March 1, 2014
Advisor’s Corner: Javier Soltero on Board Meetings
On Feb. 4th Heavybit Advisor Javier Soltero gave a talk on Board Room Process. The talk was so well received that we've invited him to offer...
In the latest episode of The Secure Developer, Guy is joined by Geoff Belknap, Chief Security Officer at Slack. Geoff discusses what drew him into security and reveals why it’s critical for security teams to be recognized as a full-fledged member of engineering. He explains why it makes sense for companies to develop a track record of transparency and actively encourage community participation through bug bounty programs. Geoff also concludes that companies should encourage basic security hygiene rather than seek a silver bullet that does not exist.
About the Guests
Geoff Belknap is Chief Security Officer at Slack, and an executive level security professional with over 20 yrs experience in enterprise software, telecom and finance. Geoff is a technical and strategic advisor to policy-makers, start-ups, and investors and a regular speaker at public and private industry events such as CIO 100, True University, Structure Security, Okta, SAP, FBI, etc.
Guy Podjarny: Welcome back everybody to the show the Secure Developer. Today we have on the show Geoff Belknap, the chief security officer for Slack. Geoff thanks for joining us.
Geoff Belknap: Yeah thanks for having me.
Guy: So there are a lot of questions that I have when I think about security at Slack. I personally at Snyk heavily rely on Slack for many things, and therefore heavily rely on the security of Slack.
Geoff: Thanks for being a customer.
Guy: Yeah always. Thanks for building the product. You know there's a lot of goodness and importance around the security of Slack as a whole. But specifically the reason that we're sort of having this chat, that I thought would be really interesting, is that Slack is a very fast-growing company.
It's definitely a company that moves quickly, that develops software quickly, that sort of ships software quickly, and what I'd love to dig through as we have the conversation here, is chat a little bit around how you build and how you handle security, in such a fast paced environment.
Geoff: Great yeah.
Guy: Can you spend a couple minutes to tell us about your background, and how you sort of got to Slack, and what you do there?
Geoff: I've been at Slack about, just about two years now. Before that I was at Palantir for almost six years, and working on some really interesting, challenging data security problems. This is the second half of my career. So for about 15 years before that I did a bunch of network engineering, and telecommunications architecture work at startups, and banks, and telcos.
So this is really the extension of me finding the magic and connecting two things together with cable and the lights blinking on either end. And realizing that means bad things can happen as a result of lights blinking on either end of a cable as well.
In my career I've really appreciated opportunities that allow me to have an impact, and certainly over the last few startups, you know the measure for whether that's going to be interesting to me, is whether a good decision I make is just as impactful as a bad decision. And I think certainly people in security, you're not really having an impact if a bad decision contributes to a terrible outcome as well. And while nobody wants that, for somebody that really wants to have a big impact, I think you need to understand that it matters.
Guy: Yeah, I think it's one of the areas that definitely contribute. So was there like an aha moment that made you see the light or the dark? I don't know how you sort of refer to the security side of things.
Geoff: No I think it's something I've always been interested in, in general. I think I was definitely one of those kids that wanted to first be a pilot, and then it was a cop, or a firefighter, or something like that. And I think as I grew up and matured, I was always very interested in the legal system, and law, and justice, and security sort of naturally played to those ideals.
I think in the mid 90s when I was getting into telecommunications originally, the industry was just getting deregulated, and this was taking off, the the natural gravitation of somebody who's smart and likes engineering problems, was to go work on that, go build the internet, and certainly I had the opportunity to work on some of the first cable internet deployments.
And we built some of the first pre-standard broadband internet networks, and that was a lot of fun. Sort of fast forward to that, nobody at the time in the 90s, nobody really thought you could abuse that. It was the time of Kevin Mitnick, where you know, abusing a telecom network was making free long-distance phone calls, and getting infamy from that.
So now when you know the industry, and telecommunications is really mature to a point, where our entire world economy is really based on top of this, and you know not only is that how those economies are driven, but also how business. You know entire businesses are built on top of this sort of fabric that we've added to the world, and it's really interesting to be able to contribute to how you manage risk, and how you make sure that you can enable technological change and economic innovation.
But also do that in a way that manages the risk, and make sure that we're enabling that change and not enabling something negative?
Guy: Yeah for sure. The table stakes are definitely higher.
Geoff: Yeah for sure.
Guy: You know the game has gone up a few notches. So you joined Slack when? How long ago?
Geoff: I think it was January 2016. So not quite two years yet.
Guy: Okay cool, can you tell us a little bit about the evolution of the security team at Slack? I mean we spoke to like Sean Gordon here from Neurealla. He talked about being kind of the first security hire at 140 people. You know and optimized it was maybe like a similar story. What was Slack's trajectory around sort of size and first security hires?
I joined Slack when we were about 300 people, which depending on what kind of startup you are, that's a reasonable time to really focus on executive level leadership for security.
And especially for Slack, as we try to compete in the enterprise space that we're in with some really big names, it becomes a critical part of the business.
And that was what was really interesting to me. I was very lucky in that I started out with four engineers, which is probably the biggest team that I've started with. So you know once or twice before this, when I've started security teams, it's usually one or two engineers that are on loan from some other team.
Slack had four engineers that were full-time dedicated to security. So we've built from that in not quite two years, to we'll probably finish out this year at about 30 people full-time. And the security team, which is a full-fledged part of the engineering organization at Slack, and the idea is that you know, we are focused on the things that reduce harm to our customers, and ensure that they can sort of have this foundational trust with us.
Because as you said, you know your whole business is built on this. Looking across at the engineer and I'm watching him use Slack on his phone, and it's both a thrilling, and you know sobering thing when I ride Caltrain into work every day, and you know I look around the the car that I'm sitting in, and I see people using Slack all over the place. And for me, that's certainly the first time I've worked somewhere where it's just ever-present in people's lives, and business.
And this you know influences how they do everything. So we focused on how we ensure that we have foundational trust. We have our application or platform security team, we have our operational security teams, we have our teams that are focused on Incident Response and management, and then of course we have our teams that are focused on how we handle risk and compliance.
How do we do risk the Slack environment? How do we make sure that customers can align, you know whatever sort of compliance programs they have to what we're doing? All towards the end goal of making it so that, if you're developing on the platform, or if you're using the platform to run your business, you don't have to think about those things right?
So you can develop an app, or build something that's built on top of the platform, and understand you can plug right into all these things that we've built. You can bring your fortune 100 organization to Slack, and understand that you know we've already provided all these touch points, and mapped everything to your program.
And you know any of the apps you pull out of the app directory, are something that are suitable and makes sense to use in your environment, and are not just you know toys, or fun shiny things to play with for your devs, but they're actually parts of your business.
Guy: So you came in and said you had four engineers, and then you talked about security as part of the engineering team. So first of all, maybe let's sort of unpack that a little bit.
Guy: You know it's not always a given, that you consider security to be a part of the engineering team. Is that the way it's structured? Like security as a whole is a part of the engineering organization?
Geoff: That's certainly the way it is now. I think when you start a security program, and you invite somebody in to sort of be the leader of that security program, you're never really sure exactly where that's going to go. In fact I just had coffee with someone this morning, who said, "Hey, you know we're almost at the same size Slack was when they started their security program. What do you recommend we do?"
And there's really no one common path that everybody can follow, other than the fact that you start with generalists, and you sort of set them to work on your highest priority problems, and you build from there.
The Slack Security team was originally part of the privacy and policy organization, which sort of flowed through the business side of the organization. Now I report directly to Cal Henderson, our CTO, and we're a full-fledged part of engineering.
And you know a first-class citizen in engineering, which really helps people understand that security matters to Slack. And in this field, I feel so cliche saying, " ah security is important to us, and security really matters," but I think to some extent that's the only term I can use that describes it matters! It matters deeply to me, it matters deeply to people at Slack.
You know we do treat security in a first-class way, and make sure that the platform security teams, or the application security teams are involved in reviewing new products that the engineering teams are working on. They're involved in customer integration and customer engagement when we're talking to an enterprise customer who wants to bring their entire environment over to Slack.
We're involved in that discussion and making sure that we understand their risk model, and they understand how we address that. We're involved when it comes to talking to developers that might be developing an app, or an integration for a customer, or trying to build a business on top of the Slack platform, and helping them understand how to do that in the most secure way possible.
You know how to make it the most impact, but also be able to offer products to our enterprise customers on top of the platform. So I think everywhere that we can possibly touch security in the organization, its represented in a really positive way, and I think ultimately, that should be your goal. If you really do take security seriously in your organization as you're building a start-up, or if you have you know an established organization, that's what you have to do, and you have to do that in a first-class way.
Guy: So to me the choice of using the term, and putting the the team in engineering, conveys, I don't know if I am wrongfully reading into it, a few things. One is it seems to prioritize building security systems, right? Sort of building things that are a little bit more engineered to work well, as opposed to maybe relying on sort of manual efforts or actions. So that's maybe one. Definitely an aspect of an engineering culture.
The second is, it talks about building security in versus bolting on, but you know, I wonder like how much of this as apposed to DevOps revolution startup. It's an environment in which, the mandate of breaking down walls between different teams, of sharing responsibility, at least in sort of the development and ops world, kind of permeates into security.
Well if you go to an organization that has ten years more lifespan than Slack, let alone 20-30-40 structure is not necessarily in that aspect of it. Do you think this is like a prerogative of an early company, or can you sort of envision you know, a 20 year old company, having kind of the security team being led from within the engineering organization?
Geoff: I think to your point, I'm definitely the very fortunate beneficiary of being able to start a security program. And to have the ability to make some of these foundational changes to how we think about it culturally. I think it really comes down to just that right? How does your organization think about security culturally? How do they think about security? Are problems secrets? You know these things to discuss and analyze, and really engineer solutions or are they things that you sort of bring up at your Audit Committee meeting, and then never speak of again?
Specifically to your question of "can I see a 20 year old established, or older established organization managing security of engineering?" And I think the answer is, "yeah I can see that, but that organization would have to work a lot harder at it than I would have." Again I'm very fortunate in that I'm starting with a clean slate. There's always this really tempting, you know ability, certainly with people on Twitter to go, "well what if this company had just done X, or why don't they just X?"
And it's a great example of not understanding how complex some of these problems are, and how complex large businesses are. But I think you've seen a lot of large organizations pivot recently, that you've seen leadership changes at Microsoft sort of wholesale shift that organization.
You've seen places like IBM make dramatic changes to how they operate their environment. And it shows that if some of these largest organizations in the world can make that change, then you know your organization can make that change too. It just has to be a high priority for that organization.
Guy: Sort of on that note a little bit, talking a little bit about the pace. So I would say that the premise of the motivation, another reason that you would want to shift your organization as a whole, or your security organization, seems to fundamentally come down to pace. The Internet is moving at an increasingly fast pace, and a business needs to catch up.
We need to iterate, you need to ship stuff quickly, you need to try out ideas and fail, and quickly, so you can kind of switch on and find the one that wins. So you know, everything needs to be fast, fast, fast, and fast is a scary thing sometimes for security. So I guess on that note, let me dig a little bit into indeed that sort of speed element right?
That's one of my kind of curiosity points indeed. How do you see that. At least from the outside, it seems like Slack is shipping stuff very quickly. What's your kind of philosophy do you have, and how does it come into practice around, kind of balancing this speed of delivery?
The faster you deliver something, the better you need to be at understanding the risk, and finding some levers to control or adjust it.
I think in Slacks case, just like everybody else's they're either trying to play agile, or actually be agile. I know that's probably a terrible cliche way to describe it, but you know the reality is, people want to move fast, and people want to change fast, and develop things fast. You know Slack pushes hundreds of changes a day, that are constantly improving the product and the platform. That's just completely counter to how enterprise software was delivered you know, five even five, maybe ten years ago.
And certainly that's something we talk with our auditors or our customers about, but you know you can move at that speed, and you can do that, even if it confounds the auditors, and scares people that were more traditional risk managers. Because the reality is your customers want that. They want you to go fast. They want it to be improved fast.
The reason people are signing up for things that are SaaS products and subscription-based services is because, their expectation is that the product is going to constantly improve, and certainly I think everyone just like Slack wants to meet that need. Well that makes my job both very exciting, and very stressful in that, I have to constantly be building a program, and controls, and visibility.
You know if I just think about the tools that can tell us what's happening, and you know alert us to if something's going wrong, it means we have to respond faster when we do find something that's gone wrong, or if we think something's sort of veering off in a direction that's less than safe, but it also means we have to have culturally, the ability to flag problems inside the organization.
To escalate those, to have discussions with comms, or general counsel, or with your CEO, or your CTO, about what are these problems. And how do they impact the business, and what should we do? Do we change a high-level, top-level strategy?
Is this a simple logistical change? But
If you don't have the ability to both quickly surface, discuss, and make decisions, and then disseminate an action plan inside your organization. All those things contribute deeply to whether you are secure or not, and whether your business is going to survive.
Guy: So how does that play out in in reality? So you know I share the concepts right? You know you want to be able to communicate. You want to talk about risk, you know without sort of some zero-tolerance mindset, like sometimes you need to move it. How does that manifest? Can you sure share some like some best practices, some tools, or maybe it's principles that you apply that you know that we can adopt elsewhere as well, and in kind of a daily life?
Geoff: I think a great example here of how this manifests, and how to measure whether this is working well in your organization is, if you're at a maturity phase where you can have a bug bounty, this is a great sort of test for a) is your organization mature in terms of how it handles risk, and how it responds to things, and b) are you mature enough that you can accept you know, random people on the internet telling you how bad your product is?
Other than your customers. I think it's really a big step. So for us we have a bug bounty. I was very lucky in that when I walked in the door, Slack already established this. But, you know
A bug bounty is really something that from the outside seems like an interesting idea, and is valuable, but to do it well becomes part of the fabric of your technical and your business culture.
You're putting something out there where you tell people, please find flaws in my product, tell me about them, and we will fix and compensate you for that trouble, and that seems really straightforward. But what that means is, we get submissions all day long from the bug bounty program, and we scrutinize each of those, and we look at them you know, all through a very serious eye.
We triage those, and then we have an internal discussion with the team that looks at that, and they make a decision about is this really a problem or not? Do we need more information? And then you know the stuff nobody ever sees, kicks into work and there's a whole bunch of stuff where you know, we file a ticket. We might go over to that product team, or the product management team, and try to understand you know, is this is a problem?
Or that we've confirmed that this is a problem. What priority is this? How fast do we think this needs to be fixed? Is this something that we think customer data is at risk immediately? Do we need to do all hands on deck and fix this? Is this something that can just go into the next you know release cycle?
And that's a hard discussion to have if you don't have a strong engineering, a strong security culture, because people generally don't likes to fix bugs. People don't like to fix bugs. I'm sure nobody listening to this podcast hates to fix bugs, but everybody else.
Guy: You can include the listeners.
Geoff: Yeah. I think everybody else would sort of like to wait till next quarter, or maybe the next release, or something else that's going to make money. So it's very easy to see the friction between the need to address risk, and the need to sort of drive the business forward, and you know your security program has to act in a credible way to represent that risk to the business, and sort of get priority on that.
So now we've established that you know there's priority there, and you know now we get down to fixing it, confirming it, and then you have to go out and tell that researcher, "you know what? You were right. That's a bug. We fixed it. Can you please confirm whether it's fixed?" And then you're giving that researcher permission, like if you want to go tell people about this bug, if you're gonna write a blog post, or post it on Twitter, or whatever it is, you have our permission.
And you know, there are some reasonable constraints on that, but at the end of the day, you know, you're agreeing as part of this sort of social contract, that they can tell people about this flaw that they found. You know and you have to take a moment and sort of let that settle in, that you're agreeing to let someone tell other people about something you fucked up.
I think there are very few industries where they're willing to sort of let people tell them what they're doing wrong. Take that and feedback very seriously, fix it, and then let people tell other people about that right? Could you imagine being in a relationship where somebody like posts all your flaws online? Well maybe there are relationships like that.
Guy: They do exist yeah. I think the bug bounties are an amazing thing. We have one at Snyk. You know from an early age, I see them as such a massive boon, and advantages. You basically get a bunch of of auditors of varying skills, going off and testing, and finding issues, and you know whatever it is that you pay them, it's probably nowhere near as much as you would have paid, hiring those auditors in.
And we you know, we love those audits, and even if people sort of submit the wrong thing, we send them schwag. You know I send them some stickers and magic ones that we have, and definitely is successful. I like the idea. I'd never really considered it that way. I like the idea of maybe thinking about bug bounties as the transparency vehicle. I think when DevOps happened, a lot of the revolution maybe came from people acknowledging failure, right?
Embracing failure. Internally talking about the blameless post-mortems, and things like that, but even externally right? Somebody getting up on stage, and talking about this massive outage they have, and how they you know screwed up, kind of throughout multiple times, you know? And how they handle it and how they learn from it, right? On how they're doing it better today.
Didn't really think security. It's risky. It's scary to do this you know? It's really scary to like stand up on stage and describe your existing security process, because you feel like everybody's going to find the gaps, and it definitely is scary to come up and talk about security mistakes that you've done, but bug bounties are kind of a safe way to do it, because you know? It was a mistake. Somebody found it, but it wasn't a breach.
Geoff: Yeah I think it's a safe way for organizations to dip their toes in the waters of transparency, and I think the reality is we're not headed towards a future where there's less transparency, right? So I'm looking at a future where you know, people need to have more information, and consumers need more information about you know, what the the privacy impacts of your products are, what you know security features you have.
I'd love to see an environment we were operating in, where there was like an Energy Star logo, or just like you have Nutrition Facts on the side of the cereal box, so people can sort of evaluate you know one SaaS provider versus another, and go oh this one's got you know, worse SSL or TLS certificates than this one.
Or this one you know handles my data in a different way. We're going to move towards a time like that, and where you know maybe five years ago you could try to sue somebody into not presenting about your product at Black Hat or Def Con, or writing a blog post. The reality is like you can't, and going that way, and trying to keep a lid on you know, how you're approaching these things, and you know, how slow or fast you're fixing them, is the wrong way to go.
The right way to go is, you know treat these things. Give them the right priority. Do a consistent job of addressing them, even if you're not fixing it as fast as someone else might like you to. As long as you're fixing it, and you're doing in a transparent way, I think you know what
Companies, especially startups are starting to figure out that having a track record of transparently, and consistently addressing security problems in general, lends you towards a place where you're improving the brand.
You're improving the trust you have with your customers, or your perspective customers, and I think they're certainly without getting too far into the weeds of the news, you're looking at breaches and other things happen in a way that if you're dealing with the companies that historically have not been transparent, and have not been consistent in how they handle these things, you know the fallout from those things is going to be much more dramatic than if people know that you've been doing the right thing.
Or making an honest attempt at taking security seriously all along. It makes it much less like a TV trope to just say you take security seriously when you're in front of Congress.
Guy: But once again it's sort of the high table stakes there. So let's sort of maybe continue down this line of kind of laying blame. Not sure if I remembering my facts correctly, but I think maybe it was when cloud bleed happens, or when one of the Slack team security team sent flowers, or some box of chocolates, or was it the other way around? You know there was some showing some love from this sort of Slack security team to another breach, or vice versa.
And while I don't remember the details, these types of occasions stick to mind, because they're oh so unusual, in the security chatter. We just had a massive breach in Equifax. We've had massive ones before. Maybe not quite that size, but definitely each time one of those happens when there's some big breach, some big data leak, the finger-pointing begins, right? It's all about laying blame, and you know a part of it is washing your hands, but it's not my fault! A part of it is glee.
You know maybe just sort of a gloating, sorry, for somebody else's failure, and I find you know, from kind of a perspective of maybe somebody providing security tools, and talking about reducing risk, and I suspect you share this as well, that it's really hard to talk about security in a positive tone, right? To educate for security, to educate about reducing risk without having the narrative be about hyping up the risk, right?
Or having this fear induced blame, induced environments. If you don't do this you're going to lose your job. If you don't do this you're going to be breached, you know the world will come to an end. What are your thoughts on this? You know how do we kind of advance on that path?
Geoff: I think the answer is definitely more cake.
Guy: That's always good.
Geoff: I think what you're thinking of is, we sent Atlassian some cake or some cookies recently, at the launch of their product. I think in the past we've also you know sent cake or pizza when you know when friends are having a bad day, because the reality is, even though we're all in this market, and we're competing against each other, whether it be you know Microsoft, or Atlassian, or anyone else, we all rise and fall together, right?
So the tide comes in and out. We all go up and down together. A breach at one cloud provider is not a cause for joyous celebration. You know it's a time for us to all reflect on, that could have been us, and what are we doing to make sure that doesn't happen? And quite frankly, you know for people in my position, and a lot of people on my team, it's a, do those guys need any hand any help right?
Is there anything that we do to help them out? If there's any information that we have, either whether it be threat intelligence or smart people that are working on a problem, you know quite frankly, we're often very ready to help and I've seen that go both ways. You know people have reached out and offered assistance, but across the community, and that's one of the parts of about the community I really like.
The alternative to that is, we also live in an environment where people are very ready to pitch their product on top of whatever this latest breach is. I think because security is very hard to sort of understand and grasp the ROI of, and understand when you should be buying versus building, and what you should be buying. You know and there are a ton of things to buy. There's something like 1,700 different startups being covered by different analysts right now.
All of them are in this market, trying to sell a security relevant product, and you know there's a ton of money floating around. They all need an opportunity to market. So anytime there seems to be some sort of news about security, I get a flood of emails, as it is everybody else I think in the industry, about how vendor X's product would have stopped this.
And it's both infuriating, and it's unhelpful to the industry as a whole, because you know that's not what we're here to do. Quite frankly, I don't think I've ever received an email where that was true, that vendor X's product would have prevented whatever I just read about in the news, but it's sort of makes it harder for our industry, and our discipline, and our engineering practice, to be better recognized, because there's sort of this ambulance-chasing.
You know I think eventually between M&A, and you know just sort of the natural momentum of the market, that will settle down, and this will make sense. I think it's gonna take longer, and people will realize that security, cyber security, risk management, information security, whatever you want to call it, has to become a core part of how you operate a business.
And you know, the thing you should be spending most of your time on when you're deciding what to buy, or what to build is, you know what are the most high priority things for your business? What's strategic, what's important? What are you spending the most amount of time on in terms of problems to solve? And then maybe those are the things to spend your money on.
But otherwise you should be, like we talked about before, you really should be building things that are directly going to enable your business to achieve its strategic outcomes. And if you're not focused on those things, because you're distracted by something else that's taking all your time. That's a great thing to spend money on to make go away, either with people or technology, but everything else you should be focused on giving you the best visibility, giving you the best ability to control or influence, you know those outcomes.
Guy: But I definitely kind of relate to the vision right? The goal shall we say? You know maybe like am I a tiny bit more pessimistic around you know, whether consolidation of the security market would lead us there. You know I feel like there's that there's more you know salesmanship. It's kind of not really gone away from our culture at any point in time, but I do think that it's something that if we don't do anything about it, definitely is going to get worse, and we need to work on it right?
We need to push for it, and try to incentivize it. I guess maybe that's you know, we kind of have time for sort of one more of these topics of conversations. How do you see incentives in this world right? Because you talk about the counter example of chasing, and ambulance-chasing hack, but how do you celebrate security success? How do you, reward good behavior, you know? Good achievements in security?
Geoff: I think the way I think about incentives is, transparency helps a lot here. The economic incentives have really been misaligned for security in a long time in that, if you look at breaches historically, if you look at you know, the current breach everyone's talking about, it's probably too early to tell for that one, but if you look at the numbers historically, you'll find that the cost of being breached is very short-term.
So especially if you're a public company, you'll find that your stock price might take a hit, at least temporarily, but the reality is like, that will come back, and people will buy their diapers, or hammers, or whatever it is that you sell, and things will stabilize again, and right now the economic incentive is to scare people into buying your thing, or spending money in a certain place.
It's not directly aligned with actually making things better, because people by and large, have not spent much time studying, and putting information out in a broad way about what is better, and what is you know, what does it take to make things better? I think there are people, somebody I'm a big fan of, Bob Lord who talks about this in the sense of, so we all know we should eat less and exercise more?
Certainly I'm well aware of that fact personally, but occasionally you have to skip a day. You don't go to the gym, or you skip a leg day, or whatever it is, and that's fine, but you understand that there are consequences with that, and you understand that making different choices about what you eat, or having a cheat day, or whatever it is, is different than deciding, well I know I'm supposed to eat less and exercise more, but I really like eating a full sheet cake, and drinking an entire bottle of whisky every day.
And while that sounds wonderful, you can't sustain that on a daily basis. Something bad is going to happen. The chances are probably more bad things than wonderful. So you have to sort of manage risk in a way that you know leads you down a path of like, you're constantly improving, and that's not any fun. It is fun to buy the brand new APT dark web threat intelligence machine learned something or other.
And the thing that you can you know wrap yourself in this comfortable blanket of like, oh good now all the threats will be found by whatever this thing is. But
The reality is the thing that makes you safer every day are the really boring things right? Managing inventory management, managing your risk, understanding where your risk is.
Guy: Security hygiene right?
Geoff: Yeah understanding the hygiene, and you know while it's easy to say, oh if you just keep up to date on your patches, then you'll be fine. Well in a complex environment, understanding what is there to be patched, what the current patch date of it is, how many machines do we actually own, or VMs are you actually running at any given time, that's a really complex problem, and it has a straightforward solution.
You know it is easy to get distracted by buying that magical silver bullet, versus doing that hard hygiene work, or that hard sort of, you know self-care work. So I think you know more see lends to you know, putting the incentives in the right place, because if you're being transparent, if you had to disclose sort of what your security status was, your hygiene was as part of your quarterly filings.
If the market was incentivized in a way where you had to do, like we talked about these nutrition facts kind of label on the side of your product, you would be incentivized to make sure that you're constantly making improvements in these areas, and you'd be less excited about buying silver bullets, and more focused on you know, making steady improvements, and listening to what your consumers want. Or giving you know consumers what you think they need, versus just sort of trying to defend yourself against you know an inevitable lawsuit.
Guy: Definitely a complicated equation to have, and it's indeed fun. It's fun to build these sort of advanced APT machine learning darkweb thingy as well.
Geoff: Yeah and to be honest, like I think there is certainly, there is a need for some of these, and many of them are valuable.
Guy: There's no silver bullet.
Geoff: Yeah yeah. Having the best algorithm is not going to solve any problems for you if you aren't doing any of these basic hygiene things, and it's not that the hygiene things are easy. If you look at sort of Maslow's hierarchy of need applied to security, you need to address some of these things first before you're spending time on self-actualization, or making sentient AI.
Guy: So this was fascinating. I have all sorts of questions for you, but I think we're sort of out of time. So before we let go, I'll ask a question I ask all of my guests here. If there was sort of one tip, one advice. Maybe the other way around, some pets peeve that you have for people not doing, that could help you know a security team, a development team, a company level up their level of security. What's your tip?
Geoff: I think I was just ranting about this on Twitter, which honestly I have to be more specific about, but I think the best tip I have is, if you only have one place to focus, focus on people. Focus on the really hard non instantly gratifying thing of like, invest in your people. Invest in hiring great people. Invest in giving the people that you have hired the things that they need.
Listen to them. Give them your time. Give them your support, your trust, your respect, and they're going to do great things for you, and you know putting great people on your team is gonna be way better than spending double or triple that amount of actual hard money into some security product right?
In fact, some of the security products that we have, that are the best are the least expensive things we would spend money on, and I think you know, spending you know money, but also just investing time in your people is the best thing you can do, and it's also quite frankly, the least expensive you know, easiest thing you can do.
Guy: Yeah excellent tip. Definitely, if you invest in people, I think the rest will come. You will choose the right tools. You would build the right practices. Fully agree. Well this was a super great conversation. Thanks a lot for coming. If people want to keep up with your sort of inputs, want to follow on Twitter, or sort of contact you some other way, can you sort of share how we can find you?
Geoff: You can follow me on Twitter, which is probably a terrible idea if you want to have a rational conversation, but I'm @GeoffBelknap on Twitter. I don't know, come work at Slack! That's the easiest way to spend a bunch of time with me.
Guy: That works.
Guy: Cool, well thanks a lot Geoff for coming on the show.
Geoff: Thanks for having me. It super fun.
Guy: And thanks for everybody that tuned in and join us for the next one.