Transcript - Episode 2: When Corporate Culture Threatens Data Security Guest: Phil Huggins, Vice President, Stroz Friedberg Welcome to Episode 2 of the Business of Truth podcast by Stroz Friedberg, "When corporate culture threatens data security. Our guest for this episode is Phil Huggins, a Vice President in the firm's London office. Contrary to popular belief, it's not an organization's IT function that's responsible for ensuring the security of its information. Rather, as Phil will explain, culture is today's greatest driver of cyber security. He looks at what cultural aspects and behaviors are most likely to put a company at risk, what it means to infuse security into an organization and what IT's role should be, and whether it's even possible to remain secure and take the business risks necessary to be profitable. So let's get started with episode 2 of our show, right now. Hello, Phil. Welcome to the show. Hey, thanks for having me. It's great to have you on. We have a very interesting topic. Why don't we jump right in? What aspects, Phil, of a company's culture are most likely to put its information at risk? One of the key drivers of information risk is how everybody in the organization handles the information. Some of the fundamental problems underlying this are that security needs to be everybody's responsibility. If security is just the responsibility of the security team, we see bad decisions, bad behaviors across an organizations. There's lots of reasons that would drive that. Organizations that value collaboration, organizations that value convenience, mission-driven organizations and organizations that are very agile all have incentives and drivers that push them away from security if that's not part of everybody's responsibility. For example, collaboration. It makes people less tolerant of stricter controls - controls that make it harder to collaborate. Also where there's large amounts of collaboration in an organization, we also find that people have less time to focus on their core tasks. As a result they're much less tolerant of inconvenience, leading to the next problem.
2 There's a lot of bad security out there. Security decisions get made in the past, they get kept, they get recycled into the future. There's a lot of security that doesn't necessarily make sense any more. That's really inconvenient. So let's be clear. Security can often inconvenience people, but it's the different between security that says no and the security that says yes, but. Mission-driven organizations are very interesting. They're very focused on the outcomes they're trying to achieve. They're very focused on the mission, where they're going, and anything that is not that mission is not incentivized. That can be a real problem for security. It means it's always deprioritized. We're starting to see agile methodologies, agile business and technology approaches, turning up in much larger businesses, which is great because there's some advantage and value to be driven out of that process - this sort of incremental fail-fast process. It has to be in context. Most start-ups fail. Most start-ups take business risks that don't work out. We need to understand that sometimes the failure of a small agile team performing a function, learning from that failure, that's great when it's just a business failure. When it's a failure of security because we're so interconnected and such a broad-based organization these days, that failure can infect all sorts of areas of the business. We also find that we end up building up security debt. Phase two of projects rarely happens. So Phil, what sort of behaviors result from all this? And what's risky about them? What we're seeing is that security isn't front and center. People, when they make risk decisions, they make risk decisions every day. When they decide whether to click a link; whether or not they're going to apply a patch now or later; whether they decide the current version of the software is good enough to be deployed in the real world; whether they decide sharing the information with the person that's asked for it seems like a good idea, seems like a friendly thing to do those are all risk decisions. If that's not incentivized, if that's not communicated, if that's not explained, then people will fall back onto the key drivers of the organization. If it's a collaborative organization they'll share. If it's an organization that wants to move quick, to be agile, they're looking for convenience. If it's a mission-driven organization they will run just straight through security because that's not contributing to the mission. Why do businesses continue to tolerate this kind of activity in light of all the recent major data breaches given what you've just described? Well, what we've known for years, it's not just a security problem. People aren't great at judging risk. Even security people like myself are not great at judging risk. They won't understand by default. It needs to be explained, it needs to be communicated. Also, often the incentives behind the risk decisions they're making are out of sync. The organization at a macro level may well not want to take those sort of risks, but at a micro level it may be rewarding behaviors which are more risky than they would otherwise want. So is the solution then to change company culture? That seems like a pretty big shift to make. Changing culture is hard. In all of the security work I've ever done, this is about the hardest thing you can ever do. The goal probably isn't to destroy an existing culture. It's not to replace an existing culture. It's to build security. It s to fuse security into the DNA of the business. It's about helping people accept responsibility for security. It's about communicating and engaging with them so that they're prepared to deal with security. And this as you say is not easy at all, right?
3 : No, no. This is amongst the hardest things I've had to do in my career. To date, we're still learning how to do it. There have been some great examples out there. There have been some great programs, some great successes. Making the change happen is hard. Sustaining the change, that's even harder. So let's talk about infusing security into the organization. What exactly is this? And is there something concrete that businesses should do? Absolutely. The first thing is a business needs to understand what security it wants. It needs to understand what is the risk tolerance of that business, what is the goal in terms of how much risk it wants to take. What's the trade off with the benefits they want to make? In order to do that, the people at the top need to have a strategy. They need to have really understood their appetite for risk. They need to document it. That might be a policy. It probably normally is a policy. It might be a strategy as well. It's about making sure that they understand how the statements they're making directly relate to the risk they want to manage. One of the big problems we found with policy in the past, it's pretty dry, it's pretty technical. A lot of people don't really understand it. It needs to be focused on the problems of the business. It needs to be focused on the threats, it needs to be focused on the risks. It's got to hit those compliance targets. If you've got an outside party, if you're being assessed for PCI-DSS compliance, if you've got an outside auditor, you need to make sure that you can tick the boxes that they need you to tick. But don't just list out the sorts of things that every policy has ever had because it seems like a good idea. It has to make sense for your business. The other thing, and this is key, you've got to be able to make exceptions to the policy smooth. You've got to be rigorous in how you assess those exceptions, but you have to have a smooth exception process. Otherwise people will just run rough shot around. Is this a common mistake people make? They don't have that in place? Absolutely. The two most common mistakes they would make is a big weighty technical policy that makes little sense in the modern business, combined with a process that takes months for security to tell them "no, they can't make an exception to the process. Both of those things do not help engagement, do not help the people who are making the decisions make better decisions. Okay, so what are the advantages of having a risk-based policy over one that's based on compliance as you described? One of the big ones is cost effectiveness. We only have so much money to spend in security these days. We have an increasingly worrying threat environment that we operate within. We can't protect everything. I think that's becoming accepted now. We're now focusing much more on the resilience of the business we're in. The ability to withstand attack and just survive. The cost effectiveness means we can start applying the security budget in places that matter. The other thing is, and we've mentioned it before, inconvenience. People don't like inconvenience. They work around it and they fight it. If you re going to bring some sort of inconvenience in, if it's inescapable in the process to meet that security requirement, only do it where it matters. Don't do it just because you do it in every other process. Don't do it in a process that doesn't matter. So you found that people are willing to accept inconvenience if they understand the need?
4 They're a lot more willing to engage the acceptance is a much longer term cultural change issue. They're much more willing to engage with a real issue than they are with some theoretical statement of risk. Okay. So what is IT's role in all this? Cyber security and information security are very wide topics. They come across all parts of the business from finance through to HR. IT is the front line. IT is where the rubber hits the road. It's where most of the threats are initiated. Most of the problems and most of the issues we need to address are related to human behavior, but it's human behavior using the IT. Some of the great things that IT can do is come up with some good defaults. Make some of these decisions for people. Make them so it helps them do their job, it s not inconvenient. But make it so they're secure to start with. That's a great first step for IT. As I said, keeping the exceptions to policy smooth and fast, that's a big step forward. This is something we'd expect IT to do, but look after the hygiene. Don't expect the people outside of IT who are making risk decisions to have to make the sort of decisions they shouldn't have to make. Should I buy this patch or not? Most people in the business wouldn't care, they assume it's being done. IT needs to change quickly. So talking about patching, how quickly can you get those patches out? How quickly can you change the business? That's an important part of this. Don't just plan for failure, design for failures. Understand that that's going to happen. But the key thing is, advise and engage. Get out and explain what's happening, and advise people what they're doing. Help them make those decisions. Do most IT departments these days understand these things? I would say not. I would say actually most IT departments are operationally responsible for security but still sit separate from security, even if they've got the same reporting line. They still see the incentive of the IT department is to deliver benefits, it's to deliver change, it's to make the IT reliable and to work, and security is someone else's problem. This comes back to accepting responsibility and to spreading that acceptance of responsibility throughout the business. That's interesting. If a business was say, completely 100% secure, could business get done? Or is risk inherent in any kind of operation? And we just have to accept this? We'll never get rid of security risk. Security risk is a human problem. It'll always be with us like crime. I would also argue that taking risk is where we make our profit. The more we evaporate the risk the more we mitigate the risk, the less profit we're going to make. Last question, how do you know you've struck the right balance then between risk and convenience? Well there's some great work going on at the moment. There are a number of indicators and factors that we can look at around staff behaviors. We can start doing some very coarse measurements. Stuff we've done for years, but probably haven't really thought about in this way, about sorts of security decisions we can see partial equality and that sort of thing. We're starting to see the deployment of big data platforms that are doing behavioral analysis looking for bad guys. One of the things on the horizon will be the use of those platforms to measure risk decisions that our staff make and act as a platform to make feedback. That's a key part here, it's that staff need to know when they're making a good decision and when they're making a bad decision. They need to be able to pick up on that. Being able to monitor while it's happening is great, but we need to be honest. We're generally going to find out whether we've done the right thing in retrospect.
5 We're generally going to find out by understanding your own security events, the security events in your sector, and security events for everybody and just to look at the differences and see are you ahead of the game? Are you ahead of the curve? Interesting. Fascinating stuff, Phil. Thanks so much for your time and sharing your thoughts on this important topic. Thanks a lot. Well, that ends episode two of the Business of Truth podcast by Stroz Friedberg. Many thanks to Phil Huggins for sharing his thoughts on organizational culture and information security. As Phil explains, no business can be 100% secure, so organizations must strike a balance between risk and convenience. Thanks for joining us, please look for future episodes of our show, where we explore information security, business intelligence and investigations in a data-driven world. 2015 Stroz Friedberg. All rights reserved.