85. Cyber Behaviour & Influence - part 1 with Lloyd Evans


Opening this season is part 1 of the webinar recording Claire co-hosted with Lloyd Evans from LastPass, as they discuss human behaviours and the impact of culture and values on cybersecurity.

Lloyd Evans leads LastPass business across JAPAC (inc India). When he’s not training for his next ultra-marathon, Lloyd and the global LastPass teams are helping companies address the human habits and behaviours of password risks to help reduce the leading cause of data breaches globally - compromised credentials. 

A Cyber Security, cloud and technology industry veteran, Lloyd has previously held senior management roles with SolarWinds, Commonwealth Bank Australia, St. George Bank and Macquarie Bank.

This season we have partnered with Lastpass -the leading password manager – and we are discussing behaviour and influence when it comes to cybersecurity.

Links:

Lloyd LinkedIn

Lastpass website


Transcript

CP: Hello, and welcome to The Security Collective podcast. I'm Claire Pales, and welcome to season nine. This season looks a little different. We've got a few firsts. This season is the first time we've had a theme for all 10 episodes. We're 100% focused on behaviour change and influence. And I've spoken to some awesome leaders who are shifting away from awareness and very much towards action. This season for the first time, we've also got a partner. I joined forces with LastPass to bring you season nine, and we'll be kicking off the season with another first, the first time we recorded a live episode via webinar. Today's episode was recorded in early February 2022 when Lloyd Evans from LastPass and I came together to talk about behaviours, education, and what positive cyber culture looks like. It was a longer chat than usual for the podcast. So this week, we have part one for you, and make sure you tune in for part two, which includes the audience questions. I hope you enjoy part one of my chat with Lloyd Evans of LastPass.

LE: So straight into it I think is probably the easiest way to do it Claire. I mean, when we spoke originally was sort of talking about this sort of dichotomy between, you know, awareness and actual behaviours of staff as it relates to cybersecurity. And so when we were talking about that, why do you think and your experience kind of lends to these, given the work you do with boards and other things. What do you think just awareness doesn't work on its own?

CP: Yeah, I mean, we, we can talk at length, I guess, about this particular topic, because awareness in itself has been, I guess, a part of cybersecurity for a long time. Trying to get our employee communities to be a lot more focused on their role that they play in cybersecurity and the protection of an organisation and its information and digital assets. But awareness alone, just because somebody is aware of something doesn't necessarily mean that there's a call to action or that they are going to do something different. Most people are just trying to get their jobs done and will take the desired path to get that done. And so what we see is organisations putting in what they think is best practice in terms of making their staff aware. And they even call people cybersecurity awareness managers and cybersecurity awareness leads and cybersecurity awareness projects. But this awareness part is really a compliance activity, I think. It allows a business to put training in place or brown bag lunches and sometimes that information can be quite one way. It doesn't necessarily lead to behaviour change. And even if you do enough awareness briefings and awareness sessions and awareness training, it makes your employee community more aware, but it doesn't necessarily drill into their behaviours and the way that they might act when it comes to reviewing or understanding what a phishing email is, for example. Or using a strong password or even inviting security into a meeting where they might be changing a product or they might be using sensitive information. So my major issue is with the word awareness, because in itself, it talks about being aware, but it doesn't necessarily really come to that security behaviours that we want our organisation and our culture to be instilling in our employee community. 

LE: Which I think is super interesting, right? Because, you know, I try and stay away from using the term awareness and really looking at influence. I think it's important to talk about it in that sort of context and to meet people where they are. Like, if we're talking to certain staff within an organisation, really tailoring the message to them. It's sort of akin to, you know most people know that they should have eight hours of sleep, and they should drink X number of glasses of water, but do they actually do it? They have awareness around it. But how do you change that and get the behaviour into it? A lot of what we talk about is really getting to what we refer to as sort of cognitive dissonance and this change of habit. It's more of a habit aspect of it that we tend to deal with here at obviously LastPass and when we're dealing with security professionals. One thing that's sort of been interesting to me sort of looking at the progression of what's happened with remote working and sort of COVID and we've obviously gone through sort of this digital shift, I suppose not that it's really been that much of a digital shift. We've sort of moved to remote working. But one thing that I was interested to hear is you know what type of methods can companies implement to improve security habits with people working remotely? Like I know that we've sort of chatted before about, I think that the story, if I'm correct was, someone actually mailed you a computer with a password in it, I believe. What type of things have you seen from your experience just generally with people working remotely?

CP: Yeah, I think this has made the lives of my security professionals even more challenging, and a lot of organisations two years ago, when COVID first became an issue, were already ready for this. You know, they already had a remote workforce, or they had remote working set up. But I don't think necessarily the policies were in place or the expectations were in place to this magnitude. And so what has happened is organisations overnight were sent home and had to work from home and their environment that they work from became kitchen tables and kitchen benches, and alongside their kids who were home schooling, and some people already had a great setup and had an office that you're able to close the door or shut the lid of your laptop, and that was the end of it. But not everybody had that luxury. And most people were balancing things in a quite a difficult fashion. So yes, you know, I did have an example where I was mailed a laptop, and it had the, the password stuck to the top. And on the outside of the packaging, it actually said this is a laptop as well. So, you know, it didn't leave much to the imagination. But I want to come back a little bit to the piece you said before about influence, because just because our staff are outside of the office doesn't necessarily mean we no longer have the ability to influence their behaviours and to set expectations. And I think it comes a lot back to the culture of an organisation, but also back to the way that staff are treated and the loyalty that they retain to their organisation, even though they're now no longer in the staff environment or in the office. And you know, our homes are now an extension of the workplace. And helping our staff to understand that what was expected in the workplace is just as important as what is now expected at home. I mean, we wouldn't download or hopefully people wouldn't download games onto their corporate laptop in the workplace. So why would they allow their kids to do that in in the home environment. And helping them to understand why those types of behaviours are not appropriate. But also helping people who are just trying to, you know, get a laptop out to a consultant to understand the risks of putting a post it note on top of a laptop and sending that out as well. So and that hasn't happened to me once it's happened to me a number of times. And so much onboarding has happened during the last few years, it's been remote. And so I think the best intentions are there, but it certainly doesn't set people up from day one, with a good understanding of the security culture of a business if you land with your laptop, and a whole lot of paraphernalia with your new company brand on it as well as a post it note that says welcome123 or whatever the appropriate password is. So I know it's a real challenge for organisations, but for security leaders and awareness managers, I think it's been an even more difficult time to continue that influence. But I think it's still possible.

LE: Yeah, I agree with you on that. I think it's been interesting. You sort of talked about the fact that some people downloading apps for their kids on their work laptop, and that the culture really needs to be pushed forward with that, in the same way as if you're in the office, I think it has been somewhat difficult to be able to do it. But I've heard a number of stories of being able to do it remotely and being effective in terms of doing that. One question I did have was you sort of talked about cyber culture generally. And obviously with working remotely and having staff at home that can be difficult to be able to distil that culture through to multiple people's homes. What have you seen has been, let's say good culture versus bad culture? What are the main differences you see between the two?

CP: I probably wouldn't call it good culture versus bad culture. It's difficult to, I guess, put a label on it. But a lot of it comes down to maturity, and also comes down to the way that the leaders in the business operate and the board for that matter. Now, if you have board members who expect exceptions because it's easier for them to use their Gmail account or easier for them to print reams of Board Papers that then kind of sit lying around or get stuffed into briefcases, that sort of culture does disseminate through the organisation. And, you know, if you have a CEO that doesn't talk about security on a regular basis, that visibility of cyber also doesn't filter through the organisation. If your security leader or your CIO is one of the only people ever banging the drum about cyber, it doesn't necessarily become part of the culture of the business. It might become part of the culture of IT, but even then, you know, if you're just hearing the same messages within that one environment, and it's not permeating out, we how can we expect our organisation to and our employee community to make change if they're only hearing it from the person who's the most paranoid in the business? You know that security beacon or the staff within the security team that people are associating with. So I don't think it's good culture or bad culture, I think it's about having a culture of security, and making sure that gets nurtured. And I talk often about how it's not the most comfortable conversation, and I'm sure back in the 80s, it wasn't the most comfortable conversation to talk about hard hats and boots. And I know it's an age old analogy to talk about safety and cyber, but really, it is a conversation that we want to be having on a regular basis. And I really try and encourage CEOs, when they have town halls to talk about cybersecurity and talk about incidents that have happened inside the business. They don't have to point out staff and say, you know, it was this person who clicked on the link, or it was that person who divulged thousands of records. That's not the expectation, it's more normalising the conversation around these events happen, and as a business we don't want to expose our customer data to risk. And so having a mature risk culture and a mature conversation about cybersecurity is really important. Often we see cybersecurity reduced to cartoons and funny, quirky things that I guess are engaging because they want organisation staff to see it and want to watch the videos or be part of it. And I'm on the fence about whether or not that's the right way to approach cybersecurity, is with that sort of jovial, jokey, fun element because it really can be quite serious and detrimental to a business. So, you know, having a culture where we take cyber seriously and make it part of the way we do business, that's how I guess I would describe a good mature cyber risk culture. I hope that's answered your question.

LE: Yeah. Thanks, Claire. I think maybe just to add a couple more points on that, I think where I've seen it work well, and where it's maybe not so good, is you sort of touched on the sort of echo chamber of, you know, the IT function or just the security team sort of banging the drum around cyber and it's not really coming down from the board. One example that I could provide, I think, Equifax, although they've obviously had a very large breach, but the way that the business shifted to a positive cyber culture since the breach, has been outstanding. It's definitely worth having a look at, it's a great use case. But I agree with you coming down from a board level perspective, and normalising the conversation is definitely something that we need to do as a community, but also individually as we can as well.

CP: And I think Equifax are one of the first to come out and you know, if you Google cybersecurity culture change, or cybersecurity culture off the back of cyber incidents, Equifax will come up time and again. And their CISO has spoken very publicly about how they went about their culture change. And previous to their particular incident, they had a culture of complacency, and they're not my words, they're certainly what's come out in the press, And they didn't necessarily see a need to keep up to date with patching or put appropriate measures in place. And then when they had the incident, the whole thing got turned on its head, and they made millions and millions of dollars of investment. And they made a huge investment in resources and people as well. But at the board level, they put in place these mechanisms that put the bonuses on the line. And I think that's an interesting approach. Because if we scare people to think that if they don't comply, they won't get their bonuses, or they won't, I guess, meet the expectations of their shareholders, I worry about that a little bit. Because does that change behaviour, or does that just scare people into doing things differently? And I suppose safety campaigns and any campaign to stop people doing health things as well, have to have an element of scare tactics. But I'd love to see people get motivated to change and then maybe that change will stay with them. I mean, are you seeing, are you seeing change that is long lasting through the work that you're doing as well? Because I'd love to know how people are doing this without fear, without putting fear into the conversation.

LE: Yeah, it's a difficult one. I think the ones that I've seen do it well, as you sort of articulated, Claire is the use of kind of stories and incidences, right. Actually bringing it like live and to the forefront. So talking about things that have happened within a company or to them personally. I know there was an article that was published a little while ago now but one of the ANZ executives, it's a publicly facing note about it. But she fell victim to a cyber attack and she talked publicly about it. I think using those stories and reinforcing it from the c-level down is super important. And that's typically where the longer lasting change comes from, is this continual conversation around it and normalising it, as you sort of touched on. The other thing I would say is that the longer lasting behavioural change typically happens, what I've found through different types of habits and behaviours and changing the way people actually interact with different elements within their business, from a security standpoint. Obviously here at LastPass, we're talking about the use of password managers and managing passwords and credentials better. Because, you know, for the most part, that is a reason why most companies are breached this in this country. But other different aspects of change really come through change through stories, as well as other elements like that as well.

CP: And when you see people at that more senior level, taking an opportunity to talk about incidents that they have maybe been exposed to, what impact do you think that has, I guess, on employee communities of that business, but also other businesses?

LE: Like, from my perspective, I think that you know, the one for ANZ you know, the commentary around it was sort of, you know, anyone can sort of fall victim to it. I think in that incident one of the accounts was compromised, I think it was a LinkedIn account, something like that. But it sort of just dispels the motion that I think a lot of people just assume that they'll be okay. And if they lose their data, their fine. But the reality is, there's ripple effects that relate to that. And with board members or c-levels talking about this type of incident for them personally or through the business, it just, it gets back to it normalises the conversation, and gets people talking about it, which is I think it's important, right. I think it's important to be able to discuss this type of stuff and why it's important. You know, the government, here in country is obviously pushing hard on data protection, it's a global thing, obviously, as well. And we just have to be conscious of it, not just as security professionals, but we're all interacting with digital aspects of our lives now more so than we were 10 years ago. And it's super important to make sure the data is protected.

CP: And do you think, I mean, when we talk about data protection, we can't just rely on our employees? But do you think employees are relying really heavily on tech? Because a lot of companies will talk to employees about this, and they will say, well, yes, I clicked on the link, but I was expecting the right controls to be in place and the right tech to be in place. And therefore it shouldn't matter if I click on a link, because that email should have been contained way before it even got to my inbox or, you know, shouldn't matter if my password is weak, because I've got multi factor authentication. I mean, this lack of ownership, how do you see this problem changing in order to have that true behaviour change? If people just think, well, there's so much tech we've got now, surely the tech is taking care of it?

LE: Yeah, I mean, I think it's, it's interesting. Accountability comes to mind. It's a bit like, you know, parking your car at the supermarket and locking the doors, and then suing the car manufacturer, when it gets broken into. You know the accountability still sits with the individual. At the end of the day, like the data that you're trying to secure, even if it's your personal data, or its corporate data, it's valuable. And if it's leaked, as we've seen multiple times before, that information can be used for all sorts of different types of things, not just financial impact, but ransomware attacks and fraud as well. And so I think it's really just coming back to, you know, if you're putting information into a system, an application, a website, whatever it might be, you are the one that's putting information in there, and you need to make sure that it's secure. It's not that dissimilar to putting locks on the front door, right? You want to make sure that the passwords are secure to gain access into that application. It's no different. So I think it really does come back to accountability of the individual and understanding that their habits and behaviours potentially if they're not good, could both put the company at risk, but also them personally as well.

CP: I guess from a personal perspective, do you think that if somebody has an incident, or is involved in an incident or even a near miss, do you think that's enough to change their behaviour? And I'm undecided on this, because I've worked in organisations before where we've seen time and time again, the same people ending up being served up, you know, training, because they've clicked on the phishing exercise links, or, you know, they have been the ones that maybe haven't put protections in place on a number of occasions. We've seen that occur where the same people kind of get hit, they're not changing their behaviours. Do you think if people get hit, then it personally will change their behaviours?

LE: I'm probably a little bit on the fence with it too. If I'm honest, I think that we sort of sort of talked about before. In a lot of the cases when we're talking about influence and changing people's behaviours and habits, we really need to kind of approach them to where they are at the moment. Like if people are continuing to click on links, there has to be a reason as to why are they doing it? Carelessness, whatever it might be. And is there ways that we can adjust that behaviour and the habit by putting in some level of sort of circuit breaker to stop that? I think doing the same thing over and over and over again, providing training and hoping for a different outcome is kind of a recipe for disaster really. And we really need to think about, well, what are the steps that actually go into that habit and that behaviour, and how do we put a circuit breaker in between that to stop it from happening in the first place? Yes, not everything is going to get protected by security tools, right, stuff is going to get through the cracks. But how do we prevent it from an individual, you know, clicking on a link, typing a password into a website. I think we really need to think about that as well.

CP: So I was just going to say you pointed out earlier that we all know the benefits of eating healthily and drinking water and exercising and yet, not everybody does that. And even when they have health scares, they still go on to, you know, eat foods that aren't necessarily good for their waistline, or, yeah, it just feels like maybe we have that information. But what is it deep down, it's going to change people's behaviour?

LE: It's hard because I think every person's individually right. So like, certain people will exercise certain won't. But what is the underlying like, the fundamental underlying reason for them doing that. And I think if you can get to that, that's the way that you change and get that long lasting change. And there has to be something at the end of the day, like everyone's motivated by something. It could be by career reputation, whatever it might be. I think if you get to the understanding of what that is, that's when you're able to actually drive longer term change in individuals to get a better outcome, I think, personally, but it is hard. I'm not going to say it's easy, but that's where I've seen it work quite well.

CP: It does feel like it needs to be values based and having organisations that this is part of the way that their values align, is that maybe something that we should be tapping into, because I guess the, that real sort of gut feeling of people that they want to do the right thing and that they want to change. Or that there's an expectation when they arrive in the business, that these are our values, and this is how we want to behave. You wonder if the employee community have an alignment with that, then they'll behave differently. I don't know. It's a really tough one. But maybe it's got to get back to that ethical and moral and values based conversation.

LE: Yeah, I think that's it's interesting. I mean, thinking through that, like, I mean, you talk to boards all the time. It would be interesting to know, from your experience, like, how have boards communicated that in the past like to get that longer term change? Have you seen that work well in the past or from a board perspective?

CP: Yeah, it's a very topical conversation at the moment, because a lot of boards are starting to really start to look at cybersecurity through a different lens. Because of new regulation and because of old regulation and directors duties now really being under the microscope around should cyber be separated out as a director's duty that must be one of their fiduciary duties? Or should it just be under the umbrella of everything else that a board is expected to do? And cyber is, I guess, we like to say like every other risk that boards 70% of it, they should know, but there's still that 30% that they really need to understand the subject matter of cyber and the context within which it pertains to their business. But no matter what business you work in, the fundamentals remain the same. And that is if you have a culture of sort of making risk based investments and risk based decisions, as opposed to a culture of compliance, or a culture of complacency. All of that gets set at the board level. And certainly at the board level, they're making decisions around mergers and acquisitions where they might buy companies where you'd like to see the cyber due diligence done as well. So they can, directors can always make decisions at that strategic performance level of an organisation, that filters down around cybersecurity. It's a difficult one at the moment, because not all boards are doing it. But there are certainly some boards out there who are known to be putting cyber security at the front of every conversation, as opposed to just having it as number one on the risk register, but not necessarily making the right investments. And individually directors, I have different conversations with directors when I've got them one on one, to when I get them in a boardroom and they're all sitting around the table amongst their peers as well. So I think it's very topical at the moment and something that directors need to be looking at on an ongoing basis.

LE: Do you think with the change in regulation for boards and so forth, do you think that's a little bit like the you know, the carrot and stick? Like it's sort of they've only really acted on it because you know, their potential fiduciary duties are changing. Do you think that's a good outcome for the general community? I mean it's hard to say. But you know, Equifax obviously is sort of docking people's bonuses because if they don't comply, then, you know, do you see a sort of similar thing with that change.

CP: It does worry me only from the perspective that I don't want the cyber conversation to become one of compliance. I don't want it to be about if we do these five things, we can sign at the end of the financial year to say that, you know, all of these ducks are in a row. We want it to be a security and a risk based conversation, not a compliance conversation, because there's a regulator saying, here are the five things that you need to do. So I think there's always room for the regulator to play their part. But you know, I certainly don't have any qualms about that. I'm more concerned about making sure that we're having the right conversation, not as you said, we're not having the conversation because all of a sudden there could be consequences. And that's the challenge with cyber is that unless you have an incident, it's very hard to say, well, this is kind of quite virtual, and it's hard to show the consequences. It's not like safety, where you can see the number of days since the last incident or somebody was injured, or somebody unfortunately lost their life. It's quite different to that. But I think over time, cyber security incidents are starting to impact the community more and more. So at the board level, it shouldn't be about what is the regulator saying, but it should be about what is the impact on the community if we were to have a cyber incident. We've seen that in the last 12 months, people couldn't buy petrol people, you know, we cut meat supply off because we couldn't check whether or not the quality of that meat was up to standard. You know, and people turned up to pack boxes, and there was no jobs for them that day because of cyber incidents. These are starting to filter in to the community. That's the conversation we should be having at the board level. How are our customers impacted if we can't operate?

LE: Yeah, I think that's a good point. You know, it comes back to corporate responsibility, right. I think, you know, the amount of supply chain attacks we've seen over the last like couple of years, it's just ripple effects, right. So if your company is unfortunate and falls victim to a cyber attack, what other impacts could that potentially have? We saw that happen with, obviously, the SolarWinds attack, you obviously mentioned Colonial Pipeline as well. And that ripple effect and that corporate responsibility that we should be really talking about.

CP: As mentioned, we couldn't pack our webinar into one episode of The Security Collective. So this marks the end of part one, and please look out for part two of my chat with Lloyd Evans of LastPass.

Previous
Previous

86. Cyber Behaviour & Influence - part 2 with Lloyd Evans

Next
Next

Season 9 Teaser