The Security Collective

View Original

105. Developing a secure engineering mindset with Stephen Kennedy

See this content in the original post

Claire is joined by Stephen Kennedy as they cover the balance of engineers between security and functionality. They talk about secure coding expectations, and also the role compliance plays in software development. Stephen shares his experience moving from being an engineer into C-level leadership and the security lens of which he then had to look through.

Stephen's background is as a software engineer, but he's since transitioned into CTO and a CIO roles. He's worked across Australia, New Zealand, and the United Kingdom for organisations ranging from start-ups to large scale enterprises. His most recent role has involved increased security scrutiny in working with large multi-billion-dollar partners (e.g. shipping lines) with compliance mandates, and as such he's had to evolve his career to take on more of a security, privacy, and compliance focus.

Links:

Stephen LinkedIn

Stephen Twitter

The Security Collective podcast is proudly brought to you in partnership with LastPass, the leading password manager.


Transcript

CP: Hello, I'm Claire Pales, and welcome to The Security Collective podcast. Today's guest is Stephen Kennedy. Stephen is a CIO these days after spending the early part of his career in software engineering. Our conversation today covered the balance of engineers between security and functionality. We talked about secure coding expectations, and also the role compliance plays in software development. There's some really great learnings in this week's episode as Stephen shares his experience moving from being an engineer into C-level leadership and the security lens through which he then had to look through. So please welcome Stephen Kennedy to The Security Collective. So Stephen, it's great to have you joining us in The Security Collective podcast today.

SK: Thanks for having me.

CP: So you're a CIO now, but you're a software engineer by trade. And I'm really keen to talk today a little bit about software engineering, from a security sense. I'm interested to understand what the vibe is for engineers when it comes to security, because is there a desire from leaders like CIOs for engineers to ensure their code is secure? Or is there much more of a focus on producing code that meets customer and business requirements as the priority.

SK: So I think the first thing is that security is what's considered a non functional requirement. So it's like other things like performance, reliability, robustness, those sorts of things, which engineers need to think about when they're producing code. And a lot of it tends to be sort of an implicit requirement from businesses where they sort of just expect it to work. So the level of security I find tends to depend on the organisation, the values and other aspects, like the problem domain they're working on, like, obviously, something like banking has got a lot more of a compliance need around it, security just naturally becomes a, a more pressing concern. And the size of the business. So larger businesses tend to have more of a security focus, because the severity of the impact, if there is a security issue tends to be higher for a larger business. Some businesses, you know, security can be like a bit of a sausage factory. You know, again, the businesses don't necessarily care how it's there, they just want to make sure that's in place. So therefore, it's really up to the leaders in the business. So the CIO or the CTO and if they have a CISO, to come up with the initiatives and sell those initiatives to the business. In terms of the who's on the hook for it, again, it tends to be like those C level executives, so they're the ones that really need to drive that. And partially as well, it also depends on the CIO and CTO in terms of, you know, their own disposition towards security. You know, some can be quite lax and others can be quite stringent around like, what sort of security that they want to have in place. So in terms of coming back to the business requirements, I think it just depends on the problem being solved, right. So obviously, security can't get in the way of meeting product deliverables, in terms of like producing the outcomes the business needs. But just like, you know, if you've got to produce an S website, then you need to make sure that the software performance as well, as you know, is also being secure.

CP: Yeah, I mean, there definitely must be. And I've seen this before myself a balance for engineers, when you think about what security will often mean, and there's these trade offs, often for functionality, sometimes it's perceived as a slower development process, because security is involved. That balance, as you say, would be hard for the engineers to be making the call themselves, much more has to be a cultural or a leadership decision as to where that balance lies to make sure the customer is getting what they need. But also, from a security consciousness perspective, that the organisation is doing the right thing, too.

SK: Yeah, I mean, it can be a trade off just like the other non functional requirements. So there's things where it impacts stuff like code complexity, you can do like quite simple code changes to make the software a lot more secure. But that can also in turn result in what's called regression issues. So software that was previously working now breaks, because you've introduced something where you're putting maybe restrictions on the browser and what it's allowed to do and that in turn breaks things. And computer software isn't just what the software engineers themselves are building, it's tends to be built on the shoulders of giants, right? So you might introduce a security measure that seems to make sense for your product. But the dependency that it's got in place might be broken because of you know, a Content Security Policy, for instance, that you've put in place which blocks content from a site which this other component depends on. That's where a lot of the complexity can come from in terms of you don't necessarily know what you're going to break at times when you are introducing additional security measures. There can be also performance issues as well. You've got to go check permission from another server for instance, around whether or not you can do something so that in turn introduces additional complexity. I also think there's a bit of an issue as well, where there's a bit of a knowledge gap that happens at times between a software engineer delivering what they need to deliver maybe a CISO going well, from a compliance and from a policy and from a best practice standpoint, you need to do this. And what tends to happen at times where there is this gap it causes this big issue where changes take a lot longer to implement than they should because of a gap or a misunderstanding. Or, you know, some of the scariest security issues I've seen is where the engineers have either sort of ignored the requirement because it's too hard or complex, or they've sort of done a hack or a workaround, because they've not really understood the fundamentals of things like I walk or, you know, other technologies, in turn creates like a potential attack vector for someone malicious coming in, because they've created a security hole basically. So, you know, it's really important, but I think, if you are in a security conscious organisation, and you've got training in place, you've got subject matter experts who are able to bridge that gap, just like to an extent with business analysts, right. They're there to bridge the gap between, you know, the business in terms of requirements and the software engineers implementing it. I think there's also a bit of an issue at the moment between, you know, security experts, in terms of from a policy perspective, this is what we really should be doing and delivering, to a software engineer who may not construe come from a security background. And you often see that a lot also with penetration testing results, like you might you see some reports and kind of might be baffling to people. Why is the security issue in place? And it's because there's such a big knowledge gap.

CP: Yeah, I mean, I definitely see from my clients and my perspective, that there's a lot of reviews that are still managed by third parties, and I mean, particularly pen tests, because you want an independent review. But are using in the engineering community, and uplifting, secure coding skills, is there an interest there for engineers to start to take on this as a skill set? And maybe it's a differentiator for their CV if they've had exposure in secure coding? Are you seeing that as a desire? Or is it really left to the CISO, and those independent third parties to make sure the security is in place?

SK: Again, I think a lot of it comes down to organisational culture. You know, I've been in IT for 20 plus years now, I certainly think over the last five years, or the more recent five years, there's definitely been a significant uplift in software engineers in their appreciation and their appetite for security. You know certainly, when I started 20 years ago, you know, like a data breach or an issue with things like that just didn't get the media attention that it gets now. You know, the security community is a lot more mature these days, as well, in terms of things in terms of, you know, also being a lot of pragmatic as well, in terms of what should be in place. When I go back to organisational culture and again, I think, agile, you know, in terms of like, the software methodology has done a lot towards security. DevOps has been around for a lot for a while. But also, you're now starting to see the next iteration progression of that, which is DevSecOps, and I think your software engineers, and otherwise they hadn't, maybe not necessarily leading the charge, but working quite closely with operations, then also, you know, security people to produce these improvements. So things like the software, you know, when you do a deployment automatically doing scanning for vulnerabilities. Doing things like, as I said before, computer software tends to be built in packages or, you know, images or other things off the other vendors or other parties or open source libraries that people have produced. So scanning goes for malware viruses, are they verified, or they say who they are? Those sorts of things. I think that's really, really starting to see an evolution. I think, you know, particularly smaller businesses, that's where it's really valuable, right? Because if you can automate it, then the cost to the business to implement good, high quality secure software is lower.

CP: Your point about smaller businesses is really important, because they don't have the resources to have big teams or have penetration tests done all the time. Is there basics, is there a basics that in your mind, any organisation whether they're developing the code themselves, or they're outsourcing it, what their expectations should be around secure code?

SK: Yeah, I mean, I think the super basics are obviously making sure that the people working on the code have a good understanding the OWASP, top 10 so these are the top 10 web application vulnerabilities. So making sure your engineers certainly, you know, the people, depending again, on the org culture, at least the people in terms of in charge of the quality and doing the reviews and things like that have a good understanding of what those are not only what are those vulnerabilities, but how do you mitigate or remediate those vulnerabilities as well. I think, you know, again, super basic, making sure that your data is encrypted at risk and in transit is really important. No homebrew sort of cryptography or auth solutions, which are still rampant, I think in terms of code that I've looked at as well. So, you know, using off shelf products or using well known, well trusted open source libraries is really important. Automated deployments is something that really you should be looking to do. I think security misconfigurations is a really big issue. And so if you can automate those as well, that just alleviates that problem and in turn also forms part of the documentation as well, in terms of how are things set up and how things secured. If you're outsourcing a think you still need to have trusted onshore people to review and look at things. And also, even if they're developing it offshore, it should still be in the repositories and things like that, that you control and own. Ultimately customers and partners aren't really going to care if a breach was a result of you or your outsourcer. And finally, I think, you know, it's really important to understand where your personal data is being stored and how its protected and for what purpose is being stored for.

CP: We've talked at length, this season, about third party risk, and about making sure that your code and your data and your IP and your business information is under your own control when it comes to security and that you as an organisation have the responsibility no matter who your third or fourth parties are. And, you know, I think when it comes to code, it couldn't be more important, especially when it is the one thing that's often done offshore and is often something that's outsourced, especially by smaller organisations. Or if there's a very specific need by an organisation, then they might look to a third party who has a specific set of skills. It's not as simple as just outsourcing the development, there are so many other moving parts.

SK: Yep, definitely, you've got all sorts of compliance things as well, like, you know, particularly when you're working with larger parties as well, they're also interested in, you know, like, who's, you know, looking at the code or the data, you know. There's also like non production data, or something's also important, like people tend to think a lot around like the security of just what's in production, but I still see a lot of organisations where they will store like, even real personal data and things like that, and non production data. So how was that? Ideally, you know, scrambled, anonymised, whatever, those are things that also need to be thought of.

CP: Just picking up on your point about compliance. What role do you see compliance programmes playing in the application security space, and in the in the engineering side of things. The offshore piece is a whole other conversation when you think about the different laws in different countries. But I know you've had some experience in relation to things like ISO 27,001, what's your opinion around compliance and engineering and you know, where they meet?

SK: I think we're, again, sort of goes back to organisational culture. And it really goes down to how you kind of want to implement those policies. And also make sure that you're protected. The old school way it's all through documentation and processes in those sorts of things, right. And so the verification side of actually making sure that these things are being followed as difficult, I think it gets leads back again to you know, I see a large part playing around like DevOps and DevSecOps for that sort of stuff. So yes, you need to document because parties need to be able to see these the processes that are in place. But also think equally, what you need to be able to do is to have automated systems in place, which are like verifying and checking those things. So you've got tools which can potentially help you to generate your ISO documents and things like that. But also, what you want to be doing is automating as much as possible. So rather than going, here is the step by step process of how we deploy something, it's the self build pipeline, the serverless pipeline, it's all in the code. Equally with like role based access control, rather than documenting all these people have got from a compliance standpoint where you're supposed to sort of document you know, who's got access, you know, simply privileged admin access to things? Well, it's all in the code, you know, like, and it's verifiable, and it's executed every single time. So I think, in terms of the where I see the software engineering sort of side of things fall in place here, it's really around, like, how do we automate all this sort of stuff so that it meets that compliance documentation in terms of actually also supplying. Then also, fundamentally, there is just general good software practices, I think, also just naturally fit into ISO. So making sure you've got things documented, like, well, this is, you know, the flow of something where it goes from an idea to how it gets into production. So good practices, like doing code reviews, talking through, you know, like training expectations around like making sure people have got training or what admin or org level, you know, phishing training and those sorts of things. I think also, the development environment is also an area where, you know, there's an impact there around it. So how do you have data practices around your production, non production data, also how is your development environment set up? So you know, we've mentioned briefly before offshore. You've got a whole lot of value IP, which is your code. So if you've got offshore developers, particularly if they're not through your company, but through a third party vendor, how do you make sure that or reduce the risk around, you know, losing IP? Do you set environments up in the clouds, and that's a way you also potentially reduce the risk around IP and your data loss and those sorts of things. So I think there's a lot there in terms of how you comply with these sorts of things. And so then it's up to the org to go what's the best way of moving forward?

CP: I want to finish up by asking you a bit about your transition from engineering to being a CIO, because it's not a normal, well, it's not a normal path. But you don't see as many engineers become CIOs as you might see, strategy and architecture leads or tech ops leads, you know, make that sort of step into the leadership role. So for you coming out of engineering and into that leadership position, are there particular things from a security perspective that surprised you? You know, what might other CIOs need to know or want to know about that transition from the engineering side through to that leadership, where you go from one area of responsibility, I suppose, through to spinning many, many plates, and then putting that security layer over the top?

SK: Yeah, I think that the spinning many, many plates, it's, the first thing I would say to someone sort of looking to make that move is to keep it as simple as possible. Now, obviously, you've got to meet the needs of the organisation that you're working for. But you want to limit your attack surfaces, you want to make sure that not only you've got sufficient knowledge to protect the resources that you put in place, but also your team because you can only be spread so thin. So you may think, oh, I've got adequate knowledge in terms of how to prevent and manage at an implementation level of things, the rest of your team may not. So some things would be like, generally speaking, I think clouds are really good place as well. There's vendors like Microsoft on Azure, or AWS, or Google, who have invested a lot of time, money and energy to try and make these platforms as secure as possible. Now, that doesn't mean that you can just go, okay, I'm on Azure or whatever., therefore, all my security problems are going away as Microsoft managing it all for me. But if you look at things like using database as a service, or platform as a service, where you're not necessarily managing the virtual machines themselves, but your code is operating at a higher level where these vendors are managing the patches for you. They're naturally constrained environments in terms of access to like the operating system, or certain ports or whatever. Also from a CIO perspective, you're also reducing your burden around, from an operational standpoint, around like having to manage and patch and all this sort of stuff. So it keeps things as simple as possible. To the above where it makes commercial sense, you also want to probably reduce the number of vendors or minimise the number of vendors that you need to deal with. Again, it's less avenues, potential avenues of attack, it's less data privacy policies and things like that you need to be across and aware of because these people are managing and have access to your data, it just makes things as simple as possible. Automate where pragmatically possible. For smaller businesses in particular, you obviously can't just build the Taj Mahal and automate everything. So you're going to need to make sure that you've got a pragmatic, blank blend of things. But automation will definitely save you a lot of time and energy. I think Single Sign On is also something which I found really useful, that sort of something we drive and we pick vendors that we're working with based on how easy and effective it was to use single sign on with them. Employees do lead businesses, so it's quite nice typically, with I'll leave on a Friday night, if you can just disable their single sign on then worry about cleaning things up on a Monday for instance, when you don't have automated, deep provisioning of access in place. The other thing I think, when certainly when I moved into the CIO role, it's a little bit of a Captain Obvious sort of moment where you really need to take more of a holistic view of everything, I think. When you're more potentially more of a CTO, or your software engineer, you tend to just focus on the products, particularly if it's a digital product business that you're working on. We really need to take a step back, and you need to go well, okay, now I need to look at like the entire organisation from a Information System standpoint, but also, obviously, from a security standpoint. So how does data flow? How does identity flow in terms of source of truth? Who's got access to what, you know, if I disable this identity, or I make changes to these security groups, how does that impact? So you've got to take more of a holistic view around things. Learnings for me, I guess, I've not had to deal a lot with like DNS security or email security around like demark, decom, SPF and all these other sort of ways of protecting and making sure that you know, the emails that you're organisation is sending out is, you know, it's verified and it's, you know, it's trusted. So those sorts of things are things that you sort of definitely going to need to wrap your head around. The other sort of last couple of titbits, I guess, is even if you don't have any customers or partners based in the EU, you really should get your head around GDPR. Even if you're not dealing with the EU, a lot of countries are building data privacy requirements that are very similar to GDPR. And it's sort of the gold standard. So I think it's really important that you understand what a data processor is well as other controller is, and what your obligations are really under GDPR, if you were to move into that space. I think that's really important in terms of setting up that baseline around how you should consider the protection of personal data. And finally, you probably do need to get a little bit familiar with reading legal documents, particularly, you know, privacy policies. Understanding how other parties are using, managing and storing the data that you either control or getting them to process.

CP: And do you feel like your technical background was a help or a hindrance in moving into a leadership role? This could be a CIO or CSO or any C level position, there are many technical leaders who don't necessarily make that transition smoothly, and are very detailed focused and love the technical side of things. Do you think it worked well for you, that you had that technical knowledge base? Or do you still feel kind of drawn back into the detail?

SK: I do think it definitely helps having a technical background, particularly for me working in small business. So lately, I've been working mainly for start-ups. So it was definitely easier when talking to you know big, you know, shipping line, multibillion dollar cybersecurity teams. You know, to have that deep level of detail on the phone calls and things like that, rather than being high level. I think, for me, the biggest sort of benefit in terms of being able to do that shift was more, having a bit more of a consulting background. So again, it was more around the software engineering space. But I was fortunate to work for a business where through their lifetime, they went through a maturity of you know, having, you know, sort of okay security sort of practices., where they actually brought in people who came from a very security conscious backgrounds. And actually seeing the evolution of know how they took that business forward, pragmatically. I think the other bit where the consulting piece really played out was, essentially, I got to see other people's mistakes, and they were paying for those mistakes, and I was sort of watching it.

CP: Never waste a crisis, right!

SK: Exactly, exactly. So, you know, I got to see a lot of security penetration test reports. So when it was our turn to do penetration testing, I kind of already knew what the penetration testers were looking for. And whilst you're not trying to, you know, secure the platform for the sake of just getting an A grade on a penetration test, you know, what the common sort of vulnerabilities are, so you can already prepare the business for them. You know, this sort of common sort of issues that other businesses face, right. So I think, to me, the biggest for any, I would say C-level sort of role, it's beneficial to spend at least some time in some sort of consulting capacity. So you get to see a wide range of different businesses, rather than, you know, you've only worked for one or two businesses and you sort of in that little bubble. I think it's really important and valuable to sort of go out and see all the different opportunities and problems out there.

CP: I think that's great advice. And, you know, being a consultant has its pros and cons, but it certainly gives you the other side of the coin. And I guess you get to give advice that it doesn't always get implemented. But you do get to see a variety of organisations and some of the challenges that they're faced with. So it doesn't surprise me that you were able to leverage that skill set as well as your, your holistic background, I suppose. But your technical skills as well. And as a leader, you know, I guess your drawing on those every day.

SK: Yeah definitely. And I think also as a consultant, generally, they only bring you in if there is a problem, right? So it's never dull, right. So you get to see them in air quotes at their worst a lot of the time, right. Like, generally speaking, they don't tend to bring in consultants when everything's like unicorns, rainbows and sunshine, right? So it's sort of one of those things where you sort of get an eye opener in terms of all the different problems that organisations face. And I think that was certainly something for me my six, seven years of experience across different roles and in terms of consultant where I think that was definitely something that was very valuable to me in terms of my transition.

CP: Stephen, thanks so much for your time today. I've really enjoyed the chat. And I think it's been a bit different to some of the other guests we've had on because we've really started to talk about some different areas of security leadership that I hope will evolve even further. I really want to see the engineering space really open up more to security. And so thanks for your time and for bringing the software engineering community into The Security Collective today.

SK: No worries thanks very much Claire.