Why Does Everything Get Hacked?
Start at
Originally aired on April 14, 2022 @ 12:00 AM - 12:30 AM EDT
Best of: Internet Summit 2018
- Karen Renaud - Chair of Cybersecurity, Abertay University
- John Moor - Managing Director, IoT Security Foundation
- Moderator: Chris Merritt - Chief Revenue Officer, Cloudflare
English
Internet Summit
Security
Transcript (Beta)
🎵Outro Music🎵 🎵Outro Music🎵 Alright, so I don't have any props, anybody drawing anything in the audience.
If you want to draw something and send it up on a little paper airplane, that'd be fine.
So the next panel is all about why does everything get hacked. And to kick us off, we have two great panelists.
Karen Rowe is the Chair of Cybersecurity at Abertay University.
And John Moore is the MD of the IoT Security Foundation.
And I'll let them say a little bit more about their background. So Karen, why don't you start us off.
Hi, I started off as a hardcore computer scientist, a software engineer.
And then during the late 1990s, I saw the cyber security field starting to emerge and decided this was far more interesting than software development.
And so I've been looking at the human side of security for almost 20 years now, trying to figure out how we can help humans to interact with security, how we can help them to behave more securely, and what are the barriers that get in the way, and trying to figure out how we can do security with acknowledging the fact that the human is an integral part of the social technical system that we develop and that they have to use.
John, you want to say a little about yourself? Yeah, sure, thank you.
First thing I'll say, thank you to Cloudflare for inviting me along to talk about a subject that I've become quite passionate about.
So the first thing I would say is that I'm John Moore.
I'm an expired embedded systems engineer.
What does that mean to be expired? What it means is that I no longer practice.
So back in the 80s, I learned my craft. I practiced during the 90s. I became a founder of a semiconductor company based in Bristol.
We made graphics accelerators, graphics products.
In fact, we grew out of a company that was the first company in the U.K.
to go public on the back of a VR story. I guess that's VR 1.0 because it's come back again.
But for about the last dozen years or so, I've been working at a level above the company to look at industry-level problems.
So my background really has been about innovation, and it's only been the last three years that I've been looking at, I guess, the underside of innovation, the darker belly of security.
And I think it's fair to say that it's been, for somebody who just sees good in technology, it's made me reappraise technology in ways that I just didn't think about before.
And quite often when I do talks on security, I often start by talking about, for me, what is the epiphany of the obvious.
And I talk about that because it's obvious to me now.
It's obviously obvious to me now because I've thought about it, and before I didn't think about it.
So my opening gambit to get the discussion going is, I submit to you, to the audience here, that IoT security is a wicked challenge.
It's a wicked challenge of our time. It's multifaceted, and it's significantly more than just technology alone.
So you made, just to jump off on that, you referenced that IoT, so what does IoT stand for, and are there variations on it?
Okay, yes, I know where you're going with this one. So Internet of Things, I think hopefully the audience knows what the Internet of Things is.
There are various definitions, some of them are quite clunky, but for me, very simply, the Internet of Things is the next wave of the Internet.
But I was saying that, again, when I talk about this, and just to underline what I was saying about thinking about things differently, IoT could be the Internet of Treats.
That's what we all want from the Internet of Things.
But also, if you just take that small letter H and you insert it into treats, you have the Internet of Threats.
And I think that's something that we need to think much more carefully about as we move forward as an industry.
So let's park that for a moment. Karen, we talked about, let's start with WannaCry.
So everybody has a pretty good surface-level insight, but walk us through the chronology.
Why did WannaCry happen? And then let's talk thereafter about what we could have learned from it.
Okay, so WannaCry is a ransomware attack.
They used a piece of software that the NSA developed and then lost, and it was sold on the dark web and the hackers got hold of it.
So they used it to encrypt.
It hit this country particularly hard. And as this thing was happening, before that really smart hacker found out how to put the kill switch on, the first response was somebody clicked on a phishing message.
Then within a day or so, oh no, okay, that's not what happened.
Oh, it's because people didn't patch their Windows XP systems.
Then it was, well, they left a port open to the world. Then they were blaming Windows for not patching, and Windows said no, they had patched.
And there was just this complete blaming frenzy that was going on. At the moment, the blame is with North Korea, who apparently unleashed this attack on the rest of us.
But my real epiphany, if I can talk about it the way John does, was what is the point of all this blaming?
We didn't achieve anything. We've got to stop trying to blame and find the guilty party and really move to, well, hang on, why did this happen?
So if we go back a step, the NHS was very hard hit. They had to cancel operations and everything.
And there was blame, well, why didn't they patch? And the NHS said, well, we have machines like MRI machines that you cannot patch because it breaks them if you patch it.
Well, why did you leave the ports open so the hackers could get in?
Well, there are good reasons for that as well. You need to get the files off the MRI machines and you have to leave that port open.
So there were good reasons.
But it's like a storm. All the right factors lined up. People didn't patch, yes.
All sorts of things went wrong. And then the hackers exploited all those little things.
So there were lots and lots of reasons. There's no one reason why WannaCry happened.
There were lots of things that came together to create the perfect storm.
But at the end of the day, I think we, as a security arena, we've got to stop figuring out who did this thing, who made this thing, who allowed this thing to happen.
We've got to move to why did it happen and how can we prevent it from happening again.
Because the cyber attackers are going to continuously find these ways to attack.
So we have to rather start understanding that we have to fix the problems and we have to do it together instead of pointing at other people and blaming them for what's going on.
So the thing that if you decompose that, you talked about the NSA developing something, it was lost, it was sold and exploited, and then they were patching and there was blame down the line.
It ends up in North Korea's lap, which is a fitting sort of story for the media.
But if you walk it back a little bit, John, talk to us a little bit about how the emergence of all these connected devices and the amount of infrastructure that's being built out for what is considered to be public benefit.
So how many messages have we seen that were being surveilled as we walked through the city?
It's many.
So John, talk about the proliferation of devices and how that infrastructure is growing.
And then we'll talk about how those things may be used for nefarious purposes.
Yeah, okay. So I think it's fair to say that although I do security now, it's not something that I chose.
I was invited to take a look at it by our chairman at the time and when I was first invited to look at it, I thought, well, this is curious.
There's lots of things going on in tech. There are things I'd like to get out like AI and maybe blockchain, but I need to go and have a look at this thing called security.
And I thought it was quite small and esoteric, but in fact it took me not long at all to realize that this is actually massive.
It is the challenge of our time.
If we go back in history a little bit, we came through the PC era, and in the PC era we made some mistakes.
We fixed them. We're still fixing them now.
We've kind of gone through this mobile era and things got quite a lot better.
And then we come into the era of IoT, and guess what? Things are actually worse than they were back in the PC era.
And there are reasons for that. If I look at the technology trends, and there are many, but I'll pick out two for you.
The first is hyper-connectivity.
That's massive. The second is the software definition of products.
And if you put those two together, what you have is hyper -vulnerability and you have hyper -hackability.
Now, when you also put that against what I would typically recognize as Moore's Law, you have this lowering of the cost of technology, which, to use the parlance, is the democratization of technology, which means it's very cheap to put connectivity into pretty much everything.
And I'm sure you've seen it. You've got your own favorite examples. Should everything be connected?
Well, personally, I don't think so. I think this is the thing.
I look at the engineering me, and I think, oh, wouldn't it be cool if I could do these things?
But you know what? Just because you can do it doesn't mean you should do it.
I've got actually a screen grab of a Twitter feed that came off of a, I think I'll be polite in current company, but it came off a famous Internet of Things site.
And there was a chap bemoaning the fact that he couldn't turn his hall lights off because his device couldn't get an SSL certificate, and he couldn't just turn it off on the device itself because the battery had gone.
And you're there wondering, you know, what was wrong with the switch? So just because it can be done, I don't think it necessarily has been done.
But what I would also weigh that up against is I do believe that there are huge societal benefits to connecting certain systems up.
And I think certainly it's easy to find products which we can be disparaging about in the consumer space.
I think if you look in the business space, the business case is much more solid.
So we will get new innovation.
We will get efficiency. We will increase productivity. But I think, you know, what I was about to say, you know, once you have that technology and you can put it in, the fact is it's connected.
And that thing may not be the point of the attack.
That thing may just be a pivot into something else. So the fact that everything becomes connected up really exposes us all because what we have conceptually, if you think about it, the Internet of Things is an expanding attack surface, and it's going in completely the opposite direction to where we want security to be.
Because we want security to be nice and compact and small. We want a nice small surface.
We want to make sure that we authenticate, authorize, that we have confidentiality and all those wonderful things.
But hey, Internet of Things, let's just hook all this stuff up and mash up all that data.
So at its very heart, provenance is a big issue that we're trying to get our head around.
Not just provenance of data, you know, where that's come from, where it's going to, what networks has it gone through, but even in the supply chain.
So counterfeit is a big challenge for us going forward too, which will impact security.
And that's, I hope you can start to see.
I'm just talking about the technology right now.
I haven't started to talk about processes and culture. We should talk about motivations a little bit.
We should. Karen, let's talk a little bit about motivations.
So the obvious motivation is money. Anyone who puts ransomware on somebody's machine is wanting to get money from them.
Sometimes it works, other times people don't pay.
But a lot of the time, it's just so easy. There's a huge dark web where people can buy malware really cheaply and use that to attack other people.
Sometimes people may decide to attack somebody because they don't agree with them politically.
Or they may decide to attack a website or a system because these people are hurting animals in testing products or something.
So there's a vast number of motivations.
But I think there's two things that really make this kind of crime different from physical crime.
The first place is people hardly ever get prosecuted.
So somebody could attack me from the other side of the world and I know that nobody will ever prosecute them.
And the other part is just the ease of doing it.
The attack surface is huge. Lots of people are not securing their things properly.
And so these guys can attack us easily and they're never going to get prosecuted.
And so there's no deterrent. So whether they want money or whether they're just doing it to show how clever they are, there's just all sorts of people out there that are attacking us.
Now, when I came down into the underground this morning, I saw this huge billboard that said, you're 40% more likely to fall victim to a cyber attack than you are to fall victim to a physical crime attack.
This is kind of interesting because this is exactly what I'm saying.
It's just so much easier to attack people.
And if nobody's ever going to take you to court for doing it, why wouldn't you just try it?
So I think there's lots of motivations, but for me the facilitators are the ease and the lack of deterrents.
So what's to be done if the law enforcement, in your words, is not going to prosecute her or can't quite bring these folks to bear for their crimes?
What's to be done?
Do we just rage against the system? Do we hunker down and try to limit the surface area that, John, you would suggest is not limitable?
What are we to do?
Well, I think there's a range of measures that we can take. I think attribution actually is something we didn't discuss in the kind of warm-up, but attribution is really quite difficult.
And so figuring out who has conducted the attack is quite a difficult thing.
And we know from recent attacks that actually nation-states are starting to hide behind criminal gangs just to cover their tracks.
But I think where we should start is making sure we're as secure as we can be.
And so one of the discussions that we get into an awful lot at the Foundation is what is the role of regulation?
Do we need regulation? Do we want it? And then in a world in technology which traditionally doesn't want regulation, I'll quote Bruce Schneier, who in 2016 said, that ship has already sailed.
It's not a case of if, it's a case of when.
And what's important here now is that we make sure we get smart regulation and we don't get dumb regulation.
Can you regulate across borders, though?
So if this thing does become a... Not if it does, it has become a global footprint.
So how do you regulate that? We've already got three sets of regulations that are coming in now that somehow conflict.
So we have GDPR in Europe, and we've got the Chinese doing something different in China.
And then we have the FBI apparently mandating backdoors for all encryption used in the US.
So those are already conflicting, which means that you're never going to be able to have a global solution anymore if you have a global company.
It's going to have to start being more disjointed.
I think that's actually quite a challenge. So does that dysfunction...
These three countries haven't been able to agree what should be done.
So they've done their own thing. Can I give you an aspirational vision?
When I was starting to look at this and recognizing that clearly the Internet doesn't stop at borders, there are lots of geopolitical, cultural issues to deal with.
But kind of inspired by Martin Luther King and his vision, I would like to think that if we accept that nations have a duty of care to protect their citizens, and pretty much every government has that, then wouldn't it be good if we could get some international accord that said security is so important for us all?
Because in this digital world, we are all naked.
We can all be manipulated in certain ways.
So why don't we have some international accord and then get our politicians to start that conversation?
Certainly here, whilst the UK is still in Europe, there are discussions about actually how do we start to put regulation around cybersecurity?
And I think certainly where IoT gets talked about an awful lot, there are a lot bigger issues than IoT right now.
I think also, actually, I should bring you in on this, Karen, because you brought it up originally, the five I's.
We have common allies. It's on all our interests. So we kind of have to, we need to work bottom up and we need to work top down too.
And I think it's something that we as a society are going to have to understand that I believe that security is everybody's responsibility.
And I think there are, it's not evenly distributed.
We've all got some role to play. But I was talking about, I wrote a blog recently, and I called the blog, Why IoT Security Needs More BS.
And BS, translate BS. Yeah, BS. So it was deliberate. Again, in the warm -up, I was telling my colleagues here that I love the power of three.
I can generally hold three things in my head at a time.
Once you go beyond that, it gets a bit harder.
So we have this very complicated problem. How do you deal with complexity? Well, you have to start breaking it down.
And so at the foundation, we've been looking at, I guess, three major stakeholder groups.
The first group is the vendors.
We look at the vendors because we believe that if you do not provision security right at the very start, it becomes very difficult to deal with later.
So that's the first group.
The second group that we need to influence, in the void of regulation, we need market forces to do their job.
So we need to get the procurers, the installers, the integrators to be asking for security.
That's the second piece.
And the third piece, it's for the users. We need to make sure that they do their part too.
And here comes the BS. The first bit is build secure. The second part is buy secure.
And the third part is be secure. And as I was telling our members about, this is what we want to go at to try to give a very simple message, one of our members put his hand up and said, hey, John, that's great.
You've just given us three BSes.
And I said, hey, if that's how you remember it, then I'm very happy.
So that inspired the blog. So one of the things that was referenced earlier is, and I think perhaps Karen, you were talking about it back before we came on, is that security is not a vaccination.
No. So it's required but insufficient.
Can you just talk about that? It's a journey, I think. Well, it would be a vaccination, you can give vaccinations for bugs that don't change.
So we have polio vaccination and that sort of thing.
But when the arena changes all the time, technology changes, the hackers change, the way they attack us, everything is changing.
And therefore, we cannot just deliver a vaccination the first day somebody comes to work and go, right, they know about security now.
It has to become part of the culture.
It has to become part of the responsibility for everyone. So if we siloize our organizations where they're the end users and IT's over here and their job is to be the policemen and keep security and the users are the people that have to be controlled all the time because they keep making mistakes, we're never going to make any progress in this game.
We really have to start, I would add a fourth BS, although I can't think of an S.
So we'll just say, be together, collaborate, be insulated, include everyone in the whole security endeavor.
Let's try to work together.
Because while we're still pointing fingers and pushing people and blaming them, we're just never going to win.
We're going to have to start figuring out how to work together in this, collaborate.
But maybe you can think of an S.
I'll work on that one. Janelle, I completely agree. Because if you look at the security challenge, it's no longer the preserve of security experts.
It's for us all to accept.
One of the things that I was delighted to see, certainly here in the UK, we used to have this secret organization at Cheltenham.
But just in recent years, we brought that out of the shadows and we put it front and center and said this is now part of the UK strategy.
And that is the National Cyber Security Center.
And I think that should be applauded because what we were saying is that we do need to shine a light on the problem.
It's very difficult to fix problems that you cannot see.
So let's bring them out into the open. Let's have a discussion.
And again, another reason why I'm delighted to be here to share some of my epiphanies along the way.
So I completely agree. At the first conference we did for the foundation, I said, this was just three months after we were founded, I said, I need to figure out how we collaborate at scale.
I don't quite know how to do that yet.
I'm still working on it. But the reason why I said that is because defense is harder than attack.
An attacker needs to find just one route in. But we have to keep the system safe all of the time.
And you can't do that just with one group looking at it.
What we need to do is make sure we're inclusive, that we've got users that kind of keep their wits about them to spot suspicious activity.
As I've said, we need to make sure we build security in. We also need to understand the nature of security.
And again, this is something which, you know, I've been for a long time in tech, and I've never really thought about the difference between safety and security.
And I was forced to do that because a lot of people were coming to me saying, well, it's just the same as safety, isn't it?
And the answer is, well, it kind of isn't, isn't it?
And it isn't. And there's a very simple reason why it's not, is that, to get back to it, is that with security, there's a human motivation behind it.
With safety, we're looking at accidents happening, an unfortunate kind of combination of events which creates a safety incident.
Now, it's fair to say that it's not quite as simple as that because a safety issue can become a security issue, and vice versa.
But it's important that we understand that.
Now, I was talking to somebody last night to say, look, you know, just in very simple layman's terms, think about the fire brigade and think about the police.
Police are there for security. Fire brigade are there to help us, you know, get out of tricky situations when there's a fire or when, you know, cats have stuck up trees and things like this.
But, you know, it's important that we split these things out.
You know, what's also very interesting is you can't just talk about security in isolation.
You know, we've been talking quite a lot about privacy.
So privacy, safety, security, reliability, all of these things are interrelated.
So that's, again, I'm just kind of supporting my opening gambit of it's, you know, it's a wicked challenge.
It's wicked. It's wicked.
So let's do this. Let's open it up. We have a number of folks here that are probably going to have interesting questions or perhaps comments.
There's one right in the front here.
And if we could queue there. Thanks for the house lights. And we'll go ahead and get started with questions.
Thanks. Hi, this is a question for Karen.
You touched upon this a bit, but didn't quite get into what we can do to help users, end users, understand what they need to do about security.
Because I think the problem is that there's so much to do, it's hard to know what's important.
I bet every single person in this room has done something, like they know they've got an open security issue, like a reused password or something they don't have 2FA on.
So how can we make non-technical users understand what they have to do and care? Thanks.
That's a really good question. I kind of had this epiphany last year, and I've sort of started to formulate all the ideas in my mind.
I think we have to try and figure out, we've been trying to fix the users for years, right?
So we've been trying to beat them over the head with two stronger passwords.
And I don't know if anyone else in this room manages to do that.
But generally, it's really easy to solve the password problem.
Just give everyone, all your employees a password manager.
So you find the barrier that is preventing people from behaving insecurely.
But I also think users have, for a long time, not felt part of the solution.
They get a bunch of rules from above that they're asked to, and sometimes they can't do what they're being asked to do, but they're not allowed to appeal that.
So we have to figure out a better way for them to understand that they're part of the solution.
And yeah, there are rules, but if they're not possible, you need to tell us so we can work together to find something that's doable.
I've seen a lot of situations where people have been told, well, you must do X, Y, and Z.
And when they try to say, well, that's not possible in my job situation, you're preventing me from doing my job.
It's, look, we know about security, do as you're told.
I don't think that's helpful. I think we have to start giving respect to people's jobs and their job expertise and trying to make sure that security is seen as an enabler rather than a barrier.
But also, what about end users, not employees, but people who are buying light switches?
People who are, sorry? People who are buying light switches.
No, like end users, people who aren't employees, like people who aren't in our workplace.
Consumers. But consumers, yeah. People who are buying light switches.
Yeah, of course, this is a huge challenge because I think at the moment this field has moved so fast that people don't quite get it.
And so there's a massive field of uncertainty.
And so when you have that uncertainty, people don't like that.
People will deal with uncertainty any way they can. And the way they do that is find a quick and easy solution and we'll deal with it.
So I don't think I have an easy fix for this either, but I kind of feel like I know which direction we need to go.
Perhaps I could add something to that. Certainly where the foundation's concerned, where we started off was in consumer because that's where we saw most of the acute problems, the crimes against humanity.
It's very clear that what you want to do in the consumer space is security needs to almost be like magic, so it has to be designed in right up front.
What we also need to do, though, is educate consumers because this is the digital transformation.
You know, this is the world that we now live in.
You know, I don't think anybody can divorce themselves from digital hygiene.
So we have to, you know, we have to say to them, make sure we speak to them in their language.
We don't talk about authentication and authorization.
We don't do that. We say, you know, make sure your password is not easy to guess.
By the way, if there's a software update... Consumers shouldn't have to worry about software updates, by the way, but if there's one, if there's a situation, make sure you do it.
And, you know, the third thing I would say is if they think there's something suspicious going on, they should tell somebody about it.
So keep that part really simple. You know, the complexity should be designed out right from the beginning.
We need to think about that. Now, if we move that into an enterprise setting, which is where we're moving into, is the big difference is that you have professional staff that can look after those systems.
So you can do different things and, you know, you can put more sophisticated mechanisms in and you can, you know...
But I do think, you know, part of it is to make sure we acknowledge that security is no longer the preserve of security experts.
There were a few things that I picked up along the way, my favourite expressions.
Number one, perfect security is asymptotic. We need to let people know that, in fact, it won't be perfect.
Context is dependent, right? You would not put Fort Knox security on a consumer device because it just breaks the cost model.
Why let good get in the way... Sorry, why let perfect get in the way of good?
So let's go for good first. Certainly some parts of IoT are, I think, below ground.
We need to at least get them above ground level. And I think, you know, if we can do that, we'll start to win.
And we certainly shouldn't be putting divisions in.
We need to make sure that we all own the problem. Let's recognise that it's a...
I was saying that the question we ask is, who owns security in IoT? And there's a very academic answer.
And the academic answer says that it's a highly distributed moral responsibility.
And what we mean by that, we've all got a role to play.
So let's all do our part. This is a collaboration. So I think that's a great note to end on.
I want to thank Karen and John for being here today. And thank you for sharing time with us.
And I appreciate it and look forward to the next session. Thank you.
Thank you. Thank you. Thank you.