Säkerhetspodcasten #50 - LIVE på Sec-T med F1nux Tech Weekly
Lyssna
Innehåll
Throwback monday till Sec-T 2015. Panelen tar över scenen på Nalen tillsammans med F1nux Tech Weekly och svarar på lyssnarfrågor, diskuterar säkerhetsnyheter och intressanta buggar, bland annat på flygplan…
Inspelat: 2015-09-18. Längd: 40:05.
AI transkribering
AI försöker förstå oss… Ha överseende med galna feltranskriberingar.
1 00:00:00,000 --> 00:00:04,400
Så, ge det upp för, vad är det, SecT-podcasten då?
2 00:00:30,000 --> 00:00:59,980
Hej då!
3 00:01:00,000 --> 00:01:19,000
Hej då!
4 00:01:30,000 --> 00:01:59,980
Hej då!
5 00:02:00,000 --> 00:02:29,980
Hej då!
6 00:02:30,000 --> 00:02:59,980
Hej då!
7 00:03:00,000 --> 00:03:29,980
Hej då!
8 00:03:30,000 --> 00:03:40,000
Hej då!
9 00:03:40,000 --> 00:03:42,000
Hej då!
10 00:04:00,000 --> 00:04:10,000
Hej då!
11 00:04:10,000 --> 00:04:24,000
Hej då!
12 00:04:24,000 --> 00:04:28,000
Hej då!
13 00:04:28,000 --> 00:04:29,980
Hej då!
14 00:04:30,000 --> 00:04:40,000
Hej då!
15 00:04:40,000 --> 00:04:58,000
Hej då!
16 00:04:58,000 --> 00:05:08,000
Hej då!
17 00:05:08,000 --> 00:05:20,000
Hej då!
18 00:05:20,000 --> 00:05:24,000
Hej då!
19 00:05:24,000 --> 00:05:26,000
Hej då!
20 00:05:26,000 --> 00:05:28,060
have a common sense, I mean common sense
21 00:05:28,060 --> 00:05:29,480
is not very common really
22 00:05:29,480 --> 00:05:32,280
especially not in governments
23 00:05:32,280 --> 00:05:34,500
but as long as our research
24 00:05:34,500 --> 00:05:36,000
is open to some extent
25 00:05:36,000 --> 00:05:37,960
and we share our results, we can’t really
26 00:05:37,960 --> 00:05:39,880
be expected to
27 00:05:39,880 --> 00:05:42,440
control what people then do with that research
28 00:05:42,440 --> 00:05:44,020
I mean people
29 00:05:44,020 --> 00:05:46,300
will be evil whether they like it or not
30 00:05:46,300 --> 00:05:47,980
so we could either
31 00:05:47,980 --> 00:05:50,260
stop researching new vulnerabilities
32 00:05:50,260 --> 00:05:52,180
or not disclose
33 00:05:52,180 --> 00:05:54,180
our findings which is not a good way to go
34 00:05:54,180 --> 00:05:56,020
either. No but
35 00:05:56,020 --> 00:05:58,160
I still think it’s a matter
36 00:05:58,160 --> 00:06:00,320
of being ultra responsible
37 00:06:00,320 --> 00:06:01,800
and
38 00:06:01,800 --> 00:06:04,120
ultra responsible
39 00:06:04,120 --> 00:06:05,880
disclosure, yeah. I mean
40 00:06:05,880 --> 00:06:07,300
it’s about
41 00:06:07,300 --> 00:06:10,220
giving the industry
42 00:06:10,220 --> 00:06:12,180
some time. On the other hand you need
43 00:06:12,180 --> 00:06:14,240
to put pressure on them because
44 00:06:14,240 --> 00:06:14,960
otherwise
45 00:06:14,960 --> 00:06:18,140
speaking from experience they won’t fix
46 00:06:18,140 --> 00:06:19,220
that security vulnerability.
47 00:06:20,000 --> 00:06:22,180
I say ultra responsible
48 00:06:22,180 --> 00:06:23,820
disclosure, that’s a way forward when you come
49 00:06:23,820 --> 00:06:24,140
to
50 00:06:24,180 --> 00:06:25,920
finding vulnerabilities
51 00:06:25,920 --> 00:06:27,800
and disclosing them to the vendors
52 00:06:27,800 --> 00:06:30,220
but what about exploit development?
53 00:06:30,460 --> 00:06:31,940
Could you sort of morally
54 00:06:31,940 --> 00:06:33,820
defend exploit development
55 00:06:33,820 --> 00:06:36,200
in any scenario? Do you mean as a
56 00:06:36,200 --> 00:06:36,900
business model?
57 00:06:38,260 --> 00:06:40,140
Whatever, let’s say
58 00:06:40,140 --> 00:06:42,100
you do it for free, let’s not
59 00:06:42,100 --> 00:06:43,980
talk about the money, you find a
60 00:06:43,980 --> 00:06:45,940
vulnerability, you produce an exploit
61 00:06:45,940 --> 00:06:48,240
and then you give it
62 00:06:48,240 --> 00:06:49,340
to someone. Let’s say
63 00:06:49,340 --> 00:06:52,240
the easy case is like okay you
64 00:06:52,240 --> 00:06:54,140
give, you actually work for the
65 00:06:54,140 --> 00:06:56,260
government or the law enforcement, the local
66 00:06:56,260 --> 00:06:58,100
law enforcement agency and you develop
67 00:06:58,100 --> 00:07:00,140
an exploit for them to be able
68 00:07:00,140 --> 00:07:02,320
to intercept communication from
69 00:07:02,320 --> 00:07:04,100
organized crime. That
70 00:07:04,100 --> 00:07:06,240
seems like a pretty moral good way
71 00:07:06,240 --> 00:07:06,820
to go, right?
72 00:07:08,300 --> 00:07:10,160
So, the
73 00:07:10,160 --> 00:07:12,060
interesting part, the beginning part of your kind of
74 00:07:12,060 --> 00:07:13,680
statement was let’s forget about the money.
75 00:07:14,520 --> 00:07:16,180
And the problem is, is that
76 00:07:16,180 --> 00:07:17,900
money rules the world around me, right?
77 00:07:17,900 --> 00:07:19,160
Yeah, yeah, of course. And
78 00:07:19,160 --> 00:07:22,460
the people that are advising governments
79 00:07:22,460 --> 00:07:23,860
have a vested
80 00:07:23,860 --> 00:07:26,000
interest in advising
81 00:07:26,000 --> 00:07:27,740
and offensive measures, right?
82 00:07:28,300 --> 00:07:29,640
That we have, you know,
83 00:07:29,960 --> 00:07:32,000
you take road safety
84 00:07:32,000 --> 00:07:34,140
advice from car manufacturers, right?
85 00:07:34,200 --> 00:07:36,100
You take security advice from
86 00:07:36,100 --> 00:07:37,660
weapon manufacturers, right?
87 00:07:38,420 --> 00:07:40,160
It’s like polar bears voting
88 00:07:40,160 --> 00:07:41,360
for no carbon emissions,
89 00:07:42,180 --> 00:07:44,180
right? You know, they’re not going to
90 00:07:44,180 --> 00:07:46,180
turn around and say the world is
91 00:07:46,180 --> 00:07:48,100
safe. You know, by the way, can we
92 00:07:48,100 --> 00:07:49,620
just have the 50 billion for nothing?
93 00:07:50,100 --> 00:07:52,120
Yeah, yeah, when we come to cyber
94 00:07:52,120 --> 00:07:53,040
warfare,
95 00:07:53,860 --> 00:07:55,620
I’m totally with you. But let’s go for
96 00:07:55,620 --> 00:07:57,900
an easier and more down-to-earth example, like
97 00:07:57,900 --> 00:07:59,960
law enforcement agency needing
98 00:07:59,960 --> 00:08:02,040
tools to counter
99 00:08:02,040 --> 00:08:04,040
the advanced
100 00:08:04,040 --> 00:08:05,360
criminals in our society.
101 00:08:05,680 --> 00:08:07,920
But they’ve always had tools. They’ve always had
102 00:08:07,920 --> 00:08:09,720
comparison tools. They’ve intercepted
103 00:08:09,720 --> 00:08:11,440
mails, they’ve intercepted telephone
104 00:08:11,440 --> 00:08:13,440
communications, they’ve intercepted wires,
105 00:08:13,880 --> 00:08:15,760
radio transmissions. If we
106 00:08:15,760 --> 00:08:17,720
look throughout history, they’ve been
107 00:08:17,720 --> 00:08:19,660
at the forefront of intercept. Whatever
108 00:08:19,660 --> 00:08:21,880
the intercept is. Yeah, but let’s say
109 00:08:21,880 --> 00:08:23,720
me and my
110 00:08:23,720 --> 00:08:25,580
guys in the terrorist cell seconds
111 00:08:25,580 --> 00:08:27,480
podcast, and we use iMessage
112 00:08:27,480 --> 00:08:29,040
and
113 00:08:29,040 --> 00:08:31,200
Then you’re one of the small ones, right?
114 00:08:31,300 --> 00:08:33,200
Yeah, exactly. And now
115 00:08:33,200 --> 00:08:34,600
they want to sort of
116 00:08:34,600 --> 00:08:37,460
the local law enforcement wants to get a hold of
117 00:08:37,460 --> 00:08:39,220
our communications. And without
118 00:08:39,220 --> 00:08:41,520
a good tool to crack the latest
119 00:08:41,520 --> 00:08:43,420
iPhone, they won’t get a hold of
120 00:08:43,420 --> 00:08:45,400
our information. No, they’d have to give Apple
121 00:08:45,400 --> 00:08:47,480
a call. Yeah, let’s say
122 00:08:47,480 --> 00:08:49,660
Apple is one of the good guys, just for argument’s
123 00:08:49,660 --> 00:08:49,820
sake.
124 00:08:51,160 --> 00:08:53,440
So, my angle is,
125 00:08:53,720 --> 00:08:55,540
is there a valid
126 00:08:55,540 --> 00:08:57,400
and morally okay
127 00:08:57,400 --> 00:08:58,980
scenario where
128 00:08:58,980 --> 00:09:01,540
exploit development is okay?
129 00:09:01,920 --> 00:09:03,440
So, you’re asking if the ends
130 00:09:03,440 --> 00:09:04,360
justify the means?
131 00:09:04,960 --> 00:09:07,320
To some extent. Depends on what you want to do,
132 00:09:07,440 --> 00:09:08,980
what you want to use the exploits for.
133 00:09:09,400 --> 00:09:10,040
Mikes, mikes.
134 00:09:11,180 --> 00:09:12,580
This I really like.
135 00:09:12,880 --> 00:09:15,440
You know, whenever you hear the term
136 00:09:15,440 --> 00:09:17,640
you gotta break some eggs to make an omelette.
137 00:09:18,640 --> 00:09:19,600
It’s usually not a
138 00:09:19,600 --> 00:09:21,560
good sign. Turn around and walk away.
139 00:09:21,660 --> 00:09:22,200
Yeah, yeah, yeah.
140 00:09:22,200 --> 00:09:23,400
All right, mikes.
141 00:09:23,720 --> 00:09:25,520
Just out of curiosity,
142 00:09:25,980 --> 00:09:27,920
how do you define morality?
143 00:09:28,160 --> 00:09:30,320
Is it within the nation border or international?
144 00:09:30,580 --> 00:09:32,320
Because different nations kind of define
145 00:09:32,320 --> 00:09:34,140
that differently. Yeah, that’s
146 00:09:34,140 --> 00:09:35,500
a very, very good question.
147 00:09:35,840 --> 00:09:38,360
Yeah, and according to Ifyope, I think
148 00:09:38,360 --> 00:09:40,380
it’s morally okay to spy
149 00:09:40,380 --> 00:09:42,280
on U.S. journalists.
150 00:09:42,780 --> 00:09:44,540
So, it’s
151 00:09:44,540 --> 00:09:45,660
entirely up to
152 00:09:45,660 --> 00:09:47,920
whoever to define what moral is.
153 00:09:48,420 --> 00:09:50,300
Hey, I recognize this is a really
154 00:09:50,300 --> 00:09:52,300
complex issue, but for me, I’m trying to
155 00:09:52,300 --> 00:09:53,480
sort of, to be,
156 00:09:53,720 --> 00:09:55,980
to be able to
157 00:09:55,980 --> 00:09:57,340
sort of, in a
158 00:09:57,340 --> 00:09:59,720
confined space, find
159 00:09:59,720 --> 00:10:01,760
some truth. I’m trying
160 00:10:01,760 --> 00:10:03,600
to look at sort of a local
161 00:10:03,600 --> 00:10:05,620
Swedish problem where perhaps the Swedish
162 00:10:05,620 --> 00:10:06,960
police would like to
163 00:10:06,960 --> 00:10:09,700
do some surveillance of
164 00:10:09,700 --> 00:10:11,920
a local criminal. So I’m
165 00:10:11,920 --> 00:10:13,680
skipping all the cyber and the
166 00:10:13,680 --> 00:10:15,660
United States of Save the World
167 00:10:15,660 --> 00:10:17,880
and all that stuff. I’m just looking at a local
168 00:10:17,880 --> 00:10:19,360
easy problem, like
169 00:10:19,360 --> 00:10:21,700
enhancing, creating better
170 00:10:21,700 --> 00:10:23,180
tools for local law enforcement.
171 00:10:23,180 --> 00:10:31,760
And then, of course, when you answer that question, you can go into really complex discussions about the wider moral implications of that one.
172 00:10:32,020 --> 00:10:32,800
Another comment?
173 00:10:33,320 --> 00:10:39,260
Yeah, well another question is the idea of if it comes to the idea, we start talking about organized crime.
174 00:10:40,140 --> 00:10:46,800
And the question is, if you look back at the history of people doing these things in terms of digging into people’s privacy,
175 00:10:47,080 --> 00:10:52,520
most of it goes into a lot of government policies that were questionable in the first place.
176 00:10:52,520 --> 00:10:56,340
Yeah, foreign policy, right? Foreign policy determines international…
177 00:10:56,340 --> 00:10:57,140
Drug policy.
178 00:10:57,320 --> 00:11:04,780
Yeah, foreign policy is international morality. Drug policy is so we didn’t have to make the FBI redundant.
179 00:11:04,840 --> 00:11:06,080
Well, the war on drugs.
180 00:11:06,160 --> 00:11:07,000
Yeah, right, okay.
181 00:11:07,020 --> 00:11:11,820
We suddenly can justify whatever we need to do because it’s the war on drugs.
182 00:11:12,540 --> 00:11:14,160
We’ve got a war on everything, right?
183 00:11:14,800 --> 00:11:16,340
How’s the war going, by the way?
184 00:11:17,280 --> 00:11:19,960
We give up, at least in certain places.
185 00:11:20,700 --> 00:11:22,460
And then we have like in…
186 00:11:22,520 --> 00:11:32,920
In Sweden, I think, in the UK as well, that we introduce surveillance to stop organized crime and stop foreign military threats.
187 00:11:33,760 --> 00:11:37,660
And then it turns out that we actually provide some of it to the tax department.
188 00:11:38,580 --> 00:11:48,020
And I don’t think anyone here ever sat down and agreed to these tools we developed for serious, serious cases.
189 00:11:48,020 --> 00:11:51,020
We’re now doing it to help…
190 00:11:52,520 --> 00:11:55,860
Get the guys who possibly might be cheating on the tax.
191 00:11:56,360 --> 00:11:58,720
Yeah, let’s give all our new exploits to the IRS.
192 00:11:59,520 --> 00:12:00,500
They need all the help they can get.
193 00:12:00,500 --> 00:12:04,540
But what I’m hearing from you guys is that, okay, exploit development is never okay?
194 00:12:06,200 --> 00:12:06,440
No.
195 00:12:06,760 --> 00:12:08,480
I would disagree.
196 00:12:09,060 --> 00:12:10,840
I get those vibes from you.
197 00:12:10,840 --> 00:12:21,920
But so far we have talked about selling exploits to someone who’s actually intending to use them for, for example, law enforcement or for something else.
198 00:12:21,920 --> 00:12:22,420
But…
199 00:12:22,420 --> 00:12:25,660
Why would you buy an exploit that you weren’t going to use?
200 00:12:25,740 --> 00:12:26,140
Exactly.
201 00:12:26,280 --> 00:12:26,480
Right?
202 00:12:26,740 --> 00:12:27,200
I mean…
203 00:12:27,200 --> 00:12:27,220
Yeah.
204 00:12:27,500 --> 00:12:28,240
But why do you build new bombs?
205 00:12:28,240 --> 00:12:30,220
But I mean, a lot of research has just published.
206 00:12:30,240 --> 00:12:34,060
Because they were planning on using them, like, the minute they weren’t looking.
207 00:12:35,180 --> 00:12:36,720
Like, the shield’s down.
208 00:12:36,860 --> 00:12:37,060
Go.
209 00:12:37,720 --> 00:12:38,080
Okay.
210 00:12:38,600 --> 00:12:39,640
I’m not going to let this go.
211 00:12:40,000 --> 00:12:43,720
Is it okay to work with exploit development?
212 00:12:43,720 --> 00:12:49,720
I can see where it’s a valid point to…
213 00:12:50,800 --> 00:12:51,200
To…
214 00:12:51,200 --> 00:12:51,720
At least create a…
215 00:12:51,920 --> 00:12:53,100
A proof of concept code.
216 00:12:53,320 --> 00:12:59,480
Because you might have to convince company X that has a vulnerable product.
217 00:12:59,940 --> 00:13:06,200
And you have to show them that if you do this, in this particular way, things go bad.
218 00:13:06,800 --> 00:13:08,100
POC or GTFO, right?
219 00:13:08,440 --> 00:13:08,800
Yeah.
220 00:13:08,980 --> 00:13:19,740
So, I mean, it’s the one scenario where I personally would create an exploit or an attack tool.
221 00:13:20,500 --> 00:13:20,640
Yeah.
222 00:13:20,800 --> 00:13:21,740
I think you got a point there.
223 00:13:21,740 --> 00:13:23,660
Let’s look at the FireSheep plug-in.
224 00:13:25,040 --> 00:13:30,580
Facebook did nothing to secure their communication to the website until the FireSheep plug-in went live.
225 00:13:30,740 --> 00:13:31,820
And it was very visual.
226 00:13:31,940 --> 00:13:34,620
It was very easy for everyone to understand that this is bad.
227 00:13:34,680 --> 00:13:36,500
That was developed by a friend of mine.
228 00:13:37,360 --> 00:13:37,640
Cool.
229 00:13:38,040 --> 00:13:38,480
So…
230 00:13:38,480 --> 00:13:39,240
Kudos to your friend.
231 00:13:39,400 --> 00:13:39,600
Yeah.
232 00:13:40,060 --> 00:13:45,660
And I think the thing is, you’ve got to remember, it’s easy for us to, when we talk about software,
233 00:13:46,100 --> 00:13:48,200
we kind of attach our own personal feelings to it.
234 00:13:48,380 --> 00:13:49,180
Is this good software?
235 00:13:49,300 --> 00:13:50,140
Is this bad software?
236 00:13:50,140 --> 00:13:51,220
Software is amoral.
237 00:13:51,220 --> 00:13:52,580
It’s you that is good.
238 00:13:52,740 --> 00:13:53,660
It’s you that’s bad.
239 00:13:53,960 --> 00:13:54,880
The software is nothing.
240 00:13:55,240 --> 00:13:57,740
And maps are a weapon or a defense, right?
241 00:13:58,400 --> 00:13:59,640
Well, guns don’t kill people.
242 00:13:59,820 --> 00:14:02,780
Yeah, we had that analogy in, like, the arms industry.
243 00:14:03,140 --> 00:14:07,100
I mean, a weapon, per se, is neither good or bad.
244 00:14:07,400 --> 00:14:08,680
Yeah, it doesn’t give a shit about you, right?
245 00:14:08,680 --> 00:14:09,920
But in the end, it is built to shoot someone.
246 00:14:10,600 --> 00:14:10,780
Hmm?
247 00:14:11,160 --> 00:14:12,640
It is built to shoot someone.
248 00:14:12,800 --> 00:14:13,340
Yeah, definitely.
249 00:14:13,760 --> 00:14:17,260
You know, my rifles at home, they’re made for hunting moose.
250 00:14:17,920 --> 00:14:20,620
Well, yeah, but an M16 isn’t.
251 00:14:21,220 --> 00:14:23,640
That’s for shooting lots of moose.
252 00:14:26,740 --> 00:14:33,680
Another good example of, actually, proof of concept is what is required is
253 00:14:33,680 --> 00:14:36,740
Charlie Miller and Christopher Lessig’s hack this summer.
254 00:14:37,320 --> 00:14:41,060
Because we, as a community, we knew cars were hackable.
255 00:14:41,560 --> 00:14:45,620
Autosec.org released a paper, like, three or four years ago with the exact same results.
256 00:14:46,180 --> 00:14:47,580
And nothing really happened.
257 00:14:47,580 --> 00:14:50,600
The automotive industry started looking into these issues,
258 00:14:50,600 --> 00:14:52,520
but they didn’t really do it for real.
259 00:14:53,020 --> 00:14:55,860
There’s a lot of movement in the automotive industries right now
260 00:14:55,860 --> 00:14:58,600
after the revelations of Chris and Charlie.
261 00:14:58,800 --> 00:15:00,320
And that’s really the point there.
262 00:15:00,460 --> 00:15:04,240
You need to create something that’s really spectacular
263 00:15:04,240 --> 00:15:11,420
in order for the companies with vulnerable products to really do something about it.
264 00:15:11,620 --> 00:15:13,540
Or an entire industry, I think.
265 00:15:14,480 --> 00:15:15,580
We have another question there.
266 00:15:16,080 --> 00:15:17,860
Before the question jumps in, right,
267 00:15:17,860 --> 00:15:20,280
what we have to remember with full disclosure,
268 00:15:20,600 --> 00:15:24,300
in particular cases, is the automobile industry had problems beforehand.
269 00:15:24,740 --> 00:15:26,660
And they had reasonable disclosure.
270 00:15:27,140 --> 00:15:30,540
And rather than reaching for the developer, they reached for the lawyer.
271 00:15:31,060 --> 00:15:37,300
And I think a lot of us could argue that responsible disclosure on paper seems a beautiful thing, right?
272 00:15:37,460 --> 00:15:40,080
I’m going to help you help yourself.
273 00:15:40,360 --> 00:15:42,840
But the fact of it is, it’s being abused.
274 00:15:43,280 --> 00:15:46,480
So that you’ve been left with that your only true choice now
275 00:15:46,480 --> 00:15:48,440
is if you reasonably disclose something,
276 00:15:49,000 --> 00:15:50,480
you’re likely to get your ass hacked.
277 00:15:50,600 --> 00:15:52,900
You’re likely to get your ass handed to you by a legal team.
278 00:15:52,900 --> 00:15:56,100
Whereas if you full disclose it, fuck you, we’re done.
279 00:15:56,100 --> 00:15:58,900
Right? It’s over. You can’t silence me now. It’s out.
280 00:15:58,900 --> 00:16:00,400
And good luck with your lawyer.
281 00:16:00,400 --> 00:16:04,900
And it’s a shame because it shows you how a process that we came up with
282 00:16:04,900 --> 00:16:07,900
as a community to make things better
283 00:16:07,900 --> 00:16:10,600
was abused by a legal structure.
284 00:16:10,600 --> 00:16:15,400
And now, literally, we have to say, you know what, we fuck you harder.
285 00:16:15,400 --> 00:16:16,600
That’s what we’re going to do.
286 00:16:16,600 --> 00:16:19,000
So it’s a shaming process, right?
287 00:16:19,000 --> 00:16:20,500
It’s moral shaming.
288 00:16:20,500 --> 00:16:21,500
It’s shaming people.
289 00:16:21,500 --> 00:16:23,500
But we’re the bad guys for it now as well, right?
290 00:16:23,500 --> 00:16:29,500
But I mean, I can really relate to that because in a fairly recent,
291 00:16:29,500 --> 00:16:33,500
not so recent, really, situation,
292 00:16:33,500 --> 00:16:36,500
I discovered a vulnerability in a product
293 00:16:36,500 --> 00:16:39,500
and talked to the company about this
294 00:16:39,500 --> 00:16:43,000
and showed them that this is really bad.
295 00:16:43,000 --> 00:16:44,500
You need to do something about it.
296 00:16:44,500 --> 00:16:48,500
And I was served with a very long letter from their lawyers
297 00:16:48,500 --> 00:16:50,500
telling me, you know, what they would do
298 00:16:50,500 --> 00:16:53,500
to us if this came out.
299 00:16:53,500 --> 00:16:55,500
You’re going to get medieval on your ass, right?
300 00:16:55,500 --> 00:16:56,500
Yes.
301 00:16:56,500 --> 00:16:57,500
What was the question?
302 00:16:57,500 --> 00:17:01,500
Well, this is sort of, I’ll start off by saying free Kevin.
303 00:17:01,500 --> 00:17:06,500
The thing about it is that’s part of,
304 00:17:06,500 --> 00:17:09,500
I ended up moving to Sweden about five years ago
305 00:17:09,500 --> 00:17:12,500
and part of my decision about it is if I’m ever going to do security,
306 00:17:12,500 --> 00:17:14,500
I didn’t want to do it in the United States
307 00:17:14,500 --> 00:17:17,500
because of the possibility of getting thrown in jail
308 00:17:17,500 --> 00:17:19,500
or getting sued into nothingness.
309 00:17:20,500 --> 00:17:25,500
And it’s interesting because I had a discussion group
310 00:17:25,500 --> 00:17:27,500
that I have to remain nameless on
311 00:17:27,500 --> 00:17:30,500
that was a bunch of people talking about internet and things
312 00:17:30,500 --> 00:17:32,500
and connected devices.
313 00:17:32,500 --> 00:17:35,500
And I was tossing in the fact that we really needed to worry
314 00:17:35,500 --> 00:17:40,500
about the security and building a security standard for IoT devices.
315 00:17:40,500 --> 00:17:43,500
And it was funny because I brought up,
316 00:17:43,500 --> 00:17:46,500
one of the things I brought up was the Malaysian flight.
317 00:17:46,500 --> 00:17:49,500
And the person immediately jumped on me and said,
318 00:17:49,500 --> 00:17:51,500
you’re absolutely fucking crazy.
319 00:17:51,500 --> 00:17:55,500
There’s no way that a plane can get hacked and this is crap.
320 00:17:55,500 --> 00:17:58,500
And this is about seven or eight months ago.
321 00:17:58,500 --> 00:18:02,500
And so a few weeks ago we had the person arrested in the United States
322 00:18:02,500 --> 00:18:04,500
for hacking the plane.
323 00:18:04,500 --> 00:18:08,500
And I immediately had to send the article to him and say,
324 00:18:08,500 --> 00:18:11,500
okay, so they just arrested somebody for this.
325 00:18:11,500 --> 00:18:14,500
Well, to be fair, he wasn’t arrested.
326 00:18:14,500 --> 00:18:16,500
It was questioned, detained at the airport, right?
327 00:18:16,500 --> 00:18:18,500
You’re talking about Chris Roberts, right?
328 00:18:18,500 --> 00:18:20,500
And United.
329 00:18:20,500 --> 00:18:24,500
So this is a great infosec drama
330 00:18:24,500 --> 00:18:28,500
where the infosec community spins things something chronic.
331 00:18:28,500 --> 00:18:31,500
Chris Roberts did passive sniffing on a plane, right?
332 00:18:31,500 --> 00:18:34,500
Allegedly, sorry Chris, right? Allegedly, right?
333 00:18:34,500 --> 00:18:38,500
But this is man hacks plane and fly sideways, right?
334 00:18:38,500 --> 00:18:40,500
Because it sells.
335 00:18:40,500 --> 00:18:42,500
Full spectrum cyber.
336 00:18:42,500 --> 00:18:44,500
Back to the spectacular.
337 00:18:44,500 --> 00:18:46,500
I mean, people forget that the dude was actually on the plane.
338 00:18:46,500 --> 00:18:48,500
How much do you love your research?
339 00:18:48,500 --> 00:18:51,500
Enough to kill yourself on a plane, right?
340 00:18:51,500 --> 00:18:53,500
But the way that people are talking,
341 00:18:53,500 --> 00:18:55,500
he recklessly endangered lives.
342 00:18:55,500 --> 00:18:57,500
Yes, his, right?
343 00:18:57,500 --> 00:19:00,500
People, we jump to conclusions about this.
344 00:19:00,500 --> 00:19:02,500
It’s interesting that you look at the Malaysian Airlines
345 00:19:02,500 --> 00:19:04,500
and think hack.
346 00:19:04,500 --> 00:19:06,500
I’ve never made that.
347 00:19:06,500 --> 00:19:08,500
I’m not saying hack.
348 00:19:08,500 --> 00:19:10,500
I’m just saying if there’s a possibility.
349 00:19:10,500 --> 00:19:12,500
Yeah, I mean.
350 00:19:12,500 --> 00:19:14,500
Of course it is.
351 00:19:14,500 --> 00:19:18,500
What’s interesting in security is we are a self-fulfilling prophecy in its own right.
352 00:19:18,500 --> 00:19:22,500
So one thing we can guarantee is that planes will be hacked now
353 00:19:22,500 --> 00:19:24,500
because we talked about planes being hacked now.
354 00:19:24,500 --> 00:19:27,500
And this will carry on and perpetuate and perpetuate
355 00:19:27,500 --> 00:19:29,500
until we definitely take a plane out of the sky.
356 00:19:29,500 --> 00:19:32,500
And part of the reason why I went nuts with this guy
357 00:19:32,500 --> 00:19:34,500
is because he said never.
358 00:19:34,500 --> 00:19:35,500
Yeah.
359 00:19:35,500 --> 00:19:38,500
I mean, look at Boeing and their 787, right?
360 00:19:38,500 --> 00:19:45,500
They basically had a bug, a counter bug in their software on the planes.
361 00:19:45,500 --> 00:19:47,500
And the fix was you had to reboot your 787.
362 00:19:47,500 --> 00:19:49,500
No shit.
363 00:19:49,500 --> 00:19:53,500
Like you had to power the thing down and back up again to reset the counter.
364 00:19:53,500 --> 00:19:57,500
These people are building planes that we all fly on, right?
365 00:19:57,500 --> 00:20:00,500
And it’s an overflow bug that they’ve had.
366 00:20:00,500 --> 00:20:02,500
These overflow bugs are pretty old, right?
367 00:20:02,500 --> 00:20:05,500
I’d like to listen to that IT support call.
368 00:20:05,500 --> 00:20:07,500
Actually.
369 00:20:07,500 --> 00:20:10,500
Have you rebooted your 777?
370 00:20:10,500 --> 00:20:12,500
Yeah, have you tried turning it on and off again?
371 00:20:12,500 --> 00:20:16,500
Actually, that question was a really good sort of
372 00:20:16,500 --> 00:20:20,500
bridge over to our next question for today.
373 00:20:20,500 --> 00:20:24,500
Because we had a question from Olof Lindqvist, Varje sådd, at Twitter.
374 00:20:24,500 --> 00:20:27,500
And he wondered about the future of IT security.
375 00:20:27,500 --> 00:20:30,500
What software and hardware threats do we see in the next five years
376 00:20:30,500 --> 00:20:31,500
or perhaps ten years?
377 00:20:31,500 --> 00:20:33,500
And how do we prepare for them?
378 00:20:33,500 --> 00:20:35,500
The internet of flying things.
379 00:20:35,500 --> 00:20:37,500
Yeah, and that’s why I caught on to the question.
380 00:20:37,500 --> 00:20:40,500
It’s like internet of things is coming and stuff like that.
381 00:20:40,500 --> 00:20:42,500
So what do we see in the future?
382 00:20:42,500 --> 00:20:45,500
I could just say that I was on a small flight
383 00:20:45,500 --> 00:20:47,500
with a private pilot.
384 00:20:47,500 --> 00:20:52,500
And we were listening in to the general air transmissions.
385 00:20:52,500 --> 00:20:56,500
And there was a big airliner
386 00:20:56,500 --> 00:21:02,500
who had the problem that their display was black.
387 00:21:02,500 --> 00:21:07,500
And that wasn’t in any newspaper the day after.
388 00:21:07,500 --> 00:21:13,500
So I think there happens a lot more scary shit
389 00:21:13,500 --> 00:21:15,500
than is ever reported.
390 00:21:15,500 --> 00:21:17,500
Because since they resolved it,
391 00:21:17,500 --> 00:21:19,500
they didn’t need to do an emergency landing.
392 00:21:19,500 --> 00:21:22,500
They managed to reboot something and copy it.
393 00:21:22,500 --> 00:21:25,500
And they got their IT systems up.
394 00:21:25,500 --> 00:21:27,500
They could continue like nothing happened.
395 00:21:27,500 --> 00:21:30,500
And I’m pretty sure they didn’t tell the passengers back
396 00:21:30,500 --> 00:21:33,500
that the pilots were out of instrumentation
397 00:21:33,500 --> 00:21:35,500
and didn’t see a shit on their computer.
398 00:21:35,500 --> 00:21:38,500
Because if the screens are black, they have zero instruments.
399 00:21:38,500 --> 00:21:40,500
Because it’s all computerized.
400 00:21:40,500 --> 00:21:42,500
I think perhaps Johan, you were right.
401 00:21:42,500 --> 00:21:44,500
It was a call to the IT support department.
402 00:21:44,500 --> 00:21:48,500
So who in here are flying home on Sunday or tomorrow?
403 00:21:48,500 --> 00:21:50,500
Good luck.
404 00:21:50,500 --> 00:21:53,500
Make sure it’s Airbus, right?
405 00:21:53,500 --> 00:21:56,500
So, what do we see in the future?
406 00:21:56,500 --> 00:22:00,500
Apart from planes with Windows XP screensavers on them?
407 00:22:00,500 --> 00:22:03,500
I see your toaster being a pivot point, right?
408 00:22:03,500 --> 00:22:04,500
Yeah.
409 00:22:04,500 --> 00:22:07,500
I see your toaster being a pivot point.
410 00:22:07,500 --> 00:22:10,500
And then from there on inwards you’re getting pwned.
411 00:22:10,500 --> 00:22:13,500
But to be fair, the Internet of Things is just bullshit, right?
412 00:22:13,500 --> 00:22:15,500
Because it’s IPv6.
413 00:22:15,500 --> 00:22:17,500
Borderless networking.
414 00:22:17,500 --> 00:22:19,500
I mean, this is what it boils down to.
415 00:22:19,500 --> 00:22:21,500
I mean, this is probably why we’ll colonize Mars
416 00:22:21,500 --> 00:22:23,500
so we can finally have a planet that’s IPv6.
417 00:22:25,500 --> 00:22:28,500
But, I mean, this is the concept.
418 00:22:28,500 --> 00:22:30,500
Nothing has a border.
419 00:22:30,500 --> 00:22:33,500
So it doesn’t make a difference if it’s a toaster or a computer, right?
420 00:22:33,500 --> 00:22:35,500
I mean, that’s the whole idea.
421 00:22:35,500 --> 00:22:38,500
We have so many IP addresses that we’ll be able to connect to.
422 00:22:38,500 --> 00:22:41,500
So, you know, those that have got their head around,
423 00:22:41,500 --> 00:22:44,500
there is no borders, there is no perimeter and external
424 00:22:44,500 --> 00:22:46,500
so on and so forth in networking.
425 00:22:46,500 --> 00:22:50,500
We’ll have a better time than people who are still building walls
426 00:22:50,500 --> 00:22:52,500
as a defense mechanism.
427 00:22:52,500 --> 00:22:55,500
You know, this was good in the medieval times in Rome and so on and so forth.
428 00:22:55,500 --> 00:22:57,500
Bit shitty in our tech world.
429 00:22:57,500 --> 00:22:59,500
We’re just going to build a wall.
430 00:22:59,500 --> 00:23:01,500
And everything on the outside of this wall is a bad guy.
431 00:23:01,500 --> 00:23:03,500
And everything on the inside of this is a good guy.
432 00:23:03,500 --> 00:23:05,500
It’s armadillo of security.
433 00:23:05,500 --> 00:23:08,500
So those that get their head around this probably won’t have a bad time.
434 00:23:08,500 --> 00:23:10,500
Those that don’t will probably die.
435 00:23:10,500 --> 00:23:15,500
I do think that the IT support market will be exploding soon.
436 00:23:15,500 --> 00:23:19,500
I mean, I’ll have to do firmware updates for my fridge in the near future.
437 00:23:19,500 --> 00:23:21,500
And I just see how many ways that can go wrong.
438 00:23:21,500 --> 00:23:25,500
Hey, I do firmware updates with my furnace.
439 00:23:25,500 --> 00:23:27,500
Yeah, today.
440 00:23:27,500 --> 00:23:31,500
I’m looking forward to patch Tuesday for your home.
441 00:23:31,500 --> 00:23:35,500
But if you look at that, throwing away the borders,
442 00:23:35,500 --> 00:23:39,500
IPv6 for everyone, a lot of possibility to
443 00:23:39,500 --> 00:23:42,500
connect to all the things.
444 00:23:42,500 --> 00:23:48,500
With what we see today from an embedded software development perspective,
445 00:23:48,500 --> 00:23:51,500
I would say there’s going to be a lot worse when it comes to
446 00:23:51,500 --> 00:23:55,500
exploitable software and hardware because they aren’t ready.
447 00:23:55,500 --> 00:23:58,500
The embedded software I’ve seen is really, really bad.
448 00:23:58,500 --> 00:24:00,500
Nothing’s changed, right?
449 00:24:00,500 --> 00:24:03,500
I mean, we’ve been talking about inputting,
450 00:24:03,500 --> 00:24:05,500
sanitizing input for God knows how long.
451 00:24:05,500 --> 00:24:08,500
I mean, at some point we have to look at it as an industry and say we suck.
452 00:24:09,500 --> 00:24:11,500
Because we’ve talked about this.
453 00:24:11,500 --> 00:24:13,500
We’ve even got OWASP, right?
454 00:24:13,500 --> 00:24:16,500
And we still have SQLI, right?
455 00:24:16,500 --> 00:24:18,500
We still have this going on.
456 00:24:18,500 --> 00:24:20,500
At some point, you’re going to have to look in the mirror and say,
457 00:24:20,500 --> 00:24:22,500
maybe it’s not the developers.
458 00:24:22,500 --> 00:24:25,500
Maybe it’s us that can’t convey the message.
459 00:24:25,500 --> 00:24:26,500
Definitely true.
460 00:24:26,500 --> 00:24:28,500
There is a problem here.
461 00:24:28,500 --> 00:24:31,500
So, you know, all we’re doing is just moving the attack surface.
462 00:24:31,500 --> 00:24:33,500
It’s going to be the same attacks.
463 00:24:33,500 --> 00:24:37,500
I mean, look at what we’re about to see with ICMP.
464 00:24:37,500 --> 00:24:39,500
All the, like, IPv6,
465 00:24:39,500 --> 00:24:41,500
and ICMP with 4,
466 00:24:41,500 --> 00:24:44,500
we turned off ping attacks and ping tunnels and so on and so forth.
467 00:24:44,500 --> 00:24:46,500
We didn’t mitigate the problem.
468 00:24:46,500 --> 00:24:47,500
We turned it off.
469 00:24:47,500 --> 00:24:48,500
And guess what?
470 00:24:48,500 --> 00:24:51,500
We can’t do half of those defense mode processes in IPv6.
471 00:24:51,500 --> 00:24:52,500
So guess what?
472 00:24:52,500 --> 00:24:54,500
We just put off a technical debt.
473 00:24:54,500 --> 00:24:55,500
That’s all we did.
474 00:24:55,500 --> 00:24:57,500
So everything that’s old will be shiny again, right?
475 00:24:57,500 --> 00:24:58,500
Yeah.
476 00:24:58,500 --> 00:25:01,500
That’s sort of a process we’ve seen before.
477 00:25:01,500 --> 00:25:05,500
So let’s go back to the good old days
478 00:25:05,500 --> 00:25:07,500
when we put computers on the internet for the first time.
479 00:25:07,500 --> 00:25:09,500
And then we solved those problems.
480 00:25:09,500 --> 00:25:10,500
More or less.
481 00:25:10,500 --> 00:25:12,500
And then we put phones on the internet.
482 00:25:12,500 --> 00:25:14,500
And then we got the same problems.
483 00:25:14,500 --> 00:25:15,500
And now we’re trying to solve them.
484 00:25:15,500 --> 00:25:19,500
And now we’re putting small, tiny, embedded devices with Linux.
485 00:25:19,500 --> 00:25:21,500
Bad configured Linux.
486 00:25:21,500 --> 00:25:23,500
And we have that same problem all over again.
487 00:25:23,500 --> 00:25:24,500
And cars.
488 00:25:24,500 --> 00:25:25,500
And airplanes.
489 00:25:25,500 --> 00:25:30,500
I’m not so worried about refrigerators and toasters being connected to the internet.
490 00:25:30,500 --> 00:25:32,500
I’m slightly worried about tanks.
491 00:25:32,500 --> 00:25:35,500
And you laugh, right?
492 00:25:35,500 --> 00:25:38,500
I swear to God there is a tank with an internet connection.
493 00:25:38,500 --> 00:25:39,500
Right?
494 00:25:39,500 --> 00:25:41,500
There’s a gun with an internet connection.
495 00:25:41,500 --> 00:25:42,500
Yeah, right.
496 00:25:42,500 --> 00:25:43,500
Okay, cool.
497 00:25:43,500 --> 00:25:47,500
That sounds like an interesting project to hack a tank.
498 00:25:47,500 --> 00:25:51,500
At what point in that board meeting do you go,
499 00:25:51,500 --> 00:25:53,500
we should totally connect this thing, right?
500 00:25:53,500 --> 00:25:55,500
It would be fucking awesome.
501 00:25:55,500 --> 00:25:58,500
If we can just buy a shit remotely.
502 00:25:58,500 --> 00:26:00,500
At what point do you all agree?
503 00:26:00,500 --> 00:26:02,500
It’s a unique selling point, right?
504 00:26:02,500 --> 00:26:05,500
I like this assault rifle, but is it Wi-Fi-enabled?
505 00:26:05,500 --> 00:26:07,500
Can I update my status?
506 00:26:07,500 --> 00:26:12,500
To connect back to yesterday’s presentation,
507 00:26:12,500 --> 00:26:18,500
they talked about security issues in the FIS,
508 00:26:18,500 --> 00:26:20,500
the physical internet.
509 00:26:20,500 --> 00:26:25,500
And the NCC group published a month or so ago
510 00:26:25,500 --> 00:26:31,500
attacks where they attack the DAB radio and attack cars.
511 00:26:31,500 --> 00:26:36,500
And they owned the infotainment systems over the radio.
512 00:26:36,500 --> 00:26:42,500
So we’re actually, in exception to the attack surfaces
513 00:26:42,500 --> 00:26:44,500
we’re intentionally putting there,
514 00:26:44,500 --> 00:26:48,500
we also got unintentional attack surface there.
515 00:26:48,500 --> 00:26:50,500
An intentional attack surface.
516 00:26:50,500 --> 00:26:52,500
Introducing new attack vectors.
517 00:26:52,500 --> 00:26:54,500
That’s pretty awesome.
518 00:26:54,500 --> 00:26:58,500
We actually have an episode on odd attack vectors.
519 00:26:58,500 --> 00:27:02,500
It’s a really old program, so we should probably do a new one.
520 00:27:02,500 --> 00:27:03,500
An update.
521 00:27:03,500 --> 00:27:05,500
Yeah, I think there’s been a few more surfaces since we recorded that one.
522 00:27:05,500 --> 00:27:07,500
Definitely.
523 00:27:07,500 --> 00:27:11,500
Actually, I just felt that hacking cars is sort of getting old.
524 00:27:11,500 --> 00:27:13,500
Tanks?
525 00:27:13,500 --> 00:27:15,500
Tanks?
526 00:27:15,500 --> 00:27:17,500
You’re like the epitaph of retro hacker there.
527 00:27:17,500 --> 00:27:19,500
I mean, hacking cars is pretty cool, right?
528 00:27:19,500 --> 00:27:21,500
But that’s so like six months ago.
529 00:27:21,500 --> 00:27:23,500
So like August.
530 00:27:23,500 --> 00:27:27,500
I mean, I like the original when we hack submarines, right?
531 00:27:27,500 --> 00:27:28,500
That’s cool.
532 00:27:28,500 --> 00:27:30,500
I’m going for tanks next.
533 00:27:30,500 --> 00:27:33,500
And I think it’s funny, but when we saw Terminator 3
534 00:27:33,500 --> 00:27:36,500
we all thought that was a stupid sucky movie.
535 00:27:36,500 --> 00:27:38,500
You saw Terminator 3?
536 00:27:38,500 --> 00:27:44,500
Yeah, and it was so unrealistic with all the cars being hacked remotely
537 00:27:44,500 --> 00:27:46,500
and starting to attack the humans.
538 00:27:46,500 --> 00:27:50,500
And now it doesn’t feel so remote anymore.
539 00:27:50,500 --> 00:27:52,500
Well, yeah, life imitating art.
540 00:27:52,500 --> 00:27:54,500
Just look at Star Trek for that, right?
541 00:27:54,500 --> 00:27:57,500
We can blame Kirk and Picard.
542 00:27:57,500 --> 00:28:02,500
What’s that other movie where the trucks sort of attack humans?
543 00:28:02,500 --> 00:28:03,500
Transformers?
544 00:28:03,500 --> 00:28:05,500
Maximum Overdrive or something like that.
545 00:28:05,500 --> 00:28:08,500
I think it’s a Stephen King movie or something, right?
546 00:28:08,500 --> 00:28:10,500
Yeah.
547 00:28:10,500 --> 00:28:13,500
Yeah, I’m seeing that future.
548 00:28:13,500 --> 00:28:17,500
I think that book was written during Stephen Kings cocaine years.
549 00:28:17,500 --> 00:28:18,500
Okay.
550 00:28:18,500 --> 00:28:19,500
So, anything else?
551 00:28:19,500 --> 00:28:21,500
If we’re looking like 10 years…
552 00:28:21,500 --> 00:28:22,500
Do we have a question?
553 00:28:22,500 --> 00:28:24,500
I think we have a question from the audience here.
554 00:28:24,500 --> 00:28:25,500
One question.
555 00:28:25,500 --> 00:28:28,500
When we look at the general trends the recent years
556 00:28:28,500 --> 00:28:31,500
we see that there’s been a shift towards…
557 00:28:31,500 --> 00:28:36,500
from general computing towards locked-in devices on iPads and stuff like that.
558 00:28:36,500 --> 00:28:41,500
And the sophistication level of how these devices are locked is increasing all the time
559 00:28:41,500 --> 00:28:44,500
and it gets more difficult to break.
560 00:28:44,500 --> 00:28:47,500
There are HSM chips and everything.
561 00:28:47,500 --> 00:28:50,500
So, what do you think about the future in that sense?
562 00:28:50,500 --> 00:28:56,500
Will there be an iPad that won’t be jailbroken?
563 00:28:56,500 --> 00:28:59,500
And what are the consequences of this trend?
564 00:28:59,500 --> 00:29:01,500
Well, everything can get hacked.
565 00:29:01,500 --> 00:29:04,500
But as you say, it’s been more and more difficult
566 00:29:04,500 --> 00:29:07,500
to actually jailbreak an iDevice.
567 00:29:07,500 --> 00:29:10,500
And there are fewer and fewer exploit developers
568 00:29:10,500 --> 00:29:14,500
that have the competencies to be able to exploit those.
569 00:29:14,500 --> 00:29:17,500
I don’t think we’ll ever see an unhackable device
570 00:29:17,500 --> 00:29:20,500
but perhaps it will be that hard
571 00:29:20,500 --> 00:29:25,500
that there won’t be any public ways to jailbreak a device.
572 00:29:25,500 --> 00:29:27,500
It’s cost-benefit, right?
573 00:29:27,500 --> 00:29:28,500
Yeah, yeah.
574 00:29:28,500 --> 00:29:30,500
Nobody is interested in…
575 00:29:30,500 --> 00:29:33,500
Let’s say, nobody…
576 00:29:33,500 --> 00:29:37,500
No civilian will be interested in paying that much money
577 00:29:37,500 --> 00:29:39,500
to develop an exploit for…
578 00:29:39,500 --> 00:29:41,500
What I would say to this, right,
579 00:29:41,500 --> 00:29:44,500
is that remember that we used to have this kind of mantra
580 00:29:44,500 --> 00:29:47,500
that the only safe computer was a disconnected computer.
581 00:29:47,500 --> 00:29:49,500
Do you remember when we had this as a mindset?
582 00:29:49,500 --> 00:29:51,500
Air gap all the things.
583 00:29:51,500 --> 00:29:53,500
And then the fuckers went and fucked that too.
584 00:29:55,500 --> 00:29:59,500
So even now, an unplugged device is not necessarily so safe.
585 00:29:59,500 --> 00:30:01,500
Take the sound card out, right?
586 00:30:01,500 --> 00:30:02,500
That’s where we’re at next.
587 00:30:02,500 --> 00:30:04,500
And what will happen from there on inwards?
588 00:30:04,500 --> 00:30:07,500
So, yeah, I’m with you on the unhackable…
589 00:30:07,500 --> 00:30:10,500
The only thing that’s unhackable is unhackable, right?
590 00:30:10,500 --> 00:30:14,500
I mean, I kind of get that you can’t fix anything.
591 00:30:14,500 --> 00:30:17,500
But yeah, I mean, like I say, who would have thought 10 years ago
592 00:30:17,500 --> 00:30:20,500
that air gapping not only would be there
593 00:30:20,500 --> 00:30:23,500
but it would be readily available to most people
594 00:30:23,500 --> 00:30:25,500
that can use Google?
595 00:30:25,500 --> 00:30:28,500
It’s mind-blowing in some ways.
596 00:30:28,500 --> 00:30:31,500
It’s about sort of malicious firmware in hard drives
597 00:30:31,500 --> 00:30:34,500
and malicious UEFI BIOS stuff.
598 00:30:34,500 --> 00:30:36,500
It scares the hell out of me.
599 00:30:36,500 --> 00:30:39,500
Yeah, it’s like you can’t trust hardware anymore.
600 00:30:39,500 --> 00:30:41,500
No, no, no, you have to build your computer yourself
601 00:30:41,500 --> 00:30:43,500
from transistors.
602 00:30:43,500 --> 00:30:45,500
Trust no one and nothing.
603 00:30:45,500 --> 00:30:50,500
And I think the most common trust zone implementation
604 00:30:50,500 --> 00:30:58,500
is the QSC operating system for ARM trust zones.
605 00:30:58,500 --> 00:31:05,500
And there’s four presentations on people actually succeeding
606 00:31:05,500 --> 00:31:07,500
in hacking those.
607 00:31:07,500 --> 00:31:12,500
And my understanding is that once you actually get root
608 00:31:12,500 --> 00:31:16,500
on a Linux system and can start talking to a trust zone
609 00:31:16,500 --> 00:31:18,500
and issue commands to that,
610 00:31:18,500 --> 00:31:23,500
the trust zone operating system actually has less modern security features
611 00:31:23,500 --> 00:31:27,500
than the Linux system you already have succeeded in attacking
612 00:31:27,500 --> 00:31:32,500
to getting to the point that you can start attacking the trust zone.
613 00:31:32,500 --> 00:31:36,500
And to get back to the question,
614 00:31:36,500 --> 00:31:38,500
I think that the development we will see
615 00:31:38,500 --> 00:31:45,500
is that more devices will get more efficient tools
616 00:31:45,500 --> 00:31:48,500
to make exploits harder.
617 00:31:48,500 --> 00:31:51,500
We will see the same development that we have seen with AI devices.
618 00:31:51,500 --> 00:31:53,500
We will see those in Android field as well.
619 00:31:53,500 --> 00:31:56,500
And we will see those in the embedded field as well.
620 00:31:56,500 --> 00:31:58,500
So things will get better and better.
621 00:31:58,500 --> 00:32:01,500
But then we are so creative in this world
622 00:32:01,500 --> 00:32:04,500
that we always create new types of tools.
623 00:32:04,500 --> 00:32:08,500
And those will not have these fancy features.
624 00:32:08,500 --> 00:32:11,500
So they will in turn be exploitable instead.
625 00:32:11,500 --> 00:32:15,500
So even if the old stuff will be hard to hack,
626 00:32:15,500 --> 00:32:17,500
there will always be new gadgets that we can hack.
627 00:32:17,500 --> 00:32:21,500
The toasters, the fridges, the planes.
628 00:32:21,500 --> 00:32:23,500
And the tanks.
629 00:32:23,500 --> 00:32:25,500
The tanks.
630 00:32:25,500 --> 00:32:29,500
And as a counter to that,
631 00:32:29,500 --> 00:32:32,500
I think since the consumer market
632 00:32:32,500 --> 00:32:35,500
is pushing for more and more connected devices
633 00:32:35,500 --> 00:32:37,500
and more and more gadgets.
634 00:32:37,500 --> 00:32:40,500
We are talking about wearables and things like that.
635 00:32:40,500 --> 00:32:44,500
The marketing departments and the tech departments
636 00:32:44,500 --> 00:32:49,500
will have a huge drive to come out with new features.
637 00:32:49,500 --> 00:32:53,500
And those will be customer oriented features,
638 00:32:53,500 --> 00:32:55,500
not security features.
639 00:32:55,500 --> 00:32:59,500
So I think there is still going to be
640 00:32:59,500 --> 00:33:02,500
so much new technology coming in
641 00:33:02,500 --> 00:33:06,500
with the same vulnerabilities we looked at 10 years ago.
642 00:33:06,500 --> 00:33:08,500
That will be introduced.
643 00:33:08,500 --> 00:33:11,500
And they will become the pivot point that you,
644 00:33:11,500 --> 00:33:13,500
like you say the pivot point is the toaster,
645 00:33:13,500 --> 00:33:16,500
that you can use to get a foothold
646 00:33:16,500 --> 00:33:18,500
and then hack the secure systems.
647 00:33:18,500 --> 00:33:21,500
Because they will be connected via Bluetooth or Wi-Fi or whatever.
648 00:33:21,500 --> 00:33:23,500
So we need support.
649 00:33:23,500 --> 00:33:25,500
I have a question just over there.
650 00:33:25,500 --> 00:33:27,500
Yeah, it’s quite easy question I think.
651 00:33:27,500 --> 00:33:29,500
We’ll tell you afterwards.
652 00:33:29,500 --> 00:33:31,500
Yeah, if you could just talk a bit about it.
653 00:33:31,500 --> 00:33:33,500
How do you see risk and possibilities
654 00:33:33,500 --> 00:33:37,500
in creating botnets in Internet of Things?
655 00:33:37,500 --> 00:33:39,500
Great.
656 00:33:41,500 --> 00:33:43,500
Depending on your viewpoint.
657 00:33:43,500 --> 00:33:47,500
Yeah, I’ve seen a couple of papers already like 2-3 years ago
658 00:33:47,500 --> 00:33:52,500
regarding really successful botnet creation from meshed
659 00:33:52,500 --> 00:33:55,500
Internet of Things devices.
660 00:33:55,500 --> 00:33:57,500
So it’s…
661 00:33:57,500 --> 00:34:00,500
Yeah, I would say there are great possibilities.
662 00:34:00,500 --> 00:34:02,500
The more Internet-connected things you have,
663 00:34:02,500 --> 00:34:04,500
the higher likely, right?
664 00:34:04,500 --> 00:34:06,500
Yeah.
665 00:34:06,500 --> 00:34:08,500
I would say very likely.
666 00:34:08,500 --> 00:34:11,500
Wasn’t it the Internet Census paper or something
667 00:34:11,500 --> 00:34:14,500
where they claimed that the data was produced
668 00:34:14,500 --> 00:34:18,500
from hacked routers?
669 00:34:18,500 --> 00:34:20,500
So you already have this,
670 00:34:20,500 --> 00:34:22,500
that non-computer device,
671 00:34:22,500 --> 00:34:25,500
non-computer devices are being hacked
672 00:34:25,500 --> 00:34:27,500
at a pretty large scale.
673 00:34:27,500 --> 00:34:31,500
And they also said that they found other malwares
674 00:34:31,500 --> 00:34:33,500
on some of the routers.
675 00:34:33,500 --> 00:34:36,500
So, I mean, it’s already happening.
676 00:34:36,500 --> 00:34:40,500
Let’s say Shordan is a really scary tool.
677 00:34:40,500 --> 00:34:43,500
Beautiful tool.
678 00:34:43,500 --> 00:34:46,500
So do we have any more questions
679 00:34:46,500 --> 00:34:48,500
from you lovely people in the audience?
680 00:34:48,500 --> 00:34:50,500
Don’t be shy.
681 00:34:50,500 --> 00:34:52,500
We’ll just point at you in a minute, right?
682 00:34:52,500 --> 00:34:53,500
Yeah.
683 00:34:53,500 --> 00:34:55,500
Are you still hungover from last night?
684 00:34:55,500 --> 00:34:58,500
Some of us on the stage are.
685 00:34:58,500 --> 00:35:00,500
A little bit.
686 00:35:00,500 --> 00:35:02,500
I’ve got a question for us all.
687 00:35:02,500 --> 00:35:04,500
We talked about this yesterday.
688 00:35:04,500 --> 00:35:07,500
Jester, hero or villain?
689 00:35:07,500 --> 00:35:09,500
It’s connected to the other discusses.
690 00:35:09,500 --> 00:35:12,500
The jester, hero or villain?
691 00:35:12,500 --> 00:35:13,500
So what do you say?
692 00:35:13,500 --> 00:35:15,500
Let’s go from left to right.
693 00:35:15,500 --> 00:35:18,500
I’m not sure, but he is entertaining.
694 00:35:18,500 --> 00:35:19,500
Once again?
695 00:35:19,500 --> 00:35:22,500
He’s entertaining at least.
696 00:35:22,500 --> 00:35:25,500
I don’t know, it’s really tough to answer actually
697 00:35:25,500 --> 00:35:27,500
because he’s doing some good stuff
698 00:35:27,500 --> 00:35:29,500
and some really weird stuff as well.
699 00:35:29,500 --> 00:35:32,500
So it’s not black and white, I’d say.
700 00:35:32,500 --> 00:35:35,500
It’s a lot of patriotism, isn’t it?
701 00:35:35,500 --> 00:35:37,500
It’s like bald eagles are flying.
702 00:35:37,500 --> 00:35:38,500
Yeah.
703 00:35:38,500 --> 00:35:40,500
It’s a lot of America.
704 00:35:40,500 --> 00:35:42,500
Alleged.
705 00:35:42,500 --> 00:35:44,500
I’m going to go for villain.
706 00:35:44,500 --> 00:35:46,500
I’m going to go for like the super villain
707 00:35:46,500 --> 00:35:48,500
but he’s actually quite a cool guy.
708 00:35:48,500 --> 00:35:50,500
Super villain like Bond villain?
709 00:35:50,500 --> 00:35:54,500
No, more like Doctor Evil, I guess.
710 00:35:54,500 --> 00:35:56,500
Secret Lair.
711 00:35:56,500 --> 00:35:58,500
Yeah, that kind of evil.
712 00:35:58,500 --> 00:36:00,500
I would say prankster
713 00:36:00,500 --> 00:36:02,500
because my understanding is that
714 00:36:02,500 --> 00:36:05,500
apparently he does some stuff
715 00:36:05,500 --> 00:36:07,500
but he has also been very efficient at
716 00:36:07,500 --> 00:36:11,500
looking out for when sites will go down
717 00:36:11,500 --> 00:36:14,500
for expiring DNS’s etc.
718 00:36:14,500 --> 00:36:16,500
and has published tango down
719 00:36:16,500 --> 00:36:17,500
the minute they go down.
720 00:36:17,500 --> 00:36:20,500
So he’s a very clever guy
721 00:36:20,500 --> 00:36:24,500
at creating PR for stuff he didn’t do.
722 00:36:24,500 --> 00:36:26,500
Although I’m sure someone will correct me
723 00:36:26,500 --> 00:36:29,500
and name quite a lot of things he has actually done.
724 00:36:29,500 --> 00:36:31,500
But some of it is just pranks
725 00:36:31,500 --> 00:36:33,500
where he takes credit for
726 00:36:33,500 --> 00:36:36,500
natural occurring events on the internet.
727 00:36:36,500 --> 00:36:38,500
Yeah, I sort of agree with you, Peter.
728 00:36:38,500 --> 00:36:40,500
I think he’s an internet troll, mainly.
729 00:36:40,500 --> 00:36:42,500
He’s an amusing internet troll
730 00:36:42,500 --> 00:36:44,500
and especially when he attacks
731 00:36:44,500 --> 00:36:45,500
the Westboro Baptist Church.
732 00:36:45,500 --> 00:36:46,500
I really love him.
733 00:36:47,500 --> 00:36:50,500
Yeah, I can’t agree more.
734 00:36:50,500 --> 00:36:52,500
I would say an internet troll.
735 00:36:52,500 --> 00:36:55,500
So for me, it’s interesting because
736 00:36:55,500 --> 00:36:59,500
how we determine if he’s a hero or a villain
737 00:36:59,500 --> 00:37:01,500
is the victims of his crime, right?
738 00:37:01,500 --> 00:37:04,500
Those that are deserving and those that are not.
739 00:37:04,500 --> 00:37:07,500
So we’re all cool with the Baptist Church
740 00:37:07,500 --> 00:37:09,500
getting pwned, right?
741 00:37:09,500 --> 00:37:11,500
Because, hey, you’re a bunch of assholes.
742 00:37:11,500 --> 00:37:13,500
And then we’ve got no sympathy.
743 00:37:13,500 --> 00:37:15,500
But if we look at other targets,
744 00:37:15,500 --> 00:37:16,500
we can, you know,
745 00:37:16,500 --> 00:37:18,500
if you change the target face,
746 00:37:18,500 --> 00:37:20,500
we suddenly have different attachments to it.
747 00:37:20,500 --> 00:37:22,500
The problem is that he’s, you know,
748 00:37:22,500 --> 00:37:25,500
a judge, you’re an executioner himself.
749 00:37:25,500 --> 00:37:27,500
Kan’t buy versus Indians.
750 00:37:27,500 --> 00:37:29,500
And we’re really back to where we started
751 00:37:29,500 --> 00:37:32,500
with the morality and the background.
752 00:37:32,500 --> 00:37:34,500
It depends.
753 00:37:34,500 --> 00:37:36,500
It shouldn’t really, but yeah.
754 00:37:36,500 --> 00:37:39,500
You would think as a bunch of
755 00:37:39,500 --> 00:37:41,500
computer-orientated people
756 00:37:41,500 --> 00:37:43,500
we would be far more black and white
757 00:37:43,500 --> 00:37:44,500
and not so grey.
758 00:37:44,500 --> 00:37:45,500
But it is.
759 00:37:45,500 --> 00:37:47,500
For me it’s always,
760 00:37:47,500 --> 00:37:48,500
like an example,
761 00:37:48,500 --> 00:37:50,500
we hop back to the Chris Roberts stuff, right?
762 00:37:50,500 --> 00:37:53,500
So, you know, Chris Roberts passively,
763 00:37:53,500 --> 00:37:55,500
so let me explain it this way.
764 00:37:55,500 --> 00:37:58,500
Someone who passively intercepts packets on a network,
765 00:37:58,500 --> 00:38:01,500
someone who arbitrarily executes code
766 00:38:01,500 --> 00:38:03,500
on someone else’s property,
767 00:38:03,500 --> 00:38:05,500
which would we decide is the worst?
768 00:38:05,500 --> 00:38:07,500
In that argument being framed that way,
769 00:38:07,500 --> 00:38:10,500
we would say the person that executes code arbitrarily
770 00:38:10,500 --> 00:38:13,500
on someone else’s property without their permission.
771 00:38:13,500 --> 00:38:14,500
Now I tell you,
772 00:38:14,500 --> 00:38:17,500
the people that are having that code executed
773 00:38:17,500 --> 00:38:19,500
on their property are Wi-Fi pineapple,
774 00:38:19,500 --> 00:38:22,500
the pineapple users and their skiddies.
775 00:38:22,500 --> 00:38:24,500
And then all of a sudden people are like,
776 00:38:24,500 --> 00:38:25,500
oh, that’s so funny.
777 00:38:25,500 --> 00:38:27,500
It’s Lee, it’s Lowell, blah, blah, blah, blah, blah.
778 00:38:27,500 --> 00:38:30,500
So the minute that we change the face of the victim
779 00:38:30,500 --> 00:38:33,500
is the minute that we change the crime.
780 00:38:33,500 --> 00:38:35,500
It’s really interesting to kind of look at it
781 00:38:35,500 --> 00:38:36,500
in that context.
782 00:38:36,500 --> 00:38:39,500
And, you know, kind of like the hacking back,
783 00:38:39,500 --> 00:38:42,500
I mean, is it okay to hack back?
784 00:38:42,500 --> 00:38:43,500
That’s very interesting.
785 00:38:43,500 --> 00:38:45,500
That’s, again, you know,
786 00:38:45,500 --> 00:38:48,500
who’s on the other side of that system?
787 00:38:48,500 --> 00:38:51,500
Yeah, I mean, if you’re throwing packets in vain,
788 00:38:51,500 --> 00:38:53,500
we’d like to think if you’re throwing packets in vain,
789 00:38:53,500 --> 00:38:55,500
then all’s fair and good, right?
790 00:38:55,500 --> 00:38:57,500
I mean, if you haven’t thrown a packet in vain,
791 00:38:57,500 --> 00:38:58,500
then you’re not hacking.
792 00:38:58,500 --> 00:39:01,500
But, yeah, I mean, I’d agree.
793 00:39:01,500 --> 00:39:02,500
And let’s say you’re hacking back,
794 00:39:02,500 --> 00:39:05,500
you could actually be attacking the police.
795 00:39:05,500 --> 00:39:06,500
Yes.
796 00:39:06,500 --> 00:39:08,500
And with that, I think we’re done.
797 00:39:08,500 --> 00:39:10,500
Yeah, it’s time to wrap up, I think.
798 00:39:10,500 --> 00:39:11,500
It’s time for a wrap.
799 00:39:11,500 --> 00:39:12,500
You go for that?
800 00:39:12,500 --> 00:39:13,500
Yeah.
801 00:39:13,500 --> 00:39:15,500
Well, it’s been really great
802 00:39:15,500 --> 00:39:17,500
standing here talking in front of you all,
803 00:39:17,500 --> 00:39:19,500
even though I am terribly hungover.
804 00:39:19,500 --> 00:39:21,500
I hope you guys enjoyed it
805 00:39:21,500 --> 00:39:24,500
as much as we did on stage, I’m sure.
806 00:39:24,500 --> 00:39:27,500
And just to let you know,
807 00:39:27,500 --> 00:39:29,500
we do have some t-shirts up here on stage,
808 00:39:29,500 --> 00:39:30,500
so if you want them,
809 00:39:30,500 --> 00:39:32,500
I mean, there’s not too many left,
810 00:39:32,500 --> 00:39:34,500
but run up here and grab one,
811 00:39:34,500 --> 00:39:36,500
you know, first come, first served.
812 00:39:36,500 --> 00:39:37,500
Yeah.
813 00:39:37,500 --> 00:39:40,500
And anything else, guys, we want to add?
814 00:39:40,500 --> 00:39:41,500
All right.
815 00:39:41,500 --> 00:39:43,500
I said this yesterday, but, like,
816 00:39:43,500 --> 00:39:44,500
you guys have been cool,
817 00:39:44,500 --> 00:39:46,500
but a big round of applause for the SEC team,
818 00:39:46,500 --> 00:39:48,500
volunteers for doing such a great job as well.
819 00:39:48,500 --> 00:39:49,500
Oh, yeah, really.
820 00:39:51,500 --> 00:39:52,500
Bye, guys.
821 00:39:54,500 --> 00:39:55,500
All right, see you guys later.
822 00:39:55,500 --> 00:39:56,500
Thanks.
823 00:39:56,500 --> 00:39:57,500
Thanks.
824 00:39:57,500 --> 00:39:59,500
All right, thank you very much.