Säkerhetspodcasten #209 - Sec-T 2021 Del 1
Lyssna
Innehåll
Detta är del ett av ett antal intervjuavsnitt genomförda under säkerhetskonferensen Sec-T i Stockholm 2021. Detta avsnitt innehåller intervjuer med Edwin van Andel, Fabio Viggiani & Fredrik Alexandersson samt Lars Haulin.
Inspelat: 2021-09-10. Längd: 00:58:59.
AI transkribering
AI försöker förstå oss… Ha överseende med galna feltranskriberingar.
1 00:00:00,000 --> 00:00:05,840
Okej, så vi sitter här tillsammans med Edvin van Andel från Zerocopter som kommer ut på scenen.
2 00:00:05,840 --> 00:00:09,760
Det här är Säkerhetsbordkastning och transmittning från Säkti 2021.
3 00:00:09,760 --> 00:00:16,280
Det är en stor glädje att ha dig här och sitta tillsammans med min äldre kollega Peter.
4 00:00:16,280 --> 00:00:22,520
Ja, tack. Det var start-up-säsongen från Säkti den här åren.
5 00:00:22,520 --> 00:00:25,800
Den här åren och den senaste åren, så det här är en tvåårig start-up.
6 00:00:25,800 --> 00:00:31,520
Så det är en tuff kraft, men jag tror att du lyckades lägga dem till en bra start.
7 00:00:31,520 --> 00:00:32,840
Tack för att du gjorde det.
8 00:00:32,840 --> 00:00:37,120
Jag hoppas alltid att mina tal är lite roliga på en sida,
9 00:00:37,120 --> 00:00:40,600
för att få publiken att vara interaktiv och attraktiv och så vidare.
10 00:00:40,600 --> 00:00:46,000
Men när de går bort, så har de något som de säger att de kanske måste förändra.
11 00:00:46,000 --> 00:00:49,200
Det är i princip idén med mina tal.
12 00:00:49,200 --> 00:00:55,120
Ja, och jag tror att du hade några riktigt goda berättelser för att börja ta upp de här tankarna.
13 00:00:55,120 --> 00:00:55,600
Ja.
14 00:00:55,600 --> 00:01:02,240
Jag menar, beroende på de buggar som kommer över och över igen, som du säger.
15 00:01:02,240 --> 00:01:08,520
Men som jag förstår, så är du en del av en volubilderingsförslagssättning i Nederländerna, eller hur?
16 00:01:08,520 --> 00:01:13,040
Ja, korrekt. Vi arbetar utanför regeringen, alltså inte regeringen.
17 00:01:13,040 --> 00:01:16,360
Så vi får göra lite mer än regeringen kan göra.
18 00:01:16,360 --> 00:01:18,720
Och vi försöker hålla Nederländerna lite säkra.
19 00:01:18,720 --> 00:01:24,040
Så vi skannar och vi försöker hjälpa barn som kanske går åt fel väg.
20 00:01:24,040 --> 00:01:25,600
Till exempel hade vi en barn senast.
21 00:01:25,600 --> 00:01:35,080
Han hittade en databas med mycket kreditkartinformering och han pratar med det om sina vänner, men han vet att han inte kan använda det för att han kommer att bli använd.
22 00:01:35,080 --> 00:01:39,960
Och sen kommer en kriminal och offrar honom ungefär 3 000 eller 4 000 euro för att köpa databasen.
23 00:01:39,960 --> 00:01:48,560
Och innan han vet, så är han i kriminalorganisationen och vi försöker som grumpiga gamla hackare, de äldre de är, att säga okej, inte göra det.
24 00:01:48,560 --> 00:01:53,160
Bara avslöja det till företaget, vi kommer att hjälpa dig, vi gör säkerhet att du inte blir använd.
25 00:01:53,160 --> 00:01:55,440
Men du blir i princip berömd för hjälp.
26 00:01:55,600 --> 00:02:01,360
Vi hjälper företaget att bli bättre och det är i princip idéen som vi försöker hålla ut i Nederländerna.
27 00:02:01,360 --> 00:02:07,880
Så det är lite om att hitta ungdomar innan barnet är helt utvecklat?
28 00:02:07,880 --> 00:02:09,160
Ja, det är korrekt.
29 00:02:09,160 --> 00:02:11,440
Och vi arbetar även med norska polisen.
30 00:02:11,440 --> 00:02:23,120
De har ett program som heter HackRide, där om någon är använd under 21 år för deras första cyberkrimsrelaterade incident, så är de inte ställda i krig, utan skickade till oss.
31 00:02:23,120 --> 00:02:38,600
Och vi lär dem och visar dem hur man kan vara ansvarig för utsläpp, och vi hittar på dem lite, men vi försöker att få dem att vara rädda, att inte göra det, att försöka hjälpa världen att bli bättre, och det fungerar, så det är väldigt fint.
32 00:02:38,600 --> 00:02:49,080
Det är ett fantastiskt utsläpp, det är kanske något som andra länder borde hitta också, att hitta dem och hjälpa dem att komma på väg, som ett korrektionsinstitut, men på rätt sätt.
33 00:02:49,080 --> 00:02:51,080
Ja, men det är också logiskt.
34 00:02:51,080 --> 00:02:52,480
I Nederländerna till exempel.
35 00:02:52,480 --> 00:03:08,560
Om du skickar något ur en skärm, så blir dina föräldrar rädda, kanske polisen, stadspolisen blir rädd, skärmföreningen blir rädd, skolan ger dig en lektion, så du har fyra eller fem olika människor som ställer dig.
36 00:03:08,560 --> 00:03:16,080
Och om du är i ditt rum och du inte har föräldrar som tittar över dig eller vad det är, så är du fri, du blir inte korrekt.
37 00:03:16,080 --> 00:03:22,240
Och jag tror att det är lite nödvändigt, och föräldrarna frågar mig alltid, vad kan jag göra, min barn är alltid uppe på väggen.
38 00:03:22,240 --> 00:03:28,880
Ska jag installera all den här softwaren för att monitorera honom, och jag säger att om han är en bra hackare, så spelar det ingen roll, för han kommer att bypassa det.
39 00:03:28,880 --> 00:03:31,520
Det bästa du kan göra är att lära dem etik.
40 00:03:31,520 --> 00:03:44,160
Så berätta till dem att det är okej om du hittar något, det är okej om du vill hacka, men om du gör det, så rapportera det till företaget, var en bra person, gör det etiskt, och så är du även känd i samhället.
41 00:03:44,160 --> 00:03:48,160
Och du blir inte räddad, du måste inte titta över hjärtat. Det är det enda du kan göra.
42 00:03:48,160 --> 00:03:50,120
Jag vet att en av våra podcastkollegor är en del av det här.
43 00:03:50,120 --> 00:03:51,120
Jag vet att en av våra podcastkollegor är en del av det här.
44 00:03:51,120 --> 00:03:52,120
Jag vet att en av våra podcastkollegor är en del av det här.
45 00:03:52,120 --> 00:04:01,860
Och det här är en del Growthrack, de gör en videor som kommer till Swedish Television från Sverige, hvor de läst lite om det här.
46 00:04:01,860 --> 00:04:03,360
Och det här är en del av det här.
47 00:04:03,360 --> 00:04:04,360
Och det här är en del av det här.
48 00:04:04,360 --> 00:04:05,360
Och det här är en del av det här.
49 00:04:06,440 --> 00:04:08,440
Och det här är en del av det här.
50 00:04:08,620 --> 00:04:10,620
Och det här är en del av det här.
51 00:04:10,620 --> 00:04:19,800
Så jag är lite trött
52 00:04:19,800 --> 00:04:20,360
Och det här är en del av det här.
53 00:04:20,360 --> 00:04:21,060
och det här är en del av det här.
54 00:04:21,400 --> 00:04:21,500
Och det här är en del av det här.
55 00:04:21,660 --> 00:04:21,860
Och det här är en del av det här.
56 00:04:21,860 --> 00:04:22,000
Och det här är en del av det här.
57 00:04:22,000 --> 00:04:22,100
Och det här är en del av det här.
58 00:04:22,100 --> 00:04:22,120
Och det här är en del av det här.
59 00:04:22,120 --> 00:04:24,160
av oss, vi kanske vet
60 00:04:24,160 --> 00:04:25,940
om IT-säkerhet och
61 00:04:25,940 --> 00:04:28,160
IT-säkerhet-relaterade
62 00:04:28,160 --> 00:04:29,500
moral-dilemmar, men
63 00:04:29,500 --> 00:04:31,800
jag tror att om vi
64 00:04:31,800 --> 00:04:33,980
väljer en generik förälder, så kanske
65 00:04:33,980 --> 00:04:36,260
de inte är familiar med
66 00:04:36,260 --> 00:04:37,860
IT-säkerhet och krimer och
67 00:04:37,860 --> 00:04:39,980
sådana saker. Ja, det är rätt. Och också
68 00:04:39,980 --> 00:04:42,000
barnen själva, de ser det som en spel.
69 00:04:42,300 --> 00:04:43,740
Du vet, de spelar spel
70 00:04:43,740 --> 00:04:45,980
och de kan göra något, och sen finns det internet
71 00:04:45,980 --> 00:04:47,980
som också är på skärmen, så du kan göra
72 00:04:47,980 --> 00:04:50,040
något, och det är där det går fel.
73 00:04:50,040 --> 00:04:52,220
And I would suggest
74 00:04:52,220 --> 00:04:54,040
trying to educate the parents
75 00:04:54,040 --> 00:04:56,220
and not so much all the technical
76 00:04:56,220 --> 00:04:57,960
stuff, but more on the moral
77 00:04:57,960 --> 00:04:59,260
side, the ethical side.
78 00:04:59,900 --> 00:05:02,080
Try to do the same thing as
79 00:05:02,080 --> 00:05:03,740
you would do in normal life.
80 00:05:03,900 --> 00:05:06,060
If you find somebody’s keys outside the
81 00:05:06,060 --> 00:05:08,240
door, don’t steal them, but ring the door
82 00:05:08,240 --> 00:05:10,060
and give
83 00:05:10,060 --> 00:05:12,220
them back. And if you do that on the internet
84 00:05:12,220 --> 00:05:14,080
in the same way, then it’s all fine.
85 00:05:14,320 --> 00:05:15,820
But you also need to educate the
86 00:05:15,820 --> 00:05:17,840
companies, right? Oh, yes.
87 00:05:17,840 --> 00:05:19,780
I mean, because they will freak out
88 00:05:19,780 --> 00:05:21,900
someone telling you, oh, we have a
89 00:05:21,900 --> 00:05:23,280
vulnerability, and okay, we’ll
90 00:05:23,280 --> 00:05:26,000
pin you down, right? So the vulnerability disclosure
91 00:05:26,000 --> 00:05:27,900
setup needs to
92 00:05:27,900 --> 00:05:29,300
be taught also to the
93 00:05:29,300 --> 00:05:32,200
kids at the companies.
94 00:05:32,520 --> 00:05:33,880
Yeah, well, that’s true. We have
95 00:05:33,880 --> 00:05:35,820
a lot of companies that come to us and
96 00:05:35,820 --> 00:05:37,780
want to join, and if we ask
97 00:05:37,780 --> 00:05:39,800
them about what is your current status
98 00:05:39,800 --> 00:05:41,820
in security, and they haven’t done anything,
99 00:05:41,940 --> 00:05:43,340
we don’t accept them as a client,
100 00:05:43,580 --> 00:05:45,800
because it’s unworkable. You have to
101 00:05:45,800 --> 00:05:47,880
have processes in place, and what we
102 00:05:47,880 --> 00:05:49,760
do is we filter in between, so that
103 00:05:49,780 --> 00:05:51,840
means that you don’t get all the bullshit, but
104 00:05:51,840 --> 00:05:53,920
only the serious things go to the company,
105 00:05:54,360 --> 00:05:55,640
but still, you have to
106 00:05:55,640 --> 00:05:57,400
discuss with the researcher,
107 00:05:57,780 --> 00:05:59,860
because they want it to know they’re proud of
108 00:05:59,860 --> 00:06:01,620
what they found, so they want interaction
109 00:06:01,620 --> 00:06:03,780
with you. And if you can do that, and
110 00:06:03,780 --> 00:06:05,880
if you can also fix it within, like, say
111 00:06:05,880 --> 00:06:07,880
60 or 90 days, according
112 00:06:07,880 --> 00:06:09,440
to standards, then it’s fine.
113 00:06:09,820 --> 00:06:11,640
But you have to have that process in place.
114 00:06:11,680 --> 00:06:14,020
If you don’t, then, yeah, you get a lot of
115 00:06:14,020 --> 00:06:15,760
miscommunication, and
116 00:06:15,760 --> 00:06:17,500
miscommunication leads to problems.
117 00:06:17,780 --> 00:06:19,700
So you could have, like, a group
118 00:06:19,700 --> 00:06:21,600
of hang-around companies that
119 00:06:21,600 --> 00:06:23,780
want to join the disclosure
120 00:06:23,780 --> 00:06:25,800
program, but they’re not there yet.
121 00:06:25,820 --> 00:06:27,580
They’re not there yet, no, and they have to be
122 00:06:27,580 --> 00:06:29,820
educated and taught, and often
123 00:06:29,820 --> 00:06:31,660
it’s also within a company there are a few
124 00:06:31,660 --> 00:06:33,200
people who want to work with us,
125 00:06:33,440 --> 00:06:35,660
and then it goes up the chain, you know, it goes
126 00:06:35,660 --> 00:06:37,600
to procurement, and to legal,
127 00:06:37,800 --> 00:06:39,720
and to the board, and they say, hackers?
128 00:06:40,140 --> 00:06:41,280
No way! You know?
129 00:06:41,600 --> 00:06:43,620
Yeah, but you’re hacked already, you know, with
130 00:06:43,620 --> 00:06:45,660
just the same stuff as is happening
131 00:06:45,660 --> 00:06:47,660
already, only now you know it, instead
132 00:06:47,660 --> 00:06:49,560
of that it’s criminally used,
133 00:06:49,700 --> 00:06:51,560
or whatever, but still, you know,
134 00:06:51,680 --> 00:06:53,540
and I think it’s also the term
135 00:06:53,540 --> 00:06:55,260
hacker that’s still too negative.
136 00:06:55,860 --> 00:06:57,140
I mean, if you look at the media,
137 00:06:57,700 --> 00:06:59,520
somebody hacked, a hacker did this,
138 00:06:59,580 --> 00:07:01,640
it’s all bad, you know, and for us, hacking
139 00:07:01,640 --> 00:07:03,540
is our pride, you know, so
140 00:07:03,540 --> 00:07:05,780
it’s always difficult. If a hacker
141 00:07:05,780 --> 00:07:07,560
would be positive, then I think
142 00:07:07,560 --> 00:07:09,740
the rest of the world will follow along
143 00:07:09,740 --> 00:07:11,020
a bit, I hope.
144 00:07:11,760 --> 00:07:13,900
So, talking about hacking and hackers,
145 00:07:14,520 --> 00:07:15,660
your name, Edwin,
146 00:07:16,160 --> 00:07:17,740
it comes
147 00:07:17,740 --> 00:07:19,660
up, some persons might
148 00:07:19,700 --> 00:07:21,340
have heard the Darknet Diary
149 00:07:21,340 --> 00:07:23,180
podcast, where you
150 00:07:23,180 --> 00:07:25,520
disclosed some details
151 00:07:25,520 --> 00:07:27,660
about the hacking of the Trump
152 00:07:27,660 --> 00:07:29,160
Twitter account, right?
153 00:07:29,340 --> 00:07:31,680
Well, it wasn’t hacking. No, no, good
154 00:07:31,680 --> 00:07:33,540
guessing. Yeah, good guessing, yeah.
155 00:07:33,860 --> 00:07:35,300
No, not good guessing, there was,
156 00:07:35,640 --> 00:07:37,400
we were in Brukon, a conference
157 00:07:37,400 --> 00:07:39,260
in 2016, and
158 00:07:39,260 --> 00:07:41,700
the databases leak all
159 00:07:41,700 --> 00:07:43,380
the time, and people crack databases,
160 00:07:43,620 --> 00:07:45,520
and LinkedIn was hacked in 2012,
161 00:07:46,220 --> 00:07:47,720
and around that time
162 00:07:47,720 --> 00:07:49,580
the database came publicly available, so
163 00:07:49,580 --> 00:07:51,140
we looked at it, and we were just
164 00:07:51,140 --> 00:07:53,680
in our hotel room, because it came out
165 00:07:53,680 --> 00:07:55,820
that day, warning Dutch government officials
166 00:07:55,820 --> 00:07:57,500
like, your password is in here,
167 00:07:57,680 --> 00:07:59,660
and etc, etc, so please fix it,
168 00:08:00,120 --> 00:08:01,780
and then Trump was on TV,
169 00:08:01,780 --> 00:08:03,660
and then one of my colleagues said, oh,
170 00:08:03,760 --> 00:08:05,840
I wonder if he would be in there, and we looked,
171 00:08:05,960 --> 00:08:07,720
and he was in there with a password, which
172 00:08:07,720 --> 00:08:09,780
was like, okay, it’s your TV
173 00:08:09,780 --> 00:08:11,800
show, it was your fire, you know?
174 00:08:12,260 --> 00:08:13,800
So, and I had a little
175 00:08:13,800 --> 00:08:15,360
bit too much beer,
176 00:08:15,860 --> 00:08:17,200
so I was at my laptop,
177 00:08:17,820 --> 00:08:19,500
and somebody said,
178 00:08:19,580 --> 00:08:21,520
I wonder if it would work on Twitter, and I
179 00:08:21,520 --> 00:08:23,060
just entered his account.
180 00:08:23,060 --> 00:08:25,660
So you were the one with the fingers at the keyboard, right?
181 00:08:25,800 --> 00:08:27,460
And I pressed enter, and it was,
182 00:08:27,600 --> 00:08:29,640
we weren’t in yet, but we, yeah,
183 00:08:29,820 --> 00:08:31,600
the password was correct, so
184 00:08:31,600 --> 00:08:33,380
then it was like, uh-oh, you know?
185 00:08:33,720 --> 00:08:35,620
And it was two weeks before election, so
186 00:08:35,620 --> 00:08:37,660
what do we do? If we stop now,
187 00:08:37,760 --> 00:08:39,600
then, yeah, it all traces back to us,
188 00:08:39,600 --> 00:08:41,800
and what if somebody else already
189 00:08:41,800 --> 00:08:43,620
knows this, and now blames us
190 00:08:43,620 --> 00:08:45,360
for everything that might happen, so
191 00:08:45,360 --> 00:08:47,660
we went on, tried to get in fully,
192 00:08:48,160 --> 00:08:49,140
took screenshots,
193 00:08:49,580 --> 00:08:51,460
made timelines, and then did responsible
194 00:08:51,460 --> 00:08:52,780
disclosure to the Trump
195 00:08:52,780 --> 00:08:55,420
administration, and the American
196 00:08:55,420 --> 00:08:57,620
government, etc., and nobody
197 00:08:57,620 --> 00:08:59,480
responded, so then we tried
198 00:08:59,480 --> 00:09:01,400
the Dutch government, they have liaisons
199 00:09:01,400 --> 00:09:03,080
in the US, so they picked it up,
200 00:09:03,520 --> 00:09:05,420
and in the end, we got, like, two weeks
201 00:09:05,420 --> 00:09:07,060
later, like, it’s fixed, and
202 00:09:07,060 --> 00:09:09,240
that was it. But we were scared
203 00:09:09,240 --> 00:09:11,220
to go to the US after that, because
204 00:09:11,220 --> 00:09:13,400
Trump isn’t the most friendly guy.
205 00:09:13,860 --> 00:09:15,340
And you could easily become one
206 00:09:15,340 --> 00:09:17,520
of the persons that get randomly selected
207 00:09:17,520 --> 00:09:19,260
at the airport, right? Yeah, randomly selected,
208 00:09:19,260 --> 00:09:21,040
put in an orange jumpsuit, and
209 00:09:21,040 --> 00:09:23,180
bye, you know, so that was
210 00:09:23,180 --> 00:09:25,260
scared. But then there was a second time, right?
211 00:09:25,360 --> 00:09:27,140
Yeah, yeah. And there
212 00:09:27,140 --> 00:09:29,360
was a lot of doubt in the community
213 00:09:29,360 --> 00:09:31,020
whether it was really, like,
214 00:09:31,100 --> 00:09:32,840
fake news, or if it was actually
215 00:09:32,840 --> 00:09:34,460
a correct
216 00:09:34,460 --> 00:09:37,320
entry. I know it was correct
217 00:09:37,320 --> 00:09:38,820
because we did it, but
218 00:09:38,820 --> 00:09:41,160
it’s also stupid that it happened
219 00:09:41,160 --> 00:09:42,600
the second time. But
220 00:09:42,600 --> 00:09:45,100
for us, the bigger problems were
221 00:09:45,100 --> 00:09:47,080
first of all, you should have had
222 00:09:47,080 --> 00:09:49,160
two-factor authentication, which was off
223 00:09:49,160 --> 00:09:51,100
again, probably because
224 00:09:51,100 --> 00:09:53,060
of the campaign team also using
225 00:09:53,060 --> 00:09:55,160
the same account. So that’s why
226 00:09:55,160 --> 00:09:57,200
we put one password on it and leave
227 00:09:57,200 --> 00:09:59,240
it that way. But the biggest thing
228 00:09:59,240 --> 00:10:01,300
was that Twitter said that they couldn’t
229 00:10:01,300 --> 00:10:03,180
find anything in their login. And
230 00:10:03,180 --> 00:10:05,160
that was something that the community
231 00:10:05,160 --> 00:10:06,700
said, oh, but then it’s fake.
232 00:10:07,480 --> 00:10:09,320
But we, well,
233 00:10:09,460 --> 00:10:11,340
not heard officially, but
234 00:10:11,340 --> 00:10:13,260
the idea now is,
235 00:10:13,380 --> 00:10:15,340
let’s put it that way, that Twitter
236 00:10:15,340 --> 00:10:16,980
wasn’t allowed to
237 00:10:16,980 --> 00:10:19,040
log in on that account, because
238 00:10:19,040 --> 00:10:20,980
if you can see the location of the
239 00:10:20,980 --> 00:10:22,900
person, and you can see all the other
240 00:10:22,900 --> 00:10:24,420
things, so Twitter
241 00:10:24,420 --> 00:10:26,860
couldn’t log it. So that’s
242 00:10:26,860 --> 00:10:28,780
why they didn’t see it. But the second
243 00:10:28,780 --> 00:10:31,060
time, was it basically the same thing
244 00:10:31,060 --> 00:10:32,780
or was it something different? It was
245 00:10:32,780 --> 00:10:34,720
the password was different.
246 00:10:36,180 --> 00:10:37,260
Exclamation mark, right?
247 00:10:37,600 --> 00:10:38,920
Well, no, it was a
248 00:10:38,920 --> 00:10:40,220
different password. It was
249 00:10:40,220 --> 00:10:42,700
make America great again, something, something,
250 00:10:42,880 --> 00:10:44,800
exclamation mark. And it was the
251 00:10:44,800 --> 00:10:46,740
same password which was used for the
252 00:10:46,740 --> 00:10:48,780
Wi-Fi, open Wi-Fi in the rallies.
253 00:10:49,040 --> 00:10:51,100
So, it was,
254 00:10:51,160 --> 00:10:52,660
yeah, it was way too easy again.
255 00:10:52,880 --> 00:10:54,700
But wasn’t this also that you entered,
256 00:10:54,980 --> 00:10:56,880
like, you were logging in from
257 00:10:56,880 --> 00:10:58,940
a different IP address? Yeah. So you had
258 00:10:58,940 --> 00:11:00,960
to, like, a certain number? Yeah, we had to
259 00:11:00,960 --> 00:11:02,940
take a proxy in the US, close to
260 00:11:02,940 --> 00:11:04,940
where it was. That was one. And
261 00:11:04,940 --> 00:11:06,440
the email address validation
262 00:11:06,440 --> 00:11:08,760
had also, we also have to do,
263 00:11:09,300 --> 00:11:11,040
and it wasn’t Trump, but it was
264 00:11:11,040 --> 00:11:13,060
Twitter at Trump.com or something.
265 00:11:13,060 --> 00:11:14,940
So it was very easy. But the
266 00:11:14,940 --> 00:11:16,960
problem was, if we wouldn’t have done it
267 00:11:16,960 --> 00:11:18,980
and somebody else misused it, published
268 00:11:18,980 --> 00:11:21,080
all our DMs or whatever, then
269 00:11:21,080 --> 00:11:23,180
it might lead back to us. So that’s
270 00:11:23,180 --> 00:11:24,300
why we had to go.
271 00:11:25,420 --> 00:11:27,240
Let’s show, okay, we can really
272 00:11:27,240 --> 00:11:29,240
get in. We can really do anything.
273 00:11:29,460 --> 00:11:31,100
Here’s all our proof, all our screenshots.
274 00:11:31,580 --> 00:11:33,140
These were the times that we did it.
275 00:11:33,200 --> 00:11:34,400
This was the IP address.
276 00:11:35,040 --> 00:11:36,780
Please fix this. Blah, blah, blah.
277 00:11:37,080 --> 00:11:39,260
But this is kind of a classic example
278 00:11:39,260 --> 00:11:40,420
of where
279 00:11:40,420 --> 00:11:43,180
it’s not black and white.
280 00:11:43,640 --> 00:11:44,320
Like, you
281 00:11:44,320 --> 00:11:46,440
can’t say this is
282 00:11:46,440 --> 00:11:48,920
100% fine, no issue at all.
283 00:11:48,980 --> 00:11:50,680
And you can’t say
284 00:11:50,680 --> 00:11:54,840
this is really, really bad.
285 00:11:54,940 --> 00:11:56,500
This would be obviously criminal.
286 00:11:57,180 --> 00:11:59,160
Because when we go back
287 00:11:59,160 --> 00:12:01,200
to ethics discussion, which we had before,
288 00:12:01,820 --> 00:12:03,120
like, often
289 00:12:03,120 --> 00:12:03,980
things are
290 00:12:03,980 --> 00:12:07,000
gray. Yeah, they’re gray.
291 00:12:07,520 --> 00:12:09,260
If nothing
292 00:12:09,260 --> 00:12:10,420
you are doing
293 00:12:10,420 --> 00:12:13,280
makes you ask
294 00:12:13,280 --> 00:12:15,480
any ethical questions,
295 00:12:16,220 --> 00:12:17,000
maybe either
296 00:12:17,000 --> 00:12:18,960
you’re not doing
297 00:12:18,980 --> 00:12:20,920
anything interesting or you
298 00:12:20,920 --> 00:12:22,940
have a very weird sense of morals
299 00:12:22,940 --> 00:12:24,600
or ethics. It’s true.
300 00:12:24,740 --> 00:12:26,460
But it’s also true for our researchers
301 00:12:26,460 --> 00:12:28,700
because a lot of times clients have indeed
302 00:12:28,700 --> 00:12:30,860
scopes, but then sometimes you
303 00:12:30,860 --> 00:12:32,700
think, well, I think there’s a problem
304 00:12:32,700 --> 00:12:34,780
here, but I have to go basically
305 00:12:34,780 --> 00:12:36,980
too far, but then I can show
306 00:12:36,980 --> 00:12:38,940
it to you. You know, what do you want?
307 00:12:39,060 --> 00:12:40,960
And with our clients, we can ask
308 00:12:40,960 --> 00:12:42,760
but in responsible
309 00:12:42,760 --> 00:12:44,940
disclosure or whatever, you basically can’t ask.
310 00:12:45,260 --> 00:12:46,900
So then it’s up to you. Are you going to
311 00:12:46,900 --> 00:12:48,940
do it? And then in the Netherlands, we have
312 00:12:48,940 --> 00:12:51,000
a law and all the
313 00:12:51,000 --> 00:12:53,140
judges have it. And it’s basically
314 00:12:53,140 --> 00:12:55,140
said that if you break into a system
315 00:12:55,140 --> 00:12:56,900
and you can download all client
316 00:12:56,900 --> 00:12:59,020
data, don’t do it. Just
317 00:12:59,020 --> 00:13:01,040
download two records to show that
318 00:13:01,040 --> 00:13:03,120
you can. So you are allowed to break
319 00:13:03,120 --> 00:13:04,920
in. You are allowed to download
320 00:13:04,920 --> 00:13:06,980
but only two records to
321 00:13:06,980 --> 00:13:09,020
show that you can. And then you won’t
322 00:13:09,020 --> 00:13:11,040
get arrested. All the judges have
323 00:13:11,040 --> 00:13:12,880
that thing. And that’s really awesome.
324 00:13:14,100 --> 00:13:17,000
So you say it’s a law, like
325 00:13:17,000 --> 00:13:18,880
is it written in a law or is it…
326 00:13:18,940 --> 00:13:20,920
Is it a clarified practice from
327 00:13:20,920 --> 00:13:23,000
historical precedents? There is a law
328 00:13:23,000 --> 00:13:25,040
which says that responsible disclosure
329 00:13:25,040 --> 00:13:26,160
is allowed in the Netherlands.
330 00:13:26,800 --> 00:13:29,060
And all the judges have the rulings
331 00:13:29,060 --> 00:13:30,400
about it near their desk.
332 00:13:30,840 --> 00:13:33,020
So that basically means that if you do something
333 00:13:33,020 --> 00:13:34,880
in the Netherlands as a hacker and a company
334 00:13:34,880 --> 00:13:36,960
wants to sue you, then
335 00:13:36,960 --> 00:13:38,980
the judges looks at it and say if you
336 00:13:38,980 --> 00:13:41,120
apply to, you didn’t publish
337 00:13:41,120 --> 00:13:42,780
it, you didn’t download too much,
338 00:13:43,160 --> 00:13:44,900
you kept to all the rules, you announced
339 00:13:44,900 --> 00:13:46,880
it, you did blah, blah, blah, we’re not
340 00:13:46,880 --> 00:13:48,900
going to prosecute. And that’s
341 00:13:48,940 --> 00:13:50,840
awesome. Because it pushes companies
342 00:13:50,840 --> 00:13:53,160
to implement responsible
343 00:13:53,160 --> 00:13:54,960
disclosure. But going into that,
344 00:13:55,060 --> 00:13:56,980
if we look at the Kaseya situation that you
345 00:13:56,980 --> 00:13:58,060
also covered on your talk,
346 00:13:58,600 --> 00:14:00,700
there was findings, right?
347 00:14:00,960 --> 00:14:03,020
That were reported
348 00:14:03,020 --> 00:14:04,560
that was actually
349 00:14:04,560 --> 00:14:06,580
being used in the attack.
350 00:14:06,700 --> 00:14:06,920
Correct.
351 00:14:07,660 --> 00:14:10,940
So was it
352 00:14:10,940 --> 00:14:13,200
after the disclosure was out?
353 00:14:13,500 --> 00:14:14,900
No. So it was before this.
354 00:14:15,040 --> 00:14:17,120
So there could be ideas
355 00:14:17,120 --> 00:14:18,780
or conspiracies that
356 00:14:18,940 --> 00:14:20,300
someone has access to.
357 00:14:20,420 --> 00:14:22,920
Yeah, that’s always the problem.
358 00:14:23,040 --> 00:14:24,800
But we see that also in our regular
359 00:14:24,800 --> 00:14:27,060
programs, that when something starts
360 00:14:27,060 --> 00:14:28,940
that a hacker finds something and three minutes
361 00:14:28,940 --> 00:14:30,900
later another one finds the same thing.
362 00:14:31,060 --> 00:14:32,000
So it’s happened.
363 00:14:32,940 --> 00:14:34,420
It’s not always
364 00:14:34,420 --> 00:14:36,960
obvious, but I think often
365 00:14:36,960 --> 00:14:39,040
you can see some sort
366 00:14:39,040 --> 00:14:40,800
of trends that if
367 00:14:40,800 --> 00:14:42,340
there’s some new
368 00:14:42,340 --> 00:14:43,960
kind of
369 00:14:43,960 --> 00:14:46,860
entry point into thinking about
370 00:14:46,860 --> 00:14:48,860
security or one new
371 00:14:48,940 --> 00:14:50,580
kind of, it doesn’t have to be
372 00:14:50,580 --> 00:14:52,780
like a new way of exploiting, but just
373 00:14:52,780 --> 00:14:54,780
some new way of getting to
374 00:14:54,780 --> 00:14:55,360
a problem.
375 00:14:56,840 --> 00:14:58,520
It seems like
376 00:14:58,520 --> 00:15:00,760
once there is some sort
377 00:15:00,760 --> 00:15:02,280
of breakthrough, then
378 00:15:02,280 --> 00:15:04,740
you will see independently
379 00:15:04,740 --> 00:15:06,620
very similar
380 00:15:06,620 --> 00:15:08,880
findings will grow out at different
381 00:15:08,880 --> 00:15:09,400
places.
382 00:15:11,120 --> 00:15:12,960
But that’s the deal, right?
383 00:15:13,020 --> 00:15:14,640
With the Kaseya now. They managed
384 00:15:14,640 --> 00:15:16,760
to show that managed software
385 00:15:16,760 --> 00:15:18,460
or remote access software is
386 00:15:18,940 --> 00:15:20,940
a good way to get in because you have all the
387 00:15:20,940 --> 00:15:22,460
privilege. It’s just
388 00:15:22,460 --> 00:15:25,000
open doors. So just
389 00:15:25,000 --> 00:15:27,040
go after those guys and you
390 00:15:27,040 --> 00:15:28,420
will be into everything.
391 00:15:28,680 --> 00:15:30,920
But that’s the world. I mean, in the beginning
392 00:15:30,920 --> 00:15:32,720
all the hackers were focused on Windows,
393 00:15:32,880 --> 00:15:35,040
you know, because everybody had Windows
394 00:15:35,040 --> 00:15:36,680
so it’s the most easy access point.
395 00:15:36,800 --> 00:15:38,520
Then Windows 10 got basically
396 00:15:38,520 --> 00:15:40,820
more secure. So what we’re looking
397 00:15:40,820 --> 00:15:42,880
at now, what does everybody have
398 00:15:42,880 --> 00:15:44,920
besides Windows? Antivirus.
399 00:15:45,300 --> 00:15:46,840
So now all the hackers focus on
400 00:15:46,840 --> 00:15:48,640
antivirus because if you are in
401 00:15:48,640 --> 00:15:50,220
antivirus, you can access anything.
402 00:15:50,480 --> 00:15:52,300
And that will go on and on and on.
403 00:15:52,360 --> 00:15:54,520
And it’s the same with this. I mean, first
404 00:15:54,520 --> 00:15:56,560
the companies were attacked. Now the companies get
405 00:15:56,560 --> 00:15:58,760
more and more secure. So now you go for the vendors
406 00:15:58,760 --> 00:16:00,460
which are used in the companies.
407 00:16:00,660 --> 00:16:02,480
And companies don’t test it. I think
408 00:16:02,480 --> 00:16:04,820
if you want to do a true security
409 00:16:04,820 --> 00:16:06,580
test of are you vulnerable for
410 00:16:06,580 --> 00:16:08,520
this, give somebody, a pen
411 00:16:08,520 --> 00:16:10,620
tester, access to one computer
412 00:16:10,620 --> 00:16:11,820
in your network fully.
413 00:16:12,280 --> 00:16:14,480
Because that’s what basically happens if you
414 00:16:14,480 --> 00:16:16,380
have like an air conditioning system and
415 00:16:16,380 --> 00:16:18,340
it’s remotely managed. There’s a computer
416 00:16:18,340 --> 00:16:20,620
in your network. And if that one
417 00:16:20,620 --> 00:16:22,440
is breached, then they can reach anything.
418 00:16:22,920 --> 00:16:24,420
And those tests are
419 00:16:24,420 --> 00:16:26,400
the next thing we should do. But yeah,
420 00:16:26,540 --> 00:16:28,640
you’re always one step behind of
421 00:16:28,640 --> 00:16:30,040
criminals thinking.
422 00:16:30,560 --> 00:16:32,420
But it’s also, I mean, if you
423 00:16:32,420 --> 00:16:34,760
lock down everything to the 100%
424 00:16:34,760 --> 00:16:35,540
security,
425 00:16:35,820 --> 00:16:38,420
we will go out of business because we couldn’t
426 00:16:38,420 --> 00:16:40,460
work, right? Correct. And even
427 00:16:40,460 --> 00:16:42,620
then, 100% security can never be.
428 00:16:42,780 --> 00:16:44,520
No, exactly. No, I don’t
429 00:16:44,520 --> 00:16:46,100
think so. So what’s the next step?
430 00:16:46,180 --> 00:16:48,260
You’re talking about supply chain as the
431 00:16:48,260 --> 00:16:50,220
thing that’s going on for the last couple of years,
432 00:16:50,220 --> 00:16:52,000
right? Because you trust your vendors.
433 00:16:52,180 --> 00:16:54,140
You don’t want to check them. You know how they’re trustworthy
434 00:16:54,140 --> 00:16:56,240
and then you get in there. But what’s
435 00:16:56,240 --> 00:16:57,620
the next step? Would you say the
436 00:16:57,620 --> 00:17:00,360
I mean, what’s coming next after supply
437 00:17:00,360 --> 00:17:02,120
chain? Well,
438 00:17:02,520 --> 00:17:04,240
I just had a brilliant talk
439 00:17:04,240 --> 00:17:06,280
with a friend of mine about this and he says
440 00:17:06,280 --> 00:17:07,780
the next step is
441 00:17:07,780 --> 00:17:10,180
configuration issues. And I was
442 00:17:10,180 --> 00:17:12,400
like, hmm. And he said, yeah, everybody’s building
443 00:17:12,400 --> 00:17:14,420
now systems which might be
444 00:17:14,420 --> 00:17:16,240
more secure, etc. You will
445 00:17:16,240 --> 00:17:17,900
still have bugs, but it doesn’t matter.
446 00:17:18,260 --> 00:17:20,220
The problem now is configuration
447 00:17:20,220 --> 00:17:22,220
issues. And there have been research
448 00:17:22,220 --> 00:17:24,220
with some very advanced companies that
449 00:17:24,220 --> 00:17:26,100
did a scan of all their bugs and
450 00:17:26,100 --> 00:17:28,060
about 70% of all their
451 00:17:28,060 --> 00:17:29,860
bugs were configuration
452 00:17:29,860 --> 00:17:32,120
issues. So that means I have vendor A,
453 00:17:32,540 --> 00:17:34,320
I have vendor B, and I have vendor C
454 00:17:34,320 --> 00:17:36,000
and I have to combine them
455 00:17:36,000 --> 00:17:38,000
and if I make one mistake, then
456 00:17:38,000 --> 00:17:40,040
somebody can misuse that. And
457 00:17:40,040 --> 00:17:42,040
he did some research and he said
458 00:17:42,040 --> 00:17:44,160
if you have a clear PC, so
459 00:17:44,160 --> 00:17:45,700
a new computer,
460 00:17:46,040 --> 00:17:48,120
and you install Windows on it, there
461 00:17:48,260 --> 00:17:49,980
is 10 to the time 600
462 00:17:49,980 --> 00:17:52,180
possibilities to configure Windows.
463 00:17:53,060 --> 00:17:54,400
So there might
464 00:17:54,400 --> 00:17:56,360
be one million mistakes in there.
465 00:17:56,640 --> 00:17:58,220
You don’t know. And I think
466 00:17:58,220 --> 00:17:59,980
that will be,
467 00:18:00,140 --> 00:18:02,300
but not now because we’re way too busy
468 00:18:02,300 --> 00:18:04,580
with still perimeters, with still
469 00:18:04,580 --> 00:18:06,040
client-sided
470 00:18:06,040 --> 00:18:08,040
tags, with still everything that’s there,
471 00:18:08,140 --> 00:18:10,200
bug boundaries, etc. But I think in a couple
472 00:18:10,200 --> 00:18:11,900
of years you will see a shift to, okay,
473 00:18:12,260 --> 00:18:14,060
we got the basis of security
474 00:18:14,060 --> 00:18:16,020
fixed. What now? Well, the
475 00:18:16,020 --> 00:18:18,140
interaction thing, the configuration stuff,
476 00:18:18,260 --> 00:18:19,820
all that stuff, I think that will be
477 00:18:19,820 --> 00:18:22,200
the biggest thing in three to
478 00:18:22,200 --> 00:18:24,100
five years. Because, I mean, a lot of
479 00:18:24,100 --> 00:18:26,000
stuff that you have around you
480 00:18:26,000 --> 00:18:27,940
in our IT world today is
481 00:18:27,940 --> 00:18:29,880
configuration, right? It’s not program anymore.
482 00:18:30,280 --> 00:18:31,920
It’s configuration on an AVS account
483 00:18:31,920 --> 00:18:34,180
or, yeah, some other… And if you make
484 00:18:34,180 --> 00:18:36,040
a mistake there, then you can be as
485 00:18:36,040 --> 00:18:37,620
secure as you want, but you’re still gone.
486 00:18:37,980 --> 00:18:40,100
So I think that will be the biggest thing
487 00:18:40,100 --> 00:18:42,000
in a couple of years. So we’re looking forward
488 00:18:42,000 --> 00:18:44,300
to the future, right? A bright or a
489 00:18:44,300 --> 00:18:46,140
scary future? Oh, a fun
490 00:18:46,140 --> 00:18:47,780
future. A fun future for us.
491 00:18:48,260 --> 00:18:49,600
Funny stuff. So I’m not
492 00:18:49,600 --> 00:18:52,260
even, you know, everything is
493 00:18:52,260 --> 00:18:54,180
the same now. I’m not curious
494 00:18:54,180 --> 00:18:56,300
or I’m still curious how stuff
495 00:18:56,300 --> 00:18:57,720
work, of course, but I’m not
496 00:18:57,720 --> 00:19:00,320
surprised when things are broken or whatever.
497 00:19:00,400 --> 00:19:02,260
It’s the same, same. Over and over
498 00:19:02,980 --> 00:19:03,140
again.
499 00:19:04,060 --> 00:19:06,260
Okay, Edvin, thank you so much for
500 00:19:06,260 --> 00:19:07,540
coming to Secretary’s Podcast and
501 00:19:07,540 --> 00:19:10,020
transmission here from SEC-T at 2021.
502 00:19:10,440 --> 00:19:12,320
It’s been a pleasure having you at the seat here
503 00:19:12,320 --> 00:19:14,300
together with Peter, Edvin,
504 00:19:14,480 --> 00:19:15,980
Robin. Thank you for
505 00:19:15,980 --> 00:19:18,200
sharing with us your thoughts. Yeah, thank
506 00:19:18,200 --> 00:19:20,100
you for having me, and have a good one. Have a
507 00:19:20,100 --> 00:19:21,720
nice time in Stockholm. Thank you very much.
508 00:19:21,720 --> 00:19:23,720
I will. Thank you. Cheers. Cheers.
509 00:19:25,880 --> 00:19:28,020
So, we are live here from
510 00:19:28,020 --> 00:19:29,400
SEC-T 2021.
511 00:19:31,140 --> 00:19:31,940
Stök and
512 00:19:31,940 --> 00:19:33,840
Fabio just came off stage
513 00:19:33,840 --> 00:19:36,080
giving an excellent talk about
514 00:19:36,080 --> 00:19:37,820
some threat hunting they’ve been doing,
515 00:19:37,980 --> 00:19:39,980
some tools you developed
516 00:19:39,980 --> 00:19:42,020
to figure out how to be
517 00:19:42,020 --> 00:19:44,060
a little bit ahead of the bad
518 00:19:44,060 --> 00:19:45,820
guys, right? Yeah. And
519 00:19:45,820 --> 00:19:48,120
maybe a little bit about how to be the bad guys.
520 00:19:48,200 --> 00:19:50,240
Or? It’s all of
521 00:19:50,240 --> 00:19:52,140
them, right? That’s it. It can’t just be
522 00:19:52,140 --> 00:19:53,180
all defense, right?
523 00:19:54,160 --> 00:19:56,260
Everybody likes to hack stuff, but
524 00:19:56,260 --> 00:19:58,040
what I love about this
525 00:19:58,040 --> 00:20:00,280
whole thing that we presented today, which
526 00:20:00,280 --> 00:20:01,180
was
527 00:20:01,180 --> 00:20:04,200
incident response,
528 00:20:04,860 --> 00:20:06,320
right, is that that’s red teaming
529 00:20:06,320 --> 00:20:08,240
in reverse. Because you need to
530 00:20:08,240 --> 00:20:10,320
understand all the offensive tools being used.
531 00:20:10,620 --> 00:20:12,220
And if you are a good red teamer,
532 00:20:12,700 --> 00:20:14,320
it’s going to feel natural for you
533 00:20:14,320 --> 00:20:16,120
when you’re doing the whole incident response
534 00:20:16,120 --> 00:20:18,180
thing, because you identify patterns,
535 00:20:18,200 --> 00:20:20,660
and it’s like reading redacted
536 00:20:20,660 --> 00:20:22,320
reports that somebody wrote up
537 00:20:22,320 --> 00:20:24,240
and you can see what happens. So I think it’s
538 00:20:24,240 --> 00:20:26,360
really interesting. Yeah, and we have the luxury
539 00:20:26,360 --> 00:20:28,120
to have both red teamers and
540 00:20:28,120 --> 00:20:29,900
blue teamers instead of responders
541 00:20:29,900 --> 00:20:32,020
that help doing this,
542 00:20:32,320 --> 00:20:33,620
handling these cases.
543 00:20:34,260 --> 00:20:36,180
And it’s always nice to have a red teamer
544 00:20:36,180 --> 00:20:38,400
perspective when you’re trying to figure out what an actual
545 00:20:38,400 --> 00:20:40,300
cybercriminal did, because then
546 00:20:40,300 --> 00:20:42,180
you can think from their perspective, like
547 00:20:42,180 --> 00:20:44,420
if I were there, I had this type of access,
548 00:20:44,580 --> 00:20:46,400
I was in this system, and my goal was that.
549 00:20:46,680 --> 00:20:48,140
How would I do it? And then you can
550 00:20:48,140 --> 00:20:50,080
check, is that how they actually did it or not?
551 00:20:50,200 --> 00:20:51,480
And it actually helps a lot.
552 00:20:52,080 --> 00:20:54,180
And the other way around as well, right? What we
553 00:20:54,180 --> 00:20:56,260
learned from incident response, you can apply
554 00:20:56,260 --> 00:20:58,180
into your own red teaming, because if
555 00:20:58,180 --> 00:21:00,220
you know how to find stuff, you also know how to
556 00:21:00,220 --> 00:21:02,260
not be found. Exactly. And I think
557 00:21:02,260 --> 00:21:04,160
also to be a good red teamer,
558 00:21:04,200 --> 00:21:05,980
it helps if you work blue team.
559 00:21:06,500 --> 00:21:08,380
Because you know where
560 00:21:08,380 --> 00:21:10,180
sysadmins get sloppy
561 00:21:10,180 --> 00:21:12,220
and where they sort of
562 00:21:12,220 --> 00:21:14,200
take shortcuts and things like that.
563 00:21:14,580 --> 00:21:16,280
So yeah, it all ties
564 00:21:16,280 --> 00:21:17,960
together. I think it’s a good idea
565 00:21:17,960 --> 00:21:20,200
to have both sides of the
566 00:21:20,200 --> 00:21:22,280
house represented in an incident
567 00:21:22,280 --> 00:21:22,760
response.
568 00:21:24,840 --> 00:21:26,400
I would throw
569 00:21:26,400 --> 00:21:28,300
into the mix also the monitor and the SOC.
570 00:21:29,000 --> 00:21:29,480
Because
571 00:21:29,480 --> 00:21:32,000
they actually do the detection
572 00:21:32,000 --> 00:21:34,160
and quick response, right? So they
573 00:21:34,160 --> 00:21:36,320
are the ones accepting the rules for what to check for
574 00:21:36,320 --> 00:21:37,620
and
575 00:21:37,620 --> 00:21:39,860
alerting and reacting to that.
576 00:21:40,200 --> 00:21:42,280
So should the SOC people
577 00:21:42,280 --> 00:21:44,040
also go into the red teaming?
578 00:21:44,040 --> 00:21:46,260
I think that’s awesome. And we have people
579 00:21:46,260 --> 00:21:47,940
from our SOC that move into pentesting.
580 00:21:47,960 --> 00:21:51,860
So I think that’s the best.
581 00:21:52,080 --> 00:21:53,500
The more perspectives you can get
582 00:21:53,500 --> 00:21:55,580
and the more you act
583 00:21:55,580 --> 00:21:57,700
effectively.
584 00:21:58,700 --> 00:22:00,060
I think for me
585 00:22:00,060 --> 00:22:01,600
one of the things
586 00:22:01,600 --> 00:22:04,380
I’ve worked a lot of incident response
587 00:22:04,380 --> 00:22:05,640
and worked in forensics
588 00:22:05,640 --> 00:22:07,860
in the
589 00:22:07,860 --> 00:22:09,900
before times. But
590 00:22:09,900 --> 00:22:12,100
one thing I liked about your talk
591 00:22:12,100 --> 00:22:13,860
was that you did the analogy about
592 00:22:13,860 --> 00:22:15,940
a jigsaw puzzle
593 00:22:15,940 --> 00:22:17,920
that you have to find the little
594 00:22:17,920 --> 00:22:18,760
pieces and
595 00:22:18,760 --> 00:22:22,000
you might not even have all the pieces
596 00:22:22,000 --> 00:22:23,460
but it somehow
597 00:22:23,460 --> 00:22:25,440
creates a picture and
598 00:22:25,440 --> 00:22:28,040
you’re keeping that timeline
599 00:22:28,040 --> 00:22:30,080
as basically a security blanket
600 00:22:30,080 --> 00:22:31,940
that you’re attaching
601 00:22:31,940 --> 00:22:33,820
everything to and try to correlate
602 00:22:33,820 --> 00:22:34,900
things. So
603 00:22:34,900 --> 00:22:37,080
I really enjoyed that.
604 00:22:38,880 --> 00:22:40,140
What do you think?
605 00:22:40,220 --> 00:22:41,680
I mean you guys develop tools
606 00:22:41,680 --> 00:22:43,880
to sort of build into
607 00:22:43,880 --> 00:22:45,700
that but do you think
608 00:22:45,700 --> 00:22:46,740
there should be
609 00:22:46,740 --> 00:22:47,640
like
610 00:22:47,920 --> 00:22:49,480
you discussed in the talk
611 00:22:49,480 --> 00:22:51,960
Microsoft’s
612 00:22:51,960 --> 00:22:54,080
cache for
613 00:22:54,080 --> 00:22:55,040
remote desktop.
614 00:22:55,400 --> 00:22:57,580
Do you think that software companies
615 00:22:57,580 --> 00:22:59,320
like Microsoft should be more
616 00:22:59,320 --> 00:23:01,940
helpful in terms of
617 00:23:01,940 --> 00:23:03,780
creating tools to use
618 00:23:03,780 --> 00:23:05,620
that as incident response
619 00:23:05,620 --> 00:23:07,620
or should that be
620 00:23:07,620 --> 00:23:08,440
thwarted by?
621 00:23:09,740 --> 00:23:11,740
I think yes.
622 00:23:11,840 --> 00:23:12,660
Ideally yes.
623 00:23:13,400 --> 00:23:15,400
But I mean if I’m trying to look from
624 00:23:15,400 --> 00:23:16,520
their perspective
625 00:23:17,920 --> 00:23:19,360
they of course want to work
626 00:23:19,360 --> 00:23:21,240
with giving possibility to detect
627 00:23:21,240 --> 00:23:23,240
and possibly even block stuff
628 00:23:23,240 --> 00:23:24,760
and they do it in their way.
629 00:23:25,080 --> 00:23:26,560
They push their new solutions
630 00:23:26,560 --> 00:23:29,000
and all the new fancy
631 00:23:29,000 --> 00:23:30,540
solutions they have
632 00:23:30,540 --> 00:23:33,620
which they’re usually good.
633 00:23:34,020 --> 00:23:35,240
They do the right things
634 00:23:35,240 --> 00:23:37,140
provided that you are in a modern
635 00:23:37,140 --> 00:23:39,340
environment and that you have other things in place.
636 00:23:39,720 --> 00:23:41,140
It doesn’t make sense to use the latest
637 00:23:41,140 --> 00:23:43,000
and greatest if you don’t even do the basics.
638 00:23:43,480 --> 00:23:45,380
But here we’re talking about missing the basics
639 00:23:45,380 --> 00:23:47,300
and this is the stuff we’re left with.
640 00:23:47,300 --> 00:23:49,400
So I don’t know. Yes it would be great
641 00:23:49,400 --> 00:23:51,380
to have but I don’t see
642 00:23:51,380 --> 00:23:53,240
that there is huge value
643 00:23:53,240 --> 00:23:54,940
for a company like Microsoft to
644 00:23:54,940 --> 00:23:57,360
do that. Then again they are sometimes
645 00:23:57,360 --> 00:23:58,800
very transparent. Sometimes not.
646 00:23:59,760 --> 00:24:01,340
But they have tools like
647 00:24:01,340 --> 00:24:03,500
Autoruns and all this stuff anyway.
648 00:24:03,820 --> 00:24:05,100
Yeah I mean that’s system terminals
649 00:24:05,100 --> 00:24:06,860
that’s you know it’s Microsoft.
650 00:24:07,300 --> 00:24:08,620
It’s a part of Microsoft that does those.
651 00:24:08,640 --> 00:24:11,280
They could probably just create one that’s the RDP
652 00:24:11,280 --> 00:24:12,160
cache explorer.
653 00:24:13,280 --> 00:24:14,900
Sure. Right? Sure. In theory.
654 00:24:14,900 --> 00:24:16,780
And yeah. I mean it’s
655 00:24:16,780 --> 00:24:18,700
if possible it’s going to come up like but
656 00:24:18,700 --> 00:24:20,820
In theory they will start to use the
657 00:24:20,820 --> 00:24:22,840
CAPTCHAs as do the puzzling
658 00:24:22,840 --> 00:24:25,040
from the RDP cache to show
659 00:24:25,040 --> 00:24:25,980
that you’re human right?
660 00:24:26,300 --> 00:24:28,940
You just gave me a great idea. I’m going to put the CAPTCHA
661 00:24:28,940 --> 00:24:30,880
Do that. Can you figure out how to
662 00:24:30,880 --> 00:24:32,600
sort Mimikas in this or
663 00:24:32,600 --> 00:24:34,440
Hydra or whatever? Awesome.
664 00:24:34,760 --> 00:24:35,200
Great idea.
665 00:24:36,500 --> 00:24:38,220
You’re happy to explore that one.
666 00:24:38,600 --> 00:24:40,900
Perfect. One thing that I picked up on your talk
667 00:24:40,900 --> 00:24:42,920
was all the different tactics
668 00:24:42,920 --> 00:24:44,920
and technologies that you showed
669 00:24:44,920 --> 00:24:46,300
up was a way of describing
670 00:24:46,780 --> 00:24:49,160
your actual attack towards
671 00:24:49,160 --> 00:24:50,920
the responders right? Throw away
672 00:24:50,920 --> 00:24:53,060
some stupid error message
673 00:24:53,060 --> 00:24:54,180
that will make
674 00:24:54,180 --> 00:24:57,020
the incident responders stop
675 00:24:57,020 --> 00:24:59,200
and look another way.
676 00:24:59,240 --> 00:25:00,940
Oh this is just a phishing
677 00:25:00,940 --> 00:25:03,040
attack or this is just this and that.
678 00:25:03,680 --> 00:25:05,040
But how common
679 00:25:05,040 --> 00:25:07,020
would you say it is that someone
680 00:25:07,020 --> 00:25:09,180
goes in, goes after data and then
681 00:25:09,180 --> 00:25:11,020
throws out some bitcoin miner
682 00:25:11,020 --> 00:25:12,940
just to see that oh the
683 00:25:12,940 --> 00:25:14,840
incident responder, oh it’s just
684 00:25:14,840 --> 00:25:16,740
a bitcoin miner or a money
685 00:25:16,780 --> 00:25:18,940
miner person that’s after us.
686 00:25:18,940 --> 00:25:20,460
Do you have any data on that?
687 00:25:20,760 --> 00:25:22,920
I don’t have any data up to the top of my head
688 00:25:22,920 --> 00:25:24,980
but I don’t think that’s very common among criminal
689 00:25:24,980 --> 00:25:26,800
groups. It is
690 00:25:26,800 --> 00:25:28,820
more common when you have an APT
691 00:25:28,820 --> 00:25:30,920
that actually
692 00:25:30,920 --> 00:25:33,100
has, you know, they have
693 00:25:33,100 --> 00:25:34,500
a lot of resources, they have time
694 00:25:34,500 --> 00:25:36,940
and they can actually do this and they actually
695 00:25:36,940 --> 00:25:38,480
have value in doing that.
696 00:25:39,860 --> 00:25:40,000
But
697 00:25:40,000 --> 00:25:43,080
it would be quite
698 00:25:43,080 --> 00:25:45,220
easy just to throw off something.
699 00:25:45,340 --> 00:25:46,760
For sure it would but it also
700 00:25:46,780 --> 00:25:48,720
tells you that it’s important to do a thorough
701 00:25:48,720 --> 00:25:50,860
investigation. So when do you stop
702 00:25:50,860 --> 00:25:53,060
investigating? When you’re
703 00:25:53,060 --> 00:25:54,940
ready to do a full kick out
704 00:25:54,940 --> 00:25:56,880
and you have a plan
705 00:25:56,880 --> 00:25:58,880
for how you’re rebuilding. Yeah, because
706 00:25:58,880 --> 00:26:00,340
that’s the thing, when you want to do a kick out
707 00:26:00,340 --> 00:26:02,980
you need to
708 00:26:02,980 --> 00:26:04,940
make sure you do all the things you need
709 00:26:04,940 --> 00:26:07,140
to do at once. Right, say you have
710 00:26:07,140 --> 00:26:08,740
X number of systems compromised
711 00:26:08,740 --> 00:26:11,080
and there is backdoors talking
712 00:26:11,080 --> 00:26:12,940
to servers, you know
713 00:26:12,940 --> 00:26:13,940
threat after servers
714 00:26:13,940 --> 00:26:16,400
their accounts are compromised.
715 00:26:16,780 --> 00:26:18,680
Then you know, okay, I need to
716 00:26:18,680 --> 00:26:20,640
clean that server, I need to block
717 00:26:20,640 --> 00:26:22,740
I need to block
718 00:26:22,740 --> 00:26:24,600
those IPs in the firewall and I need to
719 00:26:24,600 --> 00:26:26,140
reset the passwords for these accounts
720 00:26:26,140 --> 00:26:28,540
and they came in through RDP for example
721 00:26:28,540 --> 00:26:30,500
so we need to close RDP so they can
722 00:26:30,500 --> 00:26:32,420
come back in the same way and so on.
723 00:26:32,620 --> 00:26:34,600
So all these things, you need to have found them
724 00:26:34,600 --> 00:26:36,680
all ideally so you can do them
725 00:26:36,680 --> 00:26:38,360
at the same time. Because if you do one
726 00:26:38,360 --> 00:26:40,760
but you leave the other, they’re going to figure out
727 00:26:40,760 --> 00:26:42,580
that you’re doing
728 00:26:42,580 --> 00:26:44,580
that and then they’re going to hide it differently.
729 00:26:44,900 --> 00:26:46,620
It doesn’t really apply for ransomware cases
730 00:26:46,780 --> 00:26:48,800
so much because, you know, obviously
731 00:26:48,800 --> 00:26:49,620
you’ve noticed
732 00:26:49,620 --> 00:26:51,360
your shit is gone.
733 00:26:51,780 --> 00:26:54,620
So then they know of course there’s going to be
734 00:26:54,620 --> 00:26:56,200
an investigation. They’re not really in
735 00:26:56,200 --> 00:26:58,680
in most cases, I guess they could be
736 00:26:58,680 --> 00:26:59,040
but
737 00:26:59,040 --> 00:27:02,540
that’s more for targeted attacks that are not
738 00:27:02,540 --> 00:27:04,700
ransomware. Then you really
739 00:27:04,700 --> 00:27:06,800
want to make sure you find everything before you do the kickout.
740 00:27:07,580 --> 00:27:08,720
And that takes
741 00:27:08,720 --> 00:27:10,600
time and I don’t know how, it depends how complex
742 00:27:10,600 --> 00:27:12,520
the environment is and how advanced
743 00:27:12,520 --> 00:27:14,640
and extended the attack is.
744 00:27:15,140 --> 00:27:16,580
It’s easy to say also that you
745 00:27:16,780 --> 00:27:18,580
probably just change the passwords, right?
746 00:27:18,580 --> 00:27:22,940
But you need to change all the passwords, all of them, everything.
747 00:27:22,940 --> 00:27:23,900
At the same time, right?
748 00:27:23,900 --> 00:27:25,180
Yeah, at the same time.
749 00:27:25,180 --> 00:27:27,940
And then you need to have another cleaned domain
750 00:27:27,940 --> 00:27:29,980
where you can start working with and, you know,
751 00:27:29,980 --> 00:27:31,260
this is complex stuff.
752 00:27:31,260 --> 00:27:33,860
And you have people working, trying to take care of stuff,
753 00:27:33,860 --> 00:27:35,140
trying to do what they’re doing.
754 00:27:35,140 --> 00:27:37,100
Customer is distressed all the time, right?
755 00:27:37,100 --> 00:27:39,140
And asking you, when can we start doing this?
756 00:27:39,140 --> 00:27:40,980
Then losing money by the minute.
757 00:27:40,980 --> 00:27:44,900
And so that’s also the thing that I love about doing this work
758 00:27:44,900 --> 00:27:46,680
is that when you enter an incident,
759 00:27:46,680 --> 00:27:50,400
in the first stages, if it isn’t a ransomware,
760 00:27:50,400 --> 00:27:54,600
you are, you’re getting called in, adrenaline is pumping,
761 00:27:54,600 --> 00:27:57,200
you’re starting to figure things out and you’re,
762 00:27:57,200 --> 00:27:59,640
and you know the threat actor is in the system
763 00:27:59,640 --> 00:28:02,240
and you need to figure out how did they get in
764 00:28:02,240 --> 00:28:04,860
and which, and how do we stop them?
765 00:28:04,860 --> 00:28:08,340
Because the system can be ransomware at any second
766 00:28:08,340 --> 00:28:09,980
because they don’t know you’re there.
767 00:28:09,980 --> 00:28:12,560
So you need to be stealthy as well, doing the research.
768 00:28:12,560 --> 00:28:16,680
You can’t just start a bunch of new user accounts in the domain
769 00:28:16,680 --> 00:28:19,920
naming them your company, consultant one,
770 00:28:19,920 --> 00:28:21,540
because that’s going to flag them.
771 00:28:21,540 --> 00:28:24,380
So you need to stealth that, look what’s going on
772 00:28:24,380 --> 00:28:27,120
and then eventually figure out like, do we have backups?
773 00:28:27,120 --> 00:28:28,920
So we create, are we ready?
774 00:28:28,920 --> 00:28:32,440
So we don’t get that vengeance ransomware
775 00:28:32,440 --> 00:28:33,280
where they just wind it.
776 00:28:33,280 --> 00:28:34,120
Yeah, exactly.
777 00:28:34,120 --> 00:28:36,660
Then they do things, you know, quickly.
778 00:28:36,660 --> 00:28:37,500
Yeah.
779 00:28:37,500 --> 00:28:38,420
Like, you know, the phases are usually,
780 00:28:38,420 --> 00:28:40,260
you take over the entire infrastructure first,
781 00:28:40,260 --> 00:28:42,200
then you go after data.
782 00:28:42,200 --> 00:28:44,320
They usually takes a while to find the right data
783 00:28:44,320 --> 00:28:45,260
and then you steal that.
784 00:28:45,260 --> 00:28:46,200
So you have the additional leverage,
785 00:28:46,200 --> 00:28:51,200
when you do the ransomware and then you destroy backups.
786 00:28:51,300 --> 00:28:54,880
They usually also like uninstall AB centrally,
787 00:28:54,880 --> 00:28:57,120
if it’s a group policy or whatever, they just push it out.
788 00:28:57,120 --> 00:28:59,440
And then they do the ransomware everywhere at the same time.
789 00:28:59,440 --> 00:29:02,440
If, for example, if you catch them in the middle phase there
790 00:29:02,440 --> 00:29:05,280
when they already have a lot of access,
791 00:29:05,280 --> 00:29:06,740
but they’re still, you know,
792 00:29:06,740 --> 00:29:08,560
they still need to go through data to steal
793 00:29:08,560 --> 00:29:10,560
and stuff like that and find the backups.
794 00:29:10,560 --> 00:29:11,920
And they see that you’re onto them.
795 00:29:11,920 --> 00:29:15,360
They’re probably going to be kicked out any minute now.
796 00:29:15,360 --> 00:29:16,200
You know,
797 00:29:16,200 --> 00:29:17,960
the question could be, well, okay,
798 00:29:17,960 --> 00:29:21,280
forget the backups and the data, just do ransomware anyways.
799 00:29:21,280 --> 00:29:24,300
So that’s a very bad spot to find them
800 00:29:24,300 --> 00:29:27,140
or to at least let them know that you found them.
801 00:29:27,140 --> 00:29:28,740
How common would you say it is
802 00:29:28,740 --> 00:29:31,280
that they go after the backups and crash that?
803 00:29:31,280 --> 00:29:32,780
All the time, pretty much.
804 00:29:32,780 --> 00:29:35,480
Also that, like you could go persistent
805 00:29:35,480 --> 00:29:38,840
and just encrypt the backups, right?
806 00:29:38,840 --> 00:29:40,380
So during half a year,
807 00:29:40,380 --> 00:29:43,480
you start doing the encryption of the backup.
808 00:29:43,480 --> 00:29:46,200
When you do the readback, you do the decryption.
809 00:29:46,200 --> 00:29:48,080
So you don’t see that there is a problem.
810 00:29:48,080 --> 00:29:49,000
Yeah.
811 00:29:49,000 --> 00:29:50,880
And then you drop the key.
812 00:29:50,880 --> 00:29:51,580
I see that.
813 00:29:51,580 --> 00:29:54,800
But then it’s easier to wipe them and do the ransomware.
814 00:29:54,800 --> 00:29:57,080
Yeah, but then you would recognize that, right?
815 00:29:57,080 --> 00:29:59,200
And then you have your offsite.
816 00:29:59,200 --> 00:29:59,780
Yeah, okay.
817 00:29:59,780 --> 00:30:02,580
If you have actual offsite and offline backups.
818 00:30:02,580 --> 00:30:03,260
Yeah, that’s true.
819 00:30:03,260 --> 00:30:04,920
Fair enough.
820 00:30:04,920 --> 00:30:08,480
And it’s definitely something that could potentially be done.
821 00:30:08,480 --> 00:30:13,600
But I’m just going to say they are so successful already.
822 00:30:13,600 --> 00:30:16,200
They haven’t even started thinking about more sophisticated
823 00:30:16,200 --> 00:30:17,320
stuff like that.
824 00:30:17,320 --> 00:30:18,160
They don’t need to.
825 00:30:18,160 --> 00:30:19,120
They don’t need to, yeah.
826 00:30:19,120 --> 00:30:20,920
It’s an RFI thing.
827 00:30:20,920 --> 00:30:23,080
You know the delay there is between the first breach
828 00:30:23,080 --> 00:30:26,120
and when they actually start to manually work in the environment?
829 00:30:26,120 --> 00:30:28,200
That can be weeks or months.
830 00:30:28,200 --> 00:30:29,640
No one has seen anything like that.
831 00:30:29,640 --> 00:30:32,280
Imagine how many companies are currently breached
832 00:30:32,280 --> 00:30:34,440
and they are not being actively worked with
833 00:30:34,440 --> 00:30:37,920
because they’re working on some other higher priority target.
834 00:30:37,920 --> 00:30:39,040
That can be weeks or months.
835 00:30:39,040 --> 00:30:40,640
Imagine how many they’re already in.
836 00:30:40,640 --> 00:30:43,640
They’re overloaded with networks they already have access to.
837 00:30:43,640 --> 00:30:46,200
They own you, create persistence, and then they idle.
838 00:30:46,200 --> 00:30:49,000
They just idle until it’s time to harvest.
839 00:30:49,000 --> 00:30:53,040
One thing that I really liked as well about your presentation
840 00:30:53,040 --> 00:30:56,880
was that you tied in the importance of threat intelligence
841 00:30:56,880 --> 00:31:03,640
to be able to proactively find when things are going to hit the fan.
842 00:31:03,640 --> 00:31:05,720
Could you elaborate some more about that?
843 00:31:05,720 --> 00:31:09,120
Because I guess that’s a case-by-case thing.
844 00:31:09,120 --> 00:31:16,120
You gave one example where you had a typical MO for a certain threat actor.
845 00:31:16,200 --> 00:31:24,200
But how could you use threat intelligence in a more general way to sort of be on top of things?
846 00:31:24,200 --> 00:31:29,400
Yeah, so there are the more technical IOCs type of things that are pretty straightforward.
847 00:31:29,400 --> 00:31:32,080
Like, oh, this IP address is a C2 server.
848 00:31:32,080 --> 00:31:33,880
You can proactively block that, right?
849 00:31:33,880 --> 00:31:35,120
And simple stuff like that.
850 00:31:35,120 --> 00:31:38,520
Or, you know, this email like this domain is sending out phishing emails.
851 00:31:38,520 --> 00:31:40,440
Like, you know, there is so much of that.
852 00:31:40,440 --> 00:31:42,760
And that can be done proactively.
853 00:31:42,760 --> 00:31:45,040
If it’s shared, people can block it.
854 00:31:45,040 --> 00:31:46,000
Then the other part is more of like, you know, you can block it.
855 00:31:46,000 --> 00:31:46,160
Then the other part is more of like, you know, you can block it.
856 00:31:46,160 --> 00:31:48,600
Then the other part is more of like, you know, you can block it.
857 00:31:48,600 --> 00:31:49,600
I guess there are different layers.
858 00:31:49,600 --> 00:31:51,200
One is the techniques.
859 00:31:51,200 --> 00:31:54,600
Like the first example we showed about the RTF.
860 00:31:54,600 --> 00:31:58,080
That you can talk about, we see this attack that works this way.
861 00:31:58,080 --> 00:32:03,320
And this is a relatively generic way you can detect this type of attack even if it’s a variation of it.
862 00:32:03,320 --> 00:32:06,240
That’s tricky to define because it really depends on case-by-case.
863 00:32:06,240 --> 00:32:08,000
So I don’t have a good answer for that.
864 00:32:08,000 --> 00:32:14,240
But then the layer on top of that, I think it’s more generic in terms of what type of attacks are there even?
865 00:32:14,240 --> 00:32:15,960
Like, how common is it?
866 00:32:15,960 --> 00:32:16,960
How common is ransomware?
867 00:32:16,960 --> 00:32:19,920
How common is that they actually delete the backups before the ransomware?
868 00:32:19,920 --> 00:32:24,400
If you have those statistics, if you know that in 100% of the cases they delete the backups,
869 00:32:24,400 --> 00:32:27,800
that’s probably going to put that at the top of the list if you’re going to handle something.
870 00:32:27,800 --> 00:32:31,400
Start with securing your backups and testing your backups, right?
871 00:32:31,400 --> 00:32:36,840
We do this work when we do like threat intelligence reports and profiles that are specific to customers.
872 00:32:36,840 --> 00:32:43,000
We start with like the generic landscape and then narrow it down to what they actually expose and what problems they actually have.
873 00:32:43,000 --> 00:32:45,760
And then we give recommendations that are actually
874 00:32:45,760 --> 00:32:49,400
mapped to the threats that are currently seen in the wild.
875 00:32:49,400 --> 00:32:55,640
Because, you know, if you need to start from very low security, there is so many things you can do, right?
876 00:32:55,640 --> 00:32:57,720
Where do you start, right?
877 00:32:57,720 --> 00:33:02,080
You should probably do most of them anyways, but you need to have an order for doing it.
878 00:33:02,080 --> 00:33:03,080
Where do you start?
879 00:33:03,080 --> 00:33:07,560
Start from the ones that are actively exploited in the wild right now.
880 00:33:07,560 --> 00:33:13,760
So we try to do the mapping and give them a plan that is actually mapped to urgency based on what’s happening.
881 00:33:13,760 --> 00:33:15,720
But are you also mapping the…
882 00:33:15,760 --> 00:33:20,760
Priorities depending on the criticality of the asset that you’re protecting?
883 00:33:20,760 --> 00:33:23,760
I mean, it’s not just a generic IT environment.
884 00:33:23,760 --> 00:33:25,520
It could be production environments.
885 00:33:25,520 --> 00:33:28,120
It could be ERP systems, et cetera, et cetera.
886 00:33:28,120 --> 00:33:28,560
Exactly.
887 00:33:28,560 --> 00:33:29,880
And that needs to start from the business.
888 00:33:29,880 --> 00:33:33,360
But it’s a question that, of course, is usually asked or is always asked.
889 00:33:33,360 --> 00:33:36,200
Like, what’s the worst that can happen, right?
890 00:33:36,200 --> 00:33:41,680
How would your business die tomorrow if we don’t have, for example, this database or this ERP?
891 00:33:41,680 --> 00:33:45,640
Or for us, the uptime is very important, for example.
892 00:33:45,760 --> 00:33:48,960
And then you map it to, okay, these are important assets.
893 00:33:48,960 --> 00:33:51,640
What do they depend on?
894 00:33:51,640 --> 00:33:53,000
Because that’s the other challenging part.
895 00:33:53,000 --> 00:33:57,320
Like, they maybe depend on a hundred different third-party software that we have no idea about.
896 00:33:57,320 --> 00:33:59,320
So what if they get breached?
897 00:33:59,320 --> 00:34:02,240
And do they even know what they depend on?
898 00:34:02,240 --> 00:34:03,400
They usually don’t.
899 00:34:03,400 --> 00:34:05,320
They usually don’t.
900 00:34:05,320 --> 00:34:05,960
And it’s not easy.
901 00:34:05,960 --> 00:34:07,400
In their defense, it’s hard.
902 00:34:07,400 --> 00:34:08,760
The systems are complex.
903 00:34:08,760 --> 00:34:10,120
Yeah, absolutely.
904 00:34:10,120 --> 00:34:12,080
Looking especially at the Kaseya stuff.
905 00:34:12,080 --> 00:34:14,920
I mean, you’re looking at several layers of dependencies, right?
906 00:34:14,920 --> 00:34:15,120
Yeah.
907 00:34:15,120 --> 00:34:15,560
Yeah.
908 00:34:15,760 --> 00:34:21,240
But then again, facts in hindsight proved that Kaseya was pretty broken.
909 00:34:21,240 --> 00:34:24,480
And, you know, that they weren’t focusing on that.
910 00:34:24,480 --> 00:34:28,200
And people are relying on the trust from suppliers.
911 00:34:28,200 --> 00:34:35,000
So, but if we remove that idea and say, okay, I got this IT environment that’s mine that I’m the guard of.
912 00:34:35,000 --> 00:34:36,840
I’m Susanne or whatever.
913 00:34:36,840 --> 00:34:39,560
And I have a bunch of things that are exposed to the internet.
914 00:34:39,560 --> 00:34:45,120
What I would do personally is I would constantly just git pull the latest nuclei.
915 00:34:45,120 --> 00:34:46,880
Templates and whatever.
916 00:34:46,880 --> 00:34:49,480
And just scan my environment continuously.
917 00:34:49,480 --> 00:34:54,680
Because that’s where like one day CVs come out and see if it smashes your system.
918 00:34:54,680 --> 00:34:57,800
If it fits, send yourself a notification in Slack.
919 00:34:57,800 --> 00:34:59,240
This is CV fired.
920 00:34:59,240 --> 00:35:00,720
False positive or not.
921 00:35:00,720 --> 00:35:04,840
Like use the bug bounty methodology and attack your system externally.
922 00:35:04,840 --> 00:35:08,120
Because if you don’t do it, somebody else is going to do it for you.
923 00:35:08,120 --> 00:35:10,320
And if it fires, at least then you know.
924 00:35:10,320 --> 00:35:13,000
And it’s 100% better to know what you’re having that’s vulnerable.
925 00:35:13,000 --> 00:35:13,800
And it’s 100% better to know what you’re having that’s vulnerable.
926 00:35:13,800 --> 00:35:15,000
And it’s 100% better to know what you’re having that’s vulnerable.
927 00:35:15,000 --> 00:35:16,000
Than not having it.
928 00:35:16,000 --> 00:35:17,000
Than not having it.
929 00:35:17,000 --> 00:35:20,640
And if you don’t have the capabilities to do it yourself, I guess that maps to whatever
930 00:35:20,640 --> 00:35:26,120
sort of vulnerability exposure type of thing you can put on the outside, right?
931 00:35:26,120 --> 00:35:28,760
Now, I don’t even know what products there are to be honest.
932 00:35:28,760 --> 00:35:30,760
I know there are many.
933 00:35:30,760 --> 00:35:35,760
Ideally, you have hackers like stuff that can do that for you continuously.
934 00:35:35,760 --> 00:35:39,440
But you know, they got other stuff to do as well.
935 00:35:39,440 --> 00:35:44,800
But I mean, identifying the exposure is very important and also be very quick when things
936 00:35:44,800 --> 00:35:45,800
happen.
937 00:35:45,800 --> 00:35:47,360
Because step number one is inventory.
938 00:35:47,360 --> 00:35:48,800
Know what you actually have.
939 00:35:48,800 --> 00:35:50,800
Especially what you’re exposed to in the internet.
940 00:35:50,800 --> 00:35:55,800
And step number two is be consistent with the patching level.
941 00:35:55,800 --> 00:35:58,720
And it’s boring to talk about patching.
942 00:35:58,720 --> 00:36:03,560
No one likes to like, you know, go through the process of doing the inventory and check
943 00:36:03,560 --> 00:36:04,560
the version.
944 00:36:04,560 --> 00:36:05,560
Because the heroes of the off time.
945 00:36:05,560 --> 00:36:09,420
Make sure it’s a patch and get the notification when there is a new patch available and stuff.
946 00:36:09,420 --> 00:36:10,420
But take stats.
947 00:36:10,420 --> 00:36:12,800
It’s actually, if you look at our own stats.
948 00:36:12,800 --> 00:36:13,800
It’s actually, if you look at our own stats.
949 00:36:13,800 --> 00:36:14,800
It’s actually, if you look at our own stats.
950 00:36:14,800 --> 00:36:15,800
It’s actually, if you look at our own stats.
951 00:36:15,800 --> 00:36:16,800
It’s actually, if you look at our own stats.
952 00:36:16,800 --> 00:36:17,800
You can see, there is a lot of stats that have been released from around 100 different
953 00:36:17,800 --> 00:36:19,000
cases so far this year.
954 00:36:19,000 --> 00:36:27,800
I think the highest number on what initial access back to the use is external exploitation.
955 00:36:27,800 --> 00:36:28,800
And next is phishing.
956 00:36:28,800 --> 00:36:35,800
And you will see those nice firewalls with VPNs enabled just for the IT admins that
957 00:36:35,800 --> 00:36:37,800
they forgot about.
958 00:36:37,800 --> 00:36:39,800
Or that kind of stuff that’s happening all the time.
959 00:36:39,800 --> 00:36:43,800
Because you do that, the easiest route, right?
960 00:36:43,800 --> 00:36:45,800
It’s on the internet.
961 00:36:45,800 --> 00:36:49,800
No one likes to talk about patching, but it’s actually very important.
962 00:36:49,800 --> 00:36:54,800
First, reduce the exposure. If you don’t have to have it exposed, don’t have it.
963 00:36:54,800 --> 00:37:00,800
If you need to have it exposed, but not necessarily have to have it hosted in your on-prem infrastructure,
964 00:37:00,800 --> 00:37:02,800
put it on a cloud service. That’s a lot better.
965 00:37:02,800 --> 00:37:06,800
If it gets pawned, at least it’s not your infrastructure.
966 00:37:06,800 --> 00:37:12,800
It’s your app, but still, it’s a lot better than having it into your internal network.
967 00:37:12,800 --> 00:37:17,800
Exactly. I think that’s a good note to end on.
968 00:37:17,800 --> 00:37:21,800
Thank you a lot for taking the time to talk to us.
969 00:37:21,800 --> 00:37:25,800
You guys are done now, so you can just enjoy it.
970 00:37:25,800 --> 00:37:29,800
Enjoy the sun, enjoy the show, enjoy the venue.
971 00:37:29,800 --> 00:37:31,800
And have a nice day in Stockholm.
972 00:37:31,800 --> 00:37:34,800
Thank you for sharing with us on Säkerhetspodkasten.
973 00:37:34,800 --> 00:37:36,800
Fabio, Fredrik, thank you.
974 00:37:36,800 --> 00:37:38,800
Thank you very much.
975 00:37:38,800 --> 00:37:41,800
We’re rolling. Welcome off the stage.
976 00:37:41,800 --> 00:37:42,800
Lars, how are you?
977 00:37:42,800 --> 00:37:50,800
Just gave an excellent talk about exploiting memory corruption in RTOS, real-time operating systems.
978 00:37:50,800 --> 00:37:56,800
Together here with me at the Säkerhetspodkasten transmission from Säkti is also Peter Magnusson.
979 00:37:56,800 --> 00:37:57,800
Hello!
980 00:37:57,800 --> 00:37:58,800
And Lars Hallin.
981 00:37:58,800 --> 00:37:59,800
Yes.
982 00:37:59,800 --> 00:38:05,800
Cool. And let’s just go and head right into the questions, right?
983 00:38:05,800 --> 00:38:08,800
And then follow up on your talk on a little…
984 00:38:08,800 --> 00:38:11,800
You described this as some kind of a summer experience.
985 00:38:11,800 --> 00:38:15,800
A summer experiment or summer hobby project that you brought up.
986 00:38:15,800 --> 00:38:17,800
Yes, that’s right.
987 00:38:17,800 --> 00:38:20,800
So, where did it come from?
988 00:38:20,800 --> 00:38:25,800
I mean, where did the first idea to go into this come from?
989 00:38:25,800 --> 00:38:31,800
Well, I have previously been working as an embedded systems developer.
990 00:38:31,800 --> 00:38:38,800
So these questions like, how secure is this when we add these compiler flags to these operating systems?
991 00:38:38,800 --> 00:38:41,800
I have been thinking that question for a while.
992 00:38:41,800 --> 00:38:48,800
But then I thought, well, maybe I should try to learn Docker and make something with it.
993 00:38:48,800 --> 00:38:51,800
So that was my main driving force.
994 00:38:51,800 --> 00:38:59,800
So the Docker drive went into building a playground for playing around with memory corruption in RTOS?
995 00:38:59,800 --> 00:39:00,800
Yes.
996 00:39:00,800 --> 00:39:06,800
Myself, I have all kind of electrical equipment at home.
997 00:39:06,800 --> 00:39:08,800
Debuggers and debugging probes.
998 00:39:08,800 --> 00:39:10,800
And microcontrollers.
999 00:39:10,800 --> 00:39:12,800
So that makes it a bit easier for me.
1000 00:39:12,800 --> 00:39:22,800
But I wanted to make something that could be used by anyone as long as they have a computer or cloud resources that they can run a Docker container in.
1001 00:39:22,800 --> 00:39:26,800
So a huge improvement on accessibility?
1002 00:39:26,800 --> 00:39:28,800
Well, that depends.
1003 00:39:28,800 --> 00:39:30,800
I hope so, at least.
1004 00:39:30,800 --> 00:39:31,800
We will see.
1005 00:39:31,800 --> 00:39:35,800
I hope to get comments and so on the actual thing.
1006 00:39:35,800 --> 00:39:37,800
What about compared to getting…
1007 00:39:38,800 --> 00:39:44,800
The microcontroller and rigging it up to your computer and making everything work?
1008 00:39:44,800 --> 00:39:47,800
I would actually say that that is a better experience.
1009 00:39:47,800 --> 00:39:50,800
There are really, really good tools for that.
1010 00:39:50,800 --> 00:39:53,800
Some of them are free.
1011 00:39:53,800 --> 00:39:56,800
But there are also some that you pay a bit for.
1012 00:39:56,800 --> 00:40:02,800
And wow, they are made to make your work efficient.
1013 00:40:02,800 --> 00:40:05,800
And you can really feel that.
1014 00:40:05,800 --> 00:40:07,800
So what price level are we talking about?
1015 00:40:07,800 --> 00:40:09,800
Are we talking about your oscilloscopes at home?
1016 00:40:09,800 --> 00:40:12,800
I don’t have any oscilloscopes at home at the moment.
1017 00:40:12,800 --> 00:40:21,800
But if I ask nicely, I might be able to use one at work.
1018 00:40:21,800 --> 00:40:25,800
Not the calibrated ones, because those are only for work.
1019 00:40:25,800 --> 00:40:27,800
But maybe one of the test oscilloscopes.
1020 00:40:27,800 --> 00:40:29,800
But now we have the playground, right?
1021 00:40:29,800 --> 00:40:30,800
Yes, we have.
1022 00:40:30,800 --> 00:40:31,800
Okay.
1023 00:40:31,800 --> 00:40:33,800
So tell us a little bit.
1024 00:40:33,800 --> 00:40:35,800
You made a demo on stage.
1025 00:40:35,800 --> 00:40:36,800
Mm.
1026 00:40:36,800 --> 00:40:38,800
Did you analyze it and play around a little bit with it?
1027 00:40:38,800 --> 00:40:44,800
And one thing that came up and one thing that you explored a little bit was the canary functionality.
1028 00:40:44,800 --> 00:40:45,800
Yes, that’s right.
1029 00:40:45,800 --> 00:40:50,800
How common would you say it is in the Nartos 2 that they use these canary protections?
1030 00:40:50,800 --> 00:40:52,800
Well, I would say it depends.
1031 00:40:52,800 --> 00:40:57,800
I think you should add the protection if you can.
1032 00:40:57,800 --> 00:41:00,800
But often you have some memory constraints.
1033 00:41:00,800 --> 00:41:05,800
And I have a fun story from when I was working
1034 00:41:05,800 --> 00:41:07,800
as an embedded developer.
1035 00:41:07,800 --> 00:41:11,800
And I was asking, can we make this program 80 bytes smaller?
1036 00:41:11,800 --> 00:41:15,800
Because if we can do that, we can fit it into one kilobyte of flash
1037 00:41:15,800 --> 00:41:17,800
instead of two kilobytes of flash.
1038 00:41:17,800 --> 00:41:19,800
And you save a lot of money, right?
1039 00:41:19,800 --> 00:41:20,800
Yes.
1040 00:41:20,800 --> 00:41:22,800
And you don’t realize how much money you save.
1041 00:41:22,800 --> 00:41:28,800
But if that is 30 cents per chip and you manufacture 500,000 per year,
1042 00:41:28,800 --> 00:41:30,800
it adds up.
1043 00:41:30,800 --> 00:41:32,800
It really adds up.
1044 00:41:32,800 --> 00:41:34,800
You try to tell that to the programmers over there.
1045 00:41:34,800 --> 00:41:42,800
I have this like, we’re talking about this many hours of engineering
1046 00:41:42,800 --> 00:41:51,800
and yeah, but bill of material for making this profitable is this much?
1047 00:41:51,800 --> 00:41:54,800
We don’t care about how much hours you spend.
1048 00:41:54,800 --> 00:41:59,800
That’s a really, really strange equation where you’re like, this is silly.
1049 00:41:59,800 --> 00:42:02,800
And that’s why television sets are slow.
1050 00:42:02,800 --> 00:42:03,800
I mean, when you switch channels.
1051 00:42:03,800 --> 00:42:05,800
It’s like they save 50 cents on a microcontroller.
1052 00:42:05,800 --> 00:42:06,800
Yeah, sure.
1053 00:42:06,800 --> 00:42:10,800
They could put something fast in there.
1054 00:42:10,800 --> 00:42:11,800
Yeah.
1055 00:42:11,800 --> 00:42:15,800
So, I mean, the talk was, of course, of interest because normally in a real-time
1056 00:42:15,800 --> 00:42:19,800
operating system, you put that in important stuff, right?
1057 00:42:19,800 --> 00:42:20,800
What do you mean?
1058 00:42:20,800 --> 00:42:22,800
I mean, stuff that keeps you alive.
1059 00:42:22,800 --> 00:42:23,800
Yes.
1060 00:42:23,800 --> 00:42:25,800
Or keeps things flying in the air.
1061 00:42:25,800 --> 00:42:26,800
Yeah.
1062 00:42:26,800 --> 00:42:29,800
Segways, also, probably.
1063 00:42:29,800 --> 00:42:31,800
All kind of control systems and things.
1064 00:42:31,800 --> 00:42:32,800
Yeah.
1065 00:42:32,800 --> 00:42:39,800
So, I had some like, how common is real-time operating systems?
1066 00:42:39,800 --> 00:42:45,800
Because you were looking at two different, one free RTOS and one was called Cephs or
1067 00:42:45,800 --> 00:42:46,800
something.
1068 00:42:46,800 --> 00:42:47,800
Cypher, yeah.
1069 00:42:47,800 --> 00:42:48,800
Yeah.
1070 00:42:48,800 --> 00:42:50,800
From the Linux Foundation.
1071 00:42:50,800 --> 00:42:58,800
And how different experience is that compared to some corporations where, for example,
1072 00:42:58,800 --> 00:43:01,800
you get a small base from…
1073 00:43:01,800 --> 00:43:06,800
From STM32 or some other company where they…
1074 00:43:06,800 --> 00:43:13,800
You basically just build the embedded thing upon a very small C library and start there.
1075 00:43:13,800 --> 00:43:14,800
It’s…
1076 00:43:14,800 --> 00:43:21,000
Do you know which companies choose to go for a real-time operating system and which ones
1077 00:43:21,000 --> 00:43:24,800
would just go like raw upon a small…
1078 00:43:24,800 --> 00:43:27,800
Yeah, a small loop like an Arduino.
1079 00:43:27,800 --> 00:43:28,800
Yeah.
1080 00:43:28,800 --> 00:43:32,800
Usually, this is the decision that you take on an architectural level.
1081 00:43:32,800 --> 00:43:35,800
So, you look at what does this system need to do.
1082 00:43:35,800 --> 00:43:43,800
And usually, it’s when you have some kind of shared resource that you need to have accessible
1083 00:43:43,800 --> 00:43:45,800
to several different parts of the program.
1084 00:43:45,800 --> 00:43:53,800
Let’s say that you access a network port or some kind of SD card or something, some
1085 00:43:53,800 --> 00:43:57,800
storage or some communication bus.
1086 00:43:57,800 --> 00:43:59,800
But you need to have shared between the…
1087 00:43:59,800 --> 00:44:02,800
Then you need to have a mutual exclusion, mutexes.
1088 00:44:02,800 --> 00:44:03,800
Yeah, yeah.
1089 00:44:03,800 --> 00:44:08,800
So, if you just controlling like turning on and off a lamp or something like that, maybe
1090 00:44:08,800 --> 00:44:10,800
you don’t need an operating system.
1091 00:44:10,800 --> 00:44:14,800
But if you got network connections and you got…
1092 00:44:14,800 --> 00:44:20,800
It’s about the same level as when you want to be having multi-threaded programs in a
1093 00:44:20,800 --> 00:44:23,800
normal general purpose computer.
1094 00:44:23,800 --> 00:44:26,800
But do you also see…
1095 00:44:26,800 --> 00:44:32,800
I mean, the threat level towards smart cards, for instance, and mobile phones, et cetera,
1096 00:44:32,800 --> 00:44:33,800
has raised the…
1097 00:44:33,800 --> 00:44:37,800
I mean, the architectural protection of these kinds of systems.
1098 00:44:37,800 --> 00:44:43,800
I mean, also the embedded systems get, I mean, memory protection and all kinds of protection
1099 00:44:43,800 --> 00:44:47,800
and safe environments for trust zones, et cetera.
1100 00:44:47,800 --> 00:44:54,800
Would you say that this will start to be implemented also in these kinds of equipment that you’re
1101 00:44:54,800 --> 00:44:55,800
looking at?
1102 00:44:55,800 --> 00:44:59,800
Because more bare bones are those systems are…
1103 00:44:59,800 --> 00:45:02,800
Like secure boot, for instance.
1104 00:45:02,800 --> 00:45:04,800
To be honest, I don’t know.
1105 00:45:04,800 --> 00:45:06,800
It really depends.
1106 00:45:06,800 --> 00:45:12,800
Because some of these systems are so small that you can make them provably correct or
1107 00:45:12,800 --> 00:45:14,800
that you are sure that…
1108 00:45:14,800 --> 00:45:16,800
We have bound checks everywhere.
1109 00:45:16,800 --> 00:45:21,800
So, if you have a small enough program, it’s much easier to prove that it doesn’t have
1110 00:45:21,800 --> 00:45:22,800
any bugs.
1111 00:45:22,800 --> 00:45:24,800
However, when you bring in code that you haven’t written on the computer, it’s much
1112 00:45:24,800 --> 00:45:25,800
easier to prove that it doesn’t have any bugs.
1113 00:45:25,800 --> 00:45:27,800
Then you can utilize a basketball once you have done your own, like externa libraries,
1114 00:45:27,800 --> 00:45:28,800
et cetera.
1115 00:45:28,800 --> 00:45:29,800
I mean, I mentioned this…
1116 00:45:29,800 --> 00:45:30,800
Network stack, or UDSD stack, or…
1117 00:45:30,800 --> 00:45:31,800
Yeah.
1118 00:45:31,800 --> 00:45:37,800
I mean, I mentioned this in the talk and say the formal term is Software of Unknown Provenance.
1119 00:45:37,800 --> 00:45:39,800
Then you don’t know what is.
1120 00:45:39,800 --> 00:45:49,200
You add these extra things to let’s say that if there should be a bug in this code, then
1121 00:45:49,200 --> 00:45:52,800
we have at least made something to mitigate that risk.
1122 00:45:52,800 --> 00:45:54,800
So, it won’t be a catastrophic error or something?
1123 00:45:54,800 --> 00:45:58,240
för personen med pacemaker?
1124 00:45:58,240 --> 00:46:05,280
Om man har en pacemaker så är jag inte så säker på om de kommer att fungera i en
1125 00:46:05,280 --> 00:46:08,640
real-time operating system eftersom det är så resursintensivt.
1126 00:46:08,640 --> 00:46:17,040
Jag har inte jobbat med pacemakers men jag hade några kollegor som hade jobbat med
1127 00:46:17,040 --> 00:46:20,400
pacemakers om jag minns rätt, eller om de hade kollegor.
1128 00:46:20,400 --> 00:46:25,280
Det är inte mer än tre nivåer av möten.
1129 00:46:25,280 --> 00:46:29,760
Det var skrivet i assembleringsspråk.
1130 00:46:29,760 --> 00:46:33,520
Och det är helt säkert, eller hur?
1131 00:46:33,520 --> 00:46:38,560
Men det var inte stora program?
1132 00:46:39,280 --> 00:46:45,920
I företaget jag har jobbat i var en av problemen en C-kompiler error.
1133 00:46:45,920 --> 00:46:47,760
Kodbasen
1134 00:46:47,760 --> 00:46:49,840
inte fungerade och bröt
1135 00:46:49,840 --> 00:46:50,320
på en
1136 00:46:50,320 --> 00:46:54,080
specifik system och kompilerade i en specifik mån.
1137 00:46:54,080 --> 00:47:01,440
Och de bestämde i slutändan att den här kompilern i de här och de här förhållandena
1138 00:47:01,440 --> 00:47:06,640
genererar något som inte gör vad C-koden säger.
1139 00:47:06,640 --> 00:47:11,760
Undisput, det här är problemet och i slutändan tror jag att de
1140 00:47:11,760 --> 00:47:14,080
fick en bugfix för den här kompilernas problem.
1141 00:47:14,080 --> 00:47:19,760
Men jag är inte säker på hur man skulle göra en beslut där.
1142 00:47:19,760 --> 00:47:24,560
Men i den här fallet, jag skulle säga 99% av fallet,
1143 00:47:24,560 --> 00:47:29,920
det vore galet om man hade ett problem och bara fick en assembleringsspråk
1144 00:47:29,920 --> 00:47:32,800
för då vet jag vad det gör.
1145 00:47:32,800 --> 00:47:38,240
Det verkar galet, men i den här specifika scenen när
1146 00:47:38,240 --> 00:47:43,600
vi har förlorat så många timmar och vi vet att den här kompilern
1147 00:47:43,600 --> 00:47:47,600
för den här specifika konfigurationen genererar en skitutsläpp.
1148 00:47:49,760 --> 00:47:56,560
Så kan du faktiskt tro på en C-kompilering för att exekutera kritiska koder i en artist?
1149 00:47:56,560 --> 00:48:00,000
Ja, du kan, men inte någon C-kompilering.
1150 00:48:00,000 --> 00:48:02,640
Okej, så välj viss, eller hur?
1151 00:48:02,640 --> 00:48:09,600
Ja, så det finns några C-kompileringar som faktiskt ger garanter för att deras kod är korrekt.
1152 00:48:09,600 --> 00:48:11,200
Det är intressant.
1153 00:48:11,200 --> 00:48:13,200
Men de här är inte de fria.
1154 00:48:13,200 --> 00:48:15,200
Nej, det kan jag förvänta mig.
1155 00:48:15,200 --> 00:48:19,560
Jag kan nästan inte se någon softwareförening som ger någon garanti på deras kod.
1156 00:48:19,560 --> 00:48:27,360
Ja, jag skrev faktiskt min mastertesis på en kompilering som gjorde det.
1157 00:48:27,360 --> 00:48:36,360
Det var en väldigt bra fråga, det var ingen förberedelse, det var bara fritt lycka.
1158 00:48:36,360 --> 00:48:46,360
Jag hörde att det fanns en språk om flygförsäkringssäkerhet och programmering på flygsystem.
1159 00:48:46,360 --> 00:48:48,360
Och det fanns många C-kompileringar och C-kompileringar.
1160 00:48:48,360 --> 00:48:49,360
Ja, det fanns många C-kompileringar och C-kompileringar.
1161 00:48:49,360 --> 00:48:51,360
Och det fanns många C-kompileringar och C-kompileringar.
1162 00:48:51,360 --> 00:48:57,360
Och den personen frågade om kvalitetsförsäkring.
1163 00:48:57,360 --> 00:49:07,360
Och han sa, ställ dig upp om någon av er skulle gå på en flygplans där din företag skapar software som kontrollerar flygplans.
1164 00:49:07,360 --> 00:49:10,360
Och det var bara en person som ställde upp handen.
1165 00:49:10,360 --> 00:49:15,360
Och jag sa, okej, ja, coolt, det blir inget problem.
1166 00:49:15,360 --> 00:49:17,360
Vi kommer inte ens att gå på en flygplans om vi skriver den koden.
1167 00:49:17,360 --> 00:49:19,360
Vi kommer inte ens att gå på en flygplans om vi skriver den koden.
1168 00:49:19,360 --> 00:49:21,360
Ja, det skriver vi inte.
1169 00:49:21,360 --> 00:49:25,360
Så det är svårt att skriva bra software överallt.
1170 00:49:25,360 --> 00:49:27,360
Ja, det är det.
1171 00:49:27,360 --> 00:49:29,360
Okej, vi…
1172 00:49:29,360 --> 00:49:32,360
Låt oss återkomma till frågan på den här intervjun.
1173 00:49:32,360 --> 00:49:38,360
Men det finns en bekämpande författning där de var rätt vissa rädda…
1174 00:49:38,360 --> 00:49:40,360
att luft…
1175 00:49:40,360 --> 00:49:42,360
att luft…
1176 00:49:42,360 --> 00:49:44,360
att luft…
1177 00:49:44,360 --> 00:49:45,360
att luft…
1178 00:49:45,360 --> 00:49:47,360
att luft?
1179 00:49:47,360 --> 00:49:50,620
I think it’s an airbus incident
1180 00:49:50,620 --> 00:49:51,320
Or something like that
1181 00:49:51,320 --> 00:49:54,380
It’s not
1182 00:49:54,380 --> 00:49:55,940
100% proven
1183 00:49:55,940 --> 00:49:58,220
But it seems like the accident
1184 00:49:58,220 --> 00:50:00,680
Started with one of the flight computers
1185 00:50:00,680 --> 00:50:01,520
Flipping a bit
1186 00:50:01,520 --> 00:50:03,880
And it’s hard to prove
1187 00:50:03,880 --> 00:50:05,220
Because it’s
1188 00:50:05,220 --> 00:50:07,920
I think they call it soft faults
1189 00:50:07,920 --> 00:50:08,600
Or something like
1190 00:50:08,600 --> 00:50:11,200
You flip a bit but there’s no
1191 00:50:11,200 --> 00:50:13,660
Permanent damage to the computer
1192 00:50:13,660 --> 00:50:15,360
So you can just look at
1193 00:50:15,360 --> 00:50:17,200
The output and what was registered
1194 00:50:17,200 --> 00:50:19,640
And the simplest explanation
1195 00:50:19,640 --> 00:50:21,660
Was one bit was flipped
1196 00:50:21,660 --> 00:50:23,520
So could you actually do that
1197 00:50:23,520 --> 00:50:25,560
Kind of simulation in the playground
1198 00:50:25,560 --> 00:50:27,180
The random bit flipping
1199 00:50:27,180 --> 00:50:27,920
Not yet
1200 00:50:27,920 --> 00:50:31,620
But you could probably just stop
1201 00:50:31,620 --> 00:50:32,780
The QM
1202 00:50:32,780 --> 00:50:34,800
And flip a bit in RAM
1203 00:50:34,800 --> 00:50:36,980
You could of course do that
1204 00:50:36,980 --> 00:50:37,860
But that’s not really
1205 00:50:37,860 --> 00:50:40,220
Just play something radioactive
1206 00:50:40,220 --> 00:50:41,320
Close to the processor
1207 00:50:41,320 --> 00:50:43,420
Close to the docker image
1208 00:50:43,420 --> 00:50:45,340
That’s what we have
1209 00:50:45,340 --> 00:50:46,080
The ECC
1210 00:50:46,080 --> 00:50:48,580
Or whatever it’s called
1211 00:50:48,580 --> 00:50:51,340
Self-correcting RAM memories
1212 00:50:51,340 --> 00:50:53,520
So we can mitigate those risks
1213 00:50:53,520 --> 00:50:55,420
At least it reduces
1214 00:50:55,420 --> 00:50:56,060
It’s heavily
1215 00:50:56,060 --> 00:50:58,020
But then you have the raw hammer attack
1216 00:50:58,020 --> 00:51:02,120
Writing rapidly to one memory port
1217 00:51:02,120 --> 00:51:03,760
And get bit flips
1218 00:51:03,760 --> 00:51:04,620
On the other side
1219 00:51:04,620 --> 00:51:07,560
So we were asked
1220 00:51:07,560 --> 00:51:10,020
By the organizers of the event
1221 00:51:10,020 --> 00:51:11,620
To look through the questions
1222 00:51:11,620 --> 00:51:12,020
And
1223 00:51:12,020 --> 00:51:14,960
Most of the questions
1224 00:51:14,960 --> 00:51:15,320
Were the same
1225 00:51:15,340 --> 00:51:17,500
That they either were
1226 00:51:17,500 --> 00:51:20,660
Weird or we didn’t understand them
1227 00:51:20,660 --> 00:51:22,260
It was one good question
1228 00:51:22,260 --> 00:51:23,300
That was
1229 00:51:23,300 --> 00:51:25,280
Should you write
1230 00:51:25,280 --> 00:51:27,420
Embedded software in Rust
1231 00:51:27,420 --> 00:51:28,220
Instead of C
1232 00:51:28,220 --> 00:51:31,320
That depends
1233 00:51:31,320 --> 00:51:36,000
I personally haven’t been working much
1234 00:51:36,000 --> 00:51:36,520
With Rust
1235 00:51:36,520 --> 00:51:38,440
Then we’re on the team
1236 00:51:38,440 --> 00:51:41,000
But I have
1237 00:51:41,000 --> 00:51:42,980
Friends
1238 00:51:42,980 --> 00:51:45,320
Who are really liking it
1239 00:51:45,340 --> 00:51:46,380
And speaking for it
1240 00:51:46,380 --> 00:51:49,040
But one thing you need to consider
1241 00:51:49,040 --> 00:51:50,540
Is say that you’re using
1242 00:51:50,540 --> 00:51:51,360
An old code base
1243 00:51:51,360 --> 00:51:52,360
That is written in C
1244 00:51:52,360 --> 00:51:55,640
Do you really rewrite everything
1245 00:51:55,640 --> 00:51:57,900
And if you make a new implementation
1246 00:51:57,900 --> 00:52:00,060
Do you really make a new new implementation
1247 00:52:00,060 --> 00:52:02,020
Or are you basing it on libraries
1248 00:52:02,020 --> 00:52:04,280
And what are those libraries written in
1249 00:52:04,280 --> 00:52:05,980
I don’t think it’s
1250 00:52:05,980 --> 00:52:07,720
It is not yet
1251 00:52:07,720 --> 00:52:08,740
Close
1252 00:52:08,740 --> 00:52:11,520
In time
1253 00:52:11,520 --> 00:52:14,180
That we see the majority of the systems
1254 00:52:14,180 --> 00:52:15,000
Written in Rust
1255 00:52:15,340 --> 00:52:16,320
But I definitely
1256 00:52:16,320 --> 00:52:18,260
Think that
1257 00:52:18,260 --> 00:52:19,860
There will be good things
1258 00:52:19,860 --> 00:52:20,540
Coming there
1259 00:52:20,540 --> 00:52:23,260
That’s a good
1260 00:52:23,260 --> 00:52:25,720
Ending of this presentation
1261 00:52:25,720 --> 00:52:27,600
I think Lars
1262 00:52:27,600 --> 00:52:29,800
It’s been a pleasure having you on board
1263 00:52:29,800 --> 00:52:31,620
This small small
1264 00:52:31,620 --> 00:52:33,800
Follow up interview on your presentation on stage
1265 00:52:33,800 --> 00:52:34,640
Here at
1266 00:52:34,640 --> 00:52:38,020
Can I ask one question
1267 00:52:38,020 --> 00:52:38,980
The last question
1268 00:52:38,980 --> 00:52:41,560
I’ll let you do that
1269 00:52:41,560 --> 00:52:43,280
So
1270 00:52:43,280 --> 00:52:45,340
One of your demos
1271 00:52:45,340 --> 00:52:46,780
Was on
1272 00:52:46,780 --> 00:52:48,620
Bad
1273 00:52:48,620 --> 00:52:50,540
Stack canaries
1274 00:52:50,540 --> 00:52:52,580
Was it free art
1275 00:52:52,580 --> 00:52:54,060
No that was in
1276 00:52:54,060 --> 00:52:56,240
All the live demos were in Sapphire
1277 00:52:56,240 --> 00:52:56,800
Okay
1278 00:52:56,800 --> 00:52:59,680
Do you know if Sapphire
1279 00:52:59,680 --> 00:53:02,040
Has recommendations for
1280 00:53:02,040 --> 00:53:04,960
Like how you should do
1281 00:53:04,960 --> 00:53:05,760
Stack canaries
1282 00:53:05,760 --> 00:53:07,960
Do they instruct
1283 00:53:07,960 --> 00:53:11,180
Is there some instruction on changing the code
1284 00:53:11,180 --> 00:53:11,920
To make it
1285 00:53:11,920 --> 00:53:13,000
Because one
1286 00:53:13,000 --> 00:53:13,320
One
1287 00:53:13,320 --> 00:53:13,460
One
1288 00:53:13,460 --> 00:53:14,820
One
1289 00:53:14,820 --> 00:53:15,320
One
1290 00:53:15,320 --> 00:53:33,760
One
1291 00:53:33,760 --> 00:53:39,140
What doesoke
1292 00:53:39,140 --> 00:53:40,600
Arduino
1293 00:53:40,600 --> 00:53:42,700
If
1294 00:53:42,700 --> 00:53:42,760
More
1295 00:53:42,760 --> 00:53:43,440
Or
1296 00:53:43,440 --> 00:53:43,820
Short
1297 00:53:43,820 --> 00:53:44,820
Or
1298 00:53:44,820 --> 00:53:44,900
Time
1299 00:53:44,900 --> 00:53:44,920
L
1300 00:53:44,920 --> 00:53:45,000
All set
1301 00:53:45,000 --> 00:53:45,040
In
1302 00:53:45,040 --> 00:53:45,260
And
1303 00:53:45,260 --> 00:53:45,280
Yes
1304 00:53:45,280 --> 00:53:45,300
It’s
1305 00:53:45,300 --> 00:53:54,300
Men jag har säkert berättat om F, men även om vackra ombedda arkitekter som är mindre bekämpande.
1306 00:53:54,300 --> 00:54:00,300
Men jag är inte en ombedd utvecklare, jag har bara en liten del av en ombedd erfarenhet.
1307 00:54:00,300 --> 00:54:07,300
Jag har jobbat mer tillsammans med ombedda utvecklare än med mitt eget arbete.
1308 00:54:07,300 --> 00:54:15,300
Men till exempel på SDM32 kommer de med en liten random nummergenerator.
1309 00:54:15,300 --> 00:54:20,300
Så man kan teoretiskt börja…
1310 00:54:20,300 --> 00:54:28,300
Det kan vara för lätt att alltid välja en random värde, för det är mycket kod som man kan få.
1311 00:54:28,300 --> 00:54:33,300
Men man kan åtminstone ha en random början av stackgeneratorn.
1312 00:54:33,300 --> 00:54:36,300
Men…
1313 00:54:36,300 --> 00:54:37,300
Men jag menar…
1314 00:54:37,300 --> 00:54:39,300
I GCC och andra saker…
1315 00:54:39,300 --> 00:54:43,300
Jag tror att man behöver implementera något för att få bra stackgenerator.
1316 00:54:43,300 --> 00:54:45,300
Så det är lite…
1317 00:54:45,300 --> 00:54:49,300
Säger de något om hur man implementerar det?
1318 00:54:49,300 --> 00:54:55,300
Jag har inte läst så mycket på dokumentationen om det.
1319 00:54:55,300 --> 00:54:58,300
Jag har bara hittat den här flaggan, enabla stackgeneratorn.
1320 00:54:58,300 --> 00:55:03,300
Och jag måste vara ärlig, jag vet inte om den flaggan var korrekt.
1321 00:55:03,300 --> 00:55:06,300
För som ni kanske har sett i presentationen…
1322 00:55:06,300 --> 00:55:12,300
Det är den som har adressen till stackgeneratorn.
1323 00:55:12,300 --> 00:55:14,300
Och inte värdena av stackgeneratorn.
1324 00:55:14,300 --> 00:55:16,300
Så det var faktiskt samma hela tiden.
1325 00:55:16,300 --> 00:55:18,300
Och det är inte så att det ska vara så.
1326 00:55:18,300 --> 00:55:22,300
Jag undrade mig själv på den här i första stället.
1327 00:55:22,300 --> 00:55:25,300
Men jag kommer verkligen att undersöka det lite mer.
1328 00:55:25,300 --> 00:55:28,300
Jag hittade just den här dagarna.
1329 00:55:28,300 --> 00:55:33,300
Och jag hade redan den här områdena jag planerade att presentera.
1330 00:55:33,300 --> 00:55:34,300
Ja, ja, ja.
1331 00:55:34,300 --> 00:55:35,300
Men det var verkligen…
1332 00:55:35,300 --> 00:55:36,300
Det var verkligen…
1333 00:55:36,300 --> 00:55:38,300
Det här är så olyckligt.
1334 00:55:38,300 --> 00:55:40,300
Så det kan vara något du har hittat.
1335 00:55:40,300 --> 00:55:43,300
Det kan vara titeln av ditt nästa tal.
1336 00:55:43,300 --> 00:55:53,300
Men det är normalt att oavsett om du har implementerat randomiteten till stackgeneratorn…
1337 00:55:53,300 --> 00:55:59,300
Den defaulta implementeringen måste vara tråkigt om det inte finns något där.
1338 00:55:59,300 --> 00:56:01,300
Jag menar…
1339 00:56:01,300 --> 00:56:03,300
Om det inte finns något där…
1340 00:56:03,300 --> 00:56:07,300
Hur genererar du randomitet när du inte har någon randomisering?
1341 00:56:07,300 --> 00:56:08,300
Det är en typisk problem.
1342 00:56:08,300 --> 00:56:13,300
Jag nämnde en annan talgång under min talgång.
1343 00:56:13,300 --> 00:56:18,300
Och om jag minns rätt så var det i den talgången att han faktiskt hade en plan för det.
1344 00:56:18,300 --> 00:56:22,300
Man kan använda den initiala konditionen av SRAM.
1345 00:56:22,300 --> 00:56:23,300
Den biasen där.
1346 00:56:23,300 --> 00:56:26,300
För det är lite fluktuerande.
1347 00:56:26,300 --> 00:56:28,300
Men jag rekommenderar den talgången.
1348 00:56:28,300 --> 00:56:29,300
Gå och se den.
1349 00:56:29,300 --> 00:56:32,300
Och hur man simulerar det är…
1350 00:56:32,300 --> 00:56:35,300
I min fall hade jag den här emulatoren.
1351 00:56:35,300 --> 00:56:38,300
Och den emulatoren kommer nog att hända med alla zeroer.
1352 00:56:38,300 --> 00:56:40,300
Så det skulle nog inte vara någon entropi.
1353 00:56:40,300 --> 00:56:45,300
Det finns en företag, jag minns inte om de är svenska eller något.
1354 00:56:45,300 --> 00:56:50,300
Men de tror att de kan generera…
1355 00:56:50,300 --> 00:56:53,300
Jag tror att de kan kalla det intrinsek eller något.
1356 00:56:53,300 --> 00:56:56,300
Men deras idé är något som…
1357 00:56:56,300 --> 00:57:01,300
SRAM tänds att förlora på samma sätt.
1358 00:57:02,300 --> 00:57:05,300
Eller förlora på samma sätt på alla båtar.
1359 00:57:05,300 --> 00:57:10,300
Det är något som de borde kunna förlora från en kilobyte av SRAM.
1360 00:57:10,300 --> 00:57:15,300
Den random initiala konditionen borde vara möjlig att förlora…
1361 00:57:15,300 --> 00:57:20,300
En key som blir samma på alla båtar.
1362 00:57:20,300 --> 00:57:22,300
Men det är inte klonbara?
1363 00:57:22,300 --> 00:57:25,300
Nej, det är klonbara.
1364 00:57:25,300 --> 00:57:30,300
Men det ska vara unikt för alla båtar.
1365 00:57:30,300 --> 00:57:35,300
För idéen är att båtdiften blir annorlunda.
1366 00:57:35,300 --> 00:57:39,300
Så jag tror att idéen är något som…
1367 00:57:39,300 --> 00:57:42,300
Den initialiseringen av SRAM…
1368 00:57:42,300 --> 00:57:45,300
Eftersom man inte skriver något till den på båtar…
1369 00:57:45,300 --> 00:57:50,300
Det blir inte samma för alla båtar.
1370 00:57:50,300 --> 00:57:54,300
Så alla båtar blir lite annorlunda för det var på ett annat sätt.
1371 00:57:54,300 --> 00:57:56,300
Puff? Är det P-U-F?
1372 00:57:56,300 --> 00:57:58,300
Ja, det är en av…
1373 00:57:58,300 --> 00:58:00,300
Det är en av de funktionsnedsättningarna.
1374 00:58:00,300 --> 00:58:02,300
Det är en av de Puff-teknikerna, ja.
1375 00:58:02,300 --> 00:58:03,300
Coolt!
1376 00:58:03,300 --> 00:58:08,300
Så låt oss lyssna på den här frågan och kanske titta mer på Puffs.
1377 00:58:08,300 --> 00:58:11,300
Och se om det kan vara lösningen till Canary-problemet.
1378 00:58:11,300 --> 00:58:15,300
Som kanske finns i några av de här R2.
1379 00:58:15,300 --> 00:58:19,300
Okej Lars, tack så mycket för att du har med dig dina tankar.
1380 00:58:19,300 --> 00:58:21,300
Ja, tack för att du har med dig mig på podcasten.
1381 00:58:21,300 --> 00:58:23,300
Ja, inget problem. Det är vårt glädje.
1382 00:58:23,300 --> 00:58:27,300
Och ha en bra tid här på resten av konferensen.
1383 00:58:27,300 --> 00:58:29,300
Vi ses 2021 den här gången.
1384 00:58:29,300 --> 00:58:31,300
Hejdå!
1385 00:58:33,300 --> 00:58:35,300
Outro kommer snart.