Contents

Säkerhetspodcasten #166 - CS3STHLM 2019

Lyssna

mp3

Innehåll

I dagens avsnitt presenterar våra flygande reportrar fyra intervjuer inspelade under CS3STHLM 2019. Vi pratar med Andy Greenberg, Monta Elkins, Stephen Hilt, Jimmy Wylie och Reid Wightman.

Inspelat: 2019-10-23/24. Längd: 01:15:38.

AI transkribering

AI försöker förstå oss… Ha överseende med galna feltranskriberingar.

1 00:00:00,000 --> 00:00:07,480 Hej och välkomna till Säkerhetspodcasten från CS3 Stockholm 2019.

2 00:00:08,060 --> 00:00:17,480 Och det här intervjun är jag väldigt glad att se på, för på den runda sidan har jag Jesper Larsson från Storbritannien.

3 00:00:17,480 --> 00:00:19,220 Ja, det är sant. Jag gjorde det här.

4 00:00:19,780 --> 00:00:25,720 Och på andra sidan av talet har vi deras gäst och keynote-pråkare av dagen, Andy Grimberg.

5 00:00:26,080 --> 00:00:27,400 Hej, glad att vara här.

6 00:00:27,400 --> 00:00:34,500 Och låt oss börja med en kort introduktion om dig själv och om huvudet av ditt språk.

7 00:00:34,900 --> 00:00:41,540 Ja, så jag är en skrivare för Wired Magazine och jag har skrivit en bok som heter Sandworm.

8 00:00:42,140 --> 00:00:49,640 Det är, jag tror, historien om den första fullblåna cyberkriget som verkligen sker.

9 00:00:50,140 --> 00:00:56,640 Och det träffar den ena hackargruppen som kallas Sandworm från deras första existering.

10 00:00:57,400 --> 00:01:27,380 Och det är, jag tror, historien om den första fullblåna cyberkriget som verkligen sker.

11 00:01:27,380 --> 00:01:27,400 Och det är, jag tror, historien om den första fullblåna cyberkriget som verkligen sker.

12 00:01:27,400 --> 00:01:57,380 Och det är, jag tror, historien om den första fullblåna cyberkriget som verkligen sker.

13 00:01:57,380 --> 00:02:27,360 Och jag tror, historien om den första fullblåna cyberkriget som verkligen sker.

14 00:02:27,380 --> 00:02:57,360 Och jag tror, historien om den första fullblåna cyberkriget som verkligen sker.

15 00:02:57,360 --> 00:03:27,340 Och jag tror, historien om den första fullblåna cyberkriget som verkligen sker.

16 00:03:27,340 --> 00:03:57,320 Och jag tror, historien om den första fullblåna cyberkriget som verkligen sker.

17 00:03:57,340 --> 00:04:27,320 Och jag tror, historien om den första fullblåna cyberkriget som verkligen sker.

18 00:04:27,320 --> 00:04:57,300 Och jag tror, historien om den första fullblåna cyberkriget som verkligen sker.

19 00:04:57,300 --> 00:05:27,280 Och jag tror, historien om den första fullblåna cyberkriget som verkligen sker.

20 00:05:27,300 --> 00:05:57,280 Och jag tror, historien om den första fullblåna cyberkriget som verkligen sker.

21 00:05:57,300 --> 00:06:27,280 Och jag tror, historien om den första fullblåna cyberkriget som verkligen sker.

22 00:06:27,300 --> 00:06:57,280 Och jag tror, historien om den första fullblåna cyberkriget som verkligen sker.

23 00:06:57,300 --> 00:07:27,280 Och jag tror, historien om den första fullblåna cyberkriget som verkligen sker.

24 00:07:27,300 --> 00:07:57,280 Och jag tror, historien om den första fullblåna cyberkriget som verkligen sker.

25 00:07:57,300 --> 00:08:27,280 Och jag tror, historien om den första fullblåna cyberkriget som verkligen sker.

26 00:08:27,300 --> 00:08:57,280 Och jag tror, historien om den första fullblåna cyberkriget som verkligen sker.

27 00:08:57,300 --> 00:09:27,280 Och jag tror, historien om den första fullblåna cyberkriget som verkligen sker.

28 00:09:27,300 --> 00:09:57,280 Och jag tror, historien om den första fullblåna cyberkriget som verkligen sker.

29 00:09:57,300 --> 00:10:27,280 Och jag tror, historien om den första fullblåna cyberkriget som verkligen sker.

30 00:10:27,300 --> 00:10:56,640 Och jag tror, historien om den första fullblåna cyberkriget som verkligen sker.

31 00:10:56,640 --> 00:10:57,140 Och jag tror, historien om den första fullblåna cyberkriget som verkligen sker.

32 00:10:57,140 --> 00:10:57,280 Och jag tror, historien om den första fullblåna cyberkriget som verkligen sker.

33 00:10:57,300 --> 00:10:59,380 I just talked about today at CS3 Stockholm.

34 00:10:59,520 --> 00:11:04,420 I don’t think, you know, unless you’ve read the book, it’s not public anywhere.

35 00:11:05,140 --> 00:11:06,160 I don’t think.

36 00:11:06,320 --> 00:11:08,460 Like I, maybe it’s been…

37 00:11:08,460 --> 00:11:10,000 To the data loss thing?

38 00:11:10,320 --> 00:11:15,100 Yeah, this affected NotPetya on hospitals is something that I was surprised to learn about.

39 00:11:15,240 --> 00:11:19,600 It was not, because, you know, you hear about the damages of NotPetya on public companies.

40 00:11:19,600 --> 00:11:20,780 They have to tell their shareholders.

41 00:11:22,260 --> 00:11:25,340 Hospitals have not always been so forthcoming.

42 00:11:25,340 --> 00:11:31,820 And I think it was surprising to me that NotPetya actually affected probably hundreds of American hospitals.

43 00:11:32,040 --> 00:11:36,480 Not directly, but because they shut down this speech-to-text software firm, Nuance,

44 00:11:36,900 --> 00:11:39,800 that these hospitals use for medical record transcription.

45 00:11:40,540 --> 00:11:46,780 And as a result, hospitals were like losing millions of updates to medical records.

46 00:11:46,840 --> 00:11:48,980 And they didn’t know which ones were updated and which ones weren’t.

47 00:11:48,980 --> 00:11:54,980 Which in the case of, you know, I told the story in the talk today of like a IT-centralist

48 00:11:55,340 --> 00:12:02,320 staffer at a major hospital who like a nurse came up to her in this panic telling her,

49 00:12:02,780 --> 00:12:05,520 we have a child who needs to be transferred for a procedure.

50 00:12:05,900 --> 00:12:09,740 And we don’t know if the child has been cleared for surgery because their medical record is incomplete.

51 00:12:10,160 --> 00:12:15,620 And they had to track down in the raw audio files the missing change that had been lost

52 00:12:15,620 --> 00:12:19,640 because Nuance was down, this software firm.

53 00:12:20,420 --> 00:12:25,120 And then that happened three more times in a week with just barely like hours,

54 00:12:25,340 --> 00:12:28,020 before actual effects on someone’s health.

55 00:12:28,780 --> 00:12:33,560 So I couldn’t find a case where somebody was actually harmed, you know,

56 00:12:33,780 --> 00:12:36,780 their health was harmed directly as a result of NotPetya.

57 00:12:37,060 --> 00:12:42,820 But when you hear those stories like three times in a week that they almost had to delay

58 00:12:42,820 --> 00:12:49,580 or cancel a surgery, and then you multiply that out by hundreds and thousands of patients

59 00:12:49,580 --> 00:12:54,860 at dozens or hundreds of hospitals in the U.S. alone, and there were hospitals shut down

60 00:12:54,860 --> 00:12:55,260 in Ukraine.

61 00:12:55,340 --> 00:12:58,780 Yeah, yeah, yeah, I think that like,

62 00:12:58,780 --> 00:13:04,660 it’s hard to say that there wasn’t some harm done to some human’s health or life as a result of NotPetya too.

63 00:13:05,100 --> 00:13:05,580 Exactly.

64 00:13:05,660 --> 00:13:08,660 And that’s not captured in the dollar numbers.

65 00:13:08,700 --> 00:13:09,380 No, no.

66 00:13:09,740 --> 00:13:17,060 But, and I mean, you also mentioned that there’s probably a lot of collateral effects on NotPetya.

67 00:13:17,100 --> 00:13:22,700 Because I mean, this were, I mean, companies also struck in Russia, so on and so on.

68 00:13:22,700 --> 00:13:24,300 Yeah, yeah, I think that like,

69 00:13:25,340 --> 00:13:28,100 it may mostly have been collateral damage.

70 00:13:28,100 --> 00:13:30,340 It’s very hard to say it’s collateral damage.

71 00:13:30,340 --> 00:13:33,220 You have to be a little bit inside the heads of the hackers.

72 00:13:33,220 --> 00:13:36,700 You have to know what they were intending to do to know what they weren’t intending to do.

73 00:13:36,940 --> 00:13:43,700 But it seems like they probably were just targeting Ukraine, and then we’re just extremely

74 00:13:43,700 --> 00:13:47,620 reckless about allowing the worm to spread, you know, willy nilly.

75 00:13:47,900 --> 00:13:53,180 That, you know, we know this was Russia, and we know that NotPetya hit Russia very badly also.

76 00:13:53,180 --> 00:13:55,180 So it’s,

77 00:13:55,340 --> 00:13:57,380 it’s hard to imagine that that was intentional.

78 00:13:57,660 --> 00:14:04,580 But we also know that the way that NotPetya was seeded out to all these victims was using this

79 00:14:04,580 --> 00:14:05,500 accounting software.

80 00:14:05,820 --> 00:14:10,420 And that accounting software would have had, you know, the hackers could have accessed via the

81 00:14:10,420 --> 00:14:16,180 accounting software, the unique tax ID numbers of all of the victims, and very easily figured out

82 00:14:16,180 --> 00:14:21,940 exactly who was going to be infected, and even targeted their attack if they wanted to.

83 00:14:21,980 --> 00:14:22,380 Yeah.

84 00:14:22,420 --> 00:14:25,300 And they just didn’t seem to care, like, they seemed fine.

85 00:14:25,340 --> 00:14:31,460 Like, we just started with allowing this to just be an amazingly, like, wide spread, and indiscriminate attack.

86 00:14:31,460 --> 00:14:32,960 Yeah.

87 00:14:33,180 --> 00:14:42,820 So, in the end of your talk, you’re also discussing the wild west mentality of this group, that impressed your boss of the day,

88 00:14:42,820 --> 00:14:44,820 Right, I tried.

89 00:14:44,820 --> 00:14:49,660 Once I knew, once it became clear that sandworm and this whole series of attacks with these sort of, you know, maybe a group or a network of how to be targeted, when the attack was so obvious, I was completely shocked.

90 00:14:49,660 --> 00:14:51,660 Yeah.

91 00:14:51,660 --> 00:14:52,660 Yeah.

92 00:14:52,660 --> 00:14:53,660 Yeah.

93 00:14:53,660 --> 00:14:54,660 I was completely shocked.

94 00:14:54,660 --> 00:14:54,700 Yeah.

95 00:14:54,700 --> 00:14:54,740 Yeah.

96 00:14:54,740 --> 00:14:54,760 Yeah.

97 00:14:54,760 --> 00:14:54,940 Yeah.

98 00:14:54,940 --> 00:14:54,980 Yeah.

99 00:14:54,980 --> 00:14:55,100 Yeah.

100 00:14:55,100 --> 00:14:55,120 Yeah.

101 00:14:55,140 --> 00:14:55,200 Yeah.

102 00:14:55,200 --> 00:14:55,280 Yeah.

103 00:14:55,280 --> 00:14:55,300 Yeah.

104 00:14:55,300 --> 00:14:55,320 Yeah.

105 00:14:55,320 --> 00:14:55,340 Yeah.

106 00:14:55,340 --> 00:15:25,320 Tack för att du har tittat på den här videon.

107 00:15:25,340 --> 00:15:55,320 Tack för att du har tittat på den här videon.

108 00:15:55,340 --> 00:16:25,320 Tack för att du har tittat på den här videon.

109 00:16:25,340 --> 00:16:55,320 Tack för att du har tittat på den här videon.

110 00:16:55,340 --> 00:17:25,320 Tack för att du har tittat på den här videon.

111 00:17:25,340 --> 00:17:55,320 Tack för att du har tittat på den här videon.

112 00:17:55,340 --> 00:18:25,320 Tack för att du har tittat på den här videon.

113 00:18:25,340 --> 00:18:55,320 Tack för att du har tittat på den här videon.

114 00:18:55,340 --> 00:19:25,320 Tack för att du har tittat på den här videon.

115 00:19:25,340 --> 00:19:55,320 Tack för att du har tittat på den här videon.

116 00:19:55,340 --> 00:20:25,320 Tack för att du har tittat på den här videon.

117 00:20:25,340 --> 00:20:55,320 Tack för att du har tittat på den här videon.

118 00:20:55,340 --> 00:21:25,180 Tack för att du har tittat på den här videon.

119 00:21:25,180 --> 00:21:25,300 Tack för att du har tittat på den här videon.

120 00:21:25,300 --> 00:21:25,320 Tack för att du har tittat på den här videon.

121 00:21:25,340 --> 00:21:55,320 Tack för att du har tittat på den här videon.

122 00:21:55,340 --> 00:22:25,320 Tack för att du har tittat på den här videon.

123 00:22:25,340 --> 00:22:55,320 Tack för att du har tittat på den här videon.

124 00:22:55,340 --> 00:23:25,320 Tack för att du har tittat på den här videon.

125 00:23:25,340 --> 00:23:55,320 Tack för att du har tittat på den här videon.

126 00:23:55,340 --> 00:24:25,320 Tack för att du har tittat på den här videon.

127 00:24:25,340 --> 00:24:55,320 Tack för att du har tittat på den här videon.

128 00:24:55,340 --> 00:25:25,320 Tack för att du har tittat på den här videon.

129 00:25:25,340 --> 00:25:55,320 Tack för att du har tittat på den här videon.

130 00:25:55,340 --> 00:26:25,320 Tack för att du har tittat på den här videon.

131 00:26:25,340 --> 00:26:55,320 Tack för att du har tittat på den här videon.

132 00:26:55,340 --> 00:27:25,320 Tack för att du har tittat på den här videon.

133 00:27:25,340 --> 00:27:55,320 Tack för att du har tittat på den här videon.

134 00:27:55,340 --> 00:28:23,420 Tack för att du har tittat på den här videon.

135 00:28:23,420 --> 00:28:24,660 Tack för att du har tittat på den här videon.

136 00:28:24,660 --> 00:28:25,220 Tack för att du har tittat på den här videon.

137 00:28:25,220 --> 00:28:25,300 Tack för att du har tittat på den här videon.

138 00:28:25,300 --> 00:28:31,300 De kom upp med bilder och alla tyckte inte att det verkligen hände.

139 00:28:31,460 --> 00:28:34,540 De fick mycket kritik för att de skickade ut artikeln.

140 00:28:34,700 --> 00:28:40,140 De hade en insens om att om de skickade ut artiklar som gick viralt-

141 00:28:40,300 --> 00:28:44,100 -“så skulle de få mer pengar. -“Jag vet inte om det.”

142 00:28:44,260 --> 00:28:48,140 Jag tittar bara på den tekniska sidan.

143 00:28:48,300 --> 00:28:53,180 Vi såg aldrig prover, bilder av ett riktigt dispositiv.

144 00:28:53,180 --> 00:28:58,300 De hade bilder som jag trodde var en typ av mock-up.

145 00:28:58,460 --> 00:29:02,820 Så jag tänkte att vi skulle göra det riktigt.

146 00:29:02,980 --> 00:29:05,940 Vi ska försöka göra det.

147 00:29:06,100 --> 00:29:11,060 Som vanligt försöker jag se hur lätt något är.

148 00:29:11,220 --> 00:29:16,580 Kan jag göra det relativt lågt och lätt?

149 00:29:16,740 --> 00:29:23,140 Vi hör om 80-20-reglerna, där man får 80 % av jobbet med 20 % av utmaningen.

150 00:29:23,300 --> 00:29:26,380 Mina utmaningar är mer som 50-10-reglerna.

151 00:29:26,540 --> 00:29:32,140 Jag försöker få 50 % av jobbet gjort med 10 % av utmaningen.

152 00:29:32,300 --> 00:29:36,500 Då tänkte du att vi ska lägga något i något.

153 00:29:36,660 --> 00:29:39,020 Hur bestämde du dig för att utmana?

154 00:29:39,180 --> 00:29:42,220 Jag arbetar i ett industriellt kontrollsystem.

155 00:29:42,380 --> 00:29:48,900 Jag tittar runt på industriella appar i mitt labb som jag har tillgång till.

156 00:29:49,060 --> 00:29:52,420 På en viss grad bestämde jag mig för-

157 00:29:52,540 --> 00:29:56,020 att en serial port kan vara en lätt attack-vektor.

158 00:29:56,180 --> 00:29:59,340 De är nästan alla konfigurerade av serial-portar.

159 00:29:59,500 --> 00:30:04,660 Sen tänkte jag att, ja, den industriella kontrollsystem-världen-

160 00:30:04,820 --> 00:30:11,700 är ofta runnad av serial-portar, men så är det även med CISCO-vattnet och appar.

161 00:30:11,860 --> 00:30:14,940 Det kan ha en lite bredare uppmärksamhet.

162 00:30:15,100 --> 00:30:19,660 Jag började i ICS-spacet, men en serial port är en serial port.

163 00:30:19,820 --> 00:30:21,500 Jag tror att CISCO-attacken-

164 00:30:21,500 --> 00:30:23,980 hade en större uppmärksamhet.

165 00:30:24,140 --> 00:30:28,180 Så du öppnade upp den och började utforska-

166 00:30:28,340 --> 00:30:31,420 var man ska lägga in den här rysgränsen, eller hur?

167 00:30:31,580 --> 00:30:36,300 Den är lite större än det. Den är inte så liten-

168 00:30:36,460 --> 00:30:39,340 som den andra artikeln sa.

169 00:30:39,500 --> 00:30:41,380 En stor rysgräns.

170 00:30:41,540 --> 00:30:46,060 Ja, så 50 % av effekten på 10 % av kostnaden.

171 00:30:46,220 --> 00:30:51,300 Jag bygger inte customchips. Den är ungefär 5 mm i stund.

172 00:30:51,300 --> 00:30:53,900 Det är en liten, rysgräns.

173 00:30:54,060 --> 00:30:57,500 Jag öppnade inte upp så mycket och började utforska.

174 00:30:57,660 --> 00:31:00,260 Jag hade att testa så att det skulle fungera först.

175 00:31:00,420 --> 00:31:02,780 Jag ville följa snabbt och tajligt.

176 00:31:02,940 --> 00:31:05,580 Det går inte att spela mycket arbete om det inte fungerar.

177 00:31:05,740 --> 00:31:08,740 Det var om att testa den med andra tajta tillverkningar.

178 00:31:08,900 --> 00:31:10,780 Jag började med OONO.

179 00:31:10,940 --> 00:31:12,180 Arduino OONOs.

180 00:31:12,340 --> 00:31:16,940 När jag förstod att jag kunde interfacera dem och få dem att göra attacken-

181 00:31:17,100 --> 00:31:19,380 så väljde jag en mindre chip.

182 00:31:19,540 --> 00:31:20,860 Och sen så-

183 00:31:20,860 --> 00:31:22,500 I’m looking for where and how to place it.

184 00:31:23,560 --> 00:31:24,940 And how much

185 00:31:24,940 --> 00:31:26,920 in the approach that you

186 00:31:26,920 --> 00:31:28,720 made, how much hardware

187 00:31:28,720 --> 00:31:30,740 versus how much software do you need

188 00:31:30,740 --> 00:31:33,000 to approach

189 00:31:33,000 --> 00:31:34,260 this attack with?

190 00:31:36,400 --> 00:31:37,580 Yeah, I mean

191 00:31:37,580 --> 00:31:39,060 in retrospect

192 00:31:39,060 --> 00:31:41,240 I don’t think

193 00:31:41,240 --> 00:31:42,860 it’s very difficult for somebody

194 00:31:42,860 --> 00:31:45,040 with a little bit of technical

195 00:31:45,040 --> 00:31:47,040 skill in like programming Arduinos

196 00:31:47,040 --> 00:31:48,400 or using a soldering iron.

197 00:31:48,860 --> 00:31:50,740 It seems like for a lot of

198 00:31:50,740 --> 00:31:53,020 tasks, technical

199 00:31:53,020 --> 00:31:55,180 tasks, the hard thing

200 00:31:55,180 --> 00:31:56,840 is the belief that it can be done.

201 00:31:57,220 --> 00:31:59,080 And once you decide that it can be done

202 00:31:59,080 --> 00:32:01,180 it’s not very difficult. And it should be

203 00:32:01,180 --> 00:32:03,240 easy if you want to follow along and do this

204 00:32:03,240 --> 00:32:03,820 on your own.

205 00:32:04,800 --> 00:32:07,420 The hardware is an ATtiny

206 00:32:07,420 --> 00:32:08,560 85 chip.

207 00:32:09,140 --> 00:32:11,340 You can program it with the Arduino IDE

208 00:32:11,340 --> 00:32:13,420 environment. And the software

209 00:32:13,420 --> 00:32:15,360 basically just sends a set

210 00:32:15,360 --> 00:32:17,060 of commands that you can

211 00:32:17,060 --> 00:32:19,020 follow along from a Cisco

212 00:32:19,020 --> 00:32:20,580 password recovery document.

213 00:32:20,740 --> 00:32:22,620 So if you can print

214 00:32:22,620 --> 00:32:24,680 to a serial port, that’s the

215 00:32:24,680 --> 00:32:26,720 software. And

216 00:32:26,720 --> 00:32:28,400 with just a little bit of

217 00:32:28,400 --> 00:32:30,480 fiddling, if you can

218 00:32:30,480 --> 00:32:32,540 wire up that hardware

219 00:32:32,540 --> 00:32:34,680 chip to the motherboard, you can send

220 00:32:34,680 --> 00:32:36,660 those commands on reboot when they’re needed.

221 00:32:36,960 --> 00:32:38,680 So the implant itself is

222 00:32:38,680 --> 00:32:40,600 something that you pick up in any hardware

223 00:32:40,600 --> 00:32:42,680 store, or not hardware

224 00:32:42,680 --> 00:32:44,100 store, but an electronic store

225 00:32:44,100 --> 00:32:46,220 on town, right?

226 00:32:46,580 --> 00:32:48,920 Yeah, you know, China’s my favorite

227 00:32:48,920 --> 00:32:50,720 hardware provider.

228 00:32:50,740 --> 00:32:52,580 And I think there’s some irony

229 00:32:52,580 --> 00:32:54,600 in using cheap Chinese equipment to

230 00:32:54,600 --> 00:32:55,680 build this implant.

231 00:32:57,000 --> 00:32:58,260 Now you can get these

232 00:32:58,260 --> 00:33:00,700 from Amazon. I’m

233 00:33:00,700 --> 00:33:03,000 using the Digispark

234 00:33:03,000 --> 00:33:04,540 board, which has the small

235 00:33:04,540 --> 00:33:06,660 processor on it. And I’m programming

236 00:33:06,660 --> 00:33:08,640 it on that board and then desoldering

237 00:33:08,640 --> 00:33:10,620 it from there and soldering it

238 00:33:10,620 --> 00:33:12,360 onto the Cisco

239 00:33:12,360 --> 00:33:13,260 motherboard.

240 00:33:14,500 --> 00:33:16,600 They’re $2 US

241 00:33:16,600 --> 00:33:18,640 each, so the hardware

242 00:33:18,640 --> 00:33:19,460 is not expensive.

243 00:33:20,740 --> 00:33:22,400 The software is not that

244 00:33:22,400 --> 00:33:24,480 difficult. You spend a little

245 00:33:24,480 --> 00:33:26,600 time looking around the best way to install

246 00:33:26,600 --> 00:33:28,540 it. Again, I think the key is

247 00:33:28,540 --> 00:33:30,540 just knowing that it can be done.

248 00:33:30,860 --> 00:33:32,260 And then go for it, right?

249 00:33:32,380 --> 00:33:32,540 Yeah.

250 00:33:33,820 --> 00:33:36,020 That’s quite amazing. And I mean,

251 00:33:36,100 --> 00:33:38,520 you got a lot of attention around

252 00:33:38,520 --> 00:33:40,080 when you put out the article

253 00:33:40,080 --> 00:33:41,440 and the spin

254 00:33:41,440 --> 00:33:43,860 went quite viral

255 00:33:43,860 --> 00:33:46,280 in itself, that you could actually

256 00:33:46,280 --> 00:33:48,120 do this kind of attack.

257 00:33:48,900 --> 00:33:50,240 What’s the response been

258 00:33:50,240 --> 00:33:50,580 from

259 00:33:50,580 --> 00:33:52,880 the others?

260 00:33:55,080 --> 00:33:55,260 Well,

261 00:33:55,680 --> 00:33:58,160 one of the most interesting things

262 00:33:58,160 --> 00:33:59,320 for me is

263 00:33:59,320 --> 00:34:02,400 using my Android phone

264 00:34:02,400 --> 00:34:04,180 and occasionally, more than once,

265 00:34:04,400 --> 00:34:06,300 I have popped up on my own news feed.

266 00:34:06,780 --> 00:34:07,940 And there’s something when

267 00:34:07,940 --> 00:34:09,580 I open it up and I see

268 00:34:09,580 --> 00:34:11,880 somebody has quoted me

269 00:34:11,880 --> 00:34:14,520 in my news feed. It’s like, oh wow, that’s

270 00:34:14,520 --> 00:34:16,340 interesting. That doesn’t happen to me

271 00:34:16,340 --> 00:34:16,760 every day.

272 00:34:17,700 --> 00:34:20,500 But I mean, if we say Cisco

273 00:34:20,580 --> 00:34:22,380 for instance, have they replied

274 00:34:22,380 --> 00:34:22,960 in any way?

275 00:34:23,540 --> 00:34:25,040 Cisco called me.

276 00:34:26,720 --> 00:34:28,100 That’s a sort of very

277 00:34:28,100 --> 00:34:30,280 interesting question is

278 00:34:30,280 --> 00:34:32,020 whether or not this is a vulnerability.

279 00:34:32,580 --> 00:34:34,300 I didn’t think it was. I didn’t talk to

280 00:34:34,300 --> 00:34:36,500 them originally if I thought it was a vulnerability.

281 00:34:36,940 --> 00:34:38,340 And after some discussion

282 00:34:38,340 --> 00:34:40,360 with them, they decided that it’s

283 00:34:40,360 --> 00:34:41,400 not a vulnerability.

284 00:34:42,260 --> 00:34:44,080 It uses an existing

285 00:34:44,080 --> 00:34:46,360 feature in the devices that would allow

286 00:34:46,360 --> 00:34:48,020 you to recover passwords.

287 00:34:48,580 --> 00:34:50,240 The normal fault, though, is that you have to

288 00:34:50,240 --> 00:34:52,220 physically be present in front of the

289 00:34:52,220 --> 00:34:53,660 device and plugged into it.

290 00:34:54,160 --> 00:34:56,220 My twist is leaving a chip behind that

291 00:34:56,220 --> 00:34:57,360 can do that later on.

292 00:34:57,880 --> 00:34:58,760 I think that

293 00:34:58,760 --> 00:35:02,200 although we don’t consider that

294 00:35:02,200 --> 00:35:03,600 a vulnerability now,

295 00:35:04,200 --> 00:35:05,860 that in the future, maybe

296 00:35:05,860 --> 00:35:07,940 five years from now,

297 00:35:08,500 --> 00:35:09,840 some of this

298 00:35:09,840 --> 00:35:12,280 unauthenticated local

299 00:35:12,280 --> 00:35:14,280 physical access might be

300 00:35:14,280 --> 00:35:16,080 considered a vulnerability just as we

301 00:35:16,080 --> 00:35:18,060 evolve our understanding

302 00:35:18,060 --> 00:35:20,080 of what vulnerabilities and security is

303 00:35:20,080 --> 00:35:20,580 over time.

304 00:35:22,060 --> 00:35:24,240 It’s not about being physically present

305 00:35:24,240 --> 00:35:26,080 in the place where you approach

306 00:35:26,080 --> 00:35:28,020 it, right? It’s a supply chain that’s

307 00:35:28,020 --> 00:35:29,920 interesting. Can you intercept

308 00:35:29,920 --> 00:35:32,100 it while it’s delivered to that

309 00:35:32,100 --> 00:35:33,880 factory or anything else?

310 00:35:34,640 --> 00:35:35,980 Yeah, and frankly, that’s

311 00:35:35,980 --> 00:35:36,740 the hard part.

312 00:35:37,960 --> 00:35:39,960 We get a lot of play about building the

313 00:35:39,960 --> 00:35:42,060 device. The hard part would be introducing

314 00:35:42,060 --> 00:35:43,980 it in the supply chain. I could

315 00:35:43,980 --> 00:35:46,100 do small numbers, perhaps.

316 00:35:46,160 --> 00:35:48,000 I could resell these devices on

317 00:35:48,000 --> 00:35:49,880 eBay. Maybe

318 00:35:49,880 --> 00:35:52,060 I could social engineer my way

319 00:35:52,060 --> 00:35:54,020 into a site and replace a

320 00:35:54,020 --> 00:35:56,200 device or provide a device.

321 00:35:56,420 --> 00:35:57,920 But that would be

322 00:35:57,920 --> 00:35:59,860 hard to do on

323 00:35:59,860 --> 00:36:02,080 a large scale. The hardware

324 00:36:02,080 --> 00:36:03,780 part is easy. I think probably

325 00:36:03,780 --> 00:36:05,780 the supply chain interdiction is

326 00:36:05,780 --> 00:36:06,880 the harder task.

327 00:36:07,220 --> 00:36:09,920 But once you get a foothold there in some way,

328 00:36:10,040 --> 00:36:12,040 that’s also a soft spot

329 00:36:12,040 --> 00:36:13,820 for the asset owner

330 00:36:13,820 --> 00:36:15,980 because you trust what you’re getting

331 00:36:15,980 --> 00:36:18,040 in the original package,

332 00:36:18,040 --> 00:36:19,780 so to speak. You just open it up.

333 00:36:19,880 --> 00:36:21,600 It has all the manuals and

334 00:36:21,600 --> 00:36:24,040 everything in the box, right?

335 00:36:24,440 --> 00:36:25,840 So why should you worry about

336 00:36:25,840 --> 00:36:27,560 that it has a latent

337 00:36:27,560 --> 00:36:29,640 implant

338 00:36:29,640 --> 00:36:31,700 device installed?

339 00:36:32,160 --> 00:36:33,260 Yeah, and I have

340 00:36:33,260 --> 00:36:35,680 my secret weapon, which is

341 00:36:35,680 --> 00:36:37,620 sort of supply chain

342 00:36:37,620 --> 00:36:39,700 jujitsu, where I have

343 00:36:39,700 --> 00:36:41,720 some very nice warranty void

344 00:36:41,720 --> 00:36:43,920 if removed stickers. They have holograms

345 00:36:43,920 --> 00:36:45,820 on them. They have barcodes and serial

346 00:36:45,820 --> 00:36:47,820 numbers. And if you put one of those on a

347 00:36:47,820 --> 00:36:49,860 device after chipping it, some

348 00:36:49,860 --> 00:36:51,840 body is a little more hesitant

349 00:36:51,840 --> 00:36:53,780 to peel it off and open it up and

350 00:36:53,780 --> 00:36:55,760 take the motherboard out and turn it over and look

351 00:36:55,760 --> 00:36:57,760 to see if there’s anything that looks a little weird

352 00:36:57,760 --> 00:36:59,660 on there. Exactly, so you hide

353 00:36:59,660 --> 00:37:01,400 your implant in that

354 00:37:01,400 --> 00:37:03,360 sense. Yes,

355 00:37:04,200 --> 00:37:05,160 but right now

356 00:37:05,160 --> 00:37:07,700 you have to take 14 screws

357 00:37:07,700 --> 00:37:09,720 out, pull the motherboard out, turn it upside

358 00:37:09,720 --> 00:37:11,560 down and look. And

359 00:37:11,560 --> 00:37:13,780 I think in general people are hesitant

360 00:37:13,780 --> 00:37:15,360 to do that. If they were going to buy

361 00:37:15,360 --> 00:37:16,840 new security equipment,

362 00:37:17,880 --> 00:37:19,620 they’d probably maybe take the

363 00:37:19,620 --> 00:37:21,460 case off when you have technical people, but

364 00:37:21,460 --> 00:37:23,580 actually have to pull the motherboard out and turn it

365 00:37:23,580 --> 00:37:25,480 upside down is, I’d say,

366 00:37:25,520 --> 00:37:27,480 a little more than people are comfortable with

367 00:37:27,480 --> 00:37:29,800 in general, for new equipment especially.

368 00:37:29,800 --> 00:37:31,940 Yeah, I mean, it’s

369 00:37:31,940 --> 00:37:33,640 all coming back to the threat

370 00:37:33,640 --> 00:37:35,360 modeling then, I guess. I mean,

371 00:37:35,840 --> 00:37:36,480 if it’s

372 00:37:36,480 --> 00:37:39,700 some factory

373 00:37:39,700 --> 00:37:41,700 with non-crucial parts or if

374 00:37:41,700 --> 00:37:43,580 it’s a nuclear facility that you’re

375 00:37:43,580 --> 00:37:45,140 putting this stuff into,

376 00:37:45,760 --> 00:37:47,600 I guess you have different threat models

377 00:37:47,600 --> 00:37:49,600 to take

378 00:37:49,600 --> 00:37:50,880 into account in that sense.

379 00:37:51,260 --> 00:37:53,420 Yeah, and there’s actually a good place

380 00:37:53,420 --> 00:37:55,640 to hide it inside the connector

381 00:37:55,640 --> 00:37:57,560 where you would plug

382 00:37:57,560 --> 00:37:59,340 in this RJ45 connection

383 00:37:59,340 --> 00:38:00,780 for the serial cable, so

384 00:38:00,780 --> 00:38:03,540 with that in mind, you might actually be able

385 00:38:03,540 --> 00:38:05,460 to attack the

386 00:38:05,460 --> 00:38:07,440 supply chain of the supplier. In other

387 00:38:07,440 --> 00:38:09,720 words, if you’re providing those

388 00:38:09,720 --> 00:38:11,680 sockets, those RF-shielded

389 00:38:11,680 --> 00:38:13,400 cans, you could

390 00:38:13,400 --> 00:38:15,300 put the chip in there

391 00:38:15,300 --> 00:38:17,500 and then sell it to the

392 00:38:17,500 --> 00:38:18,880 manufacturer, be it

393 00:38:19,600 --> 00:38:20,660 Cisco or others.

394 00:38:21,660 --> 00:38:23,720 And by the way, this isn’t, I mean,

395 00:38:23,780 --> 00:38:25,720 it uses the Cisco recovery documents

396 00:38:25,720 --> 00:38:27,580 and I’m doing it in that place, but

397 00:38:27,580 --> 00:38:29,340 they just happen to

398 00:38:29,340 --> 00:38:31,320 be the lucky winner

399 00:38:31,320 --> 00:38:33,040 for picking a device.

400 00:38:34,260 --> 00:38:35,540 It’s largely

401 00:38:35,540 --> 00:38:37,680 applicable to anything that has

402 00:38:37,680 --> 00:38:39,020 a serial configuration port.

403 00:38:39,560 --> 00:38:40,480 Yeah, that’s cool.

404 00:38:41,180 --> 00:38:43,520 If we round off this small

405 00:38:43,520 --> 00:38:45,520 teaser, so to speak,

406 00:38:45,680 --> 00:38:47,260 you will be available on the

407 00:38:47,260 --> 00:38:49,360 RCS3 YouTube channel,

408 00:38:49,600 --> 00:38:51,440 once it’s online, the presentation that you give

409 00:38:51,440 --> 00:38:53,480 for all the people that are interested

410 00:38:53,480 --> 00:38:54,180 in the details.

411 00:38:54,980 --> 00:38:57,480 You have your own YouTube channel that you

412 00:38:57,480 --> 00:38:58,920 could plug again? Yeah, yeah.

413 00:38:59,380 --> 00:39:02,000 Look for Monte Elkins, M-O-N-T-A

414 00:39:02,000 --> 00:39:04,460 Elkins, E-L-K-I-N-S

415 00:39:04,460 --> 00:39:05,760 Coke and Strippers

416 00:39:05,760 --> 00:39:06,540 on YouTube.

417 00:39:08,000 --> 00:39:09,660 And in a few

418 00:39:09,660 --> 00:39:11,160 months, I should be doing this

419 00:39:11,160 --> 00:39:13,620 presentation again in the U.S. if you’re not

420 00:39:13,620 --> 00:39:15,020 able to get out there.

421 00:39:16,060 --> 00:39:17,660 TDI Technologies sponsored

422 00:39:17,660 --> 00:39:19,580 my trip to the SANS ICS,

423 00:39:19,600 --> 00:39:21,800 Summit, and if you want to come by

424 00:39:21,800 --> 00:39:23,180 and see it in person and

425 00:39:23,180 --> 00:39:25,620 more details perhaps than you can

426 00:39:25,620 --> 00:39:27,680 get here, look for

427 00:39:27,680 --> 00:39:29,380 that. I believe that’s in March of

428 00:39:29,380 --> 00:39:31,380 2020. Oh, excellent.

429 00:39:31,620 --> 00:39:32,980 So the next possible

430 00:39:32,980 --> 00:39:35,420 spot to spot you, so to speak.

431 00:39:35,640 --> 00:39:37,580 Yeah, yeah. That’s cool. Okay, so on

432 00:39:37,580 --> 00:39:39,600 behalf of the Secrets Podcast and listeners, I thank you

433 00:39:39,600 --> 00:39:41,820 Monte for being on the show, and

434 00:39:41,820 --> 00:39:43,520 I hope you have a nice day

435 00:39:43,520 --> 00:39:45,500 here in Stockholm and following up

436 00:39:45,500 --> 00:39:47,060 on this one, and

437 00:39:47,060 --> 00:39:49,260 this is Robin from PostSpeaking.

438 00:39:49,600 --> 00:39:51,800 Thank you for tuning in to Secrets Podcast

439 00:39:51,800 --> 00:39:53,640 and once again, thank you all.

440 00:39:54,640 --> 00:39:55,080 Thank you.

441 00:39:56,140 --> 00:39:57,940 Welcome to Secrets Podcast

442 00:39:57,940 --> 00:39:59,480 and transmitting from

443 00:39:59,480 --> 00:40:00,880 CS3 Stockholm

444 00:40:00,880 --> 00:40:03,900 2019. We just

445 00:40:03,900 --> 00:40:05,580 got off stage here. This is

446 00:40:05,580 --> 00:40:07,240 Robin from PostSpeaking, and

447 00:40:07,240 --> 00:40:09,760 on the other side of the table, I have

448 00:40:09,760 --> 00:40:11,480 Steven Hilt from Trend Micro.

449 00:40:11,920 --> 00:40:13,280 Welcome. Thank you.

450 00:40:13,780 --> 00:40:15,580 So you had a really interesting

451 00:40:15,580 --> 00:40:16,960 talk here about

452 00:40:16,960 --> 00:40:19,580 hacking crane.

453 00:40:19,600 --> 00:40:21,680 So the remote controls of

454 00:40:21,680 --> 00:40:23,780 construction cranes, right?

455 00:40:23,920 --> 00:40:25,660 Yeah, it focused the

456 00:40:25,660 --> 00:40:27,340 presentation because we have the crane,

457 00:40:27,760 --> 00:40:29,680 but it applies to many of the industrial

458 00:40:29,680 --> 00:40:31,680 radios that control lots of other things

459 00:40:31,680 --> 00:40:33,180 than just cranes themselves.

460 00:40:33,800 --> 00:40:35,040 That’s really cool. And

461 00:40:35,040 --> 00:40:37,700 what was the basic

462 00:40:37,700 --> 00:40:39,740 interest, or how did you get this basic

463 00:40:39,740 --> 00:40:41,600 interest for this research

464 00:40:41,600 --> 00:40:43,620 that you did? Yeah, so originally

465 00:40:43,620 --> 00:40:45,600 we were pitching ideas for

466 00:40:45,600 --> 00:40:47,080 research as we do

467 00:40:48,080 --> 00:40:49,520 yearly, and a couple

468 00:40:49,520 --> 00:40:51,500 of us got together, and

469 00:40:51,500 --> 00:40:53,500 one of my co-workers, Federico,

470 00:40:54,160 --> 00:40:55,580 in Italy, had

471 00:40:55,580 --> 00:40:57,480 noticed a construction crane across

472 00:40:57,480 --> 00:40:59,360 from his house, and started wondering

473 00:40:59,360 --> 00:41:00,200 to himself,

474 00:41:00,860 --> 00:41:03,440 are these things vulnerable? Are

475 00:41:03,440 --> 00:41:05,540 they secure? How does the communications

476 00:41:05,540 --> 00:41:07,540 work? Things like that. And so when

477 00:41:07,540 --> 00:41:09,400 we were talking about it, you know, we

478 00:41:09,400 --> 00:41:11,360 had noticed the same things, and

479 00:41:11,360 --> 00:41:12,960 lots of us,

480 00:41:13,420 --> 00:41:15,400 in total seven people on our team

481 00:41:15,400 --> 00:41:17,400 that did this project, got

482 00:41:17,400 --> 00:41:19,440 together and decided that we wanted

483 00:41:19,440 --> 00:41:21,280 to look at the security

484 00:41:21,280 --> 00:41:22,920 if there was any

485 00:41:22,920 --> 00:41:25,540 on the controllers

486 00:41:25,540 --> 00:41:27,700 for large construction

487 00:41:27,700 --> 00:41:29,480 cranes was the original piece.

488 00:41:29,660 --> 00:41:31,400 And once we started digging into it, we

489 00:41:31,400 --> 00:41:33,560 found more and more systems

490 00:41:33,560 --> 00:41:35,380 that utilize the same

491 00:41:35,380 --> 00:41:37,420 communications. And how many

492 00:41:37,420 --> 00:41:39,620 different vendors or different brands

493 00:41:39,620 --> 00:41:41,440 are you looking at? We looked at

494 00:41:41,440 --> 00:41:43,660 seven vendors that were

495 00:41:43,660 --> 00:41:45,720 globally dispersed between

496 00:41:45,720 --> 00:41:47,300 multiple countries

497 00:41:47,300 --> 00:41:49,420 and where the

498 00:41:49,440 --> 00:41:50,940 point of origin of the country is

499 00:41:50,940 --> 00:41:53,720 manufactured. From Taiwan,

500 00:41:53,960 --> 00:41:55,380 there was even a vendor

501 00:41:55,380 --> 00:41:57,640 from Sweden, Italy, US,

502 00:41:58,420 --> 00:41:59,180 and Japan.

503 00:41:59,440 --> 00:42:01,440 So this is a global

504 00:42:01,440 --> 00:42:03,440 issue, so to speak?

505 00:42:03,640 --> 00:42:05,580 Yeah, out of the seven vendors we looked at,

506 00:42:05,580 --> 00:42:07,500 all of them were susceptible to

507 00:42:07,500 --> 00:42:09,460 easy replay attacks.

508 00:42:09,900 --> 00:42:11,480 That’s quite amazing, right?

509 00:42:11,760 --> 00:42:13,500 I mean, what did you think

510 00:42:13,500 --> 00:42:15,120 when you figured that out?

511 00:42:15,720 --> 00:42:17,540 When we had

512 00:42:17,540 --> 00:42:19,420 conversations, and we originally found this,

513 00:42:19,440 --> 00:42:21,540 going back

514 00:42:21,540 --> 00:42:23,400 to Federico, I was

515 00:42:23,400 --> 00:42:25,600 more surprised that his garage

516 00:42:25,600 --> 00:42:27,620 door has more security

517 00:42:27,620 --> 00:42:29,540 than, because

518 00:42:29,540 --> 00:42:30,320 they have no

519 00:42:30,320 --> 00:42:33,540 implementation of rolling codes when we

520 00:42:33,540 --> 00:42:34,920 looked at it, or any security

521 00:42:34,920 --> 00:42:37,860 when we looked at those specific

522 00:42:37,860 --> 00:42:38,860 vendors.

523 00:42:39,200 --> 00:42:41,480 Because in your talk you said

524 00:42:41,480 --> 00:42:43,400 you had an escalating

525 00:42:43,400 --> 00:42:45,460 approach to just

526 00:42:45,460 --> 00:42:47,600 start by replaying, then start to try to figure

527 00:42:47,600 --> 00:42:49,380 out the general commands,

528 00:42:49,440 --> 00:42:51,560 to send to it. I mean, you could have stopped

529 00:42:51,560 --> 00:42:53,640 just by doing the replay, and

530 00:42:53,640 --> 00:42:56,060 this is too bad, right?

531 00:42:56,520 --> 00:42:57,740 Yeah, we could,

532 00:42:57,840 --> 00:42:59,420 but one of the things we wanted to do,

533 00:42:59,520 --> 00:43:01,540 because replay does take you recording

534 00:43:01,540 --> 00:43:04,040 messages of the crane in operation,

535 00:43:04,160 --> 00:43:05,840 so we wanted to prove that

536 00:43:05,840 --> 00:43:07,960 an attacker could pre-plan something,

537 00:43:08,440 --> 00:43:10,180 record it, be able

538 00:43:10,180 --> 00:43:11,940 to figure out the communications and how the

539 00:43:11,940 --> 00:43:14,060 protocol works, come back, and

540 00:43:14,060 --> 00:43:16,020 issue commands that you may not have issued

541 00:43:16,020 --> 00:43:16,720 that they recorded.

542 00:43:17,300 --> 00:43:19,340 And you could carry the

543 00:43:19,340 --> 00:43:21,600 transmitter on like a drone or

544 00:43:21,600 --> 00:43:23,760 anything, I guess, because it’s not a

545 00:43:23,760 --> 00:43:25,820 huge effect that you need

546 00:43:25,820 --> 00:43:26,700 to transmit on.

547 00:43:27,140 --> 00:43:29,620 No, a drone, but we also

548 00:43:29,620 --> 00:43:31,840 built a tool called RFQuack

549 00:43:31,840 --> 00:43:33,320 that

550 00:43:33,320 --> 00:43:35,880 you can deploy

551 00:43:35,880 --> 00:43:37,740 the device, it can run

552 00:43:37,740 --> 00:43:39,560 on a battery, it has wifi

553 00:43:39,560 --> 00:43:41,800 and cellular capabilities

554 00:43:41,800 --> 00:43:43,680 to remotely get into the

555 00:43:43,680 --> 00:43:45,960 device, listen, and then also transmit

556 00:43:45,960 --> 00:43:46,940 as well.

557 00:43:46,940 --> 00:43:49,240 And the reason why we did that

558 00:43:49,240 --> 00:43:50,780 was to just show that

559 00:43:50,780 --> 00:43:53,060 you always hear that in practice

560 00:43:53,060 --> 00:43:55,160 you have to be near it, you have to

561 00:43:55,160 --> 00:43:57,100 do all these things, but now we built a device

562 00:43:57,100 --> 00:43:59,300 that we can remotely deploy, and then

563 00:43:59,300 --> 00:44:01,240 get into later and make our attack

564 00:44:01,240 --> 00:44:02,560 from even further away.

565 00:44:02,940 --> 00:44:04,440 On a safe distance, right?

566 00:44:05,360 --> 00:44:06,760 So what was the

567 00:44:06,760 --> 00:44:09,100 response from the vendors

568 00:44:09,100 --> 00:44:10,040 when you approached them?

569 00:44:11,900 --> 00:44:13,220 We dove into that

570 00:44:13,220 --> 00:44:15,480 during the talk a bunch,

571 00:44:15,780 --> 00:44:16,800 and I don’t want to spoil

572 00:44:16,800 --> 00:44:19,220 too much for those

573 00:44:19,220 --> 00:44:20,900 who are going to watch it when it comes on

574 00:44:20,900 --> 00:44:22,800 YouTube, but the

575 00:44:22,800 --> 00:44:24,900 response was more or

576 00:44:24,900 --> 00:44:26,620 less, we

577 00:44:26,620 --> 00:44:28,640 only had one vendor who never

578 00:44:28,640 --> 00:44:30,960 responded to us. Three of the

579 00:44:30,960 --> 00:44:32,560 seven vendors have

580 00:44:32,560 --> 00:44:33,960 issued patches,

581 00:44:34,800 --> 00:44:36,800 and the other ones addressed it

582 00:44:36,800 --> 00:44:38,660 by either saying that the product is into

583 00:44:38,660 --> 00:44:40,720 life, or other

584 00:44:40,720 --> 00:44:42,800 reasons why it

585 00:44:42,800 --> 00:44:43,680 couldn’t be fixed.

586 00:44:44,820 --> 00:44:46,780 So, I mean, given

587 00:44:46,780 --> 00:44:48,560 your insight into this,

588 00:44:48,680 --> 00:44:50,360 all the protocols and the details,

589 00:44:50,660 --> 00:44:52,620 and as you said, the details will be

590 00:44:52,620 --> 00:44:54,300 on the YouTube channel from CS3

591 00:44:54,300 --> 00:44:55,100 later this year,

592 00:44:56,680 --> 00:44:58,120 what would you say,

593 00:44:58,280 --> 00:44:59,940 would be the recommendations

594 00:44:59,940 --> 00:45:02,620 to a typical manufacturer

595 00:45:02,620 --> 00:45:04,180 of a remote-controlled

596 00:45:04,180 --> 00:45:06,100 facility?

597 00:45:06,240 --> 00:45:08,600 So, recommendations are, we need

598 00:45:08,600 --> 00:45:10,580 to look

599 00:45:10,580 --> 00:45:12,180 at sub-gigahertz

600 00:45:12,180 --> 00:45:14,640 radio communications to make

601 00:45:14,640 --> 00:45:16,660 sure not only that in these cases

602 00:45:16,780 --> 00:45:18,420 of industrial radios that

603 00:45:18,420 --> 00:45:20,420 for specifics that we were talking about were

604 00:45:20,420 --> 00:45:22,760 cranes, make sure

605 00:45:22,760 --> 00:45:24,760 that these kinds of security issues

606 00:45:24,760 --> 00:45:26,620 are getting fixed now before they become

607 00:45:26,620 --> 00:45:28,520 a really big problem

608 00:45:28,520 --> 00:45:30,700 as radio equipment

609 00:45:30,700 --> 00:45:32,520 becomes cheaper and cheaper, and

610 00:45:32,520 --> 00:45:34,480 people have the ability

611 00:45:34,480 --> 00:45:36,780 to buy

612 00:45:36,780 --> 00:45:38,880 it and use it with software-defined radios

613 00:45:38,880 --> 00:45:40,580 becoming more

614 00:45:40,580 --> 00:45:41,220 and more prevalent.

615 00:45:42,000 --> 00:45:44,660 That sounds like a good plan.

616 00:45:45,200 --> 00:45:46,680 And I mean, looking at, as you

617 00:45:46,680 --> 00:45:48,300 said, like key fobs for cars

618 00:45:48,300 --> 00:45:50,360 and garage doors, etc., I mean, there is

619 00:45:50,360 --> 00:45:52,040 some advances in cryptography

620 00:45:52,040 --> 00:45:54,000 and in basic design

621 00:45:54,000 --> 00:45:56,400 that they could just adopt, I guess.

622 00:45:56,620 --> 00:45:58,000 Yes, yeah, they could.

623 00:45:58,940 --> 00:46:00,260 The only thing is,

624 00:46:00,680 --> 00:46:02,420 and I explained that

625 00:46:02,420 --> 00:46:04,260 I had a conversation with somebody here

626 00:46:04,260 --> 00:46:06,200 at the conference about this, and they’re like,

627 00:46:06,320 --> 00:46:08,460 but why does my garage door have that

628 00:46:08,460 --> 00:46:10,560 when these don’t? And it’s because

629 00:46:10,560 --> 00:46:12,280 people break into

630 00:46:12,280 --> 00:46:14,220 people’s houses using replay

631 00:46:14,220 --> 00:46:16,180 attacks previously. There’s been

632 00:46:16,180 --> 00:46:16,640 no

633 00:46:16,680 --> 00:46:18,480 proof or anything like that, a

634 00:46:18,480 --> 00:46:20,560 wide-scale issue on

635 00:46:20,560 --> 00:46:22,000 these systems has happened.

636 00:46:22,380 --> 00:46:24,560 And we don’t want to wait until something happens

637 00:46:24,560 --> 00:46:26,600 to fix it, so we need to try

638 00:46:26,600 --> 00:46:28,520 to plan before the

639 00:46:28,520 --> 00:46:30,660 attackers really start going after these

640 00:46:30,660 --> 00:46:32,480 communication

641 00:46:32,480 --> 00:46:34,300 protocols to

642 00:46:34,300 --> 00:46:36,740 go ahead and get that security added to it.

643 00:46:37,740 --> 00:46:38,780 That’s excellent.

644 00:46:38,960 --> 00:46:40,580 I think being on the toes instead of

645 00:46:40,580 --> 00:46:42,000 the heels is a good

646 00:46:42,000 --> 00:46:44,420 strategy for them, since, I mean,

647 00:46:44,500 --> 00:46:46,420 it’s not toys we’re talking.

648 00:46:46,680 --> 00:46:47,820 No, no.

649 00:46:47,940 --> 00:46:49,200 This is real stuff.

650 00:46:49,300 --> 00:46:51,680 Yeah, and that’s one of the reasons why in the presentation

651 00:46:51,680 --> 00:46:53,600 I showed a video of us moving

652 00:46:53,600 --> 00:46:55,300 a large

653 00:46:55,300 --> 00:46:58,180 real crane and not just the toy one

654 00:46:58,180 --> 00:46:58,660 on stage.

655 00:46:58,840 --> 00:47:02,240 It sounds like a critical

656 00:47:02,240 --> 00:47:03,960 infrastructure that I don’t want

657 00:47:03,960 --> 00:47:05,580 anyone else to replay or

658 00:47:05,580 --> 00:47:07,660 toy around with in that sense.

659 00:47:08,800 --> 00:47:10,180 So the RF Quark, is that

660 00:47:10,180 --> 00:47:12,260 possible to look into?

661 00:47:12,980 --> 00:47:13,800 Yeah, it is

662 00:47:13,800 --> 00:47:15,960 publicly available on GitHub.

663 00:47:16,680 --> 00:47:18,280 The code to run it is

664 00:47:18,280 --> 00:47:20,360 on Trend Micro’s GitHub

665 00:47:20,360 --> 00:47:22,380 page. That’s cool. So you search

666 00:47:22,380 --> 00:47:23,880 for RF Quark

667 00:47:23,880 --> 00:47:25,660 and then you’re done.

668 00:47:26,000 --> 00:47:28,380 And it’ll show up in the board

669 00:47:28,380 --> 00:47:29,920 schematics and how to build it.

670 00:47:30,260 --> 00:47:31,960 Everything is there on the GitHub page.

671 00:47:32,160 --> 00:47:34,560 Excellent. Let’s keep the

672 00:47:34,560 --> 00:47:36,340 crane remote controls on

673 00:47:36,340 --> 00:47:38,260 the toes, right? Yeah. Okay, Stephen.

674 00:47:38,600 --> 00:47:40,460 Thank you so much for sharing with

675 00:47:40,460 --> 00:47:42,560 Secrets Podcasten. A small

676 00:47:42,560 --> 00:47:44,300 teaser for the presentation that you

677 00:47:44,300 --> 00:47:46,440 gave and will be on the YouTube channel

678 00:47:46,440 --> 00:47:48,260 later this year. My pleasure

679 00:47:48,260 --> 00:47:50,000 and thanks for having me. Thank you for coming.

680 00:47:50,380 --> 00:47:52,500 Take care. All right. This is

681 00:47:52,500 --> 00:47:54,340 Security Podcast Secrets

682 00:47:54,340 --> 00:47:56,360 Podcasten. I’m now from

683 00:47:56,360 --> 00:47:58,320 CS3 Stockholm and

684 00:47:58,320 --> 00:48:00,360 I am very honored

685 00:48:00,360 --> 00:48:02,180 to have Jimmy Wiley

686 00:48:02,180 --> 00:48:03,900 and Reid Reitman

687 00:48:03,900 --> 00:48:05,660 from Dragos

688 00:48:05,660 --> 00:48:08,460 who just delivered a talk on

689 00:48:08,460 --> 00:48:12,120 security research into

690 00:48:12,120 --> 00:48:13,240 PLCs.

691 00:48:14,500 --> 00:48:16,380 I thought it

692 00:48:16,440 --> 00:48:18,200 was a pretty interesting talk

693 00:48:18,200 --> 00:48:20,400 specifically since you were

694 00:48:20,400 --> 00:48:22,060 looking at

695 00:48:22,060 --> 00:48:24,360 a particular problem and

696 00:48:24,360 --> 00:48:25,940 wanted to find out if that

697 00:48:25,940 --> 00:48:28,260 applied to other PLCs.

698 00:48:28,960 --> 00:48:30,320 Could you just, for our

699 00:48:30,320 --> 00:48:32,180 listeners, because they probably haven’t seen

700 00:48:32,180 --> 00:48:34,060 the talk, could you just do the

701 00:48:34,060 --> 00:48:36,220 elevator pitch on what it was

702 00:48:36,220 --> 00:48:38,300 about? Yeah, so we looked at this

703 00:48:38,300 --> 00:48:40,460 trisis attack that affected

704 00:48:40,460 --> 00:48:42,220 the Triconic safety controller

705 00:48:42,220 --> 00:48:44,720 a few years ago at a

706 00:48:44,720 --> 00:48:46,300 gas plant in Saudi

707 00:48:46,300 --> 00:48:48,180 Arabia and we just wanted to know

708 00:48:48,180 --> 00:48:50,200 would the

709 00:48:50,200 --> 00:48:52,220 vulnerability or the issue that was

710 00:48:52,220 --> 00:48:54,520 used in that attack affect other

711 00:48:54,520 --> 00:48:56,240 vendors’ controllers?

712 00:48:56,500 --> 00:48:57,520 In other words, could they

713 00:48:57,520 --> 00:48:59,960 take what’s a normal

714 00:48:59,960 --> 00:49:02,140 operation by an engineer and

715 00:49:02,140 --> 00:49:04,440 cause the entire controller to become

716 00:49:04,440 --> 00:49:06,180 untrusted? So we

717 00:49:06,180 --> 00:49:07,920 wanted to look at a few different control

718 00:49:07,920 --> 00:49:10,080 vendors and see if they might be impacted

719 00:49:10,080 --> 00:49:12,060 by the same issue. And that

720 00:49:12,060 --> 00:49:14,200 particular malware

721 00:49:14,200 --> 00:49:16,060 was sort of like a root

722 00:49:16,060 --> 00:49:17,620 kit for PLCs, right?

723 00:49:18,200 --> 00:49:20,040 Yeah, so the

724 00:49:20,040 --> 00:49:21,180 main thing here is

725 00:49:21,180 --> 00:49:23,560 arbitrary code execution.

726 00:49:24,260 --> 00:49:25,960 Can we achieve arbitrary code

727 00:49:25,960 --> 00:49:27,900 execution? That is step one. So understand

728 00:49:27,900 --> 00:49:30,220 the protocol, understand the code format

729 00:49:30,220 --> 00:49:31,920 and then once you have the code, examine the

730 00:49:31,920 --> 00:49:33,760 privilege on the device

731 00:49:33,760 --> 00:49:35,860 and see whether or not an exploit

732 00:49:35,860 --> 00:49:37,320 is necessary to do anything else.

733 00:49:37,840 --> 00:49:39,680 So with trisis, they needed

734 00:49:39,680 --> 00:49:42,120 privilege, supervisor privilege

735 00:49:42,120 --> 00:49:43,940 in order to install a root kit because in order

736 00:49:43,940 --> 00:49:46,000 to achieve persistence, they need to be in a

737 00:49:46,000 --> 00:49:47,960 particular part of firmware memory which would have

738 00:49:47,960 --> 00:49:49,940 been inaccessible as a sort of

739 00:49:49,940 --> 00:49:52,020 regular quote-unquote user program.

740 00:49:52,300 --> 00:49:52,780 Yeah, yeah.

741 00:49:53,600 --> 00:49:55,200 And that, the

742 00:49:55,200 --> 00:49:57,740 possibility

743 00:49:57,740 --> 00:49:59,920 to do an attack

744 00:49:59,920 --> 00:50:01,760 like that, it’s also very

745 00:50:01,760 --> 00:50:03,860 dependent on the type of processor, right?

746 00:50:04,960 --> 00:50:06,140 I mean, if

747 00:50:06,140 --> 00:50:07,760 you look at older

748 00:50:07,760 --> 00:50:09,820 types of processors that

749 00:50:09,820 --> 00:50:11,840 do not separate between

750 00:50:11,840 --> 00:50:12,500 user

751 00:50:12,500 --> 00:50:15,220 execution

752 00:50:15,220 --> 00:50:15,940 and

753 00:50:15,940 --> 00:50:18,440 ring zero execution

754 00:50:18,440 --> 00:50:19,420 and so on and so forth.

755 00:50:19,560 --> 00:50:21,140 Yeah, certainly some older processors

756 00:50:21,140 --> 00:50:23,760 you saw on the talk, you know, there are PLCs

757 00:50:23,760 --> 00:50:25,740 that use these old CPUs that don’t

758 00:50:25,740 --> 00:50:28,020 even have this concept of separating

759 00:50:28,020 --> 00:50:29,960 user code from supervisor

760 00:50:29,960 --> 00:50:31,840 code. So all code

761 00:50:31,840 --> 00:50:32,660 is treated equally.

762 00:50:33,760 --> 00:50:35,420 And that makes it easier.

763 00:50:35,720 --> 00:50:36,600 Right, for sure.

764 00:50:37,040 --> 00:50:40,040 Yeah, because it means arbitrary code execution

765 00:50:40,040 --> 00:50:40,620 is winning.

766 00:50:41,340 --> 00:50:43,880 As opposed to needing to do an exploit afterwards,

767 00:50:43,880 --> 00:50:45,880 right? So it would have been like steps one and two of trisis.

768 00:50:45,940 --> 00:50:47,580 And then it would have been over with, right?

769 00:50:47,580 --> 00:50:49,700 Yeah, and that’s, I mean, that’s basically

770 00:50:49,700 --> 00:50:51,820 what our talk was. How do we get steps one and two?

771 00:50:52,140 --> 00:50:53,920 And then once we have steps one and two,

772 00:50:54,320 --> 00:50:55,760 you know, then step three, we could

773 00:50:55,760 --> 00:50:57,780 do if we wanted to like sit there and play with it.

774 00:50:57,800 --> 00:50:59,720 But that’s not as important as how easy it is

775 00:50:59,720 --> 00:51:02,000 just to get our code to execute.

776 00:51:02,300 --> 00:51:03,000 Right, right.

777 00:51:04,400 --> 00:51:05,500 What would you say,

778 00:51:05,840 --> 00:51:07,580 I mean, when you

779 00:51:07,580 --> 00:51:09,760 point out this problem to the

780 00:51:09,760 --> 00:51:11,640 vendors, you

781 00:51:11,640 --> 00:51:13,620 sort of received the same question on stage

782 00:51:13,620 --> 00:51:15,060 now, but would you say

783 00:51:15,940 --> 00:51:17,840 that information is received

784 00:51:17,840 --> 00:51:19,800 well, or are you treated

785 00:51:19,800 --> 00:51:21,220 sort of like…

786 00:51:21,220 --> 00:51:23,340 It’s been pretty neutral, I would say. It’s not

787 00:51:23,340 --> 00:51:25,740 an issue that a lot of vendors have considered

788 00:51:25,740 --> 00:51:27,660 in the past. And, you know, I totally understand

789 00:51:27,660 --> 00:51:29,800 that, honestly. As a researcher, I’m like, well,

790 00:51:29,880 --> 00:51:31,280 you know, I used to work for a vendor,

791 00:51:31,720 --> 00:51:33,740 and I know, you know, back then, that

792 00:51:33,740 --> 00:51:35,900 was not a design consideration. It was just

793 00:51:35,900 --> 00:51:37,360 that’s how it works.

794 00:51:38,060 --> 00:51:39,680 You know, that went into

795 00:51:39,680 --> 00:51:40,820 the product design. So,

796 00:51:41,220 --> 00:51:43,740 you know, it’s kind of a little

797 00:51:43,740 --> 00:51:45,720 give and take with the vendor, because you have

798 00:51:45,720 --> 00:51:47,600 to understand that the product

799 00:51:47,600 --> 00:51:49,720 wasn’t designed that way to have this

800 00:51:49,720 --> 00:51:50,820 separation of code.

801 00:51:51,720 --> 00:51:53,180 They’re not going to fix it overnight.

802 00:51:53,820 --> 00:51:55,700 In fact, they’re probably not going to fix it at

803 00:51:55,700 --> 00:51:57,500 all in the current generation

804 00:51:57,500 --> 00:51:59,780 products. It’s probably going to be something

805 00:51:59,780 --> 00:52:01,520 that requires years of development

806 00:52:01,520 --> 00:52:03,620 effort internally, and they’re really only

807 00:52:03,620 --> 00:52:05,600 going to go down that path if their customers ask

808 00:52:05,600 --> 00:52:07,660 for it. You know, I can

809 00:52:07,660 --> 00:52:09,700 kick and scream all I want. They’re probably not going to listen

810 00:52:09,700 --> 00:52:11,580 to me, ultimately. I’ll have to mention

811 00:52:11,580 --> 00:52:13,640 the original design consideration was

812 00:52:13,640 --> 00:52:15,600 like, they only assumed

813 00:52:15,600 --> 00:52:17,400 that their software is generating the code that will

814 00:52:17,400 --> 00:52:19,620 execute. Like, that is

815 00:52:19,620 --> 00:52:21,680 part of the assumption. That’s the assumption

816 00:52:21,680 --> 00:52:23,680 we’re trying to break, but that’s the assumption they’re operating

817 00:52:23,680 --> 00:52:25,500 under. So, they’re not really thinking

818 00:52:25,500 --> 00:52:27,920 like, oh, well, this isn’t like a

819 00:52:27,920 --> 00:52:29,560 web browser situation where

820 00:52:29,560 --> 00:52:31,520 you’re always downloading code. Anytime you go

821 00:52:31,520 --> 00:52:33,480 to any website, you’re downloading code into your box, and

822 00:52:33,480 --> 00:52:34,400 it’s executing, right?

823 00:52:35,320 --> 00:52:37,660 And you trust that everything’s going to happen safely

824 00:52:37,660 --> 00:52:39,260 except for the one time it gets away from you.

825 00:52:39,700 --> 00:52:41,280 And the vendors,

826 00:52:41,680 --> 00:52:43,480 they don’t think that way.

827 00:52:43,520 --> 00:52:45,560 They don’t think like, oh, you know, my controller

828 00:52:45,600 --> 00:52:47,580 will execute code from any place. My code is

829 00:52:47,580 --> 00:52:49,500 only going to execute from things that I already

830 00:52:49,500 --> 00:52:50,800 wrote and formatted correctly.

831 00:52:51,860 --> 00:52:53,480 Yeah, I totally understand.

832 00:52:53,660 --> 00:52:55,560 I mean, it’s, taking

833 00:52:55,560 --> 00:52:56,920 the example from

834 00:52:56,920 --> 00:52:59,560 the thing I will

835 00:52:59,560 --> 00:53:00,460 talk about tomorrow,

836 00:53:01,060 --> 00:53:03,600 I discovered a

837 00:53:03,600 --> 00:53:05,660 serious flaw in the security

838 00:53:05,660 --> 00:53:07,060 architecture of

839 00:53:07,060 --> 00:53:09,020 a SCADA component.

840 00:53:09,820 --> 00:53:11,460 And that

841 00:53:11,460 --> 00:53:13,200 was, you know,

842 00:53:13,820 --> 00:53:15,380 almost impossible to get

843 00:53:15,380 --> 00:53:17,340 the vendor first to accept that this was

844 00:53:17,340 --> 00:53:19,180 a huge problem. So, I

845 00:53:19,180 --> 00:53:21,280 wrote a proof of concept code, and then it

846 00:53:21,280 --> 00:53:23,100 dawned on them, and it’s like, this is

847 00:53:23,100 --> 00:53:25,300 damn bad. And then they

848 00:53:25,300 --> 00:53:26,160 tried to bury it.

849 00:53:26,160 --> 00:53:26,420 Yeah.

850 00:53:27,060 --> 00:53:29,980 And, I mean,

851 00:53:30,200 --> 00:53:32,320 it’s, yeah.

852 00:53:32,760 --> 00:53:34,140 Don’t sign the NDA.

853 00:53:34,660 --> 00:53:36,340 No, no. And also,

854 00:53:36,560 --> 00:53:37,520 also, I mean,

855 00:53:37,740 --> 00:53:40,400 the tail end of that problem

856 00:53:40,400 --> 00:53:42,240 is that there’s

857 00:53:42,240 --> 00:53:43,940 going to be customers out there with

858 00:53:43,940 --> 00:53:45,320 vulnerable PLCs,

859 00:53:45,380 --> 00:53:47,240 vulnerable SCADA systems, vulnerable

860 00:53:47,240 --> 00:53:49,320 this and that. And

861 00:53:49,320 --> 00:53:51,240 they have a, you know,

862 00:53:51,320 --> 00:53:53,420 life cycle of over

863 00:53:53,420 --> 00:53:55,260 10, 15 years.

864 00:53:55,580 --> 00:53:57,300 So, even if the

865 00:53:57,300 --> 00:53:59,460 vendor comes out with a new product that is not

866 00:53:59,460 --> 00:54:01,080 vulnerable, I mean, it’s going to take

867 00:54:01,080 --> 00:54:03,340 for ages before the

868 00:54:03,340 --> 00:54:04,560 customers are secure.

869 00:54:04,560 --> 00:54:07,300 I mean, it’ll probably be five years before we even see

870 00:54:07,300 --> 00:54:09,440 a product that considers this particular design

871 00:54:09,440 --> 00:54:11,340 flaw as part of its

872 00:54:11,340 --> 00:54:13,400 design model. So, the PLCs

873 00:54:13,400 --> 00:54:15,340 that are being vetted and installed for

874 00:54:15,340 --> 00:54:17,300 the next five years will have this

875 00:54:17,300 --> 00:54:18,500 problem, most likely.

876 00:54:19,220 --> 00:54:21,440 And then they’ll be around for another 20 or 30

877 00:54:21,440 --> 00:54:22,920 beyond that. So, it’s, yeah.

878 00:54:23,120 --> 00:54:25,080 I’ll probably be retired by the time

879 00:54:25,080 --> 00:54:27,240 security can,

880 00:54:27,560 --> 00:54:29,360 but by the time these things become,

881 00:54:29,380 --> 00:54:31,440 you know, commonplace where you have

882 00:54:31,440 --> 00:54:33,340 a, you know, controller that’s

883 00:54:33,340 --> 00:54:35,360 running code in a

884 00:54:35,360 --> 00:54:36,580 secure enclave somehow.

885 00:54:37,580 --> 00:54:39,500 I think, but that’s,

886 00:54:39,620 --> 00:54:41,300 I don’t want, well, that’s not an okay problem. I mean, that’s a

887 00:54:41,300 --> 00:54:43,360 problem, but that’s part of the reason why we’re doing

888 00:54:43,360 --> 00:54:45,200 this work. We know that these

889 00:54:45,200 --> 00:54:47,220 things aren’t going to get fixed. And as much as we want

890 00:54:47,220 --> 00:54:49,240 to complain to Rockwell or whoever, well, not

891 00:54:49,240 --> 00:54:51,260 Rockwell necessarily yet, but, you know,

892 00:54:51,320 --> 00:54:53,480 Proconos or Codasys or whatever about these problems,

893 00:54:53,840 --> 00:54:55,380 that’s not why we started doing this, right?

894 00:54:55,440 --> 00:54:57,300 So, it was two things, right? One,

895 00:54:58,760 --> 00:54:59,320 what is it going

896 00:54:59,320 --> 00:55:01,280 to take? Like, what is, after

897 00:55:01,280 --> 00:55:03,160 Tricis, we got a lot of questions. You know,

898 00:55:03,180 --> 00:55:05,180 what do you think the level of effort would have been to do this

899 00:55:05,180 --> 00:55:07,240 with the Triconics? Like, how long would it have taken you

900 00:55:07,240 --> 00:55:09,200 all to do it? Which is already, like, kind of a loaded question

901 00:55:09,200 --> 00:55:11,120 because, you know, I’ve been doing RE for

902 00:55:11,120 --> 00:55:13,160 10 years. He’s been doing ICS stuff for 10 or

903 00:55:13,160 --> 00:55:15,080 15 years. You’re looking at two people that have lots of

904 00:55:15,080 --> 00:55:17,120 experience and asking them how long it’ll take. Well, I don’t

905 00:55:17,120 --> 00:55:19,080 know. Like, their team could have taken less or

906 00:55:19,080 --> 00:55:20,560 more time depending on who was on it.

907 00:55:21,600 --> 00:55:23,160 And so, for us, it’s like, well, how do

908 00:55:23,160 --> 00:55:25,200 we, like, how do we actually understand the level

909 00:55:25,200 --> 00:55:27,120 of effort to get from, you know, point A to point

910 00:55:27,120 --> 00:55:29,120 B, right? Without the help of

911 00:55:29,120 --> 00:55:31,120 the vendor. Like, I don’t want to talk to them. I don’t want them giving me

912 00:55:31,120 --> 00:55:33,260 any hints because, you know, ideally the attacker,

913 00:55:33,540 --> 00:55:34,980 you know, in the worst case,

914 00:55:35,300 --> 00:55:36,580 the attacker has no help, right?

915 00:55:37,220 --> 00:55:39,180 And then, the sort of second part

916 00:55:39,180 --> 00:55:40,840 of this, you know, so there’s the understanding

917 00:55:40,840 --> 00:55:43,140 what it takes to actually execute

918 00:55:43,140 --> 00:55:45,060 one of these attacks. The sort

919 00:55:45,060 --> 00:55:47,120 of second part is, we know that this thing isn’t

920 00:55:47,120 --> 00:55:48,820 going to get fixed necessarily right away.

921 00:55:49,820 --> 00:55:51,080 Understanding the first

922 00:55:51,080 --> 00:55:53,280 part lets us build detections

923 00:55:53,280 --> 00:55:54,980 and analytics on the second part. So,

924 00:55:55,160 --> 00:55:57,140 it doesn’t matter if you

925 00:55:57,140 --> 00:55:59,120 get it, well, I mean, it does matter, but it doesn’t matter if an attack

926 00:55:59,120 --> 00:56:01,140 comes through on the wire because the fact is we have a way to detect

927 00:56:01,140 --> 00:56:03,260 it, we have a way to detect a code upload, and we have a way to analyze

928 00:56:03,260 --> 00:56:05,240 a code that gets uploaded. Right. So, we can sort things

929 00:56:05,240 --> 00:56:07,160 out ourselves and the incident responders can do the job

930 00:56:07,160 --> 00:56:09,180 that they’re supposed to do, right? And that’s really what we’re

931 00:56:09,180 --> 00:56:10,000 trying to get at here.

932 00:56:11,500 --> 00:56:13,340 I mean, personally, I didn’t even think to contact

933 00:56:13,340 --> 00:56:15,020 Rockwell for anything when I was doing my work.

934 00:56:15,060 --> 00:56:15,880 Like, why bother?

935 00:56:17,140 --> 00:56:18,940 And also, it would just spoil the fun.

936 00:56:19,220 --> 00:56:20,120 Yeah, exactly.

937 00:56:21,400 --> 00:56:23,200 Take away some of the

938 00:56:23,200 --> 00:56:24,060 engineering problems.

939 00:56:25,080 --> 00:56:26,120 Absolutely, yeah.

940 00:56:26,840 --> 00:56:29,020 But, I mean, what would you

941 00:56:29,020 --> 00:56:30,920 say, how should

942 00:56:30,920 --> 00:56:32,400 the end customers

943 00:56:32,400 --> 00:56:34,740 deal with this? Because it is

944 00:56:34,740 --> 00:56:37,200 a problem that is probably

945 00:56:37,200 --> 00:56:39,080 out

946 00:56:39,080 --> 00:56:41,060 there, regardless if it says

947 00:56:41,060 --> 00:56:43,360 ABB, Rockwell, Honeywell,

948 00:56:43,880 --> 00:56:44,320 whatever.

949 00:56:45,060 --> 00:56:47,180 Yeah. Mitsubishi

950 00:56:47,180 --> 00:56:48,860 on the equipment.

951 00:56:49,200 --> 00:56:51,380 Right. I hate to say perimeter

952 00:56:51,380 --> 00:56:53,280 protection, but perimeter protection, especially

953 00:56:53,280 --> 00:56:55,280 around the controllers, you know, make some

954 00:56:55,280 --> 00:56:57,340 firewall rules that basically say, here are

955 00:56:57,340 --> 00:56:59,460 my engineering workstations. They are allowed

956 00:56:59,460 --> 00:57:01,520 to communicate with the PLCs using

957 00:57:01,520 --> 00:57:03,400 these ports. And, I mean, part of our talk

958 00:57:03,400 --> 00:57:05,540 was listing, you know, what ports and services

959 00:57:05,540 --> 00:57:07,260 are associated with Codasys and

960 00:57:07,260 --> 00:57:09,420 Poconos. So, you know, you can at least take that

961 00:57:09,420 --> 00:57:11,440 list and say, okay, let my engineering workstations

962 00:57:11,440 --> 00:57:13,440 talk to the PLC on those ports. Deny

963 00:57:13,440 --> 00:57:15,040 everything else, and if I see anything else, try to deny it.

964 00:57:15,060 --> 00:57:17,200 Try to use those services on the PLC. I want to know about it.

965 00:57:17,280 --> 00:57:19,140 Flag it. Yeah. Again,

966 00:57:19,260 --> 00:57:21,040 that’s not foolproof, though, right? No.

967 00:57:21,040 --> 00:57:22,920 Because we saw, like, in the

968 00:57:22,920 --> 00:57:24,820 Trisys case, the attacker

969 00:57:24,820 --> 00:57:26,880 had access to the engineering workstation.

970 00:57:27,400 --> 00:57:28,940 So, at that point, it really becomes

971 00:57:28,940 --> 00:57:31,040 network monitoring. You know, look for the

972 00:57:31,040 --> 00:57:32,980 abnormal behavior. Like, if we can

973 00:57:32,980 --> 00:57:34,840 get to the point where we are

974 00:57:34,840 --> 00:57:36,880 parsing the file that’s being transferred

975 00:57:36,880 --> 00:57:38,700 and can actually analyze that file,

976 00:57:39,160 --> 00:57:41,020 that would be awesome. Yeah. Because then we could

977 00:57:41,020 --> 00:57:43,100 actually, you know, okay, now we have passive detection.

978 00:57:43,780 --> 00:57:44,880 Yes, that came from the

979 00:57:45,060 --> 00:57:47,120 engineering workstation, and yes, it was a

980 00:57:47,120 --> 00:57:49,100 logic update, but it happens to be

981 00:57:49,100 --> 00:57:51,200 a malicious logic update, or at least a suspicious

982 00:57:51,200 --> 00:57:53,340 one. This is something that your, you know,

983 00:57:53,820 --> 00:57:55,020 ICS SOC analyst

984 00:57:55,020 --> 00:57:57,080 wants to dig into. Yeah. Go talk to the

985 00:57:57,080 --> 00:57:59,300 engineer, see if he actually initiated that update,

986 00:57:59,820 --> 00:58:01,140 make sure that the time

987 00:58:01,140 --> 00:58:02,760 window fits, and all that. Exactly.

988 00:58:03,000 --> 00:58:04,840 I mean, and it’s, when I

989 00:58:04,840 --> 00:58:06,320 used to work for

990 00:58:06,320 --> 00:58:08,240 a public

991 00:58:08,240 --> 00:58:11,020 critical infrastructure

992 00:58:11,020 --> 00:58:13,100 company as an IT manager,

993 00:58:13,440 --> 00:58:15,040 and I had the

994 00:58:15,060 --> 00:58:16,660 notion that I wanted,

995 00:58:17,380 --> 00:58:19,080 I want to know if there is

996 00:58:19,080 --> 00:58:21,260 a software update, regardless if it’s

997 00:58:21,260 --> 00:58:22,840 approved or not. Yeah.

998 00:58:23,020 --> 00:58:24,540 So I want to know that.

999 00:58:24,720 --> 00:58:27,060 Because then I can go back to the

1000 00:58:27,060 --> 00:58:29,300 automation engineers

1001 00:58:29,300 --> 00:58:31,240 and say, did you guys do this?

1002 00:58:31,300 --> 00:58:32,720 Right. Yes or no? Right.

1003 00:58:33,080 --> 00:58:34,680 And then, you know,

1004 00:58:34,880 --> 00:58:37,020 in all the cases, the answer was

1005 00:58:37,020 --> 00:58:38,780 yes. So, I mean, I guess we were

1006 00:58:38,780 --> 00:58:41,080 pretty well off, but…

1007 00:58:41,080 --> 00:58:42,800 I mean, that’s another part of, like,

1008 00:58:43,200 --> 00:58:44,980 getting back to, like, what can

1009 00:58:44,980 --> 00:58:47,020 you do now? Yeah. Right? I would also

1010 00:58:47,020 --> 00:58:48,940 say, like, some of the things we say about

1011 00:58:48,940 --> 00:58:50,080 Trisys is, like, one,

1012 00:58:50,900 --> 00:58:52,900 you know, the SOC people, whoever’s in charge of your

1013 00:58:52,900 --> 00:58:54,840 security, you know, should

1014 00:58:54,840 --> 00:58:56,880 at least have enough communication with the plan managers to

1015 00:58:56,880 --> 00:58:58,620 understand when these things are getting configured.

1016 00:58:59,040 --> 00:59:00,860 Right? So then they know, like, that’s when the traffic’s

1017 00:59:00,860 --> 00:59:02,820 supposed to be there. Yeah. Right? At least that.

1018 00:59:03,040 --> 00:59:05,060 Yeah. The other thing I’d recommend

1019 00:59:05,060 --> 00:59:06,800 is, you know, whoever’s monitoring

1020 00:59:06,800 --> 00:59:08,840 your network, if they are monitoring the communications

1021 00:59:08,840 --> 00:59:10,900 between the EWS and whatever the system

1022 00:59:10,900 --> 00:59:13,020 is, or whatever the controller is, is, like,

1023 00:59:13,100 --> 00:59:14,780 pay attention the next time things get configured.

1024 00:59:14,980 --> 00:59:16,900 Look at what that traffic looks like. Yeah. So, if you

1025 00:59:16,900 --> 00:59:18,860 see it outside the normal operating windows, you know, like,

1026 00:59:18,920 --> 00:59:20,800 oh, crap, like, this isn’t something. Right. Just

1027 00:59:20,800 --> 00:59:22,920 even to get a general sense, right? So, at the very least, you can be

1028 00:59:22,920 --> 00:59:24,880 like, okay, this is what it is. Because right now

1029 00:59:24,880 --> 00:59:26,520 we just don’t have the tools, right? Like, what we’re

1030 00:59:26,520 --> 00:59:28,680 trying to do, I mean, this is the problem,

1031 00:59:29,060 --> 00:59:30,880 the big problem with ICS is we’re trying to

1032 00:59:30,880 --> 00:59:32,840 catch up to what we’ve done in the

1033 00:59:32,840 --> 00:59:34,660 Microsoft Windows and Apple and stuff, right?

1034 00:59:35,180 --> 00:59:37,020 Right now, you know, comparably,

1035 00:59:37,580 --> 00:59:38,660 if we saw an executable

1036 00:59:38,660 --> 00:59:40,780 PE32 executable fly across the wire,

1037 00:59:40,880 --> 00:59:42,720 you know, a tool would parse it, it would run against all these signatures

1038 00:59:42,720 --> 00:59:44,720 and everything, tell you what it is and give you a bunch of

1039 00:59:44,720 --> 00:59:46,720 information on the status. If we had no idea

1040 00:59:46,720 --> 00:59:48,660 what Windows was and a PE executable flew by,

1041 00:59:48,700 --> 00:59:50,860 it would just look like binary. We’d have no idea what it is. And that’s what we’re doing.

1042 00:59:51,100 --> 00:59:52,780 We’re playing catch up before we can do any of the

1043 00:59:52,780 --> 00:59:54,760 things that sort of the, in the IT space

1044 00:59:54,760 --> 00:59:56,640 that we’re doing that’s like advanced detections.

1045 00:59:57,040 --> 00:59:58,420 You know, we just need visibility.

1046 00:59:58,960 --> 01:00:00,840 Right, right. And I think

1047 01:00:00,840 --> 01:00:02,580 that’s an excellent

1048 01:00:02,580 --> 01:00:04,720 takeaway. I mean, gain that visibility

1049 01:00:04,720 --> 01:00:06,200 within your OT

1050 01:00:06,200 --> 01:00:08,720 environment to be able

1051 01:00:08,720 --> 01:00:10,540 to do analysis on

1052 01:00:10,540 --> 01:00:12,740 traffic patterns and so on and so

1053 01:00:12,740 --> 01:00:14,580 forth. At least

1054 01:00:14,580 --> 01:00:16,500 in my experience, I think, looking

1055 01:00:16,500 --> 01:00:18,520 at traffic in an OT network where you

1056 01:00:18,520 --> 01:00:19,220 don’t have any

1057 01:00:19,220 --> 01:00:22,420 users browsing the internet

1058 01:00:22,420 --> 01:00:24,060 and doing all kinds of crazy stuff

1059 01:00:24,060 --> 01:00:26,620 on the wire, it’s

1060 01:00:26,620 --> 01:00:28,480 fairly easy to build up a baseline

1061 01:00:28,480 --> 01:00:30,420 of what does my plant

1062 01:00:30,420 --> 01:00:31,040 look like?

1063 01:00:32,500 --> 01:00:34,620 And what is normal traffic? What is

1064 01:00:34,620 --> 01:00:36,620 anomalies? And so on.

1065 01:00:37,020 --> 01:00:38,020 Yeah, yeah. On the one hand,

1066 01:00:38,920 --> 01:00:40,460 yeah, so background, I guess, for

1067 01:00:40,460 --> 01:00:42,180 the people here, but I came from IT

1068 01:00:42,180 --> 01:00:44,100 doing malware analysis and things like that and

1069 01:00:44,100 --> 01:00:44,560 moving

1070 01:00:44,580 --> 01:00:46,260 to Dragos and started doing this ICS stuff.

1071 01:00:46,640 --> 01:00:47,860 And one of the things I’ve noticed is that

1072 01:00:47,860 --> 01:00:50,420 the malware that’s ICS targeted or

1073 01:00:50,420 --> 01:00:52,440 targeting ICS companies, it’s a lot of living

1074 01:00:52,440 --> 01:00:54,440 off the land stuff or really simple implants and from

1075 01:00:54,440 --> 01:00:56,540 a malware analysis perspective,

1076 01:00:56,660 --> 01:00:58,620 it’s not really that difficult to deal with in terms

1077 01:00:58,620 --> 01:01:00,140 of like, oh, what does this do and whatever.

1078 01:01:00,960 --> 01:01:02,400 The problem here that’s

1079 01:01:02,400 --> 01:01:04,420 very difficult, right, in the

1080 01:01:04,420 --> 01:01:06,360 IT space, we have fairly homogenous

1081 01:01:06,360 --> 01:01:08,440 operating systems, right? There’s Linux, there’s Windows,

1082 01:01:08,740 --> 01:01:10,440 there’s, well, really Linux and

1083 01:01:10,440 --> 01:01:11,640 Windows in the enterprise, let’s be real.

1084 01:01:12,140 --> 01:01:14,500 As much as I like Apple

1085 01:01:14,500 --> 01:01:16,500 products, you know, like I see a lot of enterprise Apple

1086 01:01:16,500 --> 01:01:18,400 stuff, right? And, you know,

1087 01:01:18,440 --> 01:01:20,460 all the understandings there, the operating system understanding there,

1088 01:01:20,540 --> 01:01:22,460 the executable understanding is there, how they’re put together,

1089 01:01:22,560 --> 01:01:24,620 how things are loaded, all these things are well-researched, documented

1090 01:01:24,620 --> 01:01:25,440 or open source.

1091 01:01:26,680 --> 01:01:28,380 The larger problem in ICS is we have a

1092 01:01:28,380 --> 01:01:30,200 fragmented operating system

1093 01:01:30,200 --> 01:01:32,460 landscape. And that’s what’s difficult.

1094 01:01:32,980 --> 01:01:34,500 You know, Codasys is basically an operating

1095 01:01:34,500 --> 01:01:36,460 system, right? Proconos is an operating system.

1096 01:01:36,820 --> 01:01:38,280 Whatever RSLogix is doing is

1097 01:01:38,280 --> 01:01:40,000 an operating system. Triconics.

1098 01:01:40,440 --> 01:01:42,320 And it’s like, yo, why are we doing this to ourselves?

1099 01:01:42,500 --> 01:01:44,360 We’ve made the problem so much harder than it

1100 01:01:44,360 --> 01:01:46,440 needs to be, right? And the same thing

1101 01:01:46,440 --> 01:01:48,180 with, like, whatever they do.

1102 01:01:48,260 --> 01:01:50,440 And then we have the custom compilers that can only…

1103 01:01:50,440 --> 01:01:52,560 And all these things aren’t well-researched because the ICS

1104 01:01:52,560 --> 01:01:54,080 community doesn’t really

1105 01:01:54,080 --> 01:01:56,020 approach things from, like, a…

1106 01:01:56,020 --> 01:01:58,300 Well, they haven’t, from, like, sort of, like, what would be easy

1107 01:01:58,300 --> 01:02:00,340 to defend against, right? Or defend…

1108 01:02:00,340 --> 01:02:01,580 What would be easy to defend, really?

1109 01:02:02,720 --> 01:02:04,380 Or even, like, a more, like, I mean,

1110 01:02:04,380 --> 01:02:06,580 honestly, like, a programming-language-centric

1111 01:02:06,580 --> 01:02:08,380 kind of point of view. Like, what would make things

1112 01:02:08,380 --> 01:02:10,400 easier, you know? It just hasn’t happened.

1113 01:02:10,920 --> 01:02:12,380 But do you guys think that is

1114 01:02:12,960 --> 01:02:14,360 a big part

1115 01:02:14,360 --> 01:02:16,420 of the problem? That there’s so

1116 01:02:16,420 --> 01:02:18,400 much… I mean,

1117 01:02:18,460 --> 01:02:20,400 it’s so hard to do research

1118 01:02:20,400 --> 01:02:22,500 on these things. And it’s cost-prohibitive.

1119 01:02:23,020 --> 01:02:24,540 And it’s

1120 01:02:24,540 --> 01:02:26,300 hard to get hold of unless

1121 01:02:26,300 --> 01:02:27,840 you have a, you know,

1122 01:02:27,960 --> 01:02:29,600 a big load of money.

1123 01:02:31,300 --> 01:02:31,320 And

1124 01:02:31,320 --> 01:02:34,440 there’s really not

1125 01:02:34,440 --> 01:02:36,080 much…

1126 01:02:36,080 --> 01:02:37,940 Unless you’re working for someone

1127 01:02:37,940 --> 01:02:40,480 particular that have, you know,

1128 01:02:40,580 --> 01:02:42,300 a big box of money

1129 01:02:42,300 --> 01:02:42,680 and

1130 01:02:42,680 --> 01:02:46,080 they want you to do research

1131 01:02:46,080 --> 01:02:47,340 on a specific product.

1132 01:02:47,760 --> 01:02:49,280 You don’t get access to that.

1133 01:02:49,480 --> 01:02:50,240 I mean…

1134 01:02:50,240 --> 01:02:52,200 Right. And we’re lucky to be in a

1135 01:02:52,200 --> 01:02:54,400 place where, yeah, we do have the money

1136 01:02:54,400 --> 01:02:56,360 where, okay, we can go acquire this hardware

1137 01:02:56,360 --> 01:02:58,280 and have the time to analyze the network

1138 01:02:58,280 --> 01:03:00,120 protocol. Partnerships and…

1139 01:03:00,120 --> 01:03:02,320 This isn’t something that, unfortunately,

1140 01:03:02,460 --> 01:03:04,120 the whole community can do yet because of that.

1141 01:03:04,660 --> 01:03:05,660 That’s hurting us.

1142 01:03:06,360 --> 01:03:08,400 Right. And that’s, I think,

1143 01:03:08,840 --> 01:03:10,380 why there are so few security

1144 01:03:10,380 --> 01:03:12,560 researchers looking at,

1145 01:03:12,560 --> 01:03:13,300 you know, that ICS stuff.

1146 01:03:14,220 --> 01:03:15,640 Yeah. Because it is

1147 01:03:15,640 --> 01:03:18,260 hard to get hold of.

1148 01:03:18,820 --> 01:03:20,940 And, yeah.

1149 01:03:20,960 --> 01:03:22,420 Yeah. Both the hardware and the software

1150 01:03:22,420 --> 01:03:24,380 are hard to get a hold of. I mean, you can buy

1151 01:03:24,380 --> 01:03:26,240 the hardware on eBay, sometimes cheaply,

1152 01:03:26,340 --> 01:03:27,640 you know, but even getting

1153 01:03:27,640 --> 01:03:30,340 a licensed copy of the software

1154 01:03:30,340 --> 01:03:31,700 is hard. Yes.

1155 01:03:31,960 --> 01:03:34,140 You can buy the PLCs,

1156 01:03:34,420 --> 01:03:36,160 but to get… Good luck

1157 01:03:36,160 --> 01:03:37,560 programming it. Yeah.

1158 01:03:37,680 --> 01:03:39,400 Good luck programming it. Exactly.

1159 01:03:39,400 --> 01:03:39,960 Yeah.

1160 01:03:41,440 --> 01:03:42,000 That’s,

1161 01:03:42,560 --> 01:03:44,980 what would you say,

1162 01:03:45,820 --> 01:03:47,420 where do you think the ICS

1163 01:03:47,420 --> 01:03:48,880 community

1164 01:03:48,880 --> 01:03:51,680 in terms of security research will be

1165 01:03:51,680 --> 01:03:53,720 in, let’s say, 10 years? Will we have

1166 01:03:53,720 --> 01:03:55,520 caught up to where

1167 01:03:55,520 --> 01:03:57,640 IT is today, or are we still

1168 01:03:57,640 --> 01:03:59,580 lagging behind? We’ll

1169 01:03:59,580 --> 01:04:01,660 probably still be lagging behind where IT

1170 01:04:01,660 --> 01:04:03,700 is now. And I think, of course,

1171 01:04:03,820 --> 01:04:06,020 you know, IT methodologies

1172 01:04:06,020 --> 01:04:07,520 will be continuing to plod

1173 01:04:07,520 --> 01:04:09,520 forward in that time, too. I mean, it’s kind of

1174 01:04:09,520 --> 01:04:11,480 always been the case with ICS

1175 01:04:11,480 --> 01:04:12,340 security, I think.

1176 01:04:12,560 --> 01:04:14,540 We’re always quite a ways behind.

1177 01:04:14,680 --> 01:04:16,240 You know, we used to refer to this

1178 01:04:16,240 --> 01:04:18,240 industrial control system security

1179 01:04:18,240 --> 01:04:20,260 loss decade in the early

1180 01:04:20,260 --> 01:04:22,540 2000s. It was like, oh, why don’t even

1181 01:04:22,540 --> 01:04:24,580 PLCs have basic security

1182 01:04:24,580 --> 01:04:26,140 features like authentication? Well,

1183 01:04:26,440 --> 01:04:28,060 for the most part, they still don’t.

1184 01:04:28,680 --> 01:04:30,580 You know, and that’s, we’re now 10 or 15

1185 01:04:30,580 --> 01:04:32,620 years after the loss decade, and we still

1186 01:04:32,620 --> 01:04:33,780 haven’t quite caught up.

1187 01:04:34,240 --> 01:04:36,560 What happened to code signatures and things like

1188 01:04:36,560 --> 01:04:37,520 that? Yeah.

1189 01:04:38,160 --> 01:04:40,580 Yeah, I mean, it kind of depends

1190 01:04:40,580 --> 01:04:41,860 on what you mean by catch up.

1191 01:04:42,560 --> 01:04:44,220 Right? Like, what are we catching up

1192 01:04:44,220 --> 01:04:46,100 to? Like, what’s the sort of

1193 01:04:46,100 --> 01:04:48,280 goalposts that you say

1194 01:04:48,280 --> 01:04:49,480 are going to keep moving?

1195 01:04:49,980 --> 01:04:52,400 Like, so, for example,

1196 01:04:52,680 --> 01:04:54,220 like, when I think about it, if

1197 01:04:54,220 --> 01:04:55,980 the Triconics operating system is like Windows

1198 01:04:55,980 --> 01:04:57,900 95, and we happen to get,

1199 01:04:58,080 --> 01:04:59,980 we don’t have it yet, maybe we’ll get it eventually,

1200 01:05:00,060 --> 01:05:01,300 I don’t know. I say yet, we don’t know.

1201 01:05:02,120 --> 01:05:04,020 That was firmware version 10, and then we get firmware

1202 01:05:04,020 --> 01:05:05,920 version 11. Sure. And we look at that.

1203 01:05:06,020 --> 01:05:07,880 Is that operating system a little bit more up to date?

1204 01:05:07,980 --> 01:05:10,120 Is that operating system more like Windows 98 or XP?

1205 01:05:10,340 --> 01:05:12,020 I mean, we don’t know. Like, what are the,

1206 01:05:12,560 --> 01:05:14,220 know what the ICS vendors are learning

1207 01:05:14,220 --> 01:05:16,420 from the modern operating system world,

1208 01:05:16,520 --> 01:05:18,260 or if they are even trying to learn

1209 01:05:18,260 --> 01:05:20,260 from the modern operating system world. So, for me,

1210 01:05:20,300 --> 01:05:22,380 it’s hard to, it’s like when you calculate

1211 01:05:22,380 --> 01:05:24,240 slope, you need two points to figure out, like,

1212 01:05:24,260 --> 01:05:26,340 what the, you know, the direction is. I don’t have

1213 01:05:26,340 --> 01:05:28,420 two points of reference for operating systems yet

1214 01:05:28,420 --> 01:05:29,820 in the ICS field

1215 01:05:29,820 --> 01:05:32,480 to be able to say, like, oh yeah, we’re progressing

1216 01:05:32,480 --> 01:05:34,480 in any sort of way. Read’s been around longer

1217 01:05:34,480 --> 01:05:36,320 than me in terms of, like, the authentication and all these

1218 01:05:36,320 --> 01:05:38,120 things, but that’s what I’m looking at. I’m like,

1219 01:05:38,440 --> 01:05:40,100 how secure are these operating systems?

1220 01:05:40,360 --> 01:05:42,040 And I don’t have the data points for that yet.

1221 01:05:42,560 --> 01:05:44,520 So, pretty dismal.

1222 01:05:44,900 --> 01:05:46,580 Yeah, yeah. It’s depressing

1223 01:05:46,580 --> 01:05:47,460 at times, I think.

1224 01:05:48,420 --> 01:05:50,400 Maybe because it’s like I’m still a little green.

1225 01:05:50,620 --> 01:05:51,760 Yeah, yeah. You’ll get there.

1226 01:05:51,860 --> 01:05:54,560 I’m not really that depressed by it. It’s just a,

1227 01:05:54,560 --> 01:05:56,740 you know, it’s a reality you have to accept,

1228 01:05:56,840 --> 01:05:58,260 I guess. And another thing, I mean,

1229 01:05:58,360 --> 01:06:00,140 that I will touch upon

1230 01:06:00,140 --> 01:06:01,280 tomorrow, but

1231 01:06:01,280 --> 01:06:04,420 about disclosure. I mean,

1232 01:06:04,440 --> 01:06:06,120 how should you work with that?

1233 01:06:06,900 --> 01:06:08,200 Because, like I say,

1234 01:06:08,320 --> 01:06:09,360 even if there’s

1235 01:06:09,360 --> 01:06:11,360 a new product or

1236 01:06:11,360 --> 01:06:12,540 a new software,

1237 01:06:12,560 --> 01:06:14,360 firmware or something

1238 01:06:14,360 --> 01:06:16,360 that is fixing the problem,

1239 01:06:17,080 --> 01:06:18,020 it’s going to be

1240 01:06:18,020 --> 01:06:20,540 ages before that is implemented

1241 01:06:20,540 --> 01:06:22,340 in all the customer systems.

1242 01:06:22,700 --> 01:06:24,620 This goes back to what we were talking about

1243 01:06:24,620 --> 01:06:26,480 before. It’s not…

1244 01:06:26,480 --> 01:06:28,360 Patching is one small part of a defensive

1245 01:06:28,360 --> 01:06:30,380 strategy. Visibility

1246 01:06:30,380 --> 01:06:31,860 and detection is more important.

1247 01:06:32,680 --> 01:06:34,280 So, if you know the thing is wrong,

1248 01:06:34,500 --> 01:06:36,540 figure out how you detect the wrong thing. And then it doesn’t

1249 01:06:36,540 --> 01:06:38,620 matter if you weren’t patched. I mean, it matters a little

1250 01:06:38,620 --> 01:06:40,180 bit, but it doesn’t matter because you’ll be able to see.

1251 01:06:40,180 --> 01:06:42,420 So, everything is known. And to some

1252 01:06:42,420 --> 01:06:43,820 extent, I feel like that is more important.

1253 01:06:44,440 --> 01:06:46,480 Right? And that’s sort of the way

1254 01:06:46,480 --> 01:06:48,640 I look at it. Because if you’re going to have this long time to patch

1255 01:06:48,640 --> 01:06:50,520 and you’re not necessarily going to be able to update

1256 01:06:50,520 --> 01:06:52,180 it, because we’ve dealt with sensors where they’re like, no,

1257 01:06:52,440 --> 01:06:54,420 the customer can’t update that. One of our guys has to fly

1258 01:06:54,420 --> 01:06:56,300 out there and update it. It’s like, well,

1259 01:06:56,320 --> 01:06:58,380 if that’s the process, then you just need

1260 01:06:58,380 --> 01:07:00,340 visibility. I don’t want to say that’s

1261 01:07:00,340 --> 01:07:02,380 a solution, but it’s as close as

1262 01:07:02,380 --> 01:07:04,500 we’re going to get to a solution, at least for the time

1263 01:07:04,500 --> 01:07:06,420 being, until something significant changes.

1264 01:07:07,220 --> 01:07:08,500 Yeah. But yeah,

1265 01:07:08,580 --> 01:07:10,420 on the disclosure window thing, I mean,

1266 01:07:10,440 --> 01:07:12,340 I’ve been all over on disclosure, you know, back

1267 01:07:12,340 --> 01:07:14,360 in the day, I used to drop O-Days on vendors all

1268 01:07:14,360 --> 01:07:16,340 the time. Now we do a lot more

1269 01:07:16,340 --> 01:07:18,020 private reporting where we say,

1270 01:07:18,400 --> 01:07:20,360 we’re never going to publicly report

1271 01:07:20,360 --> 01:07:22,160 on vulnerabilities. We’ll report it to the vendor.

1272 01:07:22,600 --> 01:07:24,580 We’ll get in touch with as many customers as we can

1273 01:07:24,580 --> 01:07:26,640 and let them know, like, hey, there was this problem.

1274 01:07:26,780 --> 01:07:27,540 It’s been fixed.

1275 01:07:28,180 --> 01:07:29,520 And try to do it quietly

1276 01:07:29,520 --> 01:07:32,500 and see how that works out. But I won’t

1277 01:07:32,500 --> 01:07:34,520 fault anyone if they decide that they’re

1278 01:07:34,520 --> 01:07:36,740 going to take a vulnerability and release

1279 01:07:36,740 --> 01:07:38,100 a working exploit, even.

1280 01:07:38,300 --> 01:07:40,560 I totally get it, because

1281 01:07:40,560 --> 01:07:42,320 having worked with a lot of

1282 01:07:42,320 --> 01:07:44,200 vendors, it’s such a slow

1283 01:07:44,200 --> 01:07:46,240 process. It requires a lot

1284 01:07:46,240 --> 01:07:47,520 of input from the researcher.

1285 01:07:48,020 --> 01:07:50,160 It can be really difficult, and I don’t think

1286 01:07:50,160 --> 01:07:52,200 vendors always appreciate how much time goes

1287 01:07:52,200 --> 01:07:54,340 into, you know, the security

1288 01:07:54,340 --> 01:07:56,220 research. They kind of

1289 01:07:56,220 --> 01:07:58,300 treat it as a free resource, and they may

1290 01:07:58,300 --> 01:08:00,260 sit and take two, three years

1291 01:08:00,260 --> 01:08:02,100 to actually fix the vulnerability.

1292 01:08:02,860 --> 01:08:04,160 And in that time, you have to

1293 01:08:04,160 --> 01:08:06,160 constantly be poking them and saying, like, you

1294 01:08:06,160 --> 01:08:08,160 really need to put out an advisory about this

1295 01:08:08,160 --> 01:08:10,300 and let your customers know that they need

1296 01:08:10,300 --> 01:08:12,220 to update or at least, you know, put

1297 01:08:12,220 --> 01:08:14,160 in some firewall rules and what to block

1298 01:08:14,160 --> 01:08:16,040 and what to look for for anomaly detection.

1299 01:08:17,800 --> 01:08:20,200 Do you think there will ever be

1300 01:08:20,200 --> 01:08:22,480 like bug bounty programs for ICS?

1301 01:08:23,120 --> 01:08:23,980 Probably eventually.

1302 01:08:24,220 --> 01:08:26,040 Yeah, I hope so. I think that

1303 01:08:26,040 --> 01:08:28,240 right now, just because of the,

1304 01:08:28,400 --> 01:08:30,140 you know, most of these systems lack

1305 01:08:30,140 --> 01:08:32,220 really basic security features like authentication,

1306 01:08:32,660 --> 01:08:34,360 I don’t view the bug bounty program

1307 01:08:34,360 --> 01:08:36,240 as being particularly worthwhile, just because

1308 01:08:36,240 --> 01:08:38,060 it’s like, oh, well, you found a buffer

1309 01:08:38,060 --> 01:08:40,220 overflow vulnerability in some runtime,

1310 01:08:40,220 --> 01:08:42,180 but the runtime didn’t have any security

1311 01:08:42,180 --> 01:08:44,000 anyway, so a

1312 01:08:44,000 --> 01:08:46,260 real attacker probably isn’t going to use the

1313 01:08:46,260 --> 01:08:48,300 bug, they’re just going to use whatever features

1314 01:08:48,300 --> 01:08:50,140 are present in the service to

1315 01:08:50,140 --> 01:08:52,160 compromise the system

1316 01:08:52,160 --> 01:08:54,160 and go that route. I’ve always

1317 01:08:54,160 --> 01:08:56,440 found the vulnerability problem in both sectors,

1318 01:08:56,580 --> 01:08:58,380 ICS and IT, to be a little…

1319 01:08:58,380 --> 01:09:00,320 It’s a symptom of a

1320 01:09:00,320 --> 01:09:02,080 larger problem, which is just better development

1321 01:09:02,080 --> 01:09:04,600 practices, and that’s what needs to happen in ICS.

1322 01:09:05,100 --> 01:09:06,140 Right? Because then you

1323 01:09:06,140 --> 01:09:07,860 get the basic stuff out of the way,

1324 01:09:07,860 --> 01:09:09,940 and then the software is to a point

1325 01:09:09,940 --> 01:09:12,100 where you can have a bug bounty program and it actually makes sense.

1326 01:09:12,180 --> 01:09:13,980 Otherwise, the companies are going to go broke.

1327 01:09:14,140 --> 01:09:14,380 Yeah.

1328 01:09:15,380 --> 01:09:18,080 I think that’s a very

1329 01:09:18,080 --> 01:09:19,840 valid point, and I think

1330 01:09:19,840 --> 01:09:22,280 that’s where I believe

1331 01:09:22,280 --> 01:09:23,720 the ICS

1332 01:09:23,720 --> 01:09:27,500 side of the house could

1333 01:09:27,500 --> 01:09:29,240 catch up faster.

1334 01:09:29,700 --> 01:09:32,100 They don’t have to go through and learn

1335 01:09:32,100 --> 01:09:34,280 by doing all the mistakes,

1336 01:09:34,520 --> 01:09:35,960 because there are

1337 01:09:35,960 --> 01:09:38,040 sound practices for co-development,

1338 01:09:38,760 --> 01:09:39,920 and there are

1339 01:09:39,920 --> 01:09:41,660 processes that you can implement

1340 01:09:41,660 --> 01:09:43,900 to make sure that you don’t have

1341 01:09:43,900 --> 01:09:45,860 issues. I mean, do a threat

1342 01:09:45,860 --> 01:09:47,420 modeling and things like that.

1343 01:09:47,800 --> 01:09:49,920 Right, and come up with your list of

1344 01:09:49,920 --> 01:09:51,860 okay, here are my

1345 01:09:51,860 --> 01:09:53,700 inputs to the system,

1346 01:09:54,360 --> 01:09:55,940 and develop a threat

1347 01:09:55,940 --> 01:09:57,000 model against those inputs.

1348 01:09:58,060 --> 01:09:59,880 So I think that is

1349 01:09:59,880 --> 01:10:01,820 true. I mean, there is a huge benefit to

1350 01:10:01,820 --> 01:10:03,900 Microsoft and other companies already

1351 01:10:03,900 --> 01:10:05,880 having gone through that process,

1352 01:10:05,880 --> 01:10:07,840 and now you can just take their process and

1353 01:10:07,840 --> 01:10:09,980 hopefully work your pipeline around it.

1354 01:10:10,000 --> 01:10:11,520 Microsoft SDL book,

1355 01:10:11,660 --> 01:10:13,780 that should be a required reading for all

1356 01:10:13,780 --> 01:10:14,340 developers.

1357 01:10:16,000 --> 01:10:17,900 Oh, for sure. And some vendors

1358 01:10:17,900 --> 01:10:19,540 are starting to take that seriously.

1359 01:10:19,780 --> 01:10:21,740 I mean, they actually, when they develop products now,

1360 01:10:21,840 --> 01:10:23,440 they actually do

1361 01:10:23,440 --> 01:10:25,420 a software development life cycle.

1362 01:10:25,600 --> 01:10:27,940 They say, okay, this is

1363 01:10:27,940 --> 01:10:29,840 starting

1364 01:10:29,840 --> 01:10:31,420 with a threat model on the new product,

1365 01:10:31,580 --> 01:10:33,980 and going through that process, and actually

1366 01:10:33,980 --> 01:10:35,880 testing the artifacts that come out of that,

1367 01:10:35,880 --> 01:10:37,580 and all that. Yeah, and then all the other

1368 01:10:37,580 --> 01:10:39,380 nice useful tools, right? Like actual

1369 01:10:39,380 --> 01:10:41,640 runtime testing, and not runtime testing,

1370 01:10:41,660 --> 01:10:43,660 I’m sorry, like your

1371 01:10:43,660 --> 01:10:45,400 what is it called?

1372 01:10:47,540 --> 01:10:48,080 You know,

1373 01:10:48,360 --> 01:10:49,840 well, just doing the basic testing

1374 01:10:49,840 --> 01:10:51,860 stuff, like your unit tests, and then

1375 01:10:51,860 --> 01:10:53,800 your static code analyzers, and

1376 01:10:53,800 --> 01:10:55,940 all fuzzing, like these things

1377 01:10:55,940 --> 01:10:58,080 that need to sort of be built into your larger dev process

1378 01:10:58,080 --> 01:10:59,760 need to just be there. And I don’t know how much of that

1379 01:10:59,760 --> 01:11:01,920 they’re actually doing. I think some

1380 01:11:01,920 --> 01:11:03,900 vendors are better than others, but even the ones

1381 01:11:03,900 --> 01:11:05,780 that claim they’re doing

1382 01:11:05,780 --> 01:11:08,180 a secure development process

1383 01:11:08,180 --> 01:11:10,000 end up releasing new products that clearly

1384 01:11:10,000 --> 01:11:11,640 had not been tested, or at least not.

1385 01:11:11,640 --> 01:11:13,480 Yeah, not to any sufficient degree.

1386 01:11:14,480 --> 01:11:15,040 You know what I think?

1387 01:11:15,420 --> 01:11:17,180 Bugs are going to be there. Oh, yeah, for sure.

1388 01:11:17,800 --> 01:11:19,680 We want certain bugs not

1389 01:11:19,680 --> 01:11:21,600 to be there. Yeah. Right, there are certain classes

1390 01:11:21,600 --> 01:11:23,280 of bugs I think we can eliminate, like

1391 01:11:23,280 --> 01:11:25,720 we were talking about in our talk, or maybe during the Q&A

1392 01:11:25,720 --> 01:11:26,600 for a second, like

1393 01:11:26,600 --> 01:11:29,680 you know, the JVM has

1394 01:11:29,680 --> 01:11:31,700 like a by-code verifier, and it guarantees

1395 01:11:31,700 --> 01:11:33,320 that certain problems won’t occur.

1396 01:11:33,860 --> 01:11:35,680 And we need things like that, right?

1397 01:11:35,780 --> 01:11:37,600 Like, you know, right now

1398 01:11:37,600 --> 01:11:39,600 as it stands, a lot of these controllers that we

1399 01:11:39,600 --> 01:11:41,600 can get arbitrary code execution out of, we could just

1400 01:11:41,640 --> 01:11:43,760 brick. Right? We could just crash

1401 01:11:43,760 --> 01:11:45,300 the thing and be done with it. Yeah.

1402 01:11:45,440 --> 01:11:47,960 Which isn’t an ideal situation. That shouldn’t be a possibility.

1403 01:11:48,400 --> 01:11:49,700 Right? Like, so

1404 01:11:49,700 --> 01:11:51,660 if, for example, like a controller, I’m not saying a controller

1405 01:11:51,660 --> 01:11:53,580 needs to run the JVM, because that would be a little insane,

1406 01:11:53,960 --> 01:11:55,740 but something like it, that could

1407 01:11:55,740 --> 01:11:57,760 actually, before it ran its code, it just

1408 01:11:57,760 --> 01:11:59,700 did a little check and said, okay, it meets these

1409 01:11:59,700 --> 01:12:01,700 properties. That would be miles ahead of

1410 01:12:01,700 --> 01:12:03,600 what we have now. Yeah. Because it would mean certain things we

1411 01:12:03,600 --> 01:12:05,480 wouldn’t necessarily have to worry about. Not to say that

1412 01:12:05,480 --> 01:12:07,600 that verifier would have vulnerabilities and there couldn’t be

1413 01:12:07,600 --> 01:12:09,760 some problems, but we’ve kind of narrowed things

1414 01:12:09,760 --> 01:12:11,600 down now. It’s like a little bit easier, right? We can

1415 01:12:11,640 --> 01:12:13,160 focus on this one little verification thing.

1416 01:12:14,280 --> 01:12:15,560 And we don’t even have that, and

1417 01:12:15,560 --> 01:12:17,720 they could very easily do that. So, like, the one thing that

1418 01:12:17,720 --> 01:12:19,620 I really wish vendors would do is when

1419 01:12:19,620 --> 01:12:21,620 they’re looking at re-releasing their next operating system,

1420 01:12:21,700 --> 01:12:23,760 they’re like, okay, what would it cost, right,

1421 01:12:24,320 --> 01:12:25,760 to, like, port this

1422 01:12:25,760 --> 01:12:27,560 OS or whatever it is to, like,

1423 01:12:27,880 --> 01:12:29,560 a type-safe language. Just that.

1424 01:12:29,760 --> 01:12:31,640 Just a type-safe language, you know,

1425 01:12:31,720 --> 01:12:32,960 would work. You know, like,

1426 01:12:33,080 --> 01:12:35,320 even if it’s, like, Rust or something. Right.

1427 01:12:35,600 --> 01:12:37,520 That wouldn’t be the best, but it would still work.

1428 01:12:37,520 --> 01:12:39,520 You know, because at least it would be type-safe and there’s a whole class of

1429 01:12:39,520 --> 01:12:40,380 errors you don’t have to deal with.

1430 01:12:40,380 --> 01:12:42,300 Yeah. Golang.

1431 01:12:42,760 --> 01:12:44,380 Yeah. Maybe not Golang, but…

1432 01:12:45,780 --> 01:12:47,840 I mean, well, you’d have to write…

1433 01:12:47,840 --> 01:12:49,660 The thing is, you’d have to be able to…

1434 01:12:49,660 --> 01:12:52,360 Golang still… The problem with Rust and Golang is you still have to compile it.

1435 01:12:52,460 --> 01:12:54,500 Yeah. Right, so it’ll compile down to the assembly language.

1436 01:12:54,600 --> 01:12:56,380 So you’d still be sending raw bytecodes to the

1437 01:12:57,020 --> 01:12:58,420 system. And that

1438 01:12:58,420 --> 01:12:59,600 could be exploited then. Yeah.

1439 01:12:59,840 --> 01:13:02,200 The system needs a way to sort of, like,

1440 01:13:02,580 --> 01:13:04,520 I don’t want to say type-check, but sort of, like, type-check

1441 01:13:04,520 --> 01:13:06,440 the assembly it’s getting. So, like,

1442 01:13:06,520 --> 01:13:08,300 the JVM and the .NET runtime

1443 01:13:08,300 --> 01:13:10,360 makes sense because the bytecodes that

1444 01:13:10,380 --> 01:13:12,320 it admits are… It’s basically like

1445 01:13:12,320 --> 01:13:14,500 a typed assembly language, right? The type information

1446 01:13:14,500 --> 01:13:16,560 is still present. So the controller

1447 01:13:16,560 --> 01:13:18,540 or whatever is loading, in this case the JVM

1448 01:13:18,540 --> 01:13:20,320 or whatever that is loading the code, has

1449 01:13:20,320 --> 01:13:22,420 enough information to say, oh, well, you know,

1450 01:13:22,480 --> 01:13:24,360 this operation is illegal because these two

1451 01:13:24,360 --> 01:13:26,440 types don’t match. Yeah. Right, and you can start checking

1452 01:13:26,440 --> 01:13:28,540 for, like, out-of-bounds memory access and these kinds of things

1453 01:13:28,540 --> 01:13:30,140 because there’s extra metadata

1454 01:13:30,140 --> 01:13:32,460 inside of the assembly language that allows it to do

1455 01:13:32,460 --> 01:13:34,300 that, right? So any of these sort of, like,

1456 01:13:34,360 --> 01:13:36,300 higher-level languages would allow it, but the

1457 01:13:36,300 --> 01:13:38,480 compiled ones, you’d have to develop a mechanism for doing

1458 01:13:38,480 --> 01:13:40,320 those kinds of checks or, like, pass extra

1459 01:13:40,380 --> 01:13:42,420 metadata. You know,

1460 01:13:42,800 --> 01:13:44,440 I mean, that’s the kind of thing I think needs

1461 01:13:44,440 --> 01:13:46,400 to happen is because these programs are simple.

1462 01:13:46,700 --> 01:13:49,120 Yeah. They are. You don’t need to be

1463 01:13:49,120 --> 01:13:50,720 running straight PowerPC to, like,

1464 01:13:50,800 --> 01:13:52,620 you know, check a

1465 01:13:52,620 --> 01:13:54,620 pressure sensor. Yeah. Why does

1466 01:13:54,620 --> 01:13:56,520 it matter if it’s… Yeah, it always surprises

1467 01:13:56,520 --> 01:13:58,520 me that, you know, 6.1.1.3.1,

1468 01:13:58,620 --> 01:14:00,540 the standards for programming logic

1469 01:14:00,540 --> 01:14:02,500 and controllers, is a Turing complete programming

1470 01:14:02,500 --> 01:14:04,620 language. And it always just surprises me

1471 01:14:04,620 --> 01:14:06,620 a little bit to think about that. I’m like, do we really

1472 01:14:06,620 --> 01:14:08,500 need that? You know, if we’re just

1473 01:14:08,500 --> 01:14:10,360 going to be comparing sensor inputs and

1474 01:14:10,380 --> 01:14:12,560 controlling some outputs, yeah, you want

1475 01:14:12,560 --> 01:14:14,540 some, maybe some more complex control logic

1476 01:14:14,540 --> 01:14:16,440 in some cases, but do we really

1477 01:14:16,440 --> 01:14:18,460 need a Turing complete programming language? And some

1478 01:14:18,460 --> 01:14:20,500 of these, like, some of these run

1479 01:14:20,500 --> 01:14:22,440 times actually have more advanced features.

1480 01:14:22,700 --> 01:14:24,420 You know, it’s like, oh, you can actually program

1481 01:14:24,420 --> 01:14:26,440 an arbitrary TCP client and

1482 01:14:26,440 --> 01:14:28,580 server in the PLC for,

1483 01:14:28,680 --> 01:14:30,600 like, grabbing data off of some remote system.

1484 01:14:30,680 --> 01:14:32,480 I’m like, do we really need that? I

1485 01:14:32,480 --> 01:14:34,560 know it’s there for a reason. I know it’s there because

1486 01:14:34,560 --> 01:14:36,520 some customer wanted it. They said, oh, we’ve

1487 01:14:36,520 --> 01:14:38,580 got this weird system with some weird

1488 01:14:38,580 --> 01:14:40,340 protocol, and we want the PLC,

1489 01:14:40,380 --> 01:14:42,500 to be able to pull that weird system, get

1490 01:14:42,500 --> 01:14:44,560 some data out of it, parse that response,

1491 01:14:44,680 --> 01:14:46,560 and then do some control logic based on it.

1492 01:14:47,180 --> 01:14:48,380 But you have to be like, why?

1493 01:14:48,740 --> 01:14:50,220 I wish we

1494 01:14:50,220 --> 01:14:52,160 hadn’t agreed to do this.

1495 01:14:52,340 --> 01:14:53,600 What is that that reminds me of, like,

1496 01:14:54,020 --> 01:14:56,440 one of the number one rules of security? I feel like you learn

1497 01:14:56,440 --> 01:14:58,140 it when you’re in college, and you think security

1498 01:14:58,140 --> 01:15:00,380 is whatever. It’s like, that’s sort of like the principle

1499 01:15:00,380 --> 01:15:02,400 of least privilege. Things only need

1500 01:15:02,400 --> 01:15:03,780 as much access as they’re required.

1501 01:15:04,080 --> 01:15:05,660 And, like, ICS just completely

1502 01:15:05,660 --> 01:15:07,660 failed at that off the bat.

1503 01:15:07,660 --> 01:15:08,220 Right, yeah.

1504 01:15:08,380 --> 01:15:10,120 It can do everything.

1505 01:15:10,380 --> 01:15:15,040 Awesome. I think that’s a good note

1506 01:15:15,040 --> 01:15:16,920 to end this interview. And, Jimmy,

1507 01:15:17,340 --> 01:15:19,280 Reid, thank you very much for taking

1508 01:15:19,280 --> 01:15:21,000 time to talk to us. It was

1509 01:15:21,000 --> 01:15:23,340 really nice listening

1510 01:15:23,340 --> 01:15:25,240 to your talk, and it was even nicer

1511 01:15:25,240 --> 01:15:26,740 talking to you now. Thank you.

1512 01:15:26,940 --> 01:15:27,680 Thank you.

1513 01:15:27,800 --> 01:15:30,700 And that is over and out from

1514 01:15:30,700 --> 01:15:33,020 Säkerhetspodcasten, och jag som pratar här heter

1515 01:15:33,020 --> 01:15:34,820 Rickard Borgfors. Ha det gott. Hej.