Contents

Säkerhetspodcasten #153 - Shira Shamban & Dave Lewis

Lyssna

mp3

Innehåll

I dagens avsnitt pratar podcastens flygande reporter Robin von Post med Security Fests två keynotes Shira Shamban och Dave Lewis om deras respektive föredrag. Mycket nöje!

Inspelat: 2019-05-23. Längd: 00:33:04.

AI transkribering

AI försöker förstå oss… Ha överseende med galna feltranskriberingar.

1 00:00:00,000 --> 00:00:11,280 Så, ni lyssnar på Säkerhetspodcasten som transmitteras från Securityfest 2019 och vi har bara gått av staden.

2 00:00:11,920 --> 00:00:22,000 Det här är Robin von Post som pratar med Säkerhetspodcasten och jag sitter här tillsammans med Shira Shaman. Välkomna till Säkerhetspodcasten!

3 00:00:22,760 --> 00:00:25,480 Tack så mycket! Det är en glädje att vara här, seriöst!

4 00:00:25,480 --> 00:00:33,120 Det är faktiskt vårt bästa glädje att ha dig på Securityfest och också på Säkerhetspodcasten.

5 00:00:33,400 --> 00:00:33,580 Tack!

6 00:00:33,720 --> 00:00:39,820 Du kom just av staden och gjorde keynote-adressen här, sätter staden för resten av språkarna, eller hur?

7 00:00:40,820 --> 00:00:44,620 Ja, jag hoppas att jag har ställt barna eller satt dem i ett bra ställe.

8 00:00:44,880 --> 00:00:51,640 Barna är på rätt nivå och jag är säker på att de kommer att ha stora utmaningar när de kommer över det.

9 00:00:52,040 --> 00:00:54,340 Så vi är väldigt glada att ha dig här.

10 00:00:54,340 --> 00:00:57,860 Och vi kommer att följa upp lite på dina berättelser.

11 00:00:58,280 --> 00:01:03,640 Vad var det största tematet? Hur skulle du kalla det i en elevator-pitch?

12 00:01:03,800 --> 00:01:16,980 Jag skulle säga att jag gillar att prata om statens sponsrade attacker, de motiver som är bakom dem och de skillnader som finns mellan dem och kriminalhackarna som vi är mer vänliga med.

13 00:01:16,980 --> 00:01:21,060 Ja, och ge oss lite om din bakgrund.

14 00:01:21,060 --> 00:01:25,600 För du kom på staden för en anledning, eller hur?

15 00:01:25,600 --> 00:01:31,600 Ja, jag tror att jag kom på staden för att jag har en militär bakgrund.

16 00:01:31,600 --> 00:01:44,600 Jag ser bara ut som blå och lätt, men jag är major i det israelska IDF och jag började min karriär i intelligenskorpsen.

17 00:01:44,600 --> 00:01:50,000 I början var jag analyserare, men jag tog upp olika rörelser i olika delar.

18 00:01:50,000 --> 00:01:51,000 Jag tog upp olika rörelser i olika delar.

19 00:01:51,000 --> 00:01:52,620 Jag tog upp olika rörelser i olika delar.

20 00:01:52,620 --> 00:02:06,040 Jag tog också upp officerstidig träning, och på ett visst sätt kom jag till en unit som kallas 8200, som är det israeliska signal- och intelligensuniteten och cyberuniteten.

21 00:02:06,040 --> 00:02:11,840 Okej, så du hör mycket om alla dessa olika operationer om staden över världen.

22 00:02:11,840 --> 00:02:19,580 Och Israel är en av de största spelarna, så vi är väldigt glada att ha dig här för att prata om det här.

23 00:02:19,580 --> 00:02:20,960 Det är inte ett öppet värld.

24 00:02:21,000 --> 00:02:24,560 I think a lot of experience and insight goes into your presentation.

25 00:02:25,120 --> 00:02:31,740 And it went live on YouTube and there is a possibility to see your whole talk there as well.

26 00:02:31,740 --> 00:02:37,820 But let’s dive a little bit more into the details here.

27 00:02:38,420 --> 00:02:39,260 Questions arise.

28 00:02:40,140 --> 00:02:44,880 And one thing, the challenge for a state-sponsored attack,

29 00:02:44,980 --> 00:02:48,700 you were talking about why they do it.

30 00:02:48,700 --> 00:02:55,400 I mean, comparing the criminal gangsters or the motivations for just financial…

31 00:02:55,400 --> 00:03:00,460 Yeah, so we can understand the criminal hackers better, I guess.

32 00:03:00,460 --> 00:03:05,700 Because, first of all, their hacks are more known than the state-sponsored attacks.

33 00:03:06,260 --> 00:03:09,000 So we understand that hackers usually are after the money.

34 00:03:09,560 --> 00:03:13,840 So they will cause the damage and get the data that will get them the money.

35 00:03:14,100 --> 00:03:17,860 Either breaking into a bank or a stock exchange

36 00:03:17,860 --> 00:03:18,680 and getting the money.

37 00:03:18,700 --> 00:03:24,100 Or stealing private data that they can sell

38 00:03:24,100 --> 00:03:26,340 or use in other ways to get money.

39 00:03:26,500 --> 00:03:31,600 But at the end of the day, we’re talking about getting richer in general.

40 00:03:32,360 --> 00:03:35,780 Or even sell your capabilities to other countries

41 00:03:35,780 --> 00:03:37,280 but in order to get money.

42 00:03:37,360 --> 00:03:39,280 You care about the money, less about the motives.

43 00:03:40,120 --> 00:03:44,540 And compared to that, the state-sponsored attacks are way more interesting

44 00:03:44,540 --> 00:03:48,360 because they have so many different and secret motives.

45 00:03:48,700 --> 00:03:51,140 Like the house of cards kind of thing.

46 00:03:51,480 --> 00:03:55,260 You do one thing because you mean to do another thing.

47 00:03:56,900 --> 00:04:00,060 It’s a very complicated ecosystem.

48 00:04:00,820 --> 00:04:06,100 So one country might want to get information about their neighbors,

49 00:04:06,440 --> 00:04:07,660 which they’re good friends with.

50 00:04:07,860 --> 00:04:12,380 They’re not in a war, but they want to understand better about their economics,

51 00:04:12,900 --> 00:04:15,660 about their foreign affairs with the third country,

52 00:04:16,100 --> 00:04:18,640 about different interests that they have around.

53 00:04:18,700 --> 00:04:24,640 So this is one kind of an interest that two countries have in the cyber domain.

54 00:04:25,400 --> 00:04:32,760 Another interest might be that one country wants to get another country’s intellectual property

55 00:04:32,760 --> 00:04:39,360 because they want to build factories that make better products.

56 00:04:39,920 --> 00:04:42,540 They want to get a piece of that market.

57 00:04:43,360 --> 00:04:47,220 Or they want to have better factories and so on.

58 00:04:47,220 --> 00:04:48,460 So they want to know.

59 00:04:48,700 --> 00:04:52,800 All your intellectual property secrets, they break into universities,

60 00:04:53,100 --> 00:04:57,720 they break into the factories and get that kind of information.

61 00:04:58,280 --> 00:05:04,300 But after all, it’s financial motives on that level, but it’s on a higher level.

62 00:05:04,580 --> 00:05:06,760 It’s like a state-level money.

63 00:05:07,000 --> 00:05:08,380 And it doesn’t look legitimate.

64 00:05:09,140 --> 00:05:13,180 I mean, when two criminals do it to one another, of course it’s not legitimate.

65 00:05:13,180 --> 00:05:18,200 But when a country does that, it seems a lot more not legitimate.

66 00:05:18,700 --> 00:05:24,680 And then we also have the cyber warfare for war purposes.

67 00:05:25,360 --> 00:05:29,380 So if in the past we used to see bombs explode,

68 00:05:30,160 --> 00:05:35,640 today we see countries that cause damage in the digital domain,

69 00:05:35,900 --> 00:05:37,520 not in the physical domain.

70 00:05:38,440 --> 00:05:40,340 They do it in many different ways.

71 00:05:40,340 --> 00:05:44,940 One way would be, and I think one of the more popular ways,

72 00:05:45,320 --> 00:05:47,340 would be to cause people…

73 00:05:48,700 --> 00:05:54,560 They’re horrified, they’re afraid, they feel unsafe in their own homes.

74 00:05:55,200 --> 00:05:56,540 How is this done?

75 00:05:56,680 --> 00:06:01,660 By shutting down electricity, by shutting down communications

76 00:06:01,660 --> 00:06:06,020 and making it look like an accident or a misconfiguration.

77 00:06:06,020 --> 00:06:11,760 But when weird things start to happen and you’re not getting serious answers,

78 00:06:12,300 --> 00:06:13,860 you get worried.

79 00:06:14,560 --> 00:06:18,380 And usually the country that is suffering from these kind of attacks,

80 00:06:18,700 --> 00:06:22,300 they don’t hurry to confirm that they’ve been hacked.

81 00:06:22,800 --> 00:06:27,920 Because unlike other kinds of attacks, this is not very easy to prevent

82 00:06:27,920 --> 00:06:31,940 and not easy to protect your citizens against.

83 00:06:32,340 --> 00:06:37,380 But you lose trust, right, in your management of the country, so to speak.

84 00:06:37,920 --> 00:06:38,240 Exactly.

85 00:06:38,800 --> 00:06:45,380 So coming into that, you were talking a little bit about Ukraine and the attack on that.

86 00:06:46,360 --> 00:06:48,240 Would you say that there is a specific…

87 00:06:48,700 --> 00:06:55,520 A specific motive to have the citizens or the Ukraine people

88 00:06:55,520 --> 00:06:58,320 to lose trust in the country in that operation?

89 00:06:59,140 --> 00:07:06,600 So to our understanding, this was a hack made by the Russian government

90 00:07:06,600 --> 00:07:09,920 or people working for the Russian government on their behalf.

91 00:07:10,720 --> 00:07:16,060 And assuming this is true, then I understand that Russia and Ukraine

92 00:07:16,060 --> 00:07:17,700 have their own issues.

93 00:07:18,700 --> 00:07:20,940 And problems with one another.

94 00:07:21,660 --> 00:07:28,740 But Russia decided to use the cyber domain as another place where they can influence.

95 00:07:29,100 --> 00:07:33,200 And not only by taking over land and real estate,

96 00:07:33,200 --> 00:07:37,780 but also by making people know who’s the boss around here.

97 00:07:38,040 --> 00:07:42,300 So even if we didn’t take over the whole country,

98 00:07:42,960 --> 00:07:46,060 we’re letting you know that we control the electricity.

99 00:07:46,580 --> 00:07:48,580 With no electricity, people are going…

100 00:07:48,700 --> 00:07:49,960 You’re going to die in hospitals.

101 00:07:50,500 --> 00:07:53,880 And you will not have access to your most basic needs.

102 00:07:54,780 --> 00:07:58,900 This is going back 100 years with no electricity.

103 00:07:59,220 --> 00:08:02,400 This was the Russians way of saying to the Ukrainian people,

104 00:08:02,980 --> 00:08:03,920 We’re here.

105 00:08:05,160 --> 00:08:09,680 And to my understanding, this might have another layer of meaning.

106 00:08:10,320 --> 00:08:15,980 Because Russia has had its own issues with other countries in Europe and in the world.

107 00:08:15,980 --> 00:08:18,580 And I don’t think that…

108 00:08:18,700 --> 00:08:21,100 They would easily go and attack.

109 00:08:21,100 --> 00:08:28,300 I believe they hack, but they don’t create an effect in other Western countries.

110 00:08:28,300 --> 00:08:33,700 So this is their way of saying, listen, we can do the same to you.

111 00:08:33,700 --> 00:08:35,460 So don’t mess with us.

112 00:08:36,460 --> 00:08:42,140 And one interesting part about the Ukraine attack was actually that they left a backdoor.

113 00:08:42,380 --> 00:08:46,600 And that was something that you brought up a couple of times when you speak around the Easter eggs.

114 00:08:46,600 --> 00:08:48,600 Or always leave a backdoor.

115 00:08:48,700 --> 00:08:49,520 Yes.

116 00:08:49,740 --> 00:08:56,640 So when a state is conducting a hack, they always leave reinfectibility.

117 00:08:57,440 --> 00:09:01,820 They always make sure they can get back there if they need to.

118 00:09:02,720 --> 00:09:05,020 And continue from where they stopped.

119 00:09:06,180 --> 00:09:12,380 And the Easter egg is not just getting the backdoor to where you want to be.

120 00:09:12,520 --> 00:09:16,040 This is actually leaving some kind of code that can cause damage.

121 00:09:16,260 --> 00:09:18,040 And I can…

122 00:09:18,040 --> 00:09:18,600 Like a bomb.

123 00:09:18,700 --> 00:09:21,400 I can activate it from far away.

124 00:09:21,820 --> 00:09:28,400 So I can leave Trojan horses or backdoors or malicious code in different places.

125 00:09:28,880 --> 00:09:32,900 And if you piss me off at some point, I will just activate it.

126 00:09:32,900 --> 00:09:34,960 And this is the whole purpose of the Easter egg.

127 00:09:35,200 --> 00:09:36,020 I will use it when I find it.

128 00:09:36,020 --> 00:09:42,160 So we’re getting into a cold war again about don’t mess with me because we know you can mess with you.

129 00:09:42,580 --> 00:09:42,600 Yeah.

130 00:09:42,600 --> 00:09:46,800 As we said on the talk, from the dirty bomb to the dirty worm.

131 00:09:47,060 --> 00:09:47,200 Yeah.

132 00:09:48,700 --> 00:09:56,820 So also one item that you brought up was the hacking back issue about, I mean…

133 00:09:56,820 --> 00:09:57,680 Attack the attackers.

134 00:09:57,920 --> 00:09:58,680 Attack the attackers.

135 00:09:58,940 --> 00:10:02,940 What’s your, I mean, from your military background, what’s your feeling?

136 00:10:02,940 --> 00:10:10,820 Because it’s quite a hot potato in this area to discuss, I guess.

137 00:10:10,920 --> 00:10:18,640 Because there is so many both ethical and practical issues around hacking.

138 00:10:18,700 --> 00:10:19,500 Yes.

139 00:10:21,260 --> 00:10:24,480 If necessary, then I think it’s legitimate.

140 00:10:25,160 --> 00:10:31,260 I think that hacking the hackers is a good way of getting to the source of things.

141 00:10:32,000 --> 00:10:40,320 And actually, I think that a lot of researchers that were published by, you know, well-known security companies,

142 00:10:40,840 --> 00:10:43,960 they were able to publish them because they got to the source.

143 00:10:44,060 --> 00:10:45,000 They got to the hackers.

144 00:10:45,620 --> 00:10:48,200 And this is where they actually saw the source code.

145 00:10:48,200 --> 00:10:48,260 And this is where they actually saw the source code.

146 00:10:48,260 --> 00:10:48,680 And this is where they actually saw the source code.

147 00:10:48,680 --> 00:10:51,280 And the IP addresses of the people who got attacked.

148 00:10:52,000 --> 00:11:00,020 So this was a great way of understanding the whole story behind one piece of malware they found somewhere.

149 00:11:01,680 --> 00:11:10,120 It wouldn’t surprise me if, you know, a company publishes a very nice white paper about a hacking campaign.

150 00:11:10,940 --> 00:11:15,920 The whole white paper began because they were tipped off by another government to do that.

151 00:11:16,840 --> 00:11:18,240 But attacking the attackers enabled…

152 00:11:18,680 --> 00:11:24,400 You access to the attacking tools, to information about whoever it was that got attacked.

153 00:11:24,540 --> 00:11:26,140 So sometimes you even find victims.

154 00:11:26,580 --> 00:11:29,600 You had no idea that they were even victims.

155 00:11:29,600 --> 00:11:31,820 And the victims have no idea that they are victims.

156 00:11:32,680 --> 00:11:40,560 So it’s a good way of understanding the whole campaign, finding the tools, finding who got attacked, and getting the whole story behind it.

157 00:11:40,840 --> 00:11:46,340 But if you want to hack back, you need to have your tools or you need to have the abilities to do it.

158 00:11:46,760 --> 00:11:48,600 And, I mean, you could buy those…

159 00:11:48,680 --> 00:11:54,260 You could buy those capabilities or you can develop them yourself and find vulnerabilities.

160 00:11:54,720 --> 00:11:57,980 And that’s where one discussion comes up, of course.

161 00:11:58,160 --> 00:12:03,760 You knew all along about this SMB attack on Windows.

162 00:12:03,860 --> 00:12:05,160 Why didn’t you say anything?

163 00:12:05,540 --> 00:12:07,100 It brought down our own country.

164 00:12:07,360 --> 00:12:07,660 Yes.

165 00:12:09,140 --> 00:12:14,360 Now, this is taking us to another question about responsibility of the software companies, right?

166 00:12:14,360 --> 00:12:15,600 So you have…

167 00:12:15,600 --> 00:12:17,360 If you use Windows on your computer…

168 00:12:18,680 --> 00:12:21,620 Why did you expect Microsoft to feel reliable to you?

169 00:12:22,260 --> 00:12:29,140 Unfortunately, our software suppliers are not feeling reliable to our privacy enough.

170 00:12:29,480 --> 00:12:32,160 Because we hear about hacks all the time.

171 00:12:32,720 --> 00:12:41,060 You know, a couple of years ago, Uber, who lost 57 million usernames and passwords to hackers.

172 00:12:41,700 --> 00:12:42,800 Did you stop using Uber?

173 00:12:44,460 --> 00:12:45,060 No.

174 00:12:45,740 --> 00:12:46,720 No, I know.

175 00:12:46,720 --> 00:12:48,540 I know many people did not stop using Uber.

176 00:12:48,680 --> 00:12:51,140 Because it’s comfortable and we still want to use it.

177 00:12:51,200 --> 00:12:53,300 We did not punish Uber for that.

178 00:12:53,380 --> 00:12:55,960 And we did not punish Microsoft for other things.

179 00:12:55,960 --> 00:13:02,300 We did not punish a lot of other companies that hurt our privacy and did not inform us about this.

180 00:13:02,600 --> 00:13:03,700 We did not punish Yahoo.

181 00:13:04,180 --> 00:13:06,180 Yahoo were sold to Verizon.

182 00:13:06,400 --> 00:13:07,780 Verizon got a small discount.

183 00:13:08,260 --> 00:13:11,160 But Yahoo’s stock did not get hurt by this.

184 00:13:11,540 --> 00:13:12,340 They only went up.

185 00:13:12,360 --> 00:13:13,540 But are people still using Yahoo?

186 00:13:14,040 --> 00:13:15,040 It’s a good question.

187 00:13:15,160 --> 00:13:15,700 It’s a good point.

188 00:13:16,020 --> 00:13:18,560 And surprisingly enough, every now and then, I still see…

189 00:13:18,560 --> 00:13:20,340 I see people using a Yahoo-domain.

190 00:13:21,740 --> 00:13:22,580 This is true.

191 00:13:22,840 --> 00:13:26,460 But still, so how come their stock is only on the rise?

192 00:13:27,480 --> 00:13:32,420 But one current issue is around all the Facebook situations that have been around.

193 00:13:32,980 --> 00:13:34,480 And I mean, people are still there.

194 00:13:34,880 --> 00:13:36,720 They still want to get invited to parties.

195 00:13:37,380 --> 00:13:38,160 Of course.

196 00:13:38,520 --> 00:13:40,020 Of course, we are also addicted.

197 00:13:40,360 --> 00:13:41,640 So it’s hard for us to stop.

198 00:13:42,500 --> 00:13:48,240 I think that in the near future, we will see governments enforcing policies.

199 00:13:48,560 --> 00:13:49,980 This is about privacy.

200 00:13:50,100 --> 00:13:52,300 We already see this with the GDPR.

201 00:13:52,880 --> 00:14:01,080 And we will be seeing that more and more until software companies will learn to respect our privacy.

202 00:14:01,980 --> 00:14:04,900 They’re making a lot of money out of our data, right?

203 00:14:04,980 --> 00:14:08,600 We get special ads because yesterday you googled, I don’t know, sneakers.

204 00:14:09,080 --> 00:14:13,300 So today you keep on getting advertisements about cool shoes.

205 00:14:14,040 --> 00:14:15,620 This is how they make money.

206 00:14:15,760 --> 00:14:16,940 And we get free software.

207 00:14:16,940 --> 00:14:17,940 We get free stuff.

208 00:14:18,560 --> 00:14:21,360 We get storage space in Google Drive.

209 00:14:21,520 --> 00:14:25,260 And we get Facebook and all the other stuff we are totally addicted to.

210 00:14:25,820 --> 00:14:27,820 And we pay with our privacy.

211 00:14:28,140 --> 00:14:30,880 And it’s time to find a new equation.

212 00:14:31,700 --> 00:14:38,020 But still, I mean, one of my main concerns is even when you pay for something, you could still pay with your privacy.

213 00:14:38,280 --> 00:14:38,660 It’s not that.

214 00:14:38,660 --> 00:14:39,300 This is true.

215 00:14:39,700 --> 00:14:42,660 So you both lose your privacy and your money.

216 00:14:43,180 --> 00:14:44,020 You’re absolutely right.

217 00:14:44,120 --> 00:14:46,700 And this is bad.

218 00:14:46,700 --> 00:14:48,120 This shouldn’t be this way.

219 00:14:48,560 --> 00:14:55,800 And I’m looking forward to our governments enforcing stronger policies about our privacies.

220 00:14:56,180 --> 00:15:00,060 It’s rather annoying that we need the government to enforce that.

221 00:15:00,600 --> 00:15:04,660 And that the companies just don’t feel reliable to our data.

222 00:15:04,820 --> 00:15:09,820 Because I’m keeping my data with you and I expect you not to use it at all.

223 00:15:10,060 --> 00:15:10,620 It’s mine.

224 00:15:11,980 --> 00:15:12,560 True that.

225 00:15:12,640 --> 00:15:17,240 If we come back a little bit to the more targeted attacks, so to speak.

226 00:15:17,240 --> 00:15:18,120 I mean.

227 00:15:18,560 --> 00:15:26,300 When the Snowden revelations were done, the tools were quite sophisticated, the implants and the details about that.

228 00:15:26,520 --> 00:15:30,140 But then, of course, you said something in the beginning.

229 00:15:30,300 --> 00:15:34,620 We talk about not using all your tools everywhere, right?

230 00:15:34,760 --> 00:15:39,820 So it’s about not spreading the butter too thin on the sandwich.

231 00:15:40,960 --> 00:15:47,440 How would you say, how afraid should I be as a common citizen in a country?

232 00:15:47,440 --> 00:15:48,440 On the one hand.

233 00:15:48,560 --> 00:15:51,140 Being targeted by an implant in my computer.

234 00:15:51,600 --> 00:15:54,320 So this is a question I get asked often.

235 00:15:54,320 --> 00:15:59,900 Because when I talk to people about awareness, they tell me, yeah, but I don’t have anything interesting on my computer.

236 00:16:00,940 --> 00:16:02,600 I have no secrets.

237 00:16:02,600 --> 00:16:07,760 If someone wants my nude pictures, I will feel uncomfortable, but whatever.

238 00:16:09,600 --> 00:16:11,360 So people say that all the time.

239 00:16:11,420 --> 00:16:13,200 And on the one hand, it’s true.

240 00:16:13,400 --> 00:16:15,140 But there are a few problems.

241 00:16:15,140 --> 00:16:18,520 One problem is that if this does happen.

242 00:16:18,560 --> 00:16:30,320 And someone will duplicate your identity and go and open a bank account on behalf of you and take your money and run away and no one will be able to find them.

243 00:16:31,260 --> 00:16:32,920 This will be a serious problem for you.

244 00:16:32,980 --> 00:16:38,420 I think that in the US, this is already a problem because fraud there is a little easier.

245 00:16:38,780 --> 00:16:41,860 And this affects your credit score.

246 00:16:42,340 --> 00:16:42,520 Yeah.

247 00:16:42,640 --> 00:16:42,900 Right.

248 00:16:42,900 --> 00:16:46,060 So if your credit card was stolen, it happens.

249 00:16:46,600 --> 00:16:48,360 You would probably just cancel it.

250 00:16:48,560 --> 00:16:51,120 And get your credit company to get you a new one.

251 00:16:51,200 --> 00:16:57,100 And in the US, if they registered that the card was canceled, no matter what the reason was.

252 00:16:57,540 --> 00:16:59,700 So this affects your credit score.

253 00:17:00,000 --> 00:17:05,200 Because the credit score company, they don’t care if it was canceled because you didn’t pay your bills.

254 00:17:05,420 --> 00:17:07,580 Or because the card was stolen.

255 00:17:08,160 --> 00:17:10,900 So next time you go and get a mortgage, this will hurt you.

256 00:17:11,620 --> 00:17:12,420 This is one problem.

257 00:17:12,420 --> 00:17:15,200 The other problem is the problem of scale.

258 00:17:16,000 --> 00:17:16,100 Right.

259 00:17:16,220 --> 00:17:18,420 We talked about devices with a benchmark.

260 00:17:18,560 --> 00:17:19,040 Backdoor.

261 00:17:19,180 --> 00:17:22,340 Should we or should we not buy Chinese cell phones and computers?

262 00:17:23,120 --> 00:17:27,000 And the problem here is with the scale of the impact.

263 00:17:27,780 --> 00:17:33,700 If one million Swedish people will have a Chinese computer and all the computers has a backdoor.

264 00:17:33,880 --> 00:17:36,440 And from the backdoor, you can get to other places.

265 00:17:36,440 --> 00:17:39,100 Because maybe you and I share the same Wi-Fi network.

266 00:17:39,180 --> 00:17:41,860 And from your computer, someone can get to my computer.

267 00:17:41,860 --> 00:17:44,760 So at some point, it will proliferate.

268 00:17:45,680 --> 00:17:47,860 And the Chinese or someone else can get…

269 00:17:48,560 --> 00:17:54,800 Theoretically get access to many areas we did not intend to give them access to.

270 00:17:55,060 --> 00:17:57,040 So it’s not about your data.

271 00:17:57,160 --> 00:18:03,600 It’s about the access that they get by using your computer as a router.

272 00:18:03,700 --> 00:18:11,760 So even if I’m just a normal citizen, I can be like a proxy towards a more interesting target, so to speak.

273 00:18:12,060 --> 00:18:13,220 Unfortunately, I think so.

274 00:18:13,380 --> 00:18:13,900 Yeah, okay.

275 00:18:14,120 --> 00:18:15,640 That makes sense.

276 00:18:15,800 --> 00:18:17,500 So you should keep your heads up.

277 00:18:17,580 --> 00:18:18,480 Even if you’re not…

278 00:18:18,480 --> 00:18:20,480 A specific target in that sense.

279 00:18:20,500 --> 00:18:27,880 And there are always things that even common people who are not security professionals can do to protect themselves.

280 00:18:28,300 --> 00:18:33,140 Of course, if anyone wants to target them for some reason, they can do it.

281 00:18:33,140 --> 00:18:37,000 But being responsible and not reusing passwords, for example.

282 00:18:37,520 --> 00:18:41,060 This is something that most people, they don’t do.

283 00:18:41,220 --> 00:18:42,060 They reuse passwords.

284 00:18:42,180 --> 00:18:45,480 They have the same password to their Facebook and LinkedIn and bank account.

285 00:18:46,240 --> 00:18:48,140 And then, you know, LinkedIn gets hacked.

286 00:18:48,480 --> 00:18:53,080 And a few hundred millions of passwords are leaked.

287 00:18:53,420 --> 00:18:58,900 And that database of usernames and passwords is still used today.

288 00:18:59,040 --> 00:19:00,900 I think it happened in 2016.

289 00:19:01,400 --> 00:19:03,160 And it’s still used until today.

290 00:19:03,160 --> 00:19:06,600 Because these are very good passwords for dictionary attacks.

291 00:19:06,800 --> 00:19:07,160 Absolutely.

292 00:19:07,540 --> 00:19:15,100 I mean, the password spraying situation that will obviously happen to all my kind of sources.

293 00:19:17,160 --> 00:19:17,600 So…

294 00:19:17,600 --> 00:19:18,100 I think…

295 00:19:18,480 --> 00:19:20,720 Your talk was really interesting.

296 00:19:20,840 --> 00:19:22,360 Because, I mean, we’re talking financial motives.

297 00:19:22,440 --> 00:19:26,540 We’re talking about the state financial motives.

298 00:19:27,120 --> 00:19:31,800 And a lot of the people down in the crowd are, of course, working every day with security issues.

299 00:19:31,800 --> 00:19:35,200 Trying to figure out how big of a threat is this and not.

300 00:19:35,980 --> 00:19:38,940 And trying to figure out the impact and so on.

301 00:19:39,440 --> 00:19:45,720 And I think your talk gave a good backdrop to the whole sessions down here.

302 00:19:45,840 --> 00:19:48,420 So I appreciate you taking your time to come to Sweden.

303 00:19:48,480 --> 00:19:49,660 Thank you very much.

304 00:19:49,780 --> 00:19:53,700 Thank you very much for sharing more information with us on the Säkerhetspodcasten.

305 00:19:54,180 --> 00:19:57,520 On behalf of all the listeners, I would say a big thank you.

306 00:19:57,880 --> 00:19:58,300 Thank you.

307 00:19:58,420 --> 00:19:59,780 And welcome back next year.

308 00:19:59,900 --> 00:20:01,780 I hope to be back here with the sun.

309 00:20:01,900 --> 00:20:02,940 I’ll bring the sun with me.

310 00:20:03,100 --> 00:20:03,740 Bring the sun.

311 00:20:03,880 --> 00:20:04,560 Do it, do it.

312 00:20:04,680 --> 00:20:05,300 Please do.

313 00:20:05,660 --> 00:20:06,840 Thank you very much.

314 00:20:09,300 --> 00:20:15,580 So, we are here just getting off stage on Security Fest 2019, day two.

315 00:20:15,580 --> 00:20:17,580 And the keynote address was…

316 00:20:18,480 --> 00:20:19,940 Made by Dave Lewis.

317 00:20:20,200 --> 00:20:22,640 And he’s sitting here right in front of me.

318 00:20:22,840 --> 00:20:24,440 It’s like a miracle coming true.

319 00:20:24,540 --> 00:20:25,320 You’re a legend, right?

320 00:20:26,020 --> 00:20:27,880 I will never ascribe to that.

321 00:20:28,900 --> 00:20:32,920 But Dave, you’re coming, flying in from a hectic schedule.

322 00:20:33,100 --> 00:20:35,940 You just arrived late last evening.

323 00:20:36,240 --> 00:20:39,920 And you’re from Cisco.

324 00:20:40,340 --> 00:20:40,500 Yep.

325 00:20:40,880 --> 00:20:41,740 Acquired Duo.

326 00:20:41,740 --> 00:20:42,680 Yeah, Duo Security.

327 00:20:42,880 --> 00:20:45,680 We were acquired by Cisco back October 1st, 2018.

328 00:20:46,320 --> 00:20:48,180 And yeah, so it’s been quite a wild ride.

329 00:20:48,480 --> 00:20:50,180 I can imagine, I can imagine.

330 00:20:50,620 --> 00:20:55,980 And the layout of your talk, the elevator pitch, so to speak, would be?

331 00:20:56,420 --> 00:20:59,340 So, the way I look at it is zero trust in the flaming sword of justice

332 00:20:59,340 --> 00:21:02,840 is really comparing and contrasting what people are talking about

333 00:21:02,840 --> 00:21:04,640 as a zero trust conversation today

334 00:21:04,640 --> 00:21:07,600 versus the way security was historically done.

335 00:21:08,160 --> 00:21:11,340 And the whole idea here is that we can do a better job

336 00:21:11,340 --> 00:21:13,360 with the resources that we have

337 00:21:13,360 --> 00:21:17,300 in a much more succinct fashion than we have prior to now.

338 00:21:17,300 --> 00:21:17,680 Yeah.

339 00:21:18,480 --> 00:21:23,080 So, one of your slides says that we’re actually, like, redefining the perimeter.

340 00:21:23,900 --> 00:21:28,960 It’s one of the key issues here regarding the data and the trust in the data.

341 00:21:29,100 --> 00:21:30,140 So, zero trust.

342 00:21:31,020 --> 00:21:39,160 What would you say has driven this change in the last five, ten years?

343 00:21:39,420 --> 00:21:44,320 So, a lot of the discussion has, well, it’s basically taking all of the resources

344 00:21:44,320 --> 00:21:47,000 that we’ve had at our disposal for a very, very long time

345 00:21:47,000 --> 00:21:48,420 and doing a much better job.

346 00:21:48,480 --> 00:21:51,220 It’s a much better job of implementing them and leveraging them

347 00:21:51,220 --> 00:21:54,200 so that we are making sure that people only have access

348 00:21:54,200 --> 00:21:56,060 to what they absolutely need to have access to.

349 00:21:56,200 --> 00:21:59,860 Only devices that are supposed to be accessing your network are accessing it.

350 00:22:00,220 --> 00:22:03,440 And only people that are supposed to be accessing the applications

351 00:22:03,440 --> 00:22:05,900 are the applications are then also being controlled

352 00:22:05,900 --> 00:22:09,260 so that, you know, random individuals from Canada and other places

353 00:22:09,260 --> 00:22:13,200 aren’t just nefariously logging into your resources in Sweden.

354 00:22:13,440 --> 00:22:13,660 Yeah.

355 00:22:13,660 --> 00:22:17,540 So, what would you say, how did you manage that ten years ago?

356 00:22:18,240 --> 00:22:18,380 Well.

357 00:22:18,480 --> 00:22:19,980 And ten years ago, that’s just it.

358 00:22:20,040 --> 00:22:21,160 We’ve had all these pieces.

359 00:22:21,300 --> 00:22:22,640 We’ve had network zone segmentation.

360 00:22:22,740 --> 00:22:24,040 We’ve had asset inventories.

361 00:22:24,160 --> 00:22:25,180 We’ve had user management.

362 00:22:25,580 --> 00:22:29,240 But we just haven’t had a clear drive to do it in a cohesive fashion.

363 00:22:29,240 --> 00:22:32,680 And over time, we tend to chase the next shiny thing.

364 00:22:33,260 --> 00:22:35,920 So, now, we’re looking at, you know, back in 2010,

365 00:22:36,040 --> 00:22:38,520 John Kindervog, who was an analyst at Forrester at the time,

366 00:22:38,780 --> 00:22:39,880 coined the term zero trust.

367 00:22:40,500 --> 00:22:43,580 And it has really gotten people’s attention.

368 00:22:43,800 --> 00:22:47,740 People are paying attention to something they should have been paying attention to all along,

369 00:22:47,840 --> 00:22:48,240 which is great.

370 00:22:48,240 --> 00:22:49,780 That is fantastic.

371 00:22:50,400 --> 00:22:52,520 And now, I think it’s time for us to evolve from that

372 00:22:52,520 --> 00:22:54,240 and start talking more about trusted access

373 00:22:54,240 --> 00:22:57,800 and put more of a positive connotation as opposed to zero trust.

374 00:22:57,900 --> 00:23:00,200 Because when you’re talking zero trust within technical terms,

375 00:23:00,660 --> 00:23:01,680 people, they’ll get it.

376 00:23:01,740 --> 00:23:04,200 But if you’re talking to folks that are not technically savvy,

377 00:23:04,320 --> 00:23:05,600 they will tend to say,

378 00:23:05,920 --> 00:23:07,620 it sounds like a negative connotation.

379 00:23:07,960 --> 00:23:12,120 So, I prefer to move it, start moving it towards a trusted access discussion.

380 00:23:12,400 --> 00:23:12,520 Yeah.

381 00:23:12,620 --> 00:23:16,880 So, trusted access is a more fine-grained now, I guess, then.

382 00:23:17,300 --> 00:23:17,960 Well, yeah.

383 00:23:17,960 --> 00:23:19,700 It’s basically the same thing,

384 00:23:19,780 --> 00:23:21,660 just calling it something a little bit more positive

385 00:23:21,660 --> 00:23:26,020 because you want it to resound or reverberate rather through the C-suite

386 00:23:26,020 --> 00:23:28,040 so that they see it as a positive

387 00:23:28,040 --> 00:23:30,020 and something that can help reduce costs

388 00:23:30,020 --> 00:23:33,080 and secure the environment in a more sane, cohesive fashion.

389 00:23:33,480 --> 00:23:37,400 But when you go from like the perimeter firewall era

390 00:23:37,400 --> 00:23:42,960 to the fine-grained asset-based trust stuff,

391 00:23:43,580 --> 00:23:45,080 you also get a lot of administration

392 00:23:45,080 --> 00:23:47,840 or could end up in administration, I guess.

393 00:23:47,960 --> 00:23:49,300 Well, that is very true.

394 00:23:49,420 --> 00:23:50,220 There is that possibility.

395 00:23:50,280 --> 00:23:53,900 But there are tools out there that can help facilitate that in a sane fashion.

396 00:23:54,060 --> 00:23:56,100 I mean, yes, the old way of doing it, of having,

397 00:23:56,220 --> 00:23:57,940 it’s okay, everything’s fine, we have a firewall.

398 00:23:58,440 --> 00:24:01,380 That was a demonstrably broken notion at the time.

399 00:24:02,620 --> 00:24:06,620 During my talk, I make an analogy to the fall of Rome in 410 AD

400 00:24:06,620 --> 00:24:10,720 where the attackers, the Visigoths, were able to take over the city

401 00:24:10,720 --> 00:24:12,700 by using their own security against them.

402 00:24:12,800 --> 00:24:15,040 So, the idea of the traditional perimeter approach

403 00:24:15,040 --> 00:24:17,940 was proven demonstrably broken in 410 AD.

404 00:24:17,960 --> 00:24:20,140 And unfortunately, we see a lot of companies

405 00:24:20,140 --> 00:24:21,340 are still implementing that now.

406 00:24:21,480 --> 00:24:24,100 The reality is, is the perimeter is now anywhere

407 00:24:24,100 --> 00:24:25,660 and access decision is being made.

408 00:24:25,860 --> 00:24:27,760 So, we have to pivot from that.

409 00:24:27,840 --> 00:24:30,820 And yes, there is some overhead to managing it, absolutely.

410 00:24:31,540 --> 00:24:34,180 And with anything new, there is going to be growing pains.

411 00:24:34,680 --> 00:24:37,440 So, we are really shifting the collective consciousness

412 00:24:37,440 --> 00:24:42,500 away from the perimeter model to a much better decentralized approach.

413 00:24:42,500 --> 00:24:47,520 Yeah, and I guess that I’m not sure how many active directors

414 00:24:47,520 --> 00:24:51,500 you’ve seen with the perfect match between the policies

415 00:24:51,500 --> 00:24:55,300 and the right, how it should be and how it is, right?

416 00:24:55,380 --> 00:24:56,700 Well, I can count that on no fingers.

417 00:24:56,880 --> 00:24:58,820 Yeah, and we’re out of fingers.

418 00:25:00,300 --> 00:25:01,340 That’s a tough one.

419 00:25:01,560 --> 00:25:04,780 So, I guess you need something that is correct, right?

420 00:25:04,940 --> 00:25:09,480 In order to accept that you have internet on your home yard.

421 00:25:10,000 --> 00:25:10,780 Well, yeah, exactly.

422 00:25:10,960 --> 00:25:12,920 And being able to have a clear understanding

423 00:25:12,920 --> 00:25:14,420 as to the assets in your environment

424 00:25:14,420 --> 00:25:15,920 as well as the users in your environment.

425 00:25:16,200 --> 00:25:16,920 That is a big…

426 00:25:17,520 --> 00:25:18,840 That is a battle that has been going on forever.

427 00:25:19,460 --> 00:25:21,480 After the talk today, I spoke to a couple of individuals

428 00:25:21,480 --> 00:25:23,680 that were telling me about how their asset inventories

429 00:25:23,680 --> 00:25:26,000 in their organizations are being run with Excel spreadsheets.

430 00:25:26,420 --> 00:25:28,040 And they’re not alone.

431 00:25:28,280 --> 00:25:30,480 There are many companies out there that still do this.

432 00:25:30,880 --> 00:25:32,680 And there are many resources out there

433 00:25:32,680 --> 00:25:34,960 that can help you facilitate this

434 00:25:34,960 --> 00:25:36,680 in a much more sane and cohesive fashion.

435 00:25:37,160 --> 00:25:38,420 Some of them might rhyme with Duo.

436 00:25:39,780 --> 00:25:40,620 Sorry, that was blatant.

437 00:25:41,820 --> 00:25:44,020 But yeah, there are all kinds of resources out there

438 00:25:44,020 --> 00:25:45,140 that can help you do a better job.

439 00:25:45,140 --> 00:25:47,500 Yeah, but then the C-level,

440 00:25:47,520 --> 00:25:49,440 needs to understand that there is a shift

441 00:25:49,440 --> 00:25:52,740 and that you need to invest in those kinds of tools, I guess.

442 00:25:52,880 --> 00:25:54,060 Well, and that’s the truth of it.

443 00:25:54,180 --> 00:25:55,620 And within the security circles,

444 00:25:55,720 --> 00:25:58,880 we have to evolve in order to be taken seriously.

445 00:25:59,420 --> 00:26:02,100 I mean, we are proverbially the dog that has caught the bumper.

446 00:26:02,580 --> 00:26:04,540 And now we have to try and figure out what to do with it.

447 00:26:04,600 --> 00:26:06,020 So the C-suite is now paying attention.

448 00:26:06,160 --> 00:26:08,100 We want to make sure that we’re articulating a message

449 00:26:08,100 --> 00:26:12,140 that is sane because if we start barking at them,

450 00:26:12,440 --> 00:26:13,400 they’re not going to listen.

451 00:26:13,620 --> 00:26:14,340 They’re going to go deaf.

452 00:26:14,400 --> 00:26:16,560 If we can speak to them in terms they understand,

453 00:26:16,560 --> 00:26:16,660 they’re going to understand.

454 00:26:16,660 --> 00:26:16,700 They’re going to understand.

455 00:26:16,700 --> 00:26:16,800 They’re going to understand.

456 00:26:16,800 --> 00:26:17,040 They’re going to understand.

457 00:26:17,040 --> 00:26:17,060 They’re going to understand.

458 00:26:17,060 --> 00:26:17,220 They’re going to understand.

459 00:26:17,220 --> 00:26:17,280 They’re going to understand.

460 00:26:17,280 --> 00:26:19,060 We’re talking about risk and things to that effect.

461 00:26:19,360 --> 00:26:22,360 And consequences of not having it correct.

462 00:26:22,360 --> 00:26:22,640 Exactly.

463 00:26:22,800 --> 00:26:24,280 And you want them to buy in.

464 00:26:24,400 --> 00:26:26,900 So you want to make sure that now that you have their attention

465 00:26:26,900 --> 00:26:27,500 that you keep it.

466 00:26:27,640 --> 00:26:27,860 Yeah.

467 00:26:28,100 --> 00:26:30,620 So you don’t jump up on the horse in the wrong direction

468 00:26:30,620 --> 00:26:33,480 and try to ride away, but you steer slowly.

469 00:26:34,140 --> 00:26:37,160 But is there a way to do like a gradient transition

470 00:26:37,160 --> 00:26:40,960 to trust or to the access-based networking?

471 00:26:41,320 --> 00:26:42,600 Honestly, in order to do that, yes.

472 00:26:42,640 --> 00:26:45,860 You do have to look at it as an incremental approach.

473 00:26:45,980 --> 00:26:46,100 Yeah.

474 00:26:46,100 --> 00:26:46,980 Because if you try to do all of that,

475 00:26:46,980 --> 00:26:47,680 do all or nothing,

476 00:26:47,820 --> 00:26:49,600 you’re basically setting yourself up for failure.

477 00:26:49,740 --> 00:26:51,420 If you could do it in an incremental fashion,

478 00:26:51,820 --> 00:26:54,380 you can then celebrate each win as you go along

479 00:26:54,380 --> 00:26:56,840 because otherwise you could get yourself

480 00:26:56,840 --> 00:26:57,820 into a world of hurt.

481 00:26:58,400 --> 00:27:02,540 And so literally picking and prioritizing as you go along.

482 00:27:02,840 --> 00:27:04,560 Most companies, the easiest way to start

483 00:27:04,560 --> 00:27:05,880 is dealing with an asset inventory,

484 00:27:06,080 --> 00:27:07,480 then dealing with user management,

485 00:27:07,720 --> 00:27:09,600 and then incrementally growing from there.

486 00:27:10,800 --> 00:27:12,800 You really want to make sure

487 00:27:12,800 --> 00:27:14,500 that you don’t try to bite it off

488 00:27:14,500 --> 00:27:15,560 and do it all at the same time

489 00:27:15,560 --> 00:27:16,740 because you simply can’t.

490 00:27:16,740 --> 00:27:21,560 The boiling the ocean was a well-received message

491 00:27:21,560 --> 00:27:23,020 in your talk as well, I guess.

492 00:27:23,720 --> 00:27:25,840 We tried to do that too many times.

493 00:27:26,000 --> 00:27:26,220 Yes.

494 00:27:27,020 --> 00:27:28,380 I’ve been guilty of that as well.

495 00:27:28,900 --> 00:27:31,340 So if we come to one of the quite,

496 00:27:31,540 --> 00:27:33,000 in the recent news,

497 00:27:33,460 --> 00:27:35,960 we had the issue with Salesforce

498 00:27:35,960 --> 00:27:38,420 being down for like a couple of days.

499 00:27:38,480 --> 00:27:38,660 Yes.

500 00:27:39,220 --> 00:27:41,560 So I’m pretty sure you heard about it.

501 00:27:42,020 --> 00:27:44,080 But what happened there was really

502 00:27:44,080 --> 00:27:46,340 that their access management broke down.

503 00:27:46,740 --> 00:27:49,480 They deployed into a production script

504 00:27:49,480 --> 00:27:52,780 that wide opened a lot of files in some way.

505 00:27:52,880 --> 00:27:54,160 I didn’t see the details.

506 00:27:54,480 --> 00:28:00,900 But I mean, that’s one kind of trust access network,

507 00:28:01,180 --> 00:28:01,760 the Salesforce,

508 00:28:02,220 --> 00:28:04,960 where everyone from every company connects, right?

509 00:28:05,200 --> 00:28:05,300 Yeah.

510 00:28:06,880 --> 00:28:08,620 And then zero trust.

511 00:28:08,760 --> 00:28:10,560 I mean, it’s a lot of trust networking

512 00:28:10,560 --> 00:28:13,560 that you put stuff in the cloud in that way.

513 00:28:13,560 --> 00:28:13,960 Yeah.

514 00:28:14,180 --> 00:28:15,920 And that’s why I prefer to talk about trusted access.

515 00:28:16,120 --> 00:28:16,260 Yeah.

516 00:28:16,260 --> 00:28:16,720 Because it helps.

517 00:28:16,720 --> 00:28:19,400 It helps frame it in more of a positive light.

518 00:28:19,960 --> 00:28:23,260 And the Salesforce example is just pointing out

519 00:28:23,260 --> 00:28:24,580 that it can happen to anybody.

520 00:28:25,480 --> 00:28:27,160 And the thing is you want to make sure

521 00:28:27,160 --> 00:28:29,520 that you have a defined, repeatable process

522 00:28:29,520 --> 00:28:32,060 that you can handle when something like that does happen.

523 00:28:32,760 --> 00:28:34,740 I’ve been through various organizations

524 00:28:34,740 --> 00:28:37,400 where we all suffered oopses along the way

525 00:28:37,400 --> 00:28:39,420 or misconfigurations or whatever it happens to be.

526 00:28:39,700 --> 00:28:41,720 And you just have to be able to learn

527 00:28:42,320 --> 00:28:43,600 how to get ahead of the narrative.

528 00:28:43,720 --> 00:28:45,300 Otherwise, the narrative will get ahead of you.

529 00:28:45,300 --> 00:28:46,600 And that’s not going to serve

530 00:28:46,600 --> 00:28:48,300 anybody any positive night.

531 00:28:48,820 --> 00:28:49,260 No, exactly.

532 00:28:49,400 --> 00:28:50,760 Because what I’m saying is that

533 00:28:50,760 --> 00:28:52,420 trust takes years to build,

534 00:28:52,560 --> 00:28:53,240 seconds to ruin,

535 00:28:53,300 --> 00:28:54,400 and forever to repair.

536 00:28:55,200 --> 00:28:56,500 Couldn’t have said that better myself.

537 00:28:57,500 --> 00:28:58,580 It’s not me saying that.

538 00:28:58,680 --> 00:28:59,540 It’s a proverb, I guess.

539 00:28:59,640 --> 00:29:01,640 It is, but it’s very apt.

540 00:29:02,660 --> 00:29:04,740 Then another thing you brought up was web of them.

541 00:29:05,020 --> 00:29:05,240 Yes.

542 00:29:05,340 --> 00:29:07,200 The thing that should bring you from the presentation.

543 00:29:07,460 --> 00:29:09,700 And the presentation will be live on YouTube

544 00:29:09,700 --> 00:29:11,280 to look at.

545 00:29:11,280 --> 00:29:15,080 So all the listeners that want to go into detail there

546 00:29:15,080 --> 00:29:16,060 can look at it.

547 00:29:16,060 --> 00:29:19,540 What would you say is the main disruptive thing

548 00:29:19,540 --> 00:29:20,360 about web of them?

549 00:29:20,760 --> 00:29:22,660 So web of them is allowing you to do

550 00:29:22,660 --> 00:29:23,700 passwordless authentication.

551 00:29:23,860 --> 00:29:24,720 So you’re able to use

552 00:29:24,720 --> 00:29:26,380 multi-factor authentication for websites.

553 00:29:26,500 --> 00:29:27,620 So if you’re going to internet banking,

554 00:29:27,800 --> 00:29:29,740 you’re able to have a push come to your device

555 00:29:29,740 --> 00:29:30,680 and you can say yes or no

556 00:29:30,680 --> 00:29:32,200 as to whether or not you want to gain access

557 00:29:32,200 --> 00:29:33,380 to that particular website

558 00:29:33,380 --> 00:29:36,020 as opposed to typing any username and password.

559 00:29:36,980 --> 00:29:39,280 Because there’s too many phishing sites out there

560 00:29:39,280 --> 00:29:39,900 as an example.

561 00:29:40,500 --> 00:29:42,080 And there’s too many opportunities

562 00:29:42,080 --> 00:29:43,720 for static passwords to be purloined

563 00:29:43,720 --> 00:29:44,700 by negative actors.

564 00:29:45,340 --> 00:29:46,040 And there’s too many opportunities for static passwords to be purloined by negative actors.

565 00:29:46,060 --> 00:29:47,800 This is a better way to do it

566 00:29:47,800 --> 00:29:50,600 in such a way that it is far, far more difficult

567 00:29:50,600 --> 00:29:52,000 for an attacker

568 00:29:52,000 --> 00:29:54,960 to gain access to your resources.

569 00:29:56,580 --> 00:29:57,660 Nothing’s ever 100%

570 00:29:57,660 --> 00:29:59,780 but this is far better than what we currently have.

571 00:29:59,780 --> 00:30:01,660 Yeah, we have some kind of standard now

572 00:30:01,660 --> 00:30:03,500 to hold on to, so to speak.

573 00:30:03,600 --> 00:30:06,140 And it’s an open standard, which I’m all about.

574 00:30:06,940 --> 00:30:08,100 Pro that, pro that.

575 00:30:08,260 --> 00:30:08,640 Very much.

576 00:30:09,860 --> 00:30:13,060 Then you also brought up the supply chain issue, of course.

577 00:30:13,180 --> 00:30:14,080 It’s really hot

578 00:30:14,080 --> 00:30:15,180 and I mean it’s hitting

579 00:30:16,060 --> 00:30:17,580 companies where it hurts

580 00:30:17,580 --> 00:30:18,860 and where they’re soft, right?

581 00:30:18,920 --> 00:30:20,800 Because you trust your suppliers.

582 00:30:22,400 --> 00:30:24,280 How can you live in this world

583 00:30:24,280 --> 00:30:26,000 without trusting your suppliers?

584 00:30:26,340 --> 00:30:27,320 What are we saying?

585 00:30:28,280 --> 00:30:30,880 What’s on the labels in the box?

586 00:30:31,740 --> 00:30:31,860 Yeah.

587 00:30:32,160 --> 00:30:34,020 This is one of those things with supply chain

588 00:30:34,020 --> 00:30:35,720 and it’s really funny that it’s all hot now

589 00:30:35,720 --> 00:30:37,520 because I gave my first supply chain security talk

590 00:30:37,520 --> 00:30:38,240 back in 2013.

591 00:30:39,060 --> 00:30:41,880 And this is something I was tub-thumping for a very long time about.

592 00:30:42,220 --> 00:30:43,640 Because the attackers

593 00:30:43,640 --> 00:30:45,960 are shifting how they attack companies.

594 00:30:46,060 --> 00:30:47,440 How they attack organizations.

595 00:30:47,600 --> 00:30:48,380 How they attack governments.

596 00:30:48,880 --> 00:30:52,340 They have gone away from being able to hit you in the face

597 00:30:52,340 --> 00:30:53,940 as opposed to trying to pick your pocket.

598 00:30:54,620 --> 00:30:56,420 So you really have to be able to

599 00:30:56,420 --> 00:30:58,720 be monitoring all these different aspects.

600 00:30:58,720 --> 00:31:01,040 Is code being accidentally introduced

601 00:31:01,040 --> 00:31:02,080 into your organization

602 00:31:02,080 --> 00:31:04,680 that either nefariously has

603 00:31:04,680 --> 00:31:06,440 negative instructions in it

604 00:31:06,440 --> 00:31:07,560 or accidentally?

605 00:31:08,280 --> 00:31:10,120 And this is just one example of many.

606 00:31:10,280 --> 00:31:12,060 There’s great companies out there like SNIC

607 00:31:12,060 --> 00:31:14,360 that can monitor the libraries

608 00:31:14,360 --> 00:31:15,660 that you’re using in your organization.

609 00:31:16,060 --> 00:31:16,500 Look and say,

610 00:31:16,920 --> 00:31:18,500 okay, these individual applications,

611 00:31:18,740 --> 00:31:19,980 there’s a security problem with them.

612 00:31:20,220 --> 00:31:21,720 We need to update it in our own packages.

613 00:31:21,880 --> 00:31:23,400 So that’s just one example of many.

614 00:31:23,400 --> 00:31:25,800 So the DevOps way of developing

615 00:31:25,800 --> 00:31:27,760 needs to apply new tools

616 00:31:27,760 --> 00:31:28,640 as well then to

617 00:31:28,640 --> 00:31:31,680 be more resistant to these kinds of

618 00:31:31,680 --> 00:31:33,760 threats or risks.

619 00:31:34,000 --> 00:31:35,700 Yes, you want to build security into your pipeline.

620 00:31:35,820 --> 00:31:38,040 You want to make sure that it is part of the process

621 00:31:38,040 --> 00:31:39,000 as opposed to

622 00:31:39,000 --> 00:31:41,500 the good old fashioned flaming sword of justice.

623 00:31:41,760 --> 00:31:42,760 How do we get to know?

624 00:31:43,100 --> 00:31:44,460 That doesn’t serve anybody well.

625 00:31:44,600 --> 00:31:45,940 And in DevOps, you’ve got to move,

626 00:31:46,060 --> 00:31:46,340 fast,

627 00:31:46,680 --> 00:31:48,140 and you have to be agile.

628 00:31:48,720 --> 00:31:50,860 And when you’re dealing with that sort of environment,

629 00:31:51,000 --> 00:31:52,960 you want to make sure that security is baked in

630 00:31:52,960 --> 00:31:53,720 from the word go

631 00:31:53,720 --> 00:31:54,720 and you’re making sure

632 00:31:54,720 --> 00:31:56,560 that you’re not accidentally introducing

633 00:31:56,560 --> 00:31:58,700 vulnerabilities that could impact your customers,

634 00:31:58,940 --> 00:32:00,120 citizens, whatever it happens to be.

635 00:32:00,320 --> 00:32:01,160 Yeah, because I mean,

636 00:32:01,200 --> 00:32:03,700 there’s so many global services now

637 00:32:03,700 --> 00:32:05,200 so you could have a global impact

638 00:32:05,200 --> 00:32:07,740 if you’re putting anything in production.

639 00:32:08,360 --> 00:32:11,840 So just having a control of your impact level

640 00:32:11,840 --> 00:32:13,880 is a way of understanding

641 00:32:13,880 --> 00:32:15,880 how much effort you should put into this.

642 00:32:16,060 --> 00:32:16,600 Well, exactly.

643 00:32:16,780 --> 00:32:18,840 It’s like a simple script or a simple error

644 00:32:18,840 --> 00:32:21,420 can have impact millions of people in the blink of an eye.

645 00:32:21,620 --> 00:32:22,520 Here’s a close-up.

646 00:32:22,840 --> 00:32:26,580 And the need to focus on this

647 00:32:26,580 --> 00:32:28,920 is definitely far higher than it’s ever been.

648 00:32:29,100 --> 00:32:30,400 Yeah, indeed.

649 00:32:30,920 --> 00:32:33,660 So Dave, I appreciate you taking your time

650 00:32:33,660 --> 00:32:35,480 to come to Cyclesport Kostn

651 00:32:35,480 --> 00:32:37,040 and especially coming to Gothenburg

652 00:32:37,040 --> 00:32:38,240 in this beautiful day.

653 00:32:39,500 --> 00:32:40,540 Gothenburg weather day.

654 00:32:42,080 --> 00:32:43,260 It’s like Canadian weather.

655 00:32:43,260 --> 00:32:44,260 You’ve got to wait five minutes.

656 00:32:44,560 --> 00:32:45,600 Well, that’s okay.

657 00:32:45,600 --> 00:32:46,180 I’ll wait.

658 00:32:47,380 --> 00:32:48,840 But thank you very much for having me.

659 00:32:48,880 --> 00:32:49,120 Yeah.

660 00:32:49,440 --> 00:32:51,720 Okay, so on behalf of Cyclesport Kostn listeners

661 00:32:51,720 --> 00:32:53,500 and my name is Ron von Post,

662 00:32:53,740 --> 00:32:55,860 the flag reporter for the crew here

663 00:32:55,860 --> 00:32:59,660 and I appreciate you all listening to this episode as well.

664 00:33:00,100 --> 00:33:02,420 Take care out there and have a nice day.