Jan. 17, 2025

Making the World a More Secure Place with Yasmin Abdi

Making the World a More Secure Place with Yasmin Abdi

Data and security breaches are a dime a dozen nowadays, and despite their frequency, they’re still just as dangerous. That’s where Yasmin Abdi, the CEO of noHack, comes in. Despite her relatively short career, she’s already worked for some of the giants of the tech industry like Google and Snapchat. Along with Justin and Autumn, Yasmin breaks down real-world security challenges and solutions, a firsthand view into managing role-based access, phishing simulations for employee training, and the delicate balance between security and usability.


Show Highlights

(0:00) Intro

(0:32) Tremolo sponsor read

(2:04) How Yasmin built noHack

(3:15) Breaking down Yasmin's impressive resume

(4:17) What sparked Yasmin's interest in security?

(7:54) Yasmin's biggest challenge since starting noHack

(11:05) How Zero Trust has evolved over the past decade

(12:34) Balancing usability and security

(15:43) The problems with role-based access and how Yasmin's work addresses it

(19:31) Phishing schemes and AI's role in the future of security

(23:14) Tremolo sponsor read

(24:13) Yasmin's efforts to educate organizations on the dangers of phishing and poor security

(29:31) "Security theater" and the lack of serious education

(34:20) How to get people to take security seriously

(39:37) Yasmin's opinions on third-party scanning vendors

(43:17) How Yasmin would have handled the CrowdStrike attack

(46:52) Where you can find more from Yasmin

About Yasmin Abdi

Yasmin Abdi is the CEO and Founder of noHack, a cybersecurity company focused on delivering high-impact solutions for public and private (startups, & SMB) clients. Yasmin’s expertise spans enterprise security, secure software development, vulnerability and risk management, threat detection and intelligence, security assurance and education, and privacy best practices. Yasmin has also shared her knowledge at major industry platforms (featured on Forbes, Cisco, Voice of America) and has established herself as a leading voice in the cybersecurity space. 


Before launching noHack, Yasmin led global security and privacy initiatives at tech giants like Google, Meta, and Snap. With over seven years of experience, she played a pivotal role as a founding member of Meemo, an AI-powered social finance app later acquired by Coinbase for $95M.

Links Referenced

Sponsor

Tremolo: http://fafo.fm/tremolo

Sponsor the FAFO Podcast!

http://fafo.fm/sponsor

Transcript

1
00:00:00,000 --> 00:00:03,469
I think something here that would be really important is like the principle of

2
00:00:03,470 --> 00:00:09,050
least privilege, really ensuring that users only have the needed permissions

3
00:00:09,139 --> 00:00:12,210
while they're doing their work that wouldn't disrupt their workflow.

4
00:00:18,110 --> 00:00:21,509
Welcome to Fork Around and Find Out, the podcast about

5
00:00:21,509 --> 00:00:24,600
building, running, and maintaining software and systems.

6
00:00:36,680 --> 00:00:37,769
Managing role based access control for

7
00:00:37,770 --> 00:00:41,310
Kubernetes isn't the easiest thing in the world.

8
00:00:41,400 --> 00:00:44,010
Especially as you have more clusters and more users

9
00:00:44,020 --> 00:00:47,170
and more services that want to use Kubernetes.

10
00:00:47,500 --> 00:00:50,360
Open Unison helps solve those problems by bringing

11
00:00:50,360 --> 00:00:52,740
single sign on to your Kubernetes clusters.

12
00:00:52,889 --> 00:00:56,619
This extends Active Directory, Okta, Azure AD, and

13
00:00:56,669 --> 00:00:59,679
other sources as your centralized user management.

14
00:00:59,775 --> 00:01:01,855
for your Kubernetes access control.

15
00:01:02,135 --> 00:01:05,254
You can forget managing all those YAML files to give someone access

16
00:01:05,254 --> 00:01:09,125
to the cluster and centrally manage all of their access in one place.

17
00:01:09,324 --> 00:01:11,664
This extends to services inside the cluster

18
00:01:11,685 --> 00:01:14,544
like Grafana, Argo CD, and Argo workflows.

19
00:01:14,545 --> 00:01:17,674
Open Unison is a great open source project, but relying

20
00:01:17,675 --> 00:01:20,485
on open source without any support for something as

21
00:01:20,505 --> 00:01:23,990
critical as access management, May not be the best option.

22
00:01:24,150 --> 00:01:25,990
Tremelo Security offers support for OpenUnison

23
00:01:26,680 --> 00:01:29,760
and other features around identity and security.

24
00:01:29,860 --> 00:01:33,350
Tremelo provides open source and commercial support for OpenUnison

25
00:01:33,459 --> 00:01:36,730
in all of your Kubernetes clusters, whether in the cloud or on prem.

26
00:01:36,970 --> 00:01:40,929
So check out Tremelo Security for your single sign on needs in Kubernetes.

27
00:01:41,239 --> 00:01:42,300
You can find them at fafo.

28
00:01:42,300 --> 00:01:45,759
fm slash Tremelo.

29
00:01:46,110 --> 00:01:48,990
That's T R E M O L O.

30
00:01:55,635 --> 00:01:58,155
Thank you for opening up your firewall ports in your

31
00:01:58,155 --> 00:02:01,195
head to listen to this episode with Yasmin Abdi.

32
00:02:01,205 --> 00:02:02,215
Welcome to the show, Yasmin.

33
00:02:02,695 --> 00:02:03,875
Thank you for having me.

34
00:02:04,345 --> 00:02:06,295
I wanted to talk a little bit about security.

35
00:02:06,295 --> 00:02:09,305
This is really cool seeing that you were saying that you built NoHack,

36
00:02:10,175 --> 00:02:14,045
you're the founder and creator of NoHack LLC, and you were doing security

37
00:02:14,045 --> 00:02:17,345
at Snap before that, but you built this on the side kind of as a project.

38
00:02:17,435 --> 00:02:18,655
And what drove you to do this?

39
00:02:18,655 --> 00:02:20,275
Why were you like, I want to do a side project

40
00:02:20,275 --> 00:02:21,825
and start doing security for other people?

41
00:02:22,260 --> 00:02:26,770
It initially started off with me just helping a few friends with their security

42
00:02:26,770 --> 00:02:30,499
and their digital presence, as well as small business owners and startups.

43
00:02:31,300 --> 00:02:33,979
I just saw a lot of vulnerabilities and how they

44
00:02:33,980 --> 00:02:37,080
were securing their data and most of them were not.

45
00:02:37,080 --> 00:02:42,200
And there's just like a lot of bad hygiene and bad practices.

46
00:02:42,200 --> 00:02:44,280
So it kind of just started off as a hobby,

47
00:02:44,300 --> 00:02:46,700
which helped people a few hours out of the week.

48
00:02:46,840 --> 00:02:50,500
A few years ago around when COVID started and everything became digital.

49
00:02:50,595 --> 00:02:53,235
And then it kind of just snowballed and grew from there.

50
00:02:53,235 --> 00:02:56,375
So while I was at Snap building, building, building, but never

51
00:02:56,375 --> 00:03:00,345
really took it to be like a formal like side company or a side

52
00:03:00,345 --> 00:03:02,775
hustle, just kind of something to help friends and family.

53
00:03:03,075 --> 00:03:06,934
And then over the past year, it just snowballed and kind of grew

54
00:03:06,935 --> 00:03:10,935
into a full functioning security services and solutions company.

55
00:03:10,935 --> 00:03:13,165
So decided to leave that, leave Snap back

56
00:03:13,165 --> 00:03:14,795
two months ago and focus full time on it.

57
00:03:15,845 --> 00:03:17,255
Can you tell us about your resume?

58
00:03:17,305 --> 00:03:18,665
Because it is very impressive.

59
00:03:18,715 --> 00:03:19,475
It's baller.

60
00:03:19,575 --> 00:03:20,455
Yeah, sure.

61
00:03:20,505 --> 00:03:24,165
Um, I've had the opportunity to work at some of the big name companies.

62
00:03:24,264 --> 00:03:26,444
I started my career off at Snapchat, where

63
00:03:26,445 --> 00:03:30,564
I interned twice back in 2017 and 2018.

64
00:03:30,565 --> 00:03:33,669
And then I went over to Google, was there for a few months.

65
00:03:33,670 --> 00:03:37,330
It was a software engineer on the Android team, learned a lot.

66
00:03:37,330 --> 00:03:39,750
I think that's every computer science and

67
00:03:39,750 --> 00:03:42,000
software engineer's dream is to work at Google.

68
00:03:42,000 --> 00:03:45,719
So had the, uh, had the opportunity to be there, learn from some of

69
00:03:45,740 --> 00:03:50,929
the smartest people, and then also had the opportunity to, uh, work

70
00:03:50,930 --> 00:03:54,850
at Facebook, which it was called back then on Instagram specific.

71
00:03:54,970 --> 00:03:55,420
team.

72
00:03:55,630 --> 00:03:57,560
And that was back in 2019.

73
00:03:58,050 --> 00:04:01,180
So yeah, I worked at a Google, Meta, Snap.

74
00:04:01,240 --> 00:04:05,630
I also was a founding member of Mimo, which is, which was an

75
00:04:05,630 --> 00:04:08,919
AI fintech startup that a few of me and my friends started.

76
00:04:09,100 --> 00:04:12,800
And that was acquired to Coinbase back in 2021.

77
00:04:12,810 --> 00:04:17,130
So had the opportunity to do all of the fun things at these places.

78
00:04:17,150 --> 00:04:18,920
What did you do at Snap before you left?

79
00:04:18,920 --> 00:04:21,320
Like, can you run us through like your security career?

80
00:04:22,079 --> 00:04:22,530
I was like, how

81
00:04:22,530 --> 00:04:23,559
did you get into security?

82
00:04:23,589 --> 00:04:25,179
Like what, what started all of this?

83
00:04:25,639 --> 00:04:27,009
I've always had a passion.

84
00:04:27,039 --> 00:04:32,829
I tell myself that earlier on in my career, I always thought like a hacker.

85
00:04:32,869 --> 00:04:35,749
So there's like a way to do things the normal way, you

86
00:04:35,749 --> 00:04:39,829
know, like log into the wifi, here's your password and, and

87
00:04:39,840 --> 00:04:42,099
you know, you're logged in and everything is good to go.

88
00:04:42,279 --> 00:04:43,719
And then there's always a backdoor.

89
00:04:43,719 --> 00:04:45,829
There's always another way to get into something.

90
00:04:46,079 --> 00:04:47,489
There's another way to break into a system.

91
00:04:47,659 --> 00:04:51,029
Um, so I always thought, Hey, like, how can I manipulate the system?

92
00:04:51,029 --> 00:04:52,459
Like, how can I, I don't know if that's the

93
00:04:52,459 --> 00:04:55,619
best response or if that's even legal, but, um,

94
00:04:56,559 --> 00:04:57,319
totally valid.

95
00:04:57,319 --> 00:04:59,809
Everyone has some use case where they're like, I wanted to hack a game.

96
00:04:59,809 --> 00:05:00,159
Security

97
00:05:00,239 --> 00:05:03,579
people used to do like some other stuff and

98
00:05:03,579 --> 00:05:05,069
that's how they learned how to do security.

99
00:05:05,109 --> 00:05:06,399
That's just white hacking.

100
00:05:06,459 --> 00:05:07,669
That's just, you know, well,

101
00:05:07,669 --> 00:05:09,929
I mean, you have a personal bend on most,

102
00:05:09,929 --> 00:05:11,479
like I did war driving back in the day.

103
00:05:11,479 --> 00:05:13,089
Cause I wanted like free wifi, right?

104
00:05:13,089 --> 00:05:14,309
I was like, Oh, my neighbors have wifi.

105
00:05:14,319 --> 00:05:15,299
How do I get on that?

106
00:05:15,299 --> 00:05:15,349
Right.

107
00:05:15,899 --> 00:05:20,379
Yeah, I think for me also my parents would like shut down like internet access

108
00:05:20,379 --> 00:05:24,279
after a certain time, like put passwords on the like desktops in our houses

109
00:05:24,319 --> 00:05:27,599
and like just do things like that and I was like, no, like, there's no way

110
00:05:27,599 --> 00:05:30,359
you're going to turn off my internet after a certain like, there's no way

111
00:05:30,359 --> 00:05:33,829
you can block these certain TV shows and TV channels after a certain time.

112
00:05:33,829 --> 00:05:36,489
So I was kind of for my own personal preference.

113
00:05:36,724 --> 00:05:40,354
But at that time, I didn't know what I was like, I didn't know it was called

114
00:05:40,414 --> 00:05:43,514
hacking and throughout my college career, or even high school, as early

115
00:05:43,514 --> 00:05:47,264
as high school, I said, Hey, like, you know, I enjoyed doing those things.

116
00:05:47,454 --> 00:05:50,594
And then I found out job security, I found out this was something that

117
00:05:50,594 --> 00:05:53,534
was going to be long lasting and in, in the world that we live in.

118
00:05:53,534 --> 00:05:57,004
So studied at the University of Maryland, um, took my first

119
00:05:57,004 --> 00:06:00,194
formal cybersecurity software engineering course there.

120
00:06:00,264 --> 00:06:04,184
Graduated back in 2019 with a bachelor's in computer

121
00:06:04,184 --> 00:06:06,844
science and a focus in cybersecurity, and then started

122
00:06:06,844 --> 00:06:09,234
my full time career in cybersecurity at Snapchat.

123
00:06:09,354 --> 00:06:11,934
And at Snap, I was a software engineer, software security

124
00:06:11,944 --> 00:06:14,804
engineer, working on developing internal tooling for the

125
00:06:14,804 --> 00:06:17,894
security team, and then, um, moved my way up and then became a

126
00:06:17,904 --> 00:06:21,014
manager and kind of led the Insider Risk Program before I left.

127
00:06:21,664 --> 00:06:26,054
Y'all, she's like gorgeous and smart and got like a crazy career.

128
00:06:26,074 --> 00:06:29,044
How did you do all that since 2019?

129
00:06:29,084 --> 00:06:29,984
Like, did you sleep?

130
00:06:30,044 --> 00:06:30,504
Like, what?

131
00:06:30,604 --> 00:06:30,824
Like,

132
00:06:32,174 --> 00:06:34,234
to be honest, no, I didn't sleep.

133
00:06:34,254 --> 00:06:37,004
There was lots of late nights, but I think for

134
00:06:37,014 --> 00:06:39,984
me, I've always just had a passion to learn more.

135
00:06:39,984 --> 00:06:44,064
So even in times and days where I didn't think I was working, working like the

136
00:06:44,064 --> 00:06:48,154
startup that me and my friends started, I didn't really think of that as a job.

137
00:06:48,154 --> 00:06:52,009
I thought of that as more of A side hustle, like you

138
00:06:52,009 --> 00:06:54,029
and your friends are working on building something cool.

139
00:06:54,239 --> 00:06:56,509
And then, I mean, acquisition was the goal,

140
00:06:56,509 --> 00:06:58,389
but I didn't think it would happen that fast.

141
00:06:58,389 --> 00:07:00,849
I think that turnaround was like 18 months.

142
00:07:00,929 --> 00:07:03,739
I had a lot of very senior people on my team, I'll tell

143
00:07:03,739 --> 00:07:07,019
you that, like ex Google directors of search, et cetera.

144
00:07:07,049 --> 00:07:10,109
So like, it was, it was a heavy, heavy group of people working

145
00:07:10,109 --> 00:07:12,474
on that product, but still the turnaround time was, was amazing.

146
00:07:12,474 --> 00:07:13,204
Super fast.

147
00:07:13,334 --> 00:07:15,524
But yeah, I think like working on that project

148
00:07:15,544 --> 00:07:17,944
with, with my friends, like you lose track of time.

149
00:07:18,154 --> 00:07:19,444
You lose track of the days.

150
00:07:19,604 --> 00:07:22,634
And then also when I was at Snap, like working on a product that

151
00:07:22,634 --> 00:07:25,424
you care so much, you feel so much passion and you care about.

152
00:07:25,674 --> 00:07:29,374
And then with Nohack, like I think I work seven days a week.

153
00:07:29,374 --> 00:07:32,334
Like I can't even tell the difference between like work and not work

154
00:07:32,334 --> 00:07:35,789
because sometimes I I work on stuff that's, like, super fun and engaging.

155
00:07:35,799 --> 00:07:37,119
Like, I do a lot of public speaking.

156
00:07:37,179 --> 00:07:37,989
I do a lot of panels.

157
00:07:37,989 --> 00:07:38,919
I do a lot of conferences.

158
00:07:38,919 --> 00:07:39,959
I travel the world now.

159
00:07:40,219 --> 00:07:42,509
And it's, like, for work, but it's, like, fun work.

160
00:07:42,509 --> 00:07:44,719
So, sometimes I get lost in the track of time

161
00:07:44,719 --> 00:07:46,849
with all the different moving pieces that I do.

162
00:07:47,019 --> 00:07:48,989
It seems like you really enjoy what you do and that

163
00:07:48,989 --> 00:07:51,359
you just are born with the hustle and curiosity.

164
00:07:51,859 --> 00:07:52,089
I

165
00:07:52,089 --> 00:07:53,039
would say I agree

166
00:07:53,039 --> 00:07:53,459
with that.

167
00:07:53,499 --> 00:07:53,969
Thank you.

168
00:07:54,619 --> 00:07:58,309
What's, uh, What's the biggest challenge you've faced so far starting

169
00:07:58,479 --> 00:08:01,049
Nohack or maybe what's the biggest thing that you didn't expect?

170
00:08:01,349 --> 00:08:06,169
So as a CEO, I'm learning a lot of not technical like skill sets.

171
00:08:06,499 --> 00:08:11,419
So um, when I was at Snap or when I was at Memo or even Google in Meta, I

172
00:08:11,419 --> 00:08:14,319
was learning It's very much software engineer like or security engineer.

173
00:08:14,319 --> 00:08:17,459
So it was very much within the engineering realm of things.

174
00:08:17,689 --> 00:08:21,619
And now I'm learning about cap tables and investing and like how to

175
00:08:21,619 --> 00:08:24,949
pitch and how to sell yourself and how to sell your company and like

176
00:08:24,949 --> 00:08:27,979
how to form like partnership deals, which for me, I've always had

177
00:08:27,979 --> 00:08:31,109
like the, I feel like I've always been born with the opportunity to

178
00:08:31,119 --> 00:08:34,919
communicate, but I think it's like the sit, the selling aspect of no

179
00:08:34,919 --> 00:08:38,089
hack is something that I've, I've been learning and it's definitely fun.

180
00:08:38,089 --> 00:08:40,369
I feel like I'm back in school where I'm learning something.

181
00:08:40,504 --> 00:08:42,554
All over again from, from the ground up.

182
00:08:42,774 --> 00:08:46,014
I feel like with, with an engineering, there's, everything's changing, but

183
00:08:46,024 --> 00:08:48,764
there's, there's a certain way that things are changing that you can grasp on.

184
00:08:48,764 --> 00:08:51,724
But like with sales or with business or with marketing, like I, I

185
00:08:52,084 --> 00:08:55,164
tend to spend a lot of my time figuring out like all these social

186
00:08:55,164 --> 00:08:57,824
media strategies and things that I've just never been privy to.

187
00:08:57,824 --> 00:09:00,524
So I would say those are some of the things that, that

188
00:09:00,524 --> 00:09:02,304
have been challenging, but they're fun challenges.

189
00:09:02,304 --> 00:09:05,434
Cause I'm always learning and I feel like I'm a lifelong learner.

190
00:09:05,434 --> 00:09:07,654
So it's fun to learn new, new areas.

191
00:09:08,519 --> 00:09:10,839
What sort of software did you start building with?

192
00:09:10,859 --> 00:09:12,329
Like, cause you mentioned you initially,

193
00:09:12,419 --> 00:09:14,009
initially this was like friends and family.

194
00:09:14,009 --> 00:09:15,329
And a lot of that's just like, here's a

195
00:09:15,329 --> 00:09:16,969
webpage with a blog post or something, right?

196
00:09:16,969 --> 00:09:20,619
Like use a password manager, put on TFA if you can, something like that.

197
00:09:20,619 --> 00:09:22,749
And then at some point you have to like transition that into

198
00:09:22,749 --> 00:09:25,719
like, Oh, if I'm going to help a company do this, I need tools or

199
00:09:25,719 --> 00:09:28,379
I need some automation or I need some way to do this reporting.

200
00:09:28,379 --> 00:09:30,579
What sort of things did you focus on first to do?

201
00:09:30,579 --> 00:09:33,509
Like here's your public port scans or here's

202
00:09:33,509 --> 00:09:37,280
a CVE report or what are you doing first?

203
00:09:37,570 --> 00:09:40,570
So at Knowhack, we really started off with

204
00:09:40,780 --> 00:09:43,240
penetration testing and vulnerability scanning.

205
00:09:43,380 --> 00:09:47,510
So regular just assessing systems for weaknesses and vulnerabilities.

206
00:09:47,760 --> 00:09:49,560
Typically, that's that's how we would start.

207
00:09:49,609 --> 00:09:53,159
So, hey, like, you know, if you're a startup, primarily with like a

208
00:09:53,159 --> 00:09:56,319
digital footprint, we would scan your infrastructure, your systems.

209
00:09:56,320 --> 00:09:58,650
Systems, your architecture, your endpoints, like

210
00:09:58,650 --> 00:10:01,040
your APIs, all of the things that you have digitally

211
00:10:01,040 --> 00:10:03,630
connected to the world and scan them for vulnerabilities.

212
00:10:03,660 --> 00:10:05,800
And then we would also do some red teaming.

213
00:10:05,810 --> 00:10:08,980
So we would pen test and really see how we could maybe

214
00:10:08,980 --> 00:10:11,589
break into your system or find weaknesses in your system.

215
00:10:11,590 --> 00:10:12,910
So that's kind of how we started.

216
00:10:12,970 --> 00:10:16,250
And then we built a bunch of other like services and solutions from that.

217
00:10:16,570 --> 00:10:20,690
They can range anywhere between like AI threat detection and response.

218
00:10:20,690 --> 00:10:23,840
So really use utilizing a lot of like machine learning and a

219
00:10:23,840 --> 00:10:27,095
lot of like the insider risks experience that I had to kind

220
00:10:27,095 --> 00:10:29,605
of understand, okay, like, where is the threat happening?

221
00:10:29,685 --> 00:10:30,875
Where are the pain points here?

222
00:10:30,875 --> 00:10:32,655
And then what responses can we build?

223
00:10:32,655 --> 00:10:35,115
So whether that's alerting, whether that's setting

224
00:10:35,115 --> 00:10:37,355
up continuous monitoring and things of that sort.

225
00:10:37,355 --> 00:10:40,145
So I would say we started with vulnerability assessment

226
00:10:40,355 --> 00:10:43,504
and scanning, moved over to threat detection response.

227
00:10:43,685 --> 00:10:46,925
And then now we really focus a lot on like zero trust architecture.

228
00:10:46,925 --> 00:10:52,564
So making sure that we are never trusting and always verifying every request

229
00:10:52,565 --> 00:10:57,215
that comes in, no matter if the user Or device has access to any, any systems

230
00:10:57,215 --> 00:11:01,235
we always would verify and have very strict authorization and authentication.

231
00:11:01,245 --> 00:11:04,225
So I think those are probably the three biggest things that we focus on at

232
00:11:04,775 --> 00:11:05,005
Nowhack.

233
00:11:05,035 --> 00:11:07,194
How has Zero Trust evolved over time?

234
00:11:07,195 --> 00:11:10,685
Like when I remember back in the 20 teens or whatever, like Zero

235
00:11:10,685 --> 00:11:12,915
Trust was just like, Oh, at some level of your network, you need

236
00:11:12,915 --> 00:11:15,285
to be able to figure out if this device is trusted on the network.

237
00:11:15,285 --> 00:11:17,155
And you just do it with mutual certificates.

238
00:11:17,165 --> 00:11:18,515
You're like, Oh, you got a cert, I got a cert.

239
00:11:18,765 --> 00:11:20,545
We trust the signing authority.

240
00:11:20,545 --> 00:11:21,235
We're fine.

241
00:11:21,285 --> 00:11:22,195
Let's keep talking.

242
00:11:22,685 --> 00:11:23,765
But when you talk about.

243
00:11:24,245 --> 00:11:25,835
Auth Z versus Auth N or whatever.

244
00:11:25,835 --> 00:11:27,715
Like you're like moving into the application layer.

245
00:11:27,715 --> 00:11:29,795
You're saying, Oh, you can do that at the network layer.

246
00:11:29,795 --> 00:11:31,675
You can do it at the request load bouncing layer.

247
00:11:31,685 --> 00:11:33,104
You can do it at the application layer.

248
00:11:33,404 --> 00:11:37,545
How has that changed over time for something that is a zero trust mindset?

249
00:11:38,075 --> 00:11:42,185
I think it started off with adding like an additional layer of protection.

250
00:11:42,185 --> 00:11:45,155
So when you think about adding 2FA or MFA.

251
00:11:45,310 --> 00:11:47,260
And I think that's kind of like the early days.

252
00:11:47,260 --> 00:11:50,660
And then now it's kind of evolved into like a continuous monitoring

253
00:11:50,710 --> 00:11:53,980
approach where every single request that comes in, you're going to

254
00:11:54,050 --> 00:11:57,819
verify the identity before allowing any type of level of access.

255
00:11:57,880 --> 00:12:00,220
And then I also think that identity and access

256
00:12:00,220 --> 00:12:03,339
management has also been increasingly important.

257
00:12:03,560 --> 00:12:06,050
So always managing the user's identities and

258
00:12:06,050 --> 00:12:08,750
permissions to minimize any unauthorized risks.

259
00:12:08,750 --> 00:12:12,940
So there's frameworks like RBAC, so Role Based Access Controls.

260
00:12:13,095 --> 00:12:15,305
Access management systems to really ensure that

261
00:12:15,325 --> 00:12:18,605
employees, regardless of like their role, can only access

262
00:12:18,605 --> 00:12:20,725
data that's necessary and needed for their workforce.

263
00:12:20,735 --> 00:12:24,264
So I think it really started off with the MFAs and 2FAs

264
00:12:24,275 --> 00:12:26,995
moved over to a continuous monitoring approach with that

265
00:12:26,995 --> 00:12:30,514
identity and access management, managing users permissions,

266
00:12:30,514 --> 00:12:33,735
and very granular to the RBAC framework that I mentioned.

267
00:12:34,340 --> 00:12:39,255
How do you balance usability and security and kind of like educating people?

268
00:12:39,285 --> 00:12:42,010
Because I feel like That is the hardest part, kind of

269
00:12:42,040 --> 00:12:45,940
getting people to realize how important security is and why.

270
00:12:46,360 --> 00:12:48,020
Because people just be like, well, I don't want to get

271
00:12:48,020 --> 00:12:50,810
alerted all the time and I don't want to sign in twice and

272
00:12:50,850 --> 00:12:53,980
this is so much harder, but trying to really educate them.

273
00:12:54,400 --> 00:12:58,380
But also make it easy to use, but secure is always the best.

274
00:12:58,380 --> 00:12:58,720
I just

275
00:12:58,720 --> 00:13:01,950
moved one of my Google accounts to a passkey and I don't like it.

276
00:13:02,400 --> 00:13:03,620
I'm like, Oh, this is more secure.

277
00:13:03,620 --> 00:13:04,040
It's better.

278
00:13:04,080 --> 00:13:04,490
Blah, blah, blah.

279
00:13:04,490 --> 00:13:07,849
And like this, the sign in flow is just worse now compared

280
00:13:07,849 --> 00:13:10,549
to one password auto filling my username and password.

281
00:13:10,560 --> 00:13:11,910
And now I'm like, Oh, I got three.

282
00:13:11,970 --> 00:13:15,560
I hate remembering all the ridiculous passwords I make, but.

283
00:13:15,860 --> 00:13:20,000
Google, definitely, there is some sort of a bug in the passkey

284
00:13:20,020 --> 00:13:22,970
that sometimes it doesn't always work the way it's supposed to,

285
00:13:23,580 --> 00:13:26,810
but I really appreciate that Apple products use the same passkey.

286
00:13:26,830 --> 00:13:28,109
Like, you know, you can use it from one

287
00:13:28,110 --> 00:13:29,810
of your phones because it's all connected.

288
00:13:30,080 --> 00:13:33,060
I do appreciate that my face is my passkey and it doesn't require

289
00:13:33,060 --> 00:13:36,410
me to remember, like, passwords and constantly change them, so.

290
00:13:36,760 --> 00:13:39,860
Yeah, I mean, I think that that's a core challenge

291
00:13:39,860 --> 00:13:42,860
that's always being spoken about in security.

292
00:13:43,040 --> 00:13:49,150
I think the overcomplex and restrictive systems can lead to like frustration.

293
00:13:49,460 --> 00:13:53,739
I remember when I was at Snap, we had like four different layers of

294
00:13:53,800 --> 00:13:57,750
authentication that we needed to get through to get into our Google account.

295
00:13:57,790 --> 00:14:06,879
So it was definitely a lot of, question.

296
00:14:06,880 --> 00:14:08,387
This is um, so I was kind of looking at the finals of

297
00:14:08,387 --> 00:14:14,379
article 810 and Um, I think it's pretty dark, to be honest.

298
00:14:14,379 --> 00:14:18,105
that is always needed is password complexity.

299
00:14:18,385 --> 00:14:23,195
So if a system requires like a long, complex password, maybe not asking them.

300
00:14:23,285 --> 00:14:25,905
I think sometimes even these days that you require

301
00:14:25,905 --> 00:14:29,165
to get changed every 90 or 180 days, I saw some, some

302
00:14:29,174 --> 00:14:32,705
organizations and I was like, Hey, like that's a bit too much.

303
00:14:32,785 --> 00:14:34,325
That would be too annoying for me personally.

304
00:14:34,334 --> 00:14:36,355
If I had to create a new password, 180 days.

305
00:14:36,605 --> 00:14:38,295
And especially they're so long.

306
00:14:38,385 --> 00:14:38,915
Yes.

307
00:14:38,915 --> 00:14:42,535
I think there's like that battle where like you want.

308
00:14:42,745 --> 00:14:45,415
it to be usable because if not they're going to go around it just

309
00:14:45,415 --> 00:14:47,935
like you said how you went around your parents stuff like I talk about

310
00:14:47,935 --> 00:14:50,195
all the time my kids are going to end up working for the NSA to get

311
00:14:50,195 --> 00:14:54,415
around all of the like you know like so people will go around it and

312
00:14:54,475 --> 00:14:59,215
be super lazy and not use all the safeguards or try to get out of them.

313
00:14:59,485 --> 00:15:01,955
But you want it to also be safe, so it's like the struggle.

314
00:15:02,205 --> 00:15:03,635
Yeah, no, I agree.

315
00:15:03,635 --> 00:15:05,365
I think something here that would be really

316
00:15:05,365 --> 00:15:08,155
important is like the principle of least privilege.

317
00:15:08,405 --> 00:15:11,515
So again, with going back to like role based access control,

318
00:15:11,525 --> 00:15:16,014
really ensuring that users only have the needed permissions.

319
00:15:16,570 --> 00:15:19,780
while they're doing their work that wouldn't disrupt their workflow.

320
00:15:20,070 --> 00:15:22,840
So the principle is least privilege, I think, is

321
00:15:22,850 --> 00:15:25,570
extremely important when trying to find that balance.

322
00:15:25,780 --> 00:15:30,529
And then I also think kind of creating like a human centric design, really

323
00:15:30,530 --> 00:15:34,514
designing these security measures that are intuitive, minimally disruptive.

324
00:15:34,575 --> 00:15:35,375
The, the workflow.

325
00:15:35,375 --> 00:15:38,635
So something like a single sign on could be helpful, but

326
00:15:38,665 --> 00:15:41,305
I think, yeah, it'd always be, it'll always be interesting

327
00:15:41,305 --> 00:15:43,424
to kind of find what that balance is going to be.

328
00:15:43,765 --> 00:15:47,924
One of my pain points of any role based access control system

329
00:15:47,965 --> 00:15:53,110
is it's so hard to Define a person as a single role, right?

330
00:15:53,110 --> 00:15:56,680
Like most people, once they're, when they start their job, like your role is

331
00:15:56,680 --> 00:16:00,659
clearly defined and in larger corporations, it might be easier to fit you in.

332
00:16:00,660 --> 00:16:01,759
Like, this is your role.

333
00:16:01,759 --> 00:16:05,070
This is the only access you ever have access to, but once they move

334
00:16:05,089 --> 00:16:08,549
positions internally, they're now doing two, like a role and a half.

335
00:16:08,560 --> 00:16:10,680
Cause they're like, Oh, I still do some of that stuff for that old job.

336
00:16:10,680 --> 00:16:11,510
I have this new one.

337
00:16:11,510 --> 00:16:14,270
And they switch again, or the team moves or org

338
00:16:14,290 --> 00:16:17,190
charts move or the products move, whatever it is.

339
00:16:17,590 --> 00:16:20,520
All of those roles get really, really messy once we try to

340
00:16:20,520 --> 00:16:25,380
maintain them after six months or a year of real life experience.

341
00:16:25,910 --> 00:16:28,690
Not even just that, but like packages, like, you know, when you're

342
00:16:28,690 --> 00:16:31,920
like are responsible for certain code packages and having ownership

343
00:16:31,920 --> 00:16:35,240
of the testing and the pipelines and all of that stuff, like there's

344
00:16:35,240 --> 00:16:38,390
always some point you have to give somebody access to binaries or

345
00:16:38,390 --> 00:16:40,860
something, but then how long can you give them access to binaries?

346
00:16:40,860 --> 00:16:41,299
And like that

347
00:16:41,549 --> 00:16:43,700
you're, you have a temporary role in this case, right?

348
00:16:43,700 --> 00:16:45,590
Like here, you need this for a week or two.

349
00:16:45,630 --> 00:16:46,250
I don't know.

350
00:16:46,250 --> 00:16:47,780
And, and that just gets messy.

351
00:16:47,780 --> 00:16:51,090
And a lot of times it's just like, Oh, I have, I have root access because I got

352
00:16:51,090 --> 00:16:54,229
it three years ago and I still have root access and no one knew to take it away.

353
00:16:54,440 --> 00:16:57,720
And trying to do it fast, you know, like where, Hey, I need this.

354
00:16:57,720 --> 00:16:59,690
And you're like, uh, I know you need it.

355
00:17:00,540 --> 00:17:01,290
Exactly.

356
00:17:01,340 --> 00:17:02,710
So when you're in production, you're trying

357
00:17:02,710 --> 00:17:04,550
to fix something that makes it so complicated.

358
00:17:04,560 --> 00:17:04,790
Have you

359
00:17:04,790 --> 00:17:07,860
seen easier ways to manage that or to, to change

360
00:17:07,860 --> 00:17:12,039
that or, or just to make RBAC fit the real world?

361
00:17:12,719 --> 00:17:13,379
Yeah.

362
00:17:13,379 --> 00:17:15,630
So that's actually one of the services that

363
00:17:15,630 --> 00:17:17,640
I built when I was a software engineer.

364
00:17:17,640 --> 00:17:20,340
One of the, one of the internal services I built at Snap.

365
00:17:20,845 --> 00:17:22,255
It was so hard to get right.

366
00:17:22,285 --> 00:17:23,915
It was so difficult to get right.

367
00:17:23,915 --> 00:17:24,895
So I'll start with that.

368
00:17:24,965 --> 00:17:29,405
But we built a tool that allows us to know who has access to what.

369
00:17:29,415 --> 00:17:33,575
So given like an employee's email address, it would show us everything they had

370
00:17:33,584 --> 00:17:37,785
access to, when they got access, what role they had, what permission, et cetera.

371
00:17:37,785 --> 00:17:39,945
Like what if it was GitHub, like what repository.

372
00:17:39,955 --> 00:17:43,745
So it looked at internal services, it looked at external as well.

373
00:17:43,985 --> 00:17:46,680
And I think for us, like, First, visibility

374
00:17:46,680 --> 00:17:48,100
and awareness was the most important.

375
00:17:48,100 --> 00:17:50,920
So we can't revoke your access if we don't know what we had.

376
00:17:51,030 --> 00:17:53,010
So I think like kind of what you were saying around

377
00:17:53,010 --> 00:17:55,430
like the temporary access or if someone changes teams.

378
00:17:55,430 --> 00:17:58,930
I know I changed teams like two or three times at Snap and when I was building

379
00:17:58,930 --> 00:18:02,190
this tool I was like oh wow I still have access to the old teams that I

380
00:18:02,200 --> 00:18:06,490
had or I requested access for temporary for this one project and I still

381
00:18:06,490 --> 00:18:09,970
have access from like a external partner that I don't even need access to.

382
00:18:10,270 --> 00:18:14,480
So I think like awareness was the first part and then we built this

383
00:18:14,480 --> 00:18:18,220
mechanism that had like, if you didn't use this access within, I think it

384
00:18:18,220 --> 00:18:21,769
was 90 days or 180 days, then you most likely won't need it moving forward.

385
00:18:21,769 --> 00:18:24,459
And then if you did, well, you'll just have to request it again.

386
00:18:24,469 --> 00:18:28,750
So I think that's a way that it was more applicable to like real world.

387
00:18:28,750 --> 00:18:31,180
Like I wasn't on that team for two years.

388
00:18:31,180 --> 00:18:32,929
I didn't need that access.

389
00:18:32,929 --> 00:18:34,329
So it was revoked.

390
00:18:34,330 --> 00:18:37,035
Like that worked pretty well just because People wouldn't,

391
00:18:37,235 --> 00:18:39,255
and if they needed it, they would just re request it.

392
00:18:39,305 --> 00:18:41,815
It's always really challenging because of all the lateral

393
00:18:41,815 --> 00:18:44,875
movements within organizations, um, temporary access,

394
00:18:44,885 --> 00:18:47,775
time bound access, or if someone leaves the company.

395
00:18:47,894 --> 00:18:52,285
But what we did, what we did was we hooked it onto Workday APIs.

396
00:18:52,615 --> 00:18:56,565
So depending on like your role or depending on the org or the organization

397
00:18:56,565 --> 00:18:59,985
that you were in, you would get whatever access was applicable for that.

398
00:19:00,095 --> 00:19:03,855
But if you changed orgs, it would ideally drop or remove that access.

399
00:19:03,855 --> 00:19:07,194
So early days of, of, of it, but it was working, it was

400
00:19:07,195 --> 00:19:09,404
working, it was working well when, when, when I left.

401
00:19:09,404 --> 00:19:09,545
So I

402
00:19:09,545 --> 00:19:10,735
hope, I hope it still is.

403
00:19:11,480 --> 00:19:15,120
Keeping those roles and org chart in sync is extremely difficult.

404
00:19:15,520 --> 00:19:18,380
Not just that, but when you move from different job families, like going

405
00:19:18,380 --> 00:19:21,930
from an essay where you touch code, but you don't touch production code.

406
00:19:21,990 --> 00:19:23,990
And then all of a sudden you're in production code.

407
00:19:25,110 --> 00:19:27,660
Like I had so many permission issues just because it still

408
00:19:27,660 --> 00:19:30,840
thought I was an essay when I was dev and it was always confused.

409
00:19:31,349 --> 00:19:34,010
What role do you feel like automation and machine

410
00:19:34,010 --> 00:19:36,170
learning are going to play in the future of AI?

411
00:19:36,370 --> 00:19:38,020
Because you said that you do, you did work

412
00:19:38,020 --> 00:19:41,185
on a, um, Machine learning tool, right?

413
00:19:41,754 --> 00:19:47,225
The biggest one is around like being able to detect threats faster and smarter.

414
00:19:47,225 --> 00:19:51,634
So once you have like a vast amount of data and you can

415
00:19:51,634 --> 00:19:55,575
kind of like see similarities and identify anomalies.

416
00:19:55,624 --> 00:20:01,504
And within real time, I think AI will definitely help with faster,

417
00:20:01,534 --> 00:20:05,464
better, smarter, real time threat detection, responding to potential

418
00:20:05,464 --> 00:20:08,534
threats, like blocking access if it's unauthorized, if it looks

419
00:20:08,534 --> 00:20:11,204
malicious, or if you see incoming traffic in the network that

420
00:20:11,204 --> 00:20:14,924
looks, it looks suspicious, it could stop it before escalating.

421
00:20:14,924 --> 00:20:18,384
So I think that will be a high ticket area

422
00:20:18,384 --> 00:20:20,774
where AI and automation will help a lot.

423
00:20:21,054 --> 00:20:25,024
Do you feel like there's any areas that AI are going to make us more vulnerable?

424
00:20:25,034 --> 00:20:28,274
Yeah, And the future with us giving it access to so many things.

425
00:20:28,839 --> 00:20:32,269
It will get better around social engineering and like phishing

426
00:20:32,299 --> 00:20:35,719
and, and, and those and that area and realm of things.

427
00:20:36,089 --> 00:20:38,849
I, even yesterday I was with a friend and they got like

428
00:20:38,849 --> 00:20:42,409
a credit card fraud email alert when we were in Colombia.

429
00:20:42,659 --> 00:20:45,589
And I was like, you don't even have, it was a Chase, Chase card.

430
00:20:45,589 --> 00:20:48,739
I was like, I've never even seen you use Chase over the past four days.

431
00:20:48,739 --> 00:20:50,089
Like, it's probably not real.

432
00:20:50,339 --> 00:20:52,279
And they were like, yeah, Yasmin, like, I think it's real.

433
00:20:52,539 --> 00:20:53,999
And I was like, Oh, okay, whatever.

434
00:20:54,029 --> 00:20:57,359
And then she They kept doing their thing 20 30 minutes later and they

435
00:20:57,359 --> 00:21:00,589
were like, yeah, they even have the same like four digits of like my card.

436
00:21:00,589 --> 00:21:02,093
And I was like, I'm telling you, like, I don't

437
00:21:02,093 --> 00:21:04,219
even, I haven't even seen you pull out a Chase card.

438
00:21:04,219 --> 00:21:06,159
Like you shouldn't, this is phishing.

439
00:21:06,159 --> 00:21:07,909
The email looked so real.

440
00:21:08,109 --> 00:21:11,644
And then I think after, I think it was very fine tuned.

441
00:21:11,644 --> 00:21:13,334
I don't remember what the exact detail was.

442
00:21:13,574 --> 00:21:16,214
And he was like, Oh my gosh, this is actually fishing.

443
00:21:16,224 --> 00:21:17,614
And I was like, I told you from the jump.

444
00:21:17,614 --> 00:21:20,604
Like, I don't, like, I don't understand why you didn't listen to me, but I

445
00:21:20,614 --> 00:21:24,304
think it'll just get really smart and really good at all of social engineering

446
00:21:24,304 --> 00:21:27,544
and like fishing campaigns and spearfishing and all of those things.

447
00:21:28,099 --> 00:21:31,849
I don't know how and where they got, like, our trip location, the card

448
00:21:31,849 --> 00:21:36,149
form, the last four, all those details, like, the T was exactly right.

449
00:21:36,359 --> 00:21:38,479
And then, you know, he almost fell victim to

450
00:21:38,479 --> 00:21:40,749
it, but, but thankfully, thankfully I was there.

451
00:21:40,829 --> 00:21:41,499
Saved the day.

452
00:21:41,599 --> 00:21:45,479
But I think it, uh, Being your friend has to be a total flex.

453
00:21:45,480 --> 00:21:49,454
Like, But like, isn't it crazy though, with all the information that we

454
00:21:49,514 --> 00:21:52,444
give out, like there's been so many times I've had to stop my friends

455
00:21:52,454 --> 00:21:55,794
and they're like, I'm gonna go like, do one of those like, surveys on

456
00:21:55,804 --> 00:21:58,534
Facebook and I'm like, you just gave eight people your passwords, but okay.

457
00:21:59,634 --> 00:22:02,194
Like, you're just like, there's so many different ways, like people

458
00:22:02,194 --> 00:22:05,054
are always giving their location on social media, then they're always

459
00:22:05,054 --> 00:22:08,169
talking about how they're not home and I'm just like, Can y'all just,

460
00:22:09,319 --> 00:22:11,879
yeah, I think you bring up a really good point too.

461
00:22:11,889 --> 00:22:16,599
Cause like for as long as I've been adjacent to security and interested in

462
00:22:16,599 --> 00:22:21,809
security, we've basically always told people like your instincts suck, right?

463
00:22:21,809 --> 00:22:23,139
Like your passwords suck.

464
00:22:23,139 --> 00:22:25,039
You are all of these things that you think are

465
00:22:25,039 --> 00:22:28,199
unique or random and computers can't hack into it.

466
00:22:28,199 --> 00:22:29,649
Like, nah, just don't trust any of that stuff.

467
00:22:29,649 --> 00:22:34,119
Hand off all that stuff to a password manager, certificates,

468
00:22:34,129 --> 00:22:35,849
all these other things that are external to you.

469
00:22:36,159 --> 00:22:37,279
But when it comes to like this.

470
00:22:37,739 --> 00:22:41,939
Phishing attacks and AI generation, none of them pass the vibe check.

471
00:22:42,189 --> 00:22:44,179
If you like have any experience, right?

472
00:22:44,179 --> 00:22:46,699
And like, immediately you're like, this vibe is off.

473
00:22:46,779 --> 00:22:47,859
Don't trust it.

474
00:22:48,009 --> 00:22:51,159
But they were like, no, no, the bank has told me I have to trust them.

475
00:22:51,159 --> 00:22:54,429
And so I'm externally mounted, like all of my, all these

476
00:22:54,439 --> 00:22:57,469
systems I have to go through to make sure I don't lose my money.

477
00:22:57,470 --> 00:22:58,179
Right.

478
00:22:58,179 --> 00:22:59,169
And that's like a big risk.

479
00:22:59,169 --> 00:23:00,229
And that's like, but.

480
00:23:00,499 --> 00:23:03,699
You know, that, that picture of that person has 18 fingers, right?

481
00:23:03,699 --> 00:23:04,449
Like don't trust it.

482
00:23:04,489 --> 00:23:06,289
Like there's some level here that you just

483
00:23:06,299 --> 00:23:07,709
have to be able to like trust yourself.

484
00:23:07,709 --> 00:23:10,229
But in security specifically, we've just

485
00:23:10,239 --> 00:23:11,899
always told people they're terrible at it.

486
00:23:11,939 --> 00:23:13,599
And now we're like reversing some of that.

487
00:23:17,559 --> 00:23:20,029
Running Kubernetes at scale is challenging.

488
00:23:20,429 --> 00:23:24,339
Running Kubernetes at scale securely is even more challenging.

489
00:23:24,559 --> 00:23:27,399
Access management and user management are some of the most

490
00:23:27,399 --> 00:23:30,639
important tools that we have today to be able to secure

491
00:23:30,789 --> 00:23:33,639
your Kubernetes cluster and protect your infrastructure.

492
00:23:33,839 --> 00:23:36,539
Using Tremelo Security with Open Unison is the

493
00:23:36,659 --> 00:23:39,859
easiest way, whether it be on prem or in the cloud.

494
00:23:40,239 --> 00:23:42,839
to simplify access management to your cluster.

495
00:23:42,969 --> 00:23:47,089
It provides a single sign on and helps you with its robust security

496
00:23:47,089 --> 00:23:50,549
features to secure your cluster and automate your workflow.

497
00:23:50,549 --> 00:23:54,799
So check out Tremelo Security for your single sign on needs in Kubernetes.

498
00:23:55,129 --> 00:23:57,339
You can find them at fafo.

499
00:23:57,340 --> 00:24:00,869
fm slash Tremelo.

500
00:24:00,869 --> 00:24:06,189
That's T R E M O L O.

501
00:24:13,659 --> 00:24:15,559
Yeah, and I think that's it.

502
00:24:15,619 --> 00:24:20,269
It's still, phishing is still the number one way that organizations get hacked.

503
00:24:20,269 --> 00:24:21,369
It's always through people.

504
00:24:21,369 --> 00:24:23,299
It's always through their lack of education.

505
00:24:23,309 --> 00:24:26,679
So, I always try to help organizations educate their

506
00:24:26,679 --> 00:24:30,279
employees through phishing, like mock phishing emails.

507
00:24:30,299 --> 00:24:31,889
I actually set a campaign up at Snapchat.

508
00:24:31,909 --> 00:24:36,319
Snap, where we would send mock phishing emails to employees just to

509
00:24:36,319 --> 00:24:39,939
see what the click through rate was, how many of them clicked the link,

510
00:24:39,939 --> 00:24:42,909
but then also entered their credentials in the link and then downloaded

511
00:24:42,909 --> 00:24:47,299
files, and then we had some very nice follow up calls from that.

512
00:24:47,549 --> 00:24:48,319
Like, look at this pie chart.

513
00:24:48,329 --> 00:24:49,849
You all opened a PDF.

514
00:24:50,169 --> 00:24:53,169
Wouldn't it be funny if like you did you sent it like an

515
00:24:53,179 --> 00:24:55,829
email to see like what people would click on or whatever

516
00:24:55,829 --> 00:24:58,419
and then Like a big pop up came and it was like you failed

517
00:25:01,099 --> 00:25:01,179
That's

518
00:25:01,179 --> 00:25:04,759
actually that's exactly what we did so like if they did click on the link

519
00:25:04,759 --> 00:25:08,469
or if they downloaded it it would be like Boom, like you have failed,

520
00:25:08,469 --> 00:25:12,069
like now you have a mandatory education training that you have to go to.

521
00:25:12,179 --> 00:25:15,849
So it wasn't just like a simulation for us to kind of see like how,

522
00:25:16,049 --> 00:25:18,889
like the posture and the health of the organization, but also like we

523
00:25:18,899 --> 00:25:22,889
very much so sent them to like a mandatory training, um, and awareness.

524
00:25:24,109 --> 00:25:25,819
We keep leaning back on that, like, we

525
00:25:25,819 --> 00:25:28,139
need to educate people to get beyond this.

526
00:25:28,139 --> 00:25:31,739
But at the other end, we're like, we want machine learning to do the vibe check.

527
00:25:32,169 --> 00:25:34,479
And at some point, like, I don't know that machines are going to

528
00:25:34,479 --> 00:25:37,679
get the vibes, but people aren't getting the education either.

529
00:25:37,689 --> 00:25:39,069
And so I don't know where that meets in

530
00:25:39,079 --> 00:25:41,199
the middle of like, both these sides suck.

531
00:25:41,389 --> 00:25:43,579
But not just that, but we also constantly

532
00:25:43,579 --> 00:25:45,829
talk about, like, least privilege, right?

533
00:25:46,099 --> 00:25:48,269
the principle of least privilege, but now we

534
00:25:48,269 --> 00:25:51,499
want to give machines access to everything.

535
00:25:51,779 --> 00:25:55,899
We've given like AI so much data, there's so many companies that are piping

536
00:25:55,899 --> 00:26:01,119
their own data back into their AI, and then they're giving it privileges

537
00:26:01,139 --> 00:26:04,819
to infrastructure, giving it privileges to data, giving privileges to

538
00:26:04,839 --> 00:26:08,509
like their code bases and to writing their code bases, and I'm just like

539
00:26:08,879 --> 00:26:12,519
I mean, I wish I knew more about it, but I'm like, how many safeguards

540
00:26:12,519 --> 00:26:15,879
are in the different like, areas that these things aren't talking to?

541
00:26:15,879 --> 00:26:16,479
People gotta get their jobs done.

542
00:26:16,529 --> 00:26:17,519
And like, the AI

543
00:26:17,519 --> 00:26:19,189
systems are the new Jenkins, right?

544
00:26:19,189 --> 00:26:22,459
Because the CICD systems were the place that every hacker went to attack

545
00:26:22,469 --> 00:26:25,019
because it had all the credentials and all the access, all the automation.

546
00:26:25,239 --> 00:26:25,439
That's what I'm saying.

547
00:26:25,439 --> 00:26:29,289
Like, and like, just working in production, like, I think getting a degree

548
00:26:29,289 --> 00:26:32,109
that was about, like, secure software development, I actually went to the

549
00:26:32,109 --> 00:26:34,759
same school you went to, but the online, like, military version of it.

550
00:26:34,799 --> 00:26:38,039
And it's wild, like, what people do in real life production,

551
00:26:38,039 --> 00:26:40,979
because things don't always work the simplest ways.

552
00:26:40,989 --> 00:26:41,679
Like you know what I mean?

553
00:26:41,679 --> 00:26:45,219
Sometimes there is like a weird way that you have to give

554
00:26:45,219 --> 00:26:47,899
something permission to do that or make it so it's automated so

555
00:26:47,899 --> 00:26:50,259
you can release a bunch of versions at once or just something.

556
00:26:50,269 --> 00:26:53,689
And it's, you'd be surprised the amount of, like, I, I was

557
00:26:54,009 --> 00:26:57,559
on a business intelligence team, and they were testing on

558
00:26:57,559 --> 00:27:00,809
Redshift, like, clusters, and I was like, what are we doing?

559
00:27:01,449 --> 00:27:03,869
Like, you know, like, I was the most junior person,

560
00:27:03,889 --> 00:27:06,339
and I'm like, can we, this, this is a bad idea.

561
00:27:06,649 --> 00:27:10,319
There's so many different layers to what you can do in production, and sometimes

562
00:27:10,319 --> 00:27:14,719
you have to do something quickly, and I'm just like, It's this bad when we know

563
00:27:14,779 --> 00:27:18,359
the principles of least privilege and they're humans, and then we're going to

564
00:27:18,369 --> 00:27:23,729
give machines access to all these different levels of data at the same time.

565
00:27:24,479 --> 00:27:26,009
It's going to make it so much easier.

566
00:27:26,039 --> 00:27:29,169
You hack one thing and you get the keys to the candy store for everything.

567
00:27:29,389 --> 00:27:34,029
And I think that's why we set up like systems in place at Snap, where we

568
00:27:34,029 --> 00:27:37,124
would be able to see if you were putting anything into these AI systems.

569
00:27:37,454 --> 00:27:41,284
Like we would send alerts, like we would like data exfiltration.

570
00:27:41,284 --> 00:27:45,384
So like copying any source code or copying any documentation, see the source.

571
00:27:45,394 --> 00:27:48,174
And then we wouldn't be able to flag it in, in chat GPT

572
00:27:48,174 --> 00:27:50,474
or any of, any of these models, but I think you're right.

573
00:27:50,474 --> 00:27:53,234
If you hand them over your source code or anything

574
00:27:53,234 --> 00:27:56,104
like that, and God forbid, I mean, I hope that no one's

575
00:27:56,114 --> 00:27:59,094
storing keys or any, any credentials and code these days.

576
00:27:59,154 --> 00:27:59,724
But how many

577
00:27:59,724 --> 00:28:02,904
times do like, there's literally a bot that goes around Google

578
00:28:02,904 --> 00:28:05,824
tell, I mean, not Google, but GitHub telling people you put your

579
00:28:05,824 --> 00:28:09,064
keys on the internet, like Cause how often do we do it on app?

580
00:28:09,244 --> 00:28:12,264
I remember I was sitting at Google Next and they were like,

581
00:28:12,324 --> 00:28:14,534
we're going to, it's going to write your infrastructure.

582
00:28:14,544 --> 00:28:16,904
It's going to write your app and then it's going to make a database.

583
00:28:16,914 --> 00:28:18,924
And I'm just sitting there like, Oh no.

584
00:28:19,134 --> 00:28:22,304
And then they exposed the EC2 instance name.

585
00:28:22,304 --> 00:28:27,854
And I was like, Oh, like on the stage, like at their keynote.

586
00:28:27,854 --> 00:28:32,124
And I was just like, Like, my little security hardbot died.

587
00:28:33,084 --> 00:28:37,054
Like, I was like, y'all, this is like a 101 of what we

588
00:28:37,054 --> 00:28:38,194
should not do in public.

589
00:28:39,054 --> 00:28:42,224
And I think for me, it's like the biggest things, like the

590
00:28:42,224 --> 00:28:45,214
most easy to catch or like the easiest, the most obvious

591
00:28:45,234 --> 00:28:48,994
vulnerabilities are right in Sometimes people just overlook them.

592
00:28:49,074 --> 00:28:51,744
And a lot of these, a lot of these vulnerabilities

593
00:28:51,754 --> 00:28:54,224
that happen are sometimes the most obvious.

594
00:28:54,284 --> 00:28:56,574
The biggest hacks are the ones that walked through the door.

595
00:28:56,594 --> 00:28:59,154
Like Target got taken down by literal lease

596
00:28:59,154 --> 00:29:01,764
privileges because they gave access to a contractor.

597
00:29:01,764 --> 00:29:04,374
Like it's, it's never something crazy.

598
00:29:04,374 --> 00:29:06,684
Like I think the only thing that we've really thought was really

599
00:29:06,684 --> 00:29:10,374
crazy was that guy who did the social engineering to make the

600
00:29:10,414 --> 00:29:13,214
maintainer really depressed to do like to get the binaries in

601
00:29:13,214 --> 00:29:16,404
and then, you know, Yes, that was the only, think about it.

602
00:29:16,414 --> 00:29:19,744
Out of all the news, that dude, like, look, he

603
00:29:19,744 --> 00:29:22,264
deserves to get like something named after him.

604
00:29:22,384 --> 00:29:24,874
Like I was like, I, like, I can't even be mad at you.

605
00:29:24,874 --> 00:29:26,444
That was the Trojan horse of 2024.

606
00:29:27,494 --> 00:29:27,804
Okay.

607
00:29:27,854 --> 00:29:30,924
Like, but most of the time they walk right in the door.

608
00:29:31,424 --> 00:29:33,174
What's something Yasmin that you think.

609
00:29:33,629 --> 00:29:36,839
is, is commonly said, you should do this thing, but

610
00:29:36,839 --> 00:29:39,209
it's mostly just security theater and doesn't matter.

611
00:29:39,829 --> 00:29:40,009
Right.

612
00:29:40,019 --> 00:29:41,259
Is there something that they're like, Oh, this

613
00:29:41,259 --> 00:29:43,579
is the advice that the news will tell you.

614
00:29:43,579 --> 00:29:46,259
And you're like, actually just don't like, it doesn't matter.

615
00:29:46,259 --> 00:29:51,639
Or, or something that a company's like investing millions of dollars in a thing.

616
00:29:51,949 --> 00:29:53,479
And then you're like, you know, you're probably not going to

617
00:29:53,479 --> 00:29:56,599
get the security outcomes that you want by doing that process.

618
00:29:57,180 --> 00:29:59,740
You know, I think it just goes back to there's a lot of

619
00:29:59,740 --> 00:30:05,140
compliance rules and regulations around mandating data and

620
00:30:05,149 --> 00:30:08,340
maybe just like actually security education for companies.

621
00:30:08,550 --> 00:30:12,939
There's like laws and regulations that now the government has

622
00:30:12,939 --> 00:30:16,139
regulated that says, Oh, like you need to educate your employees.

623
00:30:16,160 --> 00:30:20,060
But sometimes the the employees are just clicking through these docs and

624
00:30:20,060 --> 00:30:23,720
submitting okay or like watch or fast forwarding this video not actually

625
00:30:23,720 --> 00:30:27,209
watching it so i think there's a disconnect around like we actually need

626
00:30:27,210 --> 00:30:31,510
to educate employees but how we are doing it is not actually materializing

627
00:30:31,520 --> 00:30:35,680
into anything that's beneficial because i've surveyed so many people

628
00:30:35,680 --> 00:30:38,745
like hey like did you actually like Watch this or read through this.

629
00:30:38,795 --> 00:30:41,754
Like no, just click to accept, acknowledge and move on.

630
00:30:41,965 --> 00:30:45,565
And I think it just highlights a lot of policies around like privacy

631
00:30:45,565 --> 00:30:48,555
or data, data usage, data deletion, data retention, all of those

632
00:30:48,555 --> 00:30:52,205
things that people just don't really, like they think that, Oh,

633
00:30:52,244 --> 00:30:54,864
my data is secure or like they're not using my data or they're

634
00:30:54,864 --> 00:30:58,115
not retaining it or anything like that when in actuality they are.

635
00:30:58,405 --> 00:31:00,105
There's a lot of fine lines that people are

636
00:31:00,105 --> 00:31:02,785
missing and mis misreading or not even reading.

637
00:31:02,845 --> 00:31:05,995
So I think, oh, like a company doesn't have access to my snaps.

638
00:31:05,995 --> 00:31:06,404
Like, do

639
00:31:06,404 --> 00:31:07,145
they really not?

640
00:31:07,679 --> 00:31:08,550
Are you sure they don't?

641
00:31:09,300 --> 00:31:11,449
I was in SA and we have all this training,

642
00:31:11,449 --> 00:31:13,540
but the training for SDEs were different.

643
00:31:14,010 --> 00:31:16,160
And I remember getting on an SDE team and they

644
00:31:16,160 --> 00:31:18,060
were like, Oh, this customer's having this issue.

645
00:31:18,090 --> 00:31:20,620
And then the other like SDE was like, I'll just log into their account.

646
00:31:20,620 --> 00:31:21,530
And I was like, you're going to do what?

647
00:31:21,569 --> 00:31:21,809
Like,

648
00:31:23,870 --> 00:31:27,500
no, you're not like, it's crazy that like, I mean, we all know,

649
00:31:27,500 --> 00:31:30,750
like, I love security and I think it's interesting, but I definitely

650
00:31:30,760 --> 00:31:34,030
have got a program, like one of those, like, requirement learnings.

651
00:31:34,030 --> 00:31:36,449
And I'm just like, this is so boring, but

652
00:31:36,449 --> 00:31:38,310
how do we make better education though?

653
00:31:38,350 --> 00:31:38,749
You know?

654
00:31:39,080 --> 00:31:41,120
It's not only the fact that like, usually it's just

655
00:31:41,129 --> 00:31:42,769
dry content that no one's really interested in.

656
00:31:42,769 --> 00:31:43,489
That's what I'm saying.

657
00:31:43,499 --> 00:31:45,590
And it doesn't really give you the real use case.

658
00:31:45,629 --> 00:31:46,330
Like, you know what I mean?

659
00:31:46,330 --> 00:31:47,590
There, it doesn't really.

660
00:31:47,629 --> 00:31:48,300
I mean, like all the

661
00:31:48,300 --> 00:31:50,909
cartoons and the silly like situations they try to

662
00:31:50,909 --> 00:31:53,359
say, but like most of the time that any security.

663
00:31:53,540 --> 00:31:56,050
Training I've been at usually is like, oh,

664
00:31:56,060 --> 00:31:58,580
fit this into your normal schedule, right?

665
00:31:58,679 --> 00:32:01,050
Here's the 37 meetings you have this week.

666
00:32:01,159 --> 00:32:02,839
Here's the things you have to get done for work.

667
00:32:02,870 --> 00:32:04,379
Oh, and there's all this training thing, right?

668
00:32:04,379 --> 00:32:05,949
So we're like, well, I'm going to have to do

669
00:32:05,949 --> 00:32:07,959
this, you know, like as I'm doing something else.

670
00:32:07,999 --> 00:32:09,969
And those are all the times that like, I would hack into

671
00:32:09,969 --> 00:32:12,204
like, or I'd look at the JavaScript and change the timestamp.

672
00:32:12,204 --> 00:32:13,939
I'm like, oh yeah, I watched this for 30 minutes.

673
00:32:13,959 --> 00:32:14,269
Yeah.

674
00:32:14,669 --> 00:32:16,560
I changed my system clock and we can fast forward.

675
00:32:16,560 --> 00:32:17,379
I didn't even think

676
00:32:17,379 --> 00:32:17,949
about that.

677
00:32:17,949 --> 00:32:18,379
Justin.

678
00:32:18,380 --> 00:32:20,759
They're all time based and you're just like, oh.

679
00:32:21,364 --> 00:32:23,024
A computer doesn't know what time is.

680
00:32:23,054 --> 00:32:23,495
I do.

681
00:32:23,514 --> 00:32:24,784
Let me skip past these parts.

682
00:32:27,844 --> 00:32:29,284
That's like the things I learned about it

683
00:32:29,284 --> 00:32:31,094
was like, Oh, you did client side validation.

684
00:32:31,104 --> 00:32:31,794
You're an idiot.

685
00:32:31,794 --> 00:32:32,004
Right?

686
00:32:32,004 --> 00:32:36,074
Like, that's like, we can bypass some of that stuff because again, it wasn't, it

687
00:32:36,074 --> 00:32:40,484
was a priority enough to get the checkboxes for people are trained, but not give

688
00:32:40,484 --> 00:32:44,745
them time to learn something or give them a person to ask questions to, right?

689
00:32:44,745 --> 00:32:47,980
Like sit down with someone like, Pair programming is a thing

690
00:32:47,990 --> 00:32:50,760
because it's like, wow, we learned so much by just watching

691
00:32:50,760 --> 00:32:53,460
someone else, an expert in their field, do something, or even

692
00:32:53,460 --> 00:32:55,940
not even an expert, just someone else with a different approach.

693
00:32:56,349 --> 00:32:58,919
I don't even want to invest time into pair programming though.

694
00:32:59,009 --> 00:33:01,409
And like, look at all the studies that show how fast

695
00:33:01,420 --> 00:33:03,589
that helps people to ramp up and they're like, Oh no.

696
00:33:03,790 --> 00:33:03,899
I

697
00:33:03,899 --> 00:33:05,629
mean, pair debugging is like the best

698
00:33:05,629 --> 00:33:07,580
experience I've had in my engineering career.

699
00:33:07,580 --> 00:33:09,760
It was like watching someone else use 18 different tools

700
00:33:09,760 --> 00:33:11,889
to debug something like, what was that command you ran?

701
00:33:11,889 --> 00:33:12,769
I'm writing that down.

702
00:33:12,769 --> 00:33:13,939
I'm going to read the man page later.

703
00:33:13,939 --> 00:33:14,290
This is amazing.

704
00:33:14,290 --> 00:33:14,429
Which

705
00:33:14,429 --> 00:33:17,679
is wild because like everybody can steal code from somewhere,

706
00:33:17,679 --> 00:33:21,784
but debugging is like you will always have to debug something.

707
00:33:21,894 --> 00:33:23,874
Yeah, I was, I was gonna agree.

708
00:33:23,874 --> 00:33:27,604
I think that's why we, well, to the first point, that's why we did the

709
00:33:27,614 --> 00:33:32,594
real live phishing mock simulations where it wasn't like a manual or like a

710
00:33:32,594 --> 00:33:36,404
video or like a document that said, Hey, I, I read, I read this, but it was

711
00:33:36,404 --> 00:33:40,465
a real live simulation where, okay, like you actually read the email, you

712
00:33:40,465 --> 00:33:44,760
clicked on it and then boom, like now you're like, Oh, And then especially

713
00:33:44,760 --> 00:33:48,459
when you CC their managers or like leadership, and it's like, your org is

714
00:33:48,469 --> 00:33:51,909
in charge of 30 percent of this simulation that we, and this could lead

715
00:33:51,909 --> 00:33:55,549
to how many millions of dollars or how much user data could be exposed.

716
00:33:55,810 --> 00:33:57,510
So then now from leadership is like, okay,

717
00:33:57,510 --> 00:33:59,270
like we actually really have to invest.

718
00:33:59,270 --> 00:34:02,350
And it's like, okay, if you already got caught and your team and your org is

719
00:34:02,350 --> 00:34:06,530
like performing very poorly at this, it just becomes so much more impactful.

720
00:34:06,530 --> 00:34:09,069
So then people start to take it seriously.

721
00:34:09,069 --> 00:34:09,380
So.

722
00:34:09,710 --> 00:34:12,909
After you get caught in that kind of a pop up, you're probably going

723
00:34:12,909 --> 00:34:16,290
to pay attention to that class that you got sent to and you're never

724
00:34:16,290 --> 00:34:18,440
clicking on another email link that you don't know about again.

725
00:34:20,080 --> 00:34:20,930
Is there an offset for that?

726
00:34:20,930 --> 00:34:23,959
Like, cause you can't care about everything and you can't pay attention

727
00:34:23,959 --> 00:34:28,759
to everything, but there are a set of maybe this is more relevant now.

728
00:34:28,800 --> 00:34:30,449
And I, I've been subscribed to.

729
00:34:30,955 --> 00:34:33,505
Have I been pwned for I don't know how long.

730
00:34:33,875 --> 00:34:36,165
And I've gotten so many emails after so many times.

731
00:34:36,444 --> 00:34:38,024
I don't read them anymore because I'm like,

732
00:34:38,024 --> 00:34:39,804
yeah, there's nothing I can do about this.

733
00:34:39,804 --> 00:34:41,165
My data got leaked somewhere.

734
00:34:41,455 --> 00:34:44,045
Someone else didn't secure it the right way, or

735
00:34:44,045 --> 00:34:45,814
someone got a phishing attack and they got in the door.

736
00:34:45,824 --> 00:34:48,374
I'm like, I can't do anything about this anymore.

737
00:34:48,755 --> 00:34:49,944
Now it's just noise.

738
00:34:49,975 --> 00:34:51,354
And at that point I stopped caring.

739
00:34:51,354 --> 00:34:53,255
Originally it was like, I really care about these things.

740
00:34:53,274 --> 00:34:55,715
Let me make sure every time I rotate my passwords, all that stuff.

741
00:34:55,715 --> 00:34:58,594
And now I'm just like, I just don't have the time to care.

742
00:34:58,825 --> 00:34:59,495
And I don't have the.

743
00:34:59,660 --> 00:35:01,710
the memory bandwidth to care anymore.

744
00:35:01,950 --> 00:35:04,809
How do we like eliminate or not eliminate, but

745
00:35:04,809 --> 00:35:07,240
just like reduce the fatigue and help people focus.

746
00:35:07,740 --> 00:35:09,149
Like you can't focus on everything.

747
00:35:09,700 --> 00:35:16,779
I would say, um, if you have like multi factor authentication set up

748
00:35:16,779 --> 00:35:20,849
on your accounts and you are like not connecting to like public Wi

749
00:35:20,849 --> 00:35:24,539
Fis and you have like secure best practices, then you'll most likely

750
00:35:24,539 --> 00:35:27,970
be at a less risk for for these attacks or all the noise that you're

751
00:35:27,970 --> 00:35:30,729
saying that you get from these different applications and stuff.

752
00:35:31,020 --> 00:35:36,080
I would say that, yeah, I mean, always just to enable 2FA,

753
00:35:36,080 --> 00:35:38,599
MFA, secure best practices in your day to day workflow.

754
00:35:38,620 --> 00:35:42,220
And, and I think you could take a lesser, lesser

755
00:35:42,220 --> 00:35:44,359
look on some of these, some of these notifications.

756
00:35:45,009 --> 00:35:47,349
And also regularly update your password.

757
00:35:47,399 --> 00:35:51,739
Um, not 180 days, but definitely something, something frequent.

758
00:35:51,739 --> 00:35:54,269
And then I think also like, I mean, I'm not telling

759
00:35:54,279 --> 00:35:56,899
you guys, but maybe other listeners that are not aware,

760
00:35:56,899 --> 00:35:59,759
but don't just update it with like one extra character.

761
00:35:59,819 --> 00:36:02,979
I think that's the most obvious way for you to get hacked.

762
00:36:02,979 --> 00:36:05,799
And a lot of times like these, your emails have

763
00:36:05,799 --> 00:36:08,329
already been in databases where it's been compromised.

764
00:36:08,339 --> 00:36:10,959
So you adding one additional character is

765
00:36:10,959 --> 00:36:12,629
not really going to make it more secure.

766
00:36:12,705 --> 00:36:15,384
One of my first and favorite projects when I started at Disney

767
00:36:15,384 --> 00:36:19,115
Animation was they wanted to see like, Hey, can you use John the

768
00:36:19,115 --> 00:36:23,845
Ripper to look at, well, whose passwords are easy, easy to hack.

769
00:36:24,105 --> 00:36:25,285
And I'm like, sure.

770
00:36:25,314 --> 00:36:26,594
Could you give me the LDAP dump?

771
00:36:26,595 --> 00:36:28,974
And they're like, Oh yeah, here's the literally like, here's admin access.

772
00:36:29,055 --> 00:36:30,214
Go, go, go get the dump.

773
00:36:30,555 --> 00:36:33,444
Uh, and then, and get all the hashes from it and then see what

774
00:36:33,444 --> 00:36:36,234
John the Ripper could do, and we had a render farm and it was a

775
00:36:36,234 --> 00:36:38,564
Christmas break and they, we didn't have a lot of stuff to do.

776
00:36:38,564 --> 00:36:40,584
So I'm like, how much of the render farm can I use?

777
00:36:40,919 --> 00:36:42,309
To start this John the Ripper process.

778
00:36:42,319 --> 00:36:43,229
Like, you can have a rack.

779
00:36:43,249 --> 00:36:44,229
And I'm like, cool.

780
00:36:44,339 --> 00:36:45,709
That's, I get a bunch of machines.

781
00:36:45,719 --> 00:36:48,389
Let me just, in like the amount of things that we're just like.

782
00:36:48,994 --> 00:36:51,054
Very basic and very things.

783
00:36:51,754 --> 00:36:54,174
I would expect this to be a password at Disney.

784
00:36:54,394 --> 00:36:56,254
And I was like, and just increment a number.

785
00:36:56,254 --> 00:36:58,624
So I'm like, Oh, these aren't secure at all.

786
00:36:58,664 --> 00:37:00,074
And that was 2014.

787
00:37:00,084 --> 00:37:04,024
And basically ever since then, I stopped knowing any of my passwords.

788
00:37:04,094 --> 00:37:06,414
I'm like, no password manager is generating everything.

789
00:37:06,664 --> 00:37:07,294
That's not.

790
00:37:07,684 --> 00:37:10,704
You know, like if, if I know the password, I have 2FA on it, right?

791
00:37:10,704 --> 00:37:12,204
Like we have to have some level of security.

792
00:37:12,204 --> 00:37:15,694
If I had to create this thing out of my head, it's not that random in there.

793
00:37:15,814 --> 00:37:17,904
And so, yeah, having that is like one of those things

794
00:37:17,964 --> 00:37:21,384
that the security best practices, like that quote to me

795
00:37:21,514 --> 00:37:24,664
is always really hard because that always depends, right?

796
00:37:24,664 --> 00:37:26,224
Like it always depends on the context.

797
00:37:26,224 --> 00:37:28,614
That always depends on what the information

798
00:37:28,614 --> 00:37:30,194
is, what the actual system you're using.

799
00:37:30,194 --> 00:37:35,554
If this is an internal AI system at Snap, like I have different best practices

800
00:37:35,794 --> 00:37:40,084
compared to You know, a forum login that is a throwaway that I don't care about.

801
00:37:40,384 --> 00:37:41,594
Yeah, I agree.

802
00:37:41,674 --> 00:37:42,954
I was just going to add to that.

803
00:37:42,954 --> 00:37:46,274
I mean, it definitely depends on what context you're speaking

804
00:37:46,274 --> 00:37:48,984
about, but password managers, like you mentioned, I think

805
00:37:48,984 --> 00:37:51,694
something that's always really important is endpoint protection.

806
00:37:51,864 --> 00:37:53,514
So always making sure.

807
00:37:53,659 --> 00:37:55,379
Updates are, are in sync.

808
00:37:55,429 --> 00:37:57,849
Um, you have security patches, firewalls,

809
00:37:57,909 --> 00:38:00,919
antiviruses, anything like that is super important.

810
00:38:00,919 --> 00:38:03,609
I know a lot of people probably are familiar with, uh, password

811
00:38:03,649 --> 00:38:06,739
managers, but not as much with, Hey, like, we're not just

812
00:38:06,739 --> 00:38:09,359
sending you these pop ups cause your device is not updated.

813
00:38:09,359 --> 00:38:11,359
It's probably some security patches that,

814
00:38:11,359 --> 00:38:12,839
that need to be updated in that as well.

815
00:38:12,839 --> 00:38:13,109
So.

816
00:38:13,409 --> 00:38:15,219
To be my opinion, one of the best and worst

817
00:38:15,219 --> 00:38:17,439
things that Microsoft did for the security.

818
00:38:17,994 --> 00:38:22,364
ecosystem is reliably release updates on the second Tuesday of the month.

819
00:38:22,424 --> 00:38:25,124
And I was a Windows system admin when that was happening.

820
00:38:25,134 --> 00:38:26,454
It was always, Oh, second Tuesday's here.

821
00:38:26,454 --> 00:38:27,284
We got to go through tests.

822
00:38:27,294 --> 00:38:30,104
We got, we would block out time because they were predictable.

823
00:38:30,274 --> 00:38:32,394
And then we can say, Oh, I can build predictability into

824
00:38:32,394 --> 00:38:34,334
my schedule for how I'm going to roll these out, where

825
00:38:34,334 --> 00:38:35,884
I'm going to roll them out, how I'm going to test them.

826
00:38:36,494 --> 00:38:39,684
But on the downside of that is like, they weren't.

827
00:38:39,810 --> 00:38:43,864
Equally prioritized as far as like, sometimes there was

828
00:38:43,864 --> 00:38:46,664
a zero day that was actively exploited across the world.

829
00:38:46,894 --> 00:38:48,774
And it just came out normally on a Tuesday.

830
00:38:48,774 --> 00:38:49,554
That's just like, Oh yeah.

831
00:38:49,554 --> 00:38:52,254
Also Excel crashes once, once or twice, right?

832
00:38:52,414 --> 00:38:55,114
It's just like, Oh, this thing is critically important in this other thing.

833
00:38:55,124 --> 00:38:57,904
And I can't tell you how many times I've been in situations

834
00:38:57,904 --> 00:39:01,004
where the infrastructure wasn't kept up to date and that

835
00:39:01,344 --> 00:39:07,214
Helped us not have a CVE because the CVE was in the recent four

836
00:39:07,214 --> 00:39:09,414
releases and we're like, Oh, we're, we're six versions old.

837
00:39:09,414 --> 00:39:09,964
We're good.

838
00:39:09,964 --> 00:39:10,134
Right?

839
00:39:10,134 --> 00:39:11,254
This wasn't introduced yet.

840
00:39:11,254 --> 00:39:16,364
That bug, that CVE, that security hack that was being critically exploited

841
00:39:16,364 --> 00:39:20,494
somewhere like now we don't have to update because we, we were never vulnerable.

842
00:39:20,494 --> 00:39:22,714
And I can't tell you how many times that has happened to me.

843
00:39:22,844 --> 00:39:25,374
Giving it time to bake and let somebody else find.

844
00:39:25,869 --> 00:39:29,459
All the bugs is always a lot of big things do that though.

845
00:39:29,459 --> 00:39:32,409
Like they don't let you update right away.

846
00:39:32,469 --> 00:39:34,889
Like they will look, definitely let it bake

847
00:39:34,889 --> 00:39:36,729
and see if other people exploit it first.

848
00:39:37,189 --> 00:39:41,259
What do you think about making like third party scanning

849
00:39:41,259 --> 00:39:44,049
vendors better and not getting so many false positives?

850
00:39:44,049 --> 00:39:46,829
Cause it seems like the more we get automated, the more we get.

851
00:39:48,489 --> 00:39:49,359
That's a good question.

852
00:39:49,359 --> 00:39:53,019
I just also wanted to add on to the, um, previous quickly.

853
00:39:53,179 --> 00:39:57,489
I think at Snap we actually would shut down access to you, for you

854
00:39:57,489 --> 00:40:01,369
to like log in if you didn't update within like the certain time.

855
00:40:01,479 --> 00:40:04,169
I know IT was very, very, very big on, hey, like,

856
00:40:04,229 --> 00:40:07,729
There's this zero day happening like your computer, cause

857
00:40:07,729 --> 00:40:09,699
you know, it's all managed software from the company.

858
00:40:09,699 --> 00:40:11,869
So you will not be able to log into your computer unless

859
00:40:11,869 --> 00:40:14,069
you update it, or you won't be able to do anything on

860
00:40:14,069 --> 00:40:16,629
your computer until you update or unless you update.

861
00:40:16,639 --> 00:40:18,389
So that's, that's interesting that you said that.

862
00:40:18,529 --> 00:40:19,649
We used to get logged out.

863
00:40:19,649 --> 00:40:20,369
Cause we like.

864
00:40:20,604 --> 00:40:23,374
Closed our computers on a Friday and Monday.

865
00:40:23,384 --> 00:40:24,874
You're like, Oh, the amount of

866
00:40:24,884 --> 00:40:30,364
time I spent fighting Amazon's Acme system internally for updates because they

867
00:40:30,364 --> 00:40:34,644
were so aggressive on doing every piece of software update all of the time.

868
00:40:34,654 --> 00:40:36,544
And if you didn't do it after like three or four

869
00:40:36,544 --> 00:40:38,764
days, it's like, yeah, you can't get email now.

870
00:40:39,064 --> 00:40:40,674
Stop what you're doing and update.

871
00:40:40,674 --> 00:40:43,154
And I'm like, wow, this is this is on the extreme.

872
00:40:43,154 --> 00:40:43,464
And

873
00:40:43,864 --> 00:40:45,544
I spent more time doing that than writing code.

874
00:40:45,719 --> 00:40:46,279
Yeah, exactly.

875
00:40:46,279 --> 00:40:49,659
I can't tell you how much time I was waiting for my system to update.

876
00:40:49,659 --> 00:40:52,419
And on chats, because it wouldn't work and things were, you know,

877
00:40:52,839 --> 00:40:55,829
Oh, look, this new five updates you rolled out don't work together.

878
00:40:55,989 --> 00:40:57,989
And I need that fourth one or whatever.

879
00:40:57,989 --> 00:41:01,919
And those were all things that's is such a hard balance to keep this.

880
00:41:02,869 --> 00:41:03,989
We need to keep it secure.

881
00:41:03,989 --> 00:41:07,439
We need to keep it compatible and just giving people

882
00:41:07,439 --> 00:41:09,419
time back to like, when do they don't think about it?

883
00:41:09,419 --> 00:41:12,759
And I do think beyond what Microsoft did with keeping it

884
00:41:12,759 --> 00:41:17,169
predictable, what's Google did with Chrome and Chrome OS of

885
00:41:17,169 --> 00:41:21,419
making it More immutable updates of saying like, Hey, we're doing

886
00:41:21,429 --> 00:41:24,649
whole patches of systems that roll from one image to another.

887
00:41:24,649 --> 00:41:28,239
And if it fails, we can roll back and you could never roll back with windows.

888
00:41:28,239 --> 00:41:29,399
You can't roll back with a Mac.

889
00:41:29,409 --> 00:41:31,219
And those things make it really difficult.

890
00:41:31,419 --> 00:41:34,219
The downside is you have to reboot and you're like, no one wants to reboot.

891
00:41:34,519 --> 00:41:35,389
But the, the.

892
00:41:35,479 --> 00:41:38,679
Bonus of, oh, I know this is safe to try because

893
00:41:38,679 --> 00:41:40,529
if it doesn't work, I always have a fallback.

894
00:41:40,989 --> 00:41:44,579
Java did something similar, but not so much for rollbacks, but they

895
00:41:44,579 --> 00:41:48,949
made the release cadence shorter so people would no longer get stuck.

896
00:41:49,289 --> 00:41:50,279
Giant updates.

897
00:41:50,309 --> 00:41:50,649
Yeah.

898
00:41:50,659 --> 00:41:54,559
So like after eight, we learned our lesson and they were like, okay.

899
00:41:55,914 --> 00:41:56,284
Really?

900
00:41:56,334 --> 00:41:56,634
Eight?

901
00:41:57,614 --> 00:41:57,924
Sorry.

902
00:41:58,174 --> 00:42:01,974
Like eight will die when the universe nukes itself, okay?

903
00:42:02,024 --> 00:42:04,834
Like that's when it'll die, but like it'll, the release cadence

904
00:42:04,834 --> 00:42:09,604
made it easier to release software more like, um, more regularly,

905
00:42:09,604 --> 00:42:12,934
but it also made it where you're getting new LTSs, but they're

906
00:42:13,064 --> 00:42:15,954
long living enough for them, people to want to switch to them.

907
00:42:16,444 --> 00:42:20,534
But at the same time, kind of giving people where the

908
00:42:20,624 --> 00:42:23,724
versions weren't so different that they were hard for you to.

909
00:42:24,699 --> 00:42:28,369
I think going back to the question around the

910
00:42:28,389 --> 00:42:31,509
third party scanning, it's super critical.

911
00:42:31,799 --> 00:42:35,179
It's a critical component for modern cybersecurity practices.

912
00:42:35,289 --> 00:42:40,169
But I think that there's also a lot of supply chain risk that's introduced.

913
00:42:40,269 --> 00:42:42,509
I'm not sure if you guys are familiar, if you heard of the

914
00:42:42,519 --> 00:42:46,049
SolarWinds attack that was back maybe a few years ago, but

915
00:42:46,059 --> 00:42:49,599
that originated from vulnerabilities in third party systems.

916
00:42:49,909 --> 00:42:54,549
So I think that having these third party scanning capabilities is super

917
00:42:54,549 --> 00:42:58,659
important, but we also have to remember that it increases the attack surface.

918
00:42:58,779 --> 00:43:01,779
So as you're integrating more third party solutions, those

919
00:43:01,799 --> 00:43:05,619
potential entry points for attackers increases significantly.

920
00:43:05,619 --> 00:43:08,379
So there's a shared responsibility, there's a lot, there's a

921
00:43:08,379 --> 00:43:11,789
lot, there's a lot of benefits, but there's a lot of increased

922
00:43:11,839 --> 00:43:14,069
vulnerabilities that happen when you, when you just think

923
00:43:14,079 --> 00:43:16,729
about all the new entry points that, um, attackers have.

924
00:43:17,049 --> 00:43:19,619
How would you have fixed or changed CrowdStrike?

925
00:43:19,710 --> 00:43:24,749
This is a good, this is spicy, just like a little spicy,

926
00:43:24,749 --> 00:43:27,959
but like, like this is, this is like mild compared to

927
00:43:27,959 --> 00:43:30,759
his normal shade that he throws at cloud companies.

928
00:43:30,759 --> 00:43:30,849
I'm

929
00:43:31,109 --> 00:43:33,149
just, I would love to hear some insights on like

930
00:43:33,149 --> 00:43:35,619
what, what you think is something that could have been

931
00:43:35,619 --> 00:43:37,369
done different or should have been done different.

932
00:43:37,539 --> 00:43:42,849
Fundamentally, like, The testing could have been a lot better, but I also

933
00:43:42,879 --> 00:43:46,849
think that they should have had like a layered approach for monitoring,

934
00:43:47,099 --> 00:43:50,819
maybe like combining some type of like endpoint detection or some

935
00:43:50,819 --> 00:43:54,939
type of like network traffic and analysis or like behavior analysis.

936
00:43:55,074 --> 00:43:58,504
for ways to detect these anomalies in their system could have been a way.

937
00:43:58,594 --> 00:44:01,224
But I think that like, just going back to like, how

938
00:44:01,224 --> 00:44:03,754
could they miss something as fundamental as testing?

939
00:44:03,764 --> 00:44:08,874
Like for a company that big for, for them to be faulted at that level.

940
00:44:09,094 --> 00:44:10,004
Because it changed the

941
00:44:10,004 --> 00:44:10,874
behavior.

942
00:44:10,914 --> 00:44:11,524
You know what I mean?

943
00:44:11,524 --> 00:44:14,924
Like it changed such a behavior that like, you know, your

944
00:44:14,934 --> 00:44:16,834
product, you know, it's running in airports, you know, it's

945
00:44:16,834 --> 00:44:20,594
running to things that can't be rebooted, don't have keyboards.

946
00:44:20,614 --> 00:44:21,294
You know what I mean?

947
00:44:21,294 --> 00:44:22,314
So like, I'm just like.

948
00:44:22,519 --> 00:44:22,909
I don't know.

949
00:44:22,939 --> 00:44:25,159
I feel like we all have use cases and bugs

950
00:44:25,159 --> 00:44:27,659
that you can't account for every now and then.

951
00:44:27,669 --> 00:44:30,079
You know, it gets so out of the realm on how a user is

952
00:44:30,079 --> 00:44:32,499
going to use it that like, we all have our issues, but

953
00:44:32,509 --> 00:44:35,069
that wasn't even like a user using it in a weird way.

954
00:44:35,319 --> 00:44:36,299
Yeah, I agree.

955
00:44:36,299 --> 00:44:39,989
I think that they could also have like a better incident response approach.

956
00:44:40,139 --> 00:44:43,789
Maybe if they had some type of speed or clarity of response

957
00:44:43,789 --> 00:44:45,719
during that incident, that would have helped a lot.

958
00:44:45,759 --> 00:44:47,509
So yeah, I think there's a lot of ways in

959
00:44:47,509 --> 00:44:49,269
which that they could have made this better.

960
00:44:50,069 --> 00:44:52,929
Just to play a little devil's Advocate here.

961
00:44:53,029 --> 00:44:58,489
The thing that they had a bug crashed roughly 1 percent of Windows clients.

962
00:44:58,759 --> 00:45:01,699
I would never, I don't ever test my software enough that

963
00:45:01,699 --> 00:45:04,899
1 percent of my customers could not be affected, right?

964
00:45:04,899 --> 00:45:07,329
There's always this edge case of like, how

965
00:45:07,369 --> 00:45:10,099
thoroughly can I test something like security?

966
00:45:10,099 --> 00:45:13,879
And yeah, it's in all these places and all this stuff is, is obviously bad.

967
00:45:13,879 --> 00:45:17,379
And I think that the, the global deployment of a thing.

968
00:45:17,799 --> 00:45:21,479
Uh, it was, was a YOLO moment for them of just like, here it goes,

969
00:45:21,479 --> 00:45:24,689
it's, it's tested on my machine, it works on my machine, and then 1%, 8.

970
00:45:24,689 --> 00:45:27,939
5 million Windows devices crash from it.

971
00:45:28,399 --> 00:45:30,049
Which, again, like, it just seems like a

972
00:45:30,049 --> 00:45:32,099
really small edge case in a lot of ways.

973
00:45:32,100 --> 00:45:32,499
The way they

974
00:45:32,529 --> 00:45:36,619
dug into it though, it just seemed like there were so many opportunities.

975
00:45:36,899 --> 00:45:39,059
Anytime I'm looking from the outside on anything, I'm

976
00:45:39,059 --> 00:45:41,539
like, Oh, this is, this should have been easy, right?

977
00:45:41,539 --> 00:45:42,789
Like, Oh, I could have figured that out.

978
00:45:42,789 --> 00:45:42,969
Right.

979
00:45:42,970 --> 00:45:45,289
But like, when I really look at those, like edge 1

980
00:45:45,289 --> 00:45:48,039
percent edge cases, I don't know that it would matter.

981
00:45:48,189 --> 00:45:49,689
Yeah, I agree.

982
00:45:49,689 --> 00:45:53,119
I feel like they, there, there are so many ways and so many lessons that

983
00:45:53,119 --> 00:45:57,002
they could have, well, now that they learned, but yeah, that was, that

984
00:45:57,002 --> 00:45:59,394
was, I think one of the really interesting outcomes from

985
00:45:59,394 --> 00:46:02,344
this is the fact that Microsoft is giving the kernel hooks

986
00:46:02,404 --> 00:46:04,164
so that they don't have to run in kernel space, right?

987
00:46:04,164 --> 00:46:07,664
Like that was the thing, the API limits that Microsoft walled

988
00:46:07,664 --> 00:46:11,854
off in Windows Vista is now becoming open again so that the

989
00:46:11,864 --> 00:46:15,524
security vendors have the proper access to not run this highly

990
00:46:15,524 --> 00:46:19,844
privileged code that is sometimes untested and causes those things.

991
00:46:19,854 --> 00:46:22,579
So I think the actual, uh, Eventual outcome that's interesting is the,

992
00:46:22,629 --> 00:46:25,769
is the Microsoft changes, not the CrowdStrike changes necessarily.

993
00:46:25,769 --> 00:46:28,429
Cause everyone's going to have 1 percent errors and everyone at

994
00:46:28,429 --> 00:46:30,809
some point is going to say, this is, this has a fix that has to go

995
00:46:30,809 --> 00:46:34,979
out now and, and how much access or where, how critically does that

996
00:46:34,979 --> 00:46:37,960
software run is the real kind of interesting learning thing to me.

997
00:46:38,369 --> 00:46:38,689
Yeah.

998
00:46:38,719 --> 00:46:40,789
Vendor accountability, super important.

999
00:46:41,284 --> 00:46:42,054
And partnerships, right?

1000
00:46:42,054 --> 00:46:44,244
Like you're, they build the thing for when Microsoft

1001
00:46:44,244 --> 00:46:46,554
Windows and that's where it runs in the primary use case.

1002
00:46:46,584 --> 00:46:49,194
And, and that was, was what was affected and

1003
00:46:49,504 --> 00:46:51,884
Microsoft never allowed the vendors to get in there.

1004
00:46:52,454 --> 00:46:53,364
Yasmin, this has been great.

1005
00:46:53,404 --> 00:46:54,784
Thank you so much for coming on the show.

1006
00:46:54,994 --> 00:46:57,494
Thank you about teaching us all about your, your career

1007
00:46:57,494 --> 00:46:59,674
path and different security aspects at different companies.

1008
00:46:59,854 --> 00:47:02,984
Where should people find you if they want to reach out online or get in contact?

1009
00:47:03,874 --> 00:47:04,864
Yeah, absolutely.

1010
00:47:04,864 --> 00:47:05,914
This was so much fun.

1011
00:47:05,914 --> 00:47:06,994
Thank you for having me.

1012
00:47:07,204 --> 00:47:12,964
My, uh, socials are Yasmin Abdi, so you can find me on LinkedIn at Yasmin abdi.

1013
00:47:13,204 --> 00:47:19,384
Um, Instagram at yaz abdi, Y-A-Z-A-P-D-I also, uh, no hack llc.com.

1014
00:47:19,564 --> 00:47:20,584
Feel free to message us.

1015
00:47:20,584 --> 00:47:23,104
Feel free to reach out if you wanna learn more about

1016
00:47:23,104 --> 00:47:25,894
cybersecurity or you wanna partner or work together.

1017
00:47:25,954 --> 00:47:28,399
Yeah, this was super fun and I'm, I'm super glad that we did this.

1018
00:47:29,049 --> 00:47:31,029
I'm so excited to meet you, to have met you.

1019
00:47:31,069 --> 00:47:33,529
I'm going to be rooting for you and like fangirling the whole time.

1020
00:47:33,609 --> 00:47:34,209
It's going to be great.

1021
00:47:34,409 --> 00:47:35,299
Right back at you.

1022
00:47:35,509 --> 00:47:37,139
Thank you so much and thank you everyone for listening.

1023
00:47:37,149 --> 00:47:38,289
We will see you again soon.

1024
00:47:53,439 --> 00:47:56,429
Thank you for listening to this episode of Fork Around and Find Out.

1025
00:47:56,759 --> 00:47:58,909
If you like this show, please consider sharing it with

1026
00:47:58,909 --> 00:48:02,080
a friend, a coworker, a family member, or even an enemy.

1027
00:48:02,199 --> 00:48:04,289
However we get the word out about this show

1028
00:48:04,499 --> 00:48:06,719
helps it to become sustainable for the long term.

1029
00:48:06,924 --> 00:48:10,674
If you want to sponsor this show, please go to fafo.

1030
00:48:10,714 --> 00:48:14,254
fm slash sponsor and reach out to us there about what

1031
00:48:14,254 --> 00:48:16,444
you're interested in sponsoring and how we can help.

1032
00:48:17,724 --> 00:48:20,924
We hope your systems stay available and your pagers stay quiet.

1033
00:48:21,424 --> 00:48:22,604
We'll see you again next time.