1
00:00:01,080 --> 00:00:03,000
Speaker 1: How'd you like to listen to dot net rocks with

2
00:00:03,040 --> 00:00:07,879
no ads? Easy? Become a patron for just five dollars

3
00:00:07,919 --> 00:00:10,800
a month. You get access to a private RSS feed

4
00:00:10,839 --> 00:00:14,279
where all the shows have no ads. Twenty dollars a month.

5
00:00:14,279 --> 00:00:16,879
We'll get you that and a special dot net Rocks

6
00:00:16,960 --> 00:00:21,000
patron mug. Sign up now at Patreon dot dot NetRocks

7
00:00:21,120 --> 00:00:37,079
dot com. Hey, it's dot net rocks. I'm Carl Franklin

8
00:00:37,159 --> 00:00:41,280
and Amirchard Campbell. We're here again for the nineteen hundredth

9
00:00:41,320 --> 00:00:42,840
and sixty third time.

10
00:00:44,039 --> 00:00:46,439
Speaker 2: It's almost like an addiction or a pattern or something.

11
00:00:46,479 --> 00:00:46,880
I don't know.

12
00:00:47,159 --> 00:00:49,000
Speaker 1: You think one of these days we'll figure it out.

13
00:00:49,119 --> 00:00:50,960
Speaker 2: I don't know. If we're going out, we would have

14
00:00:50,960 --> 00:00:51,520
buy now right.

15
00:00:51,640 --> 00:00:54,679
Speaker 1: Maybe we will talk about what happened in the year

16
00:00:54,759 --> 00:00:58,000
nineteen sixty three in just a minute. But first let's

17
00:00:58,119 --> 00:00:59,840
roll the music for better no framework.

18
00:01:07,480 --> 00:01:08,799
Speaker 2: All right, dude, what do you got?

19
00:01:08,840 --> 00:01:09,000
Speaker 3: Well?

20
00:01:09,040 --> 00:01:12,480
Speaker 1: This came from Simon Cropp, who is a font of

21
00:01:12,640 --> 00:01:14,840
all good things, especially in the.

22
00:01:14,840 --> 00:01:16,519
Speaker 2: Area of tests at Bexter.

23
00:01:16,840 --> 00:01:21,799
Speaker 1: Yeah. He found this tool called xUnit dot Combinatorial Oh,

24
00:01:21,840 --> 00:01:24,439
which is a project from Andrew Arnott on the Visual

25
00:01:24,480 --> 00:01:28,239
Studio Platform team but instead of pointing you to the docks,

26
00:01:28,280 --> 00:01:32,239
which he said are okay, But this particular blog post

27
00:01:32,280 --> 00:01:39,120
by Andrew Locke really talks about it in detail. So basically,

28
00:01:39,319 --> 00:01:45,719
it's a the combinatorial is a weird name but fancy word,

29
00:01:45,719 --> 00:01:47,719
but it's a bunch of features that can make it

30
00:01:47,760 --> 00:01:50,519
easier to generate the test data you need with x

31
00:01:50,640 --> 00:01:54,680
unit nice right, So you can also auto generate parameters,

32
00:01:54,799 --> 00:01:59,560
generate all parameter combinations, or randomly generate values and it's

33
00:01:59,560 --> 00:02:04,400
all with you know, just a simple code you know

34
00:02:04,640 --> 00:02:10,400
theory tests right, and you can use more more attributes

35
00:02:11,000 --> 00:02:20,840
from the combination combinator combinatorial data nice x unit combinatorial. Anyway,

36
00:02:20,840 --> 00:02:22,719
it's really good and there's a lot of examples on

37
00:02:22,719 --> 00:02:27,199
this blog post, so you know, like I said, it's

38
00:02:27,319 --> 00:02:32,240
very easy. Just audit, auto generate permutations and parameters and

39
00:02:32,439 --> 00:02:38,280
custom defined values. Generating values for a single parameter, you

40
00:02:38,319 --> 00:02:42,360
can reduce the number of combinations. So there you go,

41
00:02:42,680 --> 00:02:45,479
know it, learn it, love it. Who's talking to us? Yeah?

42
00:02:45,520 --> 00:02:46,919
Speaker 2: I like about that. Do you know what I think

43
00:02:46,960 --> 00:02:49,319
about that particular situation. It's like it's one guy to

44
00:02:49,360 --> 00:02:51,719
write it, but another person to explain it.

45
00:02:51,759 --> 00:02:53,240
Speaker 1: Well, enough that you know, how do we want to

46
00:02:53,319 --> 00:02:56,599
use it? Yeah, yeah, yeah, it happens. The two talents

47
00:02:56,759 --> 00:02:58,039
don't necessarily go together.

48
00:02:58,159 --> 00:03:01,120
Speaker 2: They don't necessarily go together. Absolutely true. So who's talking

49
00:03:01,120 --> 00:03:03,000
to us today? Richard Graud to comment heal for show

50
00:03:03,039 --> 00:03:05,840
eighteen seventy six, which is back in December twenty three,

51
00:03:05,879 --> 00:03:08,319
when we talk to our friend Laura Belle Maine, who

52
00:03:08,560 --> 00:03:12,599
is a regular conversationalist about security and runs a security

53
00:03:12,599 --> 00:03:16,680
related company. And this was about agile application security, the

54
00:03:16,680 --> 00:03:21,599
idea that you can incorporate security into your workflow, sprint

55
00:03:21,599 --> 00:03:24,120
to sprint, it's not crazy. Don't just try and retrofit it.

56
00:03:24,120 --> 00:03:26,319
At the end, and Rob had this great comedy, says

57
00:03:26,360 --> 00:03:29,759
great show. I especially appreciated discussion centered around making security

58
00:03:29,759 --> 00:03:32,680
easier for the average person and for that matter, the

59
00:03:32,680 --> 00:03:36,159
average developer to implement. So often it has taken that

60
00:03:36,199 --> 00:03:39,479
the cost of security is inconvenience, but I think that

61
00:03:39,560 --> 00:03:42,159
turning this idea on its head and is actually what

62
00:03:42,319 --> 00:03:44,879
needs to happen. In order to get people to do things.

63
00:03:44,879 --> 00:03:48,039
Security must make secure solutions to be the convenience solution. Convenances,

64
00:03:48,199 --> 00:03:52,319
the behavior modification tool need to dimploment better security. I mean,

65
00:03:52,319 --> 00:03:54,360
it's not that convenient about a lock, but you don't

66
00:03:54,360 --> 00:03:56,960
want to give them up. What you would like is

67
00:03:56,960 --> 00:04:00,599
a lock that you can unlock reliably when you want to.

68
00:04:00,919 --> 00:04:05,680
So there's some balance there, I agree, Rob, But it's

69
00:04:05,719 --> 00:04:08,000
good to have the tooling in place. But also, and

70
00:04:08,080 --> 00:04:09,319
I think this is one of the things we talked

71
00:04:09,360 --> 00:04:11,439
about a lot with Laura, was just getting it into

72
00:04:11,479 --> 00:04:13,800
the workflow so that security is never the after.

73
00:04:13,960 --> 00:04:14,199
Speaker 3: Yeah.

74
00:04:14,240 --> 00:04:15,919
Speaker 2: So Rob, thank you so much for your commented. A

75
00:04:15,960 --> 00:04:17,399
copy of music Coby is on its way to you,

76
00:04:17,439 --> 00:04:19,040
And if you'd like a copy of music go by,

77
00:04:19,240 --> 00:04:21,040
write a comment on the website at dot m Rocks

78
00:04:21,079 --> 00:04:23,319
dot com or on the facebooks publish every show there

79
00:04:23,360 --> 00:04:24,800
and if you comment there and I reading the show,

80
00:04:24,959 --> 00:04:26,120
we'll send you copy of music. O.

81
00:04:26,199 --> 00:04:29,720
Speaker 1: All right, shall we talk about the year nineteen sixty three? Sure,

82
00:04:29,879 --> 00:04:33,199
it's a few. Yeah, things that happened in sixty three.

83
00:04:34,759 --> 00:04:40,600
Kennedy assassination, yikes, Vietnam War, it's getting worse. Beatlemania, yeah,

84
00:04:40,680 --> 00:04:43,160
the antidote to the Vietnam War. I guess I remember

85
00:04:43,160 --> 00:04:46,959
Paul McCartney says, yeah, you know, when they were telling

86
00:04:47,079 --> 00:04:49,160
us how to behave in the press, and somebody says,

87
00:04:49,160 --> 00:04:51,560
bring up the Vietnam War. We'd just say, oh, bad woo,

88
00:04:51,839 --> 00:04:52,360
bed war.

89
00:04:52,879 --> 00:04:59,360
Speaker 2: That's it. Bet there's any good wars. Yeah, right, bad woo.

90
00:05:01,079 --> 00:05:05,199
Speaker 1: Some civil rights things were going on. In sixty three

91
00:05:05,480 --> 00:05:09,480
March on Washington, Martin Luther King delivered his famous I

92
00:05:09,600 --> 00:05:10,839
Have a Dreams speech.

93
00:05:11,199 --> 00:05:15,279
Speaker 2: Good speech and then bad war, good speech, good speech.

94
00:05:16,079 --> 00:05:20,079
Speaker 1: On the bad side of civil rights. The Sixteenth Street

95
00:05:20,240 --> 00:05:24,639
Baptist Church bombing on September fifteenth, Birmingham, Alabama, killed four

96
00:05:24,720 --> 00:05:29,560
young girls absolutely intensified national outrage and support for the

97
00:05:29,600 --> 00:05:36,199
civil rights movement. Push button telephones, the touchdown, and some

98
00:05:36,279 --> 00:05:40,759
things happened in the nuclear front, right, the Nuclear Test

99
00:05:40,759 --> 00:05:44,000
Band Treaty on October twenty fourth, and the first commercial

100
00:05:44,079 --> 00:05:47,720
nuclear reactor began operation in the US. Anything you want

101
00:05:47,759 --> 00:05:48,279
to say about that.

102
00:05:48,319 --> 00:05:51,639
Speaker 2: Richard, Oh, there were so many reactors at that time.

103
00:05:52,160 --> 00:05:54,319
They go back to the fifties, so it depends on

104
00:05:54,360 --> 00:05:59,199
which one we're talking about. But yeah, no, they were

105
00:06:00,079 --> 00:06:02,959
a lot of military related reactors, but then you had

106
00:06:02,959 --> 00:06:08,000
the Westinghouses and the General Electrics building the commercial editions

107
00:06:08,079 --> 00:06:12,480
just to make them reliable. That those were two Gen

108
00:06:12,600 --> 00:06:15,600
one reactors and they were tricky to operate, and they

109
00:06:15,639 --> 00:06:18,040
had very skilled people to operate them, and most part

110
00:06:18,079 --> 00:06:20,439
went well. It's when it grows out over the next

111
00:06:20,480 --> 00:06:22,040
decade that we started to have more problems with.

112
00:06:22,120 --> 00:06:24,600
Speaker 1: And was it last week we talked about the integrated circuits.

113
00:06:24,800 --> 00:06:25,720
That was sixty two.

114
00:06:26,040 --> 00:06:28,639
Speaker 2: Yeah, we talked a bit about integrated circuits there because

115
00:06:28,680 --> 00:06:31,199
they were struggling still struggling to make them. Then too,

116
00:06:31,680 --> 00:06:36,439
they were part of the equation on the computer front there.

117
00:06:36,519 --> 00:06:39,360
So there's a couple of interesting things. Last week we

118
00:06:39,399 --> 00:06:42,399
talked about the Link computer are really the first personal computer,

119
00:06:42,720 --> 00:06:44,360
but just because you could take one home even though

120
00:06:44,360 --> 00:06:48,079
it was six foot tall. But the following year at DAK,

121
00:06:48,600 --> 00:06:52,480
inspired by that machine, they made the PDP five, which

122
00:06:52,560 --> 00:06:55,040
arguably is the first mass produced mini computer. There were

123
00:06:55,079 --> 00:06:58,000
only over fifty links made, so these were twelve bit

124
00:06:58,079 --> 00:07:00,680
words and you could get between one and up to

125
00:07:00,759 --> 00:07:04,319
thirty two K words of core memory. This is before

126
00:07:05,079 --> 00:07:08,000
you know integrated circuit memory, so you literally have the

127
00:07:08,040 --> 00:07:13,959
little wound fires cores in wound in copper to store stuff.

128
00:07:14,399 --> 00:07:16,439
Came with an editor, assembler or four track compiler and

129
00:07:16,480 --> 00:07:19,720
a debugger for about twenty seven thousand US dollars and

130
00:07:19,759 --> 00:07:22,879
it was a less expensive computer than the PDP four,

131
00:07:23,639 --> 00:07:26,279
which was also a less expensive computer than the PDP one.

132
00:07:26,360 --> 00:07:29,160
So the original machine made by Deck was very expensive,

133
00:07:29,160 --> 00:07:30,680
and so they made a cheaper one, and this was

134
00:07:30,720 --> 00:07:33,759
even cheaper than that, and this machine, the more advanced

135
00:07:33,800 --> 00:07:35,879
version of the PDB five, will become the very famous

136
00:07:35,920 --> 00:07:42,160
PDP eight. One piece of software from nineteen sixty three

137
00:07:42,240 --> 00:07:45,120
is sketch Pad. So this is Ivan Sutherland who was

138
00:07:45,360 --> 00:07:49,120
writing his PhD thesis, and he was really the ancestor

139
00:07:49,160 --> 00:07:52,680
of CAD software of computer aided design and arguably the

140
00:07:52,759 --> 00:07:56,279
very first guy. It ran on the TX two, which

141
00:07:56,360 --> 00:07:58,759
is a completely unique, one of a kind machine from

142
00:07:58,839 --> 00:08:01,439
nineteen fifty eight built in it only lasted about twenty

143
00:08:01,519 --> 00:08:05,480
years or fifteen years, which had sixty four k of

144
00:08:05,560 --> 00:08:08,000
thirty six bit words and you were able to draw

145
00:08:08,040 --> 00:08:10,879
directly on the screen with a light pen. Yeah there

146
00:08:10,920 --> 00:08:13,360
was no mouse yet Nope, that was Xerox part The

147
00:08:13,399 --> 00:08:17,560
mouse is still coming. Yeah yeah, but you know, these

148
00:08:17,560 --> 00:08:19,600
are all the beginning efforts with these early machines, but

149
00:08:19,839 --> 00:08:23,720
also totally bespoke. This was software that ran on exactly

150
00:08:23,759 --> 00:08:26,079
one computer in the world at the time.

151
00:08:26,399 --> 00:08:28,680
Speaker 1: Right, Wow, Yeah. Well, I can't wait to go through

152
00:08:28,720 --> 00:08:33,159
the years here leading up to the computer revolution, because

153
00:08:33,200 --> 00:08:35,799
this is this is where it gets exciting, you know.

154
00:08:36,039 --> 00:08:39,679
Speaker 2: Yeah, And it's we're pulling these bits and pieces together

155
00:08:39,720 --> 00:08:43,440
and just seeing like that PDP five it's all transistors.

156
00:08:43,440 --> 00:08:47,080
There's not a single I see in it. It's before that. Right, Wow,

157
00:08:47,720 --> 00:08:48,320
good stuff.

158
00:08:48,679 --> 00:08:51,240
Speaker 1: All right, stay tuned for more history lessons from Richard

159
00:08:51,240 --> 00:08:56,080
Campbell and myself. But I suppose it's time to talk

160
00:08:56,159 --> 00:09:01,000
to our guest, Michael Howard. So. Michael is a senior

161
00:09:01,039 --> 00:09:04,399
director in the Microsoft Red Team. If you don't know

162
00:09:04,399 --> 00:09:05,960
what a Red Team is, they're the ones that go

163
00:09:06,080 --> 00:09:09,000
and hack. You'll tell you all about it, and the

164
00:09:09,039 --> 00:09:13,519
hackers focused on improving security design and development across Microsoft.

165
00:09:13,639 --> 00:09:17,519
Based on RT findings. He has been at Microsoft for

166
00:09:17,559 --> 00:09:20,600
thirty three years, almost always in security except for at

167
00:09:20,600 --> 00:09:24,120
the start. We're at Microsoft in New Zealand. He supported

168
00:09:24,159 --> 00:09:28,320
Windows three point x, the Windows SDK, and the Microsoft

169
00:09:28,399 --> 00:09:32,799
C compiler. Currently lives in Austin, Texas. Is an avid

170
00:09:32,960 --> 00:09:36,679
scuba diver and life is now much better because his

171
00:09:36,799 --> 00:09:42,799
wife dives now too fantastic, so she didn't scuba dive

172
00:09:42,879 --> 00:09:45,440
FORR The early years of your relationship, is that it?

173
00:09:45,480 --> 00:09:49,519
Speaker 3: Well, I've been diving since ELA's seven. Oh well, okay,

174
00:09:49,720 --> 00:09:54,200
so no, she was absolutely terrified of diving. She felt

175
00:09:54,240 --> 00:09:58,480
very claustrophobic and also like, I live in Austin, so

176
00:09:58,519 --> 00:10:00,960
the closest lake to us for die thing is Lake Travis.

177
00:10:00,960 --> 00:10:02,879
And honestly, the only nice thing about Lake Travis is

178
00:10:02,919 --> 00:10:05,679
it makes every single diving destination looks so much better.

179
00:10:08,559 --> 00:10:09,039
It's just a light.

180
00:10:09,159 --> 00:10:11,799
Speaker 2: Not that much diving in Texas really, No.

181
00:10:11,840 --> 00:10:14,919
Speaker 3: Actually, there actually is. Actually there's a yeah Travis. Some

182
00:10:14,919 --> 00:10:18,200
of the lakes are really commonly dived. And there's also

183
00:10:18,279 --> 00:10:25,639
abandoned nuclear rocket silo in West Texas called Valhalla, which

184
00:10:25,720 --> 00:10:27,519
I plan on diving this year as well. Its about

185
00:10:27,519 --> 00:10:31,600
one hundred and twenty feet down. How many rads hopefully zero?

186
00:10:33,559 --> 00:10:35,440
Speaker 4: There's not a game of fallout, you know, you should

187
00:10:35,519 --> 00:10:37,559
check that out first, take it out.

188
00:10:37,440 --> 00:10:37,960
Speaker 1: That's right.

189
00:10:39,159 --> 00:10:42,320
Speaker 3: But yeah, it's funny. When my wife got her her certification,

190
00:10:42,519 --> 00:10:44,360
she refused to do it in Lake Travis. So she

191
00:10:44,399 --> 00:10:47,639
flew with a girlfriend to the Virgin Islands. Oh nice,

192
00:10:47,840 --> 00:10:49,639
did a certification there instead?

193
00:10:49,799 --> 00:10:53,039
Speaker 2: You find some warm water to swim around in, right, Yeah.

194
00:10:52,759 --> 00:10:54,799
Speaker 3: Absolutely, she's a blue water princess.

195
00:10:54,960 --> 00:10:55,200
Speaker 2: Yeah.

196
00:10:55,360 --> 00:11:00,360
Speaker 1: There so a couple of things RTI Red Team. So

197
00:11:00,480 --> 00:11:02,600
tell me about the Microsoft Red Team. I know what

198
00:11:02,639 --> 00:11:04,919
red teams are because I do a podcast with two

199
00:11:04,960 --> 00:11:08,159
guys on our Red Team about security security this week.

200
00:11:08,519 --> 00:11:11,559
But tell us what the Microsoft Red Team does. Yeah.

201
00:11:11,600 --> 00:11:14,200
Speaker 3: We you know, we're basically treated as a real threat

202
00:11:14,240 --> 00:11:19,399
actor in every possible way you could consider that. Nothing's

203
00:11:19,399 --> 00:11:22,279
else on the table except JA. That's good. That's actually

204
00:11:22,279 --> 00:11:25,240
a very good point. There's a story. There's a story

205
00:11:25,279 --> 00:11:27,919
I can probably tell you there about I don't want

206
00:11:27,960 --> 00:11:30,039
to get it, only very careful what I say here,

207
00:11:30,039 --> 00:11:32,919
But I remember showing something on a on a on

208
00:11:32,960 --> 00:11:35,440
a session and someone said, oh you could you know

209
00:11:35,480 --> 00:11:38,320
you can get fired because of that. And then this

210
00:11:38,440 --> 00:11:40,879
friend of mine who's also on the Red Team, colleague

211
00:11:40,919 --> 00:11:42,799
of mine, chimed up and said, well, actually know he's

212
00:11:42,799 --> 00:11:44,519
on the Red team. He can get out of get

213
00:11:44,519 --> 00:11:50,039
a raise. Yeah, the Red Team is really kind of interesting, right.

214
00:11:50,080 --> 00:11:53,120
So our job is ultimately we start off with an objective,

215
00:11:53,360 --> 00:11:57,240
whatever that objective is, and we go for it. And

216
00:11:57,279 --> 00:11:58,799
that's a I mean whatever it takes to get to

217
00:11:58,799 --> 00:12:02,279
that objective. And a big part that I'm involved with

218
00:12:03,200 --> 00:12:07,879
is the readouts as I look at what actually happened,

219
00:12:07,919 --> 00:12:10,360
what was actually done, all the steps along the way,

220
00:12:11,360 --> 00:12:14,120
and so okay, so what was the for example, the beachhead,

221
00:12:14,240 --> 00:12:16,440
you know, what got the Red team into the environment

222
00:12:16,480 --> 00:12:18,120
and where did they go from there? And how did

223
00:12:18,120 --> 00:12:21,080
they get there? Was it a security vulnerability? Was it,

224
00:12:21,600 --> 00:12:23,919
you know, like from a code level perspective, or was

225
00:12:23,960 --> 00:12:27,440
it a week commission on something? Was it insecure this

226
00:12:27,559 --> 00:12:29,200
that and the other? Was it a combination of things?

227
00:12:29,240 --> 00:12:31,240
The whole point is for me to learn those things

228
00:12:31,679 --> 00:12:35,720
and then turn that into appropriate material for inside of Microsoft,

229
00:12:35,919 --> 00:12:38,200
and eventually we'll make a lot of that available outside

230
00:12:38,240 --> 00:12:40,559
Microsoft as well. But we're starting internally obviously.

231
00:12:40,840 --> 00:12:43,360
Speaker 1: Are you primarily focused on Azure because I know that

232
00:12:43,360 --> 00:12:45,919
that's kind of where you where you live, But what

233
00:12:46,039 --> 00:12:49,080
about internal offices and things like that?

234
00:12:49,320 --> 00:12:49,559
Speaker 3: Yes?

235
00:12:50,000 --> 00:12:51,960
Speaker 1: Both, Yes, I'll just leave it at that.

236
00:12:52,240 --> 00:12:54,440
Speaker 3: Nothing's off the table, nothing's on the table at all.

237
00:12:55,120 --> 00:12:58,679
What's that's kind of interesting is the Red Team. Now.

238
00:12:58,720 --> 00:13:01,240
The Microsoft Red Team is really a sort of conglomeration

239
00:13:01,399 --> 00:13:05,679
of multiple Red teams that existed across Microsoft, now reporting

240
00:13:05,759 --> 00:13:10,000
under one essentially management chain, essentially on the Scott gu three.

241
00:13:10,080 --> 00:13:13,519
Ultimately if those cuts you know, primarily Asia, I mean

242
00:13:13,519 --> 00:13:15,200
he obviously Windows rolls up there as well.

243
00:13:15,559 --> 00:13:18,759
Speaker 1: Right, Penetration tests are kind of expensive. Do you do

244
00:13:18,799 --> 00:13:19,799
them around the clock?

245
00:13:20,039 --> 00:13:22,440
Speaker 3: We do so. So pen testing is not red teaming, right,

246
00:13:22,480 --> 00:13:25,720
I mean pent testing is you almost say, hey, I

247
00:13:25,799 --> 00:13:27,679
want you to go and you know, kick the you

248
00:13:27,759 --> 00:13:30,840
know what out of this product, right. Red Team starts

249
00:13:30,840 --> 00:13:35,279
from an objective and objective yeah, oh yeah, we don't

250
00:13:35,279 --> 00:13:38,759
tell anybody. We don't tell anybody. You know, we want

251
00:13:38,799 --> 00:13:42,720
to create tokens or we want to air quotes, steal

252
00:13:42,799 --> 00:13:46,559
money from something whatever. Right, that's the objective. And whoever

253
00:13:46,600 --> 00:13:49,960
we trample along the way get strampled along the way.

254
00:13:50,559 --> 00:13:53,279
Whereas pen testing is quite different. It's like, you know, hey,

255
00:13:53,440 --> 00:13:56,759
find bugs and share point online, find bugs in I

256
00:13:57,120 --> 00:13:58,039
S or find bugs and.

257
00:13:58,120 --> 00:14:00,000
Speaker 1: Dot it find vulnerabildings.

258
00:14:00,480 --> 00:14:01,679
Speaker 3: Yeah, quite different.

259
00:14:01,720 --> 00:14:04,080
Speaker 1: So the difference is a pentest they ask you to

260
00:14:04,159 --> 00:14:06,759
do X. In Red Team, they don't even know what

261
00:14:06,799 --> 00:14:09,159
you're up to. You're just hack hack hack hack.

262
00:14:09,200 --> 00:14:13,840
Speaker 3: Yeah, and you know we'll leaven, not just like attackers. Right,

263
00:14:13,919 --> 00:14:18,279
So if we are detected, we will you know, change

264
00:14:18,360 --> 00:14:21,279
course and we will try to will try to obfuscate,

265
00:14:21,679 --> 00:14:24,039
if me deleting logs or whatever, We'll do it whatever

266
00:14:24,120 --> 00:14:27,159
needs to be done right, and you know, it's a

267
00:14:27,200 --> 00:14:27,840
beautiful thing.

268
00:14:27,919 --> 00:14:28,159
Speaker 2: Really.

269
00:14:28,320 --> 00:14:32,080
Speaker 1: I asked Twayne Laflatte, who's the head of the Red

270
00:14:32,120 --> 00:14:35,320
Team at Pulsar Security, where then I do the podcast with?

271
00:14:36,639 --> 00:14:38,919
I asked every once in a while, like, do you

272
00:14:38,960 --> 00:14:41,840
ever get asked, why aren't you a criminal? You know,

273
00:14:42,000 --> 00:14:44,240
you know all this stuff you can get, you know

274
00:14:44,240 --> 00:14:47,559
how to get it. He's always giving away criminal career advice,

275
00:14:48,159 --> 00:14:54,039
you know, and he says, because I'm a good human. Duh.

276
00:14:54,120 --> 00:14:55,879
Speaker 3: Yes, it's interesting. I mean, you know a lot of

277
00:14:55,919 --> 00:14:58,320
people that I work with, I mean it's the same thing.

278
00:14:58,440 --> 00:15:01,399
I mean, they're incredibly nice people, like truly nice people,

279
00:15:02,720 --> 00:15:07,039
but they just have this way of thinking about things

280
00:15:08,039 --> 00:15:11,759
that a lot of people don't. There's an analogy I

281
00:15:11,879 --> 00:15:14,120
like to give. You. Even heard the story right where

282
00:15:14,120 --> 00:15:15,960
people said, you know, we'll only make things more secure

283
00:15:16,000 --> 00:15:18,639
when we get more people thinking like attackers. Well, my

284
00:15:18,759 --> 00:15:21,080
argument is you can't do that unless you are one.

285
00:15:22,240 --> 00:15:24,480
And the example I gave because I mentioned this to

286
00:15:24,519 --> 00:15:26,799
my wife, this is years ago. We were some word

287
00:15:26,799 --> 00:15:28,720
I don't know where it was, I mean camera to

288
00:15:28,799 --> 00:15:32,360
this ATM machine and it said hey on the screen,

289
00:15:32,399 --> 00:15:34,360
it said, hey, hey, if this temper evident tape is

290
00:15:34,399 --> 00:15:37,480
tampered with, don't use the device. And I said to

291
00:15:37,519 --> 00:15:38,759
my wife, said, do you think that's a good thing?

292
00:15:38,799 --> 00:15:40,879
So well, I don't know what it's even protecting against.

293
00:15:41,279 --> 00:15:42,919
And I said, well, so you know, quick, give her

294
00:15:42,879 --> 00:15:47,279
a quick skimming one oh one, and it's a magnificent difference.

295
00:15:47,320 --> 00:15:49,639
And that's stupid. She said, what do you mean, Well,

296
00:15:49,720 --> 00:15:51,279
don't you think the criminals can make their own temper

297
00:15:51,360 --> 00:15:54,960
evident tape number one number two? It's actually worse than that.

298
00:15:55,120 --> 00:15:57,919
I can knock out every single ATM in town. So

299
00:15:58,039 --> 00:15:59,279
what do you mean, I still just take my pocket

300
00:15:59,360 --> 00:16:01,120
knifeound just every single piece of.

301
00:16:01,039 --> 00:16:02,240
Speaker 2: Type, start cutting the tape.

302
00:16:02,279 --> 00:16:04,679
Speaker 3: Yeah, so I'm using the defense against yourself. It's like,

303
00:16:04,919 --> 00:16:06,399
how did you even think of that?

304
00:16:06,720 --> 00:16:08,399
Speaker 2: Because ease e.

305
00:16:09,759 --> 00:16:13,000
Speaker 1: But you have to have an evil mind.

306
00:16:13,120 --> 00:16:15,440
Speaker 3: You have to think, oh I would I would consider

307
00:16:15,480 --> 00:16:19,480
myself I don't know, probably chaotic good, Okay, Yeah.

308
00:16:19,919 --> 00:16:22,559
Speaker 2: It's a contrarian mind, just to think of the opposites

309
00:16:22,600 --> 00:16:24,600
of things too. I did a show on run As

310
00:16:25,440 --> 00:16:27,080
not that long ago, maybe I'll just go back in

311
00:16:27,120 --> 00:16:29,919
May with your ideogenous who did a book on how

312
00:16:29,919 --> 00:16:32,200
to get a career in cybersecurity, excellent book and the

313
00:16:32,240 --> 00:16:34,000
thing and he talked about the fact that, you know,

314
00:16:34,080 --> 00:16:37,159
most things insecurity are teachable. The thing that isn't teachable

315
00:16:37,360 --> 00:16:41,639
is this sort of insatiable curiosity to just keep wanting

316
00:16:41,639 --> 00:16:44,799
to pay poke away things, try a different way, keep

317
00:16:44,840 --> 00:16:47,799
on pressing. If you don't have that, this isn't the

318
00:16:47,919 --> 00:16:48,480
job for you.

319
00:16:50,159 --> 00:16:52,000
Speaker 3: Whenever I'm interviewing people, one of the things I do

320
00:16:52,080 --> 00:16:55,080
look for is that fire in the belly. Right, It's

321
00:16:55,120 --> 00:16:59,279
that passion to learn, that passion to dig deeper. I

322
00:16:59,320 --> 00:17:01,000
don't know you. I'm you guys probably the same as me.

323
00:17:01,039 --> 00:17:02,080
Right when I was a little kid, I was the

324
00:17:02,080 --> 00:17:05,680
guy in the corner of reading encyclopedias, you know I was.

325
00:17:05,799 --> 00:17:07,880
Speaker 1: I was taking apart my tape recorders and trying to

326
00:17:07,880 --> 00:17:09,680
put them back together unsuccessfully.

327
00:17:10,039 --> 00:17:12,359
Speaker 3: Right, But you learned a lot right getting.

328
00:17:12,039 --> 00:17:14,279
Speaker 1: Whacked by my mother. M Yeah, I learned that I

329
00:17:14,279 --> 00:17:19,720
probably shouldn't take my taper. Yeah, can I have another one?

330
00:17:22,519 --> 00:17:25,799
Speaker 2: I might have made my tape recorder explode. Actually that's

331
00:17:25,839 --> 00:17:26,359
probably worse.

332
00:17:27,240 --> 00:17:31,359
Speaker 4: Oh, the stuff I the thermite substitute for batteries.

333
00:17:32,359 --> 00:17:34,319
Speaker 3: So I got to actually got an explosion story there.

334
00:17:34,319 --> 00:17:35,400
When I was a kid, I went to that, this

335
00:17:35,480 --> 00:17:37,160
is New Zealand. I went to the local chemist and

336
00:17:37,240 --> 00:17:39,559
of course I went, hey, can I get some you know,

337
00:17:40,160 --> 00:17:45,519
carbon potassium? I said, saltpeter and sulfur. The guy's like, come.

338
00:17:45,400 --> 00:17:47,000
Speaker 1: On making gunpowder?

339
00:17:47,119 --> 00:17:49,400
Speaker 3: It is like but you know what he said though,

340
00:17:49,480 --> 00:17:51,359
He said, I've got two options. I can send you

341
00:17:51,400 --> 00:17:53,519
on your on your way, but you're gonna.

342
00:17:53,240 --> 00:17:54,799
Speaker 1: Do it anyway.

343
00:17:54,960 --> 00:17:57,240
Speaker 3: Or I can show you how to do it safely.

344
00:17:58,039 --> 00:17:59,440
And he showed me how to do it safely.

345
00:17:59,680 --> 00:17:59,960
Speaker 2: Nice.

346
00:18:00,079 --> 00:18:04,119
Speaker 1: Wow, that's good. Yeah, don't ask about explosion stories your fingers.

347
00:18:04,920 --> 00:18:08,599
Speaker 2: You know. I told the thermiting the old server so

348
00:18:08,640 --> 00:18:10,720
that we ended up with funky jewelry and a memory

349
00:18:11,400 --> 00:18:16,160
story the other day. Yes, and it was a bad

350
00:18:16,240 --> 00:18:19,880
child that it was a practical end for a server

351
00:18:19,960 --> 00:18:22,920
that everyone hated. Well, I was talking about your teenager

352
00:18:23,039 --> 00:18:25,680
or the other the many other expos in this day

353
00:18:25,720 --> 00:18:28,480
and age. I just be labeled as a terrorist, probably

354
00:18:29,200 --> 00:18:35,400
Canadian polite tarress. What I was was under supervised, clearly, all.

355
00:18:35,359 --> 00:18:38,400
Speaker 1: Right, So what did you come here and talk about?

356
00:18:38,599 --> 00:18:40,960
Because we could. We could talk about this stuff all day.

357
00:18:41,079 --> 00:18:43,319
Speaker 3: Mm hmm, yeah, I mean, I mean it's in my background.

358
00:18:44,359 --> 00:18:46,599
That's you know, sort of mentioned in the brief bio

359
00:18:46,680 --> 00:18:53,440
at the beginning. IS is essentially security however, is an

360
00:18:53,440 --> 00:18:56,839
interesting topic, right because I originally started when I moved

361
00:18:56,839 --> 00:19:01,279
to Redmond from from New Zealand. I got a job

362
00:19:01,279 --> 00:19:04,720
in IIS and the web server intet Information Internet's information

363
00:19:04,799 --> 00:19:07,960
server as it was then now at Services, but same difference,

364
00:19:10,640 --> 00:19:13,160
and I eventually took on the role as the security

365
00:19:13,160 --> 00:19:17,240
PM in IS. Three. I know, you guys are going

366
00:19:17,319 --> 00:19:19,200
to start giggling to yourself, and I sort of mentioned

367
00:19:19,240 --> 00:19:19,680
this stuff.

368
00:19:19,680 --> 00:19:22,240
Speaker 2: Now that made a lot of money off the challenge

369
00:19:22,880 --> 00:19:23,279
in there.

370
00:19:23,400 --> 00:19:25,039
Speaker 1: There's a challenge to keep that secure.

371
00:19:26,640 --> 00:19:28,599
Speaker 3: There's so many stories. When I retire, I think I'll

372
00:19:28,599 --> 00:19:29,400
write a book on it.

373
00:19:30,119 --> 00:19:33,279
Speaker 2: But three four you've already written a bunch of books there,

374
00:19:33,400 --> 00:19:33,799
I have.

375
00:19:33,880 --> 00:19:37,160
Speaker 3: I have, that's right, yeah. But I three four, five

376
00:19:37,240 --> 00:19:39,720
were interesting, right because they were full of security features,

377
00:19:39,799 --> 00:19:42,920
especially five five was full of security features. We had

378
00:19:42,960 --> 00:19:45,960
Cerbros integration, we had certificate services. I think it may

379
00:19:46,000 --> 00:19:49,160
have been actually the first web server that did serve

380
00:19:49,160 --> 00:19:53,759
aside certificate revocation of client certificates. I think it was

381
00:19:53,799 --> 00:19:57,680
the first one. We're really hardcore actually about that if

382
00:19:57,720 --> 00:19:59,960
we couldn't reach the crill distribution point. But that's all

383
00:20:00,000 --> 00:20:03,720
whole nother discussion. But you know, did they have security features? Yeah,

384
00:20:03,720 --> 00:20:07,119
had a ton of security features, But were they secure features, right,

385
00:20:07,200 --> 00:20:09,160
ann answers potentially No.

386
00:20:09,480 --> 00:20:11,839
Speaker 1: I remember path traversal was a problem.

387
00:20:13,000 --> 00:20:17,599
Speaker 3: Yeah, a canonicalization, right, Colonicalization was was a huge problem

388
00:20:17,759 --> 00:20:21,599
and we even saw it in in dartn Air, right,

389
00:20:21,599 --> 00:20:25,079
there was an anti cross site scripting library, right, and

390
00:20:25,319 --> 00:20:28,359
up being so many ways around it that we just thought,

391
00:20:28,519 --> 00:20:32,680
we just can't maintain this, right, And so yeah, I could.

392
00:20:32,759 --> 00:20:37,559
I could talk forever about colonicalization problems. I'm a big, big,

393
00:20:37,599 --> 00:20:42,079
big fan of canonicalization. In fact, canonicalization or two tokens

394
00:20:42,160 --> 00:20:44,400
kind of worry me. But that's a whole nother discussion

395
00:20:44,400 --> 00:20:47,440
where you basically doing string comparisons to make access decisions.

396
00:20:48,079 --> 00:20:50,160
Is there a more than one way of representing something

397
00:20:50,200 --> 00:20:54,799
that's valid that you're not looking for? But so, yeah,

398
00:20:54,839 --> 00:20:57,519
I mean so I S three, four, five, especially five.

399
00:20:57,599 --> 00:21:00,599
Lots of security features, but not particularly secure features, said Carl.

400
00:21:00,960 --> 00:21:06,599
Canonicalization pas traversal were big ones, and so I ended

401
00:21:06,680 --> 00:21:08,519
up learning a lot from that, and at that point,

402
00:21:08,839 --> 00:21:13,359
you know, my my career kind of pivoted from security

403
00:21:13,559 --> 00:21:17,079
features to securing features, and that's where I really got

404
00:21:17,079 --> 00:21:20,960
stuck into lower level memory corruption issues in C C

405
00:21:21,039 --> 00:21:27,279
plus plus again, canonicalization problems across the scripting. Sequel injection,

406
00:21:27,319 --> 00:21:31,799
I mean, sequel injection basically kind of happened around I

407
00:21:32,119 --> 00:21:36,960
S three fourish guy Rainforest Puppy r f P M

408
00:21:37,079 --> 00:21:39,119
getting an email from him saying, hey, look what I discovered,

409
00:21:39,559 --> 00:21:43,759
and that became sequel injection. In fact, I've spoken a

410
00:21:43,839 --> 00:21:46,480
couple of times to RFP since since.

411
00:21:46,279 --> 00:21:48,440
Speaker 1: They're still still like number one on the list, isn't

412
00:21:48,480 --> 00:21:49,240
it sequel injection?

413
00:21:49,480 --> 00:21:52,599
Speaker 2: It's down to three facts in the top ten?

414
00:21:52,759 --> 00:21:54,200
Speaker 1: Is still in the top ten.

415
00:21:54,200 --> 00:21:56,079
Speaker 2: It sat there for so long.

416
00:21:56,160 --> 00:21:58,119
Speaker 3: We know how to solve we know how to solve this.

417
00:21:58,200 --> 00:22:00,880
Speaker 2: Yeah, it's a totally solved problem, and it just doesn't

418
00:22:00,880 --> 00:22:01,400
get away.

419
00:22:01,559 --> 00:22:05,319
Speaker 3: Yeah. And so the question is why, right, why people

420
00:22:05,359 --> 00:22:08,279
keep making those mistakes, And honestly, it's because they just

421
00:22:08,559 --> 00:22:11,400
don't know. I mean, this is an interesting thing, right.

422
00:22:12,839 --> 00:22:15,559
One of the definitions of a secure system is a

423
00:22:15,559 --> 00:22:18,400
system that does what it's supposed to do with nothing else. Right. Well,

424
00:22:18,440 --> 00:22:22,200
if you build some code that does sequally stuff right

425
00:22:22,799 --> 00:22:27,240
and you do stren cocaatenation for your sequel statements, is

426
00:22:27,319 --> 00:22:29,400
it going to solve the business problems? Yes, it is, right,

427
00:22:29,400 --> 00:22:32,599
it's just going to work. But it's the injection part

428
00:22:32,839 --> 00:22:36,240
that is the or, you know, the something else. You know,

429
00:22:36,720 --> 00:22:38,440
A secure system is the one that does what it's

430
00:22:38,440 --> 00:22:40,720
supposed to do with nothing else. But is that something

431
00:22:40,720 --> 00:22:44,240
else that makes it insecure? It solves all your business problems,

432
00:22:44,279 --> 00:22:46,160
but it's insecure at the same time, and people don't

433
00:22:46,200 --> 00:22:49,599
realize it until they start, you know, running tools over

434
00:22:49,640 --> 00:22:52,119
it or someone you know, compromises the environment.

435
00:22:52,640 --> 00:22:55,079
Speaker 1: Richard and I used to do a talk together at

436
00:22:55,400 --> 00:22:58,920
various conferences about, you know, walking through sort of the

437
00:22:59,000 --> 00:23:03,519
history of inner services, and he made a good point

438
00:23:03,559 --> 00:23:07,279
about saying that iis is like whatuld you call it,

439
00:23:07,279 --> 00:23:09,839
like a Swiss army knife with all the blades out.

440
00:23:09,880 --> 00:23:12,880
The Swiss army knife with every blade out was yeah,

441
00:23:13,000 --> 00:23:13,839
whereas you know.

442
00:23:15,200 --> 00:23:17,920
Speaker 2: Well node was the ultimate opposite. Yeah, no, there are

443
00:23:17,960 --> 00:23:20,279
no blades. You have to go get each blade right.

444
00:23:20,640 --> 00:23:22,279
Speaker 1: That's right, everything's off by default.

445
00:23:22,440 --> 00:23:25,799
Speaker 3: Right, yeah, but same with I six though, yeah, by six,

446
00:23:25,799 --> 00:23:27,559
So that's where we made a huge change, right, right,

447
00:23:27,599 --> 00:23:29,640
So his code name was Kevlar, and there was a

448
00:23:29,680 --> 00:23:34,119
reason for it being kevl Right. We rewrote huge swaths

449
00:23:34,119 --> 00:23:39,119
of code, especially string handling and also kind oftilization was

450
00:23:39,119 --> 00:23:41,880
all centralized. We also changed the design, and one of

451
00:23:41,920 --> 00:23:44,279
the biggest designs was that IS five, Like you say,

452
00:23:44,279 --> 00:23:45,880
it was a Swiss army knife with all the blades

453
00:23:45,920 --> 00:23:49,680
and screwdrivers and what have you sticking out. I six

454
00:23:49,799 --> 00:23:52,960
was the opposite, right, basically ran essentially nothing and you

455
00:23:53,000 --> 00:23:55,119
had to opt in for most of the services and

456
00:23:55,240 --> 00:23:58,599
I seven, which I could even further by opting in

457
00:23:58,680 --> 00:24:02,480
for specific h GDP verbs, especially the web dev stuff.

458
00:24:03,119 --> 00:24:05,039
Speaker 1: Right, yeah, it's just cool out I remember it.

459
00:24:05,079 --> 00:24:05,279
Speaker 2: Well.

460
00:24:05,359 --> 00:24:09,200
Speaker 1: We learn, you know, from each other and evolve together.

461
00:24:09,319 --> 00:24:10,200
Speaker 3: It was a huge baptism.

462
00:24:10,400 --> 00:24:12,359
Speaker 2: Well and in the early days of I S it

463
00:24:12,400 --> 00:24:14,559
was about making it easier for developers. You didn't really

464
00:24:14,559 --> 00:24:17,640
know that much about web development to make things work.

465
00:24:18,119 --> 00:24:20,440
Speaker 3: And and that was Jay Allen's impact, right, I mean

466
00:24:20,559 --> 00:24:23,559
Jay ran the project and he was a big you know,

467
00:24:23,680 --> 00:24:29,759
let's get everything onto the web, especially I S. I know, look,

468
00:24:30,039 --> 00:24:33,319
I understand that. I totally understand that. But at the

469
00:24:33,319 --> 00:24:36,200
same time, you know, if you start turning stuff off

470
00:24:36,279 --> 00:24:39,640
by default, it is going to provide some kind of impedance.

471
00:24:39,839 --> 00:24:43,240
Speaker 1: Yeah, you're also not just interested in server stuff, but

472
00:24:43,359 --> 00:24:49,119
in code security in general, right, like correct avoiding vulnerabilities

473
00:24:49,160 --> 00:24:51,839
like buffer overruns and things which admittedly aren't so much

474
00:24:51,880 --> 00:24:55,079
of a problem in a managed language like C sharp,

475
00:24:55,160 --> 00:25:01,000
but certainly in C C plus plus those are still issues.

476
00:25:00,880 --> 00:25:05,160
Speaker 3: Huge issue. In fact, I'm working on another book now

477
00:25:05,200 --> 00:25:08,200
with some friends, with some colleagues, and one of the

478
00:25:08,279 --> 00:25:11,200
chapters is going to be exactly that, like reconsidering the

479
00:25:11,279 --> 00:25:12,640
role of C and C plus plus.

480
00:25:13,480 --> 00:25:13,599
Speaker 1: Now.

481
00:25:13,640 --> 00:25:16,000
Speaker 2: The only reason, well, this is this what Rassinovich was

482
00:25:16,039 --> 00:25:18,960
talking about, Like you have to justify writing exactly.

483
00:25:19,039 --> 00:25:21,920
Speaker 3: So racinovert Well it's not just Rescinovi, it's also Dave

484
00:25:21,960 --> 00:25:26,160
Western writing windows. So the three of us was talking

485
00:25:26,160 --> 00:25:32,799
at length about about this. And look, I have worked

486
00:25:32,920 --> 00:25:38,160
in C since I was sixteen. Wow, let's just say

487
00:25:38,160 --> 00:25:42,880
that was a few years ago. You know, as I mentioned,

488
00:25:43,079 --> 00:25:44,519
you know in the Buyer at the beginning, right, one

489
00:25:44,519 --> 00:25:46,599
of my well, my first job at Microsoft was in

490
00:25:46,640 --> 00:25:49,880
part supporting the Microsoft C compiler it was C version

491
00:25:49,960 --> 00:25:55,200
four back in the day. But what's interesting is I've

492
00:25:55,240 --> 00:25:57,039
you know, I've had a huge I have a huge

493
00:25:57,039 --> 00:25:58,920
love affair with C and C plus plus. I love

494
00:25:59,039 --> 00:26:03,440
both languages, absolutely, adore both languages. See is probably the

495
00:26:03,559 --> 00:26:07,039
language that I can write without even thinking. It just

496
00:26:07,440 --> 00:26:12,119
spews out onto the keyboard. You know. The problem though,

497
00:26:12,799 --> 00:26:17,039
is that it's too easy to have undefined behavior in

498
00:26:17,039 --> 00:26:20,400
both languages. And remember that before, you know, definition of

499
00:26:20,440 --> 00:26:22,440
a secure system is a system that does what it's

500
00:26:22,440 --> 00:26:26,440
supposed to do with nothing else. A more academic representation

501
00:26:26,559 --> 00:26:30,920
of that is a system that exhibits no undefined behavior. Well,

502
00:26:31,119 --> 00:26:33,559
ce on C plus plus has got plenty of undefined

503
00:26:33,559 --> 00:26:37,559
behavior when it comes to the corruption memory safety. So

504
00:26:37,680 --> 00:26:39,799
cal you said buffer overruns, but that's only just one.

505
00:26:40,039 --> 00:26:40,960
Speaker 1: That's just one.

506
00:26:40,839 --> 00:26:45,119
Speaker 3: Yeah, that's one, right. The whole gamut is essentially memory safety.

507
00:26:45,920 --> 00:26:47,039
I know that there's a lot of work going on

508
00:26:47,079 --> 00:26:49,880
in the C plus plus committees right now around this

509
00:26:49,960 --> 00:26:53,279
thing is called profiles, to basically say, here is the

510
00:26:53,359 --> 00:26:56,039
subset of the language that we will we will enforce.

511
00:26:57,160 --> 00:27:02,519
For example, let's say you have a C array, maybe

512
00:27:02,519 --> 00:27:04,559
a vector as well. But let's say in an array

513
00:27:04,920 --> 00:27:07,000
by C array. I don't mean a sorry, a C

514
00:27:07,119 --> 00:27:10,039
plus plus array. I don't mean a C array. But

515
00:27:10,119 --> 00:27:12,200
if you've got a C plus plus class that's an

516
00:27:12,319 --> 00:27:16,359
array and you use just the index operator, you know,

517
00:27:16,400 --> 00:27:21,519
square brackets eye, that is not bounce checked. But if

518
00:27:21,519 --> 00:27:25,519
you do dot at I, then that is bounce checked.

519
00:27:25,960 --> 00:27:28,519
So you may have in the profile. We will only

520
00:27:28,559 --> 00:27:32,359
allow iterators to use dot at and not just the

521
00:27:32,440 --> 00:27:34,559
array and the normal array index that we used to

522
00:27:35,400 --> 00:27:38,200
so that there's that that works. But the problem is

523
00:27:38,440 --> 00:27:41,119
there is so much in C plus plus code out there.

524
00:27:42,799 --> 00:27:45,160
Are you really going to re engineer it and refactor it?

525
00:27:45,200 --> 00:27:46,039
I don't I don't know.

526
00:27:47,359 --> 00:27:49,079
Speaker 2: No, you're not well. If you are, you're not going

527
00:27:49,160 --> 00:27:50,839
to stay there, You're going to go elsewhere.

528
00:27:50,880 --> 00:27:52,559
Speaker 3: So the word, you know, so the word from on

529
00:27:52,640 --> 00:27:55,880
high in as you're at least, and this is from Ricinovich,

530
00:27:56,359 --> 00:28:00,000
is new code must be written using memory save language.

531
00:28:01,240 --> 00:28:04,880
That could be RUSS, could be see sharp, could be go.

532
00:28:05,519 --> 00:28:08,880
You know, we saw typescript typescript compiler written and go.

533
00:28:09,039 --> 00:28:11,240
Speaker 2: I mean, there is managed C plus plus, but does

534
00:28:11,240 --> 00:28:12,119
anybody use it.

535
00:28:12,319 --> 00:28:13,680
Speaker 3: I don't know of anybody using.

536
00:28:13,480 --> 00:28:15,119
Speaker 2: It, or is it actually safe.

537
00:28:15,519 --> 00:28:18,799
Speaker 3: Honestly, I haven't really used it. I found the syntax

538
00:28:19,000 --> 00:28:20,200
very funky.

539
00:28:21,000 --> 00:28:23,880
Speaker 2: Yeah, and yeah, you know, this is what happened to

540
00:28:23,880 --> 00:28:26,119
me with VB dot net. I was a good VB programmer.

541
00:28:26,119 --> 00:28:28,640
I was very happy dot net comes along. VB dot

542
00:28:28,640 --> 00:28:32,039
net makes me mental because I have VB reflexes, and

543
00:28:32,119 --> 00:28:33,799
so I switched to C sharp because at least I

544
00:28:33,880 --> 00:28:35,000
know for sure it's not VB.

545
00:28:35,319 --> 00:28:35,880
Speaker 1: Yeah point.

546
00:28:36,480 --> 00:28:36,680
Speaker 3: Yeah.

547
00:28:36,759 --> 00:28:38,680
Speaker 1: So it took me a while, but I got over

548
00:28:38,759 --> 00:28:39,279
to Yeah.

549
00:28:39,319 --> 00:28:41,880
Speaker 3: So, you know, Perisenovitch new stuff must be written in

550
00:28:42,119 --> 00:28:43,160
memory safe language.

551
00:28:43,359 --> 00:28:45,400
Speaker 2: And I know Dave West, I think he said it.

552
00:28:45,759 --> 00:28:48,400
You have to justify not writing in a memory safe language.

553
00:28:48,519 --> 00:28:51,359
Speaker 3: Yeah, correct, Yeah, that's that's that's the more accurate statement.

554
00:28:52,119 --> 00:28:55,519
Speaker 2: Yeah. Yeah, well he's pretty careful to not talking in absolutes.

555
00:28:55,599 --> 00:29:03,680
Speaker 3: Mister, what was the confidence for computing? But other than that, okay, no,

556
00:29:04,039 --> 00:29:08,319
it's look I again. For me, it's really difficult. I

557
00:29:08,359 --> 00:29:11,480
love seeing C plus plus, I love both languages. Look,

558
00:29:11,599 --> 00:29:13,920
and I realize that C plus plus can get pretty upteous. Sometimes.

559
00:29:13,960 --> 00:29:17,400
Everyone likes showing the you know, the crazy you know

560
00:29:17,480 --> 00:29:19,039
C plus plus code, you know, try and work out

561
00:29:19,039 --> 00:29:23,000
what the heck. There's things doing. I get that, but

562
00:29:23,039 --> 00:29:26,039
I love cea Sharp for you know, my my general

563
00:29:26,119 --> 00:29:27,880
rule of thumb is this, and this is my general

564
00:29:27,960 --> 00:29:32,200
rule of thumb is if you're writing something new sea sharp, However,

565
00:29:33,200 --> 00:29:36,799
if you need something where you can't incur the cost

566
00:29:36,880 --> 00:29:41,000
of the garbage collection or the the unpredictability of the

567
00:29:41,000 --> 00:29:44,799
garbage collector especially on a server, than Rust. I'm a

568
00:29:44,880 --> 00:29:46,960
huge fan of Russ. I love Russ. I love Rust

569
00:29:47,000 --> 00:29:47,599
to death.

570
00:29:47,799 --> 00:29:50,440
Speaker 2: And it does seem to strike that balance between I'm

571
00:29:50,559 --> 00:29:53,799
quite low level and deterministic in what I'm going to do,

572
00:29:54,319 --> 00:29:55,599
but I'm also memory safe.

573
00:29:55,640 --> 00:29:59,599
Speaker 3: Correct. The downside of Russ is learning it. It is

574
00:29:59,759 --> 00:30:04,720
a very difficult language to learn. I am some years ago,

575
00:30:05,079 --> 00:30:07,160
about two years ago now, I gave a presentation inside

576
00:30:07,160 --> 00:30:10,599
of Microsoft called a lap around Rust, and it's basically

577
00:30:10,640 --> 00:30:13,039
a one hour session, just a quick hey, you never learned,

578
00:30:13,079 --> 00:30:16,039
never know anything, know nothing about Russ. Watch this talking

579
00:30:16,079 --> 00:30:19,160
about visual studio code blah blah blah blah. One half

580
00:30:19,200 --> 00:30:24,359
of the presentation is just on the borrow checker in Rust,

581
00:30:24,920 --> 00:30:28,559
because that's what causes most of the pain. And the

582
00:30:28,599 --> 00:30:32,319
first slide of that is, you know, if you're writing

583
00:30:32,400 --> 00:30:34,599
Russ for the first two to four weeks, you're going

584
00:30:34,640 --> 00:30:37,519
to be you know, punching the screen because of the

585
00:30:37,519 --> 00:30:41,160
borrow checker, because of the way it works, and it

586
00:30:41,240 --> 00:30:43,880
drives people bonkers. But once you get over that learning hump,

587
00:30:43,960 --> 00:30:47,839
it's actually it's a beautiful language to learn, and you know,

588
00:30:47,960 --> 00:30:50,839
nice ecosystem and very good standard library.

589
00:30:50,960 --> 00:30:53,240
Speaker 1: Yeah, well, this seems like a good place to take

590
00:30:53,240 --> 00:30:55,240
a break. So we'll be right back after these very

591
00:30:55,240 --> 00:30:57,920
important messages. And as a reminder, if you don't want

592
00:30:57,920 --> 00:31:00,480
to hear these ads, you can pay five bucks a

593
00:31:00,480 --> 00:31:03,680
month to become a patron Patreon dot dot and Aarox

594
00:31:03,720 --> 00:31:06,000
dot com. You get an ad free feed. We'll be

595
00:31:06,079 --> 00:31:08,440
right back. Did you know there's a dot net on

596
00:31:08,519 --> 00:31:13,759
aws community. Follow the social media blogs, YouTube influencers and

597
00:31:13,839 --> 00:31:17,880
open source projects and add your own voice. Get plugged

598
00:31:17,880 --> 00:31:21,599
into the dot net on aws community at aws dot

599
00:31:21,640 --> 00:31:28,519
Amazon dot com, slash dot net. And we're back. It's

600
00:31:28,519 --> 00:31:31,559
dot in Aros. I'm Carl, that's my friend Richard Campbell, hey,

601
00:31:31,720 --> 00:31:35,319
and our friend Michael Howard, and we're talking code security

602
00:31:35,359 --> 00:31:38,759
and one thing that has become very evident to me,

603
00:31:39,160 --> 00:31:43,359
doing you know, a weekly podcast on vulnerabilities and hacks

604
00:31:43,359 --> 00:31:47,039
that have happened the week before. Is that most of

605
00:31:47,079 --> 00:31:52,039
these aren't technical vulnerabilities. Most of them. Most of the

606
00:31:52,119 --> 00:31:57,920
attacks anyway, are from social engineering, and that is just

607
00:31:58,000 --> 00:32:01,720
a whole another ball away. But it's the first line

608
00:32:01,720 --> 00:32:05,640
of defense and security is the user. Don't click the

609
00:32:05,720 --> 00:32:08,440
freaking link in the email or the text message, you know.

610
00:32:08,720 --> 00:32:11,240
Speaker 3: Yeah, I mean I usually get emails from my father

611
00:32:11,319 --> 00:32:16,200
in law, yeah, saying you know, is this legit? My

612
00:32:16,319 --> 00:32:18,279
answer was always the same. If you're asking me the question,

613
00:32:18,400 --> 00:32:19,119
then the answer.

614
00:32:18,920 --> 00:32:22,000
Speaker 2: Is no, you already know the answer.

615
00:32:22,359 --> 00:32:25,200
Speaker 3: That's right. I get ones that everyone's in a while

616
00:32:25,200 --> 00:32:27,359
from my wife as well, but I mean, yeah, the

617
00:32:27,400 --> 00:32:29,839
work that's going on, certainly Microsoft, I'm sure other companies

618
00:32:29,839 --> 00:32:32,640
as well, you know, in the various UIs to mitigate things.

619
00:32:33,440 --> 00:32:36,119
One of my favorites actually was was added to Office

620
00:32:36,240 --> 00:32:39,839
years ago, which was to run the you know, the

621
00:32:39,880 --> 00:32:46,279
process at low integrity level by default, which is really nice. Well,

622
00:32:46,519 --> 00:32:48,240
I don't know how technical you guys want to get,

623
00:32:48,279 --> 00:32:50,599
but from my perspective, it was a magnificent defense. It

624
00:32:51,359 --> 00:32:54,119
assumes that the document, whether it's a PowerPoint, Excel or

625
00:32:54,240 --> 00:32:58,160
doc file, whatever, it assumes it's malicious and runs it

626
00:32:58,200 --> 00:33:02,640
in this this essentially a a low integrity sandbox. So

627
00:33:02,759 --> 00:33:05,799
it does blow up the potential damage. I'm not saying

628
00:33:05,839 --> 00:33:10,000
it's completely mitigated, but it's certainly less. And you got

629
00:33:10,039 --> 00:33:11,559
to do things like that, right, You've got to assume

630
00:33:11,640 --> 00:33:14,680
that the users going to click on things and open

631
00:33:14,759 --> 00:33:16,759
things and all that sort of stuff, And so how

632
00:33:16,799 --> 00:33:19,279
do you mitigate the damage.

633
00:33:19,480 --> 00:33:21,759
Speaker 2: Yeah, yeah, we're in the admin's side. Our line is

634
00:33:21,799 --> 00:33:26,079
always be more careful next time. Is not a strategy, right, yeah, right,

635
00:33:26,400 --> 00:33:28,839
it's not that the guy clicked on it, it's that

636
00:33:28,920 --> 00:33:30,079
anything happened when he did.

637
00:33:30,200 --> 00:33:33,079
Speaker 1: Yeah, and just that advice be careful, Well, what does

638
00:33:33,119 --> 00:33:36,319
that mean? Does that mean don't ever click on anything?

639
00:33:36,440 --> 00:33:38,680
Does that mean never go to websites? Does that mean

640
00:33:38,720 --> 00:33:41,000
cut off your hands and live in a box? Thanks

641
00:33:41,079 --> 00:33:43,519
Roy Blake for that one. You know, it's hard to tell.

642
00:33:43,920 --> 00:33:44,599
Just be careful.

643
00:33:44,960 --> 00:33:48,240
Speaker 3: That's terrible advice, to be honest, it's terrible. Yeah, I

644
00:33:48,279 --> 00:33:51,400
mean humans is going to do what humans going to do, right,

645
00:33:51,440 --> 00:33:52,640
I mean they can click on that. I mean one

646
00:33:52,680 --> 00:33:55,039
of one of my favorite and I gotta be careful,

647
00:33:55,119 --> 00:33:57,720
I say here, but I was working doing some work

648
00:33:57,759 --> 00:34:02,480
with a defense contractor and they got hit with a

649
00:34:02,480 --> 00:34:08,719
pretty serious attack. And the attack actually came through a

650
00:34:08,840 --> 00:34:12,480
very senior person within the organization, and it's basically a

651
00:34:12,559 --> 00:34:16,480
zero day in Acrobat. But the reason why she opened

652
00:34:16,519 --> 00:34:22,519
it is because the message was about knitting interesting and

653
00:34:22,599 --> 00:34:23,199
she was an.

654
00:34:23,039 --> 00:34:26,400
Speaker 2: Avid knitter, so very good targeting.

655
00:34:26,159 --> 00:34:28,400
Speaker 3: And so she just opened the dark the PDF and

656
00:34:28,440 --> 00:34:32,159
that was it. That was the beach air. Yeah, there's

657
00:34:32,199 --> 00:34:34,079
always going to be something that you're gonna click on.

658
00:34:36,239 --> 00:34:39,000
We can pontificate and say how awesome we are, but

659
00:34:39,039 --> 00:34:41,159
I can guarantee all three of us there's something we

660
00:34:41,199 --> 00:34:41,760
will click on.

661
00:34:41,960 --> 00:34:44,119
Speaker 2: Yep, you're gonna make them and you are just gonna

662
00:34:44,119 --> 00:34:45,639
have a moment of weakness. You're gonna have a mistake

663
00:34:45,679 --> 00:34:46,159
at some point.

664
00:34:46,880 --> 00:34:49,280
Speaker 3: And that's why I'm a huge fan of these sort

665
00:34:49,320 --> 00:34:52,400
of mitigations around assuming that whatever you click on is

666
00:34:52,440 --> 00:34:55,000
going to blow up, and you know, what can you

667
00:34:55,039 --> 00:34:57,000
do to mitigate that? And we spend a lot of

668
00:34:57,000 --> 00:34:58,760
time in Windows especially on that.

669
00:34:58,920 --> 00:35:01,800
Speaker 1: There's some ideas that we've thrown around. One of them

670
00:35:02,039 --> 00:35:04,840
is to have a separate VM where all your email

671
00:35:04,960 --> 00:35:07,039
comes through and all your Internet goes in and out

672
00:35:07,039 --> 00:35:10,239
of the problem then is Okay, if you do have

673
00:35:10,360 --> 00:35:12,880
to communicate with some data and some files and stuff,

674
00:35:12,880 --> 00:35:15,920
now you've got to get them around safely without infecting

675
00:35:16,000 --> 00:35:19,719
each system. But that's not an insurmountable problem.

676
00:35:19,760 --> 00:35:21,760
Speaker 3: You know, it's funny you bring that up.

677
00:35:21,800 --> 00:35:23,760
Speaker 1: You can do a little file Zilla, you know, a

678
00:35:23,760 --> 00:35:25,880
little FTP, you could do that.

679
00:35:26,079 --> 00:35:28,599
Speaker 3: It's funny you bring that up. That was actually thrown around.

680
00:35:28,599 --> 00:35:31,000
I want to say fifteen twenty years ago, I was

681
00:35:31,400 --> 00:35:34,639
a meeting with Gates where Butler lamps and actually threw

682
00:35:34,679 --> 00:35:37,519
out this idea of I think you got a red green,

683
00:35:38,119 --> 00:35:41,320
a red VM and a green VM. Yeah, that's exactly

684
00:35:41,360 --> 00:35:44,920
what it was. Yeah. Wow, But back then, I mean

685
00:35:45,000 --> 00:35:47,480
vms were super duper heavy as well, and it's releatant.

686
00:35:47,960 --> 00:35:50,199
Then there were performance impacts. I think we're in a

687
00:35:50,280 --> 00:35:54,119
much better position today. But you know, when you know Microsoft,

688
00:35:54,119 --> 00:35:56,360
if you're doing administration and VAS, you're at the back end,

689
00:35:56,400 --> 00:35:59,440
you can't do it from your machine. You're not allowed, no, right,

690
00:35:59,440 --> 00:36:02,239
you run it from my secure access workstation. That is

691
00:36:02,360 --> 00:36:05,599
very limited in what it can do on purpose.

692
00:36:06,039 --> 00:36:08,599
Speaker 1: Yeah, do you remember, Richard, we had we interviewed a

693
00:36:08,639 --> 00:36:12,199
guy on dot net rocks. It was a little at NBC,

694
00:36:12,480 --> 00:36:17,719
I think, and he had discovered how to hack dot

695
00:36:17,800 --> 00:36:21,039
net apps and this was dot net framework, Windows and stuff.

696
00:36:21,599 --> 00:36:24,840
He could get into memory and change things around, and

697
00:36:25,079 --> 00:36:28,480
he had developed a tool to do this, and he

698
00:36:28,519 --> 00:36:31,199
gave the tool away for free, and then he sold

699
00:36:31,880 --> 00:36:34,039
an anti tool to procure.

700
00:36:34,920 --> 00:36:36,800
Speaker 2: John McCoy is who you're talking about.

701
00:36:36,880 --> 00:36:39,079
Speaker 1: Yeah, to protect yourself from it. I just remember Richard

702
00:36:39,119 --> 00:36:42,239
and I were horrified. We didn't know anything about this

703
00:36:42,320 --> 00:36:45,599
before we started talking, and then it was like that's evil,

704
00:36:45,840 --> 00:36:46,079
you know.

705
00:36:46,400 --> 00:36:52,239
Speaker 3: But yeah, I remember this is some time ago and

706
00:36:52,320 --> 00:36:56,599
there was a company who, well, that should't work, No,

707
00:36:56,639 --> 00:36:58,440
I don't, but they came up with the wh They

708
00:36:58,440 --> 00:37:00,000
came up with a white paper that showed how US

709
00:37:00,039 --> 00:37:06,400
snaffle private keys out of memory and iis TLS TLS certificates.

710
00:37:07,679 --> 00:37:10,480
But if you bought their hardware, then the key stayed

711
00:37:10,480 --> 00:37:13,280
in the hardware. This is before HSM's working hardware security

712
00:37:13,280 --> 00:37:14,440
models were a thing, and they have one of the

713
00:37:14,440 --> 00:37:15,000
first ones.

714
00:37:15,119 --> 00:37:17,440
Speaker 1: But meanwhile, we're going to teach you how to Yeah,

715
00:37:17,679 --> 00:37:18,079
we're going.

716
00:37:18,000 --> 00:37:19,599
Speaker 3: To give you if you buy a device for two

717
00:37:19,599 --> 00:37:22,280
thousand dollars. And then it is kind of funny that

718
00:37:22,320 --> 00:37:25,079
the tool they had was really really simple. It basically

719
00:37:25,079 --> 00:37:27,360
looked for entropy and memory. That's all it really did, right,

720
00:37:27,360 --> 00:37:31,559
because the private key is random, whereas most memory kind

721
00:37:31,559 --> 00:37:34,880
of isn't really, and so they would just look for entropy.

722
00:37:34,920 --> 00:37:36,320
And then what they would do is they were trying

723
00:37:36,320 --> 00:37:37,880
to match up the public key and the certificate with

724
00:37:37,880 --> 00:37:39,760
the private key by decrypting and encrypting. And so they

725
00:37:39,800 --> 00:37:42,280
got the correct answer. But they could do it within

726
00:37:43,000 --> 00:37:44,960
but they can do it in seconds. So now they

727
00:37:44,960 --> 00:37:48,280
know the private key. But hey, if you buy our

728
00:37:48,360 --> 00:37:51,239
hardware device, then the private key stays in the hardware.

729
00:37:51,280 --> 00:37:52,039
Speaker 1: And they probably thought that.

730
00:37:52,079 --> 00:37:53,920
Speaker 3: Don't get me wrong, I'm a huge fan of HSMs,

731
00:37:53,920 --> 00:37:54,320
I really am.

732
00:37:54,360 --> 00:37:55,679
Speaker 1: Probably thought that was a great idea.

733
00:37:55,760 --> 00:37:57,880
Speaker 2: Yeah, but there's also a way to be as somebody

734
00:37:57,880 --> 00:38:01,280
who understands security and tries to fix things without you know,

735
00:38:01,360 --> 00:38:02,320
being part of the problem.

736
00:38:02,480 --> 00:38:02,679
Speaker 1: Right.

737
00:38:03,119 --> 00:38:05,440
Speaker 3: Yeah, Well, let's back up that a little bit. I mean,

738
00:38:05,760 --> 00:38:08,079
did this company do a service I think explaining to

739
00:38:08,119 --> 00:38:10,840
people how easy if you've got rogue software on the box,

740
00:38:10,880 --> 00:38:13,519
you could snaffle the private key. I think it's a

741
00:38:13,519 --> 00:38:15,440
good thing to know. You know, a lot of people

742
00:38:15,800 --> 00:38:19,039
may not realize right that you could easily get the

743
00:38:19,039 --> 00:38:22,000
private key, and that's why you know things like as

744
00:38:22,039 --> 00:38:24,639
a key vault and managed HSM. You know, the private

745
00:38:24,719 --> 00:38:27,320
keys stay in the hardware. They never leave unless you

746
00:38:27,360 --> 00:38:30,559
want to export them in a shrouded manner. Rather, we

747
00:38:30,639 --> 00:38:34,840
give you know, decrypt and we give signing operations rest

748
00:38:34,920 --> 00:38:37,440
endpoints instead, so the private key stays in the hardware.

749
00:38:38,079 --> 00:38:42,920
Speaker 1: Yeah, let's talk about GitHub, get lab all them. You

750
00:38:42,960 --> 00:38:48,360
know that people are raising concerns like existential threats, you know,

751
00:38:48,599 --> 00:38:52,840
to our very existence because we're dependent on all these

752
00:38:53,199 --> 00:38:57,800
projects and nobody has software bill of materials unless they're

753
00:38:57,800 --> 00:39:02,000
really hip, so they don't know where their dependencies are.

754
00:39:02,199 --> 00:39:02,480
Speaker 2: They go.

755
00:39:02,760 --> 00:39:05,360
Speaker 1: You got depend about which tells you, hey, you've got

756
00:39:05,360 --> 00:39:08,840
some dependencies that aren't safe. But it's kind of a

757
00:39:08,880 --> 00:39:11,079
full time job. It is, playing whack them all with

758
00:39:11,159 --> 00:39:11,920
that stuff. It is.

759
00:39:12,039 --> 00:39:14,280
Speaker 3: Yeah. I mean I get notifications, but at least one

760
00:39:14,320 --> 00:39:17,599
every other day, not necessarily because of anything I have,

761
00:39:17,719 --> 00:39:20,480
but because of I'm sort of a co owner of

762
00:39:20,480 --> 00:39:24,239
other people's projects, especially one that's actually on purpose vulnerable.

763
00:39:24,880 --> 00:39:27,360
Actually they get up guys have I actually use it

764
00:39:27,360 --> 00:39:29,360
as a demo at build. They're kind enough to give

765
00:39:29,400 --> 00:39:32,280
me access to it, but of course again depend about

766
00:39:32,320 --> 00:39:35,559
warnings from that all the time. Yeah, I mean, the

767
00:39:35,599 --> 00:39:38,320
whole bill of materials things is a really serious problem. Right.

768
00:39:38,320 --> 00:39:42,920
In fact, back in the day, some old boss Steve Lipner,

769
00:39:43,199 --> 00:39:47,320
who has since retired, he came with this term giblets,

770
00:39:47,639 --> 00:39:50,840
which is, you know, components that you depend on that

771
00:39:50,920 --> 00:39:54,960
you don't own. The problem is if they have a vulnerability, congratulations,

772
00:39:54,960 --> 00:39:56,159
you've got a vulnerability too.

773
00:39:56,679 --> 00:39:57,480
Speaker 2: You've got it too.

774
00:39:57,920 --> 00:40:00,599
Speaker 3: And the funny thing is that product that sort of

775
00:40:00,599 --> 00:40:03,679
woke us up to that was actually seql server with slammer.

776
00:40:03,599 --> 00:40:10,000
Speaker 1: Yeah, tell us telescope. I can't remember exactly what happened,

777
00:40:10,039 --> 00:40:11,000
like two thousand and one.

778
00:40:11,039 --> 00:40:13,800
Speaker 3: Slammer, yeah, two thousand and one. Yeah. It was a worm,

779
00:40:13,960 --> 00:40:17,360
a really nasty worm. It was a UDP worm. So

780
00:40:17,400 --> 00:40:20,599
it traveled like wildfire, right because you know, you didn't

781
00:40:20,599 --> 00:40:22,679
have to do any three way handshakes. You're just dropping

782
00:40:22,719 --> 00:40:24,360
a UDP packet on the network and off you go.

783
00:40:26,119 --> 00:40:28,599
So produced, you know, you end up creating huge network

784
00:40:28,639 --> 00:40:33,280
congestion of compromise machines. What was interesting though, oh yeah, okay,

785
00:40:33,280 --> 00:40:36,840
I remember now. But what was interesting though, is that

786
00:40:39,119 --> 00:40:44,320
seql server actually wasn't really impacted. What was impacted was

787
00:40:44,440 --> 00:40:48,960
msde right, the embedded version of sql server that people

788
00:40:49,000 --> 00:40:51,360
had their accounting pro right. They needed a good a

789
00:40:51,440 --> 00:40:55,360
good quality database that wasn't access right, so they wanted

790
00:40:55,400 --> 00:40:57,400
SQL server, but they don't want to buy seql service

791
00:40:57,440 --> 00:40:59,239
with msdu right, So the embedded version, so you have

792
00:40:59,239 --> 00:41:03,199
your accounting soft we're running this database back end, and

793
00:41:03,199 --> 00:41:05,400
people didn't realize they had it, so they weren't patching it.

794
00:41:05,400 --> 00:41:07,400
Because they weren't patching it, they got whacked by slammer.

795
00:41:08,239 --> 00:41:10,199
So that work us up to this idea of that.

796
00:41:10,320 --> 00:41:13,400
Steve called it giblets, which is components you depend on

797
00:41:14,000 --> 00:41:15,079
that you don't control.

798
00:41:15,320 --> 00:41:17,239
Speaker 1: So the idea is you're making gravy and you cut

799
00:41:17,320 --> 00:41:20,400
up these things that you that look like little weird parts,

800
00:41:20,440 --> 00:41:22,519
and one of them has like a virus or something,

801
00:41:22,559 --> 00:41:25,679
and is that the idea is that the metaphor, you

802
00:41:25,719 --> 00:41:25,960
know what?

803
00:41:26,000 --> 00:41:29,960
Speaker 3: I think the metaphor is the chicken is yours, but

804
00:41:30,000 --> 00:41:33,800
you've got these giblets inside. I don't I don't think

805
00:41:33,840 --> 00:41:36,599
there's actually a good this. This conversation isn't gonna end

806
00:41:36,639 --> 00:41:40,320
very well. I don't know body parts that's okay, but

807
00:41:40,320 --> 00:41:41,559
but yeah, I.

808
00:41:41,480 --> 00:41:44,599
Speaker 2: Thought the main impact of slimer was just burying networks.

809
00:41:46,519 --> 00:41:49,360
Speaker 3: It was a UDIP, it was a UTP payload, so

810
00:41:49,719 --> 00:41:53,440
and the payload actually sat that actually fit fitted in

811
00:41:53,519 --> 00:41:57,320
one UDIP packet. So what would happen is, you know,

812
00:41:57,320 --> 00:41:59,800
you'd have a machine that would just generate random IP

813
00:42:00,000 --> 00:42:03,760
addresses and then just send that UDOP package to that endpoint.

814
00:42:04,280 --> 00:42:08,880
But they could do it so quickly that because it was UDP, right,

815
00:42:08,920 --> 00:42:10,039
there's no three away handshakes.

816
00:42:10,400 --> 00:42:13,440
Speaker 2: And it was already patched, as I recall, it was patched.

817
00:42:14,280 --> 00:42:16,119
There was a patch, and I'll like it had been

818
00:42:16,159 --> 00:42:17,639
out for a while too. All of us it RAN

819
00:42:17,719 --> 00:42:20,280
sequel servers had done the patches. But it's all these

820
00:42:20,320 --> 00:42:23,840
app devs and these little applications that had msde med

821
00:42:23,920 --> 00:42:27,199
it in it that hadn't been patched became propagation machine.

822
00:42:27,199 --> 00:42:29,719
Speaker 3: That was the problem. There was one beautiful piece of

823
00:42:30,280 --> 00:42:33,840
the code was actually buggy. I could be wrong, and

824
00:42:34,639 --> 00:42:37,320
there's two ways I could go with this. It either

825
00:42:37,480 --> 00:42:40,559
didn't generate odd numbers or it didn't generate even numbers.

826
00:42:40,559 --> 00:42:42,440
I can't remember which. Let's just go with even numbers.

827
00:42:42,800 --> 00:42:46,360
But the random number generatesor for generating the four architects

828
00:42:46,360 --> 00:42:51,000
from ip address couldn't generate even numbers, so it left

829
00:42:51,000 --> 00:42:53,239
out huge swaths of the Internet, which is a good thing.

830
00:42:54,320 --> 00:42:57,079
But yeah, the other interesting part of that is so

831
00:42:57,119 --> 00:42:58,679
the guy that found the bug was a guy called

832
00:42:58,760 --> 00:43:02,880
David Litchfield who worked in the UK at the time.

833
00:43:02,960 --> 00:43:07,320
He worked for Apple. And here I spoke in my

834
00:43:07,400 --> 00:43:10,840
last book, Designing and Developing Secure as Your Solutions, actually

835
00:43:10,840 --> 00:43:13,039
spoke with we've been we've been good friends for a

836
00:43:13,039 --> 00:43:15,320
long time, and I spoke at Land like how he

837
00:43:15,320 --> 00:43:17,360
found the bug and that sort of stuff. But that

838
00:43:17,480 --> 00:43:21,280
was the where he put out the announcements about the bug.

839
00:43:21,639 --> 00:43:24,239
He included a proof of concept, right, and that was

840
00:43:24,280 --> 00:43:26,079
the last time you ever put out a proof of

841
00:43:26,119 --> 00:43:29,159
concept because of the damage that was caused by evil

842
00:43:29,159 --> 00:43:29,840
people using it.

843
00:43:29,920 --> 00:43:30,440
Speaker 2: People use it.

844
00:43:30,480 --> 00:43:32,800
Speaker 3: We imagine, imagine if you didn't even imagine if you

845
00:43:32,880 --> 00:43:35,159
ran the proof of concept in your own little environment

846
00:43:35,159 --> 00:43:36,320
and escaped from the lab.

847
00:43:36,519 --> 00:43:39,920
Speaker 2: Yeah, you know, yeah, that's the thing with those propagators,

848
00:43:39,960 --> 00:43:42,960
like more often than not they did escape from the lab.

849
00:43:43,079 --> 00:43:45,719
Like that's how they actually got away back in the day,

850
00:43:45,760 --> 00:43:48,239
in the last black hat days that exists today.

851
00:43:48,679 --> 00:43:51,639
Speaker 3: So for what it's worth though I used that slammer

852
00:43:51,679 --> 00:43:53,559
code or the bug in SQL serve. Actually the code

853
00:43:53,599 --> 00:43:56,119
wasn't even in sql The bug wasn't actually in SQL server.

854
00:43:56,480 --> 00:43:59,280
It wasn't in one TCP one four three three. It

855
00:43:59,320 --> 00:44:02,239
was UDP one three one four three four, which is

856
00:44:02,280 --> 00:44:06,199
the management interface, which is not the database engine. It's

857
00:44:06,199 --> 00:44:10,400
all the management goop around it. So I actually I

858
00:44:10,440 --> 00:44:15,039
actually use that code for lots of examples about how

859
00:44:15,039 --> 00:44:18,280
we've made progress at Microsoft and in the industry in general.

860
00:44:18,960 --> 00:44:23,920
So for example, so Carl, you mentioned GitHub and you

861
00:44:23,960 --> 00:44:27,880
mentioned dependerbot. Well, another thing that GitHub has which is

862
00:44:27,960 --> 00:44:32,119
really cool is code ql, which is absolutely I'm a

863
00:44:32,480 --> 00:44:35,360
huge fan of coql. And so if you've got a

864
00:44:35,360 --> 00:44:38,119
public repo, or if you have a private repo and

865
00:44:38,159 --> 00:44:40,320
you have an enterprise agreement, let's let's just go with

866
00:44:40,320 --> 00:44:43,360
public repair. If you do a poor request, it will

867
00:44:43,440 --> 00:44:46,960
run codeql. I think they call it security scanning or something.

868
00:44:47,039 --> 00:44:49,920
It will run coq well over the poor request. And

869
00:44:49,920 --> 00:44:52,920
if there's a vulnerability, well, actually you can't merge the

870
00:44:52,960 --> 00:44:53,559
poor request.

871
00:44:53,880 --> 00:44:55,159
Speaker 1: Yeah, nice, that's great.

872
00:44:54,960 --> 00:44:57,000
Speaker 3: Which is really nice. That's even better though, you can

873
00:44:57,199 --> 00:44:59,760
even opt in for an auto fix and it will

874
00:44:59,840 --> 00:45:04,519
use AI to generate a fix. Yeah, and it will

875
00:45:04,559 --> 00:45:07,000
provide that as an update, and you can say whether

876
00:45:07,039 --> 00:45:11,039
you want to take the take the merge or not. Yeah.

877
00:45:11,079 --> 00:45:11,440
Speaker 1: I guess so.

878
00:45:11,719 --> 00:45:14,360
Speaker 3: But it's really really I'm a huge, huge fan of

879
00:45:14,400 --> 00:45:16,400
co q WEL because you can write your own rules

880
00:45:16,440 --> 00:45:18,119
for it, which I really like.

881
00:45:18,400 --> 00:45:18,920
Speaker 1: That's cool.

882
00:45:19,239 --> 00:45:21,000
Speaker 2: I guess we got to talk about AI because it

883
00:45:21,079 --> 00:45:27,039
sounds like both hacking and and resisting being hacked are

884
00:45:27,079 --> 00:45:28,599
going to be affected by these tools.

885
00:45:29,079 --> 00:45:30,960
Speaker 3: Yeah. I mean it's more than that, right, I mean,

886
00:45:31,280 --> 00:45:33,559
I you know, as we mentioned air quotes in the

887
00:45:33,559 --> 00:45:39,639
green room, the impact AI is having in the software

888
00:45:39,679 --> 00:45:44,880
industry and security especially cannot be understated. The work that

889
00:45:44,920 --> 00:45:47,960
we're doing at Microsoft, Like I think every single day

890
00:45:48,000 --> 00:45:51,320
I have at least two meetings on my calendar that

891
00:45:51,400 --> 00:45:55,440
have the letters A and I right in there about

892
00:45:55,519 --> 00:46:00,400
something that's going on in software development, security, security testing.

893
00:46:00,400 --> 00:46:02,000
Once today I just had just now is on security

894
00:46:02,039 --> 00:46:06,440
testing with AI and you know producing you know test

895
00:46:06,440 --> 00:46:10,519
plans that test security. One I had the other day

896
00:46:10,760 --> 00:46:15,440
was or some scanning, some really smart password scanning or

897
00:46:15,519 --> 00:46:19,519
you know, looking for rogue endpoint or unpatched doandpoints using AI.

898
00:46:19,760 --> 00:46:23,519
And they're literally literally, these guys were vibe coding, the developers,

899
00:46:23,800 --> 00:46:26,599
but they were vibe coding, and they're using Go because

900
00:46:26,599 --> 00:46:31,199
of the concurrency to be able to scan things quickly again,

901
00:46:31,599 --> 00:46:33,440
and they were. They batually said that ninety eight percent

902
00:46:33,480 --> 00:46:35,440
of this was vibe coded. The other two percent was

903
00:46:35,480 --> 00:46:39,119
a human being. So you know, there's a demo that

904
00:46:39,159 --> 00:46:42,599
I gave it build where I had some code that

905
00:46:42,599 --> 00:46:44,159
had a service side request ford. It was in C

906
00:46:44,280 --> 00:46:47,719
sharp is a rest dandpoint and I actually said in

907
00:46:47,880 --> 00:46:51,079
Visual Studio, you know, hey, do a security code review

908
00:46:51,119 --> 00:46:55,360
of this this code, and you know, fingers crossed because

909
00:46:55,360 --> 00:46:59,079
of the nondeterministic nature of LLLMS. It came back with, hey,

910
00:46:59,159 --> 00:47:02,599
there's a service side squardry here, and actually came up

911
00:47:02,599 --> 00:47:05,320
with a couple more I hadn't even thought about. Wow, wow,

912
00:47:05,440 --> 00:47:08,199
So you know, do we even need static analysis tools

913
00:47:08,199 --> 00:47:11,079
at this point? Yes, we do. But you know, it's

914
00:47:11,079 --> 00:47:13,079
just an interesting thing that's built into the product that

915
00:47:13,119 --> 00:47:17,079
gives you not just a bug, but also an explanation

916
00:47:17,119 --> 00:47:19,679
of why the bug's bad. And if you want it,

917
00:47:19,960 --> 00:47:21,159
you know, you can tell it, Hey, we'll fix the

918
00:47:21,159 --> 00:47:23,239
bug for me. If you're on a Visual Studio code

919
00:47:23,280 --> 00:47:25,079
in my agent mode or something's crazy.

920
00:47:25,239 --> 00:47:27,360
Speaker 2: It's power. It is powerful and at least gives you

921
00:47:27,480 --> 00:47:31,000
ideas the question of whether or not it's comprehensive or not. Yeah,

922
00:47:31,039 --> 00:47:33,880
it's you know, a big one. Well as well as,

923
00:47:33,920 --> 00:47:35,880
like I said, not deterministic. I asked it again and

924
00:47:35,920 --> 00:47:37,199
I found different vulnerabilities.

925
00:47:37,280 --> 00:47:39,760
Speaker 3: Right the other way looking at it is it's almost

926
00:47:39,760 --> 00:47:42,159
like learning on demand as well. Right, well, I didn't

927
00:47:42,199 --> 00:47:45,719
even know that was a problem. Well now you do, so, Yeah,

928
00:47:45,719 --> 00:47:48,039
it's I'm a huge fan of course in the defensive

929
00:47:48,039 --> 00:47:51,039
side of things. You know, we get so many signals

930
00:47:51,079 --> 00:47:54,320
at Microsoft every single day, and the many trillions of

931
00:47:54,400 --> 00:47:56,800
signals every single day, there's no where human being can

932
00:47:56,840 --> 00:47:59,639
go through all them, and so we use AI extensively

933
00:47:59,719 --> 00:48:02,159
to work out what's real and what's not before a

934
00:48:02,199 --> 00:48:03,320
human gets to look at things.

935
00:48:03,599 --> 00:48:05,719
Speaker 1: I want to shift gears before we end here. And

936
00:48:06,159 --> 00:48:08,679
I know that you know secure code is your thing.

937
00:48:09,400 --> 00:48:11,599
Are there is there anything that we haven't talked about

938
00:48:11,679 --> 00:48:15,280
that your average developer might fall into the trap of

939
00:48:16,519 --> 00:48:19,880
or needs to be aware of where you know to

940
00:48:19,880 --> 00:48:24,519
to leave vulnerabilities in their code. I mean, obviously seql

941
00:48:25,760 --> 00:48:28,400
injection is a huge one, but I think everybody that

942
00:48:28,480 --> 00:48:30,880
listens to our show and has listened to it knows

943
00:48:31,199 --> 00:48:33,880
how to use parameterized queries and all that stuff. But

944
00:48:33,920 --> 00:48:36,000
are there are any any other things that might not

945
00:48:36,079 --> 00:48:36,719
be so obvious.

946
00:48:36,880 --> 00:48:41,000
Speaker 3: Yeah, so we mentioned sequel injection. I touched on cross

947
00:48:41,000 --> 00:48:44,320
site scripting before with remember that talked about the ASP

948
00:48:44,400 --> 00:48:47,280
dot net class or whatever it was that detected cross

949
00:48:47,280 --> 00:48:49,920
site scripting and vulnerabilities or attacks I should.

950
00:48:49,639 --> 00:48:51,119
Speaker 1: Say anti forgery tokens.

951
00:48:51,239 --> 00:48:54,079
Speaker 3: That's the one. Yeah, that's another one, and it doesn't matter.

952
00:48:54,159 --> 00:48:55,599
But you know, talk about a service SID request for

953
00:48:55,719 --> 00:48:57,960
just squal injection, cross site scripting, memory corruption, and so on.

954
00:48:58,000 --> 00:48:59,599
They all have one thing in common. They're all input

955
00:48:59,679 --> 00:49:05,440
valid problems, every single one of them. And that problem

956
00:49:05,760 --> 00:49:09,519
transcends vulnerabilities, like there'll be new vulnerabilities in the future

957
00:49:09,519 --> 00:49:11,679
that we don't even know about yet. Even LMS, right,

958
00:49:12,400 --> 00:49:14,400
LLM suffer from input validation problems.

959
00:49:14,880 --> 00:49:15,639
Speaker 2: Sure, and the.

960
00:49:15,599 --> 00:49:17,599
Speaker 3: Fact that you're merging the data plane with the control plane.

961
00:49:17,599 --> 00:49:21,559
But that's a whole other problem. So to this day,

962
00:49:21,559 --> 00:49:24,239
and I wrote about this in Running Secure Code. You

963
00:49:24,280 --> 00:49:26,599
know the first edition of Running Secure Code, which was

964
00:49:26,719 --> 00:49:29,559
twenty five years ago now gratulations.

965
00:49:29,679 --> 00:49:30,599
Speaker 2: It's a lot of time.

966
00:49:31,440 --> 00:49:34,000
Speaker 3: I'm not sure if that's good or just absolutely terrifying.

967
00:49:35,559 --> 00:49:37,920
But yeah, one of the chapters is all input is

968
00:49:37,960 --> 00:49:41,800
evil until proven otherwise. And that's the same today. And

969
00:49:41,840 --> 00:49:44,960
so I still think that if developers just learned that

970
00:49:45,039 --> 00:49:47,199
one thing, like, you know, is this data coming in

971
00:49:47,199 --> 00:49:54,159
through a rest endpoint through whatever AP port, a socket

972
00:49:54,480 --> 00:49:59,519
or a web socket or whatever, RPC endpoint, gRPC data

973
00:49:59,519 --> 00:50:01,960
coming in, you know, is it correctly formed? Is it

974
00:50:02,000 --> 00:50:03,960
the correct stuff? Don't look for bad things because that

975
00:50:04,000 --> 00:50:06,880
assumes you know all the bad things. Is it correctly formed?

976
00:50:06,920 --> 00:50:11,079
Like if I'm expecting a zip code, is it numbers

977
00:50:11,079 --> 00:50:14,239
with a potential dash? Right, and that's it? And is

978
00:50:14,280 --> 00:50:17,480
it you know, ten digits long max or something like that.

979
00:50:17,639 --> 00:50:19,559
Speaker 4: That's got to be alpha numerica because we have other

980
00:50:19,639 --> 00:50:21,920
content that's right, Yeah, exactly right, but even now for

981
00:50:22,039 --> 00:50:25,199
numeric like sorry, M well, well the UK is the same,

982
00:50:25,280 --> 00:50:29,239
right UK. So I used to live in L nine

983
00:50:29,280 --> 00:50:32,800
to ninety L when I was in England. But it's

984
00:50:32,800 --> 00:50:40,079
interesting that that issue is still the number one source

985
00:50:40,079 --> 00:50:42,480
of issues, and in fact there's a whole new issue

986
00:50:42,559 --> 00:50:44,360
set of issues opened up with l MS.

987
00:50:45,960 --> 00:50:49,639
Speaker 3: And it's all input validation, that's all it is. I

988
00:50:49,679 --> 00:50:51,800
still say that's like the number one thing that developers

989
00:50:51,840 --> 00:50:54,280
have got to think about is my data correct.

990
00:50:54,360 --> 00:50:54,599
Speaker 2: Yeah.

991
00:50:54,639 --> 00:50:58,239
Speaker 1: I learned about a new attack from doing security this week,

992
00:50:58,280 --> 00:51:01,280
which was D serialization injection.

993
00:51:01,760 --> 00:51:02,039
Speaker 2: Yep.

994
00:51:02,679 --> 00:51:06,280
Speaker 1: So imagine, if you will, you have a comments paid

995
00:51:06,360 --> 00:51:09,079
you know, comments thing, and there's no limit to it, right,

996
00:51:09,199 --> 00:51:13,239
and somebody just pastes in there some JSON that has

997
00:51:13,280 --> 00:51:17,000
somehow attached some code to it. I don't This is

998
00:51:17,000 --> 00:51:20,360
what I don't understand. Maybe you de serialize that into

999
00:51:20,400 --> 00:51:24,320
an object that has code. Yeah, and that object has

1000
00:51:24,719 --> 00:51:31,119
you know, in in its in its initializer, it's code

1001
00:51:31,119 --> 00:51:31,639
that runs.

1002
00:51:31,920 --> 00:51:32,119
Speaker 2: Right.

1003
00:51:32,360 --> 00:51:34,159
Speaker 3: We'll give an example. You've got a proof of concept

1004
00:51:34,239 --> 00:51:36,119
on this, which is this is why we might sort

1005
00:51:36,159 --> 00:51:39,440
of deprecate to the binary serializer and SEA shop is

1006
00:51:39,519 --> 00:51:43,599
you can deserialize a c shop object and having the

1007
00:51:43,639 --> 00:51:46,719
constructor which will run, right, you can.

1008
00:51:46,639 --> 00:51:50,199
Speaker 1: Have exec yeah, exec whatever.

1009
00:51:50,000 --> 00:51:51,920
Speaker 3: Right, and I'll just run and I'll just run.

1010
00:51:52,320 --> 00:51:52,599
Speaker 1: Yeah.

1011
00:51:52,679 --> 00:51:57,239
Speaker 3: So there's nothing special or funky or magic abouts it is.

1012
00:51:57,320 --> 00:51:59,920
You put whatever the heck you want in the constructor

1013
00:52:00,440 --> 00:52:02,679
and then when it's de serialized, that runs. And if

1014
00:52:02,679 --> 00:52:05,400
it's an exact you know, prop I don't know what

1015
00:52:05,440 --> 00:52:07,320
it is in dot net, I'll be honest with you.

1016
00:52:07,480 --> 00:52:10,440
It process dot execute or something process dot run.

1017
00:52:10,800 --> 00:52:13,239
Speaker 1: Yeah, yeah, you can create a process you're can just

1018
00:52:13,320 --> 00:52:13,599
run it.

1019
00:52:14,280 --> 00:52:15,920
Speaker 3: Now you're running a arbitrary code, and the demo I

1020
00:52:15,920 --> 00:52:17,519
give is just running notepad of course, you know.

1021
00:52:18,119 --> 00:52:18,360
Speaker 2: Yeah.

1022
00:52:18,400 --> 00:52:21,079
Speaker 1: But the thing is is that that that object has

1023
00:52:21,119 --> 00:52:24,039
to be defined on the system that's doing the de serializing.

1024
00:52:24,159 --> 00:52:26,320
So it's not like you can just put in a

1025
00:52:26,360 --> 00:52:28,559
bunch of JSON and it's going to create a class

1026
00:52:28,599 --> 00:52:31,920
that runs. Do you know what I mean? Yeah, yeah,

1027
00:52:32,000 --> 00:52:33,800
you can't de serialize code like that.

1028
00:52:34,599 --> 00:52:36,400
Speaker 3: Yeah, yeah, You've got to have there's got to be

1029
00:52:36,400 --> 00:52:38,840
a class that you know exists. So for someone like

1030
00:52:39,199 --> 00:52:41,639
a person class, right, yeah, then you would put it

1031
00:52:41,719 --> 00:52:43,400
in the person constructor. Yeah.

1032
00:52:43,599 --> 00:52:43,800
Speaker 1: Right.

1033
00:52:43,880 --> 00:52:47,760
Speaker 3: So so here's there are actually, actually there are other examples.

1034
00:52:47,760 --> 00:52:51,000
There's lots of examples, and in fact, in the red

1035
00:52:51,079 --> 00:52:56,159
team we have a person who is an absolute expert

1036
00:52:56,599 --> 00:53:02,960
in building malicious payloads for deserialization. Okay, so I guess

1037
00:53:03,400 --> 00:53:05,000
I'm all I can say is I'm glad he works

1038
00:53:05,000 --> 00:53:06,480
for Microsoft and nobody else.

1039
00:53:07,280 --> 00:53:10,159
Speaker 2: That's good. I'm very I've had a number of people

1040
00:53:10,159 --> 00:53:11,480
I've met over the years where you sort of look

1041
00:53:11,480 --> 00:53:13,039
at them later and go, you know, I'm really glad

1042
00:53:13,079 --> 00:53:13,760
you're a good guy.

1043
00:53:14,800 --> 00:53:15,800
Speaker 3: I know.

1044
00:53:16,039 --> 00:53:18,239
Speaker 1: One of the things we do, and you might like

1045
00:53:18,320 --> 00:53:22,039
this on our show, is we take the cv score

1046
00:53:22,119 --> 00:53:25,360
for a given threat, which is usually from one to ten,

1047
00:53:25,639 --> 00:53:29,079
you know, being CVSS. Yes, I'm sorry. The CVE is

1048
00:53:29,079 --> 00:53:32,559
the number that identifies the threat, the CVSS scorer, and

1049
00:53:33,239 --> 00:53:37,559
that if it's ten, basically that means, oh my god,

1050
00:53:37,599 --> 00:53:40,559
the world is going to end. You must patch. But

1051
00:53:41,000 --> 00:53:47,199
we also add a contagion score, which is how realistic

1052
00:53:47,360 --> 00:53:50,280
is it that you the average person is going to

1053
00:53:50,320 --> 00:53:52,119
be affected by this vulnerability?

1054
00:53:52,280 --> 00:53:52,519
Speaker 2: Right?

1055
00:53:53,000 --> 00:53:56,159
Speaker 1: And that to that equation is you know, is it

1056
00:53:56,559 --> 00:53:59,760
connected to the Internet? Do you have to be on

1057
00:53:59,760 --> 00:54:02,440
the local network already? Do you have to have hacked

1058
00:54:02,519 --> 00:54:05,960
to a certain point? So so what do you think

1059
00:54:06,000 --> 00:54:07,679
about that? Do you think that should be an official

1060
00:54:07,719 --> 00:54:10,400
thing like a contagion score or a reality score?

1061
00:54:10,639 --> 00:54:13,480
Speaker 3: I mean, I don't think I think that idea is

1062
00:54:13,519 --> 00:54:17,000
actually baked into the modern versions of CVSS, like accessibility,

1063
00:54:17,360 --> 00:54:21,519
does it require an authenticated connection? Does it require human interaction? So,

1064
00:54:21,519 --> 00:54:24,199
if you've got something that requires no authentication, that's remote,

1065
00:54:24,199 --> 00:54:28,440
that requires no human interaction. Then, for example, I remember

1066
00:54:28,440 --> 00:54:30,320
a while ago, I remember a while ago, there was

1067
00:54:30,320 --> 00:54:33,559
a really nasty bug in iOS where someone could send

1068
00:54:33,719 --> 00:54:36,400
a text message, an SMS message.

1069
00:54:36,880 --> 00:54:38,920
Speaker 1: Yeah, I remember this too, right, you didn't even have

1070
00:54:39,000 --> 00:54:41,079
to open it and it would own your phone.

1071
00:54:41,159 --> 00:54:43,400
Speaker 3: Yeah, exactly, I mean, don't get I mean, you know.

1072
00:54:43,960 --> 00:54:46,119
But again, the nice thing about iOS, you know, most

1073
00:54:46,119 --> 00:54:48,800
mobile platforms, is they're very locked down and isolated, and

1074
00:54:48,880 --> 00:54:52,840
they assume this sort of stuff. But even but even so,

1075
00:54:53,000 --> 00:54:54,800
it was a really nasty bug.

1076
00:54:56,119 --> 00:54:59,480
Speaker 1: So they we have seen scores that have been very high,

1077
00:54:59,559 --> 00:55:02,519
like nine or whatever, that we have adjusted down for

1078
00:55:02,599 --> 00:55:06,800
our listeners because because of those things, like you already

1079
00:55:06,800 --> 00:55:08,679
had to be on the network, you had to like

1080
00:55:08,719 --> 00:55:11,199
maybe there had to be a human interaction, like somebody

1081
00:55:11,199 --> 00:55:14,199
got up from their screen and went to the bathroom

1082
00:55:14,239 --> 00:55:16,639
and left themselves logged in, like, you know, a bunch

1083
00:55:16,679 --> 00:55:18,800
of things had to fall into line before you could

1084
00:55:18,840 --> 00:55:22,519
get that thing. So I'm not so sure they do

1085
00:55:22,599 --> 00:55:24,920
a good job of Yeah.

1086
00:55:24,639 --> 00:55:28,159
Speaker 3: But here's the problem. But here's the problem. Though KYL

1087
00:55:28,320 --> 00:55:32,719
is a bug by itself is interesting, a CV is interesting.

1088
00:55:32,760 --> 00:55:34,920
You look at what the guys doing, the Red Team,

1089
00:55:35,320 --> 00:55:38,960
they rarely take a single CVE. They chained them together.

1090
00:55:39,119 --> 00:55:40,280
Speaker 1: Yeah, yeah, that's true.

1091
00:55:40,360 --> 00:55:44,440
Speaker 3: Right, So you may have a remote and unauthenticated thing

1092
00:55:44,480 --> 00:55:47,880
that does nothing exciting. Then you have another one that's

1093
00:55:47,920 --> 00:55:51,960
a local only that gives you some privilege elevation. Then

1094
00:55:52,000 --> 00:55:53,760
you have another one that requires a little bit of

1095
00:55:53,760 --> 00:55:57,679
privileged access, but now it gives you godlike on the box. Right,

1096
00:55:57,719 --> 00:56:01,679
So he chained three together. So that's where it becomes. Yeah,

1097
00:56:01,760 --> 00:56:03,800
that's where it becomes really dangerous. And that's one thing

1098
00:56:03,840 --> 00:56:05,559
I love about the Red Team, right, is they think

1099
00:56:05,599 --> 00:56:10,039
about these things. So John Lambert, corporate VP and security

1100
00:56:10,079 --> 00:56:14,920
fell at Microsoft, you know, he says that attackers thinking graphs,

1101
00:56:16,039 --> 00:56:18,440
that's what they're doing. They're chaining these things together. They

1102
00:56:18,480 --> 00:56:20,559
don't think about individual spot things.

1103
00:56:20,800 --> 00:56:23,320
Speaker 1: First we'll use this exploit. Then we'll you know, take

1104
00:56:23,320 --> 00:56:25,320
a late use whatever is the lay of the land.

1105
00:56:25,360 --> 00:56:26,960
I guess what they call it, you know, feeding off

1106
00:56:26,960 --> 00:56:30,199
the land, feeding off the land. Yeah, see what's there? Yeah,

1107
00:56:30,239 --> 00:56:33,159
and then maybe use another exploit, another privsk and then

1108
00:56:33,159 --> 00:56:33,920
you own the box.

1109
00:56:34,039 --> 00:56:37,280
Speaker 3: Yeah absolutely, And now next thing you know, you're broken

1110
00:56:37,320 --> 00:56:40,760
out of a VM you know, and you're on the

1111
00:56:40,800 --> 00:56:45,000
host the ultimate path traverse, right. I mean, that's that's

1112
00:56:45,000 --> 00:56:48,000
pretty nasty, you know. And that's where security guarantees come

1113
00:56:48,000 --> 00:56:50,559
into play. Is what are the security guarantees around your

1114
00:56:50,599 --> 00:56:54,239
security mitigation? Is there a security guarantee around it? So,

1115
00:56:54,239 --> 00:56:56,639
for example, there are security guarantees around VMS. If you

1116
00:56:56,760 --> 00:57:00,360
violate that, then that's a serious problem that we will fix, right.

1117
00:57:00,480 --> 00:57:02,639
But other things, so you know, other things don't have

1118
00:57:02,679 --> 00:57:05,199
security guarantees. It's really good examples of his code obfuscation.

1119
00:57:05,280 --> 00:57:08,760
There's no security guarantee. There are none because eventually the

1120
00:57:08,800 --> 00:57:12,280
CPU has to know what instructions to execute. Right.

1121
00:57:12,519 --> 00:57:14,559
Speaker 1: And if you know you're you're a hacker, you're on

1122
00:57:14,679 --> 00:57:18,360
the box and you know a assembler or binary, you

1123
00:57:18,400 --> 00:57:20,920
can just go walk the memory and see what's going on.

1124
00:57:21,400 --> 00:57:24,039
Speaker 3: It's funnys you bring that up. Someone made a comment

1125
00:57:24,159 --> 00:57:29,159
on x about some C sharp link stuff or whatever

1126
00:57:29,199 --> 00:57:30,559
and said, which is fast? And they give you three

1127
00:57:30,559 --> 00:57:34,079
different examples of link and I said, I don't know,

1128
00:57:34,239 --> 00:57:37,280
So let's look at the assembly language. Nice, you know,

1129
00:57:37,599 --> 00:57:39,840
it's let's look at what it's doing. And it was

1130
00:57:39,880 --> 00:57:42,199
really interesting. And this guy made a comment to me, said,

1131
00:57:42,199 --> 00:57:45,719
you read assembly language. I said, assembly language is the truth.

1132
00:57:47,639 --> 00:57:49,840
Speaker 2: This is what it's landing on the processor.

1133
00:57:50,280 --> 00:57:52,000
Speaker 3: Yeah, I mean you can read whatever you want in

1134
00:57:52,039 --> 00:57:54,119
the you know, in the source code, but it's what's

1135
00:57:54,639 --> 00:57:57,360
generated by the compiler on the link that matters.

1136
00:57:58,039 --> 00:58:01,000
Speaker 1: But quite frankly, I would rather you to look at

1137
00:58:01,079 --> 00:58:03,920
the thing with the assembly language, then learn assembly language

1138
00:58:04,039 --> 00:58:07,159
to the point where I could spot a problem. Thank

1139
00:58:07,199 --> 00:58:08,400
you very much, dude.

1140
00:58:08,559 --> 00:58:10,400
Speaker 3: I love assembly language. I'm not gonna lie. I love

1141
00:58:10,480 --> 00:58:14,039
assembly language. It's it's my happy place. Well, I mean

1142
00:58:14,039 --> 00:58:15,760
c is probably my happy place. I love CE, but

1143
00:58:16,079 --> 00:58:18,480
I love a SAMI language. Yeah, I've been learning a

1144
00:58:18,519 --> 00:58:19,840
lot about our assembly language.

1145
00:58:19,840 --> 00:58:22,599
Speaker 2: Well, and see what was created to make more portable

1146
00:58:22,599 --> 00:58:25,920
assembly language that you could write one and degenerate down

1147
00:58:25,960 --> 00:58:27,719
and do assembly for different processors.

1148
00:58:29,199 --> 00:58:31,280
Speaker 1: So what's next for you? What's in your inbox?

1149
00:58:31,480 --> 00:58:35,800
Speaker 3: What's in my inbox? A lot of MCP security these days,

1150
00:58:35,800 --> 00:58:39,639
you believe it or not. On the as of Security podcast,

1151
00:58:39,800 --> 00:58:43,239
we interviewed a guy who works on MCP at Microsoft

1152
00:58:43,280 --> 00:58:46,159
and also on the working group for MCP.

1153
00:58:46,639 --> 00:58:50,960
Speaker 1: This is let me get it, Master Control Protocol, a

1154
00:58:51,079 --> 00:58:55,199
little model, monitor model. Anyway, it's what's between your agent

1155
00:58:55,400 --> 00:58:58,800
and the services that you want the agent to access

1156
00:58:58,840 --> 00:59:02,039
on your behalf model context protocol model context. But I'm

1157
00:59:02,039 --> 00:59:04,000
never going to remember this model context protocol.

1158
00:59:04,079 --> 00:59:06,719
Speaker 3: Yeah I I I'm you know what, I have problems

1159
00:59:06,760 --> 00:59:11,320
remembering as well. Yeah, I hear every single day.

1160
00:59:09,960 --> 00:59:14,679
Speaker 2: Yes, because I keep picking tron Master Control program a lot.

1161
00:59:14,760 --> 00:59:16,239
Speaker 1: That's what it is.

1162
00:59:17,239 --> 00:59:20,800
Speaker 3: But but yeah, you know, MCP is being used everywhere,

1163
00:59:20,800 --> 00:59:21,960
and all of a sudden, you know as e sql

1164
00:59:22,000 --> 00:59:25,320
database has now got m C MCP server. Visual Studio

1165
00:59:25,400 --> 00:59:26,960
code is an MCP client.

1166
00:59:27,480 --> 00:59:30,039
Speaker 1: But last we knew it wasn't secure, right, Well.

1167
00:59:29,840 --> 00:59:34,119
Speaker 3: So actually it's interesting the MCP guys are trying as

1168
00:59:34,119 --> 00:59:37,880
hard as possible to offload that's a terrible word the

1169
00:59:37,960 --> 00:59:42,519
security to the underlying environment. Like so for example, ACT

1170
00:59:43,079 --> 00:59:47,480
authentication shouldn't really be an MCP thing really really should be.

1171
00:59:47,559 --> 00:59:50,000
You know, if you use two user or two for authorization,

1172
00:59:50,360 --> 00:59:52,440
you want to use open idconnect for authentication, use you

1173
00:59:52,639 --> 00:59:55,719
use Windows Authentication for authentication, so beet.

1174
00:59:55,599 --> 00:59:59,599
Speaker 1: And then your MCP is is whatever is there correct?

1175
01:00:00,119 --> 01:00:05,760
Speaker 3: Correct? Which is which I agree with. There there is

1176
01:00:05,800 --> 01:00:09,360
some work going on with authorization. They recognize that, but

1177
01:00:09,920 --> 01:00:15,039
I would not throw out the terms MCP is insecure.

1178
01:00:15,480 --> 01:00:20,360
I would think MCP doesn't do some things on purpose.

1179
01:00:20,920 --> 01:00:23,719
So it's not locked into a WS it's not locked

1180
01:00:23,760 --> 01:00:26,159
into Azure, it's not locked into Windows or Linux or

1181
01:00:26,159 --> 01:00:29,000
whatever they talk about.

1182
01:00:29,199 --> 01:00:30,400
Speaker 1: Can it be exploited though?

1183
01:00:30,760 --> 01:00:32,880
Speaker 3: Well, I actually funny should say that I sort of

1184
01:00:32,880 --> 01:00:36,119
the thing came came across in my inbox, but a

1185
01:00:36,199 --> 01:00:41,400
vulnerability in some MCP client because it wasn't authenticating the server.

1186
01:00:41,800 --> 01:00:44,920
Speaker 1: Well okay, okay, not MCP's fault.

1187
01:00:45,159 --> 01:00:48,119
Speaker 3: That's not an mc exactly exactly right, it's not an

1188
01:00:48,199 --> 01:00:51,239
MCP fault. But you know this, this bug is being

1189
01:00:51,239 --> 01:00:54,760
built as an MCP vulnerability, but it's not, you know,

1190
01:00:55,320 --> 01:01:00,440
so I I understand your you know how you're arriving

1191
01:01:00,440 --> 01:01:02,440
at that, but by the same token, you gotta understand

1192
01:01:02,480 --> 01:01:05,119
where the MCP guys draw the line, right, Okay, I

1193
01:01:05,119 --> 01:01:05,719
think that's fair.

1194
01:01:05,840 --> 01:01:07,519
Speaker 1: That's fair, that's fair. I mean I think it was

1195
01:01:07,559 --> 01:01:10,719
when we were talking to Scott Hunter and he had

1196
01:01:10,760 --> 01:01:13,639
some inkling about that, right, that might be where I

1197
01:01:13,679 --> 01:01:15,880
picked that up. Yeah, but it's good to know that

1198
01:01:15,920 --> 01:01:19,199
it's that it's evolving and all.

1199
01:01:19,079 --> 01:01:22,119
Speaker 3: That apparently, Yeah, grain their authorization is that next thing m.

1200
01:01:22,599 --> 01:01:26,280
Speaker 1: Mm hmm okay, well you'll have to google that, folks,

1201
01:01:26,280 --> 01:01:29,559
because we're out of time. So Michael Howard, thank you

1202
01:01:29,719 --> 01:01:31,559
very much. It's been a pleasure talking to you and

1203
01:01:31,599 --> 01:01:33,159
a little bit scary, but in a good way.

1204
01:01:33,239 --> 01:01:35,039
Speaker 3: Oh I'm not scary, that's all.

1205
01:01:35,800 --> 01:01:37,000
Speaker 1: No, No, you're not scary.

1206
01:01:37,039 --> 01:01:39,559
Speaker 2: Life is, but the world is very scary. You just

1207
01:01:39,599 --> 01:01:40,320
reminded us.

1208
01:01:42,559 --> 01:01:44,559
Speaker 1: All right, Thanks a lot, and we'll talk to you

1209
01:01:44,559 --> 01:02:07,599
next time on dot net rocks. Dot net Rocks is

1210
01:02:07,639 --> 01:02:11,320
brought to you by Franklin's Net and produced by Pop Studios,

1211
01:02:11,679 --> 01:02:15,719
a full service audio, video and post production facility located

1212
01:02:15,760 --> 01:02:18,679
physically in New London, Connecticut, and of course in the

1213
01:02:18,719 --> 01:02:23,800
cloud online at pwop dot com. Visit our website at

1214
01:02:23,880 --> 01:02:25,719
d O T N E t R O c k

1215
01:02:25,960 --> 01:02:30,760
S dot com for RSS feeds, downloads, mobile apps, comments,

1216
01:02:31,079 --> 01:02:33,599
and access to the full archives going back to show

1217
01:02:33,679 --> 01:02:36,239
number one, recorded in September two.

1218
01:02:36,159 --> 01:02:36,760
Speaker 2: Thousand and two.

1219
01:02:37,360 --> 01:02:39,719
Speaker 1: And make sure you check out our sponsors. They keep

1220
01:02:39,800 --> 01:02:42,960
us in business. Now go write some code, see you

1221
01:02:43,000 --> 01:02:43,440
next time.

1222
01:02:44,320 --> 01:02:46,159
Speaker 3: You got Javans

1223
01:02:48,239 --> 01:03:02,199
Speaker 2: And

