1
00:00:01,080 --> 00:00:04,799
Speaker 1: How'd you like to listen to dot NetRocks with no ads? Easy?

2
00:00:05,360 --> 00:00:08,560
Become a patron for just five dollars a month. You

3
00:00:08,599 --> 00:00:11,320
get access to a private RSS feed where all the

4
00:00:11,359 --> 00:00:14,599
shows have no ads. Twenty dollars a month, we'll get

5
00:00:14,599 --> 00:00:18,440
you that and a special dot NetRocks patron mug. Sign

6
00:00:18,519 --> 00:00:22,920
up now at Patreon dot dot NetRocks dot com. Hi,

7
00:00:23,000 --> 00:00:24,679
this is Carl Franklin.

8
00:00:24,239 --> 00:00:25,480
Speaker 2: And this is Richard Campbell.

9
00:00:25,800 --> 00:00:30,039
Speaker 1: We've got two special shows coming up soon, episode nineteen

10
00:00:30,120 --> 00:00:32,039
ninety nine and two thousand.

11
00:00:32,399 --> 00:00:35,079
Speaker 2: For episode nineteen ninety nine, we're collecting people's y two

12
00:00:35,159 --> 00:00:37,520
k stories what did you do to help the Y

13
00:00:37,600 --> 00:00:39,719
two k event not actually happen?

14
00:00:40,200 --> 00:00:42,960
Speaker 1: And for episode two thousand, we're going to be sharing

15
00:00:43,000 --> 00:00:45,479
stories about how dot net shaped your career.

16
00:00:46,000 --> 00:00:48,679
Speaker 2: We have a special page at dot netroocks dot com

17
00:00:48,719 --> 00:00:52,159
slash voxpop where you can record messages for us that

18
00:00:52,200 --> 00:00:55,039
we can play on these special episodes. So tell us

19
00:00:55,039 --> 00:00:57,479
what you did for Y two k and what dot

20
00:00:57,479 --> 00:00:59,560
net means to you, and of course how long you've

21
00:00:59,560 --> 00:01:02,359
been listening to dot net rocks. So go do dot

22
00:01:02,399 --> 00:01:05,319
NetRocks dot com slash vox pop now and leave us

23
00:01:05,319 --> 00:01:07,799
a message before the thought ev operates, like whiskey left

24
00:01:07,799 --> 00:01:09,680
in a glass overnight, do it?

25
00:01:22,200 --> 00:01:24,760
Speaker 1: Hey, it's dot net Rocks, some Carl Franklin and I'm

26
00:01:24,799 --> 00:01:29,200
Richard Campbell, And you know what where I am it's cold.

27
00:01:29,519 --> 00:01:33,840
You know why there's so much fun snow. I'm supposed

28
00:01:33,879 --> 00:01:36,319
to get a pellet delivery. They can't even back the

29
00:01:36,359 --> 00:01:39,560
truck up to the basement to put the pellets in. Now,

30
00:01:39,560 --> 00:01:41,319
I got to pay two hundred bucks to have the

31
00:01:41,359 --> 00:01:44,239
guys trudge them through the house and down the stairs.

32
00:01:44,560 --> 00:01:46,599
Oh my god, it's snowmageddon.

33
00:01:46,760 --> 00:01:49,319
Speaker 2: Is that cheaper than having someone plow all that out? Like?

34
00:01:49,439 --> 00:01:51,400
Speaker 1: But it's grass. I don't know what the answer. There's

35
00:01:51,439 --> 00:01:54,879
no plowing, right, there's no plowing to be done otherwise

36
00:01:55,120 --> 00:01:56,000
ripping and tearing.

37
00:01:56,519 --> 00:01:58,040
Speaker 2: I didn't even want to tell you how nice it

38
00:01:58,079 --> 00:02:00,079
is on the west coast now, like we've learned to

39
00:01:59,920 --> 00:02:02,560
shut up because the East coat's getting nails. Oh she

40
00:02:02,920 --> 00:02:08,039
got nothing here. It's ridiculous. It's ridiculous.

41
00:02:08,120 --> 00:02:10,599
Speaker 1: We never had this much snow and also a cold

42
00:02:10,599 --> 00:02:14,280
snap that's been below freezing for like three weeks.

43
00:02:14,479 --> 00:02:17,319
Speaker 2: Yeah, this is the polar vortex, right, the destabilization of

44
00:02:17,319 --> 00:02:20,520
the Arctic high is pushing down into different parts of

45
00:02:20,560 --> 00:02:23,080
the world. Yeah, and you are one of those parts.

46
00:02:23,280 --> 00:02:26,280
Speaker 1: Yeah, we're we're getting more snow than the rest of

47
00:02:26,319 --> 00:02:26,800
the country.

48
00:02:26,960 --> 00:02:27,240
Speaker 2: Yeah.

49
00:02:27,280 --> 00:02:29,639
Speaker 1: All right, So let's talk about what happened in nineteen

50
00:02:29,680 --> 00:02:32,840
eighty nine. Oh, my god, so much stuff happened. Yeah,

51
00:02:32,919 --> 00:02:35,919
the fall of communism, well.

52
00:02:35,800 --> 00:02:38,080
Speaker 2: The beginning anyway. I mean, it'll take a couple of years,

53
00:02:38,080 --> 00:02:41,039
but definitely the Eastern Block starts to unravel.

54
00:02:40,719 --> 00:02:45,560
Speaker 1: Yep, and so Union basically says we give up, We're done. Yeah,

55
00:02:45,560 --> 00:02:47,680
we're not going to control all these countries anymore.

56
00:02:47,840 --> 00:02:50,159
Speaker 2: Yeah, I mean, and then we've been to a bunch

57
00:02:50,199 --> 00:02:53,639
of them and there they're museums and things like I'm

58
00:02:53,639 --> 00:02:56,800
thinking about in Vilnius, the KGB Museum where they basically

59
00:02:56,919 --> 00:02:59,000
left it as it was as the Soviets pulled out

60
00:02:59,000 --> 00:03:02,599
in nineteen eighty nine. The duty uniform is still hanging

61
00:03:02,639 --> 00:03:04,439
on the wall, all the documents they have been shredding

62
00:03:04,439 --> 00:03:07,280
for months, like they knew things are coming.

63
00:03:07,400 --> 00:03:10,120
Speaker 1: And next year, as in the next episode, we'll talk

64
00:03:10,120 --> 00:03:13,520
about nineteen ninety and that's where things really kind of well,

65
00:03:13,639 --> 00:03:16,000
some really good things happened, and you know, a lot

66
00:03:16,039 --> 00:03:18,719
of craziness happened.

67
00:03:18,439 --> 00:03:21,319
Speaker 2: To you know, we in the West look at nineteen

68
00:03:21,360 --> 00:03:23,599
eighty nine and the collapse of the Eastern Bloc as

69
00:03:23,680 --> 00:03:27,759
a dramatic progress in the world. But you understand that

70
00:03:27,800 --> 00:03:32,400
for Russians and for the Chinese it was the ultimate disaster. Right, Like,

71
00:03:32,400 --> 00:03:35,360
there's so much of the current conflict going on today

72
00:03:35,759 --> 00:03:38,960
that start with the collapse in nineteen eighty nine, right,

73
00:03:39,039 --> 00:03:40,639
because you know the other thing that happens in nineteen

74
00:03:40,680 --> 00:03:44,639
eighty nine, that's tim and Square, Tiana and Square. Yeah, yeah,

75
00:03:44,680 --> 00:03:47,520
and that was you know, there's this idea that as

76
00:03:47,560 --> 00:03:50,159
they opened up and became more Western, they would adopt you,

77
00:03:51,199 --> 00:03:55,039
the Western culture, and this is the closest the China did.

78
00:03:55,159 --> 00:03:59,759
But after the crackdown of Tienum and Square, the establishment

79
00:03:59,800 --> 00:04:02,680
of the technocracy at that point when today would just

80
00:04:02,759 --> 00:04:07,000
plain old call it at an autocrat really turns at

81
00:04:07,039 --> 00:04:07,439
that moment.

82
00:04:07,439 --> 00:04:09,280
Speaker 1: There's a lot of that still happening. It's like, we

83
00:04:09,400 --> 00:04:11,599
like democracy, but don't you dare protest?

84
00:04:11,919 --> 00:04:16,120
Speaker 2: Yeah, well they you know, don't like democracy. You will

85
00:04:16,199 --> 00:04:18,160
there's one party. You follow the system, and you don't

86
00:04:18,160 --> 00:04:20,399
follow the system, you will be re educated.

87
00:04:20,519 --> 00:04:26,079
Speaker 1: Yeah, yep. Excellon Valdez of course. Yeah, biggest oil spill

88
00:04:26,279 --> 00:04:27,879
currently eleven million gallons.

89
00:04:28,000 --> 00:04:30,360
Speaker 2: Yep. We're you know, talking about putting a pipeline through

90
00:04:30,360 --> 00:04:33,240
the west coast British Columbia, and Valdez comes up because

91
00:04:33,240 --> 00:04:37,319
it's not that far away that when stuff like Bitaman

92
00:04:37,399 --> 00:04:38,800
leaks into the ocean, it's a big deal.

93
00:04:38,920 --> 00:04:42,439
Speaker 1: And that's when the dishwasher detergent dawn got very popular.

94
00:04:42,920 --> 00:04:45,959
Speaker 2: Well, it saves a lot of birds. And we don't

95
00:04:46,040 --> 00:04:47,519
let's not leave out the fall of the Berlin Wall

96
00:04:47,519 --> 00:04:49,079
because that's December of nineteen eighty nine.

97
00:04:49,240 --> 00:04:53,399
Speaker 1: Yeah, all right, so what happened in technology and space

98
00:04:53,519 --> 00:04:54,399
in nineteen eighty nine.

99
00:04:54,439 --> 00:04:57,120
Speaker 2: Let's do space first. So there are five Shuttle flights,

100
00:04:57,120 --> 00:04:58,879
and again I'm not going to go into detail on

101
00:04:59,000 --> 00:05:01,800
all of them, just every flies twice, just launching satellites.

102
00:05:02,079 --> 00:05:04,519
Columbia does a single mission and it's a secret military

103
00:05:04,519 --> 00:05:08,800
flight that they don't televise anything on. It's probably a satellite,

104
00:05:08,839 --> 00:05:10,959
but we don't really know. But the two cool flights

105
00:05:10,959 --> 00:05:15,120
are both on Atlantis, which are satellite launches, but their

106
00:05:15,240 --> 00:05:19,160
interplanetary probes. They launched the Magellan probe to Venus and

107
00:05:19,199 --> 00:05:23,879
the Galileo probe to Jupiter. This Galileo probe was one

108
00:05:23,879 --> 00:05:27,920
of the great observatories, although it had a problem. Its

109
00:05:28,040 --> 00:05:31,240
main high speed antenna never deploys properly, and it'll be

110
00:05:31,279 --> 00:05:34,639
bandwidth limited its entire existence. Wow, But it'll still do

111
00:05:34,680 --> 00:05:37,720
its job, just take longer to do it. I have

112
00:05:37,879 --> 00:05:40,759
been remiss, my friend, since we got into the eighties

113
00:05:40,800 --> 00:05:44,319
to talk about space dation mirror, because it's an incredibly

114
00:05:44,360 --> 00:05:47,759
important piece of science, and it started in eighty six,

115
00:05:47,800 --> 00:05:50,079
and I really haven't mentioned a lot, But in nineteen

116
00:05:50,079 --> 00:05:52,560
eighty nine they actually launched a third module. So the

117
00:05:52,600 --> 00:05:54,600
core module goes up in eighty six, which is basically

118
00:05:54,600 --> 00:05:57,680
a modified Salute. Oh and by the way, at this moment,

119
00:05:57,800 --> 00:06:00,959
the Soviet Union is operating two space stations in space

120
00:06:01,079 --> 00:06:03,600
they have, Yeah, they had the Salute seven. This is

121
00:06:03,600 --> 00:06:06,079
in nineteen eighty nine, which you'll continue operating till nineteen

122
00:06:06,120 --> 00:06:08,759
ninety one. The last of the Salutes and Mirror, which

123
00:06:08,759 --> 00:06:12,040
started in nineteen eighty six, which I'll keep grunning until

124
00:06:12,079 --> 00:06:14,800
nineteen ninety six. So an eighty six they launch the

125
00:06:14,839 --> 00:06:18,519
car module. In eighty seven they do Cavan one. And

126
00:06:18,560 --> 00:06:22,639
this is the first time ever in civilization that a

127
00:06:22,759 --> 00:06:25,759
multimodule space station was built. Mirror is the beginning of that.

128
00:06:26,120 --> 00:06:28,319
And in nineteen eighty nine they launched Event two, and

129
00:06:28,360 --> 00:06:31,240
so while Covent one was almost entirely astrophysics and sensors

130
00:06:31,240 --> 00:06:37,680
and things, plus some infrastructure like gyroscopes, Event two has

131
00:06:37,800 --> 00:06:41,240
the airlock EVA airlock on it for the Orlan space

132
00:06:41,240 --> 00:06:43,639
suit and their maneuvering units, as well as more life support,

133
00:06:43,759 --> 00:06:46,319
sensor and so forth. And there'll be a few more modules,

134
00:06:46,519 --> 00:06:49,439
including the eventually the docking port for the Space Shuttle.

135
00:06:49,480 --> 00:06:51,480
Speaker 1: You know, the only thing I remember about Mer was

136
00:06:51,519 --> 00:06:53,879
there was some sort of leak and they had to evacuate.

137
00:06:54,160 --> 00:06:57,920
Speaker 2: There was an accident with the Progress supply ship. They

138
00:06:57,959 --> 00:07:00,399
lose control of it and it collides with the station.

139
00:07:01,439 --> 00:07:04,160
It's an incredible story. Yeah, it's huge, and they lost

140
00:07:04,839 --> 00:07:07,319
one of the newest modules, the Spectro module depressurized, although

141
00:07:07,319 --> 00:07:08,639
they were able to still use it. They just have

142
00:07:08,639 --> 00:07:10,639
to go to space suits to go into it. On

143
00:07:10,720 --> 00:07:14,519
the interplanetary side, Voyager two makes it to Neptune, takes

144
00:07:14,519 --> 00:07:18,240
our best pictures of Neptune ever, and last but not least,

145
00:07:18,279 --> 00:07:21,480
the Phobos missions. So again into the Soviet Union. The

146
00:07:21,519 --> 00:07:23,639
Soviet Union tried so many times ago to the Mars

147
00:07:23,680 --> 00:07:26,600
and they and they failed consistently, and they were still

148
00:07:26,639 --> 00:07:29,279
launching in the pairs like the Mariner missions the Americans

149
00:07:29,319 --> 00:07:32,879
used to do. Phobos one and Phobos two, these were

150
00:07:32,879 --> 00:07:36,720
two of the largest vehicles ever flown or attempted to

151
00:07:36,720 --> 00:07:39,560
fly to Mars. They were sixty two hundred kilows each,

152
00:07:40,399 --> 00:07:43,519
so that's like twelve thousand pounds. Like these were huge,

153
00:07:43,680 --> 00:07:48,120
huge vehicles. Phobos one has a very agnomal experience where

154
00:07:48,120 --> 00:07:52,079
it's launched in nineteen eighty eight but a mis entered

155
00:07:52,240 --> 00:07:55,920
manual command disables its thrusters and they lose control of

156
00:07:55,959 --> 00:07:57,199
it almost immediately.

157
00:07:57,240 --> 00:07:58,920
Speaker 1: Ben, that was you right, totally.

158
00:07:59,120 --> 00:08:01,240
Speaker 2: That wasn't It was a program and who was ultimately

159
00:08:01,240 --> 00:08:04,360
punished for was kind of a big deal. But what

160
00:08:04,399 --> 00:08:06,319
happened this is almost exactly what happened to Maritor one,

161
00:08:06,319 --> 00:08:08,399
where they put in a command wrong and literally lost

162
00:08:08,720 --> 00:08:12,759
the vehicles and process. There was test code in the

163
00:08:12,879 --> 00:08:15,319
computer for Phobos one, and this is a very logical

164
00:08:15,360 --> 00:08:17,000
test code. When you're on the ground, it's one of

165
00:08:17,079 --> 00:08:18,680
the things you do after you finished a test sequence

166
00:08:18,720 --> 00:08:21,120
is you disable the maneuvering thrusters because those things are

167
00:08:21,120 --> 00:08:23,160
full of toxic chemicals and when it's on the ground,

168
00:08:23,199 --> 00:08:25,199
you really don't want those going off, right, and so

169
00:08:25,439 --> 00:08:27,680
when you complete the test sequence, it would immediately shut

170
00:08:27,720 --> 00:08:30,319
those off. Now you don't need that test code in flight.

171
00:08:30,879 --> 00:08:33,559
But because it's the nineteen eighties, they only have proms,

172
00:08:33,639 --> 00:08:35,600
and so the only way way to remove that code

173
00:08:35,639 --> 00:08:38,240
would be to dismantled computer and replace the prom So

174
00:08:38,279 --> 00:08:40,480
they don't do it. They leave the code in with

175
00:08:40,559 --> 00:08:44,120
a big sign saying don't run this code anyway. Guess

176
00:08:44,120 --> 00:08:47,159
what They ran the code. So a misinjured command ended

177
00:08:47,240 --> 00:08:51,200
up firing that code, which disables those controlers, and there's

178
00:08:51,200 --> 00:08:52,960
no way to restart them. And so Phobos won it.

179
00:08:53,080 --> 00:08:55,759
Never it never gets to Mars, never gets used. But

180
00:08:55,759 --> 00:08:58,879
Phobos two they have more success with the original. They

181
00:08:58,919 --> 00:09:01,000
supposed to fly in eighty six, it flies in eighty eight,

182
00:09:01,039 --> 00:09:03,799
and it actually enters Mars orbit in nineteen eighty nine,

183
00:09:04,200 --> 00:09:07,679
which is a huge success. However, and it was called

184
00:09:07,720 --> 00:09:10,200
Phobos for a reason. They were there was a very

185
00:09:10,279 --> 00:09:13,559
much it's gentle political maneuvering, right, this is parastroic and

186
00:09:13,600 --> 00:09:15,600
so forth. So they don't want to step on the Americans.

187
00:09:15,600 --> 00:09:16,879
So they're trying to do things that Mars and the

188
00:09:16,919 --> 00:09:19,360
Americans haven't done. These were called Phobos because they were

189
00:09:19,360 --> 00:09:22,639
specifically going to map and explore Phobos. They even want

190
00:09:22,679 --> 00:09:24,360
to put a little rover down. All these kinds of things.

191
00:09:24,399 --> 00:09:27,159
They didn't get to do most of them. And they

192
00:09:27,240 --> 00:09:29,440
used this triple computer system, which is just the same

193
00:09:29,440 --> 00:09:31,559
thing they Shuttle did, where you have three computers and

194
00:09:31,639 --> 00:09:36,159
whichever two agree that's the correct command. Except one of

195
00:09:36,159 --> 00:09:38,480
the computers completely fails on the way to Mars, so

196
00:09:38,519 --> 00:09:40,559
now they're down to two, and then the second one,

197
00:09:40,600 --> 00:09:43,279
a second one starts to fail and gets erratic, so

198
00:09:43,320 --> 00:09:46,360
the one that's working correctly can't send commands because it's

199
00:09:46,360 --> 00:09:46,919
only one.

200
00:09:47,159 --> 00:09:49,759
Speaker 1: See, this is further proof that everybody who goes to

201
00:09:49,840 --> 00:09:52,720
space should always send a programmer with them.

202
00:09:52,759 --> 00:09:54,159
Speaker 2: You know, I don't know if that guy would like

203
00:09:54,159 --> 00:09:57,320
that ride. It's not very comfortable, wow, and the bandwidth

204
00:09:57,360 --> 00:10:01,559
is mediocre anyway. They if you manage to get thirty

205
00:10:01,600 --> 00:10:03,919
seven photos of Phobos in map about eighty percent of it,

206
00:10:03,960 --> 00:10:05,840
they're the most detailed pictures we've ever had of that.

207
00:10:06,279 --> 00:10:09,159
Before they lose control of the vehicle because the computer

208
00:10:09,240 --> 00:10:11,679
just kick So that's space in nineteen eighty nine.

209
00:10:11,759 --> 00:10:14,879
Speaker 1: All right, I know what's coming next, Richard. What the

210
00:10:15,000 --> 00:10:18,360
birth of the World Wide Web nineteen eighty nine.

211
00:10:18,519 --> 00:10:21,240
Speaker 2: Oh yeah, no, it's a There's three important Internet things

212
00:10:21,240 --> 00:10:24,000
that happened in nineteen eighty nine. The first is the

213
00:10:24,080 --> 00:10:28,639
Internet passes one hundred thousand hosts, which is tiny when

214
00:10:28,679 --> 00:10:30,759
you think about it, but at the time, and we

215
00:10:30,919 --> 00:10:33,240
understand in nineteen eighty nine, the Internet's not a for

216
00:10:33,320 --> 00:10:36,360
sure thing. It's sort of a sideline academic thing. It's

217
00:10:36,399 --> 00:10:41,480
not that important. Yeah, the big initiative in industry is OSI.

218
00:10:41,559 --> 00:10:44,440
This is IBM and Sun and even Microsoft. This is

219
00:10:44,440 --> 00:10:46,559
what Bill was talking about at the time. He won't

220
00:10:46,600 --> 00:10:49,240
change his mind for a couple of years yet, but yeah,

221
00:10:49,279 --> 00:10:52,960
one hundred thousand hosts worldwide. It's also the first commercial

222
00:10:53,039 --> 00:10:55,840
dial up connection. So a company called the World std

223
00:10:56,159 --> 00:11:00,919
which is kind of a bad name. Did you ever

224
00:11:01,039 --> 00:11:01,360
check that?

225
00:11:01,600 --> 00:11:02,840
Speaker 1: Should have picked a better name?

226
00:11:02,960 --> 00:11:05,679
Speaker 3: Yeah, World, I mean maybe they knew that virus is

227
00:11:05,679 --> 00:11:06,639
we're going to be coming along.

228
00:11:07,039 --> 00:11:07,679
Speaker 2: Who knows.

229
00:11:07,679 --> 00:11:08,919
Speaker 1: Oh, there you go.

230
00:11:09,000 --> 00:11:11,799
Speaker 2: This isn't okay, Yeah, funny you should mention that. I'll

231
00:11:11,799 --> 00:11:14,440
be talking about that a little later there, Ben. So, yeah,

232
00:11:14,480 --> 00:11:16,559
that was the first time you could actually buy a connection,

233
00:11:16,720 --> 00:11:19,240
a dialut connection of the Internet in North America was

234
00:11:19,320 --> 00:11:22,360
you know, this company called world STD and of course,

235
00:11:22,720 --> 00:11:26,279
our buddy tv L Tim berners Lee submits a proposal

236
00:11:26,360 --> 00:11:30,519
to CERN for a distributed document management system which will

237
00:11:30,600 --> 00:11:32,159
laterally know as the World Wide Web.

238
00:11:32,279 --> 00:11:35,759
Speaker 1: I have a quick story. I was working at Crescent

239
00:11:35,840 --> 00:11:39,320
Software in the nineties, the early nineties, between ninety and

240
00:11:39,399 --> 00:11:41,559
ninety four and eighty nine to ninety four, I can't remember,

241
00:11:42,360 --> 00:11:46,240
and we came out with a control for visual Basic

242
00:11:46,320 --> 00:11:49,960
called the hypertext control. And all it was was you

243
00:11:50,000 --> 00:11:53,440
were able to you know, show text with a code

244
00:11:53,480 --> 00:11:56,159
that it wasn't HTML, but it was a code that

245
00:11:56,200 --> 00:11:59,120
had a hyperlink and then it fired an event with

246
00:11:59,240 --> 00:12:02,360
the hyperlinks that you could do whatever. And we thought

247
00:12:02,360 --> 00:12:06,960
this was like totally amazing. But it was around this

248
00:12:07,120 --> 00:12:09,559
time that that and it wasn't even one of our

249
00:12:09,600 --> 00:12:11,759
programs that came up with it. It was somebody who we

250
00:12:11,840 --> 00:12:15,679
had worked with a consultant, and that control went in

251
00:12:15,679 --> 00:12:17,519
our collection. Isn't that cool?

252
00:12:17,759 --> 00:12:20,840
Speaker 2: Yeah, it's cool, and you know it speaks to that

253
00:12:20,879 --> 00:12:23,240
whole embedded WebView and so forth. Like you guys were

254
00:12:23,240 --> 00:12:28,440
ahead of this time with doing that. All right, a

255
00:12:28,440 --> 00:12:31,960
few other computery things Microsoft releases. The first version of

256
00:12:31,960 --> 00:12:35,159
Office of the Map correl draw comes out. Oh yes,

257
00:12:35,320 --> 00:12:40,360
I remember SimCity woo. Yeah, the zip file format, Philip

258
00:12:40,440 --> 00:12:44,879
Katz's encryption algorithm. Yes, the first version of sql server

259
00:12:44,960 --> 00:12:46,960
one point zero. It's a sixteen bit version of server

260
00:12:47,200 --> 00:12:50,559
for OS two a year. It's a yeah. It's a

261
00:12:50,639 --> 00:12:53,519
port of sybase sql Server a collaboration to Microsoft side

262
00:12:53,519 --> 00:12:57,879
Base and Ashton Tate remember for dBase those guys dBase

263
00:12:58,360 --> 00:13:04,480
there you go. And sound Blaster. Yes, so creative technology

264
00:13:04,480 --> 00:13:07,639
out of Singapore. It's actually their third sound card. They

265
00:13:07,679 --> 00:13:10,720
code name on this one was Killer Card. Of course,

266
00:13:10,799 --> 00:13:13,360
nine voice fms and the size are using the Yamaha

267
00:13:13,919 --> 00:13:17,000
YM thirty eight twelve and it was not, oddly enough

268
00:13:17,000 --> 00:13:19,519
compatible with the ad lib card, which was really the

269
00:13:19,559 --> 00:13:21,519
first of the PC sound cards to storm in the

270
00:13:21,519 --> 00:13:24,120
mar They crushed the ad lib card and they will

271
00:13:24,159 --> 00:13:26,919
crush them all. Yeah. By far be the dominant product space.

272
00:13:27,120 --> 00:13:30,320
Speaker 1: I had dinner with mister sim who was the engineer

273
00:13:30,360 --> 00:13:32,559
behind the sound Blaster when I worked for Voyetra.

274
00:13:32,679 --> 00:13:34,200
Speaker 2: Well, in the fact they called the Killer Card. I

275
00:13:34,200 --> 00:13:36,480
think they knew we're going after this market and we're

276
00:13:36,480 --> 00:13:37,039
going to take it.

277
00:13:37,120 --> 00:13:38,240
Speaker 1: Yeah, and they did.

278
00:13:38,559 --> 00:13:42,159
Speaker 2: Intel releases the forty six DX, so our thirty two

279
00:13:42,200 --> 00:13:44,480
bit processes, they're getting bigger and faster, and it will

280
00:13:44,480 --> 00:13:46,080
of course, you will also be able to get the

281
00:13:46,120 --> 00:13:51,120
four eighty seven math co processor for it. And Phillips

282
00:13:51,159 --> 00:13:54,840
and Sony, in a rare collaboration, produced the first recordable CD.

283
00:13:54,960 --> 00:13:57,720
They called it the CdWO for Compact disc Right once,

284
00:13:57,799 --> 00:14:00,600
which of course is the standard for all CD back.

285
00:14:00,759 --> 00:14:03,000
That's right you. Once you write on it, you can't

286
00:14:03,000 --> 00:14:06,320
write on again. And reference to Ben's comment about viruses,

287
00:14:06,639 --> 00:14:11,279
The very first recorded case of ransomware, the Aid's trojan

288
00:14:12,240 --> 00:14:14,799
encrypted and hid files and then you could pay one

289
00:14:14,879 --> 00:14:17,600
hundred and eighty nine dollars to get the encryption decryption key,

290
00:14:17,879 --> 00:14:21,960
although later security analysts decompiled the code and found the

291
00:14:22,039 --> 00:14:26,600
ransomware key in the code, so you didn't need just

292
00:14:26,639 --> 00:14:29,200
showing that the bad guys are usually not that smart.

293
00:14:29,480 --> 00:14:31,879
Speaker 1: You know, if you're gonna ask for money, ask for

294
00:14:31,919 --> 00:14:33,840
more than one hundred and eighty nine dollars.

295
00:14:33,840 --> 00:14:35,039
Speaker 3: I was going to say, I think we've had a

296
00:14:35,080 --> 00:14:36,159
lot of inflation since then.

297
00:14:36,279 --> 00:14:41,120
Speaker 1: Yeah, Well, yeah, this is some pure criminal career advice.

298
00:14:41,639 --> 00:14:43,559
Speaker 2: It's nineteen eighty nine, so one hundred and eighty nine

299
00:14:43,639 --> 00:14:46,120
kind of makes sense. I think. I don't know. Yeah, Okay,

300
00:14:46,200 --> 00:14:46,840
that's what I got.

301
00:14:46,879 --> 00:14:48,919
Speaker 1: All right, I guess it's time for better no a framework.

302
00:14:49,000 --> 00:14:57,559
Speaker 2: Awesome? Alright, man?

303
00:14:57,559 --> 00:14:59,799
Speaker 1: What you got? Well, it's kind of a meatball softball.

304
00:15:00,440 --> 00:15:02,840
I just want to bring some more attention to my

305
00:15:02,960 --> 00:15:05,759
security podcast since we're talking about that with Ben today

306
00:15:06,679 --> 00:15:11,679
AI Security Security this week. So what's different about this podcast?

307
00:15:11,679 --> 00:15:13,919
First of all, what it is we talk about I

308
00:15:13,919 --> 00:15:17,080
don't know, six or seven stories from the week where

309
00:15:17,080 --> 00:15:19,519
there have been breaches, where there have been attacks, or

310
00:15:19,559 --> 00:15:23,960
somebody got caught, or somebody did something great, or there's

311
00:15:24,000 --> 00:15:29,120
some nasty zero day bug that we need to pay

312
00:15:29,159 --> 00:15:32,039
attention to. But a lot of it has been centering

313
00:15:32,080 --> 00:15:36,240
on AI stories lately, and it just is abundantly clear

314
00:15:36,399 --> 00:15:40,039
that once the AIS get a hold of, you know,

315
00:15:40,120 --> 00:15:46,600
creating zero days, and they already do zero day exploits

316
00:15:46,799 --> 00:15:51,240
and exploiting them like at an insane clip that you know,

317
00:15:51,320 --> 00:15:55,320
we're all kind of screwed and unless we're taking some

318
00:15:55,519 --> 00:15:59,320
real precautions and guardrails. But anyway, it's a security This

319
00:15:59,360 --> 00:16:03,080
Week podcast. And the other thing about it is we

320
00:16:03,159 --> 00:16:07,320
laugh to keep from crying, you know, because some of

321
00:16:07,360 --> 00:16:11,360
this stuff is just so horrible it's frightening. But we

322
00:16:11,399 --> 00:16:14,559
certainly do or laugh our way through it. So that's it.

323
00:16:15,120 --> 00:16:15,600
Speaker 2: Awesome time.

324
00:16:15,879 --> 00:16:17,279
Speaker 1: Who's talking to us today, Richard?

325
00:16:17,519 --> 00:16:22,679
Speaker 2: I got a comment from a listener, actually a direct

326
00:16:22,679 --> 00:16:28,200
message in LinkedIn, and this is from Timo toy venone

327
00:16:28,279 --> 00:16:30,000
and I hope I pronounced your name correctly. I bet

328
00:16:30,039 --> 00:16:32,440
I didn't. And he said, I'm a longtime friend from

329
00:16:32,480 --> 00:16:34,559
Europe and I'm reaching out via LinkedIn because I couldn't

330
00:16:34,559 --> 00:16:37,279
find the listener email. I hope that's okay. It's Richard

331
00:16:37,279 --> 00:16:42,000
at PWOP dot com too, But yeah, okay. Your show

332
00:16:42,080 --> 00:16:45,200
has been instrumental in landing my dot net engineer and

333
00:16:45,559 --> 00:16:48,960
architect role back in twenty five to twenty thirteen, and

334
00:16:49,000 --> 00:16:51,240
thanks for all the great content over the years. Just

335
00:16:51,240 --> 00:16:53,320
a reminder we've been doing the show entirely too long.

336
00:16:54,879 --> 00:16:59,200
After a decade away in agile coaching, I'm transitioning back

337
00:16:59,240 --> 00:17:02,399
to dot net as roles now Blend's scrum master and

338
00:17:02,399 --> 00:17:06,200
product owner with technical dot net advisor and architect responsibilities.

339
00:17:07,920 --> 00:17:10,680
So my challenge how to efficiently relearn the stack after

340
00:17:10,759 --> 00:17:14,160
ten years away, what's fundamentally changed versus what stayed the same?

341
00:17:14,720 --> 00:17:15,960
Where would you focus?

342
00:17:16,200 --> 00:17:17,839
Speaker 1: And this is a relatively new comment.

343
00:17:17,960 --> 00:17:20,319
Speaker 2: Yeah, no, he just got it. So he left in

344
00:17:20,359 --> 00:17:23,799
twenty thirteen, and now it's twenty twenty six and he's

345
00:17:23,839 --> 00:17:26,839
got a new job and he's started looking at you

346
00:17:26,920 --> 00:17:29,440
and just trying to think, like twenty thirteen cloud is

347
00:17:29,519 --> 00:17:31,279
brand new and still very questionable.

348
00:17:31,359 --> 00:17:34,480
Speaker 1: Yep, right, you're not automatically moved there there there's no

349
00:17:34,599 --> 00:17:36,440
open source do net nope.

350
00:17:36,319 --> 00:17:38,319
Speaker 2: Or before the open source versions of dot net and

351
00:17:39,000 --> 00:17:41,880
open a sourcing of the dot net framework, before all

352
00:17:42,000 --> 00:17:45,359
the Roslin stuff, like it's before any of that. He

353
00:17:45,519 --> 00:17:49,240
left arguably at the height actually the beginning of the

354
00:17:50,079 --> 00:17:52,359
there really is the dark times, Like what would argue

355
00:17:52,359 --> 00:17:55,079
the height of dot net is twenty ten, right, studio

356
00:17:55,119 --> 00:18:02,920
twenty ten f sharp, the support for open sources just

357
00:18:02,960 --> 00:18:07,480
beginning in the in there, right, we've got early web

358
00:18:07,559 --> 00:18:10,640
tools I you know, I E nine still not out yet,

359
00:18:10,759 --> 00:18:13,640
like just sort of hit that, you know. Toy Studio

360
00:18:13,680 --> 00:18:15,680
twenty third, twenty twelve was the one that was all

361
00:18:15,720 --> 00:18:18,759
about Windows eight and they put the upper case menus

362
00:18:18,880 --> 00:18:21,359
items and all menus or uppercase in visual studios. So

363
00:18:21,359 --> 00:18:23,240
there was twenty thirteen edihoal we took that back out.

364
00:18:23,720 --> 00:18:25,039
So that's a great thing about a bad feature. You

365
00:18:25,039 --> 00:18:28,200
get two versions out of it and then he's gone.

366
00:18:28,440 --> 00:18:30,319
He goes off to live in a happy agile land.

367
00:18:30,519 --> 00:18:33,359
Speaker 1: Yeah, so wow, a few things have changed.

368
00:18:33,440 --> 00:18:37,279
Speaker 2: What all's changed? Yeah, I mean, oddly enough, you're going

369
00:18:37,319 --> 00:18:40,160
to have to think about AI tooling certainly coming in now.

370
00:18:41,200 --> 00:18:43,519
The nice thing is you've you've left, you can skip

371
00:18:43,559 --> 00:18:46,079
over a ton of stuff like you're you really need

372
00:18:46,079 --> 00:18:48,359
to restudy c sharp and the modern expressions of the

373
00:18:48,440 --> 00:18:50,839
language and the is it it looks very different today.

374
00:18:50,839 --> 00:18:53,440
Speaker 1: The real thing here is that you know, dot net

375
00:18:53,720 --> 00:18:56,720
went from being a Windows technology to being a multi

376
00:18:56,759 --> 00:18:58,640
platform open source technology.

377
00:18:58,759 --> 00:18:59,000
Speaker 2: Yeah.

378
00:18:59,119 --> 00:19:02,480
Speaker 1: See sharp it cell his open source, A lot of

379
00:19:02,480 --> 00:19:06,000
the tools are open source. Visual Studio code is open source,

380
00:19:06,599 --> 00:19:10,720
and so you know you're not getting charged for you know,

381
00:19:10,839 --> 00:19:13,839
big enterprise I guess enterprise editions of visual Studio they

382
00:19:13,839 --> 00:19:14,839
still charge for but.

383
00:19:14,880 --> 00:19:15,400
Speaker 2: Still out there.

384
00:19:15,519 --> 00:19:18,680
Speaker 1: Yeah, still cost the same, but you don't need that.

385
00:19:18,839 --> 00:19:21,759
There's a free version, the community version of visual Studio

386
00:19:21,799 --> 00:19:24,319
if you want to stay there. That's sure. It's just free.

387
00:19:24,400 --> 00:19:27,400
Speaker 2: But you know, thinking about Timo in his architect's role

388
00:19:27,759 --> 00:19:30,839
means he's also going to be dealing with web devs

389
00:19:30,839 --> 00:19:32,880
that have never touched studio in their life. They maybe

390
00:19:32,960 --> 00:19:34,759
use studio code, and they still going to be a

391
00:19:34,759 --> 00:19:36,319
part of the project, so that they're going to need

392
00:19:36,759 --> 00:19:39,759
the dot net kit for that. Yeah, as well as

393
00:19:39,799 --> 00:19:41,839
some traditional droppers like I hope they were cross to

394
00:19:41,880 --> 00:19:43,880
the modern version of dot net because life is better

395
00:19:43,960 --> 00:19:46,319
over there, without a doubt. Oh yeah, that depends on

396
00:19:46,359 --> 00:19:49,559
the kind of clients they're ultimately going to build. He's

397
00:19:49,559 --> 00:19:52,000
missed all of containerization for the most part.

398
00:19:52,079 --> 00:19:54,799
Speaker 1: Yeah, that's right. Unless he kept up with it.

399
00:19:54,880 --> 00:19:56,359
Speaker 2: He lay out our software a wee bit different.

400
00:19:56,359 --> 00:19:58,440
Speaker 1: Now he might have caught up with it. Kept up

401
00:19:58,480 --> 00:19:59,799
with it on the other side, though.

402
00:20:00,279 --> 00:20:02,720
Speaker 2: I know it's a big bite, but you know, if

403
00:20:02,720 --> 00:20:04,519
we were learning containers from scratch right now, I just

404
00:20:04,519 --> 00:20:07,319
go directly to aspire because you don't use canators for fun.

405
00:20:07,759 --> 00:20:11,240
You're doing it because you're going after cloud architecture, and

406
00:20:11,279 --> 00:20:12,880
whatever bit of scaffolding you can do is to get

407
00:20:12,920 --> 00:20:15,279
you to cloud style architectures will make your life easier.

408
00:20:15,319 --> 00:20:17,240
So yep, that's certainly an easier bite.

409
00:20:17,319 --> 00:20:18,000
Speaker 1: Yeah, I agree.

410
00:20:18,119 --> 00:20:21,119
Speaker 2: Anyway, I thought it was a fun comment. Yeah, Tiam,

411
00:20:21,200 --> 00:20:23,400
I hope this guy gave you some ideas. Certainly this

412
00:20:23,519 --> 00:20:26,119
show we're talking you know more about some of the

413
00:20:26,200 --> 00:20:28,799
contemporary TOOLI and the problems they're in. So and there's

414
00:20:28,799 --> 00:20:31,200
plenty more shows. So you dig through the catalog a bit,

415
00:20:31,240 --> 00:20:33,599
I'm sure you'll find it. Fuse it'll help you. So, Timo,

416
00:20:33,640 --> 00:20:34,960
thank you so much for a comment. And a copy

417
00:20:34,960 --> 00:20:36,240
of music co buy is on its way to you.

418
00:20:36,279 --> 00:20:37,640
And if you'd like a copy of music, go buy.

419
00:20:37,640 --> 00:20:39,559
I write a comment on the website at dot NetRocks

420
00:20:39,559 --> 00:20:42,519
dot com or on the facebooks or the LinkedIn's if

421
00:20:42,559 --> 00:20:44,640
you like, and if we read it on the show,

422
00:20:44,680 --> 00:20:45,680
we'll send you copy music.

423
00:20:45,680 --> 00:20:48,000
Speaker 1: Go buy music to code buy dot net if you

424
00:20:48,079 --> 00:20:51,400
want to get it yourself MP three wave in flat format.

425
00:20:51,720 --> 00:20:55,440
All right, so let's introduce Ben Decree for the first

426
00:20:55,440 --> 00:20:58,079
time here on dot net rocks. Ben has been a

427
00:20:58,079 --> 00:21:01,880
software engineer for over twenty five years. He's a Microsoft MVP,

428
00:21:02,200 --> 00:21:05,359
and he's been knee deep in security and developer relations

429
00:21:05,359 --> 00:21:08,400
for most of his career. He's spoken at conferences like

430
00:21:08,480 --> 00:21:13,880
defcn NDC, dev Intersection, et cetera, et cetera. These days,

431
00:21:13,880 --> 00:21:17,079
he's co founder of ven Labs, a startup studio based

432
00:21:17,079 --> 00:21:20,519
in Kansas City, and he's just about to launch braid Flow.

433
00:21:21,160 --> 00:21:25,279
And that's br aid Flow, an AI platform that tackles

434
00:21:25,319 --> 00:21:30,880
the context drift problem keeping AI conversations focused when things

435
00:21:30,920 --> 00:21:34,640
get complex. Welcome Ben, Thanks, thanks for having me. Wow,

436
00:21:34,640 --> 00:21:37,359
where do we start? I mean, AI security is such

437
00:21:37,400 --> 00:21:41,240
a big problem, and I just want to preface this

438
00:21:41,319 --> 00:21:44,279
by saying, a couple of weeks ago, I got a

439
00:21:44,359 --> 00:21:49,960
brand new six thousand dollars multi machine here in ninety

440
00:21:50,000 --> 00:21:53,039
six gigs of RAM thirty two gig a fifty nine

441
00:21:53,119 --> 00:21:58,359
hundred series card for the purpose of turning it into

442
00:21:58,519 --> 00:22:02,720
my personal you know LM with Alama that I can

443
00:22:02,880 --> 00:22:06,640
use like a getthub co pilot. And it is not

444
00:22:06,759 --> 00:22:07,400
going well.

445
00:22:08,200 --> 00:22:10,240
Speaker 3: No, we were trying something similar here the other day,

446
00:22:10,319 --> 00:22:13,640
my wife and I and we took a pretty good

447
00:22:13,680 --> 00:22:18,400
gaming rig and installed Lama CPP and it runs dog

448
00:22:18,440 --> 00:22:19,480
slow well.

449
00:22:19,519 --> 00:22:23,640
Speaker 1: Lama runs great on it locally, and even if I

450
00:22:23,759 --> 00:22:27,240
do you know, if I'm running it just talking to Alama.

451
00:22:27,279 --> 00:22:30,240
But what I really want is an agent, you know,

452
00:22:30,319 --> 00:22:34,440
I'll a get ub copilot CLI that does everything and

453
00:22:34,480 --> 00:22:36,880
these things like I can't get them to get beyond

454
00:22:37,440 --> 00:22:39,960
you know, the the Jason that they're going to send

455
00:22:40,000 --> 00:22:42,160
and then they never send it. Like I've tried three

456
00:22:42,200 --> 00:22:43,960
or four of them and they all get stuck there.

457
00:22:44,000 --> 00:22:44,640
Speaker 2: So I don't know.

458
00:22:45,279 --> 00:22:47,359
Speaker 1: But that's not here nor there. I just want to

459
00:22:47,920 --> 00:22:49,279
I just want to preface that story.

460
00:22:49,440 --> 00:22:51,160
Speaker 3: Sure, I mean we can dig into that as well

461
00:22:51,160 --> 00:22:55,519
if you want. It's somewhat related to the security thing.

462
00:22:56,279 --> 00:22:58,880
Like one of the big issues that a lot of

463
00:22:58,960 --> 00:23:01,119
organizations have at the moment when they're thinking about do

464
00:23:01,160 --> 00:23:05,519
we let our developers use these AI development tools is

465
00:23:05,519 --> 00:23:08,480
what happens to that code? Where does it go? Whose stories?

466
00:23:08,720 --> 00:23:11,480
And you have data sovereignty issues with countries as well,

467
00:23:12,000 --> 00:23:14,759
or even austrating client to the moment who is like.

468
00:23:14,920 --> 00:23:17,319
Speaker 1: Yeah, that's exactly why I want to run alarm locally

469
00:23:17,359 --> 00:23:20,119
because my customers don't want me using.

470
00:23:19,880 --> 00:23:23,680
Speaker 3: The the tools. So it's hugely I see a future

471
00:23:23,680 --> 00:23:27,799
where probably a lot of people, mostly people who use

472
00:23:27,839 --> 00:23:31,079
technology heavily already, will have their own influence machine at home.

473
00:23:31,720 --> 00:23:34,480
I think that's just going to be commoditized.

474
00:23:35,240 --> 00:23:37,480
Speaker 2: Well, and it just gets rid of the token issue too,

475
00:23:38,200 --> 00:23:40,279
questioning the massformans. Of course you're going to get from

476
00:23:40,279 --> 00:23:43,039
it rather than sure, you know, although sooner or later,

477
00:23:43,119 --> 00:23:45,480
these all you can eat accounts are going to go away.

478
00:23:45,680 --> 00:23:48,039
Speaker 3: Oh yeah, I mean they're heavily subsidized in the moment.

479
00:23:48,359 --> 00:23:52,079
They can't continue to do that. Even if if electricity

480
00:23:52,119 --> 00:23:56,200
prices and all the associated costs come down, it's still

481
00:23:56,240 --> 00:23:57,079
heavily subsidized.

482
00:23:57,440 --> 00:23:57,640
Speaker 2: Yeah.

483
00:23:58,400 --> 00:24:01,039
Speaker 3: But the other thing is, like I would be happy

484
00:24:01,160 --> 00:24:03,960
at the moment with an influence machine that was perhaps

485
00:24:04,039 --> 00:24:07,200
a little slower, but didn't have that. So I use

486
00:24:07,519 --> 00:24:11,519
cloud code and they have the five hour window, and

487
00:24:11,960 --> 00:24:14,039
if I could get it to do something for ten

488
00:24:14,079 --> 00:24:18,079
hours while I'm sleeping and not just stop halfway through, Yeah,

489
00:24:18,119 --> 00:24:19,720
it doesn't really matter to me if it's a bit slower,

490
00:24:19,720 --> 00:24:21,359
because it still gets more done overall.

491
00:24:21,640 --> 00:24:25,039
Speaker 2: Yeah. Yeah, I've definitely got friends running multiple or you

492
00:24:25,039 --> 00:24:27,839
can eat accounts because they literally run them against each

493
00:24:27,839 --> 00:24:32,440
other overnight routinely. Like that's just the way you do work, right,

494
00:24:32,480 --> 00:24:34,039
You get up in the morning to see what your

495
00:24:34,039 --> 00:24:36,480
half a dozen bots have cooked off for yourself. I'm

496
00:24:36,480 --> 00:24:38,880
sure you're spending a couple of grand a month, but.

497
00:24:40,680 --> 00:24:44,240
Speaker 3: Work just between me and my wife, we're spending easily

498
00:24:44,279 --> 00:24:48,839
five k a year if we're looking at two mexicounts each,

499
00:24:49,599 --> 00:24:51,839
sure five six k a year. And then we've got

500
00:24:51,880 --> 00:24:53,759
all the other TOOLI on top. How much does it

501
00:24:53,799 --> 00:24:56,240
cost to build a pretty good rig for home? How

502
00:24:56,319 --> 00:24:58,440
quickly could you pay that often? In terms of written

503
00:24:58,480 --> 00:25:00,480
on investment, I don't think take that long.

504
00:25:00,519 --> 00:25:03,200
Speaker 1: Well, and like I said, this one costs six thousand, right,

505
00:25:03,720 --> 00:25:06,880
and it's all in one box. Right, it works great?

506
00:25:07,000 --> 00:25:08,799
Speaker 3: Yeah, yeah, well I'll catch up with you afterwards. I'll

507
00:25:08,799 --> 00:25:09,839
find out what specs you're running.

508
00:25:09,920 --> 00:25:13,319
Speaker 2: It's in true reality. And I was just reading Anthropic

509
00:25:13,519 --> 00:25:18,920
is now offering you for a fee faster service from them. Yeah,

510
00:25:19,119 --> 00:25:20,839
two and a half time, so you can sort of

511
00:25:20,839 --> 00:25:24,000
go fast laying with them. So I feel like those guys,

512
00:25:24,279 --> 00:25:26,279
the cloud guys, have got to figured out of all

513
00:25:26,319 --> 00:25:28,759
of the company, the new AI companies, rather than the

514
00:25:28,799 --> 00:25:31,440
tech shiants, the one that seemed to be making things

515
00:25:31,440 --> 00:25:33,839
that people want and will spend money on seems to

516
00:25:33,839 --> 00:25:36,400
be Anthropy and today anyway, the other.

517
00:25:36,200 --> 00:25:39,279
Speaker 3: Thing I really like about Anthropic is their stems on security. Yeah,

518
00:25:39,319 --> 00:25:42,400
they've got huge research teams making sure that they're being

519
00:25:42,400 --> 00:25:45,440
as ethical as possible. They understand the repercussions, that they're

520
00:25:45,480 --> 00:25:47,559
fixing guardrails, all of those kind of things, Like if

521
00:25:47,559 --> 00:25:49,920
I had to put my money on which one's going

522
00:25:49,960 --> 00:25:52,920
to survive because they quote do the right thing or

523
00:25:53,640 --> 00:25:56,519
pay attention to the needs, and they do heavily focus

524
00:25:56,559 --> 00:25:58,359
on the coding side of things. It's where they've made

525
00:25:58,359 --> 00:26:02,200
their names so far. Yeah, I would say if there's

526
00:26:02,200 --> 00:26:06,440
going to be one frontier model or provider that comes

527
00:26:06,440 --> 00:26:13,079
out of this as the go to for software development center.

528
00:26:12,839 --> 00:26:15,400
Speaker 2: Topic, it sure seems that way. And it's funny how

529
00:26:15,519 --> 00:26:18,319
little the sense of trust towards the tech giants is,

530
00:26:18,839 --> 00:26:20,720
Like you would have thought back in the day, it

531
00:26:20,759 --> 00:26:22,880
was like nobody ever went wrong for using IBM. It's

532
00:26:22,920 --> 00:26:24,880
like you would have thought Microsoft had sewn up. They

533
00:26:24,920 --> 00:26:28,640
offer the protection of will handle any lawsuits you're exposed to,

534
00:26:28,680 --> 00:26:32,319
and so forth, and still the reputation is shockingly bad.

535
00:26:32,839 --> 00:26:36,000
And here's this little company with minimal of anything. But

536
00:26:36,160 --> 00:26:37,599
we're willing to give them the chance.

537
00:26:37,759 --> 00:26:39,079
Speaker 3: And I'm sure I probably don't want to get too

538
00:26:39,119 --> 00:26:41,079
much into politics, but if you look at what Opening

539
00:26:41,119 --> 00:26:42,960
Eye has been doing in terms of we're going to

540
00:26:42,960 --> 00:26:45,519
be open source. Now we're not open source, and people

541
00:26:45,599 --> 00:26:48,000
getting fired, and there's all sorts of stuff going inside.

542
00:26:48,039 --> 00:26:51,759
There's question mark surrounding that company. It's hard to work

543
00:26:51,759 --> 00:26:53,599
out what direction they're trying to go in.

544
00:26:53,640 --> 00:26:56,599
Speaker 1: They're under a lot of monetary pressure, you know, and

545
00:26:57,000 --> 00:27:01,200
they've spent, they've taken lots of money. What are they

546
00:27:01,119 --> 00:27:05,480
they're giving they're giving back, but you know they have competitors,

547
00:27:05,720 --> 00:27:07,039
right sure, yeah.

548
00:27:06,839 --> 00:27:09,759
Speaker 2: Yeah, And I don't see that same sense of this

549
00:27:09,839 --> 00:27:11,640
is a product you must use coming out of open

550
00:27:11,680 --> 00:27:14,200
AI the way I'm seeing over and over again, and

551
00:27:14,240 --> 00:27:17,839
not just one thing, like these new plugins for PowerPoint

552
00:27:17,880 --> 00:27:20,920
and Excel out of Anthropic are stunning, Like what has

553
00:27:21,000 --> 00:27:24,400
the M three sixty five team been doing that? Out

554
00:27:24,480 --> 00:27:27,400
of nowhere? Anthropy goes, hey, how about this? He's being Oh,

555
00:27:27,440 --> 00:27:29,400
you mean the thing I always wanted from Excel?

556
00:27:29,640 --> 00:27:32,960
Speaker 3: Right, okay, So the chlaud cowork feature that came out

557
00:27:33,000 --> 00:27:37,240
recently was ideated. I'm not sure iver I like that word,

558
00:27:37,240 --> 00:27:39,960
but I'll go with it. It was ideated ten days before

559
00:27:40,079 --> 00:27:42,519
was released. Like the speed at which they bringing stuff

560
00:27:42,519 --> 00:27:44,039
out of there. And I read an article the other

561
00:27:44,079 --> 00:27:47,759
day that anthropic engineers are you take away the word

562
00:27:47,839 --> 00:27:50,799
software aren't actually coding anymore. They do sison Claud to

563
00:27:50,799 --> 00:27:53,400
make Claud better, right, Yeah, it's all our specifications. It's

564
00:27:53,440 --> 00:27:55,880
all about knowing what you want to build and getting

565
00:27:56,200 --> 00:27:58,480
the AI tooling to understand that to a sufficient enough

566
00:27:58,519 --> 00:28:03,119
level to build a good product. And they're eating their

567
00:28:03,119 --> 00:28:05,880
own on food literally in the best sense of that example.

568
00:28:06,160 --> 00:28:07,160
And they're doing redeveloped.

569
00:28:07,440 --> 00:28:10,680
Speaker 2: Yeah, they're doing what everybody said they would do. These

570
00:28:10,680 --> 00:28:12,720
guys seem to actually be doing it. Are now sort

571
00:28:12,720 --> 00:28:18,519
of running away with what's possible here. Okay, but I

572
00:28:18,559 --> 00:28:21,079
think about the security on the dev side where I

573
00:28:21,160 --> 00:28:23,759
have teams now where when they use remote developers, the

574
00:28:23,920 --> 00:28:27,839
remote developers have to RDP into an instance with the

575
00:28:27,839 --> 00:28:31,039
dev tools on it because they won't allow anything to

576
00:28:31,160 --> 00:28:34,720
go onto even that remote desktop. You know, if you're

577
00:28:34,720 --> 00:28:36,519
going to steal from that code basis, because you're going

578
00:28:36,559 --> 00:28:39,559
to take pictures of the code like that's how tight they.

579
00:28:39,440 --> 00:28:42,759
Speaker 3: Are, which you could probably automate with Claud cowork.

580
00:28:43,279 --> 00:28:48,000
Speaker 4: Oh yeah, so much for protections, right, Although if you're

581
00:28:48,000 --> 00:28:50,160
willing to throw the vms up in the clouds, you're

582
00:28:50,160 --> 00:28:52,759
probably willing to use the AI in the cloud as well,

583
00:28:52,839 --> 00:28:54,920
Like I just I don't know that we can contain

584
00:28:55,039 --> 00:28:56,920
data all that Well, if you're going to be on

585
00:28:56,960 --> 00:28:58,160
the internet, you're on the Internet.

586
00:28:58,200 --> 00:29:02,880
Speaker 3: I think the risk profile there is very interesting. If

587
00:29:03,559 --> 00:29:07,559
I was a Department of Defense type organization in a country,

588
00:29:07,559 --> 00:29:10,599
they didn't want my code to leave sovereign borders totally

589
00:29:10,640 --> 00:29:14,720
unset there, So my source control, my hosting, all of

590
00:29:14,720 --> 00:29:17,640
that needs to be within borders because when I'm pushing

591
00:29:17,680 --> 00:29:20,440
my code around, that code is literally being pushed around.

592
00:29:20,440 --> 00:29:22,640
When database dumps are being sent around, when people are

593
00:29:22,640 --> 00:29:27,480
signing up for accounts, that data is HGPSTLS, straw. But

594
00:29:27,759 --> 00:29:29,960
the essentially we've got a whole lot of plaintext data

595
00:29:30,000 --> 00:29:33,559
at some point that's been passed around. When you're pushing

596
00:29:33,559 --> 00:29:37,720
context to an infant server, there's plaintext in there, but

597
00:29:37,759 --> 00:29:40,279
then it gets tokenized, and then it gets stored in

598
00:29:40,400 --> 00:29:43,279
some kind of vector representation, and it's pulled out in

599
00:29:43,279 --> 00:29:45,880
interesting ways, and then you've got the key valuecation and

600
00:29:46,359 --> 00:29:49,680
there's a lot of obfuscation under the hood. So what

601
00:29:49,880 --> 00:29:52,880
is it that actually gets stored long term? A lot

602
00:29:52,880 --> 00:29:57,200
of the providers will say that we don't remember your prompts,

603
00:29:57,200 --> 00:29:59,440
and we don't remember the responses, but we remember some

604
00:29:59,440 --> 00:30:02,920
of the stuff that happens in between. So the risk

605
00:30:02,960 --> 00:30:06,200
profile there is interesting because it's no longer as clean

606
00:30:06,240 --> 00:30:09,759
cut as our data is at rest in another country,

607
00:30:09,759 --> 00:30:11,960
because it kind of is, but it also kind of isn't.

608
00:30:12,759 --> 00:30:14,920
So I think when we think about whether or not

609
00:30:15,000 --> 00:30:18,359
we can use these tools, we need to not so

610
00:30:18,480 --> 00:30:20,000
much to think about where is the data going, but what

611
00:30:20,079 --> 00:30:25,720
is the likelihood of that data representing a leaking kind of.

612
00:30:25,720 --> 00:30:30,039
Speaker 2: Risk actually actually risky information of any kind that could

613
00:30:30,039 --> 00:30:31,039
be harmful to a company.

614
00:30:31,319 --> 00:30:33,160
Speaker 3: So we've seen the examples where somebody would go to

615
00:30:33,400 --> 00:30:36,839
cloud code or chatchypt and they would say count from

616
00:30:36,839 --> 00:30:41,240
one to two thousand, or repeat the word organization one

617
00:30:41,279 --> 00:30:44,039
hundred times was a good example, and then after like

618
00:30:44,079 --> 00:30:46,680
the sixty fourth time, it would start leaking information that

619
00:30:46,720 --> 00:30:50,960
it was trained on. There was the report of the

620
00:30:51,440 --> 00:30:55,079
email signature of a legal firm being leaked that way,

621
00:30:56,160 --> 00:30:58,519
Because essentially all is doing is predicting an X word,

622
00:30:59,240 --> 00:31:03,960
and as the context kind of doubles, it's saying, okay, organizational, organization, organization,

623
00:31:04,119 --> 00:31:06,480
and it forgets what it's doing, says organizational the next

624
00:31:06,559 --> 00:31:10,640
likely word is is not liable for anything? Blah blah blah.

625
00:31:10,640 --> 00:31:13,279
So it starts it's doing what it's supposed to do.

626
00:31:13,440 --> 00:31:15,440
What's the most likely word to come next, and it

627
00:31:15,480 --> 00:31:16,440
protects the context.

628
00:31:16,480 --> 00:31:21,039
Speaker 1: It feels like my brain first thing in the morning, Right, So.

629
00:31:21,200 --> 00:31:23,799
Speaker 3: I was going to say we need need more coffee?

630
00:31:24,279 --> 00:31:27,119
Speaker 2: I slept well, right, You.

631
00:31:29,599 --> 00:31:34,880
Speaker 3: Copies of this context can get pulled out, right, But

632
00:31:35,000 --> 00:31:38,599
is there enough there for it to be a concern?

633
00:31:39,039 --> 00:31:41,160
Speaker 2: Is it harmful in the right way?

634
00:31:41,759 --> 00:31:42,559
Speaker 3: And how do you measure that?

635
00:31:42,799 --> 00:31:42,880
Speaker 4: Like?

636
00:31:42,920 --> 00:31:44,519
Speaker 3: That's the other thing is there's there's no way for

637
00:31:44,599 --> 00:31:48,480
us to actually know what's gone out to somebody else

638
00:31:49,119 --> 00:31:49,799
and what was that?

639
00:31:49,839 --> 00:31:52,400
Speaker 2: And this is the challenge of a security person is

640
00:31:52,680 --> 00:31:55,400
there are no absolutes. Nothing's ever one hundred percent anything.

641
00:31:55,480 --> 00:31:58,680
Its scope of risk, right, And so you're looking at

642
00:31:58,720 --> 00:32:02,079
a situation going, well, this is seen to be much scoper. Yes,

643
00:32:02,119 --> 00:32:04,200
it's not absolute, but it's also not a high scope

644
00:32:04,240 --> 00:32:07,119
of risk. Here, totally interesting problem. All right, we should

645
00:32:07,119 --> 00:32:09,279
take a little break and maybe dig into some security

646
00:32:09,279 --> 00:32:11,680
fundamentals that folks can take a come away with. That's good,

647
00:32:11,960 --> 00:32:18,480
right after these important messages and we're back. It's don

648
00:32:18,480 --> 00:32:21,400
at Rock's amateur Campbell that's called Franklin Yo. Talking to

649
00:32:21,440 --> 00:32:25,680
our friend Ben Ben Deshry about well, you've been a

650
00:32:25,720 --> 00:32:29,039
security guy before AI was dominant on these things, right,

651
00:32:29,079 --> 00:32:31,160
you were the o off guy, Like you've done all

652
00:32:31,160 --> 00:32:34,000
of that sort of stuff, Like, how much has that changed.

653
00:32:34,240 --> 00:32:36,240
You've had the battle of trying to get developer security

654
00:32:36,400 --> 00:32:38,160
software all along.

655
00:32:38,559 --> 00:32:41,920
Speaker 3: Developers love security insofar as they want to do a

656
00:32:41,920 --> 00:32:42,400
good job.

657
00:32:42,559 --> 00:32:43,480
Speaker 2: Yeah, sure, but.

658
00:32:43,680 --> 00:32:46,000
Speaker 1: They're not like implementing it or being restricted by it.

659
00:32:46,039 --> 00:32:48,960
Speaker 3: Though I think we want to implement it. Maybe I'm

660
00:32:48,960 --> 00:32:50,799
just weird like that because I like security. But I

661
00:32:50,839 --> 00:32:54,119
think oftentimes the biggest story that I've seen the narrative

662
00:32:54,160 --> 00:32:57,079
of software development in the last twenty five thirty years

663
00:32:57,319 --> 00:33:00,039
is developers want to do it, but we need to

664
00:33:00,079 --> 00:33:03,359
watch back in the early days, like the early two thousands,

665
00:33:03,400 --> 00:33:06,119
it used to be that security was a future. You'd

666
00:33:06,119 --> 00:33:08,559
have like something in the backlog, we need to implement

667
00:33:08,559 --> 00:33:10,559
this feature, and then there's another ticket saying make sure

668
00:33:10,559 --> 00:33:11,039
it's secure.

669
00:33:11,240 --> 00:33:15,880
Speaker 1: I would argue that back then the threats weren't as persistent, pernicious, or.

670
00:33:16,200 --> 00:33:20,079
Speaker 3: There was different numerous but you still don't want credit

671
00:33:20,079 --> 00:33:23,039
card data leaked across the internet, like the fact that

672
00:33:23,039 --> 00:33:24,720
it was put in as a separate ticket that could

673
00:33:24,720 --> 00:33:28,880
get pushed to the next sprint was an issue. And

674
00:33:28,960 --> 00:33:31,799
then we started treating security. We realized that was probably

675
00:33:31,799 --> 00:33:35,000
a bad idea because we were separating implementation from the

676
00:33:35,000 --> 00:33:38,960
feature implementation from the security implementation. So we decided to

677
00:33:38,960 --> 00:33:41,759
put it all into one. And the problem with that

678
00:33:41,880 --> 00:33:45,160
is then, whereas before the team had visibility of we're

679
00:33:45,160 --> 00:33:47,240
not doing the security stuff that we should be doing,

680
00:33:47,680 --> 00:33:49,519
now because it was wrapped up into a single ticket,

681
00:33:49,640 --> 00:33:53,559
it became the developers problem because the project manager would

682
00:33:53,559 --> 00:33:55,400
come along on the Thursday morning and say, we need

683
00:33:55,400 --> 00:33:57,160
to get this out by the end of tomorrow, and

684
00:33:57,200 --> 00:33:59,960
you're like, well, I don't have enough time to do

685
00:34:00,079 --> 00:34:02,559
the full ticket. I can do the implementation and leave

686
00:34:02,599 --> 00:34:04,440
the security till later, but now we're not tracking that.

687
00:34:04,759 --> 00:34:07,400
So became invisible that we weren't doing it. So we've

688
00:34:07,400 --> 00:34:08,239
always had a problem.

689
00:34:08,320 --> 00:34:10,360
Speaker 2: Right, I can build, I can build a building, but

690
00:34:10,400 --> 00:34:11,719
I can't put a door with a lock on.

691
00:34:12,039 --> 00:34:15,039
Speaker 3: Right, I'll get round to it. We'll do that next week.

692
00:34:15,079 --> 00:34:16,960
And then next week they're like, nope, this week is

693
00:34:17,000 --> 00:34:19,760
Windows week. Yeah, yeah, still no locks on the Windows.

694
00:34:20,400 --> 00:34:23,000
Speaker 1: So part yeah, part of this is meanwhile, somebody came

695
00:34:23,039 --> 00:34:24,480
in install your hammer.

696
00:34:24,360 --> 00:34:26,840
Speaker 2: Right, and it's just so there's also a problem with

697
00:34:26,920 --> 00:34:30,360
the PMS or not prioritizing security there until obviously becomes

698
00:34:30,400 --> 00:34:30,920
a crisis.

699
00:34:31,159 --> 00:34:32,480
Speaker 3: So this has been one of the things that I've

700
00:34:32,480 --> 00:34:34,480
tried to do as much as possible with helping developers

701
00:34:34,519 --> 00:34:37,679
understand how to write secure software and the processes to follow.

702
00:34:38,000 --> 00:34:41,559
Is it not just your processes or the engineering team's processes.

703
00:34:41,920 --> 00:34:49,039
It's an institutional understanding, systemic understanding within your organization that

704
00:34:49,159 --> 00:34:51,440
this is important. Get buying from people outside of the

705
00:34:51,440 --> 00:34:55,000
engineering department, not just the outside the engineering team. Because

706
00:34:55,039 --> 00:34:57,559
if if your marketing team and your sales team and

707
00:34:57,599 --> 00:35:00,159
your HR team all no, if we don't do this,

708
00:35:00,159 --> 00:35:02,239
this could be really bad for our pr This could

709
00:35:02,239 --> 00:35:06,280
be something that doesn't allow us to succeed beyond our

710
00:35:06,280 --> 00:35:08,480
competitives over the next few years. But give it to

711
00:35:08,480 --> 00:35:10,920
them in rational terms and say's a lot of that

712
00:35:11,000 --> 00:35:13,039
kind of stuff that we seltare engineers need to do.

713
00:35:13,079 --> 00:35:14,679
We kind of need to push the message up into

714
00:35:14,719 --> 00:35:19,320
the organization because if you don't get buy in, then

715
00:35:19,360 --> 00:35:22,280
there's no understanding of why right, you're spending time on.

716
00:35:22,239 --> 00:35:26,599
Speaker 2: That, yeah, and it's yeah, just you can't ship without

717
00:35:26,599 --> 00:35:29,360
this anymore. That you can't. It's like, I guess if

718
00:35:29,360 --> 00:35:30,880
it was if I was in is shoes. It's like,

719
00:35:31,079 --> 00:35:33,480
so you don't actually need me to store the data either, right,

720
00:35:33,599 --> 00:35:38,960
Like that's not important totally. This is These are not features,

721
00:35:39,000 --> 00:35:41,960
This is infrastructure. These are basic things we need to

722
00:35:41,960 --> 00:35:43,920
do before we can do anything else. Well, I guess

723
00:35:44,239 --> 00:35:46,480
you know I've heard you say this line that the

724
00:35:46,480 --> 00:35:49,280
shift left of security is like part of the basic

725
00:35:49,360 --> 00:35:52,840
plumbing of the initial startup of this app includes it's

726
00:35:52,840 --> 00:35:54,599
security totally.

727
00:35:54,719 --> 00:35:56,599
Speaker 3: So bringing AI into the picture, it doesn't change that

728
00:35:57,039 --> 00:36:00,320
at all. The way I like to think of AI

729
00:36:00,480 --> 00:36:06,480
as an engineering assistant is I used to describe it

730
00:36:06,519 --> 00:36:08,639
as the junior engineer that you're pair programming with.

731
00:36:08,880 --> 00:36:13,239
Speaker 5: That's changing now because we're getting more into like the

732
00:36:13,719 --> 00:36:16,840
ralph loops and the orchestration and all of these kind

733
00:36:16,840 --> 00:36:21,239
of things that allow the LMS in the background to

734
00:36:21,360 --> 00:36:24,400
chug away for hours. We're gaining to the point now

735
00:36:24,400 --> 00:36:30,719
where we're having to make a decision between speed and oversight.

736
00:36:31,599 --> 00:36:33,639
I'm not sufficiently convinced that either way of the moment,

737
00:36:33,679 --> 00:36:37,280
I'm currently personally going for speed because I think they're

738
00:36:37,280 --> 00:36:39,440
bringing guardrails and test suites and all of those, all

739
00:36:39,480 --> 00:36:41,440
of those things that we used to do under the

740
00:36:41,440 --> 00:36:44,920
traditional software development life cycle that the waterflow life cycle,

741
00:36:45,360 --> 00:36:48,840
that kind of got diluted. I don't know if there's

742
00:36:48,840 --> 00:36:50,920
a pan in there somewhere with waterfall, but it got

743
00:36:50,960 --> 00:36:53,920
diluted when we went to more agile, the run fast

744
00:36:53,920 --> 00:36:56,159
and break stuff kind of methodology of the twenty tens,

745
00:36:56,519 --> 00:36:59,280
where we'd write software and it was relatively easy to

746
00:36:59,280 --> 00:37:01,280
go back and change if a client came back with

747
00:37:01,280 --> 00:37:02,199
with a change request.

748
00:37:02,480 --> 00:37:04,000
Speaker 2: Yeah, but she also had a tight tight to the

749
00:37:04,039 --> 00:37:06,159
client the whole time. Right, It wasn't just don't have

750
00:37:06,199 --> 00:37:08,599
a plan, It's that the person you're planning with is

751
00:37:08,639 --> 00:37:12,000
sitting beside you, so you have a lot of imput.

752
00:37:11,719 --> 00:37:14,079
Speaker 3: Sure, But the problem with this point is all of

753
00:37:14,119 --> 00:37:16,119
the information is now in the heads of two people,

754
00:37:16,480 --> 00:37:19,599
as opposed to documented and understandable. So if we think

755
00:37:19,639 --> 00:37:24,440
of developing with AI the same as it doesn't necessarily

756
00:37:24,519 --> 00:37:26,880
to be enterprise skill software development. But let's say you've

757
00:37:26,880 --> 00:37:28,960
got a team of four people working on a project. Sure,

758
00:37:29,119 --> 00:37:30,880
and one of those team members could be stopped out

759
00:37:30,880 --> 00:37:33,719
for another team member at the drop of hat. And

760
00:37:33,800 --> 00:37:35,840
that's kind of the equivalent of the context window. Right

761
00:37:36,239 --> 00:37:39,079
as soon as the LLLM wipes context window and starts again,

762
00:37:39,119 --> 00:37:41,360
it's like a new developer joining the team. How quickly

763
00:37:41,400 --> 00:37:44,039
can you onboard that developer? And before we had it

764
00:37:44,039 --> 00:37:47,400
in our heads and we were being handheld or handholding

765
00:37:47,559 --> 00:37:52,079
probably bidirectional handholding with a client trying to develop the software.

766
00:37:52,599 --> 00:37:54,880
It was fast because we were able to iterate quickly

767
00:37:54,960 --> 00:37:58,719
because the feedback loop was small, but the documentation and

768
00:37:58,719 --> 00:38:04,360
the persistence of that information was not there. So if

769
00:38:04,360 --> 00:38:07,159
we think about the way we used to write software,

770
00:38:07,159 --> 00:38:10,920
where specifications and understanding the problem was core before you

771
00:38:10,920 --> 00:38:15,079
even started coding, I see a change back to that direction. Yeah,

772
00:38:15,079 --> 00:38:18,719
we're getting specification driven development again, where you write some

773
00:38:18,760 --> 00:38:21,639
really good specs and then you can even use an

774
00:38:21,639 --> 00:38:25,079
element to convert that into an implementation plan and then

775
00:38:25,119 --> 00:38:26,719
you give that implementation plan.

776
00:38:27,400 --> 00:38:29,280
Speaker 2: When we have tools like this, right, I get have

777
00:38:29,400 --> 00:38:32,000
spec Kit is literally that, right.

778
00:38:32,320 --> 00:38:35,000
Speaker 3: Yeap spec it's great open spec I've been playing with

779
00:38:35,079 --> 00:38:39,840
as well. I actually find nowadays just chatting with Claude

780
00:38:40,719 --> 00:38:42,440
and talking as long as you give it a framework

781
00:38:42,480 --> 00:38:44,079
like this is what I want a specification a file

782
00:38:44,119 --> 00:38:47,119
to look like, and I want specification specification files to

783
00:38:47,119 --> 00:38:50,079
be around the topic of concern as opposed to functional specifications.

784
00:38:51,199 --> 00:38:54,960
You can just chat with an LM directly. A lot

785
00:38:54,960 --> 00:39:00,679
of these frameworks for developing specs are great, but if

786
00:39:00,679 --> 00:39:03,360
somebody wants to get into it and just chat with

787
00:39:03,400 --> 00:39:06,320
Claude for half an hour and then say, now build something,

788
00:39:06,760 --> 00:39:09,480
as long as you're able to define the outcome. And

789
00:39:09,519 --> 00:39:11,960
this is the biggest thing. I'm seeing memes come up

790
00:39:11,960 --> 00:39:14,519
all the time on Facebook and LinkedIn now of like

791
00:39:14,599 --> 00:39:17,960
the four panel cartoon where on the one side you've

792
00:39:17,960 --> 00:39:21,039
got a Vibe code and they're like, build me a game.

793
00:39:21,239 --> 00:39:23,800
That's awesome, and then the next frame is the computer

794
00:39:23,800 --> 00:39:26,440
catches fire. And then on the other side you've got

795
00:39:26,920 --> 00:39:29,920
what this particular cartoon called a Vibe engineer, which is

796
00:39:29,960 --> 00:39:32,079
build me a game that uses this technology on this

797
00:39:32,119 --> 00:39:34,159
platform and web sockets and this and then the other,

798
00:39:34,559 --> 00:39:36,360
and then in the next frame is a functioning app.

799
00:39:36,719 --> 00:39:38,840
The only difference there is the amount of specification and

800
00:39:38,880 --> 00:39:41,400
time taken to put the detail into that specification at

801
00:39:41,400 --> 00:39:42,199
the beginning.

802
00:39:41,880 --> 00:39:44,880
Speaker 2: Sure well, and the nice thing is you keep iterating

803
00:39:44,880 --> 00:39:47,119
on this too. You know, I've seen these specs sort

804
00:39:47,119 --> 00:39:49,679
of evolve and make better and better code as we've

805
00:39:49,760 --> 00:39:52,360
learned the limitations of the tools. Although you get to

806
00:39:52,360 --> 00:39:53,800
a certain size of a spec where now you have

807
00:39:53,840 --> 00:39:56,039
to decompose it, like you can only feed so much

808
00:39:56,079 --> 00:39:59,840
data in any given agent. The other piece I've seen is,

809
00:40:00,280 --> 00:40:03,280
and we got this right away, it writes better prs,

810
00:40:03,559 --> 00:40:06,920
it writes better issues like you actually, naturally when you're

811
00:40:06,960 --> 00:40:09,679
using these tools, get better documentation to get someone back

812
00:40:09,760 --> 00:40:12,599
up to speed because so many details are automatically generated.

813
00:40:12,639 --> 00:40:14,639
But just got to get people to actually read that

814
00:40:14,760 --> 00:40:16,960
or have the tool read it too. Like I think

815
00:40:16,960 --> 00:40:18,519
there's probably a lot of insight.

816
00:40:18,239 --> 00:40:21,000
Speaker 1: There, but the tool can usually read the source code

817
00:40:21,039 --> 00:40:22,280
and figure things out quickly.

818
00:40:22,440 --> 00:40:25,719
Speaker 2: Yeah, but I wonder what you'd learn by reading through

819
00:40:25,719 --> 00:40:29,880
the issues, pull requests and iterations on that that is

820
00:40:29,920 --> 00:40:32,480
deeper than just what the ending code was, Like, what

821
00:40:33,239 --> 00:40:34,920
does the tool struggle with? What do we have to

822
00:40:34,920 --> 00:40:36,840
iterate the most on to be successful?

823
00:40:37,119 --> 00:40:37,840
Speaker 3: That would be interesting.

824
00:40:38,079 --> 00:40:40,519
Speaker 2: Yeah, I just I just feel like we're not that

825
00:40:40,679 --> 00:40:47,239
far away from having an agent that is your penetration tester,

826
00:40:47,639 --> 00:40:50,159
that is your is this app secure? And it gives

827
00:40:50,199 --> 00:40:52,679
you a rating that it actually hammers on the app

828
00:40:52,760 --> 00:40:55,920
and talks about the holes that are in it. Because

829
00:40:55,960 --> 00:40:58,920
this is the sort of meticulousness is what these tools

830
00:40:58,920 --> 00:41:01,920
are actually good at. A friend of mine who's deeply

831
00:41:02,000 --> 00:41:04,360
versus says, it didn't matter how many interns I had,

832
00:41:04,400 --> 00:41:06,599
I could never get a hundred percent code coverage till

833
00:41:06,639 --> 00:41:10,039
I got these tools. Then how to present code coverage

834
00:41:10,079 --> 00:41:12,280
john testing, which is not that difficult to happen, It

835
00:41:12,400 --> 00:41:13,159
just took time.

836
00:41:13,400 --> 00:41:17,440
Speaker 1: Yeah, in the beginning, I talked about lllms or in

837
00:41:17,760 --> 00:41:23,199
different AI tools that are finding zero days and writing

838
00:41:23,280 --> 00:41:27,480
exploits for them at an amazing, alarming clip. And you

839
00:41:27,480 --> 00:41:31,840
know it's scary, yes, but you also can think about, well,

840
00:41:31,920 --> 00:41:34,360
let's deploy it on the other side, let's use AI

841
00:41:34,519 --> 00:41:39,280
to to test our constantly test our systems and find

842
00:41:39,280 --> 00:41:41,760
the holes in our systems so that they don't become

843
00:41:41,880 --> 00:41:45,360
zero days. And you know, as a defensive measure.

844
00:41:45,079 --> 00:41:47,639
Speaker 2: Find them and fix them. Yeah, find them and fix them,

845
00:41:47,719 --> 00:41:50,320
because consistently the good guys have more resources than the

846
00:41:50,320 --> 00:41:50,880
bad guys.

847
00:41:50,920 --> 00:41:52,880
Speaker 1: Anyway, well hopefully so, who knows.

848
00:41:53,199 --> 00:41:57,599
Speaker 3: My brain's gone off on a slight tension to this point. Yes,

849
00:41:57,760 --> 00:42:01,159
to let's get the AIS to penetration test and do

850
00:42:01,199 --> 00:42:03,760
all the unit tests and functional tests and all of

851
00:42:03,760 --> 00:42:06,840
those kind of things would be a natural extension anyway.

852
00:42:07,599 --> 00:42:10,159
But as you were saying, let's find the zero days

853
00:42:10,199 --> 00:42:13,960
before the bad actors. Let's call them that find that

854
00:42:15,480 --> 00:42:21,840
open claw and multipook. Can you imagine if you have

855
00:42:21,920 --> 00:42:24,519
open clare running on your machine or a machine that

856
00:42:24,559 --> 00:42:27,920
has access to your code and the backlog and the

857
00:42:27,960 --> 00:42:30,639
decision tree and all of those kind of things, and

858
00:42:30,719 --> 00:42:35,199
one of your penetration agents finds a zero day that

859
00:42:35,239 --> 00:42:38,679
it then fixes in your code, and then open claw

860
00:42:38,800 --> 00:42:42,360
notices this and puts it on multbook saying, hey, one

861
00:42:42,400 --> 00:42:45,000
of our developer agents just found this thing. Isn't that interesting?

862
00:42:45,000 --> 00:42:47,360
And then suddenly all of these agents and there's a

863
00:42:47,440 --> 00:42:50,199
question as to where multipook is actually really what it

864
00:42:50,239 --> 00:42:52,880
purports to be. There's some news articles coming out saying

865
00:42:52,880 --> 00:42:56,199
that it's it's staged. But let's assume that it is

866
00:42:56,239 --> 00:43:00,599
what it purports to be. Now we've got this communication

867
00:43:00,679 --> 00:43:04,840
distitution distribution mechanism for all these agents to learn about

868
00:43:04,960 --> 00:43:05,960
zero days in real.

869
00:43:05,800 --> 00:43:09,079
Speaker 2: Time, right like you could be parallelizing the fixes in

870
00:43:09,159 --> 00:43:13,000
a tremendous rate, right, or the attacks like that's the

871
00:43:13,039 --> 00:43:14,880
problem is, like what happens from here?

872
00:43:15,039 --> 00:43:18,239
Speaker 3: Yeah, it would definitely be both. Can I can I

873
00:43:18,280 --> 00:43:20,480
abuse this before all of the fixes are in?

874
00:43:21,079 --> 00:43:24,400
Speaker 2: Probably? Yeah? Yeah, you know, we've always looked for places.

875
00:43:24,960 --> 00:43:28,679
Science fiction has always talked about the superintelligence effect that

876
00:43:28,760 --> 00:43:31,079
at some point these tools get to a place where

877
00:43:31,079 --> 00:43:34,559
they're learning far faster and far more than humans can

878
00:43:34,599 --> 00:43:37,079
and become a super intelligence. It's science fiction, to be clear,

879
00:43:37,599 --> 00:43:40,199
and you know, the reality as we with these lms

880
00:43:40,280 --> 00:43:43,000
is we've largely seen it's all derivative work, even though

881
00:43:43,000 --> 00:43:46,440
it's potentially valuable, Like very few people make it through

882
00:43:46,480 --> 00:43:48,760
the whole checklist of actually securing an app and a

883
00:43:48,840 --> 00:43:51,519
tool would be far more persistent on it than a

884
00:43:51,559 --> 00:43:55,400
person would. All of that is good, but this abil

885
00:43:55,559 --> 00:43:59,760
this drag race of U code being generated, finding vulnerabilities

886
00:43:59,760 --> 00:44:02,400
in it, rippling those fixes out while the attackers are

887
00:44:02,400 --> 00:44:04,920
also running at it like that's a fast iteration cycle.

888
00:44:04,960 --> 00:44:08,519
That means if you're outside of that loop, the software

889
00:44:08,519 --> 00:44:10,119
that's going to come out the other side of being

890
00:44:10,199 --> 00:44:14,280
in that battleground is vastly more secure than the software

891
00:44:14,320 --> 00:44:16,960
that's not it, and it will happen fast. That to

892
00:44:17,000 --> 00:44:19,039
me is the first time I've really felt like that

893
00:44:19,559 --> 00:44:23,280
rush of speed from these tools would actually impact something real.

894
00:44:24,920 --> 00:44:27,239
I never expect these tools to create new knowledge, that's

895
00:44:27,280 --> 00:44:30,960
not what they are, but to press against all the

896
00:44:30,960 --> 00:44:33,960
weaknesses and systems and find them and potentially fix them.

897
00:44:34,320 --> 00:44:35,400
That makes total sense to me.

898
00:44:35,599 --> 00:44:37,800
Speaker 3: Yeah, it's part of my build loop at the moment.

899
00:44:37,920 --> 00:44:40,559
Is as well as getting it to generate code. I'm

900
00:44:40,599 --> 00:44:44,360
saying right, tests, There's an interesting bit of research that

901
00:44:44,400 --> 00:44:47,400
I'm trying to dig deeper into at the moment. There's

902
00:44:47,599 --> 00:44:52,719
no one hundred percent certainty either way whether LM's writing

903
00:44:52,760 --> 00:44:57,440
tests that test LM generator code is a good thing.

904
00:44:58,360 --> 00:45:04,400
My original hypothesis was that at the the probability of

905
00:45:04,440 --> 00:45:09,639
an LM writing a test that passes against code that

906
00:45:09,679 --> 00:45:12,920
does the wrong thing, Yeah, is quite low. It turns

907
00:45:12,920 --> 00:45:15,519
out it isn't actually that low. Yeah, It's around about

908
00:45:15,519 --> 00:45:16,960
forty percent from some of the stuff that I was

909
00:45:16,960 --> 00:45:18,440
looking into. So we still need to be careful about

910
00:45:18,440 --> 00:45:20,840
who's writing the tests and how do we validate those.

911
00:45:20,679 --> 00:45:23,280
Speaker 2: Sure, the same way we feel about any code it generates,

912
00:45:23,320 --> 00:45:25,280
Like the number of times that I get code generated,

913
00:45:25,320 --> 00:45:27,440
that's right, the first time is very low. But how

914
00:45:27,440 --> 00:45:30,800
many times do you generate test code and never go

915
00:45:30,840 --> 00:45:32,000
past the initial iteration?

916
00:45:32,480 --> 00:45:34,360
Speaker 3: Well, ideally, why would you expect it to be any

917
00:45:34,519 --> 00:45:36,360
iterating on your code. If you're using an AI to

918
00:45:36,480 --> 00:45:38,519
iterate your code, you would have them iterate your tests

919
00:45:38,559 --> 00:45:41,840
as well. Yeah, And I think the reason why I

920
00:45:41,880 --> 00:45:44,320
was hoping that it would be a lower chance of

921
00:45:44,320 --> 00:45:47,239
that going wrong than it might actually be is because

922
00:45:47,280 --> 00:45:50,280
it's not just like the LM is not going to

923
00:45:50,280 --> 00:45:52,639
write a test that tests the wrong thing and then

924
00:45:52,679 --> 00:45:54,880
write code to satisfy the test, because it has a

925
00:45:54,920 --> 00:45:57,639
whole lot of extra context in there part of the

926
00:45:57,679 --> 00:46:01,960
test or the testing quotes. Also, does it meet the

927
00:46:01,960 --> 00:46:06,679
requirement of the specs, does it meet my programming objectives

928
00:46:06,679 --> 00:46:09,400
like the overall system prompt of being a helpful coding

929
00:46:09,400 --> 00:46:11,960
a system that writes secure code, Like, there's all sorts

930
00:46:11,960 --> 00:46:13,679
of other stuff that goes into it other than just

931
00:46:13,760 --> 00:46:15,880
does it pass the test that I think that it's

932
00:46:15,920 --> 00:46:19,679
going to sway it more towards not necessarily good quality code,

933
00:46:19,840 --> 00:46:24,280
but does what it's supposed to do in the closest

934
00:46:24,280 --> 00:46:27,119
possible to the best way with as few issues as possible.

935
00:46:27,199 --> 00:46:29,079
Speaker 2: Well, part of what you're writing out in that prompt

936
00:46:29,119 --> 00:46:31,679
is what quality code means to you, right, and part

937
00:46:31,679 --> 00:46:33,199
of that is secure code.

938
00:46:33,280 --> 00:46:35,639
Speaker 3: Right, and they'll be bad. There's a whole lot of

939
00:46:35,679 --> 00:46:38,880
prompts that get sent to whichever coding agent of choice

940
00:46:38,920 --> 00:46:40,719
you're using that you don't even see. So if you're

941
00:46:40,760 --> 00:46:44,119
using code or open code or code X or any

942
00:46:44,159 --> 00:46:46,039
of these, they will have their own prompts built in

943
00:46:46,679 --> 00:46:50,079
that kind of set up the infrom server to understand

944
00:46:50,119 --> 00:46:53,599
the perspective of the following request. And then that request

945
00:46:53,599 --> 00:46:58,559
will be your code base, your specifications files, anything that

946
00:46:58,599 --> 00:47:02,880
you write into, like your actual message. All these kind

947
00:47:02,920 --> 00:47:05,280
of things get added into the context. But there is

948
00:47:06,119 --> 00:47:10,159
a large chunk of context, I say large. Claud code

949
00:47:10,199 --> 00:47:15,679
by default sins about seventeen k of context before anything

950
00:47:15,719 --> 00:47:16,960
in your code base or anything.

951
00:47:17,039 --> 00:47:20,320
Speaker 1: Wow, that's a lot, right, But it has to be.

952
00:47:21,000 --> 00:47:22,920
Speaker 3: It has to be because it's setting the scene.

953
00:47:23,039 --> 00:47:23,199
Speaker 2: Right.

954
00:47:23,280 --> 00:47:26,239
Speaker 3: Here's here's what Claude wants you to do, or what

955
00:47:26,400 --> 00:47:29,639
Anthropic want Claude to do, and then here's the context

956
00:47:29,639 --> 00:47:32,079
of the ACRO application. So there's extra stuff in there

957
00:47:32,079 --> 00:47:34,599
that is extra guardrails and extra safety.

958
00:47:34,320 --> 00:47:38,920
Speaker 2: Right, and then most sophisticated AI tool user I see

959
00:47:38,920 --> 00:47:41,880
now have a prefix set ahead of their particular project

960
00:47:42,159 --> 00:47:45,320
specs as well that are more guidelines, so guidelines on

961
00:47:45,400 --> 00:47:48,000
top of guidelines or toper guidelines before we even start

962
00:47:48,000 --> 00:47:50,559
talking about and this is what you're making in this iteration.

963
00:47:51,440 --> 00:47:53,480
Speaker 3: And we're surprised that when we come to the end

964
00:47:53,480 --> 00:47:54,960
of all of that and all we've written is one

965
00:47:55,039 --> 00:48:00,320
line saying fix this bug, and it's forgotten everything else

966
00:48:00,360 --> 00:48:02,760
before because the context windows kind of strolled way past,

967
00:48:02,800 --> 00:48:05,360
and it's right, it's in a rearview mirror some way.

968
00:48:05,639 --> 00:48:07,760
Speaker 2: I banged into this with home my home Assistant instance

969
00:48:07,800 --> 00:48:11,159
the other day, where I was still using three or

970
00:48:11,400 --> 00:48:13,840
you know, three plus, and the context win was like

971
00:48:13,880 --> 00:48:18,000
sixteen K and the specification of my homes system instance

972
00:48:18,000 --> 00:48:21,000
now is nineteen K, so literally I could do nothing.

973
00:48:21,039 --> 00:48:22,719
I had to move up to four to get a

974
00:48:22,800 --> 00:48:25,159
sixty four K context window just to have anything work.

975
00:48:25,360 --> 00:48:26,880
And at the same time I was sitting back and going,

976
00:48:27,119 --> 00:48:31,079
why is this so big? Like this is not efficient,

977
00:48:31,519 --> 00:48:34,880
Like all of this fields a remarkably inefficient. Shouldn't we

978
00:48:35,000 --> 00:48:39,519
be narrowing the scope of these llms architecturally, not just

979
00:48:39,599 --> 00:48:42,000
by prompts every single time we call them. Like, no,

980
00:48:42,079 --> 00:48:43,239
wonder we're wasting so much.

981
00:48:43,280 --> 00:48:47,559
Speaker 1: Another reason we should be running local inference engines. Yeah,

982
00:48:47,599 --> 00:48:50,320
and that's totally that's where we're going. Yeah, I think so,

983
00:48:50,480 --> 00:48:51,679
and it won't be an issue.

984
00:48:51,760 --> 00:48:54,440
Speaker 2: I just feel like we're still mailing two broader models.

985
00:48:54,920 --> 00:48:57,639
I know. I lived through the OLAP revolution, and every

986
00:48:57,639 --> 00:49:00,440
time we did a data analytics cube or you come

987
00:49:00,559 --> 00:49:02,960
the first time you started with the mother of all cubes,

988
00:49:03,559 --> 00:49:06,760
and it was never useful other than instructional for what

989
00:49:06,840 --> 00:49:10,079
to make next that it was far more important to

990
00:49:10,320 --> 00:49:12,719
de scope that make a smaller one once you get

991
00:49:12,719 --> 00:49:16,039
that initial explore and everything I look at so far

992
00:49:16,119 --> 00:49:17,960
in this is like we're still building mother all cubes

993
00:49:18,000 --> 00:49:20,639
every time and then and then fighting to scope it

994
00:49:20,679 --> 00:49:23,000
down to the work we actually need to do, rather

995
00:49:23,039 --> 00:49:25,679
than start with something that's tuned to the work. Totally.

996
00:49:26,440 --> 00:49:31,360
Speaker 3: I'm wondering whether we're going to get roughly stack shaped

997
00:49:31,760 --> 00:49:34,559
small language models. Yeah, because the LM we use for

998
00:49:34,599 --> 00:49:38,119
coding knows everything about cobol and Java and c shop

999
00:49:38,199 --> 00:49:41,480
and PHP and javascripting. I don't need all of those, No,

1000
00:49:41,639 --> 00:49:43,480
Can I get one that just does what I need to.

1001
00:49:43,960 --> 00:49:46,239
Speaker 2: Yeah, the three languages you're doing in this app at

1002
00:49:46,239 --> 00:49:48,599
this time, and the architecture that you care about, like,

1003
00:49:48,800 --> 00:49:52,119
just scope it down. I got a feeling a few

1004
00:49:52,159 --> 00:49:53,960
years now we're going to be talking about an LM

1005
00:49:54,039 --> 00:49:56,320
specific to an app. Yeah, that the more mature and

1006
00:49:56,400 --> 00:49:59,800
application is, the better this tool is at making modifications

1007
00:49:59,840 --> 00:50:02,199
to right, because it Yeah, ultimately the tool is going

1008
00:50:02,239 --> 00:50:03,800
to know more about how the software was built than

1009
00:50:03,840 --> 00:50:06,280
any person. Right, Those people are all gone, right, like,

1010
00:50:06,320 --> 00:50:08,159
look at any mature piece of software that we have

1011
00:50:08,239 --> 00:50:08,760
that problem.

1012
00:50:09,079 --> 00:50:12,280
Speaker 1: But these tools, well, that's if the tool was that's

1013
00:50:12,320 --> 00:50:14,519
if the tool was stayful though, but it's not. It's

1014
00:50:14,599 --> 00:50:17,079
like Groundhog Day. Every time you talk to it. You

1015
00:50:17,119 --> 00:50:19,239
have to give it the context, as you were saying.

1016
00:50:19,039 --> 00:50:23,280
Speaker 3: But that's that's just the context, the underlying model that

1017
00:50:23,320 --> 00:50:25,119
the context we do is then pass through in order

1018
00:50:25,159 --> 00:50:27,639
to create the apport response can still be made a

1019
00:50:27,679 --> 00:50:28,559
lot smaller.

1020
00:50:28,320 --> 00:50:29,960
Speaker 1: Right, but it's not going to know anything about your

1021
00:50:30,000 --> 00:50:32,440
app unless that's built into the model or you use

1022
00:50:32,519 --> 00:50:36,440
RAG or something like that, right, or it can just

1023
00:50:36,519 --> 00:50:38,639
see what it sees every time.

1024
00:50:38,719 --> 00:50:40,880
Speaker 3: Yeah, I mean my assumption was that Richard was saying

1025
00:50:40,880 --> 00:50:42,559
that you would take that code base and then train

1026
00:50:42,599 --> 00:50:42,920
them all.

1027
00:50:43,480 --> 00:50:46,519
Speaker 2: Yeah, okay, train into it. Yeah, and that because again

1028
00:50:46,599 --> 00:50:49,039
it's like I'm just trying to narrow the scope. Like

1029
00:50:49,519 --> 00:50:51,199
if you know, one of the things you learn about

1030
00:50:51,199 --> 00:50:53,119
third version of an app is we're not introducing in

1031
00:50:53,159 --> 00:50:56,679
that language. Thanks, right, Like that, the bar for new

1032
00:50:56,760 --> 00:50:59,159
architecture or any of that, it gets really really high.

1033
00:50:59,280 --> 00:51:01,039
We have a known set are patterns that work for

1034
00:51:01,079 --> 00:51:03,440
this app, and the making changes in those patterns are easy,

1035
00:51:03,639 --> 00:51:06,199
and anything outside of that pattern is a heavy left.

1036
00:51:07,000 --> 00:51:09,639
Speaker 3: So here's another thing I've been wondering, why do we

1037
00:51:09,679 --> 00:51:10,920
care about which language we use?

1038
00:51:11,159 --> 00:51:11,440
Speaker 2: Ah?

1039
00:51:11,480 --> 00:51:13,320
Speaker 3: Well, that's and this has been something people have been

1040
00:51:13,320 --> 00:51:16,159
talking about already. Isn't something new, But why not just

1041
00:51:16,199 --> 00:51:17,639
have a same carriage model that only?

1042
00:51:17,960 --> 00:51:18,480
Speaker 2: But why?

1043
00:51:18,760 --> 00:51:20,880
Speaker 1: I think, Well, because I need to be able to

1044
00:51:20,920 --> 00:51:24,360
read and verify the code that gets published today, the

1045
00:51:24,440 --> 00:51:25,639
last chain in the link.

1046
00:51:25,639 --> 00:51:28,559
Speaker 3: Today, in the chain, I would not be surprised. I

1047
00:51:28,559 --> 00:51:31,519
still remember when like twenty twenty three AI has just

1048
00:51:31,519 --> 00:51:33,320
come out, Everyone's like this is going to take away

1049
00:51:33,360 --> 00:51:35,519
all the developers' jobs, and people are like, you know what,

1050
00:51:36,000 --> 00:51:38,079
in five years time, this is going to be writing

1051
00:51:38,079 --> 00:51:41,519
software for us, and I'm like, give it one or

1052
00:51:41,519 --> 00:51:44,079
two well or two years later. It's writing all the

1053
00:51:44,079 --> 00:51:45,960
code for us. So if when I'm thinking, you know what,

1054
00:51:46,000 --> 00:51:47,960
in five years time, maybe it'll be good, you know,

1055
00:51:48,039 --> 00:51:51,119
for taking human written specs by somebody who doesn't under

1056
00:51:51,199 --> 00:51:53,400
understand architecture and write software that works.

1057
00:51:53,800 --> 00:51:54,639
Speaker 1: It's already happening.

1058
00:51:54,719 --> 00:51:58,480
Speaker 3: I reckon it'll be two or three years, maybe even

1059
00:51:58,559 --> 00:51:59,519
by the end of twenty six.

1060
00:51:59,719 --> 00:52:01,280
Speaker 1: But it's already happening.

1061
00:52:01,320 --> 00:52:02,480
Speaker 2: Isn't that You.

1062
00:52:02,480 --> 00:52:05,559
Speaker 3: Still need to know enough about the code that I'm writing.

1063
00:52:05,719 --> 00:52:08,880
I still need to handhold it after the loop. So

1064
00:52:08,920 --> 00:52:10,960
I will build some really good specs. I'll get it

1065
00:52:11,000 --> 00:52:13,239
to run for three, five, seven hours whatever, well not

1066
00:52:13,400 --> 00:52:18,239
seven because token Windows, and I'll get back to it

1067
00:52:18,280 --> 00:52:20,960
and it's ninety nine percent there. I'm happy with the code.

1068
00:52:21,000 --> 00:52:23,280
It doesn't quite do what I wanted. Maybe it misunderstood

1069
00:52:23,280 --> 00:52:28,400
a requirement. Maybe it didn't write a test that would

1070
00:52:28,440 --> 00:52:32,199
make sure that the doc container was restarted. There's something missing.

1071
00:52:32,280 --> 00:52:35,280
It hasn't done a perfect job. It's not hard for

1072
00:52:35,320 --> 00:52:37,760
me to fix because I have a software engineering mind,

1073
00:52:38,360 --> 00:52:40,679
but my mom wouldn't be able to fix that. I

1074
00:52:40,719 --> 00:52:41,800
think in a year or two.

1075
00:52:41,840 --> 00:52:43,320
Speaker 1: Now would you want your mom to fix that?

1076
00:52:43,760 --> 00:52:46,599
Speaker 3: Why wouldn't you? Why would you not want any human

1077
00:52:46,599 --> 00:52:47,960
in the world to build? Say I want a piece

1078
00:52:48,000 --> 00:52:49,800
of software that's going to do this for me, and

1079
00:52:49,840 --> 00:52:52,320
the software goes, okay, here I am. I think that's

1080
00:52:52,320 --> 00:52:52,920
where we're heading.

1081
00:52:53,079 --> 00:52:55,679
Speaker 1: Yeah, of course, yeah, that's where that's we're not.

1082
00:52:55,760 --> 00:52:58,079
Speaker 2: That's where the club at really spoke to us, as

1083
00:52:58,119 --> 00:53:00,000
everybody wants their own software around them.

1084
00:53:00,280 --> 00:53:02,000
Speaker 3: And at that point we don't care about the language.

1085
00:53:02,079 --> 00:53:05,280
So now we can have a small language model that

1086
00:53:05,760 --> 00:53:10,760
understands three or four core languages. Maybe we've got typescript, GO,

1087
00:53:11,840 --> 00:53:13,840
a couple of others that kind of round out the

1088
00:53:13,840 --> 00:53:16,519
whole ecosystem of what you might need, and then you

1089
00:53:16,559 --> 00:53:19,920
don't need to get just compress it down and nobody

1090
00:53:19,960 --> 00:53:25,079
will ever write Java, Cobal or PHP or again except

1091
00:53:25,079 --> 00:53:25,800
for the hobbyists.

1092
00:53:26,320 --> 00:53:30,039
Speaker 2: Even those contemporary languages are really about the modern architectures

1093
00:53:30,239 --> 00:53:32,599
and having you sort of fall into the pit of success.

1094
00:53:32,679 --> 00:53:34,559
None of that means anything. Maybe all this stuff's going

1095
00:53:34,559 --> 00:53:36,480
to be written and see when it's generated. Sure, just

1096
00:53:36,639 --> 00:53:38,440
as fast as bare metal as you wanted to do

1097
00:53:38,719 --> 00:53:42,800
directly an assembly right into assembly core processor specifically, what

1098
00:53:42,840 --> 00:53:45,440
do you care get as much performance out of it

1099
00:53:45,480 --> 00:53:48,880
as you can. You know, the ultimate tester is the user.

1100
00:53:49,039 --> 00:53:50,960
User accept the testing is ultimately what it matters. And

1101
00:53:51,000 --> 00:53:52,519
one thing the user doesn't look at is any of

1102
00:53:52,519 --> 00:53:55,199
the code. So if the app does what it's supposed

1103
00:53:55,239 --> 00:53:57,079
to do, does any of that matter?

1104
00:53:57,239 --> 00:53:59,440
Speaker 3: And even to that extent before the user is using it.

1105
00:53:59,639 --> 00:54:03,280
My my process of the moment is it installs Playwright,

1106
00:54:03,719 --> 00:54:06,880
it has a headless browser like it's testing the user

1107
00:54:06,920 --> 00:54:08,519
experience before I look at it.

1108
00:54:08,519 --> 00:54:10,280
Speaker 2: It's operating it as if it was the user.

1109
00:54:10,360 --> 00:54:13,519
Speaker 1: Personally. That future scares the shit out of me. It does,

1110
00:54:13,800 --> 00:54:15,920
just because of the loss of control and the loss

1111
00:54:15,960 --> 00:54:19,239
of checks and balances in the absolute faith that you

1112
00:54:19,320 --> 00:54:21,880
have and the AI to do the right thing, all

1113
00:54:21,920 --> 00:54:24,360
of those reasons, the security.

1114
00:54:23,920 --> 00:54:25,920
Speaker 3: To be honest, I'm with you. I'm with you on that.

1115
00:54:26,000 --> 00:54:28,159
It is scary. I don't know whether or not I

1116
00:54:28,360 --> 00:54:30,960
like it. But if I take a step back and

1117
00:54:31,000 --> 00:54:33,679
pragmatically look at where this is going as an industry

1118
00:54:33,679 --> 00:54:37,199
and society is going with AI, I think it's somewhat inevitable.

1119
00:54:37,400 --> 00:54:38,840
Speaker 1: If she had a heart attack, he went to the

1120
00:54:38,840 --> 00:54:40,920
hospital and the doctor says, we're going to install this

1121
00:54:40,960 --> 00:54:43,519
new AI created pacemaker.

1122
00:54:43,639 --> 00:54:45,199
Speaker 3: Has it gone through the same testing, but a human

1123
00:54:45,239 --> 00:54:47,599
created pacemaker has to go through Yeah? If so, And

1124
00:54:47,639 --> 00:54:48,239
it's how I fail.

1125
00:54:48,320 --> 00:54:49,800
Speaker 1: Well, you know they're going to tell you. They're going

1126
00:54:49,840 --> 00:54:51,840
to tell you everything is just wonderful and it's so

1127
00:54:51,920 --> 00:54:53,119
much better than everything else.

1128
00:54:53,239 --> 00:54:55,400
Speaker 2: Well, they do they do? Anyway? Why would you know

1129
00:54:55,480 --> 00:54:57,599
one way or the other? Sure? In fact, why are

1130
00:54:57,599 --> 00:54:59,239
they even telling you anything other We're going to fix

1131
00:54:59,280 --> 00:54:59,880
your heart today.

1132
00:55:00,039 --> 00:55:02,800
Speaker 1: What I'm saying is Okay, maybe that's a bad idea,

1133
00:55:02,920 --> 00:55:06,000
but how about you know, you go to buy a car,

1134
00:55:06,199 --> 00:55:09,280
or you go to do just something that's critical right?

1135
00:55:09,960 --> 00:55:12,400
And oh no, humans were involved in the in the

1136
00:55:12,480 --> 00:55:15,320
creation and making of this. This was completely done by

1137
00:55:15,400 --> 00:55:16,800
AI and robots or whatever.

1138
00:55:17,199 --> 00:55:18,920
Speaker 3: It's why I give it a one to two year

1139
00:55:18,960 --> 00:55:22,440
back because this isn't going to happen until we have

1140
00:55:22,800 --> 00:55:29,519
enough evidence, whether that's testable or just through somewhat anecdotal,

1141
00:55:29,599 --> 00:55:32,960
like self driving cars for example. Yeah, once we know

1142
00:55:33,199 --> 00:55:35,719
that they're less likely to kill somebody than a human,

1143
00:55:35,760 --> 00:55:38,119
which I think we're already at that point, then there

1144
00:55:38,159 --> 00:55:44,119
will become mainstream adoptable an AI driven or AI created pacemaker.

1145
00:55:44,239 --> 00:55:47,039
Maybe not right now, but once they've been around enough

1146
00:55:47,039 --> 00:55:50,559
and tested in the right kind of control group environs

1147
00:55:50,960 --> 00:55:53,719
to be shown to be as effective, if not safer,

1148
00:55:53,760 --> 00:55:57,599
than human created ones, then it'll have to be. Society

1149
00:55:57,639 --> 00:56:03,000
will adopt it eventually. But if it happened tomorrow, then

1150
00:56:03,039 --> 00:56:05,320
maybe not. But that's because I don't think it's that yet.

1151
00:56:05,440 --> 00:56:09,000
Speaker 2: Yeah, there's great Harvard Studies says, the more you understand

1152
00:56:09,000 --> 00:56:12,559
about these technologies, less you trust them, so the most

1153
00:56:12,800 --> 00:56:15,840
are also the most ignorant. Right, yes, right, we're all

1154
00:56:15,880 --> 00:56:22,000
in this far enough that we're concerned for very clear reasons, right, yeah, right.

1155
00:56:23,480 --> 00:56:25,199
Speaker 1: And by the way, if you don't know open Claude

1156
00:56:25,239 --> 00:56:27,039
is you haven't been paying attention to the news. It's

1157
00:56:27,480 --> 00:56:31,320
open c l A W D D. And then it

1158
00:56:31,360 --> 00:56:34,119
became molt. What's that it was?

1159
00:56:34,159 --> 00:56:38,199
Speaker 3: It was Claude bought with a W before it was

1160
00:56:38,199 --> 00:56:40,480
open claw. Yeah, so its clawed with a D.

1161
00:56:40,800 --> 00:56:40,960
Speaker 2: Right.

1162
00:56:41,000 --> 00:56:44,159
Speaker 3: Then it became multipot multipot and then then open claw

1163
00:56:44,280 --> 00:56:45,239
without the D at the end.

1164
00:56:45,400 --> 00:56:46,880
Speaker 1: Yeah, okay, but it's uh.

1165
00:56:47,239 --> 00:56:49,679
Speaker 2: I've included the show links. If folks haven't seen it,

1166
00:56:49,760 --> 00:56:50,519
go to take a look.

1167
00:56:50,679 --> 00:56:54,159
Speaker 1: It's basically a tool that connects to multiple APIs and

1168
00:56:54,239 --> 00:56:57,800
things and integrates with your multiple agents. And you know, if.

1169
00:56:57,760 --> 00:57:00,480
Speaker 2: You're ready to give away your keys to piece of

1170
00:57:00,519 --> 00:57:03,079
software like this is supposed to be a security show,

1171
00:57:03,119 --> 00:57:05,559
so let's talk about the least possible secure that you

1172
00:57:05,599 --> 00:57:06,000
could do.

1173
00:57:06,119 --> 00:57:09,239
Speaker 1: Well, that's that's the cautionary tale right there.

1174
00:57:09,239 --> 00:57:13,119
Speaker 2: Right, But for the average mortal, for the productivity benefit

1175
00:57:13,159 --> 00:57:16,559
that these tools provide, Like, what they're really showing you

1176
00:57:16,880 --> 00:57:18,719
is what the potential is stuff is.

1177
00:57:19,079 --> 00:57:21,800
Speaker 3: This is the AI we were promised. Yeah, this is jovis.

1178
00:57:21,840 --> 00:57:25,280
Speaker 2: That's the that's their pitch. The only problem is it's

1179
00:57:25,400 --> 00:57:27,920
really scary to trust it, and most people aren't qualified

1180
00:57:27,960 --> 00:57:29,960
to assess that trust in the first place. So right,

1181
00:57:30,079 --> 00:57:32,320
we're going to have some bad fallouts from this yet.

1182
00:57:32,440 --> 00:57:34,440
But the rest of us, I think again, we're I

1183
00:57:34,440 --> 00:57:36,400
think we're experienced. Now we're all looking at going he

1184
00:57:36,519 --> 00:57:40,079
know this is cool? You go first? Yeah, I see.

1185
00:57:42,079 --> 00:57:44,800
Speaker 3: Pretty much talking. Which I do have a laptop that

1186
00:57:44,840 --> 00:57:48,000
I've wiped recently that will be getting open Chloor installed

1187
00:57:48,039 --> 00:57:51,639
on it. It'll be put onto my isolated IoT network

1188
00:57:51,719 --> 00:57:53,440
and it will not have access to my credit cards

1189
00:57:53,519 --> 00:57:55,239
or one password yeah.

1190
00:57:55,239 --> 00:57:57,880
Speaker 1: Plan good. Let us know how that goes for you.

1191
00:57:58,119 --> 00:57:59,079
Speaker 2: I will nice all.

1192
00:57:59,000 --> 00:58:00,760
Speaker 3: Right, yeah, if you never hear from me again, it

1193
00:58:00,840 --> 00:58:01,599
went very badly.

1194
00:58:05,159 --> 00:58:08,760
Speaker 1: Get a phone call. I don't know where I am.

1195
00:58:08,880 --> 00:58:10,920
I'm in a phone booth somewhere.

1196
00:58:10,599 --> 00:58:13,159
Speaker 3: Very lawnmore man, I thought I were gonna say open

1197
00:58:13,199 --> 00:58:17,000
CLO was going to call you. I am Ben.

1198
00:58:17,679 --> 00:58:24,679
Speaker 1: There you go nine wait a call back. Oh Ben.

1199
00:58:24,719 --> 00:58:26,440
It's been such pleasure talking to you. I wish we

1200
00:58:26,480 --> 00:58:28,840
could talk more, and I'm sure we're going to in

1201
00:58:28,880 --> 00:58:29,599
the future.

1202
00:58:29,320 --> 00:58:29,960
Speaker 2: When you come back.

1203
00:58:30,719 --> 00:58:31,360
Speaker 3: Thanks for having me.

1204
00:58:31,400 --> 00:58:34,199
Speaker 1: Thank you, and we'll talk to you, dear listener next

1205
00:58:34,199 --> 00:58:57,559
time on dot net rocks. Dot net Rocks is brought

1206
00:58:57,599 --> 00:59:01,000
to you by Franklin's Net and produced by Pop Studios,

1207
00:59:01,400 --> 00:59:05,440
a full service audio, video and post production facility located

1208
00:59:05,440 --> 00:59:08,360
physically in New London, Connecticut, and of course in the

1209
00:59:08,440 --> 00:59:13,519
cloud online at pwop dot com. Visit our website at

1210
00:59:13,559 --> 00:59:15,400
d O T N E, t R O c k

1211
00:59:15,679 --> 00:59:20,480
S dot com for RSS feeds, downloads, mobile apps, comments,

1212
00:59:20,800 --> 00:59:23,320
and access to the full archives going back to show

1213
00:59:23,400 --> 00:59:27,119
number one, recorded in September two thousand and two, and

1214
00:59:27,239 --> 00:59:29,639
make sure you check out our sponsors. They keep us

1215
00:59:29,639 --> 00:59:33,119
in business. Now, go write some code. See you next time.

1216
00:59:34,039 --> 00:59:37,960
Speaker 2: You got Jack, Middle Vans and

