1
00:00:01,080 --> 00:00:04,799
Speaker 1: How'd you like to listen to dot NetRocks with no ads? Easy?

2
00:00:05,360 --> 00:00:08,560
Become a patron for just five dollars a month. You

3
00:00:08,599 --> 00:00:11,320
get access to a private RSS feed where all the

4
00:00:11,359 --> 00:00:14,599
shows have no ads. Twenty dollars a month, we'll get

5
00:00:14,599 --> 00:00:18,440
you that and a special dot NetRocks patron mug. Sign

6
00:00:18,519 --> 00:00:22,920
up now at Patreon dot dot NetRocks dot com. Hi,

7
00:00:23,000 --> 00:00:24,679
this is Carl Franklin.

8
00:00:24,239 --> 00:00:25,480
Speaker 2: And this is Richard Campbell.

9
00:00:25,839 --> 00:00:30,000
Speaker 1: We've got two special shows coming up soon, episode nineteen

10
00:00:30,120 --> 00:00:32,039
ninety nine and two thousand.

11
00:00:32,399 --> 00:00:35,079
Speaker 2: For episode nineteen ninety nine, we're collecting people's y two

12
00:00:35,159 --> 00:00:37,560
k stories what did you do to help the Y

13
00:00:37,600 --> 00:00:39,719
two k event not actually happen?

14
00:00:40,200 --> 00:00:42,960
Speaker 1: And for episode two thousand, we're going to be sharing

15
00:00:43,000 --> 00:00:45,479
stories about how dot net shaped your career.

16
00:00:46,000 --> 00:00:48,719
Speaker 2: We have a special page at dot netroocks dot com

17
00:00:48,719 --> 00:00:52,159
slash voxpop where you can record messages for us that

18
00:00:52,200 --> 00:00:54,000
we can play on these special episodes.

19
00:00:54,439 --> 00:00:56,479
Speaker 1: So tell us what you did for Y two k

20
00:00:56,759 --> 00:00:59,039
and what dot net means to you, and of course

21
00:00:59,079 --> 00:01:01,000
how long you've been listening to dot NetRocks.

22
00:01:01,640 --> 00:01:04,439
Speaker 2: So go to dot NetRocks dot com slash vox pop

23
00:01:04,560 --> 00:01:06,480
now and leave us a message before the thought of

24
00:01:06,519 --> 00:01:08,760
operates like whiskey left in a glass overnight.

25
00:01:09,159 --> 00:01:23,959
Speaker 1: Do it? Hey, guess what it's dot net rocks all

26
00:01:24,079 --> 00:01:27,200
over again. I'm Carl Franklin and I'm Richard Campbell. We're

27
00:01:27,200 --> 00:01:29,359
here again again.

28
00:01:29,599 --> 00:01:33,200
Speaker 3: Keep doing it, he go again, nineteen ninety one times

29
00:01:33,319 --> 00:01:36,120
he keep going and going. He keep me on and going.

30
00:01:37,439 --> 00:01:41,239
You go now, Okay, Yeah, we're only nine shows away

31
00:01:41,280 --> 00:01:44,000
from two thousand men like this, and that's gonna be

32
00:01:44,280 --> 00:01:46,120
hellaciously cool. It's gonna be fun.

33
00:01:46,200 --> 00:01:49,280
Speaker 1: If nothing else, it's gonna be a potty yeah literally,

34
00:01:49,560 --> 00:01:52,000
party for literal, party with Palermo, Party with Palermo at

35
00:01:52,000 --> 00:01:54,920
the MVP summit. Okay, well, this is gonna be a

36
00:01:54,920 --> 00:01:58,560
good show. Andrew Murphy is here. We are talking to

37
00:01:58,680 --> 00:02:02,200
him in a few minutes. And before we get to him, though,

38
00:02:02,280 --> 00:02:04,920
let's do a few things, starting with what happened in

39
00:02:05,040 --> 00:02:08,120
nineteen ninety one. And Richard, I'm going to let you

40
00:02:08,159 --> 00:02:11,159
go first because I'm really interested to what you have

41
00:02:11,240 --> 00:02:13,240
to say about the dissolution of the Soviet Union.

42
00:02:13,400 --> 00:02:13,599
Speaker 4: Yeah.

43
00:02:13,719 --> 00:02:15,840
Speaker 3: Well, obviously this has been come you know, things have

44
00:02:15,840 --> 00:02:20,520
been unraveling since the late eighties. Yeah, and so there's

45
00:02:20,560 --> 00:02:24,840
a referendum in the Soviet Union, a national referendum. Six

46
00:02:24,879 --> 00:02:28,319
of the republics refused to participate, but the majority that

47
00:02:28,360 --> 00:02:31,240
do participate are eighty percent in favor, and by the

48
00:02:31,319 --> 00:02:33,840
end of nineteen ninety one, the USSR will be no more.

49
00:02:34,599 --> 00:02:37,360
There'll be an attempted coup on Garbachov. There's a whole

50
00:02:37,639 --> 00:02:42,360
bunch of things that happened once Estonia, Latvia, Lithuania who

51
00:02:42,479 --> 00:02:46,199
had already declared independence in nineteen ninety and are fighting

52
00:02:46,360 --> 00:02:50,520
with the with the Soviets at the time, there's conflict.

53
00:02:50,520 --> 00:02:53,120
It's like the old crackdowns from the fifties and sixties.

54
00:02:53,639 --> 00:02:58,960
So they were first, but it goes on from there,

55
00:03:00,280 --> 00:03:04,599
which is east of Romania also declared its independence in

56
00:03:04,639 --> 00:03:09,759
ninety one, all of the stans Kyurgistan, Uzbekistan to Chikistan, Kazakhstan, Turkmenistan,

57
00:03:10,840 --> 00:03:13,919
all five of them declared. That's in the eastern Central

58
00:03:13,919 --> 00:03:18,240
Asia and then down in the caucas Peninsula, so that's Georgia, Iserbaijan,

59
00:03:18,800 --> 00:03:20,560
Armenia and of course Ukraine.

60
00:03:21,039 --> 00:03:23,879
Speaker 1: These were all regions of the Soviet Union before right,

61
00:03:23,919 --> 00:03:26,599
they did have borders. Well, these were all originally these

62
00:03:26,599 --> 00:03:28,680
were countries. They were originally countries.

63
00:03:29,080 --> 00:03:31,960
Speaker 3: These had always been independent countries, but in the era

64
00:03:32,039 --> 00:03:34,960
of Stalinism at the end of World War II, as

65
00:03:35,000 --> 00:03:37,759
the Soviet Union, well, the Soviet Union was already formed,

66
00:03:37,800 --> 00:03:39,680
they were controlled by the Soviets, and they were expanded

67
00:03:39,680 --> 00:03:41,960
as part of the more part expanded as the Warsaw

68
00:03:42,000 --> 00:03:45,120
Path right into the West that included also Romania and

69
00:03:45,199 --> 00:03:45,639
so forth.

70
00:03:45,680 --> 00:03:48,800
Speaker 4: It was the end of World War two and Germany

71
00:03:49,360 --> 00:03:52,439
basically took those countries over. There are independent countries before

72
00:03:52,479 --> 00:03:55,280
World War Two. Germany took them over as part of that,

73
00:03:55,680 --> 00:03:58,919
and then it ended up them merging as part of

74
00:03:58,960 --> 00:03:59,759
the Soviet Union.

75
00:04:00,080 --> 00:04:01,479
Speaker 1: So they've been kicked around a bit.

76
00:04:01,560 --> 00:04:03,759
Speaker 4: As as most of Eastern Europe.

77
00:04:04,360 --> 00:04:08,000
Speaker 3: Yeah, yeah, and as Moscow loses controlled in the late eighties,

78
00:04:08,000 --> 00:04:10,520
they're you know, Poland's one of the first and all

79
00:04:10,680 --> 00:04:13,080
the Warsaw Paths, like the Warsaw Pact actually ends in

80
00:04:13,120 --> 00:04:15,599
nineteen ninety one, but it had already fallen apart at

81
00:04:15,599 --> 00:04:16,079
this point.

82
00:04:16,240 --> 00:04:18,720
Speaker 1: What I know about Poland is that the geography of

83
00:04:18,759 --> 00:04:21,759
Poland makes it very susceptible to invasion, and it's been

84
00:04:21,759 --> 00:04:25,399
invaded many times by many other countries because of that. Yeah,

85
00:04:25,800 --> 00:04:27,160
and you know, they're still.

86
00:04:26,959 --> 00:04:31,240
Speaker 3: Here, yeah, and under control lots of different ways. The

87
00:04:32,240 --> 00:04:35,040
Central Asian States that stands as you refer to them,

88
00:04:35,160 --> 00:04:37,319
are obviously very important, so you have it apart for

89
00:04:37,360 --> 00:04:40,800
some time, and so they do become part of the

90
00:04:40,879 --> 00:04:45,800
come wealth of independent states. That referendum that is really

91
00:04:45,839 --> 00:04:50,279
about should we have more independence? The current Soviet system

92
00:04:50,279 --> 00:04:52,959
will end, but something will replace it. This is also

93
00:04:53,000 --> 00:04:56,279
the period where or Sielson is elected as the president

94
00:04:56,800 --> 00:05:00,439
of the new Russian Soviet Federative Social Public.

95
00:05:00,600 --> 00:05:03,360
Speaker 1: So they come up with a union in other words Russia.

96
00:05:03,680 --> 00:05:07,360
Speaker 3: Yeah, yeah, Russian Federation and sort of a collective. So

97
00:05:07,560 --> 00:05:10,480
it is it is an unraveling and it will affect

98
00:05:10,560 --> 00:05:13,079
space as well as we get down into that portion

99
00:05:13,160 --> 00:05:16,439
of the story. And obviously depending on where you are

100
00:05:16,480 --> 00:05:19,160
in the world, this was a disaster or a huge success.

101
00:05:19,199 --> 00:05:22,800
I mean, this is also when Yugoslavia starts to break

102
00:05:22,879 --> 00:05:27,399
up and goes immediately into conflict. Yeah, Serbs and Croats

103
00:05:27,439 --> 00:05:30,600
and the Macedonians like it turns into what will become

104
00:05:30,680 --> 00:05:35,000
quite a nasty, nasty, nasty civil war for several years. Yeah.

105
00:05:35,079 --> 00:05:37,680
But on top of that, this is when the Gulf

106
00:05:37,720 --> 00:05:40,480
War begins. Yep, that's right, you know, the invasion of

107
00:05:41,160 --> 00:05:44,800
Kuwait had already happened, and George Bush's president, so it's

108
00:05:44,839 --> 00:05:46,959
crazy to think about with all of that going on.

109
00:05:47,879 --> 00:05:51,120
Then also the air strikes begin for the First Gulf War.

110
00:05:51,680 --> 00:05:54,439
It's also when aparth ideens in South Africa. Like, yeah,

111
00:05:54,560 --> 00:05:55,439
I didn't anyone is.

112
00:05:55,879 --> 00:05:57,839
Speaker 1: It's just a lot of stuff, a lot of history

113
00:05:57,879 --> 00:06:00,279
going on, a lot of happened simultaneously there. I remember

114
00:06:00,399 --> 00:06:04,240
David Crosby on somewhere on the news or something in

115
00:06:04,319 --> 00:06:07,879
nineteen eighty nine, he says, you think the sixties were wild?

116
00:06:08,160 --> 00:06:10,040
Just where do you see the nineties?

117
00:06:10,360 --> 00:06:15,920
Speaker 3: Yeah? Yeah, So it's a very intense time and a

118
00:06:16,000 --> 00:06:17,360
lot of the things that have been coming, you know,

119
00:06:17,360 --> 00:06:20,360
are coming to fruition. Like I said this, the USSR

120
00:06:21,199 --> 00:06:23,040
will be over by the end of the year.

121
00:06:23,319 --> 00:06:25,439
Speaker 1: Andrew. And that voice you love, the voice you heard

122
00:06:25,519 --> 00:06:29,040
was Andrew. Do you want to say something about the chain,

123
00:06:29,839 --> 00:06:30,759
the Balkan Chain.

124
00:06:31,360 --> 00:06:34,160
Speaker 4: Yeah, So we were talking about this in the pre show,

125
00:06:34,199 --> 00:06:36,839
and this is I'm just old enough to remember this

126
00:06:37,560 --> 00:06:40,879
kind of transition away from the USSR, and I have

127
00:06:40,920 --> 00:06:45,560
a very vivid memory of watching the TV of something

128
00:06:45,600 --> 00:06:48,600
called the Baltic Chain, which wasn't actually in nineteen ninety one.

129
00:06:48,639 --> 00:06:51,600
It was a couple of years earlier, in nineteen eighty nine,

130
00:06:51,920 --> 00:06:55,720
and it was at two million people across Eastern Europe

131
00:06:55,839 --> 00:07:00,439
hell hands in an unbroken chain over four hundred miles long,

132
00:07:00,519 --> 00:07:03,600
to kind of protest the USSR. I just have this

133
00:07:03,759 --> 00:07:07,040
vivid memory of the news shows about it. And if

134
00:07:07,040 --> 00:07:10,240
you're in Europe, you probably know about the Black Ribbon Day,

135
00:07:10,240 --> 00:07:13,720
which is the twenty third of August every year, and

136
00:07:13,759 --> 00:07:17,959
that's kind of the remembrance of anti Stalinism and Nazism.

137
00:07:18,160 --> 00:07:20,199
And the reason it's the twenty third of August is

138
00:07:20,240 --> 00:07:23,000
because that's when the Baltic Chain or the Baltic Way

139
00:07:23,199 --> 00:07:23,959
was Wow.

140
00:07:24,560 --> 00:07:27,920
Speaker 1: Cool stuff. All right, let me get to some other

141
00:07:28,000 --> 00:07:32,040
things that happened. You already talked about all of that stuff.

142
00:07:32,319 --> 00:07:36,160
There was a military coup overthrow that overthrew Haiti's first

143
00:07:36,279 --> 00:07:40,759
democratically elected president, So Haiti, you're going to talk about

144
00:07:40,879 --> 00:07:43,000
the World Wide Web, of course, so I'll leave there

145
00:07:43,079 --> 00:07:43,519
for you.

146
00:07:44,279 --> 00:07:44,560
Speaker 4: Last.

147
00:07:45,360 --> 00:07:49,040
Speaker 1: What I want to talk about is some cultural stuff.

148
00:07:49,120 --> 00:07:53,920
Nintendo released the Super Nintendo Entertainment System, which was amazing.

149
00:07:55,360 --> 00:07:59,319
There were some good movies. Terminator two was there, Sounds

150
00:07:59,319 --> 00:08:01,839
of the Lambs there. But I want to talk about

151
00:08:01,839 --> 00:08:07,439
the music. So the top ten albums of nineteen ninety one,

152
00:08:08,399 --> 00:08:11,600
And you know, for me, nineteen ninety was sort of

153
00:08:11,639 --> 00:08:16,319
the comeback of people who played their own instruments. You know,

154
00:08:16,720 --> 00:08:19,160
in the eighties, it was all synthesizers and midi and

155
00:08:19,160 --> 00:08:22,279
all that stuff, and it just war on my soul.

156
00:08:22,399 --> 00:08:24,759
You know, I grew up listening to sixties and seventies,

157
00:08:25,279 --> 00:08:28,319
and you know, I wanted to play like Peter Frampton

158
00:08:28,439 --> 00:08:31,920
and Brian May and Eric Clapton, right, people who really

159
00:08:32,240 --> 00:08:37,440
were musicians of an instrument and practiced their craft on instrument.

160
00:08:37,799 --> 00:08:40,200
And so in the nineties it sort of came around again.

161
00:08:41,840 --> 00:08:45,159
So in ninety one, let's talk about Matthew Sweet. Girlfriend

162
00:08:45,399 --> 00:08:48,840
was number ten, and I remember I was working at

163
00:08:48,960 --> 00:08:53,679
Voietra Technologies and the head engineer there was friends with

164
00:08:53,759 --> 00:08:56,919
Matthew Sweet, so I got to hear his stuff before

165
00:08:56,919 --> 00:09:02,080
it went live. The Pixies, Number nine, le Monde A

166
00:09:02,159 --> 00:09:07,519
Good Album, Soundgarden Bad, Motor Finger, Dala Soul, Dala Soul

167
00:09:07,639 --> 00:09:11,879
is Dead, Ram Out of Time, number six, number five,

168
00:09:12,000 --> 00:09:17,679
Pearl Jam ten, number four, You two, Oktung Baby Hello

169
00:09:18,159 --> 00:09:22,480
Like This is amazing stuff and number three a tribe

170
00:09:22,519 --> 00:09:26,559
called Quest the low End Theory, Number two My Bloody

171
00:09:26,639 --> 00:09:31,480
Valentine Loveless, and number one Nirvana never mind Nice, and

172
00:09:31,519 --> 00:09:35,000
that really started the whole grunge movement. Yeah, but it's

173
00:09:35,039 --> 00:09:37,480
interesting that Neil Young had a great album on the

174
00:09:37,559 --> 00:09:40,799
top ten in nineteen ninety and he was called the

175
00:09:40,840 --> 00:09:44,799
grandfather of grunge, right, because sure, yeah, he was sort

176
00:09:44,840 --> 00:09:47,639
of the inspiration for all of that stuff. And Neil

177
00:09:47,759 --> 00:09:50,799
never stopped rocking all throughout the seventies and eighties, and.

178
00:09:50,759 --> 00:09:52,879
Speaker 3: It's certainly a man who always played his own instruments too.

179
00:09:53,320 --> 00:09:55,720
Absolutely come on. And you know there was some great

180
00:09:55,759 --> 00:09:58,320
hip hop albums and stuff too, but for a white

181
00:09:58,399 --> 00:10:01,919
kid who grew up listening to primarily white bands, but

182
00:10:02,000 --> 00:10:05,679
some black bands too, of course black musicians. But you know,

183
00:10:05,759 --> 00:10:09,360
I was always just into real music, and you know,

184
00:10:09,879 --> 00:10:14,039
nothing about rap and hip hop and all of that

185
00:10:14,159 --> 00:10:18,240
isn't real music, it really is. But the electronics stuff

186
00:10:18,480 --> 00:10:21,200
really just drove me crazy. And so in the nineties

187
00:10:21,320 --> 00:10:23,440
it felt like a sigh of relief, Oh my god,

188
00:10:23,559 --> 00:10:26,720
this is great. Bands are back. So yeah, we'll talk

189
00:10:26,759 --> 00:10:30,080
about more of that later on. Anyway, all right. The

190
00:10:30,080 --> 00:10:32,440
space side's fairly short. There are five Shuttle flights in

191
00:10:32,559 --> 00:10:35,000
the nineteen ninety one. Two of them are military. There's

192
00:10:35,039 --> 00:10:37,559
not much to talk about. Columbia flies another space Lab mission,

193
00:10:37,559 --> 00:10:40,759
which is awesome to me, the most interesting mission of

194
00:10:40,759 --> 00:10:45,639
the bunch. They're Discovery and Atlantas. Both launch observatory satellites.

195
00:10:46,120 --> 00:10:48,679
Discovery does the Upper Atmosphere Research satellite, which is about

196
00:10:48,720 --> 00:10:53,200
detecting carbon dioxide. Yeah, and Atlantis flew the Compton Gamma

197
00:10:53,279 --> 00:10:55,759
Ray Observatory, one of the great observatories, which is an

198
00:10:55,759 --> 00:10:59,399
awesome massive machine. It, like the Hubble, was designed to

199
00:10:59,399 --> 00:11:01,919
be serviced, although it never was. Ultimately it worked as

200
00:11:01,919 --> 00:11:04,679
it was and there was no reason ever upgraded. That

201
00:11:04,799 --> 00:11:07,480
was the whole idea behind this collection of things with

202
00:11:07,519 --> 00:11:09,840
the Shuttle was to do maintenance on the ultimately the

203
00:11:09,960 --> 00:11:12,360
only Hubble. Of course, Hubble lead to the most Hubble's

204
00:11:12,360 --> 00:11:15,279
already launched. They just doesn't work because they've misground the mirror.

205
00:11:15,360 --> 00:11:17,159
So they're still at this point trying to figure out

206
00:11:17,159 --> 00:11:22,039
how to fix that. But there's one Russian flight, Soviet flight,

207
00:11:22,080 --> 00:11:26,080
actually the important one TM thirteen Poral TM thirteen, This

208
00:11:26,240 --> 00:11:28,799
was a so this was a Soyu's mission to the

209
00:11:28,919 --> 00:11:31,919
Mir space station, which is of course operating at this time,

210
00:11:32,600 --> 00:11:37,320
and the two Russians are on Mir stay station and

211
00:11:37,360 --> 00:11:43,080
they go up there as Soviet Union members in October

212
00:11:43,120 --> 00:11:45,960
of nineteen ninety one. But when they return in March

213
00:11:45,960 --> 00:11:48,519
of nineteen ninety two, they will be the last Soviets

214
00:11:48,720 --> 00:11:51,960
elver attorney. There's no one Mare Soviet Union, so it's

215
00:11:52,200 --> 00:11:58,000
Alex elder Volkov and Sergei Krikalev will be you know,

216
00:11:58,080 --> 00:12:02,480
somehow impacted by the huge change, right, Okay, yeah, but

217
00:12:02,759 --> 00:12:05,399
you have no passports. The other part of this, of

218
00:12:05,399 --> 00:12:08,440
course is Kazakhstown. You know, all of those flights go

219
00:12:08,440 --> 00:12:10,879
out of Kazakstan, which is now an independent nation too.

220
00:12:10,919 --> 00:12:13,080
So the part one of the things they did on

221
00:12:13,120 --> 00:12:17,559
TM thirteen was they flew a Kazakh astronaut or cosmonaut.

222
00:12:18,039 --> 00:12:20,360
So they're trying to keep the Kazakhs engage as they

223
00:12:20,399 --> 00:12:23,000
see the changes that are going on here. And so

224
00:12:23,120 --> 00:12:27,440
the Kazakhs get their first astronaut as part of TM thirteen. Wow,

225
00:12:27,840 --> 00:12:30,200
to go up to Mir. That's cool, and maintain the

226
00:12:30,240 --> 00:12:32,679
agreement that it still exists to this day to host

227
00:12:32,799 --> 00:12:36,360
the space flights of the Russian Space System. Okay, computing,

228
00:12:36,879 --> 00:12:40,039
I mean we have to leave with this. Tim berners

229
00:12:40,120 --> 00:12:43,000
Lee announces the Worldwide Web Project in a newsgroup and

230
00:12:43,799 --> 00:12:47,120
provides a link to the very first website in the world,

231
00:12:47,679 --> 00:12:52,399
info dot cern dot ch because it's in Switzerland. Of course,

232
00:12:52,440 --> 00:12:54,840
the only graphical browser that exists at that time is

233
00:12:54,840 --> 00:12:57,200
on the Next because TBL wrote it. So unless you

234
00:12:57,240 --> 00:13:00,440
have a Next, you can't go there anyway, or you

235
00:13:00,440 --> 00:13:02,080
can try to use a text onre But they tried

236
00:13:02,080 --> 00:13:05,559
to address that by releasing the www Common Library and

237
00:13:05,639 --> 00:13:08,240
sort of encourage people, you need to build browsers, and

238
00:13:08,279 --> 00:13:10,879
a bunch of folks volunteer to start making the first

239
00:13:10,919 --> 00:13:11,759
generation of browsers.

240
00:13:11,799 --> 00:13:14,720
Speaker 1: One of them will be market Reason. The first browser

241
00:13:14,759 --> 00:13:16,120
I used was called Cello.

242
00:13:16,480 --> 00:13:16,919
Speaker 3: There you go.

243
00:13:17,080 --> 00:13:19,960
Speaker 1: Do you remember Cello? It was before Netscape. Yeah, yeah,

244
00:13:20,080 --> 00:13:22,759
so before all that. Actually it was Mosaic before Netscape.

245
00:13:22,840 --> 00:13:25,279
Speaker 3: Yeah, and yeah, that's the one that Dreason will be

246
00:13:25,320 --> 00:13:28,080
evolved in out of the Supercomputing Center now Brena. Yeah, yeah,

247
00:13:28,159 --> 00:13:31,399
the National Science Foundation which now owns that network because

248
00:13:31,399 --> 00:13:34,440
it's the military's ARPA that's moved on. It's not the NSF,

249
00:13:34,480 --> 00:13:38,200
but this is the year that NSF announced the restrictions

250
00:13:38,559 --> 00:13:41,559
removing the restrictions on commercial utilization of the Internet. So

251
00:13:41,679 --> 00:13:45,559
arguably this is the catalyst, although things won't go crazy

252
00:13:45,639 --> 00:13:49,159
until Netscape comes along. But these are the elements of

253
00:13:49,200 --> 00:13:50,799
making what we now know as the Internet.

254
00:13:50,799 --> 00:13:53,519
Speaker 1: I remember there was a lot of people complaining about

255
00:13:53,559 --> 00:13:56,000
the commercialization of the Internet, you know, on the news

256
00:13:56,039 --> 00:13:58,080
groups and all in the well yeah, no, even on

257
00:13:58,200 --> 00:14:01,440
compu serve before that, you know, they were really worried

258
00:14:01,480 --> 00:14:06,240
that the commercial thing. However, porn was totally cool, right,

259
00:14:06,759 --> 00:14:08,720
that's fine. Porn drove the Internet.

260
00:14:08,799 --> 00:14:12,039
Speaker 3: This is also the year that one young programmer, namn

261
00:14:12,080 --> 00:14:17,000
Linus Torvalds releases a version of Unix called he calls

262
00:14:17,080 --> 00:14:20,240
Linux the kernel on Usenet, and a whole bunch of

263
00:14:20,240 --> 00:14:23,039
people volunteer to improve it. They're very excited about having

264
00:14:23,440 --> 00:14:28,159
an independent version of uns Yeah. Microsoft releases a product

265
00:14:28,240 --> 00:14:30,000
called visual Basic.

266
00:14:30,519 --> 00:14:35,120
Speaker 1: That's really funny because that's when I started working with it.

267
00:14:35,200 --> 00:14:37,240
And Andrew, you were a VB programmer too.

268
00:14:37,279 --> 00:14:38,960
Speaker 4: Wonder how they got through your past pout control?

269
00:14:39,519 --> 00:14:41,919
Speaker 1: Were you in there in the beginning one point zero?

270
00:14:42,360 --> 00:14:46,000
Speaker 4: No, no, I I have a few, a few few,

271
00:14:46,039 --> 00:14:51,480
A gray hairs than you. Kyle looks brown at me.

272
00:14:51,759 --> 00:14:53,480
Speaker 1: Mine was practically white.

273
00:14:54,200 --> 00:14:56,639
Speaker 4: I was sorry for the segue. I was putting sun

274
00:14:56,720 --> 00:14:59,320
cream on my face and I thought I hadn't LittD

275
00:14:59,360 --> 00:15:01,840
all the song cream, and then I realized my beard

276
00:15:01,960 --> 00:15:02,639
is just turning one.

277
00:15:02,639 --> 00:15:04,039
Speaker 3: It's a bit agree on the sides there.

278
00:15:04,240 --> 00:15:10,639
Speaker 4: Yeah. Yeah, I was a VB four developer. That was

279
00:15:10,679 --> 00:15:14,000
my well, my my first ever programming language was Basic

280
00:15:14,080 --> 00:15:17,360
on a ZX spectrum, and then I progressed from that

281
00:15:17,440 --> 00:15:18,279
to VB four.

282
00:15:18,519 --> 00:15:18,799
Speaker 1: Nice.

283
00:15:19,159 --> 00:15:22,200
Speaker 3: Yeah, I remember VB one. I was a happy clipper programmer,

284
00:15:22,279 --> 00:15:25,000
making lots of money. Business was good, but people kept

285
00:15:25,000 --> 00:15:27,080
wanting window stuff. I didn't understand why. I thought tooking

286
00:15:27,080 --> 00:15:31,320
your hand off the keyboard was stupid. But with Windows

287
00:15:31,320 --> 00:15:35,960
three and device contexts, that was the thing. Yeah, dcs,

288
00:15:36,080 --> 00:15:40,639
but right, Trying to write MFC apps and crashing windows

289
00:15:40,679 --> 00:15:43,919
all the time sucked. So VB looked to be a

290
00:15:44,000 --> 00:15:45,919
window through there. Although I didn't have a lot of

291
00:15:45,960 --> 00:15:48,240
level one one was pretty primitive. You couldn't do a

292
00:15:48,240 --> 00:15:48,759
whole bunch.

293
00:15:48,840 --> 00:15:49,279
Speaker 1: I got it.

294
00:15:49,559 --> 00:15:51,519
Speaker 3: Two would be the version that I'm like, okay, this

295
00:15:51,639 --> 00:15:52,080
is the way.

296
00:15:52,200 --> 00:15:54,720
Speaker 1: I liked one just because it was the proof of concept,

297
00:15:54,799 --> 00:15:57,279
you know, and it yeah didn't it couldn't. You couldn't

298
00:15:57,279 --> 00:15:59,320
really do anything with it. It was, you know, No,

299
00:15:59,399 --> 00:16:00,840
it's just to be just the beginning.

300
00:16:01,480 --> 00:16:04,159
Speaker 3: This is the year that Microsoft releases STUS five point zero,

301
00:16:04,840 --> 00:16:07,840
and also the separation from IBM is complete and they

302
00:16:07,879 --> 00:16:11,000
stop calling their Microsoft OS two. They start calling it

303
00:16:11,080 --> 00:16:11,759
Windows and.

304
00:16:12,440 --> 00:16:13,039
Speaker 1: Day have cutler.

305
00:16:13,480 --> 00:16:17,279
Speaker 3: What else? The power PC Processor, a joint development between IBM,

306
00:16:17,320 --> 00:16:20,399
Motorola and Apple for the new PowerMac is released.

307
00:16:20,559 --> 00:16:22,559
Speaker 1: I don't know whether to feel a kinship with you.

308
00:16:22,639 --> 00:16:25,080
Guys are just sad at how old I am.

309
00:16:25,559 --> 00:16:28,799
Speaker 4: You know, this video call is like a live demo

310
00:16:28,840 --> 00:16:34,679
of entropy, isn't it beautiful?

311
00:16:35,519 --> 00:16:40,759
Speaker 3: All right, you can stay. Creative Labs releases the Multimedia

312
00:16:40,879 --> 00:16:43,120
Kit remember this so you know, you know, only it

313
00:16:43,240 --> 00:16:45,600
comes with with your sound Blaster Pro. Is also this

314
00:16:45,639 --> 00:16:49,120
thing called a CD ROM drive, So this is the

315
00:16:49,120 --> 00:16:51,600
beginning of multimedia on your computer.

316
00:16:51,879 --> 00:16:52,639
Speaker 1: Yeah, that's right.

317
00:16:53,159 --> 00:16:55,679
Speaker 3: One that I particularly love I remember at the time

318
00:16:55,799 --> 00:16:57,679
was a new game for the sake of Genesis called

319
00:16:57,840 --> 00:17:01,799
Zero Wing, and it is the origin of the phrase

320
00:17:02,360 --> 00:17:04,240
all your base are belonging to us?

321
00:17:04,480 --> 00:17:07,839
Speaker 1: Oh, all your base our belong to us?

322
00:17:08,039 --> 00:17:10,079
Speaker 3: Yeah, This is where it comes from. It's this terrible

323
00:17:10,200 --> 00:17:12,880
Japanese translation of this story of.

324
00:17:13,319 --> 00:17:15,759
Speaker 1: A somebody made. I think we did it on dot

325
00:17:15,880 --> 00:17:18,799
Ne Rocks a long time ago. Jeff Manzilick maybe played that.

326
00:17:18,799 --> 00:17:21,480
There was like a music loop that was done with

327
00:17:21,640 --> 00:17:23,920
all your bass I belonged to us. It was like

328
00:17:23,960 --> 00:17:25,039
a technow kind of thing.

329
00:17:25,160 --> 00:17:27,400
Speaker 3: It was a techno thing. Yeah, that's where it comes from.

330
00:17:27,480 --> 00:17:29,680
Nineteen ninety one. This is also a year that ID

331
00:17:29,799 --> 00:17:32,839
Software and is formed, that's commanded. Their first game is

332
00:17:32,839 --> 00:17:36,000
called Commander Keene. But you know about them for Doom

333
00:17:36,079 --> 00:17:38,160
and Quake and all of those games, and a company

334
00:17:38,160 --> 00:17:41,240
called Silicon and Synaps which now we know as Blizzard.

335
00:17:41,319 --> 00:17:41,640
Speaker 4: There you go.

336
00:17:41,720 --> 00:17:43,799
Speaker 3: You care about World of Warcraft or of those lines

337
00:17:43,839 --> 00:17:45,640
and games. All right, that's what I got.

338
00:17:45,680 --> 00:17:48,119
Speaker 1: All right, let's move on to better know a framework

339
00:17:48,200 --> 00:17:48,839
role of the music.

340
00:17:49,000 --> 00:17:59,039
Speaker 3: Awesome? All right, man, we got all right?

341
00:17:59,119 --> 00:18:02,359
Speaker 1: So I just re coorded an episode of Coded with

342
00:18:02,440 --> 00:18:06,640
AI today with Jeff Fritz and he pulled out this

343
00:18:06,680 --> 00:18:11,240
thing called Squad and Squad is a tool by Brady Gaster.

344
00:18:12,039 --> 00:18:13,279
Speaker 3: Let me read it from your repo.

345
00:18:13,720 --> 00:18:18,480
Speaker 1: YEP, Squad gives you an AI development team through GitHub copilot,

346
00:18:18,720 --> 00:18:22,920
describe what you're building. Get a team of specialists front end,

347
00:18:23,160 --> 00:18:26,720
back end testers that live in your repo as files.

348
00:18:27,279 --> 00:18:30,839
They persist across sessions. One of them. Their job is

349
00:18:31,000 --> 00:18:35,759
just to remember things and write it down. They persist

350
00:18:35,799 --> 00:18:39,400
across sessions, learn your code, bay share decisions, and get

351
00:18:39,519 --> 00:18:42,799
better the more you use them. It's not a chatbot

352
00:18:42,799 --> 00:18:46,079
wearing hats. Each team member runs in its own context,

353
00:18:46,480 --> 00:18:49,839
reads only its own knowledge, and writes back what it learned.

354
00:18:50,440 --> 00:18:55,640
Get this, It will write skills. It's like really smart.

355
00:18:56,039 --> 00:19:00,279
And I saw a demo of that with Jeff Fritz today.

356
00:19:00,680 --> 00:19:03,799
It's coded with AI episode eighteen and we'll provide a

357
00:19:03,799 --> 00:19:07,480
link to that website. And it just blew my mind.

358
00:19:07,640 --> 00:19:10,240
Here's the other thing that blew my mind. I said,

359
00:19:10,400 --> 00:19:13,319
isn't this, with all these multiple agents going to rack

360
00:19:13,440 --> 00:19:16,119
up a lot of you know, money, a lot of tokens.

361
00:19:16,160 --> 00:19:18,880
He says, yes, But my co pilot license is not

362
00:19:19,400 --> 00:19:24,119
by the token, it's by the request. Interesting, and so yeah,

363
00:19:24,240 --> 00:19:27,079
he turned out. It turned out that he did this

364
00:19:27,160 --> 00:19:31,599
demo of adding Oh what did he add? He added

365
00:19:31,640 --> 00:19:36,119
a testing framework. He added a sequel expert to analyze

366
00:19:36,119 --> 00:19:38,359
the code that was generating SEQL. This is for av

367
00:19:38,480 --> 00:19:42,240
and data Genie. He did this. He added a GitHub

368
00:19:44,319 --> 00:19:50,559
CICD pipeline. All of this stuff only took eighteen requests,

369
00:19:50,680 --> 00:19:53,839
and that's because he was it was really only six requests,

370
00:19:53,839 --> 00:19:59,440
but it was a three times license essentially. The thing

371
00:19:59,480 --> 00:20:03,720
that he was using was a three x agent essentially.

372
00:20:04,279 --> 00:20:07,240
So yeah, and only took eighteen tokens. We did the

373
00:20:07,279 --> 00:20:08,960
math and it turned out to be about a buck.

374
00:20:09,119 --> 00:20:09,519
Speaker 3: Wow.

375
00:20:09,720 --> 00:20:14,160
Speaker 1: So it's very cost efficient if you used the right way. Okay, Yeah,

376
00:20:14,200 --> 00:20:15,960
I'm very impressed. Now it sounds like you should be

377
00:20:15,960 --> 00:20:17,799
a dot neut Rocks about it at some point. I

378
00:20:17,839 --> 00:20:19,400
think there will be as a matter of fact.

379
00:20:19,480 --> 00:20:21,960
Speaker 3: Yeah, we just have Brady on the show too, so

380
00:20:22,079 --> 00:20:23,160
but you know, we'll figure it out.

381
00:20:23,240 --> 00:20:26,079
Speaker 1: Very cool. So anyway, that's what I got. Who's talking

382
00:20:26,079 --> 00:20:26,359
to us?

383
00:20:26,440 --> 00:20:30,000
Speaker 3: Richard grabbed a comment off a show nineteen eighty nine.

384
00:20:30,079 --> 00:20:32,559
So that's the one we just did with Ben when

385
00:20:32,559 --> 00:20:34,640
we talked about the role of Ai. Lit off a

386
00:20:34,680 --> 00:20:37,319
whole bunch of comments that I thought we should this

387
00:20:37,359 --> 00:20:40,240
from a long time listed John Suda has been listening

388
00:20:40,240 --> 00:20:41,519
to the show for a very long time, so I

389
00:20:41,519 --> 00:20:43,839
grabbed his comment. He said, Hey, not so long ago,

390
00:20:43,880 --> 00:20:46,920
you guys mocked the concept of vibe coding. Sorry, dude,

391
00:20:46,960 --> 00:20:49,839
I still do if you're talking about the original concept,

392
00:20:49,839 --> 00:20:51,880
which we know what Andrea we talked about, where you

393
00:20:51,920 --> 00:20:54,079
don't adjust in your own code, you just keep feeding

394
00:20:54,119 --> 00:20:56,400
it back and so forth. But now it's down to

395
00:20:56,519 --> 00:20:59,480
and I'm paraphrasing, only the hobbyists will soon read let

396
00:20:59,480 --> 00:21:00,480
alone write code.

397
00:21:00,799 --> 00:21:01,240
Speaker 4: Wow.

398
00:21:01,559 --> 00:21:02,200
Speaker 1: Who said that?

399
00:21:02,559 --> 00:21:06,000
Speaker 3: Well, I'm not sure, but this is from that conversation

400
00:21:06,119 --> 00:21:10,279
of you now managing a bunch of agents generating code.

401
00:21:10,400 --> 00:21:12,920
Speaker 1: Right, although I think, well that I didn't say that.

402
00:21:13,440 --> 00:21:15,599
Speaker 3: I think maybe no, you didn't think you did. But

403
00:21:15,880 --> 00:21:18,680
there is that point of which you do an agent management,

404
00:21:18,720 --> 00:21:20,960
and then there's also this point of there's some things

405
00:21:21,000 --> 00:21:23,400
that the agencies aren't good at, at least for now.

406
00:21:23,680 --> 00:21:26,519
Speaker 1: One thing I said in this conversation with Jeff Fritz

407
00:21:26,640 --> 00:21:32,440
on Squad is that you know, you have to be

408
00:21:32,480 --> 00:21:35,759
more vigilant than ever because we have this now that

409
00:21:35,799 --> 00:21:38,720
we're doing. We're in this role of you know, just

410
00:21:38,799 --> 00:21:42,599
barking out commands and letting things happen. We have we

411
00:21:42,640 --> 00:21:46,519
have a moral obligation to double check, triple check that

412
00:21:46,720 --> 00:21:50,559
code because our name's on it, you know what I mean.

413
00:21:50,920 --> 00:21:53,839
And if the company comes black back and they're looking

414
00:21:53,839 --> 00:21:56,720
to people to blame, you can't say, oh, the AI

415
00:21:56,880 --> 00:22:00,839
did that. You can't. No, you're responsible, you can.

416
00:22:01,079 --> 00:22:02,319
Speaker 3: They're just not going to take it.

417
00:22:02,400 --> 00:22:05,119
Speaker 1: You're responsible for it. So and people could get hurt.

418
00:22:05,440 --> 00:22:07,559
That's the thing about it, Like you really have to

419
00:22:07,559 --> 00:22:10,640
think about this as a moral issue, I think anyway.

420
00:22:10,400 --> 00:22:12,839
Speaker 3: Yeah, I don't disagree, And this is basically what John's saying.

421
00:22:12,880 --> 00:22:15,119
It's like the pace at which things are happening scares me.

422
00:22:15,160 --> 00:22:17,920
At times. We almost fatalistically accept it as inevitable. But

423
00:22:18,000 --> 00:22:21,079
is it always an unmitigated good? No? Yeah, And we're

424
00:22:21,119 --> 00:22:24,359
not fatalistity accepting it. We're testing it and trying it

425
00:22:24,400 --> 00:22:26,599
and trying to figure out if it's valuable and understanding

426
00:22:26,640 --> 00:22:29,160
the issues around it. Like right, these see things are

427
00:22:29,160 --> 00:22:31,440
in front of us. And my first reaction was show

428
00:22:31,480 --> 00:22:34,039
me the money, like does this work? And how do

429
00:22:34,079 --> 00:22:35,319
you use it responsibly?

430
00:22:35,480 --> 00:22:37,559
Speaker 4: And the genie is not going back in the bottle

431
00:22:37,759 --> 00:22:40,799
like you know, so it's only ever going to get better,

432
00:22:40,839 --> 00:22:43,359
it's not going to get worse. And you know, we

433
00:22:43,440 --> 00:22:45,799
just can't undo what's being done. We have to find

434
00:22:45,799 --> 00:22:46,480
a way to do with it.

435
00:22:46,519 --> 00:22:49,000
Speaker 3: Although the scope of badness is still full yet to

436
00:22:49,000 --> 00:22:51,920
be fully explored. It's more harm to be done without

437
00:22:51,920 --> 00:22:55,160
a dot. So and he goes, Honestly, change is good,

438
00:22:55,200 --> 00:22:56,759
There's no two ways about it. But sometimes I wonder

439
00:22:56,880 --> 00:22:58,880
can to be too preferably too much of a good time?

440
00:22:59,039 --> 00:23:02,400
Absolutely too much of a thing? Absolutely right? And that's

441
00:23:02,480 --> 00:23:05,079
part of why we keep digging into these things is

442
00:23:05,119 --> 00:23:08,359
because I am concerned and we do have real goals

443
00:23:08,359 --> 00:23:10,519
to make We're not just writing software for a front.

444
00:23:11,039 --> 00:23:13,880
Are you delivering the quality that the customer needs? And

445
00:23:14,759 --> 00:23:16,960
is your product good enough? And do you understand the

446
00:23:17,000 --> 00:23:19,880
scope of what's been created? Don't find this all that

447
00:23:19,960 --> 00:23:23,000
different from when we started outsourcing development into other countries

448
00:23:23,000 --> 00:23:24,960
because the bandwidth was cheap and the labor was cheap,

449
00:23:24,960 --> 00:23:27,880
and we had similar kinds of problems. Yes, you know,

450
00:23:28,440 --> 00:23:32,559
with quality and understanding and so forth, so communication, yeah,

451
00:23:32,759 --> 00:23:34,839
the whole thing. But you know, I'm not going to

452
00:23:34,920 --> 00:23:37,440
argue with Andrew. Eugenie's definitely out of the bottle, and

453
00:23:37,480 --> 00:23:39,160
we just keep on looking at it. I mean, I'm

454
00:23:39,680 --> 00:23:41,720
last year was the year we were often tired of

455
00:23:41,759 --> 00:23:44,799
talking about AI. This year is the year we seem

456
00:23:44,799 --> 00:23:46,920
to have a stack of results sufficient to say, Okay,

457
00:23:46,920 --> 00:23:48,880
what have we got here and what are the right

458
00:23:48,920 --> 00:23:51,680
things to do well at the same time feeling like

459
00:23:52,000 --> 00:23:54,680
I really don't need the hype anymore. I just need

460
00:23:54,720 --> 00:23:55,480
to get to work.

461
00:23:55,880 --> 00:23:58,519
Speaker 4: It's finding the signal in the noise. That's what it is.

462
00:23:58,559 --> 00:23:58,640
Speaker 1: Like.

463
00:23:58,640 --> 00:24:02,799
Speaker 4: There's so much stuff out there, some of which is

464
00:24:02,920 --> 00:24:08,640
absolute and complete hoss for the Yeah, yeah, but you

465
00:24:08,680 --> 00:24:11,960
know some of it is meaningful and different, and you know,

466
00:24:12,039 --> 00:24:14,400
we need to find that signal in the noise. Yeah.

467
00:24:14,440 --> 00:24:16,079
Speaker 3: So maybe that's the thing. Last year we were trying

468
00:24:16,119 --> 00:24:18,039
to figure out there was signal. Now we're pretty darn

469
00:24:18,119 --> 00:24:21,039
sure and we're just scraping away the noise to focus

470
00:24:21,079 --> 00:24:25,720
on what actually is is real here for better or worse? Yeah, right, John,

471
00:24:25,759 --> 00:24:27,680
I'm pretty sure you have a Coffee music code by

472
00:24:27,720 --> 00:24:30,480
which you're always welcome to another. Just send me an

473
00:24:30,519 --> 00:24:33,160
email and we'll figure it out. So thank you so

474
00:24:33,240 --> 00:24:35,480
much for your commented And if you'd like copy of

475
00:24:35,559 --> 00:24:37,839
music Coby, write a comment on the website at Donna

476
00:24:38,000 --> 00:24:40,440
Rocks dot com or on the facebooks. Publish every show

477
00:24:40,480 --> 00:24:42,039
there and if you comment there in a regional show.

478
00:24:42,119 --> 00:24:43,759
We'll send you copy of music cobe.

479
00:24:43,480 --> 00:24:45,039
Speaker 1: And if you want to just go get music to

480
00:24:45,079 --> 00:24:47,680
code by. We have twenty two tracks. They're twenty five

481
00:24:47,680 --> 00:24:50,559
minutes long to keep you in a state of flow

482
00:24:50,759 --> 00:24:53,680
and focus while you're writing code. And that's at music

483
00:24:53,720 --> 00:24:56,839
toocode by dot net and we have them in MP three,

484
00:24:57,079 --> 00:25:01,559
flak and wave formats. Awesome, Okay, should we finally get

485
00:25:01,599 --> 00:25:03,400
to our guest. I feel embarrassed here.

486
00:25:03,440 --> 00:25:04,960
Speaker 3: It's like that. Oh my goodness, Yes, this.

487
00:25:05,000 --> 00:25:07,519
Speaker 1: Show has taken a real turn for you know. That's

488
00:25:07,599 --> 00:25:09,839
twenty what is it twenty five minutes so far?

489
00:25:10,079 --> 00:25:10,680
Speaker 3: Something like that.

490
00:25:10,799 --> 00:25:12,480
Speaker 4: I'm here for it. I'm here for it.

491
00:25:12,680 --> 00:25:15,119
Speaker 1: All right, Well we're gonna cram it in. Andrew Murphy

492
00:25:15,160 --> 00:25:18,119
started his career as a VV programmer, but after almost

493
00:25:18,119 --> 00:25:21,319
two decades in technology leadership, he decided to focus on

494
00:25:21,440 --> 00:25:25,240
teaching the skills that he learned the hard way. When

495
00:25:25,279 --> 00:25:28,519
he moved into leadership, there was no support, so he

496
00:25:28,599 --> 00:25:31,160
had to make all the mistakes, and a lot of them,

497
00:25:31,519 --> 00:25:34,279
and learn from them. His goal is now to make

498
00:25:34,319 --> 00:25:36,880
sure that other tech leaders don't have to do things

499
00:25:36,920 --> 00:25:39,839
the hard way and to make them happy, confident and

500
00:25:39,880 --> 00:25:43,680
effective leaders. All right, a formal welcome to dot net

501
00:25:43,799 --> 00:25:45,039
rocks Andrew.

502
00:25:44,880 --> 00:25:46,680
Speaker 4: Thank you. Great to be here, great.

503
00:25:46,519 --> 00:25:46,839
Speaker 3: To have you.

504
00:25:47,079 --> 00:25:49,640
Speaker 1: So what are you thinking besides all this AI stuff

505
00:25:49,799 --> 00:25:52,279
just you know, flooding your.

506
00:25:52,160 --> 00:25:54,039
Speaker 3: Brain, overwhelming the conversation.

507
00:25:54,279 --> 00:25:58,000
Speaker 4: Yeah, so that's kind of what I wanted to talk about,

508
00:25:58,160 --> 00:26:02,599
is you know, I I've won this meet up event

509
00:26:02,759 --> 00:26:07,000
brains trust type thing in Australia called CTO School, and

510
00:26:07,559 --> 00:26:10,160
I see it as my job as hosts to kind of,

511
00:26:10,240 --> 00:26:14,039
you know, curate the topics of conversation and you know,

512
00:26:14,160 --> 00:26:16,759
just kind of keep things really interesting. And for the

513
00:26:16,839 --> 00:26:20,359
past year, I have tried every single month to not

514
00:26:20,559 --> 00:26:24,920
have AI dominate the agenda, and I have failed every

515
00:26:24,960 --> 00:26:28,200
single month, and I've just come to an acceptance in

516
00:26:28,240 --> 00:26:31,240
the past kind of I don't know quarter or so

517
00:26:31,519 --> 00:26:33,880
that it is just going to be an important part

518
00:26:33,920 --> 00:26:37,440
of what we do as professionals and especially as leaders.

519
00:26:37,559 --> 00:26:40,359
And so, you know, I was doing a bunch of

520
00:26:40,359 --> 00:26:43,720
research on it and I came across this blog post

521
00:26:43,799 --> 00:26:46,039
by a guy called Nolan Lawson. I don't know if

522
00:26:46,279 --> 00:26:48,799
of you have read it. It's called We Mourn our

523
00:26:48,880 --> 00:26:56,039
Craft and it's, oh yeah, it's incredible writing. It's absolutely beautiful.

524
00:26:56,359 --> 00:27:00,160
It's gorgeous. Honest. It's probably the best thing that I've

525
00:27:00,200 --> 00:27:03,119
read about what it actually feels like to be a

526
00:27:03,160 --> 00:27:08,640
software engineer right now in the industry. And it talks

527
00:27:08,680 --> 00:27:11,240
about a whole bunch of stuff. But I just want

528
00:27:11,279 --> 00:27:14,880
to quote one line from it because I think it

529
00:27:15,000 --> 00:27:19,000
kind of sums up exactly the feeling in the industry.

530
00:27:19,039 --> 00:27:21,279
So this is him talking about acceptance of where we

531
00:27:21,319 --> 00:27:24,680
are with AI. I don't celebrate the new world, but

532
00:27:24,720 --> 00:27:28,519
I also don't resist it. The sunrises, the sunsets, and

533
00:27:28,559 --> 00:27:31,960
I all bit helplessly around it. My protests can't stop it.

534
00:27:31,960 --> 00:27:36,079
It doesn't care. It continues. It's ark across the sky regardless,

535
00:27:36,400 --> 00:27:44,200
moving but unmoved. Beautiful poetry, Absolutely beautiful poetry. But I

536
00:27:44,799 --> 00:27:49,519
disagree with Nolan. I think there's a whole bunch of

537
00:27:49,640 --> 00:27:52,839
changes that we've had in our industry, and this is

538
00:27:53,599 --> 00:27:56,799
one of them. It's going to be a fundamental shift

539
00:27:56,880 --> 00:28:00,960
in the way we write code. So was a whole

540
00:28:01,000 --> 00:28:03,279
bunch of other things that happened in the industry, right

541
00:28:03,839 --> 00:28:08,160
you know, we all three of us remember the days

542
00:28:08,200 --> 00:28:11,759
before IntelliSense, you know, when you actually had to remember

543
00:28:11,759 --> 00:28:13,279
the names of methods that's right.

544
00:28:13,359 --> 00:28:16,799
Speaker 3: I when the cloud emerged, I got emails on the

545
00:28:16,880 --> 00:28:22,519
running as side from folks mourning, spinning, screwdrivers, racket and stacking. Yeah,

546
00:28:22,559 --> 00:28:24,519
you know stuff I used to do too, and we

547
00:28:24,559 --> 00:28:26,720
really took a lot of from Building out of forty

548
00:28:26,720 --> 00:28:29,839
two was a great gig. It was you know, it

549
00:28:29,880 --> 00:28:32,720
was a day of work fun, It was fun. And

550
00:28:32,799 --> 00:28:33,240
it's over.

551
00:28:33,519 --> 00:28:37,119
Speaker 4: Yeah. Yeah, And I think there's so much of this

552
00:28:37,799 --> 00:28:39,880
that has happened in our industry that you know, this

553
00:28:40,039 --> 00:28:44,279
could be a fundamental shift, but it's not going to

554
00:28:44,359 --> 00:28:46,640
get rid of, I think, the core of what it

555
00:28:46,720 --> 00:28:49,640
means to be a software engineer. And so that's why

556
00:28:49,680 --> 00:28:53,640
I wrote my article in response to Nolan's around, you know,

557
00:28:53,720 --> 00:28:58,039
looking at that grief that Nolan identified through the common

558
00:28:58,079 --> 00:29:00,160
model everybody's heard of of grief, which is the five

559
00:29:00,200 --> 00:29:03,440
stages of grief. And I called it the five stages

560
00:29:03,480 --> 00:29:06,000
of losing our craft because it is it is grief.

561
00:29:06,039 --> 00:29:09,039
We are losing a part of what it means to

562
00:29:09,079 --> 00:29:13,000
be a software engineer where our identity is being challenged.

563
00:29:13,519 --> 00:29:16,319
But you know the reason why the five stages of

564
00:29:16,400 --> 00:29:19,480
grief exist as a model is because you don't just

565
00:29:20,000 --> 00:29:23,440
give up, you know, if you if you've lost a

566
00:29:23,480 --> 00:29:27,519
loved one, Wallowing in that in that grief is not productive.

567
00:29:27,519 --> 00:29:30,079
It's it's moving through it and looking at what it means.

568
00:29:30,240 --> 00:29:31,960
And I wanted to kind of look at the grief

569
00:29:32,039 --> 00:29:36,119
that Nolan identified through the angle of the five Stages

570
00:29:36,160 --> 00:29:36,559
of grief.

571
00:29:36,920 --> 00:29:37,200
Speaker 3: Sure.

572
00:29:37,359 --> 00:29:39,920
Speaker 1: Yeah, we all have to move on. That's what it's

573
00:29:39,960 --> 00:29:43,920
all about. And if you don't, you know, then go

574
00:29:44,000 --> 00:29:46,559
work at Walmart. Yeah.

575
00:29:46,599 --> 00:29:49,519
Speaker 4: I mean it's a valid option, is it.

576
00:29:49,640 --> 00:29:55,720
Speaker 1: I don't know that it is, but okay, well, the

577
00:29:56,079 --> 00:30:00,920
collective and I say collective personally, then knowledge that I've

578
00:30:00,960 --> 00:30:04,440
collected about systems and about customers and about what they want,

579
00:30:04,480 --> 00:30:08,559
what they need and users and stuff is all really valuable.

580
00:30:08,839 --> 00:30:10,960
And an AI just doesn't have that.

581
00:30:10,960 --> 00:30:11,400
Speaker 4: That's it.

582
00:30:11,599 --> 00:30:15,160
Speaker 1: And especially especially in the domains that I've worked in,

583
00:30:15,240 --> 00:30:17,680
which tend to be the domains that I keep working in.

584
00:30:19,240 --> 00:30:22,640
You know, people will hire you because you have that

585
00:30:22,759 --> 00:30:24,079
experience and that knowledge.

586
00:30:24,240 --> 00:30:29,000
Speaker 4: Yeah, and you know, it's it's not just about building

587
00:30:29,000 --> 00:30:33,240
great system architecture. It's also about knowing when not to

588
00:30:33,319 --> 00:30:35,559
do that. Like one of the things that I've I

589
00:30:35,599 --> 00:30:39,440
feel like I've learned and honed over my career is

590
00:30:39,519 --> 00:30:42,759
knowing when the Hackey solution is the right solution and

591
00:30:42,799 --> 00:30:46,359
when it's worth while investing in building a great solution.

592
00:30:46,720 --> 00:30:50,880
And that's something that I have historically just seen AI

593
00:30:51,400 --> 00:30:54,960
fail and fall flat on its face on doing is

594
00:30:55,000 --> 00:30:58,200
should this be the big architected solution or should this

595
00:30:58,240 --> 00:31:00,440
be something I just I just had together. I think

596
00:31:00,480 --> 00:31:04,000
those decisions are something that you can only only learn

597
00:31:04,279 --> 00:31:08,400
and sharpen from from doing this. And I'll probably get

598
00:31:08,440 --> 00:31:11,920
there in ten, twenty thirty years. But you know the

599
00:31:11,960 --> 00:31:15,160
difference between the people who have jobs in twenty years

600
00:31:15,440 --> 00:31:19,039
and those who don't orbit the people who understand that

601
00:31:19,039 --> 00:31:22,160
that's the value that they had. Ken ken Beck wrote

602
00:31:22,160 --> 00:31:26,240
an article on this on his sub stack where he said,

603
00:31:26,880 --> 00:31:31,440
I've just realized that the value of ninety percent of

604
00:31:31,480 --> 00:31:34,640
my skills has gone to zero and the value of

605
00:31:34,640 --> 00:31:38,759
the remaining ten percent has one thousand xt. The job

606
00:31:39,000 --> 00:31:41,680
I have now is working out what that ten percent is.

607
00:31:42,000 --> 00:31:43,319
Speaker 1: Yeah, right, that's good.

608
00:31:43,720 --> 00:31:43,920
Speaker 4: Yeah.

609
00:31:43,960 --> 00:31:46,839
Speaker 1: You're talking about Kent Beck, the father of extreme programming

610
00:31:46,839 --> 00:31:47,240
and all that.

611
00:31:47,440 --> 00:31:51,079
Speaker 4: Yes, yeah, yeah, yeah. He came to YAO in Australia

612
00:31:51,119 --> 00:31:54,680
and did the coming it's a keynote or the locknote

613
00:31:54,680 --> 00:31:57,000
there and he talked about it, and it's yeah, it's

614
00:31:57,039 --> 00:31:59,480
it's I think it's really powerful because you're you're right,

615
00:31:59,519 --> 00:32:03,079
cal all of those things around understanding the customer. This

616
00:32:03,279 --> 00:32:06,359
is one of the things early in my career. I

617
00:32:06,440 --> 00:32:09,359
was really really lucky in that I worked for a company.

618
00:32:09,599 --> 00:32:13,200
They manufactured fireplaces. It doesn't necessarily matter what they did,

619
00:32:13,640 --> 00:32:16,920
but I had a great boss at the time, and

620
00:32:16,960 --> 00:32:19,680
what he would do is every time I had to

621
00:32:19,759 --> 00:32:24,319
write code to help somebody with their job, he made

622
00:32:24,319 --> 00:32:27,200
me go do their job for a day or so.

623
00:32:27,200 --> 00:32:31,880
So if I was writing this the warehouse management software,

624
00:32:32,079 --> 00:32:34,880
I would be in my overalls in the warehouse doing

625
00:32:34,880 --> 00:32:35,519
a stock text.

626
00:32:35,559 --> 00:32:37,799
Speaker 1: Yeah. You had to go out and feel the pain.

627
00:32:37,920 --> 00:32:42,720
Speaker 4: Exactly and empathize and understand the users. And AI is

628
00:32:42,759 --> 00:32:46,039
never gonna do that, you know, it can't, at least

629
00:32:46,039 --> 00:32:49,319
any anytime soon. And I think those are the pivotal

630
00:32:49,440 --> 00:32:52,720
moments in people's careers that help you really understand that

631
00:32:52,759 --> 00:32:56,559
the job was never about producing the code. The job

632
00:32:56,599 --> 00:32:59,799
is about understanding the problems that need to be solved.

633
00:33:00,240 --> 00:33:02,480
Software can help solve those problems.

634
00:33:02,680 --> 00:33:06,559
Speaker 3: Yeah sure, yeah, yeah, you're great. We've been through this before. Yeah,

635
00:33:06,599 --> 00:33:08,480
exactly know what I should be writing our own garbage

636
00:33:08,480 --> 00:33:11,759
collector or encryption libraries or you know, they we've the

637
00:33:11,839 --> 00:33:13,799
pieces of our job have been peeled off for a while.

638
00:33:13,839 --> 00:33:15,119
This just seems like a big one.

639
00:33:15,200 --> 00:33:18,839
Speaker 1: How does this manifest itself in the leadership role something

640
00:33:18,880 --> 00:33:19,839
that you're really into.

641
00:33:20,240 --> 00:33:22,720
Speaker 4: That's a really interesting one because you know, if you've

642
00:33:22,720 --> 00:33:27,519
got a senior engineer who is resisting this transition, then

643
00:33:27,559 --> 00:33:31,160
it's it's you know, a problem for them. If you've

644
00:33:31,200 --> 00:33:35,480
got a CTO who's resisting this transition, then it's a

645
00:33:35,519 --> 00:33:39,240
problem for the whole company. Yeah, and it becomes existential

646
00:33:39,720 --> 00:33:43,400
for whichever organization you're in. And you know, that's the

647
00:33:43,519 --> 00:33:45,599
thing that I'm worried the most about is you know,

648
00:33:45,640 --> 00:33:48,200
if you don't have those leaders who can kind of

649
00:33:48,359 --> 00:33:51,559
champion this change. And I'm not talking about it. You know,

650
00:33:51,640 --> 00:33:55,799
I'm not I'd say I'm an AI moderate in that

651
00:33:55,880 --> 00:33:57,759
you know, I'm not an AI optimist and I'm not

652
00:33:57,799 --> 00:34:00,640
an AI pessimist. I'm more of an AI realist. And

653
00:34:00,720 --> 00:34:04,279
you know, I'm not banging the drona of of you know,

654
00:34:04,359 --> 00:34:06,160
this is the best thing that ever happened, and you've

655
00:34:06,160 --> 00:34:08,480
got to jump on the bandwagon. Or you know, you're

656
00:34:08,480 --> 00:34:11,119
going to lose your jobs. But like we said earlier,

657
00:34:11,159 --> 00:34:13,079
the genie is out of the bottle, and it's it's

658
00:34:13,159 --> 00:34:16,960
less about, you know what, what's the existential threat to

659
00:34:17,000 --> 00:34:19,239
the industry and more about what can you do in

660
00:34:19,280 --> 00:34:21,760
the next few years to make sure you're you know,

661
00:34:22,400 --> 00:34:24,880
catering with what's happening and deal with the industry.

662
00:34:24,920 --> 00:34:28,280
Speaker 1: How can we use it smartly, exactly right, and not

663
00:34:28,599 --> 00:34:29,800
just go off the deep end.

664
00:34:30,039 --> 00:34:32,360
Speaker 4: Yeah, And you know, one of the mistakes I've seen

665
00:34:32,800 --> 00:34:35,559
some leaders make in this area is like, okay, well,

666
00:34:35,599 --> 00:34:38,239
let's do a research project. Let's you know, let's spend

667
00:34:38,280 --> 00:34:40,840
six months finding out where, you know, what we can

668
00:34:40,960 --> 00:34:43,199
use it for and how we can use it. And

669
00:34:43,239 --> 00:34:44,760
then they get to the end of those six months

670
00:34:44,760 --> 00:34:47,559
and they decided what their AI policy is, and then

671
00:34:47,599 --> 00:34:49,920
they realized that where they were making decisions based on

672
00:34:49,960 --> 00:34:51,679
where AI was six months ago.

673
00:34:52,159 --> 00:34:55,760
Speaker 3: Right, Yeah, So yeah, I mean I saw folks do

674
00:34:55,840 --> 00:34:58,280
that with mobile in the back in the day as well, right,

675
00:34:58,360 --> 00:35:00,760
And if you whatever tool were in front of you

676
00:35:00,800 --> 00:35:02,880
at the time, especially either you think about two thousand

677
00:35:02,920 --> 00:35:05,840
and nine twenty ten, the early days of mobile, you know,

678
00:35:05,880 --> 00:35:09,159
the iPhone, if you took those six months. By the

679
00:35:09,159 --> 00:35:10,519
time you come on the out side of it, everything

680
00:35:10,519 --> 00:35:12,000
you're doing is a waste of time. The tools are

681
00:35:12,000 --> 00:35:12,639
totally changed.

682
00:35:13,039 --> 00:35:15,679
Speaker 4: Yeah, yeah, yeah, I think the way you have to

683
00:35:15,719 --> 00:35:19,119
think about it is kind of what we said earlier,

684
00:35:19,159 --> 00:35:22,320
which is you're responsible for the code that ends up

685
00:35:22,360 --> 00:35:24,400
in the code base. Sure, you know, it doesn't matter

686
00:35:24,400 --> 00:35:26,840
how that code ended up in the code base. You

687
00:35:26,880 --> 00:35:31,440
could have handcrafted it while drinking at zinfandel, or you

688
00:35:31,480 --> 00:35:34,920
could have copied it from stack overflow, or an AI

689
00:35:35,079 --> 00:35:37,519
could have generated it, or you asked your you know,

690
00:35:37,599 --> 00:35:38,920
your junior engineer.

691
00:35:38,519 --> 00:35:39,000
Speaker 3: To write it.

692
00:35:39,199 --> 00:35:42,039
Speaker 4: Whatever the reason is, the code ended up in your

693
00:35:42,079 --> 00:35:45,519
code base, you're still responsible and accountable for it. Yeah,

694
00:35:45,559 --> 00:35:47,800
and you know, I think focusing on that side of

695
00:35:47,840 --> 00:35:52,400
things and building quality controls into you know, into how

696
00:35:52,440 --> 00:35:55,199
you generate the code and what ends up in that's

697
00:35:55,239 --> 00:35:58,760
the answer. Not these big research projects that you know,

698
00:35:58,800 --> 00:36:00,360
were just rapidly the day.

699
00:36:00,840 --> 00:36:04,360
Speaker 3: Are you finding the team like the junior people are okay,

700
00:36:04,360 --> 00:36:07,280
they're embracing the new teer tools. Fine, Like we've always

701
00:36:07,280 --> 00:36:09,159
been talking about this is gonna wipe out juniors.

702
00:36:09,280 --> 00:36:13,440
Speaker 4: Yeah. So I think there's an interesting, interesting debate in

703
00:36:13,480 --> 00:36:17,280
this and I don't know where I kind of fall

704
00:36:17,360 --> 00:36:20,159
down on this line because there's there's two there's two

705
00:36:20,159 --> 00:36:22,119
ways you could look at that problem. You could look

706
00:36:22,159 --> 00:36:25,119
at that the problem and go, okay, well, you know,

707
00:36:25,559 --> 00:36:29,320
juniors are irrelevant because the AI is just the junior.

708
00:36:30,000 --> 00:36:31,440
Or you could look you could look at it from

709
00:36:31,440 --> 00:36:33,840
the other lens and you could go, well, actually, we

710
00:36:33,880 --> 00:36:37,079
don't need senior engineers anymore because a junior with AI

711
00:36:37,519 --> 00:36:40,320
is a senior engineer. And I think, I think it'll

712
00:36:40,360 --> 00:36:42,880
be interesting to see where we end up there. But

713
00:36:43,400 --> 00:36:47,159
I'm I'm seeing a lot of juniors just jumping on this,

714
00:36:47,480 --> 00:36:50,000
you know, with two hands and just just grasping it

715
00:36:50,039 --> 00:36:54,559
and running circles around the seniors. In terms of code output.

716
00:36:55,079 --> 00:36:59,280
Now again you can ask the question on yes, exactly exactly,

717
00:36:59,519 --> 00:37:02,519
and and I think that's that's the you know, that's

718
00:37:02,559 --> 00:37:05,760
the thing that is the differentiator. Is a senior that

719
00:37:05,880 --> 00:37:08,239
knows how to use these tools is going to produce

720
00:37:08,400 --> 00:37:12,119
so much better outcomes, although a junior, you know, is

721
00:37:12,159 --> 00:37:14,039
probably going to produce a lot more code.

722
00:37:14,800 --> 00:37:16,880
Speaker 1: Well, this seems like a good place to take a break,

723
00:37:16,960 --> 00:37:19,519
So we'll be right back after these very important messages.

724
00:37:21,719 --> 00:37:25,079
Hey Carl here. You probably know text Control is a

725
00:37:25,119 --> 00:37:29,639
powerful library for document editing and PDF generation, but did

726
00:37:29,679 --> 00:37:32,840
you know they're also a strong supporter of the developer community.

727
00:37:33,280 --> 00:37:35,719
It's part of their mission to build and support a

728
00:37:35,760 --> 00:37:40,199
strong developer community by being present, listening to users, and

729
00:37:40,280 --> 00:37:43,960
sharing knowledge at conferences across Europe and the United States.

730
00:37:44,360 --> 00:37:47,519
So if you're heading to a conference soon, check if

731
00:37:47,559 --> 00:37:50,599
text Control will be there and stop buy to say hi.

732
00:37:50,920 --> 00:37:55,079
You can find their full conference calendar at dubdubdub dot

733
00:37:55,199 --> 00:37:58,400
textcontrol dot com and make sure you thank them for

734
00:37:58,440 --> 00:38:07,719
supporting dot NetRocks. And we're back. It's dot net rocks.

735
00:38:07,760 --> 00:38:10,639
I'm Carl Franklin. That's my friend Richard Campbell, and that's

736
00:38:10,679 --> 00:38:13,800
our friend Andrew Murphy. And we're talking about AI leadership

737
00:38:13,840 --> 00:38:16,840
and all that goes with it, the challenges, all the

738
00:38:16,880 --> 00:38:21,559
fun things, the thrill of victory and the agony of defeat.

739
00:38:22,840 --> 00:38:25,039
Speaker 3: Well, I was also from there's plenty of senior people

740
00:38:25,039 --> 00:38:27,079
that have changed tools enough time they're like, oh, we're

741
00:38:27,119 --> 00:38:29,360
changing tools again, and they just dive in. Yeah. But

742
00:38:29,400 --> 00:38:31,159
there's a certain group of people that are resisting, and

743
00:38:31,199 --> 00:38:32,559
I'm trying to figure out who they are.

744
00:38:33,280 --> 00:38:39,199
Speaker 4: I genuinely think that they're the people like me who

745
00:38:39,760 --> 00:38:42,760
didn't make a transition into leadership. And what I mean

746
00:38:42,800 --> 00:38:45,360
by that is like, I'm a nerd and a geek.

747
00:38:45,639 --> 00:38:48,199
I learned to code on a zaex spectrum. My dad

748
00:38:48,360 --> 00:38:52,679
taught me how to code. I've been coding since I

749
00:38:52,719 --> 00:38:55,480
was eight years old, and I just love it so much.

750
00:38:56,280 --> 00:38:58,480
But you know, I was kind of thrust into this

751
00:38:58,880 --> 00:39:02,599
leadership role without wanting it. If I hadn't ha done that,

752
00:39:03,079 --> 00:39:06,159
I could see myself being one of those people because

753
00:39:06,199 --> 00:39:09,480
I just love coding so much. Like I get a

754
00:39:09,559 --> 00:39:12,559
huge amount of joy in the creation aspect of it

755
00:39:12,639 --> 00:39:16,480
and the craft aspect of it. So I have I

756
00:39:16,559 --> 00:39:20,480
have empathy for those people because you know, in another life,

757
00:39:20,559 --> 00:39:23,360
I would have been them. But you know, it's it's

758
00:39:23,440 --> 00:39:28,840
kind of similar to other skills that have have disappeared,

759
00:39:29,159 --> 00:39:33,719
you know throughout the years. Weavers. You know, we know,

760
00:39:33,960 --> 00:39:37,360
not many people weave by hand anymore. We use looms

761
00:39:37,400 --> 00:39:40,440
to do it, and then you know, we use automated limbs.

762
00:39:40,519 --> 00:39:43,880
Like people still do that stuff for fun because it

763
00:39:44,000 --> 00:39:48,679
is insanely enjoyable, but you know it's not it's you

764
00:39:48,719 --> 00:39:51,519
can't build an industry off doing it by hand anymore,

765
00:39:51,840 --> 00:39:55,039
you know, making a living prob it is tough, yeah, exactly,

766
00:39:55,239 --> 00:39:57,960
And if you do, it's going to be some hugely

767
00:39:58,039 --> 00:40:02,199
specific niche thing where people are willing to pay for

768
00:40:02,920 --> 00:40:05,119
you know, the fact that this has had human hands

769
00:40:05,119 --> 00:40:07,880
on it. And there might be something like that in

770
00:40:07,880 --> 00:40:11,039
our industry. Like I could see, you know, a certain

771
00:40:11,039 --> 00:40:14,039
group of people going, let's handcraft this thing for fun,

772
00:40:14,440 --> 00:40:16,920
just like people go, let's assemble a car for fun.

773
00:40:17,000 --> 00:40:19,719
Like it's it's you know, it's abit.

774
00:40:19,800 --> 00:40:20,840
Speaker 3: It's going to be a niche.

775
00:40:20,960 --> 00:40:21,199
Speaker 4: Yeah.

776
00:40:21,440 --> 00:40:24,239
Speaker 3: I wonder if digital, hand crafted digital is going to

777
00:40:24,280 --> 00:40:27,480
be a thing as opposed to physical, Like that's certainly

778
00:40:27,519 --> 00:40:27,840
a thing.

779
00:40:28,000 --> 00:40:33,280
Speaker 4: Yeah, yeah, yeah, hand crafted digital? And then how do

780
00:40:33,280 --> 00:40:35,559
you prove it? Like that's the interesting thing, like how

781
00:40:35,599 --> 00:40:38,280
do you how do you prove that it wasn't generated

782
00:40:38,320 --> 00:40:40,679
by an AI? Yeah? Yeah, you have to make mistakes,

783
00:40:40,679 --> 00:40:42,920
you have to make typos. There has to not be

784
00:40:43,000 --> 00:40:44,159
comments in some areas.

785
00:40:44,519 --> 00:40:46,519
Speaker 3: Here you go, well, I don't whole request can't be

786
00:40:46,519 --> 00:40:47,159
too coherent.

787
00:40:47,280 --> 00:40:49,079
Speaker 1: I don't know when, but I'm going to make a

788
00:40:49,119 --> 00:40:51,320
prediction and I think I have said this before that

789
00:40:52,000 --> 00:40:54,199
I don't know if it's five years, ten years, but

790
00:40:54,719 --> 00:40:59,519
pretty soon everybody's going to be sick of AI generated

791
00:40:59,559 --> 00:41:02,480
content and talking to bots and all that stuff, and

792
00:41:02,519 --> 00:41:06,719
there's going to be a revolution in human to human

793
00:41:06,880 --> 00:41:12,320
contact activity. The arts going to see real people perform,

794
00:41:12,800 --> 00:41:17,320
whether it's dance or music or you know whatever. And

795
00:41:17,400 --> 00:41:19,920
I think that that's that's going to be revolutionary. I

796
00:41:19,920 --> 00:41:21,880
don't know when it will happen, but I think that's

797
00:41:21,920 --> 00:41:26,480
the next turnover event that will happen after all this.

798
00:41:26,639 --> 00:41:30,360
Speaker 4: I've been thinking of something similar about sas as a

799
00:41:30,360 --> 00:41:34,000
as an industry. You know, the when when the cost

800
00:41:34,119 --> 00:41:38,159
to produce sas reduces almost to zero, sas as an

801
00:41:38,239 --> 00:41:41,559
industry is going to really struggle. You know, when you've

802
00:41:41,559 --> 00:41:44,360
got companies that are paying thousands, tens of thousands of

803
00:41:44,360 --> 00:41:47,800
dollars a month for Trello, and you know, you give

804
00:41:47,840 --> 00:41:50,440
an engineer a weekend and they can basically build Treillo.

805
00:41:51,360 --> 00:41:54,519
What's going to be the differentiator? I think it's going

806
00:41:54,599 --> 00:41:57,800
to be taste. Taste is is going to be the

807
00:41:57,840 --> 00:42:00,559
new features. You don't You don't choose a piece of

808
00:42:00,599 --> 00:42:04,119
software because it can do something. You choose a piece

809
00:42:04,119 --> 00:42:07,239
of software because it feels a certain way. It acts

810
00:42:07,280 --> 00:42:10,360
a certain way and it kind of fits your brain better.

811
00:42:10,559 --> 00:42:13,039
This is kind of similar to the proliferation of like

812
00:42:13,119 --> 00:42:15,599
to do managers and that kind of stuff. You know,

813
00:42:15,639 --> 00:42:18,320
there's a thousand to do lists, but the one you

814
00:42:18,400 --> 00:42:20,639
choose is not the one that has the features you want.

815
00:42:20,679 --> 00:42:22,400
It's the one that kind of fits the way your

816
00:42:22,400 --> 00:42:23,039
brain works.

817
00:42:23,239 --> 00:42:25,920
Speaker 3: Yeah, well, any rate, not everybody has a strong opinion

818
00:42:25,920 --> 00:42:28,880
about all things. Like we're seeing with tools like cloudbod

819
00:42:28,960 --> 00:42:31,800
and the like, the emergence of this idea of surrounding

820
00:42:31,800 --> 00:42:35,400
yourself with custom software that there is no there is

821
00:42:35,440 --> 00:42:38,679
no definitive software per se. Everything will be unique to you.

822
00:42:38,719 --> 00:42:40,960
But most people don't have that strong opinions on that

823
00:42:41,000 --> 00:42:44,599
many things. So, you know, the same way in theory,

824
00:42:44,639 --> 00:42:47,679
we could all make our own clothes or customer as

825
00:42:47,719 --> 00:42:50,079
are all our clothes, but people, most people don't care enough.

826
00:42:50,400 --> 00:42:54,480
We look to these influencers and these thought leaders in

827
00:42:54,519 --> 00:42:56,400
that space. I hate that phrase, but you know what

828
00:42:56,400 --> 00:43:00,519
I mean. Yeah, there will be a taste element to this.

829
00:43:00,679 --> 00:43:01,920
Speaker 1: Did you guys see the movie Her?

830
00:43:02,119 --> 00:43:02,400
Speaker 3: Sure?

831
00:43:02,519 --> 00:43:05,239
Speaker 1: Yes, So what did you think the main message of

832
00:43:05,239 --> 00:43:08,400
that movie was? Was there a message or was it

833
00:43:08,480 --> 00:43:10,840
just a kind of enjoyable romp. I mean, it was

834
00:43:10,880 --> 00:43:15,440
a great exploration of a digital assistant and they you know,

835
00:43:15,480 --> 00:43:19,880
the you know they what's the term. Now, it's just

836
00:43:19,960 --> 00:43:23,199
jumped out of my head that that social stratum of

837
00:43:23,960 --> 00:43:27,880
you know, you projected your feelings onto a piece of technology, right,

838
00:43:28,000 --> 00:43:30,079
And it was kind of a it's not the same.

839
00:43:30,199 --> 00:43:32,760
It was kind of a cautionary tale, wasn't it. Yeah,

840
00:43:32,800 --> 00:43:36,320
And we didn't really have these people getting emotionally involved

841
00:43:36,320 --> 00:43:39,920
in chatbots. Maybe a little bit before that, but I

842
00:43:39,960 --> 00:43:43,440
mean the author of that movie, in that book really

843
00:43:44,000 --> 00:43:46,760
saw the future that wow, this is a problem and

844
00:43:46,880 --> 00:43:51,360
you know, this could could really hurt people. So I

845
00:43:51,400 --> 00:43:54,000
saw I'm just watching I don't know when I was

846
00:43:54,039 --> 00:43:57,119
watching the super Bowl or something or the Olympics, and

847
00:43:57,159 --> 00:44:00,199
there was an ad where this woman is woken not

848
00:44:00,400 --> 00:44:06,119
by her digital assistant, and she has this jovial kind

849
00:44:06,159 --> 00:44:11,039
of you know, rapport with it that you would have

850
00:44:11,199 --> 00:44:14,599
with your partner, and she's waking up alone, right, and

851
00:44:15,000 --> 00:44:20,119
it's celebrating that the AI as your you know, your

852
00:44:20,159 --> 00:44:24,880
your emotional and you know your emotional partner. Really and

853
00:44:24,920 --> 00:44:27,599
I just thought, wow, how far we've come, Like, you know,

854
00:44:27,679 --> 00:44:30,119
this is something that people are really looking forward to

855
00:44:30,920 --> 00:44:31,599
in there, is.

856
00:44:31,599 --> 00:44:33,400
Speaker 3: It or is it just the person who made that

857
00:44:33,519 --> 00:44:37,480
APD Well, they wouldn't make it if people didn't respond

858
00:44:37,480 --> 00:44:41,079
to it, right, I not necessarily, Yeah, got a lot

859
00:44:41,119 --> 00:44:41,559
of money.

860
00:44:42,719 --> 00:44:45,559
Speaker 4: Do you know the last time so when you look

861
00:44:45,559 --> 00:44:48,199
at all the ads of the Super Bowl, AI dominated

862
00:44:48,239 --> 00:44:50,760
it a huge Yeah. Do you know the last time

863
00:44:51,119 --> 00:44:54,119
when technology dominated to the ads of the Super Bowl

864
00:44:54,280 --> 00:45:00,599
was two thousands, right just before the doctor komboom So.

865
00:45:00,639 --> 00:45:02,800
I think that that maybe tells you something about you know,

866
00:45:02,840 --> 00:45:06,440
they're just spending money trying to grab market share. I

867
00:45:06,480 --> 00:45:08,639
think you know that there's a whole heap. I don't

868
00:45:08,639 --> 00:45:10,599
want to get too much in the industry, it's being

869
00:45:10,599 --> 00:45:14,199
done to death. But there's so much in the where

870
00:45:14,360 --> 00:45:16,960
in where we are right now with the investment in AI,

871
00:45:17,440 --> 00:45:20,719
where it is one hundred percent a solution trying to

872
00:45:20,760 --> 00:45:24,159
find the problem. Like they just have this big hammer

873
00:45:24,199 --> 00:45:26,480
which is a LLL m's and they're just hitting it everywhere.

874
00:45:26,760 --> 00:45:29,360
And that doesn't mean that there isn't a huge value

875
00:45:29,360 --> 00:45:32,119
in this stuff. There is And you know, I think

876
00:45:32,159 --> 00:45:36,000
one of the big problems people have in this kind

877
00:45:36,039 --> 00:45:39,559
of mental transition of how they utilize LMS is they

878
00:45:39,639 --> 00:45:42,199
kind of fixate on the things it does badly rather

879
00:45:42,239 --> 00:45:44,960
than the things it does well. Because there's so many

880
00:45:45,000 --> 00:45:48,679
companies that are just using this LLM hammer in places

881
00:45:48,679 --> 00:45:49,760
it shouldn't be used.

882
00:45:49,639 --> 00:45:52,599
Speaker 3: Right, Yeah, everyone's desperate to care. You know. There's the

883
00:45:52,639 --> 00:45:54,679
line I use the other day when I was a

884
00:45:54,719 --> 00:45:57,159
guest on a show where there's only so many chances

885
00:45:57,719 --> 00:46:01,719
in your lifetime to become a billionaire, and this is

886
00:46:01,760 --> 00:46:04,639
one of them right now. And and so for a

887
00:46:04,639 --> 00:46:07,440
certain group of people, like this is your golden ring moment,

888
00:46:07,480 --> 00:46:11,960
and you will put aside arguably all ethics of morality

889
00:46:12,239 --> 00:46:15,559
for this chance, and you're seeing the consequences of it.

890
00:46:15,960 --> 00:46:18,519
I do want to get back to your post and this.

891
00:46:18,840 --> 00:46:22,519
You know, you're leading a team in the midst of

892
00:46:22,559 --> 00:46:27,119
this change. So what are I mean? Do you let

893
00:46:27,159 --> 00:46:28,800
go of the folks that are willing to embrace this

894
00:46:28,920 --> 00:46:30,800
like I gotta. I think there's teams that are doing that.

895
00:46:30,840 --> 00:46:33,039
It's like we're going this way and those are unhappy.

896
00:46:33,079 --> 00:46:35,639
There's the door. Yeah, I've just told there's a better way.

897
00:46:35,840 --> 00:46:38,199
Speaker 1: The guy in the back on it, I'll never work

898
00:46:38,559 --> 00:46:40,599
and I'll never get off the ground.

899
00:46:41,159 --> 00:46:43,519
Speaker 3: Well, you've also got the folks who are just pointing

900
00:46:43,519 --> 00:46:46,559
out all the problems, which sounds like a kind of denial.

901
00:46:46,800 --> 00:46:47,960
Speaker 4: That's exactly what it is.

902
00:46:48,039 --> 00:46:50,440
Speaker 1: And you know, and but you want those people around,

903
00:46:50,440 --> 00:46:52,440
don't you think.

904
00:46:52,519 --> 00:46:56,199
Speaker 4: I think, you know, a certain amount of skepticism is valuable. Yeah,

905
00:46:57,280 --> 00:47:01,039
you want those people who are saying, maybe we shouldn't

906
00:47:01,159 --> 00:47:05,159
let the AI run wholly. You know, our entire code stack,

907
00:47:05,239 --> 00:47:07,480
and you know we're never going to manually review things

908
00:47:07,519 --> 00:47:10,199
like you want those people who are pushing back on it. Great,

909
00:47:10,880 --> 00:47:14,320
But you know, I think the issue becomes when people

910
00:47:15,119 --> 00:47:19,239
deny the trajectory of where it's heading. So I think,

911
00:47:19,320 --> 00:47:22,679
you know, talking about what it's good at and what

912
00:47:22,679 --> 00:47:25,599
it's bad at right now is exactly the conversations we

913
00:47:25,639 --> 00:47:29,280
should be having. But anybody that looks at the trajectory

914
00:47:29,360 --> 00:47:33,239
we've had over the last twelve months and goes, Okay, well,

915
00:47:33,280 --> 00:47:36,320
you know that's going to stop on you know, whatever

916
00:47:36,320 --> 00:47:38,360
today's date is, and it's never going to get better

917
00:47:38,440 --> 00:47:41,119
or it's going to get worse. I think those are

918
00:47:40,159 --> 00:47:44,960
the risky kind of opinions to have. And you know,

919
00:47:45,320 --> 00:47:47,239
I think it comes from the fact that it is

920
00:47:47,280 --> 00:47:49,119
bad at a whole bunch of stuff, and I think

921
00:47:49,159 --> 00:47:53,480
it comes from the fact that the models, the models

922
00:47:53,559 --> 00:47:57,280
have incrementally got better. So you know, we've we've had

923
00:47:57,280 --> 00:48:00,519
a bit of a diminishing of returns in new model versions.

924
00:48:00,760 --> 00:48:05,480
A GPT three versus two was just game changing, GPT

925
00:48:05,800 --> 00:48:10,480
four versus three was really you know, really mass. Yeah,

926
00:48:10,559 --> 00:48:15,920
but GPT five versus four is more. Yeah, it's incremental.

927
00:48:16,039 --> 00:48:18,719
It's it's it's meaningful. Sure, don't get me wrong, but

928
00:48:18,800 --> 00:48:19,840
it's not exponential.

929
00:48:19,920 --> 00:48:21,920
Speaker 3: But it's also a point where you realize, like the

930
00:48:22,039 --> 00:48:26,320
War's law doesn't apply here. There isn't an exponential additional

931
00:48:26,400 --> 00:48:30,119
data set available. We've kind of indexed everything, including things

932
00:48:30,119 --> 00:48:31,559
we probably shouldn't have indexed.

933
00:48:31,639 --> 00:48:34,760
Speaker 4: Yeah, and there is an exponential compute, right, you know,

934
00:48:35,079 --> 00:48:38,719
but what has got better has been the tooling around

935
00:48:38,800 --> 00:48:41,440
those models. Like this is this is something where really

936
00:48:41,480 --> 00:48:44,880
good as engineers at It's taking something that is pretty

937
00:48:45,039 --> 00:48:48,880
crappy and you know, putting guardrails and tooling around it

938
00:48:48,920 --> 00:48:52,840
to make it good. You know, we tricked locks into thinking,

939
00:48:53,639 --> 00:48:56,400
and now we kind of make those locks do slightly

940
00:48:56,440 --> 00:48:59,239
smarter things. I think that's that's what we're good at

941
00:48:59,239 --> 00:49:03,519
as engineers. We talked about it earlier of where we

942
00:49:03,639 --> 00:49:07,119
got to with intelli sense and those toolings around it.

943
00:49:07,920 --> 00:49:12,079
Programming languages have haven't got meaningfully better in the past

944
00:49:12,119 --> 00:49:15,519
twenty years. They have a little bit, you know, there's

945
00:49:15,639 --> 00:49:16,079
there's not.

946
00:49:16,400 --> 00:49:19,119
Speaker 3: I would resist that, Like I think the emergence of

947
00:49:19,159 --> 00:49:22,760
cloud specific language is like go fair point, and even

948
00:49:22,800 --> 00:49:25,039
the emergency of RUSS, like there is definitely a new

949
00:49:25,079 --> 00:49:28,119
generational language that were more tailored to the next generation

950
00:49:28,199 --> 00:49:29,039
of problems.

951
00:49:30,079 --> 00:49:32,000
Speaker 4: That yeah, fair point, fair point.

952
00:49:32,119 --> 00:49:34,119
Speaker 3: Yeah, but you know, but the reality is lots of

953
00:49:34,159 --> 00:49:36,719
people also haven't embraced those languages. Like an adoption in

954
00:49:36,760 --> 00:49:41,920
your language is slow. Yeah, yeah, C Sharpest continue to improve,

955
00:49:41,960 --> 00:49:44,719
but it's also a twenty six year old language.

956
00:49:46,039 --> 00:49:51,760
Speaker 4: Yeah. Though, that ability to iterate and improve on something

957
00:49:51,800 --> 00:49:54,840
that isn't isn't perfect is what we do as engineers,

958
00:49:55,119 --> 00:49:56,800
and that's what we're going to see over the next

959
00:49:56,800 --> 00:49:58,679
few years. Like I don't think we're going to see

960
00:49:59,119 --> 00:50:02,039
a land break improvement in l lms, But what we

961
00:50:02,079 --> 00:50:04,599
are going to do is see things like open claw

962
00:50:04,760 --> 00:50:08,519
and you know, those those ways of using the models

963
00:50:08,599 --> 00:50:11,079
to be meaningfully better. And so I think that that

964
00:50:11,239 --> 00:50:15,559
denial of the trajectory is the problem, not not the

965
00:50:15,679 --> 00:50:19,599
kind of realism of where we are, and that those

966
00:50:20,119 --> 00:50:24,760
those people, it is you know, honors as leaders to

967
00:50:24,840 --> 00:50:27,199
invest in those people and help them see and help

968
00:50:27,280 --> 00:50:30,440
them understand what's happening, which takes a lot of time.

969
00:50:30,719 --> 00:50:35,119
But those those people often tend to be sorry, I

970
00:50:35,159 --> 00:50:37,039
know you don't like that word, but those people tend

971
00:50:37,039 --> 00:50:41,440
to be the thought leaders in the team. Like, yeah,

972
00:50:40,440 --> 00:50:44,880
I joke about this in the workshops that I run

973
00:50:45,000 --> 00:50:47,440
in that you know, it's really easy as engineers for

974
00:50:47,519 --> 00:50:49,880
us to work out who the real leader of a

975
00:50:49,960 --> 00:50:52,840
team is. There's a really simple way to work out

976
00:50:52,840 --> 00:50:54,400
who the leaders in your team are. Do you know

977
00:50:54,440 --> 00:50:58,880
what it is? Break pod? Who does everybody turn their

978
00:50:58,920 --> 00:51:01,800
chairs around to look at you, break pops? Interesting? Those

979
00:51:01,840 --> 00:51:04,599
are the real leaders of the team, not the person

980
00:51:04,639 --> 00:51:07,679
with lead or manager in the title. Yeah, and those

981
00:51:07,719 --> 00:51:09,719
are the people that are often stuck in this place,

982
00:51:09,800 --> 00:51:12,880
and so they are worthwhile investing in.

983
00:51:13,079 --> 00:51:16,800
Speaker 3: Yeah, that's great. I appreciate that there. But and yeah,

984
00:51:17,519 --> 00:51:19,559
I also think we're nowhere near the end of this.

985
00:51:19,840 --> 00:51:21,840
Like I do feel like we're in this bubble and

986
00:51:21,880 --> 00:51:23,880
it's going to end. And a lot of folks that

987
00:51:23,880 --> 00:51:27,400
have been resisting to go see when the when the

988
00:51:27,480 --> 00:51:29,800
money strips away. Yeah, but I also feel like we're

989
00:51:29,840 --> 00:51:31,880
about to get more efficient too, because when the money

990
00:51:31,920 --> 00:51:34,840
goes away, people learn to be more efficient. They don't

991
00:51:34,960 --> 00:51:36,480
need to right now, so they're.

992
00:51:36,320 --> 00:51:39,159
Speaker 4: Not And the hardware is going to get more efficient,

993
00:51:39,239 --> 00:51:40,800
you know, I think I think that's a big thing.

994
00:51:40,840 --> 00:51:46,599
Even if the models don't get appreciably, you know, exponentially better,

995
00:51:46,840 --> 00:51:50,480
what will happen is the ability to run those models

996
00:51:50,480 --> 00:51:54,159
on local hardware will become you know, just what we do.

997
00:51:54,239 --> 00:51:58,199
Speaker 1: That's where that's that's that's where my money is currently

998
00:51:58,480 --> 00:52:03,639
literally new machine that I've set up to run local stuff.

999
00:52:03,719 --> 00:52:07,039
And I'm betting on the fact that local models in

1000
00:52:07,079 --> 00:52:09,480
the way that we can interact with them with our

1001
00:52:09,679 --> 00:52:13,920
software tools are really going to change be a game changer.

1002
00:52:14,119 --> 00:52:16,840
Speaker 4: Yeah, yeah, yeah, I think you're absolutely right. I think

1003
00:52:16,880 --> 00:52:19,039
that's going to be you know, the change of the

1004
00:52:19,079 --> 00:52:21,880
next five years is going to be the stuff that

1005
00:52:21,920 --> 00:52:24,760
we have to pay hundreds of dollars a month, Like

1006
00:52:25,119 --> 00:52:28,519
my bill to Anthropic right now is about two thousand

1007
00:52:28,679 --> 00:52:31,239
US a month. Wow, you know, I could I could

1008
00:52:31,280 --> 00:52:34,400
build a piece of a piece of hardware that that

1009
00:52:34,400 --> 00:52:37,039
would will not in a year. Yeah, you know, so,

1010
00:52:37,320 --> 00:52:39,280
like I think we're going to get to the point

1011
00:52:39,360 --> 00:52:42,039
when you know, we're just we're just running this stuff

1012
00:52:42,440 --> 00:52:43,280
on machines.

1013
00:52:43,360 --> 00:52:47,280
Speaker 3: The ninety six gig v ram that's the problem. It's

1014
00:52:47,320 --> 00:52:51,199
fifty ninety RTX is about ten grand US. So in

1015
00:52:51,320 --> 00:52:54,159
less than six months of an anthropic bill, you've got

1016
00:52:54,280 --> 00:52:55,880
the month mother of all cards.

1017
00:52:56,039 --> 00:52:58,599
Speaker 1: Well that's just now. I mean, if there's a bubble

1018
00:52:58,639 --> 00:53:02,360
that bursts in the you know, the AI providers, in

1019
00:53:02,400 --> 00:53:06,639
the LM providers, the you know the you know what

1020
00:53:06,760 --> 00:53:11,360
is it one what did you call it, Richard all

1021
00:53:11,440 --> 00:53:14,559
you can eat model? Yes, yeah, the all you can

1022
00:53:14,559 --> 00:53:16,760
eat model for a certain number of dollars a month.

1023
00:53:16,760 --> 00:53:20,400
That isn't paying for it costs Like we have not

1024
00:53:20,519 --> 00:53:23,719
paid the real cost yet. No, no, And when that happens,

1025
00:53:24,000 --> 00:53:27,639
you know, all this all these graphics cards are going

1026
00:53:27,679 --> 00:53:29,320
to be for sale on Craigslist.

1027
00:53:31,400 --> 00:53:33,079
Speaker 4: I have the I have the real evidence of that.

1028
00:53:33,199 --> 00:53:35,519
So that two thousand dollars that I spend a month,

1029
00:53:35,760 --> 00:53:38,599
so I for my own use, I have a well whatever.

1030
00:53:38,639 --> 00:53:42,239
The Claude Pro Max twenty X subscription is basically the

1031
00:53:42,239 --> 00:53:45,840
biggest one that Claude offered and that's about three hundreds

1032
00:53:45,960 --> 00:53:49,480
US a month. And then one of my engineers he

1033
00:53:49,599 --> 00:53:53,280
uses Z as his editor, and Z doesn't allow you

1034
00:53:53,360 --> 00:53:56,360
to use all open all, and topic doesn't like well

1035
00:53:56,400 --> 00:54:00,360
whatever is basically you cannot use the subscription with Z.

1036
00:54:00,800 --> 00:54:03,719
You have to use a law apike and so the

1037
00:54:03,760 --> 00:54:09,320
other seventeen hundred a month his tokens the Rockie. Yeah,

1038
00:54:09,400 --> 00:54:12,679
so that's the difference between the two. It's literally a

1039
00:54:12,800 --> 00:54:14,239
five x difference.

1040
00:54:14,760 --> 00:54:15,000
Speaker 1: Yeah.

1041
00:54:15,480 --> 00:54:18,599
Speaker 3: And you know the thing is a grand a month.

1042
00:54:19,119 --> 00:54:23,360
That's still cheaper than an in turn developer exactly. It's

1043
00:54:23,400 --> 00:54:26,760
not actually that much money if you're putting real valve,

1044
00:54:26,880 --> 00:54:28,320
producing real results from.

1045
00:54:28,119 --> 00:54:30,440
Speaker 4: It, exactly, And that's that's the way to think about it.

1046
00:54:30,519 --> 00:54:32,400
Speaker 1: Yeah, it's six months. You could build your own machine

1047
00:54:32,400 --> 00:54:34,760
with that. That would be pretty rock and sweet.

1048
00:54:34,440 --> 00:54:36,639
Speaker 4: Possibly, but you couldn't. So you know where we are

1049
00:54:36,719 --> 00:54:39,599
right now with the models. You couldn't run Claude Opus

1050
00:54:39,639 --> 00:54:43,199
four point six on that hardware like to rue that

1051
00:54:43,320 --> 00:54:46,639
you need multiple you know, a six hundreds or whatever

1052
00:54:46,639 --> 00:54:49,639
they are, like, you can't two hundreds. Yeah, yeah, but.

1053
00:54:49,639 --> 00:54:51,599
Speaker 1: You can run deep Seek and you can run Olama,

1054
00:54:51,639 --> 00:54:53,239
and there are things that you can run in then

1055
00:54:53,280 --> 00:54:56,960
there's rag and you could teach it stuff and Laura.

1056
00:54:57,280 --> 00:54:59,360
Speaker 3: Well, and that's the other side of this is when

1057
00:54:59,400 --> 00:55:02,360
the when the engineers actually focus on efficiency and we

1058
00:55:02,400 --> 00:55:05,320
start especially we're talking about software development, like narrowing the

1059
00:55:05,400 --> 00:55:08,719
scope of these models down to the job that we're

1060
00:55:08,719 --> 00:55:11,679
doing rather than the generalized models.

1061
00:55:11,880 --> 00:55:13,400
Speaker 4: Yeah, yeah, yeah.

1062
00:55:13,719 --> 00:55:17,639
Speaker 1: That's why these these AI agents and the stuff like

1063
00:55:18,039 --> 00:55:21,400
squad is great because each of these team members can

1064
00:55:21,480 --> 00:55:24,000
use their own models for the stuff that they're good at,

1065
00:55:25,000 --> 00:55:27,320
you know, their own meaning a different model.

1066
00:55:27,280 --> 00:55:29,679
Speaker 4: I do before. You know, I'm aware of where we

1067
00:55:29,719 --> 00:55:31,719
are in the recording of this, and I want to

1068
00:55:31,719 --> 00:55:36,119
make sure we touch on one thing that's really important

1069
00:55:36,159 --> 00:55:38,960
to me about this, this kind of stages of grief

1070
00:55:39,079 --> 00:55:44,320
model that I discussed, which is the depression stage, because

1071
00:55:44,599 --> 00:55:47,639
I think it's it's really real and we don't talk

1072
00:55:47,639 --> 00:55:51,440
about it enough. Like you know that that senior engineer

1073
00:55:51,519 --> 00:55:54,960
which we kind of you know, joked about earlier there

1074
00:55:54,960 --> 00:56:00,840
probably are going through some kind of existential depression around

1075
00:56:00,920 --> 00:56:03,599
what it means to be an engineer, and you know,

1076
00:56:03,639 --> 00:56:07,239
they don't talk about it at stand up, they don't

1077
00:56:07,320 --> 00:56:10,559
put it in their slack status, you know, mourning my career,

1078
00:56:10,719 --> 00:56:14,719
like they don't, they don't discuss this stuff, but I

1079
00:56:14,760 --> 00:56:18,719
think it is important, especially as leaders, to talk about

1080
00:56:18,719 --> 00:56:22,760
it with them because I'm feeling I'm feeling the same way.

1081
00:56:22,800 --> 00:56:25,599
I'm sure lots of people are, but we're just you know,

1082
00:56:25,800 --> 00:56:28,480
we get so excited or at least I do, about

1083
00:56:28,480 --> 00:56:31,519
the possibilities of this that we kind of push down

1084
00:56:31,559 --> 00:56:36,320
that feeling of you know, depression and grief and mourning

1085
00:56:36,519 --> 00:56:38,800
and don't acknowledge that it's there.

1086
00:56:39,079 --> 00:56:42,639
Speaker 3: That's fair. Yeah, yeah, I mean nobody has ten years

1087
00:56:42,719 --> 00:56:47,800
experience with claude. Yeah, as you know has happened repeatedly.

1088
00:56:48,960 --> 00:56:52,039
We get new tools or we get new libraries, and

1089
00:56:52,079 --> 00:56:54,320
we're all beginners at it, and so part of that

1090
00:56:54,480 --> 00:56:56,239
is just like can you get up to speed? Is

1091
00:56:56,280 --> 00:56:59,440
it worthwhile for you? But I do, you know, get

1092
00:56:59,519 --> 00:57:01,159
the point in all and wrote about that as well.

1093
00:57:01,199 --> 00:57:03,199
It's like there was a thing you used to do

1094
00:57:03,960 --> 00:57:07,039
that the shape of these new tools mean it really

1095
00:57:07,079 --> 00:57:08,920
doesn't make a lot of sense to do that anymore.

1096
00:57:09,639 --> 00:57:12,079
And that's and that's what you're missing. That's that's the

1097
00:57:12,159 --> 00:57:13,400
main thing that makes you say.

1098
00:57:13,400 --> 00:57:16,920
Speaker 4: And if you're feeling that it's normal, you know, you're

1099
00:57:17,000 --> 00:57:21,639
you're not broken. Your career isn't over, you know, it's

1100
00:57:21,360 --> 00:57:25,079
it is just a transition and a phase, and there

1101
00:57:25,199 --> 00:57:27,280
is light at the other side. There is a career

1102
00:57:27,760 --> 00:57:29,679
at the other side. It's going to be a career

1103
00:57:29,719 --> 00:57:33,159
that shapes differently. The things that you spend your day

1104
00:57:33,280 --> 00:57:36,159
doing are going to be different. But there is there

1105
00:57:36,239 --> 00:57:38,280
is a career there. Like we we've we've seen this

1106
00:57:38,519 --> 00:57:43,679
with you know, rapid application development stuff. We've seen it

1107
00:57:43,719 --> 00:57:46,519
with Excel, We've seen it with you know, low code,

1108
00:57:46,559 --> 00:57:47,679
no code stuff.

1109
00:57:47,840 --> 00:57:51,079
Speaker 1: I'm reminded of the book Who Moved My Cheese? You

1110
00:57:51,119 --> 00:57:51,760
remember that book?

1111
00:57:51,800 --> 00:57:52,880
Speaker 4: I've not read that. What's that?

1112
00:57:52,920 --> 00:57:55,039
Speaker 1: It's a little booklet. It's just about you know, you're

1113
00:57:55,079 --> 00:57:57,199
going along in the mouse is eating the cheese that

1114
00:57:57,320 --> 00:57:58,920
shows up in the same place every day, and then

1115
00:57:58,920 --> 00:58:02,800
they move the cheese. Oh my god, somebody moved the cheese, right,

1116
00:58:03,639 --> 00:58:05,719
So you know, it's about having to find your way

1117
00:58:05,760 --> 00:58:06,639
to the next cheese.

1118
00:58:06,960 --> 00:58:09,280
Speaker 4: Yeah, yeah, yeah. And I think you know, those of

1119
00:58:09,360 --> 00:58:12,239
us who have been in the industry a while know

1120
00:58:12,440 --> 00:58:14,480
that it is it is just part of what you do.

1121
00:58:14,519 --> 00:58:14,719
Speaker 3: You know.

1122
00:58:14,760 --> 00:58:19,159
Speaker 4: I don't write VB six anymore. I don't write ColdFusion anymore.

1123
00:58:19,280 --> 00:58:21,719
I I you know, I have I have moved on.

1124
00:58:22,000 --> 00:58:26,199
Who would have anticipated that we were riding Sea Sharp

1125
00:58:26,320 --> 00:58:30,079
that actually runs in the browser when Sea Sharp came

1126
00:58:30,119 --> 00:58:32,960
out twenty six years ago. We really never have envisioned it.

1127
00:58:33,280 --> 00:58:37,320
Speaker 1: Yeah, well, what's next for you? Andrew? What's in your inbox?

1128
00:58:37,880 --> 00:58:40,559
Speaker 4: So I had I had another baby ten months ago,

1129
00:58:40,599 --> 00:58:43,360
so not a lot corect thank you? Wow?

1130
00:58:43,440 --> 00:58:44,400
Speaker 3: No your plates full?

1131
00:58:44,480 --> 00:58:50,440
Speaker 4: Friend? Yeah yeah, yeah, So the family dial is definitely

1132
00:58:50,519 --> 00:58:52,559
turned up to eleven right now, which is kind of

1133
00:58:52,840 --> 00:58:55,199
you know, taking everything out of other places. But no,

1134
00:58:55,280 --> 00:58:59,000
I'm I'm trying to write a bit more. I'm trying

1135
00:58:59,039 --> 00:59:01,320
to I'm trying to do blog post like I did.

1136
00:59:01,360 --> 00:59:03,360
I did this blog post. I did one a couple

1137
00:59:03,440 --> 00:59:07,840
of months ago called Embrace the Suck, which is basically

1138
00:59:08,519 --> 00:59:11,800
all about that transition to leadership and how you know,

1139
00:59:11,880 --> 00:59:15,239
a lot of being a leader sucks, and you know

1140
00:59:15,400 --> 00:59:17,960
if you if you try and not do the sucky things,

1141
00:59:18,039 --> 00:59:20,320
then you're not being the leader you should be and

1142
00:59:20,719 --> 00:59:22,519
need to be for your team. So I'm trying to

1143
00:59:22,519 --> 00:59:25,119
do more of that kind of stuff, which is more

1144
00:59:25,239 --> 00:59:28,039
authentic to my tone of voice and the way that

1145
00:59:28,079 --> 00:59:30,320
I think about things which is hitting you in the

1146
00:59:30,360 --> 00:59:32,719
face with the truth and then you know, helping you

1147
00:59:32,760 --> 00:59:35,079
see why it's not as bad as it as it seems.

1148
00:59:35,320 --> 00:59:37,679
Speaker 1: That's great. Well, Andrew, thank you for spending this hour

1149
00:59:37,719 --> 00:59:40,159
with us. It's great to hear your ideas in your

1150
00:59:40,280 --> 00:59:41,480
in It's a great conversation.

1151
00:59:41,599 --> 00:59:42,480
Speaker 3: We should keep it going.

1152
00:59:42,880 --> 00:59:44,320
Speaker 4: Thank you both. I had a lot of fun.

1153
00:59:44,679 --> 01:00:06,840
Speaker 5: All right, We'll talk to you next time on that networks.

1154
01:00:08,119 --> 01:00:10,800
Speaker 1: Dot net Rocks is brought to you by Franklin's net

1155
01:00:10,920 --> 01:00:14,880
and produced by Pop Studios, a full service audio, video

1156
01:00:14,960 --> 01:00:19,039
and post production facility located physically in New London, Connecticut,

1157
01:00:19,280 --> 01:00:24,079
and of course in the cloud online at pwop dot com.

1158
01:00:24,280 --> 01:00:26,400
Visit our website at d O T N E T

1159
01:00:26,639 --> 01:00:30,679
R O c k S dot com for RSS feeds, downloads,

1160
01:00:30,800 --> 01:00:34,480
mobile apps, comments, and access to the full archives going

1161
01:00:34,519 --> 01:00:37,920
back to show number one, recorded in September two thousand

1162
01:00:37,960 --> 01:00:40,599
and two. And make sure you check out our sponsors.

1163
01:00:40,760 --> 01:00:43,719
They keep us in business. Now go write some code,

1164
01:00:44,119 --> 01:00:44,880
see you next time.

1165
01:00:45,800 --> 01:01:00,960
Speaker 4: You got Javans and the

