1
00:00:01,080 --> 00:00:03,000
Speaker 1: How'd you like to listen to dot net rocks with

2
00:00:03,040 --> 00:00:07,879
no ads? Easy? Become a patron for just five dollars

3
00:00:07,919 --> 00:00:10,800
a month. You get access to a private RSS feed

4
00:00:10,839 --> 00:00:14,279
where all the shows have no ads. Twenty dollars a month.

5
00:00:14,279 --> 00:00:16,920
We'll get you that and a special dot net Rocks

6
00:00:16,960 --> 00:00:21,000
patron mug. Sign up now at Patreon dot dot NetRocks

7
00:00:21,120 --> 00:00:36,719
dot com. Hey you welcome back to dot net rocks.

8
00:00:36,719 --> 00:00:41,280
Carl Franklin and I'm Richard Campbell. Yeah, so I said

9
00:00:41,320 --> 00:00:44,479
it for me because I just like massing the pronouns.

10
00:00:44,600 --> 00:00:45,240
Speaker 2: I don't know why.

11
00:00:46,640 --> 00:00:49,399
Speaker 1: Yeah. So we're talking about AI. Mark Semen is here.

12
00:00:49,560 --> 00:00:52,119
We'll get to him in a minute, but first I

13
00:00:52,200 --> 00:00:55,000
have a related better know a framework.

14
00:00:55,159 --> 00:01:04,359
Speaker 2: Awesome, It's all right, what do you got?

15
00:01:05,120 --> 00:01:10,359
Speaker 1: Ezra Kline is a New York Times columnist. He does

16
00:01:10,400 --> 00:01:14,840
a great podcast called Ezra Kleinhow Yeah, I highly recommend it.

17
00:01:14,959 --> 00:01:16,719
Speaker 2: I don't know nobody's naming strategy, but I'll tell you

18
00:01:16,799 --> 00:01:18,840
what his theme song on point.

19
00:01:19,879 --> 00:01:24,200
Speaker 1: Yeah. Well, anyway, the the article or the podcast that

20
00:01:24,280 --> 00:01:28,000
I listened to this week was how the attention economy

21
00:01:28,159 --> 00:01:32,599
is devouring gen Z and the rest of us? Right,

22
00:01:32,760 --> 00:01:36,840
and uh, we're going to get in into the weeds

23
00:01:36,879 --> 00:01:39,920
here with with Mark in a bit, But I just

24
00:01:39,959 --> 00:01:44,200
wanted to point this out as an absolute necessary, absolutely

25
00:01:44,239 --> 00:01:50,400
necessary required listening slash reading. Even if you don't have

26
00:01:50,560 --> 00:01:53,840
gen Z sons and daughters, or you know someone or

27
00:01:53,879 --> 00:01:57,959
you are gen Z, this is a really good perspective

28
00:01:58,000 --> 00:02:02,760
piece about you know, well in general, he's saying that

29
00:02:02,799 --> 00:02:06,799
gen Z came of age during COVID, Right, Yeah, I

30
00:02:06,920 --> 00:02:11,719
have a gen Z daughter. She Yeah, she was graduating

31
00:02:11,800 --> 00:02:14,759
high school during COVID. She was robbed of her high

32
00:02:14,759 --> 00:02:19,360
school senior year. She did not have any social activity

33
00:02:19,400 --> 00:02:23,240
that whole year. Then she went off to college. Everything

34
00:02:23,319 --> 00:02:27,840
is on zoom. All the information that she learned school

35
00:02:27,840 --> 00:02:32,919
wise is on zoom, like anything of importance. And so

36
00:02:33,159 --> 00:02:37,120
that kind of shapes the way the gen z Ers

37
00:02:37,159 --> 00:02:43,719
think about things, and in particular the nihilism attitude about

38
00:02:43,840 --> 00:02:47,840
why should I, why should I try? Why should I

39
00:02:47,960 --> 00:02:51,560
go to college? Why should I better myself? Because the

40
00:02:51,639 --> 00:02:54,400
AI is going to take my lunch. There's no entry

41
00:02:54,479 --> 00:02:57,439
level jobs anymore. Like it's a very kind of a

42
00:02:57,520 --> 00:03:00,319
dark place that the gen Zers are incause how of

43
00:03:00,360 --> 00:03:03,360
their experience and because of what has happened in the

44
00:03:03,400 --> 00:03:06,919
last few years, and I think, yeah, Mark is nodding

45
00:03:06,919 --> 00:03:09,960
his head, we're going to get into this. This is

46
00:03:10,000 --> 00:03:13,280
a good point for me to mention that I have

47
00:03:13,360 --> 00:03:17,080
a TikTok carl at atfenex dot com, and one of

48
00:03:17,120 --> 00:03:20,800
the first things that I've done is a video that

49
00:03:21,000 --> 00:03:26,439
basically says, AI is no excuse. It's no excuse to

50
00:03:26,719 --> 00:03:30,280
give up, to stop learning, to stop trying to be

51
00:03:31,000 --> 00:03:32,800
the person that you wanted to be when you were

52
00:03:32,800 --> 00:03:35,360
a kid, whatever it is. You know your hopes and dreams.

53
00:03:35,360 --> 00:03:38,479
Don't give up. You know, we don't know what's going

54
00:03:38,560 --> 00:03:41,400
to happen in the AI future. We don't know if

55
00:03:41,520 --> 00:03:43,439
what the jobs are they are going to be available,

56
00:03:43,520 --> 00:03:46,599
but we do know this, if you just try to

57
00:03:46,599 --> 00:03:50,199
be the best whatever that you can be, you have

58
00:03:50,319 --> 00:03:52,719
a better chance of surviving no matter what the AI

59
00:03:52,879 --> 00:03:57,960
landscape is. Here's an example. You want to go into

60
00:03:58,039 --> 00:04:03,479
a trade, let's say carpentry because that's an AI proof,

61
00:04:03,800 --> 00:04:08,240
So you think profession. But then you think, oh, well,

62
00:04:08,280 --> 00:04:10,319
you know the robots are going to just start building

63
00:04:10,319 --> 00:04:13,120
houses and all that stuff, so why bother. Well, here's

64
00:04:13,159 --> 00:04:16,600
why you should be the best carpenter that you can

65
00:04:16,639 --> 00:04:19,600
possibly be. So that when the robots do come and

66
00:04:19,600 --> 00:04:23,480
they're affordable, and construction people general contractors are hiring them.

67
00:04:23,560 --> 00:04:27,800
You're the boss because you're awesome, and that's how you

68
00:04:27,879 --> 00:04:30,120
got to be the boss, and then you can hire

69
00:04:30,160 --> 00:04:32,680
the robots to work for you. Do you know what

70
00:04:32,720 --> 00:04:34,959
I'm saying. I mean, it's a weird example and it

71
00:04:35,000 --> 00:04:38,600
probably isn't going to be true anytime soon, but it's

72
00:04:38,639 --> 00:04:41,439
no excuse to stop trying and to stop learning and

73
00:04:41,519 --> 00:04:44,079
to just put the brakes on your life and resign to,

74
00:04:44,879 --> 00:04:46,759
you know, flipping burgers for the rest of your life,

75
00:04:47,000 --> 00:04:48,600
you know, unless that's what you want to do.

76
00:04:49,040 --> 00:04:51,279
Speaker 2: So anyway, although let's face it, flipping burgers is far

77
00:04:51,279 --> 00:04:52,079
more automatable.

78
00:04:52,199 --> 00:04:56,000
Speaker 1: Yes, yes, yes, so so anyway, I highly recommend this,

79
00:04:56,360 --> 00:04:58,759
and by the way, check out my TikTok because I

80
00:04:58,839 --> 00:05:01,839
have a bit to say about this that will continue

81
00:05:01,839 --> 00:05:05,759
this conversation, I'm sure. So that's when I got Richard

82
00:05:05,759 --> 00:05:07,000
who's talking to us today?

83
00:05:07,199 --> 00:05:10,199
Speaker 2: Well, knowing we were going to get philosophical today. When

84
00:05:10,199 --> 00:05:13,079
I was taking around for a comment, I grabbed one

85
00:05:13,120 --> 00:05:16,279
off of the artificial intelligence geek out that we did

86
00:05:16,879 --> 00:05:21,800
back in twenty fifteen, whoa ten years ago well, now,

87
00:05:22,079 --> 00:05:24,720
you know, the other thing is to realize, why did

88
00:05:24,759 --> 00:05:27,480
we do that? Geek out? Then, Yeah, that was the

89
00:05:27,639 --> 00:05:32,519
time when Bill Gates and Elon Musk and Stephen Hawking

90
00:05:32,600 --> 00:05:35,759
were all going on about AI emerging and then like,

91
00:05:35,839 --> 00:05:39,079
we have to be careful now. Then the subtext of

92
00:05:39,120 --> 00:05:43,079
this is that Google had successfully hired many of the

93
00:05:43,120 --> 00:05:47,000
best minds, including guys like Jeff Hinton and so forth,

94
00:05:47,480 --> 00:05:50,759
and they were doing some extraordinary things in a group

95
00:05:50,800 --> 00:05:54,199
called Google Brain even then, and so really this was

96
00:05:54,240 --> 00:05:56,439
a push for we've got to get those scientists out

97
00:05:56,519 --> 00:05:59,319
of Google. What you were seeing was the setup that

98
00:05:59,360 --> 00:06:02,240
would become open AI, but we didn't know that at

99
00:06:02,240 --> 00:06:05,279
the time. It would emerge another year or two later,

100
00:06:05,720 --> 00:06:08,079
and with all of the problems that that had attacked

101
00:06:08,120 --> 00:06:11,079
and continues to have attached to it. Again, nobody expected

102
00:06:11,480 --> 00:06:12,959
any of the things that have happened here.

103
00:06:13,000 --> 00:06:17,639
Speaker 1: Well, and Hinton famously started warning the public against general

104
00:06:17,639 --> 00:06:21,160
AI and the things that I'm sure Mark is going

105
00:06:21,240 --> 00:06:22,279
to be talking about too.

106
00:06:22,360 --> 00:06:26,759
Speaker 2: But yeah, once his shares in Google were fully vested, right, Yeah,

107
00:06:26,839 --> 00:06:29,199
so you know it should be clear.

108
00:06:29,519 --> 00:06:31,680
Speaker 1: Not that you're cynical about that or anything.

109
00:06:31,639 --> 00:06:36,240
Speaker 2: Just watching for people's motivations here. That's right, follow the money.

110
00:06:36,920 --> 00:06:38,480
So there was a lot of comments on that show

111
00:06:38,519 --> 00:06:40,360
because we were pretty far ahead of our time when

112
00:06:40,360 --> 00:06:43,079
we were at that particular point. You know, we're reading

113
00:06:43,120 --> 00:06:45,199
in a lot of the tea leaves much more science

114
00:06:45,240 --> 00:06:47,920
fiction based, so makes it ten years all the more extraordinary, right,

115
00:06:48,000 --> 00:06:51,160
And I grabbed this comment one of literally dozens, and

116
00:06:51,199 --> 00:06:53,560
one of them was Mark Semens comment too. And Mark

117
00:06:53,600 --> 00:06:55,959
probably doesn't remember either because he writes lots of comments

118
00:06:55,959 --> 00:06:58,839
on lots of shows. But you referenced a book called

119
00:06:58,959 --> 00:07:03,519
blind Site, Sir, which is a very interesting study and consciousness,

120
00:07:03,759 --> 00:07:07,360
because we did go down that path twenty fifteen about

121
00:07:07,800 --> 00:07:11,680
what's consciousness, what is sentiency, and what is intelligence that

122
00:07:11,759 --> 00:07:15,639
kind of thing. So Tom Kirkhoff's comment, another past guest

123
00:07:15,720 --> 00:07:19,800
of the show, he says, as you mentioned, it depends,

124
00:07:20,040 --> 00:07:22,920
So what is artificial intelligence? People such as Bill Gates

125
00:07:23,240 --> 00:07:25,600
are cautious with AI and tells us we should not

126
00:07:26,199 --> 00:07:28,319
do it, But we have and we entered the era

127
00:07:28,600 --> 00:07:32,000
where AI is in the past, Apple has Siri and

128
00:07:32,040 --> 00:07:35,920
Microsoft has Cortana and as personal assistants who are more

129
00:07:35,959 --> 00:07:37,879
and more integrated in all our toys. So where do

130
00:07:37,920 --> 00:07:39,920
we draw the line. Isn't it cool to think that?

131
00:07:39,959 --> 00:07:44,800
In twenty fifteen we Tartana. Yeah, and when I Robot

132
00:07:44,920 --> 00:07:50,959
came out, and that's the Will Smith version of Isaac

133
00:07:51,000 --> 00:07:54,800
Asimov's movie. We saw robots helping us as humans in

134
00:07:54,839 --> 00:07:56,560
our day to day work, which you know, the funny

135
00:07:56,600 --> 00:07:59,439
part is here we are with some interesting software, but

136
00:07:59,600 --> 00:08:02,720
still build a robot that humans can be around. Is

137
00:08:02,759 --> 00:08:06,439
that artificial intelligence? Because we can get this today in

138
00:08:06,519 --> 00:08:09,160
some sort I don't think of remote qualifies. We have

139
00:08:09,279 --> 00:08:11,600
robots that are capable of walking like animals. We have

140
00:08:11,680 --> 00:08:15,240
sensors such as connect connect I remember connect that can

141
00:08:15,279 --> 00:08:20,279
detect walls, open doors, and well, plus we have Cortana

142
00:08:20,480 --> 00:08:22,800
and with our knowing our schedule and helping us to

143
00:08:22,839 --> 00:08:25,199
remind stuff and look stuff up the combine this together.

144
00:08:25,240 --> 00:08:28,199
Are we getting close to those robots? Also? Where are

145
00:08:28,279 --> 00:08:31,199
we with laws supporting lists like getting self driving cars

146
00:08:31,759 --> 00:08:35,039
and personal assistance and stuff? How will we protect ourselves

147
00:08:35,080 --> 00:08:37,799
from human hackers or AI going wrong? These are all

148
00:08:37,840 --> 00:08:38,600
interesting talking.

149
00:08:38,759 --> 00:08:41,519
Speaker 1: Well, you'd be happy to know nothing has happened in

150
00:08:41,639 --> 00:08:45,600
law government to protect us from anything because they don't

151
00:08:45,639 --> 00:08:47,320
even know what the heck is going on.

152
00:08:47,679 --> 00:08:51,320
Speaker 2: Well, that's not true. The EU has passed an interesting self.

153
00:08:51,360 --> 00:08:54,360
Speaker 1: I was talking about our government, Richard, Well, my government

154
00:08:54,919 --> 00:08:57,399
they have no clue about AI or what to do

155
00:08:57,480 --> 00:09:00,519
about it. So the other civilized civilizations we have a

156
00:09:00,559 --> 00:09:01,360
little bit more to do.

157
00:09:01,519 --> 00:09:03,679
Speaker 2: It is I mean, we were thinking about these same

158
00:09:03,720 --> 00:09:07,639
problems ten years ago, but with obviously some gaps, right, Like, Yeah,

159
00:09:08,320 --> 00:09:11,840
the voice assistance of ten years ago actually worked better

160
00:09:11,879 --> 00:09:14,039
than they did in just the past couple of years

161
00:09:14,080 --> 00:09:17,080
before the LM showed up, because they never made money. Yeah,

162
00:09:17,279 --> 00:09:19,480
And as they didn't make money, their budgets got squeezed

163
00:09:19,480 --> 00:09:21,840
tighter and tighter, and less compute resources were used on them,

164
00:09:21,879 --> 00:09:26,639
and they degraded. And eventually, just before the LLM breakout,

165
00:09:26,639 --> 00:09:30,279
before chatchpt, both Google and Amazon came out said hey,

166
00:09:30,279 --> 00:09:33,279
we're cutting these groups back because they're just not doing

167
00:09:33,279 --> 00:09:36,279
what they were intended doing with attending not being helping people,

168
00:09:36,519 --> 00:09:39,720
but making the company money. And then of course it

169
00:09:40,000 --> 00:09:42,480
did chatchpt lands and the whole thing's up in the

170
00:09:42,559 --> 00:09:46,120
air and they're all scrambling. So Tom, thank you so

171
00:09:46,200 --> 00:09:48,159
much for your comment. Great to hear from your friend

172
00:09:48,360 --> 00:09:51,559
nine years ago, and a coffee of music. Cobe is

173
00:09:51,559 --> 00:09:52,559
on its way to you, and if you'd like a

174
00:09:52,600 --> 00:09:54,600
coffee of musicobe, I write a comment on the website

175
00:09:54,600 --> 00:09:56,399
and done at rocks dot com or on the facebooks.

176
00:09:56,399 --> 00:09:58,000
You publish every show there, and if you comment there

177
00:09:58,000 --> 00:09:59,679
and I read in the show, we'll send you copy

178
00:10:00,080 --> 00:10:00,960
go by Music.

179
00:10:00,720 --> 00:10:03,159
Speaker 1: To code by Still Going Strong. Thank you Mark Semen

180
00:10:03,279 --> 00:10:05,879
for that idea that you gave me long those many

181
00:10:05,919 --> 00:10:08,720
years ago. Oh that turned into music to code by

182
00:10:08,840 --> 00:10:10,799
twenty two tracks now and you can get them an

183
00:10:10,840 --> 00:10:16,159
MP three wave or flack of twenty five minute compositions

184
00:10:16,279 --> 00:10:18,039
at music too coode by dot net.

185
00:10:18,120 --> 00:10:19,759
Speaker 3: So that's the end of analysis of music.

186
00:10:20,000 --> 00:10:23,480
Speaker 1: Yeah pretty much? Wow? Yeah Yeah. And it's designed to

187
00:10:23,559 --> 00:10:27,799
be in that beat per minute range that was cited

188
00:10:27,840 --> 00:10:31,559
in the study with the baroque music the children that

189
00:10:31,600 --> 00:10:34,559
were doing math problems when it was six between sixty

190
00:10:34,600 --> 00:10:36,679
five and seventy two piece per minute. I think it is.

191
00:10:37,480 --> 00:10:41,879
And it's neither too distracting nor is it too boring,

192
00:10:42,320 --> 00:10:44,679
Like you're not going to lose your mind listening to it.

193
00:10:44,720 --> 00:10:47,200
There is some variation in there, but nothing's going to

194
00:10:47,279 --> 00:10:50,360
jump out and scream at you. So it works, and

195
00:10:50,440 --> 00:10:54,600
it works for a ton of happy customers, including me.

196
00:10:56,120 --> 00:11:01,440
All right, well, let's formally introduce Mark Mark Seamen.

197
00:11:01,960 --> 00:11:03,679
Speaker 2: Hmmm, we got to do nineteen sixty.

198
00:11:03,879 --> 00:11:06,919
Speaker 1: Oh yeah, we do. Why don't I always forget that? Richard?

199
00:11:07,120 --> 00:11:07,600
I don't know.

200
00:11:08,120 --> 00:11:09,639
Speaker 2: Maybe we should let this go at some point, but

201
00:11:09,639 --> 00:11:10,799
I kind of want to run until we get to

202
00:11:10,799 --> 00:11:12,279
two thousand and two and we have inception.

203
00:11:12,480 --> 00:11:15,799
Speaker 1: I do too, Yeah, yeah, yeah. So significant events in

204
00:11:15,879 --> 00:11:21,240
nineteen sixty included the independence of seventeen African nations, the

205
00:11:21,279 --> 00:11:24,360
Greensboro sit ins for civil rights in the US, and

206
00:11:24,519 --> 00:11:28,039
the first televised presidential debate between John F. Kennedy and

207
00:11:28,120 --> 00:11:29,120
Richard Nixon.

208
00:11:28,919 --> 00:11:30,399
Speaker 2: MM, which went well.

209
00:11:30,480 --> 00:11:34,399
Speaker 1: Kennedy was very telegenic, he was you wore a dark suit,

210
00:11:34,519 --> 00:11:37,679
and Richard Nixon blended in with the background. I remember that.

211
00:11:37,559 --> 00:11:39,360
Speaker 2: All these things you didn't need to think about until

212
00:11:39,440 --> 00:11:40,360
television came along.

213
00:11:40,679 --> 00:11:43,000
Speaker 1: It was also marked by the U two incident, where

214
00:11:43,039 --> 00:11:46,440
an American spy plane was shot down over Soviet airspace,

215
00:11:46,799 --> 00:11:51,080
escalating Gary Powers old war tensions.

216
00:11:51,639 --> 00:11:54,000
Speaker 2: They believe that plane flew too high to be shot down,

217
00:11:54,000 --> 00:11:58,000
and they were wrong. So what's on your list, Richard?

218
00:11:58,600 --> 00:12:02,039
The first laser is rendered operational. A guy named Theodore

219
00:12:02,080 --> 00:12:05,240
Maymon where can at a huge research used a synthetic

220
00:12:05,360 --> 00:12:07,840
ruby with flash lamps based on a bunch of science

221
00:12:08,320 --> 00:12:10,279
a group of other smart folks over the past few years.

222
00:12:10,279 --> 00:12:13,799
But he's the guy who actually implemented coherent light. Wow

223
00:12:13,840 --> 00:12:15,639
your DBD, thanks you Wow.

224
00:12:15,759 --> 00:12:19,039
Speaker 1: And shortly after that, Star Trek came online and something

225
00:12:19,080 --> 00:12:22,200
they're using phasers because they couldn't say lasers.

226
00:12:22,679 --> 00:12:25,679
Speaker 2: And the very first weather satellite ever TIROS one, the

227
00:12:25,840 --> 00:12:31,200
US satellite TIRO short for Television Infrared Observation satellite, launched

228
00:12:31,200 --> 00:12:34,320
by aor able rocket. It had solar panels on it

229
00:12:34,320 --> 00:12:37,600
in nineteen sixty at solar panels Wow, a wide narrow

230
00:12:37,639 --> 00:12:41,080
angle infrared cameras and it took about twenty three thousand

231
00:12:41,120 --> 00:12:45,080
pictures before an electrical failure after ten weeks knocked it out,

232
00:12:45,559 --> 00:12:48,480
beginning this idea of being able to look at weather

233
00:12:48,480 --> 00:12:52,480
at a macroscale from orbit, which is incredibly important. It's

234
00:12:52,559 --> 00:12:54,559
amazing to think that that's only it's only been sixty

235
00:12:54,639 --> 00:12:56,679
years of being able to do that. Yeah, it's still

236
00:12:56,759 --> 00:13:00,360
actually in orbit. There was this electrical failure in the

237
00:13:00,360 --> 00:13:03,080
battery system knocked it out early. Could have lasted longer,

238
00:13:03,080 --> 00:13:05,240
but it was followed up by many more, but that

239
00:13:05,240 --> 00:13:06,360
that began in nineteen sixty.

240
00:13:06,360 --> 00:13:09,200
Speaker 1: I shall also mention the pill was ratified in nineteen sixty,

241
00:13:09,360 --> 00:13:13,159
so that ushered in a whole era of women's reproductive

242
00:13:13,240 --> 00:13:14,080
rights and all of that.

243
00:13:14,159 --> 00:13:16,039
Speaker 2: Next, we're going to talk about the age of Aquarius.

244
00:13:16,399 --> 00:13:19,279
Speaker 1: No, no, no, come on, man, come on.

245
00:13:20,559 --> 00:13:21,639
Speaker 2: All works together.

246
00:13:21,720 --> 00:13:23,200
Speaker 1: Where would we be without the pill?

247
00:13:23,360 --> 00:13:23,559
Speaker 3: There?

248
00:13:23,559 --> 00:13:26,080
Speaker 1: You seriously, I wouldn't have gotten laid in high school.

249
00:13:26,080 --> 00:13:28,480
I don't know about you guys. But all right, So

250
00:13:29,200 --> 00:13:33,600
is there any other computer oriented events or computers that

251
00:13:33,679 --> 00:13:37,000
were breakthroughs in nineteen sixty that you can.

252
00:13:36,960 --> 00:13:39,679
Speaker 2: No, but nineteen sixty one is a big one. So

253
00:13:40,080 --> 00:13:41,960
hang in there, will all right, we'll talk a lot

254
00:13:42,000 --> 00:13:44,320
about the integrated circuit, all right next week.

255
00:13:44,360 --> 00:13:47,159
Speaker 3: Did they start with four trend back then? Or is

256
00:13:47,519 --> 00:13:49,879
that that's around that time maybe a little bit earlier?

257
00:13:50,039 --> 00:13:53,519
Speaker 2: Yeah, yeah, yeah, no, four trends already around by then.

258
00:13:53,799 --> 00:13:58,039
Oh yeah, okay, you know, not on a you know,

259
00:13:58,080 --> 00:14:02,519
we're talking kind of pre This is before we actually

260
00:14:02,559 --> 00:14:06,279
have digital computers per se. Right, they're largely electrical mechanical

261
00:14:06,600 --> 00:14:10,320
like the the uh. We have transistors, but we haven't

262
00:14:10,320 --> 00:14:12,440
really got an integrated circuit, so the compute power is

263
00:14:12,480 --> 00:14:13,240
not the same at all.

264
00:14:13,600 --> 00:14:16,120
Speaker 1: All right, So the bio that I'm going to read

265
00:14:16,200 --> 00:14:18,600
was not written by me. It was written by Mark himself.

266
00:14:19,080 --> 00:14:23,159
Mark Semen is a bad economist who's found a second

267
00:14:23,200 --> 00:14:27,519
career as a programmer. He has worked as a web

268
00:14:27,600 --> 00:14:30,360
and enterprise developer since the late nineteen nineties, and he

269
00:14:30,440 --> 00:14:34,759
blogs regularly at blog dot plo dot dk. That's p

270
00:14:35,080 --> 00:14:37,759
l o e h did I pronounce that writer? Is

271
00:14:37,799 --> 00:14:40,200
it more like pl that's plur pl That's right.

272
00:14:40,360 --> 00:14:43,200
Speaker 3: Okay, you got that the second time around. Yeah, that's

273
00:14:43,639 --> 00:14:44,759
pretty good. That's pretty good.

274
00:14:44,799 --> 00:14:47,799
Speaker 1: Well, welcome back, and uh, thank you. I just had

275
00:14:47,799 --> 00:14:50,399
to formally introduce you there. That's even though we've been

276
00:14:50,440 --> 00:14:53,200
talking to you for ten fourteen minutes.

277
00:14:53,360 --> 00:14:53,879
Speaker 3: Yes we have.

278
00:14:54,120 --> 00:14:56,679
Speaker 1: Yeah, all right, So what are your thoughts? Are you

279
00:14:56,759 --> 00:14:57,639
fan of Ezra Klein?

280
00:14:57,720 --> 00:15:02,720
Speaker 3: First of all, I've I I usually used used to

281
00:15:02,759 --> 00:15:05,000
listen to a podcast by Sam Harris and these two

282
00:15:05,240 --> 00:15:10,559
sort of enemies, if you will, also, so I haven't

283
00:15:10,559 --> 00:15:12,919
really listened to to Ezra Klein. But but on the

284
00:15:12,960 --> 00:15:15,840
other hand, I think it was if Scott Fitzgerial who

285
00:15:15,840 --> 00:15:18,600
said something like and you know, the sign of intelligence

286
00:15:18,639 --> 00:15:22,039
is being able to hold two opposing thoughts in your

287
00:15:22,080 --> 00:15:24,080
mind at the same time and not go insane. So

288
00:15:24,159 --> 00:15:28,039
maybe I should. I mean, it's it also sounds like

289
00:15:28,120 --> 00:15:30,519
he's been doing he's been on sort of Isra Klein

290
00:15:30,559 --> 00:15:33,120
has been on some sort of journey where he's starting

291
00:15:33,159 --> 00:15:35,960
to realize that, you know, some of the problems that

292
00:15:36,039 --> 00:15:39,639
you just talked about here are actually really important. So yeah,

293
00:15:39,679 --> 00:15:41,519
so maybe I should. So I haven't really been a

294
00:15:41,559 --> 00:15:44,720
fan there, but you know, I have no beef with

295
00:15:44,799 --> 00:15:47,759
him personally, so maybe maybe I should give it a listen.

296
00:15:47,960 --> 00:15:50,159
Speaker 1: Well, what about the idea of gen Z being sort

297
00:15:50,200 --> 00:15:52,759
of caught in this vortex of impossibility?

298
00:15:53,279 --> 00:15:57,240
Speaker 3: That that that absolutely rings true. I have two gen

299
00:15:57,320 --> 00:16:01,200
Z kids, and the well, the old one is old

300
00:16:01,279 --> 00:16:03,879
enough so she's almost not a gen Z, so she's

301
00:16:04,080 --> 00:16:06,600
she's sort of got you know, through most of this

302
00:16:06,799 --> 00:16:10,399
stuff without too many, you know, too much impact that.

303
00:16:10,559 --> 00:16:13,440
But the other one, he's eighteen now, and he's really

304
00:16:13,720 --> 00:16:17,440
he's really you know, grabbed by TikTok and phones and

305
00:16:17,519 --> 00:16:20,159
so on. So that's yeah, that's that's a bit of

306
00:16:20,200 --> 00:16:20,679
a problem.

307
00:16:20,840 --> 00:16:23,440
Speaker 2: Yeah, yeah, I mean my kids are just that bit

308
00:16:23,559 --> 00:16:26,399
much older that maybe they slipped past us to some degree,

309
00:16:26,480 --> 00:16:29,639
but the the bait here, and you brought it up

310
00:16:29,679 --> 00:16:31,879
right at the top there, Carl, is how much of

311
00:16:31,879 --> 00:16:34,879
this is just the attention economy in general? And how

312
00:16:34,919 --> 00:16:37,480
much of it is the impacts of the pandemic of

313
00:16:37,519 --> 00:16:43,120
that two years of psychosis just this crazy time.

314
00:16:43,279 --> 00:16:47,240
Speaker 1: It was really psychosis, absolutely crazy time.

315
00:16:47,440 --> 00:16:50,559
Speaker 3: I think we were seeing signs of this already before.

316
00:16:50,639 --> 00:16:54,080
I mean, was it the Senta shoop Off who wrote

317
00:16:54,080 --> 00:16:57,320
this book about the attention economy? I think that predates

318
00:16:57,399 --> 00:17:02,000
the the pandemics. I remember, So there were definitely people

319
00:17:02,039 --> 00:17:05,640
talking about this, you know, even in the in the

320
00:17:05,680 --> 00:17:09,640
twenty tens. But that's not really what we're here to

321
00:17:09,640 --> 00:17:10,119
talk about.

322
00:17:10,200 --> 00:17:14,440
Speaker 1: No, no, no, is it. We're just getting started here?

323
00:17:14,640 --> 00:17:16,880
Speaker 3: Yeah, I know, but but maybe I should start with

324
00:17:17,440 --> 00:17:20,559
an other experience I had with a young person. So

325
00:17:20,599 --> 00:17:25,599
I was following a university course on something computer science

326
00:17:25,599 --> 00:17:29,599
I don't exactly remember. And because I was doing that,

327
00:17:29,680 --> 00:17:32,720
they you know, they have us do some group exercises

328
00:17:32,720 --> 00:17:34,519
as well. So I was doing a little paper with

329
00:17:34,559 --> 00:17:38,279
some young people and we were having a discussion about

330
00:17:38,279 --> 00:17:41,119
how to interpret a certain algorithm, and you know, whether

331
00:17:41,160 --> 00:17:45,440
we were in one regime or another and we couldn't

332
00:17:45,480 --> 00:17:47,559
really agree. And then the other one he was just

333
00:17:47,599 --> 00:17:53,200
writing on on what some DM what's it called? I

334
00:17:53,240 --> 00:17:56,160
can't remember anyway, So we were dming back and forth and

335
00:17:56,519 --> 00:17:58,880
he writes to me, well, but I just asked chat

336
00:17:58,960 --> 00:18:02,359
GBT and it says blah blah blah. So I'm right ah,

337
00:18:02,559 --> 00:18:04,920
and I'm sort of like, I don't care. I don't

338
00:18:04,920 --> 00:18:07,519
care what Chad GBT says. I wrote back and he

339
00:18:07,640 --> 00:18:09,759
was like, oh my god, you don't care what chat

340
00:18:09,839 --> 00:18:12,799
GGBT says. How can you? I mean, it was that

341
00:18:12,920 --> 00:18:17,759
was very much a generational divide there, and every time

342
00:18:17,799 --> 00:18:20,160
we came back to that, he's sort of like, oh, yeah, Mark,

343
00:18:20,319 --> 00:18:23,759
is this weird person who doesn't believe in everything that

344
00:18:23,839 --> 00:18:25,160
jaduary it.

345
00:18:25,200 --> 00:18:28,319
Speaker 1: But you're lude you don't know anything about technology?

346
00:18:31,240 --> 00:18:34,359
Speaker 3: Yeah, And I should probably preface I'm going to say

347
00:18:34,359 --> 00:18:36,519
a lot of critical things about AI, but it's not

348
00:18:36,599 --> 00:18:39,279
that I'm a complete lot eyed. I actually do see

349
00:18:39,440 --> 00:18:42,079
that there's some you know, benefits to be gained as well,

350
00:18:42,119 --> 00:18:44,160
but that's not what we here to talk about. So

351
00:18:44,559 --> 00:18:46,519
if the listener gets the impression that I'm just in

352
00:18:46,799 --> 00:18:49,839
grumby old man shouting at the cloud. It's not the

353
00:18:50,000 --> 00:18:51,960
entire picture, but let's just pretend.

354
00:18:51,960 --> 00:18:53,519
Speaker 1: Well, that's besides the point.

355
00:18:53,960 --> 00:18:55,480
Speaker 3: Let's just pretend that that's the case.

356
00:18:55,519 --> 00:19:00,039
Speaker 2: Anyway, Yeah, that cloud didn't need shouting.

357
00:19:01,640 --> 00:19:05,559
Speaker 3: Indeed, indeed. But I keep running into this thing where

358
00:19:05,599 --> 00:19:08,880
people are backing up their claims by saying, well, I

359
00:19:09,079 --> 00:19:12,119
just asked you know, some chat TBT or some other

360
00:19:12,200 --> 00:19:16,000
EI online EI system, Last Language model, whatever you want

361
00:19:16,000 --> 00:19:18,759
to call it, and then they're using that as their

362
00:19:18,920 --> 00:19:22,960
appeal to authority and saying, well, it's true because it's

363
00:19:23,000 --> 00:19:27,440
it says so, And it's really hard to argue against that,

364
00:19:27,480 --> 00:19:31,240
because if people are actually in that mindset where they

365
00:19:31,279 --> 00:19:34,440
think that's as an authority that they can trusts, it's

366
00:19:34,519 --> 00:19:36,119
hard to get them out of that mindset.

367
00:19:36,160 --> 00:19:39,240
Speaker 2: But it is actually new. This is not new to LMS,

368
00:19:39,559 --> 00:19:43,200
no chat GBT. People have been saying the computer says

369
00:19:43,559 --> 00:19:46,680
uh huh, yeah, since we put computers in front of people.

370
00:19:46,559 --> 00:19:49,319
Speaker 3: Right, Well, that's a fair argument, but I think we

371
00:19:49,440 --> 00:19:52,880
are We're we sort of reached a new level there

372
00:19:52,960 --> 00:19:55,640
because usually, you know, in the old days, when the

373
00:19:55,640 --> 00:19:59,880
computer said something, it was usually correct under the count

374
00:20:00,039 --> 00:20:04,519
take that you know, in which it would say something.

375
00:20:04,480 --> 00:20:07,400
Speaker 1: Right, data came out of a database somewhere.

376
00:20:07,160 --> 00:20:09,359
Speaker 3: Yeah, you would ask it about something in the database,

377
00:20:09,400 --> 00:20:11,720
And of course you can you could have wrong data

378
00:20:11,960 --> 00:20:14,440
inside that database, or you could have a bug in

379
00:20:14,480 --> 00:20:17,240
the program and so on. But in general, if you

380
00:20:17,319 --> 00:20:20,720
understood the context in which the you know, the computer

381
00:20:20,799 --> 00:20:24,079
and the program and the software would actually be giving

382
00:20:24,119 --> 00:20:28,039
you answers, there would be some sort of knowledge to

383
00:20:28,119 --> 00:20:30,640
be gained. And that's not really where we are with

384
00:20:30,720 --> 00:20:34,319
those you know, new systems, and that's it's not you know,

385
00:20:34,559 --> 00:20:36,839
one thing is the system itself, but it's how people

386
00:20:36,880 --> 00:20:41,000
are interacting with these sites concerned me a bit. Yeah, yeah.

387
00:20:41,039 --> 00:20:43,519
But also the thing is that that they tend to

388
00:20:43,559 --> 00:20:45,559
see them as oracles, is that you can go and

389
00:20:45,599 --> 00:20:48,559
ask them about anything and then you a lot of

390
00:20:48,599 --> 00:20:52,200
people seem to just blindly trust them. Which that's that's

391
00:20:52,200 --> 00:20:54,240
really what concerns me here, because.

392
00:20:54,160 --> 00:20:58,880
Speaker 1: Yeah, I did an experiment Mark, I asked for a

393
00:20:58,960 --> 00:21:02,960
recommendation of a product on Amazon based on my parameters,

394
00:21:04,039 --> 00:21:07,400
and it recommended something, and then I went it was

395
00:21:07,440 --> 00:21:10,400
an electronic piece for electronic Here, I went on Amazon,

396
00:21:10,440 --> 00:21:13,839
I looked at all the reviews and there was very

397
00:21:13,880 --> 00:21:17,319
many one star reviews saying this thing overheats and then

398
00:21:17,559 --> 00:21:22,039
goes to crap, don't buy it. So then I brought

399
00:21:22,039 --> 00:21:24,559
that up to Chatchip. He said, you're right, let me

400
00:21:24,599 --> 00:21:27,960
look for another one. Sound familiar. Let me look for

401
00:21:28,000 --> 00:21:30,160
another one that doesn't overheat. Here's here's the one.

402
00:21:30,200 --> 00:21:30,559
Speaker 2: You want.

403
00:21:31,359 --> 00:21:34,240
Speaker 1: This is because it's going to satisfy this condition, that condition,

404
00:21:34,319 --> 00:21:37,960
And I said, okay, and you know it's fairly well reviewed.

405
00:21:38,000 --> 00:21:41,559
So I bought it and it didn't work. It didn't

406
00:21:41,559 --> 00:21:45,720
do some of the things that I asked for with

407
00:21:46,400 --> 00:21:48,599
chat GPT, and it was vague because when I went

408
00:21:48,640 --> 00:21:52,000
back then looked at the description didn't explicitly say this

409
00:21:52,039 --> 00:21:54,440
thing that I needed. I just assumed that it would

410
00:21:54,680 --> 00:21:58,839
do it because most things like this did it. So

411
00:21:58,880 --> 00:22:01,720
I ended up returning it and something else. But it's

412
00:22:01,920 --> 00:22:04,599
conscinary town. So yeah, it's an experiment. I wanted to

413
00:22:04,640 --> 00:22:09,240
see if I could rather than going through the tedious

414
00:22:09,279 --> 00:22:14,440
task of searching on Amazon and then starting by most

415
00:22:14,480 --> 00:22:18,000
favorite reviews and reading them. Rather than doing that, I

416
00:22:18,119 --> 00:22:20,279
just asked chapter GPT to do my bidding. And it

417
00:22:20,319 --> 00:22:20,880
didn't work.

418
00:22:21,400 --> 00:22:24,200
Speaker 3: It lies, but indeed, in this in this case, you

419
00:22:24,279 --> 00:22:26,920
were still in a scenario where you were able to

420
00:22:27,880 --> 00:22:30,920
you're still able to verify or in this case, in

421
00:22:30,920 --> 00:22:35,680
this case as actually falsify the claim that was made

422
00:22:35,720 --> 00:22:38,920
by the LMS. So of course, because you were ordering

423
00:22:39,240 --> 00:22:41,640
I assume a physical product, it took some time to

424
00:22:41,680 --> 00:22:44,799
actually get that verification of falsification in place, but you

425
00:22:44,799 --> 00:22:48,759
could still do that, and that's that's not even you know,

426
00:22:48,880 --> 00:22:52,640
I'm not too concerned about people using lms in that

427
00:22:52,640 --> 00:22:54,559
way because I actually use them like that as well.

428
00:22:54,680 --> 00:22:56,359
You know, if I if I have a problem where

429
00:22:56,359 --> 00:22:59,759
I can you know, I know, I don't know exactly

430
00:23:00,119 --> 00:23:01,799
what the answer is going to be, but if I

431
00:23:01,839 --> 00:23:04,720
get the answer, I can do a verification check and

432
00:23:04,759 --> 00:23:07,279
then I can see if that solves my problem or not.

433
00:23:08,160 --> 00:23:11,440
I had I've had very nice, you know, experiences with

434
00:23:11,480 --> 00:23:13,960
the limbs that do that for me and save me

435
00:23:14,000 --> 00:23:16,599
a ton of time. So I don't really have a

436
00:23:16,599 --> 00:23:19,279
problem with that because you know, if you can get

437
00:23:19,319 --> 00:23:21,200
an answer and then you can verify whether or not

438
00:23:21,240 --> 00:23:25,880
it works, you're still on solid ground in terms of epistemology.

439
00:23:26,000 --> 00:23:28,319
So okay, so now we said the big word here,

440
00:23:28,359 --> 00:23:31,279
but it basically means the theory of knowledge. So how

441
00:23:31,319 --> 00:23:33,559
do we know that we know things, why do we

442
00:23:33,599 --> 00:23:35,359
think that we know some things?

443
00:23:35,680 --> 00:23:40,519
Speaker 1: I know that in the term of epistemological studies that

444
00:23:40,920 --> 00:23:45,480
are basically just tabulating answers from people, but you don't

445
00:23:45,480 --> 00:23:51,160
know whether or not they lied, right, the entomologic entomological studies,

446
00:23:51,680 --> 00:23:54,799
there's another All right, I'm mixing up my words.

447
00:23:54,839 --> 00:23:57,519
Speaker 3: Here, go ahead, Yeah, so where we so? Yeah?

448
00:23:57,559 --> 00:23:57,599
Speaker 2: So?

449
00:23:57,640 --> 00:23:59,119
Speaker 3: But that's the on thing. So if you can ask

450
00:23:59,240 --> 00:24:01,039
you know, a system and then get get it to

451
00:24:01,200 --> 00:24:03,400
give you an answer that you can then later verify,

452
00:24:03,480 --> 00:24:05,880
I think that's that sound. I don't really have a

453
00:24:05,880 --> 00:24:08,519
problem with that. My problem is really with you are

454
00:24:09,039 --> 00:24:11,799
asking a system to do something and you have no

455
00:24:11,920 --> 00:24:14,880
way of verifying whether or not it actually, you know,

456
00:24:15,079 --> 00:24:17,079
is what it is that you wanted to do. Then

457
00:24:17,279 --> 00:24:21,240
I think now I'm getting concerned. And since we are

458
00:24:21,519 --> 00:24:24,240
on a podcast where we usually talk about software development,

459
00:24:24,279 --> 00:24:25,920
you know, one of the things that really concerned me

460
00:24:26,000 --> 00:24:28,279
is when people ask you know, these systems to write

461
00:24:28,279 --> 00:24:32,200
code for them, because but then again, you know, if

462
00:24:32,279 --> 00:24:35,400
you if you do that, well, if you can actually

463
00:24:35,680 --> 00:24:37,960
read through on the code and then you have an

464
00:24:37,960 --> 00:24:41,000
idea at what you're looking at, well that might actually work,

465
00:24:41,079 --> 00:24:43,440
but often you hear people so yeah, I know you

466
00:24:43,519 --> 00:24:47,079
talked about vibe coding already, and and for me it

467
00:24:47,160 --> 00:24:49,920
is a pejorative. I think it's it sounds like a

468
00:24:50,079 --> 00:24:53,039
really really bad idea because if you don't know, if

469
00:24:53,079 --> 00:24:54,960
you if you don't know how to code, or if

470
00:24:54,960 --> 00:24:57,880
you ask this system to write code in a language

471
00:24:57,920 --> 00:25:01,559
that you don't really understand, then how you know it works? Right?

472
00:25:02,200 --> 00:25:05,799
Speaker 4: Well, the compiler has to say if you are writing

473
00:25:05,839 --> 00:25:08,759
in a language that actually does compile, and people are

474
00:25:08,880 --> 00:25:11,519
using a lot of often they get it to write

475
00:25:11,640 --> 00:25:14,359
JavaScript or Python or something like that for them, and

476
00:25:14,400 --> 00:25:16,640
they those languages don't even compile.

477
00:25:16,920 --> 00:25:19,559
Speaker 1: The point is something that they don't know. Yeah, right,

478
00:25:19,960 --> 00:25:23,880
Why would Why would I ask an l M or whatever,

479
00:25:24,160 --> 00:25:27,319
an agent to write me an assembly program because I

480
00:25:27,359 --> 00:25:29,720
think it's going to be faster When I don't read

481
00:25:29,720 --> 00:25:32,039
assembly and I can't verify it, and I can't step

482
00:25:32,079 --> 00:25:33,440
through the code, and I don't know what the heck

483
00:25:33,519 --> 00:25:36,400
that thing does. It might look like it works, but

484
00:25:36,640 --> 00:25:39,599
I ain't going to run that thing, right, I'm gonna

485
00:25:39,799 --> 00:25:43,720
If you ask an agent to write code in a

486
00:25:43,799 --> 00:25:47,680
language that you don't know how to verify, you get

487
00:25:48,319 --> 00:25:50,599
you know, you get what you get, right, you get

488
00:25:50,599 --> 00:25:54,400
what you pay for basically deserve it.

489
00:25:54,960 --> 00:25:59,480
Speaker 2: That being said, I'm now having experiences with very experienced

490
00:25:59,519 --> 00:26:03,759
software helpers where we spend an entire day working through

491
00:26:04,440 --> 00:26:07,599
a sprint of code that we estimated it would have

492
00:26:07,640 --> 00:26:10,279
been six weeks worth of work, and then knocked it

493
00:26:10,279 --> 00:26:13,240
out in a weekend using these tools. Yep, yeah, right,

494
00:26:13,359 --> 00:26:16,799
Like but in the hands of skilled people who understand

495
00:26:16,839 --> 00:26:20,240
what they're doing and are working hard with these tools,

496
00:26:20,880 --> 00:26:25,160
you can get extraordinary results. Not you cancre mental results,

497
00:26:25,200 --> 00:26:27,759
but literally weeks of working days.

498
00:26:27,880 --> 00:26:31,039
Speaker 1: We just heard and I don't remember if this was

499
00:26:31,079 --> 00:26:32,880
talking to you, Richard or somebody else, might have been

500
00:26:32,920 --> 00:26:36,400
Brian McKay that he had a guy that was in

501
00:26:36,480 --> 00:26:41,480
a meeting, a two hour meeting about a spec and

502
00:26:41,680 --> 00:26:43,559
about building a prototype, and by the end of the

503
00:26:43,599 --> 00:26:44,480
meeting he had it done.

504
00:26:44,720 --> 00:26:49,319
Speaker 2: Right. That seems to becoming more common again known problem space.

505
00:26:49,640 --> 00:26:52,799
You know, these these were forms over data problems, so

506
00:26:52,839 --> 00:26:56,960
they were pretty automatical anyway, and with someone who knew

507
00:26:57,119 --> 00:26:59,839
the tools and the language well, and they have a

508
00:27:00,079 --> 00:27:03,200
put together assembly and their productivity is astonishing.

509
00:27:03,319 --> 00:27:06,799
Speaker 3: Yeah, but again, how do you measure productivity? In software development,

510
00:27:06,839 --> 00:27:10,880
because it seems to me that we are forgetting that

511
00:27:11,960 --> 00:27:14,920
lines of code is not a measurement of productivity, you.

512
00:27:14,920 --> 00:27:17,279
Speaker 2: Know, Yeah, this is delivering features to customers.

513
00:27:17,400 --> 00:27:20,240
Speaker 3: Yeah, and that makes a lot of sense, of course,

514
00:27:20,359 --> 00:27:24,200
if you can measure that. But that's a whole different discussion.

515
00:27:24,240 --> 00:27:27,720
Whether that's because one feature is not necessarily equivalent to

516
00:27:27,759 --> 00:27:30,400
another feature. You know, some features are big and someone small.

517
00:27:30,440 --> 00:27:32,680
But that's probably a different discussion.

518
00:27:33,079 --> 00:27:34,960
Speaker 2: No, but I think it's a really valid one that

519
00:27:35,160 --> 00:27:37,640
there's a bar here that these tools seem to be

520
00:27:37,680 --> 00:27:40,440
able to handle at a bar, and above that bar

521
00:27:40,559 --> 00:27:41,200
they cannot.

522
00:27:41,839 --> 00:27:43,920
Speaker 1: Yeah, above that bar. You have to sort of break

523
00:27:43,960 --> 00:27:47,079
it down into you know, bite sized pieces for them. So,

524
00:27:47,160 --> 00:27:51,279
but that's how I like to work anyway, you know, Yeah, yeah, yeah.

525
00:27:51,119 --> 00:27:54,160
Speaker 3: Of course. But I'm still I'm still wondering whether we

526
00:27:54,240 --> 00:27:57,240
can trust these things even if we look at them.

527
00:27:57,279 --> 00:28:00,160
Because so we reached the story that you told, I

528
00:28:00,160 --> 00:28:02,119
don't know exactly the details of it and so on.

529
00:28:02,240 --> 00:28:04,480
But but one of the questions I would like to

530
00:28:04,519 --> 00:28:06,680
ask when when people do something like that, is to

531
00:28:06,839 --> 00:28:10,079
how do you actually know that the software works? How?

532
00:28:10,480 --> 00:28:13,480
How did you how did you decide that that software

533
00:28:13,519 --> 00:28:18,799
worked in that particular case. What was the decision criteria there?

534
00:28:18,920 --> 00:28:21,079
Speaker 2: Oh? I mean again, they'd also built a set of

535
00:28:21,119 --> 00:28:24,920
test suites. Yeah, you know, I saw that these features

536
00:28:24,960 --> 00:28:26,720
need to be tested this way and measure you know,

537
00:28:26,920 --> 00:28:30,680
he did the complete coding solution, including the security evaluation,

538
00:28:30,880 --> 00:28:32,960
like all of the different pieces. Like, you didn't just

539
00:28:33,039 --> 00:28:35,720
spat it. This was not vibe coding. No, this was

540
00:28:35,880 --> 00:28:38,279
a thoroughly thought out architectural solution.

541
00:28:38,599 --> 00:28:40,000
Speaker 3: But who wrote who wrote the tests?

542
00:28:40,000 --> 00:28:41,240
Speaker 2: Though? With the tools?

543
00:28:41,640 --> 00:28:43,839
Speaker 3: How do you why do you trust those? Then?

544
00:28:44,119 --> 00:28:47,440
Speaker 2: Well, because you could see the code, right, there's no

545
00:28:47,519 --> 00:28:49,960
secrets here. Tests are pretty straightforward to understand.

546
00:28:50,119 --> 00:28:52,200
Speaker 1: Yeah, I guess the thing that we can agree on

547
00:28:52,359 --> 00:28:54,680
is if you let it get away from you, right,

548
00:28:54,759 --> 00:28:57,440
and you don't follow up on every change your AI

549
00:28:57,559 --> 00:29:00,319
is making for you and test it and on it

550
00:29:00,400 --> 00:29:04,799
and you know, observe it, and you just let it

551
00:29:04,880 --> 00:29:07,720
go wild, you're you're going to lose control. And so

552
00:29:07,839 --> 00:29:10,039
staying in control, I think this is the key.

553
00:29:10,200 --> 00:29:12,359
Speaker 2: The question you keep asking is why do you trust it?

554
00:29:12,359 --> 00:29:16,319
It's like, don't don't trust it? Yeah, don't know exactly. Yeah, Look,

555
00:29:16,359 --> 00:29:19,279
I already do distributed development. I have people contributing to

556
00:29:19,319 --> 00:29:22,559
my projects that I never meet, that I only interact with,

557
00:29:22,759 --> 00:29:26,440
you know, through issues on GitHub. You don't trust them either, No,

558
00:29:26,640 --> 00:29:28,319
but you evaluate the code.

559
00:29:28,519 --> 00:29:29,519
Speaker 3: You review the code.

560
00:29:29,640 --> 00:29:33,000
Speaker 2: Yeah, yeah, that's the job. But the reality is it's

561
00:29:33,039 --> 00:29:36,599
still a force multiplier to have multiple people contributing to

562
00:29:36,640 --> 00:29:38,720
a project. It takes less time to review code than

563
00:29:38,720 --> 00:29:39,480
it takes to write it.

564
00:29:39,799 --> 00:29:43,400
Speaker 3: And I do not disagree with that. That's reasonable enough,

565
00:29:43,440 --> 00:29:47,559
but it's still my concern is still that we you know,

566
00:29:47,599 --> 00:29:51,480
if we have an output of code that is multiplied,

567
00:29:51,640 --> 00:29:56,680
you know, tenfold, one hundredfold in comparison to what we

568
00:29:56,720 --> 00:30:00,279
had a couple of years ago, then we should also

569
00:30:00,079 --> 00:30:04,039
to have that that we should also spend that much

570
00:30:04,079 --> 00:30:07,599
more energy on actually reviewing the things that are being produced.

571
00:30:07,599 --> 00:30:10,799
And I'm not really getting the impression that that's the case.

572
00:30:12,640 --> 00:30:15,759
Speaker 1: So it's the case at my house, I can tell

573
00:30:15,799 --> 00:30:15,960
you that.

574
00:30:16,480 --> 00:30:16,680
Speaker 3: Yeah.

575
00:30:18,359 --> 00:30:20,599
Speaker 2: Well, but again, you know, well, and this will be

576
00:30:20,720 --> 00:30:23,599
this is also some with self fulfilling. Those who trust

577
00:30:23,680 --> 00:30:26,279
these tools, right, will get burned. Absolutely.

578
00:30:26,519 --> 00:30:29,359
Speaker 3: Yeah, that's that's also what I'm what I'm concerned about.

579
00:30:29,680 --> 00:30:31,720
And we can just hope that it's just some simple

580
00:30:31,759 --> 00:30:35,480
forms over the data and and they're they're probably only

581
00:30:35,559 --> 00:30:39,279
hurting the company that actually owns that software. But what if,

582
00:30:39,480 --> 00:30:42,920
what if the actually we're beginning to see people, you know,

583
00:30:43,079 --> 00:30:48,880
writing fly you know, h operating systems or systems for

584
00:30:49,319 --> 00:30:53,119
controlling hardware or elevators and medical systems and so on.

585
00:30:53,240 --> 00:30:55,960
And then I'm getting a little bit concerned here that's

586
00:30:55,960 --> 00:30:59,480
probably not going to happen this year. But well, in

587
00:30:59,519 --> 00:31:02,359
a couple of is we'll see, we'll see those people

588
00:31:02,359 --> 00:31:05,839
who do use those systems at the moment, they some

589
00:31:05,920 --> 00:31:08,759
of them will graduate to writing those kinds of systems,

590
00:31:09,200 --> 00:31:11,319
and I'm just concerned that they're probably going to take

591
00:31:11,319 --> 00:31:12,640
some of their bad habits with them.

592
00:31:12,880 --> 00:31:16,480
Speaker 2: Without a doubt, I think, yeah, let's do the break

593
00:31:16,519 --> 00:31:18,680
and then I want to dig into the next tier

594
00:31:18,759 --> 00:31:20,920
of this problem, which I think is the junior developer.

595
00:31:21,039 --> 00:31:23,759
Speaker 1: Yeah, okay, and we'll be right back after these very

596
00:31:23,759 --> 00:31:28,000
important messages. Did you know that you can work with

597
00:31:28,160 --> 00:31:34,000
AWS directly from your ide AWS provides toolkits for visual studio,

598
00:31:34,319 --> 00:31:38,319
visual studio code, and jet brains rider Learn more at

599
00:31:38,319 --> 00:31:47,200
AWS dot Amazon dot com, slash net, slash tools. And

600
00:31:47,279 --> 00:31:49,680
we're back. It's dot at Rox. I'm Carl Franklin and

601
00:31:49,759 --> 00:31:52,519
I'm Richard Campbell and that is Mark Seaman, and we're

602
00:31:52,559 --> 00:31:56,160
talking about AI concerns. And just as a reminder, if

603
00:31:56,160 --> 00:31:57,759
you don't want to hear these ads, you can pay

604
00:31:57,799 --> 00:32:01,359
five bucks a month become a patron Patreon dot nerocks

605
00:32:01,400 --> 00:32:04,440
dot com. You'll get a free an ad, free feed.

606
00:32:04,599 --> 00:32:05,440
Take it away, Richard.

607
00:32:05,799 --> 00:32:07,839
Speaker 2: The folks that I'm seeing that will be successful these

608
00:32:07,839 --> 00:32:11,240
tools are very experienced developers. Yeah, you know, really they've

609
00:32:11,240 --> 00:32:13,319
spent most these days. They don't even write a lot

610
00:32:13,359 --> 00:32:15,480
of code, and maybe they do some spikes and things,

611
00:32:15,519 --> 00:32:18,960
but they're mostly supervising a group of developers. They are

612
00:32:19,039 --> 00:32:21,359
the architects, you know, they run at a high level

613
00:32:21,400 --> 00:32:25,279
of responsibility, and they're starting to see these tools act

614
00:32:25,519 --> 00:32:31,400
as inexperienced developers under fairly strict guidance with constant code reviews,

615
00:32:31,400 --> 00:32:35,960
but ultimately productive. And it begs a question like where

616
00:32:35,960 --> 00:32:37,680
does the junior developer go now?

617
00:32:38,119 --> 00:32:42,079
Speaker 1: Right? Are we the last generation of people who came

618
00:32:42,200 --> 00:32:43,519
up as junior developers?

619
00:32:43,759 --> 00:32:44,319
Speaker 2: Yeah?

620
00:32:44,599 --> 00:32:48,079
Speaker 3: Yeah, that's my concern too, because well, I think you

621
00:32:48,119 --> 00:32:51,519
said it pretty well, Richard. I'm not sure that I

622
00:32:51,519 --> 00:32:53,480
have a lot of stuff to add to that.

623
00:32:53,559 --> 00:32:58,039
Speaker 2: Actually, I mean, I am meeting young developers right now

624
00:32:58,079 --> 00:33:00,799
that are pretty freaked out, and I wonder if it's

625
00:33:00,799 --> 00:33:03,279
because we trained them poorly at this point, Like here

626
00:33:03,279 --> 00:33:05,079
we are, this inflection point where things are changing. And

627
00:33:05,079 --> 00:33:07,160
the funny part is when I have a conversation with

628
00:33:07,200 --> 00:33:10,039
them about solutions, and I'm thinking back to the show

629
00:33:10,759 --> 00:33:13,319
that we did together Carl with the Imagining Cup folks.

630
00:33:13,599 --> 00:33:15,359
Speaker 1: Wow, what an inspirational group.

631
00:33:15,519 --> 00:33:19,279
Speaker 2: Phenomenal, But you know what, they didn't care about tool stacks? Yeah,

632
00:33:19,839 --> 00:33:21,519
do you remember there were one of the ladies asked us,

633
00:33:21,559 --> 00:33:24,319
like you make a podcast about dot net? Like why

634
00:33:24,480 --> 00:33:25,400
why would you do that?

635
00:33:25,960 --> 00:33:26,319
Speaker 1: Right?

636
00:33:27,200 --> 00:33:31,079
Speaker 2: Right? And I realized, like we've got old thinking. You know,

637
00:33:31,119 --> 00:33:33,559
when it was a nine to twelve month commit to

638
00:33:33,599 --> 00:33:36,240
get to an MVP of a piece of software, you

639
00:33:36,279 --> 00:33:38,400
could spend a couple of weeks arguing over what stack

640
00:33:38,480 --> 00:33:40,960
to use. Right, but with the productivity level that we're

641
00:33:41,000 --> 00:33:44,279
talking about right now, who cares? Just take the tool

642
00:33:44,319 --> 00:33:46,640
out for a spin. You know, there's so many different

643
00:33:47,079 --> 00:33:50,680
there's Even before the LM showed up, it was so

644
00:33:50,799 --> 00:33:53,160
much easier to learn a new programming environment. It was

645
00:33:53,200 --> 00:33:58,599
so much easier to experiment that those times are getting

646
00:33:58,599 --> 00:34:00,200
shorter and shorter, and the stacks are just not that

647
00:34:00,440 --> 00:34:03,519
different from each other. You know, fundamentally, they still draw

648
00:34:03,559 --> 00:34:05,839
on screens and they still communicate over the Internet. Like

649
00:34:05,880 --> 00:34:08,159
a lot of this stuff is the same, and if

650
00:34:08,159 --> 00:34:09,679
you focus on the solution, you're fine.

651
00:34:09,719 --> 00:34:10,280
Speaker 1: Like I don't.

652
00:34:10,360 --> 00:34:14,639
Speaker 2: I wonder if we're not actually growing the right generation,

653
00:34:15,000 --> 00:34:19,199
next generation of software developers, because they are not hung

654
00:34:19,280 --> 00:34:21,440
up on the stuff that we're hung up on. Well,

655
00:34:21,480 --> 00:34:24,000
at the same time, maybe they should be. I mean,

656
00:34:24,199 --> 00:34:26,800
so here's here's a scenario, and this came from a

657
00:34:26,800 --> 00:34:31,519
real story that I heard from somebody. Somebody is a

658
00:34:31,719 --> 00:34:34,159
back end dot net developer, and a full stack dot

659
00:34:34,199 --> 00:34:36,480
net developer does the front end, does Blazer and all

660
00:34:36,519 --> 00:34:40,320
that stuff. Somebody comes to them and says, hey, we

661
00:34:40,400 --> 00:34:43,119
want to use React for the front end, but still

662
00:34:43,159 --> 00:34:45,320
keep the as peanut core back end.

663
00:34:45,639 --> 00:34:48,679
Speaker 1: Can you do that? And they think, hey, I've got

664
00:34:48,760 --> 00:34:52,440
chat GPT or I've got the agent and they say, yes,

665
00:34:52,760 --> 00:34:57,400
yes I can. They know nothing about React, right, but

666
00:34:57,800 --> 00:35:02,000
they generate all this code and it work. Would you

667
00:35:02,079 --> 00:35:05,960
say yes? Would you say yes, I can do that,

668
00:35:06,199 --> 00:35:08,199
or would I say no? I think you better get

669
00:35:08,199 --> 00:35:10,840
a React programmer to run the tool.

670
00:35:11,199 --> 00:35:11,960
Speaker 3: Yeah, I wouldn't.

671
00:35:11,960 --> 00:35:13,559
Speaker 2: What would you do it? Yeah? Yeah?

672
00:35:13,639 --> 00:35:16,559
Speaker 1: And if I did it that way, would you accept

673
00:35:16,559 --> 00:35:19,679
the code if I didn't know anything about React?

674
00:35:20,000 --> 00:35:22,599
Speaker 3: Yeah? That's that's the other problem. So this reminds me

675
00:35:22,639 --> 00:35:24,960
of of an experience I had many years ago. So

676
00:35:25,039 --> 00:35:28,800
obviously it is because long before the ill a limbs.

677
00:35:29,079 --> 00:35:31,559
But I was working with a with a customer of mine,

678
00:35:31,679 --> 00:35:34,280
trying to teach them to move in small increments and

679
00:35:34,320 --> 00:35:36,400
do test driven development and all these things that I

680
00:35:36,480 --> 00:35:41,079
usually do, and it's actually working pretty well, you know,

681
00:35:41,119 --> 00:35:43,440
trying to also give them an idea about how to

682
00:35:43,440 --> 00:35:45,800
do pull requests and work in this sort of like

683
00:35:46,519 --> 00:35:50,800
quasi open source way of working with the small, small

684
00:35:50,840 --> 00:35:54,440
itserations and all of that. And they hadn't told me

685
00:35:54,519 --> 00:35:58,880
that they actually had an offside group sitting in another country.

686
00:35:59,239 --> 00:36:02,880
And you know, three weeks into my engagement with this customer,

687
00:36:02,960 --> 00:36:06,840
I get this pull request from hell from this you know,

688
00:36:07,079 --> 00:36:09,599
team sitting in another country because no one had told

689
00:36:09,639 --> 00:36:12,159
them that I was actually now trying to you know,

690
00:36:12,280 --> 00:36:15,639
change the things around, and they hadn't told me about

691
00:36:15,639 --> 00:36:18,159
that system as well. So I get this pull request

692
00:36:18,199 --> 00:36:20,480
and it's just like a you know, fifty thousand lines

693
00:36:20,480 --> 00:36:23,400
of code or something like that, and I'm trying to

694
00:36:23,480 --> 00:36:26,400
get the people that I was working with, you know,

695
00:36:26,480 --> 00:36:29,360
going through and saying well, okay, if you write the code,

696
00:36:29,559 --> 00:36:32,559
we need someone else to review it, and well you

697
00:36:32,559 --> 00:36:34,199
can do it with path programming, or we can do

698
00:36:34,239 --> 00:36:35,039
it with pull requests.

699
00:36:35,079 --> 00:36:35,719
Speaker 2: I don't really care.

700
00:36:35,719 --> 00:36:38,480
Speaker 3: I just want to have more than one person actually

701
00:36:38,760 --> 00:36:40,480
looking at this code. And then I get this thing

702
00:36:40,519 --> 00:36:43,280
in from the from the outside, and I'm sort of like, okay,

703
00:36:43,280 --> 00:36:45,360
what do I do with this now, because you know,

704
00:36:45,400 --> 00:36:49,639
the usual reaction to something like that is to say, well,

705
00:36:50,320 --> 00:36:52,760
looks good to me, because that's what you already always

706
00:36:52,760 --> 00:36:55,480
do with those big, big requests.

707
00:36:55,079 --> 00:36:57,400
Speaker 2: The classic sign of I have not read.

708
00:36:57,159 --> 00:37:01,559
Speaker 3: This exactly exactly. And now, fortunately I was actually, you know,

709
00:37:01,599 --> 00:37:04,159
engaged by the CEO of the company, so I know

710
00:37:04,239 --> 00:37:08,760
that I had I had pretty free, you know, range

711
00:37:08,800 --> 00:37:11,880
of deciding what to do. So I wrote back and say, well, okay,

712
00:37:11,920 --> 00:37:14,199
so I'm really sorry that you weren't in on what

713
00:37:14,320 --> 00:37:15,920
is what it is that we're doing at the moment,

714
00:37:16,159 --> 00:37:19,119
but I'm actually going to politely decline this pull request

715
00:37:19,119 --> 00:37:21,360
because it's just too much. And I don't know whether

716
00:37:21,360 --> 00:37:23,519
it works. And it's not that I don't trust you

717
00:37:23,559 --> 00:37:25,480
in the sense that I think you have you know,

718
00:37:25,840 --> 00:37:29,079
ill intent, but I don't even trust myself to write

719
00:37:29,280 --> 00:37:33,039
you know, flawless code. So that's why we need someone

720
00:37:33,119 --> 00:37:35,400
else to actually review with it, because it's part of

721
00:37:35,440 --> 00:37:39,519
this whole you know, process of figuring out does the code,

722
00:37:39,559 --> 00:37:42,039
does the software actually work as intended? Does the code

723
00:37:42,079 --> 00:37:43,960
do what it is that we wanted to do, And

724
00:37:44,000 --> 00:37:46,079
we can't do that if you just give me, you know,

725
00:37:46,159 --> 00:37:49,119
all of that in one go. So I said, well,

726
00:37:49,159 --> 00:37:50,639
I'm not going to take this one. But on the

727
00:37:50,639 --> 00:37:52,719
other hand, you still have all the codes, so I'll

728
00:37:52,840 --> 00:37:54,519
work with you and try to break it down into

729
00:37:54,519 --> 00:37:55,960
smaller pieces and we can get.

730
00:37:55,800 --> 00:37:56,400
Speaker 2: It in that way.

731
00:37:56,440 --> 00:38:00,199
Speaker 3: So we sort of made that work. But my mind

732
00:38:00,239 --> 00:38:02,760
here is that we're sort of in that situation now

733
00:38:02,840 --> 00:38:07,440
where we do get you know, something that reminds us

734
00:38:07,559 --> 00:38:09,559
of you know, the pull request from Hell. But it's

735
00:38:09,599 --> 00:38:12,320
just like it's not written by a person anymore. It's

736
00:38:12,360 --> 00:38:17,159
just now it's written by some statistical system. And and

737
00:38:17,199 --> 00:38:19,519
then if you just get all of that code in

738
00:38:19,559 --> 00:38:22,480
one big, you know chunk, you don't really you can't

739
00:38:22,519 --> 00:38:26,480
really fit it in your head, and then you.

740
00:38:26,559 --> 00:38:31,079
Speaker 2: Just supplies submit a series of smaller requests exactly.

741
00:38:31,559 --> 00:38:33,519
Speaker 3: And if you can get these systems to do to

742
00:38:33,599 --> 00:38:37,119
work in that way, which I suppose you could, then yeah,

743
00:38:37,199 --> 00:38:40,880
that would that would probably help. So it's not that

744
00:38:41,000 --> 00:38:44,639
I so that's that's actually a pretty that's actually that

745
00:38:44,719 --> 00:38:48,920
maybe showing us a way out and trying to work

746
00:38:48,960 --> 00:38:52,719
with LMS as though they were you know, contributors on

747
00:38:53,000 --> 00:38:55,159
an open source project and then try to tease them

748
00:38:55,159 --> 00:38:57,159
to do small increments that you can review.

749
00:38:57,280 --> 00:38:59,159
Speaker 2: Well, that's certainly the way I look at it, because

750
00:38:59,159 --> 00:39:01,400
it's yeah, you know, more and more we're in a

751
00:39:01,400 --> 00:39:03,400
situation where all you can do is see the code.

752
00:39:03,480 --> 00:39:06,039
You don't really see the person, and you certainly have

753
00:39:06,079 --> 00:39:08,559
no way to measure the qualifications. Let's face it, we've

754
00:39:08,679 --> 00:39:11,440
almost never had a way to measure qualifications in software

755
00:39:11,760 --> 00:39:14,079
that was meaningful in any way. In the end, the

756
00:39:14,119 --> 00:39:17,079
code had to speak, and so if they could, you

757
00:39:17,159 --> 00:39:21,760
have to engage with the person. Yeah, you know, there's

758
00:39:21,800 --> 00:39:23,920
an argument here that looks just like, oh, you don't

759
00:39:23,960 --> 00:39:26,840
have a PhD in COMMSI you can't contribute to our.

760
00:39:26,760 --> 00:39:28,199
Speaker 1: Project, right, which is silly.

761
00:39:28,719 --> 00:39:31,440
Speaker 2: In the end, let the code speak, and if you

762
00:39:31,440 --> 00:39:34,000
can insist that the tool delivers the code in a

763
00:39:34,039 --> 00:39:37,119
form that is viable for you to validate, because in

764
00:39:37,159 --> 00:39:40,480
the end, it's your butt on the line, right, you

765
00:39:40,519 --> 00:39:43,039
are the professional engineer. You're going to sign off on this,

766
00:39:43,679 --> 00:39:45,599
then you have a chance of being able to use

767
00:39:45,599 --> 00:39:49,360
these tools. And I you know, this seems like the

768
00:39:49,400 --> 00:39:52,639
most solvable problem in the LLM space compared to what

769
00:39:52,679 --> 00:39:55,599
people are talking about are playing with doing it outside

770
00:39:55,599 --> 00:39:59,239
of software, Like, at least software has pretty good tools

771
00:39:59,239 --> 00:40:04,000
and governments. We already have a method doing distributed programming.

772
00:40:04,079 --> 00:40:06,559
Oh yeah, oh yeah, oh yeah, name other industries that

773
00:40:06,599 --> 00:40:07,920
are even close to this ability.

774
00:40:08,039 --> 00:40:11,800
Speaker 1: Yeah, before we leave software, just there's another gotcha for

775
00:40:11,880 --> 00:40:15,719
junior programmers that more experienced programmers won't necessarily have. And

776
00:40:15,760 --> 00:40:18,559
I've talked about this on dot Ner Rocks before, which

777
00:40:18,639 --> 00:40:22,960
is a junior developer will ask a question that they

778
00:40:23,000 --> 00:40:24,920
think is the right question to ask when it might

779
00:40:24,960 --> 00:40:27,679
not be. So they'll ask a question, you know, they'll well,

780
00:40:27,719 --> 00:40:31,840
they'll say something like, please make me a you know,

781
00:40:31,920 --> 00:40:37,360
a thread safe list component or list control. Let a

782
00:40:37,400 --> 00:40:40,519
thread safe list class, right that I can use that

783
00:40:40,519 --> 00:40:43,199
that's completely thread safe and locking and all that stuff.

784
00:40:44,199 --> 00:40:47,039
And that's the wrong question asked, because in dot net anyway,

785
00:40:47,119 --> 00:40:50,679
there is one in the framework. So the first question

786
00:40:50,960 --> 00:40:53,599
that should be asked is, hey, is there a way

787
00:40:53,880 --> 00:40:56,119
that I can use a list in a thread safe manner?

788
00:40:56,559 --> 00:40:58,920
And then you know, if the thing is worth anything,

789
00:40:58,960 --> 00:41:01,480
it will say yeah, well there's the spread safe collection, right,

790
00:41:02,800 --> 00:41:07,119
But instead a junior programmer might go down the very

791
00:41:07,159 --> 00:41:11,320
difficult path of doing one themselves because they don't know

792
00:41:11,400 --> 00:41:15,679
what else is available. So whereas an experienced developer would

793
00:41:15,880 --> 00:41:19,519
would know and not ask that question, and a junior

794
00:41:19,519 --> 00:41:22,639
developer would not. Most likely this question is isn't this

795
00:41:22,760 --> 00:41:23,480
very teachable?

796
00:41:23,639 --> 00:41:23,840
Speaker 3: Yeah?

797
00:41:23,880 --> 00:41:26,320
Speaker 1: Sure it is, but but how many hours is the

798
00:41:26,360 --> 00:41:28,960
junior developer going to waste working on something that when

799
00:41:29,000 --> 00:41:30,840
they you know, check it in you say, hey, you

800
00:41:30,880 --> 00:41:32,280
know that there is something like this?

801
00:41:32,599 --> 00:41:35,719
Speaker 2: Yeah, again, there's a teachable moment around. Make sure you

802
00:41:35,760 --> 00:41:39,159
ask the question what already exists and you know, yeah, we.

803
00:41:39,199 --> 00:41:42,679
Speaker 3: Still imagine a future version of large language models will

804
00:41:42,719 --> 00:41:45,840
probably be able to you know, loop in that kind

805
00:41:45,880 --> 00:41:50,199
of questioning saying, oh you're asking about this, have you

806
00:41:50,679 --> 00:41:54,360
do you really want? You know, a ground of implementation

807
00:41:54,480 --> 00:41:56,599
here or can you use the one that already exists?

808
00:41:56,599 --> 00:41:59,920
Have you looked into the framework and see whether they

809
00:42:00,039 --> 00:42:03,039
as a reusable component. I mean, I could probably imagine

810
00:42:03,079 --> 00:42:05,119
that even if they don't do that right now, they

811
00:42:05,119 --> 00:42:08,840
could they actually do probably be yeah, yeah, yeah, yeah,

812
00:42:08,880 --> 00:42:12,000
because it's it's a fairly common question to ask ask

813
00:42:12,079 --> 00:42:15,360
if you're a senior developer anyway, so anybody, Yeah.

814
00:42:15,079 --> 00:42:18,480
Speaker 1: You may need your own implementation because it may need

815
00:42:18,519 --> 00:42:21,800
features that the base class doesn't have or isn't extendable to.

816
00:42:22,480 --> 00:42:23,639
Speaker 3: So but usually not.

817
00:42:23,880 --> 00:42:27,639
Speaker 2: But maybe maybe now you're starting, you're already seeing in

818
00:42:27,679 --> 00:42:31,440
these tools that you can put in and pre prompts

819
00:42:31,480 --> 00:42:34,679
like this should be included with every prompt, right so

820
00:42:34,679 --> 00:42:37,960
that you could see the idea of an enterprise group

821
00:42:38,559 --> 00:42:41,840
setting a set of rules that any prompt code generation

822
00:42:42,239 --> 00:42:44,519
has to follow these rules. But they don't always follow

823
00:42:44,519 --> 00:42:45,599
the rules. That's the problem.

824
00:42:45,920 --> 00:42:48,480
Speaker 1: I don't know if you've noticed this, but in my

825
00:42:48,559 --> 00:42:51,079
system prompts and in my user prompt if I say

826
00:42:51,159 --> 00:42:55,079
don't do this, sometimes it will anyway. And that's just

827
00:42:55,320 --> 00:42:56,840
that's just the way it goes.

828
00:42:57,119 --> 00:42:59,719
Speaker 2: Yeah, and again I'll often skip things as well if

829
00:42:59,760 --> 00:43:03,400
the gets too complex, so you still see some development

830
00:43:03,440 --> 00:43:06,440
need to be done to do more work on validating

831
00:43:06,639 --> 00:43:07,559
the outBut.

832
00:43:07,719 --> 00:43:11,519
Speaker 1: Right, yeah, all right, well I'm ready for Harry carry

833
00:43:11,559 --> 00:43:12,119
are you guys?

834
00:43:13,559 --> 00:43:16,960
Speaker 2: I'm actually really excited about all this because it does

835
00:43:17,000 --> 00:43:21,280
seem to empower more people to build software. As these

836
00:43:21,280 --> 00:43:27,039
tools mature and they get more reliable. They said they're

837
00:43:27,079 --> 00:43:28,960
as bad as they're going to be right now. I

838
00:43:29,000 --> 00:43:32,559
don't see an exponential growth here. You know, we've basically

839
00:43:32,559 --> 00:43:34,840
indexed all the Internet into these models. As it is,

840
00:43:34,920 --> 00:43:38,320
there is no more data to consume, and so far,

841
00:43:38,960 --> 00:43:42,920
training against stuff generated by these tools is degenerative. It

842
00:43:43,000 --> 00:43:44,320
makes it worse, not better.

843
00:43:44,480 --> 00:43:46,639
Speaker 1: Yeah, and most of the time that's not going to happen.

844
00:43:46,880 --> 00:43:49,599
Like I've got confirmation that if I have a private

845
00:43:49,639 --> 00:43:54,840
repo and I use the GitHub Copilot agent to generate code,

846
00:43:55,320 --> 00:43:58,360
it's not going to train their models. They're not going

847
00:43:58,400 --> 00:44:00,400
to train their models on the code that it generates.

848
00:44:00,480 --> 00:44:03,400
In other words, my code isn't going to leak out

849
00:44:03,480 --> 00:44:08,639
into the ether and you know somebody else is using it.

850
00:44:09,119 --> 00:44:12,559
That's just a promise by gethub. I don't know it's true,

851
00:44:12,639 --> 00:44:13,679
but that's a promise.

852
00:44:13,760 --> 00:44:16,239
Speaker 3: But even if it's true. Now you know what's going

853
00:44:16,239 --> 00:44:17,159
to happen in the future.

854
00:44:17,800 --> 00:44:18,760
Speaker 2: Yeah, well but.

855
00:44:19,079 --> 00:44:24,679
Speaker 1: You mean when Oracle buys get ub uh huh.

856
00:44:23,519 --> 00:44:26,320
Speaker 3: Pretty sure it's not for sale, you know, a completely

857
00:44:26,400 --> 00:44:28,599
unrelated you know. Note, you know, there were people who

858
00:44:28,639 --> 00:44:31,320
were submitting that DNA sample suport to form you know,

859
00:44:31,400 --> 00:44:34,039
tes be and me, And I'm so happy I never

860
00:44:34,079 --> 00:44:36,320
did that because even back, you know, ten years ago,

861
00:44:36,360 --> 00:44:40,039
I thought, that's not data that I want, you know,

862
00:44:40,400 --> 00:44:44,760
sitting in someone else, you know, repository that I can't control.

863
00:44:44,800 --> 00:44:47,239
Speaker 1: And yeah, too late, I've already cloned you.

864
00:44:48,760 --> 00:44:49,119
Speaker 2: Indeed.

865
00:44:49,719 --> 00:44:52,199
Speaker 3: Yeah, so I think we should be, you know, a

866
00:44:52,239 --> 00:44:56,079
little bit careful with trusting these things, you know, because

867
00:44:56,079 --> 00:44:58,280
things change, you know, so even if you try to

868
00:44:58,280 --> 00:45:01,719
trust an entity like Microsoft, get up and you might

869
00:45:01,760 --> 00:45:03,480
not want to trust it forever.

870
00:45:03,679 --> 00:45:05,719
Speaker 1: But then again, Mark, you know, how long will it

871
00:45:05,760 --> 00:45:08,719
be before your iPhone twenty four will be able to

872
00:45:08,760 --> 00:45:12,559
sequence your genome just by taking a picture of a

873
00:45:12,800 --> 00:45:15,280
hair follicle? Yeah?

874
00:45:15,519 --> 00:45:18,400
Speaker 3: Yeah, maybe you should go back to probably have a

875
00:45:18,480 --> 00:45:21,519
knock here somewhere lying around that still works. You should

876
00:45:21,519 --> 00:45:24,440
go back to those. That's why we all kind of end.

877
00:45:25,840 --> 00:45:27,840
Speaker 2: Yeah, I saw a modern flip phone the other day.

878
00:45:27,880 --> 00:45:30,440
That wasn't like it was still an LCD, right, Like,

879
00:45:30,480 --> 00:45:32,599
it wasn't the I was very tempted.

880
00:45:32,760 --> 00:45:37,360
Speaker 1: Yeah, Samsung flip z Well that there's an Android phone

881
00:45:37,360 --> 00:45:39,079
that flips up and down and I have one of those.

882
00:45:39,159 --> 00:45:41,360
Speaker 2: Yeah yeah, but those are all those are smartphones. I'm

883
00:45:41,360 --> 00:45:45,239
talking about a full retro flip phone. Oh wow, yeah, wow,

884
00:45:45,880 --> 00:45:48,880
it was exciting. You do have those retro urges without

885
00:45:48,880 --> 00:45:52,119
a doubt. Sure, there isn't the Again, I feel like

886
00:45:52,159 --> 00:45:54,440
the programming situation is the best case scenario. I think

887
00:45:54,480 --> 00:45:58,119
we're more cautious or more familiar with these models of

888
00:45:58,360 --> 00:46:01,280
needing to valid and so forth. The concern space is

889
00:46:01,320 --> 00:46:04,119
pretty much everything else happening in LLMS. Yeah, like you

890
00:46:04,159 --> 00:46:09,679
get back to the computer says stuff. Yeah yeah, But

891
00:46:09,719 --> 00:46:12,960
I also think we've gone through that. We used to

892
00:46:13,000 --> 00:46:16,519
believe everything Google said too, Like this is just yet

893
00:46:16,559 --> 00:46:21,039
another learning pattern that you have to go through the

894
00:46:21,119 --> 00:46:24,840
experience of realizing these tools are based on knowledge we

895
00:46:24,960 --> 00:46:27,920
have and a lot of that knowledge is inaccurate and

896
00:46:28,000 --> 00:46:31,199
so when you you know, quote it verbatim, you are

897
00:46:31,239 --> 00:46:31,880
often wrong.

898
00:46:32,039 --> 00:46:37,440
Speaker 1: So maybe Mark your critique is more of the population

899
00:46:37,920 --> 00:46:39,559
than of the AI tools.

900
00:46:39,639 --> 00:46:41,320
Speaker 3: Right, Oh, absolutely, you know.

901
00:46:41,360 --> 00:46:43,960
Speaker 1: Can we trust people to do the right thing with these?

902
00:46:44,119 --> 00:46:48,880
And yeah, my answer is people are driven by incentives,

903
00:46:49,039 --> 00:46:53,880
and if the economic incentives outweigh the moral incentives or

904
00:46:53,920 --> 00:46:57,880
the ethical incentives, guess which one wins. It's just that simple.

905
00:46:58,119 --> 00:47:02,119
Speaker 3: Yeah, yeah, that's that's a bleak. But I'm I think

906
00:47:02,159 --> 00:47:04,719
I'm probably agreeing with you there.

907
00:47:04,960 --> 00:47:08,159
Speaker 1: Unfortunately, Well, it comes down to the developer who says

908
00:47:08,239 --> 00:47:12,079
yes to developing something with an AI in a language

909
00:47:12,119 --> 00:47:12,920
they don't understand.

910
00:47:13,000 --> 00:47:15,639
Speaker 3: Oh yeah, And it's it's not that it's an moral

911
00:47:15,760 --> 00:47:19,280
excuse or anything. But usually what actually does happen is

912
00:47:19,320 --> 00:47:22,119
if you say no, someone else will say yes. So

913
00:47:22,199 --> 00:47:25,880
it's going to happen anyway. And again it's not and

914
00:47:25,920 --> 00:47:28,719
it's not an excuse for doing something that's unethical. But

915
00:47:28,719 --> 00:47:32,639
but still from a you know, a high level, you know,

916
00:47:33,039 --> 00:47:36,760
just looking at society overall, that's still the mechanism that's

917
00:47:36,760 --> 00:47:38,480
going to happen. You know, someone won't do it.

918
00:47:38,800 --> 00:47:42,079
Speaker 1: Yeah, I wouldn't say yes, just because I'd code myself

919
00:47:42,079 --> 00:47:42,920
into a corner, you know.

920
00:47:43,039 --> 00:47:45,480
Speaker 3: Yeah, but that's but you also that you senior, just

921
00:47:45,599 --> 00:47:48,559
like you know Richard is and and what Richard talked

922
00:47:48,559 --> 00:47:51,000
about you know. You can actually have success with these

923
00:47:51,039 --> 00:47:52,920
things if you you know, if you know a lot

924
00:47:52,960 --> 00:47:56,440
of enough about programming, and even if you've never seen

925
00:47:56,440 --> 00:48:01,400
a language before, you've seen other languages that are similar

926
00:48:01,519 --> 00:48:04,119
enough that you can probably still get a good sense

927
00:48:04,119 --> 00:48:06,639
of if you ask it to write you know, if

928
00:48:06,639 --> 00:48:08,960
you don't know, go, I don't know go. If I

929
00:48:09,000 --> 00:48:12,039
ask you know, an LM to write me something and go,

930
00:48:12,079 --> 00:48:14,320
I would probably have a fairly good idea about what

931
00:48:14,400 --> 00:48:16,840
it is that it produced, and I'd have to look

932
00:48:16,920 --> 00:48:19,239
up a few things and so on. But again the

933
00:48:19,280 --> 00:48:22,119
problem is if you're not even a programmer from the beginning,

934
00:48:23,440 --> 00:48:25,440
or if you're a junior, as we talked about, then

935
00:48:25,519 --> 00:48:26,920
that gets a lot harder.

936
00:48:27,079 --> 00:48:30,360
Speaker 1: But still, if it was you know, a language like reactor,

937
00:48:30,480 --> 00:48:33,840
you know JavaScript, but you know, if Reactor is going

938
00:48:33,920 --> 00:48:36,199
to do something, how do I know that it did

939
00:48:36,199 --> 00:48:39,119
it the right way or the most efficient way, or

940
00:48:39,480 --> 00:48:40,880
you know that there isn't a better way?

941
00:48:41,239 --> 00:48:42,239
Speaker 2: I don't Yeah.

942
00:48:42,320 --> 00:48:44,360
Speaker 1: Yeah, So what I would do is I would hire

943
00:48:44,400 --> 00:48:47,880
a subcontractor that does React, and I would encourage them

944
00:48:47,920 --> 00:48:53,639
to use the LM because of the agents, because they'll

945
00:48:53,679 --> 00:48:56,039
be more productive, sure, especially if they're charging me by

946
00:48:56,079 --> 00:48:59,199
the hour. That's another question. How ethical is it to

947
00:48:59,320 --> 00:49:01,679
charge by the project versus by the hour?

948
00:49:01,800 --> 00:49:04,840
Speaker 3: Now fair enough, but still it comes down to accountability

949
00:49:04,880 --> 00:49:08,079
because if you hire a sub contractor, you need to

950
00:49:08,119 --> 00:49:12,079
trust that some subcontractor to actually do the right thing,

951
00:49:12,920 --> 00:49:16,480
and and and and absolutely and and that's that's another

952
00:49:16,519 --> 00:49:19,199
problem that we tend to have with ais in general

953
00:49:19,280 --> 00:49:22,960
is they're not really accountable right now. We don't have

954
00:49:23,000 --> 00:49:26,920
any laws that govern how you know, who has responsibility

955
00:49:26,960 --> 00:49:29,559
for the output of them. And as Richard said, well,

956
00:49:29,760 --> 00:49:32,360
we even know how to deal with with software development,

957
00:49:32,559 --> 00:49:35,039
but if we're looking at the broader picture of just

958
00:49:35,880 --> 00:49:37,840
you know, asking it to do all sorts of other

959
00:49:37,920 --> 00:49:40,159
tasks for us in the in the rest of the world,

960
00:49:40,239 --> 00:49:43,039
you know, outside of software, you know who's accountable then,

961
00:49:43,119 --> 00:49:46,920
and we we don't know they're they're not, so we're

962
00:49:47,159 --> 00:49:49,000
sort of stuck with, you know whatever.

963
00:49:50,719 --> 00:49:54,920
Speaker 1: We saw companies using their their AI bots as an excuse,

964
00:49:56,159 --> 00:49:59,400
uh when bot gave them bad advice. Remember that one, Richard,

965
00:49:59,400 --> 00:50:02,039
I think it was an line thing, whether refund or something.

966
00:50:02,320 --> 00:50:05,880
Speaker 2: Yeah, that was the Air Canada incident. Yeah, where where

967
00:50:06,079 --> 00:50:09,199
a bot told a customer or you'll be able to

968
00:50:09,199 --> 00:50:11,199
get a refund on that. So they went ahead and

969
00:50:11,239 --> 00:50:12,519
did the thing. When they went to get the refund,

970
00:50:12,519 --> 00:50:15,159
they were refused and ultimately ended up in front of

971
00:50:15,159 --> 00:50:17,679
a judge and the judge said, use the bot as

972
00:50:17,679 --> 00:50:19,480
if it was an employee. If an employee said that,

973
00:50:19,519 --> 00:50:22,320
you'd have to make it true. So the bot qualifies. Yeah,

974
00:50:23,239 --> 00:50:25,559
and they had to issue the refund and it just

975
00:50:25,800 --> 00:50:29,559
was a you know, the interesting part was considering that

976
00:50:29,639 --> 00:50:34,320
Air Canada had made a publicly accessible tool that early on. Yeah,

977
00:50:34,400 --> 00:50:37,079
and at least in Canada now has set a piece

978
00:50:37,079 --> 00:50:40,199
of case law in place. Not a bad thing. I'm

979
00:50:40,199 --> 00:50:43,559
not unhappy with that outcome. No, that's I don't want them. Yeah,

980
00:50:43,920 --> 00:50:47,559
cautionary tale to other companies, right, test and be prepared

981
00:50:47,599 --> 00:50:49,480
to pay for the consequences exactly.

982
00:50:49,599 --> 00:50:52,079
Speaker 3: So that also means that if you can sort of

983
00:50:52,159 --> 00:50:54,679
sense as a customer that you are that you have

984
00:50:54,679 --> 00:50:57,239
an ill limit the other end, you just keep asking it,

985
00:50:57,719 --> 00:51:00,599
you know, variations of the same questions until you get

986
00:51:00,639 --> 00:51:04,960
something you like and then you pounce on that. Oh yeah,

987
00:51:05,000 --> 00:51:06,199
I'll take that deal, thank you.

988
00:51:07,639 --> 00:51:10,519
Speaker 1: Yeah. I remember asking about are you a bot? And

989
00:51:10,519 --> 00:51:13,719
it said no, my name is whatever from blah blah blah.

990
00:51:13,840 --> 00:51:15,639
Speaker 2: Yeah, sure, yeah, I.

991
00:51:15,559 --> 00:51:17,840
Speaker 3: Think we should. I think that should should be a

992
00:51:17,880 --> 00:51:21,920
law that says, well, they are not allowed to impersonate humans, but.

993
00:51:22,159 --> 00:51:22,719
Speaker 1: You should know.

994
00:51:22,920 --> 00:51:24,639
Speaker 2: I have to wonder if we had done a better

995
00:51:24,719 --> 00:51:26,880
job on privacy in the first place, if we wouldn't

996
00:51:26,880 --> 00:51:29,559
be dealing with quite as many issues as we've got. Yeah,

997
00:51:29,679 --> 00:51:32,239
that doesn't mean we shouldn't continue to try. Like I'm

998
00:51:33,119 --> 00:51:35,039
we talk about the challenges of the gen Z and

999
00:51:35,079 --> 00:51:39,960
younger generations. It's this using despairs and excuse not to try. Yeah, hey,

1000
00:51:40,000 --> 00:51:44,280
it's not an excuse. You have to continue to be

1001
00:51:44,400 --> 00:51:48,800
the best person you can be. Be the best programmer, dancer, electrician,

1002
00:51:48,960 --> 00:51:51,119
whatever it is you are. Be the best you can

1003
00:51:51,159 --> 00:51:53,760
possibly be, and use the tools to your advantage. Don't

1004
00:51:53,800 --> 00:51:59,199
become a tool yourself. All right, at all? In all,

1005
00:51:59,199 --> 00:52:00,000
I'm pretty optimistic.

1006
00:52:01,000 --> 00:52:02,679
Speaker 1: Yeah, I actually am too.

1007
00:52:03,079 --> 00:52:09,119
Speaker 3: Well, I'm not, but that's just my disposition.

1008
00:52:08,639 --> 00:52:10,920
Speaker 2: To see some patterns happening again. It's like, oh boy,

1009
00:52:10,960 --> 00:52:13,119
we get to learn this problem again. Oh you've been

1010
00:52:13,119 --> 00:52:15,679
trusting software. Huh, All right, here we go.

1011
00:52:15,840 --> 00:52:17,760
Speaker 1: We're gonna have to trust but you know what, the

1012
00:52:17,840 --> 00:52:20,199
chickens will come home to rust, like we've said, Richard,

1013
00:52:20,360 --> 00:52:22,840
and you know the rest of the people will wake

1014
00:52:22,920 --> 00:52:24,679
up and say, oh we need to do this, we

1015
00:52:24,719 --> 00:52:27,079
need to do that. We can't just rely on these things.

1016
00:52:27,679 --> 00:52:30,599
So yeah, there'll there will be some pain for sure. Yeah,

1017
00:52:30,719 --> 00:52:34,960
but ultimately come down to people and the decisions they make. Yeah, yeah, okay,

1018
00:52:35,360 --> 00:52:36,000
is that a show?

1019
00:52:36,360 --> 00:52:37,159
Speaker 2: I think it's a show?

1020
00:52:37,320 --> 00:52:41,199
Speaker 1: All right, Mark, thank you. It's always always awesome talking

1021
00:52:41,239 --> 00:52:41,519
to you.

1022
00:52:41,840 --> 00:52:44,039
Speaker 3: It's a pleasure, all right. Thank you for having me.

1023
00:52:44,199 --> 00:52:46,480
Speaker 1: You bet, and we'll see you next time on dot

1024
00:52:46,519 --> 00:53:09,039
net rocks. Dot net Rocks is brought to you by

1025
00:53:09,119 --> 00:53:13,800
Franklin's Net and produced by Pop Studios, a full service audio,

1026
00:53:13,920 --> 00:53:18,360
video and post production facility located physically in New London, Connecticut,

1027
00:53:18,599 --> 00:53:23,400
and of course in the cloud online at pwop dot com.

1028
00:53:23,599 --> 00:53:25,760
Speaker 5: Visit our website at d O T N E T

1029
00:53:25,960 --> 00:53:30,000
R O c k S dot com for RSS feeds, downloads,

1030
00:53:30,159 --> 00:53:33,840
mobile apps, comments, and access to the full archives going

1031
00:53:33,880 --> 00:53:37,239
back to show number one, recorded in September two thousand

1032
00:53:37,280 --> 00:53:37,519
and two.

1033
00:53:38,159 --> 00:53:40,480
Speaker 1: And make sure you check out our sponsors. They keep

1034
00:53:40,559 --> 00:53:43,760
us in business. Now, go write some code, see you

1035
00:53:43,800 --> 00:53:44,199
next time.

1036
00:53:45,119 --> 00:53:46,960
Speaker 3: You got jam Vans

1037
00:53:49,000 --> 00:54:01,519
Speaker 2: And for the pass

