1
00:00:01,080 --> 00:00:03,000
Speaker 1: How'd you like to listen to dot net rocks with

2
00:00:03,040 --> 00:00:03,799
no ads?

3
00:00:04,440 --> 00:00:04,839
Speaker 2: Easy?

4
00:00:05,360 --> 00:00:08,560
Speaker 1: Become a patron for just five dollars a month. You

5
00:00:08,599 --> 00:00:11,320
get access to a private RSS feed where all the

6
00:00:11,359 --> 00:00:14,599
shows have no ads. Twenty dollars a month, we'll get

7
00:00:14,599 --> 00:00:17,679
you that and a special dot net Rocks patron mug.

8
00:00:18,160 --> 00:00:34,479
Sign up now at Patreon dot dot NetRocks dot com.

9
00:00:34,479 --> 00:00:39,759
Happy Chris Mahana Kwanzadon. It's dot net Rocks. I'm Carl Franklin, an.

10
00:00:39,679 --> 00:00:42,039
Speaker 2: Amateur Campbell and we timeshift. So this is actually being

11
00:00:42,039 --> 00:00:43,039
published in January.

12
00:00:43,159 --> 00:00:45,679
Speaker 1: That's right, January ninth. So we hope you had a

13
00:00:45,679 --> 00:00:46,799
happy holiday anyway.

14
00:00:46,960 --> 00:00:48,000
Speaker 2: Yeah, I'm sure it was great.

15
00:00:48,119 --> 00:00:51,200
Speaker 1: Yeah, we haven't had ours yet as of this recording,

16
00:00:51,439 --> 00:00:55,600
No December nineteenth, but you know it's going to be

17
00:00:55,600 --> 00:00:57,960
a good show. Our old friend Thomas Betts is with us.

18
00:00:58,439 --> 00:01:01,479
And before I get into better know framework, I just

19
00:01:01,520 --> 00:01:03,399
want to check in with you, Richard Campbell and see

20
00:01:04,000 --> 00:01:06,799
how your holiday is shaping up.

21
00:01:06,920 --> 00:01:10,000
Speaker 2: Oh, now we're gonna we're shortly after this, like this weekend,

22
00:01:10,000 --> 00:01:12,879
we'll go to the city for the week Lots of

23
00:01:12,879 --> 00:01:15,079
friends to visit a few different parties, go to spend

24
00:01:15,120 --> 00:01:18,200
time with the girls. It's probably been the leak out

25
00:01:18,239 --> 00:01:22,159
by now. The youngest is pregnant, so she's like, hey,

26
00:01:22,159 --> 00:01:24,799
can I bourrow one of your old winter coats because

27
00:01:24,920 --> 00:01:26,400
mine don't fit anymore.

28
00:01:27,680 --> 00:01:35,359
Speaker 1: Yeah. Kelly's daughter had a daughter and she turned one

29
00:01:35,599 --> 00:01:38,480
last week. Oh yeah, right, so we had, you know,

30
00:01:38,599 --> 00:01:43,400
the obligatory smash the cupcake in your face birthday party.

31
00:01:43,519 --> 00:01:44,079
Speaker 2: That's the thing.

32
00:01:44,599 --> 00:01:47,239
Speaker 1: It was pretty awesome. Yeah, all right, well, we've got

33
00:01:47,280 --> 00:01:49,079
a lot of stuff to cover in better know framework.

34
00:01:49,120 --> 00:01:59,040
It's history day here, so roll the music, all man,

35
00:01:59,040 --> 00:02:00,760
what do you got? So? I hadn't done this in

36
00:02:00,760 --> 00:02:03,200
a while, but you know, nineteen thirty two, which is

37
00:02:03,239 --> 00:02:07,599
our episode number, was such a jam packed year that

38
00:02:07,719 --> 00:02:11,280
I want to talk about some things that happened. And Thomas,

39
00:02:11,319 --> 00:02:12,960
I know you're there if you have anything you can

40
00:02:13,039 --> 00:02:13,479
jump in.

41
00:02:14,800 --> 00:02:15,120
Speaker 2: You two.

42
00:02:15,280 --> 00:02:20,159
Speaker 1: Richard, of course you are the history guy. So January fourth,

43
00:02:20,159 --> 00:02:24,759
British colonials in India arrested Mahatma Gandhi and put him

44
00:02:24,759 --> 00:02:30,680
in prison. January twelfth, US elected their first female senator,

45
00:02:30,960 --> 00:02:37,039
Hattie Wyatt Caraway, who represented Arkansas. February seventh, NASA astronaut

46
00:02:37,080 --> 00:02:41,400
Alfred M. Warden was born in Jackson, Michigan. He went

47
00:02:41,520 --> 00:02:45,199
on the Apollo fifteen mission, which saw the use the

48
00:02:45,240 --> 00:02:48,840
first use of a lunar roving vehicle, a little Buggy,

49
00:02:48,879 --> 00:02:49,759
Little Buggy.

50
00:02:50,080 --> 00:02:50,759
Speaker 2: March first.

51
00:02:51,080 --> 00:02:55,919
Speaker 1: Lindberg plays a big role In nineteen thirty two. The

52
00:02:55,960 --> 00:02:59,000
Lindberg kidnapping occurred, where the twenty month old son of

53
00:02:59,039 --> 00:03:02,680
famous aviator t Arleson and Lindberg were kidnapped from their home.

54
00:03:03,719 --> 00:03:07,879
Was kidnapped from their home in east Amwell, New Jersey,

55
00:03:08,520 --> 00:03:11,840
March fourteenth. George Eastman, the inventor of the Kodak camera,

56
00:03:11,960 --> 00:03:18,080
shot himself in the heart, aged seventy seven. March seventeenth,

57
00:03:18,159 --> 00:03:23,759
the German police rated Hitler's Nazi headquarters April's second. Famous aviator,

58
00:03:23,840 --> 00:03:26,680
Charles Lindberg, paid fifty thousand dollars as a ransom for

59
00:03:26,719 --> 00:03:31,759
his kidnapped son. In April nineteenth, President of the United

60
00:03:31,800 --> 00:03:34,800
States Herbert Hoover first suggested the five day work week.

61
00:03:35,759 --> 00:03:36,400
Speaker 2: So you can.

62
00:03:36,280 --> 00:03:40,520
Speaker 1: Blame him for that. What was it before seven days?

63
00:03:40,759 --> 00:03:43,479
Speaker 2: Seventh? It was huh, the were six actually because he

64
00:03:43,520 --> 00:03:45,840
was supposed to rest on Sundays, right, okay.

65
00:03:46,280 --> 00:03:50,360
Speaker 1: May fourth American gangster Al Capone entered the Atlanta Prison

66
00:03:50,520 --> 00:03:54,240
convicted of income tax evasion. May fifth, Japan and China

67
00:03:54,360 --> 00:03:57,719
signed a piece treaty. May twelfth, the body of famous

68
00:03:57,759 --> 00:04:01,319
aviator Charles Lindberg's kidnap son is found in New Jersey.

69
00:04:02,159 --> 00:04:06,599
Ransom didn't do any good, apparently. May twenty first, Amelia

70
00:04:06,639 --> 00:04:10,759
Earhart became the first woman to complete the transatlantic solo flight,

71
00:04:10,840 --> 00:04:15,400
having flown for seventeen hours from Newfoundland, Canada, to Londonderry,

72
00:04:15,439 --> 00:04:20,040
Northern Ireland. June thirteenth, Great Britain and France signed a

73
00:04:20,040 --> 00:04:25,680
peace treaty. July twenty eighth, Douglas MacArthur, acting against US

74
00:04:25,759 --> 00:04:29,800
President Hoover's orders, commanded several attacks on the Bonus Army,

75
00:04:30,279 --> 00:04:33,480
which was World War One veterans in their families, attempting

76
00:04:33,560 --> 00:04:36,759
to a victim from their encampment. At least two veterans

77
00:04:36,800 --> 00:04:41,360
died in the attacks, with fifty five injured. July thirty first,

78
00:04:41,439 --> 00:04:46,360
Nazis gained thirty seven percent in the Reichstag elections in Germany,

79
00:04:46,399 --> 00:04:49,000
becoming the largest party in parliament by a large margin.

80
00:04:50,399 --> 00:04:54,399
August second, American physicist Carl David Anderson discovered and photographed

81
00:04:54,399 --> 00:05:00,079
a positron the first known anti particle, and Thomas, do

82
00:05:00,160 --> 00:05:03,439
you have some other news about physics, don't you?

83
00:05:03,560 --> 00:05:07,360
Speaker 3: Well? I saw that the nineteen thirty two Noble in

84
00:05:07,399 --> 00:05:10,800
Physics went to Werner Heisenberg. Ah, he got it a

85
00:05:10,879 --> 00:05:17,000
year before Schroedinger. So all your quantum mechanics and atonic theory.

86
00:05:16,720 --> 00:05:18,560
Speaker 1: A lot of stuff happened in thirty two, all right.

87
00:05:18,600 --> 00:05:21,439
So August sixth the Venice Film Festival, the world's oldest

88
00:05:21,519 --> 00:05:26,319
film festival, opened for the first time. August thirteenth, President

89
00:05:26,480 --> 00:05:29,680
von Hindenburg refused Adolf Hitler when he asked to be

90
00:05:29,720 --> 00:05:34,000
appointed as chancellor, and instead offered Hitler the position of

91
00:05:34,079 --> 00:05:37,399
vice Chancellor of Germany. Hitler refused a position and announced

92
00:05:37,399 --> 00:05:40,959
he would oppose every government not headed by himself until

93
00:05:41,000 --> 00:05:47,240
he was chancellor. August sixteenth. John Lindberg, Charles Lindberg's second son,

94
00:05:47,360 --> 00:05:49,720
was born just five months after the kidnapping and death

95
00:05:49,759 --> 00:05:53,319
of his older brother, Charles Lindberg Junior. Almost done. Here

96
00:05:53,360 --> 00:05:56,439
is Stember twenty in his cell at your word. At

97
00:05:56,519 --> 00:06:00,279
jail in Pune, India, Gandhi began a hunger striking against

98
00:06:00,319 --> 00:06:05,720
the treatment of India's lowest classes, known as Untouchables. October two,

99
00:06:05,879 --> 00:06:09,560
the New York Yankees won their twelfth consecutive World Series game.

100
00:06:10,839 --> 00:06:11,120
Speaker 2: Yay.

101
00:06:12,319 --> 00:06:14,759
Speaker 1: As a Red Sox fan, I can still appreciate that.

102
00:06:15,439 --> 00:06:19,399
October third, Iraq gained full independence from Britain and joined

103
00:06:19,439 --> 00:06:24,560
the League of Nations. October ten, the largest hydroelectric power station,

104
00:06:25,319 --> 00:06:33,120
the niper Dam dni nieper nieper Dam, was first put

105
00:06:33,160 --> 00:06:37,000
into operation in the USSR. It's actually Ukraine, but well

106
00:06:37,199 --> 00:06:40,319
it is now, right, always was was it USSR?

107
00:06:40,360 --> 00:06:42,680
Speaker 2: Then? Well, okay, he was always in Ukraine. Okay.

108
00:06:43,319 --> 00:06:46,800
Speaker 1: November sixth, the Prime Minister of Italy, Benito Mussolini, introduced

109
00:06:46,800 --> 00:06:51,600
an amnesty decree freeing thousands of convicts. December fifth, Albert

110
00:06:51,639 --> 00:06:54,240
Einstein was granted a visa to enter the United States,

111
00:06:54,279 --> 00:06:58,519
and December twenty seventh, Radio City Music Hall first opened

112
00:06:58,600 --> 00:06:59,399
in New York.

113
00:06:59,680 --> 00:06:59,839
Speaker 3: Sit.

114
00:07:01,360 --> 00:07:03,759
Speaker 1: And that's just a little bit of what happened in

115
00:07:03,839 --> 00:07:07,160
nineteen thirty two. A lot of very famous people that

116
00:07:07,199 --> 00:07:10,120
you know were born in nineteen thirty two. But I'm

117
00:07:10,160 --> 00:07:12,439
not going to go down there. It's sort of the threshold, right,

118
00:07:12,560 --> 00:07:13,839
like that's a long time ago now.

119
00:07:13,920 --> 00:07:16,240
Speaker 2: Yeah. Yeah, I'm working on the space geek out, which

120
00:07:16,240 --> 00:07:18,600
will have already been published by the time you heard this,

121
00:07:19,199 --> 00:07:22,439
and I'm still writing the script. And nineteen thirty two

122
00:07:22,480 --> 00:07:27,519
is also when Karl Janski created the first radio telescope,

123
00:07:28,000 --> 00:07:30,439
not because he was an astronomer, but because he was

124
00:07:30,480 --> 00:07:34,720
trying to build a directional antenna and he kept having

125
00:07:34,720 --> 00:07:36,920
this hiss no matter where he pointed the antenna, so

126
00:07:36,959 --> 00:07:39,279
the directional antennae could tell which way like thunderstorms and

127
00:07:39,279 --> 00:07:42,800
stuff were because lightning strikes have created a radio wave

128
00:07:42,839 --> 00:07:44,920
that he could say, Okay, well the lightning coming from there,

129
00:07:45,199 --> 00:07:47,800
but there was this hiss that was coming from everywhere. Wow.

130
00:07:47,839 --> 00:07:49,639
And it took a while to finally figure out that

131
00:07:49,639 --> 00:07:52,120
it was the hiss of the universe. Yeah, and it's

132
00:07:52,120 --> 00:07:55,399
actually the cosmic background radiation from the Big Bang. Wow.

133
00:07:55,439 --> 00:07:59,000
And that he ended up building the first radio telescope

134
00:07:59,040 --> 00:08:00,519
just trying to figure out what was going on, an

135
00:08:00,680 --> 00:08:02,759
entirely new class of astronomy. So cool.

136
00:08:02,920 --> 00:08:05,079
Speaker 1: Yeah, you got to have an imagination to do that

137
00:08:05,160 --> 00:08:06,199
kind of stuff, can't.

138
00:08:06,279 --> 00:08:08,519
Speaker 2: Just like he had a problem. Yeah, the problem is

139
00:08:08,560 --> 00:08:12,800
what's that noise? What is that? Okay, Richard, who's talking

140
00:08:12,800 --> 00:08:14,720
to us? He got a comment, Yeah, grabbed a comment

141
00:08:14,720 --> 00:08:16,399
on off the show eighteen fifty eight of Moth we

142
00:08:16,439 --> 00:08:19,680
do with Thomas back in August of twenty three, so

143
00:08:19,800 --> 00:08:21,720
a little over a year ago, and that was the

144
00:08:21,879 --> 00:08:24,920
leveling up your architecture game conversation. Seem to have a

145
00:08:24,920 --> 00:08:27,240
theme with your shows, Thomas. And there was a bunch

146
00:08:27,279 --> 00:08:28,480
of good comments on the show, and this one comes

147
00:08:28,480 --> 00:08:30,399
from Laslow. He says, the more shows I listen to

148
00:08:30,480 --> 00:08:33,840
about software architecture, the less I'm sure about what it's about.

149
00:08:34,639 --> 00:08:37,360
I've seen great examples of software architecture and designs, but

150
00:08:37,399 --> 00:08:40,399
I've never seen them evolving from an architect's point of view.

151
00:08:40,840 --> 00:08:43,279
A big bowl of mud never starts at the beginning

152
00:08:43,320 --> 00:08:45,480
of a software's lifetime, but a couple of years later,

153
00:08:45,480 --> 00:08:48,200
when the architect has left the project, new developers are

154
00:08:48,240 --> 00:08:50,360
not fully onboarded, and the clients realize they can push

155
00:08:50,360 --> 00:08:53,840
your requests that were previously quote too hard or too

156
00:08:53,879 --> 00:08:57,960
expensive to implement you and sometimes junior developers cross the

157
00:08:57,960 --> 00:09:00,360
line in previous design reels to be clients happy, making

158
00:09:00,360 --> 00:09:04,960
the application architecture rot. I'd like to see an architecture

159
00:09:04,960 --> 00:09:08,440
that stands the test of time. Hm hmm. You know,

160
00:09:09,919 --> 00:09:13,600
architecture isn't a thing right. It's a set of ideas

161
00:09:13,600 --> 00:09:15,360
that have to be implemented by people, and as long

162
00:09:15,360 --> 00:09:18,200
as people don't implement them, you can take any building

163
00:09:18,279 --> 00:09:21,360
and stick something horrible onto it and damage its architecture.

164
00:09:21,519 --> 00:09:24,120
Same with any piece of software. The only way architecture

165
00:09:24,120 --> 00:09:26,039
stands the test of time is that people choose to

166
00:09:26,200 --> 00:09:29,320
care for and pretty much leave it alone. Well, no,

167
00:09:29,519 --> 00:09:32,639
I mean challenge it, improve it. I could. It makes

168
00:09:32,679 --> 00:09:34,480
sense when you have a feature that's a problem in

169
00:09:34,519 --> 00:09:38,120
the current architecture, say, is the architecture wrong, Like, is

170
00:09:38,159 --> 00:09:40,240
there a way to press against the architecture to still

171
00:09:40,279 --> 00:09:43,720
get some of the benefits of it, of the extreme design? Well,

172
00:09:44,279 --> 00:09:47,519
we do that all the time anyway, and some of

173
00:09:47,639 --> 00:09:50,240
us just admit to it is compromise, right. No, you know,

174
00:09:50,639 --> 00:09:54,440
all ideal architectures can't be built. It's always a depression

175
00:09:54,480 --> 00:09:55,240
of less than I do.

176
00:09:55,960 --> 00:09:58,080
Speaker 3: You always end up with an architecture. It's whether you

177
00:09:58,159 --> 00:10:01,840
made it explicit decisions or implicitly accepted the decisions along

178
00:10:01,840 --> 00:10:04,440
the way. And if you just keep letting it rot,

179
00:10:04,600 --> 00:10:07,320
as they said on the comment, then yeah, you'll end

180
00:10:07,360 --> 00:10:08,879
up with the big ball of mud and you won't

181
00:10:08,919 --> 00:10:12,799
have a nice, clean architecture, but you'll still have an architecture.

182
00:10:12,879 --> 00:10:15,960
Speaker 2: Unfortunately. Yeah, nobody makes a ball of mud, it just emerges.

183
00:10:18,080 --> 00:10:19,720
So Las Well, thank you so much for your comment.

184
00:10:19,759 --> 00:10:21,120
And a copy of music co Buy is on its

185
00:10:21,120 --> 00:10:22,279
way to you. And if you'd like a copy of

186
00:10:22,320 --> 00:10:24,039
music code By, I write a comment on the website

187
00:10:24,080 --> 00:10:26,559
at dot net rocks dot com or on the facebooks

188
00:10:26,559 --> 00:10:28,159
we publish every show. There to be comment there and

189
00:10:28,200 --> 00:10:29,360
I read it on the show. We'll send you a

190
00:10:29,399 --> 00:10:30,279
copy of music code By.

191
00:10:31,080 --> 00:10:34,480
Speaker 1: And we're also on all the social media's we're on.

192
00:10:34,639 --> 00:10:36,720
We've been on x Twitter for a long time, at

193
00:10:36,720 --> 00:10:41,279
Carl Franklin at Rich Campbell, and we're also on blue Sky.

194
00:10:41,559 --> 00:10:45,200
I'm at Carl Franklin bsky dot app, and I'm Richcampbell

195
00:10:45,240 --> 00:10:48,919
at besky dot app or and is it app? Is

196
00:10:48,960 --> 00:10:50,639
it app or is it social I don't even know

197
00:10:51,039 --> 00:10:52,200
whatever blue sky.

198
00:10:52,159 --> 00:10:54,559
Speaker 3: Dot, bsky dot net or dot whatever.

199
00:10:54,639 --> 00:10:57,000
Speaker 2: Yeah, bsky dot app. It is bsky dot.

200
00:10:57,159 --> 00:11:00,519
Speaker 1: Okay, good because you know, I just pull it up

201
00:11:00,559 --> 00:11:01,480
in the browser and it's there.

202
00:11:01,480 --> 00:11:03,600
Speaker 2: I don't really pay attention much anymore of the URL.

203
00:11:03,639 --> 00:11:05,759
But anyway, doing all that stuff through buffer these days,

204
00:11:05,799 --> 00:11:08,759
I you known't actually see the regular sites much. What's

205
00:11:08,799 --> 00:11:12,320
buffer pray tell She's a tool for looking across all

206
00:11:12,360 --> 00:11:16,240
the different social media's that you're dealing with, and that's

207
00:11:16,279 --> 00:11:17,759
how we get all the posts out for all the

208
00:11:17,799 --> 00:11:20,759
shows I need that. It's a good little product. Yeah,

209
00:11:20,799 --> 00:11:22,200
nothing bad to say about it. I switched to it

210
00:11:22,240 --> 00:11:24,120
attle while ago, and I've been very happy and it

211
00:11:24,159 --> 00:11:24,840
works with them all.

212
00:11:25,000 --> 00:11:28,440
Speaker 1: And we're also on macedon. I'm Carl Franklin at tech

213
00:11:28,519 --> 00:11:29,679
Hub dot Social.

214
00:11:29,519 --> 00:11:31,919
Speaker 2: And I'm rich Gamble at mass it on social, So.

215
00:11:31,960 --> 00:11:34,240
Speaker 1: You know, get in touch with us, ask us questions

216
00:11:34,279 --> 00:11:37,360
and you may get a free copy of music to

217
00:11:37,399 --> 00:11:40,600
Code by as well. All right, Well, that last voice

218
00:11:40,639 --> 00:11:45,159
you heard there before Richard was Thomas Betts and he

219
00:11:45,480 --> 00:11:52,440
is a laureate software architect at Blackbod baud Blackbod, the

220
00:11:52,519 --> 00:11:56,240
leading software provider for social impact. In his spare time,

221
00:11:56,279 --> 00:11:59,919
he contributes to InfoQ dot com and helps organize Q

222
00:12:00,080 --> 00:12:02,799
on software development conferences.

223
00:12:02,840 --> 00:12:03,600
Speaker 2: This is interesting.

224
00:12:03,600 --> 00:12:06,320
Speaker 1: Credits dot NetRocks for inspiring him to give back to

225
00:12:06,360 --> 00:12:11,600
the software community as a writer, podcast host, and international speaker.

226
00:12:11,759 --> 00:12:14,000
Well that's nice. How you doing doing well?

227
00:12:14,080 --> 00:12:16,879
Speaker 3: Yeah? I realized that we don't talk about the stuff

228
00:12:16,919 --> 00:12:20,159
I do with InfoQ, but it's kind of my side project.

229
00:12:20,240 --> 00:12:22,440
That's the job I've had the longest now for like

230
00:12:22,480 --> 00:12:23,720
eight or almost nine years.

231
00:12:24,039 --> 00:12:24,320
Speaker 2: Wow.

232
00:12:24,919 --> 00:12:27,840
Speaker 3: So yeah, being able to speak locally at a developer

233
00:12:27,879 --> 00:12:32,720
conference and now I spoken internationally and organize a conference.

234
00:12:32,879 --> 00:12:36,159
So kind of following you guys the example. You know,

235
00:12:36,159 --> 00:12:38,360
it started with writing into comments I think, like I

236
00:12:38,399 --> 00:12:41,240
don't know, two thousand and seven. Yeah, just to join

237
00:12:41,279 --> 00:12:43,279
the conversation and give back to the community.

238
00:12:43,440 --> 00:12:43,960
Speaker 2: Yeah.

239
00:12:44,159 --> 00:12:45,759
Speaker 1: Yeah, we've been friends for a long time.

240
00:12:46,000 --> 00:12:46,440
Speaker 2: Yeah.

241
00:12:46,480 --> 00:12:51,159
Speaker 1: And so your your topic these days is architectural intelligence,

242
00:12:51,200 --> 00:12:54,799
which you say is the next AI, meaning the next

243
00:12:54,840 --> 00:12:56,759
AI acronym, right, exactly.

244
00:12:56,799 --> 00:12:58,720
Speaker 2: So let's define what that is.

245
00:12:58,879 --> 00:13:01,159
Speaker 3: Well, I want to you know, right now, we've got

246
00:13:01,240 --> 00:13:05,159
everyone's wanting to put AI on everything. But I think

247
00:13:05,200 --> 00:13:07,799
we're in that Arthur C. Clark moment where any sufficiently

248
00:13:07,840 --> 00:13:11,159
advanced technology is indistinguishable from magic. Oh yeah, and that

249
00:13:11,639 --> 00:13:14,519
right now, any software we don't understand, we just call AI.

250
00:13:15,279 --> 00:13:15,480
Speaker 2: Right.

251
00:13:15,840 --> 00:13:19,159
Speaker 3: We don't know what it is, but everyone's asking for it.

252
00:13:19,200 --> 00:13:21,600
The CEOs are asking for it. The product owners are saying,

253
00:13:21,639 --> 00:13:23,639
I need a AI in my products. We look innovative

254
00:13:24,399 --> 00:13:27,200
customers aren't asking for it, but give it a couple

255
00:13:27,240 --> 00:13:29,360
of years and the expectation will be there. Like, why

256
00:13:29,360 --> 00:13:31,279
it doesn't this have AI? It must not be modern?

257
00:13:31,360 --> 00:13:34,440
Speaker 1: So Thomas real quick. I play in a band and

258
00:13:34,480 --> 00:13:37,480
it's a ten piece band and it's awesome, and you know,

259
00:13:37,600 --> 00:13:41,000
after we do a particularly great tune, I will say,

260
00:13:41,840 --> 00:13:44,360
this band does not use AI.

261
00:13:47,279 --> 00:13:47,519
Speaker 2: Yeah.

262
00:13:47,840 --> 00:13:51,200
Speaker 1: It's kind of like when you get rice cakes and

263
00:13:51,240 --> 00:13:56,200
they say fat free. You know, it's like, well, duh.

264
00:13:55,759 --> 00:13:59,480
But everybody thinks that anything like any kind of pedal

265
00:13:59,519 --> 00:14:02,639
that we're using to change our voice or any kind

266
00:14:02,639 --> 00:14:07,720
of sound. You know, software is AI just because it's

267
00:14:07,759 --> 00:14:11,240
good and it might use digital signal processing or whatever,

268
00:14:11,240 --> 00:14:12,080
but it's not AI.

269
00:14:12,320 --> 00:14:12,519
Speaker 2: Right.

270
00:14:12,600 --> 00:14:15,600
Speaker 3: I think we call stuff AI until we have something better,

271
00:14:15,639 --> 00:14:17,000
and then we call it computer science.

272
00:14:17,399 --> 00:14:17,519
Speaker 2: Ye.

273
00:14:17,720 --> 00:14:20,120
Speaker 3: Right, we have a product and a tool and a name,

274
00:14:20,639 --> 00:14:23,759
and if we go back to you know, like AI.

275
00:14:24,840 --> 00:14:27,000
When we talk about AI, mostly what we mean is

276
00:14:27,639 --> 00:14:32,240
generative AI, not general AI, not artificial general intelligence. That's

277
00:14:32,759 --> 00:14:35,960
data and the terminator and other characters from science fiction.

278
00:14:36,080 --> 00:14:38,120
Speaker 2: I don't think the average person even thinks that far.

279
00:14:38,159 --> 00:14:40,440
They're just looking at large language models right right.

280
00:14:40,480 --> 00:14:42,799
Speaker 3: Right right, and genhi is large language.

281
00:14:42,440 --> 00:14:44,879
Speaker 2: Model is one of them. But you're doing gen ai

282
00:14:45,000 --> 00:14:48,600
disservice if you can, you know, keep it scoped into LMS.

283
00:14:49,159 --> 00:14:51,159
It's a bunch of other things. It's been you know,

284
00:14:51,200 --> 00:14:53,679
that's been going on for a decade. LLLM is only

285
00:14:53,759 --> 00:14:56,440
kind of detonated with chat GPT in twenty two. Like

286
00:14:56,519 --> 00:14:59,600
this is all. This current storm is pretty recent. And

287
00:15:00,080 --> 00:15:02,039
anything I've learned from the shows we've done recently is

288
00:15:02,039 --> 00:15:06,080
that the actual smart machine learning people in this space

289
00:15:06,279 --> 00:15:10,480
are pretty offended. What do you do it because a

290
00:15:10,480 --> 00:15:13,279
lot of this stuff is sloppy machine learning.

291
00:15:13,360 --> 00:15:16,399
Speaker 1: Yeah, and I'll I'll just talk about jen ai. In

292
00:15:16,480 --> 00:15:20,039
terms of images, I can spot a chat GPT generated

293
00:15:20,080 --> 00:15:21,399
image a mile away now.

294
00:15:21,399 --> 00:15:22,720
Speaker 2: It just has a look to it.

295
00:15:23,480 --> 00:15:29,279
Speaker 1: And I'm I'm offended when I'm scrolling through Facebook and

296
00:15:29,320 --> 00:15:34,480
I see a picture of this idyllic house, you know,

297
00:15:34,559 --> 00:15:38,480
with perfect lighting with waterfalls going through it, and you know,

298
00:15:38,559 --> 00:15:42,759
there's no comment. It just says ah or something like that,

299
00:15:42,840 --> 00:15:45,559
you know, and and there's a million views and a

300
00:15:45,639 --> 00:15:51,559
million likes, and it's clearly generated. It doesn't exist, and

301
00:15:51,600 --> 00:15:54,600
there's no there's no place, there's no date. It's just

302
00:15:54,799 --> 00:15:57,720
like a bucolic setting, right, and.

303
00:15:58,039 --> 00:16:00,879
Speaker 3: Half of those million likes are the that are liking

304
00:16:00,919 --> 00:16:02,639
the thing that the other bot created, so.

305
00:16:03,000 --> 00:16:06,120
Speaker 1: And people that use them to take pictures of themselves

306
00:16:06,120 --> 00:16:07,519
and turn them into AI pictures.

307
00:16:07,960 --> 00:16:09,240
Speaker 2: No, stop that.

308
00:16:10,240 --> 00:16:13,200
Speaker 3: I like that you brought up like traditional machine learning,

309
00:16:13,360 --> 00:16:16,960
like we used to call what's now just established machine

310
00:16:17,000 --> 00:16:19,559
learning and an mL model. Like for a while that

311
00:16:19,759 --> 00:16:22,440
sounded like AI, and then we move the AI a

312
00:16:22,480 --> 00:16:24,360
little further. It's that marketing term that just kind of

313
00:16:24,399 --> 00:16:24,960
is the umbrella.

314
00:16:25,399 --> 00:16:28,840
Speaker 2: Yeah, I've always said it coined it as artificial intelligence

315
00:16:28,879 --> 00:16:30,519
is what you call it when it doesn't work. Yeah,

316
00:16:30,600 --> 00:16:32,200
so as it does work, it'll get a new name.

317
00:16:32,279 --> 00:16:34,799
Speaker 3: But I think there is that correlation between a large

318
00:16:34,879 --> 00:16:39,639
language model and other machine learning models, Like the difference

319
00:16:39,720 --> 00:16:42,399
is in the algorithm inside. And if you're not a

320
00:16:42,480 --> 00:16:45,720
data scientist, you probably don't understand. But I still, because

321
00:16:45,720 --> 00:16:48,159
I don't understand the inside, I treat it as you know,

322
00:16:48,200 --> 00:16:50,600
a function box. I put it in an input, I

323
00:16:50,600 --> 00:16:54,159
get an output. So if I'm doing image recognition, I

324
00:16:54,200 --> 00:16:56,879
send it to my image recognition machine learning model, and

325
00:16:56,919 --> 00:16:58,840
it says this is a cat, this is a dog,

326
00:16:59,399 --> 00:17:01,519
or comments on the on a block well.

327
00:17:01,399 --> 00:17:02,919
Speaker 2: And it's the whole point. I have an API. I

328
00:17:02,960 --> 00:17:04,799
don't want to know. I don't know how to create

329
00:17:04,839 --> 00:17:06,720
a cryptographic key, but I do know how to call

330
00:17:06,759 --> 00:17:08,759
an API that gives it to me exactly.

331
00:17:08,799 --> 00:17:10,720
Speaker 3: And so these are the things that software engineers know

332
00:17:10,759 --> 00:17:13,720
how to use, Like this is some little function box

333
00:17:14,240 --> 00:17:16,720
and I can just call it, and here's my input

334
00:17:16,720 --> 00:17:19,960
and what's my expected output. I think what you need

335
00:17:19,960 --> 00:17:22,720
to understand with large language models is you give it

336
00:17:22,759 --> 00:17:26,240
a series of tokens, a bunch of words. We call

337
00:17:26,279 --> 00:17:29,000
them tokens, but token might be part of a word.

338
00:17:29,200 --> 00:17:30,960
That's some of the semantics need to learn.

339
00:17:31,039 --> 00:17:33,000
Speaker 2: But I think it's a you know, the joke is

340
00:17:33,039 --> 00:17:35,799
a tokenization. I think it's the cleverer attack here, right,

341
00:17:36,000 --> 00:17:39,440
Like the stochastic parative spitting language back is not that impressive,

342
00:17:39,799 --> 00:17:41,920
But the fact that we've come up with a strategy

343
00:17:42,039 --> 00:17:46,359
for converting language, virtually any language, into a set of

344
00:17:46,480 --> 00:17:49,839
numeric symbols that, by the way, cross between each other,

345
00:17:50,000 --> 00:17:55,519
like it's the Babelfish man, Like you've almost cracked universal translation.

346
00:17:55,440 --> 00:17:58,480
Speaker 3: Right, because that's that's what tokenization is. Like, here are

347
00:17:58,480 --> 00:18:01,480
all I want to take this string of characters and

348
00:18:01,559 --> 00:18:04,079
turn it into a bunch of floats, and it's not

349
00:18:04,400 --> 00:18:06,880
the asking number of like this is an A and

350
00:18:06,920 --> 00:18:09,400
this is a B. Like this word or part of

351
00:18:09,440 --> 00:18:13,119
a word has this representation in a multi dimensional array,

352
00:18:13,799 --> 00:18:16,359
and that's all it is. It's a lot of math, right,

353
00:18:16,400 --> 00:18:19,920
it's all statistics. And when you feedsie into a large

354
00:18:19,960 --> 00:18:23,079
language model, here is my series of tokens, here's my

355
00:18:23,440 --> 00:18:27,480
words and my sentence, and all it gives back is one.

356
00:18:27,839 --> 00:18:31,000
It just predicts the single next word. And I think

357
00:18:31,079 --> 00:18:34,559
that's the mystery people don't recognize, is it takes that

358
00:18:34,599 --> 00:18:37,559
one like, well, that's not useful. And you watch chat

359
00:18:37,599 --> 00:18:40,640
GPT and especially the early versions, you watched it type

360
00:18:40,640 --> 00:18:43,000
out very slowly. What it's doing is it's feeding that

361
00:18:43,079 --> 00:18:46,599
one token back in adding to you, and that keeps

362
00:18:46,599 --> 00:18:50,880
building up the context that auto regression eventually produces a

363
00:18:51,000 --> 00:18:53,640
series of words that say, oh, the next part of

364
00:18:53,839 --> 00:18:56,519
the sentence is likely to be this, yeah, and it

365
00:18:56,519 --> 00:18:58,039
looks like magic yeah.

366
00:18:57,799 --> 00:19:01,440
Speaker 2: Well and them Importantly, it can be interpreted as intelligence

367
00:19:01,480 --> 00:19:03,640
where none exists. Yeah.

368
00:19:03,680 --> 00:19:07,039
Speaker 3: And this goes to if we look at the you know,

369
00:19:07,200 --> 00:19:10,519
the learning aspect of it, like we're fine calling it

370
00:19:10,640 --> 00:19:13,960
machine learning because we start with training data. For any

371
00:19:14,000 --> 00:19:15,920
machine learning model, you give it a set of training data.

372
00:19:15,920 --> 00:19:19,279
Maybe that's your sales data for the last you know, quarter,

373
00:19:19,640 --> 00:19:22,079
or it's pictures of dogs and cats. In our case,

374
00:19:22,079 --> 00:19:25,559
it's just words. Have this thing read everything you can

375
00:19:25,599 --> 00:19:27,960
find and train it on that. So you give a

376
00:19:27,960 --> 00:19:30,039
lot of training data that's words, and then it's really

377
00:19:30,079 --> 00:19:34,400
good at understanding words, but it's still just predicting the

378
00:19:34,400 --> 00:19:38,039
next word. And when we see something that looks like

379
00:19:38,519 --> 00:19:41,279
a human probably did that, and we don't understand how

380
00:19:41,319 --> 00:19:45,079
a computer could do that, we think, I don't understand it, it

381
00:19:45,079 --> 00:19:47,519
must be intelligent, and we're applying that where it just

382
00:19:47,559 --> 00:19:48,119
doesn't exist.

383
00:19:48,160 --> 00:19:49,880
Speaker 2: Well. Plus, and humans are prone to that sort of

384
00:19:49,920 --> 00:19:53,440
thing anyway, right, Yeah, heck, we think our dogs understand Yeah,

385
00:19:54,079 --> 00:19:58,599
and we talk to our cars, which especially weird this

386
00:19:58,680 --> 00:20:03,119
tendency to ANSWERPROMORPHI. It's like, it's oh, it's necessary. I mean,

387
00:20:04,359 --> 00:20:06,240
is it necessary or is it a weakness?

388
00:20:06,559 --> 00:20:06,759
Speaker 1: Oh?

389
00:20:06,799 --> 00:20:07,319
Speaker 2: I think so.

390
00:20:08,039 --> 00:20:11,200
Speaker 1: I think it's necessary for us to be able to

391
00:20:11,240 --> 00:20:15,640
have some sort of relationship with the thing that we're anthropomorphizing.

392
00:20:16,039 --> 00:20:21,319
It's easier for us if we give it human at attributes.

393
00:20:21,559 --> 00:20:22,880
I think it makes it easier.

394
00:20:23,599 --> 00:20:26,279
Speaker 3: Architects are good at coming up with metaphors to describe

395
00:20:26,279 --> 00:20:28,799
stuff and people. How is someone going to relate to

396
00:20:28,839 --> 00:20:32,240
the software? How do I translate this complex idea into

397
00:20:32,319 --> 00:20:34,680
a design that my engineers are going to be able

398
00:20:34,720 --> 00:20:37,480
to implement or the users are going to understand? And

399
00:20:37,799 --> 00:20:40,480
things like the desktop on my computer, well that used

400
00:20:40,480 --> 00:20:43,240
to be analogous to the desktop where I'm setting the

401
00:20:43,240 --> 00:20:44,759
computer down on and I can have a pile of

402
00:20:44,799 --> 00:20:45,400
papers here.

403
00:20:45,640 --> 00:20:45,839
Speaker 2: Right.

404
00:20:46,480 --> 00:20:49,440
Speaker 3: We have these ideas and then those just become things,

405
00:20:49,480 --> 00:20:51,880
and then a floppy disk icon sticks around for twenty

406
00:20:52,000 --> 00:20:54,000
years after we've gotten rid of floppy disks yep.

407
00:20:54,119 --> 00:20:56,119
Speaker 2: Yeah, and now it is more better known as a

408
00:20:56,119 --> 00:20:58,920
save icon than it is as a physical thing, right. Yeah.

409
00:20:58,920 --> 00:21:01,160
People think it's weird that I three pre printed as

410
00:21:01,200 --> 00:21:03,680
save icon, like what's wrong with you? Why would you

411
00:21:03,759 --> 00:21:06,319
do that? But everybody recognizes it and that's why we

412
00:21:06,400 --> 00:21:07,359
still use it. Yeah.

413
00:21:07,440 --> 00:21:10,400
Speaker 3: So let's get to the architectural intelligence part of this.

414
00:21:10,519 --> 00:21:12,200
I think it comes down to two questions. If we

415
00:21:12,279 --> 00:21:17,119
have jen AI or really just LLMS, the two questions

416
00:21:17,119 --> 00:21:21,079
are is this appropriate for my software like the scenario

417
00:21:21,200 --> 00:21:23,640
I have? And then if I decide it is how

418
00:21:23,640 --> 00:21:27,359
do I optimize it? And I think you can look

419
00:21:27,400 --> 00:21:29,960
at some of the examples of when it makes sense

420
00:21:30,039 --> 00:21:33,640
to using your software because it is a language model.

421
00:21:33,759 --> 00:21:37,640
It's good at doing things that are language based, right, right,

422
00:21:37,680 --> 00:21:40,200
like you want to have a natural language interface. These

423
00:21:40,279 --> 00:21:42,119
used to be things that people worked really hard on

424
00:21:42,200 --> 00:21:45,279
and turns out you can just throw stuff at it

425
00:21:45,319 --> 00:21:47,319
and it can translate it into something else. Like you

426
00:21:47,319 --> 00:21:50,359
said that universal translator. I can't write a good search query,

427
00:21:50,400 --> 00:21:52,920
but now I can just talk to it and transcribe

428
00:21:53,319 --> 00:21:54,240
and it just works.

429
00:21:54,960 --> 00:22:01,839
Speaker 1: Yeah, and you might think of replacing certain complex patterns

430
00:22:01,960 --> 00:22:06,680
whereas you know, picking something from a series of drop

431
00:22:06,759 --> 00:22:09,759
downs and then maybe you know, in a grid in

432
00:22:09,920 --> 00:22:12,759
setting some things where you could just pop up a

433
00:22:12,759 --> 00:22:15,799
box and ask the user what they want, you know,

434
00:22:15,920 --> 00:22:17,720
what they want to see or what they want to do.

435
00:22:17,920 --> 00:22:18,119
Speaker 2: Yeah.

436
00:22:18,160 --> 00:22:20,519
Speaker 3: My company has a hackathon once a year and one

437
00:22:20,519 --> 00:22:22,759
of the teams that won actually people I worked with.

438
00:22:23,480 --> 00:22:26,559
They were looking at our custom ad hoc report builder, right,

439
00:22:26,720 --> 00:22:30,000
very customizable. You can choose all these things, and they

440
00:22:30,039 --> 00:22:32,559
realized that all you're doing in the UI is sending

441
00:22:32,599 --> 00:22:35,240
requests to the API to say, create a report with

442
00:22:35,319 --> 00:22:38,920
these characteristics. Yeah, so they train you know, basically wrote

443
00:22:38,960 --> 00:22:42,039
a prompt that says, here's the API, and you could

444
00:22:42,039 --> 00:22:44,519
ask it please create a report that runs every Monday

445
00:22:44,519 --> 00:22:48,599
morning and the week over week totals and gives these values,

446
00:22:49,359 --> 00:22:52,119
and it knew what to do and you simplified the

447
00:22:52,160 --> 00:22:54,680
complex part of the system, but you didn't replace the

448
00:22:54,720 --> 00:22:57,920
actual report generation, right right, just that interview.

449
00:22:57,640 --> 00:23:00,440
Speaker 2: It's literally a ux change to UX change. Yeah.

450
00:23:00,640 --> 00:23:03,839
Speaker 3: Yeah, And again it's that language level of the UX change.

451
00:23:03,880 --> 00:23:08,880
The language model is good at understand language where it

452
00:23:08,960 --> 00:23:11,440
starts to slide into the maybe we should or maybe

453
00:23:11,440 --> 00:23:14,240
we shouldn't. Like if you're asking for in product help

454
00:23:14,279 --> 00:23:16,279
and you want to find stuff like maybe that's a

455
00:23:16,319 --> 00:23:18,960
little better than searching your help and maybe and consummarize

456
00:23:18,960 --> 00:23:21,000
the results. We've seen stuff like that. This is where

457
00:23:21,119 --> 00:23:25,720
RAG comes in and does different things. But you know,

458
00:23:25,799 --> 00:23:27,920
some of the examples I've seen like, oh, we have

459
00:23:27,960 --> 00:23:31,039
a rules engine because we have all this complex logics,

460
00:23:31,079 --> 00:23:33,640
we added a rules engine, and now we have a

461
00:23:33,680 --> 00:23:36,480
rules engine to manage, and now we have difficulty figuring

462
00:23:36,480 --> 00:23:39,359
out like what are the rules that are applied? What

463
00:23:39,440 --> 00:23:41,440
if we just replaced the whole thing with an AI

464
00:23:41,559 --> 00:23:43,880
and it can just do the logic the rules engine.

465
00:23:43,920 --> 00:23:45,680
This I think goes to the example you give for

466
00:23:46,039 --> 00:23:49,640
it Air Canada that basically said someone's eligible for a discount, Yeah,

467
00:23:49,640 --> 00:23:53,039
because the AI said so, But there's no actual rule there.

468
00:23:53,079 --> 00:23:56,599
It just said the most likely answer, the most likely

469
00:23:56,640 --> 00:23:59,720
next word when you ask, am I eligible for a discount? Yes,

470
00:24:00,559 --> 00:24:03,039
it's just predicting a word, but it's not actually based

471
00:24:03,039 --> 00:24:03,440
on logic.

472
00:24:03,559 --> 00:24:05,359
Speaker 2: Yeah. And more importantly, when then that went all the

473
00:24:05,359 --> 00:24:07,319
way to court because it's like, oh, our software failed,

474
00:24:07,319 --> 00:24:09,720
we're not liable, It's like, no, you presented that as

475
00:24:09,759 --> 00:24:12,839
a replacement for a human agent, and if human agent

476
00:24:12,880 --> 00:24:16,480
had said the wrong thing, you would be liable for it.

477
00:24:16,519 --> 00:24:20,039
And so you're liable for it, right, which is good.

478
00:24:20,079 --> 00:24:22,200
You know, let's get some case law in place there,

479
00:24:22,240 --> 00:24:26,880
and also pressing against employers to say, if you're doing

480
00:24:27,079 --> 00:24:30,640
this kind of utilization of this software, it comes with

481
00:24:30,680 --> 00:24:33,839
a price, so you know, test carefully, right. I think

482
00:24:33,880 --> 00:24:36,519
that's the bigger issue I have is a you know,

483
00:24:37,240 --> 00:24:40,200
I mentioned that the API for calling cryptographicky, but what

484
00:24:40,240 --> 00:24:42,559
if one in one hundred cryptograph keys were just invalid

485
00:24:42,759 --> 00:24:44,759
but you had no way of knowing like this is

486
00:24:44,799 --> 00:24:46,559
the problem I had with lllms is you're putting them

487
00:24:46,599 --> 00:24:49,799
in a critical workflow and you haven't really tested them.

488
00:24:49,839 --> 00:24:51,480
You don't know what the failure modes look like.

489
00:24:51,640 --> 00:24:53,759
Speaker 3: Right, And I think anywhere you're going to use an LM,

490
00:24:54,440 --> 00:24:57,400
take the LLM out and put a person in that place.

491
00:24:57,519 --> 00:24:57,920
Speaker 2: Yeah. Right.

492
00:24:57,960 --> 00:24:59,839
Speaker 3: And if you asked a person to do something, would

493
00:24:59,839 --> 00:25:03,279
you trust them or do you have other checks and balances?

494
00:25:03,759 --> 00:25:06,319
Is something has to be reviewed by a supervisor or

495
00:25:06,359 --> 00:25:09,000
do you just let that person have full autonomy and

496
00:25:09,039 --> 00:25:12,400
they can do things or you have later auditing to say, hey,

497
00:25:12,400 --> 00:25:14,799
they messed up and we can fix it. But you

498
00:25:14,839 --> 00:25:18,200
can't just magically think that this LLM can do everything

499
00:25:18,319 --> 00:25:20,200
because it can write poetry.

500
00:25:20,319 --> 00:25:24,799
Speaker 1: So, guys, as if United Healthcare wasn't already in the

501
00:25:24,839 --> 00:25:29,400
news enough, they are facing a class action lawsuit alleging

502
00:25:29,440 --> 00:25:36,160
that the company misused AI to deny specific insurance claims,

503
00:25:36,160 --> 00:25:42,079
and especially on elderly people. So there's you know, there's

504
00:25:42,119 --> 00:25:45,240
a I'm going to link to a news story about this,

505
00:25:45,359 --> 00:25:48,920
and there's an interview with people who were denied and

506
00:25:49,400 --> 00:25:54,240
clearly weren't shouldn't have been denied. An insurance claim and

507
00:25:54,279 --> 00:25:57,079
they basically suit and they said, yeah, your AI basically

508
00:25:57,079 --> 00:25:58,079
made this determination.

509
00:25:58,240 --> 00:26:00,160
Speaker 2: Well I think that so, yeah, but they all so

510
00:26:00,240 --> 00:26:02,799
configured it to do that, right, It's no different than

511
00:26:02,799 --> 00:26:04,519
having a person say deny all acclaim.

512
00:26:04,640 --> 00:26:06,920
Speaker 3: But I think that's also where we're seeing AI is

513
00:26:07,000 --> 00:26:10,640
thrown on as a label what was probably just machine learning. Right,

514
00:26:10,720 --> 00:26:14,559
they have all this could be historic claims data, and

515
00:26:14,920 --> 00:26:17,079
there are plenty of examples of it is.

516
00:26:17,200 --> 00:26:20,680
Speaker 1: Actually it was predictive analytics that they used because the

517
00:26:20,759 --> 00:26:24,720
ices are pretting that if we allow this, you know,

518
00:26:24,759 --> 00:26:27,799
the chances are that they're going to I don't know,

519
00:26:27,839 --> 00:26:28,839
abuse it or whatever.

520
00:26:29,039 --> 00:26:33,319
Speaker 3: Yeah, those those models are trained with biases because it's

521
00:26:33,359 --> 00:26:35,559
the day you gave them, and if the only data

522
00:26:35,559 --> 00:26:38,240
you gave them. I think the crime ones are horrible, Right,

523
00:26:38,279 --> 00:26:41,039
It's like, oh, there's all this crime because there were

524
00:26:41,039 --> 00:26:43,400
these arrests in these places. Therefore we're going to send

525
00:26:43,440 --> 00:26:46,160
more people, which then leads to more arrests because that's

526
00:26:46,160 --> 00:26:48,839
where they sent the cops. It didn't actually solve the

527
00:26:48,920 --> 00:26:52,359
problem of why is high why is crime high in

528
00:26:52,359 --> 00:26:52,839
these areas?

529
00:26:52,920 --> 00:26:53,039
Speaker 2: Right?

530
00:26:53,160 --> 00:26:55,799
Speaker 1: Or we found that where there are house fires, there

531
00:26:55,839 --> 00:26:58,440
tend to be a lot of firemen, So let's get

532
00:26:58,519 --> 00:26:59,319
rid of the firemen.

533
00:27:01,000 --> 00:27:03,680
Speaker 3: So I think we're getting into my third category. We

534
00:27:03,720 --> 00:27:06,480
went from good to maybe to the really questionable uses

535
00:27:06,519 --> 00:27:10,119
of AI. This is where people think I should just

536
00:27:10,880 --> 00:27:15,559
you know, replace that report generation. If you need specific

537
00:27:15,759 --> 00:27:20,279
mathematics like added up your regulatory and compliance scenarios your

538
00:27:20,359 --> 00:27:24,480
quarterly reports, you really really really shouldn't do that because

539
00:27:24,519 --> 00:27:27,440
it will come up with numbers. They're not going to

540
00:27:27,440 --> 00:27:30,519
be the right numbers. It's based on statistics, based on

541
00:27:30,680 --> 00:27:33,519
words and language, not this is my accounting model. We've

542
00:27:33,519 --> 00:27:36,240
had accounting standards for like four or five hundred years

543
00:27:36,799 --> 00:27:38,920
for good reason, Like we know how to do math.

544
00:27:39,559 --> 00:27:40,960
Don't put it in those places.

545
00:27:41,119 --> 00:27:43,279
Speaker 2: What if it can help you generate the query.

546
00:27:43,680 --> 00:27:46,039
Speaker 3: Again, that goes back to the where does it fit?

547
00:27:46,319 --> 00:27:49,319
Put it in some place. That's the language aspect. That's

548
00:27:49,319 --> 00:27:52,319
what it's good for language aspect. Yeah, But I think

549
00:27:52,640 --> 00:27:56,440
what people need to understand is this is non deterministic software. Right,

550
00:27:57,000 --> 00:27:59,519
we are used to software is a series of if

551
00:27:59,559 --> 00:28:01,000
then else statements.

552
00:28:00,759 --> 00:28:01,839
Speaker 2: Right, very deterministic.

553
00:28:01,920 --> 00:28:05,279
Speaker 3: Yeah, AI is very non deterministic. It might give the

554
00:28:05,319 --> 00:28:07,720
same thing, but it might give something different. And this

555
00:28:07,799 --> 00:28:10,000
is where you can set temperatures and all the different

556
00:28:10,000 --> 00:28:11,960
things and move the sliders around. You can get more

557
00:28:12,039 --> 00:28:14,920
creative answers or less creative answers, or more specific, and

558
00:28:14,920 --> 00:28:16,759
you can get it to repeat the same answer. But

559
00:28:17,720 --> 00:28:20,200
it's always still based on a prediction model.

560
00:28:20,240 --> 00:28:21,880
Speaker 2: The fact that you can get different answers to the

561
00:28:21,880 --> 00:28:24,440
same question, it's just just clear indication. It's like, it's

562
00:28:24,440 --> 00:28:26,799
not that determined. Yes, the determined model will give you

563
00:28:26,799 --> 00:28:28,240
the same answer for the same question.

564
00:28:28,400 --> 00:28:30,480
Speaker 1: Yeah, I got to admit that I have used at

565
00:28:30,559 --> 00:28:35,680
GPT to generate store procedures or SQL queries where you know,

566
00:28:35,720 --> 00:28:38,799
I gave it the the data that I needed, and

567
00:28:39,039 --> 00:28:41,440
you know, I'm just I'm not the sequel guru that

568
00:28:41,519 --> 00:28:44,279
Richard is, and I don't know, you know, my group

569
00:28:44,319 --> 00:28:48,759
buys completely befuddle me. And it worked, you know, it

570
00:28:48,799 --> 00:28:51,000
turned out to return the right stuff.

571
00:28:51,240 --> 00:28:53,160
Speaker 3: So you just said, you know, you gave it a

572
00:28:53,200 --> 00:28:54,960
sample of your data, like here's what it looks like.

573
00:28:55,160 --> 00:28:58,359
The more specific you can get, then the better the

574
00:28:58,440 --> 00:29:00,880
answer is. It will always give an answer. It will

575
00:29:00,920 --> 00:29:03,279
never come back and say I have a question, can

576
00:29:03,359 --> 00:29:06,079
you provide me more information? If you said, please generate

577
00:29:06,839 --> 00:29:10,359
a store procedure it would write one, it'd be completely

578
00:29:10,440 --> 00:29:13,160
useless for you. But if you said I need a

579
00:29:13,160 --> 00:29:16,640
store procedure that gets this data, that prompt is giving

580
00:29:16,680 --> 00:29:20,279
it more information and that narrows the context. Like AI

581
00:29:20,400 --> 00:29:23,519
is really good at broad general statements. It knows all

582
00:29:23,559 --> 00:29:26,480
this generic knowledge because it read the Internet. It doesn't

583
00:29:26,480 --> 00:29:29,279
know your specific scenario. And that's where architects come in.

584
00:29:29,400 --> 00:29:32,319
Is like architects take these design patterns we have and

585
00:29:32,319 --> 00:29:35,640
we figure out what is useful in this scenario and

586
00:29:35,799 --> 00:29:38,039
is AI one of those things that might be useful

587
00:29:38,079 --> 00:29:41,200
in my design at this time or does it not apply?

588
00:29:41,319 --> 00:29:45,480
Speaker 1: But in this case, it knows the rules of tseql

589
00:29:45,799 --> 00:29:48,480
and you gave it everything that it needed in order

590
00:29:48,519 --> 00:29:52,759
to create the right select statement and it worked. Because

591
00:29:52,799 --> 00:29:56,039
of that narrow scope. You also have a testability aspect

592
00:29:56,039 --> 00:29:58,920
there too. Were able to try it and evaluate the

593
00:29:59,000 --> 00:30:02,640
results and decide if that you know was correct.

594
00:30:03,200 --> 00:30:06,559
Speaker 2: The compiler gets to say yeah, as they say too right,

595
00:30:06,720 --> 00:30:09,759
like if the if the database didn't like it, it would

596
00:30:09,759 --> 00:30:10,519
have spat it back.

597
00:30:10,720 --> 00:30:14,200
Speaker 1: I also like the ability to if I'm asking it

598
00:30:14,240 --> 00:30:17,079
to generate a method in c sharp that does X,

599
00:30:17,640 --> 00:30:20,440
and it smells a little funky, and I would say

600
00:30:20,440 --> 00:30:24,960
to it, can you try this again, but with less verbosity,

601
00:30:25,039 --> 00:30:28,000
can you maybe use link or do something like that

602
00:30:28,119 --> 00:30:31,359
to you know? And it'll say sure, and it'll try

603
00:30:31,400 --> 00:30:34,799
it and it'll work. On the other hand, I've had

604
00:30:36,640 --> 00:30:39,400
methods that use link and a lot of complex link

605
00:30:39,559 --> 00:30:43,079
and then I will say, hey, can you expand this

606
00:30:43,319 --> 00:30:48,000
to use loops and if then statements, And then I

607
00:30:48,039 --> 00:30:50,880
will take that and I will I'll test it, make

608
00:30:50,880 --> 00:30:54,440
sure it works, and I'll comment that for somebody who's

609
00:30:54,480 --> 00:30:57,960
reading it who really doesn't understand link, say this is

610
00:30:57,960 --> 00:30:59,319
what this link statement does.

611
00:30:59,599 --> 00:30:59,880
Speaker 2: Yeah.

612
00:31:00,039 --> 00:31:02,680
Speaker 3: I just had to give a demo of get ub

613
00:31:02,720 --> 00:31:06,039
copilot within my company and it was a five minute

614
00:31:06,119 --> 00:31:10,119
lightning talk. I said, we had this new feature we're developing.

615
00:31:10,640 --> 00:31:12,440
It was pretty complex, so we spent a little bit

616
00:31:12,480 --> 00:31:15,960
more time than usual doing a upfront design. Just wrote

617
00:31:15,960 --> 00:31:19,319
out some markdown, a few mermaid diagrams, here's some classes,

618
00:31:19,359 --> 00:31:22,279
and here's the API end points. And I'm like, what

619
00:31:22,279 --> 00:31:24,319
would happen if I just gave this to get up

620
00:31:24,359 --> 00:31:28,160
copilot And the first time I did it it wasn't great.

621
00:31:28,640 --> 00:31:31,799
It worked, but it wasn't it didn't It didn't fit

622
00:31:31,920 --> 00:31:34,960
the style of the coding that we had in our project,

623
00:31:35,000 --> 00:31:37,759
things like we used filescope name spaces because who needs

624
00:31:37,759 --> 00:31:40,759
extra curly braces. And then I changed my prompt and

625
00:31:40,799 --> 00:31:43,960
I said things like, please use filescope name spaces, and

626
00:31:44,519 --> 00:31:47,200
please create the interface for each of the classes and

627
00:31:47,240 --> 00:31:50,400
injected into the controllers like things that a developer would

628
00:31:50,400 --> 00:31:53,160
have known to do based on the design document, because

629
00:31:53,160 --> 00:31:55,079
they said, Okay, you only need to provide this level

630
00:31:55,079 --> 00:31:58,960
of detail. I can figure out the rest with Copilot.

631
00:31:59,279 --> 00:32:02,319
It could do better once I told it, please follow

632
00:32:02,559 --> 00:32:04,119
our guidelines, but I had to tell it what our

633
00:32:04,119 --> 00:32:04,759
guidelines were.

634
00:32:04,920 --> 00:32:08,160
Speaker 2: Yep, just like a developer, same thing. Yeah, give better

635
00:32:08,160 --> 00:32:10,640
and better instructions, get more precise results.

636
00:32:10,839 --> 00:32:11,440
Speaker 3: Yeah.

637
00:32:11,480 --> 00:32:13,359
Speaker 1: Hey, I think it's time for break. So we'll be

638
00:32:13,519 --> 00:32:16,440
right back after these very important messages. And if you

639
00:32:16,839 --> 00:32:19,039
buy chance do not want to hear these messages in

640
00:32:19,079 --> 00:32:23,440
the future. You can get an ad free feed by

641
00:32:23,480 --> 00:32:25,960
becoming a five dollars a month patroon at Patreon dot

642
00:32:26,039 --> 00:32:30,119
dot NetRocks dot com. We'll be right back. Do you

643
00:32:30,160 --> 00:32:32,880
have a complex dot net monolith you'd like to refactor

644
00:32:32,920 --> 00:32:37,640
to a microservices architecture? The microservice extractor for dot Net

645
00:32:37,680 --> 00:32:41,920
tool visualizes your app and helps progressively extract code into

646
00:32:42,000 --> 00:32:46,200
micro services. Learn more at aws dot Amazon dot com,

647
00:32:46,279 --> 00:32:53,319
slash modernize, and we're back. It's dot NetRocks. I'm Carl Franklin,

648
00:32:53,319 --> 00:32:56,079
that's Richard Campbell, hey, and that's our friend Thomas Bets

649
00:32:56,079 --> 00:33:00,519
and we're talking about architectural intelligence. Should I shouldn't I?

650
00:33:00,720 --> 00:33:04,440
And if I should, where and how much? And why

651
00:33:04,759 --> 00:33:05,920
do we even need to do this?

652
00:33:06,359 --> 00:33:08,839
Speaker 3: Yeah, I think we were leaving it off with AI

653
00:33:09,000 --> 00:33:12,920
being nondeterministic software, and I want to get the idea

654
00:33:12,960 --> 00:33:15,000
that that's a feature, not a bug.

655
00:33:15,720 --> 00:33:16,000
Speaker 2: Right.

656
00:33:17,119 --> 00:33:20,440
Speaker 3: It gives these good enough answers, like that's why it

657
00:33:20,480 --> 00:33:23,480
seems intelligent, Like it did a really good job. And

658
00:33:23,880 --> 00:33:26,160
there are times when the really good job is flat

659
00:33:26,160 --> 00:33:28,759
out wrong, but there are times when it's going to

660
00:33:28,799 --> 00:33:33,519
be okay. And find those places in your applications where

661
00:33:33,519 --> 00:33:37,279
it's like I can tolerate the good enough answers, Like

662
00:33:37,319 --> 00:33:40,599
if someone doesn't find all the help references, but I

663
00:33:40,640 --> 00:33:42,720
found enough and they got their job done, that's okay.

664
00:33:42,839 --> 00:33:48,119
Speaker 2: Yeah. I like your reporting API wrapper scenario because you're experimenting,

665
00:33:48,119 --> 00:33:50,400
You're trying to come up with a way to look

666
00:33:50,440 --> 00:33:53,680
at the company's data in a way that'll presumably allow

667
00:33:53,720 --> 00:33:56,160
you to take an action, so you don't know exactly

668
00:33:56,200 --> 00:33:59,319
what you're asking for. The most frustrating thing I see

669
00:33:59,359 --> 00:34:01,440
with most people play with some kind of report builders.

670
00:34:01,440 --> 00:34:05,400
They gather get everything or nothing right. They always have

671
00:34:05,440 --> 00:34:08,119
scoping problems and so forth. So a tool that allows

672
00:34:08,159 --> 00:34:10,239
them to get to maybe do a little more iterative

673
00:34:10,320 --> 00:34:13,159
and improve an expression on that that might be an

674
00:34:13,159 --> 00:34:13,920
easier way to go.

675
00:34:14,239 --> 00:34:20,159
Speaker 1: The anthropomorphizing problem is a big one, and Richard's been

676
00:34:20,159 --> 00:34:22,199
banging this drum for a long long time. Don't fall

677
00:34:22,239 --> 00:34:24,639
into that trap, or try not to. But it's kind

678
00:34:24,639 --> 00:34:30,599
of like having an educated uncle who sounds very smart

679
00:34:31,199 --> 00:34:36,000
and uses big words and never says um or like

680
00:34:36,360 --> 00:34:40,440
or you know uh, and you know you ask them

681
00:34:40,480 --> 00:34:45,280
the question, they give you a very intellectual sounding answer,

682
00:34:45,760 --> 00:34:49,760
and it may be completely wrong. They may have brain damage.

683
00:34:50,000 --> 00:34:54,320
You know, your uncle may be educated at Havad. However

684
00:34:55,079 --> 00:34:56,840
he got in a car accident a couple of years

685
00:34:56,840 --> 00:34:59,880
ago and hasn't been the same since, but he's still

686
00:35:00,119 --> 00:35:01,239
sounds very smart.

687
00:35:03,400 --> 00:35:07,519
Speaker 3: I like, I've taken issue with some of the terminology

688
00:35:07,639 --> 00:35:12,800
like get hub copilot, great product name, it's not a copilot.

689
00:35:13,000 --> 00:35:15,360
Like we talked about this as being your AI assistant.

690
00:35:15,920 --> 00:35:18,519
It's going to help me do my job. I've flown

691
00:35:18,559 --> 00:35:21,039
on enough commercial flights they all have co pilots, and

692
00:35:21,039 --> 00:35:23,519
I'm pretty sure those people are fully qualified to fly

693
00:35:23,639 --> 00:35:26,760
the plane and are probably doing it. Also, on a

694
00:35:26,800 --> 00:35:29,159
long flight, they aren't flying the plane. They flip on

695
00:35:29,199 --> 00:35:32,599
the autopilot, So at some point you are trusting the

696
00:35:32,599 --> 00:35:34,760
computer to do the thing with monitoring.

697
00:35:34,840 --> 00:35:38,559
Speaker 2: I think Getthub Copilot did a disservice to real copilots. Yes,

698
00:35:38,639 --> 00:35:41,679
although I did appreciate the name at least implied to you, Hey,

699
00:35:42,199 --> 00:35:44,639
you're still the pilot. It's still your fault.

700
00:35:46,519 --> 00:35:50,480
Speaker 3: I think AI agents has been like changed to a

701
00:35:50,599 --> 00:35:54,199
gentic AI, which is better but harder to say. So

702
00:35:54,239 --> 00:35:54,960
no one says it.

703
00:35:55,760 --> 00:35:57,119
Speaker 2: But that's the idea of the disease.

704
00:35:57,400 --> 00:36:01,000
Speaker 3: The AI gets to make these decisions and we're not

705
00:36:01,119 --> 00:36:03,400
there yet. No, Like we shouldn't just let them run

706
00:36:03,400 --> 00:36:03,719
a muck.

707
00:36:03,920 --> 00:36:05,679
Speaker 2: I don't want to. I don't want to get there.

708
00:36:05,800 --> 00:36:07,199
We're going to get there. Yeah, it's going to have

709
00:36:07,360 --> 00:36:09,880
This is the thing they're pitching now, and there's certain workflow.

710
00:36:09,880 --> 00:36:12,719
I mean, it's really not that different from any sort

711
00:36:12,719 --> 00:36:17,800
of stream based bit of software that has the ability

712
00:36:17,880 --> 00:36:21,519
to to act in some way, right, Like, I've played

713
00:36:21,519 --> 00:36:27,440
with plenty of prescriptive analytic models for email prompting. You know,

714
00:36:27,480 --> 00:36:29,559
you don't necessarily you don't want to be in the

715
00:36:29,599 --> 00:36:33,079
workflow of this person's been to the site, they put

716
00:36:33,119 --> 00:36:34,760
in some stuff in the car. If they didn't buy,

717
00:36:35,119 --> 00:36:37,159
we send them a tiicle email about the things they

718
00:36:37,159 --> 00:36:39,199
put in their cart, right, Like, all of that is

719
00:36:39,320 --> 00:36:42,719
automated now, and the fact that it's using a machine

720
00:36:42,719 --> 00:36:45,320
model to determine when to send that email, like, don't

721
00:36:45,320 --> 00:36:48,320
send it right away, that's creepy, right, it's a few

722
00:36:48,360 --> 00:36:52,440
hours later, and they're actually using all of the relative

723
00:36:52,480 --> 00:36:55,559
response data to feedback into the model to adjust the time.

724
00:36:55,719 --> 00:36:55,800
Speaker 3: Ye.

725
00:36:56,039 --> 00:36:58,920
Speaker 2: Now that these agentic models are going to play on

726
00:36:58,960 --> 00:37:02,559
that presumably take it further and maybe make it easier

727
00:37:02,559 --> 00:37:05,599
to build, because good prescriptive modeling is hard, Like, that's

728
00:37:05,639 --> 00:37:07,679
a tough thing to build, So maybe we're lowering the

729
00:37:07,719 --> 00:37:09,880
bar for how to make this stuff.

730
00:37:09,880 --> 00:37:14,519
Speaker 1: You also have to beware of AIS or AI agents

731
00:37:14,599 --> 00:37:19,000
rewriting things that you wrote and making sure that if

732
00:37:19,039 --> 00:37:24,679
you do choose an augmented version that it's accurate, so

733
00:37:24,920 --> 00:37:28,519
that means you have to you right. Let's say, let's

734
00:37:28,519 --> 00:37:30,480
say you're on Facebook and you want to do an

735
00:37:30,519 --> 00:37:33,639
AD or something, so you write the ad copy and

736
00:37:33,679 --> 00:37:36,639
then it gives you three other options. Hey how about this,

737
00:37:36,719 --> 00:37:39,679
which sounds one of them sounds more exciting and stuff,

738
00:37:39,719 --> 00:37:44,280
but it left out details. It left out links, for example,

739
00:37:45,239 --> 00:37:49,599
it left out references to other Facebook pages, so makes

740
00:37:49,639 --> 00:37:50,320
me skeptical.

741
00:37:50,880 --> 00:37:54,280
Speaker 3: Yeah, one of the other. There's actually a product that's

742
00:37:54,480 --> 00:37:57,639
out there now for my company because we do software

743
00:37:57,639 --> 00:37:59,960
for nonprofits, a lot of fundraising, a lot of donation,

744
00:38:00,760 --> 00:38:03,239
and the people who are good at running a nonprofit

745
00:38:03,320 --> 00:38:05,960
and doing the behind the scenes work might not be

746
00:38:06,039 --> 00:38:09,039
the best people at writing the message that little, you know,

747
00:38:09,199 --> 00:38:10,760
short text it's going to go out and say, hey,

748
00:38:10,840 --> 00:38:14,920
please donate for this cause or the we have just giving,

749
00:38:14,960 --> 00:38:16,760
which is kind of like go fundme. You have a

750
00:38:16,800 --> 00:38:18,559
small little thing. How do you put that little blurb

751
00:38:18,599 --> 00:38:21,719
out there that's not too long that gets people's attention.

752
00:38:21,960 --> 00:38:24,400
And so we have tools that allow you to, you know,

753
00:38:24,599 --> 00:38:27,079
click these buttons and change the tone and it'll help

754
00:38:27,159 --> 00:38:30,440
you write the message. Because again, it's just language. It's

755
00:38:30,480 --> 00:38:33,559
great at writing language that convinces people, and we've seen

756
00:38:33,920 --> 00:38:36,519
people donate more to causes that are using that product.

757
00:38:36,639 --> 00:38:40,800
So you know, there's a very clear correlation to well,

758
00:38:40,800 --> 00:38:43,480
it's good for our products, good for the nonprofits that

759
00:38:43,519 --> 00:38:46,480
are using it. So again, find the right places for it.

760
00:38:47,800 --> 00:38:50,840
I think the agents they're going to get better as

761
00:38:50,880 --> 00:38:53,280
we get them to be more specialized. And this is

762
00:38:53,320 --> 00:38:56,000
one of those weird things that the bigger is not

763
00:38:56,039 --> 00:38:58,719
always better. We've talked about these lms, right that have

764
00:38:58,800 --> 00:39:02,960
grown to billions and billions of parameters. Right, I can't

765
00:39:03,000 --> 00:39:06,039
remember what GPT three to GPT four like double the

766
00:39:06,119 --> 00:39:09,199
number of parameters, basically how big the model is and

767
00:39:09,239 --> 00:39:10,239
how many things it knows.

768
00:39:11,679 --> 00:39:12,960
Speaker 2: It was more it was like one hundred and seventy

769
00:39:12,960 --> 00:39:16,159
five billion to a trillion. Yeah, yeah, you know, five

770
00:39:16,239 --> 00:39:18,519
times and the context size growth.

771
00:39:18,599 --> 00:39:18,760
Speaker 3: Right.

772
00:39:18,840 --> 00:39:22,360
Speaker 2: Yeah, Well, there's an argument there's no path forward because

773
00:39:22,400 --> 00:39:24,840
there's not more data, Like there's not four trillion parameters

774
00:39:24,840 --> 00:39:25,400
to be had.

775
00:39:26,079 --> 00:39:30,320
Speaker 3: Yeah. Yeah, they can't train it on anything else because

776
00:39:30,320 --> 00:39:32,639
it'll start training itself on what it knows and then

777
00:39:32,639 --> 00:39:36,480
it'll be the cyclical. It'll just like go down the biospiral.

778
00:39:35,920 --> 00:39:37,719
Speaker 2: You know, in science fiction when we talked about a

779
00:39:37,760 --> 00:39:41,840
superintelligence is that it would start self learning and getting better.

780
00:39:42,400 --> 00:39:46,079
So far, this software when self learning gets worse, like

781
00:39:46,159 --> 00:39:48,440
if you if you put it back out to train

782
00:39:48,480 --> 00:39:52,079
against its own data, it gets less effective, like it's

783
00:39:52,079 --> 00:39:53,559
a photocopy of a photocopy.

784
00:39:53,639 --> 00:39:55,559
Speaker 1: But for the average user, they can build up a

785
00:39:55,599 --> 00:40:00,360
context over time so that it quote unquote no, or

786
00:40:00,440 --> 00:40:03,000
remembers things that you've talked about in the past. So

787
00:40:03,039 --> 00:40:05,360
you can just say, hey, remember that application I was

788
00:40:05,400 --> 00:40:07,599
telling you called blah blah blah. Yeah, I have a

789
00:40:07,679 --> 00:40:11,559
question about that, and it'll know well quote unquote no, yeah,

790
00:40:11,599 --> 00:40:12,960
it'll pull up that context.

791
00:40:13,000 --> 00:40:18,320
Speaker 3: And that's the most effective way to use any LLM

792
00:40:18,559 --> 00:40:23,599
is good prompt engineering, And that's like we talked about

793
00:40:23,679 --> 00:40:25,960
rag go and find this data. But you've got to

794
00:40:26,199 --> 00:40:28,079
have someone who knows how to do that effectively, because

795
00:40:28,119 --> 00:40:30,639
you know it poorly, you get worse results. Here's how

796
00:40:30,679 --> 00:40:33,400
to index my data. But if you can stick everything

797
00:40:33,440 --> 00:40:35,920
that you could possibly want to know about this question

798
00:40:36,039 --> 00:40:39,400
into the prompt, you're gonna get great, great results. The

799
00:40:39,400 --> 00:40:42,280
problem is the corpus of knowledge that I need you

800
00:40:42,320 --> 00:40:45,800
to know about for my product won't fit into forty

801
00:40:45,840 --> 00:40:47,400
thousand characters or thousand.

802
00:40:47,519 --> 00:40:52,599
Speaker 2: Now if either of you run into Windows Recall yet

803
00:40:52,760 --> 00:40:53,280
not yet.

804
00:40:53,400 --> 00:40:55,159
Speaker 3: We have a new laptop that has it, but I

805
00:40:55,159 --> 00:40:56,159
haven't played with it yet.

806
00:40:56,320 --> 00:40:58,719
Speaker 2: Yeah, I mean there's been a lot of This was

807
00:40:58,760 --> 00:41:01,599
an announced back in I think Microsoft did a terrible

808
00:41:01,679 --> 00:41:05,239
job of announcing it was only for the new laptops,

809
00:41:05,280 --> 00:41:11,320
the Copilot plus Com based coplout plus PC laptops. But

810
00:41:11,400 --> 00:41:13,719
it's basically taking a snapshot of everything you're doing the

811
00:41:13,719 --> 00:41:16,159
whole time. Like you talk about a way to generate

812
00:41:17,360 --> 00:41:21,079
knowledge about you for a system I could see, you know,

813
00:41:21,159 --> 00:41:23,000
at the moment, it looks like it's largely just a

814
00:41:23,000 --> 00:41:25,960
search engine. Hey what was that pair of pants I

815
00:41:26,000 --> 00:41:28,519
was looking at in the past? Right? And the fact

816
00:41:28,559 --> 00:41:30,159
that it can sort through all of that because it

817
00:41:30,159 --> 00:41:33,199
has a copy of everything you've done stored on your

818
00:41:33,199 --> 00:41:37,119
machine is interesting. But there is that larger idea that

819
00:41:37,280 --> 00:41:43,960
over years, you'd gradually be building a remarkable augmented retrieval

820
00:41:44,039 --> 00:41:47,239
set about yourself just because it knows everything you've done,

821
00:41:47,320 --> 00:41:50,119
given that you were only working from one machine. Because

822
00:41:50,119 --> 00:41:53,519
it doesn't and it's only stored in that machine, so

823
00:41:53,559 --> 00:41:55,599
you're within the constraints of what that computer can do

824
00:41:56,000 --> 00:41:57,840
in the name of security, but if you did anything

825
00:41:57,840 --> 00:41:59,039
on your phone, it's not going to know about it.

826
00:41:59,039 --> 00:42:00,800
If you have more than one can who would have

827
00:42:00,840 --> 00:42:03,239
more than one computer? I was drained? Well, Gmail is

828
00:42:03,880 --> 00:42:05,320
that experience for me now?

829
00:42:05,440 --> 00:42:07,280
Speaker 1: Yeah, I mean, and it has been for a long time.

830
00:42:07,400 --> 00:42:10,239
It's your store of intelligence, it's my store of intelligence.

831
00:42:10,239 --> 00:42:12,119
But the problem with it, of course, is when you

832
00:42:12,159 --> 00:42:16,079
want to find something, you're going to hit on all

833
00:42:16,159 --> 00:42:20,719
of the spam that you've gotten first, and because there's

834
00:42:20,760 --> 00:42:24,000
more of it now, you have to really get creative

835
00:42:24,039 --> 00:42:27,960
with your search, your advanced search, and you know, that's

836
00:42:28,039 --> 00:42:31,760
kind of the problem. Whereas this recall thing sounds like

837
00:42:31,800 --> 00:42:33,239
it's a little bit more focused, and.

838
00:42:33,159 --> 00:42:35,679
Speaker 2: There's lots of people freaking out about the security around it, like, oh,

839
00:42:35,719 --> 00:42:38,639
of course, and they were as soon as it was introduced.

840
00:42:38,880 --> 00:42:42,599
Speaker 3: Yeah, I think any one who's taking this seriously understands

841
00:42:42,599 --> 00:42:45,719
there's a security aspect. I think you're going to see

842
00:42:45,760 --> 00:42:49,599
small language models become more popular because there isn't it's

843
00:42:49,599 --> 00:42:51,679
not going over the wire. I can the model is

844
00:42:51,719 --> 00:42:54,119
small enough. I can host it on my laptop, or

845
00:42:54,159 --> 00:42:56,840
I can host it on my servers. I don't need

846
00:42:56,880 --> 00:43:01,960
to be calling out to chatchy here. Open ai APIs right,

847
00:43:02,000 --> 00:43:04,000
and so I don't have the concern of sending the

848
00:43:04,039 --> 00:43:06,360
data over the wire if I can self host it.

849
00:43:06,400 --> 00:43:09,039
Now I'm paying the hosting costs of hosting a model,

850
00:43:09,559 --> 00:43:11,920
but maybe I get better results because I don't have

851
00:43:11,960 --> 00:43:14,440
to sanitize my data. So I'm able to ask better

852
00:43:14,519 --> 00:43:16,880
questions because I know that the data is not leaving

853
00:43:16,920 --> 00:43:17,440
my domain.

854
00:43:17,519 --> 00:43:19,920
Speaker 2: I don't think the customer cares at all, right, like

855
00:43:19,960 --> 00:43:22,679
they're happily using chat GPT inside of companies even though

856
00:43:22,679 --> 00:43:26,360
they're specifically forbidden. I'm doing so, you know, on the

857
00:43:26,400 --> 00:43:29,280
administrative cybork are battling. Give them a path forward to

858
00:43:29,360 --> 00:43:32,400
use actually secure approaches. Right, people cared about security, we'd

859
00:43:32,400 --> 00:43:34,119
have a lot fewer problems. There's nothing you can do

860
00:43:34,159 --> 00:43:38,519
to me. People care about security. Cost there is something, right.

861
00:43:39,000 --> 00:43:41,760
Speaker 1: Yeah, if you can if you can equate security to dollars,

862
00:43:42,599 --> 00:43:44,880
like if we don't take these secure initiatives, we're going

863
00:43:44,960 --> 00:43:46,840
to lose x amount of dollars, then you'll get some

864
00:43:46,880 --> 00:43:47,440
of these attention.

865
00:43:47,719 --> 00:43:50,519
Speaker 2: Right. If you run this in the cloud, it costs you,

866
00:43:50,440 --> 00:43:52,079
you know, and dollars a month. And if you run

867
00:43:52,119 --> 00:43:55,079
it locally, it doesn't, right, you might actually win some

868
00:43:55,119 --> 00:43:56,119
folks over to Yeah.

869
00:43:56,159 --> 00:43:59,519
Speaker 3: I think that's why Microsoft copilot. Again, the copilot name

870
00:43:59,559 --> 00:44:02,480
is now like any different products. So it's a biggest

871
00:44:02,679 --> 00:44:06,719
I've heard internally. It's over two hundred sure, wow, But

872
00:44:06,800 --> 00:44:09,639
it's the I want to search our SharePoint in one

873
00:44:09,719 --> 00:44:13,239
drive and teams and everything else. But it's like, here's

874
00:44:13,280 --> 00:44:15,960
the thing that we provide, like well, all the other

875
00:44:16,000 --> 00:44:20,079
stuff that we trust Microsoft to secure, that's going to

876
00:44:20,119 --> 00:44:21,840
take care of it. I'm sure Gemini is going to

877
00:44:21,880 --> 00:44:24,800
do the same thing. If you're a Google based company, right, you're.

878
00:44:24,639 --> 00:44:27,000
Speaker 2: Doing Google workspaces, you're doing the same sort of thing

879
00:44:27,159 --> 00:44:30,000
more or less. The uh yeah, and I think you're

880
00:44:30,000 --> 00:44:32,159
talking about M three sixty five copilot in that case.

881
00:44:32,199 --> 00:44:34,320
But it looks like there's at least several flavors of

882
00:44:34,440 --> 00:44:36,559
M three sixty five copilot. The thing that'll give you

883
00:44:36,599 --> 00:44:39,360
a hints to make better PowerPoint slides is different from

884
00:44:39,400 --> 00:44:42,280
the thing that'll find stopping SharePoint from you, but it's

885
00:44:42,320 --> 00:44:44,440
all under the M three sixty five copilot matter.

886
00:44:44,679 --> 00:44:48,719
Speaker 1: So I got bills from Microsoft co Pilot three sixty

887
00:44:48,719 --> 00:44:51,920
five and I never signed up for it. Did Dick

888
00:44:52,159 --> 00:44:57,880
automatically convert your office three sixty five subscriptions to copilot

889
00:44:57,920 --> 00:44:59,960
and tack on a couple hundred dollars for some reason.

890
00:45:00,320 --> 00:45:01,280
Speaker 2: I don't know the answer to that.

891
00:45:01,599 --> 00:45:05,599
Speaker 3: Yeah, anyway, you mean an AI to understand Microsoft billing.

892
00:45:05,800 --> 00:45:08,719
Speaker 2: Yeah, I guess. One thing I want for a lawyer.

893
00:45:08,840 --> 00:45:10,400
Speaker 1: One thing I want to mention to you, Richard is

894
00:45:11,079 --> 00:45:14,639
we should get test Ferrondez on the show talk about RAG.

895
00:45:14,760 --> 00:45:16,719
She has some different ideas than mainstream.

896
00:45:16,840 --> 00:45:19,719
Speaker 2: Yeah, there's good tech in that space, no, no doubt. And

897
00:45:19,760 --> 00:45:24,400
it is the sort of balance between consumer level tools

898
00:45:24,440 --> 00:45:27,920
for this and this line that we play in developers

899
00:45:27,920 --> 00:45:31,920
where we're building code for our companies and they have

900
00:45:32,159 --> 00:45:36,599
other requirements, but it's also utilizing the data of the company.

901
00:45:36,800 --> 00:45:40,119
I mean, I would be highly resistant to wanting to

902
00:45:40,119 --> 00:45:42,159
build apps in this space with things like M three

903
00:45:42,199 --> 00:45:45,440
C five copilot around, Like when it comes to navigating

904
00:45:45,480 --> 00:45:48,199
through the data of the company, that seems to be

905
00:45:48,280 --> 00:45:50,360
the tool given you're in your company is in M

906
00:45:50,400 --> 00:45:54,719
three sixty five. Yeah, right, But the I mean, I

907
00:45:54,719 --> 00:45:56,320
wonder if it's just a UX feature when it comes

908
00:45:56,320 --> 00:45:58,559
to l MS, it's really just a UX feature. It's

909
00:45:58,599 --> 00:46:00,679
a different way to communicate with a p software to

910
00:46:00,679 --> 00:46:01,519
get results you want.

911
00:46:01,840 --> 00:46:05,760
Speaker 3: Where we're at right now, that's the best use case.

912
00:46:06,119 --> 00:46:10,239
Everything below that starts getting into the questionable should we

913
00:46:10,280 --> 00:46:11,360
do it?

914
00:46:11,480 --> 00:46:13,440
Speaker 2: Or is it good enough? And I think we did

915
00:46:13,440 --> 00:46:15,079
a show with vishuwaz On that long ago where he's

916
00:46:15,079 --> 00:46:17,639
getting involved with a startup where it's trying to write.

917
00:46:18,039 --> 00:46:20,039
It's trying to use the tool to write proposals more

918
00:46:20,119 --> 00:46:23,880
quickly than humans can, except the company then has to

919
00:46:23,920 --> 00:46:28,079
submit that proposal and comply with that proposal. So making

920
00:46:28,119 --> 00:46:31,119
sure that proposal is accurate is not a trivial problem.

921
00:46:31,639 --> 00:46:34,559
In the end, writing the software to gender proposal easy part.

922
00:46:34,840 --> 00:46:37,639
Testing to make sure the proposal is correct and is

923
00:46:39,199 --> 00:46:42,079
capable of doing what capable of executing on it. That's

924
00:46:42,280 --> 00:46:45,599
much harder and what takes longer for somebody smart to

925
00:46:45,599 --> 00:46:48,880
write the proposal himself or for somebody not quite as

926
00:46:48,880 --> 00:46:52,079
smart to generate the proposal and then have the smart

927
00:46:52,079 --> 00:46:55,239
people read it and make sure it's right. And the

928
00:46:55,239 --> 00:46:58,639
consequences of being wrong are is money right? You're going

929
00:46:58,719 --> 00:47:00,840
to submit a proposal for a you're not going to

930
00:47:00,880 --> 00:47:04,119
be capable of doing, or if it's grossly underpriced, or

931
00:47:04,280 --> 00:47:07,880
you know you missed a requirement, Like there's validating these

932
00:47:07,880 --> 00:47:10,199
things is not a trivial problem. I look at it

933
00:47:10,239 --> 00:47:14,079
almost like writing contracts like I've had. I've dealt with

934
00:47:14,079 --> 00:47:16,719
companies where we had service level agreements with other companies

935
00:47:16,760 --> 00:47:18,639
but had no way to measure where that we were compliant.

936
00:47:19,119 --> 00:47:22,039
Speaker 3: And the problem is the level of effort to write

937
00:47:22,199 --> 00:47:25,440
a good contract and a bad contract is exactly the same,

938
00:47:25,679 --> 00:47:26,280
more or less.

939
00:47:26,360 --> 00:47:27,159
Speaker 2: Yeah, if you're.

940
00:47:27,000 --> 00:47:30,320
Speaker 3: Pressing the button for putting this in and the AI

941
00:47:30,440 --> 00:47:33,119
generates it, you don't know if it's a good or

942
00:47:33,159 --> 00:47:35,320
a bad one until you have an expert review that.

943
00:47:35,440 --> 00:47:38,199
And so you get the perception that, oh, we've saved

944
00:47:38,199 --> 00:47:40,119
this effort because look at how easy it was to

945
00:47:40,480 --> 00:47:42,800
press the button. But if you still have to have

946
00:47:42,880 --> 00:47:46,559
someone do a thorough analysis, and maybe it made it

947
00:47:46,599 --> 00:47:48,400
easier for them to do the review than to be

948
00:47:48,679 --> 00:47:50,679
writing it the whole time, and maybe they make a mistake,

949
00:47:50,719 --> 00:47:52,840
and so you still need a second editor to review it.

950
00:47:53,280 --> 00:47:56,280
Maybe you only need one qualified expert instead of too.

951
00:47:56,559 --> 00:48:01,360
But there's that perception that, oh, it's it's good, and

952
00:48:01,400 --> 00:48:04,199
you don't realize that it could also be just as bad. Yeah,

953
00:48:04,199 --> 00:48:05,679
And if you don't know how to tell that this

954
00:48:05,800 --> 00:48:07,760
was good and this is bad, then that's not a

955
00:48:07,800 --> 00:48:08,519
good implementation.

956
00:48:08,639 --> 00:48:10,320
Speaker 2: You can get back to this. You really can't use

957
00:48:10,320 --> 00:48:12,199
this tool unless you're qualified to have done this work

958
00:48:12,239 --> 00:48:15,079
without the tool, right, because you need to evaluate its output.

959
00:48:15,159 --> 00:48:17,679
Speaker 3: Yeah, Like I said, if you take the LLM out

960
00:48:17,719 --> 00:48:19,960
and put a human in that, what would you do

961
00:48:20,000 --> 00:48:21,760
to make sure they did it right? And we tend

962
00:48:21,760 --> 00:48:24,480
to do that because we don't trust people for really

963
00:48:24,519 --> 00:48:27,719
important stuff like I'm going to write this contract, Richard's

964
00:48:27,719 --> 00:48:29,400
going to review it to make sure that I dotted

965
00:48:29,440 --> 00:48:30,599
all my eyes and cross my t's.

966
00:48:30,719 --> 00:48:32,920
Speaker 2: I would argue the LM's better in the checking role

967
00:48:33,440 --> 00:48:35,239
that you get the contract and then you run it

968
00:48:35,239 --> 00:48:36,920
through the LM to say what's been missed.

969
00:48:37,719 --> 00:48:41,039
Speaker 3: Yeah, and that's like having copilot and GitHub copilot like

970
00:48:41,159 --> 00:48:41,960
write the tests.

971
00:48:42,079 --> 00:48:43,000
Speaker 2: Yeah for my code?

972
00:48:43,239 --> 00:48:45,679
Speaker 3: Should you have it write the code and write the tests?

973
00:48:45,840 --> 00:48:46,199
Speaker 2: Well?

974
00:48:46,360 --> 00:48:49,199
Speaker 3: Maybe maybe not. I have the developers do that as well,

975
00:48:49,360 --> 00:48:50,960
so we kind of accept it.

976
00:48:51,000 --> 00:48:54,239
Speaker 2: So a little that's unnerving because it's like garbaging garbage

977
00:48:54,239 --> 00:48:56,159
out and it's like I couldn't write good code, but

978
00:48:56,159 --> 00:48:58,639
I can't write good tests either, So it passed. Well.

979
00:48:58,679 --> 00:49:02,159
Speaker 1: There are people out there who can read and understand code,

980
00:49:02,199 --> 00:49:04,480
but aren't so good at writing it, especially when it

981
00:49:04,519 --> 00:49:08,719
comes to architectural decisions, right, so you know that may

982
00:49:08,760 --> 00:49:10,119
be a good role for them.

983
00:49:10,360 --> 00:49:13,079
Speaker 3: And that's where the context matters, Like the architect is

984
00:49:13,079 --> 00:49:16,599
always going to say, it depends. So how do I

985
00:49:16,639 --> 00:49:19,519
know that this code is good code in this application?

986
00:49:19,639 --> 00:49:22,920
In this instance? So I can have it write the code,

987
00:49:23,280 --> 00:49:25,480
but is it the right code? Does it follow our patterns?

988
00:49:25,519 --> 00:49:27,199
And that got back to the idea that I learned.

989
00:49:27,599 --> 00:49:29,760
If I give it a better prompt and say here

990
00:49:29,840 --> 00:49:32,599
is how we write our code. Follow these standards, but

991
00:49:32,679 --> 00:49:35,559
that's still just basic stuff like how to write async

992
00:49:35,599 --> 00:49:38,159
and where to put your curly braces, it's not follow

993
00:49:38,199 --> 00:49:39,239
these design patterns.

994
00:49:39,400 --> 00:49:44,880
Speaker 2: This also reminds me of how outsourcing actually was the mistake,

995
00:49:44,920 --> 00:49:47,400
because if you can't describe the problem well outsourced, it's

996
00:49:47,400 --> 00:49:49,039
not going to work any better if you did it locally.

997
00:49:49,239 --> 00:49:51,679
But as we got better at describing problems for U

998
00:49:51,679 --> 00:49:54,639
source work, we actually got code results. You get better

999
00:49:54,679 --> 00:49:59,039
at describing the in the prompt what I need you

1000
00:49:59,079 --> 00:50:02,800
to do here, you know through that tool you're going

1001
00:50:02,840 --> 00:50:06,440
to get usable results. I do like the idea of

1002
00:50:07,599 --> 00:50:10,039
these tools are good at making going down the checklist,

1003
00:50:10,280 --> 00:50:12,880
like taking an architectural design and feeding it to an

1004
00:50:13,000 --> 00:50:15,719
LLM with a good prompt about what our expectations are

1005
00:50:15,760 --> 00:50:18,280
around an architectural design and saying what have we missed?

1006
00:50:18,559 --> 00:50:19,880
Like what would you correct?

1007
00:50:20,480 --> 00:50:23,960
Speaker 1: There's one thing I really hate about being the recipient

1008
00:50:24,159 --> 00:50:28,800
of LLLM generated content, and that is being lied to.

1009
00:50:30,159 --> 00:50:33,599
And I get an email that says, you know, hey,

1010
00:50:33,639 --> 00:50:36,199
we looked at your podcast. We think it's amazing. We

1011
00:50:36,239 --> 00:50:38,719
want to talk to you about blah blah blah, and

1012
00:50:38,760 --> 00:50:41,480
I really love this episode with YadA YadA.

1013
00:50:41,320 --> 00:50:43,639
Speaker 2: And you know it's just all generated.

1014
00:50:44,239 --> 00:50:48,400
Speaker 1: And or here's another one I got recently. We'd really

1015
00:50:48,480 --> 00:50:51,480
like to buy your company. We'd really like to buy

1016
00:50:51,480 --> 00:50:54,239
your company. We think it's interesting and it's whatever. So

1017
00:50:54,639 --> 00:50:57,480
I ignore it because it's obviously a bot. Yeah, and

1018
00:50:57,519 --> 00:51:00,960
I send another email another week, Hey are you are

1019
00:51:01,000 --> 00:51:04,559
you really? Are you interested in selling your company? Ignore

1020
00:51:04,639 --> 00:51:07,840
that another one ham circling back. You know, blah blah blah,

1021
00:51:08,000 --> 00:51:11,440
you kind of And so I wrote back and I said,

1022
00:51:11,800 --> 00:51:14,280
why don't you tell me what you know about my company?

1023
00:51:14,400 --> 00:51:17,760
Speaker 2: Cricket of course not. They just the whole goal was

1024
00:51:17,800 --> 00:51:19,320
to get you to respond it anyway, and then you

1025
00:51:19,400 --> 00:51:21,519
just go on a list. Now that person will respond.

1026
00:51:21,599 --> 00:51:25,639
Speaker 3: They ignored it, you said saying about having it asking

1027
00:51:25,679 --> 00:51:28,159
it to analyze your design documents. I think that's something

1028
00:51:28,199 --> 00:51:31,639
that the architects can really benefit from, Like we can

1029
00:51:31,760 --> 00:51:34,559
use these tools to our benefits. Another pair of eyes. Yeah,

1030
00:51:34,760 --> 00:51:37,159
you know, metaphorically speaking, use it as your rubber duck.

1031
00:51:37,159 --> 00:51:39,639
I don't always have someone around. I've got a whiteboard here,

1032
00:51:39,679 --> 00:51:42,719
but that's you know, for me to like draw stuff out,

1033
00:51:42,840 --> 00:51:44,679
and I want to know, is this good If I

1034
00:51:44,679 --> 00:51:47,199
write up in an ad R an architecture decision record

1035
00:51:47,800 --> 00:51:49,840
what did I miss? And I can ask it to

1036
00:51:50,119 --> 00:51:51,599
you know, analyze that and find things messed.

1037
00:51:51,639 --> 00:51:54,119
Speaker 2: It's another thing the tool is good for is documenting

1038
00:51:54,559 --> 00:51:58,480
what we did, right, Yeah, recording the meeting where you

1039
00:51:58,679 --> 00:52:01,199
generated that ADR in the end and then having the

1040
00:52:01,199 --> 00:52:03,679
tool summarize that as part of this is why this ADR.

1041
00:52:03,719 --> 00:52:05,000
There was a meeting on this date. These are the

1042
00:52:05,039 --> 00:52:07,639
people are there, These were the key talking points. This

1043
00:52:07,800 --> 00:52:10,800
was the decision ADR fire the secretary. You're supposed to

1044
00:52:10,800 --> 00:52:12,639
write that up. We just don't.

1045
00:52:13,039 --> 00:52:15,679
Speaker 3: Yes, yeah, we had the meeting, we made a decision. Yeah,

1046
00:52:15,679 --> 00:52:19,320
but the why behind the decision gets in the discussion. Yeah,

1047
00:52:19,360 --> 00:52:21,800
and so if they can capture that. And then there's

1048
00:52:21,800 --> 00:52:24,400
also the I need to communicate with different people. This

1049
00:52:24,440 --> 00:52:26,760
is the architect elevator idea. I need to talk to

1050
00:52:26,800 --> 00:52:29,599
the CEOs and the CTOs all the way down to

1051
00:52:29,639 --> 00:52:32,679
the engineers and the basement doing the work. Those have

1052
00:52:32,760 --> 00:52:35,719
different audiences. I might have that same design document that

1053
00:52:35,760 --> 00:52:38,760
I need to convey in different ways. I need to

1054
00:52:38,800 --> 00:52:41,159
get more detail to the engineers, but I need to

1055
00:52:41,199 --> 00:52:42,440
summarize really quickly.

1056
00:52:42,760 --> 00:52:44,679
Speaker 2: I could see as an architect, you'd build up a

1057
00:52:44,679 --> 00:52:46,760
body of prompts. It's like, I'm allowed to take this

1058
00:52:46,840 --> 00:52:50,159
adr to the CFO. Here's the CFO prompt. Yeah, they

1059
00:52:50,199 --> 00:52:53,639
care about return on investment, they care about initial capital costs,

1060
00:52:53,679 --> 00:52:56,599
like include these numbers, like that kind of thing, so

1061
00:52:56,639 --> 00:53:00,280
that the tool would spit out a fairly well shaped thing,

1062
00:53:01,239 --> 00:53:03,360
and you really use that over and over again. Yep.

1063
00:53:03,559 --> 00:53:06,199
Speaker 3: Yeah, I think we're going to see the I have

1064
00:53:06,280 --> 00:53:08,280
my toolkit of here's the things I have, and I

1065
00:53:08,719 --> 00:53:11,039
used to write macros am I going to write those

1066
00:53:11,199 --> 00:53:13,360
those prompts that I reuse that stuff that I had

1067
00:53:13,400 --> 00:53:17,000
to do that one time and said please use filescope namespaces.

1068
00:53:17,360 --> 00:53:19,000
I shouldn't have to tell it every time, but it

1069
00:53:19,039 --> 00:53:21,079
doesn't remember, so I had to tell it every time.

1070
00:53:21,320 --> 00:53:23,199
But if I can have a macro that starts that,

1071
00:53:23,400 --> 00:53:26,760
like start our code with this and it's pre injected,

1072
00:53:27,360 --> 00:53:28,039
that's useful.

1073
00:53:28,519 --> 00:53:33,679
Speaker 2: Yep. Yeah, Lank, you're language scoping now. You know language

1074
00:53:33,719 --> 00:53:37,599
of the Executive Committee, language of the security group, you

1075
00:53:37,639 --> 00:53:41,599
know language of the vendor. Yeah, you know. I I

1076
00:53:41,639 --> 00:53:45,360
have gone back with and re read previous interactions with

1077
00:53:45,400 --> 00:53:47,679
a given vendor, like an ISP that I was working

1078
00:53:47,719 --> 00:53:50,440
with on a project. They say, you know what, what

1079
00:53:50,480 --> 00:53:53,480
were the ones that worked? Essentially when we talked this way,

1080
00:53:53,519 --> 00:53:55,239
we got better results and you were almost cut and

1081
00:53:55,280 --> 00:53:57,519
pasting from previous ones, like in a way building up.

1082
00:53:57,519 --> 00:54:00,719
That prompt would be that same example, maybe a little

1083
00:54:00,800 --> 00:54:03,760
quicker and a little clearer. I would also say, as

1084
00:54:03,760 --> 00:54:06,480
someone who organizes conferences, I can tell when you use

1085
00:54:06,519 --> 00:54:10,960
cht GPD to write write your abstract. Yep. So that's

1086
00:54:11,039 --> 00:54:13,400
got seven hundred submissions in three hundred of them, the

1087
00:54:13,760 --> 00:54:15,159
same opening sentence.

1088
00:54:15,440 --> 00:54:18,199
Speaker 3: G I think this is one of those things that

1089
00:54:18,719 --> 00:54:22,119
QCon stands apart, we don't have a call for papers

1090
00:54:22,400 --> 00:54:25,159
right as one hundred percent human curated content. We get

1091
00:54:25,199 --> 00:54:26,960
together like six months in advance. I was on the

1092
00:54:28,039 --> 00:54:31,159
program committee for QUCAN, San Francisco, and we say, what

1093
00:54:31,239 --> 00:54:33,159
are the topics right now that I want to learn

1094
00:54:33,159 --> 00:54:35,159
about that I think other engineers want to learn about that.

1095
00:54:35,199 --> 00:54:38,400
Gave us our twelve or fifteen tracks. We find track

1096
00:54:38,440 --> 00:54:41,159
hosts for each of those, and then they find they

1097
00:54:41,159 --> 00:54:43,679
reach out to their network and find five people to

1098
00:54:43,719 --> 00:54:45,119
talk about this topic.

1099
00:54:45,639 --> 00:54:46,440
Speaker 2: And that's great.

1100
00:54:46,679 --> 00:54:49,639
Speaker 3: That pole model means you're using the intelligence of the

1101
00:54:49,679 --> 00:54:53,239
people who know what's relevant right now. It's not a

1102
00:54:53,400 --> 00:54:54,679
push of police accept my talk.

1103
00:54:54,840 --> 00:54:57,280
Speaker 2: Yeah, yeah, doesn't mean they didn't also write the abstract

1104
00:54:57,280 --> 00:54:58,039
with the machine.

1105
00:54:58,239 --> 00:55:00,599
Speaker 3: Well, I mean yes. I expect everyone to be using

1106
00:55:00,639 --> 00:55:03,239
whatever tools are at the disposal. And if if an

1107
00:55:03,239 --> 00:55:06,679
AI and an LM allows you to make a better presentation,

1108
00:55:06,800 --> 00:55:09,760
that's great. I like the presentations that say I used

1109
00:55:10,119 --> 00:55:13,320
Claude to help write this presentation. Here's where it sucked,

1110
00:55:13,519 --> 00:55:16,679
because that's useful information to me, Like it just can't write.

1111
00:55:16,800 --> 00:55:18,360
It can write a presentation, but I can't write a

1112
00:55:18,360 --> 00:55:18,639
great one.

1113
00:55:18,719 --> 00:55:22,559
Speaker 2: Not concern is that catalog when when a potential attendee

1114
00:55:22,639 --> 00:55:25,199
looks and sees the same sets of words over and

1115
00:55:25,239 --> 00:55:28,719
over get every session, they're going to get the creeps right, Like,

1116
00:55:28,760 --> 00:55:31,360
that's not good for business. So you definitely have to

1117
00:55:31,360 --> 00:55:33,960
push on. Hey, I like your top I like you.

1118
00:55:34,039 --> 00:55:36,519
I want you to speak. This is the topic you're

1119
00:55:36,519 --> 00:55:38,880
talking on. But you use chet GPT to write this thing,

1120
00:55:38,920 --> 00:55:41,159
so it looks like every other abstract. Like you have

1121
00:55:41,239 --> 00:55:43,920
to write a better prompt or at least edit this

1122
00:55:44,119 --> 00:55:46,679
into something that looks like it's you rather than a

1123
00:55:46,679 --> 00:55:47,400
piece of software.

1124
00:55:47,480 --> 00:55:51,280
Speaker 1: Yeah, so especially here's here's a tell if the first

1125
00:55:51,320 --> 00:55:52,440
three words are did you.

1126
00:55:52,480 --> 00:55:55,880
Speaker 2: Know they I've got a lot of in the in

1127
00:55:56,400 --> 00:56:01,000
this fast based movie based technological world in a world

1128
00:56:01,719 --> 00:56:04,039
right the clerk's in opening. Yeah.

1129
00:56:04,079 --> 00:56:07,760
Speaker 3: Yeah, we've started having to use because info has article

1130
00:56:07,800 --> 00:56:11,360
submissions and we feed it through was this AI generated?

1131
00:56:11,440 --> 00:56:14,360
Like my son's in college and all through high school

1132
00:56:14,360 --> 00:56:16,440
and everything else, they've had plagiarism checkers and now they

1133
00:56:16,440 --> 00:56:20,679
have AI checkers. Wow, LLLM checkers, ELM checkers.

1134
00:56:20,679 --> 00:56:22,599
Speaker 2: I'm doing it. I'm doing my best to discourage people

1135
00:56:22,639 --> 00:56:24,519
from using AI because AI just tells me you don't

1136
00:56:24,559 --> 00:56:25,039
know what it is.

1137
00:56:25,719 --> 00:56:28,559
Speaker 3: That's the problem. It's such an easy thing to say.

1138
00:56:28,440 --> 00:56:30,840
Speaker 1: Yeah, I'm going to start using NS, which is natural

1139
00:56:30,880 --> 00:56:34,119
stupidity because it's kind of the same thing, just in reverse.

1140
00:56:34,239 --> 00:56:36,400
Speaker 3: But I mean again, I go back to it has

1141
00:56:36,559 --> 00:56:39,800
value and when it's wrong, that's a feature, not a bug.

1142
00:56:40,039 --> 00:56:42,880
Speaker 2: Yeah. Well, now I've been pressing against folks. Is like

1143
00:56:42,920 --> 00:56:45,960
when you say AI, what do you actually mean? Can

1144
00:56:46,000 --> 00:56:49,440
you articulate it? Yeah, so that it becomes a slower term,

1145
00:56:49,880 --> 00:56:52,119
a term that needs to be qualified. Right, Why don't

1146
00:56:52,159 --> 00:56:54,800
you use the qualified term? Then maybe we can get going,

1147
00:56:54,880 --> 00:56:58,880
And I think otherwise we're all talking magic, right, yeah, right,

1148
00:56:59,480 --> 00:57:02,400
if you just wait, AI equals magic. Sorry, no magic allowed.

1149
00:57:02,599 --> 00:57:05,239
Oh you're using a large language. Well well that's not magic. Fun.

1150
00:57:06,039 --> 00:57:09,079
Speaker 3: Yeah, And that's what the architectural intelligence is is figuring

1151
00:57:09,079 --> 00:57:12,719
out when to use those actual AI elements. What are

1152
00:57:12,719 --> 00:57:14,920
the real world things we can do? And that just

1153
00:57:14,960 --> 00:57:20,159
comes down to good traditional tradeoff analysis. We make trade offs.

1154
00:57:20,239 --> 00:57:21,800
Is this the right thing or the wrong thing? Oh,

1155
00:57:21,840 --> 00:57:24,239
it doesn't fit here. For all these decisions, I can

1156
00:57:24,280 --> 00:57:26,400
put it in my ADR and say I considered an

1157
00:57:26,679 --> 00:57:29,480
LM and we decided to go with a traditional write

1158
00:57:29,519 --> 00:57:31,760
the code approach. But you've got to get past the

1159
00:57:31,880 --> 00:57:35,199
hype of hey, AI can do everything. Well, AI doesn't

1160
00:57:35,199 --> 00:57:37,760
actually mean anything. It's just a marketing term. What is

1161
00:57:37,800 --> 00:57:39,719
the tool and how can you use that tool?

1162
00:57:39,960 --> 00:57:42,159
Speaker 2: And you know the person doesn't actually know what they're

1163
00:57:42,159 --> 00:57:43,639
saying when they say that, because when you ask a

1164
00:57:43,760 --> 00:57:47,199
question like that, they really tails ben yep, well you know,

1165
00:57:47,719 --> 00:57:51,440
you know, sorry, we don't have Jarvis. Jarvis isn't the thing.

1166
00:57:52,119 --> 00:57:54,000
So what do you actually got here?

1167
00:57:54,760 --> 00:57:57,960
Speaker 1: Thomas? Is there anything we missed that you want to mention?

1168
00:57:58,679 --> 00:58:00,639
Speaker 3: I just wrap up. So I started with the quote

1169
00:58:00,679 --> 00:58:02,920
from Arthur C. Clark. I think it's his third law.

1170
00:58:03,840 --> 00:58:05,719
The thing I want to wrap up with is his

1171
00:58:05,960 --> 00:58:09,400
second law that the only way of discovering the limits

1172
00:58:09,719 --> 00:58:12,079
of the possible is to venture a little ways past

1173
00:58:12,159 --> 00:58:14,840
them into the impossible. I think there's a lot of

1174
00:58:14,880 --> 00:58:18,400
this hype around AI and LMS and what can they do?

1175
00:58:19,159 --> 00:58:21,480
But we don't push the boundaries, we won't find what

1176
00:58:21,519 --> 00:58:24,199
those limits are. So sometimes you have to basically believe

1177
00:58:24,239 --> 00:58:26,800
the hype and go into that impossible and then we'll

1178
00:58:26,800 --> 00:58:28,480
figure out where we connection get too.

1179
00:58:28,639 --> 00:58:30,119
Speaker 2: That's good, Well, thanks, Thomas.

1180
00:58:30,199 --> 00:58:32,639
Speaker 1: Has been enlightening to say the least, and it's always

1181
00:58:32,639 --> 00:58:35,199
good to talk to you, so thanks again.

1182
00:58:35,239 --> 00:58:36,320
Speaker 3: Always great to say you guys.

1183
00:58:36,519 --> 00:58:39,440
Speaker 2: All right, we'll see you next time on dot neat.

1184
00:58:39,360 --> 00:59:02,280
Speaker 1: Rocks dot net Rocks is brought to you by Franklin's

1185
00:59:02,320 --> 00:59:06,360
Net and produced by Pop Studios, a full service audio,

1186
00:59:06,480 --> 00:59:10,920
video and post production facility located physically in New London, Connecticut,

1187
00:59:11,159 --> 00:59:15,960
and of course in the cloud online at pwop dot com.

1188
00:59:16,159 --> 00:59:18,280
Visit our website at d O T N E t

1189
00:59:18,519 --> 00:59:22,559
R O c k S dot com for RSS feeds, downloads,

1190
00:59:22,679 --> 00:59:26,360
mobile apps, comments, and access to the full archives going

1191
00:59:26,400 --> 00:59:29,599
back to show number one, recorded in September two.

1192
00:59:29,480 --> 00:59:30,039
Speaker 2: Thousand and two.

1193
00:59:30,719 --> 00:59:33,039
Speaker 1: And make sure you check out our sponsors. They keep

1194
00:59:33,119 --> 00:59:36,280
us in business. Now go write some code, See you

1195
00:59:36,320 --> 00:59:36,760
next time.

1196
00:59:37,679 --> 00:59:39,519
Speaker 3: You got jam, Vans

1197
00:59:41,559 --> 00:59:41,599
Speaker 1: And

