1
00:00:01,080 --> 00:00:03,000
Speaker 1: How'd you like to listen to dot net rocks with

2
00:00:03,040 --> 00:00:07,879
no ads? Easy? Become a patron for just five dollars

3
00:00:07,919 --> 00:00:10,800
a month. You get access to a private RSS feed

4
00:00:10,839 --> 00:00:14,279
where all the shows have no ads. Twenty dollars a month.

5
00:00:14,279 --> 00:00:17,679
We'll get you that and a special dot NetRocks patron mug.

6
00:00:18,160 --> 00:00:34,000
Sign up now at Patreon dot dot NetRocks dot com.

7
00:00:34,119 --> 00:00:38,000
Speaker 2: Hey guess what. It's dot net Rocks Steve Sanderson Edition.

8
00:00:38,159 --> 00:00:41,240
I'm Carl Franklin, the amateur Campbell And uh, what can

9
00:00:41,280 --> 00:00:42,799
we say? Richard New Year?

10
00:00:43,039 --> 00:00:46,840
Speaker 3: We're you're making a show and checking it twice? Oh wait, no,

11
00:00:47,039 --> 00:00:48,920
Christmas is over? Yeah?

12
00:00:49,320 --> 00:00:52,159
Speaker 2: Uh all right, Well we got a lot of sort

13
00:00:52,200 --> 00:00:55,439
of pre show ramble to do, so let's get right

14
00:00:55,479 --> 00:01:05,879
into a better no framework awesome. Well I don't have

15
00:01:05,920 --> 00:01:09,040
a framework piece or anything per se, but I do

16
00:01:09,159 --> 00:01:13,400
have some news about Blazer Train and Blazer Puzzle. So

17
00:01:13,879 --> 00:01:17,920
Blazer Train I have done sparsely since we started the

18
00:01:17,959 --> 00:01:23,680
Blazer Puzzle, just focusing on redoing old things. But I've

19
00:01:23,719 --> 00:01:26,480
started a series on Blazer Train now that's going to

20
00:01:26,560 --> 00:01:30,159
be weekly or every other week on What's New and

21
00:01:30,200 --> 00:01:33,920
Dot at nine. So the first one is published maybe

22
00:01:34,079 --> 00:01:35,879
even the second one by the time this comes out,

23
00:01:36,480 --> 00:01:38,879
and the Blazer Puzzle is going to be a little

24
00:01:38,920 --> 00:01:42,400
different than usual. Typically, what we've done with the Blazer

25
00:01:42,439 --> 00:01:45,680
puzzles we give a problem and we say, okay, take

26
00:01:45,719 --> 00:01:48,319
the week or a few days and find a solution

27
00:01:48,400 --> 00:01:50,439
and email us the solution. Then we pick a winner

28
00:01:50,480 --> 00:01:54,159
from all the right answers. And we're not doing that anymore.

29
00:01:54,680 --> 00:01:57,040
We're going to give you a problem and then we're

30
00:01:57,079 --> 00:02:00,799
going to take a little pause, give people a chance

31
00:02:00,840 --> 00:02:04,480
to duck out or press the pause button on the video,

32
00:02:04,920 --> 00:02:07,239
and then we're going to give the solution. So no

33
00:02:07,319 --> 00:02:11,560
more winners, no more mugs, no more don't post this

34
00:02:11,639 --> 00:02:14,080
on social media. We want you to engage with us more.

35
00:02:14,400 --> 00:02:16,719
So it's kind of fun. It's more fun.

36
00:02:17,000 --> 00:02:17,680
Speaker 3: That's cool, man.

37
00:02:17,800 --> 00:02:20,599
Speaker 2: Yeah, So that's what I got. Just some updates from

38
00:02:20,639 --> 00:02:27,360
my Blazer videos. And also we should probably just briefly

39
00:02:27,400 --> 00:02:29,800
go over some things that happened in nineteen thirty six.

40
00:02:30,000 --> 00:02:31,680
Speaker 3: This is episode ninety thirty six.

41
00:02:32,120 --> 00:02:34,639
Speaker 2: Yeah, since it's episode nineteen thirty six and I want

42
00:02:34,639 --> 00:02:37,599
to be labor stuff and we don't want to take

43
00:02:37,599 --> 00:02:39,719
too much time. But there are a few important things

44
00:02:39,759 --> 00:02:44,000
that happened. Of course, you know, Nazi Germany just kind

45
00:02:44,039 --> 00:02:48,599
of ratcheted up in general. So there's a whole bunch

46
00:02:48,680 --> 00:02:53,319
of things. But on February fourth, Keynes John Maynard Keynes

47
00:02:53,400 --> 00:02:58,360
published The General Theory of Employment Interest in Money, a

48
00:02:58,479 --> 00:03:04,879
revolutionary economic work that fundamentally transformed modern modern macroeconomic thought

49
00:03:05,759 --> 00:03:10,319
and policymaking. Everybody knows about Keynesy and economics now. On

50
00:03:10,400 --> 00:03:15,360
February nineteenth, Marion Anderson performed at the White House, significant

51
00:03:16,199 --> 00:03:19,000
in the context of racial segregation, would later be remembered

52
00:03:19,000 --> 00:03:22,759
as a pivotal, pivotal moment in the civil rights movement.

53
00:03:23,960 --> 00:03:27,800
March first, the first p seventeen bomber was delivered on

54
00:03:27,919 --> 00:03:34,360
March seventeenth, Saint Patrick's Day Flood in Pittsburgh, Terrible, terrible flood.

55
00:03:36,199 --> 00:03:41,520
April thirteenth. Hitler appears on Time magazine cover. May seventh,

56
00:03:41,560 --> 00:03:47,520
Italian annexation of Ethiopia, and they continued to do a

57
00:03:47,520 --> 00:03:54,120
lot of damage in Ethiopia, the Italian military in On

58
00:03:54,159 --> 00:03:59,479
May twenty eighth, Alan Turing submitted the groundbreaking paper on computability.

59
00:04:00,919 --> 00:04:05,199
And he's a hero among us developers, of course, in

60
00:04:05,240 --> 00:04:09,960
mathematicians everywhere, and you know, do you have anything else

61
00:04:10,000 --> 00:04:11,719
you want to add? I mean, there's tons of things

62
00:04:11,719 --> 00:04:14,199
that happened, but sure, well you know, I stick towards

63
00:04:14,240 --> 00:04:17,120
the Zion stuff. So in nineteen thirty six was the

64
00:04:17,240 --> 00:04:21,800
year that the last thylacine died, also known as the

65
00:04:21,839 --> 00:04:25,399
Tasmanian tiger. Although it was more wolf like than anything else.

66
00:04:25,839 --> 00:04:29,240
It had been hunted because they believed it had been

67
00:04:29,279 --> 00:04:31,399
hunted for bounty because they believed it was killing sheep.

68
00:04:31,439 --> 00:04:34,360
Turned out to not be true. The last one died

69
00:04:34,439 --> 00:04:38,439
of neglect in the Tasmanian zoo. This is strictly a

70
00:04:38,480 --> 00:04:43,399
Tasmanian animal, and coincidentally, there is an organization working hard

71
00:04:43,480 --> 00:04:48,800
to de extinct the thylaccene. Wow like Jurassic Park style or.

72
00:04:48,920 --> 00:04:52,319
Speaker 3: Well, no, his Jurassic Park is fiction.

73
00:04:53,759 --> 00:04:55,279
Speaker 2: No, no, of course, but you know what I mean

74
00:04:55,399 --> 00:04:56,360
to cloning DNA.

75
00:04:56,480 --> 00:05:00,920
Speaker 3: But in Jurassic Park's approach was to find DNA of

76
00:05:00,959 --> 00:05:04,600
the dinosaurs and then fill in the gaps of amphibian DNA,

77
00:05:04,639 --> 00:05:07,439
which is you know, again it's kind of fictional. What

78
00:05:07,480 --> 00:05:10,560
they're actually doing, these folks are doing, is they are

79
00:05:10,720 --> 00:05:15,160
finding the closest living genetic relative because they have good

80
00:05:15,199 --> 00:05:19,120
genetic material for the thylacine, and then they are changing

81
00:05:19,199 --> 00:05:22,600
that DNA to match the thylacine. Okay, but they're also

82
00:05:22,680 --> 00:05:25,000
working closely with Tasmani itself, like if they're going to

83
00:05:25,079 --> 00:05:26,839
does it make sense to reinduce it all that sort

84
00:05:26,879 --> 00:05:29,439
of thing, and there's a strong will to bring that

85
00:05:29,480 --> 00:05:32,360
animal back. It'll actually helps stabilize the ecosystem in Tasmania.

86
00:05:32,439 --> 00:05:35,439
Now that the thylacine has been gone for some ninety years,

87
00:05:36,399 --> 00:05:38,319
Tasmani has been stuffing air effects for not having a

88
00:05:38,319 --> 00:05:41,600
top tier predator, and so there's an idea that bringing

89
00:05:41,639 --> 00:05:43,040
back the thylascene would help.

90
00:05:43,199 --> 00:05:45,360
Speaker 2: See you get a mini geek out in every dot

91
00:05:45,439 --> 00:05:46,240
net Rocks show.

92
00:05:46,360 --> 00:05:48,439
Speaker 3: There you go. I mean, I've been keeping notes on

93
00:05:48,519 --> 00:05:50,600
de extinction for the mammoth and a few others and

94
00:05:50,639 --> 00:05:52,000
so forth. The thilastine has been one of them, and

95
00:05:52,000 --> 00:05:53,759
it just happens to be that nineteen thirty six was

96
00:05:53,759 --> 00:05:55,360
the year that the last thyloscene died.

97
00:05:55,560 --> 00:05:57,480
Speaker 2: Wow, well, who's talking to us today?

98
00:05:57,560 --> 00:06:00,720
Speaker 3: Richard Graudy comment off of show nineteen twenty two, the

99
00:06:00,759 --> 00:06:02,920
one we did with our friend Dan Roth and the

100
00:06:02,959 --> 00:06:06,160
Fall of twenty four talking about Blazer and dot net nine.

101
00:06:06,879 --> 00:06:09,759
And this comment seemed particularly relevant to our conversation today

102
00:06:09,800 --> 00:06:11,800
because it's from Brent from Atlanta, who said it was Grelway.

103
00:06:11,800 --> 00:06:13,360
Was great to hear from Dan. He binds me of

104
00:06:13,399 --> 00:06:15,439
a brilliant co worker of mine who just retired from

105
00:06:15,480 --> 00:06:20,519
Mongo dB. I'd love to know if coding llm's running

106
00:06:20,560 --> 00:06:23,600
locally like the Quinn two point five coder, will be

107
00:06:23,639 --> 00:06:26,480
easily integrated individual studio. I'm looking to enhance my Blazer

108
00:06:26,519 --> 00:06:28,800
coding with such tools without having to pay for Copilot,

109
00:06:29,120 --> 00:06:31,480
which is preferred to be in verie to Quinn. Anyway,

110
00:06:32,120 --> 00:06:38,000
I love the show. Well, hey get out copilot now free,

111
00:06:38,480 --> 00:06:41,399
So so much for paying for that. B All the

112
00:06:41,399 --> 00:06:44,879
copilots now have options for different back ends you can

113
00:06:44,959 --> 00:06:47,399
run on, so you know, the default is the open

114
00:06:47,439 --> 00:06:49,759
AI one, but if you want to switch to a

115
00:06:49,879 --> 00:06:52,920
LAMA or cloud or anything like that, you can do that. Yeah.

116
00:06:52,959 --> 00:06:54,360
Absolutely so there.

117
00:06:54,399 --> 00:06:56,360
Speaker 2: Just as in a side I was talking to the

118
00:06:56,360 --> 00:06:59,160
guys at dev Express and they have a control now

119
00:06:59,639 --> 00:07:02,519
that tracks all of that stuff away and just gives

120
00:07:02,560 --> 00:07:07,120
you like an input box and you can connect it

121
00:07:07,120 --> 00:07:10,160
to wherever you want without any kind of nasty pain.

122
00:07:10,439 --> 00:07:13,199
Speaker 3: Nice. Yeah, so yeah, all good things happening, good things,

123
00:07:13,240 --> 00:07:15,480
and so Brent, thank you so much for your comment.

124
00:07:15,639 --> 00:07:17,040
Copy of music Cobuy is on its way to you.

125
00:07:17,079 --> 00:07:18,680
And if you'd like copy of music Cooba, I write

126
00:07:18,680 --> 00:07:20,560
a comment on the website at dotn at Rocks dot

127
00:07:20,560 --> 00:07:22,639
com or on the facebooks. We publish every show there

128
00:07:22,639 --> 00:07:24,079
and if you comment there, we're read it on the show.

129
00:07:24,279 --> 00:07:25,560
We'll send you copy of music Goba.

130
00:07:25,680 --> 00:07:27,680
Speaker 2: And you can follow us on other social media. We've

131
00:07:27,680 --> 00:07:30,879
been on ex Twitter forever. We're on mastadon and on

132
00:07:30,959 --> 00:07:36,160
blue Sky, and I believe Richard's also on threads. I

133
00:07:36,199 --> 00:07:38,639
have a Thread's account, but I don't watch it, and

134
00:07:38,720 --> 00:07:40,639
you know, but you can find us out there.

135
00:07:40,879 --> 00:07:42,720
Speaker 3: Yep, we're out there. Blue Sky seems to do the

136
00:07:42,720 --> 00:07:43,600
fun one of these days.

137
00:07:43,680 --> 00:07:47,360
Speaker 2: Yeah, just some variation of at Carl Franklin at rich Campbell.

138
00:07:48,399 --> 00:07:53,680
So let's bring back our friend Steve Sanderson. This is

139
00:07:53,720 --> 00:07:57,079
a very humble bio that Steve wrote for us. He's

140
00:07:57,079 --> 00:07:59,439
a developer on the dot net team at Microsoft. He

141
00:07:59,519 --> 00:08:02,680
focused as I'm making dot net better for application developers

142
00:08:02,680 --> 00:08:05,720
with a focus on AI and the web. Yeah, he's

143
00:08:05,759 --> 00:08:07,680
done a few things, a couple of things here and there,

144
00:08:07,800 --> 00:08:09,560
not really mentioning in the bio.

145
00:08:10,319 --> 00:08:13,439
Speaker 4: Welcome back, Steve, Hello, thank you for bringing me back.

146
00:08:13,879 --> 00:08:14,600
Speaker 2: Good to have you.

147
00:08:14,800 --> 00:08:17,720
Speaker 3: Yeah, absolutely, man, And you know I always lead this

148
00:08:17,759 --> 00:08:19,560
off the same way. What are you working on?

149
00:08:20,680 --> 00:08:21,680
Speaker 4: What am I working on?

150
00:08:23,199 --> 00:08:23,480
Speaker 3: Well?

151
00:08:23,560 --> 00:08:26,000
Speaker 4: I think, like quite a lot of us, I've started

152
00:08:26,040 --> 00:08:29,800
to focus on the AI side of software.

153
00:08:29,879 --> 00:08:30,920
Speaker 2: Oh wow, recently.

154
00:08:32,240 --> 00:08:35,600
Speaker 4: Yeah. I think it's a really big opportunity, and there's

155
00:08:35,600 --> 00:08:37,440
many different ways you can think about what it is.

156
00:08:37,519 --> 00:08:39,720
Like on one level, it's like a tool that can

157
00:08:39,759 --> 00:08:41,799
help you write your code, or it's just like a

158
00:08:41,879 --> 00:08:45,000
chat interface or things like that. I think all those

159
00:08:45,039 --> 00:08:47,679
things are fine, but that's not what I'm personally drawn

160
00:08:47,799 --> 00:08:51,399
to as like an area where we can exploit opportunities.

161
00:08:52,000 --> 00:08:55,000
Maybe it's my background, but I feel really drawn to

162
00:08:55,080 --> 00:08:59,600
doing things around applications and doing things for application developers,

163
00:08:59,600 --> 00:09:02,120
and I think about the you know, the many many

164
00:09:02,120 --> 00:09:05,480
people I know who working companies that are producing software

165
00:09:05,919 --> 00:09:09,679
for their customers or for their employees, and things like

166
00:09:10,320 --> 00:09:13,240
how do you actually use AI within that to make

167
00:09:13,279 --> 00:09:15,720
better software? How can you make your software have better

168
00:09:15,759 --> 00:09:18,200
features than it would have had before?

169
00:09:18,279 --> 00:09:21,639
Speaker 3: Well, and when you were thinking about that, like I

170
00:09:21,679 --> 00:09:23,720
remember you thinking about trying to make it easier to

171
00:09:23,720 --> 00:09:27,279
put data on web pages, and that made knockout. Yeah. So,

172
00:09:28,320 --> 00:09:30,320
and then you were thinking about how to make it

173
00:09:30,360 --> 00:09:32,279
easier to use c sharp on web pages, and that

174
00:09:32,360 --> 00:09:34,120
made Blazer. So I kind of like it when you're

175
00:09:34,120 --> 00:09:35,919
thinking about things, Steve, good things happening.

176
00:09:37,000 --> 00:09:40,279
Speaker 2: I remember last year you did a demo at one

177
00:09:40,279 --> 00:09:42,720
of the Dona comps or something like that. You did

178
00:09:42,720 --> 00:09:46,840
a demo of some AI in AI chat wired up

179
00:09:46,840 --> 00:09:51,360
to data. Yeah, it was in Blazer and it was

180
00:09:51,480 --> 00:09:54,440
just mind blowing. So you've been doing it for a while.

181
00:09:54,639 --> 00:09:57,960
Speaker 4: Yeah, that was our e shop support demo. I'm sure

182
00:09:57,960 --> 00:10:01,200
we can add a link to that. Yeah. That was

183
00:10:01,879 --> 00:10:05,919
some exploration around this of what you get when you

184
00:10:05,919 --> 00:10:10,799
put Blazer and AI and typical application business app scenarios together.

185
00:10:11,200 --> 00:10:14,039
And that's an exploration of like a customer support scenario

186
00:10:14,080 --> 00:10:17,720
where people are submitting inquiries and there's some AI that

187
00:10:17,799 --> 00:10:22,480
automatically classifies them as different types of thing, and then

188
00:10:22,600 --> 00:10:26,399
the staff who's trying to deal with customer support gets

189
00:10:27,000 --> 00:10:30,559
like a chat assistant that's able to look through corporate

190
00:10:30,639 --> 00:10:34,480
or enterprise data to provide suggested answers. Fills it out,

191
00:10:34,720 --> 00:10:38,559
you know, rates the customer's satisfaction automatically, all that kind

192
00:10:38,600 --> 00:10:40,960
of state thing, and it just represents a whole own

193
00:10:40,960 --> 00:10:43,559
of scenarios that we think people would probably actually have.

194
00:10:43,759 --> 00:10:45,720
Speaker 2: Yeah, I'm looking forward to the day when I can

195
00:10:45,759 --> 00:10:52,279
replace my report criteria selection UI with just a chat box. Yeah,

196
00:10:52,320 --> 00:10:55,240
and you probably can do that already and I can actually,

197
00:10:56,200 --> 00:10:59,039
but you know, it's going to take a little bit

198
00:10:59,080 --> 00:11:01,639
of a paradigm shift for users to want to go

199
00:11:01,720 --> 00:11:04,759
for that perhaps, But you know, nobody's saying you can't

200
00:11:04,799 --> 00:11:05,639
offer it as an option.

201
00:11:05,799 --> 00:11:08,720
Speaker 4: Yeah, totally. And you mentioned chat as an interface there.

202
00:11:08,759 --> 00:11:12,639
I think chat is clearly something that people strongly associate

203
00:11:12,679 --> 00:11:15,879
with AI, and it's a thing that we didn't really

204
00:11:15,919 --> 00:11:18,879
have as a form of UI before AI came along,

205
00:11:18,919 --> 00:11:21,120
and so that's kind of exciting and cool. But I

206
00:11:21,159 --> 00:11:23,519
also whenever I talk about this, I really try to

207
00:11:24,159 --> 00:11:27,039
emphasize the fact that chat is not the only way

208
00:11:27,200 --> 00:11:30,799
to benefit from AI, And personally, in a lot of cases,

209
00:11:30,840 --> 00:11:34,000
I don't actually like typing into chat boxes like it.

210
00:11:34,240 --> 00:11:38,159
When I'm using my computer to do something productive, my

211
00:11:38,240 --> 00:11:40,080
mind is focused on the task and I don't really

212
00:11:40,120 --> 00:11:43,440
want to be like explaining stuff in English to the computer. Yeah,

213
00:11:43,720 --> 00:11:46,559
and so I think a lot of the time when

214
00:11:46,559 --> 00:11:48,559
we think about how we can add AI to our

215
00:11:48,600 --> 00:11:53,159
applications and produce genuinely more productive apps, it's about using

216
00:11:53,200 --> 00:11:56,480
AI on the back end just sort of like almost

217
00:11:56,559 --> 00:11:59,879
like secretly, without involving the user. It's just predicting what

218
00:12:00,039 --> 00:12:02,159
they want and doing parts of the job for them

219
00:12:02,200 --> 00:12:05,120
ahead of the time so they don't have to do it.

220
00:12:05,120 --> 00:12:07,519
Speaker 2: It does bring to mind a couple of really good

221
00:12:07,759 --> 00:12:11,399
text interfaces where you're writing and then things are helping you.

222
00:12:12,399 --> 00:12:15,960
Obviously get ub copilot and go back to intelligence and

223
00:12:16,000 --> 00:12:20,159
statement completion and see sharp or even writing queries in

224
00:12:20,519 --> 00:12:23,720
SQL management studio, you know when you say select star

225
00:12:23,840 --> 00:12:26,639
from and you get dropped down. You know, you could

226
00:12:26,679 --> 00:12:29,600
implement that in a chatbot where you could say I

227
00:12:29,639 --> 00:12:32,919
want to see all the reports of type and then

228
00:12:33,679 --> 00:12:39,600
you know, transaction type, whatever, whatever type of reports. You know,

229
00:12:39,679 --> 00:12:43,559
those things will probably come in the in the AI

230
00:12:43,840 --> 00:12:45,600
front end chat boxes.

231
00:12:45,720 --> 00:12:48,879
Speaker 4: I'm sure they will. Yeah. And another good example of

232
00:12:48,879 --> 00:12:53,080
that is, of course, as you already mentioned, visual studio Copilot.

233
00:12:53,840 --> 00:12:56,200
And one thing that I really love about that is

234
00:12:57,320 --> 00:12:59,720
the fact that I don't have to explain to it

235
00:13:00,080 --> 00:13:02,080
what I want it to do, Like it's there while

236
00:13:02,120 --> 00:13:06,480
I'm writing code and producing sensible suggestions for me without

237
00:13:06,480 --> 00:13:09,080
me stopping to say I need to do X. It's

238
00:13:09,159 --> 00:13:11,679
just you know, inferring that from what I've already typed

239
00:13:11,720 --> 00:13:12,799
in code, which is I've.

240
00:13:12,600 --> 00:13:14,720
Speaker 3: Seen this happen in Excel. I don't know if you've

241
00:13:14,720 --> 00:13:16,960
played with Excel lately if you have that stuff turned on.

242
00:13:17,080 --> 00:13:20,279
But if if you're doing a pattern of entry into

243
00:13:21,080 --> 00:13:24,000
a series of a column of cells, it starts pre

244
00:13:24,120 --> 00:13:26,399
populating it with the next one it thinks you want,

245
00:13:26,799 --> 00:13:29,159
and it's I mean, the easy one is like one, two, three,

246
00:13:29,279 --> 00:13:32,799
but you know, first of each month, it's figured that

247
00:13:32,840 --> 00:13:37,399
one out, Like you're exactly right. It's not doing it

248
00:13:37,399 --> 00:13:39,320
for you. You're not telling it what to do. It's

249
00:13:39,360 --> 00:13:42,440
just looking at what you're doing and then shortcutting instead

250
00:13:42,440 --> 00:13:43,799
of me having to type it out. Now I'm just

251
00:13:43,879 --> 00:13:47,480
like tab yep, I'll keep that, tah that and so forth.

252
00:13:47,519 --> 00:13:49,960
Like that's really subtle and clever.

253
00:13:49,919 --> 00:13:52,840
Speaker 2: And it doesn't have a lobotomy like Clippy. But it's

254
00:13:52,879 --> 00:13:56,080
the you know, tried to it's trying to solve the

255
00:13:56,159 --> 00:13:59,960
same problem. Really, I mean, it's the whole idea of clippy,

256
00:14:00,519 --> 00:14:01,759
you know, without a lobotomy.

257
00:14:03,440 --> 00:14:06,759
Speaker 3: Well, when we've had smart populating and forms and things

258
00:14:06,840 --> 00:14:09,440
like our browsers know our addresses and so if you

259
00:14:09,639 --> 00:14:11,759
looks like you're starting a type of address, it finds

260
00:14:11,799 --> 00:14:13,320
the rest of the boxes and fills them all in

261
00:14:13,360 --> 00:14:17,919
for you. Like that kind of thing, and actually address

262
00:14:17,960 --> 00:14:19,320
fill in is one of those ones I think people

263
00:14:19,320 --> 00:14:20,720
have worked really hard on where it's like, hey, when

264
00:14:20,799 --> 00:14:23,080
you seeing your postcode, okay, well now I know your

265
00:14:23,080 --> 00:14:24,320
city in state right, Like.

266
00:14:24,440 --> 00:14:26,879
Speaker 2: The country and postcode sort of reverse.

267
00:14:26,559 --> 00:14:29,840
Speaker 3: The thing up. But now I start thinking about forms

268
00:14:29,879 --> 00:14:33,240
over data, Like I'm really interested in the machine learning

269
00:14:33,279 --> 00:14:36,679
models that study a user's workflow and start to tailor

270
00:14:36,759 --> 00:14:40,000
their behavior to that user. Like if you do the

271
00:14:40,080 --> 00:14:42,480
same thing every day in this app, how do I

272
00:14:42,600 --> 00:14:44,000
make it quicker for you?

273
00:14:44,279 --> 00:14:48,000
Speaker 4: Yeah, that's an interesting area. People have started speculating like

274
00:14:48,240 --> 00:14:51,720
about really far off futuristic things which are interesting. I

275
00:14:52,039 --> 00:14:55,240
don't know what the sort of short term thing is here,

276
00:14:55,399 --> 00:15:00,399
Like there's already products and demos out there of AI

277
00:15:00,480 --> 00:15:04,879
systems that generate a UI on demand, so you know,

278
00:15:04,919 --> 00:15:07,519
while a user is doing some task, there isn't a

279
00:15:07,519 --> 00:15:10,559
pre defined set of textboxers, a pre defined set of

280
00:15:10,559 --> 00:15:14,759
wizard steps or whatever. But some AI is being asked

281
00:15:14,279 --> 00:15:16,840
to make up a UI on the fly based on

282
00:15:16,879 --> 00:15:19,759
what the user appears to be doing. That's tailored towards

283
00:15:19,799 --> 00:15:22,559
that task, And I think that's pretty much still a

284
00:15:22,759 --> 00:15:26,000
sort of proof of concept level right now. I doubt

285
00:15:26,000 --> 00:15:29,159
that anyone's completely relying on that to ship and app with,

286
00:15:29,360 --> 00:15:32,360
but it's conceivable that, you know, five years from now,

287
00:15:32,960 --> 00:15:36,519
the way we interact with computers will be much more

288
00:15:36,559 --> 00:15:39,360
sort of generated on demand for us rather than pre defined.

289
00:15:39,559 --> 00:15:41,480
Speaker 2: Is that one of a good example of what you

290
00:15:41,559 --> 00:15:45,200
mean by AI working on the back end without you know,

291
00:15:45,480 --> 00:15:48,000
you really noticing it all that much, or were you

292
00:15:48,039 --> 00:15:49,080
thinking of something else.

293
00:15:49,320 --> 00:15:52,559
Speaker 4: I was thinking a much simpler thing. So the dynamically

294
00:15:52,559 --> 00:15:55,480
make up a UI as the user is working is

295
00:15:55,519 --> 00:15:58,600
like super sci fi futuristic stuff, which by which I

296
00:15:58,639 --> 00:16:00,639
mean like it's not going to be several months away.

297
00:16:00,720 --> 00:16:02,919
Speaker 3: But like you know with the pets of AI.

298
00:16:04,720 --> 00:16:09,399
Speaker 4: But no, what I meant was something like really straightforward

299
00:16:09,440 --> 00:16:11,799
stuff that you can code up in in ten minutes.

300
00:16:12,759 --> 00:16:16,799
So the kind of like core features that you can

301
00:16:16,840 --> 00:16:20,840
do with AI, stuff like being able to automatically classify text,

302
00:16:21,000 --> 00:16:25,639
automatically extracting structured data from fuzzy inputs, whether that's plain

303
00:16:25,679 --> 00:16:31,279
text or images, ability to summarize, translate, ability to detect

304
00:16:31,519 --> 00:16:35,759
anomalies in data, the ability to score user sentiment. All

305
00:16:35,799 --> 00:16:38,559
these kinds of things you can do programmatically on the

306
00:16:38,559 --> 00:16:41,399
back end of your application and then use that to

307
00:16:42,559 --> 00:16:46,440
you know, trigger workflows automatically. Like some message comes in

308
00:16:46,480 --> 00:16:49,720
and it appears to be a complaint, Well you can automatically,

309
00:16:50,200 --> 00:16:52,480
you know, send a message to the person who deals

310
00:16:52,519 --> 00:16:55,720
with complaints or something like that. Or you get something

311
00:16:55,759 --> 00:17:00,360
that you know, you get a large amount of text

312
00:17:00,559 --> 00:17:03,720
in from something and you can summarize it down to

313
00:17:03,759 --> 00:17:07,599
one sentence to display as like the heading within a

314
00:17:07,599 --> 00:17:09,880
table or something like that. You know, there's so many

315
00:17:09,920 --> 00:17:14,799
ways that you can use AI systems programmatically to make

316
00:17:14,799 --> 00:17:17,160
your application just feel that bit more intelligent.

317
00:17:17,400 --> 00:17:19,559
Speaker 2: It sounds like pattern matching on steroids.

318
00:17:19,680 --> 00:17:20,799
Speaker 4: Yeah, in in some ways, you.

319
00:17:20,799 --> 00:17:22,680
Speaker 2: Know the things that we loved pattern matching for.

320
00:17:23,160 --> 00:17:26,839
Speaker 4: Well, arguably at the lowest level, that's what language models are.

321
00:17:27,000 --> 00:17:31,039
Speaker 3: Yeah. Yeah, it is all pattern managing matching tokens effectively, but.

322
00:17:31,000 --> 00:17:33,240
Speaker 2: It's not stuff that you necessarily have to do in

323
00:17:33,319 --> 00:17:36,759
raw code. You know, absolutely the language has a pattern

324
00:17:36,759 --> 00:17:38,160
matching feature and you're using that.

325
00:17:38,640 --> 00:17:42,039
Speaker 4: Yeah. Yeah, there are tons of application features that people

326
00:17:42,440 --> 00:17:46,240
could be using quite easily and cheaply, especially in terms

327
00:17:46,279 --> 00:17:49,359
of the amount of developer effort. Something like semantic search,

328
00:17:49,400 --> 00:17:52,359
for example, Like the naming makes it sound kind of

329
00:17:52,720 --> 00:17:55,079
fancy and advanced or something like that, but it's really not.

330
00:17:55,559 --> 00:17:57,720
You know, you could you can take any search feature

331
00:17:57,759 --> 00:18:01,000
that you've got today and upgrade it to be semantic search,

332
00:18:01,359 --> 00:18:04,200
like within an afternoon, and at least at a sort

333
00:18:04,240 --> 00:18:06,400
of prototype level, and see whether or not that's going

334
00:18:06,440 --> 00:18:09,440
to benefit your application, so that users don't need to

335
00:18:09,480 --> 00:18:11,920
type things correctly and they don't need to phrase things

336
00:18:11,960 --> 00:18:14,240
in a particular way, but it just sort of smartly

337
00:18:14,279 --> 00:18:16,240
works out what they're trying to search for.

338
00:18:16,400 --> 00:18:18,960
Speaker 3: Well. Semantic search to me seems like one of those

339
00:18:19,119 --> 00:18:23,000
buzzword phrases of a promise of a better future, right,

340
00:18:23,039 --> 00:18:26,599
I mean, we've been arguing about semantic search since the

341
00:18:26,680 --> 00:18:29,799
web was invented, right. Even Tim Berner's lead was like,

342
00:18:29,839 --> 00:18:31,519
this is what this is all about. It's just a

343
00:18:31,599 --> 00:18:34,079
question of or we finally got and we finally had

344
00:18:34,079 --> 00:18:37,759
a set of tools that can do it consistently take

345
00:18:37,799 --> 00:18:40,880
the semantical information around it and say you make a

346
00:18:40,960 --> 00:18:42,119
more informed search.

347
00:18:42,359 --> 00:18:44,640
Speaker 4: Well, I personally think that when it comes to just

348
00:18:45,319 --> 00:18:47,720
using embedding models, which are one of the two main

349
00:18:47,799 --> 00:18:53,519
types of language models that people use today, to take

350
00:18:53,680 --> 00:18:57,839
things like document titles or text from documents, convert them

351
00:18:57,839 --> 00:19:00,519
into embeddings and then be able to search through those

352
00:19:01,640 --> 00:19:05,160
in a way that matches on meaning rather than exact

353
00:19:05,279 --> 00:19:09,000
string matches is a very straightforward thing to achieve, Like,

354
00:19:09,039 --> 00:19:11,720
it's a really well trodden path by now, We've got

355
00:19:12,279 --> 00:19:15,960
so many cheap ways of doing it that I think

356
00:19:16,000 --> 00:19:18,680
that most people who have got any kind of application

357
00:19:19,359 --> 00:19:21,039
and do as any kind of search feature ought to

358
00:19:21,039 --> 00:19:22,000
be thinking about doing that.

359
00:19:22,119 --> 00:19:24,400
Speaker 2: We think about it this way, like you make a

360
00:19:24,480 --> 00:19:29,599
document a PDF document of rules that your software must follow, right,

361
00:19:29,640 --> 00:19:34,119
and then just set the AI on it and say, Okay,

362
00:19:34,640 --> 00:19:37,599
don't let me break the rules in my software. Okay,

363
00:19:39,400 --> 00:19:41,400
there's a there's a sci fi for you.

364
00:19:42,039 --> 00:19:44,759
Speaker 4: Yeah, it's it's a scary thing. So, I mean, AI

365
00:19:45,079 --> 00:19:48,200
has so much promise, right, But at the same time,

366
00:19:48,559 --> 00:19:53,200
it also kind of violates many of the assumptions that

367
00:19:53,240 --> 00:19:56,039
we've had about how software should work, like, particularly when

368
00:19:56,119 --> 00:19:59,519
you talk about rules that can't be violated or anything

369
00:19:59,519 --> 00:20:01,680
to do with security. You know, as soon as you

370
00:20:01,720 --> 00:20:04,960
bring AI into it and have a language model making

371
00:20:05,000 --> 00:20:07,880
decisions about what a user kind of can't do, everything

372
00:20:07,880 --> 00:20:10,920
becomes so fuzzy. Like you tell the language model a

373
00:20:10,920 --> 00:20:13,200
certain user should or shouldn't be allowed to do something,

374
00:20:13,839 --> 00:20:15,680
but if that user says that they want to do

375
00:20:15,720 --> 00:20:17,640
it persuasively enough, then it might just let them do

376
00:20:17,720 --> 00:20:18,240
it anyway.

377
00:20:18,680 --> 00:20:22,000
Speaker 3: Persuading software, I said in air quotes.

378
00:20:22,039 --> 00:20:24,440
Speaker 2: But what I would do in that situation is I

379
00:20:24,440 --> 00:20:27,519
would want a tool that says, Okay, here's the document

380
00:20:27,559 --> 00:20:31,039
of rules and business rules. Right, not necessarily you know,

381
00:20:31,160 --> 00:20:34,759
thou shalt not or whatever, but and then say, and

382
00:20:34,839 --> 00:20:39,119
here's my code and here's my rules. Now generate some

383
00:20:39,240 --> 00:20:41,559
code from me that I can where I can plug

384
00:20:41,599 --> 00:20:43,720
the holes. Yeah, okay, you know, and then I can

385
00:20:44,000 --> 00:20:46,200
look at I can do that once, and then I

386
00:20:46,240 --> 00:20:48,839
can look at that code those things and see if

387
00:20:48,880 --> 00:20:51,720
they if that's a good idea to implement them. But

388
00:20:52,240 --> 00:20:53,960
you know, that's how I want to use AI. I

389
00:20:54,000 --> 00:20:56,960
wanted to either generate some code or give me some

390
00:20:57,000 --> 00:21:00,119
suggestions and they'll let me write the final you know,

391
00:21:00,240 --> 00:21:02,720
do the final cut and paste or whatever and testing.

392
00:21:02,799 --> 00:21:05,599
Speaker 4: Yeah, I think that's reasonable. So that's using GAI as

393
00:21:05,599 --> 00:21:09,119
a coding assistant, which is definitely a very very valuable

394
00:21:09,319 --> 00:21:11,680
part of the whole thing. And a lot of people

395
00:21:11,680 --> 00:21:15,319
are doing that and that's great. There are limitations to that,

396
00:21:15,599 --> 00:21:20,000
you know, one of them is ultimately it might still

397
00:21:20,039 --> 00:21:22,359
get it wrong and you may or may not actually

398
00:21:22,440 --> 00:21:24,200
spark the fact that it's got it wrong. Like, there

399
00:21:24,200 --> 00:21:28,440
could still be bugs, especially subtle security bugs. It's not magic.

400
00:21:29,640 --> 00:21:35,400
And secondly, if you're constrained to only using GAI at

401
00:21:35,400 --> 00:21:38,599
development time and not at run time, then that constrains

402
00:21:38,599 --> 00:21:41,960
the sorts of features that you can ship. So I

403
00:21:41,960 --> 00:21:44,240
don't think it's like the ultimate or only solution, but

404
00:21:44,279 --> 00:21:47,480
it's definitely a valuable and valid way of thinking about

405
00:21:47,839 --> 00:21:49,279
things in a lot of cases.

406
00:21:49,279 --> 00:21:51,880
Speaker 2: Right, And I you know, just as an example, what

407
00:21:51,960 --> 00:21:54,519
I talked about is just an evolution of what we're

408
00:21:54,519 --> 00:21:58,680
already doing now with you know, Copilot and all of

409
00:21:58,720 --> 00:22:02,119
that stuff. Yeah, so I can see those things happening,

410
00:22:02,200 --> 00:22:06,000
you know, more discrete tools for helping us shore up

411
00:22:06,079 --> 00:22:10,799
our code. Static code analysis another great way to use AI.

412
00:22:11,880 --> 00:22:15,920
But yeah, I would, I would be skeptical to Here's

413
00:22:16,000 --> 00:22:19,440
here's something that Jeff Fritz and I were thinking about,

414
00:22:20,960 --> 00:22:25,279
and it's not necessarily AI, but it's calling translation services

415
00:22:25,279 --> 00:22:28,200
on the fly so that somebody you don't have to

416
00:22:28,279 --> 00:22:34,000
actually do different language versions of your application. You could

417
00:22:34,039 --> 00:22:37,279
just you know, you're in Japan, you bring up everything

418
00:22:37,279 --> 00:22:39,799
in Japanese and it goes and does a translation. But

419
00:22:39,880 --> 00:22:44,359
now you're relying on that translation to be accurate in

420
00:22:44,400 --> 00:22:47,599
real time. Not only that, but you're racking up all

421
00:22:47,680 --> 00:22:51,599
the API calls that you're doing. So another idea would

422
00:22:51,640 --> 00:22:56,240
be to just take a tool that would generate all

423
00:22:56,279 --> 00:23:01,319
the text and all the resources in every language and

424
00:23:01,359 --> 00:23:04,400
then boom, now you've got some Now you can actually

425
00:23:04,440 --> 00:23:05,039
test it.

426
00:23:04,960 --> 00:23:06,480
Speaker 4: And I'm sure and a lot of people are doing

427
00:23:06,519 --> 00:23:07,480
that absolutely.

428
00:23:08,200 --> 00:23:11,599
Speaker 3: Yeah. Yeah. The language tokenisation strategy that llm's brought to

429
00:23:11,640 --> 00:23:13,519
the table made translation.

430
00:23:13,319 --> 00:23:15,519
Speaker 2: A lot easier, yeah, and more accurate.

431
00:23:15,680 --> 00:23:17,559
Speaker 3: And it does seem to be more accurate as well.

432
00:23:17,799 --> 00:23:22,000
Speaker 4: Yeah. Well, modern large language models entirely come out of

433
00:23:22,440 --> 00:23:25,240
machine translation. That's where this whole thing started, you know,

434
00:23:25,319 --> 00:23:30,920
building mL models that could translate between human languages, and

435
00:23:31,000 --> 00:23:34,599
the way that that was very successful through transformers became

436
00:23:35,240 --> 00:23:36,519
you know what we have today.

437
00:23:36,839 --> 00:23:40,319
Speaker 3: Sort of the key ingredient to what began the path

438
00:23:40,319 --> 00:23:44,119
towards GPT was it's the translation problem first, and it's

439
00:23:44,160 --> 00:23:46,759
just yeah, it's one of its superpowers. And what I

440
00:23:46,799 --> 00:23:48,680
like about it is, for the most part, we haven't noticed.

441
00:23:48,759 --> 00:23:51,200
It's just that every all translations seem to be better

442
00:23:51,480 --> 00:23:54,920
and cheaper and more prevalent, just showing up in more places.

443
00:23:55,039 --> 00:23:55,200
Speaker 1: Yeah.

444
00:23:55,200 --> 00:23:55,559
Speaker 3: Absolute.

445
00:23:55,559 --> 00:23:57,160
Speaker 4: In fact, it was just today that I don't know

446
00:23:57,160 --> 00:24:00,200
if you saw VLC the media player. The oper was

447
00:24:00,240 --> 00:24:04,759
media Player as a shipped to feature this morning, which

448
00:24:04,880 --> 00:24:10,359
adds automatic subtitles to what you're watching, is controversial because

449
00:24:10,359 --> 00:24:12,079
people are not sure if the quality is good enough.

450
00:24:12,119 --> 00:24:14,480
But yeah, you're right, it is showing up all over

451
00:24:14,480 --> 00:24:14,839
the place.

452
00:24:15,079 --> 00:24:20,519
Speaker 2: So how likely are you to use a chat bot

453
00:24:20,599 --> 00:24:24,920
on a website that pretends to be a representative of

454
00:24:24,960 --> 00:24:25,839
the company. Oh?

455
00:24:25,920 --> 00:24:30,240
Speaker 4: Usually, I mean I'm usually my general feeling when encountering

456
00:24:30,279 --> 00:24:33,000
something like that is oh no, what have they done?

457
00:24:33,480 --> 00:24:38,920
But I did have one example of a time quite

458
00:24:38,960 --> 00:24:40,680
recently where I had to use a chat butt and

459
00:24:40,680 --> 00:24:43,640
it actually worked, and I was amazed, Wow, I had

460
00:24:44,880 --> 00:24:47,160
made a booking with like Expedia or something. I think

461
00:24:47,160 --> 00:24:50,279
it was Expedia, and I couldn't find my booking number anywhere.

462
00:24:50,319 --> 00:24:52,200
Like I went to every single email, there was no

463
00:24:52,240 --> 00:24:55,119
booking number in anything I could find. I tried to

464
00:24:55,119 --> 00:24:57,640
contact customer services. It was AI. I thought, oh, this

465
00:24:57,720 --> 00:24:59,640
is stupid, this is never going to work. And I

466
00:24:59,640 --> 00:25:01,960
asked it the question and it said, like what's your

467
00:25:02,039 --> 00:25:03,480
name and where you're going and stuff, and then it

468
00:25:03,920 --> 00:25:06,279
just immediately came back and said, this is your booking number,

469
00:25:06,799 --> 00:25:10,160
and it was awesome and it was correct. Yeah of course,

470
00:25:10,640 --> 00:25:11,240
yeah wow.

471
00:25:11,400 --> 00:25:13,839
Speaker 3: And from there, like you then searched on that or

472
00:25:13,960 --> 00:25:16,400
you know, opened something up like validated it elsewhere.

473
00:25:16,640 --> 00:25:18,240
Speaker 4: Yeah, well it took it gave me a link to

474
00:25:18,759 --> 00:25:22,079
something that I hadn't been able to find previously.

475
00:25:22,240 --> 00:25:25,400
Speaker 2: Do you do you think that chat bots should be

476
00:25:25,440 --> 00:25:28,240
required to tell you that they are a bot and

477
00:25:28,279 --> 00:25:30,160
you're not actually speaking to a real person.

478
00:25:30,319 --> 00:25:32,519
Speaker 4: I don't know. I mean within context, it can be

479
00:25:32,559 --> 00:25:34,759
really obvious. Like if if it's literally a chat you

480
00:25:34,839 --> 00:25:37,279
I and there's a little icon of a robot and

481
00:25:37,400 --> 00:25:39,599
it says like assistant bot is his name, then you're

482
00:25:39,599 --> 00:25:41,440
not going to think it's a person. Yeah, it would

483
00:25:41,440 --> 00:25:42,920
be a terrible person.

484
00:25:43,240 --> 00:25:46,680
Speaker 2: I've asked bots before and they have said, no, I'm

485
00:25:46,680 --> 00:25:48,920
not a bot, but I knew it was.

486
00:25:50,799 --> 00:25:53,279
Speaker 3: And you asked him to respond to you and iamic pentameter.

487
00:25:53,839 --> 00:25:57,559
Speaker 4: Yeah, and then you do the the voidkamp test or something.

488
00:25:57,640 --> 00:26:00,759
Speaker 2: Is that Okay, well I'm interested in that, but we

489
00:26:00,799 --> 00:26:02,359
probably shouldn't go down that rabbit.

490
00:26:05,480 --> 00:26:10,880
Speaker 3: Yeah, definitely trouble. I am thinking in terms of how

491
00:26:10,920 --> 00:26:15,519
we use generative AI models to understand the user's interaction

492
00:26:15,599 --> 00:26:19,400
with the app. I've been doing some work with some

493
00:26:19,480 --> 00:26:22,240
neuroscience companies and they do a lot of additional instrumentation

494
00:26:22,319 --> 00:26:23,920
on people. And one of the things they've said is like,

495
00:26:24,559 --> 00:26:27,519
facial expressions are not a good way to measure someone's

496
00:26:27,559 --> 00:26:30,759
response to things, and they're very cultural, they're very personal

497
00:26:30,799 --> 00:26:34,599
individuals and so forth. But if you take that camera

498
00:26:34,680 --> 00:26:37,160
and zoom in on a point of skin close enough,

499
00:26:37,640 --> 00:26:42,960
you can actually see heart rate, and heart rate is neutral.

500
00:26:43,200 --> 00:26:46,240
Like the fact that now you can measure that person's

501
00:26:46,240 --> 00:26:48,279
heart rate relative to one step to another inside of

502
00:26:48,319 --> 00:26:50,440
a piece of software and say is a heart rate

503
00:26:50,480 --> 00:26:53,880
increasing or decreasing and that is a better indicator of

504
00:26:54,559 --> 00:26:56,359
frustration or But.

505
00:26:56,440 --> 00:26:58,920
Speaker 2: You have to have a baseline though you have to

506
00:26:58,960 --> 00:27:01,039
know it when they're not excited.

507
00:27:01,799 --> 00:27:04,359
Speaker 3: Yeah, yeah, I think it's very personal, right that you

508
00:27:04,839 --> 00:27:08,680
could put someone through some paces and their cases. They

509
00:27:08,759 --> 00:27:12,720
do initial testing for a kind of surveying, and so

510
00:27:12,759 --> 00:27:14,480
they sort of look at what your measurements look like

511
00:27:14,519 --> 00:27:16,319
on that and then go into more important ones and

512
00:27:16,359 --> 00:27:17,279
see right through reaction.

513
00:27:17,480 --> 00:27:20,319
Speaker 4: Yeah, well that's interesting. It possibly gets us into some

514
00:27:20,559 --> 00:27:24,559
ethical and maybe even legal issues. Like so, you know

515
00:27:24,640 --> 00:27:30,480
a lot of recruitment takes place through AI these days,

516
00:27:30,480 --> 00:27:33,799
and people are often when applying for jobs as to

517
00:27:33,799 --> 00:27:37,079
do video interviews that kind of like an AI video

518
00:27:37,680 --> 00:27:41,440
AI assessed video interview. You know, you could it would

519
00:27:41,480 --> 00:27:44,440
be really straightforward then to be using a technique like

520
00:27:44,480 --> 00:27:46,960
that to try and assess do we think this person's

521
00:27:47,000 --> 00:27:49,799
really being honest about what they're saying on that? But

522
00:27:50,599 --> 00:27:54,480
if you're going to make decisions about someone on that basis,

523
00:27:55,279 --> 00:27:58,440
then you know, as a software developer, engineer and manager

524
00:27:58,960 --> 00:28:01,240
thinking about implement feature like that, you've got to think

525
00:28:01,359 --> 00:28:04,839
very carefully about how reliable you can make this and

526
00:28:04,880 --> 00:28:07,440
what the ethics of doing that really are.

527
00:28:07,480 --> 00:28:10,599
Speaker 2: Because humans have trigger words, right that may not have

528
00:28:10,640 --> 00:28:13,039
anything to do with what you're talking about But you

529
00:28:13,160 --> 00:28:16,279
use a word or something and some of you know,

530
00:28:16,599 --> 00:28:18,839
the heart rate will go up and they'll start sweating.

531
00:28:19,319 --> 00:28:21,000
But has nothing to do with the context.

532
00:28:21,160 --> 00:28:24,640
Speaker 3: Well, we've all interviewed developers. It turns out that doing

533
00:28:25,079 --> 00:28:28,160
being interviewed as a developer increases your heart rate. Yeah,

534
00:28:29,480 --> 00:28:32,319
they're not comfortable, the vast majority of the market. It's

535
00:28:32,319 --> 00:28:34,400
almost strange to find someone who is comfortable.

536
00:28:34,720 --> 00:28:36,319
Speaker 4: Yeah, that's even more suspicious.

537
00:28:36,480 --> 00:28:39,359
Speaker 3: Yeah, what's wrong with that, psychopath? Why are you happy

538
00:28:39,519 --> 00:28:41,640
being interviewed? There's something wrong here.

539
00:28:43,799 --> 00:28:47,880
Speaker 2: There's no laughing in development, laughing in interviews.

540
00:28:48,000 --> 00:28:50,079
Speaker 3: Nothing's fun. This is not fun.

541
00:28:51,319 --> 00:28:54,200
Speaker 2: We'll be right back with more with Steve Sanderson after this.

542
00:28:55,519 --> 00:28:58,240
Did you know that you can work with AWS directly

543
00:28:58,279 --> 00:29:04,240
from your ide AWS provides toolkits for visual studio, visual studio, code,

544
00:29:04,519 --> 00:29:08,599
and jet brains rider Learn more at AWS dot Amazon

545
00:29:08,640 --> 00:29:15,839
dot com, slash net slash tools. And we're back. It's

546
00:29:15,839 --> 00:29:18,359
dot net Rocks. I'm Carl Franklin, that's Richard Campbell hey,

547
00:29:18,519 --> 00:29:22,839
and that is mister Blazer himself, Steve Sanderson. We're talking

548
00:29:22,839 --> 00:29:29,759
about AI, his latest passion. We kind of stopped and uh,

549
00:29:30,279 --> 00:29:32,440
let's open up the floor for a new topic. What's

550
00:29:32,480 --> 00:29:35,680
what's something else that you're thinking about AI wise.

551
00:29:35,519 --> 00:29:38,839
Speaker 4: Steve Well, I guess the main area is what do

552
00:29:38,880 --> 00:29:42,720
we actually do practically in order to give better tools

553
00:29:43,000 --> 00:29:45,359
and capabilities to dot net developers. You know, I'm on

554
00:29:45,359 --> 00:29:48,880
the dot net team. That's that's what my jobs. And

555
00:29:48,960 --> 00:29:50,799
so the main thing that we're looking at doing at

556
00:29:50,799 --> 00:29:55,799
the moment is producing a new set of standard abstractions

557
00:29:55,839 --> 00:29:58,960
for AI features in dot Net. This is a pattern

558
00:29:59,079 --> 00:30:01,240
that you know we've we've used in all sorts of

559
00:30:01,240 --> 00:30:04,319
other things. So you know, we've got packages like Microsoft

560
00:30:04,400 --> 00:30:09,160
Extensions Dependency Injection. You'll know we've got Microsoft Extensions Logging

561
00:30:09,279 --> 00:30:11,799
and all sorts of other Microsoft Extensions things that contain

562
00:30:12,640 --> 00:30:17,119
standard representations of common features. So we're following that same pattern.

563
00:30:17,240 --> 00:30:19,920
We're going to have already shipped a preview of a

564
00:30:19,960 --> 00:30:24,279
package called Microsoft Extensions AI, and that contains some new

565
00:30:24,359 --> 00:30:28,519
standard representations for things like language models which we call

566
00:30:28,519 --> 00:30:32,759
i CHET client and embedding models or I Embedding Generator

567
00:30:32,799 --> 00:30:35,240
as we call it, and all kinds of other standard

568
00:30:35,319 --> 00:30:38,640
types and helpers that you can use when you want

569
00:30:38,680 --> 00:30:41,559
to add some AI features into your application. And then

570
00:30:41,559 --> 00:30:44,680
there are implementations for all of these with many different

571
00:30:45,680 --> 00:30:48,680
implementations provided. So we've got the obvious ones like open

572
00:30:48,720 --> 00:30:52,119
AI as your open AI. We've got ones for Olama,

573
00:30:52,359 --> 00:30:57,359
there's ones for Gemini, ones for Anthropic. You know, all

574
00:30:57,440 --> 00:31:00,480
kinds of different implementations.

575
00:31:00,279 --> 00:31:03,839
Speaker 2: So both cloudy and local implementations of LLMS.

576
00:31:03,400 --> 00:31:07,200
Speaker 4: YEP absolutely and then all from us as well. The

577
00:31:07,240 --> 00:31:11,400
intention is that this is a shared community effort, so

578
00:31:11,440 --> 00:31:14,759
we've been working with external projects like the Alama Sharp

579
00:31:14,799 --> 00:31:17,880
project and others to make sure that they've all got

580
00:31:17,960 --> 00:31:21,200
implementations of these same interfaces. And the payoff for all

581
00:31:21,279 --> 00:31:24,640
this for dot net developers is that you have one

582
00:31:24,839 --> 00:31:28,759
common program model that you can use whichever AI back

583
00:31:28,880 --> 00:31:31,480
end you want to work with. And it's hopefully quite

584
00:31:31,480 --> 00:31:34,480
a well thought out one because you know, we've had

585
00:31:34,599 --> 00:31:37,359
people dedicated to thinking about this for months and it's

586
00:31:37,440 --> 00:31:40,200
very flexible and you can plug in your own behaviors

587
00:31:40,240 --> 00:31:43,240
at different stages in the pipeline and combine things and

588
00:31:43,279 --> 00:31:45,119
you know, do all the stuff that you will want

589
00:31:45,119 --> 00:31:47,279
to do, and yeah, we want to be able to

590
00:31:47,279 --> 00:31:48,599
build an ecosystem around that.

591
00:31:48,680 --> 00:31:52,559
Speaker 2: Are all these providers committed to adhering to the interfaces

592
00:31:52,559 --> 00:31:55,160
that you have or is it up to you doing

593
00:31:55,160 --> 00:31:58,119
the implementation to adjust if their APIs change.

594
00:31:58,480 --> 00:32:00,920
Speaker 4: So yeah, So let's take an example. So we'll say

595
00:32:01,359 --> 00:32:04,279
the Alarma on for example. If people don't know, Olama

596
00:32:04,400 --> 00:32:06,319
is some software that you can run locally on your

597
00:32:06,559 --> 00:32:09,400
developer machine as a way of using things like language

598
00:32:09,400 --> 00:32:11,720
models locally. You probably wouldn't use it in production, but

599
00:32:11,759 --> 00:32:15,720
it's great for development, Okay. So there are there are

600
00:32:15,720 --> 00:32:18,240
a bunch of different client packages for that for dot net.

601
00:32:18,480 --> 00:32:21,359
So Alarma sharp is the most well known one, and

602
00:32:21,480 --> 00:32:24,279
you can use it's APIs directly, just like it's straight

603
00:32:24,359 --> 00:32:27,559
concrete APIs a newer Alarma client or whatever it is,

604
00:32:27,880 --> 00:32:30,240
and use it directly. But then that code's not interoperable

605
00:32:30,240 --> 00:32:33,640
with anything else. So what you can do is use

606
00:32:33,759 --> 00:32:36,519
the iChat client implementation that it provides. So you would

607
00:32:36,559 --> 00:32:40,160
say something like Newolarma client dot dot, and then at

608
00:32:40,160 --> 00:32:42,279
the end you put dot as chat client and that

609
00:32:42,359 --> 00:32:45,160
gives you back and in an object that implements this

610
00:32:45,240 --> 00:32:47,799
iChat client interface. And now you can use that in

611
00:32:47,839 --> 00:32:50,240
the same way as every other AI service.

612
00:32:50,440 --> 00:32:55,839
Speaker 2: So it's really Olama's responsibility to implement Yeah, iChat client,

613
00:32:55,920 --> 00:32:57,960
it is totally, which is the dot Net interface.

614
00:32:58,079 --> 00:33:00,240
Speaker 4: Yeah, that's right, in the same way that Sarah log

615
00:33:00,240 --> 00:33:03,960
implements IE logging. And you know, that's what the whole

616
00:33:04,000 --> 00:33:06,839
point of having these standard interfaces is that everyone can

617
00:33:06,920 --> 00:33:09,079
make them work in whatever way makes sense to them.

618
00:33:09,240 --> 00:33:11,799
Speaker 2: Yeah, and I'm wondering, like, how would that evolve.

619
00:33:12,799 --> 00:33:16,400
Speaker 4: Well, yeah, so it evolves in that the industry itself

620
00:33:16,480 --> 00:33:19,720
is evolving and we're getting new types of AI services

621
00:33:19,759 --> 00:33:21,960
coming out. A good example of that at the moment

622
00:33:22,039 --> 00:33:24,960
are the real time APIs. So at the moment, most

623
00:33:25,000 --> 00:33:27,200
of our interactions with language models are a sort of

624
00:33:27,599 --> 00:33:29,920
request response pattern where we send a bunch of text

625
00:33:29,920 --> 00:33:31,880
to them and they come back with a bunch of text.

626
00:33:31,960 --> 00:33:36,519
But there's a different paradigm that's emerging called real time

627
00:33:36,559 --> 00:33:40,079
at the moment, where it's a bidirectional communication channel a

628
00:33:40,119 --> 00:33:42,799
bit like a web socket or you know, just straight

629
00:33:42,880 --> 00:33:46,000
TCP socket connection where you can just sort of fire

630
00:33:46,200 --> 00:33:49,319
a load of bites across the wire to the AI

631
00:33:49,400 --> 00:33:51,599
system and it can send bites back to you at

632
00:33:51,640 --> 00:33:54,920
any time in any order, like overlapping or whatever, and

633
00:33:54,960 --> 00:33:57,319
those bites could represent text, or they could be audio

634
00:33:57,400 --> 00:34:00,759
data or images or whatever in either direction. So that's

635
00:34:00,839 --> 00:34:06,079
the real time API, and our standard abstraction I chat

636
00:34:06,079 --> 00:34:09,519
client doesn't represent that today. It represents the request responds

637
00:34:09,599 --> 00:34:13,639
text based pattern. So, as you alluded to, we have

638
00:34:13,679 --> 00:34:16,360
to evolve as time goes on. We will add further

639
00:34:16,400 --> 00:34:20,519
abstractions to represent other AI patterns when they become common

640
00:34:20,760 --> 00:34:22,039
across multiple providers.

641
00:34:22,519 --> 00:34:26,679
Speaker 2: And these providers keep contexts too, right well, they can

642
00:34:26,760 --> 00:34:28,079
do so that they can.

643
00:34:28,760 --> 00:34:32,159
Speaker 4: The classic way of interacting with the language model is stateless.

644
00:34:32,320 --> 00:34:35,079
So if you're having a conversation in a chatbot where

645
00:34:35,119 --> 00:34:36,840
you say A and it says B, and U say

646
00:34:36,920 --> 00:34:39,920
C and it says D. When you sen want to

647
00:34:39,960 --> 00:34:41,880
send the message E, you don't just send E to it.

648
00:34:41,920 --> 00:34:46,000
You also send abcd like that because it hasn't remembered

649
00:34:46,000 --> 00:34:49,119
any of the conversation, so you literally have to repeat

650
00:34:49,159 --> 00:34:52,159
the entire conversation. Now, the app developer doesn't think about

651
00:34:52,199 --> 00:34:54,800
that now, because that's like inside the libraries. But that's

652
00:34:54,840 --> 00:34:55,840
what's happening on the wire.

653
00:34:55,960 --> 00:34:59,400
Speaker 2: Okay, So we've been spoiled by chat GPT. In other words,

654
00:34:59,559 --> 00:35:02,559
it's it's doing that context stuff under the hood and

655
00:35:02,599 --> 00:35:04,320
we don't we don't have to worry about it.

656
00:35:04,400 --> 00:35:07,000
Speaker 4: Yeah, the WebUI does that and to be honest, they

657
00:35:07,000 --> 00:35:10,760
probably do some service side caching of the conversations as well,

658
00:35:10,800 --> 00:35:13,079
but that's you know, that would be specific to them.

659
00:35:13,199 --> 00:35:16,920
Speaker 3: Yeah, okay, interesting I've been I've got an open AI

660
00:35:17,039 --> 00:35:21,559
interface into my Home Assistant implementation, and essentially what it

661
00:35:21,599 --> 00:35:26,119
does is prefixes whatever my request is with the model

662
00:35:26,199 --> 00:35:29,360
of the house as part of the prompt. So it's

663
00:35:29,480 --> 00:35:32,559
essentially saying, here's what's in the house, now, what would

664
00:35:32,599 --> 00:35:35,360
you do with this sentence? And then it comes back

665
00:35:35,400 --> 00:35:37,239
with us, you know, so I can say things like

666
00:35:38,159 --> 00:35:40,599
turn on all the outdoor lights, turn off the lights

667
00:35:40,599 --> 00:35:42,519
in the garage, and turn off the lights in the garage,

668
00:35:42,920 --> 00:35:45,639
and it will change those statements. It really translates that

669
00:35:45,800 --> 00:35:48,760
into it's this switch that's which they know does all

670
00:35:48,760 --> 00:35:54,239
the things. Yeah, but it is literally manufacturing the context

671
00:35:54,400 --> 00:35:56,119
each time you make a request.

672
00:35:56,719 --> 00:35:57,800
Speaker 4: Yeah, that is right.

673
00:35:57,960 --> 00:35:58,199
Speaker 3: Yeah.

674
00:35:58,239 --> 00:35:59,800
Speaker 2: When's Alexa going to catch up to this?

675
00:36:00,079 --> 00:36:00,400
Speaker 1: Hmmm?

676
00:36:01,159 --> 00:36:04,519
Speaker 3: Yeah. I don't envy those folks. The yellms have sort

677
00:36:04,559 --> 00:36:07,119
of caught them with their pants down, both the Google

678
00:36:07,159 --> 00:36:11,519
Home folks and they and the Amazon folks. They've got

679
00:36:11,599 --> 00:36:12,880
you respond right.

680
00:36:12,719 --> 00:36:14,920
Speaker 2: But they didn't They wanted to kill the product, didn't They.

681
00:36:14,960 --> 00:36:17,760
Speaker 3: Well, the product was losing money, losing lots of money,

682
00:36:17,920 --> 00:36:21,159
like the you know, the purpose behind those the Amazon

683
00:36:21,159 --> 00:36:24,519
device was to sell stuff on Amazon Sales tool, except

684
00:36:24,519 --> 00:36:27,280
that nobody did that right because it correctly became a

685
00:36:27,360 --> 00:36:30,119
joke about walking into the room and ordering you know,

686
00:36:30,639 --> 00:36:33,840
a ton of soap on someone, right like or having

687
00:36:33,840 --> 00:36:36,119
your children order stuff because they saw you do it.

688
00:36:36,679 --> 00:36:39,159
So nobody did that part. And they sold all that

689
00:36:39,239 --> 00:36:41,320
gear for cost. So it was a few years ago

690
00:36:41,360 --> 00:36:43,159
they said, hey, we're cutting these teams way back, like

691
00:36:43,239 --> 00:36:44,400
we spent billions on them.

692
00:36:44,440 --> 00:36:47,400
Speaker 2: Has anybody heard about anything that they're working on in

693
00:36:47,440 --> 00:36:48,000
the back end?

694
00:36:48,239 --> 00:36:51,599
Speaker 3: They clearly are. Yeah, I mean that they haven't.

695
00:36:51,639 --> 00:36:52,239
Speaker 2: They have to be.

696
00:36:52,320 --> 00:36:55,400
Speaker 3: They have to be. The same time, they had just

697
00:36:55,440 --> 00:36:58,159
cut their teams and within months of Amazon saying that

698
00:36:58,199 --> 00:37:01,119
Google did the same thing for their devices as well.

699
00:37:01,480 --> 00:37:04,119
Speaker 2: Well. That that brings me to the next question, Steve,

700
00:37:04,159 --> 00:37:06,320
do you have an iPhone? I do, yep. And what

701
00:37:06,360 --> 00:37:07,920
do you think of Apple Intelligence?

702
00:37:08,239 --> 00:37:10,559
Speaker 4: Oh, I don't have one that's that new, you say, I.

703
00:37:10,480 --> 00:37:14,559
Speaker 2: Do, and I don't think it's very intelligent. I would

704
00:37:14,599 --> 00:37:17,440
thought I was going to get an upgrade to sirikay

705
00:37:18,039 --> 00:37:21,400
that hasn't happened. And I have, you know, my Apple

706
00:37:21,440 --> 00:37:24,760
car play, so anytime I ask it something that's Siri ye,

707
00:37:25,039 --> 00:37:27,199
and I'll ask it stuff and it's just as dumb

708
00:37:27,239 --> 00:37:30,239
as it was before. Yeah, so I don't see Apple Intelligence.

709
00:37:30,239 --> 00:37:32,239
Speaker 4: I mean, I'd be happy with even if the basic

710
00:37:32,320 --> 00:37:35,360
SERI worked. Like I tell my phone to like set

711
00:37:35,360 --> 00:37:37,079
a reminder or something and it just gives me a

712
00:37:37,119 --> 00:37:38,519
spinner and that's it.

713
00:37:38,599 --> 00:37:39,000
Speaker 2: Wow.

714
00:37:39,119 --> 00:37:44,840
Speaker 3: So yeah, this is very unusual of Apple. But the

715
00:37:44,880 --> 00:37:48,840
best guess that the Apple you know, Digerati have said

716
00:37:48,960 --> 00:37:51,719
is that Apple was afraid last year if they didn't

717
00:37:51,719 --> 00:37:55,119
say something about AI at WWDC because they only do

718
00:37:55,159 --> 00:37:57,320
one show here, if they waited till this year would

719
00:37:57,320 --> 00:38:00,920
be too late, so they basically pre aound something like normally,

720
00:38:00,960 --> 00:38:03,360
Apple already has all this stuff worked out, talks about

721
00:38:03,360 --> 00:38:05,760
it in the show and its releases on schedule. But

722
00:38:05,800 --> 00:38:08,760
that is not what happened with Apple Intelligence. They came

723
00:38:08,840 --> 00:38:11,440
up with a term, they pitched it at the show,

724
00:38:11,719 --> 00:38:13,360
and then they immediately pushed it back.

725
00:38:13,599 --> 00:38:15,239
Speaker 2: Yeah, No, that was weird.

726
00:38:15,679 --> 00:38:17,559
Speaker 4: I don't even know if it's shipped globally. Yeah, I

727
00:38:17,559 --> 00:38:19,320
don't know if it's available in the UK. I haven't

728
00:38:19,360 --> 00:38:20,440
heard anyone talk about it.

729
00:38:20,679 --> 00:38:25,079
Speaker 3: Yeah, it's it is. I think you're seeing the closest

730
00:38:25,079 --> 00:38:28,000
thing to Apple being scared that they're missing the boat

731
00:38:28,079 --> 00:38:31,440
on this thing and they're behaving oddly because we would

732
00:38:31,519 --> 00:38:34,199
expect them. And again, I don't use Apple products, but

733
00:38:34,280 --> 00:38:37,559
I've lived alongside them long enough to know if Apple

734
00:38:37,599 --> 00:38:39,679
says they're going to do something, they generally just do

735
00:38:39,760 --> 00:38:44,280
it and have a good plan. And that's not happening here. Yeah,

736
00:38:44,360 --> 00:38:46,800
it's an odd time, but I do think we're all

737
00:38:46,840 --> 00:38:49,320
feeling around for the right thing to do in this space.

738
00:38:49,360 --> 00:38:52,159
Like I'm excited that you guys are building tools because

739
00:38:52,159 --> 00:38:54,519
that gives me a standard way to interact with it

740
00:38:54,559 --> 00:38:56,400
in my apps. No, like for me as a dot

741
00:38:56,400 --> 00:39:01,400
net developer. Thanks like the extensions all I understand Right

742
00:39:01,639 --> 00:39:03,559
when I need this, I go, I go to new Get,

743
00:39:03,599 --> 00:39:06,239
I bring it in in and off I go. So

744
00:39:06,360 --> 00:39:08,880
it's a way to sort of create a common approach

745
00:39:08,920 --> 00:39:13,280
to integrating stuff into yourself. Yeah, absolutely, So. Just I

746
00:39:13,320 --> 00:39:15,599
grab the blog post that I'll put in the show

747
00:39:15,639 --> 00:39:17,320
notes because it does give you sort of a walk

748
00:39:17,360 --> 00:39:19,679
through of this. And it's just that you talked about

749
00:39:19,719 --> 00:39:21,920
open AI. You talked about, was it.

750
00:39:22,000 --> 00:39:24,480
Speaker 2: Lama O Lama O Lama olama, And.

751
00:39:24,480 --> 00:39:26,519
Speaker 3: Then there's the inference engine.

752
00:39:26,800 --> 00:39:28,719
Speaker 4: The inference I'm not sure.

753
00:39:29,079 --> 00:39:33,000
Speaker 3: As you as your AI inference Yeah.

754
00:39:32,679 --> 00:39:35,440
Speaker 4: Oh right, yeah, sorry, as your AI inference. Yeah, that's right.

755
00:39:35,480 --> 00:39:38,679
That's that's almost like the cloud version of OLAMA. That's

756
00:39:38,719 --> 00:39:40,679
a very simple mental model for it. It's a way

757
00:39:40,719 --> 00:39:43,599
of running any kind of model that you want to

758
00:39:43,639 --> 00:39:46,599
provide from us or once from a standard catalog in

759
00:39:47,440 --> 00:39:51,320
a hosted service, as opposed to say, like as your

760
00:39:51,360 --> 00:39:54,199
open air, which is literally just open AI models and

761
00:39:54,320 --> 00:39:57,400
doesn't include other things like the LAMA models.

762
00:39:57,280 --> 00:40:01,360
Speaker 3: Because at the same time, as you AI has opened

763
00:40:01,400 --> 00:40:03,440
up to many more models, and just open AI you

764
00:40:03,440 --> 00:40:04,519
have all theseries.

765
00:40:04,639 --> 00:40:05,639
Speaker 4: That's correct, I.

766
00:40:05,559 --> 00:40:07,360
Speaker 3: Mentioned to the listener at the beginning of the show.

767
00:40:07,519 --> 00:40:09,239
Speaker 4: Yeah, yeah, I think in a lot of cases that's

768
00:40:09,280 --> 00:40:11,800
going to be a much easier way of running a

769
00:40:11,800 --> 00:40:14,800
wider range of models in production than you know, setting

770
00:40:14,840 --> 00:40:21,280
up your own like containers and serving model output from that.

771
00:40:21,199 --> 00:40:24,960
That's quite hard business to do because you know, these

772
00:40:25,039 --> 00:40:26,760
things only work well if they're running on the right

773
00:40:26,840 --> 00:40:30,519
kind of GPU hardware, and that can be expensive, and

774
00:40:30,840 --> 00:40:33,360
you know, you really don't want to be managing all

775
00:40:33,400 --> 00:40:33,840
of that stuff.

776
00:40:33,840 --> 00:40:36,199
Speaker 3: If you can avoid that stuff is plumbing, right. You

777
00:40:36,400 --> 00:40:39,239
just want to be able to send your your thing

778
00:40:39,239 --> 00:40:42,480
out over an API and get a useful response back. Yeah, done,

779
00:40:42,480 --> 00:40:46,239
it's just another API. No magic. Please just let my API,

780
00:40:46,639 --> 00:40:50,480
that my API call work exactly. You're creating a cloud dependency.

781
00:40:50,519 --> 00:40:52,360
But that's not that weird. Like there seems to be

782
00:40:52,360 --> 00:40:57,840
a big push to run more of these models on prem.

783
00:40:58,320 --> 00:41:00,840
While we're recording this, I think C is still going

784
00:41:00,880 --> 00:41:03,519
on the Consumer Electronic Show, and Nvidia made this announcement

785
00:41:03,559 --> 00:41:07,800
about this three thousand dollars supercomputer running what they called

786
00:41:08,719 --> 00:41:14,119
the Grace Blackwell chip set. But they're saying, for three

787
00:41:14,119 --> 00:41:18,280
thousand bucks, you will run a two hundred billion parameter

788
00:41:18,800 --> 00:41:19,639
model on prem.

789
00:41:19,920 --> 00:41:22,360
Speaker 4: Oh it's funny, isn't it the way this competition works.

790
00:41:22,400 --> 00:41:25,920
So yeah, the current status quo is that you use

791
00:41:25,960 --> 00:41:30,920
a combination of Nvideo provided hardware and then the software

792
00:41:30,960 --> 00:41:35,119
from open A or another tech giant, Microsoft or Google.

793
00:41:35,639 --> 00:41:39,320
And so we've got Nvidea trying to commoditize the tech

794
00:41:39,719 --> 00:41:43,079
the other tech giants the model providers by saying, oh no,

795
00:41:43,159 --> 00:41:45,199
you just need our hardware. You don't need any cloud.

796
00:41:45,440 --> 00:41:47,800
And then you've got the tech giants trying to commoditize

797
00:41:47,880 --> 00:41:50,079
n video by saying, oh, we've got our own custom silicon,

798
00:41:50,320 --> 00:41:52,800
you don't need Nvidia. Yeah.

799
00:41:53,679 --> 00:41:55,840
Speaker 3: Fun well, happily buying each other's products too.

800
00:41:56,039 --> 00:41:58,679
Speaker 4: Yeah, see who's going to win.

801
00:41:58,920 --> 00:42:02,320
Speaker 3: This is co oppetition, But it did what impressed me, Like,

802
00:42:03,199 --> 00:42:05,639
GPT three is one hundred and seventy five billion parameters.

803
00:42:05,639 --> 00:42:08,400
So in a matter of four years, we've gone from

804
00:42:09,159 --> 00:42:13,280
a cloud problem that Mark Rosinovich talked about building the

805
00:42:13,280 --> 00:42:16,119
fifth largest supercomputer in the world to build that model

806
00:42:16,159 --> 00:42:18,840
back in the day, to something for three grand I

807
00:42:18,840 --> 00:42:21,159
could sit on my desk. In theory it was CES,

808
00:42:21,239 --> 00:42:23,039
which is basically a concept show.

809
00:42:23,000 --> 00:42:26,519
Speaker 2: Right, and the stock continues to get by them. Yeah.

810
00:42:27,280 --> 00:42:29,480
Speaker 4: Yeah, it's amazing enough us that grows. In fact, just

811
00:42:29,559 --> 00:42:34,559
yesterday I was experimenting with what's the biggest language model

812
00:42:34,559 --> 00:42:37,199
that I can train on my laptop in like one minute?

813
00:42:37,920 --> 00:42:39,719
I thought it was going to be nothing like I thought.

814
00:42:40,119 --> 00:42:42,840
You know, it takes months of GPU time to make

815
00:42:42,880 --> 00:42:45,719
a sensible language model, Like in one minute, it's not

816
00:42:45,760 --> 00:42:48,360
even going to produce words that you can read. Sure,

817
00:42:49,559 --> 00:42:53,480
it turns out you can produce something sort of if

818
00:42:53,480 --> 00:42:57,920
you take a GPT that sorry, GPT two, an implementation

819
00:42:58,039 --> 00:43:01,000
of that on a single GPU on my laptop, I

820
00:43:01,039 --> 00:43:06,280
can get something that produces pretty intelligible sentences within one

821
00:43:06,320 --> 00:43:07,199
minute of training.

822
00:43:07,639 --> 00:43:07,800
Speaker 3: Wo.

823
00:43:08,440 --> 00:43:12,159
Speaker 4: So yeah, it's crazy how how quickly we've come on

824
00:43:12,280 --> 00:43:15,760
from you know, it being a you know, industry defining

825
00:43:15,800 --> 00:43:21,039
problem to produce anything like the English language to being

826
00:43:21,079 --> 00:43:22,039
really quite straightforward.

827
00:43:23,159 --> 00:43:25,760
Speaker 3: You know, they're used to be an XKCD graphic about

828
00:43:25,800 --> 00:43:28,679
identifying birds in a picture, being needing a team in

829
00:43:28,800 --> 00:43:31,920
years of work, and now we expect our phone when

830
00:43:31,960 --> 00:43:36,719
we take a photo, like I've got the new Android phone,

831
00:43:36,760 --> 00:43:39,960
the Pixel nine, and one of its features in Gemini

832
00:43:40,039 --> 00:43:41,880
is when you take a photo, it writes a caption

833
00:43:41,920 --> 00:43:46,559
of what's in the book, and that's included, Like just prettful?

834
00:43:47,119 --> 00:43:48,400
Speaker 4: Is that on device that.

835
00:43:48,639 --> 00:43:50,639
Speaker 3: The apparently it's on device. I suspect there's a round

836
00:43:50,679 --> 00:43:53,000
trip to the cloud involved. Okay, pretty sure?

837
00:43:53,079 --> 00:43:54,800
Speaker 2: What I what I want to do when I keep

838
00:43:54,840 --> 00:43:56,480
threatening my wife, I'm going to do this is to

839
00:43:56,559 --> 00:44:00,320
take the ten thousand photos that have accumulated since I

840
00:44:00,360 --> 00:44:03,800
started saving them and upload them to a cloud service

841
00:44:03,880 --> 00:44:07,039
and have it identify what's in the photos, and then

842
00:44:07,239 --> 00:44:10,159
create a database that I can search so that I

843
00:44:10,199 --> 00:44:13,800
can say, you know, a picture of Richard Campbell with

844
00:44:13,880 --> 00:44:17,360
a glass of whiskey, and nine thousand of those ten

845
00:44:17,400 --> 00:44:20,079
thousand photos come up and.

846
00:44:20,280 --> 00:44:23,840
Speaker 3: List Yeah, it's just in state from me.

847
00:44:24,800 --> 00:44:27,800
Speaker 2: Yeah. I think that's another little science fiction any thing too.

848
00:44:27,840 --> 00:44:30,119
But I mean you can sort of do that now.

849
00:44:30,280 --> 00:44:32,079
Speaker 4: But oh you can totally do that now.

850
00:44:32,159 --> 00:44:33,719
Speaker 2: Yeah, maybe not.

851
00:44:34,400 --> 00:44:37,159
Speaker 4: You could even just have it generate those pictures even

852
00:44:37,159 --> 00:44:38,000
if they didn't exist.

853
00:44:38,079 --> 00:44:38,840
Speaker 2: Well, there you go.

854
00:44:39,199 --> 00:44:41,079
Speaker 3: There is that. Yeah, you don't have enough pictures me

855
00:44:41,119 --> 00:44:43,039
holding a glass of whiskey. You can make them.

856
00:44:44,119 --> 00:44:45,800
Speaker 2: By the way, I still haven't been able to send

857
00:44:45,840 --> 00:44:49,920
you a Christmas present because your mail system is borked. Richard.

858
00:44:50,000 --> 00:44:52,599
Speaker 3: Well, we had a six week strike, yeah, and it

859
00:44:52,760 --> 00:44:54,440
kind of made a mess of absolutely Ever.

860
00:44:54,760 --> 00:44:57,079
Speaker 2: Is it back in business now or yeah.

861
00:44:56,920 --> 00:45:00,239
Speaker 3: The strike has been stopped till the spring, so a

862
00:45:00,280 --> 00:45:04,280
window now, Okay, that's right done. They never resolved anything.

863
00:45:04,280 --> 00:45:06,039
They just kicked it down the road to this.

864
00:45:06,320 --> 00:45:09,199
Speaker 2: Steve, do you have your hands in Blazer much these days?

865
00:45:09,400 --> 00:45:09,639
Speaker 3: Yeah?

866
00:45:09,679 --> 00:45:12,719
Speaker 4: So definitely not as much as before, because you know,

867
00:45:12,719 --> 00:45:17,719
I've my focus is broadened and spread out. But yes,

868
00:45:17,719 --> 00:45:20,800
I'm still very involved in the Blazer team. You know,

869
00:45:20,840 --> 00:45:25,280
I've probably liked ten Blazer meetings a week and yeah,

870
00:45:25,320 --> 00:45:29,400
so that we're working towards clarifying a dot net ten plan.

871
00:45:29,559 --> 00:45:31,400
At the moment for Blazer, we've got a whole bunch

872
00:45:31,480 --> 00:45:33,880
of meetings going on around that.

873
00:45:34,199 --> 00:45:36,559
Speaker 3: It's crazy to think about. Dot net ten just gives

874
00:45:36,559 --> 00:45:39,079
me chills. Man. Yeah, I guess it's only a couple

875
00:45:39,159 --> 00:45:41,239
months ago. You ship dot net nine, So I guess

876
00:45:41,239 --> 00:45:42,840
that's only fair. There's going to be another one.

877
00:45:42,960 --> 00:45:44,599
Speaker 4: Yeah, I know what you mean. It's a bit like

878
00:45:44,639 --> 00:45:47,679
browser versions, like you know, Chrome one hundred and fifty

879
00:45:47,760 --> 00:45:52,440
five or whatever by now. Yeah, time goes on. So yeah,

880
00:45:52,440 --> 00:45:55,079
the Blazer team is We're actually all gathering in person

881
00:45:55,400 --> 00:45:57,480
next week, which is going to be fun to see everyone,

882
00:45:57,719 --> 00:46:00,559
and we will try and dig in to a lot

883
00:46:00,559 --> 00:46:03,719
of this planning and working out what we want. I

884
00:46:03,719 --> 00:46:05,960
think one of the big themes that we'll focus on

885
00:46:06,079 --> 00:46:10,079
in dot net ten is the most core scenario of

886
00:46:10,119 --> 00:46:15,599
Blazer usage, which is Blazer server and what we can

887
00:46:15,679 --> 00:46:19,840
do to make that as bulletproof as possible. Like one

888
00:46:19,880 --> 00:46:21,960
of the key things that people have struggled with Blazer

889
00:46:21,960 --> 00:46:24,239
server from the beginning is the fact that you have

890
00:46:24,280 --> 00:46:26,920
to have this constant connection. And yes, we've always had

891
00:46:26,920 --> 00:46:29,800
a sort of reconnection mechanism, but it's always been a

892
00:46:29,800 --> 00:46:33,480
little bit you know, suboptimal and like does it really work,

893
00:46:33,519 --> 00:46:35,559
And even when it works, it's a little bit like

894
00:46:35,679 --> 00:46:38,079
jarring to the end user and can result in them

895
00:46:38,119 --> 00:46:41,519
losing some state. So what can we do to make

896
00:46:41,599 --> 00:46:45,239
it as practical as possible for a Blazer app developer

897
00:46:45,559 --> 00:46:49,280
to have a really bulletproof experience for the user so

898
00:46:49,280 --> 00:46:51,159
that whether or not the connection is lost, the user

899
00:46:51,199 --> 00:46:52,559
does not lose state.

900
00:46:52,639 --> 00:46:55,039
Speaker 2: Well, you certainly laid the foundation really well in dot

901
00:46:55,199 --> 00:46:57,679
at nine. I mean even right out of the box,

902
00:46:57,760 --> 00:47:01,599
the experience is, yeah, your code kind of freezes for

903
00:47:01,639 --> 00:47:04,320
a bit, Yeah, but it usually comes back.

904
00:47:04,920 --> 00:47:06,880
Speaker 4: Yeah, Yeah, that's right. We did improve it a lot

905
00:47:06,880 --> 00:47:09,519
in nine. In prior to nine, it used to be

906
00:47:09,559 --> 00:47:12,239
a pretty awful user experience, if I'm honest, where it

907
00:47:12,280 --> 00:47:14,679
would just sort of make it the user's problem to

908
00:47:15,400 --> 00:47:19,119
work out whether to reload. You know, it wasn't great.

909
00:47:19,519 --> 00:47:22,440
And in nine it's a lot more prone to like

910
00:47:22,480 --> 00:47:25,079
figure out the problem and solve it. Itself, but that's

911
00:47:25,119 --> 00:47:30,320
still not bulletproof. You can still lose data because it's

912
00:47:30,320 --> 00:47:32,199
stored in the server memory, and if the server's gone,

913
00:47:32,239 --> 00:47:34,199
well it's gone, Like what else do you expect to happen?

914
00:47:34,719 --> 00:47:39,440
So we want to create mechanisms by which you can

915
00:47:40,639 --> 00:47:44,920
persist that state in a way that's independent of maintaining

916
00:47:44,920 --> 00:47:45,840
the connection to the server.

917
00:47:46,280 --> 00:47:50,679
Speaker 2: At nine, there are also laser improvements for auto mode.

918
00:47:51,280 --> 00:47:53,840
In general, it didn't really work all that well for

919
00:47:53,960 --> 00:47:56,599
me before in dot Net eight and before, and in

920
00:47:56,760 --> 00:47:58,880
nine it seems to really work well.

921
00:47:59,079 --> 00:48:01,679
Speaker 4: Yeah, so when an eight first shift, we unfortunately had

922
00:48:01,760 --> 00:48:04,199
quite a bad bug with the automotive that basically meant

923
00:48:04,239 --> 00:48:07,119
it was never going to work, which was terrible, and

924
00:48:07,159 --> 00:48:08,920
we patched it almost straight away, but are still a

925
00:48:08,920 --> 00:48:12,840
lot of people got this very bad first impression of that,

926
00:48:14,280 --> 00:48:17,239
and even then it was you know, still had constraints

927
00:48:17,280 --> 00:48:20,400
and limitations, and so yeah, we really wanted to improve

928
00:48:20,440 --> 00:48:22,199
that a lot in nine, make sure that not only

929
00:48:22,239 --> 00:48:24,280
did it really work this time, but it actually works

930
00:48:24,320 --> 00:48:26,400
well in practical situations.

931
00:48:25,840 --> 00:48:28,639
Speaker 2: And that seems to be with along with some state

932
00:48:28,719 --> 00:48:33,000
management and you know, shifting state from server to client

933
00:48:33,039 --> 00:48:34,960
when we go to web assembly. That kind of stuff

934
00:48:35,480 --> 00:48:38,320
seems to be a really good pattern. I think now

935
00:48:38,960 --> 00:48:41,639
that I would actually recommend it, you know, because you

936
00:48:41,679 --> 00:48:45,960
get the startup snappiness of server, but then you get

937
00:48:46,000 --> 00:48:50,039
the you know, the independence of web assembly.

938
00:48:50,199 --> 00:48:53,840
Speaker 4: I absolutely agree with you that if your application is

939
00:48:53,880 --> 00:48:55,519
built in the right way and the developer knows what

940
00:48:55,559 --> 00:48:57,920
they're doing, then you can make it work really great.

941
00:48:57,960 --> 00:49:00,719
And that's that's excellent. But we kind of want to

942
00:49:00,800 --> 00:49:02,920
raise the bar from that from Dottnut time. We want

943
00:49:02,920 --> 00:49:04,760
to make it so that you don't have to be

944
00:49:04,800 --> 00:49:08,760
someone who writes your code in a sort of elaborate

945
00:49:08,800 --> 00:49:11,239
and carefully thought out manner. You don't have to be

946
00:49:11,239 --> 00:49:13,719
an expert, but rather just the natural way of building

947
00:49:13,760 --> 00:49:17,079
your code in a simple, obvious manner. We'll just have

948
00:49:17,119 --> 00:49:21,239
all those benefits just innately. That's really how we want

949
00:49:21,239 --> 00:49:22,960
to raise the bar and improve things.

950
00:49:22,920 --> 00:49:25,760
Speaker 2: Right, so we can still use server, because using server

951
00:49:25,880 --> 00:49:28,440
is just by itself a huge benefit because you don't

952
00:49:28,480 --> 00:49:30,159
have to have an API layer and you don't have

953
00:49:30,239 --> 00:49:33,880
to extra security precautions and things like that.

954
00:49:34,280 --> 00:49:37,440
Speaker 4: Yeah, it's very very straightforwards. I really enjoy whenever I

955
00:49:37,440 --> 00:49:39,440
get to write a bit of playser server code because

956
00:49:39,480 --> 00:49:43,440
it just bypasses so many of the boring boilerplate tasks

957
00:49:43,480 --> 00:49:45,960
that you have in web development and just means you

958
00:49:46,039 --> 00:49:48,599
just get something running very quickly and it just works.

959
00:49:48,760 --> 00:49:51,360
Speaker 2: Well, I'm excited. Let me know when I'll have FILS.

960
00:49:51,119 --> 00:49:54,559
Speaker 3: A long way from those experiments you showed in twenty sixteen. Friend,

961
00:49:54,639 --> 00:49:55,199
my goodness.

962
00:49:55,360 --> 00:49:58,960
Speaker 4: Oh yeah, yeah, yes, indeed.

963
00:49:59,679 --> 00:50:00,840
Speaker 3: Yeah.

964
00:50:01,199 --> 00:50:02,519
Speaker 2: Is that the best talk you ever gave?

965
00:50:02,559 --> 00:50:06,000
Speaker 4: Or well, it's definitely fun. It was really cool.

966
00:50:06,039 --> 00:50:09,159
Speaker 3: Actually to do that, you hurt David Fowler's head and

967
00:50:09,159 --> 00:50:11,679
that's not easy to do. Yeah, yeah, that's true.

968
00:50:12,000 --> 00:50:12,920
Speaker 2: Yeah, very cool.

969
00:50:13,039 --> 00:50:15,360
Speaker 3: It was a great time. It's good. It was cool

970
00:50:15,400 --> 00:50:18,239
to be in the room. Definitely, moment, definitely.

971
00:50:18,280 --> 00:50:20,199
Speaker 2: Well, I'm looking forward to it, and I guess you

972
00:50:20,239 --> 00:50:22,079
know the repos and open books so you can just

973
00:50:22,119 --> 00:50:25,440
go check the progress and are the early bits out

974
00:50:25,480 --> 00:50:26,119
there yet for.

975
00:50:26,320 --> 00:50:28,880
Speaker 4: At ten, No, not at all, because we haven't even

976
00:50:28,880 --> 00:50:32,760
decided what's going to go into it. Sometimes we start

977
00:50:32,800 --> 00:50:36,440
our planning a little bit earlier, but we were intentional

978
00:50:36,760 --> 00:50:39,119
as a team to wait a little bit longer. I'd

979
00:50:39,199 --> 00:50:42,159
let things settle a little bit more, be more thoughtful

980
00:50:42,159 --> 00:50:47,559
about collecting feedback and about like creating clear consensus about

981
00:50:47,280 --> 00:50:51,440
what would be wanted in this particular release. So yeah,

982
00:50:51,440 --> 00:50:53,920
that's why it's it's brought us into January before we've

983
00:50:54,840 --> 00:50:55,840
got a nailed down plan.

984
00:50:56,440 --> 00:50:59,360
Speaker 2: Excellent, awesome, Well we're coming to the end here. Is

985
00:50:59,360 --> 00:51:00,960
there anything else you want to talk about before we

986
00:51:00,960 --> 00:51:01,480
wrap it up?

987
00:51:01,639 --> 00:51:03,280
Speaker 4: Not specifically for me, all right?

988
00:51:03,480 --> 00:51:05,400
Speaker 2: Best place in your neighborhood for fish and chips?

989
00:51:05,639 --> 00:51:09,199
Speaker 4: Fish and chips? Wow, I don't know. Probably got a

990
00:51:09,239 --> 00:51:12,960
couple of local chippies, but yeah, it's not really my thing,

991
00:51:14,159 --> 00:51:14,559
not really.

992
00:51:14,639 --> 00:51:14,960
Speaker 2: Okay.

993
00:51:15,239 --> 00:51:17,800
Speaker 3: You in the South of England is that fish and

994
00:51:17,880 --> 00:51:19,679
chip country?

995
00:51:19,719 --> 00:51:20,119
Speaker 1: It is?

996
00:51:20,280 --> 00:51:22,159
Speaker 3: It is you eat?

997
00:51:22,679 --> 00:51:25,679
Speaker 2: You must be going for a chicken teaka masala then

998
00:51:25,760 --> 00:51:26,480
or something.

999
00:51:27,199 --> 00:51:29,199
Speaker 3: Better it's more for you, yeah, for sure.

1000
00:51:29,239 --> 00:51:31,719
Speaker 2: And where's your where's your favorite place in Bristol?

1001
00:51:32,280 --> 00:51:35,280
Speaker 4: Oh, in Bristol, We've got a really good one called

1002
00:51:35,400 --> 00:51:37,800
Urban tand or just a good tip for the for

1003
00:51:37,840 --> 00:51:42,320
the locals. Yeah, that's that's sort of classic in my area.

1004
00:51:42,400 --> 00:51:44,440
Speaker 2: Well, Steve, it's always great to talk to you, of course,

1005
00:51:44,639 --> 00:51:47,360
and we'll be speaking to you more, you know, when

1006
00:51:47,800 --> 00:51:52,559
things around dot ten get awesome. Ye okay, thanks again

1007
00:51:52,639 --> 00:52:14,119
and we'll talk to you next time on dot net rocks.

1008
00:52:16,480 --> 00:52:19,159
Dot net Rocks is brought to you by Franklin's Net

1009
00:52:19,280 --> 00:52:23,239
and produced by Pop Studios, a full service audio, video

1010
00:52:23,320 --> 00:52:27,400
and post production facility located physically in New London, Connecticut,

1011
00:52:27,639 --> 00:52:32,440
and of course in the cloud online at pwop dot com.

1012
00:52:32,639 --> 00:52:34,760
Speaker 5: Visit our website at d O T N E t

1013
00:52:35,000 --> 00:52:39,039
R O c k S dot com for RSS feeds, downloads,

1014
00:52:39,199 --> 00:52:42,880
mobile apps, comments, and access to the full archives going

1015
00:52:42,920 --> 00:52:46,159
back to show number one, recorded in September two.

1016
00:52:45,960 --> 00:52:48,960
Speaker 2: Thousand and two. And make sure you check out our sponsors.

1017
00:52:49,119 --> 00:52:52,119
They keep us in business. Now, go write some code.

1018
00:52:52,480 --> 00:52:53,239
See you next time.

1019
00:52:54,159 --> 00:52:59,440
Speaker 5: You got javans

1020
00:53:01,119 --> 00:53:04,960
Speaker 2: That means home, then my Texas in line d

