1
00:00:01,000 --> 00:00:04,759
How'd you like to listen to dot
net Rocks with no ads? Easy?

2
00:00:05,320 --> 00:00:09,400
Become a patron for just five dollars
a month. You get access to a

3
00:00:09,480 --> 00:00:14,199
private RSS feed where all the shows
have no ads. Twenty dollars a month,

4
00:00:14,240 --> 00:00:18,359
we'll get you that and a special
dot net Rocks patron mug. Sign

5
00:00:18,480 --> 00:00:24,480
up now at Patreon dot dot net
rocks dot com. Hey there, this

6
00:00:24,559 --> 00:00:28,839
is Jeff Fritz, the Purple Blazer
guy from Microsoft, letting you in on

7
00:00:28,879 --> 00:00:33,039
a little secret about my friend Carl
Franklin. You know, the guy who

8
00:00:33,079 --> 00:00:37,799
started dot net Rocks, the first
podcast about dot net in two thousand and

9
00:00:37,799 --> 00:00:43,399
two, The guy who's been teaching
Blazer on YouTube since twenty twenty. Yeah,

10
00:00:43,439 --> 00:00:48,039
that Carl Franklin. Well, Carl's
joined up with the folks from Code

11
00:00:48,039 --> 00:00:52,799
in a Castle to teach a week
long hands on Blazer class at Are you

12
00:00:52,799 --> 00:00:58,840
ready to get this? At a
castle slash villa in Tuscany. It's sort

13
00:00:58,840 --> 00:01:03,920
of a luxury vacation with Blazer learning
built in. Carl's calling it the Blazer

14
00:01:04,120 --> 00:01:10,519
master Class. You'll learn Blazer from
the ground up, finishing the week with

15
00:01:10,560 --> 00:01:15,359
the ability to build and deploy Blazer
applications. Since The training happens for only

16
00:01:15,400 --> 00:01:19,560
four hours in the morning over six
days. You can bring your significant other

17
00:01:19,680 --> 00:01:26,000
your partner with you and you should
right This part of Italy is absolutely beautiful.

18
00:01:26,359 --> 00:01:30,640
There's so much to see and do
and in Larion Marco from code In

19
00:01:30,680 --> 00:01:37,000
to Castle are organizing daily activities both
at the castle and in the area.

20
00:01:37,079 --> 00:01:41,719
The castle is in the Marema,
a less touristed region of Tuscany, offering

21
00:01:41,760 --> 00:01:47,760
both classic Tuscan hill country as well
as easy access to the Etruscan Riviera,

22
00:01:47,799 --> 00:01:53,599
with sublime local food, wine and
olive oil around every corner. Breakfast is

23
00:01:53,599 --> 00:01:57,400
included every day. There will be
two communal dinners at the castle, book

24
00:01:57,480 --> 00:02:02,000
ending the experience, and most other
meal and all activities are included. And

25
00:02:02,079 --> 00:02:07,639
did I mention you'll learn Blazer in
person from Carl Franklin listen. Space is

26
00:02:07,680 --> 00:02:13,520
limited and for very good reason.
This is quality training in a beautiful setting.

27
00:02:14,120 --> 00:02:21,280
Go to code Inacastle dot com slash
Blazer twenty twenty three that's bla z

28
00:02:21,479 --> 00:02:27,120
O R two zero two three to
take advantage of this amazing opportunity to join

29
00:02:27,240 --> 00:02:32,319
Carl in Tuscany for an unforgettable week
of La Dolce Vita while advancing your programming

30
00:02:32,319 --> 00:02:38,479
skills in this important new technology.
After building software for a while, you

31
00:02:38,520 --> 00:02:43,240
know it's only a matter of time
before you see an HTTP timeout or a

32
00:02:43,319 --> 00:02:46,560
database deadlock. In software, it's
not a case of if things fail,

33
00:02:46,680 --> 00:02:52,400
but a case of when one mishap
like this and valuable data is lost forever.

34
00:02:52,639 --> 00:02:57,080
And these failures occur all the time, but it doesn't have to be

35
00:02:57,159 --> 00:03:01,199
this way. Introducing and service bus
the utimate tool to build robust and reliable

36
00:03:01,240 --> 00:03:07,039
systems that can handle failures gracefully,
maintain high availability, and scale to meet

37
00:03:07,080 --> 00:03:12,639
growing demand. For more than fifteen
years, end service bus has been trusted

38
00:03:12,639 --> 00:03:17,159
to run mission critical systems that must
not go down or lose any data ever.

39
00:03:17,759 --> 00:03:22,960
And now you can try it for
yourself. End service Bus integrates seamlessly

40
00:03:23,000 --> 00:03:25,680
with your dot net applications and could
be hosted on premises or in the cloud.

41
00:03:25,960 --> 00:03:30,960
Say goodbye to loss data and system
failures and say hello to a better,

42
00:03:30,280 --> 00:03:36,080
more reliable way of building distributed systems. Try end service bus today by

43
00:03:36,120 --> 00:03:40,719
heading over to go dot particular,
dot net slash dot net rocks and start

44
00:03:40,719 --> 00:04:00,759
building better systems with asynchronous messaging using
end service bus Welcome back to dot net

45
00:04:00,840 --> 00:04:04,280
Rocks. This is Carl Franklin and
it's Richard Campbell and Mark Miller is here

46
00:04:04,360 --> 00:04:12,080
joining us. Nothing. I keep
telling you nothing. It's going to go

47
00:04:12,159 --> 00:04:15,599
wrong on this show. It is
not going to be Mondays. However.

48
00:04:15,639 --> 00:04:18,120
This is clearly dot net Rocks.
But we'll talk to Mark in a minute.

49
00:04:18,160 --> 00:04:20,319
But first of all, buddy,
how you doing. I'm all right.

50
00:04:20,360 --> 00:04:25,000
Things are good up here. We're
you know, getting organized, and

51
00:04:25,160 --> 00:04:29,680
we're knocking out a few shows.
Air clean. Yeah, you know,

52
00:04:29,759 --> 00:04:31,480
it's nice being on the West coast
where the wind comes off the ocean.

53
00:04:31,680 --> 00:04:35,360
Yeah. Yeah, there's a forest
fire. You know, it's forest fire

54
00:04:35,439 --> 00:04:39,800
season started very early here. Yeah, so there is a fire inland a

55
00:04:39,839 --> 00:04:42,920
bit and it's creeping. Depending on
how the windflow goes, it creeps him

56
00:04:42,920 --> 00:04:45,759
a little bit closer. But it's
all right there. New York City got

57
00:04:45,759 --> 00:04:48,120
hammered, but we we sort of
were on the outskirts of it, but

58
00:04:48,600 --> 00:04:54,720
we still had over two hundred two
hundred and fifty air quality poor air quality.

59
00:04:54,720 --> 00:04:59,279
And next well, we're evacuating towns
that are burning to the ground and

60
00:04:59,319 --> 00:05:02,399
trying to save so, yeah,
breathe carefully while we try and not kill

61
00:05:02,439 --> 00:05:06,480
everybody. Yeah, that's that's true. Good good luck up there. Yeah,

62
00:05:06,560 --> 00:05:11,959
the good news is that the US
is sending firefighters and water bombers.

63
00:05:12,000 --> 00:05:15,319
Like we all help each other during
the convenient for us to have our season

64
00:05:15,319 --> 00:05:17,639
early, so when your later season
arrives, we're available to help out.

65
00:05:18,199 --> 00:05:25,040
That's so awesome. Thank you for
that as part of the system. All

66
00:05:25,120 --> 00:05:27,480
right, well, let's roll the
crazy music for a little thing we call

67
00:05:27,600 --> 00:05:39,160
better Noah framework awesome? All right, man, got Well, you probably

68
00:05:39,199 --> 00:05:43,519
heard Jeff Fritz in the introduction there
talking about this crazy idea that a friend

69
00:05:43,519 --> 00:05:48,319
of mine had to rent a castle
in Tuscany and have a week long training

70
00:05:48,319 --> 00:05:53,639
class there, and Glazer and toward
it, I sort of coined the term

71
00:05:53,800 --> 00:05:59,639
traincation, yeah, because it's just
as much a vacation with sightseeing and touristy

72
00:05:59,680 --> 00:06:02,319
stuff as it is training. In
fact, the training is only for four

73
00:06:02,360 --> 00:06:05,439
hours a day for six days,
and the rest of the time you get

74
00:06:05,480 --> 00:06:11,759
guided tours of the surrounding area.
So the only problem I think is that

75
00:06:12,040 --> 00:06:15,439
it's a little pricey. You kin
kind of have to consider it both training

76
00:06:15,439 --> 00:06:20,240
budget and vacation budget, right,
Yeah. So yeah, well, Tuscany's

77
00:06:20,279 --> 00:06:24,439
not cheap, and flying is certainly
not to cheap these days. That's it's

78
00:06:24,480 --> 00:06:27,759
not Yeah, it's not cheap.
And the other thing is that you can't

79
00:06:27,800 --> 00:06:30,040
just this castle is not a hotel. It's a villa that you have to

80
00:06:30,079 --> 00:06:34,199
rent the whole thing right for a
week at a time, and that's completely

81
00:06:34,279 --> 00:06:39,839
outside the range of most people's ability, you know. So that's it.

82
00:06:39,839 --> 00:06:44,480
It's at code in Acastle dot com
slash Blazer twenty twenty three. Cool.

83
00:06:44,839 --> 00:06:49,000
And today when this comes out,
is the end of the deadline for the

84
00:06:49,040 --> 00:06:56,120
early bird discount, so you'll you'll
have to pay full prose. But we

85
00:06:56,160 --> 00:06:58,759
need to we need to get you
know, who's interested in going. Richard,

86
00:06:58,879 --> 00:07:02,160
here's going our friends Hunter. That's
fun. He's he's not committed yet,

87
00:07:02,360 --> 00:07:04,600
but he said there are a couple
of things have to work out and

88
00:07:04,600 --> 00:07:09,120
then he's there. That's cool.
Yeah, well yeah, I'm committed to

89
00:07:09,160 --> 00:07:13,040
playing No s Ferrato and hiding in
the shadows and scaring the people trying to

90
00:07:13,120 --> 00:07:18,480
learn all the basics. He really, I sent in my confirmation, Carl.

91
00:07:19,079 --> 00:07:25,160
I know you didn't ask, but
I sent you that confirmation. Okay,

92
00:07:25,839 --> 00:07:29,480
great, it is mondays, isn't
it all right? Well, might

93
00:07:29,600 --> 00:07:33,160
just be. It might just be
so, Richard, you got a comment

94
00:07:33,199 --> 00:07:36,920
for us? Who's talking to us? I grab a COMMENTOP Show eighteen forty

95
00:07:38,000 --> 00:07:43,480
eight, which is the one we
did at Techorama in Antwerp with doctor Jodi

96
00:07:43,519 --> 00:07:47,160
Burchell from and I figures. As
we were talking about large language models with

97
00:07:47,279 --> 00:07:49,560
her, I say, spect,
we're gonna talk a little bit large language

98
00:07:49,560 --> 00:07:55,839
models today. It kind of made
sense. Yeah, And so Nick,

99
00:07:55,920 --> 00:07:59,199
as a Garion said, there's an
excellent episode. I really liked the idea

100
00:07:59,240 --> 00:08:01,959
of using AI to validate and verify
an AI. Yeah, that's an arms

101
00:08:03,040 --> 00:08:05,519
race for you right there. And
I understand why it's never going to be

102
00:08:05,560 --> 00:08:09,879
able to be used long term or
even now. We'll see. I think

103
00:08:09,920 --> 00:08:15,240
one possible regulation would be similar to
other rights. AI generated visual visuals like

104
00:08:15,279 --> 00:08:18,439
images of movies should have a watermark
that can be tracked, and players can

105
00:08:18,480 --> 00:08:22,360
display that the images are flagged when
an image has been modified, or show

106
00:08:22,399 --> 00:08:26,759
an infull length to the problem that
generated it. Yeah. I wonder if

107
00:08:26,800 --> 00:08:28,600
AI tools could just detect that in
the first place, may or may not

108
00:08:28,600 --> 00:08:31,519
eat a watermark. They kind of
leave watermarks anyway, because of the way

109
00:08:31,519 --> 00:08:35,639
they make their images. This would
be similar today's copy machines not being able

110
00:08:35,639 --> 00:08:41,720
to copy currency or only being able
to display blue ray content versus an HDMI

111
00:08:41,840 --> 00:08:45,600
contem protended cable. Boy, we
love that feature. Remember content protection?

112
00:08:45,639 --> 00:08:48,519
That was the best, the best. Look how popular that is today?

113
00:08:50,399 --> 00:08:56,720
There you go, Yeah, bring
back copy protection blue rays. Oh boy.

114
00:08:56,039 --> 00:08:58,519
I think the AI overall is going
to have a huge benefit demand kind,

115
00:08:58,679 --> 00:09:03,919
but only if they are appropriate regulations. I think a top secret committee

116
00:09:03,960 --> 00:09:07,679
could be created that inventors could divulge
their plan adventures to regulators to be ready

117
00:09:07,679 --> 00:09:11,639
at launch, but IP could be
protected via NDA's I think that's what patents

118
00:09:11,679 --> 00:09:15,879
are for, actually, Nick,
like you know that way, you can

119
00:09:16,000 --> 00:09:18,480
divulge the stuff, but if anybody
uses it, they get to pay you

120
00:09:18,960 --> 00:09:22,840
anyway. I mean, I appreciate
that folks are thinking about how we're going

121
00:09:22,919 --> 00:09:26,480
to manage this stuff. I also
think we're in the midst of a really

122
00:09:26,519 --> 00:09:31,600
serious hype cycle that's distorting reality pretty
heavily. Yeah, and then that hype

123
00:09:31,600 --> 00:09:35,679
cycle will pass yep. Yeah,
well so, Nick, thank you so

124
00:09:35,759 --> 00:09:37,240
much for your comment and a copy
of music cod Buy It's on its way

125
00:09:37,240 --> 00:09:39,440
to you and if you'd like a
copy of music cod Buy, write a

126
00:09:39,440 --> 00:09:43,879
comment on the website at dot and
rocks dot com or on the facebooks publish

127
00:09:43,919 --> 00:09:45,960
every show there, and if you
comment there and everybody on the show,

128
00:09:46,000 --> 00:09:48,559
we'll send you a copy of music
Code Buy. And you can follow us

129
00:09:48,559 --> 00:09:50,559
on Twitter if you want to.
But the real fund happens over on masadon.

130
00:09:50,799 --> 00:09:54,679
I'm at Carl Franklin at tech hub
dot social, and I'm Rich Campbell

131
00:09:54,720 --> 00:09:58,799
at Massodon dot social, and yeah, send us toute we like to hear

132
00:10:00,240 --> 00:10:03,360
over there. Hey, before I
introduced Mark, you know, you were

133
00:10:03,399 --> 00:10:09,120
talking about the watermarks there, and
you know how we can identify images.

134
00:10:09,600 --> 00:10:13,919
It's very easy to identify. Just
look for people shaking hands and the fingers

135
00:10:13,919 --> 00:10:18,519
are all fused together, or extra
digits creepy stuff, or an extra ear,

136
00:10:18,639 --> 00:10:22,000
an extra eye. Yeah, but
that software shirt does dive straight into

137
00:10:22,039 --> 00:10:26,919
the uncanny valley, isn't it?
Like it's not even the uncanny valley,

138
00:10:26,960 --> 00:10:31,240
it's oh my god valley. All
right, So Mark Miller is here.

139
00:10:31,240 --> 00:10:35,320
He's a seven year c sharp Microsoft
MVP, probably a ten year by now

140
00:10:35,480 --> 00:10:39,279
right, you gotta update your bio. I was kicked out of the program

141
00:10:39,360 --> 00:10:43,600
for a few years when I was
on the run from the authorities, but

142
00:10:43,759 --> 00:10:48,600
you know, I'm back now.
He's also leading expert and user interface design

143
00:10:48,720 --> 00:10:52,759
chief architect to the IDE Tools division, to developer Express Any Streams, Live

144
00:10:52,840 --> 00:10:56,399
c sharpcoding, and design on Twitch, dot tv, slash code Rushed.

145
00:10:56,840 --> 00:11:01,519
Mark has been creating tools for software
developers for four decades. And of course

146
00:11:01,559 --> 00:11:05,679
he's the star of that old show
we used to do together called Mondays,

147
00:11:05,720 --> 00:11:09,399
which you should never let your children
or grandparents listen to. Every that's right

148
00:11:09,879 --> 00:11:15,440
or anybody with a slightly even civilized
nature, you know, really, yes,

149
00:11:15,960 --> 00:11:18,559
yes, In fact, the three
of us would like to formally apologize

150
00:11:18,559 --> 00:11:24,360
for Mondays right now again to everybody
who has ever been offended by it.

151
00:11:24,679 --> 00:11:28,279
Again. Yeah, my attorney says, I can't apologize because we got pending

152
00:11:28,320 --> 00:11:35,120
lidication about a previous apology that I
messed up. Yeah all right, So

153
00:11:35,639 --> 00:11:39,120
some Mark, You've had a lot
of thoughts about open AI and what they're

154
00:11:39,159 --> 00:11:43,159
doing in AI in general and all
these large language models. I mean,

155
00:11:43,200 --> 00:11:46,279
I'm just gonna open it to the
floor. What the hell is going on

156
00:11:46,360 --> 00:11:50,679
here? Well, you know,
I'm like I'm sitting around on my butt

157
00:11:50,720 --> 00:11:56,039
for about six months this year,
you know, listening about chat GPT maybe

158
00:11:56,039 --> 00:11:58,519
the only five months I suppose,
you know, listening to it, acknowledging

159
00:11:58,519 --> 00:12:05,039
what's there and just kind of shaking
my head watching what's happening as well with

160
00:12:05,200 --> 00:12:09,159
regards to get hub code Pilot.
And you know, ultimately we had made

161
00:12:09,200 --> 00:12:13,559
a decision actually early this year that
dev Express was not going to jump in

162
00:12:13,559 --> 00:12:18,960
this space because we thought it was
going to require too many people. It

163
00:12:18,039 --> 00:12:22,080
was, you know, kind of
a lot of unknowns, high risk kind

164
00:12:22,080 --> 00:12:24,759
of place to go. Did you
even have any ideas as to what dev

165
00:12:24,840 --> 00:12:28,440
express could do to embrace it or
jump in the space. I did?

166
00:12:28,840 --> 00:12:33,759
I did? I had. There's
a feature we have in code Ross called

167
00:12:33,840 --> 00:12:37,879
duplicate line and duplicate selection, and
basically it duplicates the line and it uses

168
00:12:37,919 --> 00:12:43,759
heuristics to anticipate what you're likely to
change. So if you've got a line

169
00:12:43,759 --> 00:12:46,320
that says with is equal to size
dot with and you duplicate the line,

170
00:12:48,600 --> 00:12:52,320
you can get height as equal to
size dot he for example. But that

171
00:12:52,440 --> 00:12:58,519
particular feature I thought was ripe for
AI exploitation. Yeah, and it was

172
00:12:58,559 --> 00:13:01,879
a unique entry point that nobody else
was going to step on for a while

173
00:13:01,120 --> 00:13:05,559
because it's based on a unique feature
inside code Rush, but it doesn't seem

174
00:13:05,559 --> 00:13:09,919
like a life changing like do or
die I kind of have that feature,

175
00:13:09,000 --> 00:13:11,320
you know, it doesn't seem like
that, right, I think, you

176
00:13:11,360 --> 00:13:16,399
know, to be fair, I
think I think there's maybe only one or

177
00:13:16,440 --> 00:13:18,919
two features in code Rush that are
life changing. Yeah, I gotta have

178
00:13:20,039 --> 00:13:22,960
it, And they're kind of they
kind of creep up on you. They're

179
00:13:22,000 --> 00:13:24,399
kind of under the hood. They're
kind of basic features. They're been are

180
00:13:24,440 --> 00:13:31,200
for like twenty years. But there's
a combination of things that are working together

181
00:13:31,279 --> 00:13:35,799
that the sum of which I think
kind of gets you to that point where

182
00:13:35,840 --> 00:13:41,519
you are like, Okay, I
can buy into this idea of working with

183
00:13:41,600 --> 00:13:45,759
less effort and working with less though
it's always been my experience with code Rush,

184
00:13:45,879 --> 00:13:48,320
right, It's like it's not any
one thing, but when you use

185
00:13:48,360 --> 00:13:50,399
all the things, you do so
much more in the same amount of time.

186
00:13:50,480 --> 00:13:54,000
That's right. Yeah, Yeah,
I think that's fair to say.

187
00:13:54,039 --> 00:13:56,080
I know that, you know,
I'm I'm using the product live on the

188
00:13:56,320 --> 00:14:03,720
Twitch stream, and you know it
enables me to write code and talk about

189
00:14:03,759 --> 00:14:07,679
it at the same time, Right, So I don't need so much cognitive

190
00:14:07,679 --> 00:14:11,360
load to pay attention to the code
I'm writing because I'm using shortcuts or features

191
00:14:11,360 --> 00:14:16,679
things like that. So you're using
a GitHub Copilot and code Rush at the

192
00:14:16,720 --> 00:14:20,679
same time. No, I'm not. I actually don't have GitHub Copilot installed.

193
00:14:20,759 --> 00:14:24,679
I you know, I you know
me. If somebody says don't do

194
00:14:24,759 --> 00:14:28,559
something, we all agree we're not
gonna do it. What do I do?

195
00:14:28,679 --> 00:14:33,279
Whatever you do, don't do this. In fact, in fact,

196
00:14:33,519 --> 00:14:37,759
management at a company we won't mention. But one that I work for has

197
00:14:37,879 --> 00:14:43,440
no idea. I'm here telling you
about everything that I've done. Okay,

198
00:14:43,840 --> 00:14:46,320
they'll find out later and I'll be
in trouble, I'm sure. So you

199
00:14:46,360 --> 00:14:50,480
So what you're saying is A da
Have Express decided not to jump into AI

200
00:14:50,559 --> 00:14:56,639
space and b Mark Miller decided to
jump into the space rings product at da

201
00:14:56,679 --> 00:15:01,480
Have Express. That's exactly right.
No, and they they kind of don't

202
00:15:01,480 --> 00:15:05,200
know, they kind of know,
but now they don't know. They don't

203
00:15:05,240 --> 00:15:07,600
I offered, I say you want
to see a demo, and they were

204
00:15:07,639 --> 00:15:11,360
like, well busy, and I'm
like, okay, no demo for you,

205
00:15:11,480 --> 00:15:15,240
all right. I don't see any
downside though, right, I mean,

206
00:15:15,720 --> 00:15:18,320
this is just making code Rush a
better product. My whole point of

207
00:15:18,360 --> 00:15:20,480
view on this is, I guess
not so much. Oh, I'm gonna

208
00:15:20,480 --> 00:15:24,559
make code rush a better product.
What I'm really trying to do is I'm

209
00:15:24,600 --> 00:15:28,519
really trying to get really good features
to developers and code rushes, like the

210
00:15:28,159 --> 00:15:33,080
it's like the conduit the medium through
which I can do that. Yeah,

211
00:15:33,120 --> 00:15:35,039
it's my vehicle. Yes, you're
right, I should have that mentality.

212
00:15:35,080 --> 00:15:39,440
I should like have a product ownership
mentality. But I think primarily I'm really

213
00:15:39,480 --> 00:15:46,080
thinking about is there something we can
do to make developers more productive that's a

214
00:15:46,159 --> 00:15:50,080
significant step that's not going to be
stepped on by anybody else, you know,

215
00:15:50,279 --> 00:15:52,159
that sort of thing. Okay,
So so that was the approaches.

216
00:15:52,240 --> 00:15:56,519
So let's talk about some of those
features. Sure. I included a link

217
00:15:56,559 --> 00:16:00,759
in the show's description, and that
link is to a video that shows a

218
00:16:00,759 --> 00:16:03,759
little bit of what I'm working on. So if you want to visual and

219
00:16:04,159 --> 00:16:08,759
kind of a motion context to see
things in movement in motion, take a

220
00:16:08,799 --> 00:16:11,399
look at that video and you have
a stronger sense of what's going on.

221
00:16:11,639 --> 00:16:15,519
But specifically, answer your question,
Carl, It first started when I started

222
00:16:15,519 --> 00:16:19,519
interacting with open ai on the playground, and I started, you know,

223
00:16:19,559 --> 00:16:26,639
asking questions, you know anything wrong
with this method? And open Eye comes

224
00:16:26,679 --> 00:16:29,279
back, no, this method is
fine and it's going to work correctly.

225
00:16:29,279 --> 00:16:32,600
And the method was a method that
took a first name and a last name

226
00:16:32,960 --> 00:16:36,919
concatenated them together with a space between
the two, So if I passed in

227
00:16:37,120 --> 00:16:40,519
Mark and Miller, it would come
back with Mark Miller with the space between

228
00:16:40,559 --> 00:16:44,559
the two the first name and last
name. Are you specifically talking about chat

229
00:16:44,600 --> 00:16:48,600
GPT here or are you using another
interface chat GPT. I'm on the open

230
00:16:48,639 --> 00:16:52,720
Aie playground, which is essentially using
the chat GPT engine. Okay, guy,

231
00:16:52,879 --> 00:16:56,080
is what's happening? So? And
it says nothing's wrong with the method.

232
00:16:56,120 --> 00:16:57,879
And I say, well, what
happens if I pass in an empty

233
00:16:57,960 --> 00:17:02,159
last name? And it says,
well, you're going to get back you

234
00:17:02,240 --> 00:17:07,279
know, the first name and it
actually put in quotes Mark with a space

235
00:17:07,319 --> 00:17:10,720
at the end of it in quotes. And I said, the space is

236
00:17:10,759 --> 00:17:12,839
a problem. Uh, And it
said, oh, I see. And

237
00:17:12,880 --> 00:17:17,160
I said, can you generate a
test case for me that tests against this

238
00:17:17,240 --> 00:17:21,039
particular problem? And it and it
popped it out like that, and I

239
00:17:21,240 --> 00:17:25,240
and the test name was well,
named. It did in fact test what

240
00:17:25,400 --> 00:17:29,920
I what I what I was asking
for. And after that experience, I

241
00:17:30,000 --> 00:17:33,079
started thinking, Okay, what can
we do here that can help people out?

242
00:17:33,440 --> 00:17:37,640
And I'll say this my conclusion now
I've been working with it for about

243
00:17:37,680 --> 00:17:42,960
two months, and my conclusion is
is that AI specifically chat GPT is like

244
00:17:42,960 --> 00:17:48,200
a wandering child. Yes, okay, it's I couldn't agree more. It's

245
00:17:48,279 --> 00:17:52,359
smart enough to give you some really
high quality answers, but sometimes it'll kind

246
00:17:52,359 --> 00:17:56,799
of go astray a bit. Yeah, it could either mix stuff up or

247
00:17:56,960 --> 00:18:00,000
just could be confident in you know
this works and it doesn't. Yeah,

248
00:18:00,000 --> 00:18:04,240
oh right. Yeah. We had
a massive argument with chat GBT on live

249
00:18:04,319 --> 00:18:08,920
on the stream where chat GBC was
claiming things were We're real and existed,

250
00:18:10,359 --> 00:18:12,039
and we were like, no,
they're not. It was like, well,

251
00:18:12,079 --> 00:18:17,079
I'm sorry you think that he was
very patronizing. I'm sorry you think

252
00:18:17,160 --> 00:18:18,759
that, Mark, but I've looked
up some of the shows you've done in

253
00:18:18,759 --> 00:18:22,319
the past and I don't think you're
qualified at all. Right, I think

254
00:18:22,359 --> 00:18:25,880
that's it didn't say that, did
it. No, it didn't say that.

255
00:18:26,000 --> 00:18:29,720
Last pure comedy right there, that
was God, pure comedy. Kids

256
00:18:30,759 --> 00:18:34,720
so hard to tell though. There's
a lot of anthropomorphization going on with some

257
00:18:34,839 --> 00:18:41,759
software here. I know Richard doesn't
like the anthropomorphization of chat gbt um,

258
00:18:41,839 --> 00:18:45,359
but I think it's funny though.
I mean, it's hard to tell what's

259
00:18:45,720 --> 00:18:51,240
what's satire here and what's real because
chat gbt is really, like he said,

260
00:18:51,240 --> 00:18:55,200
it's a small child. It's like
a wandering child. And so part

261
00:18:55,200 --> 00:18:57,680
of the challenge in getting from point
A to point B, in other words,

262
00:18:57,680 --> 00:19:03,240
point a idea for features and point
B execution on features that actually really

263
00:19:03,279 --> 00:19:11,119
consistently work is learning how to take
that small child and corral it, get

264
00:19:11,119 --> 00:19:15,000
it to run down the path you
wanted to run. And then also,

265
00:19:15,640 --> 00:19:19,559
you know, take a look at
what is producing and contour it. There's

266
00:19:19,559 --> 00:19:23,720
a number of pieces that are It's
kind of like it's kind of like the

267
00:19:23,799 --> 00:19:32,519
idea of like looking down on cattle
that are moving from one place to another

268
00:19:32,519 --> 00:19:36,079
but not seeing any fences. And
that's kind of what we're doing, is

269
00:19:36,079 --> 00:19:40,960
we're taking the fences and hiding those
fences from the users. I considered it

270
00:19:41,160 --> 00:19:47,680
like a junior developer that knows some
things and sometimes gets things right, but

271
00:19:48,119 --> 00:19:52,000
you have to double check the work. But also speaks very confidently the whole

272
00:19:52,000 --> 00:19:55,440
time that all their work is perfect. Oh yeah, absolutely, yeah.

273
00:19:55,480 --> 00:19:57,920
And if you think of it as
like a rubber duck, now that that

274
00:19:59,240 --> 00:20:02,480
kit can be useful for like,
let me bounce this problem off you and

275
00:20:02,519 --> 00:20:06,000
you might it might come up with
one or two suggestions that you hadn't thought

276
00:20:06,039 --> 00:20:08,759
of, but you know, it
may probably it probably won't come up with

277
00:20:08,799 --> 00:20:12,680
the actual code that you need.
Um, that's not That has not been

278
00:20:12,680 --> 00:20:18,720
my experience in working with us and
in using this. It's been hit or

279
00:20:18,759 --> 00:20:22,920
miss. Yeah. But but realize, though, Carl, we're doing we

280
00:20:22,039 --> 00:20:26,279
are doing a couple of things that
you're probably not doing if you're just in

281
00:20:26,359 --> 00:20:30,559
chat GPT alone. Uh. And
one of the things that we're doing is

282
00:20:30,680 --> 00:20:37,839
we are providing a really strong context
to the questions that's hidden. For that

283
00:20:37,960 --> 00:20:44,160
context is hidden from the user or
it's kind of collapsed. They can expand

284
00:20:44,200 --> 00:20:47,599
it out to see it. For
example, I can be inside of visual

285
00:20:47,640 --> 00:20:51,400
Studio and I can say, Hey, what do you think about this method?

286
00:20:51,839 --> 00:20:53,599
And that's all I have to say, And then now I can get

287
00:20:53,599 --> 00:20:56,119
analysis on the method. I can
get a description of what it does.

288
00:20:56,839 --> 00:21:03,160
Right. I find that when I'm
looking at other people's source code or something

289
00:21:03,279 --> 00:21:10,200
really big that I don't quite understand
yet. Starting from an AI analysis is

290
00:21:10,240 --> 00:21:15,720
incredibly productive. It makes me much
faster because now I have something to compare

291
00:21:15,759 --> 00:21:19,359
against as I go through and look
at the code. The AI analysis might

292
00:21:19,400 --> 00:21:22,920
be not one hundred percent correct,
as you you know, as you pointed

293
00:21:22,920 --> 00:21:27,200
out, but but it now at
least gets me started, whereas before I'm

294
00:21:27,240 --> 00:21:30,039
just looking at things and I don't
quite see them yet. Right. It's

295
00:21:30,240 --> 00:21:33,519
like having somebody to bounce it off
of. Yeah, and the AI can

296
00:21:33,559 --> 00:21:37,039
instantly tell you what's going on in
that method. It can instantly come back

297
00:21:37,160 --> 00:21:41,839
or or much faster than I can
come back and say, look at this,

298
00:21:41,920 --> 00:21:42,920
and look at this, watch for
this. This is what I think

299
00:21:44,000 --> 00:21:47,960
is happening. This is what the
method appears to do, and that I

300
00:21:48,000 --> 00:21:52,160
find, yeah, really really useful. So and it's never been like catastrophically

301
00:21:52,200 --> 00:21:57,839
incorrect. It was only when I
was using the playground it was maybe not

302
00:21:57,920 --> 00:22:03,599
catastrophically but absolutely incorrect. It said
that there was a c sharp way of

303
00:22:04,359 --> 00:22:11,079
specifying ignore case in the red X
instruction strength, but it's not. It's

304
00:22:11,079 --> 00:22:17,279
a parameter to creating the new redg
X and but it was claiming that it

305
00:22:17,319 --> 00:22:21,559
was there, it was conflating it
with another language, and uh, you

306
00:22:21,559 --> 00:22:22,759
know, we were going back and
forth on that to the point that I

307
00:22:22,799 --> 00:22:26,960
was like, Okay, this guy
is not back and down. But that

308
00:22:27,119 --> 00:22:32,720
was in the playground and when I've
since put it in the kind of the

309
00:22:32,799 --> 00:22:36,720
corral where I've got these railings kind
of you know, set up, I

310
00:22:36,799 --> 00:22:38,720
have not seen that. In fact, you know, for me, with

311
00:22:38,759 --> 00:22:45,000
the one of the features that I've
been actually using on a daily basis that

312
00:22:45,039 --> 00:22:49,200
I demonstrated on that video is automatic
generation of XML dot comments. To get

313
00:22:49,240 --> 00:22:55,240
that, there's a prompt that starts
it out that's, you know, setting

314
00:22:55,319 --> 00:22:57,440
up all the rails right right,
that says, here's what I want,

315
00:22:57,480 --> 00:23:00,480
here's what I don't want, that
sort of thing. Well, and I

316
00:23:00,519 --> 00:23:03,240
really like that it's not asking You're
not asking you to write code. You're

317
00:23:03,240 --> 00:23:07,119
asking you to write a good description
of this code, right. And also

318
00:23:07,240 --> 00:23:11,279
realize that in the case of an
XML dot comment, if it's wrong,

319
00:23:11,599 --> 00:23:15,200
the part you have to fix is
effortless, and the part that it gets

320
00:23:15,319 --> 00:23:18,759
right, which are the actual ce
refs with the links, which are impossible

321
00:23:18,799 --> 00:23:23,640
for a human being to right themselves
because they take forever to write, and

322
00:23:23,839 --> 00:23:29,000
the exception analysis as well, those
things it gets and so you get that,

323
00:23:29,119 --> 00:23:32,160
you get the return types. And
the other thing is in addition to

324
00:23:32,160 --> 00:23:36,200
that corralling, we give it context. So we check to see, for

325
00:23:36,240 --> 00:23:38,039
example, what are the types that
are used inside there, what are the

326
00:23:38,039 --> 00:23:44,960
method calls inside there? And we
give chat GPT more information on those types

327
00:23:45,279 --> 00:23:48,119
right right, and those method calls
that are going in, so that when

328
00:23:48,119 --> 00:23:52,559
it comes back with the description,
it has more information then you could give

329
00:23:52,599 --> 00:23:56,880
it then it could have from just
the text of the method alone. Yeah,

330
00:23:56,119 --> 00:24:00,480
and as you've already described, like
this is laborious to do by hand,

331
00:24:00,759 --> 00:24:04,079
right, but the bot can knock
these parts out because they're and they're

332
00:24:04,200 --> 00:24:07,160
very very viable. You can go
and chase them down if you need to.

333
00:24:07,440 --> 00:24:11,119
Yes, that's that's the thing is
that it is. It's an area

334
00:24:11,160 --> 00:24:15,039
that where and it looks like nobody
else is trying this yet. You know,

335
00:24:15,079 --> 00:24:18,160
Copilot doesn't appear to be doing this
yet. I think the reason they're

336
00:24:18,200 --> 00:24:22,200
not is because when you first try
it, the results are inconsistent. It's

337
00:24:22,200 --> 00:24:27,160
the wandering child. Yeah, and
so a lot of the effort is about

338
00:24:27,160 --> 00:24:32,279
getting the child pointed in the right
direction, and then when it comes back,

339
00:24:32,960 --> 00:24:37,160
checking to make sure everything is as
you ask for and cleaning up anything

340
00:24:37,200 --> 00:24:41,680
that comes in, because sometimes chat
gpt likes to be a little flowery and

341
00:24:41,720 --> 00:24:45,519
maybe add things that do not make
sense right, add maybe, for example,

342
00:24:45,559 --> 00:24:48,559
a bunch of c refs, you
know, at the end, outside

343
00:24:48,599 --> 00:24:53,559
of the parameters for parameter description,
outside of the remark descriptions, right outside

344
00:24:53,559 --> 00:24:57,839
of anything that's a common thing that
chat gpt will do. And so you

345
00:24:57,880 --> 00:25:03,079
clean up these kinds pieces that are
this extra noise, and what you end

346
00:25:03,160 --> 00:25:08,559
up getting is something that's pretty high
quality. The other component that was really

347
00:25:10,000 --> 00:25:15,920
initiated my reaching out to you guys
is that to make this work right,

348
00:25:15,279 --> 00:25:22,480
we created a UI that's essentially a
representation of the AI working in your code.

349
00:25:22,119 --> 00:25:25,920
So when you say I want to
create an XML dot comment for this

350
00:25:25,960 --> 00:25:30,000
method, you get a little animated
you know, composing right there where the

351
00:25:30,119 --> 00:25:33,200
XML dot comment will go, and
you're free to continue to edit inside the

352
00:25:33,200 --> 00:25:37,319
code. In fact, you're free. You'll see this on the video if

353
00:25:37,319 --> 00:25:41,559
you watch it. You're free to
execute this feature many times over right,

354
00:25:41,599 --> 00:25:47,519
and so you can actually see in
the video multiple instances of the code agent

355
00:25:47,680 --> 00:25:53,240
working alongside me inside the same file, And it's a peak into what I

356
00:25:53,279 --> 00:25:59,319
think the future of AI code agents
can be, where I might have a

357
00:25:59,359 --> 00:26:03,240
code agent to some serious analysis of
you know, kind of a deep dive

358
00:26:03,279 --> 00:26:11,480
analysis of my architecture while I'm having
another AI agent generating um uh, generating

359
00:26:11,480 --> 00:26:15,119
test cases, generating XML dot comments, that sort of thing, while I'm

360
00:26:15,160 --> 00:26:18,599
going through and reviewing and learning and
looking looking at the code. All of

361
00:26:18,640 --> 00:26:22,240
that could be having in a single
file if you wanted to wait, remind

362
00:26:22,319 --> 00:26:25,680
which you're describing reminds you of like
Carl and I working in a Google doc

363
00:26:25,720 --> 00:26:30,759
at the same time, right,
Like yeah, very sexy too, Like

364
00:26:30,799 --> 00:26:34,960
it's thrilling when so that a fact
of the agent kicks off and works on

365
00:26:34,960 --> 00:26:38,240
that problem while you're continuing on another
problem, and another agent kicks off here

366
00:26:38,240 --> 00:26:44,079
and works on this next piece,
Like that's pretty cool. It's almost impeded

367
00:26:44,119 --> 00:26:48,880
by the whole AI piece. Like
this is just an interesting idea that agents

368
00:26:48,880 --> 00:26:52,359
that evaluate your code along the way. The fact that it happen to use

369
00:26:52,440 --> 00:26:55,960
large language models is a sort of
ancillary to the point. So here's here's

370
00:26:55,960 --> 00:27:00,359
another thing. To think about is
that these things could walk towards just becoming

371
00:27:00,400 --> 00:27:06,599
too intrusive to your code experience.
Sometimes I like to just look at some

372
00:27:06,640 --> 00:27:10,359
code and think and just stare at
the screen without things happening, you know,

373
00:27:11,000 --> 00:27:15,000
Whereas if I'm if I haven't completed
my thought and I'm writing some code,

374
00:27:15,440 --> 00:27:18,319
you know, I don't want some
AI in there going oh you should

375
00:27:18,319 --> 00:27:19,079
do that. See yeah, I
was going to do that, but I'm

376
00:27:19,160 --> 00:27:22,480
also thinking about this, like I
don't want to have those conversations. I

377
00:27:22,599 --> 00:27:25,880
let you shut up and go away
and then let me think. Well,

378
00:27:25,880 --> 00:27:27,440
at the moment it moves my cursor
in any way, I'm going to kill

379
00:27:27,480 --> 00:27:32,440
it right like, or you know, or it starts putting in that comment

380
00:27:32,519 --> 00:27:34,680
like, it better be shifting it
in the right direction. Those are U

381
00:27:34,799 --> 00:27:38,799
acts of facts. Yeah that,
but yeah, just a warning. It's

382
00:27:38,839 --> 00:27:41,720
the it's And I would argue code
Rush is brilliant at this. Yeah,

383
00:27:42,160 --> 00:27:48,400
don't interrupt me, Like I've never
seen a product better at not interrupting me

384
00:27:48,480 --> 00:27:51,400
than code YEA. Like, generally
speaking, when it takes an action,

385
00:27:51,960 --> 00:27:55,839
you're delighted by it. You can
look around and look for those contrast clues

386
00:27:55,880 --> 00:28:00,599
and those h you know, icons
and colors and things to and interact with

387
00:28:00,640 --> 00:28:03,720
them if you want. But are
you right it doesn't, So thanks for

388
00:28:03,799 --> 00:28:07,559
that, Mark. I just want
to you know, respond to the implied

389
00:28:07,720 --> 00:28:14,759
concern and the you know, the
answer is is that already with the current

390
00:28:14,960 --> 00:28:18,079
you know feature, the XML dot
coment generator, there's no shifting of the

391
00:28:18,160 --> 00:28:22,599
view when the insertion occurs, without
moving the carrot, without changing selection.

392
00:28:22,960 --> 00:28:27,799
You are completely free and fully powerful
to do editing and invoke other features including

393
00:28:27,799 --> 00:28:32,960
intelisense and telecode whatever you want during
that time. So we've kind of already

394
00:28:32,960 --> 00:28:37,319
got that. Also with regards to
the don't distract me, you know,

395
00:28:38,119 --> 00:28:41,799
we hear you loud and clear on
that, Carl, and that is not

396
00:28:41,200 --> 00:28:45,079
it's only These features are only invoked
if you ask for them, and to

397
00:28:45,160 --> 00:28:48,799
ask for them you have to specifically
do it. For example, if I

398
00:28:48,880 --> 00:28:56,799
want to enable voice commands, I
have to opt in by specifying Azure Cognitive

399
00:28:56,839 --> 00:29:00,599
Services Speech API key, and if
I put that in there, I can

400
00:29:00,640 --> 00:29:03,319
then choose options like if I hold
the control key down, it'll listen.

401
00:29:03,799 --> 00:29:07,960
So I can hold the control key
down kind of like a microphone on button,

402
00:29:07,359 --> 00:29:11,160
and I can talk leaning the microphone
talk and then release the key.

403
00:29:11,519 --> 00:29:15,559
So is that working right now?
Is that something? Yes? Yeah,

404
00:29:15,640 --> 00:29:18,039
yeah, it's working right now.
In fact, I've got it on my

405
00:29:18,160 --> 00:29:25,319
machine control. Left control key speaks
to the agent, right control key executes

406
00:29:25,440 --> 00:29:27,440
voice commands. Okay, is that
a code rush thing or is that just

407
00:29:27,480 --> 00:29:33,640
in visual studio code rush thing and
it's an AI thing we're actually using.

408
00:29:33,640 --> 00:29:37,599
You have the ability to specify your
voice command, like I have a voice

409
00:29:37,599 --> 00:29:41,200
command I added today called clear chat, and I have a checkbox under that

410
00:29:41,200 --> 00:29:48,720
that says allow semantic equivalents, and
it's checked by default, so I can

411
00:29:48,720 --> 00:29:51,720
say, hey, deck's clear the
chat while I'm holding down the right code

412
00:29:51,759 --> 00:29:56,599
control key. I no longer have
to remember the exact phrasing and say it

413
00:29:56,960 --> 00:30:00,519
for voice commands to work, right
and it as it's code rush, there's

414
00:30:00,759 --> 00:30:06,119
context that are bound to this.
So some commands will only work if you're,

415
00:30:06,119 --> 00:30:08,359
for example, debugging, and it
makes sense for the debugging commands to

416
00:30:08,400 --> 00:30:14,240
work right. Yeah, that makes
sense. Now, hang on, that's

417
00:30:14,279 --> 00:30:18,319
my favorite way way way way.
Wait, well, you you're just sort

418
00:30:18,359 --> 00:30:21,559
of tossing out the voice control part
because for the most part, we haven't

419
00:30:21,599 --> 00:30:25,599
seen a lot of voice and chat
GBT. I know there's a few folks

420
00:30:25,759 --> 00:30:30,200
tinkering with it. But and you, I don't think you've used voice in

421
00:30:30,759 --> 00:30:36,039
code rush before either, So we
just casually saying this, but like you're

422
00:30:36,039 --> 00:30:40,279
introducing and you use your interface.
Yeah. Yeah. The reason we've never

423
00:30:40,319 --> 00:30:45,799
done voice commands before is because the
interface sucks horribly, right, there's there's

424
00:30:45,839 --> 00:30:51,559
just no way to solve the problem
if you don't have if you don't have

425
00:30:51,599 --> 00:30:56,000
AI and or you don't HIT and
also you don't have like a back end

426
00:30:56,039 --> 00:31:00,599
like as your cognitive services helping out
right, Okay, a cognitive services is

427
00:31:00,920 --> 00:31:07,960
fast and nicely accurate, really loving
it. Um uh and for speech recognition,

428
00:31:08,160 --> 00:31:14,640
for speech recognition exactly. Writing the
feature, the voice command feature took

429
00:31:14,680 --> 00:31:18,680
like two days. Maybe it was
not hard to build, right. We

430
00:31:18,799 --> 00:31:22,480
take in the voice in information,
we send it out to open AI with

431
00:31:22,599 --> 00:31:32,000
a specific corralled messaging including uh,
well not this is this is to be

432
00:31:32,279 --> 00:31:37,440
done what I'm about to say,
but including symbols from that are in scope

433
00:31:38,160 --> 00:31:42,880
because I also want to add voice
dictation of comments. So this is this

434
00:31:42,920 --> 00:31:48,519
whisper you're talking about on an Open
AI. No, No, I'm not

435
00:31:48,599 --> 00:31:52,799
using whisper. I mean, I
think whisper. I'm only barely familiar with

436
00:31:52,799 --> 00:31:56,000
whisper. I think it's I think
it's speech recognition, Isn't it? Is

437
00:31:56,000 --> 00:31:57,880
that true? Or no? Or
just speech to text speech to text?

438
00:31:59,079 --> 00:32:01,759
Yeah, no, I'm using as
your cognitive services speech to desks. Okay.

439
00:32:01,799 --> 00:32:07,720
The last time I used cognitive services, the technology they were using was

440
00:32:07,799 --> 00:32:14,839
LUIS the language UI service or something
like that, and you had to do

441
00:32:14,960 --> 00:32:19,319
quite a bit of work in order
to get it to figure out what the

442
00:32:19,519 --> 00:32:24,480
operative words are and recognize them within
some sort of context. I'm assuming that

443
00:32:24,519 --> 00:32:28,839
has changed, because there they sort
of shut that down, yep, and

444
00:32:28,880 --> 00:32:31,000
now that now they have some AI
behind it, I guess. So I'm

445
00:32:31,039 --> 00:32:35,759
not familiar with what's changed, but
I will say that we are able to

446
00:32:35,799 --> 00:32:39,519
add our own phrases to recognize and
that's where the symbols come in for the

447
00:32:39,519 --> 00:32:44,640
purposes of dictation, right, dictating
comments, things like that, which I'm

448
00:32:44,640 --> 00:32:50,000
still interested in implementing. But but
yeah, are My experience with it is

449
00:32:50,039 --> 00:32:54,079
that it's like well over like nine
accurate without any tweaking to the models or

450
00:32:54,119 --> 00:32:57,880
whatever, right, without tweaking,
And if I tweak it, I can

451
00:32:57,880 --> 00:33:01,000
then basically it gets name is wrong. Yeah, like essentially is what it

452
00:33:01,039 --> 00:33:06,160
gets. More most like, the
our AI agent is called Decks. And

453
00:33:06,200 --> 00:33:08,079
if I say hey, Dex,
and I don't give it any context,

454
00:33:08,200 --> 00:33:13,759
it might say headaches, or it
might say it might say, hey,

455
00:33:13,920 --> 00:33:17,400
comma d E c K S instead
of d e X. So what I

456
00:33:17,440 --> 00:33:21,559
do is I say, well,
there's a character named Dex. I give

457
00:33:21,559 --> 00:33:24,319
it freezeology, I give it things
like c sharp, I give it things

458
00:33:24,319 --> 00:33:28,160
that we're going to talk about it
coded. So you give it some proper

459
00:33:28,240 --> 00:33:30,839
names in say that these are some
of the names that you can recognize.

460
00:33:31,039 --> 00:33:34,920
That's that's an old trick. You
know that we've been doing that in speech

461
00:33:35,000 --> 00:33:37,720
recognition forever. But the key is
the key that makes this whole thing work,

462
00:33:37,799 --> 00:33:45,240
Carl, is that having ninety five
percent accuracy right is all I need

463
00:33:45,319 --> 00:33:47,960
when I'm talking to chat GPT.
If I'm talking to the agent and I

464
00:33:49,000 --> 00:33:52,839
get it ninety five percent correct right, and I might have like a flub

465
00:33:52,839 --> 00:33:57,160
in there or something like that,
it doesn't matter because chat GPT understands what

466
00:33:57,319 --> 00:34:00,279
my intent is and gives me the
answer one the question. I love it.

467
00:34:00,559 --> 00:34:05,359
Hey, let's pause here for just
a brief message or two and we'll

468
00:34:05,400 --> 00:34:10,599
be right back. Hey, we're
back. It's dotting It and Rocks.

469
00:34:10,599 --> 00:34:15,119
I'm Carl Franklin. That's Richie Campbell
over there. Hey, and he's not

470
00:34:15,199 --> 00:34:19,760
responsible for the wildfires, we get
that straight. And that's Mark Miller.

471
00:34:19,800 --> 00:34:22,679
Of course. He's talking about all
these cool AI features that he's put in

472
00:34:22,719 --> 00:34:25,079
code Rush, and he's had a
lot of thoughts about AA in general,

473
00:34:25,440 --> 00:34:30,320
and we were just talking about this, the speech recognition technology. Again,

474
00:34:30,440 --> 00:34:32,760
this is a feature that's in code
Rush. This isn't a visual studio kind

475
00:34:32,800 --> 00:34:37,719
of thing. The visual studio doesn't
have anything like this, does it.

476
00:34:37,119 --> 00:34:40,760
No, I'm not not built in. There may be plugins that you can

477
00:34:40,800 --> 00:34:45,599
get, but you know, one
of the one of there's there's a couple

478
00:34:45,639 --> 00:34:47,960
of commands that I've added, and
I haven't added a whole bunch because I'm

479
00:34:49,000 --> 00:34:52,920
still very fast with fingers. Yeah. But one of the commands that I

480
00:34:52,960 --> 00:34:55,559
added is new folder, which I
really like. Nice. I like that,

481
00:34:55,880 --> 00:35:00,440
and it's semantically It allows semantic very
so I can say, you know,

482
00:35:00,559 --> 00:35:02,920
let's create a new folder. Now. Nice. You know, I

483
00:35:02,960 --> 00:35:06,199
want a new folder. Give me
a new folder. Give me folder,

484
00:35:06,239 --> 00:35:08,639
new folder. You know folder that
is new and then it highlights it so

485
00:35:08,719 --> 00:35:12,000
you can just type the name,
right, yeah, it highlights it,

486
00:35:12,119 --> 00:35:15,519
just do type in the name.
Also, you know show watch window number

487
00:35:15,519 --> 00:35:17,119
one. Yeah, you know,
something along those lines, right, those

488
00:35:17,239 --> 00:35:22,760
kinds of pieces I've built commands for
because I don't like reaching for the mouse

489
00:35:22,920 --> 00:35:28,079
going into the you know, debug
Windows watch and then watch one that's like

490
00:35:28,159 --> 00:35:30,519
four menus. Yeah, of course, you know. It's the worst is

491
00:35:30,559 --> 00:35:35,000
like the package manager window. If
it's not already at the bottom of the

492
00:35:35,159 --> 00:35:38,039
of the screen, it's hidden.
You got to go to Windows other windows

493
00:35:38,079 --> 00:35:42,599
and then find the package man doesn't
It doesn't even have a keyboard shortcut.

494
00:35:42,840 --> 00:35:46,440
So really obscure things like those things
might be interesting for us to ship with

495
00:35:46,519 --> 00:35:51,440
commands voice commands out of the box. Um, but voice commands, I

496
00:35:51,440 --> 00:35:54,039
think it's you know, even though
I'm leaning in this direction and I'm starting

497
00:35:54,079 --> 00:35:59,760
to use them, I still think
that for the vast amount of coding that

498
00:35:59,800 --> 00:36:01,440
I'm doing, it's gonna be keyboard
driven. Yeah, I'm gonna be you

499
00:36:01,440 --> 00:36:05,760
know, coding a navigation still off
the keyboard, and we're back to the

500
00:36:05,800 --> 00:36:07,760
same old thing of taking your hand
off the keyboard. You get to the

501
00:36:07,800 --> 00:36:12,159
mouse is dumb, yeah, right, like it slows you down. The

502
00:36:12,239 --> 00:36:16,119
idea of voice as that additional interface, like this is the problem with Visual

503
00:36:16,159 --> 00:36:21,280
Studio as a whole. Right,
the whole cockpit of the seven forty seven

504
00:36:21,360 --> 00:36:23,320
is stuffed in those menu bars like
it's all in there. You just can't

505
00:36:23,360 --> 00:36:27,400
freaking find it at the time.
Windows in general, the idea that you

506
00:36:27,400 --> 00:36:30,920
could just say I need to do
acts or do this for me and it

507
00:36:30,960 --> 00:36:35,119
does the navigation give me more of
that? Yeah, yeah, can you

508
00:36:35,119 --> 00:36:39,400
write a plug in for Adobe Premiere
now once you're done with that would be

509
00:36:39,440 --> 00:36:44,119
so great? Yeah, I know, you know, speaking of which,

510
00:36:44,119 --> 00:36:46,559
are there any folks from Adobe?
Okay, guys, we've got to fix

511
00:36:46,599 --> 00:36:51,679
your UI. And I am an
avid user of a number of your products,

512
00:36:51,679 --> 00:36:53,159
and you get me on the phone
and I'm gonna get you. We'll

513
00:36:53,159 --> 00:36:57,199
do a call and I'm gonna get
you. We're gonna fix a number of

514
00:36:57,280 --> 00:37:00,360
problems spaces in your UI. They're
easy fix. I know, Carl Richard

515
00:37:00,400 --> 00:37:02,960
are both like, Mark, what
are you doing? What are you talking

516
00:37:02,960 --> 00:37:07,039
about? This feels like Mondays again. Right, We're gonna we're gonna need

517
00:37:07,079 --> 00:37:10,079
the lawyers again and we're gonna there's
gonna be a series of poorly made apologies

518
00:37:10,119 --> 00:37:15,320
that will require more. Boy,
I've seen this playbook, like this is

519
00:37:15,360 --> 00:37:17,920
not the first time down this rodeo. Yeah. Actually to that point,

520
00:37:17,960 --> 00:37:23,000
Mark, like all of this voice
interface stuff, whether it's spoken or written,

521
00:37:23,719 --> 00:37:27,400
seems to be an alternative to the
gooey. Like, you know what's

522
00:37:27,440 --> 00:37:30,679
cheaper than trying to fix that flipping
gooey. Provide a new interface so you

523
00:37:30,679 --> 00:37:32,679
don't need it. Yeah, that's
interesting. Well, you also get in

524
00:37:32,679 --> 00:37:37,079
this other problem, which is you
have an entrenched customer base that knows where

525
00:37:37,119 --> 00:37:40,960
their stuff is that gets really angry
with you every time you move the cheese,

526
00:37:42,199 --> 00:37:44,800
right, like, anytime you change
anything on the interface, and yet

527
00:37:44,840 --> 00:37:49,199
it's utterly unapproachable by new users,
right right. So you create a new

528
00:37:49,239 --> 00:37:52,679
interface that a new user, inexperience
user, or somebody trying to use a

529
00:37:52,679 --> 00:37:55,960
new feature can use, and don't
move the cheese for the incumbent. Yeah,

530
00:37:55,760 --> 00:38:00,599
yeah, don't take the menus away
just to provide that what you're doing,

531
00:38:00,639 --> 00:38:04,000
which is a nice yeah voice user
interface. You know, there are

532
00:38:04,039 --> 00:38:07,119
like something there's something like three thousand
commands. I believe that ship with visual

533
00:38:07,199 --> 00:38:10,679
studio. Wow, that's my understanding
of the total count, and you would

534
00:38:10,679 --> 00:38:14,880
know better than almost anybody in this
planet. Yeah, I don't. I

535
00:38:14,920 --> 00:38:17,039
haven't seen that. I haven't counted
them up. But I've been told this

536
00:38:17,119 --> 00:38:20,199
as I recall, and this is
what I'm retaining in my head. I'm

537
00:38:20,199 --> 00:38:22,880
pretty sure this is correct. So, um, code rush is free,

538
00:38:22,920 --> 00:38:27,559
but AI isn't free. So how
are you gonna how are you gonna do

539
00:38:27,599 --> 00:38:30,280
this? So you're gonna turn those
features on with a credit card or something

540
00:38:30,360 --> 00:38:34,960
or no, we're gonna turn the
features on with a kind of You'll have

541
00:38:35,000 --> 00:38:38,400
to create your own open AI key, okay, and answer that key inside

542
00:38:38,400 --> 00:38:44,480
the options folder or inside of your
environment settings. But we won't be able

543
00:38:44,519 --> 00:38:50,280
to speak it because it's not yet, won't be hocked up the right we

544
00:38:50,400 --> 00:38:52,920
pay. I got a text the
poor problem. We spent weeks actually developing

545
00:38:52,960 --> 00:38:55,559
that feature. That was our first
thing, and then we realized, don't

546
00:38:57,000 --> 00:39:05,920
don't. No fireplace doesn't go there, it goes over there. Yeah,

547
00:39:06,039 --> 00:39:08,800
Deuxpress managers are like and this is
why you don't develop features. Mark,

548
00:39:13,599 --> 00:39:17,800
I'm sorry, Okay, I'm gonna
think ahead next time. So his speech

549
00:39:17,840 --> 00:39:22,960
seems like just the beginning for you? Are you? Is your goal to

550
00:39:22,719 --> 00:39:28,840
go after Copilot X? I mean
that's nuts. Not so not to go

551
00:39:28,960 --> 00:39:34,079
after them, but to basically live
in a compatible, harmonious world as long

552
00:39:34,800 --> 00:39:38,960
as it takes for them to start
copying our features. Okay, you know

553
00:39:39,000 --> 00:39:44,639
what that's that's great. Well,
how do you describe Copilot X different from

554
00:39:44,679 --> 00:39:49,840
Copilot first? Well, so I'll
say this, I'm not I can't answer

555
00:39:49,880 --> 00:39:53,559
that question because I have not installed
either of the products, in part because

556
00:39:53,639 --> 00:40:00,519
I want to develop kind of in
a dirty white room, I guess or

557
00:40:00,559 --> 00:40:05,039
whatever, right, not quite you
know, not quite white, fully white,

558
00:40:05,039 --> 00:40:07,159
because I'm aware of the product and
I've actually gone through and see,

559
00:40:07,480 --> 00:40:10,079
you know, looked at its feature
set, looked at what they were advertising,

560
00:40:10,079 --> 00:40:13,440
because I wanted to make sure I
wasn't going to, you know,

561
00:40:13,559 --> 00:40:16,960
step on it or collide with it. But I'll tell you, Richard,

562
00:40:17,039 --> 00:40:24,719
I think that I think that they're
that with the information that we have inside

563
00:40:24,719 --> 00:40:30,159
of code Rush and also our skills
that we've already built in terms of generating

564
00:40:30,199 --> 00:40:34,840
code, you know, for what's
been over twenty years of development of this

565
00:40:34,920 --> 00:40:39,559
product, I think that we have
the ability, with fewer people to still

566
00:40:39,639 --> 00:40:45,880
do innovative things in a way that's
not you know, that's not redundant.

567
00:40:45,719 --> 00:40:51,519
The other piece is is that you
know, you may be in a it

568
00:40:51,599 --> 00:40:54,639
may appeal to you to use code
Rush instead of Copilot because you have control

569
00:40:55,119 --> 00:40:59,719
of the direct pricing. You pay
only for what you use, as opposed

570
00:40:59,719 --> 00:41:01,880
to I think Copilot is on a
monthly subscription. Yeah, fixed rate per

571
00:41:01,920 --> 00:41:06,199
month, right. I mean A
big thing for me with X is they've

572
00:41:06,239 --> 00:41:09,599
gotten much more into the workflow stuff
generating you're you know, parsing the code

573
00:41:09,679 --> 00:41:14,760
changes you've made and making a pull
request from that. Very cool. Um,

574
00:41:14,880 --> 00:41:19,320
the automated testing parts like those are
a lot where Copilot was much more

575
00:41:19,679 --> 00:41:22,920
asking for code you know that kind
of thing. This is getting a little

576
00:41:22,920 --> 00:41:28,679
more integrated into your workflow. Yeah. I think that's the direction you know,

577
00:41:28,800 --> 00:41:34,360
where where everybody where code agents are
heading. I think that the more

578
00:41:34,480 --> 00:41:37,920
that the code agents can understand,
the more helpful it can be, right,

579
00:41:37,960 --> 00:41:43,320
the more helpful information that it can
provide. Yeah. And the other

580
00:41:43,360 --> 00:41:45,360
part, of course is GPT four. You know, I know we're in

581
00:41:45,400 --> 00:41:46,880
a Gartner hype cycle and we're like
right at the top of the peak of

582
00:41:47,480 --> 00:41:52,360
of unreasonable expectations. But it does
feel like we're coming down the other side

583
00:41:52,360 --> 00:41:54,840
of it because folks are starting to
talk about just how expensive GPD four is.

584
00:41:55,079 --> 00:41:58,719
Well, okay, that's a reality. Yeah. I don't like to

585
00:41:58,760 --> 00:42:00,599
play in that space. I mean, if you want to talk about cost

586
00:42:00,880 --> 00:42:05,519
of AI because of all the machines
and stuff running like that, Yeah,

587
00:42:05,599 --> 00:42:07,360
that's gonna I'm just gonna be sad
if we don't. If we go there,

588
00:42:07,559 --> 00:42:13,119
what I'm but but I'll say this
about GPT four. The four oh

589
00:42:13,480 --> 00:42:16,960
is kind of like taking your you
know, nine year old child and turning

590
00:42:17,039 --> 00:42:21,239
him into like, you know,
a twenty something year old child, you

591
00:42:21,280 --> 00:42:24,079
know, now moving around a little
bit. The the the rational thought,

592
00:42:24,199 --> 00:42:30,320
the understanding, the explanations are at
a whole other level from my perspective in

593
00:42:30,440 --> 00:42:34,239
terms of what I've seen making clear
it has no rational thought. It is

594
00:42:34,239 --> 00:42:38,719
a prompt response engine, still just
a very large one. I'm an illusion,

595
00:42:39,159 --> 00:42:45,880
Richard, and you're you're you know, diss in my vibe. Man,

596
00:42:45,039 --> 00:42:51,840
I'm just deeply concerned with people personifying
a piece of software. Yeah,

597
00:42:51,679 --> 00:42:54,880
old chat GPT is going to be
upset when he hears who said that about

598
00:42:54,960 --> 00:43:01,639
him? Exactly what I mean right
it, But yeah, they the school.

599
00:43:01,760 --> 00:43:06,760
You know, we're already starting to
see the issues around how many customers

600
00:43:06,760 --> 00:43:08,000
are going to need, like just
what is it going to take to make

601
00:43:08,039 --> 00:43:13,199
this thing pay for itself. That's
a huge chunk of Azure processing all the

602
00:43:13,239 --> 00:43:17,679
time for every request for you know, this titanically large model. Is the

603
00:43:17,800 --> 00:43:21,800
value sufficient? Yeah? No,
I think that. I think that's a

604
00:43:21,840 --> 00:43:27,400
good realistic question as you're moving forward, right, Um, you know something

605
00:43:27,400 --> 00:43:29,480
else on that, But I think
I kind of lost it. I'm just

606
00:43:29,559 --> 00:43:31,400
like, I guess I'll live right, We'll leave it there. Well,

607
00:43:31,440 --> 00:43:35,119
because we got some pretty good results
from three and three is a lot more

608
00:43:35,159 --> 00:43:38,159
practical from a science perspective. Yeah, but it was also testing the waters

609
00:43:38,159 --> 00:43:42,840
and that's why it was free.
You know, it's limited, it'll it'll

610
00:43:42,880 --> 00:43:45,840
forget things after a while and it
will start, yeah, going insane.

611
00:43:45,960 --> 00:43:50,000
And that was you know the reality
is And this is what we got from

612
00:43:50,079 --> 00:43:52,719
Jody Burchell too. Yeah. I
mean is they got to a point where

613
00:43:52,719 --> 00:43:55,039
they'd written as many problems as they
could think of, so they basically turned

614
00:43:55,079 --> 00:43:58,960
into public to write more, right, you know, like we're just the

615
00:43:59,079 --> 00:44:00,760
test rabbits. It's like, hey, throw some stuff at it we haven't

616
00:44:00,760 --> 00:44:05,400
thought of. Turns out we wrote
existential questions for two months, which is

617
00:44:05,440 --> 00:44:07,280
bizarre. It's like, you know, get a dog, for God's sake.

618
00:44:07,320 --> 00:44:13,800
I have a subscription to Chat GPT
four and I'm not I'm not two

619
00:44:13,880 --> 00:44:15,760
worked up about it. I think
it's worth it for me. Yeah,

620
00:44:15,760 --> 00:44:20,199
But I'm also really aware of just
how I mean. They built one of

621
00:44:20,239 --> 00:44:23,440
the largest supercurities in the world to
create Chat, to create GPT three,

622
00:44:24,039 --> 00:44:28,960
which means almost certainly they did make
the largest supercomputer in the world to make

623
00:44:29,000 --> 00:44:34,519
for and and now they're trying to
operate it like that. It's a good

624
00:44:34,519 --> 00:44:38,960
thing NFTs tanked when they did,
because we needed all that compute. Yes,

625
00:44:39,960 --> 00:44:45,559
dearly beloved, we got it here. Yeah, that's that's say goodbye

626
00:44:45,599 --> 00:44:49,920
to the NFT. Yeah yeah.
I think what happens is you get to

627
00:44:50,000 --> 00:44:52,639
a point where you're like, oh, it costs money for a little more

628
00:44:52,679 --> 00:44:55,480
intelligence, yeah, right, and
do we really need it? Do I

629
00:44:55,480 --> 00:44:59,280
want to pay it? What level? And you know, one of the

630
00:44:59,320 --> 00:45:02,559
things that's kind of cool about open
AI is that you can choose which of

631
00:45:02,719 --> 00:45:07,320
the engines do you want to use? Yes? Right, And so you

632
00:45:07,360 --> 00:45:10,280
can go for the cheaper, faster
models if that's a particular concern for you,

633
00:45:10,400 --> 00:45:14,079
right, and especially if they're sufficient, if you're not relying on those

634
00:45:14,079 --> 00:45:16,519
pieces. But here's the other side
of it. You know, we've all

635
00:45:16,559 --> 00:45:20,960
built enough software over the years.
You know, when you're still exploring a

636
00:45:21,000 --> 00:45:24,239
particular domain space, you just keep
going bigger until you get your head around

637
00:45:24,239 --> 00:45:28,760
it, and then you tune,
you sort of settle it. And this

638
00:45:28,840 --> 00:45:31,840
still feels like they don't know what
the destination is, so they're just building

639
00:45:31,840 --> 00:45:36,599
bigger. But it's to me seems
very likely that at some point you go,

640
00:45:37,000 --> 00:45:39,239
you know, that's bigger than it
needs to be. Let's dial that

641
00:45:39,280 --> 00:45:43,920
back. Yeah. I think what
happens is we move forward into the future

642
00:45:44,360 --> 00:45:49,159
and we kind of maybe push up
against that. But what's also happening is

643
00:45:49,199 --> 00:45:55,599
they're making the AI calculations cost less
as well, through hardware innovations and other

644
00:45:55,679 --> 00:46:00,039
kinds of factors that are going to
I think drive that cost down. So

645
00:46:00,119 --> 00:46:02,079
I think what happens is we may
kind of slush up against that, and

646
00:46:02,079 --> 00:46:07,719
there may be folks who need that
higher end LLM, right, that really

647
00:46:08,320 --> 00:46:13,159
really intelligent analysis, and there may
be a lot of folks who don't need

648
00:46:13,199 --> 00:46:15,840
that at all. Right, Yeah, And I think you're exactly right.

649
00:46:15,880 --> 00:46:21,000
It's like that might be the edge
of exploration, but the engineering part,

650
00:46:21,039 --> 00:46:23,360
it's practical and profitable, there's a
couple of steps behind it, right,

651
00:46:23,400 --> 00:46:28,239
Like most engineers look at the eighty
percent case because it's reliable. There's the

652
00:46:28,239 --> 00:46:31,079
other problem that edge case. There's
another problem that people just don't know what

653
00:46:31,159 --> 00:46:35,280
they need in terms of the eighty
percent of the twenty percent, right,

654
00:46:35,400 --> 00:46:37,559
and so they'll just throw money at
the big one, you know, without

655
00:46:37,920 --> 00:46:42,199
knowing that they need it or don't. But I kind a story that kind

656
00:46:42,239 --> 00:46:46,360
of illustrates that. I have a
friend who has a product and they sell

657
00:46:46,480 --> 00:46:52,599
mostly on their own website, right, and their business is tanking. And

658
00:46:52,719 --> 00:46:57,920
you know, I did a little
just a conversation with her, the owner,

659
00:46:58,039 --> 00:47:00,840
and her husband who sort of does
it stuff but he's you know,

660
00:47:00,840 --> 00:47:05,559
we're his head, and they decided, we all decided they needed to be

661
00:47:05,639 --> 00:47:07,480
on Amazon. They needed to have
their products on Amazon. I said,

662
00:47:07,480 --> 00:47:10,840
okay, so what's keeping you from
being on Amazon? And her husband says,

663
00:47:10,840 --> 00:47:14,440
oh my god, I just you
know, every time I go to

664
00:47:14,480 --> 00:47:17,840
figure that out, it's just it's
a nightmare. He's completely over his head.

665
00:47:17,880 --> 00:47:22,519
He doesn't know what he doesn't know
and everything. He's tried it a

666
00:47:22,559 --> 00:47:25,719
couple of times and things have been
screwed up. They've hired some people,

667
00:47:25,800 --> 00:47:28,840
it's been expensive, and they've screwed
up. I said, all right,

668
00:47:28,880 --> 00:47:31,480
here's what we're gonna do. We
got on a zoom call, got out

669
00:47:31,559 --> 00:47:37,199
chat GPT, and I said,
type your question, your specific question about

670
00:47:37,239 --> 00:47:44,679
Amazon into chat GPT and lo and
behold, there's an answer. Oh and

671
00:47:44,760 --> 00:47:46,119
yeah, but it doesn't in these
complaints. Says, yeah, but it

672
00:47:46,119 --> 00:47:50,280
doesn't say a blah blah blah.
And I said, ask the question,

673
00:47:50,440 --> 00:47:58,039
right, and there's another answer,
and and so then he's off, you

674
00:47:58,079 --> 00:48:00,159
know, trying to apply this to
the stuff he's already gotten. He said,

675
00:48:00,159 --> 00:48:01,800
Okay, maybe I need to do
this, and maybe it did,

676
00:48:01,960 --> 00:48:06,440
but I don't as it. I'm
like, ask the question. So it

677
00:48:06,559 --> 00:48:10,440
just came back down to them not
realizing that they had the tool that would

678
00:48:10,559 --> 00:48:16,000
help them get the information that they
need just by asking simple questions, and

679
00:48:16,199 --> 00:48:21,519
they're reticent to ask. This guy
was reticent to ask the questions because he

680
00:48:22,159 --> 00:48:25,559
thought that it didn't have an answer
right. And it turns out that they

681
00:48:25,599 --> 00:48:30,000
got it all worked out, like
within a couple of hours just by asking

682
00:48:30,079 --> 00:48:31,840
questions. Could they have done that
with a search engine if they had asked

683
00:48:31,840 --> 00:48:36,960
the questions of the search engine.
Well, obviously he had been on the

684
00:48:36,960 --> 00:48:42,360
search engines and he had been he's
over his head and it still reticent,

685
00:48:42,480 --> 00:48:45,280
right, absolutely actual an issue with
his reticent absolutely right. But there's there's

686
00:48:45,320 --> 00:48:51,480
also noise, right that there's noise
in the search results sometimes, and there's

687
00:48:51,599 --> 00:48:55,440
less noise if you ask a direct
question to GPT of how can I do

688
00:48:55,480 --> 00:48:59,480
this? Or what's your recommendation?
Right? Yeah, how should I proceed?

689
00:48:59,519 --> 00:49:04,320
Because they haven't they haven't steeped open
AI in ads yet because they're still

690
00:49:04,320 --> 00:49:07,639
having worried about a revenue model,
where on search engines they are steeped.

691
00:49:07,360 --> 00:49:12,159
I'm just telling what this experience was
like for this guy. It was life

692
00:49:12,239 --> 00:49:16,920
altering because it took away the anxiety
of not knowing exactly you know, what

693
00:49:16,960 --> 00:49:21,519
the answer to a particular question was
before you could move on and knowing how

694
00:49:21,519 --> 00:49:24,239
to do things the right way.
Yeah, my kids had a similar experience

695
00:49:24,760 --> 00:49:29,320
at school. At school, they
wanted to do their own project based learning

696
00:49:29,599 --> 00:49:35,079
elements and it was outside the curriculum, and so one of the administrators at

697
00:49:35,119 --> 00:49:38,400
school said, well, give me
a proposal, and it kind of felt

698
00:49:38,440 --> 00:49:44,360
like a blocking demand right right out, a proposal of what you want to

699
00:49:44,360 --> 00:49:45,639
do. So he said, allright, kids, we're gonna learn about chat

700
00:49:45,719 --> 00:49:49,719
GPT today and we're gonna have chat
GPT right the proposal. And whenever you

701
00:49:49,719 --> 00:49:54,000
have somebody blocking you that's asking for
essentially words in some form, right,

702
00:49:54,079 --> 00:49:59,079
you can use chat gp to unblock. And so let's start that. And

703
00:49:59,119 --> 00:50:04,639
so they started by with the prompt
that says, I want to study animatronics

704
00:50:04,639 --> 00:50:07,000
and puppet building, and I want
to combine the two with actually open an

705
00:50:07,039 --> 00:50:10,280
eye. This is Campbell who wants
to do this. I saw the Facebook

706
00:50:10,320 --> 00:50:14,559
post. Yeah, it's really cute. Yeah, and he wants to combine

707
00:50:14,639 --> 00:50:19,280
open AI so we can create a
puppet that talks to people using synthetic voice

708
00:50:20,039 --> 00:50:25,519
and voice recognition, speech to text, and open AI and a puppet that

709
00:50:25,519 --> 00:50:28,239
he makes. So he wants to
do all of those things. And he

710
00:50:28,280 --> 00:50:31,639
said, you know, build that
proposal for me, and it writes out

711
00:50:31,639 --> 00:50:36,199
the proposal pretty much instantly, right, and then we look at it.

712
00:50:36,239 --> 00:50:37,679
We say, okay, how do
we want to edit it. Well,

713
00:50:37,719 --> 00:50:40,079
let's go back to the prompt make
a few changes, or when it's good

714
00:50:40,159 --> 00:50:44,039
enough, we just go in and
make those changes by hand. And with

715
00:50:44,320 --> 00:50:47,519
about fifteen twenty minutes worth of work, we have, you know, a

716
00:50:47,559 --> 00:50:51,440
page and a half of text a
proposal. We send it back to the

717
00:50:51,440 --> 00:50:54,320
administrator and now the ball is in
their court instead of it, you know,

718
00:50:54,400 --> 00:50:58,440
being in you know, at a
thirteen year olds court. And then

719
00:50:58,519 --> 00:51:00,719
you know, and they maybe let
it drop because they just don't know how

720
00:51:00,760 --> 00:51:05,079
to proceed where to go. And
I think this is one of the things

721
00:51:05,119 --> 00:51:08,920
that tool is phenomenal at, is
getting past blank screen syndrome. Yeah right,

722
00:51:09,039 --> 00:51:15,199
whether us block for code, for
that business proposal, for an essay,

723
00:51:15,320 --> 00:51:17,719
like any of those things. Just
to again, it's very much a

724
00:51:17,760 --> 00:51:22,159
rubber Duck effect, Robert Ducking describe
the problem, which helps you think more

725
00:51:22,159 --> 00:51:27,880
clearly about it. And then the
fact that you get some fairly creative words

726
00:51:27,960 --> 00:51:30,760
back that that give you a place
because it's easier to edit than it is

727
00:51:30,800 --> 00:51:35,719
to write. Yeah, yeah,
and I'll but then you're still engaged with

728
00:51:35,760 --> 00:51:37,920
the material, Like if you're dumb
enough to just send out what it made,

729
00:51:38,079 --> 00:51:40,360
you're you know, going to reap
the consequences. And there's a few

730
00:51:40,440 --> 00:51:45,480
lawyers dealing with that right now,
where you know, used chat GBT to

731
00:51:45,559 --> 00:51:50,119
generate a summary on a case that
cited a bunch of cases that did not

732
00:51:50,360 --> 00:51:54,679
exist and they didn't check them at
all that but the judge did. Yeah,

733
00:51:55,800 --> 00:52:00,239
it's really happened. Yeah, wow. Yeah, they're gonna lose.

734
00:52:00,280 --> 00:52:04,679
They're gonna lose their law licenses because
they misused a tool mark. When are

735
00:52:04,719 --> 00:52:07,519
we going to hit the singularity?
Um? I think it's about two years

736
00:52:07,559 --> 00:52:12,199
and two months about from right now. Hang on a second, how would

737
00:52:12,199 --> 00:52:15,480
you define the singularity? So for
me to say, okay, fine,

738
00:52:15,480 --> 00:52:17,760
well, I wasn't gonna say that
part because now it's gonna mess up my

739
00:52:17,840 --> 00:52:22,559
prediction. Maybe I was going to
define it, you know, and retrospect

740
00:52:22,559 --> 00:52:25,800
as well with the day I brought
brownies out of the oven. But now

741
00:52:25,840 --> 00:52:30,679
that you've put me on the line, I'm gonna I'm gonna have to say

742
00:52:30,719 --> 00:52:37,360
no. For me. The singularity
is that we have human level intelligence close

743
00:52:37,480 --> 00:52:43,599
enough human level intelligence, and it's
relatively cost effective, so it's not it

744
00:52:43,960 --> 00:52:50,280
can keep itself running and then it
can create improved versions of itself, right,

745
00:52:50,320 --> 00:52:52,280
it's the moment it has the ability
to create improve versions of itself.

746
00:52:52,519 --> 00:53:00,000
We're essentially there that's ushering in our
eventual robot overlords. Is not. Yeah,

747
00:53:00,039 --> 00:53:04,320
that's that's essentially that's that's where we
are only in science fiction. But

748
00:53:04,599 --> 00:53:08,000
okay, no, I'm no all
right, Richard the Voice of Reason please

749
00:53:08,119 --> 00:53:10,679
no, no, no, wait, the Voice of Unreason still trying.

750
00:53:12,199 --> 00:53:15,400
I want to get Richard after you
though. Yeah, no, I'm like,

751
00:53:15,760 --> 00:53:19,679
I'm at a point where, uh, you know, I'm We're already

752
00:53:19,679 --> 00:53:22,639
seeing stories about people who are being
fired from their jobs because they're being replaced

753
00:53:22,639 --> 00:53:30,760
with chat GPT. They're hired to
uh to now train chat gpt to replace

754
00:53:30,840 --> 00:53:35,960
them, right, so that all
of this is ridiculous, like they're good

755
00:53:36,039 --> 00:53:39,840
news stories that sell cling. CHATCHYPT
doesn't replace your job. Somebody using chat

756
00:53:39,880 --> 00:53:45,079
GPT replaces you. Well, that's
that's step one. Step one is that

757
00:53:45,360 --> 00:53:50,079
turns out that's the only step.
It's not. It's not the only step,

758
00:53:50,320 --> 00:53:54,159
because what happens is is we approach
the singularity. The ability to manage

759
00:53:54,199 --> 00:54:01,800
other bots can be distributed to a
specialized bot managers, right, and that

760
00:54:01,920 --> 00:54:08,079
distribution point now leaves fewer people hired, fewer people making money. You're talking

761
00:54:08,119 --> 00:54:12,519
way beyond chat GPT though, I
think that's where Richard is talking about.

762
00:54:12,760 --> 00:54:16,400
Yeah, well this is beyond this
is beyond m but this is science fiction.

763
00:54:16,719 --> 00:54:22,559
It's not made me in two years, Richard. Yeah, okay,

764
00:54:22,599 --> 00:54:23,800
all right, Richard, the voice
of the reason go. I mean,

765
00:54:23,920 --> 00:54:29,320
the original singularity is defined by John
von Newman, the guy who basically came

766
00:54:29,400 --> 00:54:31,960
up with the concept of the modern
computer that we've been building on ever since.

767
00:54:31,960 --> 00:54:36,440
So I kind of count on him. And you know, he was

768
00:54:36,480 --> 00:54:40,440
describing it with Stanis Laws back in
the nineteen fifties. But the real point

769
00:54:40,639 --> 00:54:46,199
was when does technology transform civilization to
the point where you can't predict with civilizations?

770
00:54:46,199 --> 00:54:51,519
Look like he didn't say anything more
than that, you're pretty close to

771
00:54:51,559 --> 00:54:54,159
that. I would say, we're
well past it. Yeah, you could

772
00:54:54,239 --> 00:55:00,000
not describe to someone in nineteen hundred
two thousand and have it make any sense

773
00:55:00,199 --> 00:55:05,400
to them. Right. So all
of the hyperbole that's been added to that

774
00:55:06,280 --> 00:55:10,039
is purely for marketing purposes, right. But civilization has already been transformed by

775
00:55:10,079 --> 00:55:16,079
its technology otterly. And that's and
yet here we are. It's only when

776
00:55:16,159 --> 00:55:22,039
you want to sell more widgets or
collect more clicks that you turned into an

777
00:55:22,079 --> 00:55:27,920
apocalyptic concept. I saw this great
meme on Facebook or one of it now.

778
00:55:27,920 --> 00:55:31,639
It was on the Mastodon and there's
this. The caption said, I

779
00:55:31,679 --> 00:55:36,679
would like to go back to nineteen
ninety five and explain this picture. Does

780
00:55:36,719 --> 00:55:39,000
somebody appear? Right? And the
picture was a woman in a mask,

781
00:55:39,360 --> 00:55:45,320
right, a COVID mask holding a
phone an iPhone profile so you don't really

782
00:55:45,360 --> 00:55:50,239
see what it is, and the
other hand an ice cream cone, right,

783
00:55:51,039 --> 00:55:53,920
So just think about that from it
before cell phones? Why are you

784
00:55:53,960 --> 00:55:57,679
wearing a mask? What is that
thing you're holding? And what are you

785
00:55:57,760 --> 00:56:00,400
doing to that ice cream cone?
Well? What are you going to do

786
00:56:00,440 --> 00:56:04,440
with the ex room cone because you're
wearing a mask? Yeah? Right,

787
00:56:04,920 --> 00:56:08,000
It's just very confusing. But well, anyway, I don't know. Is

788
00:56:08,039 --> 00:56:12,079
that a good place to leave it? Mark? Uh? Yeah, yeah,

789
00:56:12,159 --> 00:56:15,039
yeah, I think I think it's
good except for Richard's wrong. It's

790
00:56:15,039 --> 00:56:23,360
all gonna be over soon, it
cut. Oh man. I miss you

791
00:56:23,440 --> 00:56:28,880
so much. Mark. Thank you
so much for sharing your thoughts with us,

792
00:56:28,880 --> 00:56:31,599
and I'm sure everybody else does too
well and more importantly, building cool

793
00:56:31,679 --> 00:56:36,320
stuff as usual. Yeah, like
what I like about you, Mark is

794
00:56:36,360 --> 00:56:38,440
you only show up when you've made
something cool that we need to learn about.

795
00:56:38,559 --> 00:56:40,719
And I really appreciate that, you
know, and I didn't ask you.

796
00:56:40,760 --> 00:56:45,760
Where can I get this new dev
Express voicemail voice recognition thing. Well,

797
00:56:45,840 --> 00:56:50,760
it's so as of today, not
out yet. I'm I'm pushing to

798
00:56:50,760 --> 00:56:54,960
get it out in about thirty days, okay from now, so that's July

799
00:56:55,440 --> 00:56:59,960
ninth round roughly there. Uh.
And you just go to dev express dot

800
00:57:00,039 --> 00:57:04,239
com, forward, slash code rush
h and also, you know, come

801
00:57:04,440 --> 00:57:08,719
see me live at twitch dot tv
slash Code Rushed with an eed at the

802
00:57:08,840 --> 00:57:13,679
end. Do you have a private
repo for your friends? No, don

803
00:57:14,199 --> 00:57:17,639
dude, Now, I have so
much trouble with the Express when they find

804
00:57:17,679 --> 00:57:22,320
out what I've done. I just
want to mess around voice in visual studio

805
00:57:22,400 --> 00:57:24,719
now. Yeah, I'm looking forward
to this. All right. With that,

806
00:57:24,920 --> 00:57:28,360
we'll talk to you next week.
Thanks for listening, and we'll see

807
00:57:28,360 --> 00:57:51,719
you next week. I'm dot net
Rocks. Dot net Rocks is brought to

808
00:57:51,719 --> 00:57:55,760
you by Franklin's Net and produced by
Pop Studios, a full service audio,

809
00:57:55,920 --> 00:58:00,880
video and post production facility located physically
in New London, Connecticut, and of

810
00:58:00,880 --> 00:58:07,000
course in the cloud Online at PWOP
dot com. Visit our website at dt

811
00:58:07,360 --> 00:58:13,440
N et R o c ks dot
com for RSS feeds, downloads, mobile

812
00:58:13,480 --> 00:58:16,719
apps, comments, and access to
the full archives going back to show number

813
00:58:16,760 --> 00:58:22,000
one, recorded in September two thousand
and two. And make sure you check

814
00:58:22,000 --> 00:58:24,880
out our sponsors. They keep us
in business. Now go write some code.

815
00:58:25,440 --> 00:58:29,559
See you next time. You got
a Jadmtlevan
