1
00:00:01,120 --> 00:00:05,679
How'd you like to listen to dot
NetRocks with no ads? Easy? Become

2
00:00:05,719 --> 00:00:09,880
a patron For just five dollars a
month, you get access to a private

3
00:00:10,039 --> 00:00:14,439
RSS feed where all the shows have
no ads. Twenty dollars a month will

4
00:00:14,480 --> 00:00:19,039
get you that and a special dot
NetRocks patron mug. Sign up now at

5
00:00:19,079 --> 00:00:24,160
Patreon dot dot NetRocks dot com.
Hey Carl and Richard here with your twenty

6
00:00:24,199 --> 00:00:29,359
twenty four NDC schedule. We'll be
at as many NDC conferences as possible this

7
00:00:29,440 --> 00:00:33,600
year, and you should consider attending
no matter what. Ndcoslo is happening in

8
00:00:33,679 --> 00:00:37,880
June tenth through the fourteenth. Get
your tickets at ndcoslo dot com. The

9
00:00:37,920 --> 00:00:44,000
Copenhagen Developers Festival happens August twenty sixth
through the thirtieth. Early bird discount ends

10
00:00:44,079 --> 00:00:51,719
April twenty sixth. Tickets at Cphdevfest
dot com. Ndcporto is happening October fourteenth

11
00:00:51,759 --> 00:00:56,880
through the eighteenth. The early bird
discount ends June fourteenth. Tickets at Ndcporto

12
00:00:57,079 --> 00:01:12,200
dot com. We'll see you there, we hope. Hey, guess what

13
00:01:12,560 --> 00:01:17,519
it's dot net rocks. I'm Carl
Franklin and I'm Richard Camp Richard, where

14
00:01:17,519 --> 00:01:19,439
are you in the world Now?
I am in Las Vegas where I'm at

15
00:01:19,480 --> 00:01:26,359
the Microsoft Fabric conference. This is
their new data analytics stack. And so

16
00:01:26,760 --> 00:01:30,040
we're putting on a show with me
in four thousand of my closest friends.

17
00:01:30,200 --> 00:01:37,040
You know what's crazy is that Kelly
has a friend who we don't know what

18
00:01:37,120 --> 00:01:40,079
she does, but we know she's
in it. And she said she was

19
00:01:40,120 --> 00:01:42,840
going to Vegas this week to go
to some Fabric conference and I had no

20
00:01:42,920 --> 00:01:47,480
idea what it was. And now
I know she's hanging out with me.

21
00:01:48,159 --> 00:01:51,200
She might be and we'll talk afterwards. You can hook up with her,

22
00:01:51,719 --> 00:01:55,280
you know what I mean. I'm
not saying we up. I'm just saying

23
00:01:55,599 --> 00:02:00,439
maybe made for coffee or tea.
Because you're a brit you know. We're

24
00:02:00,439 --> 00:02:02,359
going to be back here in May
for dev Intersection. Yeah. It's gonna

25
00:02:02,359 --> 00:02:05,680
be fantastic. And you're signed up
for that one. Yeah, I'm really

26
00:02:05,719 --> 00:02:07,599
looking forward to it. I'm going
to be doing a Blazer workshop, I

27
00:02:07,639 --> 00:02:12,680
think mm hmmm, yeah. I
think it's called Carl Franklin's Blazer Workshop.

28
00:02:12,960 --> 00:02:16,080
I believe it is. So therefore
I'm the only one that can teach it.

29
00:02:16,639 --> 00:02:20,919
I guess it's your workshop. That'd
be dev intersection dot Com. Yeah,

30
00:02:20,960 --> 00:02:23,280
I hope we'll see you there.
Restrictions apply, void, we're prohibited

31
00:02:23,280 --> 00:02:27,319
by law. Okay, order now
get a free set against your knives.

32
00:02:29,080 --> 00:02:32,199
See it's five o'clock here, I
have wine. Michelle is in Australia.

33
00:02:32,240 --> 00:02:36,240
It's eight o'clock in the morning.
She just had brecky eight o'clock in the

34
00:02:36,240 --> 00:02:39,800
morning. Tomorrow. She's in the
future. Yeah, it's great love being

35
00:02:39,800 --> 00:02:43,759
in the future. I wasn't going
to have a bowl of seal, but

36
00:02:43,800 --> 00:02:46,199
then I sat here and I was
starving, so I was like, I

37
00:02:46,280 --> 00:02:49,680
bet it eight alluhs, I'm not
gonna last. So your podcast, you're

38
00:02:49,719 --> 00:02:52,759
not gonna make it. We make
a long show. We need to shine.

39
00:02:52,840 --> 00:02:55,319
Yeah, exactly. You know,
our hardcore listeners know that whenever we

40
00:02:55,400 --> 00:03:00,680
do shows later in the day like
now, that we tend to get little

41
00:03:00,719 --> 00:03:04,240
punchy. So like, you know, I'll sell a glass wine. You

42
00:03:04,240 --> 00:03:07,360
don't believe Michelle such a serious person. So right, I don't know how

43
00:03:07,360 --> 00:03:13,159
this is gonna work. That's serious. All right, Well let's get on

44
00:03:13,240 --> 00:03:23,400
with better no framework. All right, dude, what do you got?

45
00:03:23,159 --> 00:03:27,840
I got something that's interesting and a
little scary. I like it. So

46
00:03:28,599 --> 00:03:32,840
this is a paper that came out
from Microsoft on March thirteenth, This being

47
00:03:32,879 --> 00:03:37,120
Show eighteen ninety two. You can
go to one eight nine to two dot

48
00:03:37,240 --> 00:03:46,000
pwoppwop dot E. It's a Microsoft
paper entitled autodev Automated AI Driven Development,

49
00:03:46,599 --> 00:03:53,000
and I'll just take the most interesting
in sixtync paragraph from it. We present

50
00:03:53,120 --> 00:04:00,840
autodev, a fully automated AI driven
software development framework designed for autonomous planning and

51
00:04:00,919 --> 00:04:10,479
execution of intricate software engineering tasks.
Autodev enables users to define complex software engineering

52
00:04:10,520 --> 00:04:16,079
objectives, which are assigned to Autodev's
autonomous AI agents to achieve. These AI

53
00:04:16,240 --> 00:04:23,160
agents can perform diverse operations on a
codebase, including file editing, retrieval,

54
00:04:23,240 --> 00:04:29,199
build processes, execution, testing,
and get operations. They also have access

55
00:04:29,240 --> 00:04:33,000
to files, compiler output, build
and testing logs, static analysis tools,

56
00:04:33,040 --> 00:04:39,480
and more. This enables the AI
agents to execute tasks in a fully automated

57
00:04:39,560 --> 00:04:44,800
manner with a comprehensive understanding of the
contextual information required. You know, Richard,

58
00:04:46,000 --> 00:04:53,560
anytime I hear AI and fully automated
in a sentence or a paragraph,

59
00:04:53,839 --> 00:05:00,319
I get the willies well, that's
just your crap detector going off, gone

60
00:05:00,319 --> 00:05:04,399
what sure? Yeah, well,
I mean fully automated kind of means that

61
00:05:05,000 --> 00:05:11,439
it has, you know, it
has unfettered access to things. It's building

62
00:05:11,519 --> 00:05:14,319
software. Right, Well, maybe
it will, maybe it won't. I

63
00:05:14,319 --> 00:05:16,759
think I think a person wrote that
line and hasn't made it true at all.

64
00:05:16,920 --> 00:05:20,360
It's a paper, not a product. Yeah, but eventually, eventually

65
00:05:20,399 --> 00:05:24,120
this is going to happen, right, I mean, And the whole idea

66
00:05:24,160 --> 00:05:29,839
is that software developers may, in
fact, in the not so near distant

67
00:05:29,959 --> 00:05:35,199
or distant future, become more supervisors
than developers. Well, one would argue

68
00:05:35,199 --> 00:05:40,439
they already are. They mostly supervise
the utilizational libraries and glue them together.

69
00:05:41,839 --> 00:05:45,519
Boy, you're just a curmudgeon,
aren't you. You're like the AI curmudgeon,

70
00:05:45,959 --> 00:05:49,040
got it'll never work. I'm just
more of a Hey, you know

71
00:05:49,079 --> 00:05:54,920
what, I've seen this play a
bunch of times. Someone still has to

72
00:05:54,920 --> 00:05:59,040
do requirements gathering, some still has
to describe the problem space, someone still

73
00:05:59,040 --> 00:06:01,959
has to validate it. You know. The code writing was the simplest part.

74
00:06:02,240 --> 00:06:06,279
Yeah, oh did I mention this? It only writes Python lovely.

75
00:06:10,199 --> 00:06:13,240
Well, that means it's all c
under the hood anyway, So there we

76
00:06:13,319 --> 00:06:17,600
are. Yeah, no, no
offense to our Python programming listeners, because

77
00:06:17,639 --> 00:06:21,600
Python is probably the most popular language
in the world right now. It's yeah,

78
00:06:21,600 --> 00:06:30,040
absolutely, and it's awesome. Yeah, what is the first javascripts?

79
00:06:30,040 --> 00:06:32,199
You're probably right? Yeah, yeah, I thought you were gonna say,

80
00:06:32,240 --> 00:06:35,920
Jason, and I was going to
be sad because it's not a language.

81
00:06:35,920 --> 00:06:42,399
It's for that. He well,
anyway, that is when I have it's

82
00:06:43,639 --> 00:06:48,360
it's certainly we're certainly creeping to this. And and I remember having this conversation

83
00:06:48,439 --> 00:06:53,040
with you on dot net rocks in
the two thousands Richard, yep. And

84
00:06:53,120 --> 00:06:55,519
do you remember. I got to
look up what show it was and maybe

85
00:06:55,560 --> 00:06:59,079
an alert listener can remind us.
But I said, you know, there's

86
00:06:59,120 --> 00:07:02,079
going to come a day where a
business person can walk up to a machine

87
00:07:02,160 --> 00:07:06,920
and have a conversation with it,
you know, either typing or English or

88
00:07:08,160 --> 00:07:12,560
other language, and tell it what
the requirements are and it would spit out

89
00:07:12,600 --> 00:07:19,759
a fully ready to go qualified application. And you said that will never happen.

90
00:07:19,959 --> 00:07:25,199
Yep. And it still hasn't.
Yeah, it still hasn't happened yet.

91
00:07:25,360 --> 00:07:28,720
No, but but you know,
but we've seen we've seen papers like

92
00:07:28,720 --> 00:07:31,839
this before the actually making it happen
something else entirely, and I would point

93
00:07:31,839 --> 00:07:35,680
you a copilot and power platform.
Yeah, although you find the average business

94
00:07:35,680 --> 00:07:40,680
person cannot describe a business process adequately
to explain it to another human, much

95
00:07:40,720 --> 00:07:44,519
less to a piece of suf.
There is that problem. Yes, yeah,

96
00:07:44,560 --> 00:07:47,279
Actually breaking down workflows is hard.
It takes skill, and it takes

97
00:07:47,399 --> 00:07:53,199
time. It does, and it
also takes some kind of uh, some

98
00:07:53,439 --> 00:07:58,279
kind of thing that we don't think
machines have yet. Software machines don't have

99
00:07:58,439 --> 00:08:01,160
anything. It's just software. Yeah, programs. I guess. All right,

100
00:08:01,199 --> 00:08:07,319
Well enough of the philosophical bs.
Let's hear who's talking to us today.

101
00:08:07,399 --> 00:08:09,519
Richard Gravity comment tub show eighteen fourteen, the one we did with one

102
00:08:09,600 --> 00:08:15,720
Michelle Mannering, you know, back
when she was still Michelle Manoring at NDC

103
00:08:15,920 --> 00:08:20,399
Oslo. We did it in person
talking about this product called get hub Copilot.

104
00:08:20,399 --> 00:08:22,560
It's the thing that's going to replace
all developers. As I recall,

105
00:08:22,759 --> 00:08:31,639
absolutely and Trudje and said, I've
been kind of slow with gethub Copilot and

106
00:08:31,680 --> 00:08:35,120
midtally this is now a year ago, but I started last week. This

107
00:08:35,240 --> 00:08:39,639
week I got repaid in full.
I cannot understand why I hadn't tried this

108
00:08:39,679 --> 00:08:43,440
earlier. The suggestions from visual Studio
good, but this is brilliant. My

109
00:08:43,559 --> 00:08:48,240
current project is a CE sharp Blazer
application using Entity framework for creating and seeding

110
00:08:48,279 --> 00:08:54,120
test data to a database with repeating
patterns. After the first three or four

111
00:08:54,240 --> 00:08:58,080
entities, get hub Copilot got the
grip and wrote almost entirely correct classes in

112
00:08:58,279 --> 00:09:03,080
just a few time. That's because
it was trained on Blazer train repos.

113
00:09:03,120 --> 00:09:05,679
Come on, there you go,
that's what it's all about. Yeah,

114
00:09:05,720 --> 00:09:09,399
it's all about me. However,
the difference is between the eneities made it

115
00:09:09,480 --> 00:09:13,159
hard to just copy the whole class
and change them the names Copilot did that

116
00:09:13,200 --> 00:09:16,360
the first few times. By for
it embraced pattern and created the unique code

117
00:09:16,360 --> 00:09:20,639
I needed. I guess the result
was about ninety percent. It required some

118
00:09:20,799 --> 00:09:22,840
fixes, but all in all,
I save so much time on the single

119
00:09:22,840 --> 00:09:26,840
project that I guess get hub Copilot
has counted as free for the next few

120
00:09:26,919 --> 00:09:33,120
years for me. I'm seriously impressed. You know where get hub coopilot really

121
00:09:33,159 --> 00:09:37,159
really works well for me is if
I have things that repeat, like I've

122
00:09:37,200 --> 00:09:43,039
got five or six lines that you
know, I'm working with data in a

123
00:09:43,039 --> 00:09:46,879
grid and there's controls in each cell, right, and it understands that I

124
00:09:48,000 --> 00:09:52,080
have the what the data and the
headers are in the grid, and I

125
00:09:52,200 --> 00:09:56,720
make one of these blocks of six
lines and then it just starts spitting out

126
00:09:56,759 --> 00:09:58,720
the rest of them and it's like
tabin or tabin or tabin or taber.

127
00:10:00,320 --> 00:10:03,600
What I don't like about co pilot
and Michelle will probably jump in here and

128
00:10:05,440 --> 00:10:07,679
either agree or disagree with me.
But what I don't like about copilot is

129
00:10:07,720 --> 00:10:13,960
when it tries to read my mind
and it's almost like like, okay,

130
00:10:13,039 --> 00:10:18,840
here's a story. I have a
customer in the studio who wants me to

131
00:10:18,759 --> 00:10:26,120
work on mastering this solo piano CD
right, and whenever I say, you

132
00:10:26,120 --> 00:10:28,039
know what I'm gonna do, I'm
gonna and he goes, where are you

133
00:10:28,039 --> 00:10:31,000
going to turn down the volume there? Or are you going to bring up

134
00:10:31,000 --> 00:10:33,919
to five K there? Or are
you going to be able to And it

135
00:10:33,960 --> 00:10:35,679
goes on like four or five times, and then I just look at him

136
00:10:35,720 --> 00:10:39,360
like, will you shut up and
let me say what I'm going to do?

137
00:10:41,200 --> 00:10:46,320
This would go faster. I have
two answers, see you that go

138
00:10:46,320 --> 00:10:50,360
ahead. So the first one is
get off a copilot can't actually rad your

139
00:10:50,360 --> 00:10:54,879
mind as feels like it might be
able to read your mind. All it's

140
00:10:54,919 --> 00:11:00,600
doing is taking context from the file
you have opened. Any other open files

141
00:11:01,320 --> 00:11:03,919
sounds like you use visual studios if
you have other open tabs. It's taking

142
00:11:03,919 --> 00:11:09,919
the context and that so sometimes client
exactly So sometimes it does feel like you're

143
00:11:09,960 --> 00:11:13,159
like, oh, it is trying
to read my mind. You're like,

144
00:11:13,480 --> 00:11:16,320
just just wait, like I am
going to so a couple of things.

145
00:11:16,320 --> 00:11:20,080
You know, you can just ignore
it and keep typing over it, or

146
00:11:20,159 --> 00:11:24,399
you can turn in line suggestions off
and mostly use to get up copilot chat

147
00:11:24,440 --> 00:11:30,240
picture. Yeah. What I like
is if if you make a comment about

148
00:11:30,320 --> 00:11:33,240
exactly what you want to do,
and then it's pretty really really good about

149
00:11:33,639 --> 00:11:39,440
you know, coming like Richard was
saying, if you're able to articulate very

150
00:11:39,519 --> 00:11:43,039
clearly what you want, you're going
to get a response that is much more

151
00:11:43,279 --> 00:11:48,399
or much closer to what you actually
need as opposed to it just actually trying

152
00:11:48,399 --> 00:11:52,120
to read your mind. Because as
much as it feels like all these ais

153
00:11:52,159 --> 00:11:54,639
can read our mind, they really
can't. As I mentioned before, it's

154
00:11:54,679 --> 00:11:58,240
a bunch of software. It can't
read your mind. It can only take

155
00:11:58,279 --> 00:12:03,120
in the context you're providing it,
and it's data training set that it has

156
00:12:03,159 --> 00:12:07,399
been already trained on. It's a
good suggestion to turn off that inline stuff,

157
00:12:07,480 --> 00:12:13,600
especially when it starts spitting out one
hundred online of code solutions and you

158
00:12:13,720 --> 00:12:20,720
finally your screen fills up with gray
bloody blah they God help you if your

159
00:12:20,720 --> 00:12:26,759
present, you need to go.
It's a great technical term. Great blare

160
00:12:26,039 --> 00:12:31,320
to use that. I like that. Let's say that today Hey trying.

161
00:12:31,399 --> 00:12:33,159
Of course, you kicked off some
great conversations. Thanks so much for your

162
00:12:33,159 --> 00:12:37,080
comment. A copy of music Cobey
is on its way to un if you'd

163
00:12:37,080 --> 00:12:39,399
like a copy of music Codbey.
I read a comment on the website at

164
00:12:39,440 --> 00:12:41,440
dot NetRocks dot com or on the
facebooks. We publish every show there,

165
00:12:41,600 --> 00:12:45,799
and if you comment on the show
and I read it, we'll send you

166
00:12:45,840 --> 00:12:48,039
copy music Goby. And you can
definitely follow us on Twitter if you like.

167
00:12:48,159 --> 00:12:50,480
We've been on Twitter for a long
time or ex or whatever the hell

168
00:12:50,519 --> 00:12:54,200
they call it these days. But
the cool kids are hanging out. I'm

169
00:12:54,240 --> 00:12:58,440
Master Don, I'm at Carl Franklin
at tech head Social, and I'm riche

170
00:12:58,440 --> 00:13:01,480
Campbell at masadon do Social send us
a tuote, and of course, if

171
00:13:01,519 --> 00:13:05,159
you're looking for more ways to get
in touch with us or Meme particularly,

172
00:13:05,159 --> 00:13:09,919
you can go to Carl Franklin dot
com. Okay, let's bring in Michelle.

173
00:13:09,919 --> 00:13:16,600
That person who was interjecting her brilliance
there Michelle mish Manner's Duke no longer

174
00:13:16,639 --> 00:13:22,639
mannering. Congratulations on your marriage much
Yeah. Michelle Duke is a multi talented

175
00:13:22,639 --> 00:13:28,679
personality in the tech and gaming communities. As a developer advocate, she gets

176
00:13:28,679 --> 00:13:33,600
to create awesome experiences and engage with
the vibrant GitHub developer community. Michelle has

177
00:13:33,600 --> 00:13:37,919
spoken at over two hundred and fifty
events on topics like AI, the future

178
00:13:37,919 --> 00:13:43,240
of work, communication, teamwork,
and has given technical demos. She is

179
00:13:43,240 --> 00:13:48,960
a respected leader in the hackathon community, having one organized and mentored over one

180
00:13:50,039 --> 00:13:54,559
hundred hackathons. Michelle has founded several
tech companies, including an AI company,

181
00:13:56,000 --> 00:14:00,399
an e scooter business, and as
a result sits at the forefront of Alburn

182
00:14:00,559 --> 00:14:05,039
science, tech, esports and startup
scenes in her spare time. In air

183
00:14:05,120 --> 00:14:11,120
quotes, Michelle is a streamer journalist
and is always working on something exciting.

184
00:14:11,279 --> 00:14:18,799
Catcher at an event or streaming on
Twitch or on dynam Rocks yeah. I

185
00:14:18,840 --> 00:14:24,360
actually haven't streamed for a little while. My computer broke recently, so I'm

186
00:14:24,399 --> 00:14:28,440
reduced to doing not gaming and streaming
at the same times. It can no

187
00:14:28,480 --> 00:14:35,200
longer handle that. It's really hard
on a machine to do to run OBS

188
00:14:35,240 --> 00:14:37,519
and the game and you know,
and all the filters and stuff. Yeah,

189
00:14:37,600 --> 00:14:43,080
so I'm usually running OBS, the
game a chat feature, which is

190
00:14:43,080 --> 00:14:46,720
a JavaScript program that is scraping on
my chat, my Twitch chat, so

191
00:14:46,759 --> 00:14:50,360
I get to keep that and recording, and it's just it's got to the

192
00:14:50,360 --> 00:14:54,120
point where I'm like, how old
is my computer? Now? Okay,

193
00:14:54,159 --> 00:14:56,879
it's five years old. I definitely
need a new once, so I haven't

194
00:14:56,080 --> 00:15:01,840
Why do I smell smoke? The
benefit the benefit putting your tea on the

195
00:15:03,080 --> 00:15:05,559
gun the CPU? Right? Yeah. No, It's one of those things

196
00:15:05,559 --> 00:15:09,120
where like you turn the computer on
and it just like loudly beeps at you

197
00:15:09,240 --> 00:15:13,360
because there's something wrong and won't turn
on, and you're like, now,

198
00:15:13,399 --> 00:15:18,120
I really, Or when you're trying
to write a presentation for someone and you

199
00:15:18,240 --> 00:15:24,000
realize it takes off full twenty four
hours to render the demos within that presentation

200
00:15:24,240 --> 00:15:26,600
that are like a couple of minutes
long, and then you're like that's not

201
00:15:26,720 --> 00:15:31,600
right, sounds broken. It's like
going on your computer on its way until

202
00:15:31,639 --> 00:15:35,279
then. Yeah, so it's ordered, it's ordered. It's actually one of

203
00:15:35,360 --> 00:15:39,279
those like fancy custom build ones tourn. I've got my graphics and branding on

204
00:15:39,320 --> 00:15:43,399
it, and it looks it looks
really awesome, like we just came back

205
00:15:43,399 --> 00:15:48,879
from our honeymoon from Japan as well, so it's all Japanese theme and oh

206
00:15:48,960 --> 00:15:52,320
I see, Yeah, timing matters. You were very Japanese centric when you're

207
00:15:52,320 --> 00:15:56,679
giving a design, So tell me
it's fancy on the inside as well.

208
00:15:56,519 --> 00:16:02,799
Yeah, yeah, X five,
forty nine ninety like something crazy. It

209
00:16:02,840 --> 00:16:06,879
didn't quite go for the forty ninety. They're a little bit pricey still.

210
00:16:07,000 --> 00:16:10,919
But yeah, we're up there.
We're in like the forty eighty series.

211
00:16:10,919 --> 00:16:12,399
We're in, soyeah, you go, that's right up there. Yeah,

212
00:16:12,480 --> 00:16:18,159
exactly, thank you. I built
an EPC during the pandemic and it's an

213
00:16:18,200 --> 00:16:25,720
I nine and my fan is just
about as deep as high as the case

214
00:16:25,799 --> 00:16:30,279
is deep, and it's just one
big looks like a radiator on a seventy

215
00:16:30,320 --> 00:16:33,879
four corvette. You know. That's
literally what they are, though, like

216
00:16:34,360 --> 00:16:38,360
the radiators. So my one that
had my old here was like this big

217
00:16:38,879 --> 00:16:44,360
sitting which is about thirty centimeters big, sitting on top of a tiny little

218
00:16:44,440 --> 00:16:48,200
like seaping chip that's like a couple
of centimeters wide. And you're like,

219
00:16:48,679 --> 00:16:53,919
why, but I've never had a
more quiet computer in my life. When

220
00:16:53,960 --> 00:16:56,320
I built the old studio, Richard, do you remember, I put a

221
00:16:56,320 --> 00:17:00,960
hole in the wall so I could
put the computer behind the waller room in

222
00:17:00,000 --> 00:17:03,640
a different line I don't have to
do. Yeah, yeah, I got

223
00:17:03,680 --> 00:17:07,799
water cooling, water cooling, Yeah, o sid, I got water cooling

224
00:17:07,839 --> 00:17:11,519
in the new one. It's that's
how we know. And I also did

225
00:17:11,559 --> 00:17:15,880
it like with one of those things
where you're just like, those are unnecessary

226
00:17:15,920 --> 00:17:19,200
amount of turns you've put in the
water cooling and it's awful. Looks like

227
00:17:21,039 --> 00:17:26,000
oh show, Yeah I got rid
of the phase change. Coolers are good

228
00:17:26,079 --> 00:17:29,720
enough. They're quiet enough. Use
the big fans, the one forty millimeters,

229
00:17:30,720 --> 00:17:33,960
the fourteenth centimeter ones are quiet,
like you can get a machine right

230
00:17:33,000 --> 00:17:36,559
down. Yeah, yeah, I
mean, no reason to go hard.

231
00:17:36,680 --> 00:17:40,000
I have water cooling, but then
I also have ten fans in the computer

232
00:17:40,079 --> 00:17:45,240
as well, so I mean there's
very nothing succeeds like excess, right,

233
00:17:45,440 --> 00:17:51,519
that's my line fully JFA everything like
it's going to also use some photos when

234
00:17:51,559 --> 00:17:56,440
it's done. It looks pretty cool. So, speaking of good co pilot

235
00:17:56,519 --> 00:18:00,279
stuff, are you really going to
adopt the grit? What did I say,

236
00:18:00,480 --> 00:18:07,960
Gray bloody Blash, the Great Bloody
Blas. I use GitHub copilot a

237
00:18:07,000 --> 00:18:12,519
lot. I do. I do
leave the inline suggestions on because I like

238
00:18:12,599 --> 00:18:18,519
it, but I also use the
chart feature quite a lot. It's just

239
00:18:18,480 --> 00:18:23,119
the things you can do these days
are just incredible. Like we heard from

240
00:18:23,160 --> 00:18:29,119
the quote that Richard read out that
it's saving lots of time. I've literally

241
00:18:29,160 --> 00:18:33,000
just read like four or five articles
just this morning on various companies and how

242
00:18:33,039 --> 00:18:37,000
their productivity has gone through the roof
of using it ub coke pilot. Even

243
00:18:37,759 --> 00:18:41,119
you mentioned we got married recently,
so like getting good developer, when you

244
00:18:41,119 --> 00:18:45,759
get married, you turn to GitHub. So I use get up copilot to

245
00:18:45,799 --> 00:18:49,279
build our the website for our wedding, and instead of taking like two four

246
00:18:49,319 --> 00:18:52,480
weeks to build it, which probably
what it would have taken me, it

247
00:18:52,519 --> 00:18:57,440
took like eight hours. And the
fact that it was able to suggest to

248
00:18:57,559 --> 00:19:03,279
me like things I didn't really or
probably wouldn't have quite thought of, like

249
00:19:03,400 --> 00:19:06,920
it. As I was going through
it realized what I was doing, which

250
00:19:06,960 --> 00:19:11,079
is building a website for my wedding, and it said started suggesting things like,

251
00:19:11,359 --> 00:19:14,720
oh, would you like a countdown
time on your website? How about

252
00:19:14,720 --> 00:19:18,799
a light box to show some lovely
photos? The text version of Clippy,

253
00:19:18,920 --> 00:19:22,359
isn't it basically like when you think
about like, you know, all the

254
00:19:22,400 --> 00:19:26,119
memes popping up is about you know
Clippy and can I help you with that?

255
00:19:26,960 --> 00:19:30,200
That is lit get up copilot is
doing with your cod is like I

256
00:19:30,279 --> 00:19:33,559
se you did agree? Is it
a business grave or a personal grave?

257
00:19:33,920 --> 00:19:41,920
No, it's for you Clippy,
for Clippy Clippi. Have we defined really

258
00:19:41,960 --> 00:19:47,200
a difference between because get how copilot
chat came later? Right, the original

259
00:19:47,279 --> 00:19:51,519
co pilot they didn't have the chata
No, So the original co pilot was

260
00:19:51,559 --> 00:19:56,559
the inline functionality. So the new
getub co pilot has the two types of

261
00:19:56,559 --> 00:20:00,640
functionality. Has that inline line congestion, so as you're coding, it will

262
00:20:00,680 --> 00:20:06,359
show up that grade out text that
you can accept or reject. And then

263
00:20:06,400 --> 00:20:08,480
there's the chat feature, which is
a little bit more like some of the

264
00:20:08,519 --> 00:20:12,279
other ais that you may have interacted
with, where you go in and you

265
00:20:12,599 --> 00:20:17,960
type out either a question or something
that you want done, and a copilot

266
00:20:18,039 --> 00:20:21,960
can not just provide you with the
code snippet for that, but a small

267
00:20:22,039 --> 00:20:26,799
explanation of what it does, and
maybe if there's any elements within that code

268
00:20:26,799 --> 00:20:30,039
snippet that might need to be changed
before it gets inserted into your code,

269
00:20:30,359 --> 00:20:34,680
and then you can just insert that
straight into your code block and it's same

270
00:20:34,720 --> 00:20:38,079
as the inline code. It is
built into the editor you already use.

271
00:20:38,240 --> 00:20:44,920
So the GitHub coropilot Chat for vas
Studio came out the end of last year,

272
00:20:44,960 --> 00:20:48,880
so people have been able to use
it since then. We just launched

273
00:20:48,960 --> 00:20:53,799
I believe, let me just double
check get a propilot chat for jet Brains.

274
00:20:53,839 --> 00:20:57,160
It was on a wait list,
I believe it is now it is

275
00:20:57,240 --> 00:21:03,240
now generally available as of this month, so like only like a couple of

276
00:21:03,240 --> 00:21:07,240
weeks ago. So now you can
use it in Visual Studio and jet Brain's

277
00:21:07,279 --> 00:21:11,720
IDs as well. So you have
that get ub coropilot chat funct now.

278
00:21:11,920 --> 00:21:17,519
So that's yes. Now I've used
I've used chat GPT to help me with

279
00:21:17,640 --> 00:21:22,559
programming things, and I've used GitHub
copilot chat and the biggest difference that I

280
00:21:22,599 --> 00:21:27,519
can see is that GitHub copilot Chat
knows more about your application right off the

281
00:21:27,559 --> 00:21:30,599
bat. You don't have to explain
yourself. You don't have to pace your

282
00:21:30,720 --> 00:21:37,000
entire project to chat GPT like you
do with So the two main advantages of

283
00:21:37,079 --> 00:21:41,079
Get up Copilot chat over say something
like chativity because a lot of people are

284
00:21:41,079 --> 00:21:44,319
saying, well, chat tip t
is good. It does provide me with

285
00:21:44,400 --> 00:21:48,000
code suggestions. It's like, well, think about it. Chattivity is being

286
00:21:48,039 --> 00:21:51,880
trained on everything available on the Internet, and there's code on the internet.

287
00:21:52,119 --> 00:21:56,799
Your think stack overflow. Even so
chat tivit has that context behind it.

288
00:21:56,839 --> 00:22:02,559
But GitHub Copilot chat because it's built
into your editor, it has all that

289
00:22:02,599 --> 00:22:07,400
context as I was kind of hinting
at before, which is it has the

290
00:22:07,440 --> 00:22:11,920
context of it being not just trained
on everything on the Internet, but having

291
00:22:11,000 --> 00:22:15,839
it trained specifically on code. So
it's much more It understands code a lot

292
00:22:15,880 --> 00:22:19,799
better and is optimized better for code. So that's the first point in that

293
00:22:19,839 --> 00:22:25,279
it's training model is much more giared
towards understanding code. And that secondly is

294
00:22:25,279 --> 00:22:30,519
it has so much more context.
So if you're using visual Studio, any

295
00:22:30,559 --> 00:22:33,680
other things that you've written in the
file that you've already got opened, that's

296
00:22:33,759 --> 00:22:37,720
context, we get up copilot,
any other tabs that you have open,

297
00:22:37,759 --> 00:22:44,440
so other files that you have open
either in one of your jet Brains ideas

298
00:22:44,599 --> 00:22:48,440
or in visual studio or vs code. If you have other files open in

299
00:22:48,519 --> 00:22:53,319
tabs, get up copart can see
all of that. And that is all

300
00:22:53,359 --> 00:22:56,960
context, which is why when you're
writing things, you might be like,

301
00:22:56,000 --> 00:23:00,119
how does it know what function of
called something the file? How does it

302
00:23:00,279 --> 00:23:04,920
know what that class is called?
It's because it has that context. And

303
00:23:04,960 --> 00:23:10,039
the other piece of context it has
too, is your entire project structure.

304
00:23:10,440 --> 00:23:14,440
So while it can't see what's in
each individual file, it can see your

305
00:23:14,440 --> 00:23:17,519
folder structure. So that means it
can do things like, you know,

306
00:23:17,839 --> 00:23:21,240
if you're writing a website, say, for example, it can point and

307
00:23:21,319 --> 00:23:25,279
have the you know, the proper
location pointer to your CSS file for example,

308
00:23:25,319 --> 00:23:27,599
because it knows where you've stored it, it knows what it's called,

309
00:23:27,680 --> 00:23:33,119
and all those kind of things.
And because it then has that folder structure,

310
00:23:33,799 --> 00:23:37,079
it can then see the extensions of
all the files, which means it

311
00:23:37,200 --> 00:23:40,799
knows what languages you're trying to write
your project in. So it has a

312
00:23:40,839 --> 00:23:45,839
lot more context. And chat chat
doing context chattivity has is its training datap

313
00:23:47,000 --> 00:23:48,720
just everything on the Internet and the
prompt that you give it. That's it.

314
00:23:48,960 --> 00:23:52,319
Yeah. One of the real problems
I had with chatchipt, and I'll

315
00:23:52,319 --> 00:23:57,119
tell you a little story just happened
to me. I have this little tool

316
00:23:57,200 --> 00:24:02,759
that I used to mege all of
our YouTube videos and by hours, I

317
00:24:02,799 --> 00:24:06,599
mean, you know, the dunt
neet rocks. Every show is published to

318
00:24:06,640 --> 00:24:15,920
YouTube, and the speaker publishing system
that we use doesn't set the flag that

319
00:24:17,000 --> 00:24:22,920
says it's not made for kids when
they publish a show to YouTube, and

320
00:24:22,960 --> 00:24:25,759
so you have to go in.
And the reason why that's important is because

321
00:24:25,799 --> 00:24:27,759
there are no comments. If it's
made for kids, comments are turned off,

322
00:24:27,799 --> 00:24:33,279
so people can't comment unless you set
that value. And it has to

323
00:24:33,319 --> 00:24:37,359
be set when you're creating the video, when you're uploading the video. The

324
00:24:37,400 --> 00:24:41,720
reason and the reason I know that
and I found it out is because I

325
00:24:41,759 --> 00:24:45,680
had this conversation with chat gpt about
how to set that after the fact,

326
00:24:47,880 --> 00:24:51,440
and you know, it gave me
the solution and I said, here's my

327
00:24:51,640 --> 00:24:56,680
code boom, you know, and
it said, okay, well, if

328
00:24:56,680 --> 00:24:59,200
this is the case, then here
are the things that you can check.

329
00:25:00,000 --> 00:25:03,319
I could see my code but it's
still told me that I should do things

330
00:25:03,359 --> 00:25:07,240
that I was already doing in my
code, right, and it's like you

331
00:25:07,279 --> 00:25:11,599
can see this, You see that
I'm doing that. Why are you giving

332
00:25:11,640 --> 00:25:15,759
me? And the reason is is
because it didn't have an answer. The

333
00:25:15,799 --> 00:25:18,799
answer was and it didn't know the
answer that it's a read only value.

334
00:25:18,920 --> 00:25:23,960
You can only set it when you
upload a video in the API. You

335
00:25:23,960 --> 00:25:27,480
can only set that flag it's available, and you can try to set it.

336
00:25:27,799 --> 00:25:32,119
But it didn't know that. Yeah. So chativity is smart but doesn't

337
00:25:32,160 --> 00:25:36,400
know everything right, doesn't Yeah.
And there's a really really great keynote at

338
00:25:36,400 --> 00:25:41,759
one of the n DSA's recently on
the different types of ways that charativity and

339
00:25:41,799 --> 00:25:45,480
all the ais can get things wrong. So one of them is it just

340
00:25:45,519 --> 00:25:51,319
doesn't have the facts, so it
makes up something. Or the other thing

341
00:25:51,400 --> 00:25:57,200
is it doesn't have enough information and
while it's providing you with information, that

342
00:25:57,400 --> 00:26:03,480
is what they call grounded. So
it might be true for something else,

343
00:26:03,599 --> 00:26:06,640
but it's not actually true for the
context you'll talk about there. It might

344
00:26:06,680 --> 00:26:10,279
have already been true in the past
but not anymore, yeah, Or or

345
00:26:10,319 --> 00:26:12,319
it's true for something else. So
for example, when I asked chatter TIPT

346
00:26:12,480 --> 00:26:15,440
to write a bio about me.
It gets some things right, and then

347
00:26:15,440 --> 00:26:21,079
it says that I have a PhD
in like chemistry, which is incorrect.

348
00:26:21,079 --> 00:26:22,440
I didn't know. You're so clever, I know, right, which is

349
00:26:22,519 --> 00:26:25,920
I mean, I don't have HAD. I didn't do chemistry, I don't

350
00:26:25,960 --> 00:26:30,559
have HD. But that information is
grounded because there's another Michelle Manoring out there

351
00:26:30,559 --> 00:26:33,960
who lives in New York who potentially
has PhD. Right, So the information

352
00:26:34,359 --> 00:26:38,519
is grounded, so it has a
base in fact, but it is not

353
00:26:38,880 --> 00:26:44,519
actually correct for me, and chat
to PT can't distinguish between what we're trying

354
00:26:44,519 --> 00:26:48,079
to ask about. The other example
she gave was, you know, write

355
00:26:48,119 --> 00:26:53,720
me a or give me some information
about this airline. I'm going to use

356
00:26:53,799 --> 00:26:59,920
Duke Airlines and exactly you know,
tell me about Duke Airlines and chat units

357
00:27:00,119 --> 00:27:03,200
that would be you're going on how
it was an amazing airline, it was

358
00:27:03,240 --> 00:27:08,720
always on time, top customer satisfaction, It flew like lots of roots around

359
00:27:08,759 --> 00:27:11,960
the world and had like a fantastic
and it doesn't exist, and you're like,

360
00:27:12,279 --> 00:27:15,559
wow, this sounds really good.
Doesn't exist, right, So that's

361
00:27:15,599 --> 00:27:21,000
the whole Like AI hallucination thing where
it just the thing is when you come

362
00:27:21,039 --> 00:27:23,359
when it comes to the when it
doesn't know the answer, it will just

363
00:27:23,400 --> 00:27:26,680
say, well, I really have
no you should It should work right.

364
00:27:27,440 --> 00:27:30,599
It does. Sometimes it's like I
can't tell you any further, and then

365
00:27:30,640 --> 00:27:34,240
other times I shouldn't. It doesn't
say that something that just makes stuff up.

366
00:27:34,400 --> 00:27:37,839
Yeah, it just makes stuff up. Or it says, well,

367
00:27:37,880 --> 00:27:41,079
clearly there's something wrong and you should
check all these things that you're already doing.

368
00:27:41,440 --> 00:27:44,720
But you know, a human would
figure out a way to go look

369
00:27:44,759 --> 00:27:47,359
for the answer. I was going
to say to you. What I think

370
00:27:47,440 --> 00:27:51,319
is interesting is that this is again
a personal opinion. But what I think

371
00:27:51,400 --> 00:27:55,640
is interesting is I doesn't mimit humans. Because you're saying a human will go

372
00:27:55,680 --> 00:27:57,839
and look up the information, you'll
find it out, not necessarily. I

373
00:27:57,839 --> 00:28:03,279
mean most people, most I know, if they don't know the answer,

374
00:28:03,319 --> 00:28:07,000
they'll make it up on the spot
because they are incredible or worded. People

375
00:28:07,039 --> 00:28:11,759
believe them, so they're off,
you know, providing incorrect information as well.

376
00:28:12,440 --> 00:28:15,319
I mean I'm definitely one of those
people. I'm like, look,

377
00:28:15,359 --> 00:28:18,920
if I don't know the answer,
I will go and find out for you

378
00:28:18,960 --> 00:28:22,160
because I know usually know where to
get it from. It's the best of

379
00:28:22,359 --> 00:28:25,680
being human and the worst of being
human at the same time. And that's

380
00:28:25,720 --> 00:28:27,960
what I think you're getting a chat
chibert right, it's they train and everything

381
00:28:27,960 --> 00:28:30,799
available on the internet, you know, not everyone the internet is good.

382
00:28:32,440 --> 00:28:36,839
You do splash. So yeah,
I think you're going to get those times

383
00:28:37,200 --> 00:28:38,960
just give to just like to see, you know, at one point,

384
00:28:40,000 --> 00:28:41,759
you know, when it doesn't know
the answer, say well, aliens must

385
00:28:41,799 --> 00:28:45,880
have something to do with this,
right, you know, just like start

386
00:28:45,000 --> 00:28:49,519
spouting conspiracy theories at least it that
way that this is how I want you

387
00:28:49,559 --> 00:28:53,599
to respond when you can't answer,
I suppose. So, yeah, you

388
00:28:53,640 --> 00:28:59,799
know, just making software do even
dumber things. At the same time,

389
00:28:59,839 --> 00:29:03,680
I've they're talking to some pms and
they're I mean, it's not small productivity

390
00:29:03,720 --> 00:29:08,559
provements. It's thirty percent a more
code in less time, higher quality,

391
00:29:08,880 --> 00:29:12,920
less remediation. Like these are big
numbers. Yeah, yeah, I've got

392
00:29:15,200 --> 00:29:18,400
some stats here. Actually they're just
pulled up from a potent exactly what company,

393
00:29:18,480 --> 00:29:26,279
But they said that their developers reported
ninety percent increase in writing better code,

394
00:29:26,799 --> 00:29:32,319
which I just think is insane.
It's kind of subjective statement. It

395
00:29:32,440 --> 00:29:38,000
is a little bit, but fifty
percent more builds, which is not measurable,

396
00:29:38,319 --> 00:29:44,799
which is measurable, eighty four percent
increase in successful builds. I think

397
00:29:44,799 --> 00:29:48,079
it's really cool. That's pretty big
number. It's a pretty big number again

398
00:29:48,119 --> 00:29:51,599
and a little bit more like subjective
on like ninety percent of them feeling like

399
00:29:51,720 --> 00:29:55,720
more satisfied and fulfilled in their jobs. Well, and that's the interesting part

400
00:29:55,720 --> 00:30:00,119
about this, right, is that
here's this tool arguably doing some of you

401
00:30:00,200 --> 00:30:03,200
work for you, but you love
it. You love it, and what

402
00:30:03,279 --> 00:30:06,400
I think is really good, And
this is something that came out from some

403
00:30:06,400 --> 00:30:10,799
of the original research that we did
when get ub copilot was first introduced,

404
00:30:10,799 --> 00:30:15,000
and this is before all the chat
stuff too, is that developers reported that

405
00:30:15,039 --> 00:30:18,200
they, yes, get ub copilot
was writing a lot of code for them

406
00:30:18,440 --> 00:30:22,279
and you know, doing that,
but they actually had to think less when

407
00:30:22,319 --> 00:30:23,599
writing code, which I was like, it's not a bad thing, but

408
00:30:23,680 --> 00:30:27,720
they follow up that statement with I
have to think less, but when I

409
00:30:27,920 --> 00:30:33,079
do have to think, it's about
the fun stuff that sets off that spark

410
00:30:33,160 --> 00:30:36,480
that makes coding fun again. And
I think that's what I think is really

411
00:30:36,480 --> 00:30:40,839
great about a lot of these ais. It takes away you know those blockers.

412
00:30:40,920 --> 00:30:44,279
You know, we often get writer's
block or coder's block, So it

413
00:30:44,400 --> 00:30:47,559
removes those blockers, so we're not
seeing there going or where do we start

414
00:30:47,599 --> 00:30:51,839
again, So it removes that side
of it. It writes a lot of

415
00:30:51,880 --> 00:30:55,480
that repetitive, mundane code for you
so that you're actually able to sit there

416
00:30:55,480 --> 00:30:59,920
as a developer and do the thing
that you would train best to do,

417
00:31:00,119 --> 00:31:03,240
which is solve the problem, like
what is the best way to do something?

418
00:31:03,319 --> 00:31:10,759
If I give a problem or a
task or project to fifty developers,

419
00:31:11,240 --> 00:31:15,160
I'll probably get fifty. Some of
them might be similar, but there's going

420
00:31:15,200 --> 00:31:18,440
to be different ways of writing things. We all know as developers there's multiple

421
00:31:18,440 --> 00:31:21,359
ways of doing something, and that's
what you know. Some get up copilot

422
00:31:21,400 --> 00:31:25,920
as well. It will often give
you multiple suggestions or if you rephrase a

423
00:31:26,000 --> 00:31:29,240
question, you'll get a different response. So I think the best thing about

424
00:31:29,279 --> 00:31:33,720
being developer is actually looking at what
we're doing and going what is the best

425
00:31:33,720 --> 00:31:38,599
way to actually solve this and what
is the best way to take this on?

426
00:31:38,759 --> 00:31:42,480
I think what's really exciting. This
is another personal opinion. We got

427
00:31:42,519 --> 00:31:47,160
to take a break, but we'll
be right back after these very important messages,

428
00:31:51,400 --> 00:31:53,480
and we're back. It's starting at
rock some Carl Franklin, that's Richard

429
00:31:53,480 --> 00:32:00,680
Campbell and that is Michelle Mishmanner's duke, and we're talking about get hub,

430
00:32:00,759 --> 00:32:04,599
copilot and AI and all those cool
things. And you were just saying that

431
00:32:05,319 --> 00:32:09,200
you think it's really great for removing
those blocks, those things that you were

432
00:32:09,240 --> 00:32:12,559
working on that you may have said, you know what, I'm going to

433
00:32:12,640 --> 00:32:15,680
sleep on it or whatever. Now
you can get there quicker. And I

434
00:32:15,759 --> 00:32:21,400
was just going to add that it's
also good for you to take inventory of

435
00:32:21,920 --> 00:32:27,319
write down the top ten things that
you haven't learned because you were afraid to

436
00:32:27,319 --> 00:32:30,839
too busy, or you thought it
was over your head, you could never

437
00:32:30,880 --> 00:32:34,960
figure that out. Whatever. Take
that list, go through it one by

438
00:32:35,000 --> 00:32:38,359
one with co pilot or you know, your favorite AI, and you'd be

439
00:32:38,400 --> 00:32:42,799
surprised at how fast you learn these
things. Yeah, and so yeah,

440
00:32:42,839 --> 00:32:46,759
what I was going to blow everyone's
mind was my personalia. So what I

441
00:32:46,880 --> 00:32:50,680
think a lot of these iis and
things are going to do. Like if

442
00:32:50,680 --> 00:32:53,920
we look back through history, we
say where a lot of new technologies are

443
00:32:53,920 --> 00:32:59,759
coming and people being really upset or
skeptical and one I like to look at

444
00:32:59,839 --> 00:33:05,720
is writing. So when writing was
first like widely invented and adopted, people

445
00:33:05,880 --> 00:33:09,680
got really worried that we're all going
to become dumb because we no longer needed

446
00:33:09,680 --> 00:33:13,960
to remember the things we need to
remember. So if I needed to note

447
00:33:14,000 --> 00:33:15,880
something back then, I was there. You were there, you were there,

448
00:33:16,000 --> 00:33:20,400
Yeah, I was there. If
I needed to know something I had

449
00:33:20,400 --> 00:33:22,359
to remember, I could just write
it down. I mean even today,

450
00:33:22,480 --> 00:33:25,119
like I don't have to remember.
I mean I still remember because I've just

451
00:33:25,160 --> 00:33:29,599
got stupid photographic memory. But I
don't have to remember. You're absolutely right

452
00:33:29,640 --> 00:33:31,519
about that. Like I don't have
to remember my husband's phone number, my

453
00:33:31,559 --> 00:33:35,640
family's phone number, my work phone
number. I mean, I remember all

454
00:33:35,680 --> 00:33:37,119
of them because again, photographic memory. But we don't have to. We've

455
00:33:37,119 --> 00:33:40,000
got a phone that does that for
us. We were talking to Richard,

456
00:33:40,039 --> 00:33:47,720
we were talking to Grant Barrett from
Away with Words, and he mentioned that

457
00:33:49,559 --> 00:33:55,519
in Aristotle's time, when you know, when people were writing handwriting, you

458
00:33:55,519 --> 00:34:00,640
know they were these guys were against
it because they thought, well, if

459
00:34:00,640 --> 00:34:02,799
we're if that's cheating, we write
it down, we can refer to it

460
00:34:02,880 --> 00:34:07,839
later. We should keep everything in
our mind. Exactly. But what actually

461
00:34:07,880 --> 00:34:13,239
happened instead of everyone becoming dumb,
is there's this massive wave of innovation invention

462
00:34:13,400 --> 00:34:16,239
because all of a sudden, people's
minds were freed up from having to remember

463
00:34:16,280 --> 00:34:21,159
stuff that they could actually think creatively. And I think that's what's going to

464
00:34:21,199 --> 00:34:23,360
happen. I think I'm bigger problems
at a higher level exactly. So I

465
00:34:23,440 --> 00:34:27,360
actually think that's what's going to happen
over the next five to ten years,

466
00:34:27,440 --> 00:34:31,079
is we're actually going to solve some
of the world's biggest problems because all of

467
00:34:31,119 --> 00:34:35,079
a sudden, people's time is free
to and you know, it's the fun

468
00:34:35,079 --> 00:34:37,239
stuff. We don't have to sit
there, you know, doing a lot

469
00:34:37,280 --> 00:34:40,519
of that mundane competitive stuff. We
can do what those original developers are same

470
00:34:40,519 --> 00:34:45,400
when we did the research, start
to get up copilot, which is it

471
00:34:45,440 --> 00:34:47,599
sets off this spark which makes things
fun again, right, so we can

472
00:34:47,679 --> 00:34:52,199
now spend more time actually, you
know, thinking about how to solve these

473
00:34:52,239 --> 00:34:57,119
problems. I think we're going to
get this big wave of just cool stuff

474
00:34:57,159 --> 00:34:59,480
happening. Again, this is all
personal opinion, but I think we're going

475
00:34:59,519 --> 00:35:01,119
to see a lot We've already had
a wave in Richard, I'd be interested

476
00:35:01,159 --> 00:35:04,920
in what you have to say about
this. But we've already had a big

477
00:35:04,960 --> 00:35:12,599
wave of people using it for nefarious
purposes before you know, the and it's

478
00:35:12,639 --> 00:35:15,760
a game of cat and mouse and
ketch up with the you know, the

479
00:35:15,800 --> 00:35:20,440
authorities in the security business. But
not in gethub copilot. No, no,

480
00:35:20,440 --> 00:35:22,880
no, no no, I'm just
talking AI in general. Yeah,

481
00:35:22,079 --> 00:35:24,920
but you know, I was thinking
more in terms of the productivity boosts we

482
00:35:24,960 --> 00:35:28,599
got when we got the dot net
framework. Oh yeah, and you already

483
00:35:28,639 --> 00:35:32,199
had an encryption library. I already
had sockets and like you didn't have to

484
00:35:32,199 --> 00:35:36,320
write all that stuff. And all
the C plus plus programmers were like,

485
00:35:36,480 --> 00:35:39,639
Dad, I don't want that's so
lazy. Yeah, real programmers allocate and

486
00:35:39,679 --> 00:35:44,400
deallocate their memory manually. Come on. Like, I just think it's a

487
00:35:44,440 --> 00:35:46,599
really exciting time. Like I know
a lot of people to get worried about

488
00:35:46,599 --> 00:35:50,320
the whole you know, what if
it takes my jobs and what if it

489
00:35:50,360 --> 00:35:53,880
does this? And you know what, I again, personal opinion, there

490
00:35:53,880 --> 00:35:58,800
are going to be some jobs that
get get lost. That's just the reality

491
00:35:58,880 --> 00:36:01,079
of every industrial revolution should move hard
over to us, you know, since

492
00:36:01,079 --> 00:36:05,079
the dawn of time. But I
think with this song. What's going to

493
00:36:05,079 --> 00:36:07,960
be really interesting is the majority of
people aren't going to lose their jobs to

494
00:36:07,000 --> 00:36:10,199
AI. They're going to lose their
job to another human who's using AI.

495
00:36:10,960 --> 00:36:17,360
So the people who can understand how
how these things work, so understand AI,

496
00:36:17,639 --> 00:36:22,719
understand the capabilities of AI, but
more importantly understand the limitations of AI,

497
00:36:22,920 --> 00:36:25,519
and you therefore, yeah, therefore, where they can add value as

498
00:36:25,519 --> 00:36:30,480
a human, they're the people who
are going to have jobs. It's exactly

499
00:36:30,519 --> 00:36:35,760
my position. Yeah again tall personal
opinion. I may play the role of

500
00:36:35,760 --> 00:36:40,480
a lad eight for you know,
for communic effect, but I'm absolutely with

501
00:36:40,679 --> 00:36:45,079
you on that that you know this
is not something to be feared. You

502
00:36:45,119 --> 00:36:49,440
have to you have to jump on
the horse and write it. So if

503
00:36:49,480 --> 00:36:52,559
anyone is listening of like, okay, that sounds good, what do I

504
00:36:52,639 --> 00:36:54,599
do, like, actually go and
start using AI. If you're not using

505
00:36:54,679 --> 00:37:00,840
AI, you need to start using
it so you really understand what it can

506
00:37:00,880 --> 00:37:04,519
actually do and then what it can't
do if you And it's hit me that

507
00:37:05,199 --> 00:37:08,679
developers uniquely sued for these kinds of
tools kind of sad. Is the reason

508
00:37:08,760 --> 00:37:13,039
Get Help Copilot took off first?
Yes, because we get it. It's

509
00:37:13,119 --> 00:37:16,079
just another kind of coding to describe
a task to the tool, But do

510
00:37:16,119 --> 00:37:20,320
you still have to evalue it.
We're used to criticizing other people's code and

511
00:37:20,440 --> 00:37:23,639
our own, and the compiler always
gets the same. And you're right,

512
00:37:23,760 --> 00:37:25,920
Like, you know, a lot
of people just think, well, you

513
00:37:25,960 --> 00:37:29,239
know, is it going to be
this age where get up copil it just

514
00:37:29,280 --> 00:37:30,840
writes code and ships it off.
It's like, no, that's that's really

515
00:37:30,840 --> 00:37:34,360
stupid, Like no one would do
that, Like, no one would you

516
00:37:34,679 --> 00:37:38,519
give a project to an intern and
ship it off before you know, before

517
00:37:38,599 --> 00:37:42,320
checking it. Right, it's kind
of the same thing to get up copile.

518
00:37:42,360 --> 00:37:44,400
In a way, we kind of
go to treat it a little bit

519
00:37:44,440 --> 00:37:47,519
like an intern in that sense that
you know, we as developers are responsible

520
00:37:47,559 --> 00:37:50,960
for the code we ship at the
end of the day, regardless of how

521
00:37:51,039 --> 00:37:53,079
much of it was written by an
AI, which means as developers, we

522
00:37:53,199 --> 00:37:58,559
have to be responsible for checking the
code, checking the security of it,

523
00:37:58,639 --> 00:38:01,159
you know, doing all those kinds
of normal things that we would do before.

524
00:38:01,480 --> 00:38:05,960
And I like your actorism, Michelle, because I'm sure people are going

525
00:38:06,039 --> 00:38:10,320
to do that. They're just going
to get well, you know, back

526
00:38:10,360 --> 00:38:15,480
in the early days of chat GPT
with a lawyer asked for citations from chat

527
00:38:15,519 --> 00:38:20,960
GPT. Yeah, go fictional ones
and fictional case. The soft there wasn't

528
00:38:20,960 --> 00:38:23,719
in trouble, the lawyer was,
the court doesn't care. But there was

529
00:38:23,760 --> 00:38:30,159
an article recently about someone who used
chat GPT and they didn't even bother changing

530
00:38:30,320 --> 00:38:32,800
like you know, it was like
hi, insert name here, like they

531
00:38:32,800 --> 00:38:37,360
didn't changing any of that stuff.
And you're like why, like what's what

532
00:38:37,559 --> 00:38:43,840
is wrong? Like lazy? Are
you? Yeah? Exactly? I have

533
00:38:43,920 --> 00:38:50,880
to bring up something here. My
friend and aphoenex colleague, Brian uh sent

534
00:38:50,920 --> 00:38:59,239
me this thing called pseudo no Suno
suno app dot suno dot ai and it's

535
00:38:59,280 --> 00:39:06,880
apparently this thing that can generate music
and it sounds like anything that you could

536
00:39:06,880 --> 00:39:12,639
hear in a genre specific radio station, right, and we complete with singing

537
00:39:12,719 --> 00:39:15,199
and lyrics and everything. And he
thought this was the coolest thing, and

538
00:39:15,239 --> 00:39:17,840
he said, so what I've I
let it go for about a week.

539
00:39:17,880 --> 00:39:21,960
Hecaes, what, no comment,
And I'm like, Okay. From a

540
00:39:21,960 --> 00:39:28,440
technology perspective, it's pretty impressive,
but from a music perspective, it's soulless

541
00:39:28,480 --> 00:39:35,559
and soul draining and just like crushing. Yeah, because because what it does

542
00:39:35,719 --> 00:39:39,960
is you know, if somebody actually
puts out a hit record, you know

543
00:39:40,000 --> 00:39:46,320
that's completely generated by AI. It
takes away all the integrity of real artists

544
00:39:46,320 --> 00:39:50,920
that are doing real things. And
it's you know, for the musicians out

545
00:39:50,960 --> 00:39:53,360
there, get busy, you got
a tour, Yeah, you got to

546
00:39:53,400 --> 00:39:57,159
let people know that you're doing it
for real. It's the same, right,

547
00:39:57,280 --> 00:39:59,239
right. You know, I mentioned
in my bio I'm a journalist,

548
00:39:59,360 --> 00:40:01,639
right, so a natural a lot
of people asking me, you know,

549
00:40:01,719 --> 00:40:06,559
are we using chattivt And it's like, well, we're not. We know

550
00:40:06,639 --> 00:40:10,719
a lot of publications who are using
chat activity. We're basically told, like

551
00:40:12,280 --> 00:40:15,840
in our our sphere that like if
you use chatchivit, you're gonna get fired

552
00:40:16,280 --> 00:40:22,079
because you're right, like, it's
a lot of it is generic. It's

553
00:40:22,119 --> 00:40:24,320
soulless. It's not like I mean
they said, like, you know,

554
00:40:24,400 --> 00:40:28,760
sure, go use it to like
get some you know, get some like

555
00:40:28,880 --> 00:40:30,719
a base of something where to start, but actually write something yourself, because

556
00:40:30,760 --> 00:40:36,360
otherwise it is it is still plagiarism, right, rather, you're plagiarizing something

557
00:40:36,400 --> 00:40:38,360
that's that just hadn't been written.
But that's why I think a lot of

558
00:40:39,000 --> 00:40:44,280
you know, we our news site, so I write for Upcoming. It's

559
00:40:44,280 --> 00:40:47,639
a gaming and like esports kind of
publication. So we kind of moved away

560
00:40:47,679 --> 00:40:51,760
a lot from the like the generic
news stuff, and so we write a

561
00:40:51,760 --> 00:40:57,719
lot of you know, guides and
interview type pieces because it's it's much more.

562
00:40:58,199 --> 00:41:01,000
It's it's different content, but it
all so is like that more heartfelt

563
00:41:01,000 --> 00:41:07,440
and thought provoking stuff, and like
writing a guide and exactly, and so

564
00:41:07,599 --> 00:41:12,079
writing a guide like I write a
guide, and this is why I love

565
00:41:12,079 --> 00:41:14,800
writing my technology guides too, which
you can read a lot of them on

566
00:41:14,840 --> 00:41:20,360
dev two or like on dev is, I write a guide as I'm doing

567
00:41:20,400 --> 00:41:25,480
it, so and like I'm got
no knowledge of what I would have had

568
00:41:25,519 --> 00:41:29,400
to do in the first place.
That's why I love writing beginner content as

569
00:41:29,400 --> 00:41:34,519
well. So and AI can't do
that. They can't necessarily put themselves and

570
00:41:34,760 --> 00:41:37,599
you could tell it to go,
hey, write this on the perspective of

571
00:41:37,639 --> 00:41:40,639
someone who's never done this, but
it's really hard for an AI to do

572
00:41:40,719 --> 00:41:45,159
that because it doesn't quite understand that
human comment. It's like when you watch

573
00:41:45,480 --> 00:41:51,079
a video or see some artwork.
I mean, there's so much AI generated

574
00:41:51,159 --> 00:41:54,760
artwork out there, and I can
probably bet that majority of people, regardless

575
00:41:54,800 --> 00:42:00,760
of whether they're working in technology or
not, can pick up AI content like,

576
00:42:02,119 --> 00:42:06,559
right, I've been able to pick
up dally generated content because it has

577
00:42:06,559 --> 00:42:08,480
a look about it. Yeah,
exactly right. Do you notice this?

578
00:42:08,639 --> 00:42:12,239
Yeah? Yeah, there's a specific
look. Like I'm looking at this stuff

579
00:42:12,320 --> 00:42:15,639
now, going that's a really cool
picture of like a Pika two. But

580
00:42:15,920 --> 00:42:19,519
I know that's been AI generated.
And then you look into it has been

581
00:42:19,679 --> 00:42:21,800
and everyone's like, how do you
know? And you're like, you just

582
00:42:21,920 --> 00:42:25,280
know, like you just have this
like does this look and a feeling about

583
00:42:25,320 --> 00:42:29,960
it where you're just like, it's
been AI generated. However, okay,

584
00:42:30,039 --> 00:42:34,079
However, the thing is is that
we keep complaining about this stuff not being

585
00:42:34,599 --> 00:42:37,840
as good as humans, and then
it keeps raising the bar and raising the

586
00:42:37,880 --> 00:42:40,840
bar, and you know, twenty
years from now, who knows, right,

587
00:42:42,239 --> 00:42:45,119
exactly right, Eventually it's going to
catch up with our complaints. And

588
00:42:45,559 --> 00:42:50,599
I am one of those people.
I fall into categy again more personal preference

589
00:42:50,760 --> 00:42:54,920
and personal opinion. I love the
AI generated stuff because I think it's really

590
00:42:55,000 --> 00:43:01,920
good for people who don't necessarily have
that creative I want to have something different

591
00:43:02,000 --> 00:43:05,760
in like I'm thinking here, too, Like I use a lot of it

592
00:43:05,800 --> 00:43:09,119
for my presentations, right because I
want something that is a little bit unique,

593
00:43:09,159 --> 00:43:14,000
a little bit different, and I
can't find exactly what I'm looking for.

594
00:43:14,119 --> 00:43:19,000
So being able to use some sort
of AI generated thing to create pretty

595
00:43:19,039 --> 00:43:21,320
much what I want is really good. Now, a lot of people say

596
00:43:21,360 --> 00:43:22,840
to me, oh, but that's
taking work away from artists. I'm like,

597
00:43:23,079 --> 00:43:25,519
well, on the flip side,
let's look at how much work I

598
00:43:25,559 --> 00:43:30,480
have on my computer and within my
office that is created by an actual artist

599
00:43:30,519 --> 00:43:35,280
that I commission work for. I
do a lot of that kind of stuff

600
00:43:35,280 --> 00:43:37,599
because those are the kind of like
I have stickers I have on my Twitch

601
00:43:37,679 --> 00:43:42,039
branding all that kind of stuff.
But I'm not going to go and commission

602
00:43:42,039 --> 00:43:45,679
an artists for every single PowerPoint slide
that I want done, Like that's really

603
00:43:45,719 --> 00:43:50,559
costly and it's actually not a good
use of the artist time either. Like

604
00:43:50,599 --> 00:43:54,280
I've got artists she is actually doing
a PowerPoint slide for me at the moment,

605
00:43:54,280 --> 00:43:58,400
but it's a reusable slide, like
I'll be using it for every presentation.

606
00:43:58,639 --> 00:44:00,440
Yeah, you're not taking work away. It's worked that wouldn't be otherwise

607
00:44:00,440 --> 00:44:02,679
be done. It's the work that
want to be done, because she's telling

608
00:44:02,679 --> 00:44:06,199
me, she's like, I love
doing this stuff, but it takes a

609
00:44:06,199 --> 00:44:08,039
long time. Like I think she'd
been working on it for about three months

610
00:44:08,039 --> 00:44:12,840
now, like on a slide.
Like it's you know, like if we

611
00:44:13,440 --> 00:44:15,840
all wanted to go and commission it, like we just do way too much

612
00:44:15,920 --> 00:44:20,679
work. It's I just think it's
a bit insane. So I think having

613
00:44:21,320 --> 00:44:25,039
that good balance of like being able
to create things that are quick and easy

614
00:44:25,079 --> 00:44:30,199
that we wouldn't have gotten artists to
do, and then still having artists for

615
00:44:30,639 --> 00:44:35,000
the really high quality, good work
that we always want to do, folks

616
00:44:35,000 --> 00:44:37,960
who have been Usingdali to generate the
rough of what they want and then taking

617
00:44:37,960 --> 00:44:43,119
that to the artist saying I want
this but different, you know, in

618
00:44:43,159 --> 00:44:45,840
this style looking at way good for
a sketch. All right, So,

619
00:44:45,960 --> 00:44:47,800
Michelle, I have a hypothetical situation
for you. Let's say you have a

620
00:44:47,800 --> 00:44:52,880
little girl, and you know she's
five or six years old, about the

621
00:44:52,880 --> 00:44:58,239
time that you might want her to
start taking piano lessons and learning an instrument

622
00:44:58,320 --> 00:45:00,719
is a really good thing for a
child because you know, it can be

623
00:45:00,760 --> 00:45:06,320
a comfortable comfort in times of struggle. It expands your horizons, like it's

624
00:45:06,440 --> 00:45:09,760
just good for people to learn an
instrument, whether you plan on being a

625
00:45:09,800 --> 00:45:15,320
performer or not. And then the
kid is you know, hip with the

626
00:45:15,360 --> 00:45:17,599
tech and the YouTube and AIS that
comes to you and says, but mom,

627
00:45:19,440 --> 00:45:22,760
I don't need to learn piano.
I have a program that does that.

628
00:45:22,119 --> 00:45:24,800
Okay. So I got two responses
for you. The first one is

629
00:45:24,800 --> 00:45:30,880
I have four nieces, so this
is a very much a hot topic of

630
00:45:30,920 --> 00:45:35,760
conversation amongst my siblings. And second
thing, I actually had someone come to

631
00:45:35,800 --> 00:45:39,199
me when I was at State of
Open Con, which was less than a

632
00:45:39,199 --> 00:45:45,079
month ago. Young girl, she's
in year ten or eleven, and she

633
00:45:45,320 --> 00:45:50,360
literally said to me, why do
why should I study computer science and software

634
00:45:50,400 --> 00:45:52,320
engineering? I don't want to be
a developer because AIS are going to do

635
00:45:52,360 --> 00:45:58,320
everything for us, Like literally said
that. This isn't even a hypothetical anymore.

636
00:45:58,400 --> 00:46:02,079
This is young kids in high school
saying I don't want to study yourself

637
00:46:02,159 --> 00:46:06,000
by engineering because an AI is going
to do it all for me. And

638
00:46:06,079 --> 00:46:09,559
my response, yeah, exactly.
And so my response to her was very

639
00:46:09,599 --> 00:46:14,000
similar what we're already telling you on
this podcast is that like AI can't do

640
00:46:14,119 --> 00:46:19,000
everything like and she was like oh, but it's it is just growing exponentially.

641
00:46:19,000 --> 00:46:22,239
I'm like, yeah, but if
you have done exponential blog, it

642
00:46:22,320 --> 00:46:25,079
does go like this and plateaus out
right. We're still in the you know,

643
00:46:25,159 --> 00:46:29,320
the upward phase. But I think
there's going to gain personal opinion,

644
00:46:29,679 --> 00:46:32,679
I think we're going to hit a
point where we can't go any further because

645
00:46:32,719 --> 00:46:37,000
of it. Whether yeah, we're
basically there, but whether it's because of

646
00:46:37,320 --> 00:46:42,760
limitation. There is another order of
magnitude of data to be collected off the

647
00:46:42,760 --> 00:46:46,599
internet exactly. So it's like whether
we're going to be limitated by you know,

648
00:46:46,920 --> 00:46:52,519
literal physical data points, the speed
of electricity, whatever it's going to

649
00:46:52,559 --> 00:46:54,719
be. There's going to be a
limit point, right. What I think

650
00:46:54,960 --> 00:46:58,360
is that there will, Yeah,
there will be that limit. But the

651
00:46:58,440 --> 00:47:05,960
limit that I think is more important
is the public's rejection of AI and because

652
00:47:06,159 --> 00:47:10,000
because it's going to produce so many
Charlatans and so many so much phony stuff

653
00:47:10,719 --> 00:47:15,039
that people are going to want to
they're gonna learn, they're going to yearn

654
00:47:15,840 --> 00:47:22,679
for human to human communication, contact
sharing of arts and culture. Again,

655
00:47:22,880 --> 00:47:27,519
it's funny they said the same thing
about the telephone. It was going to

656
00:47:27,599 --> 00:47:30,159
limit human communication. I think I
think we're not there, Richard. I'm

657
00:47:30,159 --> 00:47:35,559
saying, like, in the future, I think we're going to have a

658
00:47:35,639 --> 00:47:39,599
situation where there'll be a rejection of
artificial anything. I think to certain agree,

659
00:47:39,639 --> 00:47:43,519
But my response to this girl was
very somewhat we're saying here like a

660
00:47:43,639 --> 00:47:45,920
counter everything you as the developer is
still in charge, like we still need

661
00:47:45,960 --> 00:47:49,800
to direct all those kind of things. So after a while she was like,

662
00:47:49,840 --> 00:47:52,039
I actually want to starty computer sides
now, So she got really excited

663
00:47:52,400 --> 00:47:58,480
about the prospect of being able to
then use AI as opposed to being wary

664
00:47:58,519 --> 00:48:02,079
of it. And to your point
about like, you know, the public

665
00:48:02,159 --> 00:48:07,920
rejecture, I don't necessarily think it's
going to be like a public rejection over

666
00:48:07,960 --> 00:48:09,519
all of AI, because there's so
many people in tech that are just like,

667
00:48:09,519 --> 00:48:13,000
oh my gosh, it's stuff so
cool. What I think we're going

668
00:48:13,079 --> 00:48:16,440
to get again more personal opinion,
is we're going to have the Ikea model,

669
00:48:16,920 --> 00:48:21,119
which is there's gonna be a lot
of stuff out there that's built by

670
00:48:21,159 --> 00:48:22,880
AI and it's all got and you
can tell it's being built by AI.

671
00:48:23,239 --> 00:48:28,880
But then there's going to be certain
pieces where you're going to go that was

672
00:48:28,920 --> 00:48:32,199
built by a human and therefore it
is bespoke, it is custom, it

673
00:48:32,280 --> 00:48:37,239
is more expensive. So same with
the Ikea furniture thing, is why I

674
00:48:37,320 --> 00:48:39,519
use IQ. So many people have
Ikea furniture in the house, right,

675
00:48:39,599 --> 00:48:44,280
it does the job. But if
there are no more furniture makers because they

676
00:48:44,320 --> 00:48:46,039
all think, well why should I
learn how to make furniture if I can

677
00:48:46,079 --> 00:48:50,039
just buy it here? And then
I think, because there will be a

678
00:48:50,119 --> 00:48:53,239
lack of those people, that they
will be the ones that are left being

679
00:48:53,239 --> 00:48:57,960
able to do that will be sought
out. So it's like a renaissance,

680
00:48:58,079 --> 00:49:01,119
right exactly happened because of the depress
and all of that. I think there

681
00:49:01,119 --> 00:49:06,519
may be another renaissance, but in
this in this situation, not a rejection

682
00:49:06,599 --> 00:49:10,599
of technology, but just a recognition
of human value. Yeah, basically,

683
00:49:10,679 --> 00:49:14,440
And so like you might have lots
of buy Kea furniture in your house,

684
00:49:14,440 --> 00:49:16,599
but there might be one or two
pieces that aren't ike and you spend a

685
00:49:16,639 --> 00:49:20,320
lot of money on Now, as
people, we usually can't you know,

686
00:49:20,599 --> 00:49:23,880
generally and especially the cost of living, we can't afford to have every piece

687
00:49:23,960 --> 00:49:28,960
in our home built by you know, a beautiful furniture maker. In the

688
00:49:29,000 --> 00:49:31,480
same way we can't have every piece
of artwork you know, in our house

689
00:49:32,079 --> 00:49:36,840
because it's too expensive, but there
will be those, you know, really

690
00:49:36,840 --> 00:49:40,760
bespoke pieces. And I think you've
hinted on something really important is that those

691
00:49:40,880 --> 00:49:46,159
people who really understand that in their
craft will get better at it so that

692
00:49:46,320 --> 00:49:51,920
it has something different that AI doesn't. And I use that example because my

693
00:49:51,960 --> 00:49:55,599
brother is a photographer and a videographer
and he loves the AI stuff. He

694
00:49:55,760 --> 00:49:59,280
was like, you know, it
makes my job a lot, you know,

695
00:49:59,360 --> 00:50:00,960
a lot of the things automated a
lot easier, He's like, but

696
00:50:01,000 --> 00:50:06,559
it's also made me a better photographer
and a better videographer because now I'm learning

697
00:50:06,599 --> 00:50:12,519
techniques and ways of shooting that's completely
different that AI cannot do. And so

698
00:50:12,599 --> 00:50:17,239
I think, you know, the
smart people and the people who you know,

699
00:50:17,400 --> 00:50:22,360
I have that more optimistic viewpoint are
going to look at their craft and

700
00:50:22,400 --> 00:50:27,400
say, how do I add a
bespoke human element that people are going to

701
00:50:27,440 --> 00:50:30,920
cover it, and that people are
going to are going to be willing to

702
00:50:30,000 --> 00:50:34,199
pay the extra money for it.
And you can only see the horizon if

703
00:50:34,239 --> 00:50:36,920
you get on the horse and ride. That's exactly right, And I think

704
00:50:36,920 --> 00:50:40,800
it doesn't really matter. I think
what kind of profession you're in. I

705
00:50:40,840 --> 00:50:45,280
think that's going to be across the
board, whether you're a developer writing,

706
00:50:45,360 --> 00:50:49,519
you know, building a website,
whether you're an artist, you know,

707
00:50:50,400 --> 00:50:53,599
creating artwork, whether again you're you
know, even physical things like furniture and

708
00:50:53,599 --> 00:50:57,000
stuff like that, where I mean, that's already we're already seeing that,

709
00:50:57,119 --> 00:51:00,559
because you know, there's robots that
can manufacture for a future like at ridiculous

710
00:51:00,559 --> 00:51:06,239
speeds. But I still want a
nice bespoke custom. You want a story

711
00:51:06,360 --> 00:51:10,119
rather, Yeah, AI has no
story. Oh I generated this on AI.

712
00:51:10,239 --> 00:51:14,360
Okay, but what's the story behind
it? Yeah? The story.

713
00:51:14,719 --> 00:51:19,760
The story is the prompt you wrote. Yeah, a lot of things like

714
00:51:19,960 --> 00:51:23,559
especial occasions and things like that.
People want that human touch. Weddings,

715
00:51:23,679 --> 00:51:29,199
anniversaries, birthdays, those kind of
things where people are going to want that

716
00:51:29,280 --> 00:51:34,880
extra that extra quality and that extra
human touch and knowing that that it's been

717
00:51:35,199 --> 00:51:39,159
human touched. Yeah, yeah,
very very good. I agree. Oh

718
00:51:39,199 --> 00:51:45,159
my god, what else should we
talk about, Richard? I don't know.

719
00:51:45,280 --> 00:51:46,800
I think there was some copilot in
here. Somewhere, there was some

720
00:51:46,840 --> 00:51:52,119
copilot. I'm so we got off
on that tangent. It's good because I

721
00:51:52,159 --> 00:51:55,320
think it does prompt a broader discussion
around everything. You know, I go

722
00:51:55,400 --> 00:51:58,960
into the office, we need to
keep that discussion going. Yeah, And

723
00:51:59,000 --> 00:52:01,159
I go into the the startup office
now and people come up to me and

724
00:52:01,199 --> 00:52:04,719
go, hey, you're into this
AI stuff. What do you think about

725
00:52:04,719 --> 00:52:07,599
this? This? And this nothing
to do with you know, get up

726
00:52:07,639 --> 00:52:13,159
a copilt. But because it's just
so ingrained in every conversation now, I

727
00:52:13,239 --> 00:52:16,119
just think it's you know, every
it's touching everyone. That that is why

728
00:52:16,199 --> 00:52:22,320
we're getting this real just uproar in
society about like everything happening, because it's

729
00:52:22,639 --> 00:52:25,760
it's all, it's everywhere. You
cannot turn away, and you can't.

730
00:52:25,840 --> 00:52:29,119
You can't get away from it,
essentially unless you want to go live on

731
00:52:29,119 --> 00:52:31,840
a deserted island by yourself. Yeah. I know it's a lot of gardeners

732
00:52:31,880 --> 00:52:36,000
who are too immersed in it right
now. I think we are immersing it

733
00:52:36,039 --> 00:52:37,719
in the industry. I do think
it's just another set of tools. The

734
00:52:37,719 --> 00:52:43,880
biggest problem that it has is the
name. We've spent sixty years making crazy

735
00:52:44,039 --> 00:52:47,079
movies where the AI tries to kill
everybody. So gee, I wonder why

736
00:52:47,119 --> 00:52:51,280
people are jumpy about this name.
Well, it was like, I want

737
00:52:51,280 --> 00:52:53,199
to did my popcom talk last year. I was like, didn't we learn

738
00:52:53,239 --> 00:53:00,480
anything from like Terminator one or two
or three or four or five? You

739
00:53:00,480 --> 00:53:05,440
know, you're right, like,
but the technology in those movies has nothing

740
00:53:05,440 --> 00:53:07,440
to do with the technology of making
here except for the name. It's well

741
00:53:07,480 --> 00:53:13,079
said Wretchard, Yeah exactly. And
I think what's interesting is I love looking

742
00:53:13,119 --> 00:53:19,519
at some science fiction movies is because
it shows what people's minds were actually thinking

743
00:53:19,599 --> 00:53:23,760
would be potentially capable in you know, twenty or thirty years time. And

744
00:53:23,800 --> 00:53:27,880
I think like some of the Cape
Billy people always come to me and say,

745
00:53:27,880 --> 00:53:30,119
oh, how far are we away
from getting like Iron Man's you know,

746
00:53:30,360 --> 00:53:34,800
Javis am I afforded there? Like, you know, that kind of

747
00:53:34,880 --> 00:53:39,079
capability is already available. While it
might not be commercially available in the sense

748
00:53:39,079 --> 00:53:44,000
that you and I can't just go
out and buy our own Javas, a

749
00:53:44,039 --> 00:53:47,360
lot of those functionalities are already there. And you know Open Eye has already

750
00:53:47,400 --> 00:53:52,920
put Chapter VT. I think it
was five they put into a humanoid robot.

751
00:53:52,320 --> 00:53:55,800
I don't know if you saw the
new in Video Press the video Big

752
00:53:55,840 --> 00:54:00,159
Conference in who are there? But
you know, robots like they're just it's

753
00:54:00,159 --> 00:54:04,039
like we created human eyed robots and
you know, stuff them with all the

754
00:54:04,800 --> 00:54:07,679
stuff from the Internet and you're like, okay, I can't see what's going

755
00:54:07,719 --> 00:54:13,320
wrong at all. Yeah, all
right, So ride the horse, get

756
00:54:13,320 --> 00:54:16,679
GitHub co pilot, kitt gethub co
pilot chat, use it. It's good

757
00:54:17,320 --> 00:54:22,480
and you know you and find your
way in the you know, in the

758
00:54:22,639 --> 00:54:25,440
in the wave of the future.
Find your spot on that wave and ride

759
00:54:25,519 --> 00:54:34,159
it. That's that's it, and
stop arguing with scenes from movies because those

760
00:54:34,159 --> 00:54:38,840
aren't real. Well, and you
know, anthromorphizing software is never good.

761
00:54:39,039 --> 00:54:42,519
No, it's not. You're you're
right, Richard, But it is easy,

762
00:54:42,719 --> 00:54:45,480
right, just like it's easy for
us to oh, I won't get

763
00:54:45,519 --> 00:54:51,559
into the religious, but it's easy
for us to answerpomorphize everything that is a

764
00:54:51,599 --> 00:54:55,239
force that we don't understand. Humans
prefer to most of the time. My

765
00:54:55,360 --> 00:54:59,920
concerned is we're supposed to be the
professionals and the regular mortals are being confused.

766
00:55:00,159 --> 00:55:06,599
It's a freakular more they ask you
those questions the muggles do. The

767
00:55:06,679 --> 00:55:09,000
muggles come up to you and ask
questions, Richard. That's a good bog

768
00:55:09,519 --> 00:55:16,800
like qootball. Yeah, in a
wizard. Well, we're the wizards.

769
00:55:17,239 --> 00:55:22,599
We're having a lot of fun and
it's useful for us to be responsible absolutely

770
00:55:23,079 --> 00:55:28,159
healthy and health these these ordinary people
who are freaked out because they've watched too

771
00:55:28,159 --> 00:55:30,800
many movies. Yeah, and we're
playing with the same thing, which we

772
00:55:30,880 --> 00:55:35,360
know we're not. We're not,
but they don't know. Yeah. And

773
00:55:35,440 --> 00:55:37,599
on that note, Uh, Michelle, where are you going to be next?

774
00:55:37,679 --> 00:55:42,039
Or what's in your inbox? Oh? So many things coming up,

775
00:55:42,079 --> 00:55:45,760
but it's really exciting. We have
kind of like our GitHub broad show happening

776
00:55:45,760 --> 00:55:51,360
at the moment. So I GitHub
Galaxy is really exciting, a bit more

777
00:55:51,480 --> 00:55:54,599
geared towards enterprise. So if you
go on to if you just google GitHub

778
00:55:54,639 --> 00:56:01,039
galaxy, but it's galaxy dot GitHub
dot com. And there's some really exciting

779
00:56:01,119 --> 00:56:07,199
stuff that we're doing all around the
world. We've got galaxies happening all over

780
00:56:07,280 --> 00:56:10,000
Apak, all of the Americas,
and all over Europe. So I'll be

781
00:56:10,119 --> 00:56:15,119
doing the Melbourne one, which is
happening in May. I'll be in the

782
00:56:15,159 --> 00:56:20,760
Hong Kong one which is in April, and I'll be Interestingapore and Thailand ones

783
00:56:20,880 --> 00:56:24,119
that are also happening towards the end
of May. Then there's a couple of

784
00:56:24,239 --> 00:56:30,480
NDC's coming up soon which is really
exciting. We'll be there Serverallest Days.

785
00:56:30,559 --> 00:56:37,239
I'm doing Serverallest Days in Sydney and
Auckland for those people over down on note

786
00:56:37,280 --> 00:56:42,440
with me, so you come see
me there, and I think they're kind

787
00:56:42,440 --> 00:56:45,599
of the main ones that are coming
up. And then I'll be at NDC

788
00:56:45,760 --> 00:56:49,079
also again, which wuld be really
all right. We'll be there, yeah,

789
00:56:49,199 --> 00:56:53,440
I'll be looking. Yeah. Well, Michelle, thank you. This

790
00:56:53,480 --> 00:56:57,400
has been a great conversation. And
I know we didn't stick to get help

791
00:56:57,519 --> 00:57:00,039
co pilot, but I think our
listeners like that about this show. We

792
00:57:00,239 --> 00:57:06,199
tangentize, don't we Yes, that's
not a word. Okay, sometimes we

793
00:57:06,280 --> 00:57:10,199
go places tangent urize. This is
all right, Michelle, thanks a lot,

794
00:57:10,280 --> 00:57:14,440
and for you, dear listener.
We'll see you next time on dot

795
00:57:14,519 --> 00:57:38,360
net rocks. Dot net Rocks is
brought to you by Franklin's Net and produced

796
00:57:38,360 --> 00:57:45,000
by Pop Studios, a full service
audio, video and post production facility located

797
00:57:45,000 --> 00:57:49,639
physically in New London, Connecticut,
and of course in the cloud online at

798
00:57:49,679 --> 00:57:53,480
pwop dot com. Visit our website
at d O t N E t R

799
00:57:53,519 --> 00:57:59,320
O c k S dot com for
RSS feeds, downloads, mobile apps,

800
00:57:59,480 --> 00:58:02,440
comments, and access to the full
archives going back to show number one,

801
00:58:04,000 --> 00:58:07,800
recorded in September two thousand and two. And make sure you check out our

802
00:58:07,840 --> 00:58:12,079
sponsors. They keep us in business. Now go write some code, See

803
00:58:12,119 --> 00:58:24,639
you next time. You got Jack
middle Vans
