1
00:00:01,080 --> 00:00:05,679
How'd you like to listen to dot
NetRocks with no ads? Easy? Become

2
00:00:05,679 --> 00:00:09,839
a patron For just five dollars a
month, you get access to a private

3
00:00:10,000 --> 00:00:14,439
RSS feed where all the shows have
no ads. Twenty dollars a month will

4
00:00:14,439 --> 00:00:19,039
get you that and a special dot
NetRocks patron mug. Sign up now at

5
00:00:19,039 --> 00:00:23,839
Patreon dot dot NetRocks dot com.
Hey, Carl and Richard here with your

6
00:00:23,960 --> 00:00:28,839
twenty twenty four NDC schedule. Will
be at as many NDC conferences as possible

7
00:00:28,839 --> 00:00:33,000
this year, and you should consider
it tending no matter what. Ndcoslow is

8
00:00:33,000 --> 00:00:38,479
happening June tenth through the fourteenth.
Get your tickets at ndcoslow dot com.

9
00:00:38,520 --> 00:00:43,359
The Copenhagen Developers Festival happens August twenty
sixth through the thirtieth. Early bird discount

10
00:00:43,439 --> 00:00:51,039
ends April twenty sixth. Tickets at
Cphdevfest dot com. Ndcporto is happening October

11
00:00:51,039 --> 00:00:56,640
fourteenth through the eighteenth. The early
bird discount ends June fourteenth. Tickets at

12
00:00:56,759 --> 00:01:12,719
Ndcporto dot com. And we'll see
you there, we hope. Hey,

13
00:01:12,719 --> 00:01:17,400
guess what it's not need rocks.
I'm Carl Franklin and I'm Richard Campbell and

14
00:01:17,879 --> 00:01:19,640
Man. Next week we're going to
be a build. Yes we will.

15
00:01:19,760 --> 00:01:23,239
Yeah, Actually I think this show
comes out while we're at build. Okay,

16
00:01:23,319 --> 00:01:26,280
so we're a build now I'm shifting
is hard? Yeah, how's your

17
00:01:26,280 --> 00:01:30,760
build going? It's great? Gosh, I loved I had real problems with

18
00:01:30,920 --> 00:01:34,959
travel. Yeah, recording amazing web
shows. Yeah, everything's fine. And

19
00:01:37,280 --> 00:01:42,359
while we're here at build. Next
week, we're going to be interviewing Scott

20
00:01:42,359 --> 00:01:47,599
Hanseman for our nineteen hundredth episode.
Right, can you believe it? What

21
00:01:47,640 --> 00:01:51,319
could go wrong? It's gonna be
awesome. It's going to be like the

22
00:01:51,400 --> 00:01:55,079
last one was Sean Wildermere. Three
old white guys tired to talk about code.

23
00:01:55,200 --> 00:01:57,319
Who knows Well, he's a VP
now, so the conversation will be

24
00:01:57,359 --> 00:02:00,480
a little different. Well, he's
got, you know, get off my

25
00:02:00,599 --> 00:02:04,120
lawn stories. I'm sure, so
we're probably gonna hear it. Oh oh

26
00:02:04,200 --> 00:02:08,560
yeah, maybe a few. All
right, Well, anyway, let's get

27
00:02:08,599 --> 00:02:20,199
right into a better no framework.
What do you got? Well, if

28
00:02:20,240 --> 00:02:23,800
you remember the comment you read last
week, the person commenting used the word

29
00:02:23,840 --> 00:02:30,039
Gubbins. Gubbins. First time I
heard Gubbins was interviewing John Skeet for DNR

30
00:02:30,120 --> 00:02:36,319
TV, and it turns out a
tweet by Skeet is my better no Framework

31
00:02:36,360 --> 00:02:40,439
today awesome, So he says,
I've been bitten by the Raspberry Pie bug.

32
00:02:40,560 --> 00:02:46,199
Now. The photo below is an
x Touch Mini, which is a

33
00:02:46,199 --> 00:02:51,840
Barringer mixer USB mixer. It's about
one hundred bucks with a Raspberry Pie zero

34
00:02:51,919 --> 00:02:55,280
two w hacked into the case.
Oh man, he opened the case and

35
00:02:55,280 --> 00:03:00,439
he put this little thing inside just
supply power and it controls whatever digital mixer

36
00:03:00,879 --> 00:03:07,520
is configured and over Wi Fi,
no separate device required. So he has

37
00:03:07,560 --> 00:03:12,960
basically modified this thing and giving it
some smarts, and he put in to

38
00:03:13,199 --> 00:03:16,520
it an app that he wrote called
digit Mixer onto the Raspberry Pie. So

39
00:03:17,039 --> 00:03:21,039
if you look in the links,
so we have a link to his post

40
00:03:21,199 --> 00:03:27,240
that's the better no Framework eighteen ninety
nine dot pop dot E. And then

41
00:03:27,280 --> 00:03:31,120
we also have a link to the
Barringer z Touch right ninety nine bucks,

42
00:03:31,280 --> 00:03:36,400
and that has a USB interface that
he's now just looping into the Raspberry Pie

43
00:03:36,439 --> 00:03:39,240
that's in the case. He's powering
the Raspberry Pie with that and the Raspberry

44
00:03:39,240 --> 00:03:44,599
Pie. If you look at the
link on Amazon, it's twenty bucks.

45
00:03:44,840 --> 00:03:47,479
Yeah, and it's a little bored. Those little pie zeros. Yeah,

46
00:03:47,560 --> 00:03:51,199
yeah, they're impressive. I haven't
done anything with the Raspberry pie in a

47
00:03:51,199 --> 00:03:54,719
long time. And wow, they're
getting powerful and small. They got a

48
00:03:54,759 --> 00:04:00,000
lot of horsepower. Yeah. I
run a pair of them as my pie

49
00:04:00,560 --> 00:04:04,599
is my DNS sync for all things
AD just sounds wrong, but let me

50
00:04:04,639 --> 00:04:06,960
tell you, Like, it's one
thing to run an AD blocker on your

51
00:04:08,000 --> 00:04:11,000
browser, Yeah, it's another thing
to run a pie hole in your house

52
00:04:11,400 --> 00:04:15,400
because that means that the stuff that
my television gets gets blocked. All that

53
00:04:15,439 --> 00:04:20,120
stuff gets blocked automatically. Like you
have so much more control over what data

54
00:04:20,279 --> 00:04:27,399
is going out from your home.
This is really circling back to the very

55
00:04:27,439 --> 00:04:31,240
first interview I dated with you,
Richard Campbell on dot Ahrock's show sixty nine

56
00:04:31,279 --> 00:04:35,120
before you were co host and you
were talking. We were talking about the

57
00:04:35,360 --> 00:04:41,000
TVO and you say, yeah,
TVA doesn't work and replay TV Canada because

58
00:04:41,040 --> 00:04:44,639
we don't get metadata up here.
Yeah, and apparently that's still a problem.

59
00:04:44,839 --> 00:04:47,600
Well no, I mean, so
what did I do? I ran

60
00:04:47,639 --> 00:04:54,040
an instance of Linux as a DNS
server to lie to the replay TV to

61
00:04:54,879 --> 00:05:00,680
run my loader of metadata. So
that I had replay D be populated with

62
00:05:00,680 --> 00:05:02,319
the right stuff because it's just some
digital recorder. But you know, this

63
00:05:02,399 --> 00:05:06,959
is the joys of being a programmer
for yourself. Yep, totally. Yeah.

64
00:05:08,000 --> 00:05:10,959
Well ah that doesn't work. Let
me write in a little after that.

65
00:05:11,040 --> 00:05:12,759
I'll just make a thing with the
stuff, and then we're done.

66
00:05:12,800 --> 00:05:15,959
Make a thing with this stuff and
you're done. I you know, I'll

67
00:05:15,959 --> 00:05:18,560
tell you another story just as an
aside before we get into this, because

68
00:05:18,600 --> 00:05:20,920
of course, now that we've moved
up to the coast, I'm moving like

69
00:05:20,959 --> 00:05:26,160
all of my services. So I
met my new optometrists this week, okay,

70
00:05:26,240 --> 00:05:30,360
and I'm chatting with the front desk
lady that the the lady actually runs

71
00:05:30,399 --> 00:05:36,519
the show, and we end up
in a conversation about her husband's hearing problems

72
00:05:38,319 --> 00:05:41,639
and how he's on the list for
a calcular implant, which you know takes

73
00:05:41,639 --> 00:05:43,720
a little time to get a certain
don stort. It's like, that's this

74
00:05:43,839 --> 00:05:46,199
is not hearing aid bad, this
is you cannot hear. You can't hear.

75
00:05:46,319 --> 00:05:51,759
And I showed her live transcribe on
Android because she just didn't know.

76
00:05:53,360 --> 00:05:56,480
Wow, right, it's like right
now she's like, I'm frustrated trying to

77
00:05:56,480 --> 00:05:58,240
talk to him because I have to
shout at him, and it says,

78
00:05:58,240 --> 00:06:01,439
well, that automatically changes your mood, and you know you're frustrated. And

79
00:06:01,480 --> 00:06:03,879
I literally just fired it up on
the phone and put it in front of

80
00:06:03,879 --> 00:06:08,600
her and we start kept talking,
like look down, everything we're saying is

81
00:06:08,639 --> 00:06:11,800
there, Like, let me put
us on your phone when you go home

82
00:06:11,839 --> 00:06:15,079
tonight. Just put it in front
of him and start talking and see if

83
00:06:15,079 --> 00:06:17,759
you don't figure it out right away. That's so cool. So I got

84
00:06:17,800 --> 00:06:21,560
a call from her yesterday. It's
like you've changed everything. That is awesome.

85
00:06:21,639 --> 00:06:26,279
I'm also really aware like cochlear implants
are only so good, Like they're

86
00:06:26,319 --> 00:06:30,680
expecting hations for what an implant's going
to be. Are you know me?

87
00:06:30,879 --> 00:06:33,600
You see a path ahead of someone
where it's going to be months of sadness.

88
00:06:33,680 --> 00:06:36,360
It's like, look, there are
other ways to address this. Yeah,

89
00:06:36,360 --> 00:06:40,680
Like if there's anything we've got going
for us living in this technology world,

90
00:06:40,759 --> 00:06:44,199
it's take that knowledge to folks who
don't have it. You're so right,

91
00:06:44,279 --> 00:06:46,800
she owned the phone, she didn't
know the product existed, and it's

92
00:06:47,079 --> 00:06:50,879
free. Yeah, amazing, that's
a great story, Richard. Yeah,

93
00:06:51,120 --> 00:06:55,160
well, anyway, that's our better
note framework who's talking to us today.

94
00:06:55,199 --> 00:06:58,319
Mister Campbell grabbed the comment of the
show eighteen ninety two. The one we

95
00:06:58,319 --> 00:07:01,480
did with Michelle Duke once was Michelle
Mannering but then you know, got married,

96
00:07:01,560 --> 00:07:03,959
changed your name, all that good
stuff. Who works for Gethub and

97
00:07:03,959 --> 00:07:06,759
we talked about gethub copilot, which
we'd done before with her, so it

98
00:07:06,800 --> 00:07:12,040
was fun to get an update on
that. And Joshua Hillarup, who's been

99
00:07:12,079 --> 00:07:15,199
on the show as well, you
know guests and regular commenter, said,

100
00:07:15,279 --> 00:07:17,399
I'm with Richard on the future of
AI. I can't say what will happen

101
00:07:17,480 --> 00:07:20,759
hundreds of years from now in this
space, but I think we before we

102
00:07:20,800 --> 00:07:25,319
would get a business person giving all
the requirements to an ass generative program,

103
00:07:25,399 --> 00:07:30,959
we would get AI largely replacing that
business person is dead good line. I

104
00:07:30,000 --> 00:07:34,720
mean, let's be clear, how
much time do you spend getting business requirements

105
00:07:34,160 --> 00:07:39,800
from a domain expert like It takes
a while, so that's not a trivial

106
00:07:39,839 --> 00:07:42,879
thing. But yeah, there's many
changes coming in space. And I know

107
00:07:42,920 --> 00:07:46,000
we're going to talk to Aaron today
and he's certainly working with companies that have

108
00:07:46,079 --> 00:07:50,680
a role to play in generative AI, so that I thought I'd grab that

109
00:07:50,680 --> 00:07:57,040
little story and think, you know, you know, don't don't take the

110
00:07:57,120 --> 00:08:01,480
obvious answers here. Every time we
have a new automated automation tool in our

111
00:08:01,519 --> 00:08:07,519
industry or in any industry, that
industry evolves and generally results in more work,

112
00:08:07,720 --> 00:08:13,279
not less. One, Joshua,
I know you already have a copy

113
00:08:13,279 --> 00:08:16,480
of music code By, but I
wrote the comment anyway because I really appreciate

114
00:08:16,519 --> 00:08:18,160
your insights and you can reach out
to me anytime if you like it.

115
00:08:18,160 --> 00:08:20,079
If you'd like a copy of music
code By, I write a colm on

116
00:08:20,120 --> 00:08:24,000
the website at don at Rocks dot
com or on the facebooks. We publish

117
00:08:24,000 --> 00:08:26,319
every show there, and if you
comment there and ever reading the show,

118
00:08:26,319 --> 00:08:28,120
I'll send you a copy of music
Coba and you can follow us on Twitter

119
00:08:28,240 --> 00:08:31,720
or x or y or z or
whatever the hell it is. Today.

120
00:08:31,759 --> 00:08:37,559
But the cool kids are hanging out
at mastadon. I'm Carl Franklin at techhub

121
00:08:37,559 --> 00:08:41,720
dot social, and I'm Rich Campbell
at Macedon dot social. Aaron Ericson the

122
00:08:41,840 --> 00:08:46,840
last time he was here was two
thousand and nine. He leads a team

123
00:08:46,919 --> 00:08:52,120
building autonomous agents at Nvidio maybe a
little company might have heard of. As

124
00:08:52,120 --> 00:08:56,720
I said, his first book,
Nomadic Developer, dropped in two thousand and

125
00:08:56,759 --> 00:08:58,879
nine, and that's the last time
we talked to him. And since then,

126
00:09:00,399 --> 00:09:05,039
he spent nine years at thought Works, moved to San Francisco, built

127
00:09:05,120 --> 00:09:09,559
internal developer platforms at Salesforce, became
a VPE at New Relic, then lost

128
00:09:09,559 --> 00:09:13,279
his mind and decided to become a
startup founder, where he attempt to San

129
00:09:13,320 --> 00:09:18,840
Francisco, where he attempted to I
just want to know where mushrooms involved.

130
00:09:18,159 --> 00:09:24,639
Uh. Yeah, So he attempted
to build a company there at the startup

131
00:09:24,639 --> 00:09:30,600
that used AI to automate your company
reorg. Wow. After that startup didn't

132
00:09:30,639 --> 00:09:35,159
work, he joined Nvidia, where
he now builds AI agents that optimized the

133
00:09:35,240 --> 00:09:41,600
usage of GPU resources across the GPU
fleet for Nvidia. So basically he's a

134
00:09:41,639 --> 00:09:45,039
slacker. Yeah, just later on, Just dude, no wonder we haven't

135
00:09:45,039 --> 00:09:46,759
talked to him in fifteen years.
Didn't have anything, he wasn't doing anything.

136
00:09:46,840 --> 00:09:52,679
Oh my god, what kind of
bio is that? How are you

137
00:09:52,720 --> 00:09:56,039
doing? Aaron? And welcome back? I feel fantastic. I mean,

138
00:09:56,159 --> 00:10:01,039
you know, what kind of life
charmed life? Do you live? Where

139
00:10:01,039 --> 00:10:03,240
you fail at a startup, and
they say, hey, why don't you

140
00:10:03,360 --> 00:10:09,960
just do even crazier stuff with AI
than automating reorcs, which already sounds,

141
00:10:11,240 --> 00:10:15,159
I don't know, a little bit
audacious. You could say, yeah,

142
00:10:15,200 --> 00:10:18,080
that's a nice word. Yeah,
I don't know, dude. That seems

143
00:10:18,159 --> 00:10:24,240
like we already have a problem with
too much remoteness as it is, right,

144
00:10:24,320 --> 00:10:28,480
Like nobody wants to be laid off
by text message. It's just the

145
00:10:28,519 --> 00:10:31,600
phrase that gives me shells. I'm
sure we could humanize it it certainly.

146
00:10:31,879 --> 00:10:35,000
The biggest problem I have with reorcs
is often people don't go through all of

147
00:10:35,039 --> 00:10:39,759
the steps to take care of folks
properly. And so if I had a

148
00:10:39,840 --> 00:10:43,679
digital checklist that's making sure I'm doing
the right things, maybe I can actually

149
00:10:43,720 --> 00:10:48,159
do it better. But that's a
tough elevator pitch, it is. Yeah,

150
00:10:48,679 --> 00:10:56,159
the funny was so funny enough.
You know where we landed with this.

151
00:10:56,799 --> 00:11:00,559
So we're working on a problem which
was I think a little bit more

152
00:11:00,639 --> 00:11:05,159
virtuous when we started, which was
how do we help engineering leaders reorganize our

153
00:11:05,159 --> 00:11:09,440
teams in a better way, in
a way that kind of has multiple displaynering

154
00:11:09,480 --> 00:11:13,639
teams. I think it started out
that way, and then, as what

155
00:11:13,720 --> 00:11:16,600
happened to a lot of startups in
twenty twenty three or late twenty one two,

156
00:11:16,360 --> 00:11:20,200
chech gt comes out and everybody's like, well, how the hell do

157
00:11:20,279 --> 00:11:24,360
I raise another round? And how
do I like it if I don't have

158
00:11:24,440 --> 00:11:28,200
this? Yeah, how do I
plant dot ai at the end of whatever

159
00:11:28,240 --> 00:11:31,840
I'm doing and make it moderately relevant? And so our dumb idea was,

160
00:11:31,879 --> 00:11:35,720
well, we already have this engine
that allows you to kind of do your

161
00:11:35,039 --> 00:11:39,080
change your organization in a draft mode
and then have your friends approve it or

162
00:11:39,080 --> 00:11:43,120
in your org and then go and
do it. Why don't we just put

163
00:11:43,159 --> 00:11:46,320
a chech tee wrapper around it and
say we're going to write that reorger email

164
00:11:46,320 --> 00:11:50,600
that everybody hates writing It already sounds
robotic. I mean, who has ever

165
00:11:50,639 --> 00:11:54,519
gotten an email from HR that didn't
look like it came from chet gipt in

166
00:11:54,519 --> 00:12:00,559
the first place. It certainly infused
with HR legal ees, yes and double

167
00:12:00,600 --> 00:12:03,480
speak. I mean, it's to
me, it's the classical case. I

168
00:12:03,519 --> 00:12:05,639
mean, there's a lot of things
you do not want Jenner Todai to do,

169
00:12:05,759 --> 00:12:09,279
you know. I look at the
music that you know, Carl that

170
00:12:09,320 --> 00:12:11,159
you're into, and I look at
a lot of really creative pursuits, and

171
00:12:11,159 --> 00:12:15,120
I think creativity is going to be
you know, despite all the noise,

172
00:12:15,200 --> 00:12:18,159
right, it's going to be one
of the last bestiges where jenner Ai is

173
00:12:18,159 --> 00:12:22,279
really useful robotic things from HR.
I think it is extraordinarily useful in that

174
00:12:22,320 --> 00:12:26,399
because at least we can get rid
of the pretense that anybody really cares about

175
00:12:26,440 --> 00:12:31,120
that communication, right, because mostly
they care about it for risk management.

176
00:12:31,159 --> 00:12:33,879
They don't like, how do we
make it sound as anodyne as possible?

177
00:12:33,919 --> 00:12:37,960
That that's the whole point. So
why not have can we soften the blow?

178
00:12:37,120 --> 00:12:41,399
Ye? Yeah, well and not
expose ourselves to legal issues, right,

179
00:12:41,600 --> 00:12:45,279
you know, I mean that therein
lies the bigger thing. Like I

180
00:12:45,320 --> 00:12:48,360
know, I know folks in HR
who are remarkably compassionate people. They're just

181
00:12:48,399 --> 00:12:54,600
not allowed to be most of the
time because you slam into labor laws and

182
00:12:54,600 --> 00:13:00,559
and anything you say can be used. You know, you're almost automatic living

183
00:13:00,600 --> 00:13:05,360
in Miranda Rights Land, just by
talking about HR related issues. Yeah,

184
00:13:05,559 --> 00:13:09,080
I mean I used to do this
thing at conferences where I'd say I'd write

185
00:13:09,120 --> 00:13:11,360
the memo, but then you could
put at the end and write it in

186
00:13:11,399 --> 00:13:16,639
the form of a Homrian epic or
sometimes a limerick. Yes, and you

187
00:13:16,720 --> 00:13:20,000
have been let go. But you
know what, it's kind of dumb.

188
00:13:20,000 --> 00:13:22,519
But it was like I would really
appreciate that if I was being let go

189
00:13:22,639 --> 00:13:26,080
to actually read it, you know, in Iambic pentaminter for example, here

190
00:13:26,159 --> 00:13:30,320
you go rhyming couples. You know, at least it would make me laugh.

191
00:13:33,960 --> 00:13:37,720
Man. I mean, it's kind
of crazy that it's been fifteen years

192
00:13:37,720 --> 00:13:43,080
since Nomadic Developer. I mean,
spending nine years at Thought Works, is

193
00:13:43,080 --> 00:13:46,559
that really nomadding You kind of put
down roots, I mean, except for

194
00:13:46,600 --> 00:13:50,759
the fact that you're going from company
to company to company and with a banner

195
00:13:50,799 --> 00:13:54,559
over your head. Yeah. Yeah, and my personal stuff, I mean

196
00:13:54,559 --> 00:13:58,919
I was I went from engineered a
bunch of other roles, you know.

197
00:13:58,960 --> 00:14:01,440
I think it was like, hey, be a product manager, be you

198
00:14:01,519 --> 00:14:03,240
this, be you that, And
you know, at the end of the

199
00:14:03,320 --> 00:14:07,039
day, it's like, decide what
you want to do for a living.

200
00:14:07,159 --> 00:14:11,399
Maybe you are you an engineer?
Are you not? I'm still thinking about

201
00:14:11,399 --> 00:14:13,440
it. I don't know. Yeah, yeah, I don't know what.

202
00:14:13,559 --> 00:14:16,480
I don't know what I want to
be when I grow up. That same

203
00:14:16,559 --> 00:14:22,240
thing, but again, it's still
is the nomadding mindset, the idea of

204
00:14:22,240 --> 00:14:24,519
what's always the next thing. And
you know what most people just say was

205
00:14:24,559 --> 00:14:28,600
a managed, well managed career.
You kind of put the nomad's been on

206
00:14:28,720 --> 00:14:31,840
at least that part of it.
I think a lot of it for me,

207
00:14:33,440 --> 00:14:35,480
A lot of it, despite the
randomness of it, was intentional.

208
00:14:37,320 --> 00:14:39,279
And it was intentional because I wanted
to, you know, if you want

209
00:14:39,279 --> 00:14:41,720
to be, you know, somebody
leading something. I think it's good to

210
00:14:41,759 --> 00:14:45,399
know a little bit of sales.
It's good to know a little bit of

211
00:14:45,440 --> 00:14:48,279
product. It makes you a better
engineer. If you know a little bit

212
00:14:48,320 --> 00:14:50,039
of product. You know a little
bit about why, why are we building

213
00:14:50,039 --> 00:14:54,799
this? Who does it matter to
have you done customer research? You know,

214
00:14:54,919 --> 00:14:58,080
you become a better engineer if you
ride along totally right something. We

215
00:14:58,080 --> 00:15:03,879
did it strangely, we started rotating
devs into Strangely, we were building network

216
00:15:03,919 --> 00:15:07,879
appliances, and so it turned out
that the lynchpin of the whole company were

217
00:15:07,919 --> 00:15:11,720
our installers, our system integrators.
Because you're touching, you're getting, you're

218
00:15:11,720 --> 00:15:15,120
putting stuff into people's networks. Everybody's
network has the ugliness. This is the

219
00:15:15,200 --> 00:15:18,799
question of where you know where it
is right and the Sis and they are

220
00:15:18,879 --> 00:15:22,360
all geniuses, like just brilliant,
brilliant people inevitably had to fix something in

221
00:15:22,399 --> 00:15:26,480
a network to have the appliance work, like it's just not there was never

222
00:15:26,960 --> 00:15:31,840
not that. And when we started
sticking devs with them, you do a

223
00:15:31,919 --> 00:15:37,759
six week rotation with the SIS as
a dev just be on the call see

224
00:15:37,759 --> 00:15:41,000
some of the things. Every single
time they came back, they're like,

225
00:15:41,440 --> 00:15:46,840
holy man, those guys' job is
hard and our product got better for it

226
00:15:46,120 --> 00:15:50,399
every time because they just saw,
how do I provide his ability to the

227
00:15:50,440 --> 00:15:54,879
behavior? How do I show you
know this kind of flow? How do

228
00:15:54,000 --> 00:15:58,480
you how do I give them switches
to turn features off and on quickly so

229
00:15:58,679 --> 00:16:03,320
they can figure out what's going on
with that customer. It's this cross discipline

230
00:16:03,320 --> 00:16:07,399
thing is it's insanely valuable. I
think you need a certain amount of depth

231
00:16:07,399 --> 00:16:10,320
to pull it off. But boy, I don't know how you become good

232
00:16:10,320 --> 00:16:12,320
without doing it. Yeah, I
mean, I mean there's lots of people

233
00:16:12,440 --> 00:16:15,720
get at their niches. I wouldn't, you know, downplay the person that's

234
00:16:15,759 --> 00:16:18,960
been doing engineering and a very specific
thing in a very specific way. There's

235
00:16:18,080 --> 00:16:22,399
very good people doing that. But
I would say, if you aspire to

236
00:16:22,440 --> 00:16:25,919
run a company someday, or if
you aspire to be a really truly great

237
00:16:25,919 --> 00:16:30,440
engineering leader, having those perspectives will
be a lay up on average. As

238
00:16:30,480 --> 00:16:33,320
a way, I would say,
yeah, yeah, And every time you

239
00:16:33,320 --> 00:16:37,519
can get a chance to dip into
the other spaces, see how decisions are

240
00:16:37,519 --> 00:16:41,440
made, Learn that language, you
know how those folks speak and think,

241
00:16:42,360 --> 00:16:47,159
just makes you more capable to helping
more people into being better at your own

242
00:16:47,200 --> 00:16:49,639
job. I find it really interesting
that you went to Nvidia. I mean,

243
00:16:51,200 --> 00:16:56,159
in Vidia makes the infrastructure that makes
a possible. And obviously you can

244
00:16:56,200 --> 00:17:00,799
see the success of Envidia by looking
at their stock price over the last few

245
00:17:00,840 --> 00:17:06,079
years, it has gone through the
roof the trillion dollar club. Oh my

246
00:17:06,200 --> 00:17:14,000
god. And so what I'm really
interested in is you know where Nvidia lies

247
00:17:14,039 --> 00:17:18,440
in the software side of AI,
which is what you're doing there. How

248
00:17:18,480 --> 00:17:23,240
does that happen? Well, the
real advantage of Nvidia is in fact software,

249
00:17:23,279 --> 00:17:27,200
It's not just hardware, right.
Kuda is the moat or one of

250
00:17:27,200 --> 00:17:30,759
the main motes of why in Vidia
is successful. It's kind of like how

251
00:17:32,279 --> 00:17:37,440
Windows helped make Microsoft successful because they
had backwards compatibility for many years. Well,

252
00:17:37,440 --> 00:17:41,759
it turns out if you wrote your
software for Kuda in two thousand and

253
00:17:41,759 --> 00:17:45,000
seven, it still works on every
Nvidia device you buy, on every Nvidia

254
00:17:45,119 --> 00:17:49,880
GPU. So explain Kuda for those
who don't know. Kuda is a it's

255
00:17:49,920 --> 00:17:55,480
the low level AKA there that allows
you to write code and C right or

256
00:17:55,559 --> 00:17:57,359
and there's Python libraries to do it
too. But if you need to write

257
00:17:57,400 --> 00:18:03,319
soft it's basically kind of like the
assembly layer plus the C layer for every

258
00:18:03,319 --> 00:18:07,640
GPU if you're trying to if you're
trying to do multi parallel processing of you

259
00:18:07,640 --> 00:18:11,279
know, primarily graphics. That's why
it's called the GPU. But it turned

260
00:18:11,319 --> 00:18:14,319
out, and this is almost kind
of they put it out there, and

261
00:18:14,640 --> 00:18:17,680
Jensen Longer CEO, tells the story
a lot better than I could possibly tell

262
00:18:17,720 --> 00:18:22,640
it. But you think about how
AI researchers found out. It also seems

263
00:18:22,640 --> 00:18:26,799
to be seems to do matrix multiplication
very very well well, which is core

264
00:18:26,920 --> 00:18:33,440
to how generative AI works in most
AI works in general. So it turns

265
00:18:33,440 --> 00:18:37,839
out that that was in the in
the sense of accelerated computing, which is

266
00:18:37,880 --> 00:18:41,359
what we talk about now, which
AI is one type of accelerated computing.

267
00:18:41,119 --> 00:18:45,279
We don't have More's law anymore quite
the same way we did, right,

268
00:18:45,680 --> 00:18:48,319
you know, I mean it's level
by the new More's law. I mean,

269
00:18:48,359 --> 00:18:52,799
it's this kind of massive rise of
the ability for multiple parallelism for a

270
00:18:52,920 --> 00:19:00,319
great you know, scale to accelerate
computing. So that's kind of the story

271
00:19:00,599 --> 00:19:03,680
of how a thirty year company that
kind of you know, most companies don't

272
00:19:03,680 --> 00:19:07,559
become two trillion dollar companies in their
thirtieth year. They've become that maybe in

273
00:19:07,559 --> 00:19:11,799
their tenth year, and it wasn't. Yeah. Well, I mean,

274
00:19:12,000 --> 00:19:15,720
and I remember I have a friend
who's one of those low level developer types

275
00:19:15,720 --> 00:19:19,519
who's moved from job to job,
staying in that And one of the gigs

276
00:19:19,519 --> 00:19:26,759
he did was in the middle Oughts, and he was working for an astronomical

277
00:19:26,799 --> 00:19:33,279
society and he was building software to
utilize GPUs to do analytics for identifying asteroids.

278
00:19:33,799 --> 00:19:37,200
So you're literally looking at two pictures
of saying what moved kind of thing

279
00:19:37,240 --> 00:19:41,720
rights complicated stuff, and he was
hand coding all this stuff because Andrew's that

280
00:19:41,839 --> 00:19:48,079
kind of guy. And Kuda came
out and he literally spent a weekend re

281
00:19:48,200 --> 00:19:51,640
implemented with using Kudah. And when
I've been wasting my time. This is

282
00:19:51,720 --> 00:19:56,359
just drue and we're talking to v
one stuff of kudo. Like, it's

283
00:19:56,359 --> 00:19:59,240
not like they were thinking about Jenner
of AI in two thousand and seven.

284
00:20:00,039 --> 00:20:03,559
That's not what they were thinking about, right, It's just astonished. It's

285
00:20:03,599 --> 00:20:08,279
astonishing to think that that library is
continued to evolve to become one of the

286
00:20:08,279 --> 00:20:15,759
great underpinnings of this multi parallel processing
space we're now running in as hardware people.

287
00:20:15,839 --> 00:20:22,119
Well, I remember talking to Steve
Sanderson in at NDC in twenty thirteen.

288
00:20:22,240 --> 00:20:30,680
I think about his project using WebGL
to take advantage of GPU processing inside

289
00:20:30,720 --> 00:20:36,240
a browser in twenty thirteen. Yeah, that's the HTML five revolution, right,

290
00:20:36,319 --> 00:20:41,240
the CAMAS compute and all of that
sort of thing. I've certainly been

291
00:20:41,279 --> 00:20:48,680
banging on the drum about Moore's law
ending for some time now. Was TMSC

292
00:20:48,839 --> 00:20:52,359
talking about the three nanometer process like, folks, you're running out of atoms

293
00:20:52,640 --> 00:20:57,880
just cannot get much smaller than that, right center. Well, and also

294
00:20:59,480 --> 00:21:03,440
that it's not just that we went
parallel, it's that we're also doing specialized

295
00:21:03,519 --> 00:21:07,720
compute. So I mean for the
long we you know, go back a

296
00:21:07,759 --> 00:21:14,559
few years and we had math coprocessors
for our CPUs, and then but when

297
00:21:14,599 --> 00:21:18,240
you move up the stack. Obviously, the GPU was the first real alternative

298
00:21:18,240 --> 00:21:23,640
to the CPU for major workloads.
Now they're talking about MPUs, right,

299
00:21:23,680 --> 00:21:29,839
they start a neural processing unit,
although I personally don't I mean, I

300
00:21:29,920 --> 00:21:32,759
haven't dug far enough it is to
say how different is an MPU from a

301
00:21:32,839 --> 00:21:37,079
GPU. Really, I think if
you really kind of break it down,

302
00:21:37,000 --> 00:21:41,799
there's going to be a lot of
different kinds of we'll just call it asterix

303
00:21:41,079 --> 00:21:45,759
PU, right, you know.
And there's as we've gotten more advanceds of

304
00:21:45,799 --> 00:21:47,920
technology. I mean, I don't
know if it's widely known, but we

305
00:21:48,000 --> 00:21:52,160
use a lot of AI to develop
chips. Sure it's SOLF, right,

306
00:21:52,440 --> 00:21:55,960
and a lot of that's going to
come up with lots of different designs.

307
00:21:56,400 --> 00:21:59,400
Like if you can imagine if you
can compile your program down to something that

308
00:21:59,440 --> 00:22:03,200
actually runs on hardware where you know
your normal instruction set is xor and yeah,

309
00:22:03,279 --> 00:22:06,839
you know your normal kind of assembly
language instructions you might have learned early

310
00:22:06,839 --> 00:22:10,000
in your career. What if it's
the software, right, What if you

311
00:22:10,039 --> 00:22:11,839
had a special chip for every kind
of little bit of software that you were

312
00:22:11,839 --> 00:22:15,000
going to run be it generated bay, I be it whatever kind of future

313
00:22:15,079 --> 00:22:22,039
architecture around neural processing or other things, because the cost to build specialized chips

314
00:22:22,079 --> 00:22:23,279
is going down as well, right, so so much so, there's like,

315
00:22:23,440 --> 00:22:26,319
I think in New York there's a
there's a you know, somebody has

316
00:22:26,319 --> 00:22:29,680
like a food cart, but instead
of a food cart, it will make

317
00:22:29,720 --> 00:22:33,519
a die. You can actually literally
change the software on the die and make

318
00:22:33,599 --> 00:22:37,519
it build like a string copy function
that runs purely in hardware or something.

319
00:22:37,599 --> 00:22:41,119
Right, you know, I like
going out for chips, but this is

320
00:22:41,160 --> 00:22:45,599
a different kind of chip. Do
not get this one in mayonnaise? That

321
00:22:45,720 --> 00:22:52,039
is wrong. My first album with
my brother Franklin Brothers nineteen ninety nine,

322
00:22:52,480 --> 00:22:56,319
we program I mostly me programmed drums, you know, drum samples and stuff

323
00:22:56,799 --> 00:23:02,079
for all the drums, and we
gave him album credits. Chip Franklin,

324
00:23:03,759 --> 00:23:07,680
I didn't know you had a third
brother. I don't. I don't.

325
00:23:07,720 --> 00:23:11,359
Well, depends on your definition of
brother. I guess. You know the

326
00:23:11,440 --> 00:23:15,960
count Ai. You know it's maybe
a family member, I guess. But

327
00:23:15,279 --> 00:23:19,119
it's a little wild, certainly.
I mean, I think Gethub nailed it

328
00:23:19,160 --> 00:23:22,680
with the term co pilot, just
like it's an assistant. Right, it's

329
00:23:22,720 --> 00:23:27,839
an automation tool. You're still the
pilot. It's your fault when things go

330
00:23:27,920 --> 00:23:33,000
wrong, And that co pilot has
become a buzzword for any kind of a

331
00:23:33,200 --> 00:23:36,200
I've stopped trying to drink every time
they say it because I'm out of the

332
00:23:36,240 --> 00:23:38,160
keynote in five minutes, right,
like you've done. I have a provocative

333
00:23:38,240 --> 00:23:44,160
question, what if you become the
co pilot? Yeah? I don't think

334
00:23:44,200 --> 00:23:48,319
it's all that prodocative because I haven't
haven't seen a piece of software that can

335
00:23:48,400 --> 00:23:52,039
that has direction yet. Yeah,
but you know, so like a lot

336
00:23:52,039 --> 00:23:55,960
of the things that we're doing now
these days are so I have a project.

337
00:23:55,960 --> 00:23:57,640
So we talked about autonomous observability agents. We want to go there?

338
00:23:59,160 --> 00:24:03,839
Yeah there, All right, what
does that do? Because a lot of

339
00:24:03,880 --> 00:24:07,160
people are like that sounds like a
cool title. What does that actually mean?

340
00:24:07,720 --> 00:24:11,000
Well, one of the things we
discovered and so in my day job,

341
00:24:11,039 --> 00:24:14,759
which is we do GPU allocations.
So we got to figure out all

342
00:24:14,799 --> 00:24:18,960
the internal external groups that are going
to use GPUs and take the scarce resource

343
00:24:18,000 --> 00:24:21,559
and allocate them in a manner that
is not just good for a video but

344
00:24:21,559 --> 00:24:22,920
good for a customer base. All
that stuff right, all the good things.

345
00:24:25,039 --> 00:24:26,839
It turns out we have more data
sources than we can possibly just kind

346
00:24:26,839 --> 00:24:32,880
of automatically do engineer or like human
do data engineering to understand as quickly as

347
00:24:32,920 --> 00:24:36,039
we need to. And so it
came to us, what if we built

348
00:24:36,039 --> 00:24:41,319
a system that could do texts where
you can say words in human in English

349
00:24:41,359 --> 00:24:45,720
or whatever human language you want,
convert that into either you know, druid

350
00:24:45,799 --> 00:24:51,480
squol or elastic search queries or any
other thing. Right, So we're not

351
00:24:51,480 --> 00:24:53,200
trying to be dev in the software
developer per se. We're not trying to

352
00:24:53,440 --> 00:24:57,839
write and try our programs. It
turns out AI is extraordinarily good at writing

353
00:24:57,960 --> 00:25:02,880
short programs that all the well known
patterns. And there's a lot of training

354
00:25:02,960 --> 00:25:07,920
data on how to write SQ well
in the corpus of models today, and

355
00:25:07,960 --> 00:25:11,039
you can do a little bit of
tuning to make it right. You know,

356
00:25:11,079 --> 00:25:14,599
code that runs against rest endpoints,
code that runs against well known things

357
00:25:14,640 --> 00:25:18,279
like elastic search. You know we
have in fact, you know, Jensen

358
00:25:18,319 --> 00:25:22,319
talked about it in his latest keynote, doing text to a back and SAP.

359
00:25:22,559 --> 00:25:26,839
So if you want to just ask
questions of your database, ask questions

360
00:25:26,880 --> 00:25:33,440
of your large ERP system right.
Being able to do that. What if

361
00:25:33,440 --> 00:25:37,759
I can ask questions about my GPU
telemetry? What if I can get answers

362
00:25:37,839 --> 00:25:41,400
back about my GPU telemetry? Wouldn't
it be great? And I had this?

363
00:25:41,799 --> 00:25:44,680
The story I can tell is,
you know, one of my frustrations

364
00:25:44,680 --> 00:25:47,640
when I was, you know,
VP of engineering at New Relic was I'd

365
00:25:47,680 --> 00:25:51,039
be asked to do things like,
hey, what happened with that incident last

366
00:25:51,119 --> 00:25:53,759
night? A lot of times it
wouldn't be as accurate as I would like

367
00:25:53,880 --> 00:25:56,960
because I had to go at eight
am, I had to call up a

368
00:25:57,000 --> 00:26:00,640
bunch of directors and ask, hey, what have last night? And they're

369
00:26:00,680 --> 00:26:06,480
tired, right, they're they're they're
exhausted. They they spent the night fighting

370
00:26:06,799 --> 00:26:10,559
the fire, right, And so
you know, you think about AI's a

371
00:26:10,599 --> 00:26:14,039
hallucinating Well, nobody hallucinates like somebody
that's not gotten very much sleep, right,

372
00:26:14,160 --> 00:26:19,000
So so you've hit you have this
dynamic. And so I walk into

373
00:26:19,039 --> 00:26:23,119
these meetings half the time right,
half the time completely off base. What

374
00:26:23,160 --> 00:26:29,400
I wanted was a system where I
can ask questions, what the F happened

375
00:26:29,480 --> 00:26:32,559
last night? Right, and add
out the F if you want, no,

376
00:26:33,440 --> 00:26:36,799
No, I don't think and then
I get an answer back, and

377
00:26:36,920 --> 00:26:40,440
I can then say, oh great, that took three seconds or even took

378
00:26:40,440 --> 00:26:42,880
thirty seconds because it thought a lot
about the question. We allowed it to

379
00:26:42,920 --> 00:26:47,799
ruminate a little bit and confer with
other experts in the system about the answer,

380
00:26:48,960 --> 00:26:52,160
and then I don't give me a
considered answer and then be able to

381
00:26:52,200 --> 00:26:55,400
help me ask the next question.
You see this in perplexity today. By

382
00:26:55,440 --> 00:26:59,240
the way, you see in perplexity
you have this capability where you ask a

383
00:26:59,319 --> 00:27:02,039
question and gives you the next five
good questions as a consequence of that,

384
00:27:02,640 --> 00:27:06,039
I want that against observability back.
Then if I could go back in time,

385
00:27:06,440 --> 00:27:10,119
right, so I could spend that
two hours interrogating the system rather than

386
00:27:10,160 --> 00:27:14,319
waiting for everybody to call with bad
information. Right, I mean, I

387
00:27:14,359 --> 00:27:17,680
mean part of me wonders, you
know, is in my my time in

388
00:27:17,680 --> 00:27:21,759
that role is like you had to
root cause analysis problem. These folks were

389
00:27:21,799 --> 00:27:25,839
fighting the fire and they weren't getting
to the summary phase before you needed to

390
00:27:25,880 --> 00:27:30,119
talk to leadership about it, like
you need you needed to finish root cause

391
00:27:30,119 --> 00:27:33,160
analysis and you didn't have time or
or you know, there needed to be

392
00:27:33,160 --> 00:27:41,519
more automation around it. The number
times that the IRC log for strange loops

393
00:27:41,519 --> 00:27:45,559
save my bacon. You know that
the you know, the firefight actually happened

394
00:27:45,599 --> 00:27:48,279
over I R C because we were
all remote, and that you could you

395
00:27:48,319 --> 00:27:52,240
want to know what happened? Read
it. Read how we spent four hours

396
00:27:52,359 --> 00:27:57,440
getting to knowing what went wrong and
fifteen minutes fixing it in back the day

397
00:27:57,480 --> 00:28:00,240
that I would the first thing I
would do when I wake up every morning

398
00:28:02,160 --> 00:28:06,519
was look at the backscroll in slack
one of any incident mansion rooms, and

399
00:28:06,759 --> 00:28:08,839
you would get some you'd get some
ideas about it, but it still wasn't

400
00:28:08,880 --> 00:28:12,160
like great. Sometimes you didn't resolve
to root cause you just resolved to how

401
00:28:12,200 --> 00:28:15,359
do we fix a thing? You've
got to be up again? And then

402
00:28:15,359 --> 00:28:17,680
you went to bed. Yeah,
he didn't actually get your root cause.

403
00:28:17,960 --> 00:28:22,359
But and I think these gen AI
tools are really good at summarizing those long

404
00:28:22,440 --> 00:28:25,960
logs like that seems to be one
of the strengths are at although it would

405
00:28:25,960 --> 00:28:30,960
also argue it's very hard to test
if they're bad at it. Yeah,

406
00:28:30,119 --> 00:28:34,799
Like you ask it to summarize forty
pages of notes and it gives you a

407
00:28:34,839 --> 00:28:37,400
paragraph back. How do you know
if it's a good paragraph or a bad

408
00:28:37,440 --> 00:28:41,640
paragraph. Hm, good point,
right, because you didn't read the forty

409
00:28:41,680 --> 00:28:45,599
pages that was the point. So
how would you know if it was bad?

410
00:28:45,640 --> 00:28:48,359
And to me, the biggest battle
I have with all of the neuralnet

411
00:28:48,400 --> 00:28:52,799
stuff right now is testability. How
do you know it's correct or what to

412
00:28:52,880 --> 00:28:57,000
what degree it is correct? Well, this is called you build a robust

413
00:28:57,039 --> 00:29:03,440
evil set in any of these models
that you're doing. Be It and all

414
00:29:03,440 --> 00:29:06,880
the big generative AI companies. I
mean there's leader boards like how much they

415
00:29:06,880 --> 00:29:11,599
test work in the email. This
gets really tricky and this is a very

416
00:29:11,920 --> 00:29:17,640
new area, right because I think
there's maybe a small number of people in

417
00:29:17,640 --> 00:29:21,079
the world right now that are even
attempting to do text to SQL or text

418
00:29:21,119 --> 00:29:26,440
to other computer language where you're having
to do the same thing summarization, but

419
00:29:26,119 --> 00:29:30,680
instead of doing kind of traditional RAG
or having it in the model, you're

420
00:29:30,680 --> 00:29:33,559
pulling new data in every time,
right, You're asking what happened last night

421
00:29:34,079 --> 00:29:37,680
from systems of record that would be
able to tell you from the telemetry what's

422
00:29:37,680 --> 00:29:42,240
actually happening. And so it's it's
a new area. We're we're developing it

423
00:29:42,240 --> 00:29:44,839
as we go, but a lot
of what we do, a lot of

424
00:29:44,880 --> 00:29:48,480
our work, and our team is
less programming more finding a good evail set

425
00:29:48,480 --> 00:29:52,160
and validating, so we have confidence
in the systems that are telling us what

426
00:29:52,200 --> 00:29:55,559
we think is happening. And I
think that's one of the things that folks

427
00:29:55,559 --> 00:29:59,720
who really struggling with as from the
development perspective, is recognizing how data driven

428
00:29:59,759 --> 00:30:03,119
all this actually is that a good
evail set is a set of data you

429
00:30:03,119 --> 00:30:07,559
can test effectively against that without Without
that, no mon of code is going

430
00:30:07,559 --> 00:30:10,960
to save you. And gentleman,
I need to interrupt for one moment for

431
00:30:11,079 --> 00:30:18,880
this very important message and we're back. It's Dot a Rock So I'm Richard

432
00:30:18,920 --> 00:30:21,519
Campbell. That's Carl Franklin. Hey, hey, hey, talking to our

433
00:30:21,519 --> 00:30:26,359
friend Aaron Erickson, who has been
no madding and has found himself landed at

434
00:30:26,640 --> 00:30:30,039
Nvidia, which is very cool,
dude, Like, congratulations, what an

435
00:30:30,079 --> 00:30:33,279
amazing company. And it was it
was cool before jen a I took off

436
00:30:33,319 --> 00:30:37,240
and you had to run the whole
place. But why are you folks building

437
00:30:37,400 --> 00:30:45,079
autonomous agents? Like aren't you happy
selling shovels during the gold Rush? Because

438
00:30:45,039 --> 00:30:52,200
because we can no we so we
have a we have a need to increase

439
00:30:52,960 --> 00:30:56,359
utilization of our GPU fleet. We
have a need to be able to monitor

440
00:30:56,400 --> 00:31:00,920
our own GPU environment. The reason
I got so are doing this is because

441
00:31:00,920 --> 00:31:03,799
we had a need in that category
and it trued out. You know,

442
00:31:03,799 --> 00:31:07,880
if you work for an AI company
and you find AI to be useful in

443
00:31:07,920 --> 00:31:11,400
solving a problem, you should probably
do that. And so we are doing

444
00:31:11,440 --> 00:31:15,160
that well. And I mean you
had before the break that little lead in.

445
00:31:15,200 --> 00:31:18,200
It's like what if you were what
if you were the copilot? Like

446
00:31:18,599 --> 00:31:22,039
to me, the autonomous agent is
still the copilot. It's just got a

447
00:31:22,079 --> 00:31:26,960
better instruction set. It's not it's
not a gopher, you know. It's

448
00:31:26,000 --> 00:31:29,680
not ask a question, get a
response. Ask a question, get a

449
00:31:29,680 --> 00:31:33,319
response. It's more of a here's
a problem space that needs to work on,

450
00:31:33,440 --> 00:31:37,599
report back routinely. Like it's almost
a if you were equating it to

451
00:31:37,759 --> 00:31:41,759
a junior employee. And I loathe
to do that because I do everything I

452
00:31:41,759 --> 00:31:47,680
can to avoid anthro memorphizing software.
Now that it's become a real problem,

453
00:31:48,079 --> 00:31:49,960
that's the only way we can think
of it, though, Yeah, except

454
00:31:51,000 --> 00:31:53,960
that it makes human wrong right like, it makes us think. It makes

455
00:31:55,039 --> 00:31:59,079
us think that this thing has intent. It gives we as soon as you

456
00:31:59,119 --> 00:32:05,920
give it an agent, then it's
then its response has credibility more than it

457
00:32:06,039 --> 00:32:09,240
deserves. Like you. But do
you think that us being the copilot means

458
00:32:09,240 --> 00:32:14,640
the AI is driving the train?
Well? Is that the implication? That's

459
00:32:14,920 --> 00:32:21,279
the idea? So and you and
you assist the AI when it needs help

460
00:32:21,519 --> 00:32:23,880
from somebody when it doesn't have the
answer. Is that what you're talking about?

461
00:32:24,480 --> 00:32:30,359
Correct? Correct? So what you
do, and I stated this publicly,

462
00:32:30,559 --> 00:32:36,440
you create an org structure of agents. So there is a director agent.

463
00:32:36,559 --> 00:32:39,359
The director agent has a goal.
It has a goal of a metric,

464
00:32:39,440 --> 00:32:45,759
almost like giving it an OKR in
an organization structure, and its job

465
00:32:45,839 --> 00:32:52,000
is to analyze data. Work with
analyst agents that might understand a domain might

466
00:32:52,039 --> 00:32:55,920
be I understand GPS need to run
in this temperature range. I understand financial

467
00:32:55,960 --> 00:32:59,880
transactions need to operate in a certain
manner. You know, you can apply

468
00:32:59,880 --> 00:33:01,480
to a lot of different domains.
And this is a pattern. I'm not

469
00:33:01,480 --> 00:33:07,559
really talking about anything secret here.
And then you have worker agents. Their

470
00:33:07,640 --> 00:33:12,079
job is to take human language and
convert it into database queries or rest calls

471
00:33:12,200 --> 00:33:16,319
or whatever else, so that confer
with themselves. Right, maybe human assistant

472
00:33:16,359 --> 00:33:20,839
it might call you back to say
can you clarify this yes or no?

473
00:33:21,039 --> 00:33:23,920
Is this true? Right? It
might actually ask you questions and then figure

474
00:33:23,920 --> 00:33:28,480
out an action plan as a result
of the analysis. And so the action

475
00:33:28,559 --> 00:33:32,880
plan might be realoicate these GPUs from
here to here. It might be hey,

476
00:33:34,119 --> 00:33:37,240
human go take a look at this
data. Tell me what you think

477
00:33:37,279 --> 00:33:39,240
of it. Right, there's all
sorts of ways you can think about this,

478
00:33:39,400 --> 00:33:44,119
But what it starts to do is
once you reach a certain threshold of

479
00:33:44,200 --> 00:33:47,839
capability, it becomes more of its
driving things. It's driving the details and

480
00:33:47,880 --> 00:33:52,240
you're kind of monitoring it right with
the real data about how is this actually

481
00:33:52,279 --> 00:33:58,759
improving the system? Now you've said
like you've given an emission and it's it's

482
00:33:59,079 --> 00:34:01,559
essentially going down the checklist of the
right way to execute on that mission and

483
00:34:01,640 --> 00:34:05,960
responding to the impadences that he gets
back to it from it. And I

484
00:34:06,000 --> 00:34:07,760
would hope you said monitoring is the
key, right. I mean, there's

485
00:34:07,800 --> 00:34:13,039
got to be a CEO that's watching
it and not allowing it to make decisions

486
00:34:13,119 --> 00:34:19,800
until they're understood and passed off by
human io. So have you been to

487
00:34:19,840 --> 00:34:24,639
San Francisco lately? No, but
I suspect you have. So I have

488
00:34:24,719 --> 00:34:29,239
an app on my phone, I
can call a way Mo self driving car

489
00:34:29,400 --> 00:34:32,639
with no driver in it. Yeah. Yeah, And occasionally, if it

490
00:34:32,679 --> 00:34:38,159
gets into a jam, it will
notify somebody at a real, real human

491
00:34:38,239 --> 00:34:42,800
Hey, I'm in a traffic jam
or I'm stuck in some unusual condition and

492
00:34:42,840 --> 00:34:45,840
they take over. Yes, somebody
put a traffic cone on my hood.

493
00:34:46,280 --> 00:34:50,760
Yes. Yes. And so if
you've seen these videos, I'm sure you

494
00:34:50,800 --> 00:34:52,320
probably have. Not everybody has.
I mean, there's people that have never

495
00:34:52,320 --> 00:34:55,320
seen this and haven't heard this is
actually happening. But it's this kind of

496
00:34:55,320 --> 00:34:59,239
scaling of self driving cars. Like
we started out with cruise control, we

497
00:34:59,400 --> 00:35:04,760
moved to smarter cruise control, we
move to kind of full self driving.

498
00:35:04,800 --> 00:35:07,880
Is like the biggest manifestation of this, But there's this kind of face shift

499
00:35:07,880 --> 00:35:13,760
that happens between hey, it helps
me till now we're in San Francisco,

500
00:35:13,880 --> 00:35:16,239
it's I help it when it calls
me. And it's just kind of an

501
00:35:16,280 --> 00:35:20,639
evolution of how these things happen.
And so it's a lot of investment,

502
00:35:20,920 --> 00:35:22,719
a lot of evals, a lot
of monitoring on the way there. But

503
00:35:22,760 --> 00:35:28,719
once you get to that threshold where
you're monitoring the thing and it's right ninety

504
00:35:28,800 --> 00:35:32,679
nine percent of the time, especially
in domains where it being wrong isn't catastrophic.

505
00:35:32,840 --> 00:35:36,679
Like you may not do this with
financial transactions tomorrow, but you might

506
00:35:36,760 --> 00:35:42,559
do this with certain other maybe kind
of lower risk things, right, And

507
00:35:42,599 --> 00:35:44,920
this is where it starts, and
then you start to get more and more

508
00:35:45,039 --> 00:35:50,840
of this stuff. Yeah, I
don't know if we're building on a house

509
00:35:50,880 --> 00:35:57,960
of cards here, but I certainly
think that there are routine tasks in throughout

510
00:35:58,119 --> 00:36:01,360
business that can be off greated from
a set of rules. You know,

511
00:36:01,480 --> 00:36:07,280
we've made fact box long ago,
uh Tier one tech support. More and

512
00:36:07,360 --> 00:36:10,519
more automation happening on that because there
is this set of did you turn it

513
00:36:10,559 --> 00:36:13,599
off and on again? Did you
check this? That? You know,

514
00:36:13,800 --> 00:36:17,119
like it's a checklist of things as
soon as you've got a person flipping through

515
00:36:17,159 --> 00:36:22,519
a book, you know, going
back and forth on the standard set of

516
00:36:22,599 --> 00:36:25,079
questions like that could be software.
It just seems like layers of automation.

517
00:36:25,559 --> 00:36:31,840
I've just always concerns me when when
we get a sense of consdering more intent

518
00:36:32,039 --> 00:36:37,639
than that that it is still a
set of instructions, although often those sets

519
00:36:37,639 --> 00:36:44,960
of instructions are based on a parametric
response to data coming in. You know,

520
00:36:45,000 --> 00:36:46,960
there's a reason why we turn the
we turn the lights red when it's

521
00:36:47,000 --> 00:36:52,440
past a certain threshold, right,
right, But there's so there's a good

522
00:36:52,519 --> 00:36:55,079
job or a book called forgive Me. I'm going to quote a book name.

523
00:36:55,639 --> 00:37:00,519
It's not a curse word, but
it's called bulkt jobs of the book.

524
00:37:01,199 --> 00:37:07,360
And it turns out there's a lot
of roles that I think maybe aren't

525
00:37:07,360 --> 00:37:13,559
that intellectually challenging, but for whatever
reason, we still keep around where you

526
00:37:13,559 --> 00:37:17,960
know, I call these kind of
like low level intellectual robots. And if

527
00:37:19,000 --> 00:37:22,199
you can kind of automate a lot
of the low level intellectual roboticness out of

528
00:37:22,280 --> 00:37:24,679
jobs, and you think of like
what happens at the DMV, I don't

529
00:37:24,719 --> 00:37:28,119
know that there's a lot of creativity
going on at the DMV. And I

530
00:37:28,119 --> 00:37:30,440
don't know that you want a lot
of creativity going out on the DMV.

531
00:37:30,920 --> 00:37:32,599
I don't know that you want a
lot of creativity in your finance department.

532
00:37:32,639 --> 00:37:37,519
In some domains, there's some kinds
of software development that aren't terribly creative.

533
00:37:37,559 --> 00:37:42,400
I don't ever want to write another
get rest point, get you know,

534
00:37:42,480 --> 00:37:45,880
rest endpoint again that gets data from
a simple thing. I know that can

535
00:37:45,920 --> 00:37:49,679
be automated for the most part,
So we use copilot for that today.

536
00:37:49,760 --> 00:37:52,679
Right, We're going to find more
and more things that we just use copilot

537
00:37:52,679 --> 00:37:55,400
for a lot. That's that's low
level of intellectual labor, right, and

538
00:37:55,440 --> 00:38:00,320
we will start t all to make
that more well and and well. Every

539
00:38:00,320 --> 00:38:05,000
time we've ever done this, it
frees up resources for doing more creative work.

540
00:38:05,599 --> 00:38:09,920
Right. The reality, a lot
of travel agents lost their jobs as

541
00:38:09,960 --> 00:38:17,559
the airline industry and the hospitality industry
moved online, but travel exploded far more

542
00:38:17,599 --> 00:38:22,679
people traveled, all those industries got
bigger, and a net more jobs were

543
00:38:22,719 --> 00:38:27,719
made than were lost from travel agents. You know that that pattern consists.

544
00:38:27,920 --> 00:38:31,599
It happens over and over again.
So the theory being these you know,

545
00:38:31,760 --> 00:38:37,199
menial jobs, these menial tasks that
can be automated, should be automated because

546
00:38:37,239 --> 00:38:39,519
that person is capable of more.
Yeah, it's the only way you can

547
00:38:39,559 --> 00:38:43,400
look at it, because you can
be mad that this is happening, and

548
00:38:43,920 --> 00:38:47,880
lots of people are and I have
a lot of you know, but reality,

549
00:38:47,960 --> 00:38:52,719
it's just how we manage disruption is
the important of this, right,

550
00:38:52,840 --> 00:38:57,480
Like, yeah, we should be
able to manage this disruption compassionately. But

551
00:38:58,159 --> 00:39:00,800
it doesn't mean, you know,
the alter of his not having disruption,

552
00:39:00,920 --> 00:39:06,400
isn't there. I referenced the the
Luddite movement. Yeah, you know,

553
00:39:06,679 --> 00:39:09,039
and for those who don't mean we've
heard the term. But originally, when

554
00:39:09,079 --> 00:39:17,039
the first automated the first steam powered
looms were deployed in Scotland, the folks

555
00:39:17,039 --> 00:39:22,880
that were used to weaving by hand
freaked out and smashed them. And here's

556
00:39:22,920 --> 00:39:27,440
something that people don't know about that
movement. But the Lunites weren't anti technology.

557
00:39:27,960 --> 00:39:30,320
In fact, they were the technologists
of the day, right. They

558
00:39:30,320 --> 00:39:34,719
were the high tech. They were
using high tech machinery. They understood it,

559
00:39:35,440 --> 00:39:39,239
but it was it was really about
you know, jobs, right,

560
00:39:39,360 --> 00:39:45,559
yeah, And the reality, of
course was that was a communication between the

561
00:39:45,559 --> 00:39:50,840
employer and employees. The result was
that ultimately rebuilt the machines and had all

562
00:39:50,880 --> 00:39:54,039
those people running them. Right.
The bigger thing that happened as they automated

563
00:39:54,079 --> 00:40:00,239
the weaving of cloth is that people
bought more clothes because the price of them

564
00:40:00,280 --> 00:40:04,480
fell dramatically. And not only that
all of those people keep their jobs,

565
00:40:04,519 --> 00:40:07,119
but a whole lot of more people
got imployed because they made a lot more

566
00:40:07,199 --> 00:40:13,800
cloth. And you know, the
industry exploded. The idea of owning different

567
00:40:13,800 --> 00:40:15,800
garments for every day, not just
a good set of clothes and a working

568
00:40:15,800 --> 00:40:20,519
set of clothes, came out of
being able to automate cloth weaving. So

569
00:40:20,639 --> 00:40:22,519
you want to know one of the
most ironic things about what's happening right now,

570
00:40:22,599 --> 00:40:32,559
I'm sure you do, and that
is so now engineers are automating engineering.

571
00:40:32,599 --> 00:40:37,079
And you see this with some of
the responses that Devin the software developer.

572
00:40:37,079 --> 00:40:44,440
I found that fascinating because who here
in software development hasn't automated the living

573
00:40:44,480 --> 00:40:46,639
crap out of somebody else's jobs,
a lot of other people's jobs. We've

574
00:40:46,679 --> 00:40:52,679
been doing that for the entire what
do you think CICD pipelines are, right,

575
00:40:52,400 --> 00:40:55,920
We've been doing this for the entire
history of computing. Okay, normally

576
00:40:55,960 --> 00:41:00,320
there is nobody. I'm sorry this
might make some people mad. I'm getting

577
00:41:00,320 --> 00:41:04,360
older, so I kind of don't
care, as we all are, right,

578
00:41:04,719 --> 00:41:07,880
we all age one. But the
reality is if you're mad that,

579
00:41:08,000 --> 00:41:13,280
like programmers are automating programmers jobs,
well, sorry, I guess that's just

580
00:41:13,400 --> 00:41:15,079
kind of like what we've been doing, and I guess it's coming back and

581
00:41:15,079 --> 00:41:17,719
it's kind of funny. But like
when people get mad at this, like

582
00:41:17,719 --> 00:41:21,400
they're like, oh my god,
Devin, it's like, yeah, sorry,

583
00:41:21,840 --> 00:41:24,320
we did it to ourselves. By
the way. By the way,

584
00:41:24,679 --> 00:41:30,679
you have so much opportunity now if
you can learn how to construct these systems

585
00:41:30,719 --> 00:41:34,239
together, use this. So you
know, we built this thing that in

586
00:41:34,320 --> 00:41:37,880
video inference microservice just a pattern just
being video, it could be an inference

587
00:41:37,960 --> 00:41:40,840
microservice. So now instead of building
a micro service that knows one domain well

588
00:41:40,880 --> 00:41:45,800
and you build it in code,
you build ALM that knows one domain well,

589
00:41:45,119 --> 00:41:52,000
that takes in requests in human and
generates data for you in whatever form

590
00:41:52,039 --> 00:41:54,199
that you need. But it's the
same idea, right, it's taking some

591
00:41:54,239 --> 00:41:59,440
of these same archeittral patterns that we've
known as engineers and using it to build

592
00:41:59,480 --> 00:42:04,039
bigger, smarter, more interesting systems
that then let us be more creative.

593
00:42:04,239 --> 00:42:07,639
Yeah, why did you think that
the methods for using machine learning tools weren't

594
00:42:07,679 --> 00:42:12,440
going to improve? Right? Really, come on, like, that's of

595
00:42:12,480 --> 00:42:16,440
course they are the Especially in a
young in technology like this, it evolves

596
00:42:16,800 --> 00:42:21,880
even faster than the more mature technologies, where a lot of the stuff's one

597
00:42:21,920 --> 00:42:24,679
point zero and just becoming to two
is a big jump. Maybe three is

598
00:42:24,840 --> 00:42:29,000
kind of where we're talking about these
first few versions of how we build this

599
00:42:29,039 --> 00:42:31,960
stuff. Of course is going to
be massively transforming. Well, but it's

600
00:42:32,280 --> 00:42:35,920
funny the way it's happened in AI, where it's kind of like it's very

601
00:42:35,960 --> 00:42:37,840
quiet for periods of time, right, you know, you have researchers doing

602
00:42:37,880 --> 00:42:40,440
things. It kind of feels like
it's it's quiet, it's quiet, and

603
00:42:40,480 --> 00:42:46,039
then boom, something like tchike happens
and then massive, massive change. I

604
00:42:46,039 --> 00:42:49,159
think ken Beck had a quote saying, I don't know how this is going

605
00:42:49,239 --> 00:42:51,639
to change, but I know it's
going to change a lot about Yeah,

606
00:42:51,679 --> 00:42:55,320
a lot of things how I write
code, and I think the best response

607
00:42:55,360 --> 00:42:59,599
to a lot of this is wonder
yeah, you know, and and just

608
00:42:59,679 --> 00:43:02,320
kind of being even as you get
into your later years in your career,

609
00:43:02,400 --> 00:43:05,400
you know, I'm kind of rounding
third base in my career, I think

610
00:43:05,400 --> 00:43:07,760
at this point, and I still
every day I wake up, I'm like,

611
00:43:07,800 --> 00:43:09,880
what can we do with this stuff? How can we build on these

612
00:43:09,920 --> 00:43:14,960
with these tools that we have to
do things that are incredible for our customers,

613
00:43:14,960 --> 00:43:19,280
that are incredible for new Like I
look at what Nvidiot does. I'm

614
00:43:19,280 --> 00:43:22,000
sorry if I'm if I sound like
a fanboy, I am, you know,

615
00:43:22,320 --> 00:43:24,760
guilty. So it's a lot.
But things we're doing around protein discovery,

616
00:43:25,079 --> 00:43:29,199
things that our customers are doing around
protein discovery, that we're never even

617
00:43:29,280 --> 00:43:34,599
possible pre transformers, pre the transformer
model. There's going to be diseases,

618
00:43:34,639 --> 00:43:39,719
care there's going to be technologies built
that make humanity better in every way possible.

619
00:43:39,760 --> 00:43:45,480
There's also going to be, let's
be real technologies that make things terrible

620
00:43:45,599 --> 00:43:47,679
in a lot of ways. And
you know, I don't. The thing

621
00:43:47,719 --> 00:43:51,400
is, you can't stop it though. You know, there's no world where

622
00:43:51,519 --> 00:43:53,440
you can suddenly, oh, GPU's
got to throw them away like they're here.

623
00:43:53,559 --> 00:43:57,440
And even even if technology doesn't move
at all from what we have now,

624
00:43:57,719 --> 00:44:00,679
so only started to use these tools
in various ways, and you open

625
00:44:00,719 --> 00:44:04,239
the door to philosophy. So let's
go here. Yeah. One of the

626
00:44:04,360 --> 00:44:07,039
questions that is always on my mind
is, Okay, there are a lot

627
00:44:07,079 --> 00:44:13,800
of skilled labors in the workforce who
can make the adjustment. There are way

628
00:44:13,840 --> 00:44:19,000
more unskilled laborers that just need a
job, and so how do you see

629
00:44:19,000 --> 00:44:22,719
the future playing out for them?
I wish I knew the answer that question,

630
00:44:22,800 --> 00:44:25,719
but I don't know the answer that
question. If I'm honest, I

631
00:44:25,719 --> 00:44:30,880
can speculate. I can speculate that
I think you know. The answer is

632
00:44:30,920 --> 00:44:35,239
sometimes everybody's going to talk about UBI. That sounds kind of terrible. Universal

633
00:44:35,440 --> 00:44:37,440
even if you had a good UBI. Yes, universal basic can come like

634
00:44:37,440 --> 00:44:40,599
Okay, maybe they don't do anything. I think we have to make the

635
00:44:40,599 --> 00:44:45,280
world more prosperous so that we can
give people the opportunities to be creative.

636
00:44:45,599 --> 00:44:47,719
Yeah, would be the way.
If I could be the dictator of the

637
00:44:47,760 --> 00:44:52,119
world, right, how I would
arrange that. Yeah, that's the goal.

638
00:44:52,760 --> 00:44:54,239
Yeah, for Aaron to be the
dictator of the world. Sure,

639
00:44:54,360 --> 00:44:58,000
yes, that's exactly what I meant. Richard, Thank you. I have

640
00:44:58,039 --> 00:45:00,599
a better idea. Why don't we
just have a super smart AI be the

641
00:45:00,679 --> 00:45:04,480
president. I know it sounds a
little ridiculous, but you know, so

642
00:45:04,559 --> 00:45:13,400
are the presidents regardless of you're more
ridiculous. Yeah, ridiculous. As a

643
00:45:13,440 --> 00:45:19,440
service, we admittedly are at a
dark phase in some of this stuff,

644
00:45:19,480 --> 00:45:22,719
but I'm pretty sure I wish to
will want people to be the compassionate ones

645
00:45:23,239 --> 00:45:29,039
that are helping set our values and
are setting the rules for the software to

646
00:45:29,480 --> 00:45:37,639
do their things. Yeah, I
know I'm such a party pooper. Fine,

647
00:45:38,960 --> 00:45:44,639
I can't have president GPT. I
mean, you know, I would

648
00:45:44,679 --> 00:45:49,480
love I would love somebody to do
an experiment at least elect g In fact,

649
00:45:49,480 --> 00:45:52,239
that was one of the first things
I asked when auto GPT came out,

650
00:45:52,239 --> 00:45:54,360
which is like one of the first
agent systems that somebody built. Almost

651
00:45:54,400 --> 00:45:58,920
immediately when CP four was live,
They're like, what if we put this

652
00:45:58,920 --> 00:46:00,800
in a loop and what if we
So I asked it, how do you

653
00:46:00,800 --> 00:46:06,119
elect yourself mayor of San Francisco?
Because you know, it just kind of

654
00:46:06,159 --> 00:46:09,039
sucks here in that manner, and
it immediately knew, oh, I'm going

655
00:46:09,119 --> 00:46:13,840
to have to get a straw candidate
because AIS can't get elected. But if

656
00:46:13,880 --> 00:46:17,199
I can control a candidate and they
just trust me with the decisions, I

657
00:46:17,199 --> 00:46:21,760
don't know, it still sounds ystopian. I mean maybe we should stop watching

658
00:46:22,519 --> 00:46:25,199
and being like, I wonder this
would be great products. Yeah, here's

659
00:46:25,239 --> 00:46:29,599
the real issue, Eron, is
that we decided to attach the term artificial

660
00:46:29,639 --> 00:46:35,960
intelligence to this technology. Yeah,
because that has had seventy years of science

661
00:46:36,000 --> 00:46:38,760
fiction, you know, perverting working
against it. You know, the first

662
00:46:38,760 --> 00:46:43,280
time the public hears the phrase artificial
intelligence is in two thousand and one a

663
00:46:43,360 --> 00:46:47,159
Space Odyssey and Hal tries to kill
everybody, like we have set a path,

664
00:46:47,199 --> 00:46:52,239
and now you want to hang that
moniker on this software. Yeah,

665
00:46:52,360 --> 00:46:57,679
like it was. It's a mistake
because it's hard enough to introduce new automation

666
00:46:57,840 --> 00:47:04,719
into society, but to introduce new
automation to society with decades of telling me

667
00:47:04,800 --> 00:47:07,039
it's going to kill everybody, like, it's kind of a dumb thing to

668
00:47:07,119 --> 00:47:10,000
have done. I'm going to take
it up with the branding department of AI

669
00:47:10,159 --> 00:47:15,239
and see if we can come over
with a better moniker. Good luck.

670
00:47:15,320 --> 00:47:17,519
Unfortunately, it's probably stuck. I
think. I think once it's like out

671
00:47:17,559 --> 00:47:21,079
there, it's kind of just gonna
be a thing. But Genie's out of

672
00:47:21,119 --> 00:47:24,000
the bottom. It's it's a little
late now, right. And the joke,

673
00:47:24,079 --> 00:47:29,199
of course, is that Minski came
up with the term to convince the

674
00:47:29,320 --> 00:47:34,679
US Army to give him money.
It has always been a marketing term to

675
00:47:34,920 --> 00:47:37,760
raise money. That's the sort of
reality of it. Now, every so

676
00:47:37,840 --> 00:47:42,719
often you build some software and it's
happened several times. You can call a

677
00:47:42,840 --> 00:47:46,159
Mai Winters if you want, but
the iteration is the same. Science works

678
00:47:46,199 --> 00:47:49,880
on the problem for a while until
it gets to a certain level where the

679
00:47:49,960 --> 00:47:53,639
engineers take over and apply it,
and once the applications are done, it's

680
00:47:53,679 --> 00:47:58,119
just software, right, right,
are they? The joke term I've used

681
00:47:58,119 --> 00:48:00,800
for this has been artificial intelligence is
what you call a piece of technology when

682
00:48:00,840 --> 00:48:05,760
it doesn't work, right, Because
as soon as it does work, it

683
00:48:05,920 --> 00:48:12,519
gets a new name. It becomes
planning and optimization software, it becomes vision

684
00:48:12,559 --> 00:48:16,000
image recognition, it becomes large language
model. Right as long as there you

685
00:48:16,000 --> 00:48:20,159
know, I do a lot of
investing. If you call it AI technology,

686
00:48:20,199 --> 00:48:22,599
you just told me you have stuff
that doesn't work. When you call

687
00:48:22,639 --> 00:48:28,280
it something else, then okay,
let's take a look at it. That's

688
00:48:28,320 --> 00:48:30,639
a great point because I think a
lot of people think they just got to

689
00:48:30,679 --> 00:48:32,400
slap an AI, you know,
peanut butter in my product and it gets

690
00:48:32,440 --> 00:48:36,119
better. Well, I have a
failed startup to say that does not,

691
00:48:36,639 --> 00:48:39,480
yeah, work that way, And
I think the best products they use AI

692
00:48:39,599 --> 00:48:43,559
don't mention AI at all. They
just solve a problem. No, because

693
00:48:44,239 --> 00:48:45,760
I think it's plumbing. I think
it's just going to be part of the

694
00:48:45,800 --> 00:48:52,199
software. The ones that do It's
questionable whether what we think of as software

695
00:48:52,199 --> 00:48:57,239
developers is AI, where's involved at
all? It's just algorithmic programming, right.

696
00:48:57,320 --> 00:49:00,079
I mean there's some they all want
to slap the AI monicor on it

697
00:49:00,119 --> 00:49:06,480
when there may may not be any
you know, any machine learning, any

698
00:49:07,559 --> 00:49:08,960
that kind of thing. It could
be a series of statements. But listen,

699
00:49:09,320 --> 00:49:15,840
without a doubt, there is a
software technology using probabilistic language analytics that

700
00:49:15,039 --> 00:49:22,320
is getting effective results. And it's
and that's it right Like now, there's

701
00:49:22,480 --> 00:49:27,840
an interesting area to think about about
how we revise UX's and how we can

702
00:49:27,920 --> 00:49:32,960
put software in more places like this
feeds into the other stories of ubiquitous computing,

703
00:49:34,599 --> 00:49:37,159
and you know the fact that we're
making them in a small form factic

704
00:49:37,199 --> 00:49:38,119
we want to put on our shirt
and so forth. It speaks to you,

705
00:49:38,440 --> 00:49:42,880
I want an even more personal compute
device than the slab of black glass

706
00:49:42,880 --> 00:49:45,920
that Apple gave me. You know, we are ripe for change in this

707
00:49:45,960 --> 00:49:52,800
particular space. But I think part
of the problem we're having right now is

708
00:49:53,000 --> 00:50:01,880
this Ultron versus Jarvis kind of science
fiction crap that's infected our thinking around an

709
00:50:01,960 --> 00:50:07,519
interesting piece of software that we can
do some good with well, and the

710
00:50:07,559 --> 00:50:09,400
thing that cheching be tenailed wasn't the
fact that it's AI. Yeah, it's

711
00:50:09,400 --> 00:50:14,960
great. What chech tenailed was what
people want is I want to ask questions

712
00:50:14,960 --> 00:50:20,159
to a box and get moderately or
ideally perfectly accurate answers to my questions,

713
00:50:20,679 --> 00:50:23,119
regardless of what they are. I
debate the irebade the accurate part. They

714
00:50:23,119 --> 00:50:30,599
wanted answers. They wanted responses that
we're spoken confidently right, like we're and

715
00:50:30,639 --> 00:50:34,480
apparently many people are happy with the
answers even if they are not right.

716
00:50:34,679 --> 00:50:38,920
That was sufficient. Now I when
I'm doing these AI talks, I bring

717
00:50:39,000 --> 00:50:44,320
up Alpha fold two because I think
it's one of the most important pieces of

718
00:50:44,519 --> 00:50:47,000
machine learning software that almost nobody knows
about. Is this the Wolfram thing?

719
00:50:47,400 --> 00:50:51,360
No, this is al fold exactly
comes from deep mind. These are the

720
00:50:51,360 --> 00:50:55,480
guys who solved go okay, but
then they reapplied the technology to protein folding.

721
00:50:57,000 --> 00:50:59,679
And again it's like, this is
pretty esoteric stuff. But there's a

722
00:50:59,679 --> 00:51:04,039
few things said. I find it
important and only a it's doing protein.

723
00:51:04,159 --> 00:51:08,599
It's computing protein fold results that are
taking years to even validate, Like that's

724
00:51:08,800 --> 00:51:12,400
so we're not even sure if it's
all right. But everything that we've been

725
00:51:12,440 --> 00:51:15,599
able to prove so far, it's
getting far better results than anything else we've

726
00:51:15,639 --> 00:51:20,760
ever done. But also like the
competition, the CASP competition, which is

727
00:51:20,880 --> 00:51:24,480
very much like the ImageNet competition that
kicked off vision recognition in the twenty tens,

728
00:51:25,239 --> 00:51:30,519
The CASP competitions were going on for
a few years to what four years

729
00:51:30,559 --> 00:51:32,960
ago was the first time alpha fole
jumped in on it, and they open

730
00:51:34,039 --> 00:51:37,360
source the whole frickin' thing. The
next two year one it was like half

731
00:51:37,440 --> 00:51:42,480
alpha fol, but this year's a
CASP event. It's all alpha fol.

732
00:51:43,239 --> 00:51:46,880
Everybody's using it, Like every scientist
looked at this and said this is a

733
00:51:46,880 --> 00:51:51,280
better way to work on protein problems, and they've all adopted it. Like

734
00:51:52,440 --> 00:51:55,920
this is where we're seeing automation and
new technology at its best. There's nobody

735
00:51:57,000 --> 00:52:00,719
trying to make up billion dollars off
of this yet. Right when products start

736
00:52:00,760 --> 00:52:04,000
to appear from it, then things
might get ugly. But right now,

737
00:52:04,079 --> 00:52:10,960
in the research phase, everything has
been shared that this non deterministic algorithm for

738
00:52:12,119 --> 00:52:19,880
computing behavior at the very low level
of biology is open to every scientist who's

739
00:52:19,920 --> 00:52:22,840
interested in this space, and they're
all testing each other, they're all competing

740
00:52:22,880 --> 00:52:28,440
with each other to make it better. We went from media remember what you

741
00:52:28,440 --> 00:52:30,440
know it was the XKCD where it's
like identify a burden this picture. It's

742
00:52:30,440 --> 00:52:34,000
like, I need a huge research
budget too. Now it's a trivial thing

743
00:52:34,119 --> 00:52:37,840
you can do on your phone.
We may be up against that for understanding

744
00:52:37,880 --> 00:52:43,679
the fundamentals of biology and complex biological
organisms with alpha fold Like, it's so

745
00:52:43,840 --> 00:52:46,760
profound and it's not being I love
that it's not being celebrated in some respects

746
00:52:47,519 --> 00:52:52,880
because it's people too busy working.
You know, this current detonation around LMS

747
00:52:53,119 --> 00:52:58,360
is about the scientists moving on right. The Hinton's Suskevards of the world are

748
00:52:58,440 --> 00:53:01,119
largely saying, hey, jen Ai
has gone as far as it go and

749
00:53:01,159 --> 00:53:05,760
that's like, Okay, we figured
out a concrete Now you engineers can play

750
00:53:05,920 --> 00:53:09,159
with it, and we as engineers
now get to change the world with it.

751
00:53:09,280 --> 00:53:13,159
Sam Malton in a recent talk I
literally just listened to this morning,

752
00:53:13,239 --> 00:53:15,119
but it was a really good talk. I think he was an MIT or

753
00:53:15,119 --> 00:53:19,760
something. He talks about how great
we're going to get GP five. But

754
00:53:19,800 --> 00:53:22,840
what we want to be able to
distill is the ability to figure out the

755
00:53:22,840 --> 00:53:29,039
reasoning part separate from the architecture that
does the kind of like large language more

756
00:53:29,159 --> 00:53:31,440
data. Somehow, somehow we knew
we added more data and we got something

757
00:53:31,440 --> 00:53:37,239
that looks like reasoning, and we're
not entirely sure it's real reasoning or what

758
00:53:37,400 --> 00:53:40,599
it really is. But if we
start to understand the architecture of reasoning in

759
00:53:40,639 --> 00:53:45,840
a much more nuanced way, I
think that's where the next big breakthrough is

760
00:53:45,840 --> 00:53:47,519
going to come from. At least
that's based on IPI today. I mean,

761
00:53:47,519 --> 00:53:52,039
it can change tomorrow with whatever papers. I mean, but once we

762
00:53:52,079 --> 00:53:55,440
get better kind of synthetic reasoning and
we understand that architecture, that's going to

763
00:53:55,440 --> 00:54:00,599
be probably the next I certainly look
at the smell perspective and say, are

764
00:54:00,639 --> 00:54:02,760
we still just making it bigger so
because we don't actually understand it, or

765
00:54:02,840 --> 00:54:07,960
I'm hoping that jat that the GPT
five is much more reasoned, that it's

766
00:54:07,280 --> 00:54:10,599
we're not automatically bigger, that we're
more focused in areas that we're starting to

767
00:54:10,639 --> 00:54:17,119
do more discrimination, you know,
effectively understanding testing your language in deeper layers

768
00:54:17,119 --> 00:54:22,159
before you present it. I don't
know if we're there yet. There is

769
00:54:22,199 --> 00:54:24,840
a set of code smells that you
see in every rapidly evolving piece of technology,

770
00:54:24,880 --> 00:54:29,079
and one of them is right sizing, and you are starting to see

771
00:54:29,480 --> 00:54:32,679
movement towards right sizing. But what
you're describing then is, rather than wait

772
00:54:32,719 --> 00:54:39,320
for the next research breakthrough from the
scientists, that engineers iteratee and find incremental

773
00:54:39,360 --> 00:54:45,360
improvements in iteration that occasionally can lead
to another breakthrough. Well, and a

774
00:54:45,360 --> 00:54:49,400
lot of this. You talked about
anthromorphizing earlier in the episode, and I

775
00:54:49,480 --> 00:54:54,280
want to call out there's a great
researcher Ethan Mallock, I think professor Wertn

776
00:54:54,360 --> 00:54:58,280
or something like that, and he
wrote a great book called co Intelligence,

777
00:54:58,320 --> 00:55:02,199
and he talks about actually the virtuousness
of using a limited form of anthromopore that

778
00:55:02,440 --> 00:55:08,119
I can't even say the word right, but talking in human and optimizing your

779
00:55:08,119 --> 00:55:12,639
prompts that way. So we saw
that even just asking it to think step

780
00:55:12,639 --> 00:55:15,440
by step, if you do a
chech fee question and say and think step

781
00:55:15,440 --> 00:55:19,559
by step, people have demonstrated to
get better results. What I really look

782
00:55:19,599 --> 00:55:24,800
forward to is the application of industrial
organizational psychology to using LMS and seeing a

783
00:55:24,920 --> 00:55:30,679
different techniques such as how people talk
to each other, how organizations work,

784
00:55:30,719 --> 00:55:35,880
and do organizations and lms, when
you have multiple agents working with each other,

785
00:55:36,440 --> 00:55:39,280
share some of those same properties,
which on a conjecture basis, we

786
00:55:39,280 --> 00:55:45,039
haven't proven this, but hypothetically they
should because lms are trained on people,

787
00:55:45,079 --> 00:55:49,400
and they're also trained on people's interactions, and that's the important part. It's

788
00:55:49,400 --> 00:55:53,559
not that's speaking to it more like
a human makes it more human. It

789
00:55:53,599 --> 00:55:58,800
was trained on human data. Yeah, so obviously it's going to work more

790
00:55:58,800 --> 00:56:02,119
effectively if you work with the data
set as it is. And again I'm

791
00:56:02,159 --> 00:56:08,599
always balancing this what came from us, and it is the reason that things

792
00:56:08,599 --> 00:56:14,320
work the way they do, versus
imbibing any of that capability on the software

793
00:56:14,360 --> 00:56:17,000
itself. It's software. It's a
reflection, which is why you always think.

794
00:56:17,320 --> 00:56:21,880
This is why you always think chet
GPT and ask talk to it nicely,

795
00:56:22,039 --> 00:56:24,960
because studies are shown when you ask
somebody nicely for something, you get

796
00:56:25,000 --> 00:56:30,000
a better result than being a jerk
to it. And there's the most anthropomorphy.

797
00:56:30,119 --> 00:56:36,039
Does this mean my wife should stop
telling sch Alexa to shut up instead

798
00:56:36,039 --> 00:56:39,239
of saying stop she does that?
And I don't think there's an LM and

799
00:56:39,320 --> 00:56:44,719
Alexa yet, So you don't have
nothing to worry about I forgot what Alexa

800
00:56:44,800 --> 00:56:46,360
was. I thought you were actually
talking about a child. Okay, now

801
00:56:46,440 --> 00:56:50,599
no Alexa, Yeah, no,
no, no, I have I'm starting

802
00:56:50,639 --> 00:56:52,960
a new meme where if you put
in front of anything that you don't like,

803
00:56:53,239 --> 00:56:58,760
that's a diminutive. So you know
this coffee, Oh my god.

804
00:56:58,840 --> 00:57:01,440
Well, plus insaying the app word
because folks are sometimes playing this in open

805
00:57:01,480 --> 00:57:06,159
air shire's their devices, which makes
them sad. And that's been a running

806
00:57:06,199 --> 00:57:08,559
joke in dotam Rocks too. So
you got a lot of mileoget of that.

807
00:57:08,760 --> 00:57:12,599
Sounds like you're working on some amazing
stuff, Aeron. I can't wait

808
00:57:12,639 --> 00:57:15,440
to see some of it in the
field. Yeah, like the emergence of

809
00:57:15,519 --> 00:57:20,679
autonomous agents. I hear the noise
in a lot of different places. Not

810
00:57:20,760 --> 00:57:23,239
the product yet, but obviously it's
a goal. And I would categorize the

811
00:57:23,320 --> 00:57:25,719
show as a geek out. I
mean, we really went a lot of

812
00:57:25,719 --> 00:57:30,480
places that wasn't real, you know, practical day to day programming, and

813
00:57:31,000 --> 00:57:35,199
these are great shows to do from
time to time. So thanks Erin.

814
00:57:35,519 --> 00:57:38,159
I'm so happy to be here.
It's been fifteen years. I feel like

815
00:57:38,199 --> 00:57:42,480
maybe I've got a few more wrinkles. Yeah, maybe, y'all do too,

816
00:57:42,639 --> 00:57:45,400
but I feel like maybe a little
bit more Risdom as well, or

817
00:57:45,440 --> 00:57:49,719
maybe not, but you know,
I'm excited to see y'all's It's great to

818
00:57:49,719 --> 00:57:52,639
be on the show. And good
luck in the future and that means everybody,

819
00:57:53,000 --> 00:57:57,400
and thank you for listening to dot
Rocks and we'll see it next time.

820
00:57:58,039 --> 00:58:22,440
But dot net Rocks is brought to
you by Franklin's Net and produced by

821
00:58:22,519 --> 00:58:28,480
Pop Studios, a full service audio, video and post production facility located physically

822
00:58:28,519 --> 00:58:32,440
in New London, Connecticut, and
of course in the cloud online at pwop

823
00:58:32,719 --> 00:58:37,440
dot com. Visit our website at
d O T N E, t R

824
00:58:37,480 --> 00:58:42,280
O c k S dot com for
RSS feeds, downloads, mobile apps,

825
00:58:42,440 --> 00:58:45,719
comments, and access to the full
archives going back to show number one,

826
00:58:45,920 --> 00:58:50,760
recorded in September two thousand and two. And make sure you check out our

827
00:58:50,760 --> 00:58:54,039
sponsors. They keep us in business. Now go write some code. See

828
00:58:54,079 --> 00:59:05,280
you next time. You got Jvan
in summer time that means home. Then

829
00:59:05,360 --> 00:59:07,840
my Texas a lie present Hall
