1
00:00:01,000 --> 00:00:04,759
How'd you like to listen to dot
net Rocks with no ads? Easy?

2
00:00:05,320 --> 00:00:09,880
Become a patron For just five dollars
a month you get access to a private

3
00:00:10,000 --> 00:00:14,359
RSS feed where all the shows have
no ads. Twenty dollars a month will

4
00:00:14,400 --> 00:00:18,800
get you that and a special dot
net Rocks patron mug. Sign up now

5
00:00:18,839 --> 00:00:24,199
at Patreon dot dot net rocks dot
com. Hey Carlin Richard here. As

6
00:00:24,239 --> 00:00:29,519
you may have heard, NDC is
back offering their incredible in person conferences around

7
00:00:29,559 --> 00:00:33,960
the world, and we'd like to
tell you about them. NDC Copenhagen is

8
00:00:34,000 --> 00:00:39,600
happening August twenty seventh through the thirty
first. Go to NDC Copenhagen dot com

9
00:00:39,799 --> 00:00:45,039
for more information. NDC Porto is
happening October sixteenth through the twentieth. Go

10
00:00:45,159 --> 00:00:50,119
to Dcporto dot com to register and
check out the full lineup of conferences at

11
00:00:50,200 --> 00:00:54,840
NDC Conferences dot com. Hey there, this is Jeff Fritz, the purple

12
00:00:54,880 --> 00:00:59,280
blazer guy from Microsoft, letting you
in on a little secret about my friend

13
00:00:59,439 --> 00:01:03,560
Carl Franklin. You know, the
guy who started dot net Rocks, the

14
00:01:03,640 --> 00:01:07,439
first podcast about dot net in two
thousand and two. The guy who's been

15
00:01:07,480 --> 00:01:14,599
teaching Blazer on YouTube since twenty twenty. Yeah that Carl Franklin. Well,

16
00:01:14,959 --> 00:01:18,640
Carl's joined up with the folks from
Code in a Castle to teach a week

17
00:01:18,760 --> 00:01:23,120
long hands on Blazer class at Are
you ready to get this? At a

18
00:01:23,239 --> 00:01:30,200
castle slash villa in Tuscany. It's
sort of a luxury vacation with Blazer learning

19
00:01:30,280 --> 00:01:37,359
built in. Carl's calling it the
Blazer master Class. You'll learn Blazer from

20
00:01:37,359 --> 00:01:41,680
the ground up, finishing the week
with the ability to build and deploy Blazer

21
00:01:41,719 --> 00:01:47,120
applications. Since the training happens for
only four hours in the morning over six

22
00:01:47,200 --> 00:01:51,680
days, you can bring your significant
other, your partner with you and you

23
00:01:51,719 --> 00:01:57,319
should right This part of Italy is
absolutely beautiful. There's so much to see

24
00:01:57,400 --> 00:02:02,159
and do and in Larry and Marco
from Code into Castle are organizing daily activities

25
00:02:02,200 --> 00:02:07,760
both at the castle and in the
area. The castle is in the Marema,

26
00:02:07,840 --> 00:02:13,319
a less touristed region of Tuscany,
offering both classic Tuscan hill country as

27
00:02:13,319 --> 00:02:17,759
well as easy access to the Etruscan
Riviera, with sublime local food, wine

28
00:02:19,039 --> 00:02:23,120
and olive oil around every corner.
Breakfast is included every day. There will

29
00:02:23,120 --> 00:02:28,919
be two communal dinners at the Castle
book ending the experience and most other meals

30
00:02:29,080 --> 00:02:34,439
and all activities are included. And
did I mention you'll learn Blazer in person

31
00:02:34,520 --> 00:02:38,159
from Carl Franklin. Listen, space
is limited and for very good reason.

32
00:02:38,439 --> 00:02:45,000
This is quality training in a beautiful
setting. Go to code in Acastle dot

33
00:02:45,000 --> 00:02:53,039
com slash Blazer twenty twenty three that's
bla z o R two zero two three

34
00:02:53,080 --> 00:02:58,599
to take advantage of this amazing opportunity
to join Carl in Tuscany for an unforgettable

35
00:02:58,639 --> 00:03:13,680
week of La dulce vita while advancing
your programming skills in this important new technology.

36
00:03:17,120 --> 00:03:21,199
Welcome back to dot net rocks.
I'm Carl Franklin, and this is

37
00:03:21,280 --> 00:03:25,639
Richard cattle in what's going on over
on your side of the continent? My

38
00:03:25,639 --> 00:03:30,080
friend what packing boxes and hauling him
up to the coast, Like after after

39
00:03:30,080 --> 00:03:34,280
we finished the show today, I'm
literally loading the truck and taking a run

40
00:03:34,400 --> 00:03:38,719
up. We're trying not to end
up with a storage locker, seeing how

41
00:03:38,719 --> 00:03:40,840
we're moving. We have a very
big house that we're moving out over a

42
00:03:40,840 --> 00:03:46,120
lot of stuff into a smaller house
that's already furnished. Yeah, so you

43
00:03:46,159 --> 00:03:50,680
know, it's just a juggling of
a lot of stuff. But you're going

44
00:03:50,719 --> 00:03:53,560
to have a party. Oh,
there's a few parties. There's been a

45
00:03:53,560 --> 00:03:55,319
few parties. Well, let me
know, I might just make the trek.

46
00:03:55,439 --> 00:03:59,360
It's a long way out, but
yeah, we'll figure something out.

47
00:03:59,400 --> 00:04:00,879
Certainly. When we're settled in up
there, we'll we'll do something fun.

48
00:04:00,919 --> 00:04:05,560
But the it's good to be married
to a woman who who tends to argue

49
00:04:05,599 --> 00:04:10,360
with the spreadsheet anyway, because didn't
think it's very long to figure out how

50
00:04:10,439 --> 00:04:12,719
much stuff the garage holds said,
Okay, that's ho much stuff we're allowed

51
00:04:12,719 --> 00:04:15,639
to keep, Yeah, and then
condensed accordingly. Love it. Yeah,

52
00:04:15,920 --> 00:04:19,680
it's organized. Well, that's good
to hear, Richard. Let's get started

53
00:04:19,720 --> 00:04:30,279
with better know a framework. Awesome, all man, what do you got?

54
00:04:30,519 --> 00:04:33,319
Well? This being show eighteen fifty
seven. You can go to eighteen

55
00:04:33,439 --> 00:04:40,920
fifty seven dot twamp dot m E
and that's a link to an article at

56
00:04:40,959 --> 00:04:45,279
Digital Trends, which I think is
kind of important. Maybe it's important.

57
00:04:45,319 --> 00:04:48,399
We'll find out, but anyway,
it's called here's why people think GPT four

58
00:04:48,600 --> 00:04:55,160
might be getting dumber over time.
As impressive as GPT four was at launched,

59
00:04:55,240 --> 00:04:59,800
some onlookers have observed that it has
lost some of its accuracy and power.

60
00:05:00,560 --> 00:05:02,279
First of all, was it ever
accurate? I don't know, but

61
00:05:03,759 --> 00:05:11,439
sometimes I mean it mean the underlying
Harvard study is pretty great. Yeah,

62
00:05:11,639 --> 00:05:14,959
and they are modeling it well.
I thought the bigger message from this thing

63
00:05:15,120 --> 00:05:18,319
was they're doing updates and not telling
anyone. Yes, right, like this

64
00:05:18,399 --> 00:05:23,920
is all guesswork from everybody else.
Is that they're constantly refreshing this data set,

65
00:05:23,959 --> 00:05:27,319
it seems, and they're not really
talking about where their sources are or

66
00:05:27,360 --> 00:05:31,680
anything like that. No, we
don't, We don't really know. Yeah,

67
00:05:31,720 --> 00:05:35,319
Well, anyway, and also I've
noticed that it don't ask it to

68
00:05:35,319 --> 00:05:42,360
do math. That's not his thing. It doesn't. It will lie and

69
00:05:42,439 --> 00:05:46,160
and be very adamant that it is
correct. Yeah, it's uh yeah in

70
00:05:46,199 --> 00:05:50,439
the end, when you realize it's
basically just text driven and you're hoping that

71
00:05:50,560 --> 00:05:54,920
the text on the Internet was accurate, which is a hilarious statement all by

72
00:05:54,959 --> 00:05:58,560
itself. It is absolutely hilarious.
Well, yeah, I'm sure Amber's got

73
00:05:58,600 --> 00:06:01,519
some opinions about this we're talking about, but we'll bring her on in a

74
00:06:01,519 --> 00:06:05,000
minute. First of all, who
is talking to us today, Richard Campbell?

75
00:06:05,240 --> 00:06:08,720
I grabbed a command of the show
sixteen oh five, So that's going

76
00:06:08,759 --> 00:06:12,720
to back a bit in December of
twenty eighteen, and not really about large

77
00:06:12,759 --> 00:06:15,639
language models, because who was talking
about large languge of models in twenty eighteen.

78
00:06:15,680 --> 00:06:17,360
But this was a show we did
with Jared Rhodes who's at MVP,

79
00:06:18,600 --> 00:06:24,040
and we were talking about IoT and
edge computing at the time, and mostly

80
00:06:24,040 --> 00:06:28,480
what we were talking about was this
continuous data collection mechanism, like just our

81
00:06:28,519 --> 00:06:31,120
ability to collect a lot of data
about people, especially with so much compute

82
00:06:31,120 --> 00:06:34,959
in so many places, with so
much ability to gather and stream that information

83
00:06:35,040 --> 00:06:40,399
back. And we ended up in
an ethical conversation around data collection. And

84
00:06:40,439 --> 00:06:43,800
so Mike had this comment on the
show shortly after it was publisher. He

85
00:06:43,839 --> 00:06:46,519
says, this is a great ethical
discussion about collecting data. I always revert

86
00:06:46,600 --> 00:06:49,800
back to Kransberg's first low tech,
which is it tech is neither good nor

87
00:06:49,879 --> 00:06:55,199
bad, nor is it neutral,
which you know, I actually went and

88
00:06:55,240 --> 00:06:59,839
looked it up. It doesn't have
to be right. It's like, again,

89
00:07:00,040 --> 00:07:04,879
we are personifying or anthromomorphizing technology.
It's the people using the tool that

90
00:07:05,079 --> 00:07:09,600
is the issue. So it just
is yeah, well, yeah, don't

91
00:07:09,680 --> 00:07:12,040
the thing is when we attribute it
to the technology rather than the people who

92
00:07:12,120 --> 00:07:15,600
operated. Right, Right, we're
giving agency where agency doesn't exist. The

93
00:07:15,720 --> 00:07:19,160
mentioned stores, data gathering is troubling. So this is one of the devices

94
00:07:19,199 --> 00:07:24,439
we were talking about. You obviously
opt in by seeing the monitor and continue

95
00:07:24,480 --> 00:07:27,319
to enter, But is that enough. Shouldn't the customer know to what extent

96
00:07:27,360 --> 00:07:30,079
their behavior will be analyzed and for
what purpose? What are the patrons who,

97
00:07:30,120 --> 00:07:32,959
beyond their own control, only have
access to that store? Are you

98
00:07:33,000 --> 00:07:36,399
now alienating their business and taking advantage
of the situation to gather data? So

99
00:07:36,439 --> 00:07:42,920
this is the store monitoring system where
they're figuring out what people's buying behavior is

100
00:07:43,160 --> 00:07:46,959
just by having these sensors, right
Yeah. Programming ethics have always fascinated me.

101
00:07:47,160 --> 00:07:49,920
Recent protests, of course, is
the twenty eighteen if you can call

102
00:07:49,959 --> 00:07:55,920
back to then, recent protests by
Google employees over developing censoring products for the

103
00:07:56,000 --> 00:08:01,040
Chinese government and military machine learning for
the US government highlights the backlash she's company's

104
00:08:01,079 --> 00:08:05,079
face when adopting technologies for different applications. But it always boils down to this.

105
00:08:05,439 --> 00:08:07,319
You will always be able to find
a developer to do what you want

106
00:08:07,360 --> 00:08:11,560
to have done. Right there,
will always be someone with the ethics that

107
00:08:11,639 --> 00:08:13,720
align with yours or who's desperate enough
to do that work for you. This

108
00:08:13,879 --> 00:08:18,279
is unique to programming. However,
programming has a unique advantage of being cheap

109
00:08:18,360 --> 00:08:22,959
to scale, So one developer going
to have a profound impact on humanity versus

110
00:08:22,959 --> 00:08:26,360
one unethical police officer or one unethical
to investor. Had Bernie Madoff know how

111
00:08:26,360 --> 00:08:31,800
to make an asp dot net site, yeah, he could have offered his

112
00:08:31,879 --> 00:08:35,679
services anonymously to many more people.
So it's always a great discussion on what

113
00:08:35,720 --> 00:08:39,600
to do responsibly. How do companies
talk about this? My experience is most

114
00:08:39,799 --> 00:08:43,639
don't, but we will eventually need
to. So just seemed to be on

115
00:08:43,759 --> 00:08:48,679
point with what we were talking about
today, and certainly what I love is

116
00:08:48,759 --> 00:08:52,720
dragging out a conversation from five years
ago, a gay pretty relevant, kind

117
00:08:52,720 --> 00:08:56,440
of on the point kind of there. Hey, Mike, thanks so much

118
00:08:56,440 --> 00:08:58,000
for your comment and a copy of
us to Cobi. It's on its way

119
00:08:58,000 --> 00:09:01,000
to you, and if you'd like
a copy used to go buy write a

120
00:09:01,039 --> 00:09:03,279
comment on the website at dot net
rocks dot com are on the facebooks.

121
00:09:03,320 --> 00:09:07,080
We publish every show there and if
you comment there and reading the show we'll

122
00:09:07,080 --> 00:09:09,960
send your copy mused to go by
and you can definitely follow us on Twitter

123
00:09:09,000 --> 00:09:13,480
if you want, But we're on
Mastodon. I'm on Mastodon a lot more

124
00:09:13,480 --> 00:09:15,720
of these days. I don't know
about you, Richard, but I'm here

125
00:09:15,720 --> 00:09:16,759
and there, you know. Yeah, if you want to follow me on

126
00:09:16,840 --> 00:09:22,320
Mastodon, it's Carl Franklin at tech
Hub dot social and I'm rich Campbell at

127
00:09:22,360 --> 00:09:26,759
mastodon dot social. Send us a
two root two twot two two boot scoot

128
00:09:28,840 --> 00:09:33,440
you're back there, are you?
Okay? Toot scoot boogie. Okay,

129
00:09:33,559 --> 00:09:39,639
let's bring in Amber. Amber McKenzie
is head of data science and analytics at

130
00:09:39,799 --> 00:09:46,000
hr tech startup Fama Fama. Her
background is in computer science and linguistics,

131
00:09:46,120 --> 00:09:48,919
and she's been doing data science,
machine learning, and natural language processing for

132
00:09:48,960 --> 00:09:52,519
almost fifteen years now. Welcome back, Amber, Yeah, thanks for having

133
00:09:52,559 --> 00:09:56,039
me, holcome, thanks for being
happy. And we had the ethics and

134
00:09:56,279 --> 00:10:01,639
AI conversation with you like four years
ago. But I think when I wrote

135
00:10:01,639 --> 00:10:03,679
the email too, and like,
I don't know if you've noticed, but

136
00:10:03,759 --> 00:10:07,480
some stuff has happened and you pretty
much picked up that threat. You're like,

137
00:10:07,600 --> 00:10:11,600
yeah, a few things have happened, little stuff here and there Yeah,

138
00:10:11,600 --> 00:10:15,720
it's an interesting time. Are we
now in a place where we're really

139
00:10:15,799 --> 00:10:20,240
challenging that debate like that four years
ago and we were talking about sort of

140
00:10:20,559 --> 00:10:24,279
the bias in the machine learning models
and that kind of thing. Is that

141
00:10:24,759 --> 00:10:28,720
is chat GPT? Is that writ
large? You know, it's interesting to

142
00:10:28,759 --> 00:10:33,639
me and I've had this conversation a
lot with people lately. There is sort

143
00:10:33,679 --> 00:10:41,480
of this idea that chat GBT is
like some new way, different, next

144
00:10:41,559 --> 00:10:46,519
level, and to some degrees it
is. But in my mind, it's

145
00:10:46,559 --> 00:10:50,720
not any different than some of the
innovations we've had over the years. I

146
00:10:50,799 --> 00:10:56,279
mean, you know, we went
from always sending letters to sending email.

147
00:10:56,360 --> 00:11:00,240
That was a heatge, big different
thing, right we came out, you

148
00:11:00,240 --> 00:11:01,840
know, when Watson came out,
everybody was like, Oh, that's going

149
00:11:01,919 --> 00:11:05,600
to change the game. That's going
to be the new thing, right,

150
00:11:05,679 --> 00:11:11,279
and then it was Google and now
it's chat GPT. And what essentially is

151
00:11:11,279 --> 00:11:16,200
going to happen is that the the
opposite side, the people who are trying

152
00:11:16,200 --> 00:11:22,159
to regulate it, trying to create
things to combat you know, bias or

153
00:11:22,320 --> 00:11:26,200
combat uh you know, deep fakes. That type of thing is just going

154
00:11:26,279 --> 00:11:30,399
to catch up, right, It's
always kind of a game of came ross.

155
00:11:30,799 --> 00:11:35,120
Yeah, yeah, it always is. Yeah, but at least we

156
00:11:35,200 --> 00:11:39,559
know what we're racing against, like
I would hope, Yeah, accuracy is

157
00:11:39,559 --> 00:11:46,000
actually important, it is. I
mean I don't know. I look at

158
00:11:46,519 --> 00:11:50,600
Yeah, I liken it a lot
to Google. We look to Google,

159
00:11:50,639 --> 00:11:54,840
and Google doesn't have all of the
right things. And when we first came

160
00:11:54,840 --> 00:11:58,600
out, we were like, you
know, oh, it's all magic information.

161
00:11:58,639 --> 00:12:01,399
And then everybody's like, well,
you gotta check the sources. I

162
00:12:01,399 --> 00:12:05,120
would argue that when we used to
all look up everything on encyclopedias, that

163
00:12:05,240 --> 00:12:11,480
wasn't all exactly right. So like
it's just clearly though, the way people

164
00:12:11,480 --> 00:12:15,120
are perceiving its change is different than
all of these other things. I mean,

165
00:12:15,200 --> 00:12:22,480
I don't remember that much kerfuffle between
governments and agencies getting together and saying,

166
00:12:22,559 --> 00:12:24,879
what are we going to do about
this? You know, we need

167
00:12:26,000 --> 00:12:28,720
regulation around this. I mean I
guess they did with Google and Microsoft to

168
00:12:28,759 --> 00:12:35,120
some extent, but you know,
other AI things haven't been Yes as hitting

169
00:12:35,159 --> 00:12:41,120
you over the head what this is. I'm glad to see it, though,

170
00:12:41,679 --> 00:12:45,480
I mean it took I remember,
so I worked at Sandia in an

171
00:12:45,480 --> 00:12:50,679
internship and one of the problems they
had at the time is all the national

172
00:12:50,759 --> 00:12:56,919
labs and all the agencies weren't even
working together for you know, cybersecurity purposes.

173
00:12:58,320 --> 00:13:01,759
So you know, they would come
up with a one person would come

174
00:13:01,840 --> 00:13:05,039
up with a solution for a zero
day attack, and then they wouldn't share

175
00:13:05,080 --> 00:13:09,639
it. And that's what they were
working on at the time, was just

176
00:13:09,759 --> 00:13:13,840
getting to a point where they would
even share information. So the fact that

177
00:13:13,879 --> 00:13:16,960
we're already like, hey, we've
got this new thing out there, we

178
00:13:18,000 --> 00:13:22,240
need to come together and put the
guard wheels in place is for me kind

179
00:13:22,279 --> 00:13:26,159
of a refreshing sure take on things. And it's like, oh yeah,

180
00:13:26,200 --> 00:13:31,720
people recognize now that it's not just
to put it out there, let it

181
00:13:31,799 --> 00:13:35,759
run wild thing. Let's figure out
the best way to you know, especially

182
00:13:37,159 --> 00:13:43,799
for the sort of non technically inclined
people who don't you know, have some

183
00:13:43,840 --> 00:13:46,919
of the understanding that is, oh, I've got to take this with a

184
00:13:46,919 --> 00:13:50,240
green as salt. I need to
you know, check this or check that.

185
00:13:50,519 --> 00:13:54,039
So saying hey, let's let's try
and put you know, the bowling

186
00:13:54,120 --> 00:13:58,919
bumpers on and make this thing,
you know, go down the lane a

187
00:13:58,000 --> 00:14:01,919
little bit easier. Yeah, the
thing that worries me about government regulation.

188
00:14:03,120 --> 00:14:07,279
Isn't the regulation itself. I think
that's great, but it's that government takes

189
00:14:07,360 --> 00:14:13,519
so long to do things and to
regulate, and this thing is evolving faster

190
00:14:13,720 --> 00:14:18,559
than anything we've seen before in tech, at at a much more rapid pace.

191
00:14:18,080 --> 00:14:22,759
For example, just read an article
in The Verge that chat GPT can

192
00:14:22,799 --> 00:14:28,919
now remember who you are and what
you want. With new custom instructions.

193
00:14:28,960 --> 00:14:31,279
You might not have to tell the
chat about your life story every time you

194
00:14:31,320 --> 00:14:35,159
have a question, but do you
know what you want the bot to know?

195
00:14:35,360 --> 00:14:39,399
I'll put a link to that.
That doesn't surprise me. And I

196
00:14:39,799 --> 00:14:43,200
think in those instances, the private
sector is going to be the one.

197
00:14:43,320 --> 00:14:50,639
I mean, there's going to be
essentially the market opens up for yeah,

198
00:14:50,679 --> 00:14:54,120
for industry to say, hey,
let's let's find a tech that combats that,

199
00:14:54,240 --> 00:15:00,879
and they'll be a market for that. People looking to to fill that

200
00:15:00,919 --> 00:15:05,879
gap. But I agree with you, the big cogs are slow. Yeah.

201
00:15:05,919 --> 00:15:09,559
I hope the government does big picture
regulation rather than you know, picking

202
00:15:09,559 --> 00:15:13,639
on these features and those features which
can easily change from under the regulations and

203
00:15:13,720 --> 00:15:18,320
screw everything up. That's where I
like the data source one you brought up

204
00:15:18,320 --> 00:15:20,480
before it's like, just at least
publish where we got your data from.

205
00:15:20,639 --> 00:15:24,480
Publish where that came from. Yet, I don't know if you see that

206
00:15:24,519 --> 00:15:28,159
as reasonable, Amber like, oh
for sure, I mean that in terms

207
00:15:28,159 --> 00:15:33,480
of that's what's interesting about what's happened
recently. For me, it is not

208
00:15:33,679 --> 00:15:37,759
the development of these algorithms. It's
I mean, that's been happening. They're

209
00:15:37,799 --> 00:15:45,600
just taking it to a new level. It was the the mass market of

210
00:15:45,679 --> 00:15:50,120
it. They put marketing resources around
it, they packaged it that they made

211
00:15:50,120 --> 00:15:56,120
it suddenly visible, and that changes
the stakes of the game from something that

212
00:15:56,279 --> 00:16:00,600
is research based. You put papers
around, people have to review the papers,

213
00:16:00,639 --> 00:16:06,360
they need to have reproducibility to you
know. Now it is a product,

214
00:16:06,399 --> 00:16:10,039
a commodity, which puts it in
a different space. Which, by

215
00:16:10,080 --> 00:16:12,799
the way, Microsoft just got a
nice, big old fat stock price bump

216
00:16:12,919 --> 00:16:18,279
on like the marketing spoke and you're
killing it in the stock market. We

217
00:16:18,399 --> 00:16:22,360
talked about this on Windows Weekly.
It's like they put out an Inspire that

218
00:16:22,399 --> 00:16:26,320
it's thirty dollars a month per person
and you're like, holy men, that's

219
00:16:26,320 --> 00:16:30,480
a lot of any three subscriptions only
thirty six dollars. But if you want

220
00:16:30,480 --> 00:16:34,360
the copilot Now it's an extra thirty
bucks. Stock price goes up five percent,

221
00:16:34,639 --> 00:16:40,080
Like if I'm sat an Adela,
Like I'm laughing all the way to

222
00:16:40,120 --> 00:16:42,840
the bank, Like, that's a
lot of money. What's your opinion amber

223
00:16:44,000 --> 00:16:48,840
on the general public's acceptance of this
stuff. I mean, it's been my

224
00:16:48,480 --> 00:16:52,639
experience that people are kind of like, oh, that's cool, you know,

225
00:16:52,720 --> 00:16:56,320
and they use it when they can
and they don't think too much about

226
00:16:56,320 --> 00:16:59,440
it. But I guess these new
regulations, some of the stuff that they're

227
00:16:59,480 --> 00:17:04,880
floating around sound is we want to
have some sort of watermark on anything that

228
00:17:04,920 --> 00:17:11,640
has been generated by AI, whether
it's text or video or or you know,

229
00:17:11,759 --> 00:17:15,039
or whatever. But but that of
course puts it in the hands of

230
00:17:15,079 --> 00:17:21,279
the content creators, and so you
have the you'll have the problem where everybody

231
00:17:21,319 --> 00:17:25,240
who's following the law and everything will
use the watermark, and then those who

232
00:17:25,319 --> 00:17:29,200
want to publish stuff for nefarious purposes
will use it. Honey, that's the

233
00:17:29,279 --> 00:17:34,200
case of everything, right Yeah,
you know, for me, it's it's

234
00:17:34,240 --> 00:17:40,200
one more thing that we have to
teach about. I mean, right now,

235
00:17:40,240 --> 00:17:45,400
even in my kids English classes,
they are taught about primary sources.

236
00:17:45,480 --> 00:17:48,319
They are taught about not just finding
something on the web and taking it at

237
00:17:48,359 --> 00:17:52,400
face value, and that they can't
just take that information roll it into an

238
00:17:52,519 --> 00:17:56,440
essay and their teachers are going to
take it. That's going to be just

239
00:17:56,559 --> 00:18:03,720
one more thing in a realm that
has to be taught. But I don't

240
00:18:03,759 --> 00:18:07,880
see I mean, you see degrees. It's almost like the Kevin Bacon thing.

241
00:18:07,920 --> 00:18:11,759
You see to the degrees of separation. It's like, you know,

242
00:18:11,000 --> 00:18:15,720
we're in tech, we are uniquely
aware of it. And then you've got

243
00:18:15,799 --> 00:18:18,279
people who are pretty tech savvy who
are like, yeah, this is cool,

244
00:18:18,640 --> 00:18:21,880
maybe you know, you know,
like I have a friend who had

245
00:18:21,920 --> 00:18:25,400
it right as resignation letter, you
know, stuff like that. And then

246
00:18:25,440 --> 00:18:29,200
you get, you know, to
my parents who probably don't even know that

247
00:18:29,240 --> 00:18:33,839
it's happening right now. And it's
like that degree of separation requires just more

248
00:18:33,839 --> 00:18:41,279
and more education about it. And
I think that's what I hope the government

249
00:18:41,519 --> 00:18:42,759
sort of, I mean, even
if it takes a little while, the

250
00:18:44,119 --> 00:18:48,519
sort of the guardrails, if you
will, will essentially help those people who

251
00:18:48,440 --> 00:18:53,480
you know, aren't aren't going to
go through the diligence of understanding it and

252
00:18:53,599 --> 00:18:57,680
understanding how to use it. And
what to do with it and try and

253
00:18:59,079 --> 00:19:03,799
mitigate, um, you know,
the damage that can be done there,

254
00:19:03,240 --> 00:19:07,680
because again, it's the same thing. I liken it to Google. I

255
00:19:07,720 --> 00:19:11,240
mean, I have grandparents who go
and Google something and they're like, oh,

256
00:19:11,319 --> 00:19:12,240
but this, this, this,
And I'm like, right, no,

257
00:19:12,920 --> 00:19:17,680
that's just not accurate. That's just
not the case, right, And

258
00:19:17,920 --> 00:19:18,920
it's the same, you know,
it's the same type of thing. And

259
00:19:18,920 --> 00:19:22,640
then the watermark stuff. I mean, I think we're going to come out

260
00:19:22,720 --> 00:19:30,079
that the software is going to come
out to try and spot plagiarism, fakes,

261
00:19:30,160 --> 00:19:33,359
that type of thing. It's just
going to have to catch up right

262
00:19:33,400 --> 00:19:38,759
at this twenty ugly it's but well
I don't I don't know there's going to

263
00:19:38,839 --> 00:19:44,160
be any uglier than it already is. It's just becoming more obvious than it

264
00:19:44,240 --> 00:19:48,440
is. Ugly, Like I think
your language they're amber about they Google this,

265
00:19:48,519 --> 00:19:52,000
so it must be true, right, Just there's no thought to the

266
00:19:52,079 --> 00:19:56,039
idea of where Google where where that
data came from that Google surfaced for you

267
00:19:56,440 --> 00:20:00,559
is not in consideration for most people, right yep, or at least for

268
00:20:00,640 --> 00:20:03,960
some people anyway. Certainly, then
we get into the confirmation biased thing where

269
00:20:04,000 --> 00:20:07,519
it's like I look for the link
that sounds most like what I want must

270
00:20:07,519 --> 00:20:11,599
sound like, and that's the one
I use absolutely, And in that respect,

271
00:20:11,640 --> 00:20:15,119
nothing has changed again. It's as
you talked about earlier. It's the

272
00:20:15,160 --> 00:20:19,359
scalability issue. It just makes it
more accessible. In the olden days,

273
00:20:19,400 --> 00:20:22,839
we just had you know, snake
oil people, right, and they were

274
00:20:22,920 --> 00:20:26,000
still doing that right, Like they're
going around in their wagons and they're telling

275
00:20:26,039 --> 00:20:30,200
people stuff and selling stuff, and
it was harder for them to get around

276
00:20:30,279 --> 00:20:34,640
and spread that message, but it
was still happening. It's just now it's

277
00:20:34,680 --> 00:20:38,319
easy, it's in your face,
it's accessible. It just gets further and

278
00:20:38,440 --> 00:20:44,799
further that way. What if water
marks become required and then there are penalties

279
00:20:44,880 --> 00:20:51,640
for those who don't use them.
I can just see that turning into a

280
00:20:51,759 --> 00:20:56,440
shite show. I mean how I
mean, when any we're going to put

281
00:20:56,440 --> 00:21:00,200
it? You know, I find
it something to tweet and chat. GPT

282
00:21:00,359 --> 00:21:03,839
helped me write it. Do I
have to put that on my tweet?

283
00:21:03,160 --> 00:21:07,400
And now does that mean that somebody's
going to look at that and say,

284
00:21:07,440 --> 00:21:11,200
oh, that didn't come from my
brain, you know what I mean?

285
00:21:11,359 --> 00:21:14,480
And then if I don't do it, when am I going to pay a

286
00:21:14,559 --> 00:21:15,960
fine? And how much is the
fine going to be? Is it going

287
00:21:17,000 --> 00:21:18,559
to be more than Elon must charges
me a month? To you sweeter,

288
00:21:18,559 --> 00:21:22,200
I can't imagine it'd be a fun
I think more likely my viewer that said

289
00:21:22,319 --> 00:21:26,519
I will say, I'll have the
option to say I only want stuff that

290
00:21:26,559 --> 00:21:30,079
has a watermark, and so it's
your stuff just won't be visible if it

291
00:21:30,079 --> 00:21:33,640
doesn't pass that quality gate. Yeah, the market's going to dictate that.

292
00:21:33,799 --> 00:21:37,400
I mean to some degree. You're
already seeing some of it where there have

293
00:21:37,519 --> 00:21:42,279
been ramifications for people that have used
chat GPT and their jobs right or on

294
00:21:42,319 --> 00:21:47,799
their resumes or something like that,
and it's already you've seen instances where it's

295
00:21:47,799 --> 00:21:52,839
come to kind of bite them.
Those those things are going to happen.

296
00:21:52,680 --> 00:21:56,640
The people are still going to do
some of the bad stuff and get away

297
00:21:56,640 --> 00:21:59,759
with it as they always do.
And they're in my mind, we're gonna

298
00:21:59,839 --> 00:22:03,640
end in some middle ground. I
don't think it's ever going to be as

299
00:22:03,680 --> 00:22:08,200
far as like super regulated. That
just doesn't ever really happen. They're going

300
00:22:08,279 --> 00:22:11,880
to try, and there's going to
be some backlash and then there's you know,

301
00:22:11,880 --> 00:22:15,279
we're going to cut out a bulk
of the easy stuff that people are

302
00:22:15,279 --> 00:22:18,799
trying to do, and people will
find that it's not worth it to lose

303
00:22:18,839 --> 00:22:22,599
their job or whatever. And right, we'll end up in a space where

304
00:22:22,640 --> 00:22:26,680
some people use it for good and
some people use it for bad. And

305
00:22:26,359 --> 00:22:30,079
well, this is that Gartner hype
cycle right where I think we've come off

306
00:22:30,119 --> 00:22:36,279
the peak of inflated expectations and we're
headed down into the trough of disillusionment right

307
00:22:36,319 --> 00:22:40,720
now, and then you sort of
climb your way back out into sort of

308
00:22:40,839 --> 00:22:45,200
reasonable expectations what it can and can't
do. I mean, I've certainly been

309
00:22:45,240 --> 00:22:49,119
encouraged people don't ask, don't use
this software to try and discover facts,

310
00:22:49,640 --> 00:22:55,160
right, use it to discover ideas. It's a good creative tool. But

311
00:22:55,400 --> 00:22:59,200
any fact you want that it spits
out, you need to validate, and

312
00:22:59,240 --> 00:23:03,640
you'll find a stunning number of them
are incorrect. Yes, running is right.

313
00:23:04,160 --> 00:23:08,119
Absolutely, Yeah, it's the classic
one. The one that I first

314
00:23:08,119 --> 00:23:11,720
said, okay, well here we
go down the trough was the lawyers that

315
00:23:11,920 --> 00:23:17,200
used it to write summaries to their
cases that cited cases that simply did not

316
00:23:17,319 --> 00:23:21,240
exist. Didn't exist, yep,
and they didn't check yep. And but

317
00:23:21,359 --> 00:23:26,559
you know who did check the judge
and then they were in big trouble because

318
00:23:26,599 --> 00:23:30,519
it was their name on it.
Yeah, you can't use the software wrote

319
00:23:30,559 --> 00:23:33,960
it. Excuse like, it's not
a thing. You submitted this as you

320
00:23:34,599 --> 00:23:40,839
you're on the hook for submitting inaccurate
information. Yeah. And and then you

321
00:23:40,880 --> 00:23:44,240
know, the industry, they'll you
know, at that point we'll see and

322
00:23:44,319 --> 00:23:45,519
be like, oh, you know
what, that's not lucrative for me to

323
00:23:45,559 --> 00:23:49,279
do. It doesn't It might save
me time, but it doesn't that the

324
00:23:49,440 --> 00:23:52,519
risk is not worth the reward.
And I gotta think that's just like a

325
00:23:52,559 --> 00:23:56,799
cautionary tales for lawyers full stop.
Like how many lawyers are like, Okay,

326
00:23:56,839 --> 00:23:59,680
well we're not using this now because
it's at a risk. Although it

327
00:23:59,759 --> 00:24:02,680
same time, I think that's an
overreaction too. It's like check your facts,

328
00:24:03,160 --> 00:24:07,119
right, that's always a good idea
by the way you set down guidelines.

329
00:24:07,279 --> 00:24:11,319
I mean, like my company sent
out so they encouraged us to explore

330
00:24:11,559 --> 00:24:15,839
chat GBT find ways that it could
kind of help us do things, but

331
00:24:15,880 --> 00:24:21,720
they also sent out a you know, essentially a regulatory doc that we need

332
00:24:21,720 --> 00:24:23,759
to sign that's like, hey,
these are the boundaries, right, like,

333
00:24:23,839 --> 00:24:26,759
these are the ways to use it
and the ways not to use it.

334
00:24:26,799 --> 00:24:30,559
And I imagine a lot of companies
will do that, right, and

335
00:24:30,599 --> 00:24:34,559
it's like, hey, this is
how we're okay with you guys using this

336
00:24:34,799 --> 00:24:38,039
tool. This is going to become
a part of the employee handbook. Yep.

337
00:24:38,319 --> 00:24:44,599
Yeah, especially when you see Microsoft
and presumably Google productizing it. Yeah.

338
00:24:44,640 --> 00:24:48,160
Absolutely, Now if the company is
paying for it for you, they're

339
00:24:48,200 --> 00:24:52,839
better be guidelines. Yeah, you
would, hope, I think so.

340
00:24:53,079 --> 00:24:56,079
I mean we, of course we're
still premature here. This is coming out

341
00:24:56,079 --> 00:25:00,880
in early August. We haven't actually
later hands on M three sixty five copilot,

342
00:25:00,920 --> 00:25:04,200
even though they've announced a price,
like you can't just buy it yet,

343
00:25:04,880 --> 00:25:10,400
so we still don't actually know what
it's going to do for us as

344
00:25:10,440 --> 00:25:12,880
that whether or not, as we're
thirty dollars a month per per person.

345
00:25:14,200 --> 00:25:18,839
Wow, And that's where the you
know, the market dictates, right,

346
00:25:18,960 --> 00:25:21,839
Yeah, they act. I mean, just with any other software, you've

347
00:25:21,880 --> 00:25:26,119
got to come out with the value
to warrant the price. And if they

348
00:25:26,160 --> 00:25:29,279
don't, yeah, then people won't
pick it up. Right. Well,

349
00:25:29,359 --> 00:25:30,519
I think this is the other side
of this, which is that we're still

350
00:25:30,519 --> 00:25:33,480
trying to figure out if this is
a product people are actually willing to pay

351
00:25:33,480 --> 00:25:37,559
for. Yeah, and because the
other thing we don't really know is the

352
00:25:37,559 --> 00:25:41,359
cost of operations, Like we know
what they're willing to charge for, but

353
00:25:41,480 --> 00:25:45,319
I fascinated to know how much it's
costing Microsoft to run those models on our

354
00:25:45,400 --> 00:25:49,440
behalf essentially for free. You know, I don't know how many folks have

355
00:25:49,559 --> 00:25:55,920
signed up for Chat GPT plus and
much less what it's costing chats GPT on

356
00:25:55,960 --> 00:25:59,799
the back end. I did.
I have Chat GPT plus and then I

357
00:26:00,000 --> 00:26:03,839
and to use the API, and
I found that, oh, that's a

358
00:26:03,880 --> 00:26:07,759
separate charge, the GPT four API. And not only that, but you

359
00:26:07,799 --> 00:26:11,000
have to put your name on a
waiting list, right, So yeah,

360
00:26:11,039 --> 00:26:14,119
if you want to use the API, even to use stuff like the playground,

361
00:26:14,319 --> 00:26:18,079
which I've been using on the AI
Bought Show with Brian McKay, you

362
00:26:18,119 --> 00:26:22,839
have to have you have to be
you have to be validated, and pay

363
00:26:23,720 --> 00:26:27,079
extra for that because it does more
than Chat GPT. I mean, I

364
00:26:27,119 --> 00:26:32,440
do feel a little bit good about
this gatekeeping, Like it's not like this

365
00:26:32,480 --> 00:26:34,680
thing sitting out an open source and
anybody can set one of these lms up

366
00:26:34,720 --> 00:26:38,599
and running. However, they want
like the fact that you have to be

367
00:26:38,640 --> 00:26:44,799
an authenticated developer so they kind of
know who you are, and then you're

368
00:26:44,799 --> 00:26:48,880
working against their instance like that at
least gives me some sense of governance,

369
00:26:49,200 --> 00:26:52,279
right, Yeah, and I just
don't I mean at this time, you

370
00:26:52,319 --> 00:26:56,920
know, it takes an enormous amount
of resources to train pre train one of

371
00:26:56,960 --> 00:27:00,519
these, right, and then a
lot of what they're doing over over the

372
00:27:00,559 --> 00:27:08,640
time is fine tuning, right,
tweaking it and so and I'm seeing,

373
00:27:08,759 --> 00:27:14,319
you know, the job listings are
out there now right to work on these

374
00:27:14,359 --> 00:27:17,599
things, to do some of the
fine tuning and that sort of stuff.

375
00:27:17,640 --> 00:27:19,839
And that that too, kind of
what you were talking about in the beginning

376
00:27:21,240 --> 00:27:26,759
about whether whether it's degrading or not. And that's I think it does so

377
00:27:26,799 --> 00:27:32,359
many things that trying to test a
handful of things that it that it does

378
00:27:32,519 --> 00:27:37,880
does not really test the entirety of
it and it what it's capable of that

379
00:27:37,960 --> 00:27:41,640
sort of thing. But that's going
to be a it's going to take I

380
00:27:41,680 --> 00:27:48,720
think a good bit to try and
manage that process. Right, they're fine

381
00:27:48,759 --> 00:27:52,000
tuning it and having to basically because
they've put it out to the public.

382
00:27:52,119 --> 00:27:57,400
Look at how that's received just like
a software product, right, you know,

383
00:27:57,720 --> 00:28:00,599
do you put out an update and
people get all mad about it,

384
00:28:00,720 --> 00:28:03,839
right, and then you have to
kind of go back and tweak it.

385
00:28:03,839 --> 00:28:08,240
It's going to take them a while
to sort out that that management. I

386
00:28:08,279 --> 00:28:11,880
think I got it. I got
a sense that when they put chat Gypt

387
00:28:11,000 --> 00:28:15,559
out in November of twenty two,
it's because they had runned all the tests

388
00:28:15,599 --> 00:28:18,160
they could think of, and so
now it was well, instead of continuing

389
00:28:18,160 --> 00:28:25,000
to pay for people to write prompts
to validate that this behavior, we can

390
00:28:25,000 --> 00:28:27,759
get it for free by putting it
out in public. Yeah right, you

391
00:28:27,799 --> 00:28:32,839
know, except for that whole hundred
million users signing up in sixty days thing,

392
00:28:32,920 --> 00:28:36,880
which it was weird. You know, I don't think anybody planned on

393
00:28:36,920 --> 00:28:41,720
that. I mean, their reaction
to that was clearly intentional, But I

394
00:28:41,759 --> 00:28:45,160
don't think anybody started in November of
twenty two saying, you know what,

395
00:28:45,200 --> 00:28:47,359
one hundred million people are going to
sign up for this in two months.

396
00:28:47,480 --> 00:28:49,599
Well, that's a new you know, in the last few years, that's

397
00:28:49,599 --> 00:28:53,119
a new space for us. It
didn't used to be that things like that

398
00:28:53,200 --> 00:28:57,599
traveled that quickly, right, Even
though, right we're all digital and that

399
00:28:57,640 --> 00:29:02,640
sort of thing. But now,
I mean, my kids know about new

400
00:29:02,680 --> 00:29:07,599
things before I do, because it
gets blasted out on TikTok and goes all

401
00:29:07,640 --> 00:29:11,799
over the place, right, And
that's that's new, especially for us older

402
00:29:11,799 --> 00:29:15,720
people. Right, It's like that, how does everybody in my neighborhood now

403
00:29:15,759 --> 00:29:19,000
know about this thing? No?
About this thing because TikTok, you know,

404
00:29:21,079 --> 00:29:25,039
craziness. Yeah, no, it's
an interesting aspect to that. But

405
00:29:25,880 --> 00:29:27,240
you know, I'm not a big
fan of the folks saying, hey,

406
00:29:27,240 --> 00:29:30,960
we should put a pause on this, you know, shouldn't be productized.

407
00:29:32,920 --> 00:29:36,920
My instinct is against that. And
yet when we talk this way, it's

408
00:29:36,960 --> 00:29:40,960
like, it's pretty clear you probably
productized too soon, like you have not

409
00:29:41,319 --> 00:29:45,160
tested the edges of this product.
At the same time, I also see

410
00:29:45,200 --> 00:29:51,960
Microsoft looking at it going they're probably
spending millions a month in Azure resources operating

411
00:29:52,000 --> 00:29:56,440
this thing. If you a customers
signed up for it soon, you might

412
00:29:56,480 --> 00:29:59,680
have to turn it off. Like
that's just a lot of money. So

413
00:29:59,720 --> 00:30:04,039
we're i'd stay stitty in I mean, and it's you know, it's a

414
00:30:04,039 --> 00:30:10,039
product essentially that has gone viral that
you really didn't you know, I imagine

415
00:30:10,079 --> 00:30:14,119
they expected some but I'm with you. Imagine they didn't expect no sort of

416
00:30:14,119 --> 00:30:17,559
the degree to it, and it's
like, how do you how do you

417
00:30:17,599 --> 00:30:19,519
manage that with any sort of product? Right, all of a sudden,

418
00:30:21,119 --> 00:30:25,359
everybody has it and they all have
opinions about it, and you know,

419
00:30:26,079 --> 00:30:29,440
how do you scale that up?
And do you pull it back or do

420
00:30:29,519 --> 00:30:33,720
you try and tweak it? And
how you manage expectations and backlash and all

421
00:30:33,759 --> 00:30:38,119
of that stuff. I mean,
I don't there's a reason I work at

422
00:30:38,160 --> 00:30:42,400
smaller company because there's just a lot
of stuff there that I'm just not interested

423
00:30:42,440 --> 00:30:47,519
in managing. Because at the same
time, you've got Google with bards saying,

424
00:30:47,559 --> 00:30:48,480
hey, this isn't ready yet.
We're going to hold us back.

425
00:30:48,519 --> 00:30:51,960
And I think there's a sense of
folks who're saying, oh, do you

426
00:30:52,000 --> 00:30:53,240
just say your thing's not working yet? It's like, I'm pretty sure the

427
00:30:53,279 --> 00:30:57,599
other thing isn't working yet either.
But you know, there's all these degrees

428
00:30:57,599 --> 00:31:02,079
of working. They're playing game,
right, and there's there's always a thing

429
00:31:02,119 --> 00:31:07,559
like you always have to take that
that gamble when do you do something wins

430
00:31:07,599 --> 00:31:10,960
the right time? Am I you
know, is it going to pay off

431
00:31:11,000 --> 00:31:14,400
for me to wait and put out
something that's later, you know, maybe

432
00:31:14,440 --> 00:31:18,759
more mature and received better, or
maybe they're missing out because they're missing out

433
00:31:18,759 --> 00:31:23,039
on this initial fervor. Yeah.
Again, that's not stuff that I am

434
00:31:23,119 --> 00:31:26,759
very good at. So also a
reason I will never run a company.

435
00:31:26,880 --> 00:31:30,039
It's a great start up story.
It's like, did you want to be

436
00:31:30,079 --> 00:31:33,000
the first mover because first mover has
advantages? Or do you recog or do

437
00:31:33,039 --> 00:31:37,319
you look at it as the pioneers
the one with the arrows in his back?

438
00:31:37,119 --> 00:31:42,279
Right? Absolutely? Yeah, both
can be true. It makes you

439
00:31:42,279 --> 00:31:45,720
think of that. Have you seen
that F one show on Netflix. I

440
00:31:45,759 --> 00:31:48,720
wasn't really into F one until I
saw it. It's really good. I

441
00:31:49,000 --> 00:31:53,480
highly recommend it. But they the
racing cars, Yeah yeah, and they

442
00:31:53,599 --> 00:31:59,119
show a lot of like the behind
the scenes like decisions about you know,

443
00:31:59,160 --> 00:32:02,640
if it's raining, do you decide
to put your raining tires on now or

444
00:32:02,759 --> 00:32:06,799
later? Or try and extend the
things, or do you right? You

445
00:32:06,839 --> 00:32:09,240
know, when do you move maneuver
and all that stuff. There's just so

446
00:32:09,319 --> 00:32:15,319
much you know, intricacy that goes
into that. And it's the same type

447
00:32:15,359 --> 00:32:17,240
of thing. It's like when do
I put my car out or when do

448
00:32:17,319 --> 00:32:22,200
I hold it back. Could we
then surge ahead and some of it pans

449
00:32:22,200 --> 00:32:24,319
out and some of it doesn't.
Yeah, it's getting a three months jump

450
00:32:24,400 --> 00:32:29,440
on a market. Give me more
advantaged in doing three more months of research?

451
00:32:30,119 --> 00:32:34,680
Right for at least it's a great
question, And obviously we're seeing it

452
00:32:34,799 --> 00:32:37,599
play out right now and two different
positions. I think to a large degree,

453
00:32:37,640 --> 00:32:42,160
I think, oh, it'll be
the source of books in a couple

454
00:32:42,200 --> 00:32:45,039
of years, right, There'll be
so many books about it and what worked

455
00:32:45,039 --> 00:32:49,720
and what didn't and what you can
learn from your for your own company in

456
00:32:49,759 --> 00:32:52,359
the future. Absolutely the race for
AI, No, I see it,

457
00:32:52,400 --> 00:32:55,599
absolutely, And Amber, I'm gonna
interrupt for one moment if it's very important

458
00:32:55,640 --> 00:33:02,440
message and we're back. It's done
in rocks. I'm Richard Gamble. Let's

459
00:33:02,480 --> 00:33:07,680
Carl Franklin. You talking to our
friend Amber Mackenzie a little bit about Well,

460
00:33:07,720 --> 00:33:09,920
I just called this the ethics of
large language models. But I think

461
00:33:10,359 --> 00:33:15,559
there is some ethical conversation going on
here as we're trying and figure out the

462
00:33:15,640 --> 00:33:19,519
consequences of what's being built and being
basically played out in front of us.

463
00:33:19,759 --> 00:33:22,359
I wonder how much of this also
is just the culture of modern software development.

464
00:33:22,359 --> 00:33:25,920
Where it's like just sticking in front
of the customers. We have telemetry

465
00:33:27,279 --> 00:33:30,480
and we'll fix it via the internet
going forward. Yeah, you don't.

466
00:33:30,519 --> 00:33:34,720
It's almost like you don't have to
be responsible anymore because it's all fixable.

467
00:33:34,880 --> 00:33:38,440
It's true, but also like you
put this thing out and oh you don't

468
00:33:38,440 --> 00:33:40,720
like how it works. Do you
think it's getting worse? Well, I've

469
00:33:40,759 --> 00:33:45,400
got all these other people lined up
to use it, you know that sort

470
00:33:45,400 --> 00:33:49,200
of thing. It's like, well, we'll get the feedback. The feedback

471
00:33:49,279 --> 00:33:54,519
is oftentimes more valuable than anything else, right, Yeah, well that's certainly

472
00:33:54,559 --> 00:34:00,000
where chat GPT started, was just
a feedback mechanism for more prompts and more

473
00:34:00,319 --> 00:34:02,880
data on how it was behaving and
how people were reacting to it. So

474
00:34:04,920 --> 00:34:07,920
it's just, you know, this
sudden flip to make it make it a

475
00:34:07,960 --> 00:34:12,679
product. It's interesting to see if
it's going to be good and what that

476
00:34:12,760 --> 00:34:19,440
even looks like. I'm still not
convinced that this isn't a dead end because

477
00:34:19,480 --> 00:34:22,280
my experience with software has generally been
if if the only way you know how

478
00:34:22,320 --> 00:34:24,960
to make it better is to make
it bigger, it's probably not that good.

479
00:34:25,960 --> 00:34:31,519
Yeah, I've been saying that about
LM's lately. They're just I'm one

480
00:34:31,519 --> 00:34:37,079
of those, uh when I non
jumpers is what I call myself. So

481
00:34:37,199 --> 00:34:40,840
I turned to wait on a lot
of things, like if my company rolls

482
00:34:40,880 --> 00:34:44,719
out a new software that we're going
to be using, I'm like, are

483
00:34:44,719 --> 00:34:46,119
we I'm not going to dive all
into it. I'm going to wait to

484
00:34:46,159 --> 00:34:50,079
see if we're really going to use
that, or in a couple of months

485
00:34:50,079 --> 00:34:52,079
they're going to be like, ah, no, we made the wrong decision,

486
00:34:52,440 --> 00:34:58,119
right. And so I actually have
not been as all in on it,

487
00:34:59,320 --> 00:35:00,440
you know. I think that there's
probably, yeah, I could I

488
00:35:00,480 --> 00:35:05,119
spend some time and really find ways
to help my productivity or that sort of

489
00:35:05,159 --> 00:35:07,719
thing. Yeah, but it hasn't
been this thing that I've been like,

490
00:35:07,800 --> 00:35:13,000
oh my gosh, that makes all
the difference. I've got to use it,

491
00:35:13,480 --> 00:35:15,679
you know, and I'm waiting to
see where it goes. I Mean

492
00:35:15,719 --> 00:35:20,440
a lot of times that's true too, that if you're the second mover,

493
00:35:21,199 --> 00:35:23,199
you don't get the arrows in the
back and you get the advantages of their

494
00:35:23,320 --> 00:35:25,559
hits to say, Okay, this
is a better way to use it.

495
00:35:27,000 --> 00:35:30,079
Yeah, and maybe some better tooling, Like there's a lot of second mover

496
00:35:30,199 --> 00:35:35,639
advantage. There is English the only
language that CHAT Gypt and GPT for uses

497
00:35:36,320 --> 00:35:39,119
and is that an issue? So
I don't know the answer, would I

498
00:35:39,159 --> 00:35:45,280
would assume not. I mean,
the machine translation space is pretty solid and

499
00:35:45,320 --> 00:35:47,960
we have a lot of data and
other languages, but I actually don't know,

500
00:35:50,039 --> 00:35:52,440
right, But is it translation at
that point? I mean, because

501
00:35:52,800 --> 00:35:57,400
language models are that right, it's
trained on language. It's a yeah,

502
00:35:57,440 --> 00:36:00,440
you know, there's actually been a
way CHAT is available in fifty languages.

503
00:36:00,559 --> 00:36:06,239
Okay, that's first and foremost right, But there have been some great papers

504
00:36:06,360 --> 00:36:13,280
about the tokenization strategy and how it
doesn't map symmetrically to all languages because different

505
00:36:13,360 --> 00:36:16,320
languages have different architectural portrays, and
because it was trained on English, wasn't

506
00:36:16,320 --> 00:36:22,800
it Well, fundamentally the tokenizations technique
was built by English speakers. Yeah,

507
00:36:22,800 --> 00:36:25,599
like I wouldn't even say, it's
just it's the training set, per se

508
00:36:25,880 --> 00:36:30,280
in that of the adeomatic structure of
the tokens is based on people who were

509
00:36:30,280 --> 00:36:36,199
thinking in English. And so there
was a really great piece I read about

510
00:36:36,280 --> 00:36:39,639
its mistakes in Urdu, not that
I speak or do, and I know

511
00:36:39,679 --> 00:36:45,199
it's in Pakistan. It's a very
old language, but architectural fundamental different language,

512
00:36:45,199 --> 00:36:47,639
and the token system just didn't work
as well for that one appen.

513
00:36:47,719 --> 00:36:51,199
Yeah, and it just begs the
question like if Grant we're here, he

514
00:36:51,239 --> 00:36:53,239
would be saying, what's language anyway? You know, it changes all the

515
00:36:53,280 --> 00:37:00,400
time, So what's formal English language
today or Urdu language today? We'll change

516
00:37:00,440 --> 00:37:05,800
in ten years. I'm reminded of
that awesome show night Rider. Do you

517
00:37:05,840 --> 00:37:09,639
remember that night Rider with the kit
kit the car, the AI car,

518
00:37:09,760 --> 00:37:14,440
right, And I thought one of
the funniest things I ever heard about that.

519
00:37:14,440 --> 00:37:16,960
One of the funniest bits was from
Eddie Murphy who's like, you know,

520
00:37:17,679 --> 00:37:21,360
the door is a jar. He
says, you know, if that

521
00:37:21,440 --> 00:37:24,360
were in the hood, the like
by yeo, man, someone stole your

522
00:37:24,360 --> 00:37:32,360
battery him. It's true though,
Yeah, that's actually a big I mean

523
00:37:32,480 --> 00:37:38,159
that speaks back to the data bias
situation, where, right, the amount

524
00:37:38,199 --> 00:37:43,320
of data that you have in different
languages or different dialects or whatever, it

525
00:37:43,480 --> 00:37:49,400
really does dictate sort of the outcome
that you're going to get from any of

526
00:37:49,400 --> 00:37:54,719
these models, and most of it
isn't you know, more English oriented or

527
00:37:54,760 --> 00:38:00,320
white oriented English? Right? Yeah, white English yep. And part of

528
00:38:00,320 --> 00:38:04,599
this also is now this film.
It seems to be the philosophy that a

529
00:38:04,679 --> 00:38:07,719
larger data set is better. Yeah, and so like and I look at

530
00:38:07,719 --> 00:38:09,559
it, go, I go over
and look at Wikipedia. Say, there's

531
00:38:09,559 --> 00:38:14,360
far more English articles than anything else. Are they more accurate because there's more

532
00:38:14,519 --> 00:38:17,800
and more participation or are there other
languages with fewer articles that are actually higher

533
00:38:17,880 --> 00:38:23,280
quality. Well, the argument that
more data is that are it has nuances?

534
00:38:23,400 --> 00:38:28,480
Right? So if you are doing
a particular thing, having more data

535
00:38:28,639 --> 00:38:31,159
that speaks to the thing that you
need to do, Yes, that is

536
00:38:31,199 --> 00:38:37,400
accurate. But say you're doing in
classification, you've got five classes. Having

537
00:38:37,519 --> 00:38:40,559
more data in one of your classes
that you have the most data for is

538
00:38:40,599 --> 00:38:45,199
actually it degrades the model for the
other ones. So having more data in

539
00:38:45,239 --> 00:38:52,480
that first bucket, you know,
say white English, having more data there

540
00:38:52,519 --> 00:38:57,679
does actually not help. It actually
will bias the model towards those things.

541
00:38:57,840 --> 00:39:01,719
Right. So you need more data
in a diverse area. You need more

542
00:39:01,800 --> 00:39:06,360
data in the places that you don't
have a lot of data, or if

543
00:39:06,360 --> 00:39:08,639
you don't have data about a certain
subject, or if you don't have it

544
00:39:08,960 --> 00:39:13,159
you know, data and a certain
thing you're trying to do. That helps,

545
00:39:13,239 --> 00:39:15,639
but not across the board. Yeah, I mean, I guess you

546
00:39:15,719 --> 00:39:21,039
have to when tokenizing, you have
to determine, Okay, does this apply

547
00:39:21,159 --> 00:39:25,360
to every culture, every language or
is this a language or culture specific thing?

548
00:39:25,679 --> 00:39:29,119
Right? Yeah, I don't know
the answer. I'm not smart enough

549
00:39:29,119 --> 00:39:31,960
to do that to know that well. And it's it's twofold. It's the

550
00:39:32,159 --> 00:39:37,599
linguistic part that is sort of the
translation or the how do I you know,

551
00:39:37,920 --> 00:39:42,000
when I'm setting up questions, when
I'm setting up prompts and I'm translating

552
00:39:42,000 --> 00:39:45,519
those into other languages, or even
if I have a model that I'll take

553
00:39:45,559 --> 00:39:47,639
it from other languages, if I
don't have enough data to support that,

554
00:39:47,679 --> 00:39:51,960
then it's not going to be as
good. But then the cultural differences,

555
00:39:52,320 --> 00:40:00,760
So if if even if it's the
same English, if other cultures questions in

556
00:40:00,800 --> 00:40:02,840
a different way, if they speak
in a different way, if they provide

557
00:40:02,840 --> 00:40:07,000
their thoughts in a different way,
then it's not going to be as effective.

558
00:40:07,199 --> 00:40:09,119
Yeah, yeah, yeah, And
I think this is where like you

559
00:40:09,159 --> 00:40:15,320
can go to the platform tokenizer on
open Ai and actually see how it tokenizes

560
00:40:15,360 --> 00:40:22,159
English GPT three just to get an
idea of what they're talking about, and

561
00:40:22,199 --> 00:40:29,039
then recognize how English centric that actually
is like, again, I don't know

562
00:40:29,159 --> 00:40:31,679
enough language to say now if I
was fluent in and I would prefer not

563
00:40:31,719 --> 00:40:37,800
even a romance language, something very
different. Do the tokens actually makes sense?

564
00:40:37,400 --> 00:40:39,599
I don't know how much, really, how much work has been done

565
00:40:39,599 --> 00:40:44,440
in that area, Richard. When
you say you can see how it's tokenized,

566
00:40:44,559 --> 00:40:49,400
is it tokenizing your questions or the
data that it's trained on? It

567
00:40:49,559 --> 00:40:52,480
you can literally just write some text
and it'll show you what about the data

568
00:40:52,519 --> 00:40:55,719
that it's trained on. That's that's
my No, it doesn't do that,

569
00:40:55,840 --> 00:40:59,599
But I'm just saying, you know, folks are asking about tokenization, and

570
00:40:59,599 --> 00:41:02,800
it's look, I'll clue the link
in the shows, but you can literally

571
00:41:04,159 --> 00:41:07,119
take text, plug it into that, and it'll show you how the tokenizer

572
00:41:07,440 --> 00:41:12,320
represents it well. And this ends
up being a problem. Actually, the

573
00:41:13,000 --> 00:41:19,800
productization of it becomes a problem there
because products are oriented towards their users.

574
00:41:20,880 --> 00:41:23,559
So you know, if you're making
it a product and you're trying to sell

575
00:41:23,599 --> 00:41:28,920
it, you're going to make it
work best for the majority of the people

576
00:41:29,079 --> 00:41:34,320
who want to buy it, and
that oftentimes is not the marginalized communities.

577
00:41:34,920 --> 00:41:38,039
Well, and yeah, and you
get into now you have all the market

578
00:41:38,079 --> 00:41:43,000
forces, which is what's the appropriate
demographic for me to optimize return especially in

579
00:41:43,079 --> 00:41:45,719
the early days of a product like
this where they're desperate to get some returns

580
00:41:45,760 --> 00:41:50,400
quickly. Right, They're not looking
at diversity or any of those things.

581
00:41:50,440 --> 00:41:52,559
They're like, who's got the money
that we know how to sell to now?

582
00:41:52,719 --> 00:41:55,320
Right? Absolutely? I mean,
and again I go back to M

583
00:41:55,360 --> 00:41:59,519
three sixty five, which copile,
which seems to be the first one really

584
00:41:59,559 --> 00:42:07,559
product since Gehub Coopilot, where they're
targeting large American corporations. Yep, because

585
00:42:07,679 --> 00:42:12,039
they because they're already they're already customers, they're going to use it on mass

586
00:42:12,039 --> 00:42:15,039
like there's a lot of reasons.
I totally get why they would do that,

587
00:42:15,239 --> 00:42:19,360
but has nothing to do with any
kind of sense of diversity to the

588
00:42:19,440 --> 00:42:22,760
data. That's probably a bigger concern
in my mind than you know, some

589
00:42:22,800 --> 00:42:28,000
of the misinformation stuff. The misinformation
has happened, and it will happen,

590
00:42:28,119 --> 00:42:31,519
and people will make things to combat
that, but there isn't a market to

591
00:42:31,840 --> 00:42:37,719
combat the lack of diversity and bias. There are smaller groups who work towards

592
00:42:37,719 --> 00:42:42,360
it. But again, if it's
a product and it's somebody keeps it,

593
00:42:42,840 --> 00:42:45,599
you know, privately held and that
sort of thing. There isn't an impetus

594
00:42:45,679 --> 00:42:51,519
for them to to fix that,
and that widens the widens the tech gap,

595
00:42:52,039 --> 00:42:55,639
widens the gap for people having you
know, access to things that other

596
00:42:55,679 --> 00:43:00,320
people have access to. It sort
of heads us in a further in a

597
00:43:00,360 --> 00:43:05,760
direction that we have been going and
as where you really have the argument about

598
00:43:05,800 --> 00:43:10,280
making a product in this particular scenario
is that because of the need to show

599
00:43:10,360 --> 00:43:16,119
returns quickly, you're going to to
bias towards where those returns are likely to

600
00:43:16,239 --> 00:43:20,360
come from. Yeah, although and
I'm in our past conversation you said like

601
00:43:20,480 --> 00:43:22,000
step one of dealing with any biases, No, it's there. Yeah,

602
00:43:22,000 --> 00:43:25,599
what's step two? Well, in
this case, the app to care about

603
00:43:25,639 --> 00:43:30,880
it. Yeah, but I think
we always did, but or we always

604
00:43:30,880 --> 00:43:34,880
supposed to. But that's again where
it comes into the product side, is

605
00:43:34,920 --> 00:43:37,800
like, yeah, when you take
it out of the research right, research,

606
00:43:38,559 --> 00:43:44,320
the research arena oftentimes will penalize for
bias for things like that because you

607
00:43:44,400 --> 00:43:47,119
have a diverse group of researchers who
were like, hey, did you think

608
00:43:47,119 --> 00:43:51,039
about this thing? And we care
about that, We're not going to you

609
00:43:51,079 --> 00:43:53,039
know, approve your paper or whatever
it is, if you haven't done your

610
00:43:53,079 --> 00:43:57,440
due diligence, many you take it
out of that realm and put it into,

611
00:43:58,000 --> 00:44:01,440
you know, a commercial product.
What do they get for increasing the

612
00:44:01,559 --> 00:44:06,679
diversity there? What benefit does it
do them? In fact, it's probably

613
00:44:06,920 --> 00:44:09,559
worse for them because they have to
put more time and resources into something that

614
00:44:09,599 --> 00:44:15,039
they're not going to get a return
from. So yeah, it's to me,

615
00:44:15,320 --> 00:44:17,639
you know, and again I've done
enough product work to be pretty comfortable

616
00:44:17,639 --> 00:44:22,480
with these terminologies. It is like
you go get your pioneering group and you

617
00:44:22,599 --> 00:44:24,400
kind of want them to all be
the same because you've only got so many

618
00:44:24,480 --> 00:44:32,119
resources. That diversity quotient only comes
in when we're trying to broaden the customer

619
00:44:32,159 --> 00:44:38,840
race. It's literally the second half
before that. He becomes a question absolutely

620
00:44:38,840 --> 00:44:43,320
where it's like we are now limiting
our market by not being broad enough.

621
00:44:43,559 --> 00:44:46,360
And if you think about the people
who are going to pay for chat GPT,

622
00:44:46,519 --> 00:44:51,320
it's often not going to be,
you know, a diverse group.

623
00:44:51,360 --> 00:44:52,960
There's a whole bunch of this world
that is not going to be able to

624
00:44:53,000 --> 00:44:59,039
pay for Chat GPT. It's already
priced at the monthly income of a true

625
00:44:59,119 --> 00:45:02,039
non trivial chunk of the this world. So uh, and that and that's

626
00:45:02,039 --> 00:45:06,039
a whole other can of worms,
right, is that you know we already

627
00:45:06,039 --> 00:45:09,719
have multinationals where pricing varies by country
by country, and that we could get

628
00:45:09,800 --> 00:45:14,039
there. I don't know that that's
true. And I would also say we're

629
00:45:14,079 --> 00:45:17,119
arguing over a product that's I think
yet to show profound value. Yeah,

630
00:45:17,159 --> 00:45:22,960
it's certainly created profound hype, but
show me that, you know, the

631
00:45:23,079 --> 00:45:29,079
company with a huge competitive advantage because
they've been using this technology. Lots of

632
00:45:29,079 --> 00:45:32,239
people are talking about it. They're
certainly using excuses to lay people off over

633
00:45:32,320 --> 00:45:36,400
it. But you know, I
want to see two quarters after that and

634
00:45:36,519 --> 00:45:38,719
say, did you actually get better? Did you were your customers happier?

635
00:45:38,880 --> 00:45:43,079
Like, I don't know that any
of us know the answer to that those

636
00:45:43,119 --> 00:45:45,639
things yet. Yeah, I think
there's going to be opportunities, especially in

637
00:45:45,679 --> 00:45:51,440
the sort of data science and machine
line consulting room, to really speed up

638
00:45:52,000 --> 00:45:58,440
those consulting opportunities, right, to
take them from something that you know you

639
00:45:58,440 --> 00:46:01,639
you have to kind of train a
new thing every time and tailor it to

640
00:46:01,679 --> 00:46:05,760
your client. But then to get
ahead of some of that competition, say

641
00:46:05,800 --> 00:46:07,760
hey, we can significantly cut down
on that time. But for all the

642
00:46:07,760 --> 00:46:13,719
companies that don't necessarily have in house
data science, machine learning, that sort

643
00:46:13,719 --> 00:46:16,840
of thing, you know, are
they suddenly going to be able to pick

644
00:46:16,920 --> 00:46:22,320
up chat GPT and have it make
a big difference in that area. Yeah,

645
00:46:22,440 --> 00:46:25,360
I don't know about that. Yeah, I think it's a great question.

646
00:46:25,440 --> 00:46:30,039
And funny thing is, like,
this is what Microsoft's has been known

647
00:46:30,159 --> 00:46:35,599
for, is the commoditization of certain
technologies, sure, making it very easy

648
00:46:35,679 --> 00:46:40,360
to get into things so that typically
needed experts. You could use these tools

649
00:46:40,639 --> 00:46:45,800
without the experts, and you wouldn't
get a superior experience. Like arguably we'd

650
00:46:45,800 --> 00:46:51,239
constantly get an infior experience, but
perhaps good enough that it was worthwhile.

651
00:46:51,639 --> 00:46:58,320
Yeah. I think it goes back
to them not anticipating how big it would

652
00:46:58,320 --> 00:47:02,960
get immediately, right. Yeah.
I don't want to present the idea that

653
00:47:04,039 --> 00:47:07,480
anyone had a plan here because I
don't think it's true. I think they're

654
00:47:07,519 --> 00:47:12,880
all making it up as they go. Right now, that the hundred million

655
00:47:13,039 --> 00:47:17,760
users wiled them, so they pointed
at Bying of all places, and Bying

656
00:47:17,920 --> 00:47:22,480
got a hundred million users, you
know, like this is one hundred million

657
00:47:22,599 --> 00:47:27,960
user machine. Let's go. Yeah, it's an interesting space right now.

658
00:47:28,159 --> 00:47:32,239
Well, and I'm excited to talk
to you because you come from the science

659
00:47:32,280 --> 00:47:37,519
side of this and the academic knowledge
side of this. And I can see

660
00:47:37,000 --> 00:47:42,760
because we could see each other while
we're recording this, Like the productization of

661
00:47:42,800 --> 00:47:46,519
this stuff is the worst I think
for someone from your spaces, like you're

662
00:47:46,559 --> 00:47:51,800
doing what with my thing? Where? How? And literally it's like that

663
00:47:51,960 --> 00:47:54,039
sort of grotesque. And then they
took the sledgehammer to it to get it

664
00:47:54,079 --> 00:47:59,800
to fit into the square, like
just hammering it through a box to get

665
00:47:59,800 --> 00:48:05,039
it to customers. There have been
some musical artists who are demanding that or

666
00:48:05,159 --> 00:48:10,119
who are who are offering um royalties
or something like that or demanding royalties.

667
00:48:10,119 --> 00:48:14,920
What's the story there, Richard.
I'm confused about this, but I remember

668
00:48:14,960 --> 00:48:17,679
talking to you about it. I
mean, there's there's certainly a question about

669
00:48:19,760 --> 00:48:22,480
uh yeah, rights to data,
rights to the data used in the training

670
00:48:22,519 --> 00:48:29,400
models. And you can see the
thought that as a developer running these models,

671
00:48:30,679 --> 00:48:32,920
because they're all you believe, they're
all behind the scenes, so it

672
00:48:32,960 --> 00:48:37,320
doesn't really matter. I'm not taking
a copy of your thing, I'm not

673
00:48:37,480 --> 00:48:42,360
using your thing per se. It's
just being munged in a big neural net,

674
00:48:42,760 --> 00:48:46,039
like why should I care? And
everything was fine until Dolly started spitting

675
00:48:46,039 --> 00:48:51,079
out Getty logos. Then it wasn't
fine. Yes, right, this is

676
00:48:51,199 --> 00:48:53,000
there's there was no intent here,
There was no plan here. I think

677
00:48:53,000 --> 00:48:58,039
a devs have it always been doing
what devs were doing until the logo,

678
00:48:58,400 --> 00:49:01,400
until the logo started appearing in the
pictures, and then it's like that,

679
00:49:02,039 --> 00:49:08,320
huh. And I love that it
was Getty because Getty is the greatest pursuer

680
00:49:08,400 --> 00:49:14,599
of copyright you've ever seen. Heaven
help you if you use a Getty picture

681
00:49:14,599 --> 00:49:17,199
without permission. Ah, it's all
a big I mean, it just opened

682
00:49:17,280 --> 00:49:21,039
up a lot of stuff that I
think people have been talking about for a

683
00:49:21,079 --> 00:49:27,079
while, but that the like need
wasn't so great that people had to really

684
00:49:27,239 --> 00:49:30,320
pursue, you know, some of
these I mean, we had to do

685
00:49:30,360 --> 00:49:34,559
some of that, right, Napster
days and then there's you know, there's

686
00:49:35,519 --> 00:49:39,199
Spotify days and all of that stuff. Everything's always changed and it's just one

687
00:49:39,239 --> 00:49:45,960
more time that those things are gonna
have to change. But they just the

688
00:49:45,159 --> 00:49:50,400
speed with which it came out,
it's going to take a little bit for

689
00:49:51,039 --> 00:49:55,039
the extra things to catch up.
And be like, all right, we

690
00:49:55,119 --> 00:50:00,199
have been having this conversation for years, including the three of us. Yeah,

691
00:50:00,360 --> 00:50:05,199
Like, in some ways this is
a useful forcing function. Okay,

692
00:50:05,199 --> 00:50:08,920
I've now finally made a thing that's
showing up in the public gestalt. Now

693
00:50:08,960 --> 00:50:16,119
we've been talking about treating data maturely, respecting copyright, considering bias, like

694
00:50:16,239 --> 00:50:20,960
all of those things. Now ya
gotta because we're now starting to stick it

695
00:50:20,960 --> 00:50:25,440
in front of regular people who don't
know how problematic these data collections and utilization

696
00:50:25,480 --> 00:50:30,199
methods have been. Yeah, I
mean that's what normally drives innovation, right,

697
00:50:30,440 --> 00:50:35,800
every time, other innovation. There's
no driver's licenses until cars started killing

698
00:50:35,880 --> 00:50:40,159
people. Yeah, true, right, Just hopefully we can get we can

699
00:50:40,199 --> 00:50:44,760
get to a point of some moderation
with this before folks die. I would

700
00:50:44,840 --> 00:50:49,800
rather not have folks die over this
crazy bit of software. Yeah, crazies,

701
00:50:49,880 --> 00:50:52,199
right man, people are going nuts. It's true. I had the

702
00:50:52,199 --> 00:51:00,000
same thing comes to fruition on the
environmental impact side, because these models are

703
00:51:00,000 --> 00:51:05,119
are very extensive and very impactful to
train. Yeah, and now that they

704
00:51:05,119 --> 00:51:08,039
are out there and people are using
them, that likely means that more of

705
00:51:08,079 --> 00:51:10,360
them are going to be trained,
and more of them are going to be

706
00:51:10,400 --> 00:51:14,039
deployed, and it's like, okay, how are we going to then also

707
00:51:14,159 --> 00:51:16,639
mitigate that impact as well? Well. This one good thing that comes out

708
00:51:16,639 --> 00:51:20,760
of it is now people won't be
complaining so much about how much water it

709
00:51:20,760 --> 00:51:25,719
takes to grow a stake. That's
true. I always thinking more in terms

710
00:51:25,760 --> 00:51:30,840
of you know, Jess Wing Crypto
was winding down and giving back those resources.

711
00:51:30,840 --> 00:51:34,320
We found a new place to put
them. Yeah, exactly, absolutely.

712
00:51:34,800 --> 00:51:37,639
Yeah. And if you want to
make an AI person angry in a

713
00:51:37,679 --> 00:51:40,719
big hurry, just call it the
new Crypto. That's a good way to

714
00:51:40,760 --> 00:51:45,239
do it. Oh yeah, we
will not get a Christmas card from that

715
00:51:45,280 --> 00:51:49,519
person. That's not a happy thing
to say. That's awesome. And I

716
00:51:49,559 --> 00:51:52,519
also think it's not true, like
it doesn't seem to be trying to separate

717
00:51:52,519 --> 00:51:57,559
people with our money quite as vociferously. But the bill, you know,

718
00:51:57,559 --> 00:52:00,559
the rates are the prices are starting
to come. Man. I guess we'll

719
00:52:00,559 --> 00:52:04,239
see what happens. Yeah, it's
true. It'll be an interesting back half

720
00:52:04,280 --> 00:52:07,199
of the year. Yeah. So
Amber, what's next for you? What's

721
00:52:07,199 --> 00:52:13,280
in your inbox? Ah? Well, I am doing good things at Kauma.

722
00:52:13,440 --> 00:52:17,679
I'm happy to be there. And
for me, I'm finally just in

723
00:52:19,320 --> 00:52:24,880
a steady state, which is great, and looking forward to kind of just

724
00:52:24,920 --> 00:52:29,480
seeing where things go. I've been
throwing a lot of energy into sort of

725
00:52:29,519 --> 00:52:34,199
mentoring and growing the next data science
leaders. I've noticed that there's a lot

726
00:52:34,239 --> 00:52:38,199
of they teach a lot about the
data science and then they find people who

727
00:52:38,199 --> 00:52:42,159
are like, oh, well,
you look like you could lead, We're

728
00:52:42,159 --> 00:52:44,920
going to throw you into leadership,
but they don't do a lot of teaching

729
00:52:45,079 --> 00:52:52,679
about data science for business value or
data science you know, in the bigger

730
00:52:52,719 --> 00:52:57,760
picture, and how to lead a
team and that sort of things. I've

731
00:52:57,760 --> 00:53:00,960
been doing a lot of mentoring some
of the younger data scientists, which I've

732
00:53:00,960 --> 00:53:04,360
been really enjoying. Wow, that's
great. Well yeah, yeah, you're

733
00:53:04,400 --> 00:53:07,400
doing You're doing the people's work for
sure, because we goodness knows, we

734
00:53:07,519 --> 00:53:12,000
need more knowledgele people in this space. Lots of people using data, not

735
00:53:12,159 --> 00:53:15,840
enough data scientists. Yep, that's
true. Yeah, well, Amber,

736
00:53:15,000 --> 00:53:19,480
thank you so much for coming on
the show and talking about this. It's

737
00:53:19,559 --> 00:53:22,719
not a topic that's going away soon, and I'm sure it's going to change

738
00:53:22,719 --> 00:53:27,960
by the next time we have you
on. Absolutely. Thanks for having me.

739
00:53:28,320 --> 00:53:30,000
Great to have you about Yeah you
bet, and we'll talk to you,

740
00:53:30,159 --> 00:53:55,519
dear listener next time on dot net
rocks. Dot net Rocks is brought

741
00:53:55,519 --> 00:54:00,679
to you by Franklin's Net and produced
by Poop Studios, a full service audio,

742
00:54:00,800 --> 00:54:06,519
video and post production facility located physically
in New London, Connecticut, and

743
00:54:06,639 --> 00:54:12,480
of course in the cloud online at
pwop dot com. Visit our website at

744
00:54:12,559 --> 00:54:17,719
dt n et r ocks dot com
for RSS feeds, downloads, mobile apps,

745
00:54:17,880 --> 00:54:22,000
comments, and access to the full
archives going back to show number one,

746
00:54:22,360 --> 00:54:27,079
recorded in September two thousand and two. And make sure you check out

747
00:54:27,079 --> 00:54:30,400
our sponsors. They keep us in
business. Now go write some code CE

748
00:54:30,599 --> 00:54:32,199
next time you got
