1
00:00:01,000 --> 00:00:04,759
How'd you like to listen to dot
net rocks with no ads? Easy?

2
00:00:05,320 --> 00:00:09,400
Become a patron For just five dollars
a month, you get access to a

3
00:00:09,480 --> 00:00:14,199
private RSS feed where all the shows
have no ADS. Twenty dollars a month

4
00:00:14,240 --> 00:00:18,600
will get you that and a special
dot net Rocks patron mug. Sign up

5
00:00:18,640 --> 00:00:23,760
now at Patreon dot dot net rocks
dot com. Hey Carlin Richard Here,

6
00:00:24,079 --> 00:00:29,280
As you may have heard, NDC
is back offering their incredible in person conferences

7
00:00:29,320 --> 00:00:33,240
around the world, and we'd like
to tell you about them. NDC Oslow

8
00:00:33,240 --> 00:00:37,039
will be made twenty first through the
twenty fifth. Go to NDC Oslo dot

9
00:00:37,119 --> 00:00:42,759
com to register. NDC Copenhagen is
happening August twenty seventh through the thirty first.

10
00:00:43,320 --> 00:00:48,960
Go to NDC Copenhagen dot com for
more information. NDC Porto is happening

11
00:00:49,000 --> 00:00:53,280
October sixteenth through the twentieth. The
early bird discount for DC Porto ends July

12
00:00:53,560 --> 00:00:58,280
twenty first. Go to Eddcporto dot
com to register and check out the full

13
00:00:58,320 --> 00:01:15,239
lineup of conferences at DC conferences dot
com. Hey, welcome back to dot

14
00:01:15,280 --> 00:01:18,760
net Rocks. This Carl Franklin and
this is Richard Campbell. I'm very excited,

15
00:01:18,879 --> 00:01:23,079
Richard, because my one of my
heroes of podcasting, Grant Barrett,

16
00:01:23,159 --> 00:01:26,239
is going to be with us in
just a few minutes. Um. But

17
00:01:26,480 --> 00:01:30,680
first let's do better, no framework
and get out of the way. Awesome

18
00:01:38,400 --> 00:01:41,040
man would have got Believe it or
not, I started a new podcast.

19
00:01:41,359 --> 00:01:45,959
Are you kidding? I'm stupid?
I don't know, because both put several

20
00:01:46,000 --> 00:01:49,680
hours of podcasts a weekend, so
it's not surprising. Yeah. So this

21
00:01:49,719 --> 00:01:53,480
one is a personal more like a
personal journal. Oh yeah, I'm just

22
00:01:53,640 --> 00:01:57,879
finding people that I've met in my
travels and I've worked with that I find

23
00:01:57,920 --> 00:02:01,079
really interesting and I think would also
their stories would be very interesting to my

24
00:02:01,200 --> 00:02:05,840
listeners. And so it's called no
Excuse for boredom. I love it.

25
00:02:06,760 --> 00:02:09,240
It's a no excuse for boredom dot
com. You can't believe you got that

26
00:02:09,360 --> 00:02:15,719
domain name. I have a way
with getting domain names. I don't know.

27
00:02:15,879 --> 00:02:19,759
Yeah, yep, no excuse for
boredom dot com. So there's only

28
00:02:19,759 --> 00:02:23,080
two episodes up there. Um.
The first one is Dan Grief. He's

29
00:02:23,120 --> 00:02:27,000
a guy from England who I know, you don't know, but he I

30
00:02:27,080 --> 00:02:31,680
basically talked to him about the difference
between new England and Old England in terms

31
00:02:31,719 --> 00:02:36,360
of language and slang, and great, you probably like this too. I

32
00:02:36,360 --> 00:02:40,360
asked him to define a whole bunch
of British slang words and phrases that confused

33
00:02:40,360 --> 00:02:45,639
me when I first heard them,
like oh uh, okay, are you

34
00:02:45,639 --> 00:02:50,120
trying to take the piss out of
me? The classic? I never The

35
00:02:50,159 --> 00:02:53,199
only time I ever heard that was
like on a Gordon Ramsey show, right,

36
00:02:53,599 --> 00:02:57,240
and I'm like, what the heck
are they talking about. I kind

37
00:02:57,280 --> 00:03:00,000
of got from the context that it
means like are you kidding with me?

38
00:03:00,120 --> 00:03:04,199
Or are you are you just trying
to pull a fast one or whatever,

39
00:03:04,680 --> 00:03:07,800
but I'd never heard it before.
So there's just a long list. Yeah,

40
00:03:07,800 --> 00:03:14,039
there's something very cathe or esque about
that. It really put in a

41
00:03:14,159 --> 00:03:19,400
stint exactly. I just didn't get
it. So there's it's just funny.

42
00:03:19,439 --> 00:03:23,039
And then the second one is with
Randy Judkins. Oh man, I remember

43
00:03:23,120 --> 00:03:25,319
him. Yeah, my friend from
in the early days, my friend from

44
00:03:25,319 --> 00:03:29,759
Maine who did that first dot net
what is dot net video? And he's

45
00:03:29,800 --> 00:03:34,159
a jungling magician, comedian. He
was so funny too. Oh had me

46
00:03:34,280 --> 00:03:36,400
rolling and stitches. All right,
Well, enough of that. That's what

47
00:03:36,520 --> 00:03:38,800
I got who was talking to us
today? Richard grabbed a comment off a

48
00:03:38,840 --> 00:03:43,159
show sixteen o five, the one
we did back in twenty eighteen, so

49
00:03:43,199 --> 00:03:46,240
that's a while ago. And it
was actually about IoT and edge competing.

50
00:03:46,240 --> 00:03:49,360
But we were taught and this is
Jared Rhodes we were talking to. I

51
00:03:49,400 --> 00:03:52,439
don't even remember this because of course
you remember every show. It was only

52
00:03:52,560 --> 00:03:58,400
five years ago. Wait, oh
no, that's me. But the latter

53
00:03:58,439 --> 00:04:01,360
half of the show we were talking
about data collection a lot because these you

54
00:04:01,400 --> 00:04:03,680
know, we were talking about how
small the hardware gotten and you can put

55
00:04:03,680 --> 00:04:08,319
it just about anywhere and nothing.
It's always uploading to the cloud. You

56
00:04:08,360 --> 00:04:12,560
know, there's consequences to all of
that. And Mike comment on the show,

57
00:04:12,599 --> 00:04:15,000
and admittedly he commented, you know, within a month of it being

58
00:04:15,000 --> 00:04:16,879
published several years ago, he said, hey, this is a great ethical

59
00:04:16,879 --> 00:04:21,560
discussion about collecting data. I always
revert back to Kransberg's first law tech it

60
00:04:21,720 --> 00:04:26,839
is neither good or bad, nor
is it neutral. I think that's actually

61
00:04:26,920 --> 00:04:30,319
his quote, but okay, I
don't disagree with the quote. Mentoring a

62
00:04:30,439 --> 00:04:34,240
store gathering data is troubling. You
obviously opt in. And this is when

63
00:04:34,240 --> 00:04:38,519
we were talking about some of the
projects I've been involved with where they're actually

64
00:04:38,560 --> 00:04:41,920
instrumenting shelves to see what you look
at, what you pick up, what

65
00:04:42,040 --> 00:04:45,360
you put down, all of those
kinds of things. And so the point

66
00:04:45,399 --> 00:04:47,199
of you obviously opt in by seeing
the monitor continue to enter, But is

67
00:04:47,199 --> 00:04:51,800
that enough? Shouldn't the customer not
to what extent their behavior will be analyzed?

68
00:04:51,800 --> 00:04:56,800
And for what purpose? What are
the patrons who beyond their control only

69
00:04:56,839 --> 00:04:59,720
have access to that store, are
you know, aliening their business and taking

70
00:04:59,720 --> 00:05:03,040
a v to the system to gather
data? Programming ethics have always fascinated me.

71
00:05:03,120 --> 00:05:08,439
Recent protests by Google employees now again
twenty eighteen. Yeah, and that

72
00:05:08,519 --> 00:05:13,439
was when Google was going into China
and they were demanding the Great Firewall stuff

73
00:05:13,480 --> 00:05:17,879
and censoring and so forth. Recent
protests by Google employees over developing censoring products

74
00:05:17,879 --> 00:05:21,920
for the Chinese government and the military
machine learning for the US government highlight the

75
00:05:21,959 --> 00:05:27,800
backlashes these companies faced with adopting technologies
for different applications. And I'm like man

76
00:05:28,000 --> 00:05:32,959
five years later and there's no protests
than ever. But it always boils down

77
00:05:33,000 --> 00:05:36,399
to this, you will be able
to find a developer to do what you

78
00:05:36,480 --> 00:05:41,160
want to have done. There will
always be someone with the ethics that aligned

79
00:05:41,199 --> 00:05:44,480
to yours or what desperate enough to
do that work. This is unique to

80
00:05:44,519 --> 00:05:47,040
programming. However, programming has a
unique advantage to being cheap to scale,

81
00:05:47,360 --> 00:05:51,720
so one developer can have a profound
impact on humanity versus one unethical police officer

82
00:05:51,959 --> 00:05:57,040
or one unethical investor. Had Bertie
made Off, I love this reference.

83
00:05:57,079 --> 00:06:00,240
Had Bertie made Off known how to
make an ASP website, he could have

84
00:06:00,319 --> 00:06:04,600
offered his services anonymously to many more
people. It was a great discussion on

85
00:06:04,720 --> 00:06:09,839
what to two about this responsibility.
How do companies talk about this? My

86
00:06:09,959 --> 00:06:15,000
experience is most don't, don't yeah, but we will eventually need to Yeah.

87
00:06:15,040 --> 00:06:18,079
Wow, thanks you, Thank Mike, you killed it. I was

88
00:06:18,079 --> 00:06:21,560
so glad to find this comment as
I dug through knowing where we were going

89
00:06:21,639 --> 00:06:27,079
to go today, and so thanks
thanks for this. It is absolutely the

90
00:06:27,079 --> 00:06:30,720
conversations we need to have and are
having. And a copy of music Codebuy

91
00:06:30,839 --> 00:06:33,199
is on its way to un If
you'd like a copy of Music Code by

92
00:06:33,480 --> 00:06:36,920
write a comment on the website at
dondt Rocks dot com or on the facebooks

93
00:06:36,959 --> 00:06:40,600
we publish every show there, and
if you comment there and everybody in the

94
00:06:40,600 --> 00:06:44,120
show. We'll send you a copy
of music Code Buy and definitely follow us

95
00:06:44,160 --> 00:06:47,600
on mastadon. I'm at Carl Franklin
at tech Hub dot social, and I'm

96
00:06:47,800 --> 00:06:53,040
Rich Campbell at mastadon dot social.
Send us a two Rudy too too.

97
00:06:53,040 --> 00:06:56,759
It's still funny, Yeah, it's
still funny. Okay, we're still in

98
00:06:56,879 --> 00:07:00,480
grade seven. I know I'm always
going to be that way. This is

99
00:07:00,519 --> 00:07:05,199
a really exciting time to have Grant
Barrett on the show because I think Grant

100
00:07:05,680 --> 00:07:15,720
exists in the intersection of language and
ethics and tach. So let me read

101
00:07:15,759 --> 00:07:19,319
his bio for you. Away with
Words, Wayward Radio dot org co host

102
00:07:19,360 --> 00:07:26,160
and co producer, Grant Barrett is
an American lexicographer and linguist specializing in slang

103
00:07:26,240 --> 00:07:29,959
and new words. The public radio
show has heard from coast to coast in

104
00:07:29,959 --> 00:07:34,360
the United States. It's a Wayward
Radio dot org slash radio and around the

105
00:07:34,399 --> 00:07:40,759
world by podcasts Wayward Radio dot org
slash Podcasts. He has helped produce dozens

106
00:07:40,759 --> 00:07:45,920
of dictionaries and has authored three books
of his own. He's a voracious reader,

107
00:07:46,279 --> 00:07:48,519
reads and speaks a bit of French
and Spanish, and spent years working

108
00:07:48,560 --> 00:07:54,199
in the jargon rich jungles of information
technology. Though born and raised in Missouri

109
00:07:54,279 --> 00:07:57,639
and having been a long time resident
of New York City, Grant now lives

110
00:07:57,639 --> 00:08:01,399
in San Diego, California, with
his wife, also linguist and lexicographer,

111
00:08:01,439 --> 00:08:07,439
and their young son. His personal
website is Grant Barrett with two tis dot

112
00:08:07,439 --> 00:08:11,560
com. Welcome Grant, Hello,
how you doing Hey in the bio and

113
00:08:11,680 --> 00:08:15,600
the u URLs. Yeah. Absolutely. First of all, I just got

114
00:08:15,600 --> 00:08:18,519
to say your podcast as a staple
in my house. You know, it's

115
00:08:18,519 --> 00:08:20,560
a radio show too, heard coast
to coast in the United States as you

116
00:08:20,600 --> 00:08:26,319
just read, Yes, you're right, and also independently, voraciously independent,

117
00:08:26,519 --> 00:08:28,240
right yeah, yeah, not a
part of NPR, but heard on a

118
00:08:28,279 --> 00:08:31,759
lot of NPR stations. It's hard
to explain that to people, but it's

119
00:08:31,799 --> 00:08:37,480
a it's an one, tiny little
nonprofit. We just the radio stations are

120
00:08:37,559 --> 00:08:39,879
clients of ours, so we just
so cool. I talked to the radio

121
00:08:39,879 --> 00:08:43,879
stations because you have an MPR sound, well you could easily be MPR.

122
00:08:45,000 --> 00:08:48,080
Yeah, we are on NPR stations, so we take the NPR sound.

123
00:08:48,080 --> 00:08:50,720
But we were just added to a
bunch of new stations Miami and Tucson.

124
00:08:50,919 --> 00:08:58,799
And actually we're now heard in Roswell, New Mexico. Wow, Aliensis are

125
00:08:58,840 --> 00:09:03,480
listening. Excellent, it's the caller
there. Let me tell you where that

126
00:09:03,519 --> 00:09:09,200
comes from, you know, Beetle
Juice. Yeah, that stuff a great

127
00:09:09,200 --> 00:09:11,720
movie, all right. So the
reason that I got in touch with you

128
00:09:11,840 --> 00:09:16,919
is because I was listening to your
show and Martha read a story or a

129
00:09:16,960 --> 00:09:24,639
comment or an anecdote about She went
to chat GPT and told it to give

130
00:09:24,720 --> 00:09:31,759
her a one paragraph description of a
way with words, and it came up

131
00:09:31,759 --> 00:09:35,759
with this what she thought was a
completely unique paragraph. She said, she

132
00:09:35,840 --> 00:09:39,519
googled each one of them, each
one of the sentences, and they could

133
00:09:39,559 --> 00:09:46,720
not be found elsewhere on the Internet, and then started this moral discussion about

134
00:09:46,799 --> 00:09:50,440
you know, there's people who are
really afraid of this, all right,

135
00:09:50,519 --> 00:09:54,519
of chat GPT, and of language
evolving in general. And I think one

136
00:09:54,559 --> 00:09:58,679
of you, maybe it was you, said that, well, back in

137
00:09:58,720 --> 00:10:05,240
the days of what was it Socrates
or ancient Greece, they were concerned that

138
00:10:05,360 --> 00:10:09,679
if they taught, if they educated
people, and books, you know,

139
00:10:09,720 --> 00:10:13,039
people were writing books. Obviously they
weren't writing, you know, printing books,

140
00:10:13,039 --> 00:10:16,360
but if they were writing, that
was going to make people stupid,

141
00:10:16,440 --> 00:10:18,799
or because they wouldn't have to remember
so much. Yeah, at the time

142
00:10:20,159 --> 00:10:26,120
and the ancient Greeks, you were
expected to memorize and recite allowed, and

143
00:10:26,519 --> 00:10:31,919
once writing became more standard, you
might put it on a sheepskin, vellum

144
00:10:31,480 --> 00:10:37,000
or papyrus. People are like,
wait a second, you're not memorizing this,

145
00:10:37,080 --> 00:10:41,279
You're just gonna go and read it. It's not in your head.

146
00:10:41,559 --> 00:10:45,879
How do you actually know anything?
Right, If you just know where to

147
00:10:45,919 --> 00:10:48,399
get it, that's not knowledge.
It's stored out of your brain. You

148
00:10:48,480 --> 00:10:52,600
got to think that was a technological
advance, that better writing implements had come

149
00:10:52,639 --> 00:10:56,639
along. You weren't chiseling in stone
or pressing in clay. And well,

150
00:10:56,639 --> 00:11:01,679
it's the evidence of every technological advance
is as a of resisting force, there's

151
00:11:01,679 --> 00:11:05,759
always opposition. This was the same
screaming match that happened when Gutenberg made the

152
00:11:05,799 --> 00:11:11,919
press. Well, everything, all
technological advances, has pushed back. There's

153
00:11:11,960 --> 00:11:18,440
not a one. I mean,
somebody probably argued against fire. It burns

154
00:11:16,120 --> 00:11:20,159
somebody, somebody, nothing good will
come of it. Somebody argued against the

155
00:11:20,159 --> 00:11:24,720
wheel. But wait, it doesn't
have a stopping point. It can just

156
00:11:24,799 --> 00:11:26,639
keep going. How well, I
know that when I go to sleep,

157
00:11:26,679 --> 00:11:30,679
it'll step, You'll still be there
in the morning. Yeah, right,

158
00:11:30,840 --> 00:11:33,480
roll away. Yeah, yeah,
it's circular, that's satanic. I don't

159
00:11:33,519 --> 00:11:37,799
I don't know. We can't trust
people with new technology that I think that's

160
00:11:37,799 --> 00:11:41,159
what it comes down to, right, and that's why people. But on

161
00:11:41,240 --> 00:11:43,519
a larger point, what's funny about
this is, you know, since we've

162
00:11:43,519 --> 00:11:48,759
booked this interview for the three of
us to talk, the larger conversation has

163
00:11:48,759 --> 00:11:56,080
continued as fast as all the GPT
out growths and companies. And I mean,

164
00:11:56,279 --> 00:12:01,480
how many how many companies were founded
in Delaware? Since so much of

165
00:12:01,559 --> 00:12:05,919
this conversation hasn't moved on since we
just booked this interview, Margat and I

166
00:12:05,919 --> 00:12:11,080
talked about it on the air.
I think the way to discuss this isn't

167
00:12:11,080 --> 00:12:16,840
necessarily from an ethical standpoint so much
as it is from a personal standpoint.

168
00:12:16,039 --> 00:12:22,080
It's Martha took. Her approach was
really interesting as a former magazine writer,

169
00:12:24,039 --> 00:12:28,720
so know she were for Glamor and
for the Washington Post and some other things.

170
00:12:28,759 --> 00:12:31,279
She just said, well, if
I was still in the trenches,

171
00:12:33,960 --> 00:12:37,399
what to stop my competitor from taking
all my gigs and using this to write

172
00:12:37,480 --> 00:12:41,200
rough drafts and then tuning them up
a little bit and turning them in and

173
00:12:41,600 --> 00:12:45,720
beating me to every deadline, taking
all of the greatest gigs, and nobody

174
00:12:45,720 --> 00:12:48,200
being the wiser. Now we know
that there are tools that can help detect

175
00:12:48,200 --> 00:12:54,080
the GPT derived you know articles.
Yeah, but that's today. Yeah,

176
00:12:54,080 --> 00:12:56,279
she's got a good point. Yeah, except that quality is really not that

177
00:12:56,320 --> 00:13:00,159
good. Like, is it ultimately
saving you time? I don't even I'm

178
00:13:00,200 --> 00:13:01,960
not even convinced. Well, that's
the question you still have, Like I

179
00:13:01,960 --> 00:13:05,279
said, you have to tune it
still, right? You still? I

180
00:13:05,279 --> 00:13:09,320
mean what if you what if your
machine learning was based on your own writing

181
00:13:11,000 --> 00:13:13,200
and not the larger world. Yeah, it's totally going to be as smart

182
00:13:13,240 --> 00:13:18,480
as you and invariably not as right. What if what if it was a

183
00:13:18,639 --> 00:13:22,919
two set process. The first draft
was just from the larger chat GTPT and

184
00:13:22,960 --> 00:13:26,960
the second draft said, now run
it through the corpus of my own writing.

185
00:13:28,080 --> 00:13:30,960
So it sounds like me. I
read an interesting opinion I think it

186
00:13:31,000 --> 00:13:35,840
was in Washington Post about from a
teacher who said teachers need not worry about

187
00:13:35,960 --> 00:13:43,360
AI, and the pushback was,
there's a simple solution, just make tests

188
00:13:43,080 --> 00:13:48,960
in how in person or even oral
exams. Yeah. Handwriting, Yeah,

189
00:13:48,960 --> 00:13:54,279
handwriting or you know, Oh you
mentioned that things have progressed. I've noticed

190
00:13:54,320 --> 00:14:00,840
that there's a word Press Chat GPT
plug in. Now, oh, because

191
00:14:00,879 --> 00:14:05,399
that's going to work. Well,
all the SEO animals love this crap,

192
00:14:05,519 --> 00:14:07,360
right, How many happy SEO blogs
are we going to see with this?

193
00:14:07,720 --> 00:14:11,120
But I just gon alert them up
with Google ads. Yeah, well they

194
00:14:11,120 --> 00:14:15,399
were doing that anyway. This is, I know, but now more and

195
00:14:15,559 --> 00:14:20,559
easier. Yeah. I would wonder
if it'll actually be better. I don't

196
00:14:20,559 --> 00:14:22,519
know, because it might actually be
worse. So I want to talk about

197
00:14:24,080 --> 00:14:26,360
So we're at a bridging point here
where there's a before and and after,

198
00:14:26,639 --> 00:14:30,879
but I wanted to talk about and
compare this to another bridging point. I

199
00:14:30,919 --> 00:14:33,759
went back to school as an older
student in his twenties, and I went

200
00:14:33,759 --> 00:14:37,799
to Columbia University to get a degree. And I was right there at the

201
00:14:37,879 --> 00:14:43,000
point where universities weren't really decided whether
or not laptops should be allowed in all

202
00:14:43,039 --> 00:14:48,039
classrooms. So this was two thousand
so and I was at Columbia University,

203
00:14:48,080 --> 00:14:52,240
and I have horrific handwriting. So
I had one professor who allowed me to

204
00:14:52,279 --> 00:14:56,720
take my exams on a laptop,
and a number of the other students were

205
00:14:56,759 --> 00:15:00,360
kind of resentfull of this. But
most students didn't even have a laptop class

206
00:15:00,360 --> 00:15:03,720
to take notes. But now it
is really standard to have a laptop and

207
00:15:03,759 --> 00:15:07,120
class to take notes. And the
difficulty is you don't know what a student

208
00:15:07,200 --> 00:15:11,879
is doing unless you can see their
screen. Yeah, and now the difficulties

209
00:15:11,879 --> 00:15:15,600
that we saw a lot of this
during the pandemic. The difficulty is if

210
00:15:15,639 --> 00:15:18,360
you're taking an exam remotely, you
have to lock it down as all this

211
00:15:18,480 --> 00:15:22,879
software and all these businesses that are
designed to track eye movement and mouse movement,

212
00:15:24,000 --> 00:15:26,559
and they'll even pan the room with
your camera to know what else and

213
00:15:26,600 --> 00:15:31,759
who else is in the room.
So we've kind of leaped this whole thing

214
00:15:31,159 --> 00:15:35,600
where it's just assumed you have a
laptop, and now they're worried about the

215
00:15:35,639 --> 00:15:37,720
laptop doing too much, rather than
it shouldn't even be in the room.

216
00:15:39,120 --> 00:15:43,080
I have friends who are professors who
say, you have no idea how rampant

217
00:15:43,120 --> 00:15:46,600
cheating is, Like it's unbelievable,
no doubt. And you know, it's

218
00:15:46,600 --> 00:15:50,639
funny because there's another bridge point that
I want to talk about when we talk

219
00:15:50,679 --> 00:15:54,279
about technology kind of going from oh
my god, this is a problem too.

220
00:15:54,399 --> 00:15:58,080
Of course it's here, and I
think GPT and things like this,

221
00:15:58,159 --> 00:16:00,519
all this machine learning will go of
course it's here, or we'll just at

222
00:16:00,519 --> 00:16:04,159
some point wonder and laugh about this
panic. UM I remember as an IT

223
00:16:04,440 --> 00:16:08,919
guy, I was largely on the
support side for ad agencies and publishing companies.

224
00:16:10,240 --> 00:16:15,799
UM. I remember teaching people to
use the internet. Right where we

225
00:16:15,799 --> 00:16:19,919
went from a completely unnetworked office with
standalone computers on desks and the most networking

226
00:16:19,960 --> 00:16:25,799
we have was maybe to a printer, to hooking up email for the first

227
00:16:25,840 --> 00:16:26,720
time, and people going, what
am I going to use this for?

228
00:16:29,000 --> 00:16:32,399
What does this do? I don't
know anybody outside the office, what is

229
00:16:32,399 --> 00:16:34,240
it? What's an email address?
Why is it that this AT symbol kind

230
00:16:34,240 --> 00:16:38,240
of put another symbol there instead.
I don't like the AT symbol, you

231
00:16:38,279 --> 00:16:42,120
know. Um lots of stuff like
that. One woman printing out all of

232
00:16:42,120 --> 00:16:47,039
her emails and filing them because she's
at She's like, oh, you know,

233
00:16:47,240 --> 00:16:52,799
things like that, And now emails
emails actually almost day pass a emails

234
00:16:52,799 --> 00:16:55,159
like kind of like people like,
oh I don't want email? Who uses

235
00:16:55,159 --> 00:16:59,000
email anymore? So? I don't
know I make notes in my contact.

236
00:16:59,039 --> 00:17:02,240
Now, what's the prime medium this
person likes? Yeah, how am I

237
00:17:02,240 --> 00:17:03,279
going to reach God? Help us? I wish we had email as the

238
00:17:03,279 --> 00:17:07,880
primary one still, because I wish
I have fifteen different ways to reach people.

239
00:17:07,920 --> 00:17:10,759
I don't know which one, but
anyway, I just want to talk

240
00:17:10,799 --> 00:17:15,160
about this instead of getting this moral
panic and an ethical panic. Really,

241
00:17:15,240 --> 00:17:18,680
let's talk about this as an inflection
point. And we've seen these, whether

242
00:17:18,720 --> 00:17:26,279
it's moving from speaking or memorizing our
knowledge versus writing it down, whether it's

243
00:17:26,759 --> 00:17:30,000
should we have laptops in the classroom
versus having them the class expecting them in

244
00:17:30,039 --> 00:17:36,240
the classroom, to should we have
email to expecting to have email. There's

245
00:17:36,240 --> 00:17:40,480
all these different times in history where
technology has gone from oh my god,

246
00:17:41,119 --> 00:17:42,720
this is scary too. Of course
we have this. This is scary,

247
00:17:42,880 --> 00:17:45,880
right, And I think that's where
we are with this. So I know

248
00:17:45,960 --> 00:17:52,079
that you are one of your things
on way with words is that you are

249
00:17:52,200 --> 00:17:57,160
very much always trying to talk people
off the ledge of being like, you

250
00:17:57,200 --> 00:18:03,599
know, grammar Nazis, usage Nazis, and you're like, like, language

251
00:18:03,680 --> 00:18:06,680
is changing, you need to change
with it. You know, it's not

252
00:18:06,720 --> 00:18:08,240
the world doesn't going to stop and
change for you. But that does not

253
00:18:08,359 --> 00:18:14,680
excuse figuratively and literally, like I'm
not okay with that. Calm yourself.

254
00:18:15,119 --> 00:18:19,119
Yeah, see see this is exactly
why. Right. So yeah, you're

255
00:18:19,160 --> 00:18:25,960
like, it's changing, and people
wonder why how these things work back in

256
00:18:26,079 --> 00:18:30,440
time? You know, how did
something turn into something else? And you

257
00:18:30,480 --> 00:18:33,480
guys do the research. You you're
like, well, it's because well maybe

258
00:18:33,480 --> 00:18:37,039
you can give me a better example. But it happens, it changes,

259
00:18:37,119 --> 00:18:42,000
and it's often based on misunderstandings,
right, uh. Yeah. Sometimes sometimes

260
00:18:42,000 --> 00:18:48,400
it's just natural where one c one
category that was broad becomes specific. For

261
00:18:48,440 --> 00:18:52,039
example, girl used to mean boy, right, how is that weird?

262
00:18:52,319 --> 00:18:55,799
I mean yeah, well, because
it was more general, it could mean

263
00:18:56,039 --> 00:19:00,920
a young person in general, you
know, or so sometimes broad topic.

264
00:19:00,599 --> 00:19:04,920
One of my favorite ones to talk
about is how the very specific psychotherapy term

265
00:19:06,039 --> 00:19:10,319
of ano retentive now just means oh, they're particular. So it left the

266
00:19:10,400 --> 00:19:15,240
specific domain where it's very precise in
this profession to be used an everyday language

267
00:19:15,240 --> 00:19:18,880
by people who are not you know, they're not professional at all. Do

268
00:19:18,680 --> 00:19:25,559
you find an author who use that
phrase in some kind of popular media that

269
00:19:25,640 --> 00:19:29,799
then it became part of the regular
lexicon, it wasn't one as many you

270
00:19:29,839 --> 00:19:34,039
know, the things often leave their
specific police jargon, you know, often

271
00:19:34,119 --> 00:19:37,839
leaves, for example, to leave
the specific language of the police and show

272
00:19:37,880 --> 00:19:41,039
up in hip hop and then it
becomes slang in the street. Right.

273
00:19:41,119 --> 00:19:45,160
So so a lot of times the
bridge there is some kind of contact,

274
00:19:45,160 --> 00:19:48,640
but it usually requires repeated contact and
not one point of contact. So I

275
00:19:48,640 --> 00:19:53,559
guess, given all that history of
you in Martha, are you a little

276
00:19:53,559 --> 00:19:57,839
more scared with AI than the rest
of us? Are you? Are you

277
00:19:57,920 --> 00:20:02,960
like, is this just falling into
this is just another inflection point and don't

278
00:20:03,000 --> 00:20:04,839
worry and everybody. Yeah, I
think that's another inflection point. And you

279
00:20:04,880 --> 00:20:07,519
know, I think on the segment
we aired on the show, I can't

280
00:20:07,519 --> 00:20:11,079
remember how it ended up when we
edited it. I think I specifically said

281
00:20:11,079 --> 00:20:14,480
I want to avoid the term AI. I don't believe this is AI.

282
00:20:14,920 --> 00:20:17,839
I think it's just machine learning,
right, and it is, And to

283
00:20:17,920 --> 00:20:21,720
call it AI is over has a
lot of the AI we've been calling AI

284
00:20:21,799 --> 00:20:26,279
for decades is an AI. It's
just sophisticated conditionals. Yeah. The way

285
00:20:26,319 --> 00:20:29,920
I've been describing it is artificial intellages
is what you call something when it doesn't

286
00:20:29,960 --> 00:20:33,400
work. As soon as it does
work, it'll get a new name.

287
00:20:33,559 --> 00:20:36,000
Yeah. Yeah, Well, although
I think that some of the stupid things

288
00:20:36,000 --> 00:20:41,440
it's producing are just as stupid as
humans. Potentially stupider, Yeah, potentially

289
00:20:41,480 --> 00:20:45,960
stupider. The opposite of artificial intelligence
is natural stupidity, right, Yeah,

290
00:20:45,960 --> 00:20:48,240
although you know, it's a ton
of operator air from what I'm saying,

291
00:20:48,240 --> 00:20:51,880
a lot of those stupid results are
just because the operators don't know what the

292
00:20:51,880 --> 00:20:55,559
hell they're going. Why are you
having existential debates with a search engine?

293
00:20:55,720 --> 00:20:57,240
Right? We all saw the one
where we're all laughing, Like the guy

294
00:20:57,279 --> 00:21:00,480
who wrote this very ponderous piece at
the Washington Boats were like, yeah,

295
00:21:00,559 --> 00:21:06,559
dude, you're like completely treating it
like an entity. It's literally every single

296
00:21:06,599 --> 00:21:10,240
time you ask it a question,
it's only uh, I know, it

297
00:21:10,519 --> 00:21:12,839
doesn't have a brain, you're acting
like as a brain. It's software.

298
00:21:14,240 --> 00:21:18,200
But here's the thing, though,
Um, companies are going to get dinged

299
00:21:18,279 --> 00:21:22,359
for these things representing them in ways
they don't want to be represented. Yeah.

300
00:21:22,599 --> 00:21:26,799
Um, this has just happened with
with being right. Microsoft, they

301
00:21:26,880 --> 00:21:34,240
basically allowed open ended discussions with their
AI, and what happens they've noticed is

302
00:21:34,279 --> 00:21:40,640
after maybe twenty iterations back and forth
on the same subject, the box starts

303
00:21:40,680 --> 00:21:44,599
to lose its You know what,
Now I now know why I actually was

304
00:21:44,640 --> 00:21:47,359
able to talk to some folks on
the inside. Yeah, because it's a

305
00:21:47,400 --> 00:21:52,720
software problem. It's a bug interesting
like a bug bug, or like it's

306
00:21:52,839 --> 00:21:56,160
it's it is solvable. They brought
but they solved it by not hitting the

307
00:21:56,240 --> 00:22:00,559
limits. Look this whole mechanism about
context, right, yeah. Yeah,

308
00:22:00,640 --> 00:22:06,079
So your parser reads what you've written
and it tokenizes it, so it knows

309
00:22:06,079 --> 00:22:07,839
how to respond to it, and
it puts it into cash. So the

310
00:22:07,920 --> 00:22:11,759
next thing, then it writes a
response, which is also tokenized. Then

311
00:22:11,799 --> 00:22:15,559
you write your next response in order
to maintain context, it tokenizes that,

312
00:22:15,599 --> 00:22:18,839
then it reads them all together and
creates a response. Eventually you overflow the

313
00:22:18,880 --> 00:22:22,799
cash. Oh so you've got too
many layers of recursive tokenizing. That's right,

314
00:22:23,160 --> 00:22:26,440
you just run out of room.
They've only allocated so much space for

315
00:22:26,480 --> 00:22:30,039
tokens. This is why you get
it. Well, the oldest tokens fall

316
00:22:30,079 --> 00:22:33,000
off the back, and the oldest
tokens are typically the ones shaping things.

317
00:22:33,079 --> 00:22:36,720
Yeah, because you need the umbrella
thought. Because a lot of times your

318
00:22:36,720 --> 00:22:41,440
first question sets your umbrella, your
umbrella. Yeah. The best demonstration I've

319
00:22:41,480 --> 00:22:42,759
done of this is I fired up
chat GPT. It says, only talk

320
00:22:42,799 --> 00:22:48,240
to me an iambic pentameter, and
then just chat away with it, and

321
00:22:48,319 --> 00:22:52,240
at some point it forgets. I
love when you give it rules and it's

322
00:22:52,799 --> 00:22:56,720
like a four year old. Right. Yeah, I told you not to

323
00:22:56,799 --> 00:22:59,160
talk to me, but you want
to know what the limits of the cash

324
00:22:59,160 --> 00:23:02,200
are. Put key instructions at the
beginning of the cap. But to get

325
00:23:02,279 --> 00:23:06,400
back to the bing thing, this
chatbot kept insisting to New York Times reporter

326
00:23:06,599 --> 00:23:10,400
Kevin Ruse I guess that he didn't
actually love his wife and said that it

327
00:23:10,440 --> 00:23:14,480
would like to steal nuclear secret and
yeah, okay, but it's a bot.

328
00:23:14,640 --> 00:23:18,079
It's not going to steal. There
was another situation where somebody in one

329
00:23:18,079 --> 00:23:22,720
of our rds actually convinced the bot
that, you know, fifteen plus three

330
00:23:22,839 --> 00:23:26,160
equal twenty one, and the bot
said, yes, I'm sorry, you

331
00:23:26,200 --> 00:23:30,680
are right, fifteen plus three equals
twenty one. And then I to test

332
00:23:30,759 --> 00:23:33,920
it, went back and asked it, what's fifteen plus three and said eighteen.

333
00:23:34,119 --> 00:23:37,359
Yeah, because it didn't remember that
previous conversation, because that's not how

334
00:23:37,359 --> 00:23:41,400
it's spelled. It doesn't have the
context from all the other conversations it's had

335
00:23:41,720 --> 00:23:45,839
with all these other people. It
only has its context from you, because

336
00:23:45,839 --> 00:23:48,920
when it does combine all the contexts, you end up with TA. Okay,

337
00:23:48,960 --> 00:23:53,200
you got to explain that the original
TA was the original Microsoft bot.

338
00:23:53,200 --> 00:23:57,799
I go back at twenty eighteen,
which within ten hours had turned into a

339
00:23:57,880 --> 00:24:02,839
race of psychopath because it was combining
everything that people were throwing at it,

340
00:24:02,920 --> 00:24:04,200
right, Yeah, and they were, and they were testing its limits.

341
00:24:04,200 --> 00:24:07,680
There are tons of edge cases.
It wasn't when they took it down.

342
00:24:07,799 --> 00:24:10,920
And yeah, I'm still not convinced
they're not going to take this down.

343
00:24:11,160 --> 00:24:14,119
I'm not. I'm not convinced either, and I don't. I think it's

344
00:24:14,119 --> 00:24:17,920
a wonderful experiment. I think it's
an experiment on the people using it as

345
00:24:18,039 --> 00:24:22,160
much on the software itself. I
wonder how much of this is almost another

346
00:24:22,240 --> 00:24:26,319
kind of Stanford experiment. Yeah,
maybe how lonely are people that they're having

347
00:24:26,359 --> 00:24:30,279
these conversations, you know, I
want to I want to tackle this from

348
00:24:30,279 --> 00:24:33,680
another angle as well. You know
what, This also reminds me of a

349
00:24:33,720 --> 00:24:40,000
lot of the complaints about Mechanical Turk, when people were talking about what happens

350
00:24:40,039 --> 00:24:45,640
when you put all this effort out
to people to automate stuff that you know,

351
00:24:47,240 --> 00:24:49,240
pennies on the task, right,
And I've done that, and actually,

352
00:24:49,279 --> 00:24:52,400
you know, one of my other
lives, one of the reasons I

353
00:24:52,440 --> 00:24:56,079
do the radio show, it was
as a dictionary editor. So I actually

354
00:24:56,160 --> 00:25:00,400
did a paper for the Dictionary Society
of North America and presented it what when

355
00:25:00,400 --> 00:25:03,559
you ask people on mechanical turk to
write dictionary entries? And so I found

356
00:25:03,599 --> 00:25:08,160
a bunch of fairly common words that
weren't in any major dictionary, and I

357
00:25:08,599 --> 00:25:12,599
created a bunch of tasks. And
first I had people find sample sentences use

358
00:25:12,680 --> 00:25:15,920
these words, and then I had
another whole bunch of people use those sample

359
00:25:15,960 --> 00:25:19,519
senses to approximate dictionary style entry.
And then I had another batch of people

360
00:25:21,279 --> 00:25:25,480
trim them down and make them look
like ractual dictionary entries with parts of speech.

361
00:25:25,599 --> 00:25:29,559
And it actually worked pretty well.
It was more expensive than having an

362
00:25:29,559 --> 00:25:33,440
actual dictionary editor to do it.
But and now I have a paper proposal

363
00:25:33,519 --> 00:25:38,480
before that same Dictionary Society to see
what chat GPT does. And I'm also

364
00:25:38,519 --> 00:25:45,759
going to use stable diffusion or a
variety of some other AI image generator to

365
00:25:45,799 --> 00:25:53,720
create dictionary style illustrations for those definitions. See if I can't scare the lexicographers,

366
00:25:53,880 --> 00:26:00,119
nice do blaying up against the plagiarism
aspect too. I really that Martha

367
00:26:00,200 --> 00:26:04,440
did that work to check all the
sentences. Just because you had you scrape

368
00:26:04,480 --> 00:26:07,599
something off the web and run it
through a synonym engine, you know,

369
00:26:07,680 --> 00:26:11,559
doesn't make it original work, right, it just doesn't search well, yeah,

370
00:26:11,559 --> 00:26:15,799
because it's it's even though it's not
good texts that generates it's plausible right,

371
00:26:15,880 --> 00:26:18,400
Yes, Now it's definitely a Dunning
Krueger amplifier. If you know nothing

372
00:26:18,440 --> 00:26:22,799
about a subject, all this stuff
looks great. Well, that's Wikipedia,

373
00:26:23,119 --> 00:26:30,440
indeed. Yeah. I love going
to Wikipedia and finding like something that's been

374
00:26:30,440 --> 00:26:33,839
in there for years and I know
it is fundamentally wrong. Yeah. I

375
00:26:33,160 --> 00:26:38,319
go to Wikipedia for the footnotes.
It gives me interesting places to go look

376
00:26:38,359 --> 00:26:42,160
for information. It's not information itself. Well, that's the researcher's attitude.

377
00:26:42,200 --> 00:26:45,240
I go to I go find a
paper and I look at its bibliography and

378
00:26:45,400 --> 00:26:49,400
suddenly I know how the path to
become an expert on the thing. Sure

379
00:26:49,799 --> 00:26:53,880
you at least know what those authors
thought was important, whether they found or

380
00:26:53,920 --> 00:27:00,400
at least you know what other papers
they pillaged for. Right. So another

381
00:27:00,400 --> 00:27:03,920
point of inflection that you guys talked
about on the show and still talk about

382
00:27:03,000 --> 00:27:10,440
quite a bit is texting language and
how you know people are afraid because kids

383
00:27:10,440 --> 00:27:12,319
have their phones and their you know, their heads down in their phones.

384
00:27:12,359 --> 00:27:17,480
And then we don't talk about it
that much anymore. But you you talked

385
00:27:17,480 --> 00:27:19,880
about that book and I went and
read the book because internet is at it.

386
00:27:21,440 --> 00:27:23,559
Yeah. Yeah, but Greta McCulloch, it's a fantastic book. It's

387
00:27:23,559 --> 00:27:27,759
one of the best books written on
internet language ever. It's being used in

388
00:27:27,759 --> 00:27:30,440
classrooms now, that's how good it
is. It's awesome, and that the

389
00:27:30,440 --> 00:27:36,000
basic tenant is, look, you
know these this is definitely changing the language.

390
00:27:36,599 --> 00:27:40,200
You know, law is a word
now, not just land of lakes.

391
00:27:40,720 --> 00:27:42,799
Yeah, it's in context. Yeah, and it doesn't mean lots of

392
00:27:42,839 --> 00:27:47,519
love. It depends on the context. But yeah, it's context sensitive.

393
00:27:47,799 --> 00:27:52,319
What the reason I said we don't
talk about texting languages for some people that

394
00:27:52,880 --> 00:27:57,480
in their mind it's just these very
specific things like b R B or LLL

395
00:27:57,839 --> 00:28:02,680
or OMG. But there's so much
more to it. And Gretchen gets into

396
00:28:02,720 --> 00:28:10,880
that about all these different ways that
we express emphasis and laughter and sarcasm or

397
00:28:11,240 --> 00:28:17,799
community or insiderness or outsiderness and hesitation. There's so much more to there's so

398
00:28:17,880 --> 00:28:22,160
many more layers, and what you
think of as an error maybe having may

399
00:28:22,200 --> 00:28:26,519
have a message attached to it.
Lack of capitalization or punctuation may have a

400
00:28:26,640 --> 00:28:32,160
point, and you need to understand
that. But what's really important is that

401
00:28:32,240 --> 00:28:37,240
again the moral panic about what does
it all mean when people don't capitalize and

402
00:28:37,279 --> 00:28:42,279
punctuate their text messages has vanished.
It's mostly gone. And what we discovered

403
00:28:42,400 --> 00:28:47,880
was that people really do care about
their speech. And then when we all

404
00:28:47,960 --> 00:28:52,599
moved from those bar phones that had
we only had ten nine entry of our

405
00:28:52,759 --> 00:28:59,440
text messages and went to touch phones
with haptic response, with full keyboards,

406
00:29:00,119 --> 00:29:03,279
we could finally type whole sentences.
We did, right, most of us

407
00:29:03,359 --> 00:29:07,079
do. Most of us type the
best that we can. But then it

408
00:29:07,200 --> 00:29:10,960
also then we also were brought in
a chart of emojis, of ever ever

409
00:29:11,000 --> 00:29:15,240
expanding emojis. Yeah, and those
actually have expanded our ability to communicate.

410
00:29:15,400 --> 00:29:19,599
They're the new hieroglyphs. Well,
but they're necessary. There's something I talk

411
00:29:19,680 --> 00:29:23,559
about and I don't have given this
presentation a while, but I'll the version

412
00:29:23,599 --> 00:29:30,279
of it I'll give to you is
there's something called parallinguistic restitution. The vocal

413
00:29:30,440 --> 00:29:36,400
version of speech is speech. The
written version is only a pale invitation of

414
00:29:36,400 --> 00:29:41,279
what we speak aloud, and it'll
never catch up with the spoken version ever,

415
00:29:41,720 --> 00:29:45,200
And so we're constantly trying to make
the written version accommodate all the things

416
00:29:45,240 --> 00:29:49,359
we can do with our spoken language, and it really never catches up.

417
00:29:49,400 --> 00:29:56,000
But paral linguistic restitution is what we
do when we make things like emoji,

418
00:29:56,400 --> 00:30:00,000
or when we make things like emoticons, or when we do clever little things

419
00:30:00,039 --> 00:30:03,960
with punctuation. It's when we're adding
back in gestures by making a smiley face,

420
00:30:04,440 --> 00:30:08,400
or we're adding back in a grimace
by putting the right emojians. Still,

421
00:30:08,440 --> 00:30:11,640
I never felt older than when my
teenager came up to me and said,

422
00:30:12,119 --> 00:30:18,920
hashtag I'm hungry. I don't know
about old, but I've always thought,

423
00:30:18,000 --> 00:30:21,839
you don't have to say hashtag.
You don't have to say hashtag yeah

424
00:30:21,880 --> 00:30:25,680
okay. But also every emotions no
word following it. When it's just the

425
00:30:25,720 --> 00:30:27,400
hash symbol, it's not a hashtag. It has to have a word following

426
00:30:27,400 --> 00:30:32,240
it, so to stop that.
Yeah, it's you can call it a

427
00:30:32,279 --> 00:30:34,759
hash. You can remember. There
are some things that really Bud Graham Barrett,

428
00:30:34,799 --> 00:30:40,000
this is good to know a few. What are some of the others.

429
00:30:40,759 --> 00:30:44,839
I'm not telling you have to listen
to all five hundred episodes. Well

430
00:30:44,880 --> 00:30:47,599
I have. I appreciate that.
I just don't remember them. I feel

431
00:30:47,599 --> 00:30:48,720
sorry for you. Who made you
do that? Now? Me? I

432
00:30:48,759 --> 00:30:52,359
did it on my own. Like
when he was in prison, they didn't

433
00:30:52,160 --> 00:30:55,880
play like Guns and Rosens to full
Ball and they played our show. Nice

434
00:30:56,359 --> 00:31:00,759
guys made two thousand plus podcasts.
He does not recommending through the whole back

435
00:31:00,799 --> 00:31:04,400
catalog, but he does it himself
anyway. Oh, it's its own kind

436
00:31:04,400 --> 00:31:07,640
of OCD. And so that's those
are those are prison bars I see in

437
00:31:07,680 --> 00:31:12,039
the background there absolutely okay, And
I got interrupt for one moment for this

438
00:31:12,160 --> 00:31:18,039
very important message. You know,
Amazon Aws is a great home for your

439
00:31:18,079 --> 00:31:22,720
dot net applications. Whether you're looking
to run your apps as serverless functions,

440
00:31:23,160 --> 00:31:29,599
have them running containers, or you
require the complete flexibility of virtualized hardware,

441
00:31:30,119 --> 00:31:33,400
you can do it all on Aws. Dot Net toolkits and templates guide you

442
00:31:33,440 --> 00:31:37,559
through Bill and deployment, or you
can use your favorite infrastructure as code and

443
00:31:37,640 --> 00:31:45,519
CICD tools including Azure DevOps, GitHub
Actions and git labs, CICD loving Visual

444
00:31:45,559 --> 00:31:48,920
Studio vs Code or Rider. There's
an AWS toolkit for each of them.

445
00:31:49,359 --> 00:31:55,160
Want to use familiar Microsoft Sequel server
databases, you can with the Amazon Relational

446
00:31:55,240 --> 00:32:00,319
Database service, which provides fully managed
Sequel server databases, then offer auto manic

447
00:32:00,400 --> 00:32:05,359
scaling failover in snapshots. Once your
dot net application is deployed, it can

448
00:32:05,440 --> 00:32:10,200
use any of the over two hundred
AWS services to extend its functionality. Interested

449
00:32:10,240 --> 00:32:15,680
in AI, IoT, machine learning, video processing, or transcription, your

450
00:32:15,720 --> 00:32:21,119
dot Net on AWS application can do
at all. To learn more, go

451
00:32:21,160 --> 00:32:29,960
to AWS dot Amazon dot Com,
slash net and we're back. It's dot

452
00:32:29,960 --> 00:32:32,119
net Rocks. I'm Richur Campbell.
That's call Franklin, Yo. Talking to

453
00:32:32,160 --> 00:32:37,960
our fellow podcaster Grant Barrett, who's
also a lexographer. And I didn't really

454
00:32:37,960 --> 00:32:43,000
think that lexographers were that much fun. It's lexicographer, a lexicographer. Yeah,

455
00:32:43,000 --> 00:32:49,359
well you're you're okay with with US
languages? Yeah, you're in good

456
00:32:49,359 --> 00:32:52,720
company though. The very first radio
interview I ever did was with an Australian

457
00:32:52,920 --> 00:32:55,960
station promoting my first book, and
the guy called me a lexiographer the whole

458
00:32:55,960 --> 00:33:00,079
time and igrapher. Yeah, when
you're doing a live radio interview, you

459
00:33:00,160 --> 00:33:04,039
just don't correct people. It's like
correcting your father at the dinner table.

460
00:33:04,039 --> 00:33:07,039
You just don't do it. All
right, So we're not going to learn

461
00:33:07,200 --> 00:33:12,480
the other pet peeves unless you listen
to a way with words. But certainly

462
00:33:12,559 --> 00:33:15,920
this sort of inflection point, as
you call it, is really interesting,

463
00:33:16,039 --> 00:33:22,839
especially for programmers because this, you
know, for years, other people have

464
00:33:22,960 --> 00:33:28,079
been looking at us as programmers,
as people that are replacing their jobs.

465
00:33:28,359 --> 00:33:34,559
Right, we automate. That's our
job to write automation software to make people's

466
00:33:34,559 --> 00:33:38,400
who do those jobs either less brain
dependent, you know, push the button,

467
00:33:39,079 --> 00:33:43,160
or eliminate them all together. I
mean that's the perception. I'm not

468
00:33:43,200 --> 00:33:49,160
saying it's true. I would say
that software has increased the number of available

469
00:33:49,240 --> 00:33:53,960
jobs and change those jobs from sort
of menial tasks to things that have meaning.

470
00:33:54,519 --> 00:33:59,039
But they're new jobs and on people
to retrain. And if you don't

471
00:33:59,039 --> 00:34:01,440
want to retrain, that's on them. But now the shoes on the other

472
00:34:01,480 --> 00:34:05,720
foot. Now programmers are like,
oh my god, chat GPT, you

473
00:34:05,720 --> 00:34:07,159
can tell it to write a program
that does acts and it will just spit

474
00:34:07,199 --> 00:34:12,880
it out. Might not be right, but I've actually used it, and

475
00:34:13,159 --> 00:34:15,639
you know, how do how do
I do that little thing again? And

476
00:34:15,679 --> 00:34:19,920
I asked it and for the most
part it works, but sometimes it doesn't.

477
00:34:19,639 --> 00:34:22,199
Like I totally do that because I'm
the worst. The most program I

478
00:34:22,280 --> 00:34:24,920
know is a little bit of Pearl
script doing a little bit of a little

479
00:34:24,960 --> 00:34:29,800
bit of Python, and mainly I
just like, yeah, but you ran

480
00:34:29,840 --> 00:34:34,320
an Exchange struggle of all these crap
scripts that I do my little work with,

481
00:34:34,440 --> 00:34:37,599
but you ran Exchange, and this
is where you and Richard have common

482
00:34:37,679 --> 00:34:40,880
but we both know pain. Oh
god, I did pain. But also

483
00:34:40,960 --> 00:34:45,360
I got to work with I actually
loved like things like Skip or Cisco servers,

484
00:34:45,400 --> 00:34:51,440
you know, blocking blocking everything coming
in with us, the original iOS.

485
00:34:52,199 --> 00:34:54,000
It was just nice. It was
just nice just to see all that

486
00:34:54,039 --> 00:34:59,079
stuff bounced back and never enter your
network. Just watch those logs and you're

487
00:34:59,079 --> 00:35:01,840
like, they did it. I
did it. All that intrusion not showing

488
00:35:01,880 --> 00:35:07,599
up into my networks weak. It
is interesting that programmers now are feeling that

489
00:35:07,719 --> 00:35:14,280
existential dread of you know, Chat
GPT kind of taking their jobs. You

490
00:35:14,320 --> 00:35:17,639
know, you know, it's not
just an inflection point, it's I want

491
00:35:17,679 --> 00:35:22,039
to emphasize that point about it being
a bridge. I've had the good fortune

492
00:35:22,159 --> 00:35:27,639
to be on a lot of different
bridge points in history. For example,

493
00:35:27,679 --> 00:35:32,000
when I worked in newspapers, I
bridged the print paste up era where you

494
00:35:32,039 --> 00:35:37,880
actually paste it up the paper and
shot it with a camera. But that's

495
00:35:37,880 --> 00:35:40,079
how you went to press to the
digital era, where you did it all

496
00:35:40,199 --> 00:35:44,960
on a you know, a Mac
see with a floppy test and PageMaker.

497
00:35:45,920 --> 00:35:49,480
And I remember going to the press
and there was a guy working in the

498
00:35:49,519 --> 00:35:53,079
press room who was very disgruntled that
I was taking his job because we were

499
00:35:53,119 --> 00:35:55,559
doing it on a Mac and we
weren't doing it the old fashioned way.

500
00:35:55,559 --> 00:36:00,840
Were used a line of type machine
to output the ages. There was this

501
00:36:00,239 --> 00:36:06,639
giant thing that involved chemicals and this
weird custom keyboard, and he was very

502
00:36:06,639 --> 00:36:08,719
angry about it that his job was
gone. You know, there's all of

503
00:36:08,880 --> 00:36:14,880
these technological innovations. Somebody's going to
do it shifted, but if they can't

504
00:36:14,960 --> 00:36:17,400
keep up, it's on them.
That's how I feel. Yeah, I

505
00:36:17,440 --> 00:36:20,679
mean, I want to have some
compassion for Yeah, you're going to need

506
00:36:20,679 --> 00:36:23,360
to be retrained. Yeah, but
I also believe in the growth model here.

507
00:36:23,679 --> 00:36:28,599
When we brought the web into travel, we eliminated a lot of travel

508
00:36:28,639 --> 00:36:32,400
agents, but we actually grew the
overall travel travel business so much there's many

509
00:36:32,480 --> 00:36:37,440
more jobs than the ones that were
eliminated. I just don't think that getting

510
00:36:37,519 --> 00:36:40,800
chatchyp Hall to find some code for
you to use is that different from you

511
00:36:40,880 --> 00:36:45,320
googling it or stackle, stack overflow. It's the same gig. It's just

512
00:36:45,360 --> 00:36:47,599
like now you're doing any different context, and I think it are I think

513
00:36:47,639 --> 00:36:52,440
it shifts the low end of the
business. It shifts the entry level business,

514
00:36:52,480 --> 00:36:57,880
and maybe that's the larger problem those
It shifts an entry point for the

515
00:36:57,920 --> 00:37:01,360
beginning programmer. How to I enter
as a brand new programmer? How do

516
00:37:01,440 --> 00:37:07,760
I get my name recognition out there
on the different freelance marketplaces. How do

517
00:37:07,840 --> 00:37:10,960
I start, and as in a
small country, to make a few bocks.

518
00:37:12,280 --> 00:37:15,719
If GPT is going to do this
for everyone, you know, how

519
00:37:15,760 --> 00:37:19,239
do I make a little bit of
extra money on the side. I don't

520
00:37:19,239 --> 00:37:21,920
know. There are some people who
are going to be cut out and retraining

521
00:37:21,920 --> 00:37:24,239
isn't going to solve it for them. Well, and folks have talked about

522
00:37:24,280 --> 00:37:30,599
you'll use chap EGPT and your ability
to craft requests of that engine. It

523
00:37:30,000 --> 00:37:32,719
turns out to be your actual skill, and in the end you have to

524
00:37:32,719 --> 00:37:37,400
decide whether or not what they gave
you is good or not. Back in

525
00:37:37,440 --> 00:37:42,400
the early days of programming, I
sort of glibly said, you know,

526
00:37:42,440 --> 00:37:45,360
maybe the only job left in the
world someday will be programmers. Amber Scott

527
00:37:45,400 --> 00:37:51,559
Stanfield said, yeah, and battery
makers. Programmers and battery makers. That's

528
00:37:51,599 --> 00:37:52,880
those of you, the only jobs
left. Now I'm thinking, you know,

529
00:37:53,159 --> 00:37:58,880
proofreaders might be the only job left
someday in the literary world. Oh

530
00:37:58,960 --> 00:38:01,440
dude, they're being cut in the
newspaper business, so I wouldn't even publishing

531
00:38:01,440 --> 00:38:05,239
business. I wouldn't even count on
that. Really. Oh yeah, yeah,

532
00:38:05,280 --> 00:38:08,039
it's it's they're one of the first
jobs to be cut in publishing.

533
00:38:08,320 --> 00:38:12,480
Yeah. Well it shows, um, it does, it really does.

534
00:38:12,719 --> 00:38:15,039
I can't tell you how many errors
I see. And my wife, Kelly,

535
00:38:15,119 --> 00:38:21,519
Kelly, my wife is very um
attentive when it comes to details,

536
00:38:21,679 --> 00:38:24,239
right, and so she can spot, she says. She also says she

537
00:38:24,239 --> 00:38:29,280
should be a continuity person on a
movie set. M H, same thing,

538
00:38:29,400 --> 00:38:32,239
details details. Your coffee cup is
in your left hand, that last

539
00:38:32,239 --> 00:38:35,719
shot. You moved it to your
right hand, putting your arms up not

540
00:38:35,880 --> 00:38:38,840
down. Yeah. Right. But
I wonder if we're on a I wonder

541
00:38:39,159 --> 00:38:43,559
if we are, have been,
and are now accelerating down a path of

542
00:38:43,800 --> 00:38:50,719
like a version of yellow journalism,
this idea of the quality eventually the you

543
00:38:50,719 --> 00:38:55,039
know, yellow journalism was that the
time when folks were sensationalizing news back in

544
00:38:55,079 --> 00:38:59,920
the in the broadsheet era, and
eventually people stopped reading it because it was

545
00:39:00,039 --> 00:39:04,000
garbage and the industry had to change. And I wonder if we're not on

546
00:39:04,000 --> 00:39:07,480
that exact path where it's like,
hey, quality matters, and eventually people

547
00:39:07,480 --> 00:39:13,079
will ignore it because it's garbage.
Well, I've sat in on a lot

548
00:39:13,239 --> 00:39:16,079
of the future of news panels.
There's nothing news people love to talk about

549
00:39:16,119 --> 00:39:21,159
more than the future of their business. And this is a this has been

550
00:39:21,199 --> 00:39:22,840
a topic for as long as I've
been involved in the news business, which

551
00:39:22,880 --> 00:39:30,039
is like thirty years exactly this question. And generally the consensus is, and

552
00:39:30,079 --> 00:39:35,519
it has always been, there's not
enough division between opinion and news, and

553
00:39:35,679 --> 00:39:38,960
there never will be because there's too
much money and opinion and the public doesn't

554
00:39:38,960 --> 00:39:43,079
know how to separate the two either. When you put opinion at the top

555
00:39:43,119 --> 00:39:45,519
of the page or on the chiron
at the bottom of the screen, people

556
00:39:45,519 --> 00:39:50,719
still think it's news. But we're
talking about an inflection point. This may

557
00:39:50,760 --> 00:39:53,679
be part of that inflection point is
to say that that we do need to

558
00:39:53,719 --> 00:39:59,119
discriminate on these things. It has
happened before. I cite the yellow journalism

559
00:39:59,159 --> 00:40:01,800
case as a what are the few
cases where it did happen. Yeah,

560
00:40:01,800 --> 00:40:06,159
but what I'm saying. What I'm
saying is in the last thirty years of

561
00:40:06,679 --> 00:40:08,920
all we've seen is the rise of
the twenty four hour news channel, which

562
00:40:08,960 --> 00:40:15,320
is what eighty percent opinion and twenty
percent hard news, eighty percent advertising,

563
00:40:15,480 --> 00:40:21,119
ten percent opinion and ten percent news. Yeah, well, and lots of

564
00:40:21,159 --> 00:40:23,760
repeating over and over and over again. It's expensive. What do you think

565
00:40:23,760 --> 00:40:30,119
of Twitter? Well, God help
me. Yeah, we kind of feeling

566
00:40:30,159 --> 00:40:34,000
that dread too. I mean,
it was such a great language research tool

567
00:40:34,119 --> 00:40:37,360
until these new fees. I guess
they're not underway quite yet, but these

568
00:40:37,400 --> 00:40:43,559
new fees are now threatening a great
language research tool. Yeah, where the

569
00:40:43,960 --> 00:40:52,039
geocoding on tweets meant that you could
actually do really sophisticated time based, region

570
00:40:52,119 --> 00:40:58,760
based database searches. And say this
is a term is more common here and

571
00:40:58,840 --> 00:41:02,039
less common there um and has been
over the last ten years or however long

572
00:41:02,079 --> 00:41:08,000
Twitter has been a thirteen years?
How long his Twitter fifteen maybe fifteen m

573
00:41:08,480 --> 00:41:14,480
And so now we have longitudinal data
and it's fantastic for that, And now

574
00:41:14,519 --> 00:41:16,599
that's going to be just thrown in
the crapper. Yeah, I'm not.

575
00:41:16,639 --> 00:41:20,960
I mean, I'm not convinced.
I don't think Twitter is all that important

576
00:41:20,960 --> 00:41:22,320
to Elon. Actually, I think
he's mostly just trying to get out of

577
00:41:22,320 --> 00:41:25,039
the situation he's in, Like,
I wonder if we're not going to get

578
00:41:25,039 --> 00:41:29,400
back to something, but he blew
up the revenue model, so he's just

579
00:41:29,400 --> 00:41:32,679
trying to scrounge around for revenue.
Dingdong. Yeah, I agree, what

580
00:41:32,800 --> 00:41:37,639
a terrible, terrible waste of time. Yes, it was a good product.

581
00:41:37,360 --> 00:41:39,480
Maybe it'll get back there. I
don't Yeah, maybe you're right,

582
00:41:39,519 --> 00:41:43,880
Maybe it we'll get back there,
because ultimately it doesn't bat at the same

583
00:41:43,960 --> 00:41:46,760
level as getting people to Mars are
inventing the electric car, Like, why

584
00:41:46,760 --> 00:41:50,360
are you wasting time out? Yes, I just wish he would spend more

585
00:41:50,360 --> 00:41:52,400
time with his family. Yeah,
yeah, that that dude needs a friend.

586
00:41:52,440 --> 00:41:54,920
Like in a worse way. I'm
just saying, like out of the

587
00:41:54,920 --> 00:41:59,800
public eye, he just he needs
attention, and he's got all those people

588
00:42:00,079 --> 00:42:02,239
who could get him all the attention
he ever wanted. Yeah, I think

589
00:42:02,239 --> 00:42:07,480
he went off the rails when Grimes
left. Like the Elon that was kind

590
00:42:07,480 --> 00:42:09,079
of fun. You're the guy who
said let's fly my sports car into space

591
00:42:09,159 --> 00:42:13,480
like that. Yeah, that was
a great elon. And that's when he

592
00:42:13,559 --> 00:42:15,800
had he had a significant other around
him. That was, you know,

593
00:42:15,880 --> 00:42:20,039
at least something you reflect against this
and she's been gone. He's just gotten

594
00:42:20,039 --> 00:42:22,440
progressively crazier. Yeah. I don't
I don't know enough about his personal life.

595
00:42:22,480 --> 00:42:25,199
I just I just know how you
know. I'm just trying to focused

596
00:42:25,360 --> 00:42:30,360
on the utility of the tool rather
than the nature of the man. And

597
00:42:30,400 --> 00:42:35,639
the utility of the tool is is
decline, especially for a guy like you,

598
00:42:35,920 --> 00:42:38,840
like he were saying, a great
research tool for language usage. It

599
00:42:38,920 --> 00:42:43,559
really is. Look at the work
by Jack Grieve if you just type in

600
00:42:43,639 --> 00:42:46,719
Jack Grieves g R I e vee
just type in Jack Grieve linguist. He's

601
00:42:46,760 --> 00:42:54,639
written some phenomenal papers using Twitter and
showing African American English online in British English

602
00:42:54,639 --> 00:43:00,920
and just different ways you can track
regionality using Twitter. It's brilliant stuff,

603
00:43:00,079 --> 00:43:06,639
just brilliant stuff. Well, and
that only works if Twitter is serving as

604
00:43:06,679 --> 00:43:12,079
that sort of Piazza model where you
have a significant portion of the population's routinely

605
00:43:12,119 --> 00:43:21,280
communicating, right, a broad spectrum
across all ethnic backgrounds and economic backgrounds and

606
00:43:21,360 --> 00:43:23,559
political backgrounds. Right. But if
it's going to shift very far right,

607
00:43:23,599 --> 00:43:28,519
we're going to lose that. Are
there any other tools out there that can

608
00:43:29,000 --> 00:43:31,400
that have potential in your mind to
be used in the same way someday?

609
00:43:31,599 --> 00:43:36,239
No, No, no, nothing
else geocodes like that. It's not online.

610
00:43:36,239 --> 00:43:38,039
As far as I know, I've
not thought of Twitter as all that

611
00:43:38,159 --> 00:43:42,679
special until I hear what you do
with Yeah, I'm like, Okay,

612
00:43:42,960 --> 00:43:45,360
it's because it's not the product of
Twitter is. It's a byproduct, but

613
00:43:45,400 --> 00:43:49,199
it's an important byproduct. Yeah,
that's right. That A lot of the

614
00:43:49,239 --> 00:43:53,719
work that we do is linguist and
mexicographers is based on byproduct. For example,

615
00:43:53,920 --> 00:44:00,079
I learned years ago to use Google
alerts to track new words because writers,

616
00:44:00,199 --> 00:44:05,079
authors, journalists will often tag words
that are new to the vocabulary with

617
00:44:05,119 --> 00:44:10,280
phrases like also known as are known
to police as are referred to by referred

618
00:44:10,320 --> 00:44:15,719
to by um chemists as. So
that I have like three or one hundred

619
00:44:15,840 --> 00:44:20,039
or so expressions, and then following
those expressions, is a word is a

620
00:44:20,039 --> 00:44:22,880
new word? Yeah? Possibly a
new word? Yeah. I was gonna

621
00:44:22,880 --> 00:44:24,159
say, how do you how do
you filter for new words? Yeah,

622
00:44:24,280 --> 00:44:28,800
but no, it's it's when people
are trying to relate that contact for the

623
00:44:28,880 --> 00:44:31,880
relation phrases, yeah, and then
they'll sometimes define them right, and so

624
00:44:32,280 --> 00:44:37,000
you know, it's not very successful. I mean, I could probably automate

625
00:44:37,039 --> 00:44:40,159
that if I wanted to hire somebody, But you know, for a while

626
00:44:40,159 --> 00:44:44,079
there, I was just reading those
alerts and marking down the words that were

627
00:44:44,079 --> 00:44:46,719
new to me also, and it
was pretty successful. I found thousands and

628
00:44:46,760 --> 00:44:51,480
thousands of words that way. Yeah. I think you also used Google books

629
00:44:51,719 --> 00:44:55,159
right to find the in other tools
that you have at your disposal, to

630
00:44:55,239 --> 00:45:00,480
find the first time word has been
scene or a phrase has been seen in

631
00:45:00,559 --> 00:45:04,840
print. Yeah, you know,
it's always like, as far as we

632
00:45:04,880 --> 00:45:07,440
know, you always have to add
that shad that caveat as far as we

633
00:45:07,480 --> 00:45:09,599
know. Yep. These days,
I more often use archive dot org.

634
00:45:10,079 --> 00:45:15,119
Yeah, because they have done they
have a grant great new search engine by

635
00:45:15,159 --> 00:45:20,039
the way that it allows date range
searches. And although the OCR is kind

636
00:45:20,079 --> 00:45:23,880
of iffy because it uses I think
a modified version of tess iact, which

637
00:45:23,960 --> 00:45:28,400
is way better than it used to
be, but it's still not perfect and

638
00:45:28,440 --> 00:45:31,920
it often depends upon script recognition,
which isn't that great, so sometimes it

639
00:45:32,000 --> 00:45:37,320
will misrecognize, for example, of
Cirillac as another language. But it's pretty

640
00:45:37,320 --> 00:45:42,159
good. I just found in an
old um website that I used to run

641
00:45:42,159 --> 00:45:45,639
and started in nineteen ninety four,
Carling Carey's VB homepage. It's where I

642
00:45:45,679 --> 00:45:49,440
first ran across you back in the
day. Yeah, but what I'm talking

643
00:45:49,440 --> 00:45:52,760
about is the books portion of archive
dot org, and that's where the gold

644
00:45:52,920 --> 00:45:57,360
is. The wet portion, the
books portion of Archive, That's where the

645
00:45:57,360 --> 00:46:00,239
gold is. Yeah. If you're
going to donate money on Internet Resource stone

646
00:46:00,239 --> 00:46:02,760
it to archive dot org. Yeah, I do. I give. It's

647
00:46:02,800 --> 00:46:05,840
not a lot, but I give
them like twenty five a month. Wow.

648
00:46:06,000 --> 00:46:07,679
That's nice because I make such a
huge use of this side. And

649
00:46:07,719 --> 00:46:12,920
then there's a few software projects like
since I use OCR and I PDF pretty

650
00:46:12,920 --> 00:46:15,920
regularly, so sometimes I'll download something
that I don't think was OCR very well

651
00:46:15,960 --> 00:46:21,400
there, and then I'll run OCR
my PDF on it, which is great

652
00:46:21,480 --> 00:46:25,800
script just wonderful. It's got tests
iact on the back end. It's Python

653
00:46:25,880 --> 00:46:30,639
based. You throw a PDF at
it, tell it which languages the it

654
00:46:30,719 --> 00:46:35,960
should OCR with, and it just
wonderful things like de skewing and some a

655
00:46:36,000 --> 00:46:42,400
little bit of contrast correction. Just
fantastic, Just kind of a nice front

656
00:46:42,480 --> 00:46:45,119
end for tests direct to handle some
of the real common cases. So it

657
00:46:45,119 --> 00:46:51,239
converts some PDF to PDF. Just
fantastic, beautiful. Yeah. Why do

658
00:46:51,360 --> 00:46:52,519
new words emerge? I mean,
I can think of a few reasons,

659
00:46:52,519 --> 00:46:55,440
but I like to hear your version
of that. Well, in your business.

660
00:46:55,480 --> 00:46:58,639
You know why they emerge? In
the tech business, You get a

661
00:46:58,639 --> 00:47:01,239
new product. It doesn't is a
new problem, and that new problem has

662
00:47:01,239 --> 00:47:06,320
a name, right, It's often
a silly name like zing boing or something.

663
00:47:06,480 --> 00:47:09,760
Well, what I'm talking about For
example, if we oh, speaking

664
00:47:09,760 --> 00:47:14,519
of Elon Musk, what did he
create upstairs in the sky? He created

665
00:47:14,760 --> 00:47:20,760
mum starlink starlink? What is starlink? When you see all those satellites together,

666
00:47:20,920 --> 00:47:23,400
it's a mega constellation. Right,
So that's a new word because he

667
00:47:23,440 --> 00:47:29,360
created something that didn't exist before a
mega constellation, and so you're you're creating

668
00:47:29,400 --> 00:47:32,599
new terms to describe new things.
Yeah, often it is, and it's

669
00:47:32,599 --> 00:47:37,559
easier with nouns, and it's easier
when you're not creating new senses for verbs,

670
00:47:37,559 --> 00:47:42,119
those are much harder, or new
senses for words, those are much

671
00:47:42,119 --> 00:47:46,320
harder to come across and to notice. But but generally, it's about the

672
00:47:46,360 --> 00:47:52,119
advancement of ideas, culture, people. It's about things changing, and that's

673
00:47:52,119 --> 00:47:54,400
when the language has to change too. I learned so many great terms from

674
00:47:54,400 --> 00:47:59,880
listening to yours, like your show, like voluntold is a great one,

675
00:48:00,400 --> 00:48:02,880
has been volun told. Anybody who's
worked in the military knows that way.

676
00:48:04,719 --> 00:48:08,480
That's a great word and it doesn't
require explanation. You know. So many

677
00:48:08,519 --> 00:48:15,760
of our three letter acronyms TLAs are
just so baffling to people who don't who

678
00:48:15,800 --> 00:48:21,559
aren't in the business, that we
have to constantly stop and define what what

679
00:48:21,679 --> 00:48:23,480
we're talking about. Yeah, we
put a show in the can this week

680
00:48:23,800 --> 00:48:29,000
that you'll hear in a couple weeks. Martha and I talked with a caller

681
00:48:29,039 --> 00:48:32,440
who was telling us about working with
Aussie's people from Australia, and he was

682
00:48:32,480 --> 00:48:38,880
talking about how mysterious their language was
at first. And so sometimes the language

683
00:48:38,960 --> 00:48:45,360
changes just because you have this natural
friction with other groups of people, so

684
00:48:45,440 --> 00:48:50,039
you you pick up how they speak, you know, and it becomes part

685
00:48:50,039 --> 00:48:57,840
of your speech. So language change
is about osmosis, you know, growth

686
00:48:57,920 --> 00:49:01,639
between different groups of people having to
accommodate each other. Accommodation is a real

687
00:49:02,079 --> 00:49:07,880
big part of new language, an
adoption of other people's speech or adoption of

688
00:49:07,920 --> 00:49:12,920
their ways of living. So,
for example, on the border between Brazil

689
00:49:13,039 --> 00:49:16,480
and all the Spanish speaking countries around
it, there are varieties of language no

690
00:49:16,559 --> 00:49:22,039
as Portugnol, which is a mix
of Portuguese and Spanish, but they're different

691
00:49:22,119 --> 00:49:28,639
depending on which border you're at.
They're each one of them has kind of

692
00:49:28,840 --> 00:49:31,760
created a new crayle, you know, just like the varieties of Spanglish and

693
00:49:31,920 --> 00:49:37,280
spoken along the Mexico US border are
a little different. So Miami Spanglish is

694
00:49:37,320 --> 00:49:45,320
different from Texas Spanglish. Yeah,
just because the English brushing up different kinds

695
00:49:45,320 --> 00:49:49,599
of Spanish. They've just they're varied
when you're banging against different cultures. Like

696
00:49:49,639 --> 00:49:52,559
if you think about the northern countries
in the northern side of Brazil, like

697
00:49:52,639 --> 00:49:57,880
Surinam and Guyana. They've got some
strong French influences too, exactly. Yeah,

698
00:49:58,000 --> 00:50:00,760
it's the same reason that the cabe
Quaff wrench is different from say,

699
00:50:01,000 --> 00:50:06,400
uh, I don't know Martinique French, and they're going to Caribbean for sure.

700
00:50:07,159 --> 00:50:08,880
The other thing I wanted to say
is like in the tech world,

701
00:50:09,320 --> 00:50:15,559
so the jargon of say, people
who work in cloud services versus people who

702
00:50:15,679 --> 00:50:20,840
do front frontline support. They're both
in tech, but they're different flavors of

703
00:50:20,880 --> 00:50:22,760
tech, so they're going to have
some overlap, but they're still gonna have

704
00:50:22,800 --> 00:50:25,800
different lingo. They might have different
lingo for the same stuff. You know,

705
00:50:25,840 --> 00:50:29,880
how they talk about customers for example, or how they talk about service

706
00:50:29,920 --> 00:50:34,719
calls. Yeah. Yeah, I'm
still not convinced that what's happening with the

707
00:50:34,800 --> 00:50:37,199
large language models is going well enough
that it's going to continue. We may

708
00:50:37,599 --> 00:50:43,400
we may be thrown into a winter
like where I think the whole Live with

709
00:50:43,559 --> 00:50:47,639
byng was a pr play. I
think that the most important part of the

710
00:50:47,679 --> 00:50:51,880
case, that what stimulated all this
was one hundred million users signing up for

711
00:50:51,960 --> 00:50:54,400
chat GBT. You think it's a
money grab, Yeah, well, I

712
00:50:54,400 --> 00:50:57,760
think I always think it's a money
grab. I think it's a pr play.

713
00:50:57,840 --> 00:51:01,079
I think byng was in the single
digits of use as a search engine,

714
00:51:01,360 --> 00:51:06,679
and then chat gpt comes along and
we now have evidence, you know,

715
00:51:06,719 --> 00:51:10,159
they've made posts of things talking about
they had been tinkering with a language

716
00:51:10,199 --> 00:51:15,320
model attached to a search engine for
a couple of years. They weren't happy

717
00:51:15,360 --> 00:51:17,719
with it. It wasn't good enough. And then chat gpt comes out,

718
00:51:17,840 --> 00:51:22,159
signs up one hundred million users in
two months. They're like, that's more

719
00:51:22,199 --> 00:51:25,320
people that are using bing. Let's
put it out there. And then Google

720
00:51:25,320 --> 00:51:29,000
freaks out and takes the bait.
I wonder how long it's going to be

721
00:51:29,000 --> 00:51:36,679
before Amazon Alexa has that sort of
contextual memory that it does not really currently

722
00:51:36,719 --> 00:51:39,039
have. Now, I don't know. Alexa has turned out to be a

723
00:51:39,119 --> 00:51:44,599
real money sync for Amazon. It's
something like ten billion dollars put into it,

724
00:51:44,639 --> 00:51:46,800
and it looks like a failing portion
of their business. Well all of

725
00:51:49,199 --> 00:51:52,920
those voice bots all the same time. I mean, it's coming into twenty

726
00:51:52,920 --> 00:51:55,199
twenty three where we have a threat
of an economic downturn. I don't know

727
00:51:55,280 --> 00:51:59,119
we actually have a we have a
threat of one. It's a very stylish

728
00:51:59,199 --> 00:52:02,440
or tech companies to lay people off
and so forth. They're clearly scrutinizing their

729
00:52:02,480 --> 00:52:07,199
businesses and wanting and signally they're going
to cut things that aren't working. And

730
00:52:07,360 --> 00:52:14,360
both Google Home and their voice device
and Amazon Alexa both were supposed to make

731
00:52:14,440 --> 00:52:17,400
money and didn't to the tune of
billions, and so they're saying they make

732
00:52:17,440 --> 00:52:21,679
it noisy. I don't know if
they've actually done anything, but they've made

733
00:52:21,719 --> 00:52:24,360
noises about winding it down. Yeah, exactly what I've been reading, because

734
00:52:24,519 --> 00:52:29,800
from my perspective as somebody who hosts
a radio show and podcasts, I'm always

735
00:52:29,840 --> 00:52:35,480
interested in how many listeners can tune
into my show on those devices. And

736
00:52:35,519 --> 00:52:39,360
it's incredibly hard to make it happen, Yeah, because they're Unfortunately. My

737
00:52:39,440 --> 00:52:43,880
show is called Away with Words,
and there are a zillion songs, and

738
00:52:44,039 --> 00:52:50,159
other podcasts either have episodes or portions
of their programming called Away with Words.

739
00:52:50,320 --> 00:52:53,159
Yeah, and the wrong thing comes
Dude, we have a podcast it starts

740
00:52:53,159 --> 00:53:00,519
with a period. Oh God.
It's really hard that anyway, And it's

741
00:53:00,519 --> 00:53:05,639
surprised a number of people create new
shows constantly called Away with Words without checking

742
00:53:05,800 --> 00:53:08,159
to see if there's another show already
called that. People are just do not

743
00:53:08,280 --> 00:53:12,480
think yeah, yeah, it's not
good for them either. I mean,

744
00:53:12,840 --> 00:53:15,199
so no, I find it.
You have to be very explicit with the

745
00:53:15,239 --> 00:53:19,960
one who starts today. You have
to say, play the latest episode of

746
00:53:19,960 --> 00:53:23,159
the podcast Away with Words with Grant
Barrett and Martha Parnett, and then it

747
00:53:23,199 --> 00:53:27,840
will But yeah, people don't do
No, they don't do that because speech

748
00:53:27,880 --> 00:53:30,719
is supposed to be easy. I
would say it's supposed to be efficient.

749
00:53:30,840 --> 00:53:34,800
I'm supposed to be able to say
very little and you know, and you

750
00:53:35,239 --> 00:53:37,559
or should understand what I'm talking about. How far do you think we are

751
00:53:37,719 --> 00:53:43,519
from the way it works in movies
or Star Trek to speaking to the computer

752
00:53:43,559 --> 00:53:45,920
and having it all be understood.
I think we can't even see our way

753
00:53:45,920 --> 00:53:52,960
there right now because we don't have
real understanding coming from the machine of intent

754
00:53:52,159 --> 00:53:55,679
or any of those things. I
think it's almost like we're in the same

755
00:53:55,719 --> 00:54:00,960
space that Tassel is with the autopilot. You've made something that feels like it

756
00:54:00,960 --> 00:54:05,079
should exist. It has science fiction
context, so people want to believe exists,

757
00:54:05,159 --> 00:54:08,559
and so you ignore how very broken
it actually is. Right, That's

758
00:54:08,559 --> 00:54:13,440
exactly you put it into. So
the words that I was trying to say,

759
00:54:13,599 --> 00:54:15,920
that's exactly how I feel about it, And I wonder if it won't

760
00:54:15,920 --> 00:54:20,679
be thrown into a winter because it
deserves to me. I like that ex

761
00:54:20,760 --> 00:54:23,400
question too. Thrown into a winter
where it just everything chills for a while.

762
00:54:23,480 --> 00:54:25,880
Yeah, put it on ice.
It's kind of like what happened with

763
00:54:25,920 --> 00:54:30,960
electric cars. Electric cars were big
at the beginning of the automoile ears automobile

764
00:54:30,000 --> 00:54:34,559
era, and how long did they
lay in the deepres? Yeah, for

765
00:54:35,079 --> 00:54:37,280
decades and then they you know,
a little a little poke up in the

766
00:54:37,320 --> 00:54:40,880
eighties, in the seventies when with
the oil crisis, and then they went

767
00:54:40,880 --> 00:54:44,800
away again, and a little poke
up in the nineties, and then it

768
00:54:44,840 --> 00:54:47,679
went away again. People in the
crazy billionaire showed up. People gonna yell

769
00:54:47,679 --> 00:54:51,079
at me for saying lay instead of
lie. Just so you know, here

770
00:54:51,079 --> 00:54:57,320
you go. But I grant you
know, it's contextual to us because of

771
00:54:57,360 --> 00:55:01,199
the artificial intelligence winters, right raise. Artificial intelligence first comes up the fifties

772
00:55:01,199 --> 00:55:06,760
with Martin Minsky working for the military. They get some stuff to work,

773
00:55:06,760 --> 00:55:10,599
but their promises are way out of
scope and so winter. Then it reappears

774
00:55:10,639 --> 00:55:15,639
as the expert systems in the seventies
and they build these thirty thousand dollar computers

775
00:55:15,639 --> 00:55:20,639
and sold several thousand of them.
But that's about it. Winter then in

776
00:55:20,679 --> 00:55:22,840
the nineties with the decision tree systems
and deep Blue and so forth, and

777
00:55:22,920 --> 00:55:28,519
playing chess, and it can't get
further than that winter. And then it

778
00:55:28,559 --> 00:55:31,599
comes up again in the twenty eleven's
with Jeffrey Hinton's model and an image net

779
00:55:31,599 --> 00:55:36,639
and vision recognition systems, and that's
what we're the tail into this right now.

780
00:55:36,719 --> 00:55:39,679
Hinton's now saying, here comes another
winter. But you know the big

781
00:55:39,760 --> 00:55:43,280
change now is we have the clouds. So with this huge amount of compute

782
00:55:43,320 --> 00:55:49,159
hanging around and companies highly motivated to
find work for their cloud, clouds are

783
00:55:49,159 --> 00:55:52,639
full arranging the kicker to all this
open AI stuff and jatchebetween all these things

784
00:55:52,800 --> 00:55:58,360
all consume cloud. I have a
story for you, Grant. Do you

785
00:55:58,400 --> 00:56:04,719
remember a program back when we were
all young called Eliza. Yes, I

786
00:56:04,760 --> 00:56:07,840
played with Eliza. I had the
MAC version. Yeah, yeah, the

787
00:56:07,960 --> 00:56:13,320
Rogerian psychotherapist. You basically would ask
it a question and it would answer with

788
00:56:13,400 --> 00:56:20,360
a question that contained parts or was
triggered by your question. Yes. Yeah,

789
00:56:20,360 --> 00:56:22,480
you can still find it online.
Yeah, I'll include a link to

790
00:56:22,519 --> 00:56:27,159
an online one, so you would
say a sentence that included my brother,

791
00:56:27,480 --> 00:56:30,719
right, and so picked up on
those keywords and said, tell me more

792
00:56:30,760 --> 00:56:34,519
about your family. And my first
experience of this was, oh, my

793
00:56:34,639 --> 00:56:37,760
god, it's brilliant. It's a
genius. But here's the thing. The

794
00:56:37,840 --> 00:56:43,960
guy who wrote that originally wrote it
as a goof against the term artificial intelligence

795
00:56:43,960 --> 00:56:47,719
to prove that it wasn't intelligent.
It was simply doing a lookup of keywords

796
00:56:47,800 --> 00:56:54,239
and responding with responses appropriate for those
keywords. And everybody, kind of including

797
00:56:54,280 --> 00:56:59,119
me as a kid, I was
blown away by this, and I actually

798
00:56:59,159 --> 00:57:02,360
rewrote it in visual Basic a long
time ago, and I think I think

799
00:57:02,400 --> 00:57:05,840
I might have even done an online
version of it. But but I was

800
00:57:05,880 --> 00:57:08,519
just fascinated by it. You know, from a consumer standpoint, it looked

801
00:57:08,559 --> 00:57:13,079
like magic to me. Yeah,
people want to believe these things exist,

802
00:57:13,239 --> 00:57:22,880
right between peridolia and anthropomorphization, Like
we're looking for faces and we cast humanity

803
00:57:22,920 --> 00:57:25,360
on things that aren't you. I'm
not the first to say this, but

804
00:57:25,400 --> 00:57:30,440
we are terribly bad at pattern matching, and we just think we're fantastic.

805
00:57:30,519 --> 00:57:37,119
Yeah right, the pattern we matches. I'm a genius, but I am,

806
00:57:37,239 --> 00:57:39,400
Yeah, but in your case,
we're making an exception. But no,

807
00:57:39,440 --> 00:57:43,880
I'm not. I always say that
like I'm somewhere in the middle,

808
00:57:43,960 --> 00:57:46,840
just a little bit above average.
So the well beg Yeah, and and

809
00:57:46,840 --> 00:57:51,559
the other thing we know about humans
is when they don't understand something or there's

810
00:57:51,639 --> 00:57:54,639
ambiguity, we make stuff up.
Yeah, and we're scared and we're scared.

811
00:57:54,840 --> 00:57:59,000
Yeah, yeah, we make stuff
up and it always blames someone but

812
00:57:59,199 --> 00:58:02,320
us. Right, So go read
a book or a bunch of books.

813
00:58:02,480 --> 00:58:06,800
Yeah, let me ask you guys
a question while I have here. What

814
00:58:06,880 --> 00:58:07,960
do you what are you working on? What do you what do you do

815
00:58:08,079 --> 00:58:12,000
besides this podcast? Well I could
tell you if well, first of all,

816
00:58:12,079 --> 00:58:15,800
Richard's working on a book about the
history of dot net. Okay,

817
00:58:15,000 --> 00:58:19,159
Microsoft's you probably know what dot net
is. Yeah, yeah, yeah,

818
00:58:19,239 --> 00:58:22,920
Well I only have the like open
source version of it installed. But yeah,

819
00:58:22,079 --> 00:58:25,360
right, so it goes back to
being a Windows only product back in

820
00:58:25,440 --> 00:58:31,199
the two thousands and went all the
way through tablets and phones and open source.

821
00:58:32,360 --> 00:58:35,800
And he can tell you more about
that. But I'm basically, um

822
00:58:36,119 --> 00:58:42,920
a web developer, and I've got
a couple of YouTube videos series one on

823
00:58:43,280 --> 00:58:49,280
Blazer, which is there the state
of the art web UI technology, and

824
00:58:49,400 --> 00:58:54,239
another one on Maui, which is
their mobile technology. So i'm my job

825
00:58:54,400 --> 00:59:00,000
is essentially to come up with good
examples of how to use this stuff every

826
00:58:59,840 --> 00:59:01,760
week, do some research, write
some code, and then do a video

827
00:59:01,760 --> 00:59:06,360
about it. So i'm this is
my wheelhouse. I love it. Oh

828
00:59:06,400 --> 00:59:09,119
that's fantastic. I'm working on a
book too. It's on the on the

829
00:59:09,159 --> 00:59:15,320
words suck, s u c K, suck these there a whole book there.

830
00:59:15,639 --> 00:59:17,000
Yeah. Yeah, well it'll be
a short one thirty thousand words,

831
00:59:17,079 --> 00:59:21,159
but yeah, a mirror and it
will be on it'll be like on the

832
00:59:21,199 --> 00:59:22,920
figurative uses of it, like you
suck or that sucks. No, no,

833
00:59:23,000 --> 00:59:28,039
we have a we have a version
of it. That's um a noun.

834
00:59:28,880 --> 00:59:30,360
Right, So we went the big
suck, right, Well, we

835
00:59:30,400 --> 00:59:35,079
turned down the suck and we turn
up the awesome in our production, in

836
00:59:35,079 --> 00:59:37,320
our production that's yeah, yeah,
exactly, Yeah, Yeah, we have

837
00:59:37,360 --> 00:59:39,800
the suck now. Yeah, I
will be addressing that now, Yes,

838
00:59:39,920 --> 00:59:45,960
turn it down. I appreciate that. Grant. It's been an absolute pleasure

839
00:59:45,960 --> 00:59:50,480
to talk to you and keep in
touch, will you Yeah, absolutely,

840
00:59:50,480 --> 00:59:52,000
thanks for having me on. This
was a great conversation. It's nice to

841
00:59:52,559 --> 00:59:57,400
flex the tech brain a little bit. Yeah, and again, you know,

842
00:59:57,559 --> 01:00:00,599
it was a really good intersection of
all these things and at the right

843
01:00:00,599 --> 01:00:02,719
time, I think. So thanks
again and we'll see you on the way

844
01:00:02,760 --> 01:00:06,559
with words. All right, Thank
you, take happy well you guys.

845
01:00:06,599 --> 01:00:08,400
All right, byebye. We'll talk
to you, dear listener next time on

846
01:00:08,599 --> 01:00:34,719
dot net rocks. Dot net Rocks
is brought to you by Franklin's Net and

847
01:00:34,880 --> 01:00:39,199
produced by Pop Studios, a full
service audio, video and post production facility

848
01:00:39,480 --> 01:00:45,440
located physically in New London, Connecticut, and of course in the cloud online

849
01:00:45,440 --> 01:00:51,559
at pwop dot com. Visit our
website at dt n et r ocks dot

850
01:00:51,599 --> 01:00:55,960
com for RSS feeds, downloads,
mobile apps, comments, and access to

851
01:00:57,000 --> 01:01:00,679
the full archives going back to show
number one, recorded in September two thousand

852
01:01:00,719 --> 01:01:05,159
and two. And make sure you
check out our sponsors. They keep us

853
01:01:05,159 --> 01:01:08,599
in business. Now go write some
code. See you next time. You

854
01:01:08,679 --> 01:01:20,920
got a check. Mile Band
