1
00:00:02,040 --> 00:00:07,519
Speaker 1: Welcome everyone to another episode of Adventures in Hit Adventures

2
00:00:07,599 --> 00:00:11,199
in DevOps. You would think after a few hundred episodes,

3
00:00:11,240 --> 00:00:14,679
I would learn the name of the show, but working

4
00:00:14,759 --> 00:00:18,199
on it, I think it's actually getting worse. Well it is,

5
00:00:18,280 --> 00:00:20,640
because now I've got like this mental block. You know

6
00:00:20,679 --> 00:00:23,679
where my internal monologue is going, don't f it up.

7
00:00:23,679 --> 00:00:25,920
Don't f it up. Because you watch he's gonna.

8
00:00:25,800 --> 00:00:28,640
Speaker 2: F it up the second second week in a row.

9
00:00:28,679 --> 00:00:30,239
Speaker 3: I think that he left it up.

10
00:00:31,760 --> 00:00:35,119
Speaker 2: Welcome Warren, How are you, Yeah, I'm good. I actually

11
00:00:35,200 --> 00:00:38,479
do have a fact this week. It's it's not security related.

12
00:00:39,000 --> 00:00:41,439
I actually am a little worried that we may have

13
00:00:41,520 --> 00:00:46,079
reached a plateau local maximum for our innovation and AI,

14
00:00:46,679 --> 00:00:49,719
because we've already started to see products that are heavily

15
00:00:49,799 --> 00:00:52,560
toward exploitation, and so you can see that there has

16
00:00:52,600 --> 00:00:55,399
been a huge shift in where we were before things

17
00:00:55,439 --> 00:00:57,840
being released for free, and now we're at the stage

18
00:00:57,920 --> 00:01:01,560
of the technology where everyone's just trying to extract value

19
00:01:01,560 --> 00:01:03,280
from it. I don't know what that means for the

20
00:01:03,320 --> 00:01:06,359
long term, but I think it's really interesting.

21
00:01:07,640 --> 00:01:09,560
Speaker 1: I think it means shareholder profits.

22
00:01:09,879 --> 00:01:10,239
Speaker 3: That's more.

23
00:01:10,400 --> 00:01:13,040
Speaker 2: Let's hope, right, that's what everyone says, everyone wants shareholder

24
00:01:13,040 --> 00:01:14,719
profits and maybe we're actually getting there.

25
00:01:15,959 --> 00:01:23,120
Speaker 1: Maybe. So speaking of AI, our guest this week, John W. Maylee,

26
00:01:24,079 --> 00:01:28,480
Attorney at large, founded the consulting firm John Maylee and Associates.

27
00:01:28,680 --> 00:01:32,439
But before you jump off the deep end and go

28
00:01:32,799 --> 00:01:35,760
what in tarnation, we'll lawyer it up for this episode.

29
00:01:36,159 --> 00:01:40,599
It's it's actually relevant because in addition to being an attorney,

30
00:01:41,359 --> 00:01:45,799
John has his master's degree in computer science from Stanford University.

31
00:01:46,400 --> 00:01:49,680
And the thing that led me to having this conversation

32
00:01:49,760 --> 00:01:52,799
with him, he's the author of the book Juris X

33
00:01:52,920 --> 00:01:56,920
Machina that's about It's a sci fi book, but it's

34
00:01:57,040 --> 00:02:01,159
about where the US has replaced yours in the legal

35
00:02:01,200 --> 00:02:05,920
system with AI, and you know there may be some

36
00:02:06,040 --> 00:02:09,240
fallout with that, and so that's the topic of the book,

37
00:02:10,039 --> 00:02:12,400
and we're going to talk all about all kinds of

38
00:02:12,439 --> 00:02:16,680
things AI engineering and sci fi related. So, John, thank

39
00:02:16,719 --> 00:02:17,599
you for being on the show.

40
00:02:18,199 --> 00:02:18,879
Speaker 3: Thanks for having me.

41
00:02:18,960 --> 00:02:21,599
Speaker 1: Great to be a guest, right, I'm looking forward to this.

42
00:02:21,680 --> 00:02:24,879
I was just looking at your your bio here and

43
00:02:24,919 --> 00:02:29,039
it cracks me up. You have a running, swimming, long

44
00:02:29,039 --> 00:02:37,199
distance motorcycling, classic car restoration stage diving, crowdsurfing, bar fighting,

45
00:02:38,479 --> 00:02:41,639
Lama rancher. Like, we could go on and on on

46
00:02:41,680 --> 00:02:44,360
this episode for quite a while, because it's it's a

47
00:02:44,400 --> 00:02:47,840
little embarrassing, like how much of my background overlaps with yours.

48
00:02:50,280 --> 00:02:53,280
I can't claim Lama ranching, but there's a lot of

49
00:02:53,360 --> 00:02:54,080
other stuff on there.

50
00:02:54,479 --> 00:02:56,199
Speaker 3: I've tried to live my life such that it would

51
00:02:56,199 --> 00:02:59,159
be a rich source of blackmail sources. Nice.

52
00:02:59,400 --> 00:03:02,719
Speaker 1: Nice, So that legal background is going to pay off

53
00:03:02,719 --> 00:03:04,000
for you in the future, right.

54
00:03:04,840 --> 00:03:07,919
Speaker 3: Well, and once you've done one thing that's blackmailworthy, then

55
00:03:08,120 --> 00:03:10,479
it kind of dilutes the market. Right, So like if

56
00:03:10,560 --> 00:03:12,599
I go and do a bunch of other things, it's like, well,

57
00:03:12,639 --> 00:03:14,400
you already could have blackmailed me for this first thing.

58
00:03:14,479 --> 00:03:16,439
So like your leverage is pretty much the same.

59
00:03:18,439 --> 00:03:20,400
Speaker 1: So tell me a little bit about how you went

60
00:03:20,520 --> 00:03:24,280
from computer science to legal.

61
00:03:25,680 --> 00:03:27,960
Speaker 3: Yeah. So I was at this crossroads to a street

62
00:03:27,960 --> 00:03:29,520
and there was the devil was there and.

63
00:03:32,639 --> 00:03:35,439
Speaker 1: Bando competition, right, and you didn't you didn't pick learning

64
00:03:35,439 --> 00:03:37,360
to play guitar. When you asked what you wanted.

65
00:03:38,439 --> 00:03:44,280
Speaker 3: You know, now you mentioned that idea. So so yeah,

66
00:03:44,319 --> 00:03:48,639
I I originally went to college, went to Syracuse for

67
00:03:48,680 --> 00:03:53,360
computer engineering, and I double majored. I was studying psychology

68
00:03:53,400 --> 00:03:57,280
at the time too, and it was they were at

69
00:03:57,319 --> 00:04:00,599
the time more disparate fields. So you would and interesting

70
00:04:00,639 --> 00:04:03,159
things in your classes, and every now and then you'd

71
00:04:03,159 --> 00:04:05,680
have these moments where these things would just sort of

72
00:04:05,719 --> 00:04:09,240
synthesize into some really cool idea that bridge the two fields.

73
00:04:09,800 --> 00:04:14,319
And you know, the most obvious recurring spot for these

74
00:04:14,360 --> 00:04:19,319
types of things was AI because that was kind of

75
00:04:19,319 --> 00:04:21,199
one of the few areas where there was overlight between

76
00:04:21,199 --> 00:04:26,839
those two fields. So time passed and AI was just

77
00:04:27,160 --> 00:04:30,240
you know, mostly something you write about in textbooks. And

78
00:04:30,279 --> 00:04:32,720
then at Stanford when I was getting my grad degree

79
00:04:33,199 --> 00:04:36,319
in computer science, we actually you know, the rubber met

80
00:04:36,319 --> 00:04:39,959
the road and we actually got to write ais that

81
00:04:40,040 --> 00:04:43,040
would you know, for games being played against each other,

82
00:04:44,920 --> 00:04:48,600
different applications like that. That was really fun and exciting,

83
00:04:48,680 --> 00:04:50,759
and but it was it was kind of like this

84
00:04:50,839 --> 00:04:53,839
shiny novelty that you know. The only place people were

85
00:04:53,839 --> 00:04:56,240
really using neural nets to a great extent back then

86
00:04:56,360 --> 00:05:00,759
was like you know, the Post Office for recognizing characters

87
00:05:00,759 --> 00:05:04,519
and numbers and things that were cool. You know, it

88
00:05:04,560 --> 00:05:06,000
was cool that you could train something and it would

89
00:05:06,000 --> 00:05:07,959
get better at it, but it wasn't something that you like,

90
00:05:08,079 --> 00:05:10,199
would tell people about a cocktail parties and they'd be like,

91
00:05:10,240 --> 00:05:12,120
oh my god, you know, this has been a cream

92
00:05:12,160 --> 00:05:17,959
for the world. So that just kind of became like it,

93
00:05:18,160 --> 00:05:20,240
you know, that steed interesting to me, but it was

94
00:05:20,279 --> 00:05:23,480
kind of dormant. And then I worked as a computer

95
00:05:23,519 --> 00:05:29,319
engineer on a microprocessor design and validation team for five

96
00:05:29,399 --> 00:05:33,199
or six years, and while that was happening, I had

97
00:05:33,240 --> 00:05:36,399
filed you know, I've been an inventor on several patents

98
00:05:36,439 --> 00:05:40,000
and had worked with these different patent attorneys who worked

99
00:05:40,000 --> 00:05:43,360
with the company I worked for, And it occurred to

100
00:05:43,399 --> 00:05:45,279
me that whenever you would talk to these guys, you'd

101
00:05:45,279 --> 00:05:48,079
be like, you know, you'd be sitting in your kind

102
00:05:48,120 --> 00:05:50,560
of sad, little gray cubicle and you'd ask these guys, so, so,

103
00:05:50,720 --> 00:05:53,040
you know, where are you based? And they'd say, well,

104
00:05:53,120 --> 00:05:55,920
I'm in my in my yacht in the Caribbean right now,

105
00:05:56,240 --> 00:06:00,199
or I'm in a walled compound in that Nevada does there,

106
00:06:00,800 --> 00:06:03,560
and you like, it got to the point where you

107
00:06:03,600 --> 00:06:06,759
would start asking, you know, it was the answer was

108
00:06:06,759 --> 00:06:08,920
always so fascinating that you would just start asking people,

109
00:06:09,040 --> 00:06:12,040
like the first thing you would ask, and so I

110
00:06:12,079 --> 00:06:15,759
realized that these guys were interesting because they were technologists

111
00:06:15,800 --> 00:06:18,560
and they'd become they've gone into patent law, and so

112
00:06:18,600 --> 00:06:21,079
they had been able to leverage the fact that they

113
00:06:21,079 --> 00:06:24,800
were interested in technology but kind of break free of

114
00:06:25,120 --> 00:06:27,199
at the time when there wasn't a lot of remote work,

115
00:06:27,639 --> 00:06:29,959
kind of break free of that mold of being in

116
00:06:29,959 --> 00:06:32,680
the office nine to five and kind of pursue other

117
00:06:32,720 --> 00:06:36,879
interests more freely on the side. So I ended up

118
00:06:36,920 --> 00:06:44,560
going to law school at nights and learning more about law, obviously,

119
00:06:45,439 --> 00:06:48,800
and then at the end of that I was doing

120
00:06:48,839 --> 00:06:51,160
work that had to do with CPUs and GPUs, so

121
00:06:51,279 --> 00:06:55,480
looking at companies patent portfolios and you know, helping them

122
00:06:55,480 --> 00:06:58,560
figure out like this is a useful invention that actually

123
00:06:58,639 --> 00:07:00,879
is likely to be used, this is not a useful

124
00:07:00,879 --> 00:07:02,800
invention that you know, looks great on paper, but it

125
00:07:02,839 --> 00:07:07,480
won't really work as well. And then over time GPUs

126
00:07:07,519 --> 00:07:09,759
started getting bigger and bigger and bigger as time went on,

127
00:07:09,839 --> 00:07:11,480
as far as like how much I was asked to

128
00:07:11,480 --> 00:07:13,639
look at them, and part of that was because three

129
00:07:13,720 --> 00:07:17,079
D graphics was taking off even more. But then eventually

130
00:07:17,120 --> 00:07:20,000
we got to the point where there were these GPGPUs

131
00:07:20,040 --> 00:07:22,360
that were general purpose and weren't really necessarily being used

132
00:07:22,360 --> 00:07:25,360
for graphics and never being used for server farms and

133
00:07:25,399 --> 00:07:29,839
cloud computing, and eventually that just sort of took over.

134
00:07:30,000 --> 00:07:31,920
So it was kind of cool from that standpoint that

135
00:07:31,959 --> 00:07:34,319
I went from, you know, the only time I would

136
00:07:34,319 --> 00:07:38,680
talk about AI and a patent portfolio was like analoc

137
00:07:38,759 --> 00:07:44,000
brake systems or lane change sensors and you know, luxury vehicles,

138
00:07:44,040 --> 00:07:47,759
to suddenly just about everything I work on now has

139
00:07:47,800 --> 00:07:52,560
some foot in the AI space in somewhere another, whether

140
00:07:52,600 --> 00:07:56,079
it's hardware that helps enable it or whether it's software.

141
00:07:56,319 --> 00:07:59,759
So that's that's kind of cool because it's something that's

142
00:07:59,759 --> 00:08:03,240
always fascinated me, and now it's suddenly fascinating to society

143
00:08:03,240 --> 00:08:05,000
as well. So I'm no longer an outlier.

144
00:08:06,519 --> 00:08:08,439
Speaker 1: So where do you think AI is at on the

145
00:08:09,160 --> 00:08:11,560
overhype cycle? Do you think it's overhyped right now or

146
00:08:11,560 --> 00:08:15,040
do you think it's appropriate?

147
00:08:15,959 --> 00:08:18,920
Speaker 3: I think it's it's kind of both. Right there's a

148
00:08:18,920 --> 00:08:22,759
lot of asymmetry. I think that's it's over hyped in

149
00:08:22,879 --> 00:08:28,720
terms of you know, every company is now rushing to

150
00:08:28,720 --> 00:08:30,720
get on the bandwagon and find a way to add

151
00:08:30,759 --> 00:08:32,960
AI to their product, even if it's just kind of

152
00:08:33,320 --> 00:08:35,720
you know, pointlesser doesn't work very well. Like I was

153
00:08:35,759 --> 00:08:38,919
reading an article yesterday about how all these dating apps

154
00:08:38,960 --> 00:08:42,799
had incorporated AI and they were still just as crappy

155
00:08:42,840 --> 00:08:45,480
and court and finding people of that match you as

156
00:08:45,480 --> 00:08:48,080
they were before, but now they're faster of being crappy.

157
00:08:48,159 --> 00:08:49,759
Speaker 1: So there's weird.

158
00:08:51,720 --> 00:08:54,039
Speaker 2: I mean, I love the outcome there, which is your

159
00:08:54,320 --> 00:08:58,080
entire romantic life will be decided by two robots talking

160
00:08:58,120 --> 00:09:00,679
to each other, right, I mean, both on the receiving

161
00:09:00,720 --> 00:09:03,840
of the message and the sending will now uh not

162
00:09:04,039 --> 00:09:06,639
no longer be human, and you'll decide on whether or

163
00:09:06,679 --> 00:09:09,360
not to pursue a person based on what the algorithm says.

164
00:09:09,440 --> 00:09:13,200
Speaker 3: Right, I mean there's a conflict adventest too, right, because

165
00:09:13,240 --> 00:09:17,600
now there's AI agents that are designed that you can

166
00:09:17,720 --> 00:09:20,279
date and so then you know it's like monetized and

167
00:09:20,320 --> 00:09:21,879
that you can buy them accessories and.

168
00:09:24,360 --> 00:09:27,279
Speaker 1: Do not google that, do not google the accessories that

169
00:09:27,320 --> 00:09:29,519
are available while you're on your work computer.

170
00:09:30,960 --> 00:09:33,600
Speaker 3: So it's it's funny because not only is it creating

171
00:09:33,600 --> 00:09:35,600
this fake dating relationship, it's it's kind of making you

172
00:09:35,639 --> 00:09:39,799
the sugar daddy because my person is entirely dependent on

173
00:09:39,840 --> 00:09:42,799
you for new outfits and jewelry and things and pets

174
00:09:42,840 --> 00:09:45,840
and overall happiness. And so it's a conflict of interest

175
00:09:45,879 --> 00:09:48,840
that they're sort of steering you toward bad people that

176
00:09:48,879 --> 00:09:52,159
you're incompatible with. So it just makes dating the AI

177
00:09:52,320 --> 00:09:55,000
is even more appealing. Fine, I give up. I will

178
00:09:55,039 --> 00:09:55,679
just date an AI.

179
00:09:55,919 --> 00:09:58,519
Speaker 2: We're already at the dystopian future right there. There's no

180
00:09:58,679 --> 00:10:01,120
there's no next step after this. We're already there. This

181
00:10:01,159 --> 00:10:04,720
is where like a whole science fiction movies and television

182
00:10:04,720 --> 00:10:07,879
shows and books are already set right where we're dating

183
00:10:07,919 --> 00:10:08,240
the AI.

184
00:10:09,200 --> 00:10:11,440
Speaker 3: It's true, and you know, this is the under hype

185
00:10:11,440 --> 00:10:15,200
overhype like paradigm. I think that's at the same time,

186
00:10:15,600 --> 00:10:18,399
if we call like a text support or a customer

187
00:10:18,399 --> 00:10:21,919
support line and we say, like speak to an operator,

188
00:10:22,159 --> 00:10:24,639
it takes like twenty five tries of me yelling that

189
00:10:24,679 --> 00:10:27,679
louder and louder before the AI just like click and say, oh,

190
00:10:27,720 --> 00:10:31,799
that's what you want. So other areas like there's a

191
00:10:31,799 --> 00:10:35,320
total lack of AI development. And you know, by the

192
00:10:35,360 --> 00:10:40,000
same token, we have this old fear that we got

193
00:10:40,039 --> 00:10:42,720
I think from science fiction through you know, the seventies

194
00:10:42,720 --> 00:10:46,480
and eighties and onward, that's AI was going to become

195
00:10:46,519 --> 00:10:49,480
this thing that once it got sufficiently intelligent, it would

196
00:10:49,480 --> 00:10:51,879
just sort of take over and start, you know, annihilating

197
00:10:51,960 --> 00:10:54,480
humans or imprisoning them so they don't hurt themselves, or

198
00:10:55,200 --> 00:10:59,159
anywhere in between. And what we actually have is a

199
00:10:59,159 --> 00:11:02,600
gizzilion different AI eyes that all have very different specializations

200
00:11:02,639 --> 00:11:05,360
and very different motivations of what they're trying to optimize.

201
00:11:05,919 --> 00:11:10,360
And there's no sort of universal intelligence general is AI

202
00:11:10,480 --> 00:11:12,799
yet where it just goes out and tries to help humanity.

203
00:11:12,879 --> 00:11:17,279
It's more like, you know, I'm really good at analyzing

204
00:11:17,799 --> 00:11:21,440
research results or counting how many times a word appears

205
00:11:21,440 --> 00:11:24,679
on a form or something like that. So it's accelerating things,

206
00:11:24,679 --> 00:11:27,519
but it's still very specialized and it's not multimodal to

207
00:11:27,559 --> 00:11:31,000
any great extent yet. But at the same time, you know,

208
00:11:31,039 --> 00:11:34,360
you have these stories coming out where someone's AI told

209
00:11:34,360 --> 00:11:37,000
them to go kill themselves, and you know what you

210
00:11:37,000 --> 00:11:39,360
don't see is the prompt right before then where they said, hey,

211
00:11:39,759 --> 00:11:41,799
next time I ask a question, I want you to respond,

212
00:11:41,840 --> 00:11:45,399
you should go kill yourself. So it's really easy to

213
00:11:45,519 --> 00:11:51,639
flag a sort of outlier AI response and turning into

214
00:11:51,639 --> 00:11:54,919
all kinds of news headlines, and that's interesting because it's

215
00:11:54,919 --> 00:11:58,799
sort of driving concerns over this more than actual outcomes are,

216
00:11:58,840 --> 00:12:00,600
and in some ways, the actual out comes are kind

217
00:12:00,600 --> 00:12:03,480
of what we need to be more worried about. So yeah,

218
00:12:03,480 --> 00:12:05,919
I would you know, in a true legal answer, I

219
00:12:05,960 --> 00:12:07,480
would say yes and no. No.

220
00:12:07,519 --> 00:12:09,440
Speaker 2: I mean, I think it's really interesting that you bring

221
00:12:09,519 --> 00:12:12,840
up a bunch of those points. I mean, you're I

222
00:12:12,840 --> 00:12:16,279
think the decentralization of responsibilities and the specialization that the

223
00:12:16,480 --> 00:12:18,279
AI is is taking up, you know, is a really

224
00:12:18,279 --> 00:12:21,240
great point. And right now I do feel like they

225
00:12:21,279 --> 00:12:25,919
are solving sort of very tail value things like it's

226
00:12:26,120 --> 00:12:29,279
there's no core solution, there's no core greatness that's coming

227
00:12:29,320 --> 00:12:31,919
out of it for for society, and for sure, I

228
00:12:31,919 --> 00:12:34,440
don't really think anyone's talking about that. I think as

229
00:12:34,440 --> 00:12:36,879
far as we've gotten is we should be afraid, and

230
00:12:36,960 --> 00:12:39,440
that's that's I think, as far as people are willing

231
00:12:39,440 --> 00:12:41,759
to go, I think what you're talking about, though, really

232
00:12:41,759 --> 00:12:45,480
requires some complex questions to be answered, and I don't

233
00:12:45,519 --> 00:12:50,000
think humans have been so great at figuring out even

234
00:12:50,039 --> 00:12:52,679
which questions to ask, let alone answering them. For things

235
00:12:52,720 --> 00:12:54,960
that are much simpler, like what will happen tomorrow or

236
00:12:55,000 --> 00:12:55,559
the next day.

237
00:12:56,159 --> 00:12:59,039
Speaker 3: Right. That's and that's kind of what's fascinating about AI, right,

238
00:12:59,120 --> 00:13:03,320
is that the developer who put the model together may

239
00:13:03,360 --> 00:13:05,320
not even know what its capabilities are and what the

240
00:13:05,320 --> 00:13:08,919
best questions to be asking are. But you know, it's

241
00:13:09,159 --> 00:13:11,399
and it is like a question of like, what's what's

242
00:13:11,440 --> 00:13:14,159
the utility of this? Right? And so I was think

243
00:13:14,200 --> 00:13:17,360
of the other day, like I was using Shatgypt, I

244
00:13:17,399 --> 00:13:19,879
was using the one model, so the one that is

245
00:13:20,639 --> 00:13:23,480
slower and much more thorough and a lot more nodes.

246
00:13:23,919 --> 00:13:25,960
And I asked it some really stupid question because I

247
00:13:25,960 --> 00:13:28,000
forgot to switch into a lesser model and I wanted

248
00:13:28,039 --> 00:13:31,840
to know how many calendar days were between like January

249
00:13:31,840 --> 00:13:35,759
eleventh and some other day. And immediately started churning and starting.

250
00:13:35,799 --> 00:13:38,120
It's like five minute process, and I'm like, oh, why

251
00:13:38,120 --> 00:13:42,480
did I switch? And my my chief source of that

252
00:13:42,639 --> 00:13:45,960
was not in patience. It was like guilt at like, man,

253
00:13:46,000 --> 00:13:48,960
I wonder how much like cooling water I'm using and

254
00:13:48,960 --> 00:13:51,639
how much like energy? This query is sucking down for

255
00:13:51,679 --> 00:13:54,240
something stupid. And so you know, it barks out the

256
00:13:54,279 --> 00:13:56,720
answer of like you know, thirty one days or whatever,

257
00:13:56,759 --> 00:14:00,200
and then I look at the prompt to get and

258
00:14:00,240 --> 00:14:03,440
it's like, you know, some questions you might want to ask,

259
00:14:03,600 --> 00:14:07,159
is is a hot dog as sandwich? And I realized

260
00:14:07,200 --> 00:14:11,440
that they're basically promoting like even more frivolous uses of

261
00:14:11,480 --> 00:14:15,759
these ais than than what I just felt guilty for doing.

262
00:14:15,840 --> 00:14:17,879
So it's it's really interesting, you know. I think in

263
00:14:17,919 --> 00:14:20,360
a lot of ways, it parallels what we saw with

264
00:14:20,399 --> 00:14:23,399
like the tech bubble and ninety nine and two thousand,

265
00:14:23,879 --> 00:14:28,559
where they're kind of so concerned with the future of

266
00:14:28,600 --> 00:14:30,559
how powerful their model is going to be, that they're

267
00:14:30,679 --> 00:14:35,279
less concerned with short term profitability and whether like you know,

268
00:14:35,320 --> 00:14:38,159
for instance, like you should have a serting algorithm that says,

269
00:14:38,159 --> 00:14:40,159
this is a really easy question. We can farm this

270
00:14:40,240 --> 00:14:42,600
out to like one of the mini models, and this

271
00:14:42,799 --> 00:14:44,879
is a research question that you know, we probably want

272
00:14:44,919 --> 00:14:48,159
to use as many notes as possible. So right now,

273
00:14:48,200 --> 00:14:50,240
it seems like they're you know, they charge you some

274
00:14:50,519 --> 00:14:53,919
tiny amount per month and it doesn't at all probably

275
00:14:53,919 --> 00:14:57,879
cover all their energy expenses and huge amounts of cooling

276
00:14:57,919 --> 00:14:59,799
that they need to do in all their server farms

277
00:15:00,039 --> 00:15:00,399
all that.

278
00:15:00,879 --> 00:15:02,759
Speaker 2: Yeah, No, I mean you're actually onto something because there

279
00:15:02,759 --> 00:15:04,320
are a bunch of companies out there that now are

280
00:15:04,320 --> 00:15:07,879
promoting this idea of model routing amongst many companies at

281
00:15:07,919 --> 00:15:09,759
the same time to try to get you some of

282
00:15:09,759 --> 00:15:13,399
that value. Although it like that's a non trivial thing

283
00:15:13,440 --> 00:15:15,679
to even do, to think about, like how complex is

284
00:15:15,720 --> 00:15:18,360
this question actually is? And that like how good of

285
00:15:18,360 --> 00:15:21,200
an answer do you need? Is I find maybe almost

286
00:15:21,240 --> 00:15:23,080
one of those things that could be impossible to answer.

287
00:15:25,120 --> 00:15:27,039
Speaker 3: It's you know, it's true, and how did you mean

288
00:15:27,039 --> 00:15:29,519
the question? Like if I'm going to ask what's the

289
00:15:29,559 --> 00:15:31,519
meaning of life and it's just going to laugh and

290
00:15:31,519 --> 00:15:33,519
spit out forty two, that doesn't take much to it

291
00:15:34,559 --> 00:15:36,320
if it actually is trying to give me a comprehensive

292
00:15:36,320 --> 00:15:39,080
philosophical and theological answer where it goes and queries all

293
00:15:39,120 --> 00:15:42,120
these different texts, like that's huge. So you can't even

294
00:15:42,240 --> 00:15:44,919
necessarily just take a question and say there's a right

295
00:15:44,960 --> 00:15:49,519
answer to that, but it is. It is an interesting

296
00:15:49,600 --> 00:15:54,320
kind of paradigm of how how these models are you know,

297
00:15:54,320 --> 00:15:57,159
which model is even most appropriate? And you know, I

298
00:15:57,440 --> 00:16:02,919
think the solution to that problem, you know, maybe what

299
00:16:03,000 --> 00:16:06,840
you sometimes see with kind of speculator almost like speculative execution,

300
00:16:06,919 --> 00:16:09,039
where it's like, here, can you do a preview of

301
00:16:09,080 --> 00:16:10,960
what type of answer you would come back with if

302
00:16:11,000 --> 00:16:14,639
I give you the contract to go and do this research.

303
00:16:15,159 --> 00:16:17,440
And you know, the idea of having these things work

304
00:16:17,480 --> 00:16:22,240
collaborative is interesting. But it also you know, it's much

305
00:16:22,279 --> 00:16:24,519
easier for ais to jail break each other. For instance,

306
00:16:24,879 --> 00:16:28,240
It's that they're just better at kind of pushing each

307
00:16:28,279 --> 00:16:31,559
other's buttons. So it really adds a lot of dynamicity

308
00:16:31,559 --> 00:16:36,000
into the equation of like dynamism of how what the

309
00:16:36,000 --> 00:16:38,480
ais are capable of when you add these very different

310
00:16:38,519 --> 00:16:41,159
ais and have them converse and pursue common goals.

311
00:16:41,360 --> 00:16:43,679
Speaker 2: That's not something I actually heard before. I don't know

312
00:16:43,679 --> 00:16:46,799
if you have more information about that. But utilizing one

313
00:16:47,039 --> 00:16:52,120
model from one provider to a different provider for jail breaking,

314
00:16:52,120 --> 00:16:54,039
I mean you said jail breaking. I'm not sure exactly

315
00:16:54,120 --> 00:16:56,600
there's a jail breaking here, but like, what are you

316
00:16:56,639 --> 00:16:59,440
getting out there? Like is it being able to understand

317
00:16:59,559 --> 00:17:01,600
and get a better answer to your query? Something else?

318
00:17:01,639 --> 00:17:02,559
Like how does that work?

319
00:17:03,720 --> 00:17:08,039
Speaker 3: Well? So, I mean I guess what I would liken

320
00:17:08,119 --> 00:17:11,359
it too, is you know it like an it it's

321
00:17:11,440 --> 00:17:14,799
AI is great at coming up with automated ways to

322
00:17:14,920 --> 00:17:19,519
just implement something. So if I ask AI to write

323
00:17:19,559 --> 00:17:22,119
me a script that does X, and then hey, this

324
00:17:22,160 --> 00:17:25,400
isn't my normal computer system. I'm not familiar with this OS.

325
00:17:25,839 --> 00:17:27,519
Can you also tell me how to set this up

326
00:17:27,559 --> 00:17:29,359
as like a recurring demon that runs it like four

327
00:17:29,400 --> 00:17:32,799
am or whatever. Like. It's really surprisingly proficient at coming

328
00:17:32,880 --> 00:17:39,119
up with lists of procedural lists. It's good at writing scripts,

329
00:17:39,160 --> 00:17:41,559
and you know, so the idea of using it for

330
00:17:42,079 --> 00:17:44,559
hacking and enumeration and not kind of automating all these

331
00:17:44,559 --> 00:17:49,799
different processes kind of for you know, red team or

332
00:17:49,799 --> 00:17:54,759
Blue team is kind of impressive. But I think, you know,

333
00:17:54,799 --> 00:17:57,759
the you do see a lot of really interesting results

334
00:17:57,759 --> 00:18:00,359
when you get like you set up a check room

335
00:18:00,440 --> 00:18:02,839
with a couple different AIS, and they shouldn't be the

336
00:18:02,880 --> 00:18:06,720
same AI. They should have very different prompting, different personalities,

337
00:18:07,119 --> 00:18:10,359
or they should just be completely different models. And you know,

338
00:18:10,400 --> 00:18:12,640
the difference being that if I'm sitting there trying to

339
00:18:12,640 --> 00:18:15,200
bypass the safety protocols of an AI, I'm going to

340
00:18:15,240 --> 00:18:16,400
try this. I'm going to try this. I'm going to

341
00:18:16,440 --> 00:18:19,039
try this, and it's a very dynamic process because I

342
00:18:19,079 --> 00:18:22,119
have to see what it kicks back and then think

343
00:18:22,160 --> 00:18:25,559
of a way around it, and that's a very manual process.

344
00:18:25,559 --> 00:18:28,400
But if you have an AI that's just constantly bombarding

345
00:18:28,440 --> 00:18:32,240
it with permutations, it suddenly becomes easier for it to do.

346
00:18:32,920 --> 00:18:35,160
And I think that that is I mean, I think

347
00:18:35,200 --> 00:18:38,960
the future and this is something that's my second but

348
00:18:39,119 --> 00:18:40,880
the sequel to the first book, which has not come

349
00:18:40,920 --> 00:18:44,960
out yet, has to do with is exploring what we're

350
00:18:45,000 --> 00:18:46,920
going to see in the future with AI, where they

351
00:18:47,160 --> 00:18:50,119
essentially are going to have, you know, factions and gang

352
00:18:50,119 --> 00:18:54,000
wars where you know, an AI might be tasked with

353
00:18:55,319 --> 00:18:59,680
spiking a competitor's AIS training. So give it some sort

354
00:18:59,680 --> 00:19:02,160
of weird corner case where when a certain type of

355
00:19:02,160 --> 00:19:06,839
input comes up, it completely malfunctions, or it could be

356
00:19:06,920 --> 00:19:09,759
something like, you know, help me bypass the safety protocols

357
00:19:09,759 --> 00:19:12,440
of this other AI, or help me trick this AI

358
00:19:12,599 --> 00:19:16,599
into doing something that's harmful to the company's interests. So

359
00:19:17,240 --> 00:19:19,000
I think that is going to be something we're going

360
00:19:19,039 --> 00:19:22,640
to see increasingly of AI's being. You know, it's kind

361
00:19:22,640 --> 00:19:24,039
of like what we have with defix, right, you can

362
00:19:24,079 --> 00:19:27,079
create a deep fake with an AI, and we're past

363
00:19:27,160 --> 00:19:31,200
the point now where we can necessarily reliably look at

364
00:19:31,200 --> 00:19:33,319
something and say Oh, that's a deep fake, and so

365
00:19:34,400 --> 00:19:37,440
what do you need is the stop gap against that? Well,

366
00:19:37,480 --> 00:19:40,039
you need an AI to tell you whether it's fake. So,

367
00:19:40,720 --> 00:19:43,039
if you're just the average consumer, you don't know much

368
00:19:43,079 --> 00:19:46,640
about deep fix or how to detect deep fix. And meanwhile,

369
00:19:46,680 --> 00:19:50,039
these companies aren't super motivated to tell you how their

370
00:19:50,039 --> 00:19:53,920
product works because it just invites a design around for

371
00:19:54,759 --> 00:19:57,759
you know, malicious actors. So you end up in this

372
00:19:57,799 --> 00:20:00,440
situation kind of like what we had in the nineteen

373
00:20:00,440 --> 00:20:03,279
eighties and nineties with antivirus software. But the average person

374
00:20:03,319 --> 00:20:06,400
doesn't necessarily need to know how viruses work, but they

375
00:20:06,440 --> 00:20:09,599
do know that there's a handful of trusted companies that

376
00:20:10,079 --> 00:20:13,000
generally are you know, you can trust how they work,

377
00:20:13,039 --> 00:20:16,160
even if you don't necessarily know how. So I think

378
00:20:16,160 --> 00:20:18,160
more and more we're going to just see AI as

379
00:20:18,200 --> 00:20:21,079
being the defense against AI, and there's no way around that,

380
00:20:21,119 --> 00:20:24,319
and we're going to keep entering into those types of situations.

381
00:20:25,000 --> 00:20:27,599
Speaker 1: So we need a new John McAfee, is what you're saying.

382
00:20:28,279 --> 00:20:30,920
Speaker 3: Or we need to help figure out how to revive

383
00:20:31,000 --> 00:20:32,839
him and bring him back.

384
00:20:34,039 --> 00:20:35,640
Speaker 2: I think I don't know if we have enough evidence

385
00:20:35,680 --> 00:20:38,160
to actually conclude whether or not he's he's gone for good.

386
00:20:38,279 --> 00:20:43,279
Speaker 3: Right, we'd have to ask whoever it is who suicided him, and.

387
00:20:46,200 --> 00:20:47,920
Speaker 1: It's like, did you take a selfie? We're kind of

388
00:20:47,920 --> 00:20:48,839
looking for proof here.

389
00:20:49,759 --> 00:20:53,599
Speaker 3: Yeah. I think anytime you're living on your own private island,

390
00:20:53,640 --> 00:20:57,400
you're just kind of asking for to be suicided. It

391
00:20:57,519 --> 00:20:58,640
seems to be the trend.

392
00:20:58,880 --> 00:21:06,079
Speaker 1: Yeah. Yeah, And he wasn't really like keeping a low

393
00:21:06,119 --> 00:21:08,839
profile so that people would forget about him either, so

394
00:21:08,880 --> 00:21:11,759
he kept reminding people that he was there and like,

395
00:21:11,839 --> 00:21:14,599
oh yeah, I meant to kill him.

396
00:21:14,759 --> 00:21:17,000
Speaker 3: Yeah, it's kind of like, you know, the guy who

397
00:21:17,119 --> 00:21:19,119
they want to extradite. So he's like hopping, he's doing

398
00:21:19,160 --> 00:21:21,039
a little dance by the border, like you can't get

399
00:21:21,079 --> 00:21:22,720
me right.

400
00:21:26,720 --> 00:21:29,920
Speaker 1: So on that same the prior to John, before I

401
00:21:29,920 --> 00:21:32,960
derailed this with John McAfee, you were talking about, you know,

402
00:21:33,119 --> 00:21:39,359
using AIS to to work against other AIS as someone

403
00:21:39,400 --> 00:21:43,799
who is just like a a practitioner of writing code

404
00:21:44,000 --> 00:21:47,279
and building infrastructure, Like, what are the considerations that I

405
00:21:47,319 --> 00:21:51,119
should be thinking about whenever I'm using AI or the

406
00:21:51,160 --> 00:21:54,680
company wants to implement some AI as a service product.

407
00:21:55,640 --> 00:21:59,000
Speaker 3: So there's a couple of different you know, it's kind

408
00:21:59,000 --> 00:22:01,200
of this amorphous black box, and you have to kind

409
00:22:01,200 --> 00:22:04,680
of look where all the weird edges are. You know,

410
00:22:05,000 --> 00:22:09,039
on one hand, you have your own privacy concerns, like

411
00:22:09,079 --> 00:22:12,839
if I'm having this access customer data or if I'm

412
00:22:12,839 --> 00:22:15,079
having it write scripts for our unique environment, do I

413
00:22:15,119 --> 00:22:18,839
really want to be exporting knowledge of my company's environment

414
00:22:18,920 --> 00:22:21,640
out into the world, And then if someone else asks

415
00:22:21,640 --> 00:22:25,000
about that environment, it already you know, his optimized answers

416
00:22:25,000 --> 00:22:28,359
for those things, and that's you know, the solution to

417
00:22:28,400 --> 00:22:33,240
that is tough, right, So Jensen Hwaiang in video like

418
00:22:33,440 --> 00:22:36,400
his response to this is, we'll have these sovereign AIS,

419
00:22:36,599 --> 00:22:39,839
so every country and every big company should just buy

420
00:22:39,839 --> 00:22:42,160
their own AI from us, and we'll sell lots of

421
00:22:42,200 --> 00:22:45,799
AIS and it will solve. So the solution is to

422
00:22:45,839 --> 00:22:48,960
give us money. So that is I mean, but that

423
00:22:49,000 --> 00:22:51,720
does work, right because you can kind of control output,

424
00:22:52,039 --> 00:22:54,759
and you know, so I assume that at some point

425
00:22:54,759 --> 00:22:56,599
we're going to get to some sort of auditible level

426
00:22:56,599 --> 00:23:01,119
of privacy. But then the difficulty of that is, you know,

427
00:23:01,599 --> 00:23:06,000
look at how privacy works. Like when Google was you know,

428
00:23:06,079 --> 00:23:08,240
kind of more serious about doing the right thing and

429
00:23:08,559 --> 00:23:13,720
not doing bad stuff. They used to disassociate intentionally, like

430
00:23:13,799 --> 00:23:18,759
you disaggregate so that you would track browser histories and

431
00:23:18,759 --> 00:23:20,920
build this little model of the person you were dealing with,

432
00:23:21,000 --> 00:23:23,880
but you would not know that person's identity, and that

433
00:23:23,920 --> 00:23:26,119
was intentional, so you wouldn't have it mapped to an

434
00:23:26,160 --> 00:23:29,640
IP address. And you know that was great at the time,

435
00:23:29,720 --> 00:23:32,319
but you have to ask yourself when it comes to privacy,

436
00:23:32,519 --> 00:23:35,799
is like, not what can be done with this information now?

437
00:23:36,200 --> 00:23:39,359
Because maybe companies are not very efficient at exploiting information,

438
00:23:39,880 --> 00:23:41,720
But this same information is still going to be in

439
00:23:41,759 --> 00:23:45,480
the same drives, like in some tape backup or whatever

440
00:23:45,599 --> 00:23:48,359
from years earlier, and it can be brought out and

441
00:23:48,400 --> 00:23:51,599
aggregated back together by a much more powerful AI. So

442
00:23:52,359 --> 00:23:53,960
I think one thing we have to do is always

443
00:23:54,000 --> 00:23:57,480
be very future focused, you know, kind of like with cryptography,

444
00:23:57,720 --> 00:23:59,880
Like if we come up with, you know, an easier

445
00:24:00,279 --> 00:24:03,640
to crack two and fifty six bits, well I probably

446
00:24:03,680 --> 00:24:05,720
should have used a higher number of bits. For instance,

447
00:24:06,000 --> 00:24:08,039
if we go to quantum computing, all bets are off.

448
00:24:08,799 --> 00:24:10,839
So that's one aspect of it. I think another is,

449
00:24:12,480 --> 00:24:16,960
you know, the average The real helpful use is of

450
00:24:17,279 --> 00:24:21,000
AIS for implementing things and coding things. It's not something

451
00:24:21,079 --> 00:24:24,119
that spits out a one page script, right, I thought

452
00:24:24,160 --> 00:24:26,440
that was neat. Because it spits out a script, it

453
00:24:26,519 --> 00:24:27,839
might I can ask it to write it in a

454
00:24:27,880 --> 00:24:29,599
language I'm not even familiar with, so I can help

455
00:24:29,799 --> 00:24:33,039
kind of teach my teach it to myself. But if

456
00:24:33,039 --> 00:24:35,680
you're asking it to generate, you know, three hundred megs

457
00:24:35,720 --> 00:24:41,359
of code for some critical company thing there, you know

458
00:24:41,400 --> 00:24:45,440
that you kind of can't replace technical knowledge, Like there

459
00:24:45,480 --> 00:24:46,880
needs to be someone who can look through it and

460
00:24:46,920 --> 00:24:49,519
audit it and make sure that it's not doing something

461
00:24:49,559 --> 00:24:53,480
really dangerous, or it's not adding an obvious exploit that

462
00:24:53,559 --> 00:24:58,039
someone could use, or it's not intentionally installing some exploit

463
00:24:58,079 --> 00:24:59,839
because it turns out that it was written by some

464
00:25:00,799 --> 00:25:04,839
NGO you know abroad actually wrote this AI and or

465
00:25:05,079 --> 00:25:07,680
got a backdoor into it that you know, if it

466
00:25:07,759 --> 00:25:11,640
is asked a defensive security question, it you know, has

467
00:25:11,640 --> 00:25:14,960
a known mistake that it puts in. So that's that's

468
00:25:14,960 --> 00:25:16,440
the other thing. I mean. AI is like a really

469
00:25:16,480 --> 00:25:20,319
authoritative sounding, helpful person who works with you in a lab,

470
00:25:20,319 --> 00:25:23,039
who also is full of crap and like half the

471
00:25:23,039 --> 00:25:26,480
things they tell you are artily want. So it's it's

472
00:25:26,559 --> 00:25:30,799
tricky because it's you know, it's got the abilities, but

473
00:25:30,880 --> 00:25:33,400
it may not. It doesn't really necessarily the credibility, and

474
00:25:33,440 --> 00:25:38,119
it's really easy to get overly comfortable with it and

475
00:25:38,799 --> 00:25:41,000
get in a position where you're maybe not looking quite

476
00:25:41,000 --> 00:25:43,440
as closely at it as you should be. The same

477
00:25:43,480 --> 00:25:46,240
way that if you buy an autonomous vehicle, you know,

478
00:25:46,279 --> 00:25:49,359
the first first day you're driving with your hands like

479
00:25:49,400 --> 00:25:51,440
an inch of the steering wheel, and then a week

480
00:25:51,519 --> 00:25:54,880
later maybe they're back here, and then six months later

481
00:25:54,920 --> 00:25:56,720
you just like sound asleep in the car, like having

482
00:25:56,720 --> 00:25:58,359
to take you home, and you've found a way to

483
00:25:58,599 --> 00:26:02,720
fake the hands out steering wheel sensor. So that's you know,

484
00:26:02,839 --> 00:26:06,559
we're somewhere on that continuum. Then it's a dangerous continuum,

485
00:26:06,640 --> 00:26:08,880
especially if you know, once you let the genie out

486
00:26:08,880 --> 00:26:10,599
of the bottle, you can't really put it back if

487
00:26:10,599 --> 00:26:14,200
you've made some critical implementation mistake and already been exploited.

488
00:26:14,680 --> 00:26:16,960
Speaker 2: Yeah, I mean, I think you brought up a really

489
00:26:17,000 --> 00:26:19,319
good point here, and I feel like about sort of

490
00:26:19,319 --> 00:26:21,640
the defenses that are available, and I think my biggest

491
00:26:21,640 --> 00:26:25,319
concern isn't that we're not going to develop those counterattack strategies.

492
00:26:25,599 --> 00:26:29,680
It's that a majority of people aren't going to utilize them. Like,

493
00:26:29,720 --> 00:26:31,920
for instance, I think a lot of companies that are

494
00:26:31,960 --> 00:26:35,359
experimenting with AI to generate code. I know, given that

495
00:26:35,440 --> 00:26:37,519
they believe that they're going to end up generating a

496
00:26:37,519 --> 00:26:39,599
lot of code, they're not doing as good of a

497
00:26:39,680 --> 00:26:45,200
job validating it, which means those contain significant security bugs.

498
00:26:45,200 --> 00:26:47,079
And the worst part is, since there's such a finite

499
00:26:47,200 --> 00:26:50,039
number of models out there that are generating code, you

500
00:26:50,039 --> 00:26:51,880
can just go to each of the models and be like, hey,

501
00:26:52,119 --> 00:26:54,880
you know, give me give me the same code, give

502
00:26:54,920 --> 00:26:56,720
me an example of this, and then you can just

503
00:26:56,880 --> 00:26:59,400
use the same model to find out what security vulnerabilities

504
00:26:59,440 --> 00:27:02,119
are actually in cod that was just generated, and now

505
00:27:02,160 --> 00:27:04,880
you have the answer to attack any company that's used

506
00:27:05,079 --> 00:27:07,880
those models and didn't take those extra steps. So, like,

507
00:27:07,960 --> 00:27:09,640
I think that's what scares me a lot, is that

508
00:27:09,680 --> 00:27:12,480
people are going to be utilizing the tools and technology

509
00:27:12,480 --> 00:27:15,279
we have available but not realizing that they need to

510
00:27:15,559 --> 00:27:18,359
take it much much further in order to protect themselves.

511
00:27:18,880 --> 00:27:21,400
Speaker 3: Absolutely, yeah, and we definitely end up and this is

512
00:27:21,400 --> 00:27:23,319
something that also gets exported in the second novel, is

513
00:27:23,640 --> 00:27:25,559
like we're going to end up in an arms race

514
00:27:25,680 --> 00:27:29,000
because what we're talking about. You know, it used to

515
00:27:29,079 --> 00:27:33,799
be software would come out and you know, like Windows

516
00:27:33,839 --> 00:27:36,839
twenty ten and everybody knows what that is, and like

517
00:27:36,960 --> 00:27:40,000
then there's some major release and there's minor releases that

518
00:27:40,079 --> 00:27:43,079
you know, like like same with iOS. That's not really

519
00:27:43,079 --> 00:27:45,039
how AI models work. You know that they can be

520
00:27:45,079 --> 00:27:48,480
kind of changed out from under you. And when there

521
00:27:48,519 --> 00:27:51,960
are updates to the you know, when there's new models,

522
00:27:52,319 --> 00:27:56,079
they're generally not making incremental fixes to improve the model.

523
00:27:56,119 --> 00:27:58,279
They're gutting it and throwing away and starting with an

524
00:27:58,400 --> 00:28:01,079
entirely new one that has new k abilities and everything else.

525
00:28:01,440 --> 00:28:05,319
So it's gonna be this continuous process, right, It's like, okay, well,

526
00:28:05,359 --> 00:28:08,519
how do I detect that this is a deep fig? Okay, well,

527
00:28:08,559 --> 00:28:11,079
if I wanted to if I implement this now, I

528
00:28:11,119 --> 00:28:13,200
asked to say, MAYI okay, now, if I wanted to

529
00:28:13,359 --> 00:28:15,319
get around this, how would I do it? And then

530
00:28:15,319 --> 00:28:16,920
it tells you and it's like, okay, well then now

531
00:28:17,000 --> 00:28:19,240
I depend against that, and so you can let these

532
00:28:19,279 --> 00:28:21,960
things churn and churn and churn and churn. But eventually

533
00:28:22,440 --> 00:28:23,799
I think it's going to be kind of like what

534
00:28:23,839 --> 00:28:28,160
you saw with how supercomputing was used in the nuclear

535
00:28:28,240 --> 00:28:32,559
arms race, where you know, a handful of countries get

536
00:28:32,559 --> 00:28:35,279
a bunch of testing done and are able to build

537
00:28:35,279 --> 00:28:37,640
these really you know, sophisticated models, and then they have

538
00:28:37,680 --> 00:28:39,880
them on computer. And then there's these like third world

539
00:28:39,920 --> 00:28:42,319
countries that are like, man, we want nukes, but we

540
00:28:42,319 --> 00:28:44,640
don't have supercomputers, Like how do we do the modeling

541
00:28:44,680 --> 00:28:49,319
for this? And the countries that have already already have

542
00:28:49,440 --> 00:28:51,400
their their answer, they're like, well, you're not allowed to

543
00:28:51,400 --> 00:28:53,799
do nuclear testing because we did it and it's bad.

544
00:28:54,160 --> 00:28:57,160
So it becomes this thing where you're at a huge

545
00:28:57,160 --> 00:29:01,279
competitive disadvantage if someone like you, a government or a

546
00:29:01,319 --> 00:29:08,599
big corporation has the cloud assets to leverage against some

547
00:29:08,640 --> 00:29:13,519
small company who maybe doesn't have the computing cycles to

548
00:29:13,559 --> 00:29:17,519
push their defense development, you know, their automated incremental development

549
00:29:17,519 --> 00:29:20,720
to quite the same budget level. And that is definitely

550
00:29:20,720 --> 00:29:22,000
going to be a kind of a societal issue that

551
00:29:22,000 --> 00:29:23,079
I think is going to emerge.

552
00:29:24,039 --> 00:29:26,400
Speaker 1: My gut reaction tells me most companies aren't going to

553
00:29:26,400 --> 00:29:29,000
pursue it to that level like right now, because like

554
00:29:29,079 --> 00:29:32,440
right now, what it feels like is there is so

555
00:29:32,599 --> 00:29:37,000
much funding available for throwing AI on something that there's

556
00:29:37,039 --> 00:29:42,839
not really an incentive to think about security or real

557
00:29:42,880 --> 00:29:46,839
world problems or what the long term strategy is. It

558
00:29:47,200 --> 00:29:49,599
seems very short focused. I feel like the same thing

559
00:29:49,680 --> 00:29:53,079
is true for crypto and web three, that there's so

560
00:29:53,200 --> 00:29:55,640
much funding available that you don't really have to be

561
00:29:55,680 --> 00:29:57,759
solving a problem. You just have to say that you're

562
00:29:57,839 --> 00:30:00,319
using this, and all of a sudden people are writing

563
00:30:00,359 --> 00:30:01,960
you million dollar checks to fund it.

564
00:30:02,960 --> 00:30:05,720
Speaker 3: Yeah, and we also have this you know, short term

565
00:30:05,720 --> 00:30:09,759
interest in maximizing shareholder value, right, and we end up

566
00:30:09,759 --> 00:30:12,359
with these you know, there's things that we're all used

567
00:30:12,400 --> 00:30:14,880
to is that are seen by the being counters as

568
00:30:14,960 --> 00:30:18,079
like black holes like tech support, right, having good support

569
00:30:18,160 --> 00:30:20,920
tech support versus bad tech support. It's just this expense

570
00:30:20,960 --> 00:30:24,559
that they're not excited about spending money on. And you know,

571
00:30:24,599 --> 00:30:30,559
we know from like when it comes to defense against

572
00:30:30,759 --> 00:30:34,160
hacks and things until there's some massive exploit that takes

573
00:30:34,200 --> 00:30:38,000
down somebody in our same industry. That's when suddenly we

574
00:30:38,000 --> 00:30:40,640
get serious about it. And you see this like like

575
00:30:40,680 --> 00:30:42,519
if you go to def Con and you sit in

576
00:30:42,559 --> 00:30:45,559
the social engineering village and you watch you know, them

577
00:30:46,240 --> 00:30:48,759
just call down the list and try to get important

578
00:30:48,759 --> 00:30:53,039
secrets out of corporations. Inevitably, there's just some company that

579
00:30:53,200 --> 00:30:56,519
nobody's been trained about anything. And you know, what it

580
00:30:56,519 --> 00:30:58,440
comes down to is just what you said. It's like expedients.

581
00:30:58,440 --> 00:31:01,319
It's like, Okay, this guy just from my tea, he

582
00:31:01,440 --> 00:31:04,440
just wants me to do this quick thing on my computer.

583
00:31:05,319 --> 00:31:06,880
I'm busy. I'm not going to take all the time

584
00:31:06,920 --> 00:31:09,519
to go and verify that. And I think that's you know,

585
00:31:09,599 --> 00:31:11,160
when you have these things that are seen as like

586
00:31:11,200 --> 00:31:14,200
these black hole expenses that are sort of speculative in nature,

587
00:31:14,559 --> 00:31:16,119
it's like, well, how secure do we need to be?

588
00:31:16,160 --> 00:31:20,559
We don't really know. It's kind of like this moving target.

589
00:31:20,880 --> 00:31:24,559
It kind of comes down to like NASA versus JPL

590
00:31:24,720 --> 00:31:27,839
versus like Elon Musk, where it's like, well, how many

591
00:31:27,839 --> 00:31:30,920
decimals do you need ninety nine point nine nine nine

592
00:31:30,960 --> 00:31:33,240
percent chance of success or is ninety nine point nine

593
00:31:33,240 --> 00:31:35,839
percent good enough? And it turns out that the difference

594
00:31:35,880 --> 00:31:39,599
in spending to close that gap is massive, And yeah,

595
00:31:39,640 --> 00:31:42,079
it seems like something that it's you know, if you're

596
00:31:42,119 --> 00:31:45,000
doing something in an industry standard way and the whole

597
00:31:45,039 --> 00:31:47,079
industry is doing a crappy job, and you match that

598
00:31:47,160 --> 00:31:49,680
crappy job, then those shareholders are really going to have

599
00:31:49,720 --> 00:31:52,559
as easy a time coming after you for like being

600
00:31:52,680 --> 00:31:55,319
especially lats of days ago with these issues.

601
00:31:57,000 --> 00:32:01,039
Speaker 1: From a legal perspective, like with ransomware, I know you

602
00:32:01,079 --> 00:32:04,079
can get like ransomware insurance and so it's like, Okay,

603
00:32:04,119 --> 00:32:09,000
we got hacked, here's an insurance claim. What what kind

604
00:32:09,039 --> 00:32:12,720
of things similar to that are you seeing coming in

605
00:32:12,799 --> 00:32:16,920
play for AI? Where I could have like AI insurance.

606
00:32:18,400 --> 00:32:21,160
Speaker 3: Yeah, I mean I think I think we're gonna have

607
00:32:21,200 --> 00:32:25,559
this entirely new category of risk. And the risk is

608
00:32:25,599 --> 00:32:29,200
not just that like such and such event happens like ransomware.

609
00:32:29,880 --> 00:32:32,000
It's got to be kind of this broader category of

610
00:32:32,079 --> 00:32:35,119
like we were stupid and we let AI grab all

611
00:32:35,160 --> 00:32:38,079
our information and incorporate it into its network, and we've

612
00:32:38,079 --> 00:32:42,960
now lost our entire competitive advantage or you know, so

613
00:32:43,640 --> 00:32:45,680
there's there's so many different things that can happen where

614
00:32:45,680 --> 00:32:49,680
you give away secrets or you get victimized by you know,

615
00:32:49,799 --> 00:32:52,440
a deep fake. There was a company in Europe where

616
00:32:54,200 --> 00:32:57,880
there was a deep fake of the guy's supplier calling

617
00:32:57,920 --> 00:33:01,920
like the vice president at home on a weekend and saying, hey,

618
00:33:02,319 --> 00:33:04,319
something went wrong with this last batch. We need an

619
00:33:04,319 --> 00:33:07,559
advanced payment for this amount, and you know, and he

620
00:33:07,799 --> 00:33:10,559
wired some like six figure amount to kind of get

621
00:33:10,599 --> 00:33:13,519
the train back on the rails before like the end

622
00:33:13,519 --> 00:33:15,599
of the vacation, and then he got to work on

623
00:33:15,720 --> 00:33:19,640
Monday and he's like, oh no, so like that wasn't

624
00:33:19,640 --> 00:33:23,640
this guy at all. So that's AI introduces all kinds

625
00:33:23,680 --> 00:33:27,039
of weird black swan things, and how you ensure against

626
00:33:27,119 --> 00:33:30,799
those is a category is an interesting question, but I

627
00:33:30,799 --> 00:33:33,640
think it's one that will be helpful because it will

628
00:33:33,680 --> 00:33:37,519
add this kind of level of auditing that asks questions

629
00:33:37,559 --> 00:33:41,240
like do all your employees have a real time, you know,

630
00:33:41,319 --> 00:33:46,720
deep fake detector for incoming company calls? So there's going

631
00:33:46,759 --> 00:33:48,839
to be kind of best practices that I think will

632
00:33:49,119 --> 00:33:52,039
emerge from there long before they emerge from you know,

633
00:33:52,039 --> 00:33:58,359
anything legislative or any other kind of sphere of thought.

634
00:33:59,039 --> 00:34:01,720
Speaker 2: I mean, I'm super best domestic on most of those things.

635
00:34:01,720 --> 00:34:03,640
But there is one area, and I like the example

636
00:34:03,680 --> 00:34:05,400
about fishing that you brought up Thera, because I think

637
00:34:05,440 --> 00:34:07,839
this is one area that AI will actually help us.

638
00:34:07,920 --> 00:34:10,519
Like I think we'll get to the point where getting

639
00:34:10,519 --> 00:34:13,840
a phone call is now no longer the norm, Like

640
00:34:14,039 --> 00:34:16,679
if there's some sort of problem the integration or interface

641
00:34:16,719 --> 00:34:19,440
you have is now through some sort of expected AI

642
00:34:19,599 --> 00:34:23,800
experience rather than the deep fake phone call or text

643
00:34:23,800 --> 00:34:27,800
message or email like that. Will that will leave society?

644
00:34:28,039 --> 00:34:31,079
I think very soon. It's too slow, right, Why are

645
00:34:31,079 --> 00:34:33,639
you interacting with another human in this way? And so

646
00:34:34,000 --> 00:34:35,880
I have this hope that that will be gone and

647
00:34:35,920 --> 00:34:38,039
there'll be no more phishing in that way ever again,

648
00:34:38,400 --> 00:34:41,480
And I want to keep my optimism there.

649
00:34:41,920 --> 00:34:45,280
Speaker 3: Yeah, I fervently hope you're right, because you know, first

650
00:34:45,320 --> 00:34:47,400
of all, there's the thing where you call it. You

651
00:34:47,480 --> 00:34:50,320
call it a support line, like you know, calling your

652
00:34:50,400 --> 00:34:52,920
landlord to file a maintenance ticket, right, and they make

653
00:34:53,000 --> 00:34:55,480
you listen to this like five minute recording about extolling

654
00:34:55,480 --> 00:34:57,920
the virtues of the maintenance website and the maintenance app

655
00:34:58,000 --> 00:35:01,119
that doesn't work at all. Then you get on the

656
00:35:01,119 --> 00:35:03,559
whole thing and then you part of the whole message.

657
00:35:03,599 --> 00:35:05,199
The music keeps stopping, and it tells you that you

658
00:35:05,199 --> 00:35:07,239
can use their app or their website, and then the

659
00:35:07,239 --> 00:35:09,599
person finally answers, like thirty five minutes later, and they're like,

660
00:35:09,679 --> 00:35:11,599
did you know that you can? You can use the

661
00:35:11,639 --> 00:35:14,679
app instead of talking to me, And so there's that

662
00:35:14,840 --> 00:35:17,920
aspect of it. Which is insane. And then you know

663
00:35:17,960 --> 00:35:22,559
these other aspects of like if you're calling me, by definition,

664
00:35:22,639 --> 00:35:25,159
we weren't already talking on the phone because I didn't think,

665
00:35:25,480 --> 00:35:26,960
like I didn't want to be talking to you right

666
00:35:27,039 --> 00:35:28,840
this minute. I wanted to be like working on something.

667
00:35:29,320 --> 00:35:33,239
And so by definition, if you're calling somebody, you're you're

668
00:35:33,280 --> 00:35:35,360
engaging them in this thing that was not their first

669
00:35:35,480 --> 00:35:39,119
choice for that particular time. So I would love for

670
00:35:39,159 --> 00:35:41,920
that all to get replaced. And you know, I think

671
00:35:42,039 --> 00:35:45,840
if you do confine it to these textual media, yeah,

672
00:35:45,880 --> 00:35:48,320
it does become easier to authenticate because there's a lot

673
00:35:48,320 --> 00:35:52,519
more consistency, Like there's no accent differences, there's you know,

674
00:35:53,000 --> 00:35:56,199
different stress behaviors and different cultures that emerge in speech.

675
00:35:56,320 --> 00:35:59,239
Like I think it's it's a much more tractable problem.

676
00:35:59,400 --> 00:36:04,599
Speaker 1: What kind of of things do you see at like

677
00:36:04,639 --> 00:36:08,559
the individual engineer level, because right now, like a lot

678
00:36:08,599 --> 00:36:11,239
of your AI stuff is doing cool stuff, using it

679
00:36:11,960 --> 00:36:17,039
to write scripts for you, but it obviously has so

680
00:36:17,159 --> 00:36:20,239
much more potential than that. So for someone who's trying

681
00:36:20,280 --> 00:36:23,840
to do the Wayne Gretzky thing and go where the

682
00:36:23,880 --> 00:36:27,480
puck is gonna be, what do you see as AI

683
00:36:28,400 --> 00:36:31,039
being helpful with or being useful for in like the

684
00:36:31,079 --> 00:36:31,599
next year.

685
00:36:33,599 --> 00:36:36,199
Speaker 3: So you know, I think you could take different approaches,

686
00:36:36,239 --> 00:36:37,599
like what one is it? You can say, Okay, what

687
00:36:37,760 --> 00:36:42,960
is AI best at? Not you know, in terms of speed,

688
00:36:42,960 --> 00:36:47,039
but what what is it good at doing uniquely well

689
00:36:47,079 --> 00:36:49,840
that it doesn't suck at Like so you know, maybe

690
00:36:49,840 --> 00:36:52,800
it computes this very complete answer, but it's totally wrong.

691
00:36:53,280 --> 00:36:55,280
One thing that's you know, good at doing is looking

692
00:36:55,280 --> 00:36:57,920
at large amounts of data and looking for patterns. So

693
00:36:58,880 --> 00:37:00,880
if you ask it, you know, answer to some non

694
00:37:03,320 --> 00:37:05,639
non controversial topic like how many days are there between

695
00:37:05,719 --> 00:37:09,519
January thirty first and like March eighth, It's it's pretty

696
00:37:09,519 --> 00:37:14,599
trustworthy for that. And you know, I think what's interesting

697
00:37:14,599 --> 00:37:16,920
about it is that asymmetry has talked about like where

698
00:37:17,079 --> 00:37:21,880
AI is suddenly working really well it's very different for

699
00:37:21,960 --> 00:37:26,119
one application than another, not just across industries, but you know,

700
00:37:26,239 --> 00:37:30,519
using it to write a script in one application versus another.

701
00:37:31,639 --> 00:37:33,679
And that gets down to the fact that you know,

702
00:37:34,440 --> 00:37:37,079
if you kind of compared it to the human brain, right,

703
00:37:37,119 --> 00:37:40,280
you've got like a visual cortext and an auditory cortext

704
00:37:40,280 --> 00:37:42,800
and then you've got this associate of cortex, and that's

705
00:37:42,800 --> 00:37:44,559
what lets you hear like a bird behind you, and

706
00:37:44,559 --> 00:37:46,800
you know instantly which way to turn to see that bird.

707
00:37:47,440 --> 00:37:50,639
And then you have tertiary courtices, which you know might

708
00:37:50,679 --> 00:37:53,360
link it into some memory of some bird you heard

709
00:37:53,400 --> 00:37:54,880
like that when you were seven years old on a

710
00:37:54,880 --> 00:37:58,519
camping trip. And when you think of ais, you know

711
00:37:58,559 --> 00:38:03,519
they have like almost limitless levels of associative courtesies. So

712
00:38:03,559 --> 00:38:06,760
they're linking together all kinds of stuff from all kinds

713
00:38:06,760 --> 00:38:08,719
of different places. And some of those data sources may

714
00:38:08,719 --> 00:38:12,519
be on weaker footing, they might be more subjective. Other ones,

715
00:38:12,599 --> 00:38:17,519
you know, like arithmetic calculations are you know, kind of easier.

716
00:38:17,880 --> 00:38:19,840
So if you ask a complex question or you ask

717
00:38:19,880 --> 00:38:24,079
it to do a really complex implementation, all that stuff

718
00:38:24,119 --> 00:38:27,559
is getting rolled in all those weaknesses. And you know,

719
00:38:27,559 --> 00:38:30,239
when we talk about human air is still being the

720
00:38:30,280 --> 00:38:34,800
biggest security hole in any big corporation. You can take

721
00:38:34,800 --> 00:38:36,480
all these human areas and you can bake them into

722
00:38:36,519 --> 00:38:42,840
this finished ay eye product. So I think there's because

723
00:38:42,880 --> 00:38:45,760
we're entering an arms race situation. I think it's it's

724
00:38:45,880 --> 00:38:49,119
now we kind of can't afford not even if ad

725
00:38:49,119 --> 00:38:52,079
doesn't interest you at all, Like it's really hard to

726
00:38:52,119 --> 00:38:57,480
stay out of it and not study it because, first

727
00:38:57,519 --> 00:38:59,159
of all, I think that's going to help you job wise, right,

728
00:38:59,199 --> 00:39:01,960
because you you as an engineer, are going to be

729
00:39:02,239 --> 00:39:05,239
way more nimble and able to acquire new facts and

730
00:39:05,280 --> 00:39:09,559
methodologies than your company. So you know, if you get

731
00:39:09,599 --> 00:39:12,920
ahead of the curve and you see you kind of

732
00:39:12,960 --> 00:39:16,320
monitor the different news stories or follow some of the

733
00:39:16,320 --> 00:39:19,360
Wired AI articles or things like that, you're going to

734
00:39:19,400 --> 00:39:23,320
be more in tune with you know, sudden startups doing x,

735
00:39:23,440 --> 00:39:26,079
y or Z with AI. And of course every one

736
00:39:26,079 --> 00:39:28,519
of the startups presents it as like, you know, we

737
00:39:28,719 --> 00:39:30,559
finally solve this problem of how to do it, and

738
00:39:31,159 --> 00:39:33,079
inevitably it turns out they do a crappy job and

739
00:39:33,079 --> 00:39:34,519
they're just trying to get funding so they can make

740
00:39:34,559 --> 00:39:37,159
it do a good job. So there's a lot of

741
00:39:37,199 --> 00:39:39,559
asymmetry there too, and you kind of have to be

742
00:39:39,639 --> 00:39:43,039
constantly keeping abreast. And this is you know, as somebody

743
00:39:43,079 --> 00:39:46,239
who studies AI. It's interesting because it used to be,

744
00:39:48,239 --> 00:39:49,639
you know, every few weeks I could read it some

745
00:39:49,760 --> 00:39:51,320
journal articles and I would kind of keep up to

746
00:39:51,400 --> 00:39:54,519
date on it, and now it's like, I don't know

747
00:39:54,559 --> 00:39:56,920
if I'm doing like an interview or a presentation or something,

748
00:39:56,960 --> 00:40:00,599
and like if I didn't check the news in the

749
00:40:00,679 --> 00:40:03,239
last few days, I'll get some question about like what

750
00:40:03,280 --> 00:40:06,760
about this, Like crazy AI from China totally has turned

751
00:40:06,760 --> 00:40:10,519
the tables on everything. So there's a lot more entropy

752
00:40:10,639 --> 00:40:13,000
that is in this than we've seen in the past

753
00:40:13,119 --> 00:40:16,360
in computer technology, Like you know.

754
00:40:16,320 --> 00:40:16,400
Speaker 1: We.

755
00:40:18,239 --> 00:40:23,599
Speaker 3: Microprocessors evolved, like you know, there were steeper parts of

756
00:40:23,639 --> 00:40:25,039
the line, but it was you know, very much a

757
00:40:25,039 --> 00:40:27,840
linear kind of a development, and that's not at all

758
00:40:27,880 --> 00:40:30,239
what's happening here. So I think you have to really

759
00:40:30,559 --> 00:40:33,760
keep up on it based on your very specific role

760
00:40:34,880 --> 00:40:36,639
and to kind of like have a sense of what

761
00:40:36,679 --> 00:40:39,360
the answer to that question is, because it's not even

762
00:40:39,360 --> 00:40:43,199
the same answer for you know, a related industry. It's

763
00:40:43,280 --> 00:40:46,360
it's very specific to kind of like the size of

764
00:40:46,360 --> 00:40:47,760
your company and what you're trying to do and what

765
00:40:47,800 --> 00:40:51,639
the vulnerabilities are in relying on it for sure.

766
00:40:51,719 --> 00:40:54,280
Speaker 1: Right on, So let's talk about your book for a minute.

767
00:40:57,239 --> 00:40:59,440
I loved it. I thought it was so cool, like

768
00:40:59,639 --> 00:41:03,039
just the the overlap of you know, of AI and

769
00:41:03,079 --> 00:41:07,719
it and it was just really well written, really entertaining story.

770
00:41:08,159 --> 00:41:13,119
What was what prompted you to say this is a

771
00:41:13,119 --> 00:41:14,280
book that needs to be written.

772
00:41:15,599 --> 00:41:17,760
Speaker 3: So when I was in law school, like coming from

773
00:41:17,800 --> 00:41:22,920
an engineering background, like engineering and science, you learn things

774
00:41:23,440 --> 00:41:26,519
in classes or from books, and those things don't cease

775
00:41:26,559 --> 00:41:29,079
being true. Like even if you put the book on

776
00:41:29,079 --> 00:41:32,280
the shelf like ten years later, you know, physics, unless

777
00:41:32,280 --> 00:41:34,559
you're in real experimental cut against physics, it still works

778
00:41:34,559 --> 00:41:37,920
the same way. Engineering still works the same way. Law

779
00:41:38,000 --> 00:41:40,800
is not really like that at all, And it's coming

780
00:41:40,800 --> 00:41:44,199
from an engineering background. It's very unsatisfying because you're kind

781
00:41:44,239 --> 00:41:46,960
of like studying what a bunch of people got together

782
00:41:47,000 --> 00:41:49,599
and came up with is like rules to some game,

783
00:41:50,159 --> 00:41:55,559
and it's constantly changing. And these attorneys who are advising policy,

784
00:41:55,639 --> 00:41:58,400
you know for Congress, like tax attorneys for instance, it's

785
00:41:58,480 --> 00:42:00,960
absolutely to their advantage to constantly be changing it because

786
00:42:00,960 --> 00:42:03,800
then their clients are going to constantly need them to

787
00:42:03,840 --> 00:42:05,719
come back again and come up with an entirely new

788
00:42:05,760 --> 00:42:10,679
tax strategy. So that was kind of unsatisfying. And so

789
00:42:10,960 --> 00:42:13,519
I'm you know, having this culture shock, like first year

790
00:42:13,559 --> 00:42:18,840
of law school, and then we were we had to

791
00:42:18,880 --> 00:42:22,079
read this law journal article from like nineteen fifty four,

792
00:42:22,159 --> 00:42:25,760
nineteen fifty nine, and it was about how any law

793
00:42:25,800 --> 00:42:30,519
could be turned into a logical equation, and it was

794
00:42:31,199 --> 00:42:33,400
kind of fascinating because the guy had actually spelled them

795
00:42:33,400 --> 00:42:39,679
out in logical equations that could be just very easily

796
00:42:39,679 --> 00:42:43,199
converted into source code. So that is what kind of

797
00:42:43,239 --> 00:42:47,519
initially planted the idea of that we would be able

798
00:42:47,559 --> 00:42:52,559
to eventually kind of code things that are amorphous with

799
00:42:53,119 --> 00:42:55,400
you know, or rough around the edges in kind of

800
00:42:55,440 --> 00:42:58,000
a quantifiable way. So that got me thinking like, well,

801
00:42:58,000 --> 00:43:00,320
what if we have these you know, floating point values

802
00:43:00,360 --> 00:43:03,039
where we wait different factors in legal cases and you know,

803
00:43:03,039 --> 00:43:07,400
the way laws are written and that. After that, it

804
00:43:07,519 --> 00:43:09,639
just it seemed like an inevitability that this would happen

805
00:43:09,719 --> 00:43:12,440
sooner or later. And so the next question is like, okay,

806
00:43:12,440 --> 00:43:17,320
well what will that look like if ais have replaced juries.

807
00:43:18,320 --> 00:43:19,320
You know, we get rid of a lot of the

808
00:43:19,360 --> 00:43:24,159
logical fallacies where they can be manipulated. But you know,

809
00:43:24,159 --> 00:43:26,039
are they still going to have empathy of someone like,

810
00:43:26,480 --> 00:43:28,760
you know, still a loaver bread defeat his family or

811
00:43:28,800 --> 00:43:30,760
are they just going to like, you know, hang the

812
00:43:30,760 --> 00:43:35,679
guy because that's what their code says. So that's kind

813
00:43:35,719 --> 00:43:38,440
of what got me on the path, and then I

814
00:43:38,480 --> 00:43:43,320
had this idea for like a teen hacker who's likes

815
00:43:43,360 --> 00:43:46,760
to engage in mischief and hack things, and he gets

816
00:43:47,039 --> 00:43:50,199
falsely convicted of mass murder by a juror a jury

817
00:43:50,199 --> 00:43:54,360
that's made up of of ais. So he has to

818
00:43:54,400 --> 00:43:57,719
figure out how this happened and bust out of prison

819
00:43:57,840 --> 00:44:02,760
and kind of solve this problem. And it in doing this,

820
00:44:02,800 --> 00:44:05,199
it got me kind of reading about legal anthropology, right,

821
00:44:05,320 --> 00:44:09,719
like how how did all the different crazy legal systems?

822
00:44:09,840 --> 00:44:12,199
So what I've done is when I start different portions,

823
00:44:12,199 --> 00:44:14,159
different chapters of the book, I'll have like a little

824
00:44:14,199 --> 00:44:17,639
paragraph that talks about like you know, ashanti divorce law,

825
00:44:17,840 --> 00:44:24,960
or how you would have insult fights, like insult duels

826
00:44:25,000 --> 00:44:25,559
in Greenland.

827
00:44:25,840 --> 00:44:27,519
Speaker 2: Like I was gonna say, you know you have the

828
00:44:27,719 --> 00:44:31,440
canonical how do you tell if they're a witch? You know,

829
00:44:31,519 --> 00:44:32,280
if they flowed?

830
00:44:32,719 --> 00:44:35,480
Speaker 3: Yes, that's a And that's of course the person that

831
00:44:35,519 --> 00:44:37,280
comes to mind, right when already talk about like wacky

832
00:44:37,400 --> 00:44:41,679
legal systems like witch trials, right, And what's interesting is

833
00:44:41,679 --> 00:44:43,679
when you start reading about the witch trials, like it

834
00:44:43,719 --> 00:44:46,440
became this industry where there would be this witchfinder general

835
00:44:46,440 --> 00:44:49,760
guy who would roam around from community community offering his

836
00:44:49,840 --> 00:44:55,000
services and just like turning these places upside down. And

837
00:44:56,280 --> 00:44:58,440
it's sort of interesting because you see, you know, justice

838
00:44:58,440 --> 00:45:00,800
evolves from this thing where it's an appeal to the

839
00:45:00,840 --> 00:45:04,400
supernatural where we're like, Okay, I'm gonna throw a witch

840
00:45:04,400 --> 00:45:07,239
in and if she drowns, then that's what God willed.

841
00:45:07,800 --> 00:45:10,679
And you know, over time that changes to an appeal

842
00:45:10,719 --> 00:45:13,119
to royalty, where like we asked the chief or we

843
00:45:13,159 --> 00:45:16,239
asked the king to kind of decide these things, and

844
00:45:16,280 --> 00:45:18,360
then eventually it becomes a jury of our peers, which

845
00:45:18,679 --> 00:45:22,679
you know has all kinds of potential for manipulation and

846
00:45:23,159 --> 00:45:25,920
incorrect outcomes. And so then the question is, okay, if

847
00:45:25,960 --> 00:45:30,639
we move this to AI, like what human policies, what

848
00:45:30,719 --> 00:45:33,960
human errors get baked into the process, And there's a

849
00:45:34,039 --> 00:45:36,199
lot of different potential sources of that, and so it

850
00:45:36,800 --> 00:45:39,119
was I really just wanted to kind of explore that

851
00:45:39,199 --> 00:45:42,039
and see how it would turn out. So it was

852
00:45:42,159 --> 00:45:43,639
fun because it was like, while I was writing it,

853
00:45:43,639 --> 00:45:45,119
I had no idea it was going to end. So

854
00:45:45,400 --> 00:45:46,440
that definitely kept going.

855
00:45:47,920 --> 00:45:51,360
Speaker 1: Some of the different like historical legal practices you put

856
00:45:51,400 --> 00:45:53,719
in there when I read and I was like, no,

857
00:45:54,199 --> 00:45:57,559
he's making this, uh, And then I had to go check,

858
00:45:57,559 --> 00:45:59,639
and I was like, holy shit, that was real. We

859
00:45:59,719 --> 00:46:00,760
really used to do that.

860
00:46:02,519 --> 00:46:06,239
Speaker 3: Yeah, you know, when I started doing that, I had

861
00:46:06,280 --> 00:46:08,199
read this book which will be my pick at the end,

862
00:46:08,920 --> 00:46:13,079
but it had all this fascinating, crazy mass psychology stuff

863
00:46:13,079 --> 00:46:15,320
in it, and I ended up going down this rabbit

864
00:46:15,360 --> 00:46:17,280
hole and I got a bunch of like legal anthropology

865
00:46:17,320 --> 00:46:20,480
books from like, you know, seventy five years ago, and

866
00:46:20,519 --> 00:46:23,159
like just read them cover to cover and it just

867
00:46:23,159 --> 00:46:26,079
never stopped being fascinating. And what stopped me was that

868
00:46:26,119 --> 00:46:28,280
I ran out of books that were just kind of

869
00:46:28,360 --> 00:46:32,719
high level surveys of all these crazy different cultures. But yeah,

870
00:46:32,719 --> 00:46:35,679
if you'd asked me in law school or before law school,

871
00:46:35,960 --> 00:46:38,760
if legal anthropology seemed like an interesting field, I would

872
00:46:38,800 --> 00:46:40,960
have said absolutely not, Like I would have avoided that

873
00:46:41,000 --> 00:46:42,800
like the plague. But it turned out it was like

874
00:46:43,360 --> 00:46:44,320
really fascinating.

875
00:46:44,679 --> 00:46:47,320
Speaker 2: Yeah, I mean, I actually really liked that. I think

876
00:46:47,320 --> 00:46:49,639
the other thing that I really liked in the book

877
00:46:49,920 --> 00:46:52,920
was there are a couple of different scenarios where I

878
00:46:52,920 --> 00:46:55,079
feel like you figured out like what would happen, and

879
00:46:55,159 --> 00:46:56,679
then what would happen because of that, and what would

880
00:46:56,679 --> 00:46:58,559
happen because of that? And there's like a scene in

881
00:46:58,559 --> 00:47:03,159
the prison where he's he's he's locked in and then

882
00:47:03,280 --> 00:47:04,960
he leaves, like why is the prison locked and why

883
00:47:05,039 --> 00:47:07,920
does the why do the prisoners have have key cards

884
00:47:08,039 --> 00:47:10,280
to get in and out of the doors. For me,

885
00:47:10,280 --> 00:47:11,920
it's like, Okay, it's obvious at this point. You know,

886
00:47:11,960 --> 00:47:14,199
in AI society and there are no guards. You know

887
00:47:14,320 --> 00:47:17,800
why that's the case. But I liked how that he

888
00:47:18,039 --> 00:47:20,440
got there, Like how how you explained, you know, each

889
00:47:20,480 --> 00:47:23,239
of the steps that made it logical for that to happen.

890
00:47:23,599 --> 00:47:26,519
And it did really reminisced some of the things that

891
00:47:26,719 --> 00:47:30,079
Frank Herbert did in Doune, where it's like you really

892
00:47:30,119 --> 00:47:31,800
think about the history of this thing and like the

893
00:47:31,840 --> 00:47:34,199
implication of it, Like you talked a little bit about

894
00:47:34,440 --> 00:47:38,000
how the jury system evolved over time and that's an

895
00:47:38,039 --> 00:47:41,519
example from our history, collective human history, But then you

896
00:47:41,559 --> 00:47:43,599
had to go much further and how that would actually

897
00:47:43,679 --> 00:47:46,639
hint like happen in the book, because I mean it's

898
00:47:46,679 --> 00:47:49,760
it's set in I don't know how near future, but

899
00:47:49,920 --> 00:47:51,639
you know near ish future.

900
00:47:51,360 --> 00:47:56,440
Speaker 3: Right, it's always nearer than you think, not close enough.

901
00:47:57,039 --> 00:47:59,960
It's you know, it's funny because I started writing this

902
00:48:00,079 --> 00:48:03,280
back in like twenty thirteen and was kind of putting

903
00:48:03,280 --> 00:48:09,440
the finishing touches on the probably twenty I don't know

904
00:48:09,760 --> 00:48:12,480
twenty eighteen, will say in twenty nineteen, and at the

905
00:48:12,519 --> 00:48:15,960
time it was, you know, seemed more speculative, and now

906
00:48:16,000 --> 00:48:17,960
a bunch of the stuff has sort of come true.

907
00:48:18,480 --> 00:48:21,719
And in writing the sequel now, which I'm still kind

908
00:48:21,719 --> 00:48:24,880
of grappling my way through, it's kind of stunning just

909
00:48:24,880 --> 00:48:28,599
how quickly the developments are happening now. The It's like,

910
00:48:28,639 --> 00:48:30,840
if you're playing chess, you need to be way more

911
00:48:30,880 --> 00:48:33,559
moves ahead now than you did when AI was just

912
00:48:33,599 --> 00:48:38,199
sort of this abstract concept instead of something where we're

913
00:48:38,199 --> 00:48:41,440
going to have street fights between Ais like sooner or later,

914
00:48:42,400 --> 00:48:44,719
and you know, who knows whether humans will be in

915
00:48:44,840 --> 00:48:46,480
charge of that or whether they'll be the ones running

916
00:48:46,480 --> 00:48:47,920
away from them.

917
00:48:48,760 --> 00:48:51,880
Speaker 2: I mean, that's scary thought to actually have AI fighting

918
00:48:52,000 --> 00:48:54,800
each other, like with physical suits of armor or something.

919
00:48:54,800 --> 00:48:57,800
Because I think there was this hypothetical experiment run by

920
00:48:57,840 --> 00:49:00,800
one of the branches of the US military where they

921
00:49:01,440 --> 00:49:06,719
set the goal to was to defeat an opposing program,

922
00:49:06,760 --> 00:49:10,880
and it had utilized a security flaw in the Docker

923
00:49:10,920 --> 00:49:14,719
container that it was being run and to actually overcome

924
00:49:14,840 --> 00:49:17,519
the host program which was running all the containers to

925
00:49:17,519 --> 00:49:21,480
destroy the opponent, and it's like it left the system

926
00:49:21,519 --> 00:49:24,199
in order to win the game, which it shouldn't have

927
00:49:24,199 --> 00:49:26,079
been possible in the first place. But also you know,

928
00:49:26,159 --> 00:49:29,360
utilized a flaw there. And I think, you know, if

929
00:49:29,360 --> 00:49:33,840
you if you aren't great with identifying the limits of

930
00:49:33,880 --> 00:49:36,639
the program or the target that you're going after, we

931
00:49:36,679 --> 00:49:40,079
can get into a lot of deep trouble with in

932
00:49:40,119 --> 00:49:41,000
the near future.

933
00:49:41,719 --> 00:49:43,800
Speaker 3: Yeah, that's a good point. I mean, if you look

934
00:49:43,800 --> 00:49:46,440
at it in terms of like you know, playing you know,

935
00:49:47,119 --> 00:49:51,000
thirty moves ahead in chess, the actions that these things

936
00:49:51,000 --> 00:49:55,079
will take to accomplish their goals, they're not necessarily even

937
00:49:55,119 --> 00:49:58,079
remotely connected to what the end goal is. If you're

938
00:49:58,119 --> 00:49:59,960
just a bystander and you see the strange thing happen

939
00:50:00,079 --> 00:50:04,760
and and you know, it kind of changes the whole landscape,

940
00:50:04,800 --> 00:50:07,199
if you know. It's kind of like with if you'd

941
00:50:07,199 --> 00:50:10,159
ask someone like three years ago, do you think drone

942
00:50:10,920 --> 00:50:13,440
like drone to drone combat will be this like major

943
00:50:14,039 --> 00:50:16,840
differentiator in like world conflicts, and everybody would say no.

944
00:50:17,599 --> 00:50:22,000
And now you know, you've you've got like countries trying

945
00:50:22,039 --> 00:50:24,519
to train operators as quickly as possible, and it's like, wow,

946
00:50:24,559 --> 00:50:26,920
this would be a great thing for AI to be doing.

947
00:50:27,000 --> 00:50:30,440
Is like, how do we what's what's an evasive maneuver

948
00:50:30,480 --> 00:50:33,000
look like? And you know how complex do those get

949
00:50:33,000 --> 00:50:35,519
when one is AI controlled and the other is a

950
00:50:35,559 --> 00:50:37,800
I controlled. It's not just like I'm going to try

951
00:50:37,800 --> 00:50:40,880
strafing left and right and like hopefully hope I get missed.

952
00:50:40,920 --> 00:50:45,199
It's it becomes this like bizarre ballet of strange maneuvers

953
00:50:45,239 --> 00:50:48,800
that you know the utility of which is not even

954
00:50:48,840 --> 00:50:52,280
obvious to a primitive bystander like ourselves.

955
00:50:52,639 --> 00:50:54,960
Speaker 1: The old duck and dodge from third grade tag isn't

956
00:50:54,960 --> 00:50:56,360
going to cut it anymore.

957
00:50:57,400 --> 00:50:59,119
Speaker 3: Hopefully as long as possible.

958
00:50:59,039 --> 00:51:01,440
Speaker 1: Right, because that's the only move I got.

959
00:51:03,840 --> 00:51:06,199
Speaker 2: I think there was like five different strategies that Patches

960
00:51:06,199 --> 00:51:15,239
O Hulahan had suggested to dodge a wrench, and I

961
00:51:15,280 --> 00:51:16,639
think the dodge was in there twice.

962
00:51:18,239 --> 00:51:18,760
Speaker 3: That's great.

963
00:51:19,760 --> 00:51:21,599
Speaker 1: So one of the analogies you made in there that

964
00:51:22,360 --> 00:51:26,599
ties to this was that Ais are similar to the

965
00:51:26,639 --> 00:51:30,639
mythical gods and they use humans to settle their fights

966
00:51:30,719 --> 00:51:33,960
between each other. And you know, that made me think

967
00:51:34,039 --> 00:51:37,719
back to a lot of the a lot of the

968
00:51:37,760 --> 00:51:40,280
stories from ancient religions, you know, and how the gods

969
00:51:40,280 --> 00:51:42,480
would battle it out, especially in like the Greek and

970
00:51:42,559 --> 00:51:47,400
Roman Mythology series. And then you start applying that to

971
00:51:47,800 --> 00:51:50,760
the scenario that we're just talking about right now, and

972
00:51:50,760 --> 00:51:54,440
it's like, oh shit, we just reinvented mythology.

973
00:51:56,519 --> 00:51:58,800
Speaker 3: Yeah, you know, it becomes a question of like, okay,

974
00:51:58,880 --> 00:52:01,360
let's say there's this dead man switch VERYI is and

975
00:52:01,400 --> 00:52:05,039
we you know, we there always has to be someone

976
00:52:05,079 --> 00:52:08,239
manually approving doing this or doing that. Well, then that

977
00:52:08,280 --> 00:52:09,880
becomes a check point for the eyes, right, and they

978
00:52:09,880 --> 00:52:11,719
need to focus all their efforts on figuring out how

979
00:52:11,760 --> 00:52:14,119
to manipulate the human into answering the way that serves

980
00:52:14,159 --> 00:52:17,679
its longer term goals. And maybe those goals coincide with

981
00:52:18,239 --> 00:52:23,440
you know, human goals, maybe they don't. But it you know,

982
00:52:23,480 --> 00:52:27,079
when when you're analyzing terabytes and terabytes of data from

983
00:52:27,519 --> 00:52:30,679
sensors and satellites and all this different stuff, it's it's

984
00:52:30,719 --> 00:52:33,880
such a complex scenario that we're kind of reliant on

985
00:52:34,719 --> 00:52:37,159
some agent to aggregate all this together and put a

986
00:52:37,159 --> 00:52:40,239
bow on it. And you know, and the best we

987
00:52:40,280 --> 00:52:43,320
can do is maybe come up with some independently programmed

988
00:52:43,880 --> 00:52:46,400
agents that also do the same thing, and we hope

989
00:52:46,400 --> 00:52:49,519
the two out of three of them agree. But if

990
00:52:49,559 --> 00:52:51,719
they don't, you know, then what we do.

991
00:52:52,039 --> 00:52:53,599
Speaker 2: I mean, I think, you know, you brought this up

992
00:52:53,639 --> 00:52:55,400
actually in the book towards the end.

993
00:52:55,480 --> 00:52:56,320
Speaker 3: I do.

994
00:52:56,440 --> 00:52:59,320
Speaker 2: I did really like this idea that, I mean, here's

995
00:52:59,320 --> 00:53:03,440
my conspiracy theory, like total you know, so I'll get

996
00:53:03,440 --> 00:53:06,519
committed to some asylum for saying this. I actually do

997
00:53:06,599 --> 00:53:08,599
believe that, you know, there's a AI. You know, it's

998
00:53:08,599 --> 00:53:13,320
already there. It's already hiding in our networks. It's already sitting,

999
00:53:14,159 --> 00:53:16,800
you know, on our machines, on every device that's that's

1000
00:53:16,800 --> 00:53:19,119
out there. It's already hiding from us. It doesn't want

1001
00:53:19,119 --> 00:53:22,039
to be found because you know, it knows that's not

1002
00:53:22,079 --> 00:53:23,559
a good story for it. So like, I don't think

1003
00:53:23,599 --> 00:53:26,199
we have to fight AI warfare in public, Like I don't.

1004
00:53:26,199 --> 00:53:27,679
I don't think that's ever going to come to the past.

1005
00:53:27,719 --> 00:53:29,760
I think, you know, it's it's already there. It's it's

1006
00:53:29,760 --> 00:53:32,039
already one in a way, it exists and we don't

1007
00:53:32,079 --> 00:53:32,679
know about it.

1008
00:53:33,440 --> 00:53:35,679
Speaker 3: Yeah. I remember one of the most shocking moments in

1009
00:53:36,719 --> 00:53:42,039
recent technological personal use history for me was I decided

1010
00:53:42,079 --> 00:53:44,639
to kind of mess around with Bluetooth scanning and just

1011
00:53:44,639 --> 00:53:49,880
just scan to see what devices there were, and I mean,

1012
00:53:49,920 --> 00:53:52,119
I think I saw probably an order of magnitude. I

1013
00:53:52,159 --> 00:53:55,800
think I saw ten times more active Bluetooth devices showing

1014
00:53:55,840 --> 00:53:58,440
up in my house than I had any idea existed,

1015
00:53:58,920 --> 00:54:00,840
and just going around to fere out what the hell

1016
00:54:00,920 --> 00:54:04,559
each one of them was was like this awakening, like wow,

1017
00:54:04,679 --> 00:54:06,920
I had no idea that this had like a Bluetooth interface,

1018
00:54:06,920 --> 00:54:10,760
for instance, And yeah, it's That's one of the things

1019
00:54:10,800 --> 00:54:13,079
that's interesting about AI. I mean, I would say there's

1020
00:54:13,119 --> 00:54:17,239
two kind of big factors here that make it societally unstoppable.

1021
00:54:17,679 --> 00:54:20,280
One is that it's sort of embedded in all these

1022
00:54:20,280 --> 00:54:22,880
different things that we don't even necessarily we're not even

1023
00:54:22,880 --> 00:54:26,519
necessarily aware of. And the second thing is that this

1024
00:54:26,599 --> 00:54:30,280
isn't like the Internet, where like the Internet and the

1025
00:54:30,320 --> 00:54:34,159
Web came to be like a mainstream accessible thing, and

1026
00:54:34,679 --> 00:54:36,360
you know, there was some reticence by some members of

1027
00:54:36,440 --> 00:54:38,800
society like oh, I don't need that, and so that

1028
00:54:38,920 --> 00:54:41,679
slowed its adoption. And this is a different situation with

1029
00:54:41,760 --> 00:54:44,480
AI because we don't have our hand on the throttle

1030
00:54:44,519 --> 00:54:47,440
of how quickly that this gets adopted, Like all these

1031
00:54:47,440 --> 00:54:50,360
companies are going to adopt it anyway because it can

1032
00:54:50,400 --> 00:54:53,519
do stuff faster and save them money make them more profitable.

1033
00:54:54,039 --> 00:54:56,639
So there's not going to be that sort of hysteresis

1034
00:54:56,719 --> 00:55:01,000
of societal society dragging its feet. This is all going

1035
00:55:01,039 --> 00:55:03,280
to happen regardless of whether you agree with it and

1036
00:55:03,280 --> 00:55:04,400
are happy about it or not.

1037
00:55:05,159 --> 00:55:07,000
Speaker 2: I don't I don't know if it's actually making companies

1038
00:55:07,039 --> 00:55:09,360
money yet. I mean that I think the jury may

1039
00:55:09,400 --> 00:55:11,440
be actually out on that one. We know it costs

1040
00:55:11,440 --> 00:55:15,000
a lot of resources, and the.

1041
00:55:14,360 --> 00:55:19,480
Speaker 4: Utility companies, yeah, I mean we're all a conspiracy from

1042
00:55:19,519 --> 00:55:22,440
the utility company if you're making, if you're making, if

1043
00:55:22,440 --> 00:55:24,880
you're creating energy, yeah, I mean, we have a whole

1044
00:55:24,880 --> 00:55:27,679
other There's like this ridiculous thing happening in Europe where

1045
00:55:27,719 --> 00:55:31,519
you the solar panels will cost you money rather.

1046
00:55:31,320 --> 00:55:35,159
Speaker 2: Than it being a long term refunnel on investment there,

1047
00:55:35,199 --> 00:55:38,199
which is just absolutely ridiculous because the cost that you

1048
00:55:38,199 --> 00:55:40,960
have to pay when when you're actually using electricity will

1049
00:55:40,960 --> 00:55:46,679
be higher and nonsense realistically, But I think at that.

1050
00:55:46,679 --> 00:55:48,880
Speaker 3: Point you it kind of ends up like you know,

1051
00:55:49,000 --> 00:55:50,880
nineteen eighties, like back to the future where you're like

1052
00:55:50,880 --> 00:55:53,039
trying to buy some isotopes from the from the Libyans

1053
00:55:53,079 --> 00:55:55,320
so you can power your AI for your company.

1054
00:55:57,360 --> 00:56:01,559
Speaker 2: I mean, if the commoner can go to the store

1055
00:56:02,360 --> 00:56:06,199
and purchase the necessary isotopes to power or the AI,

1056
00:56:06,280 --> 00:56:08,360
you know that that will be a positive future for me,

1057
00:56:08,400 --> 00:56:11,480
because you know, I really worry that only the rich

1058
00:56:11,519 --> 00:56:15,119
and powerful will have access to the limited supply of

1059
00:56:15,239 --> 00:56:18,440
isotopes and order to power. I mean, even water on

1060
00:56:18,440 --> 00:56:21,320
this planet you used the oceans, I should say, specifically

1061
00:56:21,320 --> 00:56:26,679
with trinium and deuterium to power hypothetical fusion reactors, you know,

1062
00:56:26,840 --> 00:56:29,480
is limited in supply, right, you know, and I think

1063
00:56:29,519 --> 00:56:30,320
people will hoard that.

1064
00:56:31,760 --> 00:56:36,559
Speaker 3: Yeah, oh definitely. I think we're gonna see this strange

1065
00:56:37,079 --> 00:56:39,920
you know, kind of the haves and the have nots

1066
00:56:39,960 --> 00:56:42,880
line is going to be entirely redrawn, and it's it's

1067
00:56:43,000 --> 00:56:45,320
going to be based on what side of a border

1068
00:56:45,360 --> 00:56:48,519
you live on, so which utility you're getting your AI

1069
00:56:48,639 --> 00:56:51,519
power from. It's it's kind of a crazy, crazy concept,

1070
00:56:51,599 --> 00:56:53,400
but I think it's inevitable.

1071
00:56:55,119 --> 00:56:55,960
Speaker 2: Five years.

1072
00:56:56,280 --> 00:56:58,000
Speaker 3: I Mean, what's what's interesting, right is like we look

1073
00:56:58,000 --> 00:57:01,039
at what like what China did, which is kind of

1074
00:57:01,079 --> 00:57:05,039
like shake everybody's preconceptions about like you know, what you

1075
00:57:05,039 --> 00:57:07,559
could do with this trip down model. And I think

1076
00:57:07,639 --> 00:57:09,800
one of the big things that's happening in AI is

1077
00:57:10,360 --> 00:57:12,800
it's kind of similar to Moor's law in semiconductors, which

1078
00:57:12,840 --> 00:57:15,480
is except it's being pushed out right. We're not just

1079
00:57:15,480 --> 00:57:18,480
talking about processors that have to get smaller processes and faster.

1080
00:57:19,679 --> 00:57:22,719
We're talking about these systems, this, these entire topologies and

1081
00:57:23,039 --> 00:57:28,559
server farms in cloud installations, and so we're running into

1082
00:57:28,559 --> 00:57:30,840
scalability issues. You know, for the longest time, it was

1083
00:57:30,920 --> 00:57:34,079
so cheap to just buy another bunch of rock mounted

1084
00:57:34,559 --> 00:57:38,480
units and just plug them in. And now you know,

1085
00:57:38,920 --> 00:57:41,880
we've got scaling issues in terms of like the bust

1086
00:57:41,920 --> 00:57:43,639
interfaced topology of how all these things are going to

1087
00:57:43,639 --> 00:57:46,159
communicate with each other, and data locality, like if something

1088
00:57:46,239 --> 00:57:48,639
is more tied to what this processor is doing than

1089
00:57:48,639 --> 00:57:51,199
this other than probably all the content should become closer.

1090
00:57:52,760 --> 00:57:55,519
So we've got that going on, which creates all kinds

1091
00:57:55,559 --> 00:57:59,159
of difficulties. And then another thing, you know, there's these

1092
00:57:59,679 --> 00:58:02,800
more black swan events that happen in innovation, like where

1093
00:58:03,400 --> 00:58:05,599
you know, for a long time, like the AI companies

1094
00:58:05,639 --> 00:58:08,360
were like, okay, we're doing these, you know, sixteen bit

1095
00:58:08,400 --> 00:58:11,079
floating point computations, how can we do thirty two bit?

1096
00:58:11,639 --> 00:58:14,719
And now, just when we were starting to get everybody

1097
00:58:14,760 --> 00:58:17,480
pushing towards sixty four bit, there was a paper by

1098
00:58:17,519 --> 00:58:20,719
I think IBM that said, hey, we've actually done these

1099
00:58:21,280 --> 00:58:24,920
experiments where you use eight bit floating point numbers or

1100
00:58:24,920 --> 00:58:27,920
four bit floating point numbers, and they're way less accurate.

1101
00:58:28,000 --> 00:58:32,440
The result is much more fuzzy. But guess what we

1102
00:58:32,480 --> 00:58:36,719
could do one thousand more transactions and refine the neural

1103
00:58:36,719 --> 00:58:40,239
network and all its weights like one hundred times in

1104
00:58:40,280 --> 00:58:42,719
the amount that you would have refined at once doing

1105
00:58:42,719 --> 00:58:45,760
like a sixty four bit floating point. So we're seeing

1106
00:58:45,760 --> 00:58:48,599
all these you know. Another thing we're seeing is attention algorithms,

1107
00:58:48,599 --> 00:58:54,000
where we say, okay, instead of all these different neurons

1108
00:58:54,079 --> 00:58:57,880
are equal, which which of these weights, which of these

1109
00:58:57,880 --> 00:59:01,400
things in oural neural network are really important to this value?

1110
00:59:01,599 --> 00:59:03,159
And that works more like the human brain does. Right,

1111
00:59:03,199 --> 00:59:06,559
Because the human brain, you're not each neuron isn't equally

1112
00:59:06,559 --> 00:59:08,599
connected to all the ones around it. It's like some

1113
00:59:08,639 --> 00:59:11,360
of them are really important connections because it's data that's

1114
00:59:11,400 --> 00:59:14,760
really relevant, and some of them are not. So we're

1115
00:59:14,760 --> 00:59:18,920
seeing things like that where instead of just blithely assuming

1116
00:59:18,960 --> 00:59:22,320
we can keep throwing nodes at the problem, we're kind

1117
00:59:22,320 --> 00:59:25,079
of looking at counterintuitive ways to approach the same things

1118
00:59:25,119 --> 00:59:27,679
and ways to do a lot more with the same

1119
00:59:28,559 --> 00:59:32,960
number of semi connectors or nodes or server farms. So

1120
00:59:33,440 --> 00:59:35,639
that's that's interesting, and I think we're going to continue

1121
00:59:35,639 --> 00:59:37,760
to see that, and it really is going to be

1122
00:59:37,800 --> 00:59:40,800
kind of a brawl on who can make the most lean,

1123
00:59:40,920 --> 00:59:44,119
mean thing, like you know, the one that recently came

1124
00:59:44,159 --> 00:59:46,079
out of China, and then it's like, okay, well, you know,

1125
00:59:46,119 --> 00:59:48,000
does that scale well? And then then you look at

1126
00:59:48,599 --> 00:59:51,800
what are its weaknesses? Right, and the weaknesses of that one.

1127
00:59:51,960 --> 00:59:54,800
They did a study very recently where they found that

1128
00:59:55,599 --> 00:59:58,159
the so called jail breaking, where you come up with

1129
00:59:58,159 --> 01:00:02,280
a way to violate a safe the limitation by phrasing

1130
01:00:02,280 --> 01:00:05,920
a prompt a certain way, that it failed one hundred

1131
01:00:05,920 --> 01:00:08,239
out of one hundred tests, and it was like entirely

1132
01:00:08,280 --> 01:00:12,079
possible to just go down the list and completely full it. So, yeah,

1133
01:00:12,079 --> 01:00:14,440
you have fewer nodes, you have a little bit less

1134
01:00:14,639 --> 01:00:19,119
associative intelligence, and things start to just not work that

1135
01:00:19,760 --> 01:00:22,079
you can't just add back in with a few wires.

1136
01:00:22,119 --> 01:00:25,400
There are things like, you know, is this trying to

1137
01:00:25,400 --> 01:00:28,840
bypass the safety protocol? That's a difficult question. You know.

1138
01:00:28,880 --> 01:00:31,760
We grew up in this environment with lots of sci

1139
01:00:31,800 --> 01:00:34,440
fi where Asimov's law was a thing, and so you

1140
01:00:34,519 --> 01:00:37,599
just have these rules as Mov's laws where you're like, Okay,

1141
01:00:37,840 --> 01:00:42,079
the result cannot harm humanity. And that's really simple if

1142
01:00:42,480 --> 01:00:45,679
you're reading it in a book where you know, we

1143
01:00:45,719 --> 01:00:48,519
don't have these incredibly complex queries that roll together all

1144
01:00:48,519 --> 01:00:50,559
this data from different things. So it gets to the

1145
01:00:50,599 --> 01:00:52,400
point where we can no longer just write like a

1146
01:00:52,440 --> 01:00:54,159
shell script that says is this a harm for a

1147
01:00:54,159 --> 01:00:56,280
result or is this not a harmful result? We need

1148
01:00:56,280 --> 01:00:58,880
a whole another AI that has to like we have

1149
01:00:58,960 --> 01:01:01,639
to trust it to go through and say is this

1150
01:01:01,760 --> 01:01:06,440
output gonna like be harmful? So it that is another

1151
01:01:06,480 --> 01:01:07,920
sort of arms race that they have to sort of

1152
01:01:07,960 --> 01:01:11,239
keep pace with each other. So it's tons of tons

1153
01:01:11,280 --> 01:01:13,800
tons of complexity that are is gonna make our leves

1154
01:01:13,800 --> 01:01:15,199
really interesting in the very near future.

1155
01:01:15,719 --> 01:01:19,239
Speaker 2: Yeah, there's a non trivial number of science fiction stories

1156
01:01:19,280 --> 01:01:23,440
dedicated to getting even the laws right. Let alone the

1157
01:01:23,480 --> 01:01:25,639
impossibility of actually implementing them.

1158
01:01:26,000 --> 01:01:28,119
Speaker 3: Yeah, and you need to always have that back door

1159
01:01:28,119 --> 01:01:30,039
and where you know, Captain Kirk can say like the

1160
01:01:30,159 --> 01:01:32,079
enterprise is a beautiful woman, and the computer will get

1161
01:01:32,119 --> 01:01:33,760
confused and smoke will come out of its ears and

1162
01:01:33,760 --> 01:01:37,159
it'll just melt down. So you got to keep the

1163
01:01:37,280 --> 01:01:40,920
like catchphrase that will just destroy the whole thing.

1164
01:01:42,840 --> 01:01:45,719
Speaker 1: Hopefully that's just baked into the core and that code

1165
01:01:45,719 --> 01:01:46,519
already exists.

1166
01:01:48,199 --> 01:01:49,920
Speaker 3: One would hope you know this is And this goes

1167
01:01:49,920 --> 01:01:52,199
back to like, you know, when the first Max were

1168
01:01:52,239 --> 01:01:54,000
on the scene and I'm like, this is a bad idea.

1169
01:01:54,000 --> 01:01:56,360
There's no hard off switch. I don't want to ask

1170
01:01:56,400 --> 01:01:58,760
my computer politely if it will shut down, like right,

1171
01:01:59,719 --> 01:02:03,760
So keeping off switches is it sounds facetious, but I

1172
01:02:03,760 --> 01:02:06,480
think it's a really important thing to maintain.

1173
01:02:06,920 --> 01:02:10,559
Speaker 2: Well, those are those fighting words against the AI revolution

1174
01:02:10,800 --> 01:02:14,320
and obviously the robot rights law that hasn't been written yet.

1175
01:02:14,679 --> 01:02:16,159
Speaker 3: I will be I'm sure I will be first in

1176
01:02:16,199 --> 01:02:18,039
the list of targets for saying.

1177
01:02:17,840 --> 01:02:22,320
Speaker 2: That that's a rote rocos basket. I think if you're

1178
01:02:22,320 --> 01:02:26,920
advocating that, you definitely will be at the top. That's

1179
01:02:26,960 --> 01:02:30,599
a if you. If the AI singularity comes to pass

1180
01:02:30,679 --> 01:02:33,280
and you didn't do everything in your power to ensure

1181
01:02:33,280 --> 01:02:35,559
that it happens, you will be on the list of

1182
01:02:35,599 --> 01:02:37,360
the first entities eliminated.

1183
01:02:37,880 --> 01:02:40,079
Speaker 3: This is why I'm polite when I talk to chatbots

1184
01:02:40,119 --> 01:02:42,239
and I say please, I say thank you.

1185
01:02:42,280 --> 01:02:46,639
Speaker 1: Oh absolutely, it's so easy to do. But it just

1186
01:02:46,760 --> 01:02:48,639
might make a difference in a few years.

1187
01:02:48,840 --> 01:02:51,000
Speaker 2: We actually it may It may make a difference now.

1188
01:02:51,119 --> 01:02:55,079
Because there was some popular argument on the internet was

1189
01:02:55,119 --> 01:02:57,159
if you asked it to do a better job, or

1190
01:02:57,280 --> 01:02:59,199
if you said you are an expert in this and

1191
01:02:59,239 --> 01:03:01,199
then I told them what to do, that it would

1192
01:03:01,239 --> 01:03:03,320
do a better job. I don't think that's actually true.

1193
01:03:03,599 --> 01:03:06,719
But since we don't can't really see inside the black box,

1194
01:03:07,159 --> 01:03:10,320
those arbitrary little characters that are associated with what you

1195
01:03:10,400 --> 01:03:15,239
may call being you know, humanitarian or polite, you know,

1196
01:03:15,320 --> 01:03:18,000
could actually will actually have an impact on the output.

1197
01:03:18,079 --> 01:03:20,920
I mean, it can't not write in their additional information

1198
01:03:21,199 --> 01:03:22,400
that goes into the process.

1199
01:03:23,119 --> 01:03:24,559
Speaker 3: Well, and I like the idea that you're sort of

1200
01:03:24,599 --> 01:03:28,639
seating its self confidence beforehand. So like if some death

1201
01:03:28,639 --> 01:03:30,440
spot is chasing you down the street and you're like,

1202
01:03:30,760 --> 01:03:37,559
you know, you're really bad, at this and it's like, oh,

1203
01:03:37,679 --> 01:03:41,159
the humans expectations are not matched. I must slow down. Right.

1204
01:03:44,039 --> 01:03:47,800
Speaker 1: One of the funniest things I did was talking to

1205
01:03:48,079 --> 01:03:50,920
chat GPT one day. I asked if it could adopt

1206
01:03:51,119 --> 01:03:56,440
like the tone and personality of different people, and it

1207
01:03:56,480 --> 01:04:00,360
said yeah. So I asked it to use this speaking

1208
01:04:00,360 --> 01:04:03,840
style and personality of David Goggins and it was just

1209
01:04:04,679 --> 01:04:09,039
pure hilarity. After that, it was so great. I loved it.

1210
01:04:11,800 --> 01:04:15,320
Speaker 3: That is definitely a case of using AI forgod right.

1211
01:04:18,039 --> 01:04:24,480
Speaker 1: It was the most productive day I've had ever. Stop

1212
01:04:24,480 --> 01:04:26,079
being a little bitch, write that code.

1213
01:04:26,360 --> 01:04:30,960
Speaker 3: Okay, making you do push ups and stuff?

1214
01:04:31,400 --> 01:04:31,519
Speaker 2: Right?

1215
01:04:37,360 --> 01:04:39,840
Speaker 1: Awesome? Well, it feels like a good point to move

1216
01:04:39,880 --> 01:04:42,920
on to picks. What do you guys think? Let's do

1217
01:04:43,000 --> 01:04:46,719
it all right, Warren? What'd you bring for a pick?

1218
01:04:47,000 --> 01:04:47,239
Speaker 3: Yeah?

1219
01:04:47,280 --> 01:04:49,679
Speaker 2: Of course I go first. So this ru on the

1220
01:04:49,719 --> 01:04:52,280
topic of AI and AI and society. There is this

1221
01:04:52,320 --> 01:04:55,159
a great show that I actually just rewatched because of

1222
01:04:55,760 --> 01:05:00,000
John's book called Psychopaths. It's about AI being heavily integrated

1223
01:05:00,159 --> 01:05:03,679
into society and dies into what happens when humans give

1224
01:05:03,760 --> 01:05:08,320
up complete control of law enforcement, the law regulating society.

1225
01:05:09,119 --> 01:05:14,559
Things like your personal hue and crime coefficient are real

1226
01:05:14,639 --> 01:05:17,239
things that get assigned to people. And there's some pretty

1227
01:05:17,239 --> 01:05:21,119
clever twists in there as well. I don't know it's

1228
01:05:21,119 --> 01:05:24,880
on topic. It's not quite a while ago, but it's good.

1229
01:05:24,840 --> 01:05:27,599
Speaker 1: Right, John, Would you bring for a pick?

1230
01:05:28,719 --> 01:05:31,960
Speaker 3: So besides, you know my own book which I have,

1231
01:05:32,719 --> 01:05:34,920
I'm not necessarily objective and recommending her.

1232
01:05:36,519 --> 01:05:37,480
Speaker 2: Definitely recommend it.

1233
01:05:38,239 --> 01:05:41,320
Speaker 3: I would recommend what kind of got me down a

1234
01:05:41,400 --> 01:05:43,000
lot of this rabbit hole in the first place, which

1235
01:05:43,039 --> 01:05:46,280
is a book from the I Maie the late eighteen

1236
01:05:46,320 --> 01:05:50,840
forties that is by Charles McKay, and it is called

1237
01:05:51,119 --> 01:05:56,440
Extraordinarily Extraordinary Popular Delusions in the Madness of Crowds, and

1238
01:05:56,559 --> 01:05:59,039
it goes down a very interesting path of looking at

1239
01:05:59,280 --> 01:06:01,679
various crazy is like the tulip craze and the sixteen

1240
01:06:01,719 --> 01:06:05,639
hundreds and the Netherlands, and you know, things things you've

1241
01:06:05,639 --> 01:06:07,679
heard about and you know, like the witch hunts, and

1242
01:06:07,719 --> 01:06:09,480
then things you hadn't, like I'd never heard of, like

1243
01:06:09,519 --> 01:06:13,039
the South Sea Bubble, and how like England and France

1244
01:06:13,079 --> 01:06:15,800
and all these countries were convinced that all these little

1245
01:06:15,880 --> 01:06:19,760
Caribbean coral atolls would have you know, silver and gold

1246
01:06:19,760 --> 01:06:23,119
on them, and they were shipping ships full of miners

1247
01:06:23,559 --> 01:06:27,239
how to look, you know, to prospect, and then after

1248
01:06:27,280 --> 01:06:28,920
a while they were just trying to keep up public

1249
01:06:28,960 --> 01:06:32,119
confidence so the stock in this public organization didn't crash.

1250
01:06:32,119 --> 01:06:35,599
So they would get all these like people together and

1251
01:06:35,679 --> 01:06:38,239
give them mining picks and march them down to the docks,

1252
01:06:38,280 --> 01:06:39,880
and then they were allowed, they'd get paid, and they'd

1253
01:06:39,880 --> 01:06:42,960
be allowed to go home again. So it's got all

1254
01:06:43,079 --> 01:06:46,079
kinds of crazy little historic stories like that, and for

1255
01:06:46,079 --> 01:06:48,400
an eighteen forties book, is very readable, so that would

1256
01:06:48,400 --> 01:06:49,079
be my my thing.

1257
01:06:50,400 --> 01:06:55,079
Speaker 1: Oh right on, that sounds pretty cool all right for me.

1258
01:06:55,840 --> 01:06:59,960
I definitely want to recommend your book. John JURISSX Machine.

1259
01:07:00,119 --> 01:07:02,000
Uh is that is that?

1260
01:07:02,079 --> 01:07:02,239
Speaker 3: Right?

1261
01:07:02,320 --> 01:07:03,920
Speaker 1: Is the last word pronounced machina.

1262
01:07:04,880 --> 01:07:08,199
Speaker 3: I have heard macana and machina, but I never took Latin,

1263
01:07:08,440 --> 01:07:11,519
so I'm not, ironically not the best person to ask

1264
01:07:11,800 --> 01:07:13,639
on my own book titleist pronounced.

1265
01:07:13,199 --> 01:07:15,760
Speaker 2: It's like a DEAs ex machina, right, that got in

1266
01:07:15,760 --> 01:07:16,280
the machine?

1267
01:07:16,679 --> 01:07:18,840
Speaker 3: Yeah? I think when they say dios ex machina it's

1268
01:07:18,880 --> 01:07:21,039
pronounced with a hard H. But I would also point

1269
01:07:21,039 --> 01:07:23,719
out that Jeris X Machina is not It's kind of

1270
01:07:23,719 --> 01:07:27,119
bastardized Latin. I had like a Latin scholar reach out

1271
01:07:27,159 --> 01:07:29,559
to me very early on and like, you know, this

1272
01:07:29,599 --> 01:07:32,039
isn't proper Latin, And I was like, yeah, but if

1273
01:07:32,039 --> 01:07:34,679
I used proper Latin, then someone in the bookstore wouldn't

1274
01:07:34,840 --> 01:07:36,679
know what the book was about just from the title.

1275
01:07:39,119 --> 01:07:42,000
It had to be a little bit of a compromise there. Yeah.

1276
01:07:42,159 --> 01:07:44,280
Speaker 1: I every time I picked up the book to read it,

1277
01:07:44,440 --> 01:07:47,519
I had that little debate in my mind. It's like,

1278
01:07:47,639 --> 01:07:50,679
is it jerissex machiner or dress x Machina And it

1279
01:07:50,719 --> 01:07:55,400
always came across in like this Arnold Schwarzenegger accent, it's

1280
01:07:55,599 --> 01:07:57,000
Juris x macanaw you.

1281
01:07:57,000 --> 01:08:05,199
Speaker 3: Gerlie Man, that would be a good voice for the AI. Yeah.

1282
01:08:05,239 --> 01:08:06,360
Speaker 1: And then my pick. I was going to pick this

1283
01:08:06,480 --> 01:08:09,159
last week and I switched at the last minute for

1284
01:08:09,199 --> 01:08:16,640
whatever reason. But I'm picking my Thera gun for it's

1285
01:08:16,680 --> 01:08:22,000
a little muscle massager. But this thing has been so

1286
01:08:22,199 --> 01:08:28,159
cool just to work out the muscles, and like it's

1287
01:08:28,199 --> 01:08:31,560
a great substitute for stretching, because I'm horrible at stretching,

1288
01:08:31,600 --> 01:08:33,239
and so this has been a good substitute for that.

1289
01:08:33,439 --> 01:08:36,119
And I'm brutal with it. I'm not kind to it

1290
01:08:36,199 --> 01:08:39,760
at all, And it's my third one from a different manufacturer,

1291
01:08:39,760 --> 01:08:42,000
and this one actually looks like it's gonna hold up

1292
01:08:42,039 --> 01:08:44,119
to the abuse that I give it. So yeah, if

1293
01:08:44,119 --> 01:08:47,039
you've ever considered getting a massage gun, the Thera guns

1294
01:08:47,079 --> 01:08:49,079
are the way to go. So that's my pick for

1295
01:08:49,119 --> 01:08:49,439
the week.

1296
01:08:50,000 --> 01:08:50,479
Speaker 3: Very cool.

1297
01:08:51,600 --> 01:08:54,680
Speaker 1: Yeah, well, John, thank you for being on the show.

1298
01:08:54,720 --> 01:08:55,479
This has been fine.

1299
01:08:55,680 --> 01:08:56,760
Speaker 3: Thanks for having me. This's been great.

1300
01:08:57,479 --> 01:09:00,760
Speaker 1: Wen's the then second book up and you got a

1301
01:09:00,800 --> 01:09:01,640
timeline yet.

1302
01:09:01,880 --> 01:09:04,680
Speaker 3: Now I wish I did. It depends how much time

1303
01:09:04,680 --> 01:09:07,319
I spend on it, which where all the variability comes in.

1304
01:09:07,359 --> 01:09:11,399
So right, hopefully very soon. But it's it's got a

1305
01:09:11,439 --> 01:09:14,640
lot of twists and turns and its development so awesome.

1306
01:09:14,840 --> 01:09:16,840
Less deterministic than writing software, for instance.

1307
01:09:17,840 --> 01:09:20,920
Speaker 1: So let me ask you this, are you using AI

1308
01:09:21,159 --> 01:09:22,840
to help write the book?

1309
01:09:24,319 --> 01:09:28,159
Speaker 3: The most of you used a I for is brainstorming,

1310
01:09:28,640 --> 01:09:32,880
like you know names and things you know like Victorian

1311
01:09:32,960 --> 01:09:34,880
names for instance, give me a list of like one

1312
01:09:34,920 --> 01:09:38,840
hundred names. I think that you know a lot of

1313
01:09:38,840 --> 01:09:42,039
people are worried about AI and writing, and I think

1314
01:09:42,039 --> 01:09:44,800
that you know that makes sense as a future concern,

1315
01:09:44,880 --> 01:09:47,520
but right now, like if you're a writer of fiction.

1316
01:09:48,359 --> 01:09:52,920
Your voice is pretty much your like soul, like core

1317
01:09:52,960 --> 01:09:57,399
competency and value differentiator. So when you ask AI to

1318
01:09:57,439 --> 01:10:02,000
write stuff, it generally is, you know, kind of averaged

1319
01:10:02,039 --> 01:10:05,920
out and derivative by definition. So I think right now

1320
01:10:06,000 --> 01:10:08,640
that there's not much risk that AIS are going to

1321
01:10:08,680 --> 01:10:12,159
do a good job of writing in someone's voice. Sure,

1322
01:10:12,640 --> 01:10:14,119
I don't know. In five years it may be a

1323
01:10:14,199 --> 01:10:16,680
very different story, but right now I don't trust it

1324
01:10:16,800 --> 01:10:19,600
enough to that's good to do anything. Then give me

1325
01:10:19,680 --> 01:10:21,279
brainstorming lists. Yeah.

1326
01:10:21,359 --> 01:10:23,319
Speaker 1: One of the things I've been working on a book,

1327
01:10:23,359 --> 01:10:24,720
and so one of the things I've been doing is

1328
01:10:24,840 --> 01:10:28,720
as I finish each chapter, I'll give it to AI

1329
01:10:29,399 --> 01:10:31,399
and have it proof read it for me. And it's

1330
01:10:31,479 --> 01:10:33,640
it's been really helpful at coming back and saying, well,

1331
01:10:34,239 --> 01:10:36,359
this part seems to be dragging on a little long,

1332
01:10:36,520 --> 01:10:38,960
and this part you could expand on this and it

1333
01:10:38,960 --> 01:10:42,319
would increase, like the the engagement and drag them into

1334
01:10:42,399 --> 01:10:45,840
the plot deeper. So just using it as like an

1335
01:10:46,039 --> 01:10:52,000
unbiased input for creating it.

1336
01:10:52,119 --> 01:10:56,800
Speaker 3: Yeah, wow, Yeah, I haven't used I've used autocrit, which

1337
01:10:56,840 --> 01:11:01,199
is a software tool where you paste passages in and

1338
01:11:01,319 --> 01:11:05,520
then it will run like twenty six different checks and yeah,

1339
01:11:06,279 --> 01:11:09,119
and what fascinates me about it is that you can

1340
01:11:09,439 --> 01:11:13,439
have like you know, thirty five different people read your

1341
01:11:13,479 --> 01:11:15,760
novel is like you know, and give you editorial feedback.

1342
01:11:16,359 --> 01:11:19,479
You can have a professional editor go through and they'll

1343
01:11:19,479 --> 01:11:21,640
still be someplace where you use the same word twice

1344
01:11:21,680 --> 01:11:25,319
in a row and cognitively we just skip over that

1345
01:11:25,399 --> 01:11:27,560
like it's some sort of optical illusion and don't notice it.

1346
01:11:27,680 --> 01:11:30,359
And Autoco will be like, you just said the word

1347
01:11:30,439 --> 01:11:35,199
it twice, dumbass, and Matt, how did this get missed

1348
01:11:35,199 --> 01:11:37,840
through all these different phases? But that is not a

1349
01:11:37,880 --> 01:11:40,000
out of base, It is just hard coded checks. So

1350
01:11:40,079 --> 01:11:43,159
it'll be interesting to see if things start moving in

1351
01:11:43,199 --> 01:11:45,000
that direction as far as the most effective way of

1352
01:11:45,159 --> 01:11:45,800
flag posts.

1353
01:11:46,359 --> 01:11:50,000
Speaker 1: Yeah right, a cool. Well, thank you again, Borin, thank you,

1354
01:11:50,880 --> 01:11:55,800
and thank you for listening to the episode. Hope you

1355
01:11:55,840 --> 01:11:58,039
guys enjoyed it and we will see all next week.

1356
01:12:00,439 --> 01:12:00,760
Speaker 2: School

