1
00:00:05,000 --> 00:00:06,599
Speaker 1: To like already click the button.

2
00:00:09,720 --> 00:00:11,359
Speaker 2: And that's how we're starting the episode.

3
00:00:13,240 --> 00:00:15,960
Speaker 1: Oh for those of you just getting started, actually you

4
00:00:16,000 --> 00:00:17,800
are just getting started because I just clicked on the

5
00:00:17,920 --> 00:00:23,480
record button. But hey, welcome, thanks for joining the podcast. Warren.

6
00:00:23,519 --> 00:00:24,000
How are you?

7
00:00:24,679 --> 00:00:26,640
Speaker 2: I'm I'm great. You know, I feel like, you know,

8
00:00:26,679 --> 00:00:29,239
we should go back to this is Adventures and DevOps.

9
00:00:29,280 --> 00:00:31,440
You know, for the first time listeners, if this is

10
00:00:31,519 --> 00:00:35,079
just the one episode they happen to click on, then

11
00:00:35,359 --> 00:00:37,399
they're in for a real treat today because I'm really

12
00:00:37,439 --> 00:00:38,840
interested in today's topic.

13
00:00:39,920 --> 00:00:45,520
Speaker 1: Yeah, and today's topic, we have gorkam Erkan with us

14
00:00:45,600 --> 00:00:49,799
and we're gonna be talking about how to how to

15
00:00:49,840 --> 00:00:56,479
integrate AI and mL into teams using tooling that's a

16
00:00:56,520 --> 00:00:59,359
little bit more aligned with how we do DevOps.

17
00:00:59,479 --> 00:01:00,520
Speaker 2: That sound of out right?

18
00:01:01,679 --> 00:01:02,880
Speaker 3: Yeah? That sounds right?

19
00:01:03,200 --> 00:01:03,640
Speaker 1: All right?

20
00:01:03,960 --> 00:01:04,280
Speaker 3: Cool?

21
00:01:05,280 --> 00:01:08,200
Speaker 1: Before we dig into the details of that, can you

22
00:01:08,239 --> 00:01:11,319
share with our listeners a little bit about your background?

23
00:01:12,799 --> 00:01:18,079
Speaker 3: Sure. I am the CEA of josu U. The josu

24
00:01:18,200 --> 00:01:22,079
is a company that is trying to make it easier

25
00:01:22,079 --> 00:01:30,000
for enterprises to adopt AI and mL in into their applications. Uh.

26
00:01:30,040 --> 00:01:34,359
And prior to JOSU I was with red Hat for many,

27
00:01:34,400 --> 00:01:37,120
many years. I was as a distinguished engineer with red

28
00:01:37,159 --> 00:01:46,040
Hat worked on developer tools, which in red hat it

29
00:01:46,319 --> 00:01:49,840
covers anything from I D E S to C C

30
00:01:50,000 --> 00:01:54,920
D pipelines to them op so it's it's all everything

31
00:01:55,480 --> 00:01:59,480
that you can think of that is open source and

32
00:01:59,560 --> 00:02:04,200
that will tool My group kind of got involved with

33
00:02:04,239 --> 00:02:10,560
those as part of that. I did Java tools early

34
00:02:10,560 --> 00:02:18,479
in my career for something called J two E if

35
00:02:18,520 --> 00:02:25,120
anyone remembers that. So and then I actually had the

36
00:02:25,240 --> 00:02:32,080
great idea of integrating Java tooling into vs code. The

37
00:02:32,439 --> 00:02:37,120
current Java tooling is something that I have done in

38
00:02:37,479 --> 00:02:40,400
a month or so and that has grown to what

39
00:02:40,479 --> 00:02:47,000
it is today. My group got involved with things like

40
00:02:47,159 --> 00:02:50,000
tech Tile, CD Foundation and so and so forth. So

41
00:02:50,000 --> 00:02:53,879
I've been doing devobs for the last probably five years,

42
00:02:54,199 --> 00:02:58,159
or tools for DeVos for five years, and tools in general.

43
00:02:58,360 --> 00:03:05,599
Before that, before joining red Hat, I was working for Nokia,

44
00:03:06,280 --> 00:03:09,879
the phone company, so I was in the middle of

45
00:03:09,919 --> 00:03:21,360
the mobile revolution and again mostly doing open source projects

46
00:03:21,680 --> 00:03:23,400
at Noukia at the time.

47
00:03:23,639 --> 00:03:27,240
Speaker 2: When you say open source and Java and red Hat,

48
00:03:27,280 --> 00:03:31,400
my mind immediately goes to Jenkins and I don't have

49
00:03:31,479 --> 00:03:34,919
a particular soft spot in my heart for that product,

50
00:03:34,960 --> 00:03:36,599
but I can imagine you've had to spend a lot

51
00:03:36,639 --> 00:03:38,199
of time with it. Is that what they were using.

52
00:03:39,960 --> 00:03:43,840
Speaker 3: So yeah, we did start with Jenkins. We do support,

53
00:03:43,960 --> 00:03:47,800
We did support or red Hats still supports Jenkins to

54
00:03:48,000 --> 00:03:53,080
some degree. But at the time when we tried to

55
00:03:53,159 --> 00:03:57,520
do a project, code open shift that I owe and

56
00:03:57,639 --> 00:03:59,919
one of the things that we tried to do at

57
00:04:00,080 --> 00:04:06,159
the time was to run Jenkins in a manner that

58
00:04:06,199 --> 00:04:09,520
it is not supposed to be run at the time.

59
00:04:09,960 --> 00:04:13,560
I think Jenkins is doing better now, but at the time,

60
00:04:13,879 --> 00:04:16,839
Jenkins was created as a server, so it was meant

61
00:04:16,879 --> 00:04:19,240
to run as a server. But then when you're on

62
00:04:19,279 --> 00:04:23,399
the cloud trying to run Jenkins, one thing that you

63
00:04:23,439 --> 00:04:25,360
want to do is you want to try to run

64
00:04:25,399 --> 00:04:30,360
its servers, like if I need to go and build something,

65
00:04:30,879 --> 00:04:35,680
especially if you're doing this as a SAS service, you

66
00:04:35,759 --> 00:04:39,160
want to just on demand start a Jenkins instance, do

67
00:04:39,319 --> 00:04:41,560
your bills, and then shut it down so that you're

68
00:04:41,560 --> 00:04:47,000
not spending too much on the cloud resources. That didn't

69
00:04:47,040 --> 00:04:51,040
work well for us. So one of the things that

70
00:04:51,240 --> 00:04:55,560
happened at the time was the Canadia project. If you're

71
00:04:55,600 --> 00:04:59,800
familiar with it, and as part of the Canadia project

72
00:05:00,040 --> 00:05:06,240
first code drop before even it even got public, it

73
00:05:06,279 --> 00:05:09,279
was open to some of the redheaders, including myself, and

74
00:05:09,319 --> 00:05:11,839
one of the things that we noticed was there was

75
00:05:11,879 --> 00:05:15,279
a built piece in the k native code which was

76
00:05:15,319 --> 00:05:21,120
actually doing server spiels essentially. So the whole idea of oh,

77
00:05:21,360 --> 00:05:25,439
can we actually make this and turned this into something

78
00:05:26,879 --> 00:05:31,240
It turned out to be the techtile project later and

79
00:05:31,279 --> 00:05:35,920
then it became the tech Time's cd c CD system

80
00:05:36,560 --> 00:05:40,839
that we know today. So it actually we got involved

81
00:05:40,839 --> 00:05:46,879
with that because of Jenkins because we couldn't actually make

82
00:05:46,920 --> 00:05:50,879
it cloud native at the time, so it felt like, oh,

83
00:05:50,920 --> 00:05:54,600
we should actually, you know, do something cloud native instead.

84
00:05:56,120 --> 00:06:01,680
So that's that's how involvement with Jane. But on the

85
00:06:01,720 --> 00:06:05,800
other hand, I would still bet that a big chunk

86
00:06:05,839 --> 00:06:11,279
of workloads on kimbernites in the world is actually Jenkins.

87
00:06:16,160 --> 00:06:16,959
Speaker 1: Survey to edit.

88
00:06:17,240 --> 00:06:19,680
Speaker 3: I mean, I don't know, I don't have un versa.

89
00:06:19,720 --> 00:06:23,079
I like that would be my guess. It's it's it's

90
00:06:23,120 --> 00:06:25,240
in double digits. I'm pretty sure enough.

91
00:06:25,680 --> 00:06:28,279
Speaker 2: I feel like I feel like this is where ignorance

92
00:06:28,399 --> 00:06:30,680
is bliss. Like I don't I'm going to pretend that's

93
00:06:30,720 --> 00:06:34,000
not happening and just continue living my life in whatever

94
00:06:34,279 --> 00:06:36,000
delusion I have currently going on.

95
00:06:37,079 --> 00:06:41,199
Speaker 3: Unfortunately, it is true. It's there, It's it's in many enterprises.

96
00:06:41,240 --> 00:06:47,839
You can't ignore Jenkins. It's hundreds of instances for sure. Yeah.

97
00:06:48,040 --> 00:06:53,399
Speaker 1: I have a love hate relationship with Jenkins. I hate

98
00:06:53,480 --> 00:06:56,000
working with it, but I love how it can just

99
00:06:56,079 --> 00:06:58,519
do absolutely anything you needed to do.

100
00:07:00,519 --> 00:07:07,120
Speaker 3: And the ecosystem, Yeah, like the ecosystem behind Jenkins. You know,

101
00:07:07,360 --> 00:07:11,759
I think nowadays get up actions maybe at that level,

102
00:07:11,839 --> 00:07:19,399
but the ecosystem behind Jenkins, it's, it's it's pretty big.

103
00:07:19,560 --> 00:07:22,360
It's very important. You can find a plug in for

104
00:07:22,399 --> 00:07:26,600
anything essentially, yeah, or anything that you care about, right

105
00:07:27,920 --> 00:07:28,639
for sure.

106
00:07:29,959 --> 00:07:34,839
Speaker 1: So let's talk a little bit about aim L since

107
00:07:34,879 --> 00:07:38,720
that's our topic for today. And there's like, you know,

108
00:07:38,959 --> 00:07:43,879
I whenever I think about this, the most common scenarios

109
00:07:43,920 --> 00:07:47,120
that I see coming up are people using chat, GPT

110
00:07:47,759 --> 00:07:50,560
or tools like that, And it seems like a very

111
00:07:53,240 --> 00:07:59,680
early stage product. So to get that enterprise ready, like,

112
00:07:59,720 --> 00:08:02,360
what the what are the challenges that people are facing?

113
00:08:03,879 --> 00:08:07,639
Speaker 3: Yeah, so I think check GPT, Like what we are

114
00:08:07,720 --> 00:08:10,920
going through right now is that that in every enterprise

115
00:08:11,000 --> 00:08:14,920
there is one or two let's call it experiments that

116
00:08:15,040 --> 00:08:18,079
is going on. And usually when you're starting the experiment,

117
00:08:18,560 --> 00:08:20,560
the first thing that you do is like what is

118
00:08:20,600 --> 00:08:27,680
the least resistance that you can face, and that is

119
00:08:28,360 --> 00:08:31,319
use the check GPT API s right, the open API

120
00:08:31,600 --> 00:08:37,320
or Microsoft has Azure APIs as well, So use those

121
00:08:37,759 --> 00:08:40,679
and do whatever you need to do and come up

122
00:08:40,720 --> 00:08:44,120
with your first early experiments. And when you look at

123
00:08:44,159 --> 00:08:49,600
those experiments in many of the organizations, what you see

124
00:08:49,720 --> 00:08:52,759
is there are a lot of success that is coming in.

125
00:08:53,120 --> 00:08:55,639
There's a lot of learning that they are going through,

126
00:08:56,159 --> 00:09:00,679
but there's also successful stories as well coming out of them. Uh.

127
00:09:01,240 --> 00:09:04,360
The one thing about AI is in deterministic, right, It's

128
00:09:04,399 --> 00:09:07,960
like the first time that you do something with AI,

129
00:09:08,200 --> 00:09:11,720
you're going to fail, and then the second you're going

130
00:09:11,759 --> 00:09:16,120
to do another iteration, and because you're going to understand

131
00:09:16,200 --> 00:09:19,519
that behavior, and then you're going to do another iteration.

132
00:09:19,639 --> 00:09:23,240
I think check GPT and all the other APIs out there,

133
00:09:23,360 --> 00:09:30,519
of the open API and similar AI API is out there.

134
00:09:30,600 --> 00:09:36,639
I think it provides them this opportunity to do the experimentation.

135
00:09:37,360 --> 00:09:39,720
But the way I see it is this is this

136
00:09:39,759 --> 00:09:43,639
is temporary, right, This is just dipping your toes into

137
00:09:43,679 --> 00:09:48,879
the water. When you actually want to start doing something

138
00:09:50,519 --> 00:09:53,799
that will change your business and try to get some

139
00:09:54,000 --> 00:09:58,240
financial benefits to your company. That means integrating with many

140
00:09:58,320 --> 00:10:01,759
main systems. That means like it's like we have seen

141
00:10:01,799 --> 00:10:04,159
it with cloud, we have seen it with mobile. Right,

142
00:10:04,200 --> 00:10:07,240
It's like it's not a oh, here's one application that

143
00:10:07,399 --> 00:10:11,679
is mobile, or here's one application that runs on the cloud.

144
00:10:12,159 --> 00:10:15,600
You're starting to adopt it to your organization or if

145
00:10:15,639 --> 00:10:18,440
you are even older, there was a time where there

146
00:10:18,559 --> 00:10:23,200
was we talked about paper this office. Right, So it's

147
00:10:23,240 --> 00:10:26,360
like it takes a little bit of time, it's a process,

148
00:10:26,440 --> 00:10:30,159
and you actually need to internalize it into your organization.

149
00:10:30,240 --> 00:10:33,840
I think the next step for many of these organizations

150
00:10:34,360 --> 00:10:37,960
is to start to internalize it. And when you're internalizing it,

151
00:10:38,000 --> 00:10:43,639
you cannot really depend on a third party service. And

152
00:10:43,759 --> 00:10:48,559
some of the industries cannot even share their data give

153
00:10:48,600 --> 00:10:54,320
their data to these services where there is concerns for

154
00:10:54,440 --> 00:10:58,279
privacy and security and so on and so forth. So

155
00:10:58,320 --> 00:11:06,600
they need to start running these essentially base models internally

156
00:11:07,000 --> 00:11:11,919
in their infrastructure or in their cloud infrastructure so that

157
00:11:12,759 --> 00:11:16,000
they can get the benefits. And the one thing that

158
00:11:16,480 --> 00:11:21,480
I see is when you go to your data scientists

159
00:11:21,639 --> 00:11:33,720
or data data scientists or mL engineer. They are separate

160
00:11:34,480 --> 00:11:38,960
from the DevOps engineers in the company. They are not

161
00:11:39,080 --> 00:11:43,600
well integrated. They have a very different tool set. It's

162
00:11:43,679 --> 00:11:47,279
just because of the way the AI has been developed

163
00:11:47,320 --> 00:11:51,080
and coming into the world. And one thing that you

164
00:11:51,200 --> 00:11:54,600
notice is there is no shortage of open source projects

165
00:11:54,600 --> 00:11:57,639
and tools in the AI world, but there is a

166
00:11:57,679 --> 00:12:03,720
shortage of projects and open source tools in the AI

167
00:12:03,840 --> 00:12:08,799
world that are standards. You can do. Just think of

168
00:12:08,919 --> 00:12:12,240
any subject and you can find ten alternates to do it.

169
00:12:13,000 --> 00:12:16,799
None of them talks to each each other. Nothing is standard.

170
00:12:17,840 --> 00:12:21,559
So this is one of the things that we have noticed.

171
00:12:21,559 --> 00:12:26,200
We actually were trying to solve this for ourselves as well,

172
00:12:26,600 --> 00:12:29,840
and one of the things we decided what was okay,

173
00:12:29,919 --> 00:12:34,919
So I have so many ways of storing a model

174
00:12:35,919 --> 00:12:39,039
and moving that to production. I have so many ways

175
00:12:39,080 --> 00:12:43,759
of storing a data set and moving that to inference

176
00:12:43,919 --> 00:12:46,720
or training and testing and so on and so forth,

177
00:12:47,120 --> 00:12:50,519
but none of it is standard, like this should be

178
00:12:50,639 --> 00:12:53,559
easier than what it is right now. So what we

179
00:12:53,600 --> 00:12:57,480
did is we came up with an open source project

180
00:12:57,519 --> 00:13:02,200
called ki toops dot mL. And what kittops dot mL

181
00:13:02,360 --> 00:13:06,519
provides is what we call model kits. A model kit

182
00:13:06,679 --> 00:13:12,840
is essentially an OCI artifact and an OCI artifact like

183
00:13:13,039 --> 00:13:16,120
a container image. It's not a container image itself, but

184
00:13:16,200 --> 00:13:19,399
it's an OCI artifact. The advantage of being an OCI

185
00:13:19,639 --> 00:13:25,279
artifact is it can be stored in any OCI registry

186
00:13:25,600 --> 00:13:32,120
like docer hub ECR as you're wherever your image registry is,

187
00:13:32,720 --> 00:13:36,960
and using the information that is stored in the model kit,

188
00:13:37,519 --> 00:13:44,759
you can actually move your models as part of your

189
00:13:44,840 --> 00:13:49,240
c c D pipelines much much efficient because your c

190
00:13:49,480 --> 00:13:53,120
c D pipelines actually knows how to usually knows how

191
00:13:53,120 --> 00:13:57,919
to talk to OCI registrict. For instance, when today we

192
00:13:58,080 --> 00:14:04,120
have a a registry that we are running ourselves in

193
00:14:04,279 --> 00:14:09,919
jose dot mL and you will see many signed model

194
00:14:10,000 --> 00:14:13,759
kits in there. The tool that we are using to

195
00:14:13,840 --> 00:14:17,279
sign model kits is cosign, which is what you would

196
00:14:17,440 --> 00:14:25,519
use for signing an imagery, a regular container image. So

197
00:14:26,720 --> 00:14:33,320
the idea behind having the OCI artifacts was to introduce

198
00:14:33,399 --> 00:14:38,879
a standard way of storing and standard way of sharing

199
00:14:39,799 --> 00:14:48,879
your AI artifacts between data scientists and mL engineers and

200
00:14:49,240 --> 00:14:50,600
DevOps engineers.

201
00:14:51,240 --> 00:14:55,519
Speaker 2: You said something really interesting in there, and that's I

202
00:14:55,519 --> 00:14:57,960
haven't heard of put it this way before that your

203
00:14:58,080 --> 00:15:02,000
first foray into using any sort of AI or mL

204
00:15:02,120 --> 00:15:05,080
models in your company will be an experiment that ends

205
00:15:05,120 --> 00:15:09,200
in failure. And I, you know, I think there's something

206
00:15:09,639 --> 00:15:12,200
really to that, because I can imagine if you like

207
00:15:12,320 --> 00:15:15,600
walked around and put like a hammer on every engineer's

208
00:15:15,639 --> 00:15:20,039
desk and said, you will now start to develop with

209
00:15:20,120 --> 00:15:23,759
this hammer instead of what you were using before. I

210
00:15:23,799 --> 00:15:26,159
feel like it's a ridiculous analogy, but I think there's

211
00:15:26,200 --> 00:15:28,080
a lot of sense there that like that for sure

212
00:15:28,159 --> 00:15:31,000
will not end in success there, Like, however you thought

213
00:15:31,840 --> 00:15:35,919
you could utilize it internally is probably not going to

214
00:15:35,960 --> 00:15:39,399
be super effective. And that obviously extends into the area

215
00:15:39,519 --> 00:15:43,080
of if you are building AI into your product, not

216
00:15:43,159 --> 00:15:46,679
just using it as a as a tool for development,

217
00:15:46,840 --> 00:15:49,320
and you're really talking about this next level which isn't

218
00:15:49,360 --> 00:15:53,039
just using it but actually creating it to be effective

219
00:15:53,039 --> 00:15:54,840
as part of the product offering.

220
00:15:56,279 --> 00:15:59,840
Speaker 3: Yeah, And the reason for that is it's really in

221
00:15:59,879 --> 00:16:04,120
the terministic right with the AI itself. So you may

222
00:16:04,360 --> 00:16:09,559
actually think that they're like when we are integrating applications together,

223
00:16:10,960 --> 00:16:15,120
the rule based the applications, the outcome is very deterministic.

224
00:16:15,240 --> 00:16:18,159
We know that these are the inputs and these will

225
00:16:18,159 --> 00:16:21,759
be the outputs. In the case of AI, that's not

226
00:16:21,799 --> 00:16:26,679
the case, right, So when you go through your first

227
00:16:26,799 --> 00:16:29,759
initial cycle and go through your you know, you receive

228
00:16:29,799 --> 00:16:32,440
your inputs, and then you start to get your outputs,

229
00:16:32,679 --> 00:16:35,960
you start to see these outputs that you haven't expected before.

230
00:16:36,559 --> 00:16:39,840
And the usually, like, for instance, it's very likely that

231
00:16:39,919 --> 00:16:43,879
you're missing actually some data that you're feeding into the model,

232
00:16:44,440 --> 00:16:48,840
and you're not getting the outputs that you're expecting. Because

233
00:16:49,240 --> 00:16:55,120
let's say that you're giving the customers can data, but

234
00:16:55,159 --> 00:16:58,440
then you when you give the eleventh data, the AI

235
00:16:58,639 --> 00:17:01,080
is now able to make the code nection or or

236
00:17:01,200 --> 00:17:04,440
or make the analysis better and give it a better result.

237
00:17:04,759 --> 00:17:07,559
So you will start to notice things like that, and

238
00:17:07,640 --> 00:17:10,680
it is not very easy to notice this at the

239
00:17:10,759 --> 00:17:16,079
design phase. It's not even possible to notice these things

240
00:17:16,079 --> 00:17:19,200
at the design phase. So that's why I'm saying that

241
00:17:19,480 --> 00:17:22,799
your first experiment is going to fail, and you're gonna

242
00:17:22,839 --> 00:17:26,759
do another one with more data or less data or

243
00:17:27,240 --> 00:17:30,440
changed data, and then you will get to a state

244
00:17:30,519 --> 00:17:33,319
where you're you're you're happy with the outputs.

245
00:17:33,559 --> 00:17:35,960
Speaker 2: I feel I have a fear here now that, like

246
00:17:36,039 --> 00:17:38,799
you know, if I'm incorporating it into one of our products.

247
00:17:39,039 --> 00:17:41,680
You know, we have a couple and we have tried

248
00:17:41,799 --> 00:17:44,440
going down the route of adding some AI into into

249
00:17:44,480 --> 00:17:47,799
one of them. Uh, there is this to only say

250
00:17:47,839 --> 00:17:50,880
non deterministic. We mean like literally the same inputs gives

251
00:17:50,920 --> 00:17:55,119
us different output. And exactly, I wonder if we're training

252
00:17:55,200 --> 00:17:58,039
our customers to believe that they should just you know,

253
00:17:59,440 --> 00:18:01,720
turn it off and turn it on again in a sense, right,

254
00:18:01,759 --> 00:18:03,839
Like you know, you have to keep repeating the same

255
00:18:03,920 --> 00:18:06,880
question over and over again to get the answer you want.

256
00:18:06,920 --> 00:18:09,799
And that just that feels like a dangerous thing to

257
00:18:09,799 --> 00:18:13,319
push that expectation onto the users of our products.

258
00:18:15,400 --> 00:18:19,519
Speaker 3: Yeah, there is a Yeah, I see what you're you're saying,

259
00:18:19,559 --> 00:18:23,920
and and and I think that's that's that's the part

260
00:18:23,960 --> 00:18:27,759
of the AI that that still needs to be enhanced

261
00:18:28,000 --> 00:18:33,480
a little bit. You know, the the whole subject of

262
00:18:34,039 --> 00:18:42,519
guardrails on AI is a little bit it feels early. Oh,

263
00:18:42,559 --> 00:18:44,640
I'm told, I'm totally I'm totally with you.

264
00:18:44,640 --> 00:18:47,279
Speaker 2: No, there was actually a study that was released just recently,

265
00:18:47,319 --> 00:18:50,240
and I think we'll kind of appreciate this that it's

266
00:18:50,279 --> 00:18:54,720
not being like prompt engineering isn't being used for malicious purposes.

267
00:18:54,799 --> 00:18:58,279
It's generally being used by power users to actually get

268
00:18:58,359 --> 00:19:01,400
value out of the system more than anything, which I

269
00:19:01,400 --> 00:19:04,599
think is really surprising that these systems aren't being abused

270
00:19:04,640 --> 00:19:09,000
in that way realistically yet, And so adding in guardrails

271
00:19:09,039 --> 00:19:12,240
of anything is just preventing those early adopters from being

272
00:19:12,240 --> 00:19:15,240
able to utilize your product more effectively, rather than preventing

273
00:19:15,279 --> 00:19:16,240
malicious attackers.

274
00:19:16,319 --> 00:19:19,640
Speaker 3: Right. And we actually like since you said prompt Engini,

275
00:19:19,720 --> 00:19:26,039
we actually talked with an enterprise recently and one of

276
00:19:26,119 --> 00:19:33,279
their needs was to share their prompts, Like, uh, think

277
00:19:33,319 --> 00:19:35,559
of it this way. You have a database of all

278
00:19:35,599 --> 00:19:40,519
your all your enterprises data, all your customer data, and

279
00:19:40,559 --> 00:19:43,279
so on and so forth. In the past, the sequel

280
00:19:43,519 --> 00:19:47,680
statements that you have used for finance was very different

281
00:19:47,720 --> 00:19:51,000
from the psychoal statesments that you have used for sales,

282
00:19:51,240 --> 00:19:56,640
so you'd never really actually care to share the psycholos statements.

283
00:19:57,079 --> 00:19:59,440
But now what they're seeing is, as you said, there

284
00:19:59,440 --> 00:20:03,079
are these users who are using these prompts and getting

285
00:20:03,119 --> 00:20:07,039
really good results out of those prompts that are querying

286
00:20:07,079 --> 00:20:10,640
there or that are connected to their their data. Right,

287
00:20:11,079 --> 00:20:13,000
So what they want to be able to do is

288
00:20:13,039 --> 00:20:16,519
to to be able to share these prompts like between

289
00:20:16,680 --> 00:20:21,079
finance and and and HR and and and sales and

290
00:20:21,119 --> 00:20:23,920
so on and so forth, so that they don't have

291
00:20:24,039 --> 00:20:29,440
to reinvent the veil every time. But I think that

292
00:20:29,680 --> 00:20:32,720
we're starting to see a different picture now in the

293
00:20:33,079 --> 00:20:36,200
in the AI world where with with these kind of things.

294
00:20:36,240 --> 00:20:42,160
But when you are opening your data through and an

295
00:20:42,160 --> 00:20:48,119
AI model directed to your customers, I think there's still

296
00:20:48,359 --> 00:20:55,359
work that needs to be done on the guard for this. Yes, yeah,

297
00:20:55,400 --> 00:20:55,720
for sure.

298
00:20:55,759 --> 00:20:59,759
Speaker 1: It reminds me a lot of there's two things here

299
00:20:59,799 --> 00:21:03,000
that that seem like there's lessons we can learn here

300
00:21:03,039 --> 00:21:08,559
from other engineering or other industries. Like whenever I work

301
00:21:08,599 --> 00:21:12,960
with early stage startups, I always say that like the

302
00:21:13,000 --> 00:21:19,319
goal of your first product as a startup is to launch,

303
00:21:20,599 --> 00:21:24,480
like the let me rephrase that. To become a successful startup,

304
00:21:24,519 --> 00:21:27,319
you have to identify the difference between the product that

305
00:21:27,400 --> 00:21:30,759
you built and the product that your customers thought you

306
00:21:30,880 --> 00:21:34,000
built and close that gap. And it feels like this

307
00:21:34,240 --> 00:21:40,440
is very similar to that. You know, because AI will

308
00:21:40,519 --> 00:21:43,920
answer whatever question you ask it, and then you have

309
00:21:44,039 --> 00:21:49,279
to figure out what assumptions it made that you didn't

310
00:21:49,359 --> 00:21:53,599
want it to make. And that's where I think really

311
00:21:53,640 --> 00:21:57,240
sharing those prompt those prompt models can be helpful. For

312
00:21:57,279 --> 00:21:59,759
a while there on X there was a trend where

313
00:21:59,759 --> 00:22:03,799
peopleeople were you know, sharing what they were doing with

314
00:22:03,839 --> 00:22:06,640
different AI models and then also sharing the prompts that

315
00:22:06,640 --> 00:22:09,319
they used too, And I found that really interesting to

316
00:22:09,400 --> 00:22:13,200
see the different caveats that they would give in the

317
00:22:13,279 --> 00:22:16,880
prompt to keep it from wandering off some other path

318
00:22:17,079 --> 00:22:19,960
that gave you an answer that looked right, but as

319
00:22:20,279 --> 00:22:21,920
wasn't actually the answer that you needed.

320
00:22:22,440 --> 00:22:24,559
Speaker 2: No, you say that and it's like, actually, really ridiculous.

321
00:22:25,160 --> 00:22:27,519
Like I'm going through right now. I'm making a presentation

322
00:22:27,720 --> 00:22:32,480
for the European as reinvent that I'm speaking at, and

323
00:22:33,799 --> 00:22:38,400
I want some pictures, and I will ask I'm using

324
00:22:38,799 --> 00:22:40,400
one of the tools, and I will ask the same

325
00:22:40,440 --> 00:22:42,079
question multiple times. But when I get a picture that

326
00:22:42,119 --> 00:22:43,799
I actually like, I will take the prompt that I

327
00:22:43,960 --> 00:22:45,880
used for it, and I will save it as the

328
00:22:45,920 --> 00:22:49,000
file name of the picture so that I can get like,

329
00:22:49,039 --> 00:22:51,400
I mean, it's a ridiculous, but that's the name for me,

330
00:22:51,519 --> 00:22:53,400
that really is the thing that identifies it, so that

331
00:22:53,440 --> 00:22:55,720
I can, you know, potentially go back later and be like,

332
00:22:55,759 --> 00:22:58,960
what did I use to actually generate that image?

333
00:23:00,279 --> 00:23:04,039
Speaker 3: So let me complicate that problem a little bit. So

334
00:23:04,160 --> 00:23:08,480
you have you're you're using a check GPT for instance, like.

335
00:23:09,680 --> 00:23:11,960
Speaker 2: It's not the one from open Ai, but yeah, for sure,

336
00:23:11,960 --> 00:23:13,559
it's an l you're using.

337
00:23:14,359 --> 00:23:18,559
Speaker 3: You're using and it's actually a check giv A service.

338
00:23:19,240 --> 00:23:24,599
So you don't really know the version number. Really, you

339
00:23:24,599 --> 00:23:27,720
don't really have to know the data sets that has

340
00:23:27,799 --> 00:23:30,599
been used and retrained and so and so forth. So

341
00:23:30,839 --> 00:23:33,279
for some of the organizations that's not going to work.

342
00:23:33,720 --> 00:23:36,559
So some organizations are going to say, hey, you know what,

343
00:23:37,160 --> 00:23:41,119
I need to know exactly the model, the data set

344
00:23:41,200 --> 00:23:44,240
that it was trained on, and I need to know

345
00:23:44,960 --> 00:23:50,480
where my fine tuning is coming from, where my RAGS

346
00:23:50,680 --> 00:23:55,000
is coming from. So I need all the lineage of

347
00:23:55,000 --> 00:24:02,480
of of this when this prompt was issued and right.

348
00:24:02,880 --> 00:24:10,359
So you need systems that are capable of telling precisely

349
00:24:11,000 --> 00:24:14,599
what was the data set, what is the model, what

350
00:24:14,759 --> 00:24:18,599
is the fine tuning data set? Right? And what is

351
00:24:18,839 --> 00:24:21,319
if you have been using RAG, what is what is

352
00:24:22,119 --> 00:24:26,799
the version of the vector database or the lineage of

353
00:24:26,839 --> 00:24:33,440
the vectors. All of this stored and available somewhere because

354
00:24:35,000 --> 00:24:38,240
something may happen at any point saying that, oh, you

355
00:24:38,319 --> 00:24:41,640
need to go back and figure out what went wrong

356
00:24:41,920 --> 00:24:47,079
with this? Answer what went wrong with this today? Like

357
00:24:47,319 --> 00:24:51,559
if you're using an API, you can't really say what

358
00:24:51,559 --> 00:24:59,559
what went wrong? And so if you think about it

359
00:24:59,559 --> 00:25:03,440
this way, we actually have a system that was used

360
00:25:03,480 --> 00:25:07,319
for this purpose. We used s BOMs to be able

361
00:25:07,359 --> 00:25:12,880
to say exactly what we have used in production for

362
00:25:13,000 --> 00:25:18,200
our applications. Right, so can we actually utilize the same

363
00:25:18,319 --> 00:25:24,000
system with AI as well? Like, oh, I'm using the

364
00:25:24,039 --> 00:25:28,279
application x y Z, it's built out of these dependencies,

365
00:25:28,359 --> 00:25:30,920
these libraries and so on and so forth. An AI

366
00:25:31,079 --> 00:25:36,559
application is not two different. It's basically saying that, oh,

367
00:25:36,680 --> 00:25:41,559
I'm using the same application, the same dependencies. I'm using

368
00:25:41,960 --> 00:25:45,279
the inference engine x y Z, and then I'm using

369
00:25:45,279 --> 00:25:49,000
the model weights that are coming from here. I'm using

370
00:25:49,000 --> 00:25:52,960
a data set that was used in that is coming

371
00:25:53,039 --> 00:25:57,720
from here. And these are all in an s bomb

372
00:25:57,799 --> 00:26:03,799
which existing systems know how to pars and get results from.

373
00:26:04,200 --> 00:26:06,839
And we know how to sign s bomps, we know

374
00:26:06,960 --> 00:26:11,799
how to store s bomps, you know CI artifacts. So

375
00:26:13,079 --> 00:26:16,680
I think that's kind of you know, is the missing

376
00:26:16,720 --> 00:26:21,200
link nowadays where we are doing these things in our enterprises,

377
00:26:22,799 --> 00:26:28,680
adopting models based models from somewhere, and none of the

378
00:26:28,799 --> 00:26:31,440
data sets that we are using is enough. So some

379
00:26:32,160 --> 00:26:37,000
one of our engineers just goes to Google and searches

380
00:26:37,000 --> 00:26:40,200
for a data set and uses that to you know,

381
00:26:40,319 --> 00:26:43,400
do some of the training. You don't know what is

382
00:26:43,440 --> 00:26:45,720
in that data set. So it's it's a little bit

383
00:26:45,759 --> 00:26:50,559
wild west at the moment with the AI adoption. So

384
00:26:51,200 --> 00:26:56,680
and when it was wild west for the regular applications,

385
00:26:57,200 --> 00:27:02,359
you know, we we ended up with things like, uh,

386
00:27:02,480 --> 00:27:05,920
the we ended up with things like what was the

387
00:27:06,359 --> 00:27:13,599
look for j right? So how many how many days

388
00:27:14,119 --> 00:27:17,000
any Joba they will prespend trying to figure out if

389
00:27:17,000 --> 00:27:19,440
they have used the wrong version of look for Jack

390
00:27:20,039 --> 00:27:25,920
when that came out right, So, uh so things happened.

391
00:27:26,759 --> 00:27:30,720
Uh and I think we need to be prepared for it.

392
00:27:30,799 --> 00:27:33,240
And it's a little bit wild wild West at the

393
00:27:33,279 --> 00:27:47,519
moment with the with the I amount.

394
00:27:41,319 --> 00:27:43,960
Speaker 1: It's you know, as we talk about this more, it

395
00:27:44,079 --> 00:27:48,039
kind of feels like there is it's almost like a

396
00:27:48,119 --> 00:27:52,720
communication problem. One of the things you said was knowing

397
00:27:52,759 --> 00:27:55,720
the lineage of the data and with when it comes

398
00:27:55,799 --> 00:27:59,759
to these AI models, we have a very human feeling

399
00:28:00,119 --> 00:28:04,240
interface to it, and so we I think we naturally

400
00:28:05,319 --> 00:28:09,799
try to leverage that. But the difference being, if if

401
00:28:09,839 --> 00:28:14,400
I ask either of you who was the last president,

402
00:28:15,599 --> 00:28:18,839
you probably are going to assume that I'm talking about

403
00:28:18,920 --> 00:28:21,839
current time. But if I'm talking to a data set

404
00:28:21,880 --> 00:28:24,039
and that data set is twenty years old and I

405
00:28:24,079 --> 00:28:26,240
ask it who the last president is, I'm going to

406
00:28:26,279 --> 00:28:29,720
get the wrong answer. And I feel like it's those

407
00:28:29,799 --> 00:28:33,960
hidden assumptions that we have when we communicate that make

408
00:28:34,079 --> 00:28:37,039
us have to be overly verbose with these tools.

409
00:28:37,319 --> 00:28:40,279
Speaker 2: See, I don't think I don't think AI is special here.

410
00:28:40,400 --> 00:28:44,240
I think that our own culture and values propagate much

411
00:28:44,319 --> 00:28:49,160
too far into it, and as well as our neural diversity.

412
00:28:49,319 --> 00:28:52,200
Like I'm on some spectrum somewhere and you say, who

413
00:28:52,240 --> 00:28:55,160
is the last president? I you know, what I heard

414
00:28:55,359 --> 00:28:59,839
was like the last president there will ever be like as.

415
00:28:59,680 --> 00:29:01,680
Speaker 4: If you know, you know, you're talking about side of

416
00:29:01,839 --> 00:29:03,920
history and like, you know, if I look at it

417
00:29:05,319 --> 00:29:08,039
and today's you know, I look at today's like political climate,

418
00:29:08,119 --> 00:29:09,759
you know that is a you know I am.

419
00:29:09,799 --> 00:29:12,079
Speaker 2: I'm evaluating that based off you know, the model that

420
00:29:12,119 --> 00:29:14,480
I've trained in myself, and I can also imagine like

421
00:29:14,519 --> 00:29:17,960
for other cultures out there, depending on who you're talking to,

422
00:29:18,359 --> 00:29:20,279
you know, I feel like in the West or in

423
00:29:20,319 --> 00:29:22,799
the United States, it may be much more clear. But

424
00:29:22,839 --> 00:29:25,480
you know, depending on who you're talking to and where

425
00:29:25,519 --> 00:29:28,519
they've come from, Uh, they're gonna have much different answers

426
00:29:28,559 --> 00:29:31,240
on that. And I feel like we may be more

427
00:29:31,279 --> 00:29:36,680
closely able to understand what some of those implicit inputs

428
00:29:36,720 --> 00:29:40,039
into that human model there are based off of what

429
00:29:40,079 --> 00:29:42,359
we know about them or the current world. And I

430
00:29:42,400 --> 00:29:44,960
think we definitely do throw a lot of that out

431
00:29:45,000 --> 00:29:48,880
the window when we are utilizing a prompt sitting on

432
00:29:49,000 --> 00:29:50,039
a website somewhere.

433
00:29:52,240 --> 00:29:53,720
Speaker 3: Yeah, for sure.

434
00:29:55,400 --> 00:30:04,640
Speaker 1: So what are the what are the big challenges that

435
00:30:04,640 --> 00:30:08,640
that we're facing with this from a DevOps perspective, like

436
00:30:08,680 --> 00:30:11,839
how do we how do we set the framework into

437
00:30:11,839 --> 00:30:15,680
guardrails to get that tool from the AI developers into

438
00:30:15,720 --> 00:30:17,160
a production grade capacity.

439
00:30:18,319 --> 00:30:21,640
Speaker 3: So I guess the first one is, you know, have

440
00:30:21,720 --> 00:30:24,920
you packaged this, have you set this sober in a manner?

441
00:30:25,039 --> 00:30:28,240
And you know, we have one solution with kit tops

442
00:30:28,440 --> 00:30:32,359
for it. The second challenge that we have noticed and

443
00:30:32,559 --> 00:30:39,519
started working on is AI is more nondeterministic and it's

444
00:30:39,559 --> 00:30:43,279
more complex. So when you think about a workflow that

445
00:30:43,319 --> 00:30:49,319
you use on DevOps, it's more linear, and there is

446
00:30:49,359 --> 00:30:53,160
a starting point and there's an endpoint, and there's an

447
00:30:53,359 --> 00:30:57,039
art fact that you can record and send to production

448
00:30:57,240 --> 00:30:59,759
and so on and so forth. With AI, it works

449
00:30:59,759 --> 00:31:02,799
a lot a little bit different because you sent the production,

450
00:31:02,960 --> 00:31:10,720
but you after that, you keep on watching and because drifts,

451
00:31:10,759 --> 00:31:15,480
the data changes the model, you know, twenty years passes,

452
00:31:15,559 --> 00:31:19,000
the president changes and so on and so forth. So

453
00:31:21,240 --> 00:31:23,559
you start to notice these drifts and you need to

454
00:31:23,599 --> 00:31:28,599
start to constantly observe the models and then react to

455
00:31:28,680 --> 00:31:32,720
their changes, and not just the models itself as well.

456
00:31:32,839 --> 00:31:36,599
Before you know, the training needs to be like even

457
00:31:36,640 --> 00:31:40,880
if you're doing fine tuning, whether your fine tuned model

458
00:31:41,240 --> 00:31:45,960
is adequate or not, that's actually a bunch of tests

459
00:31:46,160 --> 00:31:49,720
that needs to be run and then compared and so

460
00:31:49,759 --> 00:31:52,160
on and so forth. So the one thing that we

461
00:31:52,279 --> 00:31:55,680
have noticed is this is almost like there's a lot

462
00:31:55,720 --> 00:32:00,440
of signals that are coming in from these develops systems

463
00:32:00,480 --> 00:32:01,119
that we're going to.

464
00:32:01,240 --> 00:32:06,839
Speaker 5: Use for training, inference, production and observability of the AI,

465
00:32:07,400 --> 00:32:11,599
but there isn't a real solution out there that reacts

466
00:32:11,680 --> 00:32:15,079
to it, that brings everything together so that you can.

467
00:32:15,079 --> 00:32:21,079
Speaker 3: Use these things together in a more orderly fashion or easier,

468
00:32:21,640 --> 00:32:30,000
integrate them, easier to be to cite it in a

469
00:32:30,039 --> 00:32:35,680
different way. So what we I don't know if this

470
00:32:35,759 --> 00:32:40,519
is the right language to use, but what we said was, oh,

471
00:32:40,799 --> 00:32:44,000
let's build a control plane for this so that it

472
00:32:44,160 --> 00:32:49,480
receives all these signals for all your AI m L applications,

473
00:32:49,519 --> 00:32:54,079
including everything that is coming in from observability, from your

474
00:32:54,119 --> 00:32:59,160
c I, c D pipelines, your training pipelines, your dataset changes,

475
00:32:59,240 --> 00:33:04,799
your data extractions and so forth. And in the control

476
00:33:04,839 --> 00:33:09,000
plane we define what needs to be and then the

477
00:33:09,000 --> 00:33:13,039
control plane reacts to these signals and runs the data

478
00:33:13,119 --> 00:33:16,920
plane components like their CICD and some so forth, or

479
00:33:16,960 --> 00:33:22,839
your deployments in such a way that the system tries

480
00:33:22,880 --> 00:33:27,799
to get back to the state that you want it

481
00:33:27,839 --> 00:33:31,119
to be. So we started working on a control plane

482
00:33:31,240 --> 00:33:37,240
for AI and mL applications so that some of the

483
00:33:39,319 --> 00:33:43,759
some of the challenges of running and integrating AI and

484
00:33:43,880 --> 00:33:49,039
mL into an enterprise is boxed into a control plane. Essentially.

485
00:33:50,400 --> 00:33:53,960
Speaker 2: Maybe this is a controversial statement to make. I get

486
00:33:53,960 --> 00:33:59,559
the sense that our current capabilities for generating new models

487
00:34:00,240 --> 00:34:02,880
restricts those models in a way which makes them not

488
00:34:03,240 --> 00:34:08,960
very good at reasoning or analysis and If they're not

489
00:34:09,000 --> 00:34:12,079
good in that, then what we're left with is pretty

490
00:34:12,159 --> 00:34:17,440
much data replay construction content creation. But we know that

491
00:34:17,519 --> 00:34:20,039
the models that we're creating, they're not good at that

492
00:34:20,360 --> 00:34:26,000
because they suffer from hallucinations from making up stuff by

493
00:34:26,159 --> 00:34:29,880
combining different pieces. I think I read somewhere that the

494
00:34:29,920 --> 00:34:34,400
summarization strategy that a lot of models have taken in

495
00:34:34,440 --> 00:34:37,920
their creation is to look at the headlines of articles

496
00:34:37,960 --> 00:34:41,800
that have been written and then consume the content of

497
00:34:41,840 --> 00:34:44,280
the article and index that and then do a reverse

498
00:34:44,320 --> 00:34:46,639
index look up. So you know, hey, here's my article,

499
00:34:46,679 --> 00:34:48,880
what should what summarizes for me? It will spit back

500
00:34:48,880 --> 00:34:51,119
out the title. And if we look at the content

501
00:34:51,159 --> 00:34:54,719
that humans have created so far, titles are all click baity,

502
00:34:54,960 --> 00:34:57,039
you know, have nothing to do with the conclusion. And

503
00:34:57,119 --> 00:35:00,239
so even if they were right, you know, they're it's

504
00:35:00,239 --> 00:35:03,840
still not very accurate. And so, like I am, I'm

505
00:35:03,840 --> 00:35:06,400
wondering if we are not still a bit too early

506
00:35:06,599 --> 00:35:09,519
in trying to automate or improve some of these things

507
00:35:09,559 --> 00:35:13,320
when the base thing we're getting out still is fundamentally

508
00:35:13,360 --> 00:35:14,679
limited in capability.

509
00:35:16,599 --> 00:35:22,400
Speaker 3: Uh, I don't think so, Like I think I need

510
00:35:22,440 --> 00:35:25,119
to I need to be very careful about this because

511
00:35:25,199 --> 00:35:29,000
I don't want to add to the hype. Yeah, because

512
00:35:29,559 --> 00:35:32,440
you know, I think it's the AI is over hyped

513
00:35:32,760 --> 00:35:33,400
at the moment.

514
00:35:33,559 --> 00:35:35,920
Speaker 2: No, No, definitely not.

515
00:35:37,760 --> 00:35:47,920
Speaker 3: What well, yep, I know it's a shock to me too,

516
00:35:51,719 --> 00:35:55,239
So it is a little bit hyped, let's call it.

517
00:35:56,639 --> 00:36:01,199
But on the other hand, I don't think with the

518
00:36:01,239 --> 00:36:05,760
capabilities and we we when we talk about the hype

519
00:36:05,800 --> 00:36:08,280
of the A I, we we are usually talking about

520
00:36:08,320 --> 00:36:10,719
the gen a I, right, the l l MS and

521
00:36:10,840 --> 00:36:15,119
generative AI. But but in reality there is also other

522
00:36:15,280 --> 00:36:20,079
kinds of AI and m l as well, so that

523
00:36:20,159 --> 00:36:23,119
we don't usually talk about. But it's actually getting a

524
00:36:23,119 --> 00:36:27,360
lot of benefits from this hype essentially, So when I

525
00:36:27,440 --> 00:36:34,679
think about it, that whole space as A as A

526
00:36:32,920 --> 00:36:40,039
as one one piece of of of technology area, I

527
00:36:40,079 --> 00:36:43,639
think that this is this is not a this is

528
00:36:43,679 --> 00:36:47,320
not a like we can start adapting A I and

529
00:36:47,599 --> 00:36:52,599
m l projects into into our enterprises. Even with generative AI,

530
00:36:52,719 --> 00:36:57,360
which was what you were referring to, Essentially, you can

531
00:36:57,400 --> 00:37:00,239
get really good results out of it. Like that was

532
00:37:00,280 --> 00:37:04,320
one article, I think it was not even an article.

533
00:37:04,320 --> 00:37:09,079
It was Walmart reporting their quarterly results. They basically said

534
00:37:09,480 --> 00:37:15,159
you know, we are more efficient with our online business

535
00:37:15,320 --> 00:37:19,639
because of generative AI. I think they were doing generating

536
00:37:21,079 --> 00:37:30,519
product the descriptions more efficiently or something, so they actually

537
00:37:30,599 --> 00:37:36,280
started doing better financially. So there are pockets of even

538
00:37:36,559 --> 00:37:39,679
direct financial benefit that we are starting to see in

539
00:37:39,840 --> 00:37:44,400
from generative AI and from AI mL and other kinds

540
00:37:44,400 --> 00:37:48,039
of AI. There has always been in the five year

541
00:37:48,280 --> 00:37:51,480
last five years, there has always been projects that were

542
00:37:51,519 --> 00:37:56,519
bringing benefits to the business. What is changing with the

543
00:37:56,639 --> 00:38:00,679
hype is now there is more eyeballs look into this

544
00:38:00,840 --> 00:38:07,119
area so that there is an accelerated opportunity for adoption.

545
00:38:08,480 --> 00:38:11,320
You know, in the past you had a chance to

546
00:38:12,199 --> 00:38:16,199
get one base model for something for categorizations, let's say,

547
00:38:16,559 --> 00:38:20,079
but now you have ten, and you can actually test

548
00:38:20,119 --> 00:38:24,840
the ten categorization based models and then get a better

549
00:38:25,280 --> 00:38:29,079
fit for your data. So there are things like that

550
00:38:29,079 --> 00:38:33,559
that is actually happening that's going to I think help

551
00:38:33,639 --> 00:38:40,960
the the enterprises and the businesses get more value out

552
00:38:40,960 --> 00:38:47,719
of AI n mL. So I guess generative AI it's

553
00:38:48,039 --> 00:38:51,679
a little bit too much hyped up, but that is

554
00:38:51,679 --> 00:38:54,159
not a bad thing for the rest of DAI world.

555
00:38:54,320 --> 00:38:56,679
Speaker 2: I mean, you say that and now you help me

556
00:38:56,760 --> 00:39:01,760
thinking and maybe a more philosophical perspective if if we're

557
00:39:01,800 --> 00:39:05,280
just lacking a number of people paying more attention and

558
00:39:05,320 --> 00:39:10,599
investing their resources in actually researching or building up the

559
00:39:10,639 --> 00:39:11,840
technology more.

560
00:39:12,960 --> 00:39:13,719
Speaker 3: Is it is?

561
00:39:13,800 --> 00:39:17,360
Speaker 2: Are the limitations that we're facing something fundamentally within the

562
00:39:17,360 --> 00:39:21,960
capability of human society to overcome, like or is reaching

563
00:39:22,960 --> 00:39:34,079
general artificial intelligence actually completely made up dream? Yeah, And

564
00:39:34,280 --> 00:39:36,000
for the record, this is not a I expect that

565
00:39:36,039 --> 00:39:37,480
you want to have an accurate answer to.

566
00:39:37,440 --> 00:39:41,880
Speaker 3: This, but I don't have any accurate answer. Of course,

567
00:39:41,960 --> 00:39:46,880
I don't think anyone has, or they may have an

568
00:39:46,920 --> 00:39:51,920
interpretation for or expectation for it. But to be honest,

569
00:39:53,239 --> 00:39:58,039
the amount of compute that we have been using to

570
00:39:58,159 --> 00:40:06,719
trade models, even even the small ones, is enormous For

571
00:40:06,800 --> 00:40:14,159
us to get to a place where we have more

572
00:40:14,239 --> 00:40:19,400
capable models, I think it will take some kind of

573
00:40:19,440 --> 00:40:25,920
a leap before something needs to happen. This amount of

574
00:40:26,039 --> 00:40:29,360
compute that we need for that is just just enormous

575
00:40:29,400 --> 00:40:32,679
at the moment. And the other thing that I'm actually

576
00:40:32,719 --> 00:40:36,840
worried about is without actually getting adoption and value, how

577
00:40:36,960 --> 00:40:43,199
much can we continue to spend on training more and

578
00:40:43,440 --> 00:40:46,000
more capable AI models At some.

579
00:40:46,119 --> 00:40:49,400
Speaker 2: Point I don't know if it's actually a monetary problem.

580
00:40:49,760 --> 00:40:53,199
I understand that we're actually resource constrained, and I'm wondering

581
00:40:53,239 --> 00:40:56,360
if those chips are coming from Taiwan, of which has

582
00:40:56,360 --> 00:40:59,119
its own sort of issues, and we don't have another

583
00:40:59,360 --> 00:41:03,119
opportunity to actually produce what is necessary in order to

584
00:41:03,320 --> 00:41:07,159
accelerate further. It's like a fundamental limitation of we're limited

585
00:41:07,239 --> 00:41:10,360
not by our ability or by the technology's ability, but

586
00:41:10,840 --> 00:41:15,199
realistically our actual capability of a resource allocation.

587
00:41:14,920 --> 00:41:19,000
Speaker 3: Produce fast enough. Yeah. And also not to forget the data,

588
00:41:19,079 --> 00:41:23,280
because the way that the AIS be trained today, it's

589
00:41:24,039 --> 00:41:28,519
troven large amounts of data to it, and we true

590
00:41:28,679 --> 00:41:31,039
pretty much everything that we can so far.

591
00:41:31,199 --> 00:41:34,119
Speaker 2: It's worth I mean, that's its own sort of problem

592
00:41:34,159 --> 00:41:36,840
because the Internet is sort of over as far as

593
00:41:37,000 --> 00:41:40,039
what it used to be a free, free, public sharing

594
00:41:40,119 --> 00:41:43,039
of knowledge. That's no longer the thing, and so we're

595
00:41:43,079 --> 00:41:46,320
only going to be accessible to a smaller pool of

596
00:41:46,360 --> 00:41:50,519
information going forward for future models that that data now

597
00:41:50,599 --> 00:41:54,679
has finally have a price on it, and realistically, as

598
00:41:54,679 --> 00:41:58,000
you point out, it's not that much, and it's not

599
00:41:58,119 --> 00:41:59,719
even that good. If this is the best we can

600
00:41:59,760 --> 00:42:01,800
come out out with it at this point. So it

601
00:42:01,840 --> 00:42:05,159
does seem like we're at some sort of obstacle to

602
00:42:05,440 --> 00:42:09,519
get further, both from a data creation standpoint and a

603
00:42:09,559 --> 00:42:10,920
resource creation standpoint.

604
00:42:11,639 --> 00:42:17,159
Speaker 3: Right, But to be honest, like the general AI is

605
00:42:17,239 --> 00:42:24,280
a like it's a very good ambition to have, but

606
00:42:24,519 --> 00:42:29,880
most enterprises and businesses do not need it. They that

607
00:42:30,000 --> 00:42:35,599
what we have today with their data is plenty enough

608
00:42:35,639 --> 00:42:40,960
for them to actually get financial benefits and efficiencies and

609
00:42:41,159 --> 00:42:45,440
value out of AI. And about like, uh there, I

610
00:42:45,440 --> 00:42:48,840
think there's there are two facets to this. There is like, oh,

611
00:42:48,880 --> 00:42:54,119
there's these ambitions which we should actually try to pursue.

612
00:42:54,639 --> 00:42:57,920
But then there is oh, can we actually start adopting

613
00:42:58,320 --> 00:43:02,719
the technology that we have generated over these years so

614
00:43:02,760 --> 00:43:08,039
that we start creating value to our businesses and communities? Right,

615
00:43:08,559 --> 00:43:15,480
So I don't think the two are connected as connected

616
00:43:15,519 --> 00:43:17,880
as we think. Where we don't need to wait the

617
00:43:18,000 --> 00:43:19,360
first to do the second?

618
00:43:19,480 --> 00:43:22,920
Speaker 2: Yeah, yeah, for sure. Where is where is AI adoption

619
00:43:23,079 --> 00:43:25,920
on the WILL technology adoption scale?

620
00:43:29,000 --> 00:43:35,440
Speaker 1: Not really a not really a priority for me just

621
00:43:35,519 --> 00:43:38,800
but I think I think just because of the industry

622
00:43:38,840 --> 00:43:43,880
that I'm in, for me, the most value that it

623
00:43:44,000 --> 00:43:48,679
has is like a personal value of using it to

624
00:43:50,239 --> 00:43:52,679
remember the things that I've forgotten, you know, like how

625
00:43:52,719 --> 00:43:55,719
do I you know, how do I do an a

626
00:43:55,880 --> 00:43:58,360
single weight in javascriptor or things like that. From a

627
00:43:58,400 --> 00:44:01,559
business perspective, I don't have person have any use cases

628
00:44:02,239 --> 00:44:07,360
that I'm actively involved with, though I think it has

629
00:44:07,400 --> 00:44:09,239
a lot of power and potential. It's not that I've

630
00:44:09,320 --> 00:44:13,280
ruled it out. It's just that it's not a priority

631
00:44:13,280 --> 00:44:14,199
for me right at the moment.

632
00:44:14,519 --> 00:44:18,039
Speaker 2: No, I totally get it right, like it always optimizes

633
00:44:18,119 --> 00:44:19,920
for like the norm, right, and so as long as

634
00:44:19,960 --> 00:44:22,079
the norm is better than where you are, it's a

635
00:44:22,119 --> 00:44:24,119
good thing to use it because you'll get to that norm.

636
00:44:24,320 --> 00:44:27,400
But will you're such a great engineer that what it's

637
00:44:27,400 --> 00:44:28,920
going to suggest to you is going to be worse

638
00:44:29,039 --> 00:44:31,239
than what you would create by default.

639
00:44:33,320 --> 00:44:35,920
Speaker 1: Which comes back into I think that comes back into

640
00:44:36,000 --> 00:44:38,800
the prompt engineering topic that we were having, you know,

641
00:44:39,599 --> 00:44:43,800
to get useful information out of it, I have to

642
00:44:43,840 --> 00:44:47,960
give it all of the internal mental models that I

643
00:44:48,000 --> 00:44:52,159
would use to get a similar or better answer.

644
00:44:52,559 --> 00:44:54,880
Speaker 2: And that's really where all the effort goes into, Like

645
00:44:54,920 --> 00:44:56,960
how do I even correctly? It's like I have an

646
00:44:56,960 --> 00:44:59,159
intuition about how to think about this, Now I need

647
00:44:59,159 --> 00:45:04,679
to convey that intuition to another being using natural language,

648
00:45:04,719 --> 00:45:07,000
of which I've never thought about how I'm even thinking

649
00:45:07,000 --> 00:45:10,119
about this problem before. I totally agree that's where all

650
00:45:10,199 --> 00:45:13,159
of the challenge really comes in to get valuable output

651
00:45:13,280 --> 00:45:14,480
from these models.

652
00:45:15,000 --> 00:45:18,039
Speaker 3: So it's one of the things that I have the test,

653
00:45:18,119 --> 00:45:21,800
and I'm sure that there is a scientific explanation to

654
00:45:21,840 --> 00:45:25,559
it which I haven't really looked up up. But when

655
00:45:25,639 --> 00:45:31,039
you actually give a working example of what you're trying

656
00:45:31,079 --> 00:45:37,760
to achieve to the model, you usually get much much

657
00:45:37,760 --> 00:45:42,320
better results from the model. Like if you teach it,

658
00:45:42,719 --> 00:45:48,519
you know this is how it should work. Once, then

659
00:45:48,840 --> 00:45:52,320
the rest it goes much easier for you.

660
00:45:52,599 --> 00:45:54,599
Speaker 2: I mean, and we know that must be true because

661
00:45:54,639 --> 00:45:58,519
like in the lms are designed by humans, and humans

662
00:45:58,519 --> 00:46:02,840
operate in this like sixth level of thinking hierarchy, where

663
00:46:02,880 --> 00:46:06,119
like the bottom level is like wrote, repetition and that

664
00:46:06,239 --> 00:46:09,679
means you saw the answer in some format before and

665
00:46:09,719 --> 00:46:12,519
you can repeat it again. And I feel like that's

666
00:46:12,559 --> 00:46:14,920
the first level if you want an answer that looks

667
00:46:14,960 --> 00:46:17,800
like something you like, even another human, Right, I need

668
00:46:17,800 --> 00:46:21,039
to show you what success looks like before I expect

669
00:46:21,119 --> 00:46:24,280
you to output it, you know, especially in experienced engineers.

670
00:46:24,480 --> 00:46:26,880
But then there's like five more levels of this where

671
00:46:27,440 --> 00:46:30,199
I know for sure that it's not going to get

672
00:46:30,239 --> 00:46:35,400
to without also deep conscious thinking and deliberate input on

673
00:46:35,519 --> 00:46:38,159
those levels and order like don't just do one I need,

674
00:46:38,800 --> 00:46:41,519
because I'm going to ask it to create unit tests

675
00:46:41,559 --> 00:46:43,400
for me. I love this example because I actually think

676
00:46:43,440 --> 00:46:45,920
this is something that really works well, especially going from

677
00:46:45,920 --> 00:46:46,840
one language to another.

678
00:46:46,880 --> 00:46:47,079
Speaker 3: One.

679
00:46:47,119 --> 00:46:49,320
Speaker 2: The steps are we were just doing this in an

680
00:46:49,320 --> 00:46:51,360
authoress as we were converting some of our code from

681
00:46:51,440 --> 00:46:54,719
JavaScript to RUSS. We say, okay, you know we have

682
00:46:55,079 --> 00:46:58,559
this code in JavaScript, These tests in JavaScript, right tests

683
00:46:58,559 --> 00:47:00,840
that look like this in RUSS wright the actual code,

684
00:47:00,840 --> 00:47:03,519
and then run the RUSS unit tests against the code,

685
00:47:03,519 --> 00:47:05,760
so we have the sort of validation and the first

686
00:47:05,840 --> 00:47:07,800
level is really the unit test. But we then need

687
00:47:07,840 --> 00:47:11,159
to say things like oh, but also make sure to

688
00:47:11,159 --> 00:47:14,280
handle these weird edge cases and like specifically list out

689
00:47:14,320 --> 00:47:16,679
some edge cases so someone has to think about those.

690
00:47:16,960 --> 00:47:18,960
And then there's like, okay, now I need to generate

691
00:47:19,119 --> 00:47:20,960
like ways in which this could fail, which is like

692
00:47:21,000 --> 00:47:23,760
another level on top of that, and you know, propose

693
00:47:23,880 --> 00:47:27,199
different ways in which we could re architect this whole

694
00:47:27,519 --> 00:47:29,880
aspect and I can't just say that, right, I can't

695
00:47:29,920 --> 00:47:32,000
just say, oh, yeah, you know, refactor this code for

696
00:47:32,039 --> 00:47:33,800
me in an optimal way. I need to actually like

697
00:47:33,840 --> 00:47:35,519
how do I know what to refactor? How do I

698
00:47:35,559 --> 00:47:37,639
know what to do that? It's just such a challenge

699
00:47:37,639 --> 00:47:40,639
to articulate, even from one person to another one And

700
00:47:40,639 --> 00:47:43,719
I almost want to, like draw a diagram how do

701
00:47:43,760 --> 00:47:45,599
I feed that in? Though like there had a different

702
00:47:45,679 --> 00:47:50,599
len right exactly. You know, it does create this whole

703
00:47:50,599 --> 00:47:54,639
other challenge in order to really get the value out

704
00:47:54,719 --> 00:47:57,400
to a high enough level that where we really wanted

705
00:47:57,440 --> 00:47:57,760
to have.

706
00:47:58,760 --> 00:48:05,519
Speaker 3: Yeah, I mean it's something that we have recently done

707
00:48:05,639 --> 00:48:12,559
with a you know, it's languaging language out right. Lms

708
00:48:12,559 --> 00:48:14,719
are are really good at those. It's like we have

709
00:48:14,840 --> 00:48:21,360
done a similar thing. We actually had to implement a

710
00:48:21,360 --> 00:48:26,360
piece of code that was implemented in Rust, but we

711
00:48:26,440 --> 00:48:29,079
had to run it in a different environment which didn't

712
00:48:29,119 --> 00:48:33,360
allow us to run Rust at the time. So we said, okay,

713
00:48:33,360 --> 00:48:37,039
here's some Rust code that implements this algorithm and this thing,

714
00:48:37,119 --> 00:48:41,559
and then just can you convert that to a typescript

715
00:48:42,039 --> 00:48:46,519
in that at that time? It did like it did,

716
00:48:47,559 --> 00:48:52,280
and it runs, it runs as good as it like,

717
00:48:52,440 --> 00:48:55,760
it converts all the unit tests and as you said,

718
00:48:55,800 --> 00:49:00,320
it's like and it works perfectly, except that you don't

719
00:49:00,360 --> 00:49:05,719
get the the performance, Like there is no way that

720
00:49:05,760 --> 00:49:07,760
you can get the performance, and all of a sudden

721
00:49:07,760 --> 00:49:11,679
you're faced with the fact that, oh, okay, this code

722
00:49:11,760 --> 00:49:20,360
is maybe equivalant, but not equal. So there's also that

723
00:49:20,480 --> 00:49:25,119
fact as well. But the good thing about it was

724
00:49:25,199 --> 00:49:28,039
we actually spend on the what forty five minutes to

725
00:49:28,199 --> 00:49:31,199
just get that code converted and learn that, oh, this

726
00:49:31,360 --> 00:49:34,800
will never perform the same, so that we had to

727
00:49:34,840 --> 00:49:38,559
go a different way. So I don't know how much

728
00:49:38,599 --> 00:49:41,960
time we saved by just doing that, but that was

729
00:49:42,039 --> 00:49:50,880
worth it. So it's not always about actually getting the

730
00:49:50,920 --> 00:49:54,800
result that you want. It's also sometimes about getting the

731
00:49:54,840 --> 00:49:56,719
wrongers out quicker as well.

732
00:49:57,239 --> 00:49:58,360
Speaker 1: Oh true story.

733
00:49:58,480 --> 00:50:04,320
Speaker 2: Yeah, yeah, go ahead.

734
00:50:04,360 --> 00:50:08,000
Speaker 1: Well I was just gonna reinforce that because a lot

735
00:50:08,039 --> 00:50:11,960
of times that's the best learning that you can get.

736
00:50:12,920 --> 00:50:15,119
You know, if you get the right answer the first time,

737
00:50:16,519 --> 00:50:18,760
you don't learn as much as you do when you

738
00:50:19,559 --> 00:50:21,079
think you were going to get the right answer and

739
00:50:21,079 --> 00:50:25,639
get something completely opposite, you know, Like it's like when

740
00:50:26,400 --> 00:50:31,400
tackling a new project or a new problem, one of

741
00:50:31,480 --> 00:50:34,920
the most important things to learn is what questions should

742
00:50:34,960 --> 00:50:37,239
I be asking that I don't know I should be asking,

743
00:50:37,719 --> 00:50:40,760
and getting the wrong answer helps you along that path.

744
00:50:41,880 --> 00:50:46,199
Speaker 2: I think there's a huge challenge there when you almost

745
00:50:46,320 --> 00:50:48,760
know from intuition that the response is wrong. I mean,

746
00:50:48,760 --> 00:50:50,760
it's wrong, obviously it came from it all alum, but

747
00:50:50,840 --> 00:50:55,159
I mean it's like there's something special about it that

748
00:50:55,280 --> 00:50:58,039
you know, you just know. You're like, oh, that that

749
00:50:58,079 --> 00:51:00,559
can't really be the best way to do this, and

750
00:51:00,599 --> 00:51:03,440
you're like, do it differently, do it differently, And like

751
00:51:03,440 --> 00:51:06,559
I'm running out of words to basically say, try again,

752
00:51:06,639 --> 00:51:09,159
but don't use any of those constructs you use this time.

753
00:51:09,719 --> 00:51:13,159
And it's just every time it's like just subtly wrong

754
00:51:13,320 --> 00:51:15,840
in a different way. And so there is certainly a

755
00:51:15,920 --> 00:51:18,559
learning there of, you know, all these ways in which

756
00:51:18,639 --> 00:51:21,320
not to do it. I would just, you know, one day,

757
00:51:21,360 --> 00:51:23,559
I would like it to actually give me a right answer.

758
00:51:24,840 --> 00:51:29,880
Speaker 3: Yeah, it's it's the are sneaky. So they're gonna give

759
00:51:29,960 --> 00:51:34,440
you an answer that is ninety nine percent correct, but

760
00:51:34,519 --> 00:51:37,000
then that one percent they are going to hide a

761
00:51:37,079 --> 00:51:40,719
bug that you're gonna spend the next two days trying

762
00:51:40,760 --> 00:51:43,559
to find in production.

763
00:51:43,880 --> 00:51:47,199
Speaker 2: Yeah, I mean you're lucky if it happens. You're definitely

764
00:51:47,280 --> 00:51:49,639
lucky if it's short term turnaround like that right where

765
00:51:49,679 --> 00:51:52,039
you know it happens quickly and not like you know,

766
00:51:52,159 --> 00:51:54,039
months or years down the road when you may not

767
00:51:54,119 --> 00:51:55,960
even be part of the team anymore. And now you've

768
00:51:56,000 --> 00:51:58,119
got that, which is where a lot of the you know,

769
00:51:58,159 --> 00:52:01,679
bugs come up there. I'm interested, though, if we flash

770
00:52:01,719 --> 00:52:04,519
back to something you were talking about before about where

771
00:52:04,559 --> 00:52:06,840
the some of the real value is. We were talking

772
00:52:06,880 --> 00:52:11,480
about Walmart and the content automatic creation descriptions for the

773
00:52:11,519 --> 00:52:14,119
products that they're having. I wonder how many of those

774
00:52:14,199 --> 00:52:15,480
are out there, because I mean, I think a lot

775
00:52:15,480 --> 00:52:18,519
of the hype right now is focused on this. I'm

776
00:52:18,559 --> 00:52:22,599
gonna say lie that it increases the productivity of software engineers.

777
00:52:24,000 --> 00:52:25,880
A lot of companies are repeating this. I'm waiting for

778
00:52:25,920 --> 00:52:29,639
some actual real data on it. But I'm really curious

779
00:52:29,639 --> 00:52:33,840
about the ways in which the enterprises, the larger companies,

780
00:52:33,840 --> 00:52:36,239
you know, are finding opportunities that aren't just like a

781
00:52:36,360 --> 00:52:38,360
chatbot on a website.

782
00:52:38,519 --> 00:52:42,239
Speaker 3: Right, I mean, that's I think that's table states at

783
00:52:42,239 --> 00:52:46,880
this point. The software engineering and get psyche. Yeah, there

784
00:52:46,880 --> 00:52:50,960
are there benefits to it. There are no benefits to it.

785
00:52:50,960 --> 00:52:54,360
It's a controversial, but then there are other users like

786
00:52:54,440 --> 00:52:57,239
furnish this with. One of the usages that I have

787
00:52:57,719 --> 00:53:07,239
seen is uh they the company, Uh they're running support

788
00:53:07,320 --> 00:53:11,639
service phone support service, and what they were doing was

789
00:53:12,119 --> 00:53:18,400
uh doing sentiment analysis of the calls so that they're

790
00:53:18,880 --> 00:53:25,639
uh they can do better callbacks to customers who are

791
00:53:26,400 --> 00:53:30,000
whose sentiment was was not as as good as as

792
00:53:30,079 --> 00:53:33,719
they they wanted to be. So I think that's one

793
00:53:33,760 --> 00:53:39,440
way of doing uh, you know, customer satisfaction and and

794
00:53:39,440 --> 00:53:46,400
and keeping your brand uh recognition, uh and so on

795
00:53:46,400 --> 00:53:49,239
and so forth. So it's like and one of the

796
00:53:49,239 --> 00:53:52,239
things that they actually had was they first tried it

797
00:53:52,559 --> 00:53:57,960
the sentiment analysis, but they had to actually feed additional

798
00:53:58,079 --> 00:54:02,559
data to the model to get the sentiment analysis correct.

799
00:54:03,840 --> 00:54:06,320
It's one of those examples where you have, oh you

800
00:54:06,400 --> 00:54:08,679
may not get it right from the first run, but

801
00:54:08,760 --> 00:54:12,480
then you need to like adjust for it and then

802
00:54:12,599 --> 00:54:17,599
try again. I wish I could give their name that

803
00:54:17,599 --> 00:54:24,920
that's a very exciting name, but uh, but what they

804
00:54:24,960 --> 00:54:31,440
did was really uh you know, it's it's they were

805
00:54:31,480 --> 00:54:35,280
really happy with it. Uh. They were to the point

806
00:54:35,280 --> 00:54:39,159
they were always doing callbacks to the customers, but the

807
00:54:39,679 --> 00:54:44,320
selection process was not something that they were able to

808
00:54:44,400 --> 00:54:47,760
streamline to the level that you know that they get

809
00:54:47,800 --> 00:54:51,599
more and more better results out of it. Now they were.

810
00:54:52,440 --> 00:54:57,920
Now they're able to target the right customers because they

811
00:54:58,320 --> 00:55:00,800
are able to do the sentiment and now out of

812
00:55:00,880 --> 00:55:01,840
those goals.

813
00:55:01,639 --> 00:55:04,760
Speaker 2: Center analysis is an interesting one. We actually have a product,

814
00:55:04,800 --> 00:55:08,079
it's like the number one stand up bought and Slack,

815
00:55:09,079 --> 00:55:14,679
and we tried to run some AI models on our

816
00:55:15,119 --> 00:55:17,920
responses for stand ups, and like it was very clear

817
00:55:17,960 --> 00:55:19,519
whether it was like a one or a five on

818
00:55:19,559 --> 00:55:21,559
a five point scale, but like you know, you get

819
00:55:21,559 --> 00:55:23,440
into and those are obvious, but the ones in the

820
00:55:23,480 --> 00:55:26,639
middle were quite the challenging spot there, and we ended

821
00:55:26,679 --> 00:55:30,079
up discarding the notion entirely because it is the sort

822
00:55:30,079 --> 00:55:31,920
of thing that I feel like, you really want to

823
00:55:32,039 --> 00:55:35,639
get right and not be in the danger area. But

824
00:55:35,760 --> 00:55:37,599
your example made me think of something that I had

825
00:55:37,599 --> 00:55:43,719
read recently actually, where a call center realized that the site,

826
00:55:43,880 --> 00:55:49,760
the psychological safety of their staff actually being super important

827
00:55:49,840 --> 00:55:53,519
for the success of the organization, was being threatened by

828
00:55:53,679 --> 00:55:56,400
irate callers that would call up and yell and scream.

829
00:55:56,480 --> 00:56:00,400
Maybe obscenities at them because you know, obviously supports are

830
00:56:00,440 --> 00:56:03,159
being made by people who are usually not happy. I mean,

831
00:56:03,239 --> 00:56:07,920
maybe that's a controversial statement, but I believe that's true.

832
00:56:07,960 --> 00:56:10,079
You know, generally speaking, you're calling up the support line

833
00:56:10,079 --> 00:56:13,239
because you're not thrilled with the service you're getting, and

834
00:56:13,920 --> 00:56:15,599
it was challenging for them. And what I heard that

835
00:56:15,639 --> 00:56:20,360
they were doing was actually uh using AI to alter

836
00:56:20,760 --> 00:56:26,079
the vocal expression of the calls they were getting so

837
00:56:26,159 --> 00:56:29,960
that they seemed more demur in content. Like they weren't screaming,

838
00:56:30,000 --> 00:56:33,960
they weren't swearing, they weren't like high volume pitches, automatically

839
00:56:34,039 --> 00:56:36,639
changing what the staff were hearing so that they wouldn't

840
00:56:36,679 --> 00:56:42,039
be susceptible or subjugated to an environment that would be

841
00:56:42,039 --> 00:56:43,840
worse for them and the company.

842
00:56:44,719 --> 00:56:48,400
Speaker 3: Yeah, I mean that is that is you know, that

843
00:56:48,559 --> 00:56:53,159
is a very no way to way of using sexually right.

844
00:56:53,360 --> 00:57:01,519
It's yeah, I it's it's always hurt when you're on

845
00:57:01,639 --> 00:57:08,239
the first line. The customer is not happy, so it's

846
00:57:10,199 --> 00:57:12,159
it helps, I'm pretty sure.

847
00:57:14,079 --> 00:57:18,480
Speaker 1: And then there's the people where the more swear words

848
00:57:18,519 --> 00:57:24,159
they have is actually an indicator that they are happy.

849
00:57:27,960 --> 00:57:30,840
We may be able to identify someone who fits in

850
00:57:30,880 --> 00:57:37,159
that category sort of tangential to this. That's one of

851
00:57:37,199 --> 00:57:41,639
the things I did in using chat GPT is I

852
00:57:41,800 --> 00:57:44,199
told it. I don't know if you' allre familiar with

853
00:57:44,280 --> 00:57:49,320
who David Goggins is, but he's a very vocal and

854
00:57:50,639 --> 00:57:54,599
blunt person. I think we can just say that. But

855
00:57:54,679 --> 00:57:58,599
I told chat gpt that it could adopt that person's

856
00:57:58,639 --> 00:58:03,440
speaking style and personality, and like instantly I was more

857
00:58:03,480 --> 00:58:07,199
productive with it because I could drop F bombs and

858
00:58:07,320 --> 00:58:10,960
it would respond with hell, yeah, let's go, And like

859
00:58:11,000 --> 00:58:12,960
all of a sudden, I was communicating with someone who

860
00:58:13,760 --> 00:58:15,719
spoke the way that I did.

861
00:58:16,840 --> 00:58:19,920
Speaker 2: So I just assumed, and this is obviously not really

862
00:58:19,920 --> 00:58:25,159
knowing your history that if you've seen Full Metal Jacket

863
00:58:25,440 --> 00:58:28,239
and Drel Sargent, like I just assume that this is

864
00:58:28,280 --> 00:58:31,000
the life you've gone through, Will and your path being

865
00:58:31,039 --> 00:58:34,480
in the Navy, that you know there is something just

866
00:58:34,559 --> 00:58:37,719
for you special about being yelled at in this way

867
00:58:37,760 --> 00:58:39,280
which really gets you into gear.

868
00:58:40,039 --> 00:58:44,320
Speaker 1: Right, Screaming F bombs is my love language.

869
00:58:48,639 --> 00:58:57,840
Speaker 3: Okay, we're doing this podcast compete the wrong right?

870
00:58:58,599 --> 00:59:00,280
Speaker 2: I mean, I guess we could release two verse is

871
00:59:00,320 --> 00:59:02,280
the stream, you know, the natural one and the one

872
00:59:02,280 --> 00:59:05,079
that's attenuated using AI that strips out all of the

873
00:59:05,119 --> 00:59:07,599
obscenities that are coming out of Will's mouth. So if

874
00:59:07,639 --> 00:59:10,800
you're not hearing him, yeah, I mean, if you're not

875
00:59:10,840 --> 00:59:14,159
hearing Will swear every other word. That's what's actually happening

876
00:59:14,239 --> 00:59:14,679
right now.

877
00:59:16,079 --> 00:59:17,800
Speaker 3: The bill special, Right.

878
00:59:19,880 --> 00:59:21,480
Speaker 1: There's so many ways that could go wrong.

879
00:59:25,360 --> 00:59:35,320
Speaker 3: Yeah, but yeah, there's there's always that. I tried to

880
00:59:35,360 --> 00:59:38,719
do the same with the chick GPT back in the

881
00:59:38,800 --> 00:59:40,800
day on SWAL that didn't go well for me.

882
00:59:43,280 --> 00:59:48,800
Speaker 1: So one thing I'm curious about, like, are there common

883
00:59:50,400 --> 00:59:54,400
use cases you're seeing from specific type of types of customers,

884
00:59:54,440 --> 00:59:57,559
like enough where you could say, like this industry or

885
00:59:57,599 --> 01:00:01,440
this segment is betting really heavily lead on AI and

886
01:00:01,679 --> 01:00:04,480
m L and they're actually making progress.

887
01:00:04,920 --> 01:00:05,880
Speaker 2: That's a good question.

888
01:00:08,800 --> 01:00:20,400
Speaker 3: Finance definitely, is there insurance companies both on the customer

889
01:00:20,440 --> 01:00:24,639
support side and also on the risk and analysis and

890
01:00:24,679 --> 01:00:27,400
so and so forth. Essentially the risk andalysos they have

891
01:00:27,480 --> 01:00:32,199
been doing for for many years, but as I said,

892
01:00:32,239 --> 01:00:36,760
the hype, the result of the hype is they are

893
01:00:36,800 --> 01:00:41,280
now able to adopt things that are more advanced, easier,

894
01:00:41,440 --> 01:00:47,199
and in some cases cheaper than they were able to

895
01:00:47,199 --> 01:00:50,719
do in the past. So those are the two that

896
01:00:50,760 --> 01:00:53,519
we are seeing. The one thing that I was surprised

897
01:00:53,559 --> 01:01:00,400
about is the consulting companies. Was so because as we're

898
01:01:00,400 --> 01:01:04,760
providing a tool, perhaps that's why as well we get

899
01:01:05,000 --> 01:01:12,000
a lot of interest from consulting companies, which are essentially

900
01:01:12,039 --> 01:01:17,920
some of them are actually building develops pipelines for other

901
01:01:18,039 --> 01:01:28,440
companies that are very interested in adopting our tools. So

902
01:01:28,480 --> 01:01:31,360
that was, you know, that's something that I wasn't expecting

903
01:01:32,440 --> 01:01:38,119
expected for me. So I do not really know what

904
01:01:38,199 --> 01:01:41,920
kind of industries these consultant companies are working for, and

905
01:01:42,000 --> 01:01:45,039
some of them are really large, So I don't know

906
01:01:45,119 --> 01:01:46,039
if you.

907
01:01:45,960 --> 01:01:49,159
Speaker 2: Know, if you can tell you no, I mean you

908
01:01:49,199 --> 01:01:51,199
say consulting. And I had two things give to mind,

909
01:01:51,280 --> 01:01:55,840
like like management consulting, of which we knew all along

910
01:01:56,079 --> 01:01:58,159
that what the words coming out of their mouth could

911
01:01:58,199 --> 01:02:02,440
easily be puppeted by an LM and maybe third party

912
01:02:02,519 --> 01:02:06,440
contractors who are you're hiring to outsource some work. I

913
01:02:06,440 --> 01:02:10,199
could imagine they're trying to sneakly basically use LLMS instead

914
01:02:10,199 --> 01:02:12,679
of even cheaper because you know, supposedly it's even more

915
01:02:12,760 --> 01:02:18,000
cheaper labor than even outsourcing to deliver the work items.

916
01:02:18,079 --> 01:02:22,719
Speaker 3: No, in this case, no, it was no. In this case,

917
01:02:22,760 --> 01:02:27,480
they are actually hired as experts were building a pipeline

918
01:02:27,519 --> 01:02:31,079
for their For instance, one of the cases that we're

919
01:02:31,079 --> 01:02:37,440
helping with is they're building a pipeline for consuming base

920
01:02:37,599 --> 01:02:43,519
models and retraining them or fine tizing them. So like,

921
01:02:43,800 --> 01:02:47,599
it's not that kind of work. It's not like adapting

922
01:02:47,639 --> 01:02:51,599
the LM for doing something. It's more about you know,

923
01:02:51,679 --> 01:02:57,000
taking base models, making sure the lineagures there, as bounds

924
01:02:57,039 --> 01:03:02,239
are there, and so and so forth. It's what it

925
01:03:02,360 --> 01:03:06,280
essentially is is they are building CICD pipeliness, very complex

926
01:03:06,599 --> 01:03:12,079
c CD pipelines for these companies, and as far as

927
01:03:12,079 --> 01:03:17,519
I understand, they are working on multiple projects at the

928
01:03:17,559 --> 01:03:22,280
same time. So there is a lot of consultancy activity

929
01:03:22,360 --> 01:03:25,320
that we are I'm sure that there is more. It's

930
01:03:25,360 --> 01:03:29,519
just we're seeing a few of examples for adapting our tools,

931
01:03:29,920 --> 01:03:32,800
but there's a lot of that going on. So that

932
01:03:33,199 --> 01:03:36,239
only tells me that it's like there is a bit

933
01:03:36,280 --> 01:03:45,199
of activity happening in industries that are known to use consultancy.

934
01:03:46,280 --> 01:03:50,360
So and to be honest, I am impressed on the

935
01:03:50,400 --> 01:03:54,199
fact that they are actually doing the right thing. It's

936
01:03:54,199 --> 01:03:56,599
like you don't at this point of the hype, you

937
01:03:56,639 --> 01:04:01,320
don't usually get the people doing the right thing and

938
01:04:02,039 --> 01:04:07,320
you know, building provenance, attestations, s bombs and so on

939
01:04:07,320 --> 01:04:11,440
and so forth. So when you see that, you're like, hmm,

940
01:04:11,679 --> 01:04:12,400
I'm impressed.

941
01:04:12,480 --> 01:04:15,159
Speaker 2: No, I totally get it. I mean different field. But

942
01:04:15,559 --> 01:04:18,159
you know, we offer login and access control as a

943
01:04:18,159 --> 01:04:21,400
SaaS product, and we're talking with our customers and they

944
01:04:21,400 --> 01:04:24,000
say them things like, well, you know, because of these reasons,

945
01:04:24,000 --> 01:04:26,440
we decided to go with your product, And it makes

946
01:04:26,480 --> 01:04:28,440
me feel like really happy that at least there are

947
01:04:28,440 --> 01:04:30,280
some people out there that are that are thinking through

948
01:04:30,320 --> 01:04:32,719
this effectively, like whether or not they decided to use

949
01:04:32,719 --> 01:04:34,320
what we have or not. It's like I can actually

950
01:04:34,320 --> 01:04:36,599
see how they're thinking about it, and it's so much

951
01:04:36,639 --> 01:04:39,760
better than like the rest of what I see, which

952
01:04:39,800 --> 01:04:42,320
is like, oh, no, you know, we decided to hack

953
01:04:42,400 --> 01:04:44,599
this together, or you know, we're not tracking what we're

954
01:04:44,639 --> 01:04:45,880
doing or auditing.

955
01:04:45,599 --> 01:04:46,440
Speaker 3: In it any way.

956
01:04:46,679 --> 01:04:49,000
Speaker 2: You know, it's you know, I hate to say yoloing

957
01:04:49,039 --> 01:04:52,440
it out there, because you know they can it's not

958
01:04:52,480 --> 01:04:55,719
the most critically important thing for them as they see it.

959
01:04:55,800 --> 01:05:00,119
Even though we no longer term tracking the models that

960
01:05:00,119 --> 01:05:01,920
are being used, and how you're fine tuning it, like

961
01:05:01,920 --> 01:05:04,360
actually being able to trace it back. It's like going

962
01:05:04,360 --> 01:05:07,000
out there and as well loves to say right to

963
01:05:07,079 --> 01:05:10,280
production using v I to edit those production files. You know,

964
01:05:10,599 --> 01:05:12,760
it's great in the moment, but you know you got

965
01:05:12,800 --> 01:05:15,280
to trace that back to the get commit if you

966
01:05:15,320 --> 01:05:18,800
care about your production reliability.

967
01:05:19,000 --> 01:05:24,880
Speaker 3: I mean, we actually encourred to a case where the

968
01:05:25,000 --> 01:05:29,559
model that was deployed was forgotten completely and it drifted,

969
01:05:30,400 --> 01:05:34,639
but it was still part of the application flow. It

970
01:05:34,679 --> 01:05:39,760
was still producing data to the application, and the application

971
01:05:40,039 --> 01:05:52,280
was putting that into into databases essentially, So it was

972
01:05:52,599 --> 01:05:55,360
at the end it wasn't something critical that they were doing,

973
01:05:55,400 --> 01:05:58,920
but it took them like they couldn't understand at first,

974
01:05:59,079 --> 01:06:03,880
Like that's what they told us. They couldn't understand at

975
01:06:03,880 --> 01:06:08,320
first how that happened, and they weren't even aware that

976
01:06:08,440 --> 01:06:13,559
there was a model there at that point that was

977
01:06:13,639 --> 01:06:17,599
that was generating the the data that they are seeing

978
01:06:19,000 --> 01:06:24,840
was drifting. I mean it's it was doing a simple categorization,

979
01:06:25,079 --> 01:06:27,440
but then it drifted so much that it was always

980
01:06:27,440 --> 01:06:31,760
putting things into irrelevant categories that it wasn't supposed to

981
01:06:32,719 --> 01:06:38,480
put into so, and that's what happens with with AI.

982
01:06:38,599 --> 01:06:42,079
That's why you need to be really careful with the pipeline.

983
01:06:42,119 --> 01:06:45,719
Like the v I example that that you have given.

984
01:06:46,079 --> 01:06:50,840
It's like you can get away with ed editing something

985
01:06:51,400 --> 01:06:58,119
with v I in production if if something is if

986
01:06:58,239 --> 01:07:02,840
if that deployment changes in one time in fifteen years,

987
01:07:02,920 --> 01:07:07,599
yeah right, so, but you know, if you have a

988
01:07:07,679 --> 01:07:13,440
deployment that changes only once in fifteen years, you may

989
01:07:13,480 --> 01:07:18,719
get away with that because your input and output never changes, right,

990
01:07:18,760 --> 01:07:21,320
It's like that thing is if it is working, it

991
01:07:21,400 --> 01:07:25,280
is working unless you have a bucket and software never

992
01:07:25,360 --> 01:07:30,639
has bugs. So but then with AI, it is going

993
01:07:30,719 --> 01:07:34,760
to change, whether that happens in six months, that happens

994
01:07:35,000 --> 01:07:37,760
in a year. You need to monitor that. You need

995
01:07:37,800 --> 01:07:42,280
to have a pipeline that will retrain and get rid

996
01:07:42,280 --> 01:07:44,679
of the drift and so on and so forth and

997
01:07:44,719 --> 01:07:50,599
all that. So I think that's where the main differences

998
01:07:51,199 --> 01:07:55,039
start to happen between DeVos and maybe mlops.

999
01:07:55,159 --> 01:07:58,400
Speaker 2: That's a really good point, actually, I think you're worded

1000
01:07:58,440 --> 01:08:02,719
that really well. You are only going to use mL

1001
01:08:02,760 --> 01:08:06,159
in situations that are already so complex or changed so

1002
01:08:06,360 --> 01:08:09,519
frequently to get the value out that you have to

1003
01:08:09,559 --> 01:08:12,280
imagine that you built such a complex like pretend you

1004
01:08:12,320 --> 01:08:14,880
didn't use mL. Think about how complex of a system

1005
01:08:14,920 --> 01:08:17,600
you would have had in its place. There must have

1006
01:08:17,680 --> 01:08:20,399
been so much testing around that, so much reliability, so

1007
01:08:20,479 --> 01:08:23,279
much concern and fine tuning of that to get it right.

1008
01:08:23,520 --> 01:08:26,359
You can't just throw away all those extra pieces because

1009
01:08:26,399 --> 01:08:29,000
you're using some sort of model now. You still need

1010
01:08:29,039 --> 01:08:31,279
all of them in place otherwise, you know, just think

1011
01:08:31,319 --> 01:08:34,039
about that legacy system that's thirty years old and has

1012
01:08:34,880 --> 01:08:36,880
you know, maybe one hundred million lines of code, Like

1013
01:08:36,920 --> 01:08:39,920
that's just absolutely ridiculous to be running as a critical

1014
01:08:39,960 --> 01:08:42,520
piece of software, right, I.

1015
01:08:42,479 --> 01:08:46,399
Speaker 3: Mean, there are problems like for instance, again this is

1016
01:08:46,600 --> 01:08:52,960
a real one, let's say that you're producing shirts, yeah,

1017
01:08:53,000 --> 01:08:57,079
and you're distributing these shirts to all over the world.

1018
01:08:58,319 --> 01:09:01,920
How do you know what size is to produce and

1019
01:09:01,960 --> 01:09:07,319
where to distribute them? Someone needs to make that decision.

1020
01:09:07,439 --> 01:09:12,399
You cannot send the same amount of large to every country,

1021
01:09:13,199 --> 01:09:16,119
and you cannot send the same amount of xx large

1022
01:09:16,159 --> 01:09:17,039
to every country.

1023
01:09:18,359 --> 01:09:20,439
Speaker 2: There was actually, I think there was a good paper

1024
01:09:20,479 --> 01:09:22,239
that was released. I think it was the US Army

1025
01:09:22,279 --> 01:09:26,000
where they were trying to devise standard kits for sizes,

1026
01:09:26,560 --> 01:09:30,319
and they performed a number of physical measurements of everyone.

1027
01:09:30,399 --> 01:09:33,920
Everyone drafted to decide, okay, you know, we're gonna have

1028
01:09:33,920 --> 01:09:35,920
a small medium in large kit how big should those

1029
01:09:35,960 --> 01:09:39,760
things be? And after measuring and calculating the norms, they

1030
01:09:39,800 --> 01:09:42,239
realized that there was never there was not one person

1031
01:09:42,279 --> 01:09:44,399
in the set of like a thousand people that were

1032
01:09:44,439 --> 01:09:48,560
able to fit into a standard distribution of one of

1033
01:09:48,560 --> 01:09:51,520
those kids. They were like, it was like, it's really

1034
01:09:51,600 --> 01:09:54,279
ridiculous and a good reminder that there is no such

1035
01:09:54,319 --> 01:09:57,319
thing as following the norm and getting it actually right,

1036
01:09:58,039 --> 01:09:59,600
and so there's no reason to believe that it would

1037
01:09:59,600 --> 01:10:00,760
work in the eye world either.

1038
01:10:01,720 --> 01:10:05,359
Speaker 3: Yeah. But on the other hand, given enough data, you

1039
01:10:05,439 --> 01:10:08,600
can actually predict me you know, how many how many

1040
01:10:08,720 --> 01:10:12,960
larges that you need in China, and how many largest

1041
01:10:13,079 --> 01:10:16,640
you need in US, and how many large you need

1042
01:10:16,720 --> 01:10:21,800
in Canada because you have the historical data as well

1043
01:10:21,880 --> 01:10:27,000
as other demography fixs and so on and so forth.

1044
01:10:27,079 --> 01:10:29,159
So when you combine all of that, and this is

1045
01:10:29,199 --> 01:10:32,560
actually used somewhere, when you combine all of that, you

1046
01:10:32,600 --> 01:10:35,199
can actually come up with a data say that, oh,

1047
01:10:35,600 --> 01:10:41,319
you know what I need this many excels in Belgium, right, so,

1048
01:10:42,079 --> 01:10:45,479
and that is very important because you know, sending the

1049
01:10:45,520 --> 01:10:49,279
wrong shirt sizes healthway around the world where you won't

1050
01:10:49,279 --> 01:10:52,399
be able to sell is a lot of cost.

1051
01:10:53,520 --> 01:10:57,399
Speaker 2: I worked for a long time in manufacturing and manufacturing logistics,

1052
01:10:57,439 --> 01:11:02,000
so I am well aware of the challenges there. So

1053
01:11:02,600 --> 01:11:06,239
I appreciate the example. We're getting pretty up on time here,

1054
01:11:06,319 --> 01:11:09,800
so I'm actually wondering if it may be the moment

1055
01:11:09,840 --> 01:11:12,319
that we switched to doing some picks.

1056
01:11:12,880 --> 01:11:17,840
Speaker 3: Let's do some picks. That's exciting, Yeah, do it?

1057
01:11:19,880 --> 01:11:21,560
Speaker 1: So, Warren, you want to kick us off?

1058
01:11:21,760 --> 01:11:23,920
Speaker 2: Yeah, you know, I always go first. I think that's

1059
01:11:24,039 --> 01:11:25,520
that's the secret format here.

1060
01:11:25,640 --> 01:11:27,319
Speaker 1: You know, feel free to call me out and I'll

1061
01:11:27,359 --> 01:11:28,479
go first at any point.

1062
01:11:28,800 --> 01:11:31,479
Speaker 2: No, No, I think I go first. I think that's

1063
01:11:31,600 --> 01:11:35,640
the uh, that's the rule here. All our listeners will

1064
01:11:35,640 --> 01:11:39,359
know that by now. So my pick I sort of

1065
01:11:39,359 --> 01:11:42,319
alluded this to this earlier about psychological safety on teams.

1066
01:11:42,359 --> 01:11:45,720
There's actually a paper that was released by Google in

1067
01:11:45,760 --> 01:11:50,479
twenty sixteen about trying to figure out what makes a

1068
01:11:50,520 --> 01:11:52,960
great team, and the actual article is released on The

1069
01:11:52,960 --> 01:11:56,439
New York Times twenty sixteen. What Google learned from its

1070
01:11:56,520 --> 01:11:59,520
quest to build the perfect team, and I feel like

1071
01:11:59,520 --> 01:12:02,079
in twenty two, twenty four, this shouldn't be a shocker anymore.

1072
01:12:02,119 --> 01:12:05,800
But it's psychological safety and it's sort of ridiculous how

1073
01:12:05,840 --> 01:12:09,960
important that is is really outlined in the paper, and

1074
01:12:10,439 --> 01:12:13,199
that I still talk with companies today. My colleagues at

1075
01:12:13,199 --> 01:12:17,039
other companies are ones that I consult for and it's

1076
01:12:17,119 --> 01:12:20,199
not their priority. And it's like, you look at data

1077
01:12:20,279 --> 01:12:22,199
like this and the only conclusion you've come to is

1078
01:12:22,239 --> 01:12:25,079
it's the number one thing that everyone should be doing,

1079
01:12:25,159 --> 01:12:29,159
because it's the guaranteed way of making the most successful

1080
01:12:29,159 --> 01:12:32,680
company with the most revenue, the happiest customers and the

1081
01:12:32,720 --> 01:12:35,239
best employees. And yet people still aren't doing.

1082
01:12:35,199 --> 01:12:37,479
Speaker 3: It fair enough.

1083
01:12:38,039 --> 01:12:40,399
Speaker 1: All right, GOK, what'd you bring for a pick today?

1084
01:12:41,479 --> 01:12:44,239
Speaker 3: Yeah, so it was a little bit last minute. So

1085
01:12:44,359 --> 01:12:48,239
I'm gonna pitch a coffee machine.

1086
01:12:49,640 --> 01:12:50,680
Speaker 1: I'm super interested.

1087
01:12:50,760 --> 01:12:51,840
Speaker 2: You have my full attention.

1088
01:12:53,239 --> 01:12:56,359
Speaker 3: Okay, So a little bit of background. There is something

1089
01:12:56,439 --> 01:13:00,520
called a golden ratio in the coffee world. If you're

1090
01:13:00,560 --> 01:13:05,159
doing drip coffee right there, that there is a something

1091
01:13:05,199 --> 01:13:09,039
called golden ratio, which is the amount of time that

1092
01:13:09,399 --> 01:13:13,399
your beans are gonna spend with the with the boiled

1093
01:13:13,439 --> 01:13:17,960
water and there is an institution I can't remember what

1094
01:13:18,439 --> 01:13:26,359
that uh measures these these uh uh coffee drip coffee

1095
01:13:26,359 --> 01:13:35,560
machines and issues the Golden ratio certificate. So this, yeah,

1096
01:13:36,319 --> 01:13:39,319
the my my first introduction to this machine actually goes

1097
01:13:39,359 --> 01:13:44,000
all the way back to Nokia, where we had these

1098
01:13:44,239 --> 01:13:48,520
kitchens in every floor in Nokia, and there was in

1099
01:13:48,640 --> 01:13:52,600
every kitchen there was one or two coffee machine that

1100
01:13:52,840 --> 01:13:56,159
which you would go and make your own coffee, and

1101
01:13:57,039 --> 01:14:01,680
uh it was so ridiculous that the coffee would run

1102
01:14:01,720 --> 01:14:05,720
out and someone would have to make a new page.

1103
01:14:05,800 --> 01:14:08,159
And one thing that we have done is we actually

1104
01:14:08,199 --> 01:14:11,399
put cameras on top of them so that we would

1105
01:14:11,479 --> 01:14:15,359
know if there is coffee in there or not before

1106
01:14:15,399 --> 01:14:21,800
we live our spot. So all of these machines and okay,

1107
01:14:21,880 --> 01:14:27,159
buildings were MoCCA masters. And the thing about MoCCA masters

1108
01:14:27,560 --> 01:14:30,960
is they are the simplest machines that you can think of.

1109
01:14:31,479 --> 01:14:34,399
Like all their parts, you can just take them up

1110
01:14:35,439 --> 01:14:40,479
apart and then put them in. Probably the most expensive

1111
01:14:40,520 --> 01:14:44,199
piece in it is the copper wire which boils the water.

1112
01:14:44,439 --> 01:14:47,439
And that's actually the secret of it, because it boils

1113
01:14:47,439 --> 01:14:52,600
the water into the correct degree at the correct time

1114
01:14:53,159 --> 01:14:58,039
and let the water go through the beans your coffee

1115
01:14:58,079 --> 01:15:02,880
with the correct ratio, and therefore it's considered one of

1116
01:15:02,920 --> 01:15:06,880
the better coffee drip machines that you can get, and

1117
01:15:06,920 --> 01:15:11,840
they're expensive. Right.

1118
01:15:11,920 --> 01:15:18,680
Speaker 1: And on that note, I just learned last week that

1119
01:15:18,760 --> 01:15:22,880
the webcam was invented and I can't remember who it was.

1120
01:15:22,960 --> 01:15:25,079
I want to say it was at Nokia. The webcam

1121
01:15:25,239 --> 01:15:28,239
was invented because the engineers were tired of going to

1122
01:15:28,319 --> 01:15:32,399
the break room finding the coffee machine empty, so they

1123
01:15:32,399 --> 01:15:35,920
figured out how to hook a camera up to their

1124
01:15:35,960 --> 01:15:40,640
network and invented the first webcam.

1125
01:15:40,960 --> 01:15:45,000
Speaker 3: Uh well, by the time I was there it was invented,

1126
01:15:46,640 --> 01:15:49,680
it was, but it was a very common practice, Inchia.

1127
01:15:50,439 --> 01:15:55,520
Speaker 1: Yeah, rightly, so, rightly so, all right. So my pick

1128
01:15:56,520 --> 01:15:59,600
is a little bit related to what we've been talking

1129
01:15:59,600 --> 01:16:02,840
about to just with the early stages of AIML, you know,

1130
01:16:02,880 --> 01:16:05,920
and how do we make this production ready. And so

1131
01:16:06,199 --> 01:16:10,840
my pick is actually Voyager one and Voyager two, the

1132
01:16:10,960 --> 01:16:16,279
spacecraft that are outside of our solar system now. And

1133
01:16:16,319 --> 01:16:18,520
the reason I pick those is because just go and

1134
01:16:18,600 --> 01:16:22,279
watch some YouTube videos about this or read articles whatever

1135
01:16:22,319 --> 01:16:25,800
you preferred format, is the level of engineering done on

1136
01:16:25,840 --> 01:16:31,039
these things is so amazing. The data storage on these

1137
01:16:31,039 --> 01:16:34,439
things is an eight track tape drive that's still working

1138
01:16:34,640 --> 01:16:40,279
forty seven years later, and you know, it's gone so

1139
01:16:40,399 --> 01:16:42,359
much further. Both of them have gone so much further

1140
01:16:42,439 --> 01:16:45,720
than what they were ever planned to do. But when

1141
01:16:45,720 --> 01:16:48,199
the engineers were designing it, they kind of hoped that

1142
01:16:48,279 --> 01:16:50,720
this would be the scenario, and so they built for

1143
01:16:50,760 --> 01:16:54,560
this and here we are utilizing it today and crazy

1144
01:16:54,640 --> 01:16:57,680
things like the backups that they have on there. The

1145
01:16:57,720 --> 01:17:02,560
primary thrusters worked for a long time and then they

1146
01:17:03,399 --> 01:17:08,079
rightly so they failed, and so after thirty seven years

1147
01:17:08,119 --> 01:17:12,000
of sitting dormant, they fired up the backup thrusters and

1148
01:17:12,039 --> 01:17:15,439
they worked flawlessly. And so just the level of engineering

1149
01:17:15,520 --> 01:17:19,399
in this in the Voyager spacecraft, I think is worth

1150
01:17:20,000 --> 01:17:22,880
spending a little bit of time to just acknowledge and

1151
01:17:23,039 --> 01:17:28,000
admire and then reflect on how that same engineering philosophy

1152
01:17:28,520 --> 01:17:33,399
can be applied to your common everyday tasks. So Voyager

1153
01:17:33,439 --> 01:17:35,439
one and Voyager two are my picks for the week.

1154
01:17:37,520 --> 01:17:38,960
Speaker 2: You got to upstage everyone there?

1155
01:17:39,039 --> 01:17:43,319
Speaker 1: Oh well, now, I just it just it was on

1156
01:17:43,319 --> 01:17:45,600
my YouTube feed last night. It wasn't planned at all,

1157
01:17:45,640 --> 01:17:48,960
but after watching it, that was like, wow, this relates

1158
01:17:49,000 --> 01:17:52,000
to what I do on so many levels.

1159
01:17:52,600 --> 01:17:55,159
Speaker 2: There's an inspirational YouTube video sitting out there that we

1160
01:17:55,199 --> 01:17:59,720
should get in the link section of this episode.

1161
01:18:00,039 --> 01:18:03,319
Speaker 1: I'll pull I'll pull on YouTube history and get the

1162
01:18:03,359 --> 01:18:05,119
link and make sure it gets into the show notes.

1163
01:18:05,520 --> 01:18:08,279
Sounds good, Yeah, because it was a cool video, super

1164
01:18:08,319 --> 01:18:10,399
cool and it's only like twenty minutes. It's not one

1165
01:18:10,439 --> 01:18:17,399
of those two hour ones. And on that note, Gorkam,

1166
01:18:17,439 --> 01:18:19,319
thank you so much for joining us today. This has

1167
01:18:19,359 --> 01:18:22,279
been a cool conversation. I'm looking forward to see how

1168
01:18:22,279 --> 01:18:25,439
it comes out. And now be sure to keep in touch,

1169
01:18:25,520 --> 01:18:28,600
let us know what kind of things you're on, and

1170
01:18:28,640 --> 01:18:31,560
when you come across something cool, I'd love to have

1171
01:18:31,600 --> 01:18:32,920
you back on to talk about.

1172
01:18:32,720 --> 01:18:33,880
Speaker 2: It and figure out how it works.

1173
01:18:35,199 --> 01:18:38,159
Speaker 1: Awesome, And to all the listeners, thank you so much

1174
01:18:38,199 --> 01:18:42,079
for listening. Be sure and reach out to us if

1175
01:18:42,119 --> 01:18:47,159
you have questions, comments, episode ideas, or someone that you

1176
01:18:47,239 --> 01:18:51,920
want us to have on the show. We'd love your input. Jarren,

1177
01:18:52,399 --> 01:18:55,600
thanks for joining me today. Yeah, as always, all right,

1178
01:18:55,680 --> 01:18:56,920
and we'll see y'all next week.

