1
00:00:01,919 --> 00:00:05,799
Speaker 1: Welcome everyone to another episode of Adventures and Devlops. Joining

2
00:00:05,799 --> 00:00:08,560
the in studio today, Warren Broad Warren, how.

3
00:00:08,519 --> 00:00:11,240
Speaker 2: Are you thanks for having me back. You know, I

4
00:00:11,240 --> 00:00:14,119
actually have a good fact for today that I thought

5
00:00:14,279 --> 00:00:17,079
was really interesting to share. There's a malware out there

6
00:00:17,120 --> 00:00:21,719
called potter Cookie, and I know the economy or engineers

7
00:00:21,879 --> 00:00:24,879
is not so great at the moment. However, a lot

8
00:00:24,920 --> 00:00:28,160
of advertisements out there for a job recks, maybe from

9
00:00:28,199 --> 00:00:31,559
malicious attackers who are trying to get you to run

10
00:00:32,359 --> 00:00:35,280
GitHub repos or download packages from the internet to pass

11
00:00:35,320 --> 00:00:38,520
the interview, and those things will get installed on your

12
00:00:38,520 --> 00:00:41,679
machine and either try to steal your local crypto wallets

13
00:00:41,759 --> 00:00:45,759
or worse, be used for attacking whichever company you do

14
00:00:45,799 --> 00:00:49,000
get hired by. So I know it's a real struggle

15
00:00:49,039 --> 00:00:52,039
that you want to complete whatever take home assignment or

16
00:00:52,079 --> 00:00:53,759
you know, get the next job. But you really got

17
00:00:53,799 --> 00:00:56,840
to be careful in today's economy because this tech is

18
00:00:56,880 --> 00:00:59,920
out there that's just waiting to capitalize on a supple mistake.

19
00:01:01,200 --> 00:01:05,319
Speaker 1: That's just nuts, Like let's just kick someone while they're

20
00:01:05,359 --> 00:01:09,439
down right, Yeah, Like all of that is nuts.

21
00:01:09,519 --> 00:01:12,840
Speaker 3: I think getting homework from an interview is nuts. I

22
00:01:12,879 --> 00:01:17,319
think you know, potentially installing something on your computer that's

23
00:01:17,359 --> 00:01:19,879
going to make your computer go wild. It's not like

24
00:01:20,040 --> 00:01:22,400
it's just it's multiple levels of crazy.

25
00:01:23,920 --> 00:01:25,920
Speaker 1: Speaking which, Hi, Jillian, welcome to the show.

26
00:01:27,560 --> 00:01:28,640
Speaker 3: Thanks for having me back.

27
00:01:29,760 --> 00:01:31,719
Speaker 1: You guys are making me feel guilty. Thanks for having

28
00:01:31,719 --> 00:01:34,680
me back. Like it's conditional with this point, just show up.

29
00:01:36,239 --> 00:01:37,719
Speaker 3: Oh I don't know. I'm not sure that it is

30
00:01:37,760 --> 00:01:39,560
for me, but no, thank you.

31
00:01:40,120 --> 00:01:42,359
Speaker 2: I'm still appreciative for being here. I guess that is

32
00:01:42,400 --> 00:01:43,040
what I'll say.

33
00:01:43,359 --> 00:01:45,359
Speaker 1: Well, I'm happy to have you both here. You guys

34
00:01:45,359 --> 00:01:48,519
make my job a lot more fun and entertaining. And

35
00:01:48,560 --> 00:01:52,120
speaking of fun and entertainment, I'm looking forward to this episode.

36
00:01:52,560 --> 00:01:55,560
We have Alex Kernes joining this in a studio today,

37
00:01:56,200 --> 00:02:00,000
principal solution architect from dude, you just told me how

38
00:02:00,079 --> 00:02:02,280
to say this in my mind already went below.

39
00:02:04,719 --> 00:02:08,400
Speaker 4: I've heard every every possible permutation of how to pronounce

40
00:02:08,400 --> 00:02:11,639
it myname correctly, which a lot of people don't do.

41
00:02:13,319 --> 00:02:15,639
Speaker 1: Welcome to the show, man, I'm happy to have you here.

42
00:02:16,159 --> 00:02:16,800
Speaker 5: Great to be here.

43
00:02:16,800 --> 00:02:17,120
Speaker 4: Thank you.

44
00:02:17,560 --> 00:02:19,479
Speaker 1: Cool. So give us a little bit about your background

45
00:02:19,680 --> 00:02:23,039
now that we know how to pronounce for your work.

46
00:02:23,919 --> 00:02:28,960
Speaker 4: Yeah, great, I mean, I come from a software engineering background,

47
00:02:28,960 --> 00:02:31,479
if you kind of go way back to the post

48
00:02:31,599 --> 00:02:37,039
university career and then made the move into intercloud both

49
00:02:37,319 --> 00:02:42,639
kind of internal consultancy and of platform type work and

50
00:02:42,680 --> 00:02:49,560
then consultancy in terms of the traditional customer external facing consultancy.

51
00:02:49,599 --> 00:02:54,520
But I'm still very technically driven. I like my hands dirty.

52
00:02:54,840 --> 00:02:59,599
That's that for me. Is is what's the most exciting part.

53
00:02:59,680 --> 00:03:05,680
It's building things, breaking things, learning from it. Yeah, payper

54
00:03:05,759 --> 00:03:09,919
architecture is is not my not my phone.

55
00:03:10,719 --> 00:03:13,520
Speaker 1: I hear you there. There's like a certain I think

56
00:03:13,560 --> 00:03:16,800
for people who succeed in this industry for a long time,

57
00:03:16,800 --> 00:03:19,759
there's a certain amount of entertainment value that you get

58
00:03:19,840 --> 00:03:20,639
from your job.

59
00:03:22,240 --> 00:03:27,439
Speaker 2: I'm definitely doing it wrong then, I mean I I

60
00:03:27,800 --> 00:03:31,000
my new belief is that when I retire, I'm just

61
00:03:31,039 --> 00:03:34,560
gonna go back into drawing boxes and lines. Like that's

62
00:03:34,639 --> 00:03:36,280
just like the best part of my job. Like when

63
00:03:36,280 --> 00:03:38,639
I can get on a piece of paper or whiteboard

64
00:03:38,960 --> 00:03:42,159
and boxes and lines, you know, not even any words necessarily,

65
00:03:42,240 --> 00:03:45,159
Like that's prime enjoyment right there.

66
00:03:45,360 --> 00:03:47,199
Speaker 1: I'm gonna get my typewriter and I'm gonna go to

67
00:03:47,199 --> 00:03:49,400
my cabin in Montana. Screw all you guys.

68
00:03:51,120 --> 00:03:52,960
Speaker 3: So so speaking of just like I'm going to do

69
00:03:53,000 --> 00:03:55,240
something really simple and turn my brain off, I bought

70
00:03:55,280 --> 00:03:57,680
like a paint by Numbers kit and that has been.

71
00:03:58,120 --> 00:03:59,919
It just reminded me a lot of what you said,

72
00:04:00,039 --> 00:04:01,879
because I just sit there and I just paint in

73
00:04:01,919 --> 00:04:05,199
the numbers and I don't care that I'm nearly forty

74
00:04:05,240 --> 00:04:08,719
and doing an adult like an activity for children. It's great,

75
00:04:08,800 --> 00:04:11,000
and it's just so you just turn your brain. It's great.

76
00:04:11,039 --> 00:04:11,400
It's great.

77
00:04:11,439 --> 00:04:14,960
Speaker 1: Anyways, I've actually heard quite a few people comment how

78
00:04:15,120 --> 00:04:17,360
like therapeutic and relaxing that is.

79
00:04:17,879 --> 00:04:20,519
Speaker 3: They really are. They're very relaxing. It's great.

80
00:04:21,680 --> 00:04:26,079
Speaker 4: I think so many things. It's the tech industry. Brains

81
00:04:26,079 --> 00:04:29,399
are so switched on all the time. There has to

82
00:04:29,399 --> 00:04:34,160
be a way to switch off otherwise work life balances

83
00:04:34,800 --> 00:04:35,600
it's pretty nil.

84
00:04:36,120 --> 00:04:38,600
Speaker 2: Well, it's interesting you bring that up, because actually, very

85
00:04:38,639 --> 00:04:42,680
commonly where you keep being in a always production mode,

86
00:04:42,720 --> 00:04:48,160
like everything we do is happening at a critical level

87
00:04:48,199 --> 00:04:50,600
and we have to pass that test. Like whatever work

88
00:04:50,600 --> 00:04:53,879
we're doing, there's no practice involved. It's always run time

89
00:04:53,920 --> 00:04:56,600
for us. And there's a lot that there's a bunch

90
00:04:56,600 --> 00:04:58,439
of research out there that says like we have to

91
00:04:58,519 --> 00:05:01,879
go into practice mode where mistakes can be made, failures

92
00:05:01,879 --> 00:05:04,680
can be had, and we can actually learn from it intentionally,

93
00:05:05,160 --> 00:05:07,160
and without that we will like that's actually one of

94
00:05:07,160 --> 00:05:09,720
the biggest causes of burnout. So you know, if it's

95
00:05:09,759 --> 00:05:13,079
going home and doing watercolor, water painting, you know, whatever

96
00:05:13,120 --> 00:05:16,680
it takes there. If that somehow helps you recharge realistically,

97
00:05:17,000 --> 00:05:17,759
definitely do it.

98
00:05:18,319 --> 00:05:21,920
Speaker 1: There's also a lot of evidence showing that taking on

99
00:05:21,959 --> 00:05:25,480
those kind of activities, while you're not consciously thinking about

100
00:05:25,519 --> 00:05:28,839
the problem, your subconscious is continuing to work on it,

101
00:05:28,839 --> 00:05:31,480
and that's when some of the big insights and big

102
00:05:31,519 --> 00:05:34,360
breakthroughs for you occur. Like I know, there's a really

103
00:05:34,360 --> 00:05:36,560
common antidote of like when you're in the shower and

104
00:05:36,600 --> 00:05:40,000
you have this great idea that's like a really fixed

105
00:05:40,040 --> 00:05:42,040
example that whole process in action.

106
00:05:44,759 --> 00:05:48,759
Speaker 3: True, do two important jobs anymore, Like I'm just I'm

107
00:05:48,800 --> 00:05:52,120
just not doing them, but like I mean, like they matter,

108
00:05:52,199 --> 00:05:54,639
but not like not on like a huge I'm always

109
00:05:54,680 --> 00:05:58,480
in production. Everything has to be perfect, Like it's fine,

110
00:05:58,600 --> 00:06:00,600
it'll it'll be fine if it's it's not a couple

111
00:06:00,600 --> 00:06:02,879
of days later. I used to do important stuff though,

112
00:06:03,079 --> 00:06:06,839
and I don't want to anymore. So that's I suppose

113
00:06:06,879 --> 00:06:08,639
that's your lesson is this is where I'm at doing

114
00:06:08,639 --> 00:06:10,720
my paint. Fine numbers and things don't really matter.

115
00:06:13,639 --> 00:06:15,439
Speaker 1: Well, one of the things we were talking about before

116
00:06:15,519 --> 00:06:22,000
we started recording the episode was leveraging jen AI and

117
00:06:22,040 --> 00:06:24,759
Alex you've got some experience with that, specifically some experience

118
00:06:24,800 --> 00:06:29,600
of like real world examples where you've done that, and

119
00:06:29,639 --> 00:06:32,439
I think that's one of the big I think that's

120
00:06:32,439 --> 00:06:34,519
one of the cool things about AI. You know, it

121
00:06:34,560 --> 00:06:38,240
goes through this this buzz cycle, but people who are

122
00:06:38,240 --> 00:06:40,959
actually putting it to real world use. I'm interested to

123
00:06:41,000 --> 00:06:42,360
hear your take on that.

124
00:06:43,600 --> 00:06:45,959
Speaker 4: Yeah, I think it's a it's a really it's a

125
00:06:46,000 --> 00:06:50,439
really interesting topic. It's it's something where you go back

126
00:06:51,600 --> 00:06:58,399
eighteen months, maybe chat GPT was kind of just about

127
00:06:58,439 --> 00:07:03,879
starting to be established as almost household name. People aren't

128
00:07:03,920 --> 00:07:09,120
necessarily using it actively, but tools like that are becoming

129
00:07:09,680 --> 00:07:14,399
more and more common. And then I think with with

130
00:07:14,519 --> 00:07:18,319
any technology, it's it's when it gets kind of democratized,

131
00:07:18,360 --> 00:07:21,959
when it gets put in the hands of people that

132
00:07:22,000 --> 00:07:27,839
aren't having to spend millions on hardware and and do

133
00:07:27,920 --> 00:07:31,360
those kind of things that it actually really starts to

134
00:07:31,399 --> 00:07:36,160
become an awful lot more prevalent. So I think as

135
00:07:36,560 --> 00:07:40,560
we saw with any technology, So you think back to

136
00:07:42,160 --> 00:07:46,040
kind of mid twenty tens, I suppose, where things like

137
00:07:46,079 --> 00:07:50,639
AWS Lander came out, so kind of serveralus technologies, and

138
00:07:50,680 --> 00:07:55,920
then the years after that where every SaaS company that

139
00:07:56,120 --> 00:07:59,600
existed was going for a we now have a server

140
00:07:59,639 --> 00:08:03,839
off ring, and it's like, what is it serverless? Is

141
00:08:03,879 --> 00:08:06,399
it just managed service?

142
00:08:06,519 --> 00:08:07,839
Speaker 5: Is it?

143
00:08:07,879 --> 00:08:11,000
Speaker 4: What is your what is your definition of servers?

144
00:08:11,319 --> 00:08:11,560
Speaker 1: Right?

145
00:08:12,959 --> 00:08:15,519
Speaker 4: So you see the buzz around that. You see buzz

146
00:08:15,639 --> 00:08:21,079
around even Cloud, which is I mean cloud is public

147
00:08:21,079 --> 00:08:23,720
Cloud twenty years old if you if you go back

148
00:08:23,720 --> 00:08:28,079
to Aws's first service, so it's not it's not new,

149
00:08:28,160 --> 00:08:32,879
it's not shiny anymore. And AI, I think is going

150
00:08:32,879 --> 00:08:35,919
through the same thing, but just at a much much

151
00:08:35,960 --> 00:08:36,600
faster pace.

152
00:08:37,120 --> 00:08:40,759
Speaker 2: So well's a really interesting comparison though, Like I just

153
00:08:40,840 --> 00:08:42,440
I want to stop here for a second there because

154
00:08:43,879 --> 00:08:46,639
I feel like there's a sort of a weird duality

155
00:08:46,679 --> 00:08:50,080
where serverleists made it easier for people to get into

156
00:08:50,480 --> 00:08:53,799
building stuff and releasing applications because it didn't require you

157
00:08:53,879 --> 00:08:57,559
to purchase or allocate huge data center capacity in order

158
00:08:57,559 --> 00:09:00,360
to make that happen. I feel like the where where

159
00:09:00,399 --> 00:09:04,000
the point is AI is currently at is it actually

160
00:09:04,000 --> 00:09:09,080
does require only the most expensive access to hardware or

161
00:09:09,240 --> 00:09:11,120
service providers to be able to get that. So I

162
00:09:11,120 --> 00:09:14,000
don't think like I don't know if it's been democratized yet.

163
00:09:14,320 --> 00:09:15,960
I mean, there's a lot of services out there that

164
00:09:16,240 --> 00:09:20,039
claim to get you access to some facet of AI,

165
00:09:20,120 --> 00:09:23,159
and I know there's like chtgbts and lms out there

166
00:09:23,200 --> 00:09:26,919
that questionable how much value they were turning to you.

167
00:09:27,480 --> 00:09:29,799
But I think that like the real core aspect of

168
00:09:30,080 --> 00:09:33,600
being able to provide the underlying resources or technology to people,

169
00:09:33,639 --> 00:09:35,799
I think it's still much too far away.

170
00:09:36,639 --> 00:09:38,399
Speaker 4: I think the way you described it is great. It's

171
00:09:39,279 --> 00:09:44,360
giving people access to a facet of AI. I mean,

172
00:09:44,399 --> 00:09:49,440
if we think about AI as a general topic, artificial

173
00:09:49,480 --> 00:09:55,200
intelligence more broadly has been around for decades. It's only

174
00:09:55,279 --> 00:09:58,960
really that when you sort of stop breaking it further

175
00:09:59,039 --> 00:10:04,120
down into machine learning and deep learning and now generative

176
00:10:04,120 --> 00:10:08,000
AI is a kind of subset of that that the

177
00:10:08,360 --> 00:10:13,720
generative AI and the AI terminologies are now almost interchangeable.

178
00:10:14,759 --> 00:10:19,320
Certainly from an industry perspective, I think it very much

179
00:10:19,360 --> 00:10:22,279
depends on what what people are wanting to do with

180
00:10:22,399 --> 00:10:28,000
AI and how how specific their use case is. So

181
00:10:28,919 --> 00:10:34,360
if we think about things like like chat GPT, that's

182
00:10:34,360 --> 00:10:37,679
obviously a very specific use case. It gives you very

183
00:10:37,960 --> 00:10:42,480
generic responses to things. It hasn't got access to your

184
00:10:43,919 --> 00:10:49,559
your specific business data. But it's free and not with

185
00:10:49,600 --> 00:10:52,879
any any free product you are you are normally the product.

186
00:10:54,799 --> 00:10:56,840
Of course you can you can opt out of things,

187
00:10:56,879 --> 00:11:01,120
but by default it's it's collecting that chat history to

188
00:11:01,200 --> 00:11:05,480
improve the service for everyone. You've then got things like

189
00:11:07,120 --> 00:11:14,279
Amazon Bedrock, so Bedrock with aws's generative AI offering that

190
00:11:14,320 --> 00:11:20,440
came out at their conference twenty twenty three that was announced.

191
00:11:20,960 --> 00:11:26,039
So Bedrock offers kind of two different modalities I suppose

192
00:11:26,240 --> 00:11:28,720
in how you can use it. One is on demand

193
00:11:30,240 --> 00:11:34,480
where you pay per thousand tokens. So that's the way

194
00:11:34,519 --> 00:11:37,480
where you can you can go and build something. Again,

195
00:11:37,519 --> 00:11:42,039
it's similar to chat GPT in its sense of its

196
00:11:42,240 --> 00:11:45,679
generic knowledge. It's whatever the large language model has been

197
00:11:45,679 --> 00:11:50,679
trained on. But because as with any kind of cloud

198
00:11:50,720 --> 00:11:55,039
hosted managed service, they can take advantage of economies of

199
00:11:55,039 --> 00:11:59,000
scale and give you pay as you go pricing the

200
00:11:59,000 --> 00:12:02,559
moment that you want to fine tune that model or

201
00:12:03,200 --> 00:12:06,480
train a different model with your specific data. You're going

202
00:12:06,480 --> 00:12:09,480
from paying kind of fractions of a cent per thousand

203
00:12:09,559 --> 00:12:14,360
tokens to having to commit to thirty thousand for three

204
00:12:14,360 --> 00:12:17,480
months because you are now the one that's bearing the

205
00:12:17,480 --> 00:12:21,000
cost of all of that hosting rather than to AWS

206
00:12:21,080 --> 00:12:25,240
making it available. And of course there are ways that

207
00:12:25,279 --> 00:12:29,200
you can augment the use of large language models with

208
00:12:29,240 --> 00:12:33,120
your own data without going to that extent. So even

209
00:12:33,240 --> 00:12:38,080
just including examples of your specific data in a prompt

210
00:12:39,159 --> 00:12:44,399
or of retrieval augmented generation where you can load your

211
00:12:44,440 --> 00:12:48,840
own documents into a vector database and have it retrieve

212
00:12:48,879 --> 00:12:52,919
that data from there, there's lots of ways you can

213
00:12:53,000 --> 00:12:57,279
kind of get quite a long way without spending huge

214
00:12:57,279 --> 00:13:01,080
amounts of money. At the moment you want to get

215
00:13:01,120 --> 00:13:05,399
to the I have complete control of my model, I

216
00:13:05,480 --> 00:13:10,159
train it with my specific data, then you'd hope that

217
00:13:10,159 --> 00:13:12,919
that's when you start getting to the the type of

218
00:13:12,960 --> 00:13:16,399
customers who can afford to spend that kind of money.

219
00:13:16,919 --> 00:13:20,080
Speaker 2: So, in your capacity at your current job, where you're

220
00:13:20,120 --> 00:13:23,320
interfacing with clients and whatnot, do you find there is

221
00:13:23,360 --> 00:13:26,440
one particular provider or one set of tools that you're

222
00:13:26,440 --> 00:13:30,840
constantly going to or whole breadth of set, but for

223
00:13:30,960 --> 00:13:34,720
different types of tasks to help assesdo yes.

224
00:13:34,840 --> 00:13:39,320
Speaker 4: I think we are an AWS consultancy, so everything tends

225
00:13:39,320 --> 00:13:43,960
to center around Amazon tools. But of course as you

226
00:13:44,120 --> 00:13:47,399
or in Google both have have AI offerings.

227
00:13:47,080 --> 00:13:48,000
Speaker 5: Now as well.

228
00:13:49,639 --> 00:13:54,440
Speaker 4: Or the Microsoft with their investment into open ai, are

229
00:13:54,519 --> 00:13:57,679
the only provider that the offer the open AI models

230
00:13:58,039 --> 00:14:00,919
on the public cloud, and I think that will stay

231
00:14:00,960 --> 00:14:04,960
the same for for a number of years. In terms

232
00:14:05,039 --> 00:14:08,440
of the tools that we reach to, there's definitely a

233
00:14:08,440 --> 00:14:13,559
combination of kind of vendor specific but also open source tools.

234
00:14:13,919 --> 00:14:19,600
So in terms of hosting large language models, the kind

235
00:14:19,600 --> 00:14:23,039
of the most frictionless way to access them is through

236
00:14:23,240 --> 00:14:29,200
Amazon Bedrock, really really straightforward API, easy to write scripts

237
00:14:29,240 --> 00:14:35,200
to interact with that either synchronous asynchronous chat however you

238
00:14:35,240 --> 00:14:38,159
need to. But then you can start to bring in

239
00:14:38,639 --> 00:14:42,799
open source tools, so things like lang chain is a

240
00:14:42,840 --> 00:14:49,320
really popular open source framework where you can use Bedrock,

241
00:14:49,399 --> 00:14:54,679
you can use Microsoft's hosted models, Google open Ai. However

242
00:14:54,720 --> 00:14:58,360
you need to interact with your your large language models

243
00:14:58,720 --> 00:15:02,919
and then bring in those other parts like retrieve augmented

244
00:15:03,360 --> 00:15:07,039
generation where you can say I've got a database full

245
00:15:07,080 --> 00:15:10,200
of full of documents these are my business documents, my

246
00:15:10,360 --> 00:15:15,360
sales reports, my financial reports, whatever they need to be.

247
00:15:15,919 --> 00:15:20,200
And then when your large language model takes your prompt,

248
00:15:20,559 --> 00:15:24,080
it can then use that data that you've provided without

249
00:15:24,080 --> 00:15:28,080
having to specifically train the model to augment the generation

250
00:15:28,120 --> 00:15:32,120
of its response. And there's lots of open source tools

251
00:15:32,159 --> 00:15:36,679
that can do things like that. I think what will

252
00:15:36,720 --> 00:15:41,039
be really interesting as these kind of I want to

253
00:15:41,080 --> 00:15:43,759
say years, but I think it's going to be months

254
00:15:43,440 --> 00:15:46,240
with how things are going at the moment. As the

255
00:15:46,279 --> 00:15:50,799
next few months go by, so many open source packages

256
00:15:50,799 --> 00:15:55,039
are popping up, but it's how these open source packages

257
00:15:55,840 --> 00:15:57,320
stay around long term.

258
00:15:57,480 --> 00:15:58,799
Speaker 5: So it's.

259
00:16:00,120 --> 00:16:04,600
Speaker 4: Unless they are backed by by a big business what

260
00:16:04,679 --> 00:16:12,519
makes them commercially sustainable. I think we've seen seen frameworks

261
00:16:12,639 --> 00:16:17,240
like crew ai, which is a framework for building AI

262
00:16:17,639 --> 00:16:21,759
agents and multiple agents and kind of orchestrating like what

263
00:16:21,919 --> 00:16:24,840
agent would get called for a particular type of task.

264
00:16:25,720 --> 00:16:30,039
They've now introduced to a commercial model where they can

265
00:16:30,080 --> 00:16:35,600
take on some of the management and observability around those agents,

266
00:16:35,720 --> 00:16:38,240
or you can just use the framework open source.

267
00:16:39,039 --> 00:16:42,159
Speaker 2: So I feel like I want to ask about that.

268
00:16:42,440 --> 00:16:46,480
Do you you're supporting your customers in utilizing AI within

269
00:16:46,519 --> 00:16:50,720
their businesses. Have you seen significant change? Say over I mean,

270
00:16:50,960 --> 00:16:54,519
I think things are changing very frequently in a month period,

271
00:16:54,600 --> 00:16:58,360
so you know, compared to you know, early twenty twenty

272
00:16:58,360 --> 00:17:02,960
three to now, Like what's the next thing, Like what

273
00:17:03,000 --> 00:17:07,200
are customers now most interested in utilizing? Is it one

274
00:17:07,200 --> 00:17:10,119
of the particular providers more so than others. Is it

275
00:17:10,240 --> 00:17:13,720
just a smattering of everything or do you really see

276
00:17:14,000 --> 00:17:17,359
something taking off specifically in the businesses that you're working with.

277
00:17:18,160 --> 00:17:21,160
Speaker 4: So I think it's it's worth worth making it kind

278
00:17:21,160 --> 00:17:24,680
of provide a agnostic and thinking more use case and

279
00:17:24,839 --> 00:17:30,279
drivers for use of AI. So thinking back twelve months,

280
00:17:31,440 --> 00:17:37,279
even twenty four months, there was a lot I think

281
00:17:37,319 --> 00:17:42,480
there was a lot more of the people wanting to

282
00:17:42,599 --> 00:17:47,119
use AI for the sake of using AI. So we've

283
00:17:47,119 --> 00:17:51,960
had conversations with customers who have said, my board member

284
00:17:52,319 --> 00:17:54,680
has said, as a business, we've got to be using

285
00:17:54,680 --> 00:17:58,160
AI because investors wanted to hear, public needs to see it.

286
00:17:59,279 --> 00:18:02,039
Can you help as USI It's like, well, of course,

287
00:18:02,160 --> 00:18:05,160
but let's let's take that step back. Let's try and

288
00:18:05,160 --> 00:18:08,519
work out where there is a genuine use case. I

289
00:18:08,519 --> 00:18:12,880
think we've been kind of getting through. If we use

290
00:18:12,920 --> 00:18:17,200
the gardener hype cycle as a framework, I suppose here

291
00:18:17,319 --> 00:18:22,119
where I think we've we've definitely passed that peak of

292
00:18:23,839 --> 00:18:27,319
inflated expectations, that kind of top of the top of

293
00:18:27,319 --> 00:18:31,519
the hype hype cycle where everyone is using AI for

294
00:18:31,559 --> 00:18:34,880
the sake of using AI. People want to do way

295
00:18:34,920 --> 00:18:40,720
more with it than is really feasible and ethical, sustainable everything,

296
00:18:41,799 --> 00:18:46,279
And we now I think we're starting to quite rapidly

297
00:18:46,319 --> 00:18:50,119
get into that point of what the hype cycle terms

298
00:18:50,119 --> 00:18:56,119
as the trough of disillusionment, where people are thinking, I'm

299
00:18:56,160 --> 00:19:02,240
seeing AI so much every service, every tool, every news article.

300
00:19:02,440 --> 00:19:06,000
I mean even in the in the UK today our

301
00:19:06,039 --> 00:19:08,880
Prime Minister came out and announced a big plan for

302
00:19:08,880 --> 00:19:11,880
for rolling out AI and growth programs across the country.

303
00:19:12,440 --> 00:19:16,079
So it's it's gone from just the those big tech

304
00:19:16,160 --> 00:19:22,119
providers to to government politics to everything. It's dominates everywhere,

305
00:19:24,279 --> 00:19:26,640
and I think it's almost sort of starting to see

306
00:19:26,640 --> 00:19:29,839
a little bit of fatigue in companies where it's a

307
00:19:31,440 --> 00:19:34,519
it is a case of every vendor is talking to

308
00:19:34,559 --> 00:19:40,480
us about AI or their latest AI powered offering, and

309
00:19:40,599 --> 00:19:44,000
the that next stage, which I don't think we're far

310
00:19:44,039 --> 00:19:49,960
away from now, is working out and really being visible

311
00:19:50,400 --> 00:19:54,720
the use cases for AI that are here to stick around.

312
00:19:55,039 --> 00:19:58,279
Speaker 2: So the one yeah, no, no, I totally get it.

313
00:19:58,400 --> 00:20:01,599
I mean it is quite in the it's everywhere, as

314
00:20:01,599 --> 00:20:03,839
you said, and I'm I sort of want to ask

315
00:20:03,839 --> 00:20:08,279
our resident mL expert here, you know, what she's seen,

316
00:20:08,480 --> 00:20:11,039
you know, comparative to with what you brought up. I

317
00:20:11,079 --> 00:20:12,319
know she loves to talk about it.

318
00:20:13,079 --> 00:20:16,039
Speaker 3: I love AI. I think I think AI is very cool.

319
00:20:16,079 --> 00:20:19,680
I'm still really seeing people on the upward side of

320
00:20:19,720 --> 00:20:21,960
the hype cycle. Like I have a small AI service

321
00:20:21,960 --> 00:20:24,200
that I offer. I had to stop offering it like

322
00:20:24,279 --> 00:20:26,960
publicly because people were just coming in with these very

323
00:20:27,000 --> 00:20:29,599
outsized expectations. And now I'm like, okay, we have to

324
00:20:29,880 --> 00:20:31,759
we have to schedule like a ten to fifteen minute

325
00:20:31,759 --> 00:20:34,559
talk first so that I can like, you know, adjust

326
00:20:34,599 --> 00:20:39,000
some of these expectations and things. But besides that, I

327
00:20:39,000 --> 00:20:42,160
think it's very It's great if you're using it kind

328
00:20:42,200 --> 00:20:45,240
of for what it's good at, and then it's terrible

329
00:20:45,440 --> 00:20:48,480
if you're not. Like you know, I think probably a

330
00:20:48,519 --> 00:20:50,640
lot of the public policy stuff might be a little

331
00:20:50,640 --> 00:20:54,119
bit maybe a little bit like outsized in terms of

332
00:20:54,119 --> 00:20:55,920
what it can do. But maybe you know, but maybe

333
00:20:55,960 --> 00:20:58,119
people making these kind of requests and just being like, well,

334
00:20:58,160 --> 00:21:00,319
shouldn't it be doing this? That's probably what's going to

335
00:21:00,359 --> 00:21:03,119
drive innovation forward. So I have kind of I have

336
00:21:03,240 --> 00:21:04,599
kind of like mixed feelings about it.

337
00:21:04,599 --> 00:21:10,200
Speaker 1: I guess, well, I think that's there's like I want

338
00:21:10,200 --> 00:21:12,200
to drill in on that for a second, like you're

339
00:21:12,240 --> 00:21:15,680
having the right expectations for it. Alex and Jillian and

340
00:21:15,720 --> 00:21:20,200
y'are both brought that up. What are some of the

341
00:21:20,240 --> 00:21:26,200
good use cases where you've seen AI really make a difference, Alex.

342
00:21:27,799 --> 00:21:30,319
Speaker 4: So I can talk about one one kind of specific

343
00:21:30,359 --> 00:21:36,160
customer use case where as part of a migration, there

344
00:21:36,279 --> 00:21:43,039
was we had two hundred and fifty PHP crown jobs

345
00:21:43,119 --> 00:21:45,440
the rebunning on a non premises server.

346
00:21:46,240 --> 00:21:50,519
Speaker 5: Now, these were some of these were.

347
00:21:52,920 --> 00:21:57,519
Speaker 4: Been years old, kind of dating back to of PHP

348
00:21:58,240 --> 00:22:02,039
early PHP five and PHP four with some of them

349
00:22:03,000 --> 00:22:06,559
and some of them were they were little scripts. Some

350
00:22:06,599 --> 00:22:09,240
of them were fifty lines, some of them were four

351
00:22:09,279 --> 00:22:11,920
or five hundred lines. And then you have the ones

352
00:22:11,960 --> 00:22:17,079
that import from different files. And we kind of took

353
00:22:17,119 --> 00:22:20,000
the view of, actually, is there a way we can

354
00:22:20,400 --> 00:22:23,359
can do something here to speed up the analysis of

355
00:22:24,039 --> 00:22:26,759
these scripts. There are a few things that we need

356
00:22:26,799 --> 00:22:30,000
to know, we need to know what databases does this

357
00:22:30,039 --> 00:22:34,400
script talk to? Does it interact with any other services

358
00:22:34,480 --> 00:22:40,680
so APIs, does it interact with things like in the

359
00:22:40,720 --> 00:22:43,759
case of the on premises servers, things like the send

360
00:22:43,799 --> 00:22:50,279
mail binary on a server, because this particular customer for

361
00:22:50,319 --> 00:22:54,480
the rest of their business used a managed SMTP service

362
00:22:54,519 --> 00:22:59,119
now and not just sending emails from on premises servers.

363
00:23:00,920 --> 00:23:03,400
So we did a bit of a proof of concept

364
00:23:03,440 --> 00:23:07,640
around well, primarily just the way to speed up our

365
00:23:07,680 --> 00:23:13,240
own analysis, because yeah, I mean, nobody wants to read through.

366
00:23:14,920 --> 00:23:15,720
Speaker 5: Line by line.

367
00:23:16,359 --> 00:23:18,119
Speaker 1: No, say, it's not so.

368
00:23:18,279 --> 00:23:20,480
Speaker 2: Well, I mean, if you're looking at just as an example,

369
00:23:20,559 --> 00:23:23,680
you know, you said accessing particular database, right, Like if

370
00:23:23,720 --> 00:23:26,880
you have calls out to some on prem or third

371
00:23:26,880 --> 00:23:30,440
party provided data provider and you're in they're migrating to AWS,

372
00:23:30,519 --> 00:23:32,839
they may be going to either a no SQL option

373
00:23:33,000 --> 00:23:35,440
or or RDS, and so you want to make sure

374
00:23:35,480 --> 00:23:38,240
that those get converted, assuming you just port the scripts,

375
00:23:38,319 --> 00:23:41,119
you know, lift and shift into easy tubes. So I

376
00:23:41,160 --> 00:23:43,960
mean a question that I might have is where do

377
00:23:44,000 --> 00:23:47,440
you find yourself on the risk scale, Like what happens

378
00:23:47,480 --> 00:23:49,640
if the LM which is not going to be perfect?

379
00:23:49,960 --> 00:23:52,759
I like, I mean, obviously it makes up tables and

380
00:23:52,759 --> 00:23:56,400
whatnot that aren't being actually used. It's that's fine, but

381
00:23:56,519 --> 00:23:59,039
I think the false negatives would be more of a problem,

382
00:23:59,119 --> 00:24:01,559
Like what happens if its the table? Would that have

383
00:24:01,599 --> 00:24:04,079
caused an issue during the migration And how did you

384
00:24:04,200 --> 00:24:07,559
potentially think about mitigating those sorts of risks.

385
00:24:08,160 --> 00:24:12,119
Speaker 4: Yes, so what we did was rather than kind of

386
00:24:12,160 --> 00:24:16,200
invest the time upfront to build a fully working solution,

387
00:24:16,279 --> 00:24:20,160
it was let's do a test on a few scripts first.

388
00:24:20,200 --> 00:24:23,160
Let's let's try over three or four scripts, scripts that

389
00:24:23,200 --> 00:24:25,640
we have done the analysis by hand on so we

390
00:24:25,720 --> 00:24:29,720
know we've got a golden answer that there is a

391
00:24:30,720 --> 00:24:36,039
ground truth to talk we're comparing against. When we did

392
00:24:36,160 --> 00:24:40,519
run it across the full data set, then we did

393
00:24:40,519 --> 00:24:45,920
dip sampling to make sure that a reasonable portion of

394
00:24:45,960 --> 00:24:49,960
them were accurate. The other thing that we did with

395
00:24:50,039 --> 00:24:55,480
these was when migrating those scripts made you of different environments.

396
00:24:55,680 --> 00:25:00,880
So these were going to cumuneties cluster in as and

397
00:25:01,680 --> 00:25:04,880
because we had a dead of in a staging and environment,

398
00:25:04,880 --> 00:25:08,039
we knew that we could run these scripts in a

399
00:25:08,480 --> 00:25:12,920
sandbox pre production environment. If they failed kind of so

400
00:25:13,039 --> 00:25:16,240
be it. It's not going to bring the business down.

401
00:25:16,279 --> 00:25:19,680
It's not going to send customers emails because we can

402
00:25:20,079 --> 00:25:25,079
we can trap emails, we can make sure that any

403
00:25:25,160 --> 00:25:29,799
calls outside of a particular network are monitored and blocked,

404
00:25:31,119 --> 00:25:34,720
so we can quite easily see actually we haven't got

405
00:25:34,720 --> 00:25:40,000
to just cut over straight of production. I think the

406
00:25:40,480 --> 00:25:46,680
point around the risk and the trust that we often

407
00:25:46,759 --> 00:25:50,000
put in, or people are increasingly putting in large language

408
00:25:50,000 --> 00:25:54,720
models is really interesting because as you start to see

409
00:25:54,720 --> 00:25:59,599
it used in more in more regulated industries, there's an

410
00:26:00,079 --> 00:26:05,160
incredible amount of have power. I suppose that large language

411
00:26:05,160 --> 00:26:12,039
models are being given and when we get to that

412
00:26:12,119 --> 00:26:15,759
point of a model being able to be the only

413
00:26:16,559 --> 00:26:21,160
process and the only tool in a loop, I don't know,

414
00:26:21,319 --> 00:26:25,519
but I don't think we're there yet. Even when you

415
00:26:25,519 --> 00:26:31,319
think about more traditional machine learning use cases, like one

416
00:26:31,319 --> 00:26:33,680
of my favorites is that the kind of flaw detection

417
00:26:34,200 --> 00:26:37,319
in banking, where you make a purchase on a credit card,

418
00:26:38,519 --> 00:26:40,920
if it doesn't look like something that you would typically do,

419
00:26:41,160 --> 00:26:43,200
or it's in a country that you haven't been to

420
00:26:43,279 --> 00:26:49,240
before anomalous amount, then models should pick that up and say, yeah,

421
00:26:49,240 --> 00:26:52,079
it's not a not a transaction. We're going to allow

422
00:26:52,160 --> 00:26:56,079
to be processed because we think it's not you. But

423
00:26:57,319 --> 00:27:05,880
that's been honed and refined over probably decades. Our large

424
00:27:05,960 --> 00:27:10,240
language models given to be exhilerated quickly enough for us

425
00:27:10,279 --> 00:27:12,799
to be able to make use of that level of

426
00:27:13,119 --> 00:27:14,480
power and that level of trust.

427
00:27:15,000 --> 00:27:17,559
Speaker 2: I mean that's a good point. I mean these scripts

428
00:27:17,599 --> 00:27:20,400
that you were migrating, I mean, worst case scenario is

429
00:27:20,440 --> 00:27:24,599
they just didn't run and they didn't necessarily impact user activity,

430
00:27:24,599 --> 00:27:27,400
and if they crash, you get the logs and then

431
00:27:27,440 --> 00:27:30,039
you can go investigate. So using an OLM in this

432
00:27:30,119 --> 00:27:33,880
area was inherently unrisky. It just helps speed up the

433
00:27:33,920 --> 00:27:37,240
initial analysis. But at the end of the day, it

434
00:27:37,240 --> 00:27:39,920
didn't really matter, right, you know, the whole amount of work.

435
00:27:39,920 --> 00:27:41,960
It's like, well, human could have made a mistake there too,

436
00:27:42,000 --> 00:27:44,519
and it wouldn't have had that big of an implication.

437
00:27:45,039 --> 00:27:49,759
But how are you starting to see customers utilizing AI

438
00:27:49,960 --> 00:27:55,519
and A well in like ones that should be risk

439
00:27:55,559 --> 00:27:58,519
averse but are leaning into it more than they should

440
00:27:58,960 --> 00:28:02,359
And how do you even evaluate that or how do

441
00:28:02,400 --> 00:28:04,839
you better avoid that? And I think you did sort

442
00:28:04,880 --> 00:28:11,200
of lean on doing sampling and verifying the outputs, but

443
00:28:11,200 --> 00:28:13,720
maybe there's something holistic because I feel like with the

444
00:28:13,759 --> 00:28:17,480
adoption of AI coming, more and more companies will do

445
00:28:17,519 --> 00:28:21,400
the wrong thing right engineers will either accidentally via negligence

446
00:28:21,519 --> 00:28:24,359
or just you know, laziness or whatever it is. You know,

447
00:28:24,359 --> 00:28:26,240
get in a state where like this is a great

448
00:28:26,279 --> 00:28:29,559
way to absolve myself of the challenge of doing all

449
00:28:29,640 --> 00:28:31,839
this work. How do we counteract that?

450
00:28:33,200 --> 00:28:40,720
Speaker 4: So I think with customers we're quite upfront. So I

451
00:28:40,720 --> 00:28:47,359
think probably different consultancies, different different companies may have an

452
00:28:47,400 --> 00:28:54,920
alternative view. But speaking from a YAH employed official hat,

453
00:28:57,599 --> 00:29:01,000
if a customer is trying to do something that doesn't

454
00:29:01,039 --> 00:29:05,000
make sense and we don't think there's going to be

455
00:29:05,200 --> 00:29:10,559
kind of significant business value in it, then we will

456
00:29:10,640 --> 00:29:15,279
be aware, be open about that, and make the customer

457
00:29:15,279 --> 00:29:19,680
aware that actually what they're doing with it isn't likely

458
00:29:19,759 --> 00:29:27,039
to be successful. Obviously, if there are ethical or legal

459
00:29:27,119 --> 00:29:32,039
concerns around what they're doing, then kind of we're a

460
00:29:32,400 --> 00:29:36,960
technical consultancy. We can raise concerns, make customers aware, but

461
00:29:37,640 --> 00:29:42,359
ultimately due diligence is in on a customer side.

462
00:29:43,000 --> 00:29:45,599
Speaker 2: Is coming though, right, Like I really would like to

463
00:29:45,640 --> 00:29:49,119
see some concrete accountability. I mean, we don't have anything

464
00:29:49,200 --> 00:29:51,720
quite the level of the Night Capital disaster where they

465
00:29:51,799 --> 00:29:53,640
lost like I want to say, like four hundred and

466
00:29:53,680 --> 00:29:56,519
sixty million dollars and that didn't involve AI at all.

467
00:29:56,559 --> 00:30:00,000
That was just pure automation and legacy systems causing a mistake.

468
00:30:00,559 --> 00:30:05,079
And now we definitely have automated car companies that I

469
00:30:05,119 --> 00:30:07,079
believe there was an incident where there was a death

470
00:30:07,119 --> 00:30:09,920
in Arizona or New Mexico a lot of years ago

471
00:30:09,960 --> 00:30:13,039
and the company didn't get suit or anything like that. So,

472
00:30:13,200 --> 00:30:16,119
I mean, does seem like accountability is something that's going

473
00:30:16,160 --> 00:30:18,279
to come up more and more And I don't really

474
00:30:18,319 --> 00:30:23,640
see anyone working on adequate safeguards here. I mean there's like, oh,

475
00:30:23,839 --> 00:30:26,559
we're afraid of AI, but it's we're not really talking

476
00:30:26,599 --> 00:30:28,839
about the companies that are utilizing it.

477
00:30:28,960 --> 00:30:31,599
Speaker 4: I feel like, So, I think there's there's two parts.

478
00:30:31,640 --> 00:30:38,000
There's there's accountability and there's explainability as well. So, again

479
00:30:38,119 --> 00:30:43,839
thinking more about about traditional machine learning, some some algorithms

480
00:30:43,839 --> 00:30:49,039
are significantly more explainable than others. So there are a

481
00:30:49,039 --> 00:30:52,400
lot of algorithms and increasingly so as we get into

482
00:30:52,400 --> 00:30:55,200
the genuine to AI and in large language model space

483
00:30:55,240 --> 00:30:59,400
where there are a bit of a black box they

484
00:30:59,440 --> 00:31:05,000
have loaded to go in and it's very hard to

485
00:31:05,039 --> 00:31:09,480
explain why a particular result has come out. And again,

486
00:31:09,519 --> 00:31:12,920
models aren't deterministic, so you can do as much testing

487
00:31:12,960 --> 00:31:15,920
as you want, but you can never be truly one

488
00:31:16,000 --> 00:31:19,599
hundred percent certain. You can be very very confident that

489
00:31:20,519 --> 00:31:23,519
if anybody can give one hundred percent guarantee, then that

490
00:31:24,240 --> 00:31:27,680
it probably isn't machine learning. That's that's making the final call.

491
00:31:28,359 --> 00:31:30,359
Speaker 2: But you can ask the model if it's correct.

492
00:31:30,519 --> 00:31:37,440
Speaker 3: Rights, it's right, It's obviously right. Haven't you ever help

493
00:31:37,519 --> 00:31:39,480
the kid with their math homework before? Same thing?

494
00:31:41,279 --> 00:31:46,200
Speaker 4: There's some interesting stuff that the AWS are working on,

495
00:31:46,359 --> 00:31:51,519
So for a while now they've had bedrock guardrails, which

496
00:31:52,079 --> 00:31:56,079
is there to try and prevent a large language model

497
00:31:56,119 --> 00:31:59,640
from responding about certain topics. But of course you have

498
00:31:59,680 --> 00:32:01,880
to give it a you have to give it a

499
00:32:01,960 --> 00:32:04,359
list to start with our topics to not talk about.

500
00:32:04,480 --> 00:32:09,400
And if you haven't thought of the topic, then yeah, again,

501
00:32:09,440 --> 00:32:12,720
you are you reliant upon a human to to have

502
00:32:12,759 --> 00:32:18,440
that extensive list before you can prevent the model. Another

503
00:32:18,480 --> 00:32:21,119
one that's come out very very recently, and it's the

504
00:32:21,200 --> 00:32:28,559
last within the last month or so, is AWS make

505
00:32:28,759 --> 00:32:34,720
use of automated reasoning in all places across across their cloud.

506
00:32:34,839 --> 00:32:38,759
So when it comes to kind of cryptographic operations and

507
00:32:39,440 --> 00:32:43,480
making sure that encryption is doing what it should be

508
00:32:43,519 --> 00:32:48,319
doing and not making sure there's mathematical proof behind some

509
00:32:48,359 --> 00:32:53,759
of these operations, that's something that yeah, today is being

510
00:32:53,880 --> 00:32:57,759
used across all of AWS, but only very recently has

511
00:32:57,799 --> 00:33:00,440
now come to Bedrock, and I think the feature is

512
00:33:00,480 --> 00:33:04,839
called Bedrock Automated Reasoning, where you can build rules up

513
00:33:04,880 --> 00:33:10,359
to say I want mathematical proof the response given is

514
00:33:10,960 --> 00:33:13,880
in accordance with these rules, which is quite cool and

515
00:33:13,880 --> 00:33:17,279
I'm yet to play with that, but it looks very promising.

516
00:33:19,559 --> 00:33:24,480
And AWS generally fairly good on the research and that

517
00:33:24,720 --> 00:33:30,000
side of things. They're certainly not perfect in terms of

518
00:33:32,079 --> 00:33:34,880
some of the decisions I think around releasing services that

519
00:33:36,440 --> 00:33:42,000
I maybe like just the wrong side of MVP we

520
00:33:42,079 --> 00:33:45,720
have seen a few times, but that is I think

521
00:33:45,720 --> 00:33:49,519
a side effect of being customer obsessed.

522
00:33:49,119 --> 00:33:49,440
Speaker 1: In the.

523
00:33:50,880 --> 00:33:53,960
Speaker 4: Perhaps as a a customer that needs a particular set

524
00:33:54,000 --> 00:33:55,799
of features and the easiest way is to build a

525
00:33:55,839 --> 00:34:00,279
service for it. But maybe it's not suitable for every

526
00:34:00,359 --> 00:34:04,160
use case and every customer just yet. But yeah, I

527
00:34:04,160 --> 00:34:08,920
think there's some really interesting stuff going on around automated

528
00:34:08,920 --> 00:34:16,239
reasoning around for aws's first party model, so the Amazon

529
00:34:16,599 --> 00:34:20,679
Nova family of models, they've got the AI service cards,

530
00:34:20,719 --> 00:34:23,360
which go into quite a lot of detail about how

531
00:34:23,400 --> 00:34:27,679
the models are built and are fairly transparent around the

532
00:34:27,719 --> 00:34:34,079
way they work. I think that's that's something that end

533
00:34:34,199 --> 00:34:37,159
users of model needs to be need to be more

534
00:34:37,199 --> 00:34:41,199
aware of. I think at the moment, it's very very

535
00:34:41,239 --> 00:34:44,280
easy to log onto to kind of or chat dot Com.

536
00:34:44,320 --> 00:34:48,679
I think is the domain that open ai spent quite

537
00:34:48,679 --> 00:34:54,039
a lot of money on and yeah, type something, But

538
00:34:54,119 --> 00:34:57,119
you don't know where that data has come from to

539
00:34:57,480 --> 00:35:01,519
build that response. You don't know the reasoning behind that

540
00:35:01,599 --> 00:35:06,039
response being given. I think as as these systems get

541
00:35:06,079 --> 00:35:10,519
more and more embedded in people's processes and workflows in business,

542
00:35:10,559 --> 00:35:16,679
it's needing to understand the why and the ask the

543
00:35:16,760 --> 00:35:21,360
question of well, ask the questions and have the ability

544
00:35:21,400 --> 00:35:26,639
to give a because this has happened because the model

545
00:35:26,719 --> 00:35:27,199
is doing this.

546
00:35:27,880 --> 00:35:28,440
Speaker 1: I don't know.

547
00:35:29,199 --> 00:35:33,039
Speaker 2: I worry that if we're relying on education to make

548
00:35:33,119 --> 00:35:37,199
people be able to utilize our AI and technology better,

549
00:35:37,960 --> 00:35:40,039
I feel like we're not going in the right direction.

550
00:35:40,239 --> 00:35:45,159
Like I say this because very frequently in the security domain,

551
00:35:46,159 --> 00:35:48,719
we have the same sort of mantra, whereas you are

552
00:35:48,840 --> 00:35:55,159
going to stop security events, prevent attackers, remove vulnerabilities through

553
00:35:55,840 --> 00:35:58,320
straight education. I mean, there's only so much that you

554
00:35:58,320 --> 00:36:02,239
can achieve there. And if your your security strategy is

555
00:36:02,639 --> 00:36:05,679
educate the users, I mean you might as well have

556
00:36:05,719 --> 00:36:08,480
given up already if that's your And I worry if

557
00:36:08,519 --> 00:36:10,800
that's the direction that we're going. Where. Oh, in order

558
00:36:10,800 --> 00:36:13,039
to use the models effectively, in order to understand what's

559
00:36:13,079 --> 00:36:15,400
going on, you have to be an expert still, which

560
00:36:15,440 --> 00:36:17,280
I feel like has been the case for a lot

561
00:36:17,320 --> 00:36:21,079
of years now, going back twenty thirty years in our

562
00:36:21,360 --> 00:36:24,199
mL development, it's always sort of been the case. And

563
00:36:24,559 --> 00:36:27,000
until we can break through that, I feel like, you know,

564
00:36:27,079 --> 00:36:30,920
real adoption really isn't going to amount to good things.

565
00:36:30,920 --> 00:36:33,280
And we're really close to utilizing it in more and

566
00:36:33,360 --> 00:36:37,119
more concerning situations where security is involved, or human safety

567
00:36:37,159 --> 00:36:41,119
is involved, or whatever Gillian is doing with automatic creation

568
00:36:41,360 --> 00:36:44,559
of you know, protein folding. You know, I honestly can't remember,

569
00:36:45,280 --> 00:36:47,280
but I'm curious that.

570
00:36:47,480 --> 00:36:51,639
Speaker 3: The human in the loop like that. If we're doom spiraling,

571
00:36:51,840 --> 00:36:56,880
it's degree oh man, why is it echoing what happens?

572
00:36:57,159 --> 00:37:01,039
What did I do okay, you're from a few minutes

573
00:37:01,079 --> 00:37:03,239
here while I figure out what happened to my camera.

574
00:37:03,920 --> 00:37:09,039
Speaker 2: Well, Gillian's saying is she is worried that she's but.

575
00:37:09,039 --> 00:37:11,079
Speaker 3: I'm the greed. I could very well be the greed

576
00:37:11,199 --> 00:37:12,679
in this situation. It could happen.

577
00:37:16,079 --> 00:37:25,519
Speaker 4: I think there's there's some really interesting and it's some

578
00:37:25,519 --> 00:37:30,000
really interesting ethical bits as well around it. So the

579
00:37:30,039 --> 00:37:32,400
topic of kind of self driving cars has come up

580
00:37:32,400 --> 00:37:36,679
a few times as we've been talking, and the things

581
00:37:36,679 --> 00:37:41,320
that as a as a human driving a car, you

582
00:37:41,400 --> 00:37:45,559
might instinctively make certain decisions. So if you've got a

583
00:37:45,599 --> 00:37:50,599
self driving car and there is a a certain tl

584
00:37:50,599 --> 00:37:58,000
a crash that one scenario means five people, one scenario

585
00:37:58,280 --> 00:38:02,679
means two people, but they are young people, then.

586
00:38:02,519 --> 00:38:02,920
Speaker 5: It's like.

587
00:38:04,400 --> 00:38:07,280
Speaker 4: What does the car choose? But at that point there's

588
00:38:07,320 --> 00:38:12,719
no there's no human emotion, there's no nothing in it.

589
00:38:12,519 --> 00:38:16,119
It's a someone has to program a model to say

590
00:38:16,360 --> 00:38:20,119
this life is worth more than this life and how

591
00:38:20,119 --> 00:38:20,599
do you do that?

592
00:38:20,679 --> 00:38:24,480
Speaker 2: Right, So there was actually an interesting and so this

593
00:38:24,559 --> 00:38:27,280
is the trolley problem, right, and it's sort of the

594
00:38:27,280 --> 00:38:30,400
moral dilemma more than an ethical one. It was actually released,

595
00:38:30,440 --> 00:38:32,360
I think it was a quick study by Stanford where

596
00:38:32,360 --> 00:38:37,480
it like a gauged random sampling of people of which

597
00:38:37,480 --> 00:38:40,519
they would pick like where where should the car actually

598
00:38:40,559 --> 00:38:42,599
go to? And I think there was like at the

599
00:38:42,679 --> 00:38:44,960
end of the study there was like a clear hierarchy

600
00:38:45,519 --> 00:38:48,519
of what humans actually prefer. And I feel like it

601
00:38:48,559 --> 00:38:52,119
was something like you should kill cats first, and then

602
00:38:53,199 --> 00:38:56,719
old men, and then old women and then and then

603
00:38:56,840 --> 00:38:58,719
dogs or something like that. And I think it had

604
00:38:58,719 --> 00:38:59,960
to do with the fact that cats will just get

605
00:39:00,079 --> 00:39:01,880
out of the way, Like that was the expectation that

606
00:39:02,400 --> 00:39:04,199
these people will either get out of the way or

607
00:39:04,440 --> 00:39:07,119
you know, in worst case scenario, this is what they

608
00:39:07,159 --> 00:39:09,519
would pay. And it was really interesting that that they

609
00:39:09,519 --> 00:39:12,679
had done this, and there was of course some you know,

610
00:39:12,960 --> 00:39:15,480
not nice things said about the fact they had gone

611
00:39:15,559 --> 00:39:17,719
through with the study, But you know, I think humans

612
00:39:17,760 --> 00:39:21,800
will sort of adapt to preferential picks for what they

613
00:39:21,840 --> 00:39:22,480
are okay with.

614
00:39:24,480 --> 00:39:25,920
Speaker 5: The thing is.

615
00:39:27,199 --> 00:39:34,880
Speaker 4: Industry is it's going to be optimal. I often see

616
00:39:35,079 --> 00:39:40,440
people for saying is going to have as much on impact,

617
00:39:41,079 --> 00:39:50,280
that's que and it is having that much of an impact.

618
00:39:50,360 --> 00:39:58,559
Then with cloud that there weren't really that many drawbacks

619
00:39:58,599 --> 00:40:01,840
that I can think of. There there was there was concerns,

620
00:40:01,920 --> 00:40:07,039
there were there uncertainty around who owns my data, who

621
00:40:07,039 --> 00:40:09,320
controls my data? Is my data secure if it's in

622
00:40:10,000 --> 00:40:13,639
Amazon's data center versus a server in my own cupboard.

623
00:40:15,440 --> 00:40:21,280
But with generative AI, there's there's almost that kind of

624
00:40:21,519 --> 00:40:24,159
rabbit in the headlights approach that a lot of people

625
00:40:24,199 --> 00:40:28,719
are taking at the moment where I think there is

626
00:40:28,760 --> 00:40:32,239
a real, a real danger without as we talked about,

627
00:40:32,239 --> 00:40:38,239
without the control, without either education or forced guard rails

628
00:40:40,400 --> 00:40:43,920
to be able to use this effectively. I mean I

629
00:40:43,920 --> 00:40:47,880
remember seeing seeing an article I think it was tail

630
00:40:47,960 --> 00:40:52,400
end of last year, and it was being talked about

631
00:40:52,480 --> 00:40:59,039
in the Yes and there were talks of developed models

632
00:40:59,440 --> 00:41:05,159
being legally required to report on things like whether their

633
00:41:05,159 --> 00:41:09,519
models could be used for purposes that would have a

634
00:41:09,599 --> 00:41:16,920
national security implication, which I think is absolutely absolutely right

635
00:41:17,039 --> 00:41:24,679
from a again from a moral and security perspective. But there's

636
00:41:24,719 --> 00:41:29,840
always that kind of fine balance of any emerging technology,

637
00:41:31,360 --> 00:41:36,639
how far do you regulate it? If you regulate it

638
00:41:36,679 --> 00:41:42,440
too much, does it stifle or prevent innovation? But then

639
00:41:42,440 --> 00:41:45,199
the flip side is if something terrible happens because somebody

640
00:41:45,239 --> 00:41:49,639
has used a large language model to teach themselves how

641
00:41:49,679 --> 00:41:54,119
to carry out an attack. Then the argument is where

642
00:41:54,119 --> 00:41:55,280
there wasn't enough regulation.

643
00:41:59,199 --> 00:42:01,360
Speaker 1: I think, going back to a self driving car example,

644
00:42:01,440 --> 00:42:05,039
the solution there is just go into the settings of

645
00:42:05,119 --> 00:42:07,760
the car and you get a little order of preference,

646
00:42:07,920 --> 00:42:09,800
like I'm okay with hitting this, I'm not okay with

647
00:42:09,880 --> 00:42:12,480
hitting this, and then problem solved.

648
00:42:12,559 --> 00:42:18,159
Speaker 5: Right, It's it's so tough, it's.

649
00:42:18,239 --> 00:42:21,920
Speaker 3: I mean, we choke, But like this is what we

650
00:42:22,000 --> 00:42:25,239
unfortunately have to do is humans sometimes, Like that's how

651
00:42:25,280 --> 00:42:28,239
all of healthcare works, right, Like when there's a shortage

652
00:42:28,280 --> 00:42:30,760
like during COVID there was the shortage of the ventilators.

653
00:42:31,119 --> 00:42:33,519
You think the doctors didn't have to like make decisions

654
00:42:33,559 --> 00:42:37,079
Like it's very unpleasant and nobody wants to think about

655
00:42:37,079 --> 00:42:40,079
them or talk about them. But the reality is we

656
00:42:40,119 --> 00:42:43,400
do anyways, and it has to happen. And I'd imagine

657
00:42:43,400 --> 00:42:45,679
that it also has to be a part of like

658
00:42:45,719 --> 00:42:50,159
self driving cars, because cars are terrible, terrible death machines.

659
00:42:50,480 --> 00:42:52,760
I hate driving. If I mentioned the show yet, how

660
00:42:52,840 --> 00:42:54,480
much I hate driving and having to have a car.

661
00:42:56,199 --> 00:42:59,960
Speaker 2: I think I think it's because we're in this intermediary state,

662
00:43:01,079 --> 00:43:04,079
because once we get to the point where there are

663
00:43:04,519 --> 00:43:08,239
there's automation all around us, and we have adapted to

664
00:43:08,320 --> 00:43:11,119
that fact. It's no longer as big of a problem

665
00:43:11,119 --> 00:43:13,360
because it's whose fault is that if you step in

666
00:43:13,360 --> 00:43:15,599
front of a train. It's like, oh, well, the train

667
00:43:15,719 --> 00:43:18,039
was supposed to stop. It was you know, supposed to know.

668
00:43:18,480 --> 00:43:20,639
And some of them do have safety protections in place,

669
00:43:20,840 --> 00:43:23,000
but realistically, you don't go on the tracks when the

670
00:43:23,000 --> 00:43:26,360
train is coming unless you have some reason you really

671
00:43:26,400 --> 00:43:29,079
want to be there. And I think realistically the same

672
00:43:29,119 --> 00:43:31,840
thing will happen if we're in the automated car space

673
00:43:31,880 --> 00:43:35,760
where the AI self autonomous cars are driving around and realistically,

674
00:43:35,960 --> 00:43:38,239
you know, don't step into the street. I mean, why

675
00:43:38,239 --> 00:43:40,400
would you go there? And I think with the the

676
00:43:40,440 --> 00:43:46,119
AI cars, we will need want to really redesign egress

677
00:43:46,199 --> 00:43:48,400
and flow for traffic and we will be able to

678
00:43:48,440 --> 00:43:51,000
do that effectively once everything has been automated.

679
00:43:53,440 --> 00:44:00,679
Speaker 4: I think that's there's a really a really interesting change

680
00:44:00,719 --> 00:44:05,719
coming in terms of I guess similar to what we

681
00:44:05,719 --> 00:44:08,599
we would have seen with are okay and kind of

682
00:44:09,199 --> 00:44:15,199
precesses being automated is anyway they are going to have

683
00:44:15,280 --> 00:44:21,719
a bey are going to step change improot in efficiency.

684
00:44:21,760 --> 00:44:25,159
There is there going to be a resistance from within

685
00:44:25,199 --> 00:44:29,840
thenesses to work on these generative because they think it's

686
00:44:29,880 --> 00:44:30,920
going to put themselves.

687
00:44:30,639 --> 00:44:31,159
Speaker 2: Out of the job.

688
00:44:33,559 --> 00:44:37,239
Speaker 5: I think it's it's a it's.

689
00:44:37,039 --> 00:44:42,880
Speaker 4: A really challenging space to try and try and tread

690
00:44:42,880 --> 00:44:50,800
the line of improving efficiency, making redundancies. I think going

691
00:44:50,880 --> 00:44:55,599
back many, many years and you think about things like

692
00:44:55,639 --> 00:45:02,360
the Industrial Revolution, and inevitably change is mean that some

693
00:45:02,559 --> 00:45:07,719
jobs are no longer needed. But those people that they

694
00:45:07,880 --> 00:45:14,719
find different roles. And if this kind of continue at

695
00:45:14,719 --> 00:45:19,400
the pace that they're going and disrupts industry at the

696
00:45:19,440 --> 00:45:24,840
pace it's predicted to, then people need to kind of

697
00:45:24,920 --> 00:45:29,119
change with it because yeah, very quickly you are you're

698
00:45:29,159 --> 00:45:34,599
already seeing not adverse to have experience in AI as essential.

699
00:45:35,360 --> 00:45:39,679
It's no longer a bonus and it's a you must

700
00:45:39,840 --> 00:45:43,559
be able to work with copy tools and effectively know

701
00:45:43,639 --> 00:45:47,400
how to say AI and do evopts or migrations or

702
00:45:47,440 --> 00:45:53,000
anything like that, because come these expecting then then expected.

703
00:45:53,559 --> 00:45:57,159
Speaker 1: It's a really interesting change to see how the job

704
00:45:57,239 --> 00:45:59,480
landscape has changed a sort of the last twelve months

705
00:45:59,519 --> 00:46:01,159
of the Internet action of AI. You know, it felt

706
00:46:01,199 --> 00:46:04,400
like twelve months ago. Using Copilot and tools like that

707
00:46:04,599 --> 00:46:07,719
was seen as cheating, but now it's just seen as

708
00:46:07,760 --> 00:46:10,440
like a part of the job, and I'm interested to

709
00:46:10,519 --> 00:46:16,599
hear how it's being like, how are how are people

710
00:46:16,840 --> 00:46:20,559
testing or qualifying your your skills for that in the

711
00:46:20,599 --> 00:46:21,760
employment space.

712
00:46:22,519 --> 00:46:30,159
Speaker 5: I think I can see you to use of tools.

713
00:46:31,639 --> 00:46:37,159
Speaker 4: I want to use them. They're not fine and with

714
00:46:38,280 --> 00:46:39,159
it comes with the changes.

715
00:46:40,679 --> 00:46:45,440
Speaker 5: If something to do and.

716
00:46:46,960 --> 00:46:52,639
Speaker 4: That's musting field founding, then I don't think AI come

717
00:46:52,719 --> 00:46:59,599
into because if you're using Jude but don't understand exted,

718
00:47:01,880 --> 00:47:06,920
you the terms that buildings square or booking solutions that

719
00:47:06,960 --> 00:47:11,039
are going work, but there's also a pretty good chance

720
00:47:11,119 --> 00:47:14,920
that they're not, or you've left a security hold in it,

721
00:47:15,039 --> 00:47:18,639
or it's going to cost ten times as much because

722
00:47:18,639 --> 00:47:23,039
there's there's one best practice that you've you've not realized

723
00:47:23,079 --> 00:47:30,519
because these models were trained on open code. So from

724
00:47:30,559 --> 00:47:34,400
from my perspective, I mean I I've tried a few

725
00:47:34,440 --> 00:47:39,239
different co pilot tools to get how co pilot came

726
00:47:40,000 --> 00:47:45,239
came kind of fairly early doors given Amazon q Ago

727
00:47:45,320 --> 00:47:49,880
as well. At the moment, I'm trying out the windsurf

728
00:47:50,679 --> 00:47:53,800
editor and Cursor as a sort of more integrated i

729
00:47:53,920 --> 00:48:01,519
D experience, and both of them are fairly good. There's

730
00:48:01,559 --> 00:48:04,920
some really cool stuff being able to like if you

731
00:48:04,920 --> 00:48:08,119
want to just hack about on a project. I do

732
00:48:08,199 --> 00:48:11,320
quite a lot with the stream lit Python package, which

733
00:48:11,360 --> 00:48:14,760
is a great way to build or data and AI

734
00:48:15,400 --> 00:48:20,280
apps with an acceptable user interface without having to know

735
00:48:20,360 --> 00:48:25,199
how to write good front end code and being able

736
00:48:25,239 --> 00:48:28,840
to just say, like creating a project using stream Lit.

737
00:48:29,800 --> 00:48:32,760
It needs to be able to interact with Amazon Bedrock

738
00:48:33,159 --> 00:48:35,679
like stub out the methods for me and get something

739
00:48:35,800 --> 00:48:41,079
very quickly. It is good for that. I think the

740
00:48:41,119 --> 00:48:44,079
only way that these tools will excel will really be

741
00:48:46,119 --> 00:48:50,000
with true understanding and context of on what would be

742
00:48:50,000 --> 00:48:53,960
in a human brain. It's the getting into the flow

743
00:48:54,000 --> 00:48:57,800
state of I understand this whole repository, I see how

744
00:48:57,840 --> 00:49:03,199
different pieces kind of connect together. But also if you've

745
00:49:03,199 --> 00:49:07,960
got micro service architectures and actually the context you need

746
00:49:08,000 --> 00:49:11,360
is in a different repository, then it also kind of

747
00:49:11,400 --> 00:49:16,000
needs that as well. So of course there are there

748
00:49:16,000 --> 00:49:16,840
are limitations.

749
00:49:16,880 --> 00:49:17,360
Speaker 5: There are.

750
00:49:19,840 --> 00:49:24,400
Speaker 4: There's only so far these things will go. But from

751
00:49:25,760 --> 00:49:29,239
I think, from my stance, if somebody kind of turned

752
00:49:29,320 --> 00:49:32,800
up at an interview and they wanted to use an

753
00:49:32,800 --> 00:49:37,079
AI tool as part of a tech exercise, for example,

754
00:49:37,239 --> 00:49:41,239
then it wouldn't necessarily be a it would be hypocritical

755
00:49:41,280 --> 00:49:44,960
of me right to say this is a thing. But

756
00:49:45,800 --> 00:49:52,039
I think I'd be more I'd certainly be more aware

757
00:49:52,079 --> 00:49:55,960
and more yeah, be a bit more careful about asking

758
00:49:56,000 --> 00:50:00,000
the right questions about the code that has been generated, because.

759
00:50:00,000 --> 00:50:02,239
Speaker 2: I think that's sort of like one of the really

760
00:50:02,280 --> 00:50:05,800
important parts here is that most companies don't spend enough

761
00:50:05,840 --> 00:50:09,360
time evaluating their interview process for what the right questions are,

762
00:50:09,719 --> 00:50:12,360
and now they're starting to realize that AI is getting

763
00:50:12,360 --> 00:50:15,880
in the way of them quote unquote effectively evaluating the candidates.

764
00:50:16,039 --> 00:50:17,480
And I think that really goes to the fact that

765
00:50:17,519 --> 00:50:20,559
the question didn't make sense and their evaluation strategy didn't

766
00:50:20,599 --> 00:50:23,199
make sense, and that there are tools that can easily

767
00:50:23,199 --> 00:50:24,840
solve that, and if they can solve it during an

768
00:50:24,920 --> 00:50:29,079
interview or during a take on interview test, then they

769
00:50:29,079 --> 00:50:32,400
could potentially or likely using that tool during their job.

770
00:50:32,760 --> 00:50:34,880
And I think we're already starting to see some companies

771
00:50:35,320 --> 00:50:41,400
intentionally telling candidates to use AI LM specifically to solve

772
00:50:41,440 --> 00:50:44,400
the problem because it is something that other engineers on

773
00:50:44,440 --> 00:50:48,199
their team are utilizing and that they would expect someone

774
00:50:48,239 --> 00:50:50,239
who comes into the team to also understand and how

775
00:50:50,239 --> 00:50:55,360
to utilize those tools, because things like configuration or linting, etc.

776
00:50:56,400 --> 00:51:01,719
Will be changed fundamentally by AI. And through that way,

777
00:51:01,960 --> 00:51:03,960
if someone who comes into your team that you're hiring

778
00:51:04,079 --> 00:51:09,440
into it doesn't have experience utilizing LMS to help them

779
00:51:09,480 --> 00:51:13,840
sufficiently or handling their weaknesses whatever they are, then they're

780
00:51:13,840 --> 00:51:16,400
going to not be as an effective team member when

781
00:51:16,400 --> 00:51:20,280
the rest of the team is expecting someone who is

782
00:51:20,320 --> 00:51:21,000
able to do that.

783
00:51:21,800 --> 00:51:30,159
Speaker 4: I mean absolutely, and I think with with how much

784
00:51:30,360 --> 00:51:36,119
of the industry and how much of businesses genuinely there

785
00:51:36,119 --> 00:51:42,519
has the potential and promise to reach there's almost prerequisites

786
00:51:42,639 --> 00:51:47,119
that a lot of people are aren't really thinking about

787
00:51:47,480 --> 00:51:50,199
at the moment because they get they're going to get

788
00:51:50,199 --> 00:51:53,320
blindsided by the AI is great. AI is going to

789
00:51:53,360 --> 00:51:57,639
solve all the problems, whether in reality there are there

790
00:51:57,679 --> 00:52:01,880
are certain things that are foundational who successful use of AI,

791
00:52:02,280 --> 00:52:06,639
like okay, AI is software. How do you monitor your software?

792
00:52:06,960 --> 00:52:10,079
How do you make sure the responses from your AI

793
00:52:10,159 --> 00:52:15,760
powered applications are what you expect them to. If you

794
00:52:15,800 --> 00:52:20,599
were deploying an API, you'd spend time in thinking about okay,

795
00:52:20,639 --> 00:52:24,519
what are the useful metrics. Is CPU a useful metric?

796
00:52:24,679 --> 00:52:30,239
Or is latency a useful metrics? What are the things

797
00:52:30,280 --> 00:52:32,280
that actually have an impact on the end use of

798
00:52:32,320 --> 00:52:36,639
this and AI should be no different. You should have

799
00:52:37,280 --> 00:52:39,920
that kind of production wrapper, you should have monitoring, you

800
00:52:39,960 --> 00:52:47,760
should have security concerns that you're proactively protecting against. But

801
00:52:47,800 --> 00:52:51,719
then also it's that it's that foundation of data. It's

802
00:52:51,760 --> 00:52:56,119
the if you're an organization that has lots of data

803
00:52:56,159 --> 00:52:58,719
and lots of different places and you want to use

804
00:52:58,760 --> 00:53:03,679
it in AI, you need to have a good data platform.

805
00:53:04,559 --> 00:53:08,960
And you have conversations with people around productionizing some of

806
00:53:08,960 --> 00:53:13,719
the AI kind of proof of concepts or very early

807
00:53:13,760 --> 00:53:17,239
stage experiments they've done, and you will say, okay, well

808
00:53:17,280 --> 00:53:20,840
you've you've managed to get these little samples of data

809
00:53:20,880 --> 00:53:24,000
from various data stores to prove of this as a

810
00:53:24,000 --> 00:53:27,599
as a concept could work and is worth putting into

811
00:53:27,639 --> 00:53:29,000
production at a wider scale.

812
00:53:29,679 --> 00:53:33,239
Speaker 2: So one of the problems is that with the pricing

813
00:53:33,320 --> 00:53:35,599
with a lot of the models today, if you're not

814
00:53:35,679 --> 00:53:38,719
running it yourself, you're pretty much paying for input and

815
00:53:38,760 --> 00:53:42,840
output tokens the amount of context you're adding, which you

816
00:53:43,000 --> 00:53:46,280
need a very high context to get an adequate answer. Event,

817
00:53:46,679 --> 00:53:50,239
and then for whatever reason, these companies are charging you

818
00:53:50,320 --> 00:53:54,159
for garbage nonsense coming out of the models. Their decoding

819
00:53:54,199 --> 00:53:57,480
process to get back at readable answer has a lot

820
00:53:57,480 --> 00:53:59,480
of nonsense in it, and then you're paying for that.

821
00:53:59,559 --> 00:54:02,800
So I think the industry is being driven towards trying

822
00:54:02,840 --> 00:54:05,159
to optimize for these two things, which have nothing to

823
00:54:05,199 --> 00:54:07,559
do with the quality of the answer in the first place.

824
00:54:07,960 --> 00:54:11,239
And I know we'll ask the question about, you know,

825
00:54:11,280 --> 00:54:13,920
bringing AI into the interview process, and I feel like,

826
00:54:14,000 --> 00:54:17,119
you know, will You're now in a great position for

827
00:54:17,400 --> 00:54:19,840
me to ask you, do you feel like your interview

828
00:54:19,840 --> 00:54:25,880
process has been changing to respond to the increased usage

829
00:54:25,920 --> 00:54:29,960
both in the workplace as well as candidates using potentially

830
00:54:30,079 --> 00:54:32,960
using AI during the interview process itself.

831
00:54:34,119 --> 00:54:38,400
Speaker 1: For me, no, because my interview process is probably a

832
00:54:39,480 --> 00:54:44,199
lot more old school than most people. If I'm interviewing

833
00:54:44,239 --> 00:54:47,159
a candidate, we're going to have a straight up bullshit

834
00:54:47,199 --> 00:54:52,559
session because I work largely around infrastructure and just through

835
00:54:52,960 --> 00:54:57,400
a casual conversation, I feel like I can get a

836
00:54:57,440 --> 00:55:00,800
lot better feel of whether you know what you're talking about,

837
00:55:01,199 --> 00:55:04,760
or whether you have heard the terms but don't really

838
00:55:04,880 --> 00:55:08,119
understand what they mean. So very a very small amount

839
00:55:08,159 --> 00:55:11,880
of my interview process has anything to do with like

840
00:55:12,079 --> 00:55:15,480
hands on the keyboard, technical coding.

841
00:55:17,239 --> 00:55:19,159
Speaker 2: Do you have some part of it that's like any

842
00:55:19,199 --> 00:55:23,960
sort of technical validation or technical systems design or anything

843
00:55:24,000 --> 00:55:32,679
like that which could be impacted by AI at all?

844
00:55:32,920 --> 00:55:37,679
Speaker 1: Potentially, Yeah, because we'll do like an exercise of you know,

845
00:55:41,199 --> 00:55:43,599
throw together a couple of micro services and explain to

846
00:55:43,639 --> 00:55:46,760
me the interaction between them. But then I'll spend a

847
00:55:46,800 --> 00:55:50,199
lot of time just talking about that, like, well, how

848
00:55:50,199 --> 00:55:52,039
does this part work, how does that part work? Tell

849
00:55:52,079 --> 00:55:56,519
me what happens if this does that? And I dig

850
00:55:56,559 --> 00:55:59,519
into a lot more of the operational stuff. I think,

851
00:55:59,599 --> 00:56:03,679
and if a candidate could pregame all of that and

852
00:56:03,800 --> 00:56:08,440
use AI, good for them. I just think it's unlikely

853
00:56:08,760 --> 00:56:13,000
given the dynamic nature of my interview process, So.

854
00:56:13,000 --> 00:56:15,079
Speaker 2: I don't want to spoil it. But there is a

855
00:56:15,119 --> 00:56:18,280
product out there where in a remote interview, the candidate

856
00:56:18,360 --> 00:56:20,800
will run it and it will listen to the audio

857
00:56:20,960 --> 00:56:24,719
that's coming across and watch the chat and then dynamically

858
00:56:24,760 --> 00:56:29,559
generate text response for the candidate to answer on the call. Now,

859
00:56:29,800 --> 00:56:32,599
I do think that there is a challenge here of

860
00:56:32,639 --> 00:56:36,079
being able to adequately understand what words are important in

861
00:56:36,119 --> 00:56:39,159
a sentence. If you have a thought and you're sharing

862
00:56:39,199 --> 00:56:41,639
that thought, you know what the point of that thought

863
00:56:41,719 --> 00:56:44,079
is and which nouns are more important than others. But

864
00:56:44,079 --> 00:56:46,960
if you're reading a response from something else, you might

865
00:56:47,000 --> 00:56:49,519
as well all say it all monotone because there isn't

866
00:56:49,519 --> 00:56:52,159
any part of that that it makes sense upfront to you.

867
00:56:52,159 --> 00:56:54,320
You almost need to read it first and then answer back.

868
00:56:54,519 --> 00:56:57,760
But that exists. So unless you're bringing people into the office,

869
00:56:57,800 --> 00:57:01,400
and obviously we want to optimize for more remote working environments.

870
00:57:01,440 --> 00:57:04,400
You know, our our company, UH is one hundred percent remote.

871
00:57:04,559 --> 00:57:06,679
I know a part of yours is will I don't

872
00:57:06,679 --> 00:57:08,360
want to sweare to that, but you do have different.

873
00:57:09,119 --> 00:57:10,639
Speaker 1: Yeah, we're one hundred percent remote as well.

874
00:57:11,159 --> 00:57:14,199
Speaker 2: Yeah, So I mean there's only so much you can

875
00:57:14,199 --> 00:57:16,119
do there. I mean, you're not gonna meet in person

876
00:57:16,280 --> 00:57:19,480
every single candidate at a coffee shop or something and

877
00:57:19,519 --> 00:57:22,599
go through uh sort of validation that they're not doing that.

878
00:57:22,639 --> 00:57:25,719
I mean, you have to use your other skills to

879
00:57:25,960 --> 00:57:29,719
sort of figure out whether or not there are you know,

880
00:57:29,840 --> 00:57:31,000
they believe what they're saying.

881
00:57:31,840 --> 00:57:33,800
Speaker 1: Yeah, and I think I would in that in that scenario,

882
00:57:33,840 --> 00:57:38,079
I just rely on like our our sixty day window,

883
00:57:38,199 --> 00:57:40,480
like every candidate would bring on has like a sixty

884
00:57:40,559 --> 00:57:45,559
day trial period and if if the expectations didn't line up,

885
00:57:45,599 --> 00:57:47,920
you know, we have sixty days to resolve that. And

886
00:57:47,920 --> 00:57:49,719
if not, cut ties.

887
00:57:50,320 --> 00:57:53,679
Speaker 2: Are you at least for us, We're really transparent about that. Like,

888
00:57:53,880 --> 00:57:57,159
if you're cheating during the interview process, that hurts you

889
00:57:57,280 --> 00:57:58,840
because we're just going to fire you a couple of

890
00:57:59,079 --> 00:58:02,119
cup date months later. Is that what you want? I mean,

891
00:58:02,679 --> 00:58:05,000
you're risking it by coming and joining us, just like

892
00:58:05,000 --> 00:58:07,719
we're risking it on you, and we have people have

893
00:58:07,800 --> 00:58:11,039
the capability of ending that relationship. So you know, if

894
00:58:11,039 --> 00:58:13,960
you managed to get through our interview process faking every

895
00:58:13,960 --> 00:58:15,800
step of the way, and then you also managed to

896
00:58:15,840 --> 00:58:18,360
fake the next couple of years successfully, I mean I

897
00:58:18,360 --> 00:58:19,800
actually think that was a pretty good hire.

898
00:58:21,039 --> 00:58:24,559
Speaker 1: Yeah, agreed. Like, if if you're faking it, because this

899
00:58:24,679 --> 00:58:27,480
is really where you want to be and I pick

900
00:58:27,559 --> 00:58:30,519
up on that, I'll do everything I can to help

901
00:58:30,559 --> 00:58:32,679
you get there because that's how I got here. I

902
00:58:32,800 --> 00:58:34,559
lied through my teeth on job interviews.

903
00:58:34,679 --> 00:58:36,840
Speaker 3: Oh I got here too. I would just like, hey,

904
00:58:36,920 --> 00:58:39,360
I had a baby, that baby needed some food and

905
00:58:39,440 --> 00:58:41,599
I was like, all right, I don't know what we're

906
00:58:41,639 --> 00:58:44,079
talking about, but I have Google. Let's go figure this out.

907
00:58:44,480 --> 00:58:47,079
Speaker 1: Yeah, I mean I read all five hundred pages of

908
00:58:47,119 --> 00:58:50,320
the Microsoft SEQL Server six book because that was the

909
00:58:50,440 --> 00:58:51,239
job I wanted.

910
00:58:52,800 --> 00:58:55,280
Speaker 4: This is where we can almost flip it on his

911
00:58:55,400 --> 00:59:01,119
head a little bit, because hey, I can post interview

912
00:59:01,920 --> 00:59:07,960
then play advantageous because if your interview process is cultural

913
00:59:08,159 --> 00:59:15,000
and it is assessing primarily person fit to a business

914
00:59:15,119 --> 00:59:18,840
and their ability with how they learn, how they interact

915
00:59:18,880 --> 00:59:22,159
other people, then if you can make a great hire

916
00:59:22,440 --> 00:59:25,840
that has the ability to very quickly learn land on

917
00:59:25,920 --> 00:59:30,559
their feet, is a great team player, then AI is

918
00:59:30,599 --> 00:59:33,840
a great tool to be able to assist in their

919
00:59:33,960 --> 00:59:37,840
upskilling or things they don't know technically once they have

920
00:59:37,880 --> 00:59:41,199
got in the door. So it's there's two sides to

921
00:59:41,239 --> 00:59:42,199
I suppose.

922
00:59:42,239 --> 00:59:44,199
Speaker 1: Ah, that's a really cool idea. I hadn't thought about

923
00:59:44,239 --> 00:59:48,039
that before. So instead of yeah, so you just kind

924
00:59:48,079 --> 00:59:51,199
of flip it around and say, hey, AI, here's this dude,

925
00:59:52,199 --> 00:59:54,079
where should I be helping them?

926
00:59:54,280 --> 00:59:56,360
Speaker 2: I think the part of the trouble with that is

927
00:59:56,760 --> 01:00:00,519
you would need to have only verbal the most part

928
01:00:00,559 --> 01:00:03,039
interaction with the candidate during the interview and have to

929
01:00:03,039 --> 01:00:04,639
go through the process of like, Okay, you know, we

930
01:00:04,679 --> 01:00:07,639
want to record this session so that you know, later

931
01:00:07,719 --> 01:00:10,000
we can feed it back through and make it available.

932
01:00:10,039 --> 01:00:12,760
And I know that you know, there's no reason why

933
01:00:12,800 --> 01:00:15,000
this should be a problem, but you know, every single

934
01:00:15,000 --> 01:00:18,679
additional obstacle you add, there is another risk for potentially

935
01:00:18,719 --> 01:00:20,559
losing out on the candidate. I feel like, you know, hey,

936
01:00:20,559 --> 01:00:22,760
can we record this session and have this as recordings

937
01:00:22,800 --> 01:00:24,320
so we can share with the rest of the team.

938
01:00:24,800 --> 01:00:26,320
It's just another one of those things. So I mean,

939
01:00:26,360 --> 01:00:29,480
if you're getting the value out, you know, great. I

940
01:00:29,480 --> 01:00:32,039
think that's where there's a I would love to see

941
01:00:32,039 --> 01:00:33,199
the tool that actually helps there.

942
01:00:33,719 --> 01:00:36,760
Speaker 5: I'm thinking of it from a from an individual.

943
01:00:37,679 --> 01:00:42,039
Speaker 4: If you are hired for a role and you're hiring

944
01:00:42,119 --> 01:00:45,679
someone that is like ninety percent of the way that

945
01:00:46,320 --> 01:00:49,280
but they are one hundred and twenty percent in terms

946
01:00:49,280 --> 01:00:52,599
of their ability to learn, you know that they will

947
01:00:52,639 --> 01:00:55,599
know the stuff given the opportunity to learn it, because

948
01:00:55,639 --> 01:01:00,400
they've demonstrated they've they've picked up on these technologies in

949
01:01:00,440 --> 01:01:05,039
every job they've changed, They've come in relatively fresh. That's

950
01:01:05,079 --> 01:01:09,639
where they as an individual are able to potentially use

951
01:01:10,559 --> 01:01:17,760
AI to augment their learning, whether it is through co

952
01:01:17,880 --> 01:01:22,199
pilot type tools, whether it is chat, GPT, those kind

953
01:01:22,199 --> 01:01:27,920
of things, where previously you would reach for stuck overflow, which,

954
01:01:27,960 --> 01:01:30,159
again looking back on it, you think, well, yeah, stuck

955
01:01:30,199 --> 01:01:33,079
overflows great because you would get loads of answers to things,

956
01:01:33,880 --> 01:01:37,519
but you would still have to understand the answer because

957
01:01:38,920 --> 01:01:42,880
you don't know who that person is that's written that answer,

958
01:01:43,199 --> 01:01:46,400
much like you have no guarantee of what the model

959
01:01:46,519 --> 01:01:51,920
is is outputting from its response, you can rely upon, well,

960
01:01:52,000 --> 01:01:55,440
you can take indication from the amount of kind of

961
01:01:55,480 --> 01:02:01,119
crowdsourced endorsement I suppose of a particular answer, And that's

962
01:02:01,119 --> 01:02:06,599
I guess where responsibility maybe goes more onto model providers

963
01:02:06,639 --> 01:02:11,639
to actually say are the answers you are providing accurate?

964
01:02:13,360 --> 01:02:18,639
Maybe large language models are just too too general for something.

965
01:02:18,760 --> 01:02:22,880
Maybe this is where the the more niche models that

966
01:02:22,960 --> 01:02:29,039
are specifically focused on Python coding, for example, are better

967
01:02:29,119 --> 01:02:34,280
fit because they have been trained on vetted best practice

968
01:02:34,559 --> 01:02:41,559
Python code. Yeah, so many angles that you can explore

969
01:02:41,599 --> 01:02:45,519
with this. I feel like we could talk for hours.

970
01:02:47,000 --> 01:02:50,920
Speaker 1: One of the things I'm doing, we're doing our annual

971
01:02:51,559 --> 01:02:56,519
performance reviews, and it consists of each employee gets a

972
01:02:56,519 --> 01:02:59,159
pure review, they do a self review, and then I

973
01:02:59,239 --> 01:03:02,920
write their review one of the things I've been doing.

974
01:03:03,239 --> 01:03:05,320
It's taking a huge amount of time, but I feel

975
01:03:05,320 --> 01:03:09,719
like it's still worth it is I'm giving AI the

976
01:03:09,719 --> 01:03:14,599
pure reviews and their self review and my review, along

977
01:03:14,719 --> 01:03:19,559
with the responsibilities for their current level and the next level,

978
01:03:19,639 --> 01:03:22,840
and then asking AI, how can I help this person

979
01:03:23,280 --> 01:03:26,920
better meet the expectations of their current role and meet

980
01:03:27,079 --> 01:03:33,800
and and start growing towards their next level within the company.

981
01:03:34,480 --> 01:03:36,599
And it's it's been insightful because it's picking up on

982
01:03:36,679 --> 01:03:40,239
things that I'd overlooked when reading through the reviews myself.

983
01:03:41,880 --> 01:03:44,760
Speaker 4: I think that's a great, great use case. But the

984
01:03:44,920 --> 01:03:48,920
key bit in that is it's grinded by human effort

985
01:03:49,079 --> 01:03:51,760
that somebody has put into start with. It's grounded by

986
01:03:52,639 --> 01:03:58,480
truth and actual knowledge. If you took I mean, if

987
01:03:58,760 --> 01:04:02,400
you took the the part where you write your review

988
01:04:02,400 --> 01:04:06,920
of that person away and said, Okay, well, they've written

989
01:04:06,960 --> 01:04:10,679
a self review, they've had a peer review from somebody else.

990
01:04:11,320 --> 01:04:16,880
Now let's use AI to write the employer review. Make

991
01:04:16,960 --> 01:04:20,920
it this tone of voice, here's the metrics about this person.

992
01:04:21,440 --> 01:04:24,199
How many lines of code they've written, this year, those

993
01:04:24,280 --> 01:04:27,880
kinds of things where you could quite easily use AI

994
01:04:27,960 --> 01:04:30,719
for that. But that's where it then turns into the

995
01:04:31,880 --> 01:04:32,840
this is a little bit.

996
01:04:34,440 --> 01:04:38,679
Speaker 1: Too far, Yeah, for sure, because at that point I personally,

997
01:04:38,679 --> 01:04:41,079
I would feel like, well, I'm not really adding any

998
01:04:41,159 --> 01:04:43,960
value here. Their review came from AI. At that point,

999
01:04:44,000 --> 01:04:46,320
I feel like I've still got to put some skin

1000
01:04:46,360 --> 01:04:49,440
in the game and do my job to help them.

1001
01:04:50,760 --> 01:04:52,440
Speaker 3: Well, I think with that said, like a lot of

1002
01:04:53,320 --> 01:04:56,960
industries are going to be creating verification processes that are

1003
01:04:57,000 --> 01:05:00,119
specific to the problem. So this whole idea that the

1004
01:05:00,159 --> 01:05:03,360
AI is going to be running amuck, it's like, well, no,

1005
01:05:03,480 --> 01:05:05,880
we don't really do that. That's not like how things

1006
01:05:05,960 --> 01:05:08,320
in the real world works. So, for example, in biotech,

1007
01:05:09,079 --> 01:05:10,800
I think there's going to be a ton of AI

1008
01:05:11,320 --> 01:05:14,559
generated drugs, but they still have to go through the

1009
01:05:14,559 --> 01:05:17,920
same verification process as all the other drugs, which take years.

1010
01:05:17,960 --> 01:05:20,159
You have to be able to create it, Like just

1011
01:05:20,159 --> 01:05:22,440
just being able to create the thing, just because the

1012
01:05:22,440 --> 01:05:24,960
computer says that it's a valid like it's a valid

1013
01:05:25,039 --> 01:05:27,000
drug and all that doesn't doesn't mean that it is.

1014
01:05:27,039 --> 01:05:29,320
It still has to go through clinical trials and still

1015
01:05:29,320 --> 01:05:32,360
has to go through peer review, and I feel like

1016
01:05:32,400 --> 01:05:37,039
every industry is going to have some it has something similar, right,

1017
01:05:37,239 --> 01:05:39,920
Like I don't know, so I don't I don't worry

1018
01:05:39,920 --> 01:05:41,800
about that one quite as much, except I worry a

1019
01:05:41,800 --> 01:05:45,800
lot about greed and the loop, Like that's what that's

1020
01:05:45,840 --> 01:05:48,199
the one. That's the one that I worry about. Like, oh,

1021
01:05:48,320 --> 01:05:51,079
look now, now we can make all these biosimilars to

1022
01:05:51,320 --> 01:05:54,519
you know, this drug but this patent expired for and

1023
01:05:54,599 --> 01:05:56,440
we can just be pumping these out and then, like

1024
01:05:57,039 --> 01:05:59,119
you know, if there's not enough like regulation and things,

1025
01:05:59,239 --> 01:06:01,360
or if somebody can to get pushed through any of

1026
01:06:01,400 --> 01:06:05,239
these any of these kind of verification steps, then I

1027
01:06:05,280 --> 01:06:08,760
could see that going very, very sideways. So I'm just

1028
01:06:08,760 --> 01:06:12,440
gonna hope that doesn't happen, and if it does, I'm

1029
01:06:12,760 --> 01:06:14,679
going to move off to the woods and there's going

1030
01:06:14,719 --> 01:06:16,119
to be no more computers in my life.

1031
01:06:16,159 --> 01:06:18,519
Speaker 2: And that's going to be I think you hit on

1032
01:06:18,559 --> 01:06:22,800
something that's quite ingenious here, actually, Jillian. If you just

1033
01:06:22,880 --> 01:06:27,320
go through previous patents for drugs and then you ask

1034
01:06:27,639 --> 01:06:31,119
an LLM to generate a new drug that has the

1035
01:06:31,159 --> 01:06:35,239
same bonding you know, activation sites, the same you know,

1036
01:06:35,639 --> 01:06:40,000
interactions with other molecules, but is fundamentally different enough that

1037
01:06:40,079 --> 01:06:42,320
it could be classified as a new drug that could

1038
01:06:42,320 --> 01:06:47,000
be patented. Then these companies will start losing a lot

1039
01:06:47,039 --> 01:06:49,599
of money because they won't their patents won't mean a

1040
01:06:49,599 --> 01:06:53,159
lot anymore, and will have a lot cheaper medication in

1041
01:06:53,199 --> 01:06:54,039
the world.

1042
01:06:55,000 --> 01:06:58,039
Speaker 3: That's the hope. That's not actually how things have been going.

1043
01:07:00,480 --> 01:07:03,559
I mean, like, I really, I really appreciate your optimism there,

1044
01:07:03,679 --> 01:07:07,760
but I'm not I don't know, I'm not sure. Like

1045
01:07:07,800 --> 01:07:10,280
if you look at like biologics, right, like biologics are

1046
01:07:10,280 --> 01:07:12,440
probably I think are like one of the biggest medical

1047
01:07:12,559 --> 01:07:16,719
innovations you know, in like decades, and they're so expensive.

1048
01:07:16,800 --> 01:07:19,960
They cost like I don't know, I think Humerica costs

1049
01:07:20,000 --> 01:07:22,639
like a couple grand a month without insurance or something

1050
01:07:22,679 --> 01:07:24,360
like that. So then yeah, hopefully we do get this

1051
01:07:24,440 --> 01:07:26,800
next wave where we're creating all the biosimilar drugs and

1052
01:07:26,880 --> 01:07:29,360
so on and so forth. But I mean, when when

1053
01:07:29,360 --> 01:07:31,800
the biologics first came out, they had their patents and

1054
01:07:31,840 --> 01:07:34,719
they had an absolute lock on the market. Legally speaking,

1055
01:07:34,760 --> 01:07:36,960
you could not create a drug that was you know,

1056
01:07:37,159 --> 01:07:40,400
slightly similar. They're called biosimilar as you couldn't do that

1057
01:07:40,480 --> 01:07:45,000
because there's legal red tape and stuff. But yeah, I hope,

1058
01:07:45,039 --> 01:07:46,719
So that would be great if you know, if like

1059
01:07:46,960 --> 01:07:50,159
just producing these drugs got cheap enough that the patents

1060
01:07:50,159 --> 01:07:52,840
were no longer even worth it, then that would be

1061
01:07:52,840 --> 01:07:55,159
like a pretty huge disruptor to the medical industry.

1062
01:07:55,679 --> 01:08:00,239
Speaker 2: I mean, they could make it much worse because the

1063
01:08:00,239 --> 01:08:03,800
company that produced the drug initially, if they had used

1064
01:08:03,920 --> 01:08:06,400
LLLMS anywhere in their process, technically they can't patent it

1065
01:08:06,480 --> 01:08:09,559
in the first place. So I think we're very close

1066
01:08:09,599 --> 01:08:12,760
to the point where, uh, there will not be any

1067
01:08:12,880 --> 01:08:18,720
patentable artifacts that exist in the fundamental laws are fundamentally changed.

1068
01:08:19,359 --> 01:08:22,199
Speaker 3: I think we're already there though, like with the and

1069
01:08:22,279 --> 01:08:24,760
patents are still around. So I think it's less about

1070
01:08:24,800 --> 01:08:28,760
the the computery stuff, you know. So like I work

1071
01:08:28,800 --> 01:08:30,399
with a lot of companies and they're like, oh, can

1072
01:08:30,439 --> 01:08:34,960
I patent this process, like the software process, And it's like, well, no,

1073
01:08:35,079 --> 01:08:37,039
you shouldn't even really bother with that. Go patent the

1074
01:08:37,079 --> 01:08:39,359
process that you use in the lab to actually create

1075
01:08:39,359 --> 01:08:42,000
the drugs, And so that's where everybody's at. The actual

1076
01:08:42,000 --> 01:08:45,399
like data generation is is like a throwaway kind of yeah,

1077
01:08:46,720 --> 01:08:49,199
you know, like yeah, it's just it's just throw away

1078
01:08:49,199 --> 01:08:50,920
because a lot of that has to be open anyways,

1079
01:08:50,960 --> 01:08:53,039
like the data generation that you used to actually get

1080
01:08:53,039 --> 01:08:54,640
to your drug, because it has to it has to

1081
01:08:54,680 --> 01:08:56,359
be like peer reviewed and all that kind of thing.

1082
01:08:56,920 --> 01:08:59,000
But everything that goes on in the lab can still

1083
01:08:59,039 --> 01:09:02,199
be a lot more I don't know, I can still

1084
01:09:02,239 --> 01:09:03,920
have a patent, see. And this is why greed in

1085
01:09:03,960 --> 01:09:06,039
the loop is such a problem because then there's always

1086
01:09:06,079 --> 01:09:09,520
like there's always ways around the things, and then people

1087
01:09:09,560 --> 01:09:11,000
want to be making money, which I get.

1088
01:09:11,079 --> 01:09:14,239
Speaker 2: I feel like that's that's its own episode in itself.

1089
01:09:14,319 --> 01:09:19,920
I think greed and software and tech companies and yeah.

1090
01:09:19,279 --> 01:09:21,680
Speaker 3: I could a little bit depressing though it's not like

1091
01:09:21,720 --> 01:09:22,399
a fun topic.

1092
01:09:22,920 --> 01:09:23,279
Speaker 5: Okay.

1093
01:09:23,359 --> 01:09:26,399
Speaker 2: So Jillian's like, I have tons of optimistic topics that

1094
01:09:26,560 --> 01:09:29,039
we should talk about. Let's pick one of those, especially

1095
01:09:29,079 --> 01:09:32,119
regarding any sort of AI or mL Okay, I'm all

1096
01:09:32,159 --> 01:09:32,560
for it.

1097
01:09:32,920 --> 01:09:34,840
Speaker 3: Yeah, let's just talk about the cool stuff and not

1098
01:09:34,920 --> 01:09:38,520
talk about, you know, potentially people flooding the market with

1099
01:09:39,279 --> 01:09:41,840
crazy patents and then nobody getting their drugs is how

1100
01:09:41,880 --> 01:09:42,640
that works.

1101
01:09:43,920 --> 01:09:46,119
Speaker 1: So, speaking of cool topics, Alex, you work with a

1102
01:09:46,159 --> 01:09:50,680
lot of companies implementing AI into their business processes. A

1103
01:09:50,720 --> 01:09:54,039
lot of our listeners are in the devlops field or

1104
01:09:54,119 --> 01:09:58,319
deal with software engineering and infrastructure. What are the key

1105
01:09:59,239 --> 01:10:04,239
pieces of AD you would have for them to continue

1106
01:10:04,279 --> 01:10:06,159
their career and be ready for the next evolution.

1107
01:10:07,479 --> 01:10:08,840
Speaker 5: That's a great question.

1108
01:10:11,199 --> 01:10:19,039
Speaker 4: I think it's about embracing it, but also being super

1109
01:10:19,079 --> 01:10:23,640
critical of the solutions and the tools are available. So

1110
01:10:25,359 --> 01:10:31,239
it's very, very easy to feel overwhelmed. I think by

1111
01:10:32,039 --> 01:10:35,000
the number of AI solutions that are.

1112
01:10:35,279 --> 01:10:37,159
Speaker 5: Available in any space.

1113
01:10:40,279 --> 01:10:44,800
Speaker 4: I think in DevOps, particularly if we're including kind of

1114
01:10:44,800 --> 01:10:48,560
the developer side of that tools in there, we're just

1115
01:10:48,600 --> 01:10:51,319
going to see more and more come out.

1116
01:10:51,359 --> 01:10:51,960
Speaker 5: I mean, there's.

1117
01:10:53,920 --> 01:11:01,840
Speaker 4: A company I came across who their niche is co

1118
01:11:02,000 --> 01:11:05,600
pilot tools, but they only offer models trained on your

1119
01:11:05,600 --> 01:11:08,920
company's data, so they haven't got like a public offering.

1120
01:11:09,039 --> 01:11:12,039
They're aiming at an enterprise. So the idea is they

1121
01:11:12,960 --> 01:11:16,960
they trade a model based on your organization's code bases,

1122
01:11:18,520 --> 01:11:22,760
and that is your private copilot. So I think there's

1123
01:11:23,880 --> 01:11:29,159
there's lots and lots to come in this area operationally.

1124
01:11:29,239 --> 01:11:33,960
I think the whole I guess principle of DevOps is

1125
01:11:34,680 --> 01:11:37,720
trying to break down that a sort of metaphorical wall

1126
01:11:38,760 --> 01:11:44,239
between the two sides and empowered developers to the operational tasks.

1127
01:11:45,279 --> 01:11:48,560
I think we're seeing quite a lot come out around

1128
01:11:51,920 --> 01:11:56,000
explaining operational events using generity they are and that kind

1129
01:11:56,000 --> 01:12:01,000
of almost like trace analysis of well, this happened. We've

1130
01:12:01,000 --> 01:12:04,399
got ten different data points here, and how do we

1131
01:12:04,439 --> 01:12:08,159
correlate them, How do we say this happened because this happened,

1132
01:12:08,159 --> 01:12:11,600
and this happened, this happened, and the chain reaction was this.

1133
01:12:13,560 --> 01:12:14,279
Speaker 5: Being able to.

1134
01:12:16,319 --> 01:12:19,840
Speaker 4: Even as simple as put those things together in a

1135
01:12:21,039 --> 01:12:23,159
some sort of structured data and.

1136
01:12:23,039 --> 01:12:25,279
Speaker 5: Then using a large language model.

1137
01:12:25,079 --> 01:12:28,239
Speaker 4: To summarize it into a black message to say this

1138
01:12:28,319 --> 01:12:34,239
has failed. This is why I think the one the

1139
01:12:34,279 --> 01:12:38,079
one piece of advice I'd give people if they're looking

1140
01:12:38,119 --> 01:12:45,399
to to start experimenting with AI, is like, so solve

1141
01:12:45,399 --> 01:12:49,439
your own problems, find things in your processes, in your

1142
01:12:49,439 --> 01:12:55,520
workflows that take the most time, and use AI almost

1143
01:12:55,560 --> 01:12:58,760
as like like your shadow. I suppose it would be

1144
01:12:58,760 --> 01:13:01,399
a good way to describe it. So if you haven't

1145
01:13:01,399 --> 01:13:05,800
got confidence in it straightway, tell it to the same

1146
01:13:05,880 --> 01:13:09,479
things that you would do, but build it so that it

1147
01:13:09,520 --> 01:13:12,079
does it as a dry run. Make sure it is

1148
01:13:12,119 --> 01:13:14,439
going to execute the same steps right on.

1149
01:13:15,800 --> 01:13:20,479
Speaker 1: Yeah, right, very cool, And that feels like a good

1150
01:13:22,199 --> 01:13:23,840
segue into some picks.

1151
01:13:24,479 --> 01:13:28,199
Speaker 5: Do you think some picks so.

1152
01:13:30,880 --> 01:13:33,680
Speaker 4: I would say, I would say, do your picks have

1153
01:13:33,720 --> 01:13:35,920
to be physical or can there be be software?

1154
01:13:36,920 --> 01:13:37,239
Speaker 1: Anything?

1155
01:13:37,399 --> 01:13:43,920
Speaker 4: Anything goes, anything goes okay. So I'm going to go

1156
01:13:44,039 --> 01:13:46,800
for some that are some that are related, some of

1157
01:13:46,800 --> 01:13:52,880
that are are not. So my my first one is

1158
01:13:53,920 --> 01:14:01,000
AWS have a free to public, generally said I experimentation

1159
01:14:01,359 --> 01:14:08,119
website called party Rock. This was born from a it's

1160
01:14:08,119 --> 01:14:11,520
actually an AWS engineer that builds it internally as a

1161
01:14:11,560 --> 01:14:14,960
way to experiment with large language models, and then it

1162
01:14:15,039 --> 01:14:19,800
got adopted by AWS as an organization. So if you

1163
01:14:19,840 --> 01:14:22,680
go to I think you are at a party rock

1164
01:14:22,720 --> 01:14:27,840
dot AWS. There's no credit cards to anything required.

1165
01:14:28,079 --> 01:14:28,399
Speaker 5: Go on.

1166
01:14:28,640 --> 01:14:33,840
Speaker 4: You get a free amount of usage each month, and

1167
01:14:33,920 --> 01:14:39,119
you can build these these generative I apps. One caveat

1168
01:14:39,199 --> 01:14:43,239
is it's free. It's public. Going go and upload like

1169
01:14:43,279 --> 01:14:48,800
your company's financial records to it, all personal data. It's

1170
01:14:49,199 --> 01:14:51,760
if you're going to use that kind of data, just

1171
01:14:52,159 --> 01:15:11,560
anonymize it first. Then what else? I I'm going to

1172
01:15:11,640 --> 01:15:12,439
go four.

1173
01:15:15,279 --> 01:15:15,600
Speaker 5: Something.

1174
01:15:15,640 --> 01:15:20,800
Speaker 4: A lot of developers spend probably some money on, but

1175
01:15:21,720 --> 01:15:23,560
I don't think you can spend enough money on it.

1176
01:15:25,199 --> 01:15:30,319
Which is keyboard and mouse. I put myself out with

1177
01:15:30,680 --> 01:15:36,159
a comfortable mouse and a comfortable keyboard, the one that

1178
01:15:36,199 --> 01:15:39,600
comes with your Mac or comes with your PC. It's functional, right,

1179
01:15:39,680 --> 01:15:41,560
but after a while.

1180
01:15:41,479 --> 01:15:44,760
Speaker 5: It's going to hurt your wrists. So what do you got?

1181
01:15:45,600 --> 01:15:52,920
Speaker 4: My little Logitech MX Master three. It has far too

1182
01:15:52,960 --> 01:15:55,199
many buttons on it to know what to do with,

1183
01:15:55,359 --> 01:16:00,800
but it's comfortable. And then I have a key chron

1184
01:16:01,279 --> 01:16:08,560
K two mechanical wireless keyboard, really slim from mechanical keyboard, uh,

1185
01:16:08,680 --> 01:16:15,319
but super nice type on as well. And you said

1186
01:16:15,359 --> 01:16:19,560
socks were cool for that was my That was my

1187
01:16:19,640 --> 01:16:24,600
prep for this was socks. So I have some really

1188
01:16:24,600 --> 01:16:29,000
cool socks, but they've all come from conferences. I can't

1189
01:16:29,000 --> 01:16:31,319
give people links, so maybe like.

1190
01:16:31,399 --> 01:16:34,880
Speaker 2: Who which which vendor it gives out the best sucks?

1191
01:16:35,560 --> 01:16:35,800
Speaker 1: Right?

1192
01:16:38,000 --> 01:16:43,880
Speaker 4: There were some I got from uh Influx TB last

1193
01:16:44,079 --> 01:16:47,880
or year before last, which were they were really cool.

1194
01:16:47,920 --> 01:16:53,359
They were like every color under the sun, super stripey

1195
01:16:53,600 --> 01:17:00,520
but really really comfortable socks. Or the like how degrail

1196
01:17:00,680 --> 01:17:04,319
of conference swag, which is the red hat red hat

1197
01:17:06,680 --> 01:17:09,319
which you might be able to see like up there

1198
01:17:09,319 --> 01:17:10,359
on top of the bookcase.

1199
01:17:12,560 --> 01:17:17,640
Speaker 5: Yeah, like swag is a yeah, a bit.

1200
01:17:17,600 --> 01:17:23,479
Speaker 4: Of a a debatable topic, but you can normally go

1201
01:17:23,560 --> 01:17:27,039
to a conference with significantly less close than you need.

1202
01:17:29,720 --> 01:17:32,520
Speaker 5: On day one for sure.

1203
01:17:33,680 --> 01:17:36,319
Speaker 1: All right, Gillian, what'd you bring for picks?

1204
01:17:37,840 --> 01:17:39,960
Speaker 3: I actually have a tech pic this week. I was

1205
01:17:40,000 --> 01:17:43,359
looking for some type of UI to build out my

1206
01:17:43,479 --> 01:17:46,720
terraform code, mostly because of this AI product that I

1207
01:17:46,760 --> 01:17:49,199
was talking about where I deploy it like on the

1208
01:17:49,239 --> 01:17:51,560
client site and it has to have you know, like

1209
01:17:51,600 --> 01:17:53,960
that database and S three bucket, a couple of lamb

1210
01:17:53,960 --> 01:17:57,000
of functions and then e C two instances, And I

1211
01:17:57,039 --> 01:17:58,680
was like, wouldn't it be nice if there was just

1212
01:17:58,800 --> 01:18:01,479
like a parameter I is quy then I could just

1213
01:18:01,520 --> 01:18:04,199
go and type and click a couple buttons because I'm

1214
01:18:04,239 --> 01:18:05,960
really in my I don't want to be typing era

1215
01:18:06,159 --> 01:18:06,840
of my life.

1216
01:18:07,359 --> 01:18:07,479
Speaker 5: Uh.

1217
01:18:07,560 --> 01:18:10,399
Speaker 3: And I found resourcely and it is very very cool.

1218
01:18:10,520 --> 01:18:12,319
I would like to point out there's no way that

1219
01:18:12,359 --> 01:18:16,479
I can afford their like their plan that's actually very useful.

1220
01:18:16,520 --> 01:18:19,319
So this is part pick and part me ebagging. You know,

1221
01:18:19,399 --> 01:18:23,600
if the guys at resourcely, I could be the voice

1222
01:18:23,680 --> 01:18:26,960
of your tool on the podcast and like, I'm sure

1223
01:18:26,960 --> 01:18:28,960
that would just be amazing, So there you go. But

1224
01:18:29,000 --> 01:18:30,600
it is. It's really neat, and I like that the

1225
01:18:30,600 --> 01:18:33,039
back end is all just run by Terraform and cookie

1226
01:18:33,039 --> 01:18:35,319
Cutter because those are like my two those are just

1227
01:18:35,359 --> 01:18:37,840
my two favorite tools of all time. It's like half

1228
01:18:37,840 --> 01:18:41,039
my life is run with Terraform, cookie Cutter and make files,

1229
01:18:41,640 --> 01:18:43,359
and then when you're throwing the make files, it's like

1230
01:18:43,439 --> 01:18:44,680
ninety percent of my life.

1231
01:18:45,279 --> 01:18:48,600
Speaker 2: So we definitely have a like a full sponsor section

1232
01:18:48,920 --> 01:18:52,800
on Adventures and DevOps dot com where if if someone

1233
01:18:52,800 --> 01:18:54,680
wants to be a sponsor this podcast, they can go

1234
01:18:54,760 --> 01:18:57,279
there and read what we have and then decide if

1235
01:18:57,279 --> 01:18:59,319
it's for them. Uh, you know, Jillian, what I found

1236
01:18:59,319 --> 01:19:03,039
that works is ask your customer if you know how

1237
01:19:03,079 --> 01:19:06,000
to incidentals work, and whether or not the usage of

1238
01:19:06,119 --> 01:19:09,399
third party tools to help cut down the amount of

1239
01:19:09,439 --> 01:19:12,079
time that you would have to charge them for would

1240
01:19:12,119 --> 01:19:14,479
be included under their contract. A lot of times the

1241
01:19:14,520 --> 01:19:16,760
contracts that when I was doing consulting, you would include

1242
01:19:16,760 --> 01:19:19,439
those in there, and of course you would charge that

1243
01:19:19,479 --> 01:19:22,760
to the customer at to optimize what they're actually getting

1244
01:19:22,800 --> 01:19:24,399
out of the value that you're providing.

1245
01:19:25,279 --> 01:19:27,520
Speaker 3: So that's always really tricky for me, just because like

1246
01:19:27,800 --> 01:19:31,319
the companies that I work for, they're not creating technology

1247
01:19:31,600 --> 01:19:34,399
as their product. Like if they could get rid of me,

1248
01:19:34,760 --> 01:19:36,920
they would, okay, like if they could just be like

1249
01:19:37,039 --> 01:19:39,600
you just go away, we just want to work on

1250
01:19:39,640 --> 01:19:42,840
our laptops with Excel, like they absolutely would, So that one,

1251
01:19:43,159 --> 01:19:44,520
that one is always a little bit of a tough

1252
01:19:44,520 --> 01:19:47,600
sell for me instead of just start emailing people and

1253
01:19:47,640 --> 01:19:51,399
try to get stuff for free, which is probably questionably

1254
01:19:51,560 --> 01:19:54,039
like in terms of ethics.

1255
01:19:53,600 --> 01:19:56,800
Speaker 2: But I get emailed all the time people asking stuff

1256
01:19:56,800 --> 01:19:59,720
for free, So I mean, I don't think you're doing

1257
01:19:59,720 --> 01:20:01,359
any thing especially wrong there.

1258
01:20:01,880 --> 01:20:04,960
Speaker 3: All right, well, thank you for thank you for the

1259
01:20:04,960 --> 01:20:07,600
ethical vote anyways, think you appreciate that. Sometimes I am

1260
01:20:07,600 --> 01:20:09,479
a little bit like, hmm, maybe I'm a little bit

1261
01:20:09,479 --> 01:20:13,479
too much on the side of the ebagging, but I

1262
01:20:13,560 --> 01:20:17,880
do like money, so here we are. But it is anyways,

1263
01:20:17,880 --> 01:20:19,920
it is a really great tool. It actually does generate

1264
01:20:19,960 --> 01:20:23,279
you like this really nice UI you can do. It

1265
01:20:23,279 --> 01:20:26,960
has to sort of parameterize like in multi tenancy built

1266
01:20:27,000 --> 01:20:28,640
in that I really like because I find a lot

1267
01:20:28,680 --> 01:20:31,520
of tools they just I don't know, they just they

1268
01:20:31,560 --> 01:20:33,800
just don't have that, and that tends to immediately not

1269
01:20:33,880 --> 01:20:36,319
work for me because I'm so rarely working on my

1270
01:20:36,399 --> 01:20:39,159
own AWS account right Like, my WS account is as

1271
01:20:39,159 --> 01:20:41,760
bare bones as it can possibly be, just for you know,

1272
01:20:41,840 --> 01:20:43,840
DEVA for whatever it is that I'm working on, and

1273
01:20:43,840 --> 01:20:47,399
then everything else is deployed on client sites. So it did.

1274
01:20:47,920 --> 01:20:50,159
It did genuinely look like a really nice tool that

1275
01:20:50,279 --> 01:20:52,239
has like everything that I want. And I think that

1276
01:20:52,279 --> 01:20:54,800
I can even make the free plan like mostly work

1277
01:20:54,840 --> 01:20:56,319
for me. But we'll see.

1278
01:20:57,840 --> 01:20:59,800
Speaker 1: Right on, Barren, what'd you bring for a pick?

1279
01:21:00,359 --> 01:21:03,920
Speaker 2: Yeah, So I've got something really interesting. It's actually a

1280
01:21:04,640 --> 01:21:08,560
old paper research paper from Yale in twenty ten, and

1281
01:21:08,600 --> 01:21:11,760
the name of the paper is comparing genomes to computer

1282
01:21:11,840 --> 01:21:15,520
operating systems in terms of the topology and evolution of

1283
01:21:15,520 --> 01:21:21,359
the regulatory control networks or regulatory control networks. For short,

1284
01:21:21,680 --> 01:21:25,680
it compares the Linux operating system to E. Coli bacterium.

1285
01:21:26,000 --> 01:21:29,000
And I find this really interesting from an architecture standpoint

1286
01:21:29,039 --> 01:21:33,279
of how what we build in technology is so wrong.

1287
01:21:33,479 --> 01:21:38,000
If we look at the evolution of biology for millions

1288
01:21:38,039 --> 01:21:40,960
of years, you look at the evolution of E. Coli

1289
01:21:41,119 --> 01:21:44,840
and you see what's currently there, and it's only a

1290
01:21:44,880 --> 01:21:47,760
six page paper. It's very short, and it really gives

1291
01:21:47,760 --> 01:21:49,680
a lot of insight into the sorts of things we're

1292
01:21:49,680 --> 01:21:52,119
building and whether or not we're building them effectively, and

1293
01:21:52,199 --> 01:21:57,199
being in the infrastructure systems design space having new insights

1294
01:21:57,239 --> 01:22:00,439
for how to build things or what actually is really important.

1295
01:22:00,439 --> 01:22:04,520
I always find really interesting. Is that from Wade Schultz.

1296
01:22:06,119 --> 01:22:10,079
I don't think so okay, but I could be totally wrong,

1297
01:22:10,119 --> 01:22:11,800
and so I don't want to swear to it, and

1298
01:22:11,920 --> 01:22:14,960
I will have to confirm for you after the episode

1299
01:22:15,000 --> 01:22:15,279
is over.

1300
01:22:15,560 --> 01:22:17,680
Speaker 1: Right on, because Wade's a really good friend of mine,

1301
01:22:17,680 --> 01:22:22,800
and he's the head of computational health over at Yale,

1302
01:22:22,960 --> 01:22:25,359
and that sounds exactly like something he would offer.

1303
01:22:30,079 --> 01:22:30,319
Speaker 3: Right on.

1304
01:22:30,880 --> 01:22:33,039
Speaker 1: So my pick for the week, I'm picking a book

1305
01:22:33,039 --> 01:22:38,319
this week, a sci fi fiction book called Jurish Juris

1306
01:22:38,359 --> 01:22:41,680
ex Machina I think that's how it's pronounced. It's from

1307
01:22:42,760 --> 01:22:45,359
John W. Maylee. It's a really cool book that has

1308
01:22:45,439 --> 01:22:47,239
a lot of tie into the episode we've talked about

1309
01:22:47,279 --> 01:22:51,159
here today. It's a future Earth where the legal system

1310
01:22:51,159 --> 01:22:58,439
has been largely replaced by AI and the main hero

1311
01:22:58,520 --> 01:23:01,159
of the story has been wrongly convicted goes to prison,

1312
01:23:01,640 --> 01:23:03,720
but it's a really well written book. It's got a

1313
01:23:03,760 --> 01:23:07,439
lot of super cool, nerdy tie ins into it, and

1314
01:23:07,520 --> 01:23:10,680
the writing is well done. It's fast paced, so you

1315
01:23:10,720 --> 01:23:14,680
get sucked into it immediately. And on top of that,

1316
01:23:14,880 --> 01:23:17,960
in about a month we're going to have John on

1317
01:23:18,000 --> 01:23:20,279
the show to talk about the book and AI. So

1318
01:23:20,319 --> 01:23:25,000
I'm looking forward to that episode. And so that's my pick.

1319
01:23:24,800 --> 01:23:28,359
Speaker 5: For the week. Awesome.

1320
01:23:28,359 --> 01:23:29,760
Speaker 2: I'm gonna have to read this in preparation.

1321
01:23:30,800 --> 01:23:32,880
Speaker 1: Yeah, it's it's been a really cool book. Like I

1322
01:23:32,920 --> 01:23:35,199
struggle to get into too fiction books, but this one

1323
01:23:35,279 --> 01:23:41,479
just like slurped me on in right on. Alex, thank

1324
01:23:41,479 --> 01:23:43,319
you so much for joining us on the episode. Was

1325
01:23:43,359 --> 01:23:44,399
a pleasure talking with you.

1326
01:23:44,960 --> 01:23:46,520
Speaker 5: Thank you for having me. It's been good fun.

1327
01:23:47,039 --> 01:23:50,439
Speaker 1: Right on, and to all our listeners, thank you for listening.

1328
01:23:50,439 --> 01:23:54,000
Appreciate your support. Gillian Warren, thank you for joining me

1329
01:23:54,039 --> 01:23:57,680
co hosting with me, and we'll see everyone next week.

1330
01:24:00,439 --> 01:24:00,479
Speaker 5: S

