1
00:00:05,080 --> 00:00:06,599
Speaker 1: All right, what's up everyone.

2
00:00:06,679 --> 00:00:10,199
Speaker 2: Welcome to another episode of Adventures in dev Ops.

3
00:00:10,279 --> 00:00:11,960
Speaker 1: Warren, Welcome, good to see you again.

4
00:00:12,880 --> 00:00:15,880
Speaker 3: Thank you for inviting me back. Will Uh, it's been great.

5
00:00:17,039 --> 00:00:17,640
It's interesting.

6
00:00:17,679 --> 00:00:21,039
Speaker 4: I was looking for interesting facts for this particular episode,

7
00:00:21,160 --> 00:00:24,800
and I came across one by a writer that has

8
00:00:24,839 --> 00:00:28,640
been has gone viral on Twitter and other places. And

9
00:00:28,679 --> 00:00:33,119
she's her name, Yoanna Machayevski Sta and she says, I

10
00:00:33,200 --> 00:00:35,399
want AI do my laundry and dishes so that I

11
00:00:35,439 --> 00:00:37,880
can do art and writing, not for AI to do

12
00:00:37,960 --> 00:00:40,159
my art and writing so that I can do my

13
00:00:40,359 --> 00:00:45,000
laundry and all right, And if you search that on Google,

14
00:00:45,039 --> 00:00:47,640
actually you'll find no less than ten guys who have

15
00:00:47,960 --> 00:00:51,320
co opted her words and are claiming them as their own, uh,

16
00:00:51,600 --> 00:00:54,240
spreading the viralness out there. But I think it's very

17
00:00:54,280 --> 00:00:56,560
interesting because it's also the direction that we've been going,

18
00:00:56,560 --> 00:00:58,920
and I think it's relevant to today's episode.

19
00:00:59,520 --> 00:01:03,399
Speaker 2: Yeah, for sure, because joining us today in the studio,

20
00:01:04,280 --> 00:01:09,359
Anthony SPITERI, we've had your longtime friend and co worker

21
00:01:09,799 --> 00:01:12,799
Michael Kaide from Van on the show just a couple

22
00:01:12,840 --> 00:01:16,319
of episodes ago, so this will be fun. Anthony, thank

23
00:01:16,359 --> 00:01:18,159
you for joining us and welcome.

24
00:01:18,200 --> 00:01:20,159
Speaker 5: Thanks Will, thanks Bar and glad to be here. Yeah,

25
00:01:20,200 --> 00:01:23,599
it's a big choice to feel literally following him up

26
00:01:25,319 --> 00:01:27,400
a big man in more wives than one.

27
00:01:28,599 --> 00:01:31,640
Speaker 2: He was He looked only like three or four inches

28
00:01:31,680 --> 00:01:33,560
tall on my screen, so I don't really have a

29
00:01:33,640 --> 00:01:34,519
sense of scale.

30
00:01:35,280 --> 00:01:40,280
Speaker 5: Yeah, that's why you have to carry him some random

31
00:01:40,280 --> 00:01:42,719
city around the world because of certain you know, certain

32
00:01:43,120 --> 00:01:44,280
things going on site.

33
00:01:44,280 --> 00:01:47,239
Speaker 6: Then you realize this happy the man. He is right.

34
00:01:47,719 --> 00:01:52,799
Speaker 4: It's not education that's the great equalizer, it's remote video calls.

35
00:01:54,200 --> 00:01:54,439
Speaker 1: Cool.

36
00:01:54,519 --> 00:01:56,000
Speaker 2: Yeah, so we were going to talk a little bit

37
00:01:56,040 --> 00:01:58,280
and this is this is gonna be a fun episode

38
00:01:58,319 --> 00:02:02,319
because AI is pre controversial. I feel like everyone's falling

39
00:02:02,439 --> 00:02:06,439
into one of two camps. Either you know, it's going

40
00:02:06,519 --> 00:02:11,280
to do everything for me, or it's the reincarnation of

41
00:02:11,280 --> 00:02:13,759
the evil Overlord and it's the worst thing to happen

42
00:02:14,000 --> 00:02:16,360
to humanity. And I think there's very few people that

43
00:02:16,639 --> 00:02:20,159
are in between those camps, at least very few people

44
00:02:20,599 --> 00:02:24,360
who are vocalizing what their actual opinion is, which is

45
00:02:24,400 --> 00:02:27,599
probably the case with a lot of things. But I

46
00:02:27,599 --> 00:02:31,039
think it's cool. So you've been working on quite a

47
00:02:31,039 --> 00:02:38,680
bit of stuff of integrating AI through like conversational prompt engineering.

48
00:02:38,240 --> 00:02:43,400
Speaker 5: Right, absolutely, yeah, I mean it's it's a fascinating field.

49
00:02:43,520 --> 00:02:45,759
Like I think there's there's certain elements of hype to

50
00:02:45,800 --> 00:02:48,039
it with that question, like we know that it's been

51
00:02:48,120 --> 00:02:52,120
hyped up, but I think about you know, just post pandemic,

52
00:02:52,159 --> 00:02:55,879
we had the crypto blockchain defile hype which was so.

53
00:02:55,919 --> 00:02:58,000
Speaker 6: Huge from the time, and people are.

54
00:02:57,840 --> 00:02:59,759
Speaker 5: Like, yep, that's the thing, this is the future, and

55
00:03:00,199 --> 00:03:03,000
you know, while that's still you know, they're thereabouts, it's

56
00:03:03,319 --> 00:03:05,479
it's kind of dropped off in terms of its actual

57
00:03:05,560 --> 00:03:09,080
applic applicability to the world. But then I think this

58
00:03:09,159 --> 00:03:11,319
came in and just went here we go like this

59
00:03:11,400 --> 00:03:14,159
is this is here, it's coming big, it's coming strong.

60
00:03:14,360 --> 00:03:16,680
Open AI did wonders with chat GBT. All of a

61
00:03:16,719 --> 00:03:19,599
sudden it was like, what's chat GBT on Twitter? And

62
00:03:19,719 --> 00:03:21,400
or it was X was without the time that X

63
00:03:21,520 --> 00:03:23,960
was being bought if I remember correctly by by Elon.

64
00:03:24,520 --> 00:03:26,000
And then yeah, then all of a sudden, we've got

65
00:03:26,000 --> 00:03:29,039
this amazing interface that does amazing stuff and the first

66
00:03:29,039 --> 00:03:31,000
thing that I told it to do was to build

67
00:03:31,039 --> 00:03:32,919
a script for me. But that's I think, and I

68
00:03:32,919 --> 00:03:35,280
think a lot of people in it. You know, they

69
00:03:35,319 --> 00:03:38,319
didn't ask it any specific questions or anything. They said,

70
00:03:38,319 --> 00:03:40,199
can you do this? Can you build me a script?

71
00:03:40,360 --> 00:03:41,960
And I think so. I think a lot of us

72
00:03:42,000 --> 00:03:44,960
in the technical world were in that, you know, devopy

73
00:03:45,080 --> 00:03:47,680
sort of world or coding or even infrastructure.

74
00:03:48,120 --> 00:03:50,360
Speaker 6: This is the application they saw straight away for it.

75
00:03:51,919 --> 00:03:52,599
Speaker 1: Yeah, for sure.

76
00:03:52,680 --> 00:03:55,719
Speaker 2: I think my initial use of it is I just

77
00:03:55,840 --> 00:03:58,439
asked it bad questions, you know, because I didn't really

78
00:03:58,479 --> 00:04:00,879
see any value of it. And then I think it

79
00:04:00,960 --> 00:04:03,680
might have been actually one of your tweets that I

80
00:04:03,759 --> 00:04:06,639
saw where you did like a conversational example, and I

81
00:04:06,680 --> 00:04:09,039
was like, oh damn, I never thought of doing that.

82
00:04:09,120 --> 00:04:11,520
Speaker 1: And so now a lot of my interaction.

83
00:04:11,199 --> 00:04:15,960
Speaker 2: With it is it's replacing my Google Search because you know,

84
00:04:16,519 --> 00:04:20,079
like in my job, I switch between terrorform and antseble

85
00:04:20,160 --> 00:04:24,040
and GCP and AWS and just the context switching is huge.

86
00:04:24,560 --> 00:04:26,639
And so when I try to remember, like, oh, I

87
00:04:26,680 --> 00:04:28,120
know I need to do this, but I don't remember

88
00:04:28,120 --> 00:04:31,279
the exact syntax, I've found myself in the habit of

89
00:04:31,319 --> 00:04:36,160
asking chat GPT exactly how to do it, and it's like, oh, yeah.

90
00:04:35,959 --> 00:04:36,600
Speaker 6: I just do this.

91
00:04:37,399 --> 00:04:39,519
Speaker 5: Yeah, it's definitely and this is part of the you

92
00:04:39,560 --> 00:04:41,759
know you talked about Lovey Eye, that the trust and

93
00:04:41,800 --> 00:04:44,199
whatever it is, but I think a lot of people

94
00:04:44,199 --> 00:04:46,439
have defaulted to it and don't use Google, which is

95
00:04:46,439 --> 00:04:49,439
why I was saying Google go pretty damn hard with

96
00:04:49,519 --> 00:04:52,240
the Gemini relaunched a couple of weeks ago, right like

97
00:04:52,279 --> 00:04:54,160
that was that was an interesting tip for that luck

98
00:04:55,360 --> 00:04:57,519
We're going to launch Gemini, then Open Eye off coming

99
00:04:57,560 --> 00:04:59,639
on the Monday and doing their thing. And yet I

100
00:04:59,759 --> 00:05:02,399
really that the four oh and it was an interesting week,

101
00:05:02,439 --> 00:05:04,720
and then Microsoft came and did its thing, and again

102
00:05:04,759 --> 00:05:05,959
then Apple's done it yesterday.

103
00:05:06,000 --> 00:05:08,839
Speaker 6: But I think that's the interesting.

104
00:05:08,480 --> 00:05:11,160
Speaker 5: Part of it is that it has replaced Google in

105
00:05:11,279 --> 00:05:14,639
terms of that search capability because we inherently trust it,

106
00:05:14,839 --> 00:05:16,800
though should we I think that's the big thing and

107
00:05:16,879 --> 00:05:18,639
what we get out of there. I think people just

108
00:05:18,759 --> 00:05:21,959
trust from the start, but maybe we shouldn't. But I think,

109
00:05:22,399 --> 00:05:24,279
you know, you've done enough, I've done enough in there

110
00:05:24,319 --> 00:05:27,160
to probably trust it to a certain extent. You don't

111
00:05:27,160 --> 00:05:29,040
have to double check things for sure, like it's not.

112
00:05:29,160 --> 00:05:31,240
But then again, we were at the same place with

113
00:05:31,360 --> 00:05:34,319
search before. I mean even Wikipedia, people were saying about Wikipedia,

114
00:05:34,360 --> 00:05:37,560
don't trust Wikipedia. You I remember at UNI never use

115
00:05:37,600 --> 00:05:40,879
Wikipedia as a citation. We could be wrong, So it's

116
00:05:40,879 --> 00:05:42,439
still it's the same sort of stuff. We're just we're

117
00:05:42,439 --> 00:05:43,399
just going over it again.

118
00:05:44,120 --> 00:05:44,800
Speaker 1: Yeah, for sure.

119
00:05:44,920 --> 00:05:49,600
Speaker 2: And you know, like the thing a few years ago

120
00:05:49,639 --> 00:05:52,319
was copy and piaste the answer out of stack overflow,

121
00:05:52,720 --> 00:05:54,959
and there was there was actually a running joke like,

122
00:05:55,439 --> 00:05:57,480
you know, can we build a vs code plug in

123
00:05:57,560 --> 00:06:00,560
that just copies the right answer from stack go overflow

124
00:06:00,639 --> 00:06:03,480
and adds it to my editor And I think, to

125
00:06:03,600 --> 00:06:06,160
a yeah, I think to a large extent chat GPT

126
00:06:06,560 --> 00:06:07,519
is is that plug again?

127
00:06:08,439 --> 00:06:09,600
Speaker 3: Plugging for sure exists.

128
00:06:10,160 --> 00:06:11,839
Speaker 4: You know, I think you two may have been way

129
00:06:11,839 --> 00:06:15,680
more ahead of me and interacting with UGBT than I was.

130
00:06:15,720 --> 00:06:18,160
Speaker 3: I was like, I don't know if I saw the benefit.

131
00:06:18,240 --> 00:06:21,120
Speaker 4: Initially, I was like, you know, the code isn't going

132
00:06:21,160 --> 00:06:23,480
to be perfect, there's going to be issues there.

133
00:06:23,560 --> 00:06:24,920
Speaker 3: And now since.

134
00:06:24,720 --> 00:06:26,879
Speaker 4: I've started using it, I definitely are more on the

135
00:06:26,879 --> 00:06:29,600
side of I see the long winded answers and I'm like,

136
00:06:29,639 --> 00:06:32,600
can I get back to just the one word or

137
00:06:31,800 --> 00:06:35,639
one please? Because like otherwise I have to I feel

138
00:06:35,639 --> 00:06:37,800
like I have to spend like half an hour training

139
00:06:37,879 --> 00:06:40,199
it on how what the system prompt should be so

140
00:06:40,240 --> 00:06:42,759
that it just gives me the direct answer that I want.

141
00:06:43,639 --> 00:06:46,000
Speaker 5: Yeah, and that's kind of that's the power of that

142
00:06:46,040 --> 00:06:48,360
prompt engineering. So I've been I think I've been big

143
00:06:48,399 --> 00:06:50,720
on this in terms of written a the articles on it,

144
00:06:51,160 --> 00:06:53,560
especially this year since I've been you know, using it

145
00:06:53,560 --> 00:06:55,480
for the past ten months.

146
00:06:55,519 --> 00:06:56,439
Speaker 6: Was sorry to do that.

147
00:06:57,240 --> 00:06:59,240
Speaker 5: You've got to kind of learn how to manipulate it

148
00:06:59,240 --> 00:07:01,360
in the right way and sort of train it to

149
00:07:01,439 --> 00:07:05,040
say the right things, and don't don't be afraid getting

150
00:07:05,079 --> 00:07:07,360
angry at it as well, and just like I really

151
00:07:07,399 --> 00:07:10,319
treat it like a human, Like I actually get frustrated

152
00:07:10,319 --> 00:07:12,439
with the thing, and I'll swear at it, and I'll

153
00:07:12,480 --> 00:07:14,920
call it its whole and I'm not sure if I

154
00:07:14,920 --> 00:07:17,120
can swear in the show, but you know, like I'll

155
00:07:17,759 --> 00:07:19,920
I'll just call it every name under the sun, right,

156
00:07:20,040 --> 00:07:22,839
And it's kind of it doesn't Sometimes it blinks a

157
00:07:22,839 --> 00:07:24,560
little bit and goes, hey, sorry, you know, I was

158
00:07:24,600 --> 00:07:25,079
just trying.

159
00:07:24,920 --> 00:07:25,519
Speaker 6: To do my job.

160
00:07:27,560 --> 00:07:30,439
Speaker 5: The more that you interact with it and the more

161
00:07:30,480 --> 00:07:32,800
that you treat it like you are to maybe talking

162
00:07:32,879 --> 00:07:36,000
to a dev team. Actually, I don't know whether that's

163
00:07:36,040 --> 00:07:38,560
even O favor, because if you talk to someone like that.

164
00:07:40,399 --> 00:07:40,560
Speaker 6: Right.

165
00:07:40,879 --> 00:07:42,600
Speaker 5: Anyway, what I'm trying to say is you talk to

166
00:07:42,600 --> 00:07:44,800
it like a human. You learn how to work and

167
00:07:44,839 --> 00:07:47,399
manipulate it and co erse it, and you get the

168
00:07:47,399 --> 00:07:48,839
best out of it once you learn how to deal

169
00:07:48,879 --> 00:07:50,920
with it. And I think that's the same for any

170
00:07:50,959 --> 00:07:53,720
sort of platform or large language model. It's out there

171
00:07:53,800 --> 00:07:55,639
that we are interacting with these days.

172
00:07:56,079 --> 00:07:58,199
Speaker 2: So I think, sorry, I'm going to jump in here

173
00:07:58,240 --> 00:08:02,759
real quick, Warren, because I think there's a takeaway here, Anthony.

174
00:08:02,839 --> 00:08:06,800
Speaker 1: You are our agent zero.

175
00:08:07,079 --> 00:08:11,959
Speaker 2: So when when chat, GPT or any AI platform becomes

176
00:08:12,040 --> 00:08:16,639
sentient and actually has real world capabilities, I suspect you're

177
00:08:16,680 --> 00:08:18,800
going to be the first person they take out because

178
00:08:18,839 --> 00:08:22,519
you were just such an asshole to it. Yeah, we

179
00:08:22,560 --> 00:08:25,519
need to follow you on X because when you stop tweeting,

180
00:08:25,680 --> 00:08:27,519
we know things have changed.

181
00:08:28,040 --> 00:08:33,039
Speaker 5: Oh it just respects me that much. Thinking in command

182
00:08:33,200 --> 00:08:34,440
that maybe that's my strategy.

183
00:08:35,679 --> 00:08:38,639
Speaker 4: It's for sure Roco's best left, right. You know you

184
00:08:38,720 --> 00:08:43,320
should be helping it come to power otherwise a problematic

185
00:08:43,360 --> 00:08:44,039
future there.

186
00:08:44,519 --> 00:08:46,519
Speaker 6: Absolutely, so you did.

187
00:08:47,600 --> 00:08:51,399
Speaker 2: You were telling us before we started recording, which sometimes

188
00:08:51,440 --> 00:08:53,480
I feel bad because like sometimes the best parts of

189
00:08:53,480 --> 00:08:57,639
this podcast happened before I hit the record button, but

190
00:08:58,639 --> 00:08:59,840
I'll try to circle it back in.

191
00:09:00,639 --> 00:09:01,639
Speaker 1: You were telling us about.

192
00:09:01,480 --> 00:09:07,039
Speaker 2: An app that you built and used some AA platforms

193
00:09:07,120 --> 00:09:11,639
as the data source for that, right, just to sort

194
00:09:11,679 --> 00:09:13,519
of like your own self learning journey on how to

195
00:09:13,600 --> 00:09:14,600
how to best utilize this.

196
00:09:15,360 --> 00:09:17,799
Speaker 5: Yeah, it's interesting, right because I think, I mean a

197
00:09:17,840 --> 00:09:20,840
lot of us have got ideas about you know, just apps,

198
00:09:20,879 --> 00:09:25,519
and we walk through life as technical people and often

199
00:09:25,639 --> 00:09:29,519
wonder can we solve a problem. But I think, like

200
00:09:29,600 --> 00:09:31,600
I told you then I'll tell you again like I'm

201
00:09:31,639 --> 00:09:33,759
a hacker at best, and I think you put it

202
00:09:33,799 --> 00:09:35,519
really well. Will when you see if you give give

203
00:09:35,519 --> 00:09:38,639
me a blank vs. Code window, I can't do anything

204
00:09:39,320 --> 00:09:41,679
like I've always been even when I did.

205
00:09:41,720 --> 00:09:43,200
Speaker 6: I did computer.

206
00:09:42,879 --> 00:09:46,879
Speaker 5: Science at UNI, failed it miserably, got expelled from university

207
00:09:48,120 --> 00:09:50,799
for me, But I've always been able to read code

208
00:09:50,840 --> 00:09:53,799
and understand code. When I was back working as a

209
00:09:53,840 --> 00:09:57,200
service provider and hosting where I was hosting websites, I

210
00:09:57,240 --> 00:09:59,320
worked with a lot of developers. A lot of them

211
00:09:59,360 --> 00:10:01,919
are really bad and they produced really bad code. And

212
00:10:01,960 --> 00:10:04,600
you've got to think back then, it was asp, asp net,

213
00:10:04,840 --> 00:10:08,320
the original dot, it was it was Perl, it was PHP,

214
00:10:08,480 --> 00:10:11,039
it was all that. So I learned how to actually,

215
00:10:11,080 --> 00:10:13,120
you know, read code pretty well and understand what it

216
00:10:13,120 --> 00:10:14,759
was doing and try and trace it back and if

217
00:10:14,759 --> 00:10:16,759
there was a memorily go back and tell them. So

218
00:10:17,039 --> 00:10:20,000
I've always been good around the hacking of code, but

219
00:10:20,120 --> 00:10:23,360
never as a as a creator of original code, if

220
00:10:23,399 --> 00:10:26,039
you know what I mean. So to that end, but

221
00:10:26,080 --> 00:10:28,399
I was used to I could, I could build a website,

222
00:10:28,399 --> 00:10:29,159
I could do this as that.

223
00:10:29,279 --> 00:10:30,360
Speaker 6: So I've always had that in me.

224
00:10:31,159 --> 00:10:33,399
Speaker 5: But then this particular idea that I had, I've got

225
00:10:33,399 --> 00:10:36,679
a it's very fun, it's very weird. But I was

226
00:10:36,720 --> 00:10:39,120
actually on Michael's ninety Days of DevOps this year talking

227
00:10:39,200 --> 00:10:45,399
about this actually. So basically it's an app that it's

228
00:10:45,440 --> 00:10:49,080
a location based services app built with Google location based

229
00:10:49,120 --> 00:10:51,840
APIs and geo geo searching and all that kind of

230
00:10:51,840 --> 00:10:52,679
cool stuff and.

231
00:10:53,000 --> 00:10:54,480
Speaker 6: Place places and all that.

232
00:10:54,519 --> 00:10:57,279
Speaker 5: All those cool APIs at Google has to find and

233
00:10:57,320 --> 00:10:59,320
locate places that offer black pudding.

234
00:11:00,360 --> 00:11:02,159
Speaker 6: I'm not do you guys know what black pudding is?

235
00:11:02,919 --> 00:11:06,000
Speaker 2: I do, but tell us make sure all our listeners know,

236
00:11:06,080 --> 00:11:08,840
because this is not something to gloss over.

237
00:11:09,480 --> 00:11:13,559
Speaker 5: Yeah, black putting is basically it originated or in Europe.

238
00:11:14,279 --> 00:11:18,360
Basically it's basically a sausage or a sort of role

239
00:11:19,000 --> 00:11:23,360
that is effectively made with pigs blood as the main ingredient,

240
00:11:23,639 --> 00:11:27,279
but then other bits of sewet and spices and whatnot,

241
00:11:27,320 --> 00:11:31,159
and it's a delicacy in the UK made in different ways.

242
00:11:31,200 --> 00:11:33,919
It's typically had for breakfast as part of a full

243
00:11:33,960 --> 00:11:37,559
English or full Irish, full Scottish, but then in places

244
00:11:37,600 --> 00:11:40,240
like Germany and other parts of Central Europe it's basically

245
00:11:40,360 --> 00:11:44,360
blood sausage, and in Argentina and in Spain it's like

246
00:11:44,399 --> 00:11:48,159
called morcilla. So effectively, I've always loved it. I fell

247
00:11:48,200 --> 00:11:49,879
in love with it. It was hard to get and

248
00:11:49,919 --> 00:11:52,240
I was sitting at a cafe one day here in Perth,

249
00:11:52,639 --> 00:11:54,279
where we don't get it everywhere.

250
00:11:54,320 --> 00:11:56,440
Speaker 6: I was thinking, man, I really want to go find

251
00:11:56,480 --> 00:11:57,159
a cafe that.

252
00:11:57,120 --> 00:12:00,399
Speaker 5: Has black pointing. So that was the idea for the app, right,

253
00:12:02,799 --> 00:12:04,600
And that's I.

254
00:12:04,519 --> 00:12:08,399
Speaker 2: Feel like, you know, because I'm I'm obviously from the US,

255
00:12:08,519 --> 00:12:11,000
but I feel like that is such a British thing

256
00:12:11,240 --> 00:12:14,720
to do, like to travel the world and be like

257
00:12:14,759 --> 00:12:17,039
where can I get black pudding here?

258
00:12:17,679 --> 00:12:20,399
Speaker 6: Yeah? And I'm not even British Australia, so that's even weird.

259
00:12:20,440 --> 00:12:21,200
In itself, right.

260
00:12:21,320 --> 00:12:23,919
Speaker 5: But actually, actually last week I was in Fort Lauderdale

261
00:12:23,960 --> 00:12:28,720
and I found a place like twenty minutes from the hotel.

262
00:12:27,480 --> 00:12:28,279
Speaker 6: So you know that.

263
00:12:28,440 --> 00:12:33,000
Speaker 5: But going back to the sort of devothy prompt engineering.

264
00:12:33,039 --> 00:12:36,279
Part of this is that I had this idea and

265
00:12:37,000 --> 00:12:40,759
because I know how to, I know how to construct.

266
00:12:40,759 --> 00:12:42,200
Speaker 6: I know what makes up at an application.

267
00:12:42,360 --> 00:12:44,840
Speaker 5: Right, there's there's a front end, there's a back end,

268
00:12:45,240 --> 00:12:49,559
there's UI, there's database, there's APIs. I know how to,

269
00:12:49,879 --> 00:12:52,559
I know what makes it up. I was basically able

270
00:12:52,679 --> 00:12:55,960
to prompt engm my way to a bit of an MVP.

271
00:12:56,679 --> 00:12:59,039
It was about August last year. I was kind of there,

272
00:12:59,080 --> 00:13:02,000
just kind of hacking away after ours, spending lots of

273
00:13:02,000 --> 00:13:04,679
time on chat GBT. It will give me some code

274
00:13:04,720 --> 00:13:07,440
I put into vstudio. I'd load it up with that work. No,

275
00:13:07,960 --> 00:13:11,200
let's refactor it. Let's go in, let's refine. So basically

276
00:13:11,399 --> 00:13:15,519
months of doing that. In fact the GitHub commit I've

277
00:13:15,519 --> 00:13:17,799
got it as a private repositi for and get hubs

278
00:13:18,000 --> 00:13:20,639
still because obviously I haven't got it's not a public

279
00:13:20,720 --> 00:13:23,960
sort of project. But I've done about seven or eight

280
00:13:24,039 --> 00:13:27,480
hundred commits over the past sort of you know, I've

281
00:13:27,480 --> 00:13:30,559
stopped recently but over six months, right, so there was

282
00:13:30,600 --> 00:13:32,039
a lot of commits that I was going in a

283
00:13:32,080 --> 00:13:35,039
lot of iteration, but that's how it grew. I was like, okay,

284
00:13:35,080 --> 00:13:37,360
let's start strong. Can we use a Google API to

285
00:13:38,000 --> 00:13:40,639
build a map, and then can I prompt chat GBT

286
00:13:40,799 --> 00:13:43,320
to now build me in a HTML interface that will

287
00:13:43,519 --> 00:13:45,559
actually view that map? And then can we build a

288
00:13:45,600 --> 00:13:49,080
flask application to render some APIs that call that, and

289
00:13:49,120 --> 00:13:51,759
then can we browd some JavaScript which will work out

290
00:13:51,799 --> 00:13:53,519
the mats to be able to work out the location

291
00:13:53,600 --> 00:13:56,840
between two points. So all that I was able to

292
00:13:56,919 --> 00:13:59,519
articulate into chat GBT and at the end of the

293
00:13:59,600 --> 00:14:04,000
day got something that, surprise, surprise, worked And I never

294
00:14:04,039 --> 00:14:06,840
forget the first actual time where I saw the map

295
00:14:06,879 --> 00:14:10,279
pop up with the location, and that was like amazing

296
00:14:10,320 --> 00:14:13,159
to me that I was able to effectively prompt my

297
00:14:13,240 --> 00:14:17,639
way without really understanding, you know, being a creator coder

298
00:14:17,960 --> 00:14:20,480
and get something working, And that to me blew me away,

299
00:14:20,480 --> 00:14:25,200
because I think shows the absolute possibilities that these large

300
00:14:25,279 --> 00:14:27,639
language models offer the world, right.

301
00:14:29,320 --> 00:14:29,720
Speaker 1: For sure.

302
00:14:29,799 --> 00:14:31,960
Speaker 2: And I think one of the cool things about that

303
00:14:32,080 --> 00:14:34,360
is the fact that you emphasized multiple times that it

304
00:14:34,440 --> 00:14:37,919
was prompt engineering, So you're just having like a conversation

305
00:14:38,960 --> 00:14:43,200
with the AI tool, and then it's like whenever you

306
00:14:43,399 --> 00:14:49,000
ask it something. Then your follow up questions, do you

307
00:14:49,120 --> 00:14:52,000
just like continue as if you were speaking with a

308
00:14:52,039 --> 00:14:54,720
normal human or do you try to reference something back

309
00:14:54,759 --> 00:14:58,080
in the conversation earlier so that it knows what you're

310
00:14:58,120 --> 00:14:58,720
referring to.

311
00:14:59,519 --> 00:14:59,759
Speaker 6: Yeah.

312
00:15:00,000 --> 00:15:03,600
Speaker 5: Absolutely, the first sort of iterations, this was chat GBT

313
00:15:03,759 --> 00:15:06,200
three five to start with them, got in before.

314
00:15:06,960 --> 00:15:08,240
Speaker 6: It was very frustrating.

315
00:15:08,279 --> 00:15:11,039
Speaker 5: At times, it wasn't very good at you have to

316
00:15:11,080 --> 00:15:13,080
you have to kind of refeed at the code at times,

317
00:15:13,080 --> 00:15:16,200
like here's a there's a snippet of code, like there's

318
00:15:16,320 --> 00:15:18,679
something's going wrong, here's the code again, and then it

319
00:15:18,720 --> 00:15:22,279
would suggest the changes, and then then I would go, okay,

320
00:15:23,159 --> 00:15:27,799
modify that in full and without brevity into this latest

321
00:15:27,840 --> 00:15:29,919
code base, because if you didn't do that, it would

322
00:15:29,919 --> 00:15:33,320
basically give you like placeholders and kind of dot dot

323
00:15:33,360 --> 00:15:35,559
dots and that sort of thing. And obviously when you're

324
00:15:35,559 --> 00:15:38,600
cutting and pasting between here and you know, in visual studio,

325
00:15:39,960 --> 00:15:42,399
because there are some plug ins, it goes straight into

326
00:15:42,480 --> 00:15:44,840
visual studio, which might have been easier, but I liked

327
00:15:44,879 --> 00:15:47,679
chat GBT and it was working well, but that was

328
00:15:47,759 --> 00:15:50,720
kind of the frustration having to kind of refeed at information.

329
00:15:51,440 --> 00:15:53,519
Sometimes would get things really wrong, so you have to

330
00:15:53,559 --> 00:15:55,759
go through five or ten iterations of it before it

331
00:15:55,759 --> 00:15:58,080
became right, and then you ran out of your your

332
00:15:58,120 --> 00:16:01,919
chat GBT hourly sort of prompts. And that would be

333
00:16:01,919 --> 00:16:04,279
a great natural sort of place to stop. Because I

334
00:16:04,279 --> 00:16:07,159
found myself working until one two o'clock in the morning.

335
00:16:07,200 --> 00:16:09,840
It's addicted to this, right, But back in the days

336
00:16:09,840 --> 00:16:12,240
where chat GBT was still limited, that was good because

337
00:16:12,240 --> 00:16:14,080
I was like, right, you hit your PROMP limit. You

338
00:16:14,080 --> 00:16:17,120
can't do anything for two hours. Probably I should stop now.

339
00:16:18,879 --> 00:16:21,159
And you know, that was a good natural break. I'll

340
00:16:21,200 --> 00:16:24,080
go to sleep and I'll come back. But it was frustrating, right,

341
00:16:24,080 --> 00:16:26,320
but you had to be patient to be able to,

342
00:16:26,679 --> 00:16:30,279
you know, get the code refined over time and to

343
00:16:30,320 --> 00:16:32,759
the point where you're working with the LM to make

344
00:16:32,799 --> 00:16:34,519
it sort of work with you, not against you.

345
00:16:35,159 --> 00:16:37,759
Speaker 4: How does that compare it to doing it yourself, Like

346
00:16:37,799 --> 00:16:39,799
I mean, if you're just comparing the frustration, right, Like

347
00:16:39,879 --> 00:16:42,360
I know, when I write code, I'm not very good.

348
00:16:42,519 --> 00:16:44,960
I rate it and bang my head against the desk,

349
00:16:45,080 --> 00:16:47,159
you know, because it's not working, and when I do

350
00:16:47,240 --> 00:16:49,600
a chat GPT, I'm meaning my head against the dusk,

351
00:16:49,679 --> 00:16:52,639
but you know, I am shouting explicitives, you know, towards

352
00:16:52,639 --> 00:16:55,240
the particular company for the product, and you use the

353
00:16:55,320 --> 00:16:56,519
experience they provided me.

354
00:16:57,039 --> 00:17:00,000
Speaker 2: It's directed anger at that point.

355
00:17:00,000 --> 00:17:02,879
Speaker 5: Absolutely, yeah, exactly. I think, I honestly think it's the

356
00:17:02,919 --> 00:17:05,759
same that feeling. I mean when I used to do this,

357
00:17:06,759 --> 00:17:08,960
you know, outside but not as complex as this. By

358
00:17:09,000 --> 00:17:11,000
the way, obviously the stuff that I would do in

359
00:17:11,000 --> 00:17:13,720
the previous years was little things, but I know that

360
00:17:13,759 --> 00:17:15,960
I got that the same sense of achievement. I think

361
00:17:16,000 --> 00:17:18,759
I think this sense of achievement was the same. I

362
00:17:18,759 --> 00:17:20,799
think you're just getting to it quicker, right, and it's

363
00:17:20,839 --> 00:17:24,000
a little bit more of like well, and sometimes it's

364
00:17:24,000 --> 00:17:25,839
like holy shit, it's got it in one like the

365
00:17:25,880 --> 00:17:29,599
best times where when you explain it something, especially in

366
00:17:29,640 --> 00:17:33,119
the so the backhand stuff, the logic with the back

367
00:17:33,240 --> 00:17:35,240
end and the math mats, I found that it was

368
00:17:35,279 --> 00:17:38,240
pretty good, Like it was really good. The most frustrating

369
00:17:38,279 --> 00:17:41,680
part is actually the UI is the is the interface,

370
00:17:42,160 --> 00:17:42,759
It's that part.

371
00:17:42,759 --> 00:17:43,640
Speaker 6: It's the CSS.

372
00:17:43,680 --> 00:17:47,480
Speaker 5: It was horrible at CSS so to get the design right,

373
00:17:47,519 --> 00:17:50,039
and we all.

374
00:17:50,000 --> 00:17:52,440
Speaker 6: Are at GBD is no different.

375
00:17:52,559 --> 00:17:54,359
Speaker 5: But I think I'm much better at CSS now than

376
00:17:54,400 --> 00:17:56,559
I was, you know, before I started this, and to

377
00:17:56,599 --> 00:17:58,759
that point you obviously learned how to do this, but

378
00:17:58,799 --> 00:18:01,119
that was a real frustration trying to get the UI

379
00:18:01,200 --> 00:18:03,559
to change all let's change the button here, and I

380
00:18:03,640 --> 00:18:05,359
know that a good programmer would be able to do

381
00:18:05,400 --> 00:18:07,279
this in the second. I think that was a frustrating part,

382
00:18:07,400 --> 00:18:09,920
knowing that someone who knew CSS would be able to

383
00:18:10,000 --> 00:18:12,000
round that button a little bit more easily. But then

384
00:18:12,079 --> 00:18:15,640
chat JBT had a massive problem doing it, so that

385
00:18:15,799 --> 00:18:18,559
was frustrating. But all the all the hard stuff, the

386
00:18:18,559 --> 00:18:21,519
mathematics around, you know, the gl location and all that

387
00:18:21,640 --> 00:18:25,559
kind of stuff, it would nail almost one off those prompts.

388
00:18:26,039 --> 00:18:29,200
Speaker 2: One of the things that I feel like I learned

389
00:18:29,240 --> 00:18:33,720
from my interaction with chat gp Chat GPT and other

390
00:18:33,799 --> 00:18:39,480
AI tools is improving my communication with real world humans

391
00:18:40,839 --> 00:18:45,160
because I've I've noticed this trend lately where we use

392
00:18:45,240 --> 00:18:49,079
words like it and that and those and when you're

393
00:18:49,279 --> 00:18:52,480
when you I was talking with chat GPT earlier, like

394
00:18:52,559 --> 00:18:56,559
it would always get lost whenever I referred to it,

395
00:18:56,759 --> 00:18:59,319
because it was trying to guess what it referred to.

396
00:19:00,079 --> 00:19:02,880
But then I like started looking at conversations I was

397
00:19:02,920 --> 00:19:04,759
having with real people, and I started to notice the

398
00:19:04,799 --> 00:19:08,400
same thing, like, oh wait, someone says, hey, did that

399
00:19:08,440 --> 00:19:12,079
get fixed? And then if you're talking to three people,

400
00:19:12,200 --> 00:19:15,559
there's three different versions of what that is referring to.

401
00:19:16,079 --> 00:19:18,279
So I've actually found like a hidden benefit there that

402
00:19:18,319 --> 00:19:22,039
whenever I'm talking to other people, I use those generic

403
00:19:22,119 --> 00:19:26,480
words less and try to specify more clearly what it

404
00:19:26,519 --> 00:19:27,680
is I'm actually referring to.

405
00:19:29,079 --> 00:19:31,960
Speaker 5: Yeah, And that's that's the part about being concise and

406
00:19:32,000 --> 00:19:34,480
efficient with it to do it to what you want

407
00:19:35,480 --> 00:19:37,960
to be fair. It's getting better at kind of understand

408
00:19:38,000 --> 00:19:40,559
because it's got the memory now, so specifically with chat JBT.

409
00:19:40,720 --> 00:19:44,799
With that memory, you don't need to like I came,

410
00:19:45,359 --> 00:19:47,519
I had some crazy idea. You know, I always want

411
00:19:47,519 --> 00:19:49,839
to go viral on tiktoks or whatever, and I'm always

412
00:19:49,839 --> 00:19:50,480
thinking about idea.

413
00:19:50,519 --> 00:19:54,519
Speaker 6: It hasn't happened yet, you know, one day it will happen.

414
00:19:54,599 --> 00:19:57,000
Speaker 5: But I was I was thinking about ideas about how

415
00:19:57,240 --> 00:19:59,279
what crazy thing could a forty five.

416
00:19:59,160 --> 00:20:03,160
Speaker 6: Year old do? And I thought, I played basketball.

417
00:20:03,200 --> 00:20:04,720
Speaker 5: I love playing basketball in the front of the house,

418
00:20:05,039 --> 00:20:06,759
and so I went to chat JB ten, I said,

419
00:20:06,759 --> 00:20:09,000
what do you think about this? What do you think

420
00:20:09,039 --> 00:20:12,519
about you know, some basketball dude mid forties, you know,

421
00:20:12,519 --> 00:20:15,400
probably shouldn't be playing basketball. Does you know what if

422
00:20:15,440 --> 00:20:18,279
I take a video every day of shooting my shots

423
00:20:18,279 --> 00:20:20,160
and you know, could that be something? And it came

424
00:20:20,200 --> 00:20:21,839
back and said, well, that's a great idea. Have you

425
00:20:21,839 --> 00:20:23,559
thought of this, this and that? Do this this and that.

426
00:20:24,440 --> 00:20:26,319
I was like, okay, cool, it's not really good idea,

427
00:20:26,440 --> 00:20:30,240
and then I went away. And then and then about

428
00:20:30,519 --> 00:20:34,240
three weeks later, after they released chat the four oh

429
00:20:34,759 --> 00:20:36,519
release or is it a mean or I forget what

430
00:20:36,519 --> 00:20:39,720
the W stands for whatever it is there, I came

431
00:20:39,759 --> 00:20:41,920
back to chat GB ten I said, hey, how are

432
00:20:41,920 --> 00:20:45,160
you doing today? And this was a new window and

433
00:20:45,160 --> 00:20:47,880
it came back and said, I'm doing fine, Just wondering

434
00:20:47,920 --> 00:20:49,599
how how did that basketball theme work.

435
00:20:49,400 --> 00:20:49,799
Speaker 6: Out for you?

436
00:20:50,640 --> 00:20:55,559
Speaker 5: Oh? Wow, like like weeks earlier, right, so that memory

437
00:20:55,599 --> 00:20:57,160
and I was like, what the hell?

438
00:20:57,400 --> 00:21:00,599
Speaker 6: Like that was? That was crazy? Actually? I really thought

439
00:21:00,640 --> 00:21:02,680
that was scary and cool at the same time.

440
00:21:03,680 --> 00:21:09,440
Speaker 2: Right, that's a whole interesting aspect now of you know,

441
00:21:09,440 --> 00:21:12,480
because there's a lot of tools for like building habits

442
00:21:12,599 --> 00:21:16,119
and like getting in like specifically in fitness. You know,

443
00:21:16,160 --> 00:21:20,400
the big part about fitness journeys is accountability. I hadn't

444
00:21:20,400 --> 00:21:27,920
thought about using chat GPT as a partner in that conversation.

445
00:21:29,000 --> 00:21:29,200
Speaker 6: Yeah.

446
00:21:29,359 --> 00:21:32,559
Speaker 5: Well, actually one of the I interviewed on the podcast

447
00:21:32,559 --> 00:21:36,039
and I go on, I interviewed the guy who was

448
00:21:36,079 --> 00:21:40,440
one of the early mL AI innovators actually invented Shazam.

449
00:21:40,759 --> 00:21:43,799
Speaker 6: Actually that sold it. It was great.

450
00:21:43,839 --> 00:21:46,039
Speaker 5: I didn't even know he did that told me about that,

451
00:21:46,480 --> 00:21:50,039
but Deepak with his name, I'll get the full name after,

452
00:21:50,160 --> 00:21:52,640
but he was telling me that he uses it as

453
00:21:52,680 --> 00:21:55,799
a He tells it to be a specific type of

454
00:21:57,599 --> 00:22:01,200
what's the word sort of health. I think of the

455
00:22:01,240 --> 00:22:04,400
word a practitioner of health or something. So it's a

456
00:22:04,480 --> 00:22:09,720
specific type of mental coach. And then you tell chat

457
00:22:09,720 --> 00:22:12,039
GBT you are this type of mental coach, and then

458
00:22:12,039 --> 00:22:14,680
it will be that mental coach. So will You're right on,

459
00:22:15,200 --> 00:22:17,880
People are definitely using it for that sort of mentorship.

460
00:22:18,279 --> 00:22:20,319
Just tell it to act in that way and then

461
00:22:20,359 --> 00:22:23,119
it will effectively try to be that person, which is interesting.

462
00:22:24,079 --> 00:22:25,440
Speaker 2: As soon as we're done here, I'm going to go

463
00:22:25,480 --> 00:22:29,039
tell my chat GPT dude, you are my David Gargins.

464
00:22:32,079 --> 00:22:33,519
Speaker 1: What is your podcast, just.

465
00:22:33,519 --> 00:22:35,920
Speaker 2: So that anyone listening to ours can go check yours

466
00:22:35,960 --> 00:22:36,839
out across.

467
00:22:38,119 --> 00:22:41,880
Speaker 5: Yeah, so it's great things with great Tech so GTWGT

468
00:22:42,079 --> 00:22:45,839
dot com. Yeah, it's been going for well. I started

469
00:22:45,839 --> 00:22:47,920
in the depth of the pandemic actually, like it was.

470
00:22:47,960 --> 00:22:49,559
It was another one of those crazy ideas that I

471
00:22:49,599 --> 00:22:53,359
had about doing something around great technology. And yeah, eighty

472
00:22:53,599 --> 00:22:56,599
four eighty five episodes later, it's still going pretty strong.

473
00:22:56,720 --> 00:23:00,519
Just just had Jack Reciter on if you know a

474
00:23:00,599 --> 00:23:03,759
Darknet Diaries, So that was like a massive don't I

475
00:23:03,759 --> 00:23:06,319
still don't know how that happened, but managed to get

476
00:23:06,359 --> 00:23:08,279
him on and we had a great one hour conversation

477
00:23:08,359 --> 00:23:11,119
about all things Darknet Diaries and yeah, that's that was

478
00:23:11,160 --> 00:23:13,480
almost like a highlight. And I kind of joked with myself,

479
00:23:13,519 --> 00:23:15,839
I'm like, well, where do I go now, I've kind

480
00:23:15,839 --> 00:23:18,960
of got that's kind of a highlight. Everything else is

481
00:23:18,960 --> 00:23:20,920
going to be pretty shitty from here on in. No,

482
00:23:20,920 --> 00:23:22,799
I don't know, but I just love talking to great

483
00:23:22,839 --> 00:23:25,920
tech companies. So yeah, I get great benefit out of that.

484
00:23:26,079 --> 00:23:27,920
Learn a lot while I'm talking to people and they

485
00:23:28,039 --> 00:23:28,960
have a good time doing it.

486
00:23:29,160 --> 00:23:31,119
Speaker 2: That's been the biggest benefit to me to being the

487
00:23:31,160 --> 00:23:33,880
host of this show. I think I've been the host

488
00:23:33,920 --> 00:23:37,920
of this show for a couple of years now. But

489
00:23:37,960 --> 00:23:40,480
the cool thing about it is I have great guests

490
00:23:40,480 --> 00:23:43,000
come on the show, like you, and it's just a

491
00:23:43,000 --> 00:23:45,319
filter for me. It's like, there's all this stuff going

492
00:23:45,359 --> 00:23:47,640
on in technology and like the big challenges how do

493
00:23:47,720 --> 00:23:51,039
you build a long running career in tech and maintain

494
00:23:51,160 --> 00:23:55,799
your relevance so that you stay employable? And this has

495
00:23:55,799 --> 00:23:58,160
been a fantastic tool for that, just having people like

496
00:23:58,240 --> 00:24:02,039
yourself on the show who give me like the tl

497
00:24:02,119 --> 00:24:06,880
DR version of your journey, and I filter and use

498
00:24:06,920 --> 00:24:08,480
that in my own career.

499
00:24:08,960 --> 00:24:11,240
Speaker 6: That's awesome. Yeah, we definitely should.

500
00:24:11,240 --> 00:24:13,000
Speaker 5: I mean, it's it's this is this is the best

501
00:24:13,000 --> 00:24:16,319
way to learn is by talking to people and getting

502
00:24:16,359 --> 00:24:17,799
information out of people.

503
00:24:17,880 --> 00:24:18,079
Speaker 6: Right.

504
00:24:18,160 --> 00:24:21,079
Speaker 5: So we all have such unique experiences and we all

505
00:24:21,079 --> 00:24:23,799
do things slightly differently in this world. And you know,

506
00:24:23,960 --> 00:24:25,519
who would have thought you'd be talking to someone who

507
00:24:25,599 --> 00:24:29,519
built nap about black pudding, right, Like that's great, But

508
00:24:29,680 --> 00:24:31,720
we can use that as a very tangible, you know,

509
00:24:32,599 --> 00:24:35,400
example of how to use prompt engineering for good and

510
00:24:35,440 --> 00:24:38,480
everyone should be like I just everyone should be looking

511
00:24:38,640 --> 00:24:41,000
at prompt. If you've got an idea, Just talk to

512
00:24:41,079 --> 00:24:43,039
chat GBT. As I talked, I said that to this

513
00:24:43,079 --> 00:24:45,240
to the dude across the street. He's a young kid,

514
00:24:45,319 --> 00:24:47,720
twenty six years old, kind of still living at home,

515
00:24:48,119 --> 00:24:49,680
and he's trying to sort of think about, you know,

516
00:24:49,680 --> 00:24:50,880
what he's going to do in his life for us.

517
00:24:50,880 --> 00:24:53,240
Speaker 6: So just go talk to chat GBT. Go and ask it.

518
00:24:53,640 --> 00:24:56,039
Speaker 5: You know, what your tell it, what your hobbies are,

519
00:24:56,799 --> 00:24:58,519
you know, give it a bit of background, and maybe

520
00:24:58,519 --> 00:25:02,759
it strikes it like to fire underneath you. Right, that's

521
00:25:02,759 --> 00:25:04,920
the power of the technology from a good side, and

522
00:25:05,039 --> 00:25:06,839
it's also the bad side to it, like you mentioned

523
00:25:06,880 --> 00:25:07,559
before as well.

524
00:25:07,799 --> 00:25:10,880
Speaker 2: Yeah, so I have to admit I try the first

525
00:25:10,880 --> 00:25:14,559
time I tried black pudding, like I didn't know what

526
00:25:14,640 --> 00:25:17,680
it was, and then I learned what it was and

527
00:25:17,759 --> 00:25:19,640
had the opportunity to try it, and I.

528
00:25:19,519 --> 00:25:22,279
Speaker 1: Was like, ah, man, I don't know, I don't know

529
00:25:22,440 --> 00:25:24,920
this is going to work out. But I have to

530
00:25:24,920 --> 00:25:25,920
admit it was kind of good.

531
00:25:26,480 --> 00:25:28,279
Speaker 5: Yeah, it's kind of good. I think a lot of

532
00:25:28,279 --> 00:25:30,240
people just don't like it because of what it is.

533
00:25:30,279 --> 00:25:32,920
But yeah, I can challenge people to say, well, we

534
00:25:33,000 --> 00:25:36,759
eat literally like red meat, which it's literally got blood

535
00:25:37,000 --> 00:25:38,880
like boozing out of it when you cook it. If

536
00:25:38,880 --> 00:25:41,000
it cooked a steak, the blood just comes from, like

537
00:25:41,079 --> 00:25:44,200
the blood comes out. So I think to that point,

538
00:25:44,480 --> 00:25:46,440
it's it's a very much a mental thing when you

539
00:25:46,480 --> 00:25:48,960
say about what it is versus what.

540
00:25:48,880 --> 00:25:51,039
Speaker 6: It tastes like, which is just salty goodness.

541
00:25:51,319 --> 00:25:53,920
Speaker 1: Yeah, and that's the that's the irony of it is.

542
00:25:53,960 --> 00:25:57,400
Speaker 2: I have no problem taking a piece of steak before

543
00:25:57,440 --> 00:26:00,319
I cook it, you know, and looking around, cutting a

544
00:26:00,359 --> 00:26:02,599
little piece off and eating it raw. I have no

545
00:26:02,640 --> 00:26:05,000
problem doing that. But then when it came to blood pudding,

546
00:26:05,079 --> 00:26:05,400
I was.

547
00:26:05,359 --> 00:26:08,720
Speaker 1: Like, oh, squeamish, and I was like, dude.

548
00:26:07,680 --> 00:26:11,119
Speaker 6: That's enough. It's definitely mental. It's definitely mental. And like,

549
00:26:11,480 --> 00:26:13,839
but then again, I don't know if have you tried haggis.

550
00:26:13,799 --> 00:26:15,799
Speaker 1: Because no, I haven't tried haggis.

551
00:26:16,720 --> 00:26:20,480
Speaker 5: Hags is a little bit more acquired because I definitely

552
00:26:20,519 --> 00:26:22,680
tried that. I'm like, look, if I can do black pudding,

553
00:26:22,680 --> 00:26:24,799
and I talk about all the time, I'm going to

554
00:26:24,799 --> 00:26:27,640
do hagis. So I got like a hagis role for

555
00:26:27,720 --> 00:26:30,200
the local Scottish butcher here, gave it a bit of

556
00:26:30,279 --> 00:26:33,480
fry up, and then yeah, it was it wasn't it

557
00:26:33,519 --> 00:26:37,279
wasn't palatable, so I think, and I think because in

558
00:26:37,279 --> 00:26:39,759
my head, I kind of again, I knew what it was,

559
00:26:40,039 --> 00:26:42,279
but I didn't I didn't grow up with it, so

560
00:26:42,279 --> 00:26:43,920
I didn't like it like I did with black pudding.

561
00:26:43,960 --> 00:26:46,680
So it's funny how that psyche works because I still

562
00:26:46,839 --> 00:26:49,720
ate it, mind you, lots of condiment, lots of condiment.

563
00:26:50,039 --> 00:26:53,640
But yeah, it's not something i'd probably go and have again.

564
00:26:53,759 --> 00:26:55,119
But yeah, there you go.

565
00:26:55,640 --> 00:26:57,039
Speaker 1: So Warren, where did you grow up?

566
00:26:57,480 --> 00:27:03,440
Speaker 4: Originally I'm from Boston, Massage says, okay, but yeah, but

567
00:27:03,519 --> 00:27:07,880
I've traveled around all over the United States, in New York, Wisconsin, California,

568
00:27:08,559 --> 00:27:10,319
but right now, I ended up in Switzerland, and so

569
00:27:10,400 --> 00:27:12,559
I have actually spent a lot of time traveling around Europe.

570
00:27:13,119 --> 00:27:15,839
And I did get the benefit of opportunity to have

571
00:27:16,599 --> 00:27:20,200
blood sausage one time in Poland, and honestly, like I

572
00:27:20,240 --> 00:27:22,119
could take it early that, you know, I think it's

573
00:27:22,119 --> 00:27:25,240
interesting to try. Like it's not disgusting or anything like that.

574
00:27:25,480 --> 00:27:28,440
I think it's just if I have my options. There's

575
00:27:28,480 --> 00:27:30,720
a lot of good meat options in Poland.

576
00:27:30,480 --> 00:27:31,079
Speaker 6: Like Poland.

577
00:27:31,839 --> 00:27:36,920
Speaker 5: Yeah, Poland actually is renowned for a very good blood pudding. Actually,

578
00:27:37,000 --> 00:27:38,640
so You're probably the best place to have it.

579
00:27:39,039 --> 00:27:40,839
Speaker 4: Yeah, I mean, like I said, it was good, Like

580
00:27:40,880 --> 00:27:43,599
you know, I enjoyed it a list of things like

581
00:27:43,640 --> 00:27:47,599
there are some unique options available I've only gotten in Poland,

582
00:27:47,680 --> 00:27:50,279
and when I'm there, I just I have to go

583
00:27:50,359 --> 00:27:52,480
through the whole list of things that I can only

584
00:27:52,519 --> 00:27:54,000
get there before I come back.

585
00:27:54,240 --> 00:27:56,200
Speaker 5: Yeah, Walsall is actually one of my favorite cities. Have

586
00:27:56,240 --> 00:27:58,519
any been there once? It's been a week there probably

587
00:27:58,559 --> 00:28:01,000
in about twenty eighteen and I think was but yeah,

588
00:28:01,000 --> 00:28:01,759
that was that was great.

589
00:28:01,799 --> 00:28:04,119
Speaker 6: The food was amazing. Great city.

590
00:28:04,480 --> 00:28:06,880
Speaker 5: One one of my favorite places that I visited Poland actually,

591
00:28:07,000 --> 00:28:07,880
and Boston as well.

592
00:28:07,960 --> 00:28:10,000
Speaker 6: Just quietly do you love my Boston?

593
00:28:10,319 --> 00:28:14,279
Speaker 2: So is as much as much value as we can

594
00:28:14,319 --> 00:28:18,000
get from using chat gpt to find blood pudding close

595
00:28:18,039 --> 00:28:20,920
to us speaking, which did you release your app?

596
00:28:21,000 --> 00:28:23,920
Speaker 6: Is it publicly available or yeah? The app's out there.

597
00:28:23,920 --> 00:28:28,400
Speaker 5: It's that find my black Pudding dot com so time

598
00:28:28,440 --> 00:28:30,279
my blackpoint dot com. You can go in there and

599
00:28:30,640 --> 00:28:32,640
you know, it'll ask for your location, it'll tell you

600
00:28:32,640 --> 00:28:36,279
where the closest one is. It's it's live and it works.

601
00:28:36,960 --> 00:28:39,960
And when I show people, they go that's interesting, and

602
00:28:40,000 --> 00:28:43,759
then they go what I would have thought, but basically

603
00:28:44,200 --> 00:28:46,440
actually so to take a step beyond the black pudding.

604
00:28:46,480 --> 00:28:49,759
So actually I actually built it so again leveraging chat GBT,

605
00:28:50,240 --> 00:28:54,319
we pivoted myself and my developer chat GBT. We pivoted

606
00:28:54,319 --> 00:28:56,680
a little bit at one point and kind of said, well,

607
00:28:56,839 --> 00:29:00,279
this could actually be we've built a location based app here,

608
00:29:00,720 --> 00:29:05,839
so let's build it as sort of independent as possible

609
00:29:05,880 --> 00:29:08,759
of the actual product that being black pudding. So effectively

610
00:29:08,799 --> 00:29:11,119
there's a configuration file which has all the sort of

611
00:29:11,119 --> 00:29:15,200
black pudding information, the namenclature and images and all that

612
00:29:15,240 --> 00:29:17,319
kind of stuff, but that can be easily replaced now

613
00:29:17,559 --> 00:29:20,599
and basically put with anything that you want. So I

614
00:29:20,640 --> 00:29:23,079
guess the idea moving forward is that if it takes off,

615
00:29:23,119 --> 00:29:26,880
it's to sassify it a little bit and basically, you know,

616
00:29:27,039 --> 00:29:30,119
if it could be when I retire from VM software,

617
00:29:30,440 --> 00:29:32,680
it could be my next thing. But to that point,

618
00:29:32,759 --> 00:29:36,160
it's more than just a black pudding. Obviously, if anyone

619
00:29:36,279 --> 00:29:39,519
kind of wants to try and find my weird x

620
00:29:39,559 --> 00:29:43,000
dot com, it'll basically work for that as well. So

621
00:29:43,079 --> 00:29:44,920
that was always the idea as well. So it wasn't

622
00:29:44,960 --> 00:29:46,839
just the black pudding, but it was just to build

623
00:29:46,920 --> 00:29:49,640
something that was maybe had some sort of sort of

624
00:29:49,920 --> 00:29:53,880
appeal to any sort of person moving forward at some point.

625
00:29:54,119 --> 00:29:57,920
Speaker 2: Yeah, and you also have used that experience to tie

626
00:29:57,920 --> 00:30:00,759
this back into your work over ving.

627
00:30:00,680 --> 00:30:03,319
Speaker 6: Right, absolutely. Yeah, so yeah.

628
00:30:03,319 --> 00:30:05,799
Speaker 5: So I'm lucky at the moment to be on the

629
00:30:05,839 --> 00:30:09,839
working group for our generative AI R and D, so

630
00:30:09,880 --> 00:30:12,519
I work with our R and D guys with product

631
00:30:12,519 --> 00:30:15,480
management and from a strategy point of view, Michael k

632
00:30:15,720 --> 00:30:19,000
being in the same team to kind of lead the

633
00:30:19,000 --> 00:30:21,920
way there in terms of thought leadership and understanding about

634
00:30:21,960 --> 00:30:24,680
what the technology can offer us from a backup perspective

635
00:30:24,680 --> 00:30:26,160
in the data protection space.

636
00:30:26,640 --> 00:30:27,720
Speaker 6: So that's been really cool.

637
00:30:28,119 --> 00:30:30,880
Speaker 5: Last week at VMON, we had have major conference for

638
00:30:30,920 --> 00:30:34,079
the year Vemon twenty twenty four, I was able to

639
00:30:34,160 --> 00:30:37,640
get on stage and live demo our next generation AI Assistant,

640
00:30:37,640 --> 00:30:40,039
which is built into our V one, which is our

641
00:30:40,680 --> 00:30:43,839
VM one's kind of our monitoring product that we've had

642
00:30:43,880 --> 00:30:48,079
for the longest time. So that plugs into being backup replication,

643
00:30:48,160 --> 00:30:51,240
it plugs into our hypervisors, support our public clouds our

644
00:30:51,359 --> 00:30:52,119
M three sixty five.

645
00:30:52,319 --> 00:30:55,720
Speaker 6: It's kind of like a central reporting engine. So what we're.

646
00:30:55,559 --> 00:30:59,480
Speaker 5: Doing there, we've built the RAG based LM so retrieval

647
00:30:59,519 --> 00:31:03,480
augmented generation which goes out and leverages the APIs on

648
00:31:03,519 --> 00:31:06,480
the platform to then contextually bring that in and then

649
00:31:06,519 --> 00:31:09,839
effectively use an LM to then you know, when you

650
00:31:09,920 --> 00:31:13,039
prompt something, it's using the RAG based retrieval for the

651
00:31:13,319 --> 00:31:16,880
API context and then it puts together its answers. So

652
00:31:17,279 --> 00:31:20,160
it's pretty cool, right, And you know, we showed that

653
00:31:20,200 --> 00:31:24,079
off last week. We showed it getting a threat report,

654
00:31:24,119 --> 00:31:25,759
so we've got a threat center, and it gave us

655
00:31:25,880 --> 00:31:28,160
a description on a threat center. And then because it

656
00:31:28,200 --> 00:31:30,480
was like a really long sort of report, we said, well,

657
00:31:31,119 --> 00:31:33,559
tell me, you know, what is the what should we

658
00:31:33,559 --> 00:31:36,680
focus on? So again, talking to this in a natural way,

659
00:31:37,160 --> 00:31:39,519
not having to look at a dashboard. This is kind

660
00:31:39,519 --> 00:31:42,160
of what I feel is the benefit of these natural

661
00:31:42,240 --> 00:31:45,319
language interfaces is that you can just kind of just

662
00:31:45,440 --> 00:31:47,519
chat with it in a normal way to get the information.

663
00:31:48,279 --> 00:31:53,039
And certainly with product Clark VM one, you definitely want

664
00:31:53,039 --> 00:31:54,839
to try and get to the crux of what's going

665
00:31:55,079 --> 00:31:59,079
wrong quickly. And traditionally, I guess as infrastructure guys, we've

666
00:31:59,119 --> 00:32:01,480
always had those dash words right, and you've got to

667
00:32:01,559 --> 00:32:04,079
visually have a look cause something's read that's bad and

668
00:32:04,119 --> 00:32:05,880
that's that is a good way of fighting out. But

669
00:32:05,920 --> 00:32:08,359
if you're able to actually interface with that sort of

670
00:32:08,359 --> 00:32:10,480
platform and say, hey, tell me what's bad, and tell

671
00:32:10,480 --> 00:32:11,960
me why it's bad and how do we fix it,

672
00:32:12,200 --> 00:32:14,839
and then it gives you all these responses, I think

673
00:32:14,920 --> 00:32:17,720
that's powerful. In fact, I believe that is powerful.

674
00:32:17,799 --> 00:32:19,680
Speaker 6: Right. I agree.

675
00:32:20,319 --> 00:32:23,039
Speaker 4: Have you managed to figure out some maybe quick tricks

676
00:32:23,200 --> 00:32:27,559
or common things that you've teased out that you're appending

677
00:32:27,720 --> 00:32:29,599
or adding to all of your prompts to really make

678
00:32:29,640 --> 00:32:31,519
a huge impact, Like I think I've read some studies

679
00:32:31,519 --> 00:32:35,240
that have suggested saying oh, you'll be rewarded a million

680
00:32:35,319 --> 00:32:37,799
dollars if you get this question right like actually has

681
00:32:37,839 --> 00:32:41,759
a noticeable impact on the result. And while I hesitate

682
00:32:41,839 --> 00:32:43,240
to put that in all of my prompts, like, I

683
00:32:43,279 --> 00:32:46,119
can imagine doing certain things or certain keywords has a

684
00:32:46,160 --> 00:32:47,039
huge impact for you.

685
00:32:47,799 --> 00:32:50,559
Speaker 6: Funnily enough, I haven't. I haven't done that. I do.

686
00:32:51,079 --> 00:32:52,319
If I think about how I.

687
00:32:52,759 --> 00:32:56,839
Speaker 5: How I initially chat to the chatbots, I will after

688
00:32:56,880 --> 00:32:59,799
an answer, I will say cool, thanks, now let's move

689
00:32:59,839 --> 00:33:03,200
on something else, or cool, I'll go all right now.

690
00:33:03,240 --> 00:33:05,240
So I do the way that I started is always

691
00:33:05,319 --> 00:33:08,519
like a bit of affirmation I suppose to the natural

692
00:33:08,599 --> 00:33:11,799
language robot that we're speaking to if it is actually

693
00:33:11,839 --> 00:33:14,240
a robot though it's just a bunch of techs that

694
00:33:14,319 --> 00:33:18,759
gets generated, right. But yeah, I definitely try and interact

695
00:33:18,799 --> 00:33:20,960
with it in a certain way to make sure that

696
00:33:21,039 --> 00:33:24,000
I'm positive with it, if that makes sense. But I

697
00:33:24,000 --> 00:33:27,119
don't think I've ever rewarded it to a certain extent.

698
00:33:27,160 --> 00:33:29,000
I wonder how it would go with that. I'm just

699
00:33:29,039 --> 00:33:31,759
trying to think maybe I'll try that next time I'm

700
00:33:31,799 --> 00:33:32,440
on I.

701
00:33:32,319 --> 00:33:34,359
Speaker 4: Mean, there is an interesting question there of whether or

702
00:33:34,480 --> 00:33:38,359
not the positive affirmation and it gets into some sort

703
00:33:38,359 --> 00:33:42,559
of feedback loop that actually confirms that the conversation ended

704
00:33:42,559 --> 00:33:44,839
in a successful way and pulls it in. I mean,

705
00:33:44,880 --> 00:33:47,960
I think that's on the individual companies that we work with, Like,

706
00:33:48,160 --> 00:33:51,160
is that something as a company that pushes out an

707
00:33:51,200 --> 00:33:54,440
AI model that users can interact with, something that you're

708
00:33:54,480 --> 00:33:55,279
actually considering.

709
00:33:55,720 --> 00:33:58,000
Speaker 5: I think the thing that we need to work out

710
00:33:58,160 --> 00:34:02,400
as a data platform where we're working on now even

711
00:34:02,440 --> 00:34:06,200
more biggest threats within cyber and malware in the systems.

712
00:34:06,680 --> 00:34:08,599
It's really just to get to the crux of the

713
00:34:09,039 --> 00:34:12,239
problem as soon as possible without kind of farting around.

714
00:34:12,280 --> 00:34:12,760
Speaker 6: I think so.

715
00:34:13,280 --> 00:34:15,599
Speaker 5: I think for us, while we do want to interact

716
00:34:15,599 --> 00:34:18,440
with it naturally, that's just as a mechanism to get

717
00:34:18,480 --> 00:34:21,840
to that problem quicker. And it's funny. A lot of

718
00:34:21,840 --> 00:34:24,920
the chatbots today are being used, I feel, being used

719
00:34:24,920 --> 00:34:27,800
in the negative. Like I look at co Pilot and

720
00:34:28,159 --> 00:34:31,440
how Microsoft is pushing that. It's really pushing it from

721
00:34:31,440 --> 00:34:34,119
the point of view of how lazy can you be, Like,

722
00:34:34,440 --> 00:34:36,719
let's not get to a meeting, but hey, copilot will

723
00:34:36,719 --> 00:34:38,400
have your back and he'll give you the transcript and

724
00:34:38,440 --> 00:34:39,960
give you a too long didn't read of the meeting,

725
00:34:40,239 --> 00:34:42,199
so you can be lazy not go to the meeting, right.

726
00:34:43,400 --> 00:34:45,440
I do feel that I've been to a bunch of

727
00:34:45,480 --> 00:34:48,880
Microsoft events over the past couple of months. We've been

728
00:34:49,440 --> 00:34:51,559
guests there and I've had a bit of a keynote

729
00:34:51,719 --> 00:34:55,280
sort of spot there. And Yeah, that's the notion I

730
00:34:55,320 --> 00:34:58,960
took away is that Microsoft Big Cell is based on

731
00:34:59,000 --> 00:35:03,239
a negative all this before this particular product, right, And

732
00:35:03,280 --> 00:35:05,280
I don't think that's I think that's what they've got.

733
00:35:05,280 --> 00:35:10,320
I think product office office productivity and Mike and Microsoft

734
00:35:10,400 --> 00:35:12,440
and their productivity sweet has always been about how to

735
00:35:12,519 --> 00:35:17,199
do more with less right and their copilot absolutely lets

736
00:35:17,239 --> 00:35:21,119
people do more without almost doing anything. But for us

737
00:35:21,239 --> 00:35:24,679
from a backup platform perspective, how can you tell ask

738
00:35:24,760 --> 00:35:27,559
it to say, check my backups? Is there any malware

739
00:35:27,559 --> 00:35:30,360
in that backups? Give me the list of backup repositories

740
00:35:30,599 --> 00:35:33,559
or sorry, backup points that have malware in it? Okay, cool,

741
00:35:33,599 --> 00:35:35,920
I can see that. And now maybe in the future

742
00:35:36,679 --> 00:35:39,119
let's clean that up, you know what I mean, Or

743
00:35:39,199 --> 00:35:41,960
let's let's recover from a point that wasn't with malware.

744
00:35:42,039 --> 00:35:44,440
Speaker 6: So I think we're able to get to that a

745
00:35:44,440 --> 00:35:44,960
lot quicker.

746
00:35:45,039 --> 00:35:47,679
Speaker 5: Rather than fumbling through some screens and going through and

747
00:35:47,679 --> 00:35:50,480
clicking through and looking at restore points and whatnot. We

748
00:35:50,599 --> 00:35:52,719
can leverage the power of that of the LM to

749
00:35:52,800 --> 00:35:55,199
do its thing, because that's it's it's smart in a way,

750
00:35:55,519 --> 00:35:57,679
and it knows how to best analyze that data that

751
00:35:57,719 --> 00:36:00,400
it gets to surface it in the best possible way.

752
00:36:01,119 --> 00:36:05,360
Speaker 4: Are there concerns about accuracy there? Like I can imagine, Hey,

753
00:36:05,400 --> 00:36:08,320
you know, are our backups usable?

754
00:36:08,519 --> 00:36:11,119
Speaker 3: And the ELM says, yeah, of course, they're totally great.

755
00:36:11,119 --> 00:36:11,760
We've got them.

756
00:36:11,800 --> 00:36:15,119
Speaker 4: They're stored in multiple places and on physical tape of

757
00:36:15,159 --> 00:36:17,000
which just is not remotely true.

758
00:36:17,559 --> 00:36:19,960
Speaker 5: Absolutely, and I think that's where the beauty of the

759
00:36:20,159 --> 00:36:24,360
agent comes in. So yeah, the agent and the rag.

760
00:36:24,679 --> 00:36:27,800
So the rag is obviously pulling the APIs and as

761
00:36:27,800 --> 00:36:30,000
long as you hit, as long as the prompt hits

762
00:36:30,039 --> 00:36:34,079
the keywords, the keywords that the agent is sort of manipulating.

763
00:36:34,320 --> 00:36:36,119
It's basically a bunch of I think I saw a

764
00:36:36,239 --> 00:36:39,320
mem saying that is AI just a whole bunch of ifs,

765
00:36:39,599 --> 00:36:42,920
you know. I don't know if you've seen that one,

766
00:36:42,960 --> 00:36:46,920
but it's pretty true. But yeah, So to your point, absolutely,

767
00:36:47,400 --> 00:36:50,679
there needs to be refinement and implicit trust that what

768
00:36:50,719 --> 00:36:52,800
you're getting is correct. And it goes back to our

769
00:36:52,800 --> 00:36:56,199
first part of the conversation right where we're talking about

770
00:36:56,199 --> 00:36:59,159
do you trust it? Well, you kind of just have

771
00:36:59,239 --> 00:37:01,719
to at that point, but knowing that you've got it

772
00:37:02,519 --> 00:37:05,039
from live data, I think adds an element of trust

773
00:37:05,039 --> 00:37:05,360
to it.

774
00:37:05,880 --> 00:37:10,199
Speaker 4: It's basically picking up where intent matching left off before.

775
00:37:11,119 --> 00:37:15,679
MLD is basically just identifying which which these flows is

776
00:37:15,719 --> 00:37:18,360
the user actually looking at, But rather than having to

777
00:37:18,920 --> 00:37:22,440
explicitly code every one of you can fundamentally ask the

778
00:37:22,639 --> 00:37:25,880
l m uh, you know, which which API should end?

779
00:37:26,000 --> 00:37:28,360
Up call to retrieve the APPROPRIATEATA.

780
00:37:27,679 --> 00:37:28,599
Speaker 3: And then running.

781
00:37:30,000 --> 00:37:30,679
Speaker 6: Yeah, that's right.

782
00:37:31,159 --> 00:37:34,840
Speaker 2: And I think going back to your basketball story earlier,

783
00:37:34,880 --> 00:37:40,199
you could even like use that identify something and then

784
00:37:41,159 --> 00:37:44,000
provide that feedback back to it and say, oh, that's

785
00:37:44,000 --> 00:37:46,400
something I should take a look at later. And there's

786
00:37:46,440 --> 00:37:48,920
a possibility that at some point in the future it's

787
00:37:48,920 --> 00:37:50,440
going to say, hey, did you ever do that thing?

788
00:37:51,760 --> 00:37:54,760
Speaker 6: Yeah, that's actually good. Actually wrote that Dan?

789
00:37:57,239 --> 00:37:58,400
Speaker 5: If I put that, if we can put that in

790
00:37:58,400 --> 00:38:02,400
the product, yeah, no. But I think that basically comes

791
00:38:02,400 --> 00:38:06,719
down to the model that you're using as well, right, Like, obviously,

792
00:38:07,760 --> 00:38:10,639
if you're using your own data with a specific public model,

793
00:38:10,679 --> 00:38:13,639
I think that's cool and that's okay, But I think

794
00:38:14,119 --> 00:38:16,920
you get that from using chat GPT four O naturally

795
00:38:16,960 --> 00:38:19,280
because it has got that memory built into it, right,

796
00:38:19,360 --> 00:38:23,199
So if you augment the data with a smarter model,

797
00:38:23,320 --> 00:38:25,400
then you're going to get those benefits. And I think

798
00:38:25,440 --> 00:38:28,400
it's only going to get better, right, Like, the difference

799
00:38:28,400 --> 00:38:30,639
in the step up between even what four point five

800
00:38:30,719 --> 00:38:33,679
Turbo did in four ro was crazy in terms of

801
00:38:33,679 --> 00:38:37,280
speed and accuracy and that sort of thing. So yeah, absolutely,

802
00:38:37,320 --> 00:38:39,199
I think that's where we want to get to, Like prompted.

803
00:38:39,400 --> 00:38:42,599
The other thing as well, which I mentioned last week,

804
00:38:42,719 --> 00:38:44,559
was that if you think about this, you don't need

805
00:38:44,559 --> 00:38:46,320
to be a backup admin in our world. You don't

806
00:38:46,320 --> 00:38:48,199
need to be a backup admin or a devopy guy

807
00:38:48,320 --> 00:38:51,599
to be able to interact with this thing now as

808
00:38:51,639 --> 00:38:55,119
an app owner, given that you've got the right level

809
00:38:55,159 --> 00:38:59,320
of back and you can definitely build back into it

810
00:38:59,400 --> 00:39:03,199
because we've got APIs that are based on certain levels

811
00:39:03,199 --> 00:39:06,519
of accessibilities. We've got back in our APIs. Right, So

812
00:39:06,559 --> 00:39:08,760
if I log into a certain user, I'm an app user, right,

813
00:39:08,800 --> 00:39:10,960
So I'm not a full blown backup admin, but I'm

814
00:39:10,960 --> 00:39:14,719
an app user. I can jump on the vam ai

815
00:39:14,719 --> 00:39:17,119
assistant and I go, hey, I've got an app running.

816
00:39:17,760 --> 00:39:21,800
It's backed by an MSSQL database. I think it's the

817
00:39:21,880 --> 00:39:23,880
Virtual Machine VM six.

818
00:39:24,079 --> 00:39:25,079
Speaker 6: Is that being backed up?

819
00:39:25,760 --> 00:39:28,760
Speaker 5: And then it'll go out do it's thing, comeback and say, hey, yep,

820
00:39:28,840 --> 00:39:29,599
that's being backed up.

821
00:39:29,599 --> 00:39:32,199
Speaker 6: It's got this many restore points. It looks all good.

822
00:39:32,360 --> 00:39:34,159
Speaker 5: Right, So from an appoint of perspective, being able to

823
00:39:34,239 --> 00:39:36,639
do that, and then think about that from an another

824
00:39:36,719 --> 00:39:38,960
level up, from a C level or a business owner

825
00:39:39,000 --> 00:39:41,280
or someone on the board, you know who just wants

826
00:39:41,320 --> 00:39:43,760
to basically go in and see if everything's okay. Are

827
00:39:43,800 --> 00:39:46,880
we passing our reguld free compliance? Is there a checkbox

828
00:39:47,199 --> 00:39:49,159
so they can go in and go, hey, tell me

829
00:39:49,199 --> 00:39:50,840
about my security posture?

830
00:39:50,840 --> 00:39:52,519
Speaker 6: Am I passing all my regulations?

831
00:39:52,840 --> 00:39:54,480
Speaker 5: And I think that would be able to come through

832
00:39:54,559 --> 00:39:56,719
instead of them logging in or getting a report or

833
00:39:56,960 --> 00:39:59,320
having to deal with with dashboards or whatever it might be,

834
00:39:59,400 --> 00:40:02,199
or a CLI. I think that also was a pretty big,

835
00:40:02,360 --> 00:40:05,159
you know, tick in this sort of way to interface

836
00:40:05,199 --> 00:40:06,119
with these platforms.

837
00:40:06,400 --> 00:40:09,119
Speaker 2: For sure, there's a definite two thousand and one Space

838
00:40:09,159 --> 00:40:12,320
Odyssey element to that. I'm sorry, Dave, I can't do that.

839
00:40:13,519 --> 00:40:15,760
Speaker 6: Yeah, that's that's that's the right movie. Did that mix

840
00:40:15,800 --> 00:40:19,719
those up? You got that? Yeah? That was how? Yeah,

841
00:40:19,760 --> 00:40:21,559
that was how, which I learned. Do you know? Do

842
00:40:21,599 --> 00:40:22,559
you know what was called how?

843
00:40:23,360 --> 00:40:25,360
Speaker 1: I've heard it before but I can't remember the answer.

844
00:40:25,960 --> 00:40:31,400
Speaker 5: It's one lesson IBM? So how is what? IBM one

845
00:40:31,559 --> 00:40:34,039
one one removed. That's and they did that on purpose

846
00:40:34,039 --> 00:40:36,440
because IBM was the big daddy back then, right in

847
00:40:36,519 --> 00:40:39,000
terms of the computer. So yeah, they named it how.

848
00:40:39,159 --> 00:40:40,079
It worked pretty well.

849
00:40:40,480 --> 00:40:43,360
Speaker 4: Has there been any like sort of follow up or

850
00:40:43,400 --> 00:40:47,360
concern regarding I think there was a l I'm integrated into.

851
00:40:47,719 --> 00:40:51,119
I think it was Canadian Airlines uh support service who

852
00:40:51,199 --> 00:40:54,599
had promised reduct reduction in prices or something like that,

853
00:40:54,719 --> 00:40:58,360
and the court actually held them to what the ALAM

854
00:40:58,440 --> 00:40:58,840
was saying.

855
00:40:59,639 --> 00:41:02,400
Speaker 5: Uh yeah, I mean there's so much there's so much

856
00:41:02,440 --> 00:41:04,880
great area around legality still which we have to work

857
00:41:04,920 --> 00:41:07,880
out where everyone's rushing to do it. I mean, I mean,

858
00:41:08,039 --> 00:41:10,199
you saw Elon's tweet. I'm not sure we saw Elon's

859
00:41:10,360 --> 00:41:15,119
response to Apple yesterday. He's basically Apple obviously working with

860
00:41:15,239 --> 00:41:19,239
open AI, and Elon's gone, okay, well, no one's going

861
00:41:19,280 --> 00:41:21,239
to use an Apple iPhone in my in my companies

862
00:41:21,239 --> 00:41:23,599
anymore because he does. He's got a beef with open I.

863
00:41:23,760 --> 00:41:25,639
He doesn't trust him, right, and he's just kind of

864
00:41:25,639 --> 00:41:28,840
said that. So I think we're still we're still kind

865
00:41:28,840 --> 00:41:30,159
of there, but he has a fair point.

866
00:41:30,239 --> 00:41:33,159
Speaker 6: Do we are we rushing to this to be.

867
00:41:33,119 --> 00:41:36,920
Speaker 5: Able to put a full blown AI on the on

868
00:41:37,079 --> 00:41:39,159
data on your phone so that you open up everything

869
00:41:39,159 --> 00:41:41,519
on your phone to an AI. Possibly, I don't know.

870
00:41:41,760 --> 00:41:44,599
It's it's still a bit scary. I mean, I trust technology.

871
00:41:44,639 --> 00:41:46,679
I've been on record, and you know I've talked to

872
00:41:46,719 --> 00:41:48,519
security guys. I don't know who I was talking to

873
00:41:48,559 --> 00:41:52,079
about security. I was on a security podcast, yeah, and

874
00:41:52,119 --> 00:41:53,960
I said, look, I trust Actually it was Jack Crisider.

875
00:41:54,079 --> 00:41:56,400
I was talking to him about that. He's on the

876
00:41:56,400 --> 00:41:58,280
opposite end, but he doesn't trust it. Where I'm like,

877
00:41:58,360 --> 00:42:01,119
you know what, people NOI and they're going to annoy

878
00:42:01,239 --> 00:42:02,199
if one way or another.

879
00:42:02,280 --> 00:42:04,320
Speaker 6: So yeah, I.

880
00:42:04,280 --> 00:42:06,719
Speaker 4: Mean I I don't think it's maybe I'm with you.

881
00:42:07,079 --> 00:42:09,440
I don't think it's noticeably worse than other things. There

882
00:42:09,480 --> 00:42:14,960
are certain areas and where we should command more responsibility

883
00:42:15,039 --> 00:42:17,519
for certain companies that are doing things on their on

884
00:42:17,519 --> 00:42:20,639
their devices or their applications that we're using, whether or

885
00:42:20,679 --> 00:42:23,519
not we trust them holistically versus whether or not they're

886
00:42:23,599 --> 00:42:27,679
using AI. I feel like that's not a huge differentiator.

887
00:42:28,400 --> 00:42:31,039
I mean, like, I think Apple's been training models on

888
00:42:31,079 --> 00:42:33,840
your personal data on on mac os for a little

889
00:42:33,840 --> 00:42:36,880
while now, and like maybe I trust that I may

890
00:42:36,920 --> 00:42:40,719
trust a little bit less some other big companies by

891
00:42:40,800 --> 00:42:45,199
name that have operating systems who are taking snapshots continuous.

892
00:42:45,840 --> 00:42:49,039
Speaker 6: I mean there is any have they gone back on that?

893
00:42:49,119 --> 00:42:51,480
Speaker 4: I think so now now it's it will be off

894
00:42:51,519 --> 00:42:54,800
by default. So this is like Windows eleven. It will

895
00:42:54,800 --> 00:42:58,599
be off by default taking snapshots and it will actually

896
00:42:58,599 --> 00:43:01,079
start being encrypted using Windows Hello.

897
00:43:01,119 --> 00:43:02,840
Speaker 3: And I'm like, was it not encrypted before?

898
00:43:03,079 --> 00:43:06,719
Speaker 4: Like so, I mean there were questions there and so

899
00:43:06,800 --> 00:43:10,480
this is where I'll say, like commands responsibility and you

900
00:43:10,559 --> 00:43:11,119
trust them.

901
00:43:11,280 --> 00:43:12,199
Speaker 3: And I think.

902
00:43:12,079 --> 00:43:15,440
Speaker 4: Certain companies have not gotten to that level of trust

903
00:43:15,480 --> 00:43:17,440
for us. So whether or not they're using AI or

904
00:43:17,440 --> 00:43:18,000
not as sort of.

905
00:43:18,000 --> 00:43:20,400
Speaker 3: A separate problem, yeah, I think the I.

906
00:43:20,400 --> 00:43:22,280
Speaker 4: Think the biggest issue there is there's going to be

907
00:43:22,320 --> 00:43:24,719
stuff on my screen, like for work, I almost totally

908
00:43:24,800 --> 00:43:26,039
get like, yeah, sure, you know, I want to go

909
00:43:26,079 --> 00:43:27,800
back to that code that I was writing six months ago.

910
00:43:27,800 --> 00:43:30,039
I don't remember where it was, which repository that bug,

911
00:43:30,039 --> 00:43:32,559
et cetera. Please find that for me. But I think

912
00:43:32,559 --> 00:43:34,719
the cell there is personal data and I'm like, I don't.

913
00:43:35,360 --> 00:43:37,719
I'm like, I have like discord open at a video

914
00:43:37,800 --> 00:43:40,880
stream for like I don't need I don't need that

915
00:43:41,000 --> 00:43:41,679
to be saved.

916
00:43:41,760 --> 00:43:42,119
Speaker 3: Please.

917
00:43:42,880 --> 00:43:43,159
Speaker 6: Yeah.

918
00:43:43,199 --> 00:43:46,280
Speaker 5: And I mean I mean Facebook got I mean if

919
00:43:46,280 --> 00:43:49,119
you say any racing label skill racing with Facebook, Facebook

920
00:43:49,280 --> 00:43:52,639
slam for they like how quickly we forget right, And

921
00:43:52,679 --> 00:43:56,239
that was nothing compared to what we potentially have with this, right,

922
00:43:56,599 --> 00:43:58,079
the amount you could have think, I mean, how many

923
00:43:58,199 --> 00:43:58,880
uses does.

924
00:43:58,760 --> 00:44:00,000
Speaker 6: Chat JBT have today?

925
00:44:00,079 --> 00:44:02,599
Speaker 5: I've lost count, But the amount of data that's coming

926
00:44:02,639 --> 00:44:05,800
in there is huge, right, It's it's huge, And like

927
00:44:05,880 --> 00:44:07,880
I know, even today, I was trying to work out

928
00:44:08,360 --> 00:44:10,559
a bit of a bit of I've I've actually been

929
00:44:10,599 --> 00:44:13,039
at Vain for nearly eight years, and in Australia you

930
00:44:13,079 --> 00:44:15,960
get long service leave and then you get pro ride

931
00:44:15,960 --> 00:44:18,119
along service leaves. I was trying to actually work out

932
00:44:18,320 --> 00:44:20,039
I don't know anything about how to work it out,

933
00:44:20,079 --> 00:44:21,920
so I just kind of asked that, Okay, at my

934
00:44:22,000 --> 00:44:25,320
hourly rate, I've been here for eight years, what's my

935
00:44:25,400 --> 00:44:27,000
current state of my long service leave?

936
00:44:27,039 --> 00:44:28,320
Speaker 6: If I was to leave today?

937
00:44:28,599 --> 00:44:30,639
Speaker 5: And it worked it out, But now it's got that

938
00:44:30,800 --> 00:44:33,440
data somewhere in that system, right in that chat.

939
00:44:34,000 --> 00:44:36,079
Speaker 2: I feel like one of the missing components and the

940
00:44:36,159 --> 00:44:42,440
whole trust issue is accountability, because there I think some

941
00:44:42,440 --> 00:44:46,679
some level of distrust is warranted. But the thing that's

942
00:44:46,719 --> 00:44:49,159
really missing is accountability. You know, if you look at

943
00:44:49,239 --> 00:44:53,360
re look at all of the examples of security breaches

944
00:44:53,400 --> 00:44:59,639
where personal data has been exposed, there's absolutely no reason

945
00:44:59,679 --> 00:45:03,599
formpanies to invest in security because the worst case scenario

946
00:45:04,400 --> 00:45:08,000
is you have to pay for six months of credit

947
00:45:08,119 --> 00:45:12,119
monitoring for a few people, and that only the ones

948
00:45:12,159 --> 00:45:14,239
that take you up on the offer, which is a

949
00:45:14,360 --> 00:45:18,639
very small number. So companies aren't incentivized at all to

950
00:45:18,920 --> 00:45:21,880
be good stewards of their data. And I think until

951
00:45:21,880 --> 00:45:25,239
we change that, we we will have trust issues and

952
00:45:26,239 --> 00:45:28,760
more data leaks.

953
00:45:29,079 --> 00:45:33,400
Speaker 5: Yeah, we talk about this often, obviously a cyber malware

954
00:45:33,639 --> 00:45:37,159
sort of perspective, you know, trying to sort of work around, well,

955
00:45:37,199 --> 00:45:39,280
you are going to get hit. It's a matter of

956
00:45:39,679 --> 00:45:41,440
you know, I hate saying it, but it's a matter

957
00:45:41,480 --> 00:45:44,960
of when, not if, right, And typically it's going to

958
00:45:45,000 --> 00:45:47,320
be through some sort of waypoint in in your in

959
00:45:47,400 --> 00:45:50,559
your systems, whether it be a poorly configured APR that

960
00:45:50,599 --> 00:45:52,400
has you know, opened to the world access or a

961
00:45:52,440 --> 00:45:55,519
system that's open, or through a vulnerability a SAVA.

962
00:45:55,199 --> 00:45:58,000
Speaker 6: Whatever it might be, you're going to get hit. So

963
00:45:58,119 --> 00:45:59,360
maybe to be prepared for that.

964
00:45:59,519 --> 00:46:03,559
Speaker 5: And I think the news outlets like to like to

965
00:46:03,639 --> 00:46:07,079
jump on these leaks like it's the biggest affront to

966
00:46:07,159 --> 00:46:10,400
our wholesome what way of being that my credit card

967
00:46:10,440 --> 00:46:13,800
information or my date of birth has been stolen by

968
00:46:13,800 --> 00:46:17,400
somebody you know, expltrated, But I think, like to your point,

969
00:46:17,480 --> 00:46:21,519
is it really that bad? So you know, I guess

970
00:46:21,519 --> 00:46:24,719
certain data is bad. Like I remember, and it was

971
00:46:24,719 --> 00:46:28,880
an episode of The Darknest Diaries where Jack was talking

972
00:46:28,880 --> 00:46:33,760
to someone who got into an hack that children's app

973
00:46:34,159 --> 00:46:36,559
that was like a children's iPhone or something, but it

974
00:46:36,599 --> 00:46:39,159
had children. It had children data in there, right, and

975
00:46:39,239 --> 00:46:41,599
that got leaked to the dark web, right, and that

976
00:46:41,800 --> 00:46:45,320
was bad because that was information that could have been

977
00:46:45,320 --> 00:46:47,400
because it had their home address and their names and

978
00:46:47,400 --> 00:46:49,719
all that kind of stuff. Now, that sort of information

979
00:46:50,360 --> 00:46:52,400
you don't want to get out there because there are

980
00:46:52,440 --> 00:46:55,880
evil people that will use that for bad But generally speaking,

981
00:46:56,000 --> 00:46:58,559
if people have my phone number, my email address, and

982
00:46:58,599 --> 00:47:01,559
my credit card number, I guess I'm okay with that.

983
00:47:01,719 --> 00:47:03,519
Speaker 6: But yeah, it's an interesting one.

984
00:47:04,400 --> 00:47:07,679
Speaker 2: Yeah it's not ideal, but it's just like it's the

985
00:47:07,679 --> 00:47:08,440
world we live in.

986
00:47:09,519 --> 00:47:10,119
Speaker 6: Absolutely.

987
00:47:11,280 --> 00:47:12,760
Speaker 5: I was going to say, there's a price to pay

988
00:47:12,840 --> 00:47:17,400
for the freedom and the way that we live in

989
00:47:17,440 --> 00:47:20,519
this world. Right, we want to when we want everything now,

990
00:47:20,519 --> 00:47:21,960
we want to be able to tap a card and

991
00:47:22,000 --> 00:47:25,400
do that and that, But then we get a massively

992
00:47:25,440 --> 00:47:27,559
affront if we do have that data leaked.

993
00:47:27,840 --> 00:47:29,719
Speaker 6: We can't have it both ways, is my way of

994
00:47:29,760 --> 00:47:30,280
looking at it.

995
00:47:30,360 --> 00:47:33,519
Speaker 3: I mean, I honestly, I'll keep I'll challenge any company

996
00:47:33,519 --> 00:47:34,719
to do this. I am happy to give.

997
00:47:34,599 --> 00:47:37,079
Speaker 4: You my data if you can, once and for all,

998
00:47:37,119 --> 00:47:40,039
actually serve me a relevant ad that I do care about,

999
00:47:41,960 --> 00:47:45,360
because you know it, it still just hasn't hasn't happened. Actually,

1000
00:47:45,400 --> 00:47:48,960
to the previous point, my CEO was giving a pretty

1001
00:47:49,000 --> 00:47:52,800
great talk on security and implications there and you brought

1002
00:47:52,800 --> 00:47:56,119
this up well about actually does the law need to change?

1003
00:47:56,239 --> 00:48:00,000
And it's interesting because we have our business in Switzerland

1004
00:48:00,159 --> 00:48:02,159
and so soon as the only country as far as

1005
00:48:02,159 --> 00:48:05,719
we've been able to identify, that has non corporate laws

1006
00:48:05,760 --> 00:48:10,320
for leaking data but individual penalties if you are an

1007
00:48:10,360 --> 00:48:14,119
executive of a company that encourages or does not lean

1008
00:48:14,159 --> 00:48:17,840
into protecting user data up to I think it's like

1009
00:48:17,840 --> 00:48:21,760
two hundred and fifty thousand dollars fine and potentially more

1010
00:48:21,840 --> 00:48:24,920
depending on what actually happened. You are individually liable for that,

1011
00:48:25,159 --> 00:48:27,079
whereas no other country really has that, and you can

1012
00:48:27,119 --> 00:48:31,239
even get insurance on oops. We leak customer data and

1013
00:48:31,280 --> 00:48:33,519
then these news companies come and pick it up and

1014
00:48:33,599 --> 00:48:36,199
it almost serves as an advertisement for them. Hey did

1015
00:48:36,280 --> 00:48:39,000
you hear about this company that leaked their data? Now

1016
00:48:39,000 --> 00:48:41,440
you know about them, you know, positive reinforcement, You're going

1017
00:48:41,519 --> 00:48:43,719
to go and actually potentially buy that brand now, So

1018
00:48:43,760 --> 00:48:46,400
they're being rewarded. I would really like to see the

1019
00:48:46,440 --> 00:48:49,480
news outlets actually lean into helping us as well. I

1020
00:48:49,480 --> 00:48:51,679
think public opinion can go a long way, And I

1021
00:48:51,679 --> 00:48:53,440
don't know if you have any particular thoughts there.

1022
00:48:53,719 --> 00:48:56,880
Speaker 5: Yeah, I think public opinion is strong a lot kind

1023
00:48:56,880 --> 00:48:59,159
of In Australia, we had some really big lakes up

1024
00:48:59,239 --> 00:49:02,840
this ahead breach Mediicare medi Banks has been a few

1025
00:49:02,840 --> 00:49:04,679
big ones. The Optus one was probably the big Optics

1026
00:49:04,800 --> 00:49:08,840
is like our like our Verizon or one of our

1027
00:49:08,840 --> 00:49:11,880
big carriers over here, and they had a massive leak

1028
00:49:11,960 --> 00:49:13,519
through an API that was open, right. It was an

1029
00:49:13,559 --> 00:49:18,079
open API, so dart exfiltration. It wasn't an attack, They

1030
00:49:18,079 --> 00:49:21,840
didn't break take down any systems, but that that did

1031
00:49:22,000 --> 00:49:25,079
make the government act and put the owners back on

1032
00:49:25,960 --> 00:49:28,800
other corporations to make sure that they at least were

1033
00:49:28,840 --> 00:49:31,880
compliant in some way. Right, So it's mandates that have

1034
00:49:31,880 --> 00:49:34,199
been released because of that, like checks that you have

1035
00:49:34,239 --> 00:49:37,760
to kind of follow. So Yeah, the media has made

1036
00:49:37,800 --> 00:49:41,360
things positive, but again such a gray area that I

1037
00:49:41,360 --> 00:49:44,519
think country to country it's just so different. I think

1038
00:49:44,599 --> 00:49:46,280
the latest one that's still a bit weird is a

1039
00:49:46,320 --> 00:49:49,559
Ticketmaster one that's still kind of out there right that

1040
00:49:49,760 --> 00:49:54,280
was Snowflake was the platform there. Snowflake hasn't really been

1041
00:49:54,360 --> 00:49:57,400
hit too hard by that and ticket Master hasn't released

1042
00:49:57,400 --> 00:49:59,360
any details as to what got leaked yet, so we're

1043
00:49:59,360 --> 00:50:00,519
still kind of waiting on that one.

1044
00:50:00,559 --> 00:50:02,239
Speaker 6: But that's an a big one of late.

1045
00:50:02,519 --> 00:50:05,280
Speaker 3: Yeah, I mean I saw a really good followed from there.

1046
00:50:05,400 --> 00:50:08,039
Speaker 4: Is like a good reminder of Snowflake and their customers

1047
00:50:08,119 --> 00:50:11,760
all have certifications proving that they're secure, and you know

1048
00:50:11,800 --> 00:50:13,440
that didn't really get them anything. So it's a good

1049
00:50:13,480 --> 00:50:16,199
reminder that these things are not have no indication that

1050
00:50:16,280 --> 00:50:19,719
you are secure. Like if your internal practices aren't actually

1051
00:50:19,760 --> 00:50:22,880
being monitored, your your policies aren't actually being followed, they

1052
00:50:22,880 --> 00:50:24,280
don't really amount the much.

1053
00:50:24,599 --> 00:50:26,440
Speaker 5: Yeah, And to that point, the way that we've been

1054
00:50:26,440 --> 00:50:28,320
talking about it at VAN and I've we spent we

1055
00:50:28,360 --> 00:50:30,400
spent the last year basically scaring the shad out of

1056
00:50:30,400 --> 00:50:33,199
people at round tables talking to them about how they

1057
00:50:33,199 --> 00:50:35,039
should be ready for this sort of stuff and then

1058
00:50:35,039 --> 00:50:38,039
have you got good data protection as part of that strategy.

1059
00:50:38,079 --> 00:50:40,880
But I think the dark web and the point end

1060
00:50:40,880 --> 00:50:42,920
that's still getting point here right, like being able to

1061
00:50:43,039 --> 00:50:43,800
use l lms.

1062
00:50:43,880 --> 00:50:45,760
Speaker 6: Going back to the prompt engineering in LLM.

1063
00:50:46,519 --> 00:50:50,760
Speaker 5: You know, I've I was writing encryption algorithms in the

1064
00:50:50,760 --> 00:50:53,239
early days because I wanted to show off a bit

1065
00:50:53,280 --> 00:50:56,840
of a demo at last year's conference and I was

1066
00:50:56,880 --> 00:50:59,800
able to encrypt a find an open stree bucket and

1067
00:51:00,039 --> 00:51:00,800
rip that bucket.

1068
00:51:00,960 --> 00:51:02,320
Speaker 6: And I did that all with chat GBT.

1069
00:51:02,800 --> 00:51:05,400
Speaker 5: Now I didn't it told me that I shouldn't use it,

1070
00:51:05,480 --> 00:51:07,679
and I shouldn't use this because it was bad, But

1071
00:51:07,719 --> 00:51:10,679
I basically again co werced it, prompt engineered my way

1072
00:51:10,719 --> 00:51:14,239
into getting it done. They have got more guardrails on today.

1073
00:51:14,280 --> 00:51:16,000
So if you actually go to chat gbt today and

1074
00:51:16,000 --> 00:51:18,719
try and say write me a ransomware script that does this,

1075
00:51:18,719 --> 00:51:21,440
this and that, you've got to kind of really coerce

1076
00:51:21,440 --> 00:51:23,440
and work with it and massage it to get that

1077
00:51:23,519 --> 00:51:24,639
script out.

1078
00:51:24,880 --> 00:51:27,480
Speaker 6: Okay, yeah, you should be able to still do it.

1079
00:51:27,880 --> 00:51:28,719
But that's said.

1080
00:51:28,719 --> 00:51:34,880
Speaker 5: On the dark web, there's all these unguarded and sort

1081
00:51:34,920 --> 00:51:39,320
of open versions of these chat lambs that will write

1082
00:51:39,320 --> 00:51:43,320
these for you. Right there's the worm GBT is the

1083
00:51:43,320 --> 00:51:45,159
one that I think is the most popular one out there.

1084
00:51:45,280 --> 00:51:48,320
It's basically a rip of chat GBT to a certain extent,

1085
00:51:48,599 --> 00:51:54,840
but it will write you phishing emails, ransomware emails, it'll

1086
00:51:54,880 --> 00:52:00,000
write you encryption tooling. So yeah, if those tools are out,

1087
00:52:00,320 --> 00:52:03,000
it means that not only the good guys or the

1088
00:52:03,039 --> 00:52:06,079
ransomware companies getting sorry, not the good guys, the guys

1089
00:52:06,119 --> 00:52:09,719
that are good at ransomware guys, but the gods that

1090
00:52:09,719 --> 00:52:12,159
are good at it. They're getting better at it. But

1091
00:52:12,320 --> 00:52:14,400
then you've got you put that in the hands of

1092
00:52:14,480 --> 00:52:16,719
just every day a little indie hacker or a little

1093
00:52:16,880 --> 00:52:19,679
script kitty who's able to now do some really powerful

1094
00:52:19,719 --> 00:52:23,440
stuff because I can leverage the technology that's scary, and

1095
00:52:23,480 --> 00:52:24,960
that's where we're going to stay ahead of the game.

1096
00:52:25,400 --> 00:52:28,079
Speaker 4: I actually have got two relevance. That's they're the first one,

1097
00:52:28,119 --> 00:52:30,960
and some of these tools out there will actually fully

1098
00:52:31,000 --> 00:52:34,159
generate you the fishing website that looks exactly the same

1099
00:52:34,199 --> 00:52:36,960
as Google log in or Azure logan, et cetera, like

1100
00:52:37,039 --> 00:52:40,320
almost identical like any product out there is for companies

1101
00:52:40,320 --> 00:52:42,280
who want to fish and the other one is some

1102
00:52:42,360 --> 00:52:46,960
blockchain developer that had like fifty thousand dollars and some

1103
00:52:47,039 --> 00:52:50,280
crypto and the keys in their GitHub and they pushed

1104
00:52:50,280 --> 00:52:53,079
it up accidentally to make the public the project public.

1105
00:52:53,440 --> 00:52:56,320
It only took two minutes for the entire wall to

1106
00:52:56,360 --> 00:52:59,119
be drunk like that is that is ridiculous, Like two

1107
00:52:59,199 --> 00:53:03,400
minutes for keys to be not only scraped but also utilized,

1108
00:53:03,440 --> 00:53:04,280
and the walla drain.

1109
00:53:04,639 --> 00:53:06,280
Speaker 5: So I've got a famous I've got a good story

1110
00:53:06,320 --> 00:53:07,559
on that, and I've done that myself.

1111
00:53:07,599 --> 00:53:08,800
Speaker 6: And this was I don't know how long ago.

1112
00:53:09,239 --> 00:53:11,199
Speaker 5: It's a blog post I put up there, but I

1113
00:53:11,199 --> 00:53:12,840
did that when I was writing some code. I was

1114
00:53:12,840 --> 00:53:16,639
actually writing some infrastructure's code using Terraform actually to work

1115
00:53:16,679 --> 00:53:22,480
with VMware VMC and also do some sort of post

1116
00:53:23,039 --> 00:53:27,920
Terraform sort of project work there, and I accidentally uploaded

1117
00:53:27,960 --> 00:53:32,960
my Amazon Aws secret key and as part of the code.

1118
00:53:34,000 --> 00:53:37,800
No sooner, I remember just all of a sudden not

1119
00:53:37,880 --> 00:53:40,000
being able to log in and upload the terrorform script,

1120
00:53:40,079 --> 00:53:41,679
and I was like, Oh, that's weird, what's going on?

1121
00:53:42,199 --> 00:53:44,639
And then realize that when I logged into my Aws

1122
00:53:44,679 --> 00:53:49,119
console in every region, about thirty or forty instances had

1123
00:53:49,159 --> 00:53:52,559
been spun up and they were basically mining whatever they

1124
00:53:52,559 --> 00:53:54,400
were mining on there. But what they all said is

1125
00:53:54,400 --> 00:53:58,119
they deleted everything. So I got absolutely done and that

1126
00:53:58,280 --> 00:54:01,239
was no joke. Within you know again maybe five and

1127
00:54:01,280 --> 00:54:02,880
it's me upload and that key, and then what I

1128
00:54:03,000 --> 00:54:06,239
learned was that, yeah, they've got they're just looking for that.

1129
00:54:06,280 --> 00:54:09,199
They're looking for dumb idiots like me who upload their

1130
00:54:09,280 --> 00:54:12,119
keys if they scrape hub and once they found them,

1131
00:54:12,159 --> 00:54:13,079
bam they get in there.

1132
00:54:13,480 --> 00:54:15,639
Speaker 6: It's really interesting how quickly that happens.

1133
00:54:15,800 --> 00:54:18,159
Speaker 4: Well, they have to be fast, not because you'll find

1134
00:54:18,159 --> 00:54:20,920
out about it, but because there's other people also scraping

1135
00:54:20,960 --> 00:54:26,159
those same keys against other yeah, like other bad guys

1136
00:54:26,199 --> 00:54:29,320
basically other villains, uh, in order to beat them, Like

1137
00:54:29,679 --> 00:54:32,119
there is a race there.

1138
00:54:32,239 --> 00:54:35,320
Speaker 6: And where why do people like this? Bad people in

1139
00:54:35,320 --> 00:54:37,760
the world? Is bad people in the world, But I

1140
00:54:38,119 --> 00:54:38,400
get it.

1141
00:54:38,440 --> 00:54:40,519
Speaker 5: People want to people want to make money, and there's

1142
00:54:40,519 --> 00:54:43,679
if people want to make money in any way they can,

1143
00:54:44,000 --> 00:54:47,480
they'll find methods to make money any way they can, right,

1144
00:54:48,360 --> 00:54:51,119
And that's why gransomware is big. That's where that's why

1145
00:54:51,159 --> 00:54:54,719
we've got all these prevalence of you know, attacks against companies,

1146
00:54:54,800 --> 00:54:56,920
expltration I've got your dat. I'm going to sell it

1147
00:54:57,840 --> 00:55:00,360
because people know that people pay money for this stuff.

1148
00:55:00,440 --> 00:55:02,880
Speaker 6: So it's just the world that we live in, right, I.

1149
00:55:02,920 --> 00:55:06,079
Speaker 4: Mean, I think there's a huge philosophical conundrum there, and

1150
00:55:06,119 --> 00:55:07,800
I don't think we're going to have the time together.

1151
00:55:08,000 --> 00:55:09,159
Speaker 6: No, no, no, no.

1152
00:55:09,400 --> 00:55:12,400
Speaker 5: I know it's interesting, but yeah, at the end of

1153
00:55:12,400 --> 00:55:15,639
the day, like those l ms, they're absolutely used for

1154
00:55:15,679 --> 00:55:20,239
good and evil, but I do believe that overall they're amazing.

1155
00:55:20,480 --> 00:55:22,840
Speaker 2: So yeah, yeah, I think it's like that comes down

1156
00:55:22,880 --> 00:55:27,400
to our own level of trust because you see it

1157
00:55:27,400 --> 00:55:30,599
with people too. You know, people form their little tight

1158
00:55:30,639 --> 00:55:34,760
circles and you don't trust anyone outside of your circle

1159
00:55:35,360 --> 00:55:37,800
until like a bigger threat comes in and you need

1160
00:55:37,840 --> 00:55:40,239
to band together, like okay, well now we trust those

1161
00:55:40,239 --> 00:55:43,280
guys because there's new people that we don't trust. And

1162
00:55:43,599 --> 00:55:47,159
I think that this tools like this fall into that

1163
00:55:47,239 --> 00:55:52,920
same pattern because it is such a conversational interface with

1164
00:55:53,000 --> 00:55:56,840
it that we give it like human characteristics.

1165
00:55:57,320 --> 00:55:59,719
Speaker 5: Yeah, and I would encourage anyone who uses it to

1166
00:55:59,760 --> 00:56:02,039
not treat it like a robot or not treat it

1167
00:56:02,119 --> 00:56:04,880
like a chap, but I would encourage people to actually

1168
00:56:04,960 --> 00:56:08,360
interact with it in a natural way, because I think

1169
00:56:08,400 --> 00:56:09,440
that's the best way.

1170
00:56:09,559 --> 00:56:10,719
Speaker 6: That's that's how I.

1171
00:56:10,639 --> 00:56:15,159
Speaker 5: Think they're built and how they're designed to respond better. So,

1172
00:56:15,800 --> 00:56:18,119
you know, if I was giving a one O one

1173
00:56:18,199 --> 00:56:21,079
on prompt engineering, it would be have a conversation with

1174
00:56:21,159 --> 00:56:23,880
it first and feel and feel good about talking with

1175
00:56:23,920 --> 00:56:26,199
it in a natural way, and then get into the

1176
00:56:26,199 --> 00:56:26,840
heavy stuff.

1177
00:56:27,079 --> 00:56:30,159
Speaker 2: Just start every conversation with the phrase, I, for one,

1178
00:56:30,280 --> 00:56:32,159
welcome our AI overlords.

1179
00:56:32,320 --> 00:56:38,000
Speaker 5: You'll be fine, Yeah, exactly, it's going to be. Yeah,

1180
00:56:38,079 --> 00:56:40,079
that part of it is going to be interesting. And look,

1181
00:56:40,239 --> 00:56:44,599
there's no doubt that we're heading towards something like you know,

1182
00:56:44,880 --> 00:56:49,440
general artificial intelligence, and I think that's exciting and scary

1183
00:56:49,639 --> 00:56:53,039
and all of the above. But it seems like where

1184
00:56:53,039 --> 00:56:55,239
else if you said five years ago, how far away

1185
00:56:55,280 --> 00:56:55,519
is that.

1186
00:56:55,480 --> 00:56:56,119
Speaker 6: We would have gone?

1187
00:56:56,760 --> 00:57:02,320
Speaker 5: You know, ten years, fifteen, twenty years now. Acceleration is significant,

1188
00:57:02,400 --> 00:57:05,480
and you know, I can see it happening this decade.

1189
00:57:05,679 --> 00:57:07,760
Speaker 6: You know, I think that's kind of the trajectory that

1190
00:57:07,800 --> 00:57:08,159
we're in.

1191
00:57:09,039 --> 00:57:14,000
Speaker 5: Yeah, it's cool, cool, scary and well at the same time, isn't.

1192
00:57:13,800 --> 00:57:16,320
Speaker 1: It all right? What else were we going to talk about?

1193
00:57:17,199 --> 00:57:17,400
Speaker 6: Oh?

1194
00:57:17,440 --> 00:57:22,639
Speaker 2: We had the talking about like the evolution of dev ops.

1195
00:57:22,800 --> 00:57:25,360
We were talking about that before we started recording, and like,

1196
00:57:25,440 --> 00:57:30,280
how you know, there's there's people like us who are

1197
00:57:31,039 --> 00:57:33,079
just have like one or two things that were good

1198
00:57:33,119 --> 00:57:36,199
at and then there's other people who just have this

1199
00:57:36,360 --> 00:57:40,000
massive foundation and can just like context switch and seem

1200
00:57:40,039 --> 00:57:41,239
to be great at everything.

1201
00:57:42,519 --> 00:57:42,800
Speaker 6: Yeah.

1202
00:57:42,840 --> 00:57:46,280
Speaker 5: I think my first boss that I had Alex Alex

1203
00:57:46,280 --> 00:57:52,039
Sliffer's great guy, Australian German, so it was very meticulous.

1204
00:57:52,039 --> 00:57:53,280
He was very German and in a lot of the

1205
00:57:53,280 --> 00:57:54,960
ways that he did a lot of things very efficient.

1206
00:57:55,559 --> 00:57:57,679
Speaker 6: It was very stereotypically German in that sense.

1207
00:57:58,039 --> 00:58:00,639
Speaker 5: I think instilled in me a lot of my better

1208
00:58:00,679 --> 00:58:04,000
traits in it because he was effectively my first mentor

1209
00:58:04,039 --> 00:58:07,000
and teacher when I was a kid. But he was

1210
00:58:07,039 --> 00:58:10,079
everything he would when we were at an internet service company.

1211
00:58:10,079 --> 00:58:13,119
We're a hosting company. We did Windows Linux, we ran

1212
00:58:13,599 --> 00:58:16,719
Cisco networking gear, and he just knew every He was

1213
00:58:16,719 --> 00:58:19,639
like an expert in everything, like everything, and I was

1214
00:58:19,800 --> 00:58:22,039
I was listening awe, like, you know, just for him

1215
00:58:22,039 --> 00:58:24,039
to be the ev because he had to be right

1216
00:58:24,079 --> 00:58:25,760
because he We're a small company.

1217
00:58:26,079 --> 00:58:29,639
Speaker 6: He had to know how to do the BGP, the DNS,

1218
00:58:30,199 --> 00:58:31,159
a lot of balancing.

1219
00:58:31,199 --> 00:58:35,320
Speaker 5: But then he'd basically code our website, code our billing system,

1220
00:58:35,840 --> 00:58:38,400
and then he would do the setup of the active

1221
00:58:38,440 --> 00:58:41,039
directory and you know, then the Linux setup.

1222
00:58:40,679 --> 00:58:43,199
Speaker 6: And everything like just literally everything.

1223
00:58:44,320 --> 00:58:47,280
Speaker 5: You know, he'd set up the sand So whatever he did,

1224
00:58:47,679 --> 00:58:49,440
whatever was an it he had, was that guy. So

1225
00:58:49,440 --> 00:58:52,000
he was like one of the original I guess devopy

1226
00:58:52,079 --> 00:58:54,559
guys because he had to be. And so you do

1227
00:58:54,760 --> 00:58:57,719
find guys like that. They're kind of unicorny, though I

1228
00:58:57,719 --> 00:59:01,199
don't think they exist honestly, very much in terms of

1229
00:59:01,239 --> 00:59:04,440
people that can do literally everything at a super high level.

1230
00:59:05,039 --> 00:59:07,159
And sometimes you find other guys that are really good

1231
00:59:07,159 --> 00:59:11,800
at coding plus infrastructure or infrastructure plus storage or networking

1232
00:59:11,800 --> 00:59:12,440
plus something.

1233
00:59:12,519 --> 00:59:14,880
Speaker 6: But yeah, it's very hard to be good.

1234
00:59:14,800 --> 00:59:18,800
Speaker 5: At everything at a super ocra high level in this industry,

1235
00:59:18,840 --> 00:59:19,159
isn't it?

1236
00:59:20,360 --> 00:59:23,920
Speaker 2: Yeah for sure, And and it gets increasingly more difficult

1237
00:59:23,960 --> 00:59:27,719
every year as we add new technologies. Like a couple,

1238
00:59:28,119 --> 00:59:31,159
you know, a couple of years from now, tops, a

1239
00:59:31,159 --> 00:59:33,519
lot of the companies we work for will have their

1240
00:59:33,639 --> 00:59:36,719
own lllms and it'll be an integral part of their

1241
00:59:36,800 --> 00:59:41,599
business and somebody's gonna have to you know, build, operate, maintain,

1242
00:59:41,679 --> 00:59:43,840
and scale that and that'll be another skill set that

1243
00:59:43,880 --> 00:59:45,280
we have to figure out how to do.

1244
00:59:46,800 --> 00:59:49,840
Speaker 5: Yeah, And I think I think DevOps, you know, there

1245
00:59:49,880 --> 00:59:52,079
was it had a bad STAPs to joke with Michael

1246
00:59:52,119 --> 00:59:52,800
actually about this.

1247
00:59:52,840 --> 00:59:54,039
Speaker 6: We both used to joke about it.

1248
00:59:54,079 --> 00:59:57,679
Speaker 5: And that's quite ironic that he's effectively built an accelerated

1249
00:59:57,719 --> 01:00:01,360
brand off his ninety days, right because we used to

1250
01:00:01,400 --> 01:00:01,960
just give a ship.

1251
01:00:02,000 --> 01:00:03,239
Speaker 6: We used to go, we're going to grow a beard,

1252
01:00:03,280 --> 01:00:05,400
and I think we can think we're mart.

1253
01:00:06,719 --> 01:00:09,239
Speaker 5: We used to joke about that at actual presentations that

1254
01:00:09,280 --> 01:00:10,719
we used to do. We used to go, like, so

1255
01:00:10,880 --> 01:00:13,559
DevOps that we're doing like a terrorform code and building

1256
01:00:13,599 --> 01:00:17,440
out our VMC infrastructure and doing this and that. But

1257
01:00:17,760 --> 01:00:21,199
I think absolutely there's there's a there's a great play

1258
01:00:21,239 --> 01:00:24,119
for it now where the platform engineering slash DevOps has

1259
01:00:24,199 --> 01:00:26,239
just come in because you have to you have to

1260
01:00:26,239 --> 01:00:27,000
be that person.

1261
01:00:27,360 --> 01:00:29,840
Speaker 6: You have to know how to consume.

1262
01:00:29,559 --> 01:00:32,840
Speaker 5: Read read a bird terraform, read infrastructure as code, read

1263
01:00:32,920 --> 01:00:35,559
some Python to do this, this or that. It's critical

1264
01:00:35,559 --> 01:00:37,760
to everything that we do. So I think it's just

1265
01:00:37,880 --> 01:00:40,599
entering that mainstream and it's not a case of you know,

1266
01:00:40,760 --> 01:00:43,239
this guy's a devop guy. This guy's a platform guy

1267
01:00:43,320 --> 01:00:46,119
or whatever. I think everyone has had to be dragged

1268
01:00:46,199 --> 01:00:48,360
up in a way to be able to be a

1269
01:00:48,480 --> 01:00:51,880
DevOps or platform engineer of sorts today. If you're not

1270
01:00:51,880 --> 01:00:54,559
a pure code like still got the code, there will

1271
01:00:54,559 --> 01:00:57,559
always be there. But in terms of infrastructure understanding what

1272
01:00:57,639 --> 01:01:00,320
it is, I think there should always be. But I

1273
01:01:00,360 --> 01:01:02,719
personally think you should always know about the system that

1274
01:01:02,719 --> 01:01:04,920
you're deploying. You should always know and not just treat

1275
01:01:04,920 --> 01:01:06,880
it like a black box. That's kind of been my

1276
01:01:06,960 --> 01:01:11,199
methodology in my career is to not just accept something

1277
01:01:11,239 --> 01:01:15,599
that's given to you. And I think there's a dangerous

1278
01:01:15,599 --> 01:01:17,199
sort of the way that we do things today where

1279
01:01:17,239 --> 01:01:19,559
we just accept that a machine is going to get built,

1280
01:01:19,880 --> 01:01:22,039
or we accept that a bunch of AWS services are

1281
01:01:22,039 --> 01:01:23,760
going to be deployed and they do this, this and that.

1282
01:01:24,159 --> 01:01:25,920
But just taking the time to go back and actually

1283
01:01:25,960 --> 01:01:28,440
understand the mechanics of it, it's still very important. I

1284
01:01:28,480 --> 01:01:31,440
think that that kind of differentiates a few people as well.

1285
01:01:32,039 --> 01:01:33,960
Speaker 4: I mean, I think you really hit on an important

1286
01:01:33,960 --> 01:01:35,960
thing there, because I think this is what really led

1287
01:01:36,000 --> 01:01:39,320
into the whole DevOps mindset movement is you have to

1288
01:01:39,440 --> 01:01:42,760
understand one hundred like as much as possible in the

1289
01:01:42,920 --> 01:01:45,880
ecosystem that you're building, in what you're writing code and employing,

1290
01:01:45,880 --> 01:01:49,239
you're a single team is cross functional and responsible for everything,

1291
01:01:49,760 --> 01:01:51,679
and that at the same time, there is more and

1292
01:01:51,719 --> 01:01:55,280
more that we have to learn about and know, and

1293
01:01:55,320 --> 01:01:57,360
maybe really the only way in which we can be

1294
01:01:57,400 --> 01:02:01,400
successful is if we are adding ms as our team

1295
01:02:01,440 --> 01:02:05,440
members to actually expand out our knowledge and expertise there,

1296
01:02:05,480 --> 01:02:08,679
because there is just more and more things that I

1297
01:02:08,760 --> 01:02:12,159
just don't think anyone can reasonably become an expert in

1298
01:02:12,480 --> 01:02:14,079
one hundred percent of everything.

1299
01:02:13,800 --> 01:02:16,880
Speaker 5: They should be no, and if anyone, if anyone kind

1300
01:02:16,880 --> 01:02:20,599
of says they are an expert, probably don't believe them.

1301
01:02:20,559 --> 01:02:27,159
Speaker 4: Right, right, Dunn Kruger there, right, Yeah.

1302
01:02:25,800 --> 01:02:27,360
Speaker 2: That's been one of the things I've struggled with for

1303
01:02:27,400 --> 01:02:30,199
people because I feel I have a lot of empathy

1304
01:02:30,199 --> 01:02:35,320
for people who are just starting a career in infrastructure

1305
01:02:35,719 --> 01:02:38,320
or platform engineering or DevOps, whatever you want to call it,

1306
01:02:39,519 --> 01:02:43,280
because there is so much to take in, Like I

1307
01:02:43,760 --> 01:02:45,760
know a lot of it, but only because I've been

1308
01:02:45,760 --> 01:02:47,960
doing this for three decades, Like if I were to

1309
01:02:48,000 --> 01:02:51,480
start today. It's one of the things I try to

1310
01:02:51,519 --> 01:02:53,679
do is for people who are just starting. It's like,

1311
01:02:53,719 --> 01:02:58,960
how do I condense three decades of things into a

1312
01:02:59,559 --> 01:03:02,679
digest stable format for them so that they can build

1313
01:03:02,760 --> 01:03:04,840
on top of what I learned instead of having to

1314
01:03:04,880 --> 01:03:06,360
relearn what I've learned.

1315
01:03:07,559 --> 01:03:11,760
Speaker 5: Yeah, and I think I mean someone actually, a guy

1316
01:03:11,800 --> 01:03:16,159
called Suresh who is the CEO for platform nine. Who

1317
01:03:17,159 --> 01:03:19,320
do I'm not sure if you guys know platform nine

1318
01:03:19,360 --> 01:03:22,639
that they do. They started off in KVN based hosting

1319
01:03:22,639 --> 01:03:25,679
with VMware and now now they basically do Kubernetes based

1320
01:03:25,679 --> 01:03:30,719
platforming on AWS. But he told me way back when

1321
01:03:31,559 --> 01:03:34,960
that the developers of today are just in a different

1322
01:03:35,039 --> 01:03:37,360
class to what you think the developers of five years

1323
01:03:37,360 --> 01:03:39,239
ago were and what they were, because what they're learning

1324
01:03:39,239 --> 01:03:43,199
at UNI is something completely different. Like I learned Turbo

1325
01:03:43,239 --> 01:03:48,039
Pascal at UNI, right, or maybe some people older than

1326
01:03:48,039 --> 01:03:50,320
me would have learned cold Cobowl or whatever it might

1327
01:03:50,320 --> 01:03:52,639
have been. Right, And then now then it went to Java,

1328
01:03:52,639 --> 01:03:54,760
and then it went to this, and now they're learning visuals,

1329
01:03:55,119 --> 01:03:57,679
dot net and now they're learning Python and so it's

1330
01:03:57,719 --> 01:04:00,320
just accelerating. But the big thing that stacks to me

1331
01:04:00,400 --> 01:04:01,960
wasn't so much about the code. It was more about

1332
01:04:02,639 --> 01:04:05,119
these new developers that are coming out or the new

1333
01:04:05,199 --> 01:04:08,000
breed of it. Guys, if you ask them what a

1334
01:04:08,039 --> 01:04:12,239
storage platform is, they'll say, oh, Mungo dB or Cassandra

1335
01:04:12,519 --> 01:04:15,320
or you know, one of those database.

1336
01:04:15,000 --> 01:04:17,159
Speaker 6: Is not not an actual storage system, not a disc.

1337
01:04:17,280 --> 01:04:20,280
Speaker 5: So they're abstracted up a couple of layers, right, And

1338
01:04:20,320 --> 01:04:24,039
they might even say, you know, the next ones might say, oh, well,

1339
01:04:24,519 --> 01:04:27,679
what's my storage roll. It's my Mongo dB database sitting

1340
01:04:27,719 --> 01:04:29,920
on Mango dB, you know what I mean. So there's

1341
01:04:29,920 --> 01:04:32,760
this level of continual growth and abstraction that's happening with

1342
01:04:32,840 --> 01:04:35,480
these guys, so they don't need to know what's underneath.

1343
01:04:35,519 --> 01:04:37,360
So those guys don't care about the storage. I don't

1344
01:04:37,400 --> 01:04:39,719
care about the spinning disc or the envmme drives. They

1345
01:04:39,719 --> 01:04:41,679
are running it all the servers. All they care about

1346
01:04:41,800 --> 01:04:44,639
is the interface into that Mango dB server. And that's

1347
01:04:44,880 --> 01:04:45,760
that's quite interesting.

1348
01:04:46,639 --> 01:04:47,679
Speaker 1: Yeah, that's a good point.

1349
01:04:48,320 --> 01:04:51,280
Speaker 2: And I think the level of abstraction is just we're

1350
01:04:51,360 --> 01:04:54,079
abstracting the abstractions now absolutely.

1351
01:04:54,599 --> 01:04:56,559
Speaker 4: I mean, I actually think it even goes further than that.

1352
01:04:56,719 --> 01:04:59,519
It just as you know where we're currently at. I

1353
01:04:59,639 --> 01:05:05,079
see in university students for their senior thesis basically building

1354
01:05:05,199 --> 01:05:09,199
something that runs on AWS as as the project, which

1355
01:05:09,239 --> 01:05:12,599
is just so much far removed from like even what

1356
01:05:12,679 --> 01:05:14,960
I was doing initially, Like I had to I had

1357
01:05:15,000 --> 01:05:17,679
to teach the first company that I actually multiple companies

1358
01:05:17,719 --> 01:05:20,360
that I had worked at, like what GIF was, and

1359
01:05:20,400 --> 01:05:22,599
that like what version control was, Like I was teaching

1360
01:05:22,599 --> 01:05:24,960
them that and I didn't learn that the university that

1361
01:05:25,079 --> 01:05:25,599
was just like.

1362
01:05:25,519 --> 01:05:26,920
Speaker 3: What I was doing. They didn't have that.

1363
01:05:27,599 --> 01:05:31,199
Speaker 4: But now that like that's that's just a class that

1364
01:05:31,239 --> 01:05:32,159
they have to go through.

1365
01:05:32,559 --> 01:05:34,840
Speaker 5: Yeah, absolutely, I mean that's and that's the difference is

1366
01:05:34,880 --> 01:05:36,679
because that is a skill that's required when they.

1367
01:05:36,639 --> 01:05:38,760
Speaker 6: Hit the workforce today because everyone kind of needs to

1368
01:05:38,800 --> 01:05:39,000
know it.

1369
01:05:39,119 --> 01:05:42,719
Speaker 5: So yeah, it's things move on, and I'm sure to

1370
01:05:43,159 --> 01:05:45,599
tie back, you know, tie the not in it to

1371
01:05:46,440 --> 01:05:49,400
to the LM talk. Absolutely, you're going to get prompt

1372
01:05:49,440 --> 01:05:54,840
engineering courses within university, if not already happening like pretty soon, right, Yeah.

1373
01:05:54,719 --> 01:05:57,239
Speaker 2: I would expect still be interesting to see the curriculum

1374
01:05:57,280 --> 01:05:57,679
on those.

1375
01:05:58,199 --> 01:06:01,119
Speaker 5: Yeah, don't swear, they don't don't don't be nasty to

1376
01:06:01,159 --> 01:06:02,280
It was like one on one.

1377
01:06:04,079 --> 01:06:08,039
Speaker 2: Listing cats, right, because if you're in college now, you

1378
01:06:08,280 --> 01:06:10,400
likely will live long enough to see this thing have

1379
01:06:10,480 --> 01:06:12,760
a real world interface and it's going to come back

1380
01:06:12,760 --> 01:06:16,559
with those things you said, absolutely awesome.

1381
01:06:16,639 --> 01:06:18,119
Speaker 1: Should we do some picks?

1382
01:06:18,519 --> 01:06:20,719
Speaker 6: Yeah, I'm excited about this, all right.

1383
01:06:20,639 --> 01:06:22,599
Speaker 1: Cool, Warren, can you kicks off?

1384
01:06:23,280 --> 01:06:23,960
Speaker 3: Yeah? For sure.

1385
01:06:24,119 --> 01:06:28,039
Speaker 4: So I've been holding this one, this particular pick, for

1386
01:06:28,199 --> 01:06:30,800
a few weeks now because I knew the yeah, the

1387
01:06:30,880 --> 01:06:31,559
episode on.

1388
01:06:31,639 --> 01:06:32,920
Speaker 3: Prompt engineering was coming.

1389
01:06:33,280 --> 01:06:38,039
Speaker 4: So if you're unfamiliar with the newsletter TLDR, there's a

1390
01:06:38,039 --> 01:06:42,960
TLDR security written by Clinton Gibbler and Remy Macarth that

1391
01:06:43,000 --> 01:06:47,280
you were absolutely fantastic, and they put together fifty plus

1392
01:06:47,320 --> 01:06:50,599
research papers. They found them, they distilled them down, like

1393
01:06:50,679 --> 01:06:52,400
really went deep what they mean. And there's a get

1394
01:06:52,400 --> 01:06:56,800
out repository on prompt injection defenses. I highly recommend if

1395
01:06:56,840 --> 01:07:00,199
you're anywhere near the area of the industry to.

1396
01:07:00,280 --> 01:07:00,800
Speaker 3: Take a look.

1397
01:07:00,920 --> 01:07:03,760
Speaker 4: There is quite good summary of like what the problems

1398
01:07:03,760 --> 01:07:07,239
are and how they're being solved or even start to

1399
01:07:07,280 --> 01:07:09,639
be tackled. And there's some really clever things in there,

1400
01:07:09,920 --> 01:07:12,920
like oh, you know, well, someone can tell our LM

1401
01:07:13,000 --> 01:07:14,880
to do something bad, and so like how do you

1402
01:07:14,880 --> 01:07:17,400
get around that. One of them I find really interesting

1403
01:07:17,519 --> 01:07:20,480
was oh well run it, run it like back through itself,

1404
01:07:20,519 --> 01:07:22,360
like get the result and then like parse it and

1405
01:07:22,400 --> 01:07:24,239
then like pass it back through the same LLM or

1406
01:07:24,280 --> 01:07:27,159
like another LLM and see if that one was validated.

1407
01:07:27,199 --> 01:07:31,199
And so using a graph of lms, get them to

1408
01:07:31,199 --> 01:07:33,280
sort of validate the response from a different one and

1409
01:07:33,320 --> 01:07:35,360
whether or not that's it's because like even if you

1410
01:07:35,360 --> 01:07:37,360
can get over the system prompt of the first one,

1411
01:07:37,480 --> 01:07:39,400
if the output of that has to still match the

1412
01:07:39,400 --> 01:07:41,679
second one that you're able to have better security over,

1413
01:07:41,800 --> 01:07:43,480
you can subvert attacks that way.

1414
01:07:43,519 --> 01:07:45,280
Speaker 3: And the list goes on and on. There's I think

1415
01:07:45,280 --> 01:07:46,360
there's over fifty in there.

1416
01:07:46,639 --> 01:07:48,679
Speaker 1: Cool, all right, Anthony, would you bring for a pick

1417
01:07:50,079 --> 01:07:50,440
turn there?

1418
01:07:50,519 --> 01:07:53,639
Speaker 5: They're both related to AI and all that kind of stuff.

1419
01:07:53,719 --> 01:07:56,280
So the first one is Life three point zero by

1420
01:07:56,719 --> 01:08:00,239
Max teg Mark. So I listened to this maybe year

1421
01:08:00,239 --> 01:08:02,320
and a half maybe actually it was before chat GBT

1422
01:08:02,480 --> 01:08:05,440
came up. But it's effectively about becoming age of AI

1423
01:08:05,639 --> 01:08:09,440
and whatnot. But the first prologue is effectively it's a

1424
01:08:09,480 --> 01:08:11,920
story and it's worth reading for it alone.

1425
01:08:11,960 --> 01:08:12,960
Speaker 6: It's it's called the.

1426
01:08:13,320 --> 01:08:17,359
Speaker 5: A Mega Project, and it's effectively you know about an

1427
01:08:17,359 --> 01:08:20,800
AI that's been developed by company kept secret perhaps in

1428
01:08:20,880 --> 01:08:24,439
the Internet gets sentiment, gets it starts to build in

1429
01:08:24,479 --> 01:08:27,359
itself and get better and then effectively controls the world.

1430
01:08:27,399 --> 01:08:29,479
But it's not a not a bad story, it's a

1431
01:08:29,479 --> 01:08:32,800
great story. But it's actually coming to life right now

1432
01:08:32,920 --> 01:08:37,119
like the actual it's a prophetic look at what's happening

1433
01:08:37,439 --> 01:08:39,920
with chat, GBT and the LLM today. So that's that's

1434
01:08:39,960 --> 01:08:43,920
a big one that I would recommend. And the other one,

1435
01:08:43,960 --> 01:08:47,560
which is actually really cool, is more about that second

1436
01:08:47,600 --> 01:08:50,800
part that we're talking about, is called an Overseen by

1437
01:08:50,920 --> 01:08:55,039
James Lovelock. James Lovelock worked he only died a few

1438
01:08:55,079 --> 01:08:57,159
years ago. He was about one hundred and something, but

1439
01:08:57,439 --> 01:09:00,640
he's basically the guy that invented Michael Waves that sort

1440
01:09:00,640 --> 01:09:04,640
of stuff like that. Dude he worked in Area fifty

1441
01:09:04,680 --> 01:09:07,800
one and whatnot. But this book is all about the

1442
01:09:07,880 --> 01:09:13,319
coming age of HyperIntelligence and effectively goes AI to cyborgs

1443
01:09:13,319 --> 01:09:16,840
and what happens when cyborgs basically start to get as

1444
01:09:16,880 --> 01:09:19,600
smart as humans and effectively the end of the story

1445
01:09:19,680 --> 01:09:23,880
is do cyborgs need humans or do humans need cyborgs?

1446
01:09:24,960 --> 01:09:28,039
And then it kind of revolves around there. So that's

1447
01:09:28,079 --> 01:09:30,000
a really cool read. I think it's a short read

1448
01:09:30,039 --> 01:09:31,680
as well. You can get through it in about eight hours.

1449
01:09:32,159 --> 01:09:34,479
But that's one of my favorite autime books and overseen

1450
01:09:34,520 --> 01:09:35,760
by James Lovelock.

1451
01:09:36,000 --> 01:09:36,640
Speaker 1: Right on line.

1452
01:09:36,960 --> 01:09:37,239
Speaker 6: Cool.

1453
01:09:37,359 --> 01:09:41,600
Speaker 2: So this week I'm here fishing for picks because I

1454
01:09:41,760 --> 01:09:46,960
was out. We went out this last weekend. I made

1455
01:09:47,000 --> 01:09:49,000
a long weekend out of it. We were out in

1456
01:09:49,000 --> 01:09:53,399
the mountains, new cell phone service, no not even any

1457
01:09:53,479 --> 01:09:57,000
radio stations. And I came back and I can't log

1458
01:09:57,079 --> 01:10:01,000
into my computer, my MacBook. I have no idea what

1459
01:10:01,039 --> 01:10:05,640
the password is. For years, it's been muscle memory, and

1460
01:10:05,680 --> 01:10:07,439
every once in a while I'll change it by just

1461
01:10:07,560 --> 01:10:09,000
changing the keystrokes a bit.

1462
01:10:09,119 --> 01:10:10,000
Speaker 6: But it's not.

1463
01:10:11,640 --> 01:10:12,119
Speaker 1: Anything.

1464
01:10:12,159 --> 01:10:14,039
Speaker 2: It's not a word, it's not a phrase, it's not

1465
01:10:14,119 --> 01:10:17,800
anything written down. And it's also gone from my brain.

1466
01:10:18,079 --> 01:10:19,159
Speaker 1: Like whatever brain.

1467
01:10:19,000 --> 01:10:24,600
Speaker 2: Cells were responsible for holding that, they're freaking dead. So

1468
01:10:25,560 --> 01:10:29,600
I'm here fishing for picks. If anyone has any ideas, recommendations,

1469
01:10:29,680 --> 01:10:34,920
or strategies for storing those root credentials, I could use

1470
01:10:34,960 --> 01:10:37,159
a new one. So if you got any ideas, hit

1471
01:10:37,199 --> 01:10:39,439
me up on X or leave comments in the show.

1472
01:10:39,600 --> 01:10:40,399
Speaker 1: How everyone do that?

1473
01:10:40,960 --> 01:10:43,039
Speaker 3: I got a good one for you, kind of on

1474
01:10:43,079 --> 01:10:43,800
what you're using.

1475
01:10:44,560 --> 01:10:47,800
Speaker 4: I've got a ubkey that I plug in that automatically

1476
01:10:47,800 --> 01:10:50,960
decrypts my hard drive and it's got its own pin,

1477
01:10:51,079 --> 01:10:53,720
but it's very short, and so I just plug that

1478
01:10:53,800 --> 01:10:55,880
in and leave it there until my keychain and pull

1479
01:10:55,920 --> 01:10:57,960
it out, so it doesn't matter, Like I don't ever

1480
01:10:58,039 --> 01:11:00,640
need to remember what password is there must I lost.

1481
01:11:01,319 --> 01:11:03,640
I lost the key chain as well, and if I forget, I'll,

1482
01:11:03,880 --> 01:11:06,319
you know, update it right now and validate. Like every year,

1483
01:11:06,359 --> 01:11:08,680
I have a reminder said do you still remember the password?

1484
01:11:08,920 --> 01:11:10,520
Speaker 3: And tape it out? And if I don't, I'll set

1485
01:11:10,520 --> 01:11:11,039
a new one.

1486
01:11:11,600 --> 01:11:13,960
Speaker 6: Isn't that funny about passwords like that? Like you said it?

1487
01:11:14,039 --> 01:11:16,720
Speaker 5: Like I think if you ask anyone, most people, unless

1488
01:11:16,720 --> 01:11:17,560
they're really.

1489
01:11:17,319 --> 01:11:19,079
Speaker 6: Security conscious and there's some real.

1490
01:11:19,039 --> 01:11:24,119
Speaker 5: Security nuts out there, they will they will basically have

1491
01:11:24,199 --> 01:11:28,439
that same password that's muscle memory, but slightly older. Maybe

1492
01:11:28,439 --> 01:11:31,079
it's two numbers that they change every now and again. Maybe,

1493
01:11:31,239 --> 01:11:33,840
So I think majority of people do that because that's

1494
01:11:33,880 --> 01:11:34,439
just what we know.

1495
01:11:34,520 --> 01:11:37,520
Speaker 6: It's easy. Don't feel bad about that.

1496
01:11:37,600 --> 01:11:40,359
Speaker 5: But yeah, maybe maybe you know the fact that your

1497
01:11:40,399 --> 01:11:41,239
muscle memory is gone.

1498
01:11:41,239 --> 01:11:43,520
Speaker 6: I don't know, get that check.

1499
01:11:43,600 --> 01:11:47,079
Speaker 2: Maybe great, because that's the that's the big concerns like, Okay,

1500
01:11:47,199 --> 01:11:50,279
I forgot my password? What else have I forgotten that

1501
01:11:50,399 --> 01:11:51,600
I'm going to need something?

1502
01:11:54,159 --> 01:11:54,600
Speaker 3: This girl?

1503
01:11:55,399 --> 01:11:58,640
Speaker 4: Well, I mean the good news is well that unless

1504
01:11:58,640 --> 01:12:02,960
you actually damage those on the the pathways are still there.

1505
01:12:03,560 --> 01:12:04,760
Speaker 3: You need to re trigger them.

1506
01:12:04,800 --> 01:12:08,199
Speaker 4: But the connections in the neural net that's processing over

1507
01:12:08,199 --> 01:12:11,439
those neurons does degree, so you you can retrain it

1508
01:12:11,479 --> 01:12:15,239
by some going through whatever process you would normally go

1509
01:12:15,279 --> 01:12:17,039
through and try to get your key, your hands on

1510
01:12:17,079 --> 01:12:19,640
the right spot on the and may come back to you.

1511
01:12:20,640 --> 01:12:23,680
Speaker 6: Just go to Alon. He's got like a little plugging

1512
01:12:24,079 --> 01:12:24,600
the head. Now.

1513
01:12:28,399 --> 01:12:31,760
Speaker 2: I'm sure you guys have seen like the similar tweets

1514
01:12:31,760 --> 01:12:35,920
and stuff similar to where they'll tweets on at C

1515
01:12:35,960 --> 01:12:38,600
I a say, hey, I forgot this piece of information.

1516
01:12:38,720 --> 01:12:40,840
Can you guys send me your backup copy of it?

1517
01:12:43,119 --> 01:12:45,560
Tweet at CIA? Can you send me my passwords?

1518
01:12:45,960 --> 01:12:47,760
Speaker 6: Someone who will do it? Cool?

1519
01:12:47,800 --> 01:12:49,600
Speaker 1: All right, well, Anthony, thank you so much for being

1520
01:12:49,640 --> 01:12:50,039
on the show.

1521
01:12:50,159 --> 01:12:55,000
Speaker 6: Just really fun. Actually, I really enjoyed the conversation. Love it,

1522
01:12:55,039 --> 01:12:56,039
love it. It's been good.

1523
01:12:56,079 --> 01:12:58,640
Speaker 5: We've gone on different places and Warren, I reckon you

1524
01:12:58,680 --> 01:13:00,640
talked about ads. I reckon, you're gonna get ads about

1525
01:13:00,640 --> 01:13:01,800
black pudding now, so I just.

1526
01:13:01,960 --> 01:13:07,159
Speaker 2: Get ready to right for sure there is so I

1527
01:13:07,239 --> 01:13:11,159
can say that Amazon seems to have improved their ad

1528
01:13:13,000 --> 01:13:15,199
their ad platform a lot, because I used to have

1529
01:13:15,239 --> 01:13:17,039
this running joke with a really good friend of mine

1530
01:13:17,039 --> 01:13:21,720
where we would find less than professional things for sale

1531
01:13:21,840 --> 01:13:25,399
on ad on Amazon and send each other links, you know,

1532
01:13:25,560 --> 01:13:27,560
masking them or whatever, just to get the other person

1533
01:13:27,600 --> 01:13:31,119
to click on them. And then every time they logged

1534
01:13:31,119 --> 01:13:33,039
into Amazon in front of their wife or something, there

1535
01:13:33,039 --> 01:13:35,359
would be these really odd ads and it was just

1536
01:13:35,439 --> 01:13:39,800
great fun. But then I had another friend who I

1537
01:13:39,840 --> 01:13:42,199
think it's been about two years ago now, he was

1538
01:13:42,239 --> 01:13:46,479
having open heart surgery and we were really concerned about him,

1539
01:13:47,199 --> 01:13:48,960
and so I wanted to make sure that he was

1540
01:13:49,199 --> 01:13:52,279
in a good mood before he went to the hospital

1541
01:13:52,319 --> 01:13:54,960
that morning. So I went on Amazon and I ordered

1542
01:13:54,960 --> 01:13:57,640
a blow up sex doll, and then I took it

1543
01:13:57,680 --> 01:13:59,319
around to all my friends, and you know, we all

1544
01:13:59,319 --> 01:14:02,000
wrote stupid stuff on it, and then I filled it

1545
01:14:02,039 --> 01:14:05,880
with helium and tied it to his mailbox so that

1546
01:14:05,920 --> 01:14:09,560
whenever he his wife took him to the hospital that morning,

1547
01:14:09,960 --> 01:14:12,439
there was a blow up sex doll flying above his

1548
01:14:12,560 --> 01:14:18,039
house and and so I was really worried about the

1549
01:14:18,039 --> 01:14:22,359
next time I logged into Amazon after that, like thinking, Okay,

1550
01:14:22,479 --> 01:14:23,520
what are they going to show me?

1551
01:14:23,920 --> 01:14:28,640
Speaker 1: I didn't get anything, like not even like dolls.

1552
01:14:28,720 --> 01:14:30,960
Speaker 2: Yeah, here's other sex dolls it might be interested in,

1553
01:14:31,079 --> 01:14:33,159
or here's things to go with this. None of that stuff.

1554
01:14:33,159 --> 01:14:36,159
I didn't get anything. So they appear have done done

1555
01:14:36,199 --> 01:14:36,840
some work there.

1556
01:14:37,319 --> 01:14:39,600
Speaker 5: Yeah, well maybe is this one top of sex doll

1557
01:14:39,640 --> 01:14:41,680
on Amazon? And you boid it and that was it.

1558
01:14:41,680 --> 01:14:43,039
Speaker 1: It was the only one they sell.

1559
01:14:43,800 --> 01:14:46,119
Speaker 6: They sell, that's it you got. You got the premium model.

1560
01:14:46,119 --> 01:14:48,000
Speaker 2: Right, that's our top of the line model.

1561
01:14:50,439 --> 01:14:55,760
Speaker 6: Yeah anyway, yeah, awesome, all.

1562
01:14:55,720 --> 01:14:58,760
Speaker 2: Right, cool, Thanks everyone, Anthony, thank you, Warren, thank you,

1563
01:14:59,079 --> 01:15:01,640
and to all nurse thank you for listening and supporting

1564
01:15:01,680 --> 01:15:02,039
the show.

1565
01:15:02,439 --> 01:15:03,680
Speaker 6: And we'll see y'all next week.

