1
00:00:02,160 --> 00:00:05,280
Speaker 1: Here we go, Warren, how are you just jumping right

2
00:00:05,360 --> 00:00:07,120
into those personal questions?

3
00:00:08,359 --> 00:00:10,839
Speaker 2: I know right who? I can't remember who the guest was.

4
00:00:12,119 --> 00:00:15,480
Speaker 3: We had a guest on it was earlier this year,

5
00:00:16,239 --> 00:00:18,760
and I asked her because we do like a little

6
00:00:18,760 --> 00:00:20,760
prep before we start recording, and I say, you know,

7
00:00:20,839 --> 00:00:25,160
there's no like stumped chump questions. Everything should be laid

8
00:00:25,199 --> 00:00:27,679
back in conversational. And I asked her how she was

9
00:00:27,679 --> 00:00:29,239
doing and she's like, I thought you said there would

10
00:00:29,239 --> 00:00:30,399
be no hard questions.

11
00:00:30,559 --> 00:00:32,799
Speaker 2: I was like, oh my bad.

12
00:00:32,920 --> 00:00:35,079
Speaker 1: I had the right answer here. Actually, I was planning

13
00:00:35,119 --> 00:00:37,079
for this the whole day and then you went and

14
00:00:37,079 --> 00:00:40,960
you asked me, and I totally forgot what it was

15
00:00:41,159 --> 00:00:43,320
that I was going to say here, So.

16
00:00:43,679 --> 00:00:44,399
Speaker 2: You know, thanks for that.

17
00:00:44,920 --> 00:00:50,560
Speaker 3: Yeah, yeah, that's that's why I'm here, joining us back

18
00:00:50,600 --> 00:00:55,119
after a long break. And she just dropped offline. I'll

19
00:00:55,159 --> 00:00:58,880
just foreshadow it. Jillian is back as a co host

20
00:00:58,920 --> 00:01:01,320
on the show, but she's had audio issue, so she'll

21
00:01:01,359 --> 00:01:05,120
be joining us later in the episode, but then also

22
00:01:05,640 --> 00:01:09,680
joining us today as our special guest, Brian valaalonga from Doppler.

23
00:01:09,799 --> 00:01:10,560
Speaker 2: Brian, how are.

24
00:01:10,480 --> 00:01:13,560
Speaker 4: You I'm doing well? Great to be here, dude.

25
00:01:13,319 --> 00:01:17,040
Speaker 3: I'm excited to have you looking forward to this conversation

26
00:01:17,079 --> 00:01:21,680
because we're going to be talking about secrets management, which men,

27
00:01:21,719 --> 00:01:24,120
there's so many different ways to deal with that, so

28
00:01:24,200 --> 00:01:27,799
I'm excited to hear what your take on that is.

29
00:01:27,879 --> 00:01:30,159
But before we get into that, could you give our

30
00:01:30,200 --> 00:01:33,439
listeners a little bit about your background and how you

31
00:01:33,519 --> 00:01:34,760
got to this point in your life.

32
00:01:35,319 --> 00:01:35,680
Speaker 4: Yeah?

33
00:01:35,719 --> 00:01:38,159
Speaker 5: Sure, So, I mean, I've always been a founder at

34
00:01:38,159 --> 00:01:42,359
Hearts Dopplers. I think startup number eight or nine. It's

35
00:01:42,400 --> 00:01:45,319
been quite a journey, a lot of a lot of

36
00:01:45,920 --> 00:01:49,799
companies that now looking back probably feel more like projects.

37
00:01:51,280 --> 00:01:53,959
Speaker 4: But yeah, a lot of failures along the way. So

38
00:01:54,159 --> 00:01:55,439
I'd say is just don't give up.

39
00:01:56,719 --> 00:01:59,400
Speaker 5: But I think for me, where Topla really got started

40
00:01:59,480 --> 00:02:02,879
was I was working and I was working on a

41
00:02:02,920 --> 00:02:04,640
side project at the time while I was at uber

42
00:02:05,000 --> 00:02:07,120
in my free time, and it was a crypto machine

43
00:02:07,159 --> 00:02:11,159
learning marketplace, kind of like all the buzzwords in one

44
00:02:11,439 --> 00:02:16,199
and I could not funding was guaranteed, right, you would think,

45
00:02:16,319 --> 00:02:19,000
and you would think that customers were guaranteed. But turns

46
00:02:19,039 --> 00:02:22,439
out I was just a little too early and none

47
00:02:22,439 --> 00:02:24,400
of that was guaranteed at all. If anything, it was

48
00:02:24,520 --> 00:02:26,879
really freaking hard to get it off the ground. I

49
00:02:26,919 --> 00:02:28,840
felt like pushing a boulder up a hill. You move

50
00:02:28,919 --> 00:02:31,159
one foot forward and slipped five feet back from exhaustion.

51
00:02:31,520 --> 00:02:32,240
Speaker 4: It just sucked.

52
00:02:33,919 --> 00:02:36,199
Speaker 5: And which is funny because like, now there's a company

53
00:02:36,240 --> 00:02:38,800
today that just like is enormously successful doing like ninety

54
00:02:38,840 --> 00:02:41,759
nine percent of what we did. But that's just how

55
00:02:41,759 --> 00:02:43,879
it works, Like you gotta time it correctly, and we didn't.

56
00:02:45,159 --> 00:02:46,960
But yeah, I was working on that for a long

57
00:02:47,000 --> 00:02:53,120
time and I was just getting frustrated, and so I

58
00:02:53,159 --> 00:02:55,080
realized at some point I was never going to be

59
00:02:55,080 --> 00:02:56,800
able to get this thing off the ground, and so

60
00:02:56,879 --> 00:03:00,199
I started looking at the proms I was facing while

61
00:03:00,280 --> 00:03:05,159
running it and managing environment variables, API keys, database URLs,

62
00:03:05,159 --> 00:03:09,680
and other sensitive information that typically is used to like

63
00:03:09,680 --> 00:03:12,199
can figure an application, like I kept struggling with it

64
00:03:12,240 --> 00:03:14,240
felt really weird to share with the contractor. At one

65
00:03:14,280 --> 00:03:15,879
point in time, it's like, oh, are they gonna walk

66
00:03:15,879 --> 00:03:18,560
away with my crypto keys that I can't like really change,

67
00:03:18,800 --> 00:03:20,879
and then all the customers are relying on us to

68
00:03:20,960 --> 00:03:24,319
keep securely and then like we drop crypto one point,

69
00:03:24,319 --> 00:03:26,240
and we were using like Stripe for pay and processing,

70
00:03:26,560 --> 00:03:28,479
and we had the stripe production key in staging the

71
00:03:28,479 --> 00:03:30,400
Stage one and PROD and took us a month to

72
00:03:30,400 --> 00:03:33,840
realize why we weren't doing any transactions. So we were

73
00:03:33,919 --> 00:03:36,400
up in an operational and could not do a transaction,

74
00:03:37,039 --> 00:03:38,599
and it took us so long to figure that. It

75
00:03:38,680 --> 00:03:40,879
was just like so many of these like like face

76
00:03:40,960 --> 00:03:43,280
palm moments where it's like, damn that that sucked.

77
00:03:44,439 --> 00:03:44,919
Speaker 4: And so I.

78
00:03:46,560 --> 00:03:49,680
Speaker 5: Go back to I go back to this in San Franisco,

79
00:03:49,759 --> 00:03:51,960
to this dinner that had a whole bunch of founders

80
00:03:51,960 --> 00:03:56,960
and developers, and I'm like, hey, am I a shitty engineer,

81
00:03:56,960 --> 00:03:58,520
is a world broke? And I just can't tell anymore.

82
00:03:58,560 --> 00:04:03,199
You tell me real problem. And they were like, yeah,

83
00:04:03,280 --> 00:04:05,240
this is a real problem. We're all struggling with the two.

84
00:04:05,960 --> 00:04:07,960
And that's what kind of got us started. And after

85
00:04:08,000 --> 00:04:10,560
that we started building the MVP, which took a couple

86
00:04:10,560 --> 00:04:13,439
of weeks, and on the fourth week we got some

87
00:04:13,479 --> 00:04:15,479
set of customers and then we were off the races

88
00:04:16,800 --> 00:04:17,079
right on.

89
00:04:17,199 --> 00:04:19,319
Speaker 3: I mean, that's a quick turnaround to have to be

90
00:04:19,399 --> 00:04:21,519
onboarding customers on your fourth week.

91
00:04:22,240 --> 00:04:24,959
Speaker 5: Oh yeah, we were. We were like, speed is the

92
00:04:25,000 --> 00:04:27,079
only thing that gives us a permission to survive here

93
00:04:27,360 --> 00:04:30,160
these not We're going to get slaughtered, And so we

94
00:04:30,240 --> 00:04:35,199
just moved incredibly quickly, like there were many many days

95
00:04:35,199 --> 00:04:37,519
with no sleep. I mean, Doppler was you was our

96
00:04:37,839 --> 00:04:40,480
we were the first customer of our own product. Before

97
00:04:40,519 --> 00:04:42,639
we even had a dashboard in a UI. We would

98
00:04:42,720 --> 00:04:45,360
just only use it with the with the API, and

99
00:04:45,480 --> 00:04:47,680
whenever I needed to change secrets, I would literally go

100
00:04:47,680 --> 00:04:50,160
into the database and edit the secrets before the h

101
00:04:50,279 --> 00:04:51,160
the encryption step.

102
00:04:51,439 --> 00:04:56,519
Speaker 2: So it's like, right on, that's awesome.

103
00:04:56,560 --> 00:04:58,879
Speaker 3: So just to put things in context, you kind of

104
00:04:58,959 --> 00:05:03,439
hinted to it, but let's clarify what we mean by secret.

105
00:05:03,519 --> 00:05:09,279
You mentioned, you know, API keys, database, URL's passwords. Obviously,

106
00:05:09,560 --> 00:05:13,199
what kind of things fall into the scope of secrets management.

107
00:05:13,759 --> 00:05:14,959
Speaker 4: Yeah, this is a great question.

108
00:05:16,480 --> 00:05:20,480
Speaker 5: I generally recommend it's anything that configures the application, and

109
00:05:20,519 --> 00:05:22,040
I know that can go even into a little bit

110
00:05:22,040 --> 00:05:23,600
of extreme, Like some people on the call could be like,

111
00:05:23,680 --> 00:05:26,639
what about my port? Is my port really a secret?

112
00:05:27,199 --> 00:05:30,079
And my answer here would be yes. And the reason

113
00:05:30,120 --> 00:05:32,240
why is because if you have to start making a

114
00:05:32,319 --> 00:05:35,439
decision about what's a secret and what's not, then you're

115
00:05:36,319 --> 00:05:38,319
you could be wrong. And the one time you're wrong.

116
00:05:38,360 --> 00:05:40,399
That could lead to the company dying or some like

117
00:05:40,399 --> 00:05:43,759
crazy event happening. So instead, I say, my view is,

118
00:05:43,800 --> 00:05:47,680
anytime you're configuring an application with a variable of some type,

119
00:05:48,240 --> 00:05:50,680
an a pike or poor to a logging statement, whatever

120
00:05:50,680 --> 00:05:53,240
it is, treat all of it the same and treat

121
00:05:53,240 --> 00:05:55,600
it all as the most sensitive thing possible. That way,

122
00:05:55,600 --> 00:05:59,079
you're kind of offloading this decision making and you're saying, Okay,

123
00:05:59,120 --> 00:06:01,000
we're just gonna do the most cure thing by default,

124
00:06:01,199 --> 00:06:03,240
and so we're reducing our chances of a bat event

125
00:06:03,279 --> 00:06:06,800
happening by by what every engineer now kind of like

126
00:06:07,040 --> 00:06:08,759
randomly deciding what's a secret and what's not.

127
00:06:10,480 --> 00:06:12,959
Speaker 3: That's a good point because I think that adds to

128
00:06:13,199 --> 00:06:16,160
a lot of the complexity I've had managing secrets in

129
00:06:16,160 --> 00:06:20,000
the past, is defining Okay, we do our configure variables

130
00:06:20,040 --> 00:06:22,439
over here, but then our secrets you're over here. And

131
00:06:22,680 --> 00:06:25,439
you've just in hindsight now that you said that, I

132
00:06:25,519 --> 00:06:29,480
just doubled my workload for really no no real good reason.

133
00:06:29,839 --> 00:06:32,519
Speaker 5: And usually they end up being like two completely different systems,

134
00:06:32,839 --> 00:06:34,879
and so now it's even harder to track of like

135
00:06:35,000 --> 00:06:37,120
which one to one in which system it's not like

136
00:06:37,199 --> 00:06:39,199
it's like there's like a branching point that you like

137
00:06:39,240 --> 00:06:41,319
can like track that and have a review around it.

138
00:06:42,680 --> 00:06:44,920
Speaker 4: And so a lot of times now you're just hoping.

139
00:06:44,759 --> 00:06:46,560
Speaker 5: And praying every engineer in the company kind of just

140
00:06:46,560 --> 00:06:48,839
makes the right decision at all times, which, like we're

141
00:06:48,839 --> 00:06:51,199
all humans, We're most like at some point in time,

142
00:06:51,279 --> 00:06:53,160
someone is going to make a mistake. The problem is

143
00:06:53,199 --> 00:06:55,519
that mistake could cost the company big time and.

144
00:06:55,439 --> 00:06:56,199
Speaker 4: All of its users.

145
00:06:56,839 --> 00:07:00,720
Speaker 1: Yeah, so a really important point there, I think is

146
00:07:01,360 --> 00:07:04,120
not really worth like really worth diving into for a second.

147
00:07:05,160 --> 00:07:08,000
So I really like this topic because I actually have

148
00:07:08,120 --> 00:07:11,639
a conference talk that I've been doing for a couple

149
00:07:11,639 --> 00:07:15,240
of years now specifically about teaching about secrets management and

150
00:07:15,399 --> 00:07:17,600
all the different strategies out there, and there's like quite

151
00:07:17,920 --> 00:07:21,360
a few of them. Secrets manager, the cloud providers have

152
00:07:21,800 --> 00:07:24,920
key management services, there's like open source faults and whatnot.

153
00:07:25,319 --> 00:07:28,560
And the sad story is the only real strategy that

154
00:07:29,120 --> 00:07:32,399
works in theory is something like bring your own keys,

155
00:07:32,480 --> 00:07:35,639
where things are secured by default, and the number of

156
00:07:35,680 --> 00:07:37,959
providers out there that support that, Like I know we

157
00:07:38,040 --> 00:07:40,319
do at authors like let our customers bring our own keys,

158
00:07:40,319 --> 00:07:42,519
like the number of customers out there that are working

159
00:07:42,560 --> 00:07:45,519
with a provider, even like something as important as Strife

160
00:07:45,720 --> 00:07:47,879
or even some of the cloud providers, they don't support that,

161
00:07:47,920 --> 00:07:50,040
Like you can't bring your own keys to any of

162
00:07:50,079 --> 00:07:52,879
those critical systems. It's sort of ridiculous that this is

163
00:07:52,879 --> 00:07:55,759
the world we live in of only bad options are available.

164
00:07:57,199 --> 00:07:57,879
Speaker 4: Yeah, it is.

165
00:07:58,040 --> 00:08:00,439
Speaker 5: I mean I would say, like where we really now?

166
00:08:01,160 --> 00:08:04,120
I agree with you that it's really we're living in

167
00:08:04,160 --> 00:08:08,120
a world of bad options. Doppler tries to and obviously

168
00:08:08,160 --> 00:08:10,800
I'm biased because I found Adoptler. I'll just call that out,

169
00:08:10,879 --> 00:08:15,199
but like I think that just so anyone knows, but

170
00:08:15,319 --> 00:08:16,920
I do think that, like you can get to a

171
00:08:16,959 --> 00:08:17,720
pretty good state.

172
00:08:18,560 --> 00:08:19,759
Speaker 4: And then I would say where the.

173
00:08:19,720 --> 00:08:21,360
Speaker 5: World's going in a second, one I can talk about

174
00:08:21,639 --> 00:08:26,120
is like the ideal state, but at least a Doppler.

175
00:08:26,120 --> 00:08:28,160
We went through a lot of iterations, like we rebuilt

176
00:08:28,199 --> 00:08:31,839
the product from scratch like three times, and what we

177
00:08:32,000 --> 00:08:35,000
realized is that you're never gonna be able to get

178
00:08:35,039 --> 00:08:38,039
developer ad option unless you make vegetables taste like candy

179
00:08:38,080 --> 00:08:40,440
and vegetables being the security can you be in the

180
00:08:40,440 --> 00:08:41,440
developer productivity.

181
00:08:41,679 --> 00:08:42,960
Speaker 4: Developers are never gonna use it.

182
00:08:43,360 --> 00:08:45,720
Speaker 6: Developers are children in this scenario because it.

183
00:08:51,480 --> 00:08:56,440
Speaker 4: So right, but yeah, you gotta, you gotta. It's it's

184
00:08:56,440 --> 00:08:59,879
really around like are are you taking energy from them?

185
00:09:00,200 --> 00:09:02,840
Speaker 5: You giving them energy in a way, right, Like developers

186
00:09:03,000 --> 00:09:04,960
want to stay and flow state. They want to be

187
00:09:05,039 --> 00:09:08,080
super productive, and if you can help them be more productive,

188
00:09:08,080 --> 00:09:10,639
if you can give them like a tailwind, then they

189
00:09:10,639 --> 00:09:13,039
will love your tool and they'll use it regardless of

190
00:09:13,080 --> 00:09:16,279
the security benefits. But if you give them headwinds, well

191
00:09:16,320 --> 00:09:20,039
they're gonna run away from you because they don't like that, understandably.

192
00:09:20,399 --> 00:09:23,039
And so for us, that was our trick to making

193
00:09:23,120 --> 00:09:26,960
it a tool that developers truly loves. The company gets

194
00:09:26,960 --> 00:09:29,840
security benefits is give them about two hours a week

195
00:09:29,960 --> 00:09:33,279
of productivity games and there's a whole host of ways

196
00:09:33,279 --> 00:09:34,320
we do that. We don't have to get into that

197
00:09:34,360 --> 00:09:37,159
on this call, but like that was the fundamental reason

198
00:09:38,080 --> 00:09:40,440
that I think doubt war has become somewhat successful in

199
00:09:40,480 --> 00:09:43,440
helping developers want to manage their secrets better. It's not

200
00:09:43,480 --> 00:09:45,720
about forcing them to, it's about getting them to want

201
00:09:45,759 --> 00:09:49,600
to do it. And then I think I mentioned before,

202
00:09:49,799 --> 00:09:53,120
like where it's going in the ideal state, we're moving

203
00:09:53,120 --> 00:09:58,000
to this world of like identity based seek, like identity

204
00:09:58,000 --> 00:10:01,399
based authentication. And so the idea here is like with

205
00:10:01,559 --> 00:10:04,559
like we have something like pass key, where like you

206
00:10:04,600 --> 00:10:06,840
can scan your fingerprint or scan your face and get

207
00:10:06,840 --> 00:10:09,600
into another system with just your email address and that

208
00:10:09,679 --> 00:10:12,320
like fingerprint or that face ID, and that's called passkey.

209
00:10:12,320 --> 00:10:14,799
It's getting rolled out across like all these major sites today.

210
00:10:15,320 --> 00:10:18,279
That's gonna start happening with secrets like API keys. Instead

211
00:10:18,279 --> 00:10:21,080
of having this API key, you'll have like this identity

212
00:10:21,120 --> 00:10:23,879
and so the fact that you're on your laptop or

213
00:10:23,879 --> 00:10:27,440
in an AWS cluster, you'll automatically be authenticated to stripe

214
00:10:27,440 --> 00:10:30,399
in the future. And that's like a really powerful thing.

215
00:10:30,720 --> 00:10:32,960
And my guess is it will be like a little

216
00:10:33,000 --> 00:10:35,039
bit difficult to set up without like a proper secrets

217
00:10:35,039 --> 00:10:37,360
manager like Doppler, But once it is set up, it'll

218
00:10:37,360 --> 00:10:39,480
be seamless, like you're just in the machine and it

219
00:10:39,519 --> 00:10:42,600
just works. You're automatically authenticated and there's nothing else you

220
00:10:42,639 --> 00:10:44,919
need to do, and there's now no worry about actually

221
00:10:44,960 --> 00:10:48,200
losing this secrets, which can cause a breach or anything

222
00:10:48,240 --> 00:10:48,519
like that.

223
00:10:48,879 --> 00:10:50,279
Speaker 4: It'll just be secured by default.

224
00:10:50,639 --> 00:10:54,480
Speaker 3: Yeah, for sure, So that scenario sounds to me a

225
00:10:54,519 --> 00:10:57,840
lot like or like very similar to the open id

226
00:10:58,320 --> 00:11:03,320
connect platform. We use that a lot with our GitHub integration.

227
00:11:03,480 --> 00:11:07,519
So the way that works for anyone who's not familiar

228
00:11:07,519 --> 00:11:12,159
with it is GitHub. When your GitHub action runs, it

229
00:11:12,200 --> 00:11:17,440
sends an authenticated request to your AWS environment. ABS validates

230
00:11:17,480 --> 00:11:20,759
that request, and then issues back a temporary set of

231
00:11:20,799 --> 00:11:24,600
credentials tied to a specific role that has been set up.

232
00:11:24,679 --> 00:11:29,600
So now your GitHub action has permissions inside your AWS

233
00:11:29,639 --> 00:11:32,519
account to do only the things that are permitted by

234
00:11:32,600 --> 00:11:35,960
that role and only within the timeframe set by those

235
00:11:36,039 --> 00:11:38,279
keys that were issued to it. And it is a

236
00:11:38,320 --> 00:11:40,159
pain in the pain and he has to set up,

237
00:11:40,440 --> 00:11:44,200
but like, once it's set up, it's so seamless and

238
00:11:44,279 --> 00:11:48,039
smooth and you never have to you never have to

239
00:11:48,080 --> 00:11:50,120
lie to your security guys and say, oh, yeah, we

240
00:11:50,240 --> 00:11:51,120
rotated the keys.

241
00:11:51,240 --> 00:11:51,919
Speaker 2: Absolutely.

242
00:11:52,799 --> 00:11:55,320
Speaker 1: I really liked that you brought this up, actually, because

243
00:11:55,759 --> 00:11:59,039
the canonical name that you called it was open id connect,

244
00:11:59,159 --> 00:12:02,679
which isn't wrang and the like GitHub and AWS jumped

245
00:12:02,720 --> 00:12:05,360
on oi DC as the acronym for it for making

246
00:12:05,360 --> 00:12:09,159
this work, and that's not what open idconnect is at all,

247
00:12:09,240 --> 00:12:11,440
but that's what all these providers are calling it the

248
00:12:11,480 --> 00:12:15,440
canonical term ridiculously is actually known as workload identities, and

249
00:12:15,480 --> 00:12:19,399
they're sourced from the some trusted location. So in this case,

250
00:12:20,360 --> 00:12:23,519
you're having your aw's account trust GitHub as a source,

251
00:12:23,720 --> 00:12:27,519
and then your workload is getting its own temporary identity

252
00:12:27,559 --> 00:12:30,799
which can authenticate to your cloud profighter. And yeah, for Charlotte,

253
00:12:31,279 --> 00:12:33,840
this is the dream come true of bring your own keys.

254
00:12:34,159 --> 00:12:37,000
Workload identity is being able to be passed from wherever

255
00:12:37,039 --> 00:12:40,039
you're running to whatever the third party application is that

256
00:12:40,080 --> 00:12:42,799
you're integrating with without ever needing to do anything else

257
00:12:42,840 --> 00:12:46,639
other than potentially sharing a public key exchange, because I

258
00:12:46,639 --> 00:12:48,879
think that's still going to need to be a manual

259
00:12:48,919 --> 00:12:50,360
step by everyone involved.

260
00:12:51,399 --> 00:12:53,320
Speaker 2: How does that line up with the future as you

261
00:12:53,360 --> 00:12:54,240
see it, Brian.

262
00:12:54,759 --> 00:12:59,559
Speaker 4: It's very close. I think we need some.

263
00:13:01,039 --> 00:13:03,840
Speaker 5: Player, and I think Doppler is going to be that

264
00:13:03,879 --> 00:13:06,799
middle player that kind of acts as like the network

265
00:13:06,879 --> 00:13:10,960
highway for everyone. So I'll give you an example. Let's

266
00:13:10,960 --> 00:13:13,840
just say you're an AWS. We're gonna make it super

267
00:13:13,879 --> 00:13:17,720
seamless to be connected from AWS to Doppler, and so

268
00:13:17,759 --> 00:13:20,120
anytime you're in AI of US, you're automatically authenticated to

269
00:13:20,120 --> 00:13:22,360
Doppler with all the right roles, permissions and all that stuff.

270
00:13:23,639 --> 00:13:26,240
And then you're gonna also want that because you're a

271
00:13:26,320 --> 00:13:29,320
Doppler to be connected automatically to Stripe, to Twilia, to

272
00:13:29,360 --> 00:13:32,639
your database, to the ten thousand other services you're using.

273
00:13:33,120 --> 00:13:35,519
And so now you're gonna get this beautiful world where

274
00:13:35,519 --> 00:13:39,759
you don't have to configure all these individual relationships painfully

275
00:13:39,799 --> 00:13:43,679
in AWS. Instead, you'll set up one relationship with Doppler,

276
00:13:43,759 --> 00:13:45,639
and then Dopper will kind of give you a seamless

277
00:13:45,639 --> 00:13:48,000
way to like manage the rest, and so you kind

278
00:13:48,000 --> 00:13:50,120
of get this like one to one and then one

279
00:13:50,120 --> 00:13:51,840
to many. Right, So, the fact that you're an a

280
00:13:51,919 --> 00:13:54,320
of us, you're not authenticated Dopper, which means you're authenticated

281
00:13:54,320 --> 00:13:56,000
to Stripe, and it all just kind of like works

282
00:13:56,000 --> 00:13:58,360
seamlessly in the background, like Dopper kind of like fades

283
00:13:58,399 --> 00:14:00,799
away and it's like just opera and completely in the

284
00:14:00,840 --> 00:14:04,440
shadows and giving you that authentication to all the services

285
00:14:04,480 --> 00:14:06,799
that you care about. And that's kind of like the

286
00:14:06,840 --> 00:14:10,559
world we see ourselves playing in in the future.

287
00:14:12,159 --> 00:14:13,679
Speaker 2: Right on what.

288
00:14:15,320 --> 00:14:17,440
Speaker 6: You know, did anybody like ever use party lines or

289
00:14:17,440 --> 00:14:19,000
anything when they were a kid and you'd have to

290
00:14:19,039 --> 00:14:22,679
call it and they would direct you to where you

291
00:14:22,720 --> 00:14:24,240
needed to go. And then in the background, you know,

292
00:14:24,279 --> 00:14:26,240
like the nosy neighbor would be listening in on just

293
00:14:26,279 --> 00:14:28,519
everybody's call and would know everything that was going on.

294
00:14:30,240 --> 00:14:34,360
Speaker 1: That nosyas nosy neighbor was my grandma. She was always

295
00:14:34,399 --> 00:14:35,120
on their listening.

296
00:14:35,879 --> 00:14:38,200
Speaker 6: Yeah, I had I had some mounts like that too.

297
00:14:38,720 --> 00:14:41,039
They always knew what was up. And let's stop for

298
00:14:41,200 --> 00:14:42,240
Doppler knows what's up too.

299
00:14:43,759 --> 00:14:45,840
Speaker 5: We try as hard as possible not to know what's

300
00:14:45,919 --> 00:14:52,159
upment you're looking for.

301
00:14:52,799 --> 00:14:55,000
Speaker 2: Right, Nice save Brian Nice.

302
00:14:58,120 --> 00:15:00,679
Speaker 5: I think in like reality, we we'll end up thing is

303
00:15:00,720 --> 00:15:04,759
like we will like offload the like we'll giving because

304
00:15:04,759 --> 00:15:06,960
while the relationship with Stripe or any of these services,

305
00:15:07,279 --> 00:15:09,000
it'll all just get encrypted at the end day, and

306
00:15:09,039 --> 00:15:10,639
we'll just try to make sure that we set up

307
00:15:10,639 --> 00:15:13,559
the encryption in a way that we can't possibly like

308
00:15:13,720 --> 00:15:16,720
read the output of because I like, for everyone's benefit,

309
00:15:16,720 --> 00:15:18,639
we don't want to be able to see that data.

310
00:15:18,919 --> 00:15:21,360
Speaker 1: Yeah, I mean, I think the biggest problem today is

311
00:15:21,399 --> 00:15:26,279
that those serve, those products, those applications that you really

312
00:15:26,320 --> 00:15:30,039
want to have protected, have some of the worst security

313
00:15:30,200 --> 00:15:33,240
and ever. Like you used to Wripe as an example,

314
00:15:33,240 --> 00:15:35,279
and I'm not it's not the worst one there is,

315
00:15:35,360 --> 00:15:38,039
but you can't even like I can't even get SSO

316
00:15:38,200 --> 00:15:40,960
to work with Stripe, like they they don't support it anymore.

317
00:15:41,000 --> 00:15:44,279
Like literally Stripe does not support SSO, And I'm just

318
00:15:44,320 --> 00:15:47,279
like how, like why not? Like I want that to

319
00:15:47,320 --> 00:15:50,559
be integrated into my into my seam system. I want

320
00:15:50,559 --> 00:15:52,799
to be able to authenticate there and have two factor

321
00:15:52,840 --> 00:15:56,480
off through whatever provider I'm utilizing, and it doesn't support it,

322
00:15:56,519 --> 00:15:59,279
so like they don't even support that baseline. So I

323
00:15:59,320 --> 00:16:01,879
have very little faith that in the very near future

324
00:16:02,080 --> 00:16:05,080
strife will support workload identities to.

325
00:16:05,000 --> 00:16:06,879
Speaker 7: Be able to prove that. But I mean, if we

326
00:16:06,919 --> 00:16:07,840
get there, that would be great.

327
00:16:08,000 --> 00:16:10,279
Speaker 1: So far, the number of providers that really support this

328
00:16:10,480 --> 00:16:12,559
is just absolutely minuscule.

329
00:16:13,039 --> 00:16:15,600
Speaker 5: So I have two thoughts around that, because we put

330
00:16:15,639 --> 00:16:17,320
a lot of time into thinking about like how do

331
00:16:17,360 --> 00:16:21,279
we get the community to My gut reed on what's

332
00:16:21,279 --> 00:16:23,799
going to happen is eventually there's gonna be an open

333
00:16:23,840 --> 00:16:25,159
protocol that everyone can.

334
00:16:25,039 --> 00:16:27,799
Speaker 4: Like adhere to that's like pretty easy to use.

335
00:16:28,240 --> 00:16:31,200
Speaker 5: Probably they'll be like SDKs and all the popular languages,

336
00:16:31,639 --> 00:16:34,440
and then some company or a band of secret spanagement

337
00:16:34,440 --> 00:16:36,519
companies are going to put out a leader board and

338
00:16:36,559 --> 00:16:38,879
they're going to show all the companies that do support it.

339
00:16:38,879 --> 00:16:41,039
It will be kind of like the SSO tax website,

340
00:16:41,159 --> 00:16:42,600
and you kind of just like want to be on

341
00:16:42,639 --> 00:16:43,960
that list or don't want to be on that list,

342
00:16:44,000 --> 00:16:46,360
depending on how it's structured. And my hope is that

343
00:16:46,399 --> 00:16:48,919
like that will create enough peer pressure. And then on

344
00:16:49,000 --> 00:16:50,080
top of that, I'm only.

345
00:16:49,919 --> 00:16:52,840
Speaker 4: A bet a lot of money that, like as.

346
00:16:53,240 --> 00:16:56,000
Speaker 5: AI gets more and more sophisticated and can go from

347
00:16:56,039 --> 00:16:58,519
like being a co pilot to like a full blown engineer,

348
00:16:59,039 --> 00:17:01,559
like AI engine years and I'm talking about humans that

349
00:17:01,559 --> 00:17:03,840
are using AI to engineer. I'm talking about an AI

350
00:17:03,960 --> 00:17:08,000
that is fully capable of being an engineer. Those those

351
00:17:08,039 --> 00:17:11,400
AI agents are going to want identity based authentication and

352
00:17:11,440 --> 00:17:15,720
they will push aggressively for it, and I bet that's gonna,

353
00:17:15,799 --> 00:17:19,039
like those two things together will probably lead over time

354
00:17:19,359 --> 00:17:21,920
to like all the big players eventually like doing this

355
00:17:22,200 --> 00:17:23,440
just because they have to.

356
00:17:24,200 --> 00:17:29,599
Speaker 1: I love your optimism, Like it's just it's just so great.

357
00:17:29,880 --> 00:17:32,680
Speaker 7: I mean, because like today we see so many.

358
00:17:32,519 --> 00:17:35,759
Speaker 1: People putting credentials in plain text, let alone whatever else

359
00:17:35,799 --> 00:17:37,720
we're going to call a secret, So getting them to,

360
00:17:37,799 --> 00:17:39,880
you know, go down a different path is a huge

361
00:17:39,960 --> 00:17:42,599
challenge there. And the other problem is like there is

362
00:17:42,640 --> 00:17:46,480
a bit of a standard already around the providers where they.

363
00:17:46,440 --> 00:17:46,799
Speaker 5: Can use it.

364
00:17:46,839 --> 00:17:49,960
Speaker 1: I think everyone could stand up any sort of open id,

365
00:17:50,559 --> 00:17:53,000
but it doesn't give them a lot of like, it

366
00:17:53,000 --> 00:17:55,359
doesn't give them money, like, it doesn't help the businesses

367
00:17:55,400 --> 00:17:57,759
grow to support that as an optional. Less some bigger

368
00:17:57,839 --> 00:18:00,680
players as yeah, we like, if you want us to

369
00:18:00,920 --> 00:18:03,079
use you, you have to support this. And I think

370
00:18:03,119 --> 00:18:04,680
the great example of that is you brought up a

371
00:18:04,720 --> 00:18:07,839
SSO tax, which I think does the wrong thing. I

372
00:18:07,880 --> 00:18:11,559
feel like it's positive reinforcement for all these companies that

373
00:18:11,599 --> 00:18:13,480
do the wrong thing. I see this leaderboard of all

374
00:18:13,559 --> 00:18:17,000
these companies, and now I read that name, and is

375
00:18:17,039 --> 00:18:20,240
there something known in psychology is called preferential attachment. I

376
00:18:20,319 --> 00:18:24,559
see this company listed there, and I forget why, Like

377
00:18:24,599 --> 00:18:27,200
I don't remember SSO tax and you remember the name.

378
00:18:27,279 --> 00:18:29,759
And then months later I got a call from one

379
00:18:29,799 --> 00:18:33,400
of their terrible salespeople trying to harass me on my

380
00:18:33,519 --> 00:18:35,759
personal phone number, saying, hey, do you want to buy

381
00:18:35,759 --> 00:18:38,599
our product? But I heard that company name before, so

382
00:18:38,920 --> 00:18:41,519
now I'm tempted to actually buy their product. Even then,

383
00:18:41,559 --> 00:18:43,880
the reason I know about them is because they were

384
00:18:43,920 --> 00:18:47,039
listed on a site that says they did the wrong thing.

385
00:18:47,240 --> 00:18:51,359
We see the same thing happen with vulnerabilities and data leaks.

386
00:18:51,599 --> 00:18:53,960
You learn about companies because they had a data leak,

387
00:18:54,000 --> 00:18:56,920
and that just actually enforces how great they must be,

388
00:18:56,920 --> 00:18:58,839
because then they send out those emails thing, oh, we're

389
00:18:58,880 --> 00:19:01,920
committed to security, we're going to change things and uh,

390
00:19:02,079 --> 00:19:04,480
and that gives you. That gives you some confidence in

391
00:19:04,759 --> 00:19:07,880
this particular company. So I think there's a special message

392
00:19:07,880 --> 00:19:09,960
that has to be sent that actually does the right thing.

393
00:19:10,279 --> 00:19:13,119
Having a group of standards can work, But if the

394
00:19:13,119 --> 00:19:16,119
companies haven't taken it on themselves to actually implement these,

395
00:19:16,519 --> 00:19:18,480
I don't like. I don't see magically that it's going

396
00:19:18,559 --> 00:19:20,000
to happen in the next ten years.

397
00:19:20,359 --> 00:19:24,440
Speaker 5: Maybe another way to I mean I used to think

398
00:19:24,480 --> 00:19:26,440
the same thing about Pasky's, but now it's getting rolled out.

399
00:19:27,640 --> 00:19:31,160
So but on the flip side, also, this could be

400
00:19:31,160 --> 00:19:33,440
a place where like war compliance comes in, like they

401
00:19:33,440 --> 00:19:36,279
added to the compliance bodies of like if you issue

402
00:19:36,279 --> 00:19:40,119
out secrets, you have to you have to support these methods.

403
00:19:43,720 --> 00:19:47,440
Speaker 3: So let's take a quick step back and talk about

404
00:19:47,519 --> 00:19:52,640
like what are some of the other common strategies that

405
00:19:52,720 --> 00:19:55,599
you've seen for managing secrets, just to help people put

406
00:19:55,680 --> 00:20:00,559
into perspective or like recognize themselves as owners of this

407
00:20:00,680 --> 00:20:03,359
problem even though they may not recognize themselves as that

408
00:20:03,400 --> 00:20:03,960
group yet.

409
00:20:04,880 --> 00:20:05,240
Speaker 4: Yeah.

410
00:20:05,319 --> 00:20:08,720
Speaker 5: So typically the way I see it is they're coming

411
00:20:08,720 --> 00:20:11,440
from a couple of different worlds. One is they're using

412
00:20:11,519 --> 00:20:14,079
something called an eb file or like a plane piece

413
00:20:14,119 --> 00:20:16,200
of text file that has a whole bunch of secrets

414
00:20:16,200 --> 00:20:18,680
on it. That's bad for a number of reasons. One,

415
00:20:19,240 --> 00:20:21,720
those secrets are unencrypted on disc, so like they're very

416
00:20:21,759 --> 00:20:25,079
easy to like steal. Two, they are now shared in

417
00:20:25,079 --> 00:20:28,079
insecure ways like slack an email, again another breach point

418
00:20:28,359 --> 00:20:32,039
that can happen. And three the manua to be coordinated.

419
00:20:32,079 --> 00:20:33,599
Like imagine if you have a ward doc and I

420
00:20:33,640 --> 00:20:35,880
have a ward dooc and everyone else on the team

421
00:20:35,920 --> 00:20:38,359
has the same wartock and we're all editing in real

422
00:20:38,400 --> 00:20:40,559
time before Google docs or Notion, and we have to

423
00:20:40,599 --> 00:20:41,440
all keep it in sick.

424
00:20:41,599 --> 00:20:44,079
Speaker 4: That's like quite hard.

425
00:20:45,079 --> 00:20:47,559
Speaker 5: So another thing we see is like they're only using

426
00:20:47,599 --> 00:20:50,319
cloud providers, which generally works for production, but then they're

427
00:20:50,400 --> 00:20:55,480
using like eethos eb files for development, and like where

428
00:20:55,519 --> 00:20:58,160
the problem really gets bad is like they can't answer

429
00:20:58,200 --> 00:21:01,279
these like fundamental four questions. The first question is like

430
00:21:01,319 --> 00:21:02,880
do I know where all my secrets are?

431
00:21:03,039 --> 00:21:03,240
Speaker 4: These?

432
00:21:03,240 --> 00:21:06,519
Speaker 5: You cannot possibly protect your secrets unless you know where

433
00:21:06,559 --> 00:21:07,200
all of them are.

434
00:21:07,680 --> 00:21:09,400
Speaker 4: Do I have a point? Right?

435
00:21:10,440 --> 00:21:12,880
Speaker 5: Do I have the ability to like access control these

436
00:21:12,920 --> 00:21:16,759
secrets because if you can't put permissions around them, again,

437
00:21:16,799 --> 00:21:19,640
you can't protect them. Very hard to do that when

438
00:21:19,640 --> 00:21:21,680
everyone's just sharing files back and forth. There's no real

439
00:21:21,680 --> 00:21:22,599
permission model.

440
00:21:22,400 --> 00:21:23,440
Speaker 6: There, right?

441
00:21:24,039 --> 00:21:24,400
Speaker 4: Three?

442
00:21:24,440 --> 00:21:27,559
Speaker 5: Do I have logging and auditing around every action that's taken,

443
00:21:27,640 --> 00:21:28,519
including reads?

444
00:21:29,119 --> 00:21:29,319
Speaker 2: Right?

445
00:21:29,599 --> 00:21:31,599
Speaker 5: Do I know every time someone has read a secret

446
00:21:31,640 --> 00:21:34,799
from development of production and everything in between? You'd be

447
00:21:34,839 --> 00:21:37,799
surprised how many times production credentials and then end up

448
00:21:37,799 --> 00:21:40,240
in a development environment. Right, So you need a tract

449
00:21:40,240 --> 00:21:42,920
development too, Like you'll see this all the time where

450
00:21:42,920 --> 00:21:45,319
like developers like, oh productions having an outagy, I'm just

451
00:21:45,319 --> 00:21:47,920
gonna like create myself with produt E and v file

452
00:21:48,160 --> 00:21:51,079
and test production locally real quick. Now you got proud

453
00:21:51,119 --> 00:21:52,680
secrets on your laptop, what are you gonna do?

454
00:21:52,960 --> 00:21:53,119
Speaker 4: Like?

455
00:21:53,720 --> 00:21:55,200
Speaker 5: And then the last one is when you do have

456
00:21:55,240 --> 00:21:58,039
a data breach because something went bad, can you stop

457
00:21:58,079 --> 00:22:01,599
it very quick and very quickly in the orders of

458
00:22:01,640 --> 00:22:03,960
like minutes, Like if you think about how fast data

459
00:22:03,960 --> 00:22:06,119
can leave your system, right, like data can leave your

460
00:22:06,119 --> 00:22:09,200
system like bigabits per second right from a tows, Well

461
00:22:09,400 --> 00:22:10,000
that's the case.

462
00:22:10,119 --> 00:22:10,960
Speaker 4: A couple of minutes of.

463
00:22:11,559 --> 00:22:13,160
Speaker 5: A data breach is actually a really really big deal

464
00:22:13,200 --> 00:22:16,759
at this point, ours is like an like disastrous. And

465
00:22:16,799 --> 00:22:19,400
so if you can't answer those four questions confidently, not

466
00:22:19,480 --> 00:22:20,839
like oh yeah, I can run a piece of paper,

467
00:22:20,839 --> 00:22:22,720
but like I am extremely confident I've bet my job

468
00:22:22,720 --> 00:22:25,039
in my life on these on answering these questions. Confidently

469
00:22:25,079 --> 00:22:27,039
you have a problem and now it's on you to

470
00:22:27,039 --> 00:22:27,400
fix it.

471
00:22:27,880 --> 00:22:30,119
Speaker 3: I like that, that's pretty clear cut. I will not

472
00:22:30,200 --> 00:22:32,480
go on record answering those questions.

473
00:22:34,160 --> 00:22:36,079
Speaker 6: Definitely be my life there, Like I.

474
00:22:36,039 --> 00:22:40,799
Speaker 1: Think maybe much for me, I mean since I since

475
00:22:41,000 --> 00:22:45,319
I guess that leaves me and uh. I run a

476
00:22:45,319 --> 00:22:49,119
technology company that focuses on providing log in and access

477
00:22:49,119 --> 00:22:53,720
control for applications that our customer is right, and so

478
00:22:53,759 --> 00:22:57,400
I feel like secrets is super important for are both

479
00:22:57,519 --> 00:23:01,319
us and are our customers there? And we're pretty extensively

480
00:23:01,440 --> 00:23:06,279
using complex crypto and things like key Management Services KMS

481
00:23:06,279 --> 00:23:08,680
and AWS and some of the other cloud providers to

482
00:23:08,799 --> 00:23:12,759
do that at location, that site based encryption there, and

483
00:23:12,839 --> 00:23:16,000
I definitely don't trust a lot of people to get

484
00:23:16,039 --> 00:23:17,319
that even close to correct.

485
00:23:17,880 --> 00:23:18,559
Speaker 4: Yeah, it's hard.

486
00:23:18,720 --> 00:23:22,759
Speaker 5: It's really really hard, especially if you're like building I

487
00:23:23,039 --> 00:23:25,079
found that if like again, you got to you gotta

488
00:23:25,119 --> 00:23:27,240
get the productivity part, like if you got to make

489
00:23:27,279 --> 00:23:29,319
it super simple and easy to use, where there's no

490
00:23:29,319 --> 00:23:31,400
footgun moments, you can't shoot yourself on the foot The

491
00:23:32,039 --> 00:23:34,359
default easy path has to be the right path. And

492
00:23:34,400 --> 00:23:36,480
that's that's probably like the hardest part that I found

493
00:23:36,519 --> 00:23:39,279
with building this. It's like, at the end of day,

494
00:23:40,359 --> 00:23:42,279
at least for us, it didn't feel like building the

495
00:23:42,319 --> 00:23:44,720
tool that was secure was a hard park. It was

496
00:23:45,200 --> 00:23:48,559
like at our frontish foundational level of like the infrastructure layer,

497
00:23:48,920 --> 00:23:51,319
but building it so that the end the end user

498
00:23:51,359 --> 00:23:54,279
experience to the developer being secured by default, that part

499
00:23:54,319 --> 00:23:58,519
was actually really hard to get right because also because

500
00:23:58,519 --> 00:24:00,839
like every developer has like a differences in mind, and

501
00:24:00,880 --> 00:24:03,000
so like now you have to like support like a

502
00:24:03,160 --> 00:24:06,880
very wide variety of different setups and they all have

503
00:24:06,960 --> 00:24:09,720
to be secured by default, and the most secure thing

504
00:24:09,759 --> 00:24:13,240
has to be the default thing, and so like talking

505
00:24:13,240 --> 00:24:15,200
a little bit about like what a solution would look like,

506
00:24:15,480 --> 00:24:16,960
even if you're not gonna use you don't have to

507
00:24:17,000 --> 00:24:19,039
use Doppler. That's not my job here is to like

508
00:24:19,079 --> 00:24:20,839
sell you guys on that. My job here is to

509
00:24:20,839 --> 00:24:22,279
sell you guys on the problem, and then you guys

510
00:24:22,279 --> 00:24:24,039
should go solve the problems so we don't keep having

511
00:24:24,119 --> 00:24:25,400
data breaches because the.

512
00:24:25,559 --> 00:24:28,480
Speaker 4: Number is only rising. It's rising fast. It's like like

513
00:24:28,519 --> 00:24:29,839
a high growth stock right now.

514
00:24:33,759 --> 00:24:36,319
Speaker 5: What you need is you need a service that centralizes

515
00:24:36,839 --> 00:24:39,400
or a system that centralizes all your secrets in one.

516
00:24:39,240 --> 00:24:42,200
Speaker 4: Place that's encrypted, super super important.

517
00:24:42,759 --> 00:24:44,799
Speaker 5: You need a system that does not encourage secrets to

518
00:24:44,799 --> 00:24:47,759
be on disc so you're reading and if you're reading

519
00:24:47,759 --> 00:24:49,559
from the environment, you immediately clean that up from the

520
00:24:49,599 --> 00:24:53,480
environment after your like frameworks and libraries at booted up.

521
00:24:54,720 --> 00:24:57,200
You need a service that has access controls and auditing

522
00:24:57,200 --> 00:25:00,319
built in, with a really strong audit story there. Ideally

523
00:25:00,400 --> 00:25:03,880
can also push those logs into a centralized logging platform

524
00:25:03,920 --> 00:25:06,000
like data Dogs, Blowing and Zeo Logic so you can

525
00:25:06,079 --> 00:25:07,400
do more robust analysis on it.

526
00:25:07,480 --> 00:25:08,880
Speaker 2: Set up learning, you.

527
00:25:08,880 --> 00:25:11,960
Speaker 5: Need the ability to automatically push the secrets when they

528
00:25:12,039 --> 00:25:14,880
change into your infrastructure, so there can't be a human

529
00:25:14,920 --> 00:25:16,920
step involved, or like, let me file your ticket and

530
00:25:16,920 --> 00:25:19,599
wait three weeks for this ticket to get closed, because

531
00:25:19,599 --> 00:25:21,880
your code's going to get rolled out automatically. So by default,

532
00:25:22,000 --> 00:25:24,720
even without the security problem, you have an outage guaranteed

533
00:25:24,720 --> 00:25:27,440
because you have this race condition of a human filing

534
00:25:28,000 --> 00:25:32,000
fulfilling a ticket versus GitHub automatically rolling that out to

535
00:25:32,039 --> 00:25:34,759
AWS or your infrastructure. Right, they're like one's going to

536
00:25:34,799 --> 00:25:36,440
move a lot faster than the other in the order

537
00:25:36,480 --> 00:25:40,680
of like seconds versus like days and weeks. So you

538
00:25:40,799 --> 00:25:42,680
have to have it automatically rolled out, just like your

539
00:25:42,680 --> 00:25:44,920
code is. And ideally it has to be rolled out

540
00:25:45,039 --> 00:25:47,359
before your code does, so your code has it to

541
00:25:47,400 --> 00:25:50,640
consume when it boots up. And the last thing is

542
00:25:50,680 --> 00:25:52,920
you've got to be able to rotate your credentials, either

543
00:25:53,000 --> 00:25:57,079
automatically or manually very quickly. If you have a data breach,

544
00:25:57,359 --> 00:25:59,799
you cannot learn how to firefight and firefight at the

545
00:25:59,839 --> 00:26:00,400
same time.

546
00:26:00,480 --> 00:26:03,839
Speaker 4: It's too slow. You have to have like your procedures down.

547
00:26:03,960 --> 00:26:06,079
Speaker 5: If you know what buttons you're gonna click, and you're

548
00:26:06,119 --> 00:26:08,000
gonna have to know how fast it takes you and

549
00:26:08,079 --> 00:26:10,720
if you're if the data breach happens and you're going

550
00:26:10,799 --> 00:26:12,839
to take minutes to hours to solve this.

551
00:26:12,799 --> 00:26:16,680
Speaker 4: Problem, you are in a really really bad spot. So

552
00:26:17,160 --> 00:26:18,200
you can do those things.

553
00:26:18,920 --> 00:26:21,759
Speaker 5: Whatever tool you use, you'll be in good shape or

554
00:26:21,759 --> 00:26:24,839
anythink I missed anything, uh.

555
00:26:24,599 --> 00:26:27,000
Speaker 1: Not at all, actually, but the thing that came to

556
00:26:27,039 --> 00:26:30,200
mind is like I'm wondering about. The corollary is like,

557
00:26:30,240 --> 00:26:32,160
you know what other solutions are there that are out

558
00:26:32,160 --> 00:26:33,960
there where you're playing in the same space, and I'm

559
00:26:33,960 --> 00:26:36,640
trying to understand like whether or not it's closer to

560
00:26:36,680 --> 00:26:39,319
say a secrets manager that one of the cloud providers offer,

561
00:26:39,559 --> 00:26:44,400
or open Bow or hashing corpus Vault are say you

562
00:26:44,400 --> 00:26:47,519
know your competitors or whether or not there's something specific here.

563
00:26:49,319 --> 00:26:53,039
Speaker 5: Yeah, So with the secrets managers from the cloud providers,

564
00:26:53,440 --> 00:26:56,519
those are actually partners of ours. So almost all of

565
00:26:56,519 --> 00:26:59,279
our customers are using a TOUS secrets manager and Doppler

566
00:26:59,319 --> 00:27:01,559
and kind of like the the mental model there is

567
00:27:01,680 --> 00:27:04,680
kind of similar to GitHub or get lab or developer

568
00:27:04,720 --> 00:27:08,680
pulls their code down, changes it on their laptop, then

569
00:27:08,680 --> 00:27:11,000
pushes it back up and then one it's ready to

570
00:27:11,000 --> 00:27:14,200
get released to production then get hub or will then

571
00:27:14,200 --> 00:27:17,039
push it to AWS, which has its own repository store, and.

572
00:27:16,920 --> 00:27:19,839
Speaker 4: That gets deployed. Dopper does the same thing.

573
00:27:19,880 --> 00:27:22,480
Speaker 5: We'll like allow you to pull those secrets down, change them,

574
00:27:22,480 --> 00:27:24,319
and then when you're ready to push them up into production,

575
00:27:24,599 --> 00:27:26,319
we will push it to a to B secrets manager

576
00:27:26,599 --> 00:27:31,119
or its pramer store or whatever, using on as your GCP,

577
00:27:31,559 --> 00:27:33,799
and then that will then push it into your infrastructure.

578
00:27:33,960 --> 00:27:35,720
And so it's kind of like the same model there.

579
00:27:37,000 --> 00:27:41,160
So we are very closely partnered with the cloud providers

580
00:27:41,640 --> 00:27:45,519
and most infrastructure companies we support, like I think forty

581
00:27:45,559 --> 00:27:49,880
different infrastructure companies right now, and then on the like

582
00:27:49,920 --> 00:27:52,279
even Kubernetes resupport, which is in the company. It's an

583
00:27:53,240 --> 00:27:57,279
open source initiative, but we support that pretty extensively too,

584
00:27:58,400 --> 00:28:00,920
And then on open VOW, I would say we're more

585
00:28:00,960 --> 00:28:06,240
closely competitive, but I would argue that it really solves

586
00:28:06,279 --> 00:28:10,680
different problems than we do, Like we're looking specifically to

587
00:28:10,759 --> 00:28:16,319
solve secrets regarding API keys, database riels, and other things

588
00:28:16,319 --> 00:28:20,680
that configure your application. I would say open Bow is

589
00:28:20,720 --> 00:28:23,720
more like a generic key store and like you can

590
00:28:23,759 --> 00:28:26,920
put anything sensitive in it, like a photo of a passport,

591
00:28:27,079 --> 00:28:30,079
or social Security number or some medical records, not things

592
00:28:30,079 --> 00:28:33,319
that are like deeply connected to your application and your infrastructure.

593
00:28:34,200 --> 00:28:36,359
I would say they're missing a lot of the layers

594
00:28:36,519 --> 00:28:38,519
going back to that, like make vegetables tastes like candy.

595
00:28:38,640 --> 00:28:41,400
They squarely have the vegetables, but there is no candy involved,

596
00:28:41,759 --> 00:28:45,079
and without the if, you're not really solving the problem.

597
00:28:45,240 --> 00:28:47,279
And so I would say it's kind of like Dropbox

598
00:28:47,359 --> 00:28:51,759
versus Amazon S three. They both do storage, but like

599
00:28:51,759 --> 00:28:53,920
like one is like here's some APIs, go figure out

600
00:28:53,920 --> 00:28:55,119
what to do with it. The other is like a

601
00:28:55,160 --> 00:28:57,480
whole end end experience. It makes it super easy to

602
00:28:57,519 --> 00:29:01,720
securely store managed secrets. And that's where I think we

603
00:29:02,039 --> 00:29:04,720
fit in. So while they're competitive, i'd say we're almost

604
00:29:04,720 --> 00:29:08,039
operating in different market, so like we don't really see

605
00:29:08,039 --> 00:29:09,000
as hyper competitive.

606
00:29:09,319 --> 00:29:09,839
Speaker 2: That makes sense.

607
00:29:09,880 --> 00:29:11,519
Speaker 3: One of the things you keep coming back to that

608
00:29:11,599 --> 00:29:15,039
I really like, it really resonates with me, is making

609
00:29:15,039 --> 00:29:18,400
the vegetables tastes like candy and making the right thing

610
00:29:18,559 --> 00:29:24,000
the default thing. I just think that works so well

611
00:29:24,000 --> 00:29:28,480
because like your target customers, not just yours for Doppler,

612
00:29:28,519 --> 00:29:31,559
but like mine at my organization. You know, I support

613
00:29:31,599 --> 00:29:36,160
the engineering teams, like I try to put in just

614
00:29:36,240 --> 00:29:39,440
guardrails so that they can do whatever they need to

615
00:29:39,480 --> 00:29:41,759
do to get their job done. But then I've got

616
00:29:41,799 --> 00:29:45,160
guardrails in place so they don't accidentally do something that

617
00:29:45,200 --> 00:29:47,720
they didn't want to do, and so they by default

618
00:29:47,720 --> 00:29:49,480
they end up doing the right thing even when they

619
00:29:49,519 --> 00:29:51,200
don't know what the right thing is.

620
00:29:51,640 --> 00:29:53,440
Speaker 2: And I think that's so important to.

621
00:29:53,440 --> 00:29:58,759
Speaker 3: Get to get that right to have adoption so that

622
00:29:59,279 --> 00:30:02,759
you don't have to add this extra layer of technical

623
00:30:03,319 --> 00:30:07,799
expertise and competency on your developers to do the job

624
00:30:07,880 --> 00:30:08,680
that you hired them for.

625
00:30:09,039 --> 00:30:12,160
Speaker 5: One hundred percent agree, Like I would assume like almost

626
00:30:12,200 --> 00:30:15,079
every developer you're going to hire is coming with positive intent.

627
00:30:15,400 --> 00:30:17,839
They want to do a great job. They want the

628
00:30:17,880 --> 00:30:20,079
company to be secure, they want it to be successful.

629
00:30:20,559 --> 00:30:22,680
But a lot of times if you force the question

630
00:30:22,759 --> 00:30:24,519
on the if you force them to answer the question

631
00:30:24,559 --> 00:30:26,400
of like is it secure, is it's not? What's the

632
00:30:26,400 --> 00:30:28,319
best way to secure this? They may not get to

633
00:30:28,359 --> 00:30:30,799
the right answer just because they don't have that expertise.

634
00:30:30,839 --> 00:30:32,920
And so it's important to kind of like abstract that

635
00:30:32,960 --> 00:30:35,160
all way. And developers love abstraction. They love SDKs and

636
00:30:35,200 --> 00:30:38,599
frameworks this is just another one that just equals security. Right,

637
00:30:40,359 --> 00:30:41,839
So yeah, one hundred percent agree with you. I think

638
00:30:41,880 --> 00:30:43,759
one thing we haven't chat about that is worth calling

639
00:30:43,799 --> 00:30:47,359
out is like why should we care? Like why is

640
00:30:47,400 --> 00:30:49,359
everyone on the call? Why should they care about keep

641
00:30:49,400 --> 00:30:52,200
being the secret secure? And I know we've talked a

642
00:30:52,240 --> 00:30:54,640
lot about data breaches for companies, but a lot of times,

643
00:30:54,680 --> 00:30:57,119
for me at least, it feels like the company has

644
00:30:57,119 --> 00:30:59,480
a data breach and they get a slap on the wrist, right,

645
00:30:59,480 --> 00:31:01,160
they get some fine that they can pay with like

646
00:31:01,200 --> 00:31:03,559
one day worth of revenue. They maybe get a bad

647
00:31:03,559 --> 00:31:05,920
press article that turns into a good press press article

648
00:31:06,000 --> 00:31:08,400
later in their minds, as Warren was talking about earlier,

649
00:31:09,119 --> 00:31:11,799
because now they remember the name, and but like, what

650
00:31:11,960 --> 00:31:14,079
is the actual real impact here? Because that doesn't sound

651
00:31:14,119 --> 00:31:15,799
like a lot of negatives right there. That almost sounds

652
00:31:15,839 --> 00:31:19,759
like not that bad. And I think the real risk

653
00:31:19,880 --> 00:31:22,559
is actually to the users. Like it's it's so easy

654
00:31:22,599 --> 00:31:24,960
to forget that every time that we talk about like

655
00:31:24,960 --> 00:31:29,960
gigabytes or terabytes of data, we're talking about usually people's data,

656
00:31:30,039 --> 00:31:33,759
like real people to your customers, and they are trusting

657
00:31:33,920 --> 00:31:36,640
you to keep their data private, like to keep their

658
00:31:36,680 --> 00:31:40,720
private data private and when that when that breach happens,

659
00:31:40,759 --> 00:31:44,039
you're breaching their trust because that private data becomes public

660
00:31:44,759 --> 00:31:46,359
and there's real consequences to that.

661
00:31:46,359 --> 00:31:48,359
Speaker 4: I actually have a real personal story that scared this

662
00:31:48,519 --> 00:31:49,119
shit out of me.

663
00:31:50,240 --> 00:31:52,960
Speaker 5: I just moved to Austin for anyone who cares, and

664
00:31:53,000 --> 00:31:55,640
I took my mom out to some barbecue and while

665
00:31:55,640 --> 00:31:57,920
we were there, I got this call. And the call

666
00:31:58,240 --> 00:32:01,640
it was from someone who claimed to be at the

667
00:32:01,720 --> 00:32:05,319
Texas Customs and Borders and they called me and say, hey,

668
00:32:05,319 --> 00:32:08,799
you're being federally investigated. We found a package that has

669
00:32:08,880 --> 00:32:11,920
illegal drugs and money in it and you cannot hang

670
00:32:12,000 --> 00:32:14,640
up from this phone call. Everything's being recorded and it

671
00:32:14,680 --> 00:32:17,359
will be used against union court a law. And I

672
00:32:17,400 --> 00:32:19,319
was like, oh shit, this is a day my life

673
00:32:19,359 --> 00:32:24,000
just falls apart, Like like I legitimately thought I was

674
00:32:24,039 --> 00:32:24,720
fucked that day?

675
00:32:25,279 --> 00:32:27,119
Speaker 4: Am I allowed to say that? I hope you guys

676
00:32:27,200 --> 00:32:27,640
gonna end it?

677
00:32:28,960 --> 00:32:32,079
Speaker 3: And one of us in this group continuously drops F bombs.

678
00:32:32,079 --> 00:32:32,839
Speaker 2: I'm not going to call.

679
00:32:32,759 --> 00:32:35,680
Speaker 4: That person out. Yeah we never should.

680
00:32:35,839 --> 00:32:40,799
Speaker 5: Uh. But they started asking me all these questions about

681
00:32:40,880 --> 00:32:43,359
like where, like have you been at this address?

682
00:32:43,400 --> 00:32:44,440
Speaker 4: Have you bought these things?

683
00:32:44,440 --> 00:32:46,279
Speaker 5: Like that kind of stuff, like stuff that you'd feel

684
00:32:46,279 --> 00:32:48,640
like owly the government would know, like really owning the

685
00:32:48,640 --> 00:32:50,440
government would know. And like this went on for like

686
00:32:50,480 --> 00:32:52,799
forty minutes where they were just like asking me questions

687
00:32:52,799 --> 00:32:54,880
and I was validating myself on and I was with

688
00:32:54,920 --> 00:32:56,680
my mom. My mom and I were both freaking out,

689
00:32:57,039 --> 00:32:59,000
and at some point we bring on the company lawyers

690
00:32:59,039 --> 00:33:02,079
to like really make sure like I get my I

691
00:33:02,079 --> 00:33:04,079
probably should have brought them on earlier, but I was

692
00:33:04,119 --> 00:33:05,559
scared and.

693
00:33:05,519 --> 00:33:06,599
Speaker 4: I just felt so real.

694
00:33:06,839 --> 00:33:08,599
Speaker 5: And it was only until like an hour and thirty

695
00:33:08,640 --> 00:33:11,240
minutes in when our lawyer when we all caught them

696
00:33:11,279 --> 00:33:14,279
slip up on something, and it was like my lawyer

697
00:33:14,319 --> 00:33:17,359
started pressing them pretty hard, and at some point they

698
00:33:17,400 --> 00:33:22,640
got frustrated and their accent kicked in Texas accent to

699
00:33:22,799 --> 00:33:25,400
like some other accent that I that I was like,

700
00:33:25,400 --> 00:33:27,000
you are not from like America.

701
00:33:27,400 --> 00:33:29,640
Speaker 4: And at that point we realized we were getting scammed.

702
00:33:30,480 --> 00:33:32,720
Speaker 5: But they were only able to get to that point

703
00:33:32,799 --> 00:33:35,799
of trust because of all the data that had been

704
00:33:35,880 --> 00:33:38,640
breached in previous data breaches. My data has been in

705
00:33:38,680 --> 00:33:41,319
like ten data breaches or so like including equal facts,

706
00:33:41,359 --> 00:33:43,400
and so they just had this rich history to lean

707
00:33:43,440 --> 00:33:46,319
on to validate who they are as like an authority,

708
00:33:47,279 --> 00:33:49,480
and I trusted that for like an hour and a half.

709
00:33:49,960 --> 00:33:53,519
And like I have security training. I run a cybersecurity company.

710
00:33:53,720 --> 00:33:56,680
I can imagine my mom, my, sister, anyone else who

711
00:33:56,680 --> 00:34:00,559
has not had security training, like actually go the entire

712
00:34:00,599 --> 00:34:03,440
ten miles and give them everything, and they wake up

713
00:34:03,440 --> 00:34:05,759
tomorrow with like a huge amount of social security problems

714
00:34:06,000 --> 00:34:09,039
and like they're at bank count strained because now know everything,

715
00:34:09,079 --> 00:34:11,400
and like that can happen to real people all the time.

716
00:34:11,679 --> 00:34:14,400
And that's the real cost that we pay when we

717
00:34:14,440 --> 00:34:16,039
mess up and we have a data breach.

718
00:34:18,039 --> 00:34:21,639
Speaker 3: That's a great point because yeah, like you mentioned, you know,

719
00:34:21,679 --> 00:34:27,119
from a corporate level, there's it's almost a net positive

720
00:34:27,440 --> 00:34:31,280
to get a data breach because the fine that you

721
00:34:31,440 --> 00:34:35,400
pay actually turned out to just be like really cheap

722
00:34:35,519 --> 00:34:39,199
marketing is the way it comes out. But like when

723
00:34:39,239 --> 00:34:42,119
you tie it back to and think about like your

724
00:34:42,159 --> 00:34:45,360
mom or your your sister or people that you know,

725
00:34:46,599 --> 00:34:49,039
and you can make it personal like that, that's actually

726
00:34:49,760 --> 00:34:53,719
like it feels like good incentive to take this seriously.

727
00:34:54,119 --> 00:34:56,039
Speaker 1: So I think there's a couple of things there that

728
00:34:56,400 --> 00:34:58,880
are definitely worth repeating. The first one.

729
00:34:59,000 --> 00:35:01,000
Speaker 2: The first one's Brian, what was your social Security number?

730
00:35:01,039 --> 00:35:06,280
Speaker 6: Again to point out real quick, like the fens are

731
00:35:06,280 --> 00:35:09,880
going to come to your door, not call you, although

732
00:35:09,880 --> 00:35:11,320
that would freak me the hell out, and I don't

733
00:35:11,320 --> 00:35:12,920
I don't know that I would consider that on the

734
00:35:12,920 --> 00:35:14,840
phone if I just got a call like that, But

735
00:35:14,960 --> 00:35:17,880
for any anything else, you know, if this is a

736
00:35:17,920 --> 00:35:20,400
scam that's going on, I'm pretty sure they're just gonna come.

737
00:35:20,480 --> 00:35:21,360
They're gonna come get you.

738
00:35:21,840 --> 00:35:27,519
Speaker 1: I know that now, right, Yeah, yeah, Like, just as

739
00:35:27,559 --> 00:35:30,280
soon as the worst a call is that you get,

740
00:35:30,360 --> 00:35:33,360
no matter what it is, just hang up first. That's

741
00:35:33,480 --> 00:35:36,000
that's the number one thing you've got to do. Uh,

742
00:35:36,159 --> 00:35:38,639
if you like, it doesn't matter what happens, You're not

743
00:35:38,639 --> 00:35:40,840
going to get penalized for hanging up the phone.

744
00:35:41,000 --> 00:35:42,039
Speaker 7: I mean, there's no way.

745
00:35:42,599 --> 00:35:44,360
Speaker 1: I And that's the whole point. They try to keep

746
00:35:44,360 --> 00:35:46,400
you on the line because as soon as you hang up,

747
00:35:46,440 --> 00:35:48,960
it's it's actually over for them. The number of times

748
00:35:48,960 --> 00:35:51,159
they have to call back will be a problem. So

749
00:35:51,480 --> 00:35:54,320
for sure, just just do that and uh, if they

750
00:35:54,320 --> 00:35:56,199
don't call back, then no problem.

751
00:35:56,239 --> 00:35:58,360
Speaker 7: And if they do, then you know, you can continue.

752
00:35:57,920 --> 00:36:00,239
Speaker 1: It from there. If you're so worried. If it's so

753
00:36:00,360 --> 00:36:03,239
critical for them to figure out what's going on, they

754
00:36:03,239 --> 00:36:06,119
for sure would call back. Yeah, so that's certainly one

755
00:36:06,159 --> 00:36:08,840
of them. But on the cost for companies, there is

756
00:36:08,880 --> 00:36:11,599
a huge aspect here that So my CEO actually did

757
00:36:11,599 --> 00:36:14,880
a talk couple I think last year about the ROI

758
00:36:14,920 --> 00:36:18,599
on security and she joked that the ROI actually stands

759
00:36:18,599 --> 00:36:21,719
for a return on incident because it's so beneficial to

760
00:36:21,760 --> 00:36:24,960
actually have this happen for you. And the thing is

761
00:36:25,000 --> 00:36:28,960
that it really depends on where you're at. So companies

762
00:36:28,960 --> 00:36:32,440
in the United States, in Russia, and China get attacked

763
00:36:32,480 --> 00:36:35,039
the most. It doesn't matter how big you are, how

764
00:36:35,039 --> 00:36:37,480
many customers you have, it doesn't matter your scale or

765
00:36:37,519 --> 00:36:40,639
your size. You will for sure get attacked. And without

766
00:36:40,760 --> 00:36:42,360
these practices in place.

767
00:36:42,360 --> 00:36:43,840
Speaker 7: You will for sure get breached.

768
00:36:44,079 --> 00:36:45,960
Speaker 1: The other thing that's really interesting is there's a lot

769
00:36:46,000 --> 00:36:48,760
of countries that just don't care about punishing you. So

770
00:36:50,519 --> 00:36:52,760
like the amount of fines that you get because of

771
00:36:52,800 --> 00:36:54,880
like say GDPR, you can actually go look it up.

772
00:36:55,039 --> 00:36:59,199
They're incredibly small, like sub one hundred thousand in most

773
00:36:59,239 --> 00:37:02,679
cases for mass of incisive. There was one recently where

774
00:37:02,800 --> 00:37:03,920
the company did get.

775
00:37:03,760 --> 00:37:05,599
Speaker 7: Sued out of business though, So that's funny.

776
00:37:06,159 --> 00:37:08,639
Speaker 1: And the other thing is that so it's good, you know,

777
00:37:08,679 --> 00:37:10,719
I felt great that you know, someone did something wrong

778
00:37:10,760 --> 00:37:13,920
and they really took the brunt of the punishment. There.

779
00:37:14,119 --> 00:37:16,679
The really interesting thing here is being a company in

780
00:37:16,679 --> 00:37:23,320
Switzerland is there's no professional there's no punishment from the

781
00:37:23,639 --> 00:37:28,280
government for companies from corporate level if we perform or

782
00:37:28,320 --> 00:37:31,679
have a security breach that's a result of executive negligence,

783
00:37:32,239 --> 00:37:36,440
there's up to like fine personal fines the government will

784
00:37:36,639 --> 00:37:41,239
charge you with and there could be other legal repercussions

785
00:37:41,239 --> 00:37:44,559
as well. And that's just so ridiculous. And if you

786
00:37:44,599 --> 00:37:47,280
work in it tar in the United States or in

787
00:37:47,639 --> 00:37:51,000
any NATO country, that's also personal fine. It's it's up

788
00:37:51,960 --> 00:37:55,079
ten million dollars and ten years in jail for having

789
00:37:55,119 --> 00:38:00,000
perpetrated sharing sensitive data that is regulated. So like it's

790
00:38:00,079 --> 00:38:02,320
not even like the company is potentially at fault there.

791
00:38:02,760 --> 00:38:04,960
They don't care. The company for sure doesn't care. But

792
00:38:05,000 --> 00:38:07,079
you as an individual, like there actually could be real

793
00:38:07,079 --> 00:38:08,000
repercussions for you.

794
00:38:08,639 --> 00:38:13,199
Speaker 5: Yep, absolutely, Like we share this with sometimes our customers

795
00:38:13,199 --> 00:38:16,119
all the time of like, it's real negligence if you

796
00:38:16,159 --> 00:38:19,400
don't manage your secrets. You can't just like look away, right,

797
00:38:19,800 --> 00:38:22,800
because again, every single time sometimes up for your website

798
00:38:22,880 --> 00:38:26,039
or does something, they're trusting you, and that trust is

799
00:38:26,079 --> 00:38:28,320
basically broken if you don't actually manage the key to

800
00:38:28,360 --> 00:38:31,559
your digital kingdom. It's like, hey, ere, hackers, take the keys.

801
00:38:31,840 --> 00:38:33,639
That's what you're basically saying, if people protect them.

802
00:38:33,960 --> 00:38:34,480
Speaker 2: Fair point.

803
00:38:35,559 --> 00:38:39,719
Speaker 3: I think another thing, like, I think another thing that

804
00:38:39,840 --> 00:38:43,920
probably is an additional challenge to this is the level

805
00:38:43,960 --> 00:38:48,079
of effort required to get from where I'm at today

806
00:38:48,400 --> 00:38:53,119
in secrets management to like the ideal world that we're

807
00:38:53,119 --> 00:38:57,199
talking about here, because like the question number one you

808
00:38:57,239 --> 00:38:59,199
asked is do you know where all your secrets are

809
00:39:00,480 --> 00:39:02,440
some of the places I've worked in the past, Like

810
00:39:02,840 --> 00:39:05,480
that could be a three to six month project just

811
00:39:05,559 --> 00:39:08,280
to answer that question. And so how do you get

812
00:39:09,679 --> 00:39:14,719
executive buy in to put time and resources towards that.

813
00:39:15,360 --> 00:39:15,840
Speaker 4: It's hard.

814
00:39:16,239 --> 00:39:17,920
Speaker 5: I think one of the first things you can do

815
00:39:18,000 --> 00:39:20,719
that's pretty easy is, like there's a bunch of open

816
00:39:20,760 --> 00:39:24,519
source secret scanners. Truffle Hog is one. Dopper will have

817
00:39:24,559 --> 00:39:28,320
one at some point, I'm sure get Guardians another, and

818
00:39:28,360 --> 00:39:31,000
you can just like put that in the codebase and

819
00:39:31,039 --> 00:39:32,639
it will. You can have it set up so never

820
00:39:32,679 --> 00:39:34,760
admits back out, so like you know that like whatever

821
00:39:34,760 --> 00:39:36,800
your findings are, whatever your code is, it will not

822
00:39:36,880 --> 00:39:39,159
go to Truflehog or gig Guardian. But you can just

823
00:39:39,199 --> 00:39:40,960
get a report and just put that as a check

824
00:39:41,159 --> 00:39:43,840
in every repository as like a commit check. Right, So,

825
00:39:44,039 --> 00:39:45,960
before I can even push to get hub, I'm going

826
00:39:46,039 --> 00:39:49,079
to do a check to scan my code for secrets.

827
00:39:49,280 --> 00:39:51,639
And all of a sudden, if you can't push the

828
00:39:51,880 --> 00:39:53,079
production anymore because of.

829
00:39:53,000 --> 00:39:55,920
Speaker 4: That check, you're probably gonna start pulling secrets.

830
00:39:55,639 --> 00:39:57,639
Speaker 5: Out of product, out of out of code real quickly,

831
00:39:58,519 --> 00:40:00,000
and you're gonna always want to stay in this state

832
00:40:00,239 --> 00:40:02,639
of a green check mark. I EI there's no secrets found.

833
00:40:03,000 --> 00:40:05,239
And that's like one first way I think to like.

834
00:40:05,199 --> 00:40:06,280
Speaker 4: Really move the needle there.

835
00:40:06,400 --> 00:40:09,599
Speaker 5: That is basically free and it takes a couple of

836
00:40:09,639 --> 00:40:10,760
lines of code to implement.

837
00:40:10,920 --> 00:40:12,239
Speaker 4: It's really really quick.

838
00:40:12,480 --> 00:40:15,199
Speaker 3: Yeah, And I guess from there it's it comes down

839
00:40:15,239 --> 00:40:18,119
to like a campaigning issue where you start campaigning for

840
00:40:18,199 --> 00:40:21,559
support to get this as a prioritized project.

841
00:40:22,119 --> 00:40:22,360
Speaker 4: Yeah.

842
00:40:23,079 --> 00:40:26,760
Speaker 1: Yeah, I mean there's something interesting here that could be

843
00:40:26,880 --> 00:40:29,239
really helpful because getting it. So we talked about workload

844
00:40:29,280 --> 00:40:32,199
identities in there being a burden of responsibility on the

845
00:40:32,320 --> 00:40:35,400
API provider whoever they are, to make some changes. The

846
00:40:35,400 --> 00:40:37,840
one that actually helped us a lot at authris was

847
00:40:37,960 --> 00:40:41,519
we automatically revoke found credential so if one of our

848
00:40:41,519 --> 00:40:45,880
customers leaks those credentials online, we will find those automatically

849
00:40:46,000 --> 00:40:49,760
and revoke them immediately. And you may be like, oh, well,

850
00:40:49,760 --> 00:40:53,199
you could break someone's production environment, and that for sure

851
00:40:53,239 --> 00:40:55,400
is true and has happened. And then we get a

852
00:40:55,400 --> 00:40:58,079
support like, oh my god, our production environment is broken?

853
00:40:58,519 --> 00:41:01,400
Why did authors do this to us? And the sound

854
00:41:01,440 --> 00:41:05,559
truth is yeah, I know, okay, it for sure happens.

855
00:41:05,800 --> 00:41:10,039
But the sound truth is that credentials that are found

856
00:41:10,239 --> 00:41:14,280
on GitHub take less than two minutes to become utilized

857
00:41:14,320 --> 00:41:17,760
by a malicious attacker. Yep, that's it. Two minutes. So like,

858
00:41:18,639 --> 00:41:20,239
which do you want? Do you want your production to

859
00:41:20,239 --> 00:41:22,199
be broken or do you want all your data to

860
00:41:22,199 --> 00:41:24,360
be compromised? And if you look at it through the

861
00:41:24,480 --> 00:41:26,559
lenses of I like, well, you should be thanking us

862
00:41:26,599 --> 00:41:29,400
for you know, stepping in here and doing something. And

863
00:41:29,440 --> 00:41:31,000
the scanners are really great. They do this.

864
00:41:31,559 --> 00:41:33,039
Speaker 4: Yeah, do that too.

865
00:41:33,639 --> 00:41:35,840
Speaker 6: Sorry, I was just gonna say, AWS absolutely removed to

866
00:41:35,920 --> 00:41:39,519
credentials if you put them, say in GitHub, which I

867
00:41:39,559 --> 00:41:41,039
have no personal knowledge of them.

868
00:41:41,079 --> 00:41:45,000
Speaker 3: Really, I just heard from red.

869
00:41:45,320 --> 00:41:45,800
Speaker 6: I mean.

870
00:41:48,880 --> 00:41:51,199
Speaker 1: It's almost it's almost funny though, because there were a

871
00:41:51,199 --> 00:41:53,400
couple of people who wanted to do some experiments with

872
00:41:53,599 --> 00:41:57,480
the AWS credentials, and they were struggling to do them

873
00:41:57,480 --> 00:42:00,400
because every time they intentionally leaked their credentials to different places,

874
00:42:00,440 --> 00:42:03,000
ABS kept on revoking them. And they're like, no, I

875
00:42:03,079 --> 00:42:04,920
want to set up a hunting pot.

876
00:42:05,960 --> 00:42:08,800
Speaker 6: Doing here AWS and doing it.

877
00:42:08,719 --> 00:42:13,119
Speaker 1: Intentionally, we're actually doing and yeah, so like in our emails,

878
00:42:13,199 --> 00:42:15,400
we actually say oh, yeah, you know, please let us

879
00:42:15,440 --> 00:42:17,320
know if this was like actually intentional, and then we

880
00:42:17,320 --> 00:42:19,119
can try to figure out how to somehow exclude this

881
00:42:19,159 --> 00:42:21,559
scenario because at least they know that there's like a

882
00:42:21,559 --> 00:42:25,159
solution path here rather than just like automatic credential revocation.

883
00:42:25,239 --> 00:42:27,519
And then be like, but I really want this to happen.

884
00:42:27,559 --> 00:42:29,519
What's going on. My credentials are invalid. Let me go

885
00:42:29,559 --> 00:42:32,199
create new ones and release them again to get up

886
00:42:32,280 --> 00:42:34,519
right like something happened there. I'd like to go through

887
00:42:34,519 --> 00:42:36,920
that five or six times before they learn their lesson.

888
00:42:37,159 --> 00:42:39,280
Speaker 5: We've done the exact same thing with Doppler. If you

889
00:42:39,480 --> 00:42:43,400
if there's a Doppler token found, we will automatically broke it.

890
00:42:43,480 --> 00:42:45,400
I think one step we want to get you even closer, though,

891
00:42:45,480 --> 00:42:49,360
is being able to have some secure way of scanning

892
00:42:49,400 --> 00:42:52,519
your secrets that are in Doppler to see if they've

893
00:42:52,519 --> 00:42:55,360
been in a breach. We do something similar today with passwords.

894
00:42:55,519 --> 00:42:57,880
Were like, if you sign up for a Doppler account

895
00:42:57,920 --> 00:42:59,519
with the password that's been in a data breach, we

896
00:42:59,559 --> 00:43:02,280
actually you and we say you have to use a

897
00:43:02,320 --> 00:43:04,760
different password, and if you log in again and that

898
00:43:05,159 --> 00:43:08,119
in that time since creating your account, your password has

899
00:43:08,119 --> 00:43:09,880
been a data breach, we will force.

900
00:43:09,679 --> 00:43:11,000
Speaker 4: You to change the password.

901
00:43:11,760 --> 00:43:12,519
Speaker 2: That's pretty cool.

902
00:43:13,000 --> 00:43:13,960
Speaker 4: Yeah, I forced.

903
00:43:13,760 --> 00:43:15,960
Speaker 6: People to change it because Google tells me that about

904
00:43:15,960 --> 00:43:19,239
my passwords all the time, and I'm like, oh, but

905
00:43:19,519 --> 00:43:22,360
I'm supposed to be secure. Instead of just rotating, you know,

906
00:43:22,519 --> 00:43:27,639
like various things that I rotate through for passwords, they

907
00:43:27,639 --> 00:43:29,719
don't make me change it, which is which is a

908
00:43:29,760 --> 00:43:30,599
downfall for me.

909
00:43:31,440 --> 00:43:34,599
Speaker 3: Password one, password two, password three, Now I've got to

910
00:43:34,599 --> 00:43:36,360
remember password four as well.

911
00:43:36,760 --> 00:43:38,400
Speaker 6: I know, right, what am I going to do?

912
00:43:39,559 --> 00:43:40,679
Speaker 4: Use a password manager?

913
00:43:46,800 --> 00:43:49,000
Speaker 5: You want like a fifty character a long password that

914
00:43:49,039 --> 00:43:51,199
you could not possibly remember that's what you want.

915
00:43:51,239 --> 00:43:54,480
Speaker 1: So you got to be careful there, because I actually

916
00:43:54,519 --> 00:43:56,000
had to start keeping a list of all of the

917
00:43:56,039 --> 00:43:59,440
websites that told me that I could put a sixty

918
00:43:59,480 --> 00:44:02,920
three character password. But then when you sign up, but

919
00:44:02,920 --> 00:44:04,599
then when you go to enter it, it said it

920
00:44:04,599 --> 00:44:07,880
doesn't match because it's secretly truncating some part of it. Yes,

921
00:44:08,639 --> 00:44:12,320
So like it really starts to get annoying when you

922
00:44:12,400 --> 00:44:15,599
think that they support something to the level of security

923
00:44:15,639 --> 00:44:18,039
standard that you want and then secretly, under the hood

924
00:44:18,440 --> 00:44:20,760
they do something different. And I have a I have

925
00:44:22,119 --> 00:44:25,039
a list of those companies that I've paid very.

926
00:44:26,800 --> 00:44:28,320
Speaker 4: Maybe that have you published it?

927
00:44:29,199 --> 00:44:34,480
Speaker 1: No, that's still private for the moment, and don't jesus,

928
00:44:35,519 --> 00:44:38,000
I got some I got some good stories here. So

929
00:44:38,079 --> 00:44:40,760
one of them would truncate your password down to twenty characters,

930
00:44:40,960 --> 00:44:43,199
but for the two factor off. What it would do

931
00:44:43,360 --> 00:44:45,840
is it would ask you for your TOTP.

932
00:44:45,559 --> 00:44:47,519
Speaker 7: Six character code and.

933
00:44:48,960 --> 00:44:50,880
Speaker 1: Or pen that to the end to make a twenty

934
00:44:50,920 --> 00:44:53,639
six character password and send that in the password field.

935
00:44:54,079 --> 00:44:57,960
So if your password was anywhere above twenty six characters,

936
00:44:58,239 --> 00:44:59,639
you would just get a wrong answer. But if it

937
00:44:59,639 --> 00:45:02,679
was somewhere between twenty one and twenty six, it would

938
00:45:02,679 --> 00:45:04,559
cut out part of the TOTP code.

939
00:45:04,360 --> 00:45:06,280
Speaker 7: And like it wouldn't work, and then you just wouldn't be.

940
00:45:06,239 --> 00:45:08,280
Speaker 1: Able to get in at all. And also what that

941
00:45:08,360 --> 00:45:10,880
does is it tricks your password manager into thinking the

942
00:45:10,880 --> 00:45:14,079
TOTP code is part of your password. And every single

943
00:45:14,119 --> 00:45:15,920
time I go to the site, it asked me if

944
00:45:15,920 --> 00:45:19,360
it want if I should remember that new password update.

945
00:45:19,519 --> 00:45:21,440
Please don't do that. Like, if you're designing a system,

946
00:45:21,519 --> 00:45:25,039
please stop using passwords altogether, honestly, Like we have this

947
00:45:25,119 --> 00:45:26,960
off by default for all of our customers.

948
00:45:26,960 --> 00:45:29,360
Speaker 7: They can't create a password.

949
00:45:29,000 --> 00:45:31,800
Speaker 1: Database unless they come and contact us and justify, like

950
00:45:31,840 --> 00:45:33,679
who their audience is, why they need this, and what

951
00:45:33,719 --> 00:45:35,400
sort of rules are going to put in place, because

952
00:45:35,400 --> 00:45:37,840
it's just so ridiculous in the corporate space, like con

953
00:45:37,920 --> 00:45:42,400
federated logins, oidc samill sso all of those are so

954
00:45:42,559 --> 00:45:45,800
much better. Please just use one of those and don't

955
00:45:45,880 --> 00:45:49,440
use passwords password Like eighty six percent of all vulnerabilities

956
00:45:49,480 --> 00:45:54,519
are a result of password usage, like simcard swapping, you know,

957
00:45:55,280 --> 00:45:57,599
found in breaches, et cetera, et cetera. Like just please,

958
00:45:57,639 --> 00:45:59,639
like passwords are gone, like they're done, they're over.

959
00:46:05,159 --> 00:46:07,079
Speaker 4: I think everyone on this can agree with that.

960
00:46:08,599 --> 00:46:12,679
Speaker 6: I don't know you're still using passwords.

961
00:46:12,559 --> 00:46:15,840
Speaker 3: Or and I don't know that you're like, are you

962
00:46:15,960 --> 00:46:16,960
are you serious about that?

963
00:46:20,000 --> 00:46:21,840
Speaker 4: No, I agree with warn here. I think passwords should

964
00:46:21,840 --> 00:46:26,840
be done. Their time is come. I'm just agreeing with them.

965
00:46:26,880 --> 00:46:29,880
Speaker 1: So, uh, my CEO actually has another talk at q

966
00:46:30,039 --> 00:46:33,880
COON in November in San Francisco that everyone should go

967
00:46:33,920 --> 00:46:36,320
to if you're in the area, and she'll actually be

968
00:46:36,400 --> 00:46:38,719
talking about the fact you do need to have like

969
00:46:38,760 --> 00:46:41,880
somewhere between three and five actual passwords in your life. Unfortunately,

970
00:46:42,039 --> 00:46:45,559
like there's no way around that, and you probably should

971
00:46:45,639 --> 00:46:49,599
have ones that the Rendall Monroe correct horse battery staple

972
00:46:49,719 --> 00:46:52,000
or is it horse battery staple correct. I can never

973
00:46:52,000 --> 00:46:55,639
remember the order of the forewords, that's the problem. But uh,

974
00:46:55,719 --> 00:46:57,559
you don't want to just use a set of path

975
00:46:57,719 --> 00:47:00,320
like words as your password, because now all of the

976
00:47:00,400 --> 00:47:03,400
brute force attacks know that up three to five words

977
00:47:03,400 --> 00:47:05,320
could be your password as well. So you got to

978
00:47:05,360 --> 00:47:07,320
you got to make some changes in there. And uh,

979
00:47:07,519 --> 00:47:10,800
Apple actually has a pretty good strategy for dynamically generating passwords,

980
00:47:10,840 --> 00:47:13,360
where they make a bunch of fake words that are

981
00:47:13,519 --> 00:47:16,119
consonant vowel sounds that are strung together so you can

982
00:47:16,159 --> 00:47:19,039
sort of remember it and use that instead.

983
00:47:19,159 --> 00:47:20,039
Speaker 7: So that's pretty clever.

984
00:47:20,679 --> 00:47:23,719
Speaker 1: But I highly recommend like all those things with symbols

985
00:47:23,760 --> 00:47:26,760
and whatnot. It's it's just it's it's it's over, honestly,

986
00:47:26,920 --> 00:47:28,840
So three to five. If you have more than five passwords,

987
00:47:28,840 --> 00:47:31,599
you're probably doing something unsafe.

988
00:47:33,039 --> 00:47:34,360
Speaker 6: What about my pets names?

989
00:47:35,800 --> 00:47:37,119
Speaker 2: Oh, those are totally in scope.

990
00:47:37,280 --> 00:47:38,280
Speaker 1: Yeah, fantastic.

991
00:47:38,400 --> 00:47:41,119
Speaker 7: I mean for some like now, you've got to tell

992
00:47:41,119 --> 00:47:42,400
them those Callilli.

993
00:47:44,440 --> 00:47:45,079
Speaker 4: Which pets?

994
00:47:45,119 --> 00:47:48,559
Speaker 6: For those Jillian, it could be Eddie pet, it could

995
00:47:48,559 --> 00:47:50,840
be pets from like here to childhood. I think, isn't

996
00:47:50,840 --> 00:47:52,159
that a common security question?

997
00:47:52,360 --> 00:47:54,480
Speaker 2: Like yeah, what's your first dog's name.

998
00:47:54,559 --> 00:47:57,320
Speaker 6: Your first pet, first car, like all that kind of stuff.

999
00:47:57,360 --> 00:47:59,280
So I don't know if it's good enough for the

1000
00:47:59,280 --> 00:48:03,280
security question, it should be good enough for my passport.

1001
00:48:03,480 --> 00:48:04,440
Speaker 7: It's pretty goodly you brought that up.

1002
00:48:04,440 --> 00:48:06,519
Speaker 1: I'm sure Brian has some opinions on this too, likes

1003
00:48:06,639 --> 00:48:10,079
just reidelines of the eight hundred.

1004
00:48:10,480 --> 00:48:12,239
Speaker 6: I do just come on the show and start trouble

1005
00:48:12,280 --> 00:48:12,559
like I have.

1006
00:48:13,880 --> 00:48:16,239
Speaker 1: Now it officially says now it actually if as like

1007
00:48:16,360 --> 00:48:19,239
you like you shall not force password rotation, like we know,

1008
00:48:19,639 --> 00:48:22,400
like that's wrong. For sure. Secrets are a little bit

1009
00:48:22,440 --> 00:48:24,400
in the gray area of whether or not force rotation.

1010
00:48:24,920 --> 00:48:29,280
Uh is required outside of expected data breaches, but uh yeah,

1011
00:48:29,320 --> 00:48:31,599
for sure, like if you were forcing password rotation. Now

1012
00:48:31,639 --> 00:48:35,840
there's a government that is more ahead of where you're

1013
00:48:35,880 --> 00:48:38,320
at as far as good password hygiene.

1014
00:48:38,840 --> 00:48:41,880
Speaker 5: Another thing that I'd strongly recommend is whenever they're asking

1015
00:48:41,960 --> 00:48:45,960
for there's like recovery questions of like what, uh, what's

1016
00:48:46,000 --> 00:48:48,239
your mother's maiden name or what's the name of your

1017
00:48:48,239 --> 00:48:51,960
first pet, don't actually put that go and generate a

1018
00:48:52,000 --> 00:48:55,039
new password and put that in like like in like

1019
00:48:55,079 --> 00:48:57,159
one pastord begin of multiple fields outside of the used

1020
00:48:57,199 --> 00:49:00,119
name password, create another field call that the name of

1021
00:49:00,239 --> 00:49:03,719
the question, and then have a type password there and

1022
00:49:03,800 --> 00:49:06,719
put in like a sixty character password in there.

1023
00:49:07,960 --> 00:49:09,320
Speaker 2: So create a fake family.

1024
00:49:09,800 --> 00:49:13,039
Speaker 4: Yeah, really long family.

1025
00:49:14,599 --> 00:49:15,719
Speaker 7: I do have a warning there.

1026
00:49:15,800 --> 00:49:17,199
Speaker 1: So a long time ago I thought this was a

1027
00:49:17,199 --> 00:49:19,719
great idea, and I generated this site to actually dynamically

1028
00:49:19,760 --> 00:49:23,599
generate shot two PTY six encoded answers based off of

1029
00:49:23,599 --> 00:49:25,360
the questions. So I don't even need to remember the question.

1030
00:49:25,400 --> 00:49:28,840
I just copy and paste the question into this generator

1031
00:49:28,880 --> 00:49:32,320
and it will generate a shot hash for me to

1032
00:49:32,360 --> 00:49:32,840
put there.

1033
00:49:33,079 --> 00:49:35,840
Speaker 7: However, it broke me because I was talking with some.

1034
00:49:37,639 --> 00:49:40,719
Speaker 1: Customer service experts who would often be on these calls

1035
00:49:40,719 --> 00:49:43,599
that have to verify security questions, and they for sure

1036
00:49:43,599 --> 00:49:46,119
were like, yeah, we do get people with the list

1037
00:49:46,719 --> 00:49:49,800
and when they ask you to verify your question, some

1038
00:49:49,800 --> 00:49:51,559
people for sure just say, oh, yeah, it's just some

1039
00:49:51,760 --> 00:49:54,599
random characters, like I don't remember, like that's what I

1040
00:49:54,639 --> 00:49:56,800
put there. I put random, like it's not a word.

1041
00:49:57,000 --> 00:50:01,679
I put random characters there. And that gets by having

1042
00:50:01,719 --> 00:50:06,760
put like a password in the security question. So security

1043
00:50:06,800 --> 00:50:09,679
questions are over too. But I definitely recommend a real word.

1044
00:50:10,639 --> 00:50:12,960
Definitely a real word, but not the right answer, as

1045
00:50:13,000 --> 00:50:15,559
Brian said, and to get over this and just store

1046
00:50:15,559 --> 00:50:16,960
that in your password manager as well.

1047
00:50:17,679 --> 00:50:19,760
Speaker 5: Maybe should be one of those passwords that are like

1048
00:50:19,920 --> 00:50:22,440
a chain of words together that you can easily remember, like.

1049
00:50:24,079 --> 00:50:25,719
Speaker 4: I would to the park or something like that.

1050
00:50:26,840 --> 00:50:28,679
Speaker 1: I got a list of those that don't accept that,

1051
00:50:28,960 --> 00:50:31,039
because some of them thought they would be really smart

1052
00:50:31,360 --> 00:50:33,559
and there would be a drop down menu of the

1053
00:50:33,639 --> 00:50:35,760
answers that you could pick, because you know, that's what

1054
00:50:35,840 --> 00:50:38,199
they wanted to go with and I'm like, first of all,

1055
00:50:38,239 --> 00:50:40,480
my right answer isn't even here, Like it's like what

1056
00:50:40,559 --> 00:50:42,960
was your what it was your first instrument you played?

1057
00:50:42,960 --> 00:50:45,119
And I'm like, I couldn't even put the right answer

1058
00:50:45,280 --> 00:50:46,440
because it's not listed there.

1059
00:50:46,480 --> 00:50:48,000
Speaker 7: So like I don't know what game you're playing.

1060
00:50:48,480 --> 00:50:50,920
Speaker 4: Yeah, I'm really curious what like ten years from now,

1061
00:50:50,960 --> 00:50:53,000
what this whole identity space is going to look like,

1062
00:50:53,320 --> 00:50:54,199
because it's changing.

1063
00:50:55,039 --> 00:51:00,000
Speaker 1: Yeah, the rest that I would talk about that because

1064
00:51:00,039 --> 00:51:01,360
that's pretty much what I spent all my.

1065
00:51:01,480 --> 00:51:02,679
Speaker 7: Day thinking about.

1066
00:51:02,840 --> 00:51:07,320
Speaker 1: And I'll say that there is this aspect of past

1067
00:51:07,360 --> 00:51:12,079
keys via hardware tokens isn't the future because it's still

1068
00:51:12,119 --> 00:51:15,599
too much churn in difficulty and complexity for most people

1069
00:51:15,639 --> 00:51:20,519
to get right. And something where the past keys can

1070
00:51:20,519 --> 00:51:23,199
be shared between multiple devices is probably correct, but not

1071
00:51:23,360 --> 00:51:26,440
the Google or Apple approach, but something way more secure,

1072
00:51:26,719 --> 00:51:30,079
and it's coming. There's actually an open standard that allows

1073
00:51:30,119 --> 00:51:34,639
for past key transmission between different devices in a secure way,

1074
00:51:34,880 --> 00:51:36,599
and I think that will be the first step that

1075
00:51:36,639 --> 00:51:39,000
really helps us there. The next one is the root

1076
00:51:39,000 --> 00:51:41,519
of trust, like our browsers working for us and making

1077
00:51:41,559 --> 00:51:45,519
sure that the security cokens the JWT's the session keys

1078
00:51:45,559 --> 00:51:50,719
that we have are device bound and can't be exfiltrated

1079
00:51:50,760 --> 00:51:53,920
to an attacker and then reused to generate new stuff

1080
00:51:53,960 --> 00:51:57,400
and will always be stuck in cross site scripting land.

1081
00:51:57,440 --> 00:52:00,719
But those actual tokens will now be worth outside of

1082
00:52:00,760 --> 00:52:04,000
the original device because you'll need the device private key

1083
00:52:04,039 --> 00:52:07,079
in order to sign each request. And we're working on it.

1084
00:52:07,079 --> 00:52:09,760
It's happening, the right working groups are focused on it,

1085
00:52:09,800 --> 00:52:12,840
but you know, it's still a thing that and users

1086
00:52:12,880 --> 00:52:15,719
are not going to see a huge impact and say

1087
00:52:15,760 --> 00:52:16,320
the next ten.

1088
00:52:16,199 --> 00:52:18,960
Speaker 5: Years, yeah, and then once it does happen, you're gonna

1089
00:52:19,000 --> 00:52:20,679
have to like roll it out, which will also be

1090
00:52:20,719 --> 00:52:23,559
another wave of adoption as well, a slow one.

1091
00:52:23,360 --> 00:52:26,880
Speaker 1: It's hard to convince those companies that don't have a

1092
00:52:26,960 --> 00:52:29,760
financial stake and doing the right thing to make a

1093
00:52:29,840 --> 00:52:32,719
change fundamentally, And so like, I'm really looking forward to

1094
00:52:33,000 --> 00:52:35,920
whatever impact Doppler can make on the industry to actually

1095
00:52:35,920 --> 00:52:39,840
convince people to move to workload identity verification for every

1096
00:52:39,880 --> 00:52:43,519
request and not just because your customer happened to be

1097
00:52:43,599 --> 00:52:47,239
someone that had was in Germany and had some ridiculous

1098
00:52:47,239 --> 00:52:51,119
security requirements there or required twenty seven KO one or

1099
00:52:51,440 --> 00:52:55,360
something or Beta ramp high in order to.

1100
00:52:55,039 --> 00:52:56,159
Speaker 2: Get those customers on board.

1101
00:52:56,840 --> 00:53:01,199
Speaker 5: We have some active projects around looking into improving the

1102
00:53:01,239 --> 00:53:03,920
compliance body so that they are more compassing of like

1103
00:53:03,960 --> 00:53:08,880
secure secrets management, and I I think when we make

1104
00:53:08,960 --> 00:53:12,880
some serious progress there, our goal is to like make

1105
00:53:12,920 --> 00:53:14,920
that more because once you get into SoC two, it's

1106
00:53:14,920 --> 00:53:16,239
a beautiful thing like SoC too.

1107
00:53:16,639 --> 00:53:18,960
Speaker 4: In a way, it's like if one company saw to,

1108
00:53:19,079 --> 00:53:20,840
every company sock to. That's kind of how it has

1109
00:53:20,880 --> 00:53:22,880
to work because.

1110
00:53:22,599 --> 00:53:25,440
Speaker 5: You have a requirement that all your all your vendors

1111
00:53:25,440 --> 00:53:27,800
that use are sought to. So that kind of just

1112
00:53:27,800 --> 00:53:29,679
has like this ripple effect where if all the BDOC

1113
00:53:29,840 --> 00:53:32,119
companies are sockd to, then every B two B company

1114
00:53:32,199 --> 00:53:35,880
sock too. And if we can get in in SoC two,

1115
00:53:36,760 --> 00:53:39,079
I think that's a pretty effective way to roll to

1116
00:53:39,159 --> 00:53:41,719
roll out what we're looking for. That's the dream at least,

1117
00:53:41,760 --> 00:53:43,519
but it's gonna take a long time and a lot

1118
00:53:43,519 --> 00:53:44,599
of work to make that happen.

1119
00:53:45,000 --> 00:53:46,159
Speaker 7: I I love your optimism.

1120
00:53:46,159 --> 00:53:48,320
Speaker 1: I'm just gonna repeat that, like, I just think I

1121
00:53:48,360 --> 00:53:50,679
love the existation you have there. It's it's just so

1122
00:53:50,800 --> 00:53:53,280
great to hear someone who who thinks that we can,

1123
00:53:53,320 --> 00:53:55,480
you know, get down that path and and get to

1124
00:53:55,519 --> 00:53:58,360
success that way. It's it's it's awesome.

1125
00:53:59,760 --> 00:54:01,519
Speaker 4: Maybe it's not purely optimism.

1126
00:54:01,599 --> 00:54:03,880
Speaker 5: I like, at some point, I'm just like this world

1127
00:54:04,039 --> 00:54:06,760
is a capitalist structure, and like, at some point, with

1128
00:54:06,920 --> 00:54:10,400
enough capital and enough resources, anything should be possible. And

1129
00:54:10,440 --> 00:54:12,360
I'm not saying you like bribe, So I'm saying more like,

1130
00:54:12,400 --> 00:54:14,320
you just put enough lawyers in the room and you

1131
00:54:14,360 --> 00:54:17,199
try to navigate this like it's a political machine, and

1132
00:54:17,239 --> 00:54:20,480
eventually you'll get there. And then the question is can

1133
00:54:20,519 --> 00:54:22,920
we just raise enough money where we have the force

1134
00:54:23,000 --> 00:54:23,920
projection to do that?

1135
00:54:24,559 --> 00:54:27,679
Speaker 1: So SoC two for anyone that doesn't have it, ends

1136
00:54:27,760 --> 00:54:31,559
up being a write out some policies and prove that

1137
00:54:31,599 --> 00:54:36,000
you're following them where compared to actually putting valid security

1138
00:54:36,039 --> 00:54:39,360
strategies in place. So it's always going to be a

1139
00:54:39,440 --> 00:54:41,360
race to the bottom here, and I think we're going

1140
00:54:41,440 --> 00:54:43,760
to see SoC two getting cheaper and less valued over

1141
00:54:43,840 --> 00:54:47,239
time as people more people realize that it is a

1142
00:54:47,280 --> 00:54:51,400
compliance checkbox. It's checkbox bake security or insurance, it really is,

1143
00:54:52,559 --> 00:54:55,840
rather than actually providing a real value. And that's a

1144
00:54:55,920 --> 00:54:59,159
sort of a sad fact there, and there's really no alternative,

1145
00:54:59,159 --> 00:55:01,360
like maybe that's what we need maybe we need some

1146
00:55:01,400 --> 00:55:05,079
sort of non regulatory body that can verify companies are

1147
00:55:05,079 --> 00:55:08,960
compliant in performing actual security. And that's sort of hard

1148
00:55:09,039 --> 00:55:11,480
because that's driven from like the mindset of the founders

1149
00:55:11,480 --> 00:55:15,400
and the executive team and the leadership and money Trump's

1150
00:55:15,920 --> 00:55:17,039
security every time.

1151
00:55:17,639 --> 00:55:17,880
Speaker 6: See.

1152
00:55:17,880 --> 00:55:20,000
Speaker 5: I think sock two is actually pretty great from a

1153
00:55:20,039 --> 00:55:22,440
standpoint of like if you're a saw to all your

1154
00:55:22,519 --> 00:55:24,599
vendors have to because that just creates this beautiful fact

1155
00:55:24,599 --> 00:55:27,639
where the entire industry has some baseline of security. I

1156
00:55:27,639 --> 00:55:30,239
think there's just like one critical flaw of sock two,

1157
00:55:30,800 --> 00:55:34,360
and that's it's like tax people are like auditors. The

1158
00:55:34,400 --> 00:55:37,719
auditors are like literal accountants. Those are the people that

1159
00:55:37,719 --> 00:55:41,119
are setting the policies. I just don't understand how accountants

1160
00:55:41,320 --> 00:55:45,360
are setting policies around security. It's like completely different fields, right,

1161
00:55:45,519 --> 00:55:47,840
Like it'd be like someone who does your taxes is

1162
00:55:47,880 --> 00:55:49,760
like the one who's saying you should be secure. I'm like,

1163
00:55:50,079 --> 00:55:52,400
maybe leave it to the security professionals who think about

1164
00:55:52,400 --> 00:55:55,719
this all day long. If that change, one change happened,

1165
00:55:55,760 --> 00:55:57,039
I would bet like the world would.

1166
00:55:56,840 --> 00:55:58,119
Speaker 4: Be a lot safer of a place.

1167
00:55:59,480 --> 00:56:02,400
Speaker 1: I mean, I'm with you. It evolved from sok one

1168
00:56:02,559 --> 00:56:06,000
being a CPA auditor of verification that they would take

1169
00:56:06,079 --> 00:56:09,119
actual liability if your finances weren't in check, and then

1170
00:56:09,159 --> 00:56:12,880
evolved to actually evaluating different parts of your business reliability,

1171
00:56:12,880 --> 00:56:15,800
disaster recovery, business continuity in case of the problem, and

1172
00:56:15,840 --> 00:56:17,679
of course that falls into well how do your I

1173
00:56:18,000 --> 00:56:21,360
systems actually work as well? So there's a lot there,

1174
00:56:21,760 --> 00:56:25,599
but I'm with you. I so to put in perspective

1175
00:56:25,599 --> 00:56:30,840
of we use not vitamins but pain pills or painkillers.

1176
00:56:30,920 --> 00:56:34,599
The as the analogy, I think governments would have to

1177
00:56:34,880 --> 00:56:38,079
pay companies who were secure, Like if you are secure

1178
00:56:38,280 --> 00:56:40,239
and you get a certification, I think that if you

1179
00:56:40,320 --> 00:56:42,159
got paid, that would be a really driving good driving

1180
00:56:42,159 --> 00:56:42,719
factor there.

1181
00:56:43,119 --> 00:56:44,920
Speaker 5: It would have to be enough to really move the needle,

1182
00:56:44,920 --> 00:56:47,639
because I could even imagine some companies are like, yeah,

1183
00:56:47,719 --> 00:56:49,159
even a million year probably.

1184
00:56:48,880 --> 00:56:49,880
Speaker 4: Wouldn't move the needle from it.

1185
00:56:51,239 --> 00:56:55,519
Speaker 5: I don't know, like maybe it's a percentage. I really

1186
00:56:55,519 --> 00:56:56,960
do feel like we don't need to pay them though,

1187
00:56:57,000 --> 00:57:02,119
Like I think so two is like the the requirement

1188
00:57:02,119 --> 00:57:03,679
that all your vendors are shocked to your requirement I

1189
00:57:03,719 --> 00:57:05,920
think is so powerful already that pretty much the entire

1190
00:57:05,920 --> 00:57:08,360
industry has adopted it, we just need different people at

1191
00:57:08,360 --> 00:57:08,639
the helm.

1192
00:57:08,719 --> 00:57:10,880
Speaker 4: Like and I know where it started.

1193
00:57:10,920 --> 00:57:13,039
Speaker 5: I completely align you there, but I kind of see

1194
00:57:13,039 --> 00:57:14,960
it like Kubernetes, a little bit like that started within

1195
00:57:15,000 --> 00:57:17,920
Google and it got so big it got pulled out

1196
00:57:17,960 --> 00:57:20,920
of Google into its own body that then could manage

1197
00:57:20,920 --> 00:57:22,760
it from then on. And I'm saying, like, sock choose

1198
00:57:22,760 --> 00:57:25,119
the same thing. They created something amazing with SoC two,

1199
00:57:25,559 --> 00:57:28,480
Let's pull it out of the of the accountants and

1200
00:57:28,559 --> 00:57:31,719
let's have real security people run this thing. And like

1201
00:57:31,880 --> 00:57:34,559
let's give it a V two or maybe a V three.

1202
00:57:34,840 --> 00:57:36,119
I think that would do a lot of good.

1203
00:57:37,239 --> 00:57:37,800
Speaker 2: I love.

1204
00:57:38,800 --> 00:57:41,679
Speaker 1: Well, it's like, oh, there's so much going on here.

1205
00:57:42,159 --> 00:57:44,639
Speaker 6: So I heard a lot with security and very much

1206
00:57:44,719 --> 00:57:47,079
like that's somebody else's job. But you guys have like

1207
00:57:47,119 --> 00:57:49,480
the nice provided for that kind of thing, which I

1208
00:57:49,480 --> 00:57:52,719
guess so I am the problem I suppose in this conversation,

1209
00:57:53,320 --> 00:57:54,400
like in general.

1210
00:57:54,679 --> 00:57:57,559
Speaker 7: Well maybe that's a good twist though, Like do.

1211
00:57:57,559 --> 00:58:00,840
Speaker 1: You feel, like Jillian, you have some comp here to

1212
00:58:02,199 --> 00:58:06,039
cause your company to do something different or you have

1213
00:58:06,119 --> 00:58:09,519
flexibility to pull in libraries or other providers to help

1214
00:58:09,559 --> 00:58:12,360
you support better security.

1215
00:58:12,719 --> 00:58:15,599
Speaker 6: Oh yeah, definitely, Like for the most party, they tend

1216
00:58:15,599 --> 00:58:17,719
to rely on the you know, like the providers and

1217
00:58:17,760 --> 00:58:19,519
things that they work with, because I work with primarily

1218
00:58:19,599 --> 00:58:26,599
data science companies, so they're not like technology companies where

1219
00:58:26,760 --> 00:58:29,400
they care about the technology in and of itself. So

1220
00:58:29,480 --> 00:58:31,679
pretty much as little computing as they can do, they

1221
00:58:31,719 --> 00:58:34,320
will do. Like these guys like if they could analyze

1222
00:58:34,320 --> 00:58:37,119
the data from their laptop, that's what they would be doing.

1223
00:58:37,320 --> 00:58:40,920
So it has to be, you know, like Brian was saying,

1224
00:58:40,960 --> 00:58:42,639
it has to be a really easy sell. It has

1225
00:58:42,719 --> 00:58:44,280
to you know, it has to be the candy and

1226
00:58:44,280 --> 00:58:46,719
not the vegetables. It has to be for the most part,

1227
00:58:46,840 --> 00:58:49,000
something they don't even notice happening. It just kind of

1228
00:58:49,039 --> 00:58:52,079
like gets bundled into the background. So pretty much anything

1229
00:58:52,239 --> 00:58:56,079
on you know, that kind of front, it's it's not

1230
00:58:56,119 --> 00:58:58,079
even so much a question for me. I don't ask

1231
00:58:58,119 --> 00:59:00,079
them Like I wouldn't ask my clients like, oh do

1232
00:59:00,119 --> 00:59:02,679
you want this? I would I would just go do this, right,

1233
00:59:02,679 --> 00:59:04,199
I wouldn't say like, oh, do you want for me

1234
00:59:04,239 --> 00:59:06,199
to keep your credentials off of GitHub? I would just

1235
00:59:06,239 --> 00:59:10,360
keep their credentials off of GitHub in that scenario. So

1236
00:59:10,559 --> 00:59:12,039
I do yeah, I do think. So I do think

1237
00:59:12,119 --> 00:59:16,440
like smaller you know it providers or consultants, are you know,

1238
00:59:16,559 --> 00:59:20,159
like just engineers in general, especially if you're working for

1239
00:59:20,199 --> 00:59:23,039
a company where technology is not their focus, you do

1240
00:59:23,119 --> 00:59:26,239
have a lot of say in the technology stacks that

1241
00:59:26,280 --> 00:59:29,119
are used. So we just got to sell this to

1242
00:59:29,119 --> 00:59:31,280
all the developers, right is that? Is that what we're

1243
00:59:31,320 --> 00:59:31,679
going for?

1244
00:59:33,679 --> 00:59:35,960
Speaker 4: And then eventually we work away at the compliance bodies

1245
00:59:36,039 --> 00:59:37,079
and those committees.

1246
00:59:39,039 --> 00:59:41,360
Speaker 1: I do have a question about Doppler maybe, and that's like,

1247
00:59:41,400 --> 00:59:44,760
what is the minimum implementation you can have here?

1248
00:59:45,159 --> 00:59:45,599
Speaker 2: Is there?

1249
00:59:45,960 --> 00:59:49,280
Speaker 1: Is it just maybe replacing out the cloud SDKs with

1250
00:59:49,519 --> 00:59:53,760
Doppler SDKs for interacting with Supermanager. Is there something else

1251
00:59:53,800 --> 00:59:55,679
that has to be done in order to sort of

1252
00:59:55,679 --> 00:59:58,039
start getting the benefit right away? Or is there like

1253
00:59:58,079 --> 01:00:00,440
a large upfront integration that has to be done to

1254
01:00:00,599 --> 01:00:01,360
make it effective?

1255
01:00:02,079 --> 01:00:02,280
Speaker 4: Yeah?

1256
01:00:02,360 --> 01:00:06,440
Speaker 5: Great, question usually depends on the complexity but the company

1257
01:00:06,480 --> 01:00:10,400
that we're working with from their infrastructure standpoint. But I'd

1258
01:00:10,400 --> 01:00:13,400
say for like a small startup of like series being below,

1259
01:00:13,519 --> 01:00:16,039
they usually get up and operational in like a couple hours.

1260
01:00:16,440 --> 01:00:19,119
So like you're usually using some cloud provider like ABSG

1261
01:00:19,159 --> 01:00:22,559
Spears your secrets manager, we would directly set up a

1262
01:00:22,599 --> 01:00:25,280
sink or an integration and write directly into that, so

1263
01:00:25,320 --> 01:00:26,800
you don't need to change your sd case at all.

1264
01:00:26,960 --> 01:00:29,440
It's literally just put like a layer on top, which

1265
01:00:29,480 --> 01:00:31,880
is Doppler, and then they would download our CLI and

1266
01:00:31,960 --> 01:00:36,280
use that for local development. For much larger companies, it

1267
01:00:36,320 --> 01:00:38,880
can take in the order of weeks. We've had very

1268
01:00:38,920 --> 01:00:42,159
few go longer than a week's even like extremely large entities,

1269
01:00:42,199 --> 01:00:45,239
like we have a one customer that's like on a

1270
01:00:45,280 --> 01:00:48,800
three year roll out to one hundred and twenty five

1271
01:00:48,800 --> 01:00:52,159
thousand engineers, and even them, it was like a couple

1272
01:00:52,199 --> 01:00:55,400
of weeks to get them like integrated in their infrastructure.

1273
01:00:55,639 --> 01:00:58,719
And usually it's like just connecting dots. It's okay, you're

1274
01:00:58,800 --> 01:01:01,599
using a toa secrets manager, you're using GitHub workflows.

1275
01:01:01,599 --> 01:01:02,840
Speaker 4: Here, we're just gonna connect all.

1276
01:01:02,800 --> 01:01:04,840
Speaker 5: The dots into Doppler and then from there we're going

1277
01:01:04,880 --> 01:01:07,199
to add local development through like the vis code extension

1278
01:01:07,199 --> 01:01:10,559
we have plus the Doppler CLI, and so it's a

1279
01:01:10,599 --> 01:01:14,000
pretty formulaic approach that that's very quick, I would say.

1280
01:01:14,679 --> 01:01:16,760
And when you set up those integrations, we'll do bulk

1281
01:01:16,800 --> 01:01:19,639
imports as well, so we try to streamline a lot

1282
01:01:19,679 --> 01:01:21,440
of it. I'd say, where things get a little bit

1283
01:01:21,440 --> 01:01:23,760
more complex, as if we have customers that like to

1284
01:01:23,800 --> 01:01:26,840
do everything the terraform way, then we help them kind

1285
01:01:26,840 --> 01:01:28,400
of like set up all that stuff instead of in

1286
01:01:28,440 --> 01:01:30,920
the UI, but through terraform, So you can actually manage

1287
01:01:30,920 --> 01:01:33,079
pretty much every part of Doppler and including all the

1288
01:01:33,159 --> 01:01:34,880
orchestration and workflows that.

1289
01:01:34,840 --> 01:01:36,440
Speaker 4: We have in Doppler through Terraform.

1290
01:01:37,280 --> 01:01:39,400
Speaker 5: And I would say that would slow them, slow them

1291
01:01:39,440 --> 01:01:42,119
down a little bit and initially, but then speed them

1292
01:01:42,199 --> 01:01:44,039
up on the long haul when they have are like

1293
01:01:44,079 --> 01:01:46,280
bringing on like tens of thousands of projects.

1294
01:01:46,719 --> 01:01:50,400
Speaker 1: I like that that you really gave credence to that

1295
01:01:50,400 --> 01:01:53,880
that using IIC there does does actually slow you down

1296
01:01:54,000 --> 01:01:56,480
for development I do thing. It's a lot of truth

1297
01:01:56,559 --> 01:01:58,800
there that people don't like to listen to.

1298
01:02:00,480 --> 01:02:02,800
Speaker 4: Yeah, it's just like a high upfront cost, but then

1299
01:02:03,039 --> 01:02:03,880
it pays difference.

1300
01:02:04,199 --> 01:02:07,840
Speaker 1: There's something here where like the IC cloud for the

1301
01:02:07,880 --> 01:02:12,760
cloud providers and whatnot. Don't always optimize for easy interactions

1302
01:02:12,840 --> 01:02:17,199
right that it's it's to have it be fixed and

1303
01:02:17,920 --> 01:02:20,480
maybe work and make it easy to make changes, but

1304
01:02:20,880 --> 01:02:25,920
not really quick and easy. Whereas your your console, your UI,

1305
01:02:26,119 --> 01:02:29,960
r CLI or optimize for ease of use and getting

1306
01:02:29,960 --> 01:02:31,519
to the value as quick as possible.

1307
01:02:32,559 --> 01:02:34,960
Speaker 5: Yeah, I would say that I see it's like I

1308
01:02:35,039 --> 01:02:38,199
almost see it as like stability insurance, right, like like

1309
01:02:38,280 --> 01:02:41,199
you're it's there to keep you stable and reliable and

1310
01:02:41,320 --> 01:02:44,119
very predictable, but it's not there for speed.

1311
01:02:45,920 --> 01:02:48,639
Speaker 4: Though you can once you have it, once you have.

1312
01:02:48,800 --> 01:02:52,400
Speaker 5: Achieved stability, it's very easy to boot up new infrastructure

1313
01:02:52,400 --> 01:02:54,480
with it, which can give you some speed benefits. But

1314
01:02:54,920 --> 01:02:57,559
sure that's usually come at the cost of a lot

1315
01:02:57,559 --> 01:03:01,559
of upfront investment, so it's like paying you're paying down

1316
01:03:01,599 --> 01:03:05,519
that or drawing down all all that initial investment over time.

1317
01:03:07,239 --> 01:03:10,000
Speaker 1: So I know we opened a huge topic by saying

1318
01:03:10,079 --> 01:03:12,639
we're going to talk about secrets management, and I think

1319
01:03:12,679 --> 01:03:16,559
we definitely dove into all the different tangents that that

1320
01:03:16,599 --> 01:03:18,960
were related to it. Was there anything on this that

1321
01:03:19,000 --> 01:03:21,719
you feel like we missed that would have you think

1322
01:03:21,760 --> 01:03:26,559
a huge impact for the audience something maybe some special

1323
01:03:26,559 --> 01:03:28,400
wisdom there, anything worth sharing.

1324
01:03:29,000 --> 01:03:31,199
Speaker 5: Yeah, I would say that the only thing that we

1325
01:03:31,280 --> 01:03:34,960
didn't touch on is it's better to start early. I know,

1326
01:03:35,000 --> 01:03:38,719
when you're first starting out, you're building a company, you're

1327
01:03:38,760 --> 01:03:40,679
all about like I need to get to product market

1328
01:03:40,760 --> 01:03:43,320
fits and then at some point you'll need to think

1329
01:03:43,360 --> 01:03:45,760
about like, how do I protect the company now that

1330
01:03:45,760 --> 01:03:46,920
I've achieved product market fit?

1331
01:03:47,360 --> 01:03:53,400
Speaker 4: And what we found is that it's adding security later

1332
01:03:53,440 --> 01:03:55,400
on if it's not part of the DNA of the company.

1333
01:03:55,519 --> 01:03:57,079
Just almost never works.

1334
01:03:57,119 --> 01:04:00,239
Speaker 5: Like you'll you'll take all you'll feel like you're taking

1335
01:04:00,239 --> 01:04:02,280
all the right actions, but it won't actually give you

1336
01:04:02,320 --> 01:04:05,440
the security benefit you're looking for, and thus you'll be vulnerable.

1337
01:04:05,840 --> 01:04:10,360
And so my strongest suggestion would be to use a

1338
01:04:10,440 --> 01:04:13,960
secret manager, Doppler or something else day. One does not

1339
01:04:14,039 --> 01:04:16,239
have to be the best solution, but just use something,

1340
01:04:16,599 --> 01:04:18,360
get used to it, build it as part of the

1341
01:04:18,440 --> 01:04:22,760
DNA of your company and of your engineering philosophy, and

1342
01:04:22,800 --> 01:04:24,000
then grow with it over time.

1343
01:04:24,559 --> 01:04:26,320
Speaker 4: And at least there you're.

1344
01:04:26,000 --> 01:04:28,960
Speaker 5: Coming from a stable foundation versus one that's cracky as

1345
01:04:28,960 --> 01:04:30,840
how that you're trying to lay layer a building on

1346
01:04:30,880 --> 01:04:31,199
top of.

1347
01:04:32,840 --> 01:04:34,239
Speaker 4: Because I think it'll save.

1348
01:04:34,079 --> 01:04:36,320
Speaker 5: You a lot of headaches and time down the line,

1349
01:04:37,199 --> 01:04:38,920
and also I think the right secrets manager, for what

1350
01:04:38,960 --> 01:04:41,719
it is worth, will make you feel more productive. So

1351
01:04:41,920 --> 01:04:43,679
just calling that out there, that's all I got.

1352
01:04:44,280 --> 01:04:47,679
Speaker 2: Seems fair enough. On that note, should we do some picks, Yeah,

1353
01:04:47,760 --> 01:04:49,480
let's do it all right?

1354
01:04:49,679 --> 01:04:53,159
Speaker 3: Cool, Jillian, I'm throwing you under the bus because you've

1355
01:04:53,199 --> 01:04:55,119
been gone for a while. What did you bring for

1356
01:04:55,199 --> 01:04:57,320
a pick today? I mean, it's got to be something

1357
01:04:57,320 --> 01:05:01,599
cool since you've had plenty of time to figure something out.

1358
01:05:02,039 --> 01:05:03,760
Speaker 6: I mean, that's a lot of pressure for what I

1359
01:05:03,800 --> 01:05:06,880
was gonna pick. I was just I was gonna like,

1360
01:05:07,199 --> 01:05:10,400
I'm looking up my window, it's spooky season, it's autumn.

1361
01:05:10,639 --> 01:05:13,679
It's one of the few live action videos before like

1362
01:05:13,760 --> 01:05:16,239
cell phones, that my kids will actually watch. And so

1363
01:05:16,599 --> 01:05:19,719
that was that was it. Hocus Pocus. Maybe I'll throw

1364
01:05:19,760 --> 01:05:20,880
Halloween in there too.

1365
01:05:20,800 --> 01:05:24,119
Speaker 3: Just in general, Halloween the movie or the holiday.

1366
01:05:24,400 --> 01:05:28,760
Speaker 6: No, the holiday. I don't like any weird horror scary movies. Okay,

1367
01:05:29,760 --> 01:05:34,199
Like hocus Pocus is as scary as it gets for me.

1368
01:05:34,519 --> 01:05:35,840
Speaker 1: I don't know why you're laughing, will like that.

1369
01:05:36,119 --> 01:05:38,840
Speaker 7: That Hocus movie for sure traumatized.

1370
01:05:38,280 --> 01:05:41,760
Speaker 1: Me as a child, like that was not I didn't

1371
01:05:41,800 --> 01:05:45,000
I didn't appreciate that experience. That was very troubling, especially

1372
01:05:45,000 --> 01:05:48,119
where the cat. You know, there's an animal death in

1373
01:05:48,159 --> 01:05:50,199
the end, and that's that's always struggling for me.

1374
01:05:51,360 --> 01:05:54,800
Speaker 6: Oh, that's true. I was I was okay with hocus Pocus.

1375
01:05:54,800 --> 01:05:56,960
The never Ending Story for some reason is like the

1376
01:05:57,039 --> 01:05:59,760
kids movie that really freaked me out, so we can

1377
01:05:59,880 --> 01:06:03,039
like bond over our shared trauma of nineties movies. Like

1378
01:06:03,119 --> 01:06:03,599
Another Day.

1379
01:06:03,920 --> 01:06:06,400
Speaker 1: I actually liked ever I did a story, you know,

1380
01:06:06,440 --> 01:06:09,079
a different world, was, you know, sufficiently interesting.

1381
01:06:09,320 --> 01:06:11,639
Speaker 4: I liked back in the day, Halloween Town if you've

1382
01:06:11,679 --> 01:06:14,760
ever heard of it. It was like a Disney series,

1383
01:06:14,800 --> 01:06:16,719
a couple of movies. But I would say it's great

1384
01:06:16,719 --> 01:06:20,960
for kids. That's magic in it, a little scary, but

1385
01:06:21,000 --> 01:06:21,679
not too scary.

1386
01:06:23,039 --> 01:06:25,679
Speaker 6: I love all the kids Halloween movies and stuff like that.

1387
01:06:25,679 --> 01:06:26,519
They're so much fun.

1388
01:06:27,159 --> 01:06:30,440
Speaker 2: They are, all right, Warren, how about you? What'd you

1389
01:06:30,440 --> 01:06:31,000
bring for a pick?

1390
01:06:31,400 --> 01:06:32,679
Speaker 1: You know, I thought for sure you were you were

1391
01:06:32,679 --> 01:06:35,199
gonna tell me to go first again, which I you know,

1392
01:06:35,280 --> 01:06:36,280
I've just gotten so used to.

1393
01:06:36,320 --> 01:06:39,199
Speaker 7: I'm happy to just do it from here on out.

1394
01:06:39,480 --> 01:06:41,039
Speaker 2: You're gonna follow your sword for us?

1395
01:06:42,239 --> 01:06:44,440
Speaker 1: Yeah, I'll just I'll just admit that. So what I

1396
01:06:44,480 --> 01:06:46,039
was when you asked me how I was doing, I

1397
01:06:46,079 --> 01:06:48,360
had the whole, the whole episode to actually think about that,

1398
01:06:48,840 --> 01:06:51,599
and what I came to is, so I guess I

1399
01:06:51,760 --> 01:06:53,480
sort of have two picks today. The first one is

1400
01:06:53,920 --> 01:06:57,119
I'm today is like one of the first days of

1401
01:06:57,159 --> 01:07:00,800
my short little intermittent fasting that I do every quarter

1402
01:07:00,920 --> 01:07:03,000
where I just like don't eat any food for three

1403
01:07:03,039 --> 01:07:06,840
to seven days and it just makes me feel so

1404
01:07:06,920 --> 01:07:09,480
much better. There's a bunch of health benefits associated with

1405
01:07:10,599 --> 01:07:13,920
taking a break from eating, and I feel better usually

1406
01:07:13,920 --> 01:07:15,639
by the time I'm done, so like sort of like

1407
01:07:15,639 --> 01:07:18,920
a full cleanse in a way, I highly recommend there's

1408
01:07:19,440 --> 01:07:22,840
no negative health impact at all from doing this. I mean,

1409
01:07:22,880 --> 01:07:25,639
it's sort of psychological. It's nice to think like, oh,

1410
01:07:25,800 --> 01:07:28,840
I don't have to eat like all the time. So

1411
01:07:29,360 --> 01:07:32,239
I really appreciate that so great since I started doing it.

1412
01:07:32,559 --> 01:07:34,360
And I think my real pick for today is going

1413
01:07:34,400 --> 01:07:36,360
to be a book, because I always pick a book

1414
01:07:36,400 --> 01:07:39,760
I guess called Drive by Daniel H. Pink that talks

1415
01:07:39,800 --> 01:07:43,519
about the truth of what motivates us and how we

1416
01:07:43,559 --> 01:07:46,880
as humans get motivated and what actually we use to

1417
01:07:47,000 --> 01:07:50,960
drive our desire, what we work on, what we find

1418
01:07:51,039 --> 01:07:55,800
enjoyment in. And the one really important learning that I

1419
01:07:55,920 --> 01:07:59,280
pulled out from the book is that you will always

1420
01:07:59,360 --> 01:08:04,519
lose happiness in something where you expect a reward for it.

1421
01:08:04,679 --> 01:08:06,599
So if you really like doing something, and then you

1422
01:08:06,639 --> 01:08:08,920
make that your job, and your job pays you to

1423
01:08:08,960 --> 01:08:12,639
do that thing, you will actually lose motivation to perform

1424
01:08:12,679 --> 01:08:15,199
that action. Which is sort of a sad story, but

1425
01:08:15,280 --> 01:08:17,560
I think there's a lot of truth in it. When

1426
01:08:17,600 --> 01:08:19,880
you start associating money with the thing that you're doing.

1427
01:08:20,920 --> 01:08:24,600
Speaker 3: Yeah, I've seen that happen in my own life, and

1428
01:08:23,720 --> 01:08:25,640
it's surprising.

1429
01:08:25,840 --> 01:08:28,720
Speaker 2: You're like, oh, damn, I used to enjoy this.

1430
01:08:28,840 --> 01:08:32,560
Speaker 3: What happened in retrospect, You're like, oh, I turned it

1431
01:08:32,600 --> 01:08:33,319
into a job.

1432
01:08:33,600 --> 01:08:34,279
Speaker 2: That's my bad.

1433
01:08:34,640 --> 01:08:38,199
Speaker 6: You should not monetize your hobbies, right, throw that out.

1434
01:08:38,239 --> 01:08:42,319
There's life advice. Just don't just sit down and crochet

1435
01:08:42,399 --> 01:08:46,119
your scarps or whatever. It doesn't have to make the money.

1436
01:08:46,880 --> 01:08:48,520
Speaker 1: Now, I feel like there's a story there that I'm

1437
01:08:48,560 --> 01:08:50,199
gonna want to have to hear at some point.

1438
01:08:50,560 --> 01:08:52,760
Speaker 3: You're gonna have to do like a non DevOps episode

1439
01:08:52,800 --> 01:08:55,119
at one point just to cover some of these topics

1440
01:08:55,119 --> 01:08:58,479
we keep bringing up. I think, so, Brian, what about

1441
01:08:58,520 --> 01:08:59,520
you would you bring for a pick?

1442
01:09:00,239 --> 01:09:05,720
Speaker 5: I got too. The first one is perplexity. It's kind

1443
01:09:05,720 --> 01:09:09,439
of like the new version of Google, and like imagine

1444
01:09:09,439 --> 01:09:11,279
you're trying to like answer a question like how do

1445
01:09:11,359 --> 01:09:14,600
I cook steak with an air fryer? And like instead

1446
01:09:14,600 --> 01:09:16,159
of googling that where you can get a bunch of

1447
01:09:16,239 --> 01:09:17,840
links you have to go through, it kind of just

1448
01:09:17,920 --> 01:09:20,520
gives you the answer. So it's like chatbut but with

1449
01:09:20,720 --> 01:09:25,159
like search capabilities and advanced reasoning built in. There pretty

1450
01:09:25,199 --> 01:09:27,600
much hasn't been a question that I've thrown a where

1451
01:09:27,600 --> 01:09:30,239
it failed at Like it's it's really good, including like

1452
01:09:30,279 --> 01:09:33,319
what happened yesterday in the news, like really really recent stuff.

1453
01:09:33,359 --> 01:09:35,760
It does a great job of analyzing and you can

1454
01:09:35,800 --> 01:09:38,560
ask you like very complex chain of thought questions where

1455
01:09:38,560 --> 01:09:40,800
like like I find a lot of times I'm searching

1456
01:09:40,800 --> 01:09:42,720
for something and then I'll have a follow up question

1457
01:09:42,840 --> 01:09:45,680
and so now I need to like ask another query

1458
01:09:45,680 --> 01:09:47,680
into Google. And this is like you just it's like

1459
01:09:47,720 --> 01:09:50,520
almost like a thread of questions so you can get

1460
01:09:50,560 --> 01:09:52,800
like every follow up question has all the context of

1461
01:09:52,840 --> 01:09:54,720
the first question or the previous questions.

1462
01:09:55,560 --> 01:09:58,880
Speaker 4: So I really love perplexity. I've completely shifted.

1463
01:09:58,520 --> 01:10:01,600
Speaker 5: Over to using perplexity for almost any knowledge based question

1464
01:10:01,640 --> 01:10:03,279
of like how do I do ex er? What do

1465
01:10:03,279 --> 01:10:04,920
I think about this? Or what happened in the stock market?

1466
01:10:04,960 --> 01:10:05,880
Why did this price go up?

1467
01:10:05,920 --> 01:10:06,479
Speaker 4: Whatever? It is.

1468
01:10:07,840 --> 01:10:12,960
Speaker 5: The second one is pickaball. I love pickaball, love it

1469
01:10:13,119 --> 01:10:15,319
so much. I try to play like five days a week,

1470
01:10:15,399 --> 01:10:19,319
two to three hours a day. It's hyper addicting. I

1471
01:10:19,439 --> 01:10:23,399
burned like thousand to thousand, eight hundred calories a session.

1472
01:10:23,560 --> 01:10:26,199
It's like a really high mount of calories. And it

1473
01:10:26,199 --> 01:10:28,319
doesn't feel like a workout at all. Like it's probably

1474
01:10:28,319 --> 01:10:31,680
the first workout sport that I've played that I genuinely

1475
01:10:31,760 --> 01:10:34,199
love doing and I don't and I'm not just like

1476
01:10:34,239 --> 01:10:36,960
waiting for it to end. So if there's any people

1477
01:10:37,000 --> 01:10:39,840
like that out there who have, like will always wanted

1478
01:10:39,840 --> 01:10:41,279
to get into sports or always wanted to get in

1479
01:10:41,279 --> 01:10:43,039
the gym, but always felt like I never really connected

1480
01:10:43,079 --> 01:10:45,760
with them, try pick a ball. There's something magical about it.

1481
01:10:45,880 --> 01:10:47,399
Almost everyone I show two falls.

1482
01:10:47,239 --> 01:10:47,720
Speaker 6: In love with it.

1483
01:10:48,800 --> 01:10:51,760
Speaker 3: Yeah, I've never played, but I know people who do,

1484
01:10:52,159 --> 01:10:57,319
and it like the level of passion amongst the pickleball

1485
01:10:57,359 --> 01:11:00,199
community is pretty wild.

1486
01:10:59,720 --> 01:11:00,760
Speaker 4: Isn't like?

1487
01:11:00,800 --> 01:11:03,960
Speaker 5: And it's very uh, it transfers well, like you you

1488
01:11:03,960 --> 01:11:06,039
you catch that bug where you're like, I just got

1489
01:11:06,079 --> 01:11:06,680
to be playing today.

1490
01:11:06,720 --> 01:11:08,359
Speaker 4: I did not play at all, and I feel guilty.

1491
01:11:10,279 --> 01:11:10,720
I don't know.

1492
01:11:10,920 --> 01:11:14,279
Speaker 5: It's it's I The last time I felt this level

1493
01:11:14,319 --> 01:11:17,680
of passionate and drive for something was like building a company.

1494
01:11:18,520 --> 01:11:20,920
So let's use like a pretty high bar for me.

1495
01:11:20,960 --> 01:11:25,399
It's like, uh so strongly recommend.

1496
01:11:24,760 --> 01:11:28,600
Speaker 3: Right on excellent, So I'm gonna pick because we were

1497
01:11:28,640 --> 01:11:32,880
talking about this before we started recording the episode. The

1498
01:11:32,920 --> 01:11:38,399
book by William fortun one second after. It's a science

1499
01:11:38,439 --> 01:11:40,680
fiction novel that talks.

1500
01:11:40,439 --> 01:11:42,359
Speaker 2: About the struggles of.

1501
01:11:42,279 --> 01:11:47,640
Speaker 3: A community after an emp or electromagnetic pulse bomb was

1502
01:11:47,640 --> 01:11:51,199
detonated over the United States and so that wipes out

1503
01:11:51,319 --> 01:11:55,880
all of our modern infrastructure, and it's I just found

1504
01:11:55,880 --> 01:11:59,079
it to be a fascinating book because of the things

1505
01:11:59,119 --> 01:12:02,760
that happened. It's you know, you don't normally think about,

1506
01:12:03,960 --> 01:12:06,399
and it was a it was well written and very

1507
01:12:06,479 --> 01:12:09,640
engaging throughout. So if you're into that kind of stuff,

1508
01:12:10,439 --> 01:12:14,840
definitely a cool book to read. And I think that

1509
01:12:14,880 --> 01:12:17,479
brings us to the end of the episode. Brian, thank

1510
01:12:17,479 --> 01:12:20,039
you so much. It's been great having you on a show.

1511
01:12:20,600 --> 01:12:27,159
Very insightful and yeah, crap, now I got work to do, thanks.

1512
01:12:26,840 --> 01:12:31,800
Speaker 4: Man, right.

1513
01:12:33,039 --> 01:12:34,760
Speaker 6: Having this episode with a checklist?

1514
01:12:35,479 --> 01:12:37,399
Speaker 2: Yeah, no kidding for real.

1515
01:12:39,319 --> 01:12:41,359
Speaker 3: Jillian, it's nice to have you back on the show Warren,

1516
01:12:41,560 --> 01:12:45,640
as always, thank you for being here and Brian hope

1517
01:12:45,680 --> 01:12:46,439
to hear from you again.

1518
01:12:46,680 --> 01:12:50,880
Speaker 2: Best of luck with Doppler and keep in touch. We'll do.

1519
01:12:51,119 --> 01:12:53,920
Speaker 3: Thank you all right, you bet, And to all our listeners,

1520
01:12:53,920 --> 01:12:55,680
thank you for listening and we'll see you all next

1521
01:12:55,720 --> 01:12:55,920
week

