1
00:00:07,960 --> 00:00:12,199
Speaker 1: And welcome back to Adventures in DevOps. I almost can't

2
00:00:12,199 --> 00:00:15,359
believe it. Attribute has returned to sponsor the podcast for

3
00:00:15,400 --> 00:00:16,399
a second week.

4
00:00:16,199 --> 00:00:16,679
Speaker 2: In a row.

5
00:00:16,879 --> 00:00:19,760
Speaker 1: What they are doing in the finnoffspaces genius they call

6
00:00:19,800 --> 00:00:22,640
it finn Ops without Tagging and it's the first runtime

7
00:00:22,640 --> 00:00:26,600
technology that analyzes your infrastructure instead of relying on billing reports,

8
00:00:26,679 --> 00:00:30,679
exports and tagging. It's for architecture ops and platform teams

9
00:00:30,719 --> 00:00:34,479
that need visibility into product customer attribution or spend or

10
00:00:34,520 --> 00:00:37,840
insight into cost anomalies without wasting hours. They capture the

11
00:00:37,960 --> 00:00:42,079
cost based on actual application usage generated from Kubernetes, databases,

12
00:00:42,119 --> 00:00:45,520
storage and over thirty five multi cloud services. And they

13
00:00:45,520 --> 00:00:47,560
even break it down by micro service all the way

14
00:00:47,600 --> 00:00:49,840
to the database query level. And I don't think I've

15
00:00:49,880 --> 00:00:52,719
ever seen another tool that does quite that. So you

16
00:00:52,759 --> 00:00:55,119
want to check them out. That's Attribute and I'll drop

17
00:00:55,119 --> 00:00:57,439
a link in the episode description And now back to

18
00:00:57,520 --> 00:01:00,640
the show and our guest today. They actually may have

19
00:01:00,640 --> 00:01:03,000
a lot to say about that currently, em of a

20
00:01:03,039 --> 00:01:06,400
platform engineering at an undisclosed large company. The interesting thing

21
00:01:06,400 --> 00:01:09,920
about how long you've been in DevOps is you probably

22
00:01:09,959 --> 00:01:12,680
started with the mindset long before it was even coined

23
00:01:12,719 --> 00:01:15,200
as a term, and I like to introduce Adam Korga.

24
00:01:15,560 --> 00:01:17,840
Speaker 3: So, like you say, I did actually start in this

25
00:01:18,159 --> 00:01:21,719
before the develops, before became a term for full disclosure,

26
00:01:21,799 --> 00:01:23,719
not all the time as develops. So I started as

27
00:01:23,719 --> 00:01:27,159
a beacond engineer. Then at some point I figured out

28
00:01:27,200 --> 00:01:29,159
that it was called release.

29
00:01:28,920 --> 00:01:31,920
Speaker 4: Engineering back then, and it wasn't even on virtual machines

30
00:01:32,040 --> 00:01:33,480
I experienced staff.

31
00:01:33,239 --> 00:01:35,359
Speaker 1: When did you figure out what you were doing was

32
00:01:35,439 --> 00:01:39,480
actually embodying the devlops mindset and not just doing what

33
00:01:39,560 --> 00:01:40,519
someone else told you to do.

34
00:01:41,079 --> 00:01:45,480
Speaker 3: I always wanted to do this safer or better though,

35
00:01:45,799 --> 00:01:50,079
That was one thing. And second, how we can avoid

36
00:01:50,400 --> 00:01:55,519
mistakes like accidental dropping of production database spoiler alert.

37
00:01:55,680 --> 00:01:56,200
Speaker 4: I did that.

38
00:01:56,319 --> 00:01:58,319
Speaker 1: First I came to mind was the incident that one

39
00:01:58,359 --> 00:02:03,519
of the quote unquote AI providers had where they dropped

40
00:02:03,599 --> 00:02:09,879
databases programmatically through their LM accidentally of their customers because

41
00:02:09,919 --> 00:02:12,520
the LM decided to go and do this. So I

42
00:02:12,520 --> 00:02:15,759
think we come full circle inexperienced engineers who didn't have

43
00:02:15,759 --> 00:02:18,639
a lot of infrastructure experience accidentally dropping databases. I think

44
00:02:18,639 --> 00:02:21,080
in twenty sixteen was the major event from one of

45
00:02:21,080 --> 00:02:25,479
the well known GIT servers and now we're two having lms.

46
00:02:25,479 --> 00:02:30,280
Speaker 3: Do it illustrate the famous joke about there are two

47
00:02:30,360 --> 00:02:33,800
kinds of people that people who do backups and people

48
00:02:33,879 --> 00:02:36,680
who will do backups issue work in develops, you know

49
00:02:36,800 --> 00:02:40,159
that there is also a third category, people who test

50
00:02:40,159 --> 00:02:41,319
if their backups are working.

51
00:02:41,439 --> 00:02:44,120
Speaker 1: Thinking about get lab where they had an issue with

52
00:02:44,159 --> 00:02:48,199
their QL database and they I think it was postgrass,

53
00:02:48,240 --> 00:02:51,319
it may have been my sequel, and they accidentally dropped

54
00:02:51,319 --> 00:02:54,280
the production database instead of the replicated one because replicating

55
00:02:54,360 --> 00:02:55,479
what replication was failing.

56
00:02:55,479 --> 00:02:57,560
Speaker 3: And this was like, that's a different story, because I

57
00:02:57,599 --> 00:03:03,199
thought about the GitHub situation where they ran RAF by accident.

58
00:03:03,439 --> 00:03:07,840
They had five backups. One wasn't working for a couple months,

59
00:03:08,759 --> 00:03:12,639
second was on the same servers, was fine to delete

60
00:03:12,719 --> 00:03:14,560
as well, and so on and so on and so on.

61
00:03:15,400 --> 00:03:18,599
It was fun day, well, I say funniest probably for

62
00:03:18,680 --> 00:03:23,599
people involved, it wasn't so fun. Pixar almost bankrupted because

63
00:03:23,599 --> 00:03:28,120
of m REF because they during production of Toy Story two,

64
00:03:28,919 --> 00:03:34,000
they accidentally deleted entire movie. Entire movie was saved because

65
00:03:34,240 --> 00:03:38,719
one of the director was on the due to maternity leave,

66
00:03:39,000 --> 00:03:42,680
was working often from home, and she had the files

67
00:03:42,719 --> 00:03:47,120
on her private computer which was violating of course the

68
00:03:47,120 --> 00:03:51,599
processes in the company, but it was the only survived

69
00:03:51,680 --> 00:03:56,280
copy after this incident, so they moved her own home

70
00:03:56,560 --> 00:03:58,599
PC to the office to restore the movie.

71
00:03:59,199 --> 00:04:01,240
Speaker 4: And that's why we have studied tool.

72
00:04:01,280 --> 00:04:03,199
Speaker 1: Actually, that's not the first time I've heard of that

73
00:04:03,439 --> 00:04:06,400
saving move. There was a while ago, I think it

74
00:04:06,400 --> 00:04:09,360
was almost fifteen years ago now where there was an

75
00:04:09,439 --> 00:04:12,960
issue with I believe KDE like the UI for Linux distribution,

76
00:04:13,159 --> 00:04:16,360
where there was some corruption that got into I think

77
00:04:16,360 --> 00:04:19,279
this was like before they were using get got into

78
00:04:19,279 --> 00:04:21,720
the source control and because of how they were replicating

79
00:04:21,879 --> 00:04:24,040
all of it to all the mirrors where all the

80
00:04:24,040 --> 00:04:27,040
source was, there was like automatic process to clone from

81
00:04:27,040 --> 00:04:29,279
the source. Like as soon as the corruption got in there,

82
00:04:29,360 --> 00:04:32,720
every single copy of it in existence also got that

83
00:04:32,920 --> 00:04:34,959
was cloned and got that corruption in it, and they

84
00:04:35,000 --> 00:04:37,120
were all worthless. And so they had to go around

85
00:04:37,240 --> 00:04:40,319
like individual engineers who like hadn't set up or specifically

86
00:04:40,360 --> 00:04:44,519
disabled the automatic replication or to rebuild the source code

87
00:04:44,519 --> 00:04:46,720
index there. Now, I really want to talk about post

88
00:04:46,759 --> 00:04:49,600
mortems with you and like how they actually reviewed the situation.

89
00:04:49,680 --> 00:04:51,800
But before we get to that, one of the reasons

90
00:04:51,800 --> 00:04:55,399
that we grabbed you for this episode. Is you released

91
00:04:55,480 --> 00:04:59,240
quite the satirical book, The Guide to the Industry IT Dictionary,

92
00:04:59,399 --> 00:05:02,680
So this saying is absolutely ridiculous. This book, I mean

93
00:05:02,680 --> 00:05:05,720
it really is a dictionary. You've got a bunch of

94
00:05:05,839 --> 00:05:11,199
i'll say euphemisms with their implied real real world interpretation

95
00:05:11,360 --> 00:05:14,079
of them, and it just goes on and on. For

96
00:05:14,120 --> 00:05:16,519
these I feel like, you know, I had to pick

97
00:05:16,519 --> 00:05:18,480
it up and you know, read something then put it down.

98
00:05:18,360 --> 00:05:22,600
Speaker 3: Like every single entry is separate punchline, so it could

99
00:05:22,680 --> 00:05:26,079
be repetitive and exhausting if you try to book read

100
00:05:26,120 --> 00:05:29,639
it as a typical linear book. I don't recommend it.

101
00:05:29,800 --> 00:05:32,319
I won't judge where you hold this book. It could

102
00:05:32,360 --> 00:05:34,680
be on your desk, it could be in other places.

103
00:05:34,680 --> 00:05:36,959
And find with that as long as you have found

104
00:05:37,000 --> 00:05:41,920
it good. But there is second layer behind that, because

105
00:05:42,360 --> 00:05:47,439
it is divided into five parts and each describes different

106
00:05:47,480 --> 00:05:53,920
areas like quoriti agil, rituals and corporations, startups.

107
00:05:53,439 --> 00:05:57,680
Speaker 4: And of course we have to mention AI nowadays.

108
00:05:57,120 --> 00:06:00,000
Speaker 1: There is something that reminds me of random monrose X Casey,

109
00:06:00,360 --> 00:06:02,399
Like if I just pull out some of the jokes,

110
00:06:02,920 --> 00:06:05,079
it's like, I mean, what he takes three or four

111
00:06:05,079 --> 00:06:08,720
panels to get across. You have like a single core

112
00:06:08,920 --> 00:06:11,040
message that wraps it up, like just as an example

113
00:06:11,079 --> 00:06:13,839
one like ticket types. One of your ticket types is

114
00:06:13,879 --> 00:06:16,879
like quick bug, easy to reproduce, and then like if

115
00:06:16,879 --> 00:06:19,639
you the actual translation is two weeks later your knee

116
00:06:19,639 --> 00:06:22,360
deep in system calls questioning your life choices. I think

117
00:06:22,360 --> 00:06:25,319
a lot of us have probably seen seen ourselves there

118
00:06:25,319 --> 00:06:27,879
where you trying to fix something. It's usually on our

119
00:06:27,920 --> 00:06:31,600
own computers, like not even from the company, And the

120
00:06:31,720 --> 00:06:34,480
longer you spend trying to fix the problem, the worst

121
00:06:34,519 --> 00:06:35,240
the issue gets.

122
00:06:35,439 --> 00:06:35,639
Speaker 4: Well.

123
00:06:35,680 --> 00:06:39,040
Speaker 3: I tries to really show how it looks from real life.

124
00:06:39,160 --> 00:06:43,360
Sometimes I'm going also on the typical jargon that doesn't

125
00:06:43,399 --> 00:06:47,519
necessarily touch the it or the core it itself.

126
00:06:47,560 --> 00:06:48,959
Speaker 1: I mean, we should go through one of the sections.

127
00:06:48,959 --> 00:06:50,680
I feel like that would be the most interesting. So

128
00:06:51,079 --> 00:06:52,319
should we call it the agile one?

129
00:06:52,360 --> 00:06:52,680
Speaker 4: I don't know.

130
00:06:52,800 --> 00:06:57,240
Speaker 3: I have to admit that introductions in agile part are

131
00:06:57,639 --> 00:06:58,560
my favorite one.

132
00:06:59,240 --> 00:07:01,439
Speaker 1: So there's like this something for everyone here, no matter

133
00:07:01,439 --> 00:07:05,000
what your flavor of engineering is that you're working in, or.

134
00:07:04,959 --> 00:07:06,240
Speaker 2: Even if you're outside the field.

135
00:07:06,399 --> 00:07:10,040
Speaker 1: I see a lot of companies saying they do agile

136
00:07:10,279 --> 00:07:13,360
and not doing it. This one always it's quite close

137
00:07:13,399 --> 00:07:14,680
to home for me, so I just I want to

138
00:07:14,720 --> 00:07:16,800
read some of these out because I just think I

139
00:07:16,800 --> 00:07:20,040
think they're quite funny. So there's like the term we're

140
00:07:20,040 --> 00:07:23,600
doing agile, and what you have here it actually means

141
00:07:23,639 --> 00:07:26,279
is we have a daily panic meeting and call it

142
00:07:26,319 --> 00:07:31,199
a methodology. And then there's servant leadership booking meetings and

143
00:07:31,240 --> 00:07:34,199
politely begging people to update JIRA. I mean obviously because

144
00:07:35,040 --> 00:07:37,079
you don't no longer have the authority to tell people

145
00:07:37,079 --> 00:07:39,399
what to do and you're just there to support them,

146
00:07:39,399 --> 00:07:42,639
but they're not performing the necessary activities. Anyway, Let's see

147
00:07:42,639 --> 00:07:45,879
if there's another particularly good one, shielding the team sitting

148
00:07:45,879 --> 00:07:47,879
in meetings they weren't invited to either.

149
00:07:48,519 --> 00:07:51,759
Speaker 3: There is even an entire section of a subsection about

150
00:07:51,839 --> 00:07:56,800
the tools in agile, and my favorite there was Slack,

151
00:07:56,920 --> 00:07:58,519
which is destruction.

152
00:07:58,600 --> 00:07:59,600
Speaker 4: As I said, there are.

153
00:07:59,600 --> 00:08:03,800
Speaker 1: Some actually good for organizations where understanding what the tempo

154
00:08:03,800 --> 00:08:06,040
of the organization is by the Slack messages that you're

155
00:08:06,079 --> 00:08:09,040
getting and being attentive to problems that show up and

156
00:08:09,240 --> 00:08:12,399
responding to the team is incredibly valuable. But I find

157
00:08:12,439 --> 00:08:14,560
that book really conveys here is that a lot of

158
00:08:14,600 --> 00:08:18,199
times the notifications represent just total noise.

159
00:08:19,199 --> 00:08:22,279
Speaker 3: Yeah, but that applies to everything. That's why the entire

160
00:08:22,319 --> 00:08:26,920
part is called cargo cult. Cargo cult is actual term

161
00:08:26,959 --> 00:08:31,879
from World War Two which evolved into mimicking rituals without

162
00:08:31,959 --> 00:08:37,360
understanding reasons. And it's not like slack, gira, confluence or

163
00:08:37,399 --> 00:08:38,840
whatever is actually wrong.

164
00:08:38,919 --> 00:08:39,240
Speaker 4: It's not.

165
00:08:40,000 --> 00:08:43,559
Speaker 3: It's just about people use it or or abuse it

166
00:08:43,720 --> 00:08:45,600
hoping that it will solve their problems.

167
00:08:45,799 --> 00:08:47,679
Speaker 1: I think this is actually really interesting to call out

168
00:08:47,679 --> 00:08:50,799
because the origins actually were quite interesting for me. So

169
00:08:51,120 --> 00:08:54,399
you're right, I believe it was Asia, and I mean realistically,

170
00:08:54,600 --> 00:08:58,759
just the shortcut is care packages and goods were dropped

171
00:08:58,799 --> 00:09:01,639
from like fighter jets on the ground, right, And so

172
00:09:01,759 --> 00:09:03,279
of course with that there's like a lot of other

173
00:09:03,320 --> 00:09:06,759
technology stack that you need to create there. And after

174
00:09:07,399 --> 00:09:10,720
World War Two and the Allies left and stop dropping

175
00:09:10,840 --> 00:09:13,759
care packages, they still wanted them to show up. So

176
00:09:14,360 --> 00:09:16,639
if you see that a box of peers, if there

177
00:09:16,679 --> 00:09:22,480
are landing strips and airplanes and control towers and those

178
00:09:22,519 --> 00:09:24,799
are all gone now, well then you build the control

179
00:09:24,840 --> 00:09:28,120
towers and the airplanes and you hope that the care

180
00:09:28,159 --> 00:09:30,759
packages show up with it. Because you know, as you said,

181
00:09:30,799 --> 00:09:33,240
and I think is really important that you see what

182
00:09:33,360 --> 00:09:36,679
the practice is and you just replicate it and without

183
00:09:36,799 --> 00:09:39,559
understanding what the reason was that was created or the

184
00:09:39,639 --> 00:09:43,399
goal in mind, then you're really just doing something which

185
00:09:43,440 --> 00:09:45,679
doesn't actually bring any value. And I think we see

186
00:09:45,720 --> 00:09:48,799
this through the industry a lot, and it gets replicated

187
00:09:48,799 --> 00:09:50,039
over and over one and I think I'm going to

188
00:09:50,080 --> 00:09:51,879
get a lot of angry emails.

189
00:09:51,559 --> 00:09:54,120
Speaker 2: As a result of this. I think Kubernetes is a

190
00:09:54,159 --> 00:09:54,720
great one.

191
00:09:54,879 --> 00:09:58,200
Speaker 1: Like you see companies building the architecture and platform to

192
00:09:58,399 --> 00:10:01,519
support such a complicated technology and then they throw it

193
00:10:01,559 --> 00:10:04,360
in and I think they hope that they're going to

194
00:10:04,360 --> 00:10:05,519
be successful after that.

195
00:10:05,639 --> 00:10:08,120
Speaker 5: Well, that's actually what picking up up.

196
00:10:08,240 --> 00:10:11,519
Speaker 3: In startup, the product is not ready if it's not scalable,

197
00:10:12,080 --> 00:10:16,240
and as a result, companies invest crazy amount of energy

198
00:10:16,279 --> 00:10:20,919
and money into scalability micro service architecture, ensuring that it

199
00:10:20,960 --> 00:10:25,600
would scale two thousands, despite the fact active user counts

200
00:10:25,639 --> 00:10:28,279
is twelve including QA.

201
00:10:28,519 --> 00:10:31,360
Speaker 1: Yeah, for sure, I think there's this there's this commat

202
00:10:31,440 --> 00:10:34,080
that I'll end up linking after the episode for anyone

203
00:10:34,120 --> 00:10:39,080
that's on our website Adventures and Develops, where the canonical

204
00:10:39,240 --> 00:10:43,600
is some engineer goes to the platform architect, will this

205
00:10:43,840 --> 00:10:49,000
infrastructure support millions of users and requests per second? And

206
00:10:49,279 --> 00:10:52,600
the answer is another question, how many you have right now?

207
00:10:52,879 --> 00:10:53,080
Speaker 4: Oh?

208
00:10:53,120 --> 00:10:53,840
Speaker 2: We have five?

209
00:10:54,200 --> 00:10:57,279
Speaker 1: Yes, that is the perfect architecture for what you're doing.

210
00:10:57,759 --> 00:11:01,000
Speaker 3: For sure, I have this problem. I had to learn

211
00:11:01,440 --> 00:11:05,399
to avoid this or notice this problem. But we tend

212
00:11:05,559 --> 00:11:07,639
to go with the best practices.

213
00:11:07,879 --> 00:11:10,919
Speaker 1: And I do want to ask what brought you to

214
00:11:10,960 --> 00:11:13,600
the point where you felt like you had to get

215
00:11:13,639 --> 00:11:15,440
this down in writing that you wanted to write the

216
00:11:15,480 --> 00:11:16,480
book in the first place.

217
00:11:16,600 --> 00:11:18,879
Speaker 3: Some time ago, I was dating some girl that was

218
00:11:18,919 --> 00:11:24,159
working in legal department in IT company, and as a

219
00:11:24,200 --> 00:11:28,879
small gift slash joke, I decided I offered to give

220
00:11:28,960 --> 00:11:33,000
her a translation of terms that people in IT used

221
00:11:33,039 --> 00:11:36,879
to say that others are stupid, without saying that openly,

222
00:11:36,960 --> 00:11:40,679
like classics like Layer eight issue. Then I added some

223
00:11:40,799 --> 00:11:45,840
translations of terms or phrases that people tend to use

224
00:11:46,000 --> 00:11:49,480
to sound smart besides while saying that they do not

225
00:11:49,840 --> 00:11:52,519
that they have no idea what's going on. And over

226
00:11:52,679 --> 00:11:56,200
time it started growing I realized that this also could

227
00:11:56,240 --> 00:11:56,879
be described.

228
00:11:56,919 --> 00:11:59,000
Speaker 4: This could also be described.

229
00:11:58,600 --> 00:12:02,159
Speaker 5: I continued adding, and when I when my.

230
00:12:05,519 --> 00:12:08,399
Speaker 3: Table of contents, which was pretty much just a draft

231
00:12:08,440 --> 00:12:11,720
of topics that I'd like to translate. Once I had that,

232
00:12:12,080 --> 00:12:15,600
I looked at it and I found this underlying structure

233
00:12:15,759 --> 00:12:19,320
that evolved into those parts. Somebody asked me if I

234
00:12:19,480 --> 00:12:22,639
was looking at him from behind his back at his desk.

235
00:12:22,720 --> 00:12:26,840
So it's leased and experienced, but I tried to make

236
00:12:26,879 --> 00:12:32,399
it accessible. Nevertheless, once you start writing, you realize that, well,

237
00:12:32,440 --> 00:12:35,360
I realized for sure probably others would have the same

238
00:12:35,360 --> 00:12:38,519
as effect that there's more to say, there's more there,

239
00:12:38,639 --> 00:12:42,960
and well, I we already started talking about the topic

240
00:12:43,000 --> 00:12:44,440
of the what's come next?

241
00:12:44,679 --> 00:12:45,440
Speaker 4: What comes next?

242
00:12:45,440 --> 00:12:48,320
Speaker 3: Because I already started writing the next book, which would

243
00:12:48,360 --> 00:12:53,639
be very friendly and not so easy to promote, the

244
00:12:53,679 --> 00:12:58,200
title Fakapalmanac, which will be literally collection of disasters in

245
00:12:58,240 --> 00:13:01,679
it and civil engineering. The focus on post mortem still

246
00:13:01,759 --> 00:13:03,480
very cynical in most cases.

247
00:13:03,600 --> 00:13:06,559
Speaker 1: I'm sure you got the forty five minute catastrophe of

248
00:13:06,639 --> 00:13:08,159
Night Capital in there.

249
00:13:08,480 --> 00:13:09,600
Speaker 4: Of course, that's.

250
00:13:09,399 --> 00:13:12,399
Speaker 1: The cornerstone of post mortems. That's the one that you

251
00:13:12,440 --> 00:13:13,720
see and you're like, you know what, I'm going to

252
00:13:13,840 --> 00:13:14,600
keep a link to that.

253
00:13:14,720 --> 00:13:19,519
Speaker 3: The Night Capital was when they wanted to automate trading

254
00:13:19,840 --> 00:13:23,919
and they deployed new version. They deployed it on every

255
00:13:24,039 --> 00:13:28,600
all the systems, except they didn't update one server that

256
00:13:28,799 --> 00:13:32,879
still had the test mode enabled, which means that it

257
00:13:33,080 --> 00:13:41,159
was creating enormous amount of trades with catastrophical consequences. And

258
00:13:41,240 --> 00:13:44,360
I'm not sure, but they low in forty five minutes.

259
00:13:44,399 --> 00:13:47,000
They lost something like eight hundred million dollars or something

260
00:13:47,039 --> 00:13:50,120
like that. But there was also an interesting story, and

261
00:13:50,279 --> 00:13:53,320
I think it was about Italium, and that's something I

262
00:13:53,320 --> 00:13:57,679
don't remember in details, but basically there was a bag

263
00:13:57,840 --> 00:14:03,360
in the contracts implementation that allowed with frequent enough trading,

264
00:14:03,440 --> 00:14:07,679
you could get paid and before this was deducted from

265
00:14:07,759 --> 00:14:09,759
the other account, return it.

266
00:14:10,000 --> 00:14:13,200
Speaker 5: So if you did it properly, would you were.

267
00:14:13,120 --> 00:14:15,399
Speaker 4: Pretty much creating this from scratch.

268
00:14:15,559 --> 00:14:18,600
Speaker 1: You're talking about the forking hard fork of the Ethereum blockchain.

269
00:14:18,639 --> 00:14:21,799
Before it was what it is today, it was Ethereum Classic.

270
00:14:22,080 --> 00:14:26,399
There was this idea of programmatic governance of organizations using

271
00:14:26,519 --> 00:14:30,120
a smart contracts, so unlike Bitcoin, Ethereum supports programmatic execution,

272
00:14:30,440 --> 00:14:34,320
and someone abused that contract. Basically, the order of operations

273
00:14:34,320 --> 00:14:38,440
in the contract didn't ensure that someone wasn't withdrawing more

274
00:14:38,440 --> 00:14:42,000
money than another contract allowed, and you're abs aut, right.

275
00:14:42,000 --> 00:14:43,519
Speaker 2: I mean, this is the thing with blockchain though.

276
00:14:43,600 --> 00:14:46,039
Speaker 1: I think the post mortem here is in a way

277
00:14:46,159 --> 00:14:49,240
like why did it happen? But also the path forward,

278
00:14:49,320 --> 00:14:51,480
So in some way the value was what can we

279
00:14:51,480 --> 00:14:54,639
do about it? And the thing with blockchain technologies, with

280
00:14:54,679 --> 00:14:57,759
the public ledger is out of enough people decide consensus

281
00:14:57,840 --> 00:15:01,120
right that what happened isn't okay and it should be different.

282
00:15:01,440 --> 00:15:03,840
Then they go and they make it different, and that's

283
00:15:03,840 --> 00:15:06,840
how we ended up with ethereum that.

284
00:15:06,879 --> 00:15:08,840
Speaker 2: We have today. Actually it's not even the ethereum we have.

285
00:15:08,799 --> 00:15:12,440
Speaker 1: Today because that was still when the consensus algorithm was

286
00:15:12,480 --> 00:15:15,360
based off of proof of work, and now we have

287
00:15:15,600 --> 00:15:17,519
ethereum two point zero, which is based off a proof

288
00:15:17,519 --> 00:15:22,519
of steak which analyzes people with monetary punishment. So there's

289
00:15:22,559 --> 00:15:25,240
been evolution here. So in some way, I guess my

290
00:15:25,320 --> 00:15:27,919
question is going to be postmortems have value then, right?

291
00:15:28,519 --> 00:15:33,159
Speaker 3: Yes, huge, at least my recent experience that that's another

292
00:15:33,240 --> 00:15:37,799
area where those cargo cules appear. That, for instance, quite

293
00:15:37,799 --> 00:15:41,320
common root cause analysis technique, which is five whys.

294
00:15:41,799 --> 00:15:44,399
Speaker 5: I've seen multiple times when people were so.

295
00:15:44,519 --> 00:15:48,120
Speaker 4: Stuck on these five whys that as.

296
00:15:47,960 --> 00:15:53,320
Speaker 3: A result, some people were artificially bloating the reasoning and

297
00:15:53,399 --> 00:15:57,480
suddenly one single answer was split into three points just

298
00:15:57,519 --> 00:16:00,519
to hit the magic five at the end, or in

299
00:16:00,600 --> 00:16:06,200
other cases, somebody hit five and stopped analysis right when

300
00:16:06,240 --> 00:16:09,759
the analysis was starting to be interesting.

301
00:16:10,000 --> 00:16:12,600
Speaker 1: Why do you think that a lot of root cause

302
00:16:12,600 --> 00:16:16,200
analysis or postmodern meetings that happen after major events don't

303
00:16:16,240 --> 00:16:19,919
push through that investigative phase and are able to get

304
00:16:19,960 --> 00:16:21,120
to what the real insight is.

305
00:16:21,159 --> 00:16:23,480
Speaker 2: Why are they stopping right before that threshold?

306
00:16:23,960 --> 00:16:25,720
Speaker 5: If I had to ask, I would say there are

307
00:16:25,720 --> 00:16:26,519
two reasons.

308
00:16:26,879 --> 00:16:30,919
Speaker 3: One is, as soon as I can throw the ball

309
00:16:30,960 --> 00:16:35,200
outside of my garden, it's enough, changing the purpose from

310
00:16:35,600 --> 00:16:39,039
finding what happened and how we can avoid to proving

311
00:16:39,080 --> 00:16:40,120
that it's not my fault.

312
00:16:40,360 --> 00:16:43,720
Speaker 5: Second is just a typical lack of time.

313
00:16:44,159 --> 00:16:46,399
Speaker 1: I do think we see the pendulum swaying back and

314
00:16:46,440 --> 00:16:48,960
forth a lot in technology. And I don't know if

315
00:16:48,960 --> 00:16:53,639
this is just my experience or technology itself, or because

316
00:16:53,679 --> 00:16:56,639
I have a limited view and it happens other places

317
00:16:56,679 --> 00:16:59,279
as well, but it does really seem like a lot

318
00:16:59,279 --> 00:17:02,840
of organizations are somewhere they have a certain mindset, and

319
00:17:02,919 --> 00:17:06,960
when things don't work right their strategy, and almost seems

320
00:17:07,000 --> 00:17:09,480
like their first and only strategy is do the extreme

321
00:17:09,559 --> 00:17:12,880
opposite of whatever that thing is. And then five years

322
00:17:12,960 --> 00:17:15,440
later the organizations in a different place there and they

323
00:17:15,480 --> 00:17:17,960
see problems with that extreme, and so then they swing

324
00:17:18,079 --> 00:17:20,759
back to the other way, which by that point everyone

325
00:17:20,839 --> 00:17:24,200
who had left the first extreme is now gone and

326
00:17:24,279 --> 00:17:26,920
you're only left with new people who have the pain

327
00:17:27,000 --> 00:17:30,359
of the decision that was from the past and swinging

328
00:17:30,359 --> 00:17:33,240
the pendulum back and now are doing basically the wrong

329
00:17:33,279 --> 00:17:36,400
thing again or something else incredibly wrong in a different way.

330
00:17:36,440 --> 00:17:38,839
Speaker 2: I think you're really on point though there with.

331
00:17:39,079 --> 00:17:42,799
Speaker 4: The it's not just in technology, the pendulument tendency to

332
00:17:42,880 --> 00:17:44,960
extremes is just in human nature.

333
00:17:45,200 --> 00:17:48,440
Speaker 1: I think you're absolutely right about why companies are stopping

334
00:17:48,640 --> 00:17:51,839
their post mortem process too early. It goes back into

335
00:17:52,039 --> 00:17:55,400
the fact that the incentives are aligned for short term

336
00:17:55,519 --> 00:17:59,920
deliverables or focus, and I don't see any reason why

337
00:18:00,039 --> 00:18:03,240
this wouldn't just continue snowball and causes bigger problems. So

338
00:18:03,279 --> 00:18:05,359
maybe you know in your in your research that you've

339
00:18:05,359 --> 00:18:08,559
done here, has there been any callouts for identifying how

340
00:18:08,680 --> 00:18:13,119
companies or organizations can stay methodical and realize they need

341
00:18:13,160 --> 00:18:16,720
to complete the process optimized for long term value opposed

342
00:18:16,720 --> 00:18:19,880
to falling back on just delivering a short term.

343
00:18:19,640 --> 00:18:23,160
Speaker 3: You can only say that it is a lot about

344
00:18:23,240 --> 00:18:27,720
common sense. One example of value of post mortem is

345
00:18:27,920 --> 00:18:33,720
how why airplanes have over circle windows? So it's not

346
00:18:33,839 --> 00:18:37,400
it wasn't always like that. So basically, in pre World

347
00:18:37,440 --> 00:18:43,359
War two airplanes had squares typical square rectangular windows and

348
00:18:43,440 --> 00:18:47,319
it was fine. The windows were sometimes the glasses or

349
00:18:47,400 --> 00:18:50,799
glass was sometimes crashing, but the planes were were not

350
00:18:50,920 --> 00:18:56,279
flying that high. It had didn't have to be uh

351
00:18:56,599 --> 00:19:01,519
pressurized and stuff. So when the planes that it was okay,

352
00:19:01,640 --> 00:19:04,960
called the glacier, changed the window and move on, and

353
00:19:05,039 --> 00:19:11,480
nobody investigated after World War two. Each actually coincidentally, it's

354
00:19:11,519 --> 00:19:14,000
not that World War to change something. But in nineteen

355
00:19:14,039 --> 00:19:19,160
fifties the first hypersonic airplane came to civil aviation.

356
00:19:20,039 --> 00:19:24,519
Speaker 4: It was heavyland Comet, which.

357
00:19:24,359 --> 00:19:28,160
Speaker 3: Was flying at the altitude of eleven or twelve kilometers,

358
00:19:28,440 --> 00:19:35,039
and there the cabins had to be pressurized, so the

359
00:19:35,079 --> 00:19:38,839
forces were much heavier and those tiny cracks that were

360
00:19:39,039 --> 00:19:43,279
visible before started to be much more critical, and within

361
00:19:43,400 --> 00:19:48,160
four or five years there were multiple catastrophes with casualties

362
00:19:48,160 --> 00:19:53,440
and stuff, And that's when they did the full blown investigation.

363
00:19:53,720 --> 00:19:59,680
We built special tanks to investigate, to analyze the pressures

364
00:19:59,720 --> 00:20:00,000
and stuff.

365
00:20:00,599 --> 00:20:01,759
Speaker 4: And then five and they.

366
00:20:01,640 --> 00:20:04,920
Speaker 3: Found out that on the corners corners were pretty much

367
00:20:04,920 --> 00:20:09,839
concentration points for the stress. The stress levels were even

368
00:20:09,920 --> 00:20:13,000
for over four times, something like four times higher than

369
00:20:14,039 --> 00:20:19,319
around the not on the corner. So and that was

370
00:20:19,359 --> 00:20:22,960
around the corner of the windows and the antenna radio

371
00:20:23,000 --> 00:20:28,160
antennas and stuff mountaints. So the solution was brilliantly simple.

372
00:20:28,519 --> 00:20:31,440
If the corners have the problems, eliminate corners. That's why

373
00:20:31,480 --> 00:20:36,480
the windows became round. If they investigation, the signs were

374
00:20:36,519 --> 00:20:40,160
before that, but they weren't leading to.

375
00:20:40,119 --> 00:20:43,039
Speaker 4: Catastrophes yet, so they were ignored.

376
00:20:43,960 --> 00:20:47,599
Speaker 3: If you look at this from the IT perspective, it's

377
00:20:47,720 --> 00:20:50,359
kind of the same. As long as you have an

378
00:20:50,400 --> 00:20:53,920
issue when you deploy the new version and something crashes,

379
00:20:54,160 --> 00:20:57,599
it's enough to restart pipeline or something like that. Okay,

380
00:20:57,680 --> 00:21:00,839
let's not investigate, we need to focus, but that could

381
00:21:00,920 --> 00:21:03,960
be an underlying issue that later could lead to the

382
00:21:04,039 --> 00:21:06,200
catastrophe like night copy.

383
00:21:06,200 --> 00:21:09,160
Speaker 1: First of all, the airline industry is quite the shining

384
00:21:09,200 --> 00:21:12,640
example of diving into post mortems. I feel like, compared

385
00:21:12,640 --> 00:21:15,920
to other industries, the idea of preventing not just loss

386
00:21:15,920 --> 00:21:21,640
of life, but preventing all incidents now, current and future

387
00:21:22,240 --> 00:21:25,200
is something that's really focused on and like really getting

388
00:21:25,200 --> 00:21:28,240
to understand, like all the root causes, not just the

389
00:21:28,279 --> 00:21:30,839
first one that pops up, so that these events in

390
00:21:30,920 --> 00:21:33,039
any way can never happen again. And I feel like

391
00:21:33,200 --> 00:21:34,640
World War Two in a lot of ways was a

392
00:21:34,680 --> 00:21:38,279
turning point. I think there was this very specific interesting

393
00:21:38,880 --> 00:21:41,279
conclusion insight that was had, and I believe it was

394
00:21:41,319 --> 00:21:44,279
in the United States with the return of vehicles that

395
00:21:44,640 --> 00:21:46,160
were damaged in some way.

396
00:21:46,640 --> 00:21:49,759
Speaker 4: It was in US. You are talking about survivors.

397
00:21:49,319 --> 00:21:51,880
Speaker 1: Bias, yes, yes, of course, but.

398
00:21:52,240 --> 00:21:56,880
Speaker 3: The conclusion was from I think Hungarian statistician.

399
00:21:57,119 --> 00:22:00,839
Speaker 1: Basically, the idea was that if you receive vehicle back,

400
00:22:01,119 --> 00:22:02,319
then it survived its.

401
00:22:02,759 --> 00:22:08,279
Speaker 3: R US Army was analyzing the bullet holes on the

402
00:22:08,400 --> 00:22:13,559
returning bombers and their initial intuition was that if that's

403
00:22:13,640 --> 00:22:17,480
where the planes were hit, they need to fortify those points.

404
00:22:17,599 --> 00:22:22,599
And just this Hungarian mathematic statistician pointed out that hey, stop,

405
00:22:22,920 --> 00:22:25,400
if you see those holes, you don't see the holes

406
00:22:25,400 --> 00:22:26,519
on other parts.

407
00:22:26,799 --> 00:22:29,720
Speaker 5: That means that the planes that were hit there did

408
00:22:29,759 --> 00:22:30,319
not return.

409
00:22:30,440 --> 00:22:33,400
Speaker 3: And that's also a very strong case for post mortems

410
00:22:33,480 --> 00:22:37,839
because its very easy to focus on the success stories,

411
00:22:38,200 --> 00:22:41,640
but the real learning, real they tell lies in the disaster.

412
00:22:41,680 --> 00:22:44,000
Speaker 1: It's very counterintuitive. I think it's the conclusion that a

413
00:22:44,079 --> 00:22:46,240
lot of times we come to here, and that it's

414
00:22:46,279 --> 00:22:49,400
almost like if you aren't coming to a counterintuitive conclusion,

415
00:22:49,519 --> 00:22:53,000
then maybe you're missing something critical about your post mortem process.

416
00:22:53,240 --> 00:22:55,400
And I think this is where the survivorship is come

417
00:22:55,480 --> 00:22:58,720
doesn't play you. You have a release or an incident,

418
00:22:58,839 --> 00:23:01,440
and you sit down and a meeting and you discuss things,

419
00:23:01,480 --> 00:23:05,119
and the conclusion invariably is like, we need more tests,

420
00:23:05,319 --> 00:23:08,799
and well, anyone could have told you to have more tests.

421
00:23:08,920 --> 00:23:10,799
Speaker 2: I think the clever trick.

422
00:23:10,680 --> 00:23:14,000
Speaker 1: Is knowing which tests you should have, because you know,

423
00:23:14,079 --> 00:23:16,279
if you just had the right test, you can avoid

424
00:23:16,440 --> 00:23:19,519
every single production incident that would ever happen.

425
00:23:20,480 --> 00:23:20,839
Speaker 4: For that.

426
00:23:20,880 --> 00:23:25,119
Speaker 3: I kind of like that approach of do not write

427
00:23:25,200 --> 00:23:27,720
more tests just because you want more tests?

428
00:23:27,759 --> 00:23:31,519
Speaker 1: That conclusion is often a struggle for organizations, like there's

429
00:23:31,559 --> 00:23:34,119
a belief that oh no, this this software doesn't change

430
00:23:34,119 --> 00:23:37,240
that frequently, or we don't have the priority to actually

431
00:23:37,279 --> 00:23:38,920
be able to implement there. But yeah, I mean, if

432
00:23:38,960 --> 00:23:42,119
you're in the weeds and actually fixing something or changing

433
00:23:42,400 --> 00:23:44,720
the way something works, I think a good metric that

434
00:23:44,759 --> 00:23:48,359
at least my teams use is add the test before you.

435
00:23:48,359 --> 00:23:49,200
Speaker 2: Make the change.

436
00:23:49,519 --> 00:23:52,359
Speaker 1: Like, it doesn't matter if there are necessarily tests on code,

437
00:23:52,440 --> 00:23:54,200
but if you're going to change something, then you can

438
00:23:54,319 --> 00:23:56,640
have an opportunity to potentially add the test first to

439
00:23:56,759 --> 00:23:59,119
ensure that the thing you're changing doesn't break in a

440
00:23:59,119 --> 00:24:02,200
particular way. And I think the follow up here is

441
00:24:02,240 --> 00:24:06,119
that often things break for the new functionality.

442
00:24:05,599 --> 00:24:09,880
Speaker 3: But generally the most box most mistakes were when you

443
00:24:10,000 --> 00:24:10,960
introduce the change.

444
00:24:11,039 --> 00:24:11,799
Speaker 2: Yeah, yeah, for sure.

445
00:24:11,920 --> 00:24:14,960
Speaker 1: Realistically here there's like a whole bunch of failure modes

446
00:24:15,000 --> 00:24:17,839
where you're making assumptions about what the current thing is

447
00:24:17,880 --> 00:24:22,359
doing rather than actually validating it in some capacity. Here,

448
00:24:22,559 --> 00:24:24,400
there is one thing that I think is still worth

449
00:24:24,400 --> 00:24:27,480
bringing up in the area of post mortems, especially when

450
00:24:27,480 --> 00:24:30,759
it comes to the failure rate, and it's this idea

451
00:24:30,839 --> 00:24:36,119
where you it's it is actually impossible to predict where

452
00:24:36,240 --> 00:24:38,880
the future errors will will show up in some capacity,

453
00:24:39,440 --> 00:24:42,799
and the mistake that's made frequently is tracking the number

454
00:24:42,839 --> 00:24:45,039
of errors you have or the size of them in

455
00:24:45,079 --> 00:24:48,519
the path, because nothing tells us that the errors that

456
00:24:48,519 --> 00:24:51,920
we've encountered so far and potentially fixed or prevented, have

457
00:24:52,000 --> 00:24:54,960
any impact on the future errors that we're going to

458
00:24:55,200 --> 00:24:58,519
encounter in our organization. The criticality of the failure. Every

459
00:24:58,559 --> 00:25:02,519
company works until the moment it doesn't. Every software works

460
00:25:02,559 --> 00:25:04,480
until the moment it doesn't. And I think there is

461
00:25:04,480 --> 00:25:07,039
this idea where, oh, yeah, we can just fix every

462
00:25:07,039 --> 00:25:10,160
problem we've identified. We should tract the number of problems

463
00:25:10,200 --> 00:25:12,720
over time and will just be good. And I think

464
00:25:13,240 --> 00:25:15,759
I'm hoping the history and the research that you've done

465
00:25:16,079 --> 00:25:18,400
leads leads, you know, anyone who reads the book to

466
00:25:18,599 --> 00:25:20,960
the opposite conclusion, which is like you need a better

467
00:25:21,039 --> 00:25:24,200
process or a mindset around how you're tackling what you're

468
00:25:24,200 --> 00:25:27,240
tackling rather than just what potentially comes up.

469
00:25:27,319 --> 00:25:31,440
Speaker 3: There was a technique of city editors. Microsoft was doing

470
00:25:32,039 --> 00:25:34,640
this with Microsoft Betwindows, but it was totally for a

471
00:25:34,640 --> 00:25:39,400
different purpose. But it was a technique to statistically evaluate

472
00:25:39,440 --> 00:25:42,799
how many bugs they didn't find. Quite interesting, not necessarily

473
00:25:42,880 --> 00:25:47,680
applicable to constantly evolving software nowadays. But the main lesson

474
00:25:47,960 --> 00:25:52,160
is to not ignore the errors. On one hand, own

475
00:25:52,640 --> 00:25:58,160
mistakes to try to analyze it. Second, don't ignore other

476
00:25:58,200 --> 00:26:03,440
people's editors. That's free learning. Sometimes you can last, sometimes

477
00:26:03,440 --> 00:26:05,400
you can be amazed, but for sure there is some

478
00:26:05,519 --> 00:26:07,240
lessons there that you can avoid.

479
00:26:07,400 --> 00:26:10,079
Speaker 1: Yeah, it's a very famous Russian submarine that I don't

480
00:26:10,079 --> 00:26:11,480
recall that.

481
00:26:11,480 --> 00:26:15,279
Speaker 3: That's what I was aiming at. It was Staniswa Pietrov

482
00:26:15,319 --> 00:26:17,279
who was in Soviet air defense.

483
00:26:18,079 --> 00:26:19,599
Speaker 4: And indeed they.

484
00:26:19,519 --> 00:26:24,880
Speaker 3: Had one censorus reporting that US launched a missile, and

485
00:26:25,000 --> 00:26:30,519
he ignored it until they got back up information from

486
00:26:30,559 --> 00:26:33,880
different stations a couple of minutes later. According to Protesto.

487
00:26:34,279 --> 00:26:38,599
To Proteto, he should be he should immediately counterfire. He

488
00:26:38,640 --> 00:26:42,759
ignored it, and that that was nineteen eighty three. He

489
00:26:42,839 --> 00:26:45,400
ignored that, and that's why the World War two did

490
00:26:45,440 --> 00:26:48,960
not start. If he shot back, we wouldn't be talking

491
00:26:49,039 --> 00:26:49,440
right now.

492
00:26:49,359 --> 00:26:49,680
Speaker 2: For sure.

493
00:26:49,799 --> 00:26:49,960
Speaker 4: Not.

494
00:26:50,680 --> 00:26:52,319
Speaker 2: I think this could be a good stopping point.

495
00:26:52,400 --> 00:26:55,720
Speaker 1: Let's move on to picks. My pick for today is

496
00:26:55,880 --> 00:26:59,359
the quitos quetason. I don't know how to pronounce it.

497
00:26:59,400 --> 00:27:02,359
Food stories containers. I really like these.

498
00:27:02,880 --> 00:27:05,680
Speaker 2: They're really expensive, but they're highly durable. They go in

499
00:27:05,720 --> 00:27:08,759
the oven, I don't know a microwave. They go on

500
00:27:08,759 --> 00:27:09,160
the stove.

501
00:27:09,240 --> 00:27:11,720
Speaker 1: I guess they go in the freezer deep freeze minus

502
00:27:11,759 --> 00:27:13,039
twenty degrees celsius.

503
00:27:13,480 --> 00:27:14,839
Speaker 2: They are absolutely fantastic.

504
00:27:15,119 --> 00:27:16,839
Speaker 1: And what I like about this brand is they have

505
00:27:16,880 --> 00:27:19,480
a lot of different sizes, so you can measure exactly

506
00:27:19,519 --> 00:27:22,799
what fits in your refrigerators and your storage units and

507
00:27:23,079 --> 00:27:25,759
focus on stacking them the best. I probably have like

508
00:27:25,799 --> 00:27:29,359
fifteen or twenty of these, like almost this exact size.

509
00:27:28,880 --> 00:27:30,599
Speaker 2: And honestly, they've been great about it.

510
00:27:30,759 --> 00:27:34,440
Speaker 1: Five years ago I completely moved off of storing anything

511
00:27:34,599 --> 00:27:37,400
in plastics in any way, and now everything I wanted

512
00:27:37,480 --> 00:27:40,440
is in glass or stainless steel metal. Actually this is

513
00:27:40,440 --> 00:27:43,359
a composite and these have been the best so far.

514
00:27:43,759 --> 00:27:46,640
And I really like them because after cooking, I always

515
00:27:46,640 --> 00:27:47,680
have leftovers.

516
00:27:48,160 --> 00:27:51,799
Speaker 3: I need to check you and if you ask me what,

517
00:27:51,880 --> 00:27:56,440
I would say, well, I recently watched the TV series

518
00:27:56,559 --> 00:28:03,559
on Netflix called Fubar. It's u pical, well modern take

519
00:28:03,680 --> 00:28:08,519
on nineteen nineties early two thousand action comedy movies with

520
00:28:08,799 --> 00:28:15,519
Arnold Schwarzenegger, and it's super full of fun service jokes,

521
00:28:15,559 --> 00:28:19,200
even quoting making fun of the most famous scenes with

522
00:28:19,799 --> 00:28:25,359
Swarzenegger and his best movies. So if you grew up

523
00:28:25,440 --> 00:28:28,000
in this time. It's very much enjoyable.

524
00:28:28,480 --> 00:28:29,920
Speaker 1: I think I think you has actually been on my

525
00:28:29,960 --> 00:28:32,519
watch list for a while, and I haven't. I haven't

526
00:28:32,519 --> 00:28:33,160
gotten around here.

527
00:28:33,319 --> 00:28:38,519
Speaker 3: It's really really good with really solid screenplay. Of course,

528
00:28:38,559 --> 00:28:43,599
it's spy spy movies, so full of physical absorts, but

529
00:28:44,119 --> 00:28:49,960
in the rules of the presented world, it stands on

530
00:28:50,799 --> 00:28:53,960
its legs and you can really enjoy it.

531
00:28:54,240 --> 00:28:56,440
Speaker 1: Okay, Well, then I guess that's going on someone's watch

532
00:28:56,519 --> 00:28:59,559
lest that will be an episode notes. So at this

533
00:28:59,599 --> 00:29:02,440
point I will just say thank you Adam for for

534
00:29:02,519 --> 00:29:05,240
coming on to the to the show and talking about

535
00:29:05,240 --> 00:29:07,920
your your book and your future upcoming book. I feel

536
00:29:07,960 --> 00:29:10,720
like the postmortems will be really interesting and maybe we'll

537
00:29:10,720 --> 00:29:11,720
get you back on it.

538
00:29:11,759 --> 00:29:14,200
Speaker 3: At that point, well, thank you for having me, thank

539
00:29:14,240 --> 00:29:17,599
you for a very nice conversation, and I would be

540
00:29:17,640 --> 00:29:18,759
super happy to be back.

541
00:29:18,920 --> 00:29:21,039
Speaker 1: And with that, I'll say thank you to Attribute one

542
00:29:21,039 --> 00:29:24,400
more time for sponsoring today's episode, Thanks for everyone listening,

543
00:29:24,559 --> 00:29:26,240
and we'll be back next

