1
00:00:01,080 --> 00:00:05,679
How'd you like to listen to dot
NetRocks with no ads? Easy? Become

2
00:00:05,679 --> 00:00:09,839
a patron For just five dollars a
month, you get access to a private

3
00:00:10,000 --> 00:00:14,439
RSS feed where all the shows have
no ads. Twenty dollars a month will

4
00:00:14,439 --> 00:00:19,039
get you that and a special dot
NetRocks patron mug. Sign up now at

5
00:00:19,039 --> 00:00:23,600
Patreon dot dot NetRocks dot com.
Hey, Carl and Richard, here with

6
00:00:23,640 --> 00:00:28,719
your twenty twenty four NDC schedule.
We'll be at as many NDC conferences as

7
00:00:28,760 --> 00:00:33,240
possible this year, and you should
consider attending no matter what. Ndcoslo is

8
00:00:33,240 --> 00:00:37,719
happening June tenth through the fourteenth.
Get your tickets at ndcoslo dot com.

9
00:00:37,759 --> 00:00:43,600
The Copenhagen Developers Festival happens August twenty
sixth through the thirtieth. Early bird discount

10
00:00:43,759 --> 00:00:51,240
ends April twenty sixth. Tickets at
Cphdevfest dot com. Ndcporto is happening October

11
00:00:51,280 --> 00:00:55,920
fourteenth through the eighteenth. The early
bird discount ends June fourteenth. Tickets at

12
00:00:56,039 --> 00:01:11,959
Ndcporto dot com. We'll see you
there, we hope. Hey, guess

13
00:01:11,959 --> 00:01:15,920
what it's dot net rocks. Well
you knew that because you pushed the play

14
00:01:15,959 --> 00:01:19,239
button. I'm Carl Franklin and I'm
Richard Campbell, and we're gonna have a

15
00:01:19,239 --> 00:01:23,159
good show today. Barry O'Reilly is
here. We're going to be talking about

16
00:01:23,239 --> 00:01:27,200
fragility, or anti fragility as he
talks about it. But how are you,

17
00:01:27,280 --> 00:01:32,319
sir, up on the coast back
home? I didn't power out it

18
00:01:32,400 --> 00:01:34,400
yesterday for a couple hours of my
new ups rig. You know, you

19
00:01:34,519 --> 00:01:38,480
find every mistake. Yeah, right, So the network stayed up perfectly,

20
00:01:38,680 --> 00:01:42,599
but it's like, oh, you
plug that monitor into the surge protected outlet,

21
00:01:42,799 --> 00:01:49,200
not the power backed up. So
over the course of the two hours,

22
00:01:49,200 --> 00:01:52,000
I had my head lamp on and
I was running around making fixes on

23
00:01:52,040 --> 00:01:53,799
stuff so that when it all came
back, everything would be better. That's

24
00:01:53,840 --> 00:01:59,719
so so like you, It's so
yes, it's like, oh, powers

25
00:01:59,719 --> 00:02:01,640
out now, I'm busy. I
was busy last week trying to come up

26
00:02:01,680 --> 00:02:07,680
with a solution to a problem that
I couldn't believe was a problem. But

27
00:02:07,039 --> 00:02:17,680
that's my better no framework, So
let's roll the music. Also, apartment,

28
00:02:17,719 --> 00:02:23,000
what do you got all? Right? So everybody does authentication authorization in

29
00:02:23,039 --> 00:02:27,080
a myriad of ways in twenty twenty
four, not one right way. That's

30
00:02:27,080 --> 00:02:30,240
for sure. Oh no. But
you know, if you just select when

31
00:02:30,240 --> 00:02:35,319
you're when you're creating a Blazer application, if you select individual accounts, you

32
00:02:35,360 --> 00:02:42,560
know you're basically buying into the Microsoft
sp net Core identity system and it and

33
00:02:42,639 --> 00:02:47,520
it wants a database, right,
and you can create the template is really

34
00:02:47,520 --> 00:02:52,680
good now. And I got to
say thanks to Jeremy Lecknis and he's not

35
00:02:52,719 --> 00:02:55,240
working on that now, but he's
one of the guys behind all the new

36
00:02:55,360 --> 00:03:01,439
UI in the Blazer template that all
Razor pages now and it's all in there.

37
00:03:01,439 --> 00:03:06,680
You don't have to scaffold anything,
well most of it. But you

38
00:03:06,719 --> 00:03:12,439
know, roles is something that seems
like an antiquated thing, right because everybody's

39
00:03:12,520 --> 00:03:15,879
using claims and building their own claims
and we got claims and servers and two

40
00:03:16,039 --> 00:03:20,520
tokens and all that stuff. But
there's still a case to be made for

41
00:03:20,639 --> 00:03:24,919
simple roles like I want to know
if you know, Joe is in the

42
00:03:25,000 --> 00:03:30,400
admin role so that he can change
things and somebody else isn't simple things like

43
00:03:30,439 --> 00:03:35,000
that, And when it turns out
that those are just strings, they're just

44
00:03:35,080 --> 00:03:38,919
tags, and so it's very easy
for you to they've really turned into claims

45
00:03:39,039 --> 00:03:43,840
effectively a role in a claim is
not that different. Yeah, and in

46
00:03:43,879 --> 00:03:46,360
a role is a type of claim. Right. But here's the deal.

47
00:03:46,520 --> 00:03:50,840
If you're doing a Blazer server application, you just add one line of code

48
00:03:50,879 --> 00:03:54,680
to that template in your program cs, you know, ad roles of identity

49
00:03:54,759 --> 00:04:00,159
role, then you're fine. You
do that in a web assembly applications a

50
00:04:00,159 --> 00:04:03,639
little bit more complicated. Yeah,
now, but there's but but in the

51
00:04:03,680 --> 00:04:10,240
template, there's support for roles in
the you know, in the in the

52
00:04:10,280 --> 00:04:13,080
server side and the client side,
things that talk to each other. It's

53
00:04:13,120 --> 00:04:18,240
just that Oops. They took that
out right before dot net eight went to

54
00:04:18,480 --> 00:04:21,839
you know, in October, right
before dot net eight went live, because

55
00:04:21,920 --> 00:04:26,279
there were situations where you have,
if you have multiple claims with the same

56
00:04:26,399 --> 00:04:30,360
name, they were interfering with each
other, and instead of fixing that,

57
00:04:30,439 --> 00:04:32,240
they punted on it. And it's
still an open issue. So essentially you

58
00:04:32,279 --> 00:04:36,399
don't have roles. Wow. So
basically I figured out a workaround for it.

59
00:04:36,519 --> 00:04:42,519
Wrote a component that's free, and
it's an MIT license component called Blazer

60
00:04:42,800 --> 00:04:47,600
Authorized Roleview And it's a replacement for
the authorized view component that you can use

61
00:04:47,680 --> 00:04:54,079
to you know, authorize markup based
on a role, except that it works

62
00:04:54,240 --> 00:04:58,040
in a web assembly. Uh,
and it's probably a workaround. They'll probably

63
00:04:58,120 --> 00:05:00,920
fix it or they'll come up with
some other solution. Something will change for

64
00:05:00,040 --> 00:05:06,160
nine. Yeah, maybe hopefully.
So that is it. This is show

65
00:05:06,439 --> 00:05:10,439
eighteen ninety six. So if you
go to eighteen ninety six dot pop,

66
00:05:10,600 --> 00:05:14,319
dot me, that will take you
to my repo. And there's also a

67
00:05:14,360 --> 00:05:18,040
Blazer train about it. So that's
what I got. Excellent solve problems.

68
00:05:18,399 --> 00:05:23,240
Cool. Who's talking to us,
Richard crowded commentator Show eighteen ninety four,

69
00:05:23,279 --> 00:05:27,240
the one which is just recently with
Carl Geittz talking about programming and speech,

70
00:05:27,319 --> 00:05:30,120
and this comment comes from Rob who
says, Hey, I really appreciate your

71
00:05:30,120 --> 00:05:33,600
coverage of artificial intelligence. It's positive
and tamps down fear. As an instructor,

72
00:05:33,639 --> 00:05:36,759
I hear a lot of concerns by
other instructors about how AI will affect

73
00:05:36,800 --> 00:05:41,519
students, mostly from the perspective that
a students are using AI to cheat,

74
00:05:41,920 --> 00:05:45,680
but from a programmer's perspective. To
me, it's just another tool, and

75
00:05:45,720 --> 00:05:48,600
it's a really cool tool, but
a tool nonetheless, And the sooner as

76
00:05:48,600 --> 00:05:53,040
instructors that we know how to use
this new tool in the context of whatever

77
00:05:53,120 --> 00:05:56,639
job. We are teaching students to
do the better. So I loaded up

78
00:05:56,680 --> 00:05:59,920
gethub copilot. I use it,
and that has made me realize that,

79
00:06:00,160 --> 00:06:01,920
yes, it makes me faster,
and yes the student could still figure out

80
00:06:01,920 --> 00:06:06,800
syntax more easily, but what's wrong
with that? They still need to learn

81
00:06:06,839 --> 00:06:10,600
the syntax to complete the code.
They still need to learn how to read

82
00:06:10,639 --> 00:06:14,480
the code, and they still need
to understand how complex frameworks are structured and

83
00:06:14,519 --> 00:06:17,319
how to properly and effectively use them. And most importantly, they can still

84
00:06:17,360 --> 00:06:24,959
look forward to hours and hours of
WTF research the figure out how to solve

85
00:06:25,040 --> 00:06:29,879
complex problems that I give them.
Here's zer kind of instructor. It seems

86
00:06:29,920 --> 00:06:33,199
like most of my day is WTF. Yes, a lot of WTF research,

87
00:06:33,360 --> 00:06:38,199
right, just turn on the light. That's got to be a hashtag

88
00:06:38,240 --> 00:06:40,720
man, and AI is likely to
never be able to figure everything out.

89
00:06:40,839 --> 00:06:44,120
That's our job as human beings.
You know. I was recently on a

90
00:06:44,120 --> 00:06:46,800
different podcast. I was on Jeff
Plermo's podcast, and I said, listen,

91
00:06:47,319 --> 00:06:50,560
you know your job's not to write
code. Your job is to find

92
00:06:50,600 --> 00:06:55,879
solutions, right, and if there's
a tool that helps you make they get

93
00:06:55,879 --> 00:06:59,879
to that solution faster, you should
use it. That's the responsible thing to

94
00:07:00,120 --> 00:07:01,560
do exactly. So, Rob,
thank you so much for your comment,

95
00:07:01,600 --> 00:07:03,600
and a copy of music Coby is
on its way to you. And if

96
00:07:03,600 --> 00:07:06,160
you'd like a copy of music Cobe, I write a comment on the website

97
00:07:06,160 --> 00:07:10,319
at dot at Rocks dot com or
on the facebooks you publish every show there

98
00:07:10,319 --> 00:07:12,680
and even comment there at e reading
on the show, we'll send you copy

99
00:07:12,680 --> 00:07:15,600
of music. Go bye music to
code by still going strong. Stuff in

100
00:07:15,759 --> 00:07:20,720
use by thousands, I would say
thousands of people based on my sales.

101
00:07:20,879 --> 00:07:25,079
Dude, it's in this room every
day, like don't don't be crazy,

102
00:07:25,199 --> 00:07:28,920
Like that's the truth. When I'm
home and I got to work mostly writing,

103
00:07:29,040 --> 00:07:32,160
to be honest. Yeah, that's
what's on Cool Blue. I love

104
00:07:32,199 --> 00:07:36,920
Blue. Blue puts me Blue puts
me in the zone. Like click,

105
00:07:38,399 --> 00:07:42,439
that's the first one. Yeah,
okay, Well, you can of course

106
00:07:42,439 --> 00:07:44,759
follow us on Twitter if you want. But the cool kids are hanging out.

107
00:07:44,800 --> 00:07:47,439
I'm mastered on these days. I'm
at Carl Franklin at tech Hub dot

108
00:07:47,480 --> 00:07:53,040
social, and I'm Rich Campbell at
masterd dot social. And you can find

109
00:07:53,079 --> 00:07:56,800
all the ways to get in touch
in the at Carl Franklin dot com All

110
00:07:56,920 --> 00:08:01,279
right, let's get to our guest
who hasn't been on in sickx years Berry

111
00:08:01,439 --> 00:08:07,079
six years. Barry O'Reilly is the
founder of Black Tulip Technology and creator of

112
00:08:07,480 --> 00:08:13,279
residuality theory. He has held chief
architect roles at Microsoft's Western Europe Consultancy practice.

113
00:08:13,560 --> 00:08:20,959
Among others. He's been Microsoft's IoT
Tap lead for Western Europe and worldwide

114
00:08:20,000 --> 00:08:28,040
lead for Microsoft's Solution Architecture Community.
He has also been a startup CTO and

115
00:08:28,199 --> 00:08:33,360
was founder of Sweden's Azure user Group. He's currently finishing up a PhD in

116
00:08:33,440 --> 00:08:39,279
complexity science and software design. I
really want to know about complexity science and

117
00:08:39,440 --> 00:08:48,720
in particular how it helps me oncomplexity
five. Yeah, welcome Berry, Welcome

118
00:08:48,759 --> 00:08:52,639
back. Yeah, heeers. It's
been a long time. A lot,

119
00:08:52,960 --> 00:08:58,240
a lot has happened. Last time
I was on, I was talking about

120
00:08:58,600 --> 00:09:03,799
the con accept of anti fragility and
software and how we can build a software

121
00:09:03,879 --> 00:09:09,879
that appeared to survive on its own
even though it wasn't designed for things,

122
00:09:09,039 --> 00:09:15,279
and that all of those ideas came
out of work you did at Microsoft trying

123
00:09:15,320 --> 00:09:18,440
to teach new architects. So how
do we turn good developers into into great

124
00:09:18,519 --> 00:09:26,240
architects kind of thing, and that
I wrote a paper, and that paper

125
00:09:26,480 --> 00:09:30,000
led to a lot of things,
a lot of attention, and I decided

126
00:09:30,039 --> 00:09:31,600
that I wanted to do things a
little bit differently. So I went back

127
00:09:31,600 --> 00:09:35,600
to school and the things that I
had noticed, the things that i'd written

128
00:09:35,600 --> 00:09:37,960
about, I said, you know, if you don't gather requirements, if

129
00:09:37,960 --> 00:09:41,519
you don't try to predict the future, if you just randomly mix stuff up

130
00:09:41,559 --> 00:09:46,960
and stress and application, you actually
end up with an architecture that's better than

131
00:09:48,039 --> 00:09:52,279
what we normally do. And that's
a very big statement, you know,

132
00:09:52,320 --> 00:09:56,360
and it's it's fun and it's and
I realized that this is the way I've

133
00:09:56,360 --> 00:10:01,000
been working for a very long time. I've given up on process and templates

134
00:10:01,000 --> 00:10:03,799
and all of these things, and
and you know, we're we're all just

135
00:10:03,840 --> 00:10:07,279
making stuff up until it starts to
spin, until the wheel starts to spin,

136
00:10:07,440 --> 00:10:13,159
and the hamster looks happy, right, And I decide, I decided,

137
00:10:13,200 --> 00:10:16,559
you know, this is a wild
claim. And while it's entertaining for

138
00:10:16,600 --> 00:10:18,240
me to go to a conference,
getting a stage and talk to people about

139
00:10:18,240 --> 00:10:22,519
these wild claims, wouldn't it be
great if if they actually had some depth?

140
00:10:22,679 --> 00:10:26,600
So I went back to university,
and I said, I want to

141
00:10:26,639 --> 00:10:30,080
I want to do a PhD.
I want to produce academic work that says

142
00:10:30,080 --> 00:10:33,080
this is real. This is a
scientific this is a scientific fact, not

143
00:10:33,360 --> 00:10:37,000
just not just someone you know,
because if you get up on stage in

144
00:10:37,080 --> 00:10:41,000
I conference, this an amazing thing
happens. People believe you. Yeah,

145
00:10:43,159 --> 00:10:46,879
And and I wanted to, you
know, make it a little bit more.

146
00:10:46,240 --> 00:10:50,279
I wanted to answer the hard questions. And it turns out that that

147
00:10:50,279 --> 00:10:54,840
that's actually was an incredibly difficult thing
to do. And I'm approaching the end

148
00:10:54,879 --> 00:11:01,840
of that process now. I've just
released a little book on Lean pub that's

149
00:11:01,879 --> 00:11:05,080
going to be given away as a
gift at some conferences across Europe in the

150
00:11:05,120 --> 00:11:11,639
next couple of weeks that talks about
the ideas. And so for the last

151
00:11:11,679 --> 00:11:16,720
six years I've been burrowed away writing
frantically and rewriting and testing and teaching people

152
00:11:16,720 --> 00:11:22,440
how to do these things. And
the basic idea that instead of using requirements,

153
00:11:22,440 --> 00:11:26,600
instead of trying to predict what's going
to change and using patterns and encapsulation

154
00:11:26,720 --> 00:11:33,399
and all these methods we've learned over
decades, you can simply start with an

155
00:11:33,480 --> 00:11:37,879
architecture that's really simple that you know
is wrong. One example I use that,

156
00:11:39,200 --> 00:11:43,679
I say to architects, just what
you're allowed one component and all of

157
00:11:43,720 --> 00:11:46,480
your data has to be handled in
memory. You're not allowed to do anything

158
00:11:46,519 --> 00:11:48,919
else. There's one way in,
there's one way out. This is how

159
00:11:48,919 --> 00:11:52,399
we're going to solve the problem.
And if they want to make a change

160
00:11:52,759 --> 00:11:56,360
to that architecture, they have to
suggest something that will cause it to fall

161
00:11:56,399 --> 00:12:00,480
apart. And they say, wow, what happens if we read but then

162
00:12:00,519 --> 00:12:03,000
we lose the in memory data stores? Okay, well you can have a

163
00:12:03,120 --> 00:12:07,080
data store that's you know, find
you deserve state, you've qualified. Yes,

164
00:12:07,159 --> 00:12:13,279
you've justified persistence. So let's do
that, and and the same with

165
00:12:13,399 --> 00:12:16,360
cues and everything else. And you
build your architecture up in that way.

166
00:12:16,279 --> 00:12:22,000
And the funny thing is when you
do this, when you stress your way

167
00:12:22,039 --> 00:12:26,840
to an architecture, you eventually reach
a point, a magical point, where

168
00:12:26,360 --> 00:12:28,960
you can't seem to find a way
to break it anymore. And everything that

169
00:12:30,000 --> 00:12:35,039
you come up with actually is survived
by the architecture. And that's what I

170
00:12:35,080 --> 00:12:39,960
talked about last time. And so
as they've gone away and read an entire

171
00:12:39,039 --> 00:12:43,799
library of books, and complexity,
sciences and philosophy and all sorts of weird

172
00:12:43,840 --> 00:12:48,120
and wonderful stuff. I've managed to
boil this down to a simple theory that

173
00:12:48,159 --> 00:12:52,320
says, this is why this works. This is why there's an actual scientific

174
00:12:52,399 --> 00:12:58,960
reason why you can produce an architecture
just by being incredibly negative. You know.

175
00:13:00,360 --> 00:13:03,799
I just want to point out that
writers, you know, writers of

176
00:13:03,879 --> 00:13:07,759
fiction do this all the time.
Okay, they just have to sit down

177
00:13:07,759 --> 00:13:09,559
and start writing, even nonfiction,
right, Richard. Yeah, sure,

178
00:13:09,720 --> 00:13:13,279
you sit down, you start writing, even if it sucks, even if

179
00:13:13,320 --> 00:13:15,960
you know it sucks, and you
throw lots of it away, throw lots

180
00:13:15,960 --> 00:13:18,440
of it away, but it gets
your brain going. You have at least

181
00:13:18,440 --> 00:13:20,639
somewhere to start. Yeah. And
if you and if you've got somebody watching

182
00:13:20,679 --> 00:13:24,519
over your shoulder saying that's not right, you just want to say, shut

183
00:13:24,600 --> 00:13:26,919
up, you go away. I'm
working it out here, man. Yeah.

184
00:13:28,240 --> 00:13:31,080
And that's that's one of the things
in these ideas that's a huge,

185
00:13:31,679 --> 00:13:35,799
a huge culture shock for I think
for a lot of software developers. Because

186
00:13:35,840 --> 00:13:41,279
we come from computer science backgrounds and
mathematics and stam we're used to having a

187
00:13:41,320 --> 00:13:43,639
formula, We're used to being able
to say this is the correct way to

188
00:13:43,720 --> 00:13:48,200
do this, This is the structured
way, this is this is how you

189
00:13:48,320 --> 00:13:50,519
rationally arrive at an answer. And
this, this way of working is a

190
00:13:50,559 --> 00:13:56,240
little bit wild. You're wrong for
a very long time until suddenly you're right.

191
00:13:56,480 --> 00:14:00,799
And that's a very very different way
of working. And I've shown when

192
00:14:00,799 --> 00:14:05,639
i've shown these ideas, for example, too very senior architects, people who've

193
00:14:05,639 --> 00:14:09,200
built big stuff in azure, and
they've said, yes, this is what

194
00:14:09,240 --> 00:14:15,159
we do. We don't have these
fancy words or this academic waffle, but

195
00:14:15,200 --> 00:14:18,879
we we this is how we think
we're we're you know, negative. And

196
00:14:18,919 --> 00:14:20,480
so if you're an architect, you've
ever been called negative, that's that's a

197
00:14:20,519 --> 00:14:28,679
very good thing. It means you're
doing right. So it's and so it's

198
00:14:28,720 --> 00:14:31,840
starting to I've started to tour these
ideas. I've been at a lot of

199
00:14:31,960 --> 00:14:37,679
conferences. There's a lot of a
lot of views coming on YouTube, kind

200
00:14:37,679 --> 00:14:41,519
of growing interest for the ideas,
and so I've started to publish books on

201
00:14:41,840 --> 00:14:46,960
the subject, and so they're starting
to gather p s and it's it's a

202
00:14:46,039 --> 00:14:50,360
it's a pretty exciting time for these
ideas. I mean, I think the

203
00:14:50,440 --> 00:14:56,799
key point you're making here is that
you're not afraid to constantly revise the architecture.

204
00:14:56,960 --> 00:14:58,919
But I also appreciate, you know, the insight here, which is,

205
00:15:00,320 --> 00:15:03,279
don't start with an architecture like go
minimal so that you're not tearing much

206
00:15:03,559 --> 00:15:09,039
down, you're only adding, yeah, exactly making changes. I tend to

207
00:15:09,039 --> 00:15:13,840
find I go into companies and help
them when they're having some troubles with their

208
00:15:13,919 --> 00:15:18,360
architecture. And sometimes you'll find people
locked in combat over and oh, this

209
00:15:18,399 --> 00:15:22,279
should be an event driven architecture,
or this should be an API driven architecture,

210
00:15:22,320 --> 00:15:26,399
and this is an exting I know, Yeah, how do you know?

211
00:15:26,840 --> 00:15:31,639
Yeah, it's a lot of gut
feeling, and not that there's anything

212
00:15:31,679 --> 00:15:35,440
wrong with gut feeling exact how most
of us do our jobs, but it's

213
00:15:37,320 --> 00:15:41,759
it doesn't become a very concrete discussion
when you're fighting about the differences between these

214
00:15:41,759 --> 00:15:43,960
two different patterns, and so boiling
it down to something simple and saying,

215
00:15:45,360 --> 00:15:48,440
tell me, it's something in the
environment that's going to push this architecture forces

216
00:15:48,480 --> 00:15:54,840
it to behave in this particular way, and and people they can come up

217
00:15:54,840 --> 00:16:00,320
with stories about what might happen in
the environment and what's that happening is that

218
00:16:00,399 --> 00:16:04,799
we're the way that we traditionally build
architectures as we try to gather information from

219
00:16:04,840 --> 00:16:10,639
stakeholders by running around and saying,
tell me your requirements, and they run

220
00:16:10,679 --> 00:16:12,120
away from us. Most of the
time. They don't want to tell us,

221
00:16:12,360 --> 00:16:15,799
and they won't tell us because they
don't really know what the future looks

222
00:16:15,879 --> 00:16:19,279
like. And some people say that
you can capture risk and you can use

223
00:16:19,320 --> 00:16:22,679
probability, or that some people believe
they can predict what's going to change in

224
00:16:22,720 --> 00:16:26,559
an architecture and write all that down. That that's not something I've seen in

225
00:16:26,600 --> 00:16:30,639
my career that's possible for us to
do in a complex kind of business environment.

226
00:16:32,159 --> 00:16:36,519
And so the solution of this problem
is to just randomly simulate the environment.

227
00:16:36,720 --> 00:16:41,320
And so even if you're just making
stuff up and you say, what's

228
00:16:41,360 --> 00:16:47,600
what's going to happen in this in
this system, and you make something up,

229
00:16:47,639 --> 00:16:51,960
you can still find a set of
conditions in which your architecture breaks,

230
00:16:52,080 --> 00:16:55,159
and once you know where breaks,
you can start to make it better.

231
00:16:55,600 --> 00:17:00,720
And it's so I was doing an
example of this. I was talking to

232
00:17:00,799 --> 00:17:03,440
my professor yesterday and he said,
well, you know, you're calling it

233
00:17:03,480 --> 00:17:08,079
a random simulation. If I just
opened a dictionary and just pick a random

234
00:17:08,119 --> 00:17:15,559
word, can you use this to
find out something about the architecture? And

235
00:17:15,599 --> 00:17:18,680
I said, so, we were
talking about building a system for the National

236
00:17:18,720 --> 00:17:22,799
Health Service in Great Britain and he
opened the dictionary and he just pointed a

237
00:17:22,880 --> 00:17:26,160
random word and he said the word
is shed. Can you use this to

238
00:17:26,200 --> 00:17:30,720
make this architecture better? And so
I said, okay, let's just say

239
00:17:30,759 --> 00:17:37,519
that some doctors and nurses start offering
health services in their garden sheds at the

240
00:17:37,519 --> 00:17:41,759
bottom of their gardens. That's absolutely
ridiculous, right, that's probably not going

241
00:17:41,799 --> 00:17:48,759
to happen in the near future.
I'd never say never. And I said,

242
00:17:49,039 --> 00:17:55,000
I'm talking about medical services. So
the question is what impact would that

243
00:17:55,039 --> 00:17:59,960
have in our architecture. Well,
suddenly we're distributed. Our workforce is distributed

244
00:18:00,200 --> 00:18:03,599
in a way that it hasn't been
distributed before, and that puts challenges on

245
00:18:03,640 --> 00:18:07,119
our architecture with updates, with sequences, with all of those things you know

246
00:18:07,119 --> 00:18:11,960
that get nasty when you distribute your
workforce. And I said, okay,

247
00:18:11,000 --> 00:18:15,839
and let's ask what's the path in
our architecture. Can we move to that?

248
00:18:15,039 --> 00:18:18,200
Or is our the architecture that we've
decided on from the start. Is

249
00:18:18,240 --> 00:18:22,799
it so stuck in it's in its
assumption that it can't move to this?

250
00:18:22,839 --> 00:18:25,839
Would it? Would we have to
scrap it and start again. And then

251
00:18:25,880 --> 00:18:29,920
as you talk about it, you
realize actually there there is a movement in

252
00:18:29,960 --> 00:18:34,359
the United Kingdom where people believe that
I guess what you guys call the emergency

253
00:18:34,440 --> 00:18:37,119
room, what we would call A
and E. There are people who want

254
00:18:37,119 --> 00:18:41,839
to have lots and lots and lots
of small emergency rooms in town centers,

255
00:18:41,400 --> 00:18:45,039
you know, especially on a Saturday
night in the UK, where people like

256
00:18:45,079 --> 00:18:48,960
to beat each other up and they
need they need, they need quick access

257
00:18:49,119 --> 00:19:00,119
to healthcare. Then and so,
while the idea that doctors and nurses will

258
00:19:00,119 --> 00:19:06,319
start practicing in their garden sheds is
clearly ridiculous. It's a random data point,

259
00:19:06,359 --> 00:19:10,680
it's a random story, there is
actually what we call in complexity science

260
00:19:10,720 --> 00:19:14,880
and attractor there which is a distributed
workforce that you start to talk about.

261
00:19:15,599 --> 00:19:18,440
And if you solve for the shed, you solve for this potential change in

262
00:19:18,480 --> 00:19:22,680
policy. You don't have to implement
it in your architecture. Now, what

263
00:19:22,720 --> 00:19:26,039
you're looking for is an analogy that
can be triggered by any kind of word

264
00:19:26,200 --> 00:19:33,119
yeah, and any kind of random
now. And so the theory that I've

265
00:19:33,119 --> 00:19:37,759
worked on is that doing this in
this random way on your architecture will actually

266
00:19:37,839 --> 00:19:41,880
give you a better architecture than asking
for requirements or risks, or saying what's

267
00:19:41,920 --> 00:19:45,799
going to change and what's not going
to change, and what I've done,

268
00:19:47,000 --> 00:19:51,960
then what I'll do when I release, if I pass my PhD after the

269
00:19:52,000 --> 00:19:57,359
summer is showed that this actually happens
in a statistically significant way that when when

270
00:19:57,400 --> 00:20:03,839
you do there's a way of measuring
with success of these of these programs,

271
00:20:03,160 --> 00:20:08,319
and you can show that this happens
almost every single time you use this method,

272
00:20:08,640 --> 00:20:14,720
you get an architecture that shows itself
to be able to survive in conditions

273
00:20:14,920 --> 00:20:17,720
which it hasn't been designed for.
I mean, I would argue that even

274
00:20:17,799 --> 00:20:22,200
when you do go in with an
architectural vision initially, there's a halfway point

275
00:20:22,240 --> 00:20:26,839
in almost every project where that gets
substantially revised where you just like, wait,

276
00:20:27,319 --> 00:20:32,039
we now understand this problem much more
deeply. We've done a bunch of

277
00:20:32,079 --> 00:20:33,480
testing. Maybe we've got to it
may even be a V one, but

278
00:20:33,519 --> 00:20:37,559
you're a few features and you're like, we've thought about this wrong, and

279
00:20:37,599 --> 00:20:41,319
if you're lucky, you're able to
make enough changes to be able to deliver

280
00:20:41,559 --> 00:20:48,839
on what the customer actually needed.
Yeah, and that's one of the things

281
00:20:48,880 --> 00:20:55,640
that I talk about, is,
you know, is that the way that

282
00:20:55,680 --> 00:20:59,839
software architecture is being done today at
large companies is kind of in some kind

283
00:20:59,839 --> 00:21:03,799
of gray zone or twilight zone in
terms of what is architecture. And there's

284
00:21:03,799 --> 00:21:08,279
a younger generation of our of developers
who don't want architects in the room.

285
00:21:08,480 --> 00:21:12,319
They'll know, they'll they'll light their
torches and they'll chase you out, and

286
00:21:12,319 --> 00:21:17,039
they'll say go away with your lines
and your boxes and your rules, and

287
00:21:17,759 --> 00:21:21,599
chase you back up into your ivory
tar. And especially within the agile movement,

288
00:21:21,640 --> 00:21:26,039
there's a there's a distrust of architecture
because people have been burned by these

289
00:21:26,039 --> 00:21:32,359
big upfront plans. And and that
means that a lot of times we'll will

290
00:21:32,599 --> 00:21:37,599
architecture is seen as something that's just
going to emerge over time. And and

291
00:21:37,599 --> 00:21:40,680
and like you said, as you
move through a project, you learn and

292
00:21:40,720 --> 00:21:44,160
you find things out, and suddenly
one day the team gets to a point

293
00:21:44,200 --> 00:21:48,160
and this can be very far down
the line where the architecture won't move anymore.

294
00:21:48,240 --> 00:21:52,039
So we'll we've just discovered that this
is all wrong and we can't save

295
00:21:52,119 --> 00:21:55,680
it with a simple refactory. They
have to tear it all down. And

296
00:21:55,720 --> 00:21:57,559
then project management will come in and
say, no, we're not going to

297
00:21:57,599 --> 00:22:02,359
tear it all down. We'll just
keep building on top of this very shaky

298
00:22:02,359 --> 00:22:07,880
platform and we'll call it technical debt. So it sounds good, and that's

299
00:22:07,920 --> 00:22:11,440
where that's where we're going. And
what I'm trying what what I'm trying to

300
00:22:11,599 --> 00:22:18,720
get architects to understand is that there
are three major challenges to software architecture that

301
00:22:18,720 --> 00:22:23,519
that we really don't know how to
handle yet. And those three things are

302
00:22:23,680 --> 00:22:27,720
time, because your architecture we dross
and if you look at someone's architecture,

303
00:22:27,720 --> 00:22:32,759
it's two dimensional diagram with lines and
boxes, and that will not stay to

304
00:22:32,759 --> 00:22:34,720
the saame ever over time, and
there's an idea that it should do.

305
00:22:36,759 --> 00:22:40,640
And then there's change. Over time, things are going to change, and

306
00:22:40,680 --> 00:22:44,720
that's the big problem. That's why
time is nasty. And the problem with

307
00:22:44,759 --> 00:22:48,160
that change is that we don't know
what it is. You can't sit down

308
00:22:48,200 --> 00:22:52,759
and figure out what's going to change
because it's a complex business environment and everything

309
00:22:52,839 --> 00:22:56,240
is moving all at once. So
the market and the customers and the employees

310
00:22:56,279 --> 00:23:00,720
and your competitors, everyone's moving.
You don't know what's in the future.

311
00:23:00,880 --> 00:23:03,880
You can definitely, if you go
back to partner US, you can definitely

312
00:23:03,880 --> 00:23:07,359
say the database connection strengths, we
know that's going to change. We can

313
00:23:07,440 --> 00:23:11,160
encaptulate that will be okay if we
do that. But you don't know what's

314
00:23:11,200 --> 00:23:15,599
going to happen in your market.
And so what I find was that when

315
00:23:15,599 --> 00:23:21,279
we do architecture, architects talk to
each other through two dimensional diagrams, but

316
00:23:21,400 --> 00:23:25,319
in their heads they carry around the
picture. Everyone has different pictures of many

317
00:23:25,359 --> 00:23:29,640
potential futures, many fears and diets
and things that could go wrong. And

318
00:23:29,720 --> 00:23:33,400
so what I wanted to do is
to develop a way of designing architecture that

319
00:23:33,480 --> 00:23:40,079
actually made it easy to capture.
This architecture has or a life cycle over

320
00:23:40,119 --> 00:23:44,640
time with many changes, and we
don't know what those are. And the

321
00:23:44,839 --> 00:23:48,920
concept then becomes that this random simulation
that we do where we randomly stress the

322
00:23:48,960 --> 00:23:57,200
thing from any perspective from talking about
sheds, then we can arrive at an

323
00:23:57,319 --> 00:24:04,039
architecture that has an expression of uncertainty
and time and change in it. And

324
00:24:04,200 --> 00:24:10,359
then we can show in a very
simple mathematical way that this architecture is more

325
00:24:10,480 --> 00:24:15,400
likely to survive in conditions which it
hasn't been built for. It's more likely

326
00:24:15,440 --> 00:24:18,680
to survive those requirements or new features
that pop up out of nowhere, those

327
00:24:18,799 --> 00:24:23,559
changes in the market. And there's
a body of evidence for this in the

328
00:24:23,559 --> 00:24:29,640
complexity sciences. So we go back
to the nineteen sixties. A biologist called

329
00:24:29,759 --> 00:24:34,759
Kaufman developed a model for how life
evolved out of a bunch of amino acids,

330
00:24:34,759 --> 00:24:37,920
how they connect to each other,
how they control each other, and

331
00:24:41,240 --> 00:24:44,519
at the end that you can get
these very simple amino acids, and at

332
00:24:44,519 --> 00:24:48,720
the end of the day you have
something insanely complex like humanity. And we

333
00:24:48,799 --> 00:24:52,640
think, oh, it must have
a master designer. Somebody say, must

334
00:24:52,680 --> 00:24:57,440
have architected all of this at the
outset, Yes, And herein we're engaging

335
00:24:57,480 --> 00:25:00,559
in these practices where that's see,
it is like very difficult to do.

336
00:25:00,880 --> 00:25:06,640
Yes, And but the key to
it is the same as the way life

337
00:25:06,680 --> 00:25:11,559
evolved is to randomly stress things until
they take on a structure. And if

338
00:25:11,599 --> 00:25:15,119
the structure is wrong, and you
keep randomly stressing your architecture, it will

339
00:25:15,200 --> 00:25:18,559
die and you have but luckily we
haven't built anything yet, so we can

340
00:25:18,640 --> 00:25:22,000
change it. And what we end
up with is we have this concept of

341
00:25:22,039 --> 00:25:30,440
a residue, and a residue is
an architecture in one particular timeline we have

342
00:25:30,599 --> 00:25:33,799
this is our residue, and it's
whatever is left over in that timeline,

343
00:25:33,839 --> 00:25:38,039
and we change our original structure so
that in that timeline things look a little

344
00:25:38,039 --> 00:25:42,000
bit better for us, maybe not
entirely solve everything, but just just movable,

345
00:25:42,119 --> 00:25:47,319
just survivable. And what we end
up with is describing an architecture as

346
00:25:47,359 --> 00:25:49,839
you start off with your original naive
architecture, which is the first thing you

347
00:25:49,880 --> 00:25:52,920
came up with, which if we're
honest, you know we all do,

348
00:25:53,480 --> 00:25:59,920
and and a whole bunch of these
residues that represent the architecture in different timelines.

349
00:26:00,440 --> 00:26:04,000
And the trick then is to integrate
all of those different architectures into one

350
00:26:04,160 --> 00:26:11,039
single set of components. And we
do that using some tools that we steal

351
00:26:11,079 --> 00:26:17,119
from the complexity sciences called incidence matrices, and we use these matrices to find

352
00:26:17,160 --> 00:26:21,720
out where where is there hidden coupling
in what I've done that's going to stop

353
00:26:21,759 --> 00:26:26,680
this architecture from moving through its different
timelines if it needs to. And those

354
00:26:26,720 --> 00:26:30,759
matrices. In the book, I
describe seven trigger seven refactoring triggers that you

355
00:26:30,799 --> 00:26:33,920
can see just by drawing a simple
matrix and put filled in it with ones

356
00:26:33,920 --> 00:26:40,400
and zeros, and it can actually
guide you to component boundaries. And we're

357
00:26:40,440 --> 00:26:45,279
way outside of object orientation or nouns
and verbs. Now we're building our building,

358
00:26:45,279 --> 00:26:51,759
our component structure, our service structure
based on the different timelines that might

359
00:26:51,799 --> 00:26:56,359
exist in the future of this architecture. And it's a completely different way of

360
00:26:56,400 --> 00:27:00,359
thinking. It sounds very very alien. Compare too. We gather requirements,

361
00:27:00,400 --> 00:27:06,799
we measure all the risks, we
follow some object oriented principles or gang of

362
00:27:06,839 --> 00:27:10,759
four or something that then we build
the system. This is a much wilder

363
00:27:10,839 --> 00:27:14,920
way of doing architecture, and it
takes people a few days of training to

364
00:27:14,960 --> 00:27:18,680
get into the idea and let go
of all all the old stuff. And

365
00:27:18,680 --> 00:27:23,079
I'm thinking about things in this very
different way, but it does produce fascinatingly,

366
00:27:23,200 --> 00:27:32,799
produces really robust, resilient architectures,
and in a way that's empirically verifiable.

367
00:27:32,799 --> 00:27:37,599
You can show that I've done something
good with a little mathematical formula at

368
00:27:37,599 --> 00:27:41,640
the end of your work, and
that changes everything because as an architect,

369
00:27:41,720 --> 00:27:45,599
for the first time, you can
actually show I've done something useful, which

370
00:27:45,599 --> 00:27:49,960
is hard for architects. I can
see that if you approach the subject with

371
00:27:51,079 --> 00:27:56,000
a word like anti fragility, yeah, that people's first thought might be,

372
00:27:56,240 --> 00:28:00,759
oh, that requires a whole bunch
of front thinking about, you know,

373
00:28:00,920 --> 00:28:07,359
predicting the future, when in fact
it's exactly the opposite, isn't it It

374
00:28:07,440 --> 00:28:11,680
is, yes, it's we're letting
go of this false belief that we can

375
00:28:11,720 --> 00:28:17,000
predict the future. And I think
that's one of the things that enterprise architecture

376
00:28:17,039 --> 00:28:21,720
and waterfall approaches try to sweep under
the rug for generations that yeah, we

377
00:28:21,759 --> 00:28:26,559
can predict the future, we can
build a scenario analysis that's realistic, and

378
00:28:26,640 --> 00:28:30,759
in this methodology we give up all
of that. You've never been able to

379
00:28:30,799 --> 00:28:33,759
do it, We've never successfully done
it. If you're working in a system

380
00:28:33,759 --> 00:28:36,799
where you can predict the future,
then you probably don't need an architect.

381
00:28:36,960 --> 00:28:41,039
It's everything. Everything's easy in that
kind of system. I've never really worked

382
00:28:41,079 --> 00:28:44,680
with any of those, but you
know, it's they may exist. I

383
00:28:44,680 --> 00:28:48,640
haven't seen one well and they're I
mean, I've certainly worked in environments where

384
00:28:48,359 --> 00:28:52,920
they're building the same app over and
over again, essentially, and so the

385
00:28:52,920 --> 00:28:56,920
template does make sense. But yeah, not every most people don't work like

386
00:28:56,960 --> 00:28:59,440
that generally. If you need to
build the same app over and over again,

387
00:28:59,480 --> 00:29:02,319
that app order exists and you shouldn't
need to build it at all.

388
00:29:02,759 --> 00:29:06,319
Yeah, if you're building the same
app over and over again, some SaaS

389
00:29:06,319 --> 00:29:10,920
company is going to come along and
steal your lunch. And so if you're

390
00:29:10,920 --> 00:29:14,680
not building that SaaS company, then
you're going to be in trouble. Eventually,

391
00:29:15,039 --> 00:29:21,359
you're going to get sass. Yeah, but yeah, I mean in

392
00:29:21,400 --> 00:29:23,000
the end, now, recognizing the
body of work that's in front of us

393
00:29:23,039 --> 00:29:27,200
when we're building new software, it's
because it doesn't exist, which means we

394
00:29:27,319 --> 00:29:32,960
really cannot predict the future. So
it makes far more sense for us to

395
00:29:33,160 --> 00:29:38,119
just explore the space over time and
come up with solutions as we go.

396
00:29:40,160 --> 00:29:42,839
It feels to me like the way
you're describing this now is that the technical

397
00:29:44,000 --> 00:29:49,160
debt we're talking about is assumptions made
early that built stuff we never needed.

398
00:29:51,440 --> 00:29:56,000
Yes, the reluctance to change it
once we do realize it because of the

399
00:29:56,039 --> 00:29:59,960
commitments we made to it. Yes, that's very, very dangerous. You'll

400
00:30:00,119 --> 00:30:03,319
find the one way that architects combat
this kind of thing in the past is

401
00:30:03,359 --> 00:30:07,200
that we'll pick a platform, we'll
pick a pattern, and we say this

402
00:30:07,279 --> 00:30:11,599
is the way that we're doing this, and we're because we because we're very

403
00:30:11,599 --> 00:30:14,079
wise, and we've looked into the
future, and we've decided this is how

404
00:30:14,119 --> 00:30:17,359
you solve this problem, right.
And what happens is that as soon as

405
00:30:17,359 --> 00:30:19,839
you put that architecture, rub it
up against reality, it starts to break

406
00:30:19,920 --> 00:30:22,640
a little bit. And the way
we talk about that breaking is we call

407
00:30:22,680 --> 00:30:29,519
it edge cases, and we say
we can't possibly be wrong. So everything

408
00:30:29,559 --> 00:30:33,200
that doesn't work is just an edge
case. It's just a tiny little distraction

409
00:30:33,480 --> 00:30:37,519
on the edge of our of our
knowledge. And that's where you get technical

410
00:30:37,559 --> 00:30:40,799
debt because you start building and hacking
to keep those edge cases and you won't

411
00:30:40,839 --> 00:30:44,759
let go of the architecture. And
so these ideas give us a chance to

412
00:30:45,000 --> 00:30:48,480
question those assumptions very very early.
And the way that the method is set

413
00:30:48,559 --> 00:30:52,839
up, it gives everyone, developers, everyone and the team a chance to

414
00:30:52,920 --> 00:30:56,119
come in and push the architecture around
and say, you know, is it

415
00:30:56,160 --> 00:31:00,599
going to hold in this timeline.
And one way that we've dealt with that

416
00:31:00,640 --> 00:31:03,880
in the past is we use probability
and we say that's never ever going to

417
00:31:03,920 --> 00:31:08,519
happen, right, But by analyzing
this from I'm always afraid of that kind

418
00:31:08,559 --> 00:31:14,839
of certainty. Yeah, exactly.
But one of the things that we've discovered

419
00:31:15,119 --> 00:31:18,880
is that if you take a bunch
of things that are going to happen in

420
00:31:18,880 --> 00:31:22,119
the system, they don't have they
don't have to be accurate because every happening

421
00:31:22,200 --> 00:31:26,519
is part of a class of happenings
that will push the entire system to the

422
00:31:26,640 --> 00:31:30,480
CM kind of state. And if
you solve for that state, then you

423
00:31:30,519 --> 00:31:33,720
solve for many, many, many
things. And what we've discovered is that

424
00:31:33,720 --> 00:31:37,720
there's a simple mathematical leverage. If
you look at it from a complexity science

425
00:31:37,759 --> 00:31:44,519
perspective, a software application is never
actually complex. It's a complicated, constrained,

426
00:31:45,000 --> 00:31:48,599
very ordered thing. We tell it
exactly what to do, and it

427
00:31:48,680 --> 00:31:52,240
only does what we've told it to
do, and sometimes what we think we

428
00:31:52,319 --> 00:31:56,039
told it to do is a little
bit different, but it's very very ordered.

429
00:31:56,599 --> 00:32:00,440
The real complexity in the software system
is in the people and the markets

430
00:32:00,480 --> 00:32:05,799
wandering around changing their minds all the
time. Generally don't have you know,

431
00:32:05,880 --> 00:32:13,039
class libraries that change their minds,
not yet anyway. And and which is

432
00:32:14,759 --> 00:32:21,799
this kind of there's this simple software
system has to live in this massively complex

433
00:32:21,839 --> 00:32:27,119
business environment, and there's such a
huge difference between them that there's a mathematical

434
00:32:27,200 --> 00:32:31,319
relationship between the number of possible states
in them, which means that you don't

435
00:32:31,359 --> 00:32:37,039
have to capture every little bit of
information about the complex state. And so

436
00:32:37,119 --> 00:32:42,640
those waterfall approaches to requirements were always
wrong on a scientific level, the wrong

437
00:32:42,680 --> 00:32:46,559
way to approach things. And so
you only need a sample of those two

438
00:32:47,359 --> 00:32:52,599
to configure your architecture. And that's
why we use the random simulation to make

439
00:32:52,599 --> 00:32:58,359
that sample as broad as possible,
rather than narrowed by by language or by

440
00:32:59,079 --> 00:33:02,319
engineering metaphors that we've imported. And
Barry, I'm gonna interrupt for one moment

441
00:33:02,359 --> 00:33:09,480
for this very important message, and
we're back. It's done at Rocks.

442
00:33:09,480 --> 00:33:13,920
I'm Richard Campbell. That's Carl Franklin. Hey, talking to our friend Barry

443
00:33:13,960 --> 00:33:16,839
O'Reilly, who we see regularly at
conferences but don't talk too near enough.

444
00:33:17,079 --> 00:33:20,359
And I want to push back a
little on what you were saying in the

445
00:33:20,359 --> 00:33:23,319
first half too, because a lot
of that sort of random simulation is like

446
00:33:23,359 --> 00:33:27,640
it still feels like could we just
call that requirements gathering, you know,

447
00:33:27,799 --> 00:33:30,279
but maybe we're testing it with a
bit of code and so forth. This

448
00:33:30,400 --> 00:33:34,400
idea that we're going to catch all
this early, I just don't know that

449
00:33:34,440 --> 00:33:37,720
I buy it, Like, I
think you are going to find things later

450
00:33:37,839 --> 00:33:42,559
on, and the question is how
tolerant are you to being able to adapt

451
00:33:42,880 --> 00:33:47,680
later on? Yeah, And the
methodology builds on that. So one of

452
00:33:47,720 --> 00:33:52,960
the most important concepts is that we
don't know what's going to happen in the

453
00:33:52,960 --> 00:33:58,799
future. We are always going to
get surprised. And if you look at

454
00:33:59,079 --> 00:34:04,880
a business environ a business environment has
tens of thousands and hundreds of thousands of

455
00:34:04,920 --> 00:34:07,119
people in it. It has technology, it has markets, it has politics,

456
00:34:07,159 --> 00:34:09,920
all of these things and they're all
moving all at once. There's no

457
00:34:10,000 --> 00:34:15,039
way that you can predict the movement
of all of these things. And so

458
00:34:15,280 --> 00:34:21,840
the biologist Kaufman showed that when you
have many, many, uncountable number of

459
00:34:21,880 --> 00:34:25,159
things connected to each other, you
don't actually have to predict every tiny little

460
00:34:25,159 --> 00:34:29,920
state of every tiny little piece of
a system. You just have to predict

461
00:34:29,960 --> 00:34:36,519
the presence of a certain number of
things called attractors, and so a large

462
00:34:36,559 --> 00:34:39,679
complex system. You don't learn a
large complex system by grabbing every individual and

463
00:34:39,800 --> 00:34:45,119
understanding their needs and their wants and
their habits and their peculiarities. You observe

464
00:34:45,159 --> 00:34:49,199
the system as a set of attractors
that these things all influence each other and

465
00:34:49,239 --> 00:34:52,000
push each other into And so the
way that you need to work as an

466
00:34:52,119 --> 00:34:58,480
architect is not to identify everything,
but to identify as many attractors as possible.

467
00:34:58,559 --> 00:35:04,119
The moving parts will exactly and once
once you expose your architecture to those

468
00:35:04,159 --> 00:35:09,400
attractors, you start to reach something
which Kaufman the biologists called criticality. And

469
00:35:09,480 --> 00:35:15,639
criticality is the property of a system
where when you start to see that it

470
00:35:15,679 --> 00:35:20,599
survives things that it hasn't been designed
for. And so a very good example

471
00:35:20,639 --> 00:35:23,320
is humanity or a human being.
We survive things all the time that we

472
00:35:23,360 --> 00:35:29,239
don't even know are there. Our
immune systems survive things that they haven't seen

473
00:35:29,320 --> 00:35:32,719
before because of the way they're configured. And Kaufman has this thing called the

474
00:35:32,880 --> 00:35:37,639
NK model that explains it very well. And you have n is the number

475
00:35:37,679 --> 00:35:39,199
of components that you have in a
system, and K is the number of

476
00:35:39,239 --> 00:35:45,840
links, and by optimizing for N
and K in a system, you improve

477
00:35:45,039 --> 00:35:51,559
the system's ability to survive things that
it hasn't been designed for. In software

478
00:35:51,599 --> 00:35:54,039
engineering, we've known this for a
very very long time. We know that

479
00:35:54,079 --> 00:36:00,199
if you allow the number of components
in your architecture to explode, and going

480
00:36:00,239 --> 00:36:05,199
to be harder to kill them all, but it's also going to be extremely

481
00:36:05,239 --> 00:36:07,159
hard to manage them all if you've
got lots and lots of them. And

482
00:36:07,239 --> 00:36:13,119
as you allow the number of connections
between those components to grow, it becomes

483
00:36:13,159 --> 00:36:16,440
incredibly difficult to maintee and because you
have to traceure your messages as they move

484
00:36:16,519 --> 00:36:22,079
between thousands of different channels. And
Kaufman discovered that this exists in biological systems

485
00:36:22,079 --> 00:36:25,760
as well. He calls it the
NK model, and he noticed that as

486
00:36:25,800 --> 00:36:30,400
you tweak certain things the number of
components, the number of links, or

487
00:36:30,440 --> 00:36:32,880
the way they interact with each other, by constraining how often or when they

488
00:36:32,920 --> 00:36:37,719
interact with each other, then you
can control the number of attractors in a

489
00:36:37,800 --> 00:36:43,599
system. And he notes that criticality
is the property that you reach when a

490
00:36:43,639 --> 00:36:47,239
system arrives at the perfect balance of
these things which we call the edge of

491
00:36:47,320 --> 00:36:53,039
chaos and complexity theory, because isn't
the corollary also true that if we have

492
00:36:53,239 --> 00:36:59,519
too few components, we also ended
with a lot of inertia because it becomes

493
00:36:59,519 --> 00:37:01,880
difficult to change them. Yes,
and it becomes very easy to kill the

494
00:37:01,880 --> 00:37:05,760
system because there's only a few components, and if you take out one,

495
00:37:06,360 --> 00:37:10,039
the whole falls apart. And so
there's something we've always known this right as

496
00:37:10,039 --> 00:37:15,000
software engineers. There's something in between
the monolith and the micro service, and

497
00:37:15,000 --> 00:37:19,039
it's never quite the same thing in
different projects. But you have to find

498
00:37:19,039 --> 00:37:22,199
that line, that magic balance,
and that magic balance can be fined through

499
00:37:22,199 --> 00:37:28,360
this random simulation, through stressing the
system until you see that it stands up

500
00:37:28,400 --> 00:37:30,880
on its own. And so instead
of architecture being I'm going to design and

501
00:37:30,920 --> 00:37:36,360
build this beautiful cathedral, architecture becomes
a little more like Bambie on the ice.

502
00:37:36,519 --> 00:37:38,960
Right, We're gonna slip slide,
but eventually we're going to stand up

503
00:37:38,960 --> 00:37:43,599
and everything's going to be okay.
This is absolutely fascinating to me. I'm

504
00:37:43,639 --> 00:37:47,480
just drinking in everything that you're saying
here. And so you have do you

505
00:37:47,480 --> 00:37:51,119
have paper? You said you had
a book or do you have papers other

506
00:37:51,199 --> 00:37:53,519
than your book or is this all
in your book? So it's all in

507
00:37:53,559 --> 00:37:59,239
the book, which is on Lean
pub I have a link to it in

508
00:37:59,280 --> 00:38:02,239
the show notes. Yeah, awesome. The book is based on a series

509
00:38:02,280 --> 00:38:08,159
of seven academic papers that I wrote
over the since we last talked, over

510
00:38:08,239 --> 00:38:10,679
the last six years. And what
I find was I got a lot of

511
00:38:10,679 --> 00:38:15,719
feedback from academic people who said this
is really good, and from a lot

512
00:38:15,719 --> 00:38:19,000
of complexity science people who said this
is really good, and from a lot

513
00:38:19,039 --> 00:38:25,280
of software developers who said, this
makes no sense to us whatsoever. They're

514
00:38:25,280 --> 00:38:30,679
removing the cheese man. Yeah,
and I've built a whole career around introducing

515
00:38:30,679 --> 00:38:34,719
complexity to the beginning of a project. Why are you ruining this for me

516
00:38:35,039 --> 00:38:38,000
with you exactly? And so what
I what I did then? So I'm

517
00:38:38,039 --> 00:38:42,719
working with the conference. The conference
is called the Domain Driven Design Europe,

518
00:38:43,559 --> 00:38:47,639
and I've spoken there a few times
on these ideas, and they called me

519
00:38:47,719 --> 00:38:50,920
last year and said, hey,
we want to give a little book out

520
00:38:50,960 --> 00:38:53,599
as a present to all our attendees. Soul you could you write us a

521
00:38:53,599 --> 00:38:57,480
little book? So I wrote the
book, and I sat down and said

522
00:38:57,519 --> 00:39:00,239
I have to make this something that
that's you know, more in the language

523
00:39:00,280 --> 00:39:06,239
that software engineers are going to grasp
rather than all these fancy complexity terms from

524
00:39:06,320 --> 00:39:10,079
biology and philosophy and all of these
things. And so that book has has

525
00:39:10,159 --> 00:39:19,159
just been released on Lean pub and
it's it's much much easier to understand.

526
00:39:19,800 --> 00:39:22,599
And there's a second book in the
works coming that's more on the philosophy.

527
00:39:22,639 --> 00:39:25,760
So I ended up, you know, digging, going all the way back

528
00:39:25,800 --> 00:39:29,480
to the Presocratics and saying, well, this is where we went wrong.

529
00:39:29,920 --> 00:39:34,719
About three years ago in ancient Greece, a guy made a decision and we've

530
00:39:34,760 --> 00:39:39,239
all been suffering ever since. And
so there's a fair body of work out

531
00:39:39,239 --> 00:39:43,440
there, and there's going to be
you know, a PhD thesis published in

532
00:39:43,440 --> 00:39:45,880
the coming months, and there's going
to be, you know, eventually a

533
00:39:45,920 --> 00:39:52,159
bigger book. Just I mean not
to dive straight into the academics of this,

534
00:39:52,320 --> 00:39:57,920
but part of what makes this methodology
work is that software isn't tangible,

535
00:39:58,239 --> 00:40:01,639
That rebuilding the temple is a button
press away, yes, you know,

536
00:40:01,880 --> 00:40:07,079
or accepting a pull request away.
We did a lot of planning when building

537
00:40:07,119 --> 00:40:10,719
in stone, because getting it wrong
takes a long time to clean up and

538
00:40:13,440 --> 00:40:19,599
replace. It's very difficult to go
from straw to wood to brick where it's

539
00:40:19,639 --> 00:40:22,280
just not in software. Yeah,
the analogy that I use in the book

540
00:40:22,360 --> 00:40:27,199
is the manufacturing of cars. So
and I say, you know, that's

541
00:40:27,199 --> 00:40:29,840
a very planned process, and it
has to be it has to all work,

542
00:40:29,920 --> 00:40:34,760
everything has to fit together. And
that's a kind of engineering. And

543
00:40:34,960 --> 00:40:38,960
whenever software engineers back in the fifties
and sixties, we're looking for credibility,

544
00:40:39,000 --> 00:40:43,960
they wanted to say we're engineers too. So we borrowed a lot of the

545
00:40:44,039 --> 00:40:50,519
engineering metaphor from industries like car manufacturing. And what I say in the book

546
00:40:50,599 --> 00:40:52,800
is, imagine what it would look
like to be the designer of a car

547
00:40:53,440 --> 00:40:59,920
if gravity wasn't constant, if aluminium
constantly changed its properties, if rubber melted

548
00:41:00,239 --> 00:41:04,599
and reformed randomly. Because that's what
our stakeholders do, because we're building on

549
00:41:04,679 --> 00:41:09,679
their ideas, their opinions, and
so using the old school kind of engineering

550
00:41:09,719 --> 00:41:14,559
way of thinking about the world doesn't
work for us. Software is so connected

551
00:41:14,599 --> 00:41:17,920
and so malleable, so connected to
our ideas, and our ideas are fluid

552
00:41:17,920 --> 00:41:22,760
in a way that physical substances just
aren't there. We don't have the equivalent

553
00:41:22,800 --> 00:41:28,039
of the laws of physics to constrain
us. No't we can do anything.

554
00:41:28,199 --> 00:41:30,840
Anything can happen in a software project. And as soon as anything does happen,

555
00:41:30,880 --> 00:41:35,400
the rigid, sort of stiff software, the mechanical part of what we

556
00:41:35,480 --> 00:41:38,760
do falls apart, very very quickly. And that's what I describe as as

557
00:41:38,760 --> 00:41:46,079
the fundamental issue in software engineering,
that things are time changing, uncertainty are

558
00:41:46,239 --> 00:41:50,679
driving what we do. And we've
been spending all our time trying to nail

559
00:41:50,719 --> 00:41:53,960
it down and make it certain right
when we design, and most of the

560
00:41:54,039 --> 00:41:57,719
time we've been lying. And one
of the things I reference in the book

561
00:41:57,760 --> 00:42:00,559
which most people haven't discovered this,
and people have been emailing me saying this

562
00:42:00,599 --> 00:42:09,079
is absolutely hilarious. But David Parnas
wrote a paper in nineteen eighty six called

563
00:42:09,760 --> 00:42:15,400
a Rational Design Process Hi and Why
to feake It? And so this is

564
00:42:15,440 --> 00:42:19,880
one of our is what this is
one of our finding fathers of software engineering,

565
00:42:20,360 --> 00:42:23,599
and he says it's impuls in this
paper, it's impossible to have a

566
00:42:23,679 --> 00:42:30,880
rational, structured way to design software. It just never happens the same way

567
00:42:30,920 --> 00:42:34,960
twice. We're making everything up as
we go along. It's pure mayhem,

568
00:42:35,559 --> 00:42:37,400
and he turns round in the paper
and says, but there's a lot of

569
00:42:37,480 --> 00:42:42,920
value in pretending to management that we
work in a rational and structured way.

570
00:42:43,320 --> 00:42:46,920
And so basically this article is a
call to all software engineers to just lie.

571
00:42:47,199 --> 00:42:53,480
Why just make it look like it's
engineering. And that's absolutely fascinating.

572
00:42:53,559 --> 00:42:58,239
Well, because it makes it makes
the mortals comfortable, it does. Yeah

573
00:42:58,320 --> 00:43:00,039
right, I mean that's really what
it comes down to. And so he

574
00:43:00,119 --> 00:43:05,360
knew this forty years ago, and
so we've been lying to management ever since

575
00:43:05,360 --> 00:43:10,400
about how how unpredictable and unstealable our
methods and our processes are. And you'll

576
00:43:10,400 --> 00:43:14,599
find that in a lot of the
approaches that we use, they promise certainty,

577
00:43:14,639 --> 00:43:19,199
they promise stability, they promise a
repeatable process. If you look at

578
00:43:19,199 --> 00:43:22,400
things like TOGAF and things like see
If that we have nowadays, it's all

579
00:43:22,400 --> 00:43:27,760
about repeatability and everything being concrete and
measured and well understood. And our world

580
00:43:28,199 --> 00:43:31,599
as architects simply isn't like that.
Yeah, wow, Mind Blown Partners is

581
00:43:31,639 --> 00:43:37,000
still alive. He's actually in Victoria, Like I'm almost looking across at where

582
00:43:37,000 --> 00:43:39,480
he lives. Yeah, he's in
his eighties. Like, yeah, man,

583
00:43:40,000 --> 00:43:45,559
I had some email contact with him
a few years ago, and yeah,

584
00:43:45,599 --> 00:43:50,719
we talked a little bit. So
I pointed out that maybe, you

585
00:43:50,719 --> 00:43:54,199
know, maybe the problem that we've
had is not that things change, but

586
00:43:54,239 --> 00:44:00,320
that we don't know what's going to
change, which, well know, I've

587
00:44:00,360 --> 00:44:05,320
been wondering the past few years that
the inertia of our systems have been pretty

588
00:44:05,360 --> 00:44:12,239
stable, and so it's felt more
and more engineering recently. But at the

589
00:44:12,239 --> 00:44:15,559
same time, you know, I
spent a lot of time looking outward.

590
00:44:15,840 --> 00:44:20,400
We're right for disruption. Yeah,
you know, the smartphone's going to change,

591
00:44:20,639 --> 00:44:25,079
likely to augmented reality. These language
these non deterministic language models represent a

592
00:44:25,079 --> 00:44:29,320
whole new X paradigm, Like there's
a bunch of funny things that could happen

593
00:44:29,360 --> 00:44:31,440
here, and boy, oh boy, all your plans can be blown up

594
00:44:31,480 --> 00:44:36,000
pretty good. Yeah. I think
one of the things that's happening is we're

595
00:44:36,039 --> 00:44:39,519
getting better and better at what we
do at the at the software, the

596
00:44:39,639 --> 00:44:44,159
rigid ordered part of what we do. We're getting better at delivering it.

597
00:44:44,239 --> 00:44:46,320
We know that we know how to
test it, we know how to deploy

598
00:44:46,840 --> 00:44:50,840
much more often, We're getting better
at so many of these things. CLOUDE

599
00:44:50,880 --> 00:44:55,480
is this amazing collection of architectural patterns
that makes everyone's life easier. But at

600
00:44:55,480 --> 00:44:59,280
the same time, if you look
out into the real world, people who

601
00:44:59,280 --> 00:45:02,480
are doing business and executives and companies
and governments, they're finding the world that

602
00:45:02,519 --> 00:45:07,760
we live in harder and harder to
manage because they don't know what's coming down

603
00:45:07,840 --> 00:45:14,719
the pipeline or if anything's coming down
the pipeline, and that makes their world

604
00:45:14,880 --> 00:45:19,360
difficulty. Management loves predictability. Yeah, at the same time, we've sold

605
00:45:19,480 --> 00:45:22,480
all of the stuff is predictable when
reality what is it should be allowing for

606
00:45:22,519 --> 00:45:28,760
extreme flexibility. Being able to pay
for your your compute by the minute makes

607
00:45:28,760 --> 00:45:32,039
you more flexible. Being able to
automate the pipeline deploy software means you can

608
00:45:32,079 --> 00:45:37,079
iterate more quickly. These are all
elements of getting to where what you're describing

609
00:45:37,119 --> 00:45:40,280
Barry, which is you don't have
to predict the future. You can keep

610
00:45:40,320 --> 00:45:45,199
adjusting to the present as it comes
at you because all of this infrastructure makes

611
00:45:45,239 --> 00:45:49,079
it very easy to do. So. Yeah, it's the question, you

612
00:45:49,119 --> 00:45:52,159
know, when a customer asks you, do you know how to you know?

613
00:45:52,239 --> 00:45:53,880
Can you do this? Not do
you know? But can you do

614
00:45:54,440 --> 00:45:58,920
X? And most of the time
I answer is yes, even if I've

615
00:45:58,960 --> 00:46:01,239
never done it before, because I
trust in my ability to be able to

616
00:46:01,239 --> 00:46:05,719
figure it out and get it done. And so that is the certainty that

617
00:46:05,760 --> 00:46:09,400
I have that I believe in myself. I believe in my team. Yes,

618
00:46:09,440 --> 00:46:13,960
and I know that we can do
this because we have the ability to

619
00:46:14,440 --> 00:46:16,320
test things out, to look things
up and see where the mistakes have been

620
00:46:16,360 --> 00:46:19,920
made and figure it out. Yes. And this is This is a big

621
00:46:19,960 --> 00:46:22,920
conversation I have with architects when I'm
teaching them, And a lot of times

622
00:46:22,960 --> 00:46:28,519
teaching architects is more like therapy than
actually teaching them anything. Sure well,

623
00:46:28,559 --> 00:46:30,320
one of the things I try to
get them to understand is to say,

624
00:46:30,360 --> 00:46:34,840
why were you successful in your last
project? And a lot of architects will

625
00:46:34,880 --> 00:46:37,159
lean back and and say because I
use this framework or this pattern or this

626
00:46:37,280 --> 00:46:40,719
tool. I was like, no, it's you. You are the success

627
00:46:40,760 --> 00:46:45,119
in your projects. The way that
you think, the flexibility of thought that's

628
00:46:45,159 --> 00:46:49,719
required to build good architectures is what
makes it succeed, and that that exists

629
00:46:49,760 --> 00:46:52,360
only in you and the people around
you. It doesn't exist in a free

630
00:46:52,400 --> 00:46:58,599
A framework boils away that human essence
that makes it possible and that's why one

631
00:46:58,639 --> 00:47:00,760
of the one of the best forms
of feedback that I get is when I

632
00:47:00,800 --> 00:47:05,920
teach these classes or I teach these
workshops, and I have some senior architects

633
00:47:05,920 --> 00:47:08,360
who's built mad stuff that I respect, who says, this is how we

634
00:47:08,519 --> 00:47:13,199
actually think, this is what we
do. That's how I know that I've

635
00:47:13,239 --> 00:47:15,239
hit the kneel on the head because
what I started out, what I set

636
00:47:15,280 --> 00:47:19,159
out to do, was to answer
the question, how do I turn a

637
00:47:19,199 --> 00:47:22,400
developer into an architect? What do
we teach them? Because it's definitely not

638
00:47:22,480 --> 00:47:24,800
too gap, and it's definitely not
requirements, and it's definitely not lines and

639
00:47:24,840 --> 00:47:30,840
boxes. And what I've done is
I've turned that negativity of senior architects into

640
00:47:31,000 --> 00:47:35,599
a two that we can do so
that we're not dismissed anymore, and so

641
00:47:35,679 --> 00:47:37,480
it'll be quiet negative, Nancy.
We just want to get this built,

642
00:47:37,960 --> 00:47:40,800
and we can say, well,
actually, this is the process. And

643
00:47:40,880 --> 00:47:45,960
it turns out that when you start
to work in this random way, it's

644
00:47:45,079 --> 00:47:49,079
so much more fun. And it's
one of the coolest things to watch at

645
00:47:49,079 --> 00:47:52,840
the workshops is when people start building
systems by stressing them, by making mad

646
00:47:52,840 --> 00:47:58,880
stuff up, by picking words out
of a dictionary. Then the fun that

647
00:47:58,880 --> 00:48:01,320
it starts to happen is it's really
really cool, and at the end of

648
00:48:01,360 --> 00:48:06,039
the day you get a verifiable architecture
out of the work. I also wonder,

649
00:48:06,159 --> 00:48:09,559
I mean, there's a psychological aspect
of this. By doing random things,

650
00:48:09,599 --> 00:48:13,639
by not, you know, doing
all this careful planning, you're not

651
00:48:13,960 --> 00:48:17,559
upset when your software breaks, like
you break it all the time. Yeah,

652
00:48:17,559 --> 00:48:22,320
so it's I mean, you take
away that stigma and so, you

653
00:48:22,360 --> 00:48:25,000
know, and you externalize it.
It's like, oh, we did this

654
00:48:25,000 --> 00:48:29,599
thing, we got this result.
This is not personal. This is not

655
00:48:29,760 --> 00:48:34,639
you've failed. This is we were
experimenting. We battered the software, the

656
00:48:34,719 --> 00:48:38,159
software got battered. Good on us. Now let's try something else exactly.

657
00:48:38,519 --> 00:48:45,400
This reminds me of a situation.
I was actually lucky enough to be hired

658
00:48:45,440 --> 00:48:50,679
as a session musician on an album
that was recorded at Levon Helms Studio and

659
00:48:50,719 --> 00:48:53,400
would stuck right, and so I
was hanging around most of the you know,

660
00:48:53,519 --> 00:48:58,159
all I had to do is play
one solo, right, and they

661
00:48:58,239 --> 00:49:00,039
finally got to me on the second
day. I hang around all day the

662
00:49:00,079 --> 00:49:05,239
first day, second day, about
three o'clock. All right, Carl you're

663
00:49:05,320 --> 00:49:08,880
up. And I told them right
at the outset, guys, it's going

664
00:49:08,960 --> 00:49:14,880
to take me about ten nine or
ten takes to really nail this solo.

665
00:49:15,039 --> 00:49:19,119
And they looked at me like I
was crazy, Like what you can't just

666
00:49:19,239 --> 00:49:22,159
like you know, and I said, no, I really have to play

667
00:49:22,199 --> 00:49:24,039
it a few times. I have
to feel it to get some ideas,

668
00:49:24,719 --> 00:49:28,280
and you know, by the by
the tenth take, I swear to God

669
00:49:28,320 --> 00:49:30,239
it's going to be magic. Turns
out I nailed it. It was like

670
00:49:30,440 --> 00:49:37,079
ninth take boom, that was the
one everybody said yes. And it's because

671
00:49:37,119 --> 00:49:43,320
I understood myself and I understood what
my you know, abilities are and how

672
00:49:43,360 --> 00:49:45,239
I can get to it. And
so at the end of the day,

673
00:49:45,239 --> 00:49:51,280
I wasn't sure if I should have
divulged that or I should have just done

674
00:49:51,280 --> 00:49:53,760
it. You know that if I
hadn't said that, maybe they would have

675
00:49:53,760 --> 00:49:58,000
been angry with the fact, like, like, you're wasting our time,

676
00:49:58,320 --> 00:50:00,599
you know, you should just come
in nail it. I don't know,

677
00:50:01,400 --> 00:50:04,960
but it makes me. It just
goes back to that whole thing with having

678
00:50:04,960 --> 00:50:08,639
a belief in your abilities and your
ability to figure stuff out. Right.

679
00:50:09,079 --> 00:50:14,239
It don't look at it as an
unknown look at it is what usually happens

680
00:50:14,239 --> 00:50:16,079
when I don't know something? And
how long does it usually take me to

681
00:50:16,079 --> 00:50:21,159
figure it out? Yeah? And
I think we have an attitude in our

682
00:50:21,440 --> 00:50:25,679
industry which I call sigalism. And
as you know, when you watch a

683
00:50:25,719 --> 00:50:30,719
Steven Siegal movie and he goes into
you know, he goes into a bar

684
00:50:30,880 --> 00:50:32,840
full of bad guys and there's like
fifty of them in there, and they

685
00:50:32,920 --> 00:50:37,519
all rush at him and he beats
everyone up and throws them out through windows

686
00:50:37,519 --> 00:50:40,119
and things. He comes out of
He comes back out of the bar without

687
00:50:40,119 --> 00:50:43,800
a scratch on him. He has
a tiny little bit of dust on his

688
00:50:43,880 --> 00:50:49,199
lapel which he brushes off and then
he goes he drives off again. And

689
00:50:49,239 --> 00:50:52,079
you're thinking, in terms of plot, why would you go in there?

690
00:50:52,199 --> 00:50:57,039
That makes no sense. But in
in tech we tend to think that,

691
00:50:57,119 --> 00:50:59,320
you know, we need to know
everything. I need to go in and

692
00:50:59,400 --> 00:51:01,000
nail this on the first take.
I need to be seen at a meeting.

693
00:51:01,000 --> 00:51:05,079
I need to be able to answer
all of these questions about how this

694
00:51:05,159 --> 00:51:07,599
particular as your component works, and
you know, I have to know this

695
00:51:07,639 --> 00:51:13,719
stuff. And that leads to a
sort of entrenched behavior when we're gathering requirements

696
00:51:14,199 --> 00:51:17,719
both in us who won't admit that
we don't know something, and it bleeds

697
00:51:17,719 --> 00:51:22,639
onto our steakholders, and they're caught
like a rabbit in the headlights. They

698
00:51:22,679 --> 00:51:24,599
have to say something or else look
like they don't know what they're doing.

699
00:51:24,639 --> 00:51:29,039
And what they say then, in
those circumstances is very often they're just making

700
00:51:29,039 --> 00:51:31,239
it up to get us to go
away. Whereas this kind of this kind

701
00:51:31,239 --> 00:51:36,199
of analysis allows us to take what
they say and stress it and push it

702
00:51:36,239 --> 00:51:40,119
around and find things outside of it
that we haven't understood. And it takes

703
00:51:40,159 --> 00:51:45,079
all that for us on the psychological
side of things. It takes a lot

704
00:51:45,119 --> 00:51:47,679
of pressure off steakholders as well,
once they realize that, you know,

705
00:51:47,800 --> 00:51:51,880
so, the question I asked to
a steakholder isn't what are your requirements?

706
00:51:51,920 --> 00:51:54,079
Anymore? It's what keeps you up
at night? What are you free it

707
00:51:54,159 --> 00:51:58,400
off? What could go wrong here? And they'll happily talk about that because

708
00:51:58,400 --> 00:52:01,239
they're under no pressure to be present
size. And the worst thing we can

709
00:52:01,239 --> 00:52:06,119
do to our stakeholders is forced them
to be precise in their language or precise

710
00:52:06,159 --> 00:52:09,360
in their description of an unknowable future. We've heard and We've heard this before

711
00:52:09,400 --> 00:52:15,239
with this, and I've certainly had
this experience that you don't interrogate like you're

712
00:52:15,239 --> 00:52:20,679
a lawyer, like you're preparing for
the lawsuit of this failed project. Let

713
00:52:20,719 --> 00:52:24,800
me get statements from you that hold
me I'm no longer liable for this thing

714
00:52:24,880 --> 00:52:30,039
failing. Yeah. I usually say
that we have a fantastic if you're a

715
00:52:30,079 --> 00:52:34,159
consultant or you know, an architectal
large company, we have a fantastic business

716
00:52:34,159 --> 00:52:37,079
model, and that is tell me
what you want, and I know it's

717
00:52:37,119 --> 00:52:40,280
wrong, and I'm going to build
it until we find out that it's wrong,

718
00:52:40,559 --> 00:52:44,119
at which point I'm going to blame
you because you didn't tell me,

719
00:52:44,159 --> 00:52:46,440
and you have to pay me a
giein to do the same thing. And

720
00:52:46,480 --> 00:52:52,199
it's this perfect business model where we
have no responsibility for what we build,

721
00:52:52,480 --> 00:52:54,599
and this is an attempt to break
out of that and change that or change

722
00:52:54,599 --> 00:52:59,599
that mold. I'm also impressed against
the idea that there's there's nothing in the

723
00:52:59,679 --> 00:53:04,639
Edge Manifesto against any of this either. Arguably, this is what real agile

724
00:53:04,920 --> 00:53:12,920
was about, was lots of communication, minimizing what you were writing, responding

725
00:53:13,000 --> 00:53:19,400
to change. It's just that it
got well folks pushed back on that sort

726
00:53:19,440 --> 00:53:23,000
of sort of agile button mindset,
right, that it keep water falling anyway.

727
00:53:23,480 --> 00:53:28,239
Yeah, I think that going back
to without getting too philosophical, but

728
00:53:28,239 --> 00:53:30,280
there was a split in our think
in three thousand years ago in Greece,

729
00:53:30,719 --> 00:53:37,480
and we chose the path of order
and structure and abstraction, and we stayed

730
00:53:37,519 --> 00:53:39,199
on that path for thousands of years. And it was good that we did

731
00:53:39,239 --> 00:53:45,239
because that gave us computers, but
it also has some weaknesses that causes us

732
00:53:45,280 --> 00:53:50,280
to think and in a particular way. When we look at social systems,

733
00:53:50,280 --> 00:53:54,320
we try to treat them like manageable
ordered machines. And the agile movement was

734
00:53:54,440 --> 00:54:00,719
actually a huge philosophical event from that
perspective because they said, hey, we

735
00:54:00,800 --> 00:54:04,119
don't know what's going on. We
can't actually write all this time. That's

736
00:54:04,159 --> 00:54:07,800
a huge philosophical step. It's a
huge challenge to our industry well, and

737
00:54:07,119 --> 00:54:12,880
arguably said can't know right, like
it is not impossible? Why should Why

738
00:54:12,960 --> 00:54:15,079
do you want us to keep lying? Right? It's sort of a pushback

739
00:54:15,079 --> 00:54:19,239
on Parnass to go, hey,
you know how we've been vaking it all

740
00:54:19,239 --> 00:54:22,760
this time? Can we just admit
we've been vaging it exactly, and that's

741
00:54:22,760 --> 00:54:28,199
why the agile movement then got suppressed. Actually, and that huge message has

742
00:54:28,239 --> 00:54:31,199
been pushed into all kinds of frameworks
and things that go back to the goal

743
00:54:31,239 --> 00:54:35,800
of giving us certainty and structure and
order. Yeah, which is I think

744
00:54:35,800 --> 00:54:39,800
against the spirit of that original manifesto. I'm not sure how many people behind

745
00:54:39,800 --> 00:54:45,360
the manifesto actually realized that they were
they were taking a swing at something really,

746
00:54:45,400 --> 00:54:50,559
really big when they said that really
big that actually swung back quite successfully.

747
00:54:51,320 --> 00:54:53,320
I remember the we talked about this
in one of the Fusion Geek outs.

748
00:54:53,400 --> 00:54:58,559
The Ider project, the giantic project
was the whole thing was get the

749
00:54:58,599 --> 00:55:02,679
first billion dollars spen because then they
can't cancel it. So we can't tell

750
00:55:04,079 --> 00:55:07,199
is what we lie until they're too
committed. Then we just work on what

751
00:55:07,320 --> 00:55:12,440
needs to be done. Yeah.
Yeah, And a number of mega projects

752
00:55:12,480 --> 00:55:15,079
that are like that, where it's
like, if you actually knew the full

753
00:55:15,119 --> 00:55:16,239
scope of this, you may not
do it, and we really want to

754
00:55:16,239 --> 00:55:20,679
do it and it will be beneficial, but we can't tell you the truth

755
00:55:20,679 --> 00:55:24,000
because then you won't do it.
Yeah, that's a great note there,

756
00:55:28,320 --> 00:55:32,000
Berry. This is so cool.
I mean You've really made my day and

757
00:55:34,280 --> 00:55:37,679
I'm loving this well, and congratulations
on what is clearly a very cool book

758
00:55:37,719 --> 00:55:40,840
which you can get for free,
but do yourself a favor at least pay

759
00:55:40,880 --> 00:55:45,039
the minimum price for you absolutely lean
pub you have a choice. Yeah,

760
00:55:45,119 --> 00:55:49,960
and it looks like a fun read. Yeah. The artwork has been done

761
00:55:50,280 --> 00:55:54,920
by my eleven year old who's incredibly
proud of it. Right, I think

762
00:55:54,920 --> 00:55:59,079
that's one of the coolest things.
And it's our project. But what's their

763
00:55:59,159 --> 00:56:02,960
name, let's call them out.
Alexander all right, who did the artwork

764
00:56:04,000 --> 00:56:12,719
and is now demanding royalties and I
like it even more. Yeah, Well,

765
00:56:12,760 --> 00:56:15,840
this is great and you've changed the
world. So thank you for spending

766
00:56:15,920 --> 00:56:20,679
the time with us today. And
well, I'm sure we're going to talk

767
00:56:20,679 --> 00:56:24,159
to you again. Cheers, guys, see you soon at a conference somewhere,

768
00:56:24,679 --> 00:56:30,079
Okay, cheers, and we'll talk
to you next time on dot net

769
00:56:30,199 --> 00:56:54,000
rocks. Dot net Rocks is brought
to you by Franklin's Net and produced by

770
00:56:54,079 --> 00:57:00,000
Pop Studios, a full service audio, video and post production facility located physically

771
00:57:00,039 --> 00:57:06,000
in New London, Connecticut, and
of course in the cloud online at pwop

772
00:57:06,239 --> 00:57:09,119
dot com. Visit our website at
d O T N E t R O

773
00:57:09,199 --> 00:57:15,480
c k S dot com for RSS
feeds, downloads, mobile apps, comments,

774
00:57:15,800 --> 00:57:19,920
and access to the full archives going
back to show number one, recorded

775
00:57:19,960 --> 00:57:22,880
in September two thousand and two.
And make sure you check out our sponsors.

776
00:57:23,039 --> 00:57:27,679
They keep us in business. Now
go write some code. See you

777
00:57:27,719 --> 00:57:37,679
next time. You got Jamtavans and
