1
00:00:01,080 --> 00:00:03,000
Speaker 1: How'd you like to listen to dot net rocks with

2
00:00:03,040 --> 00:00:03,799
no ads?

3
00:00:04,440 --> 00:00:04,799
Speaker 2: Easy?

4
00:00:05,360 --> 00:00:08,560
Speaker 1: Become a patron for just five dollars a month. You

5
00:00:08,599 --> 00:00:11,320
get access to a private RSS feed where all the

6
00:00:11,359 --> 00:00:14,599
shows have no ads. Twenty dollars a month. We'll get

7
00:00:14,599 --> 00:00:17,679
you that and a special dot net Rocks patron mug.

8
00:00:18,160 --> 00:00:34,320
Sign up now at patreon dot dot NetRocks dot com. Hey,

9
00:00:34,359 --> 00:00:37,079
get down, rock and roll build, We're here. It's dot

10
00:00:37,119 --> 00:00:37,520
net rocks.

11
00:00:37,520 --> 00:00:39,079
Speaker 2: I'm Carl Franklin, average cap.

12
00:00:39,119 --> 00:00:42,679
Speaker 1: And the end of the second day of recording for us.

13
00:00:42,920 --> 00:00:45,960
So you know, it's been a good two days so far.

14
00:00:46,079 --> 00:00:48,320
Speaker 2: I got another one to go today. I've been knocking

15
00:00:48,359 --> 00:00:50,560
out run as, having a good time doing it. Awesome.

16
00:00:51,600 --> 00:00:53,880
Speaker 1: I've already done a couple of Blazer puzzles and Jeff

17
00:00:53,920 --> 00:00:57,920
Fritz is chock full. I saw our friend Beth outside

18
00:00:58,119 --> 00:01:01,960
just now. Yeah, so who used to be Massy? And

19
00:01:01,960 --> 00:01:05,040
now I can ever remember her new last name is?

20
00:01:05,040 --> 00:01:07,480
Speaker 2: It? Didn't change it? Oh? Okay, okay, did the wedding?

21
00:01:08,239 --> 00:01:12,400
Speaker 1: Well, that's true, I was there and I don't remember.

22
00:01:12,959 --> 00:01:15,079
I don't even remember what I have for breakfast. That's

23
00:01:15,120 --> 00:01:16,480
good because I didn't have breakfast.

24
00:01:16,680 --> 00:01:17,280
Speaker 2: Okay, here we go.

25
00:01:18,480 --> 00:01:23,239
Speaker 1: Before we start, let's talk about nineteen fifty five. Episode

26
00:01:23,280 --> 00:01:25,840
nineteen fifty five. This episode nineteen fifty five. So we've

27
00:01:25,879 --> 00:01:28,599
been talking a little history about what happened. Okay, here,

28
00:01:28,680 --> 00:01:30,879
What do you like about nineteen Well, I just know

29
00:01:31,120 --> 00:01:35,079
a couple of things significant, you know, cultural, political, and

30
00:01:35,120 --> 00:01:38,519
social events. The first McDonald's restaurant opened there. It's definitely

31
00:01:38,560 --> 00:01:43,599
a cultural event. Yeah, yeah, fast food, right, Disneyland debuted,

32
00:01:44,400 --> 00:01:47,040
and The McKey mouse Club premiered on TV.

33
00:01:47,799 --> 00:01:50,319
Speaker 2: Coincidence, I don't think so. No.

34
00:01:51,400 --> 00:01:55,319
Speaker 1: Politically, the Warsaw Pact was formed and West Germany became

35
00:01:55,359 --> 00:01:59,120
a sovereign state. On the social front, Rosa Park's arrest

36
00:01:59,159 --> 00:02:03,040
sparked the Montgomery Bus boycott, and the racially inspired murder

37
00:02:03,079 --> 00:02:09,000
of Emmett Till caused national outrage. There's a lot more,

38
00:02:09,120 --> 00:02:12,680
for sure. What are you thinking about computer tech wise

39
00:02:12,879 --> 00:02:13,879
or science wise?

40
00:02:14,000 --> 00:02:17,199
Speaker 2: The first atomic clock, which was measuring the frequency of caesium,

41
00:02:17,240 --> 00:02:19,639
so it was an incredibly precise clock. Wow, it was

42
00:02:19,639 --> 00:02:21,639
the beginning of the atomic age. Nineteen fifty five, right,

43
00:02:21,919 --> 00:02:25,960
But more importantly Velcrow now Velcrow and the Big Mac

44
00:02:26,360 --> 00:02:31,319
in the same year, and the first wireless television remote

45
00:02:32,080 --> 00:02:35,000
the Zenith flash Omatic. You know. I called it the

46
00:02:35,080 --> 00:02:37,759
flash omatic because it used light. It used light. They

47
00:02:37,800 --> 00:02:39,639
put a sensor in each corner of the screen, and

48
00:02:39,680 --> 00:02:41,599
depending on what corner of the screen you pointed at,

49
00:02:41,639 --> 00:02:44,639
it either turn channel up one or the channel down one,

50
00:02:44,960 --> 00:02:47,759
or the volume up one or the volume down one.

51
00:02:48,039 --> 00:02:48,639
Speaker 1: Isn't that?

52
00:02:48,719 --> 00:02:51,439
Speaker 2: It was simply cool. And before that there was you

53
00:02:51,479 --> 00:02:53,479
had a remote, but it was wired and people didn't

54
00:02:53,560 --> 00:02:55,840
like the wire So this was the first wireless, just

55
00:02:55,879 --> 00:02:57,240
with a flashing light. Yeah.

56
00:02:57,360 --> 00:02:59,639
Speaker 1: I was the remote from my older brother. Yes, let's

57
00:02:59,639 --> 00:03:02,719
get up and change a channel. I'll smack you right.

58
00:03:03,680 --> 00:03:06,439
Were you the remote for your older siblings? Is that

59
00:03:06,479 --> 00:03:06,919
why you play?

60
00:03:07,039 --> 00:03:08,759
Speaker 3: I'm the oldest, but I was definitely the remote for

61
00:03:08,879 --> 00:03:09,159
my dad.

62
00:03:09,960 --> 00:03:14,919
Speaker 2: For your dad, I will have children except stupid.

63
00:03:14,960 --> 00:03:16,840
Speaker 1: I tried to train the dog to do it, but

64
00:03:17,360 --> 00:03:20,039
never never work. It doesn't work out, all right, So

65
00:03:20,120 --> 00:03:22,400
I guess it's time now for better Now a framework?

66
00:03:22,400 --> 00:03:22,719
All right?

67
00:03:30,439 --> 00:03:31,759
Speaker 2: Okay, dude, what do you got? All right?

68
00:03:31,879 --> 00:03:35,159
Speaker 1: So I went looking on what's trending in GitHub and

69
00:03:35,240 --> 00:03:36,439
again for repos and.

70
00:03:36,840 --> 00:03:40,080
Speaker 2: There's a lot of AI weird trending and here we are.

71
00:03:41,080 --> 00:03:45,360
Speaker 1: Yeah, it's an AI world, So this one is awesome.

72
00:03:45,520 --> 00:03:51,000
Chat GPT prompts. This repo includes chat GPT prompt curation

73
00:03:51,199 --> 00:03:55,039
to use chat, gipt and other LLLM tools better so

74
00:03:55,439 --> 00:04:00,560
they chatpt but works with these prompts, work with other

75
00:04:00,639 --> 00:04:05,520
AI models. Claude Gemini, hugging Face, jet Lama, mister Roll

76
00:04:05,560 --> 00:04:08,960
and Moore. Thank you, thank you very much.

77
00:04:09,039 --> 00:04:10,599
Speaker 2: Yeah, that's not going to show up all the ship.

78
00:04:10,879 --> 00:04:14,319
Speaker 1: No, probably not, but Brandon, if it does, I would

79
00:04:14,360 --> 00:04:15,039
like to say thank you.

80
00:04:15,240 --> 00:04:15,599
Speaker 2: Okay.

81
00:04:16,480 --> 00:04:18,560
Speaker 1: So yeah, so that's it. I mean, it's it's kind

82
00:04:18,560 --> 00:04:20,319
of a long read me because there's a lot of

83
00:04:20,360 --> 00:04:23,720
stuff there. Oh yeah, but here's just a couple. Act

84
00:04:23,800 --> 00:04:29,120
as Ethereum developer, nice, act as a Linux terminal, act

85
00:04:29,199 --> 00:04:34,199
as an English translator and improver, act as a job interviewer,

86
00:04:34,240 --> 00:04:40,959
act as a JavaScript console, as an Excel sheet. So yeah,

87
00:04:41,000 --> 00:04:43,439
some very creative prompts there. I'm sure people are finding

88
00:04:43,480 --> 00:04:46,519
good use. So that's what I found. Who's talking to

89
00:04:46,600 --> 00:04:47,160
us today.

90
00:04:46,959 --> 00:04:50,040
Speaker 2: Richard Gravin Kalinov Show fourteen twenty five, So going back

91
00:04:50,079 --> 00:04:51,399
to New York, have a little bit. That was March

92
00:04:51,439 --> 00:04:54,279
of twenty seventeen. We talked to our friend Damian Brady

93
00:04:54,519 --> 00:04:59,079
about Brownfield DevOps. Yes, I happen to know Nicole has

94
00:04:59,120 --> 00:05:01,399
done a little bit of DEVO related stuff over the years.

95
00:05:01,879 --> 00:05:04,439
And this particular, this is when we're sort of talking

96
00:05:04,480 --> 00:05:07,720
about bringing the DevOps practice of that sort of automation

97
00:05:07,839 --> 00:05:09,759
and controls over an existing application.

98
00:05:10,000 --> 00:05:12,560
Speaker 1: Right, was Damian with Octoplus back then?

99
00:05:12,759 --> 00:05:15,360
Speaker 2: I think he might have been Actually, yeah, of course

100
00:05:15,399 --> 00:05:18,279
he's a cloud advocate these days. And so our friend

101
00:05:18,360 --> 00:05:21,519
Hilton ge Is now commented on that show. Yeah, and

102
00:05:21,560 --> 00:05:23,920
he said, I was listening to the Brownfield devop show

103
00:05:23,920 --> 00:05:27,439
and realize some things. One, A green field is actually

104
00:05:27,439 --> 00:05:29,800
when it's lush and growing. Well, I'm thinking, say, like

105
00:05:29,839 --> 00:05:33,000
a cornfield or something, it's actually about to bear fruit.

106
00:05:33,160 --> 00:05:35,879
As contrast to Brownfield is one, it's just the start

107
00:05:35,920 --> 00:05:38,639
of the process. There's sort of a file new project. Right,

108
00:05:39,319 --> 00:05:41,920
So he's totally clipping the concept on his head. Right,

109
00:05:42,040 --> 00:05:45,720
he's not wrong, Yeah, you know. And then two if

110
00:05:45,720 --> 00:05:48,199
one things of financial services or wealth management firms is

111
00:05:48,199 --> 00:05:50,920
the idea of quote unquote building a legacy is a

112
00:05:50,959 --> 00:05:54,279
tremendous need to be proud of. It's something valuable lead behind.

113
00:05:54,279 --> 00:05:56,639
Whereas the software development is concerned to all something old,

114
00:05:56,720 --> 00:05:59,439
rotten and tarnished. It's definitely something you left behind because

115
00:05:59,480 --> 00:06:00,439
you changed jobs.

116
00:06:01,680 --> 00:06:05,240
Speaker 1: Yeah, so I'm thinking of ways that brown field could

117
00:06:05,279 --> 00:06:08,879
be a bad thing. You know, I'm not so sure

118
00:06:08,920 --> 00:06:11,040
I would want to walk through a green field, but not.

119
00:06:11,079 --> 00:06:13,720
Speaker 2: Through a brown field because yeah, you getr shoes to money. Yeah,

120
00:06:13,800 --> 00:06:15,199
it's not a good thing. It's not a good thing.

121
00:06:15,360 --> 00:06:17,360
So I mean it is possible. As an industry, we've

122
00:06:17,360 --> 00:06:19,959
actually got these terms flipped around, and you should start

123
00:06:20,000 --> 00:06:22,639
fresh with brownfield projects and build a legacy system to

124
00:06:22,680 --> 00:06:23,160
be proud of.

125
00:06:23,279 --> 00:06:24,199
Speaker 1: It's really smart.

126
00:06:24,319 --> 00:06:27,040
Speaker 2: Oh, Kilton, you're so clever. But you know, the.

127
00:06:27,040 --> 00:06:29,279
Speaker 1: Planting grass, it's got to be brown before.

128
00:06:29,000 --> 00:06:31,519
Speaker 2: It becomes green, and then the birds are gonna eat

129
00:06:31,519 --> 00:06:32,920
the seeds. You're gonna have to plant it again. Ask

130
00:06:32,959 --> 00:06:36,160
me how I know, Hilton. Thank you so much for

131
00:06:36,199 --> 00:06:37,800
your comment, and a copy of music Goba is on

132
00:06:37,839 --> 00:06:39,079
its way to you. And if you'd like a copy

133
00:06:39,120 --> 00:06:40,839
of music kobay I, write a comment on the website

134
00:06:40,839 --> 00:06:43,000
at dotnet rocks dot com or all the facebooks we

135
00:06:43,040 --> 00:06:44,519
publish every show there, and if you comment there and

136
00:06:44,560 --> 00:06:46,079
ever read on the show, we'll send you a copy

137
00:06:46,120 --> 00:06:46,720
of music Goby.

138
00:06:46,879 --> 00:06:49,600
Speaker 1: Music to Code by the Flack version is now the

139
00:06:49,639 --> 00:06:51,920
most popular downloaded version.

140
00:06:52,040 --> 00:06:55,000
Speaker 2: Yeah, because Flack play on everything, I presume it does well.

141
00:06:55,079 --> 00:06:57,040
Speaker 1: But what most people do is they download them and

142
00:06:57,120 --> 00:06:59,639
Flack and then they convert them to wave right and

143
00:06:59,759 --> 00:07:02,040
they you know, you get the same quality, but it's

144
00:07:02,040 --> 00:07:05,439
half the size. It's like a zipped wave. Oh so

145
00:07:05,560 --> 00:07:06,519
flat the flat smaller.

146
00:07:06,560 --> 00:07:07,240
Speaker 2: The wave is big.

147
00:07:07,439 --> 00:07:10,720
Speaker 1: But yeah, yeah, Flack is half the size, but it's lossless.

148
00:07:10,800 --> 00:07:14,040
Good anyway. Music to Code by dot Net there's twenty

149
00:07:14,040 --> 00:07:16,800
two tracks. Go get them there. They'll help you focus

150
00:07:16,839 --> 00:07:20,959
and they'll help you be a better programmer. Okay, let's

151
00:07:21,000 --> 00:07:25,319
bring back Nicole Forsgren. She's a DevOps and developer productivity

152
00:07:25,399 --> 00:07:30,720
expert who thrives on helping large organizations reshape their culture, processes,

153
00:07:30,800 --> 00:07:35,040
and technology to improve their business and enhance the developer experience.

154
00:07:35,639 --> 00:07:38,959
She combines AI tools with developer expertise to help teams

155
00:07:39,040 --> 00:07:43,720
unlock productivity in innovation. And I know this isn't one

156
00:07:43,720 --> 00:07:46,120
of those bios that you feel you had to sneak

157
00:07:46,160 --> 00:07:48,560
AI in there just to be relevant.

158
00:07:48,240 --> 00:07:51,079
Speaker 3: Because we know you really do that, I actually do.

159
00:07:51,399 --> 00:07:52,720
Speaker 1: You're an AI person.

160
00:07:52,879 --> 00:07:56,360
Speaker 2: Oh my goodnessess, well, I need wear on Donna rocks

161
00:07:56,360 --> 00:07:58,839
back in twenty seventeen with our friend Jas Humble in

162
00:07:58,879 --> 00:08:01,600
the day, the Dora days and a regular on run.

163
00:08:01,639 --> 00:08:05,000
Ass know, half a dozen times over the years, starting

164
00:08:05,000 --> 00:08:08,319
with that in person interview at the CHEF conference. I

165
00:08:08,360 --> 00:08:10,079
think it was. I think so it was a long

166
00:08:10,120 --> 00:08:12,639
time ago. Good to see you again, Good to see

167
00:08:12,680 --> 00:08:15,959
you having a good build, I mean love.

168
00:08:16,120 --> 00:08:20,519
Speaker 1: Yeah, it's historic in my mind, this period right here.

169
00:08:20,800 --> 00:08:25,959
Speaker 2: Yeah, really crazy emerging technologies. But you've moved on from Dora.

170
00:08:26,160 --> 00:08:27,560
It got acquired by Google, didn't it.

171
00:08:27,560 --> 00:08:28,480
Speaker 3: We were acquired by Google.

172
00:08:28,839 --> 00:08:30,319
Speaker 2: Congratulations, thank you?

173
00:08:30,519 --> 00:08:33,159
Speaker 3: It was it was great. Yeah, the team that I

174
00:08:33,159 --> 00:08:36,440
got to work with there was wonderful. And the Dora

175
00:08:36,519 --> 00:08:37,360
team continues now.

176
00:08:37,600 --> 00:08:40,840
Speaker 2: Sure they still do the report every year. I read it. Yeah,

177
00:08:40,879 --> 00:08:41,720
and then ever.

178
00:08:41,720 --> 00:08:43,879
Speaker 1: Can you tell me just what is Dora and what

179
00:08:43,960 --> 00:08:45,080
was it? Or what is it?

180
00:08:45,200 --> 00:08:47,440
Speaker 2: Yeah? What's the shorthand for Dora? Yeah?

181
00:08:47,440 --> 00:08:50,720
Speaker 3: Absolutely So. Dora was a multi year now at this point,

182
00:08:50,799 --> 00:08:54,480
over a decade research project that looked into, you know,

183
00:08:54,559 --> 00:08:57,200
kind of investigating the types of things that can improve

184
00:08:57,799 --> 00:09:01,679
software delivery performance and organizational packed and output. So things

185
00:09:01,720 --> 00:09:05,440
that now everyone jumping the way back machine. Imagine being

186
00:09:05,440 --> 00:09:10,159
the late two thousands, early twenty tens, when we weren't sure.

187
00:09:10,320 --> 00:09:13,960
We were hearing stories that things like CICD were important,

188
00:09:14,279 --> 00:09:17,480
that things like continuous integration were important, you know, in

189
00:09:17,480 --> 00:09:24,799
particular automated testing cloud. So you know, discussions would come

190
00:09:24,840 --> 00:09:28,120
up with leaders and organizations or managers and they'd say, oh, well,

191
00:09:28,159 --> 00:09:30,399
you know that won't work here, or well that doesn't

192
00:09:30,440 --> 00:09:34,600
work like this, or we're a different organization, or even well,

193
00:09:34,639 --> 00:09:37,039
what do you mean by CI? What do CI even mean?

194
00:09:37,039 --> 00:09:39,159
Because you know at that point also everyone's just redefining

195
00:09:39,200 --> 00:09:43,759
it constantly, and so the research program kind of broke

196
00:09:43,799 --> 00:09:48,320
all of these pieces down and tested statistically across the

197
00:09:48,360 --> 00:09:51,679
whole industry. I mean, we had thousands, tens of thousands

198
00:09:51,720 --> 00:09:54,120
of responses and so we could and data points that

199
00:09:54,159 --> 00:09:56,559
we could really test on. So that's so DORA was

200
00:09:56,720 --> 00:10:00,960
originally short for DevOps Research and Assessment and it was

201
00:10:00,960 --> 00:10:03,480
the State of Develops reports. He originally started with the

202
00:10:03,480 --> 00:10:07,039
Puppet team and then broke out and did it through

203
00:10:07,080 --> 00:10:09,639
DORA well, and that led to the book Accelerate, So

204
00:10:09,679 --> 00:10:12,360
some folks know Accelerate. Accelerate is kind of it yeah,

205
00:10:12,559 --> 00:10:15,879
synopsis and summerization of the first several years of work.

206
00:10:16,039 --> 00:10:17,919
Speaker 2: I always felt that you were the person who brought

207
00:10:17,919 --> 00:10:21,279
the real rigor to analyzing the results of these product

208
00:10:21,320 --> 00:10:24,240
teams to show that when they followed these practices, the

209
00:10:24,279 --> 00:10:29,360
results were substantial. They were orders of magnitude productivity increases.

210
00:10:29,440 --> 00:10:33,679
Like it seems so anecdotal until you started writing those reports,

211
00:10:33,679 --> 00:10:36,000
where now it's not acdotal, it's completely scientific, like, here

212
00:10:36,000 --> 00:10:38,720
are the numbers, here's what you do, and you two

213
00:10:38,840 --> 00:10:41,279
can have this experience, your teams can perform like that.

214
00:10:42,279 --> 00:10:42,480
You know.

215
00:10:42,519 --> 00:10:44,799
Speaker 3: The other thing that I think was great, you know,

216
00:10:45,120 --> 00:10:47,879
I really value the partnerships that we had with you know,

217
00:10:49,039 --> 00:10:52,679
Jess Humble, Gene Kim Puppet, a lot of Brown, Nigel Kurston.

218
00:10:53,519 --> 00:10:57,240
I was this cute little baby faculty member and I

219
00:10:57,279 --> 00:10:58,879
walked in and I'm like, I would like to help

220
00:10:58,879 --> 00:11:00,480
you with some of your data. Oh, by the way,

221
00:11:01,320 --> 00:11:04,240
let's also really rethink the research design here so we

222
00:11:04,279 --> 00:11:08,080
can test some of these relationships out. And so, you know,

223
00:11:08,399 --> 00:11:10,639
to their credit, they were like, yeah, let's go for it,

224
00:11:10,679 --> 00:11:13,320
and so everyone kind of jumped in, and you know,

225
00:11:13,559 --> 00:11:17,679
Papa had been getting some really interesting results before and

226
00:11:17,919 --> 00:11:20,320
Alana Brown was the first person to really see that

227
00:11:20,919 --> 00:11:23,159
this was kind of an emerging field and deserve some

228
00:11:23,279 --> 00:11:26,000
kind of study. But by having this kind of research design,

229
00:11:26,080 --> 00:11:29,399
we were able to also do things like what pieces

230
00:11:29,399 --> 00:11:34,120
of continuous integration are important for performance and predictive with performance. Right,

231
00:11:34,159 --> 00:11:37,519
So for example, it's three things. That's when you check

232
00:11:37,519 --> 00:11:41,120
in code, you automatically get a build it, automa kicks

233
00:11:41,159 --> 00:11:44,240
off build it automatically kicks off tests, and you have

234
00:11:44,279 --> 00:11:45,120
to keep the build green.

235
00:11:45,440 --> 00:11:45,600
Speaker 2: Right.

236
00:11:45,679 --> 00:11:48,919
Speaker 3: And for some people that was you know, like they'd

237
00:11:48,919 --> 00:11:51,840
have two of the three because you know, it's same.

238
00:11:51,879 --> 00:11:53,279
I think it's same thing with AI right now. Right,

239
00:11:53,320 --> 00:11:55,159
it's all about the tech. It's like about the yes

240
00:11:55,399 --> 00:11:59,480
and the culture and what you prioritize and the decisions

241
00:11:59,480 --> 00:12:02,799
that the teams and the processes around that. So the

242
00:12:02,879 --> 00:12:05,240
nice thing is we had kind of the statistical significance,

243
00:12:05,279 --> 00:12:09,919
but also the research design helped provide direction regardless of

244
00:12:09,960 --> 00:12:11,960
whatever technology stack you were on.

245
00:12:12,559 --> 00:12:14,679
Speaker 2: Yeah, that was the thing that was so cool for

246
00:12:14,720 --> 00:12:17,159
me reading all this. It's like it doesn't matter, right,

247
00:12:17,320 --> 00:12:20,960
like you if you follow these practices and grow this

248
00:12:21,120 --> 00:12:23,679
culture mindset, it doesn't matter what jewels you use. You

249
00:12:23,720 --> 00:12:26,320
can get these kinds of results. You know, there's obviously

250
00:12:26,320 --> 00:12:30,600
things ultimately need to build, but there's no secret sauce

251
00:12:30,720 --> 00:12:31,919
other than making the effort.

252
00:12:32,080 --> 00:12:33,919
Speaker 1: Yeah, and now you don't even really need to make

253
00:12:33,960 --> 00:12:36,200
the effort. I mean, so many of these processes are

254
00:12:36,200 --> 00:12:38,679
built in, you know, get up actions and things like that.

255
00:12:38,759 --> 00:12:40,000
I mean, you have to set them up and turn

256
00:12:40,039 --> 00:12:42,720
them on. But it's trivial compared to what it was.

257
00:12:43,080 --> 00:12:45,320
Speaker 3: Yeah, the world, the world's different. But I would also

258
00:12:45,399 --> 00:12:48,639
say that, you know, a bunch of this is table stakes.

259
00:12:48,639 --> 00:12:51,960
Now it's super important, and if anything, it's more important

260
00:12:51,960 --> 00:12:55,159
now that AI's here. Yeah, because now people are just

261
00:12:55,279 --> 00:12:59,000
making code and innovating, experimenting super super fast, and you

262
00:12:59,039 --> 00:13:03,000
know it just frustration and sadness and using computers and

263
00:13:03,039 --> 00:13:06,279
anger if you can do all of this incredible coding

264
00:13:06,320 --> 00:13:11,240
and prototyping and then you're stuck behind this giant wall

265
00:13:11,480 --> 00:13:15,559
of waiting for build and integration and slow feedback and

266
00:13:15,720 --> 00:13:17,720
flicky tests and nothing else works.

267
00:13:17,879 --> 00:13:21,799
Speaker 2: Yeah, yeah, waiting for iety to heat to bloy. And

268
00:13:23,159 --> 00:13:27,679
remember build servers back in the nineties, build servers yeah yeah,

269
00:13:27,720 --> 00:13:29,000
well and the build.

270
00:13:28,799 --> 00:13:31,120
Speaker 3: Master yeah oh yeah, what's old is new again?

271
00:13:31,240 --> 00:13:34,080
Speaker 1: Yeah, and one person that was the only person allowed

272
00:13:34,080 --> 00:13:35,879
to do the build. It's like build, we did that

273
00:13:36,000 --> 00:13:38,960
on Fridays. But why we would do that on Fridays

274
00:13:38,960 --> 00:13:40,840
before going home from the weekend, I don't know.

275
00:13:40,919 --> 00:13:42,960
Speaker 3: Because sometimes it took forty eight hours to round.

276
00:13:43,120 --> 00:13:46,200
Speaker 2: Yeah, that's true. Yeah, no question or was that it

277
00:13:46,240 --> 00:13:50,039
was bunch like? So now a new book coming, Yes,

278
00:13:50,159 --> 00:13:52,399
And I saw on LinkedIn you had a vote for

279
00:13:52,480 --> 00:13:56,600
the title. Yeah, and I think Frictionless is such a

280
00:13:56,639 --> 00:13:58,960
good title, Like it just gives me chills.

281
00:13:59,279 --> 00:14:01,480
Speaker 3: That's the one we had it on. So shout out

282
00:14:01,519 --> 00:14:02,639
to LinkedIn folks. Thank you.

283
00:14:03,600 --> 00:14:05,240
Speaker 2: It's a good name. What are you talking about?

284
00:14:05,360 --> 00:14:09,159
Speaker 3: So the book is about developer experience, right, and how

285
00:14:09,159 --> 00:14:12,240
it's important, why it's important, how you can make the

286
00:14:12,240 --> 00:14:15,840
business case, but especially, and I think most importantly, how

287
00:14:15,879 --> 00:14:18,360
you can improve developer experience yourself.

288
00:14:19,000 --> 00:14:21,919
Speaker 2: We have you know, as a developer as the leader.

289
00:14:22,559 --> 00:14:27,480
Speaker 3: Yes, right now, it's probably going to be easiest or

290
00:14:27,679 --> 00:14:30,799
most straightforward for someone who is the head of a

291
00:14:30,840 --> 00:14:33,639
developer productivity team or an infrastructure team or a DEVX

292
00:14:33,679 --> 00:14:37,120
team or you know, pick the word that your company loves.

293
00:14:38,000 --> 00:14:41,000
But there are still some really cool, good tools and

294
00:14:41,039 --> 00:14:44,399
tips and tricks that anyone could use. You know, it's

295
00:14:44,639 --> 00:14:47,000
almost like the old DevOps days, right, Sometimes you had

296
00:14:47,000 --> 00:14:50,720
exec support, sometimes it was uh grassroots.

297
00:14:50,759 --> 00:14:54,639
Speaker 2: Yeah, just you know, I remember we told the story

298
00:14:54,679 --> 00:14:57,320
of this one person walk around asking questions. It's just

299
00:14:57,360 --> 00:15:01,159
sort of surfacing areas really of friction, yep. And and

300
00:15:01,279 --> 00:15:03,679
as the conversations start, it's like, hey, you know I

301
00:15:03,679 --> 00:15:06,279
could build you a tool that would make that piece easier,

302
00:15:06,399 --> 00:15:08,799
or you know, we can I could deal with this piece.

303
00:15:08,919 --> 00:15:10,840
You know, the folks and OPS were able to do

304
00:15:10,879 --> 00:15:12,799
certain things that was tough for DEV, and there were

305
00:15:12,799 --> 00:15:15,039
things that DEV could build and made life easier for OPS.

306
00:15:15,039 --> 00:15:17,720
And as soon as those conversations started bearing, things started

307
00:15:17,720 --> 00:15:18,279
moving faster.

308
00:15:18,600 --> 00:15:18,919
Speaker 1: Yeah.

309
00:15:18,960 --> 00:15:20,960
Speaker 3: And you know, so I'm writing the book with Aby Noda,

310
00:15:21,080 --> 00:15:24,240
who is you know, founder and CEO of d X

311
00:15:24,799 --> 00:15:27,399
and he's also you know, we're both but you know,

312
00:15:27,440 --> 00:15:31,080
having all of these conversations with technology leaders and developers

313
00:15:31,360 --> 00:15:36,559
and you know, finding the best you know, the best

314
00:15:36,559 --> 00:15:39,080
practices that we have right now around developer experience, what

315
00:15:39,120 --> 00:15:41,519
it means in traditional systems, how to improve it, what

316
00:15:41,600 --> 00:15:44,240
it means for AI right now. Like I said, if anything,

317
00:15:44,279 --> 00:15:45,759
it's even more important.

318
00:15:46,000 --> 00:15:48,159
Speaker 2: Yeah, yeah, I'm just thinking about how because again you

319
00:15:48,159 --> 00:15:51,039
and I've had this conversation many times. How different is

320
00:15:51,159 --> 00:15:54,480
dev ops today? Admittedly, as car brought up, the tooling

321
00:15:54,519 --> 00:15:58,799
is better, but the asks are different now too. Everything's

322
00:15:58,840 --> 00:16:02,679
going to the cloud. People expect you to iterate very quickly.

323
00:16:02,679 --> 00:16:05,000
There are almost no version numbers per se anymore. You

324
00:16:05,080 --> 00:16:06,159
just supposed to the scale.

325
00:16:06,720 --> 00:16:10,679
Speaker 3: The scale and the speed and the size are really no,

326
00:16:11,240 --> 00:16:12,679
they're almost On the one hand, I want to say

327
00:16:12,679 --> 00:16:14,399
they're changing things, but really they're not. I think they're

328
00:16:14,440 --> 00:16:17,960
amplifying things. Right, The fundamentals and the principles are still there.

329
00:16:18,039 --> 00:16:21,600
And to be fair, I'll say that every you know,

330
00:16:21,200 --> 00:16:23,279
I used to say every year or two. At this point,

331
00:16:23,279 --> 00:16:24,759
every three or six months or so, I kind of

332
00:16:24,759 --> 00:16:28,320
sit back and I say, has anything fundamentally changed? Is

333
00:16:28,360 --> 00:16:31,600
something that used to be important no longer important? And

334
00:16:31,679 --> 00:16:33,840
so far the answer is no. Right, we still need

335
00:16:33,879 --> 00:16:38,200
to have fast bills, we still need to have good tests,

336
00:16:39,000 --> 00:16:41,279
we still need to have good disaster recovery. We still

337
00:16:41,279 --> 00:16:44,399
want to, you know, think about security. How that's implemented

338
00:16:44,480 --> 00:16:47,919
might look a little bit different. And again, right, it's

339
00:16:47,919 --> 00:16:49,840
at scale, and now we have AI not only are

340
00:16:49,840 --> 00:16:54,519
things happening faster, they're happening larger, right, like our dicks

341
00:16:54,559 --> 00:16:56,720
are much much bigger. So how do we think about

342
00:16:56,759 --> 00:16:59,159
approaching that? So I think, you know, like I said,

343
00:16:59,159 --> 00:17:01,120
on the one hand, it hasn't changed at all because

344
00:17:01,120 --> 00:17:03,000
the fundamentals are there. And on the other hand, it's

345
00:17:03,120 --> 00:17:05,440
very much changed because we're kind of dealing with different

346
00:17:05,440 --> 00:17:09,799
technologies and new technologies and again different scale and size

347
00:17:09,799 --> 00:17:13,599
and reliability and security. I mean, yeah, the world has changed.

348
00:17:13,839 --> 00:17:16,359
Speaker 2: Yeah, yeah, without a doubt. And then yeah, I just

349
00:17:16,359 --> 00:17:19,839
think about you talked about the repository of prompts, Carl,

350
00:17:19,880 --> 00:17:22,759
and oh, yeah, like the prompts are becoming code.

351
00:17:22,960 --> 00:17:25,680
Speaker 1: Yeah, that's right. They're becoming sort of the lingua franca

352
00:17:25,880 --> 00:17:28,799
of you know, instead of I can imagine like a

353
00:17:28,799 --> 00:17:34,559
sub stack where you're you know, publishing prompts, yeah, or publishing.

354
00:17:34,880 --> 00:17:37,559
Speaker 3: So a colleague of mine at MSR ben Zorn, has

355
00:17:37,599 --> 00:17:40,400
been working on a project around you know, how prompts

356
00:17:40,400 --> 00:17:43,279
are code and you need to be thinking about saving

357
00:17:43,319 --> 00:17:45,880
them and versioning them because also as our models change,

358
00:17:46,279 --> 00:17:49,680
different prompts will be differently effective, right.

359
00:17:49,680 --> 00:17:53,519
Speaker 2: Yeah, yeah, they won't be. It's actually from a historical purpose.

360
00:17:53,599 --> 00:17:55,920
Interesting if they take this prompt and just run it

361
00:17:55,960 --> 00:17:58,240
on all these different models and see how they're evolving too.

362
00:17:58,640 --> 00:18:00,319
People keep telling me the new models are better, and

363
00:18:00,359 --> 00:18:02,839
I keep asking them, how do you know? Well?

364
00:18:02,880 --> 00:18:06,279
Speaker 3: And also define better? Yeah, it's like define quality exactly,

365
00:18:06,559 --> 00:18:08,599
so many different ways to think about it and define it.

366
00:18:08,559 --> 00:18:12,640
Speaker 1: And everything getting a little meta. You know, it probably

367
00:18:12,680 --> 00:18:15,279
won't be long. Probably doing it now, you know, people

368
00:18:15,359 --> 00:18:19,240
asking their AI to help them with their prompts for

369
00:18:19,519 --> 00:18:20,559
other AI.

370
00:18:20,960 --> 00:18:23,400
Speaker 3: Oh, it's happening, Yeah, it's happening.

371
00:18:23,480 --> 00:18:26,559
Speaker 1: Yeah, Like, what would the best prompt be for you

372
00:18:26,640 --> 00:18:27,880
to have this outcome?

373
00:18:27,920 --> 00:18:31,359
Speaker 3: And then well, especially an agentic right, because sometimes a

374
00:18:31,480 --> 00:18:34,920
role is to execute something, but a role can also

375
00:18:34,960 --> 00:18:37,799
be to write and craft and tune the prompt right.

376
00:18:38,680 --> 00:18:41,160
Speaker 1: Yeah, So why didn't you give me that in the

377
00:18:41,160 --> 00:18:42,400
first place? You didn't ask?

378
00:18:42,480 --> 00:18:47,559
Speaker 2: Yeah, yeah, yeah, we're talking about this like rapid iteration.

379
00:18:48,759 --> 00:18:50,440
It's going to be the iterating of prompts as much

380
00:18:50,440 --> 00:18:51,839
a it's going to be the iterating of any given

381
00:18:51,880 --> 00:18:55,920
piece of code. Yeah, and how that stuff ultimately gets generated.

382
00:18:56,119 --> 00:18:58,720
Speaker 3: Well, and then I think also not just iterating on

383
00:18:58,759 --> 00:19:04,240
the prompts, but understanding and tracking the quality of cops. Right,

384
00:19:04,279 --> 00:19:08,720
what does testing look like across different props, across different models?

385
00:19:08,720 --> 00:19:13,559
Speaker 2: Sure? Yeah, well let software behave the way the customers back. Yep.

386
00:19:13,720 --> 00:19:16,119
That is a very broad principle. There's just a lot

387
00:19:16,160 --> 00:19:18,480
of ways to get to that point that had come

388
00:19:18,640 --> 00:19:21,720
close to a yes or with some degree of certainty,

389
00:19:21,920 --> 00:19:27,119
just like my software is secure to some degree of certainty.

390
00:19:27,519 --> 00:19:32,519
Does is this ux coherent? I mean, we usually don't

391
00:19:32,759 --> 00:19:37,240
directly articulate those sentiments. We work towards them, but now

392
00:19:37,279 --> 00:19:39,559
with prompt behavior, I think we tend to say them

393
00:19:39,599 --> 00:19:41,920
out loud now because we have to feed them to

394
00:19:41,960 --> 00:19:45,000
the machine. Yeah, to get it to start moving down

395
00:19:45,000 --> 00:19:45,480
that path.

396
00:19:45,640 --> 00:19:49,440
Speaker 1: I had my first experience today with agent mode in

397
00:19:49,519 --> 00:19:50,400
the Evisual studio.

398
00:19:50,480 --> 00:19:52,720
Speaker 2: Yeah, you've been busy, Not really.

399
00:19:52,839 --> 00:19:54,599
Speaker 1: I just took it for a spin and I asked

400
00:19:54,599 --> 00:19:56,920
to do things that I knew how to do, and

401
00:19:56,960 --> 00:19:59,759
it did them, except it got hung up on one

402
00:20:00,599 --> 00:20:04,920
little thing. Okay, So here's the thing. Blazer has these

403
00:20:05,079 --> 00:20:08,000
edit forms and there's an input text that subclasses the

404
00:20:08,119 --> 00:20:12,720
HTML input so it can support validation rules and things

405
00:20:12,799 --> 00:20:16,799
like that. And I asked it to make a form

406
00:20:16,920 --> 00:20:18,960
and it did, And then I said Okay, I want

407
00:20:19,000 --> 00:20:21,880
you to do the validation on every keystroke. And when

408
00:20:21,920 --> 00:20:24,079
you do that with an input tag, you use the

409
00:20:24,119 --> 00:20:28,960
event on input right to handle the binding the bind event.

410
00:20:29,759 --> 00:20:31,559
But you can't do that with an input text. It

411
00:20:31,599 --> 00:20:35,039
doesn't support that out of the box. But the agent

412
00:20:35,119 --> 00:20:38,519
didn't know that, and so it tried twice and I said, no,

413
00:20:38,640 --> 00:20:39,240
that didn't work.

414
00:20:39,480 --> 00:20:42,160
Speaker 2: You're going to have to subclass that control. And it

415
00:20:42,240 --> 00:20:43,160
did well.

416
00:20:43,319 --> 00:20:46,759
Speaker 1: I was the one that knew what it should do,

417
00:20:47,200 --> 00:20:49,359
and it actually did it in a way that's different

418
00:20:49,400 --> 00:20:53,200
than I'd done it. So I actually learned something and

419
00:20:53,279 --> 00:20:54,759
it worked, and it worked.

420
00:20:54,839 --> 00:20:56,559
Speaker 2: But you still had to give it the hints. You

421
00:20:56,559 --> 00:20:58,279
still had to tell it. Just speak to the role

422
00:20:58,319 --> 00:21:01,400
and expertise in all of this though, absolutely, yeah, yeah,

423
00:21:01,440 --> 00:21:03,920
that's why I told the story. Yeah, no, I appreciate that.

424
00:21:03,960 --> 00:21:07,079
And at the same time, it's like, it's not enough

425
00:21:07,119 --> 00:21:10,319
to write the prompt, it's also to evaluate the results.

426
00:21:10,599 --> 00:21:12,240
Speaker 3: Absolutely, you know, is.

427
00:21:12,240 --> 00:21:14,920
Speaker 2: This good enough? Should it be better? Like you'd almost

428
00:21:14,920 --> 00:21:17,720
have a default that says you should improve that, just

429
00:21:17,759 --> 00:21:19,359
to see if it does well.

430
00:21:19,400 --> 00:21:22,039
Speaker 3: And you know, it's interesting because now when you write code,

431
00:21:22,559 --> 00:21:26,759
you write generate code and review the code as you go,

432
00:21:26,960 --> 00:21:31,640
So we're sort of introducing another early layer of review

433
00:21:32,880 --> 00:21:33,319
code review.

434
00:21:34,759 --> 00:21:35,839
Speaker 2: Yeah.

435
00:21:35,920 --> 00:21:38,720
Speaker 1: Yeah, especially when you tell your agent to handle this

436
00:21:39,599 --> 00:21:43,640
handle this issue on your Kidhub repo, for example, and

437
00:21:43,720 --> 00:21:47,519
it creates a full request and documents it.

438
00:21:47,039 --> 00:21:49,480
Speaker 2: Yeah, better than you. Better than you would.

439
00:21:49,200 --> 00:21:51,240
Speaker 3: Have that, right, especially documents.

440
00:21:52,640 --> 00:21:55,319
Speaker 1: But you know you're doing a code review of the

441
00:21:55,400 --> 00:21:59,559
agent and you have to figure it out. You're the

442
00:21:59,599 --> 00:22:01,400
one that approves it ultimately.

443
00:22:01,640 --> 00:22:03,799
Speaker 3: Yeah, it's going to give us some really good questions

444
00:22:03,799 --> 00:22:09,039
to think about because as we are evaluating, you know, reviewing,

445
00:22:09,160 --> 00:22:14,000
validating code and pull requests that are entirely generated from

446
00:22:14,000 --> 00:22:18,160
someone else something else, right, we need to have a

447
00:22:18,200 --> 00:22:21,559
good mental model of the code base and the architecture

448
00:22:21,599 --> 00:22:23,519
in our head to understand how it fits in with

449
00:22:23,640 --> 00:22:26,880
how it interacts everything else like that, and moving forward.

450
00:22:26,920 --> 00:22:28,720
I think that's going to be a really interesting question

451
00:22:28,759 --> 00:22:32,319
and challenge because now none of my code is operating

452
00:22:32,319 --> 00:22:34,759
production right now, but it used to, and I know

453
00:22:34,839 --> 00:22:37,200
that that's how I built up my mental model, and

454
00:22:37,240 --> 00:22:40,400
I kind of like feel of the system was doing

455
00:22:40,440 --> 00:22:43,480
that coding and writing that code, and you know, running

456
00:22:43,559 --> 00:22:46,720
these tests, and it'll be really interesting to see how

457
00:22:46,720 --> 00:22:50,839
we can help support the building of mental models, the

458
00:22:50,960 --> 00:22:53,599
evolution of mental models of how the back end code

459
00:22:53,599 --> 00:22:57,200
base works because agents can and then agents will probably

460
00:22:57,240 --> 00:22:59,519
also you know, evolve to be able to handle more

461
00:22:59,519 --> 00:23:02,599
of that. But at the end of the day, you know,

462
00:23:02,720 --> 00:23:08,000
developers are still deciding and approving and orchestrating all of

463
00:23:08,039 --> 00:23:08,599
these systems.

464
00:23:08,680 --> 00:23:10,279
Speaker 2: Yeah. Yeah, absolutely so.

465
00:23:10,279 --> 00:23:11,519
Speaker 3: Sorry that was real meta.

466
00:23:11,640 --> 00:23:14,799
Speaker 1: Yeah, but we can also end that with for now

467
00:23:16,079 --> 00:23:17,039
right for now.

468
00:23:17,279 --> 00:23:20,759
Speaker 3: Wow, someone tried telling me something was going to It's

469
00:23:20,880 --> 00:23:23,920
just impossible, and I'm like, m is it though? Things

470
00:23:23,920 --> 00:23:26,279
are moving way too fast right now to say anything

471
00:23:26,279 --> 00:23:27,279
definitively Yeah.

472
00:23:27,359 --> 00:23:29,200
Speaker 2: Yeah, And then at the same time with like, I

473
00:23:29,200 --> 00:23:33,440
don't think the innovation is particularly wildly new. It's not exponential,

474
00:23:33,920 --> 00:23:36,079
but we're very much in an engineering cycle where we're

475
00:23:36,079 --> 00:23:38,720
finding ways to slip these new tools in and starting

476
00:23:38,720 --> 00:23:41,000
to think about how they fit together and what they

477
00:23:41,039 --> 00:23:43,839
could do for us. And it does feel like we're

478
00:23:43,920 --> 00:23:48,039
really changing the development workflow, even though the principles are

479
00:23:48,039 --> 00:23:50,680
still the same. Yep. We're still trying to deliver value

480
00:23:50,759 --> 00:23:54,880
to customers and trying toterate quickly get those those features

481
00:23:54,880 --> 00:23:57,680
out as soon as possible. Take feedback from their use

482
00:23:58,319 --> 00:24:01,000
and shape the next versions something I guess it's an

483
00:24:01,000 --> 00:24:02,240
area I haven't explored much yet.

484
00:24:02,319 --> 00:24:05,599
Speaker 3: Well, and version code. You know, before it was versioning

485
00:24:05,599 --> 00:24:08,039
code and understanding tests and you know, building out the

486
00:24:08,039 --> 00:24:11,559
build graph. Well, now when lms are at the heart

487
00:24:11,599 --> 00:24:13,559
of a lot of work, now we need to be

488
00:24:13,559 --> 00:24:17,119
thinking about data and data versioning and model and model versioning,

489
00:24:17,559 --> 00:24:18,680
and so you know.

490
00:24:18,720 --> 00:24:21,640
Speaker 2: It'sa do you know they're getting better when you make

491
00:24:21,680 --> 00:24:22,359
the new version?

492
00:24:22,559 --> 00:24:26,200
Speaker 3: Yep, exactly, And then how would you roll back or

493
00:24:26,319 --> 00:24:29,039
shift to a model or a model version or to

494
00:24:28,519 --> 00:24:31,559
a to a different data set, right, And so, I

495
00:24:32,039 --> 00:24:33,640
you know, that's another thing that I think comes to

496
00:24:33,680 --> 00:24:35,119
mind to some folks are like, well, you know, we

497
00:24:35,160 --> 00:24:38,359
won't need developers anymore, there's no junior devs. And I'm like, no,

498
00:24:38,559 --> 00:24:42,839
we're creating an entirely new class of development. That's important.

499
00:24:42,839 --> 00:24:45,240
If we think about the earliest days of you know,

500
00:24:45,359 --> 00:24:48,880
before it was DevOps, we didn't really have infrastructure teams

501
00:24:48,920 --> 00:24:51,240
the way we do now, and they weren't we didn't

502
00:24:51,359 --> 00:24:54,720
understand the whole industry, didn't understand how important it was

503
00:24:54,759 --> 00:24:57,839
from a leverage standpoint, just like right now, right for

504
00:24:58,359 --> 00:25:01,400
mL developers out there. You know, it's been kind of

505
00:25:01,400 --> 00:25:06,759
a challenge. And now that skill set is super important,

506
00:25:06,799 --> 00:25:09,960
and that skill set you know, plus plus plus plus plus.

507
00:25:10,079 --> 00:25:12,640
Speaker 2: Sure, well, and there isn't I don't feel like there's

508
00:25:12,680 --> 00:25:14,440
consistent patterns to making of those things right now. It's

509
00:25:14,519 --> 00:25:15,799
very wild westy.

510
00:25:15,720 --> 00:25:17,480
Speaker 3: Oh, because we're all still figuring it out.

511
00:25:17,519 --> 00:25:21,200
Speaker 2: Yeah right, and so we don't. I really haven't seen

512
00:25:21,200 --> 00:25:23,400
a set of practices that make me feel like that's

513
00:25:23,400 --> 00:25:25,920
a way that's going to be successful. These new tools

514
00:25:25,920 --> 00:25:29,599
every day, and they're you know, moving the goalposts all

515
00:25:29,599 --> 00:25:31,480
the time, so it's hard to try and come up

516
00:25:31,480 --> 00:25:32,960
with a set of practices that are really going to

517
00:25:33,000 --> 00:25:33,640
be consistent.

518
00:25:33,920 --> 00:25:35,680
Speaker 3: This is a fun time to be in tech.

519
00:25:35,839 --> 00:25:38,160
Speaker 2: Oh yeah, well, I'm kind of excited for a junior

520
00:25:38,160 --> 00:25:40,400
to imagine a junior developer just coming up now where

521
00:25:40,440 --> 00:25:43,160
you're always going to have a prompt like, how much

522
00:25:43,200 --> 00:25:46,200
of our time is now scraping off the craft of

523
00:25:46,240 --> 00:25:48,720
our old coding practices because we have all these new

524
00:25:48,720 --> 00:25:50,880
possibilities in front of us, right well.

525
00:25:50,680 --> 00:25:53,519
Speaker 3: And how much time can you save by chasing down

526
00:25:54,279 --> 00:25:57,119
a problem that's just kind of silly and nonsense. I

527
00:25:57,240 --> 00:26:00,279
was right, Well, you know, TPD on this term. You know,

528
00:26:00,279 --> 00:26:02,440
some people love it, some hate it. It's vibe coding

529
00:26:02,480 --> 00:26:05,720
the other day and writing up I know, right, I

530
00:26:05,720 --> 00:26:11,000
don't know if you've heard it's vibe coding thing or

531
00:26:11,039 --> 00:26:14,319
you know, I was just yollowing it.

532
00:26:13,559 --> 00:26:14,960
Speaker 1: Is vibe coding a pejorative to you.

533
00:26:15,079 --> 00:26:17,279
Speaker 3: I think it is to some people. I joke when

534
00:26:17,319 --> 00:26:21,599
I chat with my team, I tell I'm just yellowing it, right, Yeah,

535
00:26:21,680 --> 00:26:23,839
I mean I still check the code sometimes. Every once

536
00:26:23,839 --> 00:26:25,599
in a while, I've done something just to see how

537
00:26:25,640 --> 00:26:27,119
far it can go. And I've been like, I've set

538
00:26:27,160 --> 00:26:30,440
up a condition where I'm not allowed to change the code.

539
00:26:30,920 --> 00:26:32,640
If I were someone who didn't understand any of this

540
00:26:32,680 --> 00:26:34,119
and I only had the prompt, what would I do?

541
00:26:34,680 --> 00:26:35,000
Speaker 2: Anyway?

542
00:26:35,079 --> 00:26:36,599
Speaker 3: I was, Yeah, I was working on this thing and

543
00:26:36,599 --> 00:26:39,240
it kept breaking, and I was like, okay, well i'll

544
00:26:39,880 --> 00:26:43,559
i'll you know, relax that constrain. I'll go figure it out. Listen.

545
00:26:43,759 --> 00:26:46,839
I spent a couple hours like going through every forum

546
00:26:46,960 --> 00:26:50,079
and googling everything I could, and finally I was like,

547
00:26:50,440 --> 00:26:53,440
this is dumb. Please fix What does this error message mean?

548
00:26:53,640 --> 00:26:56,400
Please fix it for me? And I think probably because

549
00:26:56,400 --> 00:26:58,119
I've done this before, I will ask it first because

550
00:26:58,119 --> 00:27:02,079
it like loads into context what it needs to know.

551
00:27:02,119 --> 00:27:04,559
And then fris that Listen, that was fixed in thirty seconds,

552
00:27:05,039 --> 00:27:08,720
and it was it was an obscure configure error that

553
00:27:08,839 --> 00:27:11,400
like I wouldn't have found for another couple hours. And

554
00:27:11,400 --> 00:27:13,359
as a junior dev, those were the types of things

555
00:27:13,359 --> 00:27:16,720
that I hit, especially my first six months or a year.

556
00:27:16,799 --> 00:27:18,079
Speaker 2: Yeah, right, that were just showstoppers.

557
00:27:18,119 --> 00:27:22,680
Speaker 3: And to understand this, yeah, it would take hours because

558
00:27:22,680 --> 00:27:24,519
I didn't want to ask someone or I would and

559
00:27:24,559 --> 00:27:27,440
then like and then I'm interrupting someone. I think this

560
00:27:27,640 --> 00:27:30,440
is one of the great unlocks where junior developers are

561
00:27:30,440 --> 00:27:35,200
going to be so productive. I'm super impactful, and honestly,

562
00:27:35,240 --> 00:27:38,799
I was a professor for years, you know, Richard, Sometimes

563
00:27:38,880 --> 00:27:41,279
junior folks have the best ideas because they aren't they're

564
00:27:41,279 --> 00:27:44,480
not constrained, they're not hardened by all of the broken.

565
00:27:44,279 --> 00:27:46,079
Speaker 2: Deaths stuff that we've been doing.

566
00:27:46,640 --> 00:27:50,000
Speaker 3: They don't have the scars, and so now they can

567
00:27:50,039 --> 00:27:55,400
actually surface these ideas in like very visible productive ways.

568
00:27:55,599 --> 00:27:57,119
Speaker 2: This is what I'm thinking, is that junior is going

569
00:27:57,200 --> 00:28:00,240
to come up whose reflexes to prompt again, Yeah, that

570
00:28:00,640 --> 00:28:05,000
reflex we don't have yet. And things in those terms.

571
00:28:05,160 --> 00:28:07,400
How do I deal with this from a prompt perspective

572
00:28:07,480 --> 00:28:11,000
rather than our visceral reaction would be get under the hood.

573
00:28:11,000 --> 00:28:12,640
Look at the code I've changed?

574
00:28:12,640 --> 00:28:15,400
Speaker 3: Well, both and both right, because sometimes we need to

575
00:28:15,440 --> 00:28:17,440
tell it. You have to break down a problem a

576
00:28:17,440 --> 00:28:17,920
couple of times.

577
00:28:17,920 --> 00:28:19,599
Speaker 2: So I think, just like you said, you need to

578
00:28:19,599 --> 00:28:20,240
subclass that.

579
00:28:20,759 --> 00:28:23,720
Speaker 1: Yeah, well yeah, I think imprompts now all the time.

580
00:28:24,240 --> 00:28:27,599
If I'm spending more than you know, thirty seconds scratching

581
00:28:27,599 --> 00:28:31,119
my head about something, it goes right into an agent

582
00:28:31,279 --> 00:28:34,359
or a chat, GPT or something. Take a screenshot. What

583
00:28:34,400 --> 00:28:36,720
does this mean? If I don't know?

584
00:28:37,000 --> 00:28:37,480
Speaker 2: Yeah, you know?

585
00:28:38,000 --> 00:28:40,480
Speaker 1: Yeah, Well it's time to take a break. We'll be

586
00:28:40,559 --> 00:28:43,920
right back after these very very important messages don't go away.

587
00:28:46,240 --> 00:28:49,119
Did you know? You can easily migrate asp net web

588
00:28:49,160 --> 00:28:53,240
apps to Windows containers on Aws. Use the app to

589
00:28:53,359 --> 00:28:57,759
container tool to containerize your IIS websites and deploy to

590
00:28:57,839 --> 00:29:03,319
AWS Managed Container service is with or without Kubernetes. Find

591
00:29:03,319 --> 00:29:06,960
out more about app to container at AWS dot Amazon

592
00:29:07,039 --> 00:29:15,079
dot Com, slash, dot net, slash modernize, and we're back.

593
00:29:15,119 --> 00:29:17,640
It's dot in a rocks. I'm Carl Franklin as Richard Campbell,

594
00:29:17,680 --> 00:29:22,039
either and we are talking about AI with Nicole, and

595
00:29:22,079 --> 00:29:25,680
we're talking about DevOps in AI. We started on the

596
00:29:25,680 --> 00:29:28,720
DevOps thing got sidetracked into AI. But how does how

597
00:29:28,720 --> 00:29:33,119
do AI tools make their way into the DevOps architecture infrastructure?

598
00:29:33,400 --> 00:29:36,400
Speaker 3: I love this question right in part because I think

599
00:29:36,440 --> 00:29:38,400
it's something we're still discovering, like we were kind of,

600
00:29:38,599 --> 00:29:41,880
you know, chatting about, and also I think it's important

601
00:29:41,880 --> 00:29:45,200
because it's something that is not on everyone's radar right

602
00:29:45,480 --> 00:29:50,200
right now, everyone is super excited about Copilot and you know,

603
00:29:50,319 --> 00:29:52,119
all of these tools that you can use to write code.

604
00:29:52,359 --> 00:29:53,440
Speaker 2: Yeah, this is on.

605
00:29:53,480 --> 00:29:57,480
Speaker 3: The ninety and you know we all know this because

606
00:29:57,480 --> 00:29:59,960
we're developers or you know, we're developers. In my case,

607
00:30:01,920 --> 00:30:05,160
there's a lot more to software engineering and building great

608
00:30:05,240 --> 00:30:09,279
product and building great systems than just writing code. There's

609
00:30:10,319 --> 00:30:14,359
this review, there's pr there's integration, there's release, there's so

610
00:30:14,480 --> 00:30:17,640
many things, and so I think there's just a world

611
00:30:17,680 --> 00:30:22,039
of possibility for AI and AI agents to be able

612
00:30:22,039 --> 00:30:24,559
to help out. Now that also comes with you know,

613
00:30:24,839 --> 00:30:27,880
but wait, right, what do we think about hallucinations or

614
00:30:27,960 --> 00:30:31,480
correctness or all of these other things, which also introduces sorry,

615
00:30:31,519 --> 00:30:33,240
I'm gonna harp on this because I have heard too

616
00:30:33,240 --> 00:30:36,680
many people say that the junior dev is dead. We've

617
00:30:36,880 --> 00:30:42,359
now created and introduced entirely new skill sets, right, how

618
00:30:42,400 --> 00:30:45,359
do we validate this? What does testing look like across

619
00:30:45,400 --> 00:30:49,400
all these different stages for non deterministic systems, you.

620
00:30:49,359 --> 00:30:54,720
Speaker 1: Almost have to test a junior developer's creativity and their imagination, because,

621
00:30:55,599 --> 00:30:58,079
let's face it, that's what's holding the old guys back.

622
00:30:58,200 --> 00:30:59,839
Speaker 3: I woul gonna say, I think juniors are some of

623
00:30:59,880 --> 00:31:03,119
the best equipped here because now seniors will have some

624
00:31:03,319 --> 00:31:07,920
great experience in architectural design limitations and trade offs involved,

625
00:31:09,160 --> 00:31:13,799
but also we're just so ingrained in the way things

626
00:31:13,880 --> 00:31:16,119
work and the way things look and how to build

627
00:31:16,119 --> 00:31:18,640
a thing and not okay, what if we flipped the

628
00:31:18,759 --> 00:31:20,759
entire thing on its side differently exactly?

629
00:31:20,799 --> 00:31:22,240
Speaker 2: Well, and then we're also at a time where nobody

630
00:31:22,279 --> 00:31:24,960
has a decade of experience with all alabas. Right, that's right,

631
00:31:25,000 --> 00:31:28,119
we're all juniors effectively. Yeah, this is a question of

632
00:31:28,119 --> 00:31:30,319
how much bag did you bring to the table. But

633
00:31:31,000 --> 00:31:33,759
at the same time, you know, hopefully you've gone through

634
00:31:33,759 --> 00:31:35,680
a few iterations like this over the years, you have

635
00:31:35,759 --> 00:31:39,559
a bunch of sort of standards that you go by

636
00:31:39,599 --> 00:31:42,880
about quality and approaches to things so forth that are

637
00:31:43,279 --> 00:31:48,160
beyond any given tool. Yeah, it helps me. I want

638
00:31:48,200 --> 00:31:50,480
to jump back onto the frictionalist side of things because

639
00:31:50,920 --> 00:31:53,240
I think we've focused too many on the tools and

640
00:31:53,279 --> 00:31:54,440
not enough on the culture.

641
00:31:54,759 --> 00:31:58,000
Speaker 3: Absolutely right. So as we you know, Auby and I

642
00:31:58,000 --> 00:32:02,599
have been going through the book, there a not insignificant

643
00:32:02,839 --> 00:32:06,559
part of the book that isn't about tools at all, frankly, right,

644
00:32:07,240 --> 00:32:11,839
It's about understanding what devex is and why we need

645
00:32:11,880 --> 00:32:15,240
to remove friction and communicating that to others in the company.

646
00:32:15,279 --> 00:32:17,839
Whether that's you know, making the business case you can

647
00:32:17,920 --> 00:32:21,480
kick something off, or whether it's once you're starting to

648
00:32:21,559 --> 00:32:26,519
deploy a technical solution or something. Right that frankly needs

649
00:32:26,680 --> 00:32:29,400
a COMMS plan because internally we need to communicate to

650
00:32:29,440 --> 00:32:32,480
several audiences. We need to identifiers stakeholders. We need to

651
00:32:32,519 --> 00:32:34,759
tell developers why it's valuable and why they should use it.

652
00:32:34,839 --> 00:32:37,200
We need to tell engineering managers why it's valuable and

653
00:32:37,279 --> 00:32:39,279
why they should care about it. We need to be

654
00:32:39,359 --> 00:32:44,759
telling executives and leaders across various disciplines, you know, our

655
00:32:44,880 --> 00:32:48,119
CIOs and our CTOs, but also HR and finance why

656
00:32:48,160 --> 00:32:50,519
this is important what they should care about and all

657
00:32:50,559 --> 00:32:52,640
of those messages are going to be different, and how

658
00:32:52,640 --> 00:32:55,400
do we remove blockers and barriers, and how do we

659
00:32:55,720 --> 00:33:00,880
reduce whendn't they remove fear because a lot is changing, right,

660
00:33:01,119 --> 00:33:03,319
and so I think, you know, the cultural and the

661
00:33:03,319 --> 00:33:07,440
people in the communication part is is huge. And also process, right,

662
00:33:08,000 --> 00:33:08,759
we say.

663
00:33:08,559 --> 00:33:10,799
Speaker 2: That's the old DEVO process tools.

664
00:33:10,640 --> 00:33:14,319
Speaker 3: Absolutely, you know, yeah, exactly, well, and we were chatting

665
00:33:14,359 --> 00:33:16,599
with some of our earliest you know, reviewers and interviewers,

666
00:33:16,640 --> 00:33:20,079
and you know, the point kept raising up that sometimes

667
00:33:20,160 --> 00:33:23,680
the best DEVX change you can make is to a process. Right,

668
00:33:23,799 --> 00:33:25,640
Sometimes you can just clean up a process and it's

669
00:33:26,759 --> 00:33:32,240
a huge significant impact across an entire organization. Tools weren't involved,

670
00:33:32,319 --> 00:33:34,519
or maybe like tools were minimally involved.

671
00:33:34,440 --> 00:33:36,119
Speaker 2: They weren't the deciding factor.

672
00:33:35,960 --> 00:33:36,799
Speaker 3: Yep, exactly.

673
00:33:37,200 --> 00:33:37,400
Speaker 2: Yeah.

674
00:33:37,440 --> 00:33:40,759
Speaker 3: And then we also go through, you know, how to

675
00:33:40,799 --> 00:33:44,599
collect early data, how to more rigortory collect data so

676
00:33:44,640 --> 00:33:48,640
that you can get good signal into what's happening to

677
00:33:48,759 --> 00:33:50,799
make you know, decisions for maybe some of the obvious

678
00:33:50,839 --> 00:33:53,359
problems and then communicate those across. But then also how

679
00:33:53,359 --> 00:33:58,000
to prioritize and choose next projects as you continue on this.

680
00:33:58,039 --> 00:34:01,920
Speaker 2: Journey yeah, the feedback side. Yeah, exactly, the cycle. And

681
00:34:01,920 --> 00:34:03,720
I wonder if jenerative AI is going to pay more

682
00:34:03,759 --> 00:34:05,880
of a role in that, doing a better job of

683
00:34:06,640 --> 00:34:09,800
trying to pull the signal out of the noise of

684
00:34:09,920 --> 00:34:13,559
telemetry to let us to let us see people are

685
00:34:13,599 --> 00:34:16,280
struggling with the app this way, or you know, these

686
00:34:16,320 --> 00:34:18,840
features aren't being utilized, Like all of that data is

687
00:34:18,920 --> 00:34:22,079
in the telemetry, but it sometimes it's really hard to see.

688
00:34:22,360 --> 00:34:22,679
Speaker 1: Yeah.

689
00:34:22,719 --> 00:34:26,199
Speaker 3: Well, and I will say frankly, there are a lot

690
00:34:26,239 --> 00:34:28,440
of times where the data doesn't exist, right, and so

691
00:34:28,519 --> 00:34:30,920
I have Yeah, and so I think this, you know,

692
00:34:31,000 --> 00:34:33,960
this DEVS journey or the journey to improve developer experience

693
00:34:34,039 --> 00:34:37,480
right includes both identifying friction points and improving them and

694
00:34:37,559 --> 00:34:41,000
also identifying where you have gaps in the data what

695
00:34:41,039 --> 00:34:43,519
you can do in the interim until you have something

696
00:34:43,519 --> 00:34:46,519
instrumenttic because by the way, don't start or don't wait

697
00:34:46,559 --> 00:34:49,639
to start until you've instrumented your entire stack, because that's

698
00:34:49,679 --> 00:34:54,159
never happening. Part of it is not just using AI

699
00:34:54,400 --> 00:34:58,320
to kind of evaluate the telemetry, but it's also understanding

700
00:34:58,320 --> 00:35:01,039
what data and what telemetry we need. Right, many times

701
00:35:01,039 --> 00:35:02,840
we have gaps in our data, and so it's part

702
00:35:03,159 --> 00:35:06,360
improving tools and process and communicating and culture, and sometimes

703
00:35:06,400 --> 00:35:11,800
it's prioritizing investments in instrumenting systems. Right, so what does

704
00:35:11,840 --> 00:35:14,280
that look like? I will say one area that I

705
00:35:14,320 --> 00:35:17,440
think AI can make a huge impact it already is

706
00:35:17,440 --> 00:35:18,960
is you know what I just talked about. We need

707
00:35:18,960 --> 00:35:21,679
to communicate to different stakeholders. Right now, we can pull

708
00:35:21,719 --> 00:35:24,480
together a bunch of data. I can have a very

709
00:35:24,599 --> 00:35:26,800
rough list of bullet points and findings, and I can say,

710
00:35:27,400 --> 00:35:29,079
help me rewrite that.

711
00:35:29,199 --> 00:35:31,039
Speaker 2: Can you tune this for the C suite?

712
00:35:31,079 --> 00:35:32,119
Speaker 3: Tune this for the C suite?

713
00:35:32,199 --> 00:35:32,360
Speaker 2: Right?

714
00:35:32,719 --> 00:35:34,800
Speaker 3: Also, you know we know ask it politely. This is

715
00:35:34,800 --> 00:35:36,559
what I want you to do. Can you please help

716
00:35:36,559 --> 00:35:37,239
me write it this way?

717
00:35:37,280 --> 00:35:37,559
Speaker 1: Okay?

718
00:35:37,559 --> 00:35:40,159
Speaker 3: Now write it for engineering managers? What am I missing

719
00:35:40,360 --> 00:35:42,360
if I'm writing this for HR? Can you help me

720
00:35:43,719 --> 00:35:46,920
generate a list of you know, starter questions for conversation

721
00:35:47,119 --> 00:35:49,880
in a workshop I'm going to run. That has gotten

722
00:35:50,239 --> 00:35:51,039
so much faster.

723
00:35:51,360 --> 00:35:53,559
Speaker 2: That's really interesting, you know, I love it. I did

724
00:35:53,679 --> 00:35:56,119
very well as a product manager going through the logs

725
00:35:56,119 --> 00:35:58,199
and writing up a post factor report and a version

726
00:35:58,280 --> 00:36:00,760
like two weeks after a going through to see if

727
00:36:00,760 --> 00:36:02,920
we worry about to tip over and when we weren't

728
00:36:02,960 --> 00:36:04,599
just saying, hey, here's how a bouch of new features

729
00:36:04,599 --> 00:36:06,119
being used and things like that, Like it made a

730
00:36:06,159 --> 00:36:08,760
lot of people happy. What it didn't have was time

731
00:36:08,800 --> 00:36:11,480
to write it for all of those audiences. Yep, that's

732
00:36:11,480 --> 00:36:14,559
something in LLM is genuinely good, at at.

733
00:36:14,559 --> 00:36:16,360
Speaker 3: Least for a first draft, right, because then you can

734
00:36:16,400 --> 00:36:17,880
go through and you can check a couple of things.

735
00:36:17,880 --> 00:36:19,719
But I don't know about you. I am faster at

736
00:36:19,800 --> 00:36:21,840
editing than I am writing totally, and so if I

737
00:36:21,840 --> 00:36:25,400
can give it a bunch of information, a few cues on,

738
00:36:25,719 --> 00:36:28,119
you know, this leader really cares about these one or

739
00:36:28,119 --> 00:36:31,719
two things. How can we frame it this way? It

740
00:36:32,119 --> 00:36:34,880
you know, I'll change a couple of sentences, I'll add

741
00:36:34,880 --> 00:36:37,039
a couple data points. But it's so much faster.

742
00:36:37,440 --> 00:36:37,639
Speaker 2: Yeah.

743
00:36:37,760 --> 00:36:39,800
Speaker 1: Do you see there's a place in the DevOps tool

744
00:36:39,840 --> 00:36:45,960
stack for agents, for ani agents to actually you know,

745
00:36:46,480 --> 00:36:53,199
do things of their own volition with your various resources,

746
00:36:54,280 --> 00:37:01,000
even if it's notifying you or sending communications to stakeholders

747
00:37:01,199 --> 00:37:03,920
or you know, do you see that there's a place

748
00:37:03,960 --> 00:37:04,920
in the tool chain for that.

749
00:37:06,559 --> 00:37:08,199
Speaker 3: I think there will be a place. I think it's

750
00:37:08,239 --> 00:37:11,840
still you know, so early and you know, a gentic

751
00:37:12,840 --> 00:37:16,079
development and use that. Like, we're still kind of seeing

752
00:37:16,400 --> 00:37:18,840
where to best deploy them, how to best use them.

753
00:37:19,199 --> 00:37:21,199
What does what does autonomous mean?

754
00:37:21,679 --> 00:37:21,880
Speaker 1: Right?

755
00:37:22,199 --> 00:37:24,239
Speaker 2: How autonomous is autonomous exactly?

756
00:37:24,360 --> 00:37:28,559
Speaker 1: Well, you know, you you develop a prompt for your

757
00:37:28,639 --> 00:37:32,800
agent to say, you know, I'm not I don't even know.

758
00:37:33,320 --> 00:37:35,760
I'm just thinking. Well, so for example, you can tell

759
00:37:35,760 --> 00:37:38,920
it how to behave when a certain something arises, and

760
00:37:39,039 --> 00:37:39,440
it does.

761
00:37:39,719 --> 00:37:41,840
Speaker 3: Sure, but as an example, and I'm just going to

762
00:37:41,960 --> 00:37:44,119
use the reporting one because sure, I think it's one

763
00:37:44,119 --> 00:37:46,199
that's going to be obvious to everyone. I can have

764
00:37:46,199 --> 00:37:47,679
a whole bunch of data come in, I can have

765
00:37:47,760 --> 00:37:51,199
it generate a draft of report for everyone. The reports

766
00:37:51,199 --> 00:37:54,559
could be perfect. I personally would never have an agent

767
00:37:55,079 --> 00:37:59,000
automatically send the executive report to the CEO on the board.

768
00:38:00,239 --> 00:38:02,400
That's not a risk right now.

769
00:38:02,400 --> 00:38:04,679
Speaker 1: To some of our get up pull request model, right.

770
00:38:04,719 --> 00:38:06,920
You know, it sends it to you. You can send

771
00:38:06,960 --> 00:38:09,320
it on after an edit or two or whatever.

772
00:38:09,400 --> 00:38:11,239
Speaker 3: But I also might have it go ahead and send

773
00:38:11,239 --> 00:38:14,000
it automatically to all of the developers because they may

774
00:38:14,000 --> 00:38:17,800
be expecting it on a monthly or a quarterly cadence. Now,

775
00:38:17,880 --> 00:38:20,159
executives might be expecting it on a six month cadence,

776
00:38:20,480 --> 00:38:22,400
but if I know they're in an off site or

777
00:38:22,599 --> 00:38:25,280
something just blew up, I'm not sending it that day.

778
00:38:25,320 --> 00:38:27,239
I will go ahead and wait forty eight hours, and

779
00:38:27,280 --> 00:38:29,800
I want it to come from me, right, I mean, yes,

780
00:38:29,840 --> 00:38:31,679
I know we can configure things so that it looks

781
00:38:31,719 --> 00:38:34,400
like it comes from someone, but that's you know, I

782
00:38:34,719 --> 00:38:36,840
do think there are cases where we're going to start

783
00:38:36,880 --> 00:38:39,280
feeling out what that looks like.

784
00:38:39,519 --> 00:38:41,639
Speaker 2: Right, I'm pretty comfortable idea that we're going to want

785
00:38:41,760 --> 00:38:43,840
entities for each of these bits and software. So it's

786
00:38:43,920 --> 00:38:48,280
very clear that this was generated by an LM's analysis,

787
00:38:48,320 --> 00:38:49,320
and oh.

788
00:38:49,159 --> 00:38:54,079
Speaker 3: My gosh, yes, please. You know, there's also a big

789
00:38:54,119 --> 00:38:57,519
open question around how do we evaluate and think about

790
00:38:58,440 --> 00:39:02,800
the downstream outcomes of code, because if we think about it,

791
00:39:03,239 --> 00:39:07,719
very few systems currently have let's say key logger level

792
00:39:07,800 --> 00:39:12,519
tracking or per character tracking, and so you know, right now,

793
00:39:12,559 --> 00:39:14,559
the question is often did you use AI or did

794
00:39:14,599 --> 00:39:20,000
you not use AI? That is such a yeah exactly

795
00:39:20,320 --> 00:39:23,000
grammar check or qualifying right, or like did I use

796
00:39:23,039 --> 00:39:26,599
AI and it like helped kickstart my brain and then

797
00:39:26,599 --> 00:39:30,519
I deleted everything it suggested except for like some set

798
00:39:30,599 --> 00:39:31,920
up comment something.

799
00:39:31,760 --> 00:39:34,199
Speaker 2: Right stimulated an I and you ran with the idea.

800
00:39:34,760 --> 00:39:38,920
Speaker 3: But then like downstream, which portions of the code perform well,

801
00:39:38,960 --> 00:39:42,000
Which portions of the code are more readable and so

802
00:39:42,119 --> 00:39:45,079
they do well in code review, Which portions of the

803
00:39:45,119 --> 00:39:49,440
code are challenging in build or integration, Which portions of

804
00:39:49,440 --> 00:39:55,000
the code have long term viability and success in production?

805
00:39:55,559 --> 00:39:59,159
And right now we just I'll say, we generally just

806
00:39:59,360 --> 00:40:03,679
don't have that level of granularity again, introduce new opportunities.

807
00:40:03,679 --> 00:40:07,800
Speaker 2: Memory for me of what code reviews well versus what

808
00:40:07,880 --> 00:40:10,800
performs well, because I've done a lot of web performance

809
00:40:10,840 --> 00:40:14,360
optimization over the years that always reviewed poorly because it's

810
00:40:14,440 --> 00:40:17,119
on obvious code and looking at it and going why

811
00:40:17,159 --> 00:40:19,719
are you doing this? It's like because in a multi

812
00:40:19,800 --> 00:40:23,599
re entry, high velocity environment, this code is safe even

813
00:40:23,639 --> 00:40:25,440
though it's obtuse.

814
00:40:25,280 --> 00:40:28,440
Speaker 3: Right exactly well? And then there's even i think really

815
00:40:28,480 --> 00:40:31,840
interesting open questions. Sorry, this is like researcher hat. We

816
00:40:31,960 --> 00:40:37,280
know that many times reviewers will react differently to code

817
00:40:37,320 --> 00:40:39,840
written by different people. Sure it's a junior reviewer, it's

818
00:40:39,840 --> 00:40:42,480
someone on another team, it's someone who probably isn't familiar

819
00:40:42,480 --> 00:40:44,360
with your codebase, or you think is not familiar with

820
00:40:44,360 --> 00:40:47,400
your code base. How are going to how are people

821
00:40:47,440 --> 00:40:52,199
going to react differently to code that is entirely written

822
00:40:52,559 --> 00:40:55,360
by AI? What conventions can we use so that it's

823
00:40:55,440 --> 00:40:58,960
more readable, so it's no more understandable, so it's more trustable?

824
00:40:58,960 --> 00:41:01,360
And in which case you know, like in your example,

825
00:41:01,639 --> 00:41:03,960
does it just need to be hard to understand? But

826
00:41:04,000 --> 00:41:06,119
it's okay because we know that that's what's happening.

827
00:41:06,199 --> 00:41:08,760
Speaker 2: Yeah, and I used to write those comic blocks off.

828
00:41:08,800 --> 00:41:12,400
This is this code look stupid to you. Don't touch it.

829
00:41:12,400 --> 00:41:15,039
It is a hard, one fought over piece of code

830
00:41:15,039 --> 00:41:17,800
for a very difficult problem. Yes, I wish it would

831
00:41:17,840 --> 00:41:19,320
be simpler, but it cannot be.

832
00:41:20,039 --> 00:41:22,320
Speaker 1: Just to get back to the term AI and what

833
00:41:22,360 --> 00:41:27,000
does that mean? We go through these cycles where the

834
00:41:27,039 --> 00:41:31,480
public blombs onto words that they think means something and

835
00:41:31,480 --> 00:41:34,400
that means something else. Like your clock radio now has

836
00:41:34,519 --> 00:41:35,599
artificial intelligence.

837
00:41:35,960 --> 00:41:38,599
Speaker 3: Right, It's like, I mean, if you put that sticker

838
00:41:38,639 --> 00:41:40,079
on the box, it's going to sell the more money.

839
00:41:40,159 --> 00:41:44,039
Speaker 2: Yeah, of course, but I'm sorry clock radio, clock radio.

840
00:41:44,000 --> 00:41:46,760
Speaker 1: Algorithm sort of thing, right, Yeah, oh that was an

841
00:41:46,800 --> 00:41:47,920
algorithm that did that.

842
00:41:48,000 --> 00:41:48,679
Speaker 2: What does that mean?

843
00:41:49,559 --> 00:41:53,559
Speaker 1: Or let's go to the world of food. Enzymes is

844
00:41:53,599 --> 00:41:55,960
a great one, and enzymes a protein, but it sounds

845
00:41:56,000 --> 00:41:59,159
like it's doing something you know, enzyme. Well, there is

846
00:41:59,159 --> 00:42:03,360
an enzymatic act. Okay, yeah, okay, but but you know, yeah,

847
00:42:03,679 --> 00:42:05,239
people just throw these terms around.

848
00:42:05,280 --> 00:42:06,920
Speaker 2: Well, I've certainly been big on the fact that our

849
00:42:06,960 --> 00:42:11,880
official intelligence a term coined for raising money. It was

850
00:42:11,920 --> 00:42:15,320
a marketing strategy that then got hijacked by science fiction,

851
00:42:15,920 --> 00:42:18,239
and so of course everybody has a bad perception of it.

852
00:42:18,280 --> 00:42:22,880
We've had sixty years of abusing that term.

853
00:42:22,079 --> 00:42:26,519
Speaker 3: And it very legitimately means several different things.

854
00:42:26,599 --> 00:42:26,760
Speaker 2: Yeah.

855
00:42:26,840 --> 00:42:30,559
Speaker 3: Sure, given the context, given the discipline, given the background,

856
00:42:31,280 --> 00:42:34,519
it's a blacking technology. They're all AI. And also depending

857
00:42:34,519 --> 00:42:36,840
on what your assumption is, it's definitely not AI.

858
00:42:37,039 --> 00:42:38,840
Speaker 2: Well, I've always said if you still call it a

859
00:42:38,960 --> 00:42:40,960
dais because it doesn't work. As soon as it does work,

860
00:42:41,000 --> 00:42:45,119
it gets a new name. It becomes image recognition or

861
00:42:45,280 --> 00:42:48,840
you know, sentiment analysis or large language model. But if

862
00:42:49,320 --> 00:42:50,880
you only can call it AI, it's because you haven't

863
00:42:50,880 --> 00:42:51,639
figured it out yet.

864
00:42:52,039 --> 00:42:55,039
Speaker 3: Yeah, I think it's also Yeah, I mean sometimes it's

865
00:42:55,039 --> 00:42:58,320
also the underlying technology, right, Like we've had AI for

866
00:42:58,480 --> 00:43:03,159
years and it's one thing. Now, lllms use a different

867
00:43:03,199 --> 00:43:04,199
type of math.

868
00:43:04,360 --> 00:43:07,400
Speaker 2: Yeah, there's always been that schism between the decision tree

869
00:43:07,440 --> 00:43:11,159
systems and the neural net systems, the neats and the scruppies. Yep,

870
00:43:11,400 --> 00:43:14,239
going back to Minski terms, which is really old school,

871
00:43:15,360 --> 00:43:17,719
so fun. Even when you actually are talking about the

872
00:43:17,719 --> 00:43:21,320
technological side of this, there's been two totally different sets

873
00:43:21,320 --> 00:43:25,159
of philosophies. Yeah, but then throw hal and Ultron on

874
00:43:25,199 --> 00:43:28,400
the top of that, and no wonder people are confused always. Yeah,

875
00:43:28,800 --> 00:43:33,079
I appreciate it though, but I also appreciate that we're thinking,

876
00:43:33,119 --> 00:43:35,719
in the terms with all of these technologies about the

877
00:43:35,760 --> 00:43:40,079
development experience could be better from this Not only could.

878
00:43:39,840 --> 00:43:42,719
Speaker 3: Be, but I think it now. I've always said it

879
00:43:42,760 --> 00:43:46,480
needs to be, But now with a lot of teams

880
00:43:46,480 --> 00:43:48,880
and organizations that I talk to, we're really saying that

881
00:43:48,960 --> 00:43:51,880
where a subpar I won't even say bad, a subpar

882
00:43:52,039 --> 00:43:56,039
developer experience, the wheels are falling off. Yeah, things are

883
00:43:56,079 --> 00:43:58,960
just breaking because now we're trying to move so fast

884
00:43:59,000 --> 00:44:01,079
and generate so much, and so what does that mean

885
00:44:01,119 --> 00:44:03,280
for everything downstream? We have to figure out how to

886
00:44:03,280 --> 00:44:04,639
review it, we have to figure out how to test it,

887
00:44:04,679 --> 00:44:06,400
We have to figure out how to do building integration.

888
00:44:06,480 --> 00:44:08,800
We have to figure out how to release these huge

889
00:44:08,800 --> 00:44:14,760
amounts of code in compressed timeframes. And so you know, Richard,

890
00:44:14,800 --> 00:44:16,119
you know we've been saying for a while that, you know,

891
00:44:16,239 --> 00:44:18,320
DevOps or whatever you want to call it, Yeah, is

892
00:44:18,599 --> 00:44:22,079
table stakes. Oh listen, it's the bar is.

893
00:44:22,159 --> 00:44:23,920
Speaker 2: There's a whole set of conversations you didn't get to

894
00:44:23,920 --> 00:44:26,000
have if you haven't figured this part out yet.

895
00:44:26,599 --> 00:44:28,840
Speaker 3: And now for teams that have been organizations who have

896
00:44:28,880 --> 00:44:33,280
been winging it well enough, they are no longer. Yeah,

897
00:44:33,320 --> 00:44:36,440
we're going to see a lot of sprinting and burnout

898
00:44:36,599 --> 00:44:42,079
and or hopefully right improved systems that improve devs well.

899
00:44:42,320 --> 00:44:45,360
Speaker 2: And recent convinced that software was eating the world. And

900
00:44:45,400 --> 00:44:47,760
now you know your your companies are more and more

901
00:44:47,800 --> 00:44:50,119
dependent on it. If you haven't solved these problems, you're

902
00:44:50,159 --> 00:44:51,880
not going to be able to build the software.

903
00:44:51,920 --> 00:44:52,559
Speaker 1: Or you will.

904
00:44:52,960 --> 00:44:56,840
Speaker 3: But now that things are moving so incredibly fast, you

905
00:44:56,880 --> 00:44:58,719
won't be able to keep up, right. I mean, it's

906
00:44:58,719 --> 00:45:01,800
always been a challenge of competitive advantage and who can

907
00:45:01,800 --> 00:45:04,360
build faster, And before it was maybe a difference of

908
00:45:04,400 --> 00:45:06,840
like six months and nine months or one year and

909
00:45:06,920 --> 00:45:08,840
one year and a half, which like isn't great but

910
00:45:08,960 --> 00:45:12,239
kind of fine. Well, now it's like weeks and if

911
00:45:12,239 --> 00:45:15,920
you're pulling in at a year plus for you know,

912
00:45:16,199 --> 00:45:19,719
a solid feature set or new product or something, and

913
00:45:19,800 --> 00:45:23,000
all of your competitors and brand new competitors because the

914
00:45:23,079 --> 00:45:25,920
industry is just yeah, it's like a bunch of weeds.

915
00:45:26,000 --> 00:45:28,800
Everyone's popping up everywhere. If they're coming out with something

916
00:45:29,360 --> 00:45:32,920
in six to eight weeks with a really solid tech

917
00:45:32,960 --> 00:45:34,039
preview or beta.

918
00:45:34,000 --> 00:45:34,880
Speaker 1: Yeah, your toast.

919
00:45:35,000 --> 00:45:38,239
Speaker 2: Yeah, you have troubles. I remember they accelerate. You were

920
00:45:38,239 --> 00:45:40,760
talking about the fact that version over a version you

921
00:45:40,800 --> 00:45:43,280
should be getting faster, but you found a lot of

922
00:45:43,360 --> 00:45:45,960
organizations at version over version were actually getting slower. Like

923
00:45:45,960 --> 00:45:49,239
you get mired in the croft and can't ship.

924
00:45:49,000 --> 00:45:52,639
Speaker 3: The next version exactly, or you know sometimes the reflexes

925
00:45:53,519 --> 00:45:56,199
we shipped and we had some outages. So now what

926
00:45:56,199 --> 00:45:57,800
we're going to do is we're going to slow down

927
00:45:57,920 --> 00:46:00,360
and we're going to do a more thorough review, which

928
00:46:00,400 --> 00:46:04,039
then means you have even more code to try to

929
00:46:04,079 --> 00:46:07,320
push and integrate into PROD, which you know now you've

930
00:46:07,360 --> 00:46:09,960
just had a huge blast radius. Yeah, so you have

931
00:46:10,039 --> 00:46:12,519
more outages and then sometimes the reflex is then to

932
00:46:13,119 --> 00:46:16,119
have a code freeze or wait longer and go even slower.

933
00:46:16,760 --> 00:46:20,159
Speaker 2: Yeah. Those are anti patterns. Yeah, the opposite of limitless.

934
00:46:20,199 --> 00:46:21,119
It's very limited.

935
00:46:21,400 --> 00:46:26,000
Speaker 3: Yes, and it's fairly predictable. We see it very very often. Now,

936
00:46:26,440 --> 00:46:29,239
for any you know, haters who are listening, we are

937
00:46:29,239 --> 00:46:32,280
not saying to just go as fast as possible, throw

938
00:46:32,320 --> 00:46:35,199
everything at the wall and run. No, that's also not good, right.

939
00:46:35,199 --> 00:46:38,960
We're definitely saying that we need the automated processes and

940
00:46:38,960 --> 00:46:42,840
tools in place to ensure that your code is safe, secure, reliable,

941
00:46:42,920 --> 00:46:48,880
performance correct. But without those things, without the automation in particular, Yeah,

942
00:46:48,960 --> 00:46:49,760
it's real tough.

943
00:46:49,960 --> 00:46:52,400
Speaker 2: But the automation can't come first. You're still in this

944
00:46:52,480 --> 00:46:54,480
place of can we evaluate well enough to know this

945
00:46:54,519 --> 00:46:56,039
is where the bottom lick is and then try and

946
00:46:56,079 --> 00:47:00,280
solve the bottleneck exactly? Yeah, and that then maybe tool

947
00:47:00,360 --> 00:47:02,719
makes sense there, but only because you want to solve

948
00:47:02,719 --> 00:47:03,599
the problem. Yeah.

949
00:47:03,639 --> 00:47:05,199
Speaker 3: And you know something I learned kind of early on

950
00:47:05,199 --> 00:47:06,519
in my career, or someone said this to me. I

951
00:47:06,559 --> 00:47:10,039
wish I remembered who. But we can't automate something until

952
00:47:10,039 --> 00:47:12,960
we understand it. Sure, there really is value in doing

953
00:47:13,000 --> 00:47:17,559
something manually. A handful of times. Yeah, so we understand

954
00:47:17,599 --> 00:47:18,840
it and then we can automate it. Because if we

955
00:47:18,840 --> 00:47:20,840
automate too fast, all we've done is sped up something

956
00:47:20,880 --> 00:47:22,800
that is likely wrong or incomplete.

957
00:47:22,840 --> 00:47:23,880
Speaker 2: Right, But now it goes quickly.

958
00:47:24,119 --> 00:47:25,320
Speaker 3: Yeah, but now go exactly.

959
00:47:25,440 --> 00:47:28,079
Speaker 2: You applified stupidity instead of intelligence. Now it's faster.

960
00:47:28,480 --> 00:47:30,159
Speaker 3: Well it can congratulations.

961
00:47:30,199 --> 00:47:32,159
Speaker 1: It could have been too slow, but the fact that

962
00:47:32,199 --> 00:47:35,800
you're automating it, you won't notice that because hey, I'm

963
00:47:35,840 --> 00:47:37,320
not doing it, so it's faster.

964
00:47:37,119 --> 00:47:39,239
Speaker 2: Than me doing it. Yeah, doing dumb faster.

965
00:47:39,440 --> 00:47:42,360
Speaker 3: Yeah, but the solution is not to continue doing it

966
00:47:42,400 --> 00:47:43,280
manually forever.

967
00:47:43,480 --> 00:47:45,920
Speaker 2: No, right, yeah, just because you do it dumb slow

968
00:47:45,920 --> 00:47:49,719
doesn't mean it's better. Right. But this is the consulting effect. Right.

969
00:47:49,960 --> 00:47:52,519
I never got hired to re engineer a business process.

970
00:47:52,559 --> 00:47:56,440
I got hired to automate. And my process of scrutinizing

971
00:47:56,480 --> 00:48:00,440
the workflow to make the automation re engineered the process.

972
00:48:01,239 --> 00:48:03,559
Like the two kind of go together that you have

973
00:48:03,639 --> 00:48:06,360
to you have to do what's acceptable to people as

974
00:48:06,400 --> 00:48:08,599
well to say I'm here to you know, make things better,

975
00:48:08,880 --> 00:48:11,840
make things better, and automation is safe. They're comfortable with that.

976
00:48:12,159 --> 00:48:15,440
But then in order to automate, you have to work

977
00:48:15,480 --> 00:48:18,000
through the manual process and understand it. And because you're

978
00:48:18,000 --> 00:48:19,360
the outside of you also get a point to go.

979
00:48:19,480 --> 00:48:23,000
Speaker 1: I mean you're like Colombo out to Oh yeah, you

980
00:48:23,039 --> 00:48:26,280
get just one more question about this process.

981
00:48:25,800 --> 00:48:27,800
Speaker 2: It's kind of inefficient.

982
00:48:27,840 --> 00:48:29,519
Speaker 1: Do you mind if I just take a look at that?

983
00:48:30,639 --> 00:48:34,320
Speaker 2: It's a good Columbo, thank you, good Peter Fox there go.

984
00:48:35,519 --> 00:48:38,360
But yeah, when do we scrutinize? I mean, it doesn't

985
00:48:38,360 --> 00:48:41,280
have to be an external like I definitely seen folks

986
00:48:41,280 --> 00:48:43,599
who've embarked on that sort of DevOps mindset where they

987
00:48:43,679 --> 00:48:45,920
just start talking to more people and they ask a

988
00:48:45,920 --> 00:48:49,559
lot of whys and often run into it's the way

989
00:48:49,559 --> 00:48:52,480
we've always done it right, So yeah, you know, we

990
00:48:52,599 --> 00:48:56,000
believe that that was necessary.

991
00:48:54,119 --> 00:48:57,360
Speaker 3: Ye change, and we're seeing a lot of that now. Yeah,

992
00:48:57,400 --> 00:49:01,639
because everything's changing and we need to rethink or at

993
00:49:01,719 --> 00:49:05,079
least revisit how our systems are working.

994
00:49:05,360 --> 00:49:07,079
Speaker 2: Yeah, you think we would do that routinely, but we

995
00:49:07,159 --> 00:49:09,880
don't because there's so many other things to do. Anyway,

996
00:49:10,079 --> 00:49:12,039
that idea of just taking the step back and looking

997
00:49:12,039 --> 00:49:14,480
through the processes again and say is this the right process?

998
00:49:15,280 --> 00:49:19,159
Speaker 3: Well, and it's really hard to articulate the importance or

999
00:49:19,159 --> 00:49:22,159
the value of cleaning up systems and clearing up tech dead.

1000
00:49:22,239 --> 00:49:24,599
Speaker 2: Sure, it's better if you.

1001
00:49:24,559 --> 00:49:27,960
Speaker 3: Can talk about and tie it to outcomes. You know,

1002
00:49:28,599 --> 00:49:31,199
we need to make sure that we can improve the

1003
00:49:31,239 --> 00:49:34,480
consistency and the predictability and the reliability of the software,

1004
00:49:35,079 --> 00:49:39,880
and so we yeah, we carve out a portion of

1005
00:49:39,880 --> 00:49:43,440
our time to ensure that is happening right and over

1006
00:49:43,480 --> 00:49:46,679
the past period of end months and years, we can

1007
00:49:46,719 --> 00:49:49,840
show that these investments are paying off. And the times

1008
00:49:49,880 --> 00:49:52,119
when we haven't because of whatever reason, you know, a

1009
00:49:52,199 --> 00:49:55,760
race to push something, we tend to see these you know,

1010
00:49:56,400 --> 00:50:00,360
negative outcomes. So like that can help, but it's it's

1011
00:50:00,400 --> 00:50:05,599
tough to try to explain or you know, calculate an

1012
00:50:05,679 --> 00:50:06,480
ROI for tech debt.

1013
00:50:06,519 --> 00:50:08,440
Speaker 2: That's not no, I mean I think a lot of

1014
00:50:08,559 --> 00:50:10,159
us just included it as like a ten percent rate

1015
00:50:10,239 --> 00:50:13,159
or something like that. You do want to knock it down, Oh, absolutely.

1016
00:50:13,320 --> 00:50:16,039
Argument has always been if we don't do this eventually,

1017
00:50:16,039 --> 00:50:16,960
we can't shift anything.

1018
00:50:17,320 --> 00:50:17,679
Speaker 3: Exactly.

1019
00:50:17,760 --> 00:50:19,960
Speaker 2: We have to do an all tech at all tech

1020
00:50:20,039 --> 00:50:22,920
debt sprints are multiple where if we just stay on

1021
00:50:22,960 --> 00:50:25,000
top of it so that we're diminishing it, there's less

1022
00:50:25,000 --> 00:50:28,039
of it than there was the next time. We're better off.

1023
00:50:27,920 --> 00:50:32,719
Speaker 3: Now keeping that you know goal of ten percent is

1024
00:50:32,880 --> 00:50:35,639
or so it's great. It's really challenging when I chat

1025
00:50:35,679 --> 00:50:40,119
with teams who they're booked up, their capacities one hundred percent,

1026
00:50:40,159 --> 00:50:44,039
they're fully utilized, and then things start breaking. Yeah, and

1027
00:50:44,079 --> 00:50:47,599
it's because and it's not that the developers ever, you know,

1028
00:50:48,039 --> 00:50:49,920
don't want I mean, tech det isn't super fun, but

1029
00:50:49,920 --> 00:50:52,320
they understand, you know, when codebase is it. When it's

1030
00:50:52,360 --> 00:50:55,800
really hard to make a change, that's not fun, right,

1031
00:50:56,119 --> 00:51:00,920
But it's hard to explain why the way a finger quilt, right,

1032
00:51:01,000 --> 00:51:03,599
the way we've been working for the last and years

1033
00:51:04,159 --> 00:51:06,719
isn't going to work anymore because we were basically sprinting

1034
00:51:06,880 --> 00:51:09,760
and now we have to do things to harden our

1035
00:51:09,800 --> 00:51:12,880
systems or make them more scalable or more reliable or

1036
00:51:13,440 --> 00:51:14,880
easier to change and understand.

1037
00:51:15,360 --> 00:51:18,679
Speaker 2: Yeah, it's a different cycle, yep. The difference between getting

1038
00:51:18,679 --> 00:51:21,679
that initial version of the door improving initial value exactly

1039
00:51:21,719 --> 00:51:22,840
what is sustainable value?

1040
00:51:23,000 --> 00:51:25,440
Speaker 3: Yeah, And you know, I'm going to pull us back

1041
00:51:25,480 --> 00:51:28,880
to that AI thing. I think this is also where

1042
00:51:29,679 --> 00:51:31,599
we're going to have some big questions come up because

1043
00:51:32,239 --> 00:51:34,400
you know your point when it's just about getting the

1044
00:51:34,440 --> 00:51:37,079
next version out of the door. Okay, right now, some

1045
00:51:37,119 --> 00:51:42,119
companies are releasing new products or new significant features every month, right,

1046
00:51:42,599 --> 00:51:47,199
So how can we be thinking about code hygiene and

1047
00:51:47,840 --> 00:51:52,400
system automation and hygiene and improvement when we are constantly

1048
00:51:52,440 --> 00:51:53,239
moving this fast?

1049
00:51:53,360 --> 00:51:56,159
Speaker 2: Yeah? Yeah, Well, and I would always look at the

1050
00:51:56,159 --> 00:51:58,920
delta trajectories there it's like, are we still able to

1051
00:51:58,920 --> 00:52:00,960
move every month? Or we shipping a little less each month?

1052
00:52:01,079 --> 00:52:04,440
Like are we decaying or are we strengthening? Yeah? I

1053
00:52:04,440 --> 00:52:06,360
picked that number ten percent a little bit arbitrary. Like

1054
00:52:06,519 --> 00:52:08,880
what I'm really looking for is the total amount of

1055
00:52:08,920 --> 00:52:13,079
tech debt less this sprint than last print. So if

1056
00:52:13,079 --> 00:52:15,199
we're accumulating more from the new things we're shipping, we

1057
00:52:15,239 --> 00:52:18,239
actually have to fix more too, yeap. Otherwise we're just

1058
00:52:18,280 --> 00:52:19,320
delaying the inevitable.

1059
00:52:19,519 --> 00:52:22,199
Speaker 3: Now, I love that you pointed that out right, if

1060
00:52:22,199 --> 00:52:25,440
we look at the things we're accumulating from shipping. One

1061
00:52:25,519 --> 00:52:29,519
reason I'm super excited about how quickly we can develop

1062
00:52:29,519 --> 00:52:31,519
things now is we can do a lot of experiments.

1063
00:52:31,840 --> 00:52:34,639
So hopefully we are not in a world where every

1064
00:52:34,679 --> 00:52:36,519
single thing we build gets shipped.

1065
00:52:36,840 --> 00:52:40,039
Speaker 2: Yeah, hopefully it should be way more comfortable thrown away

1066
00:52:40,079 --> 00:52:40,400
code to it.

1067
00:52:40,440 --> 00:52:42,840
Speaker 3: At least fifty percent of the things that we try

1068
00:52:43,280 --> 00:52:46,400
sure should fail, and it should be done so quickly

1069
00:52:46,440 --> 00:52:51,519
that like the business doesn't really it's just almost consequence exactly. Yeah,

1070
00:52:51,559 --> 00:52:54,639
because you know, definitely to your point, code that doesn't

1071
00:52:54,679 --> 00:52:56,280
do much at least it doesn't hurt the system.

1072
00:52:56,480 --> 00:52:57,079
Speaker 2: Oh you have.

1073
00:52:57,079 --> 00:52:57,840
Speaker 3: Added to tech debt.

1074
00:52:57,880 --> 00:53:00,519
Speaker 2: Yeah, yeah, you've loaded down. But I now feel like

1075
00:53:00,599 --> 00:53:03,920
we're going to make more branches to do experimental coding

1076
00:53:04,239 --> 00:53:07,400
these new tools and then make an assessment and go, okay,

1077
00:53:07,440 --> 00:53:10,880
we've made a learning Now dump the branch. Let's try

1078
00:53:10,960 --> 00:53:13,320
again with what we've now learned to see what we're

1079
00:53:13,320 --> 00:53:15,840
going to make from the end YEP, and in relatively

1080
00:53:15,880 --> 00:53:17,840
short periods of time we could spit a lot more

1081
00:53:17,880 --> 00:53:20,440
code out. Now. The real question is isn't good yeah?

1082
00:53:20,920 --> 00:53:23,599
Speaker 3: Or you know, even if it's not the best design code,

1083
00:53:23,679 --> 00:53:25,960
is that the functionality is that, the user flow is

1084
00:53:26,000 --> 00:53:26,400
that the.

1085
00:53:27,000 --> 00:53:28,800
Speaker 2: Did we get it into thing that we want some

1086
00:53:28,880 --> 00:53:30,280
kind of value? I appreciate that.

1087
00:53:30,360 --> 00:53:34,559
Speaker 1: Yeah, I'm just looking forward to things that I was

1088
00:53:34,840 --> 00:53:40,119
never looking forward to. Right, merge conflicts, I think those

1089
00:53:40,159 --> 00:53:43,840
will get easier to deal with having you know, an

1090
00:53:43,880 --> 00:53:48,320
AI in visual studio for example. You know, just some

1091
00:53:48,400 --> 00:53:51,000
of those things that that are cringe worthy, like oh,

1092
00:53:51,039 --> 00:53:52,719
I don't want to go down that rabbit hole, but

1093
00:53:52,800 --> 00:53:55,559
then you know, what do you got to lose? It's

1094
00:53:55,599 --> 00:53:57,119
all about the prompt baby. Oh.

1095
00:53:57,199 --> 00:54:00,719
Speaker 3: I'll say, in some ways, I think technically get easier,

1096
00:54:01,199 --> 00:54:06,320
but I think personally it will be challenging because right

1097
00:54:06,320 --> 00:54:09,480
now merch conflicts is a person making. Now many times

1098
00:54:09,480 --> 00:54:15,239
the system suggests the right answer. But what happens when

1099
00:54:15,239 --> 00:54:19,679
we have many, many more merch conflicts to evaluate? And

1100
00:54:19,760 --> 00:54:22,000
now you know the things I was talking about before?

1101
00:54:22,360 --> 00:54:25,440
Did it come from an agent? Did it come from

1102
00:54:25,480 --> 00:54:25,920
a person?

1103
00:54:26,880 --> 00:54:31,000
Speaker 2: I know you have that issue list clicking fire to copilot,

1104
00:54:31,079 --> 00:54:33,079
which one of them spins up a branch to build

1105
00:54:33,079 --> 00:54:35,519
that thing and sends you a pr and ten of

1106
00:54:35,519 --> 00:54:38,559
them come to roost. I hope you get some time. Yeah.

1107
00:54:38,960 --> 00:54:41,840
Speaker 3: Yeah, Now, I will say I keep identifying like open

1108
00:54:41,920 --> 00:54:45,679
questions and hard problems. I hope people who maybe are

1109
00:54:45,760 --> 00:54:47,599
less familiar with me and my work means that this

1110
00:54:47,760 --> 00:54:50,519
is a lot of excitement. There are a lot of

1111
00:54:50,599 --> 00:54:52,519
really cool things to figure out, and there are a

1112
00:54:52,519 --> 00:54:55,840
lot of cool things that don't have established best practice.

1113
00:54:55,880 --> 00:54:58,480
So for anyone who's in the field now, whether you're

1114
00:54:58,559 --> 00:55:01,159
junior or a senior, there is so much to do.

1115
00:55:01,599 --> 00:55:04,280
You're so much to do, and AI can't fix that,

1116
00:55:04,320 --> 00:55:06,559
and they can't solve it because it's pretty good at

1117
00:55:06,840 --> 00:55:08,880
responding to a prompt, it's not creative.

1118
00:55:09,400 --> 00:55:12,239
Speaker 1: But also these sort of meta prompts. You know, here's

1119
00:55:12,239 --> 00:55:14,360
a big problem I want to solve. What are the steps?

1120
00:55:14,760 --> 00:55:16,400
You don't have to do them for me, but what

1121
00:55:16,440 --> 00:55:19,039
are the steps I need to go through in order

1122
00:55:19,079 --> 00:55:22,920
to achieve this goal? You know, organizing your thoughts and

1123
00:55:23,039 --> 00:55:26,719
breaking big problems down into smaller problems. It's really good

1124
00:55:26,719 --> 00:55:28,199
for that. Oh, it's so good.

1125
00:55:28,320 --> 00:55:28,719
Speaker 2: Yeah.

1126
00:55:28,800 --> 00:55:31,280
Speaker 1: Yeah, And whether you use it all the way through,

1127
00:55:32,119 --> 00:55:36,519
that's another thing. But at least getting started, things that

1128
00:55:36,599 --> 00:55:43,440
seem insurmountable can be made easier. So what's next for you?

1129
00:55:43,480 --> 00:55:44,159
What's in your inbod?

1130
00:55:44,199 --> 00:55:45,360
Speaker 2: When are we going to see this book?

1131
00:55:46,000 --> 00:55:46,440
Speaker 3: End of year?

1132
00:55:46,519 --> 00:55:48,920
Speaker 2: Hopefully? So right now I already came out.

1133
00:55:49,679 --> 00:55:54,239
Speaker 3: We're working on it, So continuing to work on the book,

1134
00:55:54,960 --> 00:55:58,960
doing some really exciting work at Microsoft around AI and

1135
00:55:59,039 --> 00:56:01,199
systems and how we can improve and you know, what

1136
00:56:01,239 --> 00:56:04,719
that looks like. I still occasionally find some time to

1137
00:56:04,719 --> 00:56:07,440
talk to folks across industry and I love to learn,

1138
00:56:08,320 --> 00:56:11,760
you know, from them and their experiences. So yeah, still,

1139
00:56:11,880 --> 00:56:13,679
you know, it feels like the earliest deep off days,

1140
00:56:13,679 --> 00:56:16,559
where like we're kind of identifying and finding patterns and

1141
00:56:16,599 --> 00:56:21,000
sharing best practices and you know, connecting dots and connecting teams.

1142
00:56:21,039 --> 00:56:22,320
So it's super fun.

1143
00:56:22,480 --> 00:56:25,360
Speaker 2: Yeah, that's alazing times. So good to talk to you again,

1144
00:56:25,400 --> 00:56:26,480
and Bill, thanks, good to talk to you.

1145
00:56:26,519 --> 00:56:27,239
Speaker 3: Thanks for having me.

1146
00:56:27,320 --> 00:56:29,920
Speaker 1: Yeah, thanks for coming back, and we'll talk to you

1147
00:56:29,960 --> 00:56:54,360
next time on dot net rocks. Dot net Rocks is

1148
00:56:54,360 --> 00:56:58,079
brought to you by Franklin's Net and produced by Pop Studios,

1149
00:56:58,440 --> 00:57:02,480
a full service audio, video and post production facility located

1150
00:57:02,480 --> 00:57:05,440
physically in New London, Connecticut, and of course in the

1151
00:57:05,480 --> 00:57:10,559
cloud online at pwop dot com. Visit our website at

1152
00:57:10,599 --> 00:57:12,480
d O T N E t R O c k

1153
00:57:12,719 --> 00:57:17,519
S dot com for RSS feeds, downloads, mobile apps, comments,

1154
00:57:17,840 --> 00:57:20,360
and access to the full archives going back to show

1155
00:57:20,440 --> 00:57:24,159
number one, recorded in September two thousand and two. And

1156
00:57:24,280 --> 00:57:26,679
make sure you check out our sponsors. They keep us

1157
00:57:26,679 --> 00:57:29,960
in business. Now go write some code. See you next

1158
00:57:29,960 --> 00:57:32,920
time you got jack middle vans

1159
00:57:34,960 --> 00:57:35,000
Speaker 3: And

