1
00:00:01,840 --> 00:00:02,439
Speaker 1: What's going on?

2
00:00:02,480 --> 00:00:02,919
Speaker 2: Everybody?

3
00:00:02,919 --> 00:00:08,759
Speaker 1: Welcome to another episode of Adventures in DevOps. Warren joining

4
00:00:08,759 --> 00:00:11,320
me again. I keep making you feel like the new guy.

5
00:00:11,359 --> 00:00:12,599
But it's been like, what a year.

6
00:00:12,439 --> 00:00:16,839
Speaker 2: Now, almost that long, and I've got my I got

7
00:00:16,839 --> 00:00:20,280
my back prepared. It was a recent well I don't

8
00:00:20,280 --> 00:00:21,559
want to spoil my pick, so I'm not going to

9
00:00:21,600 --> 00:00:24,039
say what it is. But the conclusion is that AI

10
00:00:24,199 --> 00:00:27,079
may be making us stupid. The truth is that AI

11
00:00:27,719 --> 00:00:30,320
has a huge decrease on our critical thinking or how

12
00:00:30,399 --> 00:00:36,079
much we're utilizing it and not necessarily training that skill,

13
00:00:36,640 --> 00:00:39,840
and this could be the beginning of the downfall of humanity.

14
00:00:39,920 --> 00:00:41,520
Speaker 3: And that's all I'm gonna say.

15
00:00:41,719 --> 00:00:44,520
Speaker 4: I don't know. I sort of take like issue with

16
00:00:44,600 --> 00:00:47,039
that because I remember hearing the same thing, like my

17
00:00:47,119 --> 00:00:49,119
teacher is telling me all about spell check, like, oh,

18
00:00:49,119 --> 00:00:50,960
you're not going to have a computer in your pocket,

19
00:00:51,039 --> 00:00:53,840
you need to get over this dyslexia thing. And as

20
00:00:53,880 --> 00:00:55,799
it turns out, I do have a computer in my pocket,

21
00:00:55,840 --> 00:00:57,679
and no, I still do not know it as well.

22
00:00:59,119 --> 00:01:01,439
We're fine. The skill set se ball, but so it's

23
00:01:01,439 --> 00:01:02,679
gonna be okay everybody.

24
00:01:03,039 --> 00:01:06,000
Speaker 2: The same thing happened with calculators as well. But I'll

25
00:01:06,000 --> 00:01:08,280
say more about that at the end of the episode.

26
00:01:09,040 --> 00:01:13,400
Speaker 1: Right on, Hi, Jillian, welcome. Hello, all right, this is

27
00:01:13,439 --> 00:01:16,120
going to be a cool conversation. Joining us today, we

28
00:01:16,239 --> 00:01:20,560
have the founder and CEO of Warp the Warp Terminal,

29
00:01:20,920 --> 00:01:24,239
Zach Lloyd. Zach, welcome, I'm excited to be here.

30
00:01:24,280 --> 00:01:25,079
Speaker 3: Thanks for having me.

31
00:01:25,599 --> 00:01:28,239
Speaker 1: I'm excited to have you on here. And just to

32
00:01:29,079 --> 00:01:33,000
pick your brain about this because I first saw the

33
00:01:33,040 --> 00:01:37,599
Warp Terminal. It's been several years now, so you've been

34
00:01:37,640 --> 00:01:41,000
working on this for a while, and it was just like,

35
00:01:41,879 --> 00:01:44,079
at first, it was so confusing to me because I

36
00:01:44,120 --> 00:01:46,719
was like, wait, this isn't what my terminals supposed to do.

37
00:01:46,879 --> 00:01:50,280
It's it's like offering up stuff like how do I

38
00:01:50,359 --> 00:01:50,920
trust this?

39
00:01:51,079 --> 00:01:53,239
Speaker 2: So before we dig into that.

40
00:01:53,239 --> 00:01:56,799
Speaker 1: Tell us tell our listeners a little bit about warp

41
00:01:56,879 --> 00:01:59,079
and and what it does.

42
00:01:59,120 --> 00:02:05,519
Speaker 5: And yeah, so Warp it's a reimagination of the terminal.

43
00:02:06,599 --> 00:02:09,000
You can use it like a regular terminal, so you

44
00:02:09,120 --> 00:02:11,639
drop it in and use it in place of I

45
00:02:11,639 --> 00:02:13,560
don't know whatever you're currently using, if you're a mac

46
00:02:13,639 --> 00:02:16,919
I term or just the stock terminal app. The idea

47
00:02:17,000 --> 00:02:20,240
behind it is that it has a much more sort

48
00:02:20,280 --> 00:02:24,400
of user friendly user experience, so you know, basic stuff

49
00:02:24,439 --> 00:02:30,400
like the mouths works, for instance, but it's also increasingly

50
00:02:30,879 --> 00:02:35,240
it's about being intelligent, and so when you use WARP,

51
00:02:35,319 --> 00:02:37,680
the main distinguishing thing these days is that you don't

52
00:02:37,719 --> 00:02:40,879
have to enter command, so you can just instruct the

53
00:02:40,960 --> 00:02:43,840
terminal in English, tell it what you wanted to do,

54
00:02:43,960 --> 00:02:46,520
and it will sort of solve your problem for you

55
00:02:46,560 --> 00:02:51,879
by translating your wishes into commands using AI, and it

56
00:02:51,960 --> 00:02:54,560
looks up whatever context it needs and kind of guide

57
00:02:54,560 --> 00:02:57,319
you through whatever task you're doing, whether it's a coding

58
00:02:57,360 --> 00:03:00,479
task or a DevOps task or setting up a new project.

59
00:03:00,599 --> 00:03:04,120
So it's a totally different way of using the command

60
00:03:04,159 --> 00:03:07,159
line that I think it's like pretty fun to use

61
00:03:07,240 --> 00:03:11,080
and definitely more powerful than your standard terminal. And like

62
00:03:11,400 --> 00:03:13,400
we're kind of having an internal debate at this point

63
00:03:13,439 --> 00:03:15,199
about whether or not it's even right to call it

64
00:03:15,240 --> 00:03:19,960
a terminal because it's so fundamentally different from what you

65
00:03:20,000 --> 00:03:22,240
know that people expect when they use a terminal, but

66
00:03:22,639 --> 00:03:25,120
it does work. It's like I think a really really

67
00:03:25,599 --> 00:03:27,039
nice to use terminal as well.

68
00:03:29,039 --> 00:03:32,120
Speaker 1: Yeah, for sure, Like the terminal features are definitely all

69
00:03:32,599 --> 00:03:35,039
right there and ready to go, and then it just keeps.

70
00:03:36,199 --> 00:03:39,240
I think it's really a cool way to get used

71
00:03:39,240 --> 00:03:41,520
to it is just drop it in as your replacement terminal,

72
00:03:42,120 --> 00:03:44,240
and then you can start picking and choosing like all

73
00:03:44,240 --> 00:03:48,159
of these other things that it has as you, as

74
00:03:48,199 --> 00:03:49,719
you get comfortable with it.

75
00:03:50,960 --> 00:03:52,560
Speaker 4: I want to say I really like that it uses

76
00:03:52,599 --> 00:03:54,479
the mouse because I have like a bit of a

77
00:03:54,520 --> 00:03:56,879
horror story of trying to get somebody set up with them,

78
00:03:56,960 --> 00:03:58,960
and I felt like very proud of myself, like oh

79
00:03:58,960 --> 00:04:01,599
look I got the scientist using them, and then they

80
00:04:01,599 --> 00:04:03,439
were like, great, how do I use a mouse? And

81
00:04:03,479 --> 00:04:06,560
I was like, oh no, So I think I think

82
00:04:06,599 --> 00:04:07,400
that's a nice feature.

83
00:04:07,879 --> 00:04:09,400
Speaker 5: The other thing that it will help you do is

84
00:04:09,520 --> 00:04:12,960
figure out how to quit them if you end up VIM.

85
00:04:15,159 --> 00:04:16,879
Speaker 4: It's not what we're trying to do here.

86
00:04:17,160 --> 00:04:20,160
Speaker 5: Which is what it's one of our most popular features

87
00:04:20,240 --> 00:04:22,000
is you could ask the AI I had to quit them.

88
00:04:22,040 --> 00:04:24,439
Speaker 3: It's very funny because people people do end up in

89
00:04:24,480 --> 00:04:26,759
there and they're like, what, oh.

90
00:04:26,680 --> 00:04:27,959
Speaker 4: You mean, like quit the application?

91
00:04:28,120 --> 00:04:30,000
Speaker 3: Not like I quit the application yet, not.

92
00:04:29,920 --> 00:04:34,600
Speaker 4: Like quit the addiction. Okay, No, people love them.

93
00:04:34,759 --> 00:04:38,519
Speaker 1: Now that's a twelve step program for that ward it is.

94
00:04:39,120 --> 00:04:40,879
Speaker 4: They need a new one. They need twenty steps.

95
00:04:42,279 --> 00:04:46,920
Speaker 1: Cool, So, how long have you been building WARPD.

96
00:04:46,959 --> 00:04:50,920
Speaker 5: We've been at it for a while, so started the

97
00:04:51,000 --> 00:04:54,399
company started during COVID SO twenty like the middle of

98
00:04:54,439 --> 00:05:01,480
twenty twenty, and we first launched something publicly in twenty

99
00:05:01,879 --> 00:05:07,480
twenty one. And it's just sort of evolved from something

100
00:05:07,480 --> 00:05:11,160
where the main value initially was, hey, let's make this

101
00:05:11,360 --> 00:05:14,800
tool a little bit easier to use and like fix

102
00:05:14,839 --> 00:05:18,560
some of the UX into something that as much richer

103
00:05:18,600 --> 00:05:21,959
it is, especially when chat chupt came out, and we

104
00:05:21,959 --> 00:05:23,600
were even doing some AI stuff before that.

105
00:05:23,680 --> 00:05:26,079
Speaker 3: But they've been working on it for a while now.

106
00:05:27,439 --> 00:05:34,560
Speaker 1: Right on what's the what's the thought process that goes

107
00:05:34,680 --> 00:05:39,800
into figuring out how to integrate AI into this?

108
00:05:42,360 --> 00:05:44,920
Speaker 3: Yeah, so we went through a bunch of different stages.

109
00:05:45,000 --> 00:05:48,079
Speaker 5: So the first, the first sort of stage of AI

110
00:05:48,120 --> 00:05:53,519
and warp was essentially like translate English into a command

111
00:05:53,920 --> 00:05:56,560
so you could bring up this little thing and it

112
00:05:56,600 --> 00:06:01,720
actually predated chatchupt. Use something called codex, which was a

113
00:06:01,920 --> 00:06:04,680
I think an open AI like coding API, and you

114
00:06:04,720 --> 00:06:08,800
could be like, you know, search my files for this

115
00:06:08,879 --> 00:06:11,759
specific term and it might generate like a fine command

116
00:06:12,040 --> 00:06:14,600
or a rep command, something like that, and it's very

117
00:06:14,680 --> 00:06:17,279
much like one to one English to command translation.

118
00:06:18,040 --> 00:06:24,519
Speaker 3: The next thing that we did was when chat chikut.

119
00:06:24,399 --> 00:06:25,839
Speaker 5: Came out, we did what I think a lot of

120
00:06:25,879 --> 00:06:27,839
apps did at that time, which was like put a

121
00:06:27,959 --> 00:06:30,399
chat panel into warp and so you could have a

122
00:06:30,439 --> 00:06:33,800
sort of chat panel on the side where you know,

123
00:06:33,879 --> 00:06:36,160
you could ask coding questions. You could be like, how

124
00:06:36,160 --> 00:06:38,800
do I set up you know, a new Python repo

125
00:06:38,920 --> 00:06:40,639
with these dependencies, and we give it to you as

126
00:06:40,639 --> 00:06:43,279
a chat and then it's sort of like a copy

127
00:06:43,279 --> 00:06:45,920
paste type experience where you would take what was in

128
00:06:46,000 --> 00:06:48,279
the chat and move it into the terminal And that

129
00:06:48,439 --> 00:06:50,959
was cool, but kind of I would say, like limited

130
00:06:51,399 --> 00:06:53,040
extra utility compared.

131
00:06:52,720 --> 00:06:54,600
Speaker 3: To just like doing it in chat GPT.

132
00:06:56,360 --> 00:07:01,199
Speaker 5: The biggest change that we made was basically the idea

133
00:07:01,279 --> 00:07:06,240
that the terminal input where people type commands also could

134
00:07:06,319 --> 00:07:10,839
be used directly as a conversational input to work with

135
00:07:10,920 --> 00:07:15,000
an AI, and that the AI itself would end up

136
00:07:15,040 --> 00:07:17,480
in like sort of intersperse in the terminal session.

137
00:07:17,480 --> 00:07:19,079
Speaker 3: And we call this agent mode.

138
00:07:19,439 --> 00:07:22,160
Speaker 5: And so in this world, it's not just that you

139
00:07:22,360 --> 00:07:25,199
chat with it, it's that you tell it what to do,

140
00:07:25,839 --> 00:07:29,560
and it's able on its own to invoke commands to

141
00:07:30,040 --> 00:07:33,199
kind of like gather the context that it needs help

142
00:07:33,240 --> 00:07:35,160
you do a thing. So, for instance, if I was like,

143
00:07:35,800 --> 00:07:37,720
go back to that same example, like help me set

144
00:07:37,759 --> 00:07:41,319
up a Python repo with these dependencies, instead of doing

145
00:07:41,319 --> 00:07:42,920
it in a chat panel, which we got rid of,

146
00:07:43,240 --> 00:07:46,120
you just type that into the terminal input and we.

147
00:07:46,160 --> 00:07:48,399
Speaker 3: Detect that you're typing English and not a command.

148
00:07:48,879 --> 00:07:52,279
Speaker 5: And when you hit enter, it follows up and says like, okay,

149
00:07:52,399 --> 00:07:54,319
like what directory do you want this saying? And you

150
00:07:54,399 --> 00:07:57,360
tell it what directory and then it will say it'll

151
00:07:57,439 --> 00:08:00,480
make the directory for you, It'll CD into it, create

152
00:08:00,519 --> 00:08:04,560
the gitripo, it'll do all the pip install, it will

153
00:08:04,959 --> 00:08:07,519
even generate the initial scaffolding of the code.

154
00:08:07,839 --> 00:08:10,120
Speaker 3: If it hits an error, it can debug its own error.

155
00:08:10,439 --> 00:08:13,519
Speaker 5: And all of this is happening within your terminal session.

156
00:08:13,560 --> 00:08:15,720
And so you know, you get to a point where

157
00:08:15,720 --> 00:08:19,319
it's like you're actually driving the terminal a little bit

158
00:08:19,319 --> 00:08:22,560
more in English than you are in commands, and it's

159
00:08:22,639 --> 00:08:24,959
it's kind of crazy how it's changing how people use

160
00:08:24,959 --> 00:08:28,240
the terminal. Like I was just looking at this yesterday,

161
00:08:28,319 --> 00:08:31,480
like in warp. Now like a quarter of what is

162
00:08:31,519 --> 00:08:34,279
going on in the terminal sessions is actually just English

163
00:08:34,360 --> 00:08:36,200
and AI generating commands.

164
00:08:35,840 --> 00:08:38,919
Speaker 3: And not people typing CD and LS anymore.

165
00:08:39,039 --> 00:08:41,159
Speaker 5: So that was the sort of evolution, so from a

166
00:08:41,320 --> 00:08:43,799
very bolt on thing to something where it's like the

167
00:08:43,799 --> 00:08:47,960
actual fundamental experience of how you use the tools has

168
00:08:48,039 --> 00:08:48,720
changed a bunch.

169
00:08:50,840 --> 00:08:53,840
Speaker 1: Yeah, so you're completely changing the interaction there. Instead of

170
00:08:53,879 --> 00:08:56,440
saying how do I just saying go do it.

171
00:08:57,000 --> 00:09:02,600
Speaker 5: Exactly exactly, And that actually takes like developers don't necessarily

172
00:09:02,639 --> 00:09:05,919
think to do that. They're very much in the like, okay,

173
00:09:06,639 --> 00:09:08,679
let me google this, let me go to stack overflow

174
00:09:09,639 --> 00:09:12,960
type of mindset, and it's a totally new behavior if

175
00:09:12,960 --> 00:09:14,679
you're a developer to just be like I'm just going

176
00:09:14,720 --> 00:09:17,840
to tell the computer what to do. It's a little

177
00:09:17,879 --> 00:09:20,960
bit scary because like, what's your terminal and it's like

178
00:09:20,960 --> 00:09:23,120
now the computer is just like doing stuff in your terminal.

179
00:09:23,559 --> 00:09:29,639
But I do think that's the future of how development, DevOps,

180
00:09:29,720 --> 00:09:31,080
whatever you're doing is developer.

181
00:09:31,120 --> 00:09:32,159
Speaker 3: It's going to move from this.

182
00:09:32,200 --> 00:09:34,480
Speaker 5: Like let me run a bunch of queries or let

183
00:09:34,519 --> 00:09:36,159
me like open up a bunch of files and hand

184
00:09:36,159 --> 00:09:38,000
out of things, to a world where you're just sort

185
00:09:38,039 --> 00:09:42,440
of like, hey, let me actually tell my smart AI

186
00:09:42,960 --> 00:09:45,639
whatever you want to call it, assistant agent whatever, to

187
00:09:45,759 --> 00:09:46,879
start me on this task.

188
00:09:47,080 --> 00:09:49,440
Speaker 3: And the you know, the.

189
00:09:49,159 --> 00:09:55,320
Speaker 5: The agent will loop me in get more info, you know,

190
00:09:56,399 --> 00:09:58,840
you know, leverage me when there's ambiguity resolve. But it's

191
00:09:58,879 --> 00:10:01,200
like it's like going to be an imperative I'm telling

192
00:10:01,200 --> 00:10:04,799
it what to do way of working. And like the

193
00:10:04,879 --> 00:10:08,080
cool thing about the terminal for doing that is like

194
00:10:08,840 --> 00:10:10,840
that's kind of what the terminal is set up for.

195
00:10:11,000 --> 00:10:12,879
If you think about it, it's like the terminal is

196
00:10:12,919 --> 00:10:15,519
set up for users to tell the computer what to do.

197
00:10:16,120 --> 00:10:18,960
It's just that we're like upping the level of abstraction

198
00:10:19,159 --> 00:10:22,120
from you telling it in terms of like grap and

199
00:10:22,200 --> 00:10:25,159
fine and cdnls to telling it at the level of

200
00:10:25,200 --> 00:10:25,519
like a.

201
00:10:25,519 --> 00:10:28,200
Speaker 3: Task what you wanted to do. And so that's like the.

202
00:10:28,240 --> 00:10:31,679
Speaker 5: Vision that we're building towards right on.

203
00:10:31,799 --> 00:10:33,840
Speaker 1: I think it's a really great analogy, you know, because

204
00:10:33,840 --> 00:10:36,159
we've seen that in other areas of software development, where

205
00:10:36,200 --> 00:10:40,879
you just keep abstracting things away more and more, yep,

206
00:10:41,360 --> 00:10:44,799
and coding at a higher level. But this is one

207
00:10:44,840 --> 00:10:49,080
of the few projects where you're actually doing that outside

208
00:10:49,120 --> 00:10:52,799
of doing it at the like the task level. Rather

209
00:10:52,840 --> 00:10:53,960
than at the coding level.

210
00:10:54,519 --> 00:10:59,080
Speaker 5: Correct, and like we are so you can you can

211
00:10:59,120 --> 00:11:02,559
code and warp. I don't know if did you all

212
00:11:02,600 --> 00:11:05,240
see the cloud code? Have you played with that at all?

213
00:11:06,159 --> 00:11:09,080
Speaker 4: I have a little Yeah, so cloud code.

214
00:11:08,960 --> 00:11:12,519
Speaker 5: Is super interesting from our perspective because it's it's uh,

215
00:11:12,639 --> 00:11:15,519
it's all terminal based, and it's all this imperative like

216
00:11:16,120 --> 00:11:19,159
you run a terminal program, you tell the you tell

217
00:11:19,320 --> 00:11:22,200
cloud code like, hey, you know, make this change for me,

218
00:11:24,159 --> 00:11:28,159
and it skips the file editor and id entirely to

219
00:11:28,159 --> 00:11:31,600
do coding stuff. And so we're also we have very

220
00:11:31,639 --> 00:11:35,000
similar feature in Warp. It's not it's access that you

221
00:11:35,039 --> 00:11:37,360
don't run a program within the terminal, You just tell

222
00:11:37,440 --> 00:11:40,240
the terminal what to do. But I think it's interesting

223
00:11:40,519 --> 00:11:43,519
in terms of like the types of tasks that you

224
00:11:43,559 --> 00:11:47,960
can do and if you even look at like have

225
00:11:48,039 --> 00:11:50,639
you all used Cursor and Windsurf those types of apps

226
00:11:50,720 --> 00:11:51,360
do any coding?

227
00:11:52,639 --> 00:11:53,399
Speaker 1: Yeah a little bit.

228
00:11:54,159 --> 00:11:57,559
Speaker 5: So yeah in those apps, Like the sort of initial

229
00:11:57,600 --> 00:11:59,440
feature that was like the magic feature and this is

230
00:11:59,440 --> 00:12:05,240
true forgetting Copilot two, was like it will do great

231
00:12:05,279 --> 00:12:07,039
code completions for you, So it gives you this goes

232
00:12:07,120 --> 00:12:09,840
to text as you're typing and it sort of completes

233
00:12:09,879 --> 00:12:13,399
your thought. And the sort of thing that they're building

234
00:12:13,399 --> 00:12:16,399
out now is also it's much more like a chat

235
00:12:16,440 --> 00:12:21,000
panel within those apps, where you can tell the computer

236
00:12:21,080 --> 00:12:23,399
what to do and it generates code dips, and they're

237
00:12:23,440 --> 00:12:28,039
creating something that looks an awful lot like a terminal interaction, but.

238
00:12:28,039 --> 00:12:28,960
Speaker 3: Within the code editor.

239
00:12:29,000 --> 00:12:32,000
Speaker 5: And so I do think there's this general shift that's

240
00:12:32,039 --> 00:12:35,159
going on for coding, and I think it's also going

241
00:12:35,200 --> 00:12:39,639
to really impact people who are doing production DevOps basically

242
00:12:39,679 --> 00:12:43,600
any type of interaction with systems where you just sort

243
00:12:43,600 --> 00:12:46,159
of start by telling the computer what to do somehow.

244
00:12:46,759 --> 00:12:49,159
So it's pretty neat, pretty neat to see.

245
00:12:50,639 --> 00:12:53,360
Speaker 4: So I really like this because I spend a lot

246
00:12:53,399 --> 00:12:56,840
of my days trying to convince biologists that, like, you

247
00:12:56,919 --> 00:12:58,600
need to be able to use the terminal at least

248
00:12:58,639 --> 00:13:00,679
a little bit, and it's always touch a tough sell

249
00:13:00,879 --> 00:13:02,720
because being like, well I'll go over here and take

250
00:13:02,759 --> 00:13:05,879
this Linux class is like not not what they want

251
00:13:05,919 --> 00:13:08,200
to be doing. Let's say, so just being able to

252
00:13:08,200 --> 00:13:10,720
say why not, just in English and it will at

253
00:13:10,759 --> 00:13:13,080
least get you to the directory and install your Python

254
00:13:13,240 --> 00:13:16,080
environment and do this kind of stuff is just so

255
00:13:16,200 --> 00:13:19,159
much nicer than what I've been doing in the past,

256
00:13:19,440 --> 00:13:20,960
and I do I like this. This is great.

257
00:13:21,960 --> 00:13:24,639
Speaker 5: Yeah, I mean it's it's the other cool thing for

258
00:13:24,720 --> 00:13:28,879
people who who are it's not their natural environment, let's say,

259
00:13:28,919 --> 00:13:30,240
and like they have to use it.

260
00:13:30,279 --> 00:13:30,600
Speaker 3: Is that.

261
00:13:32,080 --> 00:13:34,360
Speaker 5: As you use warp to do this stuff, that teaches you.

262
00:13:34,440 --> 00:13:37,960
So it doesn't just like off the skate, like at

263
00:13:38,039 --> 00:13:40,519
least for now. The way it does it is like

264
00:13:40,919 --> 00:13:42,799
you type in like, hey, I want to create this project,

265
00:13:42,799 --> 00:13:45,000
and it says something back to you like, Okay, here

266
00:13:45,000 --> 00:13:47,120
are the commands that need to be run in order

267
00:13:47,120 --> 00:13:48,919
to create this project. Are you cool if I run

268
00:13:48,960 --> 00:13:52,360
these commands? And so to warn to your earlier points

269
00:13:52,360 --> 00:13:54,399
like is this just making this all like kind of

270
00:13:54,480 --> 00:13:56,919
dumber and not knowing how to do anything? It's possible,

271
00:13:57,240 --> 00:14:00,360
But there is also an aspect of like it's kind

272
00:14:00,360 --> 00:14:02,480
of like working with like the smart person on your

273
00:14:02,480 --> 00:14:04,440
team who can show you how to do things, and like,

274
00:14:05,080 --> 00:14:07,159
you know, hopefully you pick it up because it is

275
00:14:07,559 --> 00:14:10,240
it is in some ways faster if you know what

276
00:14:10,279 --> 00:14:13,879
you're doing, just type the commands. And I think in general,

277
00:14:13,960 --> 00:14:18,159
like I don't think it's a great outcome if everyone

278
00:14:18,159 --> 00:14:20,159
who's doing development or working in the terminal doesn't know

279
00:14:20,159 --> 00:14:22,799
what the hell is going on, because inevitably you're going

280
00:14:22,879 --> 00:14:25,159
to get to some point where you kind of need

281
00:14:25,200 --> 00:14:28,919
to know in order to fix something. And so you

282
00:14:28,960 --> 00:14:30,960
know that the hope is that this doesn't make people

283
00:14:31,000 --> 00:14:33,519
like dumb, or this makes people more proficient, but there is,

284
00:14:33,559 --> 00:14:35,200
like I think there's a risk for sure.

285
00:14:35,240 --> 00:14:37,559
Speaker 2: There's there's actually two things that this reminds me of

286
00:14:37,600 --> 00:14:39,919
a lot. And the first one is a long time ago,

287
00:14:39,960 --> 00:14:42,600
and I don't know how well it's maintained, but there

288
00:14:42,679 --> 00:14:45,200
was a program that you could install into your terminal

289
00:14:45,279 --> 00:14:46,200
called fuck.

290
00:14:46,840 --> 00:14:50,039
Speaker 3: Yeah no, No, we've partnered with that exactly.

291
00:14:50,600 --> 00:14:53,200
Speaker 2: You've never you've never seen this before. Something that actually

292
00:14:53,279 --> 00:14:56,600
happens sort of often is that a command line program

293
00:14:56,600 --> 00:14:59,360
you run will tell you sort of what you did

294
00:14:59,399 --> 00:15:02,279
wrong in a way like did you mean this, And

295
00:15:02,360 --> 00:15:04,720
instead of having to like retype the command and fix

296
00:15:04,759 --> 00:15:07,120
the problem, you could just type fuck and it would

297
00:15:07,159 --> 00:15:10,960
read the output and then do that thing. And that's

298
00:15:11,000 --> 00:15:12,759
the first one. So you haven't seen that, Like I

299
00:15:12,840 --> 00:15:14,840
highly recommend at least, you know, checking that out. And

300
00:15:14,879 --> 00:15:17,480
the other one is this thing that totally changed how

301
00:15:17,519 --> 00:15:22,360
I use the terminal for doing software development, for interacting

302
00:15:22,360 --> 00:15:24,840
with GIT repositories. Is there's actually a get configuration that

303
00:15:24,879 --> 00:15:29,360
you can set up to automatically fix typos. So if

304
00:15:29,360 --> 00:15:31,879
you type something wrong, it will swap the letters around

305
00:15:31,919 --> 00:15:35,000
and be like, oh, okay, you probably meant this with

306
00:15:35,039 --> 00:15:37,639
a ninety nine percent accuracy, and then just do that

307
00:15:37,679 --> 00:15:40,120
command anyway. And you can also set a time out,

308
00:15:40,159 --> 00:15:42,440
like you know, if you accidentally type something and it's

309
00:15:42,480 --> 00:15:45,159
gonna start deleting all of your code base, you can

310
00:15:45,279 --> 00:15:47,840
be like, oh, wait, no, I don't want you to

311
00:15:47,840 --> 00:15:50,080
do that. But that actually brings me to a question

312
00:15:50,120 --> 00:15:52,840
I want to ask, which is I see more and

313
00:15:52,919 --> 00:15:55,759
more of these pieces of software I'll call them agents

314
00:15:55,840 --> 00:15:58,960
that are interacting with your operating system directly, and for me,

315
00:15:59,080 --> 00:16:01,720
like I'm super risk goverse, like I don't. I want

316
00:16:01,720 --> 00:16:04,960
to keep every LM or non thinking creature in its

317
00:16:05,000 --> 00:16:08,279
own private box where it can't accidentally delete like my

318
00:16:08,559 --> 00:16:13,000
entire operating system, because that's what I thought I wanted

319
00:16:13,000 --> 00:16:13,240
to know.

320
00:16:14,720 --> 00:16:20,759
Speaker 5: It's just like, might trust the agent with myself? Yeah,

321
00:16:20,799 --> 00:16:25,879
go ahead, point I think, yeah.

322
00:16:24,840 --> 00:16:26,639
Speaker 3: Like, so how do you how to manage this is

323
00:16:26,679 --> 00:16:27,080
the question?

324
00:16:27,240 --> 00:16:30,039
Speaker 2: Or yeah, I mean, it's just it's almost like I

325
00:16:30,080 --> 00:16:32,720
would want to run like two computers side by side

326
00:16:32,919 --> 00:16:35,799
one of them. I mean, I already am really concerned

327
00:16:35,799 --> 00:16:39,240
about running external software on my machine from Ali like

328
00:16:39,240 --> 00:16:43,240
a malicious standpoint. Very rarely is it will break my

329
00:16:43,279 --> 00:16:45,639
operating system. I don't remember the last time it happened.

330
00:16:45,639 --> 00:16:47,399
It was probably when I was using Windows, like over

331
00:16:47,440 --> 00:16:48,200
a decade ago.

332
00:16:48,840 --> 00:16:48,960
Speaker 5: Uh.

333
00:16:49,360 --> 00:16:52,279
Speaker 2: But when it comes to LMS and things that, like

334
00:16:52,440 --> 00:16:55,679
I know from firsthand experience, sometimes it's like there's a

335
00:16:55,759 --> 00:16:58,200
non zero chance that it just figures out the wrong

336
00:16:58,240 --> 00:17:01,519
thing to do. And and like that's the sort of

337
00:17:01,559 --> 00:17:03,799
thing that I almost want a sandbox as much as possible,

338
00:17:03,840 --> 00:17:05,720
and I feel like we're not getting closer to that

339
00:17:05,759 --> 00:17:08,519
because our operating systems aren't don't allow it as much.

340
00:17:09,599 --> 00:17:12,359
Speaker 5: So it's a great point. I mean, you have a

341
00:17:12,400 --> 00:17:14,960
couple of choices. If let's say you're using warps, so

342
00:17:15,079 --> 00:17:16,880
one you can just turn this stuff off, like if

343
00:17:16,880 --> 00:17:19,880
you're just like I don't trust that, I don't want it.

344
00:17:19,119 --> 00:17:19,960
Speaker 3: So that's fair.

345
00:17:20,039 --> 00:17:22,880
Speaker 5: There's one there's like a toggle that just says AI

346
00:17:23,079 --> 00:17:27,000
off and like that's it. You're back to, like you know,

347
00:17:27,039 --> 00:17:31,160
you're in control. There's also like a sort of like

348
00:17:31,480 --> 00:17:34,480
you can control the level of autonomy it has. So

349
00:17:34,559 --> 00:17:37,160
the the one of the levels that you could have

350
00:17:37,319 --> 00:17:40,319
is just like it can't do anything on its own,

351
00:17:40,400 --> 00:17:46,079
so it can suggest commands, so you can then manually

352
00:17:46,079 --> 00:17:49,839
approve anything it suggests. There's a level up from that,

353
00:17:49,880 --> 00:17:51,480
which is like you can kind of provide like an

354
00:17:51,519 --> 00:17:53,720
allow list and denial list. It could be like, oh,

355
00:17:53,799 --> 00:17:55,680
it's fine, it can run CAT, it can run less,

356
00:17:56,640 --> 00:17:59,799
can't run r M. You can go a level up

357
00:17:59,799 --> 00:18:01,880
from I feel like I wanted to be able to

358
00:18:01,920 --> 00:18:05,920
run read only commands and let let an LLM determine

359
00:18:05,920 --> 00:18:07,920
what it thinks as a read only command, which it's

360
00:18:07,920 --> 00:18:10,440
pretty damn good at but not perfect. Like if you

361
00:18:10,480 --> 00:18:12,880
had some crazy like piped thing or like hero doc

362
00:18:13,000 --> 00:18:15,519
or something like that, it might it might get confused,

363
00:18:15,599 --> 00:18:17,000
but it's pretty good.

364
00:18:17,400 --> 00:18:21,119
Speaker 3: Or you could be like you know, like yellow, like.

365
00:18:20,480 --> 00:18:23,400
Speaker 5: Like I just wanna it's not that big of a

366
00:18:23,400 --> 00:18:25,920
deal if that messes up my like get ripo or whatever,

367
00:18:25,960 --> 00:18:28,839
and I'm gonna let it run. And then the other

368
00:18:28,839 --> 00:18:30,839
thing that we're working on that we don't have yet

369
00:18:30,880 --> 00:18:32,720
but I think is really important in this world of

370
00:18:32,720 --> 00:18:35,839
like more autonomy is is what's the fastest way to

371
00:18:35,880 --> 00:18:36,400
like spin up.

372
00:18:36,319 --> 00:18:37,839
Speaker 3: A sandbox where.

373
00:18:39,400 --> 00:18:42,559
Speaker 5: You know your whatever state you want it working on

374
00:18:42,680 --> 00:18:44,720
is replicated and it can just go to work there

375
00:18:44,880 --> 00:18:49,039
without without you losing any sleep. That's gonna do something irreparable.

376
00:18:50,279 --> 00:18:53,000
I think like an undue functionality is super interesting too.

377
00:18:53,079 --> 00:18:56,960
It's not like trivial to do that in the terminal.

378
00:18:57,000 --> 00:19:01,279
Like the terminal is a stateful place where you know,

379
00:19:01,920 --> 00:19:02,480
kid deal.

380
00:19:02,480 --> 00:19:04,480
Speaker 3: Files and there's no like undo.

381
00:19:05,079 --> 00:19:07,400
Speaker 5: Uh so you kind of got to figure out, like

382
00:19:07,400 --> 00:19:09,440
like sandbox is try of the safest. But we're we're

383
00:19:09,799 --> 00:19:14,359
we're aware of this issue and it makes it makes sense.

384
00:19:14,680 --> 00:19:16,839
A surprising number of people don't give a ship. I

385
00:19:16,880 --> 00:19:22,000
will say, like they're just like this thing is just magic,

386
00:19:22,079 --> 00:19:24,400
and like I just it makes me so much faster

387
00:19:24,519 --> 00:19:27,000
and makes makes my life so much more fun that

388
00:19:27,119 --> 00:19:27,920
I don't really care.

389
00:19:28,640 --> 00:19:30,759
Speaker 3: But it's a totally fair point. I wouldn't. Like they're

390
00:19:30,759 --> 00:19:32,160
not using this in NASA.

391
00:19:31,920 --> 00:19:34,480
Speaker 2: And I'm like, you know, well, I think I think

392
00:19:34,519 --> 00:19:40,240
you you really yeah, not yet right, but probably Honestly,

393
00:19:41,279 --> 00:19:43,160
I have some theories there, but I think if I

394
00:19:43,200 --> 00:19:49,519
say them will definitely get canceled. So Uh, yeah, I

395
00:19:49,559 --> 00:19:51,519
think that's sort of the problem. And I think this

396
00:19:51,599 --> 00:19:53,759
is again I don't want to spoil my pick, but

397
00:19:54,640 --> 00:19:58,000
realistically it's that a large majority of the population falls

398
00:19:58,039 --> 00:20:02,559
into this area of maybe they have concerned, but they're uh,

399
00:20:02,599 --> 00:20:07,680
they're apathetic to actually turning off whatever. The source of

400
00:20:07,680 --> 00:20:10,119
the potential problem is there's not a good way to

401
00:20:10,240 --> 00:20:13,839
moderate AI from outside or l MS from outside the

402
00:20:14,359 --> 00:20:17,359
black box. You It's really like all or nothing in

403
00:20:17,400 --> 00:20:19,680
a lot of ways, and most people are not going

404
00:20:19,720 --> 00:20:22,640
to turn it off because they still perceive some huge

405
00:20:22,680 --> 00:20:26,720
amount of value from from utilizing them. And so you know,

406
00:20:27,039 --> 00:20:28,799
I'm not going to turn off the future. I'm just

407
00:20:28,799 --> 00:20:31,079
going to be really scared now what it's going to

408
00:20:31,119 --> 00:20:32,599
do when I'm not okay.

409
00:20:33,400 --> 00:20:37,839
Speaker 3: Yeah, yeah, I think that that's right.

410
00:20:37,920 --> 00:20:40,599
Speaker 5: And people obviously have a strong predisposition to do whatever

411
00:20:40,599 --> 00:20:41,640
you said the default too.

412
00:20:43,119 --> 00:20:44,880
Speaker 3: It might not even like know what the heck is

413
00:20:44,920 --> 00:20:49,680
going on, but I don't know.

414
00:20:49,759 --> 00:20:52,640
Speaker 5: Developers are maybe a little different, Like I feel like

415
00:20:53,240 --> 00:20:55,640
if anyone's gonna go tweak the knobs, it's gonna be

416
00:20:55,759 --> 00:20:58,440
like you know, except I don't.

417
00:20:58,519 --> 00:21:01,559
Speaker 2: I don't think so. I think everyone has their their

418
00:21:01,599 --> 00:21:05,680
depth where there feel comfortable controlling and when if there's

419
00:21:05,720 --> 00:21:08,680
if they're comfortable pulling an LLLM to solving part of

420
00:21:08,680 --> 00:21:10,359
their job or part of what they're doing, it's probably

421
00:21:10,400 --> 00:21:12,759
in an area they don't care about, and so they're

422
00:21:12,759 --> 00:21:15,480
probably not going to. I think another aspect here is

423
00:21:15,720 --> 00:21:18,200
I have a very close friend that went away on

424
00:21:18,279 --> 00:21:21,839
vacation and they're the person who was cat sitting for

425
00:21:21,880 --> 00:21:26,440
them left some plastic on the stove which was induction

426
00:21:26,960 --> 00:21:29,160
but and it was totally fine, it was off, but

427
00:21:29,200 --> 00:21:31,599
one of the cats managed to turn the stove on

428
00:21:32,400 --> 00:21:37,359
and actually melted the plastic. Yeah, And so this is

429
00:21:37,400 --> 00:21:40,200
really funny though, because there there was no LM in

430
00:21:40,519 --> 00:21:46,680
there right the cat, the cat was fine, the cat,

431
00:21:46,720 --> 00:21:50,279
the cats were fine. The thing is like, I really

432
00:21:50,279 --> 00:21:51,880
do fear at some point like there is gonna be

433
00:21:51,880 --> 00:21:54,480
in someone's gonna put an LM in my in my stove.

434
00:21:54,519 --> 00:21:57,440
It's going to happen at some point, and I don't

435
00:21:57,440 --> 00:21:59,559
think we can avoid that future. And I do fear

436
00:21:59,640 --> 00:22:01,680
that it will just turn on one day when I'm

437
00:22:01,680 --> 00:22:04,119
not here and start doing things where like I have

438
00:22:04,160 --> 00:22:07,480
no I have no need for that, and I don't

439
00:22:08,039 --> 00:22:10,160
I'm not not thrilled about this future, but it's coming.

440
00:22:10,960 --> 00:22:14,640
Speaker 5: Kelsey I Tewer had this good tweet which was he

441
00:22:15,039 --> 00:22:17,680
was like, I'm actively at the point where I will

442
00:22:17,720 --> 00:22:20,720
pay more to not have a smart appliance. So I

443
00:22:20,920 --> 00:22:22,599
was pretty much like, I get it, Like I don't

444
00:22:22,599 --> 00:22:24,559
need like my refrigerator having Wi.

445
00:22:24,359 --> 00:22:30,599
Speaker 3: Fi or whatever. That makes sense on the on the

446
00:22:30,720 --> 00:22:33,359
MLM side, for if you're a developer.

447
00:22:33,599 --> 00:22:35,799
Speaker 5: This might not be a popular opinion, but but I

448
00:22:36,160 --> 00:22:39,880
think you're not really going to have a choice as

449
00:22:39,920 --> 00:22:42,680
a developer if you want to continue being a productive

450
00:22:42,680 --> 00:22:44,000
developer on whether or not you.

451
00:22:44,000 --> 00:22:45,200
Speaker 3: Adopt this technology.

452
00:22:45,640 --> 00:22:48,000
Speaker 5: It's kind of like being like, oh, I only want

453
00:22:48,000 --> 00:22:49,920
to work in assembler. I'm not going to use like

454
00:22:49,960 --> 00:22:53,920
a high level language. Like that's not a viable choice

455
00:22:54,119 --> 00:22:57,480
going forward. The I think what you're gonna have to

456
00:22:57,559 --> 00:23:00,160
do is developer if you want to be like product

457
00:23:00,920 --> 00:23:02,640
is like, learn how to use all this stuff and

458
00:23:02,799 --> 00:23:04,799
learn how to use it in a safe and productive way.

459
00:23:05,559 --> 00:23:06,400
Speaker 3: Is that unpopular?

460
00:23:08,000 --> 00:23:10,359
Speaker 2: Let's have a fight. No, let's let's let's go around.

461
00:23:10,480 --> 00:23:13,119
You know, Jillian, what do you think greed disagree?

462
00:23:13,480 --> 00:23:17,079
Speaker 4: I think so, Like I'm pretty judgmental over development over

463
00:23:17,279 --> 00:23:19,680
developers that don't use a debugger, so like I can

464
00:23:19,720 --> 00:23:21,480
see this kind of being just the next, the next

465
00:23:21,480 --> 00:23:26,319
sort of iteration and that process. Yeah, because I don't

466
00:23:26,319 --> 00:23:31,480
know developers are I think at some point, like everybody's

467
00:23:31,519 --> 00:23:33,960
kind of drawn to development because everybody has I like

468
00:23:34,000 --> 00:23:37,160
to learn new things, disease, and like writing code is

469
00:23:37,240 --> 00:23:38,839
really good for that, and then at some point you

470
00:23:38,839 --> 00:23:41,640
get really tired of it, and so then AI is

471
00:23:41,680 --> 00:23:43,799
really good for that like process when you're like, all right,

472
00:23:43,799 --> 00:23:45,720
I'm sick of having to learn the new things. Just

473
00:23:45,920 --> 00:23:47,440
I just want for the AI to tell me what

474
00:23:47,440 --> 00:23:51,839
to do and then there we are. So I'm gonna

475
00:23:51,839 --> 00:23:55,400
go with mostly yes, except that I feel like I

476
00:23:55,480 --> 00:23:57,799
might get some angry responses on the Internet for that,

477
00:23:57,880 --> 00:23:59,160
so I'll give like a little bit.

478
00:24:01,559 --> 00:24:04,680
Speaker 5: Like there is a fear and understandable fear that developers

479
00:24:04,680 --> 00:24:06,000
have this is going to replace them.

480
00:24:06,039 --> 00:24:08,119
Speaker 3: I don't think that's even remotely true.

481
00:24:08,880 --> 00:24:12,759
Speaker 5: There's also like a thing that I've noticed, which is

482
00:24:12,799 --> 00:24:16,359
that a lot of like the more experienced, really strong

483
00:24:16,519 --> 00:24:19,839
developers on our team and who I've worked with, like

484
00:24:19,920 --> 00:24:21,599
they kind of get the least value out of it

485
00:24:21,640 --> 00:24:24,119
initially and are most likely to be like, oh, this

486
00:24:24,200 --> 00:24:26,759
is a stupid suggestion from this thing like or it's

487
00:24:26,799 --> 00:24:28,119
like creating bad code.

488
00:24:28,160 --> 00:24:31,160
Speaker 3: And so they have like a kind of anti take

489
00:24:31,240 --> 00:24:31,759
on it, but.

490
00:24:33,799 --> 00:24:36,000
Speaker 5: Eventually people get to a sort of moment with it

491
00:24:36,039 --> 00:24:38,599
where they're like, oh shit, this actually makes my life

492
00:24:38,599 --> 00:24:41,359
a lot easier and does some of the stuff that

493
00:24:41,400 --> 00:24:45,079
I find super annoying. And they, I think the proper

494
00:24:45,119 --> 00:24:48,279
outlook to have towards it is like, this is like

495
00:24:48,319 --> 00:24:52,559
another tool that I can use, just like if I

496
00:24:52,680 --> 00:24:55,839
like master like said and grap like, I'm like awesome

497
00:24:55,839 --> 00:24:56,440
as a developer.

498
00:24:56,480 --> 00:24:59,119
Speaker 3: I think if you could figure out how to effectively

499
00:24:59,480 --> 00:25:02,240
use the l I think it just makes you better.

500
00:25:02,599 --> 00:25:05,079
I think that's like the right for now, the right

501
00:25:05,119 --> 00:25:07,480
way to look at war. What do you think?

502
00:25:08,920 --> 00:25:12,000
Speaker 2: Well, I have the opposite controversial opinion, so you know,

503
00:25:12,079 --> 00:25:16,559
I was maybe thinking about keeping my mouth shut. So

504
00:25:16,680 --> 00:25:22,440
I have this perspective that it definitely replaces inexperienced engineers.

505
00:25:23,240 --> 00:25:25,680
And so the problem with that is, and I think

506
00:25:25,720 --> 00:25:28,000
this is where the fear comes from, is that lms

507
00:25:28,039 --> 00:25:31,400
do not replace inexperienced engineers. People think that elms will

508
00:25:31,440 --> 00:25:35,759
replace in experienced engineers and do that anyway, And I

509
00:25:35,759 --> 00:25:37,759
think we're already starting to see that happening. And the

510
00:25:37,759 --> 00:25:41,839
problem with that is you're paying money for these tools

511
00:25:42,119 --> 00:25:46,079
and you're not training your organization's people on leveling up

512
00:25:46,079 --> 00:25:48,839
their skills in these areas and will become more and

513
00:25:48,880 --> 00:25:52,119
more dependent on them and definitely move away from it.

514
00:25:52,200 --> 00:25:54,440
Now on the productivity side, I still think it's way

515
00:25:54,519 --> 00:25:56,119
it costs way too much. I think there has to

516
00:25:56,160 --> 00:26:01,880
be magnitudes cost reduction in general eating answers before this

517
00:26:02,000 --> 00:26:04,920
becomes of high value.

518
00:26:05,799 --> 00:26:07,640
Speaker 4: Mean like monetary costs.

519
00:26:07,279 --> 00:26:11,519
Speaker 2: Like the monetary environmental et cetera. It's still like, uh,

520
00:26:12,119 --> 00:26:14,920
none of the AI companies are making money like the

521
00:26:14,960 --> 00:26:17,359
ones that are pumping out AI. You know, Open AI.

522
00:26:18,359 --> 00:26:19,519
I'm sure whatever.

523
00:26:21,599 --> 00:26:21,799
Speaker 3: We know.

524
00:26:21,839 --> 00:26:25,599
Speaker 2: Anthropics not making money. We know whatever they are, they're zero,

525
00:26:25,759 --> 00:26:29,720
Like it's negative negative billions of dollars per year on this.

526
00:26:29,839 --> 00:26:33,680
So you know, that's not sustainable model from a society standpoint.

527
00:26:34,559 --> 00:26:36,240
There's gonna have to be something to change. Either these

528
00:26:36,279 --> 00:26:39,559
tools will completely go away or the costs will have

529
00:26:39,599 --> 00:26:42,519
to come down. I think the last thing is that

530
00:26:42,759 --> 00:26:45,759
we find from a productivity standpoint, at least for me

531
00:26:45,880 --> 00:26:48,079
and the myself and the companies that I work with,

532
00:26:48,440 --> 00:26:52,880
is that the bottleneck isn't doing more work or specifically

533
00:26:52,920 --> 00:26:55,480
writing out code or pushing that out, so the tools

534
00:26:55,519 --> 00:26:58,200
don't solve the needs that we have. It's okay for

535
00:26:58,279 --> 00:26:59,799
us to still be slow in this way or not

536
00:26:59,839 --> 00:27:03,000
be productive in this way, because that's not where our

537
00:27:03,000 --> 00:27:03,720
bottle that gets.

538
00:27:08,319 --> 00:27:15,079
Speaker 5: I disagree with almost everything you just said, but I'm gonna.

539
00:27:14,880 --> 00:27:16,799
Speaker 3: Like it's interesting.

540
00:27:16,920 --> 00:27:21,000
Speaker 5: It's it's interesting to have this, uh, this discussion because

541
00:27:21,000 --> 00:27:25,920
I'm so in the like AI bubble of like like

542
00:27:26,160 --> 00:27:30,920
Silicon Valley people and like AI tech companies and like,

543
00:27:30,920 --> 00:27:34,519
like the main contention that I hear amongst these, like

544
00:27:34,559 --> 00:27:36,319
the people I talked about on the investing said on

545
00:27:36,319 --> 00:27:38,960
the other A company side is like how quickly are

546
00:27:38,960 --> 00:27:41,400
we're getting to a g I? And Warren is coming

547
00:27:41,400 --> 00:27:43,480
in hot with being like these things are not even.

548
00:27:43,359 --> 00:27:46,519
Speaker 2: Valuable, They're not it's not even AI. Like I hate

549
00:27:46,519 --> 00:27:49,119
this term that like we we there's these companies are

550
00:27:50,119 --> 00:27:52,480
lying to the masses of people saying we have AI.

551
00:27:52,599 --> 00:27:55,000
All we have is transformer architecture which is able to

552
00:27:55,079 --> 00:27:57,640
utilize you know, create l MS, and they will always

553
00:27:57,680 --> 00:28:00,720
hallucinate And that's the ridiculous thing. Like I'm waiting for

554
00:28:00,759 --> 00:28:03,759
someone to say, how is open AI going to recoup

555
00:28:03,799 --> 00:28:05,880
the billions of dollars they are losing every single year?

556
00:28:06,160 --> 00:28:08,839
Like where does that? Where does that change? Because money

557
00:28:08,839 --> 00:28:12,680
will run out at some point?

558
00:28:13,799 --> 00:28:15,200
Speaker 3: Oh well, well you want to go or do you

559
00:28:15,200 --> 00:28:16,200
want that?

560
00:28:17,960 --> 00:28:19,839
Speaker 1: I'm going to jump in real quick. Then we can

561
00:28:19,880 --> 00:28:23,200
come back to that. I this is good. I think

562
00:28:23,240 --> 00:28:26,000
I tend to agree with you, Zach, that there's going

563
00:28:26,079 --> 00:28:28,960
to be people who are resistant to AI. And I

564
00:28:28,960 --> 00:28:32,319
think the primary place I've seen this is people who

565
00:28:32,359 --> 00:28:40,200
really are They're really passionate and invested in their chosen language.

566
00:28:40,279 --> 00:28:40,480
Speaker 4: You know.

567
00:28:40,519 --> 00:28:42,640
Speaker 1: I think if we look at the category of people

568
00:28:42,720 --> 00:28:47,519
who will argue Go versus Rust, and they've pinned, they've

569
00:28:47,559 --> 00:28:50,799
pinned their career on I'm a Rust developer or I'm

570
00:28:50,799 --> 00:28:55,240
a Go developer, and and so they'll try something like

571
00:28:55,319 --> 00:29:00,400
AI or or any of those related tools and say, oh,

572
00:29:00,440 --> 00:29:04,119
well it got this wrong. That's clearly why I'm I'm

573
00:29:04,519 --> 00:29:06,319
not going to rely on this thing because it got

574
00:29:06,319 --> 00:29:10,759
this one thing wrong. And you'll get a lot of

575
00:29:10,799 --> 00:29:12,559
resistance from those people.

576
00:29:15,440 --> 00:29:15,920
Speaker 3: I think the.

577
00:29:16,440 --> 00:29:19,200
Speaker 4: AI is like another tool. I mean, I guess more

578
00:29:19,240 --> 00:29:20,920
than what you're saying with like all the money being

579
00:29:20,960 --> 00:29:23,559
spent in the environmental cost That is very valid. But

580
00:29:23,599 --> 00:29:26,599
from the tool perspective, it's like I'm already so dependent

581
00:29:26,680 --> 00:29:30,160
upon tools like without dictation software, pie Charm and VIM,

582
00:29:30,200 --> 00:29:33,839
I'm completely useless. I have like zero utility to anybody

583
00:29:33,880 --> 00:29:36,759
anywhere at any time, like in a professional context, anyways

584
00:29:39,359 --> 00:29:41,200
that I am, you know. I mean I do have

585
00:29:41,279 --> 00:29:44,319
kids occasionally, I'm useful in like a human context, but

586
00:29:44,480 --> 00:29:47,519
like from from a professional standpoint, if I don't have

587
00:29:47,640 --> 00:29:50,359
those things, I'm not going to get any work done.

588
00:29:50,400 --> 00:29:54,039
And so AI has just become like another tool for

589
00:29:54,119 --> 00:29:55,759
me to use. And so I just see it from

590
00:29:55,799 --> 00:29:59,240
that perspective. From the money perspective, like I don't know,

591
00:29:59,279 --> 00:30:02,200
but humanity is a bunch of money on a bunch

592
00:30:02,200 --> 00:30:04,480
of things that we don't recoup an investment from. Like

593
00:30:04,519 --> 00:30:07,079
it's just the money never actually runs out. We don't

594
00:30:07,079 --> 00:30:11,960
have a gold standard anymore, Like there's there's always there,

595
00:30:12,200 --> 00:30:15,000
Like it's an arbitrary concept. There's always money.

596
00:30:17,000 --> 00:30:19,400
Speaker 1: As long as the printer companies keep making printers that

597
00:30:19,480 --> 00:30:20,160
print the money.

598
00:30:21,400 --> 00:30:22,920
Speaker 4: I mean, isn't that kind of what we're doing at

599
00:30:22,920 --> 00:30:25,960
this point? Though? Like isn't that what the governments of

600
00:30:25,960 --> 00:30:27,880
the world have sort of decided or doing well.

601
00:30:27,880 --> 00:30:30,880
Speaker 2: There's a secondary problem here, actually, which is that the

602
00:30:31,079 --> 00:30:33,839
energy consumption is too high. Like even shave off the

603
00:30:34,000 --> 00:30:37,160
environmental impacts, the energy cost is so high that people

604
00:30:37,200 --> 00:30:41,000
are now starting to have their lives affected by having

605
00:30:41,079 --> 00:30:44,480
spotty continuous energy flow into their own appliances in there

606
00:30:44,519 --> 00:30:48,559
and their house lights and stoves, ovens, whatever. And that's

607
00:30:48,559 --> 00:30:53,880
happening near data centers where increased energy usage is required

608
00:30:53,920 --> 00:30:56,799
to run LM. So I think that problem is likely

609
00:30:56,839 --> 00:30:58,640
to get worse even if the money doesn't run out.

610
00:30:59,000 --> 00:31:01,400
Speaker 1: But if you had a smart refrigerator, it could address

611
00:31:01,440 --> 00:31:04,960
for that exactly.

612
00:31:05,000 --> 00:31:07,440
Speaker 4: If the things are smart, you know, then then what

613
00:31:07,440 --> 00:31:09,759
do you even need the energy for? Now, We're fine.

614
00:31:10,160 --> 00:31:12,279
Speaker 2: I like the perspective. I mean, it is a tool,

615
00:31:12,319 --> 00:31:14,000
for sure, And I think the thing that I see

616
00:31:14,160 --> 00:31:16,400
is that it used to be the fact you could

617
00:31:16,440 --> 00:31:19,720
type into Google and get a website that helped you

618
00:31:19,759 --> 00:31:22,160
answer the question you have. And you can't even do

619
00:31:22,160 --> 00:31:24,920
that anymore because at least that search engine has become

620
00:31:25,079 --> 00:31:28,440
utterly worthless, and so you need a replacement for that.

621
00:31:28,599 --> 00:31:33,799
And I think it's worse from a accuracy standpoint than

622
00:31:33,920 --> 00:31:36,079
Google at its best, but it's for sure better than

623
00:31:36,119 --> 00:31:39,359
Google now, and I think that's a worthwhile trade off

624
00:31:39,440 --> 00:31:41,680
that you have to change if you're still using Google,

625
00:31:41,759 --> 00:31:44,279
or you're still believe that your one true programming language

626
00:31:44,519 --> 00:31:46,920
is the only one for the future. I think that's

627
00:31:47,079 --> 00:31:48,799
just the mindset which doesn't make sense.

628
00:31:48,920 --> 00:31:52,279
Speaker 1: So, Zach, you wanted to come back and answer or

629
00:31:52,319 --> 00:31:55,880
respond to the money issue.

630
00:31:56,519 --> 00:31:59,000
Speaker 5: I can't speak to the energy stuff. I can't speak

631
00:31:59,039 --> 00:32:06,640
to just like it's valuable. So for for developers paying

632
00:32:06,920 --> 00:32:12,200
twenty to forty bucks a month for AI in their

633
00:32:12,319 --> 00:32:17,279
core tools, if you just think of how much development

634
00:32:17,319 --> 00:32:23,680
time costs you have to save I don't know, twenty

635
00:32:23,759 --> 00:32:26,880
minutes or something for that to be a worthwhile thing.

636
00:32:27,279 --> 00:32:30,759
Speaker 3: And that threshold has.

637
00:32:30,680 --> 00:32:33,480
Speaker 5: Been crossed a long time ago in my opinion, just

638
00:32:33,480 --> 00:32:38,119
from using these as a user of these tools, the

639
00:32:38,160 --> 00:32:43,200
amount of like time that they save me, it's like

640
00:32:43,279 --> 00:32:45,359
a no brainer trade off. I don't know if anyone

641
00:32:45,440 --> 00:32:48,720
on the back end of this is making money yet

642
00:32:48,799 --> 00:32:51,200
I do know WARP like we have a positive margin

643
00:32:51,240 --> 00:32:54,440
when people pay us for AI, and so it could

644
00:32:54,480 --> 00:32:57,559
be that the model companies or the hyperscalers are just

645
00:32:57,599 --> 00:33:02,000
taking a huge loss on warp's profit. But it you know,

646
00:33:02,079 --> 00:33:05,640
from like just pure economics, people find the value, they

647
00:33:05,640 --> 00:33:08,200
pay for it, they stick with it like a surprise

648
00:33:08,240 --> 00:33:10,160
and higher like we don't have very high churn on it.

649
00:33:10,640 --> 00:33:13,799
And so I have to believe that just from that

650
00:33:13,960 --> 00:33:15,960
and from like actually using it, that there's a ton

651
00:33:16,000 --> 00:33:19,240
of value. It's certainly true that these things are not infallible.

652
00:33:19,400 --> 00:33:22,319
And like, I guess you could debate from like a

653
00:33:22,359 --> 00:33:26,599
philosophical perspective whether or not they're intelligent. Actually think they

654
00:33:27,079 --> 00:33:30,960
they they have some level of like intelligence. Now it's

655
00:33:31,000 --> 00:33:33,720
not like it doesn't quite work the same way that

656
00:33:33,799 --> 00:33:38,480
human intelligence works, but they're able to, like I don't know,

657
00:33:38,480 --> 00:33:41,960
they're able to do things that up until a couple

658
00:33:42,000 --> 00:33:43,839
of years ago you would only say a human could do.

659
00:33:44,039 --> 00:33:48,640
So there's it's I personally am like super excited by

660
00:33:48,680 --> 00:33:51,319
the progress, Like I was a like I studied a

661
00:33:51,319 --> 00:33:53,440
bunch of philosophy. I have a philosophy degree in addition

662
00:33:53,480 --> 00:33:56,599
to a CS background. I think it's like absolutely fascinating

663
00:33:56,599 --> 00:34:01,480
what it says about what intelligence means. It's not, like

664
00:34:01,519 --> 00:34:03,720
you said, it's not perfect human intelligence, but it's something

665
00:34:03,759 --> 00:34:05,880
and it's like I think it's a it's a pretty

666
00:34:05,920 --> 00:34:08,239
awesome technological advance.

667
00:34:08,559 --> 00:34:11,360
Speaker 3: So I'm more pro AI, more bullish.

668
00:34:11,400 --> 00:34:13,280
Speaker 5: I think Warren's a little bit more on the skeptic side.

669
00:34:13,320 --> 00:34:14,119
That's all.

670
00:34:14,920 --> 00:34:15,280
Speaker 3: I think.

671
00:34:15,400 --> 00:34:18,480
Speaker 2: I can't assign the word intelligence to it yet, and

672
00:34:18,760 --> 00:34:21,840
because because of the architecture that it's utilizing, it's just

673
00:34:21,880 --> 00:34:25,760
a probabilistic word predictor. And I think we need a

674
00:34:25,760 --> 00:34:30,519
different architecture other than the Transform architecture to actually reach

675
00:34:30,559 --> 00:34:34,480
anything that would be fair to call AI in any capacity.

676
00:34:34,639 --> 00:34:37,519
I do want to jump into how you're utilizing it

677
00:34:37,519 --> 00:34:41,039
though at WARP sure are you are you running your

678
00:34:41,079 --> 00:34:44,760
own foundational models or are you passing queries to something

679
00:34:44,840 --> 00:34:48,679
configurable for like I can put in open AI apike

680
00:34:49,079 --> 00:34:51,360
or anthropic apike, what's going on there?

681
00:34:51,920 --> 00:34:52,880
Speaker 3: You can pick your model.

682
00:34:53,000 --> 00:34:58,079
Speaker 5: So we we support the anthropic models, the open A models,

683
00:34:58,119 --> 00:35:01,519
Google's models, We support you as hosted version of deep

684
00:35:01,559 --> 00:35:02,280
seek models.

685
00:35:02,280 --> 00:35:03,960
Speaker 3: Even some of the open source models.

686
00:35:05,480 --> 00:35:09,880
Speaker 5: You you can't go directly to them because our server

687
00:35:10,039 --> 00:35:12,960
has like a whole bunch of logic on, like the

688
00:35:12,960 --> 00:35:15,639
prompt engineering and sort of different agents for different types

689
00:35:15,639 --> 00:35:18,960
of tasks, so there's like a logic layer between them.

690
00:35:19,000 --> 00:35:23,400
But the basic the basic intelligence underlying the AI and

691
00:35:23,440 --> 00:35:25,320
war currently is.

692
00:35:25,280 --> 00:35:26,400
Speaker 3: The foundation models.

693
00:35:26,719 --> 00:35:28,639
Speaker 5: There's a chance at some point that we'll get a

694
00:35:28,639 --> 00:35:32,840
little bit more into the like make a model to

695
00:35:32,880 --> 00:35:37,159
protict your command type business, but currently we're we find

696
00:35:37,199 --> 00:35:40,199
that the best thing for our users is to sort

697
00:35:40,239 --> 00:35:44,280
of use the like we're not going to spend billion

698
00:35:44,320 --> 00:35:48,079
dollars on you know, GPUs or whatever and trade models

699
00:35:48,119 --> 00:35:49,719
right now.

700
00:35:49,559 --> 00:35:53,880
Speaker 1: That would probably change profitability statement you just made earlier.

701
00:35:54,800 --> 00:35:58,320
Speaker 3: Yeah, well, so I would say, like we are at

702
00:35:58,320 --> 00:35:59,760
the application layer.

703
00:36:00,360 --> 00:36:02,639
Speaker 5: If you look at this as like application layer, model layer,

704
00:36:02,719 --> 00:36:04,880
hyperscalar worth the application.

705
00:36:04,480 --> 00:36:07,079
Speaker 2: There, No, it makes makes sense. I mean, but in

706
00:36:07,280 --> 00:36:11,719
in that way, the model providers are definitely subsidizing the

707
00:36:11,800 --> 00:36:15,039
profitability because they're taking huge losses. I mean, I don't

708
00:36:15,039 --> 00:36:16,559
know who's making money from there.

709
00:36:17,320 --> 00:36:20,199
Speaker 3: It's just a question of like where's the value going

710
00:36:20,239 --> 00:36:23,679
this whole thing, like the you know, the other thing.

711
00:36:23,760 --> 00:36:30,280
Speaker 5: Like, so the model providers, I think, like the big

712
00:36:30,400 --> 00:36:31,480
question mark to me.

713
00:36:31,480 --> 00:36:34,760
Speaker 3: Is like open source models and like if you have open.

714
00:36:34,519 --> 00:36:39,039
Speaker 5: Source models, especially ones that are like comparable quality, if

715
00:36:39,079 --> 00:36:41,199
like like open AI and anthropies of the world can't

716
00:36:41,239 --> 00:36:44,800
maintain like a like a real lead in quality or

717
00:36:44,880 --> 00:36:48,159
latency or something like that. How does how does the

718
00:36:48,199 --> 00:36:50,760
world work in that? And so the open source alternative

719
00:36:50,760 --> 00:36:53,920
where you run it yourself, uh and you don't have

720
00:36:54,000 --> 00:36:58,119
to pay the sort of margin to open AI is.

721
00:36:58,039 --> 00:36:59,000
Speaker 3: Super interesting to me.

722
00:37:00,639 --> 00:37:02,800
Speaker 5: I think that the one place that there's definitely going

723
00:37:02,880 --> 00:37:04,280
to be someone's gonna make money.

724
00:37:04,320 --> 00:37:06,039
Speaker 3: It's just on like serving these models.

725
00:37:06,199 --> 00:37:08,639
Speaker 5: So I feel like, for better or worse, if you're

726
00:37:08,920 --> 00:37:12,239
Amazon and ANWS you know, g Cloud, Azure or whatever,

727
00:37:12,920 --> 00:37:15,960
they're gonna make money because someone needs to serve these models.

728
00:37:15,960 --> 00:37:19,239
And the local versions, which is I think another interesting

729
00:37:19,280 --> 00:37:22,440
thing to consider are at least currently they're not at

730
00:37:22,440 --> 00:37:25,480
the they're not at it's not really practical to like

731
00:37:26,280 --> 00:37:31,360
get the same level of power from like downloading you know, Lama.

732
00:37:31,599 --> 00:37:34,239
But that's another thing I'm looking at is like maybe

733
00:37:34,239 --> 00:37:39,360
it's just local models that totally disintermediate the need for these.

734
00:37:39,199 --> 00:37:43,559
Speaker 3: Like huge API based cloud models.

735
00:37:43,679 --> 00:37:46,320
Speaker 2: Who know, no, I mean, you're you're onto something really

736
00:37:46,320 --> 00:37:49,840
there because the like you, it would cost you way

737
00:37:49,880 --> 00:37:52,519
more than the price that you would pay to the

738
00:37:52,559 --> 00:37:55,280
model providers to utilize their llms if you tried to

739
00:37:55,360 --> 00:37:59,159
run the open source models locally on hardware that you

740
00:37:59,159 --> 00:38:02,639
know is comparable and gets you speed and accuracy precision

741
00:38:02,719 --> 00:38:04,000
in order to utilize that.

742
00:38:04,159 --> 00:38:04,360
Speaker 3: Yeah.

743
00:38:05,800 --> 00:38:10,960
Speaker 4: Yeah, we should talk about WARP some more and like

744
00:38:11,000 --> 00:38:17,920
its features about whatever. I like, speak into the terminal

745
00:38:18,000 --> 00:38:19,280
my commands so that I don't have.

746
00:38:19,320 --> 00:38:23,760
Speaker 3: To do it. So we added this feature. It's super cool.

747
00:38:23,840 --> 00:38:24,840
If you're using WARP.

748
00:38:24,960 --> 00:38:28,920
Speaker 5: You can hold the function key or you can configure it, uh,

749
00:38:28,960 --> 00:38:30,239
and you can.

750
00:38:30,159 --> 00:38:31,960
Speaker 3: Talk to your terminal. It's magic.

751
00:38:32,280 --> 00:38:34,360
Speaker 5: You can you can just tell it what you want

752
00:38:34,400 --> 00:38:38,599
to do. We uh, it translates it into English and

753
00:38:38,599 --> 00:38:41,719
then it runs it and so it's it's pretty star

754
00:38:41,760 --> 00:38:45,480
trek y from like a user experience standpoint.

755
00:38:46,199 --> 00:38:47,719
Speaker 3: So yeah, that's that is something that we wanted.

756
00:38:49,119 --> 00:38:51,840
Speaker 4: Saving the people from the repetitive stress injuries like this

757
00:38:51,880 --> 00:38:52,280
is what I.

758
00:38:52,480 --> 00:38:54,199
Speaker 3: Why should people have to do anything?

759
00:38:54,559 --> 00:38:59,280
Speaker 2: Like I know what Gillian's waiting for that she wants

760
00:38:59,320 --> 00:39:00,840
the brain interface device.

761
00:39:00,719 --> 00:39:02,679
Speaker 3: Exactly what I want.

762
00:39:04,760 --> 00:39:05,039
Speaker 4: I think.

763
00:39:05,199 --> 00:39:07,000
Speaker 3: I think it would be a really good WARP.

764
00:39:07,639 --> 00:39:10,039
Speaker 5: I'm getting the sense that that WARF is actually very

765
00:39:10,079 --> 00:39:11,920
well suited to Jillian's work force.

766
00:39:12,239 --> 00:39:14,559
Speaker 4: It really is, Like, especially since you just said the

767
00:39:14,639 --> 00:39:17,519
speech thing, because I'm getting older and I can't type

768
00:39:17,559 --> 00:39:20,559
so much so like I very specifically need the speech.

769
00:39:20,320 --> 00:39:22,239
Speaker 3: Thing, and why should you have to say it?

770
00:39:22,719 --> 00:39:26,519
Speaker 4: I know I shouldn't. That reminds me of like an

771
00:39:26,559 --> 00:39:28,199
episode kind of person here.

772
00:39:29,400 --> 00:39:31,679
Speaker 1: It reminds me of an episode of The Simpsons where

773
00:39:31,679 --> 00:39:34,280
Homer's in the hospital and the guy in the bed

774
00:39:34,320 --> 00:39:39,199
next to him is on a breathing machine and He's like, hey,

775
00:39:39,440 --> 00:39:41,400
how come that guy gets someone to breathe for him

776
00:39:41,440 --> 00:39:43,079
and I'm over here doing it by myself.

777
00:39:44,360 --> 00:39:45,880
Speaker 2: See, I thought you were going to bring up the

778
00:39:45,920 --> 00:39:48,599
episode where he tried to get to three hundred pounds

779
00:39:48,639 --> 00:39:53,400
so he could be classified with a disability use a

780
00:39:53,440 --> 00:39:54,760
wand to dial.

781
00:39:54,920 --> 00:39:57,320
Speaker 4: That must be an old episode, like that must that

782
00:39:57,400 --> 00:39:58,639
must be a real old episode.

783
00:39:58,679 --> 00:39:59,920
Speaker 2: Yeah, that was when it was still good.

784
00:40:00,559 --> 00:40:01,960
Speaker 3: Yeah.

785
00:40:02,599 --> 00:40:06,599
Speaker 1: Doctor Nick's food philosophy was if you rub a newspaper

786
00:40:06,639 --> 00:40:09,199
on the food and the newspaper turns clear, it's good

787
00:40:09,239 --> 00:40:09,599
to eat.

788
00:40:11,880 --> 00:40:15,599
Speaker 5: I mean, one of my I'm pretty lazy, and like

789
00:40:15,679 --> 00:40:17,800
I'm not like ashamed to be lazy when especially when

790
00:40:17,840 --> 00:40:20,079
it comes to when it comes to development.

791
00:40:21,440 --> 00:40:22,639
Speaker 3: I don't want to have to.

792
00:40:22,519 --> 00:40:25,880
Speaker 5: Do more work than I have to do to ship

793
00:40:26,000 --> 00:40:29,119
something that's useful. So like my like what I care

794
00:40:29,159 --> 00:40:34,199
about as a developer. Like again, there's different kinds of developers,

795
00:40:34,199 --> 00:40:35,719
but to me, I'm like all in it for the

796
00:40:36,239 --> 00:40:37,400
I want to build something cool.

797
00:40:37,599 --> 00:40:39,000
Speaker 3: I want to ship it out to people.

798
00:40:39,639 --> 00:40:41,199
Speaker 5: I want to be proud that I built it that

799
00:40:41,360 --> 00:40:44,159
I want it to work really really well. And I

800
00:40:44,320 --> 00:40:47,519
want to do that with like the minimal possible effort

801
00:40:48,559 --> 00:40:50,800
and the extent that I have to put effort into it.

802
00:40:50,840 --> 00:40:53,800
I want it to be effort that goes towards thinking

803
00:40:53,840 --> 00:40:57,079
about how it ought to work. And I don't want

804
00:40:57,079 --> 00:40:59,679
to spend effort on like annoying shit in the terminal,

805
00:41:00,119 --> 00:41:02,760
Like that's like the last place that I want my

806
00:41:02,880 --> 00:41:05,480
limited brain cycles to go. I don't want to spend

807
00:41:05,480 --> 00:41:10,119
effort either on like changing function signatures in my files.

808
00:41:10,159 --> 00:41:11,719
I just I know what I want it to be,

809
00:41:11,800 --> 00:41:13,280
and I want like to get from A to B

810
00:41:13,400 --> 00:41:15,960
as quick as possible. And so yeah, it's the extent

811
00:41:16,039 --> 00:41:18,199
that something like AI and I think WARP for the

812
00:41:18,320 --> 00:41:20,440
terminal especially like makes it so I can be like

813
00:41:20,480 --> 00:41:21,480
a little bit lazier.

814
00:41:21,719 --> 00:41:23,760
Speaker 3: Again, this isn't like the advertising I put on our

815
00:41:23,760 --> 00:41:27,480
home page or whatever, but I think it's like I

816
00:41:27,519 --> 00:41:28,039
should have a.

817
00:41:28,159 --> 00:41:30,840
Speaker 4: Valuable things advertising it's great.

818
00:41:32,119 --> 00:41:35,760
Speaker 5: And like, honestly, like a lot of the best developers

819
00:41:35,800 --> 00:41:38,840
I've worked with in my career just kind of all

820
00:41:38,880 --> 00:41:42,039
about that, like, don't make me spend my brain cycles

821
00:41:42,079 --> 00:41:46,519
on like tedious shit and toil. And so I feel

822
00:41:46,559 --> 00:41:50,199
pretty good about trying to eliminate that stuff for developers

823
00:41:50,239 --> 00:41:52,199
so that you can do the more fun stuff, because

824
00:41:52,199 --> 00:41:53,320
the really fun stuff.

825
00:41:53,119 --> 00:41:55,400
Speaker 3: Is like it's like, to me, at least, it's like,

826
00:41:55,400 --> 00:41:56,480
how should the product work?

827
00:41:56,639 --> 00:41:59,679
Speaker 5: And then it's like, how do I architect this thing

828
00:42:00,199 --> 00:42:02,079
so that I can make the product work the way

829
00:42:02,079 --> 00:42:02,440
that I want?

830
00:42:02,440 --> 00:42:03,719
Speaker 3: And then the least fun thing is.

831
00:42:03,599 --> 00:42:07,719
Speaker 5: Like the like typing in the like words in the

832
00:42:07,800 --> 00:42:09,880
text editor or the terminal to do that.

833
00:42:10,559 --> 00:42:12,199
Speaker 3: I don't know if everyone's in the same way.

834
00:42:12,440 --> 00:42:14,719
Speaker 2: No, I think I gore onto something.

835
00:42:14,880 --> 00:42:17,039
Speaker 1: We was going to say it, yeah, No, I was

836
00:42:17,079 --> 00:42:21,159
going to say, it's much more exciting to work on

837
00:42:21,679 --> 00:42:25,280
how the application works than how to center this fucking div.

838
00:42:25,760 --> 00:42:31,480
Speaker 3: Right that vertically.

839
00:42:32,800 --> 00:42:36,639
Speaker 2: Under the vertically on the page. That's that, that's the key. Vertically,

840
00:42:36,840 --> 00:42:42,480
you know, horizontally you just use flexbox, no problem. I think, well,

841
00:42:42,920 --> 00:42:44,480
you know, there's an interesting thing here, because I feel

842
00:42:44,519 --> 00:42:46,239
like if if we take this to the natural conclusion,

843
00:42:46,280 --> 00:42:52,199
it's probably like the managing directors who will then be

844
00:42:52,320 --> 00:42:57,320
responsible for building the product by communicating with the AI

845
00:42:57,440 --> 00:43:00,119
that we technology that we have available and not needing

846
00:43:00,440 --> 00:43:04,000
so called technology department in any of our companies anymore.

847
00:43:05,440 --> 00:43:08,559
Speaker 5: So that's like a horrible outcome to me. I think

848
00:43:08,599 --> 00:43:14,480
it's product managers making software. I mean, arguably that's what's happening. Yeah,

849
00:43:14,480 --> 00:43:16,960
I mean argue that's what's happening right now. There's just

850
00:43:17,000 --> 00:43:19,159
a couple of you know, people in their way that

851
00:43:19,239 --> 00:43:21,280
are telling them that they can't have it exactly what

852
00:43:21,320 --> 00:43:25,440
they want. That's interesting, That's well, I think that's that's

853
00:43:25,480 --> 00:43:28,960
not how it works at work, Like I could be

854
00:43:28,960 --> 00:43:31,639
at work to some places. But so at work, for instance,

855
00:43:31,800 --> 00:43:34,639
we build something and we may be again we may

856
00:43:34,679 --> 00:43:36,119
be different than other places.

857
00:43:36,320 --> 00:43:39,199
Speaker 3: It's primarily engineers who are driving the product direction. Now

858
00:43:39,480 --> 00:43:42,679
we're working on a product that is used by we use.

859
00:43:43,320 --> 00:43:45,440
Speaker 5: We're the customer, we're the audience, and so we have

860
00:43:45,519 --> 00:43:49,320
this awesome virtuous feedback loop of like we build it,

861
00:43:49,400 --> 00:43:51,800
we use it, we like something, we don't like something,

862
00:43:51,880 --> 00:43:53,800
and so we drive a lot of it.

863
00:43:55,639 --> 00:43:58,119
Speaker 3: I don't want to change that at all, Like, actually,

864
00:43:58,199 --> 00:43:59,880
I think that's not a good thing to change.

865
00:44:00,079 --> 00:44:00,400
Speaker 4: And so.

866
00:44:02,440 --> 00:44:05,039
Speaker 5: I also just I don't think as bolish as I

867
00:44:05,079 --> 00:44:08,719
am on AI. I don't think that we are close

868
00:44:08,800 --> 00:44:11,559
to the point where you can build something meaningful without

869
00:44:11,599 --> 00:44:16,280
having some technical knowledge. And if anything like again this

870
00:44:16,360 --> 00:44:18,320
is probably is not the prevailing wisdom. But it's like

871
00:44:18,679 --> 00:44:22,079
I think you need to be more technical to be

872
00:44:22,159 --> 00:44:25,840
able to sort of guide and correct and be the

873
00:44:25,920 --> 00:44:29,679
like the tech lead for an AI. And it's if

874
00:44:29,719 --> 00:44:32,840
you are a aspiring developer these days, I would say,

875
00:44:32,840 --> 00:44:35,920
like learn the shit better, like learn the fundamentals to

876
00:44:35,960 --> 00:44:40,599
see us better, because if you want to effectively produce

877
00:44:40,639 --> 00:44:42,360
software in a world where you have someone who's like

878
00:44:42,440 --> 00:44:44,519
pretty smart but also kind of like a savant and

879
00:44:44,559 --> 00:44:46,840
like dominant a bunch of ways, you need to know what.

880
00:44:46,760 --> 00:44:48,599
Speaker 3: The heck is going on for when you hit a wall.

881
00:44:49,119 --> 00:44:52,519
Speaker 5: And so I think, you know, I don't think we're

882
00:44:52,559 --> 00:44:55,320
close to a world where it's like MBA is building

883
00:44:55,320 --> 00:44:56,840
all of our software no fins to nbas.

884
00:44:56,960 --> 00:44:59,719
Speaker 3: Nbas are great, but like I feel like it's you're

885
00:44:59,719 --> 00:45:00,760
gonna need people.

886
00:45:00,519 --> 00:45:03,800
Speaker 5: Who are experts in order to effectively use this tool to.

887
00:45:03,760 --> 00:45:04,920
Speaker 3: Get its pact capacity.

888
00:45:05,280 --> 00:45:07,239
Speaker 5: And I do think like Wren, I don't know your

889
00:45:07,280 --> 00:45:10,039
point out, Like if you're really junior and you don't learn,

890
00:45:10,599 --> 00:45:12,960
if all you do is say maybe like you've only

891
00:45:13,000 --> 00:45:14,559
learned how to build web apps, I do you feel

892
00:45:14,559 --> 00:45:16,079
like you're like a little bit at risk. Like my

893
00:45:16,119 --> 00:45:18,840
advice to those people would be like up level your

894
00:45:18,880 --> 00:45:19,719
CS skills.

895
00:45:21,159 --> 00:45:23,320
Speaker 3: But I don't see a world anytime soon.

896
00:45:23,159 --> 00:45:27,199
Speaker 5: Where if you're in a professional software development setting, that

897
00:45:28,440 --> 00:45:29,719
developers are going away.

898
00:45:29,880 --> 00:45:31,000
Speaker 3: I sincerely hope not.

899
00:45:31,199 --> 00:45:34,159
Speaker 2: I mean, I'm screwed if I think it's the de

900
00:45:34,239 --> 00:45:38,159
leap there that's problematic. It's that we know you need

901
00:45:38,199 --> 00:45:41,880
the skills in order to utilize LMS effectively. Like you're

902
00:45:41,880 --> 00:45:44,480
not going to be able to just job off your

903
00:45:44,920 --> 00:45:47,519
entire brain to this vehicle and have it go at

904
00:45:47,559 --> 00:45:50,280
full speed without thinking. It really does require critical thinking

905
00:45:50,480 --> 00:45:53,320
to interact with it effectively. And then that's what you're saying.

906
00:45:53,360 --> 00:45:55,840
And I think the problem is, Yeah, I think part

907
00:45:55,840 --> 00:45:58,960
of the problem is that some companies believe that that's

908
00:45:59,400 --> 00:46:02,320
not necessarily the case, that you can delegate this out

909
00:46:02,360 --> 00:46:03,480
to an LM and have it.

910
00:46:05,519 --> 00:46:07,400
Speaker 5: Some companies are just buying the hype that we don't

911
00:46:07,400 --> 00:46:08,480
need to hire developers anymore.

912
00:46:08,480 --> 00:46:11,000
Speaker 2: I and there are companies out there that are, like,

913
00:46:11,039 --> 00:46:14,440
you know, we are an agentic building thing. There is

914
00:46:14,519 --> 00:46:18,800
like the software developer Devin or whatever. Sure, yeah, so

915
00:46:19,159 --> 00:46:21,400
I mean, and I think what I'm saying is, I

916
00:46:21,440 --> 00:46:22,360
know those can't work.

917
00:46:22,840 --> 00:46:25,440
Speaker 5: But I think that's those companies will find out when

918
00:46:25,440 --> 00:46:27,800
they try to replace their development with Devan.

919
00:46:28,079 --> 00:46:28,239
Speaker 6: Yeah.

920
00:46:28,239 --> 00:46:31,119
Speaker 2: I don't like building dev is Devin building Devin because

921
00:46:31,159 --> 00:46:33,119
I don't. I don't think he is or they are

922
00:46:33,159 --> 00:46:36,719
doing it that. Yeah. But I think the bigger problem

923
00:46:36,760 --> 00:46:39,599
is that the leap from hey, I'm someone who doesn't

924
00:46:39,639 --> 00:46:44,760
have technical capabilities to I want a job utilizing technical capabilities.

925
00:46:44,920 --> 00:46:48,440
That gap is growing and harder to get into it

926
00:46:48,679 --> 00:46:52,679
now because the technology available for us interact with is

927
00:46:53,079 --> 00:46:56,239
much more complicated than it was five years ago, ten

928
00:46:56,320 --> 00:46:59,800
years ago, twenty years ago, and the skills that you

929
00:46:59,800 --> 00:47:02,760
get from even training a little bit, like teaching yourself

930
00:47:02,800 --> 00:47:05,639
skilling up skilling even a little bit, is much further

931
00:47:05,719 --> 00:47:08,679
away from what companies are looking for. At least that's

932
00:47:08,760 --> 00:47:12,159
my perspective that I think I'm seeing. And I think

933
00:47:12,199 --> 00:47:14,079
the LMS are contributing to that gap.

934
00:47:16,199 --> 00:47:17,159
Speaker 3: I'm sure like, Okay.

935
00:47:17,199 --> 00:47:20,360
Speaker 5: So say you're a company and you're spending one hundreds

936
00:47:20,400 --> 00:47:23,280
of million dollars on software developers. I'm sure you're like, God,

937
00:47:23,280 --> 00:47:26,280
I would like to spend less money and have equal output.

938
00:47:26,320 --> 00:47:29,039
And you could be like, Okay, I'm going to hire

939
00:47:29,320 --> 00:47:33,320
AI software engineers the DEFN example, And I've tried Devan

940
00:47:33,440 --> 00:47:36,639
and it's a neat vision. Devin, I don't want to

941
00:47:37,079 --> 00:47:38,599
I'm not gonna shoot on Devon. It didn't work that

942
00:47:38,639 --> 00:47:41,000
well for us. I know they're improving it, but it doesn't.

943
00:47:41,239 --> 00:47:45,559
It's like that model today does not work. Will that

944
00:47:45,599 --> 00:47:48,639
model work in five ten years?

945
00:47:48,800 --> 00:47:51,519
Speaker 3: I don't know. I'm still skeptical. I think any company

946
00:47:51,559 --> 00:47:55,880
that finds that they want to improve their.

947
00:47:57,239 --> 00:48:01,000
Speaker 5: Cost efficiency on the software side by placing their developers

948
00:48:01,119 --> 00:48:04,840
is going to be in a I think it's just

949
00:48:04,880 --> 00:48:06,760
they're going to find that they don't get the ROI

950
00:48:06,840 --> 00:48:09,239
on that, and that the better ROI right now is

951
00:48:09,280 --> 00:48:13,039
to empower your developers and like give them tools that

952
00:48:13,199 --> 00:48:14,519
let them be more productive.

953
00:48:15,039 --> 00:48:16,840
Speaker 3: I'm saying this. I'm obviously super biased.

954
00:48:16,840 --> 00:48:19,119
Speaker 5: I run a developer tools company where I'm building something

955
00:48:19,119 --> 00:48:22,400
where the mission is empowered developers. But I truly believe

956
00:48:22,440 --> 00:48:25,000
that that's like the right way to approach this. And

957
00:48:25,679 --> 00:48:27,719
you know, it's like companies will try whatever they're going

958
00:48:27,800 --> 00:48:29,880
to try, but I think that they're going to stick

959
00:48:30,119 --> 00:48:34,079
with something that actually gives them the result. I don't

960
00:48:34,079 --> 00:48:37,199
think that they're like the economic incentives are such that

961
00:48:37,320 --> 00:48:40,400
like if JP Morgan replaced all their developers with AI

962
00:48:40,440 --> 00:48:44,559
software engineers and then like all their bank and transactions.

963
00:48:43,760 --> 00:48:45,800
Speaker 3: Failed, they'd be like, this is not the right move.

964
00:48:45,880 --> 00:48:48,280
Speaker 5: And so I do think that there's like back pressure

965
00:48:48,920 --> 00:48:51,079
on doing something that actually works.

966
00:48:53,280 --> 00:48:56,159
Speaker 1: I think that's a great model, and I encourage them

967
00:48:56,199 --> 00:48:59,000
to do that, and then when it blows up, I

968
00:48:59,000 --> 00:49:01,159
want them to get over to my website where I

969
00:49:01,159 --> 00:49:03,280
have my consulting rates listed.

970
00:49:03,719 --> 00:49:08,159
Speaker 3: Exactly they're gonna need. They're going to need some smart people.

971
00:49:08,719 --> 00:49:11,840
Speaker 2: We actually went on smart people still, Yeah, I mean

972
00:49:12,440 --> 00:49:14,599
for sure, for sure, I mean we actually went We

973
00:49:14,599 --> 00:49:17,199
actually did a deep dive in this area in our

974
00:49:17,280 --> 00:49:21,440
episode on the Develops report from from from Dora in

975
00:49:21,480 --> 00:49:25,199
twenty twenty four. Okay, where like the I don't know

976
00:49:25,199 --> 00:49:27,039
if you've read it, but the actual results was like

977
00:49:27,400 --> 00:49:33,119
the value that lms were providing to organizations was suspect

978
00:49:33,280 --> 00:49:38,079
like it was. It wasn't significantly different than where they

979
00:49:38,159 --> 00:49:40,400
had been before. It was very difficult for organizations to

980
00:49:40,480 --> 00:49:43,360
justify that the value to the bottom line or the

981
00:49:43,440 --> 00:49:46,360
value to the products that were being delivered. I think

982
00:49:46,360 --> 00:49:48,079
the interesting thing was the one thing I did say

983
00:49:48,119 --> 00:49:51,039
is that people were happier with using the LMS, but

984
00:49:51,039 --> 00:49:55,639
it didn't actually reduce toil and it didn't didn't reduce

985
00:49:56,559 --> 00:49:58,760
the amount of time spent doing things that they didn't like,

986
00:49:58,800 --> 00:50:01,360
which is interesting. I think it gives the most value

987
00:50:01,400 --> 00:50:05,119
to people who are positive optimistic about AI. So if

988
00:50:05,119 --> 00:50:06,800
you like AI, you should use this.

989
00:50:08,960 --> 00:50:10,320
Speaker 3: I can tell you experience from WARP.

990
00:50:10,440 --> 00:50:13,960
Speaker 5: So so there's the way we think about users coming

991
00:50:13,960 --> 00:50:16,039
into WARP. There's some users who are coming into WARP

992
00:50:16,119 --> 00:50:18,199
because they're like I love AI.

993
00:50:18,639 --> 00:50:21,519
Speaker 3: They're like, I want I love this new technology.

994
00:50:21,679 --> 00:50:24,039
Speaker 5: I want to like use it in all my tools,

995
00:50:24,760 --> 00:50:27,800
And those are great users for us, Like they come

996
00:50:27,800 --> 00:50:30,519
in they're like, holy shit, I can I can use.

997
00:50:30,440 --> 00:50:32,719
Speaker 3: A terminal in this totally new way. That is not

998
00:50:32,880 --> 00:50:34,559
the majority users.

999
00:50:34,599 --> 00:50:36,519
Speaker 5: So the majority user for us is what I would

1000
00:50:36,519 --> 00:50:41,840
call like an AI neutral like developer who might be like, Okay,

1001
00:50:41,880 --> 00:50:44,480
I'm open to this, but it's like there's a lot

1002
00:50:44,480 --> 00:50:47,400
of hype, I have a bunch of inherent skepticism. And

1003
00:50:47,480 --> 00:50:51,960
for those users, the challenge for us is to get

1004
00:50:52,039 --> 00:50:55,559
them to actually see the value of the AI and

1005
00:50:55,639 --> 00:50:58,440
like actually use it. And so the the way that

1006
00:50:58,480 --> 00:51:01,639
we've like figured out how to do that is that

1007
00:51:02,599 --> 00:51:04,880
it's very similar to that tool that you mentioned earlier,

1008
00:51:04,960 --> 00:51:08,440
the fuck and so like the like when you have

1009
00:51:08,480 --> 00:51:11,920
an error in the in warp and it's like, oh shit,

1010
00:51:12,039 --> 00:51:13,559
like I'm missing this. I don't know if I'm a

1011
00:51:13,559 --> 00:51:15,800
lot to sware on this podcast, but I'm missing this.

1012
00:51:15,920 --> 00:51:18,400
Speaker 3: Uh, you know Python dependency.

1013
00:51:18,880 --> 00:51:20,679
Speaker 5: We show something where it's like, hey, we can fix

1014
00:51:20,719 --> 00:51:24,039
this for you, and like all you have to do

1015
00:51:24,079 --> 00:51:26,199
is say command enter and we fix it for you.

1016
00:51:26,280 --> 00:51:31,199
And then that's like a like a conversion moment. And

1017
00:51:31,239 --> 00:51:34,440
so like, I guess my point here is like kind

1018
00:51:34,440 --> 00:51:36,159
of piggyback them off. Your point is like there's some

1019
00:51:36,199 --> 00:51:38,639
people who are just like into this and like they're

1020
00:51:38,639 --> 00:51:40,719
gonna love it, and maybe they're they love it even

1021
00:51:40,760 --> 00:51:42,519
if it isn't really helping them and they're just like

1022
00:51:43,679 --> 00:51:46,639
messing around with LMS all day. But I do think,

1023
00:51:46,800 --> 00:51:50,519
based on our experience converting people who don't inherently want

1024
00:51:50,559 --> 00:51:53,960
to use this technology that there there must be value

1025
00:51:54,039 --> 00:51:56,239
because we have, like I said, we have a lot

1026
00:51:56,280 --> 00:52:00,119
of people paying us for something that that like, and

1027
00:52:00,159 --> 00:52:01,519
I don't think that people are just going to pay

1028
00:52:01,559 --> 00:52:02,840
us for something that I'll find valuable.

1029
00:52:03,159 --> 00:52:06,199
Speaker 3: Sure, and a lot of them were not AI enthusiasts

1030
00:52:06,239 --> 00:52:06,559
to start.

1031
00:52:06,639 --> 00:52:11,159
Speaker 5: There are people who tell us like, oh shit, like

1032
00:52:11,559 --> 00:52:14,519
this thing just saved me hours and I love that.

1033
00:52:14,639 --> 00:52:17,280
Speaker 3: So that's like the you know, my kind of counter

1034
00:52:17,280 --> 00:52:17,920
to what you're saying.

1035
00:52:18,239 --> 00:52:20,960
Speaker 2: Yeah, I'm really curious, you said, so the commands are

1036
00:52:20,960 --> 00:52:24,679
going through this proxy layer that you're you're hosting and

1037
00:52:24,840 --> 00:52:27,360
before interacting with the model prior. So I don't know

1038
00:52:27,400 --> 00:52:30,239
if if you can share, but maybe there's some interesting

1039
00:52:30,480 --> 00:52:32,719
metrics or data that you've been able to collect based

1040
00:52:32,719 --> 00:52:35,960
off of what people are looking for, what has been searched,

1041
00:52:36,519 --> 00:52:39,159
what sorts of problems are being fixed, anything in this area.

1042
00:52:39,880 --> 00:52:43,360
Speaker 5: Yeah, so we have a group of like alpha testers

1043
00:52:43,400 --> 00:52:49,400
who give us like data collection access essentially, and so

1044
00:52:49,559 --> 00:52:51,960
really common use cases where we're helping people are the

1045
00:52:52,159 --> 00:52:55,599
like install dependencies, the like.

1046
00:52:57,440 --> 00:53:01,039
Speaker 3: Get my get is messed up, Like I did something.

1047
00:53:01,039 --> 00:53:02,519
I mean some weird get state and I need to

1048
00:53:02,519 --> 00:53:04,519
get out of it.

1049
00:53:04,599 --> 00:53:08,239
Speaker 5: We are increasingly fixing compiler errors for people, so intrest

1050
00:53:08,239 --> 00:53:12,079
of like simple compiler errors and the air log isn't

1051
00:53:12,079 --> 00:53:12,960
the terminal we fix.

1052
00:53:13,679 --> 00:53:15,199
Speaker 3: We get people who.

1053
00:53:16,880 --> 00:53:22,480
Speaker 5: A lot of like Kubernetes, Docker Helm, like those types

1054
00:53:22,519 --> 00:53:25,719
of issues where there's very heavy command line usage and

1055
00:53:25,800 --> 00:53:30,760
kind of you know, pretty complex commands that you need

1056
00:53:30,800 --> 00:53:31,000
to do.

1057
00:53:31,119 --> 00:53:32,559
Speaker 3: Is another really popular area.

1058
00:53:33,079 --> 00:53:35,199
Speaker 5: We do things where we write scripts for people to

1059
00:53:35,239 --> 00:53:37,159
automate things that they're doing over and over again, and

1060
00:53:37,199 --> 00:53:39,599
so you know, it's it's a mix. I would say,

1061
00:53:39,639 --> 00:53:42,199
like though, really like prime use cases for us to

1062
00:53:42,239 --> 00:53:46,039
start are things that are pretty terminal oriented. And then

1063
00:53:46,679 --> 00:53:49,719
increasingly as people realize you can fix coding stuff and work,

1064
00:53:50,000 --> 00:53:52,679
and we guide them into that the coding stuff matters

1065
00:53:52,719 --> 00:53:53,400
a bunch too.

1066
00:53:53,320 --> 00:53:55,599
Speaker 3: Because just like developer spend a lot of time writing code.

1067
00:53:58,880 --> 00:54:01,880
Speaker 1: I think one of the things that that doesn't really

1068
00:54:01,880 --> 00:54:04,760
get highlighted enough is that there actually is a pretty

1069
00:54:04,800 --> 00:54:08,920
steep learning curve to using these AI tools. I think

1070
00:54:08,920 --> 00:54:12,760
there's an expectation that oh, it's AI, I just go

1071
00:54:12,920 --> 00:54:16,079
in and it's going to make my life magical, but

1072
00:54:16,239 --> 00:54:20,639
really my experience with it has been learning how bad

1073
00:54:20,679 --> 00:54:26,320
I actually suck at communication? And that was the first job. Yeah,

1074
00:54:26,360 --> 00:54:28,239
Like that was my first job, is to figure out

1075
00:54:28,280 --> 00:54:29,159
how to communicate.

1076
00:54:29,920 --> 00:54:30,480
Speaker 3: It's weird.

1077
00:54:30,519 --> 00:54:33,119
Speaker 5: It's turning every programmer into someone who needs to know

1078
00:54:33,119 --> 00:54:36,559
how to write, which is like kind of a crazy skill.

1079
00:54:36,639 --> 00:54:40,119
Speaker 3: But like, yeah, the quality of what you get.

1080
00:54:39,960 --> 00:54:43,679
Speaker 5: Out of these llms is highly dependent upon how good

1081
00:54:43,719 --> 00:54:46,039
you are prompting them, how good you are at providing

1082
00:54:46,079 --> 00:54:47,159
them with the right context to.

1083
00:54:47,159 --> 00:54:52,599
Speaker 3: Answer your question. And if you Yeah, who would have thought.

1084
00:54:52,400 --> 00:54:54,360
Speaker 5: That, like being really good at like writing English would

1085
00:54:54,360 --> 00:54:55,360
have been like the core thing.

1086
00:54:55,440 --> 00:54:58,559
Speaker 3: But I guess, I guess like people engineers write design docs.

1087
00:54:58,559 --> 00:55:00,159
It's not that different from that skill.

1088
00:55:02,519 --> 00:55:05,039
Speaker 5: It's a real behavior change and it's a real skill,

1089
00:55:05,199 --> 00:55:07,280
and I think that I think it's a great observation.

1090
00:55:07,960 --> 00:55:08,199
Speaker 3: Agree.

1091
00:55:08,239 --> 00:55:11,719
Speaker 2: I mean, I know, I went to university specifically to

1092
00:55:11,800 --> 00:55:15,039
study engineering so that I wouldn't have to read and

1093
00:55:15,079 --> 00:55:19,360
write words. And now my life I pretty much just

1094
00:55:19,840 --> 00:55:23,559
write a lot of blog posts, knowledge based articles, you know,

1095
00:55:23,679 --> 00:55:26,800
chat with lms, like it's every single day, like there

1096
00:55:26,840 --> 00:55:29,079
are it's just words. It's just words. That's that's my

1097
00:55:29,119 --> 00:55:29,800
whole life now.

1098
00:55:30,440 --> 00:55:34,440
Speaker 1: Yeah, yeah, I think it's worth elaborating on though, just

1099
00:55:34,440 --> 00:55:40,719
just like that's one of the reasons I'm being more

1100
00:55:41,119 --> 00:55:43,880
pushing people more into AI. It's like, yeah, I know,

1101
00:55:43,960 --> 00:55:46,199
you get it. You tried it, it made a mistake,

1102
00:55:46,239 --> 00:55:47,719
and you're ready to write it off. But I really

1103
00:55:47,760 --> 00:55:50,239
need you to stick with this and learn how to

1104
00:55:50,400 --> 00:55:54,000
use it, because by putting that time and effort in now,

1105
00:55:54,920 --> 00:55:57,119
you're going to figure these skills out and learn how

1106
00:55:57,119 --> 00:56:01,599
to make it productive. And then as the technology self improved,

1107
00:56:01,880 --> 00:56:05,559
you're going to start getting to take exponential benefits of that,

1108
00:56:05,800 --> 00:56:09,039
and so you and your career are going to be

1109
00:56:09,559 --> 00:56:13,119
way way ahead of everyone that you're sitting with now

1110
00:56:13,159 --> 00:56:16,000
who says, oh AI sucks five years from now.

1111
00:56:18,199 --> 00:56:21,119
Speaker 5: I believe I'm one hundred percent with you that that's

1112
00:56:21,159 --> 00:56:24,840
like that's the smart approach. Is like, yeah, I think

1113
00:56:24,880 --> 00:56:27,280
it's like the tool analogy is the right analogy right

1114
00:56:27,440 --> 00:56:30,440
right now, where it's like you can't get mad at

1115
00:56:30,639 --> 00:56:31,679
them if.

1116
00:56:31,480 --> 00:56:37,960
Speaker 4: You, like I didn't learn, I can't get mad.

1117
00:56:38,559 --> 00:56:40,000
Speaker 3: But it's like it's like counter productive.

1118
00:56:40,000 --> 00:56:43,440
Speaker 5: And I think if you've remove the hype for a

1119
00:56:43,519 --> 00:56:45,719
second and just think of it as like it's a

1120
00:56:45,719 --> 00:56:49,559
computer program that you're using it's like, yeah, you got

1121
00:56:49,639 --> 00:56:52,360
to learn how to use it, and like, you know,

1122
00:56:52,360 --> 00:56:53,920
what is it like R T F M, Like I

1123
00:56:54,280 --> 00:56:56,800
kind of hate that, but it's a it's like learn

1124
00:56:56,840 --> 00:56:58,519
how to learn how to use it. If you want

1125
00:56:58,519 --> 00:56:59,840
to get the most out of it, is one hundred

1126
00:56:59,840 --> 00:57:02,920
percent right and if you are if you think of

1127
00:57:03,000 --> 00:57:06,639
it instead as like a dumb coworker you don't want

1128
00:57:06,639 --> 00:57:11,559
to associate with. But that dumb coworker is like someone

1129
00:57:11,559 --> 00:57:14,440
who's on your well, I don't know where I'm going

1130
00:57:14,440 --> 00:57:15,679
with this.

1131
00:57:17,360 --> 00:57:18,639
Speaker 3: Think of it as a tool that you got to

1132
00:57:18,679 --> 00:57:19,360
get the most out of.

1133
00:57:19,639 --> 00:57:21,840
Speaker 2: I think I think you're onto something there really important

1134
00:57:22,000 --> 00:57:23,559
because I think one of the things that a lot

1135
00:57:23,679 --> 00:57:25,880
of the elms we see out there, and I think

1136
00:57:25,880 --> 00:57:28,079
this is where some of the value is definitely lost.

1137
00:57:28,239 --> 00:57:31,039
They don't do a great job of teaching you how

1138
00:57:31,119 --> 00:57:34,559
to be an effective prompt engineer, like how to actually

1139
00:57:34,599 --> 00:57:37,880
create communication with the tool, to Will's point, and I

1140
00:57:37,920 --> 00:57:40,159
think part of it is because those same companies have

1141
00:57:40,239 --> 00:57:42,480
no idea how their own thing works, so they can't

1142
00:57:42,519 --> 00:57:45,800
actually give good recommendations. But I think they do figure

1143
00:57:45,800 --> 00:57:47,960
it out over time, because there are communities that pop

1144
00:57:48,039 --> 00:57:49,760
up that are discussing this and then they bring that

1145
00:57:49,800 --> 00:57:52,440
knowledge back in where you know, we see examples where

1146
00:57:52,920 --> 00:57:57,360
like the Dalli model that open ai has is the

1147
00:57:57,440 --> 00:58:02,440
prompt is being mutated by their one or whatever based

1148
00:58:02,440 --> 00:58:05,440
on what the user inputs, because it's just nonsensical and

1149
00:58:05,679 --> 00:58:08,679
needs to be mutated. And like those instructions, it would

1150
00:58:08,679 --> 00:58:10,400
be great to be exposed. And I just feel like

1151
00:58:10,400 --> 00:58:12,360
these tools don't do this good of a job. But

1152
00:58:12,440 --> 00:58:14,679
you work on the application layer, and so I feel

1153
00:58:14,719 --> 00:58:17,440
like you're providing a much better experience for teaching people

1154
00:58:17,639 --> 00:58:21,119
how to utilize the tool effectively because you have to

1155
00:58:21,360 --> 00:58:23,239
because you're actually selling a real.

1156
00:58:23,079 --> 00:58:27,119
Speaker 5: Product, right right, No, No, And it's like it's a thing

1157
00:58:27,119 --> 00:58:30,199
that we're constantly thinking through, Like we have a feature

1158
00:58:30,280 --> 00:58:36,719
that is suggested prompts essentially where you know, if there's

1159
00:58:36,760 --> 00:58:39,320
a like the most common use case again is like

1160
00:58:39,320 --> 00:58:42,679
an error error resolution, but well, based on the error

1161
00:58:42,679 --> 00:58:45,159
that we see, we will suggest to prompt. And the

1162
00:58:45,199 --> 00:58:47,679
prompt probably is a little bit more than just like

1163
00:58:47,800 --> 00:58:50,440
fix this, which is what a person might write. It

1164
00:58:50,519 --> 00:58:53,880
is probably like fix this russer that is caused by

1165
00:58:54,440 --> 00:58:58,440
incorrect mutability, And like you provide, we do everything we

1166
00:58:58,480 --> 00:59:02,519
can to make it the minimum amount of work, and

1167
00:59:02,559 --> 00:59:05,400
also to show the user like, hey, here's what we're

1168
00:59:05,440 --> 00:59:07,320
actually telling the model, so that if you want to

1169
00:59:07,360 --> 00:59:10,519
do this on your own next time, without like Warth

1170
00:59:10,559 --> 00:59:11,880
doing it, you can do it.

1171
00:59:13,760 --> 00:59:16,679
Speaker 3: So that that that's it's a key skill, like totally

1172
00:59:16,760 --> 00:59:18,039
ride of it. That's something that matters.

1173
00:59:18,760 --> 00:59:20,840
Speaker 4: I think this kind of shows my bias because like

1174
00:59:20,960 --> 00:59:23,960
forcing the developers to have to communicate properly, I just

1175
00:59:24,000 --> 00:59:26,199
don't see that as a problem. I'm like, this is

1176
00:59:26,239 --> 00:59:28,360
a good thing. This should be a feature, not a butt.

1177
00:59:29,159 --> 00:59:31,920
Speaker 2: Well, okay, maybe I'll put this into perspective, Jillian in

1178
00:59:32,000 --> 00:59:38,199
a different way. It's communicating correctly is a subjective perspective

1179
00:59:38,280 --> 00:59:41,320
based off of the people involved in that collaboration. When

1180
00:59:41,360 --> 00:59:44,559
you're communicating with a second person, you know there's a

1181
00:59:44,559 --> 00:59:48,159
culture involved there, your values involved, the definitions of words

1182
00:59:48,159 --> 00:59:49,920
that you grew up with, all these things in that

1183
00:59:50,119 --> 00:59:52,840
and when you're using an LM, you don't like it's

1184
00:59:52,960 --> 00:59:55,760
challenging to figure out what its culture is, how it

1185
00:59:55,840 --> 00:59:58,880
responds to certain things, and so you have to learn

1186
00:59:59,000 --> 01:00:02,000
that tool. So I think there's a difference between like

1187
01:00:02,119 --> 01:00:04,440
you're not becoming a better communicator. You're becoming a better

1188
01:00:04,440 --> 01:00:09,880
communicator with that thing. And I think is it the

1189
01:00:10,119 --> 01:00:12,599
good thing to force people to do, I mean communicating

1190
01:00:12,599 --> 01:00:15,079
with other human beings that you work with, Yes, for sure.

1191
01:00:15,519 --> 01:00:18,400
Forcing them to learn how to use, you know, end

1192
01:00:18,440 --> 01:00:21,920
tools out there that all are slightly different, have individual

1193
01:00:21,960 --> 01:00:25,320
mindsets or cultures or whatever. The corpus and material you

1194
01:00:25,320 --> 01:00:27,920
know that I think is open for challenge and debate.

1195
01:00:29,960 --> 01:00:32,440
Speaker 4: But I just see this as like a people living

1196
01:00:32,480 --> 01:00:35,079
in society kind of issue. Like when I was a kid,

1197
01:00:35,119 --> 01:00:37,159
my dad was like, you're going to take a typing

1198
01:00:37,159 --> 01:00:39,840
class because that wasn't just an automatic thing back then,

1199
01:00:39,920 --> 01:00:42,320
you guys all right, like this was this was a

1200
01:00:42,320 --> 01:00:45,440
while ago, And I kind of just see AI as

1201
01:00:45,480 --> 01:00:49,559
sort of like I think it's it's very like pivotal,

1202
01:00:49,559 --> 01:00:52,199
it's paradigm shifting, but it's it's another iteration of that.

1203
01:00:52,239 --> 01:00:54,440
It's another tool that we're adding on that people are

1204
01:00:54,480 --> 01:00:56,199
going to learn how to use that everybody's going to

1205
01:00:56,280 --> 01:00:58,840
have to use, just like I don't know, like now,

1206
01:00:58,880 --> 01:01:01,480
my kid's just I did not have the option to

1207
01:01:01,559 --> 01:01:03,639
sign them up for a typing class or not. It's

1208
01:01:03,719 --> 01:01:05,760
just part of their curriculum that they are just doing.

1209
01:01:06,639 --> 01:01:11,079
Speaker 1: I think you should put them in a typing class.

1210
01:01:13,000 --> 01:01:16,039
No no, no, no, like an old school with the old

1211
01:01:16,320 --> 01:01:19,559
with the old man, just to screw with them.

1212
01:01:19,840 --> 01:01:23,480
Speaker 2: See you know here here I have I have, I

1213
01:01:23,480 --> 01:01:27,559
have good parable here because when I was in the

1214
01:01:27,559 --> 01:01:29,960
fifth grade, I think I was in a typing class

1215
01:01:29,960 --> 01:01:32,159
in my school was provided a public school. You got

1216
01:01:32,159 --> 01:01:35,119
a type in class, and I learned a S D

1217
01:01:35,360 --> 01:01:37,880
F J K L semi over and over again for

1218
01:01:38,159 --> 01:01:42,119
for a year. And realistically I don't use quirdy. Actually

1219
01:01:42,239 --> 01:01:46,960
I find it to be a lackluster, subpar keyboard layout.

1220
01:01:47,079 --> 01:01:49,119
And so I was taught, you know, something that took

1221
01:01:49,159 --> 01:01:52,280
me many months to unlearn so I could be more

1222
01:01:52,320 --> 01:01:59,480
effective on my keyboard use. I'm a I'm actually I'm

1223
01:01:59,519 --> 01:02:02,679
a prog. I'mer Devorak fan. But I have used Linux

1224
01:02:02,719 --> 01:02:05,599
to uh configure like all of the almost all the

1225
01:02:05,679 --> 01:02:09,159
keys so that the the third the third level not

1226
01:02:09,239 --> 01:02:12,599
shift and control about the special al gr key to

1227
01:02:12,840 --> 01:02:15,480
give me other things that are beneficial for programming and

1228
01:02:15,599 --> 01:02:18,880
German and Greek and Roman, and however I want to

1229
01:02:18,960 --> 01:02:21,559
utilize them.

1230
01:02:21,760 --> 01:02:22,800
Speaker 4: Sounds like a lot of work.

1231
01:02:23,760 --> 01:02:26,239
Speaker 2: Well, this is the thing is we're talking about productivity

1232
01:02:26,239 --> 01:02:29,199
and optimizing your flow. And I find that I type,

1233
01:02:29,360 --> 01:02:32,679
you know, you with an umloud or a with an umloud,

1234
01:02:32,840 --> 01:02:36,719
or a dollar sign or the euro sign frequently, and

1235
01:02:36,760 --> 01:02:38,880
so I want an easy way to type those. I

1236
01:02:38,880 --> 01:02:41,519
don't want to google euro sign and then copy and

1237
01:02:41,559 --> 01:02:43,960
paste that somewhere. You know, it's like on your on

1238
01:02:44,000 --> 01:02:46,159
your phone. Isn't there an emoji key where you can

1239
01:02:46,360 --> 01:02:48,960
hit emoji and then find the emoji you want? I mean,

1240
01:02:49,000 --> 01:02:51,239
I I see the LMS a sort of similar tool

1241
01:02:51,280 --> 01:02:54,400
from that perspective, Right, you're you're hot, You're hot keying

1242
01:02:54,480 --> 01:02:57,400
over to your warp uh terminal to you know, type

1243
01:02:57,400 --> 01:02:59,000
those things out and get the answer rather than having

1244
01:02:59,039 --> 01:02:59,960
to search on the internet.

1245
01:03:02,440 --> 01:03:04,320
Speaker 4: Yeah, but if it's what you're doing, it's what you're doing,

1246
01:03:04,320 --> 01:03:06,639
and there should be like a productive way to, you know,

1247
01:03:06,719 --> 01:03:07,719
to accomquish the goal.

1248
01:03:08,559 --> 01:03:12,119
Speaker 2: Look, my five my keyboard layout is open source, it's available.

1249
01:03:12,119 --> 01:03:17,559
Speaker 1: And do you have blank key caps?

1250
01:03:17,599 --> 01:03:22,480
Speaker 2: To Warren, no, I am, so this is not the

1251
01:03:22,519 --> 01:03:24,159
episode where we talk about my keyboard.

1252
01:03:25,480 --> 01:03:29,400
Speaker 4: I think it's becoming that board.

1253
01:03:30,920 --> 01:03:33,599
Speaker 2: I took a Quirity keyboard. It's the Logitech. I don't

1254
01:03:33,599 --> 01:03:35,519
even remember what number it is like K four hundred

1255
01:03:35,639 --> 01:03:37,280
or something. It says on here somewhere. I have no

1256
01:03:37,320 --> 01:03:39,880
idea what it is. It's their silent version, the one

1257
01:03:39,920 --> 01:03:42,440
that makes the least amount of sound possible, because I

1258
01:03:42,480 --> 01:03:45,320
care about noise more than anything else. And then I

1259
01:03:45,440 --> 01:03:48,320
just moved the keys everywhere I could. And this thing

1260
01:03:48,360 --> 01:03:50,440
you'll find out about keyboards that are not designing this.

1261
01:03:50,719 --> 01:03:53,320
The F key and the J key are have a

1262
01:03:53,360 --> 01:03:55,639
different form factor than all the other keys on the keyboard,

1263
01:03:55,639 --> 01:03:57,639
so you can't swap them around. I don't know why

1264
01:03:57,679 --> 01:03:59,960
they do this, just to piss you off a parent,

1265
01:04:01,880 --> 01:04:03,440
you know, it's like these two keys are going to

1266
01:04:03,480 --> 01:04:06,559
be different. I don't know why, but they are. And

1267
01:04:07,000 --> 01:04:08,679
so all the keys on my keyboard are in different

1268
01:04:08,719 --> 01:04:11,480
spots except for the F and the J they're exactly

1269
01:04:11,599 --> 01:04:12,800
where they started on the cordy.

1270
01:04:13,360 --> 01:04:15,639
Speaker 5: I think it's because that's like home based, right, Like

1271
01:04:15,639 --> 01:04:19,880
they're like you want like a tactle wave finding out

1272
01:04:19,880 --> 01:04:20,639
where those are.

1273
01:04:20,920 --> 01:04:24,239
Speaker 2: But it's the key cap form factor, not like the

1274
01:04:24,480 --> 01:04:27,360
key itself. So it's like, I don't know. The only

1275
01:04:27,400 --> 01:04:29,679
the only justication that I can figure out is that

1276
01:04:29,760 --> 01:04:31,239
if you took all the keys off the keyboard and

1277
01:04:31,280 --> 01:04:32,920
you're like, oh, where do I put them back? I

1278
01:04:32,960 --> 01:04:35,760
don't know, Oh, these two have a different form. Maybe

1279
01:04:35,800 --> 01:04:37,079
the F and the jay goes there, and then I

1280
01:04:37,119 --> 01:04:39,039
can figure out where the other ones go. And I'm like,

1281
01:04:39,079 --> 01:04:41,320
that's a pretty suspect. But it's like every keyboard I've

1282
01:04:41,320 --> 01:04:43,119
seen has this problem.

1283
01:04:43,800 --> 01:04:47,079
Speaker 3: I got a mechanical keyboard once and.

1284
01:04:48,599 --> 01:04:52,360
Speaker 5: My wife made me stop using. She's just like, that

1285
01:04:52,480 --> 01:04:57,800
is the most absolutely obnoxious, annoying sounding thing that you like,

1286
01:04:57,960 --> 01:04:59,840
put that away. I don't know want to see that again.

1287
01:05:00,400 --> 01:05:02,159
I was like, no, that's cool, Like it's like I

1288
01:05:02,199 --> 01:05:04,000
love the feel of it. And she's like it's like,

1289
01:05:04,960 --> 01:05:07,239
you know, it's like really, lad, Yeah.

1290
01:05:07,039 --> 01:05:11,559
Speaker 4: I've removed those from my kids Christmas book. She's not

1291
01:05:11,599 --> 01:05:13,039
there anymore. I'm not doing this.

1292
01:05:13,719 --> 01:05:13,960
Speaker 3: See.

1293
01:05:14,000 --> 01:05:15,559
Speaker 2: I know that that would not work for me because

1294
01:05:15,559 --> 01:05:18,239
I'm a very angry typer sometimes, like my wife. My

1295
01:05:18,280 --> 01:05:20,760
wife can figure out like what application I'm using in

1296
01:05:20,800 --> 01:05:23,679
what I'm doing, but based off of how angrily I'm

1297
01:05:23,679 --> 01:05:26,599
typing on the keyboard, like when I'm typing a blog

1298
01:05:26,599 --> 01:05:29,280
post or writing a message in slack somewhere or an email,

1299
01:05:29,320 --> 01:05:32,480
it sounds different to her, and so I think it's like,

1300
01:05:32,519 --> 01:05:33,800
how how angry I am?

1301
01:05:34,039 --> 01:05:34,280
Speaker 4: You know?

1302
01:05:34,360 --> 01:05:35,239
Speaker 2: When I'm in an email.

1303
01:05:35,239 --> 01:05:36,840
Speaker 3: It's the exact same thing.

1304
01:05:37,039 --> 01:05:42,440
Speaker 5: If my wife can be like, don't send that, take

1305
01:05:42,480 --> 01:05:45,280
a breath, don't don't send like I can, I'm like like,

1306
01:05:46,199 --> 01:05:49,599
and she's like, no, take it fre either don't don't

1307
01:05:49,840 --> 01:05:53,000
And it's the thing actually is like as a manager,

1308
01:05:53,039 --> 01:05:56,000
I try to remind myself of of like, don't don't

1309
01:05:56,079 --> 01:05:58,360
know angry slacks, no angry emails.

1310
01:05:58,519 --> 01:06:00,719
Speaker 1: Oh no, he's typing the manifest Get in the car,

1311
01:06:00,800 --> 01:06:01,719
kid's get in the car.

1312
01:06:02,800 --> 01:06:05,239
Speaker 4: So maybe, like you know, doesn't Google have like a

1313
01:06:05,360 --> 01:06:07,920
like a drunk email detection. Maybe what we need is

1314
01:06:07,920 --> 01:06:11,800
for the keyboard to have like an angry no what

1315
01:06:11,840 --> 01:06:14,079
We're gonna wait, We're gonna wait fifteen minutes and then

1316
01:06:14,119 --> 01:06:16,079
we're gonna revisit this and see if you would still

1317
01:06:16,119 --> 01:06:16,599
like to send it.

1318
01:06:16,639 --> 01:06:18,960
Speaker 2: Look, I feel like, Julian, you haven't tried searching hard enough.

1319
01:06:18,960 --> 01:06:21,599
I'm sure there's some extension out there for your browser

1320
01:06:21,679 --> 01:06:24,079
which runs some sort of LM in the background and

1321
01:06:24,159 --> 01:06:26,840
determines whether your your email has some sort of angry

1322
01:06:26,880 --> 01:06:29,280
tone to it, and well, we'll prevent you from from

1323
01:06:29,400 --> 01:06:31,960
sending an email if it contains though no there is.

1324
01:06:31,960 --> 01:06:34,079
Speaker 4: If you use like pro writing aid, it will detect

1325
01:06:34,119 --> 01:06:37,800
the tone of your email and maybe course correct you

1326
01:06:37,960 --> 01:06:41,239
a little bit. And I do I do have that. Yeah, you.

1327
01:06:43,400 --> 01:06:45,800
Speaker 1: Hit sand and it comes back and says I didn't

1328
01:06:45,840 --> 01:06:47,440
send this. But I feel like it's a good time

1329
01:06:47,480 --> 01:06:49,719
to talk about your feelings. What's the source of this

1330
01:06:49,840 --> 01:06:50,519
anger for you?

1331
01:06:52,280 --> 01:06:57,639
Speaker 4: Let's get to the bottom of these issues. Speaking of

1332
01:06:57,679 --> 01:06:59,360
which I think we need to get back to WARP

1333
01:06:59,400 --> 01:07:03,719
because I have specific questions and more like more future requests.

1334
01:07:04,559 --> 01:07:05,719
Speaker 1: Bring it on.

1335
01:07:06,480 --> 01:07:08,760
Speaker 4: The point of having the app people on the show

1336
01:07:08,840 --> 01:07:10,280
is that I can be like, if I use this,

1337
01:07:10,440 --> 01:07:12,280
I have things that I want. All right, Like that.

1338
01:07:13,039 --> 01:07:14,400
Speaker 3: Tell me what can I do for you?

1339
01:07:14,920 --> 01:07:18,599
Speaker 4: So I saw that there's like WARP workflows, and I'm wondering,

1340
01:07:19,559 --> 01:07:21,639
can I do those in reverse? Like I can? Can

1341
01:07:21,679 --> 01:07:23,760
I go through and figure something out and then be like,

1342
01:07:23,800 --> 01:07:26,400
all right, Warp, I'm stupid and I don't remember anything

1343
01:07:26,400 --> 01:07:28,159
that I just did, but I'm probably gonna have to

1344
01:07:28,239 --> 01:07:30,360
do this again. So I would like for you to

1345
01:07:30,400 --> 01:07:32,719
go through my history, figure out what I did, and

1346
01:07:32,880 --> 01:07:35,079
just go put it in like a markdown file or

1347
01:07:35,159 --> 01:07:39,840
some notes or something as opposed to like history doing proactively.

1348
01:07:41,840 --> 01:07:43,800
Speaker 3: It's a great idea. We don't quite have that.

1349
01:07:43,840 --> 01:07:46,000
Speaker 5: We have the ability to take command that you've already

1350
01:07:46,039 --> 01:07:48,639
run and turn into workflow, just so folks know what

1351
01:07:48,679 --> 01:07:51,599
a workflow is. A workflow is a it's kind of

1352
01:07:51,599 --> 01:07:54,079
like an alias, but it's like a templated command. And

1353
01:07:54,199 --> 01:07:57,000
so if you have like a complicated thing you're doing

1354
01:07:57,079 --> 01:08:00,360
a doctor or like what's your workflow for like cherry

1355
01:08:00,360 --> 01:08:02,760
picking something into a release, you can make it one

1356
01:08:02,800 --> 01:08:06,440
of these templated commands, and then we actually make it

1357
01:08:06,480 --> 01:08:08,400
so it's shareable, which I think is kind of like

1358
01:08:08,800 --> 01:08:10,519
the killer value of it. And so if you're working

1359
01:08:10,559 --> 01:08:14,360
on a development team, you can build up a library

1360
01:08:14,400 --> 01:08:15,159
of these things.

1361
01:08:14,960 --> 01:08:16,880
Speaker 3: That you can use in different situations. So if you're

1362
01:08:16,920 --> 01:08:17,640
like an.

1363
01:08:17,640 --> 01:08:19,920
Speaker 5: SRI team, it's like, Okay, what are all the commands

1364
01:08:19,920 --> 01:08:21,239
that I need to be able to run the middle

1365
01:08:21,239 --> 01:08:23,039
of a firefight. You can have that and they're all

1366
01:08:23,079 --> 01:08:25,960
sort of in a common library that you have directly

1367
01:08:26,000 --> 01:08:29,640
within warp. We don't have the feature yet of like

1368
01:08:30,479 --> 01:08:34,159
intelligently make these for me from a session, but that's

1369
01:08:34,159 --> 01:08:37,840
a super smart feature. We do have a thing that

1370
01:08:37,840 --> 01:08:40,840
we're haven't launched, but our experimenting with which is like

1371
01:08:41,560 --> 01:08:45,279
essentially like run the output of your command through an

1372
01:08:45,399 --> 01:08:47,760
LLLM and have it summarize for you and pick out

1373
01:08:47,840 --> 01:08:49,239
the interesting and important parts.

1374
01:08:50,079 --> 01:08:53,159
Speaker 3: But I like your ideagulian of like figure out what

1375
01:08:53,199 --> 01:08:54,920
I did, record for me so I can do it

1376
01:08:54,960 --> 01:08:56,039
again smart.

1377
01:08:56,079 --> 01:08:59,159
Speaker 2: So I found that some people have sworn by this

1378
01:09:00,039 --> 01:09:04,039
ring a chat context session at the end, just tell

1379
01:09:04,079 --> 01:09:06,960
it to like echo back at you what you did,

1380
01:09:07,319 --> 01:09:09,640
like say what did I do? And then when it's done,

1381
01:09:09,720 --> 01:09:11,359
then say, okay, now I want you to take that

1382
01:09:11,439 --> 01:09:14,079
and write a document for me that includes that information

1383
01:09:14,239 --> 01:09:15,800
so that the next time I have this problem, I

1384
01:09:15,800 --> 01:09:17,760
can go and reference that. And with WARP you can say, okay,

1385
01:09:17,760 --> 01:09:19,279
now turn that into a templated command.

1386
01:09:19,359 --> 01:09:21,279
Speaker 5: You can totally you could totally do that today and

1387
01:09:21,319 --> 01:09:23,880
WARP the one piece of it that's not like we

1388
01:09:23,920 --> 01:09:26,439
don't tie the loop of turning it into this specific

1389
01:09:27,119 --> 01:09:29,720
like executable thing that is a workflow.

1390
01:09:29,800 --> 01:09:32,399
Speaker 3: But you know, we also have like a notebook concept

1391
01:09:32,399 --> 01:09:32,880
and WARP.

1392
01:09:32,720 --> 01:09:36,880
Speaker 5: So you could be like, hey, LLM and WARP summarize

1393
01:09:36,880 --> 01:09:39,760
everything I did, turn into a notebook, extract the relevant

1394
01:09:39,760 --> 01:09:44,880
commands for me, but it's not quite as.

1395
01:09:44,279 --> 01:09:46,720
Speaker 3: Seamless I think as it could be for Jillian. It's

1396
01:09:46,720 --> 01:09:47,239
a good idea.

1397
01:09:48,039 --> 01:09:50,640
Speaker 4: Yeah, I'd really like to be able to have different

1398
01:09:51,560 --> 01:09:53,640
I don't know if it's sessions or context, but I

1399
01:09:53,640 --> 01:09:57,199
suppose one of those where I can say, I don't know.

1400
01:09:57,239 --> 01:09:58,560
I mean, I suppose for me it would be like

1401
01:09:58,680 --> 01:10:01,720
client dependent or contact dependent, or even like tell me

1402
01:10:01,760 --> 01:10:04,920
which environment I'm in, which version of terraform I'm using,

1403
01:10:05,079 --> 01:10:09,000
like you know, all that kind of for sure, like

1404
01:10:09,079 --> 01:10:15,520
it's here, it's it's right here. Yeah, well, yeah, so

1405
01:10:16,359 --> 01:10:17,159
that's what I want.

1406
01:10:17,840 --> 01:10:21,479
Speaker 3: Yeah, we don't like. I think that's a super interesting idea.

1407
01:10:21,560 --> 01:10:24,800
Speaker 5: I mean, you can you can use warpflort anything about

1408
01:10:24,800 --> 01:10:26,840
your environment today, so you could be like what toolchain

1409
01:10:26,880 --> 01:10:27,439
am I using?

1410
01:10:27,479 --> 01:10:34,199
Speaker 3: What are my environment variables? Like what? Uh? Anything?

1411
01:10:34,239 --> 01:10:36,319
Speaker 5: You can ask about your history and so you can

1412
01:10:36,319 --> 01:10:38,159
get some of that today, but you can't quick get

1413
01:10:38,600 --> 01:10:40,479
we don't have like packaged episode. When you start a

1414
01:10:40,520 --> 01:10:43,000
new session you can get all that stuff, which would

1415
01:10:43,000 --> 01:10:43,359
be cool.

1416
01:10:45,359 --> 01:10:48,640
Speaker 4: Well, I would if we're taking peach requests, I would

1417
01:10:48,640 --> 01:10:50,439
like that.

1418
01:10:51,920 --> 01:10:54,199
Speaker 3: I'm gonna I'm gonna force everyone on our team to.

1419
01:10:54,159 --> 01:10:59,600
Speaker 2: Listen to this, Well, well, you should probably wait until

1420
01:10:59,600 --> 01:11:02,680
the episode drops and then use uh you know, an

1421
01:11:02,720 --> 01:11:05,199
I'll m to summarize the episode and extract the future

1422
01:11:05,239 --> 01:11:07,159
requests from it, and we can do.

1423
01:11:07,119 --> 01:11:07,600
Speaker 3: It that way.

1424
01:11:08,039 --> 01:11:11,239
Speaker 5: Or I think there's been so much interesting discussion about

1425
01:11:11,279 --> 01:11:14,960
like philosophy, AI and hear that make them all listen

1426
01:11:14,960 --> 01:11:15,199
to it.

1427
01:11:16,520 --> 01:11:19,760
Speaker 3: I don't think. I don't think it's distilled. Summarized version

1428
01:11:19,840 --> 01:11:20,800
is going to do a justice.

1429
01:11:21,560 --> 01:11:24,319
Speaker 2: Oh, I totally agree we need human Warren.

1430
01:11:24,439 --> 01:11:26,319
Speaker 1: Yeah, I'm gonna put them in a dark room and

1431
01:11:26,359 --> 01:11:27,720
play it back to half speed.

1432
01:11:30,880 --> 01:11:33,720
Speaker 2: I don't listen to content any slower than like to

1433
01:11:34,199 --> 01:11:37,479
these days. Before we get on another tangent, I have

1434
01:11:37,520 --> 01:11:40,359
this feeling that we should move off onto onto picks.

1435
01:11:40,560 --> 01:11:44,600
Speaker 1: For that, it's probably a good part, good point, good time,

1436
01:11:46,840 --> 01:11:49,640
good words, look at me, work in my words.

1437
01:11:50,039 --> 01:11:51,800
Speaker 2: Well then well why don't you go first?

1438
01:11:52,880 --> 01:11:57,920
Speaker 1: Right on? Okay, So I'm I have a couple of

1439
01:11:58,000 --> 01:12:05,000
picks today. One I'm blaming you Warren and Matt from

1440
01:12:05,079 --> 01:12:10,520
last week because I got the book Dungeon Crawler Carl,

1441
01:12:10,720 --> 01:12:15,600
and I hate how much I like this book. It's

1442
01:12:15,760 --> 01:12:20,279
just it's dumb and it's funny and it's entertaining, and

1443
01:12:20,319 --> 01:12:23,720
it's engaging, and it's sucked way too much of my

1444
01:12:23,880 --> 01:12:28,239
time last week. So Dungeon Crawler Carl, I can't even

1445
01:12:28,279 --> 01:12:30,600
remember who the author is. Do you remember Warren?

1446
01:12:31,399 --> 01:12:31,439
Speaker 6: No?

1447
01:12:31,800 --> 01:12:32,640
Speaker 3: I didn't wrack it up.

1448
01:12:33,000 --> 01:12:37,520
Speaker 1: Yeah, just google Dungeon Crawler Carl. It's a stupidly fun book,

1449
01:12:38,119 --> 01:12:39,000
very entertaining.

1450
01:12:39,399 --> 01:12:41,560
Speaker 2: If you're listening to this episode, the link will be

1451
01:12:41,560 --> 01:12:44,680
included with the podcast, just you know, down below it,

1452
01:12:44,680 --> 01:12:46,039
so you don't even have to google.

1453
01:12:46,119 --> 01:12:47,720
Speaker 3: I just click the link right.

1454
01:12:49,319 --> 01:12:53,439
Speaker 1: And then the other pick I'm gonna recommend is if

1455
01:12:53,479 --> 01:12:57,760
you haven't Zach you mentioned this earlier, if you haven't

1456
01:12:57,800 --> 01:13:01,319
gone to your favorite AI tool and just started a

1457
01:13:01,439 --> 01:13:05,880
chat about philosophy with it, I highly recommend that. And

1458
01:13:05,960 --> 01:13:07,680
that's going to be my pick for the week because

1459
01:13:07,720 --> 01:13:13,720
it's just it's so much fun to do. And Laurren,

1460
01:13:13,720 --> 01:13:16,239
I know you said that AI is not intelligent, but

1461
01:13:16,560 --> 01:13:18,439
neither are some of the people I hang out with.

1462
01:13:18,600 --> 01:13:22,079
So chatting with AI about philosophy seems to be working

1463
01:13:22,079 --> 01:13:24,760
out quite well because it's just a really cool perspective

1464
01:13:24,800 --> 01:13:27,239
of some of the stuff that it has and some

1465
01:13:27,319 --> 01:13:30,439
of the insights it has to offer, and I've used

1466
01:13:30,479 --> 01:13:36,239
it for setting goals as well and challenge me challenging

1467
01:13:36,279 --> 01:13:38,960
me on those goals, and it's been pretty insightful for that.

1468
01:13:39,159 --> 01:13:42,359
So I think that's one good way to start working

1469
01:13:42,359 --> 01:13:49,439
with AI. And those are my picks. So Jillian, what

1470
01:13:49,479 --> 01:13:50,840
about you? What'd you bring this week?

1471
01:13:51,720 --> 01:13:54,279
Speaker 4: I'm going to keep going with the self promotion until

1472
01:13:54,319 --> 01:13:56,960
I'm back up to the lifestyle with which I've become

1473
01:13:57,000 --> 01:14:00,479
a custom And if you go to my website, yeah, yeah,

1474
01:14:00,479 --> 01:14:04,159
that's right, uh dabbleopdevops dot com slash AI. I have

1475
01:14:04,319 --> 01:14:08,640
a data discovery tool for mostly for data science companies.

1476
01:14:08,680 --> 01:14:10,760
If you're not a data science company like I don't,

1477
01:14:10,840 --> 01:14:12,520
I don't like even really know how to talk to you,

1478
01:14:12,560 --> 01:14:15,800
so maybe just ignore this portion. But the idea is

1479
01:14:15,920 --> 01:14:19,159
that you get your data, you load it into the LM,

1480
01:14:19,319 --> 01:14:21,359
and then you can start asking you questions. It kind

1481
01:14:21,359 --> 01:14:23,800
of acts like a maybe like a junior grad student.

1482
01:14:23,800 --> 01:14:25,880
You don't want to like completely trust what it says,

1483
01:14:26,239 --> 01:14:29,920
but it gives you a very good first draft. I'm

1484
01:14:29,960 --> 01:14:33,399
adding the PubMed interface so you can go search medical

1485
01:14:33,439 --> 01:14:36,720
literature and say, like, okay, get me all the papers

1486
01:14:36,800 --> 01:14:40,119
back on this disease or this protein or this drug interaction,

1487
01:14:40,239 --> 01:14:42,920
you know, whatever the things are. Load that into the LM.

1488
01:14:43,079 --> 01:14:45,680
Start asking you questions. I've got a couple different data

1489
01:14:45,680 --> 01:14:49,079
sets open targets, a couple single cell data sets. I

1490
01:14:49,119 --> 01:14:51,680
want to add a couple of transcripts data sets, even

1491
01:14:51,720 --> 01:14:54,359
though those might be out of vogue, because they're still cool,

1492
01:14:54,439 --> 01:14:57,880
you guys, Okay, they're still cool. So anyways, cool things

1493
01:14:57,920 --> 01:15:01,119
are being added to the platform. Anybody wants it, mostly

1494
01:15:01,479 --> 01:15:04,319
in the biotech space. Again, if you're not biotech, I don't.

1495
01:15:04,520 --> 01:15:06,720
I don't really, I don't even know why you're listening

1496
01:15:06,800 --> 01:15:07,760
to me, Like, just tune me out.

1497
01:15:07,800 --> 01:15:11,439
Speaker 2: It's fine, don't. Don't reduce your you know, your TAM

1498
01:15:11,479 --> 01:15:13,960
your total adjustable market here. You know, if you don't

1499
01:15:14,039 --> 01:15:15,720
understand what Gillian's saying, maybe you should go to the

1500
01:15:15,720 --> 01:15:17,560
website anyway and see if you can figure out a

1501
01:15:17,640 --> 01:15:18,560
use case for yourself.

1502
01:15:18,920 --> 01:15:21,399
Speaker 4: That's true, you could. I do have some companies that

1503
01:15:21,520 --> 01:15:24,119
use it just for meeting notes. They there you go

1504
01:15:24,359 --> 01:15:27,039
like Otter record all of their meetings and then Otter

1505
01:15:27,159 --> 01:15:29,800
kind of gives them, like, you know, the different summaries

1506
01:15:29,840 --> 01:15:32,199
and images and things like that. It's pretty cool. And

1507
01:15:32,239 --> 01:15:34,159
then you can feed that into the LM and have

1508
01:15:34,239 --> 01:15:36,960
sort of like a just a history of meetings, so

1509
01:15:37,000 --> 01:15:39,279
then you don't have this, didn't we have a meeting

1510
01:15:39,319 --> 01:15:42,159
about this? Didn't somebody make a database like wasn't wasn't

1511
01:15:42,159 --> 01:15:44,119
there a thing? Wasn't there a person we can talk

1512
01:15:44,159 --> 01:15:46,439
to here? You could just go and query it and

1513
01:15:46,479 --> 01:15:49,199
then and then it will tell you. Sometimes it gives

1514
01:15:49,199 --> 01:15:51,079
you the answer you want, and sometimes it's like, no,

1515
01:15:51,199 --> 01:15:54,359
that conversation never happened. You're hallucinating now, but you know,

1516
01:15:54,439 --> 01:15:58,039
like it's it's either one, it's one or the other. Well,

1517
01:15:58,079 --> 01:15:58,439
there's a.

1518
01:15:58,359 --> 01:16:03,600
Speaker 1: Big overlap between bioh hackers and software engineers as well,

1519
01:16:03,880 --> 01:16:06,479
so that they may find that interesting.

1520
01:16:07,680 --> 01:16:11,000
Speaker 4: Yeah, they could put all the literature and data in

1521
01:16:11,039 --> 01:16:15,560
there around biohacking that I'm not I'm not totally familiar with,

1522
01:16:15,720 --> 01:16:17,680
although I am very much looking forward to having like

1523
01:16:17,720 --> 01:16:19,960
bionic limbs. That would be great, Like that would that

1524
01:16:20,000 --> 01:16:20,399
would just be.

1525
01:16:20,439 --> 01:16:23,680
Speaker 2: Amazing for me, because because you want you want it

1526
01:16:23,800 --> 01:16:27,039
that you don't have to think about moving your limbs anymore.

1527
01:16:27,079 --> 01:16:29,239
You want something else to do it for you, right, No.

1528
01:16:29,319 --> 01:16:31,159
Speaker 4: I just want limbs that work at this point, Like

1529
01:16:31,199 --> 01:16:33,199
that would be nice that's that's like just on a

1530
01:16:33,239 --> 01:16:36,239
mechanical level, like that's what I need. And then you know,

1531
01:16:36,279 --> 01:16:37,880
and then on that note, we're kind of talking about

1532
01:16:37,920 --> 01:16:40,359
like philosophy of AI and so on, you know, and

1533
01:16:40,359 --> 01:16:42,560
we can kind of argue about the tools, but from

1534
01:16:42,560 --> 01:16:45,960
an accessibility viewpoint, AI is really great and doing some

1535
01:16:46,000 --> 01:16:48,319
really great things. You know, like I have some issues

1536
01:16:48,359 --> 01:16:52,079
with typing as I age out of this career field.

1537
01:16:53,079 --> 01:16:54,880
You know, I have some like low vision people in

1538
01:16:54,920 --> 01:16:57,560
the family that AI is very helpful for them being

1539
01:16:57,600 --> 01:16:59,720
able to dictate, being able. You know, there's like there

1540
01:16:59,800 --> 01:17:01,840
is of a lot of cool accessibility things that can

1541
01:17:01,880 --> 01:17:03,880
be done with AI, and I do always kind of

1542
01:17:03,920 --> 01:17:05,239
like to give a little bit of a shout out

1543
01:17:05,239 --> 01:17:07,920
to that because I do think it's like all of

1544
01:17:07,960 --> 01:17:10,279
that is pretty great. You know, Like I have somebody

1545
01:17:10,319 --> 01:17:14,239
who's low vision who can now listen to audiobooks and

1546
01:17:14,319 --> 01:17:17,239
you know, I'm basically like kind of still go through

1547
01:17:17,279 --> 01:17:20,239
the Internet just with voice, and I think that's pretty cool.

1548
01:17:20,479 --> 01:17:22,159
So I don't know, that's it.

1549
01:17:22,239 --> 01:17:25,800
Speaker 3: That's my picks, alrighty.

1550
01:17:25,840 --> 01:17:31,720
Speaker 2: Then, so for my pick this week, I primed it

1551
01:17:31,760 --> 01:17:35,119
at the beginning. It's this Microsoft backed research paper that

1552
01:17:35,239 --> 01:17:38,359
came out of Carnegie mellon the impact of Generative AI

1553
01:17:38,479 --> 01:17:42,960
on Critical thinking, and I think it's just absolutely fantastic

1554
01:17:43,000 --> 01:17:48,720
paper about the correlations between utilizing AI tools and developing

1555
01:17:48,760 --> 01:17:52,359
critical thinking processes and expanding in usage of that sort

1556
01:17:52,399 --> 01:17:55,960
of brain muscle. And I think some people have misinterpreted

1557
01:17:56,000 --> 01:17:59,119
the paper as Microsoft paper on AI is making us stupid,

1558
01:17:59,680 --> 01:18:02,039
But I think the one thing that really does come

1559
01:18:02,079 --> 01:18:07,279
out of it is that if you have low confidence

1560
01:18:07,439 --> 01:18:10,680
in an LM doing the right thing, you will be

1561
01:18:10,760 --> 01:18:13,600
able to get much better answers out than if you

1562
01:18:13,640 --> 01:18:16,159
have high confidence in the current tools that we have,

1563
01:18:16,279 --> 01:18:21,199
because the current tools are transformer networks that hallucinate, and

1564
01:18:21,359 --> 01:18:24,000
if you just assume that it gives you the right answer,

1565
01:18:24,039 --> 01:18:26,760
like your calculator, you are going to stop developing the

1566
01:18:26,800 --> 01:18:29,359
muscle of challenging where you got the information from and

1567
01:18:29,399 --> 01:18:33,000
trying to understand it. I will say that this leads

1568
01:18:33,039 --> 01:18:35,760
me to a great interview question. I know that interviewing

1569
01:18:36,359 --> 01:18:39,800
candidates today can be challenging because they may be using

1570
01:18:39,880 --> 01:18:43,399
lms to answer your questions, and for me, I think

1571
01:18:43,399 --> 01:18:46,239
that naturally you can just ask them, hey, are you

1572
01:18:46,840 --> 01:18:49,479
like how much confidence do you have in the LMS

1573
01:18:49,520 --> 01:18:51,600
that you use to produce the right answers. The more

1574
01:18:51,640 --> 01:18:54,760
confident they are, the more likely you know they're not

1575
01:18:54,920 --> 01:18:57,800
using critical thinking to challenge what comes out of them

1576
01:18:57,840 --> 01:19:01,039
and could be a useful litmus test for what sort

1577
01:19:01,079 --> 01:19:03,600
of person you're hiring into your organization.

1578
01:19:06,079 --> 01:19:09,119
Speaker 1: Right hunh, And so by are you phrasing the question

1579
01:19:09,359 --> 01:19:14,640
that way? Just presuming that they are using AI to

1580
01:19:14,720 --> 01:19:18,000
make them more comfortable with admitting that they are trying

1581
01:19:18,000 --> 01:19:18,399
to hide it?

1582
01:19:19,560 --> 01:19:23,880
Speaker 2: Well, I think realistically part of our interviews now should

1583
01:19:23,960 --> 01:19:28,520
be dedicated to solving problems that don't rely on using LMS,

1584
01:19:29,359 --> 01:19:33,479
or problems that can use LLMS to be solved better,

1585
01:19:33,600 --> 01:19:36,600
and then asking them to use lms and which LMS

1586
01:19:36,600 --> 01:19:38,640
they're utilizing to solve the problem and how they're going

1587
01:19:38,680 --> 01:19:42,920
about it, because I think where you know you're trying

1588
01:19:42,960 --> 01:19:46,239
to hide this perspective from yourself, you're lying to yourself

1589
01:19:46,319 --> 01:19:49,840
if you believe that they you don't want to pull

1590
01:19:49,880 --> 01:19:53,119
these tools into your company to utilize in some fashion

1591
01:19:53,520 --> 01:19:57,159
and that people aren't utilizing them irrelevant if you give

1592
01:19:57,199 --> 01:20:00,800
them a take home assignment for your company that takes

1593
01:20:00,800 --> 01:20:02,880
four hours or eight hours, some of them are going

1594
01:20:02,920 --> 01:20:06,600
to utilize tools, And I don't think it says a

1595
01:20:06,600 --> 01:20:09,359
lot on the type of person based on whether they

1596
01:20:09,399 --> 01:20:11,960
utilize the tools, but it does say something about them

1597
01:20:12,000 --> 01:20:14,479
about how they're utilizing them or what their expectations are

1598
01:20:14,479 --> 01:20:15,920
on how they utilize those tools.

1599
01:20:16,039 --> 01:20:17,520
Speaker 1: Cool, all right, Zach? What'd you bring for?

1600
01:20:17,600 --> 01:20:24,239
Speaker 3: Pick? I have a tool that I like. Why not?

1601
01:20:25,119 --> 01:20:31,119
Speaker 5: So it's a tool called Granola, and it's a it's

1602
01:20:31,119 --> 01:20:31,760
a note taker.

1603
01:20:31,920 --> 01:20:34,279
Speaker 3: It's a meeting note taker, you say, I.

1604
01:20:34,600 --> 01:20:37,520
Speaker 5: But the thing that I like about it compared to

1605
01:20:37,560 --> 01:20:40,279
all the other ones that I've tried using, is that

1606
01:20:41,239 --> 01:20:44,880
you don't end up with like a little like black

1607
01:20:44,960 --> 01:20:49,079
box in your zoom for the note taker. The note

1608
01:20:49,079 --> 01:20:53,199
taker works just off of your computer audio. Oh so

1609
01:20:54,079 --> 01:20:56,159
there's no like this is weird?

1610
01:20:56,239 --> 01:20:58,880
Speaker 3: Who is this? Like Zach's note taker thing? Joining the meeting?

1611
01:21:00,239 --> 01:21:03,800
And it it not? Is it takes notes. It doesn't

1612
01:21:04,000 --> 01:21:06,600
like the default way that it takes notes isn't by transcription.

1613
01:21:06,880 --> 01:21:07,560
It's by.

1614
01:21:08,960 --> 01:21:12,920
Speaker 5: Like semantically summarizing and giving you the key points.

1615
01:21:12,640 --> 01:21:13,640
Speaker 4: Of what happened in the meaning.

1616
01:21:13,960 --> 01:21:15,840
Speaker 6: So I don't like taking mening notes. So this is

1617
01:21:15,840 --> 01:21:19,079
a cool thing. It's called pole. That's one thing. A

1618
01:21:19,159 --> 01:21:24,199
second thing I'm reading a book. It's pretty nerdy. I

1619
01:21:24,199 --> 01:21:25,199
don't know why I'm reading it.

1620
01:21:25,199 --> 01:21:27,840
Speaker 5: It's called A it's like a travel Guys in the

1621
01:21:27,840 --> 01:21:31,439
Middle Ages, and it's all it's like a it's a

1622
01:21:31,479 --> 01:21:36,359
history book, and it's all about like from like you know,

1623
01:21:36,439 --> 01:21:39,800
the year like eleven hundred to fifteen hundred, how did

1624
01:21:39,800 --> 01:21:42,399
people travel, Like what was it like for them to

1625
01:21:42,439 --> 01:21:43,079
take a vacation.

1626
01:21:43,119 --> 01:21:44,319
Speaker 3: They weren't really taking vacations.

1627
01:21:44,319 --> 01:21:47,359
Speaker 5: They were like primarily going on pilgrimages or at least

1628
01:21:47,399 --> 01:21:50,640
that's like what the written record survives.

1629
01:21:50,680 --> 01:21:54,079
Speaker 3: And it takes you all over Europe.

1630
01:21:53,920 --> 01:21:57,239
Speaker 5: The Middle East and like the Near East, and like

1631
01:21:57,640 --> 01:22:00,560
I'm not through it yet, so I don't totally know where.

1632
01:22:00,600 --> 01:22:04,119
Speaker 3: But to me, it's what I like about it from a.

1633
01:22:04,119 --> 01:22:07,680
Speaker 5: History perspective is that it's just about like it's about

1634
01:22:07,720 --> 01:22:11,640
a like relatable experience, not about like a series of

1635
01:22:11,760 --> 01:22:12,640
historical events.

1636
01:22:12,640 --> 01:22:14,079
Speaker 3: It's not about historical leaders.

1637
01:22:14,079 --> 01:22:16,399
Speaker 5: It's about, like, say, you having to be living in

1638
01:22:16,439 --> 01:22:18,520
the year thirteen hundred, what the heck were you doing?

1639
01:22:19,039 --> 01:22:23,079
Speaker 3: How did you pack? How did you travel? Where did

1640
01:22:23,119 --> 01:22:25,760
you stay? Like what were the inns? Like? What were

1641
01:22:25,760 --> 01:22:29,119
you trying to go? Sight see? At I don't know

1642
01:22:29,119 --> 01:22:30,840
why I like it so much, but it's it's like

1643
01:22:31,399 --> 01:22:32,079
I really like it.

1644
01:22:32,079 --> 01:22:35,199
Speaker 5: It's like a puts me in a very different mindset

1645
01:22:35,279 --> 01:22:37,920
from like how we're living today, So.

1646
01:22:38,119 --> 01:22:44,159
Speaker 1: That's super cool. It's like National Lampoon's Middle Ages vacation.

1647
01:22:44,680 --> 01:22:47,359
Speaker 5: Yeah, except I guess it didn't seem like very funny

1648
01:22:47,359 --> 01:22:48,159
to be traveling then.

1649
01:22:48,279 --> 01:22:51,119
Speaker 3: It was a lot of like very serious.

1650
01:22:51,199 --> 01:22:54,199
Speaker 5: You got to get to this religious site, like like

1651
01:22:54,880 --> 01:22:57,720
you got to see these relics, like people were really

1652
01:22:57,760 --> 01:23:02,000
wanting to see a bunch of you know, historic felts

1653
01:23:02,079 --> 01:23:05,680
or at least that's what the written record, uh survives,

1654
01:23:05,720 --> 01:23:06,920
and that's where the history comes from.

1655
01:23:06,960 --> 01:23:07,880
Speaker 3: So that's pretty cool.

1656
01:23:08,720 --> 01:23:11,760
Speaker 4: I used to really like all those, like the the

1657
01:23:11,840 --> 01:23:14,399
diary type books, like their fiction, but they're sort of

1658
01:23:14,399 --> 01:23:16,600
written as diaries of like the kids that would do

1659
01:23:16,640 --> 01:23:19,560
the Oregon Trail and travel across the United States, and

1660
01:23:19,560 --> 01:23:21,640
and they're they're from like other places as well too,

1661
01:23:21,680 --> 01:23:25,000
So you have people coming to Plymouth Rock and doing

1662
01:23:25,000 --> 01:23:28,560
the Oregon Trail and just the sort of Yeah, in general,

1663
01:23:28,600 --> 01:23:31,079
people go in different places, like across history. It used

1664
01:23:31,119 --> 01:23:32,680
to be a lot harder. You used to have to

1665
01:23:32,720 --> 01:23:35,319
worry about more things than like if the gas station

1666
01:23:35,520 --> 01:23:38,399
have your preferred chicken tenders or like whatever you know.

1667
01:23:39,920 --> 01:23:42,399
Speaker 2: Yeah, there's so many questions, Jill Anne.

1668
01:23:47,960 --> 01:23:50,439
Speaker 1: Awesome, Zach, thank you for joining us. This has been

1669
01:23:50,479 --> 01:23:52,600
a super entertaining episode.

1670
01:23:52,920 --> 01:23:53,840
Speaker 4: Yeah, this has been fun.

1671
01:23:54,359 --> 01:23:55,359
Speaker 3: Yeah that was great.

1672
01:23:55,439 --> 01:24:00,000
Speaker 5: I was super interesting conversation and like, uh it was Yeah,

1673
01:24:00,000 --> 01:24:00,319
that's fine.

1674
01:24:00,399 --> 01:24:01,920
Speaker 3: Really appreciate you all having me on here.

1675
01:24:02,439 --> 01:24:06,640
Speaker 1: For sure. I'm gonna challenge Jillian to go download Warp

1676
01:24:06,720 --> 01:24:08,439
try it out, and then invite you back on the

1677
01:24:08,479 --> 01:24:10,680
show for a head to head rematch.

1678
01:24:11,000 --> 01:24:13,720
Speaker 4: Voice like that's the one thing I really want. So

1679
01:24:14,159 --> 01:24:14,680
there we go.

1680
01:24:14,800 --> 01:24:17,960
Speaker 5: It's a I don't know if it has it's at

1681
01:24:17,960 --> 01:24:20,399
warp dot dev is where you get it, and it's

1682
01:24:20,479 --> 01:24:26,399
now available Mac, Linux, and Windows, so all platforms right on.

1683
01:24:27,119 --> 01:24:32,439
Speaker 1: Cool cool, Well, thanks everyone, Zach, thank you again, Warren, Jillian,

1684
01:24:32,520 --> 01:24:38,560
thank you, and we'll see everyone next week.

