1
00:00:01,080 --> 00:00:03,000
Speaker 1: How'd you like to listen to dot net rocks with

2
00:00:03,040 --> 00:00:07,879
no ads? Easy? Become a patron for just five dollars

3
00:00:07,919 --> 00:00:10,800
a month. You get access to a private RSS feed

4
00:00:10,839 --> 00:00:14,279
where all the shows have no ads. Twenty dollars a month,

5
00:00:14,279 --> 00:00:16,920
we'll get you that and a special dot net Rocks

6
00:00:16,960 --> 00:00:21,000
patron mug. Sign up now at Patreon dot dot NetRocks

7
00:00:21,120 --> 00:00:36,399
dot com. Hey, welcome back to dot net rocks. I'm

8
00:00:36,479 --> 00:00:38,960
Carl Franklin and I'materi Comp. You are here for your

9
00:00:39,039 --> 00:00:42,520
dot net keeeking out, listening, pleasure and all that stuff,

10
00:00:42,640 --> 00:00:45,280
all the things, all the things anything you want to

11
00:00:45,280 --> 00:00:46,560
announce today, buddy.

12
00:00:46,520 --> 00:00:49,759
Speaker 2: No, Well let me I told you earlier. But yeah,

13
00:00:49,759 --> 00:00:52,479
I know we got a puppy. Okay, because you know,

14
00:00:52,600 --> 00:00:55,640
having a grandchild wasn't enough, we also need a puppy. Yeah,

15
00:00:55,759 --> 00:00:59,960
so I've managed to tie this while she's at yoga.

16
00:01:00,079 --> 00:01:01,880
Got to have the puppy asleep.

17
00:01:01,960 --> 00:01:04,959
Speaker 1: So wait, Manute, I thought you were swearing off dogs

18
00:01:05,400 --> 00:01:06,200
after Zach.

19
00:01:06,640 --> 00:01:09,799
Speaker 2: Listen, I already had the perfect dog was seventeen years.

20
00:01:09,920 --> 00:01:11,799
Don't need it down a dog. I didn't need that dog,

21
00:01:11,920 --> 00:01:14,680
to be clear. But sometimes sometimes you're out numbered one

22
00:01:14,680 --> 00:01:23,719
to one, you know, Oh that's great that it's really

23
00:01:23,760 --> 00:01:27,560
got to be her dog this time. So I'm doing

24
00:01:27,599 --> 00:01:29,439
the only logical thing I can do, which is good

25
00:01:29,439 --> 00:01:34,519
to Australia for three weeks. Okay, I'm leaving on Thursday,

26
00:01:34,599 --> 00:01:36,680
so I'm literally I'm gonna go away for three weeks

27
00:01:36,680 --> 00:01:37,840
and hopefully.

28
00:01:37,560 --> 00:01:38,840
Speaker 1: What kind of puppy is it?

29
00:01:38,840 --> 00:01:41,920
Speaker 2: It's a it's a mix of Ausi Shepherd and border

30
00:01:42,000 --> 00:01:44,480
collie and some poodle and some burnies, a little bit

31
00:01:44,480 --> 00:01:44,840
of everything.

32
00:01:44,840 --> 00:01:46,840
Speaker 1: It's Oh my gosh, it's a cute little.

33
00:01:46,719 --> 00:01:48,040
Speaker 2: Dog, no two ways about it.

34
00:01:48,079 --> 00:01:50,120
Speaker 1: But you know, does it have an identity complex?

35
00:01:50,200 --> 00:01:52,599
Speaker 2: It's got springs and its legs is when it's gone.

36
00:01:52,680 --> 00:01:53,040
Speaker 1: Wow.

37
00:01:53,560 --> 00:01:56,120
Speaker 2: Okay, man, we're in the early stage needle teeth and

38
00:01:56,159 --> 00:01:58,760
pee on the floor and bit by bit figuring it out.

39
00:01:58,920 --> 00:02:00,599
I already took a tick off for her because she

40
00:02:00,640 --> 00:02:01,359
was out in the lawn.

41
00:02:01,599 --> 00:02:07,120
Speaker 1: What's her name, Jojo short for Josephine Lily. Oh she

42
00:02:07,159 --> 00:02:09,439
doesn't look like a deep fried potato wedge then.

43
00:02:09,439 --> 00:02:13,400
Speaker 2: No, not like some dogs, you know. Oh, she's a

44
00:02:13,400 --> 00:02:16,319
cute little dog but cool. And she's got that French

45
00:02:16,360 --> 00:02:19,120
twying tour so that hits the crazy name, right, anyway

46
00:02:19,240 --> 00:02:21,840
I got, I got worse problems like, yeah, it makes her.

47
00:02:21,919 --> 00:02:23,960
If it makes she who must be obeyed happy, then

48
00:02:24,080 --> 00:02:25,759
I'm happy to do fantastic.

49
00:02:25,800 --> 00:02:28,360
Speaker 1: All right, Well, let's jump into it with a better

50
00:02:28,400 --> 00:02:28,960
NO framework.

51
00:02:29,039 --> 00:02:37,759
Speaker 2: All right, what do you got?

52
00:02:37,840 --> 00:02:41,479
Speaker 1: All right? Well, in honor of our AI guest today, Vishwaesle,

53
00:02:41,759 --> 00:02:44,840
who's been on the show several times before, many times,

54
00:02:45,439 --> 00:02:48,919
I found a trending repo on GitHub. It's called krillin

55
00:02:49,080 --> 00:02:53,759
ai k r I l l I n AI or

56
00:02:54,000 --> 00:02:57,960
a one, depending on your point of reference. It's an

57
00:02:58,000 --> 00:03:02,039
AI audio and video translation and dubbing tool. So let

58
00:03:02,039 --> 00:03:05,280
me just read the paragraph here now, just does a caveat.

59
00:03:05,319 --> 00:03:07,719
I have not downloaded it. I don't know if it's

60
00:03:07,719 --> 00:03:09,919
any good, but it is trending, so that tells me

61
00:03:10,000 --> 00:03:13,240
that people kind of like it. Crillin ai is an

62
00:03:13,240 --> 00:03:17,479
all in one solution for effortless video localization and enhancement.

63
00:03:18,120 --> 00:03:22,759
This minimalist yet powerful tool handles everything from translation, dubbing

64
00:03:22,840 --> 00:03:27,039
to voice cloning. I see there's no the commas don't

65
00:03:27,080 --> 00:03:31,800
work here. Translation comma dubbing to voice cloning comma. I

66
00:03:31,800 --> 00:03:36,000
would say, everything from translation and dubbing to voice cloning.

67
00:03:36,080 --> 00:03:40,960
Formatting seamlessly converting videos between landscape in portrait modes for

68
00:03:41,080 --> 00:03:47,439
optimal display across all content platforms. YouTube TikTok, Billy Billy, Duian,

69
00:03:47,719 --> 00:03:52,400
we chat channel, Red Note Kawai show. With its end

70
00:03:52,439 --> 00:03:56,400
to end workflow, crillion Ai transforms raw footage into polished,

71
00:03:56,439 --> 00:04:01,520
platform ready content and just a few clicks. So that's interesting.

72
00:04:01,560 --> 00:04:06,159
You know, translation a localization anyway is one thing in

73
00:04:06,240 --> 00:04:11,280
the world of software and web apps that we know

74
00:04:11,840 --> 00:04:18,360
very well. But dubbing and you know, translating video that

75
00:04:18,519 --> 00:04:19,519
has audio in it.

76
00:04:19,639 --> 00:04:21,680
Speaker 2: Well, the voice cloning one is the craziest one where

77
00:04:21,680 --> 00:04:25,600
I take the voice of the person speaking and then

78
00:04:25,800 --> 00:04:27,480
change their language, but it still.

79
00:04:27,279 --> 00:04:28,920
Speaker 1: Sounds like them change the language.

80
00:04:29,120 --> 00:04:31,560
Speaker 2: Yeah, that's nuts, but it's very jen Ai.

81
00:04:31,680 --> 00:04:34,079
Speaker 1: Yeah, it's nuts, but you know that's the stuff that

82
00:04:34,360 --> 00:04:36,959
we get to see. This is the world we live

83
00:04:37,000 --> 00:04:38,759
in now, so world we live in. So that's what

84
00:04:38,839 --> 00:04:40,600
I got. Richard, who's talking to us.

85
00:04:40,480 --> 00:04:43,879
Speaker 2: Today, grabbed a comment off of show nineteen forty four

86
00:04:44,199 --> 00:04:46,199
just a few weeks ago with our friend Jody Bershall,

87
00:04:46,279 --> 00:04:50,240
doctor Jody Bernshall. Yeah's got a few comments on the show,

88
00:04:50,240 --> 00:04:52,519
and this one's from Joshua hillar Up, one of our regulars.

89
00:04:52,560 --> 00:04:54,079
I'm pretty sure already has a copy of Means to

90
00:04:54,120 --> 00:04:57,560
go By, But you know, Joshua, thank you. And he said,

91
00:04:57,759 --> 00:05:00,800
I'm thinking about the non deterministing aspects of LMS, and

92
00:05:00,839 --> 00:05:03,199
what's frustrating is that there's been many decades of computer

93
00:05:03,240 --> 00:05:06,959
science research on software correctness. It just never seems to

94
00:05:07,000 --> 00:05:10,000
get used in the industry because it's too hard to learn.

95
00:05:10,079 --> 00:05:12,439
If you did that in air quotes, too hard to learn.

96
00:05:13,279 --> 00:05:16,079
LLLMS work really well at covering that hard work, and

97
00:05:16,120 --> 00:05:17,759
I don't think it's as hard as people claim, at

98
00:05:17,839 --> 00:05:20,759
least in the beginning. My hope is assuming enough money

99
00:05:20,759 --> 00:05:23,120
put into developing these languages and tools to make the

100
00:05:23,199 --> 00:05:25,240
use of that research. Now there's such an obvious use

101
00:05:25,279 --> 00:05:27,279
case for it. And we probably should have talked about

102
00:05:27,319 --> 00:05:31,959
this more with Jody too, Joshua, because she does come

103
00:05:32,040 --> 00:05:35,120
from that I hate to say old school, but because

104
00:05:35,120 --> 00:05:38,199
that would be like five years ago form of machine

105
00:05:38,279 --> 00:05:42,600
learning where quality and accuracy, you know, validation was a

106
00:05:42,720 --> 00:05:44,079
huge part of the job.

107
00:05:44,279 --> 00:05:47,000
Speaker 1: That's right. I remember talking to Seth Warez and he said,

108
00:05:47,079 --> 00:05:49,360
ninety percent of my time is cleaning the data.

109
00:05:49,480 --> 00:05:53,839
Speaker 2: Yeah, to get to a meaningful quality. So there is

110
00:05:53,839 --> 00:05:56,560
certainly some pressure on that, but we're we're bringing a

111
00:05:56,560 --> 00:06:00,519
lot of people in without a lot of experience the space,

112
00:06:00,800 --> 00:06:03,120
and they are not concerned about these things and they

113
00:06:03,120 --> 00:06:03,439
should be.

114
00:06:03,600 --> 00:06:03,879
Speaker 3: M H.

115
00:06:04,480 --> 00:06:06,439
Speaker 2: So as usual, Josha, thank you so much for your

116
00:06:06,439 --> 00:06:08,560
great comment, and a coffee of music. Cobi is on

117
00:06:08,600 --> 00:06:10,160
its way to you. And if you'd like a copy music,

118
00:06:10,160 --> 00:06:11,759
go bay rte a comment on the website at dot

119
00:06:11,800 --> 00:06:14,279
netrongs dot com or on the facebooks. We publish every

120
00:06:14,279 --> 00:06:15,560
show there and if you comment there and I've read

121
00:06:15,680 --> 00:06:17,439
on the show, we'll send you a copy music O

122
00:06:17,519 --> 00:06:18,199
bay music to.

123
00:06:18,240 --> 00:06:20,959
Speaker 1: Code by dot net. That is how you can find it.

124
00:06:20,959 --> 00:06:23,839
It's still going strong. Track twenty two is available now

125
00:06:23,920 --> 00:06:25,920
and of course you get the whole collection MP three

126
00:06:26,079 --> 00:06:32,720
Flak or Wave. This is episode nineteen forty seven, and

127
00:06:32,839 --> 00:06:34,879
so I can just give a little brief summary here

128
00:06:34,879 --> 00:06:37,439
about what happened that year. It was a year of

129
00:06:37,480 --> 00:06:42,839
significant global transformations. Of course, India and Pakistan gained independence

130
00:06:42,839 --> 00:06:46,279
from British rule, and the Marshall Plan was initiated to

131
00:06:46,319 --> 00:06:50,879
help rebuild war torn Europe. In the United States, Jackie

132
00:06:50,959 --> 00:06:55,639
Robinson broke baseball's color barrier, and Chuck Yeger broke the

133
00:06:55,639 --> 00:06:59,959
sound barrier. Also, the Truman Doctrine, which pledged US support

134
00:07:00,120 --> 00:07:05,000
for nations threatened by communism, was established, which signaled the

135
00:07:05,040 --> 00:07:08,040
beginning of the Cold War. You got anything to add?

136
00:07:08,839 --> 00:07:12,199
Speaker 2: Nineteen forty seven is the year the invention of the transistors.

137
00:07:12,360 --> 00:07:16,639
That's bartying Britain and Shockley with the first point content

138
00:07:16,639 --> 00:07:19,120
transi should have been talked about for years, but semiconductor

139
00:07:19,360 --> 00:07:22,240
making pure material semi conducts were hard. A lot of

140
00:07:22,279 --> 00:07:25,439
that came out of the war. And so what's funny

141
00:07:25,480 --> 00:07:27,079
is reading some of the documentation at the time and

142
00:07:27,079 --> 00:07:28,279
they weren't sure if it was actually going to be

143
00:07:28,319 --> 00:07:30,800
all that useful. Interesting, you know, because vacuum tubes did

144
00:07:30,839 --> 00:07:33,120
such a good job and this thing did not have

145
00:07:33,160 --> 00:07:36,120
the potential the vacuum two did because the materials weren't

146
00:07:36,160 --> 00:07:38,120
quite good enough. Yet they had no idea what was

147
00:07:38,160 --> 00:07:40,600
about to happen. Huh No, but they you know, Shockley

148
00:07:40,600 --> 00:07:44,360
would go on with Fairshire Semiconductor to help make the

149
00:07:44,360 --> 00:07:46,800
originalized sees with Robert Noise and all those like. All

150
00:07:46,839 --> 00:07:49,560
of that comes from there. But there's the beginning and

151
00:07:49,879 --> 00:07:53,759
one other fallout of the war raytheon's radar range, the

152
00:07:53,839 --> 00:07:55,040
first microwave oven.

153
00:07:55,279 --> 00:07:58,399
Speaker 1: Wow, yeah, radar range, that's what they called it.

154
00:07:58,480 --> 00:08:00,759
Speaker 2: They called it. They called it the radar range literally

155
00:08:00,800 --> 00:08:05,240
because some of the early earlier technicians working on radar

156
00:08:05,360 --> 00:08:07,920
noticed that the chocolate bars in their pockets would melt

157
00:08:07,920 --> 00:08:10,199
when they got too close to the magnetrons.

158
00:08:10,399 --> 00:08:13,279
Speaker 1: Okay, so how safe was this thing? It doesn't sound

159
00:08:13,319 --> 00:08:14,360
like it was very safe.

160
00:08:14,439 --> 00:08:18,879
Speaker 2: It's it can be is superheating the water, so it

161
00:08:18,920 --> 00:08:21,040
will you will get water burns if you get too

162
00:08:21,079 --> 00:08:21,560
close to it.

163
00:08:21,720 --> 00:08:25,519
Speaker 1: Yeah, right, And but radiation poisoning. Were they worried about that?

164
00:08:25,639 --> 00:08:28,759
Speaker 2: No? No, still, there's no I you know, ionizing radiation

165
00:08:28,839 --> 00:08:31,399
is more complicated than this. This is just microwaves, just

166
00:08:31,519 --> 00:08:34,960
like a bright light, and it can burn you because

167
00:08:35,000 --> 00:08:39,279
it vibrates the water molecules, and those water moleoscules, once

168
00:08:39,320 --> 00:08:41,039
they get to a certain temperature, are going to damage

169
00:08:41,039 --> 00:08:41,759
the tissues they're in.

170
00:08:41,919 --> 00:08:44,799
Speaker 1: Okay, all right, let's bring Vishwaz back on dot net

171
00:08:44,879 --> 00:08:46,159
rocks for the empteenth time.

172
00:08:46,519 --> 00:08:49,360
Speaker 2: It's at number seventeen, by the way I counted. Hey,

173
00:08:49,480 --> 00:08:52,519
well that's umpteen, isn't it. That's umpteenth. Yeah, we're ext

174
00:08:52,759 --> 00:08:53,720
the umpteeth category.

175
00:08:54,639 --> 00:08:58,000
Speaker 1: Bishwaslele serves as co founder and CEO at p wind

176
00:08:58,039 --> 00:09:01,919
dot AI, A noted the industry speaker and author. Mister

177
00:09:02,000 --> 00:09:07,600
Lealey is the Microsoft Regional Director for Washington, DC. Welcome back, Vishwas.

178
00:09:07,720 --> 00:09:10,799
So it's new in the world of AI in vishwas Land.

179
00:09:11,080 --> 00:09:15,399
Speaker 3: Well, i've been. We talked to you last year, very

180
00:09:15,440 --> 00:09:21,440
focused still on trying to grow this Jennai startup idea

181
00:09:21,480 --> 00:09:22,480
that we started working.

182
00:09:22,519 --> 00:09:26,200
Speaker 2: Yeah, you left your longtime company to do this startup.

183
00:09:27,000 --> 00:09:30,600
Speaker 3: Yes, I left. I was there for thirty years the

184
00:09:30,639 --> 00:09:33,960
previous company. How's it going last sixteen years? Is the CTO? Well,

185
00:09:34,000 --> 00:09:36,519
it's going. It's going well. I mean, as you can

186
00:09:36,559 --> 00:09:39,559
expect as a startup. World is all about good days

187
00:09:39,600 --> 00:09:42,200
and bad days. But it has been very exciting a

188
00:09:42,200 --> 00:09:44,000
lot of learning and we are growing.

189
00:09:44,159 --> 00:09:44,600
Speaker 1: That's cool.

190
00:09:44,639 --> 00:09:46,919
Speaker 2: Now. If I recall when we talked last time, you

191
00:09:46,960 --> 00:09:52,120
were focused on generating responses to requests for proposal for

192
00:09:52,159 --> 00:09:55,679
the RFPs. That's right, right, which in government land can

193
00:09:55,720 --> 00:09:58,320
be absolutely massive, like hundreds of pages.

194
00:09:58,399 --> 00:10:01,440
Speaker 3: That's right, that's right, and that's what we have been

195
00:10:01,480 --> 00:10:07,279
focused on. We call ourselves a thoughtful copilot. And I'll

196
00:10:07,480 --> 00:10:10,480
explain the word thoughtful in a second. We call ourselves

197
00:10:10,559 --> 00:10:15,039
thoughtful copilot for proposal writers. So what that means is

198
00:10:15,480 --> 00:10:18,759
we will take your information. You're you know in the

199
00:10:18,879 --> 00:10:23,960
years past you have responded to RFP documents that you've done.

200
00:10:24,320 --> 00:10:28,440
You have competency documents, you have some fancy documents like

201
00:10:28,519 --> 00:10:32,600
spars and for those who don't know this already, se

202
00:10:32,679 --> 00:10:35,960
parts are documents that government agencies give you to validate

203
00:10:36,039 --> 00:10:39,519
your experience. So, needless to say, you have a collection

204
00:10:39,639 --> 00:10:45,320
of documents and now a new RFP comes along. And

205
00:10:45,399 --> 00:10:48,159
these RFP documents can be, as you said, Richard, can

206
00:10:48,200 --> 00:10:51,279
be really complex, can be fifteen hundred pages long with

207
00:10:51,320 --> 00:10:54,919
different sections in them. So for example, they might be

208
00:10:54,960 --> 00:10:59,080
a section which describes how should you respond to the RFP,

209
00:11:00,120 --> 00:11:04,519
maybe a section which talks about what would be the

210
00:11:04,559 --> 00:11:08,600
evaluation criteria for somebody looking at your RFB. And then

211
00:11:09,120 --> 00:11:11,440
there may be another section which talks about all the

212
00:11:11,480 --> 00:11:12,559
things that you must do.

213
00:11:12,879 --> 00:11:14,559
Speaker 1: So that's all the boring stuff.

214
00:11:14,799 --> 00:11:16,000
Speaker 3: Our job is other boring.

215
00:11:16,039 --> 00:11:18,720
Speaker 1: You Usually the stuff that you want to read last,

216
00:11:19,600 --> 00:11:20,639
you have to read it first.

217
00:11:20,799 --> 00:11:24,919
Speaker 3: Yes, yes, and all this is part of something called

218
00:11:24,919 --> 00:11:28,519
the Federal Acquisition regulation, which something came out many many

219
00:11:28,559 --> 00:11:33,440
years ago, has different sections related to different parts of solicitation.

220
00:11:34,039 --> 00:11:38,679
So what we specialize in is, now that we have

221
00:11:38,759 --> 00:11:44,039
your knowledge repository built out, we deeply understand the solicitations

222
00:11:44,080 --> 00:11:47,519
that comes in, and then we help you write a

223
00:11:47,639 --> 00:11:54,639
draft quickly solely based on your information, so that we

224
00:11:54,679 --> 00:11:56,759
can give you this first draft quickly and then you

225
00:11:56,799 --> 00:11:59,720
can build on it. And there's a very interesting saying

226
00:11:59,759 --> 00:12:03,519
in the proposal world that proposals are often one at

227
00:12:03,559 --> 00:12:07,279
the end Richard that you know, you leave yourself enough

228
00:12:07,360 --> 00:12:10,879
time to do some creative thinking, then you goos to

229
00:12:10,879 --> 00:12:16,080
your competition, you do some innovative thinking. But the sad

230
00:12:16,120 --> 00:12:18,559
reality of proposal writing is you never reach the end.

231
00:12:18,600 --> 00:12:20,840
You spend all your time in the middle, sure, just

232
00:12:20,919 --> 00:12:24,639
trying to get a good proposal out. So our value

233
00:12:24,639 --> 00:12:28,399
proposition is very simple. Can we accelerate the initial stages

234
00:12:28,480 --> 00:12:31,559
so that we give you more polishing and thinking time.

235
00:12:31,879 --> 00:12:36,840
And that's ultimately what improves your probability of winning And

236
00:12:36,879 --> 00:12:40,240
which is the name of per company, peven dot Ai.

237
00:12:40,279 --> 00:12:43,279
Speaker 1: The probability of winning. Mm hmm, that's really good.

238
00:12:43,720 --> 00:12:46,440
Speaker 2: Well, part of this again, we're calling the previous show

239
00:12:46,639 --> 00:12:49,240
was being able to respond to more RFP so you

240
00:12:49,320 --> 00:12:53,080
have more chances at work opportunities. The thing, of course,

241
00:12:53,080 --> 00:12:57,279
has scared me is you're committing to stuff inside this

242
00:12:57,480 --> 00:13:00,399
RFP yep. And so if the tool is already the

243
00:13:00,480 --> 00:13:03,080
text for it, and it creates a commitment for you, you

244
00:13:03,039 --> 00:13:05,440
can't deliver on like you could be in serious trouble.

245
00:13:05,519 --> 00:13:09,399
Speaker 1: I think that's why you said creates a rough draft, right, Yes,

246
00:13:09,559 --> 00:13:11,879
you can't just like say, okay, here it is, send

247
00:13:11,960 --> 00:13:12,320
it off.

248
00:13:12,960 --> 00:13:15,960
Speaker 3: See. Yeah, And that's that's the thing.

249
00:13:16,279 --> 00:13:18,639
Speaker 1: Yeah, you certainly have to know what you're saying.

250
00:13:18,879 --> 00:13:21,559
Speaker 3: Yeah, no, Carl, that's a very good, good catch there.

251
00:13:21,600 --> 00:13:23,919
That's why I said draft. We're trying to improve the

252
00:13:24,000 --> 00:13:26,159
quality of drafts so that you know, you can focus

253
00:13:26,200 --> 00:13:28,639
on other things. But at the end of the day,

254
00:13:28,720 --> 00:13:31,879
make no mistake, you're putting your name down on the proposal.

255
00:13:32,000 --> 00:13:33,519
Speaker 2: Sure, so you should read it.

256
00:13:34,240 --> 00:13:34,960
Speaker 3: You should read it.

257
00:13:35,240 --> 00:13:38,000
Speaker 1: So it's kind of like having a you know, a

258
00:13:38,200 --> 00:13:41,279
manager under you or a secretary or somebody who knows

259
00:13:41,320 --> 00:13:43,639
your business say hey, you can just say look through

260
00:13:43,679 --> 00:13:47,320
this RFP and give me a rough draft and it

261
00:13:47,639 --> 00:13:51,480
might have all the accurate information from your side, but

262
00:13:51,879 --> 00:13:54,080
you know, you're the one that's got to commit to

263
00:13:54,360 --> 00:13:58,840
what you're agreeing to. And so I guess my question

264
00:13:58,879 --> 00:14:00,559
here is can you go back. I can have a

265
00:14:00,600 --> 00:14:05,799
conversation with this bot if you will, and say, you know,

266
00:14:05,919 --> 00:14:10,679
why did you why did you commit me to doing X?

267
00:14:11,519 --> 00:14:13,480
You know, and have it give you an answer from

268
00:14:13,519 --> 00:14:14,080
your data.

269
00:14:14,440 --> 00:14:19,200
Speaker 3: That's a that's a good point. We have an intermediate stage,

270
00:14:19,279 --> 00:14:24,200
and this is the pattern patent should say that we filed.

271
00:14:24,279 --> 00:14:30,279
We have a provisional patent. Yeah, so what's It's not

272
00:14:30,519 --> 00:14:35,320
like the bot is just or the copilot in this

273
00:14:35,360 --> 00:14:37,519
case of our proposal, copilot is just going to look

274
00:14:37,519 --> 00:14:40,799
at your documents and just spit out one hundred page

275
00:14:40,840 --> 00:14:44,200
document that it thinks that you should be doing. That's

276
00:14:44,240 --> 00:14:47,039
a recipe for disaster because you go pick up things

277
00:14:47,320 --> 00:14:49,600
that you don't want it to do. And there's a

278
00:14:49,720 --> 00:14:53,279
very important threshold where you really have to save time

279
00:14:53,320 --> 00:14:55,200
to people. If people say, hey, I have to first

280
00:14:55,240 --> 00:14:59,639
read your silly forty documents that you generated and then

281
00:14:59,679 --> 00:15:01,720
I have to rewrite it. Thank you, I'm not going

282
00:15:01,720 --> 00:15:03,879
to use this tool. So there's a very important threshold

283
00:15:03,879 --> 00:15:07,639
that you have to cross. So in our model. The

284
00:15:08,200 --> 00:15:11,240
patent that we have filed is called object based writing.

285
00:15:11,799 --> 00:15:16,840
Oh wow, okay, and what that means is and what

286
00:15:16,879 --> 00:15:21,200
that means is that once let's say the r FP

287
00:15:21,279 --> 00:15:25,000
aries and we have gone in and tried to analyze it,

288
00:15:25,840 --> 00:15:29,200
and we create something called the flight plan. And why

289
00:15:29,240 --> 00:15:32,440
are we calling it flight plan because we are the

290
00:15:32,480 --> 00:15:36,679
copilot for proposal writers and in the aviation industry, obviously

291
00:15:36,759 --> 00:15:39,240
there's a very important concept called flight plan that is

292
00:15:39,360 --> 00:15:42,799
used for communication. So what we do is we try

293
00:15:42,840 --> 00:15:46,600
to deeply using AI models, try to deeply understand the RFP,

294
00:15:46,879 --> 00:15:53,759
understand the evaluation criteria, understand the instructions, and we manifest

295
00:15:53,879 --> 00:15:58,039
that into a UI pattern called the flight plan. So

296
00:15:58,159 --> 00:16:01,000
you go in and then you intera to the flight plan,

297
00:16:01,440 --> 00:16:05,799
and this is where you go and say that I

298
00:16:05,879 --> 00:16:10,919
want you to use this architecture. And let's just take

299
00:16:10,960 --> 00:16:14,000
a concrete example that our listeners can follow along. Let's

300
00:16:14,000 --> 00:16:17,720
say the RFP is about a records management solution, right,

301
00:16:18,399 --> 00:16:20,440
and you may have done ten different types of records

302
00:16:20,440 --> 00:16:24,279
management solution, maybe use SharePoint for that, maybe used some

303
00:16:24,679 --> 00:16:28,759
purpose built records management solution things for that.

304
00:16:29,000 --> 00:16:33,440
Speaker 1: By the way, I'm sorry, yep, yep.

305
00:16:33,639 --> 00:16:36,240
Speaker 3: So if you had to do that. So once the

306
00:16:36,279 --> 00:16:39,519
flight plan comes in, what we do is we say, hey,

307
00:16:40,759 --> 00:16:42,879
we think that you have used these are three or

308
00:16:42,879 --> 00:16:45,960
four architectural patterns to go after this kind of work.

309
00:16:47,519 --> 00:16:51,559
Here's a suggestion, but please you need to update that

310
00:16:52,279 --> 00:16:56,440
with maybe, like you said written Carl, SharePoint is not

311
00:16:56,480 --> 00:16:58,639
the right solution. You want to do something else, so

312
00:16:58,720 --> 00:17:01,879
replace that, And in that flight plan you tell us

313
00:17:01,919 --> 00:17:06,400
that you know, I'm wanting to undertake this task using

314
00:17:06,519 --> 00:17:10,319
this architectural pattern. And then you also in the flight

315
00:17:10,359 --> 00:17:14,880
plan you get to tell people do you really understand

316
00:17:15,200 --> 00:17:17,680
the pain points? Do you really understand the motivation of

317
00:17:17,720 --> 00:17:22,000
the buyer? Why did they issue the RFP? And do

318
00:17:22,079 --> 00:17:24,720
you why do you why do you think you're differentiated

319
00:17:24,759 --> 00:17:28,079
than other competitors? If all your differentiation is or we

320
00:17:28,160 --> 00:17:31,359
have all the certifications in place, well, every other competitor

321
00:17:31,400 --> 00:17:33,799
is going to say that. So we force you to

322
00:17:33,920 --> 00:17:39,960
think critically about the response and at a higher level.

323
00:17:40,039 --> 00:17:43,000
Speaker 1: So is it kind of like an outline that you

324
00:17:43,039 --> 00:17:46,880
can interact with? It seems like it is. You you

325
00:17:46,920 --> 00:17:50,480
have a model for a conversation with the AI. Yes,

326
00:17:50,920 --> 00:17:53,880
so yes to this, no to this? What's about that?

327
00:17:54,160 --> 00:17:57,079
Speaker 3: Yes? And you know, so you're setting think of this

328
00:17:57,119 --> 00:18:00,200
as that you're setting the vision for the response that

329
00:18:00,240 --> 00:18:03,880
you want to generate. Right, you're talking about your competitors,

330
00:18:03,880 --> 00:18:06,000
you're talking about your win themes, you're talking about your

331
00:18:06,039 --> 00:18:10,000
pain points, you're talking about your solutioning strategy, and on

332
00:18:10,039 --> 00:18:12,079
and and on. There are lots of data structures that

333
00:18:12,119 --> 00:18:16,720
we capture, and then once we have that information, then

334
00:18:16,799 --> 00:18:20,559
we try to in a thoughtful manner, try to infuse

335
00:18:21,279 --> 00:18:23,680
what you gave us as a direction. Right, if your

336
00:18:23,720 --> 00:18:25,880
capture team, if your salespeople have done a good job

337
00:18:26,200 --> 00:18:28,680
and they weren't their salary, they probably on the golf

338
00:18:28,680 --> 00:18:32,200
course realized that the key motivation for this RFP was

339
00:18:32,880 --> 00:18:35,240
the pain point this organization is going through is the

340
00:18:35,319 --> 00:18:38,559
last three upgrades were not there, and that key point

341
00:18:39,079 --> 00:18:41,599
AI does not know about or would never know about.

342
00:18:42,000 --> 00:18:45,680
And this will be your capture team. And then you

343
00:18:45,720 --> 00:18:49,079
take that information, you have all of the data structures,

344
00:18:49,799 --> 00:18:52,880
you have your knowledge repository, and I'll talk more about it.

345
00:18:54,480 --> 00:18:57,279
We do a lot of work, and I very much

346
00:18:57,680 --> 00:19:00,599
agree with what you were saying Card earlier about sets comment.

347
00:19:01,039 --> 00:19:06,319
We do a lot of collation, classification, tagging of content

348
00:19:06,400 --> 00:19:09,440
and cleaning of content. So now have you have your content,

349
00:19:09,559 --> 00:19:12,599
We have your vision for how you want to respond,

350
00:19:13,400 --> 00:19:16,720
and then we start drafting your content in a thoughtful manner.

351
00:19:17,319 --> 00:19:21,079
And that drafting is done using best practices. And this

352
00:19:21,160 --> 00:19:24,359
is where we are really fortunate to have Shipley. I

353
00:19:24,559 --> 00:19:27,079
talked about this last time. Shipley is a fifty year

354
00:19:27,119 --> 00:19:30,960
old company which is which taught people how to write

355
00:19:31,119 --> 00:19:34,480
persuasive proposals. That's what they do. They teach classes, they

356
00:19:34,519 --> 00:19:38,519
generate content. So now we have this information, we use

357
00:19:38,599 --> 00:19:42,880
their best practices. And one example would be when you're

358
00:19:42,920 --> 00:19:46,799
responding to an our RFP, be sure about customer centricity.

359
00:19:47,119 --> 00:19:50,599
We talk about customer's problem, but speak to those problems

360
00:19:50,640 --> 00:19:53,640
maybe in your terms without sort of taking ownership of

361
00:19:53,640 --> 00:19:56,480
the problem. And as if you're teaching them right about

362
00:19:56,480 --> 00:20:01,160
active voice, passive voice, and then very important those are

363
00:20:01,200 --> 00:20:05,440
all very important. And then you start with the problem understanding.

364
00:20:05,599 --> 00:20:08,000
Then you say where you have done this kind of work,

365
00:20:08,240 --> 00:20:10,799
you do you have really good past performance about this.

366
00:20:11,440 --> 00:20:14,119
So they are best practices of writing that come in.

367
00:20:14,559 --> 00:20:21,359
And this is why it takes us minutes and hours

368
00:20:21,400 --> 00:20:25,440
to generate the draft, right because you know, I was

369
00:20:25,480 --> 00:20:27,480
joking with you last year when I was the show

370
00:20:27,920 --> 00:20:29,960
that if you went to Chad Gypt and said, CHADGEPT,

371
00:20:30,079 --> 00:20:32,279
this is a very important question. I'm going to ask you,

372
00:20:32,319 --> 00:20:34,920
please think about it for five minutes before you respond.

373
00:20:35,759 --> 00:20:37,839
Chad GPT does not know what to do with the

374
00:20:38,000 --> 00:20:40,920
four minutes and fifty nine seconds, right, because it just

375
00:20:41,480 --> 00:20:44,240
gave you the answer right away. Right in our case,

376
00:20:44,680 --> 00:20:46,319
that's how it works. In our case, what we are

377
00:20:46,319 --> 00:20:48,920
doing is, Oh, you have asked us to draft this information.

378
00:20:49,720 --> 00:20:53,440
Let us try to Oh you told us this here, okay,

379
00:20:53,720 --> 00:20:58,039
and your knowledge repository saying this and one other thing

380
00:20:58,680 --> 00:21:01,240
that the comment that you read just a moment to

381
00:21:01,279 --> 00:21:04,720
good Richard that if you tell the language model that

382
00:21:05,400 --> 00:21:08,000
please take these ten things that I'm telling you and

383
00:21:08,119 --> 00:21:13,640
infuse them and generate some content. Language model, maybe listen

384
00:21:13,680 --> 00:21:15,920
to the first or second requirement and maybe the last

385
00:21:16,000 --> 00:21:17,960
requirement and then forget everything in the middle.

386
00:21:18,359 --> 00:21:18,599
Speaker 2: Right.

387
00:21:19,640 --> 00:21:22,680
Speaker 3: So this is where the thoughtful copilot comes in. We

388
00:21:22,799 --> 00:21:25,960
go step by step manner to generate the content.

389
00:21:26,160 --> 00:21:27,799
Speaker 2: I love it, but I appreciate that this is I

390
00:21:27,839 --> 00:21:32,480
got to imagine when you've an experienced RFP person, you're

391
00:21:32,559 --> 00:21:36,240
probably comparing the new RFP to past ones you've done.

392
00:21:36,680 --> 00:21:38,079
It's got to be a certain amount of cut and

393
00:21:38,119 --> 00:21:41,839
paste there at least referencing material, which means you tend

394
00:21:41,920 --> 00:21:44,440
to try and solve the same problem over and over again,

395
00:21:45,440 --> 00:21:49,279
rather than this customer centric like do you understand this

396
00:21:49,400 --> 00:21:52,640
particular problem this time, which may or may not be

397
00:21:52,720 --> 00:21:57,000
explicitly stated, and then make sure you're tuning the material

398
00:21:57,519 --> 00:21:59,880
to that, even if it is from past material. This

399
00:22:00,200 --> 00:22:04,319
is where the language model actually is super advantageous, because

400
00:22:04,359 --> 00:22:07,200
you can train it on the past data and it

401
00:22:07,319 --> 00:22:12,359
will generate original texts yep, presumably with the impetus of

402
00:22:12,440 --> 00:22:15,720
the new customer problem attached to it. This is stuff

403
00:22:15,720 --> 00:22:17,920
that's hard for people to do, but I think the

404
00:22:18,079 --> 00:22:19,480
software be pretty good at it.

405
00:22:19,559 --> 00:22:22,720
Speaker 3: That's right. This is a very important point I in

406
00:22:22,759 --> 00:22:25,440
my previous role we talked about. I was there for

407
00:22:25,480 --> 00:22:28,839
a long time and I didn't write proposals, but I

408
00:22:28,880 --> 00:22:32,160
often served as a technical smme for a certain section. Right, Hey,

409
00:22:32,160 --> 00:22:34,519
this is the section we need to write about. How

410
00:22:34,559 --> 00:22:38,440
should we write about and what would happen there is?

411
00:22:38,680 --> 00:22:41,440
I would tend to talk about the projects that I

412
00:22:41,559 --> 00:22:43,880
focused on, you know, even if they were not the

413
00:22:43,960 --> 00:22:50,279
best fit. Right, because organizational memory influences how you respond

414
00:22:50,640 --> 00:22:53,599
versus This is where the superpower of the language model

415
00:22:53,759 --> 00:22:57,920
is that you have one hundred different past performance to

416
00:22:58,039 --> 00:23:01,519
look for, and maybe the the ones I picked because

417
00:23:01,559 --> 00:23:03,400
of my bias because that's what I've worked on and

418
00:23:03,400 --> 00:23:05,480
I'm very proud of those, Maybe those are not the

419
00:23:05,480 --> 00:23:08,640
best ones. Maybe you should put something else that has

420
00:23:09,119 --> 00:23:12,759
better engagement to the problem at hand.

421
00:23:13,160 --> 00:23:13,799
Speaker 2: Right, So.

422
00:23:15,599 --> 00:23:19,599
Speaker 1: I can see the advantage of this because it's very specific.

423
00:23:19,640 --> 00:23:22,160
But then again, you have to feed it your data.

424
00:23:22,279 --> 00:23:25,200
What about the stuff that's built into office You know

425
00:23:25,279 --> 00:23:28,160
that that knows like all of your documents and your

426
00:23:28,160 --> 00:23:31,960
spreadsheets and your your files and your PDFs and all

427
00:23:32,000 --> 00:23:36,400
that stuff. Is it too broad? Is that knowledge too

428
00:23:36,440 --> 00:23:40,440
broad to feed an RFP into and say come out

429
00:23:40,480 --> 00:23:41,400
with something meaningful.

430
00:23:41,519 --> 00:23:45,599
Speaker 3: Yes, that's a that's a great question. We we don't

431
00:23:45,640 --> 00:23:49,799
get that question now much much much more, But previously

432
00:23:49,839 --> 00:23:51,440
we used to get this question, why should I use

433
00:23:51,480 --> 00:23:54,519
your copilot? We have the M through sixty five co pilot, right,

434
00:23:55,279 --> 00:23:57,079
And the answer to that question is, if I can

435
00:23:57,200 --> 00:23:59,839
even take a step back, you have general purpose Shared

436
00:24:00,319 --> 00:24:01,720
chat GPT, claude.

437
00:24:01,440 --> 00:24:01,920
Speaker 1: What have you?

438
00:24:02,319 --> 00:24:06,240
Speaker 3: Right, and think of them as outside your firewall. Great tools.

439
00:24:06,319 --> 00:24:08,200
I love them. I'm sure all of you have two

440
00:24:08,279 --> 00:24:10,920
or three or more subscriptions because they are great at

441
00:24:11,440 --> 00:24:17,160
synthesizing information, summarizing information. Just you're trying to write something

442
00:24:17,400 --> 00:24:22,640
about a topic, it's great to collaborate with generate some content.

443
00:24:22,720 --> 00:24:25,079
Of course, you have to think you ultimately you own

444
00:24:25,160 --> 00:24:29,279
the content, and there can be inaccuracies, of course. So

445
00:24:29,359 --> 00:24:33,599
that's outside the firewall tools. And I say that with

446
00:24:33,720 --> 00:24:38,039
some hesitation, because only two weeks ago or three weeks ago,

447
00:24:38,400 --> 00:24:41,319
chat gpt announced that, hey, in addition to outside stuff,

448
00:24:41,559 --> 00:24:43,720
you can also point to a Google drive or something

449
00:24:43,799 --> 00:24:47,160
like that and we'll start ingesting your content. So let's

450
00:24:47,160 --> 00:24:49,799
just keep that aside for a moment. Let's just talk

451
00:24:49,839 --> 00:24:56,400
about Claude and chat GPT. Great tools, but outside the

452
00:24:56,400 --> 00:24:59,000
firewall tools. And then you have the office productivity tools

453
00:24:59,039 --> 00:25:02,440
like and three sixty five Copilot an equivalent that are

454
00:25:02,440 --> 00:25:07,640
built into, built into your enterprise. And they're great, right.

455
00:25:08,000 --> 00:25:10,880
They know who you talk to a lot on email,

456
00:25:11,039 --> 00:25:13,400
they know who the subject matter experts are. They have

457
00:25:13,400 --> 00:25:16,720
a full access to your one edrive, what have you.

458
00:25:17,279 --> 00:25:21,799
So they're great. But again they are general purpose, and

459
00:25:22,680 --> 00:25:26,200
two important differences right. You can use them to write

460
00:25:26,200 --> 00:25:29,000
an email to a customer whose irate. You can use

461
00:25:29,039 --> 00:25:31,359
it to plan a party, a going away party for

462
00:25:31,480 --> 00:25:34,039
your co worker. They do well, they will get all

463
00:25:34,079 --> 00:25:36,799
these things right. You don't have to worry about copying

464
00:25:36,839 --> 00:25:40,039
and pasting information because they have access to all the data.

465
00:25:40,559 --> 00:25:43,559
There is security built in. They only get to information

466
00:25:43,640 --> 00:25:46,480
that you have access to. All all great things with

467
00:25:46,559 --> 00:25:50,759
those productivity tools. But when we talk about a domain

468
00:25:50,799 --> 00:25:54,440
specific coal pilot like ours, there are more things that

469
00:25:54,519 --> 00:25:58,880
need to be done. So I talked about having very

470
00:25:58,880 --> 00:26:01,880
specific documents. We go to a customer and say, can

471
00:26:01,920 --> 00:26:04,799
you please give us your past RFP response documents, and

472
00:26:04,839 --> 00:26:08,119
maybe they have two thousands of those documents. We look

473
00:26:08,160 --> 00:26:12,720
at those documents and we break them up into logical boundaries,

474
00:26:12,799 --> 00:26:16,240
We classify them, we auto tag them, things like that.

475
00:26:16,400 --> 00:26:20,599
We are very opinionated about what we are trying to

476
00:26:20,640 --> 00:26:23,720
decipher from your proposal library content.

477
00:26:24,119 --> 00:26:28,079
Speaker 1: It's a specific data set rather than everything in your

478
00:26:28,200 --> 00:26:31,279
enterprise exactly. You might not you know, you might not

479
00:26:31,359 --> 00:26:34,000
want that email to your wife about the new dog.

480
00:26:35,000 --> 00:26:37,599
In your response to the you.

481
00:26:37,559 --> 00:26:41,640
Speaker 3: Don't want that and no, Caul, that's absolutely right. And

482
00:26:41,680 --> 00:26:45,440
there's one other sort of technical issue. Right, the general

483
00:26:45,440 --> 00:26:50,079
purpose tools don't know exactly how to chunk these documents,

484
00:26:50,119 --> 00:26:52,759
and often they chunk it at some page boundary or

485
00:26:52,799 --> 00:26:57,000
some fixed word boundary, and we cannot do that. We

486
00:26:57,119 --> 00:27:00,880
chunk them based on the logical sections the structure of

487
00:27:00,920 --> 00:27:03,799
an RFP document, so that when we provide that as

488
00:27:03,839 --> 00:27:07,519
a context to the language model, we provide the most

489
00:27:07,559 --> 00:27:11,000
precise context. And it's all about context, right. Language models

490
00:27:11,000 --> 00:27:14,200
can get confused easily, so we provide So that's one

491
00:27:14,240 --> 00:27:17,200
big difference, right, all of the upfront work. If you

492
00:27:17,240 --> 00:27:20,519
ask me where where we've done our most R and D.

493
00:27:21,480 --> 00:27:24,960
We are fifty people strong now, so you're growing quite

494
00:27:24,960 --> 00:27:27,160
a bit. So fifty to fifty two or fifty three

495
00:27:27,200 --> 00:27:34,920
people and many data scientists awesome, awesome team of engineers, mlops, DevOps, whatnot,

496
00:27:35,000 --> 00:27:37,880
all running on measure and of course data science engineers.

497
00:27:38,039 --> 00:27:41,440
So one part of research is ingestion. How can we

498
00:27:41,519 --> 00:27:47,640
be opinionated optimized ingestion? That's one part. And then also

499
00:27:47,759 --> 00:27:50,839
how do you glean some metadata from these documents? That's

500
00:27:50,880 --> 00:27:55,160
also important, right, because semantic search can break down if

501
00:27:55,160 --> 00:27:57,720
you're trying to find very factual information. So how do

502
00:27:57,799 --> 00:28:03,200
you extract the metadata things like that. That's one part

503
00:28:03,240 --> 00:28:05,039
of R and D. The other part of R and

504
00:28:05,119 --> 00:28:09,000
D is we talked about the flight plan. We talked

505
00:28:09,000 --> 00:28:11,160
about how you get so much data from the user

506
00:28:11,240 --> 00:28:14,359
about win themes and pain points and solutioning and all

507
00:28:14,400 --> 00:28:20,720
of that. Right, how do you effectively generate prompts dynamically

508
00:28:21,519 --> 00:28:25,000
so that you can generate quality content that adheres to

509
00:28:25,039 --> 00:28:30,160
proposal writing best practices. So those two things together is

510
00:28:30,200 --> 00:28:33,200
why there is a domain specific copilot, and people try

511
00:28:33,279 --> 00:28:35,359
to do that. Initially, Hey, let me just upload the

512
00:28:35,440 --> 00:28:39,279
RFP to the to the M three sixty five copilot

513
00:28:39,319 --> 00:28:42,240
and say can you generate a response? And then in

514
00:28:42,279 --> 00:28:45,079
some cases the response is limited. We are generating a

515
00:28:45,160 --> 00:28:47,039
response which can be one hundred and sixty hundred and

516
00:28:47,039 --> 00:28:49,640
fifty pages long. So, first of all, you can't do

517
00:28:49,759 --> 00:28:53,839
that with a general purpose copilot. And secondly, how do

518
00:28:53,920 --> 00:28:56,799
you without having to write all the prompts yourself? In

519
00:28:56,839 --> 00:28:59,359
our system, the users never have to write a prompt

520
00:29:00,799 --> 00:29:04,799
because the prompts are changing. The best practices for prompts

521
00:29:04,880 --> 00:29:08,920
before the reasoning models came along, and the best practices

522
00:29:08,960 --> 00:29:11,519
for prompting after using models are very different and you

523
00:29:11,519 --> 00:29:15,119
can't expect your end users to keep learning new prompting techniques,

524
00:29:15,480 --> 00:29:20,119
so we dynamically generate the prompts based on what is

525
00:29:20,160 --> 00:29:24,480
in your knowledge repository. Right, So those are the two things.

526
00:29:24,519 --> 00:29:26,319
Sorry for the long answer, but those are the two

527
00:29:26,359 --> 00:29:27,279
things that are different.

528
00:29:27,640 --> 00:29:30,400
Speaker 1: No, that's okay. Yeah, that's really great. And this seems

529
00:29:30,440 --> 00:29:32,759
like a good place to take a break, so we'll

530
00:29:32,759 --> 00:29:36,640
be right back after these very important messages. Stay tuned.

531
00:29:38,359 --> 00:29:40,799
Did you know you can lift and shift your dot

532
00:29:40,880 --> 00:29:45,000
net framework apps to virtual machines in the cloud. Use

533
00:29:45,039 --> 00:29:49,359
the elastic beanstock service to easily migrate to Amazon EC

534
00:29:49,519 --> 00:29:54,200
two with little or no changes. Find out more aws

535
00:29:54,359 --> 00:30:02,759
dot Amazon dot com, slash elastic Beanstock, and we're back.

536
00:30:02,839 --> 00:30:05,960
It's dot net Rox. I'm Carl Franklin, that's Richie Campbell hey,

537
00:30:06,119 --> 00:30:10,920
and that's vishwas Lele our friend. And just as a reminder,

538
00:30:10,960 --> 00:30:13,200
if you don't want to hear these ads before, during,

539
00:30:13,359 --> 00:30:17,480
and after the show, you can become a patron a

540
00:30:17,519 --> 00:30:20,680
Patreon for five dollars a month. Go to Patreon dot

541
00:30:20,680 --> 00:30:22,599
dot netroocks dot com. We'll give you a feed that

542
00:30:22,599 --> 00:30:25,640
has no ads. You still have to hear me apologizing

543
00:30:25,759 --> 00:30:30,640
for them. Okay, well, you know the thing, as you

544
00:30:30,640 --> 00:30:33,599
were talking, it just kind of occurs to me that

545
00:30:34,240 --> 00:30:38,039
what you're doing is you're creating a company around a

546
00:30:38,160 --> 00:30:43,240
specific domain, right that where you know how to ingest

547
00:30:43,519 --> 00:30:45,640
the documents, you know how to query, and you fine

548
00:30:45,680 --> 00:30:49,000
tuned it. It's not like somebody is going to come

549
00:30:49,039 --> 00:30:51,039
to you and say maybe they have, but as anybody

550
00:30:51,039 --> 00:30:52,640
come to you and said, hey, you did such a

551
00:30:52,640 --> 00:30:56,319
great job for this domain, can you make me a

552
00:30:56,400 --> 00:31:01,000
company around this domain? And that's essentially what you've done

553
00:31:01,440 --> 00:31:04,400
with just the starting with the idea of their domain.

554
00:31:05,000 --> 00:31:07,799
And I imagine that's what a lot of our listeners

555
00:31:07,880 --> 00:31:10,039
are trying to do with their own businesses and their

556
00:31:10,039 --> 00:31:11,200
own domains, isn't it.

557
00:31:11,359 --> 00:31:15,000
Speaker 3: So that's an excellent question, Carl, That's an excellent question.

558
00:31:15,119 --> 00:31:19,359
Let me my thinking has evolved over this question. I

559
00:31:19,799 --> 00:31:21,839
think you had asked a similar question last year when

560
00:31:21,839 --> 00:31:25,000
we talked, and my thinking has evolved. So, Number one,

561
00:31:25,880 --> 00:31:28,200
I think the patterns that we have figured out in

562
00:31:28,279 --> 00:31:33,039
terms of breaking down these documents in a domain specific manner,

563
00:31:33,720 --> 00:31:38,160
generating these prompts dynamically, those patterns generally are applicable to

564
00:31:38,160 --> 00:31:41,519
any business problem. So that's right. But what I've also

565
00:31:41,559 --> 00:31:45,400
come to realize is in order to get true value

566
00:31:45,480 --> 00:31:49,240
from jen AAI for any business process, any complex and

567
00:31:49,279 --> 00:31:51,920
proposal writing is a complex business process. There are hundreds

568
00:31:51,960 --> 00:31:57,279
and thousands of business processes. You really have to do

569
00:31:57,400 --> 00:32:00,599
a lot of domain specific work in order to provide

570
00:32:00,680 --> 00:32:03,599
value to the customers. It's not like you could take

571
00:32:03,640 --> 00:32:06,000
these patterns. Well, you have to Let's say you went

572
00:32:06,039 --> 00:32:08,279
to the financial sector, you went to the insurance sector,

573
00:32:08,680 --> 00:32:11,160
You'd have to do a lot of domain analysis to

574
00:32:11,200 --> 00:32:14,839
figure out what are the key entities, what should be

575
00:32:14,920 --> 00:32:18,480
the metadata. You'd have to do a lot of work

576
00:32:19,000 --> 00:32:25,359
to figure out the best practices, think about the prompt

577
00:32:25,519 --> 00:32:28,559
generation engine. So it's just not a matter of taking

578
00:32:28,640 --> 00:32:31,480
these patterns. Maybe the patterns supply at a higher level,

579
00:32:32,160 --> 00:32:35,200
but it is the work that has gone into realizing

580
00:32:35,279 --> 00:32:38,160
these patterns is where we have spent the last year in.

581
00:32:38,279 --> 00:32:40,240
Speaker 1: Right, because it seems like there's a lot of people

582
00:32:40,319 --> 00:32:43,519
that are using RAG and it's kind of like what

583
00:32:43,519 --> 00:32:47,440
you're doing here, But I don't think there from what

584
00:32:47,559 --> 00:32:50,920
I understand, look at it, like as Richard likes to say,

585
00:32:50,920 --> 00:32:53,359
it's in a squirt bottle. You know, we just want

586
00:32:53,440 --> 00:32:57,599
to spray some RAG on our documents and be able

587
00:32:57,640 --> 00:33:00,000
to query them and get what we want out of them.

588
00:33:00,640 --> 00:33:04,720
And what you're talking about here is a lot more involved,

589
00:33:04,759 --> 00:33:07,759
Like it's not just a RAG tool that you're using.

590
00:33:08,559 --> 00:33:12,039
It's everything from the original document, how to break them down,

591
00:33:12,079 --> 00:33:13,400
how to chunk them, how to put them in there,

592
00:33:13,480 --> 00:33:16,559
the things to look for, the way to respond like

593
00:33:17,440 --> 00:33:22,200
it's U And so I imagine you're we don't really

594
00:33:22,240 --> 00:33:25,480
look in terms of accuracy because you're not saying you

595
00:33:25,480 --> 00:33:28,599
know how many widgets are in BIN forty five right now,

596
00:33:30,240 --> 00:33:35,440
You're not. You're really genuine generating a text response which

597
00:33:35,480 --> 00:33:37,839
you ultimately have control over.

598
00:33:38,039 --> 00:33:42,279
Speaker 3: But accuracy is very important part for us. By the way, Carl,

599
00:33:42,599 --> 00:33:45,400
responsible they is an important tenet for us, and it

600
00:33:45,559 --> 00:33:48,960
manifests for us in many ways. And I'm really glad

601
00:33:49,000 --> 00:33:53,599
that we tried to adhere or architect to these tenets

602
00:33:53,599 --> 00:33:55,319
from the very beginning. And let me give you a

603
00:33:55,359 --> 00:33:59,480
couple of examples. Right, so, every time we generate a

604
00:33:59,519 --> 00:34:04,200
piece of text for you, the responsible AI tenant says

605
00:34:04,319 --> 00:34:07,119
that you should try to be transparent with the user.

606
00:34:07,720 --> 00:34:10,079
So you have to tell them exactly which document, which

607
00:34:10,159 --> 00:34:13,719
paragraph did you source the content from. That's one. And

608
00:34:13,760 --> 00:34:18,719
then ultimately we said, it's a shared responsibility model. It's

609
00:34:18,719 --> 00:34:22,559
about the complementarity between the AI models and the human beings.

610
00:34:23,199 --> 00:34:27,679
So how do you help the proposal writers get you

611
00:34:27,719 --> 00:34:30,840
an accurate response sooner? So we generate something called the

612
00:34:30,880 --> 00:34:33,559
hallucination report every time we generate a piece of text.

613
00:34:34,400 --> 00:34:38,079
It and what the haalucination report is that every time

614
00:34:38,159 --> 00:34:43,039
we write out some text, we try to extract any

615
00:34:43,159 --> 00:34:48,920
assertions that you've made in that text or the AI

616
00:34:48,960 --> 00:34:51,960
has made on your behalf, based on your knowledge repository.

617
00:34:53,000 --> 00:34:56,000
And we said, you know, you're saying that you've done

618
00:34:56,440 --> 00:35:01,280
these five thousand no cloud migration, but we really can't

619
00:35:01,320 --> 00:35:05,280
find anything like this in our knowledge repository, right, which is.

620
00:35:05,280 --> 00:35:07,360
Speaker 2: A great thing for it to say when it can't

621
00:35:07,360 --> 00:35:09,559
find something, rather than make something up.

622
00:35:09,639 --> 00:35:12,199
Speaker 3: Yeah, that's right. And we know that the sales team

623
00:35:12,760 --> 00:35:15,360
tends to sometimes take a creative license with some things.

624
00:35:15,360 --> 00:35:19,719
Maybe they combine to projects. Right, So it's not saying

625
00:35:19,719 --> 00:35:23,480
this is wrong, it is saying that it will it

626
00:35:23,519 --> 00:35:26,920
will serve you well to take this statement. We can't

627
00:35:26,960 --> 00:35:30,719
find an explicit reference to the statement in or repository.

628
00:35:31,079 --> 00:35:33,800
Can you come back and sort of take a look,

629
00:35:33,840 --> 00:35:36,199
and maybe you can approve that risk and say, no,

630
00:35:36,480 --> 00:35:41,519
we are taking these two projects in and we're saying

631
00:35:41,599 --> 00:35:44,800
that overall we have done these kinds of migrations for

632
00:35:44,960 --> 00:35:49,639
this aggregated agency, and that's how we are making this assertion.

633
00:35:49,880 --> 00:35:51,760
That's fine, you can approve that risk.

634
00:35:51,880 --> 00:35:54,199
Speaker 1: Yeah, it may be an accurate inference or it may

635
00:35:54,199 --> 00:35:55,960
be completely wacky.

636
00:35:55,599 --> 00:35:57,159
Speaker 2: But at least if you have the references, you can

637
00:35:57,159 --> 00:35:59,119
go evacuate it yourself if you want, Yeah.

638
00:35:58,920 --> 00:36:00,559
Speaker 3: You can. You can go back and read. And the

639
00:36:00,639 --> 00:36:02,159
third thing.

640
00:36:02,119 --> 00:36:04,159
Speaker 2: That we also do and argue the interpretation.

641
00:36:04,199 --> 00:36:06,280
Speaker 3: Third thing that we also do is something called the

642
00:36:06,320 --> 00:36:10,480
completion report. Right, So the RFP document was forty to

643
00:36:10,480 --> 00:36:15,360
fifty pages long. It had instructions, it had evaluation criteria,

644
00:36:15,400 --> 00:36:19,360
it had many many requirements, and it's a very important

645
00:36:19,599 --> 00:36:22,599
aspect of responding that you're compliant. You're answering all the

646
00:36:22,679 --> 00:36:23,960
questions that you were asked to do.

647
00:36:24,079 --> 00:36:26,119
Speaker 2: Yeah, you didn't miss any you didn't miss any.

648
00:36:26,480 --> 00:36:30,159
Speaker 3: So we again try to generate a compliance matrix for

649
00:36:30,199 --> 00:36:33,559
you and say, look, these are the things that were

650
00:36:33,599 --> 00:36:36,960
asked in the RFP. But again, this is a very

651
00:36:37,000 --> 00:36:41,320
important part you have to go and validate that yourself,

652
00:36:41,360 --> 00:36:45,280
because language models can be inaccurate at times. So it's

653
00:36:45,320 --> 00:36:48,719
a very much an assistive technology, and we are not

654
00:36:48,800 --> 00:36:52,760
quite getting it to autonomous anytime soon. That press up

655
00:36:52,800 --> 00:36:55,719
button proposal comes out that you're ready to submit.

656
00:36:55,400 --> 00:36:57,159
Speaker 2: But it also occurs to me that again this is

657
00:36:57,199 --> 00:36:59,320
something it's good at that it could lead you to

658
00:37:00,119 --> 00:37:04,320
cerve responses or lack of response to a requirement, the

659
00:37:04,360 --> 00:37:09,000
same way that a generative AI image recognized er may

660
00:37:09,159 --> 00:37:12,119
point a radiologist to a particular spot on a picture

661
00:37:12,159 --> 00:37:15,239
and say this looks anomalous yep, and encourage them to

662
00:37:15,239 --> 00:37:18,800
look in the right corners. Given all the guidance you're

663
00:37:18,800 --> 00:37:22,280
providing here, which I really appreciate, right that the flight

664
00:37:22,360 --> 00:37:26,840
plan approached the prompt generation to make sure it's accurate

665
00:37:26,880 --> 00:37:30,519
and focus on the right material with the references. Does

666
00:37:30,559 --> 00:37:33,599
the language model actually matter? Like you talked about open

667
00:37:33,679 --> 00:37:37,559
AI last year, Now there's been many more models released.

668
00:37:37,599 --> 00:37:40,199
I mean cloud was always around, but deep seeks appeared

669
00:37:40,280 --> 00:37:44,039
like does any of those make a difference given this

670
00:37:44,400 --> 00:37:45,559
strict set of guidance.

671
00:37:46,280 --> 00:37:51,960
Speaker 3: That's a great question, So let me answer that using

672
00:37:52,639 --> 00:37:56,199
two important points. Right turns out that these rf peace

673
00:37:56,280 --> 00:37:59,599
can also have sensitive information, and you need to have

674
00:38:00,760 --> 00:38:05,199
things like CMMC two standards where you are explicitly telling

675
00:38:05,239 --> 00:38:09,119
the government that this information is not just public RFP.

676
00:38:09,280 --> 00:38:13,880
So think about you're going after some RFP that talks

677
00:38:13,880 --> 00:38:17,159
about some kind of things weapon system and even the

678
00:38:17,280 --> 00:38:22,199
RFPs can have what is known as a controlled unclassified information.

679
00:38:22,920 --> 00:38:28,280
So in order to support and show that you're handling

680
00:38:28,320 --> 00:38:34,639
that information correctly, you need to show to the auditors

681
00:38:35,599 --> 00:38:40,079
where your packets have traversed in that system. And then

682
00:38:40,199 --> 00:38:44,880
ultimately where did you send this data and did let's

683
00:38:44,880 --> 00:38:47,320
say you sent it to the external system like language model.

684
00:38:47,840 --> 00:38:52,199
Did that language model itself have a security classification and

685
00:38:52,239 --> 00:38:55,559
it is giving you guarantees about not logging that data

686
00:38:55,599 --> 00:38:57,719
beyond the metadata and things like that.

687
00:38:57,559 --> 00:39:01,039
Speaker 2: Right, yeah, and the sovereignty where did that they actually go?

688
00:39:01,559 --> 00:39:01,920
Speaker 3: Yes?

689
00:39:02,320 --> 00:39:04,400
Speaker 2: Now, I mean this makes a strong case for running

690
00:39:04,360 --> 00:39:05,239
as a local model.

691
00:39:05,800 --> 00:39:11,079
Speaker 3: Well, that's true, we use but you know the same time,

692
00:39:11,119 --> 00:39:13,280
these RFPs are very complex, so we want the best

693
00:39:13,320 --> 00:39:14,840
comprehension that is available.

694
00:39:15,239 --> 00:39:19,039
Speaker 2: You want a trillion parameters, you need want trills.

695
00:39:18,400 --> 00:39:21,639
Speaker 3: I need a trillion parameters, right, So what we do

696
00:39:21,719 --> 00:39:26,159
is platforms like asure opening. I give you that guarantee

697
00:39:26,199 --> 00:39:29,320
and you can actually show to people how your network,

698
00:39:30,760 --> 00:39:33,480
how do your packets from all the way from incoming

699
00:39:33,599 --> 00:39:38,199
RFP to how the response generation traversed a certain network

700
00:39:38,320 --> 00:39:42,679
path that the auditors can say, yep, we certify, we're

701
00:39:42,719 --> 00:39:45,119
okay with that, that these packets are fine. So that's one.

702
00:39:45,159 --> 00:39:47,679
So it's not just a matter of hey, let me

703
00:39:47,800 --> 00:39:51,519
just go send this data packet to somewhere else unless

704
00:39:51,519 --> 00:39:55,000
they have a classification. So that's one one important consideration

705
00:39:55,079 --> 00:39:55,400
for us.

706
00:39:55,440 --> 00:39:58,320
Speaker 2: So just narrow is your choices of models based on

707
00:39:58,519 --> 00:40:00,920
whether or not they're going to clear see see two.

708
00:40:01,119 --> 00:40:05,000
Speaker 3: Yes, that's one. And secondly, Richard, the other thing that

709
00:40:05,320 --> 00:40:10,320
we've realized is that these models have a personality. So

710
00:40:10,679 --> 00:40:13,760
just because they speak English doesn't mean you can just swap.

711
00:40:13,440 --> 00:40:17,440
Speaker 2: Now anthropomorphic of you Vishwa's goodness, Yeah, you've pushed.

712
00:40:17,199 --> 00:40:19,599
Speaker 1: Richard's answer promorphic button there.

713
00:40:19,760 --> 00:40:23,719
Speaker 3: Yes, so these models have a personality against say saying that,

714
00:40:24,239 --> 00:40:27,239
And so I find this interesting when people say, hey,

715
00:40:27,280 --> 00:40:29,480
these are models are all English pays, so you can

716
00:40:29,559 --> 00:40:32,239
just switch one and see which one works. When you

717
00:40:32,280 --> 00:40:36,000
are generating hundreds and hundreds of pages of prompts. As

718
00:40:36,079 --> 00:40:39,039
we do, it is really important for us to understand

719
00:40:39,079 --> 00:40:43,079
the personality of the model, and we can change the model,

720
00:40:43,119 --> 00:40:44,880
and in fact, we change the models all the time.

721
00:40:44,880 --> 00:40:48,239
When we went from four to four turbo and so on,

722
00:40:48,320 --> 00:40:51,280
and now we're using oh one and three. Every time

723
00:40:51,280 --> 00:40:53,480
we change a model, we have to do extensive testing

724
00:40:53,920 --> 00:40:56,639
to make sure our prompting layer works well. But at

725
00:40:56,719 --> 00:40:59,920
least these models are a certain family. Part of these models,

726
00:41:00,159 --> 00:41:01,800
just switching to us different models.

727
00:41:01,519 --> 00:41:03,760
Speaker 2: You've all described you're only talking about open ai models

728
00:41:03,760 --> 00:41:04,199
there so far.

729
00:41:04,280 --> 00:41:09,159
Speaker 3: Yes, we're talking about OpenAI models, so that's the other part.

730
00:41:10,679 --> 00:41:15,159
So those two important considerations have gone in into deciding

731
00:41:15,280 --> 00:41:18,960
which models we go against. We're thankful to open ai

732
00:41:19,119 --> 00:41:21,960
and Azure open Ai that their models have generally kept up.

733
00:41:22,360 --> 00:41:25,239
It's not like they're not they're the ones who came

734
00:41:25,239 --> 00:41:27,679
out with the reasoning models. Of course, reasoning models are

735
00:41:27,679 --> 00:41:30,760
available on other platforms as well now, but they've kept

736
00:41:30,920 --> 00:41:35,119
up with the advancements that are happening. So we're pretty

737
00:41:35,119 --> 00:41:39,440
happy with where we are. But we constantly run tests

738
00:41:40,159 --> 00:41:43,039
to see how other models would do.

739
00:41:43,400 --> 00:41:46,920
Speaker 2: Now and you mentioned already the difference between using a

740
00:41:46,960 --> 00:41:49,320
model that doesn't have a reasoning feature, and I'm doing

741
00:41:49,360 --> 00:41:52,760
air quotes yep, because I read the Anthropic paper where

742
00:41:52,760 --> 00:41:55,840
they talked about the fact that the whole reasoning behavior

743
00:41:56,400 --> 00:42:00,159
is a post back or behavior that the result to

744
00:42:00,239 --> 00:42:04,119
the prompt generated and then analyzed after the fact by

745
00:42:04,199 --> 00:42:08,960
the tool to generate the quote unquote reasoning. Right.

746
00:42:09,039 --> 00:42:14,440
Speaker 3: So the idea here is that I'm trying to generate

747
00:42:14,440 --> 00:42:17,920
a response to a certain section. Sure, and either I

748
00:42:17,960 --> 00:42:21,119
can just in a non reasoning model, I would just say, hey,

749
00:42:21,159 --> 00:42:24,000
generate an answer, and we'll just go through a certain

750
00:42:24,119 --> 00:42:29,280
chain and just start generating a response immediately. But the

751
00:42:29,320 --> 00:42:33,679
reasoning model, it just tries to develop a plan and say, Okay,

752
00:42:34,280 --> 00:42:37,280
you're asking to generate this response. I could use this

753
00:42:37,440 --> 00:42:41,480
example versus this example. Let's try with this example, go

754
00:42:41,559 --> 00:42:44,599
halfway through and realize that there's not enough meat in

755
00:42:44,639 --> 00:42:47,079
this Let's just go back and switch the example. That's

756
00:42:47,119 --> 00:42:48,480
what reasoning is all about, right.

757
00:42:48,519 --> 00:42:51,079
Speaker 2: Yeah, that's what it implies. But what the Anthropic paper

758
00:42:51,119 --> 00:42:53,119
says is that they're actually doing that after the fact.

759
00:42:53,280 --> 00:42:55,679
They already have the statement they want to say, and

760
00:42:55,719 --> 00:42:59,159
then they take their statement apart and fill in the

761
00:42:59,159 --> 00:43:01,639
pieces to show how they quote reasoned it.

762
00:43:01,639 --> 00:43:03,760
Speaker 1: Yes, more like justification than reasoning.

763
00:43:04,239 --> 00:43:07,719
Speaker 2: You showed me the equation. I knew the answer, and

764
00:43:07,760 --> 00:43:11,400
now I write out quote my work even though I've

765
00:43:11,400 --> 00:43:13,320
already got the answer. I'm just trying to fill it

766
00:43:13,320 --> 00:43:14,679
in to make you feel better because you asked me

767
00:43:14,679 --> 00:43:15,280
to show your work.

768
00:43:15,480 --> 00:43:15,679
Speaker 1: Yeah.

769
00:43:15,719 --> 00:43:18,760
Speaker 3: No, that's the anthropic paper, no question about it. But

770
00:43:19,599 --> 00:43:24,159
we are adding on top of that reasoning layer. We say, hey,

771
00:43:24,400 --> 00:43:26,920
which past performance do you want to use? How should

772
00:43:26,920 --> 00:43:29,199
you be writing about it? Things like that.

773
00:43:29,599 --> 00:43:31,840
Speaker 2: Yeah, and I'm not going to say anything bad about

774
00:43:31,840 --> 00:43:34,280
the fact that it goes and poll's references because you

775
00:43:34,400 --> 00:43:37,039
need those. That's all the quality stuff for there.

776
00:43:37,800 --> 00:43:41,599
Speaker 1: Yep, that's fishs. Last time we talked, we've got on

777
00:43:41,639 --> 00:43:45,880
the subject of how you think jen Ai and AI

778
00:43:45,960 --> 00:43:49,599
in general is going to affect the lives of average

779
00:43:49,639 --> 00:43:54,079
developers like us. Has your thoughts on that changed in

780
00:43:54,119 --> 00:43:54,760
the last year.

781
00:43:55,920 --> 00:44:02,480
Speaker 3: My thoughts have not changed. I think developers who will

782
00:44:02,519 --> 00:44:05,519
greatly benefit from tools like that. It is not a panacea.

783
00:44:05,639 --> 00:44:07,880
You can't just say hey, I'm going to give you

784
00:44:07,920 --> 00:44:13,039
these things and go code me something. I think it

785
00:44:13,119 --> 00:44:16,960
puts developers who have been thoughtful, who have worked on

786
00:44:17,000 --> 00:44:21,159
the craft who understand the edge case as well, who

787
00:44:21,199 --> 00:44:25,000
are able to critically look at the code that the

788
00:44:25,599 --> 00:44:29,679
lllms are generating. They're in a very good position because

789
00:44:29,679 --> 00:44:32,400
they can really get that acceleration at the same time

790
00:44:32,440 --> 00:44:36,119
they're critically analyzing what's being generated for them. I think

791
00:44:36,159 --> 00:44:39,000
it's a real challenge for people who are beginning because

792
00:44:39,719 --> 00:44:42,199
it can very superficially tell you that it is working,

793
00:44:42,599 --> 00:44:44,880
but it may not actually be working exactly the way

794
00:44:44,880 --> 00:44:45,760
you want it to, or.

795
00:44:47,119 --> 00:44:50,920
Speaker 1: The AI might give you a solution which is more complex,

796
00:44:51,000 --> 00:44:53,079
and you're building and rolling your own thing when there's

797
00:44:53,119 --> 00:44:55,880
already something available that does it, and it's not going

798
00:44:55,960 --> 00:44:58,639
to tell you that. And I've had this experience and

799
00:44:58,679 --> 00:45:00,880
you go back and you say, hey, can I just

800
00:45:01,000 --> 00:45:04,639
use X? And it says, oh, yes, absolutely.

801
00:45:04,159 --> 00:45:04,679
Speaker 3: Yes, Yeah.

802
00:45:04,719 --> 00:45:07,320
Speaker 1: Well why didn't you tell me that before? Are you idiot? Right?

803
00:45:07,440 --> 00:45:07,719
Speaker 2: Yeah?

804
00:45:07,760 --> 00:45:11,559
Speaker 1: So to your point, somebody who doesn't know about you know,

805
00:45:11,719 --> 00:45:14,920
tool X or tool why isn't going to find out

806
00:45:15,000 --> 00:45:18,000
that it gave you the answer that you wanted, which

807
00:45:18,039 --> 00:45:21,639
is how do I build this? It didn't say you

808
00:45:21,639 --> 00:45:25,079
didn't say how do I solve this problem? And if

809
00:45:25,119 --> 00:45:27,679
you did that, you might have gotten a different answer.

810
00:45:27,760 --> 00:45:30,719
Speaker 3: Well, that's that's very important. It's also important to do

811
00:45:30,800 --> 00:45:34,440
it incrementally and iteratively so that, yeah, you don't just

812
00:45:34,480 --> 00:45:36,360
get two hundred lines of code, you have no idea

813
00:45:36,360 --> 00:45:38,199
and you run a test and you think it's fine.

814
00:45:38,719 --> 00:45:42,159
But if you do it incrementally, all of the basics

815
00:45:42,159 --> 00:45:45,559
of getting this right. Can I refactor this, can I

816
00:45:45,639 --> 00:45:49,639
get can can this piece be optimized? Using this? Yeah,

817
00:45:49,719 --> 00:45:53,599
it's accelerating that immensely, and so.

818
00:45:53,719 --> 00:45:56,400
Speaker 2: You know, it's funny you're describing exactly what you say

819
00:45:56,480 --> 00:46:02,079
p wind does for RFP that experienced RFP builder respond.

820
00:46:02,119 --> 00:46:05,480
Builders can use this tool to accelerate the behavior. But

821
00:46:05,920 --> 00:46:10,360
it is an interaction the tools pulling resources from past

822
00:46:10,480 --> 00:46:16,800
work and suggesting options that then you could iterate on incrementally,

823
00:46:16,960 --> 00:46:20,000
piece by piece, so that you know, when we talked

824
00:46:20,000 --> 00:46:23,360
a year ago this was about making RFPs faster. Now

825
00:46:23,400 --> 00:46:27,599
you're really talking about higher quality and better tuned to

826
00:46:27,760 --> 00:46:29,280
the demands of the new RFP.

827
00:46:29,920 --> 00:46:33,079
Speaker 3: Definitely, definitely, and the quality. We are reaching a point

828
00:46:33,079 --> 00:46:35,519
to where people say I can spot judge it PT

829
00:46:35,840 --> 00:46:38,360
generated text from a mile right, people see that I

830
00:46:38,360 --> 00:46:39,440
don't know if there's.

831
00:46:39,119 --> 00:46:42,679
Speaker 2: Well AI slop is now in the vernacular for a reason.

832
00:46:42,760 --> 00:46:45,920
There's a lot of bad generative AI slop out there.

833
00:46:46,039 --> 00:46:48,840
Speaker 1: I can definitely tell when images are AI created because

834
00:46:48,840 --> 00:46:53,039
they're so slick, you know, and people have made.

835
00:46:52,880 --> 00:46:54,039
Speaker 2: A half he even fingers.

836
00:46:54,559 --> 00:46:59,119
Speaker 1: Yeah, people have made a I don't know, hobby or

837
00:46:59,159 --> 00:47:04,599
maybe a career out of generating. You know, the most peaceful,

838
00:47:04,719 --> 00:47:10,280
sublime picturesque spot with a house by a brook and

839
00:47:10,400 --> 00:47:13,079
you know this and that, and they just post it

840
00:47:13,119 --> 00:47:19,000
on Facebook and say a beautiful place right with no contacts,

841
00:47:19,039 --> 00:47:21,760
with no you know, people are like, wow, where is that? Well,

842
00:47:21,760 --> 00:47:25,920
it's amazing, right, Yeah, it just doesn't exist. And I

843
00:47:25,920 --> 00:47:27,519
could spot those pictures a mile away.

844
00:47:27,719 --> 00:47:30,360
Speaker 2: Let me tell you, as a conference organizer, I know

845
00:47:30,400 --> 00:47:33,519
when you use chat GPT to write your abstract, abstract

846
00:47:34,119 --> 00:47:37,320
at your abstract because they are all the same.

847
00:47:37,480 --> 00:47:37,719
Speaker 3: Yeah.

848
00:47:37,760 --> 00:47:41,679
Speaker 2: I got a thousand abstracts in and four hundred of

849
00:47:41,760 --> 00:47:44,960
them start with in a world. Yeah right, like come on.

850
00:47:46,320 --> 00:47:51,320
Speaker 3: Yeah and b. Because we're so focused on the quality

851
00:47:51,360 --> 00:47:58,360
of writing, we explicitly look for these things where language

852
00:47:58,400 --> 00:48:03,639
models are trying to quote unquote write the perfect text

853
00:48:03,679 --> 00:48:05,719
because you know, you've given them ten things to write

854
00:48:05,760 --> 00:48:09,599
and they find the best way possible to pack that

855
00:48:09,679 --> 00:48:12,679
information and sort of present it. The problem is it

856
00:48:12,760 --> 00:48:15,599
is very onerous on a reader when you're packing it

857
00:48:15,679 --> 00:48:20,719
so much information right, and then overuse of cliches, making

858
00:48:20,760 --> 00:48:24,159
statements without sort of providing background information. These are all

859
00:48:24,239 --> 00:48:32,320
common sense of LLLM generated writing, and we take explicit

860
00:48:32,400 --> 00:48:36,159
steps in our engine to either correct this mistake or

861
00:48:36,239 --> 00:48:40,000
constantly remind the model to slow it down to make

862
00:48:40,039 --> 00:48:43,159
it readable, to improve the quality. These are all things

863
00:48:43,320 --> 00:48:45,800
that are works in progress. A lot more work to

864
00:48:45,840 --> 00:48:48,159
be done there, but those are the kinds of things

865
00:48:48,199 --> 00:48:51,880
that we're thinking about. Goes back to all of these

866
00:48:51,920 --> 00:48:57,079
things are needed to have a real impact on solving

867
00:48:57,079 --> 00:49:00,199
a business problem. When you think about this, it's not

868
00:49:00,280 --> 00:49:02,000
just a matter of taking this and saying, Okay, I'm

869
00:49:02,039 --> 00:49:05,320
going to go start generating financial reports tomorrow. There'll be

870
00:49:05,360 --> 00:49:07,280
a lot of work that needed to go in.

871
00:49:07,440 --> 00:49:09,920
Speaker 1: As a programmer, you kind of think of yourself as

872
00:49:09,960 --> 00:49:13,400
a senior programmer slash manager, somebody who can write code

873
00:49:14,000 --> 00:49:17,880
but chooses to offload the boring stuff to a junior programmer.

874
00:49:18,679 --> 00:49:23,400
And it's all about the prompts, you know, It's all

875
00:49:23,400 --> 00:49:26,119
about what you ask it to do. And I've had

876
00:49:26,159 --> 00:49:28,719
the experience of chatchypt anyway where I said, you know,

877
00:49:28,719 --> 00:49:31,119
I'm trying to do this, and it might be about programming,

878
00:49:31,159 --> 00:49:34,360
it might not, and it will say, oh, well there's

879
00:49:34,480 --> 00:49:37,320
currently no way to do this and blah blah blah,

880
00:49:37,360 --> 00:49:41,679
this is not possible, and or maybe it gives you

881
00:49:41,719 --> 00:49:44,519
some suggestions that are just stupid, like is it turned on?

882
00:49:44,760 --> 00:49:48,960
You know, that kind of stuff, And then you can say, well,

883
00:49:49,199 --> 00:49:51,400
is any can you check the internet to see if

884
00:49:51,440 --> 00:49:54,199
anyone else is having this problem? And that is the

885
00:49:54,239 --> 00:49:58,119
most powerful prompt you know. If you're not satisfied with

886
00:49:58,119 --> 00:50:00,880
the answer it gives you just say, hey, goog google

887
00:50:00,960 --> 00:50:04,119
it or whatever you know, and it will come back

888
00:50:04,199 --> 00:50:08,039
with sometimes with yes, there's a somebody's having this product.

889
00:50:08,079 --> 00:50:10,039
It seems like this is a common problem and this

890
00:50:10,159 --> 00:50:14,239
is what this people, these people did to overcome it, right, absolutely,

891
00:50:14,519 --> 00:50:16,639
So it's it's all about the prompts and it's all

892
00:50:16,639 --> 00:50:20,559
about thinking in that sort of manager senior programmer to

893
00:50:20,639 --> 00:50:22,599
junior programmer mindset.

894
00:50:22,719 --> 00:50:26,159
Speaker 2: Yep. But back to you. You know the p win tool.

895
00:50:26,400 --> 00:50:30,840
You're mostly writing prompts with the tool to then run

896
00:50:30,920 --> 00:50:33,880
them to get results for the operator, that's right, that's right,

897
00:50:33,880 --> 00:50:37,199
And because there is a repetitive piece to this of

898
00:50:37,719 --> 00:50:42,000
setting the restrictions and making sure the context is correct.

899
00:50:42,079 --> 00:50:44,280
Like I got to imagine there's a couple of big

900
00:50:44,360 --> 00:50:48,119
paragraphs at the top of every prompt just to make

901
00:50:48,159 --> 00:50:51,960
sure it stays in the constraint, and then specifics for

902
00:50:52,480 --> 00:50:55,199
the section you're in in the RFP.

903
00:50:55,320 --> 00:50:58,119
Speaker 3: YEP, there's some systems prompts that go in and every

904
00:50:58,199 --> 00:51:00,760
time to sort of keep.

905
00:51:00,559 --> 00:51:03,920
Speaker 2: And it occurred to me, you know, thinking about this

906
00:51:04,000 --> 00:51:06,519
problem space and thinking about in our reactions to AI

907
00:51:06,639 --> 00:51:08,679
SLOP and you know, detechnics and so forth. It's like,

908
00:51:09,000 --> 00:51:12,039
if you think this pattern matching software is awesome, try

909
00:51:12,079 --> 00:51:17,639
a human because humans are really good at finding patterns, especially,

910
00:51:17,679 --> 00:51:20,679
and that's what they see. There's certain patterns to what

911
00:51:20,760 --> 00:51:24,280
a lot of low quality l M tools generate that

912
00:51:24,679 --> 00:51:27,800
you immediately perceived. And it's now starting to shift our

913
00:51:27,880 --> 00:51:31,599
minds where it's now repulsive. Yeah, and we're in a

914
00:51:31,599 --> 00:51:34,960
funny place, but it's there's some good advice in here

915
00:51:35,000 --> 00:51:37,039
visual so I really appreciate it. Like a year later,

916
00:51:37,519 --> 00:51:41,400
you're speaking very differently about the product you're making and

917
00:51:41,480 --> 00:51:43,960
the tool and the way to use these tools like

918
00:51:44,480 --> 00:51:47,440
it shows that it's the evolution that's going on here.

919
00:51:47,519 --> 00:51:50,239
Speaker 3: Yeah, I know, not Richard and call that's that's certainly true.

920
00:51:50,440 --> 00:51:54,800
Learned a lot in this process of awesome technology. There's

921
00:51:54,840 --> 00:51:57,280
no question about it. I think we'll be it's the

922
00:51:57,320 --> 00:52:01,079
next super cycle, as Gartner Le likes to call. They said,

923
00:52:01,199 --> 00:52:04,519
you know, every tech super cyclist twenty years. The last

924
00:52:04,519 --> 00:52:07,239
one started in two thousand, but digital and the Internet

925
00:52:07,719 --> 00:52:10,039
and we are at the We're at the intersection of

926
00:52:10,079 --> 00:52:12,360
the previous one and launching into the gen AI.

927
00:52:12,840 --> 00:52:15,239
Speaker 2: Yeah. Yeah, this has been I felt like this is

928
00:52:15,239 --> 00:52:16,760
a fundamental shift in U acts.

929
00:52:16,960 --> 00:52:18,320
Speaker 3: It's a fundamental shift.

930
00:52:18,480 --> 00:52:20,800
Speaker 1: The way we interact with our equipment is about to change.

931
00:52:20,840 --> 00:52:23,320
But I think this generation isn't going to be twenty years.

932
00:52:23,320 --> 00:52:25,880
It's probably gonna be more like five or six or ten.

933
00:52:26,239 --> 00:52:28,519
Speaker 3: True's that's that's so.

934
00:52:28,559 --> 00:52:30,280
Speaker 1: Much has happened in the last.

935
00:52:30,239 --> 00:52:32,159
Speaker 2: A lot happened in the first few years of the

936
00:52:32,159 --> 00:52:34,559
twenty first century as Internet came out of the dot

937
00:52:34,599 --> 00:52:37,360
com boom, and you know, we shifted to mobile.

938
00:52:37,400 --> 00:52:39,719
Speaker 1: It's not like we've been going slow so far, no,

939
00:52:39,840 --> 00:52:41,480
but it does seem to be accelerating.

940
00:52:42,760 --> 00:52:45,519
Speaker 3: I think, Yeah, I know, it seems to be accelerating.

941
00:52:45,559 --> 00:52:48,480
But also I would say that there is some platauing,

942
00:52:48,719 --> 00:52:51,599
which you may see counterintuitive when I say that plat

943
00:52:51,599 --> 00:52:56,159
doing because GBT five was not released. They were going

944
00:52:56,199 --> 00:52:59,440
to release GPT five. Right, Just a bigger and bigger

945
00:52:59,480 --> 00:53:02,599
model with more data is not leading to better results.

946
00:53:02,840 --> 00:53:06,159
Speaker 2: Yeah, the last few years we've pretty much consumed the

947
00:53:06,199 --> 00:53:10,119
whole Internet into tokenization. There isn't more Internet to grab,

948
00:53:10,559 --> 00:53:13,920
so that that's not an exponential function, Like we're kind

949
00:53:13,920 --> 00:53:17,639
of addicted to the exponential function, but it only existed

950
00:53:17,679 --> 00:53:21,239
because a small group of people worked extremely hard to

951
00:53:21,280 --> 00:53:25,480
increase the density of silicon substrates. That's the only thing

952
00:53:26,280 --> 00:53:29,400
they you know, most everything else doesn't work that way,

953
00:53:29,519 --> 00:53:32,599
and this doesn't because there's only so much data. And

954
00:53:32,639 --> 00:53:36,639
it's very clear that bringing in more data generated by

955
00:53:36,719 --> 00:53:39,519
AI for this is like taking a photocopy of a photocopy.

956
00:53:39,960 --> 00:53:42,320
It's degenerative, it's not useful data.

957
00:53:43,320 --> 00:53:44,960
Speaker 1: Well, I can't wait to talk to you next year

958
00:53:44,960 --> 00:53:47,760
to see what's new and see what else you've learned. Yeah,

959
00:53:47,800 --> 00:53:51,199
it's fantastic. Thank you, Vishwaz. It's always very talking to you.

960
00:53:51,360 --> 00:53:53,880
Speaker 3: Always a pleasure to be on this show. Thank you

961
00:53:53,960 --> 00:53:55,159
for inviting me again.

962
00:53:55,679 --> 00:53:57,679
Speaker 1: Well, thank you and thanks for listening, and we'll talk

963
00:53:57,679 --> 00:54:20,960
to you next time on dot net rocks. Dot net

964
00:54:21,079 --> 00:54:24,000
Rocks is brought to you by Franklin's Net and produced

965
00:54:24,000 --> 00:54:27,840
by Pop Studios, a full service audio, video and post

966
00:54:27,840 --> 00:54:32,000
production facility located physically in New London, Connecticut, and of

967
00:54:32,039 --> 00:54:36,519
course in the cloud online at pwop dot com.

968
00:54:36,719 --> 00:54:38,880
Speaker 4: Visit our website at d O T N E t

969
00:54:39,079 --> 00:54:43,119
R O c k S dot com for RSS feeds, downloads,

970
00:54:43,280 --> 00:54:46,960
mobile apps, comments, and access to the full archives going

971
00:54:47,000 --> 00:54:50,400
back to show number one, recorded in September two thousand

972
00:54:50,400 --> 00:54:50,679
and two.

973
00:54:51,280 --> 00:54:53,639
Speaker 1: And make sure you check out our sponsors. They keep

974
00:54:53,719 --> 00:54:56,880
us in business. Now go write some code. See you

975
00:54:56,920 --> 00:55:03,920
next time you got and Daddy see a summer time

976
00:55:05,199 --> 00:55:09,519
that means his home, then my Texas in line vegetable

