1
00:00:00,320 --> 00:00:04,679
Hey, Carl and Richard here with
your twenty twenty four NDC schedule. We'll

2
00:00:04,719 --> 00:00:09,160
be at as many NDC conferences as
possible this year, and you should consider

3
00:00:09,199 --> 00:00:14,279
attending no matter what. The Copenhagen
Developers Festival happens August twenty sixth through the

4
00:00:14,359 --> 00:00:21,519
thirtieth. Tickets at Cphdevfest dot com. NDC Porto is happening October fourteenth through

5
00:00:21,519 --> 00:00:27,399
the eighteenth. The early discount ends
June fourteenth. Tickets at Ndcporto dot com.

6
00:00:27,440 --> 00:00:43,320
We'll see you there, we hope. Hey, gis what it's dot

7
00:00:43,399 --> 00:00:47,960
net rocks. I'm Carl Franklin and
amateur campbell fishwes Lele is here with us

8
00:00:48,000 --> 00:00:51,000
and we're excited to talk to him. How are you doing, my friend,

9
00:00:51,359 --> 00:00:54,159
seen any hotters this year? Oh? Yeah, well, the otters

10
00:00:54,159 --> 00:00:57,560
are everywhere right. It's been summer
up on the coast, so it's hard

11
00:00:57,560 --> 00:01:00,920
to be on happy up here.
But we've got the kayak out, cleaned

12
00:01:00,960 --> 00:01:03,039
them up, did some repairs,
all the things you need to do,

13
00:01:03,079 --> 00:01:04,239
and then you get to go,
you know, meet the otters where they

14
00:01:04,239 --> 00:01:07,159
are. How's that? Do they
taste good? Well? They think no,

15
00:01:07,319 --> 00:01:11,400
they don't. They smell and they
smell terrible. But now, don't

16
00:01:11,400 --> 00:01:14,879
eat otter. That's that's not a
good eating carnivores in general, not a

17
00:01:14,879 --> 00:01:18,120
good idea. That's a good idea. But no, they've got to stink

18
00:01:18,159 --> 00:01:19,840
all their own. Man Like,
it's not hard to find an otter den

19
00:01:19,920 --> 00:01:25,760
you'll smell it first. Wow.
Yeah, okay, I have no such

20
00:01:25,840 --> 00:01:32,000
things to report. But summer into
your place means barbecue, right like the

21
00:01:32,040 --> 00:01:34,760
girls are running. I just bought
two new twelve hundred watt subway first for

22
00:01:34,840 --> 00:01:38,680
the band. Well, he's that
kind of guy. You guys have been

23
00:01:38,680 --> 00:01:41,359
playing up a tornado these days.
I mean I only see it from your

24
00:01:41,400 --> 00:01:44,920
calendar. I could seep seeing every
every Friday, every Saturday, blocked out

25
00:01:44,959 --> 00:01:49,040
through the summer. One of the
guys that saw us at Ocean Beach just

26
00:01:49,120 --> 00:01:55,000
sent me like fifteen videos that he
took of us playing wow out there.

27
00:01:55,560 --> 00:01:57,920
I don't know what to do with
them. They're little clips. You know,

28
00:01:57,120 --> 00:02:00,599
you put them on TikTok, dude, that's what the hip kids do.

29
00:02:00,879 --> 00:02:06,359
I guess. Yeah, while it
still exists, yeah, before while

30
00:02:06,359 --> 00:02:07,840
it's look, it's not going away. It's just a question of who's going

31
00:02:07,919 --> 00:02:12,080
to own it. Yeah, maybe
it's the reality. All right, Well,

32
00:02:12,159 --> 00:02:15,759
I have something really funny and very
cool from our friend Simon Crop for

33
00:02:15,919 --> 00:02:27,080
better no framework, roll the music. Alright, man, what do you

34
00:02:27,120 --> 00:02:30,439
got? I love Simon Simon's and
Assi and he comes up with really great

35
00:02:30,479 --> 00:02:36,520
stuff. Every once in a while
he crosses the line into comedy, which

36
00:02:36,919 --> 00:02:43,719
is also always helpful. But he's
got this tool called waffle generator for the

37
00:02:43,719 --> 00:02:49,360
big waffles. Well, in a
way produces text which on first glance looks

38
00:02:49,439 --> 00:02:55,000
like real ponderous pros replete with cliches. So this is like you know your

39
00:02:55,120 --> 00:03:00,000
lorem ipsem if you just want text
to fill up the screen instead of using

40
00:03:00,360 --> 00:03:05,919
Latin random Latin? Is this lo
ifs in the large language model era?

41
00:03:06,120 --> 00:03:08,319
Is that what this is? Oh? No, yeah, so this is

42
00:03:08,400 --> 00:03:13,120
great. Look at the example content
on the Githovery boat. This stuff,

43
00:03:13,159 --> 00:03:20,560
I might say, the aesthetic of
economico social disposition quote in this regard the

44
00:03:20,639 --> 00:03:24,439
underlying surrealism of the take home message
should not divert attention from the esthetic of

45
00:03:24,680 --> 00:03:31,840
economic economic social disposition Humphrey Yokomoto in
the Journal of the Total Entative Item.

46
00:03:32,360 --> 00:03:38,719
And then there's some on any rational
basis A particular factor such as the functional

47
00:03:38,719 --> 00:03:44,159
baseline, the analogy of object,
the strategic requirements, or the principle overriding

48
00:03:44,199 --> 00:03:50,840
programming, provides an interesting insight into
the complementary functional derivation. This trend man

49
00:03:50,919 --> 00:03:58,319
dissipate due to the measurable proficiency complete
and utter nonsense, But it sounds good.

50
00:04:00,280 --> 00:04:03,759
This is LLM degeneration, you know, toombs for a purpose. It's

51
00:04:03,800 --> 00:04:09,840
so awesome. So it's hilarious.
Uh yeah, so he basically get this.

52
00:04:09,960 --> 00:04:13,199
He has a blazer wrapper around it, of course, that's what it's

53
00:04:13,240 --> 00:04:21,000
called blazing waffles. Blazing waffles.
Oh my god, it's, oh,

54
00:04:21,120 --> 00:04:26,399
simon, so awesome, brilliantly awesome. If you need to make text.

55
00:04:26,720 --> 00:04:30,920
Yeah, he's also got an extension
for bogus to use waffle generator, which

56
00:04:30,000 --> 00:04:34,079
is great. Okay, so good. He must be stopped though. Sometimes

57
00:04:34,439 --> 00:04:41,240
I got to ask him about this. What I'm usually the one he says

58
00:04:41,240 --> 00:04:46,560
stuff like that, he must be
stopped. What do you think of that?

59
00:04:46,680 --> 00:04:54,800
Vishwaz Yeah, it's great, it's
good. You know, whenever every

60
00:04:54,839 --> 00:04:58,680
time we get something that's both a
functional tool that we're going to use and

61
00:04:58,839 --> 00:05:01,319
makes us laugh, it's like the
perfect better no framework. It's a good

62
00:05:01,399 --> 00:05:04,240
day. It's a good day,
every day, no question. All right,

63
00:05:04,279 --> 00:05:08,680
So who's talking to us? Richard
grabbed a coma Hoff Shows nineteen oh

64
00:05:08,720 --> 00:05:11,720
four, the when we shot back
at build with Mark Brown. We were

65
00:05:11,720 --> 00:05:15,120
talking about COSMOSDB and its role in
large language models like chat, GPT and

66
00:05:15,240 --> 00:05:18,240
others, you know the role of
AI and Cosmos DUB in general, and

67
00:05:18,279 --> 00:05:23,120
our friend Brent vander Mead had this
comment. He said, oh yeah,

68
00:05:23,160 --> 00:05:29,079
thanks for the post traumatic event.
At minute thirty two, Carl talks about

69
00:05:29,120 --> 00:05:31,560
trying to put together training on artificial
intelligence and be patient for the tools coming

70
00:05:31,560 --> 00:05:33,639
out. Oh yeah, And Carl
says, you know, this stuff is

71
00:05:33,680 --> 00:05:36,759
changing so fast, this is not
the time. We just got to wait,

72
00:05:36,800 --> 00:05:40,319
wait it out and see what the
major tools shake out like. And

73
00:05:40,399 --> 00:05:44,240
I said, it only gets easier. Carl said, you could imagine going

74
00:05:44,279 --> 00:05:46,120
into training class like right before Windows
came out, and someone teaching you to

75
00:05:46,120 --> 00:05:50,079
write your own windowing system. YEP, exactly what I was thinking. And

76
00:05:50,120 --> 00:05:54,920
Brent, that was my point.
That's why I said we shouldn't be doing

77
00:05:55,000 --> 00:05:58,160
this, just hang on. Yeah, and Brent comes back with I immediately

78
00:05:58,199 --> 00:06:01,920
was thrown back into setting up a
three t your team foundations on premises server

79
00:06:02,079 --> 00:06:06,920
cluster. If only the rapid changes
were occurring back then, and how people

80
00:06:06,959 --> 00:06:13,519
were willing to suffer setting up Jenkins
pipelines for automation into AWS. Oh man,

81
00:06:13,759 --> 00:06:18,519
am I happy with aks Helm terraform
as your DevOps and get hub actions.

82
00:06:18,879 --> 00:06:24,160
Thanks for the great show. I
mean it's a great example. Think

83
00:06:24,199 --> 00:06:29,240
about how many shows we did talking
about continuous integration back in the day,

84
00:06:29,319 --> 00:06:34,560
sort of building your own tool set
every Now this bespoke pipeline and today it's

85
00:06:34,639 --> 00:06:39,759
set of services right, you turn
them on right and you've got it set

86
00:06:39,800 --> 00:06:43,839
up even as I AD as infrastructure
as code, so you're able to just

87
00:06:44,120 --> 00:06:46,079
really run a script. Is like
players my pipeline, let's go. We

88
00:06:46,120 --> 00:06:50,480
set up CI for many people in
many companies back in the day. Yeah,

89
00:06:50,560 --> 00:06:55,639
it was a whole business. You
know, you're right the folks that

90
00:06:55,680 --> 00:06:58,879
would specialize in just getting that right
for you so you didn't have to figure

91
00:06:58,879 --> 00:07:00,759
it out. I pretty sure you
got a copy of music coe by,

92
00:07:00,879 --> 00:07:03,600
but we'll hook you up with something
cool. Give me a ring at Richard

93
00:07:03,639 --> 00:07:06,240
Aguapp dot com. Thanks you so
much for your comment and a copy of

94
00:07:06,319 --> 00:07:09,399
music Cobe is on its way to
you, and if you'd like a copy

95
00:07:09,399 --> 00:07:12,000
of music, co buy I read
a comment on the website at donnat Rocks

96
00:07:12,040 --> 00:07:14,560
dot com or on the facebooks.
We publish every show there, and if

97
00:07:14,560 --> 00:07:15,680
you comment there and I read on
the show, we'll send your copy of

98
00:07:15,720 --> 00:07:19,000
music go by very good. And
you know you can always follow us on

99
00:07:19,160 --> 00:07:23,720
ex Twitter as I call it.
But we've been there for many years.

100
00:07:23,759 --> 00:07:27,920
But the cool kids are hanging out. I'm masked on I'm at Carl Franklin

101
00:07:27,959 --> 00:07:30,639
at tech Hub dot social, and
I'm Ridge Campbell at mass it. I'm

102
00:07:30,680 --> 00:07:34,079
dot social. Send us a two
Rudy two two Rudy two dy fresh and

103
00:07:34,120 --> 00:07:42,360
fruity and uh weird. The last
time vishwas Lele was on. We had

104
00:07:42,519 --> 00:07:47,000
such a great conversation and I was
under the impression at no. I said

105
00:07:47,079 --> 00:07:49,360
to him, you were on just
like a month or two ago. It

106
00:07:49,439 --> 00:07:54,319
was nine months ago. Yeah,
yeah, I don't feel nine months older.

107
00:07:54,879 --> 00:07:58,600
But obviously I am been pretty crazy. And first show is all the

108
00:07:58,639 --> 00:08:01,079
way back in two thousand and seven. I like he's been Oh yeah,

109
00:08:01,079 --> 00:08:05,000
he's an og pretty sure we owe
my sub sandwich at this point or something.

110
00:08:05,079 --> 00:08:09,879
Yeah, so his title has changed. He is now co founder in

111
00:08:11,040 --> 00:08:16,040
CEO at pwin dot Ai. He's
also a noted industry speaker and author,

112
00:08:16,240 --> 00:08:22,160
and the regional director for Washington DC. Welcome back, vishwas, thank you

113
00:08:22,240 --> 00:08:26,360
both. Glad to be back a
new job. Dude, what are you

114
00:08:26,519 --> 00:08:31,279
doing. You've been AIS CTO for
ten fifteen years? Yeah, almost sixteen

115
00:08:31,360 --> 00:08:39,799
years. I've worked at AIS for
almost thirty years, So quite a big

116
00:08:39,919 --> 00:08:43,960
change here since the dast But it
does seem to me like you're not alone

117
00:08:43,159 --> 00:08:48,240
either. Some really great tech people
I know are jumping into AI startups like

118
00:08:48,360 --> 00:08:52,759
it's the hip thing to do right
now. It's such a massive opportunity.

119
00:08:52,000 --> 00:08:56,159
Yeah, so, Richard, I
was not going to be started on this

120
00:08:56,279 --> 00:09:00,960
idea. I was not envisioning.
If you asked me like nine months ago,

121
00:09:01,120 --> 00:09:07,840
what was your plan. The plan
was whenever a new technology like Azure,

122
00:09:07,879 --> 00:09:11,840
open ai or open ai comes along, We've always had this culture of

123
00:09:11,200 --> 00:09:15,879
let's can we find a few people
jump into it, try to understand it,

124
00:09:15,879 --> 00:09:20,039
try to build something so that we
can understand it better and take it

125
00:09:20,080 --> 00:09:22,240
to our customers. I mean,
that's what we're planning to do. And

126
00:09:22,960 --> 00:09:26,879
with Azure open Ai, we went
through a bunch of use cases and then

127
00:09:26,960 --> 00:09:33,000
said, hey, can we work
on a use case which is can help

128
00:09:33,159 --> 00:09:37,159
our internal folks. We have a
pretty sizable proposal team, so it can

129
00:09:37,200 --> 00:09:41,120
we build something that can help you. So we build something, and then

130
00:09:41,200 --> 00:09:43,399
we took it out to other people
and one thing led to another and here

131
00:09:43,440 --> 00:09:48,480
we are so really good because you
were making things that people wanted. It's

132
00:09:48,519 --> 00:09:52,279
like this is actually a product and
we should do more of it. Yeah.

133
00:09:52,320 --> 00:09:54,639
So what we did, Richard,
was first we build an m VP

134
00:09:54,799 --> 00:10:01,240
for our internal teams and got some
feedback from them, some good, some

135
00:10:01,440 --> 00:10:03,799
bad, all of it. And
then we said, okay, let's do

136
00:10:03,919 --> 00:10:11,480
this. Can we before we build
any more functionality into this, can we

137
00:10:11,480 --> 00:10:13,759
go to eight or ten companies?
Ended up going to a dozen or so

138
00:10:13,879 --> 00:10:18,840
companies and said, would you allow
us to generate some content for you and

139
00:10:18,080 --> 00:10:22,120
then we'll talk about this later.
But this is a co pilot for proposal

140
00:10:22,159 --> 00:10:26,279
writers. So we went to a
bunch of companies and said, can we

141
00:10:26,320 --> 00:10:30,440
generate the first draft of a proposal
for you? Maybe eight or ten companies.

142
00:10:30,919 --> 00:10:35,559
It's going to be called the Aesthetic
of Economico Social Disposition. You're going

143
00:10:35,600 --> 00:10:45,120
to love it, so sorry,
that's great, dash waffle waffle false.

144
00:10:46,879 --> 00:10:52,039
So we we said, okay,
we took it to eight or ten companies

145
00:10:52,080 --> 00:10:56,279
and they got some feedback. And
what I learned from it, Richard,

146
00:10:56,360 --> 00:10:58,000
to your point, A lot of
people are jumping in, and as they

147
00:10:58,039 --> 00:11:03,039
should. This is definitely a disruptive
technology. Can I can feel it from

148
00:11:03,399 --> 00:11:07,159
early on in the share Point days
you remember two thousand and seven show Getting

149
00:11:07,200 --> 00:11:11,240
into share Point, or remember the
cloud show early days. This seems like

150
00:11:11,399 --> 00:11:16,039
much much bigger than all of those
things combined. And there's this idea that

151
00:11:16,320 --> 00:11:22,240
you hear technology that can reason on
your behalf. Forget the content generation pieces

152
00:11:22,320 --> 00:11:24,559
right, forget that can summarize text
for you, but the fact that it

153
00:11:24,559 --> 00:11:28,639
can reason. Given some things,
it can reason and give you an output

154
00:11:28,679 --> 00:11:33,679
back. Think about it as applicability
all across what we do in it.

155
00:11:33,279 --> 00:11:39,360
But we took one use case and
maybe in hindsight, if I think about

156
00:11:39,399 --> 00:11:43,799
it, you can do a lot
bit generative AI. But the generative AI

157
00:11:43,919 --> 00:11:48,879
models are hard to sort of control
their good But how do you get them

158
00:11:48,919 --> 00:11:56,879
to focus in and generate one twenty
thirty forty page document and that has an

159
00:11:56,960 --> 00:12:01,360
ROI associated with that because you know
come, sure are time consuming to build?

160
00:12:01,639 --> 00:12:03,519
Yes, they are time consisting to
build. So I think, in

161
00:12:03,639 --> 00:12:11,440
hindsight, picking one use case which
aligned well with the superpower of generative AI,

162
00:12:11,519 --> 00:12:16,639
which is to generate content, but
then to solve a problem where people

163
00:12:16,639 --> 00:12:22,799
were spending struggling writing this content and
then generating the first draft. And it

164
00:12:22,879 --> 00:12:26,120
is important to note that it is
a co pilot very much using the term

165
00:12:26,200 --> 00:12:30,600
copilot. Yeah, so you're still
the pilot. It's your fault. It's

166
00:12:30,600 --> 00:12:35,600
still your fault. Yes, we're
just getting you to this first draft.

167
00:12:35,679 --> 00:12:39,960
And later on in the show,
I can talk about as a CTO,

168
00:12:39,000 --> 00:12:43,639
as an architect, I have supported
proposal writing for at least two decades,

169
00:12:45,000 --> 00:12:48,720
right, and I can talk about
all of the trouble and trial and tabulation

170
00:12:48,919 --> 00:12:54,919
supporting that, and then how that
product helps with all of those challenges that

171
00:12:54,960 --> 00:12:58,600
the organizations face. So let us
go off. I mean, I'm all

172
00:12:58,639 --> 00:13:01,840
about the summarization only existing content,
like I get that from a large language

173
00:13:01,840 --> 00:13:07,720
model. But you talked about reasoning. Yes, where is the reasoning needed

174
00:13:07,879 --> 00:13:13,639
and how does the tool do?
Yeah, so the reasoning is needed because

175
00:13:16,600 --> 00:13:20,399
let's just literature Defied's okay, I'll
take a step back and sort of describe

176
00:13:20,440 --> 00:13:24,360
the problem and then we can talk
about how we solved it. Right.

177
00:13:24,879 --> 00:13:30,240
So, I remember nine months ago
as you were ending that show. At

178
00:13:30,240 --> 00:13:33,000
that time M three sixty five Copilot
had not come out or it was a

179
00:13:33,080 --> 00:13:37,039
yeah, it was being talked about. I think it available in a limited

180
00:13:37,120 --> 00:13:41,759
review or something like that. And
Richard, I remember vividly that you asked

181
00:13:41,799 --> 00:13:48,440
a question that if you're generating the
content based on your past responses, right,

182
00:13:50,080 --> 00:13:54,919
and let's say you've submitted twenty thirty
forty RFP submissions in the past,

183
00:13:56,279 --> 00:14:00,159
you stored them into a SharePoint site. Perhaps now you're M three sixty if

184
00:14:00,159 --> 00:14:07,799
I have co pilot has now scanned
those documents, and you're now able to

185
00:14:07,879 --> 00:14:09,600
go and say, hey, generate
me an answer to this question. And

186
00:14:11,159 --> 00:14:15,759
you at that point said, why
would I not use M three sixty five

187
00:14:16,080 --> 00:14:20,840
co pilot for that? Right?
Why do you need this special purpose co

188
00:14:20,960 --> 00:14:24,559
pilot to do what N three sixty
five copilot seems to be doing out of

189
00:14:24,559 --> 00:14:28,759
the box That that was a question
proposed to do. Yeah, nobody really

190
00:14:28,799 --> 00:14:33,360
knew at that point, right,
And the answer to that question is that

191
00:14:33,559 --> 00:14:39,440
M three sixty five Copilot is a
very good general purpose chat You give it

192
00:14:39,519 --> 00:14:43,559
tons and tons of content, could
be any content in your share point one

193
00:14:43,720 --> 00:14:46,600
drive teams, what have you.
And now you can use the graph connector

194
00:14:46,600 --> 00:14:50,440
to pull in content from anywhere else
you want to putting data from Salesforce,

195
00:14:50,440 --> 00:14:54,279
you can do that. But the
fact is that it is a general purpose

196
00:14:56,679 --> 00:15:01,039
chat bot where the content is broken
up. And let's say there's some sort

197
00:15:01,039 --> 00:15:05,120
of a chunking strategy. You chunk
it by the page numbers, or you

198
00:15:05,200 --> 00:15:09,120
chunked it by the discussion, or
you chunk it by sections. You chunk

199
00:15:09,159 --> 00:15:13,039
it some way. You store it
in a vector database, and there are

200
00:15:13,080 --> 00:15:16,519
lots and lots of them available.
But then you are asking a question that

201
00:15:16,679 --> 00:15:22,840
Hey, based on the information you
have, can you answer this question right?

202
00:15:22,240 --> 00:15:26,440
And it goes about answering that question. I keep in mind that chat

203
00:15:26,480 --> 00:15:31,799
bought including chat gpt our System one
chat butt And what do I mean by

204
00:15:31,840 --> 00:15:37,399
system one. I'm really talking about
the Daniel famous book from Daniel Canman System

205
00:15:37,399 --> 00:15:43,320
one system I'm exactly I'm talking about, right, and it is can you

206
00:15:43,320 --> 00:15:48,240
elaborate on that visual lot of it? Yeah, So there's a famous book

207
00:15:48,600 --> 00:15:56,080
by by Nobel Laureate Daniel Canman called
Thinking Fast and Slow. Highly recommend folks

208
00:15:56,120 --> 00:16:00,279
read it very good. Book really
talked about two things. And this book

209
00:16:00,360 --> 00:16:06,080
came out many many years ago,
well before CHATGBT was a thing. But

210
00:16:06,720 --> 00:16:10,320
in this book they talk about system
one and system to system one. Think

211
00:16:10,320 --> 00:16:17,000
of that as the way your brain
works that allows you to avoid a ditch.

212
00:16:17,440 --> 00:16:19,080
Right, You don't have to think
too much, You instinctively take a

213
00:16:19,320 --> 00:16:26,519
reaction. System two is when you're
thinking about reptile brain exactly. So chat

214
00:16:26,559 --> 00:16:32,320
GPT and other chatbots today are system
one. Right, you can go up

215
00:16:32,360 --> 00:16:34,759
to chat GPT and say, I'm
going to ask you a question, but

216
00:16:34,879 --> 00:16:37,960
this is a really detailed question.
I want you to think about it for

217
00:16:38,000 --> 00:16:41,759
thirty minutes and then only respond.
There's no way to do that. It

218
00:16:41,799 --> 00:16:48,279
starts responding in three seconds or whatever
the latency is. And so when we

219
00:16:48,360 --> 00:16:55,440
talk about a special purpose chatbot in
this case, in the case of proposals,

220
00:16:55,919 --> 00:16:59,840
you really have to think about the
proposal domain. Think about the important

221
00:17:00,120 --> 00:17:04,279
entities in the proposal domain. As
you look through past documents, you need

222
00:17:04,359 --> 00:17:10,839
to collate them in a certain entities
which are very domain specific. And then

223
00:17:11,279 --> 00:17:15,839
generating a response to a question.
There are some best practices that are available

224
00:17:15,880 --> 00:17:22,759
out there in terms of how do
you write persuasive content right and how do

225
00:17:22,880 --> 00:17:29,160
you reason about So these questions can
be long form questions, open ended questions

226
00:17:29,160 --> 00:17:34,960
in RFPs, like we have this
problem about modernization and this system needs to

227
00:17:36,000 --> 00:17:38,799
be modernized, and these are the
challenges we are running into. It could

228
00:17:38,799 --> 00:17:41,759
be anything, right, I'm just
giving you an IT example, could be

229
00:17:41,720 --> 00:17:45,960
a healthcare example, could be something
else. So these are questions that you

230
00:17:47,000 --> 00:17:49,640
have to respond to. Their very
broad based questions. But then there are

231
00:17:49,720 --> 00:17:56,000
best practices in terms of how you
write persuasive text in responding to that question.

232
00:17:56,119 --> 00:18:06,279
And that's where the reasoning comes in. Where our our domain specific copilot

233
00:18:07,200 --> 00:18:12,960
understands some of these best practices.
When it ingest the content, it automatically

234
00:18:14,119 --> 00:18:19,920
organizes them based on the entities things
like that I can I can make an

235
00:18:19,920 --> 00:18:26,799
analogy here if you go to chat
GPT to ask it for a problem and

236
00:18:26,880 --> 00:18:32,880
how to configure this middleware or whatever. Right, you're you're taking a chance,

237
00:18:32,960 --> 00:18:37,079
whether it's going to assume that you're
using startup CS or program cs right

238
00:18:37,160 --> 00:18:41,480
a dot net six and before version
of asp net core, or you know,

239
00:18:41,960 --> 00:18:45,400
or the new, new or modern
stuff. And if it understood this,

240
00:18:45,559 --> 00:18:48,920
there would be another layer before it
started spitting this stuff out that said,

241
00:18:49,000 --> 00:18:53,839
oh, we have this contact that
says it should know we were using

242
00:18:55,119 --> 00:18:59,079
dot net eight and we're using the
latest version of see sharp, and we

243
00:18:59,160 --> 00:19:03,359
want to you know, frame the
answer in that context. Right. So

244
00:19:03,440 --> 00:19:08,039
that's setting up a system prompt and
making sure that chat, GPT or any

245
00:19:08,079 --> 00:19:15,079
language model has the right context so
that it can give you an answer that

246
00:19:15,200 --> 00:19:19,680
is very much aligned with your background
and your needs. I'm also talking about

247
00:19:21,119 --> 00:19:26,319
KYLA. I'm also talking about a
scenario where in some cases you're trying to

248
00:19:26,359 --> 00:19:30,920
answer a question. So let's just
take that t T example, and I

249
00:19:30,920 --> 00:19:33,559
think that'll appeal to the listeners here
as well. So there's a question about

250
00:19:34,000 --> 00:19:41,200
somebody, some government agency or some
enterprise issued in RFP with a question about

251
00:19:41,400 --> 00:19:45,519
modernization of some data platform, right, and they talked about, hey,

252
00:19:47,720 --> 00:19:52,279
this platform needs to be modernized.
How do you go about have you done

253
00:19:52,279 --> 00:19:55,599
this kind of work before? How
do you go about solving this problem?

254
00:19:55,759 --> 00:20:00,160
Again, I'm over simplifying this,
but let's say that's starf right. So

255
00:20:00,200 --> 00:20:03,799
in this case, we would have
to go find out if you have done

256
00:20:03,880 --> 00:20:07,599
this kind of work as an organization, right, and then we will have

257
00:20:07,640 --> 00:20:15,599
to search through your past RFP responses, your core competencies. Maybe you've written

258
00:20:15,599 --> 00:20:21,720
some white papers, you've written blogs, you have transcriptions of podcasts and things

259
00:20:21,799 --> 00:20:25,920
like that. We go look through
that, and that's a common RAG pattern

260
00:20:25,960 --> 00:20:27,839
that I'm talking about that people are
familiar with. But we have to go

261
00:20:27,880 --> 00:20:32,160
find a couple of examples, or
maybe three examples of where we have done

262
00:20:32,279 --> 00:20:38,799
this kind of data modernization. So
you get that text which says, well,

263
00:20:38,799 --> 00:20:44,440
these three examples match somewhat the requirement
of the RFP. But then the

264
00:20:44,480 --> 00:20:48,400
reasoning comes in, how good a
match is this? Right? Should we

265
00:20:48,519 --> 00:20:52,680
use this as an example? If
it is not a good match, should

266
00:20:52,680 --> 00:20:56,799
we be changing our prompt and looking
for a different kind of an answer.

267
00:20:56,079 --> 00:21:02,960
So all of that is happening dynamically
in run time, and we are not

268
00:21:03,119 --> 00:21:07,079
asking our users to write any of
the proms, right. We are essentially

269
00:21:08,559 --> 00:21:14,880
generating all of the prompts for them
based on the best practices that I talked

270
00:21:14,920 --> 00:21:18,400
about. Right, And there's an
interesting inverted ratio that we have learned about.

271
00:21:18,519 --> 00:21:22,200
So if we're generating a twenty page
document, in many cases we end

272
00:21:22,279 --> 00:21:27,359
up generating two hundred pages. That's
the inverted ratio of proms, which includes

273
00:21:27,400 --> 00:21:30,400
all of the prompts and the context
that you're feeding into the language model.

274
00:21:30,519 --> 00:21:34,680
I mean, that's not that weird
when it comes to a good search string.

275
00:21:34,799 --> 00:21:37,759
Even like when you focus in on
a specific thing, you need to

276
00:21:37,799 --> 00:21:44,000
know you have to be very descriptive
to get the answer. That's right.

277
00:21:44,079 --> 00:21:47,240
That's a good analogy. Yeah,
I mean, I think it's useful for

278
00:21:47,240 --> 00:21:49,319
folks t als to understand. You're
in Washington, DC, like you guys

279
00:21:49,319 --> 00:21:52,880
are in the business of writing RFPs
for government, which tend to be lengthy

280
00:21:53,559 --> 00:21:59,680
and have very specific rules, like
it's not a trivial thing, it is

281
00:21:59,680 --> 00:22:04,400
not a real thing. There is
a multi hundred page document called the Federal

282
00:22:04,480 --> 00:22:11,519
Acquisition Regulation which talks about all of
the details of an RFP. What would

283
00:22:11,599 --> 00:22:15,519
go in section M, what would
go in section L, Section C,

284
00:22:15,240 --> 00:22:18,839
things like that, so the rules
around that. And I mean, I

285
00:22:18,839 --> 00:22:23,000
think a language mode would be good
at given this rule set and this goal

286
00:22:23,279 --> 00:22:29,680
and these capabilities, write me the
paragraphs right, and what we are finding

287
00:22:29,759 --> 00:22:33,799
is as good as these language models
are. Just like human beings, if

288
00:22:33,839 --> 00:22:36,799
you tell them twelve things to do. Yeah, they will do the first

289
00:22:36,880 --> 00:22:40,279
second, and then we will do
the eleventh and twelfth and forget everything everything

290
00:22:40,319 --> 00:22:44,440
in between. That's so, is
that interesting? It's very interesting, But

291
00:22:44,480 --> 00:22:48,680
it does sound like the in building
this RFP generator, you literally want to

292
00:22:48,720 --> 00:22:55,000
go section to section and write a
separate prompt for each one. That's effectively.

293
00:22:55,680 --> 00:22:57,799
Again, because the rule set's so
consistent, you should be able to

294
00:22:57,880 --> 00:23:02,880
generate almost all of this, right, right, So let me actually take

295
00:23:02,920 --> 00:23:07,720
you back into the dane in life
of a technical architect or an engineer like

296
00:23:07,799 --> 00:23:11,359
me who is supporting RFPs, and
then we'll come back to this point.

297
00:23:11,400 --> 00:23:15,559
So I'll try to be very quick
here. So let's say an RFP arrives.

298
00:23:15,599 --> 00:23:18,839
It's a fifty sixty or one hundred
page PDF which is montrivial to read.

299
00:23:18,920 --> 00:23:22,039
Somebody has to read it. This
is the request for you to propose

300
00:23:22,079 --> 00:23:27,119
to them. This is the request. Your proposal will be larger, although

301
00:23:27,160 --> 00:23:30,599
in some cases they want you to
write it. They have very strict page

302
00:23:30,640 --> 00:23:33,440
limits into how much you can write, right, don't know. They can

303
00:23:33,480 --> 00:23:37,759
send you as many pages as they
want, but you can only send them.

304
00:23:37,960 --> 00:23:41,480
You can only and then you know, all the tricks about you know,

305
00:23:41,640 --> 00:23:45,799
creating really dense diagrams to sort of
save on words comes into play.

306
00:23:45,839 --> 00:23:51,200
But let's just say you've got a
fifty page complicated RFP. Now some human

307
00:23:51,240 --> 00:23:56,119
being has to read that and understand
what is happening in this RFP, and

308
00:23:56,160 --> 00:23:59,240
then let's say just spend a day
or two to figure out. And then

309
00:23:59,279 --> 00:24:03,640
the organize is a meeting where they
invite different people. There's a proposal manager,

310
00:24:04,119 --> 00:24:11,079
there's a technical sme there is account
team. Right, there is no

311
00:24:11,240 --> 00:24:18,279
point in responding to an RFP if
you don't know anything about that customer.

312
00:24:18,440 --> 00:24:21,400
That is a general rule. That's
not just in government. That's a general

313
00:24:21,480 --> 00:24:23,839
rule. Right. Why waste you
know, there's no price for the second

314
00:24:23,839 --> 00:24:30,359
and third and fourth place based on
your response, So you have to read

315
00:24:30,400 --> 00:24:37,279
that. You organize a meeting and
they present what is requested in the RFP,

316
00:24:37,400 --> 00:24:41,480
and then somebody like me, an
engineer, would join that meeting and

317
00:24:41,519 --> 00:24:45,279
say, yes, we can do
this. This is SharePoint related, this

318
00:24:45,400 --> 00:24:48,279
is records management. Oh, we
will use this tool and we will architect

319
00:24:48,359 --> 00:24:53,559
these pieces. And somebody will say
fine, you go off, and the

320
00:24:53,599 --> 00:24:59,359
other architects go off and start writing
the first draft. Now I don't want

321
00:24:59,440 --> 00:25:04,240
to write anything because I'm already busy
with my day job. And they give

322
00:25:04,240 --> 00:25:07,880
me two weeks. I go off
and start writing, and turns out that

323
00:25:07,920 --> 00:25:11,519
I don't know about other my colleagues. I'm not a good writer. So

324
00:25:11,559 --> 00:25:17,240
two weeks go by, and then
we're expected to show back, come back

325
00:25:17,279 --> 00:25:21,279
to this meeting with what we have
written. And that's when the yelling begins,

326
00:25:21,319 --> 00:25:25,880
because you know they don't like what
we have written. Everybody has written

327
00:25:25,920 --> 00:25:30,000
it differently, and you know,
of course I am over emphasizing on all

328
00:25:30,039 --> 00:25:33,960
of the projects that I've worked on, and the same is true with my

329
00:25:34,039 --> 00:25:40,000
colleagues, and there's a rich set
of experienced library, but I'm only focused

330
00:25:40,039 --> 00:25:41,480
on the ones I've worked on,
right, because that's what I know best.

331
00:25:42,039 --> 00:25:48,319
So now two weeks have gone by, and really a draft is there,

332
00:25:48,359 --> 00:25:52,599
which is like poor quality draft,
which reads like ten people have written

333
00:25:52,640 --> 00:25:56,359
it. And now you've only two
more weeks left to the final submission.

334
00:25:56,799 --> 00:26:03,319
Right. And it is often said
that proposals are one either at the beginning

335
00:26:03,400 --> 00:26:06,480
or then. What they mean is
either at the beginning, you understand that

336
00:26:06,519 --> 00:26:10,039
you're going to win, you put
the right team in place, or at

337
00:26:10,119 --> 00:26:14,559
the end, if you have enough
time left, you want to take a

338
00:26:14,559 --> 00:26:19,000
step back and think about what should
be a creative pricing strategy. Who else

339
00:26:19,039 --> 00:26:23,039
would be bidding this right? How
can I ghost my competition? If you

340
00:26:23,079 --> 00:26:26,799
do those things, your chances of
winning goes up. But the sad reality

341
00:26:26,839 --> 00:26:32,440
of proposal writing is you spend most
of the time in the middle, chasing

342
00:26:32,480 --> 00:26:37,079
the first draft, chasing the second
draft. So the value proposition for the

343
00:26:37,119 --> 00:26:41,319
tool, and especially designed for people
like me, you have taken all of

344
00:26:41,359 --> 00:26:45,559
your past performances, stored them in
a library, and then when the RFP

345
00:26:45,720 --> 00:26:49,960
arrives, our engine, peven engine, will take a look at the RFP,

346
00:26:52,400 --> 00:26:56,799
parse it down into what is exactly
being asked, and then rather than

347
00:26:56,839 --> 00:27:03,359
having to write ten pages of pros
and return it, you're really dealing at

348
00:27:03,400 --> 00:27:07,519
a higher level. You're providing hints
to our model. We call it flight

349
00:27:07,559 --> 00:27:12,720
plan and very creative. Right if
you're calling ourselves copilot, a flight planner

350
00:27:12,799 --> 00:27:19,079
seems to be logical. So there, so you set up a flight plan,

351
00:27:19,119 --> 00:27:23,359
which is really and this is Richard
and cal This is where I think

352
00:27:23,400 --> 00:27:26,759
this is a very key concept in
my mind. Again, looking back,

353
00:27:26,799 --> 00:27:32,319
it seems trivial, but this is
one of the concepts that we landed on

354
00:27:32,400 --> 00:27:36,079
earlier on that has served as well. So if you think of calling yourself

355
00:27:36,079 --> 00:27:38,880
a co pilot, which means we
are not here to replace the proposal writers.

356
00:27:40,000 --> 00:27:44,319
You know, these people have gained
experience and skill or many many years.

357
00:27:45,039 --> 00:27:47,960
We are here to augment them.
And how do they work with the

358
00:27:48,039 --> 00:27:52,920
model without becoming a prompt experts themselves
or without becoming experts in a RAG pattern.

359
00:27:53,880 --> 00:27:56,400
What we ask them to do is
we ask them to fill out a

360
00:27:56,400 --> 00:28:03,240
flight plan, essentially tell us in
great how should we be responding. So

361
00:28:03,440 --> 00:28:07,960
these are the solutions we want to
incorporate. Here are the win themes.

362
00:28:07,480 --> 00:28:11,480
We think we are differentiated because we
are the best Microsoft partner in the world.

363
00:28:12,279 --> 00:28:15,559
What are the pain points? If
you really know the customer, you

364
00:28:15,599 --> 00:28:22,559
should know their motivation for issuing this
RFP, So you provide that vision in

365
00:28:22,599 --> 00:28:26,279
the form of a flight plan,
and then our engine will take that flight

366
00:28:26,400 --> 00:28:33,160
plan and then generate these prompts dynamically
as we talked about, and then generate

367
00:28:33,000 --> 00:28:38,200
a twenty thirty forty page response.
And why twenty thirty forty pages, because

368
00:28:38,200 --> 00:28:42,119
that's what's been asked, And that
was the other innovation. If you go

369
00:28:42,200 --> 00:28:48,200
to any other co pilot, you'll
end up getting one or two pages.

370
00:28:48,200 --> 00:28:49,720
Even chat GPT, if we ask
a complex question, it will say,

371
00:28:49,759 --> 00:28:52,519
hey, this is a complex question. Let me just tell you how you

372
00:28:52,559 --> 00:28:56,759
can get to this answer. In
our case, we can't tell our users

373
00:28:56,920 --> 00:29:00,559
how you can get to an answer. They want to see twenty pages.

374
00:29:00,039 --> 00:29:04,680
So we have two. Like I
was saying earlier, these language models,

375
00:29:04,680 --> 00:29:08,319
as good they are, they tend
to forget things in the middle. So

376
00:29:08,359 --> 00:29:12,279
it's our job to take two requirements, give it to the language model,

377
00:29:15,079 --> 00:29:19,160
get them to generate something, and
evaluate it at the end of that and

378
00:29:19,200 --> 00:29:22,440
see if they really answered the question. If not, send it back again,

379
00:29:22,519 --> 00:29:26,359
send it back again. Now you
have section one, you do that

380
00:29:26,400 --> 00:29:30,319
for section two, section three instead
of you do that repeatedly. And that's

381
00:29:30,359 --> 00:29:33,799
why people ask me, why is
it taking you guys four or five hours

382
00:29:33,839 --> 00:29:38,519
to generate a twenty page document?
And when I ask a question of chat

383
00:29:38,559 --> 00:29:41,279
GPT, it gives me an answer
in seconds. Why is it taking you

384
00:29:41,319 --> 00:29:47,000
four or five hours? Well,
the answer is that we're having to iterate

385
00:29:47,160 --> 00:29:51,880
over the content that's being developed,
and at every step of the way we

386
00:29:51,920 --> 00:29:55,519
are evaluating it for the sake of
completeness. And by the way, we

387
00:29:55,599 --> 00:30:00,960
also do hallucination checks as the content
is being generated, right, you have

388
00:30:00,039 --> 00:30:07,680
to without that human to check things
out. Customers can't just expect you to

389
00:30:07,720 --> 00:30:12,000
plug stuff in and copy and paste
out of a chat bot, you know,

390
00:30:12,279 --> 00:30:17,480
right, right, So I agree
hallucination checks up. That's why very

391
00:30:17,480 --> 00:30:21,279
important. And you know, going
back to my example, Carl, the

392
00:30:21,400 --> 00:30:26,319
RFP is asking for companies who have
done a five thousand or fifty fifty thousand

393
00:30:26,400 --> 00:30:33,359
notes sorry, migration to the cloud. And that's what the and now the

394
00:30:33,400 --> 00:30:37,640
text that was generated says you have
done five thousand note migration. Yeah,

395
00:30:37,799 --> 00:30:41,000
we can't find a reference to a
five thousand note migration in your entire past

396
00:30:41,079 --> 00:30:45,319
history. Say, hey, you
may want to go double check this.

397
00:30:45,160 --> 00:30:52,119
Well, how effective has it been
in generating proposals? So well, you

398
00:30:52,119 --> 00:30:57,200
should ask our customers that. But
one, well sounds like they're happy but

399
00:30:57,359 --> 00:31:00,759
happy with it. Yeah, I
think I'll lot more work needs to be

400
00:31:00,799 --> 00:31:06,839
done, obviously. Language models building, Richard, Building a production grade application

401
00:31:06,920 --> 00:31:11,960
with language models is hard because they
have their own personality. When Microsoft goes

402
00:31:11,000 --> 00:31:17,359
from four to four turbo to four
omni. There's some personality aspect of these

403
00:31:17,440 --> 00:31:19,519
language models can change, and we
have to account for that, especially if

404
00:31:19,519 --> 00:31:25,000
you're generating a multi layer prompt,
you have to account for that. So

405
00:31:25,160 --> 00:31:30,759
take some additional time. But so
far we have had customers who said that

406
00:31:30,799 --> 00:31:34,359
we've accelerated their timelines, given them
more time. We said, you reached

407
00:31:34,400 --> 00:31:38,200
out to a dozen customers, were
they all pretty happy with what they got?

408
00:31:38,559 --> 00:31:44,039
So I didn't complete the story.
We built an MVP. We went

409
00:31:44,079 --> 00:31:47,559
to a dozen customers and said,
we just want you to try our product.

410
00:31:47,680 --> 00:31:52,400
We didn't even have a UI at
that point, right, And because

411
00:31:52,480 --> 00:31:56,000
what's the point of the UI if
the content is not of the quality where

412
00:31:56,039 --> 00:32:00,519
it is helping you with saving time
and improving the outcomes. Right, So

413
00:32:00,559 --> 00:32:05,839
we went to them, got some
really good feedback, added more capabilities,

414
00:32:06,160 --> 00:32:08,599
got the UI, and of course
our UI is SharePoint because it turns out

415
00:32:08,599 --> 00:32:14,039
that eighty to ninety percent of proposal
writers live in share points. So why

416
00:32:14,079 --> 00:32:21,000
should why should we build something outside
an environment that they're already Why blow against

417
00:32:21,039 --> 00:32:24,319
the wind? Yeah? So then, Richard, what we did was back

418
00:32:24,359 --> 00:32:29,440
in February or March. This has
been a very fast moving thing lest in

419
00:32:29,559 --> 00:32:34,359
nine months, we said, okay, we're going to offer this in a

420
00:32:34,400 --> 00:32:38,119
commercial manner as a paid service,
and we went to many of those original

421
00:32:38,160 --> 00:32:43,160
data customers. Many of them signed
up, so we were at this point

422
00:32:43,279 --> 00:32:50,200
running several production customers who are paying
customers of ours. And I'm really proud

423
00:32:50,240 --> 00:32:54,880
to say that Microsoft's own proposal Center
of Excellence. Microsoft has this group called

424
00:32:54,880 --> 00:33:01,240
the Global Proposal Center of Excellence.
We went through an evaluation with them and

425
00:33:01,279 --> 00:33:07,920
they just onboarded as wow the tool. Wow, congratulations, congrats. Yeah,

426
00:33:07,960 --> 00:33:12,160
that's great, and gentlemen, we
should pause for one moment for these

427
00:33:12,240 --> 00:33:16,000
very important messages. Now we're back. It's not that Rocks. I'm Richard

428
00:33:16,039 --> 00:33:22,720
Campbell. That's Carl Franklin. Hey
here with our friend vishchwas Lela now the

429
00:33:22,799 --> 00:33:27,519
lead on this new Pewin company.
And obviously you know your MVP has gone

430
00:33:27,519 --> 00:33:30,559
well, you've done some initial testing. You know. I'll throw my sissedmin

431
00:33:30,680 --> 00:33:35,279
hat on because I've talked to a
lot of CTO CIO types who are looking

432
00:33:35,359 --> 00:33:38,480
at these technologies and they keep being
told by Microsoft, you should have your

433
00:33:38,559 --> 00:33:45,079
quote data estate in order, like
what I'm afraid of with the generalized tools

434
00:33:45,119 --> 00:33:49,480
like this, is that your data
state is not in order, and so

435
00:33:49,920 --> 00:33:53,400
data is going to be pulled into
the model and then subsequently revealed in prompts

436
00:33:53,680 --> 00:33:58,839
that everyone's going to be very unhappy
about. And there is no it's not

437
00:33:58,880 --> 00:34:01,359
like a sign pop out of a
server somewhere when your data state is in

438
00:34:01,440 --> 00:34:06,480
order. You don't know, it's
like your system should be secure. It's

439
00:34:06,640 --> 00:34:09,719
just as impossible. Like, one
of the things that appeals to me about

440
00:34:10,199 --> 00:34:16,920
your approach to this is you're encapsulating
the data set and arguably you're taking that

441
00:34:17,000 --> 00:34:22,199
responsibility saying where the folks who will
make sure that the data that appears in

442
00:34:22,239 --> 00:34:25,760
the model is appropriate. I mean, I'm wondering. I'm just wondering.

443
00:34:25,760 --> 00:34:30,880
How much of work is that for
you? Yeah, that's a huge amount

444
00:34:30,920 --> 00:34:36,800
of work. If I had to
look back, we've spent thousands of hours

445
00:34:37,119 --> 00:34:42,519
data science hours in building the multi
layer prompt engine. I would say an

446
00:34:42,559 --> 00:34:46,440
equal amount of time in worrying about
ingestion of the data. And now,

447
00:34:46,639 --> 00:34:54,239
fortunately for us, this is why
a domain specific copilot made sense, because

448
00:34:54,280 --> 00:34:58,320
you know you're limiting the data set, right. Some of the things that

449
00:34:58,360 --> 00:35:05,079
you're talking about is you might have
hundreds of thousands of documents in SharePoint and

450
00:35:05,119 --> 00:35:08,480
stored somewhere on one drive, and
that one document which nobody is found until

451
00:35:08,519 --> 00:35:13,000
now. You can be sure that
M three sixty five copilet would find it.

452
00:35:13,039 --> 00:35:16,840
And somebody asks the question, that
question will get answered inappropriately because that

453
00:35:16,920 --> 00:35:22,719
document was available, but nobody knew
about that document. In our case,

454
00:35:22,840 --> 00:35:29,880
it helped that we limit the kind
of data that goes into the engine.

455
00:35:30,199 --> 00:35:36,039
Right And as I mentioned, your
past RFP submissions, your white papers,

456
00:35:36,639 --> 00:35:43,400
and things like SPARS, which is
a government tissued document which says you've worked

457
00:35:43,440 --> 00:35:49,599
on these technologies well structured. So
we limit ourselves to those documents. We

458
00:35:49,679 --> 00:35:54,800
do things like many of these proposals
include resumes of people, and we exclude

459
00:35:54,840 --> 00:36:01,519
those resumes because now you're experienced from
a previous employer might pollute your current employer's

460
00:36:01,599 --> 00:36:07,880
data, so we exclude that.
So as part of the ingestion Richard,

461
00:36:07,920 --> 00:36:13,360
we break the documents down into logical
sections. We use the language models for

462
00:36:13,440 --> 00:36:17,119
some classification. Even so we don't
have human taggers to say this is a

463
00:36:17,159 --> 00:36:21,559
management plan, this is an executive
somebody. We're using language models to do

464
00:36:21,639 --> 00:36:27,519
some tagging of the documents themselves,
and I'm seeing this now inside of perview

465
00:36:27,639 --> 00:36:32,159
Microsoft's and again this IT conversation where
it's getting really good at calling out PII.

466
00:36:34,039 --> 00:36:37,519
Even before you got around a tagging. It knows the shape of identifiable

467
00:36:37,559 --> 00:36:44,000
information. So starting with a limited
data set helped us. And you know

468
00:36:44,199 --> 00:36:46,920
that may not be applicable to other
people here, and frankly, that's the

469
00:36:47,039 --> 00:36:52,119
challenge that Microsoft had with m TH
sixty five copilot, because you have everything,

470
00:36:52,599 --> 00:36:57,119
and of course they've added capabilities like
hey, don't include these documents in

471
00:36:57,159 --> 00:37:01,920
my index or leave this library out
right. So obviously those governance controls have

472
00:37:02,079 --> 00:37:06,000
come into play. In our case, we started a small data set,

473
00:37:06,039 --> 00:37:08,360
but even then it was really important
to parse it. I'll give you an

474
00:37:08,400 --> 00:37:15,840
example. You might partner with a
company when you're applying for a certain piece

475
00:37:15,880 --> 00:37:22,400
of work. That company may end
up teaming with a competitor reviewers in a

476
00:37:22,440 --> 00:37:28,119
subsequent RFP. Now if you've taken
their content included into your knowledge base,

477
00:37:28,960 --> 00:37:30,920
chances are that that content will show
up, so you have to be careful

478
00:37:30,960 --> 00:37:38,440
about that in terms of how you're
using maybe partner's data. So all of

479
00:37:38,480 --> 00:37:45,400
those things, Richard apply and we're
breaking that content down again. I talked

480
00:37:45,400 --> 00:37:49,880
about entities and then we use those
for our content generation. I got a

481
00:37:49,960 --> 00:37:52,920
question, did you, and you
may have said this early on, I

482
00:37:52,960 --> 00:37:59,280
didn't hear it, but we're using
Microsoft copilot Studio to create your're We're not

483
00:37:59,320 --> 00:38:04,039
losing max of copallet studio. Have
you used it, I've used it,

484
00:38:04,559 --> 00:38:13,639
But this we started our development well
before copiallet Studio. And secondly, our

485
00:38:13,800 --> 00:38:19,119
response engine at this point is so
complicated that we needed to go directly build

486
00:38:19,199 --> 00:38:23,039
things in Python and asure machine learning
notebooks and things like that. Gotcha.

487
00:38:23,159 --> 00:38:30,119
I mean what I like about this
is you clearly brought your own meta understanding

488
00:38:30,320 --> 00:38:37,320
of RFP writing to the tool set
that's right to then make into a product

489
00:38:37,480 --> 00:38:42,199
that works well for individual companies against
their data. But you know, there's

490
00:38:42,239 --> 00:38:46,039
three pieces here. There is the
language model as it stands. There is

491
00:38:46,239 --> 00:38:52,960
the understanding of the rules of writing
RFPs effectively and applying those. And then

492
00:38:52,000 --> 00:38:58,239
there is the data set that is
the companies for what their capabilities are and

493
00:38:58,519 --> 00:39:01,719
how they want to respond to RFPs
like those. You need all of those

494
00:39:01,760 --> 00:39:07,840
things. It's not there's no magic
wand here. Smart people had to work

495
00:39:07,920 --> 00:39:12,239
really hard to make this work.
But when it works, the fact that

496
00:39:12,280 --> 00:39:16,039
you're going to be able to respond
to more RFPs and to write better responses

497
00:39:16,199 --> 00:39:22,039
like this is money in the bank
if done why if done right? And

498
00:39:22,360 --> 00:39:27,440
I should add one other thing which
I should have mentioned earlier. I believe

499
00:39:27,679 --> 00:39:34,480
that in order to develop any effective
AI solution, you need to have two

500
00:39:34,480 --> 00:39:37,239
good mL engineering. Of course that's
a given, but then you have to

501
00:39:37,239 --> 00:39:43,119
bring domain expertise. And you alluded
to that that we brought some domain expertise,

502
00:39:43,320 --> 00:39:46,119
yes, but we are not experts
in proposal writing right. So what

503
00:39:46,199 --> 00:39:53,519
we did Richard early on is we
went to a company called Shipley Shipley and

504
00:39:53,960 --> 00:39:58,719
you may not be familiar with them
if you're not in the proposal space.

505
00:39:58,760 --> 00:40:05,679
But Steven Shipley fifty years ago wrote
the book on proposal writing right, and

506
00:40:06,360 --> 00:40:09,760
really an influential book in terms of
how should you be thinking about your RFP

507
00:40:09,880 --> 00:40:15,639
responses? And that translated into a
company. People can go to Shipley wins

508
00:40:15,679 --> 00:40:21,119
dot com fifty year old company and
their job it is to teach classes about

509
00:40:21,119 --> 00:40:25,840
better proposal writing, teach nuances of
writing compliant proposals and when you need a

510
00:40:25,920 --> 00:40:30,039
surge capacity, they make their consultants
available. Wow, okay, yeah,

511
00:40:30,079 --> 00:40:32,960
so they literally the business around it. That's a business around it. So

512
00:40:32,960 --> 00:40:36,559
we went to them and we said, you know, we are a tech

513
00:40:36,599 --> 00:40:39,199
company. We of course write our
own proposals, but we are nowhere near

514
00:40:39,239 --> 00:40:44,360
the subject matter expertise and the name
that you have, and we would like

515
00:40:44,400 --> 00:40:49,199
to partner you to be co creators
of this tool for us. Right and

516
00:40:50,239 --> 00:40:52,960
Shipley had never partnered with a company, and I was clear that I'm not

517
00:40:53,039 --> 00:40:57,760
just looking for a referral partnership that
hey recommend us. I want you to

518
00:40:57,840 --> 00:41:02,280
be joining the team so that you're
constantly helping us improve the prompts. You're

519
00:41:02,320 --> 00:41:08,000
bringing that domain expertise. And they
went through their own evaluation, Richard,

520
00:41:08,079 --> 00:41:13,559
for a period of time, they
looked at other tools, and very proud

521
00:41:13,639 --> 00:41:15,639
to say that they decided to join
in as an exclusive partner, as a

522
00:41:15,639 --> 00:41:23,199
conciliation partner. Right and having them
on our engineering scrums and saying, generate

523
00:41:23,320 --> 00:41:29,199
this content in this manner, because
what is very interesting about proposal writing is

524
00:41:30,880 --> 00:41:35,800
if you don't write it in a
certain manner right there, you are restating

525
00:41:35,840 --> 00:41:39,519
the problem. You're saying you've done
these things in the past, You're reinforcing

526
00:41:39,559 --> 00:41:45,360
the point. There's a certain format, and proposal writers tend to be very

527
00:41:45,400 --> 00:41:51,159
personniketty about that, right, if
you don't follow that format wrong or all

528
00:41:51,199 --> 00:41:54,079
that's are off right, Yeah,
and to the three of us, if

529
00:41:54,079 --> 00:42:00,119
you read one page of highly curated
proposal writing ready stuff and then some other

530
00:42:00,159 --> 00:42:04,400
stuff, you would say, hey, reasonable people can argue about it.

531
00:42:04,400 --> 00:42:07,800
Oh yeah, your sentence is here, your reference is here. But they

532
00:42:07,800 --> 00:42:10,920
don't like that, right, They
want the format to be exactly, So

533
00:42:12,800 --> 00:42:16,719
how do you write content that is
persuasive, that follows the structure. And

534
00:42:16,760 --> 00:42:22,360
having a domain expert like them,
so so fortunate to have them. They've

535
00:42:22,400 --> 00:42:25,760
taught people the process, the methodology, the best practice is I'm betting they're

536
00:42:25,960 --> 00:42:30,000
testing like that. They're evaluating the
output. Absolutely did a show on run

537
00:42:30,039 --> 00:42:32,760
as with Lyn Langett. Now that
logo we call the hard part of machine

538
00:42:32,800 --> 00:42:37,199
learning. It's like test test,
test, Like you've got to evaluate these

539
00:42:37,199 --> 00:42:42,400
outputs, right, and it takes
experts to even know what's good and what's

540
00:42:42,480 --> 00:42:46,000
bad. It takes experts and you
cannot use just chat GiB to say which

541
00:42:46,000 --> 00:42:52,239
section is better, right because it
requires experience, skill, all of that.

542
00:42:52,760 --> 00:42:55,760
So they're helping us with testing,
they're helping us write this. So

543
00:42:55,840 --> 00:43:00,239
that was an important partnership. And
that's maybe one tip to the list us

544
00:43:00,239 --> 00:43:04,719
out there that in order for an
YI solution to be effective, you can't

545
00:43:04,800 --> 00:43:07,719
just have five spart engineers go to
any domain and say I have a solution

546
00:43:07,840 --> 00:43:12,920
for your problem because I know as
are mL and Python better than you.

547
00:43:13,039 --> 00:43:15,360
Well, that does not matter.
Now you'd know my problem set. Well,

548
00:43:15,400 --> 00:43:22,400
there's plenty of smart developers that have
built quality software that was wrong because

549
00:43:22,440 --> 00:43:27,519
they didn't have the domain experts involved
to make the right thing. This is

550
00:43:27,559 --> 00:43:30,719
the same roles. You sit a
different place in some respects. You're not

551
00:43:30,800 --> 00:43:35,639
qualified to evaluate this output. You
need an expert to evaluate this out.

552
00:43:35,679 --> 00:43:38,800
So having that, I was just
finishing that other point that you succinctly made,

553
00:43:38,840 --> 00:43:42,960
which is it's really important to have
a domain expertise, And I was

554
00:43:43,039 --> 00:43:47,760
just adding that our expertise was not
enough alone and we needed to partner with

555
00:43:47,800 --> 00:43:51,840
the best in the business to do
that, and we're fortunate to have them.

556
00:43:51,880 --> 00:43:53,840
I mean, I've got to think
there's somebody who's made their career to

557
00:43:53,920 --> 00:44:01,360
writing great RFPs. This is still
pretty threatening like that, at least but

558
00:44:01,440 --> 00:44:05,119
admittedly you presented the right way.
It's like it's only going to give you

559
00:44:05,119 --> 00:44:08,760
a draft. We're going to need
your help right every time, at least

560
00:44:08,840 --> 00:44:14,679
to still be read. Yeah,
Richard's that's a good point. And at

561
00:44:14,679 --> 00:44:19,239
some level we are all concerned about
the socizeful impact of generative AI and all

562
00:44:19,280 --> 00:44:23,320
of the things that we do.
But the way I presented two important points.

563
00:44:24,199 --> 00:44:28,920
One is, remember the notion of
flight plan that I talked about.

564
00:44:29,559 --> 00:44:32,880
An expert proposal writer can quickly fill
out the flight plan because think about that,

565
00:44:32,920 --> 00:44:37,920
you're setting the vision, or you're
setting the intellectual momentum for you to

566
00:44:37,039 --> 00:44:42,199
win. Without a vision, nobody
is winning any RFPs, right, And

567
00:44:42,320 --> 00:44:45,079
this is where the proposal directors or
proposal writers come in and they fill out

568
00:44:45,119 --> 00:44:50,039
the flight plan. And we have
been talking to them about how their job

569
00:44:50,199 --> 00:44:54,480
changes with generative AI, where they
are operating at the object layer. We

570
00:44:54,559 --> 00:44:59,119
talk about them, We talk to
them about objects, and then we are

571
00:44:59,280 --> 00:45:01,800
dealing at the work words level.
Right, Yeah, they're changing the objects

572
00:45:01,800 --> 00:45:08,079
and attributes and methods maybe, but
then we are generating all the code for

573
00:45:08,159 --> 00:45:12,400
them. In other words, that
we are generating all the words for them.

574
00:45:12,480 --> 00:45:16,280
So that's one, and it is
a model driven architecture in the sense

575
00:45:16,360 --> 00:45:21,599
that you change the flight plan,
you look at the content and you feel

576
00:45:21,639 --> 00:45:23,559
like, hey, this came out
pretty flat. Let me go back and

577
00:45:23,639 --> 00:45:29,079
change the flight plan and regenerate and
regenerate and regenerate. Right, So it's

578
00:45:29,079 --> 00:45:32,679
a model driven architecture and the better
they get it with that setup, they

579
00:45:32,679 --> 00:45:37,039
can make these quick changes. That's
one. The second important point is we

580
00:45:37,079 --> 00:45:44,320
are not claiming authorship. Right.
I get this request from some people that

581
00:45:44,440 --> 00:45:50,400
hey, can we use your tool? Our knowledge repository is pretty sparse.

582
00:45:50,719 --> 00:45:53,320
We have not done this type of
a work. Can you help us generate

583
00:45:53,400 --> 00:45:58,920
good content? And well, this
is not cheating as a service. You

584
00:45:58,960 --> 00:46:02,119
can just push up button and you
get a proposal response. Right. If

585
00:46:02,119 --> 00:46:07,000
you don't, if you don't have
good past performances, if you don't have

586
00:46:07,159 --> 00:46:13,639
good background, if you don't have
creative architects who can come up with good

587
00:46:13,679 --> 00:46:17,239
creative solutions to the problem, chances
are that what we generate is going to

588
00:46:17,280 --> 00:46:22,400
look very flat to you. So
we're not claiming authorship. Those things have

589
00:46:22,480 --> 00:46:25,639
to be in place for an organization
to be able to use over tool.

590
00:46:27,119 --> 00:46:31,199
So do you do you ask your
customers. When you give a proposal,

591
00:46:31,280 --> 00:46:37,000
do you mind if we use this
as and feed it back into the machine,

592
00:46:37,800 --> 00:46:42,480
so as as training data. So
cal one important thing to note is

593
00:46:43,000 --> 00:46:46,000
we are not training the models.
And this is a question that I get

594
00:46:46,039 --> 00:46:52,639
asked often. We're not training the
model at all, because it's not like

595
00:46:52,760 --> 00:46:55,400
your data is going to be trained
and then and then use for someone else.

596
00:46:55,760 --> 00:47:00,880
But you do you do use those
as? Yes, we use them

597
00:47:00,920 --> 00:47:05,440
that we use them in a rag
manner. Where right? And so I

598
00:47:05,519 --> 00:47:07,599
was just about to say this is
retrieve all mana generation. This model is

599
00:47:07,639 --> 00:47:14,559
the model, I would say,
you still have to ask permission. Yeah,

600
00:47:14,639 --> 00:47:17,239
So, so just to be clear, if somebody acquires our tool,

601
00:47:19,400 --> 00:47:22,559
they go through an onboarding stage where
we look at the documents and we tell

602
00:47:22,599 --> 00:47:28,559
them some best practices and then we
ingest those documents. Of course, that's

603
00:47:29,360 --> 00:47:32,480
their asset and that data belongs to
them. And whenever engine runs, it

604
00:47:32,559 --> 00:47:37,199
dynamically makes a call into this database
to look for right pieces of information.

605
00:47:38,559 --> 00:47:44,079
So of course we are using their
data, but then we dynamically call into

606
00:47:44,119 --> 00:47:49,159
this engine. If there is no
results to be returned because they have not

607
00:47:49,280 --> 00:47:52,239
done this kind of work, then
don't have any white papers and that technology,

608
00:47:52,760 --> 00:47:57,880
then it's going to return pretty flat
results and we can't infuse that content

609
00:47:57,960 --> 00:48:02,400
back into language Models and c rate
the content appropriate. Okay, soa vishwas

610
00:48:02,440 --> 00:48:07,000
what's on your to do list?
What's next? Well, the next is

611
00:48:07,039 --> 00:48:15,679
to continue to improve the product.
That's what I'm focused on for the foreseeable

612
00:48:15,760 --> 00:48:21,920
future. Take advantage of all of
the amazing enhancements that are coming to the

613
00:48:22,000 --> 00:48:25,960
Language Model space. What I love
about this is this allows me to tap

614
00:48:27,000 --> 00:48:30,440
into all of the Azure work that
I've done in the last ten years.

615
00:48:30,639 --> 00:48:36,559
Because ultimately Language Model is just one
service. There are thirty eight other Azure

616
00:48:36,639 --> 00:48:39,760
services that are powering this solution.
So a lot of things that we've learned

617
00:48:40,360 --> 00:48:45,119
over the last ten years with the
community are coming into play. So continue

618
00:48:45,480 --> 00:48:49,559
to see how far we can take
this, how we can support our customers

619
00:48:49,599 --> 00:48:54,440
better. So that's what next for
me. Fantastic. Hey, it's always

620
00:48:54,519 --> 00:48:58,440
great to talk to you. We
always learned something, we always get a

621
00:48:58,480 --> 00:49:02,320
new perspective on AI and when we
talk to you, and thanks, thank

622
00:49:02,360 --> 00:49:06,800
you, all right, and we'll
talk to you the next time at dot

623
00:49:06,840 --> 00:49:30,559
NetBox. Dot net Rocks is brought
to you by Franklin's Net and produced by

624
00:49:30,639 --> 00:49:37,559
Pop Studios, a full service audio, video and post production facility located physically

625
00:49:37,599 --> 00:49:42,559
in New London, Connecticut, and
of course in the cloud online at pwop

626
00:49:42,800 --> 00:49:45,679
dot com. Visit our website at
d O T N E T R O

627
00:49:45,760 --> 00:49:52,039
c k S dot com for RSS
feeds, downloads, mobile apps, comments,

628
00:49:52,400 --> 00:49:55,480
and access to the full archives going
back to show number one, recorded

629
00:49:55,519 --> 00:49:59,960
in September two thousand and two.
And make sure you check out our spot

630
00:50:00,000 --> 00:50:02,400
sponsors. They keep us in business. Now, go write some code.

631
00:50:04,000 --> 00:50:13,719
See you next time. You got
jad middle vans down there that means hard

632
00:50:14,199 --> 00:50:15,360
than my taxes are
