[00:15] (15.36s)
right, I need everybody to take a deep
[00:17] (17.68s)
breath here because um I'm about to
[00:20] (20.00s)
stress you out and uh but hopefully at
[00:22] (22.48s)
the end I'll uh relieve that stress a
[00:24] (24.32s)
little bit with some ideas and solutions
[00:25] (25.76s)
for you. So, I need everybody to just
[00:27] (27.76s)
think for a second, reflect on the past
[00:29] (29.92s)
45 days and think about all the possible
[00:33] (33.12s)
things that have gone on in our industry
[00:34] (34.72s)
and all the product launches. Let me
[00:36] (36.56s)
highlight just a few for you. Uh, Notion
[00:38] (38.80s)
launched a Granola, Glean, and Chat GPT
[00:41] (41.28s)
competitors. Figma launched a Canva,
[00:43] (43.04s)
framer, illustrator, and lovable
[00:44] (44.48s)
competitor. At Lassian launched a
[00:46] (46.24s)
granola, Glean competitor, plus cloud
[00:48] (48.08s)
integrations. Anthropic launches a Glean
[00:50] (50.24s)
competitor with cloud integrations.
[00:51] (51.76s)
Google launches Codex, Lovable, and many
[00:53] (53.68s)
other competitors. OpenAI bought
[00:56] (56.32s)
a cursor competitor launches codecs and
[00:59] (59.20s)
a lot more. Right? This is just one
[01:02] (62.16s)
little microcosm of the entire tech
[01:05] (65.04s)
industry. But if you look around at all
[01:07] (67.12s)
the different categories of software
[01:08] (68.80s)
right now, the same exact thing is
[01:11] (71.44s)
happening. And I haven't even mentioned
[01:13] (73.68s)
the horde of startups, wellunded
[01:16] (76.08s)
startups, uh that are getting funded in
[01:18] (78.32s)
every single one of these spaces as
[01:20] (80.00s)
well. And among all of this chaos, we
[01:23] (83.12s)
have companies that are essentially
[01:24] (84.80s)
collapsing in months rather than years.
[01:27] (87.44s)
Cheg was one of the first ones to go
[01:29] (89.04s)
that declined to over 90% in the matter
[01:31] (91.20s)
of months. And of course, Stack Overflow
[01:33] (93.36s)
was one of the early victims as well
[01:35] (95.12s)
when chat GPT launched. So this gets to
[01:38] (98.48s)
the number one question that we all need
[01:40] (100.48s)
to be answering, right? A lot of people
[01:42] (102.48s)
at this conference are talking about how
[01:44] (104.72s)
product is doing more engineering,
[01:46] (106.16s)
engineering is doing more product work,
[01:47] (107.76s)
design's doing more product work, all
[01:49] (109.28s)
the tactical, all the technical, all of
[01:51] (111.20s)
those different infrastructure. But none
[01:53] (113.12s)
of that matters. None of it matters
[01:55] (115.04s)
unless you answer this question. What do
[01:57] (117.36s)
I build and why will it win? And the
[02:00] (120.00s)
interesting thing about this is this was
[02:02] (122.16s)
always the job of product. It just
[02:04] (124.48s)
happens to be that over the years it got
[02:06] (126.64s)
marred in all of this uh project
[02:09] (129.44s)
management, agile process, all of this
[02:11] (131.92s)
type of stuff. But this is what always
[02:14] (134.08s)
separated great product managers from
[02:16] (136.80s)
good product managers and product
[02:18] (138.64s)
leaders. This is Sean Claus. He's the
[02:20] (140.72s)
chief product officer at Confluent. He
[02:22] (142.56s)
was formerly chief product officer at
[02:24] (144.08s)
MuleSoft. He was the first head of
[02:25] (145.76s)
growth at Atlassian as well. And I
[02:27] (147.92s)
thought he encapsulated well. He said,
[02:29] (149.36s)
"You're constantly trying to get ahead.
[02:31] (151.12s)
You're trying to find the angle. the
[02:32] (152.72s)
question that has not yet been asked
[02:34] (154.08s)
that gives you an insight that is not
[02:35] (155.60s)
being actioned by other people. It
[02:37] (157.44s)
doesn't just have to be an insight. It
[02:39] (159.20s)
has to be an insight that others are not
[02:40] (160.96s)
actioning because if you find that
[02:42] (162.40s)
insight and others aren't actioning it,
[02:44] (164.40s)
that is your competitive advantage. Now,
[02:48] (168.16s)
the problem is is that this question has
[02:50] (170.64s)
gotten 10x harder. This is a rough map
[02:53] (173.44s)
of uh Gettysburg, and I thought it was a
[02:55] (175.36s)
good analogy because this was one of the
[02:56] (176.88s)
bloodiest battles in the Civil War. And
[02:59] (179.36s)
this kind of represents the map that we
[03:01] (181.52s)
are all playing in in the competitive
[03:02] (182.96s)
environment right now. We have fast huge
[03:05] (185.84s)
moving incumbents like Microsoft, Google
[03:08] (188.08s)
and Meta. There are these new huge
[03:10] (190.40s)
horizontal platforms like chat, GPT and
[03:12] (192.64s)
anthropic that are eating up major use
[03:14] (194.64s)
cases. We have foundational shifts in
[03:17] (197.28s)
the technology landscape not on a yearly
[03:19] (199.52s)
basis on a monthly basis and there are
[03:22] (202.00s)
hordes and hordes of startups being
[03:24] (204.32s)
funded including five or six in every
[03:26] (206.88s)
single capa category that has traction
[03:28] (208.88s)
by YC every single cohort. This is you
[03:33] (213.04s)
sitting in the middle of all of this,
[03:35] (215.36s)
right? And the question is is how in the
[03:38] (218.08s)
world do you find a seam among all of
[03:42] (222.00s)
these players to potentially find some
[03:44] (224.96s)
traction and win? That's the question we
[03:46] (226.96s)
have to answer before any of the other
[03:49] (229.04s)
stuff like technology, infrastructure,
[03:51] (231.60s)
or even what our roles are in the
[03:53] (233.68s)
organization. I'm Brian. I'm founder and
[03:56] (236.00s)
re founder and CEO of Reforge. And uh if
[03:58] (238.56s)
you notice, I have a little bit more
[03:59] (239.60s)
gray hair and wrinkles from this picture
[04:01] (241.52s)
because I've been around in tech for
[04:02] (242.88s)
about 25 years, been doing startups the
[04:04] (244.88s)
whole time. I played in some pretty
[04:06] (246.48s)
competitive environments. I helped
[04:08] (248.08s)
HubSpot launch their CRM almost a decade
[04:11] (251.84s)
ago. And at that time, that was a crazy
[04:15] (255.12s)
competitive category. People thought we
[04:17] (257.04s)
were bonkers. My guess is, if I took a
[04:19] (259.12s)
raise of hand, probably over 50% of your
[04:21] (261.12s)
companies are using that CRM today. Now,
[04:23] (263.36s)
that was a competitive environment. But
[04:24] (264.96s)
what we're I'm experiencing now and what
[04:26] (266.56s)
we're all experiencing is probably 10x
[04:28] (268.80s)
that. And so, uh, a little history about
[04:31] (271.92s)
Reforge is that we've been around for
[04:33] (273.68s)
about 10 years. We've helped thousands
[04:35] (275.28s)
of product teams, including all the ones
[04:36] (276.72s)
you see here, over 100,000
[04:38] (278.32s)
professionals. I hope some of you have
[04:40] (280.00s)
been part of Reforge in the past. And
[04:41] (281.84s)
the way that we've done it is that we've
[04:43] (283.12s)
built a community of over 400 experts on
[04:45] (285.52s)
the front lines to decode all of their
[04:47] (287.52s)
best practices. We started by doing that
[04:50] (290.00s)
with 40 plus expertled courses,
[04:52] (292.00s)
including our AI courses. But a couple
[04:54] (294.24s)
years ago, we started to take a shift
[04:55] (295.92s)
and started to encode all of this
[04:57] (297.28s)
knowledge into AI agents. Our first one,
[04:59] (299.52s)
Reforge Insights, which acts like your
[05:01] (301.36s)
AI product researcher. Our second one
[05:04] (304.00s)
called Compass is your project manager
[05:06] (306.40s)
that takes care of all of those
[05:07] (307.84s)
low-level, lowv value tasks that involve
[05:10] (310.32s)
product management, automated for you.
[05:12] (312.16s)
We have two more coming later this year.
[05:14] (314.24s)
But back to this question, how do you
[05:15] (315.92s)
win in the intense environment in the
[05:18] (318.80s)
history of technology? I spent a few
[05:21] (321.20s)
months with Ravi Meta thinking about
[05:22] (322.72s)
this exact question. He created our AI
[05:25] (325.12s)
strategy course. He was the former chief
[05:26] (326.64s)
product officer at Tinder. He also was a
[05:29] (329.44s)
product leader at Facebook, Microsoft,
[05:30] (330.96s)
Trip Adviser and a bunch more. And the
[05:33] (333.52s)
way that we start to answer this
[05:34] (334.88s)
question is actually we need to think
[05:36] (336.80s)
about the traps. And the two most common
[05:38] (338.56s)
traps are of course one, how do you like
[05:41] (341.20s)
reinventing the AI wheel? You do not
[05:43] (343.52s)
need to build custom models in
[05:46] (346.08s)
infrastructure in order to answer this
[05:48] (348.56s)
question. And on the opposite side is
[05:50] (350.56s)
the other trap is the other trap which
[05:52] (352.56s)
is just implementing, copying and
[05:54] (354.40s)
pasting basic AI features like chat bots
[05:57] (357.20s)
into your product. The answer actually
[05:59] (359.44s)
lies in the middle which is treating AI
[06:02] (362.24s)
like a series of Lego blocks where you
[06:05] (365.12s)
assemble differentiated AI features and
[06:08] (368.24s)
products by integrating the best
[06:10] (370.16s)
available AI capabilities with your
[06:12] (372.32s)
products data and functionality. Your
[06:14] (374.96s)
competitive advantage will come from
[06:16] (376.64s)
what is uniquely yours. These three
[06:19] (379.60s)
things, your data, your functionality,
[06:22] (382.32s)
and your understanding of unmet customer
[06:24] (384.56s)
needs, not the AI itself. So, let's
[06:28] (388.00s)
think about the anatomy of a winning AI
[06:30] (390.24s)
product. What are the major building
[06:31] (391.92s)
blocks? What are the major Lego pieces?
[06:34] (394.24s)
And how do you stack them together,
[06:35] (395.92s)
connect them to create something
[06:37] (397.76s)
differentiated? Well, we can start to
[06:40] (400.16s)
talk about this, the AI capabilities,
[06:42] (402.32s)
because there's a ton of Lego pieces
[06:44] (404.48s)
that are emerging every year. Whether
[06:46] (406.40s)
it's the pre-trained AI models or the
[06:48] (408.64s)
abilities to perform tasks, audio
[06:50] (410.96s)
processing, imaging process, all of
[06:52] (412.96s)
these new capabilities that feel magical
[06:55] (415.20s)
now that we couldn't do before. But the
[06:57] (417.68s)
thing about all of these Lego blocks is
[07:00] (420.08s)
you just don't have access to them.
[07:02] (422.08s)
Everybody else has access to them as
[07:03] (423.84s)
well. So even though AI products and
[07:06] (426.80s)
features of course use one of these
[07:09] (429.28s)
Legos as its core Lego blocks, this is
[07:11] (431.36s)
not where differentiation and
[07:13] (433.20s)
competitive advantage comes from. That
[07:15] (435.60s)
starts with one of these pieces, your
[07:17] (437.60s)
data. Because your data is what provides
[07:20] (440.40s)
context to a AI model to generate a
[07:23] (443.36s)
unique output. The more unique your data
[07:25] (445.84s)
is, the more unique output you can
[07:27] (447.84s)
generate for customer. And there's a
[07:30] (450.24s)
bunch of different types of data.
[07:32] (452.00s)
There's real- time data that the models
[07:33] (453.84s)
might not have incorporated into their
[07:35] (455.36s)
training set. There's user specific
[07:37] (457.28s)
data. There's domain specific data like
[07:39] (459.44s)
we've seen emerging in le in uh legal in
[07:42] (462.00s)
healthcare. There's human judgment data
[07:44] (464.40s)
around curation as well as reinforcement
[07:46] (466.80s)
data. Now the question about data is how
[07:49] (469.20s)
do you actually combine multiple
[07:50] (470.96s)
categories of data together to form some
[07:53] (473.52s)
uniqueness as well as it's not about the
[07:56] (476.24s)
quantity of your data. It's about the
[07:58] (478.24s)
marginal value of your data over
[08:00] (480.88s)
everybody else especially the big
[08:03] (483.92s)
models. So how much additional value
[08:06] (486.72s)
does your data add over what is already
[08:09] (489.60s)
trained in the models. The third piece
[08:12] (492.16s)
is your functionality because this
[08:13] (493.76s)
determines how the AI behaves and it
[08:16] (496.08s)
gives your AI product superpowers.
[08:18] (498.32s)
There's multiple types of Lego blocks
[08:19] (499.92s)
around your functionality. Specialized
[08:21] (501.84s)
workflows, unique algorithms, business
[08:23] (503.68s)
rules, integrations, whatever it is
[08:25] (505.52s)
that's baked into your product. Now, the
[08:27] (507.84s)
key about assembling all these pieces is
[08:29] (509.60s)
that they work like a system and you
[08:32] (512.00s)
have to connect the system in order to
[08:34] (514.80s)
build that competitive differentiation.
[08:37] (517.20s)
Let's start with this. Your data is what
[08:39] (519.84s)
provides and informs the AI's
[08:41] (521.76s)
understanding. It's what helps you gener
[08:43] (523.76s)
helps the AI generate a unique output.
[08:46] (526.80s)
And that unique output as a result is
[08:49] (529.84s)
what helps you build an additional
[08:51] (531.92s)
repository of unique so that this
[08:55] (535.04s)
continues to to flow in a flywheel. On
[08:58] (538.40s)
the other side of the spectrum is your
[09:00] (540.48s)
functionality. Your functionality in
[09:02] (542.56s)
your product is how your product
[09:04] (544.64s)
controls the AI actions. how it
[09:06] (546.40s)
interacts with AI when it calls it to
[09:08] (548.40s)
create a delightful user experience. And
[09:10] (550.80s)
in addition, AI is AI's uh increasingly
[09:14] (554.64s)
able to call tools in the functionality
[09:17] (557.92s)
of your product itself. And those two
[09:20] (560.40s)
things work together as a system as
[09:22] (562.00s)
well. So let's take all of this theory
[09:24] (564.80s)
and let's put it into practice. Let's
[09:26] (566.96s)
talk about a product granola. Just by a
[09:29] (569.20s)
raise of hands, how many people have
[09:30] (570.48s)
either tried or used granola today?
[09:32] (572.80s)
Okay, pretty decent amount. That's
[09:34] (574.16s)
probably like 40% of the room. A year
[09:36] (576.32s)
ago, that would have been zero. And I
[09:39] (579.04s)
think this is an interesting case
[09:40] (580.32s)
because they entered a space that
[09:43] (583.28s)
already had a horde of other AI
[09:46] (586.72s)
notetakers, whether that was Fathom,
[09:48] (588.96s)
Otter, Fireflies, there was a ton of
[09:51] (591.20s)
them. Um, but somehow they found a scene
[09:54] (594.56s)
and they've g garnered 40% of your
[09:56] (596.96s)
attention in this room and about 50
[09:58] (598.72s)
million in funding. So, let's go back to
[10:00] (600.80s)
those three fundamental questions in
[10:02] (602.32s)
those Lego bricks. what was uniquely
[10:04] (604.40s)
theirs, their data, their functionality,
[10:05] (605.92s)
and their understanding of their unmet
[10:07] (607.60s)
customer need. I'm going to start with
[10:09] (609.20s)
the last one. So, at the time when they
[10:11] (611.44s)
entered the market space, this is this
[10:13] (613.92s)
is just a sample of people who are
[10:15] (615.60s)
already in market, including all of the
[10:18] (618.00s)
incumbents like Zoom and Meet that have
[10:19] (619.60s)
AI native note-taking capabilities, but
[10:22] (622.72s)
they were all approaching it from the
[10:24] (624.64s)
perspective of the product is going to
[10:27] (627.04s)
do something for the user. It's going to
[10:29] (629.12s)
replace the full job. They want somebody
[10:31] (631.12s)
else to take my meeting notes. What they
[10:34] (634.00s)
realize is actually there's a whole
[10:36] (636.00s)
other set of customer needs that have
[10:38] (638.08s)
been unmet, which is I don't want you to
[10:40] (640.08s)
take all of my notes. I just want you to
[10:42] (642.16s)
help me take better notes. Empower me
[10:44] (644.88s)
around this specific task and user. And
[10:46] (646.96s)
that's what they built the product
[10:48] (648.32s)
around. Now, in order to start, they
[10:51] (651.20s)
used off-the-shelf capabilities. No
[10:53] (653.36s)
unique models, no custom training,
[10:55] (655.28s)
nothing. They used deep gram for
[10:56] (656.56s)
transcription. They used anthropic and
[10:58] (658.40s)
open AI for some of their other
[10:59] (659.76s)
functionality, but the uniqueness came
[11:02] (662.32s)
in how they assembled the Lego blocks.
[11:05] (665.12s)
Starting with on the lefth hand side
[11:06] (666.80s)
with granola's data, right? Their
[11:08] (668.88s)
context includes both the notes that you
[11:11] (671.36s)
take as well as the transcription that
[11:13] (673.60s)
they generate. They use the AI Lego
[11:16] (676.16s)
block to generate a unique output, which
[11:18] (678.32s)
is they enhance better notes. Those
[11:20] (680.64s)
notes over time form a repository that
[11:24] (684.16s)
starts to enable all sorts of other
[11:25] (685.76s)
features that they've layered on like
[11:27] (687.20s)
chatting across meetings, uh their
[11:29] (689.44s)
project workspaces, all their downflow
[11:31] (691.76s)
actions. So they have this nice flywheel
[11:34] (694.00s)
of unique context in data that's
[11:36] (696.64s)
starting to spin that was partially
[11:38] (698.64s)
enabled by the right hand side of the
[11:40] (700.16s)
Lego blocks, their functionality. They
[11:42] (702.32s)
used a Mac app so that they could detect
[11:44] (704.08s)
when meetings started to access the
[11:46] (706.40s)
system sound for transcription by being
[11:48] (708.80s)
right there at the user h uh the the
[11:51] (711.20s)
user moment that they needed it to
[11:52] (712.80s)
enable the AI to do those things. And
[11:55] (715.60s)
they've also plugged into other tools
[11:57] (717.52s)
and integrations like the calendar to
[11:59] (719.28s)
get metadata about the meetings such as
[12:01] (721.76s)
attendees. So they assembled these Lego
[12:04] (724.32s)
blocks to meet in that unique way to
[12:06] (726.88s)
meet that unique customer need. Now, the
[12:09] (729.60s)
question is, is granola going to
[12:11] (731.36s)
survive? I've got no idea, right? It's
[12:14] (734.48s)
an incredibly competitive landscape
[12:16] (736.64s)
because the real the realization is that
[12:18] (738.48s)
you can't stop here. You can't stop by
[12:20] (740.40s)
just assembling your initial set of Lego
[12:22] (742.80s)
bricks. You have to sequence over and
[12:25] (745.04s)
over again. You have to take those first
[12:27] (747.04s)
three Lego LEGO bricks, leverage them
[12:29] (749.60s)
into another unique set that you
[12:31] (751.44s)
assemble. And you see Granola doing
[12:33] (753.60s)
this. Now that they've enabled this,
[12:35] (755.52s)
they've started to create project and
[12:37] (757.44s)
team workspaces and start to uh enable a
[12:40] (760.72s)
new set of unique use cases off of the
[12:43] (763.20s)
initial layer that um they did. They've
[12:45] (765.20s)
started to integrate downstream actions
[12:47] (767.12s)
like uh connecting to your CRM and
[12:49] (769.44s)
HubSpot. Uh that I just saw them the
[12:51] (771.52s)
other day experimenting with a company
[12:54] (774.08s)
wiki that autoupdates itself. So they
[12:56] (776.72s)
continue to sequence these things into a
[12:58] (778.64s)
unique set of building blocks. The
[13:00] (780.48s)
question is is will they keep up? I
[13:02] (782.56s)
don't know. Jamon Ball, a partner at
[13:04] (784.64s)
Alterimator Capital, recently wrote a
[13:06] (786.24s)
newsletter and he said, "The real moat
[13:08] (788.00s)
is just a sequence of smaller moes
[13:09] (789.76s)
stacked together. Each one buys time.
[13:12] (792.24s)
What you do uh with that time, how fast
[13:14] (794.64s)
you execute, how quickly you evolve
[13:16] (796.64s)
determines whether you stay ahead. If
[13:18] (798.48s)
the moat used to be 6 to 12 months,
[13:20] (800.80s)
today it's 2 to 3 weeks.
[13:24] (804.16s)
So to recap, to win an AI besides being
[13:27] (807.60s)
stressed out, right, is to answer what
[13:30] (810.64s)
are your unmet customer problems?"
[13:32] (812.32s)
That's always been a part of product,
[13:33] (813.84s)
right? The second is what AI
[13:35] (815.52s)
capabilities can solve those problems in
[13:37] (817.36s)
novel ways. What proprietary data can
[13:39] (819.84s)
power those solutions? And then what
[13:41] (821.76s)
superpowers can our product give to AI?
[13:44] (824.56s)
How do you assemble those three
[13:46] (826.64s)
foundational Lego blocks? All right.
[13:49] (829.76s)
Thank you. If you're if you're an AI
[13:52] (832.00s)
engineer, we are hiring. Our team will
[13:54] (834.24s)
be outside. We can play with products
[13:55] (835.92s)
with instant distribution to 300,000
[13:57] (837.84s)
people. And if you need help with
[13:59] (839.68s)
anything else, just check out
[14:00] (840.56s)
reforge.com.
[14:02] (842.08s)
Good luck.
[14:03] (843.79s)
[Music]