[00:00] (0.16s)
Think about the area under the curve of
[00:02] (2.24s)
utility that you could contribute to
[00:03] (3.92s)
society and everything else is
[00:06] (6.48s)
similacrim. It is not real. When you
[00:09] (9.12s)
think about SBF, when you think about
[00:10] (10.96s)
therronos, when you think about the
[00:12] (12.32s)
things that truly disgrace us as people
[00:16] (16.16s)
who create technology, when you peel
[00:18] (18.56s)
back a little bit, you realize there's
[00:20] (20.72s)
nothing. It was a lie. I don't
[00:23] (23.04s)
want that for us. people outside of this
[00:24] (24.96s)
room, the world at large looks at tech
[00:28] (28.08s)
and they hate us sometimes because those
[00:30] (30.80s)
are the people who represent us. And I
[00:32] (32.96s)
say, "Not for me. They don't represent
[00:36] (36.24s)
[Applause]
[00:43] (43.12s)
Welcome to another episode of the Light
[00:44] (44.80s)
Cone. This time we're doing it live.
[00:47] (47.04s)
We're not used to doing it in front of a
[00:48] (48.96s)
studio audience. So, we thought we would
[00:51] (51.20s)
uh start off with a controversial topic.
[00:53] (53.60s)
This is something that uh a bunch of
[00:55] (55.44s)
people who are at this conference uh
[00:58] (58.40s)
have been I don't know just talking
[01:00] (60.24s)
about coming to us to talk about. Uh is
[01:03] (63.04s)
this the last window to get rich? Are
[01:05] (65.76s)
you worried about this? Are you guys
[01:07] (67.12s)
worried about this? Is this the end of
[01:09] (69.20s)
capitalism? What's what's happening? Be
[01:11] (71.20s)
like no money going to stop exist with
[01:13] (73.68s)
EGI. they they won't they won't admit to
[01:16] (76.24s)
it but in private conversations this is
[01:18] (78.56s)
one of the topics that certainly we've
[01:20] (80.24s)
been debating. Yeah. Well, you know, why
[01:22] (82.80s)
is this coming up? Actually, seems like
[01:25] (85.20s)
at least when we speak to people who are
[01:26] (86.88s)
applying to IC who are kind of like
[01:29] (89.28s)
members of the audience, there's a real
[01:31] (91.52s)
sense of uncertainty created by AI right
[01:33] (93.92s)
now, right? Like the thing is like the
[01:35] (95.52s)
sense of will the jobs that we thought
[01:37] (97.92s)
would be there be available and if we're
[01:40] (100.32s)
not um if they're not like kind of what
[01:43] (103.28s)
do we do? And if we're not sort of if we
[01:46] (106.56s)
don't have real ownership in something
[01:48] (108.32s)
that's like valuable and growing like
[01:50] (110.16s)
what will we be left with? That seems to
[01:52] (112.08s)
be the thing that comes up a lot. I had
[01:53] (113.92s)
dinner with some undergrads who are here
[01:55] (115.92s)
last night and they were saying that
[01:57] (117.52s)
this is one of the things that people
[01:59] (119.04s)
are talking about a lot on college
[02:00] (120.56s)
campuses is like, "Hey, the AI's gotten
[02:02] (122.80s)
really good at programming now." Um,
[02:06] (126.08s)
what's going to happen to all the
[02:07] (127.12s)
programming jobs? Like it used to be the
[02:09] (129.12s)
case that if you were a CS major, there
[02:11] (131.60s)
is a very clear path to like a very
[02:13] (133.44s)
stable like upper middle class
[02:16] (136.08s)
background where you get like a good
[02:17] (137.52s)
stable job as a as a programmer. Um but
[02:20] (140.16s)
like are those jobs still going to be
[02:22] (142.08s)
here in 10 years? Like Yeah. Yeah. Like
[02:24] (144.56s)
my my parents were really proud when I
[02:27] (147.52s)
uh you know graduating I you know got my
[02:29] (149.60s)
degree and then I got my job at
[02:32] (152.00s)
Microsoft and I was a level 59 PM uh you
[02:35] (155.84s)
know lowest of the low but I had health
[02:37] (157.60s)
insurance and my parents were really
[02:39] (159.20s)
really proud of me. And you know, one of
[02:41] (161.68s)
the fears, frankly, like that we're
[02:43] (163.76s)
hearing uh and it's sort of, you know,
[02:46] (166.32s)
coming out in the numbers is that will
[02:48] (168.64s)
there actually be jobs? You I think it's
[02:51] (171.84s)
a tricky thing right now with the advent
[02:53] (173.84s)
of intelligence. You know, some of the
[02:55] (175.44s)
simplest things that people rely on
[02:57] (177.68s)
entry level people right out of college
[02:59] (179.44s)
for, uh they're not hiring as many of
[03:01] (181.92s)
them anymore. And you know the craziest
[03:04] (184.56s)
stat I think this came out of uh uh the
[03:06] (186.88s)
New York Fed in February of this year.
[03:10] (190.08s)
Um computer science majors uh you know
[03:14] (194.24s)
obviously this is not the people in this
[03:16] (196.00s)
room. This is just like out of like you
[03:18] (198.56s)
know a normal distribution of all
[03:20] (200.24s)
computer science majors 6.1% in
[03:23] (203.60s)
unemployment in February of this year.
[03:26] (206.32s)
Art history in contrast was only 3.0%.
[03:29] (209.44s)
Wait, you're saying that the
[03:31] (211.04s)
unemployment rate of art history majors
[03:32] (212.72s)
is lower than the unemployment rate of
[03:34] (214.64s)
CS majors? Unbelievably, but that's what
[03:36] (216.88s)
the stat indic.
[03:44] (224.08s)
We're talking about, you know, the the
[03:46] (226.08s)
median, which you know, you guys are so
[03:48] (228.08s)
so many standard deviations above. Don't
[03:50] (230.00s)
worry. That's concerning, right? Yeah.
[03:53] (233.04s)
Yeah. But like but like this this role
[03:55] (235.12s)
of like like level 59, you know,
[03:57] (237.52s)
engineer at Microsoft used to be this
[03:59] (239.92s)
like super stable job. If you do that
[04:01] (241.92s)
job, all the adults in your life will be
[04:03] (243.60s)
like good job. Like you make the you
[04:05] (245.36s)
made the safe choice, the prudent
[04:06] (246.72s)
choice. But like one of the things I've
[04:09] (249.04s)
been I've been noodling a lot. It's like
[04:11] (251.12s)
is that actually the safe choice? Like
[04:12] (252.88s)
is it possible that the world has become
[04:15] (255.60s)
inverted and like the career path that
[04:17] (257.92s)
seemed to be like the lowest risk, most
[04:19] (259.84s)
safe path might not be anymore?
[04:23] (263.04s)
Yeah, I think like one thing that's
[04:24] (264.48s)
going to be interesting with this
[04:25] (265.36s)
audience is that there's one theory
[04:26] (266.96s)
there's um Brian Kaplan has this theory
[04:29] (269.52s)
on education. I think it's Brian Kaplan
[04:30] (270.88s)
at least. It's like where um it's
[04:32] (272.64s)
basically all about it's credentiing but
[04:34] (274.72s)
it's actually a very specific thing
[04:36] (276.64s)
that's being credentialed. It's like
[04:38] (278.56s)
what colleges are credentiing to
[04:40] (280.56s)
employers is that um these people
[04:44] (284.24s)
graduate our program which means that
[04:46] (286.16s)
they can like show up to a place on time
[04:49] (289.12s)
and like perform a series of
[04:50] (290.72s)
instructions and you know not do too
[04:52] (292.72s)
many drugs and like kind of like make it
[04:55] (295.36s)
through which is like the kind of people
[04:56] (296.96s)
you want to hire like they'll turn up
[04:58] (298.56s)
they'll do the job and if if you're a
[05:00] (300.24s)
Microsoft Yeah. Yeah. like fang like I
[05:02] (302.88s)
think like any company at scale starts
[05:04] (304.64s)
like that's actually what they are
[05:05] (305.92s)
hiring is like you went to a good
[05:07] (307.20s)
college which means that you can like do
[05:09] (309.04s)
things reliably and follow instructions
[05:10] (310.72s)
well it's pretty clear now in the AI
[05:12] (312.88s)
world that like the AI is very good at
[05:15] (315.44s)
following instructions and it's probably
[05:17] (317.36s)
going to be hard for humans to compete
[05:18] (318.72s)
with the AI on just like following
[05:20] (320.56s)
instructions reliably in which case
[05:23] (323.12s)
people here need to think about what are
[05:26] (326.08s)
they going to get out of their college
[05:27] (327.52s)
experience that goes beyond just kind of
[05:30] (330.96s)
showing up, passing the test, following
[05:32] (332.72s)
the instructions really, really well.
[05:34] (334.16s)
Like it's going to require how do you
[05:36] (336.56s)
know to do things yourself and how do
[05:38] (338.96s)
you have like agency and independence?
[05:41] (341.04s)
Um cuz that's actually the stuff that's
[05:42] (342.88s)
going to matter in like I think a post
[05:44] (344.56s)
AI world. And I think the thing that
[05:46] (346.40s)
happened is uh Dar and I went on this
[05:48] (348.16s)
college tour as well and what's
[05:50] (350.72s)
happening is that a lot of the CS
[05:53] (353.36s)
curriculum is actually quite outdated.
[05:56] (356.24s)
Like how many of you in the audience if
[05:58] (358.24s)
you're still in in in college do your
[06:00] (360.88s)
courses even allow use of a cursor?
[06:05] (365.20s)
Yeah. How how many like forbid you to
[06:07] (367.20s)
use cursor and like vibe coding tools in
[06:09] (369.44s)
your CS classes? Oh yeah. Way more
[06:10] (370.80s)
hands. Yeah. Yeah. And this is the
[06:14] (374.24s)
future and those are the kinds of skills
[06:15] (375.92s)
that are now they're quite literally
[06:19] (379.04s)
prohibiting the students from learning
[06:21] (381.20s)
the tools that they are going to need in
[06:24] (384.00s)
the future. It's crazy. It's like Google
[06:26] (386.24s)
when the internet first came out um a
[06:27] (387.92s)
lot of teachers would say you're not
[06:29] (389.76s)
allowed to use Google totally which is
[06:32] (392.40s)
unfathomable today and I think to Har's
[06:34] (394.96s)
point a lot of the most crack students
[06:37] (397.60s)
as we were meeting them and all these we
[06:40] (400.24s)
had we had some some events that were
[06:42] (402.48s)
hosted they had this um the sense of
[06:46] (406.72s)
talking on the side and working on a lot
[06:48] (408.32s)
of side projects to your point harsh of
[06:50] (410.16s)
having a lot of agency you learn a lot
[06:52] (412.72s)
more in the process of building a lot of
[06:54] (414.64s)
projects on the side rather than at
[06:56] (416.96s)
school. How many of you had learned way
[06:58] (418.40s)
more on independent projects than at
[07:00] (420.00s)
school? Yeah. All right. We picked the
[07:02] (422.56s)
right people. Sweet.
[07:05] (425.36s)
What do you guys think is the answer to
[07:06] (426.48s)
Gary's question? Is this the last window
[07:08] (428.00s)
to get rich? Being intellectually
[07:09] (429.68s)
honest. One of our sort of colleagues,
[07:11] (431.52s)
Paul Buhight, pointed this out where
[07:13] (433.12s)
it's there's probably a flaw in the
[07:14] (434.48s)
logic potentially. like if this is
[07:15] (435.76s)
actually the last window to make money
[07:18] (438.16s)
and get rich, then you're basically
[07:19] (439.92s)
implying that, you know, like the we're
[07:21] (441.68s)
going to get some definition of AGI or
[07:23] (443.52s)
ASI or whatever you want to call it. Um
[07:25] (445.52s)
that's like a necessary condition for
[07:26] (446.88s)
that to be true. Like in which case
[07:28] (448.96s)
we're probably going to have like
[07:31] (451.52s)
bigger like there's going to be a lot
[07:32] (452.96s)
more going on than just figuring out
[07:34] (454.40s)
like how to make this like human money.
[07:36] (456.00s)
I think that's a concept that Paul talks
[07:37] (457.60s)
a lot about is that in a world where the
[07:39] (459.44s)
machines can do everything that's better
[07:41] (461.28s)
than humans like what value will they
[07:43] (463.44s)
even be in human money in which case why
[07:46] (466.24s)
does it matter that you're going to race
[07:47] (467.52s)
to accumulate like the human money now
[07:50] (470.40s)
the game itself might change the you
[07:52] (472.88s)
know you sort of you grow up you go to
[07:55] (475.28s)
college you graduate you go work you get
[07:57] (477.68s)
a job you buy a house you have a
[07:59] (479.36s)
mortgage all this stuff and then um you
[08:02] (482.96s)
know one of the weirder things that uh I
[08:05] (485.52s)
see people critique San Francisco about
[08:07] (487.60s)
is somehow this belief that San
[08:09] (489.84s)
Francisco itself is about like
[08:11] (491.36s)
credential maxing which um I don't I
[08:14] (494.88s)
mean I the the part of San Francisco I
[08:17] (497.44s)
want to spend time with is like not
[08:19] (499.04s)
really about that but I can see the
[08:21] (501.12s)
critique in general I don't think people
[08:23] (503.68s)
do their best work out of fear like you
[08:26] (506.08s)
do it out of like more positive
[08:27] (507.44s)
motivations cuz you're excited about
[08:29] (509.28s)
stuff and so I don't think my advice to
[08:31] (511.84s)
anyone here would be you should drop out
[08:34] (514.80s)
of college and work on an AI startup
[08:37] (517.12s)
because it's going to be your last
[08:38] (518.32s)
chance to make money before I don't know
[08:40] (520.88s)
like the event horizon hits us. Um I do
[08:44] (524.24s)
think something that's interesting to
[08:45] (525.68s)
note is just like the how quickly like
[08:49] (529.28s)
AI startups can grow is definitely
[08:51] (531.28s)
something we've talked about but if you
[08:52] (532.64s)
think about um something I've been
[08:55] (535.20s)
thinking about recently like I all of us
[08:57] (537.52s)
actually when we were in college like um
[08:59] (539.36s)
we would you'd always have speakers come
[09:01] (541.68s)
back like startup founders who were like
[09:03] (543.60s)
a year or two out of college come back
[09:05] (545.20s)
and speak and you hear from them and I
[09:07] (547.04s)
kind of remember that the milestone to
[09:09] (549.28s)
hit when you were like a year or two out
[09:10] (550.88s)
of college having worked on your startup
[09:12] (552.72s)
was like raising a series A round of
[09:14] (554.64s)
funding and we'd have like the Dropbox
[09:16] (556.32s)
founder come back and say like yeah I
[09:18] (558.08s)
raised like a series A and it's like
[09:19] (559.52s)
really really cool and then it sort of
[09:21] (561.76s)
became okay well actually like maybe a
[09:24] (564.00s)
couple years out of college you could
[09:25] (565.20s)
raise like a series B or a series C. If
[09:27] (567.92s)
you fast forward today, it's like you've
[09:29] (569.44s)
got the cursor founder a couple of years
[09:31] (571.12s)
out of college coming back with a 10
[09:32] (572.64s)
billion dollar company. Like it's like
[09:34] (574.32s)
the the the pace order magnitude. Yeah.
[09:37] (577.20s)
It's like the orders of magnitude
[09:38] (578.56s)
difference, right? So I actually think a
[09:40] (580.08s)
far more exciting reason to think about
[09:43] (583.28s)
um you know should I start a company or
[09:45] (585.44s)
should I join like one of these fast
[09:46] (586.72s)
growing companies is like the time like
[09:49] (589.44s)
how much you can get done a year or two
[09:51] (591.76s)
like out of college is orders of
[09:54] (594.00s)
magnitudes higher than it was even a few
[09:55] (595.68s)
years ago. And I think that's like for a
[09:58] (598.08s)
certain type of person a very like
[10:00] (600.00s)
exciting motivating factor. I like that
[10:03] (603.20s)
a lot hard that I think that's why Sam
[10:05] (605.12s)
said at the beginning of the of the
[10:06] (606.72s)
event yesterday, this is the best
[10:07] (607.84s)
time in history to start a
[10:09] (609.28s)
company. Yeah. Well, the interesting
[10:11] (611.52s)
thing about credential maxing andor
[10:13] (613.60s)
what's happening now is that raising a
[10:15] (615.76s)
series A is a credential that kind of
[10:17] (617.76s)
gets bestowed by a fancy VC uh you know
[10:22] (622.40s)
driving a Ferrari down Sand Hill Road or
[10:24] (624.80s)
something, right? like that's something
[10:26] (626.32s)
that's external to outcomes and often
[10:28] (628.88s)
it's you know really like the shooting
[10:31] (631.20s)
of a a starting line gun as opposed to
[10:33] (633.84s)
like something to celebrate in and of
[10:35] (635.36s)
itself. The really big difference today
[10:37] (637.52s)
is that the very best companies that we
[10:39] (639.68s)
get to see day-to-day, they're like, I
[10:42] (642.00s)
don't know, five people, 10 people. Uh I
[10:44] (644.64s)
think all you know, each of us on stage
[10:46] (646.24s)
and all of the YC partners are sort of
[10:47] (647.92s)
collecting uh incredible startups that
[10:50] (650.96s)
we get to work with that went from zero
[10:52] (652.96s)
to 10 million, 12 million a year
[10:55] (655.20s)
revenue. Like that's net revenue like it
[10:57] (657.92s)
just goes in the bank. So you basically
[11:00] (660.24s)
get the equivalent of an entire series A
[11:03] (663.68s)
and instead of this fake credential
[11:06] (666.32s)
thing where some fancy person on Twitter
[11:09] (669.20s)
with lots of followers, you know,
[11:11] (671.68s)
blesses you and suddenly like all these
[11:13] (673.76s)
people, you know, TechCrunch says like
[11:15] (675.92s)
the new hottest founder and you know
[11:17] (677.68s)
what, like those are all external things
[11:19] (679.92s)
that are not actually connected to real
[11:21] (681.92s)
business or having an impact on anyone.
[11:24] (684.72s)
It's fake, right? It's the fake
[11:26] (686.24s)
credential. And then the cool thing now
[11:28] (688.64s)
is that that is actually very directly
[11:31] (691.20s)
being replaced by people making things
[11:34] (694.24s)
that you know people not only really
[11:36] (696.64s)
need but they're willing to pay a lot of
[11:38] (698.64s)
money for. This is like a very good
[11:40] (700.72s)
point sort of instead of uh taking the
[11:43] (703.76s)
leap out of fear that this is the only
[11:46] (706.72s)
time taking it from a from a place of
[11:49] (709.12s)
approaching and really going after
[11:51] (711.76s)
something where this is really an
[11:53] (713.68s)
exciting time to be a builder. We've
[11:56] (716.56s)
seen crazy growth unlike anything only
[12:00] (720.24s)
possible with right now with AI like all
[12:02] (722.64s)
these companies that we work with zero
[12:04] (724.16s)
to 12 million in 12 months. The cursor
[12:07] (727.12s)
story where they went zero to one in one
[12:08] (728.80s)
year the next one to 100. This is like
[12:12] (732.16s)
unprecedented in tech history for B2B
[12:14] (734.80s)
SAS companies. It didn't used to be the
[12:16] (736.16s)
case that B2B SAS companies were the
[12:17] (737.68s)
ones that had hyperrowth. Like there
[12:19] (739.44s)
were some like consumer social companies
[12:21] (741.04s)
that got hyperrowth but like B2B SAS
[12:23] (743.28s)
used to be this like you know plotting
[12:26] (746.16s)
slow growing like kind of thing. Now
[12:29] (749.76s)
there's this weird inversion. It's the
[12:31] (751.20s)
B2B SAS companies that are the like the
[12:32] (752.96s)
hyperrowth one. I think what we're
[12:35] (755.12s)
saying is a lot of times is uh founders
[12:37] (757.92s)
who are at living in the future at the
[12:40] (760.08s)
cutting edge who are winning here
[12:42] (762.24s)
because you have to sort of build the
[12:43] (763.60s)
taste to build something good and you
[12:45] (765.52s)
don't get taught some of those things in
[12:47] (767.12s)
school. I know that like on that front
[12:48] (768.88s)
like something very specifically we're
[12:50] (770.40s)
seeing is that to build any products you
[12:53] (773.12s)
you always need some combination of like
[12:54] (774.88s)
domain expertise which is really just
[12:56] (776.72s)
like understanding your customer really
[12:58] (778.24s)
well and understanding the space you're
[12:59] (779.60s)
building in and understanding the market
[13:01] (781.68s)
really well and then technical expertise
[13:03] (783.44s)
to actually build the product and it
[13:05] (785.12s)
feels like preAI thing shifted where um
[13:09] (789.60s)
sort of the technical expertise wasn't
[13:11] (791.68s)
that important because it was most of
[13:13] (793.68s)
the software was like web software and
[13:15] (795.20s)
it became fairly straightforward to
[13:16] (796.80s)
build web software and actually all the
[13:18] (798.88s)
value was in how much like domain
[13:20] (800.32s)
expertise do you have? Like do you have
[13:22] (802.16s)
relationships with the customers you're
[13:23] (803.52s)
going after? Um do you have some edge on
[13:25] (805.68s)
how to sell to them because everyone
[13:27] (807.28s)
you're selling to is already got like 10
[13:29] (809.52s)
roughly equivalent products being sold
[13:31] (811.44s)
to them. Uh, and that actually made it
[13:34] (814.00s)
quite hard, I think, for college
[13:35] (815.36s)
students to be able to like go and
[13:37] (817.92s)
compete for like you can't like compete
[13:40] (820.08s)
on compete with Salesforce for like a
[13:41] (821.76s)
CRM or go build like the best
[13:44] (824.56s)
appointment booking software for
[13:46] (826.56s)
healthcare practices. Like all of these
[13:48] (828.40s)
things were just very saturated. And now
[13:50] (830.72s)
I think what we're seeing is with AI,
[13:52] (832.16s)
there's this promise of hey, like this
[13:53] (833.44s)
is more than software. Like this can do
[13:54] (834.88s)
like the work of people. It's like
[13:57] (837.20s)
magic. But like it's actually quite hard
[13:59] (839.44s)
to do that reliably. And so there's been
[14:01] (841.44s)
this flip of where the technical
[14:03] (843.60s)
expertise is now actually really like
[14:05] (845.44s)
the missing piece for a lot of these
[14:07] (847.44s)
things. Um, and we consistently see at
[14:09] (849.84s)
least in YC that college students are
[14:12] (852.32s)
actually at the forefront of this stuff.
[14:13] (853.76s)
Like actually understanding how to use
[14:15] (855.60s)
the models and how to squeeze the
[14:17] (857.68s)
performance consistently out of the
[14:19] (859.12s)
models is something that even like you
[14:21] (861.36s)
know PhDs and people are really
[14:23] (863.04s)
experienced don't get. I think maybe
[14:24] (864.72s)
that's why Elon had that sort of look
[14:26] (866.16s)
yesterday when he was talking about
[14:27] (867.28s)
researchers versus engineers. It's like
[14:30] (870.00s)
it's actually in the engineering and
[14:31] (871.76s)
it's like working on the projects and
[14:33] (873.52s)
like building real things is where you
[14:36] (876.24s)
get the expertise. Yeah. I had a lot of
[14:38] (878.56s)
college students ask me over the last
[14:40] (880.40s)
two days like hey I don't have domain
[14:43] (883.04s)
expertise in any particular area cuz
[14:44] (884.80s)
like I haven't worked in industry that
[14:46] (886.48s)
much like what idea should I work on and
[14:49] (889.20s)
like how do I basically like how do how
[14:51] (891.92s)
do I get enough domain expertise to like
[14:53] (893.44s)
do something interesting. All right.
[14:54] (894.96s)
Well, what advice would you have for
[14:56] (896.32s)
folks in that position, Harge, based on
[14:58] (898.40s)
on that insight? I think Gary's got like
[15:00] (900.32s)
a great point on this. Um, it's
[15:02] (902.72s)
basically like become like a forward
[15:04] (904.24s)
deployed engineer, right? Yeah. Just I
[15:06] (906.64s)
mean go undercover, I guess, like go go
[15:10] (910.24s)
and figure out what people actually need
[15:12] (912.32s)
and um yeah, there are just too many
[15:15] (915.20s)
examples of billion-dollar uh startups
[15:17] (917.36s)
that we got to see. I mean, I always
[15:18] (918.96s)
think about Flexport. you know, here's
[15:20] (920.64s)
this guy who literally became one of the
[15:23] (923.36s)
top importers of medical hot tubs. Like,
[15:26] (926.08s)
I don't think anyone wakes up, you know,
[15:28] (928.32s)
and graduates and decides like, hey, I
[15:30] (930.56s)
really need to become one of the
[15:31] (931.76s)
foremost, you know, import exporters of
[15:34] (934.56s)
uh of medical hot tubs. But, you know,
[15:36] (936.32s)
he did it. He they they did they also um
[15:38] (938.56s)
I think were one of the first e the
[15:40] (940.16s)
biggest ebike importer. But then you
[15:42] (942.48s)
know basically being in weird parts in
[15:44] (944.80s)
the economy um caused them to understand
[15:48] (948.24s)
just things that that uh the the other
[15:50] (950.64s)
person you know the sort of thousand
[15:52] (952.88s)
10,000 other people who want to start
[15:54] (954.72s)
startups like they didn't have that
[15:56] (956.32s)
knowledge and so sort of your ability
[15:58] (958.80s)
your you know if you're here like your
[16:00] (960.48s)
inherent ability already is like one
[16:02] (962.96s)
part of the ven diagram and then the
[16:05] (965.60s)
other part is just something weird. It's
[16:08] (968.72s)
literally just like where does your
[16:10] (970.56s)
interest come from? I'm like I'm really
[16:12] (972.16s)
taken by to what degree both open AAI
[16:14] (974.64s)
and SpaceX for instance were uh you know
[16:18] (978.24s)
the genesis came from like interest and
[16:21] (981.68s)
a hunch and just like not really any
[16:25] (985.12s)
commercial intent and yet you know
[16:27] (987.44s)
coming out the other side uh that was
[16:29] (989.92s)
enough to attract the smartest people in
[16:31] (991.92s)
the world attract capital and then
[16:34] (994.56s)
really create you know the most enduring
[16:37] (997.36s)
businesses in the world. Yeah. And the
[16:39] (999.44s)
other thing that I've seen that's pretty
[16:40] (1000.56s)
cool is just I've just seen a lot of
[16:42] (1002.48s)
college students go from having like no
[16:44] (1004.48s)
domain expertise in an area to being
[16:47] (1007.20s)
like total experts in like a month or
[16:49] (1009.52s)
two at YC. And I think people maybe
[16:52] (1012.24s)
don't give themselves enough credit for
[16:54] (1014.24s)
how quickly you can become an expert in
[16:56] (1016.24s)
something if you're just smart and you
[16:57] (1017.84s)
learn fast and you just make like a
[16:59] (1019.28s)
concerted effort. I think the door is
[17:01] (1021.04s)
more open now than ever. Like you kind
[17:02] (1022.72s)
of go back to Yeah. in a world where
[17:05] (1025.52s)
like um any domain let's make like you
[17:07] (1027.92s)
know if you're trying to build software
[17:09] (1029.60s)
for dentists is a random example pre AI
[17:12] (1032.32s)
it was just like people were being
[17:13] (1033.92s)
pitched with so many different software
[17:15] (1035.28s)
products that they weren't actually that
[17:17] (1037.12s)
receptive to like some college students
[17:19] (1039.20s)
promising some software and wanted to
[17:20] (1040.80s)
come like learn and like work in the
[17:22] (1042.48s)
office and understand like how it works
[17:24] (1044.32s)
like got like 20 software vendors all
[17:26] (1046.96s)
like um telling me the same thing but
[17:29] (1049.52s)
now because like AI has captured the m
[17:32] (1052.00s)
like the imagination of everybody
[17:33] (1053.44s)
everyone like wants to know what's
[17:35] (1055.04s)
possible and are consistently
[17:37] (1057.20s)
underwhelmed by what like the
[17:39] (1059.28s)
established software companies can offer
[17:41] (1061.36s)
them, but they're open to like college
[17:43] (1063.28s)
students just coming in and like well
[17:45] (1065.12s)
because the college students are selling
[17:46] (1066.32s)
them pure magic. So I had three founders
[17:48] (1068.56s)
in the last batch actually that are
[17:49] (1069.84s)
building quite literally like AI agents
[17:51] (1071.36s)
for dentists. None of them like I think
[17:53] (1073.52s)
their only experience with dentists is
[17:54] (1074.88s)
they went to a dentist
[17:58] (1078.56s)
and but it was exactly what you said
[18:00] (1080.64s)
Harj like they're literally selling
[18:02] (1082.64s)
these dentists like magic in a bottle
[18:04] (1084.88s)
and so like of course the dentists will
[18:06] (1086.56s)
spend their time because if it works
[18:08] (1088.24s)
it's like just incredible for their
[18:09] (1089.76s)
business which kind of just comes back
[18:11] (1091.04s)
to the agency thing cuz it's like the
[18:12] (1092.96s)
thing like in order to build these
[18:14] (1094.96s)
products in order to go out and like
[18:16] (1096.72s)
build like the future big companies you
[18:18] (1098.56s)
kind of just have to have the agency to
[18:20] (1100.08s)
be like ah yeah like I'm actually going
[18:21] (1101.36s)
to go do the like undercover agent or um
[18:25] (1105.28s)
fully deployed engineer and I'm just
[18:26] (1106.88s)
going to go like camp out in like
[18:29] (1109.12s)
someone's office and just see how they
[18:30] (1110.64s)
do their jobs and learn how to do it and
[18:32] (1112.40s)
learn how to like build it with AI. What
[18:34] (1114.56s)
about some um pitfalls like things that
[18:36] (1116.56s)
would prevent people from exercising
[18:38] (1118.88s)
their agency or exposing themselves to
[18:42] (1122.32s)
you know the real economy? I think one
[18:45] (1125.60s)
of the thing that keeps coming back to
[18:47] (1127.04s)
my mind is having a lot of these
[18:49] (1129.12s)
conversations with uh recent grads or
[18:51] (1131.92s)
college students. I think there's this
[18:54] (1134.00s)
arc of um a lot of you trying to figure
[18:56] (1136.48s)
out what to do with your life and
[18:58] (1138.96s)
through most of your life you've been
[19:00] (1140.48s)
conditioned to kind of just pass test,
[19:03] (1143.76s)
study for the exam, do the homework and
[19:06] (1146.64s)
it's sort of like all these uh very
[19:08] (1148.72s)
constrained boxes that you have to check
[19:11] (1151.44s)
and then you treat startups or your next
[19:13] (1153.68s)
jobs sort of like another test or exam
[19:16] (1156.80s)
that a lot of the rules are
[19:18] (1158.24s)
predetermined and you just have to go
[19:20] (1160.08s)
check the boxes. But that's the complete
[19:22] (1162.88s)
wrong mental model for it. Because the
[19:25] (1165.76s)
problem is that when you go after
[19:28] (1168.96s)
building and tackling a big problem, it
[19:32] (1172.24s)
is a open wide space. There's no rules.
[19:34] (1174.88s)
You get to create it. I mean the good
[19:36] (1176.48s)
thing about startups is plus and
[19:38] (1178.56s)
minuses. You have agency to decide what
[19:42] (1182.40s)
you're going to go after. Set your goals
[19:44] (1184.48s)
instead of like some authority figure to
[19:46] (1186.56s)
like oh you need to do this this and
[19:47] (1187.92s)
that. And we get asked questions like,
[19:49] (1189.60s)
"Oh, what should I look like in order to
[19:52] (1192.00s)
raise money?" That is such a student
[19:54] (1194.00s)
question. Sort of like there's some sort
[19:55] (1195.84s)
of bar like by some higher power. Guess
[19:58] (1198.64s)
what? There's no adults in the room. Is
[20:00] (1200.88s)
you. You're in control and you get to
[20:03] (1203.44s)
design those rules and you can go as
[20:05] (1205.36s)
fast as possible. You don't have to have
[20:07] (1207.44s)
like, oh, we have to do this, this, and
[20:09] (1209.44s)
this and check the marks and get there.
[20:11] (1211.84s)
Is really you design it. You're in
[20:14] (1214.40s)
control. I think there are two very
[20:16] (1216.48s)
dangerous uh forms of like credentialism
[20:19] (1219.84s)
that you create for yourself that we see
[20:22] (1222.40s)
that actually like we'd really like to
[20:24] (1224.00s)
warn you guys about. Uh one is I mean I
[20:26] (1226.48s)
think we already talked about like
[20:27] (1227.92s)
making raising money from investors like
[20:30] (1230.56s)
somehow the the biggest goal I mean
[20:32] (1232.32s)
including us by the way. It's like, you
[20:34] (1234.80s)
know, that we're just like people to
[20:37] (1237.04s)
help you and we think we can help you a
[20:38] (1238.80s)
lot, but like once you turn that into
[20:41] (1241.60s)
like sort of uh the idol that you have
[20:44] (1244.72s)
to achieve, then that's just missing the
[20:47] (1247.36s)
whole point. And I, you know, I think
[20:49] (1249.04s)
that that's quite dangerous. Um, the
[20:51] (1251.60s)
other thing that we we're kind of
[20:52] (1252.96s)
concerned about is there are like
[20:54] (1254.24s)
entrepreneurship programs at some of
[20:56] (1256.00s)
your campuses. Uh, some of them might
[20:58] (1258.00s)
take you to wild exotic places for
[21:00] (1260.48s)
retreats. We're not going to name them,
[21:02] (1262.72s)
but like in full transparency, I'm very
[21:05] (1265.20s)
worried about them because what we're
[21:07] (1267.20s)
coming we're coming to understand is
[21:09] (1269.28s)
they are teaching you to lie. And that
[21:12] (1272.16s)
is at a moment when literally all of
[21:15] (1275.68s)
software is changing and that software
[21:17] (1277.84s)
is the most empowering thing in the
[21:19] (1279.52s)
world. Why do you have to lie? I
[21:21] (1281.68s)
understand in a world of like
[21:24] (1284.16s)
contracting capability, in a world where
[21:26] (1286.80s)
there's less money, where there's, you
[21:28] (1288.80s)
know, fewer and fewer jobs, I kind of
[21:30] (1290.48s)
get it. It's very zero sum. We're at the
[21:32] (1292.64s)
most open like sort of abundanceoriented
[21:36] (1296.40s)
like mindset thing that is happening
[21:38] (1298.16s)
right now. Like literally everyone here
[21:40] (1300.32s)
is hyper hypermpowered. You don't have
[21:42] (1302.64s)
to play by those old rules anymore. You
[21:45] (1305.04s)
don't have to lie to investors. You
[21:46] (1306.72s)
don't have to like fake it till you make
[21:48] (1308.32s)
it. Like you know I worry that some of
[21:50] (1310.40s)
these programs are just literally trying
[21:51] (1311.92s)
to teach people to become more uh you
[21:55] (1315.44s)
know SPFs and therronoses and that's
[21:57] (1317.84s)
like you know that's a waste of time.
[21:59] (1319.76s)
like and you're gonna go to jail.
[22:02] (1322.48s)
That's fine. Um also a lot of these
[22:05] (1325.60s)
entrepreneurship programs they do what
[22:06] (1326.88s)
Diana said which is like
[22:08] (1328.16s)
entrepreneurship programs especially
[22:09] (1329.36s)
ones that are not started not run by
[22:11] (1331.36s)
founders you know all of us were were
[22:12] (1332.96s)
were startup founders they they
[22:15] (1335.04s)
basically teach entrepreneurship like it
[22:17] (1337.84s)
was a course like like it was just a
[22:20] (1340.16s)
series of tests to pass a series of
[22:21] (1341.92s)
check boxes to to check. Um, anytime you
[22:25] (1345.12s)
try to bottle up entrepreneurship and
[22:26] (1346.64s)
like teach it as a college course,
[22:28] (1348.08s)
that's kind of what you end up with is
[22:29] (1349.60s)
like basically like a a sort of like
[22:32] (1352.40s)
cheap faximile of entrepreneurship where
[22:34] (1354.56s)
like they teach you to like, you know,
[22:36] (1356.80s)
follow a particular method or a
[22:38] (1358.48s)
particular practice and that's just not
[22:39] (1359.76s)
what startups are actually like. I just
[22:41] (1361.36s)
think about that Jay-Z line is like
[22:42] (1362.88s)
everybody want to tell you how to do it,
[22:44] (1364.32s)
they never did it. True. A riff on that
[22:47] (1367.36s)
I'm curious to get people's opinions on.
[22:48] (1368.72s)
Uh, maybe especially Gary actually. Um,
[22:51] (1371.52s)
something that is clearly different I
[22:53] (1373.28s)
feel about the age we live in today
[22:55] (1375.20s)
versus say 10 years ago is just social
[22:57] (1377.28s)
media and using social media as a way to
[23:00] (1380.48s)
um, like amplify your message and your
[23:03] (1383.28s)
brand. This is actually something came
[23:04] (1384.40s)
up at dinner last night. is how much in
[23:06] (1386.96s)
the early stages when you're building a
[23:08] (1388.56s)
product like should you focus in on kind
[23:10] (1390.72s)
of like building the product and going
[23:12] (1392.64s)
one by one to get users all of the kind
[23:14] (1394.88s)
of like traditional startup advice
[23:17] (1397.12s)
versus trying to cultivate sort of like
[23:20] (1400.32s)
a following or a brand um or attention
[23:23] (1403.76s)
like online and like you know 10 years
[23:25] (1405.92s)
ago spend thousands of dollars on a
[23:27] (1407.52s)
video. Yeah. Yeah. just like higher
[23:29] (1409.28s)
production launch videos and like lots
[23:31] (1411.28s)
of following like lots of followers on
[23:32] (1412.72s)
Twitter or X and um I certainly I think
[23:36] (1416.56s)
it's more confusing now because that
[23:38] (1418.24s)
wasn't even an option before and you
[23:40] (1420.24s)
definitely see people succeeding at the
[23:43] (1423.20s)
getting the online like attention and
[23:45] (1425.28s)
people talking about like the company
[23:47] (1427.52s)
what do they call it aura farming is
[23:51] (1431.36s)
we got lost maybe that's the the phrase
[23:53] (1433.20s)
yeah it is like aura farming I guess
[23:55] (1435.20s)
yeah I'm curious what you think Gary all
[23:57] (1437.20s)
I care about is what's real and what you
[23:59] (1439.68s)
can, you know, touch and see and feel
[24:01] (1441.68s)
and, you know, think about the area
[24:03] (1443.28s)
under the curve of utility that you
[24:05] (1445.36s)
could contribute to society. And you can
[24:07] (1447.52s)
always just look at that as ground truth
[24:09] (1449.52s)
and everything else is similacra. It is
[24:12] (1452.96s)
not real. It is like media. It is fake.
[24:15] (1455.92s)
It is a credential. It is a thing that
[24:17] (1457.84s)
represents something. And yet like if
[24:20] (1460.16s)
you look deeper into it, it's nothing.
[24:22] (1462.56s)
Like there's nothing. When you think
[24:24] (1464.00s)
about SPF, when you think about
[24:25] (1465.60s)
Theronos, when you think about the
[24:27] (1467.04s)
things that truly disgrace us as people
[24:30] (1470.88s)
who create technology, when you when you
[24:33] (1473.84s)
peel back a little bit, you realize
[24:36] (1476.00s)
there's nothing. This was just
[24:37] (1477.44s)
simulacum. It was a lie. I don't
[24:40] (1480.08s)
want that for us. Like, you know, people
[24:42] (1482.16s)
outside of this room, the world at large
[24:44] (1484.88s)
looks at tech and they hate us sometimes
[24:48] (1488.00s)
because those are the people who
[24:49] (1489.60s)
represent us. And I say, "Not for me.
[24:52] (1492.00s)
They don't represent us.
[24:59] (1499.04s)
So, I think that's a no on social media.
[25:01] (1501.84s)
I mean, I think social media is really
[25:03] (1503.28s)
great. I mean, I'm clearly extremely
[25:05] (1505.04s)
addicted to it and it's done some really
[25:06] (1506.96s)
great things for me, some terrible
[25:08] (1508.32s)
things, too. But, um, I, you know, I
[25:11] (1511.12s)
think that, you know, you do have to
[25:12] (1512.96s)
tell your story. Like, I think one of
[25:15] (1515.04s)
the more important things that is the
[25:16] (1516.56s)
gift is that you can tell your own
[25:18] (1518.88s)
story. like in fact that you have to
[25:21] (1521.36s)
like the second you rely on someone else
[25:23] (1523.44s)
to tell your story it's going to be
[25:25] (1525.28s)
great great great and then when you
[25:26] (1526.80s)
don't have that voice like someone's
[25:28] (1528.64s)
going to take that you know there's the
[25:30] (1530.40s)
world wants to you know the only thing
[25:32] (1532.56s)
it loves more than like you know a uh a
[25:35] (1535.52s)
story of like of becoming is one of
[25:38] (1538.00s)
unbecoming and uh if you don't have that
[25:40] (1540.48s)
voice and you can't go direct um you
[25:43] (1543.12s)
know they're going to do that to you and
[25:44] (1544.80s)
so better to start right like you know I
[25:47] (1547.60s)
I think working backwards the thing was
[25:49] (1549.68s)
most helpful for me, you know, obviously
[25:51] (1551.68s)
for my startup, but we try to encourage
[25:53] (1553.44s)
all of our startups to do this is you
[25:55] (1555.60s)
can sort of work backwards from the
[25:57] (1557.60s)
outcome that you want. Like uh I
[26:00] (1560.56s)
actually think Apple does this really
[26:02] (1562.32s)
really well. Like they don't commit to
[26:04] (1564.24s)
doing a feature until they have, you
[26:06] (1566.40s)
know, a product manager who says this is
[26:08] (1568.56s)
who it's for and this is the the problem
[26:10] (1570.64s)
it's going to solve, right? And you can
[26:12] (1572.48s)
actually turn that into going direct.
[26:14] (1574.64s)
So, you know, let's say you have a one
[26:16] (1576.48s)
week or twoe sprint. It's all too easy
[26:18] (1578.64s)
to just say, look, like here's my bug
[26:21] (1581.20s)
list and here's my backlog and I'm going
[26:23] (1583.12s)
to fix these 10 bugs. But a much more
[26:25] (1585.44s)
powerful version of this is let's work
[26:27] (1587.68s)
backwards from what I want to put on X.
[26:30] (1590.00s)
I'm going to make a very simple Loom
[26:32] (1592.08s)
video showing off a feed of strength, a
[26:34] (1594.72s)
thing that I really want to share that I
[26:36] (1596.80s)
know our team can do. And then working
[26:39] (1599.28s)
backwards from that, the next two weeks,
[26:41] (1601.12s)
that's all we're going to do. Like you
[26:42] (1602.64s)
could storyboard it. It's like it's
[26:44] (1604.16s)
going to do this, right? you can work
[26:46] (1606.24s)
like basically at that point media and
[26:48] (1608.80s)
PM and design can be all the same thing
[26:51] (1611.92s)
you know it's connected to users it's
[26:53] (1613.84s)
connected to communication it's
[26:55] (1615.52s)
connected to what your product will do
[26:57] (1617.68s)
for people and then you build it and you
[26:59] (1619.76s)
can just basically rinse and repeat like
[27:01] (1621.36s)
that if you can do two week sprints of
[27:03] (1623.52s)
working backwards from a really powerful
[27:06] (1626.08s)
not flashy uh loom video of what you did
[27:09] (1629.36s)
in the last two weeks you can do this
[27:11] (1631.04s)
all for each other and we can create a
[27:12] (1632.88s)
culture that is not about lash but about
[27:15] (1635.60s)
substance.
[27:20] (1640.32s)
We got a question over here. I'm a big
[27:23] (1643.36s)
fan of Gary and Jared. So Jared actually
[27:26] (1646.08s)
inspired me for my for my startup, but I
[27:31] (1651.28s)
just wanted some advice because I'm kind
[27:33] (1653.12s)
of going through a dilemma right now. So
[27:35] (1655.60s)
I've been working on a startup for the
[27:36] (1656.96s)
past month and I'm I'm going to my third
[27:38] (1658.80s)
year of uni um for context. But um so
[27:42] (1662.96s)
yesterday at one of the afterp parties
[27:44] (1664.88s)
which I'm not going to disclose the name
[27:46] (1666.24s)
cuz you guys are probably going to apply
[27:47] (1667.60s)
to it but um uh I was pretty much
[27:50] (1670.32s)
pitched my idea to one of the founders
[27:52] (1672.32s)
and he basically said like you know drop
[27:54] (1674.96s)
out of school and come work for me u
[27:56] (1676.80s)
move to San Francisco. So, um I'm really
[28:00] (1680.64s)
stuck between like what do I want to do?
[28:02] (1682.64s)
Like what's the right choice? Like do I
[28:04] (1684.56s)
continue university and then go to uh
[28:07] (1687.20s)
San Francisco and then you know grind
[28:08] (1688.64s)
the startup life or do I drop out now
[28:11] (1691.60s)
cuz I'm already halfway done. Like you
[28:13] (1693.20s)
know I'm not like almost finished with
[28:14] (1694.64s)
college or anything. I'm already halfway
[28:15] (1695.68s)
there. So do I drop out and just work
[28:18] (1698.24s)
and then just move on from there? I mean
[28:20] (1700.64s)
I think the most important thing is do
[28:22] (1702.40s)
you trust them and is it actually a good
[28:24] (1704.72s)
startup? So, which is uh kind of a hard
[28:27] (1707.84s)
question to answer like like this, but
[28:30] (1710.80s)
like what you know, if it's one of ours,
[28:32] (1712.56s)
it must be a good startup, right? Is it
[28:34] (1714.24s)
a YC startup? Yeah. Oh, you should
[28:36] (1716.48s)
probably do it. No, I mean more
[28:38] (1718.88s)
seriously, I mean I don't know. How
[28:40] (1720.48s)
would you like when you think about like
[28:43] (1723.60s)
where someone should go? Like what would
[28:45] (1725.44s)
you say? Good answer. I'd add one one
[28:48] (1728.16s)
third criteria, Gary. Uh I I dropped out
[28:50] (1730.48s)
of college to do Y Combinator and do a
[28:52] (1732.32s)
startup. I think in addition to the two
[28:54] (1734.64s)
ones that Gary said, the third one for
[28:56] (1736.24s)
you to consider is like do you really
[28:58] (1738.72s)
like being in college, which is like
[29:00] (1740.56s)
obvious, but like actually Yeah. Okay.
[29:03] (1743.20s)
Like like I I think I think like like
[29:05] (1745.68s)
Har was saying earlier when college
[29:07] (1747.92s)
students are thinking about dropping out
[29:09] (1749.44s)
and they're making it in a fear they're
[29:11] (1751.36s)
making a fear-based
[29:13] (1753.28s)
decision. They have FOMO about startups
[29:15] (1755.28s)
happening in San Francisco. Maybe their
[29:17] (1757.36s)
friend dropped out and they're like
[29:18] (1758.96s)
worried that they're going to miss out.
[29:20] (1760.24s)
I don't know that those are the best
[29:21] (1761.84s)
decisions. Like some of those people
[29:23] (1763.36s)
actually do end up regretting it versus
[29:25] (1765.44s)
like when I dropped out of college. I
[29:27] (1767.36s)
don't out of college because I was bored
[29:28] (1768.64s)
of college. Like I'd done three years of
[29:30] (1770.16s)
college. I felt like I had gotten out of
[29:31] (1771.84s)
it what I wanted from the experience.
[29:33] (1773.52s)
And I was just a lot more excited to
[29:35] (1775.20s)
like build real technology for real
[29:37] (1777.04s)
people. And I felt that regardless of
[29:39] (1779.60s)
whether the startup that I was working
[29:41] (1781.28s)
on succeeded or not, I wouldn't regret
[29:43] (1783.44s)
leaving college because I was just kind
[29:45] (1785.20s)
of ready to move on to the next stage of
[29:46] (1786.88s)
my life. So I think if you can if you if
[29:48] (1788.88s)
you can honestly feel that way then
[29:52] (1792.00s)
maybe it does make sense for you to drop
[29:53] (1793.28s)
out. I think if you feel you're done
[29:55] (1795.68s)
exploring living alternative life paths
[29:59] (1799.12s)
what I mean by that like you tried an
[30:01] (1801.52s)
internship working at a big tech company
[30:03] (1803.68s)
you tried an internship at a startup.
[30:05] (1805.36s)
You tried another internship to start a
[30:07] (1807.28s)
company or you tried another one doing
[30:08] (1808.56s)
research. If you feel you fulfill the
[30:11] (1811.04s)
chessboard and the land of what your
[30:14] (1814.08s)
life could be and you explore
[30:15] (1815.44s)
everything, I think it's fine. But if
[30:18] (1818.00s)
you still have a bit of a inkling, it's
[30:19] (1819.68s)
like, oh, maybe I want to try what doing
[30:22] (1822.24s)
re research is like, I think maybe not
[30:24] (1824.72s)
yet. But if you're super sure you want
[30:26] (1826.88s)
to have a career in tech or startups,
[30:30] (1830.16s)
then maybe it's fine. To to Jar's point,
[30:32] (1832.16s)
he's like you already did that life
[30:35] (1835.68s)
alternate life path. I feel like I got
[30:38] (1838.00s)
lucky and then because I ended up at
[30:40] (1840.40s)
Palunteer and these things that ended up
[30:42] (1842.00s)
being super successful but in the moment
[30:43] (1843.84s)
like you know I could have just got you
[30:45] (1845.76s)
know Palanteer could have been a bad
[30:47] (1847.28s)
startup actually and uh I didn't even
[30:50] (1850.08s)
think about it. So like thinking back on
[30:51] (1851.92s)
what I should have been thinking of when
[30:53] (1853.44s)
I was you know 22 23 it's actually
[30:56] (1856.48s)
really important to try to be at the
[30:59] (1859.28s)
most dominant places actually. I mean
[31:01] (1861.84s)
the power law for startups is so intense
[31:04] (1864.08s)
that if you're going to go work at a
[31:05] (1865.68s)
startup I do actually think you should
[31:07] (1867.44s)
try to go work at at a really good
[31:09] (1869.04s)
startup really good startup totally like
[31:11] (1871.12s)
objectively you should make literally a
[31:12] (1872.88s)
spreadsheet you should you know go down
[31:15] (1875.28s)
and like evaluate it the way uh an
[31:18] (1878.32s)
investor would and then the difference
[31:20] (1880.00s)
is the investor has a portfolio and you
[31:21] (1881.92s)
just have one life and likewise when you
[31:24] (1884.48s)
start a startup you should not try to
[31:26] (1886.16s)
start a startup just to be the median
[31:27] (1887.92s)
startup the median startup is dead like
[31:30] (1890.16s)
you actually if If you're going to do
[31:31] (1891.52s)
it, you need to be like the, you know,
[31:33] (1893.28s)
you need to work at superlative places
[31:35] (1895.28s)
with superlative people. And then that's
[31:37] (1897.68s)
the only way that like good things
[31:39] (1899.60s)
happen. And you I I got lucky. I feel
[31:42] (1902.40s)
like I, you know, later in my life, I
[31:44] (1904.48s)
became much more of a heat-seeking
[31:45] (1905.92s)
missile. Like, you know, I think that
[31:48] (1908.00s)
that's why I was drawn to YC itself is
[31:50] (1910.40s)
like this is a place that has an energy
[31:52] (1912.40s)
that I've never experienced even at
[31:54] (1914.56s)
Stanford or even at Palunteer that I
[31:57] (1917.04s)
just wanted to be here. And I became
[31:58] (1918.72s)
much more of a heat seeker seeking
[32:00] (1920.24s)
missile for like uh like that type of
[32:02] (1922.80s)
like this is going to be huge. But um I
[32:05] (1925.20s)
wish you know at 22 I lucked out you
[32:07] (1927.28s)
know like my friends went to college
[32:09] (1929.52s)
with he they started a company with
[32:11] (1931.76s)
Peter Teal. So that was just very lucky.
[32:14] (1934.72s)
That's Diana's thing is uh you know she
[32:16] (1936.88s)
likes to fund lucky startups. So just
[32:19] (1939.68s)
get lucky. And I think you get a lot
[32:21] (1941.12s)
more lucky by being in San Francisco.
[32:22] (1942.72s)
Yeah. And working around really smart
[32:24] (1944.40s)
people. Um, I want to ask Gary, you
[32:26] (1946.56s)
talked about like that level 59 job at
[32:28] (1948.56s)
Microsoft and like your parents are
[32:30] (1950.16s)
proud, you have health insurance. Um,
[32:31] (1951.84s)
and at some point you have to decide,
[32:33] (1953.20s)
okay, I'm done with this. I want to go
[32:34] (1954.64s)
start a company. And like you work at
[32:36] (1956.24s)
something, uh, maybe at night, you know,
[32:38] (1958.00s)
after work, you can't really like go out
[32:39] (1959.52s)
on LinkedIn and advertise it cuz like
[32:41] (1961.04s)
your boss would see it. But at what
[32:42] (1962.48s)
point do you say, I have enough here. I
[32:44] (1964.24s)
can really go quit my job, you know,
[32:46] (1966.00s)
start spending down my savings and go go
[32:48] (1968.40s)
do something. You're much more
[32:50] (1970.00s)
responsible than I am. like I had, you
[32:52] (1972.00s)
know, $50,000 in credit card debt and I
[32:54] (1974.72s)
had the nicest apartment in Queen Anne
[32:56] (1976.72s)
and I bought a brand new Honda and it
[32:59] (1979.52s)
was very stupid and so I had to go get a
[33:01] (1981.84s)
job and like you know I I couldn't start
[33:04] (1984.16s)
a startup. I like you know waited I had
[33:06] (1986.08s)
I needed my friends to pull me out of
[33:07] (1987.84s)
that situation. So I mean I think you
[33:10] (1990.88s)
want I don't know at least six maybe
[33:12] (1992.96s)
nine months minimum of like just you
[33:15] (1995.68s)
know being able to live on ramen in the
[33:17] (1997.44s)
cheapest possible way. Uh and then at
[33:19] (1999.68s)
that point like the money in your bank
[33:21] (2001.20s)
is like just capital that you think of.
[33:23] (2003.28s)
And so that's probably what I would
[33:25] (2005.20s)
want. And then the other thing is uh I
[33:27] (2007.68s)
would try to bring on I would want to
[33:29] (2009.28s)
work with the smartest possible people.
[33:31] (2011.44s)
Like I know this is a big internet
[33:33] (2013.44s)
debate. It's like do you need a
[33:34] (2014.88s)
co-founder? Honestly like if it's your
[33:36] (2016.72s)
first startup
[33:38] (2018.80s)
I I wouldn't I would never start like my
[33:41] (2021.28s)
second or third startup. Sure I could do
[33:43] (2023.28s)
it like alone. I you know have
[33:44] (2024.88s)
connections. I know who to hire like all
[33:46] (2026.88s)
this stuff. If it were my first, I would
[33:48] (2028.88s)
not try to do it alone because there's
[33:50] (2030.64s)
just too much going on. There's like the
[33:52] (2032.80s)
the gradient of things that you need to
[33:54] (2034.72s)
learn is too wide and you need to go you
[33:57] (2037.60s)
need to go together. Yeah. And in my
[33:59] (2039.68s)
experience for people who are your age
[34:00] (2040.96s)
who have like already graduated college
[34:02] (2042.64s)
or working at some company, that's the
[34:04] (2044.40s)
biggest limiting factor in practice for
[34:06] (2046.64s)
them actually doing a company is like
[34:08] (2048.72s)
they and their co-founder both need to
[34:10] (2050.96s)
be willing to quit their jobs and go in
[34:12] (2052.88s)
on a startup at the same time. It's it's
[34:14] (2054.88s)
just like a timing problem and like
[34:17] (2057.04s)
co-founders are hard to find and it's
[34:19] (2059.20s)
hard to make that timing line up. So my
[34:21] (2061.12s)
advice would be like if you h if you and
[34:22] (2062.88s)
your co- if that does line up for you
[34:24] (2064.48s)
and you and your co-founder are both in
[34:26] (2066.24s)
a point in your life where you're like
[34:28] (2068.08s)
able to quit your jobs and go all in,
[34:30] (2070.16s)
you should probably just do it because
[34:31] (2071.76s)
it literally might not ever happen
[34:33] (2073.12s)
again. It's actually that hard to do.
[34:34] (2074.80s)
Hey guys. Well, thank first of all thank
[34:36] (2076.64s)
you for for the event. My voice is a bit
[34:38] (2078.56s)
cracked because of talking too much over
[34:41] (2081.04s)
the over the past few days. Um, we're
[34:43] (2083.36s)
actually talking to the CEO of Straa uh
[34:45] (2085.36s)
yesterday at one of the afterparties and
[34:47] (2087.20s)
he mentioned uh that they started being
[34:49] (2089.36s)
a very niche startup. Um, and I've taken
[34:51] (2091.76s)
a look at all the Y cominator startups
[34:53] (2093.52s)
over the years and it looks like uh
[34:54] (2094.80s)
they're getting increasingly niche. Uh,
[34:56] (2096.40s)
so I was just wondering like what do you
[34:58] (2098.00s)
think like what's your take on on on
[35:00] (2100.00s)
being super niche at the beginning and
[35:02] (2102.24s)
then expanding and how do you know like
[35:04] (2104.96s)
how niche you have to be um in the
[35:07] (2107.04s)
beginning? being niche at the start has
[35:08] (2108.96s)
actually always been like the recipe to
[35:11] (2111.92s)
succeed. Like even in the YC world kind
[35:14] (2114.56s)
like the current biggest company by
[35:16] (2116.96s)
market cap is Airbnb. Um and Airbnb was
[35:20] (2120.16s)
like the definition of niche when it
[35:21] (2121.76s)
started. Like it was literally airbeds
[35:24] (2124.56s)
in people's living rooms, right? During
[35:26] (2126.48s)
conferences. Yeah. During conferences.
[35:28] (2128.32s)
So like I'm not sure it can get more
[35:30] (2130.32s)
niche than that. Democratic conferences.
[35:32] (2132.24s)
Yeah. Democratic conferences. Yeah. like
[35:35] (2135.12s)
um and obviously it turns out that that
[35:37] (2137.04s)
expanded into just being this like
[35:38] (2138.80s)
monster company that's taking over all
[35:40] (2140.32s)
of travel. Um but there's even less
[35:42] (2142.24s)
obvious examples of this where people
[35:43] (2143.92s)
don't realize that things that seem huge
[35:46] (2146.32s)
now were niche at the beginning. Stripe
[35:48] (2148.72s)
actually in a sense obviously it's
[35:50] (2150.80s)
payments of course as a big market but
[35:52] (2152.80s)
actually when they first started it was
[35:54] (2154.80s)
like um an API for developers and the
[35:58] (2158.48s)
only thing that differentiated it from
[35:59] (2159.84s)
Brainree was that you could take
[36:01] (2161.60s)
payments instantly. Um, and people
[36:03] (2163.68s)
actually didn't think that that was much
[36:04] (2164.88s)
of a wedge. It was like, oh, okay, sure,
[36:06] (2166.40s)
if I'm working on a weekend project,
[36:07] (2167.84s)
I'll care a lot about that. But like big
[36:10] (2170.16s)
businesses aren't going to care about
[36:11] (2171.28s)
that. They're fine to wait two weeks to
[36:12] (2172.96s)
get like their merchant account. And so,
[36:15] (2175.52s)
it's always been the case that niches
[36:17] (2177.44s)
have been the right way to start
[36:19] (2179.04s)
actually and to dominate a niche and
[36:21] (2181.44s)
find ways to like expand into like
[36:23] (2183.84s)
adjacent markets and grow into a big
[36:26] (2186.16s)
company, I think, is like the recipe.
[36:28] (2188.32s)
And Brian Chesy quotes a lot of some of
[36:31] (2191.52s)
the best advice he got from PG during
[36:33] (2193.84s)
the batch was to really hone in to find
[36:37] (2197.36s)
10 people that love your product much
[36:39] (2199.92s)
better than I don't know 100 randos. And
[36:43] (2203.52s)
a lot of companies start like that. You
[36:46] (2206.08s)
want to find those maximalist users that
[36:49] (2209.36s)
really obsess with you and you iterate
[36:52] (2212.32s)
on those. I mean Coinbase was also very
[36:54] (2214.80s)
niche. Yeah, Coinbase was classic niche
[36:56] (2216.64s)
because um crypto itself was small and
[36:59] (2219.68s)
fringe and even within that they were
[37:01] (2221.68s)
building for what people thought was a
[37:03] (2223.04s)
non-existent market essentially. It was
[37:04] (2224.88s)
just like regular people who wanted a
[37:07] (2227.20s)
nice user interface to like buy and hold
[37:09] (2229.76s)
Bitcoin and it was oxyon regular people.
[37:12] (2232.88s)
Exactly. It was like based Yeah. The
[37:14] (2234.48s)
conventional wisdom was is just people
[37:15] (2235.68s)
who want to use it to like launder money
[37:17] (2237.20s)
and buy drugs and it's like okay no
[37:18] (2238.80s)
there's like other um use cases for it.
[37:21] (2241.28s)
Um, but with AI more than ever, by the
[37:24] (2244.08s)
way, like I think that niche is the way
[37:26] (2246.08s)
to go because no one really knows how
[37:28] (2248.00s)
big the markets are. And it does seem
[37:30] (2250.56s)
like things that seem like they were
[37:32] (2252.08s)
niche before AI, you can get people to
[37:34] (2254.24s)
pay you a lot lot more money for because
[37:36] (2256.00s)
they're not just buying software,
[37:37] (2257.20s)
they're buying like work from you. And
[37:39] (2259.60s)
so find the niche you're really
[37:41] (2261.28s)
interested in, optimize for your like
[37:43] (2263.92s)
passion and interest in it. And just
[37:45] (2265.36s)
pull on that thread. This is like
[37:47] (2267.12s)
actually a really powerful moment
[37:48] (2268.48s)
because literally you have 130, you
[37:50] (2270.96s)
know, I think of uh 03 as basically
[37:53] (2273.28s)
about 130 IQ. Maybe 03 Pro can be even
[37:56] (2276.32s)
smarter than that. Um when I really
[37:58] (2278.48s)
think about that, it's like, oh yeah,
[38:00] (2280.00s)
like a lot of the people who I've ever
[38:01] (2281.44s)
hired in my lifetime are like, yeah, 03
[38:04] (2284.72s)
is smarter than that person now. So, and
[38:06] (2286.88s)
then you can basically take that and um
[38:10] (2290.64s)
you know connect it to the proprietary
[38:13] (2293.20s)
data systems of almost any niche. And
[38:16] (2296.08s)
the more weird and unlikely for someone
[38:18] (2298.96s)
like someone in this room to know about
[38:20] (2300.56s)
it, the more likely that will be a
[38:22] (2302.32s)
durable enough moat that you can get a
[38:24] (2304.80s)
foot, you know, you can basically get
[38:26] (2306.32s)
you can wedge you yourself in there and
[38:29] (2309.28s)
then basically all you need is a wedge
[38:31] (2311.68s)
and then you just basically expand that
[38:33] (2313.52s)
wedge until you have the pie. Thank you
[38:35] (2315.84s)
guys so much for coming out. Awesome
[38:37] (2317.92s)
stuff.
[38:40] (2320.41s)
[Applause]
[38:43] (2323.98s)
[Music]