[00:00] (0.08s)
I have a very funny story to tell you.
[00:02] (2.48s)
where have you been? I've been trying to
[00:03] (3.76s)
text you. You've been offline. What's
[00:05] (5.36s)
going on? Where have you been?
[00:06] (6.64s)
I've been working feverishly, but
[00:09] (9.12s)
yesterday I had to
[00:12] (12.40s)
go to prepare for some meetings that I
[00:14] (14.32s)
have on Sunday, which I can't tell you
[00:16] (16.00s)
about, but
[00:17] (17.52s)
Natt and I,
[00:18] (18.88s)
Natt and I went to Pasalaqua,
[00:21] (21.68s)
which is in Lake Ko, which is an in I
[00:24] (24.16s)
mean, it's stunning. The the grounds are
[00:25] (25.84s)
stunning. The hotel is stunning. if you
[00:28] (28.00s)
have a chance to go to Lake Ko.
[00:30] (30.56s)
Anyways, this is us at Pasal Aqua.
[00:32] (32.80s)
Who's the beautiful woman there? Is that
[00:34] (34.48s)
the woman who owns it or something? Is
[00:36] (36.32s)
that the queen?
[00:36] (36.88s)
That's not But the best part is we had
[00:39] (39.28s)
such a good time. You know how they have
[00:40] (40.96s)
like a registry book to leave a message?
[00:43] (43.84s)
So, I left a message.
[00:46] (46.72s)
Here we go. What a truly magnificent
[00:50] (50.32s)
place. Above and beyond any expectation
[00:53] (53.60s)
Go below. Go below. That's not for me.
[00:56] (56.16s)
Thank you. We took everything. The free
[01:00] (60.00s)
We took everything. The free bird. Oh my
[01:02] (62.64s)
god, Jason. The hangers.
[01:06] (66.32s)
The laundry bag toothpaste, the robes,
[01:08] (68.96s)
the slippers, everything.
[01:10] (70.16s)
Absolutely fantastic.
[01:11] (71.44s)
Listen, you're going to have to send a
[01:12] (72.40s)
bill to the free birds.
[01:13] (73.84s)
Absolutely.
[01:17] (77.12s)
We'll let your winners ride.
[01:24] (84.24s)
And that's it. We open sourced it to the
[01:26] (86.08s)
fans and they've just gone crazy with
[01:32] (92.48s)
All right, listen. We've got a great
[01:33] (93.68s)
panel this week. It's the summer. Things
[01:35] (95.52s)
are slow. Some people are busy. I think
[01:38] (98.00s)
uh our prince of panic attacks, our dear
[01:41] (101.28s)
Sultan of science is uh he's at the beep
[01:44] (104.88s)
sax is busy. Couldn't make it this week
[01:47] (107.52s)
in his place. another brilliant PayPal
[01:50] (110.64s)
alumni and uh dare I say GOP supporter,
[01:55] (115.28s)
Heath Ra Boy. How are you, sir?
[01:56] (116.80s)
Pleasure to be with you again.
[01:58] (118.56s)
Nice to see you. And I'm assuming you're
[02:00] (120.64s)
in gorgeous Florida or somewhere in
[02:03] (123.44s)
I'm actually in New York.
[02:05] (125.04s)
Oh, my hometown. Is it safe? Is it okay?
[02:08] (128.32s)
Mom Dammy uh chasing you down the
[02:10] (130.40s)
street.
[02:11] (131.12s)
Not yet, but it's safe.
[02:13] (133.44s)
Seize your assets. It's safe. Yeah, it's
[02:16] (136.16s)
safe right now. We'll see you on
[02:17] (137.52s)
November 4th. You know, as you probably
[02:18] (138.96s)
heard, on July 4th was the first time in
[02:21] (141.12s)
recorded history that there were no
[02:22] (142.48s)
shootings or no murders in New York on
[02:24] (144.80s)
that day. So, right now, things are in
[02:26] (146.80s)
pretty good shape, but we may be maybe
[02:28] (148.80s)
leaving New York quickly.
[02:30] (150.72s)
Yeah, you're going to probably want to
[02:31] (151.84s)
sell that place if you got one there
[02:33] (153.28s)
because Mami is going to seize it and
[02:35] (155.44s)
turn it into a drugstore for you. Yes,
[02:37] (157.44s)
it's going to be
[02:39] (159.36s)
drug stores. Travis Kalanick is back
[02:41] (161.68s)
with us. How you doing, bestie?
[02:43] (163.28s)
Uh, pretty good. Pretty good. Yeah.
[02:45] (165.52s)
Second appearance here on the round
[02:47] (167.04s)
table and uh third time on the show. Of
[02:49] (169.60s)
course, you spoke at the summit. You've
[02:51] (171.52s)
been busy with cloud kitchens. Yeah.
[02:53] (173.04s)
Lots of exciting things going on.
[02:54] (174.32s)
Oh, lots of stuff. Lots of stuff. The
[02:56] (176.32s)
robots the robots are taking over. We're
[02:58] (178.16s)
we're rolling out. We're rolling out
[03:00] (180.16s)
robots. Yeah.
[03:01] (181.20s)
TK, can you tell us what you're doing
[03:03] (183.12s)
with this Pony AI thing or not? That's
[03:04] (184.88s)
speculation.
[03:05] (185.76s)
Uh look, you know, obviously is autonomy
[03:10] (190.72s)
as we, you know, in the US we have of
[03:13] (193.92s)
course. Wait, do you want to just frame
[03:15] (195.12s)
for people that don't that may not be up
[03:16] (196.80s)
to speed what was announced or at least
[03:18] (198.40s)
Why don't you frame it? So, why don't
[03:19] (199.92s)
you AI is um an autonomous company doing
[03:23] (203.76s)
self-driving? It's one of the few uh
[03:25] (205.68s)
players that actually have cars on the
[03:27] (207.68s)
road. They're based in China. They've
[03:29] (209.44s)
got a lot of operations in the Middle
[03:31] (211.60s)
East. They've got a deal with um a
[03:33] (213.84s)
delivery company called Uber, which you
[03:36] (216.00s)
might be familiar with.
[03:38] (218.64s)
Uh okay. So
[03:41] (221.92s)
well the deal was basically that you
[03:44] (224.48s)
would partner with Uber, license in the
[03:46] (226.96s)
pony technology and essentially start a
[03:50] (230.88s)
competitor I guess to Whimo and Tesla.
[03:53] (233.76s)
Let me work on this one. Okay. So So in
[03:57] (237.12s)
the US we have Whimo. We see the Whimos
[04:00] (240.32s)
in San Francisco, Los Angeles, Boston,
[04:04] (244.40s)
coming soon to Miami, coming soon to
[04:06] (246.72s)
Atlanta, coming soon to DC. They're even
[04:09] (249.20s)
talking about New York.
[04:11] (251.92s)
Tesla's sort of like the, you know,
[04:14] (254.16s)
they're doing it the hard way, you know,
[04:16] (256.16s)
classic Elon style, like let's
[04:19] (259.28s)
let's do this sort of in a fundamental
[04:22] (262.24s)
holy [Β __Β ] let's go all the way kind of
[04:25] (265.28s)
kind of approach. Uh, and it's unclear
[04:27] (267.68s)
when it gets over the line. Of course,
[04:29] (269.12s)
he he launched sort of a a semi
[04:32] (272.56s)
semi-pilot of sorts in Austin recently,
[04:36] (276.32s)
but there's no other alternatives. So
[04:38] (278.16s)
what happens is is some of the folks who
[04:40] (280.40s)
are interested in making sure there are
[04:42] (282.32s)
alternatives
[04:44] (284.32s)
have reached out they they've reached
[04:46] (286.16s)
out to me and there different
[04:48] (288.32s)
discussions they get going because
[04:49] (289.76s)
they're like Travis you did autonomy way
[04:52] (292.08s)
back in the day got the Uber autonomous
[04:55] (295.44s)
stuff going in 2014
[04:59] (299.76s)
maybe there's something to do here to
[05:02] (302.00s)
create optionality now maybe like I'm of
[05:04] (304.40s)
course very interested on the food side
[05:06] (306.32s)
I talk about autonom onous burritos
[05:08] (308.24s)
being a big deal because if you can
[05:10] (310.40s)
automate the kitchen, the production of
[05:12] (312.96s)
food and then you can automate the sort
[05:15] (315.84s)
of logistics around food, you take huge
[05:19] (319.28s)
amount of costs out of the food out of
[05:21] (321.76s)
what's going on in food and that's of
[05:23] (323.28s)
course near and dear to my heart.
[05:25] (325.44s)
There's folks of course that want to see
[05:28] (328.00s)
autonomy and mobility. That's a real
[05:30] (330.24s)
thing. It it may be that or I would say
[05:34] (334.32s)
if you get the autonomy problem right,
[05:37] (337.68s)
you can use it to apply to both
[05:40] (340.32s)
problems. So there's a lot of folks
[05:42] (342.40s)
interested in
[05:44] (344.48s)
moving things, moving food, moving
[05:46] (346.72s)
people. And if there is some kind of
[05:50] (350.40s)
autonomous
[05:51] (351.92s)
technology that maybe I get involved in,
[05:55] (355.44s)
it might apply to a bunch of different
[05:57] (357.52s)
things. And so I've got some inbound.
[06:00] (360.16s)
Let's just put it that way. There's no
[06:01] (361.44s)
there's no real deal right now, but
[06:02] (362.96s)
there is definitely some inbound. And I
[06:04] (364.40s)
think there is some news about some of
[06:06] (366.48s)
that inbound that may or may not be
[06:07] (367.92s)
occurring. That's probably the best way
[06:09] (369.52s)
to put it. That is long-winded. I'll try
[06:11] (371.36s)
to tighten that up next time.
[06:13] (373.04s)
No, no, I think it's great to get the
[06:15] (375.36s)
overview here first on uh
[06:17] (377.84s)
allin. Thank you for sharing it with us.
[06:19] (379.36s)
And everybody knows you have
[06:21] (381.04s)
been doing a bowl builder lab 37 I think
[06:24] (384.64s)
it's called. We throw it up on the
[06:25] (385.92s)
screen. Not sure what the status of it
[06:27] (387.60s)
is. And then I'll let you go, Chimath,
[06:28] (388.96s)
with your follow-up question. But
[06:30] (390.48s)
I I think there's a pretty interesting
[06:32] (392.32s)
concept here of the bowl getting built
[06:34] (394.72s)
and then put into a self-driving car.
[06:36] (396.56s)
Now, that machine looks huge, but it's
[06:38] (398.24s)
actually 60 square feet.
[06:39] (399.84s)
That picture makes it look monstrous.
[06:41] (401.92s)
It's a 60 foot machine like uh imagine
[06:45] (405.04s)
running like a Sweet Green like brand or
[06:47] (407.76s)
a Chipotle like brand and just making it
[06:50] (410.24s)
so it comes to life for people who
[06:52] (412.48s)
who you know are like, "Hey, what is
[06:54] (414.40s)
this thing?" Imagine you just order
[06:56] (416.40s)
online exactly the kind of bowl you want
[06:58] (418.40s)
and actually this machine could run like
[07:00] (420.32s)
many brands at the same time and and
[07:02] (422.08s)
does you build the bowl you want
[07:04] (424.08s)
whatever ingredients uh it sort of it if
[07:07] (427.44s)
you look at that bottom you see those
[07:09] (429.44s)
little white bricks at the bottom that's
[07:11] (431.84s)
what carries the bowl underneath
[07:13] (433.36s)
dispensers it fills up the machine puts
[07:16] (436.16s)
a it sauces the the bowl then it puts a
[07:19] (439.12s)
lid on it takes the bowl puts it in a
[07:22] (442.16s)
bag uh puts utensils in the bag seals
[07:25] (445.20s)
the bag and the bag goes down a conveyor
[07:28] (448.32s)
belt where then another machine what we
[07:31] (451.76s)
would call an AGV takes the bull to the
[07:34] (454.64s)
front of house. The bull gets put into a
[07:36] (456.96s)
locker. The courier be a Door Dash who
[07:40] (460.00s)
breeds courier will wave their app in
[07:43] (463.44s)
front of a camera and it will open up
[07:45] (465.12s)
the locker that has the food that
[07:46] (466.24s)
they're supposed to pick up. So it just
[07:48] (468.56s)
it takes out a lot of what we would call
[07:51] (471.04s)
the the cost of assembly. um which is
[07:54] (474.96s)
reduces mistakes, right? I mean a
[07:57] (477.28s)
mistake. Yeah,
[07:57] (477.84s)
we know exactly how many grams of every
[07:59] (479.60s)
ingredient are put in. That's exactly
[08:01] (481.36s)
what you're supposed to get
[08:04] (484.24s)
and so you get a higher quality product.
[08:07] (487.36s)
It takes a lot of the cost out. You
[08:08] (488.96s)
imagine ultimately that's going to be
[08:11] (491.12s)
they're going to be couriers with that
[08:12] (492.48s)
as well that you know I like to say
[08:14] (494.96s)
autonomous burritos like is a Whimo
[08:16] (496.88s)
gonna carry a burrito or is Tesla going
[08:19] (499.44s)
to have a a machine that carries food or
[08:22] (502.80s)
you know is there another another
[08:24] (504.64s)
company that ends up doing you know sort
[08:26] (506.32s)
of the the the things the the the the
[08:28] (508.88s)
autonomous delivery of things and the
[08:31] (511.52s)
point is is well where we are right now
[08:33] (513.60s)
is we've got customers and so those
[08:35] (515.76s)
customers are starting to deploy
[08:38] (518.40s)
this quarter and it's pretty
[08:40] (520.80s)
interesting. I mean you can the in our
[08:43] (523.68s)
delivery kitchens
[08:45] (525.60s)
the cost of labor is about 30% of
[08:48] (528.08s)
revenue. That's what the successful guy
[08:50] (530.64s)
let's say 30% 35% of revenue in a in a
[08:54] (534.80s)
brickandmortar in a brick and mortar
[08:56] (536.48s)
restaurants it's even higher. Okay.
[08:59] (539.84s)
When they're running our machine it's
[09:01] (541.76s)
between seven and 10% of revenue.
[09:05] (545.76s)
Amazing. Then you take out the cost of
[09:07] (547.20s)
the delivery, you know, and now it's
[09:09] (549.12s)
becoming everybody can have a private
[09:10] (550.64s)
chef, which was your original vision for
[09:12] (552.24s)
Uber was people don't know the original
[09:14] (554.08s)
tagline, but it was your your p
[09:15] (555.44s)
everybody has a private driver.
[09:16] (556.80s)
Everyone's private driver was the
[09:18] (558.16s)
original for Uber. Basically, the
[09:20] (560.56s)
infrastructure was already there. And I
[09:21] (561.84s)
said this on, you know, one of your
[09:23] (563.68s)
recent, I think it was at the All-In
[09:26] (566.24s)
Summit, Jason, but like um
[09:30] (570.00s)
in the mobility, cars, you know, I
[09:33] (573.84s)
transport uh space, the roads were
[09:37] (577.12s)
already there. The cars were already
[09:39] (579.12s)
built. People weren't using their cars
[09:41] (581.92s)
98% of the day. So the infrastructure is
[09:44] (584.08s)
already there to get people around
[09:46] (586.88s)
to do this as a service and do it very
[09:48] (588.88s)
efficiently and conveniently
[09:50] (590.72s)
with food. The infrastructure is not
[09:52] (592.56s)
there. Like yes, restaurants have excess
[09:54] (594.72s)
capacity. That's what Uber Eatats
[09:56] (596.32s)
utilizes. But to go and say like let's
[10:00] (600.08s)
make 30% of all meals in a in a city uh
[10:03] (603.76s)
sort of prepared and delivered by a
[10:06] (606.96s)
service, the infrastructure is not
[10:08] (608.88s)
there. So you have to build it. So our
[10:10] (610.72s)
company that the mission is uh
[10:14] (614.00s)
infrastructure for better food. So
[10:15] (615.92s)
that's real estate, that's software and
[10:18] (618.24s)
robotics for the production and delivery
[10:21] (621.60s)
of food in this super efficient way.
[10:24] (624.72s)
All right. Uh Keith, what are your
[10:26] (626.48s)
thoughts? Any questions for
[10:28] (628.08s)
Well, he's not here, but isn't this what
[10:29] (629.92s)
David Freeberg tried to do a few years
[10:32] (632.08s)
Yeah. This came up on the last Allin.
[10:34] (634.00s)
Yeah. Or the last one I was at. Yeah.
[10:35] (635.76s)
Yeah. pizza. Pizza. The problem was I
[10:38] (638.00s)
told Freeberg, "People don't want to eat
[10:40] (640.00s)
quinoa.
[10:40] (640.96s)
You got to put a a little steak in
[10:42] (642.88s)
there, maybe a piece of salmon." But he
[10:45] (645.04s)
was kind of rel I think eventually he
[10:46] (646.56s)
relented and let people have a little
[10:48] (648.32s)
bit of protein. Uh but yeah, it's such a
[10:51] (651.04s)
great vision. And
[10:51] (651.76s)
wait, he he died as a vegan martyr.
[10:55] (655.12s)
I think the business died as martyr.
[10:57] (657.76s)
Well, that was the hill he was led to.
[11:00] (660.08s)
There's a lot of people have died on
[11:01] (661.36s)
that hill. But the bottom line is if
[11:03] (663.36s)
you're going to get into automation, you
[11:05] (665.60s)
have to it has to be endto-end
[11:07] (667.44s)
automation. And what I mean by that is
[11:09] (669.12s)
like there are pizza there are pizza
[11:11] (671.84s)
companies that have come and gone
[11:13] (673.92s)
automated pizza companies where it's
[11:15] (675.36s)
like we have a pizza machine and
[11:17] (677.12s)
everybody's like yeah this is amazing
[11:19] (679.28s)
and you have a guy you have a
[11:20] (680.72s)
million-dollar pizza machine and then on
[11:22] (682.24s)
the left you have a guy feeding
[11:23] (683.76s)
ingredients into the pizza machine and
[11:25] (685.68s)
on the right you have a guy taking the
[11:27] (687.36s)
pizza out and then putting it in a box
[11:29] (689.68s)
and doing all this. So instead of one
[11:31] (691.28s)
guy making pizzas, I have a
[11:33] (693.52s)
million-dollar machine and two guys
[11:35] (695.36s)
making pizza. And so when you look at
[11:38] (698.56s)
these auton like robotic food production
[11:42] (702.96s)
machines or food assembly machines, you
[11:46] (706.56s)
have to look at the full stack and say
[11:49] (709.36s)
does it work with the ecosystem that
[11:51] (711.60s)
exists in a restaurant and does it go
[11:54] (714.00s)
full stack from you know like like we
[11:57] (717.04s)
have this thing that machine we saw
[11:58] (718.96s)
earlier. The staff preps the food, they
[12:02] (722.32s)
put the food in the machine and then
[12:03] (723.92s)
they leave.
[12:05] (725.52s)
They're gone. This restaurant runs
[12:07] (727.04s)
itself for many hours without anybody
[12:12] (732.00s)
But this could be McDonald's, Burger
[12:14] (734.16s)
King, and Taco Bell. Nobody would know
[12:17] (737.76s)
that right there. That machine is a it's
[12:20] (740.08s)
an assembly machine, right? The food is
[12:22] (742.16s)
prepped by humans and then assembled by
[12:24] (744.00s)
this machine. For a Chipotle or a sweet
[12:26] (746.48s)
green, this is like a a majority of
[12:28] (748.80s)
their labor, right? You go up to a
[12:30] (750.64s)
Chipotle, there's like 10 guys at lunch
[12:32] (752.64s)
and you're still in line. That machine
[12:34] (754.88s)
right there does 300 bulls an hour,
[12:38] (758.16s)
right? And so you go, okay, that's the
[12:41] (761.76s)
this is what's called um like the
[12:44] (764.48s)
assembly line. It's just that front line
[12:46] (766.88s)
where you basically assemble things. I
[12:49] (769.04s)
think sometimes I call it the make line.
[12:51] (771.28s)
What will happen over time is you'll
[12:52] (772.64s)
have perpendicular lines going into it
[12:55] (775.28s)
where you're producing food,
[12:58] (778.08s)
right? So you'll have a production or
[13:00] (780.72s)
bake line going into an assembly line
[13:03] (783.04s)
here and then you go, "Oh, wow." So you
[13:07] (787.60s)
have it something that dispenses burgers
[13:09] (789.52s)
on buns. That's the dispenser. That's
[13:12] (792.00s)
the assembly,
[13:14] (794.00s)
But Factorio on steroids basically.
[13:16] (796.48s)
Yeah. And then it's like how do you cook
[13:18] (798.08s)
that burger? That's what I call that's
[13:20] (800.40s)
what we call state change. So state
[13:22] (802.72s)
change is the is the cooking of the
[13:24] (804.88s)
food. assembly is the like how do I put
[13:28] (808.00s)
it together and plate it.
[13:29] (809.20s)
Doesn't this collapse like for example
[13:31] (811.36s)
if you have a yield of 300 per hour you
[13:34] (814.48s)
said out of that one machine?
[13:36] (816.88s)
Very quickly you can imputee the value
[13:39] (819.68s)
of having a smaller footprint store with
[13:42] (822.80s)
five of these things in a faceless
[13:44] (824.96s)
warehouse with drone delivery or cars.
[13:47] (827.68s)
You don't need the physical
[13:48] (828.88s)
infrastructure.
[13:50] (830.24s)
So then don't you create a wasteland of
[13:52] (832.32s)
real estate or how do you repurpose all
[13:54] (834.00s)
the real estate? Well, the way to think
[13:55] (835.52s)
about it is like 90% of well, it's
[13:57] (837.92s)
probably a little lower than that right
[13:59] (839.28s)
now. Let's say 85% of all meals in the
[14:01] (841.52s)
US are at home. They just are.
[14:05] (845.60s)
And a vast majority of those meals are
[14:08] (848.72s)
cooked at home. So, you know, like Uber
[14:12] (852.48s)
Eats and Door Dash, they represent like
[14:14] (854.08s)
1.8% or 2% of all meals right now. It's
[14:18] (858.16s)
very tiny, right? So what you're doing
[14:21] (861.76s)
is you're using real estate to and
[14:24] (864.56s)
infrastructure to prepare and deliver
[14:27] (867.28s)
meals to people at their homes. And so
[14:30] (870.80s)
it's not the restaurants still exist.
[14:33] (873.12s)
We're still going to want to go to
[14:34] (874.24s)
restaurants. We're still going to want
[14:35] (875.44s)
to go outside. We learned that during co
[14:37] (877.60s)
we knew it before. We definitely know it
[14:39] (879.76s)
after. Um and so I don't it's not really
[14:43] (883.12s)
like a decimating real estate situation.
[14:46] (886.00s)
It's taking a thing we used to do for
[14:48] (888.16s)
ourselves and creating a service that
[14:50] (890.64s)
does it higher quality. You know, sort
[14:52] (892.88s)
of I like to say you don't have to be
[14:54] (894.56s)
wealthy to be healthy
[14:57] (897.04s)
and just infrastructure to get that cost
[15:00] (900.72s)
And so you're doing something as a
[15:02] (902.32s)
service that we used to do at home.
[15:04] (904.32s)
I think in the super long run you're
[15:06] (906.56s)
like what where's the story on grocery
[15:09] (909.04s)
stores? If you go to like in in 20
[15:12] (912.40s)
years, I think everybody agrees,
[15:16] (916.24s)
you you will have machines making very
[15:18] (918.72s)
high quality, very personalized meals
[15:21] (921.36s)
for everybody.
[15:22] (922.88s)
This will be good for Keith because he
[15:24] (924.40s)
measures stuff down to like five
[15:26] (926.72s)
calories based on his Instagram.
[15:29] (929.12s)
What's your What's your What's your body
[15:30] (930.40s)
fat? Like seven%.
[15:33] (933.20s)
It's like
[15:34] (934.00s)
Just open his Instagram. He posted four
[15:37] (937.20s)
times today about his body fat. like so
[15:39] (939.04s)
disgusted with himself at 10%.
[15:40] (940.88s)
It's like bad at 10. But um I actually
[15:42] (942.96s)
think the vision of this actually the
[15:44] (944.80s)
natural implication and maybe the home
[15:46] (946.56s)
run version of this is everybody has a
[15:48] (948.40s)
private chef in their house, a robot in
[15:50] (950.96s)
their house that actually does this
[15:52] (952.48s)
personalized because people do want to
[15:55] (955.04s)
cook at home, but they don't have the
[15:59] (959.12s)
Yeah. Or space uh and infrastructure.
[16:02] (962.24s)
But man, these delivery services are
[16:04] (964.40s)
charging. Rich people do this all the
[16:06] (966.16s)
time, right? They do these crazy meal
[16:07] (967.76s)
delivery services for 200 bucks a day
[16:10] (970.08s)
and this is just going to abstract it
[16:11] (971.60s)
down to everybody and man people get
[16:14] (974.32s)
creative when there's that empty space
[16:15] (975.60s)
to your point Jimoth about what happens
[16:17] (977.52s)
to all this space. When I lived in New
[16:18] (978.96s)
York in the 80s and 90s it was common to
[16:21] (981.92s)
in Tribeca in West Chelsea where I lived
[16:24] (984.80s)
to take storefronts put your little
[16:26] (986.64s)
architect's office in the front and live
[16:28] (988.16s)
in the back. And many people were
[16:30] (990.48s)
hacking real estate. We still need five,
[16:32] (992.72s)
10 million homes in this country. And
[16:34] (994.24s)
they're already doing this with malls. I
[16:36] (996.40s)
I I keep seeing malls being turned into
[16:38] (998.72s)
colleges and creative spaces. One of
[16:41] (1001.84s)
them in Boston, they turned like the
[16:43] (1003.52s)
second and third floor into studio
[16:45] (1005.76s)
apartments for artists. So, you know,
[16:48] (1008.08s)
where there's a will, there's a way we
[16:49] (1009.28s)
could use the space. I mean,
[16:50] (1010.24s)
yeah. Where this goes, what Chimamas
[16:51] (1011.68s)
saying and where the real estate goes is
[16:53] (1013.84s)
we call it the internet food court
[16:56] (1016.24s)
where, you know, you're on Amazon,
[16:58] (1018.72s)
right? It's the everything store. Now
[17:00] (1020.24s)
imagine that for food and then imagine
[17:03] (1023.28s)
you have an 8,000 foot facility where
[17:06] (1026.48s)
basically anything can be made.
[17:08] (1028.08s)
Anything can be made
[17:08] (1028.88s)
because if you have that machine you saw
[17:11] (1031.28s)
has 18 sort of dispensers for food, 10
[17:14] (1034.96s)
different sauces. You get the idea. Now
[17:16] (1036.96s)
what what about when it's 50 or 100
[17:18] (1038.96s)
dispensers for food? What if you have
[17:21] (1041.44s)
multiple machines with a 100 dispensers
[17:23] (1043.60s)
for food?
[17:24] (1044.40s)
That's crazy. you can the combinatorial
[17:26] (1046.48s)
math in terms of what's possible, what
[17:28] (1048.32s)
can be made sort of, you know, goes
[17:31] (1051.12s)
exponential and so
[17:34] (1054.48s)
the internet food court is sort of the
[17:36] (1056.32s)
vision for where this all goes.
[17:38] (1058.56s)
Another example of the bitter lesson.
[17:41] (1061.44s)
The bitter Yeah, we're going to get to
[17:43] (1063.04s)
that, I guess, today
[17:45] (1065.20s)
in a very full docket. Before we get to
[17:47] (1067.52s)
that, just a little bit of housekeeping
[17:51] (1071.76s)
September 7th, 8th, 9th in Los Angeles,
[17:54] (1074.48s)
the All-In Summit again, all-in.com yada
[17:57] (1077.04s)
yada yada. The lineup is stacked and
[18:00] (1080.00s)
we're gonna start announcing the
[18:01] (1081.44s)
speakers. People have been begging us to
[18:03] (1083.12s)
announce the speakers.
[18:04] (1084.64s)
I don't know.
[18:05] (1085.92s)
You got to hold some back. Careful,
[18:07] (1087.28s)
careful.
[18:07] (1087.76s)
Hold a couple back, but we got some
[18:09] (1089.44s)
really nice speakers lined up. It is
[18:11] (1091.36s)
going to be extraordinary.
[18:12] (1092.88s)
It is. It is the best one yet.
[18:15] (1095.60s)
I mean, every year we have this done.
[18:18] (1098.16s)
Yeah. Yeah. Every year we have this
[18:19] (1099.44s)
little bit of panic like um you know we
[18:21] (1101.68s)
going to get great speakers and man they
[18:23] (1103.12s)
started flowing in this week it's going
[18:24] (1104.40s)
to be extraordinary
[18:26] (1106.56s)
almost as extraordinary as this
[18:28] (1108.00s)
delicious tequila behind my head here.
[18:29] (1109.84s)
Get the allin tequila at
[18:31] (1111.12s)
tequila.allin.com.
[18:32] (1112.96s)
Deliveries begin late summer.
[18:34] (1114.72s)
He's moving to the side. You can't even
[18:39] (1119.28s)
right there.
[18:40] (1120.16s)
All right. Listen.
[18:40] (1120.88s)
Oh wow.
[18:41] (1121.44s)
Lots to discuss this week. Obviously AI
[18:43] (1123.44s)
is continuing to be the big story in our
[18:47] (1127.36s)
industry and for good reason. Our bestie
[18:50] (1130.32s)
uh Elon released Gro 4 Wednesday night.
[18:55] (1135.20s)
Two versions, base model and a heavy
[18:57] (1137.04s)
model. 30 bucks a month for the base.
[18:59] (1139.60s)
$300 a month for this heavy model which
[19:04] (1144.16s)
has a very unique feature. You can have
[19:05] (1145.84s)
a multi- aent feature. I got to see this
[19:08] (1148.00s)
actually when I visited XAI a couple of
[19:09] (1149.60s)
weeks ago where multiple agents work on
[19:12] (1152.00s)
the same problem then they and they do
[19:14] (1154.16s)
that simultaneously obviously and then
[19:15] (1155.60s)
compare each other's work and it gives
[19:17] (1157.84s)
you kind of like a study group the best
[19:19] (1159.60s)
answer uh by consensus really
[19:21] (1161.68s)
interesting according to artificial
[19:24] (1164.24s)
analysis benchmarks you can pull that up
[19:26] (1166.16s)
Nick Gro 4 base model has surpassed
[19:29] (1169.44s)
OpenAI's 03 pro Google Gemini's 2.5 pro
[19:33] (1173.12s)
as the most intelligent model
[19:36] (1176.96s)
includes like seven different industry
[19:39] (1179.44s)
standard evaluation tests. You can look
[19:41] (1181.52s)
it up, but reasoning, math, coding, all
[19:43] (1183.36s)
that kind of stuff. This is, you know,
[19:46] (1186.16s)
book smarts, not necessarily street
[19:48] (1188.32s)
smart. So, it doesn't mean that these
[19:49] (1189.84s)
things can reason. And obviously there
[19:51] (1191.76s)
was a little um there was a little
[19:54] (1194.16s)
kurluffle on um X formerly known as
[19:57] (1197.68s)
Twitter where XAI got a little frisky
[19:59] (1199.92s)
and was saying all kinds of crazy stuff
[20:01] (1201.44s)
and needed to uh maybe be redteamed a
[20:04] (1204.64s)
little bit more decisively.
[20:08] (1208.00s)
Many of you know Grock 4 was trained on
[20:09] (1209.84s)
Colossus. That's that giant data center
[20:12] (1212.24s)
that Elon's been building. And we showed
[20:14] (1214.96s)
the chart here. Shimoth,
[20:17] (1217.12s)
you sent us a link to the bitter lesson
[20:18] (1218.96s)
by uh Rich Sutton in the group chat.
[20:20] (1220.96s)
That's the 2019 blog post. We'll pull it
[20:23] (1223.04s)
up here for people to take a look at and
[20:24] (1224.56s)
we'll put it in the show notes. Maybe
[20:26] (1226.80s)
just generally,
[20:28] (1228.88s)
your reaction to both how quickly Elon
[20:33] (1233.12s)
has, and that chart showed it, how
[20:34] (1234.64s)
quickly Elon has caught up
[20:36] (1236.64s)
and I don't think people expected him to
[20:38] (1238.24s)
take the lead, but here we are. Before
[20:40] (1240.00s)
we start, Nick, can you please show
[20:42] (1242.40s)
Elon's tweet about how they did on the
[20:44] (1244.80s)
AGI benchmark? It's absolutely
[20:47] (1247.76s)
incredible.
[20:50] (1250.56s)
Two things. One is how quickly starting
[20:55] (1255.84s)
in March of 2023, so we're talking about
[21:00] (1260.16s)
less than 2 and 1/2 years, what this
[21:03] (1263.04s)
team has accomplished
[21:05] (1265.52s)
and how far ahead they are of everybody
[21:08] (1268.48s)
else. that's demonstrated by this. But
[21:10] (1270.96s)
the second is a fundamental
[21:12] (1272.96s)
architectural decision that Elon made
[21:16] (1276.32s)
which I think we didn't fully appreciate
[21:19] (1279.52s)
until now. And it maps to an
[21:22] (1282.00s)
architectural decision he made at Tesla
[21:24] (1284.72s)
as well. And for all we know, we'll
[21:26] (1286.40s)
figure out that he made an equivalent
[21:27] (1287.92s)
decision at SpaceX. And that decision is
[21:31] (1291.20s)
really well encapsulated by this essay,
[21:34] (1294.48s)
the bitter lesson by Rich Sutton. And
[21:37] (1297.60s)
Nick, if you can just throw this up
[21:39] (1299.04s)
there, but just to summarize what this
[21:41] (1301.04s)
says, it basically says in a nutshell
[21:44] (1304.08s)
that you're always better off when
[21:47] (1307.20s)
you're trying to solve an AI problem
[21:49] (1309.44s)
taking a general learning approach that
[21:51] (1311.76s)
can scale with computation because it
[21:54] (1314.16s)
ultimately proves to be the most
[21:55] (1315.68s)
effective. And the alternative would be
[21:59] (1319.04s)
something that's much more human labored
[22:01] (1321.12s)
and human involved that requires human
[22:03] (1323.52s)
knowledge. And so the first method, what
[22:07] (1327.20s)
it essentially allows you to do is view
[22:10] (1330.56s)
any problem as an endless scalable
[22:14] (1334.80s)
search or learning task. And as it's
[22:18] (1338.24s)
turned out, whether it's chess or go or
[22:21] (1341.52s)
speech recognition or computer vision,
[22:24] (1344.40s)
whenever there was two competing
[22:25] (1345.76s)
approaches, one that used general
[22:27] (1347.52s)
computation and one that used human
[22:29] (1349.76s)
knowledge, the general computation
[22:32] (1352.56s)
problem always won. And so it creates
[22:36] (1356.48s)
this bitter lesson for humans that want
[22:38] (1358.56s)
to think that we are at the center of
[22:41] (1361.04s)
all of this critical learning and all of
[22:42] (1362.80s)
these leaps. In more AI specific
[22:45] (1365.28s)
language, what it means is that a lot of
[22:47] (1367.52s)
these systems create these embeddings
[22:49] (1369.20s)
that are just not understandable by
[22:50] (1370.96s)
humans at all, but it yields incredible
[22:53] (1373.28s)
results.
[22:55] (1375.12s)
So why is this crazy? Well, he made this
[22:57] (1377.68s)
huge bet on this 100,000 GPU cluster.
[23:00] (1380.56s)
People thought, "Wow, that's a lot. Is
[23:02] (1382.08s)
it going to bear fruit?" Then he said,
[23:03] (1383.84s)
"No, actually, I'm scaling it up to
[23:05] (1385.52s)
250,000." Then he said, "It's going to
[23:07] (1387.36s)
scale up to a million." And what these
[23:09] (1389.60s)
results show is a general computational
[23:13] (1393.04s)
approach that doesn't require as much
[23:15] (1395.12s)
human labeling can actually get to the
[23:18] (1398.32s)
answer and better answers faster. That
[23:20] (1400.72s)
has huge implications because if you
[23:22] (1402.56s)
think about all these other companies,
[23:25] (1405.84s)
what is Llama been doing? They just
[23:28] (1408.24s)
spent 15 billion to buy 49% of scale AI.
[23:32] (1412.00s)
That's exactly a bet on human knowledge.
[23:34] (1414.40s)
What is Gemini doing? What is OpenAI
[23:36] (1416.40s)
doing? What is Anthropic doing? So all
[23:38] (1418.48s)
these things come into question. And
[23:40] (1420.08s)
then the last thing I'll say is if you
[23:41] (1421.76s)
look back, he made this bet once before,
[23:45] (1425.60s)
which was Tesla FSD versus Whimo. And
[23:48] (1428.64s)
Tesla FSD only had cameras. It didn't
[23:51] (1431.84s)
have LAR. But the bet was, I'll just
[23:54] (1434.56s)
collect billions and billions of driving
[23:57] (1437.84s)
miles before anybody else does and apply
[24:01] (1441.36s)
general compute and it'll get to
[24:03] (1443.84s)
autonomy faster than the other more
[24:06] (1446.32s)
laborious and very expensive approach.
[24:09] (1449.20s)
So I just think it's an incredible
[24:11] (1451.28s)
moment in technology where we see so
[24:13] (1453.52s)
many examples. Travis is another one
[24:15] (1455.44s)
what he's just talked about. You know,
[24:17] (1457.44s)
the bitter lesson is you could believe
[24:20] (1460.24s)
that, you know, food is this immutable
[24:22] (1462.96s)
thing that's made meticulously by hand
[24:24] (1464.96s)
by these individuals, or you can take
[24:26] (1466.96s)
this general purpose computer approach,
[24:28] (1468.48s)
which is what he took, waited for these
[24:30] (1470.40s)
cost curves to come into play, and now
[24:32] (1472.56s)
you can scale food to every human on
[24:34] (1474.32s)
Earth. I I just think it's a it's so
[24:36] (1476.88s)
profoundly important.
[24:38] (1478.40s)
One thing I'll throw out there, Chimoth,
[24:41] (1481.60s)
the Tesla approach for autonomy is
[24:44] (1484.32s)
taking human knowledge. In fact, the
[24:46] (1486.32s)
whole idea is to approximate human human
[24:50] (1490.16s)
driving, right? The is the whole damn
[24:53] (1493.36s)
thing. Now, depending on your approach
[24:55] (1495.84s)
in the technology, you can do like
[24:57] (1497.12s)
what's called an end toend approach or
[24:59] (1499.12s)
you can look at okay perception,
[25:01] (1501.52s)
prediction, planning, uh, and control,
[25:04] (1504.40s)
which are like these four modules that
[25:06] (1506.16s)
sort of you you you sort of engineer if
[25:09] (1509.52s)
that makes sense. But it's approximating
[25:12] (1512.72s)
human driving to do it. The difference
[25:15] (1515.12s)
is that,
[25:19] (1519.04s)
you know, I I think Elon's taken a a
[25:21] (1521.52s)
almost a more human approach, which is
[25:23] (1523.36s)
like, I've got two eyes. Why can't my
[25:26] (1526.00s)
car Why can't my car do it like a human?
[25:28] (1528.48s)
Like, I don't have any LAR spinning
[25:30] (1530.48s)
around on my head as a human. Why can't
[25:32] (1532.72s)
my car? So, it's kind of interesting.
[25:34] (1534.56s)
He's sort of taking what you're saying
[25:36] (1536.40s)
Chimoth on the computation side because
[25:38] (1538.96s)
hardware 5 is coming out on Tesla
[25:41] (1541.44s)
probably next year
[25:42] (1542.88s)
which is going to make a big difference
[25:44] (1544.48s)
in what FSD can do.
[25:47] (1547.68s)
That's the compute side you're talking
[25:48] (1548.96s)
about. But then he is approximating
[25:51] (1551.04s)
human.
[25:52] (1552.08s)
Yeah. I just meant that, you know, other
[25:53] (1553.60s)
than the first versions of FSD, which I
[25:55] (1555.84s)
think Andre talked about, Andre Carpathy
[25:57] (1557.68s)
talked about,
[25:58] (1558.64s)
you know, they're not really so
[26:01] (1561.76s)
reliant anymore on human labeling, per
[26:04] (1564.16s)
se, right? So that that Yeah, that that
[26:06] (1566.88s)
interference. And then
[26:08] (1568.40s)
the other crazy thing that he said
[26:10] (1570.32s)
subsequent versions of Grock
[26:13] (1573.36s)
are not going to be trained on any
[26:15] (1575.68s)
traditional data set that exists in the
[26:17] (1577.52s)
wild. the cumulative sum of human
[26:19] (1579.84s)
knowledge has been exhausted in AI
[26:21] (1581.60s)
training. That happened basically last
[26:23] (1583.36s)
last year. And so the only way to then
[26:26] (1586.08s)
supplement that is with synthetic data
[26:28] (1588.08s)
where the AI creates it'll sort of write
[26:30] (1590.72s)
an essay or it'll come up with with a
[26:32] (1592.40s)
thesis and then it will grade itself um
[26:34] (1594.96s)
and and and sort of go through this
[26:36] (1596.80s)
process of self-arning with synthetic
[26:38] (1598.64s)
data. He said that he's going to have
[26:40] (1600.40s)
agents creating synthetic data from
[26:43] (1603.44s)
scratch that then drive all the training
[26:46] (1606.08s)
which I just think is it's crazy.
[26:48] (1608.48s)
Just explain this concept one more time
[26:50] (1610.16s)
with a bitter lesson. hand coding
[26:52] (1612.16s)
heristics into the computer and saying,
[26:54] (1614.80s)
"Hey, here are specific openings in
[26:56] (1616.56s)
chess."
[26:57] (1617.28s)
Use Yeah, use chess, right?
[26:59] (1619.28s)
You're hand coding specific examples of
[27:01] (1621.20s)
openings in there, end games, etc.
[27:03] (1623.28s)
versus just saying, "Play every possible
[27:05] (1625.44s)
game, and here's every game we have."
[27:07] (1627.44s)
So, here's
[27:08] (1628.64s)
Yeah. So, the two approaches would be,
[27:10] (1630.40s)
let's say, like Travis and I were
[27:11] (1631.76s)
building competing versions of a chess
[27:14] (1634.16s)
solver, and Travis's approach would say,
[27:17] (1637.44s)
I'm just going to define the chessboard.
[27:20] (1640.08s)
I'm gonna give the players certain
[27:23] (1643.12s)
boundaries in which they can move,
[27:25] (1645.12s)
right? So the bishop can only move
[27:27] (1647.12s)
diagonally and there's a couple of
[27:28] (1648.80s)
boundary conditions and I'm going to
[27:31] (1651.20s)
create a reward function and I'm just
[27:33] (1653.52s)
going to let the thing self-learn and
[27:35] (1655.12s)
selfplay. That's his version.
[27:38] (1658.24s)
And then what happens is when you map
[27:40] (1660.72s)
out every single permutation
[27:45] (1665.04s)
when you go and play Keith, who's the
[27:46] (1666.80s)
best chess player in the world, what
[27:49] (1669.12s)
you're doing at that point is saying,
[27:50] (1670.48s)
"Okay, Keith made this move." So you
[27:53] (1673.12s)
search for what Keith's move is, and you
[27:56] (1676.00s)
have a distribution of the best moves
[27:58] (1678.48s)
that you could make in response or vice
[28:00] (1680.88s)
versa. That was the cutting edge
[28:03] (1683.36s)
approach. The different approach which
[28:06] (1686.24s)
is more, you know, what people would
[28:08] (1688.00s)
think is more quote unquote elegant and
[28:09] (1689.68s)
less brute force would be Jason for you
[28:11] (1691.92s)
and I to sit there and say, "Okay, if
[28:13] (1693.84s)
Keith moves here, we should do this. We
[28:15] (1695.68s)
should do this specific variation of the
[28:17] (1697.92s)
Sicilian defense." And
[28:19] (1699.68s)
and it's too much human knowledge. And I
[28:21] (1701.60s)
think what what it turned out was there
[28:23] (1703.04s)
was a psychological need for humans to
[28:25] (1705.04s)
believe we were part of the answer.
[28:27] (1707.60s)
But what this is showing is because of
[28:29] (1709.20s)
Moore's law and because of general
[28:30] (1710.56s)
computation, it's just not necessary.
[28:33] (1713.04s)
You just have to let go, give up
[28:34] (1714.64s)
control. And that's very hard for some
[28:37] (1717.20s)
people and for others.
[28:38] (1718.96s)
It's also very hard in some
[28:40] (1720.08s)
circumstances where a car is driving
[28:41] (1721.52s)
down the road and it's learning in that
[28:43] (1723.92s)
process, which is why you need a safety
[28:45] (1725.44s)
driver. And and I think Elon made the
[28:47] (1727.04s)
right decision to put one in there.
[28:48] (1728.56s)
Keith, your thoughts?
[28:49] (1729.20s)
Yeah, a a couple points. It's it's not
[28:50] (1730.80s)
quite that binary. Chimoth, I generally
[28:52] (1732.64s)
agree with your arc, but like if you
[28:54] (1734.16s)
think about LLMs being the most
[28:55] (1735.76s)
important unlock in AI, LLMs are all
[29:00] (1740.32s)
trained on human writing. So someone
[29:03] (1743.20s)
wrote every piece of data that every LLM
[29:05] (1745.52s)
use a human wrote at some point in
[29:07] (1747.28s)
history. So yes, it's true that they've
[29:10] (1750.32s)
shocked everybody including OpenAI's,
[29:12] (1752.96s)
you know, original team on the
[29:14] (1754.96s)
implications, the broad implications,
[29:16] (1756.48s)
the general applicability to almost
[29:18] (1758.64s)
every problem, but it's not like there
[29:20] (1760.80s)
was uh some tablets floating in space
[29:22] (1762.96s)
that weren't drafted by humans that
[29:24] (1764.80s)
we've trained on. as you get in non LLM
[29:27] (1767.44s)
based uh models, you may be totally
[29:29] (1769.76s)
right, but almost no one's really using
[29:31] (1771.76s)
nonlm based models at scale
[29:34] (1774.40s)
on driving specifically. Travis is
[29:36] (1776.32s)
totally right that humans are actually
[29:38] (1778.40s)
really good drivers except when they get
[29:40] (1780.00s)
distracted. They get distracted by drugs
[29:41] (1781.92s)
or alcohol. They get distracted by being
[29:43] (1783.76s)
tired. They get distracted by turning
[29:45] (1785.60s)
the radio. They get distracted by
[29:47] (1787.04s)
chatting with their passenger. So
[29:48] (1788.96s)
trading against human behaviors actually
[29:51] (1791.04s)
turned out to be a great decision
[29:52] (1792.88s)
because what for whatever sort of
[29:54] (1794.40s)
Darwinistic reasons humans are pretty
[29:56] (1796.08s)
ideal drivers and so you don't have to
[29:58] (1798.24s)
reason from first principles this is a
[30:00] (1800.08s)
much better path and I think again there
[30:02] (1802.16s)
may be a a broad u sort of lesson there.
[30:06] (1806.24s)
most important thing I think as a VC
[30:08] (1808.16s)
that you said uh is we've been debating
[30:10] (1810.16s)
for years should we invest in companies
[30:12] (1812.24s)
like scale or mercore or any of these
[30:14] (1814.32s)
surge the truth is I think there's a
[30:16] (1816.80s)
very short halflife
[30:18] (1818.88s)
on human label data and so everybody
[30:22] (1822.08s)
who's investing in these companies just
[30:24] (1824.64s)
looking at revenue traction
[30:26] (1826.88s)
really didn't understand that there may
[30:28] (1828.56s)
be a year two years three years max when
[30:32] (1832.32s)
anybody uses human label data for maybe
[30:34] (1834.48s)
anything
[30:36] (1836.08s)
because we hit the end of human
[30:38] (1838.48s)
knowledge or just the collection of it.
[30:42] (1842.08s)
99% done
[30:43] (1843.36s)
or you train on you train on it so well
[30:46] (1846.80s)
that you don't need to label anymore.
[30:48] (1848.80s)
Like the the machines know how to label
[30:51] (1851.76s)
as good or better than a human. And so
[30:54] (1854.00s)
like we're seeing this in the
[30:55] (1855.12s)
self-driving space is labeling was huge,
[30:58] (1858.72s)
right? You would have a
[30:59] (1859.92s)
three-dimensional sort of scene that's
[31:02] (1862.64s)
created by video plus LAR. Let's say,
[31:06] (1866.00s)
okay, I have to label all of these
[31:09] (1869.04s)
essentially what become boxes like I've
[31:11] (1871.20s)
identified objects. You're you're some
[31:14] (1874.00s)
of the players in the in the autonomous
[31:16] (1876.08s)
software space, autonomous vehicle
[31:17] (1877.92s)
software space are no longer doing any
[31:20] (1880.24s)
labeling because the machines are doing
[31:21] (1881.84s)
it all.
[31:23] (1883.12s)
Just broadly,
[31:24] (1884.16s)
it'll just be built into the chipset
[31:25] (1885.92s)
that this is a stop sign. Like it's like
[31:27] (1887.84s)
we know what a stop sign is. We don't
[31:29] (1889.92s)
need the millionth time for somebody
[31:32] (1892.56s)
captas like you're like find the stop
[31:34] (1894.32s)
sign or what's the traffic light and
[31:36] (1896.72s)
eventually the machines are just way
[31:38] (1898.88s)
better than humans at identifying these
[31:40] (1900.80s)
things
[31:41] (1901.52s)
or you to be very practical when you see
[31:43] (1903.68s)
a stop sign you don't have to identify
[31:45] (1905.76s)
that it's a stop sign. You just see that
[31:48] (1908.48s)
every human when they encounter a stop
[31:50] (1910.80s)
sign 99.9% of the time they hit a break
[31:53] (1913.68s)
and they never act. So nobody actually
[31:55] (1915.52s)
knows it's a stop sign. It's just that
[31:57] (1917.60s)
hit a break when you see something that
[31:59] (1919.12s)
looks like this object.
[32:00] (1920.16s)
It's just a vibe.
[32:02] (1922.24s)
it's a vibe.
[32:03] (1923.68s)
I would just say that that's like
[32:04] (1924.96s)
intuitive knowledge versus like the
[32:06] (1926.96s)
expressly labeled human knowledge. The
[32:08] (1928.80s)
question for me is if everybody was so
[32:11] (1931.76s)
reliant on human labeling initially, if
[32:15] (1935.12s)
you're an investor now, when you see
[32:17] (1937.84s)
these Gro four results,
[32:21] (1941.20s)
how do you make an investment decision
[32:22] (1942.88s)
that's not purely levered to just
[32:24] (1944.72s)
computation?
[32:26] (1946.72s)
So if you look at these results, does it
[32:29] (1949.68s)
mean that the, you know, there's 300 to
[32:33] (1953.28s)
a,000 basis points of lag
[32:37] (1957.60s)
between just letting the computers vibe
[32:39] (1959.76s)
itself to the answer versus interjecting
[32:42] (1962.48s)
ourselves? If interjecting ourselves
[32:44] (1964.40s)
slows us down by 300 to a,000 basis
[32:46] (1966.48s)
points per successive iteration, then
[32:49] (1969.92s)
over two or three iterations, you've
[32:51] (1971.36s)
totally lost. So, what does it mean for
[32:54] (1974.00s)
everybody that's not Grock when they
[32:56] (1976.88s)
wake up today and they have to decide
[32:58] (1978.72s)
how do I change my strategy or double
[33:01] (1981.68s)
I think look, I'm I'm not in the
[33:04] (1984.16s)
investment game, but if I were, it would
[33:06] (1986.96s)
be all about scientific breakthrough.
[33:09] (1989.36s)
So, I sometimes get in this place where
[33:11] (1991.68s)
I'm looking I'm going down a path. I,
[33:14] (1994.64s)
you know, I'll be up at 4:00 or 5 in the
[33:17] (1997.04s)
morning. Uh, my day hasn't quite
[33:19] (1999.20s)
started, but I'm not sleeping anymore.
[33:21] (2001.60s)
And I'll start go like I'll be on Kora
[33:23] (2003.44s)
and see some cool quantum physics
[33:25] (2005.44s)
question or something else I'm looking
[33:27] (2007.76s)
into and I'll go down this thread with
[33:30] (2010.32s)
GPT or Grock and I'll start to get to
[33:34] (2014.16s)
the edge of what's known in quantum
[33:36] (2016.72s)
physics and then I'm doing the
[33:39] (2019.52s)
equivalent of vibe coding except it's
[33:43] (2023.52s)
vibe physics
[33:45] (2025.60s)
and we're approaching what's known and
[33:47] (2027.84s)
I'm trying to poke and see if there's
[33:49] (2029.60s)
breakthroughs to be had and I've gotten
[33:51] (2031.92s)
pretty damn close to some interesting
[33:53] (2033.68s)
breakthroughs just doing that.
[33:56] (2036.00s)
And I, you know, I pinged uh I pinged
[33:58] (2038.80s)
Elon at some point. I'm just like, dude,
[34:01] (2041.36s)
if I'm if I'm doing this and I'm super
[34:04] (2044.56s)
amateur hour physics enthusiast like
[34:08] (2048.88s)
what about all those PhD students and
[34:11] (2051.36s)
postocs that are super legit using this
[34:14] (2054.24s)
tool and this is pre Grock 4 now with
[34:16] (2056.80s)
Grock 4 like like there's a lot of
[34:18] (2058.88s)
mistakes. I was seeing Grock make that
[34:22] (2062.00s)
then I would correct and we would talk
[34:23] (2063.84s)
about it. Grock could be this place
[34:26] (2066.48s)
where breakthroughs are actually
[34:28] (2068.64s)
happening, new breakthroughs. So if I'm
[34:30] (2070.96s)
investing in this space, I would be
[34:34] (2074.00s)
who's got the edge on scientific
[34:36] (2076.64s)
breakthroughs and and the application
[34:39] (2079.44s)
layer on top of these foundational
[34:41] (2081.12s)
models that orients that direction? Is
[34:43] (2083.76s)
your perception that the LLMs are
[34:46] (2086.64s)
actually starting to get to the
[34:48] (2088.40s)
reasoning level that they'll come up
[34:50] (2090.64s)
with a novel concept theory and have
[34:53] (2093.04s)
that breakthrough or that we're kind of
[34:55] (2095.44s)
reading into it and it's just trying
[34:57] (2097.44s)
random stuffs at the at the margins.
[34:59] (2099.60s)
It's uh
[35:00] (2100.48s)
or maybe it doesn't happen.
[35:01] (2101.44s)
No, no, no. So, what I what I've seen
[35:02] (2102.96s)
and again I haven't used Grock for I
[35:04] (2104.56s)
tried to use it early this morning but
[35:06] (2106.24s)
for some reason I couldn't do it on my
[35:07] (2107.84s)
on my app. But
[35:11] (2111.76s)
so let's say we're talking Gro 3 and
[35:13] (2113.92s)
existing chat GPT as it is. No, it
[35:16] (2116.32s)
cannot come up with the new idea. These
[35:18] (2118.48s)
things are so wedded to what is known
[35:21] (2121.52s)
and they're so like even when I come up
[35:23] (2123.44s)
with a new idea, I have to really it's
[35:26] (2126.48s)
like pulling a donkey sort of. You see,
[35:29] (2129.04s)
you're pulling it
[35:30] (2130.24s)
because it doesn't want to break
[35:32] (2132.56s)
conventional wisdom. It's like really
[35:35] (2135.68s)
adhering to conventional wisdom. for
[35:37] (2137.60s)
pulling it out and then eventually goes,
[35:39] (2139.76s)
"Oh [Β __Β ] you got something."
[35:41] (2141.84s)
But then when it says that, when it says
[35:44] (2144.00s)
that, then you you have to you have to
[35:45] (2145.92s)
go, "Okay, it said that, but I'm not
[35:48] (2148.24s)
sure." Like you have to double and
[35:49] (2149.44s)
triple check to make sure that you
[35:51] (2151.92s)
really got something. To your point,
[35:53] (2153.28s)
when these models are fully divorced
[35:55] (2155.84s)
from having to learn on the known world
[35:58] (2158.64s)
and instead can just learn
[36:00] (2160.08s)
synthetically,
[36:01] (2161.36s)
then everything gets flipped upside down
[36:03] (2163.20s)
to what is the best hypothesis you have
[36:06] (2166.00s)
or what is the best question? You could
[36:07] (2167.84s)
just give it some problem and it would
[36:09] (2169.52s)
just figure it out.
[36:11] (2171.68s)
So, where I go on this one, guys, is
[36:13] (2173.36s)
it's all about scientific method,
[36:16] (2176.32s)
right? If you get if you have an LLM or
[36:19] (2179.68s)
foundational model of some kind that is
[36:21] (2181.76s)
the best in the world at the scientific
[36:23] (2183.76s)
method,
[36:25] (2185.44s)
game the f over.
[36:28] (2188.16s)
You basically you just light up more
[36:30] (2190.32s)
GPUs and you just got like a thousand
[36:32] (2192.72s)
more PhD students working for you.
[36:36] (2196.40s)
Keith, you're nodding your head here.
[36:39] (2199.20s)
I I agree with that. I think that's
[36:40] (2200.56s)
fantastic because the scientific method
[36:42] (2202.80s)
also the faster it is the more you when
[36:46] (2206.08s)
you have a hypothesis the faster you get
[36:47] (2207.60s)
a response you're more likely to dive in
[36:49] (2209.76s)
and dive in and dive in recursively and
[36:51] (2211.36s)
recursively and every lag every
[36:53] (2213.28s)
millisecond lag causes you to like lose
[36:56] (2216.08s)
your train of thought sort of so to
[36:57] (2217.36s)
speak. So you get the benefits that
[36:59] (2219.68s)
Travis alluding to plus speed and you go
[37:01] (2221.92s)
places you never guess. This happens all
[37:03] (2223.28s)
the time when you run a company and
[37:04] (2224.40s)
you're doing like analytics and you have
[37:06] (2226.48s)
a tool that allows you to constantly
[37:07] (2227.92s)
query quickly quickly quickly double
[37:09] (2229.52s)
click triple click you get to answers
[37:11] (2231.44s)
that you never get to there's even a
[37:13] (2233.20s)
second or two second or 3 second delay
[37:14] (2234.80s)
let alone sending it to a human.
[37:16] (2236.88s)
Secondly where you actually see this
[37:18] (2238.64s)
today it's already happening. If you
[37:20] (2240.24s)
look at foundational models that just
[37:21] (2241.68s)
apply to science, there's lots of things
[37:24] (2244.24s)
about the human body, let's say in
[37:25] (2245.52s)
health biology, that we humans don't
[37:27] (2247.28s)
actually understand all the connections.
[37:28] (2248.96s)
Like why do we do X? Why do some people
[37:31] (2251.04s)
get cancer? Why other people not get
[37:32] (2252.24s)
cancer? Why does the brain work this
[37:33] (2253.36s)
way? Models trained solely on science
[37:37] (2257.44s)
tend to expose connections that no human
[37:40] (2260.00s)
has ever had before.
[37:42] (2262.08s)
And that's because like the raw
[37:43] (2263.92s)
materials there and we only have a
[37:46] (2266.16s)
conscious awareness of call it 110%. But
[37:48] (2268.96s)
when you apply it to other human domains
[37:50] (2270.80s)
where they're training on human sort of
[37:53] (2273.52s)
data, human produced data, human
[37:55] (2275.20s)
produced output, they're limited to that
[37:57] (2277.44s)
output. So I think you just take the
[37:59] (2279.36s)
science and apply it at large and you
[38:01] (2281.52s)
you're going to wind up finding things
[38:02] (2282.64s)
that no human has ever thought before.
[38:04] (2284.88s)
And it's that the thing about science
[38:06] (2286.72s)
though is that it's the hypothesis that
[38:08] (2288.72s)
you then have to test in the physical
[38:10] (2290.64s)
world. So the you're like okay you've
[38:13] (2293.92s)
got this hive mind this like you know
[38:18] (2298.48s)
this this
[38:20] (2300.80s)
computation engine this brain of sorts
[38:24] (2304.08s)
you wanted to say consciousness but you
[38:25] (2305.60s)
start I was like how do I describe
[38:28] (2308.88s)
the big C word consciousness
[38:30] (2310.64s)
but but you need to be able to test in
[38:33] (2313.20s)
the physical world so you could imagine
[38:35] (2315.36s)
a a physical lab connected to one of
[38:39] (2319.44s)
these systems
[38:40] (2320.88s)
where then you could say, okay, like if
[38:42] (2322.96s)
it was a chemistry experiment, you could
[38:44] (2324.88s)
do chemistry experiments or physics. You
[38:47] (2327.44s)
get the idea.
[38:48] (2328.32s)
What could go wrong?
[38:49] (2329.92s)
It would be it's yeah, no big deal. It's
[38:51] (2331.60s)
going to be fine. Okay. So, but but this
[38:54] (2334.08s)
is where it goes because if you have a
[38:55] (2335.84s)
scientific method machine, you still
[38:58] (2338.40s)
have to be able to test your hypothesis.
[39:00] (2340.48s)
You have to go through the scientific
[39:01] (2341.84s)
and the verification. Yeah. Exactly.
[39:04] (2344.48s)
Wow. It's kind of mind-blowing. Reminds
[39:06] (2346.64s)
really mindblowing
[39:07] (2347.76s)
if you remember I don't know if you guys
[39:09] (2349.36s)
remember dark matter and like the
[39:10] (2350.80s)
discovery of it and everything and as
[39:12] (2352.24s)
explained to me by Lisa Randall you know
[39:14] (2354.24s)
the the discovery was made not by
[39:16] (2356.64s)
knowing there was dark ma matter there
[39:18] (2358.40s)
and observing it but observing there was
[39:21] (2361.28s)
something you know gravitational forces
[39:23] (2363.76s)
around this other matter and then they
[39:26] (2366.00s)
said wait what's causing that and that's
[39:27] (2367.84s)
why they found dark matter so these
[39:29] (2369.44s)
ideas you know the idea that LLM could
[39:32] (2372.24s)
actually do that
[39:34] (2374.00s)
come up with something so novel is it
[39:36] (2376.32s)
doesn't it feels like we might be right
[39:37] (2377.76s)
there, right? Like we're kind of on the
[39:38] (2378.88s)
cusp of it.
[39:39] (2379.92s)
One of the seven most difficult problems
[39:41] (2381.76s)
in math or the most important problems
[39:43] (2383.36s)
in math is proving a general solution to
[39:45] (2385.60s)
this thing called Mavier Stokes, which
[39:47] (2387.20s)
is basically like viscous fluid dynamics
[39:49] (2389.20s)
and conservation of mass. We use it
[39:51] (2391.36s)
every day in the design of everything.
[39:53] (2393.12s)
You know what? It hasn't been proved.
[39:55] (2395.44s)
Isn't that the craziest thing where
[39:56] (2396.88s)
you're just like, how is this even
[39:58] (2398.00s)
possible? We use it to design airplanes,
[39:59] (2399.68s)
to design everything. It hasn't been
[40:00] (2400.96s)
proved. And so you could just point a
[40:03] (2403.20s)
computer at this thing and you would
[40:04] (2404.32s)
unlock all these incredible mysteries of
[40:07] (2407.68s)
the universe and we would probably find
[40:10] (2410.32s)
completely different propulsion systems.
[40:12] (2412.64s)
We could probably do things that we
[40:14] (2414.16s)
didn't think were possible.
[40:15] (2415.60s)
Teleportation I mean who knows what's
[40:17] (2417.92s)
possible.
[40:18] (2418.48s)
But remember remember you know how Elon
[40:22] (2422.00s)
talks about Brock and and about AI
[40:24] (2424.32s)
generally is about why are we here? What
[40:28] (2428.08s)
is the purpose?
[40:29] (2429.52s)
Meaning of the universe. Yeah. What is
[40:30] (2430.80s)
the meaning of the universe? How does it
[40:33] (2433.28s)
And a sort of fierce truth seeeking
[40:35] (2435.12s)
mechanism there.
[40:36] (2436.32s)
Let me ask you a question, Keith,
[40:38] (2438.08s)
Travis, Jason. If you guys were running
[40:41] (2441.44s)
Grock 4,
[40:43] (2443.36s)
that'd be so much fun.
[40:46] (2446.16s)
How do you judo flip open AAI? Because
[40:52] (2452.64s)
are marching steadfastly towards a
[40:55] (2455.12s)
billion Mao, then a billion DAO.
[40:59] (2459.36s)
It's a juggernaut. So, how do you use
[41:02] (2462.16s)
the better product in a moment to judo
[41:07] (2467.36s)
the less better product?
[41:09] (2469.76s)
Look. Yeah. I mean, here's the thing,
[41:11] (2471.36s)
right? So, you do the Elon way. So you
[41:14] (2474.40s)
have you get a bunch of missionary like
[41:17] (2477.76s)
full-on missionary engineers that work
[41:20] (2480.80s)
twice as hard and you have a culture
[41:23] (2483.84s)
that is ultra fierce truth seeeking
[41:27] (2487.92s)
and you don't you don't get caught up in
[41:31] (2491.20s)
politics, bureaucracy, BS
[41:35] (2495.20s)
and you just you go for it and and I
[41:37] (2497.84s)
think you know that's where you know and
[41:40] (2500.56s)
then you go wow scientific breakthrough
[41:42] (2502.56s)
scientific method method like you start
[41:44] (2504.64s)
winning on truth and that will start I
[41:48] (2508.08s)
believe that will start to give the
[41:50] (2510.96s)
product awesomeness
[41:53] (2513.04s)
of open AI a run for its money
[41:56] (2516.48s)
but like the product of open AI the
[41:59] (2519.04s)
product department those guys are
[42:00] (2520.96s)
crushing
[42:02] (2522.32s)
they're really good they're not only
[42:04] (2524.48s)
ahead of the game but they feel like it
[42:06] (2526.40s)
just they're just leading in a lot of
[42:08] (2528.32s)
different ways but if you are better at
[42:10] (2530.56s)
truth you will eventually you'll
[42:12] (2532.72s)
eventually have an AI product manager.
[42:14] (2534.88s)
Yeah. And on a technical basis too,
[42:16] (2536.88s)
people forget how good Elon is at
[42:19] (2539.20s)
factories and physical real world
[42:22] (2542.08s)
things.
[42:23] (2543.12s)
Uh what he did standing up Colossus made
[42:26] (2546.00s)
like Jensen Juan was like how is this
[42:28] (2548.80s)
possible that you did this right? So
[42:30] (2550.72s)
pressing that his ability to build
[42:32] (2552.56s)
factories and he said many times like
[42:34] (2554.88s)
the factory is the product of Tesla.
[42:36] (2556.96s)
It's not the cars that come out of the
[42:38] (2558.56s)
factory or the batteries. It's the
[42:40] (2560.16s)
factory itself. So if he can keep
[42:42] (2562.80s)
solving the energy problem with solar on
[42:45] (2565.04s)
one side and batteries and standing up,
[42:49] (2569.28s)
you know, Colossus 2, three, four, five,
[42:51] (2571.68s)
he's going to have a massive advantage
[42:53] (2573.20s)
there on top of Travis, you know, the
[42:55] (2575.44s)
missionary
[42:57] (2577.12s)
individuals, which by the way was what
[42:59] (2579.12s)
he backed before Sam Alman corrupted the
[43:02] (2582.08s)
original missionary basis of OpenAI and
[43:04] (2584.24s)
made it closed AI and a,
[43:06] (2586.00s)
you know, this is nothing derogatory
[43:07] (2587.44s)
towards him, but he did hoodwink and
[43:09] (2589.76s)
stab Elon in the It's not nothing
[43:11] (2591.76s)
personal. I mean, he just screwed him
[43:13] (2593.44s)
over. And
[43:13] (2593.84s)
would you say he bamboozled him?
[43:15] (2595.52s)
He bamboozled him, screwed him,
[43:18] (2598.32s)
hoodwinkedked him, you know, but pick
[43:20] (2600.56s)
your term here. But, uh, he did he
[43:23] (2603.12s)
didn't dirty. The original mission was
[43:25] (2605.20s)
to be missionary and open source all
[43:27] (2607.20s)
this content. That's the other piece I
[43:29] (2609.12s)
think is a wild card. And I'll and then
[43:31] (2611.52s)
I'll sing Keith's position, but
[43:33] (2613.68s)
open sourcing some of this could have
[43:36] (2616.24s)
profound ramifications. I think open
[43:37] (2617.92s)
sourcing the self-driving data could
[43:40] (2620.72s)
have a really profound impact.
[43:44] (2624.08s)
Elon wanted to do something really
[43:45] (2625.28s)
disruptive like he open sourced his
[43:46] (2626.64s)
patents for you know um charging. If he
[43:49] (2629.12s)
opens source the data set and
[43:50] (2630.48s)
self-driving does anybody have the
[43:52] (2632.32s)
ability to produce robo taxis at the
[43:53] (2633.92s)
scale he can do it? I don't think so.
[43:56] (2636.80s)
Travis's hypothesis is true then yeah
[43:58] (2638.88s)
everybody will. Well, what? Sorry.
[44:01] (2641.92s)
Everybody will what? Shiman,
[44:03] (2643.84s)
if you have access to the money that
[44:05] (2645.44s)
buys the compute, everyone could solve
[44:07] (2647.28s)
that problem.
[44:08] (2648.24s)
What's the hardware piece I'm talking
[44:09] (2649.60s)
about?
[44:10] (2650.24s)
He said he said if he if he published
[44:12] (2652.08s)
all the FSD data, could somebody build
[44:14] (2654.96s)
an autonomous vehicle?
[44:17] (2657.04s)
Well, yes, but could somebody produce a
[44:19] (2659.20s)
100 million robo taxis from a factory
[44:21] (2661.36s)
with batteries in them?
[44:22] (2662.88s)
Okay. No, that's a diff that's a
[44:24] (2664.16s)
different question.
[44:24] (2664.56s)
That's the thing I'm saying. And not
[44:25] (2665.84s)
really because last time I was a guest
[44:27] (2667.68s)
on you know on we talked about vertical
[44:29] (2669.92s)
integration.
[44:31] (2671.36s)
Uh products really require vertical
[44:32] (2672.88s)
integration. So ultimately you have a
[44:36] (2676.72s)
self-driving something that is
[44:39] (2679.20s)
customuilt for knowing it's going to be
[44:41] (2681.52s)
self-driving and it interacts
[44:43] (2683.28s)
differently. The cost structure is
[44:44] (2684.88s)
different. The controls are different.
[44:46] (2686.32s)
The seating's different. everything. You
[44:48] (2688.08s)
build a product taking advantage of
[44:49] (2689.84s)
where in the stack you have the most
[44:51] (2691.12s)
competitive advantage, but then you
[44:52] (2692.80s)
leverage that and it reinforces it's
[44:54] (2694.96s)
still why like Apple despite missing the
[44:56] (2696.64s)
AI wave still a pretty good company from
[44:58] (2698.80s)
any empirical standpoint. I mean like
[45:01] (2701.04s)
the performance is absolutely miserable
[45:02] (2702.80s)
on the most important technology
[45:04] (2704.64s)
breakthrough of the last 70 years, but
[45:06] (2706.96s)
the company's still alive and still
[45:08] (2708.48s)
worth trillions of dollars because it's
[45:09] (2709.84s)
vertically integrated. Open AAI at your
[45:12] (2712.32s)
point, they do have a good product team
[45:14] (2714.32s)
and they need to stay ahead on the
[45:16] (2716.32s)
product level because they can't compete
[45:18] (2718.00s)
on the factory level. The way to stay
[45:21] (2721.12s)
ahead on the product level is shipping a
[45:22] (2722.72s)
device. They got to ship the device.
[45:24] (2724.32s)
It's got to be good. It's got to be
[45:25] (2725.52s)
right. It's got to be the right form
[45:26] (2726.56s)
factor. It's got to do things for humans
[45:28] (2728.24s)
that are unexpected. But then if they do
[45:30] (2730.08s)
that, they're like Apple plus AI.
[45:32] (2732.88s)
Chimath, what's the paper you were
[45:34] (2734.32s)
talking about before? What was the name
[45:35] (2735.52s)
of it again?
[45:36] (2736.80s)
The bitter lesson.
[45:38] (2738.08s)
Yeah. could apply to autonomous driving
[45:40] (2740.48s)
is right now it's still like hey how do
[45:42] (2742.80s)
I drive like a human we talked about
[45:44] (2744.24s)
that but the leapfrog moment here could
[45:46] (2746.72s)
be like hey drive a car make sure it's
[45:49] (2749.04s)
efficient don't hit anybody and just
[45:52] (2752.56s)
simulate that uh quadrillion times and
[45:55] (2755.84s)
it's all good right but right now we're
[45:58] (2758.32s)
still trying to drive like humans
[45:59] (2759.76s)
because we don't have enough data and
[46:02] (2762.32s)
therefore can't do enough compute
[46:04] (2764.16s)
that's the global lesson by the way
[46:06] (2766.16s)
you're totally right conceptual you the
[46:08] (2768.24s)
blog post is right. But that's only true
[46:10] (2770.40s)
when you have enough data. And depending
[46:12] (2772.08s)
upon the use case, the level of data you
[46:14] (2774.08s)
need may not be possible for years,
[46:16] (2776.48s)
decades, and you may need to hack your
[46:18] (2778.24s)
way there through human interactions.
[46:20] (2780.00s)
Yeah, physical world AI is lacking in
[46:24] (2784.40s)
data and so you just try to approximate
[46:26] (2786.40s)
humans.
[46:27] (2787.12s)
I don't know if you guys have seen this.
[46:29] (2789.04s)
In related news, OpenAI and uh
[46:31] (2791.92s)
Perplexity are going after the browser.
[46:33] (2793.92s)
Perplexity launch comet for their $200 a
[46:37] (2797.52s)
month tier. I actually downloaded I'll
[46:39] (2799.28s)
show it to you in a second. But this is
[46:41] (2801.60s)
um a really interesting category. It's
[46:44] (2804.48s)
something developers can do already and
[46:46] (2806.32s)
they do it all the time, you know, but
[46:48] (2808.40s)
having your browser uh connected to
[46:51] (2811.92s)
agents let you do really interesting
[46:54] (2814.00s)
things. I'll show you an example here
[46:55] (2815.36s)
that I just fired off while we're
[46:56] (2816.88s)
talking. So, I just asked it, hey, give
[46:58] (2818.48s)
me the best flights from um United
[47:01] (2821.04s)
Airlines and uh business class from New
[47:04] (2824.64s)
York City, from San Francisco to New
[47:06] (2826.88s)
York City. It does some searches, but
[47:08] (2828.32s)
what you see here is it's popped up a
[47:10] (2830.24s)
browser window and it's actually doing
[47:12] (2832.32s)
that work and you can see the steps it's
[47:14] (2834.56s)
using and then I can actually open that
[47:16] (2836.56s)
browser window and watch it do that.
[47:18] (2838.72s)
This is just a screenshot of it and it
[47:20] (2840.56s)
will open multiple of these. So you
[47:22] (2842.08s)
could I was doing a search the other day
[47:23] (2843.44s)
saying like, "Hey, tell me all the
[47:24] (2844.56s)
autobiographies I haven't bought on
[47:26] (2846.32s)
Amazon. Put them into my, you know,
[47:29] (2849.28s)
shopping cart and summarize each of them
[47:31] (2851.28s)
cuz I like biographies and I like doing
[47:34] (2854.00s)
it here." And when it did this last
[47:36] (2856.00s)
time, it put my flight into
[47:40] (2860.48s)
like uh and I was logged in under my
[47:42] (2862.40s)
account and it basically put it into my
[47:44] (2864.48s)
account in the checkout.
[47:47] (2867.36s)
So again, this isn't like if you're a
[47:49] (2869.68s)
developer, you do this all day long, but
[47:52] (2872.72s)
this really seems to be a new product
[47:55] (2875.60s)
category. I'm curious if you guys have
[47:57] (2877.20s)
played with it yet. And then what your
[47:59] (2879.52s)
thoughts are on having an agentic
[48:01] (2881.60s)
browser like this available to you to be
[48:04] (2884.00s)
doing these tasks
[48:06] (2886.40s)
in real time. You can also connect
[48:08] (2888.00s)
obviously your Gmail, your calendar to
[48:09] (2889.84s)
it. So I did a a search, tell me every
[48:12] (2892.96s)
restaurant I've been to and then put it
[48:14] (2894.56s)
by city. And then I was going to open my
[48:16] (2896.80s)
open table and then pull that data as
[48:18] (2898.64s)
well. What's interesting about this,
[48:22] (2902.32s)
Keith, and I know you're a product guy
[48:23] (2903.92s)
and done a lot of product work. I'm
[48:25] (2905.92s)
curious your thoughts on it is you don't
[48:29] (2909.04s)
have to do this in the cloud. You're
[48:31] (2911.52s)
authenticated already into a lot of your
[48:33] (2913.84s)
accounts, nor do you have to worry about
[48:36] (2916.00s)
being blocked by these services because
[48:38] (2918.16s)
it doesn't look like a scraper or a bot.
[48:40] (2920.32s)
It just it's your browser doing the
[48:42] (2922.08s)
work. Your thoughts on this? Have you
[48:43] (2923.20s)
played with it at Yep. I think it's a
[48:44] (2924.88s)
great hail mary attempt by Perplexity. I
[48:48] (2928.32s)
think up since something like this,
[48:49] (2929.76s)
perplexity is toast. Like for the stat
[48:52] (2932.00s)
about chat GBT is going to a billion
[48:53] (2933.92s)
users like it's becoming the verb, you
[48:56] (2936.16s)
know, that the way you describe using AI
[48:58] (2938.24s)
for a normal consumer. There's nothing
[49:00] (2940.08s)
left of perplexity if they can't pull
[49:01] (2941.84s)
this off. So, it's a great idea because
[49:04] (2944.24s)
like the history of like consumer
[49:06] (2946.64s)
technology companies is whoever's up has
[49:08] (2948.88s)
uphill ground like in a military sense,
[49:11] (2951.12s)
whoever is first has a lot of control.
[49:12] (2952.80s)
This is actually what Google should be
[49:14] (2954.00s)
doing truthfully like I think Google's
[49:15] (2955.92s)
also Google search qu search is toast
[49:19] (2959.20s)
and since they have Chrome and they
[49:22] (2962.00s)
theoretically have a quality team in
[49:23] (2963.92s)
Gemini they should be putting these two
[49:25] (2965.68s)
things together and hoping to compete
[49:26] (2966.96s)
with Chad GBT they're going to lose the
[49:29] (2969.12s)
search game like the assets that are
[49:30] (2970.80s)
best at Google right now have nothing to
[49:32] (2972.88s)
do with search it's every other product
[49:34] (2974.40s)
is the only thing that's going to save
[49:35] (2975.52s)
that company if they can put figure out
[49:36] (2976.88s)
how to use them
[49:40] (2980.32s)
Travis your thoughts on this category
[49:42] (2982.64s)
anything come to mind for you in terms
[49:46] (2986.32s)
you know, feature sets that would be
[49:48] (2988.64s)
extraordinary here? I know you you like
[49:50] (2990.96s)
to think about products and the consumer
[49:52] (2992.48s)
experience.
[49:54] (2994.72s)
It's really interesting. So, you know,
[49:56] (2996.88s)
I've been spending, as you guys know,
[49:58] (2998.40s)
I've been spending my time on real
[50:00] (3000.48s)
estate and construction and robotics.
[50:02] (3002.88s)
And so, I' I've been out of the this
[50:04] (3004.88s)
kind of consumer software game for a
[50:06] (3006.88s)
long time. But super interesting over
[50:08] (3008.64s)
the last six months there have been a a
[50:12] (3012.00s)
number of consumer software CEOs
[50:16] (3016.16s)
like when I hang out with them or
[50:17] (3017.44s)
whatever they're like
[50:19] (3019.60s)
yo how are we going to how are we going
[50:21] (3021.36s)
to keep doing what we do when the agents
[50:23] (3023.52s)
take over.
[50:24] (3024.80s)
Yeah. The paradigm shift is so profound
[50:28] (3028.16s)
that the idea that you would visit a web
[50:29] (3029.84s)
page goes away and you're just in a chat
[50:31] (3031.52s)
dial you have an agent that's just
[50:33] (3033.28s)
taking care of your flights for you. So,
[50:36] (3036.24s)
I I kind of I I think there's a leaprog
[50:39] (3039.12s)
over that. I think
[50:41] (3041.44s)
it's just like you tell something, yo, I
[50:43] (3043.68s)
want to go to New York. Can you you
[50:45] (3045.68s)
know, I'm sort of looking at this time
[50:47] (3047.12s)
range. Can you just go find something
[50:49] (3049.28s)
I'm probably going to like and give me a
[50:50] (3050.72s)
couple options?
[50:52] (3052.40s)
Yeah. And it's just a whole you have an
[50:54] (3054.64s)
interface and then you know is perplex
[50:59] (3059.12s)
is this thing that you just showed
[51:00] (3060.32s)
perplexely is that the interface or do I
[51:03] (3063.28s)
just have an agent that just goes and
[51:04] (3064.64s)
does everything for me and is this the
[51:07] (3067.36s)
start of that? I you know I just haven't
[51:09] (3069.60s)
spent enough time. I I do know that
[51:12] (3072.32s)
every consumer
[51:14] (3074.32s)
software CEO
[51:16] (3076.80s)
that has an app in the app store is
[51:18] (3078.72s)
tripping. They're tripping right now.
[51:21] (3081.04s)
And I mean big boys. I meet guys with
[51:23] (3083.20s)
real stuff and sometimes I I'm doing
[51:25] (3085.76s)
like almost like therapy sessions with
[51:27] (3087.84s)
them. I'm like, "It's going to be fine.
[51:29] (3089.52s)
You actually you actually have stuff.
[51:32] (3092.32s)
You have a mo. You have real stuff
[51:33] (3093.92s)
that's of value. They can't replace it
[51:35] (3095.92s)
with an agent."
[51:36] (3096.88s)
And they're like,
[51:37] (3097.28s)
"So, you're lying to them. You're doing
[51:38] (3098.80s)
hospice care and you're telling them
[51:40] (3100.08s)
everything's going to be okay, but the
[51:41] (3101.84s)
patient
[51:43] (3103.44s)
options on Robin Hood while he's like,
[51:44] (3104.88s)
"Yeah, yeah, tell me more. Tell me
[51:46] (3106.24s)
more."
[51:47] (3107.68s)
All these things.
[51:48] (3108.48s)
There's certain things that are
[51:49] (3109.76s)
protected and there's certain things
[51:51] (3111.04s)
that aren't. That's all.
[51:52] (3112.08s)
Well, let's talk about that because the
[51:53] (3113.84s)
you and I are old enough to remember uh
[51:56] (3116.24s)
general magic. This vision was out there
[51:58] (3118.48s)
a long time ago with personal digital
[52:00] (3120.88s)
assistance and you would just talk to an
[52:03] (3123.12s)
agent. It would go do this for you. This
[52:05] (3125.20s)
feels like a step to that where it does
[52:08] (3128.16s)
all the work for you, presents you the
[52:10] (3130.56s)
final moment and says approve.
[52:13] (3133.68s)
Like a concierge or a butler. Yeah,
[52:15] (3135.60s)
I think what you're describing is what
[52:17] (3137.76s)
we want. But I think more specifically
[52:19] (3139.92s)
for today, Keith and Travis totally nail
[52:24] (3144.16s)
it. Look, I think building a browser is
[52:26] (3146.96s)
an absolutely stupid capital allocation
[52:29] (3149.20s)
decision. Just totally stupid and
[52:31] (3151.28s)
unjustifiable in 2025. Specifically for
[52:34] (3154.40s)
Perplexity,
[52:36] (3156.16s)
I think their path to building a legacy
[52:38] (3158.96s)
business is to replace Bloomberg.
[52:41] (3161.68s)
Everything that they've done in
[52:43] (3163.76s)
financial information and financial data
[52:46] (3166.56s)
in going beyond the model has been
[52:49] (3169.20s)
excellent. As somebody who's paid
[52:50] (3170.88s)
$25,000 to Bloomberg for many years,
[52:54] (3174.56s)
the terminal is atrocious. It's
[52:57] (3177.84s)
terrible. It's not very good. It's very
[53:00] (3180.56s)
limited.
[53:02] (3182.16s)
and anybody that could build a better
[53:04] (3184.88s)
product would take over a hundred
[53:09] (3189.28s)
billion dollar enterprise because I
[53:10] (3190.88s)
think it's there for the taking. I wish
[53:12] (3192.24s)
that Perplexity would double and triple
[53:14] (3194.48s)
down on that. And so when you see this
[53:15] (3195.92s)
kind of
[53:16] (3196.88s)
random sprawl,
[53:17] (3197.92s)
let's do it, Jimoth. Let's just go do
[53:20] (3200.32s)
When you do the random sprawl, I think
[53:21] (3201.60s)
it doesn't work. But I just want to say
[53:23] (3203.04s)
like a browser is like the dumbest thing
[53:25] (3205.84s)
to build in 2025 because in a world of
[53:28] (3208.64s)
agents, what is a browser? It's a
[53:31] (3211.36s)
glorified markup reader. It's like
[53:33] (3213.60s)
handling HTML. It's handling CSS and
[53:36] (3216.00s)
JavaScript. It's doing some networking.
[53:38] (3218.56s)
It's doing some security. It's doing
[53:40] (3220.40s)
some rendering. But it's like this is
[53:42] (3222.88s)
all under the water type stuff. I get it
[53:46] (3226.16s)
that we had to deal with all that
[53:47] (3227.68s)
nonsense in
[53:51] (3231.68s)
to try LOS or Google for the first time.
[53:54] (3234.32s)
But in 2025,
[53:56] (3236.80s)
there's something that you just speak to
[54:00] (3240.24s)
and eventually there's probably
[54:01] (3241.44s)
something that's in your brain which you
[54:02] (3242.96s)
just think and it just doesn't. You're
[54:04] (3244.96s)
thinking
[54:06] (3246.48s)
I need a flight to JFK or at the maximum
[54:09] (3249.84s)
today in a very elegant beautiful search
[54:12] (3252.24s)
bar you type in get me a flight and it
[54:15] (3255.20s)
already knows what to do. Keith, in some
[54:16] (3256.96s)
ways this is a step towards that
[54:19] (3259.36s)
ultimate vision. So you'd think it's
[54:21] (3261.20s)
worth it to you know sort of perplexity
[54:24] (3264.00s)
to make this way point perhaps if you
[54:25] (3265.84s)
look at it as a waypoint between the
[54:27] (3267.68s)
ultimate vision which is a command line
[54:29] (3269.76s)
and earpiece.
[54:31] (3271.28s)
How do you get distribution Jason for
[54:32] (3272.80s)
the 19th web browser in 2025? Well,
[54:35] (3275.36s)
yeah, that is a challenge and I think
[54:37] (3277.04s)
most people are speculating Apple, which
[54:40] (3280.24s)
has a lot of users, might buy Perplexity
[54:44] (3284.24s)
or do a deal with Perplexity and give
[54:46] (3286.08s)
them that distribution because of the
[54:48] (3288.64s)
Justice Department case against Google.
[54:50] (3290.96s)
So, there's been a lot of speculation
[54:52] (3292.08s)
about that. But Keith, what do you
[54:53] (3293.12s)
think?
[54:53] (3293.76s)
Well, I don't think they'd buy anything
[54:55] (3295.28s)
worth it. Like, what do what is Apple
[54:56] (3296.64s)
going to get if you continue this failed
[54:58] (3298.32s)
strategy of Apple,
[54:59] (3299.60s)
right? Apple has missed every possible
[55:01] (3301.92s)
window on AI and continues to miss it
[55:04] (3304.16s)
and it has cultural I think the CEO has
[55:07] (3307.28s)
challenges I think culturally they have
[55:08] (3308.72s)
challenges I think they have
[55:09] (3309.44s)
infrastructure challenges so it's it's
[55:12] (3312.00s)
not an easy fix but buying perplexity is
[55:14] (3314.24s)
not going to help like strategy is
[55:15] (3315.76s)
actually pretty coherent one for
[55:17] (3317.28s)
perplexity qu perplexity uh so I think
[55:20] (3320.08s)
that it's not
[55:20] (3320.96s)
pick a vertical and own it strategy
[55:23] (3323.60s)
not not a bad idea um especially because
[55:25] (3325.76s)
you need unique data sources some of
[55:27] (3327.28s)
those data sources may or may not
[55:29] (3329.36s)
license their data to open AI. So, you
[55:32] (3332.16s)
can do some clever things there, but um
[55:34] (3334.32s)
I don't think there's any residual value
[55:36] (3336.32s)
that Apple would get out of perplexity
[55:38] (3338.00s)
except there's some product taste, but
[55:39] (3339.76s)
what are you going to spend like a
[55:40] (3340.80s)
billion dollars for product taste? I
[55:42] (3342.32s)
mean, Mark's spending hundreds of
[55:43] (3343.92s)
millions of dollar, hundreds of billions
[55:45] (3345.12s)
of dollars or whatever he's spending
[55:46] (3346.32s)
these days. And you know, Grock, if
[55:48] (3348.08s)
anything, Grock for shows that Mark
[55:50] (3350.32s)
really doesn't need to just spend money
[55:51] (3351.92s)
to build a whole new team because
[55:53] (3353.12s)
everything they've done in AI has also
[55:54] (3354.64s)
missed the boat. Well, I mean, Keith,
[55:56] (3356.88s)
the way you phrase it there almost makes
[55:58] (3358.40s)
it worth it for Apple to throw a Hail
[56:00] (3360.80s)
Mary, have a team with some taste
[56:03] (3363.04s)
because that's how they tend to do
[56:04] (3364.40s)
things is something that is elegant. And
[56:06] (3366.72s)
why not just throw your search to it,
[56:08] (3368.56s)
throw 10 billion at
[56:11] (3371.28s)
a bunch of what's elegant would be if
[56:13] (3373.12s)
there's a bunch of agents and just a
[56:14] (3374.64s)
chat box.
[56:15] (3375.68s)
Seeing a bunch of visual diarrhea is not
[56:17] (3377.68s)
elegant.
[56:18] (3378.80s)
It's lazy.
[56:20] (3380.72s)
On our on our little Bloomberg clone,
[56:22] (3382.80s)
I'll give you naming rights. So you can
[56:24] (3384.96s)
call it that.
[56:25] (3385.52s)
You like you like poly polyhapatia. So
[56:28] (3388.40s)
hey, can somebody can somebody uh bring
[56:30] (3390.48s)
up the polyhapatia?
[56:32] (3392.88s)
You know what's so funny?
[56:34] (3394.48s)
Rolls right off your tongue.
[56:36] (3396.08s)
TK, listen. We were trying to do a
[56:38] (3398.16s)
screen
[56:39] (3399.92s)
of companies and it maxes out at five
[56:43] (3403.52s)
companies on a specific type of screen
[56:45] (3405.84s)
where you're like you're trying to
[56:46] (3406.88s)
compare stock price to EBID and you're
[56:48] (3408.88s)
like, okay, I can only choose five, I
[56:50] (3410.96s)
guess. So which five should I choose?
[56:52] (3412.40s)
Lefont was on right like two episodes
[56:54] (3414.08s)
ago. He was like I can't pull this up.
[56:55] (3415.84s)
It's limited to six companies.
[56:57] (3417.68s)
Dude, you it's So what do people use
[56:59] (3419.92s)
Bloomberg?
[57:01] (3421.12s)
They use it for the messaging. Now like
[57:03] (3423.04s)
my team has traded huge positions via
[57:05] (3425.36s)
text message on Bloomberg. So there is
[57:07] (3427.04s)
something very valuable there.
[57:09] (3429.12s)
But the core usability and the core UI
[57:11] (3431.92s)
of that company has not evolved.
[57:13] (3433.76s)
I have my contribution
[57:15] (3435.28s)
and Plexity is very good at that by the
[57:17] (3437.60s)
way. It they they do a very good job.
[57:20] (3440.00s)
I got a new domain name Travis. Let this
[57:21] (3441.84s)
one just sink in here. This is my way to
[57:23] (3443.60s)
weasle my way into the deal.
[57:25] (3445.44s)
begin.com.
[57:26] (3446.80s)
Begin.com.
[57:27] (3447.76s)
You own that, don't you?
[57:29] (3449.20s)
I do. I'm just a little I sniped some
[57:31] (3451.76s)
good ones once in a while. I got
[57:32] (3452.96s)
begin.com and I got annotated.com. Those
[57:35] (3455.12s)
are my two little domains.
[57:36] (3456.24s)
You're like You're like one of these old
[57:37] (3457.52s)
people that show up at those show and
[57:41] (3461.92s)
show and you're like, "Oh, I have this
[57:43] (3463.76s)
thing that I bought 1845."
[57:46] (3466.16s)
Guys, Jason Jason is Jason is the daddy
[57:49] (3469.12s)
and GoDaddy. Okay. I'm your dad. I'm
[57:51] (3471.84s)
your daddy.
[57:52] (3472.24s)
That's what it is.
[57:52] (3472.96s)
Who's your daddy? Hey, speaking of
[57:55] (3475.36s)
daddy.
[57:56] (3476.56s)
Let's go on to our next story.
[57:58] (3478.80s)
Come on.
[58:01] (3481.36s)
Is now the right time for a third party?
[58:03] (3483.76s)
Elon seems to think so. Last week, he
[58:06] (3486.96s)
announced that AXI would be creating a a
[58:09] (3489.28s)
new political party. I'll let you decide
[58:10] (3490.72s)
who daddy is in this one. uh he said
[58:14] (3494.56s)
quote when it comes to bankrupting our
[58:16] (3496.88s)
country with waste and graft we live in
[58:18] (3498.80s)
a one party system not a democracy he's
[58:21] (3501.60s)
not yet outlined a uh a platform for the
[58:25] (3505.04s)
American party we talked about it here
[58:26] (3506.48s)
last week I listed four core values
[58:28] (3508.64s)
which seem to get a good reaction on X
[58:31] (3511.76s)
fiscal responsibility doge sustainable
[58:34] (3514.16s)
energy and dominance in that
[58:36] (3516.08s)
manufacturing in the US which Elon has
[58:38] (3518.00s)
done uh single-handedly here pronatalism
[58:41] (3521.36s)
which I think is a passion project for
[58:43] (3523.60s)
him and Shabbath you punched it out with
[58:45] (3525.36s)
the fifth technological excellence
[58:47] (3527.92s)
according to poly market 55% chance that
[58:50] (3530.48s)
Elon registers the American party by the
[58:53] (3533.12s)
end of the year and you know one thing I
[58:56] (3536.24s)
was trying to figure out is just how
[58:57] (3537.68s)
unpopular are these candidates
[59:00] (3540.56s)
and uh these political parties this is a
[59:03] (3543.12s)
very interesting chart that I think we
[59:04] (3544.56s)
could have a a great conversation around
[59:06] (3546.64s)
it turns out we used to love our
[59:08] (3548.40s)
presidents if you look here from Kennedy
[59:10] (3550.40s)
at 83% % his highest approval rating.
[59:13] (3553.44s)
His lowest was 56%.
[59:16] (3556.00s)
That was his lowest approval rating. So
[59:17] (3557.92s)
he operated in a very high band. Look at
[59:20] (3560.32s)
Bush 2 during after 9/11. 92% was his
[59:24] (3564.72s)
peak. His lowest was 19, right? Wartime
[59:27] (3567.36s)
president. But then you get to Trump
[59:29] (3569.52s)
one, Biden, and Trump 2. Historically
[59:33] (3573.04s)
low high approval. Their high watermark.
[59:36] (3576.08s)
49 for Trump one, 63 for Biden, one of
[59:39] (3579.84s)
one. and then 47 for Trump too and their
[59:43] (3583.68s)
lowest 29 3140. So maybe it is time for
[59:48] (3588.48s)
a third party candidate. Let's discuss
[59:51] (3591.04s)
it boys.
[59:51] (3591.84s)
I have no idea how to read this graph. I
[59:54] (3594.56s)
have zero idea the worst.
[59:56] (3596.16s)
I'm like what is happening here?
[59:58] (3598.24s)
This is the worst formatted chart. This
[60:00] (3600.64s)
is a confusing chart. But well, the
[60:03] (3603.20s)
reason I'm putting it up is for debate.
[60:04] (3604.88s)
So you should be saying thank you.
[60:06] (3606.08s)
Yeah, we're debating that it's creating
[60:07] (3607.76s)
debating. Why did you put it up? Here's
[60:09] (3609.36s)
another one. Gallup pole. Americans
[60:11] (3611.20s)
desire for a viable third party. 63% in
[60:14] (3614.16s)
2023. So, it's it's bumping along an
[60:16] (3616.80s)
alltime high.
[60:17] (3617.60s)
Okay. I'm really concentrating on this
[60:20] (3620.16s)
Okay. Anyway, I'm going to stop there.
[60:22] (3622.00s)
What's the gray?
[60:23] (3623.60s)
I'm going to let you.
[60:25] (3625.04s)
Okay. Okay. Got
[60:26] (3626.16s)
the different presents during that time
[60:28] (3628.64s)
period and how popular parties were.
[60:30] (3630.64s)
Let's stop here. This is a good This is
[60:32] (3632.24s)
a good place to stop.
[60:34] (3634.56s)
Yeah. A couple points.
[60:36] (3636.16s)
Yes. The idea of Elon creating a third
[60:38] (3638.24s)
party is for any other human being like
[60:40] (3640.88s)
absolutely absurd and ridiculous. Elon
[60:42] (3642.64s)
has obviously done incredible things. So
[60:44] (3644.80s)
dismissing anything he's touching is a
[60:46] (3646.56s)
bad idea. However, I think the best
[60:48] (3648.80s)
metaphor I've seen is it's a little bit
[60:50] (3650.64s)
like Michael Jordan tried to play
[60:51] (3651.92s)
baseball
[60:52] (3652.72s)
and he became a replacement level
[60:54] (3654.40s)
baseball player, which actually really
[60:55] (3655.60s)
hard to do. By the way, Elon is probably
[60:58] (3658.24s)
a replacement level politician. Um, he's
[61:00] (3660.80s)
Michael Jordan for entrepreneurial
[61:02] (3662.64s)
stuff, but the third party stuff is not
[61:05] (3665.52s)
going to work. First of all, um there
[61:08] (3668.88s)
that chart is misleading. It's a flaw of
[61:10] (3670.80s)
average. Well, it's both badly designed
[61:12] (3672.32s)
and it's a flaw of average politics.
[61:14] (3674.00s)
Trump is incredibly popular among
[61:15] (3675.92s)
Republicans. He actually has the highest
[61:17] (3677.44s)
approval rate of any Republican ever
[61:19] (3679.60s)
measured in recorded history. It's 95%.
[61:22] (3682.16s)
Reagan was peaked out at 93%. It's just
[61:24] (3684.88s)
Democrats don't like him, which is
[61:26] (3686.56s)
perfectly fine. Being polarizing is is
[61:29] (3689.28s)
an ingredient to being successful,
[61:30] (3690.80s)
including with people on the show. Like
[61:33] (3693.04s)
the point of accomplishing things in the
[61:35] (3695.04s)
world is you don't really care what half
[61:36] (3696.96s)
the world thinks. You need to make sure
[61:38] (3698.40s)
that there's a lot of people who like
[61:39] (3699.84s)
you and really approve and are
[61:41] (3701.36s)
enthusiastic about what you do. And
[61:42] (3702.88s)
Trump is about as popular with his party
[61:46] (3706.16s)
as anybody's ever been ever. Period.
[61:48] (3708.96s)
No exceptions. Secondly, um there's MAGA
[61:52] (3712.08s)
has kind of already uh changed the
[61:54] (3714.72s)
Republican party. Trump is sort of like
[61:56] (3716.88s)
a third party takeover of the Republican
[61:59] (3719.36s)
party. And so it's kind of already
[62:01] (3721.20s)
happened and maybe you can do this every
[62:03] (3723.52s)
20 years or 30 years. I don't think you
[62:06] (3726.16s)
can have like this kind of
[62:07] (3727.52s)
transformation on one party within a two
[62:10] (3730.32s)
compressed period of time for a lot of
[62:12] (3732.16s)
reasons. Third is um really smart
[62:14] (3734.96s)
parties absorb the lesson of political
[62:16] (3736.64s)
science. Unfortunately I studied
[62:18] (3738.32s)
political science. I wasted kind of my
[62:19] (3739.84s)
college years and instead of saying CS
[62:22] (3742.64s)
and you know maybe then I'd be coding
[62:24] (3744.08s)
stuff and doing physics like Travis. But
[62:26] (3746.16s)
one thing I did learn is smart parties
[62:28] (3748.96s)
absorb the best ideas of third parties.
[62:31] (3751.92s)
So the oxygen is usually not there
[62:34] (3754.72s)
because there's a Darwinistic evolution
[62:36] (3756.48s)
of if you get traction on an idea, it's
[62:38] (3758.88s)
really easy to conscript some of those
[62:42] (3762.32s)
ideas and take away the momentum. No
[62:44] (3764.96s)
third party candidate that's a true like
[62:47] (3767.28s)
third party has won a Senate seat since
[62:50] (3770.88s)
And that's actually Bill Buckley's
[62:53] (3773.28s)
brother. So he had some name ID. The
[62:55] (3775.60s)
other thing Elon I think is missing and
[62:57] (3777.28s)
the proponents of what he's doing is
[62:59] (3779.28s)
people vote not just for ideas, they
[63:01] (3781.12s)
vote for people. It's a combination. The
[63:03] (3783.76s)
product is what do you what do you
[63:05] (3785.68s)
believe and who are you? And you can't
[63:08] (3788.40s)
divorce the two. Trump is a person and
[63:11] (3791.52s)
that generates a lot of enthusiasm and
[63:13] (3793.36s)
it's one of the reasons why he has
[63:14] (3794.72s)
challenges in midterms because he's not
[63:16] (3796.40s)
on the ballot. His ideas may be on the
[63:18] (3798.16s)
ballot but he is not specifically on the
[63:19] (3799.92s)
ballot. So unless because Elon can't be
[63:22] (3802.80s)
the figure head of the party, he
[63:24] (3804.48s)
literally can't constitutionally.
[63:26] (3806.88s)
You need a face that's a person, Obama,
[63:28] (3808.96s)
a Clinton, like there's reasons why
[63:30] (3810.96s)
people resonate.
[63:32] (3812.48s)
Reagan
[63:34] (3814.24s)
without that personality. Specific ideas
[63:37] (3817.92s)
just are not going to galvanize the
[63:39] (3819.36s)
American people.
[63:40] (3820.72s)
Okay. So the counter to that and what
[63:43] (3823.92s)
people believe he's going to try to do
[63:45] (3825.52s)
is win a couple of seats in the House,
[63:47] (3827.36s)
Travis. win maybe one or two Senate
[63:49] (3829.84s)
seats if you were to do that. Those
[63:51] (3831.76s)
things are pretty affordable to back.
[63:55] (3835.04s)
Couple of million dollars for a House
[63:56] (3836.80s)
race. Senate maybe$25 million. If Elon
[64:02] (3842.24s)
puts, I don't know 250 million to work
[64:04] (3844.64s)
every two years, which he I think he put
[64:06] (3846.56s)
280 million to work on the last one. He
[64:09] (3849.28s)
could kind of create the Joe Mansion
[64:11] (3851.84s)
moment and uh he could build a caucus, a
[64:15] (3855.68s)
platform.
[64:17] (3857.20s)
Grover Norquist kind of pledge along
[64:19] (3859.92s)
these lines. So, what do you think of
[64:21] (3861.20s)
that? If he's not gonna create a viable
[64:24] (3864.16s)
third party presidential candidate,
[64:26] (3866.00s)
could he Travis pick off a couple of
[64:28] (3868.32s)
Senate seats, pick off a couple of
[64:30] (3870.00s)
congressional seats?
[64:31] (3871.04s)
Okay. So, first I have this axiom that
[64:33] (3873.12s)
I'm making up right now.
[64:35] (3875.20s)
Okay. It's called Elon is almost always
[64:37] (3877.44s)
right.
[64:38] (3878.64s)
Okay. All right.
[64:40] (3880.24s)
Elon was right about everything.
[64:42] (3882.08s)
Seriously, let's just be real. And like
[64:45] (3885.12s)
honestly the things he's upset about and
[64:46] (3886.88s)
that he's riled up about especially when
[64:49] (3889.04s)
you look at the deficit like man I am
[64:52] (3892.08s)
right on board that train. Part one.
[64:55] (3895.20s)
Part two.
[64:58] (3898.16s)
We've never had somebody with this kind
[65:00] (3900.48s)
of capital that can be a quote unquote
[65:04] (3904.96s)
party boss outside of the system. Right.
[65:12] (3912.24s)
there's a lot of people that agree with
[65:14] (3914.24s)
the types of things he's saying and he
[65:17] (3917.36s)
knows how to draw, you know, he Elon in
[65:20] (3920.48s)
his own right kind of has a populous
[65:22] (3922.08s)
vibe like he does his thing and he's
[65:25] (3925.20s)
turned X into what it is and he's he's a
[65:28] (3928.96s)
big part of X. And so I think it's the I
[65:32] (3932.80s)
think it's great and honestly there's
[65:34] (3934.88s)
there's the moves you can make on Senate
[65:36] (3936.72s)
and House and just having a few folks
[65:38] (3938.80s)
and then being you being levers then to
[65:41] (3941.36s)
get the things you want done. That's
[65:43] (3943.12s)
part one. And then part two of that is
[65:45] (3945.52s)
the threat of that happening can make
[65:47] (3947.76s)
good things happen separately even if it
[65:50] (3950.00s)
doesn't go all the way.
[65:51] (3951.20s)
I just love it. I'm I'm on the train.
[65:54] (3954.16s)
Yeah. I'm I'm I'm in love with this role
[65:56] (3956.00s)
for Elon more than picking a party
[65:58] (3958.16s)
because he's picking a very specific
[65:59] (3959.84s)
platform that I think resonates with
[66:02] (3962.00s)
folks which is just balance the budget.
[66:04] (3964.56s)
Don't put us in so much debt and let's
[66:06] (3966.72s)
have some sustainable energy, you know,
[66:09] (3969.04s)
job done. Great job.
[66:10] (3970.96s)
The problem with that is like he's
[66:12] (3972.96s)
actually wrong about the reason why we
[66:15] (3975.28s)
have a deficit or debt.
[66:17] (3977.36s)
It's not because we're undertaxed. It's
[66:19] (3979.52s)
we're massively overspending. If we just
[66:21] (3981.84s)
No, I think he believes we're
[66:22] (3982.96s)
overspending. They should have been
[66:24] (3984.72s)
supporting the last, you know, beautiful
[66:27] (3987.28s)
bill because if you just held federal
[66:30] (3990.00s)
spending to 2019 levels, so 2019 is not
[66:33] (3993.04s)
like, you know, decades ago,
[66:35] (3995.04s)
literally with our current tax revenues,
[66:36] (3996.96s)
we would be in a surplus,
[66:39] (3999.04s)
500 billion.
[66:40] (4000.32s)
Yeah. So there all we need to do is cut
[66:42] (4002.64s)
spending. Now I admit that
[66:43] (4003.92s)
why didn't that happen with the big
[66:45] (4005.12s)
beautiful bill?
[66:46] (4006.08s)
So this is where details do matter. I
[66:50] (4010.40s)
think there is a willingness and a you
[66:52] (4012.24s)
know discipline problem on both parties
[66:54] (4014.08s)
and I think maybe he can help fix that.
[66:55] (4015.92s)
The second thing is that we have these
[66:57] (4017.04s)
arcane rules particularly in the Senate
[66:58] (4018.64s)
that you need 60 votes in many ways to
[67:01] (4021.44s)
cut things except through very hacky
[67:04] (4024.56s)
methods and that's a reality. So the
[67:07] (4027.60s)
best thing truthfully he could do is
[67:09] (4029.28s)
help get a Republican party to 60 votes
[67:11] (4031.44s)
and then in then in theory he could be
[67:14] (4034.00s)
absolutely furious if you didn't cut
[67:16] (4036.40s)
back to 2019 levels. But it it's very
[67:19] (4039.52s)
tricky. Or you can just overrule like
[67:21] (4041.52s)
this. The filibuster is an artifact of
[67:23] (4043.76s)
history and at some point some majority
[67:26] (4046.88s)
leader is just going to say we're done
[67:28] (4048.16s)
with the filibuster and just steamroll
[67:30] (4050.32s)
through all the cuts at 50 or 51 votes
[67:32] (4052.88s)
which you can do. There's no
[67:34] (4054.32s)
constitutional right to a filibuster. It
[67:36] (4056.56s)
is an artifact of centuries of American
[67:38] (4058.64s)
history and at some point it's going to
[67:40] (4060.24s)
go away. So maybe the time is now. Maybe
[67:42] (4062.64s)
we should just fix everything now.
[67:44] (4064.08s)
I think you're exactly right. I think
[67:45] (4065.68s)
that the filibuster it's just a matter
[67:47] (4067.52s)
of time. I think it's on borrowed time.
[67:49] (4069.76s)
And I think in a world where it is on
[67:52] (4072.32s)
borrowed time, Jason, I think your path
[67:54] (4074.48s)
is probably the one that gives the
[67:57] (4077.12s)
American party, if it does come into
[67:58] (4078.80s)
existence, the most leverage, which is
[68:01] (4081.28s)
if you control three to five independent
[68:04] (4084.48s)
candidates, you gain substantial
[68:06] (4086.16s)
leverage. I just want to take a step
[68:08] (4088.32s)
back and just note something. I don't
[68:10] (4090.48s)
know if you guys know this, but the only
[68:12] (4092.32s)
reason we're even having this
[68:14] (4094.48s)
conversation or this is even possible is
[68:17] (4097.28s)
because in 2023,
[68:19] (4099.76s)
the FEC, Federal Elections Commission,
[68:22] (4102.64s)
they actually released guidance and they
[68:25] (4105.92s)
changed a bunch of rules. And the big
[68:28] (4108.64s)
change that they made then was it
[68:30] (4110.40s)
allowed super PACs to do a lot more than
[68:32] (4112.64s)
just run ads. Up until that point, all
[68:35] (4115.44s)
you could do if you were a super PAC is
[68:37] (4117.36s)
just basically run advertising,
[68:38] (4118.72s)
television, and radio,
[68:41] (4121.04s)
I guess, online as well. But what they
[68:44] (4124.00s)
were allowed to do starting in 23 was
[68:46] (4126.72s)
they were allowed to fund ground
[68:48] (4128.16s)
operations. They were allowed to do
[68:50] (4130.16s)
things like door knocking, phone
[68:51] (4131.68s)
banking, you know, get out the vote. So,
[68:54] (4134.24s)
in other words, what happened was a
[68:56] (4136.08s)
super PAC became more like a full
[68:58] (4138.88s)
campaign machine. And Trump showed the
[69:02] (4142.16s)
blueprint of using a super PAC,
[69:05] (4145.84s)
specifically his, to win the
[69:07] (4147.28s)
presidential election.
[69:09] (4149.92s)
So he was able to fund this massive
[69:12] (4152.16s)
ground game. He built infrastructure
[69:13] (4153.84s)
across the swing states. He was
[69:15] (4155.36s)
obviously incredibly effective. And now
[69:18] (4158.24s)
that playbook can actually be used by
[69:20] (4160.56s)
other folks. And so to the extent that
[69:22] (4162.96s)
Elon decides to use those changed FEC
[69:25] (4165.92s)
rules, Jason, I think what you said is
[69:28] (4168.00s)
the only path. But but I just I I just
[69:30] (4170.32s)
wanted to double click on Keith's point
[69:31] (4171.76s)
because it's so important. I do think
[69:33] (4173.68s)
the filibuster is going to go away and
[69:35] (4175.68s)
it is because the the arceness of these
[69:38] (4178.72s)
rules having to do a reconciliation bill
[69:41] (4181.52s)
and you know needing a supermajority
[69:43] (4183.52s)
veto proof supermajority in and the
[69:45] (4185.20s)
other case it just means that nothing
[69:47] (4187.44s)
gets done and I think somebody will
[69:49] (4189.28s)
eventually get impatient and just
[69:50] (4190.80s)
steamroll this thing. We've never had so
[69:52] (4192.88s)
many people say they feel politically
[69:54] (4194.64s)
homeless as we did the last two cycles.
[69:57] (4197.52s)
And that includes many people on this
[69:59] (4199.20s)
podcast, people in our friend circle.
[70:01] (4201.12s)
And I think just the idea that Elon
[70:03] (4203.60s)
could create a platform that people
[70:05] (4205.92s)
could opt into and support, just the
[70:08] (4208.72s)
existence of that would make the other
[70:11] (4211.44s)
two parties get their act together.
[70:13] (4213.36s)
By the way, I think that's what we need
[70:14] (4214.72s)
is a little bit of a stick there and a
[70:16] (4216.32s)
carrot. Hey, if you don't control
[70:18] (4218.56s)
spending, there's this third option. And
[70:20] (4220.88s)
if Travis and I are in it, and Keith, I
[70:23] (4223.12s)
know you'll never leave the Republican
[70:24] (4224.40s)
party, but Shimoth, you know, you're
[70:26] (4226.08s)
probably set where you're where you want
[70:28] (4228.00s)
to be right now. But I can tell you, we
[70:29] (4229.60s)
go through our top 10 20 list,
[70:32] (4232.40s)
out of those 50% will join Elon's party.
[70:35] (4235.52s)
Well, look, the other the other thing,
[70:37] (4237.20s)
Jason, that that Keith said, which I
[70:38] (4238.96s)
think is is really important is
[70:42] (4242.00s)
if he were to run people, I think they
[70:45] (4245.52s)
have to transcend politics and policy.
[70:48] (4248.80s)
And I think they need to be straight up
[70:50] (4250.96s)
bosses. People that have enormous name
[70:53] (4253.92s)
recognition so that effectively what
[70:56] (4256.40s)
you're voting is a name and not an
[70:58] (4258.00s)
agenda. Equivalent to, I think, what
[71:00] (4260.24s)
happened to Schwarzenegger when he ran.
[71:02] (4262.88s)
He ran on an enormous amount of name
[71:05] (4265.04s)
recognition in the great Davis recall.
[71:07] (4267.04s)
He didn't run on the platform. I don't
[71:08] (4268.88s)
think anybody should mention that.
[71:10] (4270.32s)
JD Vance had this great book, captured
[71:12] (4272.64s)
people's imagination. He's an incredible
[71:14] (4274.48s)
speaker. He pisses off a third or
[71:17] (4277.28s)
twothirds of the country depending on
[71:18] (4278.80s)
where you are in the country, but you
[71:20] (4280.56s)
can't ignore him. I think Elon can find
[71:23] (4283.36s)
10 JD Vance type characters and back
[71:26] (4286.00s)
them fairly easily. He is a magnet for
[71:28] (4288.56s)
talent. People will line up. I have been
[71:31] (4291.44s)
contacted by high-profile people. I was
[71:33] (4293.60s)
actually thinking of running. Can you
[71:35] (4295.52s)
put me in touch with Elon?
[71:36] (4296.96s)
I was thinking more like actors and
[71:39] (4299.04s)
sports stars, meaning
[71:40] (4300.88s)
where they just come with their own
[71:42] (4302.48s)
in-built distribution. Like I think you
[71:44] (4304.08s)
almost have to to rank ex followers and
[71:47] (4307.36s)
Instagram followers and do a join and
[71:49] (4309.52s)
say, "Okay, these are do you know what I
[71:51] (4311.28s)
mean?" Like I think it's like totally
[71:52] (4312.64s)
different.
[71:53] (4313.44s)
It's painful, guys. It's painful. Like
[71:55] (4315.76s)
let's not get more celebrities as
[71:58] (4318.32s)
politicians. Like let's get like people
[72:00] (4320.64s)
who've led large large efforts, large
[72:03] (4323.76s)
initiatives, complex things, you know,
[72:06] (4326.00s)
ideally, but they still have to
[72:07] (4327.36s)
communicate, right, Keith? They have to
[72:08] (4328.80s)
be able to communicate on a podcast.
[72:10] (4330.64s)
That's the new platform.
[72:12] (4332.56s)
If they can't spend two hours, three
[72:14] (4334.40s)
hours chopping it up on a podcast like
[72:16] (4336.08s)
this or Joe Rogan,
[72:18] (4338.16s)
you know, that's Camala's the reason she
[72:20] (4340.40s)
couldn't even contend was because she
[72:22] (4342.16s)
couldn't hang for two hours in an
[72:23] (4343.76s)
intellectual discussion. If you can't
[72:25] (4345.52s)
hang, you're out in today's political
[72:27] (4347.84s)
arena.
[72:28] (4348.80s)
It'll be interesting to see if he can
[72:30] (4350.40s)
tune his algorithm for talent, which is
[72:33] (4353.44s)
to tune for politics because it's a
[72:36] (4356.00s)
slightly different audience, but if you
[72:37] (4357.60s)
can tune the algorithm and quality, that
[72:40] (4360.00s)
might work. I think you can win a few
[72:41] (4361.76s)
House races. I think that's doable. I
[72:43] (4363.92s)
don't think you can win a Senate race.
[72:46] (4366.64s)
Well, there it is. Elon Keith doesn't
[72:48] (4368.64s)
think you can win a Senate race, but he
[72:49] (4369.92s)
thinks you win a couple congressional
[72:50] (4370.96s)
ones. Thanks for giving him the
[72:51] (4371.92s)
motivation, Keith. I appreciate it.
[72:53] (4373.60s)
I'm sure he's the biggest mistake you've
[72:55] (4375.68s)
ever made. He's not going to win, too.
[72:58] (4378.48s)
People in the Republican party right now
[72:59] (4379.92s)
are going, "Oh, no. Don't poke the
[73:01] (4381.76s)
tiger." Uh, listen. Speaking of
[73:03] (4383.44s)
That's how Trump got into politics, so I
[73:05] (4385.04s)
don't want to be Obama here. Okay.
[73:06] (4386.32s)
You just Obama, Elon, right? Yeah.
[73:09] (4389.60s)
Congratulations.
[73:11] (4391.28s)
All right. Listen, Scotas made a big
[73:14] (4394.56s)
decision here. This is a really
[73:15] (4395.92s)
important uh decision. Uh they've sided
[73:18] (4398.72s)
with Trump for plans for federal
[73:22] (4402.32s)
workforce rifts, reductions in workforce
[73:24] (4404.40s)
for those of you who don't know.
[73:26] (4406.16s)
As you know, Elon, Trump, they wanted
[73:29] (4409.04s)
to, you know, downsize the three million
[73:31] (4411.12s)
people who are federal employees. This
[73:34] (4414.88s)
is just federal employees we're talking
[73:37] (4417.20s)
about. We're not talking about military,
[73:38] (4418.88s)
and we're not talking about state and
[73:40] (4420.56s)
city. That's tens of millions of
[73:42] (4422.00s)
additional people. If you remember,
[73:43] (4423.84s)
Trump issued this executive order back
[73:46] (4426.00s)
in February when we got in office
[73:47] (4427.36s)
implementing the president's Doge
[73:48] (4428.80s)
Workforce Optimization Initiative. and
[73:51] (4431.84s)
he asked all the federal agencies, hey,
[73:53] (4433.76s)
just prepare a riff for their
[73:55] (4435.20s)
departments consistent with applicable
[73:56] (4436.72s)
laws was part of this EO. Okay. In
[73:59] (4439.68s)
April, the American Federation of
[74:01] (4441.04s)
Government Employees, AFGE, sued the
[74:03] (4443.12s)
Trump administration, saying the
[74:04] (4444.64s)
president must consult Congress on
[74:06] (4446.72s)
large-scale workforce changes. This is a
[74:09] (4449.84s)
key debate because the Congress, as you
[74:11] (4451.92s)
know, has power of the purse. They set
[74:13] (4453.68s)
up the money, but the president and the
[74:15] (4455.60s)
executive branch, they have to execute
[74:17] (4457.68s)
on that. And that's what the key is
[74:19] (4459.52s)
here. So they accuse Trump of violating
[74:22] (4462.08s)
the separation of powers under the
[74:23] (4463.44s)
Constitution Act. AFGE has 820,000
[74:27] (4467.60s)
members. In May, a San Francisco based
[74:30] (4470.08s)
federal judge sided with the unions,
[74:31] (4471.84s)
blocking the executive order. The judge,
[74:35] (4475.20s)
who was appointed by Clinton, said any
[74:37] (4477.12s)
reduction in the federal workforce must
[74:38] (4478.96s)
be authorized by Congress. This is a key
[74:41] (4481.28s)
issue. And the White House submitted an
[74:44] (4484.16s)
emergency appeal, yada yada. Eight of
[74:46] (4486.32s)
nine Supreme Court justices sided with
[74:47] (4487.92s)
the White House in overturning this
[74:49] (4489.68s)
block. And so the reasoning, it's very
[74:51] (4491.52s)
likely the White House will win the
[74:52] (4492.80s)
argument of the executive order. They
[74:54] (4494.32s)
have the right to prepare a riff. The
[74:56] (4496.64s)
question is, can they actually execute
[74:58] (4498.88s)
on that riff? And who has that power?
[75:01] (4501.60s)
Chimoth, does the power reside with the
[75:04] (4504.16s)
president to make large scale or, you
[75:06] (4506.48s)
know, riffs or do they have to consult
[75:08] (4508.64s)
Congress first? Your thoughts on this
[75:10] (4510.72s)
issue?
[75:11] (4511.52s)
It's an incredibly important ruling.
[75:14] (4514.80s)
incredibly right.
[75:17] (4517.04s)
I think President Trump should have
[75:18] (4518.64s)
absolute leeway to decide how the people
[75:22] (4522.64s)
that report to him act and do their job.
[75:26] (4526.08s)
If you take a step back, Jason, there
[75:28] (4528.16s)
are more than 2,000 federal agencies.
[75:32] (4532.40s)
Employees plus contractors,
[75:34] (4534.96s)
I think, number 3 million people. If you
[75:38] (4538.80s)
put three million people into 2,00
[75:41] (4541.76s)
agencies
[75:43] (4543.52s)
and then you give them
[75:46] (4546.00s)
very poor and outdated technology, which
[75:48] (4548.32s)
unfortunately most of the government
[75:50] (4550.40s)
operates on, what are you going to get?
[75:53] (4553.92s)
You're going to get incredibly slow
[75:56] (4556.88s)
processes.
[75:58] (4558.72s)
You're going to get
[76:01] (4561.12s)
a lot of checking and double-checking
[76:03] (4563.68s)
and you're going to ultimately just get
[76:05] (4565.76s)
a lot of regulations because they're
[76:07] (4567.84s)
trying to do what they think is the
[76:09] (4569.76s)
right job. So since 1993, what have we
[76:13] (4573.36s)
seen? Regulations have gotten out of
[76:15] (4575.52s)
control. It's like a 100,000 new rules
[76:17] (4577.68s)
per some number of months. Like it's
[76:19] (4579.84s)
just crazy. So eventually we all succumb
[76:23] (4583.36s)
to an infinite number of rules that we
[76:25] (4585.60s)
all end up violating and not even know
[76:29] (4589.28s)
So if the CEO of the United States,
[76:32] (4592.32s)
President Trump, isn't allowed to fire
[76:34] (4594.64s)
people, then all of that stuff just
[76:36] (4596.40s)
compounds. So I think that this is a
[76:39] (4599.84s)
really important
[76:41] (4601.92s)
thing that just happened. It allows us
[76:43] (4603.60s)
to now level set how big should the
[76:45] (4605.44s)
government be? But more importantly, the
[76:47] (4607.84s)
number of people
[76:49] (4609.92s)
in the government are also the ones that
[76:51] (4611.60s)
then direct downstream spend that make
[76:54] (4614.48s)
net new rules. And if you can slow the
[76:57] (4617.44s)
growth of that down, you're actually
[76:59] (4619.12s)
doing a lot. In many ways,
[77:02] (4622.96s)
I wish Elon had come in and created Doge
[77:07] (4627.36s)
Like, could you imagine if Doge was
[77:09] (4629.20s)
created the day after this Supreme Court
[77:11] (4631.52s)
ruling? It would have been a totally
[77:14] (4634.08s)
different outcome, I think, because with
[77:16] (4636.56s)
that Supreme Court ruling in hand, these
[77:19] (4639.36s)
guys probably would have been like a hot
[77:21] (4641.28s)
knife through butter,
[77:23] (4643.12s)
Travis.
[77:23] (4643.52s)
So, I I think it's a big deal.
[77:25] (4645.12s)
Except that ruling doesn't happen
[77:26] (4646.32s)
without Doge. That Doge caused that
[77:28] (4648.40s)
ruling to occur.
[77:29] (4649.20s)
True. Well, the EO did you could have
[77:31] (4651.04s)
passed all that was all Doge style,
[77:33] (4653.76s)
though. You know what I'm saying? It was
[77:35] (4655.44s)
If they wasn't firing people, yeah, they
[77:37] (4657.76s)
probably wouldn't felt the need to your
[77:39] (4659.52s)
point, Travis, to actually file this.
[77:41] (4661.92s)
But Travis, if you are living in the age
[77:44] (4664.16s)
of AI efficiency right now, operations
[77:46] (4666.64s)
of companies is changing dramatically.
[77:48] (4668.72s)
Can you imagine telling somebody you you
[77:51] (4671.12s)
can be CEO, but you can't change
[77:52] (4672.80s)
personnel. That's the job. You get to be
[77:55] (4675.20s)
CEO, but you just can't change the
[77:57] (4677.36s)
players on the team. You can buy the
[77:58] (4678.88s)
Knicks, but you can't change the coach.
[78:01] (4681.04s)
No, you can grow. You just can't shrink
[78:03] (4683.36s)
It's like a It's like running a
[78:04] (4684.72s)
unionized company, which actually does
[78:06] (4686.48s)
exist. Are largeized companies where you
[78:08] (4688.80s)
can't do any of these things,
[78:10] (4690.24s)
right? Do do they still exist or are
[78:12] (4692.08s)
they all gone?
[78:13] (4693.52s)
They're going quickly.
[78:14] (4694.80s)
Yeah, probably.
[78:16] (4696.16s)
I think this just gets back to what what
[78:18] (4698.64s)
is actually Congress authorizing when a
[78:20] (4700.88s)
bill occurs. And there's certain things
[78:24] (4704.32s)
that are specific and certain things
[78:25] (4705.68s)
that aren't. And I don't I'm not sure
[78:28] (4708.48s)
that in in a lot of these bills, it's
[78:30] (4710.48s)
not very specific about exactly how many
[78:32] (4712.96s)
people must be hired. And so if it's I'm
[78:38] (4718.56s)
just doing the common man's sort of
[78:40] (4720.48s)
approach to this, which is like if if
[78:42] (4722.88s)
the law says you have to hire x number
[78:44] (4724.72s)
of people, then that is what it is. If
[78:46] (4726.80s)
the law says you here's some money to
[78:49] (4729.44s)
spend, here are the ways in which to
[78:50] (4730.80s)
spend it, but it's not specific about
[78:52] (4732.56s)
how many people you hire, then that is
[78:54] (4734.40s)
different.
[78:55] (4735.52s)
Yeah. It should be outcome based. Hey,
[78:57] (4737.04s)
here's the goal. Here's the the key
[78:59] (4739.12s)
objectives, right?
[79:00] (4740.72s)
Travis Travis is totally right. Like
[79:02] (4742.40s)
there are there's a variety of different
[79:03] (4743.76s)
laws, some with incredible
[79:05] (4745.44s)
specificities, some with very broad
[79:07] (4747.12s)
mandates. The Constitution clearly says
[79:09] (4749.68s)
that all executive power resides in the
[79:11] (4751.36s)
president of the United States. Period.
[79:12] (4752.64s)
There's no exceptions there. However,
[79:14] (4754.88s)
Congress does appropriate money and post
[79:17] (4757.52s)
Watergate,
[79:19] (4759.44s)
many people think Congress has the power
[79:22] (4762.24s)
to force the president to spend the
[79:24] (4764.16s)
money. And you can debate that. You can
[79:25] (4765.92s)
debate it on a per statute basis. And
[79:28] (4768.40s)
that will be more nuanced and that's
[79:30] (4770.00s)
going to get litigated whether the
[79:31] (4771.36s)
president can refuse to spend money that
[79:33] (4773.76s)
Congress explicitly instructed him to
[79:36] (4776.40s)
spend, sometimes called empowerment.
[79:38] (4778.56s)
That's a very interesting intellectual
[79:40] (4780.40s)
debate. This one's a little bit easier.
[79:42] (4782.08s)
It'll get more complicated again like
[79:44] (4784.16s)
this EO is only approved to allow for
[79:47] (4787.04s)
the planning. I think the vote might be
[79:49] (4789.60s)
closer. I think there's still a majority
[79:51] (4791.04s)
on the Supreme Court for the actual
[79:52] (4792.32s)
implementation, but it may not be 81
[79:55] (4795.20s)
when there's a specific plan that
[79:57] (4797.36s)
possibly navigate its way through the
[79:58] (4798.80s)
courts again.
[80:00] (4800.24s)
Yeah, it's super fascinating.
[80:03] (4803.52s)
Yeah. I wonder if they're going to get
[80:04] (4804.72s)
to the point where they're going to say
[80:05] (4805.76s)
in every bill, you need to hire this
[80:07] (4807.76s)
number of people to hit
[80:09] (4809.28s)
I don't I don't know if they can. Like
[80:10] (4810.96s)
that's where it gets borderline
[80:12] (4812.16s)
unconstitutional. like where you
[80:13] (4813.92s)
actually prescribe that the president in
[80:16] (4816.56s)
the exercise of his constitutional
[80:18] (4818.80s)
duties has to hire certain number of
[80:20] (4820.96s)
people
[80:22] (4822.32s)
that feels pretty precarious.
[80:24] (4824.24s)
Well, I I I'm not sure, Keith. That's
[80:26] (4826.96s)
just like they prescribe a whole bunch
[80:28] (4828.80s)
of other things like you must you must
[80:32] (4832.00s)
appropriate money for to this specific
[80:35] (4835.60s)
institution to do this specific work.
[80:38] (4838.48s)
But that's not an executive function.
[80:40] (4840.16s)
Like if you said like the secretary of
[80:41] (4841.84s)
state has to have x number of employees
[80:45] (4845.84s)
doing something, the secretary of state
[80:47] (4847.76s)
is your personal representative to
[80:49] (4849.60s)
conduct foreign affairs on behalf of the
[80:51] (4851.36s)
president of the United States. It gets
[80:53] (4853.44s)
a little bit more messy as you translate
[80:56] (4856.08s)
it to people um that the president
[80:59] (4859.12s)
should I mean yes Congress does set you
[81:02] (4862.24s)
know which people are subject to Senate
[81:04] (4864.24s)
confirmation, what their salaries and
[81:06] (4866.32s)
compensation bans are. So it's it's
[81:08] (4868.64s)
never going to be fully binary where the
[81:10] (4870.72s)
president can do whatever he wants and
[81:12] (4872.00s)
it's never gonna I don't think it'll be
[81:13] (4873.44s)
constitutional for Congress to mandate
[81:14] (4874.88s)
and put all kinds of handcuffs on the
[81:16] (4876.40s)
president.
[81:17] (4877.68s)
Well, then you you also have performance
[81:19] (4879.44s)
that comes in here. What if you look at
[81:21] (4881.92s)
the Department of Education? You say
[81:23] (4883.20s)
scores have gone down. We've spent this
[81:25] (4885.28s)
money. We're not getting the results.
[81:27] (4887.36s)
Therefore, these people are incompetent.
[81:29] (4889.28s)
Therefore, I'm firing them for cause and
[81:31] (4891.68s)
I'm going to hire new people. How are
[81:33] (4893.44s)
you going to stop the executive from
[81:34] (4894.96s)
doing that? There's been a bunch of
[81:36] (4896.48s)
litigation, you know, in parallel to
[81:38] (4898.56s)
this litigation about the president's
[81:40] (4900.48s)
ability to fire people. And for the most
[81:42] (4902.32s)
part, the Supreme Court's basically,
[81:44] (4904.88s)
with maybe the exception of the Federal
[81:46] (4906.40s)
Reserve chair, said that the president
[81:49] (4909.04s)
can fire pretty much anybody he wants.
[81:51] (4911.92s)
I mean, that's the way to go is like I
[81:53] (4913.60s)
mean, I hate to be cut, but if the
[81:55] (4915.76s)
results aren't there,
[81:57] (4917.12s)
I think if they're presidential Yeah. If
[81:58] (4918.96s)
they're a presidential appointee, the
[82:00] (4920.32s)
president should be able to fire you at
[82:01] (4921.60s)
will. Just like if you were a VP at one
[82:04] (4924.24s)
of our companies, the CEO should be able
[82:06] (4926.48s)
to fire you at will.
[82:07] (4927.44s)
But what about Keith? If the whole
[82:08] (4928.72s)
department sucks. Hey, you guys were
[82:10] (4930.40s)
responsible for early education. You had
[82:12] (4932.64s)
to put together a plan. The plan failed.
[82:15] (4935.20s)
Everybody's fired. We're starting over.
[82:17] (4937.60s)
Like, you should be allowed to do that.
[82:19] (4939.12s)
How are we going to have an efficient
[82:20] (4940.24s)
government?
[82:20] (4940.88s)
Some of these departments were created
[82:22] (4942.16s)
by congressional statute, like the
[82:24] (4944.40s)
Department of Education in 1979. And
[82:26] (4946.48s)
you're right, every single educational
[82:28] (4948.64s)
stat has got worse in the United States
[82:30] (4950.48s)
since the department was created. But
[82:32] (4952.56s)
[Β __Β ] there is a law on the books that
[82:35] (4955.04s)
says there shall be a department of
[82:36] (4956.48s)
education. So you may have to repeal
[82:41] (4961.04s)
All right, listen. We're at an hour and
[82:42] (4962.72s)
a half, gentlemen. Do you want to do the
[82:44] (4964.56s)
FICO story or should we just wrap Jama?
[82:47] (4967.68s)
And we got plenty of show here. It's a
[82:49] (4969.28s)
great episode. Anything else you want?
[82:50] (4970.96s)
I don't really have much to say on the
[82:52] (4972.08s)
FICO story. I thought these other topics
[82:53] (4973.68s)
were really good though.
[82:54] (4974.64s)
Oh, we did great today. This is a great
[82:56] (4976.16s)
panel. I'm so excited you guys are here.
[82:58] (4978.16s)
Let me just ask you guys um any offduty
[83:01] (4981.28s)
stuff that you can share with us with
[83:02] (4982.88s)
the audience. Any recommendations?
[83:04] (4984.96s)
Restaurants, hotels, trips,
[83:08] (4988.16s)
movies you watch, books you read? Keith,
[83:10] (4990.00s)
I know that you are an active guy. What
[83:12] (4992.96s)
What's on your agenda this summer?
[83:14] (4994.40s)
Anything interesting you can share with
[83:15] (4995.44s)
the audience that you're consuming,
[83:17] (4997.68s)
conspicuous or otherwise?
[83:19] (4999.76s)
Well, I don't want to share any good
[83:21] (5001.04s)
restaurants or hotels because
[83:22] (5002.40s)
Oh, you're gayeping. your gatekeeping.
[83:25] (5005.92s)
Come on, man. Give us Give us your
[83:28] (5008.24s)
babysitter. Baby, it's like if you have
[83:30] (5010.00s)
a babysitter, you're not going to tell
[83:31] (5011.28s)
everybody who you're babysitter.
[83:32] (5012.56s)
Yes. Can I get your nanny's email,
[83:35] (5015.84s)
but there are there are things that are
[83:37] (5017.76s)
what do you call it? No marginal cost
[83:39] (5019.44s)
consumption like Netflix. So, for
[83:40] (5020.96s)
example, um you know, this documentary
[83:43] (5023.76s)
on Osama bin Laden is phenomenal. Like I
[83:46] (5026.64s)
don't know if any of you have seen it.
[83:48] (5028.16s)
It's brand new
[83:49] (5029.44s)
and you know I I'm a student of this
[83:51] (5031.76s)
stuff and I I thought you know I knew
[83:53] (5033.28s)
the whole story and etc. Watch episode
[83:56] (5036.00s)
one. Just start with episode one and it
[83:57] (5037.60s)
just blew me away with new information,
[83:59] (5039.92s)
new footage, just absolutely incredible
[84:02] (5042.08s)
stuff. So highly highly recommend it.
[84:04] (5044.16s)
What uh what was the big takeaway for
[84:05] (5045.68s)
you so far? I don't know if there's any
[84:07] (5047.20s)
like specific takeaway, but just like so
[84:10] (5050.72s)
many parts of the story are
[84:11] (5051.92s)
misunderstood and not really understood
[84:13] (5053.76s)
and how various confluences of somewhat
[84:16] (5056.96s)
random things lead to a very
[84:18] (5058.96s)
catastrophic result, but it it's it's
[84:22] (5062.40s)
like as um dramatic as the best movie,
[84:25] (5065.60s)
but it's a full documentary and you will
[84:27] (5067.52s)
learn things and absorb things. I I just
[84:30] (5070.64s)
I've had friends while I've been
[84:31] (5071.84s)
recommending it to friends and for a
[84:33] (5073.92s)
story you think you know it's incredibly
[84:36] (5076.72s)
revealing.
[84:37] (5077.76s)
Okay, Travis, anything you got on your
[84:39] (5079.76s)
plate there that you're enjoying a
[84:41] (5081.44s)
restaurant, a dish?
[84:42] (5082.80s)
I mean look, you know, I mean Jason, you
[84:44] (5084.80s)
know, I go to Austin a lot.
[84:48] (5088.80s)
Like basically from March till October,
[84:52] (5092.08s)
I do about 15 weekends in Austin. I have
[84:55] (5095.60s)
a lakehouse.
[84:57] (5097.04s)
Jason's hung out a couple times.
[85:00] (5100.08s)
So I I love water skiing. That's my
[85:02] (5102.00s)
whole thing. That's my like I just love
[85:05] (5105.04s)
it. It's just my thing.
[85:06] (5106.56s)
Very zen. Very zen.
[85:07] (5107.60s)
Yeah. And it's lake it's I call it lake
[85:09] (5109.36s)
life. So
[85:10] (5110.72s)
that's a thing. And then I recently this
[85:13] (5113.44s)
is a little bit of like a side quest.
[85:16] (5116.08s)
I recently purchased
[85:18] (5118.96s)
the preeminent back engine.
[85:23] (5123.60s)
XG. That's right.
[85:26] (5126.16s)
It's uh acronym is it's extreme gamut
[85:29] (5129.84s)
and so the preeminent engine so all the
[85:32] (5132.00s)
pros rate themselves based on this
[85:35] (5135.20s)
it was done it was built by this amazing
[85:37] (5137.68s)
entrepreneur this guy Zavier who is just
[85:41] (5141.76s)
a full-on sort of
[85:45] (5145.12s)
ultra ultra I mean just what's the word
[85:47] (5147.60s)
I'm looking for it's not a
[85:49] (5149.84s)
like a savant essentially
[85:52] (5152.80s)
but hasn't worked on it for many years
[85:54] (5154.72s)
so I'm getting back into it and
[85:56] (5156.96s)
love it
[85:57] (5157.60s)
and making it like taking modern machine
[86:01] (5161.12s)
learning sort of deep learning
[86:02] (5162.72s)
techniques and like big compute and
[86:06] (5166.08s)
saying can we push the game of back
[86:08] (5168.16s)
gammon forward. So super exciting and
[86:10] (5170.72s)
ultra training apps to get people up to
[86:12] (5172.72s)
speed quickly. I played in my first back
[86:15] (5175.84s)
gammon tournament and cashed
[86:18] (5178.40s)
so that was pretty cool.
[86:19] (5179.84s)
No, wait. Yeah. Okay.
[86:22] (5182.16s)
All due respect, you know the founder of
[86:23] (5183.60s)
Uber. You're very high profile. You go
[86:25] (5185.52s)
to this back in is this like held at the
[86:27] (5187.36s)
Motel 8
[86:28] (5188.72s)
in like a conference room in the back.
[86:31] (5191.60s)
It was amazing. They set the It was at
[86:33] (5193.28s)
the It was like a month ago or so.
[86:36] (5196.32s)
There's like a big tournament and it was
[86:38] (5198.96s)
uh so the the United States Back
[86:41] (5201.12s)
Federation had this big tour. It was I
[86:42] (5202.64s)
guess it was um at the Los Angeles
[86:46] (5206.96s)
LAX at the LAX Hilton
[86:49] (5209.60s)
and it was in
[86:50] (5210.72s)
it was in the basement of the Hilton.
[86:53] (5213.12s)
Great.
[86:53] (5213.84s)
And it was like
[86:55] (5215.92s)
next to the Dungeons and Dragons
[86:57] (5217.52s)
convention.
[86:58] (5218.40s)
It It had those kinds of legit vibes.
[87:01] (5221.44s)
I love it.
[87:01] (5221.84s)
And like people would So I went in super
[87:04] (5224.80s)
low pro, just did my thing, but
[87:07] (5227.12s)
eventually was recognized, but I was not
[87:09] (5229.20s)
recognized as the founder of Uber.
[87:11] (5231.44s)
I was recognized as the owner of XG.
[87:14] (5234.64s)
The owner of XG.
[87:15] (5235.36s)
And then there was like a fullon
[87:18] (5238.08s)
melee that basically occurred. They're
[87:20] (5240.00s)
like, "Oh, the owner XG. Travis is here.
[87:22] (5242.64s)
Shimoth, I feel like we've got a window
[87:25] (5245.12s)
here to do the all-in back gammon
[87:27] (5247.92s)
high-end tournament.
[87:29] (5249.52s)
We got to lock this down now. We got to
[87:31] (5251.44s)
lock down the all-in back gammon set.
[87:33] (5253.36s)
I get the co-branding rights on this.
[87:35] (5255.76s)
Absolutely.
[87:36] (5256.24s)
XG XG.
[87:37] (5257.52s)
Yeah. Well, no. The all-in XG,
[87:39] (5259.68s)
you know, like because I love a great
[87:40] (5260.96s)
back gamon set. If we could make like a
[87:42] (5262.64s)
$10,000 one, Chimath, we could kill
[87:45] (5265.12s)
turtles or white rhinos, all the animals
[87:47] (5267.84s)
that, you know, um Freeberg's trying to
[87:50] (5270.16s)
protect. Oh my god.
[87:51] (5271.20s)
We could murder them and then make
[87:52] (5272.72s)
that would be so great.
[87:54] (5274.40s)
Yes. Like maybe the white could be, you
[87:57] (5277.12s)
know, rhinos and then you could take
[87:59] (5279.12s)
something else, elephant skin,
[88:00] (5280.48s)
something, you know, just really tragic
[88:02] (5282.56s)
and then eat the meat and make the the
[88:05] (5285.12s)
the back set for you.
[88:06] (5286.72s)
I love back and honestly like if I
[88:09] (5289.44s)
wasn't attempting to be like expert
[88:11] (5291.84s)
poker player,
[88:13] (5293.68s)
that is the game. I mean, if you're
[88:15] (5295.28s)
talking about a Pandora's box where once
[88:17] (5297.76s)
you open it, oh my god, you can go down
[88:19] (5299.44s)
the rabbit.
[88:20] (5300.24s)
Off. Let's go, dude. Let's do that.
[88:21] (5301.92s)
Back again is back is a beautiful,
[88:24] (5304.16s)
beautiful, beautiful game.
[88:25] (5305.92s)
I love the vibes of sitting. Travis and
[88:28] (5308.72s)
I sat I got some cigars out. You know,
[88:30] (5310.72s)
we pour a little of the all-in tequila.
[88:32] (5312.80s)
Tequila.
[88:34] (5314.56s)
Uh, we get that going.
[88:36] (5316.32s)
A couple of uh the all-in cigars and
[88:38] (5318.16s)
then we have the all-in back. It's a
[88:39] (5319.84s)
wonderful hang.
[88:41] (5321.28s)
Yeah. Keith, would you consider giving
[88:42] (5322.80s)
us some of your money playing back?
[88:45] (5325.44s)
Absolutely. Absolutely.
[88:46] (5326.48s)
We gota We got to get some of that
[88:49] (5329.28s)
money on the table because you don't
[88:50] (5330.48s)
play poker with us.
[88:51] (5331.52s)
I don't play poker, but back. Yeah, that
[88:53] (5333.44s)
sounds great. And I'll bring I'll bring
[88:54] (5334.96s)
better tequila.
[88:56] (5336.00s)
I have better tequila. Well, like we're
[88:57] (5337.52s)
going to upgrade.
[88:58] (5338.08s)
We'll do a little taste off. Yeah. So,
[88:59] (5339.76s)
you've insulted now Elon with the Senate
[89:01] (5341.68s)
seats and with his uh tequila.
[89:03] (5343.92s)
My tequila is much better. Trust me.
[89:05] (5345.84s)
Okay. Who who is left in the PayPal
[89:07] (5347.76s)
mafia? You'd like to insult
[89:10] (5350.24s)
Reed Hoffman
[89:12] (5352.56s)
or Peter? Anything about Peter?
[89:15] (5355.04s)
Reed could join Elon's party. He's
[89:16] (5356.64s)
collecting a bunch of misfits, so we
[89:18] (5358.48s)
might as well take Reed, too.
[89:19] (5359.84s)
All right, listen. This has been another
[89:21] (5361.60s)
amazing episode of the number one
[89:23] (5363.60s)
podcast in the world, the all-in podcast
[89:26] (5366.56s)
for your Sultan of science, who couldn't
[89:28] (5368.48s)
make it today. He's at the beep
[89:30] (5370.16s)
conference we don't mention. and uh
[89:33] (5373.04s)
David Saxs who is out uh making America
[89:36] (5376.56s)
safe in AI and crypto
[89:39] (5379.60s)
pyapatia world's greatest moderator
[89:42] (5382.32s)
Travis Keith thank you for coming
[89:44] (5384.48s)
thanks for appreciating you guys were
[89:46] (5386.32s)
great today what a panel see you all
[89:48] (5388.40s)
next time
[89:49] (5389.28s)
byebye
[89:51] (5391.60s)
let your winners ride
[89:58] (5398.88s)
said we open sourced it to the fans and
[90:00] (5400.88s)
they've just gone Crazy right there.
[90:02] (5402.56s)
Love you. Queen of
[90:07] (5407.71s)
[Music]
[90:11] (5411.44s)
besties are
[90:14] (5414.40s)
my dog taking a notice in your driveway.
[90:19] (5419.20s)
Oh man, my dasher will meet up at You
[90:22] (5422.08s)
should all just get a room and just have
[90:23] (5423.60s)
one big huge orgy cuz they're all just
[90:25] (5425.20s)
useless. It's like this like sexual
[90:26] (5426.88s)
tension that we just need to release
[90:28] (5428.32s)
somehow.
[90:31] (5431.84s)
wet your feet.
[90:35] (5435.20s)
We need to get merch.
[90:39] (5439.75s)
[Music]
[90:44] (5444.64s)
I'm going all in.