[00:00] (0.56s)
I got to work on whatever I wanted for
[00:02] (2.24s)
my entire career.
[00:03] (3.68s)
This is Carrie Notchenberg. He's a cyber
[00:06] (6.00s)
security expert who is a fellow at
[00:07] (7.52s)
Semantic, which is four levels higher
[00:09] (9.28s)
than staff. I look for things with big
[00:11] (11.52s)
business impact. I look where there were
[00:13] (13.76s)
From there, he joined Google X as a
[00:16] (16.00s)
principal engineer and fought imposter
[00:17] (17.84s)
syndrome. And what if like I'm not good
[00:19] (19.84s)
enough for Google or Meta or something.
[00:22] (22.40s)
As a professor at UCLA, he's seeing
[00:24] (24.16s)
firsthand how AI affects the classroom.
[00:26] (26.48s)
I allowed students to use LMS. In
[00:29] (29.04s)
retrospect, you know, that was a bad
[00:30] (30.64s)
idea because I think they were using it
[00:32] (32.16s)
in ways that hindered learning.
[00:34] (34.08s)
How do you even tell if they're using
[00:36] (36.00s)
LMS, though?
[00:37] (37.12s)
Well, the one way you can tell is
[00:39] (39.12s)
I learned a lot from his career stories
[00:40] (40.72s)
and hope you do, too.
[00:42] (42.16s)
I get this call, a frantic call from
[00:44] (44.48s)
UCLA. Can you still teach? Because our
[00:47] (47.44s)
lecturer bailed on us. I have PTSD from
[00:50] (50.00s)
that experience, I have to say.
[00:54] (54.64s)
Before we get into all the juicy parts
[00:57] (57.60s)
of your career like working at Google X
[01:00] (60.08s)
or autonomous vehicles with Lyft, I want
[01:02] (62.48s)
to start by kind of laying the high
[01:04] (64.64s)
level groundwork. So you worked at
[01:07] (67.04s)
Semantic for a long time and became
[01:09] (69.52s)
their senior most, you know, similar to
[01:11] (71.68s)
a chief scientist type of role. Kind of
[01:14] (74.48s)
want to go over that and all the lessons
[01:16] (76.24s)
you might have learned there and you
[01:17] (77.84s)
know, anything interesting that came up
[01:19] (79.28s)
there. So could you share the highlevel
[01:22] (82.24s)
story arc of you working at semantic and
[01:26] (86.32s)
how you grew to fellow?
[01:27] (87.92s)
Totally. So I actually started back in I
[01:31] (91.84s)
think 1992 as an intern at Peter Norton
[01:35] (95.68s)
Group. And so Peter Norton Group was
[01:37] (97.76s)
later an acquisition by Semantic but
[01:40] (100.00s)
some of your viewers are going to
[01:41] (101.84s)
remember Norton antivirus from Norton
[01:44] (104.16s)
and Norton Utilities and so on. And so
[01:47] (107.20s)
um I think I was the first intern at
[01:49] (109.68s)
Norton at Peter Norton computing and
[01:52] (112.64s)
they didn't even have a desk for me. So,
[01:54] (114.16s)
I literally worked in a QA lab with the
[01:57] (117.20s)
QA lab manager who was a guy that used
[01:59] (119.76s)
to sell knives for a living because like
[02:01] (121.52s)
back then in 1982 they weren't
[02:03] (123.84s)
professionally trained software people,
[02:05] (125.20s)
right? So, basically they would just get
[02:06] (126.48s)
whoever knew something about computers
[02:08] (128.88s)
and this guy knew something about
[02:10] (130.08s)
computers to run the lab and I would
[02:11] (131.20s)
work in one of the test computers. Um,
[02:13] (133.36s)
and so that was my first, you know,
[02:15] (135.20s)
summer internship. uh and then fast
[02:18] (138.40s)
forward uh when I left in 2016 I had had
[02:22] (142.72s)
become the senior most engineer of the
[02:24] (144.16s)
company. So from the junior most person
[02:26] (146.40s)
the first intern to the senior most
[02:28] (148.08s)
person at semantic
[02:29] (149.60s)
which would had acquired Norton. So it
[02:31] (151.68s)
was a long ride but that that's the high
[02:33] (153.84s)
level. Was cyber security something that
[02:37] (157.36s)
really you were focused and really
[02:39] (159.52s)
wanted a job in cyber security or it
[02:42] (162.16s)
just by chance?
[02:43] (163.28s)
Yeah. So at the time when I was working
[02:45] (165.20s)
at Peter Norton Group, I wasn't working
[02:47] (167.76s)
in cyber security. I was working on
[02:49] (169.12s)
something called Norton Commander, which
[02:50] (170.72s)
was like a file utility manager. So I
[02:53] (173.52s)
didn't actually work on cyber security
[02:54] (174.96s)
until my third year, my third year of
[02:57] (177.28s)
internships uh at this point at Semantic
[02:59] (179.92s)
where they they had acquired a product
[03:02] (182.16s)
called I don't remember what it was, but
[03:03] (183.76s)
it was an antivirus product that they
[03:05] (185.52s)
renamed Norton antivirus. So that
[03:07] (187.52s)
product was uh acquired and rebranded uh
[03:10] (190.88s)
and they had a team of people analyzing
[03:13] (193.12s)
computer viruses. So my third year of
[03:15] (195.52s)
internship that's when I got into cyber
[03:17] (197.36s)
security. I had no experience. They just
[03:19] (199.68s)
needed an intern and they threw me on
[03:21] (201.68s)
Like what is the the career ladder look
[03:24] (204.08s)
like? Because I think Google and those
[03:25] (205.84s)
companies came in at some point and
[03:27] (207.20s)
said, "Hey, this is L3. This is L4. This
[03:29] (209.92s)
is L5." And then kind of a lot of
[03:32] (212.00s)
companies copied that. At Semantic, what
[03:34] (214.56s)
did career progression look like? I
[03:36] (216.56s)
would say the levels generally track to
[03:40] (220.80s)
companies like Google and Facebook or
[03:42] (222.88s)
Meta or going all the way from like a
[03:44] (224.88s)
junior you know software engineer
[03:46] (226.48s)
through software engineer senior
[03:48] (228.32s)
software engineer staff and so on. So um
[03:50] (230.88s)
the levels are very similar. The
[03:52] (232.40s)
difference was for me at the very high
[03:55] (235.92s)
levels of distinguished engineer and
[03:58] (238.40s)
fellow at semantic you know when I went
[04:01] (241.52s)
to Google they basically said you know
[04:04] (244.16s)
we can't just hire fellows you know we
[04:07] (247.76s)
don't haven't experienced you we don't
[04:09] (249.52s)
know if you're going to do well here so
[04:11] (251.92s)
effectively I was downgraded to a st to
[04:14] (254.32s)
a principal engineer level eight from
[04:16] (256.64s)
what would have been probably a level 10
[04:18] (258.16s)
at semantic it was a vice president role
[04:20] (260.48s)
at semantic
[04:21] (261.92s)
What does the hiring process even look
[04:23] (263.92s)
like for people so high level? Um, was
[04:27] (267.04s)
that a tailored bespoke process? When I
[04:30] (270.00s)
went to uh Google, basically the former
[04:33] (273.84s)
chief operating officer of Semantic, guy
[04:36] (276.48s)
named Steven Gillette, had recently
[04:38] (278.08s)
transferred into Google X and they were
[04:40] (280.56s)
starting a stealth project in Google X,
[04:42] (282.96s)
which we could talk about. he had known
[04:45] (285.04s)
me from my time at semantic and
[04:46] (286.96s)
basically you know said hey why don't
[04:48] (288.80s)
you talk to the team and see if there's
[04:50] (290.56s)
a good fit and so I didn't have to apply
[04:53] (293.44s)
I was just you know brought in and and
[04:55] (295.60s)
uh we had a one meet and greet with the
[04:58] (298.88s)
comp the team's founders uh in Venice
[05:02] (302.08s)
and then we had an interview a day's
[05:04] (304.32s)
worth of interviews probably about eight
[05:05] (305.84s)
interviews and you know six weeks later
[05:09] (309.12s)
I had my offer letter so
[05:10] (310.64s)
you know for a lot of software engineers
[05:12] (312.56s)
that are in the lower The interviews are
[05:15] (315.76s)
leak code there or system design and
[05:18] (318.00s)
there's you know some generic behavioral
[05:20] (320.40s)
interview for at the high levels. What
[05:22] (322.72s)
what does that loop even look like? Is
[05:24] (324.40s)
it is it mostly behavioral? Do are they
[05:26] (326.64s)
asking you leak code stuff too?
[05:28] (328.64s)
So I don't believe that I had any coding
[05:33] (333.20s)
problems during that interview. I if I
[05:34] (334.96s)
recall correctly there were some design
[05:36] (336.56s)
there was one design problem. Um but
[05:39] (339.04s)
mostly these were leadership-l like
[05:40] (340.56s)
questions like you know how would you
[05:42] (342.08s)
solve a a hard problem or how did you
[05:44] (344.56s)
solve a hard problem in your career um
[05:46] (346.96s)
how do you deal with conflicts um you
[05:51] (351.20s)
know tell me your thoughts on you know
[05:53] (353.60s)
where the field is going you know Google
[05:55] (355.36s)
X is a forward-looking or X rather is a
[05:57] (357.76s)
forward-looking organization and so uh
[06:00] (360.72s)
some of the discussions were focused on
[06:03] (363.04s)
that um some of the discussions were lit
[06:05] (365.52s)
literally sales pitches like I didn't
[06:07] (367.04s)
really have to say they went and talked
[06:08] (368.72s)
to me about where they thought the world
[06:10] (370.32s)
was going to try to get me excited,
[06:13] (373.36s)
I think it probably depends on the
[06:15] (375.68s)
person, but in my case, I think they
[06:18] (378.40s)
thought there was a good fit
[06:19] (379.92s)
and it was really about sort of going
[06:21] (381.84s)
through the motions of, you know, having
[06:24] (384.64s)
interviews and selling me on the on the
[06:26] (386.96s)
Okay. So, then going back to Semantic, I
[06:29] (389.28s)
mean, you know, g getting promoted to
[06:31] (391.60s)
fellow, what do those highest level
[06:33] (393.44s)
promos look like? At least at semantic,
[06:35] (395.84s)
most promotions up through what was
[06:38] (398.40s)
called technical director or senior
[06:40] (400.32s)
technical director um which would be the
[06:42] (402.64s)
equivalent of let's say staff engineer
[06:45] (405.28s)
or senior staff engineer probably at
[06:47] (407.60s)
other companies. Most of that was done
[06:50] (410.16s)
just within the organization. So as long
[06:52] (412.00s)
as a vice president or senior vice
[06:54] (414.08s)
president was okay with a promotion, it
[06:56] (416.80s)
was it was allowed. uh at higher levels
[07:00] (420.08s)
for distinguished engineer and fellow.
[07:02] (422.24s)
It was there was a a basically um core
[07:05] (425.92s)
group of technologists led by the CTO of
[07:08] (428.56s)
the company. We would meet once a
[07:10] (430.00s)
quarter. Um we would get applications
[07:12] (432.40s)
from those people and we would then
[07:14] (434.32s)
review their applications and actually
[07:15] (435.84s)
have them come talk to us about their
[07:17] (437.36s)
work u and then make a decision based on
[07:20] (440.40s)
that. And so the reason that we did it
[07:22] (442.48s)
that way is that we'd have consistency
[07:25] (445.04s)
across all divisions, all teams, uh, of
[07:28] (448.24s)
what it meant to be a distinguished
[07:30] (450.08s)
engineer or or a fellow for the
[07:32] (452.00s)
organization.
[07:33] (453.04s)
Um, so that's the process we went
[07:35] (455.12s)
through and it wasn't just per division,
[07:37] (457.52s)
it was for the company.
[07:39] (459.28s)
So when you got promoted to fellow, was
[07:41] (461.60s)
that one of the happiest moments of your
[07:43] (463.28s)
career or was it something you expected,
[07:45] (465.36s)
not a big deal?
[07:46] (466.16s)
You know, it was very interesting. So I
[07:49] (469.76s)
did not have to go through that process
[07:51] (471.20s)
to get promoted to fellow. So I was a
[07:52] (472.96s)
distinguished engineer at Semantic. Um
[07:55] (475.60s)
and they had that process. I wasn't part
[07:58] (478.56s)
of it because I wasn't a distinguished
[08:00] (480.16s)
engineer at the time. That that team is
[08:01] (481.92s)
made the team of that does promotions is
[08:04] (484.56s)
made up of distinguished engineers at
[08:06] (486.72s)
the time. We had acquired a company
[08:08] (488.32s)
called Veraritoss and Veraritoss had a
[08:10] (490.80s)
notion of a fellow which Semantic
[08:13] (493.04s)
didn't. So basically after the merger
[08:16] (496.32s)
between the two companies, they said
[08:18] (498.48s)
okay we need to level set the role
[08:21] (501.20s)
because the people at Semantic have
[08:22] (502.64s)
never had this kind of role who would
[08:24] (504.88s)
belong there and I was nominated without
[08:27] (507.36s)
my knowledge and it just happened and I
[08:30] (510.16s)
got a congratulations or maybe my boss
[08:31] (511.76s)
filled me and said hey there's some
[08:33] (513.20s)
discussions going on thank you know and
[08:35] (515.04s)
then you're a fellow. So they basically
[08:37] (517.68s)
took my port portfolio of stuff and gave
[08:40] (520.40s)
it to the the team that did the
[08:42] (522.32s)
evaluations and that was the end of
[08:44] (524.08s)
I mean with your growth to to fellow
[08:47] (527.04s)
something about the way you work sets
[08:49] (529.12s)
you apart from other engineers. What do
[08:51] (531.44s)
you think are those things for you?
[08:54] (534.16s)
I think what helped me get to fellow was
[08:58] (538.24s)
working on really impactful projects for
[09:01] (541.36s)
the business. not necessarily always the
[09:04] (544.32s)
most technically difficult projects
[09:06] (546.00s)
although many of them were but more
[09:08] (548.08s)
impactful like they move the needle for
[09:10] (550.32s)
the company. I had so many of of them
[09:12] (552.96s)
under my belt at that time that they
[09:15] (555.44s)
just said you know the criteria are this
[09:18] (558.56s)
many projects should be done of the of
[09:20] (560.24s)
this scope and I had plenty more and so
[09:23] (563.20s)
they basically said you're you know
[09:24] (564.32s)
you're above the bar. So what was the
[09:27] (567.52s)
what was the key insight to doing those
[09:29] (569.68s)
projects is finding things where there
[09:31] (571.44s)
where there were gaps that you know
[09:34] (574.08s)
things the company needed where people
[09:35] (575.68s)
weren't stepping up to do them and you
[09:38] (578.80s)
know I did them maybe a quick step back
[09:42] (582.24s)
at Savant it was sort of like the wild
[09:44] (584.40s)
west there wasn't an engineering culture
[09:46] (586.56s)
per se there was just people working on
[09:48] (588.96s)
stuff and like projects would often run
[09:51] (591.92s)
really late because you know it was a
[09:54] (594.24s)
little bit more of a wild us. We didn't
[09:55] (595.92s)
have an agile process. Um, we didn't
[09:58] (598.96s)
really do our own unit an integration
[10:02] (602.00s)
task like we'd throw over our code to QA
[10:04] (604.08s)
and they would worry about it and we'd
[10:05] (605.36s)
work on the next thing. And in fact, I
[10:07] (607.36s)
had very unusual circumstances. I didn't
[10:09] (609.68s)
rise to a path where I was given a small
[10:12] (612.08s)
project and somebody said, "Okay, do
[10:14] (614.80s)
this. Here's here's a box around the
[10:16] (616.56s)
project." They basically said, "Carrie,
[10:18] (618.72s)
you already sort of know the technology
[10:20] (620.48s)
because you've been working on it as an
[10:21] (621.92s)
intern. Go figure out what to do and do
[10:24] (624.16s)
it." which was amazing because here I am
[10:26] (626.32s)
like a junior software engineer and I
[10:27] (627.76s)
got to work on whatever I wanted for my
[10:29] (629.28s)
entire career at semantic never assigned
[10:31] (631.52s)
work to do. Yeah, it was crazy, which
[10:33] (633.76s)
was great. And I, you know, and so I got
[10:36] (636.08s)
to just pick projects that I thought
[10:37] (637.68s)
would be impactful. And I picked well. I
[10:40] (640.64s)
got I guess I picked well because I, you
[10:42] (642.40s)
know, project after project landed. They
[10:44] (644.64s)
were integrated. We did tech transfers
[10:46] (646.16s)
into the product, they shipped. So
[10:48] (648.48s)
that's how I got there. It wasn't like,
[10:50] (650.24s)
oh, I did increasingly bigger scope
[10:52] (652.72s)
projects that somebody gave me.
[10:54] (654.08s)
So how'd you train that that project
[10:56] (656.48s)
taste? Because impact typically leads to
[10:59] (659.28s)
career growth everywhere. So
[11:01] (661.28s)
yeah. Yeah. And it's a good question and
[11:03] (663.04s)
it's I look for things with big business
[11:05] (665.36s)
impact. I look where there were gaps. So
[11:07] (667.44s)
like way back when you know this is like
[11:09] (669.92s)
old news now but it's sort of
[11:11] (671.20s)
interesting like we had um these
[11:13] (673.92s)
computer viruses which are emerging
[11:15] (675.84s)
which were called polymorphic viruses.
[11:17] (677.84s)
They were selfmutating malware and they
[11:20] (680.88s)
were selfmutating in a way that they
[11:22] (682.40s)
there could be like literally
[11:23] (683.52s)
quadrillions of variants and the way
[11:26] (686.64s)
that the teams were working on these
[11:28] (688.16s)
things but back when I was an intern
[11:30] (690.00s)
even is they would basically write uh
[11:32] (692.40s)
handwritten assembly language to go look
[11:35] (695.12s)
for telltale signs of a variant you know
[11:38] (698.88s)
in the mutation. Okay. And the problem
[11:42] (702.16s)
with that is it worked really great
[11:44] (704.40s)
except it took six months because we
[11:46] (706.00s)
shipped a new product every six months.
[11:47] (707.36s)
And so it took six months to discover to
[11:49] (709.28s)
handle a virus and then the next day
[11:51] (711.28s)
somebody released three new viruses
[11:53] (713.04s)
which were slightly you know slight
[11:54] (714.72s)
variance or differences, you know,
[11:56] (716.24s)
different viruses but mostly the same.
[11:58] (718.16s)
And then all of that work would not work
[12:00] (720.40s)
anymore. And so, you know, I would look
[12:02] (722.72s)
at that problem. I'd say, "Oh, there's a
[12:04] (724.56s)
need to be able to move more rapidly in
[12:06] (726.80s)
covering new malware. And by the way,
[12:09] (729.28s)
detecting these selfmutating threats is
[12:11] (731.20s)
not like using a rag X. You can't just,
[12:13] (733.60s)
you know, search for a string to find
[12:15] (735.04s)
these things." So, I'm like, "Oh, that
[12:16] (736.64s)
seems like a really interesting hard
[12:17] (737.84s)
problem." So, I picked it and then I
[12:20] (740.24s)
started working on it. That was my
[12:21] (741.44s)
master thesis and then eventually
[12:23] (743.20s)
transferred to the product. If you were
[12:25] (745.44s)
to think about the things that you took
[12:26] (746.80s)
on, they were a series of I guess side
[12:29] (749.84s)
projects or or you know, whatever you
[12:31] (751.76s)
wanted to take on where you you'd take
[12:33] (753.76s)
on this new thing, maybe something else
[12:35] (755.76s)
would come in, you take that on. Is that
[12:37] (757.84s)
kind of how you were working?
[12:39] (759.20s)
I would say there are probably six or
[12:40] (760.72s)
seven times in my career where I'm like,
[12:42] (762.48s)
"Oh, the company needs this type of
[12:44] (764.56s)
thing. Let me go spend six weeks, two
[12:46] (766.64s)
months, five months figuring out what
[12:49] (769.52s)
that looks like, building prototypes,
[12:51] (771.44s)
talking to engineers, and figuring out
[12:52] (772.80s)
what they need. and then you know
[12:54] (774.96s)
building that and there were other times
[12:56] (776.88s)
which is probably 80% of my career where
[12:59] (779.36s)
I was just sort of tweaking those things
[13:00] (780.96s)
in other words we built them we were
[13:03] (783.76s)
trying to either tech transfer it so I
[13:05] (785.28s)
was helping with that fixing bugs
[13:07] (787.28s)
improving those were sort of incremental
[13:09] (789.60s)
improvements on those systems but that
[13:11] (791.52s)
that it's one of those two things
[13:13] (793.12s)
generally
[13:13] (793.92s)
so you mentioned a little bit about
[13:15] (795.84s)
viruses and when I was doing some
[13:17] (797.76s)
research I saw that you had done some
[13:20] (800.48s)
storytelling on top of stuckset and kind
[13:22] (802.88s)
of compiled
[13:23] (803.84s)
I think that's such an interesting
[13:25] (805.92s)
story. Uh can you tell me a little bit
[13:28] (808.24s)
about stuckset? Maybe we can go into
[13:30] (810.48s)
Sure. Stuckset was um at the time just
[13:35] (815.68s)
unfathomable. It was just a very complex
[13:38] (818.96s)
piece of malware which was
[13:40] (820.16s)
multiplatform. So it didn't just infect
[13:42] (822.00s)
like Mac machines or Windows machines.
[13:44] (824.32s)
It infected I think you know Windows
[13:46] (826.32s)
machines but also like microcontrollers
[13:49] (829.12s)
that would actually run like centrifuges
[13:50] (830.80s)
and so on. Um, and so it was probably
[13:53] (833.60s)
the first multi-platform piece of
[13:55] (835.36s)
malware we had discovered. Uh, use zero
[13:58] (838.40s)
days in order to break into systems that
[14:00] (840.40s)
you know that you know basically
[14:01] (841.52s)
vulnerabilities, exploiting
[14:02] (842.56s)
vulnerabilities that hadn't been patched
[14:04] (844.64s)
because they weren't even known about.
[14:06] (846.08s)
And it didn't use just one of those or
[14:07] (847.68s)
two of those. I think it used like six
[14:09] (849.52s)
different vulnerabilities to spread,
[14:10] (850.96s)
many of which were zero days.
[14:12] (852.40s)
My god.
[14:12] (852.88s)
Um, it would literally stealth itself.
[14:15] (855.52s)
So on your computer, if you were to look
[14:17] (857.92s)
at a at a thumb drive which had
[14:20] (860.96s)
Snapchatnet on it and look in your your
[14:23] (863.28s)
your Finder application or your Windows,
[14:25] (865.20s)
you know, file system application, you
[14:27] (867.44s)
would see, you know, nothing there, but
[14:29] (869.68s)
it was there. You'd stick that in your
[14:31] (871.44s)
computer, it would auto launch. It
[14:33] (873.36s)
actually had a a payload to auto launch.
[14:35] (875.76s)
If you were to look at the logic that
[14:38] (878.32s)
was running on a centrifuge or rather
[14:40] (880.48s)
the controller that ran the frequency
[14:42] (882.00s)
converters, you would not see any of the
[14:44] (884.16s)
specs in that logic. It was in that
[14:46] (886.88s)
controller. But if you downloaded the
[14:50] (890.00s)
the logic from that controller onto a
[14:51] (891.92s)
Windows machine, it would stealth and
[14:54] (894.16s)
remove the logic from stuckset as it
[14:56] (896.16s)
pulled it off. And then if you updated
[14:57] (897.92s)
that logic, for instance, it would
[14:59] (899.52s)
reinsert itself into that logic to
[15:01] (901.36s)
reinfect it as it went back. So we
[15:03] (903.68s)
actually like sort of piggyback on back
[15:05] (905.76s)
and forth stealth itself. Um it was just
[15:08] (908.64s)
amazing. And then of course how it
[15:09] (909.68s)
disrupted the centrifuges is super
[15:11] (911.12s)
interesting as well.
[15:12] (912.16s)
Yeah, it's so complicated and
[15:15] (915.92s)
sophisticated that it makes me wonder
[15:19] (919.12s)
who wrote it and I saw something like it
[15:21] (921.04s)
was, you know, 50 times bigger than the
[15:23] (923.20s)
average virus. Incredibly complicated
[15:25] (925.76s)
software. And I was reading into
[15:27] (927.44s)
Wikipedia a little bit before we kind of
[15:29] (929.36s)
it said no one has claimed credit for
[15:31] (931.68s)
who wrote this thing. Who do you think
[15:33] (933.92s)
wrote this thing? It's
[15:34] (934.96s)
I think it's pretty good. You'd be
[15:37] (937.12s)
pretty safe to say it was the Israelis
[15:38] (938.64s)
and the American government. You know,
[15:40] (940.32s)
my understanding or recollection is that
[15:42] (942.16s)
there are water not watermarks but sort
[15:44] (944.96s)
of, you know, coding styles or things in
[15:47] (947.04s)
there that sort of implicate both uh
[15:49] (949.92s)
governments.
[15:50] (950.56s)
Have you ever looked at the source code
[15:52] (952.00s)
or played?
[15:52] (952.88s)
I have not. I didn't do any analysis on
[15:54] (954.72s)
stuck set. Um my career was focused
[15:57] (957.44s)
early on analyzing malware like
[15:59] (959.12s)
literally looking at the machine
[16:00] (960.56s)
language and disassembling and so on.
[16:02] (962.48s)
But later on in my career it was mostly
[16:04] (964.08s)
about detecting sort knows how how could
[16:05] (965.92s)
I build algorithms to detect that
[16:08] (968.40s)
malware rather than hands-on analyzing
[16:10] (970.40s)
the malware myself. So I'd never looked
[16:12] (972.40s)
at stuckset. Yeah.
[16:13] (973.44s)
You mentioned a little bit about uh
[16:15] (975.12s)
assembly code. Did you ever write
[16:17] (977.04s)
assembly code when you were working at
[16:18] (978.80s)
cement?
[16:19] (979.68s)
I did. Yeah. I wrote assembly code as an
[16:21] (981.60s)
intern. Um and uh although back in those
[16:25] (985.44s)
days it was mostly C.
[16:27] (987.12s)
Um but some assembly as well. And I
[16:29] (989.28s)
remember the first antivirus engines
[16:30] (990.72s)
were written in assembly for speed. And
[16:32] (992.64s)
one of my first tasks as I joined
[16:34] (994.40s)
full-time was I said, you know, this
[16:36] (996.08s)
really needs to be a C so it's more
[16:37] (997.28s)
maintainable. So we ported the thing to
[16:38] (998.72s)
C and actually made it faster because
[16:40] (1000.56s)
the people back then people didn't know
[16:42] (1002.32s)
algorithms. They didn't understand what
[16:44] (1004.24s)
an what a big O was or how to you know
[16:46] (1006.32s)
they would do linear searches. And so we
[16:48] (1008.64s)
were able to go and take something in
[16:49] (1009.76s)
assembly language, move it over to C,
[16:52] (1012.08s)
have less code, um, and it would be, you
[16:54] (1014.64s)
know, five times faster. So
[16:56] (1016.64s)
I see. So the the speed ups moving from
[16:59] (1019.12s)
assembly to C was due to better
[17:02] (1022.00s)
algorithms and things like that. It
[17:03] (1023.68s)
wasn't because of a compiler or
[17:05] (1025.36s)
something.
[17:06] (1026.08s)
No, the compilers weren't that great
[17:07] (1027.36s)
back then. But even without an
[17:08] (1028.48s)
optimizing compiler, if you use a hasht
[17:11] (1031.12s)
versus or binary search versus a linear
[17:13] (1033.12s)
search over 60,000 signatures, you know,
[17:16] (1036.16s)
I I saw that you worked at Semantic for
[17:18] (1038.32s)
a long time and you know, I think in the
[17:20] (1040.48s)
tech industry, it's common for people to
[17:24] (1044.00s)
move around here and there. What do you
[17:26] (1046.48s)
think kept you at semantic as long as
[17:28] (1048.56s)
you were?
[17:29] (1049.28s)
You know, that's a great question. Um,
[17:31] (1051.68s)
if I have to be perfectly honest, I
[17:33] (1053.36s)
would say imposttor syndrome.
[17:36] (1056.40s)
Really? So well yes and no. So at
[17:39] (1059.20s)
semantic I didn't really have imposttor
[17:41] (1061.04s)
syndrome because I had done a lot of
[17:43] (1063.04s)
stuff and I was well regarded you know I
[17:45] (1065.68s)
was known in the company and so I had a
[17:48] (1068.24s)
good safe place but I always worried
[17:50] (1070.80s)
what if it just is because I'm at
[17:52] (1072.96s)
Semantic and I grew up here and I
[17:54] (1074.64s)
learned the stuff here. what if I went
[17:55] (1075.84s)
somewhere else and I wouldn't be able to
[17:57] (1077.28s)
learn the stuff or what if people had
[17:59] (1079.52s)
different standards and what if like I'm
[18:01] (1081.36s)
not good enough for Google or Meta or
[18:04] (1084.00s)
something and so I stayed because it was
[18:06] (1086.80s)
comfortable and I complained I
[18:08] (1088.24s)
complained all the time I wasn't happy
[18:09] (1089.76s)
later on in my career I have to be
[18:11] (1091.28s)
honest with you I wasn't doing things
[18:13] (1093.52s)
that made me happy more when you get
[18:15] (1095.76s)
more senior you do a lot more BS right
[18:17] (1097.92s)
and and and
[18:20] (1100.72s)
you also have the opportunity not to do
[18:22] (1102.64s)
as much BS but you have to push yourself
[18:25] (1105.20s)
not to do it because it's very easy to,
[18:27] (1107.04s)
you know, go to meetings and, you know,
[18:29] (1109.12s)
have broad discussions and it's not
[18:30] (1110.96s)
really that necessarily fun,
[18:32] (1112.88s)
right?
[18:33] (1113.20s)
Um, and so I wasn't happy near the end
[18:36] (1116.24s)
of my tenure at Semantic, but I was
[18:38] (1118.00s)
afraid that I wouldn't be able to do
[18:39] (1119.68s)
well or I'd fail the interview process.
[18:41] (1121.28s)
And so I just stayed and it was
[18:43] (1123.04s)
comfortable. throughout your career
[18:45] (1125.04s)
there were so many promotions and you
[18:47] (1127.36s)
had so much impact for someone like you
[18:49] (1129.92s)
to have imposter syndrome you know I
[18:52] (1132.16s)
feel like that shows that a lot of
[18:53] (1133.60s)
people you know it's it's a very natural
[18:56] (1136.08s)
feeling for a lot of people did you
[18:59] (1139.60s)
eventually you did leave semantics so
[19:01] (1141.60s)
was there anything that helped you uh
[19:03] (1143.52s)
overcome imposter syndrome
[19:05] (1145.36s)
you know what the thing that that helped
[19:08] (1148.00s)
me was that somebody said hey we want to
[19:10] (1150.32s)
interview you we think you'd be a good
[19:11] (1151.92s)
fit and so I said you Well, I'm probably
[19:14] (1154.24s)
going to fail this interview. I'm sure
[19:15] (1155.60s)
I'm not good enough, but I'm going to do
[19:17] (1157.20s)
it. And so, I just did it. And so, that,
[19:19] (1159.12s)
you know, I needed an external pull or
[19:21] (1161.60s)
push, I don't know what you would call
[19:22] (1162.56s)
it, but in order to get me to to take
[19:24] (1164.32s)
the chance, and then it worked out. But
[19:26] (1166.40s)
for me, like in my head, I was, you
[19:29] (1169.36s)
know, I wasn't competent to do that job.
[19:32] (1172.24s)
You know,
[19:32] (1172.80s)
you you mentioned uh also that at the
[19:35] (1175.20s)
highest levels, there's uh, you know, a
[19:37] (1177.52s)
lot of BS and, you know, I guess it
[19:39] (1179.44s)
sounds like meetings and things like
[19:40] (1180.72s)
that. Do you have any uh I guess tips on
[19:44] (1184.00s)
how to be less involved in the BS
[19:46] (1186.80s)
because I think that's a natural pull
[19:48] (1188.72s)
pull for anyone.
[19:50] (1190.16s)
Yeah, it's sort of natural definitely it
[19:52] (1192.80s)
depends what you're doing. I mean some
[19:54] (1194.24s)
some tech senior technical directors and
[19:56] (1196.24s)
distinguished engineers even fellows
[19:57] (1197.76s)
were working dayto-day and building code
[20:00] (1200.08s)
and and working with their teams. It
[20:01] (1201.76s)
just depended uh I was an individual
[20:04] (1204.24s)
contributor vice president. So I was an
[20:05] (1205.68s)
IC through my entire time at semantic.
[20:07] (1207.76s)
other people would actually manage teams
[20:09] (1209.28s)
and work uh more closely on projects.
[20:12] (1212.40s)
You know, it's just it's inevitable,
[20:13] (1213.92s)
right? In other words, you're having
[20:15] (1215.20s)
more strategic meetings and then the
[20:17] (1217.20s)
problem is you're having a strategic
[20:18] (1218.80s)
strategic meeting with a bunch of
[20:20] (1220.32s)
people, many of which many of whom don't
[20:23] (1223.04s)
necessarily know that much, but they
[20:25] (1225.28s)
have an opinion because everybody has an
[20:26] (1226.64s)
opinion. Um, and there's a lot of
[20:29] (1229.12s)
debating and a lot of arguing and a lot
[20:30] (1230.80s)
of like, you know, posturing for, you
[20:34] (1234.00s)
know, for power. And, you know, it's
[20:37] (1237.12s)
just there's a there's a lot of garbage
[20:39] (1239.36s)
that comes with being more senior,
[20:40] (1240.56s)
unfortunately. Like, there was some some
[20:42] (1242.00s)
joy, especially for me when I got to
[20:44] (1244.00s)
pick my own projects to be able to just
[20:45] (1245.28s)
sit down and literally go two months
[20:46] (1246.80s)
with nobody asking me what what are you
[20:48] (1248.32s)
doing? And, you know, I'm just like
[20:50] (1250.48s)
cranking and trying things. That doesn't
[20:52] (1252.32s)
work, but that does. And super exciting.
[20:54] (1254.24s)
Right.
[20:54] (1254.72s)
Right. Then you get into a room with
[20:57] (1257.28s)
seven people and you're like, "We've
[20:59] (1259.04s)
agreed that this is our new company
[21:00] (1260.40s)
strategy." One of my last rule uh things
[21:02] (1262.32s)
of the company I did was actually define
[21:03] (1263.84s)
the company technology strategy for the
[21:05] (1265.68s)
whole company. And everybody had agreed
[21:07] (1267.12s)
to it. The CEO had agreed to it. And
[21:08] (1268.40s)
then we get in a room and everybody
[21:09] (1269.52s)
would say, "Oh, sure. But you know, we
[21:12] (1272.08s)
have to make money on our projects or
[21:13] (1273.84s)
products." And so, you know, adding
[21:15] (1275.12s)
those features to align with technology
[21:17] (1277.04s)
strategy that's gonna set us back. And
[21:20] (1280.00s)
we've been told we have to make, you
[21:21] (1281.52s)
know, this much topline revenue. And so,
[21:23] (1283.44s)
you know, you end up having debates and
[21:25] (1285.68s)
discussions and it's like very very
[21:27] (1287.84s)
draining.
[21:28] (1288.88s)
So, you said you were pulled into Google
[21:31] (1291.60s)
X and you ended up taking the interview
[21:34] (1294.16s)
and and doing well. I'm curious, what
[21:37] (1297.04s)
was it like entering, you know, Google
[21:39] (1299.44s)
or this like fang style big tech? And
[21:41] (1301.92s)
were there any cultural differences that
[21:43] (1303.44s)
stood out to you?
[21:44] (1304.64s)
You know, few than you would think. I
[21:47] (1307.28s)
would say the biggest difference that I
[21:49] (1309.84s)
saw there was there were really really
[21:52] (1312.16s)
really smart people like semantic had
[21:54] (1314.72s)
some smart people but again it didn't
[21:56] (1316.24s)
have an engineering culture even when I
[21:57] (1317.60s)
left in 2016 it was starting to develop
[21:59] (1319.44s)
one but it was really you know it was
[22:01] (1321.20s)
more a little looser gooseier than a
[22:02] (1322.88s)
Google for sure um but the quality of
[22:05] (1325.60s)
the people in Google X and X were really
[22:08] (1328.72s)
very high quality in terms of
[22:10] (1330.08s)
intelligence now what what seemed about
[22:12] (1332.56s)
the same was that many people in X as
[22:16] (1336.00s)
there were many people in Semantic
[22:17] (1337.84s)
didn't have good taste, research taste
[22:19] (1339.92s)
if or or project taste. And so a lot of
[22:22] (1342.72s)
people were really smart, but it wasn't
[22:25] (1345.60s)
clear that they were picking projects
[22:26] (1346.88s)
that would be that would land or you
[22:28] (1348.96s)
know or that were feasible in you know
[22:32] (1352.08s)
at least in my opinion. So I think
[22:34] (1354.24s)
that's a that is an attribute of
[22:36] (1356.40s)
engineers no matter what company, no
[22:38] (1358.24s)
matter how intelligent um people are.
[22:41] (1361.44s)
Um, but it was uh, you know, like it was
[22:43] (1363.28s)
it was startling how much how much
[22:44] (1364.96s)
brilliance there was. And I do remember
[22:46] (1366.88s)
like there was one guy who was clearly
[22:48] (1368.88s)
like over a 200 IQ. The guy was just you
[22:52] (1372.00s)
talked to him and he was just
[22:54] (1374.00s)
astoundingly brilliant and he was still
[22:56] (1376.32s)
in L4.
[22:57] (1377.68s)
Why was he in the L4? Because, you know,
[22:59] (1379.84s)
he had lack of communication skills. You
[23:02] (1382.88s)
know, worked on really interesting stuff
[23:04] (1384.32s)
that was interesting to him but not
[23:05] (1385.68s)
necessarily had business impact. Didn't
[23:08] (1388.00s)
collaborate well apparently. you know,
[23:09] (1389.52s)
like there were things, whatever it was.
[23:11] (1391.28s)
And it didn't matter that he was
[23:12] (1392.88s)
brilliant. Like he was twice as smart as
[23:15] (1395.36s)
I was, but you know, just because you
[23:17] (1397.68s)
have intelligence doesn't mean you're
[23:18] (1398.80s)
going to be successful. And so that was,
[23:20] (1400.24s)
you know, saw the same thing there.
[23:22] (1402.00s)
If I'm understanding correctly, if
[23:23] (1403.76s)
you're very ambitious and you really
[23:26] (1406.24s)
want career growth, intelligence is not
[23:29] (1409.84s)
that important. It sounds like there are
[23:31] (1411.60s)
some things that are much more
[23:32] (1412.64s)
important. You cited uh communication,
[23:34] (1414.96s)
soft skills, project taste, picking
[23:37] (1417.12s)
things that actually matter.
[23:38] (1418.56s)
Yeah. Is there anything else that you
[23:40] (1420.16s)
that comes to mind?
[23:41] (1421.44s)
There are definitely people who are less
[23:42] (1422.80s)
intelligent. You're not going to like at
[23:44] (1424.16s)
Google we didn't have too many of those
[23:45] (1425.52s)
people like there were people that were
[23:46] (1426.72s)
really really you know most people were
[23:48] (1428.80s)
really quite smart. So I would say a
[23:50] (1430.96s)
baseline is you need to have a baseline
[23:52] (1432.64s)
level intelligence but I for instance
[23:54] (1434.00s)
don't think I'm a really intelligent
[23:55] (1435.12s)
person. I I take a forever to learn new
[23:57] (1437.12s)
things. I have like this ramp which is
[23:58] (1438.80s)
like this you know at least internally
[24:00] (1440.40s)
that's how I feel.
[24:01] (1441.52s)
You don't need that much intelligence be
[24:03] (1443.20s)
successful but enough. But um so
[24:06] (1446.56s)
communication skills um collaboration
[24:09] (1449.44s)
skills are really important like knowing
[24:11] (1451.36s)
how to work with somebody and not just
[24:12] (1452.88s)
piss them off because you're saying
[24:15] (1455.28s)
you're wrong but figuring out how to you
[24:17] (1457.44s)
know you know how to how to give them
[24:19] (1459.60s)
what they need in order to get what you
[24:21] (1461.36s)
want which by the way I haven't really
[24:22] (1462.64s)
mastered yet. I've screwed that up a
[24:24] (1464.16s)
bunch of times too. But that's one like
[24:26] (1466.16s)
we talked about business outcomes. I'll
[24:27] (1467.52s)
generalize that. I would think something
[24:29] (1469.52s)
that's really important to move up is
[24:30] (1470.96s)
focusing on outcomes. So this is like a
[24:33] (1473.68s)
really important thing and I probably
[24:36] (1476.40s)
did some of it subconsciously or
[24:38] (1478.48s)
unconsciously and some of it after I
[24:40] (1480.48s)
learned about it more consciously. It's
[24:42] (1482.56s)
very easy for people to focus on their
[24:44] (1484.48s)
own outcomes. In other words, they know
[24:46] (1486.08s)
what they'd like to do. They know about
[24:47] (1487.44s)
the technology they want to build. They
[24:49] (1489.04s)
know that they want to make it 10%
[24:50] (1490.80s)
faster or whatever it is or but often
[24:53] (1493.04s)
the outcomes of the company or the
[24:54] (1494.80s)
outcomes that somebody else is trying to
[24:56] (1496.72s)
meet are different than your outcomes.
[24:59] (1499.04s)
And if you don't project yourself into
[25:00] (1500.80s)
their shoes or the company's shoes and
[25:02] (1502.40s)
identify what the company's outcomes or
[25:04] (1504.88s)
divisional outcomes are or the other
[25:06] (1506.48s)
team's outcomes are, people are not
[25:08] (1508.48s)
going to be interested in what you have
[25:09] (1509.52s)
to do, even if it's really complex and
[25:11] (1511.68s)
hard and interesting for you. And so I
[25:15] (1515.28s)
think what really helps far more than
[25:17] (1517.84s)
intelligence is focusing on what
[25:20] (1520.64s)
outcomes need to be solved for a project
[25:24] (1524.08s)
or for the company. What metrics matter
[25:26] (1526.80s)
for those outcome? Because often there
[25:29] (1529.04s)
are things that don't really matter that
[25:30] (1530.48s)
much like you know there are like
[25:32] (1532.24s)
requirements that are unimportant and
[25:33] (1533.84s)
requirements that are super important.
[25:35] (1535.20s)
So focusing on the most important
[25:36] (1536.40s)
requirements and doing that work and
[25:40] (1540.00s)
focusing on the most important outcomes
[25:41] (1541.84s)
for your division or company and then
[25:43] (1543.76s)
focusing on the only the most important
[25:45] (1545.36s)
requirement is going to get you far much
[25:48] (1548.08s)
farther than being intelligent. I would
[25:50] (1550.72s)
I I think you you have an interesting
[25:52] (1552.80s)
perspective because you're in academia
[25:55] (1555.92s)
because you're lecturing at UCLA, but
[25:58] (1558.00s)
you've also had uh a lot of success um
[26:00] (1560.56s)
in the in the industry as well. When I
[26:03] (1563.20s)
was in school, everything was very
[26:06] (1566.08s)
obviously uh intelligence-based. Maybe
[26:09] (1569.28s)
unless there's a uh well, aside from
[26:12] (1572.40s)
hard work, but you know, you're there's
[26:14] (1574.40s)
a test. You either get it right or or
[26:16] (1576.32s)
not. aside from group projects and
[26:18] (1578.32s)
things like that obviously coming from
[26:20] (1580.72s)
that place you know intelligence feels
[26:23] (1583.28s)
like everything and then you get to
[26:25] (1585.04s)
industry and I agree with you 100%
[26:27] (1587.60s)
intelligence is not everything in
[26:29] (1589.28s)
industry but because in in college it's
[26:32] (1592.72s)
very obviously meritocratic whereas in
[26:35] (1595.52s)
industry there's other things too like
[26:37] (1597.60s)
do people like you or uh you know other
[26:40] (1600.72s)
things like that would you say that
[26:42] (1602.80s)
career growth is meritocratic in the
[26:45] (1605.28s)
industry
[26:46] (1606.08s)
you In my experience, it was the there
[26:49] (1609.20s)
were cases where people were being
[26:50] (1610.56s)
promoted because a vice president
[26:52] (1612.48s)
basically pushed really hard or SVP
[26:55] (1615.28s)
pushed really hard and said, you know,
[26:57] (1617.44s)
they need to be promoted, period.
[26:59] (1619.68s)
Otherwise, we're not going to be able to
[27:00] (1620.64s)
keep them. And that's never a good
[27:02] (1622.00s)
reason to promote somebody, right?
[27:03] (1623.84s)
Because you don't uphold standards that
[27:05] (1625.28s)
for everybody else to look at and then
[27:06] (1626.48s)
you end up with a bunch of people that
[27:07] (1627.92s)
are not great and everybody's like,
[27:09] (1629.20s)
well, why shouldn't I be promoted
[27:10] (1630.40s)
because that clown is promoted, right?
[27:12] (1632.48s)
Um but by and large I would say it was
[27:15] (1635.60s)
meritocratic. You know I remember when I
[27:18] (1638.16s)
was on these committees we would look at
[27:19] (1639.60s)
the accomplishments and the complexity
[27:21] (1641.04s)
of the accomplishments the impact that
[27:22] (1642.64s)
they made uh for the company. We look at
[27:25] (1645.28s)
the communication skills. We looked at
[27:27] (1647.04s)
for instance patent portfolios. Were
[27:28] (1648.48s)
they helping the company with
[27:29] (1649.36s)
intellectual property um which was
[27:31] (1651.52s)
important back then I don't know how but
[27:33] (1653.28s)
important it is now but um so it was
[27:36] (1656.64s)
generally pretty fair and I saw Google
[27:38] (1658.48s)
too. It was very fair. there were, you
[27:40] (1660.00s)
know, very reasoned discussions about
[27:42] (1662.00s)
each person. So, I think it is. I think
[27:44] (1664.72s)
it is. It's just it's not just, you
[27:46] (1666.40s)
know, you ever see those charts where
[27:48] (1668.16s)
they have like a circle and then they
[27:49] (1669.68s)
have like little like a sort of a
[27:51] (1671.04s)
polygon inside, right? Right.
[27:52] (1672.56s)
And it shows like how your intelligence
[27:54] (1674.16s)
is versus how your your technical work
[27:57] (1677.20s)
and it had to be pretty, you know,
[27:58] (1678.72s)
pretty wellrounded for the senior levels
[28:00] (1680.80s)
or or at least be really good in some
[28:02] (1682.88s)
areas. So then at Google, you went to
[28:04] (1684.88s)
Google X working uh in a new cyber
[28:07] (1687.20s)
security division and then a company was
[28:10] (1690.08s)
spun out of that, right? Chronicle if I
[28:12] (1692.08s)
recall correctly, but it's still under
[28:14] (1694.24s)
the alphabet umbrella of companies.
[28:17] (1697.20s)
That's right.
[28:18] (1698.00s)
Which was then reacquired by Google
[28:20] (1700.64s)
Cloud. Could you share a little bit more
[28:23] (1703.36s)
about that story?
[28:24] (1704.48s)
Yeah, sure. So we started as a stealth
[28:26] (1706.24s)
project. Nobody knew we were doing cyber
[28:27] (1707.76s)
security initially. The project name was
[28:29] (1709.28s)
Project Lantern and this is inside of
[28:30] (1710.88s)
Axe. Um, and we literally started from
[28:33] (1713.60s)
zero. We didn't know what we wanted to
[28:35] (1715.04s)
build. There were lots of debates and we
[28:36] (1716.88s)
knew generally what we wanted to do. But
[28:38] (1718.64s)
we spent, I think, a good six months
[28:40] (1720.24s)
trying to just figure out what we were
[28:42] (1722.80s)
going to build. We then basically
[28:45] (1725.04s)
converged on an idea. We started hiring
[28:47] (1727.28s)
a bigger team, started working on, you
[28:49] (1729.36s)
know, building prototypes of the product
[28:51] (1731.52s)
out. Finally got to an MVP, started, you
[28:54] (1734.08s)
know, shipping, working with partners,
[28:55] (1735.76s)
um, which was great. You know, actually
[28:56] (1736.96s)
seeing real customers use it was super
[28:58] (1738.56s)
useful. and seeing their we would we
[29:00] (1740.64s)
would actually go into customer sites
[29:02] (1742.32s)
before we had the product and just watch
[29:03] (1743.76s)
how they did their work and saw where
[29:05] (1745.04s)
they struggled which was super useful.
[29:07] (1747.04s)
Um really interesting actually like
[29:08] (1748.96s)
watching cyber security teams work. Some
[29:10] (1750.96s)
of the people were like stoned you know
[29:13] (1753.20s)
they're clearly out of it you know like
[29:15] (1755.36s)
these are the people that are using your
[29:16] (1756.96s)
product you know for better or worse.
[29:18] (1758.80s)
So the product was a product that cyber
[29:21] (1761.20s)
security engineers would use.
[29:22] (1762.96s)
Yeah. So best way to think about in a
[29:26] (1766.40s)
nutshell about Chronicle's product which
[29:28] (1768.16s)
was called backstory was basically cyber
[29:31] (1771.20s)
security today is a big data game. Okay.
[29:34] (1774.24s)
And what you what you want to do,
[29:35] (1775.92s)
especially if you're trying to discover
[29:37] (1777.36s)
attacks in your environment or
[29:38] (1778.56s)
investigate attacks, which is a big part
[29:40] (1780.32s)
of cyber security, some of it's
[29:41] (1781.92s)
proactive, some of it's blocking the
[29:43] (1783.12s)
attacks before they come in, but a lot
[29:44] (1784.40s)
of it is they're going to get in and we
[29:46] (1786.24s)
have to know where they are, when they
[29:47] (1787.76s)
got in, what assets they access, and so
[29:49] (1789.84s)
on. And so, as it turns out, virtually
[29:52] (1792.40s)
all software and hardware that's used in
[29:54] (1794.32s)
corporations today generates huge
[29:56] (1796.88s)
amounts of logs. A firewall will have
[29:59] (1799.44s)
every connection, the source IP, the
[30:01] (1801.28s)
target IP, what protocol. Web proxies
[30:03] (1803.92s)
will tell you what websites were
[30:05] (1805.12s)
visited, uh what what again what machine
[30:07] (1807.44s)
visited them. You have telemetry like
[30:09] (1809.12s)
DHCP that tells you like what machine
[30:11] (1811.20s)
what machine IP is associated with the
[30:13] (1813.52s)
MAC address, associated with the machine
[30:15] (1815.44s)
name. You have email logs. You have
[30:17] (1817.92s)
client logs, what software was
[30:19] (1819.36s)
installed, right? All that data is super
[30:22] (1822.24s)
valuable for identifying attacks. Okay?
[30:25] (1825.12s)
But there's such high volume of that
[30:27] (1827.28s)
data that people couldn't really process
[30:29] (1829.12s)
it. And so the customers that we were
[30:31] (1831.36s)
starting to work with would use a
[30:33] (1833.76s)
competing product which I won't name but
[30:36] (1836.16s)
they would literally go they would
[30:37] (1837.60s)
ingest a certain fraction of that data
[30:39] (1839.84s)
very little of it because it was so
[30:41] (1841.28s)
expensive to maintain it and they would
[30:43] (1843.28s)
go for coffee for 30 minutes while
[30:45] (1845.60s)
waiting for a query to finish to look up
[30:47] (1847.52s)
just one piece of information in that
[30:49] (1849.12s)
data. And so we said, look, we have
[30:51] (1851.60s)
Google planet sized uh compute, you
[30:54] (1854.72s)
know, and storage. How could we totally
[30:57] (1857.04s)
turn this around? And what we did was we
[30:58] (1858.96s)
built a product that would ingest all
[31:00] (1860.64s)
that data, pabytes of data, like
[31:02] (1862.56s)
literally some for some companies, a you
[31:04] (1864.48s)
know, pabyte a week or a pabyte a month,
[31:06] (1866.48s)
a huge amount of data of, you know,
[31:08] (1868.88s)
every device, every connection, every
[31:11] (1871.68s)
file installation, every settings
[31:13] (1873.28s)
change, like all that kind of stuff. And
[31:15] (1875.12s)
then we indexed it. That's what I worked
[31:16] (1876.88s)
on. That was my sort of you know
[31:18] (1878.80s)
addition like so that it would be more
[31:21] (1881.12s)
like uh at the speed of a Google search
[31:23] (1883.76s)
than a 30 minute let's go get some
[31:25] (1885.36s)
coffee. What would be a use case? Let's
[31:27] (1887.84s)
imagine that you discovered a piece of
[31:30] (1890.40s)
malware on a computer. Uh you might have
[31:32] (1892.24s)
a hash for that malware. You might know
[31:33] (1893.68s)
the IP address where it came down from
[31:35] (1895.12s)
or was downloaded. You might have the
[31:36] (1896.80s)
file name. You might know the directory
[31:38] (1898.80s)
was installed. Our product would allow
[31:40] (1900.80s)
you to take any of those artifacts and
[31:43] (1903.52s)
plug it in and then we would instantly
[31:45] (1905.44s)
tell you which devices also had that
[31:48] (1908.96s)
artifact on them, what related artifacts
[31:51] (1911.44s)
there were to that, you know. Um, so you
[31:54] (1914.24s)
could say, "Oh, well, this file had a
[31:55] (1915.92s)
different name even though it had the
[31:57] (1917.28s)
same hash, let's say." And so we know
[31:59] (1919.20s)
better check for this name because this
[32:00] (1920.24s)
might be on some other computer. So you
[32:01] (1921.36s)
could pivot. Um, we could tell you how
[32:04] (1924.00s)
many devices were impacted in your
[32:05] (1925.12s)
environment, who's those devices were,
[32:07] (1927.12s)
when was the first infiltration, when's
[32:09] (1929.12s)
the last infiltration, is it still
[32:10] (1930.72s)
active, and do that in like 2 seconds,
[32:12] (1932.40s)
you know, that kind of those kind of use
[32:13] (1933.76s)
cases.
[32:14] (1934.64s)
As opposed to 30 minutes where, you
[32:16] (1936.24s)
know, um, hopefully it gives you an
[32:18] (1938.56s)
You know, when it comes to Chronicle,
[32:20] (1940.32s)
even just doing the research, I got a
[32:21] (1941.84s)
little confused. Sounds like it it spun
[32:23] (1943.68s)
out and then it came back in. What's
[32:26] (1946.40s)
What is the the benefit of doing that
[32:29] (1949.28s)
stuff? Why not just do it, you know,
[32:31] (1951.44s)
within Google? And that's kind of that.
[32:33] (1953.44s)
That's a great question. So, I think X's
[32:36] (1956.00s)
initial goal was to create viable
[32:40] (1960.64s)
businesses, you know, that could spin
[32:42] (1962.72s)
out and, you know, be world changing.
[32:46] (1966.00s)
Okay. And so, we were following that
[32:49] (1969.60s)
playbook, which was, hey, let's incubate
[32:51] (1971.52s)
it. Um, let's get really good people to
[32:53] (1973.44s)
work on it. We had really smart people.
[32:55] (1975.44s)
um let's not en encumber the team as
[32:58] (1978.72s)
much as we might a normal Google team.
[33:01] (1981.28s)
Let them work fast. Let them do what
[33:03] (1983.44s)
they need to do and then let's spit it
[33:05] (1985.76s)
out. You know, I can't really talk about
[33:07] (1987.84s)
why they required it. Partly because I
[33:10] (1990.48s)
only have hypothesis I don't know, but
[33:12] (1992.88s)
let's just say that it was a tight fit
[33:16] (1996.96s)
with Google Cloud. In other words,
[33:18] (1998.56s)
Google Cloud offers services to
[33:20] (2000.00s)
customers. This was a cloud hosted
[33:21] (2001.52s)
service. It used a lot of storage and a
[33:24] (2004.08s)
lot of compute which by the way Google
[33:26] (2006.48s)
cloud had and Google cloud could build
[33:28] (2008.40s)
for right and so I think that for
[33:32] (2012.72s)
various reasons which I can't talk about
[33:34] (2014.56s)
after we spotted out we were still an
[33:35] (2015.92s)
alphabet company like Whimo okay so we
[33:37] (2017.76s)
were still like a under the alphabet
[33:40] (2020.40s)
umbrella we were the C in alphabet for
[33:42] (2022.64s)
Chronicle um but I think they thought
[33:45] (2025.12s)
you know they had competing products
[33:46] (2026.80s)
they wanted to integrate them there were
[33:47] (2027.84s)
a bunch of reasons that they brought it
[33:49] (2029.04s)
back in and sort of integrated it so we
[33:51] (2031.20s)
were for a time not part of Google. You
[33:54] (2034.32s)
know, for instance, Google, I don't if
[33:56] (2036.08s)
you ever heard, but Google performance
[33:58] (2038.64s)
are notoriously lengthy and timeconuming
[34:01] (2041.52s)
and you have to write pages of stuff
[34:02] (2042.88s)
about yourself and what you did and PRs
[34:05] (2045.68s)
that you did and all the all the stuff,
[34:07] (2047.28s)
right? At Chronicle, we're like, we
[34:09] (2049.20s)
don't want to waste time on that. We
[34:10] (2050.24s)
want to build a project. So, as soon as
[34:11] (2051.44s)
we spun out, we had one page performance
[34:13] (2053.76s)
reviews and I think they were in a
[34:15] (2055.36s)
Google slide. It wasn't even like a big
[34:17] (2057.44s)
page of written stuff. So, so we could
[34:20] (2060.40s)
move more quickly, you know, and so that
[34:22] (2062.64s)
was great while it lasted.
[34:24] (2064.16s)
Why is it that you eventually left
[34:26] (2066.48s)
Google?
[34:27] (2067.20s)
That's another one which has to do in
[34:29] (2069.04s)
part with imposttor syndrome is a
[34:30] (2070.96s)
regular thing in my career. And in part
[34:33] (2073.04s)
it has to do with wanting to work at a
[34:35] (2075.60s)
startup as opposed to in a a bigger
[34:37] (2077.84s)
stodgier sort of Google environment
[34:40] (2080.48s)
whereas things are more slower moving
[34:42] (2082.00s)
and there's more regulations and things
[34:43] (2083.36s)
you have to deal with. Not that we
[34:44] (2084.40s)
didn't have to deal with those things in
[34:45] (2085.36s)
Chronicle, but more so. Um, part of me
[34:49] (2089.52s)
didn't feel like like there were
[34:51] (2091.28s)
interesting problems that I could have
[34:52] (2092.16s)
done. For instance, the storage
[34:53] (2093.28s)
architecture that we came up with, part
[34:55] (2095.68s)
of which I built and I think my code's
[34:57] (2097.28s)
still in there. I feel very proud of
[34:58] (2098.48s)
that. Um, you know, six, seven years
[35:00] (2100.40s)
later, whatever it is. Part of that was
[35:03] (2103.76s)
an architecture which was very expensive
[35:05] (2105.60s)
and there were ways to basically move
[35:07] (2107.76s)
that into different file formats and get
[35:10] (2110.48s)
off of things like Spanner which was you
[35:12] (2112.08s)
know very heavy weight where I could
[35:14] (2114.32s)
have dived in and tried to solve those
[35:16] (2116.40s)
problems and they'd be nice big juicy
[35:18] (2118.00s)
problems. um I didn't have the
[35:19] (2119.76s)
confidence in myself to do that and I
[35:21] (2121.52s)
you know I was afraid especially when
[35:22] (2122.64s)
you get to senior levels there are very
[35:24] (2124.64s)
high expectations for you right and so
[35:26] (2126.96s)
like if you go off and try to do
[35:28] (2128.88s)
something and then people are like well
[35:30] (2130.64s)
what have you been doing the last couple
[35:31] (2131.68s)
months and it didn't land and like
[35:33] (2133.28s)
you're like oh I tried you know I didn't
[35:35] (2135.68s)
feel the confidence in our leadership
[35:37] (2137.36s)
that I could go off and take those risks
[35:39] (2139.92s)
and have them have my back at semantic I
[35:42] (2142.40s)
did like I knew my bosses for years and
[35:45] (2145.28s)
so they they just knew that if I were
[35:46] (2146.88s)
going to go do something I'd either be
[35:48] (2148.72s)
successful or if I failed it was for a
[35:50] (2150.00s)
good reason and they'd give me the rope.
[35:51] (2151.92s)
Okay. But at Google, I didn't quite feel
[35:54] (2154.40s)
like I had that. And they offered me
[35:55] (2155.60s)
other roles, too. They said, "Oh, you
[35:57] (2157.04s)
want to work on secure databases." Um,
[35:59] (2159.84s)
there were some really interesting
[36:00] (2160.72s)
things there of like how do you compute
[36:02] (2162.24s)
and do database queries entirely in
[36:03] (2163.92s)
encrypted space rather than decrypting
[36:07] (2167.20s)
and doing it, you know, basically taking
[36:09] (2169.04s)
private data and exposing it where
[36:11] (2171.36s)
malware or other attacks get to it or
[36:13] (2173.20s)
interesting things. I really wanted to
[36:14] (2174.72s)
try something new sort of over cyber
[36:16] (2176.40s)
security and while I was at Axe I was
[36:20] (2180.08s)
talking to all the Whimo guys especially
[36:22] (2182.00s)
before Whimo even became Whimo and I was
[36:24] (2184.00s)
learning all about self-driving cars and
[36:25] (2185.68s)
so that was what I really wanted to do
[36:27] (2187.52s)
and that's why I decided to leave.
[36:29] (2189.76s)
Yeah. Before we get into the
[36:30] (2190.96s)
self-driving cars, I'm curious cuz I
[36:33] (2193.28s)
talked to someone who felt uh the the
[36:37] (2197.12s)
high expectations of the highest levels
[36:39] (2199.52s)
was a little bit limiting or kind of put
[36:43] (2203.76s)
too much pressure on them and they
[36:46] (2206.64s)
requested a demotion. Did you ever
[36:49] (2209.12s)
consider something like that?
[36:51] (2211.12s)
Not semantic. At semantic, I could do
[36:52] (2212.88s)
whatever I wanted to and it didn't
[36:53] (2213.92s)
really matter like you know.
[36:55] (2215.76s)
Yeah. But but at Google I had thought
[36:58] (2218.00s)
about those types of things actually you
[36:59] (2219.68s)
absolutely came to mind. The problem is
[37:02] (2222.88s)
that even at lower levels, there's
[37:05] (2225.52s)
certain expectations
[37:07] (2227.60s)
that I probably wouldn't have met if I
[37:09] (2229.52s)
want to go off for a couple months and
[37:10] (2230.96s)
just think about something and which is
[37:12] (2232.40s)
the way I've been most successful in my
[37:13] (2233.84s)
career.
[37:14] (2234.72s)
And so, you know, I got to be honest
[37:16] (2236.96s)
with you, um I'm a loosey goosey kind of
[37:19] (2239.12s)
guy. like I can produce prototypes and
[37:23] (2243.28s)
and optimize algorithms and do you know
[37:25] (2245.44s)
relative interesting stuff but when it
[37:26] (2246.96s)
comes to dotting every eye crossing
[37:28] (2248.56s)
every tea making sure I test every edge
[37:30] (2250.48s)
condition in my unit tests that's not my
[37:32] (2252.64s)
thing I don't enjoy that but at a Google
[37:34] (2254.48s)
that's just like that's what you do and
[37:36] (2256.48s)
so for me that wouldn't that wouldn't
[37:38] (2258.32s)
have necessarily made a difference
[37:39] (2259.44s)
because I would have had to do the
[37:40] (2260.56s)
things I didn't like to do in order to
[37:42] (2262.48s)
do the things that I wanted to do
[37:44] (2264.00s)
okay going into autonomous vehicles so
[37:46] (2266.48s)
you went to it sounds like that was
[37:48] (2268.48s)
mostly because of personal interest in
[37:50] (2270.56s)
the space. Can you talk about how you
[37:52] (2272.80s)
were hired and the story behind that?
[37:54] (2274.80s)
Sure. So, at Lyft, um I actually had a
[37:57] (2277.68s)
former student from UCLA that was
[37:59] (2279.52s)
working at Lyft and he said, "Oh, you
[38:01] (2281.52s)
should come work here. It's really
[38:02] (2282.56s)
interesting." And I thought, "We'll
[38:04] (2284.16s)
never hire me because I actually tried
[38:05] (2285.44s)
to apply for Whimo when I was in Google
[38:07] (2287.68s)
in X." And you know, again, this is
[38:10] (2290.32s)
another problem with being very senior.
[38:11] (2291.60s)
when you're very senior and you don't
[38:13] (2293.52s)
have domain expertise in a new space,
[38:15] (2295.76s)
they're less likely to say, "Oh, we'll
[38:17] (2297.52s)
take a we'll we'll try you because
[38:19] (2299.04s)
you're very expensive, right?" And, you
[38:21] (2301.28s)
know, you might not have the skills that
[38:22] (2302.88s)
they need, and you know, they're paying
[38:24] (2304.64s)
somebody that they they can't use. So,
[38:26] (2306.48s)
Whimo didn't want me inside of Google or
[38:29] (2309.52s)
Alphabet. So, I said, "Well, they're
[38:32] (2312.16s)
probably not going to want me because I
[38:33] (2313.20s)
don't have any experience here, but why
[38:35] (2315.04s)
not?" And so, uh, my former student
[38:37] (2317.44s)
arranged a meeting with their president.
[38:39] (2319.12s)
We had had uh coffee and then I went
[38:42] (2322.08s)
through a set of interviews again 78
[38:44] (2324.00s)
interviews. Those did have some coding
[38:46] (2326.72s)
um and design problems. Um that went
[38:49] (2329.12s)
fine. Fortunately, there was no dynamic
[38:50] (2330.48s)
programming because I can't do that. I
[38:51] (2331.76s)
mean just can't like you know maybe
[38:53] (2333.36s)
certain cases I can but you know I'll
[38:55] (2335.60s)
fail every dynamic programming interview
[38:57] (2337.52s)
uh question. I was hired.
[38:59] (2339.36s)
What were the projects like or what was
[39:01] (2341.12s)
the thing you were most interested in
[39:02] (2342.64s)
working on that you did? So for me the
[39:05] (2345.04s)
big project that I sort of was proud of
[39:08] (2348.56s)
was transforming the architecture
[39:11] (2351.52s)
um from one which was a classic robotics
[39:15] (2355.44s)
architecture like you would have seen in
[39:17] (2357.04s)
the you know even in Whimo until
[39:19] (2359.20s)
probably the mid mid little middle 2010s
[39:22] (2362.08s)
which was largely handwritten algorithms
[39:26] (2366.16s)
and you know sort of not expert systems
[39:28] (2368.00s)
but handwritten algorithms that would
[39:29] (2369.36s)
make decisions like oh it's time to do a
[39:31] (2371.04s)
left turn let's run the left turn
[39:32] (2372.96s)
decision decision making system and
[39:34] (2374.08s)
figure out is it safe? Do we initiate?
[39:36] (2376.08s)
Do we not initiate? Um those systems you
[39:38] (2378.96s)
know in the mid210s were using um neural
[39:42] (2382.24s)
networks but they were only using it for
[39:44] (2384.80s)
vision. So in other words you know you
[39:46] (2386.80s)
know recognizing vehicles pedestrians
[39:48] (2388.72s)
and so on maybe their you know their you
[39:51] (2391.76s)
know angle and so on and their and their
[39:54] (2394.72s)
velocity and acceleration. But at Lyft
[39:58] (2398.08s)
they were using that sort of earlier
[40:00] (2400.24s)
approach which I'm sure came up through
[40:02] (2402.00s)
places like Carnegi Melon where a lot of
[40:04] (2404.32s)
it was handcoded and to me I looked at
[40:06] (2406.72s)
that especially during my time at X
[40:08] (2408.56s)
seeing neural networks and semantic even
[40:10] (2410.72s)
seeing neural networks and said there's
[40:12] (2412.48s)
got to be a better way because if you
[40:14] (2414.16s)
start hard coding an algorithm to figure
[40:16] (2416.00s)
out how to do a lane change and then
[40:18] (2418.24s)
somebody swerves in front of you now
[40:20] (2420.96s)
you're doing an avoidance maneuver right
[40:22] (2422.64s)
now avoidance maneuver do you switch out
[40:24] (2424.96s)
of your lane change algorithm in order
[40:26] (2426.64s)
to do an avoid avoidance algorithm or do
[40:28] (2428.40s)
you stay in lane chain and handle
[40:29] (2429.68s)
avoidance it just made no sense and so
[40:32] (2432.56s)
you know uh the team was also h the
[40:35] (2435.68s)
stack was also hand parameterized so
[40:37] (2437.36s)
literally you know we they're twe how
[40:39] (2439.60s)
diff how close do we want to get to the
[40:40] (2440.96s)
curve you know um okay how close are we
[40:43] (2443.52s)
willing to get to pedestrians what about
[40:44] (2444.88s)
bicyclists what about you know what okay
[40:47] (2447.20s)
so if we get too close to a bic
[40:48] (2448.80s)
bicyclist today let's tweak that and
[40:50] (2450.64s)
make that bigger but wait a second now
[40:52] (2452.40s)
it's gonna affect our our curb distance
[40:54] (2454.80s)
and so like there
[40:56] (2456.24s)
you know, a hundred parameters that all
[40:57] (2457.76s)
had to be tweaked and it was a really
[40:59] (2459.52s)
difficult problem. And by the way,
[41:00] (2460.72s)
companies like Tesla were doing this
[41:01] (2461.92s)
until recently too
[41:03] (2463.04s)
that that approach and Whimo was doing
[41:05] (2465.36s)
that um for a long time, is a dead end.
[41:08] (2468.32s)
Uh it's a dead end. And so the approach
[41:11] (2471.12s)
that you know I think that probably
[41:12] (2472.88s)
Whimo is using now I don't know but my
[41:14] (2474.64s)
guess is uh and I know that Tesla is now
[41:16] (2476.96s)
using they've announced it is you know
[41:19] (2479.28s)
an endto-end uh neural network-based uh
[41:23] (2483.20s)
perception and behavior planner system
[41:25] (2485.76s)
and prediction. you know, basically all
[41:27] (2487.68s)
them, you know, there might be multiple
[41:29] (2489.04s)
heads on these neural networks and there
[41:30] (2490.48s)
multiple functions that they're they're
[41:32] (2492.80s)
performing, but effectively it's a
[41:34] (2494.96s)
single or small number of networks
[41:36] (2496.32s)
working together to figure out the the
[41:38] (2498.16s)
movement. And so figuring out doing and
[41:41] (2501.04s)
doing a lane change is not necessarily a
[41:42] (2502.72s)
lane change algorithm, although there
[41:44] (2504.08s)
may be a little bit of that, but it's
[41:45] (2505.36s)
mostly about, you know, we know we need
[41:47] (2507.60s)
to go there. We know there's a left turn
[41:49] (2509.76s)
lane. Let's, you know, let's start
[41:51] (2511.60s)
maneuvering into the left turn lane and
[41:52] (2512.96s)
turning on the signal. And so my pro the
[41:55] (2515.76s)
project that I was most proud of there
[41:58] (2518.24s)
was basically designing with the head
[42:01] (2521.68s)
roboticists of the team were who are old
[42:04] (2524.16s)
school by the way an architecture which
[42:06] (2526.24s)
would accommodate basically an endto-end
[42:09] (2529.04s)
neural networkbased driver but put guard
[42:11] (2531.68s)
rails on it. In other words have a
[42:13] (2533.12s)
safety layer that would ensure that if
[42:15] (2535.52s)
the network went off the rails and told
[42:17] (2537.12s)
it to go through a red light we would
[42:19] (2539.68s)
slam the brakes. The safety layer would
[42:21] (2541.36s)
take precedence over the the system. But
[42:23] (2543.36s)
the safety layer was there only for
[42:25] (2545.52s)
basically ensuring you know that
[42:27] (2547.12s)
collisions never happened. Basically
[42:28] (2548.40s)
bringing you to a safe shop or basically
[42:30] (2550.48s)
you know following legal rules that the
[42:33] (2553.20s)
neural network may not do perfectly.
[42:35] (2555.52s)
Does that make sense?
[42:36] (2556.56s)
Right. That makes sense.
[42:37] (2557.76s)
And so that was that was I was very
[42:39] (2559.36s)
proud of that project. Unfortunately
[42:40] (2560.80s)
this is a this is like a bit painful for
[42:43] (2563.44s)
me. I didn't get as much credit for that
[42:44] (2564.80s)
as I I would have liked given the work I
[42:46] (2566.88s)
did which is like one of the things I
[42:48] (2568.56s)
learned is like you really have to toot
[42:50] (2570.48s)
your own horn. you have to really talk
[42:51] (2571.92s)
about what you're doing. Somebody else
[42:53] (2573.44s)
took credit for a lot of that. But the
[42:56] (2576.56s)
hard and really great part of that was
[42:58] (2578.56s)
this was no coding really. This was all
[43:01] (2581.36s)
about working with roboticists who did
[43:04] (2584.08s)
not want to change the approach and
[43:06] (2586.00s)
getting them to consider a new approach,
[43:09] (2589.12s)
working together to define the approach
[43:10] (2590.72s)
together rather than telling them how it
[43:12] (2592.32s)
should be because I didn't quite know.
[43:14] (2594.08s)
But by the way, I I thought I had a
[43:15] (2595.52s)
better idea, but they didn't want to
[43:16] (2596.80s)
hear that because I had no degree in
[43:18] (2598.64s)
robotics. and then eventually coming up
[43:21] (2601.68s)
with a product that was a collaboration
[43:24] (2604.56s)
where they were able to buy in and push
[43:26] (2606.08s)
it themselves if that makes sense.
[43:27] (2607.84s)
Right. Right.
[43:28] (2608.72s)
And that was all like, you know,
[43:30] (2610.72s)
influence and not because I and
[43:33] (2613.20s)
influence but not based on my skill
[43:34] (2614.56s)
because I don't they didn't have any
[43:36] (2616.08s)
autonomous vehicle skill.
[43:37] (2617.44s)
I think that's a common topic that
[43:39] (2619.44s)
people wonder when they're doing shared
[43:41] (2621.60s)
projects is how do you make sure you get
[43:43] (2623.52s)
credit for what you worked on? And so
[43:45] (2625.44s)
maybe you could talk about that. I wish
[43:47] (2627.36s)
I had a good recipe for that. I, you
[43:50] (2630.64s)
know, in general when you're when you're
[43:52] (2632.64s)
more junior, I think it really makes a
[43:54] (2634.96s)
lot of sense in the moment when you
[43:56] (2636.72s)
finish a project or you're you're almost
[43:58] (2638.16s)
done with it to take notes on what you
[44:00] (2640.08s)
did and what the big accomplishments
[44:01] (2641.76s)
were and what the metrics were so that
[44:03] (2643.76s)
when it comes time to write up a promo
[44:05] (2645.68s)
packet or even, you know, your sort of
[44:07] (2647.68s)
yearly review packet, you have all those
[44:10] (2650.56s)
details which you're going to forget
[44:11] (2651.60s)
later on even like two years later when
[44:13] (2653.28s)
you're going up for more senior promo,
[44:14] (2654.88s)
right? So having that I think is useful.
[44:17] (2657.12s)
To be honest with you, I never did that.
[44:18] (2658.88s)
But like in retrospect that wouldn't
[44:20] (2660.16s)
have been very useful for me because I
[44:21] (2661.92s)
would come out of it from my head and
[44:23] (2663.28s)
then ask people like what was that? How
[44:24] (2664.64s)
much faster was that? Like what what now
[44:27] (2667.36s)
was able to detect that we could have
[44:28] (2668.88s)
detected before? But I think that does
[44:30] (2670.80s)
help a lot and you have to toot your own
[44:32] (2672.48s)
horn like you know when you're when
[44:34] (2674.24s)
you're writing your performance review
[44:36] (2676.40s)
without embellishing. I think
[44:38] (2678.48s)
embellishment is really bad actually.
[44:40] (2680.16s)
But, you know, really state what you did
[44:41] (2681.92s)
and the value added and the sections
[44:43] (2683.76s)
that you worked on. Don't claim credit
[44:45] (2685.92s)
for everything. Claim credit for the
[44:47] (2687.68s)
parts that you worked on because I
[44:49] (2689.44s)
guarantee you when a committee is
[44:51] (2691.04s)
reviewing your packet, they're going to
[44:53] (2693.20s)
be like, "Wait, why are they taking
[44:54] (2694.64s)
credit for all this when we know that so
[44:56] (2696.88s)
and so did all of this great work?" And
[44:59] (2699.44s)
then you lose credibility. So, um,
[45:01] (2701.92s)
because I've been on those committees,
[45:03] (2703.12s)
right? I've seen that. I've seen that
[45:04] (2704.24s)
happen. So, you know, be very specific
[45:05] (2705.76s)
and granular about what you did, what
[45:07] (2707.60s)
the benefits were, how you collaborated.
[45:10] (2710.24s)
Um, I think that helps a lot. When
[45:12] (2712.08s)
you're more senior, it's more difficult
[45:14] (2714.24s)
because it's it's a lot of soft power.
[45:16] (2716.24s)
It's a lot of, you know, influence. I
[45:18] (2718.48s)
was actually reluctant like I didn't
[45:20] (2720.64s)
even try to take credit because I didn't
[45:22] (2722.80s)
want to alienate my co- collaborators
[45:26] (2726.24s)
who also were part of this. And so, I
[45:28] (2728.72s)
didn't go around and start teles talking
[45:30] (2730.40s)
to the president saying, "We've come up
[45:31] (2731.60s)
with a new approach." I let them talk
[45:33] (2733.28s)
about it. I let their boss talk about it
[45:35] (2735.52s)
and that actually hurt me because I
[45:37] (2737.44s)
never got, you know, when it came to
[45:39] (2739.04s)
review time, they said, "Oh, their
[45:41] (2741.44s)
manager initiated this project." I'm
[45:43] (2743.60s)
like, "What?" Like,
[45:45] (2745.52s)
really? This is news to me.
[45:47] (2747.60s)
Oh, no. So,
[45:48] (2748.88s)
yeah. Yeah. So, that was very I have
[45:50] (2750.56s)
PTSD from that experience, I have to
[45:52] (2752.72s)
Uh, before we leave Lyft, I'm kind of
[45:55] (2755.20s)
curious. Um, because that space is super
[45:58] (2758.16s)
competitive. there's like,
[45:59] (2759.84s)
I don't know, a billion different
[46:01] (2761.44s)
self-driving companies, especially back
[46:04] (2764.88s)
Um, what did it look like for Lyft to
[46:07] (2767.52s)
kind of win in that space?
[46:09] (2769.52s)
You know, it wasn't clear that we had a
[46:11] (2771.36s)
strategy to win in that space. We were,
[46:13] (2773.28s)
you know, definitely a nent
[46:15] (2775.20s)
organization. I think they'd been around
[46:16] (2776.72s)
a couple years when I first started
[46:18] (2778.40s)
versus like 10 years for for Google and,
[46:20] (2780.72s)
you know, X working on that technology
[46:22] (2782.48s)
and then Whimo. So, um, I didn't go in,
[46:26] (2786.56s)
for instance, thinking that I would be
[46:28] (2788.16s)
building the next generation of
[46:29] (2789.36s)
self-driving car that would actually
[46:31] (2791.20s)
overtake a Whimo. I I went in thinking
[46:33] (2793.60s)
this is an opportunity to learn
[46:34] (2794.64s)
something new and work with really smart
[46:36] (2796.08s)
people. Um, and that was my outcome that
[46:38] (2798.40s)
I was trying to achieve. Um, so I don't
[46:40] (2800.88s)
know if there was a plan necessarily.
[46:43] (2803.68s)
And the division was eventually sold to
[46:45] (2805.60s)
a Toyota uh subsidiary. So, they got the
[46:48] (2808.88s)
IP, which is great for them. And you
[46:50] (2810.80s)
know, at that point, I said, "Okay, I'm
[46:52] (2812.32s)
retiring now."
[46:53] (2813.52s)
You know, I'm kind of curious because
[46:55] (2815.36s)
how did you get into actually becoming a
[46:59] (2819.04s)
professor at UCLA and lecturing there? I
[47:01] (2821.76s)
know you get a lot of joy out of it.
[47:03] (2823.60s)
What's the story behind going back and
[47:05] (2825.60s)
and lecturing? So, back when I was in my
[47:10] (2830.56s)
late teens, early 20s, I was teaching
[47:14] (2834.24s)
programming in a place called Learning
[47:15] (2835.76s)
Tree, which you probably have never
[47:17] (2837.04s)
heard of, but Learning Tree is a
[47:18] (2838.56s)
for-profit school where they will teach
[47:21] (2841.52s)
you gardening, guitar, knitting, and
[47:25] (2845.60s)
back then they started teaching
[47:26] (2846.96s)
programming. And it wasn't very easy to
[47:29] (2849.36s)
find people who could teach programming
[47:30] (2850.72s)
because it was early. That was probably
[47:32] (2852.88s)
in uh 1990 1991. So I applied because
[47:37] (2857.36s)
one of my friends was doing it and I
[47:39] (2859.52s)
really enjoyed it and I was teaching
[47:41] (2861.12s)
people from Dvry. You ever heard of Dry?
[47:43] (2863.20s)
Oh, I've heard of that.
[47:43] (2863.92s)
They used those commercials, right?
[47:45] (2865.20s)
Well, they were trying to learn from me
[47:46] (2866.96s)
a learning tree so they could teach at
[47:48] (2868.40s)
DVY back then because they didn't even
[47:49] (2869.76s)
have a programming class there,
[47:51] (2871.52s)
right? Okay. So, I've been teaching
[47:52] (2872.56s)
people and really enjoyed it and it felt
[47:54] (2874.16s)
really good to uh uh get up in front of
[47:57] (2877.52s)
people and explain things and try to be
[47:58] (2878.96s)
really clear and uh so I had had some
[48:02] (2882.56s)
experience doing that. And at UCLA when
[48:04] (2884.08s)
I was in undergrad uh I would teach
[48:05] (2885.84s)
little classes. We found a room in the
[48:08] (2888.08s)
evening and I just invite people who had
[48:09] (2889.92s)
problems with some of the material and
[48:11] (2891.20s)
we'd just go over it on the whiteboard
[48:12] (2892.40s)
together or blackboard. Um so I enjoyed
[48:15] (2895.20s)
that kind of stuff. And when I was at
[48:16] (2896.56s)
Semantic, one of my colleagues was uh a
[48:19] (2899.28s)
guy who was a part-time lecturer at
[48:20] (2900.80s)
UCLA. We were having lunch and I said,
[48:22] (2902.88s)
"Oh, you know, I really enjoy teaching
[48:24] (2904.32s)
and he's like, "Oh, you should apply."
[48:25] (2905.92s)
And I'm like, "UCLA would never hire me.
[48:28] (2908.40s)
I don't have a PhD. It'll never happen."
[48:31] (2911.76s)
And I uh he said, "You know, let me
[48:34] (2914.08s)
bring you an application just in case."
[48:36] (2916.16s)
And I said, "Okay, whatever you want."
[48:38] (2918.24s)
And like a couple days later, he brings
[48:39] (2919.76s)
this application. He's like, "Here."
[48:41] (2921.52s)
Filled it out, gave it back to him.
[48:43] (2923.20s)
Didn't hear anything for months. I don't
[48:44] (2924.72s)
remember if it was 6 months, a year, I
[48:46] (2926.00s)
don't remember how long it was. And two
[48:47] (2927.92s)
weeks before fall quarter of 2001, so
[48:49] (2929.92s)
like December of 2000, I get this call,
[48:52] (2932.88s)
a frantic call from UCLA, can you still
[48:55] (2935.76s)
teach because our lecturer bailed on us?
[48:59] (2939.12s)
And I said, of course. So I had two
[49:01] (2941.36s)
weeks to plan a curriculum and basically
[49:04] (2944.08s)
teach undergrad course at UCLA. And you
[49:07] (2947.12s)
know, it's 25 years later now. So
[49:09] (2949.12s)
I'm very glad that they did end up
[49:11] (2951.36s)
picking you because you are one of the
[49:13] (2953.68s)
best lecturers and I think so a lot of
[49:16] (2956.72s)
my peers think so as well and that's why
[49:19] (2959.20s)
I kind of want to ask you how do you
[49:20] (2960.80s)
make these CS lectures so engaging and
[49:24] (2964.48s)
interesting.
[49:25] (2965.28s)
Well first of all it's very kind of you
[49:26] (2966.72s)
to say that. Um so there are a couple
[49:29] (2969.76s)
things that go through my mind. So first
[49:30] (2970.96s)
thing is remember I told you I didn't
[49:32] (2972.08s)
think I was that smart. So I think
[49:34] (2974.48s)
whatever intelligence I have and
[49:35] (2975.76s)
whatever that is whatever level that is
[49:38] (2978.00s)
helps me write better lectures because I
[49:40] (2980.24s)
feel like unless I could understand
[49:42] (2982.32s)
something myself being I think sort of
[49:44] (2984.88s)
slow other people can't understand it
[49:47] (2987.12s)
too and so I try to design lectures for
[49:49] (2989.60s)
what I think is one of the lower common
[49:51] (2991.52s)
denominators which is myself and I don't
[49:54] (2994.24s)
try to teach to the top 5% of the class
[49:56] (2996.96s)
maybe that's a problem for some people
[49:58] (2998.48s)
but I try to teach for the maybe the
[50:00] (3000.88s)
30th percentile or 50th percentile and I
[50:03] (3003.04s)
think like what would what what what
[50:05] (3005.84s)
would I want to know if I were being
[50:07] (3007.04s)
taught this for the first time? I try to
[50:08] (3008.32s)
have empathy for the student. I think
[50:10] (3010.48s)
like where are they coming from? What
[50:11] (3011.76s)
have they learned about? Have they do
[50:13] (3013.20s)
they even know this concept? Should I
[50:14] (3014.48s)
introduce this first before I do that?
[50:16] (3016.64s)
So I think a lot about like I try to put
[50:19] (3019.52s)
myself in their shoes and ask what would
[50:21] (3021.36s)
they know? What concepts were they going
[50:23] (3023.20s)
to be fuzzy at versus concepts I'll
[50:25] (3025.28s)
pretty have pretty solid where I can
[50:26] (3026.56s)
just use the concept and explain it. And
[50:29] (3029.12s)
so that's a lot of what goes into my
[50:30] (3030.56s)
lectures. And so just for now, like I'm
[50:32] (3032.48s)
I'm trying to prove some slides. I'm
[50:34] (3034.96s)
literally spending days back and forth
[50:37] (3037.84s)
with chat GB03 discussing concepts and
[50:42] (3042.00s)
trying to simplify it so much but still
[50:44] (3044.24s)
get the essence right and then I'll go
[50:46] (3046.32s)
to Gemini and verify see figure out
[50:48] (3048.24s)
where they have differences and then
[50:49] (3049.36s)
I'll because there's a lot not a lot of
[50:50] (3050.72s)
materials on the internet that are
[50:51] (3051.60s)
actually really good to be honest with
[50:53] (3053.04s)
you. Um and the textbooks all suck too.
[50:55] (3055.20s)
I have to say we talked about outcomes
[50:56] (3056.96s)
earlier and for me an important outcome
[50:59] (3059.12s)
is that students not only learn
[51:00] (3060.48s)
something but enjoy the process. They
[51:03] (3063.36s)
don't I want them to have a good time
[51:05] (3065.52s)
and so I'm always thinking when I'm
[51:07] (3067.92s)
making slides, can I make them funny?
[51:09] (3069.76s)
Can I make some some joke or something
[51:11] (3071.68s)
silly? Can I make them like sort of
[51:13] (3073.44s)
colorful or you know add like an emoji
[51:15] (3075.84s)
or something to make it a little bit
[51:18] (3078.00s)
more fun so that they there's just a
[51:20] (3080.08s)
little bit of surprise when they come to
[51:21] (3081.68s)
class. They never know what they're
[51:22] (3082.64s)
going to see. might be a little
[51:23] (3083.44s)
inappropriate, you know, might be a
[51:25] (3085.44s)
little silly because I don't just want
[51:27] (3087.84s)
them to learn. I want them to learn and
[51:29] (3089.44s)
enjoy. They want to come to class.
[51:31] (3091.44s)
One one thing I'm also curious because I
[51:33] (3093.28s)
think a lot of people are scared of
[51:35] (3095.20s)
public speaking, but I you know, you're
[51:37] (3097.36s)
you're very good at it. Do you have any
[51:39] (3099.52s)
tips on on speaking well and
[51:41] (3101.92s)
practice just practice a lot? The more
[51:44] (3104.16s)
you do, the more fluent you're going to
[51:45] (3105.84s)
get. I even find like when I'm not
[51:47] (3107.92s)
teaching, because I'm part-time teaching
[51:49] (3109.28s)
right now, I'm retired. and I'm taking
[51:50] (3110.96s)
my dog for walks and working on side
[51:52] (3112.80s)
projects, playing with LLMs and stuff.
[51:55] (3115.28s)
Um, and I find that my speaking
[51:57] (3117.92s)
deteriorates over time when I'm not
[51:59] (3119.76s)
actively using it even over the course
[52:01] (3121.20s)
of a year. That might just because I'm
[52:02] (3122.88s)
getting older or whatever, but in in
[52:04] (3124.24s)
general, I found it too like earlier in
[52:06] (3126.00s)
my career. So, if you want to get better
[52:08] (3128.00s)
at presenting, if you want to get better
[52:09] (3129.44s)
at communicating, um, you have to
[52:11] (3131.68s)
practice a lot. And so that might mean
[52:13] (3133.60s)
getting a lunchroom and giving a talk
[52:15] (3135.12s)
about a project you're working on so
[52:17] (3137.12s)
that you can explain to other people
[52:18] (3138.40s)
what you're doing even if you don't need
[52:20] (3140.08s)
to. You're not doing it because it's
[52:22] (3142.08s)
required to transfer over your project
[52:23] (3143.68s)
or to integrate some technology just to
[52:25] (3145.68s)
do it. And people love that, by the way.
[52:28] (3148.00s)
And you'll probably screw it up the
[52:29] (3149.84s)
first couple times. You'll get better
[52:31] (3151.04s)
and better at it. And eventually you'll
[52:33] (3153.28s)
find that people will listen to you and
[52:35] (3155.68s)
take you more seriously if you're a
[52:37] (3157.92s)
great presenter, a great communicator.
[52:39] (3159.84s)
And in fact, story really quickly back
[52:42] (3162.24s)
to my time at Google X. Um, I remember I
[52:45] (3165.12s)
actually had a talk that I used to give
[52:46] (3166.32s)
at Semantic and I got permission. It was
[52:47] (3167.84s)
on stuck dad, I think, and on maybe
[52:49] (3169.60s)
malware detection. I had another one and
[52:52] (3172.40s)
I got permission from Semantic to give
[52:54] (3174.40s)
that talk in privately inside of X. And
[52:57] (3177.36s)
I remember people coming up to me
[52:58] (3178.72s)
afterwards and saying, "Wow, you know,
[53:00] (3180.32s)
you're one of the smartest people I've
[53:01] (3181.68s)
ever met." I'm like, "Little, you really
[53:03] (3183.04s)
don't know." But people will think
[53:04] (3184.64s)
you're intelligent and they will give
[53:06] (3186.16s)
you more credit and they'll introduce
[53:07] (3187.92s)
you to other opportunities based on your
[53:10] (3190.24s)
ability to communicate because people
[53:11] (3191.76s)
associate that with intelligence if that
[53:13] (3193.68s)
makes sense. Um and like I can go into
[53:16] (3196.16s)
endlessly about times in my career where
[53:18] (3198.16s)
communicating effectively helped my
[53:20] (3200.24s)
career. So it's super important. But
[53:22] (3202.40s)
yeah, practice
[53:23] (3203.28s)
you know with all the LLM and you know
[53:25] (3205.44s)
AI coming in are you seeing people cheat
[53:27] (3207.60s)
more often with LLMs? Are people
[53:29] (3209.92s)
learning less or more? uh last year in
[53:33] (3213.44s)
fall I allowed students to use LLMs not
[53:36] (3216.40s)
to write the whole project but for
[53:37] (3217.84s)
autocomplete or to write simple you know
[53:40] (3220.72s)
parts of the project which weren't
[53:42] (3222.24s)
really relevant to the material that we
[53:43] (3223.84s)
were covering and I think in retrospect
[53:46] (3226.80s)
you know that was a bad idea because I
[53:48] (3228.24s)
think people were autocompleting a lot
[53:49] (3229.76s)
more than just like a function uppercase
[53:51] (3231.68s)
a string or something you know like that
[53:53] (3233.12s)
kind of thing, you know, they were they
[53:54] (3234.88s)
were using it uh in ways that hindered
[53:57] (3237.68s)
learning. I've recently been reading a
[53:59] (3239.60s)
bunch of papers about how it helps in
[54:01] (3241.28s)
Hindu's learning and actually I don't
[54:02] (3242.88s)
know if you saw this recently a study at
[54:04] (3244.32s)
MIT that like synaptic connections are
[54:08] (3248.00s)
down from like 70% to 50% or something.
[54:11] (3251.60s)
Uh I forget the numbers but like I just
[54:13] (3253.76s)
saw a blurb when people use LM to solve
[54:16] (3256.08s)
a problem rather than working through it
[54:17] (3257.76s)
themselves. So I'm actually this coming
[54:20] (3260.56s)
fall I'm going to allow LLMs for
[54:23] (3263.12s)
learning like clarifying concepts asking
[54:26] (3266.08s)
for what what does this mean? But I will
[54:28] (3268.40s)
not allow them ideally for projects
[54:30] (3270.96s)
because I believe that it did hurt
[54:33] (3273.04s)
understanding and we saw that in exams
[54:34] (3274.64s)
like if you look at the exam
[54:35] (3275.60s)
understanding versus project scores
[54:37] (3277.60s)
there's a big delta perfect project
[54:39] (3279.76s)
scores but uh getting everything wrong.
[54:42] (3282.32s)
Although I have to say the projects were
[54:43] (3283.68s)
not solvable entirely with all those but
[54:46] (3286.32s)
you you know the latest models now you
[54:48] (3288.32s)
could probably get a 95% on on uh on
[54:51] (3291.20s)
them with really bad code but we weren't
[54:53] (3293.04s)
we evaluating correctness not code. How
[54:55] (3295.12s)
do you even tell if they're using LMS
[54:57] (3297.44s)
though?
[54:58] (3298.16s)
Um, well, one way you can tell is that
[55:01] (3301.36s)
when they autocomplete, often they'll
[55:03] (3303.28s)
autocomplete. These LMS will generate
[55:05] (3305.76s)
error checking, right? The error
[55:08] (3308.00s)
checking will typically have an
[55:09] (3309.12s)
exception with an error, a message, and
[55:11] (3311.44s)
the messages will be very consistent
[55:13] (3313.20s)
across different implementations.
[55:15] (3315.60s)
And so, in fact, we have cheating
[55:17] (3317.28s)
software that checks n squared
[55:18] (3318.80s)
different, you know, everybody's project
[55:20] (3320.16s)
against everyone else's project. And we
[55:22] (3322.00s)
will have flags where it's like 30% of
[55:24] (3324.56s)
this code is similar and a lot of it is
[55:27] (3327.12s)
like word for word similar error
[55:29] (3329.92s)
messages and similar variable names and
[55:32] (3332.64s)
you know like similar idioms and so like
[55:34] (3334.80s)
pretty obvious people are using these
[55:36] (3336.48s)
things extensively.
[55:37] (3337.60s)
You know a lot of people are worried
[55:38] (3338.80s)
about LLM's kind of automating software
[55:41] (3341.76s)
engineering and you know should they
[55:43] (3343.60s)
even get a software engineering degree
[55:45] (3345.28s)
anymore? You know what do you think
[55:46] (3346.72s)
about that? It's a great question and I
[55:49] (3349.36s)
and I think that it depends on what
[55:52] (3352.72s)
these models evolve into. Imagine you
[55:55] (3355.44s)
could literally give a project to an LM
[55:59] (3359.28s)
just like a junior engineer and it would
[56:01] (3361.60s)
produce a component which is largely
[56:03] (3363.68s)
correct with tests with security
[56:07] (3367.04s)
factored in with proper modularity DOI
[56:10] (3370.08s)
all the other good stuff you want. If we
[56:12] (3372.24s)
get to that point where bigger and
[56:14] (3374.72s)
bigger tasks of more complexity are
[56:16] (3376.88s)
solved correctly with good style and not
[56:19] (3379.20s)
no code smell and all the other good
[56:20] (3380.72s)
stuff that that is let's call that like
[56:23] (3383.12s)
AGI programming for for a time being for
[56:25] (3385.68s)
the time being. Contrast that with what
[56:27] (3387.44s)
we have today which is a models that can
[56:29] (3389.04s)
actually solve tightly specified sub
[56:31] (3391.36s)
problems pretty well build tests for
[56:33] (3393.12s)
them but you still need some you know
[56:36] (3396.32s)
some supervision you know did it use a
[56:38] (3398.72s)
good algorithm did it like do a deep
[56:40] (3400.72s)
copy when it should have done a shallow
[56:42] (3402.00s)
copy of a data structure like all this
[56:43] (3403.68s)
stuff like this right I think if we stay
[56:46] (3406.16s)
in a world where we get really good but
[56:48] (3408.16s)
not AGI good based on that definition I
[56:50] (3410.80s)
gave you earlier I think software
[56:52] (3412.24s)
engineers uh software you know software
[56:54] (3414.64s)
engineering will still be a great field
[56:56] (3416.80s)
to get into because someone is going to
[57:00] (3420.24s)
have to go and look at that code and
[57:02] (3422.64s)
understand the mission of the company
[57:03] (3423.92s)
and understand the standards and so on
[57:05] (3425.84s)
and then make sure that it's doing the
[57:07] (3427.84s)
right thing and that requires real
[57:09] (3429.76s)
thinking and introspection and
[57:11] (3431.76s)
corrections and you know and even if you
[57:14] (3434.24s)
have the model fix things you still have
[57:15] (3435.92s)
to know what to have fix um and so I do
[57:18] (3438.64s)
think that having that degree and having
[57:20] (3440.48s)
the skill of be able to write code and
[57:22] (3442.40s)
read code is super valuable and we could
[57:25] (3445.20s)
talk about if you want like but what
[57:27] (3447.04s)
about all the jobs going down right now
[57:28] (3448.64s)
you know if that which may be caused by
[57:30] (3450.32s)
the by LMS probably in part so that
[57:33] (3453.60s)
that's one situation the other situation
[57:35] (3455.28s)
if you truly have an AGI where you can
[57:37] (3457.04s)
go and delegate something to it and it
[57:38] (3458.88s)
will do like a L5 job it will do a
[57:41] (3461.44s)
really good job it might need a little
[57:43] (3463.52s)
bit of couple tweaks but they're minor
[57:45] (3465.68s)
tweaks takes on projects that would take
[57:48] (3468.08s)
weeks just gets them done um I think the
[57:51] (3471.36s)
world's a lot different and I and and in
[57:53] (3473.76s)
that world where literally all the
[57:56] (3476.64s)
software of whatever complexity can be
[57:58] (3478.56s)
written correctly securely with right
[58:01] (3481.04s)
tasks. Uh I think all bets are off and I
[58:03] (3483.60s)
think it's a different a different set
[58:05] (3485.36s)
of skills that are necessary personally
[58:06] (3486.88s)
and I can tell you what I think those
[58:07] (3487.84s)
are but I think it's different.
[58:09] (3489.12s)
What what are those skills? Project
[58:11] (3491.04s)
management soft skills those things.
[58:13] (3493.28s)
So in a world where software can be
[58:15] (3495.68s)
written like arbitrarily complex
[58:17] (3497.76s)
software can be written tested you know
[58:21] (3501.04s)
good style everything is you know good
[58:23] (3503.28s)
stuff like you'd expect of a senior
[58:24] (3504.64s)
engineer. To me, the world changes into
[58:27] (3507.68s)
a place where you're the where engineers
[58:31] (3511.04s)
are going to be focused and maybe
[58:32] (3512.08s)
they're not even engineers anymore on
[58:33] (3513.68s)
what we build, not how we build it.
[58:36] (3516.16s)
Okay? So, what problem are we solving?
[58:38] (3518.32s)
Why are we solving it? I think of in
[58:40] (3520.72s)
that world, the greatest engineers will
[58:42] (3522.88s)
be people who really understand a
[58:44] (3524.56s)
problem that they're trying to solve for
[58:45] (3525.92s)
a customer. That might be an internal
[58:47] (3527.76s)
customer, might be a customer that
[58:49] (3529.12s)
you're like a consumer, might be a
[58:51] (3531.04s)
business, and you know exactly what
[58:53] (3533.28s)
pains they're facing and how they
[58:54] (3534.56s)
measure success and what gets them
[58:55] (3535.92s)
really pissed and what is hard for them
[58:57] (3537.52s)
to do now and you want to make easy. And
[58:59] (3539.52s)
then figuring out how to really clearly
[59:02] (3542.40s)
communicate to an LLM those requirements
[59:05] (3545.12s)
to get it to do all that hard
[59:06] (3546.80s)
programming work that you would have had
[59:08] (3548.64s)
to do over weeks and months. And that is
[59:10] (3550.96s)
a really hard problem in its own right.
[59:12] (3552.80s)
So if people who have really great
[59:15] (3555.12s)
clarification skills, really great sort
[59:17] (3557.60s)
of uh outcome analysis skills like what
[59:20] (3560.40s)
outcomes is the customer trying to
[59:21] (3561.84s)
achieve, what are the metrics by which
[59:23] (3563.36s)
the customer measures success, what's
[59:25] (3565.60s)
important to them, what's not important
[59:26] (3566.80s)
to them in those outcomes, how can we
[59:29] (3569.44s)
communicate to a model in a way that
[59:30] (3570.88s)
tells a model what we need it to do and
[59:34] (3574.00s)
to meet those requirements because
[59:35] (3575.28s)
models will get some of it wrong. Those
[59:37] (3577.12s)
are the people are going to be
[59:38] (3578.00s)
successful. And in that world, I think
[59:40] (3580.32s)
you have big companies like Google and
[59:42] (3582.40s)
Meta and so on and Amazon. But you're
[59:44] (3584.48s)
going to have 10,000 smaller companies
[59:46] (3586.32s)
that are going to now be able to tackle
[59:48] (3588.08s)
problems like building software for pet
[59:50] (3590.00s)
sitters that never was, you know,
[59:52] (3592.32s)
tackled before, you know, or building
[59:54] (3594.48s)
software for like doggy daycare. I think
[59:56] (3596.48s)
about that because our dog goes to a
[59:57] (3597.68s)
doggy daycare and the software just
[59:58] (3598.88s)
sucks that they use, right? It's really
[60:00] (3600.80s)
bad. that if you have a bunch of domain
[60:03] (3603.52s)
experts who can use these tools, now you
[60:05] (3605.92s)
have a million small businesses each
[60:07] (3607.52s)
solving a problem in a way that's really
[60:10] (3610.64s)
perfect for those customers and not
[60:12] (3612.64s)
having to worry about the engineering.
[60:13] (3613.92s)
Does that make sense? So, it's a
[60:15] (3615.20s)
different world. Still a lot of software
[60:17] (3617.04s)
engineers, but different skills. What
[60:18] (3618.88s)
you said, it sounds like, and I don't
[60:20] (3620.32s)
know the product management function uh
[60:23] (3623.12s)
description too well, but sounds like it
[60:25] (3625.92s)
sounds like someone who's understanding
[60:27] (3627.52s)
the customer, the business,
[60:29] (3629.68s)
communicating well. It's almost like the
[60:31] (3631.68s)
LLM is like a software engineering team,
[60:34] (3634.56s)
but it's you know little query engine.
[60:36] (3636.56s)
It would be like a competent product
[60:37] (3637.76s)
manager and and I gotta say like in my
[60:39] (3639.68s)
life in my lifetime I've met very few
[60:41] (3641.44s)
competent product managers but yes you
[60:42] (3642.96s)
could call product manager would be a
[60:44] (3644.80s)
good name for it. Yeah. If program
[60:47] (3647.12s)
product managers were actually
[60:48] (3648.24s)
competent, most of them are not. I got
[60:49] (3649.84s)
to say
[60:50] (3650.16s)
what what makes a competent product
[60:52] (3652.16s)
manager by the way.
[60:53] (3653.44s)
You know, I think a competent product
[60:54] (3654.88s)
manager, a lot of them are good at
[60:56] (3656.72s)
communicating, although many are not,
[60:58] (3658.16s)
but a competent product manager in my
[60:59] (3659.76s)
mind is somebody who really understands
[61:01] (3661.60s)
again customer outcomes and customer
[61:03] (3663.52s)
metrics. So like for instance, if you've
[61:04] (3664.96s)
heard of jobs to be done or
[61:06] (3666.72s)
outcomedriven innovation, these are
[61:08] (3668.48s)
methodologies which are a more
[61:10] (3670.64s)
deterministic and and uh than just
[61:14] (3674.08s)
touchyfey. So they actually have meth
[61:16] (3676.64s)
methodologies where they say this is how
[61:18] (3678.16s)
you discover a customer's outcomes like
[61:19] (3679.84s)
what what jobs are they trying to get
[61:21] (3681.12s)
done during their day where are they
[61:22] (3682.80s)
struggling how do they measure success
[61:24] (3684.72s)
like what metrics are important to them
[61:26] (3686.72s)
like you know maybe you know like for
[61:29] (3689.20s)
instance when I'm brushing my teeth I
[61:31] (3691.04s)
drool I don't know about you do you
[61:32] (3692.24s)
drool when you brush your teeth
[61:33] (3693.60s)
uh sometimes yes
[61:34] (3694.72s)
yeah I drool a lot like I just like I
[61:36] (3696.48s)
got a lot of saliva okay yeah
[61:38] (3698.16s)
and for me like one of the really
[61:39] (3699.84s)
annoying things about brushing my teeth
[61:41] (3701.04s)
is it like
[61:42] (3702.08s)
like the drool's going all down my arm
[61:43] (3703.92s)
and I have to rinse my arm after I brush
[61:45] (3705.60s)
and it's like worn everywhere
[61:47] (3707.04s)
and like that's a that's a metric by
[61:48] (3708.80s)
which I judge whether my toothbrush is
[61:51] (3711.44s)
great. Of course, I haven't found a good
[61:52] (3712.56s)
one. Maybe I should design a toothbrush.
[61:54] (3714.00s)
But basically, um you really need to
[61:56] (3716.96s)
understand the pains that customers go
[61:58] (3718.24s)
through and what they care about and
[62:00] (3720.00s)
what's really not that important because
[62:01] (3721.36s)
a lot of things that you might think are
[62:02] (3722.72s)
important internally customer doesn't
[62:04] (3724.96s)
care about. And so I think most most
[62:06] (3726.64s)
product managers are touchy feely.
[62:07] (3727.92s)
They're like, "Well, I think I know what
[62:10] (3730.16s)
the customer wants, and I think I saw a
[62:12] (3732.40s)
really cool feature in this product
[62:13] (3733.36s)
here, so I'm going to go and do what
[62:14] (3734.56s)
they did, but do it a little bit better
[62:16] (3736.40s)
without really understanding what the
[62:18] (3738.24s)
customer is struggling with. Um, how
[62:20] (3740.80s)
often they struggle with that thing. Is
[62:22] (3742.32s)
it important to fix or maybe it's not
[62:24] (3744.16s)
like, you know, maybe it seems cool to
[62:25] (3745.76s)
you and maybe they feature because they
[62:27] (3747.12s)
have some extra team members that can do
[62:28] (3748.40s)
it, but that's not necessarily the right
[62:29] (3749.84s)
thing." And so yes, it would be a very
[62:31] (3751.68s)
competent product manager who really
[62:33] (3753.28s)
understands the customer, maybe worked
[62:35] (3755.12s)
in that environment, knows the problems,
[62:37] (3757.36s)
has suffered through the problems, and
[62:38] (3758.72s)
can then basically tell LLM how to solve
[62:41] (3761.76s)
that problem.
[62:42] (3762.48s)
I see. So, so competence, if I'm
[62:44] (3764.40s)
understanding correctly, is user
[62:47] (3767.04s)
empathy. It's knowing what actually
[62:49] (3769.28s)
matters uh for them and um building for
[62:53] (3773.28s)
that and communicating for that and
[62:54] (3774.80s)
measuring that, all those things. But a
[62:56] (3776.48s)
lot of people think that's a touchyfey
[62:58] (3778.16s)
skill like it's something you just sort
[62:59] (3779.76s)
of develop and you sort of have
[63:01] (3781.12s)
intuition about the customer. I think it
[63:02] (3782.72s)
is a a repeatable process done through
[63:05] (3785.44s)
interviews done through observing the
[63:07] (3787.04s)
customer. It's not something that you
[63:08] (3788.56s)
just sort of get better at by feeling
[63:10] (3790.96s)
it. It can be repeated is what I'm
[63:12] (3792.48s)
saying. And very few product managers
[63:14] (3794.40s)
will do that.
[63:15] (3795.20s)
I see.
[63:15] (3795.60s)
Most of them just say, "Oh, I know this
[63:17] (3797.12s)
product. I've been working on it for
[63:18] (3798.00s)
five years. I know the customer. I
[63:19] (3799.28s)
talked to Joe last week at you know
[63:20] (3800.88s)
customer name and you know this is what
[63:22] (3802.96s)
they really need." Do you really know
[63:24] (3804.40s)
that? Like you think you know that, but
[63:25] (3805.76s)
what are they trying to get
[63:26] (3806.48s)
accomplished?
[63:27] (3807.20s)
Right. Right.
[63:27] (3807.68s)
Product managers typically think in
[63:28] (3808.96s)
terms of features, not in terms of
[63:31] (3811.04s)
customer pain points and what they're
[63:33] (3813.20s)
trying to get done.
[63:34] (3814.08s)
I see.
[63:34] (3814.64s)
In my in my experience, I'm sure there
[63:36] (3816.64s)
are obstructions.
[63:37] (3817.44s)
Yeah. Yeah. Yeah. I'm sure there are.
[63:38] (3818.64s)
I just never met any.
[63:41] (3821.84s)
Okay. Um, you know, coming to the end of
[63:44] (3824.00s)
the interview, I always love reflecting
[63:46] (3826.40s)
back on the careers and you've had a
[63:49] (3829.12s)
full career at this point, more retired
[63:51] (3831.84s)
at this point. Um, so I'm curious like
[63:54] (3834.16s)
looking back on your career, is there
[63:56] (3836.00s)
anything that you you regret or
[63:58] (3838.08s)
something that you wish you would have
[63:59] (3839.28s)
changed that maybe others can learn
[64:01] (3841.60s)
Yeah, I mean a bunch of things I would
[64:02] (3842.96s)
say like first of all, I should have
[64:04] (3844.24s)
left my job at Semantic earlier. Um, I
[64:07] (3847.20s)
feel like you should stay in a job as
[64:09] (3849.68s)
long as you are learning new things and
[64:12] (3852.40s)
building new skills and as long as you
[64:14] (3854.64s)
feel empowered to grow and try things
[64:17] (3857.28s)
that might be uncomfortable for you and
[64:18] (3858.72s)
not have to worry about your, you know,
[64:21] (3861.20s)
what if I fail occasionally, like have
[64:22] (3862.80s)
the space to try things and get them
[64:24] (3864.88s)
wrong, but to be to be able to create
[64:26] (3866.56s)
great things. And I had that during a
[64:28] (3868.48s)
lot of my time at Sevantech. But there
[64:30] (3870.24s)
was a point where I just wasn't growing
[64:32] (3872.48s)
anymore and I was stagnant and I didn't
[64:34] (3874.48s)
want to leave. not because, you know, it
[64:36] (3876.72s)
was interesting, but because I just
[64:38] (3878.08s)
didn't have the confidence to go
[64:39] (3879.28s)
somewhere else. And so, I would say
[64:40] (3880.80s)
there's a lot of value in staying in a
[64:42] (3882.32s)
job as long as you're growing. And that
[64:43] (3883.68s)
means maybe switching teams and staying
[64:45] (3885.68s)
in the same company. There's a lot of
[64:46] (3886.88s)
value of having that institutional
[64:48] (3888.40s)
knowledge of a platform that you're
[64:50] (3890.24s)
working on or set of systems of your
[64:52] (3892.48s)
reputation. Reputation goes a long way.
[64:54] (3894.48s)
If you're really good in one in one
[64:56] (3896.24s)
area, you switch to another team, you
[64:58] (3898.08s)
can use that reputation to help you in
[64:59] (3899.68s)
the other area often. Um, and so I think
[65:02] (3902.48s)
staying in a job for five years, even 10
[65:04] (3904.72s)
years can be, as long as you're
[65:06] (3906.96s)
learning, can be really great. I don't
[65:09] (3909.20s)
advise people to switch every couple
[65:10] (3910.64s)
years necessarily. On the other hand,
[65:13] (3913.36s)
you know, staying staying too long, you
[65:15] (3915.68s)
know, you can get stale. Um, it gets
[65:18] (3918.08s)
easy to just say, you know, I'm
[65:19] (3919.28s)
comfortable. I'm not really having to be
[65:21] (3921.04s)
challenged. I can do my job. I can wake
[65:22] (3922.80s)
up and have some nice coffee in the
[65:24] (3924.08s)
morning and I do whatever I do. And it
[65:25] (3925.84s)
doesn't really, you know, when you get
[65:26] (3926.64s)
more senior, often you can just do that.
[65:29] (3929.44s)
And so when you get to that point, it's
[65:31] (3931.44s)
time to leave and and challenge yourself
[65:33] (3933.68s)
some more. And the problem with that is
[65:35] (3935.92s)
when you do leave, you do start over.
[65:37] (3937.76s)
Like I found that I had a pretty great
[65:40] (3940.24s)
career at Semantic. If you asked anybody
[65:41] (3941.60s)
at Semantic from the 21 years I was
[65:44] (3944.08s)
there who I was, people would know me.
[65:45] (3945.84s)
They'd say hi to me in the hallway. I
[65:47] (3947.76s)
even had people come up to me on the
[65:49] (3949.12s)
street because I would be on TV talking
[65:50] (3950.64s)
about Stuckset and stuff like, you know,
[65:52] (3952.16s)
I was on Fox and, you know, MSNBC and
[65:55] (3955.92s)
Wall Street Journal, New York Times and
[65:58] (3958.16s)
all that stuff. and it was really
[65:59] (3959.20s)
exciting. But then you go to Google and
[66:01] (3961.52s)
they're like, "What have you done for
[66:02] (3962.64s)
us? We don't care that you did those
[66:04] (3964.56s)
things. It only matters what you've done
[66:06] (3966.24s)
for us." And so that requires a lot of
[66:08] (3968.56s)
rebuilding up trust and building up, you
[66:11] (3971.20s)
know, sort of a reputation and doing a
[66:13] (3973.28s)
lot of good work. And, you know, that's
[66:15] (3975.28s)
stressful and it takes a lot more work
[66:16] (3976.96s)
than staying where you are. So, if you
[66:19] (3979.84s)
do that, got to choose wisely. I hear
[66:22] (3982.16s)
I'll probably misattribute who said it,
[66:23] (3983.84s)
but there's some quote about you should
[66:25] (3985.36s)
either be learning or you should be
[66:27] (3987.60s)
earning in your career. Um, what are
[66:30] (3990.00s)
your thoughts on your golden handcuffs
[66:32] (3992.88s)
somewhere you're no longer learning but
[66:34] (3994.56s)
you're earning a ton? Do you would you
[66:36] (3996.96s)
still advise that that person leaves?
[66:39] (3999.44s)
Um, I would never say don't I would
[66:41] (4001.68s)
never say leave if there's a a good
[66:44] (4004.64s)
package and you know we have to optimize
[66:46] (4006.64s)
for multiple things in our life, right?
[66:47] (4007.92s)
Learning is definitely important, but
[66:49] (4009.44s)
also being financially stable and
[66:51] (4011.12s)
different people need different amounts
[66:52] (4012.40s)
of money to feel comfortable about being
[66:54] (4014.48s)
safe in their life and having enough to
[66:55] (4015.84s)
take care of their family or themselves.
[66:58] (4018.08s)
I think there's a place for both and it
[67:01] (4021.12s)
can be okay to be comfortable and making
[67:04] (4024.00s)
a lot of money for a while, but I think
[67:05] (4025.68s)
that if you're solving for enjoyment and
[67:08] (4028.96s)
fun and hard problem solving, which a
[67:10] (4030.88s)
lot of people are, that can be toxic
[67:13] (4033.12s)
over too much time.
[67:14] (4034.96s)
Definitely. It sounds like early in your
[67:17] (4037.04s)
career you were lucky to find what you
[67:20] (4040.16s)
enjoyed. A lot of those early problems
[67:21] (4041.92s)
were super interesting.
[67:23] (4043.68s)
Do you have any advice for people who um
[67:26] (4046.88s)
want to find what they enjoy in software
[67:28] (4048.88s)
engineering?
[67:29] (4049.76s)
Yeah, for people who are in college now,
[67:32] (4052.56s)
graduating soon. Um I would say like try
[67:36] (4056.56s)
lots of internships if you can and that
[67:38] (4058.48s)
might be difficult now given the way
[67:39] (4059.84s)
things are going with jobs right now.
[67:41] (4061.12s)
Everything's cyclical though, but right
[67:42] (4062.40s)
now jobs are tough. Um, I found cyber
[67:46] (4066.08s)
security entirely randomly. You know, it
[67:48] (4068.56s)
wasn't something I was like, I want to
[67:49] (4069.76s)
be in cyber security. It's like I had an
[67:51] (4071.36s)
internship. My first year was doing file
[67:53] (4073.04s)
management. My second year was doing
[67:54] (4074.64s)
blah blah blah. Third year, oh, viruses.
[67:56] (4076.96s)
The more you can explore, the more
[67:59] (4079.44s)
you're going to potentially discover a
[68:01] (4081.20s)
passion. And like I think there are many
[68:03] (4083.52s)
jobs where you would think it's going to
[68:04] (4084.80s)
be totally boring, but if you go and
[68:06] (4086.72s)
start working on the problem, you're
[68:08] (4088.32s)
going to realize there's really
[68:09] (4089.60s)
interesting problems to solve. And you
[68:10] (4090.96s)
might find a passion for a field that
[68:12] (4092.48s)
you would never expect that you would
[68:14] (4094.24s)
enjoy. And then you're like, "Wow, this
[68:16] (4096.40s)
is really interesting. There's really
[68:17] (4097.44s)
fun problems. I like the people I'm
[68:18] (4098.80s)
working with. Customers are
[68:20] (4100.56s)
interesting." Or maybe they're a pain in
[68:21] (4101.60s)
the ass, but that's interesting to deal
[68:22] (4102.88s)
with. Like I would say just try things.
[68:26] (4106.32s)
Don't try to wait and find the perfect
[68:28] (4108.00s)
job. Find a job that where you have a
[68:30] (4110.00s)
good manager. Um there's some hard
[68:32] (4112.08s)
problems to solve. Um and um you don't
[68:36] (4116.16s)
know anything about it and you'll learn.
[68:38] (4118.08s)
Like that's you might find your dream
[68:40] (4120.16s)
job and that lesson be 25 years.
[68:42] (4122.64s)
So before you know what you enjoy,
[68:44] (4124.88s)
you're saying, you know, search with
[68:46] (4126.80s)
breath and then once you find it, then
[68:49] (4129.28s)
you can kind of go deeper.
[68:50] (4130.72s)
That's right.
[68:51] (4131.68s)
I mean, look, you might get lucky. If
[68:53] (4133.52s)
your first year internship is really
[68:55] (4135.12s)
interesting and and you're doing well, I
[68:57] (4137.28s)
would stick with that. You could try
[68:58] (4138.96s)
breath there, but I would say like a
[69:01] (4141.36s)
greedy breath. The last question I have
[69:03] (4143.44s)
for you is if you were going to go to
[69:06] (4146.48s)
yourself right when you were graduating
[69:08] (4148.80s)
from UCLA and give yourself some advice
[69:11] (4151.44s)
knowing what you know now, what would
[69:13] (4153.28s)
you say?
[69:13] (4153.92s)
It wouldn't be one thing. I mean, the
[69:15] (4155.52s)
biggest thing would be don't let fear of
[69:17] (4157.68s)
failure hold you back. You probably can
[69:20] (4160.32s)
do more than you think you can do. I
[69:21] (4161.92s)
might have had a richer career had I
[69:23] (4163.36s)
listened to that. I would say focus on
[69:25] (4165.68s)
the outcome. So whenever you're working
[69:27] (4167.20s)
on a project, think about who's going to
[69:28] (4168.88s)
use it, what do they care about, what
[69:31] (4171.28s)
how do they measure success, and then
[69:33] (4173.20s)
just optimize for that. And don't like
[69:34] (4174.96s)
try to make the perfect thing or try
[69:36] (4176.64s)
don't try to make a very, you know,
[69:38] (4178.96s)
round thing if all you need is a little
[69:40] (4180.64s)
square piece here. Like often problems
[69:43] (4183.20s)
can be solved um without having to be
[69:45] (4185.36s)
perfect and still be really good. So
[69:48] (4188.16s)
like focusing on people's outcomes. Big
[69:50] (4190.64s)
one is when you're when you're
[69:51] (4191.76s)
presenting, for example, like this had
[69:53] (4193.84s)
to learn. I used to go into
[69:55] (4195.44s)
presentations and talk about the
[69:57] (4197.36s)
technology to senior leaders and like in
[70:00] (4200.00s)
retrospect those senior leaders don't
[70:01] (4201.52s)
care how the algorithm works. They don't
[70:04] (4204.00s)
care. They just want to know is it going
[70:06] (4206.40s)
to be faster? How much faster? Is it
[70:08] (4208.08s)
going to generate more revenue? How much
[70:09] (4209.44s)
revenue? Is it going to, you know, fix a
[70:11] (4211.92s)
problem that currently takes 20 people
[70:13] (4213.20s)
to do it? Make it 10 people to do it.
[70:15] (4215.36s)
And so you really need to get in
[70:16] (4216.64s)
people's heads and think about what
[70:17] (4217.84s)
they're solving for. Again, this whole
[70:19] (4219.20s)
idea of outcomes and then speak to their
[70:21] (4221.44s)
needs, not your own. So that's a big
[70:24] (4224.00s)
one. I said find a good manager. I'll
[70:25] (4225.84s)
just say that again. A manager can make
[70:27] (4227.68s)
or break your career and your life and
[70:30] (4230.00s)
your like happiness. So, you know, a
[70:33] (4233.92s)
good manager that trusts you and gives
[70:35] (4235.20s)
you some rope is uh super valuable.
[70:38] (4238.16s)
Learn how to collaborate with people.
[70:39] (4239.60s)
Just like don't assume that you're right
[70:41] (4241.68s)
and just tell people they're wrong. You
[70:43] (4243.20s)
have to really learn to work with people
[70:44] (4244.48s)
and understand their point of view. I
[70:46] (4246.80s)
still struggle at that, but you know,
[70:48] (4248.88s)
it's something that's a work in
[70:50] (4250.40s)
progress.
[70:50] (4250.96s)
Those would be big things.
[70:52] (4252.32s)
Yeah. Yeah. Well, thanks so much,
[70:54] (4254.00s)
Carrie, for your time. I really
[70:55] (4255.68s)
appreciate it. I was really looking
[70:56] (4256.88s)
forward to it. I think there's a lot of
[70:58] (4258.16s)
stuff in here that people are going to
[71:00] (4260.40s)
benefit from. So, thanks so much for
[71:02] (4262.08s)
your time.
[71:02] (4262.56s)
That was my pleasure.
[71:04] (4264.00s)
Hey, thanks for watching the show. I
[71:05] (4265.60s)
don't sell anything or do sponsorships,
[71:07] (4267.44s)
but if you want to support, you can
[71:10] (4270.08s)
subscribe on YouTube or you can leave a
[71:12] (4272.08s)
review on Spotify. And I'm always
[71:14] (4274.08s)
looking for new guests to interview. So,
[71:15] (4275.68s)
if anyone comes up who you think you
[71:18] (4278.00s)
really want to hear their career story,
[71:20] (4280.00s)
uh, let me know and I'll try to reach
[71:22] (4282.00s)
out to them and get them on the show.
[71:23] (4283.60s)
Thanks for listening as always and I'll
[71:25] (4285.44s)
see you next time.