[00:00] (0.04s)
what does actual development look like
[00:01] (1.40s)
when you join fabricator is our internal
[00:03] (3.40s)
code review tooling fabricator is well
[00:05] (5.20s)
integrated with sand castle which is our
[00:07] (7.00s)
internal CI tooling which is integrated
[00:09] (9.00s)
with on demand which are internal Dev
[00:10] (10.72s)
boxes which is integrated with land
[00:13] (13.24s)
castle which is how code makes its way
[00:14] (14.92s)
out to users and sort of the thing goes
[00:17] (17.20s)
from there and then it doesn't just stop
[00:19] (19.08s)
at code review it's integration across
[00:20] (20.84s)
the rest of the developer platform for
[00:22] (22.48s)
example meta built its own task system
[00:24] (24.48s)
and so tasks and poll requests could be
[00:26] (26.92s)
very deeply integrated in a way that we
[00:28] (28.96s)
just don't see out here one tool that I
[00:31] (31.72s)
kind of experienced because of
[00:33] (33.32s)
fabricator was called heral heral if I
[00:35] (35.60s)
recall correctly was the rules engine
[00:37] (37.24s)
which later got replaced by butterfly
[00:39] (39.00s)
bot and the idea of both of them was the
[00:41] (41.00s)
same it was be able to match events that
[00:43] (43.40s)
might happen in the normal course of
[00:45] (45.24s)
code review I used to work on internal
[00:46] (46.92s)
developer tools there when we were
[00:48] (48.28s)
deprecating apis you might go ahead and
[00:50] (50.20s)
do something like create a heral rule
[00:51] (51.92s)
that would say something like if a user
[00:54] (54.00s)
calls this function or uses this API
[00:56] (56.00s)
post a comment letting them know hey
[00:57] (57.76s)
this is deprecated also the new syntax
[01:00] (60.56s)
is this and then as you got later and
[01:02] (62.28s)
later into the roll out you could start
[01:03] (63.44s)
to do things instead like hey notify me
[01:05] (65.96s)
if anyone's adding call sites of this
[01:07] (67.56s)
API so that I can go ahead and go leave
[01:10] (70.04s)
a comment saying hey like no we're
[01:12] (72.12s)
really trying to deprecate this meta is
[01:13] (73.64s)
famous for his customer internal tooling
[01:15] (75.48s)
Dev at the company use the likes of
[01:16] (76.84s)
fabricator Sand Castle or butterfly bot
[01:19] (79.60s)
but what are these tools and why did
[01:21] (81.36s)
meta not use a more standard stack to
[01:23] (83.08s)
build software we get answers from
[01:24] (84.92s)
Thomas rer who was previously at meta
[01:26] (86.80s)
working on developer tooling before
[01:28] (88.56s)
co-founding graphite of devel Vel
[01:30] (90.20s)
productivity platform today we talk
[01:32] (92.32s)
about what stack diffs are and how they
[01:34] (94.68s)
became the defecto way Dev at meta work
[01:36] (96.68s)
with polar requests why men I move to a
[01:39] (99.12s)
monor repo and why there seems to be an
[01:40] (100.76s)
industrywide Trend in companies moving
[01:42] (102.44s)
from poly repos to monor repos why AI
[01:45] (105.04s)
coding tools will make code reviews a
[01:46] (106.52s)
lot more important and more if you're
[01:48] (108.72s)
interested in how effective engineering
[01:50] (110.12s)
teams work these days and approaches
[01:52] (112.00s)
trending in this group this episode is
[01:53] (113.88s)
for you if you enjoy the show please
[01:55] (115.68s)
subscribe to the podcast and consider
[01:57] (117.20s)
leaving a rating this great to help the
[01:59] (119.04s)
show and with this let's jump in Thomas
[02:01] (121.88s)
it's great to have you on the podcast
[02:03] (123.80s)
great to be here so you before you
[02:07] (127.12s)
started graphite you worked at meta and
[02:10] (130.12s)
meta is known for a lot of their
[02:12] (132.16s)
in-house tooling can we talk about like
[02:15] (135.04s)
when you joined what was interesting
[02:17] (137.12s)
tools that you didn't see at other
[02:18] (138.72s)
companies before so funny it's a great
[02:21] (141.24s)
question too and yes certainly it's when
[02:24] (144.24s)
I joined I was a new grad so I just
[02:26] (146.96s)
graduated college I joined meta I used a
[02:29] (149.00s)
lot of the tooling the my frame of
[02:31] (151.44s)
reference was personal projects I used
[02:33] (153.32s)
to do a lot of hackathons back in
[02:34] (154.68s)
college and internships and so I think
[02:37] (157.32s)
when I learned when I joined meta a lot
[02:39] (159.28s)
of it came in sort of uh eyes wide sort
[02:43] (163.24s)
of like uh I was about to say eyes wide
[02:45] (165.24s)
open but I mean the opposite of that if
[02:46] (166.72s)
I came in sort of just like blissfully
[02:48] (168.64s)
unaware that this was an industry
[02:50] (170.04s)
standard uh when I left was when I had
[02:52] (172.20s)
the bigger shock because I think I had
[02:53] (173.76s)
used tools like GitHub before I'd used
[02:56] (176.32s)
analytics tools before I had left meta
[02:59] (179.68s)
after after having really gotten sort of
[03:01] (181.92s)
uh immersed in all of this wonderful
[03:03] (183.56s)
tooling and then when I had to start to
[03:07] (187.36s)
do all of these same activities myself I
[03:09] (189.24s)
found myself reaching for tools that
[03:10] (190.92s)
didn't exist and there weren't clear
[03:12] (192.28s)
analogs for yeah it's usually the other
[03:14] (194.36s)
way around right like usually people who
[03:16] (196.52s)
work in Industry at a startup or even at
[03:19] (199.16s)
a larger company go to meta and unless
[03:21] (201.48s)
you've been in like Google where you
[03:23] (203.28s)
know it's all custom is just a shock as
[03:25] (205.88s)
I think most people imagine like okay
[03:27] (207.88s)
you know they must use GitHub or or at
[03:29] (209.96s)
something similar but what what what
[03:32] (212.36s)
does actual development look like you
[03:33] (213.68s)
know when you joined right like you you
[03:35] (215.56s)
went into boot camp I assume what what
[03:37] (217.76s)
did it look like to like you know like
[03:39] (219.44s)
they didn't even push a p request but
[03:41] (221.24s)
that's not what they call it do they no
[03:43] (223.88s)
no they call it a diff um good good good
[03:47] (227.60s)
trivia for by the way um I think that
[03:51] (231.28s)
though if I had to use one word to
[03:52] (232.84s)
describe meta internal tooling it would
[03:54] (234.68s)
be integrated everything everything
[03:56] (236.92s)
everything integrates with each other
[03:58] (238.44s)
and I I tell this to people and they
[04:00] (240.88s)
never really understand the extent of it
[04:02] (242.60s)
meta builds its own internal calendar
[04:05] (245.00s)
right so meeting room scheduling can be
[04:07] (247.24s)
integrated in with the rest of the
[04:08] (248.84s)
tooling and that really sort of
[04:10] (250.48s)
permeates the whole thing right
[04:12] (252.24s)
fabricator is our internal code revieww
[04:14] (254.08s)
tooling fabricator is well integrated
[04:16] (256.20s)
with sand castle which is our internal
[04:17] (257.84s)
CI tooling which is integrated with on
[04:19] (259.92s)
demand which are our internal Dev boxes
[04:21] (261.92s)
which is integrated with land castle
[04:24] (264.24s)
which is how code makes its way out to
[04:25] (265.76s)
users and sort of the the thing goes
[04:28] (268.16s)
from there and then it doesn't just stop
[04:30] (270.04s)
at code review its integration across
[04:31] (271.76s)
the rest of sort of like the developer
[04:33] (273.16s)
platform for example meta built its own
[04:35] (275.28s)
task system and so tasks and pole
[04:37] (277.72s)
requests could be very deeply integrated
[04:39] (279.88s)
in a way that we just don't see out here
[04:42] (282.24s)
what was sand
[04:43] (283.32s)
castle so sand castle was the internal
[04:46] (286.32s)
CI system so out here a lot of people
[04:48] (288.88s)
use uh GitHub actions they use build
[04:51] (291.44s)
kite and Sand Castle was just our flavor
[04:54] (294.56s)
of that it allowed us to build versions
[04:56] (296.48s)
of our apps and uh websites before we
[04:59] (299.00s)
released
[05:00] (300.28s)
um and went ahead and tested them for us
[05:03] (303.52s)
and you mentioned this like integration
[05:06] (306.12s)
like you know when you say it it sounds
[05:07] (307.48s)
like it's just it's just a word but like
[05:08] (308.92s)
in practice what what did it mean you
[05:10] (310.84s)
know you did mention that for example
[05:12] (312.32s)
tasks which is I guess we can think of
[05:14] (314.28s)
it it's the jira or or you know linear
[05:16] (316.80s)
or whatever that is that was integrated
[05:18] (318.56s)
with the other stuff and this was like
[05:19] (319.92s)
you know like 10 years ago right or
[05:22] (322.36s)
so um yeah actually um yeah I think
[05:26] (326.52s)
integration here I think one of the best
[05:29] (329.04s)
examples I have of integration was
[05:30] (330.60s)
actually a translation system so at
[05:33] (333.12s)
scale internationalization becomes a
[05:34] (334.72s)
real problem right you need to be able
[05:36] (336.28s)
to include internationalization as
[05:38] (338.28s)
product developers start to update the
[05:40] (340.08s)
website create new pages Etc there's
[05:42] (342.60s)
always this question or this tension
[05:44] (344.20s)
that happens of well how do we know that
[05:47] (347.04s)
the string that we're pushing has a
[05:48] (348.64s)
translated equivalent at Facebook that
[05:50] (350.96s)
could actually just be built into the
[05:52] (352.12s)
code review tooling right and so the
[05:53] (353.84s)
code review tooling can tell you
[05:55] (355.16s)
something like hey this diff is using a
[05:58] (358.00s)
string for which the translation don't
[06:00] (360.08s)
exist and because of that you can't
[06:01] (361.72s)
deploy it yet or what was more
[06:03] (363.68s)
interesting to me was when you looked at
[06:05] (365.08s)
a diff that had been merged that had
[06:06] (366.72s)
been landed um that was what we called
[06:09] (369.44s)
it you would see on that same page like
[06:12] (372.04s)
hey this diff has been rolled out to
[06:13] (373.88s)
employees this diff has been rolled out
[06:15] (375.64s)
to 1% of users this diff has been rolled
[06:17] (377.68s)
out to 10% of users also if this diff
[06:20] (380.32s)
included any feature Flags or sort of
[06:22] (382.28s)
like AB tests here are the results of
[06:24] (384.44s)
those AB tests and so you had this
[06:26] (386.40s)
wonderful ecosystem where everything was
[06:28] (388.24s)
tightly tied together and so you as a
[06:31] (391.00s)
developer could just go to one place and
[06:32] (392.92s)
be like I know what I'm looking for I'm
[06:34] (394.60s)
looking for this piece of like code that
[06:36] (396.88s)
I landed let's go see like how it's
[06:39] (399.60s)
changed statistics or how uh how it's
[06:42] (402.04s)
impacted users and you could see that
[06:43] (403.84s)
all in one place and then there was a
[06:45] (405.48s)
bunch of other stuff that started to
[06:46] (406.76s)
make its way into the public tool chain
[06:48] (408.12s)
now of if you wanted to revert a diff
[06:50] (410.24s)
you could just go ahead and do that
[06:51] (411.60s)
directly from uh from fabricator this
[06:54] (414.52s)
episode was brought to you by swarma the
[06:56] (416.24s)
inuring intelligence platform for modern
[06:58] (418.04s)
software organizations swarma gives
[07:00] (420.68s)
everyone in your organization the
[07:01] (421.92s)
visibility and tools they need to get
[07:03] (423.48s)
better at getting better enging leaders
[07:06] (426.20s)
use swarma to balance the investment
[07:07] (427.88s)
between different types of work stay on
[07:09] (429.80s)
top of cross team initiatives and
[07:11] (431.64s)
automate the creation of cost
[07:12] (432.96s)
capitalization reports inuring managers
[07:15] (435.84s)
and team leads get access to a powerful
[07:17] (437.48s)
combination of research back engering
[07:19] (439.08s)
metrics and developer experience surveys
[07:21] (441.44s)
to identify and eliminate process
[07:23] (443.80s)
bottleneck software Engineers speed up
[07:26] (446.04s)
their daily workflows of swarming as
[07:27] (447.56s)
two-way slack notifications working
[07:29] (449.72s)
agreements and team Focus insights you
[07:32] (452.32s)
can learn more about how some of the
[07:33] (453.60s)
world's best software organizations
[07:35] (455.20s)
including mirro Docker and webflow use
[07:37] (457.48s)
swarm to bit better software faster at
[07:40] (460.12s)
swarm mia.com
[07:41] (461.76s)
pragmatic that is s swmi
[07:45] (465.84s)
a.com pragmatic fabricator was a really
[07:48] (468.88s)
central part of this I know because at
[07:50] (470.72s)
Uber we use fabricator it was it was an
[07:53] (473.28s)
open source tooling and I think Uber and
[07:55] (475.08s)
you mentioned Dropbox also took it on
[07:58] (478.12s)
maybe a few other companies but but not
[07:59] (479.80s)
not as many so for a while we kind of
[08:02] (482.00s)
used it I I saw some of of this
[08:03] (483.96s)
integration but it it sounds a little
[08:07] (487.04s)
wild because I I don't really know any
[08:08] (488.96s)
other company maybe outside of Google
[08:10] (490.48s)
where you can have this thing that you
[08:12] (492.36s)
know as as you say you're you deploy a
[08:15] (495.20s)
code and you can see the the experiments
[08:17] (497.96s)
the localization the the roll out all
[08:21] (501.16s)
from one UI usually it's different
[08:22] (502.44s)
systems even at ruber it was different
[08:23] (503.72s)
systems we we had a you know like an
[08:25] (505.32s)
experimentation system it was called
[08:26] (506.68s)
Morpheus at the time or or Flipper the
[08:28] (508.72s)
feature Flags it was you know and we had
[08:31] (511.36s)
I I had this list of like systems that I
[08:34] (514.16s)
you know like a cheat sheet of where I
[08:35] (515.72s)
would go to check the roll out the
[08:38] (518.24s)
internationalization etc etc so all of
[08:40] (520.44s)
this was really kind of like you one
[08:43] (523.44s)
away I think that's the way it exists
[08:45] (525.52s)
for a lot of people today you know I
[08:47] (527.20s)
think when we look of right now
[08:49] (529.16s)
best-in-class Dev tooling is starting to
[08:51] (531.32s)
become more integrated I think around
[08:52] (532.96s)
like linear it's now starting to post
[08:54] (534.84s)
the linear issue in the GitHub in the
[08:56] (536.92s)
GitHub poll request and you can see and
[08:58] (538.96s)
it updates the status as the poqu moves
[09:00] (540.96s)
through review um you're also seeing the
[09:02] (542.80s)
same with Rell right when you create a
[09:04] (544.60s)
PR Rell can go ahead and be like hey
[09:07] (547.52s)
went ahead and like made a preview
[09:09] (549.24s)
environment for you you can go ahead and
[09:10] (550.60s)
click that there I think what was so
[09:12] (552.56s)
magical about Facebook is that all of
[09:14] (554.32s)
that was native and it didn't appear to
[09:16] (556.40s)
be Integrations but just part of the
[09:18] (558.16s)
system and who built this tool right
[09:21] (561.68s)
like uh like the the developer tooling
[09:25] (565.12s)
you know fabricator uh the sand castle
[09:28] (568.96s)
all of these things was it just like
[09:30] (570.32s)
devs like okay I'm I'm going to build
[09:32] (572.04s)
this obviously it was internal teams
[09:33] (573.48s)
right but was it like dedicated teams or
[09:35] (575.84s)
or people just kind of went and like
[09:37] (577.44s)
went and tweak stuff I think it really
[09:39] (579.36s)
depends on so it really depends on the
[09:41] (581.20s)
tool some internal Dev tools were just
[09:43] (583.00s)
built by developers themselves they saw
[09:45] (585.36s)
problem they're like nothing's
[09:46] (586.56s)
addressing this I need to fix it some
[09:48] (588.76s)
came more from the com isn't it it's a
[09:51] (591.72s)
it's an wonderful culture to be a part
[09:53] (593.32s)
of where you can solve your own problems
[09:55] (595.28s)
um but some of it came more from the
[09:57] (597.20s)
like oh we as a company have notice that
[09:59] (599.84s)
there's an issue here we need to address
[10:02] (602.20s)
the issue let's spin up a team to do it
[10:04] (604.36s)
and then the place where all of these
[10:05] (605.88s)
teams Al or all of these tools
[10:07] (607.52s)
ultimately found their home was under
[10:09] (609.20s)
the dev infrastructure team so developer
[10:11] (611.28s)
infrastructure at other companies called
[10:12] (612.76s)
developer platform and developer
[10:14] (614.08s)
velocity really maintain the bulk of uh
[10:17] (617.40s)
Facebook's developer tooling yeah and I
[10:20] (620.44s)
guess like one tool that I kind of
[10:24] (624.44s)
experienced because of fabricator was
[10:26] (626.28s)
called heral I'm not sure if if you know
[10:29] (629.08s)
so so it's it was of course you could
[10:31] (631.24s)
set up special rules on on the code base
[10:34] (634.36s)
anything like if if basically on on a
[10:36] (636.56s)
commit anything happens like weird rules
[10:39] (639.28s)
could could could run were there like
[10:41] (641.24s)
can you like do you remember some some
[10:43] (643.12s)
like there were some really creative
[10:44] (644.28s)
examples of like using Herald rules for
[10:46] (646.76s)
developers to start to create their own
[10:49] (649.04s)
rules or code ownership or that kind of
[10:50] (650.96s)
stuff totally um so Herold was Herold if
[10:55] (655.32s)
I recall correctly was the rules engine
[10:57] (657.12s)
which later got replaced by butterfly
[10:58] (658.84s)
bot and the idea of both of them was the
[11:00] (660.92s)
same it was be able to match events that
[11:03] (663.32s)
might happen in the normal course of cod
[11:05] (665.60s)
review something like someone puts up a
[11:08] (668.08s)
PR that includes some line of code or
[11:10] (670.32s)
calls some function or CI fails
[11:13] (673.12s)
regarding something and then what it
[11:14] (674.68s)
would do is it could take a variety of
[11:16] (676.00s)
actions so some of the examples that I
[11:18] (678.16s)
remember was when I I used to work on
[11:20] (680.52s)
internal uh developer tools there uh
[11:22] (682.80s)
when we were deprecating apis you might
[11:24] (684.68s)
go ahead and do something like create a
[11:26] (686.08s)
Heralds rule that would say something
[11:27] (687.88s)
like if a user call this function or
[11:30] (690.00s)
uses this API like post a comment
[11:32] (692.60s)
letting them know hey this is deprecated
[11:34] (694.88s)
also the new syntax is this um and then
[11:38] (698.08s)
as you got later and later into the roll
[11:39] (699.40s)
out you could start to do things instead
[11:40] (700.96s)
like hey notify me if anyone's adding
[11:43] (703.24s)
call sites of this API so that I can go
[11:45] (705.56s)
ahead and go leave a comment saying hey
[11:48] (708.08s)
like no we're really trying to deprecate
[11:49] (709.56s)
this like please stop um and then you
[11:52] (712.04s)
could take it from there other examples
[11:53] (713.72s)
were simpler um I remember there were
[11:55] (715.40s)
some cases where there were uh let's
[11:57] (717.52s)
call it privileged or important parts of
[11:58] (718.88s)
the code base and what you'd say is like
[12:00] (720.84s)
if someone goes ahead and creates code
[12:02] (722.48s)
that touches that part of the codebase
[12:03] (723.88s)
just make sure this team is added as a
[12:05] (725.32s)
reviewer or at very least as a
[12:06] (726.80s)
subscriber subscribers were an idea and
[12:08] (728.84s)
fabricator of being able to CC people it
[12:11] (731.72s)
it wasn't that you expected a review
[12:13] (733.60s)
more just that you wanted to keep their
[12:14] (734.84s)
eyes on it um and there were a handful
[12:17] (737.24s)
of rules like that and and do I remember
[12:19] (739.28s)
correctly that code ownership was super
[12:21] (741.52s)
super important for fabricator and for
[12:24] (744.24s)
Facebook and it was just really trivial
[12:25] (745.64s)
to say I own parts of the code and then
[12:28] (748.12s)
set up different rules around that no so
[12:31] (751.00s)
that's a great question so code
[12:32] (752.08s)
ownership actually became uh code
[12:34] (754.32s)
ownership went through a few stages at
[12:35] (755.76s)
Facebook there was a while where code
[12:37] (757.76s)
owners were required at Facebook and so
[12:40] (760.28s)
you could have like different parts of
[12:42] (762.00s)
the code base could have different uh
[12:43] (763.52s)
owners then the company took it away and
[12:46] (766.20s)
they said actually we don't believe that
[12:47] (767.76s)
we should have code owners we're a
[12:49] (769.12s)
collaborative environment we should go
[12:50] (770.84s)
ahead and get rid of that it was then
[12:53] (773.00s)
reintroduced and I think it was then
[12:55] (775.32s)
again taken away wow um with the idea of
[12:58] (778.28s)
like we're not sure the company
[12:59] (779.96s)
alternated between oh they're really
[13:01] (781.88s)
privileged parts of our monor repo and
[13:04] (784.24s)
we can talk more around sort of how
[13:05] (785.44s)
Facebook structured their code that need
[13:07] (787.56s)
certain people to uh review it and well
[13:10] (790.72s)
actually we should trust our developers
[13:12] (792.96s)
developers can developers need to be
[13:15] (795.60s)
able to understand organizational
[13:17] (797.08s)
practice one thing that was always
[13:19] (799.60s)
existing within the platform though was
[13:21] (801.68s)
this idea of if you're in some part of
[13:23] (803.84s)
the codebase tag the correct reviewers
[13:25] (805.76s)
and so I think one of the things that
[13:27] (807.12s)
actually I most missed when I got out
[13:29] (809.28s)
here was the ability to structure
[13:30] (810.68s)
repository in a way where you could say
[13:32] (812.56s)
oh if you touch this part of it we
[13:33] (813.92s)
should tag these people in I think in
[13:36] (816.04s)
GitHub code owners is a very coarse
[13:37] (817.84s)
approximation of that where you have to
[13:39] (819.08s)
Define one file it's at the root of the
[13:41] (821.00s)
repository it's difficult to use a
[13:43] (823.08s)
person who I've heard has done this even
[13:44] (824.64s)
better or company uh is Google so in
[13:46] (826.96s)
Google uh code reviewers were assigned
[13:49] (829.12s)
hierarchically and so in every folder
[13:51] (831.44s)
you could basically say like hey if you
[13:53] (833.52s)
touch code in this folder like you need
[13:55] (835.00s)
to go talk to these people either as a
[13:57] (837.08s)
heads up or as a requirement and it
[13:59] (839.56s)
allowed people to get um there were two
[14:02] (842.56s)
uh really nice aspects of that one was
[14:04] (844.44s)
that it was really easy to Define
[14:05] (845.68s)
ownership and then the second was that
[14:07] (847.52s)
it was easy as you were writing uh poll
[14:09] (849.72s)
requests there to be able to go ahead
[14:12] (852.28s)
and say like who does own this code who
[14:14] (854.20s)
can I go ask for and so if you're in a
[14:16] (856.16s)
say that you using an API that you
[14:17] (857.52s)
weren't certain of you need to figure
[14:18] (858.96s)
out like who can I go talk to about this
[14:20] (860.80s)
API what you could do is you could just
[14:23] (863.12s)
navigate the file tree find the team or
[14:25] (865.92s)
people in charge of it and then just go
[14:27] (867.96s)
DM them and so that became a really
[14:30] (870.00s)
powerful system there now you've you've
[14:32] (872.04s)
gone back and forth between you know at
[14:33] (873.48s)
Facebook there was code owners no code
[14:35] (875.20s)
owners and you're now working at a
[14:36] (876.92s)
startup you actually work with a lot of
[14:38] (878.40s)
startups and and larger companies what
[14:40] (880.72s)
is your kind of unfiltered view on code
[14:43] (883.16s)
ownership is it a good thing is it a bad
[14:45] (885.48s)
thing is it is it necessary only when
[14:47] (887.56s)
you're bigger or or when you're smaller
[14:50] (890.16s)
I'm sure you've got some opinions now I
[14:52] (892.56s)
work going to start up now graphite we
[14:54] (894.00s)
have the privilege of working with some
[14:55] (895.08s)
of the largest St tools teams in the
[14:56] (896.32s)
world which is really cool at this point
[14:58] (898.44s)
I think I've heard all sort all sides of
[15:00] (900.16s)
the story I've heard the we are in a
[15:02] (902.96s)
financially regulated market and because
[15:04] (904.84s)
of that we have a legal requirement to
[15:06] (906.44s)
have code owners and so I've heard that
[15:08] (908.60s)
side of the world I've also heard the
[15:10] (910.36s)
side of the world of oh we could
[15:12] (912.08s)
possibly do code owners here like it
[15:13] (913.56s)
would just slow us down too much and so
[15:15] (915.20s)
I think it really depends on the nature
[15:17] (917.84s)
of the business I once saw this Matrix
[15:19] (919.80s)
which I really liked which is if you
[15:21] (921.80s)
trust your PE if you trust people a lot
[15:24] (924.24s)
and are willing to tolerate like small
[15:26] (926.72s)
mistakes you should lean on culture not
[15:29] (929.24s)
process um as you start to get into a
[15:31] (931.96s)
place of uh oh actually maybe I trust
[15:35] (935.56s)
the people less or I can't tolerate
[15:37] (937.16s)
mistakes at all you move to Automation
[15:39] (939.92s)
and enforcement and so it really depends
[15:42] (942.24s)
sort of where your business falls on
[15:43] (943.88s)
that I also will say that I think that
[15:45] (945.44s)
as Facebook's grown um the way they
[15:48] (948.00s)
treat different parts of their code base
[15:49] (949.48s)
has changed whereas some parts of the
[15:51] (951.12s)
codebase absolutely cannot go down right
[15:53] (953.20s)
things that deal with privacy cannot go
[15:54] (954.88s)
down that's not a place where you can
[15:56] (956.08s)
move fast um some other things perhaps
[16:00] (960.16s)
uh maybe the way that you display
[16:02] (962.20s)
comments if it's uh not a big deal that
[16:04] (964.80s)
actually is a little bit easier or a new
[16:06] (966.52s)
product even um there you have a little
[16:08] (968.88s)
bit more leeway to move quickly and so
[16:10] (970.48s)
maybe code owners isn't the right match
[16:12] (972.36s)
yeah and you know one thing that i' I've
[16:13] (973.80s)
kind of like learned over time is maybe
[16:16] (976.16s)
inside a company as much as you'd like
[16:18] (978.24s)
to think that there's one inuring
[16:19] (979.36s)
culture maybe that's not the answer
[16:21] (981.20s)
maybe you know like a new a new startup
[16:23] (983.68s)
inside a company like an established
[16:25] (985.44s)
compl met a new startup maybe it should
[16:26] (986.96s)
operate differently cuz maybe they'll be
[16:28] (988.44s)
shut down in in six months if they don't
[16:30] (990.04s)
find product Market fit and you know
[16:31] (991.56s)
they should just throw away the things
[16:34] (994.16s)
it's it's a little bit uh at the same
[16:35] (995.76s)
time I think companies don't really like
[16:37] (997.20s)
to admit it but you know like the and
[16:40] (1000.52s)
and maybe even inside a small company
[16:42] (1002.20s)
right like if you start uh just a small
[16:45] (1005.56s)
prototype project see if it goes
[16:47] (1007.16s)
anywhere you're probably not going to
[16:48] (1008.76s)
have the same rigid thing as when you're
[16:50] (1010.20s)
deploying to you know like customers who
[16:52] (1012.56s)
actually count on it it needs to stay up
[16:55] (1015.48s)
I think that's a really wise point I
[16:56] (1016.96s)
think that ultimately and uh engineering
[17:00] (1020.48s)
anywhere exists within a system of
[17:01] (1021.84s)
constraints and I think those
[17:02] (1022.88s)
constraints within a company can be very
[17:04] (1024.36s)
different and so recognizing what are
[17:06] (1026.68s)
the constraints on the product I'm
[17:07] (1027.80s)
building right now and architecting your
[17:10] (1030.04s)
systems around it is an engineering
[17:11] (1031.48s)
problem like any other right sometimes
[17:13] (1033.44s)
you have constraints of uh I think the
[17:16] (1036.00s)
the place where this actually comes to
[17:17] (1037.20s)
me uh most clearly is when I deal with
[17:19] (1039.76s)
developer tools teams who release mobile
[17:22] (1042.00s)
apps versus web apps so web apps tend to
[17:24] (1044.08s)
release like on a like hourly basis um
[17:26] (1046.96s)
maybe maybe multiple times a day but not
[17:29] (1049.32s)
slower than that whereas mobile apps
[17:30] (1050.96s)
release on a weekly Cadence because of
[17:32] (1052.84s)
that just the um the shape of problems
[17:35] (1055.16s)
they deal with are so different even
[17:37] (1057.08s)
within a company so mobile app teams
[17:39] (1059.52s)
generally don't have to hold pagers
[17:41] (1061.12s)
because even if something goes down what
[17:42] (1062.84s)
are you going to do submit a fix to
[17:44] (1064.12s)
Apple great that's two days yeah um
[17:46] (1066.52s)
versus a web app team like paging is an
[17:48] (1068.92s)
important part of the culture right as
[17:50] (1070.52s)
you're shipping things really quickly
[17:52] (1072.40s)
things will break things will go down
[17:54] (1074.28s)
and you need to you're on the hook to
[17:55] (1075.60s)
fix them um and so watching the
[17:58] (1078.24s)
difference between those cultures I
[18:00] (1080.08s)
think is really is probably the like two
[18:02] (1082.12s)
extremes you see within a company and
[18:04] (1084.28s)
then again as you start to work in
[18:06] (1086.04s)
different parts if you work on Payment
[18:07] (1087.40s)
Processing versus uh abuse and fraud
[18:10] (1090.04s)
detection versus like the consumer front
[18:11] (1091.92s)
end versus authentication um you
[18:14] (1094.12s)
actually have different constraints
[18:15] (1095.24s)
which are subtly different and I think
[18:17] (1097.12s)
that as you're saying engineering
[18:18] (1098.76s)
organizations can develop different
[18:20] (1100.08s)
cultures around how they need to treat
[18:21] (1101.48s)
them I I I like this thinking of
[18:23] (1103.72s)
constraints and how you know it leads
[18:25] (1105.80s)
different practices now one thing that
[18:27] (1107.64s)
really surprised me at Uber which came
[18:29] (1109.84s)
really from from meta uh from from
[18:31] (1111.96s)
fabricator is we use this thing called
[18:33] (1113.68s)
stag diffs and we did a deep dive into
[18:35] (1115.92s)
pragmatic engineer uh it's the it will
[18:38] (1118.40s)
be linked in the notes below but it's
[18:40] (1120.56s)
something that it just like really
[18:42] (1122.24s)
clicked for me and also everyone around
[18:43] (1123.80s)
me everyone I talk with at at meta uh
[18:46] (1126.00s)
you've now actually built a like a
[18:48] (1128.08s)
business on top of this but can we talk
[18:50] (1130.32s)
about what is stack stack diffs and you
[18:52] (1132.80s)
know how did you come across it and why
[18:55] (1135.32s)
why is it so you know like useful once
[18:57] (1137.64s)
you get a hang of it
[18:59] (1139.80s)
absolutely uh um so you're right we
[19:02] (1142.96s)
we've spent a lot of time talking to
[19:04] (1144.32s)
people about stacking about stack diffs
[19:06] (1146.56s)
and so stack PRS as as they're called on
[19:08] (1148.96s)
GitHub um and I think that we've uh
[19:12] (1152.40s)
we've had a lot of practice about how to
[19:13] (1153.84s)
talk about it I think the the way I like
[19:15] (1155.60s)
to introduce stacking is by talking
[19:17] (1157.20s)
around the problem that it solves and
[19:19] (1159.16s)
the problem that it solves is in any
[19:20] (1160.60s)
given engineering organization you have
[19:22] (1162.80s)
a bunch of Engineers who at any given
[19:24] (1164.28s)
moment might tell you they're blocked
[19:25] (1165.72s)
they're blocked on code review and I
[19:27] (1167.88s)
think the when you really sort of like
[19:29] (1169.84s)
start to ask well why are they blocked
[19:31] (1171.44s)
why is code review blocking them what it
[19:33] (1173.12s)
comes down to is in most engineering
[19:34] (1174.88s)
organizations the way development works
[19:37] (1177.16s)
is you have a main line so you have a
[19:39] (1179.52s)
main branch or a master Branch you go
[19:41] (1181.64s)
ahead you Fork off of that you go ahead
[19:43] (1183.84s)
and you create your feature branch and
[19:45] (1185.80s)
then you kind of get stuck because
[19:48] (1188.04s)
before you can continue to develop on
[19:49] (1189.64s)
top of that feature Branch it needs to
[19:51] (1191.48s)
merge merge back because exactly um you
[19:55] (1195.44s)
need to merge it back because you only
[19:57] (1197.40s)
create feature branches off of
[19:59] (1199.56s)
the downside of that though is that your
[20:01] (1201.88s)
development your personal development
[20:03] (1203.60s)
process now becomes blocked on review
[20:06] (1206.04s)
because you've created some PR you've
[20:08] (1208.00s)
created the perfect way to do uh I don't
[20:10] (1210.80s)
know authentication loging and you now
[20:12] (1212.76s)
want to build forgot password and so
[20:14] (1214.88s)
your options here are you can either add
[20:16] (1216.64s)
forgot password back into that Branch
[20:19] (1219.08s)
but suddenly that branch is going to
[20:20] (1220.16s)
become really big and your reviewers is
[20:22] (1222.04s)
not going to be able to review it or if
[20:23] (1223.32s)
they review it they're going to give it
[20:24] (1224.36s)
a like rubber stamp looks good to me
[20:26] (1226.84s)
yeah yeah that kind
[20:29] (1229.76s)
there's no point in a Cod review when
[20:31] (1231.00s)
you do that exactly and so uh Google has
[20:35] (1235.12s)
a funny academic paper where they refer
[20:36] (1236.80s)
to it as reviewer frustration as as poll
[20:39] (1239.24s)
requests get longer um reviewers buy out
[20:41] (1241.96s)
of the process and end up just approving
[20:43] (1243.92s)
things rather than actually reading them
[20:46] (1246.32s)
uh and so the solution uh the solution
[20:49] (1249.48s)
which meta uh stumbled upon is what they
[20:51] (1251.72s)
called stacking Google I've heard has
[20:53] (1253.48s)
stumbled on something similar uh there
[20:55] (1255.20s)
might be some cross-pollination with
[20:56] (1256.56s)
meta there but the idea is you create
[20:58] (1258.44s)
your feature Branch you put it up for
[21:00] (1260.08s)
review and rather than waiting for it to
[21:01] (1261.84s)
be reviewed for you to start building
[21:03] (1263.40s)
you just Branch off of that and keep
[21:05] (1265.28s)
going and you create another feature
[21:06] (1266.80s)
branch and then you put that up for
[21:08] (1268.24s)
review and then while you wait for that
[21:10] (1270.08s)
to be review you create another feature
[21:11] (1271.40s)
branch and you create that to be
[21:12] (1272.44s)
reviewed now people who have been using
[21:15] (1275.44s)
it long enough most of them have tried
[21:16] (1276.92s)
something like this and I think
[21:18] (1278.48s)
inevitably you end up with a few
[21:20] (1280.00s)
questions one is well what if like
[21:22] (1282.56s)
someone rightly calls out that that
[21:24] (1284.80s)
earlier PR needs to be updated um the
[21:27] (1287.72s)
general answer is that's actually fine
[21:29] (1289.68s)
um generally when you need to update a
[21:31] (1291.20s)
PR you're not changing the abstraction
[21:32] (1292.92s)
boundary between PRS um an example of
[21:35] (1295.08s)
how you might break it out is if you're
[21:36] (1296.56s)
building a new feature I have server and
[21:38] (1298.40s)
then I have front end and so even if
[21:40] (1300.60s)
that someone's like hey can we refactor
[21:42] (1302.20s)
This Server usually the endpoint uh
[21:44] (1304.76s)
abstraction is actually staying in place
[21:46] (1306.56s)
and so the next PR can continue to be
[21:48] (1308.24s)
stacked on top of it the other question
[21:51] (1311.00s)
is uh well uh how do I know what I'm
[21:54] (1314.56s)
supposed to break up uh and I think for
[21:56] (1316.96s)
that generally and this this is actually
[21:59] (1319.36s)
really interesting to me generally
[22:00] (1320.88s)
developers build uh their systems in
[22:03] (1323.36s)
order right you might start building the
[22:05] (1325.08s)
server and then building the front end I
[22:06] (1326.96s)
think stacking is just a way to
[22:08] (1328.20s)
communicate that same order to your uh
[22:10] (1330.80s)
to your reviewers a lot of people do
[22:12] (1332.48s)
this automatically using commits right
[22:14] (1334.44s)
they just say like oh I'm I'm going to
[22:16] (1336.24s)
build commits in the order I would build
[22:17] (1337.64s)
them and you should review them by
[22:18] (1338.76s)
commits sacking goes a step further and
[22:20] (1340.80s)
says well if the first three commits in
[22:23] (1343.04s)
our Branch are approved why not merge
[22:25] (1345.56s)
those while you wait for the latter ones
[22:26] (1346.96s)
to be approved and that has a few other
[22:28] (1348.80s)
benefits you can run C on them
[22:30] (1350.12s)
separately you can uh reduce merge
[22:32] (1352.28s)
conflicts and you can t uh and you tend
[22:34] (1354.08s)
to be able to reduce your time to merge
[22:36] (1356.12s)
because if there are sticky things uh
[22:38] (1358.72s)
you get stuck there but can still merge
[22:40] (1360.12s)
a lot of other code the other advantages
[22:41] (1361.92s)
then happen sort of afterwards of in the
[22:44] (1364.52s)
case that there's an error or something
[22:45] (1365.80s)
like that you can go ahead quickly
[22:47] (1367.88s)
identify which PR was offending and if
[22:49] (1369.84s)
you tell me a 10line PR is offending as
[22:51] (1371.72s)
opposed to like a 2000 line PR it's much
[22:54] (1374.12s)
easier for me to identify the issue so
[22:56] (1376.16s)
so basically you get to work in just
[22:57] (1377.84s)
like smaller
[22:59] (1379.08s)
increments and it's almost like forking
[23:01] (1381.92s)
off of your your current Branch you know
[23:04] (1384.96s)
like smaller and small have a bunch of
[23:06] (1386.84s)
small Forks except it's just a tooling
[23:09] (1389.20s)
right that that like so wh why do people
[23:12] (1392.64s)
not do this you know like just forget
[23:14] (1394.32s)
about like the tooling that met have but
[23:16] (1396.12s)
you could just literally you know create
[23:18] (1398.00s)
a branch and then create yet another
[23:19] (1399.48s)
Branch but I guess it's just really
[23:20] (1400.60s)
complex to look at what's happening
[23:22] (1402.64s)
complex toerge it back I think git makes
[23:25] (1405.96s)
it scary and I think GitHub makes it
[23:27] (1407.76s)
scary so so in git the most basic way to
[23:30] (1410.48s)
do this would be to create branches off
[23:32] (1412.12s)
of branches but then if you need to
[23:33] (1413.36s)
update something you need to go ahead
[23:35] (1415.08s)
and rebase it now that's where I just
[23:36] (1416.56s)
use a really scary word for a lot of
[23:37] (1417.96s)
people rebase Reb oh we hate rebasing I
[23:40] (1420.96s)
I mean okay I can't talk for anyone but
[23:43] (1423.16s)
I'm going to admit it it always freak H
[23:45] (1425.64s)
having a the rebase conflict and
[23:49] (1429.56s)
figuring out what I need to do
[23:51] (1431.28s)
yeah it's scares me and I've been doing
[23:53] (1433.60s)
this for years I actually have a stack
[23:55] (1435.12s)
Overflow post bookmark which is the
[23:56] (1436.72s)
three types of rebase because depending
[23:58] (1438.84s)
on how many arguments you give it it
[24:00] (1440.40s)
behaves differently and so super
[24:03] (1443.28s)
frightening I think that's the place I'm
[24:05] (1445.80s)
very fortunate that I learned this at
[24:06] (1446.96s)
meta because I think what meta has is
[24:08] (1448.64s)
and what I've learned Google has and
[24:10] (1450.48s)
what I've learned uh places like stripe
[24:12] (1452.40s)
and Uber have built are internal tools
[24:14] (1454.16s)
to handle this for you where you can
[24:15] (1455.64s)
basically say like I'm creating a stack
[24:17] (1457.80s)
now I now need to update a branch in my
[24:19] (1459.84s)
stack do that for me and under the hood
[24:22] (1462.48s)
it takes care of this intricate set of
[24:23] (1463.92s)
rebases that while you can do by hand
[24:25] (1465.88s)
are easy to mess up now there there's a
[24:29] (1469.12s)
lot of tooling that does that that does
[24:30] (1470.84s)
that client side portion the other half
[24:32] (1472.92s)
of this is the is the uh po request side
[24:36] (1476.20s)
is if your host does not if your
[24:38] (1478.96s)
wherever you have your PO request does
[24:40] (1480.76s)
not have support for stacking I think
[24:42] (1482.72s)
Stacks can look very foreign because
[24:44] (1484.28s)
what you end up looking with is a lot of
[24:46] (1486.32s)
PRS that appear to be merging into like
[24:48] (1488.88s)
that appear to be recursively merging
[24:50] (1490.36s)
into different branches that they were
[24:51] (1491.68s)
all based off of and so it can be hard
[24:53] (1493.72s)
to understand what's going on and
[24:55] (1495.92s)
without any clear explanation or
[24:57] (1497.24s)
affordance in the UI your reviewer might
[24:59] (1499.20s)
become confused too so I'm kind of like
[25:02] (1502.08s)
wondering if if you have a theory why
[25:04] (1504.44s)
did you know like a few companies like
[25:07] (1507.20s)
uh meta and Google invent this and why
[25:11] (1511.32s)
why is why did why is it just not part
[25:13] (1513.88s)
of get or or GitHub I'm sure you've
[25:15] (1515.92s)
speculated about this you must
[25:18] (1518.04s)
have um certainly so um why why I think
[25:21] (1521.96s)
some companies built this I think first
[25:24] (1524.00s)
and foremost the current tools weren't
[25:25] (1525.36s)
solving their own needs right Curr tools
[25:27] (1527.08s)
being and C like and or whatever they
[25:30] (1530.76s)
had at the time right um at uh mercal
[25:34] (1534.04s)
mercal yep um I think they were shipping
[25:36] (1536.96s)
a high volume I think they were really
[25:38] (1538.44s)
early on monor repos and I think that
[25:40] (1540.40s)
they were getting to this point where
[25:42] (1542.76s)
okay I'm blocked on review progressively
[25:44] (1544.80s)
larger parts of this org blocked on
[25:46] (1546.52s)
review and as you sort of like start to
[25:48] (1548.80s)
grow up um as a company you realize like
[25:51] (1551.96s)
oh like if developers are spending 20%
[25:54] (1554.84s)
of their time waiting that's actually a
[25:56] (1556.48s)
lot of inefficiency that's being created
[25:58] (1558.04s)
AC the or right most people that run an
[26:00] (1560.32s)
or of over 100 Engineers would be like I
[26:02] (1562.32s)
would to have more Engineers um second
[26:05] (1565.84s)
thing is that uh they saw the potential
[26:09] (1569.72s)
impact so I think for a lot of people
[26:11] (1571.96s)
they they sort of like saw that lift of
[26:14] (1574.20s)
like okay um Facebook for example really
[26:16] (1576.72s)
prioritized being able to improve their
[26:18] (1578.36s)
ship velocity for whatever reason it
[26:20] (1580.08s)
mattered to them a lot um Google I think
[26:22] (1582.36s)
it as well of well we're a web company
[26:24] (1584.88s)
if we can like continuously deploy we
[26:27] (1587.12s)
can continue to improve things
[26:29] (1589.00s)
um and thirdly I think they had near
[26:30] (1590.60s)
infinite resources and this is a thing
[26:32] (1592.28s)
that I think most of us don't have which
[26:33] (1593.96s)
is they saw the potential impact they
[26:36] (1596.00s)
saw a problem they had the resources and
[26:37] (1597.92s)
so they could build a solution I think
[26:39] (1599.84s)
it's much harder when you're a startup
[26:41] (1601.80s)
or even just a smaller scale company
[26:44] (1604.84s)
than one of the what three most valuable
[26:46] (1606.88s)
companies in the world uh to be able to
[26:49] (1609.24s)
say like I'm going to take a few
[26:50] (1610.36s)
engineers and I'm going to build this
[26:51] (1611.80s)
yeah I me these are companies who are
[26:53] (1613.48s)
printing money despite having you know
[26:55] (1615.84s)
thousands of Engineers even in tend of
[26:57] (1617.36s)
thousands so so they I mean obviously
[26:59] (1619.64s)
they're going to like like be rational
[27:01] (1621.24s)
they're not just going to create a team
[27:02] (1622.60s)
out of the N but this is very
[27:04] (1624.12s)
interesting because what you mentioned
[27:05] (1625.56s)
is I remember I talked with Nicole with
[27:07] (1627.72s)
dror uh with niol forren uh about
[27:10] (1630.68s)
developer productivity and you know one
[27:12] (1632.80s)
of the things that came up when we
[27:14] (1634.44s)
discussed is like well you do want to
[27:16] (1636.16s)
like measure certain things just to get
[27:17] (1637.92s)
a sense and one of them is typically
[27:19] (1639.12s)
code review like how or how long does it
[27:21] (1641.16s)
take from writing the code to get and
[27:22] (1642.84s)
get in there so I have a feeling that
[27:26] (1646.00s)
companies like meta Google and some of
[27:27] (1647.64s)
these bigger ones who did have the
[27:28] (1648.60s)
resources they started to measure it
[27:29] (1649.88s)
earlier therefore they decided let's
[27:32] (1652.32s)
build better tooling and one interesting
[27:35] (1655.64s)
part that I've only learned recently I
[27:37] (1657.92s)
was talking with one of the three uh
[27:40] (1660.04s)
Linux fellows uh
[27:41] (1661.88s)
Greg uh
[27:44] (1664.08s)
he's you know there there's and he told
[27:47] (1667.52s)
me something interesting that git was
[27:48] (1668.84s)
created by the Linux Community well
[27:50] (1670.92s)
specifically not stols but it was to
[27:52] (1672.48s)
solve their issue of Linux development
[27:55] (1675.16s)
where they had about three to four
[27:58] (1678.20s)
changes per I think per per hour or so
[28:01] (1681.40s)
and it was email based list so they
[28:03] (1683.92s)
created G to solve their own problem and
[28:06] (1686.64s)
now I think they have something like 40
[28:08] (1688.56s)
50 changes per per day or or per hour
[28:11] (1691.60s)
something like that I think it's per day
[28:13] (1693.32s)
actually that that's that's their rate
[28:15] (1695.88s)
and you they built it it's all for them
[28:17] (1697.64s)
they open sourced it but it was for them
[28:19] (1699.64s)
it was there's this Linux there's this
[28:21] (1701.60s)
big it's also a monolith uh the changes
[28:24] (1704.88s)
are pretty small for the most part
[28:27] (1707.36s)
there's not as much let's say you know
[28:28] (1708.92s)
like user facing code Etc but they sell
[28:31] (1711.88s)
for themselves so I wonder if they just
[28:33] (1713.60s)
didn't really have the problem that
[28:35] (1715.08s)
stack diff does because they they don't
[28:37] (1717.24s)
have uh if you look at the average Linux
[28:40] (1720.16s)
change it's a lot smaller than let's say
[28:42] (1722.52s)
you know like a new feature being added
[28:44] (1724.08s)
to some of the Cod bases at like met or
[28:46] (1726.48s)
Uber I think uh you hit the nail on the
[28:49] (1729.24s)
head with it's really important to use
[28:50] (1730.60s)
tools that are built for your use case I
[28:52] (1732.76s)
think that open source development
[28:54] (1734.48s)
generally and Linux specifically uh
[28:57] (1737.12s)
stacking makes no sense for so people
[28:59] (1739.20s)
always ask me sort of like who is
[29:00] (1740.56s)
stacking not for my answer is open
[29:02] (1742.44s)
source and the reasoning is that in open
[29:04] (1744.52s)
source you're dealing with a lot of
[29:06] (1746.24s)
untrusted authors who come to you
[29:08] (1748.72s)
they're like hey I have this feature
[29:10] (1750.64s)
it's partially built I don't have tests
[29:12] (1752.60s)
for it yet I haven't fully tested it can
[29:14] (1754.96s)
I merge it and it's funny in that in a
[29:17] (1757.52s)
company if you're like I have this
[29:19] (1759.08s)
feature it's partially built maybe you
[29:20] (1760.72s)
even have some tests for it but the
[29:22] (1762.16s)
feature is not totally there you're like
[29:24] (1764.12s)
I know you're going to be here tomorrow
[29:25] (1765.88s)
right and so cool thank you for doing
[29:28] (1768.28s)
job I appreciate that you're
[29:29] (1769.24s)
contributing back to the codebase and
[29:30] (1770.68s)
that you're going to prevent me from
[29:31] (1771.72s)
having to resolve rebas conflicts later
[29:33] (1773.64s)
merge conflicts I think that um in Linux
[29:37] (1777.12s)
you kind of see the opposite right and
[29:38] (1778.40s)
if you've ever spent time on those email
[29:39] (1779.76s)
lists you'll see people throw back and
[29:41] (1781.28s)
forth sort of like patches and what will
[29:43] (1783.68s)
happen is that they'll say like oh like
[29:46] (1786.52s)
this isn't complete you're not going to
[29:47] (1787.96s)
work on this Edge case right and rather
[29:49] (1789.68s)
than saying like you know what like we
[29:51] (1791.92s)
can put this behind a feature flag and
[29:53] (1793.36s)
merge it they're instead like go take
[29:55] (1795.24s)
this back fix it fully finish it rapid
[29:58] (1798.28s)
and tests and documentation and
[29:59] (1799.76s)
everything else and only then can you
[30:01] (1801.60s)
merge it um and that makes a ton of
[30:04] (1804.00s)
sense in open source right having worked
[30:05] (1805.88s)
on open source repos myself the biggest
[30:07] (1807.84s)
fear is that the contributor is just
[30:08] (1808.96s)
going to disappear it happens more often
[30:10] (1810.72s)
than we like to admit they get busy
[30:12] (1812.04s)
something happens and so in that case
[30:14] (1814.28s)
you don't want to land partial Stacks
[30:16] (1816.28s)
you want to make sure that the thing is
[30:17] (1817.76s)
fully complete before you even consider
[30:20] (1820.20s)
like reviewing it
[30:21] (1821.68s)
truthfully yeah and this is actually
[30:24] (1824.08s)
like when I had a discussion this trust
[30:26] (1826.04s)
came up so much with with open source so
[30:29] (1829.68s)
it's interesting how like you know as
[30:32] (1832.28s)
you said the context like inside a
[30:33] (1833.80s)
company where you have people you can
[30:35] (1835.16s)
actually trust them that they'll follow
[30:36] (1836.32s)
up or if they don't follow up it's not a
[30:37] (1837.80s)
big deal you know like it it might be
[30:40] (1840.12s)
deleted interesting how how this context
[30:42] (1842.56s)
like like makes such a big difference
[30:44] (1844.40s)
now another thing that is unique I guess
[30:47] (1847.52s)
maybe not just to to large companies but
[30:50] (1850.60s)
it kind of started there monor repos
[30:52] (1852.96s)
yeah so what what was the you know meta
[30:56] (1856.08s)
and Google were probably one of the two
[30:57] (1857.44s)
companies to start doing monoral you've
[30:59] (1859.80s)
seen it at at meta what what was the
[31:01] (1861.48s)
history with uh with with with the
[31:03] (1863.52s)
monoral story there and and the kind of
[31:05] (1865.24s)
pain points you
[31:06] (1866.28s)
saw yeah totally um I think you have to
[31:09] (1869.80s)
remember so first things first you have
[31:11] (1871.36s)
to remember that Source control at meta
[31:12] (1872.76s)
has a funny history when that company
[31:14] (1874.52s)
started and this is well before my time
[31:17] (1877.24s)
rumor has it the way they did Source
[31:18] (1878.68s)
control is emailing back and forth zip
[31:20] (1880.32s)
files and so at some started
[31:24] (1884.36s)
yeah um I mean that wasn't so uncommon
[31:27] (1887.00s)
right get itself is a a fairly new
[31:28] (1888.36s)
innovation came around in 20072 2008 um
[31:31] (1891.80s)
and so I think that when Facebook
[31:35] (1895.36s)
started to do Source control uh they had
[31:39] (1899.36s)
this idea that well we want to be able
[31:42] (1902.52s)
to version everything together and I
[31:44] (1904.36s)
think there are a lot of advantages to
[31:45] (1905.72s)
that I think one is it makes
[31:47] (1907.00s)
collaboration a lot easier it makes
[31:48] (1908.88s)
Discovery a lot easier um I think it
[31:51] (1911.28s)
makes it easy to put dependencies
[31:53] (1913.16s)
between different services so I think
[31:54] (1914.76s)
all of us have had that case where it's
[31:56] (1916.24s)
like oh well I need to change but that
[31:58] (1918.44s)
also means I need to change the call
[31:59] (1919.84s)
sites or that also means I need to
[32:01] (1921.20s)
change this other thing um monor repos
[32:04] (1924.16s)
allow you to do that right because what
[32:05] (1925.52s)
it's saying is there's one like unified
[32:07] (1927.64s)
state of the world that we can look at
[32:10] (1930.40s)
can make sense and everything's good we
[32:11] (1931.96s)
can test against even uh poly repos the
[32:15] (1935.72s)
opposite of that when an organization
[32:17] (1937.04s)
has many many or uh repositories I think
[32:20] (1940.00s)
have different advantages um you can
[32:22] (1942.08s)
version them separately you can uh have
[32:24] (1944.08s)
different practices within them you can
[32:26] (1946.52s)
uh have different entirely different
[32:28] (1948.80s)
owners within them I actually think that
[32:30] (1950.40s)
GitHub on many companies pushes a
[32:32] (1952.32s)
culture of poly repos because it's
[32:34] (1954.08s)
what's open it's what's common in open
[32:35] (1955.60s)
source right so again going back to this
[32:37] (1957.68s)
Narrative of sort of like oh open source
[32:40] (1960.04s)
is different than closed Source I think
[32:41] (1961.28s)
in open source you actually want things
[32:43] (1963.00s)
like I don't want to have to version at
[32:44] (1964.96s)
the same time that all of my
[32:46] (1966.32s)
dependencies are versioning on that
[32:47] (1967.56s)
doesn't make sense I'm not really
[32:49] (1969.04s)
collaborating with my dependencies yes I
[32:50] (1970.64s)
depend on other open source projects but
[32:52] (1972.48s)
perhaps I'm not contributing to them I
[32:54] (1974.60s)
think within a company those constraints
[32:56] (1976.12s)
change right um Within a company you're
[32:58] (1978.44s)
looking at constraints of like I want to
[33:00] (1980.28s)
collaborate with other people I want to
[33:01] (1981.92s)
be able to version things and so you get
[33:04] (1984.20s)
pushed towards sort of like one unified
[33:06] (1986.32s)
source of Truth for like what is the
[33:08] (1988.12s)
state of our code base now uh by the
[33:11] (1991.36s)
time I was there Facebook wasn't even
[33:13] (1993.76s)
one monor repo it was multiple M repos I
[33:15] (1995.96s)
remember hearing someone refer to it as
[33:17] (1997.52s)
a poly lith which made me laugh a lot uh
[33:20] (2000.28s)
where there was the main web mono uh
[33:22] (2002.92s)
there was Instagram which was separate
[33:24] (2004.52s)
there were the mobile apps which were
[33:25] (2005.72s)
separate than that there was tool Le
[33:28] (2008.92s)
yeah which was completely separate and
[33:31] (2011.20s)
they were on a mission to try and unify
[33:32] (2012.72s)
these into all of them because into one
[33:35] (2015.28s)
massive monor repo because the negatives
[33:37] (2017.60s)
of not having one monor repo were
[33:39] (2019.12s)
starting to affect the company some
[33:40] (2020.60s)
examples of that might be that if you
[33:41] (2021.88s)
wanted to do an end test on mobile uh
[33:43] (2023.80s)
apps I remember there was some test that
[33:45] (2025.68s)
would pull down a copy of the other
[33:48] (2028.04s)
repository um spin up a server from that
[33:50] (2030.84s)
repository and then spin up the mobile
[33:52] (2032.84s)
app such that they could test against
[33:54] (2034.36s)
that and that was that led to all sorts
[33:56] (2036.80s)
of conflicts right like you're it makes
[33:58] (2038.92s)
your CI flakier it makes it hard
[34:00] (2040.96s)
diversion against when we had
[34:02] (2042.36s)
dependencies between repositories that
[34:04] (2044.32s)
was a nightmare and so by bringing
[34:06] (2046.08s)
Everything Under One Roof it allowed you
[34:07] (2047.56s)
to really uh collaborate across the
[34:10] (2050.40s)
company more actively and uh it allowed
[34:13] (2053.52s)
you to collaborate across the company
[34:14] (2054.64s)
more actively and allowed you to share
[34:16] (2056.92s)
uh dependencies so like face Facebook
[34:20] (2060.04s)
was being slowed down despite having you
[34:22] (2062.68s)
know a few Mega repos if you will or
[34:24] (2064.80s)
these large ones so so they saw the
[34:26] (2066.04s)
disadvantage in the end they they they
[34:27] (2067.96s)
move to one unified monor between iOS
[34:31] (2071.44s)
Android and even
[34:33] (2073.00s)
web when I was leaving they were on the
[34:35] (2075.48s)
journey to um I remember hearing
[34:37] (2077.76s)
recently that they're still on the
[34:38] (2078.92s)
journey to that's that that's really
[34:41] (2081.56s)
it's such a different code especially
[34:42] (2082.68s)
for Native mobile but
[34:45] (2085.12s)
yeah and but the advantage of it though
[34:48] (2088.20s)
and again going back to it is I think
[34:49] (2089.68s)
that it allows you to enforce a common
[34:51] (2091.12s)
set of end practices and uh requirements
[34:54] (2094.60s)
right like the idea of like oh every
[34:56] (2096.64s)
repository should have at at least one
[34:58] (2098.52s)
One reviewer right in a company yeah
[35:00] (2100.68s)
that probably is a companywide
[35:02] (2102.20s)
requirement uh but in open source like I
[35:05] (2105.84s)
can't be guaranteed that every open
[35:07] (2107.32s)
source project wants the same set of
[35:08] (2108.72s)
requirements and it's interesting
[35:10] (2110.32s)
because it's not just meta like other
[35:11] (2111.96s)
other companies when they grow a certain
[35:13] (2113.76s)
size they start to start have the same
[35:15] (2115.24s)
Journey you know I've seen it at Uber as
[35:17] (2117.24s)
far as I understand you know Shopify has
[35:19] (2119.28s)
is is is doing similar things I assume
[35:21] (2121.20s)
that the as companies go larger they'll
[35:23] (2123.72s)
have these typically starts with like oh
[35:26] (2126.20s)
what about a vulnerability of a Library
[35:28] (2128.04s)
being updated would it not be nice if it
[35:29] (2129.76s)
was in one place now you've you've seen
[35:34] (2134.16s)
more of these through obviously through
[35:36] (2136.44s)
with with graphite when you're
[35:37] (2137.56s)
interrogating with with some of these
[35:39] (2139.20s)
companies do you see some trends of like
[35:42] (2142.08s)
when monores are starting to become
[35:43] (2143.72s)
important and and if if they are
[35:46] (2146.36s)
starting to become more important why do
[35:48] (2148.40s)
you think we're not seeing GitHub
[35:49] (2149.80s)
support too much of this
[35:52] (2152.44s)
still it's a great question so uh first
[35:55] (2155.60s)
and foremost I think that we're watching
[35:57] (2157.80s)
an industrywide move towards monor repos
[35:59] (2159.92s)
most companies we're talking to are
[36:01] (2161.48s)
already in a monor repo or migrating to
[36:03] (2163.28s)
a mono at least for the people that we
[36:05] (2165.32s)
talk within Silicon Valley a lot of
[36:07] (2167.20s)
people have decided you know this is
[36:08] (2168.80s)
actually the future I see the advantage
[36:10] (2170.68s)
the tooling there someone should be
[36:12] (2172.76s)
moving here um and so we're actually
[36:15] (2175.24s)
seeing a broader sort of like Trend
[36:17] (2177.08s)
towards monor repos why do I not think
[36:19] (2179.60s)
GitHub supports it I think it doesn't
[36:21] (2181.16s)
make sense for open source I think we're
[36:22] (2182.72s)
just starting to see the like uh Advent
[36:25] (2185.92s)
of monos in open source I think react is
[36:28] (2188.12s)
closer to a monor repo than not I think
[36:30] (2190.32s)
that versel follows the same pattern
[36:32] (2192.88s)
probably a lot of the same energy there
[36:35] (2195.04s)
um I think we're seeing a lot of repos
[36:37] (2197.48s)
say like well maybe multiple packages
[36:40] (2200.12s)
should be in one repo um but there are a
[36:42] (2202.48s)
lot of advantages to have to being able
[36:44] (2204.44s)
to put them all together um but I still
[36:46] (2206.56s)
think it's not the norm right when I
[36:48] (2208.28s)
when you look at like the JavaScript
[36:49] (2209.44s)
ecosystem or the rust ecosystem or the
[36:51] (2211.20s)
python ecosystem we still live in a
[36:53] (2213.16s)
world of many separate packages for many
[36:54] (2214.88s)
different authors with many different
[36:56] (2216.12s)
cultures and the way that manage those
[36:58] (2218.24s)
is through poly repos yeah and then when
[37:00] (2220.32s)
you say you know you see a lot of
[37:01] (2221.68s)
companies having can you just get a
[37:03] (2223.04s)
sense of what types of companies in
[37:04] (2224.60s)
terms of the the size the the funding
[37:06] (2226.68s)
stage you know like I appreciate you
[37:08] (2228.68s)
might not be able to say specific names
[37:10] (2230.40s)
here who do not advertise but it was a
[37:12] (2232.96s)
pretty surprising for me to say that a
[37:14] (2234.48s)
lot of the companies you work a lot of
[37:16] (2236.64s)
them are moving no um I'd say that
[37:18] (2238.88s)
graphite tends to work with companies
[37:20] (2240.52s)
with Silicon Valley based companies that
[37:21] (2241.92s)
are somewhere between call it like 100
[37:25] (2245.12s)
and high thousands of people maybe
[37:27] (2247.88s)
10,000 people and so for us I think it's
[37:30] (2250.76s)
a lot of the uh forward thinkers who are
[37:33] (2253.24s)
really investing in developer velocity
[37:34] (2254.96s)
as you were talking about earlier that
[37:36] (2256.28s)
are starting to invest in the like how
[37:38] (2258.04s)
do I speed up my team are very quickly
[37:40] (2260.40s)
realizing that the coordination overhead
[37:42] (2262.24s)
of poly repos um makes it harder to
[37:45] (2265.16s)
speed up your team it both slows your
[37:47] (2267.00s)
team down in code review in weird uh
[37:49] (2269.88s)
Insidious and second order effect ways
[37:52] (2272.44s)
um and it makes it hard to enforce a
[37:54] (2274.08s)
common culture right so when we talk to
[37:55] (2275.92s)
De tools teams and they're like oh we
[37:57] (2277.96s)
want to uh migrate CI when you're in a
[38:00] (2280.44s)
monor repo that is a that is a somewhat
[38:02] (2282.84s)
straightforward operation when you're in
[38:04] (2284.44s)
a poly repo that is a multi-year journey
[38:06] (2286.28s)
that you're on oh yeah and yeah I I I
[38:09] (2289.32s)
think I think I can emphasize with this
[38:11] (2291.52s)
yeah and and I think what you said like
[38:13] (2293.08s)
it needs to be some investment like it
[38:14] (2294.48s)
it doesn't just happen like this there I
[38:16] (2296.04s)
mean there's better tools now but you
[38:17] (2297.72s)
need to have like you dedicated people
[38:19] (2299.88s)
uh looking at vendors who are doing it
[38:21] (2301.48s)
Etc so probably makes sense that the the
[38:25] (2305.24s)
the teams who are investing in better
[38:26] (2306.76s)
tooling are probably the ones who are
[38:28] (2308.68s)
looking at these Alternatives now and
[38:31] (2311.44s)
that there's probably some selection
[38:32] (2312.80s)
effect in who we talk to yeah right of
[38:34] (2314.76s)
when teams when teams want to invest in
[38:36] (2316.40s)
tooling they start talking to us and at
[38:38] (2318.32s)
that point they've already started to
[38:39] (2319.40s)
consider them monor repo Journey no well
[38:41] (2321.60s)
I mean you don't need to tell me because
[38:42] (2322.92s)
I've seen the benefits of monor repo I
[38:44] (2324.40s)
mean I see the costs as well but it was
[38:46] (2326.28s)
so good at at Uber so many benefits and
[38:48] (2328.72s)
like I would you know if I had a magic
[38:50] (2330.84s)
wand I would say let's let's zoom on a
[38:52] (2332.28s)
repo obviously there's a cost so I I see
[38:55] (2335.36s)
this I'm just surprised that that it Act
[38:57] (2337.68s)
kind of an encouraging way that there's
[38:59] (2339.40s)
so much of this happening I think
[39:01] (2341.88s)
another place where um things got
[39:03] (2343.88s)
confused let's call it is that people
[39:05] (2345.84s)
associate U microservices with many
[39:07] (2347.92s)
repos and I think again that's because
[39:10] (2350.24s)
that's a way that um I think GitHub very
[39:12] (2352.84s)
much encourages you to think of like oh
[39:14] (2354.24s)
you have abstractions between code you
[39:15] (2355.92s)
should have different repos and I think
[39:18] (2358.04s)
that I I have a friend um also from
[39:20] (2360.64s)
Facebook who likes to joke that uh the
[39:23] (2363.00s)
discovery of monor repos is really just
[39:24] (2364.36s)
the discovery of folder structure where
[39:26] (2366.28s)
it's like well yes you have multiple you
[39:28] (2368.04s)
have multiple uh different pieces of
[39:29] (2369.72s)
code but you can just put them in
[39:30] (2370.96s)
different folders that's okay you I
[39:33] (2373.76s)
think you know the probably one of the
[39:35] (2375.08s)
best examples Uber is known for
[39:36] (2376.92s)
advertising how they have 5,000 plus
[39:38] (2378.64s)
microservices and they do have monor
[39:40] (2380.56s)
repos I think per language like four or
[39:42] (2382.68s)
five but yes you you can it's it's not
[39:45] (2385.16s)
dependent it's totally independent from
[39:46] (2386.44s)
one another exactly right now one other
[39:49] (2389.88s)
Trend that's happening now everywhere
[39:52] (2392.08s)
especially with the Forward Thinking
[39:53] (2393.28s)
companies is uh AI tools for code
[39:55] (2395.80s)
generation or for coding
[39:57] (2397.92s)
how do you see this changing how code is
[40:01] (2401.28s)
written uh both through the companies
[40:03] (2403.20s)
that you work at or maybe even at
[40:05] (2405.68s)
graphite it's incredible it's been a
[40:07] (2407.68s)
whole conversation at graphite and it's
[40:09] (2409.24s)
definitely been a whole whole
[40:10] (2410.56s)
conversation with a lot of the companies
[40:11] (2411.80s)
we work with I think the um most
[40:14] (2414.52s)
immediate impact of how will AI impact
[40:16] (2416.44s)
software engineering is there will be
[40:17] (2417.88s)
more software written right there is
[40:19] (2419.84s)
going to be more code written as the
[40:21] (2421.36s)
tools make it easier and easier for
[40:23] (2423.60s)
developers to ship code I think what's
[40:26] (2426.08s)
uh not as obvious as the second order
[40:27] (2427.76s)
effect right if you have more code
[40:29] (2429.40s)
written well what needs to happen well
[40:31] (2431.28s)
more code needs to be reviewed and if
[40:32] (2432.84s)
more code's reviewed more code needs to
[40:34] (2434.00s)
be tested and more code needs to be merg
[40:35] (2435.48s)
and more code needs to be deployed and I
[40:37] (2437.40s)
think that's suddenly the place where
[40:39] (2439.08s)
we're starting we're seeing people run
[40:40] (2440.72s)
into issues where they say okay I think
[40:43] (2443.80s)
that I think that we're ready to adopt
[40:45] (2445.20s)
one of these AI tools that helps us
[40:46] (2446.68s)
generate code that helps our developers
[40:48] (2448.32s)
code faster um let's do that and then
[40:50] (2450.96s)
what immediately happens is we see the
[40:52] (2452.48s)
volume of poll requests go up we see the
[40:54] (2454.16s)
size of those poll requests go up we see
[40:56] (2456.08s)
the amount of bugs or bills also go up
[40:59] (2459.04s)
and I think getting ready for that
[41:00] (2460.68s)
transformation is a thing that the
[41:01] (2461.88s)
industry is just starting to Grapple
[41:03] (2463.32s)
with this episode was brought to you by
[41:05] (2465.44s)
centry buggy lines of code and long API
[41:08] (2468.12s)
calls are impossible to debug and random
[41:10] (2470.36s)
app crashes are things no software
[41:11] (2471.84s)
engineer is a fan of this is why over 4
[41:15] (2475.12s)
million developers use Sentry to fix
[41:17] (2477.00s)
errors and crashes and solve hidden or
[41:18] (2478.80s)
tricky performance
[41:20] (2480.32s)
issues Sentria debugging time in half no
[41:23] (2483.92s)
more Soul crushing lock sifting or vague
[41:26] (2486.00s)
user reports like it broke fix it get
[41:29] (2489.40s)
the context you need to know what
[41:31] (2491.08s)
happened when it happened and the impact
[41:33] (2493.24s)
down to the device browser and even a
[41:35] (2495.00s)
replay of what the user did before the
[41:36] (2496.84s)
error centry will alert the right div on
[41:39] (2499.36s)
your team with the exact broken line of
[41:41] (2501.08s)
code so they can push a fix fast or let
[41:43] (2503.96s)
autofix handle the repetitive fixes so
[41:45] (2505.96s)
your team can focus on the real
[41:47] (2507.96s)
problems sent your help monday.com
[41:50] (2510.48s)
reduce their Errors By 60% and SP up
[41:52] (2512.96s)
time to resolution for NEX door by 45
[41:55] (2515.04s)
minutes per Dev per issue get your whole
[41:57] (2517.84s)
team on Century and seconds by heading
[41:59] (2519.48s)
to sentry.io pragmatic that is sen n
[42:04] (2524.88s)
y./ pragmatic or use the code pragmatic
[42:08] (2528.56s)
on sign up for three months on the team
[42:10] (2530.20s)
plan and 50,000 errors per month for
[42:12] (2532.24s)
free now you know this is speculative
[42:15] (2535.16s)
but it's it's good to speculate
[42:16] (2536.40s)
sometimes how do you think successful
[42:19] (2539.20s)
companies will cope with it and this is
[42:21] (2541.08s)
we we can just speculate right like we
[42:23] (2543.00s)
know there's going to be more code being
[42:24] (2544.64s)
turned out devs will be able to generate
[42:27] (2547.20s)
stuff faster build features faster and
[42:29] (2549.52s)
yes it will have all these a lot of them
[42:31] (2551.48s)
will just tap tap tap except they're
[42:33] (2553.80s)
going to not notice certain things being
[42:36] (2556.44s)
there we even have like attacks of like
[42:38] (2558.16s)
you know some companies trying to have
[42:39] (2559.80s)
malicious B like you know stack Overflow
[42:41] (2561.80s)
copy paste and you have the memory leak
[42:43] (2563.64s)
will have more of those what you know
[42:47] (2567.16s)
let's just do the thought experiment
[42:48] (2568.52s)
like how how
[42:51] (2571.00s)
will good teams deal with
[42:53] (2573.52s)
this I think AI necessarily is both the
[42:57] (2577.24s)
problem and the solution so I agree with
[42:59] (2579.28s)
I think that the fastest moving teams
[43:00] (2580.76s)
are going to adopt AI I think we've
[43:02] (2582.92s)
moved past the world where where you can
[43:05] (2585.16s)
say like ah this is a fad this will pass
[43:07] (2587.60s)
AI is going to mean a lasting change on
[43:09] (2589.60s)
the software development industry period
[43:12] (2592.24s)
I think that you're right it's going to
[43:13] (2593.72s)
create some problems right you have
[43:14] (2594.84s)
developers just clicking tab non-stop
[43:17] (2597.04s)
probably importing code that they
[43:18] (2598.40s)
neither we already have it right it's
[43:19] (2599.60s)
called Vibe coding
[43:21] (2601.96s)
yeah I keep I I love the Twitter memes
[43:24] (2604.76s)
about it but yeah Vibe cing is a thing
[43:26] (2606.84s)
right you have developers at your
[43:28] (2608.04s)
company who are probably just talking to
[43:29] (2609.52s)
their computer telling it like yeah
[43:30] (2610.92s)
create this website no I'm not going to
[43:32] (2612.28s)
read that code and they're putting up
[43:33] (2613.76s)
for code review and I think that to me
[43:36] (2616.48s)
that just emphasizes the point of code
[43:37] (2617.88s)
review because if the author didn't read
[43:39] (2619.36s)
the code then someone else certainly
[43:40] (2620.76s)
should and the question is how are we
[43:42] (2622.72s)
going to make sure that happens and so I
[43:44] (2624.72s)
have two answers for you one is I think
[43:47] (2627.16s)
a lot of the practices that already
[43:48] (2628.60s)
existed at these major companies at the
[43:50] (2630.48s)
Google and Facebook and to a lesser
[43:51] (2631.92s)
extent um but still an extent Uber
[43:54] (2634.72s)
Twitter Snapchat um which are used to
[43:57] (2637.60s)
seen a high volume of changes um written
[44:00] (2640.40s)
by a diverse employee pool um a lot of
[44:03] (2643.60s)
those practices are going to percolate
[44:05] (2645.44s)
down because those used to be the
[44:06] (2646.68s)
problems of like well you only run into
[44:08] (2648.64s)
this issue if you have 100 developers we
[44:10] (2650.80s)
see it on Twitter people are saying
[44:12] (2652.08s)
teams of 10 can start acting like teams
[44:13] (2653.80s)
of 100 developers and so that also comes
[44:16] (2656.32s)
with the problems of hundreds developers
[44:18] (2658.72s)
the second solution is I think that AI
[44:20] (2660.20s)
is going to transform the way that we
[44:21] (2661.72s)
view code review I think the uh the
[44:24] (2664.32s)
largest change which has happened uh
[44:26] (2666.04s)
something which we are
[44:28] (2668.12s)
trying to trying to uh keep up with
[44:31] (2671.76s)
internally is that I think we're seeing
[44:34] (2674.12s)
the same Advent that happened when
[44:35] (2675.52s)
something like grammarly or the grammar
[44:37] (2677.40s)
checker in word happened where AI can
[44:39] (2679.76s)
actually check a lot of the minutia or
[44:41] (2681.28s)
mechanics of code review it can tell you
[44:42] (2682.88s)
that like yes this code works or yes
[44:44] (2684.88s)
this code does what it intended to do
[44:47] (2687.04s)
the question though of code review then
[44:48] (2688.80s)
isn't is this code literally bug free
[44:51] (2691.76s)
it's is this aligned with what we wanted
[44:53] (2693.92s)
it to do right is this I I originally we
[44:57] (2697.12s)
spec out the system it was supposed to
[44:58] (2698.68s)
work this way um or maybe we have built
[45:01] (2701.08s)
the system and it does work this way
[45:03] (2703.00s)
what are the other effects of that how
[45:04] (2704.40s)
does this interplay with everything else
[45:05] (2705.92s)
we do at this company what's going to
[45:07] (2707.52s)
happen there and I think that we're
[45:08] (2708.64s)
going to see AI tools built we're we're
[45:10] (2710.56s)
building one ourselves our AI reviewer
[45:12] (2712.84s)
that let developers um automatically
[45:15] (2715.56s)
check this code for sort of like the the
[45:18] (2718.08s)
simple stuff allowing the reviewers to
[45:19] (2719.80s)
focus on the more complicated stuff do
[45:22] (2722.16s)
you think testing will become more
[45:24] (2724.88s)
important in this in this world where we
[45:28] (2728.24s)
we do have like a lot more AI tools a
[45:29] (2729.88s)
lot more AI code a lot more you know
[45:31] (2731.88s)
pushing to
[45:33] (2733.12s)
production I think testing's going to
[45:35] (2735.04s)
change and so I I is it going to is it
[45:38] (2738.40s)
going to so is it going to become more
[45:41] (2741.16s)
important probably in some form
[45:43] (2743.64s)
definitely because as I said I think
[45:45] (2745.52s)
that de if developers are not putting
[45:48] (2748.24s)
thought into every keystroke but instead
[45:50] (2750.00s)
we're just seeing like this code written
[45:51] (2751.76s)
quickly checking that code for
[45:53] (2753.68s)
correctness both the like literal like
[45:55] (2755.96s)
does the function work as intended and
[45:57] (2757.76s)
the endtoend correctness of like are the
[45:59] (2759.88s)
buttons in the right place and is it
[46:01] (2761.68s)
usable or critical now I think what's
[46:04] (2764.68s)
really interesting or the trend that
[46:05] (2765.88s)
we're seeing is that it appears that AI
[46:07] (2767.68s)
can actually write tests for you in some
[46:10] (2770.00s)
limited uh cases and when we look at
[46:12] (2772.00s)
some of the more advanced AI agents they
[46:14] (2774.08s)
actually do this as part of their coding
[46:15] (2775.52s)
right if you ask an agent to like
[46:17] (2777.08s)
generate a generate a change and also
[46:19] (2779.36s)
test it it'll do a pretty good job um
[46:22] (2782.00s)
and I think that a lot of the previous
[46:23] (2783.84s)
types of tests that before were uh
[46:26] (2786.40s)
restricted to QA people or to humans of
[46:29] (2789.12s)
like look at this website and try to
[46:31] (2791.08s)
like click around it with the mouse
[46:32] (2792.72s)
where before we had to use uh we had to
[46:34] (2794.96s)
use an anend testing tool it was hard
[46:36] (2796.84s)
we'd kind of be like oh go click at like
[46:38] (2798.80s)
XY location this you
[46:41] (2801.68s)
can we even hired you know like testers
[46:44] (2804.88s)
who you we call it sanity testing you
[46:47] (2807.16s)
describe what to do go there take a
[46:49] (2809.88s)
credit card and like that was cuz I was
[46:52] (2812.28s)
on payment team like we actually had
[46:53] (2813.80s)
some of the fake credit cards that take
[46:55] (2815.48s)
take one of them enter there and we did
[46:58] (2818.12s)
it because we couldn't really automate
[46:59] (2819.88s)
it for different reasons the payments is
[47:02] (2822.20s)
a bit tricky but yeah it was
[47:05] (2825.60s)
it pretty terrible job to do but someone
[47:08] (2828.80s)
had to do it and someone was being paid
[47:10] (2830.52s)
so exactly and it's uh I think that um
[47:15] (2835.72s)
again the the machines can start to do a
[47:17] (2837.64s)
lot of this work I think the question
[47:19] (2839.52s)
that we have to Grapple with then is so
[47:21] (2841.60s)
what is the purpose of this review step
[47:23] (2843.40s)
right why and I think that for a lot of
[47:25] (2845.68s)
people uh
[47:27] (2847.36s)
this is a question we used to talk about
[47:29] (2849.12s)
like three years ago and I think with
[47:31] (2851.32s)
time with advantage of time we've become
[47:33] (2853.32s)
a we've had a lot more clarity of like
[47:35] (2855.20s)
the purpose of code review isn't just
[47:37] (2857.00s)
the mechanical it's not just hey can you
[47:38] (2858.64s)
proofread my essay for me because I want
[47:40] (2860.24s)
to make sure there no grammar mistakes
[47:41] (2861.68s)
in it I think there's also a type of
[47:45] (2865.00s)
shared learning that happen of hey I
[47:47] (2867.44s)
want other people to code review this
[47:48] (2868.92s)
because I want to distribute my
[47:50] (2870.04s)
knowledge of how the system works and I
[47:52] (2872.08s)
think that there is an alignment
[47:53] (2873.32s)
checking that happens of hey there's uh
[47:56] (2876.04s)
this is how we build software here and
[47:57] (2877.60s)
there's tribal knowledge that maybe more
[47:59] (2879.08s)
senior staff level engineers at your
[48:00] (2880.64s)
company have that they disseminate
[48:02] (2882.80s)
through the process of code review and I
[48:04] (2884.60s)
think that as uh we take away these
[48:07] (2887.16s)
mechanical parts or remove their reduce
[48:09] (2889.12s)
their need um you're going to see those
[48:11] (2891.08s)
lighter two become a lot more emphasized
[48:13] (2893.24s)
I am a little worried though about there
[48:15] (2895.88s)
is a big we do see that AI tools are
[48:18] (2898.44s)
very capable of generating code and I
[48:21] (2901.16s)
think you know like experienced
[48:22] (2902.60s)
developers who've been been in the
[48:23] (2903.96s)
industry for 10 plus years and have SE
[48:26] (2906.00s)
for example the they work with the teams
[48:27] (2907.64s)
of Junior engineers and you see what
[48:29] (2909.60s)
happens you know it's a bit of a mess
[48:30] (2910.88s)
then you need to really clean it up like
[48:33] (2913.00s)
they I think you know they see that this
[48:35] (2915.32s)
has limitations right like it's it's a
[48:37] (2917.12s)
very powerful tool when you know what
[48:38] (2918.68s)
good output is when you can check it
[48:40] (2920.08s)
when you can step in and say this is BS
[48:41] (2921.80s)
stop or or or you just like stop
[48:44] (2924.04s)
prompting after a while because there's
[48:45] (2925.16s)
no going anywhere and you start take
[48:46] (2926.64s)
over and you just write whatever you
[48:47] (2927.96s)
need to do but I I do feel that people
[48:50] (2930.76s)
over index on how good a generat code
[48:52] (2932.88s)
and I have a feeling that a lot of
[48:55] (2935.40s)
companies are or especially you know
[48:57] (2937.24s)
like company leaders even CTO they'll
[48:59] (2939.64s)
they'll kind of ignore this part of of
[49:01] (2941.48s)
learning cuz it's invisible right like
[49:03] (2943.44s)
you you don't really see the difference
[49:05] (2945.36s)
between like two teams one uh there's no
[49:07] (2947.60s)
code review or just AI code review
[49:09] (2949.16s)
everyone's doing their own thing and you
[49:11] (2951.00s)
know and they're they're both shipping
[49:12] (2952.16s)
for a while and at some point something
[49:13] (2953.56s)
breaks and this team where people are
[49:15] (2955.52s)
actually doing code reviews they're
[49:16] (2956.76s)
thoughtful they understand the code they
[49:19] (2959.00s)
can start you know nothing stols down
[49:20] (2960.96s)
they build the new features they in
[49:22] (2962.40s)
innovate whereas this one kind of gets
[49:23] (2963.96s)
stuck there just bugs no one knows why
[49:26] (2966.48s)
but because it's not visible like I'm I
[49:30] (2970.48s)
have this feeling that there there's
[49:31] (2971.92s)
there might be I don't want to like you
[49:33] (2973.44s)
know be all negative about it but it's
[49:36] (2976.16s)
it's just so many steps removed from
[49:38] (2978.00s)
from the work that I I wonder if like
[49:40] (2980.56s)
few companies will appreciate it a bit
[49:42] (2982.36s)
like how few companies used to
[49:43] (2983.52s)
appreciate developer tools you know like
[49:45] (2985.08s)
it's only fashionable these days right
[49:47] (2987.08s)
like like if you think about it 10 or 15
[49:49] (2989.08s)
years ago like everyone was looking like
[49:51] (2991.24s)
why is Google having a dedicated team to
[49:53] (2993.88s)
build internal death tools that's
[49:58] (2998.48s)
I think it became popular with uh I I'll
[50:00] (3000.60s)
give a lot of credit to Heroku I I think
[50:02] (3002.80s)
I remember just like growing up in that
[50:04] (3004.40s)
era there was uh there was a lot of love
[50:07] (3007.68s)
for like oh this is what good can look
[50:09] (3009.60s)
like you know and I think it really set
[50:11] (3011.24s)
a standard of like oh we don't have to
[50:13] (3013.24s)
accept worse we can actually we can have
[50:14] (3014.96s)
better tooling sort of like in the
[50:16] (3016.72s)
popular opinion and then a lot of Heroku
[50:18] (3018.44s)
developers went on to go to companies
[50:20] (3020.32s)
brought a lot of that energy and feeling
[50:22] (3022.12s)
with them and then we started to see it
[50:23] (3023.60s)
seated there um but yeah I remember I
[50:25] (3025.36s)
remember when Google uh there was a lot
[50:27] (3027.24s)
of sort of like question marks around
[50:29] (3029.00s)
like what is Google doing like why why
[50:30] (3030.96s)
would you invest so much in this and
[50:32] (3032.16s)
meta too um I think that where we where
[50:36] (3036.76s)
this will be different with AI is the
[50:39] (3039.00s)
fact that um at the end of the day code
[50:42] (3042.20s)
has really profound business opin
[50:43] (3043.80s)
business decisions right like if you if
[50:46] (3046.88s)
you decide that you want to give a
[50:48] (3048.36s)
refund or you want to charge this or the
[50:49] (3049.96s)
user experience wants to look like this
[50:51] (3051.36s)
or this is the way the product works
[50:53] (3053.08s)
that's at the end of the day a business
[50:54] (3054.32s)
decision and I think that what leaders
[50:56] (3056.24s)
will agree with is that they don't want
[50:57] (3057.60s)
the machines making bus decisions and so
[51:00] (3060.16s)
at that point at some level that means
[51:02] (3062.00s)
that a human needs to be revie it and
[51:04] (3064.20s)
now now the question is sort of like
[51:05] (3065.68s)
what is that level and what is that
[51:06] (3066.88s)
granularity that we review on but I
[51:08] (3068.92s)
think that we can we we can we can find
[51:11] (3071.88s)
a like we can find a common
[51:13] (3073.52s)
understanding and like okay humans will
[51:15] (3075.64s)
need to at least check for intent the
[51:17] (3077.04s)
same way humans prompt and and that's
[51:18] (3078.64s)
not to say that like prompting b or it
[51:20] (3080.68s)
doesn't work it's more that just I think
[51:22] (3082.60s)
fundamentally like human natural
[51:24] (3084.52s)
language communication is flawed right
[51:26] (3086.24s)
like you I have a conversation and you
[51:27] (3087.72s)
can understand something different than
[51:29] (3089.08s)
what I mean and so at a company the way
[51:30] (3090.68s)
we get around this is we then check on
[51:32] (3092.80s)
the other side right we align at the
[51:34] (3094.60s)
start and we check at the end uh and I
[51:37] (3097.36s)
don't think that's going to go away yeah
[51:39] (3099.64s)
I I like this thinking and also I wonder
[51:41] (3101.60s)
if we might have more companies discover
[51:44] (3104.36s)
again I'm just a little bit inspired by
[51:45] (3105.92s)
Linux as I recently talked with uh with
[51:48] (3108.08s)
GRE from there that the way Linux works
[51:50] (3110.96s)
you know what I when I talked about how
[51:52] (3112.36s)
the whole project works I was really
[51:53] (3113.92s)
surprised in how much it's about people
[51:55] (3115.76s)
and the core the core concept there is
[51:57] (3117.72s)
when you take a patch they call a PR a
[52:00] (3120.20s)
patch there you now own it and you are
[52:02] (3122.96s)
personally responsible for it and this
[52:05] (3125.64s)
actually you know if something goes and
[52:07] (3127.92s)
this is how there's a pyramid of like
[52:09] (3129.44s)
subsystem owners system owners and then
[52:11] (3131.44s)
in the end there's L tal so every single
[52:13] (3133.32s)
release for Linux it goes through it
[52:15] (3135.12s)
still to the day it goes through him he
[52:17] (3137.28s)
owns it uh and you know he's he's very
[52:19] (3139.60s)
happy to take that responsibility but
[52:21] (3141.76s)
and you know Greg was telling me that
[52:23] (3143.08s)
because of this like if you think about
[52:24] (3144.28s)
it and it's very open the whole world
[52:26] (3146.08s)
sees your changes so because of this uh
[52:29] (3149.04s)
there you know people take pride in this
[52:31] (3151.12s)
they will both review but there's also
[52:32] (3152.88s)
trust levels right so if you're a system
[52:35] (3155.00s)
owner and there's a subsystem owner you
[52:36] (3156.64s)
work now for 10 years you know this
[52:38] (3158.60s)
person is solid you're not going to
[52:40] (3160.88s)
spend all the energy and I wonder if you
[52:43] (3163.92s)
know the the correct response to AI is
[52:46] (3166.64s)
like yeah just go wild use these tools
[52:48] (3168.68s)
but responsibility maybe at least to go
[52:50] (3170.92s)
back to developers like you know
[52:53] (3173.16s)
like so we we'll see
[52:56] (3176.56s)
I I suspect that is what we'll see um
[52:58] (3178.64s)
and and as I said that's not to say that
[53:00] (3180.04s)
AI is not going to solve real problems I
[53:01] (3181.36s)
think AI will actually free the humans
[53:03] (3183.68s)
to focus more on those bigger picture
[53:05] (3185.68s)
things I think AI is going to be able to
[53:07] (3187.32s)
like take away the like oh does this
[53:09] (3189.36s)
function say do what it intends to do
[53:11] (3191.48s)
because they can understand that it can
[53:12] (3192.88s)
tell us that but I think it's going to
[53:14] (3194.60s)
be a tool and then saying and so does
[53:16] (3196.72s)
this holistic change do what you wanted
[53:18] (3198.40s)
to do yeah so speaking of change you
[53:21] (3201.12s)
know one thing that I I was really
[53:23] (3203.20s)
surprised that when I worked at Uber I
[53:25] (3205.04s)
talk with other companies and stacking
[53:26] (3206.20s)
didn't take off anywhere else cuz
[53:28] (3208.04s)
fabricator was just not there what why
[53:30] (3210.16s)
do you think that didn't take off and
[53:31] (3211.44s)
how did you get the idea to start
[53:32] (3212.92s)
graphite around this idea of you know
[53:35] (3215.64s)
building this tooling to to make it
[53:38] (3218.04s)
accessible for a lot more people yeah of
[53:40] (3220.16s)
course well so I'll start by saying I
[53:42] (3222.68s)
actually don't think that's stacking
[53:43] (3223.76s)
caught on inside meta at the start oh so
[53:46] (3226.68s)
um originally when it was built um it
[53:49] (3229.12s)
was built into Mercurial there were uh
[53:51] (3231.48s)
it was advertised about the company
[53:54] (3234.16s)
there were a few senior Engineers who
[53:56] (3236.48s)
saw the vision felt blocked and picked
[53:58] (3238.80s)
it up and I think that not in an
[54:01] (3241.68s)
exaggerating way they saw themselves get
[54:03] (3243.76s)
3x faster and what happened was those
[54:06] (3246.16s)
senior Engineers started to become
[54:07] (3247.80s)
evangelists I had one on my team who was
[54:09] (3249.68s)
like the stacking evangelist and they
[54:11] (3251.28s)
used to go throughout the organization
[54:13] (3253.12s)
and literally give PowerPoint
[54:14] (3254.44s)
presentations of like this is how
[54:16] (3256.40s)
stacking speeds you up like you need to
[54:18] (3258.56s)
believe me I've seen the other side and
[54:20] (3260.84s)
at the time I think you had a lot of
[54:22] (3262.04s)
developers who were very skeptical who
[54:23] (3263.40s)
were like like I've been doing it for so
[54:25] (3265.52s)
long like I I don't need you to tell me
[54:27] (3267.32s)
how to change my workflow why would I do
[54:28] (3268.92s)
that I I hear all the time from current
[54:30] (3270.88s)
you know people who you're about
[54:32] (3272.68s)
stacking yeah and then like what ended
[54:36] (3276.36s)
up happening was that as as these
[54:38] (3278.60s)
evangelists like made inroads and
[54:40] (3280.60s)
started to convince people like no no no
[54:42] (3282.00s)
it's actually speeding you up um people
[54:44] (3284.12s)
would adopt it see the value and and
[54:46] (3286.68s)
like become that evangelist themselves
[54:48] (3288.64s)
we like to joke that once you stack you
[54:50] (3290.08s)
don't go back um I I think why it worked
[54:54] (3294.00s)
so well is it did two things really it
[54:56] (3296.16s)
was like the perfect uh convergence of
[54:58] (3298.80s)
two forces one is that it gave
[55:00] (3300.44s)
individuals what they want so I think no
[55:01] (3301.88s)
developer likes being blocked right if
[55:03] (3303.40s)
you ask a developer like what is the
[55:05] (3305.20s)
feeling towards code review um or
[55:07] (3307.40s)
towards having to wait on code review
[55:09] (3309.72s)
it's almost always negative no one's
[55:11] (3311.36s)
like I love that feeling negative people
[55:14] (3314.20s)
are like it sucks it's the worst part
[55:16] (3316.20s)
and I think that um even if you have a
[55:18] (3318.48s)
more senior developer that's been doing
[55:20] (3320.28s)
that's had code review for 20 years if
[55:22] (3322.08s)
you ask them like why do you enjoy
[55:23] (3323.76s)
coding on personal projects as opposed
[55:25] (3325.48s)
to company project
[55:27] (3327.28s)
the like the overhead of collaboration
[55:29] (3329.48s)
becomes like quite clear there of like
[55:33] (3333.04s)
yeah cuz in like when I work with others
[55:35] (3335.40s)
things just slow down we need to review
[55:36] (3336.96s)
things we need to align on them they get
[55:38] (3338.92s)
a lot slower and I think what stacking
[55:41] (3341.32s)
did is it took away some of that
[55:43] (3343.68s)
overhead right where rather than having
[55:45] (3345.36s)
to wait on other people to review your
[55:46] (3346.96s)
code that you already thought was pretty
[55:48] (3348.40s)
good and you had already aligned on
[55:49] (3349.76s)
before you could just keep going and
[55:51] (3351.76s)
keep Building without needing to without
[55:54] (3354.40s)
needing to sort of wait for them the
[55:55] (3355.96s)
second side is on I guess it made it was
[55:58] (3358.56s)
great for three parties the second side
[56:00] (3360.04s)
was the reviewer the reviewer started to
[56:01] (3361.80s)
get smaller PRS right and whether you
[56:04] (3364.44s)
whether you know it or not when I send
[56:06] (3366.40s)
you a 10line PR you are so much happier
[56:08] (3368.48s)
than if I send you a 2,000 line so so
[56:10] (3370.80s)
many companies eventually when they
[56:12] (3372.48s)
write down Cod review guidelines you're
[56:14] (3374.28s)
always going to have small peers you
[56:15] (3375.56s)
know at Uber at some point we had like
[56:17] (3377.64s)
recommendations within the team for 50
[56:19] (3379.32s)
or 70 lines whatever that is but you
[56:21] (3381.88s)
realize I mean there's jokes going
[56:23] (3383.88s)
around and I think it was viral on
[56:25] (3385.32s)
social media of the you know the 5,000
[56:27] (3387.44s)
line preview looks good to me and then
[56:29] (3389.28s)
the 10 line which gets like 10
[56:31] (3391.56s)
comments exactly right and I think
[56:33] (3393.68s)
people as we were saying before people
[56:35] (3395.16s)
buy out of it and and the third is I
[56:36] (3396.60s)
think it really gave the company
[56:38] (3398.48s)
something they needed which is that they
[56:39] (3399.80s)
started to see their or engineering
[56:41] (3401.20s)
organization speed up they started to
[56:43] (3403.08s)
see time to merge go down that's usually
[56:44] (3404.92s)
a metric that people measure and we can
[56:46] (3406.40s)
talk about engineering metric separately
[56:48] (3408.60s)
um and uh in the case that something
[56:50] (3410.72s)
went wrong it made it much easier to
[56:52] (3412.28s)
correct things right in the case that QA
[56:54] (3414.48s)
or even users like found something that
[56:57] (3417.08s)
wasn't working the way it should be it
[56:58] (3418.72s)
was much easier to then go to history
[57:00] (3420.64s)
and be like which one of these changes
[57:02] (3422.36s)
did it than when the changes were like
[57:04] (3424.00s)
monolithic feature monolithic feature
[57:05] (3425.60s)
monolithic feature yeah now you know you
[57:07] (3427.92s)
built this tooling and you are still a
[57:10] (3430.28s)
pretty small startup how did you
[57:12] (3432.00s)
convince companies to to you know like
[57:14] (3434.72s)
use a pretty small startup pretty
[57:17] (3437.04s)
unknown methodology you know rumors are
[57:19] (3439.68s)
meta uses it a lot because I assume it's
[57:21] (3441.60s)
not a really easy sell is
[57:23] (3443.20s)
it no it's not an easy sell at all um
[57:25] (3445.96s)
well so back to the company's Origins we
[57:27] (3447.92s)
didn't so I don't know if you know this
[57:29] (3449.76s)
we didn't actually start as a code
[57:30] (3450.88s)
review company uh so we actually started
[57:33] (3453.08s)
by building tools for mobile releases
[57:35] (3455.04s)
that's the background of the founders
[57:36] (3456.76s)
that's the area we wanted to like build
[57:38] (3458.52s)
in um I uh I was fortunate enough to
[57:42] (3462.64s)
have two of my teammates from meta join
[57:44] (3464.00s)
the team and I think one of the things
[57:45] (3465.44s)
that the three of us felt was like oh
[57:47] (3467.08s)
like the tooling out here I think they
[57:49] (3469.24s)
were both new grads when they had joined
[57:50] (3470.68s)
meta had been there for a handful of
[57:52] (3472.64s)
years call it like four and then when
[57:55] (3475.20s)
they left they had the same experience I
[57:56] (3476.92s)
did which was like oh like the TS out
[57:59] (3479.64s)
here is different and at the start we
[58:01] (3481.64s)
kind of were like well you know we came
[58:03] (3483.84s)
from we must be the weird ones and after
[58:06] (3486.08s)
about like a a few months we like h no
[58:08] (3488.72s)
like there there's actually something
[58:09] (3489.88s)
missing here and I think the uh the
[58:12] (3492.92s)
context that I now have that I wish IID
[58:15] (3495.40s)
had then was one most companies if you
[58:18] (3498.68s)
have more than 50 people you have at
[58:20] (3500.56s)
least one person who builds tooling on
[58:22] (3502.16s)
top of GitHub and I think that if you've
[58:23] (3503.72s)
worked at one of these companies you
[58:24] (3504.92s)
kind of just take that as a truth of
[58:26] (3506.44s)
like yeah I have the dev tools team
[58:28] (3508.00s)
devat Jin but these days Jenkins is
[58:30] (3510.80s)
GitHub actions so same difference
[58:33] (3513.64s)
exactly you have someone building
[58:34] (3514.96s)
tooling there to make your developers
[58:36] (3516.44s)
faster the second thing is that these
[58:38] (3518.44s)
companies so like meta and Google so
[58:40] (3520.92s)
GitHub released their po request page
[58:42] (3522.32s)
back in 2013 uh they haven't really made
[58:46] (3526.04s)
a ton of major edits to it since uh
[58:48] (3528.72s)
those companies uh had their changes
[58:51] (3531.96s)
then and have continued to iterate on
[58:53] (3533.92s)
their tooling for like the past decade
[58:55] (3535.76s)
and so when we say their tooling is like
[58:57] (3537.40s)
a decade more advanced that's a quite
[58:59] (3539.36s)
literal statement actually that's not a
[59:00] (3540.96s)
figurative one and so we realized that
[59:04] (3544.16s)
we were like hold on like there's
[59:07] (3547.00s)
missing tooling out here we tried a lot
[59:09] (3549.24s)
of the Alternatives as I said stacking
[59:11] (3551.28s)
was sort of that first paino and so we
[59:12] (3552.92s)
tried a lot of the sort of like open
[59:14] (3554.96s)
source CLI first Alternatives realized
[59:17] (3557.08s)
it it was ano it was like a
[59:19] (3559.68s)
approximation of what we wanted but it
[59:21] (3561.24s)
wasn't quite it um and then we built it
[59:24] (3564.76s)
uh and then what happened actually was
[59:26] (3566.60s)
uh a lot of our old co-workers many of
[59:28] (3568.08s)
whom had gone on to startups ended up
[59:30] (3570.08s)
asking us like hey like code review out
[59:32] (3572.52s)
here is is different like how do you all
[59:34] (3574.68s)
deal with it and we're like we we built
[59:36] (3576.16s)
that um let us know if you ever want us
[59:37] (3577.92s)
to tell you how to build it and they're
[59:39] (3579.44s)
like I I don't but like uh can I just
[59:42] (3582.92s)
use your thing and after we heard that
[59:44] (3584.64s)
for a few months we ended up pivoting
[59:46] (3586.52s)
back in I guess this would have been
[59:48] (3588.52s)
November of 2021 okay so quick pivot in
[59:52] (3592.28s)
the beginning of the just so Bic did you
[59:54] (3594.96s)
just feel a poll like okay like this is
[59:56] (3596.80s)
a good idea seems like there's a man for
[59:59] (3599.16s)
it uh I don't think it was uh the the
[60:04] (3604.16s)
other company IND was working quite well
[60:06] (3606.24s)
I think what we felt was we had all of
[60:08] (3608.48s)
our friends asking us for this thing
[60:10] (3610.88s)
that we had built and it wasn't the
[60:12] (3612.68s)
thing that we were selling and you can
[60:14] (3614.92s)
only listen to that for so long before
[60:16] (3616.68s)
you decide to say maybe maybe we got
[60:19] (3619.24s)
this one wrong maybe this is the
[60:21] (3621.76s)
thing it's it's just fascinating I guess
[60:24] (3624.12s)
it's hard to appreciate if if you're not
[60:26] (3626.92s)
working at a startup cuz you know my my
[60:28] (3628.76s)
imagination of joining a startup I I
[60:30] (3630.68s)
have friends that my brother was working
[60:32] (3632.16s)
at a startup but I didn't join a super
[60:34] (3634.00s)
early stage startup ever is you would
[60:35] (3635.76s)
imagine you're joining this is what
[60:36] (3636.96s)
we're doing this is what we're going to
[60:38] (3638.04s)
do but you know sounds like you need to
[60:39] (3639.96s)
be flexible enough because you might do
[60:42] (3642.32s)
something different that actually works
[60:43] (3643.88s)
and there's a bigger demand for it yeah
[60:46] (3646.60s)
we were fortunate enough that at the
[60:47] (3647.76s)
time that we pivoted we were just six
[60:49] (3649.12s)
people and so it was six people and uh
[60:52] (3652.20s)
three of us had felt the pain from
[60:53] (3653.64s)
Facebook so it was us telling people
[60:55] (3655.48s)
like no no no we've heard it like our
[60:57] (3657.64s)
our co-workers have heard it we' gotten
[60:59] (3659.08s)
other people at the company to buy into
[61:00] (3660.48s)
this workflow at that point and yeah it
[61:03] (3663.32s)
was certainly uh stressful but it was
[61:06] (3666.16s)
also uh it was also a thing we just had
[61:08] (3668.40s)
enough conviction of we're like this is
[61:09] (3669.88s)
the way the world will build software
[61:11] (3671.44s)
it's more a question of who's going to
[61:12] (3672.56s)
bring it into the world not if so we you
[61:16] (3676.00s)
previously mentioned injuring metrics
[61:17] (3677.80s)
what are injuring metrics you see these
[61:20] (3680.00s)
fast moving companies measure and
[61:21] (3681.84s)
actually care about and and want to
[61:24] (3684.24s)
improve so I'll start with with uh I'll
[61:26] (3686.48s)
start with a statement that all
[61:27] (3687.80s)
engineering metrics are a proxy right so
[61:30] (3690.00s)
I think that me measuring engineering
[61:32] (3692.32s)
velocity is an art not a science and I
[61:34] (3694.92s)
think that the value of engineering
[61:36] (3696.16s)
metrics is to uh create some holistic
[61:39] (3699.24s)
metrics so that you can see uh major
[61:41] (3701.36s)
outliers and be like what's going on
[61:43] (3703.28s)
here um the kinds of the kinds of
[61:46] (3706.96s)
metrics which I see most people measure
[61:49] (3709.12s)
are number of poll requests is just a
[61:51] (3711.44s)
proxy for sort of like how is velocity
[61:53] (3713.52s)
doing uh time to merge of poll requests
[61:57] (3717.16s)
and then my favorite metric I've
[61:58] (3718.48s)
actually ever seen came out of uber um
[62:00] (3720.96s)
and so maybe you might have more context
[62:02] (3722.52s)
on this than I but it was basically uh
[62:05] (3725.20s)
time that a poll request is just spent
[62:06] (3726.68s)
waiting in review with no clear action
[62:09] (3729.36s)
and so the idea was like if an author
[62:11] (3731.76s)
puts up a PO request and it's just
[62:12] (3732.84s)
waiting on review you take that then the
[62:15] (3735.08s)
reviewer goes ahead request changes um
[62:18] (3738.16s)
and like it's uh that you then ignore
[62:21] (3741.60s)
that time uh you then put it up back up
[62:24] (3744.40s)
for review you start summing that time
[62:26] (3746.24s)
again and you end up with this composite
[62:27] (3747.92s)
of like this is the amount of time this
[62:29] (3749.76s)
for request just sat waiting on review
[62:31] (3751.92s)
without clear next steps and so this
[62:34] (3754.32s)
metric was introduced after I left Uber
[62:36] (3756.52s)
I I think I might have covered it in one
[62:37] (3757.92s)
of the the Deep di on how Uber did it
[62:40] (3760.64s)
but like this was a really big problem
[62:42] (3762.56s)
at Uber because we were distributed
[62:44] (3764.20s)
sites and like my team specifically had
[62:45] (3765.84s)
this problem where we would have to wait
[62:48] (3768.04s)
almost 24 hours to for some things to
[62:50] (3770.52s)
happen in the PO request cuz you know we
[62:52] (3772.28s)
had the we had strong code code
[62:54] (3774.40s)
ownership and I think Uber still might
[62:56] (3776.76s)
have it to this day I'm not when I was
[62:58] (3778.32s)
there we still had it so the the mobile
[63:00] (3780.48s)
platform team would own the platform
[63:02] (3782.16s)
code in San Francisco and Amsterdam you
[63:04] (3784.72s)
know at last thing last last thing at
[63:07] (3787.08s)
night you would make a change and you
[63:08] (3788.56s)
would have to head off and it would go
[63:10] (3790.88s)
to them they would rejected because of
[63:12] (3792.72s)
whatever reason you come back in the
[63:14] (3794.00s)
morning you know it's now night time in
[63:15] (3795.96s)
amster in in San Francisco you fix it
[63:18] (3798.96s)
quickly and then you have to wait
[63:21] (3801.24s)
another like eight hours and then stay
[63:23] (3803.20s)
late so like I'm glad that they start
[63:25] (3805.84s)
measure it because and this is one of
[63:27] (3807.32s)
the things where it's not really a big
[63:28] (3808.68s)
deal if you're a startup if you're in
[63:30] (3810.88s)
the same time zone if you're in the same
[63:32] (3812.32s)
location but and then you know we came
[63:34] (3814.64s)
up with creative workarounds like we we
[63:36] (3816.32s)
started to get like we onboarded someone
[63:37] (3817.84s)
to be an owner of Etc you can do a lot
[63:40] (3820.24s)
of things but it was really killing us
[63:42] (3822.20s)
like and I knew it because I I was the
[63:43] (3823.72s)
manager of the team and this was the
[63:44] (3824.92s)
number one complaint so we actually you
[63:47] (3827.12s)
know we we took it into our hands and we
[63:48] (3828.52s)
solved it but so many teams were doing
[63:50] (3830.32s)
this independently and then it kind of
[63:52] (3832.44s)
goes back to like you know like as as a
[63:54] (3834.44s)
Dev I didn't really like to hear too
[63:55] (3835.88s)
about strategy whatnot but there is a
[63:57] (3837.24s)
thing of like you know who owns what
[63:59] (3839.00s)
where how do you place your teams it's
[64:00] (3840.76s)
kind of easier when you have teams
[64:02] (3842.08s)
working on similar stuff close to each
[64:03] (3843.76s)
other in terms of time zone so so yeah
[64:06] (3846.60s)
and you know one other thing that Uber
[64:07] (3847.88s)
measured which I'm not sure how
[64:09] (3849.52s)
successful it was but it was an
[64:10] (3850.84s)
interesting thing is it they tried to
[64:12] (3852.12s)
measure Focus time on how much
[64:14] (3854.36s)
uninterrupted Focus blogs defs had uh I
[64:18] (3858.76s)
I think it's really Noble it's just
[64:19] (3859.92s)
really hard to measure but it's true
[64:21] (3861.68s)
that when like I was always happier like
[64:23] (3863.64s)
developing when I actually had time it
[64:25] (3865.28s)
wasn't fragmented with like you know the
[64:27] (3867.60s)
random meetings here here or there
[64:30] (3870.24s)
Etc yeah totally I I mean maker schedule
[64:33] (3873.12s)
manager schedule is super real I I think
[64:35] (3875.52s)
that as we were going back to the before
[64:37] (3877.48s)
though these problems which used to be
[64:39] (3879.16s)
more isolated to big companies I think
[64:41] (3881.12s)
as we start to see a higher volume of
[64:42] (3882.60s)
poll requests and more of the time is
[64:43] (3883.92s)
spent sort of like in code review not
[64:46] (3886.04s)
authoring poll requests uh they're going
[64:48] (3888.36s)
to become everyone's problems and I
[64:50] (3890.48s)
think the question is sort of like how
[64:51] (3891.68s)
ready are companies for that
[64:53] (3893.40s)
change yeah it
[64:56] (3896.24s)
I like this you know thinking because
[64:59] (3899.28s)
this this will happen so it's
[65:00] (3900.72s)
interesting how SM small companies might
[65:03] (3903.88s)
need to take some of the lessons from
[65:05] (3905.68s)
the the larger companies on how they
[65:07] (3907.20s)
solve that and prepare for it because
[65:08] (3908.88s)
it's it's going to come like I'm I'm
[65:10] (3910.04s)
kind of tempted to agree with you on
[65:11] (3911.60s)
this one it it'll be fascinating to see
[65:13] (3913.84s)
It'll be also fascinating to see if
[65:15] (3915.20s)
small companies speed up a lot more than
[65:16] (3916.96s)
let's say big companies where there's
[65:18] (3918.24s)
larger teams and and there's now more
[65:20] (3920.20s)
dependencies it'll be interesting one
[65:22] (3922.08s)
for us to I think small companies are
[65:24] (3924.20s)
certainly better in uh position to
[65:25] (3925.80s)
embrace the change where I think big
[65:27] (3927.36s)
companies will be slow I think small
[65:28] (3928.92s)
companies we've already seen them adopt
[65:30] (3930.20s)
a lot of AI tools faster um I think some
[65:32] (3932.80s)
of the like most advanced big companies
[65:34] (3934.60s)
are starting are like just trying to
[65:36] (3936.84s)
adopt them at the same pace of these
[65:38] (3938.36s)
small companies but they're definitely
[65:39] (3939.84s)
not trying to outperform them uh and I
[65:42] (3942.96s)
think we've seen this actually in quite
[65:44] (3944.20s)
in quite a few areas right like every
[65:46] (3946.08s)
time that you have a new technology that
[65:47] (3947.52s)
makes something uh more accessible
[65:50] (3950.24s)
easier the same way I think AI is making
[65:51] (3951.88s)
coding more accessible easier I think
[65:53] (3953.84s)
what ends up happening is that like
[65:55] (3955.72s)
practices that were previously only
[65:57] (3957.68s)
meant for like the the tippity top of
[65:59] (3959.60s)
the market start to distill their way
[66:01] (3961.56s)
down to
[66:02] (3962.72s)
everyone so you you you're a co-founder
[66:06] (3966.24s)
of a def tools company and I think this
[66:07] (3967.84s)
is a little bit of a dream of a lot of
[66:10] (3970.12s)
Engineers as you see this this pattern
[66:12] (3972.44s)
pretty much is you you work you join a
[66:14] (3974.12s)
company you work on a product if it's a
[66:16] (3976.08s)
big enough company there's a platform
[66:17] (3977.36s)
team a lot of Engineers gravitate to
[66:19] (3979.12s)
there because you can solve pretty cool
[66:20] (3980.40s)
problems in your case you actually did
[66:22] (3982.28s)
the you were on a def tools team and
[66:24] (3984.20s)
then uh you
[66:26] (3986.12s)
you you find the courage or find the
[66:27] (3987.84s)
right people to actually start a startup
[66:29] (3989.88s)
and what better thing to do as a Devon
[66:32] (3992.60s)
what you know you you already know how
[66:34] (3994.44s)
to do this is built for other developers
[66:37] (3997.08s)
what is something that you've you've
[66:39] (3999.32s)
kind of learned the the hard way of
[66:41] (4001.28s)
starting a de tools company because I I
[66:43] (4003.40s)
assume it's it's a bit trickier than
[66:46] (4006.36s)
than what you might what was different
[66:47] (4007.64s)
than than when you started out because
[66:49] (4009.04s)
it sounds like the dream right you kind
[66:50] (4010.40s)
of went out you actually managed to
[66:51] (4011.80s)
raise some funding and now you're going
[66:52] (4012.84s)
to build these cool tools for developers
[66:54] (4014.72s)
and you're a developer is great you know
[66:56] (4016.12s)
your customer is
[66:58] (4018.60s)
wonderful um I mean a ton of things uh I
[67:02] (4022.80s)
think that building a company can be
[67:04] (4024.12s)
very rewarding in a lot of ways I think
[67:05] (4025.88s)
there are a lot of challenges I think
[67:07] (4027.08s)
some of the things that you learn to
[67:08] (4028.44s)
deal with uh some of the things I think
[67:10] (4030.44s)
we've learned to deal with here are sort
[67:11] (4031.96s)
of the uh diversity of engineering
[67:14] (4034.36s)
workflows across even across uh Silicon
[67:17] (4037.32s)
Valley and non Silicon Valley style
[67:18] (4038.68s)
companies right of like what does code
[67:20] (4040.56s)
review mean to them what are they trying
[67:21] (4041.88s)
to get out of it what's the purpose of
[67:23] (4043.32s)
it and finding the uh
[67:26] (4046.80s)
the Middle Road to blend all of those uh
[67:30] (4050.64s)
to blend all of those uh I don't know
[67:33] (4053.52s)
needs desires Etc uh is difficult it's
[67:35] (4055.76s)
challenging it's something we build um I
[67:37] (4057.76s)
think one of the things that has most
[67:39] (4059.28s)
surprised me coming out of Facebook uh
[67:41] (4061.80s)
is is what we talked around earlier is
[67:43] (4063.68s)
is the tooling is the I think working
[67:46] (4066.40s)
within a company you probably have that
[67:48] (4068.00s)
team that is building really awesome
[67:49] (4069.60s)
tooling you know and like they really
[67:51] (4071.56s)
benefit from that when you come out here
[67:53] (4073.76s)
into the wild as I call it a lot less of
[67:55] (4075.96s)
that exists and you find yourself being
[67:58] (4078.04s)
like okay how can I set that up how can
[68:00] (4080.28s)
I get it back how can I get back to that
[68:02] (4082.44s)
um a company which I uh I'll give a
[68:05] (4085.32s)
shout out to is stat Sig um so Facebook
[68:07] (4087.60s)
had really wonderful sort of like um had
[68:10] (4090.68s)
really wonderful uh experimentation
[68:12] (4092.56s)
tooling feature flagging tooling and
[68:14] (4094.24s)
when I left immediately not immediately
[68:16] (4096.32s)
but like 6 months in I was like okay
[68:17] (4097.76s)
cool like we probably have our first
[68:19] (4099.00s)
like feature flag like how do we go
[68:20] (4100.60s)
ahead and turn this on and I think you
[68:22] (4102.44s)
just mentioned your at Uber you had
[68:24] (4104.16s)
wonderful tooling there too
[68:26] (4106.04s)
uh there's all of this stuff that's set
[68:27] (4107.76s)
up for you that you just take for
[68:29] (4109.24s)
granted and once you come out and have
[68:31] (4111.36s)
to build it you have to stly start to be
[68:33] (4113.16s)
like okay H how do we do this from
[68:35] (4115.44s)
scratch again you know or who can do
[68:37] (4117.28s)
this for me awesome so with that let's
[68:40] (4120.04s)
wrap up with some rapid questions I'll
[68:41] (4121.52s)
ask a question and you just shoot
[68:42] (4122.72s)
whatever come comes to your mind cool so
[68:44] (4124.96s)
first what is a tool that you saw inside
[68:48] (4128.40s)
Facebook that is now outside Facebook as
[68:50] (4130.72s)
well and that can be used and you think
[68:52] (4132.40s)
it's pretty cool I think stattic is
[68:54] (4134.64s)
still the answer I'm going give here
[68:56] (4136.12s)
it's just it's very cool having the
[68:57] (4137.72s)
feature flag system as it's used to and
[68:59] (4139.28s)
it's actually an amalgamation of a
[69:00] (4140.40s)
handful of different Facebook tools um
[69:02] (4142.20s)
our feature flagging system gatekeeper
[69:03] (4143.80s)
and then our experimentation feature uh
[69:05] (4145.52s)
deltoid and so it's cool to see both of
[69:07] (4147.56s)
those and QE2 I guess it's kind of cool
[69:10] (4150.20s)
to see that sort of like uh reborn
[69:12] (4152.48s)
outside of outside of Facebook that's
[69:14] (4154.36s)
awesome and this this is the beauty of
[69:15] (4155.88s)
like uh I think you know there there was
[69:17] (4157.68s)
this like several years where Dev tools
[69:20] (4160.56s)
starts got funded because I feel a lot
[69:22] (4162.12s)
of the things that were hidden inside
[69:23] (4163.60s)
these companies are now outside as well
[69:26] (4166.96s)
totally um I think also one of the
[69:28] (4168.80s)
things that's uh having worked at Dev
[69:30] (4170.28s)
tools within Facebook and out of
[69:31] (4171.40s)
Facebook I think one of the things
[69:32] (4172.40s)
that's so funny to me is that in
[69:34] (4174.24s)
Facebook building Dev tools is a lot
[69:35] (4175.52s)
easier you don't need to worry about
[69:36] (4176.64s)
things like marketing you don't need to
[69:38] (4178.00s)
worry about things like hiring and I
[69:39] (4179.76s)
think it takes a lot of uh courage and
[69:42] (4182.24s)
it's a very difficult path to build a
[69:43] (4183.76s)
tool outside of outside of one of these
[69:45] (4185.48s)
large companies and say no I'm going to
[69:46] (4186.84s)
figure out how to make this work for the
[69:48] (4188.08s)
market because again the engineering
[69:49] (4189.92s)
cultures are different the constraints
[69:51] (4191.76s)
are different the uh the other tools
[69:54] (4194.40s)
you're integrating with are different
[69:55] (4195.76s)
and so you have to start to rebuild all
[69:57] (4197.28s)
that and so what is your favorite
[70:00] (4200.24s)
programming language always an
[70:01] (4201.96s)
interesting question by far Love T I
[70:04] (4204.72s)
think typescript is one of the most
[70:06] (4206.28s)
beautiful languages and you're talking
[70:07] (4207.80s)
to a person that learned JavaScript when
[70:10] (4210.04s)
he was like 10 um and then the type
[70:12] (4212.96s)
system on top of typescript I think is
[70:14] (4214.88s)
is really clean and does a great job
[70:17] (4217.76s)
with the constraints they have it it was
[70:19] (4219.92s)
I remember when typescript was created
[70:21] (4221.68s)
and I I remember there was a bit of
[70:23] (4223.00s)
skepticism around a it was Microsoft B
[70:25] (4225.72s)
do we even need this because you know
[70:27] (4227.28s)
the dynamic typing system is kind of
[70:29] (4229.08s)
like this wonderful thing that allows
[70:31] (4231.04s)
you to do so many crazy things and yeah
[70:34] (4234.08s)
like I think everyone has turn well not
[70:36] (4236.04s)
everyone but like most devs have have
[70:38] (4238.60s)
slowly or or surely seen the benefits of
[70:41] (4241.80s)
what what what we get but with with
[70:43] (4243.16s)
typing and with a nice language my my
[70:46] (4246.16s)
controversial opinion is I'm glad
[70:47] (4247.76s)
Microsoft is want to do it because I
[70:49] (4249.04s)
think that Microsoft has some really
[70:50] (4250.44s)
wonderful compiler knowledge locked
[70:52] (4252.24s)
inside of it um and I similar had that
[70:55] (4255.48s)
ISM I remember at the time thinking like
[70:56] (4256.76s)
we have flow what what's the purpose of
[70:58] (4258.92s)
this um and I think that the the formal
[71:03] (4263.28s)
grammars and proofs that they've done
[71:04] (4264.68s)
around it are really really excellent
[71:06] (4266.88s)
the like you can go really deep on group
[71:08] (4268.88s)
Theory or category Theory just by
[71:11] (4271.28s)
looking at typescript um yeah I I there
[71:15] (4275.32s)
I frequently look at other untyped
[71:16] (4276.64s)
languages I'm looking at Ruby and I'm
[71:18] (4278.08s)
looking at Python and I I don't think
[71:20] (4280.12s)
they've gotten close to what typescript
[71:21] (4281.48s)
has done to JavaScript yet and then
[71:23] (4283.88s)
finally what's uh a book that that you
[71:26] (4286.60s)
read and would recommend fiction or
[71:28] (4288.48s)
non-fiction one or two books so I am an
[71:31] (4291.56s)
avid mystery reader so the mystery book
[71:33] (4293.36s)
I always recommend people is the last
[71:34] (4294.92s)
days of night it's a book around uh a
[71:37] (4297.60s)
Tesla Edison and wesing house and the
[71:39] (4299.60s)
race to invent the light bulb it's
[71:41] (4301.40s)
written as historical fiction I think
[71:42] (4302.96s)
it's phenomenal as one who currently
[71:45] (4305.20s)
Works in technology in New York there
[71:47] (4307.12s)
isn't a better book I can recommend to
[71:48] (4308.60s)
anyone in that position um non-fiction
[71:52] (4312.04s)
um I read a lot of books these days I've
[71:53] (4313.80s)
been actually reading a lot more prodct
[71:55] (4315.56s)
books um the Timeless way of building is
[71:58] (4318.44s)
a really interesting book about
[71:59] (4319.64s)
architecture the beginning is is um a
[72:02] (4322.80s)
little abstract to get through but once
[72:05] (4325.00s)
you get to the sort of Latter half it
[72:07] (4327.08s)
talks around the importance of uh having
[72:10] (4330.08s)
the people who ultimately use uh tools
[72:12] (4332.44s)
or buildings be the ones who build those
[72:14] (4334.88s)
tools or buildings uh because that's the
[72:17] (4337.36s)
only way that you create a self-
[72:18] (4338.84s)
sustaining cycle I guess it kind of
[72:21] (4341.00s)
resonates with with developers developer
[72:23] (4343.48s)
tools yeah which see seems kind of
[72:25] (4345.60s)
obvious cuz who else would program the
[72:27] (4347.28s)
tools that we used on developers it's
[72:29] (4349.08s)
one of these weird things but yeah
[72:32] (4352.04s)
there's an mcer painting to be made
[72:33] (4353.60s)
about that well this this was this is
[72:36] (4356.00s)
really interesting I'm glad we were able
[72:37] (4357.84s)
to go a bit deeper into stacking than we
[72:40] (4360.28s)
previously touched on on on that uh
[72:42] (4362.52s)
article as well and thank you for your
[72:44] (4364.40s)
time cool of course thank you this was
[72:47] (4367.08s)
always very fun thanks a lot to Thomas
[72:48] (4368.64s)
for all the details on metas tooling
[72:50] (4370.12s)
stag diffs and on tooling Trends you can
[72:53] (4373.12s)
find Thomas on social media as list in
[72:55] (4375.40s)
the show notes below for more deep Dives
[72:57] (4377.64s)
on tooling at meta and more details on
[72:59] (4379.40s)
sag diffs check out the pragmatic
[73:00] (4380.76s)
engineer deep Dives Linked In the show
[73:02] (4382.52s)
notes below if you enjoy the podcast
[73:05] (4385.12s)
please do subscribe on your favorite
[73:06] (4386.48s)
podcast platform and on YouTube this
[73:08] (4388.76s)
helps more people discover the podcast
[73:10] (4390.88s)
speci special thank you if you leave a
[73:12] (4392.00s)
rating thanks and see you in the next