Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:00
This podcast is supported by the New
0:02
York Stock Exchange. In the NYSE
0:04
video series, Future in Five, the
0:06
NYSE asked Maggie Roshani from Nobel Food,
0:09
her prediction for the future. One prediction
0:11
that I have about this industry is that advancements
0:14
in biology and plant genetics are really
0:16
going to allow us to produce more food
0:19
in a much more sustainable way. The NYSE
0:21
connects companies to the capital they need
0:23
to change the way we feed our families.
0:26
Kevin, it's very late here. Where are you going?
0:30
Well, I just wanted to see what you had planned for the weekend.
0:32
No, we are doing an emergency
0:34
podcast because Casey, some
0:37
very big news just happened. That's right. My
0:39
red hard fork phone that I keep next to
0:41
my bed is ringing and it says there's
0:43
big news. Today, just a few hours ago
0:46
in a surprise move, OpenAI
0:48
CEO Sam Altman was pushed out
0:51
of the company. He was effectively fired by the
0:53
company's board. This move
0:55
was a total shock to everyone
0:57
in the tech industry and was especially shocking
0:59
to us
1:00
because on Wednesday of this week, you
1:02
and I both sat down for a very
1:04
long, very wide ranging interview
1:06
with none other than Sam Altman.
1:08
We did. And looking back, I wish I just asked him,
1:10
hey, do you think you're about to get fired? Yeah,
1:14
that was not one of the questions we asked him shockingly.
1:17
And to be frank, like it didn't seem
1:19
like he had any idea this was coming.
1:21
We were talking to him on what was essentially
1:23
the one year anniversary of chat GPT. We
1:26
asked him a lot of questions about OpenAI
1:28
and chat GPT and the future he saw
1:31
for artificial intelligence. And at
1:33
no point during our interview did he indicate
1:35
that anything was amiss or that anything
1:38
like this was about to happen. So I have to wonder
1:40
whether he knew or whether this was a total shock
1:42
to him, too. Yeah. Well, like you said,
1:44
he definitely did not betray
1:47
any notion that this was about
1:49
to happen when we talked to him. So
1:51
today in this emergency podcast, we are going
1:53
to talk about this news, what happened
1:56
and what it means. We are not going
1:58
to share our interview.
1:59
that we recorded with Sam Altman just yet, candidly
2:02
we're still trying to figure out what to do about it. But
2:04
today I just thought we should hop on the mics and
2:06
talk through what I think is probably the
2:09
most shocking tech story of the year
2:11
and maybe in several years. You know what
2:13
I would say, Kevin? You could train an entire large
2:15
language model just based on the text messages
2:17
that have been sent about Sam being fired today. It's
2:20
true. It's true. After
2:23
this news came out, my phone melted and
2:27
I knew it was one of those news days where my
2:30
phone's gonna run out of battery by noon. And
2:32
it did. I have charged
2:35
my phone twice today because it is just one of those
2:37
days. So let's just get the timeline
2:39
down because this all happened very, very
2:42
quickly and without much warning.
2:44
So on Friday, OpenAI
2:47
announced this firing in a
2:49
blog post. They said that Sam Altman
2:52
has been pushed out effective immediately
2:55
and that Mira Murati, the company's CTO,
2:58
chief technology officer, will be taking over
3:00
as CEO on an interim basis. They
3:03
did not give the exact reasons
3:05
for the firing, but
3:08
they said that his departure, quote, "'follows
3:10
a deliberative review process by the board,
3:13
"'which concluded that he was not consistently
3:15
candid "'in his communications with the board,
3:18
"'hindering its ability to exercise its
3:20
responsibilities. "'The board no
3:22
longer has confidence "'in his ability to
3:24
continue leading OpenAI.'" They
3:26
also said that Greg Brockman, who was the
3:28
president and co-founder of OpenAI, as
3:31
well as the chairman of its board, was
3:33
being pushed off the board. And
3:35
they sort of left it at that. Sam Altman
3:38
then shared what I would describe as a vague
3:40
post on X after the
3:42
announcement. He said, "'I loved my
3:44
time at OpenAI. "'It was transformative
3:47
for me personally "'and hopefully the world a little
3:49
bit. "'Most of all, I loved working with
3:51
such talented people. "'We'll have more to say
3:53
about what's next later.'" So,
3:55
you know, very sort of vague, no
3:58
details about what happened. or how
4:00
he felt about it. And then just
4:03
a couple hours after that, Greg Brockman
4:06
announced to the employees of OpenAI
4:08
that he was quitting in some
4:12
sort of solidarity or what many people interpreted
4:14
as solidarity with Sam. And
4:16
then on Friday night, Greg Brockman released a
4:18
statement saying that he and Sam were both,
4:21
quote, shocked and saddened by what the board
4:23
did today. He gave a little rundown
4:25
of his side of the story, basically said that both
4:27
he and Sam were blindsided by these texts
4:30
from Ilya Setskever, the
4:32
chief scientist of OpenAI, who
4:35
told them both sort of minutes before this
4:37
news went out that they were being removed
4:40
from their roles. And
4:42
he claimed that he was still trying to figure out exactly what
4:44
happened, but that he had been taken by
4:46
total surprise on this. So within
4:49
the span of several hours, OpenAI's
4:51
board has pushed out Sam Altman, the CEO,
4:54
Greg Brockman, the COO, and
4:56
just left everyone at the company, reeling
4:59
and wondering what happened. That's
5:01
right. And based on our reporting,
5:04
Kevin, we know that none of the rank and
5:06
file at OpenAI appeared
5:08
to know this before it was announced
5:10
to the public. So this sort of came as
5:12
a shock to them. Ina Fried at Axios
5:15
reported that Microsoft apparently got one
5:17
minutes notice before this
5:20
news was shared. Microsoft, of course, is one
5:22
of their biggest partners. That's right. And
5:25
I can tell you based on my reporting that last
5:27
week, Sam and Sacha Nadella, the CEO of
5:29
Microsoft, were spending a lot of time together and
5:31
Sacha did not get any indication, again,
5:34
whatsoever that anything was up and that this
5:36
might happened. Microsoft's stock
5:38
dropped after this news was shared, as
5:40
you might imagine, as people now worry about what this means
5:43
for the future of their AI efforts.
5:45
Nadella put out a statement later in the day, essentially
5:48
affirming his commitment to OpenAI,
5:50
saying they have a long-term deal in place and expressing
5:53
his confidence in Mira to lead the company. Yeah,
5:55
so Casey, what are you hearing from your
5:58
reporting? I assume you've been calling around. all
6:00
day trying to figure out what happened here. Well, I have
6:02
to tell you, Kevin, I wish I'd done a better job reporting
6:05
because despite sending what I would
6:07
estimate to be 40,000 text messages,
6:09
I still don't have the answers that I want,
6:11
you know? What I don't want to do is to
6:13
speculate irresponsibly, but what I can tell
6:16
you based on having seen other CEOs
6:18
get fired in other circumstances is
6:21
that Greg Brockman quitting,
6:23
I think, tells us a lot about
6:25
what is going on here. What do you mean? When
6:28
CEOs resign due to personal scandals,
6:30
they do not typically have a bunch of people lining
6:33
up with them to follow them out the door, right?
6:36
And so when you have a co-founder of this company saying,
6:38
based on the fact that Sam was fired, I
6:41
am quitting this company, that is a signal
6:43
to me that the dispute that is at
6:45
the center of whatever this is, is
6:47
related to open AI, right? Now
6:50
we don't know if it's related to a business
6:52
decision, a product decision, an AI
6:55
safety decision. I can tell you all
6:57
of those rumors are swirling inside
7:00
the company and out, but
7:02
we don't know. I can also say that
7:05
the number of people who know the
7:08
real reason behind this right now
7:11
is extremely limited. I have been talking
7:13
to a number of people very close
7:15
to this situation who I would think would
7:18
know, and they're coming back and they're saying,
7:20
we are hearing cricket. So this
7:22
information was very, very tightly
7:24
held. I do think it will come out eventually
7:27
and maybe soon, but as of recording time,
7:29
we just don't have it. Yeah, I expect that
7:32
every tech reporter in America is
7:34
chasing this story right now. I've also been making
7:36
calls and texting people all day trying to
7:38
figure out what the heck happened here. And
7:41
I think you're right. I mean, this came as a total
7:43
surprise to almost
7:45
everyone at open AI. They learned about
7:47
it the same time the rest of the world did. I
7:51
heard that there was an all hands meeting
7:54
in which employees were
7:56
sort of shocked and dismayed
7:58
many of them. Altman was a very
8:00
popular CEO among the
8:02
rank and file at OpenAI. So this was
8:04
not a situation where people kind of
8:07
had an easy explanation ready to go. Oh,
8:09
this was clearly this thing that
8:11
they could point to. They seem totally
8:14
baffled and totally shocked by this news.
8:17
And I heard it described as a hostile
8:19
takeover or a coup. Those
8:22
are the kinds of words that people inside
8:24
and close to OpenAI have been using to describe
8:27
what happened today. I think
8:29
it is important to underscore what you said,
8:31
Kevin, which is that Sam Altman was a very
8:33
popular leader at OpenAI. He's
8:36
a very popular leader in Silicon Valley
8:38
in general. Of course, before OpenAI, he
8:41
ran Y Combinator, the incubator
8:44
that is responsible for companies, including
8:46
Airbnb, Stripe, Dropbox,
8:48
a lot of other famous companies. So Sam
8:51
Altman is one of the best connected people in Silicon
8:53
Valley. And after the news broke, I just saw
8:56
boldface name after boldface name leaping
8:59
onto X to express their support for him. Brian
9:01
Chesky from Airbnb, Eric Schmidt,
9:03
the former CEO of Google and many,
9:05
many more. All of them were essentially saying,
9:08
we stand with Sam.
9:11
So, you know, it shows you the amount of support
9:13
that he has. And, you know, I have to
9:15
say, I think it raises real questions
9:18
about the future of OpenAI, because
9:21
part of being a great leader at OpenAI for
9:23
Sam meant being a great recruiter, which is one
9:25
of the real things that he did. There's
9:28
a limited amount of top tier talent
9:30
in AI world, right? And Sam, because
9:33
he is so well known from running Y Combinator, was
9:35
able to go out and get a lot of the best people
9:37
to come in and work at that company. So there were a lot of people
9:39
who went to work today because they wanted to work for Sam Altman. And
9:42
on Monday, they're going to go into work working
9:44
for someone else. And so I think the question is, are
9:47
they so tied to that mission that they are
9:49
going to stick with the company when inevitably
9:51
it comes out, whatever this dispute was? Yeah,
9:53
it's totally baffling to me. Something
9:56
just isn't adding up. This is a
9:58
very unusual circumstance. You
10:00
do not see boards firing
10:03
CEOs, very popular CEOs
10:05
of very popular companies in this
10:07
way, particularly. Usually if a CEO
10:10
is going to be pushed out, the board will sort
10:12
of do the nicety of saying,
10:15
this person is stepping aside to spend more time
10:17
with their families, sort of the cliche, or
10:19
to work on other projects. The
10:22
board of OpenAI really did
10:25
throw Sam under the bus in this statement, which
10:27
makes me think that whatever happened, it must have
10:29
been bigger than just a simple
10:32
disagreement. This, I actually
10:34
think, is a good point to talk about
10:36
the mess of OpenAI in general. And
10:39
we have talked about it on the show before. There's
10:41
basically no company that we talk about that has
10:43
a more complicated structure than this company.
10:46
There is a nonprofit board
10:49
that operates a for-profit
10:51
company. On that nonprofit
10:54
board, which voted to fire Sam
10:56
today, most of the members do not own
10:58
shares of the company. Sam
11:01
himself did not directly own
11:03
stock in the company. And so you
11:06
just have this weird situation where you have a for-profit company
11:08
owned by a nonprofit. That is just weird.
11:11
And if you were to say, does that scenario
11:13
make it likelier that we would see something like we saw today,
11:16
then not like, yes, it actually does. Okay?
11:19
So I think that's part of what's going on here. The
11:21
second thing I would say is that OpenAI
11:24
has just been a messy company always, right?
11:27
It was famously co-founded by Elon Musk. Elon
11:29
Musk had a massive falling out with those
11:31
folks and wound up walking away.
11:34
A bunch of other co-founders at the company had
11:36
a big falling out with Sam and the
11:38
other folks there. And they went to start an anthropic,
11:41
who's CEO, Dario Amade. We have had on the show. He
11:43
is the CEO of that company. He also was an OpenAI
11:45
before then. So in
11:47
the AI world, there are a lot
11:49
of disputes, and they end with people
11:52
slamming doors and often going to start
11:54
their own AI companies. Yeah, we should talk
11:56
just for a second about the board for one more bit,
11:59
because there's something... You said that I want
12:01
to draw out which is that the board
12:03
members of this company Do not
12:05
for the most part own shares
12:08
in the company that is that is unusual
12:10
or for company boards Usually
12:12
you'd have like a venture capitalist on there
12:15
or you know, maybe some some early
12:18
Executives or just you know business
12:20
people that the CEO knows and his friends
12:22
with this board. It's a
12:25
small board. First of all Before
12:27
today it consisted of Greg Brockman
12:30
Sam Altman Obviously open AI is
12:32
chief scientist Ilya Sutskivar And
12:35
then there are these sort of independent
12:38
non-employee Directors including
12:40
Adam DiAngelo who's the CEO of Quora?
12:43
Tasha Macaulay who is a technology
12:46
entrepreneur and Helen Toner
12:48
who is a Scholar at the Georgetown
12:51
Center for security and emerging technology
12:54
and of those board members I think it's worth spending
12:56
a beat talking about the last two
12:59
that I mentioned because This is sort
13:01
of one hint that a lot of people
13:03
I've been talking to today have been pointing me
13:05
toward to Sort of try to read the tea
13:07
leaves about what happened during this transition
13:10
period Which is that
13:12
Tasha Macaulay and Helen Toner
13:14
are Aligned with the
13:17
effective altruism movement now we could talk
13:19
a lot more and we have talked on the show about the effect
13:21
of altruism Movement it is behind
13:24
a lot of the AI safety research
13:26
that is going on in the AI industry And
13:28
these are people who tend to be more worried about things
13:31
like existential risk from AI now
13:34
We don't know who voted which
13:36
ways on the board But some folks that
13:38
I've been talking today sort of said well It's
13:41
possible that this was some sort of a you know
13:43
an EA Lead who against
13:45
Sam Altman for some reason having to do with
13:48
AI safety But again that is very vague
13:50
and not confirmed and so we probably shouldn't speculate
13:53
about it too much So Casey, what do you think
13:55
this does mean for the future of open
13:57
AI like can they survive this scam?
14:00
as they have survived other scandals in
14:02
the history of the organization? Or is
14:04
there something different about this? I mean,
14:06
sure. I think
14:09
the next 30 to 90 days
14:11
are critical for this company. Is
14:14
it the case that a bunch of other
14:16
top executives and engineers
14:19
at this company follow Sam and Greg
14:21
out the door and start another company
14:23
with them? If so, then yes, I think that's
14:25
a very serious challenge to open AI. I
14:28
think it also might be one of the best days anthropic
14:30
has had in a while. It's almost certainly the best day Google
14:32
bar has had in a while, okay, right? Because
14:34
these folks now have an opportunity maybe to
14:37
get ahead while their top competitor
14:39
is trying to pick up the pieces. The
14:42
second thing I would say is they still have a
14:44
lot of advantages. GPT-4 is
14:46
the best generally available large
14:49
language model. Chat GPT is
14:51
probably the most popular chatbot on the
14:53
market. It's almost certainly the one that's making the
14:55
most money. A lot of that is going to stop
14:58
being true because it's fascinating to us
15:00
as it is who is running this company. To
15:03
most of its users, the question is just, hey, can I
15:05
sign up for Chat GPT Plus yet?
15:08
It's interesting, Kevin, when you and I sat down with Sam, it seemed
15:10
like the biggest problem in his life was that he had to pause signups
15:12
for Chat GPT Plus because they had
15:14
gotten too popular, right? So they were
15:16
in the middle of their kind of infrastructure and engineering
15:19
crisis before the leadership crisis
15:21
started. I don't
15:23
think OpenAI is going to collapse tomorrow,
15:26
but is this a very real risk
15:28
to the company? I do think it is. What do you think? Yeah,
15:32
something is just not adding up for me. It
15:35
is truly inconceivable
15:37
to me that the
15:39
board of OpenAI would push
15:42
out Sam and sort
15:44
of by extension Greg Brockman without
15:48
a really good reason for doing so. I mean, this is
15:50
a CEO who has generated
15:54
enormous value for OpenAI and has
15:56
for all of the... laws
16:00
that his critics might find
16:03
with him. He's been a very effective
16:05
leader in terms of positioning open AI
16:08
at the head of the pack when it comes to AI. I
16:10
don't know. I guess
16:13
we'll find out more in the coming days.
16:18
But it just truly does not square to
16:20
me that it could be something like
16:23
a disagreement about AI safety
16:25
or something without being something. I mean,
16:28
this is also a board of people that Sam Altman,
16:30
for the most part, picked. So it's not as
16:32
if these are people who he had a history of
16:35
open conflict with. Something
16:38
happened here. I mean, look, something
16:41
did happen. And while
16:44
I cannot speculate, and I don't have any reporting,
16:46
I would just say that Semaphore reported
16:48
today that Sam has been in the
16:50
middle of raising a new venture fund. He
16:53
has other side projects. He
16:55
co-founded Worldcoin. That is the
16:57
orb that scans your eyeballs and gives
16:59
you crypto. And
17:02
it might be that none of that has anything to do with what
17:04
happened today. But I would say that in general,
17:07
if you are a CEO, and you're
17:09
also a CEO of other companies, or you're building
17:11
other projects, or you're engaged
17:14
in a lot of other complicated deals
17:16
and investments, that
17:18
stuff does get messy, actually. And there
17:21
are disclosure rules. If you
17:23
don't put the right thing down on
17:26
a forum, the next thing you know, you could find yourself
17:28
in a lot of trouble. So we have actually seen a lot of
17:30
CEOs go down over the year for
17:32
what are essentially financial entanglements.
17:35
So again, as I grasp for a what
17:37
could possibly have happened that would make the board
17:40
do this, I have to say that's the
17:42
number one place that I've been coming down. Yeah.
17:44
So aside from the sort of main
17:46
question here, which is what the heck happened? There's
17:49
also this question of like, well, what is next
17:51
for open AI and for Sam Altman,
17:54
for open AI, I imagine under Mira
17:56
Morati, they will continue to, you
17:58
know, refine and and launched new products.
18:02
I imagine Microsoft,
18:04
which is sort of heavily dependent on OpenAI
18:07
for a lot of its AI work, will try
18:09
to continue to work with the company. But
18:12
I can't imagine that Satya Nadella, the
18:14
CEO of Microsoft, was happy to see this news. He
18:16
and Sam Altman were on stage
18:19
at OpenAI's Dev Day just
18:21
recently, and it didn't seem
18:23
like there was anything but love between the two. So
18:25
I think for a lot of employees
18:27
of OpenAI, what I'm hearing is that they are trying
18:30
to figure out what to do with themselves.
18:32
As you mentioned, Sam was the reason that a lot
18:34
of them came to work at OpenAI. And so
18:37
I imagine that the recruiters of
18:39
Silicon Valley are gonna be very busy in the coming
18:41
weeks. Yeah, and
18:44
also, no disrespect to the new leadership
18:46
at OpenAI, they have very strong backgrounds. They're
18:49
clearly incredibly talented technologists,
18:52
but I wouldn't be surprised if there were people over at Microsoft
18:55
sort of feeling like maybe they actually just gained a lot
18:57
of power over OpenAI, and that maybe they
18:59
can now steer the future a little bit
19:01
more strongly than they could when they had Sam
19:04
Altman at the helm. Yeah. All
19:06
right, well, Casey, I'm sorry
19:08
that our planned week off next
19:10
week has been interrupted. This keeps happening.
19:13
This summer, when we both planned to take a week off,
19:16
we had the launch of threads.
19:18
That was a big surprise, and we had to cut our vacations
19:20
for it to do that. I'm beginning to think the problem
19:23
here is us. We plan to have an off week next week
19:25
and drop this big Sam Altman interview, and now
19:27
Sam Altman's been pushed out. So you could
19:29
call it the hard fork curse, or you could just
19:31
call it bad luck. Yeah, well,
19:33
call it what you want, but we
19:36
do like making the podcast, and thanks to everybody
19:38
who sent us messages asking us to do
19:40
an emergency pod. We did wanna hop on
19:43
and tell you what we know, which isn't much, but
19:45
obviously the story has our full attention. When
19:48
we know more, you will be the first
19:50
to know, and my guess is that Sam Altman
19:52
is going to continue to play a role in the history
19:55
of AI. Yes, this does not feel like someone
19:57
who is ready to ride off into the sunset. I-
20:00
I, on the other hand, am absolutely about to ride off into
20:02
the sunset. Actually, the sunset is a long time
20:04
ago, and now I have
20:06
to go eat dinner. Okay, well, thanks
20:08
for doing the emergency pod with me. And
20:11
we will reconvene when we
20:13
have more information.
20:44
Why rely on luck to get you through next-level
20:46
moments?
20:47
Stop crossing your fingers during a data center
20:49
migration. Don't
20:51
knock on wood to hope for a successful
20:53
office expansion.
20:54
Stop saying well wishes and looking to
20:57
your lucky stars to help you avoid
20:59
a cyber attack. Next-level moments
21:01
need the Next Level Network. With
21:03
the security, reliability, and expertise
21:06
to take your business further. AT&T
21:09
Business.
21:10
The network more businesses choose.
21:13
Hard Fork is produced by Davis
21:15
Land and Rachel Cohn. We're edited
21:18
by Jen Poynak. Today's show was engineered
21:20
by Corey Shruppel. Original music by Dan Powell. Our
21:24
audience editor is Nel Galocli. Video
21:26
production by Ryan Manning and Dylan Bergeson.
21:29
Special thanks to Paula Schumann, Thuy Nguyen Tam,
21:32
Kate Lepresti, and Jeffrey Miranda. You can
21:34
email us at hardfork at nytimes.com. Especially
21:36
if you're an OpenAI board member who knows what happened. Yeah, that'd be
21:38
great.
21:51
Thank you.
22:10
Cappella University is rethinking higher
22:13
education. With their game-changing FlexPad
22:15
format, you can earn your degree on your schedule
22:18
so you can fit education seamlessly into
22:20
your life. Imagine your future
22:22
differently at cappella.edu.
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More