Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:00
Casey, where are you right now? Okay, first
0:02
of all, you have to stop calling me, Kevin. I'm trying to be on vacation
0:05
over here. I'm literally in the middle
0:07
of gate E8 at San Francisco International
0:10
Airport. Is
0:13
it loud there? How are you recording this? It's
0:15
surprisingly quiet, but only because my flight just
0:17
got canceled. No! What
0:20
happened? Well, step
0:23
one was they were waiting on some paperwork.
0:26
Step two was they opened up some sort of compartment and
0:28
they said that they needed to replace a broken latch, and
0:30
then during that
0:30
process, they announced that something was cracked and
0:33
they were taking the airplane out of service.
0:36
So the best option
0:38
for us was to hop on and chat
0:40
while we wait for our next plane to
0:42
arrive. And that's how my Thanksgiving's going so far. Oh my God, I'm so sorry
0:45
that happened to you. Well, it— How's your
0:47
holiday going so far? Well, I would say
0:49
it's been very relaxing. I've
0:51
just been spending, you know, quiet
0:54
afternoons, you know, reading books and
0:56
catching up on sleep. And
0:58
of course, I have been in a frenzy
1:00
of
1:00
reporting and trying to figure
1:03
out what the heck is going on behind all this drama.
1:05
This is a sort of triumphal return of Sam
1:07
Altman and some OpenAI employees
1:10
have been saying things like, we are so back.
1:13
Well, before we get into the aftermath,
1:16
Kevin, we should probably just quickly tell
1:18
folks who may not have heard what happened
1:20
since last week came to them in an emergency
1:22
podcast just hours ago. I
1:25
feel like this week has just been one never-ending
1:28
emergency podcast. I
1:30
feel like we have
1:30
really not stopped recording since Friday.
1:33
But basically, here's the deal. Late
1:36
Tuesday, OpenAI announced that
1:38
Sam Altman was being brought
1:41
back after this five-day campaign
1:44
that had been waged by Sam
1:46
and his allies, OpenAI's
1:48
employees who had threatened to quit en
1:50
masse and go work at Microsoft, as
1:52
well as the company's investors. They
1:55
said they had a quote, agreement in principle
1:57
for Sam Altman to return.
1:59
and that they were overhauling
2:02
the company's board of directors. So Adam
2:04
DiAngelo, who is the chief
2:07
executive of Quora and was one of the
2:09
board directors who had voted to fire
2:12
Sam Altman, is staying on, but
2:14
he is the only member of that board who
2:16
is staying on. Helen Toner and
2:18
Tasha McCauley, two of the other board members
2:21
who voted to fire Sam Altman, are
2:23
leaving the board, and two new
2:25
people are joining to replace them.
2:28
Brett Taylor, who is an early Facebook
2:30
executive and former executive
2:32
of Salesforce, is coming onto
2:34
the board. He will be the new chairman, and
2:37
Larry Summers, the former Treasury
2:40
Secretary, is also coming
2:42
onto the board. So Casey, what was your reaction
2:44
to this news? I mean, on one
2:46
hand, it was very surprising,
2:49
given that there had been a few failed attempts to
2:51
return Sam to this position so far.
2:54
On the other hand, though, I think by
2:56
the time this happened, Kevin, this really
2:59
was inevitable. And there was one particular
3:01
detail I read in some reporting
3:03
that I want to share right now, and this was the moment where
3:05
I thought there is no way Sam Altman doesn't come back
3:07
to this board. Can I just share this with you? Yes, please. So
3:10
this is from a story written by Kichage
3:12
Deepasitha Raman and Berber Gin at the Wall Street
3:14
Journal, and in their piece on the matter, they said,
3:17
and this is, of course, as the board is discussing
3:19
the situation with some supporters for
3:22
Sam Altman, and this is the quote,
3:24
the board agreed to discuss the matter with their council after
3:26
a few hours, they returned, still unwilling
3:28
to provide specifics. Specifics in this case
3:31
about why Altman was fired. The story goes
3:33
on. They said that Altman wasn't candid and often
3:35
got his way. The board said that Altman
3:37
had been so deft, they couldn't even give
3:39
a specific example, according to the people
3:42
familiar with the executives. So
3:45
when the people who were trying to get Sam
3:47
back asked the board, hey, no,
3:49
seriously, why did you fire this guy? Their
3:52
answer was he's so good at being
3:54
bad, we can't even tell you what he did bad. And
3:56
that was the moment where I thought, this man is going to
3:58
be CEO of this company again. Yeah,
4:00
that's a good observation. And it dovetails
4:03
with some reporting that my colleagues and I have been
4:05
doing at the Times about
4:07
why Sam Altman was fired and about
4:10
some of the conflicts between him
4:12
and the board that have been going on
4:14
for a while now. In particular, this
4:16
conflict between Sam Altman and Helen
4:18
Toner, one of the board members who
4:21
is departing over this academic
4:23
paper that she had written
4:25
that sort of drew attention to open
4:28
AI in a negative light. And
4:30
Sam Altman was upset about this.
4:33
And this is sort of part of what sparked
4:35
the disagreement between him and the
4:37
sort of Helen Toner faction of the board.
4:40
But obviously, you're right, we still don't know exactly
4:43
what the trigger was for
4:45
firing Sam Altman. But it seems
4:47
to have been vague enough or
4:49
unconvincing enough that the faction
4:51
of the board that wanted to push him out was not able to
4:54
stand their ground. And ultimately,
4:56
in these bargaining sessions, they agreed
4:59
to bring him back in exchange for certain
5:01
changes to the company's governance. That's
5:04
right. I do think that that that dispute
5:06
that you mentioned is important to spend
5:08
another beat on the right because I do
5:10
think that the entire conflict is contained
5:13
in this story. Open AI is, of course,
5:15
famously a nonprofit board
5:18
that runs a for profit company. Helen
5:20
represented the nonprofit board. Sam,
5:22
his duty is to the nonprofit, right?
5:25
He is hired essentially by the nonprofit.
5:27
But I also think that his loyalties
5:30
have been much more to the sort of commercial
5:32
corporate side of the venture, at least as
5:34
this most recent drama has been playing
5:37
out. And the paper that Helen co-wrote
5:39
was a paper in part about AI safety.
5:41
And the thing that she and her co-authors
5:44
wrote was that Open AI's rival,
5:47
Anthropic, which is of course co-founded by a bunch of former
5:49
Open AI people, they wrote that
5:51
Anthropic had essentially built their product more safely
5:53
than Open AI had. And so you
5:56
can understand why in Sam's
5:58
mind, that was a betrayal. But you
6:00
can also understand why in Helen's mind
6:03
that was just her doing her job. Her job is to make
6:05
sure that AI gets built in the safest
6:07
manner possible. Her job is not to protect the
6:09
reputation of open AI. And
6:12
so that appears to be where the schism
6:14
was. And even if that wasn't the trigger for why Sam got
6:16
fired, I think it tells you a lot about what happened over the past
6:19
week. Yeah. So we'll see who gets
6:21
added to the board in the coming days.
6:23
This is not the final composition
6:25
for the board. Are you throwing your hat in the ring, by the way, Kevin? I
6:29
will not serve if elected. This company
6:31
has already cost me too much sleep. So
6:34
it remains to be seen who will end up on the final
6:37
version of the board. This is sort of being seen
6:39
and portrayed as an interim board that's just
6:42
there to kind of sort things out and ultimately
6:44
decide who should be on
6:46
the nonprofit board going forward. But
6:48
I would say a couple things. One
6:51
is Microsoft is definitely
6:53
going to have a bigger hand in the governance
6:56
of open AI going forward. When
6:58
Microsoft did this deal with open AI
7:00
investing billions of dollars in the company, they
7:03
were kind of a passive investor,
7:05
right? They did not have a seat on
7:07
the board. They were not making
7:09
the decisions about the future of this company,
7:12
even though this company and its technology have become
7:14
very important to Microsoft's future business
7:17
plans. The best joke I heard about that, by the way, was
7:19
from Matt Levine, who wrote something like Microsoft
7:21
invested in a nonprofit at a valuation of $80 billion.
7:25
Yeah. So
7:27
Microsoft obviously will
7:29
want to ensure that something like this
7:31
doesn't happen again, that its investment
7:34
in open AI is not sort of jeopardized
7:37
by this nonprofit board. And so I expect that
7:39
they will want a board seat going forward. And
7:42
the bigger picture here, I think, and
7:44
this is something that I've been writing about today, is
7:46
that this war in AI
7:49
between sort of the capitalists and
7:51
the catastrophists. That's catchy.
7:54
Yeah, thank you. So there are people who think that
7:56
AI is going to be a powerful business tool
7:58
and the people who worry that it's going to take
8:01
over the world and destroy humanity. That
8:03
war, I think, as of now, is
8:06
over. The capitalists have won. The
8:08
people who are now in charge of the board of OpenAI
8:11
are the kind of seasoned deal makers
8:13
and Silicon Valley insiders that you would expect
8:16
to govern a for-profit technology
8:18
company. They are not these kind of academics
8:21
and ideologues who worry that AI
8:24
could become too powerful and need to be
8:26
shut down. And I think that's mostly the
8:28
way that things are going to be from here on out. Certainly
8:31
I think the pro-safety people have lost
8:34
their most important perch
8:36
that they had on any sort of power in
8:38
the circumstance at all. At the same time, there
8:41
is a faction throughout the government,
8:43
academia, in journalism, and within
8:46
the industry itself that wants to build this technology in
8:48
a safe way. So I don't think that disappears, but
8:50
you're right, it did lose a lot of power. I think Mike's
8:52
sort of wrapping up question for you, Kevin, is how
8:55
much do you think this changes OpenAI? Is
8:57
it the case that Monday morning rolls around and
9:00
it is just back to business as usual for these
9:02
folks? Or do you think that this
9:05
crazy series of events will have affected
9:07
Sam and the company in some profound
9:09
way that might change what we expect to see from them going forward?
9:13
It's hard to say. I was talking to some
9:15
OpenAI employees who were
9:17
going back to the office to celebrate.
9:19
They were having a party at the office. I
9:22
did not get invited to that party, but I was
9:24
hearing dispatches from inside of it. And at one
9:26
point, the fire alarm of the
9:29
OpenAI offices was set off by
9:31
a fog machine in case you
9:33
want to do sort of a vibe check on how people at
9:35
the company are feeling right now. They are very
9:38
happy. They are celebrating. They
9:40
are bonded. Nothing bonds
9:42
people together like going through a crisis. There's
9:45
a lot of... Like the one we're having right now where I'm recording
9:47
an emergency episode in the airport? It's
9:49
true. I've never felt less of it. I'm going
9:52
to be honest with you right now, Casey.
9:54
The people I'm talking to, they think that this
9:56
is going to be a real moment of reinvigoration
9:59
for the company. that employees
10:02
are feeling optimistic about the
10:04
future and they are now
10:06
even more devoted to this mission of building
10:09
AGI. And obviously
10:12
I think there are going to be some people
10:14
who come out ahead or behind
10:17
of this kind of reorganized, reconstituted
10:20
open AI. And there's a lot of questions
10:22
we still don't know the answers to about how
10:24
the company will change going forward. But
10:26
I think if you're looking for a sort
10:28
of clear before and after picture of open
10:31
AI, I would say before there was
10:33
this sense that there was this fragile structure
10:36
that needed to be sort of balanced, the needs
10:38
of the business and the needs of the nonprofit.
10:41
And now I think people feel like the
10:43
business is firmly in the driver's seat. That
10:46
all sounds right to me. I think the one thing I would add
10:48
is that I do think the company and Sam Altman
10:50
in particular are just going to be under more scrutiny
10:52
now, right? We all learned a lot about
10:54
the history of this company and of Sam
10:57
Altman in particular over the past week.
10:59
And I think to the extent that the company makes
11:01
moves that are perceived as sort of pro
11:04
corporate, pro Microsoft and anti-safety,
11:07
I just think they're going to get 10 times the attention that they
11:09
did before all of this happened. And
11:11
that might be a good thing, right? So at
11:13
the end of the day, I think that the board
11:15
that they had did not execute its responsibilities
11:18
well and did need to go. But
11:20
I do hope that you and I will keep our eye
11:23
on some of the concerns that they were raising behind
11:25
closed doors, even if they would never be straightforward
11:27
about what those concerns actually were. Totally.
11:30
Well, Casey, thank you for taking one for the
11:32
team and recording an emergency podcast
11:34
from the airport. I hope that you are able
11:37
to get on a new flight and make your Thanksgiving
11:40
plans after all. And let me just say to the people
11:42
of OpenAI, this is the last one I have in me this
11:44
week. Okay, I don't care what kind of crazy shenanigans
11:46
you guys get up to with your fog machine and
11:48
your fire alarms that you're pulling at company
11:50
headquarters, you're not going to hear my voice again until next
11:52
Friday. And you can count on that. That's the
11:54
hard fork promise. We are going to take a vacation.
11:58
All right, maybe go grab that turkey out of the oven, Kevin.
11:59
I'm starting to see some smoke coming out from the
12:02
door behind you. That's just my fog machine
12:04
in solid air.
12:48
Heart Fork is produced by
12:50
Rachel Cohn. We're edited by
12:52
Jen Poignant. Today's show was entered
12:55
by Dan Ramirez. Original music
12:57
by Dan Powell. Our audience
12:59
editor is Nell Gologli. Video
13:02
production by Ryan Manning and Dylan
13:04
Bergison.
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More