Podchaser Logo
Home
BONUS: Understanding the Facebook Oversight Board Decision

BONUS: Understanding the Facebook Oversight Board Decision

BonusReleased Thursday, 6th May 2021
Good episode? Give it some love!
BONUS: Understanding the Facebook Oversight Board Decision

BONUS: Understanding the Facebook Oversight Board Decision

BONUS: Understanding the Facebook Oversight Board Decision

BONUS: Understanding the Facebook Oversight Board Decision

BonusThursday, 6th May 2021
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:15

Pushkin from

0:20

Pushkin Industries. This is Deep Background,

0:23

the show where we explore the stories behind

0:26

the stories in the news. I'm Noah

0:28

Feldman. This is a special

0:31

bonus episode, a mini episode about

0:33

some breaking news. This week, Facebook's

0:36

oversight board decided the

0:38

most important case in its short life,

0:41

what to do about Donald Trump's temporary

0:43

suspension from the platform, which had

0:45

been announced by Facebook in the aftermath

0:47

of the January sixth attack on

0:50

the Capitol. The story

0:52

mattered to me because, as some listeners

0:54

will know, I've been deeply involved

0:56

with the oversight board, proposing it

0:59

to Facebook in the first place and advising

1:01

the company on its creation. In

1:04

fact, I still advise Facebook on free

1:06

speech and free expression related issues.

1:09

So when it comes to the oversight board, I'm

1:11

the very opposite of an objective observer.

1:14

I am an observer who's deeply bound up

1:16

in the institution, the process, and

1:18

I care a lot about this decision,

1:21

and let me tell you, it was fascinating and

1:23

strange to see the decision of that institution

1:26

plastered on the front pages of

1:28

the newspapers. After

1:31

consultation with my terrific team of producers

1:33

here at Deep Background, we decided that it

1:35

might be useful to do a special mini

1:37

episode on the Oversight Board decision.

1:40

And I'm going to tell you, just from

1:42

my own perspective, three different

1:44

aspects of what you should think

1:46

or what you might wish to think about the Oversight

1:49

Board decision. What I'm going to do is

1:51

break my comments into three parts. First,

1:54

what did the Oversight Board actually

1:56

do? And as you'll hear, the answer is

1:58

pretty different from what the headlines have said.

2:01

Second, what is likely to happen

2:04

next in the coming months? And last,

2:06

but very much not least, why this

2:09

matters or may matter in the

2:11

big picture. First,

2:20

what did the Oversight Board actually

2:22

do? There is some confusion

2:24

around this because the very first thing the Oversight

2:26

Board said in its opinion was

2:29

the slightest little bit misleading. The

2:31

Oversight Board began by saying that it

2:33

was upholding Facebook's

2:36

decision in the aftermath of the January

2:38

sixth attack on the Capitol to take

2:40

Donald Trump off the surface. And

2:44

yet when you went on to read the fine print, the

2:46

Oversight Board went on to say that

2:48

Facebook's subsequent deep

2:50

platforming of Donald Trump for

2:52

an indefinite length of time was wrong,

2:56

standardless and unjustified,

2:59

as a consequence. The first thing the newspapers

3:01

reported was Oversight Board

3:04

upholds Facebook, Yet

3:06

they could just as easily have said as their

3:08

headline, the Oversight Board told Facebook

3:10

that it was not justified in

3:13

suspending Trump from its service. So

3:15

what was the Oversight Board in fact

3:17

saying when you drill down, Well,

3:19

what it said is that the decision to

3:22

block the content that Trump

3:25

posted during and in the process

3:27

of the attack on the Capitol

3:30

was the right thing for Facebook

3:32

to do because Donald Trump's

3:34

words, the Oversight Board believed

3:37

were contributing to ongoing harm,

3:40

including violence with respect to the

3:42

attack on the Capitol. Therefore,

3:44

said the Oversight Board, it was appropriate

3:47

to take down that content. But

3:50

the board then went on to say that

3:53

when Facebook chooses to take down

3:55

content, it doesn't ordinarily go

3:57

on to remove the user from the platform.

4:00

Instead, Facebook has a range of

4:02

things that it can do, which included just taking

4:04

down the content or temporarily

4:07

freezing the person's account too has

4:09

posted that content, or under some

4:11

circumstances, actually d

4:14

platforming the person. What Facebook

4:16

had never done before, according to the Oversight

4:18

Board, was announced an indefinite

4:21

suspension, which was neither labeled

4:23

as a mechanism to prevent future harm,

4:25

nor as a punishment for explicit

4:28

violations by Trump of rules

4:30

of the platform that can get

4:33

you d platformed. In essence,

4:35

what the board was saying was that Facebook

4:37

needs to go back to the drawing

4:39

board. It needs to clarify and

4:41

specify what its rules are going

4:44

to be going forward for taking

4:46

people off the platform, then to

4:48

see if those rules which it has to state,

4:50

explain and announce would

4:53

apply to Donald

4:55

Trump. Once it reaches that conclusion,

4:57

if it's clearly stated rules don't apply

5:00

to Trump, Trump has to be put back on

5:02

the platform. If it says that its

5:04

rules do qualify for

5:06

permanent removal of Trump, then it

5:09

could take Trump off the platform. And

5:11

Trump, of course, would then have the opportunity

5:14

to go back to the oversight board and ask

5:16

for it to review the issue again. Whether

5:19

it would listen to his case or not is uncertain,

5:21

but it seems probable that it would given

5:23

the great importance of the issue. You

5:26

probably noticed that a lot

5:28

of this decision therefore depends

5:31

on what Facebook does in

5:33

the next six months, and you

5:35

might also be wondering, And the truth is,

5:37

I'm wondering about this a little bit too, how

5:39

do the Oversight Board decide to

5:42

give Facebook six months

5:45

to figure out what it was going to

5:47

do next. So

5:49

let's turn to that six month period. And

5:52

here's why that six month period matters

5:54

so much. Some observers

5:56

of this decision have said that the Oversight

5:59

Board punted the question of

6:01

what to do about Donald Trump back to

6:04

Facebook, and in a sense that

6:06

is correct, acting in a manner not

6:08

unlike what many actual Supreme

6:10

courts or constitutional courts would do. The

6:12

Oversight Board declined to say, here,

6:15

Facebook, are the rules which you must

6:18

follow when the time comes to

6:20

decide whether to kick somebody off the platform.

6:22

The Oversight Board saw its role as doing

6:25

oversight, not as specifying

6:27

policy. So there is a punt

6:29

or a return of this issue back to Facebook

6:32

insofar as the Oversight Board was

6:34

telling Facebook, you have

6:37

to write the policy, We're not going to

6:39

do it for you. That

6:41

said, the Oversight Board gave substantial

6:44

guidance to Facebook

6:46

with respect to what that new

6:49

policy should look like. When

6:51

Facebook now goes to rewrite its policies,

6:54

it will go into the details of

6:56

what the Board suggested. And

6:59

although the Board did not say that Facebook had

7:01

to listen to these principles. The strong implication

7:03

was that if Facebook made a decision that violated

7:06

the principles that the board laid out, the

7:08

board might well overturn Facebook's

7:11

policies the next time around. What

7:14

was good for Trump is that the oversight Board

7:16

made it very clear that

7:18

Facebook, in deciding whether someone

7:21

like Trump can be permanently deplatformed,

7:23

has to look at whether his presence

7:25

on the platform would cause significant

7:29

imminent that means immediate harm.

7:32

Here's the money quote. Facebook

7:34

must assess whether reinstating mister

7:36

Trump's accounts would pose a serious

7:39

risk of inciting imminent

7:41

discrimination, violence,

7:44

or other lawless action. In

7:46

other words, Facebook can't just say we

7:48

don't like Donald Trump, we think

7:51

Donald Trump's lousy, or even

7:53

we think Donald Trump is in general dangerous.

7:56

They have to create rules according to which

7:58

a removal of Trump would be conditioned

8:01

on this serious risk of

8:03

inciting discrimination, violence,

8:06

or lawless action. That's good

8:08

for Trumps now that he's no longer president

8:10

of the United States, and now that he's not

8:13

commanding a mob that's about to attack the capital.

8:15

It would not be that easy for Facebook to show

8:18

that putting him back on the platform would

8:20

insight imminent violence

8:23

or lawlessness. What's

8:25

less good for Trump is that, in

8:27

describing what Facebook should do over

8:29

the next six months, the Oversight board

8:32

also seemed to suggest that

8:34

Facebook should require Trump to back

8:37

down from some of the spurious claims

8:39

about election fraud being made. Here's

8:42

the money quote here. Facebook

8:44

should, for example, be satisfied that

8:47

mister Trump has ceased making unfounded

8:49

claims about election fraud in

8:52

the manner that justified suspension

8:54

on January six And

8:56

in the real world we all know it

8:58

doesn't seem very likely that Donald

9:00

Trump, who responded to the oversight board

9:03

decision with a

9:05

loud statement of rejection in

9:08

which he referred to himself as the president

9:10

of the United States, is very likely

9:13

to take steps like that.

9:17

In any case, what Facebook is now going to

9:19

have to do is engage

9:21

in an internal process of figuring

9:23

out how to state rules

9:25

that will be designed to justify

9:28

and explain whatever they decide

9:30

to do about Trump. That

9:33

internal process will involve

9:35

those people within Facebook who make

9:37

content policy rules, and

9:39

they will have to figure out how to apply

9:42

those rules in a public way. They will

9:44

not only cover Donald Trump, but

9:46

will also cover anybody else whom

9:49

they wish to take off the service. The

9:51

Oversight Board made it very clear in its decision

9:54

that Facebook cannot have one rule for

9:56

Trump and another rule for every other

9:58

government leader. It also strongly

10:00

implied that Facebook should not have

10:03

different rules for public

10:05

figures who influence a lot of people than

10:08

it does for regular users. Regardless,

10:11

the Oversight Board was very concerned that

10:13

Facebook pay attention to the potential

10:16

dangers and harms posed by users

10:18

and explain the connection between those

10:20

harms and any decision to d platform

10:23

the person. We may not

10:25

know much publicly about how

10:27

Facebook undergoes this process right away,

10:30

but the good news is, under the board's

10:32

guidance and oversight, Facebook

10:34

will have to explain clearly and

10:36

publicly what its rules are, and will

10:39

have to show how those rules

10:41

operate. That brings

10:43

us to the grand question of whether

10:46

any of this matters. It

10:48

may not surprise you to hear that I think it matters

10:51

a lot, and for several reasons.

10:54

First is the fact that the Oversight

10:56

Board actually did its job.

10:58

That is to say, it operated it in

11:01

such a way as to render a decision

11:03

that neither rubber stamped what Facebook

11:06

had done nor fully

11:09

versed what it had done. Instead, the

11:11

Oversight Board did oversight.

11:14

That is, it held Facebook to account by

11:16

saying that Facebook had an obligation to

11:18

follow rules and principles that

11:20

would be made public in the realm of

11:23

free expression. On

11:25

its own, Facebook had not clarified

11:28

publicly exactly why Trump was removed.

11:31

It had acted in a somewhat

11:33

let's figure out what to do under these circumstances

11:36

ad hoc manner, and the Oversight

11:38

Board told Facebook it just couldn't

11:40

get away with that. Yet. The Oversight

11:42

Board also was unwilling to shoulder all

11:44

of the responsibility for telling Facebook exactly

11:47

what it should do in the future. It wanted

11:49

Facebook to take on board its own

11:51

responsibility for getting it right, and

11:53

that seems to be exactly

11:56

what oversight should be about. Second,

11:59

the Oversight Board decision was

12:02

treated by news organizations

12:04

throughout the world the way a decision

12:07

by an actual Supreme Court would

12:09

probably be treated. It

12:11

wasn't just discussed, It was analyzed,

12:14

poured over, evaluated,

12:17

argued about, and indeed also

12:20

much anticipated when it came down.

12:22

The fact that the world seems to have treated

12:25

the Oversight Board's decision as a real

12:27

decision suggests that the institution

12:29

may have passed its first major

12:31

test of legitimacy. Sure

12:34

it will be criticized, and indeed criticized

12:36

harshly by supporters of Donald Trump,

12:38

and it may also be criticized by people who think

12:40

that the board didn't go far enough in telling

12:43

Facebook exactly what to do. But

12:45

those are the kinds of criticisms to which real

12:47

world courts are subject all

12:49

the time. It's

12:51

therefore very important that this decision

12:54

was made, was discussed,

12:56

was analyzed, because it suggests

12:58

that a possible future direction for

13:01

the way important decisions like this are going to be

13:03

made is in dialogue between Facebook

13:05

and its oversight board. Some people

13:07

might prefer that there not be a dialog, that the oversight

13:10

board just speak and the conversation be finished,

13:12

but that's not how real world courts operate,

13:14

and that's probably not how the oversight board

13:17

is going to operate for now. Instead,

13:19

to engage in oversight, it's going to have to

13:22

participate in an ongoing process

13:24

of dialogue. Last,

13:27

but not least, one of the crucial reasons

13:29

for the creation of the oversight board in the future

13:31

was the sense that the most important

13:33

decisions about free expression

13:35

on social media are too big to

13:38

be made solely by the people who

13:40

run the company. The oversight

13:42

board told Facebook's leadership, we don't

13:44

like how you made this decision, go back

13:47

and do it again. Facebook will

13:49

then have to make a new decision, and that

13:51

decision, too is subject to

13:53

being reviewed finally by

13:55

the board. In other words, there

13:57

will be a sharing of ultimate responsibility

14:00

for decision making. That sharing

14:03

is, at least, in my view, a step in

14:05

the right direction away from a world

14:08

where the about free expression

14:10

are made by the CEOs of platforms,

14:13

with no option for recourse and

14:15

no independent review by any

14:18

third party body.

14:21

Everything that I've just said to you is subject

14:23

to revision and review as time

14:26

develops and as the story continues.

14:28

And just to remind you, none

14:31

of it comes from my objective analysis.

14:33

It all comes from my own connection

14:36

to and care about this nascent

14:38

institution. That said, I

14:40

will say, I'm pretty proud today

14:43

of what the oversight board did. I

14:45

don't know that I would have written the opinion the way the oversight

14:47

board did. I don't know that I would have given Facebook

14:50

six months in order to make this decision.

14:52

I might have thought it could do it in a substantially

14:54

shorter amount of time. I might have explained

14:57

why six months was the amount of time that was being

14:59

chosen as opposed to just suggesting it

15:01

as a reasonable amount of time in which Facebook

15:04

could act, But those are nothing but

15:06

little quibbles. In the end, this

15:08

institu Juan acted as

15:11

an oversight body and gave

15:13

feedback to Facebook, and Facebook

15:16

is going to have to listen, and

15:18

that, for once, seems

15:20

to be a small step forward

15:23

in the world of regulation and

15:25

ethics in the context of big

15:27

tech. I'll

15:34

be back to you soon with a full episode.

15:37

In the meantime, have a terrific week,

15:40

stay safe and be well.

15:44

Deep background is brought to you by Pushkin Industries.

15:47

Our producer is Mo laboord our

15:49

engineer is Martin Gonzalez, and our shore

15:51

runner is Sophie Crane mckibbon. Editorial

15:54

support from noahm Osband. Theme

15:56

music by Luis Skara at Pushkin.

15:59

Thanks to Mia Lobell, Julia Barton, Lydia

16:01

Jean Cott, Heather Fain, Carl mcgliori,

16:04

Maggie Taylor, Eric Sander, and Jacob

16:06

Weissberg. You can find me on Twitter at

16:09

Noah R. Feldman. I also write

16:11

a column for Bloomberg Opinion which you can find

16:13

at bloomberg dot com slash Feldman.

16:16

To discover Bloomberg's original slate of podcasts,

16:18

go to bloomberg dot com slash podcasts,

16:21

and if you liked what you heard today, please

16:23

write a review or tell a friend. This

16:26

is deep background

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features