Podchaser Logo
Home
 Is It Time to Unfriend Facebook?

Is It Time to Unfriend Facebook?

Released Thursday, 7th November 2019
Good episode? Give it some love!
 Is It Time to Unfriend Facebook?

Is It Time to Unfriend Facebook?

 Is It Time to Unfriend Facebook?

Is It Time to Unfriend Facebook?

Thursday, 7th November 2019
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

Next Question with Katie Curic is a production of I

0:02

Heart Radio and Katie Kuric Media. Hi

0:05

everyone, I'm Katie Curic and welcome to Next

0:07

Question, where we try to understand the

0:09

complicated world we're living in and

0:11

the crazy things that are happening by

0:14

asking questions and by listening

0:16

to people who really know what they're talking about.

0:19

At times, it may lead to some pretty

0:21

uncomfortable conversations, but

0:23

stick with me, everyone, let's all learn

0:26

together. More

0:33

than two point one billion people

0:35

use Facebook or one of its services

0:37

like Instagram or What's App every

0:40

single day. That's nearly one

0:42

third of the entire world's population.

0:45

But recently the company has gone from

0:48

the brilliant brainchild of a Harvard dropout

0:50

named Mark Zuckerberg to one of the most

0:52

controversial companies on the planet.

0:55

He was recently grilled on Capitol hilld

0:57

by members of Congress concerned about the plat

1:00

forms, increasing footprint, and

1:02

almost every aspect of our lives.

1:05

Sure, Facebook can bring communities together,

1:08

help you share photos with your family, and

1:10

even start movements, but it can

1:12

also unfairly impact elections,

1:15

spread misinformation, create

1:17

a safe space for child pornographers,

1:19

and white supremacists, invade our

1:21

privacy, exploit our personal information,

1:24

and increase the deep divisions of our

1:26

already polarized nation. That's

1:29

quite a laundry list, isn't it, And with

1:31

the election fast approaching,

1:34

you may be wondering if it might be deja

1:36

vu all over again, and

1:38

worried that, to borrow a phrase from the nineteen

1:41

sixty six movie The Russians Are

1:43

Coming, The Russians are Coming, not to

1:45

mention China and other foreign

1:47

powers, and the company's recent

1:49

decision not to fact check political

1:52

ads lead to a heated debate on social

1:54

media between Zuckerberg and Democratic

1:56

presidential candidate Elizabeth Warren,

1:59

who set the platform had become a quote

2:01

disinformation for profit machine,

2:04

and she even placed an ad on Facebook

2:06

sayt Zuckerberg was supporting Trump for president

2:09

to test if it would be removed. It

2:12

wasn't. Meanwhile, more than two

2:14

hundred and fifty of its own employees

2:17

signed an open letter warning that

2:19

the ad policy is quote a

2:21

threat to what Facebook stands for.

2:24

So I was impressed that the company CEO,

2:27

Cheryl Sandberg was willing to sit down with

2:29

me recently at the Vanity Fair New

2:31

Establishment conference in Los Angeles.

2:34

She's been with the company since two thousand

2:36

eleven and has played a pivotal role

2:38

in shaping both its culture and its

2:40

business strategy, leading it to

2:42

more than twenty two billion dollars

2:45

in profits last year. She's

2:47

also an advocate for women in the workplace

2:50

with her two thousand thirteen book and organization

2:53

Lean In. And I got to know Cheryl

2:55

after her husband, Dave, died unexpectedly

2:58

in two thousand fifteen. She

3:00

reached out because I too had

3:02

lost my husband at an early age.

3:05

Cheryl wrote a book about her experience, called

3:07

Option B, and I interviewed her

3:09

for that back in two thousand seventeen.

3:12

If you're interested, you can find that interview

3:14

in my feed. Our recent conversation

3:17

at the Vanity Fair summit got a lot of

3:19

attention, and I thought it made sense

3:21

to share it with all of you on my podcast.

3:24

So my next question for Cheryl

3:26

Sandberg is Facebook doing

3:28

enough to protect it's more than two billion

3:31

users and our democracy?

3:34

Or is it time to unfriend Facebook?

3:38

Cheryl, thank you for being here. We

3:40

have a lot to talk about,

3:42

as you know, so let's get right to it.

3:44

We're just over a year from

3:47

the election. Three hundred and seventy

3:49

eight days to be exactly who's counting? Yeah,

3:51

But I think the way Facebook

3:54

addresses and fixes

3:56

the platform that was used in two

3:58

thousand and sixteen is seen is a major,

4:00

critically important test. I know

4:02

certain measures have in fact been implemented,

4:05

for example, thirty five thousand moderators

4:07

looking for fake accounts and suspicious patterns.

4:10

Mark Zuckerberg announced news safeguards

4:12

like labeling media outlets that are state controlled.

4:15

But do you believe that's enough? I mean, do you

4:18

really seriously believe that we won't

4:20

witness the kind of widespread interference

4:22

we saw in two thousand sixteen. Well,

4:25

we're gonna do everything we can to prevent it. Um.

4:27

I do think we're in a very different place. So

4:30

if you think back to we

4:32

had protections against state actors,

4:34

but when you thought about state actors going

4:37

against a technology platform, what

4:39

you thought of was hacking the Sony emails,

4:41

the DNC emails, stealing information.

4:44

And that's what our defenses were really set up

4:46

to prevent, and so were everyone Else's

4:48

what we totally missed, and

4:50

it is on us from missing it, and

4:53

everyone missed. This was not stealing

4:55

information, but going in and

4:57

writing fake stuff was a totally diff

5:00

threat and our systems weren't set up to deal with

5:02

it. So the question is as you're asking,

5:04

what are we doing going forward and how are we going

5:06

into the election? And how did we do

5:08

in and we're in a

5:11

totally different place. The FBI

5:13

has a task force on this. They didn't have anyone

5:15

working on it. Homeland Security is working

5:17

on it, all the tech companies are

5:19

working together, because when you try

5:21

to interfere on one platform, you try to interfere

5:24

on another. In Sten,

5:26

we didn't know what this threat was in We

5:29

did one takedown. In the last

5:32

year, we did fifty and I read a

5:34

shocking number. You took down more than

5:36

two point two billion fake

5:38

accounts in a three month.

5:41

That's right. We take down millions

5:43

every day. So thirty moderators

5:46

is that even enough? Given I

5:48

mean two point two billion is almost the number

5:51

of people who are on the platform. So

5:53

the moderators are looking for content. The

5:55

fake accounts are being found with engineering.

5:58

That's the only way to find those fake account Most

6:00

of those are found before anyone

6:03

ever sees them. And fake accounts are

6:05

a really important point here because everything

6:07

that was done by Russia in everything

6:11

was done under a fake account. So if you can find

6:13

the fake accounts, you often find

6:15

the root of the problem. And

6:17

so we are now taking down millions

6:19

every day, almost all of which no one

6:21

has seen. You talked about disrupting

6:24

fifty individual campaigns from multiple

6:26

nation states so far. But what about domestic

6:29

threats. Facebook's own former

6:31

security chief Alex Stamos has

6:33

said, quote, what I expect

6:35

is that the Russian playbook is going to

6:38

be executed inside of the US

6:40

by domestic groups, in which case some

6:42

of it, other than hacking, is not illegal.

6:45

My real fear, he says, is that

6:47

in it's going to be the battle

6:50

of the billionaires of secret

6:52

groups working for people aligned on both

6:54

sides who are trying to manipulate

6:56

us at scale online. So

6:59

what is face spook doing to defend the platform

7:02

against this kind of domestic threat. It's

7:04

a really good question, because things are against

7:06

our policies if they're fraudulent

7:09

or fake accounts, but people can also kind

7:11

of deceive. Again, if you look at where

7:13

we were and where we are, the

7:16

transparency is dramatically different.

7:18

So you look on every page on Facebook,

7:21

you can now see the origin of where the

7:23

person is. So if someone is has a

7:25

page that's called I don't know us whatever,

7:28

but they're from the Ukraine, it's clearly

7:30

marked that. If you look at our ad

7:32

library we didn't have this last

7:34

time, you can see any political ad

7:36

running actually anywhere in the country or in most

7:39

places of the world, even if they're not targeted

7:41

to you. So before, if they were

7:43

trying to reach you, you could see it, but you couldn't

7:45

see anything else. Now you can see everything.

7:48

And we rolled out on Presidential ad

7:50

Tracker so that you can see the presidential

7:52

campaigns much more holistically.

7:54

So with the transparency measures we have,

7:57

people should be able to be trying to get

7:59

rid of the accounts and the ones that are legitimate,

8:02

whether they run domestically or globally.

8:04

Make sure people understand who the people

8:06

are behind what they're seeing. But then why

8:09

did Facebook announced not to

8:12

fact check political ads last month?

8:14

I know the Rand Corporation

8:16

actually has a term for this, which is truth

8:18

decay. And Mark himself has

8:21

defended this decision even as he

8:23

expressed concern about the erosion

8:25

of truth online. So

8:28

what is the rationale for that?

8:31

And I know you're gonna say we're not a news organization.

8:33

We're a platform. I'm not going to say that, but

8:37

it's a really important question, and I'm really glad

8:39

to have a chance to take a beat and really think about

8:41

it and talk about it. So one

8:43

of the most controversial things out there right now

8:46

is what adds do we take? What

8:48

ads do others take? And do we fact check

8:50

political ads? And it is a hard conversation

8:53

and emotions are running very high on this. I

8:55

also sit here realizing it's however

8:57

many days you said before the election. So

9:00

the ads that are controversial now we have not

9:02

even seen the beginning of what we're going to see. There are

9:04

going to be a lot of controversial ads

9:07

and controversial speech. So why

9:09

are we doing this. It's not for the

9:11

money. Let's start there. This is a very

9:13

small part of our revenue five percent or something.

9:15

We don't release the numbers, but it's very small,

9:18

very small, and it is very controversial.

9:21

We're not doing this for the money. We take

9:23

political ads because we really believe

9:25

they are part of political discourse and

9:28

that taking political ads means

9:30

that people can speak. If you look at

9:32

this over time, the people who have most benefited

9:35

from being able to run ads are

9:37

people who are not covered by the media so

9:39

they can't get their message out otherwise, people

9:42

who are challenging and incumbent so

9:44

they are a challenger, and people

9:46

who have different points of view. That's

9:48

that's been true historically. And so we

9:51

also have this issue that if we let's say we took political

9:53

ads off the service, we would still have

9:55

all the issue ads. So I'm running

9:57

an AD on gender equality, I'm running an AD on

9:59

an other political issue. Those ads are

10:02

much much much bigger in terms of scope

10:04

than the political ads, so you would have every

10:06

voice in the debate except the politicians

10:09

themselves. So instead, what we're

10:11

doing is as much transparency

10:14

as possible. Every ad has to be marked by

10:16

who paid for it. We're doing verification

10:18

to make sure the people that say they're paid and

10:21

that adds library I started talking about

10:23

is really important because you can't hide.

10:25

You can't run one ad in one state, one

10:27

add and another, one add to one group, one add

10:30

to another. Anyone can go

10:32

into that library and see any ad

10:34

that any politician is running anywhere.

10:36

Well, this is what Nita Gupta wrote,

10:38

the former head of the dj Civil Rights

10:41

Division, and Politico simply

10:43

put she wrote, while major news organizations

10:46

are strengthening fact checking and accountability,

10:48

Facebook is saying, if you are a

10:50

politician who wishes to pedal in lies,

10:53

distortion, and not so subtle racial

10:56

appeals, welcome to our platform.

10:58

You will not be fact check you are

11:00

automatically newsworthy. You're automatically

11:03

exempt from scrutiny.

11:05

So I know of Anita, and I've had a chance to speak

11:07

to her since she since she posted

11:09

that, and I think the debate is really important.

11:12

I've had a chance to work with her on our civil rights

11:14

work. We've taken a lot of feedback from her and

11:16

already continue which she was writing

11:18

there was not only about ads, it was really

11:20

about content on the platform. So taking

11:23

a step back, here's what we do.

11:26

When you write something. We have a very

11:28

strong free expression bent. We think it's very

11:30

important that we judge as little

11:32

as possible and let people express themselves.

11:35

But we don't allow anything on the platform. If

11:37

something is hate, terrorism,

11:39

violence, bullying, you know, hate

11:42

against protective classes, it comes down

11:44

we take it off, voter suppression. If

11:46

something is false, misinformation,

11:49

fake news, we don't take it off.

11:51

We send it to third party fact checkers. If

11:54

they market as false. We market as false.

11:56

If you go to share it and it's marked as false,

11:58

we warn you with a pop up and we say, do you want to share

12:00

this it's been marked as false? We

12:03

dramatically decrease distribution,

12:05

so we decrease it to about and

12:08

we show related articles. How can you possibly

12:10

do that with two point seven billion

12:13

users? How can you possibly keep

12:17

up with all the content

12:19

that's being produced on Facebook

12:21

and distributed and shared, etcetera. We

12:24

can't fact check everything. We're not trying to fact check

12:26

everything or send everything to third party fact checkers

12:28

at all. We prioritize in terms

12:30

of what's going most quickly. So when something

12:33

is growing really quickly, it gets referred, it goes to

12:35

the top of the heap sending it to fact checkers.

12:37

And these are really news

12:39

links. You know, if you're a bad example

12:41

because you're a media journalist, but you

12:43

know, if my sister writes a post about

12:45

her kids and her dogs, which she does all the time,

12:48

that's not getting fact check. That said,

12:51

the challenges of scale here are really

12:53

important, and in a lot of the areas where

12:55

we are reluctant to weigh in,

12:58

it's because we know we can't do this well

13:00

at scale, so we have to rely

13:02

on other sources. I think one of the most

13:04

important things we're rolling out in the next year

13:06

is our Content Advisory Board. We

13:09

understand that there are real concerns with the amount

13:11

of power and control we have that right

13:13

now we are the ultimate arbiters of what stays

13:16

on our service, and so we're setting up a

13:18

content review board. The final charter has

13:20

just been released. We've consulted

13:22

with over a thousand experts around the world and

13:24

they're going to be forty people appointed and

13:27

by next year they're going to start hearing cases. They don't

13:29

report to me, they don't report to Mark. It

13:31

means that if you disagree and

13:34

something was pulled down and you think it should be up,

13:36

or if you disagree and we

13:38

are letting something run from someone else,

13:40

that you don't think, you have a place to go and

13:42

we're going to abide by their decisions. Since

13:44

two thirds of people get their news and information

13:47

now from social media, do you

13:49

have any responsibility in your

13:51

view to at least attempt to

13:54

make sure that the news on your platform

13:57

is factual? Because oftentimes

14:00

I've heard, well, we're a platform,

14:02

we're not a publisher, right, and

14:04

so we're basically the pipes. So

14:06

where do you see your responsibility in terms

14:09

of that? So we do think we have a responsibility

14:11

for fake news and misinformation? Would you say you're

14:13

not a publisher? Still, well,

14:16

what would you call it? So that is

14:18

a complicated thing and it means different things

14:20

to different people. Here's what we are. We

14:22

are a technology company. A lot of things

14:24

are published on us. But what I think when

14:26

people ask that question, they're wondering

14:28

if we take responsibility for what's on our

14:30

service. And my answer to you is is yes,

14:33

we're not a publisher in the traditional sense because

14:35

we don't have editors who are fact checking, but we

14:38

take responsibility and what we've done

14:40

on misinformation has decreased

14:42

people's interactions. Stanford just published a study

14:45

there down by more than half. Since it's

14:48

not perfect, we're not able to fact check

14:50

everything. But we had no policies

14:52

against this in the last election, and you fast

14:55

forward to today. I think we are in an

14:57

imperfect but a much stronger

14:59

position. Let's talk about the free speech

15:01

rationale at Georgetown. Mark used

15:04

Martin Luther King Jr's name and his defense

15:06

of free speech on Facebook, but King's

15:08

daughter, Bernice tweeted, I'd

15:11

like to help Facebook better understand the challenges

15:13

that MLK faced from

15:15

disinformation campaigns launched

15:18

by politicians. These campaigns

15:20

created an atmosphere for his assassination.

15:23

And then Sherylyn Eiffel, as you know, president

15:26

of the n double a CP Legal Defense Fund,

15:28

called his speech quote a profound

15:31

misreading of the civil rights movement in

15:33

America and a dangerous misunderstanding

15:36

of the political and digital landscape

15:38

we now inhabit. It

15:41

was a controversial speech, and I

15:43

think the civil rights concerns are very

15:46

real. Um. In terms of

15:48

Bernice King, you know her, her father's

15:51

legacy, I know her. I actually spoke to her

15:53

after that tweet, totally

15:55

scheduled separately. She's coming to Facebook tomorrow

15:58

and I'm going to be in your chair interviewing, and then I'm hoasting

16:00

her for dinner tomorrow night. And what

16:02

I told her is what I'll say to you, which is that

16:04

I was grateful she published. We would have liked her to

16:06

push on Facebook, not just tweet, but

16:09

we were grateful she spoke out because this is

16:11

the dialogue we want to have. And she actually tweeted

16:13

again this morning that she heard from Mark

16:15

and is looking forward to sitting down and talking with

16:17

him civil rights.

16:19

She's smooth, isn't she. I

16:23

mean, these are just facts, she tweeted.

16:25

You can check again to my friend

16:28

Vernice. We'd like you to post on our platform too.

16:30

But this is the dialogue, right, there's

16:33

a lot of disagreement. Civil

16:35

rights and protecting civil rights are hugely

16:37

important to Mark, hugely important to me. I'm

16:39

personally leading the civil rights work at Facebook

16:42

and we'll continue to do that. And while we don't agree with

16:44

everything, and there was certainly disagreement

16:46

over some of Mark's speech, there

16:48

were other things that we've done because we've

16:51

listened and learned to them over the last year, that

16:53

I think they feel really good about. We've taken much

16:56

stronger steps on hate, looked

16:58

at white nationalism and white separatism

17:00

because they informed us of it. We've

17:03

really come down on a very strong policy

17:05

on voter suppression. We are

17:07

taking down voter suppression as hate. If

17:09

you publish you know the polls are open

17:11

on Wednesday, not Tuesday. We're taking that down

17:14

because it's as important to us as hate. And

17:16

that's all based on work, And so why

17:18

is voter suppression more important than voter

17:20

misinformation. It's

17:23

not. It's not more important. It's just a question

17:25

of how we handle it when we have misinformation.

17:28

What we believe is that

17:30

unless it's hate, are going to lead to real world

17:32

violence, we need to let the debate continue.

17:35

We dial massively down the distribution

17:37

to As I said, we don't want things to go

17:39

viral. We mark them as false, but then we

17:42

publish related articles. Here's

17:44

the other side of the story. We think

17:46

that's how people get informed that it's the

17:48

discourse about the discourse. It is market

17:51

giving a speech and Bernice King disagreeing

17:54

with it publicly, and that dialogue that matters.

17:57

Whereas if it's hate or if someone's really going

17:59

to show up to the polls the wrong day, we

18:02

just want to take it off our service. And this

18:04

is really hard because one

18:07

person will think something is clearly

18:09

just they really disagree

18:11

with it, and we do too, but

18:13

they think it's someone else's free expression,

18:16

and so these lines are going to continue

18:18

to be really hard to draw. Do you

18:20

really think that people use Facebook as

18:22

an opportunity to look at both sides

18:24

and to see something when it's corrected,

18:26

or don't you think that people are getting

18:28

stuff in their feed that is

18:31

really affirmation that information.

18:33

And I'm so glad you asked this because there's actually really

18:35

strong data here and no one understands this.

18:38

So when you think about your contacts

18:40

in the world, psychologists,

18:43

you have what's called um your

18:45

tight circle of contacts and then

18:47

your broader circle of contact. So you

18:49

basically can keep in touch with five to seven

18:51

people. That's your mom, your daughter's,

18:54

your husband, John, the people who you know where they

18:56

are. What Facebook enables

18:58

you to do is keep in top with many more people.

19:01

Without Facebook, without social media, without Instagram,

19:03

Twitter, you won't hear from your college friends

19:05

or the people you grew up with that often. So

19:09

if you compare people who do not use social

19:11

media for people who do, the people

19:13

who use social media see much more

19:16

broad points of view because

19:18

if you don't use social media, you go to

19:20

maybe one or two news outlets. They have one

19:22

particular point of view. You read one or two newspapers,

19:24

and that's it. On Facebook, you

19:27

will see, on average of

19:29

the stuff you see a news will be from another

19:31

point of view, which means it's not half

19:33

and half, but it is broadening of your views.

19:36

And that's something that I don't think we've we've

19:38

been able to explain to other people really understand.

19:41

And the reason for that is if you go to your news feed,

19:43

you don't see like half blue and half read.

19:46

You just see about more

19:48

from the other side than you otherwise

19:51

would. So it is unequivocally true

19:53

that Facebook usage and usage of social

19:55

media shows you broader points

19:57

of view, not narrower points of view than you would see

20:00

otherwise. And that's something no one understands. When

20:02

we come back, we take a deep dive

20:04

into the rise of deep fakes,

20:07

Facebook's role in the increasing polarization

20:10

of our country and what the consequences

20:12

should be if the company doesn't

20:15

put the proper safeguards in place

20:17

for presidential election.

20:23

Let's talk about the free speech argument, which

20:25

came under attack earlier this year

20:28

when Facebook decided not to take down

20:30

that doctored video of how speaker Nancy

20:32

Pelosi. Her speech was slowed

20:34

down, it made her appear to be slurring

20:37

her words, that people thought she

20:39

was drunk. You defended the decision by

20:41

saying, we think the only way to fight bad

20:43

information is with good information, and

20:45

you said it had been marked as false

20:48

but at that point, Cheryl, it had been viewed

20:50

two point five million times. So isn't

20:53

the damage already done

20:55

at that point, like when you do a correction in the

20:57

newspaper two days later in tiny

21:00

on page two. And studies

21:02

have shown if you see the

21:04

false story enough and the

21:06

correction fewer times, than the false

21:08

story actually stays in your head. Not

21:11

to mention, another study by m

21:13

I t that fake news spreads seventy

21:15

times faster than real news

21:17

on Twitter. So I guess, isn't

21:20

the current standard operating procedure on

21:22

videos like this a case of too little,

21:24

too late? I think what the Plosi video

21:27

it was, and we have said that publicly our fact

21:29

checkers moved way to the process

21:31

and not the fact checkers. The process for getting

21:33

it to them and getting it back moved way too slowly,

21:36

and we've made a change in how we do that to

21:38

prioritize things that are moving quickly and

21:40

massively cut down the review time. In

21:42

that case, we should have caught it way earlier. We

21:45

think you're right, and we want our systems to work more

21:47

because now the technology allows people

21:49

to appear that they're doing something and are saying

21:51

something other than what they're actually

21:53

saying. I mean, how do you keep up with all

21:56

of those things? Well, deep fakes is

21:58

what you're talking about. It is a new and emerging and that's

22:00

when I'm met deep. Yeah. And it's a new and emerging

22:02

area, and it is definitely one that we don't

22:04

believe we know everything about

22:06

because we don't even know what they're gonna look like. Here's what

22:08

we know. We know we're gonna need to move way,

22:10

way, way faster. We know we're going to need

22:13

very sophisticated engineering to

22:15

detect them in the first place. We also

22:17

know that the policies themselves are hard to set

22:20

right, and so we this is an area where

22:22

we know we moved too slowly with the Pelosi video.

22:25

We are trying to move faster. But we're also setting

22:28

up working groups and AI working groups to try

22:30

to develop the technology that will

22:32

help us identify these in the first place. I

22:34

wanted to ask you about Joe Biden because

22:36

I know he's cut down substantially on

22:39

his Facebook ad spending because he wasn't

22:41

seeing very good return. Some strategists

22:43

have speculated that his message

22:46

is to centrists and lacking

22:48

in the inflammatory red meat content

22:51

that does so well on platforms

22:53

like Facebook. Are you concerned that you are

22:55

creating an environment where the most

22:57

aggressive, inflammatory, each

23:00

bribal content is what sells.

23:02

I know you address that briefly in saying

23:04

that people get different points of view,

23:07

but certainly these people

23:09

seem to gravitate towards that kind of content.

23:12

I mean, I think that's true across political discourse.

23:14

I think it's a problem we face. I think you see it

23:16

in rallies. I think you see it in the debates. I

23:18

think the problem of people

23:21

making more inflammatory statements and people

23:23

rallying to those, particularly as things get

23:25

more polarized, is a real problem.

23:27

I don't think it's unique to us. But do you think you've contributed

23:30

to the polarization in the country. Um,

23:33

I think everything has contributed.

23:35

I do think Facebook is I

23:37

think held accountable for that well. I think

23:39

we have a responsibility to help make

23:41

sure people get real information, to

23:44

help participate in the debate, and make sure

23:46

that people can see other other points of

23:48

view. So I think, but are they getting real information?

23:50

If they're if they are getting the most

23:52

aggressive inflammatory, in other words,

23:54

sort of more moderate points of view, they're

23:57

not as provocative. They don't

23:59

stoke out rage as much as some of this

24:01

other content. Look, I think that's true. I think

24:03

you see it in rallies too. I think you see it on social

24:05

media. I think you see it in rallies when does

24:07

the crowd cheer. You think you've see it in the

24:09

debates. But I think here's what matters. What

24:12

matters is that we want people to be able to communicate,

24:14

express themselves. We want people to

24:16

register to vote and stay in the political

24:18

process. What I will most worry about is

24:21

if people start opting out. So one

24:23

of the things I'm proud of that Facebook has done

24:25

is we registered over two million people

24:27

to vote, and on

24:30

Facebook, when you turn eighteen, we

24:33

basically say happy birthday, and you should register

24:35

to vote. We have a really easy tool

24:37

that lets you find your local representative. Most

24:39

people don't know who their local representative is.

24:41

So yes, I worry about all that, but we

24:43

also worry about core engagement to

24:46

making sure that people don't just opt out, but

24:48

stay engaged, that they vote, that they

24:50

know who their representatives are, they know who they're

24:52

voting, and they participate in the debate.

24:55

Mark said recently in a leaked audio

24:57

from an internal Facebook meeting that if Eliza

25:00

with Warren becomes president and

25:02

tries to break up the company, it would be an

25:04

existential threat and Facebook

25:06

would go to the mat. What does

25:08

that mean exactly, go to the mat? We'll

25:11

have to see. But what this is about is whether

25:15

I mean what we'll have to see is

25:17

what this is. This is about whether or not Facebook

25:20

should be broken up. And that's a really important

25:23

question. I think we're facing it. I think all the tech

25:25

companies are facing it um

25:27

And it's interesting. What do you think about the fear

25:29

about that? Well, I

25:31

don't know if it's if it's the biggest fear. I just think

25:34

it's it would it would you be okay if it was

25:36

broken up? Well, we don't want Facebook to be broken

25:38

up because we think we're able to provide great

25:40

services across the board. We

25:42

think we're able to invest in security across

25:45

the board. So we invest enough and security

25:47

across the board, we invest

25:49

a lot. We're investing much much, much

25:51

more. We have hired an extra thirty

25:53

five thousand people, We've put tremendous

25:56

engineering resources, and we're doing

25:58

things like red teams, asking what do we

26:00

think the bad guys would do and how would we do it.

26:02

So we're never going to fully

26:04

be ahead of everything. But if you want to understand

26:06

what companies care about, you look at where they invest

26:08

their resources. And if you look back three

26:11

to five years and you look at today, we've

26:13

totally changed when we invest our resources. And

26:15

my job has changed too. If I

26:17

look at I've been at Facebook eleven and a half years. For

26:20

the first eight or so, I

26:23

spent most of my time growing the company and sometime

26:26

protecting the community. We always did some protection,

26:28

but now that's definitely flipped. My job is

26:30

a majority building the systems

26:33

that protect and minority grow. And so

26:35

we're definitely changing as a company. We're

26:37

in a different place across the board on all of these

26:40

things. Do you think you're changing enough fast enough?

26:43

I hope. So we're trying. We're definitely

26:45

trying. I mean, I think it's about not just

26:48

the current threats, but the next threat. The question

26:50

we ask ourselves every day is, Okay, we

26:52

know what happened in and

26:55

now we're going to work to prevent it. What is the

26:57

next thing someone is going to do? And that's

26:59

going to take a lot of thought and

27:01

a lot of cooperation across the board. Do you

27:03

see breaking up Facebook as the existential

27:06

threat? Mark Zuckerberg described,

27:08

And how are you feeling about Elizabeth Warren these

27:10

days? So

27:13

I know Elizabeth Warren, and would

27:15

you support her? She's a Democratic nominee.

27:18

I mean, I'm a Democrat. I have supported

27:21

Democrat nominees in the past. I imagine

27:23

I will support a Democrat nominee

27:25

if it's Elizabeth Warren. I

27:27

mean, I'm

27:30

not in the primary right now. I think

27:32

that's a good place for us to be, and so

27:34

I'm not going to let you drag me into the primary. But

27:36

I am a very well understood

27:38

Democrat. I was a supporter of Hillary Clinton.

27:41

I have spoken for many years about my desire

27:43

for my daughter and yours to see

27:46

a woman as president. And so I'd like that

27:48

sounds like a yes, I'd like that. Not just here,

27:51

I'd like that all over the world. I have this really funny

27:53

story from a friend of mine in Germany whose son

27:55

I love. This said to his mother

27:57

he was five, I can't be chance

28:00

where and she said why

28:02

not? He said, well, I'm not a girl because

28:05

of Angelo Merkel. Because the only person

28:07

he has ever known was Angelo Merkels. That's pretty

28:09

good. You've said yourself that you have to get

28:12

right. What should be the consequences if

28:14

Facebook doesn't. I

28:18

mean, I think we have to earn back trust. Trust

28:21

is very easily broken. It is hard

28:23

to earn back. I think we have to

28:25

earn back trust. I think we need

28:28

deeper cooperation across the board. We

28:30

are arguing for regulation in some of these

28:32

areas, including things that would impact

28:34

foreign interference, and I think

28:36

the consequences to us will be grave if if

28:39

we don't, what is it? What does that mean? Consequences

28:41

will be I think I think it would further a road trust.

28:44

I think people will have less faith in the platform

28:46

and our services. People are continuing to use our services.

28:49

That's trust. We need to earn back, not just with what

28:51

we say, but what we do. And it is

28:53

about finding those syndicates and taking them

28:55

down. It is showing that we can cooperate

28:58

across the board, on both sides of the AI, in Congress

29:00

and around the world to find the things that

29:02

threaten our democracy. What can other

29:05

people do to help Facebook solve

29:07

some of these problems? Well,

29:10

thank you for the question. I mean, I think there's a lot of things.

29:12

So one of the things that makes us very different

29:14

than where we were years ago is I think pretty

29:16

radical transparency. So, for example,

29:18

our community standards are completely public.

29:21

We go public every week or so I think

29:23

every two weeks with here's

29:26

some of the decisions we're making, and we take feedback.

29:28

We're publishing a transparency report

29:31

by next year. We're gonna do it every quarter, just

29:33

like earnings, because it's just as important to us

29:35

as earnings, which has Here's

29:37

all the stuff we took down, So

29:39

here's how many billions of accounts that where that number

29:41

comes from. Here's how much terrorism content,

29:44

Here's how much hate speech, and then

29:46

how much of it did we find before it was reported

29:48

to us? So what that report

29:50

shows is iss n L KADA content of

29:53

what we take down, we find it

29:55

before it's reported hate speech.

29:57

We're in the mid sixty percentages now that's

30:00

more than double where we were a year and a half ago,

30:02

but it still means that thirty percent of the hate

30:05

speech we take down has to be reported

30:07

to us, which means someone has seen it, and

30:09

so we are whack a mole in a way, though, Cheryl

30:12

that everything you take down, something pops

30:14

up. In its place. How can you ever get

30:17

really get control over this. Well,

30:19

it is like whackable, right we take something down. I

30:21

mean, right now, as you and I have spoken on this

30:23

stage, someone many people have posted things.

30:26

Our job is to build technology

30:28

that takes that down as quickly as

30:30

possible, and have enough human staff that they

30:32

can take down the rest really quickly. It

30:35

is whack a mole, but it is the price of

30:37

free speech. We have a service that two

30:39

points seven billion people are using our services.

30:43

That means that there's going to be, you know, all

30:45

the beauty and all the ugliness of humanity.

30:47

And our job, and it is whack a mole, is

30:50

to get as much of the bad off as quickly as

30:52

possible and let the good continue.

30:55

And the only way to get rid of all of it is to shut down

30:57

all of these services. And I don't think anyone's really for

30:59

that. What about temporarily

31:01

shutting them down so you can fix the problems?

31:03

Would you ever do anything like that? I

31:06

don't think the temporary shutdown would

31:08

fix the problems because we have to be in the

31:10

game to see what people are doing to build the

31:12

systems to shut down. But the point

31:15

is people have speech. Now, Like, if you think about

31:17

my childhood, right, I grew up in Miami.

31:19

I went to public school. If I wanted to say something

31:21

to the world, I had no no opportunity to do it. Couldn't

31:23

get on your show. No one

31:26

was no. Seriously, you weren't going to take me as a guest.

31:28

No, I wasn't young enough for me that. But hypothetically

31:31

I couldn't get on the person. Before I could

31:33

write a not ed to the local paper, they weren't going to take

31:35

it. People did not have voice full stop.

31:38

Now, that was a world that people

31:40

felt actually pretty comfortable, and you could fact

31:42

check everything. You could fast

31:44

forward to today, whatever services

31:46

get shut down, you can post somewhere, which

31:49

means that everyone has voice, which means that things are not

31:51

fact checked. Now that doesn't mean

31:53

we don't have responsibility. We do, but

31:56

we are in a fundamentally different

31:58

place where people around the world have

32:00

voice. And as hard as this

32:02

is and as challenging as it is, I

32:05

so deeply believe in that world, so

32:07

deeply I am. As

32:10

a friend of mine behind the stage who went to my high school,

32:12

our high school teacher found a

32:14

kidney donor on Facebook because

32:18

she could publish, and she could reach people in a

32:20

way she never could. We just announced

32:22

that two billion dollars have been raised by

32:24

people on Facebook for their for

32:27

their for their birthdays, and their personal fundraisers.

32:29

Does that mean everything on Facebook is good? Of course

32:31

not. But you can't shut this down without

32:34

shutting down a lot of good. And I

32:36

don't think that's an unacceptable answer. And so

32:38

we're going to fight to get the bad

32:40

off and let the good keep happening.

32:42

And I think there is a lot of good out there.

32:46

When we come back a look at the alarming

32:48

psychological effects of social media

32:50

on our kids, whether it's time

32:52

to take a second look at lean in in light

32:55

of the me too movement, and I'll

32:57

ask Cheryl about her legacy.

33:05

Let's talk about kids in social media.

33:07

This isn't so good. The addictive

33:10

nature the

33:12

the the addictive nature of social

33:14

media is just one concern. But as

33:16

you know, I know you have two kids twelve

33:19

and fourteen. Now, depression

33:21

is up dramatically among young

33:23

people, and the suicide rate of adolescent

33:26

girls is up one hundred and

33:28

seventy after two decades

33:30

of decline. And as you know, the leading explanation

33:33

is the arrival of smartphones and social

33:36

media. So, as a parent and someone

33:38

who has been a powerful voice for

33:40

women, how do you respond

33:43

to that terrifying statistic and

33:45

the bigger question, what can be done

33:47

about it? We take this really

33:50

seriously. I take it seriously as a Facebook execut

33:52

I take it seriously as a mom. So

33:54

it turns out that all uses of

33:57

phones, all uses of social media, are not

33:59

equal. There are some that are actually quite

34:01

good for well being, and there are some that are not

34:03

as good. So when you are actively

34:05

consuming, when you are sharing, when you are

34:07

messaging, when you are posting, liking, you're

34:10

interacting with people, that's fairly

34:12

positive. When you are more passively

34:14

consuming, that is more negative.

34:16

And so we made a very big change to the Facebook

34:19

algorithms in January.

34:21

And what about Instagram as well? Yeah,

34:24

and Instagram we're working on as well. But we dramatically

34:26

dialed up the friends and family sharing and

34:28

dramatically dialed down on

34:31

self harm. Our policies are very

34:33

strict. We do not allow any glorification

34:36

of it. We don't allow any We don't

34:38

allow any glorification of self harm. We don't allow

34:40

any encouragement. We do allow people

34:42

to post about their experiences, and that has

34:44

been very important. We've worked

34:47

really hard to develop automated tools,

34:49

so if you post something that looks

34:51

like you might be about to self harm,

34:53

we will automatically flag UM

34:56

phone numbers and helplines. We've had a tremendous

34:59

response from this, and if we think there's

35:01

imminent danger, we refer it to local law

35:03

enforcement, and many people have actually been

35:05

saved by this. The other thing where

35:07

well, that's sort of not addressing the problem

35:10

of addiction, of you know, comparison

35:13

being the thief of joy. Let me finish

35:15

some of the other things we're doing, because these are all really important,

35:17

and I'm conscious that this clock is beeping

35:19

at us UM, so they're

35:21

gonna give me a little extra so they are okay, then

35:24

I can slow down. So

35:29

so one of the other things that happens is, you know, social

35:31

media can considered by some to

35:34

be a place where you know, you're supposed to have the

35:36

perfect the perfect life, the perfect body,

35:38

a real issue for teenage girls. You and I have

35:40

talked about. We're really trying to go against

35:42

that. We ran a campaign that's very popular

35:45

UM on Instagram with real men and

35:47

women with real body types talking

35:50

about that. We've worked with the National Suicide

35:52

Awareness Lines on this, We're working with the w

35:54

h O on mental health. We're

35:57

also I think the answer is almost always

35:59

technology. So one of the things I think is

36:01

great. We have a comment warning now that

36:03

we've been rolling out, where our automatic

36:06

filters detect that you might be posting something

36:08

that's not nice. We will do a pop

36:10

up and say do you really want to post that? And

36:13

again we're seeing a tremendous response.

36:15

We also have abilities to restrict people to

36:17

prevent bullying, so that you know, if

36:19

someone were bullying you, you can restrict

36:22

them. They won't know you're restricting them, and

36:24

if they comment on your post, no one can see them.

36:26

And so these issues are real and we

36:29

have to work hard on building the technology

36:31

and that technology and the answers. There's

36:33

so many huge challenges and

36:35

how difficult is it CHERYLD truly

36:38

to address any of these when solving

36:41

them in some ways works against

36:43

your business model. You know, one critic

36:45

said Facebook has priced itself

36:47

out a morality, and I'm

36:49

just curious if implementing

36:52

some of these changes is bad for business.

36:55

So on this, I'm really pretty proud of

36:57

our track record if you look a

36:59

number of years ago and you listen to

37:01

our earnings calls. So earnings calls are exactly

37:03

what people are worried about. They're directed at investors.

37:05

It's our quarterly report. If you actually watch us and

37:08

earning calls, we are spending as

37:10

much time talking about the measures we take

37:12

on safety and security as we are about

37:14

our business growth. Easily. We

37:16

actually said many quarters ago, this

37:19

is so important to us that we are going to make massive

37:21

investments and change the profitability

37:23

of our company by making real resource

37:26

investments. And we have to the tune of billions

37:28

and billions of dollars, and we will keep doing it. We've

37:31

taken action after action after action

37:33

that is better for protecting the community

37:36

than it is for our growth, and we're going to continue

37:38

to do that. Mark has said it over and over again. I

37:40

have said it over and over again. Let me ask you about

37:43

Mark testifying before the House Financial

37:45

Services committing and a hearing focused on Facebook's

37:48

plans to launch a new digital currency called

37:50

Libra. Given the massive reach and

37:52

trust the public has experience with Facebook

37:55

selling personal information through

37:57

third parties, is it realistic to expect

38:00

the world to embrace cryptocurrency

38:03

an initiative like libra given

38:05

that protecting personal financial

38:07

data really is next level

38:09

in terms of the need for

38:11

security. And I understand

38:14

you were supposed to testify, but you had kind

38:16

of a testy exchange with Maxine

38:18

Waters when you were up on Capitol Hill or

38:20

somewhere. Can you tell us what happened. We

38:22

have a lot of respect for Maxine Waters for the work

38:24

we've done, and we worked really closely with her committee.

38:27

It was her choice to have Mark testify, and that's

38:29

obviously something we respect. But what

38:31

happened between you just

38:33

answer the question, you don't mind on libra Um,

38:37

what we have said is that we are

38:39

working on a digital currency. I think it's

38:41

really important to think about how many people in

38:43

the world are not financially included

38:46

in the banking system. By the way, not a shock.

38:48

Most of those are women. Women

38:51

pay huge frommittance fees. If you go to

38:53

work as a domestic worker in another home

38:55

in another country, you're sending

38:57

back money and you're paying larger fees if

38:59

you're a one. And there are people who are unbanked. They

39:02

work in the fields and their

39:04

money can be stolen by anyone, and women are

39:06

the most vulnerable. So I think there are really

39:08

good reasons for a digital currency to exist,

39:11

and I think they will be good for a lot

39:13

of people. That said, we've been

39:15

very clear that we're not launching

39:17

this until we have regulatory approval.

39:20

It's not a Facebook project. The currency

39:22

itself is an international nonprofit

39:25

set up that we are part of. I know that we

39:27

wanted to have a moment to talk about

39:30

lean In and some of the research that you

39:32

have found about the discomfort men

39:35

feel mentoring and spending time

39:37

alone with women. This is something that

39:39

greatly concerns you. And

39:41

what can we do about the increasing

39:44

unwillingness of men to mentor

39:46

their female colleagues and tell us a little

39:48

more about that research. Well, it's really

39:50

important because look, the METO

39:52

movement you and I have had chance to talk about it

39:54

is so important because women have faced

39:56

too much harassment for too long and I

39:59

think we're in a better place, but we're certainly not protecting

40:01

everyone we should. That said,

40:04

we have to worry about the unintended consequences.

40:06

So what our research shows, this is lean In and

40:08

survey Monkey, is that six

40:11

of male managers in the United states

40:14

are not willing right now, are nervous

40:17

about having a one on one interaction

40:19

with a woman, including a meeting. We do

40:21

a show of hands in the audience. Who's promoted

40:23

someone you've never met with, just

40:28

in case you can't see there are no hands. If

40:30

you cannot get a meeting, you cannot get a

40:32

promotion. A senior man in

40:34

the world today is nine times more likely to

40:36

hesitate to travel with the junior woman and

40:39

six times more likely to hesitate to travel

40:41

to have dinner with the junior woman

40:43

and a man. So who's getting the travel the men,

40:46

who's getting the dinners the men? And who's gonna get promoted

40:48

the men? Which is what was happening before,

40:50

and talks a lot about that. It's

40:53

absolutely the case you promote the people

40:55

you know better now. I think everyone

40:57

should be able to do all of these things with everyone.

41:00

You should be able to have a meeting, keep the door up, and if

41:02

you want to travel does not mean a hotel

41:04

room. Travel means a public airport. Dinner

41:07

does not mean you're flat, dinner means a

41:09

restaurant. We have to be able to do all of

41:11

this. But what we really want men to understand is

41:13

that if you're not going to have dinner with women, don't

41:15

have dinner with men, group lunches

41:17

for everyone, make access

41:20

equal, because if we don't make access equal,

41:22

we're never going to move these numbers at the top, and women

41:24

today have seven percent seven

41:27

percent of the CEO jobs Before before,

41:30

we guys want to talk to you because we talked about

41:32

lean in prior to Me Too, and

41:35

given the systemic failures of so many

41:37

organizations that we've seen that have tolerated

41:40

sexual misconduct and harassment

41:42

silence women through n d as, do you think,

41:45

in retrospect, given the very real

41:47

revelations that have surfaced

41:49

as a result of the Me Too movement, lean

41:51

in might have put too much of the onus

41:53

on women to change instead of getting

41:56

a lot of these screwed up companies to change.

41:59

Well, we've always done. One of the problems

42:01

with the word lean in is you can really oversimplify

42:04

without actually reading the book and ourself. But if you read

42:06

actually what we've written and the work my foundation

42:08

is done. What we've always said is

42:10

that we wanted to be okay for women to be ambitious,

42:13

and we want companies to change

42:15

and fix and it has to be both. It's

42:18

actually pretty interesting if you save the sentence

42:20

he's ambitious, it's pretty

42:22

neutral or positive. He's going to get the job done.

42:25

She's ambitious. That's a negative.

42:27

And that is still true today.

42:29

If you look at the use of the word bossy.

42:32

You know, go to the playground anywhere, I

42:35

promise, l A or anywhere this weekend and

42:37

you see a little girl. You won't see a little girl get

42:39

called bossy. And you walk up to

42:41

her parents and you say that little girl's

42:43

not bossy. Her parents probably did it, big smile

42:45

on your face. That little girl has executive

42:48

leadership skills. No

42:50

one says that. No one

42:52

says that because we

42:55

don't expect leadership from girls, and

42:57

so we have to fix that problem.

42:59

And that means companies have to change, culture

43:01

has to change, and women have to feel

43:03

free. Now they're really well. I

43:06

have one question at I

43:09

might discussion, but time to wrap. Thank

43:11

you, Graham. Was that that my

43:13

final question? Getting back getting

43:15

back to all the controversies,

43:18

I mean Facebook, My

43:20

last question is I'm gonna gaun

43:22

us, but no, I'm curious because I

43:24

just wanted to end this conversation, Cheryl,

43:27

given all the controversy Facebook

43:29

is facing clearly in the crosshairs.

43:32

I mean, the company people

43:34

love to hate. Since you

43:36

are so associated with Facebook,

43:39

how worried are you about your

43:41

personal legacy as a

43:43

result of your association with this company.

43:47

I think I have a really big responsibility here

43:49

for a company I love and believe in that.

43:52

I really believe in what I said about

43:54

people having voice. I really know

43:56

that when I was growing up, I had no ability

43:59

to reach anyone, and most people

44:01

in the world didn't, and social media has changed

44:03

that. There are a lot of problems

44:05

to fix, and we did a great job in this audience

44:08

talking about a lot of them in this interview. They're

44:10

real and I have a real responsibility

44:13

to do it. But I feel more committed

44:15

and energized than ever because

44:17

I want to fight to preserve the good. Because

44:20

I met a woman not so long ago

44:22

who for her birthday raised

44:24

four thousand dollars for a domestic violence

44:26

shelter that she volunteers at, and

44:29

crying, she told me I saved two women

44:31

from domestic abuse. I never could

44:33

have done that before Facebook, and so

44:36

there are really big issues to fix, but

44:38

I am so committed to

44:40

giving people voice and giving people away

44:42

to react that I just want to keep doing the work

44:45

and committed. They feel honored to do it and committed

44:47

to fix problem. I want to fix

44:49

them all right, Well, they're definitely

44:51

gonna kill me if I don't stop now. Definite,

44:54

Lamberg. Thank thank you, thank you. After

44:59

we were done, Cheryl and I later exchanged

45:02

emails. She told me this was the toughest interview

45:04

she had ever done, but complimented

45:06

me on being so well prepared.

45:09

She was incredibly gracious about

45:11

the whole thing. Meanwhile, about a week

45:14

after our conversation, Twitter CEO

45:16

Jack Dorsey announced it was banning

45:18

all paid political ads globally.

45:21

Facebook, though, is still sticking with its

45:23

policy, at least for now.

45:26

Thanks so much for listening everyone. If a

45:28

weekly podcast isn't enough

45:30

of me, you can follow me yet on

45:33

social media Facebook, Instagram,

45:36

and Twitter. And if you feel like you're drowning

45:38

in a seven sea of news and information,

45:41

sign up for my morning newsletter, wake

45:43

Up Call at Katie Curic dot

45:46

com because, as they

45:48

say, the best part of

45:50

waking up is Katie

45:52

in your inbox. Sorry,

45:55

folgers, that was pretty bad, wasn't

45:57

it. Everyone, Thanks again for listening.

46:00

Everyone, and I can't wait to be in your

46:02

ear again next week.

46:10

Next Question with Katie Curic is a production of

46:12

I Heart Radio and Katie Curic Media.

46:15

The executive producers are Katie Curic, Lauren

46:17

Bright Pacheco, Julie Douglas, and Tyler

46:19

Klang. Our show producers are Bethan Macalooso

46:22

and Courtney Litz. The supervising

46:24

producer is Dylan Fagan. Associate

46:26

producers are Emily Pinto and Derek Clemens.

46:29

Editing is by Dylan Fagan, Derek Clements,

46:31

and Lowell Brolante. Our researcher

46:33

is Barbara Keene. For more information

46:36

on today's episode, go to Katie Currek dot

46:38

com and follow us on Twitter and Instagram

46:40

at Katie currec. For

46:47

more podcasts for My Heart Radio, visit the

46:49

I Heart Radio app, Apple podcast, or

46:51

wherever you listen to your favorite shows.

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features