Podchaser Logo
Home
8. I Sung of Chaos

8. I Sung of Chaos

Released Monday, 25th March 2024
 1 person rated this episode
8. I Sung of Chaos

8. I Sung of Chaos

8. I Sung of Chaos

8. I Sung of Chaos

Monday, 25th March 2024
 1 person rated this episode
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

This is the BBC. This

0:03

podcast is supported by advertising outside

0:05

the UK. Blockchain,

0:11

NFTs, AI. What does this mean for

0:13

you and me? I'm Shirelle

0:16

Dorsey, host of the TED Tech Podcast,

0:18

where we bring you the latest innovations

0:20

and biggest ideas in tech. Tech

0:22

is evolving fast and it affects our lives, from

0:25

the metaverse to the watches on our wrists. You'll

0:27

learn why people in AI make good

0:30

business partners, about our future self-driving robo-taxi,

0:32

what the next generation of Siri, Alexa,

0:34

Google looks like, and a lot more.

0:37

Find TED Tech on Apple Podcasts,

0:40

Spotify, or wherever you listen. BBC

0:47

Sounds, music, radio, podcasts.

0:51

I remember waking up, the alarm clock woke

0:53

me up like it had done for days,

0:56

quite early, and

0:58

looking out the window and it was overcast,

1:02

crisp, bright, and

1:04

just wondering what the day might

1:08

bring. In my mind,

1:11

I had no doubt that

1:13

social media had played part in

1:16

Molly's death, but of course

1:18

we didn't know what the coroner would conclude.

1:23

On Friday 30th September 2022, the

1:27

family of Molly Russell arrive at the

1:29

coroner's court for one last time. Molly's

1:32

dad, Ian, her mum, Janet, her

1:34

sisters, and the family's legal team.

1:38

Ian and one of their lawyers from

1:41

Lide's solicitors, Mary Varney, are frustrated

1:43

with the way the social

1:45

media companies, particularly meta, have

1:47

behaved during the inquest. What

1:51

are those three D's that I was warned

1:53

about? Deny, delay

1:57

and deflect. I was told that that

1:59

would be the case. the the approach of

2:02

the tech platforms. We

2:04

run our platforms safely, those

2:06

sort of denials. Delay

2:08

certainly because we struggle to get

2:11

any data at all from Molly's

2:13

digital life. As Ian

2:15

walks through the door he feels a

2:17

strange sense of relief. It's

2:20

almost over. Coroner

2:22

Andrew Walker reviewed thousands of pieces

2:25

of content. Some of the posts

2:27

Mary Varney showed me in her

2:29

office. He's heard witness

2:31

statements read out in court, listened

2:33

to testimony from family and representatives

2:35

of social media companies. Meta's

2:38

head of safety and well-being Elizabeth

2:41

Leggone said she believed nearly all

2:43

of the content Molly viewed was

2:45

admissive. She called it admissive

2:48

that a hashtag for example

2:50

hashtag I want to die

2:52

wasn't violating because it was

2:54

somebody wanting to express their

2:57

own feelings and

2:59

a complete denial that it

3:01

was an unsafe place for a child. But

3:06

a child psychologist also reviewed the

3:08

material Molly saw online. He

3:10

told the court that he found it so

3:13

disturbing that he lost sleep for weeks. Every

3:19

day of the hearing there had been some journalists

3:21

and members of the public present but

3:23

today it's different. And

3:27

as we entered the courtroom I'd

3:29

never seen it so full. It was

3:32

pretty full most days. Lots of the world's press seemed

3:34

to be there most days but

3:36

there wasn't even standing room only. The people

3:38

were sitting on the floor. They'd taken every

3:40

vantage point to hear what the coroner had

3:42

to say and the sort of low hubbub

3:44

that existed

3:47

in such a way that it just stopped as

3:49

we all walked in. There

3:54

was an air of formality about it because

3:58

even though it's an inquest it still happens. in a

4:00

court, but nonetheless it came

4:02

to a focus. Finally,

4:05

Andrew Walker arrives and the

4:07

court quietens to hear the

4:09

coroner's verdict. It

4:11

would not be safe to leave suicide

4:13

as a conclusion. She

4:16

died from an act of self-harm

4:19

while suffering from depression and

4:21

the negative effects of online content.

4:25

It is likely that the above

4:27

material viewed by Molly, already

4:30

suffering with a depressive illness and

4:32

vulnerable due to her age, affected

4:35

her mental health in a negative way

4:38

and contributed to her death in a

4:40

more than minimal way. It

4:48

doesn't feel like a victory for the

4:50

Russell family, of course, but

4:53

for Mary Varnie, the coroner's

4:55

conclusion is groundbreaking. For

4:57

a non-legal mind, what significance

5:00

does that language have? The

5:02

conclusion says that online harms

5:04

caused contributed to Molly's

5:07

death. It was a material

5:09

factor in why

5:12

Molly came to die, when and how she

5:14

did. And has that ever

5:17

been concluded before in an inquest

5:19

in the UK? Not

5:22

that we're aware of. How

5:24

significant is that conclusion

5:27

for what's coming next?

5:29

My view it's a really

5:31

significant conclusion. It's often

5:34

compared to the first finding of

5:36

a coroner who found that

5:38

asbestos had played a role in her death

5:40

and that that led to sweeping

5:43

changes and better accountability

5:46

for hundreds and hundreds of victims.

5:49

And our hope is

5:51

that the conclusion that Molly's family

5:53

fought so long and strongly

5:56

for will have that sort of

5:58

impact. Later that

6:00

day, the magnitude of what's happened hits

6:02

Ian. He gives a press

6:05

conference at a nearby church hall. We

6:08

should not be sitting here. This should not happen because

6:10

it does not need to happen. But

6:16

we told this story in the

6:18

hope that change would

6:20

come about. And I hope the

6:23

digital world particularly will be a safer place.

6:28

And the final thing I want to

6:30

say is

6:34

thank you Molly for being my daughter. Thank

6:39

you. Maybe

6:46

there's a part of

6:48

me that thinks she

6:51

should have lived her life. She deserved to live her life. But

6:57

however sad it is that she's no longer with us, she's

7:00

still doing the good that she was

7:02

always destined to do. It

7:09

all started with a dream to connect the

7:11

world. But growth and

7:13

engagement caused problems. They

7:16

were calling action. Here's a snake.

7:18

Go and drink his blood. And

7:21

efforts to keep these platforms safe

7:23

were divisive. A senior

7:25

White House adviser went on television

7:27

to accuse me personally of being

7:30

biased against conservatives and

7:32

biased against Donald Trump. Now

7:35

a court of law has concluded

7:37

that online content contributed to Molly

7:39

Russell's death. After 20 years, is

7:43

Silicon Valley's radical experiment to connect

7:45

the world About to

7:47

implode? The

8:05

Bbc Radio Four This Is

8:08

The Gate Keepers I'm Jamie

8:10

Bartlett. Episode eight: I Song

8:12

Of Chaos. Around

8:19

the same time as the

8:21

money Russell Inquest concludes a

8:23

strain story is unfolding inside

8:25

Twitter Hq in San Francisco.

8:28

Well I got am. I

8:30

gonna node from a person that

8:32

twitter and came out to on

8:34

twitter. And really wasn't

8:37

told anything. Deal

8:40

on Musk had only just bought

8:42

the company for forty four billion

8:44

dollars a now. Journalists Met Tiny

8:47

Be is sitting in one of

8:49

the twitter meeting rooms wondering why

8:51

the world's third richest man has

8:53

called him in on a top

8:55

secret assignment. Months

8:58

the season journalists to use to

9:00

be a political writer for Rolling

9:02

Stone Magazine. Now he runs his

9:04

own newsletter which he says gives

9:07

him the freedom to say what

9:09

he wants. Obviously he was picking

9:11

people who are not tied to

9:13

legacy media organisations and you know

9:15

there aren't a whole lot of

9:18

i would say investigative reporters were

9:20

working in and. After

9:25

Donald Trump's Twitter account is

9:28

suspended in Twenty Twenty One's

9:30

tensions around Sue and what

9:32

is allowed on the platforms

9:34

reaches fever pitch. Some on

9:36

the right believe that the

9:38

social media companies are using

9:41

their power to subtly persia

9:43

kind of establishment friendlies, liberal

9:45

worldview, musk

9:48

gibbs matt access to

9:50

hundreds of thousands of

9:53

internal twitter documents exchanges

9:55

between employees discussions about

9:57

content moderation minutes of

9:59

meetings emails. It

10:01

was so weird, Jamie and me,

10:04

and I can't even tell you how

10:06

surreal the whole thing was. There

10:09

was a moment in the first days of this project

10:12

where there were probably 10 people in

10:15

a conference room and

10:17

Elon, sort of faulty tower style,

10:19

kind of popped his head in the door at

10:21

one point and he said, anyone

10:24

need anything? Coffee? You

10:26

know? And then popped out. Over

10:28

the next few months, Matt sits

10:30

in a conference room at Twitter

10:32

HQ on Market Street, along with

10:34

a few other carefully selected journalists,

10:36

and pours through the documents. He

10:39

eventually posts his findings in a

10:41

series of very long Twitter threads

10:43

which become known as the Twitter

10:46

Files. The

10:49

fallout from the Twitter Files is

10:51

strange, because some people

10:53

think this is explosive, and

10:56

for others, it's nothing. On one hand,

10:58

you've got the New York Post blaring on

11:00

its front page today, Twitter scandal exposed. Then

11:02

there's the Rolling Stone, which calls it a

11:04

snooze fest. Even Fox

11:06

News went with the much

11:09

more neutral Elon Musk reveals

11:11

what led to Twitter suppressing

11:13

Hunter Biden's story. The libertarian

11:15

outlet reason- Matt says the

11:17

documents show that government agencies

11:19

and Twitter work together, taking

11:22

down election misinformation, censoring conservative

11:24

views, even deleting true stories

11:26

about the side effects of Covid

11:28

vaccines. You

11:30

call it a scarier model

11:33

of digital censorship. Absolutely. This

11:35

system assumes that the public

11:38

is too stupid to handle the

11:40

material, and it basically

11:43

approaches content moderation with this idea

11:45

that we have to shape

11:48

the information landscape so that

11:52

the audiences will come to the

11:54

correct conclusion and aren't

11:56

exposed to truth that might mislead

11:58

them. It just

12:00

annihilates the whole concept of what free

12:03

speech is supposed to be for and free speech culture.

12:08

There's one person named more than anyone

12:10

else in those files, Yoel

12:12

Roth. Twitter's former head of

12:14

trust and safety who played a key

12:16

role in banning Trump. Yoel

12:19

says Matt Taibbi is pushing

12:22

highly misleading conspiracies. Yoel

12:24

acknowledges that Twitter did sometimes

12:27

remove true information, but

12:29

only, he says, if it was

12:31

tied to inauthentic behavior, like

12:33

if it was part of a

12:36

coordinated Russian campaign using fake accounts

12:38

to target and influence Americans. If

12:41

you look at the content of the Twitter files

12:43

themselves, what you see are people

12:46

debating difficult decisions. And

12:49

while the right-wing press argue

12:51

that Twitter is censoring prominent

12:53

conservative voices, Yoel is

12:56

adamant the company didn't. If

12:58

anything, it was the opposite. What

13:00

you see actually is every time

13:02

I am personally asked about moderating

13:05

Donald Trump or moderating a

13:07

piece of content from a Republican, I

13:10

push back on moderating the Republicans.

13:13

I advocate against censoring

13:16

conservatives, and I do that

13:18

not because of my political opinions, but

13:21

because the content did not violate

13:23

Twitter's written rules. The

13:25

best evidence that Elon Musk's

13:27

cherry-picked writers could come up

13:29

with are a bunch of

13:31

examples of me personally, the

13:33

censor-in-chief, not censoring

13:35

conservatives. The

13:38

companies, the Yoel Ross of the world,

13:41

can talk all they want about how, yeah,

13:43

we didn't agree all the time. Like, we

13:46

push back. As a journalist,

13:48

the whole idea of the American

13:51

speech system is to prevent the

13:53

government from doing that sort of

13:55

thing. Matt and Elon

13:57

Musk are no longer on speaking terms. but

14:00

Matt carries on writing about this

14:02

new form of secretive digital censorship.

14:07

I haven't seen all the documents that Matt was

14:10

given access to. Most

14:12

other journalists who've looked into the

14:14

reporting don't think it's proof of

14:16

anything sinister going on. Critics

14:19

say Matt Taibbi himself has become

14:21

a conspiracy theorist, pushing an agenda

14:24

of his own, something he denies

14:26

of course. But

14:28

maybe what really matters is that

14:30

large numbers of people believe it.

14:34

For some, the Twitter files

14:36

stoked the fires of a

14:38

grand Silicon Valley conspiracy to

14:40

control information and therefore

14:43

the American mind. Yoel

14:50

Roth left Twitter in late

14:52

2022. He now works as head

14:54

of trust and safety for the match group

14:56

that runs dating apps Tinder and

14:58

Hinge. He felt

15:01

that all the work he'd been doing

15:03

at Twitter for a decade was being

15:05

undone. When I left

15:07

Twitter, I tried my best to

15:09

de-escalate things with Elon. And

15:11

so the day that I resigned

15:13

when he and I spoke on the phone and he

15:16

tried to convince me to stay at Twitter and I

15:18

turned that down, I

15:20

made it a point of trying to part, if

15:22

not as friends, than at least neutrally. I

15:25

stressed that I was on his side, that

15:27

I was rooting for his success and for

15:29

Twitter's success, and I did this

15:31

because I didn't want him to come

15:33

after me. And

15:36

then he did. And in that

15:38

moment, the threat succeeded even

15:40

what had happened after Donald Trump went

15:42

after me. Thousands

15:45

of tweets, many of

15:47

them still live on Twitter, told

15:50

me that I should be executed, that

15:53

I should be hung, that I should

15:55

be thrown in a wood chipper. And

15:57

this Wasn't just random. This

15:59

was... an attempt to keep me

16:01

from speaking about what my job

16:03

at Twitter had been and to

16:05

make an example of me. The

16:08

problem was date so comprehensively destroyed

16:11

my life that that point there

16:13

really wasn't an incentive not to

16:15

keep speaking publicly. And. Saw

16:17

an. Ex hasn't

16:19

replied to all requests for

16:21

comment. The.

16:25

Story is Molly Russell and the

16:27

Twitter files a really about the

16:29

same seeing. A ceiling

16:32

that some mysterious force decides

16:34

what we get to see

16:36

that we're not in control

16:38

anymore. The. right?

16:40

A Worried about Silicon Valley using

16:42

their immense power to push a

16:45

liberal worldview behind closed doors molest

16:47

to these massive companies? Just don't

16:49

care about the offline harms they're

16:52

causing to people like Molly Russell.

16:56

But both can agree they have

16:58

too much influence over our lives

17:00

and that something needs to be

17:02

done. On

17:10

January the Thirty first, Twenty Twenty

17:13

Four forces from this cold snap,

17:15

tic toc x and metre lined

17:17

up in Washington D C in

17:19

front of twenty or so angry

17:22

looking Us senators, both republicans and

17:24

democrats spivey of search of the

17:26

shirts really will come to order.

17:30

I thank all those in that

17:32

sense. These bosses had been called

17:34

to witness by a senate committee

17:36

looking into online child protection. Mark.

17:39

Zuckerberg and Tic Toc Boss

17:41

you Choose appears voluntarily. Linda

17:44

Yeah to Reno of eggs and

17:46

Evan Spiegel of Snap only attended

17:48

off the being sent government issued

17:50

subpoenas. and

17:52

in the gallery behind something no

17:54

one had ever seen before dozens

17:56

of parents holding up photographs of

17:59

their children 11-year-old

18:02

Selena Rodriguez, who died by

18:04

suicide after being solicited for

18:06

sexually exploitative content by a

18:09

stranger on Instagram and Snap.

18:12

Mason Bogard, who died aged

18:14

15 after a TikTok choking

18:16

challenge. Jordan

18:19

DeMay, who killed himself

18:21

aged 17 after being scammed on

18:23

Instagram. Mr.

18:26

Zuckerberg, you and the companies

18:28

before us, I know you don't mean it to

18:30

be so, but you have blood on your hands.

18:32

There's not a damn thing anybody can do about

18:34

it. You can't be sued. For

18:37

too long, we have

18:39

been seeing the social media companies turn

18:41

a blind eye when kids

18:44

have... We can no longer trust

18:46

Meta and frankly, any

18:48

of the other social media to...

18:51

Mr. Zuckerberg, what the hell were you thinking?

18:54

A couple of hours in, Republican

18:56

Senator Josh Hawley waves his arms

18:58

at the grieving families in the

19:00

gallery and turns to Mark Zuckerberg.

19:03

There's families of victims here today. Have you apologized

19:05

to the victims? They're

19:08

here. Would you like to apologize for what

19:10

you've done to these good people? Mark

19:14

Zuckerberg stands up and faces the families.

19:17

He says, sorry for everything that's happened

19:19

to them. No one should have to

19:21

go through the things that your families

19:23

have suffered. The

19:32

story of modern social media started back

19:34

in 1996 when

19:37

Congressman Chris Cox and Ron Wyden

19:40

created Section 230, that

19:43

immunity law that meant the platforms

19:45

aren't liable for what we post

19:47

on them. It's

19:49

always been the foundation on which all

19:51

social media is built. Now,

19:54

I didn't realize that Chris Cox

19:56

and Ron Wyden weren't really motivated

19:58

by technology. looking

20:00

for a subject that could unite

20:03

Republicans and Democrats. And

20:06

it just happened to be this exciting

20:08

new thing called the Internet. My

20:11

name is Christopher Cox, but

20:14

please call me Chris. This

20:17

is back in the 90s, but it was

20:19

very, very clear to us that there

20:21

was no mixture

20:24

of thought between the two

20:26

major parties. And my

20:28

hypothesis was that politics

20:31

focused constantly on the same

20:33

old questions. But if

20:36

we focused on what I called green

20:38

fields, on problems on

20:40

the horizon, then we could

20:42

force people to think about problems

20:45

that they hadn't already solved, problems

20:47

that were new to them, and they'd have to

20:49

reason it through. So that was sort of a

20:51

general approach and the way we went. And

20:54

now politicians in America from both

20:56

ends of the spectrum have found

20:59

that greenfield subject again. Except

21:01

this time they have a very different

21:04

goal. And they're

21:06

even contemplating the unthinkable.

21:09

In 1996 we passed Section 230

21:12

of the Communications Decency Act. This

21:15

law immunized the

21:17

then fledgling Internet platforms

21:20

from liability for user-generated

21:22

content. For the

21:24

past 30 years, Section 230 has

21:26

remained largely unchanged. That

21:29

has to change. Thank

21:31

you, Mr. Chairman. The Republicans will answer

21:33

the call. All

21:35

of us, every one of us, is ready

21:38

to work with you and our Democratic colleagues

21:40

on this committee. It

21:42

is now time to repeal

21:45

Section 230. This

21:47

committee has made up the ideologically most

21:49

different people you could find. I mean

21:51

we've found common ground here that just

21:54

is astonishing. For 25 years

21:57

this Section 230 immunity has

21:59

helped. fast. These

22:01

technology companies have powerful and

22:04

well-funded lobbying operations. If

22:06

I could just start with a little plain talk here

22:09

this morning. Big Tech is

22:12

the biggest most powerful lobby in the

22:14

United States Congress. They spend millions upon

22:16

millions upon millions of dollars every year

22:19

to lobby this body. And

22:21

the truth is they do it successfully.

22:24

They successfully shut down every meaningful piece

22:26

of legislation every year and I have

22:28

seen it repeatedly. We'll get all kinds

22:30

of speeches in committee. We'll get speeches

22:32

on the floor about how we have

22:34

to act and then this body will

22:36

do nothing. Why? Money. That's why. Gobbs

22:39

of it. Gobbs of it. Influencing

22:43

votes. A hammer hold

22:45

on this process. It is time for

22:47

it to be broken and the only way I know to

22:49

break it is to bring the truth forward

22:51

and that's why we are so glad that you are here

22:53

today to do it. Thank you Mr. Chairman. If

22:56

section 230 is ever

22:58

repealed social media as we know it

23:00

might disappear. That

23:03

law passed when I was still a teenager.

23:05

I never thought about it before this series

23:08

but it's defined my life, my

23:10

relationships, my work, my politics.

23:14

In the same way whatever we

23:16

do next will influence the lives

23:18

of people especially younger people for

23:20

years to come because

23:23

however chaotic and confusing the online

23:25

world seems now it's

23:27

about to get even stranger. In

23:32

order to make America great and

23:34

glorious again I am tonight announcing

23:36

my candidacy for President of the

23:38

United States. The UK will probably

23:40

have a general election sometime this year

23:42

but we still don't know when. And

23:45

voting is underway in

23:47

presidential elections in Russia which

23:49

will almost certainly see Vladimir

23:51

Putin extend his quarter of

23:53

a century in power. 2024

23:58

is a big year for elections around. the world,

24:00

you've probably already heard that, but

24:03

it's more than just elections, and

24:05

it's more than just this one

24:07

year. The harms of the

24:10

first human contact with AI haven't

24:13

been addressed, and it's gotten worse,

24:15

and it's going to get worse. Maria

24:18

Ressa, the Nobel Peace Prize-winning

24:21

journalist from the Philippines, tried

24:23

to warn the world about

24:25

election manipulation on Facebook in

24:27

2016. That experience turned her

24:30

from a passionate supporter of technology

24:33

to a skeptic, and then

24:35

an activist. She

24:37

thinks many of the problems she saw

24:39

nearly 10 years ago, emotional

24:41

manipulation and misinformation pushing people

24:44

to the fringes will be

24:46

made even worse by advances

24:49

in artificial intelligence, especially

24:52

something called generative AI,

24:54

where machines can produce

24:56

human-like content. Okay,

25:00

so what happens with generative

25:02

AI? If

25:05

they fed it social media, which

25:08

actually GPT-3, they admitted they

25:11

did, the garbage in is

25:13

garbage out. Generative

25:16

AI needs examples of human speech

25:18

to learn how to replicate or

25:20

mimic us, and

25:23

the biggest data set of human speech ever

25:26

created is on social media. What

25:30

are we using to train these AI? Everyday

25:33

human speech, or the speech

25:35

that social media incentivizes?

25:38

Character-limited, emotional, chasing engagement.

25:40

What if generative AI

25:43

becomes a self-perpetuating mirror of

25:46

our social media lives, filling

25:48

the internet with our own

25:51

dark impulses? It

25:53

can sound like it

25:55

is human, it

25:57

chats with you, but also it

26:00

does is it's pattern

26:02

recognition and it pulls it together.

26:04

In my words, it just

26:07

spews lives. Why

26:09

was this released into the public sphere?

26:11

Let me like moderate my anger and

26:14

like just go through right. So what

26:16

is going to happen with

26:18

elections in 2024?

26:21

If you do not have integrity

26:23

of facts, you cannot have integrity

26:25

of elections. This goes right

26:28

back to the vote, right? If we're

26:30

being manipulated through our emotions, we

26:32

change the way we look at the world and the

26:34

way we act through our

26:36

emotions. Early this

26:38

year, I was in Paris at

26:41

UNESCO and we counted

26:43

how many elections from 2023 to 2024.

26:45

There are nine

26:48

zero 90 key

26:50

elections we need to look at. And

26:53

if you look at the patterns

26:55

as they stand and we don't

26:57

change anything significant. In

27:00

January, Taiwan will have elections.

27:02

February, Indonesia, the world's largest

27:04

Muslim population. You have the

27:07

EU, Canada, you have the

27:09

US elections coming up. The

27:11

last factoid is V-Dem,

27:14

which is a think

27:16

tank in Sweden. Last

27:18

year, they said that 60 percent

27:20

of the world is now under authoritarian

27:22

rule. This

27:25

year in January, that number went

27:27

up to 72 percent. Maria

27:31

Ressa was right in 2016. She

27:34

tried to sound the alarm that

27:36

our information ecosystem was in danger.

27:39

What if she's right again? We cannot

27:43

be insidiously manipulated because

27:45

if we are, then

27:47

democracy really, there's no shared

27:49

reality first. Democracy

27:51

can't stand this. Democracy

27:54

will fail. I mean, I

27:56

don't know what else to say. I feel like

27:58

really truly, this is a good thing. Cassandra combined

28:00

right and yet the

28:03

technology is

28:05

running rampant. I was

28:07

much calmer in 2016. 2024

28:10

will be a tipping point. Yeah

28:14

I sound really dystopian

28:16

so sorry. We

28:19

wind up our interview and Maria rushes

28:21

off to another meeting before flying back

28:24

to the Philippines. Speaking

28:27

with her makes me think

28:29

about how fast technology changes.

28:32

How long ago 2016 already feels. And

28:37

in a decade from now we'll look back to 2024

28:39

and feel the same way. The

28:48

medium is not something neutral. It

28:51

does something to people. It takes hold of

28:53

them. It wraps them up. It massages them.

28:55

It bumps them around. In the 1960s a

28:58

brilliant media theorist called

29:01

Marshall McLuhan was thinking about

29:03

the dominant new communication technology

29:05

of his age. The

29:08

television. He said

29:10

that electronic communications like the TV

29:12

would change the world. Change how

29:15

we saw ourselves. How we saw

29:17

each other. It

29:19

would create a new global village.

29:23

But there would be a clash between

29:25

the old world and the new. The

29:27

global village is at once as wide as

29:29

the planet and as small as a little

29:32

town where everybody is maliciously engaged and poking

29:34

his nose at everybody else's business. It

29:42

wasn't TV that gave rise to that clash.

29:45

It was Silicon Valley's promise

29:47

to democratize information and connect

29:49

us all. And

29:51

in many ways they achieved that. Social

29:54

media created the global village.

29:57

But someone had to run that village. and

30:00

that fell to its creators, a handful

30:02

of people, ambitious and

30:04

idealistic and young. They

30:07

became our new gatekeepers. Legislation

30:10

freed them up to run this village

30:13

in whichever way they thought best, and

30:16

they used a mixture of

30:18

engagement-based ranking, complex algorithms and

30:20

content moderation. That

30:23

business model made the tech bosses rich,

30:26

but left many of

30:28

us feeling confused, disorientated

30:30

and manipulated. For

30:32

20 years we've all been part of

30:34

this global village. We were

30:37

drawn in by the possibilities this

30:39

new digital age offered up. Then

30:42

it turned into something darker, more

30:44

chaotic than anyone could have imagined. I

30:48

think this radical experiment is finally coming

30:50

to an end, and

30:52

we're about to find out if

30:55

Silicon Valley's utopian dream has

30:57

become just another paradise lost.

31:19

The Gatekeepers is presented by me, Jamie

31:21

Bartlett. It was written by me and

31:24

Caitlin Smith. The producer is Caitlin

31:26

Smith. Research by me,

31:28

Caitlin Smith, Rachel Fulton, Elizabeth

31:31

Anne Duffy and Juliet Conway.

31:33

The executive producer is Peter McManus.

31:36

Sound design by Eloise Wickmore

31:38

and the composer is Jeremy

31:40

Warmseley. The story consultant is

31:43

Kirsty Williams and the commissioner is

31:45

Dan Clark. This was

31:47

a BBC Scotland production for BBC Radio

31:49

4. If

31:56

you are suffering distress or despair and

31:58

need support, including urgent

32:00

support. A list

32:02

of organisations that can

32:05

help is available at

32:07

bbc.co.uk/action line. I'm

32:11

Alex Kratosky. And I'm Kevin Fong.

32:13

How do you feel about AI?

32:15

Does it scare you? Very quickly

32:17

that question comes up, you know, is it going to sink for

32:19

us? Does it excite you? I

32:22

say, how is the AI going to help us to

32:24

sink better? Do you worry about how it'll change

32:26

your life? Your job? Your

32:28

kids? AI is built into many

32:30

of the software applications that we now use

32:33

in the schools every day. In every episode

32:35

of The Artificial Human from BBC Radio 4,

32:38

Kevin and I are here to help. We

32:40

will chart a course through the world of AI

32:42

and we will answer your questions. It

32:44

doesn't just lie, but it lies in an incredibly

32:46

enthusiastic, convincing way. That ability to be able to

32:48

kind of sink critically is just going to be

32:50

so important as we move forward. The

32:53

Artificial Human with me, Alex Kratosky.

32:56

And me, Kevin Fong. Listen

32:58

on BBC Sounds. What

33:29

the next generation of Siri, Alexa, Google looks like.

33:31

And a lot more. Find

33:34

TED Tech on Apple Podcasts, Spotify,

33:36

or wherever you listen.

Rate

From The Podcast

The Gatekeepers

Jamie Bartlett traces the story of how and why social media companies have become the new information gatekeepers, and what the decisions they make mean for all of us.It's 20 years since Facebook launched and the social media we know today - but it all started with a crazy idea to realise a hippie dream of building a "global consciousness". The plan was to build a connected world, where everyone could access everyone and everything all the time; to overthrow the old gatekeepers and set information free.But social media didn't turn out that way. Instead of setting information free - a new digital elite conquered the world and turned themselves into the most powerful people on the planet.Now, they get to decide what billions of us see every day. They can amplify you. They can delete you. Their platforms can be used to coordinate social movements and insurrections. A content moderator thousands of miles away can change your life. What does this mean for democracy - and our shared reality?It starts in the summer of love, with a home-made book that taught the counter-culture how to build a new civilisation - and accidentally led to the creation of the first social media platform. But a momentous decision in the mid-2000s would turn social media into giant advertising companies - with dramatic ramifications for everyone. To understand how we arrived here, Jamie tracks down the author of a 1996 law which laid the groundwork for web 2.0; interviews the Twitter employees responsible for banning Donald Trump who explain the reality of 'content moderation'; and speaks to Facebook's most infamous whistle-blower in a dusty room in Oxford. He goes in search of people whose lives have been transformed by the decisions taken by these new gatekeepers: a father whose daughter's death was caused by social media, a Nobel prize winning journalist from the Philippines who decided to stand up to a dictator and the son of an Ethiopian professor determined to avenge his father's murder. Far from being over, Jamie discovers that the battle over who controls the world's information has only just begun.

Join Podchaser to...

  • Rate podcasts and episodes
  • Follow podcasts and creators
  • Create podcast and episode lists
  • & much more

Episode Tags

Do you host or manage this podcast?
Claim and edit this page to your liking.
,

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features