Podchaser Logo
Home
Rage Against the Machine

Rage Against the Machine

Released Friday, 3rd November 2023
 1 person rated this episode
Rage Against the Machine

Rage Against the Machine

Rage Against the Machine

Rage Against the Machine

Friday, 3rd November 2023
 1 person rated this episode
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

Ew, gotta get rid of this

0:02

old Backstreet Boys t-shirt. Tell me why!

0:04

Because it stinks, boys. Tell

0:07

me why! I've washed it so many times

0:09

but the odor won't come out. Tell me why.

0:11

No, you tell me why I can't get rid of this odor.

0:13

Have you tried Downy

0:14

Rinse & Refresh? It doesn't

0:17

just cover

0:18

up odors, it helps remove

0:20

them. Wow, it worked, guys. Yeah.

0:23

Downy Rinse & Refresh removes more odor in one

0:26

wash than the leading value detergent in three washes.

0:28

Find it wherever you buy laundry products. This

0:32

is a CBC Podcast.

0:37

Hi, I'm Nora Young. This is Spark. It's

0:40

safe to say the bloom is off the rose when

0:42

it comes to big tech. From data breaches

0:44

and privacy violations to exploitative

0:47

labor practices, the excesses of

0:49

the online platforms we use every day

0:51

are increasingly evident. But from

0:54

content moderators to Etsy sellers, workers

0:56

to everyday users, there are growing signs

0:59

of opposition to what's been called techno-feudalism.

1:02

So this time, rising up and learning

1:04

from the Luddites.

1:14

Have

1:17

you ever balked at a new technology but

1:19

then immediately made sure to defend yourself?

1:22

It's not like I'm a Luddite. I just

1:24

don't see the point of a smart thermostat. Or

1:26

maybe someone's called you a Luddite for,

1:29

say, refusing to get on TikTok or not

1:31

getting a smart thermostat.

1:32

Ugh. It's

1:34

commonly used as a derogatory term

1:37

for anyone opposed to any kind of technological

1:39

advancement thanks to the real Luddites,

1:42

British textile workers in the 19th century

1:44

known for destroying new factory machinery.

1:47

But...

1:48

They do not hate technology. They are

1:50

technologists and technicians themselves. They're

1:52

hands-on with the machines. And in fact,

1:54

they become Luddites because they understand technology

1:57

so well and the implications of how it's used

1:59

in different...

1:59

This is Brian

2:02

Merchant, tech columnist at the LA Times

2:04

and the author of Blood in the Machine, the

2:06

origins of the rebellion

2:08

against big tech. Every

2:10

time that there's an advance just about, workers

2:13

recognize the way that it's going to affect

2:16

the structure of work or their

2:18

own livelihoods, and in many cases, they rebel

2:20

against it.

2:21

Understanding the real story of the Luddites

2:24

has powerful lessons in pushing back

2:26

against similar excesses by big tech

2:28

today. Prior to the Industrial

2:30

Revolution, the textile industry in England was

2:33

decentralized. Skilled individuals,

2:35

alongside their families and friends, worked

2:38

from home to create goods, a

2:40

structure that was threatened when automated machines

2:42

and factories emerged.

2:48

You know, the Industrial Revolution doesn't just explode

2:50

all at once at the turn of the century

2:53

in the early 1800s. It comes in

2:55

fits and starts over the 1700s

2:58

when there's the spinning jetty that sort of automates

3:00

how you can spin yarn. And that

3:03

is protested. So, they're in little

3:06

outbursts here and there, but it's

3:08

not until we see mechanization

3:10

reach more of a critical mass, and

3:15

it's adopted by more entrepreneurs,

3:19

we'd call them today, who are looking

3:21

to sort of really maximize efficiency,

3:24

organize work into sort of the factory mode,

3:26

and there's a number of other things going on in the early

3:29

1800s. There's a trade

3:31

depression that's associated with sanctions

3:34

that England has put on any allies

3:36

of France, because there's the Napoleonic wars

3:38

are going on, and there's a trade

3:40

shortfall as a result of that, and

3:43

then there's a crop failure that leads

3:45

to high food prices, and

3:48

then sort of a lot of the entrepreneurs use this opportunity

3:50

to kind of hit the gas on automation

3:53

and buy some of this automating machinery

3:55

that can do the work of those skilled

3:58

tradesmen twice as fast as possible. fast, six

4:00

times as fast, you know, often much

4:03

shoddier quality, but they can produce

4:05

more, they can begin to do mass production. And

4:08

it's then in 1811 or so when

4:10

sort of all of these different trajectories

4:13

come together and you have this perfect

4:15

storm and the cloth workers finally

4:18

rise up after they had spent the last 10

4:20

years really pushing Parliament to say, hey, you

4:22

got to protect our jobs after enough

4:24

was enough, they became luddites as a tactic

4:26

of last resort.

4:33

I mean, I have to say, Brian, I thought I knew the real

4:35

story of the luddites, but I did not know that Ned

4:37

Ludd may not even have been a real person.

4:40

So tell me what we know about Ned Ludd

4:42

and how he became this legendary figure.

4:44

Yeah. So the cloth workers sort of adopted

4:47

this avatar. It's kind

4:49

of almost like a meme, Ned

4:51

Ludd, who was this probably

4:54

apocryphal figure who

4:56

was an apprentice weaver who

4:58

didn't like the work of weaving and

5:01

his master was forcing him to work harder and

5:03

harder. And eventually he

5:05

said, I won't work. The

5:07

magistrate then had him whipped

5:10

at his master's behest, which threw

5:12

him into a rage and he smashes the machine and

5:14

flees into Sherwood forest. You know,

5:16

it's a legend, it's a myth, and it first

5:19

sort of is printed only after

5:21

the luddite uprisings begin.

5:23

So he's this figurehead and

5:26

the luddites use

5:28

him as sort of a symbol and

5:30

as also a tactical tool. So what they'll

5:32

do is they, to that entrepreneur

5:34

who's got a hundred machines that are automating

5:37

jobs, they'll write them a letter and say,

5:40

we know you have 100 of the obnoxious

5:42

machines. If you don't take them down, you will get a visit

5:44

from Ned Ludd's army. Then they'll sign it general

5:47

Ludd. If the entrepreneur complies,

5:49

well, they won't, they'll leave him alone. If

5:52

he doesn't, then the Luddites do what

5:54

Luddites became famous for, which is slipping

5:57

into the factory with a giant sled hammer

5:59

and smashing. Just the machines that

6:01

are automating work. Just those machines.

6:06

Yeah, can we talk about that? Because we think of the Luddites

6:08

as sort of mindlessly smashing machines,

6:10

of course, but how did they actually pick their targets?

6:13

Yeah. So the Luddites pick their targets

6:15

because there are certain machines that

6:17

can be used to either

6:20

devalue their jobs, degrade their

6:22

wages, or attempt to sort of replace

6:25

them with child laborers altogether,

6:28

basically. So these machines

6:30

are doing three things. They're automating

6:32

production. They're reducing the quality

6:35

of the goods that are coming out of

6:38

the region, of the industry.

6:40

So it's basically dinging

6:43

the reputation of all of these tradesmen and

6:45

the amount that they can ask for for their own high

6:48

quality stuff. So it's turning out

6:50

cheap, low quality stuff. So

6:52

any machine that's doing that, there's a handful

6:54

of them in different contexts and different regions.

6:57

And that's what made Luddism so interesting is that

6:59

it was adaptable even miles

7:01

and miles away depending

7:04

on what sort of you wanted to champion

7:07

as your cause. So Big Mills,

7:10

the wide frames that would allow

7:13

sort of stocking knitters to quickly

7:15

make stockings in two pieces and then

7:18

they could just kind of slam them together and they were shoddy

7:20

and they'd fall apart. And the knitters who

7:22

are the biggest sort of industry

7:25

in Nottingham at the time, they hated

7:27

this machine because it did both those

7:29

things. It automated production. They could throw them out of

7:31

work, reduce the amount that bosses

7:33

could pay them and just ruin the goods

7:35

and the reputation for the goods that they were

7:37

making and you could hear them. That's

7:40

the thing. The entrepreneurs knew that they were unpopular.

7:43

So they would try to not tell anybody that they were

7:45

using these machines and they'd hire children to run

7:47

them. But the Luddites knew what they sounded like. They would

7:49

make this loud clanking noise so they could

7:51

identify and then they would slip in

7:53

and just smash those machines. That's

7:56

the machines that were degrading conditions.

7:58

you know, there's a language

8:00

of automation but in fact, whether it's children

8:03

or deskilled workers, people were still

8:05

required to essentially make sure that the machines worked

8:08

okay and they still had jobs in many cases

8:10

but they were just these very deskilled, low

8:12

wage kind of jobs. Yeah, 100%. It's

8:15

just kind of the

8:16

enduring myth of automation that the

8:19

worker will go away and

8:21

it's just the machine and you can have

8:23

this great system that's just producing stuff.

8:26

Well, no, it's more like a transference and

8:29

that was true at the time and it's true today.

8:32

You know, the skilled workers,

8:35

you know, demanded more money for the work that they

8:37

did. So if you have a machine

8:39

that can churn more out, you need to

8:41

sell more to make up your margins

8:45

but it's not an automaton really,

8:47

it still needs to be, as you said, managed

8:50

by a worker. So

8:52

they would fill the factory with

8:54

unskilled workers, undercut the skilled

8:57

workers on wage and

8:59

then leave the sort of those

9:02

who would become luddites with few options.

9:04

You could either go into the factory but a lot

9:06

of times they didn't want the skilled cloth workers

9:09

in the factory because they knew the trade too

9:11

well and they were proud and they

9:13

weren't as malleable or pliable or

9:16

abusable really. I mean, the children,

9:18

of course, in the Industrial Revolution are

9:20

subject to tragic circumstances and

9:23

they really kind of forecast the

9:26

future of the next few decades of

9:28

what work was going to be like in the factories and the luddites

9:30

really wanted to stop that.

9:31

Yeah. So how were the luddites viewed

9:34

by various strata of society at the time?

9:37

So among other working people, the

9:39

luddites were especially in the beginning

9:41

hugely popular. They were the Robin Hoods of

9:43

the day and that's why they use this

9:46

moniker, Ned Ludd. It sounds

9:48

a lot like Robin Hood, Ned Ludd, Robin Hood, Ned Ludd

9:51

and they're in Sherwood Forest

9:54

is around them. So there's this tradition of descent

9:56

that they're plugging into and the

9:59

myth and the Sort of the crusade

10:01

that they were on really worked quite well

10:03

and people cheered them in the streets

10:05

as they smashed machines You

10:07

know some sympathetic officials would

10:10

just kind of stand by and let them do it because they

10:12

sympathized with the Luddites Now

10:14

the British crown was not so thrilled

10:16

with it and neither were the factory owners. So pretty

10:20

quickly they move to make

10:24

Frame-breaking or machine-breaking a crime

10:26

punishable by death Parliament

10:29

kind of pushes this through and sort of

10:31

Interesting aside as Lord Byron is

10:33

coming up as a Lord at this time for the

10:35

first and he gives his maiden

10:38

speech to Parliament in defense

10:40

of the Luddites trying to prevent

10:43

this bill that would make a machine-smashing

10:46

Capital offense, but it

10:49

doesn't sway enough people the law goes into

10:51

effect the crown deploys the military

10:54

There's just tens of thousands

10:56

of troops and militiamen and mercenaries

10:58

that are camped out at the factories You

11:01

know ready to fight the Luddites. It's the biggest domestic

11:04

occupation of England in

11:06

history to that point It's it's

11:08

really, you know looking a lot like

11:11

kind of a civil war and of course the

11:13

most powerful of that strata

11:16

are Really, you know

11:18

working with the British crown and it's

11:20

one of the first times that we see this sort of alliance

11:23

of the state and industry

11:26

sort of aligning against workers to Forcefully

11:29

put put them down which is eventually what happens

11:31

to the Luddite rebellion Yeah, but

11:33

to what extent do you think the Luddite movement was not

11:35

just about the machines but about the emerging

11:37

factory system itself? I

11:40

mean, I think it was more about opposing

11:42

the emerging factory system more opposing

11:45

the exploitation that that enabled

11:48

more about opposing a

11:51

system that they Quite

11:53

correctly in my opinion saw as

11:56

engendering Inequality

11:58

and more poverty the machines

12:01

were, I mean, they enabled

12:03

this sort of transfer and this

12:05

evolution of work and the entrepreneurs

12:08

were using this machinery to

12:10

this effect, but it wasn't the

12:12

machinery itself that was

12:14

the source of protest. It was, again, how

12:16

it was being used. If there was a way

12:19

that, you know, all of the cloth workers

12:22

could have sort of banded together

12:24

and collectively decided how best to use this

12:26

machinery and, you know, maybe it would save them some work,

12:29

maybe there would be cases where it would be good for, maybe there

12:31

would be other ones where they would want to leave it alone

12:33

and not use it to make a certain garment or

12:35

a bit of cloth. Then you

12:37

can imagine an alternate scenario where

12:40

technology advances without

12:42

causing this huge rift between

12:45

the industrialists and the workers

12:47

who really feel like they're being exploited.

13:02

I'm Nora Young and today we rage against

13:04

the machine. Well, maybe we

13:06

don't, but we're certainly talking about the history

13:08

of rebellion against automation from the Luddites

13:11

to today. Right now, my guest is LA

13:13

Times tech columnist and the author of Blood

13:15

in the Machine, Brian Merchant. The

13:20

subtitle of your book,

13:20

Brian, is the origins of the rebellion against

13:23

big tech and part of the argument in the book

13:25

is that there are parallels between those early

13:27

industrialists and today's big tech titans

13:30

and the big tech platforms that

13:32

dominate the tech scene now. So can you spell

13:35

out where you see those parallels? Yeah,

13:38

it really starts with, again, this mode

13:41

of technological development where,

13:44

you know, somebody like Richard Arkwright, who I

13:46

kind of name as the first tech titan,

13:49

wasn't really a great inventor.

13:50

It comes out later that

13:53

machines that he patented were, you

13:55

know, somebody else's and he gets his patents invalidated,

13:58

but he invents, quote unquote, Well, you

14:00

know, this device called the Waterframe, it's

14:02

kind of like a great big wheel

14:05

that you can put next to a stream

14:08

and it will produce yarn with

14:10

water power. It's sort of an advancement of the

14:12

spinning jenny and it can produce huge

14:14

volumes of yarn. And his major

14:17

innovation though was that he was

14:19

willing to sort of break

14:23

the laborers, break

14:25

their will into working

14:27

in this brand new mode of production

14:29

which was a factory. Like you know, it wasn't

14:32

a natural or normal thing. I mean, there's a reason that

14:34

they relied on so many children

14:36

and vulnerable populations because

14:40

they didn't have the wherewithal to sort of resist

14:42

this new awful

14:45

seeming mode of work. I

14:47

mean, the Luddites, we talked about in the

14:50

beginning about how they worked at home and they had all

14:52

this autonomy and then all of a sudden

14:55

you're being organized into

14:58

a grid of workers where you're tending machine

15:01

inside where you can't take breaks

15:03

unless the overseer tells you you

15:05

can and you have to stand at their command. So Richard

15:08

Arkwright sort of institutes this new model

15:10

and that's kind of what his major contribution

15:14

is, is getting this new mode of

15:17

division of labor instituted and

15:19

using sort of that power and

15:21

that will. So I kind of, he's kind of an amalgam

15:23

I say in the book of somebody like Steve

15:26

Jobs who kind of takes these ideas that are out there

15:29

in you know, patents him under

15:31

his own name, pushes them out into

15:33

the mainstream. You know, Steve Jobs always said, you

15:35

know, great artists don't borrow,

15:37

they steal, paraphrasing Picasso.

15:40

And then someone like Jeff Bezos who's really pushing the envelope

15:43

and seeing how productive people

15:45

can be. You know, neither one

15:48

are, they're great businessmen but they're

15:50

not great inventors or technologists. So this

15:53

route sort of of this model where these

15:55

are the sort of the folks that we

15:57

tend to celebrate in. in

16:00

pop culture and in the annals of entrepreneurship

16:04

are really taking a page out of the playbook

16:06

of these early tech titans and that

16:08

conflict that gets rooted right

16:11

then and there I think is one that we're

16:13

still seeing reverberations of

16:16

today where Uber and Lyft

16:18

and Amazon where the

16:20

idea isn't necessarily so novel, you

16:23

know, hailing a cab on your phone is not

16:25

that different than calling your cab

16:27

with, you know, with your phone. Amazon,

16:30

you order the product on a website

16:32

instead of going into a store but technology

16:34

has allows entrepreneurs,

16:37

allows tech titans to sort of argue

16:40

that the old rules don't apply where

16:42

those norms and standards and worker protections

16:45

and all those kind of things have evolved to keep pace

16:47

with industry. Technology can

16:49

do great and wonderful things but it can also

16:51

sort of be an excuse to say like, well,

16:53

this isn't a taxi company, this is a software

16:55

company so we don't have to pay attention to the municipal

16:58

taxi code. No, no, no, it's a peer-to-peer

17:00

service that you're just plugging in connecting

17:02

with an independent contractor. No, it's totally different

17:05

and then they can just throw decades

17:07

of, you know, legal protections

17:09

out the window as a result. Yeah. So,

17:12

you're a technology journalist in daily life.

17:15

Why do you think we reify technology

17:17

like this? So, it's like the technology

17:19

is the actor, the technology is the economic force

17:22

and not the social relations or the business model

17:24

or the power structures that are at play

17:27

underneath the technology or alongside the technology.

17:30

Yeah. At the top, I would say, you know, technology is

17:32

exciting, you know. It is very

17:35

much human nature to create and

17:37

to innovate and to build

17:39

new things and so the

17:42

things that we do come up with and create, we do

17:44

want to celebrate that and that does feel like,

17:46

you know, that's one of the best indicators of how

17:48

we're progressing as a, you know,

17:51

as a society or even as a species, all this

17:53

new stuff that we're making that we weren't able to do a generation

17:56

ago. However, the...

17:59

industrialists or the tech titans

18:02

are keenly aware of that aura that technology can

18:04

create. And

18:10

so is everybody who's ever hoped

18:13

to profit off of selling it. So it

18:15

can also be used as a force to

18:18

trample over our better intuitions or our ability

18:23

or willingness to question whether

18:26

or not something is an

18:28

advancement is actually good for us as a society

18:30

and it has historically been that way

18:33

for 200 years. We see all

18:35

these instances of people being kind of

18:38

shouted down when a new technology arises

18:40

and they stand up to protest. And

18:42

that's what happened from the beginning

18:44

of the Luddite rebellion. They were painted

18:47

as backward looking, as technophobes,

18:49

as deluded, as people who knew not

18:52

what they did and that was very

18:54

deliberate. It was almost kind of early

18:56

propaganda that benefits

19:00

the people who are making the technology and who

19:02

would rather not have questions asked about it,

19:04

who would rather not have pieces of

19:06

it questioned or put

19:09

under the microscope. So

19:11

there's a great quote that I include in the book

19:13

from Theodore Rosak, the cultural critic that it's, if

19:16

the Luddites didn't exist, then their critics

19:18

would have to invent them. It's very useful

19:21

to have this boogeyman that you can

19:23

point to and say, oh, you don't want to be like that. Yeah.

19:26

But what about the argument that you can

19:29

look back

19:29

through the history of technological change and yes,

19:31

it's eliminated jobs, but it's also created new

19:33

jobs and increased productivity and

19:35

maybe that applies now with automation

19:38

and generative AI, et cetera, et cetera. Yeah.

19:42

To me, this is a very deterministic

19:45

way of thinking. It indicates

19:47

that it had to be that way. Could

19:50

we have advanced technology without

19:53

immiserating hundreds of thousands

19:55

of children and workers and migrants

19:57

and women? Could we have

19:59

moved?

19:59

You know the technological needle forward

20:02

without doing all that, you know I don't

20:04

think I'm too much of an optimist and saying absolutely

20:07

we could have and it was as you mentioned

20:09

earlier It was more about the context

20:11

and the social relations and the power structures

20:14

in which technology was being developed Yeah,

20:16

so there's no reason that we can't have

20:19

Technology developed that is not as

20:21

the Luddites would have said hurtful to commonality

20:24

We if we have more democratic

20:26

inputs into how we build technology

20:29

if more people are given more say It's

20:31

going to adversely affect fewer people

20:34

We have since you know the Luddites

20:36

time we have had this model where

20:38

the people with the most money the most power You

20:40

know, it's a small Unrepresentative

20:43

sliver of people they get to call the shots

20:46

on how technology is developed and rolled

20:48

out and affects People

20:50

it's still us in Silicon Valley today You

20:52

know We have these venture capital firms who can

20:55

funnel hundreds of millions of dollars to

20:57

the firms or the startups the

20:59

founders of their choosing often

21:02

it's people who look and think a lot like

21:04

them and then they get to develop the Technologies

21:07

and if we don't like it then

21:09

we have to play this constant game of

21:11

sort of rear-guard action You know,

21:13

like look at what happened with Facebook, you know,

21:15

it's taken over the world It's got billions

21:18

and billions of users its motto

21:20

was move fast and break things Mark Zuckerberg

21:22

It was just let's get it out there and ask questions

21:25

later and we've done incredible harms

21:27

to society some benefits too But

21:30

think about if it was rolled out in a way that wasn't

21:32

just so top-down Antidemocratic,

21:35

maybe we could have avoided some of that.

21:37

Yeah, but I mean in the book It seems like you think

21:39

that grassroots organizing is

21:41

more promising than government regulation

21:44

But you know, these global platforms are so

21:46

much larger than 19th century Factories

21:49

like isn't there an argument that we need say like

21:51

a larger? international regulatory

21:54

change. Oh

21:55

Absolutely. I think I think it's both I

21:57

think neither will happen in a vacuum So

22:00

grassroots organizing can demonstrate

22:03

support for regulatory change. So

22:05

I think it's kind of an all of the above approach

22:08

we need to take. I think more

22:11

tech companies and workers affected by tech companies

22:13

should be organizing and trying

22:16

to fight for their rights

22:18

on the ground. And we just saw a great

22:20

example of how successful that can be with the

22:22

writers than the WGA who

22:24

just won a big victory against the

22:26

studios. And they won control

22:29

over how they use AI in

22:31

the labor process there. So the studios wanted

22:33

to say like, oh, we can use AI however we want,

22:35

you know, like, let's just keep an open dialogue

22:38

about it. And they said, no, how about in

22:40

our contract, we say, the studios don't

22:42

get to use AI to write scripts at all.

22:45

If AI is going to be used, then we'll determine how that

22:47

is. And they actually won

22:50

that victory, which was huge.

22:52

And I consider that a lot like victory

22:55

because they did it by rejecting an exploitative

22:57

use of technology and holding that line

22:59

and saying no. And then now they have

23:01

a much better situation where

23:03

they have more control over how they do or do

23:06

not want to use AI. So I think that's a model for

23:08

on the ground organizing. I do think we got

23:10

to fight to break up some of these big tech companies. And

23:12

I think we do need better regulatory

23:14

control over a lot of this stuff.

23:17

Right. But what about consumers

23:18

and all this? I mean, people know these

23:20

things about big tech. They know about

23:22

the working conditions. They know about the limits of gig

23:24

work. But people seem fine about taking Ubers

23:27

and ordering from Amazon and so on.

23:29

Yeah, it's a really hard battle to

23:31

fight because these tech companies

23:33

have become so ubiquitous.

23:36

And again, that just speaks back

23:38

to the anti-democratic

23:40

model of tech development I was talking about because you

23:42

could not have a better example

23:45

than Uber, which has

23:48

not ever really been profitable. Maybe

23:50

in the last year it's had some glimmers

23:52

of profitability, but it had 10 years

23:55

of being bankrolled

23:57

by huge war chests of venture companies.

24:00

capital. It didn't have to be a business throughout

24:03

most of human history. If you want

24:05

to grow as a business, you have to prove that you

24:07

can turn a profit. Uber didn't have to do that

24:09

until it had saturated

24:11

the market so extensively. So

24:13

it had become basically too big to

24:15

fail. People rely on it to get to work. There

24:17

are places without great public transit, especially

24:20

here in the states where you

24:23

want to get somewhere in LA. It's really hard

24:25

to do it unless you've got a car. Amazon,

24:28

very much the same. Amazon didn't turn a profit for

24:31

a really long time. They just focused on relentless

24:33

expansion. So it's there. It's in

24:35

the fabric of society. We have to sort

24:37

of figure out what we want to do

24:39

with that. I wish we could see

24:42

more sustained sort of backlash.

24:44

I personally don't use Amazon

24:47

because I do think at this point, it

24:49

is a non-ethical company

24:51

but I also don't think consumer boycotts are

24:54

the answer. They've insinuated themselves

24:56

too deeply into the fabric of modern society.

24:58

So the answer isn't just to shame people for

25:01

relying on these tools

25:03

and systems that have become so

25:05

commonplace. The solution

25:07

I think is to fix them. Yeah. Finally,

25:09

just ultimately, what do you hope we learn from the history

25:12

of the Luddites? I hope

25:14

we learn that it is absolutely

25:16

okay to stand up and say no

25:19

when you see a technology that

25:22

is being used by management or by

25:24

a boss to exploit you

25:27

or your working conditions. It's okay

25:29

to resist technology and

25:32

more and more people are taking this

25:34

page out of the Luddite book. For so long, we've

25:36

just seen Silicon Valley

25:39

as these sort of champions of progress

25:42

and innovation and we haven't been

25:44

good at questioning everything that they've done. That's changed

25:46

in the last few years but I think we can get

25:48

even more pointed about it.

25:50

I think we can push back further. We

25:52

can demand much more of a say in how

25:55

we want technology to shape our lives and

25:58

the future we want it to help build.

25:59

Thanks so much for your insights on this. It's a really great book.

26:02

Thanks, Nora. I really appreciate it.

26:05

Brian Merchant is the LA Times top columnist

26:07

and author of the new book, Blood in a Machine.

26:27

You are listening to Spark. Mrs.

26:30

Spark. Mrs.

26:32

Spark. Mrs.

26:36

Spark.

26:37

Mrs. Spark. Mrs. Spark. With

26:42

Nora Yung on CBC Radio.

26:47

Hello, I'm James

26:49

Milton. For 15 years, I produced

26:51

the Vinyl Cafe with the late, great Stuart

26:53

McLean. Every week, more than 2

26:55

million people tuned in to hear funny, fictional,

26:57

feel-good stories about Dave and his family.

27:00

We're excited to welcome you back to the warm

27:02

and welcoming world of the Vinyl Cafe with

27:04

our new podcast, Backstage at the Vinyl

27:06

Cafe. Each week, we'll share two

27:08

hilarious stories by Stuart. And for the

27:11

first time ever, I'll tell you what it was like behind

27:13

the scenes. Subscribe for free wherever

27:15

you get your podcasts.

27:18

I'm Nora Yung, and this time on Spark, we're talking about

27:20

fighting back against big tech, what

27:23

we have to learn from the past, and the current

27:25

dissatisfaction with how the tech platforms

27:27

so many of us rely on are governed. One

27:30

of the ways that dissatisfaction is expressing

27:32

itself is through union drive. We've seen

27:35

moves

27:35

for unionization amongst Uber drivers,

27:37

Amazon employees, and food delivery workers.

27:41

In May of this year, over 150 content

27:43

moderators came together in Nairobi to

27:45

form the African Content Moderators Union.

27:48

Members include current and former workers of

27:50

third-party moderation contractors who

27:52

provide services for companies like OpenAI,

27:55

Meta, and TikTok. We

27:57

tried to gather... with

28:01

the people who had gone through the same encounter

28:04

within the same organization. And

28:06

so there was that need to

28:08

voice our frustration. And

28:10

so out of that we decided

28:13

to form a union. This is Richard

28:15

Mathenge, one of the lead organizers of

28:17

the African Content Moderators' Union. He's

28:19

a former content moderator who worked on

28:21

the creation of ChatGPT through SAMMA,

28:24

a company OpenAI outsourced the work

28:26

to. We were actually training

28:28

the chatbot to work

28:31

with toxic messages or toxic

28:34

pieces of text so that the

28:36

people who would interact with

28:38

the platform much later will have an

28:41

easy time in as far as their encounter

28:43

is concerned. That means because

28:45

of the work of Richard and his fellow moderators,

28:48

whenever you or I use ChatGPT,

28:51

we aren't subjected to racist, sexist

28:53

or violent content. Had it

28:55

not been for the neighbor and

28:58

because of the effort and the sacrifices and

29:00

the commitment that we put on on

29:02

a daily basis, we

29:04

will not be talking about ChatGPT

29:07

as of now. Those sacrifices

29:09

were enough to spur the action of forming a union.

29:12

And it makes sense that it happened

29:14

in Kenya, which has a booming tech sector, both

29:16

in homegrown Kenyan tech companies and

29:18

as the place where tech giants like Google, Amazon

29:21

and Microsoft have set up their African

29:23

headquarters. But that's also raised

29:25

questions about what equitable work conditions

29:28

there ought to look like. The

29:31

moment we introduced

29:33

ChatGPT, there was no euphoria,

29:36

there was no excitement. Unfortunately,

29:41

during our stay, it was not that worthy

29:44

as we expected or as

29:46

we anticipated. I will see

29:49

my brothers and sisters being frustrated

29:51

on a daily basis with

29:53

the excitement that was there before

29:55

when they were starting the project. We're

29:58

simply deteriorating and figuring out what we need to do. away

30:01

on a daily basis. And so I

30:03

could tell that they were traumatized

30:05

because of the text messaging that

30:07

they were reading day in day out. So

30:10

I tried to use my diplomatic skills

30:12

to reach out to their management and remind

30:15

them of their commitment to providing

30:17

a conducive environment including psychiatric

30:20

assistance for my brothers and sisters.

30:23

Unfortunately, commitment was not

30:26

there too. So I

30:28

felt I needed to do more.

30:32

There were also concerns about the amount of

30:34

money

30:34

the moderators were receiving for this difficult

30:36

work. According to the Wall Street Journal,

30:39

workers on the OpenAI Chad GPT

30:41

project were paid an average of between $1.46 and $3.74 per

30:43

hour US, citing a SAMA spokesperson. But

30:49

Richard says that when you factor in things like remittances

30:51

workers sent home to family, there wasn't

30:54

much left.

30:55

Remember, these are individuals who are

30:58

breadwinner. They have their

31:00

families. Some of them have been raised

31:02

by single mothers. And respectfully,

31:06

they were required to, you know, out

31:08

of love, reach out to their

31:10

parents and their single mothers and

31:13

tell them, you know, you educated me all

31:15

through my school life. And this

31:18

is just a token to say thank you. But

31:22

it was not enough. It was not, even

31:25

when you're sending something back at home, you

31:28

are almost left with nothing

31:30

at the end of the month. It

31:32

was not a rosy affair. Yeah.

31:34

I know that at that union meeting in the

31:36

spring, it included workers from YouTube,

31:39

TikTok, Facebook, as well as

31:41

OpenAI. So, you

31:43

know, how many different types of workers could potentially

31:46

be in this union?

31:48

So, as a standard of,

31:50

it was 150 individuals drawn

31:54

from different AI

31:56

organizations within the city.

31:59

that as we have moved along,

32:02

we are speaking of almost 400 individuals.

32:05

And this is our graduate

32:07

ancestors from respective

32:10

AI organizations all over the country.

32:13

So right now we are speaking of an inventory of

32:15

about 400 individuals. Wow.

32:18

I understand you and others also approached

32:21

Kenyan Parliament. Can you tell me about that?

32:23

Yes. So we

32:26

approached Kenyan Parliament to come

32:28

up with legislation

32:31

that will provide a clear

32:33

pathway on some of these

32:36

organizations on how they are supposed to be

32:38

run and how they are supposed

32:40

to be conducted. So

32:42

we reached out to our representatives

32:45

with three clear objectives

32:48

and petitions. The

32:50

first one was to try

32:52

and launch investigation as

32:55

far as quantum moderation work is

32:57

concerned, specifically

32:59

with respect to SAMHSA. The

33:02

second petition was to come up

33:04

with legislation that will stop

33:07

organizations like SAMHSA from targeting

33:10

young and vulnerable individuals who

33:13

are just graduating from high

33:15

school. Some of them are graduating from

33:17

campus from doing this

33:20

kind of traumatic work.

33:24

The other final petition was

33:26

to come up with a very

33:28

clear and robust mechanism

33:31

that will address the issue

33:34

of content moderation work for

33:37

these organizations to provide clear

33:39

pathway in terms of psychological

33:42

support. So those are

33:44

the three petitions that we rendered

33:46

to Parliament. We pray

33:49

that they work on this

33:51

as a matter of urgency because as

33:53

we speak right now, SAMHSA is dedicated

33:56

and committed to recruiting

33:59

young individuals. So even as we speak

34:02

from campus, then they train them, avoid

34:05

providing psychological support.

34:07

Nora.

34:09

Nairobi is a tech hub

34:11

on its own, and a lot of tech work is also

34:13

outsourced to Kenya. So beyond

34:15

the Content Moderators' Union, how

34:17

would you like to see Kenyan

34:19

tech workers' jobs improve?

34:21

Content moderators and tech workers

34:23

need to be treated respectfully. Their

34:25

mental health needs to be addressed,

34:29

as well as the remuneration

34:31

as well. We need proper

34:34

policies and proper mechanisms

34:37

put in place to see the

34:39

improvement and the

34:41

commitment of this organization

34:45

on working on the lives of

34:48

these tech workers. This is not something

34:50

that we can go gain from. Richard,

34:53

thank you so much for your insights on this. Thanks,

34:56

Nora.

34:57

Richard M We

35:13

reached out to SAMA AI for a statement. They

35:16

told us the company disputes the claims made

35:18

by moderators in

35:21

regards to wages and psychological support.

35:24

This is a statement from the Commission in March of 2023.

35:34

You're listening to Spark from your friends

35:37

at CBC Radio.

35:40

As we heard from Richard, the work of online moderators

35:43

can be very difficult and thankless, and

35:45

yet it's necessary to make the tech tools

35:48

operate. So how much of a difference

35:50

can something like the Content Moderators' Union

35:52

make?

35:53

I think it's a wonderful development

35:55

for workers and in some

35:57

ways an inevitability

35:59

that the industry should have foreseen

36:02

because of their demonstrated

36:05

lack of interest in improving content

36:07

moderation worker conditions. As

36:10

we know, those kinds of progressive

36:13

efforts within labor

36:16

do not come from management. They come

36:17

from workers pressing

36:20

and making demands for their

36:22

basic humanity to be respected. And

36:25

I think the members of this union

36:27

were right to do that. This is Sarah

36:30

T.

36:30

Roberts.

36:31

I'm a professor at UCLA in Los

36:34

Angeles, California. I'm the director

36:36

of the Center for Critical Internet Inquiry

36:38

at UCLA and the author of behind-the-screen

36:41

content moderation in the shadows

36:43

of social media. In the book, Sarah

36:46

sheds

36:46

light on the invisible work done by moderators

36:48

to shield users from hateful language, violent

36:51

videos, and cruelty on the commercial internet.

36:54

Her work also looks at how these workers and

36:56

users can combat the excesses

36:58

of the big tech platforms.

37:01

Invariably, any kind of

37:03

worker organizing will be met

37:05

with hostility from the management class

37:08

and from the owners. But this

37:11

particular group has done very

37:13

well for itself in terms of articulating

37:15

the conditions that have pressed them

37:18

into the position of wanting

37:20

to organize in this way. I

37:23

think that they have a very strong

37:26

media presence and someone in

37:28

leadership who can really articulate their

37:30

situation

37:31

and their needs, which are

37:33

wholly reasonable.

37:35

They're asking to not be psychologically

37:37

damaged by the work that they do and to be properly

37:40

compensated

37:40

for the dangerous nature of the work,

37:43

those seem like

37:45

basics. And we're hardly in

37:47

a moment where the companies can say we didn't

37:49

know. It's been years.

37:53

So there's the formation of the African Content

37:55

Moderators Union, which happened this past

37:58

spring. But there's also a loss. suit

38:00

currently making its way through the courts in Kenya.

38:03

This case involves Meta, Facebook's parent

38:05

company, and two third-party moderation

38:07

companies, one of which is Sama. More

38:10

than 180 moderators are seeking redress

38:12

over pay and working conditions. They

38:15

also want Meta to confirm their right to unionize

38:17

and changes to mental health support. Settlement

38:20

talks between Meta, Sama, and the moderators

38:22

recently broke down. But Sarah

38:24

says she sees promise in this type of case.

38:27

For years, I have believed

38:30

that these kinds of progressive

38:32

efforts will yield the most success

38:35

coming from outside the United States. In

38:37

the US, there have been some

38:40

court actions of a similar nature

38:42

alleging similar things. But what

38:44

tends to happen is that those

38:46

court cases get settled

38:49

before they really see the light of day and

38:51

are subject to the public being

38:54

able to witness them. And they

38:57

are subject to

38:57

non-disclosure agreements and we never hear anything

39:00

else. So the individuals kind of settle

39:02

out their needs financially or

39:05

hopefully met through that process. But

39:07

in other places where we're seeing workers

39:10

come together as collective and we're

39:12

seeing some strategic lawsuit

39:14

filing and so on, I think there's

39:18

perhaps an opportunity

39:20

to make

39:22

some change in these systems. So

39:26

the firms will tell you and it is true

39:28

to some extent that this activity requires

39:31

a large amount

39:32

of available labor, people willing

39:34

to do the work, people who also

39:36

have specific cultural linguistic

39:39

competencies. So that necessitates

39:42

in many cases outsourcing to places

39:44

around the globe to meet those needs, fair

39:47

enough. But where I start to diverge

39:49

with the claims around the necessity for

39:51

this is where it becomes clear

39:53

to me that content moderation

39:56

on the one hand is a

39:59

mission critical. activity for these

40:01

firms and they'd be the first to let

40:03

you know that it is. And

40:05

yet, it is treated as an afterthought,

40:08

it is treated as a low status and therefore

40:10

low wage kind of activity. People

40:13

are considered replaceable and expendable

40:16

and the companies do not treat it

40:18

as a central or core part

40:20

of their function. They

40:21

outsource it out, they work with third

40:23

parties and in

40:26

some ways they wash their hands of it in

40:28

that sense. That's kind of

40:31

the ideological piece where if

40:33

they could, they would wave a magic wand

40:35

and automate the whole process but it's simply

40:37

not possible. And lastly,

40:39

I would say and perhaps

40:42

most cynically in this case, we

40:44

have a well-worn playbook from many

40:46

industries, the textile industry, manufacturing,

40:48

others of

40:51

out

40:51

of sight, out of mind, globalizing

40:53

activity to chase the cheapest

40:55

absolute bottom line in terms

40:57

of pay and plausible

41:00

deniability when things go wrong. So

41:03

in other words, this puts them at arm's

41:05

length from activities that

41:07

are known to be harmful and known to be

41:09

incredibly onerous and difficult for workers

41:12

and yet, they will gesture at those third

41:14

parties for being responsible for the poor working

41:16

conditions when the truth is that the

41:18

tech companies have incredible power

41:21

to set the tone and the expectations

41:23

and the mandate around these issues. So

41:25

they really kind of worked out a sweetheart deal

41:28

for themselves where they can

41:30

get and rest all

41:32

of the competency and all

41:34

of the well-being out of these employees

41:37

until these employees just aren't able to

41:39

do the work anymore and

41:41

they just go and find another

41:43

person to replace them.

41:45

So you wrote a book called Behind the Screen

41:47

about the work that content moderators on social

41:50

media do. Can you tell me a little bit

41:52

about how big the sector is and

41:54

also why this is such difficult

41:56

work?

41:56

Well, it's a sector that has

41:59

grown exponentially.

41:59

especially alongside the public's

42:02

engagement with social media. So

42:05

just as we have seen

42:07

almost every aspect of our lives sort

42:09

of contained

42:10

within and constrained by

42:12

these platforms, all of that

42:14

output is now subject

42:17

to review, reporting, falling

42:20

in line with the rules of engagement for the

42:23

platforms, etc. So the

42:25

human review process can begin

42:27

a number of ways, particularly because most

42:30

platforms also use computational

42:33

mechanisms now

42:33

to cull material

42:36

that otherwise wouldn't necessarily be reported

42:39

and that also has to be vetted. And certainly

42:41

what I was looking at in my book is that human

42:43

review process that begins

42:46

when someone like you or me encounters

42:48

something disturbing, startling

42:51

that we think is inappropriate for whatever

42:53

reason on a platform and we file a report

42:55

about it. Eventually that makes

42:57

its way to human review and

43:00

these are people who are trained

43:03

to achieve a high level

43:06

of both efficiency, so

43:08

in a high level of productivity but also high

43:10

level of accuracy vis-a-vis the rules of

43:12

the platform. And they are

43:15

looking at a new report

43:17

or a new piece of content perhaps every 10

43:19

seconds. It is

43:21

akin to being on an assembly line in that

43:23

regard. It's always

43:26

on situation when you

43:28

are working as a content moderator and

43:30

especially as a generalist and

43:33

there's never a moment where you will come to the

43:35

end of the line and say,

43:36

okay, I've got it all, I've reviewed

43:39

it all. It's an endless stream of

43:41

material. Sometimes the

43:44

job can be incredibly boring and

43:46

incredibly mundane, I mean to the point where

43:49

the difficulty of it is the rote nature

43:51

of it and sort of the mind numbing

43:54

of it. But

43:55

the difficulty with that is that it

43:57

will often be punctuated by moments

43:59

of

43:59

with abject horror, extreme

44:01

material that really no one

44:05

would ever want to see. And I

44:07

guess the last thing I would say about the work and

44:09

the workers is that despite the fact

44:11

that so many of them are outsourced

44:14

to third parties and undervalued

44:16

and disregarded, these workers

44:18

are well aware of their mission

44:20

critical role and they often

44:23

articulate that to people like me. They

44:26

say, you know, I'm doing this work so

44:28

that you don't have to see what I have to see.

44:31

There's a real sense of sacrifice and altruism

44:33

there that many of them

44:35

didn't sign

44:35

on for initially but they make

44:38

the meaning out of the work

44:40

through realizing that what they're doing

44:42

is in essence protecting

44:46

the rest of us. All of this

44:48

again for a relative low wage, for

44:51

precarious work conditions, for

44:53

not even being directly employed by the companies

44:56

for which they labor. So

44:58

it's a really tall order. It's a really

45:01

tough job and I

45:03

began my research all the way back in 2010 and I can

45:05

tell you today

45:07

in 2023, I haven't

45:09

seen a significant

45:12

change in the industry with

45:14

regard to these conditions even though

45:17

the promised AI and, you know,

45:19

generative AI in particular has arrived

45:22

as I thought and predicted it is

45:24

in fact a bit of

45:27

a reinforcement for the need for

45:29

moderation itself because as

45:31

we know, these workers are now involved

45:34

in building training models that

45:36

require them to be mired

45:38

in this material 100% of the time. So

45:41

there's really not been any significant

45:43

relief.

45:54

I'm Nora Young. Today on Spark we're talking about

45:56

what's at stake for content moderators

45:58

and for the tech platforms that rely on their

46:00

labor. Right now, my guest is Sarah

46:02

T. Roberts, author of Behind the Screen.

46:05

It seems like there's an ever-growing number of examples

46:08

of people protesting in various ways against

46:10

the abuses of tech platforms. We're

46:12

even starting to hear the term neo-luddism

46:15

being used.

46:17

Across the board in the United States

46:19

and elsewhere, there's been a resurgence

46:23

of the labor movement, a

46:25

new labor movement in some regards

46:28

and in some ways. And it's happening interestingly

46:30

across many sectors. So

46:33

we've got Starbucks organizing here

46:35

in Los Angeles. We've got grocery store

46:37

workers who are currently organizing.

46:40

We've seen the SAG-AFTRA strikes

46:42

and the Writers Guild strikes. So

46:45

there's sort of a labor sentiment

46:47

across the board. But in the tech

46:49

sector, especially,

46:51

it was sort of considered to be strike-proof

46:55

in so many ways because the tech

46:57

sector for many of

46:59

its employees, but certainly not all,

47:01

was able to provide high

47:03

levels of remuneration, great

47:06

benefits, an elite

47:08

work experience. But that isn't true

47:10

for all the workers by any means. The

47:12

tech employees, all kinds of workers who are sort of

47:14

at the bottom of the ladder

47:16

in terms of pay and status

47:18

and conditions and especially

47:21

in the tech sector where there is such a

47:23

gap between workers like that and those

47:26

workers at the top, I would

47:28

say it was almost a situation

47:30

where tech created the preconditions

47:33

for their

47:33

workers to want to respond in

47:35

this way. And I do think it's exciting,

47:38

especially these movements that are happening

47:40

around the world.

47:42

So there's people who are hired to do

47:44

digital labor like content moderators, as we've been

47:46

talking about, but there's also people who earn their livelihood

47:49

through these platforms. This spring,

47:51

food delivery workers in India went on strike for

47:53

a week

47:53

over pay cuts. Or this summer, people who sell

47:55

their wares on Etsy boycotted Etsy UK

47:58

because the platform was holding back.

47:59

as much as 75% of their sales earnings

48:02

for a period of time. So how effective

48:04

can things like protests

48:05

and strikes and boycotts be?

48:07

Well, they're incredibly effective. And the

48:09

simple reason is these platforms,

48:12

despite advertising themselves as

48:14

all tech all the time, run on humans.

48:17

They run on human labor. They run

48:19

on the ingenuity and input and

48:22

pounding the pavement in some cases of

48:24

human beings. They rely on their

48:26

creativity and output. And

48:29

it's very easy for those at

48:31

the top of these firms to lose sight of that

48:33

because they're so enamored with

48:35

the technology as well. And

48:37

they really disregard that humanity.

48:40

But

48:41

behind a very thin

48:43

veil, you will find

48:46

legions of human beings. So

48:49

the companies continue to undervalue

48:51

that human element at their

48:53

own peril. But

48:55

presumably, it partly depends on whether there

48:57

are alternative platforms or whether there's just

48:59

one behemoth dominating the whole market.

49:02

Well, I mean, I sense to

49:04

believe that that could be true. But I

49:06

think the status quo really

49:09

is that we're in largely

49:11

a situation of monopolies or maybe at

49:13

best duopolies. And these

49:15

companies, they came in and sort of became

49:18

the only game in town, the monopoly in town,

49:20

and then started to do all sorts

49:23

of things,

49:23

price surging, poor

49:26

conditions,

49:28

constantly lowering the take-home

49:30

pay of the people who make the

49:32

company really go. So in

49:35

some ways, that monopoly status makes

49:38

them quite fragile because if all of

49:40

the Uber drivers take

49:43

an action, they're sort of out of luck

49:45

in that regard. Yes, there's many delivery

49:49

services, but not that many

49:51

across the board like we might see in some

49:54

other industries where you could have your pick. So

49:56

these labor actions tend to be

49:58

very significant. with

50:00

regard to the bottom line of the one

50:02

or two firms who are controlling the

50:04

market in that

50:05

particular sector. Yeah. Companies

50:07

like Uber, of course, argue that their drivers are not employees.

50:11

So how does that complicate the picture of, you

50:13

know, labor management

50:16

relationships?

50:17

Well, they have unfortunately

50:19

been able to successfully defeat legislation

50:22

in places like

50:22

California, where I'm from, through their

50:25

financial capacity and

50:27

ability to lobby. But at

50:29

the end of the day, the drivers

50:31

will demonstrate their

50:34

worth and merit to Uber when

50:36

they withhold their own labor. So

50:39

in the context of a labor action

50:41

that involves withholding labor, I

50:43

think that status or the argument

50:45

around that will certainly take a backseat

50:48

to the fact that their non-employees are

50:50

non-driving.

50:50

Right, right, right.

51:01

I'm Nora Young, and right now my guest is Sarah

51:03

T. Roberts, an associate professor in the

51:05

Department of Information Studies at UCLA.

51:08

We're talking about the unseen true cost of

51:10

digital labor. The title of Sarah's

51:12

book, Behind the Screen, suggests

51:14

the invisibility of content moderators,

51:17

but also the human labor behind our tech

51:19

services more broadly. Think about it. You

51:22

order your food through an app, it shows up at your door.

51:24

You may not even see or interact with the person

51:26

who delivered it. And Sarah says

51:29

that has an impact on labor action.

51:32

These models are designed

51:35

to obfuscate the humanity

51:37

involved in their delivery

51:40

of services or the production

51:42

that they do. And that goes down

51:44

to Silicon Valley's peculiar

51:46

cyber libertarian

51:48

ideology that puts machines

51:51

in computation at a premium

51:54

above the basic

51:57

recognition of human effort

51:59

and human rights. humanity and humanness itself.

52:02

Of course, that does pose problems when

52:05

it comes to organizing or when it

52:07

comes to advocacy and

52:09

awareness among the general public.

52:12

I'm always happy to participate

52:15

in conversations like the ones we're having because

52:17

this is one important key way

52:20

that people can become aware of

52:23

the circumstances of these behind

52:25

the scenes, behind the screen workers that

52:28

exist in so many contexts

52:29

within

52:30

what we think of as tech. But

52:33

on the other hand, presumably the argument is that people

52:35

can just go elsewhere, whether they're users

52:37

or people who are earning money,

52:39

the tech platforms have the right to control how

52:41

they run their businesses as long as they're complying with the

52:43

law. So what do you make of that argument?

52:46

Well, of course they do, but no

52:48

individual or collective is

52:51

mandated by law to give their time and

52:53

energy and effort and creativity and humor

52:56

and arguing and so on

52:58

to those platforms. So they have

53:00

to strike a balance there. I mean, in

53:03

some places in the European Union and

53:05

other jurisdictions, there

53:07

are mechanisms being put in place

53:10

that mandate certain types of

53:12

protections and other things that

53:14

the tech companies will have to comply with.

53:17

In North America, not so much, particularly

53:20

in the United States, the kind

53:22

of regulatory apparatus has been broken

53:24

in this country for over 40 years.

53:28

But yes, there are alternatives

53:30

and people can move to them and they will. They

53:33

will. So if you're the owner

53:35

of X perhaps, and you see a complete

53:38

exodus from your platform

53:40

that was at one point an

53:43

incredibly powerful political

53:46

and cultural engine, you

53:48

have a problem. You have

53:50

a problem and it's on you to fix it.

53:52

You can't just throw a tantrum and demand that

53:54

the users come back or that the advertisers come back.

53:57

You have to make a hospitable environment that

53:59

people are interested and participating in and that

54:01

frankly is just business. Sarah,

54:04

thanks so much for your insights on this.

54:06

I appreciate it. Thanks for having me.

54:09

Sarah T. Roberts is the director of the UCLA

54:11

Center for Critical Internet Inquiry and

54:14

the author of Behind the Screen, Content

54:16

Moderation in the Shadow of Social Media. You've

54:19

been listening to Spark.

54:28

The show is made by Michelle Parisi, Samarit

54:31

Yohannes, Megan Carty and me, Nora

54:33

Young and by Brian Merchant, Richard

54:36

Methenge and Sarah

54:36

T. Roberts. Subscribe

54:39

to Spark on the free CBC Listen app or your

54:42

favorite podcast app. I'm Nora Young. Talk

54:44

to you soon.

Rate

Join Podchaser to...

  • Rate podcasts and episodes
  • Follow podcasts and creators
  • Create podcast and episode lists
  • & much more

Episode Tags

Do you host or manage this podcast?
Claim and edit this page to your liking.
,

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features