Podchaser Logo
Home
7. MySecurity

7. MySecurity

Released Wednesday, 24th May 2023
Good episode? Give it some love!
7. MySecurity

7. MySecurity

7. MySecurity

7. MySecurity

Wednesday, 24th May 2023
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:10

This is an I heeart original.

0:15

What's interesting though about the drumcase though, of course, is that

0:18

I think it's a very sympathetic impulse

0:21

to want to use some sort of law here

0:24

to convey that

0:27

what Laurie Drew and her daughter

0:30

did in this case not only did

0:32

it have a tragic ending, but

0:34

the act itself was quite

0:37

cruel. I'm

0:39

joannic Neil, and this is main

0:41

accounts the story of my

0:44

Space episode

0:47

seven. My safety

0:56

users regularly experienced

0:58

online harassment on my Space, hateful

1:02

and hurtful comments, even buying,

1:05

especially users who were very young

1:08

yet at the time, with social

1:10

networking still new and regarded

1:12

as an experiment, these experiences

1:15

weren't often taken seriously, and

1:18

to those who did experience online harassment

1:21

in the context of MySpace and

1:24

at the time and the odds, it

1:26

was so unexpected or bizarre that

1:28

they often didn't know how to process

1:31

it. Take for example,

1:33

Roommates, the web series

1:36

MySpace produced from two thousand and seven

1:38

to two thousand and eight. The

1:40

show was available to watch on MySpace.

1:44

It was also promoted through the

1:46

social network. It was critical

1:48

for driving engagement. The team

1:50

set up accounts for individual characters

1:53

on the show, audiences could

1:55

interact with the characters like they were

1:57

real people. It was exciting,

2:00

but as Roommates creator Scott

2:02

Zacharan soon found out, there

2:04

were drawbacks to being this public

2:07

available. Some of the comments

2:09

made about the characters played

2:11

by real people actors

2:14

we're really cutting. Yeah,

2:17

I mean I've happened a few times. I mean in

2:19

our early shows when people would

2:21

see the photos and people would comment

2:23

on their books any

2:25

actor, male, female, it

2:29

was really painful for them. At first, we removed

2:31

the the the garden

2:34

variety hater or

2:36

we warned them we only would

2:38

go to MySpace if there was something that,

2:41

you know, was beyond what

2:43

we should be doing, so you can get a sense

2:45

of you know, okay, this guy can be salvaged

2:48

or ignored, but

2:52

somebody else if they start to get you know, too

2:55

sexual, or you know, something

2:57

that goes beyond our standards and practices, that's

2:59

when we would kick it up to my Space. The line

3:02

between Stan and Stalker is

3:04

kind of a thin one. Like the people who

3:06

like really get enthusiastic

3:08

about a celebrity, it can be can

3:11

I get very easily tip the other way because

3:13

I think people are more nervous about doing that back

3:16

then because they didn't know what have caused. Now

3:18

it's tearing people apart is

3:20

common interactivity. Most

3:26

users on my Space did

3:29

not have access to my Space the company,

3:31

like the roommates, cast and crew. They

3:34

cannot make special requests for the company

3:36

to intervene. Typically,

3:38

it was expected back then that if

3:40

you were harassed online, the

3:43

only thing to do was ignore

3:45

it, don't feed the trolls.

3:48

This is something bridget Todd, host

3:51

of Beef and there Are No Girls on

3:53

the Internet, commented on

3:55

in our interview. I think for far

3:57

too long, people in positions

4:00

of power, or like parents and educators

4:02

and administrators, people who are in positions where

4:04

they're meant to help young people understand

4:06

the role around them, we've been telling

4:08

them this complete fiction that

4:11

what happens on the Internet is just the Internet, and like

4:13

your real life is your real life, and like who

4:15

cares what they're saying, It's just words on the on a

4:17

screen. And so when young people

4:20

are facing this kind of thing, there's

4:22

oftentimes not a lot of adults in the room

4:24

who can really understand what's happening

4:26

and like talk to them about it in a real way. That's going

4:28

to be meaningfully helpful when we're talking about

4:31

things like online harassment. It is really important

4:33

to keep that in mind. And I think that

4:35

we're seeing that that attitude

4:37

sort of slowly change, but I think

4:39

it's changing far too slowly

4:42

to actually, you know, deal

4:44

with the problem at any kind of scale.

4:48

The tragic death of Megan Meyer,

4:51

which resulted in major news coverage

4:53

ongoing for years following

4:55

the Lory True trial, was

4:59

a reckoning. The thinking

5:01

until then a mix of on

5:03

the Internet no One Knows You're a Dog as

5:06

a fame as New Yorker cartoon caption

5:08

from nineteen ninety three put it, combined

5:12

with a moral panic over youth online

5:14

that we addressed in earlier episodes,

5:18

belied how sometimes

5:20

the real threat of online harassment

5:22

is more prosaic. Your own

5:24

neighbor could create indescribable

5:27

pain for your family. Laurie

5:30

Drew, the mother of Megan Myers

5:32

classmate a neighbor of the Myers,

5:35

became virtually universally scorned

5:38

when her role in the bullying of Megan

5:40

became public. But while

5:43

most people familiar with the case believe

5:45

that her behavior toward Megan was

5:47

cruel, there was no clearly

5:50

drawn path to accountability. There

5:52

were no laws that perfectly

5:55

prevented someone else from behaving

5:57

the way that Laurie Drew had on my Space.

6:00

She was taken to court in the case United

6:02

States versus Drew and

6:04

faced felony computer hacking charges.

6:09

Drew was charged in two thousand and eight

6:11

with misdemeanor offenses of

6:14

unauthorized access to my Space.

6:17

This was overturned in two thousand and nine

6:19

and Drew was fully acquitted. Now

6:22

to a new development in the case of the Missouri teenager

6:24

who took her own life after she was harassed

6:27

on the internet. Her family wanted the mother allegedly

6:29

behind the hopes to be prosecuted. That authorities

6:31

hit a roadblock, But now there has been a

6:34

surprising development. What

6:37

kind of reactions to the Drew cose do you

6:39

get through nurse students case?

6:41

Specifically, I

6:43

would say that by

6:46

the time in the semester where I'm introducing the students

6:48

to this case, they've

6:51

already been quite outraged by the

6:54

scope of the CFAA in certain

6:56

other cases. But I think many

6:58

of them are primed to think

7:01

that the prosecutor here was really overreaching.

7:04

They already have Aaron Schwartz's

7:06

story in their minds as

7:09

somebody who was massed downloading

7:11

academic articles from j store

7:15

and was prosecuted and then

7:17

ultimately tragically took his own

7:19

life after

7:22

being charged under the CFA.

7:24

They have him in their minds. They have other

7:27

types of cases in their minds, where you

7:32

know, LinkedIn is trying to stop

7:35

another company from scraping its website

7:38

the public profiles that people have posted

7:41

on LinkedIn, and they're trying to use the CFAA

7:44

for that, and

7:47

so by the time they get to the Drew case,

7:49

I think many of them are already a little

7:51

skeptical of the

7:54

ways in which especially

7:57

sort of corporate actors can

8:00

use a law like the CFAA

8:04

to exert

8:07

forms of power and control

8:10

over websites

8:12

that they create. That's

8:16

Thomas Cardre. He teaches at

8:18

the University of Georgia School of Law,

8:20

and he's an affiliated researcher with

8:23

a Cornell Clinic to End tack

8:25

abuse, and as he just mentioned,

8:28

United States versus Laurie Drew is

8:30

a case that he brings up in his classes.

8:33

What's interesting about the Drew case that, of course, is that

8:36

I think it's a very sympathetic impulse

8:38

to want to use some sort of law here

8:42

to convey

8:44

that what the

8:48

what Laurie Drew and her daughter

8:51

did in this case, not

8:53

only did it have a tragic ending, but

8:56

the act itself was quite

8:58

cruel. And so

9:00

I think there's at least, you know, there's this perception

9:03

that what happened was was

9:07

a very at the very least uncivil

9:09

and mean spirited. And

9:13

so I think the students do

9:16

have a tough time swearing

9:20

their opposition to this very

9:23

far reaching federal law that

9:26

probably makes all of them a cyber

9:28

criminal. In my classroom,

9:32

in every class when they

9:34

drift off for a second and they go on a website

9:37

and they you know, they

9:39

violated term of service without even realizing

9:43

after the break. But we'll learn more

9:45

about the CFAA and

9:47

to find out why this and other laws

9:51

were slippery to hold Lorry Dry to account.

10:01

Essentially, the CFAA, the Computer

10:03

Fraud and Abuse Act, is a federal

10:05

criminal law that

10:09

makes it a crime to access

10:12

a computer without authorization

10:15

or to exceed your authorized access

10:17

on a computer. Now

10:20

what all of those magic words

10:22

mean has been the subject of now

10:25

decades of scholarly

10:28

debate. Different court

10:30

decisions. Many

10:33

different interpretations have been kind of put forward,

10:35

and different cases have kind of tested the boundaries

10:38

of what some of those key terms might

10:40

mean, especially the idea of what it

10:42

means to access a computer

10:45

without authorization or to exceed your authorized

10:47

access. That concept

10:49

of unauthorized access is

10:52

really at the heart of a lot of these disputes

10:55

colloquially to see if it is talked

10:57

about as the federal hacking

10:59

law. But of

11:01

course even what constitutes hacking

11:04

is a kind of disputed, and

11:08

some of the confusion surrounding the interpretation

11:10

of the statue reflects some of those those

11:13

kind of colloquial tensions as

11:16

well. One thing that I just was curious

11:18

about because it is allaw that, unless

11:21

I'm mistaken it, it's been on the books for decades

11:24

now, So has a

11:27

perception changed over the decades

11:30

because of changes in the technology

11:33

or what does

11:35

it mean to have a low of that hold? Absolutely?

11:38

Yeah. So one of the interesting things about the

11:40

CFAA is that I think best

11:42

septions surrounding the laws have

11:45

changed, and the law itself has also

11:47

changed. It's been amended by Congress several

11:49

times since it was initially passed in

11:52

the nineteen eighties, and so we've

11:55

got these kind of two parallel changes that

11:57

are going on, and they're not always syncd

11:59

up. So sometimes I would say public perceptions

12:01

surrounding the law has changed in

12:04

response to a case like

12:06

the law a Drew case, or a situation

12:09

like Aaron Schwartz, one

12:12

of the founders of Reddit, who was famously

12:14

charged under the CFAA. We've

12:17

had other high profile cases more

12:19

recently than those two, but there

12:21

are these kind of moments where there's increased

12:23

public consciousness surrounding the CFAA, usually

12:27

paired with opposition

12:29

to how it's being enforced or interpreted.

12:34

The origin of the CFAA, possibly

12:36

apocryphal, is that President

12:39

Reagan watched war games

12:41

at Camp David. It's that Matthew

12:43

Broderick movie from nineteen eighty three. You

12:46

know it. The only winning move is

12:48

not to play yeah that one.

12:51

Reagan was allegedly so disturbed

12:53

by the hacking depicted in this movie that

12:56

he whipped up legislation that extremely

12:58

broadly outlawed computer

13:01

access without authorization.

13:04

In the summer of twenty twenty one, there

13:06

was a Supreme Court case which narrowed

13:08

the scope of the CIFAA. The

13:11

way that the CIFA was applied

13:13

in the laureatory trial would probably

13:15

be considered obsolete. But

13:18

also I am very very

13:20

obviously not a lawyer, So other

13:23

Thomas Cadree take it from here. But

13:26

I still teach the case to my students because

13:29

I think it really helps to kind of ground the stakes

13:32

of what the Supreme Court was really

13:35

doing in this case last summer

13:37

in saying, well, there's

13:40

this one way that we could read the statute that might

13:42

cover all of these forms of conduct,

13:45

some of which may be harmful

13:48

but maybe not harmful in the way that this statute

13:51

was designed to cover, and other

13:53

of which may not be harmful at all, maybe really innocuous.

13:56

And of course, one of the main things that staid issue in the Drew

13:58

case and in many other cases involved in the CFA

14:01

is one of the violation of terms

14:04

of service or some other form

14:06

of contractual agreement

14:09

or written policy, whether violating

14:12

that kind of a restriction that

14:15

isn't bypassing some sort of technical restraint

14:18

on access to a computer, that is really doing

14:21

something that you're not supposed to be doing under

14:23

some sort of rule that's written down or

14:26

that's conveyed to you in some way or maybe that's implied.

14:30

Those were the cases that always, I think,

14:32

gave judges some of the greatest discomfort

14:35

in saying that the CFA should apply there.

14:38

But some courts and some judges,

14:41

including the judge and the Laurie Drew case, felt

14:43

compelled to reach that conclusion in

14:46

part because the

14:48

terms in the statute to say that you're

14:50

doing something, you're accessing a computer without

14:53

authorization, the statute

14:55

gave no specialized definition

14:57

about what without authorization should mean,

15:00

and so often when judges

15:02

are faced with interpreting a law like that, they

15:05

just look to the ordinary meanings, and we know what

15:07

without authorization means. It's synonymous

15:10

with things like without permission, not

15:12

allowed. And so

15:15

if something is forbidden in a written policy

15:17

and you go ahead and do it anyway, it

15:20

sort of makes sense to talk about that as lacking

15:22

authorization of some kind, lacking permission,

15:25

And so judges, like the judge in the

15:27

Laurie Drew case felt compelled

15:30

to say, well, these actions

15:32

because Laurie Drew

15:35

and her co conspirators, as the

15:37

court puts it, co conspirators here being her

15:40

daughter and her eighteen year old employee.

15:43

The mother's eighteen year old employee. They

15:46

violated various terms of service

15:49

that my Space had laid out, and so they were acting

15:52

without authorization and therefore they violated.

15:55

Therefore they violated the law. And so I still

15:57

teach it to my students because it's a fascinating case

15:59

to kind of show. I think a lot

16:01

of a lot of people, my students included, they

16:04

have some sympathy with

16:06

the idea that operators of websites

16:08

should be able to set certain rules and

16:11

if those get violated, it's

16:13

not just a question of oh, you breached the contract,

16:16

but you did something that violated a

16:18

criminal law, and

16:20

so we should be able to use the criminal

16:22

law to get at those kinds

16:25

of permissionless

16:27

uses of computers. But

16:30

I think the Drew case kind of pushes some

16:32

of those impulses to say, well, if if

16:35

this is allowed, then this is the this is the extent

16:37

that you could go to one question I have about

16:40

this case and then going back to this moment

16:42

in time in two thousand and seven, two thousand and eight, I

16:46

think a lot of reactions to the story

16:48

of Meganmeyer is that something

16:51

terrible happened and this

16:56

you know, what does justice look

16:58

like in this in this situation,

17:01

what is on the books at

17:03

all, and if the CFA is a

17:05

perfect legislation, and was

17:10

there anything at the time that could

17:12

have been more suitable, or in

17:15

the years since then have

17:17

there been developments to enforced cases

17:20

of what I would say extreme online

17:22

harassment of this nature

17:24

or any nature. At

17:26

the time, there certainly

17:29

weren't as many laws that would apply as

17:32

there are now. And that's one of the reasons

17:34

why I would imagine federal prosecutors

17:36

reached for a law like the CFAA that,

17:41

given its interpretation at the time, was

17:44

something of a capsule, or at least it could

17:46

help fill in the gaps where

17:49

some other laws wouldn't apply. And

17:52

so you might have had certain laws

17:54

that prohibited forms

17:59

of harassment but that didn't

18:02

yet apply to internet based

18:04

harassment. Or you might have had

18:07

uma

18:11

claims that could be brought for intentional infliction

18:13

of emotional distress. That's a taught that

18:15

had existed for a long time, but the

18:18

government can't bring that as a criminal charge.

18:20

That's a private lawsuit that needs to exist

18:22

between people. So you know, Megameier's

18:25

parents, for example, might have been able to sue

18:27

for intentional infliction of emotional distress.

18:31

But actually there are all sorts of very complicated reasons. We

18:33

won't get into about why it's difficult for Paris

18:35

to sue when when something

18:37

like that happens to a child. But anyway,

18:39

that the

18:41

bigger point is that, yes, that's

18:44

one of the reasons why prosecutors

18:46

reached for a law like the CFAA, where

18:50

you can you can bring in. It

18:54

gives a legal basis for our

18:57

sense of moral outrage that something

18:59

bad happened and somebody needs to be held responsible.

19:03

Since the Drew

19:06

decision, which

19:08

ultimately remember right, even though she was convicted,

19:11

it was her conviction was ultimately overturned

19:14

because of her constitutional challenge

19:16

that she raised to her

19:19

conviction. Since

19:21

then, there have been a

19:24

whole slew of cyber bullying

19:27

and harassment and stalking statutes that

19:29

have been passed in many states across the country, including

19:31

one in Missouri, the home state

19:34

where these events kind of mainly

19:36

took place. Missouri passed

19:38

Megan's Law, which was

19:41

a statute designed to

19:43

get at various forms of cyber bullying

19:45

and cyber harassment that

19:50

the terms of the statute certainly seemed much

19:55

they seemed much closer to what

19:57

happened here, right, It's actions

20:00

that are taken with the purpose

20:02

to frighten and intimidate and cause

20:05

emotional distress. There

20:07

are different provisions that apply depending on whether

20:09

the perpetrator is a minor or

20:11

an adult. So laws

20:14

like this have since been passed, but they've also

20:16

been subject to a lot of constitutional challenges

20:18

as well, usually First Amendment challenges

20:21

based on the freedom of speech. So

20:24

courts tend to get let's just say

20:26

a little more skittish when laws

20:30

make it illegal

20:33

to communicate with people

20:36

with the intent to annoy, with

20:39

the intent to m

20:43

yeah, pester, you

20:46

know, if it's with an intent to threaten,

20:48

if it's with an intent to

20:51

harass, Generally

20:54

that's you know, the courts are

20:56

a little less

20:58

likely to strike down those laws as unconstitutional.

21:02

But the story of kind of cyber

21:04

bullying laws across the country has been one of

21:07

a few successes and many failures

21:09

in terms of those laws standing.

21:13

This sort of the wealth being

21:17

being upheld by courts when they when they're

21:19

challenge Yeah, that was

21:22

a really helpful explanation

21:24

there. It raised a question that

21:26

I have now, which is it seems

21:28

like with online harassment legislation

21:31

you have different stakeholders,

21:34

the users, the people,

21:39

the business, the executives

21:42

of a platform, the victims

21:45

of cyber bullying or online

21:47

harassment. How

21:50

do you negotiate I mean, again,

21:52

I imagine it's a very imperfect process. But

21:55

how do you negotiate with these that

21:58

balance between the First Amendment rights

22:01

and the accountability Who

22:03

is responsible for what? And

22:06

how has this process evolved

22:10

over the past couple decades. Yeah,

22:13

it's constantly evolving. It

22:16

is by no means settled. And

22:19

I'll add one additional complication,

22:21

not that we need anymore. We've got enough to be

22:23

getting along with. But law

22:26

is only one possible regulatory

22:29

tool that can be used here to

22:31

address some of these harmful forms

22:34

of conduct communication

22:36

interaction, right that conducted

22:39

through technology. Technology

22:42

itself is another regulatory

22:45

force here. Technology can

22:47

enable and constrain different forms of

22:49

behavior in ways

22:51

that is certainly not a direct analog

22:54

to law, that can be complementary

22:56

to law and sometimes not so complementary to

22:58

law. And there are other regulatory

23:00

forces as well. Right, There can be

23:03

certain like market constraints on some

23:06

of these forms of behavior, and social

23:08

norms are working in the background as

23:10

well to again

23:13

push certain types of behavior enable

23:15

it or constrain it. But technology

23:17

in particular is really important to think

23:19

and talk about in this context because

23:23

your question asked about how do we navigate,

23:25

for example, First Amendment

23:28

rights to free speech or

23:31

just the political value

23:33

of freedom of expression right with

23:37

laws and other forms

23:40

of regulation, including technology that

23:42

might seek to regulate this kind

23:45

of behavior right, And this is a constant

23:47

process of evolution. I would say that

23:51

we see play out right everything

23:53

from when a

23:56

form of resident of the United States gets

23:59

kicked off Twitter, whether that is

24:01

a First Amendment issue, a free

24:03

speech issue, whether those things are one and the

24:05

same. Right they aren't, but they

24:08

often get lumped together q

24:13

the question of online

24:15

harassment by cyber

24:19

mobs, doxing,

24:22

non consensual distribution of intimate

24:25

images, other

24:27

forms of kind of networked harassment. The

24:31

values that are at stake in

24:34

each of those different situations are

24:36

types of regulations that might be appropriate

24:39

to deal with them, the constitutional

24:41

issues at stake in

24:44

some ways, I like to think that, but you know, they

24:46

are all deserving a very distinct treatment

24:49

because they do often raise very

24:51

different questions of how

24:53

to try and mitigate or address some of those

24:56

harms. And yet at the same time, they're all

24:58

intimately connected. Right,

25:00

The types of lines that you draw in one

25:02

context will inevitably at

25:04

least have to be reckoned with in the other context,

25:07

even if they don't directly apply. And

25:09

so if we want Twitter to be able

25:12

to or you know, let's

25:14

use my Space threat since it is still around, if we want

25:16

my Space nowadays to be able to address

25:19

certain forms of networked

25:21

harassment or

25:24

targeted threats that are you know, communicated

25:28

through its platform, that

25:31

has a certain vision of the

25:34

ability of those platforms to kind of govern

25:36

and police their spaces

25:39

that they've created online. That

25:43

might also apply in the context

25:46

of trying to de platform

25:48

somebody or

25:51

remove somebody's ability to engage in

25:53

these kinds of expression, and how

25:55

they go about doing that. Right, Sometimes it's

25:57

going to be a question of law. Sometimes it's going to be a

26:00

question of other forms of regulation that they might

26:02

put in place, but it's it's all pretty connected.

26:05

In this ecosystem. Social

26:08

media at scale is difficult

26:10

to govern. Any proposed

26:12

law that might aim to rid social

26:14

networks of online harassment and

26:16

prevent future laurea orus could

26:19

backfire and countless ways. But

26:22

while online harassment is real. What

26:25

you believe constitutes online harassment

26:28

depends a lot on who you are and

26:31

when I write in this area, and when I teach these

26:33

issues, I can't

26:35

just teach law. I

26:38

have to teach technology as well. I

26:40

have to teach to some extent social

26:43

norms because they're all

26:45

interacting in this space. There

26:48

are occasionally laws that are going to be a

26:51

major motivating factor, but

26:53

often there are going to be other forces that are actually pushing

26:56

some of the key protagonists in this space

26:58

to act in certain ways, to remove

27:01

certain types of content, to

27:03

protect people from certain types of harm.

27:05

Well, it seemed like no on the corporate

27:07

side of MySpace cared what

27:10

the users were doing. In fact, there

27:12

were workers of MySpace who

27:15

were on task to remove objectionable

27:17

content from the social network.

27:21

More on the MySpace content moderators.

27:23

After the break, MySpace

27:34

seemed like a free for all, a place

27:37

where you could post or upload anything,

27:41

and some took advantage of the lax rules.

27:44

There were users who uploaded incredibly

27:47

vile content. I once interviewed,

27:50

very early on in my research time a

27:53

person who had been an

27:55

executive at a digital media company,

27:58

and that person said to me very

28:02

Wiley and sagely,

28:05

if you open a hole on the inner it

28:07

gets filled with shit. And

28:11

that was like, you know,

28:13

like mic drop. So my Space

28:16

opened this hole in the Internet for people to fill

28:18

in with photos

28:20

and imagery. You

28:23

know. There were like also like you know,

28:26

kind of crude computer graphics

28:28

that were part of it too, So you can

28:30

imagine how quickly swastikas

28:33

would have shown up or you know what I mean, just

28:35

whatever crappy thing people could do,

28:38

they took the opportunity to do it. You know. It

28:40

reminds me of like when

28:44

there's a fresh piece

28:47

of sidewalk cement that

28:50

they've put in, you know, and they put some barriers

28:52

around it when they put it down, and then

28:54

in no time people are in there writing on it

28:57

and putting their face in it, like Michael Scott

28:59

in the office, and just doing stuff to it, you

29:01

know. And that's what this is. Like, it

29:03

was like this blank slate and

29:06

then what that's Sarah T Roberts,

29:09

author of Behind the Screen and professor

29:11

at UCLA. MySpace

29:13

was the first social media company at

29:16

massive scale, which meant that things

29:18

like kicking people off the platform for

29:21

say, posting swastika's

29:24

was not an easy process. There's

29:27

no size of labor

29:30

force that you could employ that could have even

29:33

gotten all the material on MySpace, you

29:35

know, much less on some of the platforms

29:38

that are out there now that are just exponential

29:41

in comparison. A lot of Sarah's

29:43

research and writing focuses on

29:46

content moderation. Big

29:48

social media platforms like Facebook

29:50

and YouTube employ massive teams

29:52

of workers, usually content workers,

29:55

to remove photos of violent or

29:57

sexual content that users have uploaded.

30:00

The worst thing you can imagine, well, someone

30:03

has probably tried to get that up on a

30:05

social media platform. At

30:07

some point. Almost every

30:10

major platform thinks

30:13

of content moderation a

30:16

little late, like they think

30:18

of it because some crisis has

30:22

predicated a new conversation within

30:24

the firm like oh, we

30:26

actually have to have some policies, or oh

30:29

my god, I didn't think someone would

30:32

nefariously do this, But here are

30:34

a bunch of people doing this thing with our our

30:37

tooling or our systems, And not

30:40

only is that distasteful

30:42

to us, but maybe it's illegal, you know, in the case

30:44

of circulating child sexual

30:46

abuse material, which people do all the time,

30:49

all the time on social media to this day,

30:52

and it is illegal, right The thing about

30:54

content moderation of social media is

30:58

that it's

31:00

treated as a trade secret.

31:03

It's treated the practices,

31:06

the specifics who

31:08

does what and where and

31:10

exactly how those

31:13

there's no consortium of social

31:15

media companies getting together and being like, hey,

31:17

we all have the same problem. MySpace,

31:20

as the first social media company at

31:23

a massive scale and one that was largely

31:25

image based, was the first

31:27

social network to grapple with the consequences

31:30

of scale. There

31:35

were like a series

31:37

of maybe moral and ethical responsibilities

31:43

that my Space felt, and then there

31:45

were also maybe some potential legal ones

31:48

that kind of came into

31:50

play, and so all of that necessitated

31:56

some gatekeeping

31:59

of some sort. But there

32:03

the firms have a hard time thinking

32:05

about that kind of activity gatekeeping,

32:08

taking material down, enforcing rules,

32:11

thinking about what can't

32:14

be done. They have a hard

32:16

time thinking about that as revenue

32:18

generating. If you were a user who encountered

32:20

some of this file stuff, maybe

32:23

someone left a testimonial with a

32:25

picture of data animals, it wasn't

32:28

clear how to flag this material, and

32:30

it wasn't clear what would happen if you did.

32:33

They often would have had no idea where it was

32:35

going. And I think in many

32:37

cases probably just presumed, oh, I'm sending

32:40

it off to the computer whatever that meant,

32:42

when in fact, you know they were sending it off to people,

32:45

but they were doing from labor

32:47

on the front end of triaging that material

32:49

already. So like maybe at

32:53

one point you had to go through

32:55

a series of menus to find where you would

32:57

report. Now, it's usually the

33:02

convention is like to have that much more

33:04

available to users, like that those buttons,

33:07

the red button or something, I've got to report this, But

33:11

you know, it was a it was a process of like flow

33:14

chart logic where you would find this place

33:17

to report, and then this is the macro category

33:20

of why it's a problem

33:22

because it's violent, or because it's

33:25

inappropriate sexual material, or because

33:27

it's some other kind of thing. I

33:29

mean, I would argue that making

33:32

a better, safer, more comfortable place

33:34

for people ultimately will generate revenue,

33:36

but that's a long that's kind

33:39

of a longitudinal argument for

33:41

companies that want quarterly returns, so

33:43

it's hard to make that case. So

33:46

what happened was, in

33:48

the case of my Space, you know, they

33:50

had to build up a content

33:53

moderation department,

33:56

which meant they also had to create a bunch of policies

33:58

simultaneously. Because the policies

34:01

governed the operations of content moderation,

34:04

executives often rationalize these

34:06

haphazard content moderation workforces

34:09

with haphazard workflows. They

34:12

assume it will all get automated

34:14

eventually, and for those who

34:16

work as content moderators, the

34:19

experience can be traumatizing. Sarah

34:24

talked to moderators from multiple platforms

34:26

for her book, including someone who moderated

34:29

my Space content. She said,

34:31

well, for the three years

34:33

after I worked at my Space,

34:36

if I met someone, I

34:38

wouldn't shake their hand. So I said, can you tell

34:40

me more about what you're saying with it? She

34:42

said, well, I know how

34:44

people are, and people are nasty and

34:47

they're growth, and I don't

34:49

want to touch a stranger's hand because

34:52

I know the stuff they do. So

34:55

this is kind of how she went forward

34:57

in the world after that experience.

35:00

She told me that she had

35:03

coworkers or employees that she

35:05

was worried out unleashing back

35:08

into the world because of the

35:11

harm that they underwent in

35:14

what they were doing and seeing. You

35:16

know. She told me, maybe some of these people started

35:18

out a little bit weird and this job

35:21

just you know, took them to the mat

35:23

psychologically, and she said,

35:26

you know, she often worried about what became

35:29

of those people, where did they end up? And in

35:32

case you're wondering, automating

35:35

content moderation would

35:37

be extremely difficult to do.

35:40

In fact, many of the much

35:42

heralded AI applications depend

35:46

on this kind of labor too. A

35:48

recent Time magazine feature revealed

35:51

that workers in Kenya moderate

35:53

and filter chat GPT content

35:56

for less than two dollars an hour. Does

35:59

it have to be that way? I

36:03

guess companies

36:05

think so for now, um

36:10

and they throw a lot of resources on you

36:12

know, again computationally, but there's

36:15

no getting away from the human the

36:19

human ability to

36:21

discern that is so uniquely

36:26

human. To take all of these

36:28

inputs symbols,

36:31

language, cultural meaning,

36:34

you know, the specificities of a particular

36:36

region in Mexico, you know, for

36:39

example, um, and

36:42

and the political situation

36:44

in that place, and

36:47

having someone who knows intimately

36:50

that area and

36:52

can respond to it like that

36:55

is nuance And it's so it's

36:59

so uniquely human in some ways.

37:02

It's like that discernment and judgment,

37:07

like yes, if it's you know, there's too much

37:09

boob in the photo. Okay, a computer

37:11

can like make a decision about that, Yes,

37:14

But when we bring in all

37:16

of these elements language, culture,

37:19

symbols, politics,

37:25

you know, like regional politics in some case

37:27

very specific religion,

37:30

all of these elements that are so complex

37:33

that people spend entire careers

37:35

studying them or you know whatever, and

37:39

then ask very lowly paid people

37:41

in a completely different part of the world to decide

37:44

about it, or we try to create

37:46

algorithms that can imitate

37:50

those decisions. You know, things fall

37:52

through the cracks, and it's

37:54

a really hard, hard problem

37:57

to solve under the

37:59

current business model social media, which

38:01

says post it and we'll

38:04

sort it out later. My Space is

38:06

still grap billing with content moderation

38:09

because MySpace still exists.

38:11

It is still around, It

38:14

exists as a company, it exists as a

38:16

platform. It collapse, certainly,

38:19

no one I know has used it in a decade

38:21

by people still work there, people post

38:24

on it. What is MySpace now

38:27

in twenty twenty three. In the

38:29

next episode, we're going to explore

38:31

what's left of it. Thanks for listening

38:33

to Main Accounts, The Story of MySpace

38:36

and iHeart original podcast Main

38:39

Accounts The Story of MySpace

38:42

is written and hosted by me Joeanna

38:44

McNeil, editing its sound design

38:46

by Mike Coscarelli and Mary

38:48

Do. Original music by Alice

38:51

McCoy, Mixing and mastering by

38:53

Josh Fisher, Research and fact

38:56

checking by Austin Thompson, Jocelyn

38:58

Sears, and Marissa Brown. Show

39:01

logo by Lucy Kntania. Special

39:04

thanks to Ryan Murdoch Grace

39:06

Views at the Head Frasier. Our

39:08

associate producer is Lauren Phillips,

39:10

our senior producer is Mike Coscarelli,

39:13

and our executive producer is Jason

39:15

English. If you're enjoying the show, leave

39:18

us a rating and a review on your favorite podcast

39:20

platform Sadly, my

39:22

MySpace page is no longer around, but

39:25

you can find me on Twitter at Joe mc

39:27

Let us hear your MySpace story aunts

39:30

check out my book lurking main

39:33

accounts. The Story of MySpace

39:36

is a production of iHeart Podcasts.

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features