Podchaser Logo
Home
Deepfake Abuse is a Crisis w/ Kat Tenbarge

Deepfake Abuse is a Crisis w/ Kat Tenbarge

Released Thursday, 4th April 2024
Good episode? Give it some love!
Deepfake Abuse is a Crisis w/ Kat Tenbarge

Deepfake Abuse is a Crisis w/ Kat Tenbarge

Deepfake Abuse is a Crisis w/ Kat Tenbarge

Deepfake Abuse is a Crisis w/ Kat Tenbarge

Thursday, 4th April 2024
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

This. Is your fault you made this

0:02

technology? You did not think about

0:04

this. Or if you did, you

0:06

did not create guardrails around these

0:08

obvious problems and now people are

0:10

suffering as a result. Pillow

0:29

stuck with me to partnership with A

0:31

Nation Magazine. I'm your host Paris Marks

0:33

and before we get into this week's

0:35

episode, I just want to talk about

0:37

something pretty exciting. with podcast. This month

0:40

we celebrate our fourth birthday. It's hard

0:42

to imagine saying that about a show

0:44

that I started in the early days

0:46

of the pandemic when we were all

0:48

in lock down and I figured it

0:50

was probably a better time than any

0:52

tube started podcast and now for years

0:54

later we're here. I've done over two

0:56

hundred episodes of the show, digging into

0:58

a whole range of critical topics on

1:00

the tech industry with such a fantastic.

1:02

Guess over those four years we have more

1:05

than ninety show transcript on our website for

1:07

people to explore to go back over things

1:09

and just if they prefer to read rather

1:11

than listen to these interviews, we've done live

1:14

streams like our usual end of Your Life

1:16

streams are. We go over what happened in

1:18

the past year in Tack and of course

1:20

the recent livestream that we did on Dune

1:23

and we want to start doing some more

1:25

Those I think you all have fun with

1:27

our annual Worst Person In Tech contest that

1:30

of course will be doing at the end.

1:32

Of the Year again this year, and

1:34

in October of last year we did

1:36

an Indepth series called Elon Musk en

1:38

Masse digging into who this man is,

1:40

where he came from, and how he

1:43

built this mythology of himself that is

1:45

hopefully rapidly falling apart as he takes

1:47

this turn to the extreme, right? But

1:49

that doesn't mean that there are really

1:51

serious dangers that come with the power

1:53

that he has amassed, along with the

1:55

many listeners who enjoy the show who

1:58

share it with their friends. Meteoric. They

2:00

since have also recognize the works that

2:02

Techland Save Us is doing. Last year

2:04

the New York Times recommended Tech Won't

2:06

Save Us for people wanting to know

2:09

more about Ai and said quote For

2:11

anyone alarmed by all the widespread predictions

2:13

about Ai swallowing whole entire job sectors,

2:15

the shows measured coverage might prove reassuring.

2:18

Gizmodo. Said quotes tech will save

2:20

us weeds to the crap and snake

2:22

oil the industry and discusses the human

2:25

consequences of the technology and measurable said

2:27

Quote: a healthy counter dose to the

2:29

nauseating tech utopia idealism that usually surrounds

2:31

Silicon Valley and enthusiast post coverage People

2:33

recognize what we're doing here with have

2:36

on Save Us by spreading critical perspectives

2:38

on technology and why it's so important.

2:40

In the coming months, I'll be looking

2:42

to explore even more issues that we

2:44

haven't gone in that them before like

2:47

Starlink Neural Link, Fast Fashion. Geo Engineering

2:49

and previous way to take criticism and at

2:51

all because of the support of listeners like

2:53

you. Thank you to all those who already

2:55

support the show and as we celebrate the

2:57

shows fourth birthday I'm asking those of you

3:00

who don't already to consider going to Pizza

3:02

Com/tech on Save Us and supporting the So

3:04

for as little as five dollars a month

3:06

so he can keep doing this work and

3:08

to help us tackle a new project. If

3:10

we get two hundred new supporters this month,

3:13

we're going to do a new series like

3:15

the one we did with you on Must

3:17

last year, but this time. Will be

3:19

tackling the ai hype up the past

3:21

year by digging into the false promises

3:24

of tech titans like Sam Altman, the

3:26

environmental consequences of these tools including everything

3:28

from the massive energy advance and water

3:31

use, the growing backlash, to hyper steal

3:33

data centers built by companies like Microsoft,

3:35

Amazon, and Google around the world. And

3:38

last, the key question. Do. We

3:40

really need this much computation. With the

3:42

Ai hype in full swing and a

3:44

growing drawbacks of the Silicon Valley model

3:46

becoming all too apparent, said feels like

3:48

the kind of series we need right

3:50

now and help us make it. You

3:52

can go to Page on Know Com/take

3:54

on Save Us and become a supporter

3:56

this month to help us reach our

3:58

goal. So thanks so much for your

4:00

support now! Well, let's turn to this

4:02

week's episode. My Guess is Cat Ten

4:04

Barge Cat is a tech and culture

4:06

reporter at Nbc News. Now I'm sure

4:08

that you've seen all of these stories

4:10

circulating recently about the deep Faith nude

4:12

images and a I generated explicit images

4:14

that have been produced of celebrities women

4:16

celebrities in particular Taylor Swift of course

4:19

comes to mind that many others and

4:21

the conversation that this is getting people

4:23

to start having about one of the

4:25

broader impact of these Gen Vi tools

4:27

that tech companies have been rolling out

4:29

over the past. Yourself In this episode,

4:31

we dig deep of what is actually

4:33

happening there, the problems with them, and

4:35

we don't just focus on celebrities even

4:37

though those are the ones that really

4:39

start these conversations and get people to

4:41

pay attention to it. But this is

4:44

actually having very serious effects for the

4:46

non celebrities, the regular people in the

4:48

world as well. In particular, these tools

4:50

are being used to generate. Explicit

4:53

images of middle schoolers and high schoolers

4:55

in the United States with many other

4:57

parts of the world. And. Having

4:59

serious consequences for the victims of

5:02

those things as those images circulate.

5:04

I. Think it's positive that we're starting to have

5:07

conversations about this and what we're going to

5:09

try to do about it but it seems

5:11

like too little too late at this point

5:13

when so many people have already been harmed

5:15

and so this is a conversation that the

5:17

meeting the have for a while and a

5:19

very happy that cat was willing to come

5:22

on the show to dig into up with

5:24

us and to give us some great insights

5:26

into the serious problems that these jenner to.

5:28

they are tools are creating as they are

5:30

allowing people to generate these explicit images of

5:32

people in their lives and the serious consequences.

5:34

That com of that. One more thing to

5:36

note, Given the subject matter that we're discussing,

5:38

we also get into some pretty heavy topics

5:40

like suicide and self harm. I know some

5:42

people prefer to be aware of that at

5:44

a time. So. If you enjoyed this

5:47

conversation makes her to help us hit

5:49

our goal this month of getting two

5:51

hundred new supporters and you can join

5:53

people like Nina from Essex, Antonin from

5:55

the Czech Republic, Jk from Connecticut, Danielle

5:57

from Victoria, Bc Sabian from Pow. Mathias

6:00

from Switzerland and gym in Seattle by own

6:02

to peach on a com/tech won't save us.

6:04

We can support the show for as little

6:06

as five dollars a month and help us

6:08

hit our goal to make that new series.

6:10

Digging into the ai hype and the backlash

6:12

to data centers. Thanks so much! Enjoys we

6:14

conversation. Cat. Welcomes Attack on Save

6:17

Us All. These are you're adding me? It's great

6:19

to be. Yeah, Absolutely, I've been really

6:21

looking forward to speaking with you I from

6:23

following your reporting in coverage for Nbc News

6:25

for a long time and one of the

6:28

topics that you have been writing about a

6:30

lot recently because it's obviously been in the

6:32

public conversation, but it's also an issue that

6:34

is of incredible importance and I think even

6:36

though it has become something that people are

6:39

more aware of, I think it's an issue

6:41

that is not getting the level of attention

6:43

that it deserves because of the widespread harm

6:45

that is causing to a wide number of

6:48

people in a growing number of people and.

6:50

Particularly women. but beyond that as well,

6:52

rights. And so I wanted to start

6:54

with probably the moment or the event

6:56

that many people will be most familiar

6:58

with. And this is when in January

7:00

and number of explicit A I generated

7:02

images began spreading of Taylor Swift on

7:04

X M, particular within on some other

7:07

platforms as well. Can you talk to

7:09

us? A bit of a what happened

7:11

there and what's the significance of that

7:13

kind of moment was. Why

7:15

sell for Island Fall About

7:17

the past year roughly I've

7:19

been seeing more and more

7:21

kind of incident like that

7:23

on says it's now called

7:25

on. This was one of

7:27

the biggest ingredient probably in

7:30

the entire be fake space

7:32

so far and the by

7:34

reality of the moment really

7:36

hold on it being forests

7:38

West who has been victimized

7:40

in a jail but is

7:42

that that like added? Basically

7:44

there was. An account that had

7:46

a modest following around the way

7:48

that it had old follow where

7:51

they've gone viral was by pristine

7:53

someone things related to sports com

7:55

then things will incentives that just

7:57

pop culture tweets that were in.

8:00

The to go by are all and

8:02

a lot of be the town's can

8:04

go viral pretty easily by sexualizing women

8:06

in the public eye to sometimes they're

8:09

able to do this in a more

8:11

innocuous wage of by like reposting and

8:13

instagram photo that kind of sexy or

8:16

you know commenting about various women the

8:18

pure and says But in this case

8:20

what they did was they actually posted

8:23

an image of Taylor Swift. It was

8:25

an artificially generated image the the entire

8:27

image with fake. A lot of times

8:30

when you see them it's like a

8:32

real photos that stitched into something else

8:34

or it's like an added in photo

8:37

that edited with a i've been in

8:39

this case it almost looks more like

8:41

a photo realistic drying and if you

8:44

didn't didn't you could kind of power

8:46

this isn't a real in edge to

8:48

what is. The kid was someone who

8:51

is very obviously Taylor Swift in a

8:53

football stadium. being edo sexually harassed, sexually

8:55

assaulted he ban by various man in

8:58

the football stadium was disbanded fees feel.

9:00

That played on a bunch of different

9:02

elements the the his element was being

9:05

non consensual. Tillis with obviously not only

9:07

did she not consent to this image.

9:09

Been created or distributed, but

9:11

the seen. As being depicted

9:13

hinges on this idea of non

9:15

consensual sexual assaults and the other

9:18

aspects. Year is that early this

9:20

year Taylor Swift with constantly on

9:22

the news for being at her

9:24

partner's football games because he's one

9:26

of the biggest people in the

9:29

Nfl and so he had become

9:31

a cultural phenomenon already. This sort

9:33

of success portrayal of Taylor Swift

9:35

like to taking the tension away

9:37

or like why is she there

9:40

will at wire winning hearing about.

9:42

The Fall and so this sort of

9:44

would be paid Image capitalized on all

9:46

of this and I think that's why

9:48

it really took off and by the

9:50

time it had been taken down and

9:52

had been viewed. Over forty million

9:55

try With this reached the

9:57

mainstream, it. Was showing up on almost

9:59

every. The lied on acts and

10:01

then the news started covering it

10:03

and it just blew up from

10:05

there. I think it was really well

10:08

and your when it comes to the way

10:10

that these images spread around like I think

10:12

you would imagine that if something like this

10:14

was going to happen especially to someone like

10:17

Taylor Swift's that it would be addressed very

10:19

quickly. Because someone like Taylor Swift is obviously

10:21

not only in the public's eye very clearly,

10:23

but also is quite an influential person I'm

10:26

sure that she can get in touch with

10:28

or her people can get in touch with

10:30

the social media companies try to ensure that

10:33

something is done in it Seem quite notable

10:35

in this event that it continued. You

10:37

know, kind of spreading around for such a

10:39

long period of time. What was going on

10:41

there and why did it seem like Twitter

10:43

or X or whatever was not able to

10:46

get a hold of this? And I believe

10:48

in a story of yours that I read

10:50

that these images also started to show up

10:52

on platforms like Instagram and Facebook and I

10:54

am sure elsewhere as well. A How does

10:56

it spread so much when something like this?

10:58

These sorts of images should be taken down.

11:01

So. Both really confront alive.

11:03

The issues in the space is

11:05

how the whole incident played out

11:08

with the can i Slipped and

11:10

the jazz and expose something that

11:12

people who have been looking at

11:15

people have been aware of for

11:17

a while, which is that platforms

11:19

are extremely reluctant to do anything

11:21

about that they're reluctant to actually

11:24

put down be images, they're reluctant

11:26

to suppress the images and links

11:28

to the material, and above all

11:30

them extremely reluctant. To suspend

11:33

the people who are putting a

11:35

stop and the companies that are

11:37

pushing a staff. And so in

11:39

the case of the Taylor Swift

11:41

images Twitter did not do anything

11:43

at. I actually don't think Twitter

11:46

ever did anything in regards to

11:48

that barrel aged. What happened was

11:50

they entered. Taylor Swift started a

11:52

campaign to mass report this image

11:54

and after hours and hours and

11:56

hours of presumably hundreds if not

11:59

thousands of Taylor's the fans are

12:01

putting the image ban. It was

12:03

finally taken down because they overwhelmed

12:05

the reporting system and that's how

12:07

they got the image taken down.

12:09

And it wasn't until like twenty

12:12

four hours after it was posted

12:14

the you saw Twitter start to

12:16

actually respond to this. And this

12:18

really tracks with the average experience

12:20

of someone who is victimized by

12:22

material like best, including celebrities. I

12:25

think we're going to go into

12:27

this little bit more later, but

12:29

other less famous. Celebrities but still.

12:31

People with Pr team of people

12:33

with lawyers, other celebrities and influencers

12:35

and creators have talked about how

12:38

it's shocked that that it was

12:40

impossible to get this stuff taken

12:42

down and how they went through

12:44

every avenue available to than and

12:46

still nothing was being done about

12:48

the problem. So the killer split

12:50

situation really supposed to the mainstream

12:52

not only what his problem looks

12:55

like but how difficult it is

12:57

to get anyone to do anything

12:59

about the problem. We. Obviously,

13:01

if you have, the Swift is on your

13:03

side, maybe even get some action because they

13:05

can actually are pussies platforms to do something.

13:07

But otherwise, if you don't have this kind

13:10

of ravenous fan base, it might be difficult

13:12

to actually get some action. And as you

13:14

were saying there, you've been reporting for a

13:16

while now on how this has been affecting

13:18

a number of other people, but as in

13:20

particular, it seems to be young women kind

13:22

of in the early twenties or or their

13:25

late teens, potentially even younger kinda getting their

13:27

images use and spread around in these ways

13:29

by using these deep lakes. and these. Kind

13:31

of a I generated images one you

13:33

seeing their and you know as you

13:36

were saying what is their experience of

13:38

this Yeah so. The practice of

13:40

beat faking it.on my radar. And

13:42

like the late. Twenty times, which

13:44

is. When you saw the technology

13:46

evolved to this place, there were

13:48

already sort of like a rudimentary

13:50

a i generated be paid coming

13:53

out. and like the years between

13:55

twenty sixteen. And. Twenty eight fall. He

13:57

saw. And like the corners of Reddit a quarter

13:59

the fourth. It might people were starting

14:01

to do that and Desmond we the

14:03

people who they were targeting were often

14:05

celebrity you when it makes sense, because

14:07

if your time to go viral or

14:10

if you're trying not even to go

14:12

viral, if you just trying to get

14:14

some attention for your technology or whatever

14:16

you're doing with that than it would

14:18

make sense that you're gonna go after

14:20

a highly visible, high profile one especially

14:23

with the deep a community and you

14:25

study how the community has evolved over

14:27

the past like seven years or so

14:29

on line. Is a highly gendered

14:31

environment and the ideology behind what

14:33

they're doing is highly, highly gendered.

14:36

So you see the communities of

14:38

Vp creators. They are just dominated

14:40

by man. is there are women

14:42

there. You're not really seeing them

14:44

identify themselves as one and it's

14:46

like a voice of it's like

14:48

I'm Ben. Be. And

14:50

a lot of people who have studied

14:53

sort of the deep faith faith have

14:55

talked about how it's emerged as a

14:57

kind of social bonding community for a

15:00

lot of men are various age as

15:02

and so in the early days, this

15:04

was relatively. Contained to these.

15:07

Male. Dominated corners of the internet but

15:09

over the past few years it

15:11

on my radar because I was

15:14

feeling influencers right covered having to

15:16

deal with this but not in

15:18

a major major bay. so if

15:20

I'm already looking at a teeth

15:22

were there was certain attacks against

15:24

an influencer when includes or helical

15:26

really controversial reputation i've done some

15:28

time see this material or if

15:30

the influence or was extremely vulnerable

15:33

to like one of the first

15:35

time by side the fake with

15:37

actually. From child influencers I thought and

15:39

be paid for ban on from extremely

15:41

unsavory website and my reaction was probably

15:44

around twenty nineteen. I was like this

15:46

is horrible but I have to be

15:48

careful about how I approached us because

15:50

what I don't want to do is

15:53

I don't want to make the problem

15:55

inadvertently bigger by signing a spotlight on.

15:57

At the time it felt very under.

16:00

Around a kind of like a

16:02

new threat A man in Twenty

16:04

Twenty three in actually January. so.

16:06

A year before the Taylor Swift

16:08

images, something happened that registered to me

16:11

as a really big deal at it

16:13

did become somewhat of a watershed moment

16:15

of most of the public. I doubt

16:18

it's really aware of this. The what

16:20

happened is a major stream or on

16:22

the platform twists. He was live streaming

16:25

used to live streaming one day and

16:27

in the corner of when he selling

16:30

on his camera you can feed

16:32

his web browsers. you can see

16:34

what he's doing on his computer,

16:36

and he has. A Turbo Ben

16:38

and the tab is she's looking

16:40

at the age of some of

16:42

his peers and brand image which

16:44

space on one of these really

16:46

prolific be big website you can

16:48

see what he's doing. I don't

16:50

believe that he met to showcase

16:52

their I don't think he meant

16:54

to expose what he was doing,

16:57

I think it was an accident

16:59

but he caused his cataclysmic outpouring

17:01

of attention onto the issue and

17:03

the people who were paying attention

17:05

where the people who actually. Wanted

17:07

to consume this content. From that

17:09

day forward we saw the numbers

17:11

of traffic. the amount of traffic

17:13

been directed to this website is

17:15

just started to skyrocket and it

17:17

has never stopped month over month.

17:20

Ever since this happened to, the

17:22

topic has just continued to climb.

17:24

And what happened by the end

17:26

of the year is that in

17:28

Twenty Twenty. Three, there were more

17:30

defects. create. An than every

17:32

other year combat and thus

17:35

far. Bands were the

17:37

problem of really emerged as

17:39

a mainstream issue. And not just

17:41

something in the corners of the internet

17:43

with the most unsavory types of people.

17:46

And you also saw immediately the expect

17:48

that this had on the women who

17:50

he was looking at the defect fab

17:52

because some of those women who were

17:54

full of the biggest female twitch streamers.

17:57

Were really really understandably.

18:00

Traumatized by the as they talked

18:02

about all. The various consequences that

18:04

is hard on their mental health. One

18:06

of the women who with affected talked

18:09

about how she had struggled with an

18:11

eating disorder in the past and seeing

18:13

her deep bake it had reignited some

18:15

of those triggers because she was being

18:18

her face on a different woman body

18:20

and it was starting to make her

18:22

a question her own body like should

18:25

my body look like the body in

18:27

the T fake that I'm Vi and

18:29

to that's just one of the money

18:31

under explored and sort. Of under recognized

18:34

consequences can hang out in addition

18:36

to all of the trauma of

18:38

being essentially sexually abused and I

18:40

need a lot of people struggle

18:42

to make that disconnect like they're

18:44

like was not real. It's not

18:46

happening to you to say by

18:48

in reality our brain on a

18:50

neural physiological level do not recognize

18:52

that and this is something I

18:54

talked about with has tightened that

18:57

Companies like Adobe we all recognize

18:59

why this is an issue because

19:01

we all know that the way.

19:03

The brain works is that when

19:05

you're processing videos and images, your

19:07

brain is kind of. Treating

19:09

it as if it's real and so

19:11

even if you know cognitively that what

19:13

you're looking at his face, it still

19:15

has a real of fact not only

19:17

on the person is depicted, but on

19:19

the viewer. And so it is a

19:21

very similar issue in that sense to

19:23

what used to be known as revenge

19:25

pornography, which we now. Prefer.

19:28

Not to use terms like revenge porn. we

19:30

prefer. To use terms like non consensual

19:32

sexual explicit material because it's less

19:34

stigmatizing. but the phenomenon it has

19:36

played out so similarly to how

19:38

this issue played out in the

19:41

Twenty Ten with a women's nude

19:43

photos being posted online without their

19:45

consent is really just like watching

19:47

the things like will play out

19:49

yet again because where we're at

19:51

right now is women and victims

19:53

of there are sharing how harmful

19:55

it as and tech companies and

19:57

the people who are responsible for

19:59

this problem have you have not

20:01

caught up And so now we're

20:03

in a space currently where there's

20:05

very little regulation, very little oversight,

20:07

very little paths to recourse. But

20:09

the problem is is growing and

20:12

growing and growing. It shows me

20:14

you you explain all of that. Basically the

20:16

i remember that story about the Twitch Streamer.

20:18

He was not so much. I paid a

20:20

lot of attention to you and I remember

20:22

kind of when it was passing through the

20:24

kind of media cycles that I follow in

20:26

the tech industry. And again, like, unlike the

20:28

kind of Taylor Swift moment, it wasn't something

20:31

that I saw a break out into the

20:33

broader public discourse or was something that was

20:35

contained to these sorts of communities that pay

20:37

attention to the creators and what happened on

20:39

Tic Toc And I'm sure the kind of

20:41

communities that are paying attention to. The

20:43

Deep Lakes and A I generated explicit

20:45

images and things like that, but I'm

20:47

happy that you brought up the comparison

20:49

to what was happening in the past,

20:51

right? Because it's not nude images of

20:54

celebrities in particular or even had a

20:56

minor public figures is something that is

20:58

entirely new. These have circled around in

21:00

the past, but usually they had to

21:02

be real images and and not things

21:04

that were being created As I was

21:06

reading some of your stories and and

21:08

just kind of preparing for this, I

21:10

thought back to the leaks from the

21:12

Apple and of. Cloud Storage stuff that

21:14

was particularly focused on Jennifer Lawrence but

21:16

I believe as I said a number

21:18

of other people about ten years ago

21:21

when they're kind of nude images were

21:23

circulating around and I wonder what you

21:25

see in the similarities to what happened

21:27

then versus what is happening now, but

21:30

also the differences in I guess. What?

21:32

I would see as the scale of the

21:34

problem since these images could just be created

21:37

with these tools that are now easily accessible

21:39

by these companies. Are ya in particular since

21:41

the boom of a degenerative Ai tools being

21:43

released in the past couple of years, what

21:45

do you seen? the difference between now and

21:47

then but also the similarities. Wife.

21:50

Is a great question, and it

21:52

really illuminates the scale of the

21:54

problem Now, because and twenty pounds

21:57

when you saw this issue arrive.

22:00

Happened in a lot of the same

22:02

way that we're seeing it Now where

22:04

the highest profile incidents. Were like

22:06

the I Pod Hacks and The Leak

22:08

and. You know, fascinatingly and not

22:10

and disturbingly enough, some of the

22:12

same web sites and some of

22:14

the same football who posted that

22:16

illicit material back in the Twenty

22:18

ten. It's the same people posting

22:21

the defect Now it's the same

22:23

website pushing the defects now because

22:25

we never really got a handle

22:27

on how to stop that from

22:29

the day we created you know,

22:31

tech companies. eventual we after years

22:33

of women suffering treated pathways for

22:35

them to be able to at

22:37

least take this material down from.

22:39

Google from Pittsburgh from whatever but

22:41

the web sites that existed as

22:43

sort of like the black market

22:45

of the Craft Death. Those web

22:47

sites were never taken down there

22:49

not mainstream test platform to they're

22:51

not going to get engaged in

22:53

front of Congress. people don't know

22:55

that they're not recognize the ball,

22:57

They are less susceptible to scrutiny

23:00

and media push back and regulation.

23:02

And to those website and those

23:04

people, they're still out there. Thirty

23:06

still exists and this has become

23:08

the new gold. Mine for that

23:10

aren't you alluded to? I think

23:12

that one of the major problem

23:15

fi or is you know we

23:17

still have the non consensual sharing

23:19

and distribution of intimate images is

23:21

still a massive problem and even

23:24

when for example it became more

23:26

commonly known that like you don't

23:28

want to send your nude images

23:31

to anyone because they could put

23:33

them on the internet even when

23:35

that became like kind of common

23:37

knowledge of course abusive. Partners and

23:40

vengeful ex partners would still relief

23:42

this material after they broke up.

23:44

But in addition to that, predators

23:46

have always gone out of their

23:49

way to acquire this material through

23:51

whatever means necessary. So back in

23:53

the twenty times when it was

23:55

called revenge Porn and out with

23:58

like the big deal. I

24:00

remember one of the guys who

24:02

actually did go to prison. The

24:04

reason that they were able to

24:06

convict Can is because he had

24:08

hired a hacker to hack into

24:10

women devices to find this kind

24:12

of material. Because that guy named

24:14

Punter more she was making money

24:16

off of death. as a lot

24:19

of people were able to, he

24:21

was profiting and he was able

24:23

to monetize the spread of the

24:25

non consensual material on the internet.

24:27

The once it became profitable to

24:29

do this, In addition to something

24:31

that people just wanted to do with

24:33

maliciously. That's when it really became a

24:35

kind of unstoppable force that eventually institutions

24:38

had to pay pensions you, and eventually

24:40

I think, like with the celebrity I

24:42

club leagues, it reached a point where

24:45

I could no longer be ignored. Because

24:47

you can ignore thousands of anonymous women,

24:49

you cannot ignore Jennifer Lawrence. She has

24:51

access to the New York Times. She

24:54

can say hey, this is a sex

24:56

crime and so then you start to

24:58

see things happening and. Isn't same thing

25:01

retailers left in terms of what is

25:03

different now, and what is so staggeringly

25:05

horrifying about the deeper issue is exactly

25:07

what you said. Paris before. it had

25:10

to be some sort of real material.

25:12

and you know you could use hidden

25:14

web cams and. Changing your own. you

25:16

could go to great lengths to. Acquire

25:19

real explicit material a victim but now

25:21

you don't need to do any of

25:23

that. Know it off a list of

25:25

couple of real thing that people are

25:28

doing to create be faked. There are

25:30

people who are going on to public

25:32

live streamed footage of court room and

25:34

pulling images from people testify at the

25:37

stay and entering them entity. Fakes.

25:39

There are people. Going through

25:41

women's Instagram account and only been

25:43

profiled and taking close to picture

25:46

the van and running them through

25:48

programs that ball under their people.

25:50

Doing this with girls yearbook photos

25:52

and pictures of women and girls

25:55

taken at school. There are people

25:57

doing this with broadcast news interview.

26:00

And well as movies and Tv

26:02

and podcast and social media posts

26:04

and all the other ways in

26:06

which we are able to share

26:08

our images today. And that doesn't

26:10

even get into what created those

26:12

Taylor Swift images because they would

26:14

never even any real picture they

26:16

just were able to created out

26:18

of thin air. So the problem

26:20

now is that this scale of

26:22

being able to perpetrate what should

26:24

be considered a crime. The scale

26:26

is now unimaginable and you see

26:29

this with individual perpetrator. With the

26:31

amount of damage that they're able to

26:33

cause though I think he says were

26:35

an individual perpetrator has been able to

26:37

create the pigs of women who are

26:39

close to him like me to create

26:42

the face of a job in of

26:44

his female classmates and then Additionally, he's

26:46

running all these celebrity an influencer when

26:48

through the same technology and so now

26:50

he's able to victimize and entire scale

26:52

of when n those who he knows

26:54

personally to hand and those who he

26:57

doesn't now and that is a kind

26:59

of like a level. Of criminology

27:01

I think like a level of

27:03

criminal potential that is somewhat modern

27:05

and the scale? I think it's

27:07

something that people have yet to

27:10

fully realize just how much revealed

27:12

itself. Is shocking when you actually

27:14

start to kind of grapple with the broader

27:17

impacts of it. and I want to ask

27:19

about how it's affecting people beyond the celebrities.

27:21

But I have one more question for I

27:23

do that. and you mentioned how this can

27:25

be sort of like a bonding thing. like

27:28

there are these communities where men make these

27:30

images of women and share them with one

27:32

another and those are groups that exist on

27:34

line. But I also wanted to know a

27:37

bit about how the economy of this actually

27:39

works, because as you said, there's also a

27:41

lot of web sites that profit off. Of

27:43

this in the have been doing so for

27:45

a long time. Obviously we know there are

27:48

plenty of tools that are created to make

27:50

these sorts of images. How did these companies

27:52

make their money? And and how does this

27:54

become such a big problem that so many

27:56

different actors can make money off of? even

27:59

as so many. Are suffering as a result of

28:01

it. Was so.

28:03

We the internet. There's so many

28:05

possible ways to monetize things now

28:07

and there's so many ways that

28:09

you can set up monetization scheme

28:12

both with you know, sort of

28:14

mainstream financial institutions and outside of

28:16

that. So one way that I've

28:18

seen people monetize the creation of

28:20

beat Bags is there will be

28:22

a website that is kind of

28:24

like a you to clone and

28:27

a functions the same way that

28:29

a lot of free porn website

28:31

do like Point Hub. Where you

28:33

go on As this current state of

28:35

our regulatory environment. Depending on what state

28:37

you're an A suit may or may

28:40

not need to provide your Id to

28:42

view with on that website on porn

28:44

had specifically by Typically even just access

28:46

it from your browser. You can just

28:48

go to the website, you can just

28:50

watch free videos and that homeless people

28:53

consume point today on his for free.

28:55

If you're looking to make money off

28:57

of your videos, there are Bp web

28:59

sites that are basically like you tube

29:01

or porn. How. Clowns and you can

29:04

go and you can watch like couple

29:06

minutes of a deep a video for

29:08

free and then in the description it'll

29:11

be like your are various ways where

29:13

you can unlock longer contact. Customized

29:15

content and most disturbing way content

29:17

that features individual for you personally

29:19

now and so there has to

29:21

be a way to actually get

29:23

money into the hands of the

29:25

people who are making the staff

29:27

and the ways that I've seen

29:30

them do that is big news

29:32

Crypto currencies. So crypto currency wallet

29:34

have become a big part of

29:36

this economy and.is a very difficult

29:38

kind of thing to figure out.

29:40

Like are we going to stop

29:42

there because of the nature of

29:44

crypto currency? It. It's harder to

29:46

trace, it's harder to control. There's no

29:48

government has that is going to determine

29:51

like the youth of certain type of

29:53

crypto currency. If so, that's one way

29:55

that they can profit off of. it

29:58

is your crypto another way. Staggering The

30:00

way it is that you don't Nbc

30:02

investigation. We found a website that will

30:04

like an only the and cloud and

30:07

they had Mastercard and Visa hooked up

30:09

to this month say where they were

30:11

selling effects and we reached out to

30:14

Mastercard and Visa and we were like

30:16

hey. Why? Are you offering

30:18

your payment processor services for this

30:20

website and we never daughter is

30:22

gone and it's unclear whether it

30:25

was because of our reporting. Are

30:27

not without one site that we

30:29

found that basically just banned everyone

30:31

like that say ended up going

30:33

down for Thera is the potential

30:35

to just create a new one

30:38

and I think that a lot

30:40

of these financial institutions and payment

30:42

processors they have become really strict

30:44

about supposedly pornographic content over the

30:46

past. Several years deep take stock is

30:48

getting through and through. a raises the

30:51

question of who that the wheel who's

30:53

monitoring like with a large use Visa

30:55

and Mastercard in? Are they doing a

30:57

good enough job because I feel current

31:00

stage. Dp Producers

31:02

are. Able. To market and make

31:04

money off of this content. Using people's

31:06

credit cards. Is. Really the

31:08

point about how you know can

31:11

determine the been to crack down

31:13

on you know the. His

31:15

ability to cells and is nude images and

31:17

stuff like that just to any kind of

31:19

sex worker or anything like that. and the

31:21

target that it's been placed on that by

31:23

lawmakers and by payment processors and what not.

31:25

But how these the fake companies and and

31:27

whatnot are seemingly able to get away with

31:29

that only so far and I imagine part

31:31

of that is cause it can kind of

31:33

be like a beautiful was the most situations

31:35

with new ones kind of propping up here

31:37

and there. But as you were saying this

31:39

is not just something that affects you know

31:41

the celebrities that we all see on the

31:43

news all the time. It's average everyday

31:45

people who are also being affected by

31:47

this and who are having these images

31:50

circulating about them as we're talking about.

31:52

Before we started recording, there was a

31:54

documentary that I don't know if it's

31:56

serve fully released right now for it's

31:58

still showing a festivals but. It

32:00

is really high grappling with this issue called

32:02

another Body and it looks at you know

32:05

this this woman I believe she's a high

32:07

schooler in the video and they actually kind

32:09

of creatively use the fakes in order to

32:11

hide who she actually is by using a

32:13

deep fake for her kind of throughout the

32:16

whole film and you don't find out until

32:18

the end if I remember correctly or maybe

32:20

it's our way through but that is kind

32:22

of a film that really showed me how

32:24

much for problem is is what is. The.

32:27

Widespread implications of this and are we

32:29

seeing when it comes to regular women

32:31

to even girls in like high school

32:33

and things like that when you know

32:35

the people they now are making these

32:37

images of them. Cel.

32:40

Another Body than half the documentary

32:42

and it out. It opened my

32:44

eyes to the scale of this

32:46

problem. It's Wow. on. And I

32:48

think part of the recent adults

32:50

said at the fake what we're

32:52

dealing with at this very moment

32:54

is that even on Twenty Two,

32:56

Twenty Nineteen Twenty Twenty. in those

32:58

years, the technology existed and people

33:00

were using it for this purpose.

33:02

With sophisticated, it was kind of

33:04

an entry level that you have

33:06

to be able to access to

33:08

create a defect. You have to

33:10

have a computer that can store

33:12

all of this technology on a

33:14

and process all of this at

33:16

once. And you have to have

33:18

the technical ability to know how

33:20

to do that. And so for

33:22

years that limited the spread and

33:24

the scale of the destruction in

33:26

the Another Body documentary. The perpetrator

33:28

who they identify see the clock

33:30

side students in college those he

33:32

understands computer science of. he's able

33:34

to understand how to do that.

33:36

What we're dealing with today is

33:38

that there are hundreds. Of out on

33:40

the Google Play store and on the

33:42

app store right now the you can

33:44

download and you can input photos and

33:46

some of them are really rudimentary. Could

33:48

I tested out couple of than I'm

33:50

pictures of myself to try to figure

33:52

out? like. What can you really

33:55

do with these apps and barriers? Some of them

33:57

don't do it very well, Some of them are

33:59

like super super rudimentary, but there are a Knopf

34:01

and will have any. Have to pay them by

34:03

dollars. Are you have to sign up for a

34:05

subscription on? A Lot of them are scammers so

34:07

I try not to do that. Bad:

34:11

my credit card info. And in that

34:13

case against Google and Apple are also

34:15

getting their code as people are doing

34:17

this. Exactly. And so with all

34:19

of these apps, you no longer need

34:21

to have any sophistication at all because

34:24

I'm someone who doesn't have a lot

34:26

of com fi to the cities and

34:28

I'm not, I believe rudimentary level. And

34:31

so when I'm testing this now, I'm

34:33

coming at it From the sensibilities that

34:35

your average fifth grader would have and

34:37

their ability to navigate these had such

34:40

as a nap Exactly why we're seeing

34:42

this problem in middle school. Because now

34:44

what's happening is these app that be

34:47

advertised on mean. She social media platforms

34:49

the be advertised to young people will

34:51

be advertised with photos of young celebrities

34:53

that fall into these people each group

34:55

and the message that being sent to

34:58

high schoolers. A middle schoolers around the

35:00

world is just download this app and

35:02

do this and will take five seconds

35:04

and that is exactly what is happening.

35:07

We think he said out at least

35:09

a dozen in middle high school and

35:11

I truly believe that that is just

35:13

the tip of the iceberg because. What?

35:16

we've seen in some of these. T says

35:18

is that the schools kind of try to

35:20

cover it are more than a sad actually.

35:22

Fix. The problem is what middle or

35:25

high school wants to be on the

35:27

national news for as Bbq finance but

35:29

regardless. The. Problem is woven. It's

35:31

way into all of these various communities

35:33

around the world. There was a middle

35:36

school in Spain where this happened. I

35:38

think there was a school. I find

35:40

Canada where the top end. When you

35:42

look at the map of where this

35:44

technology and the creator is behind the

35:47

effects are coming from is all over

35:49

the world. I've spoken to victims from

35:51

India. I've spoken to victims from Australia.

35:53

I've seen technology developed in China. in

35:55

the Ukraine. I've seen technology and Victim

35:58

is very clearly a global A. You

36:00

read hinders the ability for

36:02

change. Because even if let's say we'd be

36:05

and be picked up in the United States

36:07

which were nowhere near do a. Are.

36:09

So many of these apps are produced outside

36:12

the United States and how do you even

36:14

trying sound the perpetrator you know. Really Frustratingly

36:16

a lot of times when someone is a

36:18

victim of them be like says and they

36:21

go to their local police department. The local

36:23

police officer who they interfaith were more likely

36:25

than not does not have any specialized training

36:27

or knowledge to deal with this issue. and

36:30

unfortunately when a lot of victims here in

36:32

this talked with how to respond to sexual

36:34

violence and all the Jewish and a lot

36:36

of times with their hearing is. We.

36:39

Don't think this is a crime. We

36:41

can help you and even if we do

36:43

think this is a crime we're not going

36:45

to devote any before says to helping you

36:47

figure out who's making his be paid for

36:49

feel silly left to the victim. In

36:52

most cases, in the vast majority of cases

36:54

to try to seek some sort of recourse

36:56

themselves. As you described and

36:58

I can only think about the harm

37:00

that comes with it as well. My

37:02

I'm the furthest thing from an expert

37:04

on this issue, but I know that

37:06

I have read multiple stories of people

37:08

who were in high school and have

37:10

had just nude images that they took

37:12

a themselves to. That someone took them

37:14

kind of spread around through their school.

37:16

and then you know if I know

37:19

that these are really sensitive topics but

37:21

engaged in self harm or even have

37:23

attempted suicide are committed suicide as a

37:25

result of that. And now as these

37:27

sorts. Of images can just be created by

37:29

anybody by any of their students and spread

37:31

around throughout their schools. and they have so

37:33

little control about that. Like are we seeing.

37:36

Broader. Impact on the students in

37:38

these victims of the creation of these images.

37:40

Yeah. There has been at

37:43

least one reported case I believe

37:45

in the Uk of a young

37:47

person who died by suicide after.

37:49

seeing be bag and.

37:52

I don't have the supporting myself. we

37:54

didn't saw it up ourselves but from

37:56

what I saw reported and like them

37:58

tabloids the child made reference to this

38:00

issue, why they were were sorted this

38:02

so absolutely we're seeing These has the

38:04

consequences spiral out and I think we're

38:06

going to be seeing and hearing a

38:08

lot more. And in addition to that

38:10

one of the things that. Really?

38:13

A line for me about. What's happening

38:15

here is the perception

38:17

of. Fear. And

38:20

the ways that now women and girls

38:22

are trying to protect themselves and the

38:24

ways they're be encouraged to protect themselves.

38:26

The fact is there is nothing that

38:29

you as an individual can do to

38:31

prevent someone from doing this to you,

38:33

but people are gonna try. They're going

38:35

to try to find a way to

38:37

avoid this happening to them. And what

38:40

that looks like is women on enrolling

38:42

from male dominated field because of their

38:44

male cockney doing this to them. That's

38:46

what we saw with the Another Body

38:48

documentary is like it was a computer

38:51

science course which is already a very

38:53

male dominated field and they're doing is

38:55

the all the women in their classes.

38:57

So the end result could therefore be

38:59

you see this gender discrimination continue to

39:02

be perpetuated in these male dominated field

39:04

and I think that honestly has a

39:06

lot more to do that than people

39:08

realize because that's the story of a

39:10

lot of unfortunately a lot of path

39:13

come back to this issue where you

39:15

don't have women in the realm you

39:17

don't have women in leadership. In this

39:19

and and sexual harm, a non

39:21

consensual imagery becomes a t functionality

39:24

of the tech that we produce.

39:26

So that's one huge issue with

39:28

death. But in addition to that,

39:30

you also see a women and

39:32

girls wine to recuse themselves from

39:34

public life out of fear. That

39:36

is what happened to them. And

39:38

I've seen people even people who

39:40

seemingly have good intentions say things

39:42

like this is why you shouldn't

39:44

put your faith on the internet.

39:47

And it's like die. You're doing the

39:49

work for them because let's be real.

39:51

The only way to be a public

39:53

figure and Twenty Twenty Four is to

39:55

have some sort of online presence. So

39:58

you're basically telling women and girls. Don't

40:01

try to be a public figure. Don't try

40:03

to go into politics. Don't try to be

40:05

invisible person in your field or in the

40:08

world in general out of fear that your

40:10

image will be corrupted or that your image

40:12

will be abused. And so I think that's

40:14

a really harmful message that is now being

40:17

perpetuated Because the best and to the saddest

40:19

thing about it is that wouldn't even work

40:21

when we see these cases pop up in

40:23

middle and high schools. It's not because the

40:26

girls have social media presence is it's because

40:28

it's their classmates doing this sub. And

40:30

and it really tracks with the

40:32

entire spectrum of gender based violence

40:34

and how we see it most

40:37

commonly perpetrated, which is by people

40:39

who are close physically close to

40:41

the back them. Yeah,

40:44

I I think it's such an important

40:46

observation. free to make an I'm not

40:48

surprised to hear those sorts of things.

40:50

but to think that these are the

40:52

effects that women are experiencing as a

40:55

result of these technologies and stuff. So

40:57

little accountability for the people creating these

40:59

technologies, let alone the people using them

41:01

in order to create these images a

41:03

just makes you profoundly angry. Yeah, yeah,

41:06

And it makes feel I like. I

41:08

know that after looking as as for

41:10

the past year and I know there

41:12

are other reporter is like Samantha Call

41:15

who has been reporting on this from

41:17

day wine from like the day of

41:19

the first the fake you get the

41:21

stand. Over time you start to realize

41:23

that this giant pressing issue does not

41:26

matter to the major companies and the

41:28

major people who are racing ahead. And

41:30

it's a I arms race. like they're

41:32

acting like this doesn't even exist and

41:34

that it's not even. Happening. Because

41:37

if people were to really. Rough

41:39

for with the actual harm that

41:41

is truly be committed by this

41:43

technology each then we would start

41:45

to ask place in Microsoft see

41:47

producing the south at this rapid

41:49

rate with zero guardrail should open

41:52

A I really have the prominence

41:54

that it currently does in business

41:56

in culture. Shouldn't we be asking

41:58

these top fios be kind of. I

42:00

stand. They don't want that to happen. They

42:02

love how much money they're making from a

42:04

I. Right now they don't want to have

42:06

to deal with this conversation. They would prefer

42:09

that nobody talked about at and will be

42:11

you talk about it. They can't really talk

42:13

about it as if it's the thing that's

42:15

gonna happen in the future and not something

42:17

that is currently already happening. I

42:20

just me so angry like I remember reading

42:22

was the stories that you wrote. I believe

42:24

it was after the Taylor Swift incident where

42:27

the Microsoft Ceo was asked by this doesn't

42:29

he was a yeah We definitely need you

42:31

know how rules around this or what not.

42:33

You know we need to be. pay more

42:35

attention to it. but it's like that is

42:37

not nearly enough. Like that doesn't get to

42:40

the scale of what is happening and how

42:42

the many cases. It's the tools that are

42:44

created by these major companies that are helping

42:46

to enable these things to actually be done.

42:49

One that has been one of the most shocking.

42:51

And it shouldn't be shocking. That isn't one of

42:53

the most staggering. Vital for

42:55

me personally. Think that that's

42:57

a good case study that Hillis

43:00

left Microsoft respond because as he

43:02

was doing that and er Bl

43:04

for for me I was figuring

43:06

out that the Taylor Swift images

43:08

were created with Microsoft Generative Ai

43:10

tools and I remember Microsoft tried

43:12

to say like, we don't think

43:14

that's true, we don't think it

43:17

was our tool but eventually they

43:19

relented and they're like yeah, it

43:21

was our tools and now we

43:23

strengthen his protect said we cannot

43:25

function in a society. Where we're

43:27

going to let the hard happened birth

43:30

and bad and we're going to refine

43:32

to at we simply cannot enough of

43:34

these tech companies have gotten away with

43:36

for so long and back. Their status

43:38

quo is they're just gonna create new

43:40

technology and they're gonna push it out

43:42

and are still wait for people to

43:44

abuse it and then they'll have the

43:47

conversation is that the process is it's

43:49

not. Let's consider the harmful effects before

43:51

we push it out. then people will

43:53

lose their lives. they already are and

43:55

I feel as though. They exist

43:57

in this echo chamber. a plausible than I.

44:00

Already and at some point we

44:02

have to puncture that and be

44:04

like know, this is your fault

44:06

you made this technology. You do

44:08

not think about that or if

44:10

you did you did not create

44:12

guardrails around be obvious problems and

44:14

now people are suffering as a

44:16

result. Definitely you know is

44:18

no just two people creating these images that

44:20

are at fault here need to be held

44:22

to account the they absolutely do. You know

44:24

the people who are enabling it in the

44:26

first place, where creating these technologies who are

44:28

not thinking about the broader consequences or just

44:30

ignoring the fact that there will be brought

44:32

her consequences because that is much more beneficial

44:34

and easier for them to be able to

44:36

roll out all the stuff and create all

44:38

the hype around it and get the investors

44:40

excited. I'm rather than saying oh, there are

44:42

a lot of potential problems here that we

44:44

actually need to address. Can you talk a

44:46

bit about you mention Microsoft. There, but obviously

44:48

Open Ai has these tools. Google has these

44:51

tools. I believe Facebook or Matter is working

44:53

on their own. Do we see similar things

44:55

from a lot of these major companies When

44:57

it comes to people being able to use

45:00

them to create images like this, That and

45:02

I think that sort. Of what makes

45:04

this complicated is that a lot

45:06

of this technology is open source

45:08

and a lot of there is

45:10

unable to be taken from you

45:12

know like the code repository like

45:15

Get Hot for example which is

45:17

owned by Microsoft like Microsoft owns

45:19

Get Hub and if you go

45:21

to Get Had right now you

45:23

will see hundreds if not thousands

45:25

of a I models that are

45:27

on their for anyone to is

45:29

that are created just for this

45:31

purpose they're not than under the.

45:33

Pretense. Of dual credit rather Cook says.

45:36

There are just like bouncy networks

45:38

popping up. Everywhere.

45:40

Like please someone ai this one. Actually

45:42

someone make an ai tool that can

45:44

do this to win an ace. His

45:46

right eye opening in open and in

45:48

addition to that. One of like

45:51

the really fundamental issues that I had

45:53

with the current Ai space is that

45:55

they're putting these products out into the

45:57

consumer market and then not even creating

46:00

technology. There is no technology that can

46:02

reliably to tax is something is A

46:04

I generated and in a lot of

46:06

cases like open A I have a

46:09

good example of best. When.

46:11

They first launched Touchy Be T. They

46:13

were like serious aside program where you can

46:15

put the text into it and will

46:17

tell you if it came. From. Taxi pity

46:19

that didn't work. They pulled

46:21

that program like a year

46:23

later. And they were like this doesn't work.

46:26

the accuracy rate is so loud so now there's

46:28

just nothing. And they're very concerned. About

46:30

like that. like they write on their

46:32

website like there's nothing that exists to

46:35

reliably to talk to A I generated

46:37

packs. There's nothing that exists to reliably

46:39

detect a I generated video. So they're

46:42

throwing the rest of us to the

46:44

wolves and I think that people need

46:46

to have them solidarity. but I understand

46:48

why people don't because it is on

46:51

moving so quickly that I don't think

46:53

people realize like the sheer impact in

46:55

magnitude of everything but it's like bees

46:58

companies do not exist in our interest.

47:00

When they talk about strategies to combat

47:03

the harm, the Ai. They're not talking

47:05

about you and me, they're only talking

47:07

about themselves and the sooner everybody who

47:09

eclipse to that. I'm just like A

47:12

is not even for us, if not

47:14

for us. it's to take our jobs.

47:16

The way is to make our creative

47:19

work less valuable. It's to meet us

47:21

more productive for their bottom line. and

47:23

very little of what Ai is projected

47:26

to do will impact the average person

47:28

in a meaningful way. It's. All

47:30

about creating inflating these artificial stock prices

47:32

in value for the people at the

47:35

top to benefit the most. Is

47:37

so well said and even reading you

47:39

work on the struck how the lack

47:41

of responsibility isn't just in the creation

47:43

of these tools and what people can

47:45

use them for, but also just in

47:47

the search engines and the way that

47:49

many people access information right? if you

47:51

go on to Google search engines you

47:53

just the other day I was for

47:55

example trying to look up images of

47:57

like you Are Mosque and Mars. Or

48:00

like a piece I was writing and

48:02

so many of the oh the images

48:04

that like google serve me up were

48:06

a i generated stuff and obviously it

48:08

wasn't labeled as that by it. reminds

48:11

me in the past when the image

48:13

results used to be filled with like

48:15

interest images and now it's just like

48:17

stuffed with a I generated garbage and

48:19

if you can clearly towel in many

48:21

cases you would think that they're just

48:23

normal images or or something that someone

48:26

had created by reading your reporting. This

48:28

is also a serious problem with explicit.

48:30

Images were these deep face show up

48:32

in google images even when you search

48:34

for the names of certain celebrities for

48:36

example, it will show up in their

48:38

kind of results like what are we

48:41

seeing on the search engine side of

48:43

things in our these companies. Even properly

48:45

responding to that was. The search

48:47

engine component. It's huge. It had a

48:49

lot of the ways that people are

48:51

exposed to this material whether they want

48:54

to or not is to the search

48:56

engines. When you look at those major

48:58

pick web sites that are could spend

49:00

the majority of this material that way

49:02

to people are getting to that website.

49:04

is there Google and so ends of

49:07

the bag and then Google is giving

49:09

them the lengths to go to this

49:11

website. This is a Google problem. Yes,

49:13

the website is ultimately at fault for

49:15

hosting this material. Who

49:17

all have a lot of responsibility

49:20

for powering the existence of this

49:22

website. People wouldn't be going to

49:24

this website if Google was it

49:27

them there And Google the stance

49:29

that it is like we're gonna

49:31

wait and see how the cultural

49:34

conversation and specifically the legal conversation

49:36

around this develop was like they're

49:38

basically like we're going to wait

49:41

and see if this becomes illegal

49:43

and then will react like our

49:45

policies are shaped. By local

49:48

legal requirement. I

49:50

suppose, just how you. Like they're not

49:52

looking at this from a moral

49:54

perspective. They're not listening to the

49:56

people who use their platform. What

49:58

they're listening to. If the only. Thing

50:01

that has the power to keep them

50:03

accountable and that systemic of accountability is

50:05

not doing enough to combat this issue.

50:07

So Google has removed itself from having

50:09

responsibility. The only way that it's gonna

50:12

take responsibility as if people didn't pay

50:14

and Google takes responsibility for this and

50:16

something that is so insidious about the

50:18

way that that happens is like Google

50:20

it's own defense will say we're not

50:23

showing you the said when I even

50:25

showing you sexually explicit material if you

50:27

just type in the words shudder for

50:29

more as you. To type in Jennifer. Lawrence

50:31

be paid to get there.

50:34

And that's true because what

50:36

some researchers found is that

50:38

if you. Are a new consumer

50:40

who wants to hear about Jennifer Lawrence's

50:42

Box on the Pigs? Are you want

50:44

to hear about Jenna Ortega as thoughts

50:46

on Be Paid For Scarlett Johanson? What

50:48

is she said about the fake Because

50:50

cause your hands and is one of

50:52

the most victimized women in the deep

50:54

a space and supposed to have been.

50:56

I've been talking about this for a

50:59

long time and so when you go

51:01

to try to find an article about

51:03

goes to his new thoughts on be

51:05

fake googles not and just giving you

51:07

that in the result Google is giving

51:09

you. Leave to be pigs When

51:11

you're explicitly looking for things that are

51:13

and Google defenses like, we only give

51:16

people what they're looking for, but that's

51:18

not true and even if it was

51:20

true, I think we really have to

51:23

ask. Google is that right? Should we

51:25

give people this type of material that's

51:27

harmful just because they want to see

51:30

it doesn't have the biggest questions? I

51:32

think in the regulatory space is like

51:34

to Google have a responsibility to actually

51:37

prevent people from accessing material that harmful.

51:39

And in other cases the answers. Yeah when

51:41

it comes to things like child sexual abuse

51:44

material, google it of gonna give you that

51:46

could you want to fear? but with beat

51:48

fakes they haven't quite reached that point. Yeah,

51:50

and they need to be pushed into that

51:52

point. Yeah. And I guess with

51:54

the child sexual abuse material, that's because that is

51:57

explicitly illegal and so they have to act on

51:59

it, right? Yeah,

52:01

and even then and this is

52:03

something I didn't article about. even

52:05

with child sexual abuse material which

52:08

I talked to bunch of legal

52:10

experts and I went back to

52:12

the statute that were created in

52:14

the late nineties around what constitutes

52:16

tobacco be material. Even back then

52:18

in an idea, they were already

52:20

kind of thinking about what we're

52:22

dealing with right now because they

52:25

wrote into those statues in the

52:27

Us code that computer generated child

52:29

pornography is illegal to They already.

52:31

Had this kind of protection set

52:33

up because I is new in

52:35

some senses, but it is not

52:37

new. and other than said people

52:39

behind the artificial intelligence and like

52:41

the Nineteen Seventies and to be

52:43

Regeneration hello The Struck movie was.

52:46

Be viewed are generated. So is

52:48

Ben something that has existed for

52:50

a long time and what I

52:52

found by just searching like pretty

52:54

general term is related to the

52:56

fakes I found be fake example

52:58

in the top being and google

53:00

search results and what they were

53:02

was they were pictures of celebrities

53:04

taken before the age of a

53:06

tool. the one in my article

53:08

that I really focus on as

53:10

like a picture of Miley Cyrus

53:12

and the picture violence Irish i

53:14

thought oldest she was fifteen and

53:16

the photo. And they had taken her

53:18

face and they had peace but it over

53:21

adult nude body of and this is coming

53:23

up in the top Search results for like

53:25

Miley Cyrus effects. That image

53:27

should not be there. That is

53:29

technically not allowed and when I.

53:32

Show that to Google. They take it down to

53:34

their like yeah, we recognize that That's not right,

53:36

That. Prohibited. But

53:38

it just goes to show that even though that

53:40

is prohibited, they're not necessarily going to touch. It. And

53:43

I think south like. Another part of

53:46

this is they have to actually officially

53:48

be able to detect and remove this

53:50

material. And when it comes to be

53:52

paid, we're not seeing that not even

53:54

with be paid by the tests children.

53:57

Yeah. once again like i'm i'm

53:59

shocked not shocked at what you're saying. And

54:01

it really strikes me that when you talk

54:03

about responsibility and you

54:06

think about how little

54:08

Google is doing on this, and

54:10

so many of these tech companies are

54:12

doing on this, meanwhile, we saw it

54:14

just a few weeks ago, there was

54:16

this rapid backlash to their Gemini AI

54:18

tool because it dared to show some

54:21

racial diversity in historical events when it

54:23

was prompted. And something like

54:25

that, it seemed like it immediately had a

54:27

response and immediately had something to say. And

54:29

the CEO had something to say about it.

54:32

But when we see these issues with deepfakes

54:34

or when we see other issues that have

54:36

come of their AI tools, they're much less

54:39

likely to actually say something or actually take

54:41

some degree of action. Like, what

54:43

do you make of the difference in the way

54:45

that they respond to these different issues? That's a

54:47

really good question. And I think that a

54:50

lot of it depends on who

54:52

is raising these complaints. With

54:54

the Gemini stuff, you

54:57

are seeing not only giant

55:00

conservative voices speak out about

55:02

this, but there's this

55:04

alliance currently in the big

55:06

tech space between certain

55:08

venture capitalists and certain billionaires

55:11

and certain tech platform owners

55:13

have become very close and

55:15

very friendly with these extremely

55:17

conservative voices. And so when

55:19

you see someone like Elon Musk or Bill

55:21

Ackman start to speak out on it and

55:23

issue, well, they're in the same room

55:25

as the people who run Google.

55:27

So now it's your colleagues who

55:30

are calling this out. And I truly think

55:32

that if someone like Elon were going

55:34

to make a big deal out of

55:37

the question of deepfakes, maybe

55:39

you would see a response. But Elon can't

55:41

do that because his own platform is

55:43

part of the problem. And I don't think Elon

55:45

is interested in women's rights more

55:49

broadly speaking. Yeah,

55:51

I would tend to agree with that. We've

55:54

been talking quite a bit about the responsibility of

55:56

the companies and the responses that they have had

55:59

to this and... How those responses have

56:01

been truly ineffective and not merely meeting

56:03

the bar that I think most people

56:05

would expect of them. Spit on the

56:07

kind of legal side of things when

56:09

we look at lawmakers on the federal

56:11

level in the United States, but also

56:13

on the state level. And I don't

56:15

have you have any insight about internationally

56:17

what are we seeing Him You know

56:19

from our politicians when it comes to

56:21

trying to address this issue and and

56:23

does it seem like there any attempts

56:25

that would actually make some real difference

56:27

here? The. As then some positive

56:29

stride both internationally and within the

56:31

United States, Australia with one of

56:34

the first countries to actually for

56:36

like a task force dedicated to

56:38

this issue, Europe in general has

56:40

way better and tighter regulation Around

56:42

the tip of that, although not

56:45

necessarily with pigs, Europe has better

56:47

protection around things like that, a

56:49

privacy and because of the way

56:51

they've legislated online issue than the

56:53

past steal a clear way to

56:55

legislating something like. Feet back in

56:58

the united states that are really

57:00

speaking when it comes to regulate

57:02

in the internet were a mass

57:04

latest movie has very for. Regulates

57:06

and and the process actually getting

57:08

anything packed better really is

57:10

super convoluted and nasty and difficult

57:13

for deal on the speed level.

57:15

It's much easier to pass

57:17

things like this. Until we've seen

57:19

legislation here and there, we'd feel

57:22

a bunch of states and more

57:24

and more with every passing week

57:27

introducing legislation, passing legislation, getting things

57:29

on the books related to the

57:31

fix. the problem is than that

57:34

like for example, California has

57:36

been essential laws around this

57:38

issue but center high profile

57:41

this sounds like celebrities an

57:43

influence there is based in

57:45

California the problem than be

57:47

com identifying and having jurisdiction

57:49

over the perpetrators and the

57:52

problem. Also be calm cool bears

57:54

believe or. Facility here because

57:56

the act of be picking

57:58

somebody into a. Image may

58:00

be illegal for in the act of

58:02

carrying out that law, the enforcement of

58:04

that law. That's where you begin to

58:07

run into a lot of questions and

58:09

has a holiday you and a lot

58:11

of times. Unfortunately for victims there's so

58:13

much involved in the process of trying

58:15

to get just as the loses something

58:17

that applies to victim of all kinds

58:19

of time and it's an issue that

58:21

is very frequently over was which is

58:23

that you have to have resources you

58:25

people to afford it. Acts as a

58:28

lawyer you have to have the time.

58:30

And the money to dedicate to fighting

58:32

your case. These are things that are

58:34

not available to the back majority of

58:36

victims. and so we're not gonna see

58:38

the vast majority of the sound of

58:41

even Under the Laws Get That kind

58:43

of justice and so you know, approaching

58:45

the issue of the fake then takes

58:47

on a much more multifaceted sort of

58:50

approach because the also have to look

58:52

at social factors, disincentivize what these young

58:54

boys are doing and these little for

58:56

them high schools we're says sort of

58:58

make it clear. That this type

59:01

of behavior in the going to be

59:03

tolerated on various levels and again be

59:05

pinstripes. Their we've seen some good things

59:07

happen. I would say we seen some

59:10

recent middle schools really crackdown on There's

59:12

and Away where they're removing perpetrators out

59:14

of the school system. There separating perpetrators

59:17

from the family is showing victims that

59:19

you matter and that this is a

59:21

problem and that it was wrong and

59:23

even something that simple can make a

59:26

huge difference in actually combat in this

59:28

issue on a cultural apple. Speaking.

59:31

of the cultural level do we see

59:33

a shift on that as well because

59:35

i remember clinton's in the past when

59:37

people talk about kind of nude images

59:39

circulating it was very sort of shameful

59:41

thing and you know could have kind

59:43

of severe consequences for people especially people

59:46

in the public eye you know my

59:48

question is not too kind of dismiss

59:50

how important the seasons dismiss the need

59:52

for action on this but do we

59:54

also see a change kind of in

59:56

the social norms where it's like if

59:58

this happens to you you shouldn't

1:00:00

feel shamed or you know people aren't going to

1:00:02

think worse of you, do you see changes there?

1:00:05

I see a lot of different things happening

1:00:07

at once. And so it

1:00:09

really depends case to case looking at

1:00:11

the various influences within the community of

1:00:14

the person who's being affected. So

1:00:16

in some ways we've seen like

1:00:18

some progress. So a case

1:00:21

that I reported on a couple weeks

1:00:23

ago involved a middle school in Beverly

1:00:25

Hills. And what I saw

1:00:27

in that case was some

1:00:29

really progressive kind of

1:00:31

action that I don't

1:00:33

always see. But I think part of why

1:00:35

I was seeing that is because Beverly Hills

1:00:38

is a community that is

1:00:40

unlike most other communities. It's

1:00:42

an extremely wealthy, high profile

1:00:45

community that is kind of

1:00:47

incomparable to most communities in

1:00:49

the United States and globally.

1:00:51

And so with those vast

1:00:54

resources and that vast spotlight,

1:00:56

they did what seems to be the

1:00:58

right thing. Elsewhere you're

1:01:00

not always going to get people

1:01:02

reacting in the same way. And

1:01:04

I think culturally, for example, not

1:01:06

just in the United States and

1:01:09

also in other countries, in conservative

1:01:11

communities, you might

1:01:13

have an approach to this issue that

1:01:15

blames the victim. And you have, we've

1:01:17

seen that. We've seen,

1:01:19

especially in the early days of

1:01:21

deep fakes, women who were targeted

1:01:23

in really culturally conservative areas faced

1:01:26

a lot of backlash from their communities

1:01:28

and they were blamed in a lot

1:01:31

of cases and they faced violence as

1:01:33

a result of being violated. So

1:01:36

something that concerns me outside of the

1:01:38

deep fake issue, but that intersects with

1:01:40

it is we have this really radical

1:01:43

anti-feminism ideology that is growing in

1:01:46

lots of different areas. We

1:01:48

see it with young boys and

1:01:50

young men in various communities around

1:01:52

the world. They're influenced by

1:01:54

people like Andrew Tate And

1:01:57

people who are telling them, actually, you should do

1:01:59

stuff like this. You should assert

1:02:01

your power and your dominance as man

1:02:03

sized violating sexually the women and girls

1:02:05

around you when you have boys getting

1:02:08

that message after the influence how these

1:02:10

sorts of incidents play out and I

1:02:12

think right now we're seeing kind of

1:02:14

like a growing gap where some communities

1:02:17

are becoming more progressive than some people

1:02:19

are becoming more more progressive. You also

1:02:21

have people becoming more and more regrowth

1:02:24

fab. So I think that for some

1:02:26

back down there will be cultural things

1:02:28

that help them make. I think that.

1:02:31

You're still seeing the me too movement.

1:02:33

Sort of reverberate and make women more

1:02:36

competent. Any coming forward about being the

1:02:38

victim of sexual crime by your also

1:02:40

seen a backlash to the me to

1:02:42

move med studies then trying to make

1:02:45

women not feet and sixteen. We have

1:02:47

all of these conceding cultural influences that

1:02:49

are going shift the environment for anyone

1:02:52

to the victim of as. Yeah.

1:02:55

That's really good point and unfortunately those kind

1:02:57

of age who teach ideas and in the

1:02:59

people like him who promote this are far

1:03:01

more influential than they should be. And as

1:03:03

you say, even if norms are changing to

1:03:06

a certain degree, they're still that kind of

1:03:08

visceral reaction of seeing these images of herself

1:03:10

and how as you were talking about earlier,

1:03:12

this can lead to people wanting to kind

1:03:14

of, you know, move out of public life

1:03:17

in the try to avoid you know situations

1:03:19

or careers or sectors where they might face

1:03:21

a higher risk of having these images being

1:03:23

made of. them have been. Put in these

1:03:25

situations and that is completely unacceptable. And

1:03:27

so I think my final question would

1:03:29

be what do you see in the activism

1:03:32

around this you know we talked about

1:03:34

before we started recording the My Image

1:03:36

my Choice you know movement that's been

1:03:38

put together by the people who created the

1:03:40

and other Body documentary. Obviously I think

1:03:42

that this issue is becoming something that

1:03:44

more people are aware of and that you

1:03:46

know lawmakers are feeling more pressure to

1:03:48

do something about what do you see

1:03:50

on that angle and and where do

1:03:52

you see this issue kind of going over.

1:03:55

The next year. So. Sell. Like

1:03:57

he just bad as have seen some

1:03:59

really hard. The main activity from work in

1:04:01

advocacy work popping up in this phase. A

1:04:03

lot of people who have been committed to

1:04:05

this issue for. A long time because

1:04:07

before and be paid reached. This

1:04:10

point they were working with the

1:04:12

same issues in response to the

1:04:14

revenge porn crisis of the twenty

1:04:16

Times and the same people who

1:04:18

dealt with a lot of concept

1:04:20

like internet privacy. Which is sort of

1:04:22

my modern. Concept in itself. A people

1:04:24

who have been leading the charge

1:04:27

on by his you are also

1:04:29

responding to this issue and sleazy

1:04:31

advocacy organization as well as the

1:04:33

organizations that exists for survivors of

1:04:35

all types of gender based on

1:04:37

sexual violence. Something that you end

1:04:39

up being happen a lot. As

1:04:41

you're looking at how this plays

1:04:44

out, his people were already abusive,

1:04:46

they just expanded their forecasts. So

1:04:48

something like be fake to become

1:04:50

the new tool and the abusive.

1:04:52

Tool kit that a lot of

1:04:54

individuals weaponize against their victim. so

1:04:56

like help lines and resources for

1:04:58

big fan of all the types

1:05:00

of crimes they're seen this happen

1:05:02

more and more in case as

1:05:05

in the same way that they

1:05:07

saw a lot of internet. Privacy

1:05:09

in regards to. Real

1:05:11

Images because the impacts their clients. Ten

1:05:13

years ago soon as the me speak

1:05:15

images and like faith thing is impact

1:05:17

their clients as Wow! So we're seeing

1:05:20

a lot of response on a lot

1:05:22

of different friends that I think is

1:05:24

really important and in terms of how

1:05:26

these issues going to develop over the

1:05:28

next couple of years I think we're

1:05:31

just starting to hit that stride. I

1:05:33

think we're going to. See so much

1:05:35

more movement come out of there?

1:05:38

Because a lot of times when you look

1:05:40

at the field of victims right? It. Takes

1:05:42

people trying. To process what has

1:05:45

happened to them before, they're in a place

1:05:47

where they can do something about as so.

1:05:49

I think that a lot of people who

1:05:51

are tragically been victimized right now and over

1:05:54

the past couple of years in the coming

1:05:56

years they're gonna reach a place where they're

1:05:58

like I have now passed. what

1:06:00

I went through and I want to do something about

1:06:02

it. And so we're going to start hearing these voices

1:06:04

and these testimonies and they're going to get bigger and

1:06:07

bigger and bigger. And I think that bringing

1:06:09

it back to what we talked about at the very beginning, after

1:06:12

the Taylor-Slice deep-baked incident, I

1:06:15

personally saw more legislative and more

1:06:17

just like action and attention and

1:06:19

interest and support happening than I

1:06:21

had seen at any other previous

1:06:23

point. It was like a wave,

1:06:26

a tidal wave of just

1:06:28

attention being paid to this issue. And

1:06:30

so having celebrities be involved in

1:06:32

this, their advocacy can be important

1:06:35

in the same way that Jennifer

1:06:37

Lawrence saying what happened to me

1:06:39

was a sex crime and anyone

1:06:41

who viewed those pictures as a

1:06:44

sexual offender, when she said that,

1:06:46

that reverberated. That said to so

1:06:48

many women, like, I'm not

1:06:50

alone. Like Jennifer Lawrence is speaking for me.

1:06:53

And it said to people like, you should

1:06:55

reconsider what you consider to be okay. And

1:06:57

I think that we're going to see those

1:06:59

norms shift and it's going to take place

1:07:01

with a lot of big conversations as well

1:07:03

as a lot of smaller conversations.

1:07:06

Yeah, and that's so important. And it's something that

1:07:08

absolutely needs to happen. And this accountability needs to

1:07:10

start being something that we see a lot more

1:07:13

of, both on the level of the people who

1:07:15

make these images. But as we were talking about

1:07:17

the companies that are making the tools that allow

1:07:19

them to do it in the first place as

1:07:21

well. Kat, this is such an important issue and

1:07:23

you've given us so much insight into understanding the

1:07:25

broader ramifications of it. Thank you so much for

1:07:27

taking the time. Thank you for having

1:07:29

me and for giving a platform to this issue.

1:07:34

Kat Tamborage is a tech and culture reporter at

1:07:36

NBC News. Tech Won't Save Us is made in

1:07:38

partnership with the Nation Magazine and is hosted by

1:07:40

me, Paris Marks. Production is by Eric Wickham and

1:07:42

transcripts are by Bridget Palou Fry. Tech Won't Save

1:07:44

Us relies on the support of listeners like you

1:07:46

to keep providing critical perspectives on the tech industry.

1:07:48

You can join hundreds of other supporters by going

1:07:51

to patreon.com/Tech Won't Save Us and making a pledge

1:07:53

of your own. Thanks for listening and make sure

1:07:55

to come back next week.

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features