Podchaser Logo
Home
Artificial Intelligence: The Criminal Threat

Artificial Intelligence: The Criminal Threat

Released Tuesday, 28th November 2023
Good episode? Give it some love!
Artificial Intelligence: The Criminal Threat

Artificial Intelligence: The Criminal Threat

Artificial Intelligence: The Criminal Threat

Artificial Intelligence: The Criminal Threat

Tuesday, 28th November 2023
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

This is the BBC. This

0:03

podcast is supported by advertising outside

0:05

the UK. Have

0:11

you ever wondered what sets exceptional

0:13

leaders apart? Discover

0:15

how renowned leaders from around

0:17

the globe have harnessed their

0:19

natural talents to achieve remarkable

0:21

success. Uncover the

0:23

secrets of leadership excellence, one

0:26

strength at a time, through Gallup's

0:28

Leading with Strengths. Dive

0:31

into compelling stories at

0:33

leadingwithstrengths.com What

0:53

you've just heard is a real 911 call,

0:55

made earlier this year to emergency services in

0:57

Scottsdale, Arizona. What

0:59

you've just heard is a real

1:01

911 call, made earlier this year

1:03

to emergency services in

1:06

Scottsdale, Arizona. The

1:20

alarm was raised by a concerned citizen,

1:22

eyewitness to a crime unfolding

1:24

in real time. A

1:27

kidnapper calling, a daughter in danger,

1:29

a mother in shock, trapped in

1:31

a waking nightmare. And a

1:34

story that has a timely, troubling

1:36

and truly unexpected twist.

1:40

When I answered the phone I just said hello and

1:42

it was my daughter Brianna crying and

1:45

sobbing and saying mom. And

1:47

I'm like what's up and she goes mom

1:49

and she continues to cry and sob, I

1:52

messed up. That's Jennifer

1:54

DiStefano, the mom in

1:56

question, this all happened to her. phone

2:00

and he says, listen here I have

2:03

to call anyone. I'm gonna pump herself full of drugs

2:05

and have my way with her. I'm gonna drop her

2:07

in Mexico and you're never gonna see her again. At that

2:11

point I had my hand on a door

2:13

handle to the dance studio. I threw

2:16

open the door while she's fading

2:18

off in the background pleading for help and the

2:20

man starts making only threats and I just

2:22

started screaming for help. And did

2:25

you have any doubts at

2:27

all at that point that that was your daughter?

2:30

No, I had no doubt. I

2:32

was so sure of her voice. I was so sure

2:34

of her cries for thoughts that would hurt.

2:37

Faced with what she believed was

2:39

a life-or-death situation for her daughter

2:41

Brianna, she was then forced

2:43

to negotiate the terms of her

2:45

release with her still nameless anonymous

2:47

kidnappers. He wanted originally started with

2:49

a million dollars that

2:52

wasn't possible and he said $50,000 and I asked him

2:54

how he wanted to receive the

2:58

money, how he wanted the $50,000. He was

3:00

demanding that I actually be physically picked up

3:02

when a white van put a bag over

3:05

my head so I wouldn't know

3:07

where I was going and I'd better have all the

3:09

cash otherwise my daughter and I are both dead. Though

3:11

the would-be kidnappers warned Jennifer not

3:13

to call the police she ignored

3:15

the threat instead gesturing

3:17

to an out swelling huddle of

3:19

onlookers and eavesdroppers that they should

3:21

call 911. Within seconds the

3:24

call was made and the quick-thinking

3:26

operator who answered turned the

3:29

tables. So that is a very

3:31

popular scam. I need somebody to try and

3:33

get a hold of her daughter. Do you have a phone number and I can get a

3:35

hold of her daughter? No, you can get a hold of her three. Something that they do.

3:37

They sound incredibly violent. Usually

3:39

you can hear somebody's dad, etc. Acting

3:42

on the advice onlookers then scramble

3:44

to contact Jennifer's husband. He's

3:47

alarmed but is also puzzled

3:49

confused by what he's hearing.

3:52

He races upstairs to find

3:54

Brianna in her room. He

3:57

puts her on the line and at the other end of

3:59

the phone. is handed to her mum,

4:01

Jennifer. She's like, mum, I have no

4:04

idea what's going on. I'm safe with dad. I'm

4:06

literally resting in bed. I have absolutely no

4:08

idea what you're talking about. And

4:10

I just collapsed and started

4:12

crying, honestly, with relief because I knew

4:14

that my daughter was safe. So

4:17

what actually happened on that April

4:20

morning in Arizona? When

4:22

the dust finally settled, Jennifer discovered

4:25

she'd been the target of a

4:27

new type of scam, driven and

4:29

enabled by artificial intelligence. The

4:32

criminals behind the fraud had cloned her

4:34

daughter's voice using AI software, next-generation

4:37

futuristic technology that allowed

4:39

them to literally put

4:41

words in her mouth, words

4:44

she didn't say. To do

4:46

that, however, they would have needed

4:48

samples of Brianna's voice. To this

4:51

day, Jennifer has no idea how

4:53

or where they found them. Her

4:56

social media accounts are very small, very private.

4:58

There was no voice. And there was certainly

5:00

no sobbing and crying. The sobbing and crying

5:02

is what threw me off more

5:05

than anything. So I don't

5:07

know. Do you think they really were

5:09

planning on coming to get you,

5:11

to essentially abduct you as well, to take

5:13

your money? Yeah, that's actually

5:16

just giving children your, because

5:18

that's the part of the story that was real.

5:20

That's the part of the story that wasn't fake.

5:23

Today, nobody has been caused or

5:25

convicted for the attempt at extortion

5:27

and the kidnapping hoax that Jennifer

5:30

endured. She did, though, go

5:32

on to address the US Senate about the dangers

5:34

of AI crime, a

5:36

steadily emerging phenomenon, first forecast

5:38

in the world of sci-fi

5:40

fiction, but never very much

5:43

a reality, and not just in

5:45

the United States. Action

5:51

Fraud, the National Reporting Center for

5:53

Fraud and Cybercrime, covering England, Wales,

5:55

and Northern Ireland, has told File

5:57

and Four there's been 110 reports.

6:00

reports of AI-related incidents between May of

6:02

2019 and October 2023, a drop in the

6:04

crime statistics

6:08

ocean it seems. But

6:10

they also told us those figures may not

6:12

be an accurate reflection of the

6:14

actual numbers because, in their words,

6:16

many people will not know if

6:18

they've fallen victim to a fraud

6:20

that involves AI. My

6:23

name is Peter Barron. I was a

6:25

police officer for just over 30 years.

6:28

Finished in the Metropolitan State, I was

6:30

a detective for most of my career

6:33

and I finished as the head of crime

6:35

performance and strategic risk in the Met.

6:39

At present, Peter Barron is leading a project

6:41

on fraud for the Mayor's Office for Policing

6:43

and Crime in London. He

6:45

tells me that, like most

6:47

fraud, AI crime is likely

6:49

to be significantly under-reported but

6:51

is now classified within law

6:53

enforcement circles as a real

6:55

and present danger. My

6:58

first appeal on my

7:00

radar was probably about

7:02

eight, nine months ago. I did

7:04

some work with a chap

7:06

who is a reformed fraudster.

7:09

The bargain we struck was that basically

7:11

he would be willing to

7:15

disclose the tradecraft and methodology

7:17

of fraudsters as

7:19

long as I guaranteed his

7:22

anonymity. One of the

7:24

crimes his contact, the former fraudster, claims

7:27

to have insider knowledge on is an

7:29

almost carbon copy of the story you

7:31

just heard from Jennifer, a

7:34

kidnap scam with AI voice cloning

7:36

at the dark heart of the

7:38

operation. Except the victims, the

7:40

targets, well, they're not in Arizona,

7:42

they're in places like Atrington, Aberdeen,

7:44

Armagh, cities and towns across

7:47

the UK. In brief,

7:49

the former fraudster told Peter Barron about

7:51

a face-to-face sneezing he'd had with one

7:53

of the criminals running the fraud. And

7:55

in that meeting, he claims to have

7:58

heard audio of a young girl who's

8:00

crying, she's terrified. Audio

8:03

that was doctored using AI and

8:05

used as part of a real

8:07

crime. One on a recording

8:10

on a mobile phone of a young girl

8:12

squealing for help, pleading

8:15

for her parents to pay £10,000. Otherwise,

8:19

they weren't gonna see her again.

8:22

Her voice was

8:24

taken from social media where

8:27

she was on holiday and

8:30

apparently it needed very, very little time

8:32

to actually morph her voice

8:35

using AI in

8:37

order that they could then make her, if you like,

8:40

say whatever they wanted. There was

8:42

also a cruel and calculated plan

8:44

in place to stop the

8:46

parents from phoning the authorities and

8:48

then to pressure them into making payment.

8:51

What they said was basically that, do

8:53

not contact police as soon as the

8:55

money arrives, there will be a

8:57

window of four hours and we are monitoring

8:59

your phone and this will

9:03

not go well if in fact you contact

9:05

anybody. They wanted £10,000 by

9:07

clothes of play the same day. And

9:10

they got it. Yep. The

9:12

final piece of the crooked puzzle,

9:14

more tech, allowing the scammers to

9:16

further convince the parents now in

9:18

their crosshairs that it is

9:20

their daughter that's been abducted. When

9:23

the parents receive the call, the

9:26

name that comes up in the window

9:28

of their phone is their

9:30

daughter's name. Now that's

9:32

a form of software, if you like.

9:34

It's called spoofing. With

9:39

spoofing in mind, I start to

9:42

wonder just how easy it might be

9:44

to clone a voice using AI.

9:47

I mean, can anybody do it? Better

9:49

still, could I do it? Okay,

9:53

well, will we do a run through with you? Is that

9:55

okay? Yeah. All right then, so

9:57

are you ready? Kimberly

10:00

you happy? All right then ring

10:02

ring. Hi Paul. Hey

10:05

how's it going? Good

10:07

fine what can I help you with? I'm

10:10

at University College London sat with

10:12

two experts in the field of

10:15

artificial intelligence. Professor Lewis

10:17

Griffin now an advisor to the

10:19

government on threats posed by AI

10:21

and Kimberly Tan Mai a

10:23

PhD researcher. For the

10:26

past few months we've been working

10:28

together to create a synthetic but

10:30

credible version of my voice. Why?

10:33

Well so we can put it to what

10:35

for me will be the

10:37

ultimate test. The kind of voice

10:40

cloning software that's used at the moment

10:42

is basically a generative AI algorithm.

10:45

What it does is it listens

10:47

to sounds of examples usually of

10:49

how humans speak and it learns

10:51

the patterns and characteristics. Generative

10:54

artificial intelligence is by the way

10:56

a version of the

10:58

technology capable of generating text

11:00

images or other media. Kimberly

11:03

goes on to tell me that

11:05

voice cloning generators are generally fed

11:07

many thousands of samples of

11:09

human speech. That's how

11:11

they come up with that's how

11:14

they replicate tone inflection emphasis

11:16

and the like. So that's

11:18

basically why I did. I found a

11:21

voice cloning software online and

11:23

the appeal of this initially was that you didn't

11:25

have to do any kind of training you didn't

11:27

need to have any knowledge

11:30

of coding to use them. To train

11:32

the cloning model and the various quirks

11:35

and characteristics of my own

11:37

speech pattern Kimberly asks me to send

11:39

her recordings of me reading

11:42

newspapers online articles and such

11:44

this kind of stuff. They've now confirmed

11:46

for the first time that

11:49

atoms of anti-matter fall

11:51

downwards. Far from being a

11:53

scientific dead end this opens the doors to

11:55

new experiments and theories. What I tried

11:57

to do initially was feed the model some And

14:00

my voice, better than anyone. Colin

14:04

Hagen! Hey, how's it

14:06

going? Alright, I

14:08

was just doing the guns outside. I'm a

14:11

bit distracted, sorry. I've either been an idiot

14:13

and I followed myself somewhere or I was

14:15

nicked off the train. I've

14:17

no idea. Can't work it

14:19

out. My head's gone. But I've no

14:21

wallet and no money to be tempted to card

14:23

for. Can you stick a couple of hundred quid

14:26

in until I get home? Sorry,

14:28

this is too... Did somebody steal all

14:31

of your stuff today? I'm just struggling

14:33

to hear you. Sorry, I'll just say

14:35

this. Okay, did somebody steal all of

14:37

your stuff? Someone must be on the

14:39

phone. Can I text you the bank

14:41

details? Okay, this night I'll ring Lyndon

14:43

and she put it through to you. Okay,

14:46

sound. Thanks again. So sorry, Dad. Okay, no, no, no,

14:48

no, no, no, no, no, no, no, no. I'll see

14:50

you a bit later. I'll get that back. Okay, okay,

14:52

okay. See you, bye. Okay,

14:55

so yeah, so we know it

14:57

works. And if you target

15:00

the right person at the right time with the

15:02

right dialogue, the right audio, the right pitch, the

15:04

right tone, then

15:07

yeah, it works. I

15:09

didn't leave my poor bum in suspense.

15:11

You'll hear plenty more from her later.

15:14

But you don't have to go to UCL to pull a

15:17

stunt like that. AI cloning software

15:19

is, as Kimberly mentioned, easy to

15:21

find on the web. To criminal

15:23

minds, the tech opens up a

15:25

new world of possibilities. A Pandora's

15:27

box. So

15:31

I'm just browsing here, looking at, you

15:34

know, a number of online banking websites.

15:36

And there are plenty now offering voice

15:38

ID as a way of accessing your

15:40

bank accounts. Some really interesting

15:42

things here. They describe voice ID

15:45

as like a fingerprint unique to

15:47

you and that their software even

15:49

recognizes you for having a bad

15:51

day, as in a cold or

15:53

a sore throat. But some say

15:55

the unique characteristics of your voice

15:57

make it much more secure than

15:59

a conventional power. and that voice

16:01

recognition also saves you time. Peter

16:04

Baron, though, isn't convinced. My

16:07

bank has recently introduced

16:09

voice recognition, and

16:12

I half-jokingly said

16:14

this to my guy, and

16:17

he said, be very careful

16:20

about anything which

16:22

involves voice recognition,

16:24

including your bank. Everybody

16:26

is going to have to reach their own decision on this.

16:29

For me, I personally won't

16:32

activate that feature. It's

16:34

another layer of security which

16:37

would theoretically keep your

16:39

account secure. However, we

16:42

know voice cloning is incredibly

16:44

easy, and

16:47

it's yet another example of how fraudsters have... how

16:50

they're not only a couple of steps ahead of

16:52

law enforcement, but when

16:54

something new comes out, they look at it and

16:57

find a way around it. The banks have acted

17:00

in good faith and with the

17:02

best intentions to try and

17:04

protect people's accounts. Voice

17:07

cloning effectively gives them a way

17:09

around it, so that could

17:11

potentially open you up to having

17:13

your account broken into or

17:16

taken over. Do you think that

17:18

layer of security then has now by AI

17:21

cloning software been compromised?

17:25

We find ourselves in something of a race where

17:27

banks are doing the best they possibly can with

17:29

the best of intentions, but

17:31

fraudsters are still by and large finding

17:34

the workarounds very easily. UK

17:37

finance, which represents the banking industry in the

17:39

UK, told us the sector

17:41

is continuously monitoring the use of generative

17:44

AI in scams. The

17:46

issue of whether someone could use a

17:48

fake AI voice to access telephone banking,

17:51

it told us that banks assess the risks

17:53

depending on the kind of transaction being made

17:56

and that they're using more layers of

17:58

controls and boosting customer authentication. processes.

18:01

Banks are also using AI

18:03

themselves to better identify suspicious

18:05

activity. But

18:09

voice zoning isn't the only threat

18:11

posed by generative AI. And

18:13

whilst for the likes of you and me, the

18:16

associated risks are only starting to surface

18:18

to be talked about and discussed across

18:21

the news and in social media, those

18:23

at the coalface of this tech

18:26

soul, most of it coming. Those

18:28

advances in generative AI systems are happening

18:31

as we speak all of the time.

18:34

Professor Lewis Griffin again. So

18:36

images, I think much of

18:38

the time the images generated

18:40

are completely undetectable. And

18:43

I think that's just going to increase very

18:45

rapidly. And they can be absolutely perfect.

18:49

Video is kind of further

18:51

behind that, but it's moving

18:54

along very fast. If

18:56

you start to make more complex

18:58

imagery, then there will start to

19:00

be small problems in it that

19:03

you can spot. But overall, the

19:05

impression is highly convincing. So how

19:07

soon do you think will

19:10

video and images be where

19:12

audio is now? How

19:14

long until it's perfect? I

19:16

think in months and years, not

19:18

decades. And

19:21

it was some video that helped to mislead

19:23

the next person I'm going to talk to.

19:26

It was generated by AI

19:28

and demonstrates the pernicious potential

19:31

of deepfakes. It's

19:36

the first time I had

19:39

used an online dating app in my

19:41

life. That is Anna. Not

19:44

her real name. She's asked us not to

19:46

share that. Today, she

19:48

invited us to her home in southern England.

19:51

She's in her fifties, has dark

19:53

shoulder length hair that's brightened by

19:55

a smattering of silvery grey and

19:58

kind eyes despite the old deal

20:00

that she's been through. How

20:03

are you feeling about deep diving into it

20:05

all again? I've

20:07

been deep diving all weekend, all weekend,

20:10

so I'm actually back in that groove.

20:13

Anna, you see, was the victim

20:15

of a years-long romance scam. She

20:17

was duped into falling in love and

20:20

parting with vast sums of money by

20:22

an unscrupulous scammer who claimed to be

20:24

someone else entirely. She knew

20:27

him as Simon, but

20:29

it's how he convinced her

20:31

it was all real that makes her

20:33

story unlike Annie you'll have

20:36

heard before. I'd been through a very,

20:38

very, very brutal divorce

20:40

abroad and for four years

20:42

I wasn't ready to even have a

20:44

cup of coffee with a man, not interested. So

20:47

the weeks leading up to it, it was actually the

20:49

months leading up to it, my friend said, I think

20:51

you're ready, you're bouncing again, you're smiling again. The

20:54

first hit was

20:58

Simon's, the very first

21:01

text. The face

21:03

of an angel and a doneness

21:06

of a man. And

21:08

then the text turned

21:10

very loving. Anna gave us

21:13

access to voice messages that she and

21:15

Simon shared over the course of their

21:17

relationship. Here's just one example and this,

21:19

Anna believes, is the fraudster's

21:22

real voice. I'm going to make you

21:24

happy, baby. You're such

21:26

a wonderful woman, a

21:28

woman with great hat. You are the

21:30

woman I want all my life. Soon

21:32

they made plans to meet up in

21:34

person at a restaurant in West London

21:37

near where he claimed to live. It was

21:39

three weeks into our two and a half

21:41

year relationship. He booked a restaurant that I'd

21:43

looked up existed. He booked a table and

21:46

he had a business crisis and had to go to

21:48

Paris. Simon

21:50

told her he was a wealthy businessman, was

21:52

of Bulgarian Scottish heritage and

21:54

dealt in edible oils. Now

21:56

his complex web of lies and of

21:59

drawn out extortion. began with

22:01

a tall tale. He told

22:03

Anna that a sizeable shipment of his products had

22:06

been held blocked from clearing customs

22:08

in Paris, a simple case of

22:10

the wrong paperwork he said. He needed

22:13

cash quickly to get the whole thing

22:15

moving again and asked Anna if on

22:17

this one occasion she could help

22:19

out. She agreed. The

22:21

first one was £1,500 to

22:25

clear the port charges so that the containers could

22:27

leave. You believe all of that to that point.

22:30

He sent receipts. He sent loads

22:32

of receipts. Requests for money

22:35

continued, came thick and fast.

22:38

Anna would sometimes make him sign

22:40

contracts promising her money would be

22:42

repaid. He did so without hesitation

22:44

each time and soon Anna found

22:46

herself more than £60,000 down. But

22:50

Simon's influences, powers of manipulation

22:52

of persuasion, continued to cloud

22:54

her better judgment to draw

22:57

her in ever deeper. Finally

23:00

a face-to-face meeting is

23:02

arranged, not in the flesh

23:04

as such but a video call on

23:06

Skype. At the time she was

23:08

again led to believe that Simon was in Paris.

23:11

The Skype video absolutely

23:14

looks like a French cafe. I know

23:16

I've been to Paris many times. He

23:18

comes online and he's moving and he's talking

23:21

and moving and he said I can

23:23

hear you I can hear you I can hear you I can see you can

23:26

you see me and he's moving around and talking like

23:28

can you see me I'm like yeah I

23:30

said but I can't hear your voice so

23:32

you can see him lip-syncing that did it for me

23:34

that did it for me to

23:38

me it was him saying my name and I love you soon

23:40

after that a second video call

23:43

a different setting this time

23:45

but the same technical difficulties

23:47

picture no sound he's

23:50

sitting on a bed and he told me I'll call you for my bed

23:52

and breakfast it was a bit dark

23:54

it was him he could hear me and see me I

23:58

couldn't hear him but I could see him so we I

24:00

had two occasions and I was like, hey, you're real.

24:03

I was floating, I was floating on

24:05

clouds after having seen them. I

24:07

was floating on clouds

24:10

for weeks. And you think AI

24:12

was involved in both occasions, do you? It

24:20

later transpired that those real-time video

24:22

calls were fake. Or to

24:24

be more precise, deep fakes. The

24:28

typical Parisian background was

24:30

probably AI-generated, superimposed, as

24:33

was the face and body of the man Anna

24:35

thought was Simon, the supposed love

24:37

of her life. But

24:43

her story is far from over. The

24:45

deceit would deepen, as would the

24:48

gaping financial hole she found

24:50

herself in when it all came to

24:52

a devastating, if bizarre, end. How

25:00

long ago was it that you heard about

25:02

the first case? In my

25:05

experience, probably under a year, I would say.

25:07

It's very, very new to the team. Lisa

25:10

Mills is a senior fraud expert

25:12

at the charity Victims Support. She

25:16

specialises in romance fraud and has dealt with cases that

25:19

echo Anna's experience. When

25:21

I have seen victims who have not destroyed

25:23

exchanges between

25:26

the fraudster, they will show me

25:28

images that they will see have been

25:30

doctored. And

25:32

they will also tell me about real-time video exchanges that

25:36

they've had. With a suspect, with the growth of

25:39

AI, it

25:41

will lead to fraudsters having

25:43

an ever-increasing capacity to

25:47

convince their victims that they are who they say they are,

25:50

to place them in the situations

25:52

that they describe. In 2022, UK

25:54

victims of romance scams lost close

25:56

to a ho... That's

26:02

according to the National Fraud Intelligence

26:04

Bureau and those in

26:06

the know fear that already

26:08

alarming figure could increase even

26:10

further. Now the fraudsters are

26:12

monetising and in a sense weaponising

26:15

AI's darker capabilities. I

26:18

love you baby. I

26:22

love you from the bottom of my head. I just

26:24

woke up now and what I got in my name

26:26

is just you. I'm

26:29

just a little bit of a light in love you. Back

26:33

with Anna in the south of England, we

26:35

talk through the closing chapters of her story.

26:38

A grim collection of carefully

26:40

concocted lies which culminated in

26:42

Simon claiming he'd been abducted

26:45

and would be killed unless Anna could help

26:47

him pay his way out of it.

26:50

Come on, come on, come on please,

26:52

don't make me cry, don't make me

26:54

sad, you are my last love, you

26:56

are my only love, please come to

26:58

my rescue and assist me on this

27:01

last date. Please my love, please,

27:03

I love you so much. Panicked

27:05

and convinced that Simon could at any

27:08

moment be murdered, Anna posts a message

27:10

and a picture of him

27:12

on social media. She asks to

27:15

be contacted if anybody has any

27:17

information whatsoever on his whereabouts

27:20

and she gets a reply from

27:22

a woman that changes the course

27:24

of her life forever. She goes

27:26

I'm Mexican. I said okay. She

27:30

went the picture you posted is

27:32

probably our most famous soap star.

27:35

His name is Juan Solí. After

27:38

two and a half hours I still didn't believe

27:40

her and she could hear it but eventually she

27:42

does believe it. Upon

27:44

much closer inspection she

27:47

realises that every image she's ever been

27:49

sent by Simon and the man staring

27:51

back at her in those

27:53

video calls is in fact

27:55

Juan Solí, an

27:57

Argentine Mexican actor. This

28:00

is what he actually sounds like. To be

28:02

clear, the real one, Soleil, had no part

28:04

in the scam. Nor

28:14

did he have any idea that his

28:16

image was being used as part of

28:18

the fraud. How

28:21

much did you give him in total? With

28:27

the help of the financial ombudsman, Anna did

28:30

claw back more than two thirds of the

28:32

money she lost. Clearly

28:34

the victim of a

28:36

drawn-out and devastating scam. How

28:39

many of you have seen the video? AI

28:48

voice cloning, deep-sake videos,

28:51

both have paved the way

28:53

for dark twists, malicious evolutions

28:55

of tried and tested scams.

28:58

But in perhaps the darkest corners

29:00

of the internet, these new frontier

29:02

technologies are being used

29:04

for something even more disturbing. Something

29:08

Professor Lewis Griffin and the group

29:10

of experts assembled years ago didn't

29:13

foresee. There's an aspect

29:15

of deep fakery, a use of

29:18

deep fakery, that we

29:20

completely missed. In

29:22

fact, the most advanced of these

29:24

threats, and that is the generation

29:27

of synthetic child sexual

29:29

imagery. The people who create

29:31

these images are the early

29:34

adopters, and they are developing

29:37

it in greater volume, a

29:39

more technical finesse than in

29:41

any other area of criminal

29:43

use. And

29:46

it has enormous potential

29:48

for harm. The

29:57

Internet Watch Foundation, a charity based in

29:59

Canada, Cambridge are known globally for the

30:02

work they do in trying to cleanse

30:04

the internet of images of child sexual

30:06

abuse. Recently, they

30:08

published a timely, eye-opening

30:11

report on the role

30:13

AI now plays in the creation and

30:15

manipulation of such images. So

30:17

the major findings were after spending a month

30:21

reviewing a forum dedicated to

30:23

the sharing of AI-generated

30:27

images of child sexual abuse,

30:30

we were able to assess more than 11,000

30:34

images and confirm that nearly

30:36

3,000 of those were

30:38

images which would breach UK law.

30:41

Dan Sexton is their chief technology officer.

30:44

So this material was primarily located

30:46

in the dark web. Those

30:49

communities, this is technically minded at people

30:51

within those dark web communities

30:54

that have a potential sexual

30:56

interest in children and an interest in

30:58

sharing content of children, are

31:00

themselves experimenting with this technology. Dan

31:03

tells me those experiments tend to

31:05

produce two kinds of synthetic image.

31:08

That's an image, by the way, that has

31:10

been fully or partially created using AI rather

31:12

than being captured on a conventional

31:14

camera. Now, the first type

31:16

of image is made from

31:18

scratch. An image of a child that

31:20

doesn't exist, completely manufactured by AI. The

31:24

second is the manipulation of

31:26

existing images of children,

31:29

those already in circulation on the

31:31

web, to create fake images in

31:33

which those children are placed in

31:35

sexual positions and situations. Both

31:38

are illegal. As

31:40

part of the report, the IWF

31:42

also expressed their grave concern at

31:45

how easy the software is to

31:47

find and to use. sexual

32:00

images of other children. The

32:02

National Society for the Prevention of Cruelty

32:05

to Children has publicly echoed

32:07

those concerns and has shared

32:09

with Filon 4 transcripts of

32:11

phone calls made by teenagers

32:13

to Childline, their well-known in-house

32:16

counselling service. In

32:18

the calls, the emerging dangers of AI can

32:20

be heard if not felt.

32:23

The words you're about to hear have

32:26

been changed slightly to protect the identity

32:28

of a 15-year-old girl who became a

32:30

victim. Her words are spoken by

32:33

an actor. A stranger

32:35

online has made fake nudes of me. They

32:38

look so real. It's my

32:40

face and my room in the

32:42

background. They must have taken pictures

32:44

from my social media and edited

32:46

them. I'm so

32:49

scared that they'll send them to

32:51

my parents. The pictures are really

32:53

convincing and I don't think that they

32:55

believe me that they're fake. Not

32:58

only that, but the NSPCC has

33:00

received reports of children

33:02

being blackmailed. Richard

33:04

Collard is their Associate Head of Policy

33:07

and Public Affairs. It's often coming up

33:09

in the case of sextortions, so that's

33:11

where someone has called out Hotline to

33:14

say that they are being extorted for

33:16

sexual pictures of themselves. We're now starting

33:18

to see that happen through generative AI.

33:21

They're convinced that if they show or

33:23

tell anyone that they'll believe it's a

33:25

real image of them and they'll get

33:27

into trouble, it's embarrassing, so it's incredibly

33:30

distressing for the children. Dan

33:34

Sexton at the Internet Watch Foundation

33:37

believes social media platforms need to

33:39

do more to safeguard children and

33:42

that new legislation could pave the way.

33:45

The online safety bill is going to

33:47

give the UK government the ability to

33:49

address these kinds of harms. like

34:00

terrorism, revenge porn and other content

34:02

that could be harmful for children.

34:05

If they don't, they could be slapped with fines of

34:07

up to £18 million or 10% of global annual turnover,

34:13

whichever is higher. But

34:15

what about this still simmering, soon to

34:17

boil over, threat from

34:19

generative AI? How

34:21

can it be tackled head on? Professor

34:24

Griffin again. We've already got

34:26

examples which completely fool people and they're

34:28

going to be increasing in common and

34:30

it's going to be very easy to

34:32

fool people in

34:35

the near future. To make

34:37

them undetectable to algorithms is

34:40

much harder. You know, the algorithms,

34:43

they're looking at weird statistical

34:45

details down in the pixels to which

34:48

we would never pick up on and

34:50

they're saying that's not quite right. There's

34:52

an arms race already running where,

34:55

you know, someone makes a tool

34:57

that generates synthetic images but there

34:59

was an industry of people producing

35:01

tools to detect that. Social

35:04

media platforms that responded to our request

35:06

for comment told us they

35:08

have systems in place to report, detect

35:11

and remove illegal content and

35:13

that this will be reported to the authorities. This

35:16

includes images of content generated

35:18

by artificial intelligence. The

35:21

government told us it's investing 100

35:23

million to create a national fraud squad

35:25

with 400 new officers who will pursue

35:28

cyber criminals and other scammers wherever they

35:30

are in the world. Finally,

35:35

you're probably wondering how my poor mum

35:37

Mary is doing after I tried to

35:40

scam her out of a couple of

35:42

hundred quid. Well, she's grand, she's fine.

35:45

As soon as it became clear that the cloned

35:47

version of my voice had done its job, I

35:49

jumped in to tell her the truth. But,

35:52

well, we managed to, you know, make light

35:54

of it. What played out

35:56

that day is a dark and

35:58

troubling sign. What's on the

36:01

near horizon? OK, Mum, Mum,

36:04

Mum, hang on, hang

36:06

on a second. So,

36:08

and just please remember that

36:10

you love me, all right? OK, of

36:13

course I love you. Well, thank God for that.

36:16

So, you know the

36:18

way I'm making a documentary for

36:20

BBC Radio 4, File on 4, OK? So

36:24

we're making a documentary about artificial

36:26

intelligence and artificial intelligence

36:28

driven crime. So what

36:31

you just heard, Mum, wasn't

36:33

me. That was voice

36:36

cloning software. So I

36:38

am fine. Oh, I've

36:40

had it last of many hours and you want to be sent to

36:42

someone. That wasn't me, Mum. That

36:44

was the AI software. You thought

36:46

it was me. You still thought it was me. Yeah,

36:49

of course. Yeah, I did actually. How

36:52

would you feel if I'd been on the radio, Mum? I

36:55

don't care. My points wouldn't be very good. I

37:01

knew you'd laugh. I knew you'd laugh. Good.

37:03

Are you OK? That's good.

37:05

That's good. All right. OK. Thank you.

37:07

Love you too. Talk to you later. Bye.

37:09

OK, see you. Bye. See

37:12

you. Bye, Mum. OK, so... You're

37:15

going to pull it, Mum. Yeah, I

37:18

feel like... You're very sensitive

37:20

somehow. I will. I will. She didn't

37:22

laugh, though. She laughed. She's OK. She's

37:24

OK. And

37:49

all that. a

38:00

huge rift in Carolyn's family. That's our

38:02

mom. We're not going to let you

38:05

just do that. I'm Sue Mitchell, and

38:07

this story unfolded in California, on the

38:09

street where I live. Look what you

38:12

brought into your house! He's a con

38:14

artist, mother. Is Dave a

38:16

dangerous interloper, or the tender carer

38:18

he claims to be? That's why I'm

38:21

here. Thank the Lord. Find out

38:23

in Intrigue, Million Dollar Lover, from

38:25

BBC Radio 4. Listen

38:28

on BBC Sounds. If anything

38:30

happens to him, I

38:32

will just die. Have

38:39

you ever wondered what sets exceptional

38:41

leaders apart? Discover

38:43

how renowned leaders from around

38:45

the globe have harnessed their

38:47

natural talents to achieve remarkable

38:49

success. Uncover the

38:51

secrets of leadership excellence, one

38:54

strength at a time, through Gallup's

38:56

Leading with Strength. Dive

38:59

into compelling stories at

39:01

leadingwithstrengths.com

Rate

Join Podchaser to...

  • Rate podcasts and episodes
  • Follow podcasts and creators
  • Create podcast and episode lists
  • & much more

Episode Tags

Do you host or manage this podcast?
Claim and edit this page to your liking.
,

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features