Podchaser Logo
Home
The chilling rise of AI scams

The chilling rise of AI scams

Released Monday, 4th September 2023
 1 person rated this episode
The chilling rise of AI scams

The chilling rise of AI scams

The chilling rise of AI scams

The chilling rise of AI scams

Monday, 4th September 2023
 1 person rated this episode
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

This is The Guardian. Today,

0:08

the story of a mum who gets the worst

0:10

phone call imaginable. And

0:13

why, one day soon, you could get the

0:15

same call too.

0:23

Well, well, well. Shopping for a car? Yep.

0:25

Carvana made financing a car as smooth

0:27

as can be. Oh yeah? I got pre-qualified

0:30

instantly and had real terms personalized

0:32

just for me.

0:32

Doesn't get much smoother than that. Well,

0:35

I got to browse thousands of car options on Carvana,

0:37

all within my budget. Doesn't get much smoother

0:39

than that. It does. I actually wanted

0:41

a car that seemed out of my range, but I was able to

0:43

add a cosigner and found my dream car. It doesn't

0:46

get much... Oh, it gets smoother. It's

0:48

getting delivered tomorrow. Visit carvana.com

0:51

or download the app to get pre-qualified today.

0:58

The day started as normal

1:00

as anything. My older daughter

1:03

is a former ski racer and

1:05

my younger son was competing

1:08

in a snowboarding event a few hours

1:10

away. And since my daughter is

1:12

a former racer, she thought, well, it'd be fun just

1:14

to jump in and join. So I just

1:16

told her to be safe.

1:18

Jennifer DeStefano lives in the US state

1:21

of Arizona. She's a mother of four, including

1:23

a 15-year-old called Brianna, or Bri.

1:27

I was down in the valley, it's a few hours south,

1:30

and all of a sudden I got a call from

1:32

an unknown number as I'm gathering my things,

1:34

getting out of my car in the parking lot.

1:36

Originally,

1:38

I was going to ignore it because I was trying to get

1:41

in to go meet my other daughter. And

1:43

the thought crossed my mind, well, I should answer

1:45

it because an unknown number is oftentimes

1:48

in a hospital or police or something like that. So

1:51

I decided to answer it as I had locked my

1:53

car and I was walking through the parking lot. And

1:55

it was my daughter, Brianna, crying

1:58

and sobbing, saying, I

2:00

messed up. So at

2:02

that point I was like, okay, wait, what's going on?

2:05

That's extreme. What happened? And

2:07

then all of a sudden she says, mom, these bad men have me.

2:09

Help me, help me, help me. And then the

2:12

phone gets pulled from her as her

2:14

voice fades off. And then that's when she

2:16

starts screaming, mom, please help me. Plead

2:18

in the background. And this man takes

2:21

over the phone and he says, listen here, I have

2:23

your daughter. You call the police,

2:25

you call anybody. I'm going to pop her stomach

2:28

so full of drugs. I'm going to have my way with her

2:29

and drop her in Mexico and you're never going to see her again.

2:36

Artificial intelligence has incredible

2:38

potential to change the way people communicate

2:41

and to help them do terrible things.

2:44

Today is the story of the latter and

2:47

how to protect yourself against that threat.

2:56

From The Guardian, I'm Michael Safi. Today

2:58

in focus, the era of AI scams

3:01

is already here. Are you ready?

3:10

So, Jennifer, what did you think as

3:12

you were given that terrible threat over

3:14

the phone? In that moment, the

3:16

way you feel is just sheer terror. And

3:19

I started screaming for help. One mom ran

3:21

outside and called 911. 911, do you

3:23

need police, fire or medical? 911, police.

3:26

I want to talk to my daughter. A mother just came

3:29

in, she received a phone call from

3:31

someone who has her daughter. Her

3:34

daughter, like I came out around the phone saying she wants

3:36

a million dollars.

3:37

He won't let her talk to her daughter. He's got her daughter.

3:40

The man was making all these threats, vulgar

3:42

threats. So I started

3:45

screaming at my daughter to call her dad,

3:47

call her brother. I'm trying to text my older

3:49

son. Help me, we've got to find

3:52

Bree. Something happened to her. Someone's kidnapped

3:54

her. As the man's making all these

3:56

threats, he then starts demanding

3:58

a ransom. And he wants originally...

3:59

million dollars, which that's

4:02

not possible. Are we

4:04

going to do this or are we not? Are you going to get me the money

4:06

or are we not? So he at

4:08

that point was demanding $50,000. They were demanding that

4:11

they were going to pick me up in a white van, that they're

4:13

going to put a bag over my head, I better have all $50,000

4:15

in cash. And they were going

4:17

to transport me to my daughter. And if I didn't have

4:19

all the cash, then we were both dead.

4:22

At that point, the mom

4:24

who had stepped outside to call 911 came

4:26

in and she said to me that 911 tipped

4:29

her off that there's a scam going around where

4:32

they can take someone's voice, they can do anything

4:34

with it. Okay, so that is

4:36

a very popular scam. This

4:39

is what they do to try and scare people to give them money.

4:41

I need somebody to try and get ahold of her daughter. Do you have

4:43

a phone number? I can get ahold of her daughter. Call

4:46

Brie. Call Brie. No, I'm not going to doubt. Okay, get

4:48

ahold of Brie. We need to see if she can get ahold of Brie. This is

4:50

a common sense. Okay, okay.

4:53

I didn't believe her. I was like, no, it's not just her

4:55

voice. It wasn't her voice recording. I had an

4:57

interaction. It was a conversation. She

4:59

was crying. She was sobbing at me. It was her.

5:01

Okay, do they have her on the phone?

5:04

No, no, Brie. Did they get Brie

5:06

on the phone? No, they can't get Brie

5:08

on the phone. There's a recording of her voice, but

5:11

that's it. Okay, what's her phone number? Can

5:13

you give me Brie's phone number?

5:17

Then the other mom finally was able to get my

5:20

husband on the phone and he went

5:22

running through the place where he was at with

5:24

my older daughter and was able to find my

5:27

older daughter safely resting in bed.

5:29

She has no idea what's going on. And

5:32

then all of a sudden the other mom comes to

5:34

me, she's like, look, your daughter's safe. She's with your husband.

5:37

They found her. Okay. And she's okay.

5:39

And she's okay? Yeah.

5:41

Okay. This is something that they're popular to do.

5:43

They try and scare people and it works as

5:46

long as Brie's okay. And I still

5:48

was so sure

5:51

of my daughter's voice and I had spoken to her. I was

5:53

like, I couldn't wrap my head

5:55

around. I'm like, I need to talk to my daughter. I need to really

5:57

validate who I'm talking to and who's

5:59

really real. I kept asking

6:02

her over and over again, are you sure? Are

6:04

you sure you were safe? Are you sure this is the real

6:06

Brianna? Who's the real Brianna? I'm not

6:08

really sure who I've even been speaking to, but

6:11

after a couple of interchanges and I

6:13

was confident it was her. We already

6:15

reported you to the police

6:17

and then hang up.

6:18

Okay. Thank you. Thank

6:21

you very much. You're welcome. Have a good night. Okay.

6:25

At that point I hung up with them. I literally

6:30

then just collapsed the floor, started

6:32

sobbing, trying to process everything

6:34

that just happened. I mean, you've got the fear that

6:37

then turns into relief, but then you're

6:39

trying to process what did just happen.

6:41

Do they know where I am? Because they're

6:43

making threats to come get me. Do they know where she

6:46

is? And

6:47

then I was furious. Then

6:49

my fear turned into fury that

6:51

this is the lowest of the low money scam

6:53

I've ever heard of in my entire life. That's

6:56

what's crazy. Like they're stealing an

6:58

identity. It's not just a voice. It's

7:00

an identity. It's the way they talk. It's the

7:02

way they respond. It's their mannerisms,

7:05

the way they cry, the way they saw, the way they look,

7:07

they can do video, right? So

7:10

it is really a stolen identity and they're

7:12

using that identity for

7:14

evil.

7:25

Oliver Devane, you're a senior researcher at the

7:28

computer security company McAfee. We've

7:30

just been hearing about the absolutely shocking

7:33

AI scam that Jennifer DiStefano

7:35

was subjected to.

7:36

How does the technology in that scam

7:39

work? So how it works is

7:41

you first need to get a original audio

7:43

clip of someone speaking. And once you

7:45

have that, you can either make use of software

7:48

that's on your local box, or you can upload

7:50

it to a service provider where you

7:52

basically upload the audio and it will then

7:55

generate a synthetic clone that

7:57

sounds just like the original.

7:59

I mean, that same really simple but also

8:02

pretty incredible technology. The

8:04

people who invented the ability

8:06

to do this, what did they have in mind that

8:09

it would be used for? I

8:10

mean there would be lots of use cases

8:12

for this technology. I'm sure that none of them

8:15

would have created it for scammers. So

8:17

one of them would be, you

8:18

know, if someone's not necessarily comfortable speaking

8:21

at public events or even on the phone, they'd

8:24

be able to generate a voice clone of

8:26

themselves. They'd be able to communicate with

8:28

their friends and loved ones. Another use

8:30

case would be if someone is ill, it's

8:33

imagine if someone has throat cancer and they

8:35

may potentially lose the ability to speak,

8:37

well they'd be able to use this technology to clone

8:39

their voice

8:40

so that in years to come they'd still be able to communicate

8:43

with their friends using their original voice. I

8:46

mean all of that sounds really promising,

8:48

a really great use of this technology but it's

8:51

also so far away from the way it's being

8:53

used in Jennifer's case. One

8:56

of the things that she was wondering is where

8:58

they got the samples of Brianna's voice

9:00

because she's not a celebrity,

9:02

she's not someone who,

9:04

for example, hosts a podcast whose voice would

9:07

be kind of out there. She's just a kid

9:09

growing up in the suburbs. She did a radio

9:11

interview, she's done some sports interviews,

9:14

she has 32 followers on TikTok, it's very private,

9:17

it's dancing, it's not vocalisation,

9:20

there's no crying, there's no sobbing, she's not even talking.

9:23

So where they got the crying and sobbing, where they

9:25

tracked her mannerisms and the way she,

9:27

her personality profile, if you will,

9:29

that's what's baffling to me. So really

9:32

that's where it starts to become scary.

9:34

So where would they have gotten

9:37

her voice samples from? So Jennifer

9:39

did say that her daughter did have a social media account,

9:42

albeit you know there wasn't many followers

9:44

but people would still be able to access and view

9:47

those postings. There was also a talk there

9:49

of radio interviews which would be

9:52

kind of like a perfect source.

9:54

Most people use social media these days

9:56

and you know people are pretty comfortable uploading

9:59

a very short video. on TikTok or

10:01

Instagram for example and that's

10:03

kind of all they need. You don't need a

10:05

very long audio clip for someone,

10:08

you could have a 30 second clip that you

10:10

could take from a video

10:11

and then use one of these services to actually clone

10:14

the voice. And also, once

10:16

you generate the voice, you are able to alter

10:19

it and change the speed and change the tone

10:21

of the audio that has been generated so you can

10:23

make it appear that people are either

10:26

happy or in distress potentially.

10:29

What's interesting too about this story is

10:31

that they happen to call Jennifer at a time

10:33

when Brianna was away, when she wasn't

10:36

just in the next room. You know, Jennifer couldn't walk

10:38

into the living room and say, no, Brianna's here, she hasn't

10:40

been kidnapped. Do you often see

10:43

that these scams are obviously

10:45

very technologically sophisticated but they also

10:48

seem to have this kind of social element where

10:50

the scammer has researched

10:53

an aspect of your life. They know a little bit

10:55

about your whereabouts, your interests, all

10:57

these things that help to make it sound more

10:59

authentic.

11:00

For sure. And I think this is where the

11:02

social media aspect comes into play. So

11:05

let's imagine a scenario where I'm on holiday

11:07

in the south of France and I am going out

11:10

to a restaurant for dinner. So I could post

11:12

that, hey, I'm going out to this lovely

11:14

restaurant. What a scammer could do is he

11:17

could actually see that post occur and

11:19

then say an hour later, he'd be able to message

11:21

someone and say, hey, I was at this restaurant

11:24

and I damaged my phone so I'm not able

11:26

to contact you, but you know,

11:28

something bad's happened and I need help. And

11:31

because he would be using the information

11:33

that I've posted on social media, it's much

11:35

more believable because if my friends are following me,

11:37

they will know that I actually went to

11:39

that restaurant and I was there. So it just

11:42

kind of adds that extra layer

11:44

of truthfulness to the actual scam.

11:55

Obviously, online

11:57

scams are nothing new, but there does

11:59

seem to be some something particularly insidious

12:02

and invasive about the way this technology

12:04

is used. How widespread

12:07

do we think these kinds of voice

12:09

powered AI scams are?

12:12

They are fairly widespread. So McAfee released

12:14

a report earlier this year, and in that we

12:16

did a survey across the globe. And

12:18

from that survey, we identified around

12:21

one in four British people

12:23

have experienced or know someone who

12:25

has received an AI voice clone scam. One

12:27

in four? Yeah, which is

12:29

a very high number. But what this illustrates

12:33

is that it is increasing, but

12:35

also I would say that

12:37

these types of scams are underreported.

12:40

Because unfortunately,

12:42

when people fall victim to these types of scams,

12:44

they don't tend to tell anyone, they tend

12:46

to feel

12:47

a little bit embarrassed that they

12:49

have fallen victim to it and they fell for it.

12:52

But actually, there is nothing to be embarrassed

12:54

about. And they are very convincing.

12:57

And I would urge anyone listening to this now,

13:00

if you have fallen victim to this, then please

13:02

report it to the authorities. Go to action

13:04

forward and let them know what happened so

13:07

that the police can investigate it.

13:09

What's really surprising is how

13:11

accessible this kind of technology is already.

13:14

We found some software on the internet that claims

13:17

to be able to simulate people's voices and we

13:19

fed my own into it and here was

13:21

the result.

13:23

Want to get rich quick? Invest now in

13:25

Michael Saficoin, the hottest cryptocurrency

13:28

on the market. Don't wait, you won't regret it.

13:30

I think that sounds nothing like me for the record.

13:33

I definitely should not be replaced

13:35

by AI anytime soon, but it is

13:37

pretty shocking how easy that was to

13:39

get. It cost us a dollar and we had to tick a box

13:41

promising that we wouldn't use the voice clone

13:43

for anything illegal, fraudulent or harmful.

13:46

And once we had ticked that box, we had full access

13:49

to the technology. So given

13:51

how easy this is to get, what

13:54

are the other ways that people are using

13:56

this kind of software to scam

13:59

people to steal their money?

14:00

We have definitely seen the use of AI voice

14:02

clones as well as deep fake videos being used

14:05

for investment scams. And there was even

14:07

a case recently where Martin Lewis

14:09

was being used to advertise a crypto

14:11

investment scam. The money expert,

14:13

I mean, someone people really trust. Yes, exactly.

14:17

Elon Musk presented his new project in

14:19

which he has already invested more than $3 billion.

14:22

Musk's new project opens up great investment

14:24

opportunities.

14:25

Gosh, Martin, I mean, that is terrifying,

14:27

isn't it? It looks like you.

14:29

It sounds like you.

14:32

Yeah, I'm even wearing I think the same shirt that

14:34

I'm wearing today when the image has been taken from

14:36

this is a deep fake. This is still only early

14:38

stages of the technology and they are only

14:41

going to get better.

14:42

So in that case, it was Martin Lewis. We've also

14:45

seen Elon Musk being used a lot, especially

14:47

around crypto investment scams.

14:49

The other thing that strikes me about this is

14:52

in relative terms, we're in the infancy of

14:54

AI. This technology is only

14:57

going to get better at an exponential

14:59

rate over the next few years. Like

15:02

in the same way, are the scams going to get

15:04

more sophisticated?

15:05

What we believe is going to happen is the scams are going

15:07

to become more personalized. So deep

15:10

down, the scams tend to remain

15:12

the same. So even the AI voice

15:15

clone is just kind of a variation of the Hey, Mum

15:17

scam that's been around for a while. That's

15:19

the scam where you might get a text from someone saying, Hey,

15:22

Mum, I need some money urgently. Please send

15:24

it to this account.

15:25

Exactly that. Yeah, exactly.

15:28

So what they're doing is they're using the technology to

15:30

make it more personalized and make it seem more believable.

15:33

But the actual fundamentals of the scam

15:35

remain the same.

15:39

Coming up, why when Jennifer went

15:41

to the police,

15:42

they said there wasn't much they could do.

15:53

So you're driving to work, biking to a friend's

15:55

place or on the way to your next vacation.

15:58

Amazon Music has your comedy.

15:59

fix covered. As an Amazon Prime

16:02

member, you have access to ad free

16:04

top podcasts. To start listening, download

16:06

the Amazon Music app

16:08

or visit amazon.com slash comedy

16:10

on the go. That's amazon.com

16:13

slash comedy on the go

16:14

and listen to your favorite podcasts on

16:16

the go.

16:23

Jennifer, what happened in the days

16:26

after that call? Did you try to follow

16:28

it up with the police?

16:30

When I called the police and reported what had happened,

16:33

she had said that there

16:35

was nothing they could do because no

16:37

one had been kidnapped, no money had been transferred.

16:40

So therefore it was a

16:42

prank call, unfortunately, and no crime had been committed.

16:45

It was a prank call, like this threat

16:47

that your daughter had been kidnapped,

16:49

she could be harmed, they wanted at one stage a million

16:51

dollars. It was a prank.

16:54

It was a prank call. I said to

16:56

her, I understand there's these AI calls and they're

16:58

trying to scam money. However, they were trying to

17:00

make arrangements to come physically pick me up.

17:03

And isn't there something that can be done?

17:05

And it was no, sorry, I'm

17:07

sure you're not in harm's way. She's like, all I

17:10

can offer you is I can have a police officer call

17:12

you from an unknown number, which

17:14

was the number that I had answered

17:16

to begin with. So she had offered

17:19

to have a police officer call me just to

17:21

reassure me that I'm probably

17:23

safe. And I said, yes.

17:26

However, I missed that unknown call. I

17:29

was in the shower when they attempted to call

17:31

an hour later and then I never got another

17:33

call back.

17:35

From the tone of the police officer

17:37

on your 911 call, it sounded

17:39

like this is actually a pretty common scam

17:41

that they see. Have you spoken

17:43

to anyone else who's been the victim of it?

17:46

Yeah. So we have an online bulletin board,

17:48

it's called Nextdoor. So it's

17:50

just neighbours within the geographical region.

17:52

So I put it out as the community

17:55

message board and then I had hundreds of

17:57

responses, hundreds. So many people

17:59

came forward with.

17:59

this happened to my father, this happened

18:02

to me, this happened to my mother, this happened to my friend.

18:05

One literally was halfway driving down to Mexico

18:07

with a bag of cash to go meet somebody.

18:09

Someone else showed up at their door. And

18:13

the stories went so far and wide and

18:16

I was shocked. I was so baffled that this

18:18

has been going on and to the point where people are making

18:20

human contact with these perpetrators.

18:23

There was one mom in our studio whose cousin

18:25

had this happened to and it showed up as a call from

18:28

her daughter's phone. So it was her daughter's face,

18:29

her daughter's number. She had no

18:32

doubt because it was her daughter calling her as

18:34

well as her daughter's voice. She transferred

18:36

the money, it dead ended to Mexico

18:38

and then it was well announced in international affair and that's

18:41

nothing more that can be done.

18:43

The The

18:48

The

18:50

Oliver, we've just heard from Jennifer that despite

18:53

the completely terrifying ordeal

18:55

that she went through, the police treated

18:57

what had happened to her as nothing

18:59

more than a prank phone call. And in

19:01

the months since, Jennifer has fought

19:04

to have this crime taken more seriously.

19:06

She recently testified in front of the US

19:08

Congress.

19:09

It was my daughter's voice. It was her cries.

19:12

It was her sobs. It was the way she spoke.

19:14

I will never be able to shake that voice and

19:16

the desperate cries for help out of my mind. The

19:19

longer this form of terror remains unpunishable, the

19:22

farther and more egregious it will become. There

19:24

is no limit to the depth of evil AI can

19:26

enable. But

19:28

given how difficult this crime

19:30

seems to be for police to investigate, is

19:33

it something that we as individuals need

19:35

to figure out how to protect ourselves from?

19:38

And if so, how do we do that?

19:41

So usually how these scams would happen is they

19:43

would contact you from a new number. So

19:46

if you receive any communication from a

19:48

new number claiming to be someone you already

19:50

know,

19:51

then we would advise you to try and contact

19:53

that person on the number that you have associated

19:55

with them. So that would be the first warning

19:57

sign. And something else.

19:59

out for is when they try to raise urgency

20:02

and they try to make you act quickly without

20:04

really having time to think

20:06

and send them money without giving it

20:08

too much thought because if you were to step

20:11

back and wait a few minutes and think about

20:13

it you would probably start to realize that you know this

20:15

is very suspicious and this is more than

20:17

likely a scam

20:18

but the fundamental of the scam

20:20

will be the same so if you're aware of the warning

20:22

signs and you know the things to look out for

20:25

then you should be able to protect yourself against

20:27

them.

20:28

So what are those warning signs? So

20:30

there are four things that we would recommend.

20:32

So the first one would be to create a codeword.

20:35

So this codeword you would create

20:38

with your family and friends and if there

20:40

ever is an emergency then you would

20:42

use that codeword to let the person who you're calling

20:44

know that it is legitimately you.

20:46

Okay codeword what next? So

20:48

the second thing we would recommend is to question the source.

20:51

So as I mentioned earlier if you receive a

20:53

message or a phone call from an unknown

20:55

number you should be a little bit suspicious if

20:57

they're claiming to be someone you know you

20:59

know and if you're not able to verify that. Even

21:02

if they sound like someone you know. Even

21:04

if they sound like someone you know. So

21:06

if you if you receive a call from an unknown number

21:08

and they claim to be let's say your brother then

21:11

you know we would advise to

21:13

not interact with that person and actually

21:15

try to contact your brother via the usual

21:17

method so you would call him on his usual number.

21:19

The third thing we would

21:21

advise is to kind of think before you share. So this

21:24

is around social media. So

21:26

as I mentioned I mean if I'm on a holiday

21:29

and if I'm constantly posting say like

21:31

hourly updates of what I'm doing where I'm

21:33

at then that provides information to the scammer

21:35

that they can use you know to scam your

21:38

friends. And you know you can

21:40

still let everyone know what an amazing time you had

21:42

and everything you did

21:43

but maybe wait a few days to

21:45

actually let them know. Interesting and what's

21:47

the last one? And the last one we

21:49

would recommend is to use a identity theft

21:52

protection service and what this would

21:54

do is it would monitor your personal details online.

21:56

So if they do become available say on the dark web

21:58

then they'll

21:59

let me know. you know that this has happened and I'll provide

22:01

you steps to take in order to

22:04

protect yourself. Okay. So these are services

22:06

that are constantly scanning Dark

22:08

Web, scanning the internet for any time that

22:10

your personal information pops up.

22:12

Exactly that. Yeah. And

22:14

do you in your own life, I mean, implement all

22:17

of these security measures, the code words,

22:19

the questioning, all of these things? Yes,

22:21

I do. So, yeah. So earlier this year,

22:24

I set up a codeword with my family, not

22:26

my Sunday, he's a little bit too young, but when he's older,

22:29

I will for sure set one up with

22:31

him. All right. I hope you never have to use it.

22:33

Yeah, fingers crossed. Jennifer,

22:44

finally, has

22:45

this changed the way that

22:47

you and your family live your lives,

22:50

communicate with each other? What's been the

22:52

legacy of this completely

22:54

terrifying incident?

22:56

My daughters are definitely more aware

22:59

now after this event, my

23:01

older daughter now is concerned, you know, someone following

23:03

me, someone tracking me, are they coming after my siblings?

23:06

Are they coming after me personally? And

23:08

unfortunately, now you kind of, you

23:11

have to teach your children and your loved ones, you

23:13

can't trust everything you hear, you can't trust everything

23:16

you see. This is uncharted

23:19

waters. So you're trying to navigate

23:21

an unknown that's so hard

23:23

and nearly impossible because you can't navigate

23:25

what you don't know. That's

23:26

scary. Jennifer, thank

23:29

you so much for sharing your story. Thank

23:31

you so much for having me. Thanks for the global

23:33

awareness, too. That's the only way we're all going to get ahead

23:35

of it is if we all come together on it.

23:39

That was Jennifer DiStefano who wrote about

23:41

her story for The Guardian. You can find it at our

23:44

website. Thanks also to Oliver

23:46

Devane who researches these security

23:48

threats for the digital security company McAfee.

23:51

Before we go on this week's Science Weekly, Madeline

23:54

Finlay speaks to The Guardian's technology reporter,

23:56

Hibag Farah, about world coin, a

23:58

new cryptocurrency offering users tokens

24:00

in exchange for a scan of their eyeballs. Hebagh

24:03

explains what the motives behind the company are,

24:06

why they think we all need to become verified humans,

24:09

and how governments have responded to the project.

24:11

And that is it for today. This episode was

24:13

produced by Tom Glasser and Eli Block. Sound

24:16

designs by Rudy Zagadlo. The executive

24:18

producers were Homer Kalili and Sami Kent,

24:21

and we'll be back tomorrow.

24:29

This is The Guardian.

24:39

Whether you're driving to work, biking to

24:41

a friend's place, or on the way to your

24:44

next vacation,

24:45

Amazon Music has your true crime fix

24:47

covered.

24:48

As an Amazon Prime member, you have access

24:50

to ad-free top podcasts. To

24:52

start listening, download the Amazon Music

24:55

app or visit Amazon.com slash

24:57

on-the-go true crime. That's Amazon.com

25:00

slash on-the-go true crime. And listen

25:02

to your favorite podcasts

25:04

on the go.

Rate

Join Podchaser to...

  • Rate podcasts and episodes
  • Follow podcasts and creators
  • Create podcast and episode lists
  • & much more

Episode Tags

Do you host or manage this podcast?
Claim and edit this page to your liking.
,

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features