Podchaser Logo
Home
Your Face Belongs To Us  — Part 2

Your Face Belongs To Us — Part 2

Released Wednesday, 28th February 2024
Good episode? Give it some love!
Your Face Belongs To Us  — Part 2

Your Face Belongs To Us — Part 2

Your Face Belongs To Us  — Part 2

Your Face Belongs To Us — Part 2

Wednesday, 28th February 2024
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

How widespread should this be? How

0:02

often do we want to be subject to this? When

0:05

can we be comfortable that

0:07

our faces aren't being scanned, that we aren't

0:09

being identified? Welcome

0:11

to Politicology. I'm Ron Steslow, and

0:14

this is the second half of

0:16

my two-part conversation with Kashmir Hill

0:18

about her groundbreaking reporting on facial

0:20

recognition software and why

0:22

we might be living in the twilight of

0:25

anonymity. She focuses specifically

0:27

on one secretive startup at the center of

0:30

this technology, their view AI, which I was

0:32

somewhat surprised to learn had roots in the

0:34

MAGA movement. So if you missed part one

0:36

last week, I'd recommend starting there, because today

0:39

in part two, we pick up where we

0:41

left off and dive into technical

0:43

sweetness and the lack of ethical

0:45

considerations by the people building these

0:47

new technologies. We talk about

0:50

the different approaches to privacy laws in

0:52

the United States and Europe and how

0:54

companies are navigating this uneven terrain. Then

0:56

we get into the trend of

0:59

law enforcement agencies getting around constitutional

1:01

protections by buying information from private

1:03

companies. Then lastly, we

1:06

talk about the broader implications of facial

1:08

recognition and what the

1:10

future of privacy and anonymity

1:12

might look like in this age

1:14

of ubiquitous surveillance. This

1:17

was a fascinating discussion for me, so

1:19

I hope you enjoy it. Kashmir

1:22

Hill is a tech reporter at The

1:24

New York Times and the author of

1:26

Your Face Belongs to Us, a secretive

1:28

startup's quest to end privacy as

1:30

we know it. And

1:33

now here's part two of my discussion with Kashmir.

1:39

A little earlier, you mentioned we

1:43

can only trust the technology,

1:46

to the extent that we can trust the person who holds it.

1:50

This raises the term you introduced in

1:52

the book, or you surfaced in the

1:54

book, technical sweetness. A

1:56

couple of years ago, I talked with two...

2:00

Stanford professors who

2:02

were co-authors of System Error,

2:05

where big tech went wrong. And

2:07

one of the key takeaways from their book

2:09

was that tech companies by default ask

2:12

themselves whether

2:14

they can do something and how they can

2:17

do something, solve a technical problem without

2:19

any of the ethical deliberations that you

2:22

would want in the evolution of a

2:24

technology. So they don't ask themselves whether

2:26

they should. The

2:31

most radical form of this ideology, I think

2:33

we should note, considers all technical

2:35

progress virtuous in and of itself. But

2:38

I wonder if you can expand on that role

2:41

that technical sweetness is played in pushing this

2:43

tech forward and

2:46

how you think about the

2:49

degree to which ethics are

2:51

baked into the technology itself

2:55

without really philosophers or ethicists being

2:57

involved in the development. Yeah,

3:01

I mean, I often think of this

3:03

as the Jurassic Park problem. They

3:08

were so excited about making the dinosaurs,

3:10

they didn't think about whether they should

3:12

make the dinosaurs. And

3:15

I certainly encountered this. I was

3:17

going back for the book. I really wanted

3:19

to understand the kind of path of facial

3:21

recognition technology. And because I think technology in

3:23

and of itself is

3:25

not inherently interesting, it's

3:27

more people and how it

3:29

can affect people's lives. But

3:31

I was talking to these

3:34

early engineers who worked on it, made

3:37

breakthroughs in

3:39

the 80s and 90s, often

3:42

on the behalf not of the government,

3:45

but of private companies, like

3:47

a ratings company that wanted to put facial

3:50

recognition in TVs so it could know who

3:52

was sitting in front of them and watching

3:55

things that could better understand if this

3:58

should be advertising time. for

4:00

women or men or young people or old people. And

4:03

I would say, you know, you were working on this.

4:05

You made this critical breakthrough that kept driving facial

4:08

recognition forward to the point it was

4:10

today, to the point it is today

4:12

where it's incredibly powerful. And

4:14

I said, you know, were you thinking

4:16

about the implications of kind of

4:19

developing this technology, what it would mean

4:21

for our privacy, for anonymity? And

4:24

they said, no, not really. We just weren't

4:26

thinking about that. Like, it barely worked at

4:28

the time. It was hard to imagine it

4:30

ever working at the scale of

4:33

thousands or millions or billions

4:35

of people. We

4:38

just wanted to see if we could get computers to

4:40

see. And

4:43

so, yeah, so they're developing this. And

4:46

meanwhile, eventually, it gets so

4:48

powerful and

4:50

widely available. Like one thing that shocked

4:53

me about Clearview

4:55

AI is that Juan

4:57

Dontá, as I said, he made

4:59

Facebook quizzes, he made iPhone games.

5:02

He wasn't necessarily like a biometric

5:04

technology genius. And I

5:06

asked him, I mean, how did you go from he made

5:09

an app called Trump Hair that would put Donald

5:11

Trump's hair on people's

5:14

faces and photos. I said, how did you go from that to

5:17

building this, I mean, revolutionary,

5:21

earth-shattering technology

5:24

of facial recognition technology? And he

5:26

said, yeah, you know, I

5:29

went on Twitter and I started following machine

5:31

learning experts. I went on GitHub and I

5:33

looked up facial recognition and I

5:36

was just kind of staring at him. And

5:38

I start laughing and he starts laughing. And he

5:40

goes, I know, it sounds like I Googled flying

5:42

car and then I built one. But

5:47

he said, I was standing on the shoulders

5:49

of giants. And it's kind of

5:51

what we were talking about earlier. You know, Facebook

5:53

was developing this technology and the

5:56

thing about the people that are working on

5:58

this kind of AI machine learning. neural

6:00

networks is there was this culture

6:02

of sharing what you developed. And

6:04

so they would publish what they

6:06

were doing. It's kind of hard

6:08

to wrap your head around it. In

6:10

the book, I say it's as if

6:12

Coca-Cola decided to release the recipe for

6:15

what it had made for everyone else

6:17

to make themselves. But

6:19

he said, standing on the shoulders of giants, I

6:21

was using research that had

6:24

come before me. And

6:26

then it gets put in

6:28

the hand of this young guy who

6:31

is just trying to kind of make

6:34

it in the tech field. He

6:36

wants to be famous. He wants to make

6:39

an app that everybody uses. And

6:41

I asked him, I said, okay, well, you've created

6:43

Clearview AI. You've

6:46

done something that I would

6:48

later find out Facebook and Google had also

6:50

done, but decided not to release because they

6:52

did think it was too dangerous. They

6:54

didn't want to be the companies to put it out there, but

6:57

he did. I said, you've broken through this

6:59

taboo. As you

7:01

said, you're standing on the

7:03

shoulders of giants. There's a whole bunch of other people

7:05

that can stand on the shoulders with you. There's gonna

7:07

be other people who created this. Clearview

7:09

AI decided to only allow

7:11

police to use their

7:13

technology ultimately. But I said, other

7:16

companies are gonna release this publicly.

7:18

Like, what

7:20

you have created may end privacy and

7:22

anonymity as we know it. This might

7:24

be really widespread. It might be on

7:26

everyone's phones. How do you feel

7:28

to be the one that's pushing this out

7:30

into the world? And

7:34

he said, that's a really good question. I'll have

7:36

to think about it. And

7:38

so you have this long chain

7:40

of people that are all developing

7:42

this technology. And all along the

7:44

way, nobody is thinking about the

7:46

implications. They keep expecting someone else

7:48

to do it. They're the technologists.

7:50

They develop, they do

7:52

what's possible. They wanna solve this puzzle. They wanna

7:54

put it out into the world. And then they

7:56

figure there's someone else who will decide what the

7:58

rules are for this technology or what. the guardrails

8:00

will be, that's kind of not their

8:02

job. It

8:05

raises the big difference between Clearview and

8:08

other companies, which is

8:10

not their technical ability,

8:13

right? It's not the actual technology

8:16

beneath what they're offering. It's what they're willing to let people do with

8:18

it. So you mentioned Facebook and

8:20

Google to develop this and then decided not

8:22

to release it publicly. How have different companies

8:24

approached this question and arrived

8:27

at different conclusions? And

8:29

are we now having this conversation

8:32

on the eve of perhaps another

8:34

company taking an altogether different approach,

8:36

do you think? Yeah.

8:39

So right now, so Clearview

8:41

AI has this database of

8:43

40 billion photos only allowing

8:46

law enforcement to use it and

8:48

only in the US in part because there was

8:50

such a backlash. After

8:52

my initial report about Clearview and

8:55

privacy regulators in other countries said, okay,

8:57

this is illegal. You

8:59

are violating our privacy laws. You can't just

9:01

gather a whole bunch of people's photos without

9:04

their consent and subject it to biometric analysis.

9:07

But then there's another company called Pimize.

9:11

It's based, the corporate headquarters are

9:13

in the UAE. The

9:17

person who runs it is in the country

9:19

of Georgia and it's a

9:21

public faith search engine. Their

9:24

database is about 2 billion photos and

9:26

it's only from news sites. So it's

9:28

kind of the most public of photos.

9:31

But basically anyone can use it. Anyone

9:34

listening to this that they kind of want to

9:36

have a sense of how powerful facial recognition is

9:39

or see where their face is, you can go

9:41

to Pimize. You're supposed to

9:43

only upload your own face. It's supposed

9:45

to be a search engine for you to find out

9:47

where you appear on the internet. But

9:50

I have a subscription which allows me to see

9:52

where an image came from and the full image

9:56

and it's $30 per month and it lets me do

9:58

25 searches. a day and

10:01

there's no technical measure in place to make

10:03

sure I'm only searching one face,

10:05

my own face, 25 times a day. Yeah,

10:09

there's another public face search engine

10:11

similar to PIM eyes. Facebook

10:14

and Google still

10:16

haven't released this technology, but Facebook

10:19

now met up. Their chief

10:21

technology officer, Andrew Bosworth, has said,

10:24

I would love to put facial

10:26

recognition capabilities into their augmented

10:28

reality glasses so that you

10:30

could look at someone and know their name. He

10:33

said, who hasn't been to a cocktail party

10:35

where you run into somebody who you've

10:37

met before, you should know their

10:39

name, you can't remember. He said, we could just

10:42

give you their name. It would be so

10:44

easy, but we're worried about the kind of

10:46

legal regulatory environment for that.

10:52

I think legal in some places like Illinois,

10:54

one of the few states has a really

10:57

strong law on this, but I

10:59

could imagine a world where maybe

11:02

Facebook did do that. I

11:04

don't think that they would just force it

11:07

on all of us as they have forced

11:09

certain things in the past. I could imagine

11:11

it being a kind of opt-in environment where

11:14

maybe the cashmere, maybe I'd

11:16

say, okay,

11:19

I'm a journalist, I'm pretty public. I'll just make

11:21

my face public. Anyone who sees

11:23

me on the street wearing Facebook's augmented reality

11:25

glasses can know who I am. They

11:27

can come up to me, give me their tips, tell

11:30

me great stories. Maybe somebody

11:32

else who's more private would say, okay,

11:34

I'll let my Facebook friends who I'm

11:36

connected to identify me or friends

11:38

of their friends, or maybe you mark your

11:40

face private. I can imagine a

11:42

world, this would be very American approach to

11:45

visual recognition, but where you

11:47

have privacy settings for your face like you do

11:49

for your Facebook profile. I've

11:53

asked this question, I've been

11:55

on book tour, and I've asked different

11:57

audiences, okay, if this existed, who here

11:59

would... often to having

12:01

their face recognized. And

12:04

in San Francisco, half of

12:06

the room raised their hands. In

12:08

New York, it was just a few. And

12:11

then I just got back from Italy

12:13

last week in both Milan and Turin,

12:16

nobody raised their hands. They shook

12:18

their heads, they're like, no, we

12:20

don't want that. But

12:23

a lot of the companies developing this

12:25

technology are in San Francisco where there's such a

12:27

different approach to what is wanted. So

12:30

I just thought that was really interesting. It's

12:32

interesting you mentioned that in Italy because I

12:35

think the Europeans maybe have a different relationship

12:37

with the idea of privacy or their own

12:39

data than we do in

12:41

the United States, right? Because they

12:43

led with GDPR which gives Europeans

12:47

very strong protections over their personal

12:49

data that we don't have in

12:51

the United States. I thought that

12:53

contrast was important. You

12:57

mentioned there was a story about a guy,

12:59

I think, who was a German

13:01

citizen who wrote to

13:04

Clearview and

13:07

asked for all of the data that they had on

13:09

him and they were required by EU law

13:11

to give it to him. What

13:14

protections do they have that we

13:16

don't in the United States? Yeah,

13:20

it's so different. And covering privacy for

13:22

10 years, it's just so striking the different approach

13:24

in Europe versus the US. The

13:27

simplistic version is that in the

13:30

US, we believe we have freedom

13:32

to things and in Europe, they

13:34

have freedom from things being

13:36

done to them. But Europe

13:38

just has a stronger right to privacy

13:41

that says that, for example, companies

13:43

can't use your data without your

13:45

consent and we have very little

13:48

of that in the US. And

13:51

so when it comes to facial recognition, it's

13:55

all these European countries and Australia

13:57

and Canada said that what clear view is clear?

14:00

whatever you AI did was illegal. And

14:02

so Clearview pulled out of those countries

14:04

and those countries, Italy, Greece,

14:06

France, the UK, said

14:09

delete our citizens from your database. But

14:11

Clearview said, how do we know if a

14:14

face is Italian or

14:16

it's American? We can't really

14:18

do that. But

14:22

individual European citizens could reach out

14:24

to Clearview and say, hey, I

14:26

wanna see my report, I wanna

14:28

be deleted. And so Mathias

14:30

Marks is one of those people that went

14:32

ahead and did that with Clearview, also with

14:34

that site Pimize. But then in order

14:36

to keep him out of the database, they

14:38

said, we're gonna have to keep your face so we

14:40

can block it from being

14:42

collected again because Clearview is collecting new images

14:45

all the time. They say they're collecting 75

14:47

million images per day. But

14:51

since the book was published, Clearview

14:53

says they're not respecting GDPR anymore.

14:55

They're not gonna do those opt-outs.

14:58

So they're just not complying with European law, which

15:00

is, this is where facial recognition

15:02

technology and AI in general is

15:05

complicated because it's all

15:07

operating on a global scale.

15:09

And so regulating it

15:11

is difficult. Whereas here in the

15:14

US, yeah, there's no real law that's applicable

15:16

to what Clearview AI is doing on the federal

15:18

level. And there's a few

15:20

states that have laws, Illinois being the

15:22

main one. So,

15:26

speaking of law, you read about the ACLU a bit and

15:31

there's a very, very interesting paradox

15:34

that you explain when you're writing about

15:36

the conversation within the ACLU after Clearview's

15:39

existence became public. You

15:41

write, Clearview was part of a

15:43

trend the organization was seeing. Private

15:45

vendors selling law enforcement agencies the

15:47

fruit of surveillance methods that

15:50

would likely be unconstitutional if deployed by

15:52

the government itself. And a little

15:54

later in that paragraph, police didn't need a warrant.

15:56

They just needed to pay for the intel. Can

15:59

you... explain the tension within

16:02

the ACLU, which has traditionally

16:04

sort of stood

16:06

up for the rights of citizens

16:08

against governments, right? And the

16:11

protections that we have constitutionally are designed

16:14

to protect us from the government,

16:16

not necessarily from private industry. Can

16:19

you explain that a little bit? Yeah.

16:23

So the ACLU is very

16:26

protective of our right to challenge

16:28

the government, to protect protest and

16:32

protecting us from some privacy

16:35

intrusions. And so when they saw Clearview

16:37

AI and what it

16:39

had created, they did

16:42

not like it. And they were worried, this is the tool. You

16:48

have people getting together for Black Lives

16:51

Matter protests, protesting against

16:53

police brutality, and the police

16:55

can just take a photo and know everyone who's

16:57

there. And that could be very chilling. At

17:00

the same time, what

17:03

Clearview AI did was collect

17:06

photos from the public internet, from

17:08

social media sites, but the photos

17:10

that were public. And so

17:12

Clearview AI said, we have

17:14

a first amendment right to do this. We

17:17

are not, you know,

17:19

hacking into anybody's Facebook account. People made

17:22

this public. We're just going on the

17:24

internet and we're collecting faces that anybody

17:26

can see. We're just like Google.

17:29

In the same way Google crawls the

17:31

internet so that you can Google

17:34

Cashmere Hill and see everywhere where

17:36

her name appears, now

17:38

we've just reorganized the internet by face.

17:41

And you're going to Google Cashmere Hill's

17:43

face and just see all the photos she's in

17:45

on the internet. There's nothing intrusive about this. We

17:48

have the right to do this. And so I

17:50

think the ACLU is very, ACLU

17:52

is also very protective of

17:54

the First Amendment and

17:57

people who have sometimes, you

17:59

know, Unpopular. Opinions:

18:02

A seamlessly protected Take a case

18:04

right to have parades and ads

18:06

public gatherings on. and so they

18:09

are a bit torn on this.

18:11

ah, but they ultimately came down

18:13

on the side of anonymity. Privacy

18:16

is is too important to the

18:18

way that we live and we

18:20

really need to chances. And they

18:22

did. so. In Illinois.

18:25

On because Illinois has this fairies

18:27

special law ah I'm unique law

18:29

passed in two thousand They wanted

18:31

a rare laws that move fast

18:33

and technology. Of the Biometric Information

18:36

Privacy Act. And it says at

18:38

a company to not collect. Some.

18:40

Buddies biometric information including their face

18:42

print. without consent where they face a.

18:44

Very big sign and so the

18:47

a seal. You decided to file

18:49

a class action in Illinois saying

18:51

that what Clear View did violated

18:53

this law. Clear View hired very

18:55

famous First Amendment attorney Floyd Abrams.

18:58

A long history of defending the first memory back

19:00

to the New York Times in their right to

19:02

publish. The Pentagon Papers and he made

19:04

that argument. He said hate, were you

19:06

know this is just public information were

19:08

basically like journalist gathering. What's on the public

19:11

internet and making it you know? we're organizing at

19:13

make a searchable. It did not end

19:15

up flying in Illinois. The judge said. Okay

19:18

yes you have the right to gather

19:20

the gather the photos and if you

19:22

want a human being to look through

19:24

them that finds you don't have the

19:27

right to create this code biometric algorithm

19:29

you know identifier ah and use that

19:31

to search them. And ultimately

19:33

the easily you include the

19:35

way I wound up settling

19:37

and. A clear view Agreed

19:39

to only. Sell. Their

19:42

database to law enforcement and not

19:44

to companies or private individuals. On

19:47

so he we haven't seen this kind of

19:49

go of the ways prim court, but there

19:51

is this question of. Said.

19:53

This company be allowed to do this and

19:55

I think of the government did This is.

19:58

The is the in the Us government's the. The

20:00

I I decided to start making a

20:02

database of face and is just great

20:04

from the internet. I think

20:06

people might have a problem with

20:09

that. They might say yeah that's

20:11

that's an unconstitutional search and seizure

20:13

on you know we don't that

20:15

you going through our private photos

20:17

that because clear view as doing

20:20

it and his government's is collecting

20:22

from there may have is an

20:24

end run around around. Those questions

20:26

were how how how big of

20:28

an impact is. This having on

20:31

the broader civil rights and privacy

20:33

rights landscape outside of Illinois, the

20:35

A hadn't go the supreme court

20:37

yet. probably because there isn't a

20:39

suitable vehicle. Ah yeah, hasn't

20:41

been. Yet. But I'm weird

20:43

as the debates and movie the

20:45

actors within the debate. how does

20:47

that landscape look right now. So

20:51

there's there's lawsuits all around.

20:53

The country there's that. There's in

20:55

Illinois. There's another one in California

20:57

which has a testy privacy law.

21:00

There's one in Vermont. foul by

21:02

the Vermont Attorney General violating a

21:04

data broker law on. Of

21:06

a query why? It really kind

21:09

of got. A big push

21:11

back. I mean, it seems like people

21:13

are pretty comfortable with the idea of.

21:16

On. Police. Searching.

21:20

This. This public s it's hit

21:22

a base to identify people to

21:24

solve crimes on. Ah,

21:27

even though there are some. Big.

21:29

Questions X that it displayed some is

21:31

kind of playing out with individual police

21:33

departments. I think this is really interesting.

21:35

Like the. Detroit Police Department. Has

21:38

been thinking a. Lot. About how do

21:40

we use face recognition technology and

21:42

is significant because Detroit is actually

21:44

place where they're spent three false

21:47

arrest based on face recognition searches

21:49

were they ran a surge of

21:51

somebody say gotta hit. on

21:53

they wound up confirming it with an

21:55

eyewitness were an eyewitness is set my

21:57

witness said yeah because that's the person

21:59

that was involved in that crime. But

22:03

there you can run into a real danger of

22:06

a bad feedback loop where you've gotten this computer

22:09

to go through millions of photos, find a person

22:11

who looks most like the person in the photo,

22:13

and then an eyewitness agrees with the computer even

22:15

though it's the wrong person. So, had two arrests.

22:20

I talk about one in the book,

22:22

Robert Williams, arrested for shoplifting, arrested

22:25

right before his birthday, held overnight, charged,

22:27

had to hire a lawyer. Another

22:29

man, Michael Oliver, arrested

22:32

for stealing a smartphone. And

22:34

then the most recent was Portia Woodruff, eight

22:36

months pregnant, arrested on Thursday morning,

22:38

getting her kids ready for school for

22:41

carjacking. That happened the month before by

22:43

a woman who was not visibly pregnant.

22:46

She was taken to jail, she ended up

22:48

in the hospital that night, she was so

22:51

stressed out, dehydrated, needed fluids. I

22:53

mean, this is just, when you have these

22:55

kinds of encounters, it's terrible. And

22:58

they're basically arrested for the crime of looking

23:00

like someone else. The jury police department says, we

23:03

don't want this to happen. This

23:05

was bad police work. This is not the

23:07

technology. This is that we didn't do

23:09

enough gathering of other evidence in these

23:11

cases. And we wanna do this right. They're

23:14

not using Clearview AI, in

23:17

part, I think, because they're worried about

23:19

the implications of searching through 40 billion

23:21

photos to identify somebody. It

23:25

increases the chance that you might make

23:27

a mistake. They're only using

23:29

it for serious crimes, like

23:31

murder, assaults, home invasions.

23:35

And they're no longer

23:37

doing arrest based on facial

23:40

recognition combined with eyewitnesses. And

23:43

so you're basically, at this point, it's like up

23:45

to individual police departments. So

23:47

it's really happening at the local level. There's

23:50

some cities that banned it, San

23:52

Francisco, Portland. Some

23:58

cities around Boston have... said we don't want

24:00

police using this yet, we really need to assess

24:03

the civil liberties here and

24:05

the possibility for algorithmic

24:07

bias. But yeah, I mean,

24:09

we are in the melting pot right now. And

24:12

it's, yeah, as you like

24:14

listeners as citizens, it's the time

24:17

to find out what is my

24:19

department doing and thinking about what you want to

24:21

happen because it is really like local

24:23

government here, city councils that are making

24:25

these big decisions. And

24:27

some places are kind of recording how often

24:30

they use facial recognition, some aren't.

24:32

Yeah, it's

24:34

really in the early

24:36

days, but I think that it could kind of solidify

24:38

very quickly, which is why I wanted to write the

24:41

book now. Like, I think these are big questions that

24:43

we need to answer. So

24:46

what does the partisan landscape look like

24:49

on this issue? Is it actually still somehow bipartisan?

24:52

You mentioned that in the book at

24:54

a certain stage, this was one of

24:56

the issues that seemed to freak everybody

24:58

out in perhaps equal measure.

25:01

And so you had a lot of agreement across

25:03

party lines. Is that still the case? Yeah,

25:07

right now in terms of doing something

25:09

at the federal level, it seems to

25:11

be coming more from the left and

25:13

the Democrats in terms of bills that

25:15

have been floated. But

25:17

it is truly a bipartisan issue.

25:20

And I saw that over the years

25:22

as, as, as facial recognition was

25:24

kind of being debated in DC back

25:26

in 2001, when it was deployed for

25:29

the first time at the Super

25:31

Bowl in Tampa, it was called, it

25:33

was called the Snooper Bowl by the

25:35

press. Dick

25:37

Armey, very conservative

25:39

Republican teamed up with the ACLU to

25:41

put out a press release saying, Hey,

25:43

this shouldn't be happening. We shouldn't be

25:45

deploying facial recognition on crowds. That's an intrusion

25:48

on civil liberties. As recently as

25:50

2018, there was a hearing organized

25:53

by John Lewis, the late John Lewis, you know,

25:55

civil rights leader, Democrat leading

25:58

the Investigation

26:00

into Trump he paired up with Mark

26:02

Meadows and Jim Jordan to do

26:05

a hearing about facial recognition technology Very,

26:07

you know conservative on the right big

26:10

supporters of Trump and they said

26:12

we don't agree about much the three of

26:14

us But we agree that we need to

26:16

do something about facial recognition technology. It's too

26:18

great a threat to our civil liberties So

26:21

it does seem like they're there that something

26:24

should happen You know, this is a bipartisan issue and

26:26

yet for some reason on the federal level just

26:29

Not a lot of movement yet. There there

26:31

has not been much movement Yeah, I

26:33

wonder if that'll get invoked as we seem

26:36

to be moving forward toward something on social

26:38

media and And the

26:41

Hill seems to be abuzz about that. I just can't

26:43

imagine how they wouldn't

26:45

consider the entire landscape of data

26:47

and privacy rights and Sort

26:51

of as one big hole, which

26:53

is what how I think it should be considered

26:57

I mean that may be the challenge right

26:59

make maybe it's just too much. Yeah, so

27:01

much data to be regulated Yeah,

27:05

yeah, it does. It does raise a question of what the What

27:08

is the fundamental right we need to protect here? And

27:11

if you can come up with a principle that maybe you can You

27:14

know establish something that the courts can

27:16

then arbitrate individual cases

27:18

later on Okay,

27:21

I want to talk a little bit about this tension between security and

27:24

privacy Do

27:28

you think the debate can

27:31

be reframed such that the invasion

27:33

of privacy makes us less safe? You

27:37

know that that security and privacy are maybe not

27:39

mutually exclusive or zero-sum because I do think that's

27:41

the way we are sort of conditioned

27:44

to think about the trade-offs here between between

27:48

security and privacy or anonymity and

27:50

convenience is there a different way

27:52

of thinking about the relationship between

27:54

those things? So

27:56

I don't think it's Two

28:00

sides of a. Quiet and it's

28:02

us more of a spectrum right

28:04

on. There are some ways in

28:07

which it might make us more

28:09

secure. ah, I'm and yet we

28:11

lose freedom because of it and

28:14

ultimately at the less secure society.

28:16

On ah i me one way

28:19

it let's. Just talk about it personally

28:21

on how this could be used for

28:23

your own personal security. So. Let's say

28:25

you're a person going out.

28:28

To bar on Saturday night. Ah,

28:30

there's a bunch of people there

28:32

and you've got an app on

28:34

your phone or him eyes which

28:36

is browser based on your phone

28:38

and you're talking to somebody they're

28:40

telling you about themselves and you're

28:42

wondering. Is. This person who they say

28:44

they are see. Take a little photo and

28:47

you sir sir face and you find

28:49

out their name even google and you

28:51

get this information and thereby a lines

28:53

up of what they told you. It

28:55

may be feel more confident being with

28:57

that person. On on the flipside let's

29:00

say you're out a bar and are

29:02

some really creepy burst any staring at

29:04

you obsessively they keep talking. Trying to

29:06

talk to you do not want to touch them.

29:08

You're trying to avoid them. They take a picture

29:10

of your face now they know your name, they

29:12

company social media profile. Maybe they find out where

29:14

you live. On suits. Both things at

29:17

the same time, right? It could. Protect.

29:19

Your security or to be used to invade your.

29:21

Privacy and allow this person that you

29:23

never want to see again to now

29:25

figure out who you are and where

29:27

you left arm. So.

29:29

It is two things at the same

29:32

time On part of how are addressing

29:34

this right now is turned us as

29:36

were saying okay we're comfortable with clear

29:39

view way I am selling this to

29:41

police Ah. Where are we

29:43

trust? the police to use this

29:45

responsibly on and to do the

29:47

work that they're supposed to once

29:49

they identify somebody ah but it

29:51

can lead to insecurity if for

29:54

example is used irresponsibly like the

29:56

case as talked about where people

29:58

are falsely arrested where

30:00

police fall prey to automation bias

30:02

where the system tells them This

30:06

is your person and they put

30:08

too much trust in that and then they

30:10

see all other evidence as confirming that initial

30:13

Identification and they go and arrest somebody

30:16

who who shouldn't be arrested who should

30:18

be living his life be with his

30:20

family Not spending a night

30:22

in prison not hiring a lawyer To

30:25

defend him against charges that are completely

30:27

irresponsible. So Yeah, I

30:29

mean it can it can go it

30:32

can go so many different

30:35

ways that I think we need to think about

30:37

it on a deeper level of Okay,

30:40

this is a good cyst it

30:42

is out there How

30:44

widespread should this be? How often do we

30:46

want to be subject to this? When

30:49

can we be comfortable that

30:51

our faces aren't being scanned that we aren't

30:53

being identified? You include

30:55

a quote from Al Franken's former

30:57

staffer Alvaro Bedoya about what

31:00

the future could look like and he

31:02

says Do I want to

31:04

live in a society where people can be identified

31:06

secretly and at a distance by the government? I

31:09

do not and I think I'm not alone in that

31:12

and there are so many stories in the book about

31:16

people who were freaked out by facial

31:18

recognition being used on them during a

31:20

demo and As I

31:22

was reading I was thinking I would definitely be

31:24

freaked out myself if someone you know showed this

31:26

to me In

31:29

a demo, although the you know the marketer

31:31

in me would think oh my god, this

31:33

is so unbelievably powerful But

31:37

it really invokes. I think a much broader question

31:41

because You know any

31:43

resistance that we might have to

31:45

sacrificing our privacy or our anonymity and I

31:48

think those are two different things ultimately

31:51

has been quickly pacified by the Convenience

31:56

offered by these tools, but the

31:58

trade-off is usually You know,

32:00

obscured by a wall of legalese that we

32:02

just click accept very quickly on. And

32:05

as a species, I think we have proved fairly pliable

32:08

to the technology companies.

32:12

And the cost has been more broad than just privacy.

32:14

I mean, we touched on a little

32:16

bit earlier, but consider the mental health crisis that we are

32:18

now, I think, rightly outraged

32:21

about, and how indisputably

32:23

linked it is to the rise of social

32:25

media. You also

32:27

quote the science fiction writer William Gibson in

32:30

the book discussing that the future is already

32:32

here. It's just not that evenly distributed, which

32:34

I thought was brilliant

32:37

and true. And it made me think of all

32:39

of the other technologies beyond

32:42

facial recognition that you don't

32:44

discuss directly. But for

32:46

someone who pays close attention, their

32:49

facial recognition technology is one sort

32:52

of rising piece of

32:55

a very broad landscape of

32:57

technologies that are very

33:00

quickly eroding privacy

33:02

and anonymity. If you consider the

33:05

aggregation of DNA with sites like Ancestry and

33:08

23andMe with vast databases

33:11

of DNA records submitted

33:13

voluntarily by customers, you

33:16

have Voice Prints, which you did mention a few months ago. 404

33:18

Media, I think, broke a story about a

33:24

marketing firm, a data

33:26

firm in Georgia, announcing,

33:29

sort of broadcasting that your devices are listening

33:31

to you actively, and they're selling

33:33

that data to advertisers who want to advertise to

33:35

you the moment you are talking about a thing,

33:37

a product that you might be interested in. Then

33:41

you have... I have concerns

33:43

about that story. I'm very skeptical. Oh, are you

33:45

skeptical? Okay. Yeah, I

33:47

don't think they did enough digging on that

33:49

story. Nevertheless,

33:54

the list goes on. I mean, behavior... I'm looking

33:56

into it. I'll tell you that, Ron. Oh, perfect.

33:58

Okay. Behavioral fingerprinting,

34:01

which I'm sure you're aware of, right,

34:04

which is the unique signature that

34:08

you give off in the way that you move your

34:10

cursor around the web can

34:12

be used to identify you

34:15

or the keystrokes, the pattern

34:17

of your keystrokes. This

34:19

is offered, I think, by firms like Palantir. And

34:24

then obviously there's state-level hacking software

34:27

like Pegasus, which we've talked about before. And

34:33

oh yeah, and then there was that demonstration,

34:36

I think, that Tristan

34:38

Harris showed in a

34:41

presentation where these neural

34:43

nets, the large language models, can now use

34:45

Wi-Fi signals to identify body

34:48

images through

34:50

walls. And anybody's Wi-Fi

34:52

router can now be hijacked to see

34:55

what's going on inside their homes. So

34:58

I just want, I think, listeners to

35:00

think about the broader landscape of technologies

35:02

that are being used to de-anonymize

35:06

them and whether

35:09

there's anything, what they can do

35:11

to maybe

35:13

fight back or maybe secure their

35:17

own privacy, their own anonymity, or maybe there's a

35:19

different way they should be thinking about all these

35:21

technologies. But

35:24

you write in the book about minority report, right,

35:26

about the facial recognition of minority report. But

35:29

the one thing you didn't mention

35:31

about minority report was the precocks and the

35:33

ability to use all of this data to

35:35

predict what someone is going to do. And

35:38

I couldn't help think about

35:41

all of these tools, as I'm

35:43

reading your book, ultimately

35:46

being aggregated and centralized

35:50

to paint a

35:52

nearly perfect picture of an individual

35:54

and spit out highly

35:56

accurate predictions of whether

35:59

they would commit a crime. crime in the future or

36:01

fill in the blank of the ways you could use

36:04

this data to try and predict the future. That's

36:07

a lot to think about, but I

36:09

just am curious how

36:12

you think about all of this

36:14

because you cover it every day

36:16

and what do you do in

36:18

your day-to-day life to maybe

36:20

guard against the future that

36:22

is coming or that is here now, but it's just

36:25

not evenly distributed. Wow.

36:29

It's rare that somebody paints

36:32

a more dystopian version of

36:34

the world than I do. I do not mean to.

36:38

You just did. But

36:41

look, right, technologically,

36:44

that world is possible

36:47

where all of our faces are known, all

36:49

of our voices are known, all of our gates

36:51

are known. You can identify us by our walk.

36:54

They have our DNA. Every

36:58

glass and bit of hair you leave

37:00

behind can be traced back to you.

37:02

We could live in that world. I

37:06

hope we will choose not to. And

37:09

I have some optimism from

37:11

the past, and that is

37:13

that we are in a kind of similar

37:15

moment in the 1950s, 1960s, when there was

37:17

the development of small listening devices, bugs, and

37:20

wiretapping equipment.

37:27

And there was a national

37:29

panic about the

37:32

end of privacy in terms of

37:34

your conversations that any

37:37

phone call you make might be recorded

37:39

because there were private detectives that were

37:41

tapping lines all the time, hired by

37:43

husbands trying to prove that their wives

37:46

were cheating, proven fidelity. Richard

37:49

Nixon was recording every

37:52

single conversation in the White House. People

37:55

were in a panic that you couldn't have a

37:57

private conversation. And we passed away. asked

38:00

laws that made it illegal to

38:02

secretly record people, made

38:06

it illegal for you to

38:10

be wiretapped, and that the government needed to

38:12

get a warrant to wiretap you. And

38:15

so what you were talking about is this idea

38:17

of a company that secretly

38:19

is listening to us and giving

38:21

us ads based on what we say. If they were

38:23

doing that, that would be very illegal. That

38:25

would cost them a lot of money and

38:28

somebody would probably go to jail. We

38:30

did pass laws to prevent that. And it's part of

38:32

the reason why the surveillance cameras

38:34

that are all around this country, that

38:37

many of us are passing hundreds

38:39

of cameras every day in our workplaces,

38:41

on the streets, in grocery

38:43

stores, they're only recording our images

38:46

and not audio. They're not recording

38:49

our conversations because we decided we

38:51

didn't want to live in a world where

38:53

everything we said was recorded all the time.

38:58

Said we don't want that. And so we're

39:01

in that moment now. What do we

39:03

want the world to look like? Technologically,

39:06

it is possible to do these things. But

39:09

I think that we can restrain them through

39:11

norms, what we

39:14

do, what we want, and then very

39:16

importantly through laws, and then

39:18

also the technology companies themselves, what they

39:20

decide to send to

39:22

market, offer to us. So

39:26

yeah, in terms of what people

39:28

can do specifically about facial recognition

39:30

technology, get yourself a

39:32

very fashionable ski mask, that

39:35

you can wear around. Do those work then? Whenever

39:37

you're doing something very private. Yeah,

39:40

full color, yeah. Oh yeah, ski

39:42

mask. If you're going to dinner with your secret lover, you're

39:45

going to dinner with your secret lover, definitely both

39:47

wear ski masks. No one will think

39:49

that's strange around you. No,

39:51

but just

39:54

think about right now, think about

39:56

what you're putting on the public internet. I'm a mother, I

39:58

have two young children. I try

40:01

not to post photos of them to the public

40:03

internet. Sure. I still use Instagram. I have a

40:05

private account I text

40:07

photos to my loved ones But

40:09

I'm not putting photos out there for

40:11

all the world to see Photos

40:13

that will follow them for the rest of their lives Trying

40:16

to let them choose their

40:19

own kind of privacy I

40:22

I know this is gonna be this may be

40:25

surprising But I do think people should go to

40:27

PIM eyes and look up their faces and see

40:29

what's out there It's it's not gonna give you

40:31

a clear view level search But

40:33

at least will show you if your your face kind of

40:35

shows up on a news site If

40:37

your face easily leads somebody to your name or

40:40

two photos, you don't want them to see And

40:43

to PIM eyes at this point has an opt-out

40:45

system So if there are photos you don't like or

40:47

you don't want your face to be searchable You

40:50

can opt out of there. You can opt out of

40:52

their search so that doesn't show up for other people

40:55

But yeah, and then it's just you know Figuring

40:58

out what you want and kind of through

41:01

our democratic process letting lawmakers know,

41:03

you know What should they do

41:05

about facial recognition technology? Should they

41:07

do something like in Illinois where there's a

41:09

law that says companies can't use this information

41:12

without your consent? Should

41:14

there be more oversight over

41:16

how your police department using facial

41:18

recognition technology? I think

41:20

just being aware and kind of operating in this

41:22

world knowing that the power exists And

41:25

thinking about how you want the power

41:27

to manifest in your day-to-day life. It's

41:29

just a very important time To

41:33

be thinking about totally agree Kashmir

41:36

Hill Is there anything in the book or

41:39

otherwise that we didn't touch on that you that

41:41

you want to mention dig into? The only thing

41:44

is Madison Square Garden. I think crazy

41:46

example of I

41:48

feel like we should bring it up Yeah, this

41:50

is the it's why I mean this is the

41:52

venue I think that I was referring to in

41:55

the intro about People that

41:57

the lawyers being turned away at the door because they were

41:59

working on a You

42:01

can fill in the details, but it's

42:03

still happening, right? They're still able to do this? Yeah,

42:07

so Madison Square Garden put in facial

42:10

recognition systems for security reasons. I believe

42:12

they did it around the time of

42:14

the Emmys, and they may have been

42:16

using it in the Taylor Swift model,

42:18

where Taylor Swift has facial recognition

42:20

cameras that's been reported to identify

42:23

stalkers to be aware of them if

42:25

they're coming to her concert to keep them out. But

42:27

they're using it for security reasons. Somebody had a

42:29

fight at the venue, threw a beer bottle down

42:31

on the ice during a hockey game, they would

42:33

get on the band list. But

42:35

as I was finishing the book, I heard

42:37

about something wilder that was

42:39

happening. The

42:42

first case I heard of is a

42:44

mother who was taking her daughter's Girl

42:46

Scout troop to see the Rockettes at

42:48

Radio City Music Hall. And when

42:50

she got to the door, she

42:52

got pulled aside and they said,

42:54

you can't come in because

42:57

you're a lawyer and you work at a law

42:59

firm that has a suit against Madison Square Garden,

43:01

which owns Radio City Music Hall. And

43:03

you're not welcome here until that suit

43:06

is resolved or dropped. And

43:08

it turned out that

43:10

Madison Square Garden had about 90 law

43:12

firms on its band list. It had gone

43:14

to the firm's websites and

43:16

scraped the photos of the lawyers from

43:18

their own bio pages. And

43:21

they had thousands of lawyers on the band list

43:23

and they couldn't go to Rangers games or Knicks

43:25

games or Mariah

43:27

Carey concerts. Even if

43:29

they had a friend buy the ticket, it didn't matter, they could

43:31

be turned away at the door by face. And I

43:34

actually, I wanted to see this happen. So I bought

43:36

Rangers tickets for me and a personal injury attorney who

43:38

was on the list. And thousands

43:41

of people streaming into the garden, we go

43:43

to the door, put our bags on the

43:45

conveyor belt. By the time we picked them up, security

43:47

guard walked up to us, asked her for ID and

43:50

said, okay, sorry, you can't come in. And

43:52

she said, I'm not working on any case against the

43:55

garden. So he said, it doesn't matter, your whole

43:57

firm's banned until it's done. And

43:59

so Mean this was shocking to me.

44:01

I just didn't think I would see a use case

44:03

like this for another five or ten years kind of

44:06

weaponizing the technology against your enemies

44:10

But it was yeah, I mean it

44:12

was just it showed how the

44:14

technology really could usher in this new

44:17

era of discrimination where What

44:20

is invisible about you could become visible?

44:22

Let's say there's a list of people

44:24

with certain political viewpoints or

44:27

people who are VACs or anti-vax Or

44:30

they're journalists, you know, or

44:32

they work for the government. There's just all these ways that you

44:34

could be monitored or

44:36

have services denied to you based

44:39

on your face because you know being a

44:41

lawyer is not a protected glass it harkens all

44:43

the way back to the beginning with Juan Fonteut

44:45

and The convention

44:47

and wanting to ID, you know

44:50

The libtards or whoever right? And

44:54

Yeah, what really worries me about facial

44:56

recognition technology It just takes like all

44:58

these things that have happened on the internet the data

45:00

collection knowing you are

45:02

seeing what you're doing the kind

45:04

of polarization of I

45:07

know what you are. I know what you believe I hate you, you

45:09

know all that can just be

45:11

transferred to the real world because our face

45:13

would become a way to unlock the kind of

45:16

internet and what's knowable about us and That's that

45:18

is that is one thing I find very

45:20

chilling about that possible future Since

45:23

you mentioned protected class, how do you think

45:25

we'll need to reassess discrimination with this

45:28

much data at anyone's fingertips?

45:32

I mean, I think we might it's

45:34

either reassessing Discriminant like do we add

45:36

protected classes that you can't discriminate against

45:38

people based on their job? I don't

45:41

know or

45:43

do you Regulate

45:45

the means of discrimination the facial

45:47

recognition itself. So Madison Square Garden also

45:51

owns a theater in Chicago, which

45:53

is in Illinois and So

45:55

lawyers cannot be discriminated against

45:58

at the Chicago theater by

46:00

face because Madison Square Garden would

46:03

need their consent to use their

46:05

face prints. So they're still banned

46:07

from the Chicago theater, but

46:10

they can't be kept out by face. So if a friend buys

46:12

them a ticket, they can still get it in. They

46:15

need another way in. Okay.

46:21

This has been terrific. I really enjoyed

46:23

this. If people want the book, where's

46:25

the best place to go to get it? Also,

46:27

there's an audible version we should say read by

46:29

the author. It's fantastic. Yeah, I filmed it here

46:31

actually. Did you enjoy this conversation? Oh, did you?

46:34

Yeah. Oh, yeah. Yeah.

46:37

It's terrific. Yeah.

46:39

Where's the best place to send people and follow your

46:41

work more generally? Yeah, you can

46:43

get it all over the place. I mean,

46:45

I love Bookshop. I really appreciate Barnes and

46:47

Noble, where you can actually buy it in

46:49

person, Amazon, of course. But yeah,

46:52

the audio book was fun

46:54

to do. Was it? Yeah.

46:57

Yeah. I really

46:59

liked getting to read the book.

47:02

It was taxing. But

47:05

people seem to really enjoy hearing from the author.

47:07

They do. I do certainly. I always think the

47:09

book comes to... Whatever the book is, it comes

47:11

to life more when the author is reading it

47:14

than someone else. So kudos to you for doing

47:16

that. It's

47:18

a very entertaining listen also. Okay.

47:22

I think that's all we

47:24

got. So your Twitter handle also. Oh, yeah.

47:27

My X handle is Cash Hill. Your

47:29

X handle, sorry. Yeah,

47:33

I'm Cash Hill basically everywhere on the internet. And

47:35

then Cashmere Hill, of course. It's the New York Times.

47:38

Terrific. Or people can just take a picture

47:40

and Google in the search field.

47:42

Find you everywhere. Pem

47:44

eyes you. Okay. Cashmere, thanks

47:46

so much for being here. Thank you so much for

47:48

having me on. This was a great conversation. Thank

47:52

you to everyone at home and on the go for

47:55

listening. And make sure you're subscribed

47:57

so you get notified when the second part of

47:59

this conversation... and drops next week. If

48:01

you haven't yet, we'd appreciate it if you

48:04

could open up the Apple Podcasts app and

48:06

give us a five-star rating and review over

48:08

there. This helps us rise

48:10

in the rankings so that new

48:12

people can discover politicology organically. If

48:16

you have questions about anything we've talked

48:18

about, you can reach us as always

48:20

at podcast at politicology.com. We

48:23

do read everything you send us, whether it's

48:25

an episode idea, a guest recommendation,

48:27

or just a simple note about how the

48:29

show has impacted you, and we love hearing

48:31

from you. I'm Ron Steslow. I'll

48:34

see you in the next episode.

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features