Podchaser Logo
Home
State of scrutiny: Is mass surveillance justified?

State of scrutiny: Is mass surveillance justified?

Released Tuesday, 31st October 2023
Good episode? Give it some love!
State of scrutiny: Is mass surveillance justified?

State of scrutiny: Is mass surveillance justified?

State of scrutiny: Is mass surveillance justified?

State of scrutiny: Is mass surveillance justified?

Tuesday, 31st October 2023
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:04

Welcome to Doha Debates, where we

0:06

explore an urgent issue from various sides

0:09

and try to find common ground. Get

0:11

ready for a conversation that's well-informed,

0:13

spirited, civil, and respectful.

0:16

Hey there, I'm Joshua Johnson, and I will be your

0:18

moderator for this debate. Today we're

0:21

talking about government surveillance,

0:23

the benefits, and the costs. Now

0:26

this debate took on a whole new dimension in 2013

0:29

because of Edward Snowden, a former

0:31

contractor for the U.S. government's National

0:34

Security Agency. He exposed

0:36

a program that had collected data

0:38

from American phone calls. The

0:40

NSA is supposed to focus on

0:43

foreign intelligence, not on spying

0:45

on the United States. So that raised all

0:47

new questions about government surveillance going

0:50

too far.

0:51

But things have gotten much more sophisticated

0:53

since then.

0:55

Facial recognition technology can identify

0:57

people in photos almost instantly. A

1:00

license plate reader can map out your daily

1:02

movements. And closed-circuit TV

1:04

is more abundant than ever. New York

1:07

now has more than 25,000 cameras monitoring

1:09

its streets. And

1:13

according to Amnesty International, they are

1:15

disproportionately placed in predominantly

1:17

black and brown neighborhoods. Supporters

1:20

of this technology might say these tools are

1:22

necessary to keep everyone safe.

1:25

Terrorism has changed the debate over

1:27

surveillance in very big ways, as

1:29

folks in New York very well know. Opponents

1:32

might raise concerns over privacy and

1:34

human rights abuses. China, for example,

1:36

monitors internet searches and social media

1:39

while expanding its use of facial recognition.

1:42

More liberal democracies are also reckoning

1:44

with mass surveillance. For example, last

1:47

year, Belgium passed a data

1:49

retention law which required internet

1:52

providers and telecommunications companies

1:54

to retain

1:55

user data. The year before

1:57

that, the EU's Court of Justice

1:59

ruled that Belgium's prior data

2:01

retention law violated privacy

2:04

rights. So what is the proper

2:06

role of mass surveillance? How

2:08

do we balance individual privacy

2:11

and collective safety? Let's

2:13

get into all of that with today's guests. Joining

2:15

us from Washington, DC is Jamil Jaffer.

2:18

He is the founder and executive director of

2:20

the National Security Institute at

2:22

George Mason University School of

2:24

Law. Jamil Jaffer, welcome. Good to have you with us. Hey,

2:27

Joshua. Glad to be here. Joining us from San Francisco

2:29

is Cindy Cohen. She's the executive director

2:32

of the Electronic Frontier Foundation,

2:34

a nonprofit organization focused on

2:37

civil liberties in the digital world. Cindy

2:39

Cohen, welcome. Thanks. And as always,

2:41

we have a global listener who will have some questions

2:44

for our panelists a little later in

2:46

the debate. Before we begin, I have

2:48

two ground rules. First,

2:51

no personal attacks. We're here to pick

2:53

apart the issue, not each other. Second,

2:56

every question needs a direct answer.

2:59

It's fine to think out loud, of course, but

3:01

please don't pivot to another topic until

3:04

you've answered the question at hand. And please,

3:07

my personal pet peeve, don't

3:10

answer a question with a question.

3:14

If we're agreed on all that, Cindy Cohen, let

3:16

me start with you. EFF has raised concerns

3:19

for years about mass surveillance and

3:21

the concerns over infringing on human

3:23

rights. Summarize

3:26

for us why those concerns exist.

3:28

Well, whether you're talking about

3:30

in the United States as a matter

3:32

of the Fourth Amendment or the First Amendment,

3:34

or you're talking internationally with regard to freedom

3:36

of expression or the need to have government

3:39

activities be necessary and proportionate

3:42

to the government's goals, mass surveillance

3:44

violates those. It

3:46

flips the ordinary idea that

3:48

you are innocent until proven guilty on

3:51

its head and basically puts all of us

3:53

in a perpetual lineup. The

3:55

government's goal in terms of, let's

3:57

just talk about the National Security mass surveillance.

4:00

is to collect it all first

4:03

and then sort out second what it is they

4:05

actually need. This puts people's

4:07

privacy rights, honestly at the bottom

4:09

of the list of things to do. It also impacts

4:12

freedom of expression in terms

4:14

of kind of a rights-based analysis.

4:16

I think there are also serious questions about how effective

4:19

this kind of surveillance can be at scale.

4:22

If you're trying to spy on the entire world,

4:24

you're not gonna do it very well. And

4:27

we see over and over again in

4:29

the context again of the American national security

4:31

infrastructure, they can't do it very well.

4:34

Even the rules that they put in place for themselves,

4:36

which frankly are far too lenient

4:39

for me and for what I think for in

4:41

terms of human rights for people, they can't

4:43

even stick to them. They get in trouble

4:45

over and over again from the very limited

4:48

review that the secret

4:50

FISA court does or the Congress does. So

4:53

I think that it's time for us to acknowledge that

4:55

mass surveillance is not only a problem for

4:57

our rights, but I think there's a real question

4:59

about whether the government, at least

5:01

the US government, but I also think there's a problem internationally

5:04

can even do this at scale in a

5:06

way that is even marginally consistent

5:09

with our values. And I just

5:10

wanna clarify for viewers around the world, FISA,

5:13

the term that Cindy Cohen referred to as

5:15

the Foreign Intelligence Surveillance

5:17

Act. That's an American law that

5:19

requires certain kinds of intelligence to be regulated,

5:22

including being approved by a specific

5:25

judge before they can be enacted.

5:27

Jamil Jaffer, I understand you share some of

5:29

these concerns, but believe that

5:31

surveillance technology can still, broadly speaking,

5:34

be beneficial to society.

5:36

Tell us why. Well, Joshua, I think that's exactly

5:38

right. There are governments in the world that use

5:41

their surveillance capabilities with reckless

5:43

abandon are hugely troubling. Frankly, to be honest

5:45

with you, a lot of those governments are allies

5:47

in Europe who claim to have concerns about privacy

5:49

and the like and tend to be the most voracious

5:52

collectors of intelligence. There are also

5:54

our adversary nations, like China and Russia,

5:56

that sweep up all the communications in countries,

5:59

search it by... keyword and the like, but what we've

6:01

learned about at least American surveillance, as

6:03

it turns out, is that's not how America conducts

6:05

surveillance.

6:06

The national security agency, we now have a

6:08

tremendous amount more information than we

6:10

ever have about American surveillance collection.

6:13

We search for email

6:15

addresses, phone numbers, and the like that

6:17

are specific targeted ones, a

6:19

few hundred thousand of those, and that's

6:21

what our government looks at. I can tell you American

6:24

surveillance under anybody's standards, Cindy's

6:26

or anybody else's, isn't mass surveillance

6:28

as much as Cindy might want it to be, right,

6:30

or might say it is. Like the truth is,

6:32

now that we have all the information, a

6:35

few hundred thousand email addresses and phone numbers across

6:37

the globe being surveilled, chances are

6:39

you're probably not under surveillance by the US government.

6:42

Cindy, go ahead. I mean,

6:43

I just like every single

6:45

time the FISA court has looked at what

6:47

the US government is actually doing, they

6:49

find that the difference between Mr.

6:52

Jaffar's ideal world and what is actually

6:54

happening on the ground is vast. There's

6:57

millions and millions of queries by the FBI alone,

7:00

infecting way more people

7:02

than just who they're supposed

7:03

to. We discovered

7:05

what they called it, Lovins, right? NSA

7:07

officials doing searches on their exes

7:10

to try to find out where they are. So

7:12

the US government is not perfect at this. They're

7:14

not even close to perfect at this. They have

7:16

had to be scaled back over and over and

7:19

over again. And again, I know this is a global audience.

7:21

I think it's really important that we talk about the

7:23

surveillance under executive order 1233 is

7:26

the United States government granting itself

7:30

the ability to spy on the entire world, which

7:32

I think is very inconsistent from international

7:34

law and not necessarily the way the rest

7:36

of the world looks at people who are not

7:39

citizens. But I think that

7:41

the idea that,

7:43

first of all, that only the US government

7:45

would ever be able to do this and that they do it perfectly

7:48

are just not borne out by

7:50

the truth as it trickles out

7:52

of what is an extremely secretive thing.

7:55

I don't think we disagree at all that some of this stuff

7:57

has to be secret. But

7:59

even in the. context of an American warrant,

8:01

eventually you have to tell

8:04

the defendant that you did a wiretap and

8:06

you have to show them the basis upon which

8:08

you did the wiretap. And again, this is a situation

8:10

in which in the national security context,

8:13

the US government has been very, very bad

8:16

at doing that when they've done it at all. And

8:18

there are serious concerns that lots of people

8:20

have been prosecuted in the United States

8:23

without being given fair notice that

8:25

national security warrants were used at them. And

8:28

in fact, it wasn't happening at all until

8:30

the US government got caught lying to the Supreme

8:32

Court about it and had to kind

8:34

of quickly change course and

8:37

start giving some kind of notice, although

8:39

it wasn't enough. So the level of

8:41

secrecy that the government is demanding

8:44

isn't really about protecting their operational

8:46

things now. I think it's about preventing

8:49

the American people from getting a clear enough view

8:51

of this to be able to exercise their democratic

8:53

rights to decide whether this is what

8:55

they want the government to do and to block

8:57

the court. So I think there's

9:00

a huge piece that

9:02

we could do about blocking some of this opacity

9:05

and making some of this more clear and transparency

9:08

is the best disinfectant before we get to

9:10

the, you know, what are the names of

9:12

the terrorists to their tracking? And

9:14

we see the government over claiming secrecy

9:17

all over the place here.

9:18

I do want to be clear also when we talk about

9:20

surveillance, just so everyone is clear,

9:23

I am talking more about for the

9:25

sake of this debate, the kind of broad

9:27

spectrum sweeping, just

9:30

in case we catch something kind of monitoring,

9:32

as opposed to, you know, this

9:35

person is known to be a criminal,

9:37

we want to get this kind of evidence to for this

9:39

kind of case, we'll talk about that

9:41

a little bit later on. But for the purposes

9:44

of this, we really are talking about kind of sweeping

9:47

broad data collection. That's

9:49

kind of what the focus is of our conversation. Cindy,

9:51

go ahead.

9:51

Yeah, I think it's important that we also clarify terms.

9:54

I think the difference is like, are you at the top of the

9:56

funnel or perhaps at the bottom of the

9:58

funnel to be kind? It's

10:00

different than a wiretap

10:02

where you're tapping into one person

10:05

who you've pre-identified and gotten a warrant.

10:07

In my world, and I think actually in most

10:09

people's world, this top

10:12

of the funnel is mass surveillance.

10:14

And we don't just look at the bottom of the funnel. We look at

10:16

all the people who are impacted because there's a lot

10:18

of trouble that can happen between the top of the funnel

10:21

and the bottom of the funnel. And we care about

10:23

the ability to have a private conversation or

10:25

to do private browsing online. And

10:28

the fact that the government looks at

10:30

it and then maybe decides not to target, maybe

10:33

decides not to keep your communication still means

10:35

that it's not private, right? Because

10:37

there is an initial time when they're taking a look

10:39

and deciding if it's the thing they want to keep or not.

10:42

So to me, and I actually think to most

10:44

people around the world, that is mass surveillance.

10:46

And trying to reframe it as targeted

10:48

surveillance by looking at the bottom of the funnel

10:51

instead of the top of the funnel, I think ends up confusing

10:53

people because they tend to think that the only thing

10:55

that's happening is something akin

10:58

to a traditional wiretap where you go to a judge

11:00

and you make a probable cause showing and you get a very

11:02

specific order that lets you just look

11:04

at the person who you've pre-identified. That's

11:07

targeted surveillance. Mass surveillance

11:09

is when you look at everybody and

11:12

are you fishing with a drift net? Are

11:14

you fishing with a line and a pole? And the kind

11:16

of surveillance that EFF has been suing

11:18

about since, frankly, way before Mr.

11:21

Snowden showed up. He did

11:23

a tremendous service to all of us by presenting

11:26

so much information that the government could no longer

11:28

lie and say it wasn't happening. But

11:31

the drift net fishing is something that

11:33

we think is inconsistent with rights

11:35

and that fishing with a line and a pole, some

11:38

surveillance can happen in the instance

11:40

of a probable cause warrant

11:42

made under the Fourth Amendment with a judge.

11:45

That's targeted surveillance. And the broader stuff

11:47

is mass surveillance. So as long as we can agree

11:49

on terms and we can agree or disagree about

11:51

what we like. But I think we have to start by understanding

11:54

the difference between what

11:57

Professor Jaffar said and what I'm talking about.

12:00

is pretty significant in terms of what's

12:02

actually happening.

12:03

Well, let me get deeper into that difference. Jamil

12:05

Jaffir, let me come back to you with regard to that

12:07

funnel. And we mentioned Edward Snowden.

12:09

You have said before that the big

12:12

takeaway from this incident was

12:14

that the data collection was done without

12:17

enough public scrutiny around the process

12:19

and the way that it was done. But the NSA

12:21

has a spy agency. Their job is to keep

12:23

secrets. Is it even possible

12:26

to do this with public scrutiny

12:28

to make the funnel a little more like,

12:30

I know I'm getting too deep into this metaphor, but more

12:33

like a colander that is still able to

12:35

do all of this filtering, but

12:37

in a way that is not quite so

12:39

opaque. It feels like the opacity

12:42

of the process is kind of necessary

12:45

for it to work at all,

12:48

and it seems impossible to do

12:50

it really with public

12:52

scrutiny. Well, Joshua, it's

12:54

a great question. I think there is a way to do it. With

12:57

some amount of scrutiny, both public and

12:59

non-public, look, when we're talking

13:01

about who the targets of surveillance are,

13:04

we never reveal that publicly. We don't do that

13:06

in the case of criminal surveillance. Right? When you go

13:08

get a warrant, you go to a federal judge, you're

13:10

a U.S. attorney, and you go get a

13:12

warrant to surveil a criminal

13:15

defendant or a criminal suspect, right?

13:17

You don't tell that criminal defendant

13:19

or suspect that they're under surveillance. You don't

13:21

tell their attorney it happens ex

13:23

parte and in camera behind closed doors,

13:26

just a lawyer for the government, just that federal judge

13:29

approving the warrant. Same thing is true

13:31

in the foreign intelligence surveillance context. Right? We

13:33

get orders for individuals behind

13:35

closed doors with a federal judge. That's

13:38

for individuals in the United States or

13:40

American citizens anywhere around the globe.

13:43

They get an individualized warrant process because they have

13:45

rights under the U.S. Constitution. When it comes

13:47

to foreigners located looking outside the United

13:49

States, that is to say non-Americans, they

13:52

don't have rights under our Constitution. But

13:54

nonetheless, we go through a process to get the surveillance

13:56

overall approved for an entire year and

13:59

an individual. targets get an individualized determination

14:02

within the agencies. And so, you know,

14:04

it turns out it's great now that with Edward Snowden's

14:06

revelations, unlawful, illegal,

14:09

they may have been, that we are able to

14:11

talk about this a lot better. And so it's important to talk

14:13

about, we talk about this idea of mass surveillance,

14:15

it is actually just like a colander. And

14:18

so while it's true that all the water that

14:20

goes through the colander goes through the colander,

14:22

all that gets left behind and that's looked at are

14:24

is that pasta, right? That's in the top of the colander,

14:27

the salad or whatever it is. I knew I carried this

14:29

metaphor too far. I knew I went too far with it. But

14:32

here's the beauty of it, right, Joshua, which

14:34

is that, you know, to Citi's point, we're

14:37

not actually looking at any

14:39

of that water that's going through, what's being looked at

14:41

are those targeted few hundred thousand

14:44

email addresses, phone numbers and the like that

14:46

are under collection. So while we

14:48

put the net in the stream, right, you

14:50

have to dip the net, whatever size the

14:52

net is, whether it's a fishing line or a net, you

14:55

got to put it in the ocean to catch the fish, right?

14:57

That's certainly true. And I guess in some sense,

15:00

you're filtering all the water in the global

15:02

ocean through that net, right? All you're pulling

15:04

out of the water that you actually put

15:06

in the boat, right, or look at are

15:08

those few hundred thousand selectors. So to

15:10

me, even under the under what Cindy

15:13

describes as the average Americans understanding of filtering

15:15

or searching or or the like, that

15:18

is targeted. We're not taking everything

15:20

in the ocean and reviewing it. We're taking the

15:23

information from a few hundred thousand selectors,

15:26

cell phone numbers, email addresses, that's what the government's

15:28

looking at, even in its broadest collection

15:30

program.

15:31

One of the problems of this is that

15:33

the consequences of mass surveillance don't

15:36

land evenly across our society.

15:39

It tends pretty much consistently

15:41

every time we've looked at it to disproportionately

15:43

affect marginalized people, people who already

15:46

have a hard time having their voices heard or

15:48

their needs considered or to be

15:51

treated fairly by law enforcement. You

15:53

know, facial recognition has had several

15:55

incidents of mistaken identity.

15:58

All of those are

15:59

people. of color.

16:01

So it's not just that it's bad,

16:03

it's that it's bad in a way that essentially

16:06

hypercharges the discrimination that we're

16:08

already trying to fight in our society.

16:10

Let me ask you, Cindy, a little bit further about, and

16:12

ask both of you actually about, some of the the

16:15

principles underlying this, just to kind

16:17

of pull us away from specifically

16:20

US or Western law and

16:23

mores necessarily. I understand

16:26

that you know everyone's gonna view this slightly differently, different

16:28

countries have slightly different kinds of rules in terms

16:30

of free speech and privacy and the proper

16:32

rule of government and so on.

16:34

I think that the core arguments

16:37

include that the

16:40

adversaries of insert

16:42

government here don't care

16:44

about our lives, let alone our liberties,

16:47

and that there is a reasonable

16:50

realm in which it can be acceptable

16:52

to subject

16:54

people, you know, even in more

16:57

covert ways, to more

16:59

surveillance as a way of

17:01

catching adversaries we're not prepared for, of

17:04

preventing things before they happen, that it would

17:06

be better to apologize to you later,

17:08

hey Josh, sorry we

17:10

went through your Facebook history through

17:12

all the sites that you're a member of, we

17:14

caught this person but this happened, sorry

17:17

it happened but we saved some lives, that that

17:20

would be preferable to, hey

17:22

Joshua I'm sorry we didn't go through

17:24

your Facebook history to catch this person

17:27

who ended up detonating a bomb

17:29

that killed your parents, you know what I'm saying, like

17:31

that is the real world rationale

17:34

behind some of the people who are within

17:36

these intelligence apparatuses and

17:39

it's persuade, like if

17:41

you can save mom's life and dad's

17:43

life by kind of going through my

17:45

posts on Instagram, I can understand

17:48

why someone might say that's a reasonable sacrifice, I

17:50

know you don't but I want you to explain why

17:52

that's not a reasonable sacrifice.

17:54

I mean let's just start with we're not living in an

17:56

episode of 24 right, like

17:59

I think that that the argument that mass surveillance

18:01

works in this way is a product

18:04

of wishful thinking and, frankly,

18:09

too much TV. It

18:12

doesn't work like that. And again, I think that

18:14

right now the people of Israel are discovering

18:16

that the mass surveillance that they embraced

18:19

and did didn't work

18:21

in the way that they had hoped it would work.

18:23

And that is a tragedy and a horror, but I think

18:25

it does require us to think seriously about

18:28

whether mass surveillance is actually working

18:30

in the real world the way that it works when

18:32

Keith or Sutherland has a script that tells

18:34

him how this kind of things works. So

18:37

I think there is a serious question about

18:40

whether it works in the way that you're talking about.

18:42

And the fact that you can dream up a scenario

18:44

in which your mother was saved is not the same

18:46

thing as real intelligence and real

18:48

thinking about what works and what doesn't work, especially

18:51

again at scale, right? We're

18:53

not talking about individual, we're

18:55

not talking about systems of individual

18:57

investigations here. We're talking about

19:00

the massive top of the funnel

19:02

to the bottom of the funnel kind of idea, as

19:04

opposed to I've got evidence that there

19:06

might be somebody in who you're

19:09

communicating with who is a spy and

19:12

I want an individual warrant to go after

19:14

that. We're not talking about that. We're talking about a different

19:16

kind of surveillance with a different kind

19:18

of proposition. So I think I just wanna

19:20

be really clear about that because often we get these

19:23

targeted stories, real targeted

19:25

stories as a justification for something that isn't

19:27

actually targeted at all.

19:30

And I think the second thing is that, so I don't

19:32

think it is work, but more importantly, we have to decide

19:34

what kind of society we live in, right?

19:36

If it makes you safer to have everybody

19:39

who you're afraid of in jail, we

19:41

still don't do that as a society, right?

19:43

We make the people who are trying to keep

19:45

us safe consistent with our values

19:48

and abandoning our values because the people

19:50

on the other side don't have values. I mean,

19:52

that's just a race to the bottom in terms

19:54

of people's rights and liberties

19:57

and privacy. And as I mentioned,

19:59

this is gonna. disproportionately

20:01

affect marginalized people, not

20:03

people who already have power in society.

20:05

That's just how this works. So I

20:08

don't think that this preposition, I think that this

20:10

preposition is largely, it's

20:12

informed by a lot of fantasy and

20:15

that we need to put the intelligence

20:17

community to its real paces. What we see

20:20

in the debate right now in the United States about renewing

20:22

section 702 is cherry-picked

20:24

examples run past the

20:27

public without a broader view of what is

20:29

this costing us? How often are they wrong?

20:31

What is the cost of these kinds

20:34

of things? And you really shouldn't let the

20:36

people who want the power to continue to have

20:38

the power to decide to tell you which

20:41

of their stories are successes and which of their stories

20:43

are not.

20:44

All right, now let's go to our global listener and

20:46

get another question about mass

20:48

surveillance. We are pleased to welcome

20:51

Isra Fazulay. She's a journalism student

20:53

at Northwestern University in Qatar. And

20:55

her studies have included corporate surveillance

20:58

and she has a question about that for you. Jamil

21:01

Jaffer, Isra, welcome. What's on your mind?

21:03

Thank you so much for having me. So

21:05

I wanted to ask you Jamil in your opinion,

21:08

are there specific legal frameworks or oversights

21:11

that can be put in place or that have

21:13

been put in place to prevent

21:15

potential abuse of mass surveillance programs

21:17

used by private companies or

21:19

entities such as social media platforms?

21:22

A very simple example is algorithms, ensuring

21:24

that citizens privacy rights are safeguarded

21:27

while addressing security concerns effectively.

21:30

It's a great question and a really important one, Isra.

21:32

You know, the general framework, at least in the United

21:34

States, when it comes to the collection

21:37

of information by private parties is

21:39

individuals can either consent to or

21:42

not consent to the collection

21:44

of information. I think the challenge with social

21:46

media companies more often than not is that when

21:48

you and I sign up for Facebook

21:51

or you do a Google search or sign up

21:53

for Gmail or the like, we

21:55

consent to them reviewing our

21:57

data, right? Now they have privacy policies.

22:00

and did these long forms and nobody reads them, right? We all just scroll

22:02

through, yes, yes, yes. We click go, right?

22:04

Because we want access to Gmail, we want access

22:06

to Facebook or Instagram or whatever, but

22:08

we are voluntarily giving our information

22:11

to these companies. You know, there's this old saw that

22:13

says, if you don't know what

22:15

the product is that somebody's selling you, it's because

22:17

you're the product. Right? And what these

22:19

companies do and the way they make money, right, is they collect

22:21

our information, they learn about us, and they

22:23

sell that information to other people or they send us advertising,

22:26

right? I mean, how do we know this, right? Oftentimes,

22:28

at least back in the day, when you read, you'd

22:30

read a Gmail between you and somebody else

22:32

in email that came on Gmail, between you and a friend,

22:35

you'd start getting ads about that Gmail. We're

22:37

talking about a vacation in Puerto Rico, and

22:39

you start getting ads about Puerto Rico. We

22:41

all knew that was happening. We didn't, some people didn't

22:43

like it, some people liked it, but we all kept using

22:45

Gmail, right? We could go to a lot

22:48

of other more privacy protective services

22:50

that don't do that. We're making a conscious choice, and

22:52

in America at least, the rule is,

22:55

if you consent, that's your choice,

22:57

right? There are frameworks in Europe,

23:00

you have the General Data Protection

23:02

Regulations, GDPR, that have

23:04

provided a layer of what

23:06

the Europeans call privacy rights on top of

23:08

it. I'd ask you to really dig into that and

23:11

see whether there are real privacy rights actually being effectuated.

23:13

I don't think there are to be candid with you, and frankly,

23:15

the way the Europeans have used it, it's been a cudgel

23:18

against American companies. It hasn't really

23:20

protected the privacy of Europeans. All of us have

23:22

to click through all sorts of banner ads on our webpages,

23:25

but we all say yes anyway, we all consent

23:27

nine times out of 10. I happen to reject a lot of them, but

23:30

I still end up going to that webpage, they still collect

23:32

some of my information because it's necessary, right?

23:34

So I think GDPR in large part has

23:36

been a total nothing burger, and the only thing it's

23:38

really done is given Europe a tool to hit American

23:41

companies over the head and not really successfully

23:43

protect anybody's privacy, but there

23:45

are these regimes that have propped up, but

23:48

by and large in the US, our view is if you consent,

23:50

it's

23:51

your choice. Israel, let me come back to

23:53

you and just get your reaction to that, and

23:55

then Cindy, I'll bring you back in. Israel?

23:56

Yeah, that was very insightful, Jamil. I

23:59

am studying. journalism I have a

24:01

minor in strategic communications where

24:03

we learn a lot about the PR and marketing

24:05

agencies and Something that

24:07

those agencies use their favor is a third

24:10

party that is part of you know

24:12

Kind of getting this information or research from

24:14

from social media or let's say that the

24:16

hearing Mechanisms of what people

24:19

you know are looking for and

24:21

in that sense are people consenting

24:23

to that Do you think that people know about

24:26

this really like when they are consenting to the

24:28

social media uses?

24:29

Yeah, it's a great question It's hard to

24:31

know whether people actually know what they're consenting to and

24:33

that may be part of the problem is people are

24:36

saying yes But they don't know what they're saying Yes, too now

24:38

it turns out that in a lot of these circumstances you are

24:41

consenting to these people that collected in the

24:43

first instance the Facebook's the Google's the Instagram's

24:45

whatever your consent you then selling it

24:47

to a third party that then aggregates it

24:49

from multiple different sources and Essentially

24:52

can create a dossier on you, right? You know

24:54

you look at services like Lexus and

24:56

choice point I'm in the light they have

24:58

information for a variety of sources all collected

25:01

Usually as far as I know with consent and

25:03

they're combining the data and then providing it to other

25:05

people right now You'll hear people say well, you

25:08

know, that's not fair That's not appropriate and you could put

25:10

constraints on those companies if you wanted to write But

25:12

recognize that changes a business model and

25:14

recognize that most people are saying yes to those things

25:16

now If we don't if we want to inform

25:19

people better give them more information about what's

25:21

being collected and why it's being collected Right

25:23

the Europeans have tried to do that with GDPR I

25:25

do worry that we're creating a lot of roadblocks

25:28

and not really creating effective solutions

25:30

Actually help people protect their own privacy

25:33

not informing them effectively They're feel-good

25:35

measures and then they don't really do anything

25:37

effective on the back end

25:38

Cindy. How do you see it?

25:40

I don't know that they disagree very much.

25:42

I think well, I think I do about consent I

25:44

think it does violence to the definition of consent

25:46

to say that people consent given how little people Understand

25:49

and how badly written those terms

25:51

of service are I mean when you know I went to law

25:53

school consent men and meetings at the mind where you

25:55

sit down and you both know what you're agreeing to And

25:57

you sign on these are these are one-sided

26:00

contracts, not written in your favor,

26:03

very unclear about what's happening. And I think

26:05

you're right to talk about the data brokers, right,

26:08

not the people who collect it in the first place, but the people

26:10

behind it who are shadowy and less able

26:12

to spot who are doing a lot. Now

26:14

we just passed a data broker bill in California

26:17

right now. We also have a comprehensive, we

26:19

have a privacy law. We'll see EFF

26:22

has supported a much stronger kind

26:24

of privacy law than the one in California,

26:27

the one in the EU and even one nationally,

26:29

kind of working in a slightly different way than

26:31

just giving people a whole lot more

26:34

click-throughs, but more actually limiting

26:36

some of the pieces of the surveillance

26:39

business model. You know, the

26:41

kinds of algorithmic decision-making

26:44

that is based upon tracking every single thing you

26:46

do and then trying to predict what you're going to do

26:48

next requires a lot more data than

26:51

just, you know, I'm searching for

26:53

shoes on my favorite search engine and an ad

26:55

for shoes comes up. So the difference between

26:57

contextual advertising, which is the

27:00

shoe example I gave you, and the kind of predictive

27:02

advertising is a difference

27:05

that we could decide that we don't like that

27:07

business model, that it causes more harm

27:09

than is good. It doesn't kill, it's not

27:12

gonna, I don't think it's gonna kill the internet.

27:14

The marginal difference in terms of ad

27:16

revenue between these two kinds of advertising

27:19

has been demonstrated by actually some forks at George

27:21

Mason and elsewhere to be pretty small,

27:24

but we would gather a lot more of our privacy

27:26

back, a lot more of the skeevy, nasty

27:28

things that are happening in the data broker worlds would

27:30

go away if we just drew that line. And I think

27:33

we can draw that line. So that's one of the things that

27:35

EFF has advocated for. There

27:37

are a bunch of proposals for how to regain our

27:39

privacy, but I think that we've

27:41

let this fiction that people are consenting

27:44

to things that are really not what

27:46

people would do if they had real choices. And

27:49

we're interpreting the fact that people feel

27:51

like they don't have very many and very good choices

27:53

as consent to the world as it is for far

27:56

too long. And it's time to actually

27:58

change course. And the good news is, We have

28:00

lots of ideas about how to do that.

28:02

Let's wrap up today's discussion with a few more areas

28:04

where our two panelists hear

28:07

some agreement. And I've already heard a few points

28:09

of agreement between the two of you. It seems like

28:11

a lot of, at least the bulk of what you

28:13

disagree on is about kind of

28:15

the way that these surveillance

28:19

or the oversight or

28:21

law enforcement tools are implemented,

28:24

but there are a lot of overlaps, at least in terms

28:26

of some of what I've heard you both say. Cindy,

28:28

let me stick with you as we wind down. Where

28:31

do you hear some areas of agreement

28:34

on this issue?

28:34

Well, I mean, I think that we agree that American

28:37

values, actually international human rights values

28:39

are important and they can't be tossed aside

28:41

because of the things of today and

28:43

I really appreciate that. That is something that

28:46

I think people in government and people like

28:49

me who are often trying to hold them to account

28:51

always can agree on, that rule

28:53

of law is important, rules are important,

28:56

and that the fact that

28:57

the people who we might be trying to fight

29:00

don't have values, there's no reason for us

29:02

to put them on the shelf. So I think that's an

29:05

area where there's broad agreement. And then

29:07

once we have that, then we can begin

29:09

to talk about what

29:10

the rules ought to be. And Jamil

29:12

Jaffer, before we go, areas where we

29:14

have some agreement or overlap.

29:16

Yeah, look, I think Cindy and I agree on

29:19

that you have to look at both the top of the funnel

29:21

and the bottom of the funnel to understand what

29:23

is it that funnel is doing, right? I think we

29:25

just disagree about what the top of it looks like. Is it

29:27

the whole ocean that you're dipping the net in or

29:29

is it just the fish that are going through the net and that

29:31

are getting caught in the net that we're talking about? I think

29:34

it's that simple a distinction, but I think we agree

29:36

that you've got to understand both of those things

29:39

and share a commonality of viewpoints on

29:41

that. And I think at the end of the day, what really

29:43

matters here, right? Is that the

29:45

rule of law matters and operating

29:48

under the rule a lot. Whether you think the rule of law

29:50

is perfect or not, that's

29:52

a debate that we can have, but we can only have those

29:55

in societies under constructs

29:57

where people believe the rule of law and believe.

30:00

separation of powers and those things are actually implemented

30:02

in a serious and effective way and I think we both agree on

30:05

that as well.

30:06

Jamil Jaffer and Cindy Cohn, I

30:08

appreciate you both being here for

30:10

a terrific debate. We have worked our way through funnels

30:12

and colanders and nets and pipes.

30:14

We've gone through the whole hardware store today but

30:17

I think that there have been some really worthwhile

30:20

points in terms of the values

30:22

that we hold or we say we hold and

30:24

how we enact those values within the

30:26

sense of a rule of law of

30:29

human rights and I think you've helped us really

30:31

dig into some of those fine points very

30:33

very clearly. Thank you both for a great debate

30:36

and to our global listener, Israf Fezoulai, best

30:38

of luck to you at Northwestern and thank you also

30:40

for making time for us. Doha Debates

30:42

is a production of Qatar Foundation. Our podcast

30:45

is produced by FP Studios and Doha

30:48

Debates. Our producers include Ashley

30:50

Westerman, TJ Raphael, Claudia

30:52

Tatey, and Katrine Dermody and James

30:55

Gwale. FP Studios managing director

30:57

is Rob Sachs. Our executive producers

31:00

are Jay Fit Weeks, Amjad Atala,

31:02

and Jigar Mehta. You can explore our

31:05

other podcasts, short films, upcoming

31:07

events, and more online at

31:09

Doha Debates. That's dohadebates.com.

31:15

If you like this program, please follow the podcast

31:18

and write us a review to help other people find

31:20

the show and be sure to check out my

31:22

podcast The Night Light with Joshua

31:24

Johnson, a program about democracy,

31:26

culture, and solving the problems we

31:29

share. So until we meet again,

31:31

I'm Joshua Johnson. Thanks for listening.

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features