Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:00
First Contact with Lori Siegel is a production
0:03
of Dot Dot Dot Media and I Heart Radio.
0:10
There's a great quote on the A C l U website.
0:13
The fact that technology now allows an individual
0:15
to carry such information in his hand does not make
0:17
the information any less worthy of the protection
0:19
for which the founders fought. Exactly.
0:22
I like to talk about, you know, one of the whole points
0:24
of the Constitution adding the fourth Amendment,
0:26
which is the protection of privacy, is
0:28
they wanted to protect what was in Benjamin Franklin's
0:30
disk. Nobody should know if he was
0:32
writing some things that were anti government, and
0:35
we now have that on our cell phone, so of course,
0:37
but that's where I think that a lot of the protection
0:40
of civil liberties is applying our
0:42
fundamental principles in different circumstances.
0:56
We are in a moment of reckoning as
0:58
we enter an age of ubiquitous surveillance,
1:01
questionable data collection practices,
1:04
even algorithms that discriminate.
1:07
It's minorities, especially black
1:09
and brown communities, that are disproportionately
1:11
effective. Over the last months, as
1:13
the nation has grappled with a conversation around
1:16
police brutality, we've seen predator
1:18
drones used for aerial surveillance at protests
1:21
facial recognition technology that wrongfully
1:24
accused a black man of a crime he
1:26
didn't commit, and it wasn't a coincidence.
1:29
Reports say the tech is a hundred times
1:31
more likely to misidentify African,
1:33
American and Asian people, and
1:36
as COVID nineteen continues to spread,
1:39
there are serious questions being raised
1:41
about contact tracing apps and how
1:43
that data collected could be misused.
1:46
These issues raise ethical questions about
1:48
technology and its impact on our civil
1:51
liberties, equality, and the future
1:53
of our country. For Susan
1:55
Herman, it is an extraordinary
1:57
time to be sitting in her seat as
1:59
president of the a c l U. Over
2:02
the years, the American Civil Liberties
2:04
Union has filed lawsuits fighting for free
2:06
speech, reproductive rights, and privacy.
2:09
But as technology continues to muddy
2:11
the waters, the trade offs become more
2:13
complicated. Where do we draw the
2:15
line between security and privacy and
2:18
how do we prevent technological innovation
2:20
from outpacing the law. I'm
2:22
Laurie Siegel and this is first contact
2:28
Susan. Thank you for being virtually
2:30
with me today, but thank
2:32
you for inviting me, Laurie. Yeah,
2:35
you know, I I always start out these interviews
2:37
with our first contact. I talked
2:39
to guests about how we met, and
2:42
we don't really have a first contact. We've never met
2:44
in person, but we met on an email
2:46
chain because we were going to do an interview together
2:48
for something else and it and it fell through.
2:50
So I said, you've got to come on the podcast because
2:52
you are just sitting in such
2:54
an extraordinary seat at
2:57
such an extraordinary moment in
2:59
time, so that it's our first contact.
3:01
Well, thanks, It just seems to me like our first contact
3:04
is total serendipity. Yeah, exactly,
3:07
so you know, to get started. You've been the
3:09
president of the a c l U since two thousand
3:11
and eight, and I said this before, but
3:14
you know, what an extraordinary time to
3:16
be sitting in your seat. You know, how are you
3:18
feeling? Oh my, it's just sort
3:20
of overwhelming. You know, as president, I'm essentially
3:22
chair of the board, so you know, I'm not the
3:24
one doing the day to day work as all
3:27
of the members of our staff are. But to
3:29
be a member of the a c l U staff right now
3:31
is just it's mind boggling because
3:33
we had, you know, a lot of work that we
3:35
were already doing before two thousand and sixteen,
3:38
with all of the states making worse and worse
3:40
laws about reproductive freedom and voting rights
3:42
and immigrants rights, and you know, all sorts of other
3:45
things. Then can the election, and
3:47
since that we have brought a hundreds
3:49
and seventy three legal actions against the Trump
3:51
administration for things like
3:53
family separations and the travel ban and
3:56
prohibiting trans in the military. Then
3:59
in March, COVID hit and
4:01
at that point, you know, since then, we've also brought
4:03
over a hundred lawsuits, including
4:06
with a hundred lawsuits just about people who are
4:08
incarcerated in jails and prisons and ice
4:10
attention and who are just in
4:12
a hot spot. You know, they have no control over
4:14
whether they can social distance, and
4:17
so we've been working very hard to get vulnerable
4:19
people out of you know, those terrible situations
4:23
basically death traps. Plus the COVID
4:25
also led to a number of states opportunistically
4:28
restricting things like freedom of abortion,
4:31
declaring abortion to be a non
4:33
essential procedure so people could just wait
4:35
until the pandemic is over to get an abortion
4:38
right. And voting rights has also
4:40
just been a really fraught area
4:42
right now because all the restrictions
4:44
on voting and the ways in which the vote was becoming
4:47
distorted have just
4:49
been magnified
4:51
by all the difficulties, and so
4:54
there's a lot to talk about. So I was about to say, what what
4:56
what I'm hearing from the is you're sleeping really well at night.
4:58
You know, there's no work
5:00
to do, almost nothing to do the
5:02
stuff that they're just sitting around polishing their nails.
5:05
Yeah, So I mean, like, take me to March, like
5:08
coronavirus hits. You have been involved
5:10
in some of these monumental cases
5:12
that have just shaped society and our civil
5:14
liberties, like coronavirus
5:17
hits, And now you know, we have a
5:19
little bit I don't even think we have the luxury of
5:21
perspective at this point, but we have a little bit
5:23
more perspective. But like, take me to March,
5:25
Like in your role at
5:27
this extraordinary moment, Like what
5:30
was going through your head? What were you concerned
5:32
about at the time? Well,
5:34
you know, what are the first concerns is just you have
5:37
to close the office. So the first concern is
5:39
how can people do all this u massity?
5:41
It increases the work and makes it more
5:43
difficult to do the work. So we
5:45
just had to really make sure that our technology
5:48
was was up to doing things. So one thing that the
5:50
a c l you did was to buy new laptops for
5:52
some stuff. People who are going to be working and
5:54
you have to worry about, you know, how the technology
5:57
is working. Um, which has
5:59
been a question for us every time there's
6:01
something really big hits. When the travel
6:04
band hit, there were so many people wanting
6:06
to donate to the a c l U. There are website
6:08
crash so even things
6:10
like that, you know, that's you know, like number one of how
6:12
do you handle this? We have been fortunate
6:14
so far that the a c l U is so well
6:17
managed and we had not spent every
6:19
penny that all of our donors had given
6:21
us up until that point, so we have not had to lay
6:23
people off, which is very fortunate
6:25
because, as you're saying, there's more than enough work
6:27
to do. But yeah, that's the first concern of just
6:29
you know, how do you keep the organization up to speed
6:32
and ready to do you know what. Staff
6:34
members now need to be doing an incredible amount
6:36
more work. But for some of them
6:39
it's well, they're juggling a toddler
6:41
and a dog. Yeah, can
6:43
you give me a run through of some of the cases
6:46
that you've been involved in. That correct me if I'm
6:48
wrong. You started out as an intern, right
6:51
and really just worked your way up.
6:53
I mean, I can imagine you've been involved,
6:55
and I know you've been involved in some pretty extraordinary
6:58
cases. To give listeners some text,
7:00
can you explain some of the cases that kind of stick
7:02
out to you? Well, I wasn't
7:04
intern for the a c l U back, you know,
7:06
in the nineties seventies, you know, around the
7:08
time when I was in law school. And just to
7:10
make sure that everybody understands, I don't
7:13
actually work at the a c l un. My day
7:15
job is I'm a law professor, and I
7:17
don't generally work on the cases. What I'm
7:19
generally doing is we run
7:21
the organization. But I'll tell you
7:23
I think, you know, it would be interesting start
7:26
um. But the first a c l U case that I actually
7:28
did work on, which was while I was a law student, and
7:31
this was the case. One of my connections
7:33
with the a c l you originally was that one of my law
7:35
professors in the first year was connected
7:37
with the New York Civil Liberties Union and
7:40
he had some clients who came to him who
7:42
were graduate students at stony Brook
7:44
on Allowland, and they had just discovered
7:46
they were not allowed to live together. They had rented
7:48
a house together. There were six of them, and they
7:50
had just discovered they weren't allowed to live together
7:53
because there was an ordinance in their village village
7:55
called belt Hair, that prohibited
7:57
more than two persons unrelated by blood,
8:00
marriage or adoption from living together.
8:02
So, you know, they were pretty shocked. And it turned out
8:05
that under the laws
8:07
it was at the time, by the time ye they were talking
8:09
about this, they were liable
8:11
for all sorts of criminal fines and punishment.
8:14
It was really in a very heavy stuff. So
8:16
I started working on that case with my law professor
8:19
and um we went to a
8:21
federal judge to ask
8:24
for a temporary restraining order, which
8:26
means to just until we had litigated
8:28
whether or not that was a constitutional thing to
8:30
do, to tell people who they couldn't couldn't live
8:32
with, that the village should not be allowed
8:35
to either kick them out of their their house
8:37
or to you know, start mocking them up because
8:40
you know, they owned too many fines for having
8:42
been illegal residents. So
8:45
the judge ended up signing the order, and
8:47
he was signing the order about that. And then one
8:50
of the ways they which actually the original
8:52
way in which our clients had discovered that they
8:54
were illegal residents, was that they had applied
8:56
for residents only beach permit and
8:58
they were told they couldn't have one because they were illegal
9:00
residents. So the judge who we had,
9:02
the district judge, who was a very nice man, looked
9:05
at the order we had written out and he said, well, you know,
9:07
it's the summer. Don't your clients want to go to the beach
9:09
while the litigation is pending? Do
9:11
you mind if I write that in that they have to
9:13
be allowed to park in the parking lot of the beach.
9:16
So we said, sure you, that's very nice,
9:18
so he wrote that in. Then, as the junior
9:20
member of the team, I was sent out to
9:22
explain to our clients, to show them the order and
9:25
explain to them what was going on. And they
9:27
gave me a tour of you what the village
9:29
looked like in the residents only beach and
9:31
the minute the wheels of their car hit the parking
9:34
lot is very large. Your fierce looking man
9:36
comes striding across and says, what are you doing
9:38
here? You're not allowed to be in this parking lot, and
9:41
they all look at me, and I'm thinking,
9:43
what am I. I'm like, you know, twenty something, I'm not very
9:45
tall, and what am I supposed to do with this large
9:47
man who doesn't want us in there in his parking lot?
9:50
And then I remembered that I had a federal court order
9:52
right on my person, so I kind of drew
9:55
myself up, but I showed him my federal court order
9:57
and I said, well, I'm with the New York
9:59
Cibiliberties Union kind of, and I
10:01
have a federal court order saying that these people
10:03
are allowed to be in this parking lot and go
10:05
to the beach. And he melted that
10:09
that was I think in one of the points at which I thought,
10:11
wow, you know this, this is really powerful
10:13
stuff. Yeah, you saw that
10:16
there was my first day. Yeah, exactly,
10:18
that's great. And I saw I read that some of your
10:21
your earliest memories of speaking
10:23
up to authority involved I think a dispute
10:26
over a book at your school library.
10:28
Yeah, that's right, Even before the Belchair case.
10:31
My first civil liberties hero was my mother.
10:33
So when I was in third grade, we were doing
10:36
a school play about a story called Johnny
10:38
Tremaine about a boy in the American Revolution,
10:41
and I, like, I thought the play was interesting. Players
10:43
don't have that many words. And we were told
10:46
that this was based on the book. So I went
10:48
to my school library, my public school library,
10:50
and I asked to take out the book, and the
10:52
librarian said, oh, you can't take out that book,
10:54
dear, that's in the boys section. And
10:57
I was I was surprised to find this out.
11:00
I've been reading books in the girl's fiction, which were
11:02
all collections of fairy tales and
11:04
biographies of president's wives, but
11:06
it had never occurred to me that I wasn't allowed to take
11:08
out a book from the boys section. So
11:10
I went home and I told my mother about this. You're
11:13
just thinking, you know, that's the way things are, and she just
11:15
exploded and she called
11:17
the librarian the next day and say, how dare you
11:19
told my daughter? You know what she's not allowed to
11:21
be. So the librarian told
11:23
me that from then on, I could take out any
11:25
book I wanted, and you know, not long
11:28
after that, they changed the policy for everyone.
11:30
So, you know, there was another example of how you
11:32
know, you can kind of speak up to authority when
11:34
they kind of tell you who to be and prevent
11:36
you from making your own choices? Were
11:39
you always like that? Well, you know, that's
11:41
third grade, and I feel like yes, I
11:43
think for most of us are values of when
11:46
we're pretty young. Yeah, so you know,
11:48
seeing my mother do that, I'm sure you would
11:50
have had an impact on me. Yeah,
11:53
that's such a good story. And did you I mean, did you
11:55
always know you wanted to go into law? No,
11:58
I actually really didn't because having grown
12:00
up as a woman during that era, my father was
12:02
a lawyer and he always used to talk about
12:04
the fact that law was really not a good profession
12:06
for women. Why would you want to do that if you could be an
12:08
English teacher and help the summer off, you
12:11
take care of your children, so you have to be a
12:13
while. I graduated from college and then spent a
12:15
few years doing other things and then decided
12:17
to go to law school. Well,
12:19
I mean, it's it's so interesting and
12:21
now kind of seeing where you're at
12:24
um and seeing this moment, it
12:26
does feel like a moment. And I was looking at something
12:28
you said about you know, this feels like a moment. We
12:31
can be optimistic because so
12:33
many Americans are beginning to really understand
12:36
the scope and the depth of structural racism.
12:38
It certainly feels, you know, I'm based
12:40
in New York City. You can just feel it right
12:42
on the streets with the protests, and
12:45
they hear the sirens and the helicopters,
12:47
you know, as we sit here, um and
12:50
we hear you know, your rich history
12:52
and covering and caring about these issues.
12:54
What is the challenge for you guys ahead,
12:59
Well, you know, the channel on that particular subject
13:01
is that this is work that we had already
13:03
been doing. One of our top priorities
13:06
for the past several years has been trying
13:08
to break our addiction to mass incarceration,
13:12
which, as everybody is now really coming to terms
13:14
with, has been really it's a system
13:16
that has disproportionately affected people on
13:18
the basis of race and income
13:21
and disability. A quarter the people
13:23
who are arrested or people who are mentally ill,
13:26
and our feeling is that the system has
13:28
been fundamentally broken and misguided
13:30
for a long time. So part of
13:32
what we're trying to do with this moment is to capitalize
13:34
on the fact that people want to look at what the police
13:36
do. We're trying to encourage people
13:38
to look beyond the police it's not just you.
13:41
Who are the police arresting, and how are they treating
13:43
the people they arrest. I think
13:45
behind that is the question of what do we
13:47
really want to treat as a crime. So
13:50
when you treat all sorts of very minor
13:52
misconduct as a crime, you're
13:55
really setting up a situation where they're going to
13:57
be more contacts and therefore potentially
13:59
more practruary and discriminatory context.
14:02
So, if you think about it, Eric Garner ended
14:05
up dying because he was selling single
14:07
cigarettes on which the tax had not been
14:09
paid. George Floyd.
14:12
The basis for that encounter was that they thought
14:14
he might be passing a counterfeit twenty bill.
14:17
So I think that if you look at why
14:20
are we criminalizing some of the things we criminalize,
14:22
especially if you're talking about people who are mentally
14:25
ill and are having problems.
14:27
Do we really want the police to be the people
14:29
who are the first responders to people
14:31
who are having a mental health crisis or
14:34
is there some more effective way to deal
14:36
with that that would avoid putting
14:38
those people into the criminal justice system,
14:40
which isn't really good for anyone, And
14:42
to maybe recommit, reallocate
14:45
some of the resources we're using on arresting
14:48
people and locking them up to actually
14:50
dealing with the mental health crises. You
14:52
have mental health treatment. So instead
14:54
of viewing everything as all
14:56
dysfunction as a matter of policing, why
14:59
don't we spend more. I'm reinvesting and
15:01
to try to prevent more dysfunction. It's
15:03
sort of like the old thing. You know, if you're a hammer,
15:05
everything looks like a nail. Well, you know, not
15:07
every problem in our society is a problem
15:10
for the criminal justice system, and an occasion
15:12
to arrest people and lock them up a
15:14
lot of them really should be an occasion for you
15:16
thinking about public health treatments. I'm
15:18
thinking about how we want to approach homelessness,
15:21
and you have a lot of much deeper thoughts
15:23
about how you prevent dysfunction. Rather than
15:25
answering everything with you, we're going to send in the
15:27
police. It certainly seems
15:29
also like this moment, even coming out of the pandemic,
15:32
I can only imagine the mental health crisis is
15:34
going to be even worse. Yeah, that could
15:36
well be, um and I think
15:39
the pandemic is also showing us. Somebody
15:41
asked me the other day whether the protests
15:44
over policing and police brutality
15:46
are related to the pandemic. And
15:48
I was in a webinar and one of the smart people
15:50
in the room said, oh, no, no, they're two entirely different
15:53
things, And I said, what do you mean. The
15:55
same people who are being disproportionately
15:57
affected by policing and police brutality
16:00
are the people who are being disproportionately
16:02
affected by COVID. The statistics
16:04
is that people of color are much more likely to die,
16:07
and there are a lot of reasons for that, you having to do
16:09
with underlying health and having
16:11
to do with the fact that minorities
16:14
and people who are not affluent don't
16:16
get to work from home, they don't get to
16:18
work through zoom. There are the people who are out there
16:20
on the streets, being the first responders,
16:23
being the people who are picking up the garbage, being
16:25
the people who are talking the supermarket shills.
16:28
And I feel like the virus
16:30
is really amplifying so many of the inequities
16:33
we've had in our society. And I think
16:35
especially you know, I don't know what it's like for everyone else,
16:37
but I live in Brooklyn and in New
16:39
York City. It really felt like a lot of the people who
16:42
were out on the street. They were out on the street
16:44
because they were upset about George Floyd, But
16:46
I think it was more that they recognized
16:48
that George Floyd was the tip of the iceberg and
16:51
that there were just a lot going on that they
16:54
really you could not tolerate any longer.
17:00
More from Susan after the break, and make
17:02
sure to subscribe to First Contact in Apple
17:05
podcasts or wherever you listen so you don't
17:07
miss an episode
17:24
putting on the tech hat. You know, I think
17:26
most people probably don't think of tech when they think of
17:28
the a c l U, but there's quite
17:30
a bit of litigation in regards
17:32
to security and privacy issues around
17:35
contact tracing, surveillance, algorithmic
17:37
bias, and obviously the a c l U has
17:40
a hand in checks and balances and a lot of the issues
17:42
that are emerging from the pandemic. You
17:44
know, what are some of the tech
17:46
developments that you guys are most
17:49
concerned about. Well,
17:51
since you were mentioning the COVID
17:53
and the contact tracking and tracing, I'll
17:55
start with that. So the upshot is that
17:58
we are neither for nor against tact
18:00
tracing. If contact tracing is something
18:02
that really will contribute to public health.
18:04
Our concern is not to say no, you can't do
18:07
it, or yes, go right ahead and do whatever you want.
18:09
What we're concerned about is to minimize
18:12
the damage to privacy,
18:14
the damage to equity. Again, Uh,
18:17
there are a lot of concerns that we have. The
18:19
other thing that we're concerned about is discrimination
18:21
again, because there
18:23
are ways in which the technology
18:26
could also increase pre
18:28
existing social inequities. We
18:30
think that people should not be coerced into participating
18:33
and testing. We think it should be voluntary,
18:35
and we also think that it should be nonpunitive,
18:39
because if you start having the criminal justice
18:41
system enforcing whether or not people
18:43
are willing to use their phone
18:45
to take a test or whatever it is,
18:48
you're just creating more opportunities
18:50
for police interactions
18:53
that will at some point be arbitrary
18:55
or discriminatory. So we don't
18:57
want to see rules and regulations
19:00
that are good public health rules. Even if they really
19:02
are good public health rules, we don't
19:04
want to see those become occasions for
19:07
filling up the jails with the people
19:09
who aren't complying, because
19:11
we've already seen there were some statistics
19:13
in New York that when you asked
19:15
the police to start enforcing who's wearing
19:17
a mask and who's not wearing a mask. That
19:20
right away, excuse excuse, racially racially
19:22
disproportionate in terms of who they were questioning
19:24
and who they weren't questioning. So I think
19:27
there's just a lot of issues there which is very
19:29
much prop your reality, because they're very much ethical
19:31
issues. Yeah, you know, UM
19:33
one of the one of the cases that I'm
19:35
fascinated by. UM, And I
19:37
you know, I honestly I felt like it was just it was only
19:40
a matter of time until we saw this headline.
19:42
And then we saw the headline, you know, a man
19:44
was arrested after an algorithm wrongfully
19:47
identified him. You know, I've
19:49
been covering for so many years. AI is biased.
19:51
AI is trained on you know,
19:54
on data online, which can be
19:56
very racist, you know. And I think for so many years
19:58
we've been having this conversation. But the question
20:00
of okay, well, what happens when it gets
20:03
into the hands of the police, what
20:05
happens you know, if if it
20:07
could go for policing, And so I think it's such
20:09
a fascinating case. And and
20:11
you guys, the a c L you filed an administrative
20:14
complaint with Detroit's police
20:16
department over what you guys are calling
20:18
the country's first known wrongful arrest
20:21
involving facial recognition technology.
20:23
I mean, for context,
20:26
a man was arrested because he was
20:28
wrongfully identified by an algorithm.
20:30
The police department thought he had robbed
20:33
I believe, like stolen watches, and
20:35
he was arrested. I mean,
20:37
can you talk to me about the significance
20:40
of this case. I can't help put put
20:42
on my tech hat and scream. You guys, this
20:44
is a really big deal. Yeah,
20:46
it is a really big deal. And as you're saying, Laurie,
20:49
we were aware of this problem for a long
20:51
time and we've been complaining. So going
20:53
back for a minute before getting to the case you're
20:55
talking about, Robert Williams UH
20:58
the National Institute of Sidians and Technology
21:01
says that African American and Asian people
21:03
are up to a hundred times is likely to
21:05
be disidentified by facial recognition.
21:08
So that's the background problem. And so we
21:10
knew that, right, you know, we knew that before
21:12
the case came up in Michigan. UM,
21:15
and it's not the algorithm's fault. Obviously,
21:17
there's something that's being put into the algorithm that
21:19
that is you know that has a bias.
21:22
And I think people tend to think that algorithms
21:24
are you know, are so neutral and that we can rely on
21:26
algorithms. That's what I was saying about the contact
21:29
tracking and tracing, that you
21:31
you start relying on algorithms or apps
21:33
that you think are neutral, and you really have to be very
21:35
wary of that. So again,
21:38
before getting to the Robert Williams case,
21:40
UH and a c l U staffer at the ah
21:42
l U of Northern California had the really
21:44
interesting idea of trying out Amazon's
21:47
facial recognition program Recognition
21:49
with the K because yeah,
21:51
they were just offering this to the police or whatever.
21:53
This is great, it will help you identify and
21:55
see if you have somebody who matches a bug shot.
21:58
Well, what they tried to do, which I thought was very clever,
22:01
was they tried to match mug shots
22:03
against the members of Congress. They
22:05
got, you know, the fatual pictures of all the members
22:08
of Congress. This was in July, and
22:12
there were twenty eight members of Congress who
22:14
were misidentified as matching the mug shots.
22:17
There were twenty instates out of that, and
22:20
not only that, but that the false matches were
22:22
disproportionately people of color. And
22:24
one of the people who was identified as
22:26
matching a mug shot, and therefore, you know, probably
22:29
you know this criminal was civil rights
22:31
legend John Lewis, the guy who
22:33
was beat up on the bridge in Palma to know, to
22:35
get us all voting rights. So
22:38
yeah, we know that almost of
22:41
the false matches there
22:43
were of people of color, even though people
22:45
of color made up only twenty of the
22:47
members of Congress. So in
22:49
some ways, you know, the Robert Williams case is completely
22:51
predictable. We knew that
22:54
we allowed for that to happen. It might have
22:56
already happened elsewhere, but you know, subterranean
22:58
lee in a way that we don't. We didn't see the case.
23:01
But what's amazing about the Robert Williams cases
23:03
that it happened right there, you know, visible to everybody
23:06
where. You can just see it. So what happened
23:08
was that they told him that he was being arrested
23:11
because they believed that he was that
23:13
the algorithm has said that that
23:16
he was a match for this mug shot,
23:18
and they showed him in the mug shot and he said
23:20
to them, do you guys think all black people
23:22
look alike that looks nothing like me. So
23:25
you know, it was pretty clearing that if you used your eyes
23:28
and looked at the picture yourself, if
23:30
you didn't trust the algorithm, and if you looked
23:32
at the picture in this man's face, they didn't
23:34
look alike. But nevertheless, he spent
23:37
thirty hours in jail under some pretty
23:39
miserable conditions because the algorithm
23:41
said it was a match. So I think
23:43
that's really important. In some ways, the fact
23:45
that you know a problem
23:47
exists is not as inspiring
23:50
to make people want to do something about it
23:52
as when you see it. So that's
23:54
what happened with all the protests about George Floyd.
23:57
People could watch that horrible video. They
23:59
could see it. It was recorded on the
24:01
video. And here we have an actual
24:03
person, not just hypothetically statistics
24:06
are showing, but an actual person who
24:08
did get arrested and did have a miserable
24:10
time. He was arrested in front of his family. It
24:13
was really traumatizing, and based
24:15
on again, the officers
24:18
involved were trusting the science
24:20
more than they were trusting their their own eyes.
24:22
When anybody couldn't see he didn't
24:25
look like the picture right, And
24:27
you know, he wrote an offered in the Washington Post,
24:29
and he he asked the question, He said, why is law enforcement
24:31
even allowed to use this technology when it obviously
24:34
doesn't work? So I guess asking
24:37
a legal scholar the question.
24:39
You know, police departments all around the country
24:42
are using different variations of facial
24:44
recognition software. So you
24:46
know, what regulations
24:48
should we see as we enter this era
24:50
of algorithmic discrimination. Yeah,
24:53
that's a great question. And again we've been urging, you
24:55
know, long before Robert Williams turned up, we've
24:57
been urging police departments not to rely on
25:00
the facial recognition technology that it
25:02
was just it was not reliable enough to you to
25:05
hold people's faces in the hands
25:07
of the algorithms don't have hands,
25:09
but for people's face to be dependent
25:11
on the spatial recognition technology
25:13
which was being touted. And again, you it's great
25:15
if a company is doing something to make money,
25:18
but if wanting to make money is your only consideration,
25:21
and if you're not considering whether you
25:23
are unleashing something that is really going
25:25
to be disruptive of people's lives unfairly,
25:29
either because it's just going to be wrong, or because
25:31
it's going to be wrong in a racially skewed
25:33
way. I think that's just really a problem.
25:36
So um, we've been urging
25:38
police departments not to buy and use
25:40
the technology, and I'm sure you know Amazon
25:42
has withdrawn the facial recognition technology
25:45
temporarily and they're not sure whether or
25:47
not they'll bring it back. So
25:49
the probability of wrongful arrest
25:51
is one thing, but when you draw the
25:53
camera back and look at all the technology
25:56
in the bigger picture. In addition to
25:58
facial recognition, one thing that police
26:00
departments have been doing with facial recognition
26:03
and different law enforcement agencies is
26:05
to try to see who attends the demonstration
26:08
or see who's in the crowd. So
26:11
it ties not into are you, like, is
26:13
somebody likely to be wrongly arrested like
26:15
Robert Williams because they just there
26:17
was a false match. But it starts
26:20
becoming big surveillance too that
26:23
an agency has the cameras
26:25
on and then they have the facial recognition
26:27
and they're purporting to identify
26:30
all the people in that crowd so that
26:32
then they can track those people. They now know that
26:34
you were at the George Floyd demonstration and
26:37
that person was in the the
26:39
anti war demonstration, and at
26:41
that point the government starts having more and more
26:44
information about all of us, to
26:46
the point where it feels like, instead of we're
26:48
controlling the government, it's like the government
26:51
controls us. So I think the
26:53
facial recognition is only one part
26:55
of the whole tendency
26:58
of technology to amplify
27:02
government power to
27:04
be kind of watching, watching what we
27:06
do. Yeah, I mean, it's
27:09
it's interesting to hear you say that. Um,
27:11
you know, that type of technology
27:14
is just a part of it, especially when it comes to this moment
27:16
where people are out protesting police brutality,
27:19
where people are out fighting for their civil liberties.
27:21
You know, there's all sorts of technology that's
27:24
being built. Their cameras
27:26
that are are being built, that can recognize
27:29
people in real time, that are police police are wearing.
27:31
There's all sorts of technology. This is
27:33
just the beginning of it. Um, I
27:36
know you mentioned Amazon put a hold on their sales
27:38
of recognition software. Microsoft said it's
27:40
not going to sell face recognition
27:42
software to police departments until their
27:44
federal regulations. I know IBM said
27:47
that it was going to announce a ban on general
27:49
purpose facial recognition. Is
27:51
that enough? Like? What is I guess
27:53
you know, what is the government's
27:56
role here, Like, what do you think should happen, especially
27:58
since this is, just, as you say,
28:01
one small part of a larger issue that
28:03
we're facing as a society. I
28:05
think that's right, and I think that there could be, you know, government
28:08
regulation, but that's not going to happen unless
28:10
the public wants to urge their representatives
28:13
to start controlling this. And what we've
28:15
seen is that an enlightened public can
28:17
make something happen even without regulation,
28:21
right, So, you know, it was that the public was becoming
28:23
concerned and that's the reason why Amazon acted
28:25
to withdraw this. They started being concerned
28:28
that their customers were not going to be happy
28:30
with them. And I think
28:32
at this point that's almost more effective
28:35
than government regulation. And
28:37
once you have that wake up call, then
28:39
you can start having serious debates.
28:41
And I think those debates have to take place in many
28:44
places. They should be taking place
28:46
in legislatures where people can talk
28:48
about the trade off between privacy
28:50
and mass surveillance and whatever the
28:52
government is trying to accomplish. Why
28:54
do they need this technology? Is it really worth
28:57
it? You're their crimes that they wouldn't be
28:59
solveding without it? And are they crimes
29:01
that we're concerned about solving or
29:03
do they fall into the category of, you know,
29:07
is that something that we don't think should be a crime at
29:09
all. People are generally unaware in terms
29:11
of what the police do, that only four to
29:13
five percent of all arrests involve crimes
29:15
of violence. So when
29:17
people think about we want to enable law enforcement
29:20
to be at catching criminals, where
29:22
we're concerned about divesting or defunding
29:24
the police because who's going to protect us from physical
29:27
harm? Almost none of what the police
29:29
and law enforcement do is about physical harm.
29:31
It's a tiny percentage. Everything
29:34
else that they're doing is about this whole
29:36
array of all sorts of other things that we criminalize.
29:39
And I think that in addition to having better
29:41
conversations about is
29:44
there a potential for some of these technologies
29:46
that the government is using to
29:49
create arbitrary or discriminatory
29:51
enforcement, I think we need to dig deeper behind
29:54
that question, in the same way that you need to dig
29:56
deeper beyond the George Floyd murder
29:59
and to ask if there's something systemically
30:01
wrong here, do you need to rethink the whole question.
30:03
So when people say, well, you know, but we need the facial
30:06
recognition technology because it helps the
30:08
police solve crimes. Well, okay, but you know
30:10
what crimes and what are the costs?
30:12
So I think once people are educated enough,
30:15
and once they realize what the nature of the
30:17
problem is, kind of what's
30:19
being unleashed, they can start really being
30:22
ready to have that broader conversation. And
30:24
I think it should take place in legislatures, but
30:26
I think it also should take place and evidently
30:29
is taking place in boardrooms
30:31
at Amazon, Facebook, and Google
30:33
and Microsoft. They should be talking
30:35
and they do sometimes if the people
30:37
demand it. And it also has to take
30:39
part just among people, you know, among
30:42
you know, tech communities and people
30:44
just beginning to talk about what are our responsibilities
30:47
here? Is it okay for us to create products
30:49
to to to make money if
30:51
we know that there are dangerous that the products
30:54
are going to be misused, or maybe
30:56
aren't reliable enough, or that they just feed
30:58
into this enormous survey lean state.
31:01
So let me compare this to an earlier moment.
31:03
After nine eleven, we had a
31:05
kind of a similar phenomenon that in order to deal
31:07
with catching terrorists, we
31:10
changed a lot of laws that ended
31:12
up really sacrificing a lot of privacy
31:14
and allowing a lot more government surveillance,
31:17
and for a number of years that went unchallenged,
31:19
and people kept saying, oh, well, you know, if if
31:21
that's what we need in order to be safe, we're
31:24
willing to give up a little privacy.
31:26
So, first of all, I think people didn't think about
31:28
the fact that they weren't giving up their own privacy,
31:30
they were giving up somebody else's. And
31:33
second of all, people didn't realize how extensive
31:35
the surveillance really was until Edward Snowden.
31:38
So then after Edwards Snowden came along
31:40
and people realized how the government was
31:42
just scooping up tons of information about
31:45
people and just keeping it in government databases
31:47
and started realizing the horrifying
31:50
potential of all that. What
31:52
happened was that Congress made a couple of
31:54
little changes to the law. But more
31:56
important, Microsoft and Google and other
31:58
places started to realize that their customers
32:00
were concerned, and they started
32:03
being a little less cooperative. At the beginning,
32:05
right after nine eleven, all of the telecoms,
32:07
all these companies, we're just saying to the government,
32:09
you want information, here, take it all your
32:11
verizons. Are sure you know hear all the records of all our
32:14
customers take it all. You're keeping us safe. And
32:16
I think that to
32:19
me, the most important thing is an informed
32:21
public. That if people can examine
32:23
for themselves whether they really think that we're
32:25
being kept safe by all of this, and
32:28
really examine you both the costs and the benefits
32:30
in an educated way, I think we
32:33
get much better discussions. And I think not only
32:35
do you have the possibility of getting better
32:37
legislation or regulation, you also
32:39
have the possibility that private
32:42
companies and you know, the tech the tech companies
32:44
are not going to want to do it anymore because their customers
32:46
don't want them to. Yeah, I mean
32:49
it's hard to have an informed public
32:51
and to have these discussions, even in this current
32:53
environment to some degree. I mean, people
32:55
I think are struggling with the idea of
32:57
truth. People are um, you
32:59
know. And I remember, by the way, I remember this
33:02
note in leaks, like I remember being in the news room
33:04
covering technology and thinking to myself
33:06
because I wrote the tech bubble
33:08
all the way up right, and thinking, this
33:11
is an extraordinary moment because we
33:13
saw that we've been sharing all our data, but we
33:16
saw for the first time that you know, the
33:18
government had a lot of access
33:20
to things that we had no idea
33:22
they had access to. And I think it was a fundamental
33:25
shift, and the lens on tech companies
33:27
changed at that moment, and
33:29
tech companies behaviors changed quite
33:31
a bit after that. You know, I wonder
33:34
this moment we're sitting in where we're having these debates
33:36
about surveillance and privacy and whatnot.
33:39
These are sticky debates, and they're very politicized
33:41
as we're heading into an election, as we
33:44
have misinformation spreading online, as
33:46
a lot of people don't know what to believe and what not
33:48
to believe. The as the media landscape
33:50
has changed, it's it certainly seems
33:53
like a harder environment to even to
33:55
even have some of these conversations.
33:57
Well, I think in some ways it's harder in some ways.
34:00
I think the other thing that is a catalyst for
34:02
the discussions is realizing that there is a dimension
34:04
of race to all of us. I think in
34:06
talking about artificial intelligence and facial
34:08
recognition, not many people saw that
34:11
as an issue of structural racism.
34:13
You know, that there's something wrong with how we're putting together
34:15
the algorithms, and it ends up that John Lewis
34:17
is going to be misidentified as somebody
34:20
who matches a mug shot and that Robert Williams
34:22
is going to be arrested. So I think that
34:24
the fact that we now know that
34:26
that is an additional concern enables
34:29
us to have richer conversations. So
34:31
we're not only talking about is there a trade
34:33
off between security and privacy? Plus,
34:36
I think the other thing that people are feeling much more
34:38
open to is to have that deeper conversation
34:41
about what are our goals
34:43
here and if we're enabling
34:45
all this government surveillance in order to
34:48
help the government to catch criminals, well,
34:50
you know, what do we mean by criminals? What crimes
34:52
are they solving? And how are they using you know,
34:54
how how are we how is this actually being used in
34:56
services wood So I feel like in
34:58
some ways, you know, with the election coming up,
35:01
I think that gives people more
35:03
impetus to want to talk about these issues,
35:05
because the elections aren't only about the president.
35:08
They're also about local prosecutors
35:10
and sheriffs and the people who make the decisions
35:12
about whether to buy surveillance
35:14
equipment and what they're gonna do
35:16
with their authority over the criminal justice
35:19
system. So one thing the a c l U
35:21
has been doing in addition to everything else, as
35:23
we've been very involved in elections
35:25
of prosecutors because that's the place
35:27
where almost people never used to pay attention
35:30
to, you know who, who were these people running? And
35:32
maybe they would vote for somebody without really knowing
35:34
what they voted for. So what we're urging,
35:37
and I think this is very much what we're talking about
35:39
about having an educated public. We're
35:42
urging people to go to elections
35:44
or to go to debates, to go to campaign
35:46
events, attending I guess on zoom these
35:49
days, to attend campaign events
35:51
and ask the candidates questions, what
35:53
would be your policy about whether or not you're going
35:55
to accept military equipment from the
35:57
federal government in your police department? Are
35:59
you going to buy tanks? Are you going
36:01
to buy you know, these horrible weapons that
36:04
are used? Is that something you would do?
36:06
Are you going to buy you know, facial recognition
36:08
software? Is that how you would use your power? If
36:10
we you left you um say
36:13
that the prosecutors, would you support a
36:15
reduction in cash bail and increase
36:18
with increased alternatives to incarceration.
36:21
So that's a place where without waiting for the
36:23
government to do something, we can
36:25
ourselves effect what's happening in our
36:27
communities. By encouraging
36:29
candidates to think about
36:32
what positions they're taking on these different issues
36:34
and letting them know that they're gonna lose votes. The
36:37
more people educated, the more they are educated,
36:40
the more they can tell people that they'll lose votes,
36:42
and to try that. This is something that's worked in some
36:44
places to encourage candidates to
36:47
take a better position. Yeah.
36:49
Yeah, they might never never thought of that, but you
36:51
know, once they commit themselves, you know that's going to be
36:53
better. So there are all sorts of ways that
36:55
we can affect things. More
37:00
from Susan after the break, and make sure
37:02
you sign up for our newsletter at dot dot dot media
37:04
dot com Backslash newsletter we'll
37:06
be launching this summer. Before
37:23
I move on from specifically some of
37:25
the tech issues, I have to bring up predator
37:27
drones. Uh right.
37:31
You know, the the U S. Customers and Border
37:33
Protection flew a large predator drone over
37:35
the Minneapolis protests. You know, people were
37:37
protesting police brutality in the killing of George
37:39
Floyd, and for many reasons, it almost
37:42
felt symbolic. You know, it was raising all
37:44
these questions about aerial
37:46
surveillance about what
37:48
data was being collected, where
37:50
was this going. What is your
37:53
take on this? Well, you know, as
37:55
you're saying, Laurie, and you know that really it
37:57
really magnifies the opportunity to gather
37:59
more in formation because you don't even have to have
38:01
the helicopters or whatever. But
38:04
so you know that of course is a concern just you
38:06
how much information is the government gathering,
38:08
What are they going to do with it, who's going to have access
38:10
to it? Will will ever be deleted or will it just
38:12
got to stay there in the government databases
38:15
forever. But I think the other thing that
38:17
the Predator drone brings to mind is
38:19
a question that people were also asking, which
38:21
is about the militarization of law enforcement
38:24
we have had for years in this country. A
38:26
poppy Comma taught us Act as It's called,
38:29
which says, you don't want the military doing
38:31
everyday law enforcement because
38:34
that's that's not our country. We
38:36
don't want the military to be quote dominating
38:38
the streets, and we don't want the
38:41
people who are out protesting to be considered
38:43
the enemy of the United States. There
38:46
are people who are expressing their opinions, and
38:48
so the whole idea of you know, it's
38:50
one thing. It's enough if the police
38:53
held helicopters are flying overhead
38:55
and trying to keep track of, you know, who's in the crowd
38:57
and what the crowd is doing. But once he's
39:00
start adding an element of something
39:02
the military helicopters or the military
39:05
drones or things that feel like we
39:08
are being treated as the enemy of the government
39:10
instead of the people who are the government, who
39:12
are supposed to be controlling the government. I
39:15
think that that's just it's a very bad paradise.
39:19
You think it's a slippery slope, Well,
39:21
it's a slippery slope unless we stopped the slipping.
39:23
And as we saw with you with Amazon and the
39:25
facial recognition, if people say, wait a minute,
39:28
yeah, I think we can make that stuff. But
39:30
I think if people don't pay attention, I think we
39:32
have a very slippery slope. And that's
39:34
what I've been saying about most of the issues we've
39:36
talked about you, starting with the contact
39:39
tracing and the surveillance
39:41
and everything else. It seems to be that what's really important
39:43
is transparency. We should know what the
39:46
government is doing and accountability.
39:48
Back on the issue of contact tracing, one
39:50
thing that the AHL you do. Together with the a l
39:53
U of Massachusetts is we have
39:55
filed the lawsuit were actually a records
39:57
request demanding that government,
40:00
including the CDC, release information
40:02
about the possible uses of all the location
40:05
data that they would be collecting in connection with
40:07
contact tracing, because
40:09
you know, once if you don't know what they're
40:11
doing, then you can have a discussion about what they
40:13
should be doing. And one reason why I
40:15
was bringing up all the post nine eleven changes
40:18
of law is that I think that the whole idea
40:20
that we can't know what the government
40:23
is doing. The government has to act in secret
40:25
in order to keep us safe, or else the enemy
40:27
will be able to know what they're doing and you know,
40:29
and work around it. But the
40:31
government can know everything that we're doing. I
40:34
think that just has democracy backwards. You
40:36
know, we have to be able to know what's happening
40:38
inside the government. And that applies to why
40:41
are they sending the Predator drone? What are they going to
40:43
do with the information? What does this mean? Are they
40:45
going to do it again? And it also
40:47
has to do with the contact track tracking
40:49
and tracing. Once they get that data,
40:51
what happens to it? Are they going to
40:53
erase itever, you know, who do they share it with, what are
40:55
they going to do with it? And I feel, you
40:57
know, those are really important issues in a democracy
41:00
that we just have the right to know what the government
41:02
just doing so that we can talk about it. And
41:05
I feel like to sort of say, well, this is what
41:07
the government is doing and that's really bad,
41:09
and that upsets me. I think that
41:11
kind of misses the point. If the government is doing
41:13
something bad, then it is the duty of
41:16
every American to find out
41:18
what they're doing and to push back. And
41:20
so at the a c l U we have a program
41:22
that we call people Power. We
41:24
first invented that and used it to explain
41:27
to cities and localities
41:29
all over the country about how they could fight
41:31
back against draconian immigration
41:33
rules by becoming quote sanctuary cities,
41:36
what what their rights actually were. We then
41:39
used it through voting rights. We're about to
41:41
use it some more for voting rights. But what
41:43
we have really urged and I hope that you know, some of
41:45
your listeners will go to the a c l U website
41:47
and see about what people Power is doing in
41:49
addition to what the a c l U is doing, Because
41:51
what is the a c l U doing, and that's all the staffers
41:54
at home trying to you know, work on their new laptops
41:56
while they're trying to you know, keep their talkers quiet.
41:59
But People Hour is about what every single
42:01
person can and I think should be doing.
42:03
You know, if people really educate themselves and
42:05
think about the ethical issues, the
42:08
costs and benefits of all this technology
42:11
in addition to a lot of other things going on, I
42:13
think we get a lot better results if people
42:15
pay attention. Yeah, I mean it's interesting
42:17
to watch the a c L you take on issues
42:19
like surveillance, facial recognition. I know,
42:21
the a c L you filed a lawsuit against clear View
42:24
AI, which was this very controversial company
42:27
that was using biometric data. I
42:29
think facial recognition technology helped them collect
42:31
something like three billion face prints and they were
42:33
giving access to private companies, wealthy
42:36
individuals, federal, state, and local law enforcement
42:39
agencies, and you know, coming from
42:41
the tech space, it certainly feels like
42:43
sometimes these stories, you just don't know what
42:45
these companies are doing until
42:47
you start, you know, peeling back the
42:49
layers and seeing all the data went to here
42:52
and here, and why did it go there? And why wasn't
42:54
this disclosed and and oftentimes
42:56
it takes the watchdog to really
42:59
understand and where some of this can
43:01
can go wrong, and how it's being
43:03
used in ways in ways that
43:05
can be dangerous in many ways. Yeah,
43:08
I think that's exactly right. And that's why I was saying before
43:10
that aren't concerned before everybody jumps on the bandwagon
43:13
about let's have more contact tracing and you know,
43:15
like everybody should just be doing all this information.
43:18
I think we have to get a dog. Yeah,
43:21
you're not gonna have to watch dog telling you things unless
43:24
you build a watchdog into the system. And
43:26
if everything is just you know, a company has invented
43:28
this and is selling it to the police, or a company
43:30
who has invented this and now we're all going to buy it.
43:33
If you just leave out any sort of oversight,
43:35
then you really have a tremendous potential
43:37
problem. Are there any other examples
43:39
of tech that we're not thinking about the unintended
43:42
consequences for our rights or privacy
43:44
yet? Well, you know,
43:46
a AI is really big altogether across
43:49
as you're saying, across many different kinds of
43:51
issues. I was just actually, this
43:53
is not a tent gential to your question but you were
43:55
asking me before about cases that I had worked
43:57
on, and there was another case that I worked on that was
44:00
about tech where I wrote the A c
44:02
l Used brief in the Supreme Court.
44:04
It was an Amika's brief. It wasn't about
44:06
our client, but it was a kid called Riley versus
44:08
California. And what the police were
44:11
saying they're most law enforcement
44:13
places, the federal government as well as the state
44:15
of California and many other jurisdictions,
44:18
was that when you arrest somebody, the
44:20
police get to do what is called a search incident
44:22
to arrest, so they get to see what you have
44:24
in your pocket. Makes some sense, right, you know, if you have
44:26
a gun in your pocket, that's a problem or you know whatever,
44:29
So they get to do a surgeon sent in to arrest.
44:31
And the law had been that if they find
44:33
something in your pocket like that's that's a container,
44:36
they can search inside the container to see
44:38
if there's anything in it that that
44:40
could be harmful. And in fact, there was one situation
44:43
where they opened up a cigarette package
44:45
that somebody had and they you know, they could find
44:47
a razor blade, they could find the marijuana,
44:49
cigarette whatever. So that was law where
44:51
the Supreme Court said, yes, you're allowed to search
44:53
people and search the containers that are on them.
44:56
Well, what law enforcement said was your cell
44:58
phone as a container. When
45:00
we arrest you, we can search your cell phone. It's
45:02
a container. We have the right to search incident
45:05
to a risk. And so we wrote a brief saying,
45:07
no, it's not you know, it's a container, but it's a
45:09
container that essentially is your home, it's
45:12
your library, it's your desk. So
45:15
allowing the police to look in your cell phone
45:17
when they only had really very feeble
45:20
and very unlikely scenarios,
45:22
things that just wouldn't happen too often for what the
45:24
need was. You know, maybe you had some remote thing
45:26
that would go off and would blow something up. You know,
45:28
oh come on. But yeah, there were other ways
45:30
to deal with a lot of that, and so the Supreme
45:33
Court actually agreed with that. They said, yeah,
45:35
this is really is just a technological
45:37
way of finding out what's
45:39
in all your papers and books and records.
45:42
It used to be they were in your desk, and now they're in your
45:44
cell phone. So that, to me, it's
45:47
a sort of a whole thread of what we've been talking
45:49
about. But the challenges to civil liberties
45:51
are different and in some ways
45:53
greater when the technology builds
45:55
up. Yeah, there's
45:58
there's a great quote on the A. C. L. You Let's site.
46:00
The fact that technology now allows an individual
46:02
to carry such information in his hand does not make
46:04
the information any less worthy of the protection
46:06
for which the founders fought. The U.
46:09
S. Supreme Court Chief Justice John Roberts
46:11
exactly, I like to talk about, you
46:13
know, one of the whole points of the Constitution
46:16
adding the Fourth Amendment, which is the protection
46:18
of privacy, is they wanted to protect
46:20
what was in Benjamin Franklin's disk. Nobody
46:23
should know if he was writing some things that were anti
46:25
government, and we now have that on our cell
46:27
phone, so of course, But that's where
46:29
I think that a lot of the protection
46:31
of civil liberties is applying our
46:34
fundamental principles in different circumstances.
46:37
Taking a gigantic step back, what
46:39
do you think is the biggest threat to civil liberties
46:42
in the new World Order? In
46:44
the New World Order? Well, you know, it's hard to
46:46
just select one. It's sort of like Sophie's choice,
46:48
you know, which which is your favorite child? But
46:51
right now, I think one of our very top priorities
46:53
and adenia. Mass incarceration is a big one
46:55
because so many people's lives are just being totally
46:58
disrupted, their families often,
47:00
the question really has to be for what. One
47:03
thing that we're hoping is that the work we've been doing
47:05
around trying to get vulnerable people released
47:08
from prison so that they won't get the virus
47:10
and get seriously
47:12
all possibly guide is we're hoping
47:14
that once jurisdictions see that
47:17
they were able to release thousands of people
47:19
from prisons and jails and that it's
47:21
not going to cause a spike in the crime rate thing it
47:24
really is pretty safe thing to do. We're
47:26
hoping that that's going to stick and
47:28
that long run will be able to rethink,
47:30
well, did we really need to put all those people in
47:32
prison and jail to start with? What are we doing
47:34
with the criminal justice system? So that's really
47:37
big. But the other thing that I think is really
47:39
big right now is voting rights. I
47:41
had alluded to this at the beginning of our conversation,
47:44
but the premise of democracy
47:46
is that the people get to decide on who
47:49
should be running the government and who should be making
47:51
the policy about all these things we're talking
47:53
about here. You know, what, what are the
47:55
regulations about technology? What are
47:57
the regulations about your reproductive freedom?
48:00
Are everything else? LGBT
48:02
rights? Uh? And if
48:04
the people's vote is distorted, that's a
48:07
real problem that people can't vote. So
48:09
we have litigation going on right now
48:11
in I think it's like thirty different
48:13
states trying to get
48:15
people the opportunity to vote. So
48:18
one of the things that has happened,
48:20
in addition to all ways
48:22
that incumbents had been using to try to protect
48:25
their own seats, is that the
48:27
virus has really made it dangerous for people
48:29
to vote in public places. So we
48:31
saw the election in Wisconsin
48:34
where people were just lined up for you know, tremendous
48:37
disiness is waiting for a really long time to vote
48:39
because Wisconsin would not allow them
48:41
to submit absentee ballots. And
48:43
in fact, a study showed afterwards that at least
48:45
seventeen people got got the virus from
48:48
voting. Many many polling
48:50
places were closed because they, first
48:52
of all, the poll poll workers are generally elderly
48:54
people, and the poll workers were not able and willing
48:57
to to man the polling places. Are
49:00
a number of states that don't allow absentee
49:02
ballots at all, unless you have a particular
49:04
situation, like if you're disabled, and the states
49:06
you're saying, oh, well, you know, the pure and the virus are
49:08
getting yill, that's not a disability.
49:11
Or before you get an absentee ballot, you have
49:13
to have it notarized, you have to have witnesses.
49:15
Now, how is all this going to happen? So
49:18
it's very concerning that people
49:20
are going to have to choose between their health
49:22
and their right to vote. And we don't
49:24
think that that should happen. And that's something that
49:27
has to be attended to right now because
49:29
if states don't come up with plans they're trying
49:32
to enable everyone who wants to vote
49:34
to be able to vote, and
49:36
for counting absentee ballots and for
49:38
administering this program. If you don't come up
49:40
right now with the plan and the resources, a
49:42
lot of people are going to be left out and they're going to find
49:45
that either you know, they can't vote because
49:47
they're afraid to go out to the poll, or
49:49
the vote is not going to be adequately counted. So
49:51
I think that right now making democracy work
49:54
is really one of our top projects. What
49:56
is the solution to some of these problems? What are
49:58
your tangible solutions. But one tangible
50:01
solution is that more states have to make absentee
50:03
balloting available to people without having
50:05
all these conditions and you know obstacles.
50:09
Uh. The other solution that you
50:11
were talking before about truth. A
50:13
lot of the reason that's
50:15
given the very thin veneer of
50:17
justification that's given for we don't want absentee
50:20
ballots or we need voter i D people
50:22
to carry government approved
50:24
voter i D, which means you have to go down
50:27
to a governmental office live and get your voter
50:29
i D and show it at the polls. The
50:32
excuse for a lot of this is is that
50:34
there could be fraud. Well, studies have shown
50:36
that there's virtually no voter fraud, and
50:39
it's just it's really a real unicorn. And
50:41
again, I think if people understood that,
50:43
that might sound good, but it's not true. I
50:45
think truth is another thing that we're really
50:47
fighting for these days. Can you listen to the evidence,
50:49
Can you listen to the public health officials, Can you
50:52
listen to what you what's real? I
50:54
know for a fact that tech companies are very
50:56
concerned about voter suppression, you
50:58
know, and misinformation spreading online.
51:00
This idea of countering truth around a lot of
51:02
these very important initiatives, whether it's
51:05
absentee ballots, whether it's showing up to the polls,
51:07
all that kind of thing. You know, I'd
51:09
be curious to know your take. There's a current battle
51:12
happening right now. You have seven fifty advertisers
51:14
boycotting Facebook asking for better
51:17
policing of hateful content. Our
51:19
social media companies doing enough to police
51:21
harmful content, especially
51:24
as we head into an election where voter suppression
51:26
and the spread of misinformation will most certainly
51:28
be attacked. It used to manipulate voters.
51:31
Well, let me actually break your question down into two
51:33
different parts, because you were starting by saying about
51:35
the concerned about voter suppression. I
51:37
think one thing that everybody should be doing is
51:39
to increase awareness of
51:42
what is a fair way to improve
51:44
access to the ballot for everybody.
51:46
And some of those things are tech solutions. We've had
51:48
tech solutions for years that are available
51:50
and not widely enough used. How do you
51:53
enable differently abled people
51:55
to vote? You can bind people vote, do
51:57
they have the technology? So there
51:59
are a lot of is where we need the tech community
52:01
and we need everybody to find out how
52:04
you vote to find out a voting can be made
52:06
easier, and to let people know
52:08
what the rules for voting are where they live. So one
52:11
thing the a c l U Is doing is we have on our
52:13
website you know your rights, you know what
52:15
you're voting regulations are. And
52:17
that's something that I think people really have to start thinking
52:19
a lot about and to let let
52:21
all their communities, all their friends and family
52:23
know about the importance of voting and how
52:26
they what they have to do to vote, and to urge
52:28
them to just get out and vote in whatever form
52:30
that's going to take. So I think that's really
52:32
important. In terms of disinformation
52:35
on social media, people
52:38
talk about the First Amendment
52:40
and whether you know there's the First Amendment problem
52:42
with Facebook telling you what you can't
52:44
do, Well, there isn't because the First Amendment
52:46
only applies to the government, so
52:49
you don't have a First Amendment right to say whatever
52:51
you want on Facebook. However, I
52:53
have to say that we're you know, we don't regard
52:56
that issue is altogether a simplistic issue
52:58
that Facebook should be telling everybody if they can't
53:00
say because even though the First a Moment
53:02
does not apply to private companies, there's
53:05
still a tremendous value to free speech. And
53:08
there are a number of examples, which you know,
53:10
are we've come up with about people
53:13
who are have speech suppressed for bad
53:15
reasons. I'll give you one example. There was a
53:17
woman who African American
53:19
woman who posted something on
53:21
Twitter and she got all these horrible racist
53:23
responses and she posted
53:26
a screenshot of the responses that she got
53:28
to show people what she was up against, and
53:30
Twitter took it down because it included racist
53:32
words that you know, okay,
53:35
you know, kind of misses the point. There
53:37
was another uh A CLU lawyer
53:39
wrote about a statue in Kansas
53:42
that was a topless statue was a woman who
53:45
were bare agrested, and so whatever
53:47
the locality was in Kansas decided
53:49
to take it down because yeah, that was
53:52
they considered that to be important. So the
53:54
A. C. L You lawyer, who was challenging whether or
53:56
not the I think it was city could take
53:58
it down, posted a a picture
54:00
of the statue and that was it was on Twitter
54:02
was I think Facebook, and that was taken down
54:05
on the ground that it was obscene, so she couldn't
54:07
post the picture of what she wanted to do.
54:09
So we think that social media
54:12
control is really a two age sword. What
54:14
I liked is at one point Facebook had a protocol
54:17
for about you what's true and what isn't true,
54:19
And what they did was they gave you a flag. So
54:22
if they were concerned that something that was said
54:24
wasn't true, they would have a neutral fact
54:27
checker check it, and then if it didn't
54:29
turn up, well, they would put a little flag over
54:31
it and say this has been questioned, and you could click
54:33
on the flag and you could see why it was questioned.
54:36
But they didn't just take it down. So
54:38
you know, I I agree that, you know, disinformation
54:40
is a tremendous problem, but I think
54:43
that the idea that the solution is asked
54:45
the tech companies to decide what we should and
54:47
shouldn't see. Yeah, I don't think that's
54:49
so great either, And certainly
54:51
they should not be doing it without a lot of transparency
54:54
and accountability. If they're going to be taking things
54:56
down, they should tell us what
54:58
their protocols are, and you
55:00
know, there should be more public discussion about
55:03
where the balance is there. Yeah, it certainly
55:05
seems like the protocols change quite a bit, Especially
55:07
having covered tank for for this many
55:09
years. It certainly seems like Facebook changes that, Twitter
55:11
changes it, and oftentimes it depends on
55:13
public pressure. I'm curious to see what happens
55:16
with all these advertisers boycotting. I think
55:18
personally, I have a feeling it won't impact the bottom line
55:20
much and they'll go back to business as
55:22
normal. But but who knows, you know, I do
55:25
know that Zuckerbird cares deeply
55:27
about his employees and and but they've
55:29
been kind of up against you know, public
55:31
scrutiny for a very long time. But but it certainly
55:34
is interesting, especially when the stakes get higher
55:36
and disinformation can go further,
55:39
and especially as we get closer
55:41
to an election, it certainly feels like everyone
55:43
feels more triggered around it. Yeah.
55:46
Yeah, well, you know, one of the classic statements
55:49
about the First Amendment does that in the marketplace
55:51
of ideas, the best antidote to bad
55:53
speech is more speech, right, So,
55:55
you know, suppression. I think we always have to worry every
55:57
time somebody is censoring and suppressing. Yeah,
56:01
who are we giving that power to? You know,
56:03
nearing a close because we don't have you for
56:05
too much longer. I saw that you gave
56:07
a talk um Democrat
56:10
and a Republican walk into a bar and
56:12
you're saying that it seems like these days Democrats
56:14
and Republicans can't really agree on anything, but
56:17
we all need to agree on fundamental
56:19
American principles like do process, equality
56:21
and freedom of conscience?
56:23
So is that possible?
56:26
Do you believe are you are you an optimist? Do you
56:28
believe that in this current environment? Is
56:30
that possible? Well?
56:33
I think that's that's a great wrap up question. So
56:36
that speech I gave it the Central Arkansas Library.
56:39
And my cheat point, as you're saying, is I
56:41
think that people have to be able
56:43
to agree on neutral
56:45
principles. The Constitution
56:48
was designed not to say what we're going to
56:50
do about everything. It was designed to have
56:52
everybody have a fair opportunity
56:54
to be part of the process of
56:56
deciding what we're going to do. So it sets
56:59
up all these Democrats structures where we get
57:01
to vote for the people who are the policy makers
57:03
and we all got to decide. But the
57:05
principles there, the underlying principle is
57:07
that everybody should have a fair and you know, if
57:09
the principle should be neutral, everyone should get to
57:11
vote. It's not like, you know, if you're
57:13
a Democrat, your vote doesn't count in this area,
57:16
and if your republic and your vote doesn't count in
57:18
that area, that's not fair. And
57:20
the basic ideas of the
57:23
freedom of speech, freedom of religion, they're
57:25
all to me. They managestations
57:27
of the golden rule that if I
57:30
want the ability to just choose my own religion
57:32
and decide what religion I'm going to practice, I
57:34
have to respect your right to make a different choice
57:36
and have your own religion, because that's the golden
57:38
rule. If I want to say something that's unpopular,
57:41
I have to respect your right to say something that's
57:43
unpopular. And if I want to be treated
57:45
fairly and not locked away for your doing
57:47
something minor and never given a fair trial, I
57:50
have to respect your right to have the same thing
57:52
happened to you and to be all
57:54
those fundamental principles are things that we really
57:56
all should agree on. I think people
57:58
get into arguing and assuming
58:01
that they can never agree on
58:03
the principles because they're differing on what
58:05
they think the results should be. And
58:07
I think to be part of the point of civil
58:09
liberties is it's all about process, it's not
58:11
about results. The a c l U is nonpartisan.
58:14
We don't try to get Republicans elected. We don't
58:16
try to get Democrats elected. We don't
58:18
favor or disfavor individual politicians
58:20
or individual parties, but we
58:22
we favor or that there should be neutral
58:25
principles that everybody can agree
58:27
to to say, okay, here's what's fair. And
58:29
the analogy I used in that talk
58:31
at the Central Arkansas Library it
58:34
was one of the nights during the
58:36
World Series, but fortunately not a night where
58:38
there was a game, so people were able to come, and
58:41
I said, okay, so what happens before a
58:43
baseball game is that everybody has agreed
58:45
on the underlying rules, and everyone agrees
58:48
that the your umpires, your
58:50
referees, and any sports should be neutral. And
58:53
you don't want somebody who's partisan. If they were favoring
58:55
one team, you'd get rid of them at all. Sports
58:58
fans could agree to that. You know, maybe they would
59:00
be a few who would be just so you know, matchiavellian,
59:02
that they would rather have the biased umpire
59:05
to always rule for their side. But I
59:07
think sports fans can agree what you really
59:09
want for a fair game. Because you want a
59:11
fair game, you want everyone to agree on the principles
59:14
beforehand. And I think that if we
59:16
could sit down in small groups around the
59:18
country and really talk about what the fundamental
59:20
principles are. I am
59:22
enough of the patriot to think we actually could agree
59:24
about a lot. And let me give you an example
59:26
of why I think there's some basis for hope.
59:30
Maybe not optimism, but certainly hope. We
59:33
were talking about voting rights. So one of the
59:35
major problems is gerrymandering, the
59:37
way when a party is in power
59:39
they try to distort all the
59:42
districts and they try to stack the deck so that
59:44
their party will remain in power. Or
59:46
if the party in power in a particular state
59:49
thinks it's to their advantage to not have that
59:51
many people vote, they try to make it harder
59:53
to register to vote for new
59:55
voters, etcetera. Uh.
59:58
We have had the A C l U and and
1:00:00
a member of other organizations working
1:00:02
in coalition with us have had a fair amount
1:00:04
of success doing ballot initiatives
1:00:06
going to the people of a state in
1:00:09
states like Michigan and Nevada
1:00:11
and Missouri and Florida, where
1:00:13
we were part of getting the amendment for a past
1:00:15
that gave the vote back to people who
1:00:18
have been convicted of a felony at some point
1:00:20
and the people of the state. When you ask the people
1:00:23
of the state, you can get a majority. Sometimes
1:00:25
the super majority of people who say no, we want
1:00:27
the rules to be fair. Who
1:00:29
doesn't want the rules to be fair are legislators
1:00:32
who want who are incumbents and who want to keep
1:00:34
their seats even if it takes unfair procedures
1:00:37
to do it. So that's a real
1:00:39
problem right where we have right now that the incumbents,
1:00:41
the people who are trying to maintain power and
1:00:44
not allow any sort of regime change, are
1:00:46
pulling all the levers. But what I
1:00:48
think, I think the chief grounds for optimism
1:00:51
is that when you go to the American people
1:00:53
themselves and say, well, do you want a fair system
1:00:55
or do you want a system where you think your side is more
1:00:58
likely to win? You talk to
1:01:00
them about that, and I think that you're going to get them to say
1:01:02
they would really like to see a fair system,
1:01:04
and that is the promise of America. Um.
1:01:07
Last question you have taught at Brooklyn Law School,
1:01:10
since what is the lesson
1:01:12
your students will take from this moment in history?
1:01:15
Well, I know there
1:01:17
are lots of lessons, but if you could
1:01:19
extract it, what is the lesson your
1:01:21
students will take from this moment in history,
1:01:24
well, you know, in an individual
1:01:27
setting. One thing I'm doing for the fall is
1:01:29
I am preparing of course that I'm calling COVID
1:01:31
nineteen and the Constitution. So
1:01:33
what we're gonna do in this seminar is we're going to be looking
1:01:35
at the way in which the Constitution has been
1:01:38
challenged and to see, you know, how well it holds up.
1:01:40
What does the Constitution have to say about whether
1:01:42
you can quarantine people and whether
1:01:44
you can allow people to be at a religioussembly
1:01:46
but not go to a protest, and etcetera,
1:01:49
etcetera. So I think there's a lot of interesting things
1:01:51
there which I think are very much this particular
1:01:53
moment, but big picture, what I would
1:01:55
like the students to take away, the
1:01:58
constitutional law students especially is
1:02:00
essentially what I just said to you, that the Constitution
1:02:03
is about process. It's not about results,
1:02:05
it's not about you know, you're a Republican and you're a
1:02:07
Democrat, and we have two different countries depending
1:02:10
on what your party is. I
1:02:12
think that we have one country and it's all
1:02:14
about a neutral process for very good
1:02:16
reasons, and I would like people to think more about
1:02:18
that after my speech at the Central Arkansas
1:02:21
Library, I had two
1:02:23
examples of people who talked to me. One guy came
1:02:25
up to me, he said, I'm the Republican
1:02:27
who walked into that bar, and
1:02:31
he said, you know, you're making a lot of
1:02:33
sense to me. And then there was another guy
1:02:35
who talked to me who was a Democrat. He said, you know,
1:02:37
I never really thought about that that maybe it's not
1:02:39
right if we're only trying to win. I never thought
1:02:42
about you know, that's that's not what we do in sports.
1:02:45
And that's what I'd like people to think about. You
1:02:47
know, do you really want to do things that are only about how
1:02:49
you think it's going to come out and cheat and destroy
1:02:52
the system, and you know, put a film on the scale
1:02:54
and you know, stack the deck in
1:02:56
order to make things come out to what your
1:02:58
preferred result is in the short
1:03:00
run or long term. Is that
1:03:02
just a really bad idea because
1:03:04
it's just totally inconsistent. You know, we've
1:03:06
just come from fourth of July. It's totally inconsistent
1:03:09
with the premises on which we
1:03:11
would like to believe our country was founded.
1:03:15
Does technology throw a wrench in the system?
1:03:17
I mean it does. It does create lots
1:03:19
of things you can't control, and and it
1:03:22
it always does. It's always it's always
1:03:24
new environment, so you know, different kind of example,
1:03:26
we were talking about technology and surveillance,
1:03:28
where of course technology has enabled
1:03:30
a whole lot of survalance that we then have to deal
1:03:32
with. But technology also enabled
1:03:35
a whole lot of new marketplaces of ideas.
1:03:37
So the A c L. You did a lot of litigation a
1:03:40
few decades ago on applying first
1:03:42
two Moment principles to the Internet, right,
1:03:44
you know, becauld the government censor what
1:03:46
was on the Internet because you know, child,
1:03:48
a child might see it. Yeah, And
1:03:51
so you know, every new generation of technology,
1:03:53
there are new challenges about
1:03:56
how you apply our principles like privacy
1:03:58
and free speech, cetera to
1:04:01
the Internet, but the principles
1:04:03
remained the same. I
1:04:16
hope everyone is doing well in these strange
1:04:18
and surreal times and adjusting
1:04:20
to the new normal. Most important,
1:04:23
I hope you're staying healthy and somewhat
1:04:25
sane. Follow along on our social
1:04:27
media. I'm at Lorie Siegel on Twitter
1:04:29
and Instagram, and the show is at First
1:04:31
Contact Podcasts on Instagram and on
1:04:34
Twitter. We're at First Contact pod
1:04:36
and for even more from Dot dot dot sign up
1:04:38
for our newsletter at dot dot dot media dot
1:04:40
com, Backslash Newsletter, and
1:04:42
if you like what you heard, leave us a review on Apple
1:04:45
podcasts or wherever you listen. We really
1:04:47
appreciate it. First
1:04:51
Contact is a production of dot dot dot Media
1:04:54
Executive produced by Laurie Siegel and Derek
1:04:56
Dodge. This episode
1:04:59
was produced and ed it did by Sabine Jansen
1:05:01
and Jack Regan. The original theme
1:05:03
music is by Xander Sang. First
1:05:14
Contact with Lorie Siegel is a production of dot
1:05:17
dot dot Media and I Heart Radio m
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More