Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:04
There Are No Girls on the Internet. As a production of iHeartRadio
0:06
and Unbossed Creative, I'm
0:12
brigitat and this is There Are No Girls on the
0:14
Internet. Graduation
0:18
is supposed to be a time for eager college
0:20
grads making speeches and walking
0:23
across the stage to receive their hard earned
0:25
diplomas, but campuses
0:28
around the country at places like usc
0:30
Columbia, and Emery have either
0:33
shrunk or canceled graduation
0:35
ceremonies in the wake of protests
0:37
demanding universities divest from Israel,
0:40
in other words, withdrawal any funds
0:42
their university endowments have invested in
0:44
companies that are linked to Israel. The
0:48
protests have really escalated recently,
0:50
and even before that, A digital billboard
0:52
truck funded by a well moneyed conservative
0:55
group drove around university campuses
0:57
at Columbia, Harvard, and UPenn with
1:00
pictures and names of students, accusing
1:03
them of being quote their campuses
1:05
leading anti semites. One
1:07
student whose picture and name were on a
1:09
billboard truck at Columbia spoke
1:11
to Verge, saying, I literally
1:14
did not leave my apartment the days my
1:16
friends told me about it. The student
1:18
said that her name was listed on the truck because
1:21
a club that she was no longer a part of had
1:23
signed on to an open letter urging Columbia
1:26
to cut ties with Israel. I completely
1:28
wiped as much as I could of my online
1:30
presence and stated my apartment as much
1:33
as I could, the student said. Meanwhile,
1:35
Canary Mission, a website that lists
1:38
people perceived as being supportive of Palestine
1:40
in order to keep them from getting future job prospects,
1:43
has stocked students across the country. It
1:46
is your duty to ensure that today's radicals
1:48
are not tomorrow's employees. A
1:50
video describing the project explains Caroline
1:53
Cinders and their colleague Sam are
1:55
part of Convocation Research and Design, or
1:58
Chords for short people who
2:00
are being doxed or at risk for being
2:02
doxed, train people on how to stay
2:04
safe both online and off while engaging in protest,
2:07
and advocate for technology to be designed in
2:09
more responsible ways. When
2:11
I first caught up with Caroline and Sam, I
2:13
thought this whole conversation would be pretty straightforward,
2:16
you know, turn off your phone out of protest, that
2:18
kind of thing. But as we spoke, Caroline
2:21
and Sam described just how charged
2:23
of a landscape we're talking about. And
2:25
I realized the conversation is so
2:27
much more complicated than that. So
2:34
when I first reached out to y'all to do this episode,
2:37
I thought it was going to be a pretty basic guide
2:39
for how folks protesting on
2:41
campus and off campus can stay
2:43
safe digitally. I thought we'd be talking
2:45
about pretty basic things like use
2:48
a pass code to unlock your phone so that if you're
2:50
detained by police they have a harder time getting into
2:52
it, or cover any distinctive tattoos,
2:55
that kind of thing. But then when
2:57
you and I started talking, I realized
2:59
there was so much more going on here,
3:01
and then I needed to take a step back. You all
3:03
told me about things like provocateurs
3:06
setting up fake dating profiles and then
3:08
going on fake dates with protesters
3:10
and trying to secretly record them saying
3:13
things to make them look bad James O'Keeffe
3:15
style, or how right wing influencers
3:17
are kind of selling themselves as a citizen
3:20
journalists and trying to make names for themselves
3:22
by engineering viral content of
3:24
protesters looking bad on top
3:26
of threats to protesters that
3:28
you might traditionally think of, you know, threats from
3:31
campus administration or police. So
3:33
can you just give us kind of the lay of the land
3:36
in terms of the threats that these student protesters
3:38
are facing right now.
3:40
These threats are wide and varied. I
3:43
think student activists are seeing
3:45
threats not only from
3:48
outside, but from inside as well,
3:50
So they're having to contend with school
3:53
administration, sometimes
3:57
hostile students
4:00
who are against
4:02
what they're protesting for, as
4:04
well as outside forces like
4:07
the police or outside
4:11
sometimes infiltrators like people
4:15
like Project Bearitas style where
4:18
they're trying to get into
4:20
protests in order to record
4:22
people who they perceive
4:25
as their political opposition in order
4:27
to make them look bad and then posting
4:29
that on the internet for clout
4:32
or subscriptions or money, as
4:35
well as a variety of other threats
4:39
that I think a lot of people don't really take into account.
4:42
So you talked about people doing this for clout
4:44
or engagement. Do you think that's a big
4:46
undercurrent of what we're seeing here, Like people who
4:48
are like, oh, this is a great opportunity to put
4:50
out some viral video that makes a protester
4:53
look bad or look uninformed, or look
4:56
threatening or dangerous. I'll put that
4:58
online and become a superstar myself.
5:00
I mean, we are more in a
5:03
time of the kind of like citizen journalists
5:06
some of them call themselves, or like man
5:08
on the street interviews where those are very
5:10
popular on like TikTok or Instagram or
5:13
Twitter or YouTube or
5:15
rumble, like the list goes on and on
5:17
of all these different platforms that they're able
5:19
to monetize these videos for
5:22
and it is very big in the
5:24
the cultural climate that
5:28
that these protests are happening in.
5:32
So yes
5:34
and no, like they are a small part
5:37
of these protests.
5:37
But any large event I feel like that's
5:41
happening happening collectively in
5:44
the country is going to
5:46
have probably a similar amount of
5:48
people that are chasing it for some
5:51
type of engagement and.
5:52
Just to build on that. You know, I really do think we are
5:55
in the age of the influencer. Like,
5:57
I think that is something a lot of folks to feel
5:59
comfortable saying. It is really hard
6:01
to say, you know, citation
6:04
needed in terms of how many folks are attending these
6:06
protests in person
6:08
or following them online right
6:11
and
6:13
commenting in a way that is related
6:16
to influencer or citizen journalism.
6:18
Like there are a variety of organizations
6:21
that are trying to identify student
6:23
protesters and docs
6:25
them and put their information online. Like one that comes
6:27
to mind is Camary Mission for example. But
6:30
yeah, there are a lot of folks that are
6:33
either or not a lot, but there are there
6:35
are definitely folks that are showing up to the protests
6:37
that are we could argue, have
6:40
you know, maybe some more malicious
6:42
intentions that are in
6:44
this morsus in journalism or influencer
6:47
space. But there's also lots of folks watching
6:50
online trying to identify
6:53
different different folks
6:55
at these protests and sort
6:57
of doing a kind of let's say, like recap
6:59
or an analysis. And this is stuff you know, we've
7:01
seen in many other different
7:04
kinds of spaces and ways. Like what comes
7:06
to mind for me is, you know the Johnny
7:08
Depp amber Heard trial where we saw so
7:10
many sort of armchair investigators
7:14
and heavy heavy quotes, body language
7:16
experts
7:19
like this, this is the lip readers
7:21
and things like that, and and but
7:24
that had like a real impact on how people
7:26
really perceived that trial and
7:29
how I would argue in really
7:32
discoloring amber Heard not as a victim
7:34
but as a perpetrator, when in fact
7:36
she was a victim, right, And so I think,
7:39
you know, we're still seeing a lot of that here
7:42
and now with folks I think also maybe
7:44
not physically being at the protest, but commenting
7:47
on them, analyzing
7:49
them, right, and providing
7:52
different kinds of like wrongful
7:54
assessments.
7:56
And similar to what we saw with the
7:58
Amber Heard Johnny Depth trial. Do
8:00
you have you seen that those
8:02
kind of armchair folks who aren't
8:04
necessarily involved in
8:06
the protest but are like you know, commenting
8:09
on the viral video they saw from
8:11
the protests or whatever, that that is having
8:14
an impact on how every
8:16
day people are perceiving what's going on on college
8:19
campuses across the country.
8:21
I think it's leading to like stratification
8:23
for sure, and I think some of that is
8:26
also you know, I would argue actually
8:28
also coming from a place even
8:30
of sadly like traditional media.
8:33
One of the things I like to highlight is we don't live in a harassment
8:35
literate society, and
8:38
I think a lot of the analysis
8:41
that is sort of existing around this,
8:43
this this particular conflict,
8:46
right, is sort of caught up in
8:49
a really lack of nuance. Right, So the
8:52
inability to see let's
8:54
say, the people of Gaza as individuals
8:57
that we are watching a genocide
8:59
unfold, that we are watching
9:02
things where many
9:04
respected, reputable international bodies,
9:07
right, are weighing in and saying like this is unprecedented
9:09
in terms of like a famine that's being caused,
9:12
in terms of the attack, in terms of the amount of journalists
9:14
killed, in terms of the amount of children that are killed
9:17
in this particular conflict,
9:20
right, And a
9:22
lot of what I think a lot was
9:24
happening is a version of context collapse in
9:26
which I
9:29
think some folks are weaponizing, are
9:33
weaponizing anti Semitism to
9:35
push for other agendas, and
9:37
that is also creating a space
9:39
in which it's very difficult to have one
9:42
could argue a nuanced conversation, right,
9:44
And I think that is a large part of
9:48
this. And then from there that kind of
9:50
gives bad actors a space to
9:54
push different kinds of agendas and
9:56
different kinds of information and also
9:58
in a way, almost again like justify the
10:01
doxing of different activists,
10:03
right. And so
10:06
I think it's
10:09
sort of hard to like definitively say, but
10:11
I do see on
10:15
on like the you know, on the
10:18
like writer or conservative end uh,
10:20
this sort of lack of nuance and inability
10:22
to understand like
10:26
a human rights conflict really muddying
10:29
the waters in terms of also how do you contextualize
10:32
or talk about let's say, like student
10:34
protesters.
10:35
So we talked a lot about like media
10:39
entities. I guess I'll say, who are known
10:41
provocateurs that like nobody should trust.
10:43
But I do think that national reporting
10:46
is not all it's not I have not seen a lot
10:48
of national reporting that is super clear
10:50
about what is going on, like the fact
10:53
that we're talking about nineteen like a
10:55
nineteen year old college freshman or something
10:58
being up against a coordinated, organized,
11:00
well funded institution with
11:03
a mission to dox them
11:05
and make them look bad, right, and so like, I
11:08
don't feel like we are really getting
11:11
a lot of opportunities to have that full
11:13
story told. I think
11:15
that the way that you have described it, Caroline, as
11:17
like we just don't live in a harassment literate
11:20
society where you know the
11:22
actual context of what that would be like and
11:24
what and like why that is happening
11:27
is not being reflected perhaps, And I
11:29
guess I wonder, like, how can we get to
11:31
a place where that's that story is
11:33
being told more honestly, where people really understand
11:35
that like, yeah, you're talking about a
11:38
college student for who and then a
11:41
billionaire funds a truck with their
11:43
picture and their name on it to drive around their campus,
11:45
right or like, like, how truly fucked
11:48
up? That really is.
11:49
Totally and thank you so much for that. I
11:53
I don't know how
11:55
we get towards a harassment literate society.
11:57
That's why I'm really grateful for podcasts
11:59
like No Girls on the So I've been really grateful
12:01
for a lot of the programming we've seen across different
12:04
kinds of civil society organizations. Been
12:07
extremely grateful for teen
12:09
Vogue and it's like existence over the
12:11
past few years. I
12:13
think a lot of this is really
12:16
sort of insisting on having nuanced
12:20
conversations. But I think a lot of it also actually
12:22
does come down to different
12:24
kinds of education, and
12:27
I don't know at what level that should should
12:30
happen or exist in Like, for example,
12:32
with our organization, we don't really work a
12:34
lot with children for a variety of reasons. Some
12:37
of that is also just like regulatory
12:40
reasons. It's not one of our advocacy
12:42
focuses because children are like under
12:44
eighteen, they're just a different legal group
12:46
than adults. But
12:48
you know, I think there's something to be said around how
12:52
do we how
12:54
do we sort of talk about things
12:56
like digital harm and also offline harm.
12:59
But I think that this is also an issue that adults
13:01
run into as well, Like you
13:04
can't necessarily summarize,
13:06
I think often
13:08
or I think it's difficult to summarize a lot of harassment
13:11
cases into a tweet, right, to summarize
13:13
it in a few characters.
13:16
And I think in this sort of space of us
13:18
trying to maybe editorialize
13:22
or in you know, take a stance
13:24
on something that often
13:26
really collapses the
13:29
nuance of what we're talking about. I
13:31
also think that there's like a
13:33
binary in which people tend
13:35
to approach good and bad,
13:38
and then that sort of weighs
13:40
into harrassment. I think that there's also
13:43
a deeper context here. So this is one of the
13:45
things I've noticed in some of the
13:47
trainings we do with folks that are like victims
13:49
that are facing harrassment or have been
13:52
harassed. And
13:55
I've also noted this with folks that are, let's
13:57
say, into being friends of her assers. It can be very
14:00
stabilizing and difficult to
14:02
sort of come to terms with that someone you care
14:04
about might harm someone else. Folks
14:07
often I've found in my
14:09
experience, So this is like in my observations,
14:13
like they'll sort of reflect on their
14:17
own personal feelings about it,
14:19
right, and what that says about them.
14:22
And it's one of those things where I like to say,
14:24
harassment isn't complicated, it's
14:26
your feelings about the harassmer that are complicated,
14:29
right, Like, those are the complexities. So
14:32
if you're a fan of Johnny Depp,
14:34
it's probably very difficult to
14:37
reconcile your fandom and
14:39
like this thing you've invested in, even
14:42
with clear evidence that
14:44
he has harmed someone, right, or
14:47
if someone is your friend. So then
14:49
what I've found is people try to look for logical
14:51
reasons, like, oh, what did the victim
14:54
do? There must be
14:56
a reason that this thing or
14:58
person or institution or country
15:00
I really like is doing
15:02
this. There has to be a logical
15:05
reason. And one of the things I like to
15:07
point out to is like sometimes there's not. It's
15:09
like the bitch eating crackers meme. Someone just
15:11
might not like that other person and
15:14
that's all that it takes, and they happen to like
15:16
you. And that is sometimes just
15:18
how harassment unfolds.
15:19
Right.
15:20
Other times people are seeking out a
15:23
weaker opponent to feel stronger.
15:25
You know, there's all different reasons. My
15:29
work in particular doesn't really focus
15:31
on the psychology of the harasser
15:33
or why they harass. That doesn't necessarily help me
15:35
help victims. What
15:38
helps me help victims is often
15:40
looking at like how they're being targeted, how
15:42
are systems being subverted to harm them,
15:45
how do we improve those systems,
15:47
and how do we also support victims in
15:49
a trauma informed way. So
15:51
I think this is where we're sort of seeing
15:54
this almost at scale. I
15:56
think the complexities here are
15:58
also in terms
16:00
of policy, they get very complex.
16:03
I would say in terms of is it a genocide
16:05
or not complex it is a genocide.
16:07
But I think this is where people also don't
16:09
understand international
16:12
relations or international politics,
16:14
I e. That the people of Gaza are
16:16
not Hamas, that they don't understand
16:19
how like aid work works, that they don't
16:21
understand different rules
16:23
of law and rules of war. In
16:25
terms of also how or how journalists
16:28
and the media are allowed
16:30
to engage in document how
16:33
different spaces or sites
16:35
are targeted or not targeted, right,
16:38
how aid and food is allowed to flow
16:40
in and out. Like these are we
16:42
work in human rights. These are even areas
16:45
that are complex to me because I don't work
16:47
in like war torn areas, right, So
16:50
these are also very complex systems
16:52
to myself as someone who is a human
16:54
rights expert. And so I think there's
16:56
no space right now with how
16:58
Internet conversation has been defined
17:01
or how we're engaging in it to be able
17:03
to have a conversation. I can't give you
17:05
a one tweet description
17:08
or summary of this, like I have to give you
17:10
a long essay. And then this
17:12
also butts up against you,
17:14
know, then
17:17
different kinds of miss and disinformation that
17:20
I think build on what Whitney Phillips calls
17:22
people's deep memetic framework, so beliefs
17:24
that you have and then you see those reflected
17:27
in other spaces. So if you've ever
17:30
like I'm from Louisiana, I very distinctly
17:32
remember Hurricane Katrina. You know,
17:35
I can understand how certain folks
17:37
in New Orleans and Louisiana and Mississippi,
17:39
where my mom's family is from,
17:41
like might have a slight
17:44
distrust of disaster relief
17:46
programs or trusting the government to like
17:48
handle disaster relief, right because people
17:50
have lived through many hurricanes and not seen
17:52
that go very well. So, like that's
17:55
a deep memetic frame of them saying like, oh,
17:58
can I trust an international body?
18:00
Right?
18:00
And while those are different things, you can see
18:02
how some of that mistrust
18:04
gets sown and how like missin disinformation
18:08
can sort of exacerbate
18:11
that at scale. So then how do
18:13
you have a conversation with someone who's
18:15
had lets of negative experiences with FEMA to then
18:17
be like, well, here's why actually you should trust
18:19
like this you and organization that you've never heard
18:21
of that only operates in
18:24
this one region. It becomes actually
18:26
very difficult to have those conversations.
18:29
And I think this is what I mean,
18:31
like these are the ripple effects of not living in
18:33
a harassment literate society, is
18:36
that we can't look at things just through a good
18:38
and bad binary. You have to look
18:40
at the context. You have to look at
18:42
impact versus intent, So like
18:44
what is the impact of someone's actions versus
18:48
like what they say their intent is. You
18:50
have to also look at I think, or
18:54
I think, really analyze and think very
18:56
deeply about one's own ego in
18:58
these situations, and like the
19:00
seeking of justice or like retribution,
19:03
that that sometimes results in
19:05
more and more and more harm. And
19:07
this is also very complex in which you want
19:09
to allow a space for victims to be
19:11
able to vocalize anger
19:14
and distrust and dissent, like they're like, you're
19:16
allowed to be angry. This also then conflates
19:19
or combats
19:21
our idea very like
19:23
I would say global Northwestern American
19:25
idea of like what a victim is. There isn't
19:27
one kind of victim. There's so many different kinds
19:30
of victims. There's no one way to
19:32
embody victimhood. And I
19:34
think that's also a complexity people don't
19:36
understand, which is how we then get into I
19:39
don't know if I believe that person or I would have done
19:41
this in this situation, And
19:43
the answer is like, you don't know what you would do in that situation
19:46
until you're in that situation.
19:49
And so that's also a very strange
19:52
and skewed way to view how an
19:54
individual or a group of people are responding to
19:56
trauma.
20:03
Let's take quick break
20:14
at our back.
20:16
It reminds me so much of
20:19
what we saw around gamer Gate in some
20:21
ways, and just this idea that
20:23
it's so much about, like
20:26
what's happening is maybe not so complex.
20:28
The feelings surrounding it might be
20:30
the complex thing. And then the
20:32
bit where we then don't
20:35
have a media that is informed
20:37
enough or harassment literate enough
20:40
to really lay out what's
20:42
happening, right, And so they try to
20:44
find a way of like reporting
20:46
along those binaries of like well maybe
20:49
they maybe they they really do just
20:51
care about ethics and game journalism,
20:53
and like maybe this woman really did deserve
20:55
it and have it coming, and maybe you know, maybe
20:58
women have taken things too are in their criticism
21:01
of gaming or whatever. Rather than actually
21:04
laying out like the sort
21:06
of complexities that you did and the nuance that
21:08
you just did, it's so much easier
21:10
to have it be this binary, simple
21:12
thing. And I almost wonder if they're if
21:15
they're like giving us what we want in a kind of way,
21:17
because we want simple answers. Sometimes
21:19
we want like oh bad guy, good guy,
21:21
or here's the reason they did that, and what
21:24
they sometimes what we don't want is like, oh, well
21:26
it's actually a really complicated situation totally.
21:29
I mean, this is something I see where I
21:32
talk about a lot in my coaching
21:35
with different victims of harassment, is that
21:38
one of the one of the very
21:40
unsatisfying things that comes out
21:42
of of harassment. There's many unsatisfying
21:45
things. There's many awful things. There's many harmful things.
21:48
Is that you might never get a why, and
21:50
the why might be something that
21:53
feels simplistic or
21:55
minimal or naive, or you
21:58
know, not even very well thought out.
22:01
And one of the things I
22:05
try to really coach people through is it
22:07
might also not be worth thinking about
22:09
why, Like it might actually,
22:12
you know, it is your being targeted,
22:14
but it might not have anything to do with
22:16
you. And in terms of seeking the why,
22:19
like why this person is doing this, that
22:23
that might not give you the peace of mind
22:25
or the closure that you're looking for. And
22:27
I think that's also I think
22:29
that's really hard for people. And I know
22:31
that, like when I have faced harassment,
22:35
sometimes that's even hard for me, even though I know that,
22:37
like I know that if someone
22:39
were to tell me why, it wouldn't
22:42
be it wouldn't be the answer that
22:44
I'm looking for. It maybe wouldn't solve
22:46
any of the conflicting feelings I'm feeling
22:49
about that particular instance.
22:52
And I think, you know, this is where it is sometimes
22:54
important to separate harrassment research and
22:56
harassment like literature from let's
22:58
say international politics and
23:01
war and conflict. But I
23:03
do think it is at times
23:05
and important to sort of understand, Like
23:08
the why can be incredibly
23:11
simplistic and heavy quotes, like
23:13
I think a lot about my
23:15
partner's family. He was born in nineteen ninety one
23:18
in the Western Balkans, when Yugoslavia
23:20
was breaking apart, and like, what are
23:22
the whys there? There's a lot of
23:25
whys that Yugoslavia was breaking
23:27
apart. There's a lot of whys in
23:29
which a group different
23:31
groups were targeted, and a lot of those whys
23:33
are not satisfying, you know, because it comes
23:36
down to hate
23:38
and discrimination and Islamophobia
23:41
and like very entrenched
23:45
like ethnic ties and views of
23:47
a particular region. And that's an overly simplistic
23:50
way to even look right at that region.
23:52
But that's also a why, and that doesn't explain
23:55
or excuse anything. And I also don't know if that
23:58
improves what
24:01
what it feels like to have lived through that. That's
24:03
just an observation, right, And
24:05
I think sometimes you
24:07
know, some like sometimes the most awful
24:09
things that happen there is a simple
24:12
answer again, which is unsatisfying, which is just we
24:14
live in a society that is full
24:17
of bias and hate and power
24:19
differentials.
24:22
True, but I could see
24:24
how that's not the most satisfying
24:26
answer, particularly if you've been
24:29
in a situation that has like upended your life,
24:31
right Like if you're like, oh, someone
24:33
got a picture of me at a protest and now I
24:35
am unable to live a normal life. Having
24:37
the answer be like, oh, well, we
24:40
live in a society that's full of hate that
24:42
people somebody identified
24:44
you as an easy target to pick on, and
24:47
that's what they did. I could see how that could be
24:49
unsatisfying. But it goes back to what you
24:51
said about sometimes the why is
24:53
not necessarily worth dwelling
24:55
on totally.
24:57
And this is something where I'd like to say, like that also
24:59
doesn't excuse the harm you're facing, Like
25:01
like you shouldn't be facing that, right, Like that's something
25:03
that no one should be exacting
25:06
or putting on to you for you to experience,
25:10
which I think is why Sam and I really
25:12
do focus on victims
25:15
their care. What are the ways in which
25:17
systems, inadvertently
25:20
or you know, via their own design,
25:22
create space, create flaws
25:24
and vulnerabilities and spaces for harm
25:27
because I don't need to know the why to focus
25:30
on the victim. We don't need to know the why
25:32
to support someone and
25:35
even for us, I think, even if I were to
25:37
put on my research hat, the why isn't also
25:39
a satisfying thing for
25:42
me to seek personally, even in a research
25:45
context.
25:47
So you might be listening to this and think that it sounds
25:49
like I'm trying to dissuade you from showing up
25:51
to a protest irl because you could find
25:53
yourself a target for this kind of harassment
25:56
surveillance and criminalization. But
25:58
Caroline and Sam don't see it that way. In
26:00
fact, they say that you should still make
26:02
your voice heard while also having a
26:04
good understanding of the realities of our surveillance
26:07
landscape and what's at stake. When
26:09
I reached out, I thought this conversation
26:11
was going to be a lot more simple that it ended up being.
26:14
And I think that is reflective of
26:17
the need to talk about this whole conversation
26:19
in a way that is surveillance literate, right, that like
26:22
really takes into account the full sense
26:24
of like why experts
26:26
like yourselves recommend make certain recommendations
26:29
to people, how it connects to
26:31
a larger criminalization
26:34
and surveillance landscape when it comes to things
26:36
like protest in descent, And so I
26:38
wonder, like do you think we live
26:40
in a surveillance literate society,
26:42
and do you think people have a
26:44
good sense of like how these things are all connected.
26:48
Oh?
26:49
Absolutely, absolutely not. We
26:51
do not live in a surveillance literate
26:54
society. I mean there's just
26:56
the capabilities of statent on state actors
26:59
is is not
27:01
really completely understood. I feel like people are
27:03
oftentimes protesters or
27:05
activists, shall we say, are oftentimes operating
27:08
with a knowledge base that's maybe like five,
27:11
ten, fifteen years even sometimes
27:13
backdated. So or
27:16
sometimes people will just be like, oh, well,
27:18
the adversary maybe the state
27:21
we'll say that's the adversary in this example
27:23
is so advanced that why
27:26
should I even bother? Like they can read
27:28
everything that I'm doing,
27:30
they can read my mind.
27:32
Basically, They're just that
27:34
capable. Not the
27:36
case.
27:38
The opposition is oftentimes
27:42
not as sophisticated as we think.
27:45
And there are things that we can implement
27:48
best practices in digital
27:50
security and privacy that we can
27:53
implement that will keep us safe er.
27:57
And it's like super simple
27:59
stuff you signal with disappearing
28:01
messages for communications, like don't
28:03
just use text messages or phone
28:06
calls, like super
28:08
simple stuff like that will
28:11
keep people safer, like you
28:13
said in the beginning of this podcast, like don't
28:15
have a face unlock on your phone
28:18
or fingerprint unlock on your phone, because
28:21
the cops, if you are arrested, can just
28:23
take the phone and hold it up to your face and then unlock
28:25
your phone.
28:27
Stuff like that. But backing
28:29
up a little to surveillance.
28:32
You know, the capacity for
28:34
surveillance is really
28:37
increased with the proliferation
28:40
of AI assisted facial
28:42
recognition or you
28:45
know HD cameras that can do
28:48
like a thousand X zoom
28:51
and still have like four K quality,
28:55
or even like we could talk about
28:57
all the different optics that are on like helicopter
28:59
cameras, police helicopter cameras that can
29:01
like see in thermal, in infrared,
29:04
in different heat signature patterns
29:06
in you know, movement
29:09
outlines and stuff like that, to
29:12
where they can also like zoom
29:15
in to where they can essentially see
29:17
a protesters screen on their phone
29:20
from a helicopter.
29:21
That type of.
29:21
Stuff, or just
29:24
like super stuff that's been around for
29:26
like decades, like police poll cams that
29:28
they'll just like stick somewhere and people just
29:30
won't notice, or applars,
29:33
automatic license plate readers to kind
29:36
of map, like where people are driving around
29:39
all of these things. They exist, they're in
29:41
the real world, and they're used by
29:44
law enforcement on a daily basis
29:46
almost And this is just like the physical
29:48
world that we're talking about.
29:49
This isn't even the digital world where we're.
29:52
Talking about, like them
29:54
having the capabilities of seeing
29:57
all of your social media accounts interlinked
30:00
and essentially
30:03
pulled together in one little package
30:05
deal that is maybe like this is Sam,
30:08
Like this is his Facebook, this is his
30:10
Instagram, this is his Tumblr, this is his MySpace
30:12
from fifteen years ago. They
30:15
have these capabilities, but
30:17
we can do things to
30:20
prevent them from having fuller
30:24
access to them, and we can make
30:26
things private. We can have burner
30:28
accounts, burner emails, burner phone numbers,
30:31
things that aren't maybe directly associated
30:33
with us.
30:33
And there again this.
30:35
Does boil back down to best
30:37
practices in digital security
30:39
and privacy.
30:40
Yeah, I have heard the feeling
30:43
like, oh, the state they have everything
30:45
about me? Why bother? I've heard that described
30:48
as like a kind of nihilism, like
30:50
surveillance nihilism, where you're like it doesn't
30:52
even matter anymore. I
30:55
like, I get it, because
30:57
the list of ways the state surveils
31:00
US is vast and I could see, I could I
31:02
totally understand that as a reaction,
31:05
but Sam, you're
31:07
so right like it. There are still
31:09
things we could do, basic steps that
31:11
we could take to maintain
31:14
privacy, and I think the state,
31:16
it kind of like, is counting on us being
31:18
like, oh, well, there's no point of doing any of this. I
31:20
may as well just leave my phone unlocked. I may as well just
31:22
like not wear a mask. I may as well just whatever,
31:25
because they already you know, they already have everything.
31:27
It is.
31:27
It is, there are things you can do, and the state
31:30
totally benefits from us thinking that
31:32
that's not true, that there's nothing we can do, despite
31:35
the fact that I do understand that as a reaction
31:38
totally.
31:38
And I think there's something else also here
31:41
too, where I want
31:43
to highlight where I'm about to say sounds contradictory,
31:45
and I'm going to try my best to say
31:48
it in a way with nuance lay
31:50
it on us. The
31:53
state is very powerful, and I think also
31:55
at times people overestimate
31:58
or don't totally understand how the tools
32:00
of the state work, so they have a lot
32:02
of data on us. But also
32:05
like there's a lot of things that surveillance
32:07
technology can't do that. I think
32:10
often people misunderstand that
32:12
it can do. And I
32:15
think that isn't That isn't to
32:17
say we should all take a sigh of relief,
32:20
you know. I think we should all be like breathing in and out constantly,
32:24
you know, trying to quell different
32:26
levels of anxiety. And I say, this is a highly anxious
32:29
person.
32:29
Same.
32:31
But but one of the things I think it is important,
32:33
and I'm thinking about this in like a protest context,
32:36
is you know, like the state has a
32:39
lot of data on a lot of people, as
32:41
we learned from
32:44
from like Edward Snowden and
32:47
and like the prison program for example. But
32:49
also there's like so much data that at times
32:52
it can be hard to identify people. And that's
32:54
the other end of security nihilism that I see
32:56
is people being like, I'm just a drop of data
32:58
in a very large data bucket. And
33:01
for me, it's like, well, you never know when
33:03
the state is going to decide to look at you, And
33:06
if you have any marginalized
33:08
identity, then you know already that
33:12
bad actors of the state, like a local police
33:14
force, they don't need very much to decide
33:17
to look at you. In fact, they need almost nothing
33:19
to decide to arrest you to like,
33:22
you know, fuck up aspects of your life, et
33:24
cetera. And that's only getting more
33:26
and more worse now, especially
33:29
in the United States. If we look at the overturning
33:31
of Roe v. Wade, of attacks on gender affirming
33:34
care, like it's so difficult to
33:36
sort of just do things we should be able
33:38
to do. One of the things I want
33:40
to highlight this is something I think we've seen a little
33:42
bit in our training, is people sort
33:45
of at times sort
33:47
of assuming that let's say, like police
33:49
surveillance is magical,
33:52
or that like AI is magical.
33:55
It's not. There are things we actually can do to
33:57
be safe. And I think that's
34:00
like what I want to sort of emphasize.
34:02
This is like the space of nuance, Like we should
34:04
be afraid, and also there are things we
34:06
can do to try to like mitigate
34:09
and reduce these harms and be
34:11
safer, and we
34:13
should do those things. We should especially
34:15
do those things, and we should do those things if we're also
34:18
in community with marginalized
34:20
community members. So like I'm
34:24
a white, non binary person, you
34:27
know, I was born in the United States. For
34:29
me, going to a protest unmasked
34:33
even though it's COVID, So actually you should all be wearing
34:35
masks out there anyway. But
34:37
like if I were to go hypothetically unmasked,
34:40
which I wouldn't do, Like
34:42
me getting arrested is very different than someone else getting
34:45
arrested. But I shouldn't be worried just about
34:47
me. I should be worried about the people
34:49
I'm standing with, right, And I should
34:51
be worried about how I've saved those
34:53
people in my phone, And I should be worried
34:55
about what kinds of things can
34:57
be done with just a little bit
35:00
of like nudging from the police.
35:02
Right.
35:03
And this is where understanding
35:06
security measures and privacy measures
35:08
and having good what Matt Mitchell
35:10
calls like security hygiene and digital hygiene
35:13
is really important. Right,
35:15
So like disappearing messages, using signal,
35:17
not bringing my phone at all to a
35:19
protest, these are really helpful
35:22
things versus going
35:24
to a protest and thinking you need like
35:27
a farity bag and a mic jammer
35:29
and all these things. But then let's say
35:31
you're connecting to public Wi
35:33
Fi and you're messaging and you're
35:35
posting on Twitter, like
35:38
that has almost negated the other
35:40
things you also don't necessarily need something like
35:42
a mic jammer if you
35:45
like, you know, if you don't have malware on your phone.
35:47
Right.
35:47
So I think that there's these things where people sometimes
35:50
sort of over estimate
35:52
the capabilities of
35:54
some of the tools that the police have and
35:57
then underestimate how little
36:00
evidence the state needs to
36:02
seize your devices or be able
36:04
to engage with
36:06
with your phone. And so I think it's this
36:08
balance of understanding where it does like security
36:11
on the device stop, and
36:13
like good protest
36:16
tactics start, like my good
36:18
physical security, right, And
36:20
I think these are very much intertwined.
36:23
And I think that's also another level of complexity
36:27
we've been dealing with, Like we've also
36:29
worked with activists who are really nervous
36:31
and like want to put their phone in the freezer
36:33
or the microwave. And you know, that's
36:35
something too where it's like I totally hear that
36:38
you're scared if you
36:40
want to do that, Like I
36:42
can understand how that makes you feel
36:44
safer, But the issue more is like,
36:47
let's talk about what would be happening
36:49
on your phone that would cause
36:51
that to happen, and
36:54
also that there's an even
36:56
more serious question of that means there's something on your
36:58
phone that is the listening in all other different
37:00
instances, and so if
37:03
you're really worried about that, then the problem
37:06
is something we need to look at on the phone.
37:08
Right.
37:08
The problem is spyware or malware
37:11
on your phone, right, And
37:14
that's something we need to deal with immediately.
37:17
And I think that's also something that I think
37:20
is very difficult to understand or
37:24
work through if if you aren't a technology
37:26
expert. And that sucks, because you
37:29
know, we want people to have the
37:32
most accurate and
37:34
like salient information as possible, and
37:37
it I think a lot of barriers to this
37:39
is having to be constantly updated with how
37:42
technology works, what are the different kinds
37:44
of tools and apparatuses of the state, and
37:46
then how do you get that information into
37:48
people's hands.
37:54
More after a quick break,
38:05
let's get right back into it. Yeah,
38:10
I'm almost cringing a little bit because I definitely
38:12
went to a meeting where we
38:15
all put our phones in the freezer because somebody
38:17
saw it on the Snowden Doc. And then I definitely
38:19
like logged onto Starbucks WiFi.
38:21
I had a whole conversation about what we talked about
38:23
via Starbucks WiFi,
38:26
via just SMS messengers like oh well,
38:29
like the more like it
38:32
felt cool to put our phones in the freezer,
38:34
but I took no other security steps
38:36
that day, and in fact did things that were
38:38
like risky that you know, a little
38:41
common sense, basic considerations
38:44
would have probably been more effective
38:47
or made a bigger impact on my security
38:49
or my made a bigger impact on
38:52
my security than like, let's all put our phones in
38:54
the freezer. And I think, Caroline,
38:56
you made a point about people
38:58
not being tech expert. I wonder,
39:01
you know, in twenty twenty four, with the ubiquity
39:03
of smartphones and facial recognition technology
39:05
and doxing and like all of these
39:08
technological innovations and advancements,
39:11
do you think that folks feel
39:14
a barrier to protest or
39:16
to speak up and use their voices because
39:18
it seems like, well, this is so much, this is
39:20
so opaque. Some of the guides
39:23
and way that people talk about technology and
39:25
security feels not
39:27
accessible. So I'm just gonna like not show
39:29
up because who has the time? Like I wonder, is
39:32
that something that you've seen in your work?
39:33
So I haven't personally seen that, And
39:36
some of that I will highlight might be from where
39:38
we sit as an organization because
39:41
we tend to work with other
39:43
community organizations. We
39:45
tend to work with human rights defenders,
39:47
journalists, activists, and
39:49
then members of the general public who
39:52
like want to engage with this knowledge.
39:54
So I would say that like, I haven't seen
39:57
necessarily a deterrence. What
39:59
I have seen is people showing up but still
40:01
being scared and or
40:04
showing up and then something bad happens
40:07
and they weren't prepared,
40:10
let's say, for for
40:12
for what was happening, and that I want to highlight that's not there.
40:14
That's not their fault, Like we live under
40:17
surveillance capitalism, how software
40:19
and hardware has been designed is not the fault
40:22
of the user or the vulnerable individual.
40:25
That's the fault of capitalism
40:27
and big tech. And like a lot of not
40:30
regulation we have in the United States,
40:32
right, and so I think luckily
40:35
at least you know, Again, also we're speaking
40:37
from a very specific sort of space convocation,
40:41
our lab, in which we are often
40:43
engaging with people that are you know,
40:45
human rights defenders, journalists, activists,
40:48
community organizations. So we
40:50
haven't seen a hesitation.
40:53
But what we have seen is people recognizing
40:55
that there
40:58
are like skills and tools that they that
41:00
they don't have, and they're
41:02
seeing that also in real time. I think right
41:04
now because of the amount
41:07
of doxing attempts that are
41:09
happening on social media, and
41:11
so I think that's weighing into our people are now
41:14
a bit more aware. You know, there's
41:16
been many stories, as you know, Bridges, as you pointed
41:18
out, with billion paires renting buses
41:22
and putting people's faces on them or putting
41:24
people's names on them. And I think that
41:26
is, you know, causing folks to reflect and be
41:29
like, oh, like that could be me, and I still
41:31
need to show up. I still want to show up,
41:33
but how do I how do I create
41:36
or maintain some safety knowing that
41:38
that is a potential outcome. So that's
41:40
a little bit more of I think
41:42
what we've been seeing.
41:43
I started this conversation wanting to use
41:45
the focus or framing of like campus
41:48
protesters, campus activists, and what's happening
41:50
on campus is right now, But how have
41:52
you seen these same tactics being used to
41:54
target anybody, like people who are not necessarily
41:57
protesters, who maybe work for companies
41:59
that are like deemed to woke
42:02
or have in some way have been
42:04
like perceived as ideologically
42:06
against the bad actors who are doing
42:08
the dosing, Like, is this the kind of threat
42:11
that really all of us might
42:13
need to be aware of, whether or not we've ever set foot
42:16
on an IRL protest on a campus.
42:18
I would say so. I would
42:20
say, with how you know, regardless
42:23
of the country you live in, with
42:26
how politics have been going over
42:28
the past few years, it is always
42:30
good. It's very important
42:33
to be thinking about your own
42:35
digital footprint, data that's out
42:37
there about you, and how it can
42:39
be used or misused to harm you. I
42:42
think that that's something we
42:44
all now especially
42:46
really need to be thinking about.
42:49
I don't want people to listen to this episode and
42:51
think every threat that we have
42:53
discussed is something that they personally
42:56
like, will be are likely
42:59
to become a target for. Like, I don't
43:01
want people to be paranoid. I want people to be informed
43:03
and smart, right, And I guess, like, in
43:05
your work, how have you prepared
43:08
people to understand like they're
43:10
specific their specific needs in this
43:12
whole conversation.
43:14
Oh gosh, I think this is one of the hardest things. This
43:16
is where this is why we do I
43:18
think really targeted workshops. But Samya,
43:21
do you want to do you want to.
43:21
Weigh in the answer is
43:24
threat modeling.
43:25
Oh, you
43:28
were going to say that, I was going to say that,
43:30
Sam is going to say it.
43:31
Well, we all have different
43:33
threat models, right, and what Carl's
43:36
threat model is going to be way different than mine, and
43:39
what bridgets is is going to be way different
43:41
than Carlos. Threat modeling is
43:43
the process of trying to figure
43:45
out the possibility
43:48
or probability of a threat
43:51
that the person
43:54
may encounter and
43:56
weighing that weighing
43:58
the consequences of that threat
44:00
on how realistic
44:03
it is that it will happen or will not happen. And
44:06
I think that people can
44:10
try to figure out what
44:12
their personal threat model is by
44:16
considering who the bad
44:18
actors that they may encounter are
44:21
and what their capabilities are, and
44:23
then what they're trying to protect.
44:26
So that might maybe student protesters
44:29
that want to keep their anonymity,
44:32
and so maybe they will not bring
44:34
their phone to a protest, or maybe they will wear
44:37
a mask or cover their tattoos in
44:39
order to not be identified physically.
44:44
So that's just like one example of a of
44:48
a threat posed to campus
44:53
activists. And this
44:55
threat modeling process can really be used
44:59
for all aspects of life. You
45:02
can threat model everything in
45:04
your daily life.
45:05
I mean, I just to build on that. Like one thing I
45:07
want to highlight is we all threat model every
45:09
day, as Sam is saying, like you
45:12
can use it in your everyday life, and you do, Like
45:14
when you decide to cross the street not
45:16
at the light, you are threat
45:18
modeling right when you're sort of making
45:20
a decision around how
45:23
you're going to get
45:25
home or go to go somewhere. That
45:28
is like a form of threat modeling. And
45:30
I think I
45:33
think that there, this is I think this is what gets
45:35
really tied into then trying
45:37
to understand a bit more about surveillance
45:39
literacy and like security literacy and
45:41
privacy literacy is really important. Threat
45:43
modeling is incredibly important.
45:47
It's how you can decide, you know,
45:49
it's how you can maintain some safety.
45:52
What I think is harder when
45:54
the challenges is also helping folks
45:56
feel safe and secure around
45:59
threat mode. And some of that now
46:01
comes down to how do we understand like
46:04
the tactics and tools
46:06
of our adversaries. That's understanding
46:09
there why right so why it doesn't
46:11
matter, but it's understanding like what are they using?
46:14
And so I think for folks
46:18
that are going to actions, please
46:20
go. Please try to
46:22
wear something that sort
46:24
of helps anonymize you. So
46:26
I would not recommend wearing your
46:29
you know, cal
46:32
state the year you're graduating
46:34
shirt, or
46:38
like your really awesome jacket that
46:40
you made that's one of a kind. I
46:43
would recommend wearing something that's a little
46:45
bit more plain. I
46:47
would really try to cover
46:49
your face, cover your tattoos. I
46:52
would recommend leaving your phone at home
46:55
if you feel comfortable doing that, you
46:59
know, I think this gets into a different space
47:01
if we're talking about folks that are
47:05
either there to observe
47:08
or document. There's
47:11
a lot of great guides out there on how
47:14
to document a protest
47:16
safely. Some
47:18
of that, you know, we point to
47:20
our friends at open Archive
47:23
and Witness who have really great guides written
47:26
for human rights defenders on how
47:28
to document actions and safely upload
47:30
them. We recommend following that, you
47:33
know, please check out different safety
47:35
guides that have been put out, you know, pretty
47:37
recently from the eff or the Markup in
47:39
terms of how to stay safe at a protest. We're
47:42
updating, we're updating
47:44
and creating a new anti doxing guide, and we're also
47:46
putting out guides hopefully soon. But
47:49
a big thing is, you know, maybe don't bring your phone with
47:51
you, and if you have an iPhone and you need to bring
47:53
your phone, consider turning it off,
47:55
putting it lockdown, mown, and turning it off.
47:57
Yeah.
47:58
Well, actually hear from eff technologist
48:00
who wrote that guide next week about some concrete
48:03
tips for digital security at protests. So folks
48:05
should definitely tune in, But are you
48:07
all working on any guides that folks should know about.
48:10
One thing is like, you know, if
48:12
you turn your phone on and you've brought
48:14
with you at the protest and you're like tweeting
48:17
about where you are and you're taking a photo
48:20
of where you are, that might negate
48:22
a lot of the safety tips that you've already
48:25
gone through, right, And so one thing to consider
48:27
is also like can you put
48:29
that out later? Can you scrub metadata from it?
48:33
Are you doxing accidentally your
48:35
fellow protesters? Like are they are
48:38
their faces covered? You can use
48:40
Signals face blurring tool which they
48:42
have you can blur out people's
48:44
faces. That
48:46
also helps strip the image of metadata.
48:49
So there's like all these different things you
48:51
can do, and I think it's worth doing those
48:53
and going to the protest.
48:54
I think some of this is.
48:55
Also shifting our own
48:57
concepts of what it means to sort of document
49:00
and protest safely.
49:02
And I think some of that is.
49:05
Recognizing that, like going to the protest is
49:08
almost more important than
49:10
publishing an image that you were there. I
49:13
think there's other ways to talk about being there, And
49:16
that's not to discourage people from posting,
49:19
but rather it's to say threat
49:21
model and really think about what's in this image?
49:24
You know, what does this image reveal?
49:26
Right?
49:26
What does this image reveal about me? Does
49:28
it reveal about other people? Could
49:32
someone be identified? Who
49:35
is you know, who I'm protesting
49:37
with? How could this negatively
49:39
impact them?
49:40
Right?
49:41
I think it is thinking about some of those some
49:43
of those things as we engage
49:45
in collective action.
49:49
And so those are like some of the tips I want
49:51
people to like think about, which is, you
49:54
know, make a plan with your friends later of when you're
49:56
going to meet up and pick a time and make
49:58
sure you know how to get there, and and if
50:01
you don't show up in a certain amount of time, maybe
50:04
that's a signal to them that something has
50:06
happened, and like have that conversation. And these
50:08
are like really safe ways to still go about
50:11
and engage in this
50:14
really necessary. I would argue, like civic
50:16
action that we need to be engaging
50:18
in, and that's just a
50:20
really great way to cut down on ensuring
50:23
your your phone
50:25
isn't, you know, contributing to
50:30
this ongoing surveillance
50:32
apparatus, or just turn
50:34
your phone off and don't turn it on until
50:37
it's over.
50:38
Caroline Sam, thank you so much
50:40
for being here, and like truly thank
50:43
you for your work. We need folks like
50:45
you who are making it easier and
50:47
safer for everybody to use their voices
50:49
right now. So I hope this gives people
50:51
a sense of how they can do that. Thanks
50:53
for being here. If
50:58
you're looking for ways to support the show, check out
51:00
our March store at tangody dot com slash
51:02
store. Got a story
51:04
about an interesting thing in tech, or just want to say
51:06
hi, You can reach us at Hello at teangody
51:08
dot com. You can also find transcripts for today's
51:11
episode at tenggody dot com. There Are
51:13
No Girls on the Internet was created by me Bridget tod
51:15
It's a production of iHeartRadio and Unboss Creative,
51:18
edited by Joey pat. Jonathan
51:20
Strickland is our executive producer. Tari Harrison
51:23
is our producer and sound engineer. Michael
51:25
Almado is our contributing producer. I'm your
51:27
host, Bridget Todd. If you want to help us
51:29
grow, rate and review us on Apple Podcasts. For
51:32
more podcasts from iHeartRadio, check out the iHeartRadio
51:34
app, Apple Podcasts, or wherever you get your podcasts.
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More