Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:05
Hey, this is Annie and Samantha and welcome
0:07
to Steph. I've never told your productive I heart radio.
0:18
Today we are once again joined
0:21
by our friend and co worker,
0:23
Bridget Todd. Thank you so much
0:26
for being here, Bridget, It's always a delight, always
0:28
a delight for me. I'm so happy to be back. Yes,
0:31
And we were talking before
0:34
this about how important
0:37
the topic that you brought to us today is and how much we've
0:39
been wanting to talk about it, and because I can't help
0:42
myself, I immediately thought
0:44
about there's a new genre
0:47
of horror movies about this very
0:49
thing, and I actually find
0:52
them quite disturbing because there's
0:54
a sense of the like because
0:56
we're talking about online harassment today, and
1:00
these horror movies has this sense of like
1:02
you can't control
1:06
what happens on
1:08
the internet, and that like there's this sinister
1:11
force that is going
1:13
to wreck your life and
1:16
no one takes you seriously, which is how
1:18
a lot of horror movies do go, especially when you're
1:20
talking about women of like, you know, raising the alarm
1:23
there's a problem here, and people are like, I
1:25
don't see the problem or get over it,
1:28
but I do. That was one of the
1:30
first things when you sent
1:33
this topic to us, and that was where my brain
1:35
went. It's like, oh, yeah, I've seen these horror
1:37
movies and they actually legitimately
1:39
freaked me out. Well, I saw that movie friended,
1:42
or maybe it's called unfriended. Unfriended,
1:44
it's unfriended. Where's my nefarious
1:47
stuff happening on the computer on
1:49
the internet and you cannot control
1:52
or escape it. That movie scared the crap that. I mean.
1:54
It's funny that you bring this up because it is a really interesting
1:56
parallel. I've never thought about before sort
1:59
of how the horror movies really
2:02
capitalize on this phenomenon
2:05
that a lot of women online experience where it's
2:07
something bad is happening online and no one
2:09
is taking me seriously, um,
2:12
and it can really get out of control. I hadn't sort have
2:14
thought about how films have really captured
2:16
that. Yeah, and I
2:19
was thinking on the ones
2:21
that I've seen, which admittedly is only you know,
2:23
a handful maybe ten. I
2:26
love horror movies and they are almost
2:28
always I can't think of an exception,
2:31
a woman at the heart of it who
2:34
just gets harassed
2:36
and her entire life ruined and no
2:38
one believes her. Um
2:41
and we'll like go to authorities
2:43
or even go to like in some cases maybe
2:47
a professor or like a mentor and
2:49
they're like, uh, don't
2:51
really believe you. That can't be that big
2:53
of a problem. Get
2:56
over it. And because
2:58
as we've we've spoken about horror
3:00
is a great reflection of
3:03
current societal fears,
3:06
and so it's interesting to me that we do see
3:08
more and more of those types of movies. But
3:11
we're not actually talking about the
3:13
horror movie aspect today. We're talking
3:16
about real life
3:18
instances of online
3:20
harassment with women and marginalized
3:23
communities. Yes, that's right, although
3:25
I do I do agree that like movies
3:28
and film do a great job of representing this,
3:30
but it is a really scary
3:32
topic. It's great fodder for a horror movie.
3:35
You know, we have seen this wave of women facing
3:37
harassment online for just doing
3:39
their jobs in some cases like journalists, or
3:42
just for existing online at all.
3:44
And I want to be very clear that I don't
3:46
want to scare anyone. This is a can
3:48
be kind of a heavy topic. Hopefully it won't be
3:50
too heavy during this conversation, but you
3:53
know, it is, it is a reality,
3:55
and so I think that as women as queer folks,
3:58
as trans folks, as marginalized people. The
4:00
Internet is rightfully our domain,
4:02
right, Like, that's the main thing that I want to drive home.
4:05
We have been so instrumental in
4:07
setting up the Internet as it exists today,
4:09
So it is our domain, and it's our right
4:12
then to show up or not show
4:14
up online however we please. And so I
4:16
think that all of us should show up as
4:18
our full, colorful, wonderful selves
4:21
online, but we should still acknowledge
4:23
some of the risks that it involves and
4:26
some practical steps that we can take to keep
4:28
ourselves safe online while we're saying all
4:30
the things that we want to say and showing up in the ways that
4:32
we want to show up. And so it's it can
4:34
be tough because let's keep it real.
4:37
Like, nobody likes talking about harassment.
4:39
When I was doing research on this, almost
4:41
every woman I spoke to was like, Oh, I don't really like
4:43
talking about it because people think you're
4:46
a whiner. People are like, oh, you know, you're
4:48
taking it too seriously, like, oh,
4:50
you know, what, a baby grow up. But that
4:53
silence, that that that kind
4:55
of expectation that we just suck it
4:58
up and and you know, deal with it and not
5:00
speak up about it is meeting that we're all kind
5:02
of suffering in silence alone. And
5:04
so as much as it's a drag to talk
5:06
about, we should definitely be talking about
5:08
it. Yeah, And I
5:10
think when
5:13
we've talked about like, I
5:15
know, this is a bigger conversation for a separate
5:17
episode, but when people, you know, shout that First Amendment
5:19
thing, and I always remember this
5:21
argument I read that will imagine
5:24
that you're somebody trying to speak and there's a thousand
5:26
other voices yelling at you, trying to shut
5:28
you up. And then what
5:31
what we're silencing people? As you said,
5:34
and just the idea
5:36
that, as you said many times, Bridget
5:38
when you've been on here, that the Internet
5:40
is real life, Like this is
5:42
not just something that you can dismiss
5:45
or suck it up and get over it. It impacts
5:48
your every day and it
5:51
can be very hurtful and
5:53
very scary. And recently
5:55
we've seen some pretty high profile cases
5:58
of women being
6:01
forced off social media exactly.
6:04
I mean, that's exactly what you saw with Christie Teagan.
6:06
Christie Teagan she left Twitter in March.
6:08
She said that the platform no longer
6:10
served her positively. And this was a
6:12
big deal for a few reasons. First of all, Christie
6:14
Takan people used to call her like the
6:17
mayor of Twitter. She was such a prolific user.
6:19
If you use Twitter, you definitely probably saw
6:21
her viral tweet. She was very funny,
6:24
and she also was pretty embedded
6:26
in Twitter. You know, Twitter as a
6:28
company. She spoke at their one of their
6:30
company events. You know, she was a pretty buddy buddy
6:33
with them. And so the fact that she
6:35
was this powerful power user
6:37
who was so prolific on the platform,
6:40
and that she didn't feel like the platform,
6:42
you know, really served her positively,
6:45
really indicates the scope of the problem.
6:47
Right, this is probably their most high profile v
6:49
I P user and that user was
6:52
not able to have a positive experience on
6:54
the platform and had to leave that platform.
6:56
For me, that really indicates that something
6:58
is broken with your platform. And
7:01
to be clear, the kind of harassment that she was
7:03
facing, it's not just people being
7:05
like, oh, I don't like your cookbook, or
7:07
oh this joke is not funny. It
7:09
was pretty intense. She had Q and on
7:11
followers constantly connecting her to
7:14
baseless allegations of child abuse, as they
7:16
do um and this is especially awful considering
7:18
that she was publicly dealing
7:20
with a miscarriage that she talked about on Twitter,
7:23
So pretty abhorrent behavior,
7:26
you know, And I think there's
7:28
a pretty big difference between criticism
7:31
of someone who's a public figure or a celebrity and
7:33
the kind of harassment that she faces
7:35
that frankly nobody deserves, right.
7:38
I know, there was a definitely back and forth of the
7:40
damned if you do, damn DV you don't with
7:43
her, whether it's because she did want
7:45
to share the tragedies she went through
7:47
as a way of saying you're not the only one
7:50
you know and showing her heartbreak being very
7:52
honest, to being too public, that
7:54
that that's being too public, or she's doing this for sympathy,
7:56
garnering sympathy versus you know, you're
7:58
not saying enough. It was too weird back and forth
8:01
of damned she did, damned that she didn't.
8:04
And it's also kind of the same difference that there
8:06
were other people
8:09
in here that were being I guess the best
8:11
word, they were sicking people, all the other
8:13
people and it's amazing, I know, just like the
8:15
journalists who was the intern her
8:18
first article and it really just was. We
8:20
investigated this. Myself and two other reporters
8:23
investigated this. Here's the article. I'm so
8:25
proud to someone going after
8:27
her and making her target to the amount
8:29
of harassment that she went through. And she wasn't
8:31
even saying anything on Twitter, but the amount
8:34
of vitrol and harrassment she got
8:36
was unreal. To be honest
8:38
with you, that specific incident was what
8:41
prompted me to do an episode
8:43
of my own podcast, Our No Girls on the Internet, about
8:45
harassment. So the person that you're talking about
8:47
Brenna Smith, she was an intern at the USA. Today
8:49
she tweeted that she had her first byline on
8:52
a story about how defendants
8:54
in the Capital Insurrection were using
8:57
all of these sort of backdoor approaches to crowd
8:59
fund for their defense. And it was a fairly
9:01
like standard piece. It was not an opinion
9:03
piece or an op ed. It was a standard reported piece
9:06
about what's happening. It was a code Byeline,
9:08
her and another woman reporter
9:10
whose name escapes me, and then a white male
9:13
reporter will Careless. And so she tweeted,
9:15
so so happy about her first Byeline as
9:17
an intern. And I think it was Glenn
9:19
Greenwald who retweeted it to his millions
9:22
of followers to basically say
9:25
that she should be ashamed of herself,
9:27
you know, just a really mean tweet.
9:30
And to be clear, if he had a problem
9:32
with that article, there is a way to talk
9:34
about the problems that you have with
9:37
a piece that that a journalist writes in
9:39
a way that is not harassment. It's so it's
9:41
so different to retweet
9:44
an intern, you know, the power
9:46
and balance alone there, I think is worth
9:48
noting. It's so different to amplify
9:53
you're really harsh personal
9:56
swipe at somebody who really doesn't
9:58
have a lot of power. And yeah,
10:00
that when I saw that, it kind of broke my heart because
10:03
I remember what it was like to be an intern.
10:05
I remember what it was like to have to
10:07
feel like you didn't know what you were doing and they
10:09
have people not take you seriously, and
10:12
that would have broken me. And in
10:14
her credit, I was so happy to see so many
10:16
reporters, including folks like Ronan Pharaoh,
10:19
you know, supporting her and being like great job,
10:21
like good work, welcome to the journalism club, because
10:24
it would have really really hurt
10:26
me if a high profile
10:29
reporter with a big platform amplified
10:33
this kind of attack. Against me because they
10:35
didn't like something that I wrote. And I'm so glad that you brought
10:37
that up, because that was such a I
10:40
don't know why, but it really that incident
10:42
really stopped with me, and really this made me want
10:44
to talk about harassment and sort
10:46
of point out all of the things that women
10:49
online are putting up with that we just didn't have
10:51
to. We shouldn't have to put up with this to do our jobs.
10:54
It definitely spoke to the fact that they weren't
10:56
criticizing her work. They just were coming after
10:58
her. They half the people had no idea why they
11:00
were mad other than oh, you're
11:03
calling us who were in support
11:05
of the insurrection, which, by the way, yeah, we should
11:07
be criticizing you a b in
11:10
something bad and you said, we're doing something bad. And all
11:12
she really did was investigative journalism
11:14
where they went down like different money
11:16
trails to see what was happening in the story,
11:19
and then they were harassing her looks harassing
11:21
just her being a woman being around. The amount
11:24
of just hate and anger that came
11:26
onto her, which made no sense because it was not of
11:28
any critique to our work. It literally
11:30
was just you, I don't like you. You, you're
11:33
awful person. Of course, we're saying this nicely
11:35
as what was on there, because we know this whole
11:38
range of harassment that happens, especially
11:40
when it comes to the threats and the death threats that
11:43
happened for people who are
11:45
just doing their job, and essentially her
11:47
doing her first piece, really
11:49
being proud of the fact that she got something out there
11:52
and doing something that was investigative, which seemed
11:54
fairly straightforward as
11:57
a piece like it just unbelievable. Yeah,
11:59
it was really upsetting to see. And she's not
12:01
alone. Unfortunately. According to a survey
12:04
from the United Nations Educational, Scientific,
12:06
and Cultural Organization, seventies
12:08
three percent of women journalists that they surveyed
12:11
say they have experienced online violence
12:13
and the course of their work. And then it goes even
12:15
further because of those
12:17
women journalists who participated in this survey
12:20
say they have been attacked offline in
12:22
connection with online violence targeting
12:24
them. So absolutely right, I am so
12:27
sad to report that that what happened with her is
12:29
not an isolated experience and that
12:31
for so many women journalists
12:33
that online harassment, that that wave
12:36
of online hate is then connected
12:38
to offline, real
12:41
world attacks or offline threats,
12:43
and so it really is something that is
12:45
not just happening online, even though it's
12:47
very tempting to think that it is something that
12:49
has a real world impact, both in the kind
12:52
of silence saying that Annie was talking about
12:54
earlier and in the fact that this is
12:56
like people showing up at your house, people calling
12:58
your parents, things that are very you know,
13:00
very much happening in in
13:03
your real world. I
13:18
know I've spoken about this before on the show, but I
13:22
when I was a producer on this show,
13:25
I got docked and I didn't have
13:28
a big presence online,
13:30
and also like it wasn't really advertised
13:33
that I was on as
13:36
part of the show, and I just remember
13:38
it being so shocking and feeling
13:40
like, well, how did they
13:43
find me? And it was very
13:46
very scary, and we
13:48
had to hire security at our company,
13:50
we had to put these systems
13:53
like software in place, and
13:55
being like the source of that just made
13:57
me feel really small, like
14:00
and it was what blows my mind
14:03
And this shouldn't matter at all, but it does to me that
14:05
it was a really, to me not super
14:08
controversial topic of the things we've discussed
14:10
so much, like really
14:13
not it was making fun of that like
14:15
hot crazy Matrix video, and
14:19
just that the whole idea of
14:23
you don't even have to be somebody that's
14:25
like famous or has a big following
14:28
for this to happen to you, which
14:30
is very frightening. I don't know, bridget
14:33
if you want to speak about have you experienced
14:35
any harassment online?
14:38
Yeah? I mean I I first of all, I'm so sorry
14:40
to hear what you went through, and I'm sorry to
14:42
hear that it sounds like it was a really
14:44
isolating experience. I think that's the
14:47
hallmark of so many folks that I talked to who
14:49
have experienced that. Me. I mean, I've gotten
14:51
my fair share of like
14:54
sexist, gross racist responses.
14:58
You know, I once had a situation, Um,
15:00
I guess I would say this at the beginning
15:03
of the Trump administration, and
15:05
like the Trump campaign, I guess the Trump era,
15:08
I felt like online discourse
15:10
had gotten a lot more
15:12
heated, and so I noticed, you
15:15
know, more people in my mention
15:17
saying racist things or or using racial
15:19
slurs and that kind of thing. Only once
15:22
in my life has harassment ever
15:24
gotten to the point where I needed to involve like authorities.
15:27
And this was kind of like you, this was
15:29
when I was first starting out in my career, I
15:32
was not someone who had any kind
15:34
of big public profile. I think I had like maybe
15:37
I think I was on Twitter and had maybe a hand a
15:40
couple of handful of followers. I don't think I had Instagram.
15:42
You know. I was not someone who was like a public
15:45
figure at all, and similar
15:47
to you, the thing that I was writing about
15:49
was fairly mild. I remember
15:52
very clearly. It was this woman and
15:54
Nita Sarchiesian, who at the time
15:56
was making this video series about
15:59
tropes and video games, and so, lord
16:01
no, she gets a whole hell of
16:03
a lot of harassment that she does not deserve. It. It is
16:05
wild to hear what happens to her. But
16:08
I was just giving a pretty
16:11
fact based, like aggregation of
16:13
one of her videos. So it wasn't even like I was like any
16:16
of the Starkesian wrote this thing and it
16:18
was so great and I really agreed. It was just like, here's
16:20
what she says. And from that,
16:23
somebody reached out to me and basically
16:25
indicated that they knew where
16:27
I lived. And I don't want to say too much, but
16:29
the way that they were framed at I was like, oh, this person actually
16:32
somehow does know where I live, Like he use
16:34
such specific language to talk about where
16:36
where I used to live. That like, it occurred
16:39
to me that he did know where I
16:41
lived. And again it was one
16:43
of those things where I thought, how could anyone find this
16:45
out about me? Why would anyone to find this out about
16:47
me? I'm such a nobody. But that is so
16:51
so important for folks to understand. You don't
16:53
have to be Chrissy Tiguan or
16:55
you know, a high profile journalist
16:58
or someone who talks a lot social
17:00
media or has a bit following for someone
17:02
to just decide that they want to harass you. And
17:04
that's why it's so important to do what we can to
17:07
make it harder for folks who would like
17:09
to weaponize any kind of information that's out
17:11
there about you to do that, because it
17:13
can happen. You know, you don't have to be high profile
17:16
for the staff, and you could really happen to anybody.
17:18
Yes, And and it
17:20
is it's so upsetting going back to that
17:23
like silencing thing, and just I
17:26
know I've I've said before about how I stopped
17:28
online gaming because because of this very thing
17:31
and just those fears
17:35
of well, I guess
17:37
it's safer if I don't say anything at all, which is
17:39
the worst. So
17:42
do you have some steps people
17:44
can take two hopefully
17:49
prevent this at least
17:51
a little bit. Absolutely, So I
17:53
have to give a shout out to journalists Liz
17:56
Lens for this helpful list. She
17:58
was a guest on my podcast earlier this week. She
18:01
really kind of gave me a little crash course
18:04
in how we can prevent information
18:06
from being weaponized against us by online
18:09
harassers, and so I want to thank her so
18:11
much for providing this. She's she's from the Midwest.
18:13
She calls herself a practical Midwest mom,
18:15
and so these are such some practical tips that we
18:17
can all use. So the first one that she said
18:19
was to be really choosy about
18:22
what you decide to post on social media,
18:24
specifically Instagram. Uh. Liz herself
18:26
as a single mom, and she says that she never post
18:28
pictures of her kids on Instagram because
18:30
she doesn't want their faces to be too
18:32
familiar or too recognizable to people
18:35
who would, you know, want to harass her.
18:38
And like, just in general with Instagram,
18:41
it can you would be surprised what kind of things
18:43
can be sort of like taken out of context, what kind
18:45
of images can be taken out of context to smear you
18:48
or harass you, And so I know that
18:50
kind of sucks. It's something that I do as
18:52
well, Like people always say like, oh,
18:54
Bridget, like are you in a relationship,
18:56
what's your what's your family like? Because I
18:59
rarely ever post things about my romantic
19:02
life, my personal life on social media
19:04
precisely because of that, you know, and
19:06
it does sucks. It's it sucks to feel
19:09
like you have to be a little bit guarded
19:11
about what you share on social media
19:14
because you don't want, you know, you
19:16
don't want people to to use it to weaponize you. Like
19:18
I see all my friends posting there Valentine's
19:20
posts of them in their boo and it's like, I don't
19:23
do that. But Liz suggests that this is something
19:25
that like really bums you out. If you're like, man,
19:27
I really want to be able to share
19:30
whatever I want to share, you could consider having
19:32
a private Instagram account with the kids sometimes
19:35
call offen sta, just for your approved
19:37
friends and family, and then a separate public Instagram
19:39
account where you're a little bit more guarded. And so that's
19:42
just the way if you're if you if you really feel compelled
19:44
to share things, but you don't want them to be going to be going
19:47
to a wide audience. That's one thing that you can do.
19:49
Another is something that I had never heard
19:51
of, which was signing up for a service called
19:54
delete me or some kind of other
19:56
service that scrubs your information from
19:58
the web. And so, as I said it earlier,
20:00
and as you said, Annie, like I was really surprised
20:02
that someone online was able to zero
20:05
in on where I lived, and I spent a lot of time
20:07
thinking like, how could that be? How could someone have my address?
20:10
Like how is this happening to me? Well, I
20:12
did not know this, but there are sites online that collect
20:15
and sell your personal data. And if you
20:17
sign up for a site like delete me or some kind
20:19
of other similar site that scrubs your data from the Internet,
20:22
that will make it a lot harder for folks
20:24
to find your personal information. And
20:26
Liz actually says that the most times when
20:28
harassmers were able to find their targets personal
20:30
information like their address with their phone number, it's
20:33
because of those kinds of sites that are just selling
20:35
your personal information to whoever is willing
20:37
to pay for it. And so you know, this is not a commercial
20:40
for delete Me. They have not paid me in any way,
20:42
but literally, after talking to Liz,
20:44
I signed up for it and I found thirty
20:46
five different sites that were selling my address,
20:49
my current address where I currently I'm sitting right
20:51
now online but not anymore so
20:54
HI for something like that. Boom.
20:58
Yeah, it doesn't help that things like a Facebook
21:00
has happened and sold five million people's
21:02
information either, exactly.
21:05
It's like when you think, when you put it in that
21:07
context, there are so many bad
21:10
actors out there making money
21:12
off of our information,
21:15
our data, right, Like, when you think about
21:17
it, it really is.
21:19
The picture is a taste terrifying. Yeah.
21:22
Another thing that she suggests, which is I also
21:24
do, is deleting your tweets. Harassers
21:27
love to take things that you've posted against
21:29
you, so she recommends that everybody
21:31
be deleting their tweets regularly, and
21:33
again, it sucks to do this. I have had
21:35
to delete some some of some tweets I was like very
21:38
proud of, or like jokes I thought were particularly
21:40
funny, even though I'm sure only I loved
21:42
them. But in the long run, it's so much
21:45
safer because I mean, think about every joke
21:47
that you've ever made on Twitter that didn't land,
21:49
or everything you've ever tweeted after a few glasses
21:51
of wine, and now think about
21:53
that being on Twitter forever than
21:55
anybody who is inclined to
21:58
can go back and dig and find and scrape
22:00
and find that joke you made after four glasses
22:03
of wine that you regret. Right, And so for
22:06
Liz, you know, she is this writer who
22:08
writes a lot about politics and extremism,
22:10
so she's often harassed. The thing that comes
22:12
up with her a lot is this joke
22:15
that she made after just a few drinks out with friends
22:17
about online dating, where she tweeted
22:20
a man on Tinder unmatched with me after
22:22
I told him I'm a journalist, what's he hiding?
22:25
And so obviously that's just a joke,
22:27
right, She's not, she's not, she didn't. It's
22:30
just it's just a joke that she made, you
22:32
know, about her experiences
22:34
dating as an online journalist. But to
22:37
this day, people who are harassing her
22:39
have screenshot at that joke. It's many years
22:41
old now, and they use it to fraudulently
22:44
claim or falsely claim that she docks
22:46
an innocent man because he wouldn't date her. This
22:48
was a joke people. She says that people
22:50
still ask her like, did you docs
22:53
an innocent man? And she was like, She's like no, I
22:55
was just I was just kidding around on Twitter, like come
22:57
on, And so yeah, it
23:00
really just goes to show how people
23:02
can take whatever you have to say really
23:04
wildly out of context and use it to
23:07
harass you, all right, or if
23:09
you were met Gates, yes maybe
23:11
evidence, Yes, absolutely,
23:14
Okay, keep going. So
23:17
you can go to tweet delete dot net.
23:19
It is a free service that you can use to regularly
23:21
delete your tweets without you even having to
23:23
do anything. And I really recommend it because again,
23:26
it makes it so that you don't have these
23:28
like you know, because you just just don't
23:31
know what thing someone's going to use to take
23:33
out of context to harass you. Look, I'm sure that when
23:35
Liz made that joke, she wasn't thinking that years
23:37
later people would still use it to falsely
23:39
claim that she's doxing people who don't want to date
23:41
her. So it can just be a good way to give
23:43
you that peace of mind where it
23:46
will make it a little bit harder for folks to
23:48
find information out there about you too then
23:50
use to harass you. Yeah, when you
23:52
said that tweet, it seems
23:54
such a like a straightforward joke. I wouldn't have thought
23:57
of how to use that as a wave of against
23:59
her. But yeah, now that you're say and I'm like, huh, okay,
24:02
okay, which is dumb because in my head
24:04
it was obviously a joke. But wow.
24:08
And yeah, we've definitely talked about delete me, because
24:10
we when we did a couple of episodes, including
24:13
about in Cells, we were so sure we
24:15
were we were holding our breath and going back and
24:17
forth about what we were going to say, about how we're
24:19
going to say it, and whether or not we were going to be attacked.
24:22
My partner at that time was like, please don't do this, Please
24:24
don't do this. You're gonna get dogs. You're not gonna get
24:26
you. And we did. We had that conversation, Okay,
24:28
we gonna do delete me, just in case we need
24:31
to check and see. Of course, I think
24:33
we've got a pretty great audience and fam who know
24:35
us enough to not use
24:37
that as how We're going to to show people
24:39
who they are. Essentially, I guess it's the best
24:42
way to say it. So we haven't gotten any
24:44
backlash, but that definitely was a concern, and
24:46
I hate that that has to be a concern
24:48
for us to even just talk about anything. We didn't
24:50
even say anything necessarily bad other than what
24:53
it was and what he has done and how it's impacted
24:56
women. But that alone seems
24:59
to be offense of in the nature of
25:01
your calling us out somehow, right, Yeah,
25:04
I mean, oh god, do I identify
25:06
with what you just said. Even in this episode
25:08
that I did about online harassment with Liz
25:11
last week, I didn't tweet about that episode.
25:13
It's the only episode I didn't
25:15
personally tweet about because of what you just said.
25:18
I don't advocate anybody living in fear,
25:20
and so I want to be very clear that I hope
25:22
I'm not making it. See Michael, this is so
25:24
scary, but it is hard, and it
25:27
sucks to have to be that mindful.
25:29
It sucks to have to have a sam
25:32
As a podcaster, you have to have a conversation
25:34
with your partner about what you
25:36
can and can't say, and what you should and shouldn't
25:39
say, and how to keep yourself safe if you say
25:41
X, y Z, just to do your job as
25:44
a podcaster, and I
25:46
find myself always equivocating and being like,
25:48
well, it's not even like I said anything. It's
25:50
not like I did something bad like
25:53
express an opinion like it was. It
25:55
was just a fact based account and even
25:57
that just seems like such a such
26:01
a like It just really makes me feel
26:03
as if I've internalized that I
26:06
need to really watch what I say
26:08
and that if I'm going to be talking about a handful
26:11
of subjects like in sales or harassment,
26:13
I need to make sure that I am not expressing
26:16
my opinions and I am just stating
26:18
verifiable facts and blah blah blah. In
26:20
the back of my mind, I'm always thinking, well,
26:23
I don't want to say the wrong thing. I don't want to get
26:25
on a radar of the wrong people. And that's
26:27
just it's it just sucks that this is
26:29
how we have to exist, right,
26:32
I mean, shoot, this could be a whole other episode
26:34
of how this is making it a backtrack for
26:36
women who have finally come on board being
26:39
able to be confident in their jobs
26:41
and being able to be confident in what they're
26:43
doing. And now it's not so much that we are
26:46
being man splained or you
26:48
know, talk down. It's so much more that we are
26:50
afraid physically afraid
26:53
of what the backlash will be, even
26:55
if it is just a factual conversation
26:57
that we're trying to have it. Because we are women
27:00
or we identify as women, we may
27:03
be targets in the story, whether it's because
27:05
some man really takes it personally that
27:08
we say this and this and this, the
27:10
A, B, C and D, whether it's hey, this
27:12
group is really really you know, not
27:14
great for women. They treat women not so
27:17
nicely. You know, stuff like that, white supremacy
27:19
is not a great thing. What how dare you? You
27:21
know, stuff like that where it seems obvious
27:24
in this day and age, But we've kind of backtracked
27:27
in living in fear
27:29
for our safety once again just by
27:31
saying something else where. You know, many
27:34
of men right now are talking about cancel culture, whether it's
27:36
because of the hashtag me too or all
27:38
of these things, are saying some stupid
27:40
stuff, say racist things and say,
27:42
oh, I'm being canceled. That's their
27:44
biggest fear half the time, where women
27:46
are truly being threatened with rape and death,
27:49
and it's a whole different level of Okay, how
27:51
do we combat this? Yeah, it is
27:54
terrifying, I mean it kind of reminds me of that joke
27:57
where men are afraid with their data
27:59
is going to be unattractive. Women are afraid
28:01
it's going to kill them. Right, it's it's it's
28:04
And again, I want to be very clear, and there's a difference
28:06
between legitimate criticism and response
28:08
and all of that. But when a man
28:10
puts his opinions out there, they're
28:12
not People don't respond by calling them
28:15
a whore or you know, a slur, or
28:17
saying they're gonna be raped or they're gonna kill their
28:19
kids. It's so different. And
28:22
I think that you hit the nail on the head, Sam,
28:25
and that we talk so much about cancel
28:27
culture and things like that,
28:30
we need to make room to talk about you
28:32
know, what happens when women
28:35
and other marginalized people are pushed from
28:38
these platforms because of this kind of
28:40
harassment and even just the threat of it. What happens
28:42
when you know, prolific gamers
28:45
like Annie just stop playing
28:47
games online because of the harassment,
28:50
right, Like, that is also a form of
28:52
silencing. That is also a speech issue.
28:54
And yet when we talk about these things, the
28:56
conversations around like cancel culture and that
28:59
kind of thing just dominate the conversation,
29:02
and we don't see the ways that an
29:05
entire generation of women, people of
29:07
color, and marginalized voices have been
29:09
silenced and pushed off of their platforms because
29:11
of harassment. And that is also something we should
29:13
be talking about. Man
29:29
with all these big platforms. Maybe
29:31
I'm wrong, but there seems to be no protection.
29:33
We've talked about the fact that Facebook already is
29:36
now under a new foundation of accusations
29:38
of the fact that they had the millions of people's information
29:40
have been hacked and this is not the first time that
29:43
that's happened to them. Um, we know that Twitter
29:45
is on a constant battle between hey, are
29:47
you going to stop this level of us? But what are
29:49
you going to do? As well as when
29:51
we talked about Twitch, what have they been doing for
29:54
women gamers who are being constantly harassed?
29:57
Is there not something more that they can do or is it just
29:59
literally a free for all? So I'm glad that you
30:01
asked this. There is so much they
30:03
can be doing, The question is do
30:06
they want to do it? I think, especially with
30:08
Facebook, part of my day job involves
30:10
trying to hold Facebook accountable. There is
30:12
so much they can be doing, and the question is do
30:15
they want to be doing it right? And speaking
30:17
of the Christy Teagan thing, Christy taken
30:20
made it very clear she was like, I don't blame
30:22
Twitter for me having to leave the platform. They're
30:25
like Twitter staffers worked with her
30:27
team to try to help her deal with
30:29
the harassment that she faced regularly,
30:32
and I'm sure they did. Chrissy Teakan is wealthy,
30:34
she's well connected, She's married to John Legend, she
30:37
is an a lister, and it doesn't surprise
30:39
me that Twitter was very helpful in trying
30:42
to help her handle this. But here's my thing. You
30:44
shouldn't have to be Chrissy Teagan to
30:47
get a tech platform. Two make
30:49
it so that you can use their platform
30:52
without facing this kind of intense harassment.
30:54
Right. A similar thing happened with Leslie
30:56
Jones from who used to be on SNL and was on in
30:59
the reboot of Ghostbus RS. The kind of racist,
31:01
sexist, disgusting, targeted
31:03
harassment that she got, she eventually
31:06
left Twitter. She deleted her Twitter account, and again
31:08
she was another prolific user and Jack
31:11
Norsey, the head of Twitter, personally
31:13
intervened to bring her back to the platform,
31:15
which I thought was great, but again you
31:17
shouldn't have to be famous to get
31:20
these big platforms to do
31:22
something. And bear in mind that a lot of the people
31:25
who work for these platforms they make so
31:27
much money. These platforms have so much money.
31:29
I refuse to believe that there's
31:31
nothing they can do. But oftentimes it
31:34
feels like we are just left alone to,
31:37
you know, get delete me and be
31:39
super careful about what we say and how we say it
31:41
to protect ourselves in absence of their
31:43
leadership. It is a total lack of leadership
31:46
on their part and we're all just
31:48
left kind of picking up the pieces. And it's horrible,
31:51
right, And it seems like it's growing up companies
31:53
like delete me and like the Twitter
31:55
site. They're getting money to do something
31:58
that shouldn't be a concerned begin with, to be be
32:00
a part of the social media platforms. And then
32:02
I guess another question just had a like sized
32:04
flight Clubhouse, which is a whole new format now
32:06
coming back through. I guess it's kind of a resurgence
32:09
of the old platforms
32:12
of social media however you want to say it, But
32:14
what does securing ourselves look like before
32:17
we even jump into those types of platforms.
32:20
Well, with Clubhouse, that's a really good question.
32:22
A lot of folks are interested in it. I would
32:24
say the number one thing to remember about Clubhouse is,
32:27
even though the app really goes out of its way
32:29
to be like you don't record conversations, your
32:32
conversations are being recorded like that, like that's
32:34
just point blank end of sentence. And so just
32:37
know that whatever you say on
32:40
Clubhouse, someone is probably recording
32:42
it somewhere. And I think that that's
32:44
something that I've seen with Clubhouse. I think because
32:47
and so propos who don't know Clubhouse is this? I
32:49
think it's still an invite only social
32:51
media app that is audio and conversation
32:54
based. So basically you join and
32:56
you join a room, and it tests people's voices
32:58
in real time, and one of are big rules
33:01
is no recording, even though it does get recorded.
33:04
And so I think because of the ephemeral
33:06
nature of an app where the
33:09
interactions you have are just voice, I
33:12
think that it can incite
33:14
people to say wilder stuff
33:16
and and to behave worse
33:19
than they would in a written medium
33:21
where there's going to be some record of it. And you
33:23
know, I've seen, if
33:25
I'm being honest, I was in a Clubhouse of room
33:27
with a A list celebrity. I'm
33:29
not gonna say her name, but she's someone
33:32
who you might know where she
33:34
was part of a harassment
33:37
campaign against a doctor
33:39
for giving information about COVID right
33:42
and like it was. When I watched it, a
33:44
part of me was like, I cannot even believe what I
33:46
am listening to right now. And I think
33:48
the only reason why that happens is because
33:50
people they feel bolder
33:53
or emboldened to behave like
33:55
that, because it's clubhouse and the whole
33:57
I think the vibe is like, oh, you can sort of get wild
33:59
and needs rooms late at night, and there's no there's no paper
34:01
trail of it. But I think you
34:04
know, what it comes down to for me is that on
34:07
any kind of social media app or really anything
34:10
online, there needs to be very
34:12
clear policies around what is
34:15
tolerated and what's not tolerated. Otherwise
34:18
stuff just gets wild and it's just the wild wild
34:20
West, and people are being harassed and bullied
34:23
and miss and disinformation is spreading and it
34:25
can really be a free for all. So if there's
34:27
one thing I know about technology is that if you don't
34:29
have from the onset very clear and
34:31
specific and explicit policies
34:34
around this kind of thing. People will just do whatever,
34:36
and it'll be out of your control if you don't, if you don't,
34:38
if you don't say right out of the gate,
34:41
you know, no white supremacists allowed.
34:43
White supremacists will find it, and
34:46
they did. But I found it interesting because your
34:48
case specifically, I don't have clubhouses
34:50
an iPhone thing and DR four I don't
34:53
have iPhone and so therefore I can't
34:55
get an invite. I feel very I don't feel sad beside
34:57
at the same time, but
35:00
that is that actually bled into Twitter, and I know exactly
35:02
what you're talking about, and I know the level of harassment
35:05
that the doctor went through because
35:07
of that, and no one took responsibility
35:09
because they literally like it was it
35:11
was a social media platform where that no one really takes
35:14
seriously and that people did, and
35:17
so those who allowed for that kind
35:19
of harassment happened never acknowledge
35:21
it as being a problem, which is again the
35:23
problem exactly.
35:26
And yeah, clubhouse, you know it's not
35:28
accessible for people who don't use iPhones. It's
35:31
also not accessible just in general, like there's
35:33
no like for an app that is
35:36
being used by so many tech
35:38
and silicon Valley influencers. There's no captioning
35:40
for folks who need captions, Like it's so it's
35:43
wild to me that an app that is so not accessible.
35:47
It's like such a big part of the conversation right
35:49
now. I know Instagram
35:52
is trying to do something similar to it. Oh
35:54
yeah, I'm sure. I think Twitter has spaces
35:56
now, like it's going to become the
35:59
new the new ibe. I feel that a lot of
36:01
platforms are are trying to adopt because
36:03
that's how things are with Chach. You know, that's
36:06
an interesting point
36:08
because a lot of what we've been talking about is
36:11
how the onus is on women and marginalize
36:13
people to sort
36:15
of put up with our deal with harassment
36:18
that is almost expected, and when
36:20
these new apps or
36:23
social media platforms come out, like I remember when
36:25
Snapchat came out and people joked about
36:27
how it was like dick Roulette, you're gonna and
36:29
that was just the thing, and
36:33
and it's wild to me, Like we've
36:35
talked before about how women do face
36:38
things like revenge porn or being your
36:41
image manipulated and passed
36:44
on and then sent to your family
36:46
or sent everywhere, and having that sort of
36:48
gendered aspect to this, and then also
36:52
I think it's changed. But for a while, like
36:55
Xbox Live had this whole thing of
36:57
like how to deal with harrassment online? And
37:00
you know, I had a couple of bullet points, but the final bullet point
37:02
was if it gets too bad, leave It's
37:05
like, I
37:08
mean, think about that. Their official
37:10
guidance was if you can't
37:13
take the heat, get out of the kitchen. Like
37:15
that, Like that really tells you all you
37:17
need to know about where they
37:19
put the onus and the burden and the responsibility.
37:22
Certainly not the people who are
37:25
paid very handsomely to develop
37:27
Xbox Live platforms. No, no,
37:29
if you don't like it, you can just leave. That
37:32
should not be what we're telling a generation
37:34
of people who want to show up
37:36
and use their voices and take up space online.
37:39
And to the fact that social media has
37:42
become so huge that if you're not a part
37:44
of it, there are big aspects of job
37:46
requirements that you're going to miss out out like
37:49
that, just the fact that you have to be savvy
37:51
with some of these platforms, and and the
37:54
fact that oh, well you can't take it because you're
37:56
a woman, maybe you shouldn't do this job is kind
37:58
of that push and pull, which is kind
38:00
of like, this is absurd, But I didn't want to ask
38:02
because I'm not I'm not sure. We've talked about
38:04
a few examples of people being harassed. Have
38:07
we seen examples where they kind of came up on top
38:10
even through all of that harassment, or
38:12
maybe when we're a platform actually did something
38:14
good outside of Leslie Jones. Yeah, I mean,
38:16
I I think that Leslie Jones is
38:18
a good example. I can't think about a time where
38:22
someone was harassed and they sort of had
38:24
the last laugh. But I will say this and
38:26
that a lot of situations
38:29
that you're seeing right now, where particularly women
38:31
journalists are being harassed. The thing that gives
38:33
me a lot of hope is I
38:35
think that we're seeing a community come together
38:38
to be like that is not cool. And for me, that
38:41
really illustrates that kind of culture shift
38:43
because I think a few years ago that wouldn't have happened.
38:45
So folks like um Rachel Abrahams
38:47
at the New York Times, she did this critical report
38:49
on the far right, you know, conspiracy
38:51
peddling cable news network one American News,
38:54
and after she did that report, they
38:56
put her personal phone number and
38:58
email address her personal self phone number and
39:00
email address on television and
39:03
encourage their viewers, many of whom
39:05
are to say like fringe
39:07
types, to reach out to her. They said,
39:10
we encourage our viewers to stand up to
39:12
the intimidation by the left and feel free
39:14
to reach out to this Time's writer. Now, I
39:16
think if that had happened a couple of years ago, and
39:19
The Times was getting lots and lots
39:21
of angry calls about this report, it
39:24
wouldn't be shocking to make The Times fired
39:26
her, suspended her, or just
39:28
did nothing let her deal with this on her own.
39:31
But The Times put out a statement supporting
39:33
her and calling out what happened. And
39:35
so I think that when we see these cases
39:38
of outlets, an institution sticking up
39:40
for their reporters who are the targets of this kind of harassment,
39:43
and communities, it's like regular
39:45
people online being like, this isn't cool.
39:47
We see what's happening. This is harassment and you should
39:49
be ashamed of yourself. I think that's
39:51
the only thing I can say is kind
39:54
of people who are the targets of this kind of
39:56
harassment getting through it. It really
39:58
does take a community, an
40:00
institutional like response
40:03
that is a united front. And I also
40:05
just wanted to say something. You know, Sam,
40:07
you you brought up like the idea that when
40:10
companies like Xbox are just like, oh well, you
40:12
should just get off these platforms if it gets
40:14
too bad. I wish
40:16
I could get off Twitter, right, And if
40:19
I were wealthy, I would
40:21
never be on Twitter argue
40:23
with people like come on, you know. The reason
40:26
the thing that keeps me there is that the
40:28
sad reality is for a lot of the kind of work
40:31
that I do and a lot of people that I know, do you
40:33
need to have a social media presence? Right? I
40:36
once had a conversation with someone because I wanted
40:38
to write a book, and the first thing they asked me was how many
40:40
Twitter followers do you have? You know? And so and
40:42
then think it's especially hard for folks who
40:44
are otherwise marginalized
40:47
or have a harder time getting a platform, because social
40:49
media can be a way that you
40:51
can get opportunities that you wouldn't have gotten otherwise.
40:53
And so it's this real kind
40:56
of catch twenty two of like, I don't
40:58
really enjoy being on Twitter anymore,
41:01
but I feel like I have to be there because I know
41:03
that it can be kind of validating
41:07
for my professional work. And I need
41:09
that validation because you know, I
41:11
don't work for the New York Times
41:13
or work with Harvard or that kind of thing. And so
41:16
you know, for a lot of us, we need that validation where we can
41:18
get it. But I not enjoy spending
41:20
time on Twitter a lot of the times. It can
41:23
be a real unpleasant
41:25
experience, but it's
41:27
one that I feel like I I unfortunately
41:29
sometimes need to be there to get
41:32
that kind of validation. I mean, it's almost like a
41:34
currency is a part of your resume. Now, That's that's
41:36
what exactly what I'm saying. Like it would
41:38
be great if we weren't so dependent,
41:40
but it's becomes such a dependent part of our
41:43
careers and our livelihoods that
41:45
it's hard not to just like be having to have the
41:48
Internet in order to work. The likelihood
41:50
of you being successful without having that type of connection
41:53
is is not You're
41:55
not going to be able to proceed or move on
41:57
and move forward, and you need to have that type
41:59
of experience. Uh, And
42:01
you're right, And it's amazing the
42:04
level of what that blue check mark means.
42:06
You know, it is this whole like validating
42:09
whether or not not because you want it or
42:11
because you have to have it, but because you need
42:14
it for your career. Yeah,
42:16
and to be taken seriously. And it's such an absurd
42:19
idea, but that's how far we become
42:21
involved in the social media world. And that's
42:23
also why having types of harassment
42:26
like this needs to be taken seriously. But people don't
42:28
want to because they don't want to take it seriously.
42:31
But you can't walk away
42:33
from that because it's already been established as a
42:35
part of our world culture.
42:39
Yeah. And I think back in the early days
42:41
of our company, I had to respond
42:44
to every YouTube comment and
42:46
I legitimately went to my boss and
42:48
said, I think this qualifies the sexual
42:50
harassment and we had to like
42:52
sit down and talk about the legality of if you're
42:55
going to make me respond to these
42:57
comments that are awful, like,
43:00
is that legal? And I think that's the conversation
43:03
we're going to have to continue to have, especially
43:05
if it's a part of your job requirement. And I
43:07
think that the whole idea
43:09
bridget that, you know, I you
43:13
find Twitter to be often a
43:16
terrible experience. That should say
43:19
something as as someone who
43:21
developed a platform. If people were
43:23
telling me I actually
43:25
really dread your platform.
43:29
I just feel like that's you've got to
43:31
think about that. Oh yeah, And I want
43:34
to be clear that I actually believe
43:36
that, particularly for Twitter, that
43:39
thing of not enjoying
43:41
being there, I think that is baked into
43:43
Twitter's functionality because of the
43:46
way that the algorithm works, things that you
43:48
say that are in inflammatory get
43:50
much more traction than things that you say that are
43:52
not inflammatory. You know, things that you say
43:54
that are untrue get
43:57
much more traction than things that you say that are true.
44:00
And I think that Twitter, because of the way
44:02
that the algorithm works, I think
44:04
it amplifies inflammatory
44:07
takes, mean tweets, not
44:09
even things that maybe qualify as harassment,
44:12
but like things that are Yeah,
44:14
inflammatory, I guess it's the only word I can think of for
44:16
it. And I think when you are
44:18
on a platform where the content that you're
44:20
being surfaced most often is content
44:23
that is inflammatory or highly
44:25
charged or negative, and then the content
44:28
that is positive or thoughtful
44:30
or nuanced you're seeing less of. It's
44:32
no wonder that so many people report
44:35
Twitter, you know, not feeling
44:37
good, you know. And I think, especially as we're with
44:39
COVID and the quarantine, it's like we're on
44:42
social media so much more, and
44:44
we're all exhausted and stressed
44:46
and anxious and just like not being our best selves.
44:49
And I think it is a recipe for it just
44:51
not being a very good user experience. And it's
44:53
a user experience that is marked by
44:56
bad faith, criticism, and inflammatory
45:00
contents. And that's how I feel when I'm on Twitter, to
45:02
be honest with you, It's like I see things that
45:04
bummed me out and it just
45:06
makes me so sad that that
45:10
that is the driving, the
45:12
driving feeling of this kind of discourse.
45:15
And I remember the days when Twitter used to feel
45:17
hilarious, like the early days
45:19
of Black Twitter. Something would happen
45:21
and we will be laughing about stuff for weeks.
45:24
It was so much fun. Or when
45:26
like a bad movie was premiering on Lifetime
45:29
and we're all going to watch it as a family and make fun
45:31
of it. We used to have a blast.
45:34
And I feel like those days are gone, and they
45:36
have been replaced with conversations
45:38
that are tedious at best and
45:40
like really harmful at worst. And
45:44
this the days where I just watch a mascot who fell
45:47
over and kick get back up. Yes, don't.
45:52
Well, I
45:55
know we've covered some ground in this one. But
45:57
did you I know there were some other examples
46:00
bought of, like high profile examples of
46:02
online harassment. Do you want to talk about those? There's
46:05
so there's two. I want to mention the Washington Post,
46:07
Sung Min Kim so during
46:10
the Senate confirmation hearing for near A Tandon
46:12
near A Tannon was tapped by the Biden administration
46:14
to leave the Office of Budget and Management.
46:17
Tandon's harsh, critical tweets were
46:19
cited as the reason why some Republican senators
46:21
would not vote to confirm her. Now, this is
46:23
its own complicated gender story
46:25
on its on its own, but like that's a topic
46:28
for another episode. But Sung Min Kim,
46:30
who is this journalist? She was interviewing
46:33
Senator Lisa Murkowski about Tandon's
46:35
tweets, and Senator Murkowski said that
46:37
she had not seen them, and so she on
46:39
her cell phone showed Senator Mikowski
46:42
some of these tweets to to to get her comment.
46:44
Now, this is a completely
46:46
standard process for journalists, like
46:49
nothing on nothing unusual or an ethical
46:51
happened. But she was photographed
46:53
showing Senator of Makowski this tweet, and
46:56
people make it seem like she had done, you
46:58
know something on the active color something, you
47:00
know, something bad, and she was like really
47:03
harshly, harshly harassed for
47:05
this. And I have to say the people who were
47:07
doing a lot of the harassing were folks
47:09
on the left, folks who were like, oh, like, you're
47:12
getting in the way of Biden having his his person
47:14
confirmed. How dare you? And so it was pretty
47:17
harsh to see that. And I also want to make sure
47:19
that it's good that you know this journalist, she's
47:21
Asian American, and we know that women of color
47:24
it's harder for us when we face online harassment.
47:26
According to an Amnesty International report,
47:28
black women were most at risk, with the
47:31
study determining that were eighty four percent more
47:33
likely than white women to experience abuse on Twitter.
47:35
Latin next women were eighty one percent more likely,
47:37
and Asian women were seventy percent more likely.
47:40
And so this stuff is always worse when you're
47:42
marginalized. Like I felt
47:44
so bad seeing this person just being harassed
47:47
in this very harsh charged way just
47:49
for doing her job, you know. And
47:51
I also wanted to point out the New York Times
47:53
as Taylor Lawrens, who uh Sam I think
47:56
mentioned earlier that's a situation
47:58
where it really it's
48:00
just something where it's like, if it was happening to me, I
48:02
would be I don't even know what I would be doing.
48:04
I would be so I would be so hurt. For
48:07
International Women's Day back in March,
48:09
Taylor Lawrence tweeted, for International
48:11
Women's Day, please consider supporting women and during
48:13
online harassment. It's not an exaggeration
48:16
to say harassment and smear campaigns and I had faced
48:18
over the past year have destroyed my life. No
48:20
wanted to have to go through this. And I
48:23
bet you can guess what happened next to Fox News as
48:25
Tucker Carlson used his huge platform
48:27
of four point thirty three million viewers,
48:30
literally the highest rated US cable
48:32
news TV program in history, he
48:34
spent two fold nights just making
48:36
fun of her. It wasn't even a news story.
48:39
It basically she didn't even report
48:41
anything. She was just said, this is my experience
48:44
as a woman who has faced online harassment,
48:46
you know, stick up for women for International Women's
48:49
Day. Just that one tweet, he
48:51
did two back to back nights
48:54
with segments just making fun
48:56
of her for, as he put it, pretending
48:58
to be oppressed, right, and so the
49:00
fact that he used
49:02
such a massive platform just to
49:05
crap on a woman for talking about her experience
49:07
is dealing with harassment. I think it just goes
49:10
to show that when we talk about
49:12
these things, people really
49:14
don't like it, and it can it. Really we need to
49:16
change that. We need to make it so that when we talk about
49:18
our experiences online and speak up about them,
49:21
we're not attacked. Further, that we're listened to and
49:23
supported, you know, And I think that's
49:26
kind of the the ultimate point that I want to
49:28
make in this episode, I guess, is that online
49:31
harassment is very real. If it
49:33
impacts you or hurts you or otherwise
49:35
makes you feel like you are not able to show
49:37
up as your full best self, that
49:40
is a completely valid way to feel. And
49:42
you shouldn't let people make you feel
49:44
like you're being a baby, or that you're whining, or that
49:46
you should just turn off the computer or just log
49:49
off. You know, if it if it gets
49:51
to you, you know, it is harmful. It is silencing,
49:54
and it's something that none of us should be dealing with. And
49:56
I just want to make that very clear that the kind of things
49:58
that these women are going through, it's just not a acceptible
50:00
right, and yeah, it was incredible just
50:03
to see the responses. It made no sense.
50:05
I was really like, who is this? Was she stand
50:07
up comedian? Before, right before I knew what was
50:10
going on, I was like, did she say something as a stand up
50:12
comedian, because you know, that's usually kind of the level
50:14
that people will go after something you said something offensive
50:16
in a joke. That's what I was like, maybe she did that.
50:19
And just realizing all she said was be nice,
50:25
that was enough to elicit all of this hate,
50:27
just because I was just kind of dumbfounded
50:30
in that sense of because she's a notable
50:32
person, meaning she has some kind
50:35
of platform, she's a journalist, and all she
50:37
said was I've gone through some bad things, not even
50:39
talking about anything specific, and
50:41
then please be nice. And that's
50:43
the And that brought the ire of
50:46
one very angry white white man who apparently
50:49
is just angry about women period
50:51
and marginalized people period. I mean, we
50:53
see how dangerous it is when white
50:56
men get angry and like and like
50:58
this is this is a great example. Two
51:01
nights and mind this was
51:03
two nights where big news was happening.
51:05
You know, so many other huge
51:08
things are happening in the world, and you're gonna spend two
51:10
nights talking about a journalist
51:12
who tweeted about her experience and telling
51:14
people to be nice, really right.
51:18
And I had to like her willing
51:20
to back other women who are going through things, especially
51:22
journalists, including you know, rallying behind Brendan
51:24
Smith. It was beautiful. I had love to
51:27
see when things do show
51:29
because when she did that, those who
51:31
realize what is happening are gonna
51:33
fight even harder. And and that's
51:35
what he's I don't think he knew what was going to
51:38
happen, was that you kind of gave a credibility
51:40
as a journalist of itself,
51:43
end of story. No matter what, you gave her enough credibility
51:45
to the rest of us who know who what he is
51:47
doing, that she is a fierce
51:49
individual to be afraid of For him, obviously,
51:52
he's fearing something here. And I love seeing
51:54
that she was rallying behind Smith and being like,
51:56
hey, I got your back. I
51:59
mean, that's that's the of the Like obviously,
52:01
I hate seeing people being harassed in
52:03
this way, but that's been this sort of silver lining,
52:05
is seeing women come out
52:08
and like just go hard for other women,
52:10
you know, women lifting as they climb
52:12
and be like, listen, girl, I've been harassed. I
52:14
see your going through it. I support you, and that
52:17
is really I think it has been beautiful
52:19
to see. You know, there is nothing
52:22
like the power of women being in community
52:24
with each other. When we come together in support
52:26
of each other and when we're pissed off. Also
52:29
we can take on the Tucker
52:31
Carlson's of the world, the people who have these
52:33
huge platforms and all this institutional power.
52:36
The power of women being in community with each
52:38
other is a powerful, powerful,
52:40
special thing and it has been very
52:43
endearing to witness that inter real time.
52:45
Yeah, yeah, very very
52:48
much. So thank you so much, Bridget,
52:50
as always for being here. You
52:53
always bring such great conversation.
52:56
Um, where can the good listeners find you? Well?
52:59
Again, have to shout out Liz Lens
53:01
who was an amazing journalist who really helped me put
53:04
together all of my ideas for this episode. If you want
53:06
to hear from Liz herself, you can check out the
53:08
most recent episode of my podcast. There are no girls
53:10
on the Internet on this very network, and yeah,
53:13
subscribe you would love to have you And you can find
53:15
me on Instagram at Bridget Marie
53:17
in d c and on Twitter
53:20
at Bridget Marie. I
53:23
know I just said I hate being on Twitter and you
53:25
just don't follow me. I guess, yeah,
53:29
provide support and I
53:32
guess gifts of mascots following over
53:35
I missed those months in deleted. I actually
53:38
had one say and I got deleted. I was very sad about
53:40
this, the
53:42
struggle. I think we can help you with this, Okay,
53:45
But yes, Bridget you, I follow you, and I
53:47
think you have some really great messages. And I always
53:49
love seeing your perspective because even
53:52
though it is an awful, awful place to be, you put
53:54
up some good, good resources. So thank
53:56
you you. Yeah, and the one of my idols
53:59
online, she says, when you are online,
54:01
don't turn to the darkness, bring the light. So I
54:03
always try to bring the light on social media awesome
54:08
and listeners. If you would like to your
54:10
email as you can our email Stuff Media
54:12
at mom Stuff at ihart media dot com. You can
54:15
find us on Twitter at mom Stuff podcast
54:17
or on Instagram at stuff I've Never Told You thinks It's
54:19
always start a super producer, Christina, thank
54:21
you and thanks to you for listening Stuff
54:23
I've Never told you. The protection of I Heart Radio For more
54:26
podcast or my heart radio, visit that I Heart Radio app,
54:28
Apple podcast, or wherever you listen to your favorite shows.
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More