Podchaser Logo
Home
You Can't Cancel Taylor Lorenz

You Can't Cancel Taylor Lorenz

Released Thursday, 8th December 2022
 1 person rated this episode
You Can't Cancel Taylor Lorenz

You Can't Cancel Taylor Lorenz

You Can't Cancel Taylor Lorenz

You Can't Cancel Taylor Lorenz

Thursday, 8th December 2022
 1 person rated this episode
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:17

Hey, and welcome to What Future. I'm

0:19

your host, Joshua to Pulsky and today

0:22

very special guests, very

0:24

a very special guest who has experienced

0:27

a lot of intense online

0:29

harassment and which

0:32

it does sound like something to tout, but that

0:35

is the reality. I'm Lyrah. As you

0:37

know, I've been thinking a lot about being

0:39

online. It is sort of one of my main things.

0:42

It's possible to my only thing. I'm starting to think maybe

0:44

I don't have any other things except being online.

0:47

Who would I be if I were not online? Happy?

0:50

I'd be a happy person. That's what I would be if I

0:52

were not online. Anyhow, I am a little

0:54

bit fixated on social media as of late because

0:56

it's been such a disgusting mass out there.

0:58

And after talking to Kaye see

1:01

our guest today retweeted

1:03

it and liked the post I did

1:05

about the Casey Show, and

1:07

it's Taylor Lorenz, who I don't know if

1:09

do you know Taylor Lorenz? Are you familiar with her writing

1:11

at all? Yeah? I'm familiar with her as

1:13

being like a very sought after guests, very sought

1:15

after guests, very hot guests, but also like an

1:17

incredible journalist and who's responsible for like

1:20

some really really crazy stories, and and

1:22

often when she writes

1:25

something, it is one

1:27

it is like some huge deal like expose,

1:30

or like you've discovered some horrible thing

1:32

about some person. You're like, oh, this like

1:35

famous troll, like now we know who

1:37

it is, or Nuke Gingrich actually has

1:39

a burner account that he's used to harass women or

1:41

whatever. It's like things like that, and people

1:43

lose their minds on the Internet

1:45

and they go after Taylor, and

1:49

from what I can tell, seem to be making her life

1:51

a living hell. But like I

1:53

thought, let's talk about like this state

1:55

of things on the Internet, because it feels

1:57

to me like it's so intense and overwhelming,

2:00

particularly for her. And as we're at this kind of

2:02

inflection point of this sort of change over

2:05

in how I feel like we're all living with

2:07

and dealing with social media, I think she's

2:09

like a perfect investigator

2:12

of this moment in time and sort of like

2:14

interrogator of the

2:16

culture of being online. And

2:19

so not to build

2:21

it up too much, but our guest today is

2:23

Washington Post writer and brilliant

2:26

journalist Taylor Lorenz. Taylor,

2:46

I'm very excited to be talking to you because there's a lot

2:48

happening right now in your world, which

2:51

is the Internet. Actually, we

2:53

talked a little bit. There was a Twitter space. Sure

2:55

you recall this like a few weeks ago and

2:58

you were on it for a little bit. Do you recall what I'm talking out

3:00

or do I just sound crazy? Right? Of course? I remember

3:03

you came on and we started talking about Twitter

3:05

and other social networks. We talked about mastadon, which is

3:07

one of my favorite topics as of like because I'm

3:09

a huge nerd, and you said something

3:11

to me that I thought was so surprising

3:14

and interesting considering who you are and

3:17

what you do. I don't want to put words

3:19

in your mouth back. You were basically like, there's

3:21

not enough trolls on Mastodon. Was like

3:23

kind of my uh

3:25

takeaway? Would you say, would

3:28

you say that's true? You don't feel like there are

3:30

enough like nasty elements

3:32

on like some of these new social networks.

3:36

Uh No, I wouldn't say nasty

3:38

elements. But I do think

3:40

that it's overly sanitized. I

3:42

don't think that there's enough sort

3:44

of creativity and free form

3:46

discussion. I say that as

3:48

I'm like literally being canceled on Twitter.

3:51

But I know are you really, Oh

3:53

my god, I'm joking. I like

3:55

fucked up and I like defended this

3:58

like terrible person because I didn't realize there are terrible

4:00

now I did. I competed my tweet, but

4:02

of course someone screenshot at it and it's like

4:04

gotten thousands of likes on some

4:07

sweet calling me a monster, And

4:09

you're like, that's what you like? You're missing that when

4:11

you're on Masodon, is that there people aren't like mobbing

4:13

you. No, I don't

4:16

like any of that. I don't like any of that. But

4:18

I guess what I do appreciate about

4:20

Twitter is like the serendipity of it

4:23

and the kind of like being exposed

4:25

to new things and being challenged

4:27

on things like I do like that, Like I

4:29

do you know like how

4:32

kind of big and messy it is and

4:34

Mastodon just feels too controlled.

4:37

Yeah, it's definitely

4:40

smaller. Right. It reminds me of like

4:42

not to get into a masked on conversation because it was more like

4:44

about your perspective on this, which I find so interesting.

4:47

And I'm reading, now, what did

4:49

you like retweet something about some chili story?

4:52

Yeah, well so like well, no, let me just

4:54

explain in like a few times

4:56

it's lovely, kind woman made

4:59

chili her neighbors, and this

5:02

like other person comes in

5:04

and basically accuses her of

5:06

being a horrible person and being racist or

5:08

something because she made assumptions

5:10

about her. That's crazy.

5:13

Okay, So so thousands of people

5:15

pile on that deranged reply, and

5:18

I was like, do we really need to

5:20

pile on this like deranged person,

5:23

Like clearly they're having a bad day or there's

5:25

some ship going on with them. Like, I

5:27

don't think that these dynamics are productive.

5:29

I mean the person who just sorry, just to be clear, the person

5:31

who was like this story about this woman

5:33

making chili is like racist or whatever.

5:36

Right, You're like, don't pick

5:38

on that person. Well, my opinion of all

5:41

of that is just like block that person and

5:43

like move on. Like people

5:45

say a lot of dumb shit on the website, and

5:47

not to say that they don't deserve pushback.

5:50

In this case, they did, but like, I

5:52

don't know if it's productive weeks later to

5:54

be like, you know, harassing

5:57

them on off the internet. And what

5:59

I didn't realize is that person has actually

6:01

harassed the chili woman. Um and I say

6:03

chili woman because I don't actually know her full

6:05

name, and I don't want to say her Twitter handle, but

6:09

she'd actually been been really cruel to her for a while,

6:11

So that was my bad. I didn't know any of that. She

6:13

had been cruel to the Chili woman for a while.

6:15

Yeah, she had been harassing her for a while. Okay,

6:18

So I feel like some of this is a little bit like

6:21

an extension of there was a conversation that started

6:23

maybe a day or two ago on Twitter about the craziest

6:26

thread that you've seen on Twitter, like of how

6:28

how illustrative it is of how unwell

6:31

everybody is, because these threads are so like

6:34

there's it's like there was a thread a few months

6:36

ago where somebody was like reading

6:39

books as ablest, Like it's

6:42

not what they said, that's not what they said. So

6:44

this is a good example of exactly,

6:49

Josh, they didn't though, this is like this is the

6:51

point is that like, they didn't say that,

6:53

And these really nuanced discussions that

6:55

people try and have, especially around disability

6:57

in their small in group of

7:00

followers, often get taken

7:02

out of context, simplified

7:04

and kind of made into a mockery.

7:06

And look, you know, and a

7:08

lot of these people that get piled on are

7:10

like, you know, they're neurodivergent,

7:13

they're not you know, they're not necessarily communicating

7:15

very well. My feeling of that is,

7:18

if they're not being problematic, if they're

7:20

not being bullies themselves, just

7:23

let them have their discussions and we don't

7:25

need to dogpile people, especially if they

7:27

have like under a thousand followers, which you know

7:29

in some cases is the case. Like but this

7:32

this tweet was just to be clear, Sorry, the tweet I'm

7:34

talking about is this is the

7:36

tweet and I and if I misunderstand it, then

7:38

it's again's Anna Mardal. It's Anna Mardal.

7:40

There's a threat going around mocking writers who quote

7:43

don't read very much, and I'm trying not to haul out my soapbox,

7:45

but this is able list. Not everyone can

7:47

read for pleasure or indeed at all, and some of

7:49

those people are writers. I think,

7:52

yeah, let me let me just like you

7:54

know, let me try. And there's actually a whole

7:56

thread that I believe that person who's

7:59

also turned out to work for Lockeed Martin. So

8:01

I'm certainly not defending them. This

8:03

is a person who got then canceled because they're like,

8:05

I'm a contractor for Lockey Parker. And so

8:07

again I really really trust

8:09

me. This is not me like defending this

8:12

person or any of these people. I'm

8:14

just saying, like, you know, a lot of

8:16

these discussions, like somebody that is very

8:18

dyslexic as a writer has always struggled

8:21

with reading, Like they're

8:23

not saying that reading is ablest, but I think they're trying

8:25

to open a discussion, albeit poorly,

8:28

about you know, writing

8:30

and reading and the accessibility sort

8:33

of nature of that. Right, Like, fine,

8:36

you know, obviously they're not communicating it.

8:38

Well, obviously it's easy to dunk on. Does

8:41

that mean that that person is to be subject

8:43

to weeks of pylon?

8:46

And like, no,

8:49

no, that's just crazy, definitely not right.

8:51

And so first of all, I was always team

8:54

make chili for your neighbors always. I always

8:56

think that that is that woman is an angel

8:58

and lovely and a person attacking

9:01

them was clearly not you

9:03

know, being very kind and

9:06

whatever. But my feeling is just as

9:08

somebody that deals with a lot of harassment is like, Okay,

9:10

we don't need to like necessarily destroy

9:13

that other person's life. In this case, I realized

9:15

that person is actually a notorious stroll so I'm not really

9:17

going to go out of my way to defend them, but right,

9:19

but when they're not, sometimes people

9:22

don't word things correctly or word things

9:24

perfectly, and I don't know

9:26

it gets picked up by some account, but it is

9:28

that that is like part of the

9:30

problem with Twitter, I mean social media kind

9:32

of largely, but it's really

9:34

easy on Twitter. I mean Twitter is sort

9:37

of custom made for

9:39

this sort of like one I said

9:41

something super dumb, maybe not even

9:44

dumb, but like I said something that was a poorly

9:46

formed thought, or I worded it badly, or

9:48

I was having a bad day, and so I kind of said it

9:50

in a way that was a little bit more aggressive or something.

9:52

And then immediately all of the tools of

9:54

Twitter are like not the people, but like the

9:56

tools they provide are like you

9:58

could amplify it, and you can dunk

10:01

on it, and you can like screenshot it, and you

10:03

can quote tweet and you can do all these things that are like

10:05

make it very easy to

10:08

generate that kind of context

10:10

free commentary about

10:13

the thing. In fact, you and I talked about this on Twitter

10:15

a little bit. Um you were talking about the quote

10:17

tweet function and how it's like not it's

10:19

not automatically like a harassment function,

10:21

but it can be used that way. I mean you were sort

10:24

of like saying like it's not really

10:26

that big of a deal, and I agree, but people

10:28

do use it in a way that's like it feels like

10:30

it's sort of custom design sometimes for harassment.

10:33

But a part of that

10:35

sort of sequence of events that we're talking about, which

10:38

the chili thing, and I don't know the whole details

10:40

of the chili thing. The poor the poor

10:42

woman that made the chili and has it into even talk

10:44

about her because she doesn't want any

10:46

more attention. This woman deactivated her

10:49

account like she's been through such a nightmare

10:51

experience herself. She just made the chili

10:53

for her neighbors. She doesn't deserve any of this. This woman

10:56

literally like went out of her way to do a nice

10:58

thing and is being harassed

11:00

by terrible people that I

11:03

think I inadvertently defended by just being

11:05

like no harassment. Like I just I'm

11:08

always very wary when I see pylons, you

11:10

know, But so I'm actually looking

11:12

the first thing that came up, and now it's like a little bit older.

11:14

There's new tweets at replace that like Glenn Greenwalt

11:17

is tweeting about you or something because he's

11:19

obsessed with you. Oh never searched

11:21

Taylor Lorenz on Twitter. Here's

11:24

a tweet about you. It is screenshotting

11:28

a tweet. I believe it's about the chili.

11:30

I mean, you say, I said from the start, the woman who made

11:32

the chili, blah blah blah. Then there's a person who says,

11:34

well, chinche and I don't know who that is, but is that maybe

11:37

the person who made the chili? I'm not sure. Had

11:39

to deactivate again as a direct result of Taylor

11:42

her heads talking out of her fucking ass

11:44

about chili discourse. But I

11:46

guess that's fine because she has quote always

11:48

been on her side for real. Go fund

11:50

yourself, Taylor, this is a tweet.

11:54

I know you don't even My

11:57

whole d M have been

11:59

horrible. I'm getting hate mail. People

12:01

have tweeted me screenshots of personal

12:04

details already. I mean, this is

12:06

just I'm getting another

12:08

wave of harassment. So yes, I wish

12:11

I could deactivate. I mean I can't do that.

12:14

You could. If you can't, what would happen

12:17

if you were just like I'm deactivating, Then they win. They

12:19

did that and then Fox News called

12:21

and know then then it becomes a story.

12:24

You are a huge target, like Tucker

12:26

Carlson constantly talks about you, right, he's

12:29

obsessed with you. I think we can say, um,

12:31

you're a big target for obviously right wing

12:34

people because well you're a journalist

12:36

first off, and they hate that. Actually it's interesting,

12:39

but there's a whole genre of internet

12:41

journalism which is like applying

12:43

basically the real journalistic

12:46

sort of tactics that anybody

12:48

in media would use for any kind of story

12:50

to like a space that where they kind of don't

12:53

know that journalism exists a lot of the time. This

12:55

is what I see happening. It's like you'll write

12:57

a story and you'll, you know,

12:59

name him the person who runs some account, right,

13:02

like I think, maybe did you do a lips of TikTok

13:04

thing it was something like this. Of course,

13:06

I I've found the woman behind

13:09

it and wrote about it. Lips of TikTok is

13:11

an account that harasses people basically

13:13

like on Twitter and maybe also TikTok

13:15

I don't know, and you

13:18

like, name to the person is and people who

13:21

don't know how journalism works and don't know that

13:23

like when you put yourself in the public eye. You have absolutely

13:26

there's really zero chance that you should

13:28

expect to not have your name out there. Like

13:31

if you're an actor and you're using a fake name, someone's

13:33

going to figure out what your real name is eventually,

13:35

because that's just how it works. And and

13:37

they get they're really mad because they're like, this is

13:40

wrong. You're like docks saying this person. But

13:42

but you're just doing like normal journalism, and people

13:44

lose their ship and so so like right when

13:46

people get really upset about that, Tucker

13:48

Carlson gets really upset about it, and

13:51

you're now like a character, right

13:53

like, and I don't know how what this is like for you, but you're

13:55

like a character the people. Yes,

13:58

oh my god, Now you made me search myself often. I never

14:00

searched myself because it's terrifying. Like

14:03

the top tweet is like Taylor Lawrenz

14:05

is an irresponsible, spoiled little ship that

14:07

will defend libs of TikTok what

14:10

and then only someone I mean, these people, these

14:12

like you said, I'm just a character to

14:15

them. I don't have humanity. I mean, this

14:17

is a problem with the internet. And this is exactly what

14:19

I was trying to say with my original tweet, like

14:21

although obviously it was defending the wrong people,

14:23

but it's like people just don't they

14:26

don't have any empathy for anyone, and they

14:28

just they flatten people down

14:30

into characters in their little online world.

14:32

And it's well, I mean, this is so, this is the thing that

14:34

I find like unbelievable. I've had

14:36

a couple of moments on the Internet when people

14:38

have piled on me for a variety of reasons,

14:41

some warranted, mostly not warranted.

14:44

But you're like, I mean, people are like

14:46

seriously badly harassing

14:48

you online all day, like all day

14:50

every day, Like they're like, oh, okay, we

14:52

can need to pull somebody off the shell for harassment,

14:55

Like there's Taylor Lorenz. I have no idea

14:57

what it's like. It must be completely insane most

14:59

of the time. But to your

15:01

point about empathy, don't

15:03

you feel like we've just designed

15:05

a lot of systems that are like essentially

15:08

big comment threads, like the old school

15:10

comment threads from the original Internet

15:12

that we you know, know and love and

15:14

that really like discourse on Twitter essentially

15:17

amounts to this sort of anonymous

15:19

one notmanship of a comment thread, which

15:22

feels like generally unhealthy discourse.

15:24

Right, you don't have to agree with that, but I

15:26

totally agree. And so do

15:29

you believe there's a

15:31

type of social network that can

15:33

exist that doesn't end

15:36

up like this? I mean, because you've seen the worst of

15:38

it? Is there a version of this? Do you

15:40

think that where it isn't like this? Yes?

15:46

No, God no, TikTok is ten times

15:48

worse. Okay, I'm gonna get into that in a second,

15:50

But let me hear your thoughts. You mean, I

15:53

guess I'm such a optimist

15:55

about technology. I just I

15:58

mean, yes, it's horrible, and I think I've experienced

16:00

the worst of it, which is why I feel so

16:03

badly for inadvertently defending some

16:05

terrible person that's harassing this lady

16:07

that made Julie. I think you're right that the

16:09

tools that we have now are broken. Twitter is fundamentally,

16:12

you know, not a great tool

16:15

set to communicate. But I do

16:17

think that there's value

16:19

in sort of connecting people at scale

16:21

in certain ways and allowing people to build

16:23

community. And I don't think that

16:25

like these platforms are irredeemable. I think

16:27

they're run by sociopaths

16:30

that don't care about these things.

16:32

But like, you know, it's it's not that

16:34

we couldn't take this and

16:36

build a better system, and I guess

16:39

that's what the goal is of MASSED and on. It's just it's

16:41

I think it's not consumer friendly enough,

16:43

right, Well, I agree with that. I mean, the problem with this

16:45

stuff is like the balance between making it easy

16:47

for everybody and making

16:49

it not unhealthy is like they're sometimes kind

16:52

of at odds, right, Like, it's kind of the easier

16:54

you make things, the less barrier to

16:56

entry there is, the more quickly

16:58

you ramp up on people sort

17:01

of abusing it. But it's

17:03

amazing to me that you can go through this stuff

17:05

and you

17:08

know, and not just be totally demoralized

17:10

and go like I'm going to quit this right Like,

17:12

I mean, look, it's bad online for

17:14

women generally, it's bad online for reporters

17:17

generally. But I feel like you experienced

17:19

harassment at a completely different level. I

17:22

mean, I've definitely experienced the worst of the worst

17:25

of the worst and literally like living

17:27

it like you said, I mean every day and today

17:30

my bad because I really defended a really shitty

17:32

person, but never normally, you

17:34

know whatever. I mean, the Glenn Greenwald

17:36

stuff today I don't even know what charge he made.

17:38

I just saw that Glenn Greenwald made like a crazy

17:40

board or whatever, like I don't know what you call it. It's like the

17:43

thing in a show when somebody's got like the

17:45

always sunny and they always funny in Philadelphia

17:47

board. It's weird because Glenn Greenwald used to be

17:50

a kind of amazing journalist. It's

17:52

funny. It's like, you know, I've seen the greatest minds

17:54

of my generation where it's like Matt Taibi, who

17:56

is like now working for Elon Mosk doing you

17:58

know, the Twitter file or whatever, used

18:01

to be like a guy I see on like Bill Mahtin, Bill

18:04

Maher used to be kind of cool and like looks like

18:06

I'd see him go on the show and I'd be like, woll, I'm not tas

18:08

an awesome journalist. And now it's just like

18:10

weird and embarrassing. But anyhow,

18:12

Glenn Greenwall was obsessed with you. Yeah,

18:14

I mean, I just think Tucker and Glenn and all these

18:16

people just like you're mentioning. They take

18:19

people, they make them into avatars for

18:21

everything that they want to rant against, and then

18:23

they you know, attack

18:25

people for it. And I just I think having been

18:27

through that and living in it, I just like

18:30

when I see pylons on viral

18:32

tweets, like it just it makes my skin

18:35

crawl and I'm like, no, like wait a

18:37

minute, everyone, like just relax, right,

18:40

I mean, you have a seat where you're kind of like, having

18:43

experience it, You're probably a little bit more sympathetic

18:45

to the person who's at the end of the

18:48

pylon, right, I mean, because you've

18:50

been there. But sometimes those are terrible

18:52

people. I guess, which is like talks

18:54

all the way down. I think, yeah, yeah, Okay.

19:06

Now, TikTok is a social media platform that

19:09

I have, you know, some

19:12

experience with. I am not a TikTok

19:14

user. I actually I kind

19:16

of like try to avoid it, to be honest, because

19:18

I find what it does to be um,

19:22

you know, it's like candy. There's something about

19:24

it that feels unhealthy when I'm using it,

19:26

like particularly unhealthy. You're

19:28

seeing TikTok is way worse than Twitter.

19:31

You just said something to that effect. Can you talk

19:33

about like what is going on with TikTok because I don't

19:35

have a lot of awareness, but it is not the stepping

19:37

stone to a better future for social media.

19:39

Is that what you're saying. No, TikTok

19:42

is not the stepping stone. I mean I

19:44

always describe it as like you basically just took

19:46

Twitter and you combined it with YouTube

19:50

drama and commentary channel culture,

19:52

and you meshed it together, and that's

19:54

what we have. So if you think quote tweets

19:56

are bad, wait until you get stitched. You

19:58

know, explain for the novice

20:00

TikTok user, what is stitching. Yeah,

20:03

it's basically like the TikTok version

20:06

of a quote tweet, where you kind of grab

20:08

a clip from someone's video and then you can make a reaction

20:10

to that. Right. I've seen people doing this where

20:12

they're sort of like responding to the things that the

20:14

person says in the video. Yeah,

20:16

it's a lot more effort, though, don't you think like it's

20:19

like you have to like think of production value.

20:21

Oh god no, I mean trolls on there don't

20:24

care. They just get on and start ranting about

20:26

you, and they'll quote tweet anything. They'll take a

20:28

sentence that you set out of context. And

20:31

and the thing is to like, as you know, there's

20:33

no mercy for anyone online. There's no

20:35

forgiveness, there's no damn

20:37

I fucked up. Sorry about that. It's like you

20:39

must pay and we will not be happy

20:42

until you run yourself

20:44

off the internet, which is very Keywie Farms. You

20:46

know, like there's group think, right, and the

20:48

crowds of people will often follow,

20:51

you know, whoever is the loudest voice or whatever, and you end

20:53

up with a mob. You know, the feeling

20:55

of wanting to participate in that and like to

20:57

grab a pitchfork and like whatever. March up

20:59

the hill does not do anything

21:01

for me, but for a lot of people, it's really satisfying,

21:03

right to like have that moment of

21:06

I don't know, I actually don't know what it is like as

21:09

a person who has been on the receiving end of it, Like is

21:12

it is it hate? Is it envy?

21:14

Like it's pure, it's it's hate,

21:16

it's envy, it's anger, it's

21:19

everything. I mean, I'm just literally

21:21

just looked at my Twitter applies in this person

21:23

with you know, almost

21:25

thirty followers is yelling at

21:27

me, and you know, I

21:30

think it's, like you said, it is based human nature. I

21:33

guess it's like it's like this this like

21:35

mob justice mentality, um,

21:38

and I just think it's toxic

21:41

to feed into that in any form,

21:43

and yet you have to keep dipping back

21:45

into these waters because you do it for like

21:47

a living, Like you cover this for a living, right,

21:50

I mean there is this weird thing that's

21:52

happening where you're reporting

21:54

on the thing and

21:56

then you're in the thing, right yeah,

21:58

which which seems almost like, I don't

22:01

know, you don't ever just think about quitting, like they

22:03

couldn't even talk to you if you weren't there. Well,

22:06

sure, but no, because I guess,

22:08

I mean, I just think about what my life was like

22:10

before the Internet, and I never ever, ever,

22:12

ever would want to go back to that. I mean, that's

22:14

worse than anything that I deal with today, in my opinion.

22:18

Yes, I mean I can't imagine

22:21

it so bad? Was your

22:23

life before the internet was bad? Yeah?

22:25

I was miserable. I mean I was deeply, deeply,

22:28

deeply depressed. Like I

22:30

can't I that would be the worst

22:33

thing to go back to no Internet.

22:35

I just I don't know. I mean, it's

22:37

bad now, but I think what what's

22:39

better is there's also all of this

22:42

positive that comes with it. Right, You're not only

22:44

getting the bad every day, You're getting a lot of good.

22:47

And I think It's like I don't

22:49

want to give up that good no matter how much bad

22:51

comes with it. For now. I

22:53

mean, to hear that from you is like kind of amazing.

22:56

I think about Elon Musk as such a strange

22:59

like I can't understand it because he's

23:01

like the richest man in the world or used to be

23:03

before he bought Twitter. And if

23:06

I were the richest man in the world and I

23:08

was like, I'm going to go to Mars and I'm gonna make electric

23:10

cars or whatever, I would never tweet, like I

23:12

would not go online. Well,

23:15

like, why would you go online? You know it

23:17

was a billionaire, Josh, I wouldn't be online.

23:19

To be clear, I'd be okay,

23:21

right right. I'm

23:23

not saying that you have the luxury, but I

23:25

do find it interesting. Like I mean, is there

23:27

no version of this where you don't ever look

23:30

at Twitter? And you're not looking This is exactly what

23:32

got me into trouble today, Josh, is

23:34

that I didn't pay attention. I I honestly,

23:36

I didn't go that deep into the chili discourse,

23:38

and had I gone that deep, I wouldn't be getting all

23:40

this bad lash. It's like, if I don't know every

23:43

single niche drama niche

23:45

nuanced that I get crucified for not knowing

23:47

and and it is on me to know.

23:50

But the chili discourse, which incredible

23:53

that we're even talking about the chili dis We

23:55

need to stop because I don't want people bothering this

23:57

woman. Well, you've already like stepped in it,

24:00

so I think it's okay. You know, this is like maybe if people listen

24:02

to this, they'll be like, Okay, she really did feel bad about

24:04

that tweet that she did because she wasn't

24:06

like hadn't read all of the discourse, but

24:08

like the chili discourse does not matter right,

24:10

like in the world and the grand scheme of our lives.

24:13

Well, yeah, can I just push back on that? Is I

24:15

think that these things are actually indicative

24:17

of important ideas. I mean, that's what I liked about

24:19

Rebecca Jennings piece, which is what made me weigh

24:22

into all of this to begin with. I

24:24

do think that to write about technology

24:27

from the user side, which is what I do,

24:30

you have to be a user yourself. Like I mean

24:32

so much, so much of what I've been

24:34

through has really deeply informed my

24:37

reporting. And also people

24:40

know that they can trust me because

24:42

I understand things um in

24:44

that way. Yeah,

24:47

But I mean, do you do you think the

24:49

chili discourse is important? I

24:51

think that we're reaching a point where these

24:53

dog piles are getting pretty frequent, and

24:56

I think that we should have a conversation about

24:59

whether they're productive or I personally think they're

25:01

not productive. I don't think that this woman

25:03

should be dog piled right for making chili.

25:06

Well, like Rebecca Jenens, who's an amazing reporter

25:08

reporting on like the culture of the Internet,

25:11

she actually references another one

25:14

of these sort of dog piles, which is about

25:16

this woman who tweeted that her and her husband like sit

25:19

out in their garden and have coffee and talk

25:21

and she loves it, and then people were like,

25:23

fuck you, how dare you say you

25:26

enjoy having coffee with your husband. What

25:29

Rebecca very aptly points out, which

25:31

is exactly mean. Rebecca makes

25:33

the point I like linked her Be's a hundred times because I'm

25:35

like Rebecca me at this point better than me. But what

25:37

she points out is that like that there

25:39

was only a couple of those replies,

25:42

but people go searching for those replies

25:45

to get outraged about and then make content

25:47

about, like can you believe right.

25:50

I mean, I don't know if you remember the SNL character,

25:52

the Debbie Downer character, but we

25:54

live in a kind of like sea of

25:57

Debbie Downers. Like it is kind of like I

25:59

mean, but like Josh,

26:02

like those people are often getting bullied.

26:04

Those people are clearly miserable, like clearly

26:06

sure, yes, people who are

26:08

mean to other people are often bullied. They're not

26:11

necessarily mean to people. They're not even being mean.

26:13

A lot of times they're just like, hey, can you consider

26:15

X Y Z? Is that inappropriate? Sure?

26:17

Kind of it's it is

26:20

crazy now, it is crazy. You're like, I

26:22

enjoyed this movie. And then someone's like, you're a fucking

26:24

idiot for like in that movie, I hate you. I hope

26:26

you die. Not with these people are saying no,

26:29

they're not okay, but they are saying stuff

26:31

like someone's like, okay, I love to

26:33

sit with my husband drink coffee, which is

26:35

totally innocuous, Like that's great if you like to

26:37

do that. I think that's wonderful. And people's responses

26:39

are like, oh, that's nice for you. While I

26:41

have to work like three shifts at my job and

26:44

do X, Y and Z, and like I'm glad you're enjoying

26:46

your life, and it's like sure, like

26:48

everybody's life is different, and like you definitely

26:51

can feel shitty that you don't get to like sit in the

26:53

garden with your husband, which honestly

26:55

the whole thing sense. You know, it's really fucked up,

26:57

Josh. But you know what's really started RUP

27:00

is that people then screenshot that reply

27:02

from that random, miserable, you know,

27:04

person that's clearly a sad place,

27:07

and then tens of thousands of people

27:10

will docs and harass that person for

27:12

weeks. That's what It's

27:15

all very unhealthy. What the

27:17

whole the whole system is very

27:19

unhealthy. Like I actually feel like we

27:21

have discovered areas of the Internet

27:24

that are not great in areas of humanity,

27:26

that make no sense for us, Like like, here's

27:29

the deal the woman who loves to sit outside with her

27:31

husband and drink coffee or whatever, Like I

27:33

don't know who the message was for. I

27:36

mean, did I need to ever hear it? I'm not sure

27:38

that that's the case. Did you need to hear it? I don't know if that's

27:40

the case. The people who are agitated by

27:42

it because they don't have the same kind of luxury in life.

27:44

Did they need to hear it? I'm not sure that's the case.

27:46

And then on the flip side of it is

27:49

like, did we need to hear their agitation?

27:52

With the first thing, It's like we're

27:54

not supposed to think out loud

27:57

all the time, right to large groups of people.

27:59

And so I think there's this like fundamental

28:02

damage that's been done to us as

28:04

as human beings that we

28:06

think a natural order of things is that

28:08

when you think something anything, you

28:11

say it to this larger group of people as

28:13

possible, and then you wait for the replies

28:15

to roll it. Like there's no historical

28:18

precedent for anything like that for

28:21

humans, right, Like I

28:23

think we're emotionally and mentally falling apart

28:25

because of it. And I say this, by the way, as I

28:27

could not be a bigger fan of the Internet, you will

28:29

have met no person who was more of

28:31

a proponent of the internet would be like in

28:33

the earliest days or even in the later stages

28:36

of it, even like I mean, I wrote this thing

28:38

about the Internet at the beginning of the pandemic, like about

28:40

how great the Internet is because think about

28:42

all of the ways it was like useful to us Honestly,

28:45

the pandemic would have been fucking crazy if

28:48

the Internet didn't exist, Like, there are so many things

28:50

that would have been Obviously it was crazy for all

28:52

sorts of reasons. But I mean, the pandemic,

28:54

the the ongoing pandemic. It's

28:57

so there are still good things. But I

28:59

guess my point here is, like, I

29:01

think it's interesting that you have a ton of empathy

29:03

for the person who is like the shitty

29:05

person in the exchange or

29:07

like the other shitty person, for

29:10

both sides of it, because I think

29:13

because I've been through it so many times

29:15

and I just know how these pylons

29:17

go, that I just again,

29:20

it's just and I'm probably maybe I'm too brain

29:22

poisoned in in that way now where

29:24

I have empathy for the trolls. Clearly,

29:26

I'm clearly I'm like empathizing with some monster

29:29

this morning. And but I agree with you,

29:31

Josh. I mean, I think context collapse is a huge

29:33

problem on Twitter. And one thing that I've always

29:35

said, although these people love to claim that I want,

29:37

you know, censorship and stuff, no that's

29:40

not true. I mean what I want, and I've always

29:43

said, is like the ability to segment our audiences

29:45

more. I mean, so much of this is

29:47

that context collapse, like who are you posting that?

29:50

More? Probably just your little following,

29:52

right or maybe you know

29:54

you don't need it or mean for it

29:56

to even be sucked into that. But the way the networks

29:59

are designed is you

30:01

want maximum audiens, right, like

30:03

like wait, listen you're on? I said, I

30:05

was talking to Casey last week on masteredon,

30:08

which you're like, I don't even like masteredon. You have like

30:10

sixty thousand or maybe more now, like followers,

30:13

which is a ton. I haven't seen very many

30:15

people on mastadon with and like hardly any

30:17

followers, and it clearly is it works

30:19

in a different way, like getting followers

30:22

is different than on Twitter, but like you

30:24

have a big following there, you wouldn't want to just be followed

30:26

by like ten people that you're close friends, right,

30:28

Like that doesn't make any sense almost now,

30:31

I just like, oh my god, what the funk? Who are

30:33

these people? How are you getting eighty thou

30:35

followers? Oh my god,

30:38

no, you do. It's incredible. There's things

30:40

that I really like about masteredon. I think

30:42

the norms are different on there, and

30:44

people have more space to express themselves

30:46

like Twitter is bad. And

30:48

also now we have a CEO of setting a terrible example

30:51

himself, So I don't think it's

30:53

gonna necessarily get better. I

30:55

have to get I've got to get rid of my Tesla that I at

30:57

least because embarrassed to drive it. You

30:59

have a Tesla, Well, this is the funny thing, is like, when

31:01

I've got the Tesla, I was like,

31:03

Elon Musk is like, you know, he's kind of got a little

31:06

rough round the edge, but it seems like a pretty interesting guy in this

31:08

car seems pretty great, Like there weren't a lot

31:10

of options for electric cars, and I didn't want to get another

31:12

gas car. And now when I drive it, I

31:14

feel like, I mean, my car

31:16

has been spit on by people while I'm driving

31:18

for the record, Like my car

31:20

got spit on by a guy randomly the other

31:22

day. Uh, I'm assuming

31:25

because it's a Tesla. I mean, I don't think I was driving

31:27

in any way that was particularly erratic or something.

31:30

So now I'm like, how do I get rid of this? But

31:32

like you know, Elon Musk, I would love

31:34

for him to concentrate on the Teslast. I don't know

31:36

why he's interested in Twitter it doesn't

31:38

make any sense to me, Like I am, I am

31:41

so perplexed as to

31:43

again, this is what I'm saying about the billionaire,

31:46

Like you could do anything. Why

31:49

are you down here with us in this like

31:51

garbage pile? You know, Like I

31:54

had to use Twitter because I was like, I'm a journalist,

31:56

and we were all on Twitter because that's where all the journalists

31:59

were. And then like you've auntually you can find like stories and

32:01

talk to people and do whatever. But it's like he

32:03

doesn't have to be there. He didn't have to buy it. I mean,

32:05

I guess eventually he did have to buy it, But I

32:07

mean, but it's all ego and culture war.

32:09

Stephanie's clearly been red tiled. Literally,

32:23

by the way, as I'm talking to you here, I

32:26

just saw you just retweeted

32:28

something you retweet live while we're talking. We're

32:30

just now, Yeah, well, I just gotten alert

32:32

that somebody tagged me, so I retweeted,

32:36

you're so online you can't even take forty

32:39

minutes for a podcast. Okay, I

32:42

think I know It's fine. I don't care. Believe me. I literally

32:44

you can see both my hands now. I just I

32:46

wanted to support the podcast that my

32:48

friend at me. I like the fact that you are

32:51

authentically being yourself.

32:53

You're online, You're on Twitter right now. Here's

32:56

the thing, Okay, I want to have a couple of questions for

32:58

you. I want to know if you

33:00

to rank the harassment you've experienced, Like,

33:02

what was the thing that triggered the worst harassment

33:04

you've experienced in your career? Do

33:07

you have like a top pick? Oh

33:09

god, um, I don't know. I

33:11

mean, Pautie Pie made a video

33:14

about me like years and years

33:16

and years and really and yeah, he

33:19

was mad because I said something about Mr Beast.

33:21

I just remembered that because that was one

33:24

of the few times that people really started targeting my

33:26

family and and that that was like

33:29

memorable to me as like a turning

33:31

point where I was like, oh my god, because people would

33:34

harass me before that. And then ever

33:36

since then, it's just been NonStop

33:38

stuff with my family and friends,

33:40

really anyone I've ever had with on Instagram.

33:43

Yeah, I mean Pautie Pie is. And what's interesting

33:45

to a put Pie is that I think a huge segment

33:47

of his audience is like young, very

33:49

young people, Like it's pretty

33:51

popular with teens. I

33:54

have no beef with Pautie Pie. I actually

33:56

think he's he can be very funny

33:58

his audiences children, as you mentioned, And I think

34:01

that these YouTubers, you know,

34:03

the Jake Paula's also have you

34:05

know, done a lot of crazy stuff. It's like, I don't

34:07

that kind of harassment doesn't bother me because

34:10

it's children and I can read it and just be

34:12

like, Okay, an eleven year old wrote this message,

34:14

right, Oh god, it makes me feel

34:17

bad for the eleven year old though, Like I get depressed,

34:19

I mean hearing about that. Like I

34:21

understand, like you know, there's there's

34:23

always some element of this in the world,

34:25

but it does feel like we've just created

34:27

this system where like

34:30

what is that? I mean, when I started blogging back

34:33

the old taste, when there were no cars and I had to walk

34:35

through several miles of snow to get to the

34:37

blogs. Uh, But when

34:40

I started blogging, like it

34:42

was very clear to me that there was some element

34:44

on the Internet that, like fanboys

34:47

is what we would have called them, like an end gadget or

34:49

whatever, where they

34:51

were so into something, they were

34:53

so driven by this love of this

34:55

thing that it that it made them do unusual

34:58

things, right like unhealthy thing. I

35:00

think I got like something like I made a

35:02

death threat to me because I reviewed a Windows

35:05

phone or something that like I gave it a bad

35:07

review. This is like two thousand eight

35:09

or something, you know, like someone was like,

35:11

I'm going to come to your house and kill you because

35:14

you don't like this phone or something. And I'm like, it's

35:16

such an extreme and unusual

35:18

behavior. And I don't think they were kidding. I mean, they were

35:21

not serious, but I don't think that their emotion

35:23

was like I'm making a joke. I think they were like,

35:25

I'm really mad at this person. And you

35:28

cover this stuff so much so you see it firsthand.

35:31

We've created it this unhealthy

35:33

environment. And I guess, like, do

35:35

you ever think about just getting a different job, like

35:37

as it has effort? Like you're very

35:40

talented, you're very smart, you're very accomplished,

35:42

You've done amazing work in the world of journalism.

35:45

Do you ever think like, I'm just gonna fucking get a PR

35:47

job or something, or like, uh,

35:49

I don't know what would you do? What would the next job

35:51

be if it wasn't journalism. I

35:53

think about that a lot. I Mean, the thing is

35:56

is I feel like the reason I have this job

35:58

is because I want to affect the media,

36:00

and I care about media a

36:02

lot, and I I feel like when people don't

36:04

understand the Internet, really bad things happen,

36:06

right, And I think we need to understand

36:08

the world around us. My whole goal for

36:11

a long time was just getting people

36:14

to do to take this beat seriously. I

36:16

think there's so many amazing reporters like you mentioned

36:18

Rebecca, and you know there's others on my beat

36:20

as well that are just like phenomenal reporters.

36:22

So it's not like I quit and the beat

36:24

dies. But I do think that I

36:26

care a lot about my work. I mean, yeah,

36:29

sometimes I think maybe

36:32

I'll just go back to doing social media strategy.

36:34

I could definitely make more money, and it would be

36:36

great to pay off some debt. I think of all you've

36:38

learned. I think of all you've learned of the last decade

36:41

or whatever, of of of like writing

36:43

the insane waves, of of of what

36:45

you've experienced. But whatever, we

36:47

only have one life, you know what I mean, You might as well

36:49

just like do what you want and try

36:52

to affect the world in some way or

36:54

change the world in some way, like

36:56

I want to change the media industry,

36:58

so I met as well work at it. You know,

37:01

do you have a lot of distressing techniques

37:03

that you because like there have been a couple,

37:05

like I said, a couple of times when I've experienced like really

37:07

directed harassment. But this

37:10

is like it's so overwhelming, Like do you ever disconnect

37:12

from this? Like how do you disconnect from

37:14

it? I mean, I think in was

37:16

like a bad year for everyone.

37:19

Um, you know, I lost

37:21

several friends to suicide

37:23

and it was really hard, and like

37:26

the pandemic you know, kicked off. I'm

37:28

severely immunocompromised, so that's been a nightmare.

37:30

Like all this stuff has been bad. And definitely if

37:32

you ask me in like, I did

37:35

try to quit my job actually, and my editor talked me

37:37

out of it. But

37:40

but now I'm just like whatever, the worst that's

37:42

going to happen has happened a million times over

37:44

to me, and that is very freeing. So

37:48

you know why I quit now,

37:52

I mean, I've already been through all that. But to answer your question,

37:55

distress, I mean, yeah, I'm very, very

37:58

very strict about keeping my personal

38:01

life off the internet. So I have

38:03

a personal life like with friends, and

38:06

I do things in l A and like I it's never

38:08

online and never will be online, and I have friends

38:10

that respect that. You're

38:13

right, Like your Instagram is

38:15

not like pictures of you and your friends doing stuff,

38:17

right, It's like it's still like just

38:19

kind of online stuff. I mean, maybe you have a

38:21

secret Instagram. I don't know, but it's like I

38:24

have like thirty Instagrams. People can find

38:26

every one of my instagrams. You won't find any

38:28

of those photos because it's not something that I participate

38:31

in. I don't I don't like the pair of social

38:33

stuff. I don't want people who follow me to develop

38:35

a par of social bond because I

38:37

think it's unhealthy and

38:39

I just I think having that strict boundary

38:42

even and having a strong sense of self Like

38:44

ironically, I think the people who really helped me

38:46

deal with managing

38:48

all of this and being in the public eye are people that

38:50

are sort of way more famous

38:52

and regularly see themselves kind

38:54

of reflected through this like bizarrow

38:57

world version of yourself,

38:59

like on the Internet or in the media. Like that

39:01

used to be really disorienting and it uster really funk

39:03

with my head and I'd want to get online and defend myself

39:06

and be like, no, that's you have it all wrong.

39:09

And now I kind of, you know, I'm

39:11

at peace with it, and you're just like

39:13

whatever, they're going to think what they're gonna think, and it doesn't

39:15

matter what I say. Yeah, I

39:17

mean I think there you have to take some steps to like

39:20

manage your reputation. But like it

39:22

doesn't affect me the way that it affected me two years

39:24

ago. Right, Okay, we're pretty much at time,

39:26

but I mean, god, we could talk for so much longer, but

39:29

talking about this stuff, I

39:31

do have one question, Well maybe it's like

39:33

one ish question, Like

39:35

I feel like one of the things you're really really good at, like if we can

39:37

just talk about things that aren't like super unpleasant, like people

39:39

who are asking, you're really good at like seeing

39:42

trends, like trend lines of things that are

39:44

happening, like and I think actually one

39:47

of your greatest assets as a reporter and

39:49

as a journalist is to look

39:52

at all of this sort of insanity and all of this

39:54

chaos and go, this

39:56

thing here is important and we need to pay attention

39:58

to it. Or this is a thing that's how being good and bad.

40:01

But it's like influencer culture,

40:03

for instance, like the boom

40:05

and I say it's the last. I mean, I'm gonna we

40:07

call it five ish years, right, I mean, obviously

40:09

there's YouTube and stuff, but like the rise of

40:11

TikTok and Instagram and this whole

40:13

sort of like economy and culture around

40:16

influencers which has so many

40:18

ways of being expressed, and you were

40:21

I feel very very early in observing

40:23

that or in talking about it, this

40:25

is my long window way of getting to an actual question

40:28

what is the next thing? Because influencer

40:31

culture is kind of like weirdly waning or I

40:33

don't know, it's like feels like it's weakened. I

40:36

think it's I think it's just more

40:38

distributed now, it's more distributed,

40:41

but but undeniably we're moving

40:43

towards the more fractured

40:45

and fragmented media environment in

40:47

a lot of ways. Do you see anything forming? You

40:50

know, is it is it some

40:52

other mutation of influencer

40:55

culture? Yeah?

40:57

You know, like what is it? Like? What is the next thing? What

41:00

I think is that we're at an inflection

41:02

point. And maybe you feel like this too,

41:05

Like I feel like we obviously,

41:09

like the early were

41:11

the rise of digital media, and we were all so excited

41:14

about the Internet, and we all had these positive feelings

41:16

and then it all kind of came crashing down, you

41:18

know, and everyone's like, oh, no, disinformation,

41:21

da da da da um. But

41:23

I also think we're all just getting more and more

41:25

online, and I don't think that's

41:27

gonna change at all. Like I think

41:30

that the Internet. No, you think

41:32

we're getting more online. You feel like that's

41:34

the trend line. Yeah, I think we're gonna have

41:36

like computers in our brains in like thirty

41:38

years. That's definitely possible. I

41:40

mean sure, I mean obviously the logical progression

41:43

is like brain implants or something, though I

41:45

think we're pretty far away from that. I

41:47

if you had asked me that, I would say, I

41:49

feel like the trend line is like people are

41:52

backing away from being online and

41:54

like starting to go like

41:57

maybe I don't actually need this. No,

42:00

well not to say like no, um

42:02

my prediction. Let

42:05

me quote tweet you false. Um.

42:07

You're right that people are backing away from these

42:09

open, broadcast based social platforms.

42:12

I think that the era of us posting

42:14

that we're a brunch to the entire

42:16

world is definitely going away. I mean,

42:19

I think people are retreating into private

42:21

communities. That's been a trend that's been documented

42:23

for years. Um so I do.

42:25

I do think that's happening. But I don't think that

42:27

our world is getting less networked or

42:30

you know, things are becoming less online, especially

42:32

post well not we're not post pandemic,

42:35

but since the pandemic started. I

42:37

think that accelerated so many of those

42:39

shifts. And yeah, there's like a slight retraction

42:41

with some of it, but not really like it

42:44

pushed us forward so

42:46

many years that even if it's two

42:48

steps forward one step back, like we're still moving

42:50

that way. Yeah. I mean, even like delivery

42:53

culture, the way it's changed in the past

42:56

three years or four years or whatever has been

42:58

not like just been. I mean,

43:00

you don't even think twice now about like having your groceries

43:03

delivered, where like I think if you go back four years

43:05

ago, most people weren't even thinking about it right,

43:07

Like they would never even consider it. But what's

43:09

so interesting about that is it's like utilities

43:12

in a way that are very online, that

43:14

are connected to like our

43:16

phones and are you know, laptops or whatever,

43:18

but are not It's not the same

43:20

thing. I mean to your point about people going into these like smaller communities.

43:23

I think that's really interesting, or more

43:25

personalized communities, and

43:27

that not again not to talk about mascodon,

43:30

but that is one of the things I think is interesting about this idea

43:32

of a decentralized social network. I

43:34

thought, for sure I wouldn't want to get joined another

43:37

social network, but I do think there's something

43:39

interesting about like downsizing

43:41

the social sphere. And I've talked about

43:44

this a bunch of this podcast because it's on my mind all the

43:46

time. Where you've got like people talking about Twitter being

43:48

this like town square, like Global town

43:50

Square. It's like, I think people

43:52

are realizing that, like they don't

43:55

want that, and but there is some social

43:57

connection they want and how do you get at it? And I don't

43:59

know that anybody's figured

44:01

out like the killer app for that yet. You

44:04

know, it's like it is it my message? No,

44:07

I think, I mean, I do think it's things

44:09

sometimes it's things like TikTok that allow for

44:11

more segmentation, and it's not on the there's no

44:13

burden on the user to define

44:16

their own you know, it's kind of self selecting,

44:18

like, hey, this is content you're interested in, These are people you're interested

44:20

It puts you inherently into these niches and

44:23

then yeah, I mean, and then of course group chats

44:25

and discord and things like that. For

44:27

sure, I loved Ian Bogos, who

44:29

I used to work with it The Atlantic, wrote a great piece about

44:31

the end of social media. I don't know if you read it, but

44:34

I haven't. But I'm a big fan of anybody

44:36

where that starts with the end of blank. I'm

44:38

always having written some of those myself.

44:41

It's a good it's a good history. He's very smart

44:43

about it all. I agreed with him. I wrote a similar piece

44:45

and I killed my piece because I thought he wrote it better than me.

44:47

Yeah, Like, it'd be cool if social media ended.

44:49

I think that'd be great. I agree

44:51

with you. I think, like these big broadcast social

44:54

platforms are really bad um

44:56

in a lot of ways, and I think they

44:59

should not function

45:01

the way that they function now for sure, But they're

45:03

valuable for other reasons. And I don't think that we should

45:05

lose spaces to connect with people

45:07

in mass if that's what we're trying to do.

45:10

But imagine we found a better way to do it. Like,

45:12

imagine there was something else that came after

45:14

this, and people are like, oh wait, I've

45:17

been screaming to people online for no reason,

45:19

Like I spent all this time yelling

45:22

and I didn't have to. I think that would be really

45:24

lovely. I just can't wait to see what happened. I

45:26

was just gonna say, like, the reason I love my job

45:29

and I love covering all this stuff is

45:31

I'm sure you do to Josh. It's like you get

45:34

such a front row seat to all of these shifts, and it's

45:36

so fun to get to talk to people, and it's

45:39

it's interesting. I mean, I

45:41

do still love the Internet, but I feel such a sadness

45:43

now so often that like we

45:46

haven't done more. It feels like, you know,

45:48

we have spent so much time and energy

45:51

and and it's like it's

45:53

so much of it has gone so gone bad.

45:55

But but I will say, talking to you, a

45:58

person who has received an enormous amount

46:00

of harassment and hatred and and

46:02

just frankly rudeness all over the

46:04

Internet, the fact that you can continue to

46:06

have what I would describe as

46:08

an optimistic viewpoint about the

46:11

very things where like your harassment has taken

46:14

place is honestly, like it's very

46:16

inspiring. I have been

46:18

feeling lately, you know, am I just a little

46:20

bit too JOm and gloom about everything, But I

46:23

don't know you're giving me. You're giving me a potentially

46:25

a feeling of hope, because if you haven't

46:27

been beaten down, and maybe

46:29

there's hope for us all. It's

46:32

like that meme, you know that meme of the kid on the school

46:34

bus and he's like there's a one guy looking out

46:36

the at the dark side, and I

46:38

like the one where it's like they both say like everything is

46:40

terrible or something, and it's like

46:42

the one guy's like everything is terrible to the other person's like everything

46:45

is terrible. That's

46:47

how I feel. Wow, that's you're the everything

46:49

is terrible with exclamation marks, is what you're saying.

46:52

Taylor, thank you so much for taking time to do

46:54

this. I really have enjoyed this conversation.

46:56

I really appreciate you taking time out of

46:58

your I mean, you probably were there about to file a story

47:01

that will lead to harassment, or you

47:03

may have missed out on some harassment during

47:05

this conversation. Um, but I really

47:07

appreciate you taking the time and I hope that you'll come back and we

47:10

can figure out if like the next thing happens

47:12

and then you can tell us all about why it's why

47:15

it's either good or very bad. So thank you,

47:17

yeah, thank you so much for having

47:19

me. Uh

47:29

well, that was fascinating. I'm

47:33

I'm so surprised at

47:35

Taylor's attitude towards some being

47:38

online. I mean, if I were her now,

47:40

I've again, I've received only a small amount

47:42

of harassment by comparison, but I would crumple

47:45

like a wet newspaper and

47:47

that would be it for me. I mean, if you've, if

47:49

you've looked at at the kinds of stuff

47:52

that people are saying to Taylor on a daily basis, Like

47:54

literally Tucker Carlson makes her like the

47:56

point of his monologue. Sometimes it's

47:59

so unhealthy. I mean not not I'm not saying

48:01

she's being unhealthy, but like there's so much unhealthy

48:03

stuff that's like thrown at her. It's

48:07

hard to believe and

48:09

imagine being able to like face that then

48:11

go yeah, I'm gonna keep going, so like

48:14

what backbone? What resolved? Resilient,

48:18

resilience, unbelievable, I mean never the

48:20

last she persisted, I think is really what

48:22

I wanted to say. There, Um,

48:25

you coined that, right, that was my original

48:27

phrase, Um, that's my

48:29

thing about all the ladies fighting

48:32

the good fight. You don't see those shirts

48:34

a lot anymore. You don't see that. Nevertheless, she persisted,

48:37

No, it was pretty trendy. Yeah,

48:40

but anyhow, that was It's super interesting. I mean, I know

48:42

we've talked a lot about social media lately because

48:44

it's just frankly, look, since

48:46

the world didn't blow up, there was no red wave,

48:49

you know, in fact, there's

48:52

like an extra democratic senator as of

48:54

last night or whatever. You know, news

48:56

has been kind of quiet, and Twitter has been a disaster,

48:59

and there's all times of nasty social media stuff

49:01

going on. But Taylor is so she's

49:04

such a unique character

49:06

in the mix because she

49:09

is one of these reporters who and

49:11

there aren't that many of them who they report

49:14

on this stuff. They're very

49:16

i mean, her reporting is very very good

49:18

and smart and get stuff that people never

49:20

see in a million years until like she illustrates

49:23

it. But then she's also in it. She's like

49:25

in the stuff that she's talking about, Like she

49:27

is there online with the people she's reporting

49:29

on, and like she's not flying in and

49:32

there's like ten thousand foot view like the way like a

49:34

traditional journalist would might go to, you

49:36

know, cover the strike at the coal mine or

49:38

whatever, but like, you're not working in the

49:40

coal mine. You're not like you don't live in West Virginia

49:43

or whatever. You go and do the story and

49:45

then you leave and then you don't encounter any of

49:47

that. But this is like she's literally

49:49

like, well, I went to report on the strike in the coal

49:51

mine, and then I went down to the coal mine and started like

49:54

hammering away at the coal or whatever it is they

49:56

do in the coal mine. I assume there's some kind

49:58

of mining that goes on, and Uh,

50:01

that takes it. I mean, it just takes a kind of

50:04

I don't know resolve

50:06

that I don't feel like

50:09

like if somebody was like, you're gonna break the biggest

50:11

story about the worst trolls on the Internet and you're

50:13

gonna get super famous

50:15

for it, and someone's gonna want to give you a book deal and

50:18

you're gonna like make a movie out of it, I think I

50:20

would definitely pause and maybe

50:22

say I don't want to do that, because you would

50:24

become a lightning rod for some of the worst like

50:28

hate that you can possibly

50:30

imagine. And so yeah,

50:32

it takes a special kind of fearlessness

50:34

to do it. And Taylor's got it. Taylor's

50:37

got it in spades. I can't

50:39

get in touch with that. If anything, I I

50:41

mean, my dream is to retreat to the woods,

50:43

as you know, to get my axe

50:46

and to just start chopping away at the at

50:48

the wood, all of the wood that I can get my hands on.

50:50

This is a classic like New

50:53

York media guy fantasy.

50:55

It's like cutting wood building

50:58

fires. I love doing that. Um,

51:00

I've been having this like thing with my trash.

51:02

Have I told you about my trash problem?

51:04

No. I moved my trash

51:07

cans outdoors because I didn't want them

51:09

in because they were in my garage originally, and I wanted

51:11

to put them outside. And so I have this thing

51:13

that I built where it's got a lid, and I put the

51:15

trash cans in it, and I'm like, that'll

51:17

be fine. The raccoons will never get in

51:19

there. And they did

51:21

get in there, and they like

51:24

chewed like I had plastic trash cans. They

51:26

basically chewed the lids of the trash cans to the point where

51:28

they looked like a cookie that like somebody had

51:30

taken like a chunk out of like plastic,

51:32

you know, did they eat the plastic? I don't know, because like there's no

51:35

plastic scraps anywhere. And so they

51:37

like found a way to get into my trash cans.

51:39

And then they were eating, they were pulling

51:41

the trash cans out. Then I put a barrier into they

51:43

can pull them. And they were going into the trash

51:46

cans and they were eating inside and drawn

51:48

and it's like and

51:50

then

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features