Podchaser Logo
Home
Should Big Tech Stop Moderating Content?

Should Big Tech Stop Moderating Content?

Released Wednesday, 27th March 2024
Good episode? Give it some love!
Should Big Tech Stop Moderating Content?

Should Big Tech Stop Moderating Content?

Should Big Tech Stop Moderating Content?

Should Big Tech Stop Moderating Content?

Wednesday, 27th March 2024
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:06

If you were on social media this past weekend, you

0:08

might have seen some users who were really upset over

0:10

a new policy. Instagram

0:12

decided to remind us just how much they

0:15

suck with their latest update. Now

0:17

the new default setting will limit the political

0:19

content you might otherwise seek. Meta

0:22

is literally censoring political topics off

0:24

of Instagram and Facebook. This

0:26

comes just a couple weeks after the House passed

0:28

a bill that aims to ban TikTok if the

0:30

company doesn't change owners. Foreign adversaries

0:33

like the Chinese Communist Party pose the

0:35

greatest national threat of our time. It

0:38

seems like everyone, or at least everyone's mom, wants

0:40

to rein in the power of social media. They're

0:43

worried about their data being stolen, or the

0:45

election being influenced by propaganda, or

0:47

about their kids' brains being fried. Jay

0:51

Caspian King, a staff writer at The New

0:53

Yorker, shares some of those concerns. But

0:56

at the same time, he believes that access

0:58

to social media is a First Amendment right, even

1:01

if we're accessing it way too much. I

1:04

think that the structure of the

1:06

internet of all social media is

1:09

to argue about politics. And

1:11

I think that is baked into it, and I don't

1:13

think you can ever fix it. You're

1:16

listening to The Political Scene. I'm Tyler Foggett, and

1:18

I'm a senior editor at The New Yorker. In

1:26

your latest column, you wrote that we

1:28

should resist a society in which every

1:30

human interaction gets processed through an algorithm

1:32

and broadcast out to a frequently nasty

1:34

public. I'm wondering if you can

1:36

talk a little bit about what this resistance to

1:38

social media has looked like over the years and

1:40

how it has evolved. Yeah,

1:43

I mean, I think that there's two

1:45

levels of it. The first is happening

1:47

on the sort of official government level,

1:49

right, where there have been all these

1:51

attempts to regulate social media companies. I

1:54

think for almost the

1:56

entirety of his reign as the

1:58

CEO of Facebook, for example, now,

2:01

meta, Mark Zuckerberg has been

2:03

just routinely hauled in front of Congress. I

2:05

mean, you know, it's almost like a clock,

2:07

right? Like there he is in front of

2:09

Congress again. And so those efforts have not

2:11

really yielded too much in the way of

2:14

regulating social media and there's some very good

2:16

reasons for that, right? Like I'm not sure

2:18

if the Congress or the government is really

2:20

should be doing any of this. But on

2:23

the individual level, I think that some of

2:25

what inspired me to write about this and

2:27

this is actually, you know, something I write about quite

2:29

a bit is that I generally find

2:31

it odd that there is so little

2:34

resistance on a personal level, right? Or

2:36

even on a small community level to

2:38

these types of devices and to social

2:41

media generally. We expect the government to do

2:43

it instead of just doing it ourselves. Right,

2:45

that there isn't like, for example, phone-free

2:47

communes, right? Like I only say that

2:50

like partially facetiously, like

2:52

there is not social movements

2:54

to really get yourself away

2:56

from the internet. And I've

2:59

thought for a long time about why that

3:01

is because it just seems so natural that,

3:03

you know, this thing is taking over people's

3:06

lives that people generally have a negative impression

3:08

of it and an increasingly negative perception of

3:10

it. And that generally when those

3:12

types of things happen, those conditions present themselves and

3:15

you see little pockets of people

3:17

who consciously object to it

3:20

and push themselves away. And

3:22

yeah, I just

3:24

have not seen too much of it, you know.

3:26

I am personally invested in trying to find it

3:28

because I feel like I would join such a

3:30

society, right, at some point. But yeah, there's not

3:32

much to it and that the question of why

3:34

has been really interesting to me. So

3:36

when you talk about like phone-free communes or something

3:38

of that kind, I mean it seems like what

3:41

you're referring to is sort of the way in

3:43

which social media has destroyed our attention spans. But

3:45

when we think about the way that social media

3:47

is talked about by the government,

3:49

usually the conversation is centered around

3:51

misinformation and, you know, at least

3:53

in the case of TikTok, potential

3:56

national security concerns. And I'm wondering if you

3:58

can speak a little to how Are

4:00

there different concerns or whether you think

4:02

that the misinformation stuff is actually more

4:04

of like a Trojan horse for the

4:06

actual concern among parents, which is

4:08

just that these phones and computers

4:10

and apps are kind of ruining

4:13

their kids' lives to some extent?

4:15

Right, yeah. I

4:17

do think there's two levels to that critique,

4:19

right? The first is basically what

4:21

you said, which is that social media

4:23

is harmful for a number of reasons

4:26

outside of just like one's well-being and

4:28

sense of self and ability

4:30

to connect with other people. That there

4:32

is a sense that there is so

4:34

much disinformation, misinformation on the internet and

4:36

that it is profoundly affecting political

4:39

outcomes, right? So, you know, obvious

4:41

example being 2016 election, there is

4:43

all sorts of talk about how there are

4:46

these bot farms in Estonia or whatever in

4:48

Russia and that they were tricking

4:50

people into believing things that would cause

4:52

them to vote for Donald Trump or

4:54

to undermine democracy, et cetera.

4:57

I personally am quite skeptical of that type

4:59

of work. I don't

5:01

think that social media has that much

5:03

of an effect. I think it does

5:05

have an effect. I think it's generally

5:07

overblown when talked about as having a

5:09

sort of totalizing effect that we can

5:12

explain everything via disinformation.

5:14

I think one example that I've always

5:16

found interesting was that there is

5:18

a study done by a group

5:20

of academics about misinformation and the

5:23

Asian American community. What

5:25

they concluded was that the reason why

5:27

Asian Americans were opposed to

5:30

affirmative action was because

5:32

of misinformation, which to me was

5:34

kind of an outrageous claim because

5:38

I think that a lot of Asian Americans

5:40

felt that things like the Harvard case, for

5:42

example, were showing discrimination against them and that

5:44

that was why they were – that's

5:47

why they opposed affirmative action,

5:49

right? They felt like there's a misapplication

5:51

of that term and an overexpansion of

5:53

that term. And so I'm somewhat skeptical

5:56

Towards that part. In Terms of propaganda,

5:58

I Think that – In terms it

6:00

tick tock it's very hard. You know, right

6:02

through a white. By. Dance or tic

6:05

toc are doing. These things are quite cloudy

6:07

stuff. But. If. It is

6:09

propaganda. The First Amendment Supreme Court has

6:11

clearly shown that markets have the right

6:13

to receive propaganda and serve for that

6:15

reason. Like, I'm not as concerned with

6:17

that as well, because they think it's

6:19

part of no freedom of speech to

6:21

say that you can receive propaganda. Basically.

6:24

Every single thing that people are trying to

6:26

do to suppress social media networks is an

6:28

infringement on the first Mm. I think it's

6:30

quite clear and Nine or they can, I

6:32

don't you don't have to be I'd are

6:34

raving first amendment person like I am to

6:36

actually believe that I just pretty it's pretty

6:39

settled court stuff that says s on explain.

6:41

How that's the case? Because I guess I'm

6:43

wondering like let's say that I'm the government

6:45

bans tic toc. You would assume that all

6:47

the creators on that platform would. Then you know. Treat

6:50

you tube swords and Instagram real

6:52

themselves. How is he getting rid

6:54

of one app Were limiting one

6:56

app necessarily conflicting with. The. First

6:59

Amendment. It's. More that it is

7:01

not the expression of free speech,

7:03

it is the. Ability

7:06

to receive speech, So.

7:08

In article, I talk about this case com

7:10

Lamont vs Postmaster General From Nineteen Sixty Five

7:12

to Meal Jaffer who has said that head

7:14

of the Knight Foundation at Columbia sort of

7:16

direct me to this case because he's written

7:18

about it as well. Basically they're the man

7:21

and Ninety Sixty Five years philosopher. He was

7:23

I'm ah, socialist and he was trying to

7:25

debt and tapia something called the painting or

7:27

of the array and if is explicit. Chinese

7:30

profit and such a from a

7:32

Florida and I'm stressed that just

7:34

there's no light hiding anything. It's

7:36

not club and sober. At the

7:39

times, United States had this law

7:41

that if you wanted explicit foreign

7:43

propaganda from a. small

7:45

number of country is right that are deemed to

7:47

be hostile that you would have to fill out

7:50

a little reply card so i do post office

7:52

or data and their it's a we got this

7:54

propaganda we just really want to make sure that

7:56

you wanna have you checked and send back to

7:59

supply card then will send it to you, right? And

8:01

so Lamont sued the Postmaster General saying

8:03

that just the act of filling out

8:06

this little reply card, after which you

8:08

would still get the propaganda, right, was

8:11

violating his right to receive

8:13

information because he would have to be putting himself

8:15

essentially on a list. And the

8:17

court agreed with him, right? And this

8:19

is still upheld at this point. And

8:21

so the right to receive propaganda

8:23

or the right to receive any type of

8:26

information that you want to make those types

8:28

of decisions for yourself is part of free

8:30

speech. And so eliminating something

8:32

based on the threat of propaganda, right,

8:34

I think is a violation of the

8:36

First Amendment. What do you

8:38

think is the distinction between someone kind of like opting

8:40

in to receiving a certain kind of

8:42

political propaganda, say

8:45

choosing to subscribe to a communist magazine

8:47

versus going on TikTok to

8:49

see dancing videos, and then the

8:51

algorithm for some reason gives you propaganda

8:53

of some kind? I

8:56

don't think there's a distinction, but I understand

8:58

I'm in the minority. Part of what

9:00

I've started to realize in my advancing

9:03

age is that people don't really

9:05

care about the First Amendment in

9:08

so much. And for

9:10

a while, I thought that they do. But they do

9:12

until they don't. Yeah, yeah, it's like very, the people

9:15

who are, like

9:17

I'm not saying an end type of good way,

9:19

right, but the people who are just kind of

9:21

absolutist and principled about it, we're sort of dwindling

9:23

in number. And so

9:26

I would say there's no difference, right, and that people

9:28

still have the right to do that, and they have

9:30

the right to distinguish what is

9:33

propaganda and what is not propaganda. And

9:35

they also have the ability to

9:37

just not look at TikTok, right? And so

9:39

I think that those are decisions that

9:41

should be left to the individual. And I think

9:43

that governments trying to interfere with that

9:46

is bad. However, I

9:48

have a daughter, right, and I don't

9:50

want my daughter to be on social media. My

9:53

concern is that I don't want her on

9:55

those platforms, which obviously is like a real,

9:58

like it's not particularly like

10:00

that. Everyone

10:02

should be able to access these platforms, but not my daughter.

10:04

It's almost quite NIMBY-ish. And

10:08

so that's the tension I think that's happening

10:10

right now. And I think the

10:12

question that people need to ask and that I

10:15

think a lot of people need to ask is,

10:17

am I willing to sacrifice some

10:19

civil liberties if it means that

10:22

my child will not be on social media? Is

10:25

that an okay compromise to do? I

10:27

would say no, right? But

10:29

I think that most people would say yes, and I think that's

10:31

sort of what – I think that is a direction

10:34

that we're heading in right now. I'm

10:36

wondering about like some of the smaller

10:38

restrictions that states like Utah have tried

10:40

to impose, calling for age restrictions and

10:42

for parental controls on these apps. I'm

10:44

wondering what the free speech argument or concern is

10:46

there. Oh, yeah. That's another place.

10:49

So Florida passed a bill that

10:51

is similar to the Utah bill in that

10:54

it creates an age restriction,

10:56

and I think at least

10:58

in Utah it would enforce penalties on the social

11:00

media company if they were out of compliance with

11:02

it. The free speech argument

11:04

against that is that people under the 18 also have

11:07

free speech rights, right? And

11:09

that social media is the public square,

11:11

which is something that the court has

11:13

essentially said, right? And which I

11:15

think any single person who pays

11:17

any attention would say is true, right? That social

11:20

media has become the public square, even if

11:22

it is privately owned. And that

11:24

young people should have the right to

11:26

access that type of stuff and make

11:28

their voices heard or express their opinion about

11:30

anything and not wait until they're 18. There's

11:33

an additional problem with this, which

11:36

is that – in terms of the First

11:38

Amendment, which is that to verify one's age,

11:40

like how do you do that, right? In

11:42

Utah at least you need it to provide

11:44

a state ID. That means that anybody without

11:46

a state ID means that everybody who

11:48

cannot provide age verification in the way that

11:50

the state of Utah wants, is

11:53

now going to also be banned from the

11:55

public square, right? So it doesn't just infect

11:57

children, it infects people without ideas as well.

12:00

Jay, I'd like to ask you more about your

12:02

own experience on these apps and also your perception

12:05

of what's driving the public discourse about them, but

12:07

first we're going to take a quick break. You'll

12:10

hear more of the political scene from The New Yorker

12:12

in just a moment. If

12:18

you like the show, we'd love it if you could leave us

12:20

a rating and review on the podcast platform of

12:22

your choice. And while you're there, don't

12:24

forget to hit the follow button so you never miss an

12:26

episode. You

12:31

come to The New Yorker Radio Hour for

12:33

conversations that go deeper with people you

12:36

really want to hear from, whether it's

12:38

Bruce Springsteen or Questlove or Olivia Rodrigo.

12:41

Liz Cheney, we're the godfather

12:43

of artificial intelligence, Jeffrey Hinton,

12:46

or some of my extraordinarily well-informed

12:48

colleagues at The New Yorker. So

12:51

join us every week on The New Yorker

12:53

Radio Hour wherever you listen to podcasts. I

13:07

have a confession to make, which is that I

13:09

actually don't have TikTok, you know, all of the

13:11

talk of the national security concerns. You know, we

13:13

work in journalism. I just thought better not to

13:15

have it, even though The New Yorker

13:17

now has a TikTok account, so maybe I should just

13:20

get over that. But I'm

13:22

curious about your personal relationship with the app,

13:24

since you've mentioned that you are worried about

13:26

your kids, you know, kind of becoming addicted

13:28

to it. I

13:31

do use it a lot. At the beginning

13:33

of TikTok, I thought it was the best

13:35

social media app ever because it was quite

13:37

international. And I could

13:39

basically train my algorithm to do stuff

13:43

like watch construction workers in India

13:45

who would post TikToks about what their days were

13:47

like. And this was like fascinating to me. I

13:49

don't know what, I didn't know what, you know,

13:51

northern Indian construction workers' life

13:54

was like. There was a whole

13:56

bunch of sort of nature videos out of China

13:58

that I found really interesting. The most

14:01

popular one was like the people would sort of

14:03

set up by a stream and catch a fish

14:06

and cook it next to the stream. And

14:08

it's like the most relaxing, beautiful thing that

14:10

you will ever see. And

14:12

I felt like, you know, like the sort of international

14:14

component to it was going to be really

14:17

eye-opening, right? And in a

14:19

way, I could sort of funnel

14:21

myself away from all the dancing teens for

14:23

months. Like I use this

14:26

app daily and I never saw any of

14:28

that stuff, right? I didn't know who Charlie

14:30

D'Amelio or who any of these people were.

14:32

I still don't really, you know? But

14:34

since then, it's become much more typical, right?

14:36

It's become everything I think is kind of

14:39

funneled towards the same thing. You can't train

14:41

it in the same type of way. And

14:43

so I don't know, I use it much less than

14:46

I did because I find it to be frustrating. I'm

14:48

like very uninterested. I'm 44 years old. Like

14:51

I'm not out of virtue. I just can't

14:54

actually physically care about a lot

14:56

of this stuff that happens on

14:58

TikTok. So

15:01

yeah, I don't know. I don't think

15:03

that it's necessarily worse than YouTube shorts.

15:05

I don't think it's worse than Instagram

15:07

reels. I think there's larger

15:10

questions around it, you know, at the root.

15:12

But in terms of the viewer experience, like

15:14

I don't really see much difference between any

15:16

of them. In fact, I think this is something that

15:18

happens to almost everybody, which is that while

15:22

I'm just scrolling through, I oftentimes

15:24

forget which app I'm on. I

15:27

don't remember if I'm an Instagram or TikTok or

15:29

YouTube shorts. And then when I find out, I'm

15:31

just like kind of surprised by what app I'm

15:33

on because it's all the same. I mean,

15:35

all the videos like I am, as I

15:37

said, I don't use TikTok, but I do

15:40

watch Instagram reels and YouTube shorts and I'll

15:42

often send my friends videos from those platforms

15:44

and then they'll say, oh, I saw this

15:47

on TikTok. So I'm pretty sure that I

15:49

am basically on TikTok, even if I'm technically

15:51

I'm not. You know, since the content does

15:53

seem to be, you know, quite similar among

15:56

the various platforms, I mean, I'm wondering how

15:58

much of the TikTok hate should we be

16:00

attributing to the fact that this is a non-American

16:02

company? Why is it that we

16:04

are seeing so many efforts to ban TikTok

16:07

but never meta or X? I

16:09

don't want to fully minimize the

16:11

national security concerns, right? Because I

16:13

think that there is

16:15

something odd about the fact that

16:18

so many American teenagers and young

16:20

people are completely addicted to an

16:22

app that won't really tell us what they're

16:25

doing, right? That is quite vague about what

16:27

happens to the data. And

16:29

so perhaps there is a national security concern,

16:31

however, I would like I said

16:33

before, I would need it articulated much more

16:35

clearly than it has been articulated by the

16:37

people who are trying to, you know,

16:40

what I see as a de facto ban

16:42

of TikTok. In terms of privacy and data,

16:45

I do not think that this is a

16:47

particularly compelling argument for the fact because

16:50

basically all your data

16:52

is already on for sale anyway, right? There

16:55

are these data brokers who

16:57

collect and sell people's data. China

17:00

at any point could approach one of these people

17:02

and they could just buy the same data, right?

17:05

Unless you believe that there is some

17:07

sort of special data that TikTok harvests,

17:09

right? That is not keystrokes and facial

17:11

recognition and your address and all the

17:14

other data that is readily for sale

17:16

on the open market right now. You

17:19

would also have to explain to me what that special

17:21

data is that only TikTok has and that, you know,

17:23

you couldn't just go and buy on the open market

17:25

from the data broker right now. So

17:27

I don't find that reason to be particularly compelling

17:29

either. But those are the two things. I

17:31

just feel like they haven't been articulated in

17:34

a way that is convincing to me. So

17:37

you've been on these apps for a while. How

17:39

much has the content on these apps changed from

17:41

your perspective over the years? Because I

17:43

feel like when TikTok first came out, it was all about

17:45

dancing. And now it seems to be way more focused on

17:47

politics than it used to be. I

17:50

do think that since 2020 that there's

17:53

been an acceleration of political

17:55

content on these platforms. But I think

17:57

that's probably user driven. that

18:00

there is a virality around things

18:03

like police killings or around

18:05

footage from war, and that

18:08

opportunistic people on these platforms are

18:10

always going to use those moments

18:12

as a way to promote

18:15

their own brand or to get more

18:17

followers or whatever, right? I

18:19

think the idea that there is

18:21

some sort of secret plot to

18:24

promote this type of stuff to

18:26

sow division, honestly, it feels very

18:28

reminiscent of many anti-Semitic tropes that

18:30

are out there about how George Soros

18:33

or whoever is ceding dissent

18:35

within the American public through controlling

18:38

the media, et cetera, et cetera, right? In

18:42

terms of TikTok and China, the reason why

18:44

such theories still seem plausible

18:46

to us, again, is because ByteDance

18:49

TikTok not really clear about what they're doing. And

18:51

there's a way in which they could be a

18:53

little bit more transparent. And I think it would

18:55

put some of these fears to rest. However,

18:58

I think even if they did put these

19:00

fears to rest to somebody like me, that

19:03

the accusations of propaganda would still

19:05

be there, right? Because obviously, people

19:08

make political hay at this point

19:10

by criticizing China. They do

19:12

these congressional hearings where they drag people

19:14

in front of Ted

19:16

Cruz or Tom Cotton. And

19:20

that there is a theater of interrogating these

19:23

social media companies, not just TikTok,

19:26

every social media company. And

19:28

that Congress at this point

19:31

and politicians, I think lawmakers, they

19:34

see some political benefit of seeing

19:36

themselves as being adversarial to these

19:38

companies, right? Nobody wants to defend

19:40

them at this point. And so within

19:43

that political atmosphere, I think one of the

19:45

charges that you would logically

19:48

place on somebody is that you're spreading propaganda,

19:50

right? But I don't know. What

19:53

is that propaganda? Why would it

19:55

be effective? What side is it

19:57

on? To what purpose is it, right? I will

19:59

say that. at the beginning of TikTok, I did

20:01

feel a lot better about China because

20:03

I would see these people fishing by these

20:05

beautiful streams and I'd be like, oh man,

20:08

that is a beautiful stream

20:10

in China, you know, but that's not like a

20:12

political propaganda, right? So, yeah, I

20:15

don't know. I think that overall

20:17

it's, I'm mostly

20:19

confounded by the arguments just because I think

20:22

as clear as TikTok bite dance is

20:24

being, I think, you know, Congress is

20:26

not really clear about what their problem

20:28

is here either. So Jay,

20:30

I want to ask you about something I noticed over

20:33

the weekend on Instagram and what it means as we

20:35

head into an election season, but first we have to

20:37

take another break. Hi,

20:46

I'm Lauren Goode. I'm a senior writer

20:48

at Wired and I'm co-host of Wired's

20:51

Gadget Lab along with Michael Kalore. Each

20:53

week on Gadget Lab, we tackle the

20:55

biggest questions in the world of technology

20:57

with reporters from inside the Wired newsroom.

20:59

We cover everything from personal tech. Because

21:02

asking people to put a computer on one of

21:04

the most personal and sensitive parts of your body

21:07

is just like, it's a big bet. Broader

21:09

trends in Silicon Valley. There are just

21:11

so many laid off workers out there

21:13

that workers just don't have a lot

21:15

of power. And the exciting and terrifying

21:17

world of AI. It's inevitable that the

21:19

internet is going to be filled with

21:21

like AI generated nonsense. And so he

21:23

just thinks he might as well make

21:26

some money playing a small part in

21:28

a thing that he sees as unstoppable.

21:30

Wired's Gadget Lab is here to keep you

21:32

informed and to keep it real. The

21:35

entire point of the phone

21:38

should be on some level to

21:40

hate it. New

21:42

episodes of Gadget Lab are available weekly,

21:44

wherever you get your podcasts. Over

21:56

the weekend, I noticed on my Instagram that lots

21:58

of people were said about the fact

22:00

that Meta, which is the company that owns Instagram,

22:03

is now limiting content that it deems to be

22:05

political. It seems like a kind of wild thing

22:07

to do in an election year, and that it

22:09

could have a decent impact considering that at least

22:11

half of adults in the US get their news

22:13

at least sometimes from social media. And

22:15

I'm wondering what you make of this move. I

22:18

think it's done out of

22:20

a fear of litigation and a fear

22:22

of what happened in 2016, right, with

22:26

News Feed and all the claims

22:28

around disinformation. When Instagram

22:31

launched threads, right, which was their

22:33

Twitter competitor, they said the same thing.

22:35

The CEO of Instagram said that they

22:38

were going to sort of de-emphasize political

22:40

content. And then the question

22:42

arose, well, what are we going to talk

22:44

about? It's really vapid on threads, I got

22:46

to say. And so what

22:49

ends up happening is that anytime like an

22:51

internet forum basically lays a law down and

22:53

says we can't talk about the thing that

22:55

everybody's talking about because it's too toxic, then

22:57

people post photos of their dogs, right? And

22:59

then it's the most boring thing. I have

23:02

a dog. I love dogs. But

23:04

I don't need to see everyone's dog, especially

23:06

strangers that I don't know on the internet. It's

23:08

like I don't need to see your dog at all. So

23:13

I think this is part of a long-term move.

23:16

Adam Massari, I think, who is the CEO

23:19

of Instagram, he was also in charge of

23:21

News Feed in 2013. And

23:25

I think that this narrative of how

23:28

meta, Facebook, Instagram, all these

23:30

companies within the same group

23:33

affect political outcomes has been

23:35

an ongoing conversation within that

23:37

company. And I think that

23:39

generally speaking, they have decided at

23:41

this point to sort of step back, right? They don't

23:44

want to be a part of it. They want to

23:46

sell themselves as a tech company. They still want to

23:48

sell themselves as connectors or whatever,

23:51

right, between people. But they

23:53

don't want to be the world's newsfeed anymore. And

23:56

I Can actually understand why they feel

23:58

that way. I Do think that... At

24:00

some level it is. Really?

24:02

Gonna restrict their popularity, but maybe that

24:04

is like a decision that that they're

24:06

willing to make at this point. It's.

24:09

Interesting because they also say that I'm. No

24:11

Matter has said we want pro actively recommend

24:13

Content About Politics on in a recommendation services

24:16

across Instagram and threads. but if you still

24:18

want these posts recommended to you, you can

24:20

kind of go into your settings and basically

24:22

click a button and all the sudden you'll

24:24

see them again. And so I've been wondering.

24:26

if that you know kind of makes a

24:28

difference to you when it comes. To the

24:30

Freedom of Speech concerns as you can.

24:33

Press buttons and and make it. go back. To normal

24:35

or whether the expectation that people will

24:37

go and proactively do this is enough

24:39

to raise concerns. I'm a

24:41

surface as I was unaware that adheres

24:43

or I know my initial responses more

24:46

if I go. To. Sit

24:48

personal was invalid or for what they've

24:50

as sick as I don't really fascinated

24:52

of psychological experiment rail I'd like to

24:55

reverse filtered say we arrive my own

24:57

citizens of snow give me all the

24:59

Dvr that horrible above that are devi

25:01

all the arguments I will. I would

25:04

do it for it either I turned

25:06

around and say research off. Right

25:09

right. It's funny that yeah, were at the point where.

25:11

We're kind of equating like political

25:13

content with yeah, content. That's like,

25:15

you know, Sexually. Explicit

25:18

or you know, violent or something.

25:20

I don't think it will work

25:22

on. I think that the. Structure.

25:25

That it or not have. Also some

25:27

media is to argue about politics and

25:29

I says that is baked into a

25:31

man on think you can ever six

25:33

it from it and that. Pays.

25:36

Grayson said beginning of the internet people

25:39

have used message boards or the connectivity

25:41

or that. Technology that allows

25:43

you to send a bunch of

25:45

tax from one place to another

25:47

instantaneously. Third, just argue with one

25:50

another And to fight com. There's.

25:52

never been an internet or aversion on

25:54

the that is not descended into this

25:57

i it's all every single pharmacists reddit

25:59

everything And so the idea that you

26:01

could do it by trying to

26:03

limit this stuff, I think, is outrageous.

26:06

They'll just fight in the comments, right?

26:08

Or they'll fight of like a whatever,

26:10

right? There'll be arguments everywhere. I feel

26:13

like I've noticed a bunch of this lately where there'll be

26:15

like an Instagram account that, you

26:17

know, post something on Instagram. And

26:20

then you'll have like a whole subreddit

26:22

devoted to just like tearing

26:24

this Instagram apart and debating it

26:26

and whatnot. And it seems

26:28

like you're right that these conversations are going to

26:30

happen. The political conversations will happen

26:33

either in the comments of the app or

26:35

they'll happen on a separate app. But like

26:37

it's impossible to separate the

26:39

content on Instagram, even

26:41

the dog photos from the

26:43

rest of it because people seem to have a tendency

26:45

to want to, you know, engage in political debate whenever

26:47

they have an opportunity to. Right. It's

26:50

something I noticed this morning, right, which is something

26:52

that I thought about more than

26:54

I should. You know, anytime there's a

26:56

K-pop video of somebody dancing, right, an

26:59

Asian person dancing, the first or second

27:01

comment is always a discussion

27:03

of cultural appropriation,

27:05

whether it's appropriate

27:08

for Asian people

27:10

in Asia to be doing hip-hop

27:12

dances. And they're

27:14

always at the top because those are the most

27:17

engaged with comments, right? There's huge arguments underneath. It's

27:19

just like- It's way better than like, hey,

27:21

cool video. This morning

27:23

it was a 10-year-old in Vietnam who was

27:25

doing a dance, and that was the first

27:28

comment. There's like a thousand responses to it,

27:30

right? Like you can't stop it, right? Like

27:32

you would think that here's a cute child

27:35

dancing would be free from political conversation. But

27:37

of course it immediately goes to that type of place.

27:39

So I'm quite skeptical that something like

27:42

that would ever work. I'm

27:45

wondering what you think content moderation should

27:47

look like if it has to exist. And

27:50

if there are any platforms out there that seem to

27:52

be doing a better job than others. I

27:55

don't think it should really exist,

27:59

but I do think that's- certain things like people

28:02

being killed, violence, depictions

28:06

of sexual violence should

28:09

obviously not be, there

28:11

should be restrictions upon that. In

28:13

terms of any type of speech or

28:15

whatever hate speech, I'm much more resistant

28:17

to the idea that people should be

28:20

banned from platforms or that content moderation

28:22

should exist for those types of things

28:24

unless they're physical threats

28:26

of violence. And so

28:28

ironically enough, despite my general

28:30

feelings about Elon Musk,

28:33

I think that if you look at

28:35

the totality of what has happened, especially

28:37

since October 7 in terms

28:39

of the war in Gaza, that

28:43

there is much more free conversations

28:45

about that, I think, on Twitter

28:47

than anywhere else. And I think

28:50

that part of that is because

28:52

I think that Elon Musk's general

28:54

thoughts about free speech are completely

28:56

convoluted and mostly disingenuous. But I do

28:58

think there's part of it in which I'm not going

29:00

to regulate this because I already stick myself to this

29:02

position. I would probably say

29:04

they're doing a better job, but that's just because

29:07

I think that most things should be

29:09

OK. And again, I

29:11

don't think that's a very popular opinion. I

29:15

think people, I don't know, I'm Asian,

29:17

right? Sometimes people on these

29:19

social media platforms will call me a racial

29:21

slur. It doesn't happen very often, but occasionally

29:23

happens. And I don't know, I think they

29:25

should have the right to type that if

29:28

they want without getting banned from the platform.

29:30

I have the option to block them if

29:32

I want, right? And so I don't know.

29:36

I think that less moderation is good. Yeah,

29:39

my problem is that I'm, I guess, I think

29:41

really highly of myself. And so I kind of

29:43

want to be able to receive all

29:45

of the content, even if it's propaganda. And

29:47

I trust myself to be able to tell

29:49

the difference between the real stuff and the

29:51

fake stuff. And we

29:53

see the memes that went viral in 2016

29:56

that apparently had some effect on the election.

29:58

And I was like, ah, that wouldn't. That

30:00

wouldn't have gotten me. I don't

30:02

know, at the same time, I'm sort of like, oh, I guess

30:04

it's probably not great for a 12-year-old to be looking

30:07

at a bunch of propaganda on YouTube

30:09

or on TikTok. So it's

30:11

like, I'm cool with other people's content being moderated.

30:13

I just don't want mine to be moderated. So

30:17

when you're that much of a hypocrite, you just have

30:19

to be anti-moderation. You got to click that

30:21

button, right, on your sticker, and then get it

30:23

all to smash you in the face. I

30:28

think there are already safety controls for children,

30:30

right? And I think those are appropriate. In

30:33

reality, content moderation is done abroad,

30:35

right? It is outsourced. One of

30:37

the things that has always troubled

30:39

me about it is the idea

30:41

that some executives at a tech

30:43

company are making decisions that

30:45

they're outsourcing to overworked, underpaid people, say,

30:47

in the Philippines, who then have to

30:50

watch all this stuff and enact whatever

30:52

the policy is. It's

30:54

a really inexact system, right, is

30:56

one that is going to have

30:58

all sorts of failures

31:00

throughout. And I

31:02

think that the biggest failures are really at the

31:04

top, and that any time you have people

31:07

who are now controlling what is generally thought

31:09

of as a public square, who

31:11

are just making these decisions seemingly arbitrarily,

31:15

that it's probably

31:17

better to just figure out a way

31:19

to do it without them making those decisions, if

31:21

at all possible. We

31:23

talked earlier about Zuckerberg just constantly

31:25

having to do these congressional hearings

31:27

and whatnot. And so it seems

31:30

like it's less ideological.

31:33

We want the people on our platforms to be exposed

31:35

to the right things and more just a

31:38

matter of protection. I guess I'm just wondering why more

31:40

of them don't take the

31:43

approach of less moderation,

31:45

given that it seems like it

31:47

would be a good marketing strategy in some ways. Yeah,

31:50

I think they are afraid of getting hauled

31:52

in front of Congress over and over again. They

31:55

do see the polling that comes

31:57

out about the decreasing popularity of these.

32:00

of these platforms. I think

32:02

they do see things like the Utah bill

32:04

as being a public move against them that

32:06

is supported by large portions of the public

32:09

and they are wondering what

32:11

their future looks like and I don't think

32:14

that their future can just be, they think

32:16

they rightfully understand that their future can't just

32:18

be, hey we are

32:20

not the publisher right, and cite section 230

32:22

and then just kind of

32:24

move on with it over and over

32:26

again and say like no this is

32:28

a user doing it, it's not us,

32:30

you know this was what you saw

32:32

during the Reddit days right, Reddit sort

32:34

of asserting itself over and over saying

32:36

the stuff that happens is not on

32:38

them, it's on the user, it's not

32:40

their fault, it's the user's fault. I

32:42

found that compelling but eventually many many

32:44

people didn't and Reddit had to change

32:47

their rules. We're seeing a

32:49

large-scale version of that happening right now

32:51

and these companies I think are just

32:53

adjusting based on risk. So

32:55

you end your latest column on a

32:58

bit of a nihilistic note, basically you

33:00

question whether we can ever actually moderate

33:02

content online without infringing on the First

33:04

Amendment and whether instead we should just

33:06

let the internet quote run wild and

33:08

grow as big and stupid as it

33:10

can be. That way there's even

33:12

more reason to ignore it. For

33:14

those of us who care about free

33:16

speech and preventing brain rot on social

33:19

media, is that the only path forward?

33:21

Yeah I think so, I think that

33:23

people need to opt out and that

33:25

they need to basically say that the

33:27

government can't protect us from this and

33:29

that they won't and that the

33:32

companies are not going to do anything that

33:34

really compromises their bottom line, right, they're not

33:36

going to act judiciously or they're not going

33:38

to act in a benevolent type

33:40

of way and that we should probably

33:42

stop wondering when they'll do this

33:45

right, when they'll get it right. I don't think there

33:47

is a way to really get it right and

33:49

so I think that it falls on

33:51

the individual or on communities to then just

33:53

opt out of it and I think that

33:55

we haven't seen too much of it but

33:58

I think we'll start seeing more of it. in the

34:00

future, right, in the near future. I think

34:02

that there will be much more emphasis on

34:05

phones being damaging. I think there will be

34:07

much more emphasis on people

34:10

trying to live lives away from this sort

34:12

of stuff. At first, it'll

34:14

feel very culty, right, but I think that

34:16

it's kind of necessary at this point and

34:19

that it would be a good time to

34:21

keep an open mind to it, right, to

34:23

different ways of existing without devices

34:26

and without the internet. That's the only alternative

34:28

I can think of, but I don't know.

34:31

I live in Berkeley, California, so maybe it's

34:34

ingrained in me that communes are a

34:36

good way to go about life.

34:40

I have trouble picturing it in New York,

34:42

but it would be

34:45

nice to at least have the option

34:48

of an opt-out community. It seems

34:50

like right now it's the only way to do it is just to

34:53

resign yourself to being really disconnected

34:55

from everything that's going on in a

34:57

way that would be damaging to your

34:59

education if you're a kid or your

35:01

social life or your career. Right,

35:04

right. Parents generally want

35:06

this for their kids, but they

35:09

all say the same thing, which is that

35:11

it's impossible because their

35:13

kids enter a stage where

35:15

they need it to have any

35:17

friends at all. Now, there's very many

35:20

different versions of it, right? Kids play

35:22

a lot of video games online, right?

35:24

And so you can't do that. They

35:26

text back and forth, can't do that.

35:28

I mean, you're really sort of setting

35:31

your kid up for a social disaster in that

35:33

type of way. Most parents, I think, rightfully don't

35:35

want that. And so, yeah, it

35:38

takes, I think, a community effort at that level.

35:42

Well, thank you so much, Jay. Thank

35:48

you. Jay Caspian Kang is a staff writer at

35:50

The New Yorker. You can read his latest column

35:53

at New yorker.com. This

35:55

has been the Political Scene. I'm Tyler

35:57

Foggett. The show is produced by Julia

35:59

Nutter. and edited by Sifni Karyuki

36:01

with production assistants from Mike Cushman. Our

36:04

executive producer is Steven Valentina. Chris

36:07

Bannon is Conde Nast's head of global audio. Our

36:10

theme music is by Alison Leighton Brown. Enjoy

36:12

your week and we'll see you next Wednesday.

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features