Podchaser Logo
Home
Making Meta Better

Making Meta Better

Released Thursday, 26th January 2023
Good episode? Give it some love!
Making Meta Better

Making Meta Better

Making Meta Better

Making Meta Better

Thursday, 26th January 2023
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

So you wanna marry my daughter? Yes. I did.

0:02

So do you hang out in the hood all the time, or do you

0:04

just come up here for our food

0:06

and women? This January. Your

0:08

family, I think, Emily. I

0:10

don't know how this is gonna work.

0:12

I like your braids.

0:13

Thank you. Exhibit ad Brads. Jona

0:15

Hill, Lauren London, David Dubney,

0:17

Neil Long, with Julie Louis Drive Fast

0:20

and Eddie Murphy. What's up with white guys?

0:22

Am I white guys? Well, I'm not.

0:24

White people directed by Can you bear F

0:26

rated r streaming January twenty seventh

0:29

only on Netflix. It's

0:34

Thursday, January twenty sixth twenty twenty

0:36

three from Peach Fish Productions. It's the jest

0:38

time, Mike Pesca. And here's the

0:40

problem with personification. Everything

0:43

becoming not a thing talking geckos,

0:45

talking garden gnomes, talking waffles, talking

0:47

mochi, basically every Pixar movie.

0:49

What if bugs talked? What if cars talked? What

0:51

if fish talked? What if toys talked? Four times

0:53

plus a spin off. See, today,

0:56

Oreo introduced what they are calling

0:58

the Oreo Oreo. And the

1:00

thing is if you haven't been following Oreo,

1:03

or aren't what insiders call

1:05

or you know files. They don't call it

1:07

that I made that up. Oreo has

1:09

a lot of flavors, almost hundred flavors.

1:11

Toughy crunch, snicker doodle, hot and

1:13

spicy cinnamon, nineteen seventy three chateau

1:16

Montalina, chardonnay, fudge covered,

1:19

carrot cake, birthday cake. Now

1:21

one of those might have been conjured up in an Oreo and

1:23

do sugar combo, but really And also

1:25

let's point out birthday cake is not a flavor of

1:27

cake. It's an occasion four cake. I

1:29

always have chocolate hazelnut sandwich is on

1:31

my birthday. That by the way, literally another

1:33

flavor of Oreo. But now there's this.

1:36

The newest Oreo variant is

1:38

called the most oreo oreo.

1:41

It has a larger portion of cream filling

1:43

and the filling is stuffed with

1:46

ground oreos. They're

1:48

being called Oreo stuffed Oreo's, and

1:51

as a result, Oreo Twitter was on

1:52

fire. Here's YouTuber, Marki Divo's

1:55

reaction. The Oreo gods are

1:57

back at it again. They drop this huge

1:59

ass. Oreo. This is the most Oreo

2:01

Oreo cookies and cream. It

2:03

was so big that they had to name it Oreo,

2:06

freaking Oreo. It's got loaded

2:08

cream. It's like two cookies deep and it has

2:10

cookie pieces. So no problem. Right? Other

2:12

than our national diabetes problem. And where does

2:14

personification come in? Is

2:16

that a person talking, character,

2:19

Oreo? It's just an Oreo filled with Oreo's all

2:21

the way down. It's that in every

2:23

article about the Oreo stuff, Dorio.

2:25

There was a link to an article about

2:27

M and M Spokes Candies. These

2:30

sexy or relatable or also

2:32

human candies that once asked us

2:34

to eat them. And so now,

2:36

when I think of Oreo stuff, Oreo's,

2:38

can think of nothing else other than cannibalism,

2:41

which puts me in the minds of some things I once

2:44

read about chickens who started pecking each

2:46

other and end up eating

2:48

each other a quote from the

2:50

Penn State University extension

2:53

school, poultry cannibalism, prevention,

2:55

and treatment guide. Quote, cannibalism

2:58

usually occurs when the birds are stressed by

3:00

poor management practice. One's

3:02

becoming stressed, one bird begins getting

3:04

the feathers comb toes or vent of another

3:07

Once an open wound or blood is visible

3:09

on the bird, the vicious habit of cannibalism

3:12

can spread rapidly through the entire

3:14

flock. If you notice the problem

3:16

soon after it begins, cannibalism can

3:18

be held in check. Yes. Yes. We

3:20

all need to manage our chicken

3:22

cannibalism, and that's why I take Fowl

3:25

don't. For moderate to severe outbreaks

3:27

of chicken cannibalism, don't take foul don't

3:29

if you're already on Koopa or allergic

3:31

to foul don't. They list why

3:34

chicken cannibalism occurs over

3:36

crowding brightly lit nests,

3:39

quote, allowing cripples, injured, or

3:41

dead birds to remain in the flock they say

3:43

Cripples, their not mine.

3:45

And then you ask the chickens. How could you guys do

3:47

such a thing? And they just answer, I don't know. I kind of taste

3:49

like chicken. The last reason

3:52

that they give is something called prolapse

3:54

pecking. It is the most disgusting food

3:56

related thing I've ever heard about, and yet

3:58

I still eat chicken. No problem. I'm

4:00

going to eat chicken tonight. I ate chicken

4:02

last night. You have got to go out of your way

4:04

not to eat chicken in America. But

4:07

I do think of rampant disgusting

4:09

cannibalism when I think

4:11

of the new Oreo stuffed Oreo

4:14

because everything is alive, except

4:17

my appetite and any pleasure that's

4:19

left for a sweet taste that doesn't

4:21

try to be another sweet taste or some

4:23

sort of infinite regression of the same

4:26

sweet taste. Sweet taste. That's

4:29

our advanced computer assisted

4:32

effects. What I'm saying

4:34

is, and please hear me clearly and don't

4:36

miss my point what if they

4:38

just put the green M and M in sensible

4:40

flats? Aspadrills, something that wouldn't

4:42

upset the Fox hosts or Katherine

4:44

McKinnon. Why can't we have nice

4:46

things is getting more and more

4:48

complicated every day. On the

4:50

show today, I should feel about those

4:52

exhausted activists and how to

4:54

burn it all down whilst avoiding

4:57

burnout. But first, Ravi

5:00

Iyer is a data scientist and a moral

5:02

psychologist He worked for

5:04

Facebook. He's now the Managing Director of the

5:06

Psychology of Technology Institute at

5:08

USC's Neil Lee Center. We're gonna talk about

5:10

on this day after Donald

5:12

Trump was restored to

5:14

Meta, Ravi's old company.

5:16

We're gonna talk about what he thinks

5:19

social media should do to try

5:21

to stop the spread of misinformation, the spread

5:23

of angry reaction posts, and if he

5:25

thinks we can content moderate ourselves.

5:28

Out of this situation, Ravi Iar,

5:30

up next.

5:41

So of course, the New Year means New Year's

5:43

resolutions, which probably

5:47

is under the rubric of eating

5:49

better and living healthier and also being

5:51

more efficient, factor does

5:53

both. Factor is a ready to

5:55

eat meal kit. And by ready to eat,

5:58

we mean just that. You don't have to chop. You

6:00

don't have to prep. You don't have

6:02

to clean up. They're fresh. They're

6:04

never frozen and their meals in just

6:06

two minutes. Just heat them up and enjoy.

6:08

Many different kinds, meals to

6:10

fit in with your lifestyle, also New Year's

6:12

resolution, maybe you're going keto, or

6:15

vegan and veggie, or protein

6:17

plus All of those meals are on the

6:19

factory menu each week prepared

6:21

by chefs approved by dietitians.

6:23

They also have thirty six quick

6:25

bites smoothies juices and other satisfying

6:28

add ons. You could eat vegan

6:30

or veggie quite easily

6:32

because factor meals are approved by

6:34

diet petitions and they know all the ingredients

6:37

you want and nothing that you don't.

6:39

So this is really a hassle

6:41

free, very efficient,

6:44

very clean living option.

6:46

America's number one ready

6:48

to eat meal kit factor starts

6:51

saving time eating well and living your year

6:53

ever. Head to factor seventy five dot

6:55

com slash the GIST sixty and

6:57

use the code the GIST sixty to get

6:59

sixty percent off your first box.

7:02

That's code the gist sixty

7:04

at factor seventy five dot com

7:06

slash the gist sixty to

7:08

get sixty percent off your first

7:10

box. Couple

7:13

weeks ago, The Wall Street Journal had a

7:15

very well reported story. Facebook

7:17

wanted out of politics, It was

7:19

messier than anyone expected.

7:22

And the basic thrust of the article

7:24

was when Mark Zuckerberg just said,

7:26

this isn't worth it. Turning the

7:28

spigot from wherever it was to

7:30

zero had, like the article said,

7:32

some add on effects. People didn't

7:34

like it. Donations to Facebook's sponsored

7:37

charities went well. Facebook didn't get any

7:39

credit for being less toxic. Excellent

7:42

news outlets were de prioritized

7:45

in place like mother Jones found that fewer

7:47

people were reading its articles. And and

7:49

this was a little buried in the article, but I

7:51

thought the worst add on effect

7:54

that the percentage of just

7:56

poorly sourced stories was

7:58

higher in users feed after

8:00

Facebook tried to correct for what

8:02

people saw as it toxicity. There

8:05

seems to be no good

8:07

answer for what Facebook can

8:09

do to satisfy all constituencies.

8:12

But a person quoted in this article

8:14

and someone who worked for Mehta, which

8:16

is what I think when he worked there, it was called

8:18

Facebook. Now it's Mehta. Was

8:20

offering I thought the best insight and I

8:22

wanted to have him on. Ravi Iyer is

8:24

the managing director of the psychology

8:26

of technology institute at

8:28

USC's nearly center for ethical

8:30

leadership and decision making. Ravi,

8:32

welcome to the Jist. Thanks, Mike. I'm glad

8:34

to be here. is an impressive at least

8:36

the organization is impressively

8:37

titled. So, well, you know,

8:40

universities, you know, we we like to

8:42

explain what we're doing in

8:44

detail. I mean, is your journey a little

8:46

like Alfred Nobel invented dynamite?

8:48

And then said, oh god,

8:48

I gotta I gotta found the peace prize

8:50

now. I've either

8:52

you know, I tried I try to figure out what

8:54

my path is, you know, as I go.

8:56

So this is this is the place for me to be at

8:58

this moment. No. But seriously, and we'll

9:00

get to Pacifics, but did you look back at

9:02

your time at meta and from what I understand

9:04

of your tenure there, you were on

9:06

the side of trying to advocate

9:08

for Lex less toxicity and

9:11

greater actual engagement, but do

9:13

you look at it with

9:15

enough either regret or

9:17

just I don't know,

9:19

just clear eye on this that this was

9:21

the next obvious role

9:23

for you to play societally.

9:25

Yeah. Definitely. I mean, so I'm

9:27

grateful for my time at Meta. You know, it

9:29

wasn't easy. You know, I went to battles. I lost some

9:31

battles. But you

9:33

know, I really did learn some things. You know,

9:35

one thing I say to people is that we made things

9:37

three percent better, and three percent is not an actual

9:40

number. I just it it's meant to illustrate

9:42

that we made a measurable difference for a

9:44

large number of people, but there's a lot

9:46

of things that are still left to be

9:48

done. And there are some things we learned that I think

9:50

apply not just to Facebook, they apply to TikTok,

9:52

they apply to YouTube. And so I

9:54

wanted to take a shot, take a swing at that other

9:56

ninety seven percent There's

9:58

some decisions to be made that I think aren't just meadows.

10:00

I don't think people what meta want to make those decisions. I

10:02

think they'd be glad if people in the world

10:05

understood the decisions we made and help them

10:07

with them. So I do

10:09

think that's the next step. It's it's I

10:11

think there's a lot of good to be done.

10:14

And there's a a some kernels, you know, the

10:16

the one thing I might slightly disagree with from

10:18

your intro is that I do think there are some

10:20

things that we did that made a

10:22

difference that we're like positive. And

10:24

III wouldn't argue that they solve the problem,

10:26

but I think we can learn from those things and implement

10:28

them more widely. Yeah. That seems true. And if

10:30

I gave the impression, another impression

10:32

in the intro will correct it

10:34

as listeners hear this interview.

10:36

But as they hear the interview, just give them

10:38

if you would some idea

10:41

of your background. I know you worked for

10:43

ranker, but which is RANKER,

10:45

not the rank or that maybe we

10:47

associate with some social media. But I

10:49

also see you as a Google scholar, and did you

10:51

work with Jonathan Hite in

10:53

the NYU psychology department? Just

10:55

tell me about yourself a little bit. Yeah. I mean, I

10:57

started my career as a programmer. I

11:00

decided that it was

11:02

meaningless to some degree. I would program,

11:04

like, you know, some banks phone systems

11:06

so they could keep track of all the phones that they

11:08

they had with all their employees in. And so I

11:10

decided I wanted to be more meaningful and so I got a

11:12

degree in psychology. A lot of

11:14

people in grad school end up setting themselves to some degree. So I

11:16

studied myself. I studied like, what does it all mean? Why am

11:18

I doing anything? Which led me to John

11:21

Hite, and he was doing a lot of interesting work

11:23

on moral psychology. We

11:25

together so I was actually got my speech at

11:27

USC, but I met him at

11:29

a conference. With numerous

11:32

other collaborators. And we together,

11:34

we built psychology or morals. We studied,

11:36

you know, people's psychology,

11:38

how they make moral decisions. We

11:40

wrote about them from a more descriptive perspective versus

11:42

a judgmental perspective. So people who are

11:44

interested in understanding

11:46

the other side as opposed to demonizing the other

11:48

side would often read our work.

11:50

You know, John obviously became

11:52

more well known. So when I graduated,

11:54

we had a small nonprofit civil

11:56

politics. Where we actually helped

11:58

bring Liberals and conservatives together and measure the

12:00

effects alongside many partners in the

12:02

community. And then while I

12:04

was in grad school, I also was working

12:06

at Ranker. So a friend of mine started a company

12:08

called Ranker. And so I had his dual

12:11

career. That started to blow up. It's about a hundred

12:13

percent company. It's not a giant company, but it's it's a decent

12:15

sized company. In LA.

12:18

And I, you know, I was working at Rinker, and I was

12:20

also working on polarization. And

12:23

I had a friend of mine who offered

12:25

me, you know, said, we're working on on

12:27

polarization of Facebook. There's a

12:29

great opportunity to you know, do some

12:31

good in the world, sort of bring your disparate careers

12:33

together, and so that's

12:35

kind of what made to that point. Yeah. So

12:37

Ranker is the kind of site that has best

12:39

songs about breakup. Facts about World

12:41

War two we just learned today that make us

12:43

say whoa. It's let's say very

12:45

populist. I don't think it causes

12:47

toxicity I don't know

12:49

exactly though how it dovetails

12:51

with the Jonathan Heights learning to

12:53

understand the love language of the

12:55

other tribes. I mean,

12:57

it does get a lot of its traffic from

12:59

social media. So I experienced

13:01

the incentives of social

13:03

media firsthand at Rinker. I

13:05

mean, it's less political, obviously, and and

13:07

that's part of the article is that, like, you know, maybe

13:09

there are some things that it's okay to sensationalize,

13:12

like, you know, your song or your

13:14

movie in a way that it's maybe not okay to

13:16

sensationalize your political views.

13:18

But I definitely learned a lot of record

13:20

just about You know, I would I I built

13:22

their initial algorithms. I I it

13:24

helped me, like, stay current in the tech

13:26

space. And you

13:29

know, I learned also about crowdsourcing. So

13:31

if you think about, you know, best songs I

13:33

don't know, best songs to work out too. Right?

13:35

Like like, where crowdsourcing signals are

13:37

anchor from people's list people's voting for trying to

13:39

get diverse opinions. At

13:41

Facebook, you're also crowd sourcing, like, what

13:43

kind of content should I show you? Right? Like, you're taking the

13:45

signals of people who liked it, people who

13:47

commented it. And there's similar

13:49

kinds of patterns where, like, for example,

13:51

diverse signals do better than narrow

13:52

signals. Right? So things that a narrow group of

13:55

people like are tend to be worse than things that

13:57

a large group of people like, which is something that holds

13:59

true across all algorithms, whether rank or

14:01

whether Facebook or or anywhere in

14:03

the world. The diverse versus narrow.

14:05

How might that play

14:06

out in terms of what

14:08

Facebook is to some extent rightly criticized

14:10

for, which is contributing to

14:14

extremism in politics

14:16

and people's perceptions of the

14:18

world. Yeah. I mean, I I think one

14:20

thing that social media. And we

14:22

made a little bit of progress on there, on this while I

14:24

was there, but I think there's a lot more to be

14:26

made. One thing that

14:28

true of social social media is that

14:30

it's easy for a small group of people to make

14:32

something go viral even though a

14:34

maybe a larger group of people don't like it. And

14:36

that's true in part because it's easy to

14:38

say you like something and it's really hard to say you

14:40

don't like something. Right? Like, so, like, if I want if

14:42

I like something that like buttons right there,

14:45

I can share it. If I dislike it,

14:47

I have to go, like, to this three dot menu

14:49

in the upper right, it's kinda hard to

14:51

tell that. It's kind of a pain. One

14:54

thing we did in our in my time there was

14:57

we made that a little bit easier to find. Right?

14:59

And so, like, now there's a little x on the

15:01

upper right in in many posts. And

15:03

that gives you a little more signal on things that people might

15:05

dislike. We had this c more

15:07

c less feature, which

15:09

helps people see, like, know, what they explicitly

15:11

want versus what they engage with. And so

15:13

those are those are things that that help.

15:16

There are sites like Reddit. Which do

15:18

that better. Right? Which have more negative signal. And

15:20

so so things like adding more negative signal. And

15:22

and so one thing that I'd like to do in my time

15:24

outside of Facebook is, you know,

15:26

think about how we could reimagine the platform.

15:28

You know, big companies, they

15:30

are always gonna move a little slower than

15:33

startups. And there's a lot of innovation in the space

15:35

as far as startups. Where people are

15:37

really experimenting with these reactions,

15:39

experimenting with, like, you know, there's a there's a

15:41

project the Narwa project that III

15:43

started playing around with. And they have, like, a

15:45

clarifying button or a new to me

15:47

button. Right? Like, if we really completely

15:50

reimagined the reaction so that we had

15:52

know, some amount of negative feedback, some amount of explicitly

15:54

positive feedback, how could we use that

15:56

in algorithms to shape our space to be better?

15:59

So I don't like this. Which is

16:01

negative feedback. You're

16:03

portraying it as rare and valuable,

16:05

but it's confusing to me

16:07

because I think that lot of, for instance, on

16:09

a talk radio, the people who hate

16:11

listen, are a major

16:13

driver and making that format

16:15

popular And also, I'm thinking

16:17

about the first time I encountered your

16:19

name was in an article. I think it was in the Washington

16:21

Post about the angry button.

16:23

Right? So Facebook had this

16:25

a bunch of emojis and one was

16:27

angry. Like, I don't like it. This makes me angry,

16:30

which wound up being

16:32

not the solution, but in fact, a source of

16:34

a lot of the problems

16:35

there. Right? Yeah. I mean, and and

16:37

it's all a question of you

16:40

know, the signals that you get. So I

16:42

mean, so some of the low hanging fruit at

16:44

when I work working there was just and,

16:47

you know, I give credit to

16:49

Facebook to some degree for this because they did this

16:51

without the Washington Post article. You know, we

16:53

we did this before the Washington Post article came

16:55

out. Right? And so, you know, originally,

16:59

love reactions, anger reactions

17:01

counted the same in the algorithm. Right? And so if

17:03

I anger at something or I love something, I would get more of

17:05

it. Now, logically,

17:08

like a lot of times people are angry acting

17:10

because they don't like something. Like, sometimes you're synthetically

17:12

angering something, sometimes you're I

17:14

don't like that angering something, but it's not a pure signal

17:16

that I want more of that thing. It's often a signal

17:18

I don't want more of that thing. Right. Right. So you're

17:21

saying, like, sometimes there's an article

17:23

about something that's supposed to make you angry in

17:25

the world. Let's just take something legitimate.

17:27

Like, for me, what the

17:29

Republicans are doing with the debt ceiling? I

17:31

might press angry because goddamn

17:33

what the Republicans are doing with the dead

17:35

ceiling, whereas a big fan of

17:37

Matt Gates might press angry

17:39

because they don't wanna see the critical article of the Republicans. Yeah.

17:41

Or they made, like, just like the framing or they

17:43

might think it's this information. Like, there's a lot of

17:45

reasons you do Right? And so it's not a great

17:47

signal. Understand, I think, but from what we've been told

17:49

about social media is they don't really care. They

17:52

just like intense emotions. That's what

17:54

drives traffic. To some degree, but I

17:56

think there's also a longer term

17:58

view here. So, you know, we did eventually

18:00

change it so that angry actions don't count in the

18:02

algorithm. Right? And so And,

18:04

you know, it's it's it's like baby steps.

18:07

Right? But, you know, if if

18:09

we had a perfect measure of

18:11

what would be long term in the

18:13

interests of users in society. You know, I

18:15

think platforms would do that. The hard part is how

18:17

do you measure that. And and It's really

18:19

easy to measure the short term stuff. It's really easy

18:21

to measure, like, I made a change. People use

18:23

the platform more. It's hard to

18:25

measure something, like, what's gonna happen in two

18:27

years? Right? Because you have to wait two years to

18:29

happen. There's an interesting article that Facebook

18:31

Analytics put out recently about notifications.

18:34

Right? And this is like a very, like, low hanging fruit case

18:36

where they reduced amount of notifications, and in two years,

18:38

they got more people to use the platform. If

18:40

they could some of these subtler things,

18:42

like optimizing for, you know, angry actions

18:45

is simple thing. Optimizing for comments. Like, you mentioned, like,

18:47

people hate listed. Right? And they write, like, something like,

18:49

I hate this. I don't want this. Right? Or this

18:51

is terrible. Right? And, you know,

18:54

optimizing for comments is something that

18:56

was done for political content. And in that

18:58

article you referenced is is was

19:00

taken out. And that's in part because of that. Like, a lot of

19:02

times, you know, generally a comment on

19:04

something like, you know, like, if we if we went out for

19:06

beers and we had a picture and, like, everyone was

19:08

commenting on it, like, that's probably something

19:10

that people are gonna wanna see. If it's

19:12

an article about, like, the debt ceiling and people

19:14

are commenting on it, like, a lot of times people are commenting

19:16

in more like that, you know, I can't believe what that guy did

19:18

or I can't believe that guy did. And

19:20

it's not great experience for everyone. Maybe for some people it is,

19:22

but not for everyone. And so, you

19:25

know, just taking these ambiguous signals

19:27

like anger reaction or comments out

19:29

of our incentive system is

19:32

a step. And then it it leads to thinking

19:34

about what more steps, what

19:36

take? How can we get more explicit signals of

19:38

people's positive and negative reactions as opposed

19:40

to relying on these ambiguous signals which often lead

19:42

us in the wrong direction? Are there great

19:45

signals? Are there gold standards of

19:47

signals that are good for the platforms, but

19:49

also at least neutral for

19:51

society? Yeah. I mean, you know, I

19:53

think, you know, anything where you get explicit,

19:55

like, a person really liked this thing.

19:57

Right? So, like, a love reaction. Right?

19:59

Like, it's not it's unlikely you love something and it was actually

20:01

something that was like a bad experience for you. Right?

20:03

You you can like something that's a bad experience in part

20:05

because it's like, You know, you people

20:08

like things because they agree with the opinion or

20:10

they they're trying to be nice to the person. But if if

20:12

you go the extra step to love something, like,

20:14

it's probably, like, explicitly a

20:16

good thing. There's a reason why the C More Less button, you know,

20:18

works well in part because it's explicitly.

20:20

It's like, I want to see more of that. I want to see less of

20:22

that. It's less it takes it disentangles the

20:24

social signaling. From

20:26

like I want more of that.

20:28

Doesn't the love signal though optimize

20:30

for cute cat videos, which I like,

20:32

but I don't want to replace my

20:34

idealized debt ceiling coverage? Yeah.

20:36

And and that's where, like, platforms that are experimenting

20:38

with other buttons. Right? Like, ideally, there'd be

20:40

an informative button or a clarifying button,

20:43

like, like the novel project is doing. Right? So, like, you need to

20:45

experiment with some of these things, and that's where I say, like,

20:47

you know, we have some nuggets, but we

20:49

didn't solve problem. There's a lot more to be done.

20:51

And and I'm excited for the fact

20:53

that, you know, I can talk to people, you know,

20:55

experimenting on top of mask it

20:57

on. You know, building all new platforms, there's like

20:59

a half dozen of these. Right? Like and and people

21:01

can build, like, a clarifying button or

21:03

an informative button. You

21:05

know, that that that can give you the idealized

21:08

debt ceiling content that you

21:10

want.

21:10

Francis Hogan, who

21:13

you appreciatively in your writing,

21:15

others have said social

21:17

media optimizes for outrage, full

21:19

stop. How true is that? I

21:21

think especially for political content,

21:23

and that's

21:23

why, like, you know, most

21:26

of the content on social media isn't necessarily a

21:28

political. And as the article states,

21:30

like, that's something that people are trying to

21:32

get out of. Right? So for political

21:35

content, there were definitely worse than

21:37

bad incentives, and that's why like,

21:39

that's why those changes were made. And

21:41

so, yeah, I mean, I think

21:43

you need different it it doesn't have to

21:45

be the case though. Right? Like like it's somewhat a

21:47

function of this

21:49

system that exists. One can imagine a

21:51

different system. One can imagine, like, you know, what they're

21:53

doing at front porch. Right? Like, one can imagine

21:55

different systems that are designed better And

21:59

so, it has historically

22:01

been true in the political space.

22:03

In the future, it doesn't always

22:05

have to be true. Right. So it doesn't have to be

22:07

true and there can be better systems. It

22:09

just happens to be that

22:11

our dominant social media, our

22:14

Facebook, to a lesser

22:16

extent Twitter, YouTube, which

22:18

emphasized or maximized for

22:20

outrage algorithms. It

22:23

it just Unfortunately, this was the case with

22:25

the ones that dominate our consciousness

22:27

right now. Yeah. I mean so

22:29

obviously, these platforms are

22:31

not optimizing for outrage. Right? They're optimizing for comments

22:33

or shares, and that happens to correlate.

22:35

They're optimizing for engagement,

22:37

which correlates to emotion,

22:40

which is

22:40

often out. Right? But that is changeable, though.

22:42

Like, and and and so that is, again What

22:44

is it? Trying to interrupt, but it's changeable to the

22:46

extent that if you are sitting atop a

22:49

massive billion dollar or potential

22:51

billion dollar empire, you would be

22:53

convinced, okay, we could change this without losing money

22:55

or even with making money. I know it's changeable, but

22:57

is it changeable and also

23:00

extremely profitable as the old way has been

23:02

so far. I mean, it's hard because

23:04

it might not be. I mean, and so and

23:07

and that's innovators moment. That's why I

23:09

changed. That's why I put my stock in some of

23:11

these, like, upstart companies because they don't

23:13

have to be responsive to

23:15

shareholders or profits. Even in on Musk, I, you know, I

23:17

see this kind of stuff he's posting and I'm and I'm,

23:19

like, he's probably looking at

23:21

his engagement. And, you know, it's it's clear that,

23:23

like, he's experiencing the same

23:25

incentives that many publishers say

23:27

that they experience towards, like, the more sensational, the

23:29

more divisive, and also the more engaging. So I I

23:31

do think that, like, it it'll

23:33

be harder for the big company now not

23:35

like the companies aren't willing to do it to some

23:38

degree. Right? Like, the article has

23:40

a change which led to reduce

23:42

usage and therefore reduce ad revenue.

23:46

But it's a baby step. And there is

23:48

more that could be done to reimagine the

23:50

platforms entirely. And, yeah, I I think it'll

23:52

be hard for the platforms with their profit

23:54

incentives to make the kinds of

23:56

big changes they need to make versus a lot of

23:58

these upstarts. So

24:00

is the big problem

24:03

that Facebook, YouTube, the dominant

24:05

social media, are

24:07

who they are. And so

24:09

you've talked many times about, you can

24:11

imagine a way or their other

24:13

sites massed on Reddit doing things a

24:15

little bit differently. So my question is,

24:18

is the big problem that the

24:20

huge dominant media are

24:22

so dominant it will be hard to

24:24

dislodge them just because they have the market

24:26

share they do, or is it more the

24:28

problem that the reason they're

24:30

so dominant is that they're

24:32

fueled by, you know, the

24:35

crack of interaction and

24:37

engagement. Right? Let me

24:39

make an analogy. Is it

24:41

the case that Exxon and BP

24:43

are such big companies, it's

24:45

hard to knock them off their

24:47

pedestal or is the problem or

24:49

the challenge with climate change, the

24:52

fact that we're all addicted to gasoline, and

24:54

they're the gasoline companies. I

24:57

definitely think that the

24:59

company's like, a lot of

25:01

a different world is possible. So the

25:03

company's will not

25:05

always be the dominant companies. And I think

25:07

they know that, you know, you see TikTok

25:10

rising, you see people experimenting with

25:12

other platforms. We're a phrase that John

25:14

Hite often says moral thinking is for

25:16

social doing. It's hard to get

25:18

people to use a platform, to

25:20

post pictures of their kids if they don't think it's if

25:22

they don't trust it, if they don't think it's good for the

25:24

world. And so I think the

25:26

companies realize that it's

25:28

not just a moral position

25:32

to try to make their platforms better. It's

25:34

actually like in it's in their

25:36

long term it's a long term

25:38

necessity to be

25:40

perceived as better for the world. I don't know if they can move

25:42

fast enough to get there, and so I think the

25:44

other companies might the small

25:46

companies, the upstarts might beat them. But I

25:48

do believe that moral thinking is for social

25:50

doing that people are not in the long term going

25:52

to use a platform that they

25:54

think is bad for the world. And the thing to

25:56

look at is, you know, maybe go on there

25:58

sometimes and, you know, just in case my

26:00

old high school friend posts something. So I might

26:02

use it occasionally. So I I don't

26:04

think, like, leadership or usage is

26:06

is the problem. I think posting is the

26:08

problem. So you like, are people

26:10

posting pictures of their kids anymore? Are people

26:12

posting their vacation photos

26:14

anymore? On Twitter. You you can see Elon Musk asking,

26:16

he's he's always bragging about how much engagement

26:18

he's getting, but he's sort

26:21

of intimating that people aren't posting. Right? Like,

26:23

these inviting people, post more. Post

26:25

more. And if people aren't posting, like,

26:27

that's the ballgame. Right? Like like,

26:29

it's it's then it's just Netflix. If it's just

26:31

like a bunch of people like, you know, making

26:33

viral videos, like social media

26:35

needs people to want to

26:37

post there. And if people think that this isn't a

26:39

safe place for me to post my

26:41

life, it's not gonna work. Ravi Ira

26:43

is managing director of the Psychology

26:45

of Technology Institute at the USC Neely

26:47

center is a former product manager at

26:49

meta, and I thank him very much. Thank you,

26:52

Robbie. Thanks.

27:04

In the beginning of the eighteen

27:06

forties, California, had fewer than

27:08

ten thousand well

27:10

Americans as counted by

27:12

the Census Bureau. By eighteen

27:14

fifty, right? So this was after

27:16

the forty niners got there. It was

27:18

up to a hundred thousand. And

27:21

yet, things did not go well.

27:23

Just getting there killed many.

27:25

The rumors of mountains brimming

27:27

with gold, of streams

27:29

overflowing with precious metal,

27:32

were rumors, were lies in

27:34

many cases, and what the

27:37

settlers found, the dreamers

27:39

found, Rris Swindlers, Crooks, and Charlottons.

27:42

American history tellers explores

27:44

the events in the people who shape our

27:46

history. The newest season, the California

27:49

Goldrush tells the

27:51

tale of those who risked risked

27:53

it all to strike it rich. And

27:55

while some did strike it rich, for

27:57

most the rumored

27:59

riches were a bust. The

28:01

Goldrush accelerated the nation's expansion,

28:04

cemented California's reputation as a

28:06

death destination for dreamers,

28:08

but also failed to deliver

28:10

on the promise for so many.

28:12

Follow American History Tellers wherever

28:14

you get your

28:15

podcasts. You can listen ad free on the

28:17

Amazon Music or Wandry app.

28:20

Ninety two percent of households that start the year with

28:22

Peloton are still active a year later.

28:24

Ninety two percent because of a

28:26

bike, not just bikes. We also

28:28

make treadmills and rollers, Olympic guests

28:30

for elite athletes only. Right?

28:32

Nope. It doesn't matter if you're an avid

28:34

exerciser or new to working out. Peloton can

28:36

help you achieve your fitness goals. Ninety

28:38

two percent stick with it. So can you

28:40

try Peloton

28:41

Tread or a row, risk free with a thirty day home

28:44

trial. New members only, not available in

28:46

remote locations. See additional terms at one

28:48

pelleton dot com slash home dash trial.

28:50

AND NOW THE SPEEL.

28:52

NEW Zealand HAS A NEW PRIME MINISTER, CHRIS

28:54

Hipkins, A NAME YOU WILL LIKELY

28:56

NEVER HEAR AGAIN. If he's anything like

28:59

other New Zealand prime ministers except the

29:01

last one. Through her charisma,

29:03

policies, biography, and savvy, Hippkins'

29:06

predecessor, Jacinda Ardern,

29:08

earned

29:08

Renown. And the reason that

29:11

Jacinda Ardern is no longer a

29:13

leader of that country is

29:15

according to Jacinda

29:16

Ardern. I know what this job

29:19

takes, and I know that I no longer

29:21

have enough in the tank to

29:23

do it justice. It's

29:25

that simple. Which was

29:27

interpreted as an awareness of a term that's very much

29:29

on people's minds these days, burnout.

29:31

Ardern wasn't just the second ever head of

29:33

state to have a baby in office, and

29:35

the young woman who steered her nation through

29:37

COVID with just about the world's lowest

29:39

death rate per capita, She

29:41

also became a self care hero, setting

29:44

an example to emulate New York Times,

29:46

Jacinda Ardern says no to

29:48

burn out Axios. Ardern's

29:51

exit after unprecedented

29:53

threats shows toll of burnout for

29:55

women leaders, Vogue. In her decision to

29:57

step down, Ardern is showing

29:59

her leadership in dinked until the end, trusting your gut, and

30:01

when the moment comes, doing what's best

30:04

for you. That actually seems quite the

30:06

opposite of leadership and something of a

30:08

definition of selfishness. Here's

30:10

L Magazine. It's an empowering move. One that

30:13

giving a power that is, one that allows us

30:15

to regain control, look at our lives

30:17

holistically and make positive change.

30:20

Be it in spending more with family as Ardern

30:22

plans to do, focusing on our well-being,

30:24

learning to put ourselves first,

30:26

go down an entirely

30:29

new route or even making the leap into

30:31

nothing. Now a couple of

30:33

caveats. Ardern is absolutely

30:35

absolutely within her rights as a person, as

30:37

a leader, to recognize that she's

30:39

a human being with frailty and

30:41

needs. And even if New Zealand

30:43

is compared to the US, a functional happy

30:45

society with the bonds of communities still

30:47

in place, Her job is

30:49

extremely stressful. She opted for

30:51

lockdown measures, and they saved lives, but

30:53

they also sparked protests and a

30:55

lot of pushback. She's

30:57

the head of state at a time of inflation.

30:59

She's getting blamed for that

31:02

unfairly. I don't think that burnout, a word

31:04

that she never said in her press conference

31:06

though, Best describes her decision. Basically, it's

31:08

that in a parliamentary system, a weakened

31:10

less popular head of government will often

31:12

bow out to make way for a more

31:15

plausible test her from within

31:17

her party, and she's done so

31:19

because she's good at reading the tea

31:21

leaves, not because she's desperate to curl

31:23

up with a chamomile beside

31:25

a snugly fire. Ardern's not

31:28

lying or even misleading. She

31:30

was a leader with diminishing

31:32

political options who correctly assessed

31:34

the situation. A situation

31:36

that also included job stress,

31:38

but also that stress was compounded

31:40

by her diminishing political options.

31:43

Alright. All this so far has been specific to Ardern. I

31:45

wanna get into a thought that I

31:47

think I shouldn't have had. A notion I

31:49

couldn't get out of my head I

31:51

kicked it around. What I do is I don't immediately opine. I talk

31:53

it over with loved ones. I do

31:56

this. I do this.

31:58

Often, I read things. But

32:00

I do have a daily podcast. It's called the gist. You know it? And

32:02

when no one else is really talking about

32:04

an issue, even if it's maybe treading

32:06

on grounds that will paint me as the great Santini

32:09

I kinda have go for I shouldn't, but I will. Here

32:11

it goes. The celebration

32:13

of Ardern's recognition of burnout,

32:15

the applause for her wisdom and

32:18

bravery We have to admit. Don't

32:20

we? It's just a little bit

32:22

intention with the loudest voices of

32:24

those approving of her who

32:26

also always tell us that the

32:28

stakes of activism

32:30

of being active of leadership

32:33

of fighting against the

32:35

forces of evil the

32:37

stakes couldn't be higher. That's why it's so

32:40

important to fight the good fight and do the

32:42

work. So I do sympathize

32:45

progressive or maybe progressive

32:47

signifying world leader boughs

32:49

out because of the stress

32:51

and the toll But

32:53

it's a stress and toll that's portrayed

32:55

as fighting really

32:57

nefarious forces that need to

32:59

be fought, the forces of

33:02

evil. And you know who doesn't engage

33:04

in self care or whoever takes

33:06

a me day as far as I could tell.

33:08

Donald Trump, Jair Bolsonaro,

33:11

Vladimir Putin, That guy is

33:13

always advancing, literally advancing,

33:15

never retreating. He does

33:17

not surrender. He does not go

33:20

on to live, what is it in a world that's living

33:22

holistically and making positive changes.

33:24

So if it's a pitched battle

33:27

between activists who

33:29

talk about the stakes and

33:31

malefactors who represent

33:34

destruction aren't the

33:36

activists acknowledging that they're at a giant

33:39

disadvantage and by removing

33:41

themselves from the battlefield, putting

33:44

their enemies at a much greater

33:46

advantage and aren't the forces of

33:48

light to take our token construction?

33:50

Aren't they showing their enemies that

33:52

they can be defeated and how

33:54

they can be defeated. You

33:56

know, Martin Luther King moved into

33:58

the Chicago projects to show the world how

34:00

terrible the Chicago projects were. He

34:02

did so while battling serious depression.

34:06

MLK struggled with depression his whole life.

34:08

Abraham Lincoln did too. Yeah,

34:10

I know. These are impossible moral beacons the

34:12

mere mortals around now.

34:14

Impossible to emulate them.

34:18

Only they actually are literally or were mere mortals before

34:21

their near deification. And we are

34:23

told about the importance of meeting

34:25

this this dire, this critical

34:28

moment with nothing short of the

34:30

resolve that it took to

34:32

win critical battles in

34:34

the past.

34:36

I don't hear anyone today saying

34:38

black lives matter, of course, we can't

34:40

fight as hard as they did during the civil

34:43

rights era. The notion is we have to fight just as hard.

34:46

I think they probably

34:48

do, but it's not fighting just as

34:50

hard if you also

34:52

celebrate bowing out to do

34:54

what's right for you. And like I said,

34:56

Trump's tank seems perpetually

34:58

full. Here's Here's a seventy

35:00

six year old obese man given

35:02

to fits of rage, but some

35:04

of he has no off switch. He doesn't

35:06

go in self

35:08

care I mean, take this example. Writing in the New

35:10

Yorker, Jill Lepore said in the work

35:12

of the January sixth Commission,

35:14

which quote, counted at

35:16

least two hundred attempts which a, meaning

35:18

Trump, made to influence state

35:20

or local officials by

35:22

phone, text, Post or public remarks. A

35:24

Trump campaign spreadsheet documents

35:26

efforts to contact more than a

35:28

hundred and ninety Republican state legislators

35:32

Arizona, Georgia, and Michigan alone. Wow, what a motor on

35:35

that guy? Sauron never took a day

35:37

off from creating works. Did he? You

35:40

know, after equity, the most common word used by activists is

35:44

exhausted. In May, Sadhvi

35:46

Mohan Kumar, undergraduate

35:48

at the College of New Jersey

35:50

gave a TED Talk called

35:52

we need to fight activism

35:54

fatigue. Here's a clip. Activism

35:57

fatigue is the feeling of exhaustion when you've been

35:59

learning every nuance of every issue, yet

36:01

it never seems to

36:04

be enough. And it's the

36:06

feeling when you tirelessly campaign for

36:08

change in your

36:08

community, but never reap the rewards

36:11

of

36:11

your hard work. And it's the feeling when you opened your news

36:13

app this morning and read about what's going on in

36:16

Ukraine. But then got a notification

36:18

about another school shooting, so we

36:20

should talk about gun

36:22

laws now. But did you hear about

36:24

that new law that's restricting

36:26

abortion access? Or the fact that

36:28

Syrian refugees are still

36:30

being relocated? Did you know

36:32

Flint still does not have access to clean

36:34

water? Unequal

36:36

access to education, child brides,

36:39

the Amazon rainforest is still

36:41

dying. There are concentration camps

36:43

in China

36:44

today, and no one is talking about them.

36:46

People are talking about it. That's how it came up on your app, which

36:48

maybe you should change the settings of.

36:50

Also, FLINT has had water

36:54

for going on three years now.

36:56

It'll be four in February. It's a horrible crisis.

36:58

The results are still

37:01

being shown in Children's

37:04

development there. But there's

37:06

water. Flint does have clean water, has

37:08

for years. So I don't

37:10

know, maybe

37:12

it's fatiguing when progress is made, but then is

37:14

ignored because untruthful

37:16

means are more powerful than

37:18

just looking into the facts. Alright.

37:21

That's one undergrad, but the sentiment is

37:24

everywhere. How exhausting it

37:26

is to advocate for change

37:28

compared to what? Compared to not

37:30

gluing your hand on an oil painting or

37:32

compared to the generally

37:34

hard work that so many people do of

37:36

improving the world with more targeted

37:38

and practical solutions. They're

37:40

less dramatic. They're less grandiose, but

37:42

they actually get the job done. Might

37:44

leave a little in the tank also. Alexander

37:46

Ocasio Cortez, from a lot of respect in

37:49

many ways, told the New York Times that

37:51

she thinks of quitting AOC.

37:54

I didn't even know if I was going to run for reelection this year. New

37:57

York Times, really why? Answer. It's the

37:59

incoming. It's the stress.

38:01

It's the violence. It's the

38:03

lack of support from your own party.

38:06

It's your own party thinking you're the

38:08

enemy. Well, you

38:10

get elected And then you told the

38:12

other Democrats they were on notice if

38:14

they weren't progressive enough, and then

38:16

you primary them or organize

38:18

primaries against them because you were

38:20

trying to pick off moderate

38:22

Democrats. You were successful in a few cases,

38:24

unsuccessful in more cases. But yeah,

38:26

that was your choice in executing

38:28

your theory of change, which I think is wrong,

38:30

but I know is going to get you marked as

38:32

something as less than a team player.

38:35

Again, I'm sounding like Robert Daval

38:37

in that movie. I don't want to just pick

38:39

on young or youngish women who've removed themselves

38:41

from the arena overstress. In New

38:43

York City, Corey Johnson decided against running

38:45

for mayor, citing mental health I've had

38:47

on Jason Kander who talked about bowing out

38:49

of electoral politics to care

38:51

for his PTSD. And another counterpoint to women who've exited

38:53

isn't just men who've exited. Let's talk about women

38:56

who stick it out. I think of Illinois senator

38:58

Tammy Duckworth

39:00

like Ardern had a baby while in office. The Senate

39:02

is not an institution that makes child care

39:04

easy and she doesn't live in a

39:07

country with maternal leave unlike Ardern,

39:10

Duckworth is doing this all as a double

39:12

amputee by the way. Again, it's tough

39:14

to compare real people to impossible

39:16

standards. But Tammy Duckworth's

39:18

real. And, yeah, if Tammy Duckworth said

39:20

I couldn't do it anymore, we'd have to be

39:22

understanding. But shouldn't the

39:24

celebration be of doing

39:27

it at least as much as of

39:29

not doing it, we

39:32

definitely, as a society, have undergone

39:34

a shift away

39:36

from insensitivity. We were so

39:38

ignorant of mental health.

39:40

We were so callous. We

39:42

didn't identify with the humanity of

39:45

everyone else. We were sexist. Absolutely. We still are

39:47

absolutely true. But the shift

39:49

isn't just from

39:52

insensitivity. To now a properly calibrated sensitivity.

39:54

Because when we favor

39:56

the virtue of sensitivity, we're

39:58

choosing against the virtue of

40:02

resilience, not always. Sometimes it's just proper sensitivity.

40:04

Sometimes that which is called resilience

40:06

is something like unrealism

40:09

or toughening it out. Or

40:12

cruelty on the part of the person

40:14

advocating it. And, you know, in

40:16

general it's counterproductive to paint the

40:18

world in absolutes. I

40:20

know Simone Biles wasn't a

40:22

coward for pulling out of a few Olympic

40:24

events. It is dangerous when you're

40:26

twisting in the air. But a

40:28

question whether she's more heroic

40:30

for pulling out of the Olympics than she

40:32

is for being the greatest gymnast of

40:34

all time. And that took more

40:36

sacrifice and pain than the average

40:38

person could possibly understand.

40:40

Again, it's not really about just into

40:42

art art. Go live, laugh, love, Jacinda, I say

40:44

to thee. But if this

40:46

is a fight, if this is the

40:48

struggle we're told

40:50

it is, the combatants

40:52

need to ask themselves, who has

40:54

the advantage? The sensitive or

40:56

the resilient? I'd say you

40:58

have to have some of both.

41:00

Keeping in mind that the enemies, their

41:02

tanks seemed to be perpetually on fault. And

41:10

that's it for today's show. Corey

41:12

Juarez is the producer of

41:14

the jist and Joel Patterson

41:17

is the senior producer of the

41:19

GIST. Michelle Peska is the prime minister

41:21

of New Zealand. The GIST is presented

41:23

in collaboration with Ipsen's

41:26

advertised cast For advertising inquiries, go to advertisecast dot com

41:28

slash the just Oopoo g poooo.

41:30

Thanks for listening. When

41:32

you tell but

41:34

sour on this, because

41:36

you will die because

41:40

of me.

41:50

This

41:50

week, an RVER sponsored by

41:53

Progressive Insurance. Oh, then a

41:54

doctor is dropped at Cortez. Please. He's

41:57

just another

41:57

RV league

42:00

aided surgeon with good hair. No. He's different.

42:02

Nurses, we got a classy motor

42:03

home with a detached driver's side mirror. Meet

42:05

me in the OR. Stat.

42:07

Right away

42:08

No. No. No. She's on break. I'll handle this one.

42:10

Oh, you can niving. When your

42:12

RV really needs saving, progressive has

42:15

you covered. See if you could say, a leader in RV insurance, provides

42:17

a controlled insurance company with its coverage at

42:19

the policy terms.

Rate

Join Podchaser to...

  • Rate podcasts and episodes
  • Follow podcasts and creators
  • Create podcast and episode lists
  • & much more

Episode Tags

Do you host or manage this podcast?
Claim and edit this page to your liking.
,

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features