Podchaser Logo
Home
US 2.0: What We Have In Common

US 2.0: What We Have In Common

Released Monday, 29th January 2024
Good episode? Give it some love!
US 2.0: What We Have In Common

US 2.0: What We Have In Common

US 2.0: What We Have In Common

US 2.0: What We Have In Common

Monday, 29th January 2024
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

This is Hidden Brain. I'm Shankar Vedantu.

0:03

In 1980, Ronald Reagan became President

0:05

of the United States. He

0:08

quickly raised the temperature of the Cold War

0:10

and assumed a muscular stance toward the

0:13

Soviet Union. Let us be aware that

0:15

while they preach the supremacy of the

0:17

state, they are the focus

0:19

of evil in the modern world. That

0:23

September, a message flashed in a

0:25

secret bunker at Suprokov 15, a

0:28

secret Soviet outpost that analyzed

0:30

satellite data from the United States. Inside

0:33

the bunker was a 44-year-old

0:36

Soviet lieutenant colonel named Stanislav

0:38

Petrov. The military

0:40

commander saw a button pulsing red. His

0:43

panel told him the unimaginable had

0:45

happened. The United States

0:47

had launched a ballistic missile. Within

0:53

minutes, the satellite data showed four

0:55

more missiles had been launched. It

0:57

looked like the United States was trying

1:00

to cripple the Soviet Union with a

1:02

sudden, deadly nuclear attack. There

1:04

were only seconds for the Soviets to

1:06

launch strikes in response. Stanislav

1:13

Petrov debated whether to report the

1:15

attack. If he did,

1:17

it could have triggered a massive Soviet

1:20

response. The

1:22

Soviet commander did not do what was expected

1:24

of him. He decided the

1:26

satellite data was wrong and did

1:28

not report the missiles. He

1:32

was right. The satellite signals

1:34

were reflections of sunlight off

1:36

clouds. You

1:40

probably have never heard of Stanislav Petrov, but

1:43

you might owe your life to him. Retaliatory

1:45

strikes could easily have killed half

1:47

the populations of both countries. Researchers

1:50

have estimated that the nuclear winter

1:53

that followed could have killed 2

1:55

billion people worldwide. There

1:59

is a lesson in the story. about whether

2:01

fallible human beings should ever have nuclear weapons

2:03

at their disposal. But

2:05

our focus today is on a psychological

2:07

idea, how our minds work

2:09

when we are under attack. It's

2:13

also the start of a series we're calling US

2:15

2.0. As

2:17

we begin what promises to be a

2:19

pivotal and contentious election season in the

2:21

United States and many countries around the

2:23

world, we're taking a close look

2:25

at how we engage with our opponents. Over

2:30

the next few weeks, we'll explore the

2:32

assumptions we make about our allies and

2:34

our foes. We'll look to

2:36

history for lessons. And we'll

2:38

offer specific strategies to engage constructively

2:40

with our opponents, whether in

2:42

the political realm, at the dinner table,

2:45

or at work. We

2:48

begin with the psychology of threat,

2:51

this week on Hidden Brain. Support

2:54

for Hidden Brain comes from Discover. In

2:57

today's world, it seems the best treatment is reserved for

2:59

only a few. Well, Discover wants

3:01

to change that. That's why, with

3:03

your Discover card, you have access to 24-7

3:05

live customer service and

3:08

zero-dollar fraud liability, which

3:11

means you're never held responsible for unauthorized purchases. No

3:15

matter who you are, you'll feel special

3:17

with Discover. Learn

3:20

more at discover.com/credit

3:22

card. Limitations apply.

3:30

When something bad happens, it's

3:32

human nature to look for someone to

3:34

blame. Needless to say, that person usually isn't us.

3:39

The tendency to see others as villains

3:41

and to cast ourselves as innocent victims

3:45

causes harm in interpersonal relationships. It

3:48

may also lie beneath some of our deepest societal divides. At

3:51

the University of North Carolina, psychologist

3:53

and neuroscientist Kurt Gray is

3:56

a leading expert on the human brain.

4:00

studies what happens in our minds when

4:02

we think about our political opponents. Kurt

4:05

Gray, welcome to Hidden Brain. Thanks

4:07

so much for having me on. Kurt,

4:09

I want to start our conversation with a story

4:11

that is very far away from politics, but I

4:13

think it has a deep

4:16

connection with politics at a

4:18

psychological level. When you

4:20

were a teenager, you used to drive

4:22

around with a bunch of high school

4:24

friends. I understand your car was nicknamed

4:26

fireball. Does that say something about how

4:28

fast you used to drive? It

4:31

does. I used to drive a

4:34

two-door Pontiac Grand Dam. It wasn't

4:36

a flashy car, but I like to drive it

4:38

very fast and didn't always

4:40

pay attention. So one time

4:42

you and your friends were heading to a

4:44

movie when something fairly dramatic happened to you. Tell

4:47

me the story of what happened, Kurt. I

4:50

was 16, had just got my license,

4:53

and we were driving in the night

4:56

to go see a movie. It

4:59

had just rained, and so the

5:01

the streets were shining in

5:04

the orange sodium lights.

5:06

We were roaring up the road because

5:09

the movie started in five

5:11

minutes and we were 10 minutes away. I

5:14

was in the right-hand lane. There was a lane

5:16

to my left, and

5:19

my friend and shotgun all of a sudden

5:21

said, Kurt, you're going

5:23

to miss the turn. Turn left. I

5:27

hauled on the steering wheel to the left. I

5:29

didn't look in the lane next to me because

5:31

I had to cross the lane to be able to turn

5:33

left, and there was a car driving

5:35

next to me. Oh my gosh.

5:38

As I turned, this car slammed on

5:40

its brakes. I suddenly became aware

5:43

it was there. I slammed on my brakes. We

5:45

screeched and squealed. The roads were wet.

5:48

We spun around in the

5:50

intersection. I didn't hit it, this

5:53

other car. It didn't hit me, and we didn't hit anything

5:55

else. Luckily, everyone was

5:57

safe. No one was around. We ended

5:59

up stops in the

6:01

wrong direction on the other side of the road,

6:03

just kind of in a desolate night.

6:07

I mean, it's still a heart-stopping moment though, because

6:09

I think in that instant, everyone must have seen

6:11

how close they came to a crash. It

6:14

was terrifying, and it happened so fast.

6:17

I mean, the music was so loud, we barely realized

6:19

anything. We were kind of wrapped in our own world.

6:22

And so I opened my

6:25

window to start to apologize to

6:27

the driver of the other car.

6:29

It was a silver Mercedes Benz. And

6:32

this driver, this guy gets out of the

6:34

car, he was in his early 20s, had

6:38

pretty nice clothes on. I remember he

6:40

had curly hair, it was gelled, he

6:42

had some silver chains on. And

6:45

I just started opening my mouth

6:47

to say sorry. And he looks

6:50

at me, just daggers right in

6:52

his eyes, and his shoulders

6:55

are set. And he is coming towards

6:57

me fast. And he says, you're

7:00

dead. Get out of the car.

7:03

I'm gonna kill you. So I took

7:08

off, and I was flying through a strip mall. It was

7:10

a really built up kind of like

7:12

big I

7:29

was totally panicked. I had

7:31

no idea where it was going. And

7:33

again, it was dark. No one was around,

7:36

even though the movie theater half

7:38

a mile away was bustling. No one was

7:40

around these stores. And so

7:43

I'm just taking turn after turn, and

7:46

he's getting closer and closer on my tail.

7:48

And eventually I turn into a

7:51

parking lot of a Home Depot store. And

7:55

he revs and gets

7:57

close to me, And I turn and I.

8:00

Go behind the store. Into.

8:02

The loading docks and so. There's

8:05

a steep embankment on my right, so

8:07

I'm really like funneled into this little

8:09

canyon with this guy behind me. And

8:13

he accelerates. Up.

8:16

Beside me and then in front of me and

8:18

starts. Kind of like cutting me off. He kind

8:21

of like corral was me into the wall, kind

8:23

of into a corner and I realized I was

8:25

traps. Caught.

8:28

Was so paralyzed with fear he could

8:31

barely think. And.

8:34

She gets out of his car

8:36

and starts walking towards me again

8:38

very menacing he sorry angry and

8:40

all my friends we were talking

8:42

on the way they're obviously we're

8:44

having fun in a deadly silent

8:46

know music and my one friend

8:48

in the back whose thinking. Lucidly

8:51

her name's Jesse. She

8:54

says lock the door and saw it.

8:56

I immediately lock the door and a

8:58

second later he grabs my handle and

9:00

starts to to hall on the handle

9:02

and trying to pull me out of

9:04

the car. While. By the door

9:06

locked at this point so he can get in. Exactly.

9:09

But by also realize I had to defuse

9:11

the situation. Because. He

9:14

he so angry and. Much.

9:16

Bigger than me and so I

9:18

do the only thing I can

9:20

which is. The. Or start to

9:23

apologize. So I unrolled my window. A

9:25

few inches same from so sorry I am

9:27

I know it's my father was a much

9:30

more i was going and he he gets

9:32

his you know you're dead up I'm gonna

9:34

kill you and then he reaches into through

9:36

the crack in the window and he tries

9:38

to unlock. The door from

9:41

the inside with his hand.

9:43

While and so I'm simultaneously

9:45

trying to stay calm and

9:47

can trace apologize to this

9:49

man while frantically slapping away

9:51

his hands so he can

9:53

unlock the door. and

9:55

then it's clear he's not going to be able to

9:57

on on markets and so he just start slapping me

10:00

through the crack in the window is grabbing me

10:02

by my collar and just kind of shaking me,

10:04

just repeating like I'm going to kill you, you're

10:06

dead again and again. How

10:09

does this end? The

10:12

friend of mine in the backseat, Jesse, you

10:14

know, the cogent one, her

10:16

mom happened to work for a

10:19

cell phone store. And

10:21

cell phones back when I was in high school were

10:23

not popular. Not everyone had

10:25

one, but she had one, lent

10:27

from her mom just in case anything

10:29

happened, if she had to make any

10:32

phone calls. And it was a, you know,

10:34

a kind of brick of a phone as the

10:36

old ones were. And she holds it

10:38

up and she says to this guy, I've

10:41

got a cell phone and I'll call the cops.

10:44

And so this doesn't sink in right away

10:46

to the guy who keeps on slapping me

10:50

and threatening to kill me. And

10:52

then eventually he stops, it sinks

10:54

in, and he takes his hand

10:56

away and he kind of bends down and he looks through

11:00

the crack in the window at all of us in the

11:02

car and he says, fine,

11:04

you call the cops and

11:06

I'll tell them what you did. And

11:09

this statement was perplexing

11:12

to me because clearly

11:16

in my mind, if

11:18

I explain what had happened to the

11:20

police, they would surely be

11:22

on my side. I was the one

11:24

getting assaulted, getting threatened

11:26

with murder, but

11:29

I was puzzled because he was

11:31

so confident that the police

11:33

would be on his side. I

11:35

couldn't understand how he could be so confident

11:37

that he was in the morally right position.

11:40

And yet I was confident that I was morally

11:42

correct. You know, I

11:45

think when things like this happen to us, you

11:48

know, we're very quick to try and

11:50

defend our particular points of view. But

11:53

as you're telling me the story, I'm an observer, and

11:56

I can see things from both points of view. I can

11:58

see how he must have been driving. driving along the road,

12:01

someone swerves in front of him at

12:03

high speed, nearly kills

12:05

him, and he says, clearly

12:08

I'm the victim here. This crazy

12:10

teenager could have killed me. And

12:13

from your point of view, you're saying, you

12:15

know, I made a mistake, a

12:17

simple mistake, and I'm really

12:19

sorry about it, but surely that mistake

12:21

doesn't warrant somebody chasing me through dark

12:24

streets for mile upon mile, cornering me

12:26

and threatening to kill me. I

12:28

agree. And as I started to do

12:30

research on moral psychology, I

12:33

came more and more to

12:35

recognize the genuine concerns

12:38

that he had about being harmed.

12:41

He genuinely felt like he was

12:43

victimized, and so did I. And

12:45

so this presented a puzzle to

12:47

me. We experienced the

12:49

same situation and had completely opposite

12:52

perceptions of blame and harm.

12:56

I have to ask you, what happened that night

12:58

after your friend threatened to call the police and

13:00

he said, go ahead, call them. How did the

13:02

incident come to an end? Well,

13:05

he, after he

13:07

told us to go ahead and call the cops, he

13:09

kind of stood there and looked at

13:11

us for a while and, you know,

13:14

maybe recognize that we were all frightened

13:17

teenagers, you know, trapped in

13:20

a little metal cage, like veal

13:22

in some parking lot. And

13:24

he stormed back to

13:26

his car, slammed his door and squealed

13:29

off into the night. When

13:36

we come back, how Kurt's story

13:38

speaks to our deep political divides.

13:42

You're listening to Hidden Brain. I'm Shankar

13:44

Vedansam. Now

13:55

is the time to embrace a new wave of workers.

13:57

Every day your team grows younger, more digital. It's

14:00

easy to get distracted by email alerts, texts, and emails. But

14:07

psychologist Gloria Mark says there's another source

14:09

of distraction that's just as

14:37

insidious. Self-interruptions. And

14:42

we find that when the

14:44

number of external interruptions goes

14:47

down in the next

14:49

hour, the number of self-interruptions

14:51

goes up. So

14:54

it's almost as if people want

14:56

to maintain this level of

14:59

interruptions. And if you're not being

15:01

interrupted by something external

15:03

to yourself, then you've self-interrupt.

15:08

Learn how to rebuild your attention span in

15:10

our recent episode, Finding Focus.

15:13

You can find it right now in this podcast feed

15:16

or on our website, hiddenbrain.org.

15:30

This is Hidden Brain. I'm Shankar Vedantam. At

15:33

the University of North Carolina Chapel

15:35

Hill, psychologist Kurt Gray studies the

15:38

science of political polarization. Along

15:40

with other researchers who study how we think

15:43

about our political opponents, Kurt finds

15:46

that we make a series of assumptions and draw

15:48

a series of conclusions about people

15:50

who disagree with us politically. These

15:53

assumptions and conclusions are especially powerful

15:56

Because they happen so swiftly,

15:58

automatically, and unconsciously. That.

16:00

They don't feel like assumptions

16:02

are conclusions. They

16:04

see laxatives. Says evident

16:07

facts. The first of

16:09

these has to do with what we

16:11

think is happening inside our opponents minds

16:13

or rather. What? We think

16:15

isn't happening. Inside our opponents

16:17

might. We. Generally

16:19

think that we are side is smart,

16:22

that we vote in our own self

16:24

interests and that we do things that

16:26

make sense. And. We think

16:28

that we. You. Know. What?

16:30

Policies are gonna help ourselves and the

16:33

country, but when we think of our

16:35

opponents, we think of them as being.

16:37

Quite. Stupid mistake of them

16:39

as not voting in their own

16:41

self interest and we think of them

16:44

not wanting policies as gonna help them.

16:46

And so and one study we

16:48

did in North Carolina. We.

16:50

Presented people with a bunch of

16:52

amendments that were part of an

16:54

election a few years ago and

16:57

we just asked people. A.

16:59

Bows. Those amendments.

17:01

And why someone on the

17:03

other side might vote differently

17:06

than they do on those

17:08

amendments. And so it might

17:11

ask a progressive participants in

17:13

North Carolina why mice a

17:16

conservative person vote yes on

17:18

these things? And what we

17:20

found is that. People.

17:24

Things that people on the other side they

17:26

think that they are. Dumb.

17:29

And they don't appreciate what's

17:31

best for themselves or. Their.

17:34

Country or the state. So. In other

17:36

words, we think the other side

17:38

is filled with simple. exactly. We

17:41

think that we are thoughtful and

17:43

rational and doing the best we

17:45

can with complex issues. And we've

17:48

also think that those on the

17:50

other side are not those things

17:52

at all. They are tricked by

17:55

the media, They are deceived by

17:57

some leader that they are see.

18:00

For and we just think that they're

18:02

stupid. So we think our

18:04

opponents are not very smart. but it's

18:06

also the case that we feel like

18:08

we don't like our opponents very much,

18:10

but we think our opponents have stronger

18:12

ceilings about us. Tell me about the

18:15

what you've done looking at, how we

18:17

think about our opponents and how we

18:19

think our opponents feel about us. So.

18:22

It's true that we don't like folks on

18:24

the other side. But. A

18:26

lot of research finds that we

18:28

severely overestimate how much the other

18:31

side dislikes us. In.

18:33

One paper. The. Researchers show

18:36

that we inflates. Our

18:39

estimates of how much the other side

18:41

dislikes us. I somewhere between

18:43

city. To three hundred

18:45

percent. While. So.

18:49

Republicans might. Mildly.

18:52

Dislike democrats in general. But if you

18:54

as democrats how much you think republicans

18:56

dislike you, they think it's this. He

18:59

dug deep. Burning. Hatred says

19:01

you harm yourself and your political party

19:03

and it's just not true. And

19:06

of course the that the reverse as

19:08

soon as well that Republicans believe that

19:10

the Democrats hate them and I understand

19:12

that this work i think it was

19:14

done by Samantha More Bird in a

19:17

Twenty twenty found that the more partisan

19:19

people are, the more strongly partisan people

19:21

are, the more the whole despise. That's

19:23

right. so no matter what size the political

19:26

spectrum, you are. The. Further out on

19:28

that spectrum, you are. The. More

19:30

you inflate. Your. Estimates of

19:32

how much the other side hates you. I'm

19:35

wondering what's the As sectors This is if

19:37

you and I are in conflict with one

19:39

another and we have a disagreement about something

19:42

I can tell myself you know curt ones

19:44

acts and I want why and we can

19:46

figure out you know, is there a middle

19:48

ground between X and Y But if I

19:50

tell myself you a court doesn't just want

19:52

X, could actually hates me and really wants

19:54

the worst. For me, it becomes very difficult

19:57

to think about splitting the difference between X

19:59

and Y. That's. right? So.

20:01

Compromise and democracy more

20:03

generally requires that. Were.

20:06

Willing to talk with others who

20:08

might disagree with us, cooperate with

20:10

others who might disagree with us.

20:12

And if you think of the

20:14

other side hates you, it can

20:16

be hard to even engage in

20:18

conversation with them right? It's a

20:20

It's a fight for survival. Spin.

20:27

On thinking back to that incident that took

20:29

place when you're a teenager, the incident where

20:31

you gotten to a conflict with another driver

20:33

at one of the things that strikes me

20:35

is that in that moment when your friend

20:38

was threatening to call the police, you said

20:40

righteous because you felt clearly I'm the one

20:42

who's been wronged here and the police will

20:44

see my side of the story and the

20:47

other driver self righteous do and said surely

20:49

the police will see my side of the

20:51

story in some ways when we believe that

20:53

people that hate us to gives us license

20:56

to feel righteous. Absolutely right.

20:58

We feel righteous and in

21:00

in the moral right because

21:02

not only are we. Hated.

21:06

But also because we're being harmed

21:08

by there's a villain on the

21:10

other side. They're attacking us and

21:12

that makes us the victim. Ends.

21:16

When we're ceiling victimize, we

21:18

feel. License. To protect ourselves

21:20

and any way that we can. I'm

21:26

wondering how much of this is about. You

21:28

know what psychologists sometimes cause you know,

21:30

cognitive closure seeking constant. It's simplicity. If

21:32

I have to say caught once acts

21:35

I want Why? What's the middle ground?

21:37

It's complicated, but if I can the

21:39

say he Caught hates me code is

21:41

clearly wrong. I'm in the right In

21:43

some ways it's cognitively simpler. Yes,

21:46

Our minds. Once simplicity.

21:49

And. That's especially true when it comes

21:51

to morality. And the

21:53

reason is because if we

21:55

acknowledged that the moral universe

21:58

is complicated, than we. To

22:00

acknowledge that our moral beliefs my

22:02

sometimes be wrong. And so when

22:05

I was in that car right

22:07

recognizing that the other driver had

22:10

illegitimate you know a genuine feeling

22:12

of victimhood meant that I might

22:14

be the the villain. I might

22:17

be the perpetrator there. And that's

22:19

a tough pill to swallow. So.

22:26

He looks at how we believe that

22:28

our opponents are stupid and that our

22:30

opponents tate. as you say. that another

22:33

believe that we hold about our opponents

22:35

is whether they care about democracy and

22:37

they care about our shared civic values.

22:39

Talk to me about the Swiss as

22:42

good. A. Team of

22:44

scientists have found that. Although. People

22:46

generally support democracy. Everyone

22:49

in America. Generally

22:51

supports democracy. We've vastly

22:53

over inflate. How

22:55

much people on the other side? Don't.

22:59

Want to support democracy So

23:01

our side is pro democracy.

23:03

Their side is is not

23:06

anti democracy. At least willing

23:08

to let democracy slide to

23:10

win. It politics.

23:13

That. Perception means that now we feel

23:15

threatened. Now we're in a war,

23:17

they are trying to destroy democracy

23:19

and so in a war we

23:21

have to fight dirty to and

23:23

the perception that the other side

23:25

is anti democratic. Licenses.

23:28

Our side. To. Also

23:30

do anti democratic things. Because

23:33

they're willing to see elections for us to even

23:35

stay in the game, we should be willing to

23:37

bend the rules as well. That's right, I'm

23:40

wondering, how would you make. Us A recent events

23:42

in the United States of for example, if

23:44

you think about the January Sixth insurrection, how

23:47

would you think about that? I think many

23:49

democrats would look at that and say you

23:51

know very clearly these are people who actually

23:53

tried to overturn the election included. The are

23:55

anti democratic. It's not just a perception in

23:57

my head. The

24:00

The January Six. Example.

24:03

Is a great one because from

24:05

the outside it seems like these

24:07

are the folks who are just

24:09

trying to destroy democracy. But. I

24:11

think from the inside. If. You look

24:13

at it from their perspective. They.

24:16

Think that they're upholding

24:18

democracy because they. Were.

24:20

Led to believe that the election was

24:22

stolen. And. Maybe what they're

24:24

doing. Is. In stem a craddock. But

24:26

as they. Thought. That democrats

24:29

kind of fired the first shot.

24:31

They're just retaliating. But. In

24:33

that case there is a

24:36

difference between what everyday people

24:38

things and was. Political.

24:41

Elites. Are. Doing. So.

24:43

I think when it comes to

24:46

the behavior of the many people

24:48

who went up. To. The

24:50

Capital on January Six. It's easy

24:52

to see how. Their. Worldview

24:54

supports the idea that they

24:57

are standing up for democracy.

24:59

And for freedom. I'm not

25:02

sure that I would be as sympathetic

25:04

to some elites who are propagating that

25:06

idea. So

25:08

we've looked at how people think the

25:11

other side a stupid. The other side

25:13

is irrational. The other side is anti

25:15

democratic. You recently published a study could

25:18

that reported on what Democrats and Republicans

25:20

in the United States believe about one

25:22

another when it comes to topics such

25:24

as murder, child pornography, and embezzlement. Tell

25:27

me about the study. Here

25:29

so that study. I should say it's

25:31

not yet published, but it is available

25:33

online. all the data and the manuscript.

25:35

And in that study, we wondered how

25:38

people would view the morality of the

25:40

other side. And of course we already

25:42

know. That. Progressive conservatives

25:44

disagree about hot button

25:47

issues. So. You might think

25:49

the other side is wrong about. Abortion

25:52

about capital punishment

25:54

or immigration but.

25:57

there are many moral issue that seem

25:59

totally controversial, like murder

26:02

or embezzlement or, as you say, child

26:04

pornography. And so the

26:07

question is, what do people

26:10

think about those on the other side when it

26:12

comes to those issues? Would

26:14

Republicans think Democrats are okay

26:17

with child pornography or embezzlement or

26:19

infidelity and so forth? And

26:22

vice versa. So

26:25

when we look at the data, we find

26:27

that consistent with what

26:29

we've been talking about before,

26:31

people vastly overestimate how

26:34

much those on the other side

26:36

see these obvious moral wrongs as

26:38

acceptable. We show

26:41

that both Democrats and Republicans

26:44

think that 15% of the other side view child

26:49

pornography as acceptable. That's

26:53

crazy. The real answer

26:55

is basically zero. But

26:58

we really inflate how much

27:01

the other side is

27:03

evil and lacks a basic moral sense. So

27:06

I mean, we're in really deep waters

27:08

here because now our dislike

27:10

for one another is not just about policy

27:12

matters, and we're not even dressing it up

27:14

as being about policy matters. We're

27:16

actually saying our opponents now are just evil people who are

27:18

bent on just destroying the world. So

27:21

I think this rampant

27:23

polarization makes people endorse

27:26

something that we call a destruction

27:29

narrative where the

27:31

other side is motivated by

27:35

the urge to destroy our

27:37

side and also America. And

27:40

it's really the sense that the other side wants

27:42

to watch the world burn. I

27:46

should say that people think

27:48

that the other side is more stupid

27:50

than evil, more

27:53

misguided than demonic,

27:56

but it's still not a great place to be, obviously.

28:00

still a sense that the other side is motivated

28:02

by some destruction. When

28:05

the other side passes some

28:07

policy with some unintended

28:10

negative side effect, as all

28:12

policies have, some

28:15

research shows that people

28:17

think that those on the other side

28:20

intend those negative

28:22

policy consequences, that

28:24

they want to hurt people

28:26

on the other side. But

28:28

of course that's not true. Folks are

28:31

just trying to do the best they can when it comes

28:33

to these policy preferences. So

28:38

we've talked in different ways and offered

28:40

different examples of how in our political

28:43

discourse we want to see

28:45

ourselves as being the

28:47

victim, as being potentially harmed, and seeing the

28:49

other side as the perpetrator, the other side

28:52

as the villain, the people who are trying

28:54

to do us harm. And of course the

28:56

other side feels exactly the same way. But

28:58

it raises a really interesting point, which is

29:00

that the animating force in much of politics

29:03

might not be animosity and

29:05

aggression, but it might be a

29:07

feeling of victimhood, a feeling that we

29:09

are under siege, that we are under

29:11

attack. Can you talk about this idea? Because I think

29:14

that's not the way most people think about

29:16

politics, that people think about politics as

29:18

being a blood sport, very aggressive. But

29:20

the picture that you're painting, I think,

29:22

is slightly at odds with that, where

29:24

the feelings of vulnerability that we have

29:26

are in fact the dominant drivers of

29:29

our perceptions and behavior. That's

29:32

right. So when we think about the

29:34

motivations of others, we think that they

29:36

are aggressive. We think that they

29:38

are trying to destroy us. We think they

29:40

are motivated by some deep instinct

29:43

to hurt us. But

29:46

my reading of the

29:48

literature and my work suggests that ultimately

29:51

people are motivated by this desire

29:53

to protect themselves, to

29:56

guard against threats. They're motivated by

29:58

a sense of vulnerability. So

30:01

rather than a destruction narrative, I

30:03

think that politics is better

30:06

described by a protection narrative, where

30:08

people are trying to protect themselves

30:10

and their vulnerabilities. Where

30:12

do you think this comes from, this sort

30:15

of constant need to protect ourselves, to see

30:17

ourselves as under threat? Where do you

30:19

think this comes from, Kurt? I think our

30:22

desire to protect ourselves from threat

30:24

in politics and the modern world

30:27

comes from way back in human nature.

30:30

I think that the human

30:32

experience is ultimately an experience

30:34

of threat and fear

30:37

and worry about our

30:39

vulnerabilities. I'm

30:41

wondering, Kurt, if some people might say, you

30:43

know, that can't possibly be true. Humans

30:46

are at the apex of

30:49

the planet right now. You know, every

30:51

other species should fear humans, because in

30:53

fact, we are the most deadly predator

30:55

on the planet right now. But

30:58

you're making the case that humans, in fact,

31:00

are motivated almost entirely by fear, by vulnerability.

31:03

There seems to be a mismatch there. There

31:06

is a mismatch. And there

31:08

is no doubt that today we are

31:10

apex predators. We can

31:12

hunt wolves from helicopters. We

31:15

can remake the world. But

31:18

there is a fallacy in thinking that just

31:20

because we are predators today, that that's how

31:22

we have always been. And

31:25

in fact, if you look back

31:27

in the midst of time, where

31:29

our minds and our human

31:32

nature evolved, we were not predators at

31:34

all. Instead, we were prey. We

31:37

evolved not as predators, but as prey.

31:41

I understand that you had an incident in your own

31:43

life that brought home to you your own

31:45

vulnerability as an individual creature. Tell

31:47

me the story of what happened, Kurt. Before

31:51

I wanted to be a social psychologist,

31:53

I thought I wanted to be a

31:55

geophysicist. And a

31:57

geophysicist spends a lot of time out

31:59

there. doors in remote locations

32:01

looking for natural gas

32:04

or oil. And

32:06

so I was very far north.

32:09

So if you drive to the border between

32:11

Montana and Alberta

32:14

and you drive 18 hours straight north

32:16

and then you turn left and

32:19

drive for another hour, you come to

32:21

Rainbow Lake, Alberta. And

32:23

then from Rainbow Lake, which is extremely

32:26

isolated, you take

32:28

a helicopter ride, another 30 miles into

32:30

the bush, that's where we

32:32

were looking for natural gas. So

32:35

there was a crew of five of us, four

32:38

college students, and one old man

32:40

named Ian, who was at the

32:42

time 25, but he seemed like an

32:45

old man to us. And we

32:47

would spend our days in the middle of the

32:49

wilderness driving around on

32:51

snowmobiles and pounding stakes

32:53

into the ground to try to find natural

32:55

gas. On

33:00

one of these expeditions, Kurt and his team

33:02

had just finished a tough day's work. But

33:05

before the helicopter could come fetch them, bad

33:07

weather rolled in. The helicopter

33:09

pilot told them he'd come get them

33:12

the next morning. But

33:15

it was winter and it was very cold. The

33:18

five of us were literally

33:21

in the middle of nowhere and

33:23

had no water, no food, except some

33:26

leftover sandwiches from lunch. And

33:29

we had to spend the night

33:31

alone in the middle of the Canadian

33:33

wilderness. And

33:38

so we went off to the forest. We

33:40

built a lean-to. We

33:42

gathered some firewood. We lit it with gasoline,

33:45

which I wouldn't recommend unless that

33:47

was your only source of anything

33:49

flammable in the middle of the wilderness. And

33:52

then we just sat down to wait through the

33:54

night until the helicopter might be able to come

33:56

pick us up. assuming

34:00

it was pitched apart from the fire, it was pitch

34:02

dark. It was yeah, it was

34:04

pitch dark. It was minus 10

34:06

degrees Celsius. We had

34:08

no other blankets, we had no other

34:11

jackets other than our one

34:13

piece fire retardant, no max

34:16

coveralls. You know, we weren't

34:18

prepared to weather a night outside. So

34:21

we shared the remnants of our lunch, we

34:24

sat around the fire talking and then it was time

34:26

to go to bed. And so we all five in

34:28

a row, we spooned with each other to stay warm.

34:31

But that proved to be too cold in the

34:33

night. And so we eventually found our way back

34:35

to the fire. And we curled up around it

34:37

in a circle and tried

34:40

to sleep while the flames

34:42

were high and while we were warm.

34:44

And then when the flames died down, we

34:46

would wake up and we'd add

34:48

some wood to the fire and try

34:50

to sleep again. And we did that for 10,

34:53

12 hours. Now,

34:55

obviously, you know, there's no sort of, there

34:57

are no human predators out there, but presumably

35:00

there are animals. Before

35:02

this night, I had

35:04

never thought about predators, right?

35:06

I grew up in a city

35:09

in Canada. But there

35:11

was a couple of times when I woke up in the middle

35:13

of the night where I felt

35:15

uneasy. And you

35:19

might say, of course you felt uneasy because you were

35:21

stuck in the middle of the wilderness hoping not to

35:24

die, you know, of cold

35:26

or thirst or something like that. But

35:30

I just couldn't get the sense that, you know, there

35:32

was something out there and it's so

35:34

dark, you can't see beyond this little circle of

35:36

light. So you could look into the woods and

35:39

there was absolutely nothing but blackness. And

35:44

it's not like there's some serial killer out there,

35:47

right? It's not like a horror movie because

35:49

we're so far from civilization.

35:52

But I still got this uneasy

35:54

sense. And then bit

35:56

by bit, the sky turns gray, it's

35:59

still pretty cloudy. out and

36:01

we get up and we stretch and

36:03

as we walk around the campsite we notice that

36:06

there are paw prints all the way

36:08

around. Very close to where

36:10

we were sleeping and

36:14

they were lynx paw prints and

36:17

so what had happened in the

36:19

night was that some lynx had

36:21

heard us, had smelt us and

36:23

had crept close to us in the night. There

36:27

are overseas listeners who live in tropical climates, can

36:29

you tell me what they are? Lynx

36:32

are big fluffy bobcats.

36:36

I don't think they could take down an adult

36:38

man but I think

36:40

they could probably eat a small

36:43

child and certainly they could

36:45

rip out the throat of

36:47

someone who's sleeping in the

36:49

darkness and that realization

36:52

hit home to me

36:54

as we sat there in the morning waiting to

36:56

get picked up by the helicopter. And

37:00

we couldn't have done anything to prevent this because

37:02

humans are weak and

37:05

we don't have nails and we don't have teeth and

37:07

if there had been a real predator, if it

37:10

had been a mountain lion then

37:12

we wouldn't have stood a chance.

37:16

Not very long ago this was not unusual

37:18

at all. 150 years

37:20

ago and earlier stuff like this

37:22

happened probably all the time in all parts of the

37:24

world. We were living in

37:27

close proximity to nature and

37:29

in fact were vulnerable in ways that we

37:31

simply don't feel today. Absolutely.

37:33

So for the

37:35

last millions of

37:38

years of our evolution we

37:40

have been vulnerable to predation and it's

37:42

really only in the last hundred,

37:46

couple hundred years that that threat has

37:48

basically dropped down to zero for most

37:50

of us. For

37:52

a long time people were hunted by wolves,

37:56

tigers, bears,

37:59

but even And today, in

38:01

our industrialized world, many people are

38:03

still vulnerable to predators. There

38:06

was a case in Canada several years

38:08

ago of a pop

38:10

singer going for a walk through Nova

38:13

Scotia forest, and

38:16

she was killed and

38:18

partially eaten by a pack of

38:20

coyotes. I'm

38:23

wondering what effect this has on our

38:25

minds, the fact that in some ways

38:27

we've had a very long evolutionary history

38:29

where we are vulnerable and potentially under

38:31

threat, and a very recent

38:34

evolutionary history where that threat has receded.

38:37

What has that done to our minds, Kurt? Our

38:40

long-standing vulnerability to predation has

38:43

really shaped our psychology

38:45

in our modern world.

38:47

Even if we don't think about predators today

38:49

very much, we are

38:51

still fundamentally concerned with

38:53

protecting ourselves from threats. And

38:57

those threats might not be sitting in

38:59

the forest or in the jungle behind

39:01

our houses, but

39:04

we are constantly bombarded with threats

39:07

today when it comes to politics,

39:09

when it comes to morality. So

39:12

we bring forward this long-standing

39:15

evolutionary feeling of threat

39:18

into our modern political realm. And

39:20

this is why we typecast the other

39:22

side as predators. And

39:24

I think it's important to recognize this because

39:28

fundamentally those folks on the other

39:30

side who we see as predators

39:33

also feel like prey.

39:36

Even the other driver in the parking

39:39

lot that night, he felt like the

39:41

victim, like the prey. Of

39:49

course, that doesn't mean that liberals and conservatives have to

39:51

define harm the same way. What

39:53

you might consider harmful might not necessarily be what

39:55

I consider harmful, which is why we can be

39:58

worried about different issues. That's

40:00

exactly right. So, in

40:02

my research, we find

40:05

that liberals might emphasize

40:07

harms to the

40:09

environment, or

40:12

they might emphasize harms to members

40:14

of disadvantaged groups, whereas

40:16

conservatives might emphasize harms to

40:20

social order, to those trying

40:23

to protect our society,

40:25

like police, and

40:27

perhaps to religious entities,

40:30

like God or the Bible. You

40:33

can see this very well, even

40:35

with hot-button issues, like immigration.

40:39

So progressives might worry about

40:41

the harm done to undocumented

40:43

immigrants, who they perceive as

40:45

vulnerable, whereas conservatives might

40:47

worry about the harm done

40:49

by undocumented immigrants, who might

40:52

be criminals or drug traffickers in

40:54

America. So both of those

40:57

positions are motivated by a desire

40:59

to protect us from harm. They

41:01

just emphasize competing harms in that

41:03

issue. It's

41:09

a problem we face in nearly every dimension of our

41:11

lives. Our brains

41:13

were sculpted by evolution over thousands of years.

41:16

Our minds today are the product of

41:18

those evolutionary forces. We are

41:20

walking around with machines that were designed, if you

41:22

will, in the Stone Age. Unsurprisingly,

41:25

there are mismatches between what those brains

41:28

were designed to do and the challenges

41:30

we confront today. When

41:33

we come back, how understanding the

41:35

psychology of our political conflicts can

41:38

help to bridge seemingly intractable divides.

41:43

You're listening to Hidden Brain. I'm Shankar

41:45

Vedantan. It's

41:55

easy to get distracted by email

41:57

alerts, texts and phone calls. But

42:00

psychologist Gloria Mark says there's another

42:02

source of distraction. That's just as

42:05

insidious Self

42:07

interruptions And

42:10

we find that when the

42:12

number of external interruptions goes

42:15

down in the next

42:17

hour the number of self-interruptions

42:20

goes up so

42:22

it's almost as if people want

42:24

to maintain this level of Interruptions

42:27

and if you're not being interrupted

42:30

by something external to yourself You

42:34

self-interrupt Learn

42:36

how to rebuild your attention span in our recent

42:38

episode Finding focus you

42:41

can find it right now in this podcast feed or on

42:44

our website hiddenbrain.org

42:56

You This

43:01

is hidden brain, I'm Shankar Vedantam Psychologist

43:04

Kurt Gray studies the science of political

43:07

polarization in

43:09

a number of studies he and other

43:11

researchers have found that Democrats and Republicans

43:13

in the United States and Partisans in

43:15

other countries have very strong and very

43:18

wrong views about their opponents We

43:21

tend to think our opponents are idiotic and

43:23

irrational. That's the mild stuff We

43:26

also think they're anti-democratic evil and

43:28

are okay with children being harmed.

43:31

We ask ourselves What is wrong with those

43:33

people? How can any decent person

43:35

have such terrible and misguided thoughts? Our

43:38

certitude about our moral superiority means we don't

43:41

have to understand our opponents or give them

43:43

the benefit of the doubt So

43:49

you've done a lot of work Kurt sort of

43:51

looking at ways in which we can turn down

43:53

the temperature on Political

43:55

polarization and you say that

43:58

one of the first and most practical things

44:00

that we can do is to

44:02

frame our positions on issues in

44:04

terms of harm. So

44:07

in other words, we think that facts

44:09

are what bridge divides, but in fact,

44:11

it's our shared concern about harm that

44:13

actually is what bridges divides. That's

44:15

right. And we have

44:17

a big paper with 15 studies that

44:20

shows that people think

44:22

that facts are the key to bridging

44:24

divides, but when you actually give people

44:27

facts in heated conversations about morality, it

44:29

doesn't work. Instead, what does

44:31

work to bridge divides is allowing

44:34

people to talk about

44:36

their own concerns with harm, to

44:38

talk about their own worries

44:41

about threats and the pain that they

44:43

or their family may have suffered. And

44:46

that makes them seem less like sheeple.

44:48

It makes them seem less stupid and

44:50

less evil. Even though

44:52

they disagree with you, they have

44:54

the same concerns about harm. So

44:56

they're similar to you, but

44:58

it also makes sense that they would make this

45:01

decision. And so now they're

45:03

not voting against their own self-interest. They're not

45:05

being irrational. What they do

45:07

makes sense, and that makes people willing to

45:10

respect them and have conversations with them. Cote,

45:13

one of the things you say is that it's important for

45:15

us to remind ourselves that the

45:17

other person's feelings about harm are

45:19

genuine, even if those feelings of

45:21

harm seem unfounded to us. Why

45:24

is this hard to do and why is it helpful? It's

45:27

so hard to recognize that

45:29

the authenticity of other people's

45:32

perceptions of harm, especially

45:34

when those perceptions are

45:37

opposite to our own. And that's

45:40

because our perceptions of harm are

45:42

deeply intuitive. We feel them in

45:44

our gut. If

45:47

you're a pro-choice person thinking

45:50

about the abortion debate, in

45:53

your gut, you know it's

45:55

about protecting women. But

45:58

if you're a pro-life person, then you're... gut,

46:00

you know it's about protecting unborn

46:02

children. And the power of those

46:05

intuitions about harm make it difficult

46:07

to realize that the other person

46:09

is authentically trying to protect someone

46:11

from harm. But

46:13

it's so crucial, because that's

46:16

what we need to do to recognize

46:18

that those on the other side are

46:20

motivated by protection and not destruction.

46:29

You also talk about an idea

46:31

called moral humility, which you say

46:33

is different from intellectual humility. Yeah.

46:37

There's been a lot of discussion

46:39

these days about intellectual humility. And

46:42

I think it's important to recognize that

46:44

you might be wrong about how the world

46:46

works. But it's

46:48

much harder to think

46:51

that your moral judgments might not be

46:53

100% right. We are deeply motivated to

46:58

think that we are good

47:00

people. And yet moral humility

47:02

is appreciating that even

47:04

if we are good people, other

47:07

people might be good too. And even

47:09

if they disagree with us, they're still

47:12

good. And so what that means is

47:14

that we might not be 100% right about our moral

47:17

judgments. And it's hard to have that kind of

47:19

humility. I

47:24

want to talk about a demonstration

47:26

of moral humility that was in

47:28

a recent documentary called Guns and

47:31

Empathy. It was produced by a

47:33

nonprofit organization called Narrative 4

47:35

in partnership with New York Magazine. And

47:37

during this documentary, one of the participants

47:40

was a woman named Carolyn Tuft, who

47:42

was shot three times in a mass

47:44

shooting at a mall in Salt Lake

47:46

City, and her 15-year-old daughter was killed.

47:49

I want to play you a little

47:51

clip of what Carolyn said. could

48:00

actually lose their business, lose

48:03

their house, lose their family. You

48:05

know, I think that that gun would not have

48:08

so much, so much

48:10

hold. And a

48:12

little while later, Kurt, there was another

48:14

person who spoke at the same event.

48:17

Her name was Jillian Weiss, and she had a

48:19

very different view on guns. She was

48:21

born with a disability, and she bought

48:23

a pistol after she was stalked. And

48:26

after she learned that disabled women were

48:28

much more likely to be sexually assaulted

48:30

than women without disabilities. Let me

48:32

play you a clip of Jillian. I

48:34

have my gun with me in my home, and

48:37

I feel so much safer

48:39

knowing that should anything happen, I

48:42

can defend myself. What

48:44

is the effect of hearing these two different

48:46

stories on people who are listening?

48:48

What's happening in their minds? Listening

48:51

to these stories might not persuade you, but

48:54

it does make you see the position

48:57

of the person telling these stories as

49:00

rational, as something

49:02

that makes sense, and it

49:04

makes you respect that position, and

49:06

makes you willing to interact more with that

49:08

person. And those feelings

49:11

of respect and the willingness to

49:13

engage are essential in

49:15

our pluralistic democracy, right?

49:18

We depend on compromise,

49:21

on open dialogue in our society,

49:23

and so these stories of harm are

49:26

a good first step at motivating

49:28

the kind of respect that we

49:30

need to decrease polarization

49:33

and increase our willingness to engage

49:35

with others. It

49:37

often seems to many people, Kurt, that the

49:40

divides that we have in

49:42

our country and in many countries around the

49:44

world are so intractable, so painful, that it

49:46

can seem as if there

49:48

is no way out, there are no solutions out,

49:51

that there's no hope in sight, and I think

49:53

that's understandable because the temperature has

49:55

been turned up to such a pitch. But

49:58

you cite a historical example. of

50:00

a moment when people put aside their differences

50:03

and truly saw the humanity of the other

50:05

side. And it occurred in 1914 in the

50:08

First World War. Can you tell us what happened?

50:11

It was the first Christmas of

50:14

the First World War. And,

50:16

you know, the sides were

50:18

dug in in their trenches. They

50:21

had the barbed wire up. And

50:24

even though they were supposed

50:27

to be killing each other, as

50:29

Christmas approached, they

50:32

started being kinder to each other. Right? They

50:34

would hear each other singing Christmas carols in

50:36

the trench over and might

50:40

wave at each

50:42

other. Right? Shout some pleasantries.

50:45

And eventually the situation

50:48

got so positive

50:51

that the Germans and the Allies

50:53

decided to have a soccer game

50:55

in no man's land where

50:57

they exchanged gifts. And so this

50:59

is really an act of defiance

51:02

against, you know, the generals

51:04

who wanted them to kill each other.

51:06

And it was an act of camaraderie

51:08

and bridging divides that I think is

51:11

remarkable even today. Their

51:15

mission was to literally murder

51:17

each other. And yes, they

51:20

found space to come together and

51:23

see past their disagreements. I

51:27

think it holds powerful, applicable

51:29

lessons for our own time.

51:32

The elites in our government and in the

51:34

media are telling us to hate each other

51:37

and telling us that we should hate each other.

51:40

But we already know from all

51:42

the scientific work we talked about today that the

51:44

other side actually doesn't hate us as much as

51:46

we think. And so this

51:48

should be an inspiration that even in

51:51

war, real war, people can rise up

51:54

and come together and we can too. Kurt

52:04

Gray is a psychologist and neuroscientist at

52:07

the University of North Carolina at Chapel

52:09

Hill. He plans to publish a

52:11

book about these ideas in 2025. The

52:14

book is going to be titled, Outraged,

52:16

Why We Fight Over Morality and

52:19

Politics. Kurt, thank you so much

52:21

for joining me today on Hidden Brain. Thanks

52:23

for having me. Have

52:27

you tried to talk with someone who disagrees with

52:29

you about politics? Have you found effective

52:31

ways to get through? Have you

52:34

lost friends over political disagreements? If

52:37

you'd be willing to share your stories with the

52:39

Hidden Brain audience, along with any questions you have

52:41

for Kurt Gray, please record a voice

52:43

memo and email it to us at ideas

52:46

at hiddenbrain.org. That

52:48

email address again is ideas

52:51

at hiddenbrain.org. Use

52:53

the subject line, politics. Hidden

52:57

Brain is produced by Hidden Brain Media. Our

53:00

audio production team includes Bridget McCarthy,

53:02

Annie Murphy-Powell, Kristen Wong, Laura

53:05

Querell, Ryan Katz, Autumn Burns,

53:07

Andrew Chadwick, and Nick Woodbury.

53:10

Tara Boyle is our executive producer. I'm

53:13

Hidden Brain's executive editor. Next

53:17

week in our Us 2.0 series,

53:19

the mistakes we make when we try to

53:21

change someone's mind, and a

53:23

better way to talk to political opponents. If

53:26

you're asking somebody to give up their moral

53:29

values, people are willing to fight and die

53:31

for their values, right? Like people really, really

53:33

are invested in not changing their minds about

53:35

that. I'm Shankar

53:37

Vedantham. See you soon.

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features