Podchaser Logo
Home
The psychology of political messaging, with Drew Westen, PhD

The psychology of political messaging, with Drew Westen, PhD

Released Wednesday, 28th September 2022
Good episode? Give it some love!
The psychology of political messaging, with Drew Westen, PhD

The psychology of political messaging, with Drew Westen, PhD

The psychology of political messaging, with Drew Westen, PhD

The psychology of political messaging, with Drew Westen, PhD

Wednesday, 28th September 2022
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

Once again, it's election season.

0:03

This year, contests from governorships

0:05

and senate seats to sit councils

0:08

and school boards will turn on headline

0:10

grabbing issues including abortion, the

0:12

economy, climate change,

0:14

and education. But political

0:17

and psychological research has found that

0:19

most often, voter behavior

0:21

is not driven by the nuances of

0:23

policy debates on these topics. In

0:26

stead, it's how voters feel about

0:28

candidates and political parties. Whom

0:30

they trust to share their values and the

0:32

emotions that politicians messages,

0:34

speeches and ad campaigns evoke.

0:37

So how do emotions drive our political

0:39

behavior? What makes an effective

0:41

political speech or ad campaign verse

0:43

is one that falls flat. How

0:45

can small changes in wording? ReShape

0:48

voters' opinions on controversial topics

0:51

And what role might political

0:53

messaging play in shaping our increasingly

0:56

polarized public discourse?

1:00

Welcome to Speaking of Psychology. The

1:02

flagship podcast of the American Educational

1:05

Association that examines the links

1:07

between psychological science and everyday

1:09

life. I'm

1:10

Kim Mills.

1:13

My guest today is doctor Drew Weston,

1:16

a clinical personality and political

1:18

psychologist and professor in the departments

1:20

of psychology and psychiatry at

1:23

Emory University. He also

1:25

runs the consulting firm, Weston Strack,

1:27

strategies, which advises progressive nonprofits

1:30

and democratic candidates on how to talk

1:32

with voters about a range of issues. from

1:35

abortion to immigration to taxes.

1:37

He's tested political messages with thousands

1:40

of voters over the past two decades, He

1:42

is also author of the two thousand eight book,

1:44

The Political Brain, the role of

1:46

emotion in deciding the fate of the nation.

1:49

And he's working on a follow-up book to be published

1:51

in twenty twenty three. He is

1:53

a frequent contributor on political

1:55

and psychological issues on radio, television,

1:58

and in print in venues such

1:59

as The New York Times, The Washington Post,

2:02

and CNN. Thank

2:04

you for joining me today, Dr. Wester. Thanks

2:06

for helping me again. Before I

2:08

toss out my first question, I want to make

2:10

clear that APA is a non partisan

2:12

organization, so I want to be sure that

2:14

our discussion is balanced and that we will talk

2:17

about Democrats, Republicans, and

2:19

independents without fear or favor.

2:22

How does that sound to you?

2:23

I promise to be just like Fox

2:25

News, fair and balanced. But but

2:27

I will I will say that I will

2:29

say that that you

2:31

did you did introduce me, someone who works

2:33

with Democrats, and Republicans

2:35

will be especially happy with

2:37

what I have to say because Republicans

2:38

tend to be very good at messaging

2:40

and democrats tend to be really off planet.

2:42

So I'll be more critical of democrats. Alright.

2:46

Thank you. So my first

2:48

substantive question.

2:50

In an article a couple of years

2:52

ago called How to Win an Election, You

2:55

wrote that politics is less

2:57

a marketplace of ideas than a

2:59

marketplace of emotions. Why

3:01

is that? Why is it so crucial

3:04

for politicians to reach voters on an

3:06

emotional level rather than just an

3:08

intellectual one. You

3:09

know, in some ways, you could answer that question

3:11

by asking, how do we choose a mate?

3:14

or how do we choose how do we choose

3:16

dinner? You know, we we don't we don't go through

3:18

the list of, like, with a navy. If

3:20

if we do go through a list of let's see. I'm

3:22

gonna make list pros and cons. There's

3:24

fifteen on this side, fifteen on that side.

3:27

We would typically be divorced in about

3:29

two years because that's

3:31

simply not how that's how our minds

3:33

work. We we are and

3:36

I guess another way to put it is that emotions

3:38

provide the fuel for human behavior.

3:41

and cognition provides or

3:43

thinking provides the roadmap. So

3:45

there are really -- there are two basic

3:47

questions the voters ask. and and

3:49

when they're trying to decide on on which

3:52

party or candidate they support. One

3:54

is, does this person understand

3:57

and care about people like me.

3:59

And

3:59

the

3:59

second is, does this person share my values?

4:02

And

4:02

you know what? If you're an educated voter,

4:04

if you're not a terminally ill hit the voter.

4:06

Those are the same two questions that you're asking.

4:09

In an emotional way, they're actually

4:11

pretty rationing questions most of the time.

4:13

So

4:14

much of your work has to do with

4:16

exploring the importance of language

4:18

and political messaging. One example

4:20

I've heard you put forward is that people react

4:22

completely differently to a program

4:25

that's described as, say, helping

4:27

the unemployed versus one

4:29

that helps quote, people who have

4:31

lost their jobs. Why

4:33

are these seemingly small changes in

4:35

language so significant?

4:38

From a neuroscience perspective, they're

4:40

activating different what you would call,

4:43

and you would call this from a clinical perspective

4:45

as well, different networks of associations

4:48

in our minds and brains that is interconnected

4:50

sets of thoughts and feelings and

4:52

images and memories and values so

4:55

that when you activate part of that network, you

4:57

activate the rest. So when you, for example,

4:59

say,

5:00

Oh, I'm I'm concerned about

5:02

the unemployed.

5:03

You're taking real people with

5:05

pain lines, faces who may

5:07

have just had to tell their kid that

5:09

they're moving in it. This is sweetie. This

5:11

isn't gonna be your room anymore. That's

5:14

a really evocative thing and it makes

5:16

you feel something. You're turning people

5:18

like that into a nameless, faceless

5:20

abstraction, the unemployed, which

5:23

is also an other. it's not it's like,

5:25

if I'm not unemployed right now, I don't I'm

5:27

big you up, sure, glad not them.

5:30

And as soon as you use that, themify

5:32

somebody, you decrease empathy

5:34

for them.

5:35

So can you really flip public opinion

5:37

on a question just by describing the

5:40

topic in a different way? I mean, how much

5:42

difference might this make in terms of

5:44

poll numbers?

5:45

I'll give you a couple of examples. Recently,

5:48

I did study on Southern

5:51

voters looking at

5:53

how to diffuse the the

5:55

issues of race and race baiting

5:57

that were brought up by the

5:59

the rights

5:59

attack on quote unquote, critical

6:02

race theory or CRT, which versus,

6:04

you know, it stopped being taught in any

6:06

school. It's not It's not taught in

6:08

the vast majority of of of

6:11

undergraduate universities. It's taught

6:13

in a handful of law school classes. In

6:15

fact, there's there's one law school class

6:17

at Emory on it. And

6:19

and the professor asked BD Guest teach

6:21

on how to talk about it. So that's the

6:24

extent to which children, quote,

6:26

are being exposed to critical race thing. But

6:28

but what the right did was it

6:30

was completely unethical, but very

6:32

smart. And they're really open about

6:34

it. They basically said, alright.

6:36

How can we take

6:37

take anything

6:39

that's being taught in

6:41

about race

6:42

or racism

6:44

in elementary and high schools,

6:47

middle schools, how do we turn that into

6:49

something that sounds really scary? Well,

6:51

let's call it critical race

6:54

theory. Why? First of all, it's

6:56

critical of America. Now that's an

6:58

unconscious association. They say critical

7:00

They don't say of America, but that's

7:02

part of that network of associations that's

7:04

activated unconsciously. So you get in

7:07

there, it's critical of America, critical

7:09

of white people. It's

7:11

it's it's about race and it's

7:13

just a theory. No

7:15

facts. Just a theory promulgated

7:18

by those socialist communist people

7:20

who just hate white people and

7:23

can't stand white privilege and all that kind

7:25

of thing. So that's the right side of it.

7:27

It turns out that you can

7:29

diffuse that really,

7:31

really easily, but Democrats

7:33

tend to use the wrong language. They'll say,

7:35

we must fight we must

7:37

fight systemic racism. Well,

7:39

that would be great except

7:41

that as I as I actually found

7:43

in that particular survey, less than

7:46

twenty percent of people, including twenty

7:48

percent of people of color, can define

7:50

critical race there. They have no idea what you're

7:52

talking about. So And when you

7:54

when you talk to someone is like,

7:56

I often say to to leaders, if

7:59

you're speaking to someone

7:59

who's a native Spanish speaker, don't

8:02

talk to them in English. speak to

8:04

them in the in the language that they use

8:06

and the litmus test for me on language

8:08

that I always try to tell people is

8:10

if this isn't how ordinary normal

8:13

Americans would speak about this,

8:15

I don't care how you activists speak about it.

8:17

This is not this is that ordinary

8:19

voters speak about this, just don't use

8:22

it. So instead of saying critical

8:24

race theory, for example, or instead of

8:26

saying systemic racism, If

8:28

you simply describe it, if you say, hey, look,

8:30

we all know that back in the fifties when

8:33

Tresilizer was developing our

8:35

our interstate highway systems. if

8:38

a highway was gonna had to be

8:40

built between one of two neighborhoods.

8:42

One of them was a poor black neighborhood. The

8:44

other one was white neighborhood. we

8:46

all know which one was gonna go go through

8:48

because back then that was just

8:50

seen as that was acceptable, racism

8:52

was was acceptable in a way that

8:55

We've had some politicians who've made it acceptable

8:57

again, but it wasn't acceptable for about fifty

8:59

years. So what you say is,

9:01

look, fact that we know which way we go.

9:04

And it's easy to say that that's history,

9:06

but here's the problem. The problem is

9:08

that fifty years later, think about

9:10

the value of the houses on

9:13

the two sides of that of that

9:15

highway versus the value

9:17

of the houses in that white neighborhood.

9:20

white the white people's houses have gone

9:22

up skyrocketed value, and the people

9:24

and the people of those black neighborhoods, but they've

9:26

been passing on to their kids isn't worth much

9:28

at all. And then you ask, why don't these

9:30

poor black kids have all these problems with asthma?

9:33

What do you think happens when you have fumes

9:36

coming off the highway? If you say that and

9:38

then you say to a white working class

9:40

voter, you know, we just passed a trillion

9:42

dollar bipartisan infrastructure

9:44

act. Why

9:45

don't we fix this? And they'll

9:47

say, absolutely, that's not

9:49

fair.

9:50

But if you say to them that,

9:53

you know, we have this problem with systemic racism.

9:55

their immediate responses to get defensive.

9:58

Just one more quick example like

9:59

that is if you say,

10:02

you know, when we teach about history,

10:04

we

10:04

need to teach about the history of racism

10:06

because

10:08

you will

10:09

breakeven or do a little better than breakeven

10:12

with the average voter on that. If

10:14

instead, you say, we need to teach

10:16

about the history of race and racism

10:19

because you'll

10:21

kick up an extra fifteen points in the polls.

10:23

And the reason for that, we I mean,

10:25

it's a tiny change from racism

10:28

to racism. What you've

10:30

just done is You've you have

10:32

blocked white people from getting

10:34

defensive. The second you say, we need to

10:36

teach about the history of racism. immediately

10:39

it feels accusatory if you're

10:41

white.

10:42

And and do you know it's not

10:44

You you can understand what

10:46

people feel that way because racism, you

10:49

hear it and you hear like, oh, I'm

10:51

about to be attacked. Instead, if you

10:53

say the history of race and racism people go,

10:55

well, of course, we should teach about that.

10:57

There's

10:57

been a lot of discussion of how increasingly

11:00

polarized our political landscape has

11:02

become. Does that affect

11:04

the way people respond to political messaging.

11:07

I'm I'm wondering if it's possible these

11:09

days for a Democratic politician

11:11

to connect emotionally with a Republican

11:14

voter or vice versa or do

11:16

people just tune out, whatever comes

11:18

from the other side? You

11:20

know, it really depends on how

11:23

far to the other side, the other side is.

11:25

I actually I actually don't agree with

11:28

with many of my colleagues

11:30

on the on the left who were who

11:32

were pollsters about who

11:34

they think is movable because who's

11:37

movable depends on how you talk to them.

11:39

You know, if if on on for example,

11:41

on unabortion.

11:44

If you say if

11:46

you say to to voter to say

11:48

suburban independent

11:51

or suburban Republican

11:52

voters. Or

11:54

you say that even a lot of rural

11:56

voters as we learned in Kansas

11:59

where you'd get these bright red counties

12:01

where forty percent of people would say no, I

12:03

want I want the right to abortion. If

12:05

in a polling question, you ask

12:07

people, do you believe in

12:09

abortion? Well, in those suburban

12:11

Republican areas, you're gonna

12:13

get a mix of feelings. Well, you'll you'll

12:15

get more positive than negative. You'll get about

12:18

two thirds of Americans will say yes to that.

12:20

But if you ask instead, are

12:22

you pro life or pro choice? People

12:25

have split evenly. between those two

12:27

things for the last twenty five years. Until

12:29

the Dobbs decision, the decision that over

12:31

row and which led the pro

12:34

choice side to go to go way up in the polls.

12:36

Well, if you look at that, you might think those

12:38

are really conflicting results. Right? Like,

12:40

two thirds of people say that they're for

12:42

abortion. for abortion rights,

12:45

yet under half say

12:47

their pro choice. Well, if

12:49

instead of using language like

12:52

Even even Pro Choice, which is pretty

12:54

common. Pro Choice Pro Life suggests

12:56

no matter what, I

12:58

believe that that

13:00

that from the moment of conception, you're

13:02

killing babies. And that's the position that

13:04

the rights now taking and it's taken up

13:06

really extreme version of that of

13:08

that position. So, you

13:10

know, why is it that that that people

13:13

reject the language of pro choice half the

13:15

time when they believe in abortion rights.

13:17

It's because Democrats and

13:20

progressives are offering them a position that's

13:22

not equally untenable to the right

13:25

but sounds untenable, which is anytime

13:27

you feel like that you can abort. But the

13:29

reality is most of us don't actually feel

13:31

that way. I mean, what they feel is

13:34

early on in pregnancy, when you

13:36

look at what a fetus or embryo

13:38

looks like, it

13:39

doesn't look anything like us. You know,

13:41

for weeks, you can't tell the Chicken

13:43

embryo than a human embryo. That's

13:45

why most of us intuitively feel like

13:48

you need better and better reasons,

13:50

the further on that you are. You know,

13:52

it early on, it

13:54

is clearly it's not a question

13:56

between a mother's rights and the rights

13:58

of fetal

13:59

tissue.

13:59

because that's what it is. We don't say,

14:02

oh, when's your baby coming? When when you're

14:04

not even showing yet? We say that towards the

14:06

end though, and that says a lot about how we feel.

14:08

with the point I was getting I was getting to

14:10

about this is if you were to use language

14:13

like

14:14

reproductive justice, It's

14:17

again one of those abstractions. First of

14:19

all, no one has any idea what you're talking about.

14:21

What does reproduction have to do with

14:23

justice? It's like when people use words

14:25

like

14:26

like environmental justice. I've

14:28

tested that one as well. Less than

14:30

ten percent of people can accurately find

14:33

environmental justice. People I

14:35

don't know. Like, you're good to the earth.

14:37

You know? And and that's actually that's

14:39

actually not what it is about. But so if you

14:41

say reproductive justice, not only

14:44

are you turning something that's really deeply

14:46

personal and that you feel

14:48

when you hear about, say, that ten year old

14:50

girl who was raped and

14:53

couldn't get an abortion in her own

14:55

state. You know, Democrats

14:57

should be referring to that as a moral

14:59

issue. You know, there's a moral choice

15:01

between two sides and I

15:04

I would I would urge Democrats to

15:06

say, yeah, they believe that

15:08

every rapist has the right to choose the

15:10

mother of his child. we believe that

15:12

every woman has the right to choose the father

15:14

of hers. That's

15:15

the difference in our moral world views.

15:18

But,

15:18

see, that's a long phrase. And

15:21

I'm thinking, you know, pro choice was

15:24

poll tested before it ended

15:26

up in common use. And

15:28

pro life has been poll tested, and they're

15:30

both handy and short. So but what

15:32

if pro choice doesn't work, then

15:34

what's the shorthand alternative

15:37

that will work?

15:39

That's also a great question because the

15:41

left tends to have more nuanced positions

15:43

on things than the right. The right

15:45

will simply say,

15:47

no gun. Second

15:49

amendment. Wow. You know, that's the

15:51

you know, that's their that's their position. Or

15:54

or You're chillin' babies.

15:57

I mean, that that's, you know, those are the

15:59

those are the

15:59

or or the the free market

16:02

can't interfere. Yep. There you

16:04

go. It's pretty it's pretty simple. Whereas it's

16:06

it's not that easy on the left because the

16:08

left is defined by having

16:10

more nuanced views. So you might have to get

16:12

up to a few more words, like something like this,

16:14

instead of talking about reproductive justice

16:17

or pro choice. If

16:18

you say, this is about the

16:20

freedom to decide when and whether

16:22

to have a kid. That's

16:24

pretty much it. Where

16:25

where if you wanna expand that the freedom to decide

16:27

whether when and who'd have a child

16:29

with or who'd have a family with. Everyone

16:32

understands exactly what that means. And

16:34

if you notice When I say the freedom,

16:36

I'm not I only emphasize I'm

16:38

emphasizing the value that is

16:40

core to what this

16:42

whole thing is about. It's not a justice It's

16:45

not really about health so much. It's

16:47

first and foremost about This is

16:49

about our freedom one of the most essential

16:52

freedoms we have to decide

16:54

who and when and whether to have

16:56

a child. But the other thing

16:58

about it is, you know, so I can put information in

17:01

my voice when I say it. If I say,

17:03

I'm pro choice. No

17:05

feeling in that. You

17:06

could you could say a louder, you could say, I'm

17:08

pro choice, you know, or you could

17:10

say, I'm pro joyce and I'm proud.

17:13

Well, you know, actually not that many

17:15

women feel like

17:17

they go into an abortion clinic or

17:20

to plan parenthood because they

17:22

have an they have an unwanted pregnancy.

17:25

They're not going in. They're thinking,

17:27

I'm proud. they're

17:29

not necessarily thinking I'm not

17:31

proud, but they're not thinking it's not about

17:33

pride. It's about freedom. It's about

17:35

this wasn't the right time or this

17:37

isn't what I wanted or this was a

17:39

tender date for God's sake.

17:41

Now

17:41

you've done work using FMRI

17:44

to study people's brains while they

17:47

absorb political information. And

17:49

I'm just wondering, do we absorb

17:51

political information differently from

17:53

other types of information, particularly

17:56

information that maybe doesn't comport

17:58

with our preconceived notions. Yeah.

18:01

That's a real problem. This is a

18:04

was a design flaw built into the

18:06

human brain by if you're

18:08

on the left, natural selection, if you're on

18:10

the right, God messed up. And

18:13

and that is that, you know, we

18:15

learned from skinner

18:16

the

18:17

seventy five, eighty years ago that

18:19

People are well, he didn't like to use feeling

18:22

words, but but that we're

18:24

essentially drawn to things that that

18:26

that we associate with. with

18:28

reinforcement or with positive positive

18:31

outcomes for ourselves, for our families, for

18:33

the communities that we care about. and

18:35

we either fight or flee things that do

18:37

the opposite. That makes complete

18:39

sense from an evolutionary standpoint, organism

18:42

that didn't do that. We wouldn't be knowing that organism

18:44

today. it would have it would have gone extinct,

18:47

you know, millions of years ago.

18:49

The problem is that we as humans can

18:52

do exactly the same thing with ideas.

18:54

and that is that we are drawn

18:56

towards ideas, towards

18:58

beliefs that make us feel

19:00

good, that are reinforcing,

19:01

and we repel beliefs that

19:04

make us feel the opposite. So

19:05

if you wanna know how that work actually works in

19:07

the brain, we did a study back in in two

19:09

thousand four and in the election

19:12

between John Carey and George W. Bush,

19:15

where we looked at the brains of strong

19:17

partisans as they listened to

19:20

information and we ask

19:22

them to perform a reasoning task with

19:24

that information about their candidate

19:26

or the other cat. And what we found

19:28

was if you looked at the, say,

19:30

one thousand or fifteen hundred prior

19:33

studies using neuroimaging

19:34

techniques like functional

19:36

neuroimaging I guess MRI. If

19:38

you look at the, say, one thousand studies before

19:41

that have recent tests, they all found

19:44

activation in an area towards

19:46

the top of the front of the brain

19:48

called the dorsal lateral prefrontal cortex,

19:51

which is involved in

19:52

conscious, rational, thinking,

19:55

in holding things in mind consciously

19:57

so you could make decisions about them

19:59

in

19:59

more abstract thinking.

20:02

And that's what when people reasoned

20:04

they use those circuits which made a lot

20:06

of sense. Well, we had a suspicion

20:09

that on in politics, when

20:12

it got to be about your candidate and you

20:14

had an emotional investment in your candidate,

20:16

we didn't

20:16

think any of those circuits were gonna turn on

20:18

at all. We expected that what was really

20:20

gonna happen So here's an example.

20:22

And this is a this was a slightly altered

20:25

example. So we we did alter examples

20:27

that they were They were most of them were very close

20:29

to things that the canons actually said are done.

20:32

So here's an example that was a care example.

20:34

It was it was people

20:36

are lying in the scanner They're reading

20:38

this, they're listening to this, and they're

20:40

about to make a rating. And they they hear the first

20:42

slide comes up. It says, in nineteen

20:44

ninety six, John Carey was on Beat

20:46

The Press.

20:47

discussing Social Security. And he

20:49

said, we

20:51

have generational responsibility to put

20:53

everything on the table here. whether that's

20:56

means testing, whether that's affecting

20:58

the changing the

21:00

automatic cost of living adjustments.

21:03

because we have to make sure this program is solvent

21:05

for all this, you

21:06

know, decades down the road. So

21:09

that thing seems

21:09

pretty reasonable. Well,

21:11

then the next slide comes up. It says,

21:13

this

21:13

year, two thousand four, I meet

21:15

the press. John Carey, who's

21:17

asked about Social Security and said, we

21:19

should never touch Social Security. We have a generation

21:22

responsibility to our seniors to leave just

21:24

as just as as as as it has

21:26

been. So

21:27

then the next slide comes up and says consider

21:29

mister Carey's words are words

21:31

and actions are inconsistent. Well,

21:33

obviously, they are. Right? I mean,

21:36

you know, nineteen ninety six says one thing,

21:38

two thousand four now is playing the scene, usually

21:40

says a different thing. And and again, let me just

21:42

stress. This was not a this is not a completely

21:44

real example. It had elements

21:46

of truth. It was not entirely real. So

21:48

so

21:49

it

21:50

it we then have them right on a four point

21:52

scale. the extent from

21:55

one,

21:56

not inconsistent at all,

21:58

for completely inconsistent. And

22:01

their finding was that

22:03

Democrats had no trouble seeing the

22:05

inconsistencies for George W. Bush.

22:08

Republicans

22:08

had no trouble seeing the inconsistencies

22:11

for for John Carey. out,

22:13

but here's where there was a difference. The difference

22:16

was when they were looking at their own candidates,

22:18

we saw no activations in the

22:20

reasoning circuits whatsoever. we

22:22

saw what we expected, which was a bunch

22:25

of first activations of

22:27

emotion circuits, negative emotion circuits

22:29

that we're saying who

22:30

are just pinging, pinging, pinging,

22:32

saying, uh-oh,

22:33

this doesn't feel good. How am I

22:35

gonna get out of this? Then these are activations

22:37

in in parts of of the frontal

22:40

lobes right around

22:42

our eyes or between our eyes or just

22:44

above of above our eyes or the ventromedial

22:46

prefrontal cortex. which we had

22:48

hypothesized would be involved in

22:50

people unconsciously regulating

22:53

their motions, trying to figure out how to shut

22:55

them off. And then there was a huge activation

22:58

in part right behind those eyes called

23:00

the anterior cingulate, which

23:02

is involved in among other things

23:04

monitoring and trying to figure out what to

23:06

do about conflict. And the prior

23:08

studies that, you know, focus on cognitive

23:10

conflict, like, we were given emotional conflict.

23:13

So all of those circuits were just

23:15

wildly off. Then about

23:17

twenty seconds later, they started

23:19

to all turn off as people came up with

23:21

rationalizations give their candidate. Oh,

23:23

no. Not inconsistent at all. And

23:27

once they did that, this was the part

23:29

we did not expect.

23:31

they got bursts of dopamine

23:34

in reward surgery in the brain.

23:36

And essentially, they were getting reinforced

23:39

for coming to a bias conclusion.

23:42

And that's the part that's

23:44

scary. And was scary

23:46

back in two thousand four when we did the

23:48

study when you

23:50

know, we were still in the in the

23:52

post Watergate era where not

23:55

not quite, but we we had moved out of

23:57

the but, you know, back in the Watergate era,

24:00

Republicans were the ones who went to

24:02

went to Richard Nixon and said, you gotta resign,

24:05

buddy. We just can't support

24:07

you in the Senate because you obstructed justice.

24:09

I mean, I mean, it seems like

24:11

that

24:12

was two hundred years ago because you can't imagine

24:14

happening now. The problem now is

24:16

that on top of that designed effect

24:19

in the brain, we

24:20

have a designed effect in the media, which

24:22

is that we have now most people getting

24:24

their news from social media. and they're

24:26

getting it from sources that have no

24:28

fact checking.

24:30

And who are largely the

24:32

sources who are sending them information

24:34

because they're like minded. So now we don't even

24:36

share similar facts. Let alone have

24:38

to reason about facts in ways that

24:41

are differently. that

24:42

you started out as a personality in

24:44

clinical psychologists, and you still do

24:46

academic research in in those areas.

24:49

But what made you turn your attention to

24:51

politics and how do those two parts

24:53

of your career fit together? It's

24:55

really funny story. It started

24:58

first during the Clinton impeachment trials. And

25:00

it's really struck by the fact that

25:02

you'd have all of these commentators coming

25:04

on television from the right and from the left.

25:06

They would be marshaling quote unquote

25:08

evidence. You know, they've been talking about

25:10

facts. They've been talking about, well, what

25:12

did Hamilton and and and

25:15

Adam has been when they were, you know,

25:17

when they were they were crafting this

25:19

language on impeachment and the

25:21

constitution, etcetera. And

25:24

they're talking about that and they're talking about

25:26

the fact of what happened with Lindsay, what

25:28

didn't happen with with Lindsay, etcetera.

25:31

And they all seemed to come down

25:33

on the side of what they wanted to

25:35

believe in the first place. It is clear that the

25:37

facts were making no difference whatsoever. So

25:39

I actually started doing a little bit research

25:42

back then, and I found that there were

25:44

actually

25:44

three predictors of how people

25:47

how

25:47

people which side people came down

25:49

on pro impeachment or against impeachment.

25:53

One was their feelings towards the

25:55

parties, that was primary.

25:57

Second was their feelings towards Bill

25:59

Clinton.

26:00

there were Democrats who didn't like them and

26:02

their Republicans who did. And

26:04

then the third was their

26:07

feelings about feminism.

26:09

And if if they had very strong

26:11

feelings about feminism, they were more

26:13

likely even beyond their feelings of book

26:15

Clinton. to believe that this was an

26:18

impeachable offense. And

26:19

the the the point of it all was when

26:21

you got down to their knowledge of the facts,

26:24

about either the constitution or about what

26:26

had happened that had led to this,

26:29

what what had Clinton done or not done?

26:32

The facts predicted one percent

26:34

of the variance in people's voting, and

26:36

the rest was controlled by by

26:38

people's emotional reactions. to

26:40

the parties to build Clinton into feminism.

26:43

And so what that led me to think with wait

26:45

a minute. And and it actually it predicted eight

26:47

out of the nine judges on justices on

26:49

this report. and how they voted in Bush

26:52

Igor. The only one who is unpredictable

26:54

who frankly has been my favorite

26:56

justice in my lifetime

26:58

with David Sweeter And I say

27:00

that not because I've agreed with his politics,

27:03

it's because I never knew which way he was gonna

27:05

come down because he seemed like No. He was

27:07

a justice who did what he was supposed to do with

27:09

justice. You know, look at the facts

27:11

of the case. Don't start out with your own

27:13

values, your own preconceptions, your own

27:16

politics, but just look at the facts of

27:18

the gays. They were actually on this line. I really

27:20

have to tip this way. I really have to really have

27:22

to tip that way. But the thing that was really that

27:24

kicks me over from being you

27:26

know, mild mannered clinical psychology

27:29

professor to be a an

27:31

ill mannered political consultant was

27:34

watching first Gore

27:36

throw an election

27:39

in two thousand by speaking

27:41

like a Democrat, Listing

27:43

is ten point plans, not

27:46

speaking enough about his values, never

27:49

speaking about the one thing that his consultants

27:52

told them not to speak about, which

27:54

was

27:55

typical Democratic

27:58

typical Democratic

27:59

consulting. Don't talk about

28:02

what matters to you because it's not high in

28:04

the polls, and that was energy and climate.

28:06

Imagine if Al Gore

28:08

had gone to gone to

28:10

Florida. And

28:12

he had given speeches on the coast

28:14

of Florida, and he had said, look,

28:16

there are lot of you who are parents and grandparents.

28:19

and who worked really hard for this land

28:21

that we're standing on right now, that

28:23

you want to pass on to your kids and grandkids.

28:25

You know,

28:26

I know lot of you are not sure whether or not

28:28

there's anything to this idea about climate change.

28:31

I understand it, although it's kind of

28:33

a lot a lot like what we saw when

28:35

the tobacco quote, scientists and

28:37

their white coats were saying, oh, you

28:39

can breathe this black blood this this

28:41

black blood into your nose.

28:43

And don't worry, but it's not gonna do anything to you.

28:46

It's kinda like these hear these people

28:48

saying in white coats for the old company

28:50

saying,

28:50

oh, yeah. You can you can

28:52

bring this pollution into your nose. are

28:54

you there? And then it goes up into the air and there's trillions

28:56

of tons of it up there. Don't worry. It won't

28:58

affect the air. It won't affect the atmosphere.

29:00

But even if you don't believe any of that,

29:03

even if you don't believe what the vast majority

29:05

of scientists now do believe, do

29:07

you want to gamble with the land

29:10

and the homes that

29:12

you have worked so hard to leave for your

29:14

kids and grandchildren. Do you really wanna

29:17

do that? It's not a gamble I'd wanna

29:19

make, because I could be wrong about

29:21

this. But

29:22

you know, people

29:23

on the other side could be wrong about this

29:25

too. And if they are you're

29:27

not gonna be leaving anything to your grandchildren.

29:29

Now he lost by five hundred

29:31

votes. In Florida,

29:33

imagine if in Florida yeah. If he

29:35

had done that in Florida, Yeah. So

29:38

just to wrap up, you wrote your

29:40

book the political brain fifteen years

29:42

ago and a lot has happened since then.

29:44

And you're working on a follow-up book. So can

29:46

you give us a little preview of what

29:48

might be in in the next book?

29:50

Yeah. Sure can. It's based on

29:53

You know, when I wrote the last book,

29:55

I'm being totally honest here. I had

29:57

no idea

29:59

before the

29:59

before and when the reviews came

30:02

out whether they were gonna come out saying,

30:04

this guy's a total fraud.

30:06

He's a he's a psychologist. who's,

30:09

yeah, he's done a bunch of reading on elections.

30:12

He's gone back and I I am headed

30:14

to maybe

30:15

go

30:15

all the way back to to FDR

30:18

and the study is convention addresses all

30:20

the way on up. But I hadn't done

30:22

any work in practical politics before.

30:24

I'd never get a speech to a political

30:26

political audience, I might have

30:28

given one academic talk

30:31

at a political psychology conference

30:33

once. I don't even think I'd done that by that

30:35

point. So so, you know, I'm thinking I

30:37

have no idea how this is gonna be received because

30:39

it was advancing what at that time was

30:42

Believe it or not, you're gonna laugh as a radical

30:44

thesis. Then emotions are central

30:46

to politics. because at

30:47

that time, democrats

30:49

were all running

30:51

on this. All their they were being taught

30:53

was and and they were running on campaigns

30:55

on the idea that a campaign is

30:58

a debate on the issues. So what you

31:00

wanna do is you wanna spell out where you

31:02

are in the issues and your ten plan plan

31:04

on every one of them. And, you know, the

31:06

thesis in my book was no actually no

31:08

one wants to see your back election. That's

31:10

not what you're what what they're interested

31:12

in. And if you look at the history of elections,

31:15

that's not how people vote at all.

31:18

you know, if you look at at Barack Obama

31:20

versus Hillary Clinton, I

31:22

don't think they differ on anything

31:25

in their politics, in their policies. but

31:27

they ran really different campaigns, say,

31:29

in two thousand seven, two thousand eight,

31:32

where he could speak to people

31:34

emotionally talking about the same

31:36

issues,

31:37

and she just had a much harder

31:39

time, much harder time doing it to the

31:41

average person, to speak to the speak that

31:43

emotional way. So anyway, the Brooke was advancing this

31:45

kind of radical thesis that not

31:47

only was that democratic strategy wrong,

31:50

but so were political scientists models

31:53

which were virtually all rational choice models

31:55

at that point. You know, oh, let's see.

31:57

I weigh abortion. This percent

32:00

Iweigh the economy. This percent

32:03

Iweigh

32:03

immigration. This

32:05

percent, you know, I have this whole list of

32:07

twenty five issues that I'm keeping in my mind, that

32:09

I'm weighing, how much do I like them, and how much do I

32:11

like each candidate, parties position, each one?

32:13

Well, you know, in in psychology, we call

32:15

that doing multivariate statistics, and

32:18

it requires really, really

32:20

well worked out statistical programs

32:22

and software. And it's

32:24

a involves hundreds

32:27

of thousands or millions of split

32:29

second calculations that our brains

32:31

cannot do. Most of us cannot do

32:33

multivariate statistics in our heads.

32:35

So what do we do?

32:37

We simplify the equation by

32:40

asking those two questions. Does

32:42

this person understand,

32:44

care about people like me and does this person

32:46

share my share my values. So that's

32:49

where was coming from at the time. I

32:51

had no idea how the political

32:53

community literally save it. And

32:55

and I also really wasn't entirely

32:57

sure I was to fraud. And and

32:59

and then I

33:02

actually got a call that two weeks after

33:04

the book came out from this

33:05

person call this person says,

33:09

is this doctor Weses?

33:11

Yes. And I'm in a Starbucks in my,

33:13

you know, in my sweats.

33:15

in Atlanta and says, we would

33:17

you hold for president Clinton?

33:20

And this was two thousand and seven.

33:22

And I've I think yeah. And a hole

33:24

from Mahatma Gandhi too, you know. And then

33:26

maybe Martin Luther King while we're at it.

33:29

But I didn't say that, but that's what I was thinking.

33:31

So that's him. Next

33:32

thing, you know what? This voice comes on.

33:35

It's

33:35

Bill Clinton. Say,

33:37

you know, I'm I'm in Iowa

33:40

right now with with Hillary, and we're you know,

33:42

we're doing stuff where she's campaigning. And

33:45

I'm reading her reading her your

33:47

book and reading

33:47

her passages from it that think are

33:50

particularly useful for. and he said,

33:52

you know,

33:53

those things that you said, Aldur should

33:55

have said in his debate against

33:58

George W. Bush.

33:59

said that

33:59

stuff was spot on. And

34:02

alright.

34:02

If

34:03

this guy's saying this, maybe I'm not completely

34:06

fraud. in clinical work, you're doing the

34:08

same thing. You're kind of threading the needle. Someone's

34:10

got a conflict about something. You're trying

34:12

to figure out how do I talk with this person

34:14

about this in a way. won't make them defensive,

34:17

but will allow them to consider possibilities

34:19

that they might not have considered before and

34:22

consider changing in one way or another. Well,

34:24

you know, and what I realized was when

34:26

I started to do political consulting, I was

34:28

doing psychotherapy with sixty million people

34:31

at a time. it it the skills were

34:33

really not all that different. The the

34:35

only thing that was different was that

34:37

I wanted to make sure that if

34:39

I generated, say, nine

34:41

different ways of talking about

34:44

about an issue, whether it was

34:47

back then the Iraq War, whether it was

34:49

whether it was immigration, whether it was

34:52

trade policy, whether it was

34:54

taxes, whether it was How do we help

34:56

working people? Whether it was abortion?

34:59

And most of these issues, by the way, has really

35:01

stayed the same in the sense I

35:03

started working on this stuff. fifteen years ago.

35:06

I didn't wanna just do this

35:08

by the seat of my pants. I wanted to test

35:10

those messages against each other and

35:12

against the absolute toughest

35:14

message I could write from the

35:17

other side. So what I said was,

35:19

yeah, I'll do this. but only if I

35:21

can do the quantitative work along with it,

35:23

which at first met working with pollsters and

35:25

then I eventually started doing that myself. But

35:28

this is a long winded way of saying that what

35:31

I often told people when they said, well,

35:33

that's gonna be a lot more expensive, you

35:35

know, to to do all that pulling along with

35:37

us. I said, yeah. But I said, you know,

35:39

The best antidote to

35:41

you

35:42

narcissism

35:44

and especially male narcissism is

35:47

empiricism. You you

35:49

can think you have the greatest ideas of how

35:51

to talk about this thing and then you

35:53

test them and you learn that Well,

35:55

it that one beat the opposition, but

35:58

there was another message that I didn't

35:59

realize was gonna do even better.

36:02

let's

36:02

go with that one or let's offer people

36:04

several messages that they could

36:06

use because they're actually all beating

36:08

the opposition, but you know, I'd catch

36:11

all kinds of words and phrases that I was

36:13

using that I thought, oh, jeez, how

36:15

did I make that stupid error? Well,

36:17

I learned that I made the stupid error by testing

36:19

it. So in the meantime, since the political

36:21

brain came out fifteen years ago, I've

36:23

tested about a million messages

36:26

with over one hundred thousand voters. And

36:28

so this textbook is gonna have the benefit

36:30

of not only the background in psychology

36:33

and neuroscience, but having actually

36:35

tested these things in real politics with

36:38

real voters in focus groups,

36:40

in online testing, in regular

36:42

opinion polls by telephone surveys,

36:46

you know, with thousands of voters. So

36:48

I actually now can tell you, you know, there's an old

36:50

adage in advertising. which is that

36:53

that half of our advertising dollars

36:55

are a complete waste of money and the

36:57

other half work. The

36:59

problem is we

37:00

don't know which or which. And

37:02

I'm now at the point having done this, that

37:04

I'm up to about seventy five, eighty

37:06

percent getting it right. but

37:08

that still means I'm twenty five, twenty,

37:10

twenty five percent getting it wrong. And

37:12

if I wanna work for nonprofits or

37:15

for a party or for a candidate,

37:18

in the US or somewhere else whose

37:20

values I share, I wanna

37:22

do the best possible job for them I

37:24

can rather than just assuming that because

37:26

I thought that it must be right.

37:28

So

37:28

is this book going to give away all your trade

37:30

secrets then?

37:32

It's

37:34

gonna get it's gonna give away a lot of them.

37:37

And and, you know, just to get I

37:40

will give a plug now. You know, this is

37:42

a non partisan podcast. I will get

37:44

a plug for someone on the other side who

37:47

he and I were often pitted against each other

37:49

on issue after issue where he

37:51

was doing the the right wing side of

37:53

things of of how's best to talk about

37:55

this. And I I was doing it on the left

37:57

or at least from center left to left.

38:00

and and that's Frank Lutz who

38:02

was New Cambridge's brilliant word

38:04

Smith. I really like

38:06

Frank, but I didn't like the work that he did because

38:09

I didn't like the values behind it, but

38:11

but Frank is as good as

38:13

you get at this. I I taught

38:15

a course of memory for many years of seminar

38:18

called the psychology of political

38:20

persuasion in American electoral politics.

38:23

And I always assigned

38:25

Frank's book because he's got a book called

38:27

words that work. It's just absolutely

38:29

brilliant and describes a lot of

38:31

these similar kind of principles I remember when

38:34

I eventually read it. I hadn't read it until

38:37

after mine came out. They came out at pretty similar

38:39

times. But I remember reading and thinking,

38:41

wow, he's

38:42

he's really let out all the secrets from

38:44

the from the right. But But,

38:47

you know, I my my

38:48

attitude on on that is is

38:50

kind of as

38:52

opposed to let's just have a competition

38:54

of ideas, in the marketplace of

38:56

ideas.

38:57

Why don't we have both sides

39:00

present

39:00

their ideas in the most

39:02

emotionally compelling ways possible?

39:05

so that voters really know what they're looking at,

39:07

so that they really know, you know,

39:10

there are one out of every fifty

39:12

women is having a pregnancy

39:15

where

39:15

the embryo is implanting outside

39:17

her uterus in ectopic pregnancy.

39:20

That fetus or that embryo is

39:22

going to die. no matter what.

39:24

So the idea that now

39:27

doctors are afraid to take

39:30

dead fetal tissue out

39:32

of a woman's body that could cause

39:34

sepsis and killer or that could

39:36

lead her to be to

39:38

be infertile. and the other

39:40

side's calling that a culture

39:43

of life. You know, to me,

39:45

that's how you talk about it. And

39:47

voters should hear it in that form,

39:49

not the sanitized, you know,

39:51

form of, you know, some

39:53

people believe that we should have choice.

39:55

Some believe that we should be pro life.

39:58

Do you strongly agree, agree,

40:00

disagree, that's how pollsters ask these

40:02

questions, and it's not how they play out.

40:04

And,

40:04

frankly, it's not how politicians speak

40:06

about them. And I don't think it's how politicians

40:08

ought to speak about it. I think they should

40:10

speak in the most emotionally compelling

40:13

ways about things that they

40:15

believe are true that fit

40:17

their values and that there's good evidence

40:20

that they are true, and then let the

40:22

chips fall. Well, doctor Weston,

40:24

I really appreciate you joining me today.

40:26

This has been a fascinating conversation. Thank

40:28

you so much. Thanks for helping

40:30

me out. You can find previous

40:33

episodes of speaking of psychology on our

40:35

website at WWW dot speaking

40:37

of psychology dot org or on applestitcher

40:39

or wherever you get your podcasts. If

40:42

you have comments or ideas for future

40:44

podcasts, you can email us at speaking

40:46

of psychology at APA dot

40:48

org. Speaking of psychology is

40:50

produced by Lee Weinerman, our sound

40:52

editor is Chris Cannaean. Thank

40:55

you for listening. For the American

40:57

Educational Association,

40:59

I'm Kim Mills.

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features