Podchaser Logo
Home
Chapter 11: Social Psychology Pt. 1

Chapter 11: Social Psychology Pt. 1

Released Monday, 29th May 2023
Good episode? Give it some love!
Chapter 11: Social Psychology Pt. 1

Chapter 11: Social Psychology Pt. 1

Chapter 11: Social Psychology Pt. 1

Chapter 11: Social Psychology Pt. 1

Monday, 29th May 2023
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:09

Welcome

0:14

to the Psych Podcast, the podcast where

0:16

we talk about everything intro psyche.

0:18

I'm joined as always by Paul

0:21

Blum. Paul, welcome. Thank you, David. Good

0:23

to talk with you again. We are continuing to

0:25

talk about a few

0:27

topics involving the self. And

0:30

I really like the way that you've written about this

0:32

in your book because it's

0:34

a good way to capture this whole

0:37

huge body of research in social

0:39

psychology by simply saying

0:41

the way that we think about ourselves

0:44

is fundamentally different than the way we think about

0:46

other people. Yeah, I use that as an organizing

0:49

principle in part because your

0:51

field, the field of social psychology is so broad.

0:54

Social psychology includes everything from

0:56

the famous Milgram experiment, how willingness

0:58

are people to obey, how willing

1:01

are they to comply. Social psychology

1:03

often connects to personality psychology and how

1:05

to break people's personalities up, which we'll talk

1:08

about at a different time. And then there's issues like

1:10

priming, which we'll also get to today, which

1:12

is our whole different avenue

1:14

of research. But I think

1:16

that the core of social psychology is the question

1:18

of how do we see ourselves and how do

1:20

we see others?

1:21

How do we make sense of this? How do we make sense of those

1:24

bizarre categories of people,

1:26

types of people, individual people? And then

1:28

the interesting question is, are there systematic

1:31

ways in which we see ourselves different from other people?

1:33

And I think there's a lot of evidence suggesting the answer is yes.

1:36

It makes sense that we would find that there are all

1:39

these ways in which when we think of ourselves,

1:42

we think differently than when we think of other people. For

1:45

one, we have access to large aspects of

1:47

our own cognition. So I know

1:49

all of my intentions and desires and wishes

1:52

and, you know, have my memories. I

1:55

think of myself as that

1:58

complex person.

1:59

about you is what you tell me and what I see

2:02

you do. It's a big difference. We've talked

2:04

about Freud and Freud complicates things. So

2:06

Freud says maybe we don't know as

2:08

much about ourselves as we think we do. Maybe

2:11

some of our views about ourselves are, in fact, mistaken.

2:14

We think we're doing something for one reason, it's for another. Maybe

2:17

our memories have been distorted through repression

2:19

and so on. But there's still a big difference.

2:23

You know, we're we have a big fight. We're yelling

2:25

at each other.

2:26

I know why I'm yelling at you.

2:28

I know exactly what's going on in my head. I

2:31

at least I I had a strong belief that

2:33

when you wronged me in this way, I was at a hard

2:35

day and you really touched the nerve of what you said

2:38

and so on. All I see is you yelling.

2:41

And so for me, I have access

2:43

for you. I make inferences and

2:45

that's a big difference. They didn't even simplify. We

2:47

should bite into some food. Well, how

2:50

do I know what I'm feeling? Well, if I get the senses

2:52

of the sensation of how do I know what you're thinking? Well,

2:55

you go, I look at your facial expressions. I

2:57

look and listen to what you tell me. What

2:59

could be more different? The example that I always

3:01

come to is one that you talk about in

3:03

your book, which social psychologists refer

3:06

to often as the planning fallacy. So

3:08

I go home over Thanksgiving break.

3:10

I'm a sophomore in college. Every time I

3:12

would go home for Thanksgiving break, I would pack

3:15

my back full. I swear it. I was

3:17

going to study for my midterms or whatever

3:19

it was.

3:20

And every time I would bring

3:22

that same backpack unopened back

3:25

to campus and think to myself, why

3:27

did I even bring it, you know, to stress myself

3:29

out when I look at it, there is some cool

3:31

research showing that if

3:34

you asked anybody else, is David

3:36

going to study over Thanksgiving

3:38

break? They would probably say no. Yeah. Because

3:41

they've seen this happen time and time again. I

3:43

somehow am too stubborn because I

3:45

know that I'm intending to this time. I

3:48

know I really value

3:50

being the kind of person who

3:52

would study over Thanksgiving. All

3:54

of these things are infusing

3:57

my own judgment in a way that they don't

3:59

infuse others.

3:59

My example is whenever I travel, and this is the

4:02

time before iPads and

4:04

Kindles and so on, I would bring

4:06

books, but I would bring so many

4:08

books. I'd fill my suitcase of books. It's

4:11

a three-day trip, and my suitcase is lame

4:13

with books. And of course, I'm on a flight,

4:15

and I watch the movie, and I read In Flight magazine,

4:18

and I get to my hotel room, and I watch TV. The

4:20

book, it was unread. And the odd thing

4:22

is, you'd think that would happen once, and then I would

4:25

learn, but it just seems to stick. Manny

4:27

Conlon, who's this great, of course, sage of rationality

4:29

in our field, has a wonderful line saying

4:31

that, everything takes longer

4:33

than you expect, even when you

4:35

take into account the planning fallacy. Right.

4:38

That's true. I talk about one study in which

4:40

you ask college students to

4:42

estimate when they're going to finish their senior

4:44

thesis. The researchers ask them to give

4:46

their whatever the date that they

4:49

normally expect, and then to give their

4:51

just like their worst estimate. If

4:54

everything were to go wrong, if all of the things

4:56

that you think are going to happen actually don't happen

4:59

right, when do you think you'll

5:01

actually turn in your thesis? And so

5:03

they adjust something like

5:05

three weeks later. And

5:07

the results are that everybody is always like

5:09

two weeks later than that. The worst scenario

5:12

they can imagine is still not as bad as

5:14

their actual behavior. What's interesting is there's

5:16

huge asymmetry, so I fall for

5:18

it as myself all the time. But

5:20

when I'm talking to my student, my student

5:22

has very late and getting things in, and then he says

5:24

to me, I'll get you the paper in two weeks. This

5:26

time I'm going to do it. I say to myself, no, you aren't. You

5:29

are not. You never

5:31

get things. It's always much longer. And this

5:33

is, I think, a general fact. We're hypersensitive

5:35

and kind of accurate to the flaws of other people and

5:37

often ignorant to our own. So this is one

5:40

example of a whole

5:42

set of findings that we might

5:44

put under the umbrella of positive illusions

5:46

or self-enhancing illusions, is

5:49

that consistently and robustly

5:52

we find that people

5:54

think of themselves as better on

5:56

any number of dimensions, on any number

5:59

of traits.

5:59

than other people. So this is the Lake

6:02

Wobegon effect, the better language effect. Do

6:04

you wanna run through some examples? Yeah,

6:06

sure. So if you ask professors, how

6:08

good are they at teaching compared

6:11

to the average professor?

6:13

90% of them or something

6:15

like that say that they're better than average. Of

6:17

course, I always tell my students, what's

6:19

very weird is that I actually am better.

6:22

Somebody gotta be the top 90%. And

6:26

this is true even when, so it also, are

6:28

you a better driver,

6:29

a better friend, better sense of humor.

6:32

And these tend not to be subtle effects. People

6:34

just tend to think they're better and average and everything. And you

6:36

could argue that there's a fuzziness to the question, different

6:38

ways to interpret it and so on. Right, in fact,

6:41

there is some good research by my former colleague

6:43

here, David Dunning, that shows that people sometimes

6:46

construct what he calls idiosyncratic trait

6:48

definition. And this is what allows

6:51

them to have this view that they're better

6:53

than others. And what this means is, if

6:55

I say, how smart are you? You

6:57

might think, well, smart means

7:00

practical intelligence and smart means

7:02

street smarts. And I am high on that. And

7:04

so therefore I'm smarter than most people. Whereas

7:06

somebody else might say, well, I'm book smart. I

7:08

do well on tests. And that's their definition

7:11

of intelligence. And if you do constrain

7:14

those definitions, you get some of

7:16

that effect going away. But as you say

7:18

in the book, by no stretch, does that remove

7:21

the effect. That's right. There's a study,

7:23

I think it's by Chris Chabrie, but I know it's

7:26

discussed in a book by Dan Simons and Chris

7:28

Chabrie, the Invisible Gorilla, about

7:30

chess players. And

7:32

what's interesting there is there's a ranking

7:34

that chess players have. And an

7:37

extraordinary amount of chess players

7:39

believe their real ranking

7:41

is much higher

7:42

than the ranking they've been given based on their win-loss

7:44

thing. So they're much better than what

7:47

the thing says they are. Which puts you

7:49

in an odd position that when two players

7:51

of equal ranking play one another, each

7:54

is confident that they have a substantial

7:56

advantage over the other one. That's actually hilarious.

7:58

And that is a case of a... Very constrained score,

8:01

right? Like I assume that there's a simple equation.

8:03

That's right. I have an example of the above average

8:06

finding, which may or may not stay in this

8:08

podcast, but it's done by somebody

8:10

on Twitter who does polls on

8:12

unconventional things named Ayla. And

8:14

she did a very simple poll. She said, is

8:17

your penis larger than average or smaller

8:19

than average? And

8:21

overwhelmingly, most people on her poll said her

8:23

penises were larger than average. It's salacious,

8:25

but it's very clever because it's typically an aspect

8:28

of oneself that's fairly hidden.

8:30

You asked about noses or

8:33

how tall you were. You'd have plenty of validation.

8:36

But this is something where you just, you know, for the most part,

8:38

it's a

8:38

judgment call.

8:39

Importantly, I don't think that people

8:41

are lying. There's plenty of evidence to show that they

8:44

really believe it. This poll doesn't show, it's

8:46

anonymous, right? It doesn't- It's

8:48

totally anonymous. Just click on it. So it's not like you're publicly

8:50

reporting because in some cases, obviously

8:52

people might lie to make themselves appear better.

8:55

But no, this is in your own private thoughts. This is

8:57

the way that you categorize the

8:59

social world. You just end up on top. I've

9:02

always wondered, is how enhancement bias doesn't just apply

9:04

to ourselves, it applies to those that we

9:06

love.

9:07

So there's another study that says that people

9:10

strongly, they also believe their partners

9:13

have IQs and other abilities that are above

9:15

average. And certainly most loving parents

9:17

think their kids are,

9:19

they often, oh my kid is really the sharpest

9:21

and the handsomest and so on. And I've

9:24

often wondered whether this is a true belief

9:26

or just something you once says because it makes

9:28

the kid feel happy because it's a way of expressing

9:30

your love. So I sometimes

9:32

wonder what you do if you make the person sort of put their

9:35

money on it. Right. I

9:37

think they've shown plenty of times that people will put

9:39

their money where their mouth is for their own

9:41

abilities, but whether they'll do it for their

9:44

kids. I suspect

9:46

yes, but I don't know. What do you

9:48

think? I suspect yes to.

9:50

I think, let me talk about perception. We

9:52

talk about bottom up and top down. And bottom up

9:54

is the world. Top down is your expectations

9:56

and what you want to believe in. To some

9:59

extent.

9:59

how we make sense of the world is determined by what we want

10:02

to believe. So just like when asked

10:05

how good a professor I am, I could focus

10:07

on the ways in which I'm good. When I look at

10:09

my kid and said how good a kid he is, how

10:11

good an artist he is, I can just focus on the

10:13

positives and use that as a way to kind of get

10:15

a higher estimate I otherwise would. And you

10:18

might be collecting information about the

10:20

positives and ignoring information about

10:22

the negatives in a way that you don't even

10:24

realize. And so you feel like it's a

10:26

genuinely honest estimate

10:28

that you're providing. There is, I think, some social

10:31

pressure. I remember when my daughter was

10:33

very little, I remember some family member

10:35

saying, oh, she's so smart.

10:37

And I would say, who knows? I've

10:40

never raised another kid. I don't know whether what she's doing

10:43

is smart at all. Like she might be below

10:45

the 50th percentile. And they would

10:47

be very upset. So there

10:49

is pressure sometimes. I

10:51

mean, these issues raise a very interesting

10:53

question of why such a bias exists. And

10:56

it could just be a glitch in the system. But

10:58

I don't think so. I think there are reasons why self-enhancement.

11:01

This is one of these interesting cases where it's

11:03

better to be wrong in a certain way

11:06

than to be

11:07

accurate. And it's

11:09

complicated. But here's a simple

11:11

case, which is if you ask people who just

11:13

get married, what are odds you'll

11:15

get divorced? They typically,

11:18

even when asked privately, you have very low odds. A

11:21

friend of mine gave us 0% odds.

11:23

Even though if they were to be asked

11:25

about somebody else and they knew this, it's just like, oh,

11:28

the odds of them getting divorced are 50% or 30% or

11:30

whatever it is. And I can see there's something

11:33

good about that, this positive illusion.

11:35

It's like in sports, the reporters

11:38

before game always ask the players,

11:40

do you think you're going to win this one? And

11:43

it would be a little odd if they said no.

11:45

It's

11:47

so odd. It's just kind of funny. No

11:49

odds are against us. We're not really that good.

11:51

So one of the things that you could be saying is

11:54

that it's good to believe

11:56

that a good outcome will

11:58

occur. Because. Maybe it would be deflating

12:01

and you wouldn't try as hard if you thought

12:03

that you weren't going to do well. I think it's true for

12:05

all sorts of endeavors. So most attempts

12:08

at diets fail, but maybe it's

12:11

adaptive to keep trying and trying and trying and trying,

12:13

but it's hard to try in any single try if you think

12:15

you're odds are one in 10.

12:17

So you, you, you bump up the odds, it makes you

12:19

try, and the long one is good for you. As

12:21

you mentioned in your chapter,

12:23

it's important to talk about the different kinds

12:25

of errors that you can make here. That's

12:27

right. One way is to think, to

12:31

overestimate your own skills

12:33

and try at something

12:35

and fail, because you think you could do it. And that's

12:37

kind of bad.

12:38

Another different sort of failure

12:41

is to underestimate your own skills and not try

12:43

in a case you would have succeeded. Now,

12:47

which is smarter? Which kind of error to make is smarter? There's

12:49

no general answer to that. Suppose somebody

12:52

is shy and

12:53

they want to ask people out for dates,

12:56

but the person has

12:58

fear of being rejected. You might tell this

13:00

person, look, if you ask somebody out for a date and

13:02

they say, no, that's not perfect. Maybe that's

13:04

a bit embarrassing and so on. But missing

13:08

out on the chance that somebody really would like you and

13:10

have a great life with you and everything, that's

13:12

worse of a mistake.

13:14

So ask more people out. On the other hand,

13:17

if it's a case of sort of getting into a

13:19

violent confrontation like trying to kill somebody,

13:22

and then knowing that if you fail, there'll

13:24

be terrible retaliation, it's actually

13:26

probably a lot better to have an accurate or even

13:28

an underestimation of your skills. So you don't

13:30

try,

13:31

rather than foolishly trying and failing.

13:33

It's interesting that social psychology

13:36

has focused so much on the positive illusions

13:38

when we overestimate our skills and abilities.

13:41

When you look at the literature, you could come away

13:43

thinking, everybody is completely

13:46

misguided in one direction. And

13:48

I don't think that is the case. I always

13:51

would think to myself, this can't have been evolutionarily

13:54

adaptive in some ways, right? If

13:56

I think I can clear a six foot

13:58

jump, and I can...

13:59

only clear a four-foot jump. I would

14:02

probably fall to my death if I were trying

14:04

to jump that gap over a cliff. People

14:07

aren't completely idiotic

14:09

about this stuff. Right. I think that's a really

14:11

deep point. You know, some evolutionary psychologist,

14:14

and Marty Hazelton has done some work where she argues

14:17

that men, for instance, often

14:19

overestimate how attractive they are

14:21

to women.

14:22

And she says there's an evolutionary logic to that. It

14:25

causes them this overestimation, this overconfidence

14:27

causes them to approach more women, and

14:30

they have more false alarms. They

14:32

think that won't be tracked to back. They're wrong.

14:35

But because of that, overall,

14:37

that's not such a bad thing, while missing an

14:40

opportunity for romance is a bad thing. It's

14:42

a perfectly good strategy. As she points out,

14:44

it's not so good for the women who get hit upon by

14:46

overconfident men. This is why it's

14:49

important for women to also do

14:51

evolutionary psychology. Yes. Yes. I

14:53

think the same world comes for clinical psychology,

14:55

too. If you look at anxiety in a clinical

14:58

psychology textbook, in fact, in my own book, I'm

15:00

talking about anxiety disorders. The anxiety

15:02

disorders all involve too much anxiety.

15:05

You have panic attacks, you have obsessive

15:07

behaviors, phobias, which are a form

15:09

of anxiety. You are fearful of certain

15:11

situations. And that seems very

15:13

natural.

15:14

And that's when you go to a psychiatrist and psychologist

15:17

to get treated. That's what drugs are invented for. But

15:20

Randolph Nessie, who's a very sharp

15:23

psychologist, points out

15:25

that there's another sort of anxiety disorder

15:27

that's never studied called too little anxiety.

15:30

And he says that people with too much

15:32

anxiety end up in therapist's offices, people

15:35

with too little anxiety end up in prisons

15:37

and morgues. If you walk around with

15:40

too little fear, bad things will happen

15:42

to you because you're not afraid enough of them.

15:43

So for all of this, there's kind of an optimal

15:46

level of how frightened to be. If

15:48

somebody says they have a pill or intervention,

15:51

that will make you fearless, run away from them, because

15:53

that will destroy your life.

15:55

I was never great at

15:57

studying for exams.

15:59

My sister, on the other hand, who went on to become

16:02

an attorney, went to a very good law school, was

16:04

a straight A student. She would convince herself before

16:06

every exam that she was going to fail.

16:08

And I always thought this was

16:10

the weirdest belief for the very

16:13

reasons that we were discussing at the beginning of this episode,

16:15

which is that I have information

16:18

about you, and that information is that you have never

16:20

failed an exam. But this belief seemed

16:22

to motivate her to study

16:24

hard. I remember

16:26

my last year at university, I

16:29

had senioritis bad. I was doing

16:31

poorly on exams,

16:32

and it's because I was not

16:35

anxious at all about my performance. And

16:37

that lack of anxiety led me

16:39

to very, very maladaptive

16:41

behaviors. The day that I stopped being

16:43

nervous before a talk is the day that I'll give a terrible talk

16:46

because I won't have prepared at all.

16:48

I totally review. I'm always anxious before

16:50

talks. Less than I used to be, maybe because

16:52

less preparation is needed. My

16:55

anxiety motivates me to over practice and

16:57

prepare. And maybe

17:00

I could titrate it down a little bit, but

17:02

without

17:02

it, I'd just be casual and be

17:04

cool and calm and give much worse talks. Now,

17:07

of course, some people get so anxious to get paralyzed

17:09

or they can't do it or it's incredibly unpleasant. But

17:12

again, of all things, they're sort of imagine

17:14

a dial. And

17:15

I think for everybody in every situation, there's an optimal

17:18

degree on how much to turn that dial

17:20

for anxiety, for self enhancement,

17:23

for all the things that we talk about.

17:25

The big story of social psychology has

17:27

always been what we study

17:30

is the self, the person, in

17:33

social contexts. There's another social

17:35

context that we haven't discussed yet,

17:37

but has played a big role in the last

17:39

few years in psychology, and that is culture.

17:42

There are probably many people listening to us talk about

17:44

this right now who might object

17:47

and say,

17:48

this seems like a very Western way

17:50

of thinking or maybe even a very American way

17:52

of thinking, where everybody

17:55

is better than average.

17:59

other affects our psychologies

18:02

in ways that social psychologists study. So

18:05

maybe inadvertently,

18:07

social psychologists, and talking

18:09

to phenomena we're talking about here, the better-than-average effect,

18:12

self-enhancement biases, may be

18:14

giving us the psychology of a good

18:17

chunk of the world,

18:18

but actually not most of the world.

18:20

Maybe here we could talk a little bit about another

18:23

bias, the fundamental attribution error because

18:26

there's some very interesting work on this

18:28

across cultures.

18:30

One of my favorite articles is

18:32

now a classic from 1994.

18:35

It's Morris and Peng who

18:37

were looking at attributions

18:40

for cases in which somebody

18:42

had committed murder. So

18:46

they looked at the way

18:48

that Chinese newspapers and American

18:50

newspapers wrote up similar

18:52

cases. And so these were, I believe,

18:55

cases where there had been a shooting. American

18:57

newspapers focused a lot on

18:59

the individual

19:00

and the causal

19:02

factors being their character,

19:05

their particular inclinations,

19:07

their history, whereas the Chinese

19:09

papers focused much more on the

19:12

social context. So blame

19:14

was less on the individual,

19:17

or at least causality was less at the

19:19

individual level. It was more at the level of

19:22

the social groups and institutions that that person

19:24

belonged to. And that illustrates,

19:26

I think, nicely a difference between

19:29

collectivist and individualist

19:30

cultures, where the context,

19:33

the group, plays a much stronger

19:35

role than in cultures like ours. And

19:37

it could be argued that certain biases we

19:39

have would show up in our culture, but not

19:41

in others. And the sort of fundamental attribution

19:43

bias, which if I remember it right,

19:46

is the idea that what others people do

19:49

is because of their natures, not their situations.

19:52

That's right. Your confusion is understandable because there

19:54

are a number of biases in social psychology,

19:56

all of which seem to say very similar

19:58

things. I don't want to get the wrong.

19:59

bias here. But at one

20:02

point, Joe Henrik, who wrote the

20:04

Weird article, said the fundamental

20:06

attribution bias isn't fundamental. It's just weird.

20:09

I'm not sure that's entirely true. There's

20:11

a lot of debate. And in fact, some

20:13

of these biases may actually be universal,

20:16

though they show up to a lesser extent in collectivist

20:19

societies. I think it is important not

20:21

to throw out the baby with the bathwater. It's an open question

20:23

as to whether or not some of

20:25

the things we find in the U.S.

20:28

will replicate in other cultures.

20:31

I think we have varying degrees of confidence

20:33

in which of these might be the case. It's also

20:36

very possible that these biases

20:39

might just

20:39

rear their heads in slightly different ways. And so

20:41

if you are a member of a culture in

20:44

which the unit is the family or the

20:46

group or the nation, you might have a very

20:48

strong belief that

20:50

your group or your family or your nation is superior.

20:53

You might be likely to bring that kind

20:55

of positive allusion to

20:57

whatever unit you're focused on. That's right.

21:00

And even within societies, you can imagine different

21:03

feedback systems exaggerating

21:05

or diminishing a bias. So

21:07

in Australia, apparently, there's what's something

21:10

called a tall poppy syndrome,

21:12

where if you stand up above everybody

21:15

else, you get chopped off.

21:16

The proper showing showing proper

21:19

manners and doing well in societies don't make such

21:21

a big thing of yourself as opposed to like in New

21:23

York City or something.

21:26

We've been talking about how we evaluate ourselves and

21:28

now we sort of shifted a little bit to how we evaluate others. I got

21:31

to talk a little bit and ask you your thought about thin

21:33

slices. Oh, yeah. It is one of the sort

21:35

of cooler things that have come out of psychology.

21:38

The general idea of thin slices is

21:40

that we are often surprisingly

21:42

good

21:43

at figuring out

21:45

based on very brief exposures to people

21:48

or situations, either facts

21:50

about people or facts about their abilities. So

21:52

I think the original studies were with teachers.

21:55

You could tell me if I'm getting this right, but the idea would be you'd

21:57

give a lecture, you'd give a full class, your students would give you a lecture,

21:59

your students would give

21:59

evaluation, how good you are as a lecturer. And

22:02

then we get another group and we show

22:04

them a short clip

22:05

of us lecturing. And I

22:07

say shorting, like three seconds long,

22:10

maybe it's sound off or something. And

22:12

it turns out based on that short clip,

22:14

people are surprisingly good

22:16

at judging how overall we

22:18

are as teachers as rated by people who've had a lot

22:20

more experience than us. Right. A full

22:23

semester of exposure to a professor

22:25

doesn't seem to change the ratings

22:28

very much at all from just a few

22:30

seconds of seeing them lecturing.

22:33

Even the way, Paul, that you describe

22:35

the finding, I think shows

22:37

that the bit of ambiguity

22:40

in many of these findings that

22:42

I remember finding frustrating. Is

22:44

this demonstrating that we

22:46

are accurate? That

22:48

is, am I within 10 seconds

22:51

assessing your true abilities as a professor?

22:54

Or is it the case that whatever superficial

22:57

bias I have that may lead me to

22:59

judge that you're a good professor? So if you smile

23:02

and you're charming and you seem like your body

23:04

language is confident,

23:07

I might say you're a good teacher. And guess

23:09

what? You keep doing that.

23:12

And so I keep saying that you're a good teacher. Now,

23:14

what does it mean to be a good teacher? It's hard.

23:16

I mean, this is a domain in which it's unclear

23:18

what these ratings are saying. And a lot of people have

23:21

a lot of things to say about teacher ratings,

23:23

especially teachers, it turns out. You

23:26

write to some extent,

23:27

there's sort of an interesting circularity

23:29

in some of this work. So there's research on people's

23:32

judgment of how trustworthy

23:34

others are by looking at their faces. And

23:37

Alexander Todorov has done some very interesting work

23:39

finding these trustworthy intuitions

23:42

can predict pretty well who

23:44

gets elected. Yes. You know,

23:46

you do these studies, this is wonderful studies where he shows

23:48

a pair of people from some

23:50

congressional district far away and says,

23:53

basically he finds that

23:55

people, when asked proper questions, can predict

23:57

who wins by looking at their face, not a hundred

23:59

percent of the time. better in chance. And that again

24:01

is a case where there is some ambiguity about

24:03

what's driving the effect. That's right. Are they capturing a

24:06

true thing about them or is it that, well,

24:08

everybody sees them as trustworthy so they vote for them?

24:10

Right. Whatever bias you see

24:13

within a split second is

24:15

the bias that leads you to vote. So

24:17

it's hard to know. There is some work

24:19

trying to show whether or not these judgments are accurate,

24:22

but it's a little tricky and it gets a little messy because,

24:24

for instance, what does it mean

24:26

to show that you're actually trustworthy? People

24:29

might just bring you into the lab and have an economic

24:31

game that involves you behaving in

24:33

a manner that seems trustworthy. But

24:36

it's unclear whether that's truly speaking

24:38

to your overall character. That's right. I

24:40

think what is absolutely

24:40

true is that we make these snap

24:43

judgments within split seconds of beating somebody.

24:45

There's another aspect of the research which maybe

24:47

takes us away from social psychology, but I just find

24:49

it so interesting is, and

24:51

this avoids the problem we're talking about here, is

24:54

the ability of people to tell from one another's faces

24:56

their sexual orientation and their political

24:59

leanings, not perfectly, but

25:02

better than chance. And now they have

25:04

AIs that do this. Many people may

25:06

be reason to find this disturbing that

25:08

you could have a machine scan your face and reveal

25:11

those things. But how in the world

25:13

do they do it?

25:14

It's absolutely incredible

25:17

that these algorithms are capable

25:19

of making these determinations. As far as I

25:21

know, the thin slice work where humans are

25:23

making these judgments. So I show you a

25:25

bunch of pictures of men and ask

25:27

you whether they're gay or straight. People

25:30

are better than chance,

25:32

but very slightly better than chance.

25:35

But machines, much better than chance.

25:37

So the study I discuss in my book that used

25:39

a facial recognition algorithm on images

25:42

for 800,000 people on dating sites

25:44

could predict political

25:46

orientation.

25:48

Liberal and conservative at 72%

25:51

accuracy,

25:52

which is now people can do it too.

25:54

And people are 55%, which is actually better

25:56

than chance for such a big sample. It turns

25:59

out when you do a kind of.

25:59

deep dive, part of it is

26:02

they use demographic use. If you

26:04

see an older white man, it's

26:05

a better bet that he's a conservative,

26:08

for instance. But when you give

26:10

us the sample age of the same age,

26:12

gender, and ethnicity, it still does

26:15

well.

26:16

And it seems to have to do with things

26:19

like how people face the camera. So for

26:21

some reason, liberals are more likely to

26:23

face the camera directly and more likely to look

26:25

surprised.

26:26

And I find this, you

26:28

know, the idea that things like our political

26:30

orientation could leak out in our faces

26:33

is just to be fascinating. I find it absolutely fascinating

26:36

as well. You do point out that some people

26:38

are distressed by this, but I think it's important

26:40

to tease apart the distress that you might have about

26:43

the intrusiveness of corporations

26:45

or government in trying to find

26:48

out information about you that you did not reveal.

26:50

But remember that these are all faces

26:53

that people post themselves, and

26:55

these are political orientation that they

26:56

report themselves. So the research

26:59

itself isn't unethical.

27:01

What people do with it may or may not be

27:03

unethical, which leads me, Paul, to the

27:05

studies that Stanley Milgram conducted

27:08

on obedience to authority. I would say

27:10

these are the most famous studies in all of psychology.

27:12

The main idea was he brought Milgram

27:15

and his assistants, brought people into the Yale

27:17

lab under the pretext of

27:20

doing an experiment on memory.

27:21

And so they're brought in the room, there's another volunteer,

27:24

but the volunteer is actually another

27:27

experimenter, is a trick.

27:28

And they pretend to flip a coin or something

27:30

to figure who gets to be the teacher and who gets to be

27:33

the learner. But it's set up so that the

27:35

other experimenter is really the learner. And

27:37

so the real subject of the study

27:40

is the teacher. And the teacher is instructed,

27:42

he is always a man, the teacher was instructed

27:45

that he had to give this other person words to remember

27:47

and when they failed, give them an electric shock.

27:50

Right. I believe the cover story was, we're

27:52

interested, you know, nobody has looked

27:54

at the effects of punishment on

27:56

memory. Yeah. That sounds,

27:58

she

27:58

sounds like a not a bad study.

27:59

And the shocks

28:02

are in the shock box, which a series

28:04

of shocks going higher and higher and higher till

28:06

at one point it goes XXX and

28:09

danger. 300 bolts or something, yeah. Yeah.

28:11

And then basically when a person makes a mistake, they have

28:14

to move up, the learner is told to

28:16

move to the next stage and shock them at

28:18

a higher rate. And as soon as the person

28:20

starts the learner, again, the actor

28:23

starts screaming

28:24

and then begs to be let out.

28:27

And at this point in the- There's a prerecorded

28:29

track action. It's a prerecorded track. So everybody hears the

28:31

same thing. And at this point, there

28:34

was also videos one could watch of the Milgram

28:36

thing, which is really straight. Yes, absolutely,

28:38

yeah. And

28:39

from the videos, but having these people get very upset

28:42

and they say, can I stop an experiment? It says something like,

28:44

the experiment must go on. And

28:46

basically it reached a shock point

28:48

where the person doesn't respond anymore as if they're unconscious.

28:50

And at one point he yells, he has a heart condition. Yeah,

28:54

it goes from yelling and screaming, I have a

28:56

heart condition, please stop, please stop.

28:58

And then it goes to no response. That's right.

29:01

And the main finding is, contrary

29:03

to predictions of everybody,

29:05

is that most people

29:07

simply by being asked,

29:09

behaved in a way, hit the highest button,

29:12

basically believe they killed somebody.

29:15

And it's important to realize some things about this

29:17

thing. People were not, it's not like they were

29:19

sadistic. They're upset.

29:22

Very upset. And I recommend anybody

29:25

just Google for seeing these original

29:27

videos. And you can see

29:29

that not only are people upset, they are asking

29:31

repeatedly to be let out of the experiment.

29:35

And the resistance that is offered

29:37

is simply, no, you must continue, it's

29:39

part of the study. That's right. Nobody

29:41

was physically preventing them from leaving,

29:43

of course. That's right, that's right. And

29:46

it is the case as you, it goes back to what you said before,

29:48

but first person, third person, where if you

29:50

ask most people, what would you do

29:52

in that situation? People would say, well, I decided to help

29:54

with you. I leave, get up and walk away. I bet some people

29:56

did that, but most didn't. And

30:00

most people were under anguish being

30:02

ordered to kill somebody, even though

30:04

there was no gun to their head. And

30:07

Milngrim used this as a model of the Holocaust,

30:11

arguing that part of the reason why the

30:13

Germans were motivated to kill so

30:15

many millions of people was simply that it's

30:18

natural for us to obey authority.

30:20

This was, needless to say, controversial. I

30:22

mean, for one, people sometimes

30:24

take offense if you try to explain any

30:26

evil act. There is some resistance

30:29

because explanation sounds like excuse-making,

30:31

which I don't think is fair

30:34

to Milngrim and what he was doing. He was trying

30:36

to understand how regular

30:38

old people can be put in a situation

30:41

and actually commit acts that

30:44

were heinous. But to the

30:46

ethicality of these experiments was big controversy.

30:48

Yeah,

30:49

I think I may be wrong, but I think

30:52

these experiments were why human subjects were

30:54

started.

30:55

There are at least

30:57

one reason for absolutely. When

31:00

I was a student at Yale, my

31:02

friend and I, fellow graduate student, ventured

31:04

into the basement where all the student

31:06

offices are, and we decided

31:09

to go looking through parts of the

31:11

old psych department basement that weren't

31:14

in use. These were still sort of unfinished,

31:17

and

31:17

we saw that there were a bunch of file

31:20

cabinets in this dank, unfinished area

31:22

of the basement, and we started opening and

31:24

looking through the filing cabinets

31:26

to see what was there. One of the things that

31:28

was there was faculty

31:31

files dating back to probably

31:33

the 20s, I think. Yes,

31:36

in there was Stanley Milgram's

31:38

file, and in that file

31:41

were letters

31:43

that had been solicited from people

31:45

in the field as we do for promotion cases. We

31:48

asked people in the field to say, is this a person

31:50

good or not? And

31:53

we read through some of those letters. Half

31:56

of them said, this guy's a genius.

31:59

is

32:01

probably the best social psychologist

32:03

out there right now. He's doing exciting, important

32:06

work. And the other half were like, this

32:08

guy is a moral monster.

32:10

Yeah. He did not get tenure at Yale. No,

32:12

he did not. Yeah. Yeah. And if I had

32:14

to vote, I would think both sides, I think he certainly

32:16

was a genius.

32:18

He had several other experiments that were also

32:20

the notion of six degrees of separation, by

32:23

the way, that is also a Sandy

32:25

Milgram idea. But I would agree that

32:27

the experiment was in its way monstrous. The people

32:30

had no understanding of what they were going to go into. And

32:32

of course, it was meant to be a deceptive experiment.

32:34

You couldn't tell them what was going to happen, but

32:37

it must be horrible

32:41

to believe you just killed a person.

32:43

And that is way beyond what would be permitted

32:46

in any psychology experiment now.

32:47

I absolutely agree. And in fact, the

32:50

specific regulation that is a result

32:52

of that is that if any participant

32:55

asks to be let out of a study,

32:57

you must let them out immediately. You

33:00

cannot force people to stay

33:03

in a study. So it's deeply unethical

33:05

for researchers to do this, probably was

33:07

then, certainly is now.

33:10

It's not for reality TV. And in

33:13

fact, if you're wondering about the

33:16

robustness of this effect, is this

33:18

one of these effects that doesn't replicate

33:20

every attempt

33:23

at

33:23

recreating this study that I know of

33:26

has succeeded. And I think it's important to

33:28

point out that Milgram himself did

33:30

a number of studies on

33:32

this very thing. He was quite the

33:35

rigorous researcher at detail oriented. So

33:37

he manipulated all sorts of things

33:39

ranging from whether or not the

33:41

experimenter was wearing a lab coat, what

33:44

school the experimenter was from. And

33:46

he found, as you might expect,

33:48

that the more QSO authority that you had, the

33:51

more likely people were to obey. He

33:53

also varied the distance that you had from

33:56

the quote unquote learner, the person who

33:58

was receiving the shocks and turned

33:59

Turns out, as you might expect, if you could see that

34:02

person, you were less likely to deliver

34:04

the shocks to the full extent. So

34:07

he was quite systematic. There's a surprising amount

34:09

of work that he did. And there had been replications,

34:11

recent ones, in different ways. Some

34:14

online, using online people, but

34:16

like in a sort of second life situation,

34:18

some pretty much doing

34:20

it in places where maybe the requirements

34:23

of ethics were more like... Yeah, I wasn't

34:26

joking about the reality TV thing. Yeah.

34:29

There are ironies

34:29

of life or of law that

34:32

to do experiments, you need to go through elaborate

34:34

human subjects, procedures where the ethics are analyzed.

34:37

But to do a reality show, you don't have to go through any of

34:39

this. You probably just can't break the

34:41

law.

34:42

You would sort of ask me what I would do.

34:46

And knowing the Milgram experiment,

34:48

I'd like to think I would say no. And I'd like to think for

34:50

similar things,

34:52

that hearing about the Milgram experiment inoculates

34:54

me against the sort of simple obedience. But

34:57

absent that, to be honest,

34:59

I probably would have killed a guy. I

35:02

don't think there's anything in my personality that's particularly

35:04

rebellious to authority. I'm a professor,

35:06

which is

35:07

reasonably docile, obedient sort

35:10

of occupation.

35:11

People who give the middle fingers to authority

35:13

don't tend to become professors. Sad

35:16

as it is to say, and shameful it is to admit,

35:18

I would obey. I feel the same way. And

35:22

when I admit this to students, they're a bit surprised.

35:24

I mean, with obviously the caveat

35:26

that we know about the Milgram experiment. So hopefully

35:30

we wouldn't. And I think

35:32

whether or not we're right and whatever

35:35

people listening to us, whatever their intuition is,

35:38

you've got to avoid this reflexive temptation

35:40

when you hear a bunch of people do something and a majority of people

35:42

do say, well, I wouldn't. I'm special.

35:45

Well, some people are. Not

35:48

everybody does. Some people walked out

35:51

at the very beginning.

35:52

But not everybody can be special. And

35:55

you need some independent evidence. I think

35:57

that if there is anything that

35:59

you.

35:59

You can conclude from all of this

36:02

work showing that people consistently

36:04

think of themselves as better across

36:06

so many different domains. It's okay to

36:09

remind yourself that maybe

36:11

you aren't that special. Yeah. There's

36:14

even one study that said, how

36:16

subject are you to the better than average effect

36:19

where people inflate their own ability to make their special?

36:22

And the majority of people said, oh, I'm actually

36:24

better than average at avoiding that effect. It's

36:27

a bit depressing.

36:29

This episode of the psych podcast is brought to you once

36:31

again by give well. If you're

36:33

considering donating to charity or

36:35

if you've never donated to charity before and you're interested

36:38

in figuring out where your money would

36:40

go, well, that can be difficult

36:42

if not downright impossible.

36:44

Charities aren't exactly open

36:47

and forthcoming with that information sometimes

36:50

on their websites. So it can be hard

36:52

to track down exactly what your money is doing

36:54

and how much impact it will have. That's

36:57

what give well is all about. They've

36:59

dedicated a ton of staff

37:01

over 40,000 hours each

37:04

year to trying to find the

37:06

highest impact evidence backed

37:09

opportunities for people to donate

37:11

their hard earned money. Over a

37:13

hundred thousand donors have used give wells

37:16

research and their service

37:18

to donate more than $1 billion. And

37:21

rigorous evidence suggests that these donations will save

37:23

over 150,000 lives and improve

37:27

the lives of millions more. So

37:30

if you are just interested in the research,

37:32

you can go to give well.org and they've published

37:34

all of their research absolutely free. You

37:37

don't even need to sign up. No personal information

37:39

needs to be given and you can use all

37:41

of their research to figure out where to donate

37:44

to, where your money will

37:46

go and essentially how much

37:48

impact each dollar that you give will have.

37:51

But also if you want to use give well

37:54

and donate through them, they'll allocate

37:56

your tax deductible donation to the charity or

37:59

fund of your choice.

37:59

and they won't take a cut at all. They're

38:02

not about

38:03

taking any money. They really

38:05

just want to make the process

38:08

of donating to charity more effective.

38:10

So if you've never donated to GiveWell's recommended

38:13

charities before, you can have your donation

38:15

matched up to $500 before

38:17

the end of the year, or as long

38:20

as matching funds last. To claim

38:22

your match, all you have to do is go to givewell.org

38:25

and pick podcast and enter psych

38:28

at checkout. Just to make sure that

38:30

they know you heard about them through us.

38:32

That's givewell.org,

38:33

pick podcast, and enter

38:35

psych at checkout. Our thanks to GiveWell

38:37

for sponsoring this episode of Psych.

38:40

So the chapter

38:43

ends on an issue of controversy. For

38:45

the most part, what we're doing and what we're talking

38:47

about today, our early social psychologists

38:49

have batted about and sort of well-known

38:51

phenomena, but I end up to some

38:53

extent picking a fight. And I would understand if

38:55

not everybody in the field will agree with my final conclusions

38:59

of the chapter. I talk about something

39:01

which comes under the name of social priming.

39:03

And I

39:04

think maybe the best way to introduce

39:06

this

39:07

would be if I could list a bunch of phenomena.

39:10

Sitting on a wobbly workstation

39:13

makes people think their romantic relationships are

39:15

less likely to last. College

39:17

students who fill out a questionnaire but your political opinions

39:19

when next to a dispenser of hand sanitizer

39:22

become at least for a moment more politically conservative.

39:25

Exposure noxious smells make people feel less

39:27

warmly towards gay men. If you're holding

39:30

a resume in a heavy clipboard, you'll think better

39:32

of the applicant. If you're sitting on a

39:35

soft cushion chair, you'll be more flexible

39:37

when negotiating.

39:38

Thinking of money makes you less caring of other

39:40

people. Holding a cold object makes

39:43

you feel lonely.

39:44

And I could go on, I go on for about twice

39:46

as much in my book, but there are thousands,

39:49

not hundreds at least, many hundreds of findings

39:51

like this where

39:53

what they have in common is

39:55

something that seems like it

39:58

shouldn't matter, some mere exposure. Holding

40:00

a coal object, thinking about money,

40:03

being in a side of school, holding

40:05

a warm coffee cup

40:07

influences you in ways you don't know about. And

40:10

a lot of people, a lot of respected

40:12

social psychologists believe that this sort

40:14

of unconscious priming where

40:16

we're pushed around by factors out of our control

40:19

plays a radically powerful role in our lives.

40:22

You don't make decisions, you don't have free will, you're

40:24

just primed. Yes. And you'll notice some

40:26

of the studies I talked about were done by you

40:28

and me. I did notice you've primed

40:31

me to feel defensive. Yeah,

40:33

so. Yeah, you know, we talked about the replication

40:36

crisis in the previous episode. This

40:39

body of research is likely

40:41

the biggest cause and

40:43

victim of the replication

40:46

crisis. Some of these effects just

40:48

cannot be found if people

40:50

try to repeat them.

40:52

And that is bad. So

40:54

to the extent that there were poor

40:57

methods used or selective reporting

40:59

of findings that worked, all of the reasons

41:01

that we talked about in the episode

41:03

on replication might have given rise

41:06

to some of these effects. Now, to

41:08

be fair, priming itself

41:11

in the cognitive sense is a robust

41:13

phenomenon. So to give a famous

41:16

example from a set of studies on memory, if

41:18

I say pillow, bed,

41:21

alarm clock,

41:22

comforter,

41:23

the mind to some extent is organized

41:26

by associations, by a network

41:28

of associations. And I've given you a bunch

41:30

of concepts that are very closely related to

41:33

sleep. So the word sleep

41:35

and the thought sleep, the concept sleep is

41:37

going to be very easy now for you

41:39

to pull out of your mind. This

41:42

and various other ways in

41:45

which the mind can be primed

41:47

has been demonstrated in cognitive psychology

41:49

over and over again. The doubts that

41:52

these effects exist. That's right.

41:53

And I just listed the effects and I didn't give

41:56

any explanations for why they would occur. But

41:58

they're reasonably enough said to.

41:59

occur to where you're talking about. So

42:02

warmth is connected at

42:05

a deep level to social contact

42:07

and to love and affection. So

42:10

it's not crazy to imagine holding a warm object

42:12

might make you feel more loved and holding a

42:14

cold object might make you feel

42:16

more lonely. This is a case where the

42:19

some of the work on metaphor for instance has

42:21

had an influence on social psychology.

42:23

And one of my favorite effects is

42:26

the Macbeth effect

42:28

where

42:29

drawing upon part of the of the play

42:31

Macbeth by Shakespeare. The

42:33

finding is that washing your hands

42:36

physical cleansing in some

42:38

way removes guilt and anxiety. And

42:41

again the logic here is that notions of cleanliness

42:43

metaphorically extend to

42:47

both morality and goodness

42:49

and also to actual physical cleanliness. And

42:51

in some way the mind embodies these

42:54

metaphors. Sometimes it might just be more

42:56

about association. There's findings about

42:58

holding money

42:59

might make you care less with other people. Well

43:02

money's associated with a sort of

43:04

competitive sort of mindset and

43:06

less with love. And for each

43:08

of these findings they're not crazy findings.

43:11

There's a logic behind them.

43:13

My complaints, we double

43:15

back, but part of

43:17

it is that these things are hard to replicate.

43:21

But I want to be cautious here. I think some

43:23

of them are real.

43:24

I think some of these findings are real and there's some some

43:26

of them have replicated and done did well. It's

43:29

just

43:29

they are for the most part hard to replicate and

43:32

probably weaker than we thought. But the

43:34

second point is more conceptual which is

43:37

I think it's suppose it's true that

43:39

to use an example eating on a white

43:41

plate makes you think the food is fresher. You

43:44

know white is a clear and clean color,

43:47

fresher food and all that. Or maybe makes

43:49

it taste better, something like that. It'd be a mistake

43:51

from assuming from that effect that well

43:54

that's all that matters for whether we

43:56

taste the food.

43:58

It is compatible This pause about priming

44:01

is a part of the story, but if it is, it's a

44:03

very small part of the story.

44:05

I didn't decide to

44:06

set up my microphone and talk to you

44:08

today because I was primed

44:11

by something. I did it because it was on my calendar.

44:13

We talked about meaning and so on. And maybe

44:16

the

44:16

color of my room or some lights made

44:19

me a fraction of a second slower,

44:21

faster, sit down. But

44:23

that's not the interesting stuff. The interesting stuff

44:25

is the planning, the deliberating, the deciding.

44:27

There's two issues there. Is the effect

44:30

real? Did this really document

44:32

some connection between whatever

44:35

the stimulus was and the outcome?

44:38

And then if it is real, to what

44:40

extent does it actually influence

44:42

any real world judgments and behavior?

44:45

And so you could believe that these are all real

44:48

and still not believe that

44:50

they affect our behavior all

44:53

that much. And you know what? In a lab, when

44:55

you're trying to isolate all the

44:57

variables that you could possibly isolate, so

44:59

the only thing that you manipulated

45:02

between, say, two different participants is the color

45:04

of the walls. Maybe this

45:07

is an ideal situation in which to find an

45:09

effect of color. These effects are so

45:11

counterintuitive and

45:13

so potentially interesting

45:16

that not only did social

45:19

psychologists talk more and

45:21

more about them and write

45:23

books and articles, but journalists picked

45:25

up on these effects

45:28

and just loved them. It's

45:30

so entertaining to tell somebody that

45:33

perhaps the crossword puzzle that

45:35

you did had words relating

45:37

to how old you were. And

45:39

this actually made you walk a little bit more slowly

45:42

to your refrigerator. If that's true,

45:44

that's

45:44

crazy.

45:46

And proponents of this view

45:48

often said, much like

45:51

Freud, we are uncovering the

45:53

fact that we are not in full control

45:56

of our behavior. And I

45:58

always resisted this. In fact, you... You

46:00

and I and our colleague Eric Allman wrote

46:03

a paper on these effects way

46:05

back in the day, trying to... I

46:07

think we were very sober about these even

46:09

back then, about the role of unconscious

46:13

priming. So whether or not some

46:15

of these are real, I think, yes, we need

46:17

to replicate them. We

46:19

need to know whether or not they're real.

46:22

But I would put money on that none of

46:24

them matter for a judgment

46:26

decision. That's right. And they could matter to us as

46:28

psychologists. It could be really interesting,

46:29

but they get swamped by real

46:32

world effects. Suppose in

46:34

a laboratory situation, showing me a defendant

46:36

with stripes wearing stripes, and

46:38

I have to judge what kind of prison sentence he

46:41

gets. Suppose in a tiny way,

46:43

in a little statistical way, I give him a

46:46

harder prison sentence than if he

46:48

wasn't wearing stripes, maybe because I think of stripes

46:50

as the uniform prisoners wear, and that sets up an association

46:52

in my head. But that doesn't mean that when

46:54

we go into a courtroom, the story of whether that

46:57

person wears stripes or not is going to be bigger

46:59

on a harder sentence. The stuff that

47:01

really matters is the stuff you can't get a paper

47:04

published on, because it's kind of obvious. Do

47:06

you think he did it? How severe was

47:08

the crime? Is he repentant? And I think

47:11

these priming might

47:13

push and pull us in very subtle ways of

47:16

a sort that is absolutely fascinating to us as scientists.

47:19

The Macbeth effect, which has failed

47:21

to replicate, but imagine it

47:23

turned out to be true, would be very interesting

47:25

if there's a metaphorical system relating

47:28

physical cleanliness with moral guilt

47:30

and credit.

47:32

Let me tell you a lot about how the mind

47:34

works, even though whether

47:36

or not people watch journalism might have no effect outside

47:38

the laboratory. This is in general

47:41

good to remember for most psychological

47:43

effects, that the strength, people

47:46

often talk about the strength of

47:48

an effect, and they use a

47:50

statistic, a family of statistics called

47:52

effect size. So the larger

47:55

that number is, the stronger

47:57

it was that those two things were related, say

47:59

the color.

47:59

of the walls and your judgments.

48:03

Even in the cases where in the laboratory

48:05

we can get fairly big effects,

48:08

that does not at all mean that in the real

48:10

world we'll get those effects, because as

48:12

you said, the

48:14

causes for your behavior in the real world are

48:17

so myriad that they might get swamped. And

48:20

I want to say it's a good thing. Imagine

48:24

a creature like us that

48:27

was guided so strongly

48:29

by very small things like

48:31

the temperature in the room and the color of the walls

48:33

and the color of the plates. Our behavior

48:35

would be inexplicable even to us, to

48:38

ourselves. You know, this is not

48:40

priming research that you have done

48:43

some with me and my colleague, Yul

48:45

Imbar has on the relationship

48:47

between disgust sensitivity, the

48:49

tendency to feel disgust,

48:51

as a sort of stable personality trait,

48:53

like how easily grossed out are you, and

48:56

political orientation.

48:58

We have found that this relationship

49:01

exists in a number of different samples,

49:03

international samples, different languages. Oftentimes

49:06

when I give talks about this, people ask me, so

49:08

if I wanted to know about somebody's

49:10

political orientation, I can just see how grossed out

49:12

they are. And I always say, that would

49:15

be the least efficient way of

49:17

finding out what someone's political orientation

49:19

is, and probably the least predictive

49:22

of all the ways. You know what the best way

49:24

to find out what someone's political orientation

49:26

is? You ask them what their political orientation

49:29

is, because that is, after all, how

49:31

we found out that this relationship exists

49:33

to begin with. Where they're raised,

49:36

you know, what their parents' political orientation

49:38

is, how old they are, all of these are things that

49:40

are way more predictive. And I

49:43

like a world in which reasonable

49:45

things like that are better

49:47

predictors than something

49:49

like, you know, how grossed out are you. I

49:51

can't resist going back a bit to the brain now,

49:54

which is, a lot of people say, isn't this amazing?

49:56

We could do scans of the brain and find out whether

49:58

people like Diet Coke.

49:59

or not.

50:02

We show them Diet Coke, we see what parts of the brain light

50:04

up. Well, there's two problems with that, both

50:07

compatible with what you just said. One is the relationship

50:10

is very weak and very subtle. And

50:12

the second thing is just ask them,

50:14

do they like Diet Coke? Because that's how

50:16

we know that part of the brain is relevant

50:18

in the first place. You've described some research

50:20

that might be under the label, neuromarketing.

50:23

That was an attempt to use brain

50:25

science for marketing. Some British psychologists

50:28

describe it as more polite in British

50:29

neurobollocks. It's

50:34

hard to find out which of these effects are

50:36

real. Like we were discussing

50:38

earlier, cognitive priming certainly exists.

50:41

We do have conceptual networks,

50:44

thoughts connect to each other in some

50:46

cases more strongly than in other cases.

50:49

How much that influences behavior,

50:52

it might matter whether the behavior that you're talking

50:54

about is an eye movement or

50:56

a life choice. I'd

50:59

put more money on the eye movement than on the life

51:01

choice. I'll put this up to you and you

51:04

tell me if you think of the case. I can think of some

51:06

priming studies phenomena, which I

51:08

believe, and certainly the cognitive ones, we're

51:11

faster to say the word nurse, we do word

51:13

nurse if it's preceded by the word doctor. And there's

51:15

a million findings of that, pretty robust. I

51:17

can't think of a single priming finding

51:20

that's not just true,

51:23

but powerful enough to make a real difference

51:25

in people's lives that

51:27

you'd want to use as some sort of technology

51:29

or treatment or ways

51:32

to persuade somebody. I can't

51:34

either. This is something

51:36

that has been sold

51:39

to popular audiences for

51:41

a long time. For a while there was

51:44

this craze about subliminal messages

51:47

in marketing that if you could flash popcorn

51:49

subliminally below their threshold

51:52

of awareness in a movie theater, people would be more

51:55

likely to go buy popcorn. I always go

51:57

to, what does the market say about this?

51:59

I mean, think about it. If that stuff really worked,

52:01

people would be doing it a whole

52:04

lot and they'd be making a lot, a

52:06

lot of money. And the fact is they're not,

52:09

I can tell you with some degree of confidence, we are

52:11

not making money hand over fist selling

52:13

priming studies to corporations.

52:16

It never followed from the findings

52:18

that we've discussed in car and your priming that

52:21

this would influence actual behavior.

52:23

And so properly thought

52:25

of, I

52:26

think these priming studies should be thought of as

52:28

just a really interesting avenue in social

52:30

psychology to explore how our minds

52:33

work, how it's all angled, how it all connects together.

52:36

But I think this is the turn

52:38

you talked about. So much of it has turned into self-help

52:40

and marketing promotion and everything. Paint

52:43

your rooms blue, you'll be more creative, put

52:45

a flag in the corner, people will vote for your political

52:47

party. And none of that has panned

52:49

out. I think that once you remove those grandiose

52:52

claims and just bring it right back to the lab, this

52:55

stuff is of great value.

52:57

As far as I know, there have been

52:59

no attempts to replicate our disgusting smells

53:02

lead to different ratings of homosexual men.

53:04

But if people tried and it has failed,

53:07

I would be the first to say,

53:09

well, like let's take the body of evidence

53:11

and say, you know, if we were wrong, we were wrong.

53:14

And it's

53:16

not clear at all that we were not wrong. That's

53:19

right. We might be right. The way

53:21

that science proceeds has to be one in

53:24

which we say, if it's wrong, then we

53:26

should discard it. I'll say

53:28

something on a professional level, which I'm not sure

53:30

we've ever touched upon, maybe repeating ourselves,

53:33

but it's an example of how as a working

53:35

psychologist or working scientist, it's

53:37

good to work on many projects. Yes. I

53:40

feel very bad for my colleagues who work on exactly

53:42

one thing for like 40 years. And

53:44

then there's always a risk that you're just on the wrong

53:47

track and then it collapses. I mean,

53:49

maybe if you're on the right track, you get a Nobel Prize and

53:51

it's wonderful. But I feel so bad when I see

53:53

something fails to replicate and I say, Oh my God,

53:56

my friend, that's his life.

53:59

And You and me, we did some

54:01

discuss work, we like it, but we

54:04

do a lot of other things. We won't have to

54:06

hang ourselves in the basement if it fails

54:08

to replicate.

54:09

You and I have talked outside of this podcast

54:12

about this, but I view it as an investment

54:14

portfolio. You want to diversify. Yeah,

54:18

eggs in many baskets or something like that.

54:20

Diversify one's career.

54:22

That's right. Well, that is

54:24

advice that could change people's lives. It

54:28

absolutely is. Don't be so

54:30

singularly focused. OK.

54:32

On that note then. Great talking. Great

54:34

talking to

54:35

you. Ee.

55:08

Ee. Woo.

55:11

Ee.

Rate

Join Podchaser to...

  • Rate podcasts and episodes
  • Follow podcasts and creators
  • Create podcast and episode lists
  • & much more

Episode Tags

Do you host or manage this podcast?
Claim and edit this page to your liking.
,

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features