Podchaser Logo
Home
Daniel Kahneman doesn't trust your intuition (Re-release)

Daniel Kahneman doesn't trust your intuition (Re-release)

Released Tuesday, 26th December 2023
Good episode? Give it some love!
Daniel Kahneman doesn't trust your intuition (Re-release)

Daniel Kahneman doesn't trust your intuition (Re-release)

Daniel Kahneman doesn't trust your intuition (Re-release)

Daniel Kahneman doesn't trust your intuition (Re-release)

Tuesday, 26th December 2023
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:01

Ted Audio Collective. Thanks

0:07

to PagerDuty, an essential platform for

0:10

critical work. Transform your operations and

0:12

move forward faster with PagerDuty. Hey

0:15

listeners, today we're sharing a past episode

0:17

of Rethinking from the archives. Enjoy.

0:24

Hi everyone, it's Adam Grant. Welcome back to

0:26

Rethinking, my podcast on the science of what

0:28

makes us tick. I'm an

0:30

organizational psychologist and I'm taking you inside the

0:32

minds of fascinating people to explore new thoughts

0:34

and new ways of thinking. My

0:37

guest today is Daniel Kahneman. Danny

0:39

won a Nobel Prize in economics. He's been

0:42

named one of the most influential economists in

0:44

the world. But he's not on

0:46

board with that. Danny is

0:48

one of the great psychologists of our time. Actually,

0:51

of all time. You

0:53

may have read his influential book, Thinking Fast

0:55

and Slow. This conversation

0:57

with Danny challenged one of my core

0:59

beliefs about intuition. It also

1:01

gave me a new perspective on which ideas are

1:03

worth pursuing. And since Danny

1:05

is an expert on decision making, I thought

1:08

I'd start by asking about what we're actually

1:10

trying to accomplish in so many of our

1:12

choices. You've

1:23

spent a lot of your career studying happiness

1:26

and related topics. And

1:29

really, for the first time in my career,

1:31

I started to wonder why are we so

1:33

obsessed with happiness as psychologists? You know,

1:35

I'm all for people leading enjoyable,

1:38

satisfying lives. But if

1:40

I had to choose, I would much rather have

1:43

people focus on character, on

1:45

trying to build their generosity, their

1:47

integrity, their commitment to

1:49

justice, their humility. And I wonder

1:51

if you could talk to me a little bit about

1:54

whether you think we've lost

1:56

our way a bit and character has

1:58

has been too little. in

2:00

focus or too far in the background or whether

2:02

you think happiness deserves the attention it's gotten? Well,

2:05

I think my

2:07

focus would be neither happiness nor

2:09

character. It would be misery.

2:13

And I think that there

2:15

is a task for society to

2:19

reduce misery, not to increase

2:21

happiness. And when you

2:23

think of reducing misery, you

2:25

would be led into very different

2:28

policy directions. You would

2:30

be led into mental health issues. You would

2:32

be led into a lot

2:34

of other problems. So

2:36

reducing misery would be my focus.

2:40

Character and happiness or

2:42

misery are not substitutes. The

2:45

idea which has been accepted both in the

2:47

UK and in many other places, in quite

2:49

a few other countries by now, is

2:51

that the objective of society,

2:54

the objective of policy, should be

2:57

increasing human welfare or

2:59

human well-being in

3:01

a general way. I

3:03

think that's a better

3:06

objective for policy than

3:08

increasing the quality of the

3:10

population's character. I think it's

3:12

a better objective. I think

3:15

it's a more achievable objective, except I

3:18

would not focus on the positive end.

3:20

I would focus on the negative end.

3:22

And I would say it is the

3:24

responsibility of society to try to reduce

3:27

misery. And let's

3:29

focus on that. We

3:31

speak of length and not of shortness. And

3:34

we speak of happiness. The

3:37

dimension is labeled

3:39

by its positive pole. And

3:43

that's very unfortunate, because actually,

3:45

increasing happiness and reducing misery are very

3:48

different things. I agree. And it's interesting

3:50

to hear you say that reducing misery

3:52

is more important than promoting happiness. In

3:55

some ways, that feels like a critique

3:57

of the positive psychology movement. It is.

4:00

And tell me a little bit more about why. Well,

4:04

I think the positive

4:06

psychology movement has in

4:09

some ways a deeply conservative

4:12

position. That is, it

4:14

says, let's accept people's condition as

4:16

it is and let's make people

4:19

feel better about their unchanging condition.

4:22

You know, there has been some critique of

4:25

positive psychology along those lines. I'm

4:28

not innovating here. But

4:31

I think that focusing

4:33

on changing circumstances, on

4:35

dealing directly with misery

4:38

is more important and

4:40

is a wealthier objective for

4:42

society than making people feel

4:44

better about their situation. Yeah,

4:46

I mean, I think it certainly tracks with

4:49

how I think about, in general, bad

4:51

being stronger than good and the

4:53

alleviation of misery contributing more to

4:55

the quality of people's lives than

4:57

some degree of elevating of

5:00

the amount of joy that they feel. But

5:02

I also wonder at times if this is

5:04

not a false dichotomy, that if you want

5:06

to make people happy, it's awfully difficult to

5:08

do that if you don't pay attention to

5:10

the misery or suffering that they might experience.

5:14

Actually, we

5:17

once did a study in which

5:19

we were measuring how people feel,

5:22

how much of the

5:24

day are people in different states,

5:26

positive or negative. And

5:30

it turns out that people

5:33

are in a positive state on

5:36

average 80% of the time,

5:38

more than 80% of the time. That

5:42

is, on average, people are on the positive

5:44

side of zero. Now,

5:46

look at, say, the 10%

5:49

of the time that people spend suffering

5:52

overall. Most

5:54

of the suffering is concentrated

5:56

in about 10% to 15% of the

5:58

population. So

6:02

it actually is not the same

6:05

people that you would make less

6:07

miserable or happier. Those

6:09

are different populations. And the

6:11

question is, where do you direct the

6:14

weight of policy and what do

6:16

you pay more attention to? Very

6:19

interesting. I like it. So

6:22

you're basically saying, look, if we have scarce

6:24

resources, whether those are financial or time or

6:26

energy, we want to concentrate on the group

6:28

of people who are suffering, as opposed to

6:30

those who might be languishing. It

6:34

seems to me that to some extent,

6:37

we have been trapped by

6:39

a word. I mean, it's

6:41

the word happiness, which

6:44

seems to stand for the whole dimension.

6:49

And I think this is leading

6:51

to some policies. Actually, this

6:53

failing to lead to

6:56

policies that would really

6:58

be directed at increasing

7:00

human well-being by decreasing

7:02

misery. Yeah, I

7:04

think so too. And it's something I've thought about

7:07

a lot at work, given that the hat I

7:09

wear most often is organizational psychologist. I

7:11

feel like the obsession with employee engagement

7:14

has really missed the mark. I don't go to

7:16

work hoping that I'm going to be engaged today. I

7:19

hope that I'm going to have motivation and

7:21

meaning and that I'm going to have a

7:23

sense of well-being. And I wonder

7:25

if one of the effects that the pandemic has had

7:27

on a lot of people and a lot of leaders

7:29

in workplaces is to get them to

7:32

recognize, you know what, we need to care about people's

7:34

well-being in their lives, not just their engagement at work.

7:38

Well, I thought that,

7:40

you know, I'm not an expert, this is your

7:42

field of mind, but I thought

7:45

that engagement is

7:48

close to feeling good at

7:50

work. I mean, whether it's

7:52

the responsibility of workplaces

7:55

to deal with people's well-being in

7:57

general, I agree that it's their

7:59

responsibility. for dealing with people's

8:02

well-being at work. And that doesn't

8:04

seem to me to be very

8:06

different from trying to make people

8:08

engaged and happy with what they're

8:11

doing. So I'm a

8:13

bit curious to hear more about

8:15

the dichotomy or the distinction

8:18

that you're drawing between engagement

8:20

and well-being. My interpretation of

8:22

engagement was fairly close to

8:25

well-being at work. Yeah,

8:27

I think in large part it depends

8:29

on which conceptualization and measure of engagement

8:31

we're talking about. But one

8:33

of the more interesting patterns

8:36

in the literature that's gotten me thinking

8:38

quite a bit is that it's possible

8:40

to be an engaged workaholic.

8:44

And this has been differentiated recently

8:46

from being a compulsive workaholic. You

8:49

know, are you working a lot because

8:51

you find it interesting and worthwhile? Or

8:53

are you doing it because you feel guilty when

8:56

you're not working and you feel kind

8:59

of obsessed with the problem that you're

9:01

trying to solve? And I think

9:04

that one version of engagement is probably

9:06

healthier than the other. And

9:08

I associate well-being much more with being

9:11

an intrinsically motivated workaholic than

9:13

with a compulsive workaholic, even though both are

9:15

highly engaged. I agree.

9:20

You know, I worked for a while with Gallup.

9:22

I was a consultant with

9:24

Gallup many years ago. And

9:27

their concept of engagement, I

9:29

think was a positive concept. One

9:32

of the criteria that I remember for people

9:34

being happy at work is having a friend

9:37

at work. So

9:39

clearly, at least their concept

9:41

of engagement, which is the

9:43

only one that I know

9:45

much about, is by

9:47

and large a positive concept. And

9:49

certainly, we don't

9:52

want people to be compulsive, although

9:55

I don't know

9:58

how to describe it. I described myself,

10:00

for example, when

10:03

I work hard or when I used

10:05

to work very hard, was I doing

10:07

so compulsively? Was I doing so out

10:10

of intrinsic motivation? I think both.

10:13

I was intrinsically motivated and I

10:15

was compulsive about it.

10:17

So I'm not sure of

10:19

the distinction that you're drawing

10:21

between being compulsive and being

10:24

intrinsically motivated. Well

10:27

I like to call it a look at ambivalence

10:29

there because I think it speaks to the point

10:31

that you raised earlier, which is that positive emotions

10:33

and negative emotions can coexist. You

10:35

can work because you're passionate about it and because

10:37

you feel bad if you're not doing it. That's

10:40

right. I want to ask you about the joy of being wrong. The

10:43

place I wanted to begin on this is to ask you, when

10:46

you were growing up or earlier in your life,

10:48

how did you handle making mistakes? I'm

10:53

hesitating because I can't. It's

10:56

not that I didn't make any mistakes. I

10:58

certainly made many. But I wasn't very impressed

11:00

by my mistakes. They were not very salient

11:02

in my life. So if you're asking

11:04

about my early, as a student and so

11:07

on, I don't have much to report that's

11:09

of any interest. As a researcher, I found

11:13

my mistakes very

11:15

instructive and there

11:18

were sort of positive experiences by and

11:20

large. That's such an odd thing to hear

11:22

you say. Most

11:25

of us experience pain, not pleasure when

11:28

we find out that we're wrong or we discover

11:30

that we've made a mistake. How

11:33

did you arrive at a place where you found

11:35

that to be a teachable moment? Well,

11:39

those are situations in which you're surprised.

11:42

I really enjoy changing my mind

11:44

because I enjoy being surprised and

11:46

I enjoy being surprised because I

11:48

feel I'm learning something. It's

11:52

been that way. I've been lucky, I think, because I

11:55

think you're right that this

11:57

is not universal. The positive.

12:00

emotion to corrected

12:02

mistakes, but it's just

12:04

a matter of block. I mean, I'm

12:07

not, you know, not claiming I'm more

12:09

around here. It's

12:12

fascinating to watch though, because I've seen

12:14

your eyes light up and, you know,

12:16

it's palpable, right?

12:19

When you discover that you were wrong

12:21

about a hypothesis or a prediction, you

12:24

look like you are experiencing joy. And

12:27

I've started to think a

12:29

lot about what prevents people from

12:31

getting to that place. And

12:33

I think a lot of it is for so many people,

12:36

they get trapped in either a

12:38

preacher or a prosecutor mindset of

12:41

saying, you know, I know my beliefs

12:43

are correct, or I know other people are wrong. And

12:46

at some point, their ideas become part of

12:48

their identity. And I know even scientists struggle

12:51

with this, right? I think, at

12:53

least when I was trained as a social

12:55

scientist, I was taught to be passionately dispassionate.

12:59

But I know a lot of scientists who

13:01

struggle with detachment, and you don't seem to. So

13:03

how do you keep your ideas from, I guess,

13:05

becoming part of your identity? Well,

13:08

I think that, I

13:10

mean, this is going to sound awful. I

13:13

have never thought that ideas are

13:15

rare. And, you know,

13:18

if that idea isn't any good, then there

13:20

is another that's going to be better. And

13:23

I think that is probably

13:25

generally true, but not generally

13:28

acknowledged. So that for people

13:30

to give up on an idea may,

13:32

in many cases, lead to a sort of

13:35

panic. If I don't have that idea, then

13:37

what do I have? Who am I? If

13:40

I don't have that idea? So

13:42

being less identified with your ideas is

13:44

also associated, I think, with

13:47

having many of them, discovering

13:49

that most of them are no good,

13:52

and trying to do the

13:54

best you can with a few that are good. So

13:58

it's seeing ideas as abundant rather than scary.

14:00

that makes it easy to stay detached? Yeah.

14:03

Yeah, I mean, I used to tell my

14:05

students, ideas are a dime a dozen. I

14:07

mean, don't over invest in your old ideas.

14:11

And so I used to encourage my

14:13

students to give

14:15

up at

14:17

a certain point. I certainly never wanted

14:19

to read a dissertation by

14:21

a student with a chapter that would

14:24

explain why their experiment failed. So

14:27

that was the kind of

14:29

advice that I would give them. Think of

14:31

another idea. Do

14:33

you ever worry about getting too

14:35

detached? I think, for

14:38

example, about messenger RNA technology,

14:41

which was seen as, I think, a

14:43

joke for a long time. And if

14:46

not for the courage and tenacity of a

14:48

small group of scientists who persisted with it

14:50

anyway, we might not have a COVID vaccine

14:52

right now. Oh. I

14:57

think, well, in the first place,

14:59

science, like many other social

15:02

systems, doesn't

15:04

thrive on everybody being

15:06

the same. So you

15:09

may have some advice that is good for some

15:11

people. And it's clear

15:13

that some people who are

15:16

irrationally persistent achieve great

15:18

successes. Indeed, if

15:20

you look back at

15:22

great successes, you will generally find

15:24

that there is some irrational persistence

15:27

behind them and irrational optimism behind

15:29

them. That doesn't mean that

15:32

when you are looking from the

15:34

other side that irrational optimism or

15:36

irrational persistence

15:39

are good things to have. So

15:42

the expected value of it might

15:44

be negative, although when you look

15:46

back, every big success you

15:48

can trace to some irrationality.

15:51

Well, that goes beautifully to one of my

15:53

favorite ideas of yours, that we

15:56

look at successful people and

15:58

we learn from their habits. not realizing

16:00

that we haven't compared them with people who

16:02

failed, who had many of the same habits.

16:07

And I wanted to, I guess, ask you

16:09

a broader question, which is having put these

16:11

kinds of decision heuristics and cognitive biases on

16:13

the map, which one do

16:15

you fall victim to the most? Is it confirmation

16:17

bias? It sounds like maybe not. I

16:19

just wondered which of the

16:22

biases that you've documented is your greatest

16:24

demon? All of them. Really,

16:27

all of them, except, as you

16:29

said, confirmation bias. By

16:31

the way, people close to me find

16:33

this irritating. That is, that

16:36

whenever they have a problem with someone,

16:39

I automatically take the other side and

16:41

try to explain whether someone might be

16:44

right after all. So I have that

16:46

contrarian aspect

16:48

to what I am. This

16:52

reminds me a little bit of a possibly

16:54

apocryphal story that I

16:57

think told to every doctoral student in

16:59

social science these days, which is that

17:02

not long after you won the Nobel Prize for

17:04

your work on decision making, there was a journalist

17:06

who asked you how you made tough

17:08

decisions, and you said you flip a

17:10

coin. Is this true? No. Okay,

17:13

good. Absolutely. I've never flipped a coin

17:15

to make a decision in my life.

17:19

The version of the story I heard was that

17:21

you would flip the coin to observe your own emotional

17:23

reaction and figure out what your biases were.

17:26

I might have said that this is one

17:28

of the benefits of flipping a coin, but

17:30

I personally have never used it. But

17:34

it's true that flipping

17:37

a coin would be a way of discovering

17:39

how you feel if you didn't know earlier.

17:42

That I still believe. I feel very

17:44

relieved to know that because I was worried about

17:46

you, given all you know about

17:48

decision making, making important life choices

17:50

with a coin toss. Generative

18:00

AI. Sure,

18:19

everyone's talking about it, me included. Well,

18:22

PagerDuty is doing something different with it,

18:25

something hyper-relevant to operations teams.

18:27

You're using Gen AI to help IT

18:29

and engineering teams do faster and smarter

18:31

critical work. Whether it's co-authoring

18:34

status updates in the heat of an

18:36

incident, or creating full post-mortems with a

18:38

complete rundown of what happened, PagerDuty's Gen

18:40

AI is changing the game for incident

18:42

management in the enterprise. When

18:44

you're in the middle of an incident,

18:47

you're focused on managing risk, revenue, and

18:49

customer experience. You shouldn't be

18:51

spending precious time manually updating people about

18:53

the process. Now with just

18:55

a click, you can let PagerDuty's Generative AI

18:57

do that for you. PagerDuty's

18:59

AI-powered platform can help in countless

19:01

other ways too, from

19:03

automating tasks to improving productivity.

19:06

Learn more at pagerduty.com. When

19:13

I look back at my life, there's

19:17

been a series of things that ultimately

19:20

I made decisions or

19:22

I made life choices clearly, but

19:25

I did not experience them as

19:29

decisions. I have

19:31

very little to say describing myself about

19:34

making decisions, in part because

19:37

I have

19:39

pretty strong intuitions and

19:41

I follow them usually. The

19:45

decision doesn't feel hard if

19:47

you know what you're going to do. If

19:50

you know yourself and you're going to do

19:52

it anyway, it doesn't feel very hard. you

20:00

have spent most of your

20:03

career highlighting all the fallacies

20:05

that come into play when

20:07

we over rely on our intuition. Well,

20:12

you really have to distinguish judgment

20:16

from decision making, and

20:19

most of the intuitions that

20:21

we've studied were fallacies

20:24

of judgment rather than decision

20:26

making. And second,

20:29

my attitude to intuition is not that

20:32

I've spent my life saying

20:35

that it's no good. In

20:38

the book that we're writing, that we've

20:40

just finished writing, our advice

20:42

is not to do without intuition. It is

20:44

to delay it. That

20:46

is, it is not to decide

20:49

prematurely and not

20:51

to have intuitions very early. If you

20:53

can delay your intuitions, I

20:56

think they are your best guide, probably, about what

20:58

you should be doing. Okay,

21:01

so two questions there. One is how, the other

21:03

is why. Well, if

21:06

you delay your intuitions, now

21:10

I'm talking about formal decisions, decisions

21:12

that might be taken within an

21:15

organization, or a decision that an

21:17

interviewer might take in deciding whether

21:19

or not to hire a candidate.

21:23

And here, the advice of

21:25

delaying intuition is simply because when

21:27

you have formed an intuition, you

21:30

are no longer taking in information. You

21:33

are just rationalizing your own decision, or

21:35

you're confirming your own decision. And there

21:37

is a lot of research indicating

21:40

that this is actually what

21:42

happens in interviews. That

21:44

interviewers spend a lot of time. They

21:46

make their mind up very quickly, and

21:48

they spend the rest of the interview

21:50

confirming what they believe, which is really

21:52

a waste of time. Yes,

21:55

yes. So the idea of delaying your intuition is

21:57

to make sure that you've done it. gathered

22:01

comprehensive, accurate, unbiased

22:04

information so that then

22:06

when your intuition forms it's based on

22:08

better sources, better data? Is

22:10

that what you're after? Yes, because I don't

22:13

think you can make

22:15

decisions without there being endorsed

22:17

by your intuitions. You

22:20

have to feel conviction. You

22:22

have to feel that there is some good

22:24

reason to be doing what you're doing. Ultimately,

22:28

intuition must be involved. But

22:31

if it's involved, if you

22:34

jump to conclusions too early or

22:36

jump to decisions too early, then

22:40

you're going to make avoidable mistakes.

22:45

This is an interesting twist on how

22:47

I've thought about intuition, especially in a hiring

22:49

context, but I think it applies to a

22:51

lot of places. My

22:53

advice for a long time has been, don't trust

22:56

your intuition. Test your intuition. Because

22:59

I think about intuition as a subconscious

23:01

pattern recognition, and I want to

23:03

make those patterns conscious so I can figure out whether

23:06

whatever relationship I've detected in the past

23:08

is relevant to the present. It

23:11

seems like that's what you've argued as well

23:14

when you've said, look, you can trust your

23:16

intuition if you're in a predictable environment, you

23:18

have regular practice and you get immediate feedback

23:20

on your judgment. I

23:22

think the tension for me here is I don't

23:25

know how capable people are of delaying

23:27

their intuition. I wonder if

23:29

what might be more practical is to say,

23:32

okay, let's make your intuition explicit instead of

23:34

implicit early on so that then you can

23:36

rigorously challenge it and figure out if it's

23:39

valid in this situation. I've

23:41

been deeply influenced by something that I

23:43

did very early in my career.

23:45

When I was 22 years old, I

23:48

set up an interviewing system for the

23:50

Israeli army. It was to

23:53

determine suitability for combat units.

24:00

that I designed, broke

24:02

up the problem so that you

24:04

had six traits that you were

24:06

interviewing about. You were asking

24:09

factual questions about

24:11

each trait at the time, and

24:13

you were scoring each trait once

24:15

you had completed the questions about

24:18

that trait. Jumping in here,

24:20

because this is such a cool example, but it needs

24:22

a little explaining. Danny created

24:24

a system for interviewers to rate job candidates

24:26

on specific traits, like work

24:28

ethic, analytical ability, or integrity. But

24:31

interviewers did not take it well. They

24:34

really hated that system when

24:37

I introduced it. And they told me, I

24:40

vividly remember one of them saying,

24:42

you're turning us into robots. Danny

24:45

decided to test which approach worked best. Was

24:48

it their intuition or their ratings from the data?

24:51

The answer was both. Their

24:54

ratings plus their intuition. And

24:56

not their intuition at the beginning. Their

24:58

intuition at the end, after they did

25:00

the ratings. But as

25:03

you rate those six traits, and

25:06

then close your eyes

25:09

and just have

25:12

an intuition, how good do you

25:14

think the soldier is going to be? When

25:17

the data came back, it

25:19

turned out that that intuition

25:21

at the end was the

25:23

best single predictor. It

25:26

was just as good

25:28

as the average of the six

25:30

traits, and it added information. So

25:34

I was surprised. I

25:39

just was doing that as a favor to

25:41

them, letting them have intuitions. But

25:43

the discovery was very clear. And

25:46

we ended up with a system in

25:48

which the average of the six traits

25:51

and the final intuition had equal weight.

25:54

It sounds like what you recommend

25:57

then concretely is for a manager to

25:59

make a list of the... skills and values that they're

26:01

trying to select on to

26:03

do ratings that are anchored on those dimensions.

26:06

So, you know, I might judge somebody's coding

26:08

skills if they're a programmer or their ability

26:10

to sell if they're a salesperson. And

26:13

then I might also be interested in whether they,

26:15

you know, they're aligned on our organizational values. And

26:17

then once I've done that, I want to form

26:20

an overall impression of the candidate because I may

26:22

have picked up on other pieces of information

26:24

that didn't fit the model that I had. I

26:27

think that's about right. It's such

26:29

a powerful step that I think

26:32

should bring the best of both worlds

26:34

from algorithms and human judgment. There's

26:37

something that's a little puzzling to me about it though,

26:39

which is why are managers

26:41

and people in general so enamored with

26:43

intuition? I think it's because people don't

26:46

have an alternative. It's

26:49

because when they try to reason

26:51

their way to a conclusion, they

26:53

end up confusing themselves. And

26:57

so the intuition wins

27:00

by default. It makes

27:02

you feel good. It's easy to do and

27:04

it's something that you can do quickly. Whereas

27:08

careful thinking in

27:11

a situation of judgment where there is

27:14

no clearly good answer, careful

27:16

thinking is painful. It's

27:19

difficult and it leaves you in a

27:21

state of indecision or in a state

27:23

of even if one option is better

27:25

than the other, you know that the

27:27

difference is not something you can be

27:30

sure of. Whereas when you

27:32

go the intuitive route, you'll end

27:34

up with overconfident certainty and feeling

27:37

good about yourself. So

27:39

it's an easy choice, I think. You

27:43

wrote about this topic at length in what

27:45

some have called your magnum opus, Thinking Fast

27:48

and Slow. I'm

27:50

wondering what you've rethought since you

27:52

published that book. I

28:00

published in that book that were wrong. I

28:02

mean, you know, literature I quoted that didn't hold

28:04

up. Now, the

28:07

interesting thing about that is

28:09

that I haven't changed my

28:11

mind about much of anything, but

28:14

that is because changing your mind

28:16

is really quite difficult. Dan

28:20

Gilbert has a beautiful word he called

28:22

that, unbelieving. And unbelieving things

28:24

is very difficult. So I find

28:27

it extremely hard to unbelieve aspects

28:31

of the parts of thinking fast and

28:33

slow, even though I know that my

28:35

grounds for believing them are now much

28:37

weaker than they were. But

28:39

the more significant thing that

28:42

I have begun to rethink is

28:47

that thinking fast and slow,

28:50

like most of the study

28:52

of judgment and decision making is

28:55

completely oblivious to individual

28:58

differences. And

29:00

all my career, I

29:03

made fun of anybody who was studying individual

29:05

differences. I say I'm

29:08

interested in main effects, I'm interested

29:10

in characterizing the human mind. But

29:13

it turns out that when you go into detail, people,

29:17

those studies that you have, it's

29:20

not that everybody is behaving like

29:22

the average of the study, that's

29:24

simply false. There are different

29:26

subgroups who are doing different

29:29

things. And life

29:31

turns out to be much more complicated than

29:33

if you were just trying to explain the

29:36

average. So the

29:38

necessity for studying individual

29:40

differences is, I think

29:42

the most important thing that I have rethought,

29:46

it doesn't have many implications for me because

29:48

it's too late for me to study individual

29:50

differences. And I wouldn't like doing it anyway,

29:52

it's not my style. But

29:55

I think there is much more

29:57

room for it than I thought would be.

30:00

when I was writing, thinking fast and

30:02

slow. Another

30:04

thing I wanted to ask you about is the choices

30:06

you make about what problems and projects to

30:08

work on. I'm

30:11

not a good example for

30:13

anybody. I really never had a

30:15

plan, more

30:17

or less followed my nose, and

30:20

I did many things that I shouldn't have done.

30:24

I wasted a lot of time

30:27

on projects that I

30:31

shouldn't have carried out, but

30:34

no, I've been lucky. Well,

30:38

I think that's probably an encouraging message

30:40

for a lot of us. That is, and

30:42

the idea is this is an area

30:46

where there is gold and I'm going to look

30:48

for it. I mean, that's an

30:50

idea. And formulating

30:52

a new question, that's an idea

30:55

in my book. I'm going to use that. This

30:57

is an area where I think there might be

30:59

gold and I want to look for it. Such

31:02

a nice reframe. So

31:04

Danny, you mentioned your new

31:07

book, Noise. One of

31:09

my favorite ideas when I read Noise was the

31:11

idea of the inner crowd. And I wondered if

31:13

you could explain that. There've

31:15

been two lines of research by

31:19

Boel and Paschler and by Hertwig on

31:22

asking people the same question on

31:25

two occasions or in two different frames

31:28

of mind. And it

31:30

turns out that when you ask the same

31:32

question, like an estimate

31:34

of the number of

31:36

airports, when

31:38

you ask people the same question twice,

31:41

separated by some time, then

31:44

they tend to give you different answers.

31:46

And the average of the answers is

31:49

more accurate than each of them

31:51

separately. Also in the case that

31:55

the first answer is no valid than the

31:57

second. And it's also the

31:59

case that the... The longer you wait,

32:01

the better the average is, the more

32:04

information there is in the

32:06

second judgment that you make. What

32:09

it indicates is clearly that

32:12

what we come up with when

32:14

we ask ourselves a question is

32:17

we're sampling from our mind. We're

32:19

not extracting the answer from our

32:21

mind. We're sampling an answer from

32:23

our mind. There are many

32:26

different ways that that sample could

32:28

come out. Sampling

32:30

twice, especially if you make

32:32

them independent, sampling twice is going to

32:34

be better than sampling once. This

32:39

is one of the most practical, unexpected

32:42

decision-making and judgment perspectives

32:45

that I've come across in the last few years,

32:47

in part because it says, I don't

32:49

always need a second opinion if I

32:52

can get better at forming my own second opinions. I

32:56

think as we say in that chapter, sleep

33:00

over it is really

33:02

very much the same thing. That

33:04

as sleep over it, just wait and

33:07

tomorrow you might think differently. The

33:10

advice is out there. Reinforcing it

33:12

may be useful. Your

33:15

collaboration with Amos Tversky is obviously legendary.

33:18

There's a whole Michael Lewis book about

33:20

it. Is there a

33:22

lesson that you took away from that

33:24

collaboration that's informed either how you choose

33:26

your collaborators now or how you work

33:28

with the people on your teams? I

33:31

think that one really

33:33

important thing is

33:37

to be genuinely interested in what

33:40

your collaborator is saying. I'm

33:45

quite competitive. I'm also quite competitive.

33:47

We were not competitive when we

33:50

worked together. The joy

33:52

of collaboration

33:55

for me always was that

33:58

that almost was... more with almost

34:00

than with almost anyone else. I would

34:03

say something and he would understand it

34:05

better than I had. That's

34:07

the greatest joy of

34:10

collaboration. But in my other

34:12

collaborations, taking pleasure

34:15

in the ideas of your collaborator

34:17

seem to be very useful, and

34:20

I've been lucky that way. On

34:22

that note, almost anyone who's ever won

34:25

a Nobel Prize has complained that it

34:27

hurt their career. I've

34:30

wondered what the experience has been like for

34:32

you. It hurts

34:34

people's career if they're young. I

34:38

got mine when I was 68, and

34:40

for me it was a net plus. Why does

34:42

it get people in trouble if they get it earlier? There

34:47

are a variety of ways that this can happen.

34:49

In the first place, it's very destructive. People

34:53

start taking you more

34:55

seriously than they did, and hanging on your

34:58

every word and the love of nonsense like

35:00

that. If

35:04

you begin to take yourself too seriously, that's

35:07

not good. If you

35:10

take time away from your work

35:14

to do what you're invited to

35:16

do when you get a Nobel, which is a

35:18

lot of talking and a lot of talking

35:21

and think that you don't know much about, that's

35:24

a loss. Then if

35:26

it makes you self-conscious that everything that

35:29

you have to do has to be

35:31

important, that's a loss. There

35:33

are many different ways, I think, in which

35:36

getting a Nobel early is a bad idea.

35:39

It's not the best. I

35:42

was at a good age to get it because

35:44

I had some

35:46

years left in my career and

35:48

it made many things much easier

35:50

having a Nobel. It

35:54

made the

35:56

end of my career more productive,

35:58

I think. and

36:00

happier than it would have been

36:02

other. Rethinking

36:11

is part of the TED Audio Collective. The

36:14

show is hosted by me, Adam Grant, and

36:16

produced by TED with Transmitter Media. The

36:18

scene includes Colin Helms, Retticoat, Dan

36:20

O'Donnell, Joanne Deluna, Grace Rubenstein, Michelle

36:23

Quinn, Van Van Cheng, and Anna

36:25

Phelan. This episode was

36:27

produced by Constanza Gallardo and Jessica Glazer.

36:30

Our show is mixed by Rick Watt, original music

36:32

because he would help you with using additional

36:35

production like he would standard. You

36:47

ever feel like your laptop just keeps

36:49

going, but you are

36:51

completely drained? I think a lot of

36:53

us don't realize how much pain we

36:56

live in because of our interactions with

36:58

computing.

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features