Podchaser Logo
Home
Magical Overthinking with Amanda Montell

Magical Overthinking with Amanda Montell

Released Monday, 29th April 2024
Good episode? Give it some love!
Magical Overthinking with Amanda Montell

Magical Overthinking with Amanda Montell

Magical Overthinking with Amanda Montell

Magical Overthinking with Amanda Montell

Monday, 29th April 2024
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:04

There. Is like a beauty in magical

0:06

thinking can really help us. Cool

0:08

been tap into our imagination and

0:10

it's really when you combine it

0:12

with the overseen game of the

0:14

information age that were sort of

0:16

displeasing those instincts on to a

0:18

more complex and cerebral world's. Amanda

0:23

Montel is the host of

0:26

the podcast sounds Like a

0:28

Cold as well as the

0:30

author of three books including

0:33

Coltish and The Age of

0:35

Magical Over Thinking Notes on

0:37

Modern Irrationality which we will

0:40

be talking about today. Each

0:42

person or essay in this

0:44

brand new book looks at

0:47

a different cognitive bias, those

0:49

irrational thought processes of our

0:51

ancient, outdated hunter. Gatherer Brains

0:54

that no longer serve us

0:56

in our modern world. For

0:58

this episode, Amanda cover as

1:01

many of these cognitive biases

1:03

and explains why they were

1:05

important for the survival of

1:07

early humans, as well as

1:10

how they currently manifest in

1:12

a technologically advanced society that

1:14

as very little in common

1:16

with the distant past. Whether

1:19

a person or cultural or

1:21

political, these thinking patterns affect.

1:23

Us all in tremendous

1:25

ways and Amanda explains

1:27

what we can do

1:29

to understand combat and

1:31

maybe even harness these

1:33

cognitive bias ease to

1:35

create a better life

1:38

and a better world.

1:40

I'm your host, Chelsea

1:42

Webber Smith and this

1:44

is American hysteria. right?

1:52

Amanda, I am so thrilled that

1:54

you are on our show right

1:56

now! I finished your book this

1:59

morning and. I thought it

2:01

was absolutely fantastic and so vital

2:03

to everything that's happening right now

2:05

and had always happens for Christmas

2:07

Monica Ah so I did You

2:09

Ever being here in for each

2:11

year that saw the thrill is

2:13

minds and having my that this

2:16

has resulted oh President of of

2:18

they do the our little drama

2:20

to open a. Decision

2:23

It So I think the reason that

2:25

I call it vital is that on

2:27

are so we do talk a lot

2:29

about cognitive. Bias Seas And there's

2:32

this part missing from I

2:34

think a larger conversation in

2:36

our world that has to

2:38

do with these ways that

2:40

our brains work without us

2:42

truly being aware because we

2:44

still have the same brains

2:46

that we had before. civilizations

2:48

actually has some really big

2:50

deal. and it's really something

2:52

that maybe we don't wanna

2:54

think about because we hope

2:56

we're beyond things like that.

2:58

But basically your book. Has tackled

3:00

a whole bunch of these different

3:02

ways that our brains betray us,

3:04

essence of us. So I would

3:06

love to just start with. What

3:08

made you want to write a

3:10

book is you at some point

3:12

say in your box and bet

3:14

this is the book you always

3:16

wanted to raise. The that feels

3:18

like a big deal, right? So

3:20

what was it They got you

3:22

release into this and why did

3:24

it feel so vital to you

3:27

to right? Yeah well. oh gosh,

3:29

I mean I dreamt. Do writing books

3:31

from a pretty young age? I think

3:33

they came to the terrifying conclusion that

3:35

I wanted to try to pursue a

3:37

career as an author. When I was

3:40

like nineteen, I remember. Being like

3:42

very afraid of that ambition is

3:44

that it would make be what

3:46

do you do. like you know

3:48

does your even a child up until

3:51

college in your like everything's pretend and

3:53

then you're like oh no these like

3:55

do lulu visions for my life that

3:57

i have had i actually wanted right

3:59

away that's so scary.

4:02

But so this

4:04

is my third book and it really

4:06

does feel like a culmination for

4:09

everything that's ever felt important to

4:11

me. I mean, I keep

4:13

joking that this book feels like my

4:15

goodbye tour. Like everyone I've ever cared

4:18

about, like every piece of writing that's

4:20

ever meant anything to me has a

4:22

place in this book. But

4:24

the sort of thesis of it crystallized

4:26

for me over the past few years

4:28

is I was just contemplating this

4:30

notion that despite living in the

4:33

information age, life only seems to

4:35

be making less sense. And it

4:38

certainly doesn't seem to be feeling

4:40

any better. And then as

4:42

I was researching my last book, cultish,

4:44

which is about the language of cults

4:46

from Scientology to soul psychos, so the

4:48

wide spectrum of groups, I

4:51

kept coming across all this really

4:53

fascinating psychology and behavioral economics research

4:55

that made mention of cognitive biases.

4:57

Some of the more famous ones

4:59

like confirmation bias and the cost

5:02

fallacy as part of the

5:04

explanation for why cult followers

5:07

would say stay in

5:09

the moonies for longer than made

5:11

sense to anyone else, including them,

5:13

you know, but I couldn't help

5:16

but notice how these biases also

5:18

illuminated so many of

5:20

the everyday behaviors that I was seeing in

5:22

the culture at large, and also some of

5:25

my own confounding choices

5:27

that I could never even explain to

5:29

myself, like my decision in

5:31

my early 20s to stick

5:33

out a romantic relationship that really felt like a

5:35

cult, a cult of one sort of dynamic that

5:38

I kept justifying to myself and hoping would get

5:40

better, even though there was no evidence to suggest

5:42

that was true. Or, you know,

5:44

my tendency to engineer online

5:46

enemies based on conflicts I'd

5:49

invented in my own head.

5:51

And Yeah, and so

5:53

there are hundreds of cognitive biases that

5:55

have been described over the years for

5:58

the book. I Essentially chose. My

6:00

eleven save the eleven that I

6:02

thought sold the most urgent, the

6:04

most relevant to what I was

6:06

observing in the Zeit Guys ones

6:08

that I felt like could really

6:11

relate to many of the observations

6:13

that I'm sure readers were also

6:15

known as saying like you know,

6:17

extreme cycles of celebrity worship and

6:19

dethrone meant Or you know, know

6:21

stall John how that works for

6:23

better and for worse than our

6:25

minds or like our embrace of

6:28

Instagram manifestation gurus. So the book.

6:30

Was just like this opportunity to

6:32

really follow my curiosities and play

6:34

and bland genres. And it was

6:36

just like an absolute blast array

6:38

and very cathartic. The I I

6:40

imagine it was could you also

6:42

put yourself in the hot seat

6:44

Oh hi You know what you

6:47

think is probably one of the

6:49

best ways to invite people in

6:51

the conversation is to say I

6:53

do this totally I'm admitting has

6:55

happened. He now I have had

6:57

these experiences that I'm going to

6:59

share. With you to eliminate this

7:02

particular bias and you do that

7:04

so masterfully and and so was

7:06

I wanted to know if there

7:09

was one of these particular by

7:11

a sees that you feel like

7:13

you most related to as a

7:16

chronic over thank her own hands.

7:18

Of course your book is about

7:20

over thinking like when you were

7:23

growing up. Oh as I was

7:25

growing up. Ah okay so as

7:27

a young person I feel like.

7:30

Proportionality bias really dominated a lot

7:32

of my do with a shame.

7:34

So proportionality bias is the second

7:36

bias I tackle in the book.

7:38

I'm It is the theme of

7:41

a chapter called i Swear I

7:43

Manifested That and I talk about

7:45

verse nally bias in the context

7:47

of arguing that manifestation is it's

7:50

own kind of conspiracy theories. so

7:52

I will sort of explain why

7:54

Me, My, that's version Alibi as

7:56

is this proclivity to think that

7:58

big events or. Even big feelings

8:01

must have had a big cause.

8:03

It just makes proportional sense us

8:05

to miss a tribute cause and

8:08

effect in this way. So you

8:10

know, on a grand scale. It

8:12

would be no. A global pandemic strikes.

8:14

This is such you know, an

8:17

immense calamity. It could not has

8:19

been simply the result of a

8:21

bunch of small random misfortunes. A

8:23

government must have engineered it on

8:25

purpose of the only way this

8:27

seems to make sort of cosmic

8:29

logic of I and I and

8:32

I also argue in the book

8:34

that manifestation while positive then rooted

8:36

in some truth that like your

8:38

mindset can totally influence outcome. Is.

8:41

Also a kind of miss attribution of cause

8:43

and effect. This idea that you know if

8:45

you only. Vision. Board hard

8:47

enough then you know you can

8:50

reclaim some agency over your life

8:52

during times of two months or

8:54

unpredictability. I'm and that can have

8:56

some pretty nefarious consequences. And I've

8:59

noticed in the current age that

9:01

a lot of these online a

9:03

manifestation gurus you are claiming to

9:06

have this like bespoke manifestation technique

9:08

that you can sign up for

9:10

thirty dollars a month to access

9:12

had sent some followers down a

9:15

much more pernicious conspiratorial que. A

9:17

Naughty Rabbit Moment has some of

9:19

the psychology and behavioral economics at

9:21

the root is really the same,

9:23

but I find and I argue

9:25

in this chapter that anybody who's

9:28

ever attributed of a larger than

9:30

life cause that was actually unrelated

9:32

to be larger than life of

9:34

fact that there were like trying

9:36

to parse out has a pinch

9:38

of conspiracy theorist in them. In

9:40

really come Down Malaysia, you and

9:42

your phone is that working or

9:45

when your car is acting. up

9:47

and you're like why are you doing

9:49

this to me is the authentic now

9:51

he has a human intention that is

9:54

conspiring against me it's you know we

9:56

all have this impulse to like project

9:58

is overly simplistic intentionality onto,

10:01

you know, problems that

10:03

we encounter that don't naturally make sense

10:05

to us or don't have like a

10:07

sort of clear rationale behind them. And,

10:10

you know, kids do this all

10:12

the time, like kids are sort

10:14

of the OG magical thinkers coming

10:16

up with, you know, like super

10:18

shins and play pretend games. And

10:20

there is like a beauty in

10:22

magical thinking as well, like it

10:24

can really help us cope and

10:26

tap into our imagination and things

10:29

like that. It's really when you combine

10:31

it with the overthinking of the information

10:33

age that it starts to get kind

10:35

of painful and sinister and dangerous and

10:37

those things. But it's almost like magical

10:39

thinking anyone who has OCD will will

10:42

relate to this. Obviously, you know, you

10:44

think that like rituals will result in

10:46

protecting your family. And I think, you

10:48

know, a lot of kids and I

10:50

remember as a kid engaging in certain

10:52

rituals, hoping that they would have, you

10:54

know, this big effect on my life

10:57

or in the safety of my loved

10:59

ones, you know, definitely. And I

11:01

don't engage in those rituals in

11:03

the same way anymore. But yeah,

11:05

growing up, I definitely found myself

11:08

doing a ton of that. And I

11:10

still have proportionality bias in my daily

11:12

life. Like we as human beings adapted

11:14

for paranoia. Like when you think about

11:17

human beings living in a physical

11:19

world without technology or modern culture,

11:22

like if something huge happened in

11:24

nature, like, I don't know, like

11:27

a branch falling, like it was

11:29

probably from a humongous gust of

11:31

wind, you know, like, and

11:34

we did adapt for a world

11:36

full of, you know, hostile intentions

11:38

and threats and things. It's just

11:40

that we're sort of displacing those

11:42

instincts onto a more

11:44

complex and cerebral world. Absolutely,

11:47

absolutely. And, you know,

11:49

something that really kind of got in

11:52

my head about the proportionality bias is

11:54

that I Think a

11:56

lot of our culture

11:58

around therapy. In psychology you

12:01

know we have these big feelings kind

12:03

of no matter who you are a

12:05

lot of us have big anxiety we

12:07

have mean out he problems with over

12:10

thinking we have. You know the sense

12:12

that something huge must have cause these

12:14

things And yeah I was part of

12:17

this ah a self actualization group when

12:19

I was a kid into my teen

12:21

years and there was a huge emphasis

12:24

on like we acted pig back and

12:26

find even if you don't remember it

12:28

we've got a date. Back and

12:30

find that big saying that cause all

12:33

of your problems right now and yell

12:35

lot of people there were just like

12:37

why I don't I don't have that

12:40

big sanctions and it felt like we've

12:42

gotta find that big justification for these

12:44

big feelings and I that feels kind

12:46

of similar and a lot of ways

12:49

to this but just kind of projected

12:51

back into the past. Yes that's such

12:53

a good point because even physically her

12:55

heal him that of but like even

12:58

throughout this book launch France's. Or

13:00

a book about overcoming games. It's

13:02

a stressful like to have to

13:04

talk about you're supposed. To

13:08

the hubley kill bill Like really really

13:10

wrong if if if I'm home I

13:12

feel I feel really good today. the

13:14

buyer beware and but yeah but until

13:16

the book came out you know I

13:18

was just feeling like anxious all the

13:21

time and like crying like why am

13:23

I. Writing

13:25

the book sell really good and and promoting

13:27

the book felt really bad and I was

13:29

like you know. I have all of

13:31

these big feelings and like I'm on

13:34

well hold it always does it. I

13:36

was thinking like I would I don't

13:38

deserve to feel this way because you

13:40

know when I was thinking back on

13:42

like all of the contributing factors that

13:44

was like a bunch of small things

13:47

and I was a ghoul that hasn't

13:49

earned me is the right So like

13:51

need. These ito more significant.

13:53

Interventions like you know I'd either wasn't

13:55

like that. One big trauma and you're

13:57

totally right. like that is like my

13:59

for. proportionality bias at work.

14:01

Yeah, it's these bias

14:04

these are wild. Yeah.

14:06

Okay. So is there

14:10

one particular bias that was most

14:12

kind of surprising to you when

14:14

you started to read about it

14:16

that kind of shook you a

14:18

little bit to your core? Yeah,

14:21

well, they a lot of them.

14:24

But there is a chapter in

14:26

the book that covers something called

14:29

overconfidence bias, which is

14:31

defined by a few qualities, some

14:33

of which are the near

14:35

universal tendency to over

14:38

attribute positive outcomes to

14:40

ourselves to like overestimate

14:43

our knowledge or capabilities

14:45

in certain arenas like

14:47

driving or sex or

14:50

cooking. And

14:52

going into that chapter, I originally

14:54

wanted to explore the Dunning Kruger

14:56

effect, which is a phrase that

14:58

I had come across multiple

15:00

times, which describes just sort

15:03

of like colloquially the tendency

15:05

for people who know the

15:07

least about a subject to

15:09

most overestimate their expertise. And

15:12

it's a sort of license for a

15:14

lot of people to kind of feel

15:16

smug like, you know, only the dumbest

15:19

people think that they're so smart and

15:21

like the smartest people know their limits

15:23

and the Dunning Kruger effect is often

15:25

called upon by pundits and cultural commentators

15:27

to lambast, you know, like ding

15:29

dongs on the internet. And

15:33

then I started looking into the

15:35

Dunning Kruger effect and I very

15:38

quickly came across a sort of

15:40

like post mortem on this original

15:42

study that was co authored by

15:45

David Dunning, a social scientist, and

15:47

it was revisiting some sort of

15:49

like, salacious, conventional wisdom

15:51

that had been spread based on

15:54

this study. It's actually not that

15:56

people with the least expertise are

15:58

the only ones who over overestimate

16:00

themselves, everybody does it. Absolutely everybody,

16:02

no matter what their level of

16:05

expertise is in a certain field,

16:08

will likely, you know,

16:10

sort of aggrandize their

16:12

morality or their expertise

16:14

across various subjects. It's

16:16

just that people who are really, really

16:18

smarter, well educated in one subject will

16:20

tend to do that over like a

16:23

smaller range of subjects. But

16:25

everybody alive overestimates themselves. And this

16:27

was a really tough pill to

16:29

swallow because I was like, surely

16:32

this bias cannot apply to me.

16:35

Like I am a normal person. I

16:37

hate myself. But yeah. But

16:41

then it's like, well, actually,

16:43

there is a sort of

16:45

positive side to over crediting

16:48

yourself with certain outcomes or

16:50

escaping another person's knowledge for

16:52

your own. That really

16:54

reflects the idea that human beings

16:56

are so, so collaborative and we

16:59

are so good at making things

17:01

happen together that we

17:03

tend to sort of like

17:06

forget where our knowledge ends

17:08

and another person's begins. Like

17:10

we have these fuzzy cognitive

17:13

brain barriers and that really

17:15

helps us drive progress. You

17:17

know, it's like we are

17:19

constantly overreaching ourselves. And

17:22

there's a certain element of

17:24

that that's really good. Like

17:26

if everyone who engaged in

17:28

grassroots activism had to have

17:31

a PhD in global

17:33

politics, like I don't think anything

17:35

would ever be accomplished. And you know, if

17:37

I had to know exactly what I wanted to

17:40

say when I set out on a writing project,

17:42

like I would never write books. So you know,

17:44

we have so many of us really like push

17:47

past our limits in

17:50

a way that's sometimes really

17:52

lovely and sometimes absolutely catastrophic,

17:54

as we can see in

17:57

so many overconfident populist leaders.

18:00

So finding out that the Dunning-Fruger effect didn't

18:02

say exactly what we thought it did, and

18:04

also finding out that overconfidence is present in

18:06

all of us was kind of shocking. And

18:09

I like the idea that people could

18:12

be wrong about a cognitive bias and

18:14

to use their own cognitive bias to

18:16

then twist that other cognitive bias to

18:19

misunderstand it. Totally. Feels

18:21

very us. Yes. So us. More.

18:25

After this. You

18:29

ever notice how finding time

18:31

and energy to do the

18:33

most basic human necessity, eat

18:36

literal food, has become just

18:38

another exhausting task jammed into

18:41

our increasingly inhuman schedules? Well,

18:44

your spring can be a little

18:46

more stress free with Factor. Factor

18:50

will provide you with

18:52

delicious, never frozen, ready-to-eat

18:54

gourmet meals that are

18:57

chef-crafted, dietician-approved, and ready-to-eat

18:59

in just two minutes. Each

19:02

week, you get to choose from

19:05

a menu of 35 options to

19:07

create your perfect breakfast, lunch, or

19:09

dinner with absolutely no prepping, cooking,

19:12

or cleaning up. And

19:14

Factor makes sure you get exactly

19:16

what you want. You can tailor

19:18

deliveries to your schedule and customize

19:21

how many meals you want each

19:23

and every week, and you can

19:26

pause anytime. So just head to

19:28

factormeals.com/ American Hysteria 50 and

19:31

use code AmericanHysteria50 to

19:34

get 50% off your first box plus

19:36

20% off your next box. That's

19:40

code AmericanHysteria50 at

19:43

factormeals.com/AmericanHysteria50 to get

19:45

50% off your

19:48

first box plus 20% off

19:50

your next box while your

19:53

subscription is active. Check out

19:55

Factor today. cat

20:00

owners deserve better than any old fashioned

20:02

litter. That's why I teamed up with

20:04

scientists and veterinarians to create PrettyLitter. Its

20:06

innovative crystal formula has superior odor control

20:08

and weighs up to 80% less than

20:11

clay litter. PrettyLitter even monitors health by

20:13

changing colors to help detect early signs

20:15

of potential illness. It's the

20:17

world's smartest kitty litter. Go to prettylitter.com

20:19

and use code spotify for 20% off

20:21

your first order and a free cat

20:23

toy. Terms and conditions apply, see site

20:25

for details. Your

20:28

brain needs support and new OLLI

20:30

brainy chews are a delightful way to take

20:32

care of your cognitive health. Made with

20:34

scientifically backed ingredients like Thai ginger, L-C-Aine

20:36

and caffeine. Brainy chews support healthy brain

20:39

function and help you find your focus,

20:41

stay chill or get energized. Be kind

20:43

to your mind and get these new

20:45

tropus chews at ollie.com. That's ollie.com. These

20:48

statements have not been evaluated by the

20:50

Food and Drug Administration. This product is

20:52

not intended to diagnose, treat, cure or

20:55

prevent any disease. And

20:58

now back to the show. All

21:01

right, if we've done kind of the

21:03

confirmation bias that unnerved you, what is

21:05

the one that comforted you the most

21:07

when you learned about it? Ooh, great

21:09

question. Okay, so the

21:12

last chapter in the book talks about

21:14

a bias called the IKEA effect, which

21:16

is kind of fun. And

21:19

it describes our penchant

21:21

to overvalue products that

21:24

we helped create, especially if

21:26

we made them with our own hands. And

21:28

I found this really quaint.

21:31

Yeah, it's a fairy. It's

21:33

just like so delusional, but

21:35

also adorable to think that

21:37

like just because we contributed

21:40

to someone's crowdfunding campaign or

21:42

like because we helped build our

21:44

own piece of IKEA furniture, that

21:46

that thing is like so much

21:48

more beautiful and so much more

21:50

valuable and that we want to

21:52

like show it to our friends

21:54

and overprice it after we sell

21:56

it on Facebook marketplace or when

21:58

we sell it. with it or

22:01

whatever. And I found that so

22:03

interesting, especially in the context of

22:05

this time when, you

22:08

know, technology is moving so so

22:10

exponentially quickly. And, you know, there's

22:12

a lot of fear surrounding AI

22:15

taking the place of artists and

22:17

us not being able to work

22:19

with our hands in this curative,

22:22

like profoundly human way anymore, because

22:24

like the robots might do it.

22:26

And so like working through some

22:29

of that, and also sort of

22:31

taking the temperature down on some of

22:33

that alarmism and fear was a really

22:35

cool exercise for that last chapter. Do

22:38

you want to tell us about the

22:40

precious thing that you made? Oh,

22:44

which one I mean, I'm talking about

22:47

your cushion, the cushion. Yeah, I'm looking

22:49

at it right now. And

22:51

actually, I feel more proud of it than ever. So

22:57

yeah, I mean, I do not identify

22:59

as particularly crafty, like I don't think

23:02

I'm really patient enough for crocheting

23:04

and sewing and like all these

23:06

things that would reflect so well

23:08

on me. And that would allow

23:10

me to like really fulfill my

23:13

cottagecore homesteader fantasy. Yeah, I don't

23:15

I don't work well with my

23:17

hands. But during the pandemic,

23:19

I did or during lockdown, I

23:21

should say I did so something

23:26

very special. I

23:28

had bought a big girl office

23:30

chair on Facebook Marketplace. It's

23:32

very aesthetic. It's like beach

23:35

wood. It's super like Scandi,

23:37

Japan. I'm obsessed with it,

23:39

but not comfortable for the

23:41

derriere. So I

23:43

decided to make an attempt

23:45

at sewing my own seat

23:48

cushion. And so I made

23:50

this really janky green, personal

23:53

pizza size, you know,

23:55

circular seat cushion stuffed with the innards

23:57

of a discarded dog toy, might I

23:59

add. beautiful. And I am

24:01

so proud of it. Like I

24:03

show it to everybody. I think

24:06

it is the most important object.

24:09

I'm like this

24:11

book, who cares? This sea cushion

24:13

is all I want to talk

24:16

about. The pinnacle. Yeah. And it

24:18

was just like, so funny, like

24:20

read about this cognitive bias and

24:22

be like, oh, this applies to

24:24

something I did literally two months

24:26

ago. I got with

24:28

everything I write about, whether it's like,

24:31

cultishness or cognitive biases,

24:34

it always like, generates

24:37

a sort of appreciation in me

24:39

for how mystical and communal and

24:41

silly and quaint human beings are.

24:43

Like, I think in the so

24:45

speaking of not knowing where our

24:48

knowledge ends, and another person begins,

24:50

we actually don't even know where

24:52

our knowledge ends and the computers

24:54

begin. Like this is actually a

24:56

described phenomenon called the Google effect.

24:59

We fail to remember the

25:01

information that we learn via web search,

25:04

but we also fail to recognize that

25:06

we have forgotten it. Like, we think

25:08

we know everything we've ever googled. And

25:10

that is so silly.

25:13

And but I think it also

25:15

generates a certain amount of ego

25:17

in us where like, everything is

25:19

knowable. I am so

25:21

smart because of Google. But actually,

25:24

like we are not meant

25:26

to like be able to get into like

25:28

hold that much information in our minds. We

25:30

have limited time, limited cognitive

25:33

resources, limited memory storage, we are

25:35

the same sort of like

25:37

stone age brains that we've always been.

25:39

And yet we're not aware of that.

25:42

So there is this ego that like

25:44

we can know everything under the sun.

25:47

And yet, we're simultaneously

25:49

like sewing. I'll speak for myself.

25:51

I'm sewing these like shoddy seat

25:53

cushions and thinking like I'm Michelangelo.

25:56

So humans are so silly. We

26:00

sure, yes, we very much are. And

26:02

okay, so one of my

26:05

favorite moments in the book is when

26:07

you talk about how humans, though

26:10

we've always had fear and

26:12

we've always had a certain sense of

26:15

anxiety, it really wasn't until the dawn

26:17

of reason and the middle ages and

26:19

those times that we really started to

26:22

be, our

26:24

minds were robbed of faith, I

26:26

think is how you put it.

26:28

And you also quoted Dennis Tewary,

26:30

who wrote that the new science

26:32

empowered mind that dared to know

26:34

was also quote, a vulnerable mind

26:36

robbed of the medieval certainty of

26:38

faith. And I think on our

26:41

show, that's something I like

26:43

to talk about a lot because we've

26:45

kind of written off faith as this

26:47

sort of old timey thing that

26:51

we don't really need anymore in the age

26:53

of science and technology and

26:55

progress. But really it leaves

26:57

us very vulnerable

26:59

to overthinking and to

27:02

just kind of this fracturing that

27:05

happens when, again, you have all

27:07

the knowledge in the world, essentially

27:09

at your fingertips, but we kind

27:11

of lack the sense

27:13

of community and shared myth

27:15

that used to unite groups

27:17

of people and kind of

27:19

be this thing that we could fall

27:22

back on. And so I would love

27:24

if you would give us a little

27:26

bit about your relationship to faith and

27:29

what faith can be in the

27:31

information age. Ooh.

27:33

Yeah. Yeah, I mean,

27:35

listen, I would not trade now for

27:39

the middle ages. Probably not.

27:42

Anyway. Yeah, I think, you know,

27:44

I don't wanna demonize the

27:46

democratization of information. It

27:49

is so much better to have

27:51

access to knowledge and

27:55

for it not to be kept on this

27:57

high throne, like just for queen, queen.

28:00

I always end up saying Queen instead of

28:02

King and Queen, which I actually love. It's

28:04

like the gender-nouche version for royalty. Just

28:08

kept a high throne for royalty and

28:10

scribes back in the day

28:12

when almost no one could read. Like,

28:15

yeah, sure, probably no one suffered from

28:17

information overload the way that we do

28:19

now, but that was worse. Of

28:22

course, yeah. And yet,

28:24

yeah, there is a

28:26

sense of security and

28:28

communality and ritual in

28:31

these safe communities that can be

28:33

really healing as long as you

28:35

can participate within the space of

28:37

ritual time. That's a concept that

28:39

I talk about in my book,

28:41

Cultish. I learned about this

28:43

concept of ritual time from a writer

28:45

named Gary Eberl who wrote a book

28:47

called Dangerous Words. And he talks about

28:50

how ritual and metaphysical

28:52

belief and things like that

28:55

can be really healing and

28:57

awe-inspiring as long as they

28:59

take place in this

29:01

specific space and time where there's a

29:04

ritual to signify you're entering that space

29:06

and time. There is a ritual to

29:08

signify that you're exiting it, and it

29:11

doesn't follow and dictate in an

29:13

absolutist or dogmatic way every decision

29:15

that you make throughout your entire

29:17

life. And that's where things get

29:19

dangerous. And I think as we

29:21

confront so much choosers paradox in

29:23

our lives right now, like there

29:25

is at least the illusion of

29:28

like endless possibilities for where life

29:30

could go, what job you could

29:32

have, what hair color you could

29:34

have. In addition to this real

29:36

sense of isolation that we feel

29:38

in a culture of like digital

29:40

age capitalism and that

29:42

pressure to know everything under the sun, this

29:44

kind of lie that says

29:47

that everything is knowable, all of

29:49

those things combined have created like

29:51

a really tense cultural environment. And

29:54

that can make the space for

29:56

sort of secular religions to become

29:59

really... fanatical and

30:01

cult-like. Yes, faith in

30:03

traditional religion is declining

30:05

in so many places all

30:07

over the world, and yet

30:10

that craving for meaning and

30:12

connection and ritual is going

30:14

nowhere. And so we are

30:17

putting our faith in, you

30:19

know, Reddit forums or pop

30:21

standoms. And sometimes those interactions

30:23

get really, like, crusades-level brutal.

30:26

And that can be freaky.

30:28

But yeah, I know I said

30:30

it in a way just a moment ago.

30:32

But like, every time I look into one

30:34

of these topics, whether it's cults or

30:36

cognitive biases, I find myself wanting to

30:38

engage in that stuff more. I

30:41

know writing a book about cults

30:44

and how cultishness shows up in

30:46

everyday life just kind of made

30:48

me want to tap into my

30:50

mysticism a little bit more within

30:53

the space of that ritual time.

30:55

Because yeah, I don't want to

30:57

like completely write off irrationality or

30:59

superstition. Like, that stuff is so,

31:02

so human and can really help

31:04

us during certain times. Like, no

31:06

one has the answers to the

31:08

most existential questions in life. And

31:10

so like, whatever you got to

31:12

do to like cope with life

31:15

on earth, like I, I think

31:17

like, it sometimes makes sense to

31:19

do things that don't make sense,

31:21

if that makes sense. But within this

31:23

sort of protected space where you can

31:26

have this kind of skeptical twinkle in

31:28

your eye. I don't know

31:30

if that answers your question. Yeah, it

31:32

absolutely does. And I mean, you've done

31:34

a lot of work around cults, as

31:36

you mentioned, your last book

31:39

was about cults, you have a podcast

31:41

around things that approximate cults. And I

31:43

was curious too, about when you're

31:46

talking about what we were just

31:48

talking about, you know, having to

31:50

have this set time where you

31:52

get to practice ritual, because otherwise,

31:54

it's going to take over your

31:56

entire life, which often means that

31:58

some kind of charismatic. leader

32:00

is going to be taking over your

32:02

life. And what do

32:04

you think is the cognitive bias

32:06

that is most responsible for the

32:09

extremes that we go to with

32:11

this type of overthinking? What is

32:13

the bias that pushes us into

32:16

things that are cults or

32:19

approximate cults? Yeah, well, oftentimes

32:21

a whole bunch of these

32:23

biases are working together at

32:26

once, you know, like the first one that came

32:28

to mind when you asked the question was

32:30

confirmation bias. This is the cognitive

32:32

bias that gets the most play

32:35

and like headlines and things like

32:37

that. But it's our tendency to

32:39

only seek out and identify and

32:41

remember information that validates what we

32:44

already believe while, you know, ignoring,

32:46

discarding, forgetting information that controversy. And

32:49

yet, you know, that that

32:51

bias almost never works alone

32:53

because, you know, if someone

32:55

else is right about something

32:58

that will trigger our zero

33:00

some bias or our tendency to think

33:02

that another person's gain inherently means your

33:04

loss. And so if another person is

33:06

a little bit right about something that

33:08

can feel like it's taking away from

33:11

you being right about that thing. And

33:13

so then you start feeling really competitive

33:15

and you dig your heels in, you're

33:17

like, no, really, I'm correct. And you

33:19

start cherry picking even more information to

33:21

defend that. I came across this really

33:24

sort of startling fun fact that the

33:27

increase in science literacy has

33:30

not actually helped us become

33:32

better at identifying real facts.

33:34

It's just made us better

33:36

at defending our existing beliefs

33:39

with cherry pick science. So

33:41

yeah, yeah. So that's kind

33:43

of horrifying. Yeah,

33:45

I think the information age is

33:47

causing the global attention span to

33:50

shrink. We are not

33:52

necessarily getting better at thinking

33:54

critically all on our own,

33:57

just because there's more to critically think about.

33:59

And and confirmation bias is

34:01

this really, really powerful

34:03

cognitive inclination because it just

34:05

helps us make efficient decisions.

34:07

It's like if we didn't

34:09

have confirmation bias, we would

34:11

agonize over every tiny life

34:13

choice. We would dwell on

34:16

questions from like, should I have ordered

34:18

a salad instead of fries with my

34:20

lunch to like, should I have gone

34:22

to a cheaper university? It's like we

34:24

constantly are looking for evidence that we

34:26

made the right choice, that we are

34:29

currently making the right choice, and that

34:31

just helps us move forward and

34:33

survive. However, as the

34:35

world becomes more complicated, as

34:37

our problems become more abstract,

34:40

it's just ever important to become

34:42

aware of those things. And while

34:45

we are pretty bad at changing

34:47

other people's minds with facts and

34:50

data, the study suggests that

34:52

we are much better at changing

34:54

our own minds. And so, yeah,

34:56

just, I mean, I found that

34:58

becoming aware of confirmation bias in

35:00

particular in myself has helped

35:03

to a degree, at least

35:05

it's like opened up conversations

35:07

about it. And I just

35:09

always try to learn

35:11

to stomach cognitive dissonance a little

35:13

bit more than I otherwise would.

35:15

I feel like a responsibility to

35:17

do that, but it

35:20

is very difficult. Yeah, it certainly

35:22

is. And so you

35:24

also talk about in the

35:26

cult vein that there is

35:28

a such thing as a cult of one

35:31

in a toxic relationship. I think there

35:34

are, you know, it could be

35:36

romantic, it could be parental, but

35:38

that there is this propensity in

35:40

many of us to enter into

35:42

those types of relationships. And I

35:44

think one of the biases that

35:47

you mentioned having to

35:49

do with that is the sunk cost

35:51

fallacy. So would you give us a

35:53

little overview of the sunk cost fallacy

35:55

and maybe how it plays into our

35:58

culture in a bigger way? Yeah. So

36:00

the Sun Cost Files, this is

36:02

another one that's a little bit

36:05

more famous, but this one describes

36:07

our tendency to think that resources

36:09

already spent on an endeavor, money,

36:11

time, but also emotional resources like

36:13

hope or secrets justifies

36:16

spending even more. So, you know,

36:18

a really low stakes example would

36:20

be like I am playing poker

36:23

I've already, you know, put so much

36:25

on the table I can't fold now

36:27

even though my cards are bad or

36:30

I don't know if anyone has ever had

36:32

the experience of you know Getting a hundred

36:34

episodes into a TV show. You don't even

36:36

like it anymore But you're like, I'm this

36:38

far in like I have to finish it

36:41

Those are some scenarios that don't really matter that

36:43

much Well, I guess depending on if you're a

36:46

high gambler then it could be with both of

36:48

my mouth But but I

36:50

found that it really can apply to

36:52

our personal relationships as well And this

36:55

is the most sort of personal chapter

36:57

that I write in the book about

36:59

my own experience with the

37:01

most irrational choice I think I've

37:03

ever made was the decision to

37:05

stay for many many years in

37:07

a romantic relationship That really felt

37:09

like a cult and the worse

37:12

it got the more I told

37:14

myself like commit harder Like surely

37:16

the win that you were promised

37:18

is just around the corner like

37:20

any day now It'll get better

37:22

even though there was only evidence

37:24

to suggest the opposite But it

37:26

really goes back to this loss

37:28

aversion that is very natural to

37:30

us humans And also the idea

37:33

that like again, we are social

37:35

creatures and it behooves us to

37:37

seem predictable to ourselves

37:39

like it doesn't feel good to

37:42

Recognize and cop to the idea that

37:45

you made a bad bet on yourself

37:47

that you you know We're

37:49

not aware of what would make you

37:51

happy that you made a choice that

37:53

caused you great suffering and that

37:55

you know You've posted so many happy looking

37:58

photos of you and your partner online and

38:00

yet it's not helping

38:02

and you don't want to create

38:04

the impression that you're some kind of loose cannon. And so

38:07

in the digital age in particular, I

38:09

find that the sunk cost fallacy really

38:11

motivates our decisions to stay in relationships

38:13

with people that we've really flaunted on

38:16

social media as being like our perfect

38:18

match. And I definitely suffered from that

38:20

and you know, learning about the sunk

38:22

cost fallacy and how it related to

38:25

this choice of mind really

38:27

helped me sort of forgive myself because it

38:29

made me feel like I was

38:31

not just some total irrational,

38:33

pathetic, nincompoop, you know,

38:35

like there was this sort of

38:37

behavioral economics reason for my choice

38:40

that is, you know, pretty relatable.

38:42

And anyone who's ever, you know,

38:44

doubled down on their decision to

38:46

stay in a job that isn't

38:48

serving them anymore or anything like

38:50

that can probably relate to this.

38:52

But yeah, very healing chapter to

38:54

write, I'll say that. I'm

38:56

glad it was healing to read and I bet

38:58

it was for a lot of people too. More

39:02

after this. That's

39:26

it. It's that easy. It's time

39:28

to rethink recycling with renew. Particular

39:30

value of resources may vary by January. And

39:33

so available at half your way.com. And

39:36

now back to the show. So

39:40

I think that what you're talking

39:43

about to me also relates in

39:45

this bigger way to the

39:47

sunk cost of getting into a

39:50

certain political mindset or getting into

39:52

conspiracy theories or following a particular

39:54

candidate. It really could be anyone

39:57

and just being like I've invested

39:59

my. emotions, my

40:01

thinking, my time into

40:04

this. And I formed, you

40:06

know, likely relationships to other

40:09

people who share this belief

40:11

structure. And losing that would

40:14

be not only scary, but also

40:16

feel like I wasted my time

40:18

and I was stupid in some

40:20

way. And so the other thing

40:22

that that reminded me of, and

40:24

I think it was in your

40:26

confirmation bias section, you have this

40:28

this sentence that said, if changing

40:30

your mind means losing your tribe,

40:32

it's not worth it. And that's

40:34

very like, that's one that hit

40:37

me really hard. Because I think

40:39

we've all had that experience where

40:41

we encounter information that challenges our

40:43

belief structure that challenges our candidate

40:46

that challenges something that means a

40:48

lot to us politically. And

40:51

sometimes I think all of

40:53

us reject that information. And

40:56

I think that that is one of those

40:58

biases that we all need to really

41:01

be thinking about. But it still

41:03

comes from this place of like

41:05

wanting community and not wanting

41:07

to lose that. So can you

41:09

talk a little bit about how

41:11

that kind of worked in a

41:14

pre historical way? Or like, why

41:16

is that a bias that we

41:18

have today? Do you think? Yeah,

41:20

well, I mean, again, we we

41:23

have always been these like

41:25

intensely collaborative social creatures who

41:28

don't survive very well on

41:30

our own. And also for

41:32

most of human history, the problems

41:34

that we needed to solve every

41:36

single day were more physical, like

41:39

they were less cerebral, they were

41:41

less complicated, there was less sort

41:43

of like, missing this information, you

41:45

know, and so we developed these

41:48

psychological shortcuts to handle

41:50

a set of problems that really

41:52

just related to survival in the

41:54

physical world. And that had so

41:56

much to do with our

41:58

social relationship. and making

42:01

sure that we could stay a part

42:03

of our community, because that was so

42:05

much better for survival, whether that means

42:07

aligning ourselves with a certain role model

42:09

or seeking out information

42:12

that suggests that your

42:14

tribe is better than another, or

42:18

coming up with a fight or flight

42:20

response to protect yourself from a threat

42:22

that came from a more physical

42:24

stimulus or something like that. But

42:27

now we're applying those psychological

42:29

shortcuts to scenarios that they

42:32

weren't really meant for. Like

42:34

I do talk in the book about how

42:37

I have always been mystified by my own

42:40

panic response and reaction to something as

42:42

non-threatening as like a curt email or

42:44

like, you know, an Instagram comment that

42:46

was like kind of salty that was

42:48

left on my account like three days

42:50

ago when I didn't see it. And

42:52

it's like not actually a threat to

42:54

me, but you know, I'm seeing it

42:56

now. But you think you're gonna die.

42:58

But I think I'm gonna die, yeah.

43:01

And that has like so much

43:03

to do with these cognitive

43:05

inclinations that develop to help

43:07

us react to potential threats

43:09

that like really could lead

43:12

to our demise in the physical world. You

43:14

know, if like something was new to you,

43:17

new information to you, like the sound of

43:20

rustling in the bushes or whatever, like that

43:22

probably meant that that could be a predator

43:24

or at the very least, you know, it

43:26

wouldn't hurt to react like it was to

43:28

save your own hide. But now we're sort

43:30

of reacting to Instagram conflicts or

43:33

pretend conflicts as if they're like a

43:35

rustling in the bushes that could attack

43:37

us. And that is just serving no

43:39

one. But yeah, I mean, consistently, I

43:42

just keep coming back to the idea

43:44

that like a lot of these biases

43:46

developed for an evolutionary purpose or at

43:49

least they developed as a sort of

43:51

side effect that for a long time

43:53

didn't hurt us. And now because of

43:55

the culture that we've created, they are

43:58

hurting us, you know, like. yes,

44:00

it's very important to protect your community

44:02

and make sure that you're not ousted

44:04

by them. But as

44:07

yeah, problems become political and

44:09

disembodied, you know, we're often

44:11

like communicating with people that

44:13

we've never met in person,

44:15

we just, it's becoming

44:17

our responsibility to be able to

44:19

tolerate disagreement, and yet we're only

44:21

seeming to tolerate it less. So

44:24

these are like some of the

44:26

ideas that I don't claim to

44:29

like definitively solve in the book, but

44:31

they're the ideas that I'm reckoning with for sure.

44:33

Certainly. And also what

44:35

you said, you mentioned it a little

44:37

bit, but I was going to ask

44:40

you about the negativity bias, which I

44:42

think is one that haunts my soul

44:44

as well as it sounds like yours.

44:46

And I think many of those highly

44:48

sensitive people out there, or

44:51

just regular people, everybody's sensitive, they

44:53

just hide it better. But you

44:55

know, this idea that we will

44:58

focus on one

45:00

negative comment, even if it comes padded

45:02

in like a sea of positive comments,

45:04

you actually referred to it in a

45:07

way that I thought was so good

45:09

as a snake in the field of

45:11

flowers. And I was

45:13

wondering if you would just extrapolate a

45:15

little bit on negativity bias. And again,

45:17

how did that serve us in

45:20

a time when predators were

45:22

a bigger problem for us

45:24

than frightening thoughts

45:26

that we just made on our

45:28

own? Yeah, yeah, yeah.

45:31

Negativity bias. Oh my God.

45:33

What a validating concept to

45:35

learn about. It is this

45:37

tendency to assign greater weight

45:39

to unfavorable events. So like

45:41

you said, you know, we

45:44

internalize the one insult we

45:46

got in a litany of

45:49

compliments so much more like if you were

45:51

to get 10 hugs and

45:53

one slap in the face, like you would

45:56

not remember any of the hugs. You know

45:58

what I mean? Right, right. And

46:00

yeah, I describe it in the book as

46:02

like an insult in a litany of compliments

46:04

as today's rattlesnake in a field of flowers.

46:07

And that connects to yeah, this

46:09

more adaptive thing where

46:11

like, if you were in

46:13

a field of magnolias

46:16

and you noticed a snake, it

46:18

would not behoove you to like

46:20

ignore the one snake and focus

46:22

on the flowers, even if the

46:24

snake actually just turned out to

46:27

be a stick, even if it

46:29

wasn't really a threat, zeroing

46:31

in on that one potential threat

46:34

could help you. But

46:37

now we are sort of

46:39

hyper fixating on negativity when

46:42

it really doesn't make any evolutionary

46:44

sense to do so. We haven't

46:46

outgrown that tendency. So yeah,

46:48

I there are a lot of websites that I

46:51

don't go on. Oh my gosh.

46:53

Yeah, don't read the reviews people. No,

46:56

no. Or maybe do. No,

46:58

maybe that's a better thing to do

47:00

and like exercise that muscle. Well, no,

47:03

I don't feel the need to. I

47:05

spare myself because the thing too is

47:07

just like, it's not natural to be

47:09

exposed to so many opinions. No, you

47:11

know, even reading a whole

47:13

bunch of like positive reader reviews

47:15

makes me feel anxious and because

47:17

it just makes me feel really

47:19

exposed. You know what I mean?

47:21

And it's like, I know, in

47:24

theory, that when I write a book,

47:26

other people will read it. But

47:30

unless I am having a conversation with

47:32

them the way that you and I

47:34

are right now, it tends to feel

47:36

like, oh my god, like, how'd you

47:38

get that? Why are you

47:41

reading that? My diary. Yeah.

47:44

My diary like, leave me alone. It's

47:47

weird. Yeah, absolutely. I

47:50

mean, I think people who are creative,

47:52

who are sharing their work probably in

47:55

general feel like that unless perhaps

47:57

the overconfidence bias is

47:59

largely. at play for them. I would

48:01

even in that I don't think anybody is

48:03

spared from feeling well if they are I

48:05

mean maybe I would take a take a

48:07

freaking self actualization workshop from them like I

48:10

would love to learn how

48:12

to not care so much about what

48:14

the rogue strangers who happen not to

48:16

connect with something that I written have

48:18

to say but I don't think I

48:20

ever will like being sensitive is such

48:22

like an important part of being a

48:25

creative person or like a person in

48:27

general it's like we should care what

48:29

people think it's just that we shouldn't

48:32

have to care what the entire internet

48:34

thinks. Wouldn't

48:36

that be nice? Yeah, these are like constantly

48:38

like all of the ideas that I'm trying

48:40

to hold in my mind at the same

48:42

time it's like I am very porous when

48:45

it comes to other people's feedback and yet

48:47

in the age of the internet I will

48:49

just get flooded you know like I can't

48:51

be that porous so it's a tough balance

48:53

and I don't think anyone has like

48:55

the answers for how to navigate

48:58

it but I'm trying. Being

49:00

aware of it is the first step right?

49:04

So you talk about the halo effect which feels

49:06

appropriate to talk about now with this

49:09

and I

49:11

would love if you could for

49:13

our audience explain the halo effect

49:15

because when we're talking about people

49:18

who are looking up to some

49:20

public figure there are these assumptions

49:23

that we make that are

49:25

projected onto that person and then can

49:27

make us desire to also take them

49:30

down in some way eventually so would

49:32

you give us a little primer on

49:34

the halo effect? Yeah

49:36

for sure okay so the halo effect is the first kind

49:38

of bias I talk about in the book I was like

49:40

very eager to cover it the second I learned what it

49:43

was it is our proclivity

49:45

to admire one quality in a person

49:47

and then jump to the conclusion that

49:49

they are perfect overall so you know

49:52

we like a pop star's music so

49:54

we assume that they must also be

49:56

you know educated, world-wide, and we're all

49:58

in the same place. maybe

50:01

align politically with us since we are

50:03

sort of seeing them as like an

50:06

aspirational role model figure. You know

50:08

we like an influencer's sense of

50:10

style. We follow them, we start

50:12

to get invested in them parasocially,

50:14

we start jumping to conclusions that

50:16

they are congenial and smart

50:19

and you know wealthy and all

50:21

these things that there's little to

50:23

no evidence to suggest that they

50:25

are and that

50:27

can set up some

50:29

expectations that ultimately aren't healthy

50:31

for the fan or the subject

50:33

of their fandom and again

50:35

there is a sort of evolutionary

50:37

reason for the halo effect back

50:39

in earlier days in human history

50:41

if you saw someone in your

50:43

community with like big muscles and

50:46

intact teeth you might think like

50:48

oh that person must be

50:50

a skilled hunter or a skilled fighter

50:53

and thus you know good to align

50:55

with for survival purposes I want that

50:57

person in my corner even if they

50:59

you know had good muscles for

51:01

another reason other than being a

51:04

good hunter it was like you

51:06

know fair enough it was a

51:08

good enough conclusion to jump to

51:10

but we're now mapping those tendencies

51:12

onto these modern parasocial relationships involving

51:14

celebrities and influencers again in this

51:16

much more complicated time and that

51:18

is what I think is responsible

51:20

for so many of the cycles

51:22

of celebrity worship and disrondment that

51:25

we see these days absolutely

51:27

and it kind of relates to

51:29

to another cognitive bias that you

51:31

talk about decline ism yeah in

51:33

your chapter called nostalgia porn again

51:35

probably very relatable to many people

51:37

and you talk about the way

51:40

that we take these figures from

51:42

the past say Britney Spears Paris

51:44

Hilton and we look at them

51:46

through a new lens

51:48

and sort of cleanse them

51:51

of the qualities that were once

51:53

ascribed to them by tabloids by

51:55

us like let's admit it right

51:57

like by people who saw them negatively

52:00

because that's kind of what we were fed

52:02

to believe and now they're having a

52:05

moment in the warm

52:07

glow of nostalgia and we see

52:09

them differently and that extends far

52:11

beyond people. So would you talk

52:14

a little bit about declineism and

52:16

your section about nostalgia porn? Yeah.

52:19

So a lot of these cognitive

52:21

biases have an effective analog. So

52:24

I talked about zero some bias

52:26

that tendency to think another person's

52:28

gain means you're lost that could

52:30

sort of correlate to envy like

52:33

envy is zero some bias as

52:35

nostalgia is to

52:38

declineism. So yeah,

52:40

declineism is this idea that

52:42

life and society are just getting

52:44

worse and worse and worse irreversibly

52:46

and it's all downhill from there.

52:48

The human mind is kind of

52:50

like weird about time. You know,

52:52

we tend to romanticize the past.

52:54

We tend to like overly freak

52:57

out about the present and then

52:59

we sort of like devalue the

53:01

future. We tend to think that

53:03

the future is like dismissible or

53:05

scary. You know, a lot of

53:07

us don't like to think about

53:09

the future or if we do

53:11

think about the future a lot,

53:13

it tends to be with like

53:16

great anxiety. So

53:18

during times of present pain,

53:21

we don't tend to think about

53:23

the future explicitly. We tend to

53:25

sort of idealize the past or

53:27

even a past that we never

53:29

knew. And that sort

53:31

of creates this idea that some

53:34

distant bygone era was superior

53:36

to the present and following

53:38

from there, things are probably

53:40

only getting worse. But actually,

53:42

I went into that

53:44

chapter thinking like, oh, nostalgia is dangerous.

53:46

Like it just paves the way for

53:48

delusion and for the weaponization of figures

53:50

like Donald Trump to say make America

53:52

great again and stuff like that. But

53:55

then I talked to a bunch of

53:57

scholars of nostalgia and every time I

53:59

talked to academics who are experts

54:01

in the subjects that I'm interested in

54:03

looking into. They always fill me with

54:05

a sense of hope, they always ground

54:07

me, it's a really nice feeling. But

54:09

a lot of these nostalgia scholars are

54:11

saying that actually what's called personal nostalgia

54:13

or the nostalgia that you might have

54:15

of your own memories, even if it's

54:17

a little bit romanticized and not totally

54:19

accurate, it's a coping mechanism that's really

54:21

healthy. It helps us, you know, generate

54:23

a sense of imagination and this hope

54:25

for the future. Like if we romanticize

54:27

a memory that we had, it makes

54:29

us feel like, oh, maybe I could

54:31

have memories that are that positive again.

54:33

It helps you dream of days to

54:35

come. Nostalgia only starts to get sinister

54:38

when public populist figures from,

54:41

you know, political figures to

54:43

influencers even start to weaponize

54:45

and exploit visions of a

54:47

distant past that never existed

54:50

as a way to catastrophize

54:52

the current moment and sell

54:54

the public on a vision

54:56

of the future based on

54:59

that total revision of the

55:01

past. And then

55:03

there's also, like you're

55:05

saying, this idea of nostalgia relates so

55:07

much to populist politics, make America

55:10

great again. You mentioned that not

55:12

just Donald Trump, but George Bush

55:14

has used it, Bill Clinton used

55:16

it, Ronald Reagan used it, you

55:19

know, it is a very popular

55:21

way to get people to

55:24

believe that there will be this

55:26

resurrection of this old kind

55:29

of perfection that we lost

55:31

through, you know, depending on

55:33

who's talking some modern process.

55:36

And yeah, I think that it's

55:38

really important to get

55:40

a grip on these

55:43

biases and we may

55:45

not be able to do it

55:47

with other people, but there is

55:49

hope in us doing it for

55:51

ourselves. But, you know, you mentioned

55:53

it's difficult to get people to

55:55

face their own biases, but have

55:57

you found a way way,

56:00

even though it's difficult to use

56:02

this information that you have to

56:04

perhaps get to the

56:07

people who would be less

56:10

willing to confront their own

56:12

biases. Like, do you feel

56:14

like just learning about these

56:16

biases alone can just get

56:18

into people's heads enough that

56:20

maybe something rearranges a bit?

56:23

Sometimes. It depends on the bias.

56:29

So there's one bias in the

56:31

book called the illusory truth effect,

56:33

which describes our penchant to hear

56:36

something repeated multiple times and then

56:38

mistake it as true. You

56:41

know, sort of confuse processing fluency

56:43

as it's called with accuracy. And

56:46

unfortunately, with a bias like that,

56:49

even an awareness of it doesn't

56:51

help. Maybe does the opposite.

56:53

Yeah. If we hear something

56:56

over and over and over again, even

56:58

if it's like blatantly untrue and possible

57:00

or whatever, we really do tend

57:02

to internalize it as a fact.

57:05

And so the sort of counter to

57:07

that might just be to like, spread

57:10

more accurate information, you

57:12

know, like, as much as possible

57:14

to make accurate information sound catchier

57:16

and that's more engaging and more

57:19

easy to repeat and stuff like

57:21

that. But with some of

57:23

the other biases like sunk cost fallacy

57:25

and decline ism actually have really

57:28

found that at least for me becoming

57:30

aware of them has influenced

57:32

my behavior in some capacity

57:35

related to the sunk cost fallacy.

57:37

There is a study that I

57:40

quote in that chapter that talks

57:42

about additive versus subtractive solution bias,

57:44

basically like as human beings naturally,

57:46

but especially growing up in consumer

57:48

society. We tend to think that

57:50

in order to solve a problem,

57:53

any problem, we should add something

57:55

to the equation rather than taking

57:57

something away. So this study

57:59

involves presenting participants with a

58:01

set of colored blocks and a sort

58:03

of like visual puzzle that they could

58:06

solve by either adding or subtracting colored

58:08

blocks. And the much more

58:10

efficient way to solve the problem would

58:12

be to take a block away, but

58:14

the vast majority of participants opted for

58:16

the much more cumbersome way of

58:19

like adding a whole bunch of blocks to

58:21

solve the problem, which just so like has

58:23

deep-seated this is. And I find

58:25

it in my life in so many different

58:27

ways, like whenever I am struggling with a

58:30

certain anxiety or even just like clutter

58:32

in my house, I mean get this,

58:34

I was looking at my junk drawer

58:36

in my home the other week and

58:38

I was just like, oh god what

58:40

a mess. You know what would solve

58:42

this? If I went to the container

58:44

store and got some like acrylic drawer

58:46

organizers, you know, like that would solve

58:48

the problem. It didn't naturally occur to

58:50

me to be like, no, just throw

58:52

some things away and this

58:54

applies to, you know, different areas

58:57

of life. Like that relationship that

58:59

I described in that chapter, whenever we were

59:01

experiencing our worst moments, I used to think like, you

59:03

know, you know what we need to solve our problems?

59:05

We need to go on

59:07

another vacation or replace our

59:09

furniture or like add

59:12

something maximalist to our lives. It

59:14

really did not occur to me

59:16

to maybe like break up.

59:19

Yeah, I found

59:21

that with some of these biases

59:23

and awareness of them has really

59:25

helped especially because they're so interesting

59:27

to learn about and I found

59:30

just personally and anecdotally when a

59:32

subject is fun to

59:34

learn about, when it provokes curiosity

59:36

and it's like fun to talk

59:38

to other people about, then it's

59:40

also in a way like easier

59:43

to apply because I don't feel

59:45

so attacked by it. It's

59:47

not just like some boring idea or

59:49

some shaming thing. It's more like, oh,

59:51

this is so interesting that the human

59:53

brain works this way. I see it

59:56

working in me and I

59:58

think I would like to choose on my own. volition

1:00:00

to maybe make decisions differently

1:00:02

with that awareness. Absolutely.

1:00:04

And I loved the part

1:00:06

that you talked about rhyme

1:00:09

as reasons related to this,

1:00:12

where some kind

1:00:14

of piece of rhyming

1:00:16

information will not only stick

1:00:18

in your mind, but maybe,

1:00:20

you know, give you some

1:00:23

kind of sense of pleasure with language.

1:00:26

Yeah. Yeah. In that

1:00:28

illusory truth effect chapter, which is

1:00:30

really about like the power of repetition

1:00:32

and language to make you feel like

1:00:34

something is true, I talk about

1:00:36

like some other speech devices that

1:00:38

make a statement feel truer, even

1:00:40

if it's not. And yeah, one

1:00:42

of them is rhyme. There's

1:00:44

a, yeah, this effect called the rhyme is

1:00:47

reason effect, which in studies of

1:00:49

the rhyme is reason effect. Scientists have

1:00:51

found that the phrase woes unite

1:00:53

foes is evaluated to

1:00:55

feel more truthful than woes unite

1:00:57

enemies, even though they mean the

1:00:59

exact same thing, one of them

1:01:01

rhymes. And so it

1:01:04

really goes to show like the

1:01:06

power of rhyming political slogans to,

1:01:08

you know, spread information for better

1:01:10

and for worse. And so,

1:01:12

you know, like some of these linguistic tactics can

1:01:14

be weaponized in a really dangerous way, but we

1:01:16

can also use them for good as well. The

1:01:19

last question I want to ask you is in

1:01:21

your opinion from your research, is there one

1:01:26

bias that you think that as

1:01:28

a culture and as individuals or

1:01:30

maybe a combination of biases that

1:01:33

would best serve us going forward

1:01:35

in the information age, in the

1:01:37

AI age, whatever's coming next, is

1:01:40

there one that you found was

1:01:42

the one that you kind of

1:01:44

wanted to scream from the rooftops? Yeah,

1:01:50

I mean, the overconfidence bias one

1:01:52

does come to mind because I

1:01:55

find that we tend to

1:01:57

demand overconfidence in our leaders.

1:02:00

even when it behooves no one for them

1:02:02

to have it. We

1:02:04

don't really accept

1:02:06

very well when an expert claims

1:02:09

not to know or not

1:02:11

to have a definitive answer. That's

1:02:13

not very engaging. It's not very

1:02:15

comforting. And so, whether we're talking

1:02:18

to doctors or watching politicians give

1:02:20

speeches or whether we're listening to

1:02:23

business leaders, we really want them

1:02:25

to portray overconfidence. And as the

1:02:27

world becomes more volatile

1:02:30

and technologically driven and things,

1:02:33

well, I guess not more, the world has always been

1:02:35

pretty volatile, but as the world

1:02:37

becomes more technologically volatile, I

1:02:39

think it would really help

1:02:41

for each of us to

1:02:43

maybe not expect and demand

1:02:45

overconfidence as much in our

1:02:48

leaders because it will

1:02:50

only incentivize them to bullshit us.

1:02:53

Yeah, absolutely, absolutely. Well,

1:02:55

thank you, Amanda, so much. This has

1:02:57

been such a great conversation. And I

1:03:00

just loved your book. And I really

1:03:02

hope that everybody goes and gets a

1:03:04

copy of it because it's information

1:03:06

that was comforting to me, was

1:03:08

scary to me, but also I

1:03:10

think overall just like knowledge

1:03:13

is comforting even

1:03:16

though a lot of times it's

1:03:18

not comforting. So... Yeah, it's

1:03:20

all of it everywhere, everything all

1:03:22

at once. Exactly. And I

1:03:24

know that you've got something coming up that's

1:03:26

very exciting and very relevant to your book.

1:03:29

Would you tell us about it? Oh, yeah,

1:03:31

of course. Well, yeah, so the book came

1:03:33

out April 9th. And of course,

1:03:36

I am not ready to stop

1:03:38

talking about magical overthinking. So I

1:03:40

am launching a podcast

1:03:42

that was inspired by the

1:03:44

book called Magical Overthinkers. And

1:03:47

it will premiere Wednesday, May 15th,

1:03:50

wherever you get your podcasts. And

1:03:53

it's been so much fun to work

1:03:55

on so far. Basically every episode is

1:03:57

dedicated to a different buzzy, conf... founding

1:04:00

subjects that we can't stop overthinking

1:04:03

about from narcissism

1:04:05

to imposter syndrome and I

1:04:07

interview a really fascinating

1:04:10

charismatic expert guest about the subject

1:04:12

to put our overthinking to bed

1:04:14

once and for all Man,

1:04:17

I'm really excited for this and

1:04:19

I know that people can't subscribe

1:04:21

on their podcast app yet But

1:04:24

you all can follow the show

1:04:26

on Instagram at magical over thinkers

1:04:29

Please do and Amanda I'm

1:04:31

so grateful that I got to have you

1:04:33

on the show and I really hope that

1:04:35

you will come back someday Because I'm sure

1:04:37

there are so many more things that we

1:04:40

could talk about. I would love that.

1:04:42

Thank you for having me This

1:04:48

was American hysteria Make

1:04:51

sure you follow Amanda at Amanda

1:04:53

underscore Montel on Instagram and also

1:04:55

make sure you pick up a

1:04:58

copy of her book the age

1:05:00

of magical Overthinking you can find

1:05:02

those links in our show notes

1:05:05

If you want to get more of our show

1:05:07

or if you'd really like to support us The

1:05:09

best way you can do it is by going

1:05:11

to patreon.com American

1:05:13

hysteria or subscribe on Apple Plus

1:05:16

you'll get ad-free episodes and you'll

1:05:18

also get access to the other

1:05:20

Podcasts that I do with our

1:05:23

producer Miranda Hysteria home companion where

1:05:25

we tell you stories that were

1:05:27

left out of episodes or stories.

1:05:30

We think you'd really love that's

1:05:32

patreon.com American hysteria or

1:05:34

subscribe on Apple Plus Another

1:05:37

Really great way to help us out

1:05:39

is to leave us a five-star review

1:05:41

on the app of your choosing. If

1:05:44

you go to American Hysteria Calm You

1:05:46

can pick up some of our merch

1:05:48

and you can leave us a message

1:05:50

on our Urban Legends hotline about a

1:05:52

tale that you Remember from growing up

1:05:54

and we might do an entire episode

1:05:56

about it. That's American Hysteria Calm. Producer

1:06:00

and editor is Miranda Secular in.

1:06:02

Our associate producer is Riley Switch

1:06:05

L E S Smith and I'm

1:06:07

your host Chelsea Weber Smith. Thanks

1:06:10

as always for listening and I

1:06:12

hope you have a great. He.

1:06:23

Gets into spring this year with affordable. Styles

1:06:25

for show issue. Department here

1:06:27

for the marathon you've been dreaming

1:06:29

of with comprehended athletic. She's like

1:06:31

New Balance, Adidas and look your

1:06:33

best with outstanding the most. We

1:06:35

have a large selection of dressing

1:06:37

council see that are under three

1:06:39

dollars and perfect for any occasion.

1:06:41

Find your favorite brands this spring

1:06:43

at Sushil and Shoe department we've

1:06:45

That's you for everyone and for

1:06:47

every fresh beginning to like styles

1:06:49

and thirty dollars product availability. Their.

1:06:52

Location:

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features