Podchaser Logo
Home
America Needs a Self-Help Book. Tim Urban's Got One.

America Needs a Self-Help Book. Tim Urban's Got One.

Released Friday, 12th May 2023
Good episode? Give it some love!
America Needs a Self-Help Book. Tim Urban's Got One.

America Needs a Self-Help Book. Tim Urban's Got One.

America Needs a Self-Help Book. Tim Urban's Got One.

America Needs a Self-Help Book. Tim Urban's Got One.

Friday, 12th May 2023
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

When you download the Kroger app, you have easy

0:02

access to savings every day. Shop

0:04

weekly sales and get personalized coupons

0:07

to get the most value out of every trip, every

0:09

time, whether you shop in-store or online.

0:11

Download the Kroger app now to save big.

0:14

Kroger, fresh for everyone. Must have a digital

0:16

account to redeem offers. Restrictions may apply.

0:18

See site for details.

0:20

Save big on your favorites with the Buy 5 or

0:22

More, Save $1 Each sale. Simply buy 5 or

0:24

more participating items and save $1 each with

0:26

your card. Kroger. Fresh for

0:29

everyone.

0:31

I'm Barry Weiss and this is Honestly.

0:33

Tim

0:36

Urban is the only cartoonist who

0:38

has elicited an existential crisis

0:40

in me. It's not because he's some great

0:42

illustrator. Tim's drawings are comically

0:45

simple. They're of stick figures. Or if he's feeling

0:47

fancy, maybe he'll do a chart.

0:49

What makes them so affecting is the way

0:52

he's able to capture and distill the

0:54

most complex and profound questions

0:57

we face.

0:58

Questions like, what does it

1:00

mean to be a human being? What

1:03

is the purpose of our lives?

1:05

Are we spending our finite time on earth wisely?

1:07

And do we even grasp how short

1:09

that time is?

1:11

By capturing the length of our days in, say,

1:13

the amount of times we have left to swim in the ocean,

1:16

or the books we have left to read, or the dumplings

1:18

we have left to eat, assuming we live to the age

1:20

of 90, Tim takes an abstract

1:23

subject like time and makes

1:25

it tangible.

1:27

In one of my favorite blog posts of his, Tim breaks

1:29

down the amount of time, realistically, that we have left to spend with our parents.

1:35

Did you know that by the age of 18, you've

1:37

already used up like 95% of your parent time? It's

1:41

something

1:41

that stuck with me. So

1:45

Tim's done this sort of thing for years on his singular

1:47

and must-read blog, Wait But

1:49

Why, which is full of everything

1:51

from posts on AI to aliens to the Fermi

1:53

paradox to marriage.

1:56

But a few years ago, like six years ago,

1:59

like many

1:59

of us, Tim was troubled

2:02

by what he was seeing going on in the

2:04

world around him. He noticed

2:06

that while technology was progressing

2:09

in unbelievable ways, people were going

2:11

to the moon on private rocket ships, computers

2:13

were the size of Starbucks coffee cups, and foraging

2:16

was a thing of the past. Yet

2:19

we were seemingly more unhappy

2:21

than ever before. We were petty. We

2:24

were turning against each other. And

2:26

the very things that have allowed for this kind

2:28

of progress, things like democracy

2:31

and liberalism and humanism, those

2:34

were under siege. Why,

2:36

Tim wondered, was everything such

2:39

a mess? When did things

2:41

get so tribal? And why

2:43

do humans do this stuff to each other? His

2:46

new book, called What's Our Problem,

2:49

a self-help book for societies, is

2:51

an answer to those questions and more. Tim

2:54

looks back at hundreds of thousands

2:57

of years of history. Trust me, it works. He

2:59

condenses.

2:59

And he argues that we are living through more

3:02

change, more rapidly, than

3:04

at any time ever. And

3:06

the stakes of that are almost too

3:09

high to comprehend.

3:11

But what he argues is that the danger

3:13

we face in the end is not global warming,

3:16

it's not an asteroid racing toward Earth,

3:18

it's not an impending alien invasion. It's

3:22

ourselves. And Tim argues

3:24

that we got ourselves into this mess,

3:27

but he's pretty sure we can

3:29

also get ourselves out of it.

3:36

Stay with us.

3:40

Have you noticed that the legroom on airplanes keeps

3:42

shrinking? Or that when you order clothes online,

3:45

the quality is way worse than it used to be? Ladies

3:47

and gentlemen, we are suffering from skimpflation.

3:51

Quality dupes are everywhere and it's easy to get scammed.

3:54

Stick with what you know.

3:55

Raycon delivers premium audio at

3:57

the perfect price point, so you can listen

3:59

to what you want.

3:59

want, when you want, without breaking the bank. Whether

4:02

you're looking for a pair of everyday earbuds, low

4:05

latency gaming headphones, or if you're in your

4:07

20s, a speaker with a battery that will last you all

4:09

night, Raycon has you covered. You

4:12

can even get a pair in a spare and still pay

4:14

less than you would with some of those other big

4:16

name tech brands that are out there. With

4:18

three customizable sound profiles, earbud

4:21

tap functions, and noise isolation, Raycon

4:24

is premium audio at the perfect price

4:26

point. Plus, they offer buy now,

4:28

pay later options, and every purchase has

4:31

an easy and free return guarantee. Go

4:33

to buyraycon.com slash Barry

4:35

today to get 15% off your

4:38

Raycon order. That's buyraycon.com

4:40

slash Barry to score 15% off.

4:43

Hi, Honestly listeners. I'm here to tell you about

4:46

an alternative investing platform called Masterworks.

4:49

I know investing in finance can be overwhelming,

4:51

especially given our economic climate, but

4:53

there's one thing that will never go in the red, and that

4:56

is a painting for Picasso's Blue Period. Masterworks

4:59

is an exclusive community that invests in blue chip

5:01

art. They buy a piece of art, then they file

5:03

that work with the SEC. It's almost like filing

5:05

for an IPO. You can then buy a

5:07

share representing an investment in the art. Masterworks

5:11

holds the piece for three to 10 years, and

5:13

then when they sell it, you get a prorated portion

5:15

of the profits minus fees. Masterworks

5:18

has had 11 sales to date from artists like

5:20

Andy Warhol, George Kondo, Banksy,

5:22

and Monet, and the most recent sale netted

5:25

a 35% return. Over 650,000

5:28

investors are using Masterworks to get in

5:30

on the art market. Go to masterworks.com

5:33

slash honestly for priority access.

5:36

You can also find important regulations and disclosures

5:38

at masterworks.com slash CD.

5:44

Tim Urban, welcome to Honestly. Thank

5:46

you for having me. You just published this

5:49

book called What's Our Problem? A self-help

5:51

book for societies. You've been working on this

5:53

book for more than seven years I think. Is

5:55

that right?

5:56

About six and a half. Okay. You're

5:59

sort of a famous procrastinator. I am.

5:59

which we're going to get to later in this conversation.

6:02

But that is not my sense of why

6:05

this book, which has made up at least half

6:07

of cartoons, took so long to get

6:09

into the world.

6:11

What was so hard about getting

6:13

this book out into the public?

6:15

Well, if you're trying to assess

6:18

what's going on in a society and

6:20

why things are the way they are, there are

6:22

so many other

6:25

topics that feed into that. So it

6:27

was an overwhelming amount of material

6:30

to try to kind of put together and synthesize,

6:32

but also on top of that, I

6:34

just had all of this

6:37

resistance to it, saying anything besides politics.

6:40

And then other people would feed into that. They'd

6:42

say, are you crazy? Don't write about

6:44

that. You don't have any haters right now.

6:46

Why would you go and write about politics? Just write about anything

6:48

else. And that to me was interesting. I said,

6:50

well, what's good? This is like, I'm supposed to, I write

6:52

about whatever I want, right? I write it if something's important

6:55

in society. I wrote when AI first became a

6:57

big topic in 2015. I wrote a huge thing about

6:59

it. You know, whatever I'm thinking about, I write

7:01

about. So this one topic, which is so important,

7:03

it's how we all are living together. It's how

7:06

the fate of our society.

7:09

There's this incredible

7:12

incentive to stay away from it. What

7:15

is going on there? And that got

7:17

me thinking, this is part of the story, the

7:19

fear I have of talking about it. That's, there's

7:22

a much bigger topic here.

7:24

You recently wrote in your wonderful blog, Wait

7:26

But Why, about this six and

7:28

a half year journey. And you said this about

7:30

starting the book in 2016. Something

7:33

seemed off about the society around

7:35

me, like there had been a subtle foreboding shift

7:38

in the balance between reason and madness.

7:40

It felt like we were losing our grip on something

7:43

important. Let's talk a little bit

7:45

about that shift. What were you starting to witness

7:47

in 2016 that

7:50

made you feel like the balance

7:52

between reason and madness was tipping into madness?

7:54

And to what did you attribute

7:56

the shift? So

7:58

think about me. middle school

8:00

and how people act in middle school. You

8:03

know, there's popular kids and then

8:05

there's unpopular kids and there's kind of like the

8:08

most popular person is often kind of a mean

8:10

person that everyone's a little bit scared of and then there's

8:13

real in-group and there's out-group and

8:15

people are cruel, right? And I

8:18

started noticing that the grownups

8:21

were acting like this.

8:23

We all have that middle school persona

8:25

in us somewhere, whether we were the person who was

8:28

the bully or the person who was the sidekick

8:30

of the bully, sucking up to the bully or the person who

8:32

was the target of the bully or, you know,

8:35

if we were especially

8:36

grown up at that age, maybe we were the

8:38

person who stood up to the bully, you know, but either

8:40

way, that person's still in us and

8:43

something was bringing it out. And the

8:46

shift I noticed was not just that more people were acting

8:48

this way, but that people that

8:50

would normally criticize this kind of behavior,

8:52

that would criticize cruelty or

8:55

overt tribalism or, you know, gross

8:58

stereotyping of giant groups of people, you

9:00

know, or just kind of old school bigotry.

9:03

The people that would normally stand up and criticize that

9:05

were all doing it only in private. And

9:08

you know, it seemed like the power shifted where

9:10

the people acting kind of in that

9:12

unadmissible way

9:14

had some kind of power that everyone was scared of.

9:17

And it felt like something crazy news story would be happening

9:19

that any reasonable person watching it would say, well,

9:22

that's ridiculous. But no one's saying

9:24

it out loud, right? We're all either saying it in private

9:26

or in some cases you find even in private

9:29

a private dinner party, you see that

9:31

people are even the no one knows who's thinking what

9:33

at that dinner party and everyone's kind of virtue

9:36

signaling to each other at the dinner party. And so

9:38

it didn't feel like it was always like that. It felt

9:40

like this was happening more than it used

9:42

to. Meanwhile, as I'm right as I'm

9:45

thinking about this, you know, Donald Trump is ascending

9:47

in power and he's this is, you know, as the primary

9:50

was going on. And he has a total disregard

9:52

for

9:52

truth and a disregard

9:54

for a lot of the kind of norms that most

9:57

other politicians had had to follow. And he was

9:59

rising up.

9:59

And so there should be things

10:02

happening across the political spectrum, across society

10:04

that, again, felt kind of foreboding. We

10:07

were losing our grip on the

10:09

kind of things that keep a liberal society

10:12

kind of healthy and strong.

10:13

To me, it seems like everyone

10:16

has what I think of as

10:18

their, I guess it's like a play on woke,

10:20

like their waking up moment where they realize

10:23

something's a little off here. Like right for me,

10:25

it was the Tom Cotton op-ed

10:28

and my boss James Bennett being like struggle session

10:30

and fired and then ultimately leaving the New York Times

10:33

and sort of looking at the capturing

10:35

of the institutions. For other people, it was the

10:37

Trump years. For other people, it was the

10:39

COVID lockdowns. For others, it was the moment

10:42

where Kenosha was burning,

10:44

but CNN had the chyron saying,

10:46

you know, fiery but mostly peaceful protests.

10:49

Did you have a kind of like aha

10:52

moment that,

10:53

or was it just kind of a series of

10:55

dinner parties like the

10:57

ones you just alluded to? Yeah. So

11:00

when you talk about these aha moments, I think the

11:02

way they work is

11:03

you don't just see one example of something

11:06

for the first time and suddenly your whole worldview

11:08

shifts. It's more that there's,

11:10

we have these priors, right? You have a prior worldview,

11:13

but somewhere in your subconscious, you've

11:16

been noticing

11:18

things that conflict with that prior and

11:20

maybe it's in your subconscious or maybe your conscious notices

11:22

it and disregards it and you keep noticing things and

11:24

saying, well, that's a freak incident noticing. And

11:27

then the aha moment, I think comes at the

11:29

end of a bunch of these. It's that moment

11:31

when

11:32

you realize that, you know what? I might have

11:35

to question that prior

11:37

and suddenly all those other examples

11:39

that you had been pushing away or that you're only your

11:41

subconscious has been noticed, they all come into

11:43

your consciousness and you realize, wait a second. This

11:46

is a whole pattern. I've been wrong about

11:48

this whole thing. And now

11:50

it's like all these other things fall into place

11:52

in your head. And so

11:54

for me, the

11:56

closest I can come to that exact moment was

11:59

Greg Luciano. took a, you know, who's the head

12:01

of fire, took this video. He happened

12:03

to be walking through Yale campus of Nicholas

12:06

Christakis getting essentially

12:08

struggle sessioned out in the quad

12:10

by a bunch of students over an email

12:13

his wife had written, which was suggesting

12:16

that the school had gone

12:18

too far in telling

12:20

students

12:21

what they should and shouldn't wear for Halloween. And

12:24

she in very gentle wording basically said, I don't

12:26

think it's our job to tell students what they should or shouldn't

12:28

wear. And maybe even if a student

12:30

wears something that's kind of offensive, maybe that's something

12:32

that college students should be able to do. And

12:35

if they and if other students don't like it, maybe the students

12:37

should talk to them and not the administrator, whatever.

12:40

Meanwhile, that turns into now her husband,

12:42

Nicholas Christakis, being screamed

12:45

at and this happened to be captured by Greg, put

12:47

on YouTube, so it went viral, a lot of people saw

12:49

it. But it was watching that video,

12:52

it

12:52

was noticing what kids were saying, which was stuff like,

12:55

you know, this is supposed to be a safe space, saying

12:58

that he's disgusting, that, you know, that

13:00

word disgusting, you know, they've been basically

13:03

subhuman

13:04

and snapping. And

13:06

what I mean by that is, you know, that they were

13:08

when one kid would say something, all the other kids would

13:11

snap in unison. And I know if I were

13:13

there and I'm one of those students, whether I agree or not,

13:15

it's going to be hard to, you know, not snap or

13:17

at least just

13:18

stay silent. Because that's scary

13:20

when everyone is, you know, kind of, you know,

13:23

that snapping is saying, you know, the group is behind

13:25

you. We are behind you. We are one, right, against

13:28

this man. And so it was

13:30

looking at that and saying that Yale is so safe. And

13:32

the idea that this woman's

13:35

email, it makes things

13:37

unsafe. That is just so counter

13:40

to everything that I think and

13:43

the way I think a college campus should be. So now

13:45

I'm thinking, though, but the prior I had

13:47

for a long time was kind of blue,

13:49

good, red, bad, right, in the U.S.

13:52

Just in general, not blue is perfect, but like

13:55

the people on the blue side were the ones who were pushing the

13:57

country forward in a good direction,

13:59

it back. You know, I grew up in a progressive suburb and

14:02

went to progressive college and lived

14:04

in LA and then lived in New York. So, you know, I was surrounded by

14:06

this. And I was an independent thinker in a

14:08

lot of areas, but, you know, these priors, especially with

14:10

something like politics, can be very strong. You know,

14:12

everyone around you thinks this thing, it can make

14:15

us otherwise independent thinker pretty beholden

14:17

to this other framework. And

14:20

I remember looking at that and thinking like, if this

14:22

is what blue is today, then like, something is incredibly

14:25

wrong. And I don't think that, you know, it just kind of,

14:27

it's that moment when your head explodes

14:29

a little,

14:29

because the house of cards that was my

14:32

prior, that was based basically on a

14:35

kind of on tribalism on a feeling that

14:37

I'm part of the good team. And that whole thing

14:39

just kind of shattered over the next couple months.

14:41

And I was reading and I was thinking and I was reflecting

14:44

on everything and looking back at my own emails,

14:46

like, like a year earlier, two years earlier,

14:49

and man, I was like, wow, I was really close

14:51

minded about this stuff. So then I

14:53

said, okay, this is something I have to I have to

14:55

write about.

14:56

You describe your book, which I love as

14:58

a self-help book for societies. And

15:01

I want to get into the self-help portion in just

15:03

a minute. But first I want to talk about how we

15:05

got to the place where we need help. You

15:08

start this book in this wonderful

15:10

way by giving a brief history

15:12

of humanity, which you call the story of us.

15:15

And you say this, you say, if we wrote the story

15:17

of us out in a thousand pages, here's

15:19

what it would look like. From page one

15:21

to page 950, there's basically not

15:24

much going on. And

15:26

then on page 976 of recorded history, it begins ish,

15:29

as you put it. And Christianity

15:34

isn't even born until page 993. So basically, the

15:36

first 95% of the book of human history

15:41

is so, so unbelievably slow

15:44

and boring. And the last 5% is a page

15:46

turner. Why

15:47

is that? What

15:50

is the spark that creates

15:53

the propulsive change that

15:55

we see in those last

15:57

few pages? So

16:00

one of the things that separates humans from other animals

16:03

is language and the

16:05

ability for people to take what's

16:07

in their brain and put it in

16:09

a very detailed way into other people's brains and

16:12

then people can pass the idea on

16:14

through language to their children. And

16:17

so if I learn something about the world, I

16:19

can tell my whole tribe who then can tell

16:21

their descendants and the tribe now

16:24

knows that. And it's like this kind

16:26

of collective knowledge, like a tower of

16:28

knowledge in the middle of that tribe that

16:30

kind of grows and builds and builds. And

16:33

then right around page 975, you start having writing. And

16:38

so now someone can take an idea and

16:40

put it on a tablet or later a book and

16:43

send it into the minds of millions of people. And

16:45

it can last for centuries and it can last in the

16:47

exact wording it was meant to be transmitted

16:49

in. So you have instead of a tribe

16:51

of 150 people's collective knowledge, you now

16:54

have 10,000 or 50,000. And

16:57

so that knowledge tower becomes a skyscraper.

16:59

And so things start happening very

17:02

quickly. You start being able to do things that

17:04

no individual human or no small tribe could

17:06

ever do, like build giant temples and

17:08

understand that the world is round and

17:11

start to come up with governing structures

17:13

that are based on trial and error of hundreds

17:15

of years of governing structures before that. And

17:17

so things start to advance much faster.

17:20

But the craziest thing is this has an exponential

17:23

kind of effect because a more advanced species with

17:25

the bigger knowledge tower makes progress faster

17:28

than a less advanced species. So not

17:30

only do things keep building, but they build faster

17:33

and faster and faster. And

17:35

eventually this comes to a point where things are soaring

17:38

ahead and you're

17:39

making more progress in a century than you did

17:41

in all of human history before that when

17:43

it comes to tech and knowledge building. The

17:45

reason I like the thousand page book thing

17:47

is it allows you to look at the last page alone,

17:50

just page 1000. So now we're not talking

17:52

about the last 5% of the book. We're talking about the last 0.1%

17:55

of the book. And

17:57

that goes from like 1770s to today.

17:59

right, 250 years per page. And

18:03

it is a total anomaly

18:05

compared to all the pages before it. And

18:08

it's again, I don't think it just happened on its own. It's

18:10

what eventually happens if you keep this pattern

18:12

going of exponential growth, exponential

18:14

growth, which is awesome, right? I mean, we have an amazing

18:17

quality of life. Modern tech is great. And

18:19

none of us, I don't think, want to go back to the

18:21

1600s. It's also

18:23

really scary. It's also really scary because

18:26

when things start racing forward ahead, you can

18:28

get very reckless and you

18:29

have a lot of godlike power from all of this tech.

18:32

And so that's kind of, to me, it's exciting

18:34

and it's scary. The other thing that's

18:37

anomalous about that page from call

18:39

it the mid 1700s till now, isn't

18:42

just all of the unbelievable leaps ahead

18:44

in terms of technology. It's also the advent of

18:46

liberal democracy, as being

18:48

completely ahistorical and

18:51

miraculous.

18:52

How do you understand sort of the rise

18:55

of liberal democracy in the story

18:57

of us? Because to me, that is the

18:59

background to this entire book is frankly

19:02

your

19:03

love and gratitude for a system

19:05

that I think you feel is really

19:08

under siege in ways that many

19:10

people aren't appreciating.

19:11

A liberal democracy is

19:14

a totally artificial invention.

19:17

And it's not that, you know, the

19:19

enlightenment thinkers of the 16 and 1700s,

19:22

it's not that they invented this from scratch, but

19:25

this is kind of the best crack yet. And they finally

19:28

came up with a way to do it in a way that could last

19:30

for centuries. It was robust. So

19:33

it's this kind of, it seems obvious to us because

19:35

I think of it as a house. It's a house

19:38

that is built, right? The support beams

19:40

of the liberal democracy and the structure

19:42

and the roof, all of this is an

19:44

invention. It's not the law of

19:46

nature. We made the house. And

19:49

so we now are growing up within

19:51

the house and so did our parents and so did our

19:53

grandparents. And so it's been a lot

19:55

of generations in the place like America,

19:58

at least for a lot of people.

19:59

since someone has been outside this house. My

20:02

wife, she's Persian, her parents immigrated

20:05

here in 1979, or they left Iran

20:07

then, like a lot of Iranians,

20:10

and

20:11

they do not take the house for granted. They

20:13

think the liberal house is unbelievable,

20:16

right? And talk to someone who's coming from a communist country

20:18

or coming from a dictatorship, and you

20:20

will hear a love for this house because they're saying,

20:22

oh my God, look around, this house is incredible. And

20:24

we're saying, huh, it is, I guess, I don't know. It's

20:27

the house, it's just the house. That's where we all live.

20:29

That's all fine until

20:32

the house is under threat. And then

20:35

this cockiness about the house will always

20:37

be here, of course. We have to get rid of

20:39

that and say, well, hold on. Why is

20:41

the house under threat? What does that mean, and

20:43

how do we preserve it?

20:44

I think one of the paradoxes of the last

20:47

page of the book that we're on is that

20:49

on the one hand, we're enjoying more

20:51

abundance, more progress, more

20:54

genuine historical privilege than

20:57

any group ever has before, ever. And

20:59

yet that

21:01

exponential progress, as you describe it,

21:04

is also the source of

21:06

a lot of chaos, a lot of misery,

21:09

and a lot of our uncertainty. And you really

21:11

lay these out in a powerful way in the book.

21:13

So I wanna kind of go through them.

21:16

One of the things that you

21:18

write about in the book is this distinction between

21:20

what you call the primitive mind and what you

21:22

call the higher mind, right? And the thing

21:25

that sets human beings apart from other animals

21:27

is this higher mind because the higher mind

21:29

is rational. It feels complex emotions

21:32

like empathy and ego. It makes long-term

21:34

goals for the future.

21:36

It's able to look beyond itself. And

21:39

yet, like all other animals, we

21:41

also have another mind inside

21:43

of us. And that is what you call the primitive mind

21:45

or what a lot of people call our lizard brains. And

21:48

that's the part of our brains that

21:50

hunts for prey, that protects our

21:52

kin, that privileges people that

21:54

look like us or animals that look like us above

21:56

others. And as you explained it in the book,

21:58

the higher mind's goal.

21:59

is to get to the truth, but

22:02

the primitive mind has a very different goal. It's

22:04

confirmation of its existing beliefs. So

22:07

in an era in which, Tim, there's just so

22:09

much abundance, there's objectively

22:11

so much freedom, you know, and the structures

22:14

that support the higher mind

22:16

are all around us. Why is it

22:18

that the primitive mind feels so

22:21

often like it's taking over? When

22:23

did the higher minds become less powerful

22:25

than the primitive ones?

22:27

Yeah, I mean, the primitive

22:29

mind is kind of our survival brain, and

22:31

it was programmed for a world

22:34

living in a small tribe. But what's cool

22:36

about humans, like you said, is that the

22:38

primitive mind in our brain, that's kind of the preprogrammed

22:40

software, and that will never, not

22:43

just does it start off thinking it's

22:45

in 30,000 B.C., it will never understand

22:47

that it's not there. Meanwhile, the higher

22:49

mind is this cool part of us that can

22:51

override, can see what's happening, see

22:53

that our instincts don't make sense here, it's in

22:56

one area or another, and

22:57

actually override them. I use the example

22:59

of candy. When we

23:01

binge on candy, and then we regret it, or any kind of unhealthy

23:03

food, it makes no sense, right? We're

23:05

mad at ourselves later, and this is the thousandth time this

23:07

has happened. And what's happening is

23:09

that the primitive mind is programmed for

23:11

a world where calories are hard to come by, and

23:14

if you don't eat this dense, chewy

23:16

fruit that you just came by, of course it's candy, right now, you might not find

23:18

calories for two weeks, binge,

23:21

right, which makes sense then. It

23:24

doesn't make sense today. And so our higher minds

23:26

can actually get in there and override this. That's

23:28

why we don't always eat unhealthy food, and

23:30

some people can be really good at developing healthy eating

23:32

habits, and other ones not so much, because

23:35

it's this kind of tug of war in our heads between

23:37

this one voice that wants to do what

23:39

it's programmed for, and the other voice that says, wait a second,

23:41

wait a second, and the world we live in,

23:43

what we're programmed for, makes no sense here. So

23:46

it's a general concept, but you can apply it to lots of

23:48

things. You can apply it to how we think, right? So

23:51

you can say that it makes sense in today's world to

23:53

try to find the truth. We don't want to be delusional, and,

23:56

you know, we want to

23:57

play with ideas and change our mind and get wiser.

23:59

as we grow older, that makes sense. But

24:02

the other part of our brain, the pre-programmed

24:04

software, is made for

24:06

a world where the tribe has their beliefs,

24:08

their sacred beliefs. And people

24:10

who can challenge those sacred

24:13

beliefs and were too independent

24:15

thinking and said, this doesn't actually make sense,

24:17

where's our evidence? They didn't fare very well.

24:20

And so it is our nature to identify

24:23

with

24:24

our sacred beliefs. They're part of

24:26

us, they're part of who we are, and they're part of our

24:28

group. The people like us, we believe these

24:31

things. And the last thing you'd ever want

24:33

to do is change your mind about that. So you

24:35

go through all this effort to confirm the sacred beliefs, and

24:37

you'll spend time with people talking about how great

24:39

the sacred beliefs are and how bad the people are who disagree

24:42

with it. And that's the survival brain

24:44

mode we're in when we're doing that. Or we

24:46

can override that. And we can say, you know what? I

24:48

know I have the urge to confirm my

24:50

beliefs right now, but that's unwise, and I

24:52

need to overcome that and actually

24:54

get better at changing my mind.

24:57

That's this kind of initial seed of a framework

24:59

that then I take into politics. And I'll just talk

25:02

about the higher mind's way

25:04

of doing politics and

25:06

the primitive mind's way of doing politics. And

25:08

I think that that's a framework we can add into our

25:10

existing discussions.

25:12

What is the higher mind's way of doing politics

25:14

and what is the primitive mind's way of doing politics?

25:16

I use a ladder because it's a spectrum, right?

25:18

It's not just one mind is doing the thinking, the other

25:20

is not. It's not as simple as that. Sometimes

25:23

it's a mix, right? Sometimes you're conflicted between

25:25

two things. You're doing honest research,

25:28

but you find a little confirmation bias there. So

25:30

maybe you're in the middle or whatever it is. So

25:32

the ladder is kind of a spectrum. And when you're

25:35

up in the high rungs, maybe you have some conflict,

25:37

but the higher mind is running the show. And

25:39

down in the low rungs, the primitive mind is running the show.

25:41

And then in groups, the primitive

25:43

minds in the group will kind of band together, and

25:46

the

25:46

whole group will be politically low rung together.

25:49

And it's very hard to get out of that once you're in it. And

25:51

likewise, a whole group can kind of work hard to stave

25:54

off those instincts and actually stay up on the high rungs

25:56

together. So we just talked about ideas.

25:58

That's one way you can think about this.

25:59

at the top, in

26:02

high rung politics, people care about

26:04

truth. They're open to debate, they're open to changing

26:06

their minds. And when they're right about something, they will argue

26:08

it fully. But they're open to being challenged

26:10

and they don't identify with the ideas. If they're wrong about

26:12

something, they don't have this fight or flight instinct.

26:15

They will admit they're wrong and they'll move on. But

26:18

principles are consistent, again, because this is what makes sense.

26:20

So if you believe that government overreach

26:23

is bad, then you care about that when both

26:25

parties are in office. If you believe

26:27

that discrimination based

26:29

on skin color is bad, you care about that regardless

26:32

of the skin color being discriminated against. And

26:34

then finally with tactics, in politics

26:36

you want to change things, right? You want to make things happen.

26:39

You do it via persuasion in a liberal democracy.

26:42

So there's a focus on truth, there's consistency

26:44

with principles, and you try to get your

26:46

way by persuading others

26:49

and building a mind changing movement. And these

26:51

all go together. When a group is doing

26:53

one of those things, they tend to be doing all three.

26:55

Now, low rung politics, which

26:57

I think is born of our survival

26:59

brain's instincts, and when it gets, you know, bands

27:01

together with others, they do politics the

27:04

old school way, just pure tribalism.

27:06

So there's the good people and there's the bad people

27:08

with the good ideas and the bad ideas, completely

27:11

not open to changing their mind. There's tons of

27:13

confirmation bias. They don't give the other

27:15

side a fair hearing. They're really not usually open to

27:17

an honest debate. That's in the ideas realm.

27:19

And then again, with principles, there's total flip-flopping

27:22

on principles based on whether it helps the

27:24

tribe or not. So there's total lack of

27:26

consistency. We've seen a million

27:29

examples of this. And then tactics, you know, again,

27:31

the old school way to get what you want

27:33

is not persuasion. That's the way in this weird

27:36

house of liberal democracy that we do it. The

27:38

way we're programmed to do it is coercion. We will

27:40

try to force our way and force

27:42

people to do things with blackmail and fear and

27:45

violence sometimes. And so when

27:47

I look around at low rung politics, again,

27:50

people who are doing one of those things tend to be doing all

27:52

three. And I think this is just kind of like a vertical

27:54

axis. We can add to the left, center,

27:56

right, horizontal political axis

27:58

rather than saying, are you, you know,

27:59

Left wing, right wing, far right, far left, are

28:02

you centrist? Well, how about making it a square

28:04

and being like... Are you high rung or low rung?

28:06

Yeah. And now you can be far

28:08

left and high rung, or you can be centrist and low

28:11

rung, or far right and in the middle somewhere.

28:13

And I think it's useful.

28:15

Why in our culture right now do

28:17

we so... I wouldn't even say reward

28:19

the low rung politicians, but I think people are actually

28:22

addicted to them. Like, I think people

28:24

really enjoy watching the

28:26

AOCs, watching the Marjorie Taylor Greens. Obviously,

28:29

those people are very different. I don't mean to compare

28:31

them, but what is going on in our

28:34

current culture where the low

28:36

rung people seem to be the ones that get

28:38

all of the attention, all of the rewards,

28:41

and no one really seems to care very much

28:44

about the high rung principled

28:45

ones? They're like the also-rans. Yeah.

28:48

People do care. What happened

28:50

is the media

28:52

landscape has totally shifted. And we're

28:54

in a world now where 24-hour

28:57

news networks, they

28:59

didn't used to exist, used to be a half hour of news

29:01

at night on three networks that broadcast to

29:03

the whole country. Now you've got 24-hour

29:06

news networks going all day to one

29:08

political tribe. And these

29:10

networks realize you could make... I think Fox

29:12

News probably pioneered this and

29:15

other ones have caught on. This idea that

29:18

you can make a lot more money if you kind

29:20

of say what you're doing is news and what you're really

29:22

doing is kind of political entertainment. And

29:25

I use the example of

29:27

like reality show. Reality

29:30

show is interesting all the time, even

29:32

though the actual reality is not that interesting,

29:34

because the editors cut

29:36

in a constant string of conflict with bombastic

29:39

characters

29:39

and it's fun, right? It's our

29:42

primitive minds get addicted to that thing. The same way we

29:44

get addicted to junk food, this

29:46

is political junk food. And

29:48

so these stations realized that

29:50

the same thing that Mars Inc. realized

29:52

about selling candy, you could make a ton

29:54

of money by selling junk food. And so

29:57

these news networks are really entertainment

29:59

networks that sell... to kind of our primitive

30:01

minds, they sell political junk food to our primitive

30:04

minds. And unfortunately, unlike Candy, this

30:06

has major implications,

30:08

which is that the politicians who get cast

30:11

on the show, you see AOC is one of 400 something

30:14

people in the house, their bills passed every

30:16

week that never get talked about, right? There's all these other, but

30:19

AOC is one of the characters on the show. She's

30:22

been cast on the reality show, and so

30:24

is Marjorie Taylor Greene, right? And Trump is one of

30:26

the major characters on the show. And so

30:28

they're going to be on all the time.

30:29

And so of course,

30:32

it incentivizes politicians

30:34

to say, well, getting on the show is a huge career

30:36

break. I need to be bombastic. And

30:38

so that's going to have a lot of effects on

30:40

people. Otherwise, normal people are

30:43

going to get addicted to this reality show, and they're

30:45

going to be kind of sucked into kind of

30:47

hardcore political tribalism.

30:49

In addition to the sort of distinction you make between

30:51

primitive mind and higher mind, which I

30:53

loved, and the idea of sort

30:55

of high-rung political thinkers and

30:58

actors and low-rung ones, you also

31:01

make a distinction between two different

31:03

intellectual cultures, one that you

31:05

call idea labs and the other

31:07

that you call echo chambers. I would

31:09

love if you could give me an example of

31:12

an idea lab and an example of an echo

31:14

chamber.

31:15

Yeah.

31:17

So every group of friends has a culture that

31:19

includes how they do birthdays, how

31:22

they do texting, how they do

31:24

emojis, how they talk behind each other's

31:26

backs, what's acceptable, what's distasteful. Every

31:28

group, no matter what you're in, you're full of rules

31:31

about how we do things, your social rules. And

31:34

an idea lab to me is a group that has a high-rung

31:36

intellectual culture, where how we do

31:38

things here is

31:41

truth comes first. Truth matters. And

31:43

disagreement is great. Respectful

31:46

disagreement. People are to be respected.

31:48

Ideas are not. It's not cool to identify

31:51

with your ideas and get super offended if someone

31:53

disagrees with your idea, where

31:55

people call each other out on bias and on

31:57

logical fallacies.

31:59

You know, unearned conviction, someone who's acting

32:02

like they're sure and they turn out to be wrong a bunch

32:04

of times. That person is not cool in

32:06

the idea lab culture. They quickly lose

32:08

respect. People don't take them very seriously.

32:11

And so primitive minds in the group act up,

32:14

someone will again get really offended, but everyone keeps

32:16

them in check. And kind of the idea

32:18

labs immune system kicks in and says, wow,

32:21

you're upset about that, like, you know, and they get made

32:23

fun of for that, and then they don't want to do

32:25

that again next time. And that keeps the whole

32:27

group kind of up on the high rungs, and it keeps every individual,

32:29

because all of us are subject to this kind of internal

32:32

tug of war, it keeps every individual, their

32:34

mind up in the high rungs. You can't really get away with

32:36

slipping down too far, or the idea lab will call

32:38

you out. And so sometimes you can have a couple,

32:41

a married couple has an intellectual culture.

32:44

If one person knows that you just never

32:46

disagree with my husband on politics,

32:49

or it's gonna be a nightmare, that husband is imposing

32:51

the other kind of culture on the marriage. And

32:54

this can happen in groups. One person in the group

32:56

can, if they have enough cultural power

32:58

in the group, can kind of say,

32:59

no one is allowed to... Disagree

33:02

about acts. Right. And so you quickly

33:04

can slip into the other culture, the echo chamber culture. And

33:07

groups do it together. When one person starts doing it, sometimes

33:10

everyone's just scared of them, but often the whole group starts doing

33:12

it together without even realizing they're doing it. And that's when

33:14

the primitive minds have taken over. And if you think

33:16

the primitive minds go in your head, in an

33:18

individual's head, is to confirm

33:20

the beliefs, your sacred beliefs, well, the group has

33:23

their sacred beliefs, and the primitive

33:25

minds band together to protect those

33:27

beliefs. So they're very hostile

33:29

to someone who says, I think we might be wrong, or I think the

33:32

other side is not so bad, or is right about

33:34

this thing. People will

33:36

call them a bad, you know, they'll basically be relegated

33:38

to the out group. They'll get a really negative

33:40

reaction, because they violated

33:42

something sacred. You know, in an ideal lab, no idea

33:44

is sacred, but in an echo chamber, there's very sacred

33:46

ideas, and it's like going into a church and you're

33:49

slandering Christ. You don't do that in a church.

33:51

And so I don't think this is, you know, some people

33:54

do this and other people do that. I think we all can

33:56

think of different groups at different times of

33:58

the year. And you find

33:59

Oh my, we're being really echo chambery about

34:02

this right now. Oh, it's one of those things where we're

34:04

all getting a little too much pleasure about all

34:06

agreeing and all, we

34:08

constantly are just all on the same side and we're always

34:10

just talking shit about the people who disagree. And

34:13

you know it's bad when you realize that it's something, you

34:15

see something's being a little distasteful or going a little

34:17

too far and you have this incentive not

34:19

to say it because it's gonna kill the vibe. It's

34:21

gonna, people are gonna be kind of like, you know, roll their eyes

34:23

at you and maybe they'll talk shit about you now behind

34:25

your back. That means that the group has

34:28

slipped down down the rungs of the ladder

34:29

into echo chamber land. And that's

34:32

fine, by the way, in a liberal democracy, you're welcome

34:35

to be part of echo chambers or idea labs as

34:37

long as you live and let live. When

34:39

you don't live and let live, that's when there's a problem.

34:41

Tim, is there an example from your life of

34:44

being in an atmosphere that was really

34:46

idea labsy or like a moment

34:49

over the course of the past few decades that you feel like, aha,

34:52

that was like a high watermark of America

34:55

celebrating the idea labs culture

34:57

because I don't think anyone listening to this

34:59

would disagree that we're in a culture

35:02

overall right now in which it feels

35:04

like echo chambers are the thing that are actively

35:07

being cultivated. Maybe another way of asking

35:09

this is, when did the idea lab go out of

35:11

fashion and are there any

35:13

pockets of it, you know, whether

35:16

it's a friend group or an institution or whatever

35:18

that you feel like are trying to revive it? I

35:21

think a ton of people, individuals want

35:23

to revive it. And there are pockets, Intelligence

35:26

Squared is a great

35:28

podcast. It's a two on two Oxford

35:30

style debate. And it's a classic idea

35:32

lab. Everyone's respectful. It's

35:35

two people taking one side, two people taking another

35:37

side. And it's basically

35:39

like, you know, you have two attorneys in that courtroom

35:42

and the audience and anyone listening can play juror

35:45

and listen to them clash and learn a little more

35:47

along the way. And you know, you hear really,

35:49

but everyone's really smart. You hear really compelling ideas

35:52

from both sides. It's fascinating intellectually.

35:55

And the way echo chambers

35:58

get formed is when it becomes kind

36:00

of social norm to say that one

36:02

side of this particular debate is

36:04

not welcome here because, and it's almost, it's never

36:07

the same because we're an echo chamber, no one admits that.

36:10

It's almost always because

36:11

those ideas are dangerous. Those

36:13

ideas are harmful. Or it's because we're

36:16

moral and that's immoral. Yes. The

36:18

idea that the other side of this debate

36:21

is actively, only bad

36:23

people would hold it. And actually

36:25

it's dangerous to even have

36:28

it in the room. And

36:31

the key is that you live and let live, right? So echo

36:33

chamber, you want to go form that with your group of friends or

36:35

you want to start an institution and they're openly dedicated

36:38

to a religion or to a certain set of

36:40

ideas. Great. You're welcome

36:42

in the US to go form your echo chamber. Just leave

36:44

everyone else alone.

36:45

And when you have that, if you scale that

36:47

up, what you have is a lot of idea lab

36:50

pockets and a lot of echo chamber pockets, but

36:52

that inherently makes the whole country a big idea

36:55

lab because each echo chamber is going to argue their

36:57

one position. Idea labs are all over the place.

36:59

They're going to change their mind, but they're still going to be arguing different

37:02

positions and you have this big mix

37:04

of ideas.

37:05

It's the federalism of echo chambers

37:07

makes a national ideas lab. I got

37:09

you. Exactly. And that's

37:12

in general, the idea with liberal democracy is that

37:14

you can have this low rung stuff going on

37:16

everywhere, just as long, but it has to be contained.

37:18

It can't go and start messing with other people

37:21

and infringing upon others. So you

37:23

have this grand idea lab. And what I think

37:25

the trajectory has been is that

37:28

echo chambers have begun forcefully,

37:30

kind of again, using coercion to

37:33

not

37:33

just police their own members, which is

37:35

okay coercion. Again, it's not admirable, but

37:37

there's nothing wrong with it from a liberal sense.

37:40

They've been using coercion to forcefully

37:43

expand, which I compared to like the

37:45

difference between a benign tumor and a malignant

37:48

tumor. The first

37:50

way is benign. It's going to police its own people.

37:52

It's going to, it's going to say, no one can disagree with me.

37:54

You know, or you're not my friend. Okay.

37:56

I can choose to be your friend or not. It's

37:59

this other mentality.

37:59

that's saying actually no one outside,

38:02

even outside of our friends, is allowed to have these ideas.

38:05

And not only is that mentality increasing, but

38:08

it's been succeeding. I call that idea

38:11

supremacy, which is a distinct

38:13

difference from kind of the zealotry or just no

38:15

one can change my mind. Idea supremacy says no

38:17

one else, whether I know you

38:20

or not, is allowed to express these ideas. It's

38:22

trying to kind of play a cultural

38:24

dictator. And so you've had

38:26

these echo chambers expanding across

38:28

the land and kind of

38:30

forceful, coercive expansion and

38:33

kind of holding pockets that used to be

38:35

idea labs, now hostage and

38:37

saying the new rules here are the rules of our

38:39

echo chamber.

38:43

After the break, Tim

38:46

Urban explains why what happened

38:48

to our universities matters. Stay

38:50

with us. Today's

38:55

episode is supported by our sponsor, NetSuite.

38:57

NetSuite gives you the visibility and control you

39:00

need to make better decisions faster. NetSuite

39:03

has been the number one cloud financial system for 22

39:06

years. And it's been a great opportunity to

39:08

help you make better decisions.

39:10

And I'm sure you'll be able to help me with that as well. So,

39:12

I'm going to give you a quick tip. I'm going to

39:15

give you a quick tip. I'm going to give you a

39:17

quick tip. I'm going to

39:19

give you a quick tip. I've been a cloud financial system for 22

39:21

years. And now you

39:23

can defer payments of a full NetSuite implementation

39:26

for six months. NetSuite gives your

39:28

business everything you need in real time

39:30

and in one place to reduce manual

39:32

processes, to boost efficiency, to

39:34

build forecasts and to increase productivity

39:37

across every department of your company. Imagine

39:39

the power of having all the information in one

39:42

place to make better business decisions.

39:44

There's no payment and no interest for six

39:46

months and you can take advantage of their special

39:49

financing offer today. 33,000 businesses

39:51

already use NetSuite, giving

39:54

them visibility and control over financials,

39:56

inventory, HR, e-commerce and more.

39:59

Head to NetSuite for

40:02

a special one-of-a-kind financing offer. That's

40:05

netsuite.com slash Barry.

40:07

If you've been sizing up NetSuite to make the switch, take

40:10

advantage now. That's netsuite.com

40:12

slash Barry to get the visibility and control

40:15

you need to weather any storm.

40:17

This show is sponsored by BetterHelp. Dear

40:20

listener, I think you deserve to hear this from me.

40:22

You need the guidance and expertise of a licensed

40:25

mental health professional. I know how

40:27

hard it can be to take time for yourself between working,

40:29

running around town, and tending to your family

40:31

and friends. It's no wonder that a lot of

40:33

us are feeling stretched in and a little burned

40:35

out. Therapy can give you the tools

40:38

you need to find more balance in your life

40:40

so you can keep supporting others without neglecting

40:42

yourself. BetterHelp Therapy is

40:44

entirely online and it's designed to be convenient,

40:47

flexible, and suited to your schedule.

40:49

To start, you fill out a brief questionnaire

40:52

to match with a licensed therapist. If it doesn't

40:54

work out, you can switch therapists at any time

40:56

for no additional charge. Find

40:58

more balance with BetterHelp by visiting

41:01

betterhelp.com slash honestly

41:03

today to get 10% off your first

41:05

month. That's betterhelp, B-E-T-T-E-R-H-E-L-P.com

41:07

slash

41:12

honestly.

41:15

The ultimate idea lab is supposed

41:17

to be the university, right? The university is

41:19

supposed to exist for a singular

41:21

goal. Maybe they're a secondary, but the key

41:24

goal is the pursuit of truth, right?

41:26

And I think

41:27

one of the ways that

41:29

is most illustrative to the point you're making

41:31

in the book is to look at what has happened

41:33

to universities. And you dedicate almost

41:35

an entire chapter in your book to the problem

41:38

of universities today, and

41:40

the way that they've sort of been transformed from

41:42

idea labs into either echo

41:45

chambers or idea supremacist

41:47

chambers. You'll tell me the difference. And

41:49

you do this with this amazing illustration

41:52

that you call the social justice horse. And

41:54

it's sort of similar to the idea of the Trojan

41:56

horse, right? Except you have this social justice

41:58

horse which you draw

41:59

with this little... very cute rainbow mane

42:01

and tail. And the

42:04

social justice horse says

42:06

very, very lovely progressive

42:08

sounding things. Like the horse will say an inclusive

42:11

environment.

42:13

And yet the actual

42:15

idea that they're smuggling in under the

42:17

idea of an inclusive environment is

42:20

disagree with us and we'll smear you, for

42:22

example. Or the social justice

42:25

horse will say something like diversity

42:27

statements, but really the thing being

42:30

brought in

42:31

under this rhetoric is, as

42:33

you put it, McCarthyist political litmus test

42:36

and loyalty oaths, right? And this sort

42:38

of theory of the social justice

42:41

horse or the mechanics of the social justice horse

42:44

is happening

42:45

all over the country, not just in elite schools,

42:48

but in schools coast to coast, everywhere in

42:50

between. And we've gotten to

42:52

a point in

42:53

which something like 52% of college

42:55

students, according

42:57

to one recent survey say, they always

43:00

or often refrain from expressing

43:02

views on political and social issues in classrooms

43:05

because of concern for how

43:07

it will be perceived, for concern to the reputation,

43:09

for the concern to their grades. How

43:12

did we get here? How was this

43:15

social justice horse so

43:18

unbelievably effective? So

43:21

one of the first things I wanted to do when I was getting into

43:23

this very spicy topic of social justice

43:26

is I

43:27

said, we need two terms here because

43:29

there's two completely different

43:31

things that are called social justice. The

43:33

first is what I would call liberal

43:36

social justice, liberal meaning classic liberal.

43:38

And so we talked about the liberal house, right?

43:40

This idea of liberal, this

43:43

house we live in that has liberal rules and liberal

43:45

norms and liberal laws and liberal

43:47

social justice,

43:49

its goal is to make

43:51

the house more perfect. It says liberalism

43:54

is great. The constitution

43:56

is awesome,

43:56

but we don't always succeed in keeping.

43:59

its promises. There are flaws

44:02

in the house. This awesome house

44:05

has some, you know, some people

44:07

or policies or norms have, have

44:09

made it weaker and have made it actually

44:12

some people in this house are being treated

44:15

unfairly in a way they're not supposed to be treated

44:17

in the liberal house. This

44:19

was of course what Martin Luther King, you know, in his I

44:22

have a dream speech, he talked about a promissory

44:24

note and how the US has

44:26

defaulted on its check to

44:28

black Americans, right? So that's him saying this house

44:31

is great, but black Americans are

44:33

not being treated the way this house is supposed to treat them.

44:35

Let's fix the house. That's the goal. Let's

44:37

make it the best house it can be. And

44:40

so not only does it have liberal goals,

44:42

right, which is he wants more liberalism, but it has,

44:44

it uses liberal means, right? The

44:46

civil rights movement was all about free

44:49

assembly and free speech and,

44:51

you know, protest and all of these tools

44:53

of liberalism, they're tools that the house gives you

44:56

to fix the house. And you know,

44:58

this idea of colorblindness is a very liberal

45:00

idea, the individualism, right? It's not

45:02

about the color of your skin. It's about who

45:04

you are as a person. This idea of your

45:06

character is, is another way of valuing

45:09

individuals.

45:09

Each individual is a sacred thing, period.

45:11

It doesn't matter what else is, you know, about them. And

45:16

so that's liberal social justice. This is, this is the

45:18

movement behind gay marriage in 2012. This

45:20

is the movement behind women's suffrage

45:22

back in the 1910s. So

45:25

it's a great tradition in the US and it's something that most

45:27

Americans, you know, for sure progressive, but even

45:29

a lot of conservatives are very proud of this movement.

45:32

Now, what's the other thing? There's another

45:34

thing called social justice right now, which

45:37

is what I call social justice fundamentalism, otherwise

45:39

known as wokeness. And it's

45:41

important because wokeness itself sounds derogatory.

45:43

It sounds like it's, you know, it has a lot of cultural

45:46

baggage there. So I try to just

45:48

use a different term to describe what it actually is, which I

45:50

call social justice fundamentalism or SJF.

45:53

And this movement is self-proclaimed

45:56

outside the house with a wrecking ball. The

45:59

house is because the house

46:01

was built by flawed people. And it's rotten

46:03

to its core. Its foundation

46:05

is built to uphold the power

46:08

of the powerful, in

46:10

this case, you know, white supremacy or the patriarchy.

46:13

And that liberalism itself is an

46:15

invention of those ideas

46:17

in order that

46:19

has the, whether it was intended or not,

46:22

it has the property of being exploitative

46:25

and of enhancing inequality

46:28

and of entrenching the power of the powerful

46:31

and holding down the oppressed. And

46:33

so that's a fundamental disagreement. Liberal social

46:36

justice and social justice fundamentalism have

46:38

opposite, not different, but opposite goals. One

46:40

wants to make the house better and one wants to

46:43

break the house down. They use words like

46:45

liberate, liberate from this whole

46:47

system. It's very revolutionary,

46:49

much more revolutionary than liberal social justice.

46:51

Liberal social justice wants to overhaul norms

46:54

and policies and laws. Social

46:56

justice fundamentalism wants to overhaul the whole

46:58

house, just level it and build

47:00

something new. And that's

47:03

okay. That's the thing is what liberalism is

47:05

awesome because in a liberal democracy, it actually

47:07

has room for even critique of itself. Sure, bring

47:09

it in, bring it in. You hate liberalism, bring it into

47:11

the discussion. You're allowed to be here. Go

47:14

make your arguments, do whatever you want. Try to persuade people.

47:16

Maybe we're all missing something. Sure, go for

47:18

it. What's not okay is

47:20

illiberal tactics. And

47:22

that's kind of goes together. Because if you think the house is bad,

47:24

well, you also think that the same tools that

47:26

were used by the civil rights movement, those liberal tools

47:29

like free speech and all those things, that

47:31

those things are bad too. Black

47:33

feminist, Audre Lorde says, you can't dismantle

47:36

the master's house with the master's tools.

47:38

So there's a lot in that statement. It's the idea that A,

47:40

this is the master's house. It doesn't belong to all of

47:42

us. It is a house for slave masters.

47:45

And therefore the tools themselves are rotten, just

47:48

as rotten as the house itself. And so

47:50

they will be hostile to all kinds

47:52

of liberal things,

47:54

free speech. So free speech

47:57

is actually dangerous, right? It's allowing to...

47:59

the platforming of dangerous ideas.

48:02

Not the First Amendment necessarily, but the culture of free speech

48:05

is bad. The US and the West, they're

48:07

bad. The group is what matters, much

48:09

more than the individual. And that's

48:11

why they will talk about, they

48:14

will make broad generalizations about white

48:16

people and black people. They treat these groups as monoliths.

48:19

There's a common enemy kind of tone to it instead of

48:21

the much more classic liberal common humanity

48:24

tone. Equality of opportunity is

48:26

a liberal staple. But social

48:29

justice fundamentalism believes

48:29

that

48:31

because groups are the same, there's no such thing

48:33

as equality of opportunity that doesn't lead to equality

48:35

of outcome. And so they actually

48:38

are for equality of outcome. That is

48:40

very anti-liberal, because you

48:42

can't have freedom with equality of outcome, enforced

48:45

equality of outcome. So there's a lot of these examples,

48:47

but again, it is like you mentioned with the social

48:49

justice horse, it's tricky, right? It's

48:53

a little bit, they don't quite go

48:55

out and say this. They'll say, we wanna

48:58

save space. But really

49:00

what they're saying is a space that

49:02

doesn't have free speech in it. Where

49:05

our ideas are treated as sacred. And

49:07

they'll use things like the harm principle and saying

49:09

that, well, this thing is harmful, so therefore it must

49:11

be stopped. So back to universities.

49:14

That's, back to universities. What's happened

49:16

is this, social, SJF was

49:18

a, in the 60s, liberal social

49:20

justice was the major thing at universities, right?

49:22

You have Berkeley, you have free speech protests,

49:25

right? And we want more liberalism. But there

49:27

was also developing, in the corners of universities,

49:30

this concept of SJF, this neo-Marxist

49:32

kind of take on social justice. And

49:35

what's happened much more recently is, as

49:38

universities have gone from kind of pluralistic

49:41

with some conservatives,

49:43

more progressive, but some conservatives, it's

49:45

transitioned to be almost entirely

49:48

progressive. And the numbers are stark. The

49:50

ratio of professors

49:53

left to right has gone from like four to one to,

49:55

in some departments, 17 to one, 40 to one,

49:57

sometimes 100 to one.

49:59

And so when an environment

50:02

is kind of purple, it has good

50:04

defenses against extreme

50:06

components, both from red and blue. But

50:08

when an environment becomes bright blue and

50:10

things get really kind of tribal, it

50:13

becomes weak. It develops

50:15

a soft spot to kind of illiberalism

50:17

of its own color. So social

50:20

justice fundamentalism has taken advantage

50:22

of this transition to political

50:25

purity at a university and

50:27

has been able to

50:29

rise up and actually institute

50:32

its own values. And so like

50:34

you said, the university is supposed

50:37

to be the ultimate idea lab, right? Veritas

50:39

is written on the gate above the university. And

50:42

for Veritas to happen, you have to have not

50:45

just ideas put out, but people who will challenge

50:47

those ideas. Or else there

50:49

is no way for a group of people to

50:52

find truth together if no one's allowed to disagree.

50:56

But as SJF has risen up, it

50:58

has created new rules and

51:00

has basically said, we live in a small echo chamber

51:03

that now actually is going to be the rules of the

51:05

entire campus. And

51:08

ideas that disagree with SJF in

51:10

particular, with our tenants, are

51:12

going to lead to firings or

51:15

to ostracism or to the investigation

51:17

of students. And that completely topples the whole

51:19

point

51:23

of a university on its head.

51:24

For the skeptical listener who's like,

51:27

I don't want to say SJF, but maybe sympathetic

51:30

to some of its aims, or who just

51:32

believes that what happens at universities

51:35

doesn't much matter, how has the

51:37

triumph of the social justice horse

51:40

in what is meant to be the ultimate ideas

51:42

lab, what has been the impact of that on the

51:44

broader culture in this country?

51:45

Well, it matters in two ways.

51:49

Universities educate young people

51:52

who then go and become our future leaders

51:55

and are running the country 20, 30 years later.

51:58

And what

52:00

universities are supposed to teach college

52:03

students is how to think, is

52:05

how to think, how to debate, how

52:07

to find the truth, how to be tougher more robust thinkers

52:10

and teach them a wide variety of lenses,

52:12

political lenses and other kinds

52:14

of lenses that they can then you know take into their

52:16

heads and use. It's kind of training for your brain

52:19

so you go you go to college you come out a better

52:21

thinker forever which of course would serve our society.

52:24

But it also teaches kind of general

52:27

liberal values.

52:29

It teaches students that disagreement is okay

52:31

and it's supposed to teach students that

52:34

enforcing an echo chamber on a

52:36

on an ideal lab institution is

52:38

not a good thing. So when an

52:40

ideology like SJF which is you know

52:43

kind of illiberal to its core takes

52:46

over it instead of teaching students

52:48

a wide variety of lenses it teaches

52:50

them one lens. So that's the difference between teaching them how

52:52

to think and arming them with a lot of tools and teaching

52:54

them what to think teaching them there is one

52:57

correct a worldview with

52:59

one correct you

53:01

know set of politics in it. And

53:04

if

53:04

you try to bring a speaker to campus that disagrees

53:06

with it we will you know disinvite the

53:09

speaker or shot them down. If

53:11

you try to teach us that wide

53:13

variety of views in a class that professor is

53:15

going to be reported. And so students

53:17

are soaking in instead of this thing

53:19

this ideal lab culture that makes them better

53:22

thinkers and more humble about what

53:24

they know it teaches students to be zealots

53:26

and to be intellectual bullies it teaches them that the

53:28

way to be a good person who

53:31

tries to fight against harm

53:32

you need to

53:34

punish anyone who shares harmful

53:37

ideas which happen to be of course the ideas that

53:39

conflict with SJF. So

53:41

it's kind of doing the exact opposite of what I

53:44

think

53:45

we want colleges teaching young people to do it's

53:47

and you know about both how you know

53:49

about what they're teaching them and also how they're teaching

53:51

them to be as thinkers and how they're teaching them to treat

53:53

others. So that's going to affect all of us then

53:55

those those students then go enter companies

53:57

and start wreaking havoc there with

53:59

this idea of someone

54:02

said something harmful and they need to be fired for it, we'll

54:04

start a petition and use our power

54:06

to try to get this person fired.

54:08

And eventually those people are our leaders and

54:10

they're making policies. So that of course affects

54:12

everybody. The other thing that universities do

54:14

is they're supposed to be our primary truth finding centers.

54:17

Universities are the center of

54:20

academic research and science that

54:22

happens at universities. And

54:25

what a society knows is basically

54:28

what universities produce for knowledge. And

54:32

when an ideology that does

54:34

not believe in Veritas

54:36

culture takes over, it starts

54:38

to affect what can and cannot be researched.

54:41

And it starts to maybe lower the standards

54:44

for

54:45

academic work that

54:46

confirms SJF and it starts

54:49

to retract papers

54:51

or not publish them in the first place or

54:53

maybe even fire the professor for having the nerve to write

54:56

it for ideas that conflict with its

54:58

ideology. And so that also harms

55:00

everyone because there's already a strain in this

55:02

country that doesn't believe the science, right?

55:04

And that's no matter what and that's not good. But

55:07

this

55:08

gives so much credence to those,

55:10

that whole idea. And so it's really

55:12

bad for knowledge production and it's bad for education

55:14

of our future leaders. And often

55:17

what happens at universities ends

55:19

up happening everywhere else a little bit later. Universities

55:22

social media started at universities with Facebook and

55:24

it soon was everywhere. There's a lot of cultural

55:27

fads. And so when you see something happening

55:29

at universities, people should take it seriously because it

55:31

very well might appear across society

55:34

five, 10 years later, which of course in this case it

55:36

has.

55:37

Tim, there are a lot of huge,

55:39

I would say almost existential themes to your work.

55:42

But one of the biggest themes that runs through

55:44

everything you write

55:45

is the question of time. And

55:48

I think you do such an incredible job of

55:50

making readers aware

55:53

of a thing that feels

55:55

perhaps more abstract than almost anything.

55:57

And you just do an incredible job of conquering.

55:59

advertising it, making it real for us.

56:02

One of the things that

56:04

you have talked about personally is that you're

56:07

an epic waster of time, or at least you used

56:09

to be. In 2016, you

56:11

gave this TED talk that I think now has something like 50

56:14

million views, which grants it a spot on the

56:16

most popular TED talks of all time, right

56:18

next to Brene Brown and Bill Gates. And

56:20

it's called Inside the Mind of a Master

56:22

Procrastinator. And it's all about how

56:25

the procrastinator's mind works, which

56:27

as you describe it, contains a rational

56:30

decision maker and an instant

56:32

gratification

56:32

monkey. And the instant

56:34

gratification monkey takes over

56:36

in the mind of procrastinators and

56:38

throws the rational decision maker part

56:41

of the mind out by the wayside.

56:43

But then toward the end of the talk, you

56:45

say that you had an epiphany, which

56:48

is that actually we're all procrastinators.

56:51

Why are human beings such skilled

56:53

procrastinators?

56:55

Well,

56:56

I think this relates to what we were just talking about.

56:58

I think that we're procrastinators for the

57:00

same reason we eat

57:03

unhealthy food and fall into tribal

57:05

politics, which is that the

57:08

primate we are, the world we were programmed

57:11

to be in didn't really have

57:13

long term projects that often. You

57:15

had to get food, you had to survive, you had

57:17

to mate, and you had

57:20

to sometimes fight. And so you conserved energy

57:22

most of the time and you expended it when you

57:24

had to. Now

57:27

we live in this world, we've been kidnapped out

57:30

of our home forest and we

57:32

are dropped into an advanced civilization where the

57:34

way to have a gratifying

57:36

successful career is to

57:39

think really long term and to work hard on stuff

57:42

today that you might not see the fruit of for months

57:44

or years. It's

57:45

to resist the marshmallow in the test. Yeah,

57:47

exactly. We have to work really hard to override

57:50

our sense of instant gratification because in the

57:53

world we were supposed to be in, you really didn't need to override

57:55

it very often. And

57:58

so I think procrastination is...

57:59

is a problem a lot of us have.

58:02

But I think there's kind of two kinds of

58:04

procrastination. There's the first

58:06

kind, which is the one that we usually talk about when people say, 20% of

58:09

people are chronic procrastinators, is we talk

58:11

about deadlines. And people who get

58:13

behind on deadlines or they're late to a meeting, they're

58:16

unable to do the work until

58:19

they absolutely panic. Basically, there's

58:22

this resistance in them that will

58:24

resist and resist and resist until this panic

58:27

gets bigger and bigger and bigger. And eventually, the

58:29

panic threshold crosses the

58:31

resistance threshold. And they will freak out

58:33

and cram for the test or whatever. Do

58:36

their work. And that often leads to, obviously, stress.

58:38

And it's not healthy. And often, you don't do your best work.

58:40

And it's a miserable way to live. But the

58:42

reason I said there was an epiphany that I think

58:44

that we're all procrastinators is that there's

58:47

a whole other much sneakier kind of procrastination.

58:49

And it happens in all the situations

58:51

when there's something important that has no deadline

58:53

at all around it. So

58:56

that's a lot of stuff at work, of course. We're

58:59

trying to improve on our skills

59:02

and to actually rethink

59:04

the company culture or whatever. There's a lot of examples

59:06

at work of stuff that you would call kind of important,

59:09

but not urgent. No deadline. But

59:12

there's a ton of stuff outside of work. There's

59:15

no deadline on seeing your family enough. And

59:17

how many people regret not spending more

59:19

time with their parents before they passed

59:21

away or not spending more time with their kids before

59:23

they left for college or whatever it is. That

59:25

is such a common regret.

59:29

Our brain is a tool. And it's not often that smart.

59:32

And one of the ways that your brain is not that smart is that

59:35

we have this kind of delusion in our heads that

59:38

time is unlimited,

59:39

that there's endless weeks ahead. And that's

59:41

not true. And time is

59:43

quite finite. If there is a friend

59:45

that you really love, but you see them,

59:48

I don't know, once every two years, because

59:50

they live in a different city. Every couple of years, you catch

59:52

up and you have an amazing four-hour drink and

59:54

dinner. And it was

59:56

such a good time. And then that's it. You see them two years

59:58

later. Okay. Well, just say

1:00:01

you're 30 and you, you know,

1:00:03

you may be, if you're lucky, both of you lived into

1:00:05

your 80s. That's 50 years. If

1:00:08

you're seeing them every two years,

1:00:10

it's this crazy moment when you realize that I'm seeing

1:00:12

them 25 more times ever. We

1:00:15

have 25 of these dinners left. What?

1:00:17

That doesn't, the math doesn't add up. And so

1:00:20

I think that when it comes to a lot

1:00:22

of really important things, when there's no deadline,

1:00:25

the combo of that and the, this delusion

1:00:27

that we have unlimited time and unlimited, you

1:00:29

know, rounds of something, makes

1:00:31

us very complacent in a way that we

1:00:34

shouldn't be. So you have two things you can do with that information.

1:00:36

Going back to that friend example, one is really

1:00:39

savor that time with your friend. Maybe that's true.

1:00:41

Maybe that's a sad but true fact that you're going to see them 25 more times

1:00:44

if you're lucky and

1:00:46

really enjoy it and appreciate what you're seeing here.

1:00:48

You have 20, this is, this is number two out of 25.

1:00:50

Next one is number three out of 25.

1:00:53

Or you can increase that

1:00:55

number 25 by prioritizing

1:00:57

that friend more. And maybe you're going to go and

1:00:59

try to see that friend twice a year. Okay. You

1:01:01

just expanded that number from 25 to a hundred. So

1:01:04

there is a lot of things you can do with time when you look

1:01:06

it in the face. And when you look it in the face,

1:01:08

it's often different than what you think it looks

1:01:10

like.

1:01:11

You have this blog post that I

1:01:13

got kind of obsessed with called the tail end.

1:01:16

And you just, it puts in perspective

1:01:18

visually how little time the average

1:01:21

person has to do the things

1:01:23

they love or spend time with the people

1:01:25

they love, you know, from how many dumplings

1:01:27

you have left to eat. You have like a whole graph of little

1:01:30

baby dumplings to the amount of oceans

1:01:32

you're likely to jump in before you die. And

1:01:34

I think the most affecting part of it is

1:01:37

about your parents. And you say, it turns

1:01:39

out that when I graduated high school, I had already

1:01:41

used up 93% of my

1:01:44

in-person parent time.

1:01:46

I'm now enjoying the last 5% of that

1:01:48

time we're in the tail end. In

1:01:50

other words,

1:01:52

you basically get 18 years with your

1:01:54

parents and then the whole rest of your life

1:01:56

is the additional year in terms of actual

1:01:59

time you spend with them.

1:01:59

with them. And I wonder in doing

1:02:02

all of these exercises that, if

1:02:04

I'm honest, have a huge impact on

1:02:06

me when I read them, and then I kind of am

1:02:08

lulled back into the sleep that most of us go

1:02:10

through life in when it comes to something that

1:02:13

seems infinite like time. Has

1:02:15

doing these exercises actually

1:02:18

changed the way that you function in the

1:02:20

world? Has it changed the way that you

1:02:22

prioritize time with your family,

1:02:24

time with your wife, time with your daughter? Has

1:02:27

it transformed the way that you live

1:02:29

out

1:02:29

your day to day? Yes,

1:02:32

but not as much as it should.

1:02:34

So I've been arguing

1:02:37

on the side of more family vacations. And

1:02:40

I used to think, oh, we're all going to be in this

1:02:42

house for a week. And I'll make it for the last four

1:02:44

days, but I have a lot going on. And

1:02:47

I stopped doing that. If there's a thing going on, I'm not

1:02:49

missing an hour of it. And

1:02:51

then the other side of it, when I am with people I love, I

1:02:54

do sometimes find myself thinking, this is

1:02:57

precious. This is precious. There

1:02:59

actually aren't a, there's a very finite number of these moments.

1:03:01

And so yes, it does have an effect on me,

1:03:04

but

1:03:04

I'm like you too, where I

1:03:06

then go and I, it's so easy to just

1:03:08

fall back into our happy human haze of

1:03:11

thinking time is endless. And

1:03:13

by the way, another delusion that goes

1:03:15

along with this, which is

1:03:17

this delusion that things are just, I'm

1:03:19

the way that I am. Things are the way they

1:03:21

are. This is how much I see people. This is the city I

1:03:23

live in. That's just what it is. And

1:03:26

that's actually not true at all. You have a lot

1:03:28

of agency over where to take your life and you can

1:03:30

make big changes if it's important. And

1:03:33

so yes, I can say all

1:03:34

this. Do I do it enough?

1:03:36

No. But because it's in my head, I do

1:03:39

feel like I can be on a path getting 10% better

1:03:41

about this every year. You know,

1:03:43

like I will say like my grandmother is

1:03:45

very old, right? And I, she's,

1:03:48

you know, 98 and, and

1:03:51

I'll, you know, I will try to

1:03:53

make sure I spend, you

1:03:55

know, I probably will, I probably spend double the time with her that

1:03:57

I, that I would have before I wrote this, this

1:03:59

post.

1:03:59

So I do think it has an effect, but I

1:04:02

think if you're really looking it in the face, people

1:04:04

should be doing things like moving to the city

1:04:07

that their parents or friends live in, even

1:04:09

though they don't like that. That's intense,

1:04:11

but that is kind of what's called for here,

1:04:13

but I don't do that and a lot of people don't do that.

1:04:16

How has the advent of

1:04:18

this piece of glass that I'm holding in my hand

1:04:21

made this challenge even more difficult,

1:04:23

the challenge of procrastination and avoiding

1:04:26

what's truly important in our lives?

1:04:27

Yeah, I mean the phone slash

1:04:30

the computer internet, all

1:04:32

of that to me goes together, because for me it's, if

1:04:34

I'm trying to write, I'm writing on the same

1:04:37

exact device that

1:04:39

I procrastinate on. I mean the world,

1:04:42

I could go into a prison cell for 100

1:04:44

years and never run out of shit

1:04:46

to do on the internet. You're tempting

1:04:49

the instant gratification monkey much, much more.

1:04:52

If you're someone who struggles with unhealthy eating, well a great

1:04:54

way to do that is to not have any unhealthy food in the house,

1:04:56

but the internet

1:04:57

basically is like filling all of our houses just

1:04:59

out there in front of us, is just junk food

1:05:01

everywhere. It's gonna, now the willpower

1:05:03

required is a lot greater. So yeah, that's

1:05:06

a big one.

1:05:07

Did people used to procrastinate on the typewriter?

1:05:10

Like will human beings always find

1:05:12

a way to procrastinate? I do wonder

1:05:13

how like people in the 1700s procrastinate on.

1:05:19

And it must have been something, they had books, so

1:05:21

it must have been something, but it was definitely

1:05:23

less tempting. But I think there's writers

1:05:26

who for

1:05:27

centuries who have said stuff like, I

1:05:29

love having written, but I hate the process

1:05:31

of writing. I mean, actually the

1:05:34

word procrastination

1:05:36

is a Latin word. It

1:05:38

means to put off till tomorrow. So

1:05:41

we're talking about the Roman Empire, this was

1:05:43

a problem. And by the way, there's another word

1:05:45

called perendination I think.

1:05:47

And to be a perendinator

1:05:50

is actually to put things off till the

1:05:53

day after tomorrow. So procrastination

1:05:55

is to put things off till tomorrow, perendinators to put things off till,

1:05:57

so they have a nuanced understanding of someone who's

1:05:59

a.

1:05:59

disaster or like a super disaster in

1:06:02

this regard. So yeah, like I'm sure Julius Caesar

1:06:04

was a procrastinator. I just don't know what

1:06:06

he exactly did when he was procrastinating. All

1:06:08

right. Let's talk a little bit about technology

1:06:11

and happiness and the way that technology

1:06:14

is either making us happier or more miserable.

1:06:17

You once tweeted this 300 year old quote

1:06:19

by Montesquieu that says this, if

1:06:21

you only wish to be happy, this could easily be

1:06:23

accomplished, but we wish to be happier

1:06:25

than other people. And this is always difficult

1:06:28

for we believe others to be happier than

1:06:30

they are.

1:06:31

And I think everyone would agree that social

1:06:34

media has made it impossible not

1:06:36

to compare ourselves to others,

1:06:39

right? It's what we do all of the time, every single time

1:06:41

we're looking at Instagram, they're happier on their vacation.

1:06:43

They're skinnier than I am. They have the better clothes,

1:06:46

right? How is social media put

1:06:48

the human urge to compete and compare

1:06:51

that Montesquieu talked about 300

1:06:52

years ago on steroids? And

1:06:55

is there any way to resist it?

1:06:57

I have a term I call like image crafting, which

1:06:59

is I think what people do on social

1:07:02

media, they image craft, right? They're going

1:07:04

to present a

1:07:06

person that

1:07:07

is not them,

1:07:09

but is who they want people to think they are,

1:07:11

which people, again, people have always done. But

1:07:14

social media, it's much,

1:07:16

yeah, it's like you said, it's on steroids.

1:07:19

A, people don't broadcast their failures

1:07:22

and they don't broadcast their shitty vacations.

1:07:24

So you're already seeing this distorted lens. One

1:07:27

of the crazy things about humans is that how

1:07:29

we feel about our own life is almost

1:07:31

entirely derived from

1:07:34

comparison. So it

1:07:36

used to be, if your car isn't as nice

1:07:38

as your neighbor's car, you feel poor.

1:07:40

But now

1:07:43

comparison is in our face. Instead of seeing our

1:07:45

neighbor and our couple of coworkers and our friends

1:07:47

and how they're doing,

1:07:48

we see everyone

1:07:51

and we see the most six. So instead of

1:07:53

seeing, you know, there's someone who's the most successful

1:07:55

person from your high school, right? Who you knew

1:07:57

in high school. Normally you might hear through

1:07:59

the great.

1:07:59

fine about them. Oh, you hear they're doing great,

1:08:02

whatever, whatever, you forget about it. You know, you

1:08:04

don't hear about them again for 10 years. Now that

1:08:06

person's in your face because everyone's talking about them online

1:08:08

and they're there and everyone's forwarding their things. And

1:08:11

so it's kind of a nightmare of comparison

1:08:13

now. And then you combine that with the fact that everyone's

1:08:16

presenting the best version of their life. And

1:08:19

you really have a recipe for

1:08:21

misery and you know, inequality is

1:08:24

always a problem, but inequality is really rubbed

1:08:26

in your face with social media.

1:08:29

There's this term coined by the sociologist

1:08:32

Ray Oldenburg called the third space, which

1:08:34

is exactly what it sounds like, right? It's a place outside

1:08:36

of home or work for adults or for

1:08:38

kids that cultivates a sense of community.

1:08:41

Starbucks wanted to pride itself on being

1:08:43

the third place. And for some people it's still a

1:08:45

bar, maybe a coffee shop or a community library

1:08:48

or a park or a playground, but it's meant to

1:08:50

be sort of like this common leveler where

1:08:52

everyone's welcome regardless of

1:08:55

social class, race, gender, et cetera.

1:08:57

But in our world today, I would

1:08:59

argue that the internet is

1:09:01

that third space,

1:09:03

right? And you can use it

1:09:05

to get lost in an app or Tik

1:09:07

TOK or Twitter or a game. And

1:09:10

in certain ways it's incredible. It attracts

1:09:12

those looking for a community. You know,

1:09:14

if you're living in a rural place, you can connect

1:09:16

to people all over the world that have a like-minded

1:09:19

view to you,

1:09:21

but it can also be this extremely destructive

1:09:23

thing in ways that I don't even need to go into because

1:09:25

everyone knows what I'm talking about, alienation,

1:09:28

isolation, radicalization, all of those

1:09:30

things. How do we

1:09:32

use this tool for the good? How

1:09:35

do we use this tool in a way that

1:09:37

cultivates our higher rung

1:09:40

values, our higher

1:09:42

mind? How do we protect

1:09:45

ourselves from slipping to the lower rungs,

1:09:48

from giving in to our primitive mind,

1:09:50

especially as technology is

1:09:53

continuing to advance? Who knows what's going

1:09:55

to be here six months from now because of AI?

1:09:57

People

1:10:00

who want to lose weight,

1:10:01

it's very logical to

1:10:04

keep only healthy food in the house. Surround

1:10:06

yourself by healthy food and you'll probably eat more of it.

1:10:08

You can do the same thing on the internet.

1:10:11

You can actually try to avoid junk food,

1:10:13

internet junk food, and surround yourself

1:10:16

with influences that'll make you better. Think

1:10:18

about Twitter. Twitter, people rag

1:10:20

on it as this hellscape, and it is, but

1:10:23

not for everyone. Just for us? Well,

1:10:26

yeah, certainly. But if

1:10:28

you're going to tweet about politics, you're going to invite

1:10:30

the hellscape into your world. But the point

1:10:33

is, a lot of people, they log on and they see a

1:10:35

bunch of interesting people talking about science and

1:10:37

history and making funny, comedians, making funny

1:10:39

jokes, and then some of their friends. And it's not a hellscape

1:10:41

at all. It's awesome, right? One thing I

1:10:43

did for this book is, because

1:10:45

I wanted to not end up in an echo chamber, people

1:10:48

who felt the way I did, is I tried to follow

1:10:51

on different social media platforms a

1:10:53

wide range of people. If someone

1:10:56

who was getting a lot of attention and I tested

1:10:58

what they thought, instant follow. I

1:11:00

want to see what they're saying. And then also, if someone

1:11:02

who I thought was a good thinker and they disagree with me, even

1:11:05

more so, instant follow. So you can

1:11:07

surround yourself by a wide variety

1:11:09

of views if you want to get the full picture and

1:11:11

not let yourself fall too much in an echo chamber. And

1:11:14

we talked about idea lab culture. You can choose

1:11:16

to go to the sites and listen

1:11:19

to the podcast and follow people that

1:11:21

you believe have a high-rung approach,

1:11:23

which means they might be anywhere on the political

1:11:26

axis or any other axis, but

1:11:28

they approach things like a grown-up,

1:11:31

not identifying with their ideas, not

1:11:33

attacking people who disagree with them,

1:11:35

but attacking their ideas. So you can

1:11:37

curate your own internet world pretty well,

1:11:40

I think, the same way. Again,

1:11:43

it's just like food. You're going to sometimes go

1:11:45

out or deliver. You're

1:11:49

going to still end up going into this. Someone's going

1:11:51

to send you a tweet. You're going to end up scrolling down the comment section

1:11:53

and getting angry and all that. But

1:11:55

you can go a long way. Now, at a macro scale,

1:11:57

how do we do better? Because

1:12:00

collectively, we can be very smart and wise.

1:12:03

And we can also be the lowest common denominator

1:12:05

can win out. And we can be the worst of our human

1:12:08

nature can come out collectively. But

1:12:10

what we can do is if we build enough awareness about this

1:12:12

concept, and people already, there's a lot more people

1:12:14

talking about how social media is bad, right? That's new.

1:12:17

People, you said that you

1:12:20

didn't even need to even list the things. You

1:12:22

said it was just obvious why the internet can be bad.

1:12:24

That's pretty new, actually. This

1:12:27

idea that these algorithms make us miserable.

1:12:29

And so right there, you're going to start having some

1:12:31

pressure. You're going to start having some

1:12:34

kind of shaming of the people who own the platforms

1:12:36

if their algorithms are geared towards engagement,

1:12:39

which of course, usually, it means geared towards

1:12:41

amplifying anger and bombastic

1:12:44

people. And I think we could get to

1:12:46

a world where that's very, no one

1:12:48

would ever join a platform that still

1:12:51

has an algorithm like that. We all know that that's

1:12:53

not so. And in that world, everyone's

1:12:55

suddenly incentivized to make their algorithms

1:12:58

better and more pleasant. I think we could get there, where there's kind

1:13:00

of a mass shift where it becomes the,

1:13:02

it looks like the Wild West back when the algorithms were

1:13:04

just going for engagement. And now, of course, we don't do that

1:13:06

anymore. And it wouldn't be

1:13:08

that hard for algorithms to find ways

1:13:11

to drive different behavior,

1:13:13

to reward different kinds of behavior.

1:13:15

Just turning back to your book and the story

1:13:17

of us that we started this conversation with, and

1:13:20

where we are on your 1,000 page historical

1:13:23

timeline of humanity, you

1:13:25

say that the disasters on page 1,000

1:13:28

of the history of us are exponential

1:13:32

compared to the disasters on page 999.

1:13:35

Why is that? Technology's

1:13:38

a double-edged sword. I mean, look at the 20th

1:13:40

century. It had record

1:13:42

numbers in terms of GDP per capita

1:13:44

and the eradication of disease and

1:13:47

the fewest people ever in extreme poverty

1:13:51

and just general prosperity. But it also saw

1:13:53

the two biggest wars in history. It

1:13:55

saw the two biggest genocides in history.

1:13:58

And it saw the advent of...

1:13:59

of the biggest existential risk

1:14:02

weapon in history, the nuclear weapon. So

1:14:06

the same century that was the

1:14:08

best ever was also the scariest

1:14:10

ever, and in some ways the worst ever in certain

1:14:12

areas. So what does that mean

1:14:14

about the next century, right? What does that mean about

1:14:17

everything, if tech is continuing to explode?

1:14:20

It means that we could solve

1:14:22

everything, just advancements in AI

1:14:24

alone. I mean, we could solve all

1:14:27

disease, we could solve world hunger

1:14:29

and eradicate poverty, we

1:14:31

could solve climate change, we could solve

1:14:34

aging even, people could die

1:14:36

when they want to. This is all

1:14:37

realistic, this really could happen. But

1:14:39

the existential risk, now there's not just one,

1:14:42

there is many, and they go together,

1:14:44

they feed on each other. So

1:14:46

you can look at that and you could sum that up and say the

1:14:48

stakes are higher than ever. If

1:14:51

we can kind of move forward wisely, we

1:14:53

can live in what would seem like a utopia to

1:14:55

people today. And

1:14:58

if

1:14:59

we don't, then if you live in this

1:15:01

advanced society, the fall might be

1:15:04

the worst ever. So the point is people should be scared,

1:15:06

we shouldn't be cocky. And the reason

1:15:08

I like the liberal house, and I talk about it a lot,

1:15:11

is that I think that gives us our best

1:15:13

chance to proceed wisely. I think liberalism

1:15:16

is the tool and the system that

1:15:19

can get us to a really good future.

1:15:22

And I think the destruction of liberalism is

1:15:25

the ultimate existential threat because I think

1:15:27

it enhances all the other existential

1:15:29

threats. And so yeah, I don't want us to get

1:15:31

cocky about what we have and the stability

1:15:33

we have because we really need it going

1:15:36

forward and she'd never take it for granted.

1:15:39

You have a baby daughter. What

1:15:41

is the biggest piece of advice you

1:15:44

could give someone, maybe like

1:15:46

her, when she understands words, about

1:15:49

the world and the world she's being

1:15:51

born into as we move on to that

1:15:53

thousandth and one page?

1:15:55

I would try

1:15:58

to teach her. independent

1:16:01

reasoning. Conventional wisdom from 10, 30,

1:16:04

50 years ago is often not accurate anymore. It's

1:16:07

not wise anymore. Conventional wisdom does

1:16:09

not stay wise for long, and

1:16:11

it's always going to lag behind.

1:16:13

And so I

1:16:14

would encourage her to trust her

1:16:16

independent reasoning, and when

1:16:18

it conflicts with conventional wisdom about

1:16:21

how the world is, about where it's going, about,

1:16:23

you know, who the, you know, who the harmful

1:16:25

people and the productive

1:16:27

people are, to continually

1:16:30

observe and reflect. And when her,

1:16:33

what she comes up with there disagrees with conventional

1:16:35

wisdom, to trust it

1:16:36

and to continually stay

1:16:38

humble so that she can continue to change her

1:16:41

mind. Because if you live in 50,000 BC,

1:16:43

the world is the way it is. Your great grandparent

1:16:45

lived the same life you did. Conventional wisdom is the

1:16:48

same as it always has been, and it's wise. When

1:16:50

the world is rapidly changing, you have

1:16:52

to be nimble as a thinker and continue to adjust. So

1:16:54

I would want her to do that.

1:16:58

Tim Urban, thanks for coming on

1:17:00

today. Thank you, Barry.

1:17:10

Thanks for listening. If you like this conversation,

1:17:12

if it moved you in some way, or if

1:17:15

Tim said something you appreciated or disagreed

1:17:17

with or were provoked by, it's all

1:17:19

good. Share this conversation

1:17:21

with your friends, with your family, with your community,

1:17:24

and use it to have a discussion of your own. And

1:17:26

if you want to support Honestly, there's only one way to

1:17:28

do that. Go to thefp.com,

1:17:31

T-H-E-F-P dot com,

1:17:34

and become a subscriber today. We'll see

1:17:36

you next time. Transcribed

1:17:50

by https://otter.ai

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features