Podchaser Logo
Home
#365 – Sam Harris: Trump, Pandemic, Twitter, Elon, Bret, IDW, Kanye, AI & UFOs

#365 – Sam Harris: Trump, Pandemic, Twitter, Elon, Bret, IDW, Kanye, AI & UFOs

Released Tuesday, 14th March 2023
 1 person rated this episode
#365 – Sam Harris: Trump, Pandemic, Twitter, Elon, Bret, IDW, Kanye, AI & UFOs

#365 – Sam Harris: Trump, Pandemic, Twitter, Elon, Bret, IDW, Kanye, AI & UFOs

#365 – Sam Harris: Trump, Pandemic, Twitter, Elon, Bret, IDW, Kanye, AI & UFOs

#365 – Sam Harris: Trump, Pandemic, Twitter, Elon, Bret, IDW, Kanye, AI & UFOs

Tuesday, 14th March 2023
 1 person rated this episode
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

The following is a conversation with Sam Harris,

0:02

his second time in the podcast. As

0:04

I said two years ago, when I first

0:06

met and spoke with Sam, He's one of

0:08

the most influential pioneering thinkers

0:11

of our time as the host of the Making

0:13

Sense podcast, creator of the

0:15

waking up app, and the author

0:17

of many seminal books on human nature

0:20

and the human mind, including the

0:22

end of faith, the moral landscape,

0:24

lying free will and

0:26

waking up. In this

0:28

conversation, besides our mutual fascination

0:31

with AGI and free will, we

0:33

do also go deep into controversial

0:35

challenging topics of Donald Trump,

0:38

Hunter Biden, January sixth,

0:40

Vaccines, Lab Leak, Kanye West,

0:43

and several key figures at the

0:45

center of public discourse, including Joe

0:47

Rogan and Elon Musk. Both

0:50

of whom have been friends of Sam

0:52

and have become friends of mine. Somehow,

0:55

an amazing life trajectory that

0:57

I do not deserve in any way and

0:59

in fact believe is probably a figment

1:02

of my imagination. And

1:04

if it's alright, please allow me to say a few words

1:07

about this personal aspect of the conversation

1:09

of discussing Joe, Elon, and

1:11

others. What's been weighing heavy

1:13

on my heart since the beginning

1:16

of the pandemic now three years ago,

1:18

is that many people I look to for wisdom

1:20

in public discourse stop talking

1:22

to each other as often with respect,

1:25

humility, and love. When the world

1:27

needed those kinds of conversations the most.

1:30

My hope is that they start talking

1:32

again. They start being friends again.

1:34

They start noticing the humanity that connects

1:36

them that is much deeper than the disagreements

1:39

that divide them. So

1:41

Lex me take this moment to say with

1:43

humility and honesty, why

1:45

I look up to and I'm inspired by

1:48

Joe, Elon, and Sam. I

1:50

think Joe Rogen is important to

1:52

the world as a voice of compassionate curiosity

1:55

and open mindedness to ideas

1:57

both the radical and mainstream. Sometimes with

2:00

humor, sometimes with brutal honesty,

2:02

always pushing for more kindness in

2:04

the world. I think

2:06

Elon Musk is important to the world as

2:08

an engineer, leader, entrepreneur, and

2:11

human being who takes on the hardest problems

2:13

that face humanity. And refuses to

2:15

accept the constraints of conventional thinking

2:18

that made the solutions to these problems seem

2:20

impossible. IDW think

2:22

Sam Harris is important to the world

2:25

as a fearless voice who fights

2:27

for the pursuit of truth against growing

2:29

forces of echo chambers and

2:31

audience capture, taking unpopular

2:34

perspectives and defending them with

2:36

rigor and resilience. I

2:39

both celebrate and criticize all

2:41

three privately, and they criticized

2:43

me usually more 365, for

2:45

which I always learn a lot and always appreciate.

2:48

Most importantly, there is respect

2:50

and love for each other as human beings.

2:52

The very thing that I think world needs

2:55

most now in a time of division

2:57

and chaos. I will

2:59

continue to try to men divisions, to

3:01

try to understand not to ride,

3:03

to turn the other cheek if needed, to

3:06

return hate with love. Sometimes

3:09

people criticize me for being naive, cheesy,

3:12

simplistic, all that. I

3:15

know. I agree. But

3:17

I really am speaking from the heart

3:19

and I'm trying. This world is

3:21

too fucking beautiful not to try.

3:24

In whatever way I know how. I

3:27

love you

3:27

all.

3:29

And

3:29

now a quick 365 second mention of each sponsor

3:32

Check them out in the description. It's

3:34

the best way to support this podcast. We

3:37

got notion for AI powered

3:39

note taking and team collaboration. Indeed

3:42

for hiring great teams and the master

3:44

class for online learning.

3:46

Choose wisely my friends. Also,

3:49

if you want to work with our team, we're always

3:51

hiring, go to Lex 365 dot com slash

3:53

hiring. And now, until the full

3:55

ad reads, as always, no ads in the middle,

3:57

I try to make this interesting, but if you must

4:00

skip them. Please still check out our

4:02

sponsors. I enjoy their stuff. Maybe

4:04

you will too. This show

4:06

is brought to you by Notion. A

4:09

note taking and team collaboration tool,

4:11

my favorite note taking and

4:13

team collaboration tool. And

4:16

they have a new feature Notion

4:18

AI that I've been using

4:20

and loving. And this thing is probably

4:23

the best implementation

4:26

of a system that

4:28

uses a language model

4:30

to generate text because

4:33

it integrates across the entirety

4:36

of your note taking process, and it

4:38

adds just a giant number

4:40

of small and big features that help

4:42

you out. That save a lot of time,

4:44

but also make everything 365 fun and

4:46

creatively sort of inject ideas

4:49

into your workflow. So just to list

4:51

some 365, they can edit the voice and

4:54

tone of the text you already wrote, so they

4:56

can rewrite it in different tone. They

4:58

can make the Lex, which I love they can make

5:00

it shorter or longer. Also, they

5:02

can simplify the

5:04

text, which to me is at

5:07

the core of the writing process make

5:10

things as simple as possible, but not

5:12

simpler as Einstein said. And

5:14

to have tools that give you

5:16

ideas how to do that, not necessarily completely

5:18

automate everything, but give you really

5:21

powerful ideas on how to

5:23

get ninety percent there. This is just brilliant.

5:26

Also, if there's technical jargon that can rewrite

5:28

the text and we explain simply what

5:30

else? They can obviously summarize the text.

5:33

If you start writing, they can continue your

5:35

writing. If you're having

5:37

trouble starting to write and there's a

5:39

blank page glaring back

5:41

at you, they can generate

5:43

based on a topic, a bunch of tests to get you going.

5:46

I mean, there's so many just amazing features.

5:48

I love I love it when

5:51

great, powerful language

5:54

models or any idea in AI is

5:56

then injected into a tool that's actually

5:59

usable and useful and

6:01

powerful across a number of

6:03

use cases to a huge number of people.

6:05

I mean, this is really, really, really exciting.

6:07

Notion AI helps you work faster, right, better

6:10

and think bigger. Doing tasks that normally

6:12

take you hours and just minutes. Try

6:14

Notion AI for free when you go to notion

6:17

dot com slash flex. That's all lower

6:19

case. Notion dot com slash

6:21

to try the power of Notion

6:23

AI today. This

6:26

shows also brought to you by indeed a

6:28

hiring website. I

6:31

used it. I continued to use it

6:33

to hire folks for the teams. I've

6:36

been on. I've Lex.

6:38

365 engineering to creative, everything

6:41

requires a rigorous systematic

6:45

artistic all

6:48

many adjectives IDW use processed

6:50

to to build up an amazing team because

6:53

there's nothing more important to the success

6:56

of an endeavor or the success

6:58

of life or to just

7:00

your contentment and happiness and

7:02

joy and fulfillment and

7:04

a source of meaning than the team

7:07

that you take on and hard challenges of

7:09

life with, of work with. So

7:12

you should use best tools for the job of

7:14

hiring, and you should take hiring

7:16

very, very, very seriously. Don't

7:19

overspend on hiring. Visit indeed

7:21

dot com slash Lex to start

7:23

hiring

7:24

now. That's indeed dot com slash flex.

7:26

Terms and conditions apply.

7:30

This

7:30

shows also brought to you by master class,

7:33

one hundred and eighty dollars a year,

7:35

gets you an all access pass to watch courses

7:37

from the best people in the world and their respective

7:40

disciplines. One of the people I just recently

7:42

talked to is Chris Voss. He's

7:44

a former FBI hostage

7:47

negotiator, brilliant guy.

7:49

365 the mic, I really enjoyed talking to

7:51

him. There is

7:54

kindness, camaraderie, thoughtfulness,

7:57

humor, wit,

8:00

also a certain sort of cultural

8:02

density and complexity, hailing

8:04

from New York or whatever that

8:07

rich sexy accent is from is just really

8:09

fun to listen to him to

8:12

listen to him, discuss what he's really good

8:14

at. That was true in the podcast, and that

8:16

is very much true in his masterclass.

8:19

Well, he really systematically breaks down

8:21

his IDW is what it takes to negotiate with

8:23

terrorists, negotiate with

8:26

hostage shakers, negotiate with

8:28

bank robbers, but I think the most important

8:30

thing is negotiate in everyday

8:32

life to negotiate

8:35

in business relationships, all

8:37

of that. It's just a really brilliant,

8:40

concise, clear actionable advice

8:42

that he gives. And that's true for almost

8:46

every single master class they

8:48

have and you get access to all of them. Get

8:50

unlimited access to every master class

8:52

and get fifteen percent off an annual

8:54

membership at masterclass dot

8:56

com slash Lex. This

9:00

is Lex 365 Podcast to support

9:02

it, please check out our sponsors in the description.

9:05

And IDW, 365. Here's

9:07

Sam Harris. What

9:25

is more effective and making a net

9:27

positive impact on the world? Empathy

9:30

or reason? It depends

9:33

on what you mean by empathy. There are two at

9:35

least two kinds of empathy. There's the

9:38

the cognitive 365, which

9:40

is, you know, I would argue

9:42

even a species of of reason. It's

9:44

it's just understanding another

9:47

person's point of view. You know, you understand

9:50

why they're suffering or why they're happy or what,

9:52

you know, we just you have a theory of

9:54

mind about another human being

9:56

that is accurate. And so

9:59

you can you can navigate

10:01

in relationship to them more 365. And

10:06

then there's another layer entirely, not

10:08

incompatible with that, but just distinct, which

10:10

is what people often mean by empathy.

10:13

Which is more a

10:15

kind of emotional contagion. Right?

10:18

Like, you feel depressed and

10:21

I begin to feel depressed along

10:23

with you because, you know, it's just it's

10:25

contagious. Right? I I, you know, we're so

10:27

close and I'm so concerned about you and your

10:29

problems become my problems. And IDW

10:32

bleeds through right now. I

10:34

think both of those capacities are

10:37

very important, but the

10:41

emotional contagion piece. And

10:45

this is not really my thesis. This is

10:47

something have more or less learned from

10:49

from Paul Blum. The

10:52

psychologists who

10:54

wrote a book on this topic titled Against Empathy,

10:57

the emotional Social Contagion piece

11:00

is a bad IDW

11:02

rather often for ethical behavior

11:05

and ethical

11:06

intuitions. Holy boy. And

11:08

I will go so I'll give you the clear example

11:10

of this, which is we

11:16

find stories

11:18

with a single identifiable protagonist

11:21

who we can effortlessly empathize

11:23

with. Far more compelling than

11:26

data. Right? So if I tell you,

11:28

you know, this is the classic case of of

11:30

the little girl who who falls down a well

11:33

Right? You know, this is some somebody's

11:35

daughter. You see the parents distraught

11:38

on television. You hear

11:40

her cries from the bottom of the well. The

11:42

whole country stops. I mean, this there was an example

11:45

of this, you know, twenty twenty five years ago,

11:47

I think, where it was just wall to wall on CNN.

11:49

This is just the perfect use of CNN. It

11:51

was, you know, seventy two hours or whatever

11:54

was of continuous coverage of extracting

11:56

this girl from a well. So we

11:59

effortlessly pay attention to that. We care

12:01

about it. We'll donate money toward

12:03

it. I mean, it's just it marshals a hundred percent

12:06

of our compassion and altruistic impulse.

12:09

Whereas 365 you hear that there's a genocide

12:11

raging in some country you've never been to

12:14

and never attended to go to. And the numbers

12:16

don't make dent and the and we we

12:18

find the story

12:21

boring. Right? We'll change the channel in the

12:23

face of a genocide. Right? It doesn't matter. So the

12:25

and it literally and perversely, it

12:27

could be five hundred thousand little girls

12:30

have fallen down wells in that country, and

12:32

we still don't care. Right? So So

12:35

it's, you know, many of us

12:37

have come to believe that this is a bug rather

12:39

than a feature of our moral psychology And

12:42

so the empathy plays an unhelpful

12:44

role there. So ultimately, think

12:47

when we're making big decisions about what we

12:49

should do and how to mitigate

12:51

humans suffering and and what's worth val

12:53

valuing and how we should protect those values.

12:58

I think reason is the better tool.

13:00

But it's not that I would want dispense with any

13:02

part of empathy

13:03

either. Well, there's a lot of changes to go on there.

13:05

But briefly to mention, you've recently

13:07

talked about effective altruism

13:10

on your podcast. I think --

13:12

Mhmm. -- you mentioned some interesting statement.

13:15

I'm going to horribly misquote you. But

13:17

that you'd rather live in

13:19

a world, like it doesn't really make sense,

13:22

but you'd rather live in a world where you'd care

13:24

about maybe your daughter and son

13:26

more than a hundred people that live

13:29

across the world, something like this. Like

13:31

where the calculus is not always perfect,

13:33

but somehow it makes sense to live

13:35

in a world where it's irrational in

13:37

this way and yet empathetic

13:40

in the way you've been

13:40

discussing. Right. I'm not sure what the right

13:43

answer. Is there or even whether there is

13:45

one right answer, there could be multiple peaks

13:48

on on this part of the moral landscape. But So

13:51

the the opposition is between

13:53

an ethic that's articulated by

13:55

someone like the dalai lama. Right? You know, really

13:57

any exponent of of you

14:00

know, classic Buddhism would say

14:02

that sort of the ultimate enlightened

14:05

ethic is true dispatching

14:07

with respect to friends and strangers.

14:09

Right? So that you would the, you know, the

14:11

mind of the Buddha would be truly

14:14

dispassionate. You would love and

14:16

and care about all people equally. And

14:20

by that light, it seems some

14:22

kind of ethical failing or at least,

14:24

you know, failure of of to

14:26

365 actualize compassion

14:28

in the limit

14:29

or, you know, enlightened wisdom in the limit

14:32

to care more or even

14:34

and much more about your kids than

14:36

the kids of other people or and to and to

14:38

prioritize your your energy

14:41

in that way. Right? So you spend all this time trying to figure

14:43

out how to keep your kids healthy and happy. And

14:46

you'll attend to their minuteous concerns

14:48

and however 365. And

14:52

again, there's a genocide raging and

14:54

Sudan or wherever and

14:56

it takes up less than one percent of

14:58

your bandwidth. I'm not sure it would

15:00

be better world 365 everyone was running

15:03

the the dalai lama program there. I think some

15:05

prioritization of of one's

15:08

nearest and dearest ethically

15:13

might be optimal because

15:15

we'll we'll all be doing that and we'll all be doing

15:17

that in a circumstance

15:20

where we have certain

15:22

norms and laws

15:25

and and other structures that

15:28

force us to be dispassionate where

15:30

that matters. Right? So, like, when I go to when

15:33

my daughter gets sick, I have to take her

15:35

to a hospital, IDW

15:38

really want her to get attention. Right?

15:40

And I'm worried about her more than I'm worried about

15:42

everyone else in the lobby. But the truth

15:44

is IDW actually don't want a totally

15:46

corrupt hospital. Don't want hospital

15:49

that treats my daughter better

15:51

than anyone else in the lobby because she's my

15:53

daughter and I have, you know, bribed the

15:55

guy at the door whatever, you know, or the guy's a

15:57

fan of my podcast or whatever the thing is,

15:59

you don't want starkly corrupt,

16:02

unfair situations. And

16:05

when you sort of get pressed

16:08

down the hierarchy of Maslow's

16:10

needs, you know, individually and and and

16:12

societally, a bunch

16:14

of the bunch of those variables change

16:16

and they change for the worse, understandably.

16:19

But, yeah, when things are when everyone's

16:21

corrupt and it's you're you're in

16:23

a in a state of of collective

16:26

emergency. You know, you've got a lifeboat problem.

16:28

You're scrambling to get into the 365. Yeah,

16:31

then then fairness

16:33

and norms and and the,

16:37

you know, the the other vestiges of civilization.

16:40

Begin to get stripped 365. We

16:42

can't reason from those emergencies

16:46

to normal life. I mean, normal life

16:48

we want justice, we want fairness, we

16:50

want we're all better off for it even

16:53

when the spotlight of our

16:55

concern is focused on

16:57

the people we know, the people who are friends, the

16:59

people who are family, the people we we we have

17:01

good reason to care about. We

17:03

still by default want the system that

17:06

protects the interest of strangers too.

17:09

And we know that generally speaking interest

17:11

in game of theoretic

17:12

terms, we're all going to tend to be better

17:14

off in a fair system than a corrupt

17:16

one. One of the failure modes of

17:18

empathy is

17:21

our susceptibility to anecdotal

17:24

data. Just a good story

17:26

-- Yeah. -- will get us to not think

17:28

clearly. But what about empathy

17:31

in the context of just discussing ideas

17:33

with other people? And then there's a large

17:35

number of people, like in this country, you

17:38

know, red and blue, half the population

17:41

believes certain things on immigration or

17:43

on the response to the pandemic

17:46

or any kind of controversial issue, even

17:48

if if the election was fairly executed.

17:53

Having an empathy for their

17:55

worldview, trying to understand

17:57

where they're coming 365, not just in the explicit

18:00

statement of their IDW, but the entirety of,

18:02

like, the roots from which their IDW stems,

18:05

that kind of empathy at while you're discussing

18:08

IDW, what is in your pursuit

18:10

of truth, having empathy for the

18:13

perspective of a large number of other

18:15

people. Versus

18:18

raw mathematical reason? I

18:20

think it's important, but I just it only

18:22

takes you so 365. Right? It doesn't doesn't

18:25

get you to truth. Right? It's not

18:27

truth is not a it's

18:29

not decided by, you

18:31

know, democratic principles. And

18:35

Certain people believe things for understandable

18:38

reasons, but those reasons are nonetheless bad

18:40

reasons. Right?

18:41

They they don't scale, they don't generalize, they're not

18:43

reasons anyone should adopt

18:45

for themselves or respect,

18:47

you know, histologically. And

18:50

yet their their circumstance

18:52

is understandable and it's something you can care

18:54

about. Right? And so,

18:56

yeah, like, let me just take I think there's

18:59

many examples of this you might be thinking

19:01

of, but And the one one that

19:03

comes to mind is I've I've been super critical

19:05

of Trump, obviously. And

19:09

I've been super critical of certain people

19:12

for endorsing him

19:14

or not criticizing him when he

19:16

really made it, you know, painfully obvious who

19:19

he was, you know, if if there had been any

19:21

doubt initially, there was

19:23

no doubt when we have a sitting president who's

19:25

not not

19:27

agreeing to a 365 transfer

19:30

of power. Right? So IDW

19:34

critical of all of that and yet the

19:37

fact that many

19:39

millions of Americans didn't see

19:42

what was wrong with Trump or bought into

19:46

the didn't see through his con.

19:48

Right? I mean, they bought into the idea that he was a

19:50

Fridman businessman who who might just

19:52

be able to change things because he's so unconventional

19:55

and so you know, his heart is

19:57

in the right place. You know, he's really man of people

19:59

even though he's a, you know, gold plated everything

20:02

in his life. They

20:05

bought the myth somehow of,

20:08

you know, largely because they had seen him

20:10

on television for almost

20:13

deck eight and half pretending

20:15

to be this dangerous businessman who could

20:17

get things done. It's

20:20

understandable to me that many

20:22

very frustrated people who have

20:24

not had their hopes and dreams actualized,

20:28

who have been the victims

20:30

of globalism and and many

20:34

other, you know, current trends. It's

20:38

understandable that they would be

20:41

confused and and and

20:43

not see the liability of electing

20:46

a grossly incompetent morbidly

20:49

narcissistic person into

20:51

the into the in the presidency. So

20:54

I don't So which is to say that I

20:56

don't blame. There are many, many

20:58

millions of people who don't necessarily blame

21:00

for the Trump phenomenon. I but I

21:02

can nonetheless bimon the

21:04

phenomenon as as indicative of,

21:06

you know, very bad state

21:09

of affairs in our society. Right? So it's it's

21:11

there's two levels to it. I mean, one is I think you have

21:13

to call us spade a spade when you're talking

21:15

about how things actually work and

21:17

what things are are likely to happen

21:19

or not. But then you can recognize

21:23

that people are have very

21:25

different life experiences and and Yeah.

21:28

I mean, I think empathy and, you know, probably the

21:30

better word for what I would

21:32

hope to embody there is compassion.

21:34

Right? Like, really, you know, to

21:38

really wish people well, you know, and to

21:40

really wish, you know, strangers

21:42

well 365, wish them well. I mean,

21:44

to realize that there is no opposition

21:47

between in the at bottom, there's no real

21:49

opposition between selfishness and

21:51

selflessness because wise

21:54

selfishness really takes into

21:56

account other people's

21:58

happiness. I mean, you know, which do you do you wanna

22:00

live in a society where you have everything but

22:03

most other people have nothing or

22:06

do you wanna live in a society where you're surrounded

22:08

by happy, creative, self

22:10

actualized people who are having their

22:13

hopes and dreams realized. I think it's obvious

22:15

that the the second society is much

22:17

better. However, much you can guard

22:20

your good

22:21

luck. But what

22:23

about having empathy for

22:26

certain principles that people believe, for

22:28

example, the

22:30

the pushback, the other perspective on this,

22:32

because you said, bought the myth of Trump

22:34

as a great businessman. There could be

22:37

lot of people that are supporters of Trump

22:39

who could say that Sam

22:41

Harris bought the myth that we

22:43

have this government of

22:46

the people by the people that actually represents

22:48

the people as opposed to a bunch

22:50

of elites who are running a giant

22:52

bureaucracy that is corrupt that is feeding

22:55

themselves, and they're actually not representing

22:57

the people. And then here's this chaos

22:59

agent Trump who speaks off

23:01

the top of his head yeah, he's flawed

23:04

in all this number of ways. He's

23:06

more comedian than he is a

23:08

presidential type of 365. And

23:11

he's actually creating the kind of chaos that's going

23:13

to shake up this bureaucracy, shake

23:16

up the elites that are so uncomfortable because

23:18

they don't want the world to know about

23:21

the game they got running and everybody

23:22

else. So that's, you know, you

23:24

know, that's kind of perspective that they would take

23:27

and say, yeah, yeah, there's these flaws that

23:29

Trump has, but this is necessary.

23:32

I agree with the first part. So I I haven't

23:34

bought the myth that it's,

23:38

you know, a truly representative democracy

23:41

in in the way that we would it might idealize.

23:45

And, you know, on some

23:47

level means this is different conversation, but on some

23:49

level, I'm not even sure how much I think it

23:51

should be. Right? Like, I I'm not sure we

23:55

want in the end everyone's

23:58

opinion give an equal

24:00

weight about just we should

24:02

do about anything. And I include myself in that. I mean, there are

24:05

many topics around which don't

24:07

deserve to have a strong opinion because

24:09

I don't know what I'm talking about,

24:11

right, or what I would be talking about if I had a strong

24:13

opinion. So and

24:17

think we'll probably get to that to

24:20

some of those topics because I've declined to have

24:22

certain conversations on my podcast just because I think

24:24

I'm the wrong person to have that conversation. Right.

24:27

And and it's and I think it's important

24:29

to see those bright lines

24:32

in in one's life and in in the moment

24:34

politically. And ethically.

24:37

So, yeah, I think so

24:40

we've IDW the the viability of

24:42

democracy. I'm

24:44

I'm under no illusions that

24:47

all of our institutions are,

24:49

you know, worth preserving precisely

24:52

as they have been up until the moment. This

24:54

great orange wrecking ball came swinging

24:56

through her lives. But I

24:59

just it was a very bad bet to

25:01

elect someone who was grossly

25:04

incompetent. And worse

25:07

than incompetent, genuinely

25:11

malevolent in his selfishness. Right?

25:14

think and this is something we know based

25:16

on literally decades of him being

25:19

in the public eye. Right? He's not as he's

25:21

not a public servant in any

25:24

normal sense of that term. And

25:27

he couldn't possibly give an honest or

25:30

saying answer to the question the question you

25:32

asked me about empathy and reason and

25:34

and like how should we you know, what should guide

25:36

us. I

25:39

genuinely think he is missing some necessary

25:41

moral and and psychological tools.

25:45

Right? And and this this is I can feel

25:47

com compassion for him as a human being

25:49

because I think having those things

25:51

is incredibly important and genuinely loving

25:53

other people is incredibly important and and

25:56

knowing what all that's about is is is

25:58

that's really the good stuff in life. And III

26:01

think he's missing a lot of that. But

26:04

I think we we don't

26:06

wanna promote people to to the highest

26:08

positions of power in our society who

26:10

are far outliers

26:13

in in

26:15

pathological terms. Right? We want them

26:17

to be far outliers in in 365

26:20

in the best case, in wisdom and compassion,

26:22

and some of the things you've some

26:24

of the topics you brought

26:25

up. I mean, we want someone to be deeply informed.

26:27

We want someone to be

26:31

unusually curious, unusually alert

26:34

to how they may be wrong or getting

26:36

things wrong, consequently. He's

26:38

none of those things. And if and

26:40

365 far as we're gonna get normal mediocrities in

26:43

that role, which I think, you know, is

26:45

often the best we could expect, let's get

26:47

normal mediocrities in that role, not

26:51

once in a generation narcissists

26:53

and

26:57

frauds. I mean, it is like the let

26:59

me just take honesty as a single variable. Right?

27:02

I I think you want yeah. Yes. It's possible

27:04

that most

27:06

politicians lie at least some of the time.

27:09

don't think that's a good thing. I

27:11

think people should be gen you know, generally

27:13

honest even to

27:15

a fault. Yes,

27:18

there are certain circumstances where lying, I think,

27:20

is necessary. It's kind of on a continuum

27:22

of self defense and and violence. So

27:24

it's like 365 you're gonna you know, if the Nazis

27:27

come to your door and ask you if you've got Anne Frank

27:29

in the attic, I think it's okay to lie to them.

27:33

But, you

27:35

know, Trump, there's I I arguably,

27:38

there's never been a person in that

27:40

anyone could name in in human history

27:42

who's lied with with that kind of velocity.

27:45

I mean, it's just it was it was just

27:48

a blizzard of lies, great

27:50

and small, you know, to to

27:52

pointless and

27:54

and to and 365, but

27:57

it's just

27:59

it

28:01

it says something fairly

28:05

alarming about our society that a person

28:07

of that character got promoted.

28:10

And so yes, I have compassion

28:12

and concern for for half of the society

28:14

who didn't see it that way, and that's gonna sound

28:17

elitist and and and

28:19

smug or something 365 anyone who's

28:22

who's on that side listening to me. But it's

28:25

genuine. I mean, I'm I I understand that,

28:27

like, like, I barely have the I'm like one

28:29

of the luckiest people in the world, and I barely

28:31

have the bandwidth to pay attention

28:33

to half the things I should pay attention to in order

28:35

to have an opinion about half of things we're gonna

28:37

talk about. Right? So how much less

28:40

bandwidth to somebody who's working two jobs

28:42

or, you know, a single mom who's

28:45

who's, you know, raising, you

28:47

know, multiple

28:49

kids, you know, even a single IDW. It's

28:51

just it's unimaginable to me that people

28:53

have the the bandwidth to to

28:56

really track this stuff. And so then they jump

28:58

on social media and they they see they get

29:00

inundated by misinformation and

29:02

They see what their favorite influencer just said.

29:06

And now they're worried about vaccines and they're

29:08

it's just it's we're

29:10

living in an environment where our our in the

29:13

information space has become so corrupted.

29:16

And we've built machines to

29:18

to further corrupt it, you know? mean, we built a business

29:20

model for the Internet that it further corrupts it.

29:23

So it's it is just is

29:26

chaos in informational terms.

29:28

And I don't fault

29:31

people for being confused

29:33

and impatient and at

29:35

the at their wits end. And,

29:39

yes, Trump was a an

29:41

enormous fuck you to the establishment,

29:44

and that and that's that was understandable

29:46

for many reasons. To me, Sam

29:49

Harris, the great Sam Harris is somebody

29:51

I've looked up 365 for long time,

29:54

as a beacon of voice of reason,

29:56

and there's this meme on the Internet,

29:59

IDW would love you to steal man the case for

30:01

it and against that Trump broke

30:03

Sam Harris' brain. That

30:05

there's something IDW disproportionately

30:08

to the actual impact that Trump had on

30:10

our society. He had an

30:13

impact on the on the ability

30:16

of balanced calm

30:19

rational minds to see

30:21

the wall clearly, to think clearly.

30:24

You being one of the beacons of that.

30:26

Is there is there a degree to which you

30:29

broke your brain? Well,

30:32

otherwise known as Trump Derangements? J.

30:34

Yeah. Medical Well, sorry. Medical question.

30:36

Yeah. I mean, I think Trump Derangements syndrome

30:38

is a is a very clever meme because

30:40

it it just throws

30:43

the, you know, the the problem back on the person who's

30:45

criticizing Trump, but -- Sure. -- in truth,

30:47

the the true Trump's

30:50

arrangements syndrome was not to have seen

30:53

how dangerous and divisive it

30:55

would be to promote someone like Trump. To

30:57

that position of power. And to not and

30:59

in the in the 365 moment, not

31:02

to see how

31:05

untenable it was to

31:08

still support someone who, you know,

31:10

a sitting president who was not committing

31:12

to a peaceful transfer of power. I mean, that was 365

31:14

if if that wasn't a bright line for you,

31:17

you have been deranged by something because

31:20

that was, you know, the that

31:22

was one minute to midnight for our democracy

31:24

as as far as IDW concerned. And I think

31:27

it really was, but for the

31:31

integrity of a

31:33

few people, that we didn't

31:35

suffer some real constitutional crisis

31:38

and and real emergency 365

31:41

January I mean, if if Mike Pence

31:43

had caved in and decided

31:45

to not certify the election. Right?

31:48

365, you can count on two hands.

31:50

A number of people who help things

31:52

together at that moment. And

31:55

so it wasn't for want of trying

31:57

on Trump's part that we we

32:01

didn't succumb to some, you know, real

32:04

to truly uncharted catastrophe

32:07

with our democracy. So the fact that that didn't

32:10

happen is not a sign that

32:12

those of us who were worried

32:14

that it was So close to

32:16

happening, we're exaggerating the problem. I

32:18

mean, it's like, you know, you almost got run over

32:21

by a car, but you didn't And

32:23

so, you know, you're the fact that you're adrenalized and

32:26

you're thinking, you know, boy, that was dangerous. probably

32:28

shouldn't, you know, wander in the middle

32:30

of the street with my eyes closed.

32:33

You weren't wrong to feel that you really

32:35

had a problem. Right? And

32:39

came very close to something truly terrible

32:42

So IDW think that's where we were, and I think

32:44

we shouldn't do that again. Right? So the fact that

32:46

he's he's still he's coming back around as

32:48

potentially a viable candidate I'm

32:51

not spending much time thinking about it, frankly, because

32:53

it's, you know, I'm IDW waiting for the moment

32:55

where it it requires some thought.

32:58

I mean, they it did it

33:00

took up I

33:03

don't know how many podcasts I devoted to

33:05

the topic. It wasn't

33:07

that I mean, it wasn't that many

33:09

in the end, you know, against the the

33:11

number of podcasts I I devoted to other

33:14

topics, but there are people who

33:16

look at Trump and just find

33:18

him funny, entertaining,

33:21

not especially threatening. He's

33:23

like not a you know, it's just it's just good

33:25

fun to see somebody who's like who's just

33:28

not taking anything seriously. And

33:30

it's just just putting a, you know, a stick

33:32

in the wheel of of business as usual.

33:35

Again and again and again and again and

33:39

they don't really see anything much

33:41

at stake. Right? Doesn't really doesn't really

33:44

matter if we don't support NATO. Doesn't

33:46

really matter if he says he trusts Putin

33:48

more than our intelligence services. I

33:51

mean, none of this is it does matter if he's

33:53

on the one hand saying that he loves

33:58

the leader of North Korea and on the other

34:00

threatening it threatens to to,

34:02

you know, bomb them back to the stone age, right, on

34:04

Twitter. It's all it all can be

34:06

taken in the spirit of kind of reality television.

34:08

Like, this is just this is the part of the movie that's just fun

34:10

to watch. Right? And I

34:13

understand that. I can even inhabit

34:16

that space for a few minutes

34:18

at a time, but there's

34:21

a deeper concern that we're in process of

34:23

entertaining ourselves to death. Right?

34:25

That we're just not taking things seriously. And

34:28

this is it's problem I've had with several other

34:30

people. We might name who just

34:32

who just appeared to me to be

34:34

goofing around at scale. And

34:37

they lack a kind of moral seriousness. I

34:39

mean, they're they're touching big problems

34:42

where lives hang in the balance, but

34:44

they're just fucking around. And I

34:47

think there are really important problems that we have

34:49

to get our head straight around. And

34:51

we need, you know, it's not to

34:54

say that institutions don't become corrupt

34:56

I IDW think they do. And I IDW and I'm quite

34:58

worried that, you know, both about the

35:00

the loss of trust in our institutions and

35:03

the the

35:05

fact that trust has eroded for good reason.

35:07

Right? That they have become less trustworthy. III

35:10

you know, they've become infected by political

35:13

ideologies that are not truth tracking. I mean,

35:15

I worry about all of that. But I

35:19

just think the we need institutions. We

35:21

need to rebuild them. We need we

35:24

need experts who are real experts.

35:26

We need to value expertise over

35:29

you know, amateur speculation

35:31

and conspiracy thinking and

35:34

just, you know, and

35:35

bullshit. The kind of amateur speculation

35:37

we're doing on this very podcast.

35:41

IDW usually alert to the moments where

35:43

I'm just guessing

35:45

or where I actually feel like I'm just

35:47

talking from within my wheelhouse, and I try

35:49

to telegraph that a fair amount

35:52

with people. So

35:56

yeah. I mean, but it's it's not it's

35:58

different. Like, I mean, you you can invite

36:00

someone onto your podcast. Who's an expert

36:03

about something that you're you're

36:05

not an expert about. And

36:07

then you you in

36:09

the process of getting more informed yourself,

36:12

your your audience is getting more informed. So you're asking

36:14

smart questions. And you might be pushing

36:16

back at the margins But

36:19

you know that when push comes to shove, on

36:21

that topic, you really don't

36:23

have a basis to have a strong

36:25

opinion. And 365

36:28

you were going to form a strong

36:30

opinion that was this counter to the expert

36:32

you have in front of you, it's going to be

36:34

by deference to some other expert who you've

36:36

brought in or who you've heard about or whose

36:38

work you've read or whatever. But there

36:40

there's a paradox to how

36:43

we value authority. In science

36:45

that most people don't understand. And

36:47

I think we should, at some point, unravel

36:49

that because it's basis for

36:52

a lot of public 365. And and for frankly,

36:54

there's a basis for a lot of, you know, criticism

36:56

I've received on these topics, whereas, you know,

36:58

people think that I may you

37:01

know, I I'm against free speech

37:03

or I'm an establishment chill

37:06

or it's it's like IDW just think IDW

37:08

a credentialist. I just think people with

37:10

PhDs from IDW I believe university

37:13

should, you know, run everything. It's

37:15

not true, but there's a ton of there's

37:17

there's a lot to cut through to get to daylight

37:20

there because people are very

37:23

confused about how we value authority

37:26

in the surface of rationality generally. You've

37:28

talked about

37:29

it, but it's it's just interesting. The intensity

37:31

of feeling you have. You've you've had this

37:34

famous phrase about Hunter Biden

37:37

and children. In the basement.

37:39

Can you just revisit this case?

37:41

So Lex let me

37:43

give another perspective. On the situation

37:46

of January sixth and Trump in general.

37:49

It's possible that January

37:52

sixth and Things of

37:54

that nature revealed that our democracy

37:56

is actually pretty fragile. And

37:58

then Trump is not a malevolent and

38:01

ultra competent malevolent figure but

38:03

it's simply a jokester. And

38:07

he just by creating the chaos

38:09

revealed that it's all pretty 365. Because

38:11

you're a student in history and there's a lot of

38:13

people like Vladimir Lenin Hitler

38:16

who are exceptionally confident at

38:19

controlling power. At

38:21

being executives and taking that power

38:23

controlling the generals, controlling all the

38:25

figures involved, and certainly

38:28

not tweeting, but working in the shadows

38:30

behind the scenes to gain power.

38:32

And they did so extremely

38:35

365, and that is how they were able to gain

38:37

power. The the pushback with Trump, because

38:39

he was doing none of that. He was

38:42

creating which he's very good

38:44

at creating drama, sometimes

38:47

for human sake, sometimes for drama

38:49

sake, and simply reveal

38:51

that our democracy is fragile. And

38:53

so he's not this once

38:56

in a generation horrible figure, once

38:58

in a generation narcissist. No.

39:00

I I don't think he's he's a

39:03

a truly

39:05

scary sinister you

39:08

know, Putin like or, you know, Hitler, much

39:10

less Hitler like 365, not at all. I mean,

39:12

he's not ideological. He doesn't care about

39:14

anything beyond himself. So it's

39:16

not no.

39:19

No. He's much less scary than

39:22

any really scary you

39:24

know, totalitarian. Right? I mean, and

39:26

he's he's more brave new world than nineteen

39:28

eighty four. This is what, you know, Eric Weinstein

39:31

never stops battering

39:33

me about, but, you know, he's still wrong,

39:36

Eric. You know,

39:38

I I can you know, my my analogy for

39:40

Trump was that he's an evil chauncey gardener.

39:43

I don't know if you remember the the the

39:45

the book or the film being there with

39:48

with Peter Sellers. But, you

39:50

know, Peter Sellers is this Gardner who

39:52

really doesn't know anything, but

39:54

he gets recognized as this wise man and he

39:56

gets promoted to immense power in

39:58

in Washington. Because he's speaking

40:01

in these kinda in in assemblings

40:04

of wisdom. He's got these very simple 365 it

40:06

seemed to be 365. He's just talking oh, he cares

40:09

about his garden and he's just talking about his

40:11

garden all the time. Yeah. But, you know, he'll

40:13

say something. But, yeah, you know, in the spring, you know, the

40:15

new shoots will will bloom and

40:17

people read into that some kind of genius,

40:20

you know, insight politically. And so he gets

40:22

promoted and so that's that's the joker 365. For

40:24

me, Trump has always been someone

40:26

like an evil Chauncey Garden. I mean, he's

40:29

he's it's not to say he's

40:31

totally yes. He has certain kind

40:33

of genius. He's got a genius for

40:36

creating spectacle around himself. Right?

40:38

He's got a genius for getting the the eye

40:40

of the media always coming back

40:42

to him. But

40:45

it it's only it's a kind of

40:47

it's a kind of, you know, self promotion

40:49

only works if you actually are truly

40:52

shameless and don't care about

40:54

having a reputation for anything that that

40:56

that or you would wanna have a reputation

40:58

for. Right? It's like pure the

41:00

pure pornography of attention. Right?

41:02

He and he just wants more of it. I

41:05

think the truly depressing and genuinely

41:08

scary thing was that we

41:11

have a country that at

41:13

least half of the country given

41:15

how broken

41:18

our society is in many ways. We

41:20

have a country that didn't

41:23

see anything wrong with that. Bringing

41:25

someone who's who obviously

41:27

doesn't know what he should know to be president

41:30

and who's obviously not a good

41:32

person. Obviously doesn't care

41:35

about people, can't even pretend to

41:37

care about people really, right, in a credible

41:39

way. And

41:42

so, I mean, 365 there's a silver lining

41:44

to this, it it's it's

41:46

along the lines you just sketched. It

41:48

shows us how vulnerable our system

41:51

is. To a truly brilliant

41:53

and sinister figure. Right? I

41:55

mean, like, IDW think we are

41:59

we really dodged a bullet. Yes. Someone far

42:01

more competent and

42:05

archiving and ideological could

42:07

have exploited our system in

42:10

a way that Trump didn't. And and that's

42:13

yeah. So if if we plug

42:15

those holes eventually,

42:19

that would be a good thing and he would have done a good thing

42:21

for our society. Right? I mean, one of the things

42:23

we realized, and I IDW

42:25

think nobody knew. I mean, I certainly didn't know

42:27

it and IDW hear anyone talk about

42:29

it, is how much our

42:31

system relies on

42:33

norms rather than laws -- Yeah. -- as civility.

42:36

Right. Yeah, it's just like it's it's

42:39

quite possible that he never did anything

42:41

illegal. You know, truly, truly

42:44

illegal. I mean, I think he probably did

42:46

a few illegal things, but, like, illegal such

42:48

that he really should be thrown in jail for

42:50

it, you know. At

42:52

least that remains to be seen. So

42:56

all of the chaos, all of the,

42:59

you know, all of the diminishment

43:02

of our stature in the world, all of the just

43:04

the the opportunity costs of

43:06

spending years focused on

43:08

nonsense. All

43:11

of that was just norm violations. All

43:14

that was just that was just all a matter

43:16

of not saying the thing you should

43:18

say. But that doesn't mean they're insignificant.

43:20

Right? It's not that it's like it's not

43:23

illegal for a sitting president

43:25

to say, no, I'm

43:27

not gonna commit to peaceful transfer

43:29

of power. Right? We'll wait and see whether

43:31

I win. If I win, it it

43:33

it it it was the election

43:35

was was

43:37

was valid. If I lose, it was fraudulent. Right?

43:39

But aren't those humorous

43:42

perturbations to our system of

43:44

civility such that we know what the limits

43:46

are? And now we start to think that

43:49

and have these kinds of

43:50

discussion. But that that wasn't a humorous perturbation

43:52

because he did everything

43:54

he could granted he wasn't

43:56

very competent, but he did everything he could

43:59

to try to steal

44:02

the election. I mean, the irony is he claimed

44:04

to have an election stolen from him

44:06

all the while doing everything he could to steal

44:09

IDW, declaring it fraudulent in advance trying

44:11

to get the votes to to to

44:13

not be counted as the evening wore on

44:16

knowing that they were gonna be disproportionately democrat

44:19

Democrat votes. Because

44:21

of the the, you know, because of the position

44:23

he took on mail in ballots. I mean, all

44:25

of it was fairly calculated the

44:28

whole circus of of of,

44:30

you know, the clown car that

44:33

crashed into, you know, 365 seasons landscaping.

44:35

Right? And and You got Rudy Giuliani

44:38

with his hair dye and you got Sydney Powell and

44:40

all that all these grossly incompetent

44:42

people, lying as

44:44

freely as they could breathe. About

44:47

election fraud. Right? And all of these

44:49

things are getting thrown out by, you know, Republican,

44:51

largely Republican election officials and Republican

44:54

judges. It

44:56

wasn't wasn't 365 want of trying that

44:58

he didn't maintain his power

45:00

in this country. He really tried to

45:02

steal the presidency. He just was

45:04

not competent. And the people around him weren't

45:06

confident. So that's a good thing,

45:09

and it's worth not letting that

45:11

happen again. But he wasn't 365, so

45:13

he didn't do everything he could. When

45:15

he did everything he could, he didn't do everything

45:17

that could have been done by someone more competent.

45:21

Right. But the the tools you have

45:23

as a president. You could do a lot of things. You can

45:25

declare emergencies, especially during COVID.

45:27

You could postpone the election. You can

45:29

create military conflict. That,

45:32

you know, any kind of reason to postpone the

45:34

election. There's there's a lot of Well, he he tried

45:36

to do things and he would have to have done

45:38

those things through other people and there are people

45:40

who refused to do those things.

45:42

The people who said they would quit, they would they would

45:44

quit publicly. Right? I mean, this is you

45:46

start again, there are multiple

45:49

books written about in

45:51

the the last hours of of this presidency.

45:54

And the details are shocking

45:57

in what he tried to do and tried to get others

45:59

to do and it's awful.

46:01

Right? I mean, it's it's just awful that we

46:03

were that close. To

46:06

something

46:09

to to a true unraveling of

46:11

our political process. And

46:13

it's the only time in our lifetime that anything like

46:15

this has happened. And it's

46:19

deeply embarrassing. 365? to,

46:21

you know, on the world stage, it's just like we we looked

46:23

like a banana republic there for a while.

46:26

And we're

46:29

the the lone superpower. It's

46:32

a bet. It's it's not good. Right?

46:34

And so we shouldn't like, there's no

46:36

there's no the IDW people who thought,

46:38

well, we just need to shake things up and this is a

46:40

great and great way to shake things up

46:42

and having people, you know, storm our capital

46:45

and smear shit on the walls. That's just

46:47

more shaking things up. Right? It's

46:49

all just for the walls. There's

46:52

a nihilism and cynicism to

46:55

all of that, which, again, in certain people,

46:57

it's understandable. You know, frankly, it's

46:59

not understandable 365 you got a billion dollars

47:02

and you're you, you know, have a compound

47:04

in Menlo Park or wherever. It's like there

47:06

are people who are cheerleading this stuff who shouldn't

47:08

be cheerleading this stuff and who know that

47:10

they can get on their 365 stream

47:12

and fly to their compound IDW New Zealand

47:14

365 everything goes to shit. Right? So

47:17

there there's a cynicism to all of that that

47:19

I think we should be deeply critical

47:21

of.

47:21

What IDW trying to understand is

47:23

not and analyzed

47:26

is not the behavior of this particular human being,

47:28

but the effect IDW had in part

47:31

on the division between people.

47:34

To me, the degree, the

47:36

meme of Sam Harris'

47:38

brain being broken by Trump represents

47:42

You're like the person I would look to

47:44

to bridge the division. Well,

47:47

I don't think there is something

47:51

profitably to be said to

47:53

someone who's truly captivated

47:56

by the the the personality cult of

47:58

Trumpism. Right? Like, there's nothing that I'm

48:00

gonna say to there's no conversation

48:03

I'm gonna have with Candice Owens say about

48:05

Trump. That's gonna converge on something

48:07

reasonable. Right? You

48:08

don't think so? No. I mean, I've tried I haven't tried

48:10

with Candace, but tried with, you know, many people who

48:14

are into that particular orbit IDW I've

48:16

I've had conversations with people who

48:19

won't admit that there's anything

48:22

wrong with Trump. Anything. So

48:24

I'd like to push for the empathy versus

48:26

reason. Because when you operate in the space of

48:28

reason, yes. But

48:30

I think there's a lot of power and

48:33

you showing, and you somehow are showing

48:35

that you're willing to see the good

48:37

qualities of Trump publicly

48:40

showing

48:40

that. IDW think that's the way to

48:42

win over. But he has so few of them.

48:44

He has fewer good qualities than any virtually

48:47

anyone I can name. Right? But Lex

48:50

he's funny. He I'll I'll grant you that he's

48:52

funny. He's he's he's a good entertainer.

48:54

Does others look at just policies and

48:56

actual impacts at I've I've admitted that

48:59

No. No. So like so I've admitted that

49:01

many of his policies IDW agree with.

49:03

Many many of his I mean, so Probably

49:07

more often than not. mean, at least on balance,

49:09

I agreed I guess, I I agreed with his

49:11

policy that, you know, we should take China

49:14

seriously as an adversary. Right? And we're

49:16

and I

49:18

think mean, again, that

49:20

you have to there's a lot of fine print to a lot of this

49:22

because the way he talks about these things

49:25

and and many of his motives that

49:27

are obvious or or things that I 365

49:29

support, but we take immigration. I

49:31

think there's It's obvious

49:34

that we should have control of our borders.

49:36

Right? Like, I I don't see the argument for

49:39

not having control of our borders. We should let

49:41

in who we wanna let in and we should

49:43

keep out who we wanna keep out, and we should

49:45

have a sane immigration policy. So

49:48

I don't I didn't necessarily think it was a priority

49:50

to build the wall, but I didn't I never

49:54

criticized the impulse to build the wall because

49:56

if tens of thousands, hundreds of

49:58

thousands of people are coming across the border and we are

50:00

not in a position to know who's coming,

50:03

that seems untenable to me. So and

50:05

I can recognize that many people in

50:08

our society are

50:10

unbalance the victims of immigration

50:13

because and and there is a in in in

50:15

many cases, zero sum contest between the

50:17

interests of actual citizens and

50:19

the interests of immigrants. Right? So

50:21

I think we should have a we should have control

50:23

of our borders. We should have a sane and compassionate

50:26

immigration policy we should have. We should let in

50:29

refugees. Right? So I didn't you know,

50:31

Trump on refugees was terrible. But

50:35

no, like, I would say eighty

50:37

percent of the policy concerns

50:40

people celebrated

50:44

in him are

50:47

concerns that I either share entirely

50:50

or certainly sympathize with.

50:52

Right? So like, that's not That's

50:55

not the

50:55

issue. The issue is a

50:57

threat

50:57

to democracy and some 365. The

50:59

issue is largely what you said it was. It is

51:01

not so much the person. It's the effect

51:04

on everything he touches. Right?

51:06

He just he has this this

51:08

superpower of

51:11

deranging and destabilizing almost

51:15

everything he touches and selling the and

51:17

and compromising the integrity of almost anyone

51:19

who comes into his orbit. Mean, you looked at these people

51:22

who served as the chief of staff

51:24

or, you know, in various cabinet positions,

51:27

people had real reputations, you know, for

51:29

for probity and and the

51:31

level headedness, you know, whether

51:33

you share their politics or not. I mean, these

51:35

were real people. These were not you

51:38

know, some of them were 365,

51:40

but many

51:45

people who who just got

51:48

totally trashed by proximity

51:51

to him and then trashed by him when

51:53

they finally parted company with him.

51:57

Yeah. It's just people bent

52:00

over backwards to accommodate his

52:02

norm violations. And it

52:05

it was it was bad for them and it was

52:07

bad for our our our

52:09

system. And but

52:12

that but none of that discounts the

52:14

fact that we

52:16

have a

52:19

system that really needs proper

52:21

house cleaning. Yes, there are bad

52:23

incentives and entrenched

52:26

interests. And IDW not a fan

52:28

of the concept of of

52:30

the deep state, but because

52:32

it, you know, it's been so propagandized. But, yes,

52:34

there's there's something like

52:36

that, you know, that

52:40

is not flexible enough

52:44

to respond intelligently to the needs

52:46

of the

52:46

moment. Right? So there's a lot of rethinking of

52:48

government and of institutions in general that I

52:50

think we

52:53

should do, but we need

52:55

smart, well informed, well

52:57

intentioned people to do that job.

52:59

And the well intention part is

53:02

is hugely important. Right? It's just

53:05

give me someone who is

53:08

not the most selfish

53:11

person anyone has ever

53:14

heard about in their lifetime. Right?

53:16

And what we got with Trump was that, like, literally,

53:18

the one most selfish person IDW think

53:20

anyone could name. IDW mean, and you and

53:23

again, there's so much known about this

53:25

man. That's the thing. It was like, it

53:27

predates his presidency. We knew this

53:29

guy thirty years ago. And

53:32

and this is why to come back to those

53:34

inflammatory comments about a hundred bytes laptop.

53:37

The reason why I can say with

53:39

confidence that I don't care what was on

53:41

his his laptop is that there

53:43

is and and that includes any

53:46

evidence of corruption on the on the part of his

53:48

father. Right? Now there's been precious

53:50

little of that that's actually emerged.

53:52

So it's like there is no as far as I

53:54

can tell, there's not a big story associated

53:56

with that laptop as much as people bang on

53:58

about a 365 emails. But

54:02

even if there were just obvious

54:04

corruption, Right? Like, Joe Biden

54:07

was at this meeting and he took, you know, this

54:09

amount of money from this shady guy

54:11

for bad reasons. Right? Given

54:15

how visible the lives of these two men

54:18

have been, right? And given how much we know

54:20

about Joe Biden and how much we know about Donald Trump

54:22

and how they have lived in public for

54:24

almost as long as I've been alive, both of them.

54:27

The the the scale of

54:29

corruption can't possibly

54:32

balance out between the two of them. Right? We

54:34

IDW 365 if you show me that Joe

54:36

Biden has this secret life, where

54:38

he's driving a Bugatti and he's living

54:40

like Andrew Tate. Right? And he's he's

54:42

doing all these things I didn't know about.

54:45

Okay. Then I'm gonna start getting a

54:47

sense that Alright. Maybe this guy is

54:50

way more corrupt than I realized. Maybe there

54:52

is some deal in Ukraine or with China

54:54

that is just Like this guy is not who

54:56

he seems. He's not the public servant he's

54:58

been pretending to be. He's been on the

55:00

take for decades and decades, and he's

55:02

just He's as dirty as conveys. He's

55:05

all mobbed up and as a nightmare. And

55:08

he can't be trusted. Right? That's possible

55:11

365 you show me that his life is not at

55:13

all what it seems. But on the assumption that

55:16

I having looked at this guy for literally

55:18

decades, right, and

55:21

knowing that every journalist has looked at him

55:23

for decades. Just how many affairs

55:25

is he having? Just how much, you know,

55:28

how many drugs is he doing? How many

55:30

houses does he have? Where, you know, what

55:33

what is what are the obvious conflicts

55:35

of interests? You know? You hold that against

55:37

what we know about Trump. Right? And,

55:41

I mean, the litany of indiscretions

55:44

you can put on Trump's side that

55:46

that testified to his personal

55:48

corruption, to testify to the fact that he has

55:50

no ethical compass. There's

55:52

simply no comparison. Right? So that's why

55:54

I don't care about what's on the laptop

55:56

when now 365 you tell me

55:58

Trump is no longer running for president

56:01

in twenty twenty four, and we can put

56:03

Trumpism behind us. And

56:05

now you're saying, listen, there's a

56:07

lot of stuff on that laptop that makes Joe

56:09

Biden looked like a total asshole. Okay.

56:12

I'm all ears. Right? I mean, it was

56:14

a 365 in twenty twenty, it was a forced

56:16

choice between a sitting

56:18

president who wouldn't commit to a peaceful transfer of

56:20

power, and a guy who's obviously

56:23

too old to be president

56:25

who has a crack

56:27

addicted son who who, you

56:30

know, who lost his laptop. And

56:33

I just knew that

56:35

I was gonna take Biden in spite of

56:37

whatever litany of horrors was gonna come tumbling

56:40

out of that

56:40

laptop. And that might involve sort

56:42

of So the actual quote is, Hunter Biden literally

56:44

could have had the corpses of children --

56:47

Mhmm. -- in the basement. There's there's

56:49

a dark humor to it. Right? Which is, I think, you speak

56:51

to, I would not have cared. There's

56:53

nothing. It's Hunter Biden. It's not Joe Biden.

56:55

Whatever the scope of Joe Biden's corruption is,

56:57

it is infinitesimally compared to the

56:59

corruption. We know Trump is involved

57:01

in. It's like a firefly to the sun.

57:03

It's what you're speaking to. But let me make

57:06

the case that you're focused

57:08

on the surface stuff that

57:11

it's possible to have corruption

57:14

that masquerades in the thing we mentioned,

57:16

which is civility. You can spend

57:19

hundreds of billions of dollars or trillions of

57:21

stores of war in

57:23

the Middle East. For example, something that

57:25

you've changed your mind on in terms of

57:28

the negative impact it has on the world.

57:30

And that, you know, the military industrial

57:32

complex everybody

57:34

is very nice. Everybody is very civil.

57:37

There's very upfront. Here's how we're spending

57:39

the money. Yeah. Sometimes somehow

57:41

disappears in different places, but

57:43

that's the way, you know, worse complicated. And

57:46

as everyone is very polite, there's no coke

57:49

and strippers or whatever is on the laptop.

57:52

It's very nice and polite. In the meanwhile,

57:54

hundreds of thousands of civilians die of

57:57

hate just

58:00

an incredible amount of hate is created because

58:02

people lose their family members, all that kind of

58:04

stuff. But there's no strippers in Coconut on a laptop.

58:07

So -- Yeah. -- but but it's not just

58:09

superficial. It is

58:13

when you when someone only

58:15

wants wealth and power and

58:17

fame. That is

58:19

their their objective function.

58:21

Right? They're like a robot that

58:23

is calibrated just to those variables.

58:26

Right? And they don't

58:28

care about the risks

58:30

we run on any other front. They don't care

58:32

about, I mean, Environmental

58:35

risk, pandemic risk, nuclear

58:37

proliferation risk, none of

58:40

it. Right? They're just they're just tracking 365,

58:43

and money, and and

58:45

whatever can can personally redone

58:49

to their self interest along those lines. And

58:51

they're not informed about the other risks

58:53

we're running really. I mean, in Trump, you

58:55

you had a president who was repeatedly

58:58

asking his general Why couldn't we use

59:00

our nuclear weapons? Why can't we have

59:02

more of them? Why do I have fewer nuclear weapons

59:04

than JFK? Right? So that were

59:06

sign of of anything

59:08

other than progress. Right? And

59:13

this is the guy who's got then the

59:15

the button. Right? I mean, he's got somebody's

59:17

following him around with a bag waiting to

59:19

take his order to to launch. Right?

59:24

That is a it's

59:26

just it's a it's a a risk

59:29

we should never run. One thing

59:31

Trump has going for him, I think, is that he's he

59:33

doesn't drink or do drugs. Right? I don't

59:35

know. There's, you know, people allegedly does

59:38

speed, but let's take

59:40

him in his word. He's he's not

59:42

arranging himself with with

59:43

pharmaceuticals, at least. But

59:47

apart from diet coke.

59:51

But there's nothing wrong. Just for the record,

59:53

let me push back on that. There's nothing

59:54

wrong with diet coke. Yeah. Oh, I know. It's a very

59:57

your mouth. I occasionally have some myself. There's no

59:59

medical there's no scientific evidence that I

1:00:01

observed the negatives of, you know,

1:00:03

all those studies about aspartame and all of

1:00:05

that. It's I No. don't know.

1:00:07

If IDW hope I hope you're right. Yeah.

1:00:10

I mean, everything you said about the military industrial

1:00:13

complexes True. Right? And and

1:00:15

it's been we've been worrying about that on both

1:00:17

sides of the aisle for a very long

1:00:19

time. I mean, that's just that phrase

1:00:22

came from 365 Eisenhower. It's

1:00:29

I mean, so much of

1:00:31

what ails us is a story

1:00:33

of bad incentives. Right? And

1:00:36

bad incentives are so powerful that

1:00:39

they corrupt even good people. Right?

1:00:42

How much more do they corrupt bad people?

1:00:44

Right? Like, so it's like, you want

1:00:47

at minimum, you want reasonably

1:00:49

good people, at least non pathological people

1:00:52

in a in the system trying

1:00:55

to navigate against the grain of bad

1:00:57

incentives. And better

1:00:59

still, all of us can get

1:01:01

together and and try to diagnose those

1:01:03

incentives and change them.

1:01:06

Right? And and we will really

1:01:08

succeed when we have a system of incentives

1:01:10

where the

1:01:12

the good incentives are so strong that

1:01:16

even bad people are effortlessly

1:01:19

behaving as though they're good people because they're

1:01:21

so successfully incentivized to behave

1:01:23

that way. Right? That's you

1:01:25

know, so so it's almost the

1:01:27

inversion of our current situation. So yes,

1:01:30

And you say IDW my mind about the war.

1:01:33

I not quite. I mean, I I was never

1:01:37

a supporter of the war in Iraq. I

1:01:39

was always worried that it was a

1:01:41

a distraction from the war in Afghanistan. I was

1:01:43

a supporter of the war in 365, and

1:01:45

I will admit in hindsight that looks

1:01:47

like you

1:01:49

know, at best a highly ambiguous

1:01:52

and painful exercise, you know,

1:01:54

more likely a 365 fool's errand. Right?

1:01:56

It's like that would IDW did not turn

1:01:58

out well. It's it's it

1:02:01

wasn't for want of trying. I I don't, you

1:02:03

know, IDW have not done a deep dive on

1:02:05

on all of the failures there,

1:02:08

and maybe all of these failures or failures

1:02:10

in principle. I mean, maybe it's just maybe that's

1:02:12

not the kind of thing that can be done well

1:02:14

by anybody. Whatever our intentions.

1:02:18

But, yeah, the the move to Iraq always seemed

1:02:21

questionable to me. And when

1:02:24

we knew the problem, the immediate problem

1:02:27

at that moment, you know, Al Qaeda was

1:02:31

in Afghanistan and you know,

1:02:33

and then bounce into Pakistan. Anyway,

1:02:37

all you know, so yes. But my

1:02:40

my sense of the possibility of

1:02:43

nation building, my sense of

1:02:45

of, you know, and

1:02:47

so and so far as the the the

1:02:49

NeoCon spirit

1:02:53

of of, you

1:02:55

know, responsibility and idealism that, you know,

1:02:57

America was the kind of nation that

1:02:59

should be functioning in this way as

1:03:02

as the world's cop, and we got we have to get in there

1:03:04

and and untangle some of these

1:03:06

knots by 365. Rather

1:03:09

often because, you know, if

1:03:11

we don't do it over there, we're gonna have to do it over here

1:03:13

kind of thing. Yeah,

1:03:15

some of that has definitely changed.

1:03:17

For me, am I thinking? IDW mean,

1:03:20

there are obviously cultural reasons why

1:03:22

it failed in Afghanistan, and and if you can't change

1:03:24

the culture, it's

1:03:29

You're not going to force a change at gunpoint

1:03:31

in the culture or it certainly seems that

1:03:34

that's not going to happen. And it took

1:03:36

us you know, over twenty years

1:03:38

to, apparently, to realize

1:03:39

that. That's one of the things you realize with the wars.

1:03:42

There's not going to be a strong signal

1:03:44

that things are not

1:03:45

working. 365 can just keep pouring money

1:03:47

into

1:03:47

a thing,

1:03:48

a military effort. Well, also, there there are

1:03:50

signs of it working too. You have all the

1:03:52

stories of girls now

1:03:54

going to school. Right? You know, the girls are getting battery

1:03:56

assets thrown in their faces by religious

1:03:58

maniacs, and then we come in there and

1:04:00

we stop that. And now girls are getting

1:04:03

educated and there's a and that's all

1:04:05

good. And our intentions are good there. And I

1:04:07

mean, we're on the right side of history there.

1:04:09

Girls should be going to school. You know,

1:04:11

Malala Yosefai

1:04:13

should have the Nobel Prize, and she shouldn't

1:04:15

have been shot in the face by by the

1:04:17

Taliban. Right?

1:04:20

We know what the right answers are there. The question

1:04:23

is, what do you do when there are enough

1:04:25

in this particular case religious maniacs

1:04:28

who are willing to die and let their children

1:04:30

die in defense of crazy ideas

1:04:33

and moral norms that belong in the

1:04:35

seventh century.

1:04:38

And it's a problem we couldn't solve, and

1:04:40

we couldn't solve it even though we spent

1:04:43

trillions of dollars to solve it. Just

1:04:45

reminding me of the

1:04:47

thing that you and and Jack

1:04:49

Dorsey jokingly had

1:04:51

for Wiles the discussion about

1:04:53

banning Donald Trump from

1:04:56

Twitter. But does any of it

1:04:58

bother you? Now that Twitter files came out

1:05:01

that I mean, it has to

1:05:03

do with sort of the Hunter laptop

1:05:05

Hunter Biden laptop story. Does

1:05:07

it bother you that they're could

1:05:10

be a collection of people that make decisions

1:05:12

about who to ban or not. And that

1:05:16

that could be susceptible to bias and

1:05:18

to ideological influence? Well,

1:05:21

I I think it

1:05:23

always will be or, you know, in

1:05:25

the absence of perfect AI, it always

1:05:28

will be. And this becomes relevant with AI

1:05:30

as well. Yeah. Because, you know, you have some censorship

1:05:32

on AI

1:05:32

happening. Yeah. And it's an interesting question there

1:05:34

as well. I don't think Twitter is important as

1:05:36

people think it is. Right? And and I

1:05:39

I used to think it was more important when I was on it.

1:05:41

And now that I'm off of it, I think it's it's I

1:05:45

mean, 365, let me say it's

1:05:47

just an unambiguously

1:05:49

good thing in my experience to

1:05:52

to lease your Twitter account. Right? It's like it it

1:05:54

is just even the good parts

1:05:56

of Twitter that I miss were

1:05:59

bad in the aggregate. In

1:06:01

the the degree to which it

1:06:03

was fragmented my attention. That degree

1:06:06

to which my life was getting

1:06:08

doled out to me in periods

1:06:11

between those moments where I checked

1:06:13

Twitter, right, and had my attention diverted.

1:06:15

And I was, you know, I was not

1:06:18

a crazy Twitter addict. I mean, I

1:06:21

was probably a pretty normal user. I mean, was

1:06:23

not someone who was tweeting multiple

1:06:25

times a day or even every day. Right?

1:06:27

IDW I think probably

1:06:30

averaged something

1:06:32

like one tweet a day. I think I average.

1:06:34

But in reality, it was like, you know, there'd be, like, four tweets

1:06:37

one day, and then I wouldn't tweet for, you know, the better

1:06:39

part of the week. And but I was

1:06:41

looking lot because

1:06:44

it was my 365. I was just following, you

1:06:46

know, two hundred very smart people, and I just

1:06:48

wanted to see what they were paying attention to.

1:06:50

They would recommend articles, and I would read those articles.

1:06:53

And And then when I would read an article that then I would

1:06:55

that I would thought I should signal boost, I would tweet.

1:06:57

And so all of that seemed good and,

1:07:00

like, that's all separable from

1:07:02

all of the IDW bullshit that came back

1:07:04

at me in in response to this largely in

1:07:06

response to this hundred IDW thing. But

1:07:10

even the good stuff has

1:07:13

a IDW. And and

1:07:16

it comes at just this

1:07:18

point of your phone

1:07:20

is this perpetual stimulus

1:07:22

of which

1:07:26

is intrinsically fragmenting time and

1:07:28

attention. And now my phone is

1:07:30

is much less of a presence in my life.

1:07:32

And it's it's not that I don't check Slack

1:07:34

or check email, and I, you know, I

1:07:36

use it to work But my

1:07:41

sense of just what the world is and my

1:07:43

sense of my place in the world, the sense of

1:07:45

where I exist as a person, has

1:07:48

changed a lot by deleting my Twitter

1:07:50

account. I mean, I had a and it's just it's

1:07:54

and the and the things that I think mean, we

1:07:56

all know this 365. I mean, we we because we say

1:07:58

of someone who that person's too online. Right?

1:08:00

Like, what does it mean to be too online? And

1:08:03

where do you draw the that

1:08:06

that boundary? You know, where how do you know

1:08:08

what constitutes being to online? Well, In

1:08:12

some sense, just be I think being on on

1:08:14

social media at all is to be too online.

1:08:16

mean, given what it

1:08:18

does to given the kinds

1:08:21

of

1:08:21

information, it it

1:08:24

signal boosts. And

1:08:27

given the

1:08:29

given the impulse it kindles in

1:08:31

each of us to reach out

1:08:33

to our audience in at

1:08:36

in specific moments and in specific

1:08:38

ways. Right? It's like there

1:08:40

there are lots of moments now where I have an opinion

1:08:42

about something, but there's nothing for me

1:08:44

to do with that

1:08:45

opinion. Right? Like,

1:08:46

there's no Twitter. Right? So, like, there there are

1:08:48

lots of things that I would have tweeted -- Mhmm. --

1:08:50

in the last months. That

1:08:53

are not the kind of thing I'm gonna do a

1:08:55

podcast about. I'm not gonna roll

1:08:57

out ten minutes on that topic on my podcast.

1:08:59

I'm not gonna take the time to really think about it.

1:09:01

But had I been on Twitter, I would

1:09:03

have reacted to this thing in the

1:09:05

news or this thing that some somebody did.

1:09:07

Right? What do you do with that thought? No.

1:09:10

just let go of it. Like, chocolate

1:09:12

ice cream is the most delicious thing. Yeah. It's

1:09:14

it's usually not that sort of thing, but it's

1:09:16

it's just But then you look

1:09:18

at the kinds of problems people create for

1:09:20

themselves. You look at the 365,

1:09:22

deranging, and reputation, destroying

1:09:25

things that people And

1:09:28

and I look at the things that that have the

1:09:31

analogous things that have happened to me, I mean, the things

1:09:33

that have really bent my life around professionally.

1:09:36

Over the past, you know, decade. So

1:09:39

much of it is Twitter. I mean, honestly

1:09:42

in my case, almost a hundred percent

1:09:44

of it was Twitter. The the controversies I would

1:09:46

get into, the things I would IDW would

1:09:48

think I would have to respond to and a like, I would

1:09:50

release a podcast on a certain topic. I

1:09:52

would see some blowback on Twitter.

1:09:55

You know, it would give me the sense that

1:09:57

there was some signal that I really had to

1:09:59

respond to. Now that I'm off Twitter,

1:10:02

I I recognize that most of that was just it

1:10:05

was totally specious. Right? It was it was not something

1:10:08

I had to respond to. But yet, I would

1:10:10

then do a cycle of podcasts

1:10:12

responding to that thing that, like, taking

1:10:15

my foot out of my mouth or taking someone else's foot

1:10:17

out of my mouth, and it became

1:10:19

this this self perpetuating cycle,

1:10:24

which I

1:10:27

mean, it's you know, if you're having

1:10:29

fun, great. I mean, if it's if it's if it's

1:10:31

generative of useful information

1:10:35

and and engagement professionally and

1:10:38

and psychologically great.

1:10:40

But and and there, you

1:10:42

know, there was some of that on

1:10:44

Twitter. I mean, there were people who I've connected

1:10:46

with because IDW just you know,

1:10:49

one one of us ed the other on Twitter, and it

1:10:51

was hard to see how that was gonna happen. Otherwise.

1:10:53

But it

1:10:55

was largely just

1:10:58

a machine for

1:11:00

manufacturing

1:11:01

unnecessary controversy. Do you think it's possible

1:11:03

to avoid the drug of that? So now that

1:11:05

you've achieved this done state, is

1:11:07

it possible for somebody like you

1:11:10

to use it in a way that doesn't pull you into

1:11:12

the oil

1:11:12

pool. And so anytime there's

1:11:14

a Lex, you just I mean, that's how I IDW

1:11:17

to use it. Yeah. But it's it's not

1:11:19

the way I wanted to use it. It's not the way

1:11:21

it it it promises itself

1:11:23

as as wanted to have debate. I wanted

1:11:25

to actually communicate with people. IDW I

1:11:28

wanted to hear from the person because

1:11:31

again, it's it's like being an 365. Right?

1:11:33

It's like there there there are the the

1:11:35

potted cases where it's

1:11:37

obviously good. Right? So in Afghanistan,

1:11:40

the girl who's getting an education, that is just

1:11:43

here. That's why we're here. That's obviously

1:11:45

good. I've had those moments on Twitter

1:11:47

where it's okay. I'm hearing from a smart person

1:11:49

who's detected an error I made

1:11:51

in my podcast or in a book

1:11:53

or they've just got some great idea

1:11:56

about something that I should spend time

1:11:58

on. And I would never have heard from this

1:12:00

person in any other 365. And now

1:12:02

I'm actually in dialogue with them. And it's it's fantastic.

1:12:05

That's the promise of it to actually talk to

1:12:07

people. And so IDW I kept getting lured

1:12:09

back into that. No.

1:12:12

The the way the sane or,

1:12:14

you know, sanity preserving way of of using

1:12:16

it is is

1:12:19

just as a marketing channel. You just put your

1:12:21

stuff out there and you don't look at what's coming back

1:12:23

at you. And that's,

1:12:25

you know, for you know, I'm on other social media

1:12:27

365. That don't even touch. I

1:12:29

mean, my team put post stuff on

1:12:31

Facebook and on Instagram. I never even see

1:12:33

what's on there. So

1:12:34

you don't think it's possible to see something

1:12:37

and that'll let it affect your mind. Well, no. That

1:12:39

that's definitely possible. But the

1:12:41

question is and I did that for

1:12:44

vast stretches of time. Right? And

1:12:46

But then the the

1:12:49

promise of the platform is dialogue

1:12:52

and feedback. Right? So, like, so why am

1:12:54

I if I know 365

1:12:57

whatever reason, I'm gonna see,

1:12:59

like, ninety nine to one awful

1:13:02

feedback, you know, bad faith feedback.

1:13:04

Malicious feedback. Some of it's probably even

1:13:06

bots, and I'm not even aware of who's a person,

1:13:09

who's a bot. Right? But I'm just gonna stare

1:13:11

into this 365 house mirror of

1:13:14

acrimony and dishonesty that

1:13:17

is going I mean, the the reason why I got

1:13:19

off is not

1:13:21

because I couldn't recalibrate

1:13:24

and and and find equanimity again with

1:13:26

all the the nastiness

1:13:28

that was coming back at me. And not that I couldn't

1:13:30

ignore it for vast rest of time. But

1:13:34

IDW could see that I kept coming back to

1:13:36

it hoping that it

1:13:38

would be something that I could use a

1:13:40

real tool for communication. And I was

1:13:42

noticing that it was

1:13:45

insidiously changing the

1:13:47

way I felt about people. Yeah. Both

1:13:49

people I know and people I don't know. Right? Like,

1:13:52

people I you know, mutual friends of ours who

1:13:54

are behaving in certain ways on Twitter, which just

1:13:56

seemed insane to me. And

1:13:58

then that became a signal. I felt like

1:14:00

I had to take into account somehow. Right?

1:14:03

You're seeing people at their worst, both friends

1:14:05

and strangers. And

1:14:08

I 365 that it was as much as I could

1:14:10

sort of try to recalibrate for it, I

1:14:14

felt that I was losing touch with

1:14:16

what was real 365. Because

1:14:18

people are 365, people are faking,

1:14:20

people are not hurt themselves, or they're you're

1:14:23

just seeing people at their worst. And

1:14:25

so felt like Alright. Was

1:14:27

at was being advertised to me

1:14:30

here on a not just a daily basis,

1:14:32

I'm, you know, hourly basis

1:14:35

or, you know, increment sometimes of, you know,

1:14:37

multiple times an hour. I mean, I probably check

1:14:39

Twitter, you know,

1:14:42

at minimum ten times a day and maybe

1:14:44

I was checking it a hundred times a day

1:14:46

on some days, right, where things were really active

1:14:49

and I was really engaged with something. What

1:14:54

was being delivered into my brain there

1:14:56

was it was subtly

1:15:00

false information. About

1:15:02

how dishonest

1:15:04

and you

1:15:09

know, just generally unethical, totally

1:15:14

normal people are capable of being. Right?

1:15:16

It was like, it was it was a it is a fun house mirror.

1:15:18

It was it was I was seeing the most grotesque

1:15:21

versions of people who I know.

1:15:23

Right? People who I know I could sit down at

1:15:26

dinner with and they would never behave this way.

1:15:28

And yet they were they were coming at me on

1:15:30

Twitter, you know, I mean, essentially

1:15:34

turning ordinary people into

1:15:36

sociopaths. Right? It's like people are

1:15:38

just you

1:15:41

know, it's and there are analogies that many of

1:15:43

us have made it. It's like it's like one analogy

1:15:45

is road rage. Right? Like, people behave in the confines

1:15:48

of a car -- Mhmm. -- in ways that they never

1:15:50

would 365 they didn't have this metal box

1:15:52

around them, you know, and moving at speed. And it's

1:15:54

it's, you know, all that becomes quite hilarious

1:15:57

and and, you know,

1:16:00

obviously 365 when they're actually have to

1:16:02

stop at the light next to the person they just flipped 365.

1:16:04

And they realized they didn't didn't

1:16:06

understand that the person coming out of that car next

1:16:08

to them with cauliflower here is

1:16:10

someone who they never would have you

1:16:12

know, rolled their eyes at in public because they

1:16:14

they would have taken one look at this person realized this

1:16:16

this is the last person you wanna fight

1:16:18

with. That's one of the heartbreaking things is

1:16:20

to see See people who I

1:16:22

know who I admire, who I know

1:16:25

our friends be everything

1:16:27

from snarky to downright. Yeah.

1:16:31

Mean of derisive

1:16:34

towards each other. But it doesn't make any

1:16:36

sense like this. This is the only place

1:16:38

where I've seen people I really admire

1:16:41

who have had a calm head -- Yeah. -- but

1:16:43

most things, like, really be shitty

1:16:45

to other people. It's probably the only place

1:16:47

I've seen that. And I I don't I

1:16:50

tend I choose to maybe believe that that's

1:16:52

not really them. There's something about the

1:16:54

system. Like,

1:16:56

if you go paint balling, if you Jordan

1:16:58

Peterson and You're

1:17:00

gonna shoot up You're gonna shoot your friends. Yeah. Yeah. You're gonna

1:17:02

shoot your friends when you kind of accept it. That's kinda

1:17:04

what you're doing in this little game that you're

1:17:06

playing, but it's sometimes hard to

1:17:08

remind yourself of that. What and

1:17:10

I think was guilty of that

1:17:13

365. You

1:17:16

know, I I don't think there's nothing IDW don't think

1:17:18

I ever did anything that I really

1:17:21

feel bad about. But, yeah, it was always

1:17:23

pushing me to the edge of IDW

1:17:26

somehow. And It's

1:17:29

just not healthy. It's not it's not

1:17:33

IDW so so the so the reason why

1:17:35

I deleted my Twitter account in the end was

1:17:37

that It was obviously making me

1:17:39

a worse person. And

1:17:41

and so and, yeah, is

1:17:43

there some way to be on there where he's not making me

1:17:45

a worse person? IDW sure there is, but it's

1:17:48

given the nature of the platform and given

1:17:51

what was coming back at me on it,

1:17:54

the way to do that is just to basically users

1:17:57

a one way channel of of communication. Just

1:18:00

just just marketing. You know, it's like, here here's

1:18:02

what I what I'm paying attention to. Look

1:18:04

at it if you want

1:18:05

to. And just you just push it out and then you don't you

1:18:07

don't look at what's coming back at you. I put

1:18:09

out a call for questions on Twitter. And then,

1:18:11

actually, quite surprising, there's a lot of

1:18:14

Good. IDW they're they're

1:18:16

like even if they're critical, they're

1:18:18

like being thoughtful, which is

1:18:20

nice. I

1:18:20

used it that way too, and that was what kept me

1:18:23

hooked.

1:18:23

But then there's also touch

1:18:26

balls sixty nine, wrote a question -- Mhmm.

1:18:29

-- ask Can't IDW imagine. This

1:18:31

is part of it. But one way to solve this is, you

1:18:33

know, we've got to get rid of anonymity for

1:18:36

this. It's like

1:18:36

Let me ask the question. Ask Sam why

1:18:38

he sucks. Was the question. Yeah. That's that's

1:18:40

good. Well One one reason

1:18:42

why I sucked was Twitter. That was and

1:18:45

I I've since solved that problem. So a

1:18:47

much touch ball sixty nine sixty

1:18:49

nine. Yeah. Touch ball sixty nine should be happy

1:18:51

that III suck a little bit less

1:18:53

now that I'm off Twitter. Mean, the fact 365 you don't

1:18:55

have to hear from touch balls. Sixty nine on the

1:18:57

regular. The fact that you have

1:19:00

to have to see that

1:19:03

it probably can have a negative effect

1:19:05

just even a moderation

1:19:07

just to see that there is 365 for me,

1:19:09

the negative effect is slightly

1:19:12

losing faith

1:19:13

in the underlying kindness of humanity.

1:19:16

Yeah. But you That

1:19:16

was 365 me. Yeah.

1:19:17

You can also just reason your way out of it saying

1:19:19

that this is an amenity and this is kind of fun and

1:19:21

this kind of just the

1:19:23

the shit show of Twitter. It's okay, but

1:19:26

it does mentally affect you a little bit. Like,

1:19:28

I don't read too much into that

1:19:30

kind of comment. It's like that it's just

1:19:33

that's just trolling.

1:19:36

And it's, you know, I I get what's I

1:19:39

get I understand the fun the

1:19:41

person is having on the other side of that.

1:19:43

It's like, do you, though? I do. Well,

1:19:45

I do. I don't I mean, I don't behave that

1:19:47

way, but IDW

1:19:48

do. And 365 all I know that person

1:19:50

could be, you know, sixteen years old. Right? So it's

1:19:52

it's, like, it

1:19:53

could be all set to alter con for Elon. I don't

1:19:55

know. Well, yeah. That's right. Yeah.

1:19:57

Yeah. Yeah. Because I'm pretty

1:19:59

sure Elon would just tweet that. Yeah. It's totally

1:20:01

under his own name at this

1:20:02

point. But

1:20:04

you tell each other. Okay. So the

1:20:07

do you think so speaking of which, now that Elon

1:20:10

has has taken over Twitter,

1:20:13

Is there something that he could do to

1:20:15

make this platform better? This

1:20:17

Twitter and just social media in general, but

1:20:20

because of the aggressive nature of his

1:20:22

innovation that he's pushing. Is

1:20:24

there any way to make Twitter

1:20:26

a pleasant place for Sam Harris? Maybe.

1:20:31

Like, in the next slide I don't know. I'm I

1:20:33

think I'm agnostic as to whether or not he

1:20:35

or anyone could make a social media

1:20:37

platform that really was healthy. So

1:20:39

you were just observing yourself week

1:20:41

by week seeing the effect as in your

1:20:44

mind and on how much you're actually

1:20:46

learning growing as a person and it was

1:20:48

negative. Yeah, IDW seen the negativity in

1:20:50

other people's lives. I mean, it's obviously I

1:20:52

mean, he's not gonna he's not gonna admit

1:20:54

it, but I think it's obviously negative

1:20:56

for Elon. Right? And it's just not it's

1:21:01

and that was one of the things that, you know, you know,

1:21:03

when I was looking into the 365 house mirror,

1:21:05

I was also seeing the the 365 house mirror

1:21:07

on his side of Twitter, and it was just even

1:21:09

more exaggerated. It's like, when

1:21:11

I when I was asking myself, why is he spending

1:21:14

his time this way? I then

1:21:16

reflected on why why, you

1:21:18

know, why was I spending my time this way,

1:21:20

to a lesser degree. Right? And at

1:21:22

a lesser scale, and at lesser

1:21:24

risk, frankly. Right? And so

1:21:29

and it was just so It's

1:21:31

not just Twitter. I mean, it's it's this

1:21:33

is part an Internet phenomenon.

1:21:36

It's like the the the whole Hunter Biden mess

1:21:38

that you you Thanks.

1:21:41

Explorer. That was based

1:21:43

I mean, it was on I was on somebody's podcast, but

1:21:46

that was based on a clip taken from

1:21:48

that podcast, which was highly

1:21:50

misleading as to the the the general

1:21:52

shape of my remarks on that podcast. Even,

1:21:55

you know, I I had to then do my own

1:21:57

podcast untangling

1:21:59

all of that and admitting that

1:22:01

even IDW even in the full context was not

1:22:03

speaking especially well and didn't say exactly

1:22:06

what I thought in a way that was would have

1:22:08

been recognizable to anyone, you

1:22:10

know, even someone with not

1:22:13

functioning by a spirit of

1:22:15

charity, but but the clip was

1:22:17

quite distinct from the podcast itself.

1:22:20

The reality is is that we're living in an environment

1:22:22

now where the people are

1:22:25

so lazy and there's their

1:22:27

their attention is so 365, that

1:22:30

they they only have time for clips. But,

1:22:32

you know, ninety nine percent of people

1:22:34

will see a clip and will assume there's

1:22:37

no relevant context I need to understand

1:22:39

what happened in that clip. Right? And

1:22:42

obviously, the people who make those clips know that. Right?

1:22:45

And and they're doing IDW quite maliciously.

1:22:47

And in this case, the person who made that clip and

1:22:49

subsequent clips of other podcasts was

1:22:52

quite maliciously trying to engineer,

1:22:54

you know, some reputational

1:22:57

emulation for me. And

1:23:02

being signal boosted by Elon

1:23:04

and other prominent people who can't

1:23:07

take the time to watch anything

1:23:09

other than a clip even when

1:23:11

it's their friend or someone who's ostensibly

1:23:13

their friend in that clip. Right? So it's

1:23:15

a total failure and understandable failure

1:23:18

of ethics that everyone

1:23:20

is so short on time and they're so fucking

1:23:22

lazy that that

1:23:25

and and and that we now have these context in

1:23:27

which we react so quickly to things.

1:23:29

Right? Like, Twitter is inviting an

1:23:31

instantaneous reaction to this

1:23:34

clip that

1:23:37

It's it's just

1:23:39

too tempting to just say

1:23:42

something and not know what you're

1:23:44

even commenting And most of most of the people

1:23:46

who saw that clip don't

1:23:48

understand what

1:23:50

I actually think about any of these issues. And

1:23:53

the irony is People are gonna find

1:23:55

clips from this conversation that are

1:23:57

just as misleading. Mhmm. And they're gonna export

1:23:59

those, and then people are gonna be dunking on those clips.

1:24:01

And, you know, we're all living and dying by

1:24:03

clips

1:24:04

now. And it's it's

1:24:07

dysfunctional. See, I I think it's possible to

1:24:09

create platform. I

1:24:12

think we will keep living on clips.

1:24:14

But, you know, when I saw that clip of you talking

1:24:16

about children and so on, just knowing

1:24:18

that you have a sense of humor, you we just

1:24:20

went to a dark place in terms of humor.

1:24:22

Right. So, like, I didn't even bother.

1:24:24

And then IDW knew that the way clips work,

1:24:27

is that people will use it for virality

1:24:29

sake, but the giving

1:24:32

giving a person benefit of the doubt

1:24:34

That's not even the right term. sound like I was

1:24:37

just really, like, interpreting

1:24:39

it in the

1:24:41

context that Trump has passed. The

1:24:43

truth is you even need like, I even give

1:24:46

Trump the benefit

1:24:48

of the doubt when I see a clip. Of Trump.

1:24:50

And so because they're famous clips of Trump

1:24:52

that are very misleading as to what he was saying

1:24:54

in context. And I've been honest

1:24:57

about that. Like, the whole, you know, there

1:24:59

were good people on both sides scandal

1:25:01

around the char his remarks after Charlottesville,

1:25:04

that the clip that got exported and

1:25:07

got promoted by everyone,

1:25:09

you know, left of center.

1:25:12

From Biden on down, you know, the New York

1:25:14

Times CNN, there's nobody that

1:25:16

I'm aware of. Who has

1:25:19

honestly, you know, apologized

1:25:22

for what they did with that clip.

1:25:24

That clip, he did not say what he seemed to

1:25:26

be saying in that clip about the the Nazis

1:25:29

at Charlottesville. Right? And I've I've

1:25:31

always been very clear about that. So

1:25:34

it's just, you know, even

1:25:37

even people who I think should

1:25:40

be marginalized and people who

1:25:42

who who

1:25:46

should be 365 because they really are

1:25:48

terrible people who are doing dangerous things

1:25:50

and and 365 bad reasons, I

1:25:53

think we should be honest about what they actually

1:25:56

meant in context. Right? And

1:25:58

and this this goes to Anyone

1:26:00

else we might talk about, you know, who are who's more

1:26:03

where the where the case is much more 365. But

1:26:05

Mhmm. Yeah. So everyone's

1:26:09

it's just so and then IDW

1:26:12

sure we're gonna get to AI. But, you know,

1:26:14

the the prospect of being able to manufacture

1:26:16

clips with

1:26:18

AI and deep fakes and

1:26:21

that where it's gonna be hard for most people most

1:26:23

of the time to even figure out that whether

1:26:25

they're in the presence of something real, you

1:26:29

know, forget about being divorced from context.

1:26:31

There was no context. I

1:26:34

mean, that is a that's a 365

1:26:37

apocalypse that is we are right on

1:26:39

the cusp of

1:26:39

and, you know, it's It's

1:26:41

terrifying. Oh, it could be just a new world

1:26:43

like where Alice going to Wonderland,

1:26:46

where humor is the only thing we

1:26:47

have, and it will save us. Maybe

1:26:49

in the end, Trump's approach

1:26:51

to social media was the right one after

1:26:54

all. Nothing is true and everything is absurd.

1:26:57

We can't live that

1:26:58

way. People function on the base this of what

1:27:00

they assume is true. Right? They

1:27:02

think it will

1:27:03

have functioned. To do anything,

1:27:05

it's like, I mean, you have to you have to know what

1:27:07

you think is gonna happen. Or

1:27:10

you have to at least give a probabilistic weighting

1:27:13

over the 365. Otherwise, you're

1:27:15

you're gonna be incapacitated by

1:27:17

You're not gonna like, people want certain

1:27:20

things, and they have to have a rational plan to get

1:27:22

those desires gratified. And,

1:27:24

you know, they don't wanna die. They don't want their kids

1:27:26

to die. You tell them 365 there's a comment

1:27:28

hurtling toward Earth and they should get outside

1:27:31

and look up. Right? They're

1:27:33

gonna do it. And if it turns out it's misinformation,

1:27:35

you know,

1:27:39

it's gonna matter because it comes down to, like,

1:27:41

what medicines do you give your children?

1:27:44

Right? Like, we're gonna be manufacturing fake

1:27:47

journal articles. I mean, this is I'm

1:27:49

sure someone's using chatGPT for

1:27:52

for this read as we speak. Mhmm. And

1:27:55

365 it's not credible, if it's not persuasive

1:27:59

now to most people, I mean,

1:28:01

honestly, I don't think we're gonna is this

1:28:04

I'll be amazed if it's a year before.

1:28:07

We we can actually create journal

1:28:09

articles that would take, you know, AAA

1:28:12

PhD to debunk they're

1:28:15

completely 365. And

1:28:18

there are people who are celebrating this

1:28:20

kind of you

1:28:25

know, coming cataclysm. But I I just it's

1:28:27

it's just they're the people who don't

1:28:30

have anything to lose who were celebrating it or

1:28:32

just are so 365, so they just don't even know what's

1:28:34

at stake. And then they're the people who have met

1:28:36

the few people who we could count

1:28:38

on a few hands who have managed to insulate

1:28:40

themselves or at least imagine they've insulate insulated

1:28:43

themselves from the downside here enough that

1:28:46

they're not implicated in the great

1:28:48

unraveling. We are witnessing

1:28:50

or could could

1:28:51

witness. The shaking up of what is

1:28:53

true. So actually, that it returns us to

1:28:55

experts. Do you think experts can save

1:28:57

us? Is there such thing as expertise

1:29:00

experts of something? How do you know if you've achieved

1:29:02

it? think it's it's important to

1:29:04

acknowledge 365 that there's

1:29:07

something paradoxical about how

1:29:09

we relate to

1:29:11

authority, especially within

1:29:14

science. And I don't

1:29:16

think that paradox is going away, and it's just

1:29:18

it doesn't have to be confusing. It's just and it's

1:29:20

not, essentially, a paradox. just like there

1:29:22

are different moments in

1:29:23

time. So

1:29:26

it is true to say that

1:29:31

within science or within within

1:29:33

rationality, general I mean, we're just one whenever

1:29:35

you're making it having a fact based discussion

1:29:37

-- Mhmm. -- about anything. It

1:29:40

is true to say that the truth

1:29:42

or falsity of a statement does

1:29:45

not even slightly depend on

1:29:48

the credentials of the person

1:29:50

making the statement. Right? So

1:29:52

it doesn't matter if you're a Nobel laureate, you can

1:29:54

be wrong. Right? The thing you could either the

1:29:57

last sentence you spoke could be total bullshit.

1:29:59

Right? And it's also possible

1:30:01

for someone who's deeply uninformed to

1:30:04

be right about something. Or or to

1:30:06

be right for the wrong reasons. Right? Or or someone just

1:30:08

gets lucky, or somewhat or

1:30:10

or and they're they're middling cases where

1:30:12

you have like a a backyard

1:30:15

astronomer who's got no credentials.

1:30:17

But he just loves astronomy and he's got

1:30:19

a telescope and it's he's spent a lot

1:30:21

of time looking at the nice guy and

1:30:23

he discovers a comet that no one

1:30:26

else has seen, you know, not even the

1:30:28

professional expert astronomers.

1:30:31

And got to think that happens less

1:30:33

and less now, but but some version

1:30:35

of that keeps happening and it and it may

1:30:37

always keep happening in every area

1:30:39

of expertise. Right? So

1:30:45

It's true that truth

1:30:48

is orthogonal to the

1:30:50

reputational concerns we have among

1:30:52

apes who are talking about the truth.

1:30:56

But it is also true that most of the

1:30:58

time real

1:31:00

experts are much

1:31:02

more reliable than frauds

1:31:05

or people who are not experts.

1:31:07

Right? Are you so and Expertise

1:31:09

really is a thing. Right? And when you

1:31:11

know, when you're flying an airplane in a

1:31:14

in a storm, you don't want

1:31:16

just randos come into the cockpit

1:31:18

saying, listen, I've got a new idea about how to, you

1:31:20

know, how we should tweak these controls. Right?

1:31:23

You want someone who's a train pilot and

1:31:25

and and that training gave them something. Right?

1:31:27

It gave them a set of competences and

1:31:29

intuitions and they

1:31:32

they know what all those dials and switches

1:31:34

Right? And I don't. Right? I shouldn't

1:31:36

be flying that plane. So

1:31:40

when things really matter, you know,

1:31:43

and put in this at thirty thousand feet

1:31:45

and a storm sharpen this up,

1:31:49

we want real experts to be in

1:31:51

charge. Right? And we

1:31:54

are at thirty thousand feet a lot

1:31:56

of the time on a lot of issues. Right?

1:31:58

And whether they're public health issues,

1:32:00

whether it's issue, whether it's a geopolitical

1:32:03

emergency like Ukraine, I

1:32:05

mean, climate change. mean,

1:32:08

just pick your pick your topic.

1:32:12

There are real problems And

1:32:15

the clock is rather often ticking and

1:32:17

their solutions are non obvious. Right?

1:32:20

And and so expertise is

1:32:22

a thing and deferring to

1:32:24

experts much of the time

1:32:27

makes a lot of sense. It's at minimum,

1:32:30

it it prevents

1:32:33

spectacular errors of incompetence

1:32:36

and and just,

1:32:39

you know, 365. But

1:32:42

even in in the case of some where where you're talking

1:32:44

about someone, I mean, people like ourselves

1:32:47

who are like, we're well educated. We

1:32:49

we're not the the worst possible candidates

1:32:52

for, you know, the the Dunning Kroger 365. When

1:32:54

we're going into a new area where we're

1:32:56

not experts, we're fairly alert

1:32:58

to the possibility that we don't, you know, it's

1:33:00

not as simple as things seem at 365, and

1:33:02

we don't know how our tools

1:33:05

translate to this new area. We can

1:33:07

be fairly circumspect, but we're also

1:33:10

because we're well educated, we

1:33:12

can we're and we're pretty quick studies.

1:33:15

We can learn a lot of things pretty 365, and

1:33:18

we can begin to play a language game

1:33:20

that sounds fairly expert. Right?

1:33:23

And in

1:33:25

that case, The

1:33:28

the invitation to do your own research, right,

1:33:31

is in when when

1:33:33

times are good, I

1:33:35

view as an invitation to waste your

1:33:37

time pointlessly. Right?

1:33:39

When times are good. Now the truth

1:33:42

is times are not all that good. Right? And we

1:33:44

have the

1:33:47

ongoing public display of

1:33:49

failures of expertise. We have

1:33:51

experts who are obviously corrupted by bad incentives

1:33:53

we've got experts who, you

1:33:56

know, perversely won't admit they were wrong

1:33:58

when they, in fact, you know, are demonstrated to be

1:34:00

wrong. We've got institutions that have

1:34:02

been captured by a political

1:34:05

ideology that's not truth

1:34:06

tracking. I mean, this this this whole woke

1:34:11

encroachment into really every

1:34:13

place, you know, whether it's universities or

1:34:15

science journals or government or IDW

1:34:17

it's just like that is that has been

1:34:19

genuinely deranging. So

1:34:23

there's a lot going on that Lex

1:34:25

experts and and the very concept of

1:34:27

expertise has seemed to discredit itself.

1:34:29

But the reality is is that there is a massive

1:34:32

difference when anything matters, when there's anything

1:34:34

to know about anything, there is a massive

1:34:36

difference most of the time between

1:34:38

someone who has really done the work

1:34:41

to understand that domain. And someone

1:34:43

who hasn't. And

1:34:46

if I get sick or someone close

1:34:48

to me gets sick, you

1:34:51

know, I have a PhD in neuroscience,

1:34:53

right? So can read a medical journal article

1:34:56

and understand a lot of it. Right?

1:34:58

And IDW, know, so I'm just fairly conversant

1:35:01

with, you know, medical terminology. And

1:35:03

I understand these methods, and I I'm

1:35:05

alert to the different because I've, you know, because

1:35:08

in neuroscience, I've spent hours and hours

1:35:10

in journal clubs, you know, diagnosing you

1:35:12

know, analyzing the

1:35:14

difference between good and bad studies, and

1:35:17

I'm alert to the difference between good and bad

1:35:19

studies in in medical journals. Right? And understand

1:35:22

that bad studies can get published and, you

1:35:24

know, etcetera. And

1:35:27

experiments can be poorly designed. I'm

1:35:30

alert to all of those things, but when I get sick or

1:35:32

when someone close to me gets sick, I

1:35:34

don't pretend to be a doctor. Right?

1:35:36

I've got no clinical experience. I

1:35:39

don't go down the rabbit hole on Google

1:35:41

for days at a stretch trying

1:35:43

to become a doctor. Much less a

1:35:45

specialist in the domain of problem

1:35:48

that has been visited upon me or my

1:35:50

365. Right? So if someone close me gets

1:35:52

cancer, I don't pretend to be an oncologist.

1:35:55

I don't go ahead and start I don't start reading,

1:35:57

you know, in journals of

1:35:59

oncology and

1:36:01

try to really get up to speed as an

1:36:03

oncologist. Because it's it's not it's

1:36:08

one is a bad and

1:36:11

potent and very likely misleading

1:36:15

use of my time. Right? And

1:36:18

it's 365

1:36:20

I decide if I had 365 I had a lot of runway,

1:36:23

if I IDW, okay, it's really

1:36:25

important for me to know everything I can.

1:36:27

At this point, I wanna I know someone's gonna

1:36:29

get cancer. I may not go back to school

1:36:31

and become an oncologist, but what I

1:36:33

want to do is I want to know everything I can know

1:36:35

about cancer. Right? So I'm going to take the next

1:36:37

four years and spend most

1:36:39

of my time on cancer, okay, I

1:36:41

could do that. Right? I still think

1:36:43

that's waste of my time. Mhmm. I

1:36:46

still think at the end of even at the end

1:36:48

of those four years, I'm not going to be the best

1:36:50

person to 365 intuitions

1:36:52

about what to do in the face of the next

1:36:54

cancer that have to confront. I'm

1:36:58

still gonna want a better oncologist than

1:37:01

I've become to tell me what

1:37:03

he or she would do 365 they were in my shoes

1:37:05

or in the shoes of, you know, my family member.

1:37:08

I'm gonna you know, I'm what

1:37:10

I'm not advocating not advocating a

1:37:14

a blind trust in authority. Like, if you

1:37:16

get cancer, And you're

1:37:18

talking to one oncologist and they're recommending some

1:37:20

course of treatment. By all means, get a second

1:37:22

opinion. Get a third opinion. Right?

1:37:25

But it matters that those opinions

1:37:27

are coming from real experts and not from,

1:37:30

you know, Robert

1:37:32

Kennedy junior, you know, who's telling

1:37:35

you that, you know, you got it because you got a, you

1:37:37

know, a vaccine. Right? It's like it's it's just

1:37:39

it there's we're swimming

1:37:41

in a sea of misinformation where you've got people

1:37:44

who are moving the opinions of

1:37:46

millions of others who

1:37:49

who should not have an opinion

1:37:52

on these topics? Like, there's no

1:37:55

there is no scenario in

1:37:57

which you should be getting your opinion

1:37:59

about vaccine safety or or

1:38:01

climate change or the

1:38:04

war in Ukraine. Or anything

1:38:06

else that we might wanna talk about from

1:38:09

Candice Owens. Right? It's just like it like

1:38:11

the like she she's not

1:38:13

a relevant expert on any of those topics.

1:38:16

And what's more she doesn't seem to care?

1:38:18

Right? And and and she's living in a culture

1:38:21

that has that has amplify

1:38:24

that not carrying into a business model

1:38:27

and an effective business model. Right? So it's

1:38:29

just it's and that is something

1:38:31

very trumpian about all that. Or, like, that's that's

1:38:33

the problem is

1:38:35

the culture is not these specific

1:38:37

individuals. So

1:38:39

the paradox here is that expertise

1:38:43

is a real thing. And

1:38:45

we 365 to it a lot as

1:38:47

a as a labor saving device, and it's just as

1:38:49

and just based on the

1:38:53

the the reality that it's very hard

1:38:55

to be a polymath. Right? And specialization is

1:38:58

a thing. Right? And so that people who specialize in

1:39:00

a very narrow topic, they know more about that

1:39:02

topic than the next guy no matter how smart that

1:39:04

that guy or gal is. And

1:39:09

that those differences matter, but

1:39:11

it's also true that

1:39:13

when you're talking about 365, sometimes

1:39:19

the best experts are wrong. The

1:39:21

scientific consensus is wrong. You

1:39:23

get AAA sea change in the thinking

1:39:26

of a whole field because one person

1:39:28

who's an outlier 365 whatever reason decides,

1:39:31

okay, I'm you know,

1:39:33

I'm gonna prove this point and they

1:39:35

prove it. Right? So somebody like

1:39:38

the doctor who believe

1:39:41

that stomach ulcers were not due to stress,

1:39:43

but were due to to h pylori

1:39:46

365. Right? So he just drank a vial of h

1:39:48

pylori bacteria and and proved that

1:39:50

and then quickly got an ulcer and

1:39:52

convinced the field that that at

1:39:54

minimum h pylori was involved in

1:39:56

in that process. Okay. So,

1:39:59

yes, everyone was wrong. That

1:40:01

doesn't disprove the

1:40:04

reality of expertise. It doesn't

1:40:06

disprove the tillity of relying on

1:40:08

experts most of the time, especially

1:40:11

in an emergency, especially when the clock is

1:40:13

ticking, especially when you're, you

1:40:15

know, you're you're in this particular cockpit

1:40:17

and you only have one chance to land

1:40:19

this

1:40:20

plane. Right? You want the real

1:40:22

pilot at the controls.

1:40:24

But there's just few things to say.

1:40:28

So one, you mentioned this example

1:40:30

with cancer and doing your own research. There

1:40:32

there are several things that are different about

1:40:35

our particular time in history.

1:40:37

One, doing your own research has

1:40:39

become more and more 365. Because

1:40:42

you can read the Internet made

1:40:44

information a lot more accessible. So

1:40:46

you can read a lot of different meta

1:40:49

analysis You can read blog

1:40:51

posts that describe

1:40:53

you exactly the flaws IDW the different papers

1:40:56

that make up the meta analysis. They

1:40:59

and and you can read a lot of those blog posts that

1:41:01

are conflicting with each other, and you can take that

1:41:03

information in. And in a short amount of time,

1:41:05

you can start to make good

1:41:09

faith interpretation. For example, I don't

1:41:12

know. I don't wanna overstate things, but

1:41:15

365 you suffer from depression, for example,

1:41:18

then there you could go to

1:41:20

an expert and a doctor that prescribes you some

1:41:22

medication. But you

1:41:24

could also challenge some of those ideas and seeing,

1:41:26

like, what are the different medication? What are the different

1:41:28

side effects? What are the different solutions to depression?

1:41:31

All that kind of stuff? And I think depression

1:41:33

is just really difficult problem that's very

1:41:37

I don't want to again say incorrect things,

1:41:39

but I think it's there's

1:41:41

a lot of variability of what depression

1:41:44

really means. So being introspective

1:41:46

about the type of depression you have and

1:41:49

the different possible solutions you have. Just

1:41:51

doing your own research as a first

1:41:53

step before approaching a doctor or

1:41:55

as you have multiple opinions. Could

1:41:57

be very beneficial in that case. Now,

1:42:00

that's depression that's something that's been studied

1:42:02

for a very long time with a new pandemic

1:42:05

that's affecting everybody. It's,

1:42:08

you know, with the airplane, IDW

1:42:11

equate it to, like, nine eleven or something. Like,

1:42:14

the the the new emergency just happened,

1:42:16

and everybody, every

1:42:19

expert in the world is

1:42:21

publishing on it and talking about

1:42:23

it. Mhmm. So doing your own research there

1:42:25

could be perceptionally effective in

1:42:27

asking questions. And then

1:42:29

there's a difference between experts,

1:42:32

virologists, And it's actually a good question.

1:42:34

Who is exactly the expert in

1:42:37

a pandemic? Yeah. But

1:42:39

there's the actual experts doing

1:42:42

the research and publishing 365. And

1:42:44

then there's the communicators of that expertise.

1:42:47

And the question is, 365

1:42:50

the communicators are 365. To

1:42:54

to a degree we're doing your own research,

1:42:57

is actually the more effective way to figure

1:42:59

out policies and solutions because

1:43:01

you're not competing with experts. You're competing

1:43:03

with the communicators of expertise. That

1:43:05

could be WHO or CDC in the case of

1:43:07

pandemic or politicians or

1:43:10

political type of science figures like 365.

1:43:13

There's a question there. Of

1:43:17

the effectiveness of doing your research, your

1:43:19

own research in that context. And

1:43:23

the competing forces

1:43:25

their incentives that you've mentioned is

1:43:27

you can be become quite popular by being

1:43:30

contrarian. By saying, everybody's

1:43:32

lying to you, all the authorities lying to you, all the institutions

1:43:34

lying you. Mhmm. So those are the

1:43:36

waters you're swimming in. Yeah. But

1:43:38

I think doing your own research in

1:43:40

that kind of context could be quite

1:43:43

effective.

1:43:44

Lex me be clear. I'm not saying you

1:43:47

shouldn't do any research. Right? I'm not saying that

1:43:49

you shouldn't be informed about an issue. I'm not

1:43:51

saying you shouldn't read articles on

1:43:53

on whatever the topic is. And yeah, yes, if

1:43:56

I got cancer or someone close to me got cancer,

1:43:59

probably would read more about cancer than I've

1:44:01

read thus far about cancer, and

1:44:03

I've read some.

1:44:06

So I'm not making

1:44:08

a virtue of ignorance and

1:44:11

a blind obedience to authority. And

1:44:13

I again, I recognize that that authorities

1:44:16

can discredit themselves or they can be wrong

1:44:19

they can wrong even when they had that when there's no

1:44:21

discredit. There's just there's a lot we don't understand

1:44:23

about the the nature of the world.

1:44:28

But still this this vast

1:44:30

gulf between truly

1:44:32

informed opinion and

1:44:35

bullshit exists. It always

1:44:37

exists. And and

1:44:40

conspiracy thinking is Rather

1:44:44

often, you know, most of the time, the

1:44:47

species of bullshit, but it's not

1:44:49

always wrong. Right? There are real conspiracies. And

1:44:51

there there really are just

1:44:54

awful corruption

1:44:57

of, you know, but born of bad incentives within

1:45:00

our, you know, our scientific processes

1:45:02

within institutions. And again,

1:45:05

we've mentioned a lot of these things in past

1:45:07

anybody. What what

1:45:09

woke political ideology did to

1:45:11

scientific communication during the

1:45:13

pandemic was 365. And

1:45:15

it was really corrosive of public trust,

1:45:17

especially on the on the right.

1:45:21

For understandable reasons. And it was just it

1:45:23

was crazy. Some of the things that were being

1:45:25

said, and still is. And

1:45:27

these cases are all 365. I mean, like, you take depression.

1:45:30

We just don't know enough about depression 365

1:45:33

anyone to be that confident about anything.

1:45:35

Right? And there are many different modalities in

1:45:38

which to interact with it as a problem.

1:45:40

Right? So there's yes, pharmaceuticals have

1:45:43

whatever promise they have, but there's there's

1:45:45

certainly reason to be concerned that they don't

1:45:47

work well for everybody and

1:45:51

and IDW that's obvious. They don't work

1:45:53

well for everybody, but they IDW work for some

1:45:55

people. But

1:46:00

again, depression is a multifactorial

1:46:02

problem and they're they're different

1:46:05

levels at which to to influence it and

1:46:07

they're, you know, they're things like meditation, they're things

1:46:09

like just life changes and

1:46:12

and one of the worst

1:46:14

things about depression is that when you're depressed,

1:46:17

all of the things that would be good for you to do are

1:46:19

precisely things you don't wanna do. You don't have

1:46:21

any energy to socialize. You don't wanna get

1:46:23

things done. You don't wanna exercise. You don't and

1:46:27

all of those things if you got those up and running,

1:46:29

they do make you feel better in, you know, in

1:46:31

the aggregate. But the

1:46:34

reality is that there are clinical

1:46:36

level depressions that are so bad that it's

1:46:39

just we just don't have good

1:46:41

tools for them. And it's not

1:46:43

enough to tell there's no life change. Someone's

1:46:45

gonna embrace

1:46:47

that it's going to be an obvious remedy for

1:46:49

that. I

1:46:53

mean, pandemics are obviously a

1:46:55

complicated problem, but I

1:46:58

would consider it much simpler than depression

1:47:00

in terms of you know,

1:47:02

what's on the menu to be chosen

1:47:05

among, you know, the various choices. Just less 365.

1:47:08

Logic by which you would make those choices. Yeah. So

1:47:10

it's, like, We have a virus. We have

1:47:12

a new virus. It's

1:47:14

some version of bad. You know,

1:47:17

it's human transmissible. We're

1:47:19

still catching up. We're catching up to every aspect

1:47:21

of

1:47:21

it. We don't know how it spreads. We don't know how

1:47:24

effective

1:47:24

masks are. At a certain point, we knew it was respiratory,

1:47:27

but we knew that But but but yet

1:47:29

and whether it's we had spread by fomites. So, like,

1:47:31

all that we were confused about a lot of things,

1:47:33

and we're still confused. It's been a moving

1:47:35

target this whole time. And been changing this whole

1:47:37

time. And our responses to

1:47:40

it have been, you know,

1:47:42

we ramped up the vaccines as quickly

1:47:44

as we could, but too

1:47:47

quick for some, not quick

1:47:49

enough for others, we could have done human

1:47:51

challenge trials and got them out more quickly

1:47:54

with better data. IDW

1:47:56

think that's something we should probably look at in the future

1:47:58

because that, to my eye,

1:48:00

that would make ethical sense to do challenge

1:48:02

trials. But And

1:48:07

so much of my concern about COVID, many

1:48:09

people are confused about my concern about COVID.

1:48:12

My concern about COVID has for

1:48:14

much of the time not been

1:48:16

narrowly focused on COVID 365,

1:48:19

how dangerous I perceive COVID

1:48:21

to be as a as a illness.

1:48:25

It has been for the longest time even

1:48:28

more concerned about our ability

1:48:30

to respond to a truly

1:48:33

scary pathogen next time.

1:48:36

Like 365 you know, outside

1:48:38

those initial months, know, give

1:48:41

me the 365 six months to be

1:48:44

quite worried about COVID and and

1:48:46

the unraveling of society, but in a supply

1:48:48

toilet paper. You wanna secure steady supply

1:48:50

of toilet paper. But beyond

1:48:53

that initial period when

1:48:55

we had a sense of what we were dealing with and

1:48:57

we had every hope that the vaccines are actually

1:49:00

gonna work and we're getting and we knew we were getting those

1:49:02

vaccines in short order, right? Beyond

1:49:04

that and we knew

1:49:06

just how dangerous the illness was

1:49:08

and how dangerous it wasn't. 365

1:49:12

years now, I've just been worrying

1:49:14

about this as a failed dress rehearsal

1:49:17

for something much worse. Right? I think what we

1:49:19

proved to ourselves at this moment

1:49:21

in history is that we have

1:49:23

built informational tools that we

1:49:25

do not know how to use. To

1:49:27

and we have made ourselves We we basically

1:49:30

enrolled all of human society

1:49:32

into a psychological experiment that

1:49:35

is ranging

1:49:38

us and making it virtually

1:49:40

impossible to solve coordination

1:49:43

problems that we absolutely have to

1:49:45

solve. Lex time when things are worse.

1:49:47

Do you understand who's at fault for

1:49:50

the way this unraveled? The

1:49:53

way we didn't seem

1:49:55

to have the distressed

1:49:58

institutions and institutional science

1:50:00

that

1:50:00

grew, like, seemingly exponentially or

1:50:02

got or got revealed to this process,

1:50:05

who who is a fault

1:50:06

here? And what's the

1:50:08

365? So much blame to go around, but

1:50:10

so much of it is not a matter

1:50:12

of bad

1:50:14

people conspiring to do bad things.

1:50:17

It's a matter of

1:50:20

incompetence and misaligned incentives

1:50:23

and just just ordinary,

1:50:25

you

1:50:26

know, there's plain vanilla dysfunction.

1:50:29

But my problem was that people like you,

1:50:31

people like Brett Weinstein, people

1:50:34

like that I look to for reasonable,

1:50:36

difficult conversations on difficult topics,

1:50:39

have a little bit lost their mind, became

1:50:41

emotional, docmatic, in style

1:50:43

of conversation, perhaps not in the depth of

1:50:45

actual IDW. But

1:50:48

there IDW, you know, at least something

1:50:50

of that nature and that about you would just it

1:50:52

feels like the pandemic made

1:50:55

people really more emotional

1:50:57

than 365. And then Kimball Mosk

1:50:59

responded I think something

1:51:01

IDW you probably would agree with. Maybe

1:51:03

not. I think it was the combo of Trump and

1:51:06

the pandemic. Trump triggered the far

1:51:08

left to be way more active than they

1:51:10

could have been without him. And then the

1:51:12

pandemic handed big

1:51:13

government, nanny state, left these a huge platform

1:51:15

on a silver platter. I want to punch,

1:51:18

and here we are. I would agree with some

1:51:20

of that. I I'm not sure how much to read into

1:51:22

the nanny state

1:51:24

concept. But but yet, like, basically,

1:51:27

got people on the 365 left really activated

1:51:30

-- Yep. -- and then gave control

1:51:32

to I don't know if you

1:51:33

say, the nanny state, but just control the

1:51:35

government. That

1:51:38

one executed poorly has created

1:51:40

a complete distrust in government. My

1:51:43

fear is that there was gonna be that complete distrust

1:51:45

anyway, given the nature of the information

1:51:48

space, given the level of conspiracy thinking,

1:51:50

given the gaming of of

1:51:54

these tools by an anti

1:51:56

Lex cult. I mean, there really is an anti

1:51:58

Vax cult that that just

1:52:01

ramped up its its energy during

1:52:03

this moment. But it's

1:52:05

a small one. It's it's not to say that everything

1:52:08

every concern about vaccines is

1:52:10

a species of it was born of misinformation

1:52:13

or born of this cult, but there is a cult

1:52:15

that is just, you know, and and, you know,

1:52:18

And the core of Trumpism is

1:52:20

a cult. I mean, the Q1 on is a cult.

1:52:24

And so there's a lot of line and there's

1:52:26

a lot of 365.

1:52:29

You know, there there are it's

1:52:31

almost impossible to exaggerate how

1:52:34

confused some people are and how and how

1:52:36

fully their their lives are organized

1:52:38

around that confusion. I mean, there are people who think

1:52:40

that the world's being run by pedophile

1:52:42

cannibals and that, you know, Tom Hanks

1:52:44

and Oprah Winfrey and Michelle Obama

1:52:47

are among those cannibals. I mean, like, they're adjacent

1:52:51

to the pure crazy. There's the semi

1:52:53

crazy. And adjacent to the semi crazy,

1:52:55

there's the 365 opportunist

1:52:58

asshole. And the

1:53:01

layers of of

1:53:03

bad faith are, you know,

1:53:06

hard to fully diagnose.

1:53:08

But the problem is all

1:53:11

of this is getting signal boosted by

1:53:14

an outrage machine that is preferentially

1:53:16

spreading misinformation. It has a business

1:53:18

model that is is

1:53:20

guaranteed, that is is preferentially

1:53:22

sharing misinformation. K. Actually, it's just on a small

1:53:24

tangent. Yeah. How do you defend

1:53:27

yourself against the claim that you're a pedophile

1:53:29

cannibal. It's

1:53:31

difficult to here's the case would make

1:53:33

because I don't think you can use the

1:53:35

reason. IDW think you have to use empathy.

1:53:38

You have to understand. But what but what like,

1:53:40

part of it, I mean, III find it very difficult

1:53:42

to believe that anyone believes these things.

1:53:44

I mean, I think that there's And there's I'm sure

1:53:47

there's some number of people who

1:53:49

are just pretending to believe these things

1:53:51

because it's just again, it's this

1:53:53

is sort of, like, the 365 qualification of everything.

1:53:56

It's just it's just it's just a good it's just Pepe

1:53:58

the frog. Right? Like, none of this is what it

1:54:00

seems. They're not signaling an

1:54:03

alliance with white supremacy or neo

1:54:05

nazism. But they're not not doing an

1:54:07

like, they just don't fucking care. It's just cynicism

1:54:10

overflowing its banks. Right?

1:54:12

It's just fun to to to wind up

1:54:15

the normies. Right? Like, look at all the

1:54:17

normies. You don't understand that a green frog is just

1:54:19

a green frog, even when it isn't just a green

1:54:21

frog. Right? Like, they're just it's just

1:54:23

gumming up everyone's cognitive

1:54:25

bandwidth with bullshit. Right? I

1:54:28

get that that's fun if you're a teenager and you

1:54:30

just wanna vandalize our our

1:54:32

new 365. But

1:54:35

at a certain point, we have

1:54:37

to recognize that real questions

1:54:40

of human welfare are in play. Right?

1:54:42

There's like, they're really there is this

1:54:44

their war is getting fought or not fought. And

1:54:46

there's a pandemic raging and there's

1:54:49

medicine to take or not take. But IDW mean

1:54:51

to come back to this issue of COVID, I

1:54:54

don't think my I don't think I got

1:54:57

so had a balance around COVID. IDW think

1:54:59

people are quite confused about what

1:55:01

I was concerned about. I mean,

1:55:03

like I there was a yes. There was a period

1:55:06

where was crazy because

1:55:08

anyone who was taken seriously was crazy because

1:55:10

they had no idea what was going on. And so it's like,

1:55:12

yes, I was wiping down packages with

1:55:14

with alcohol wipes. Right?

1:55:17

Because people thought

1:55:19

it was transmissible by

1:55:21

a touch. Right? So that when

1:55:23

we realized that was no longer the case, I stopped

1:55:25

doing that. But so there there again,

1:55:27

it was it was a moving target and

1:55:30

A lot of things we did in hindsight around

1:55:32

masking and school closures

1:55:36

Lex fairly dysfunctional.

1:55:38

Right? But it's necessary I think the criticism

1:55:40

that people would would

1:55:43

say about your talking

1:55:45

about COVID, and maybe you can correct me. But

1:55:48

you were skeptic or you were

1:55:51

against skepticism of

1:55:53

the safety and efficacy of the vaccine.

1:55:56

So people who get

1:55:58

nervous about the vaccine,

1:56:01

but don't fall into the usual

1:56:04

anti camp, which I think there

1:56:06

was a significant yeah. Yeah. There were enough

1:56:08

number -- Right. -- they're asking they're getting

1:56:10

nervous. I mean, especially

1:56:13

after the War of Afghanistan in

1:56:15

Iraq, IDW too was nervous

1:56:17

about anything where a lot of

1:56:19

money could be made. And

1:56:22

you you start you just see how

1:56:24

the people who are greedy, who come they come to

1:56:26

the surface all of a sudden. And then a lot

1:56:28

of them that run institution is actually

1:56:30

really good human beings. IDW a lot of them,

1:56:32

but it's hard to know how

1:56:35

those two combined together when there's hundreds

1:56:37

of billions, trillions of dollars to be

1:56:39

made. And so that skepticism

1:56:41

IDW I guess you the

1:56:43

the sense was that you weren't opening enough to the

1:56:45

skepticism I understand that people have that sense.

1:56:48

I'll tell you how I thought about it and

1:56:50

think about it. One, again, it was

1:56:52

a moving target. So there was point in

1:56:54

the timeline where it

1:56:56

was totally rational to expect

1:56:59

that the vaccines

1:57:03

were were both working,

1:57:05

but but both they were they

1:57:08

were reasonably safe and

1:57:10

that and that COVID was reasonably

1:57:13

dangerous. And that the trade off for basically

1:57:15

everyone was, it was rationally get vaccinated.

1:57:17

Given how many given the level of testing

1:57:20

and how many people have been vaccinated before

1:57:21

you, given what we were seeing with COVID,

1:57:24

right? That

1:57:26

that was a forced choice. Are you thinking you're eventually gonna

1:57:28

get COVID? And the question is, do you wanna be vaccinated

1:57:30

when you do? Right? There

1:57:32

was a period where that forced choice where it

1:57:34

was just obviously reasonable to

1:57:37

get vaccinated especially

1:57:39

because there was every reason

1:57:41

to expect that while it wasn't a perfectly

1:57:44

sterilizing vaccine, it

1:57:46

was going to knock down transmission a

1:57:48

lot and that matters. And so it wasn't just

1:57:51

a personal choice. You were actually

1:57:54

being a good citizen when you decided

1:57:56

to run whatever risk you you were gonna

1:57:58

run to get vaccinated because

1:58:01

there are people in our society who can't actually

1:58:03

can't get vaccinated. I know people who can't take

1:58:05

any vaccines. They're so allergic to,

1:58:07

I mean, they they in their own

1:58:09

person seem to justify

1:58:12

all of the fears of the anti cult. I

1:58:14

mean, it's like they're the kind of person who Robert

1:58:16

Kennedy and junior can point to him and say, see, vaccines

1:58:18

are well, we'll 365 kill you. Right?

1:58:21

Because because 365 the the experience that

1:58:23

and and we're still I know people have

1:58:25

kids who fit that description. Right? So

1:58:28

we should all feel a civic responsibility to

1:58:32

be vaccinated against greedously

1:58:35

awful and transmissible diseases

1:58:38

365 which we have relatively safe

1:58:40

vaccines to keep those

1:58:42

sorts of people 365. And there was period of time when

1:58:44

there was thought that the vaccine could stop transmission.

1:58:46

Yes. And so, again, all of this has

1:58:49

begun to shift I

1:58:51

don't think it has shifted as much as

1:58:53

Brett Weinstein thinks it has 365. But,

1:58:56

yes, there are safety

1:58:58

concerns around the mRNA vaccines, especially

1:59:02

for young men, right? As far

1:59:04

as I know, that's the

1:59:06

purview of the of

1:59:08

actual heightened concern. But

1:59:12

also, there's

1:59:14

now a lot of natural immunity out there, a

1:59:16

lot of basically, everyone who was gonna

1:59:18

get vaccinated, has gotten vaccinated. The

1:59:21

virus has evolved to the point in

1:59:23

in in this context where it

1:59:26

seems less dangerous.

1:59:29

You know, again, I don't I going

1:59:32

more on on the Siemens than on

1:59:34

on research that I've done

1:59:36

at this point, but I'm certainly less worried about

1:59:38

getting COVID. I've had it once I've been vaccinated.

1:59:40

I feel like it's like, so you ask

1:59:42

me now, how do I feel about getting the next

1:59:45

booster? I don't know

1:59:47

that I'm going to get the next booster. Right?

1:59:49

So I was somebody who was waiting

1:59:51

in line at four in the

1:59:53

morning, you know, hoping to get

1:59:55

get a some overflow vaccine when

1:59:57

it was first available. And I that

2:00:00

was at that point, given what we knew

2:00:03

or given what I thought I knew based

2:00:05

on the best sources I could consult and based

2:00:07

on based

2:00:09

on anecdotes that were too vivid to ignore,

2:00:12

you know, both data and and

2:00:14

personal experience, It

2:00:17

was totally rational for me to want

2:00:19

to get that vaccine as soon as I could.

2:00:22

And now IDW think it's totally rational

2:00:24

for me to to do a 365

2:00:26

kind of cost benefit analysis and wonder,

2:00:29

listen, do I really need to get

2:00:31

a booster? Right? You know, like, how

2:00:33

many of the how many of these boosters am I gonna get for

2:00:35

the rest of my life? Really? And

2:00:37

how safe is the mRNA

2:00:40

vaccine for a man of my age.

2:00:42

Right? And do I need to be worried about myocarditis

2:00:45

365, you know, all of that is completely

2:00:47

rational to talk about now. My

2:00:49

concern is that at

2:00:52

every point along the way, IDW

2:00:55

was the wrong person, and

2:00:57

and and Brett Weinstein was the wrong person,

2:00:59

and there's many other people I could add to this

2:01:01

list, to have strong opinions. About

2:01:04

any of this stuff. I just disagree with

2:01:06

that. I I think, yes,

2:01:08

in theory, I agree one hundred percent.

2:01:11

But I feel like experts failed

2:01:13

at communicating. Not at doing They

2:01:15

did. I I and I just feel like

2:01:17

you and Brent Weinstein actually

2:01:20

have the tools with the Internet given

2:01:22

the engine you have in your brain

2:01:24

of thinking for months at a time

2:01:27

deeply about the problems that

2:01:29

face our world that you actually

2:01:31

have the tools to do pretty good thinking

2:01:33

here. Is the

2:01:34

problem I have with experts, but there would

2:01:36

be deference to experts and pseudo

2:01:38

experts behind all of that. Well, the papers

2:01:40

you would stand on the shoulders of giants. But you can

2:01:42

surf those shoulders better than the giants

2:01:44

themselves. Even if I knew we were gonna disagree about

2:01:46

that. Like, III saw his podcast

2:01:48

where he brought on these experts who

2:01:51

had many of them had the right

2:01:53

credentials. But for

2:01:55

a variety of reasons, they didn't

2:01:57

pass the smell test 365 me, maybe

2:01:59

one larger problem and this goes back to the

2:02:01

problem of how we rely

2:02:04

on authority in science is that you

2:02:06

can always find a PhD or

2:02:08

an MD to to champion

2:02:10

any crackpot idea. Right? You could you could I

2:02:12

mean, it is amazing, but you could find

2:02:15

PhDs and MDs who would sit

2:02:17

up there in front of congress and

2:02:19

say that they thought smoking was not

2:02:21

addictive, you know, or that it was not harmful

2:02:23

to there was no direct link

2:02:25

between smoking in lung cancer, you

2:02:27

could always find those people. And you

2:02:29

can – but

2:02:31

some of the people Brett found were people

2:02:33

who had obvious tells to my point

2:02:35

of view, to my eye. I mean,

2:02:38

and I saw them on some of the same people were on

2:02:40

Rogans Podcast. Right? And

2:02:45

And it's hard because if a person

2:02:47

does have the right credentials and

2:02:49

they're not and they're not saying something

2:02:52

365 mistaken. And

2:02:54

we're talking about something where it's

2:02:57

they're genuine unknowns. Right? Like, how

2:03:00

much do we know about the safety of these vaccines,

2:03:02

right? It's at that point,

2:03:04

not a whole hell of a lot. I mean, we have no long

2:03:06

term data on mRNA vaccines.

2:03:09

But to confidently say that

2:03:11

millions of people are going to die because of these

2:03:13

vaccines, and to confidently say

2:03:15

that Ivermectin is a panacea, Right?

2:03:18

Ivermectin is the thing that prevents COVID.

2:03:20

Right? Mhmm. There was no good reason

2:03:22

to say either of those things at that moment.

2:03:24

And that's and and so Given

2:03:27

that that's where Brett was, I

2:03:29

felt like there was there was just no there was nothing

2:03:31

to debate. We're we're both the wrong people to

2:03:33

get getting into the weeds on this. We're

2:03:35

both gonna defer to our chosen

2:03:38

experts. His experts look

2:03:40

like crackpots to

2:03:41

me. And or

2:03:44

at least the ones who are most vociferous on

2:03:46

those most on on those edgiest points that

2:03:48

seem most

2:03:49

and your experts seem like, what is the term?

2:03:51

Stereo. I 365 the What time? Well, it's

2:03:54

no. But it's like it's like with, you know,

2:03:56

climate science. I mean, this this

2:03:58

old it's received as

2:04:00

a canard for for in half of our society

2:04:02

now, but the claim that ninety seven

2:04:04

percent of climate scientists agree that human

2:04:06

caused climate changes a thing.

2:04:09

Right? So do you go with the ninety

2:04:11

seven percent most of the time? Or do you

2:04:13

go with the three percent most of the time?

2:04:15

It's obvious you go with the ninety seven

2:04:18

percent most of the time 365 anything that matters.

2:04:20

It's not to say that the three percent are always

2:04:22

wrong. Again, that there

2:04:24

are things get overturned. And yes,

2:04:27

as you say, I've spent much more time

2:04:29

worrying about this on my podcast and I've spent worrying

2:04:31

about COVID. Our institutions have

2:04:35

lost trust for

2:04:37

good reason. Right? And and

2:04:40

it's it It's an open

2:04:42

question whether we

2:04:45

can actually get things done with

2:04:47

this level of transparency and

2:04:50

and pseudo transparency given our

2:04:52

information ecosystems. Like,

2:04:54

can we fight a war? Really

2:04:56

fight a war that we may have to 365? Like, the next

2:04:59

Nazis? Can we fight that war

2:05:01

when everyone with an iPhone is

2:05:03

showing just how awful it is that

2:05:06

little girls get blown up when we drop our bombs?

2:05:09

Right? Like, could we, could we, as

2:05:11

a society, do what we might

2:05:13

have to do to

2:05:15

actually get necessary things done?

2:05:17

When we're living in this this

2:05:19

panopticon of just, you know,

2:05:21

everyone's a journalist. Right? Everyone's

2:05:24

a scientist, everyone's an expert, everyone's

2:05:26

got direct

2:05:28

contact with the facts or or some

2:05:30

or a semblance of the facts. I don't know.

2:05:32

I think yes. And I think voices like

2:05:34

yours are exceptionally important, and I think there's

2:05:37

certain signals you send in

2:05:39

your ability to steal me on the other side

2:05:41

in your empathy, essentially. So

2:05:45

that's the 365. That's the mechanism by

2:05:48

which you resist

2:05:51

the dog the dogmatism of these this

2:05:53

binary thinking. And then 365

2:05:56

you become a trusted person that's

2:05:58

able to consider the other side, then

2:06:00

people would listen to you as

2:06:03

as the aggregators, the communicator of expertise.

2:06:05

Because the virologists haven't been able

2:06:08

to be good communicators. I still to this

2:06:10

day don't really

2:06:12

know what is

2:06:14

the what am I supposed to think

2:06:16

about the safety and

2:06:18

efficacy of the vaccines today.

2:06:20

As it stands today -- Right. -- what are we supposed to think?

2:06:22

What are we supposed to think about testing?

2:06:24

What are we supposed to think about the effectiveness of

2:06:26

masks or lockdowns? Where's

2:06:29

the great communicators on this

2:06:31

topic that consider all the

2:06:33

other conspiracy theories, all the other

2:06:36

all the communication that's out there, and

2:06:38

actually aggregating it together and being

2:06:41

able to say, this is actually what's most

2:06:43

likely the truth. And

2:06:45

also some of that has to do with humility,

2:06:48

epistemic humility, knowing that you can't

2:06:50

really know for sure, just

2:06:52

like with depression, you can't really know for sure.

2:06:55

You know, who wears a little IDW not seeing those

2:06:57

communications

2:06:58

being effectively done even still today.

2:07:00

Well, IDW the jury is still out

2:07:02

on some of it. And again, it's

2:07:05

a moving target. And some of it

2:07:07

I mean, it's complicated. Some of it's a self fulfilling

2:07:11

dynamic where -- Sure. -- like so,

2:07:13

like, lockdowns in theory,

2:07:16

lockdowns a lock down would work

2:07:19

if we can only do it, but we can't really

2:07:21

do it. And there's a lot of people who won't do it because

2:07:23

they're convinced that it's this is the totalitarian boot

2:07:26

you know, 365, on the neck of

2:07:30

the good people who are

2:07:34

always having their interests introduced

2:07:36

by the elites. Right? So, like, this is 365

2:07:39

you have enough people who think of the lockdown

2:07:41

for any reason in the face of any

2:07:43

conceivable illness, right,

2:07:46

is just code for the

2:07:48

new world order coming to fuck you over

2:07:50

and take your guns. Right? Okay.

2:07:52

You have a society that is now immune to

2:07:55

reason. Right? Because they're they're absolutely

2:07:57

certain pathogens that

2:08:00

we should lock down for next time. Right?

2:08:02

And and and it was

2:08:04

completely rational in the beginning

2:08:07

of this thing to lock down

2:08:09

given to attempt to lock down,

2:08:11

we never really lock down. To

2:08:13

attempt some semblance of lock down just

2:08:16

to quote bend the curve to spare our

2:08:18

healthcare system, given

2:08:20

what we were seeing happening in Italy. Right? Like

2:08:22

that moment was it was not hard to navigate.

2:08:25

At least in in my view, it was obvious

2:08:27

at time In retrospect, my

2:08:29

views on that haven't changed except

2:08:32

for the fact that I recognize maybe

2:08:35

it's it's just impossible. Be

2:08:37

given the nature of people's response

2:08:40

to that kind of demand.

2:08:42

Right? We live in a society that's just not gonna

2:08:44

lock down unless the pandemic is

2:08:46

much more deadly. Right. So that's a

2:08:48

point I made, which, you know, was maliciously

2:08:50

clipped out from some other podcast where someone's trying

2:08:53

to make it look like I wanna see

2:08:55

children die. Look, there's a pity more children

2:08:57

didn't die from COVID. Right? This

2:09:00

is actually the same person who who and

2:09:03

that's the other thing that got so

2:09:06

poisoned here. It's like that person, this

2:09:08

this psychopath or effective psychopath who's

2:09:10

creating these clips of me on podcast. The

2:09:13

second clip of me seeming

2:09:16

to say that I wish more children died during

2:09:18

COVID, which but it was it was so

2:09:20

IDW was so it was so clear in context

2:09:22

what I was saying that even the clip betrayed the

2:09:24

context, so it didn't actually work. This

2:09:27

psycho. And again, I don't know whether he actually

2:09:29

is a psychopath, but he's behaving like one

2:09:31

because of incentives of Twitter. This

2:09:33

is somebody who Brett signal boosted

2:09:36

as a as a very reliable

2:09:39

source of information. Right? He

2:09:41

he kept retweeting this guy at

2:09:43

me, against me. Right?

2:09:45

And this guy at one glance,

2:09:48

I knew how unreliable this guy was. Right?

2:09:50

But I think I I'm

2:09:53

not at all set. One thing I think I did wrong.

2:09:56

One thing that I do regret one

2:09:58

thing I have not sorted out for myself

2:10:01

is how to navigate

2:10:03

the professional

2:10:06

and personal pressure that

2:10:09

gets applied

2:10:13

at this moment where you have a friend

2:10:15

or an acquaintance or someone you know who's

2:10:18

behaving badly in public. Or

2:10:21

behaving badly, behaving in a way

2:10:23

that you think is bad in

2:10:25

public. And they

2:10:27

have a public platform where they're influencing

2:10:29

a lot of people. And you have your own public

2:10:31

platform where you're constantly getting

2:10:34

asked to comment on what this

2:10:36

this friend or or acquaintance or colleague

2:10:38

is doing.

2:10:41

IDW haven't known what I think

2:10:43

is ethically right about

2:10:45

the choices that seem forced

2:10:47

on us IDW at moments like this. So, like, I've I've criticized

2:10:50

you in public about your

2:10:52

your interview with Kanye. Mhmm. Now

2:10:54

in the case in in that case, I reached

2:10:56

out to you in private first and told you exactly

2:10:58

what I thought. And then when I was gonna

2:11:00

get asked in public or when I was touching that topic

2:11:03

on my podcast, I'm

2:11:05

more or less said the same thing that I said to you in private.

2:11:07

Right? Now that was how I navigated that moment.

2:11:10

I did the same thing with with

2:11:13

Elon, at least on

2:11:15

at the beginning. You

2:11:19

know, this we have

2:11:21

we have maintained good vibes, which

2:11:24

is which is not what I know what you

2:11:25

want. But I don't think I I disagree

2:11:28

with you because good vibes in the moment.

2:11:30

There's a deep core of good vibes that

2:11:32

persist through time between you and

2:11:34

Elon, and I would argue probably between

2:11:37

some of the other folks you mentioned. I think

2:11:39

with Brett, I failed to

2:11:41

reach out in private

2:11:43

to the degree that I should have. And we

2:11:45

we never really had a a we

2:11:48

we had tried to set up a conversation in

2:11:50

private that that never happened, but

2:11:52

there was some communication. But it

2:11:55

would have been much better

2:11:58

for me to have made more of an effort in private

2:12:00

than I did before it's build

2:12:02

out into

2:12:03

public. And I would say that's true with other people

2:12:05

as well. What kind of

2:12:07

interaction in private do you think you should have

2:12:09

with Brett? Because my case would be

2:12:11

beforehand and now still the

2:12:14

case I would like in this part of the criticism

2:12:17

you sent my way, maybe

2:12:20

it's useful to go to that direction.

2:12:22

Actually, let let's go to that direction because

2:12:24

I think I disagree with your

2:12:26

criticism as you stated publicly put the Sorry

2:12:28

about that. Your of your five fifteen AM.

2:12:30

The thing you criticized me for is actually the right

2:12:33

thing to do with Brett. Okay. You

2:12:35

you Lex could have spoken with Kanye

2:12:37

in such a way as they have produced

2:12:39

a useful

2:12:40

document. He didn't do that because

2:12:42

he has a fairly naive philosophy about the

2:12:44

power of love. Mhmm. Well,

2:12:48

see if you can maintain that philosophy in the present.

2:12:50

Let's go.

2:12:51

No. It's beautiful. He

2:12:54

seemed to think that if he just got

2:12:56

through the 365 to the end of

2:12:58

the conversation, were the two of them

2:13:00

still or feeling good about one another

2:13:02

and they can hug it out, that would

2:13:04

be by definition a success. So

2:13:08

Lex me make the case for this power

2:13:10

of love philosophy. Right? And first of all,

2:13:12

I I love you, Sam. You're still inspiration

2:13:15

and somebody I deeply admire. Okay.

2:13:17

Back at you. To

2:13:20

me, in the case of Kanye, it's

2:13:24

not only that you get to the conversation

2:13:26

and have hugs It's

2:13:29

that the display that you're willing

2:13:31

to do that has power. So

2:13:33

even if it doesn't end in hugging, the

2:13:36

actual the turning the

2:13:38

other cheek, the act of turning the other cheek

2:13:41

365 communicates both to Kanye

2:13:43

later and to the rest of the world

2:13:46

that we should have

2:13:49

empathy and compassion towards each other. There is

2:13:51

power to that. I that maybe

2:13:53

that is naive, but I believe in the

2:13:56

power of that. So it's not that IDW trying to convince

2:13:58

Kanye that some of his ideas are wrong,

2:14:00

but I'm trying to illustrate that

2:14:03

just the act of listening and truly trying

2:14:05

to understand the human being. That

2:14:10

is opens people's minds to actually

2:14:12

questioning their own beliefs more. It takes

2:14:14

them out of the dogmatism, deescalates the

2:14:17

kind of dogmatism

2:14:19

that I've been seeing. So in that sense,

2:14:21

I would say the power of love is

2:14:24

the is the philosophy you might apply to

2:14:26

Brett because the right conversation

2:14:29

you have in private is not about,

2:14:31

hey, listen. You're you know, the

2:14:34

the experts you're talking to they

2:14:36

seem credentialed, but they're not actually

2:14:39

as credentials as they're illustrating. They're

2:14:41

not grounding their findings in actual meta

2:14:43

analysis and papers and so on. Like,

2:14:45

making a strong case. Like, what are you doing? This

2:14:47

is gonna get a lot of people in trouble. But instead

2:14:49

just saying, like, being a friend

2:14:51

in the domicile ways, being

2:14:54

like, respectful,

2:14:57

sending love their way, and

2:14:59

just having conversation outside of all

2:15:01

of this. Also, But, basically,

2:15:04

showing that, like, removing

2:15:09

the emotional attachment to this

2:15:11

debate even though you are very emotionally

2:15:13

attached because in the case of COVID 365, there

2:15:16

is a very large number of lives at

2:15:18

stake. But removing

2:15:20

all of that and remembering that you have

2:15:23

a friendship.

2:15:25

Yeah. Well, so I think these are highly non

2:15:27

analogous cases. Right? So your your conversation

2:15:29

with Kanye misfired

2:15:32

from my point of view for a very different reason.

2:15:34

It was it was it

2:15:37

has to do with Kanye. mean, so Kanye I

2:15:40

don't I don't know. I have never met Kanye. So,

2:15:42

obviously, I don't know him. But

2:15:46

I think he's either obviously

2:15:49

in the midst of a mental health crisis

2:15:52

or he's a colossal asshole. Or

2:15:55

both. I mean, the fact that those aren't mutually exclusive.

2:15:57

So one of three possible he's either

2:15:59

mentally

2:16:00

ill, he's an asshole, or he's

2:16:02

a he's mentally ill and an asshole. I think

2:16:04

all three of those possibilities are possible

2:16:06

for

2:16:07

the both of us as well. No. moment. I would argue

2:16:09

none of those are are are -- Right. -- likely

2:16:11

for either of us, but possible.

2:16:14

Not not to say we don't have our moments, but

2:16:15

So so the

2:16:18

reason not to talk to Kanye, so you I

2:16:20

think you should have had the conversation

2:16:22

you had with him in private. That's great.

2:16:24

And there's no I have got no criticism

2:16:27

of what you said had it been in private.

2:16:29

In pub, I just thought you're

2:16:32

not doing him a favor. If if he's

2:16:34

mentally ill. Right? He's

2:16:37

in the middle of a

2:16:39

manic episode or, you know, I'm

2:16:41

not a clinician, but I've, you know, heard it said him

2:16:43

that he is bipolar. You're

2:16:47

not doing him a favor, sticking a mic

2:16:49

in front of him and letting him go off on

2:16:51

the Jews or anything else. Right? We

2:16:55

know what he thought about the Jews. We

2:16:57

know that there's not much illumination

2:16:59

gonna it's gonna come from him on that

2:17:01

topic. And 365 it

2:17:03

is a symptom of his mental illness

2:17:05

that he thinks these things, well then it's you're

2:17:08

not doing him a favor making that even

2:17:10

more public. 365

2:17:12

he's just an asshole and he's just

2:17:14

an antisemite and an ordinary, you

2:17:16

know, garden variety antisemite, well

2:17:18

then there's also not

2:17:20

much to say unless you're really gonna

2:17:22

dig in and kick the shit

2:17:24

out of him in public. And I'm

2:17:26

I'm saying you can do that with love. The

2:17:29

other thing here is that I

2:17:31

don't agree that compassion and love

2:17:33

always have this

2:17:36

patient embracing

2:17:39

acquiescent face.

2:17:42

Right? IDW they they don't always feel

2:17:44

good to the recipient. Right? There is a

2:17:46

sort of wisdom that you can wield

2:17:49

compassionately in moments like

2:17:51

that where someone's full of shit and you

2:17:53

just make it absolutely clear to them and to

2:17:55

your audience. That they're full of shit. And

2:17:57

it's no there's no hatred being communicated. In

2:18:00

fact, you could just it's like, I'm gonna do everyone

2:18:02

a favor right now and you know,

2:18:04

just take your foot out of your mouth

2:18:07

and and and

2:18:10

the truth is, you know, I wouldn't I just wouldn't have aired

2:18:12

the conversation. I just don't think it was document that

2:18:14

had to get out there. Right? IDW get that

2:18:17

many p this is not a signal you're

2:18:19

likely to get from your audience. Right? I get that

2:18:21

many people in your audience thought my god.

2:18:23

That's awesome. You're you're talking to Kanye and

2:18:25

you're doing it in lifestyle where it's just

2:18:27

love and you're not treating him like a pariah.

2:18:30

And, you know, you you're you're holding this tension

2:18:33

between he's this creative genius, who is this work

2:18:35

we love, and yet he's having this moment that's

2:18:37

so 365. And what a tight

2:18:39

rope walk and I get that maybe

2:18:41

ninety percent of your audience saw it that way.

2:18:44

They're still wrong. And I and

2:18:46

I I still think that was not unbalanced,

2:18:48

not a good thing to put out into the world. You

2:18:50

don't think it opens up the mind and heart of people

2:18:52

that listen to that? Just have it seeing

2:18:54

a picture. Because it's

2:18:56

365 it's opening up in the wrong direction

2:18:59

where just jail force nonsense

2:19:01

is coming in. Right? I I think we should

2:19:04

have an open mind and an open heart, but

2:19:06

there's some clear things here

2:19:08

that we have to keep in view.

2:19:10

One is the mental illness component is

2:19:12

its own thing. Yeah. I don't pretend to understand

2:19:15

what's going on with

2:19:15

him. So but insofar as that's the reason

2:19:18

he's saying what he's saying,

2:19:20

IDW not put this guy on camera

2:19:22

and let know what had six sorry. In

2:19:24

that point real quick, I had a bunch of conversations

2:19:26

with him 365, and I didn't get a sense of

2:19:28

mental

2:19:29

illness. That's why IDW to

2:19:31

sit down. Okay. And I

2:19:32

didn't get it. I mean, mental illness is such

2:19:34

a

2:19:36

But when it shows up in game put on

2:19:38

Alex Jones's podcast, IDW this is

2:19:40

either that's

2:19:42

more, you know, genius performance

2:19:44

in his world or it's he he's I'll unraveling

2:19:47

further. I wouldn't put that under mental illness.

2:19:49

I we have to I think

2:19:51

there's another conversation to be had about

2:19:54

how we treat artists.

2:19:57

Right. Because they're they're

2:19:59

weirdos. They're very I

2:20:01

mean, we You know, taking

2:20:04

taking words from Wakani as

2:20:06

if he he's like Christopher Hitchens or something

2:20:09

like that. Like, very eloquent researched

2:20:13

you know, many books on

2:20:16

history, on politics, on geopolitics, on

2:20:19

psychology. Kinda didn't do any of that.

2:20:21

He's an artist just spotting 365. And so

2:20:23

there's a different style conversation and

2:20:26

a different way to treat

2:20:28

the wars that are coming out. Let's let's leave the mental

2:20:31

illness side. So if if if we're gonna say that there's no reason

2:20:33

to think he's mentally ill, and this is just him being

2:20:35

creative and brilliant and opinionated,

2:20:38

Well then, that falls into the asshole bucket

2:20:40

for me. It's like, then then he's someone and,

2:20:42

honestly, the most offensive thing about him

2:20:45

in that interview, from my point of view, is not the

2:20:47

antisemitism, which you know, we can talk

2:20:49

about, because I think there are problems

2:20:51

just letting him

2:20:54

spread those memes as well. But The

2:20:56

most offensive thing is just how

2:20:59

delusionally ego centric he

2:21:01

is or was coming off in that

2:21:03

interview and in and in others. Like, he

2:21:05

he has an estimation of himself as

2:21:08

this omnibus genius to

2:21:10

to to rival not only to rival Shakespeare

2:21:13

to exceed Shakespeare. Right? I mean, he's like,

2:21:15

he's he's the greatest mind that has ever walked

2:21:17

among us. And he's more or less explicit

2:21:20

on that point. And if he manages to

2:21:22

talk for hours without saying anything, actually

2:21:24

interesting or insightful or or

2:21:26

factually illuminating. Right? So

2:21:28

it's complete delusion of

2:21:31

a very trumpian sort. You know, it's like it's

2:21:33

like, you know, when Trump says he's a genius who

2:21:35

understands everything and say, but nobody takes him

2:21:37

seriously. And one wonders whether

2:21:39

Trump takes himself seriously. Kanye

2:21:41

seems to believe he he seems

2:21:43

to believe his own press. He actually thinks he's

2:21:46

he's you know, just

2:21:49

a colossus. And he

2:21:52

may be a great musician. You know, I'm not, you

2:21:54

know, I've it's certainly

2:21:56

not my wheelhouse to compare him to any other musicians.

2:22:01

But one thing is painly obvious in

2:22:04

from your conversation is he's not

2:22:06

who he thinks he is intellectually or

2:22:08

ethically or in any other relevant

2:22:11

way. And so when you couple

2:22:13

that to the antisemitism

2:22:15

he was spreading, which was genuinely

2:22:18

noxious and ill considered and

2:22:22

has potential knock on effects

2:22:24

in the black community. I mean, there's

2:22:26

an ambient level of antisemitism in the black

2:22:29

community that is worth worrying about and

2:22:31

talking about anyway. There's a bunch of

2:22:33

guys, you know, playing the knockout game in Brooklyn,

2:22:35

just punching Orthodox Jews in the 365. And

2:22:38

I think letting Kanye air his

2:22:40

anti Semitism, that publicly

2:22:43

only raises the likelihood

2:22:45

of that rather than diminishes it. I don't know. So

2:22:47

Lex me say just a couple of things. So one,

2:22:50

my belief at the time was

2:22:52

it doesn't. It decreases it. Show an empathy

2:22:54

while pushing back decreases likelihood

2:22:56

of that. It does might it might

2:22:58

on the surface look like it's

2:23:00

increasing it, but that's simply because the

2:23:03

antisemitism or the hatred general is

2:23:05

brought to the 365, and then people

2:23:07

talk talk about it. But I

2:23:09

should also say that you're one of the

2:23:11

only people that wrote to me privately criticizing

2:23:13

me. Mhmm. And,

2:23:16

like, out of the people IDW respect that admirer,

2:23:18

and that was really valuable that, like, I had to

2:23:21

365 because I had to think through it for a while.

2:23:23

Sure. I'm still it still haunts

2:23:25

me because the other kind of

2:23:27

criticism I got a lot of people

2:23:30

basically said, thinks

2:23:33

towards me

2:23:36

based on who I am that they hate me. Just just

2:23:38

you mean anti Semantic things or that you're antisemitism?

2:23:40

Because I just hate the word

2:23:41

asymatic. It's a it's like

2:23:43

racist.

2:23:44

But here here's the reality. So

2:23:47

I'm someone so I'm Jewish,

2:23:49

you know, although obviously not religious.

2:23:53

I have never taken, you know, I've

2:23:55

I've been a student of the Holocaust. Obviously, I

2:23:57

I know a lot about that and and there's reason

2:24:00

to to be student of the

2:24:02

Holocaust. But in

2:24:05

my lifetime and in my experience, I

2:24:07

have never taken antisemitism. Very

2:24:10

seriously. IDW not worried about it.

2:24:12

I have not made a thing

2:24:14

of it. I've done exactly one podcast on

2:24:16

it. I had very wise on my podcast.

2:24:19

When her book came out. But

2:24:24

it really is a thing and

2:24:27

it's it's

2:24:29

something we have to keep an eye on societally

2:24:33

because it it it's a it's a unique

2:24:35

kind of hatred. IDW? It's a it's

2:24:38

it's unique in that it

2:24:40

seems it's it's knit together with it's not just

2:24:42

ordinary

2:24:42

racism. It's it it knit together with lots of

2:24:44

conspiracy theories that never seem to die out.

2:24:48

It's it can, by turns,

2:24:51

equally animate the left and the right

2:24:53

politically. It was so perverse about antisemitism.

2:24:55

It was like, look in the American context. With

2:24:57

the far right, you know, with white supremacists, Jews

2:25:01

aren't considered white, so they they hate us IDW

2:25:03

the 365 the same spirit in which they hate black

2:25:05

people or brown people or anyone who's not

2:25:07

white. But on the left,

2:25:09

Jews are considered extra white.

2:25:11

I mean, we're we're we're we're we're the Lex beneficiaries

2:25:14

of white privilege. Right? And

2:25:16

in the black community, that is often

2:25:18

the case. Right? We're we're a minority that

2:25:20

has thrived. And so and it

2:25:22

seems to stand as a counterpoint

2:25:25

to all of the problems of of

2:25:27

other that other minorities suffer in particular,

2:25:30

you know, African Americans, in the American context.

2:25:33

And, yeah, Asians are now getting a little bit of

2:25:36

this, you know, like the the the model minority

2:25:39

issue. But Jews have had this

2:25:41

going on for centuries and

2:25:43

and millennia, and it never

2:25:45

seems to go away. And it's this again, this

2:25:47

is something that I've never focused on.

2:25:50

But this has been

2:25:52

at a slow boil for

2:25:55

as long as we've been alive and there's

2:25:57

no guarantee it can't suddenly become

2:26:00

much much uglier than we have

2:26:02

any reason to expect it

2:26:04

to become even in our society. And so

2:26:07

there's there's kind of a special concern at

2:26:10

moments like that where you have an immensely

2:26:12

influential person. In a community

2:26:16

who already has a checkered history

2:26:18

with respect to their own beliefs about the

2:26:20

Jews and the conspiracies and all the rest.

2:26:24

And he's just messaging, you know,

2:26:26

not especially 365 opposed

2:26:29

by you and anyone else who's who's given him

2:26:32

the microphone at that moment.

2:26:34

To the world. And that so that

2:26:36

that, you know, made

2:26:38

my Spidey sense tingle. Yeah. It's complicated.

2:26:41

It's the stakes are very high. And

2:26:43

I somebody who's been obviously

2:26:45

family and also reading a lot about World

2:26:47

War two. Yeah. And just this whole period

2:26:49

is a very difficult conversation. Or

2:26:51

III believe in the power,

2:26:54

especially given who I

2:26:56

am of not

2:26:58

always, but sometimes often

2:27:01

turning the other cheek. Oh,

2:27:03

yeah. And, again, things change.

2:27:06

When

2:27:08

when they're for public consumption, you know, when

2:27:10

you're so it's like IDW mean, the

2:27:13

the cut 365 me that, you know, has just

2:27:15

the use case, I keep stumbling upon

2:27:17

is the the kinds of things that I will say

2:27:19

on a podcast like this or 365 I'm given

2:27:21

a public lecture versus

2:27:23

the kinds of things I will say at dinner with

2:27:27

strangers or with 365. Like like, if

2:27:29

you're in an elevator, like, 365 I'm in an elevator

2:27:31

with strangers, I do not 365 and I hear someone

2:27:33

say something stupid, I don't feel

2:27:36

an intellectual responsibility to

2:27:38

turn around in the in the in

2:27:41

the in the in the 365 of that space with

2:27:43

them. And say, listen, that thing you just said about

2:27:45

y, or z is completely 365, and here's

2:27:47

why. Right? But if somebody

2:27:49

says it in front of me, some

2:27:51

public diets where I'm actually talking about

2:27:54

IDW, that's when, you

2:27:56

know, there's a different responsibility that

2:27:58

comes online. The question is, how you say

2:28:00

it? How you say it. Or even whether

2:28:02

you say anything in in those I mean, they're

2:28:04

they're they're they're 365 moments to

2:28:06

privilege

2:28:07

civility or just to pick your battle. Sometimes

2:28:10

it's just not worth it to get into it with somebody

2:28:12

out out in in real

2:28:15

life. I just believe in the power

2:28:17

of empathy both in the in the

2:28:19

elevator and

2:28:21

when a bunch of people are listening

2:28:24

that -- Yeah. -- when they see you

2:28:26

willing to consider another

2:28:29

human being's perspective. IDW

2:28:32

just gives more power to your to

2:28:35

your words 365.

2:28:37

Well, yeah. But until it doesn't,

2:28:39

like, if you because you can you

2:28:41

can -- Right. -- you can extend

2:28:43

charity too far. Right? You could

2:28:45

like, it can be absolutely obvious what someone's

2:28:47

motives really are. Right. And they're

2:28:50

they're, you know, assembling about that. Right?

2:28:52

And so then your taken at face

2:28:54

value, their representations begins

2:28:56

to look like you're just being duped and

2:28:58

you're not you're not actually doing the work

2:29:00

of of putting pressure on

2:29:02

bad actor. You know, so it's it's and

2:29:04

again, the whole the mental illness component here

2:29:06

makes makes it very difficult to think about

2:29:09

what you should or shouldn't have

2:29:11

So IDW the topic of platform is

2:29:13

is pretty interesting. Like, what's

2:29:15

your view on platform

2:29:17

and controversial people? Let's let let's

2:29:19

start with the the old would

2:29:21

you interview Hitler on your

2:29:23

podcast? And how would you talk to

2:29:25

him? Oh, and follow-up

2:29:28

question. Mhmm. Would you interview him in nineteen

2:29:30

thirty five? Forty

2:29:34

one.

2:29:36

And then, like, forty five. Well,

2:29:38

I think we have an uncanny valley problem with

2:29:41

respect to this issue of

2:29:45

whether or not to speak to bad people. Right?

2:29:47

So if if person sufficiently bad,

2:29:49

right? If they're all the way out of the valley,

2:29:52

then you can talk to them and it's just it's

2:29:54

totally unproblematic to talk

2:29:56

to them because you don't have to spend any time

2:29:59

signaling to your audience that you don't agree with them. And

2:30:01

if you're interviewing Hitler, you don't have to say, listen,

2:30:03

I just gotta say, before we start, I don't

2:30:05

agree with the whole, you know, IDW thing.

2:30:07

And you know, I just think you're

2:30:10

killing, you know, killing mental patients

2:30:12

and Vans and all that. That was

2:30:14

all bad. It's bad look. It

2:30:16

365. See, you just it

2:30:18

can go out saying that

2:30:20

you don't agree with this person, and you're not platforming

2:30:23

them to signal boost their their

2:30:26

views, you're just trying to

2:30:28

365 they're sufficiently evil, you can go

2:30:30

into it very much as an anthropologist

2:30:33

would Just just

2:30:35

wanna understand the nature of evil.

2:30:38

Right? You just wanna understand this 365. Like,

2:30:40

how is this person who

2:30:42

they are. Right? And

2:30:44

that strikes me as a intellectually

2:30:47

interesting and and

2:30:50

morally necessary thing to do.

2:30:52

Right? So, yes, you think you always

2:30:54

interview Hitler.

2:30:55

Wait. Wait. Wait. Wait. Wait. Wait. Wait. Wait. Wait. Wait.

2:30:57

Wait. Well, when he when you know once

2:30:59

he's hit, but when do you know

2:31:00

it? Once he's legitimately.

2:31:02

But when do you know it? Is the genocide

2:31:04

really happening? Yes. Like 365 forty

2:31:07

two fifty three. No. If if you're on the cusp of

2:31:09

it where it's just he's someone who's gaining

2:31:11

power and you don't wanna you don't wanna help facilitate

2:31:13

that, Then

2:31:16

there's a question of whether you can you can undermine

2:31:18

him in by while pushing back against

2:31:20

him in that interview. Right? So there are people I wouldn't

2:31:22

talk to just because I don't want to

2:31:25

give them oxygen, and I don't think that

2:31:27

in the context of my interviewing

2:31:29

them, I'm gonna be able to to

2:31:32

take the wind out of their sails at all. Right?

2:31:34

So it's it's 365 for whatever either

2:31:36

because an asymmetric advantage

2:31:38

because I just know that they can do

2:31:41

something that they they they within

2:31:43

the span of an hour that I can't

2:31:46

that I can't correct 365. Sure. Sure. You know, it's it's

2:31:48

like they can they can light, many small 365, and

2:31:50

it just takes too much time to put

2:31:51

them That's more like on the top of the vaccines, for example.

2:31:53

Having a debate on the efficacy of vaccines.

2:31:55

Yeah. Okay. It's not that I don't think sunlight

2:31:58

is usually the best 365. I

2:32:00

think it is. You

2:32:01

know, even these asymmetries IDW. I

2:32:03

mean, there are it is true

2:32:06

that a person can always

2:32:08

make a mess faster than you can clean it up.

2:32:10

Right? But still, they're debates worth having

2:32:12

even given that limitation. And

2:32:15

they're the right people to have those specific debates.

2:32:17

And there's certain topics where, you

2:32:19

know, I'll debate someone just because

2:32:22

I'm the right person for the job and

2:32:24

it doesn't matter how messy they're gonna

2:32:26

be. It's just it's just worth it

2:32:28

because I I can make my points land

2:32:31

at least to to the right part

2:32:33

of the audience. So some of it is just your own

2:32:35

skill and confidence and also interest

2:32:37

in preparing

2:32:38

correctly? Well, yeah. Yeah. And the nature

2:32:40

of the subject matter and and

2:32:43

but, yeah, but there are other people who just by

2:32:45

default, I would say, there's no reason

2:32:47

to give this guy a platform. And

2:32:48

there are also people who are so confabulatory

2:32:51

that they're

2:32:53

making such a mess with every

2:32:55

sentence that

2:32:58

you insofar as you're even trying to

2:33:00

interact with what they're saying,

2:33:03

you're going you're by definition going to fail

2:33:06

and you're going to seem to fail to

2:33:09

an an un a sufficiently large 365

2:33:11

audience. Where it's gonna be a net negative

2:33:14

for for the for the cause of truth

2:33:16

no matter how good you are. So, like, for instance,

2:33:18

I think talking

2:33:21

to Alex Jones on any topic

2:33:23

for any reason is probably a bad idea

2:33:25

because I just think he's he's

2:33:28

just neurologically wired to

2:33:30

just I mean, under a string of

2:33:33

sentences, he'll get twenty sentences out,

2:33:35

each of which has to be each of which is,

2:33:37

you know, contains more lives than the last.

2:33:41

And there's

2:33:43

just there's not time enough in the

2:33:45

world to run down and certainly

2:33:47

not time enough in the span of a conversation. To

2:33:49

run down each of those leads to

2:33:52

to bedrock so as to falsify it. And

2:33:54

it'll just make shit up. It just and

2:33:57

or and or it makes shit up and then

2:33:59

then weave it in with, you know, half trusses

2:34:02

and and and microtrusses

2:34:04

that may give some sense semblance of

2:34:07

credibility to somebody out there. I mean,

2:34:09

apparently millions of people out there. And

2:34:13

there's just no way to to untangle

2:34:15

that in real time with

2:34:16

him.

2:34:16

IDW have noticed that you have an allergic

2:34:19

reaction to 365.

2:34:24

Yeah. 365. Confabulation. Yeah.

2:34:27

That 365 somebody says

2:34:29

something a little micro untruth,

2:34:32

it really stops your brain. Here,

2:34:34

I'm not talking about micro untruths. I'm just talking about

2:34:36

making up things out of whole cloth. So

2:34:39

it's like, it 365 someone says some people,

2:34:41

like, well, what about and then

2:34:43

the then the thing they put at

2:34:46

the end of that sentence is just a

2:34:48

set of 365 right,

2:34:50

that you can't possibly authenticate or

2:34:53

not in the span of that conversation. They

2:34:56

will, you know, whether it's UFOs or anything

2:34:58

else. Right? They

2:35:00

will seem to make you look like ignoramus

2:35:03

when in fact everything

2:35:05

they're saying is specious. Right?

2:35:08

Whether they know it or not? I mean, there's some people who are

2:35:10

just crazy and there's some people who are

2:35:13

who are just bullshitting and they're not even tracking

2:35:15

whether it's true or it just feels good and there's some people

2:35:17

who are conscious lie lying about

2:35:19

things. But don't you think there's just

2:35:21

a kind of jazz masterpiece

2:35:23

of untruth that you should be able to just

2:35:26

a a wave off by saying, like,

2:35:29

well, none of that is backed up by any evidence.

2:35:31

It's almost like take it to the humor

2:35:33

place. We But the thing is is okay.

2:35:35

Just the place I'm familiar with

2:35:37

doing this and not doing this is is

2:35:40

on specific conspiracies

2:35:42

like nine eleven truth. Right?

2:35:44

Like the the nine eleven sorry. Because of

2:35:46

my because of what nine eleven

2:35:48

did to my intellectual

2:35:52

365. And they'd really just you know, it

2:35:54

it sent me down a path for the better

2:35:56

part of a decade. Like, I became a critic

2:35:58

of religion Well,

2:36:00

I don't know if I was ever gonna be a critic of religion.

2:36:02

Right? Like, but that like, it happened to be

2:36:04

in my wheelhouse because spent so much time

2:36:07

studying religion. On

2:36:09

my own, and I was

2:36:12

also very interested in in the the underlying

2:36:14

spiritual concerns of every religion.

2:36:17

And so I was I was you

2:36:20

know, I devoted

2:36:23

a 365 more more than a full decade of

2:36:25

my life to just you know, what is

2:36:27

what is real here? What is possible? What is

2:36:29

what is the nature of subjective reality?

2:36:32

And how does it relate to reality at large? And is

2:36:34

there anything to You know, who

2:36:36

just who was someone like Jesus or Buddha? And

2:36:38

are these are these people frauds? Or are they are they

2:36:40

are these just these just myths?

2:36:42

Or or or is there really a

2:36:45

continuum of of insight to be had

2:36:47

here that is

2:36:48

interesting. So I spent a lot of time

2:36:50

on that question

2:36:52

through my twenty the full decade at my twenties,

2:36:54

and that was launched in part by nine eleven Luther.

2:36:57

No. Then when nine eleven happened,

2:36:59

I had spent all this time reading religious

2:37:02

books, understanding em empathically

2:37:04

understanding the motivations of religious people,

2:37:07

right, knowing just how

2:37:09

fully certain people believe what they say they

2:37:11

believe. Right? So IDW took religious

2:37:13

convictions very seriously. And

2:37:16

then people started flying planes into our buildings, and

2:37:18

I so I knew that there

2:37:20

was something to be said about allegedly. The

2:37:22

the the core doctrine of Islam. Exactly.

2:37:25

So so I went down so that was that

2:37:27

became my wheelhouse for a time, you

2:37:30

know, terrorism and and jihadists

2:37:32

and related topics.

2:37:35

And so the nine eleven truth conspiracy

2:37:37

thing kept, you know,

2:37:40

getting aimed at me. And

2:37:43

the question was, well, do I do I

2:37:45

wanna debate these

2:37:46

people? Right? Yeah. Like

2:37:47

Lex Jones, perhaps. Yeah. I mean yeah. So Alex

2:37:49

Jones, I think, was an early purveyor of it. Although, I

2:37:52

don't think I knew who he was at that point.

2:37:56

And so, and privately, I had some

2:37:58

very long debates with people who,

2:38:00

you know, the one person in my family went way

2:38:02

down that rabbit hole. And I just, you know, every

2:38:04

six months or so, I literally write

2:38:06

the two hour email, you know, that that would

2:38:09

try to try to deprogram

2:38:11

them, you know, however, 365. And

2:38:15

so I went back and forth for years on that topic

2:38:17

with with in private with people. But

2:38:20

could see the structure of the conspiracy. I could

2:38:22

see the nature of of of

2:38:24

of how of how

2:38:26

impossible it was to

2:38:29

to play whack a mole sufficiently well

2:38:32

so as to so as to convince anyone

2:38:34

of anything who was who was not

2:38:37

seeing the

2:38:40

problematic structure of that way of thinking.

2:38:42

I mean, it's it's not actually a thesis. It's

2:38:44

It's a proliferation of anomalies that

2:38:48

don't you can't actually connect all the dots

2:38:50

that are being pointed to. They they don't connect in

2:38:52

a coherent way. There's they're incompatible theses

2:38:55

that are not and and their incompatibility is not being

2:38:57

acknowledged. But

2:39:00

they're they're running this algorithm as

2:39:03

if things are things are never what they seem.

2:39:05

There's always malicious conspirators doing

2:39:07

things 365. We see

2:39:10

all we see evidence of human incompetence everywhere

2:39:12

else. No one can tie their

2:39:14

shoes, you know, expertly

2:39:17

anywhere else. But over here,

2:39:20

people are perfectly competent. They're

2:39:22

perfectly concealing thing. Like, the

2:39:24

the thousands of people are collaborating. You

2:39:27

know, inexplicably. mean

2:39:29

incentivized by what who knows? They're

2:39:31

they're collaborating to murder thousands

2:39:33

of their neighbors, and no one is breathing

2:39:35

a peep about no one's getting caught on a on

2:39:37

camera. No one's and and no one's

2:39:40

breathe the word of it to a journalist. And

2:39:45

so I've

2:39:48

I've dealt with that style

2:39:50

of thinking and I've IDW know what it's like

2:39:52

to be in the weeds of a conversation like

2:39:54

that and and and the person

2:39:56

will say, okay. Well, but

2:39:58

what do you make of the fact that all

2:40:01

those f sixteen's were flown

2:40:03

eight hundred miles out to sea in

2:40:05

the morning of nine eleven to doing an exercise

2:40:08

that hadn't even been scheduled for that day, but it

2:40:10

was And now all of these are

2:40:12

I dimly recall some thesis of that

2:40:14

kind, but I'm just making these things up now.

2:40:16

Right? So I got that detail. Hadn't even

2:40:19

been scheduled for that day as it inexplicably run

2:40:21

that day. Like, so what how long would

2:40:23

it take to track that down?

2:40:25

Right? Think idea that this is anomalous.

2:40:28

Right? That that was an f f sixteen

2:40:30

exercise run

2:40:33

on it. And it wasn't even supposed to be been run

2:40:35

that day. Right? Yeah. Someone

2:40:38

like Alex Jones, their

2:40:40

speech pattern is to pack as

2:40:42

much of that stuff in as

2:40:44

possible at the highest velocity that

2:40:47

a person can speak. And

2:40:49

unless you're knocking down each one

2:40:51

of those things, to that audience,

2:40:53

you appear to just be uninformed. Appeared

2:40:56

to just not. But you don't oh, wait. He he didn't know

2:40:58

about the f sixteenth. Yeah. Sure.

2:41:01

He he doesn't know about Project Mockingbird. You

2:41:03

haven't heard a project, Mockingbird. just made project,

2:41:05

Mockingbird. I don't know what it is, but that's

2:41:08

the kind of thing that comes at tumbling out

2:41:10

in a in conversation like that.

2:41:12

That's the kind of thing, frankly, I was worried about

2:41:15

in the COVID conversation because not

2:41:17

that someone like Brett would do it consciously,

2:41:21

but Someone like Brett is

2:41:23

swimming in a sea of misinformation on

2:41:25

social living on Twitter, getting

2:41:28

people sending the blog post

2:41:30

and the study 365, you

2:41:33

know, the Philippines that showed that in

2:41:35

this cohort, Ivermectin did Right?

2:41:37

And and

2:41:39

not like, to

2:41:42

actually run anything to ground. Right?

2:41:44

You have to actually do

2:41:46

the work, journalistically, and scientifically,

2:41:51

and run it to ground. Right? So for many for

2:41:53

some of these questions, you actually have to be

2:41:55

a statistician to say,

2:41:57

okay, they

2:42:00

they use the wrong statistics in this

2:42:02

experiment. Right? Now, yes,

2:42:05

we could take all the time to do that,

2:42:07

or we could at every stage along the way

2:42:10

in a in context where we we have

2:42:13

experts we can trust go with ninety

2:42:15

with what ninety seven percent of the experts are saying

2:42:17

about Lex, about the safety of

2:42:20

mRNA, about the transmissibility

2:42:22

of COVID about whether to wear masks or not wear

2:42:24

masks. And I completely agree

2:42:27

that that broke down unacceptably

2:42:32

in the over the last few years. And that

2:42:36

but I think that's largely

2:42:39

social media and blogs and

2:42:41

and the efforts of podcasters and

2:42:43

sub stack writers. We're

2:42:48

not just a response to that. It

2:42:50

was a I think it was a symptom

2:42:52

of that and a cause of that. Right?

2:42:54

And I think we're we're living in an environment

2:42:57

where people

2:43:02

we basically we we have trained ourselves

2:43:04

not to be able to agree about facts

2:43:07

on any topic no matter how urgent.

2:43:10

Right? What's what's flying in our sky?

2:43:12

You know? What is, you know, what is what's

2:43:15

happening in Ukraine? Is is Putin

2:43:17

just D. 365, Ukraine?

2:43:20

I mean, like, there are people who

2:43:23

we respect who

2:43:25

are spending time down that particular

2:43:27

rabbit hole. Like, this is this

2:43:29

is, you know, maybe there are a lot of nazis

2:43:32

in

2:43:32

Ukraine, and that's the real problem. Right?

2:43:34

Maybe Putin's maybe

2:43:37

Putin's not the bad actor here. Right?

2:43:39

How much time do I have to spend? Empathizing

2:43:43

with Putin to the point of thinking, well, maybe

2:43:45

Putin's got a point and it's it's

2:43:47

like, what about the polonium and

2:43:50

the nerve agents and the killing of journalists

2:43:52

and the in the nivolny and, like,

2:43:55

does that count a note? Listen, I'm not paying

2:43:57

much attention to that because I'm following all these interesting

2:43:59

people on Twitter and they did

2:44:01

give me some pro Putin material

2:44:04

here. And there is a there

2:44:06

are some nazis in Ukraine. It's not like there

2:44:08

are no nazis in Ukraine. How

2:44:10

am I gonna wait these things? I

2:44:12

think people are being driven crazy by Twitter.

2:44:15

Yeah. But you're you're kinda

2:44:17

speaking to conspiracy theories that pollute

2:44:19

everything and then you but every every

2:44:21

example you gave is kind of a bad faith

2:44:25

style of conversation. But but

2:44:27

it's not necessarily knowingly bad faith by I

2:44:29

mean, the people the people who are who are

2:44:31

worried about Ukraine Ukrainian Nazis,

2:44:34

to my I mean, some of the same people, they're the same

2:44:37

people who are worried about

2:44:39

IBMectin got suppressed. Like,

2:44:42

Ivermectin is really the panacea, but

2:44:44

it got suppressed for because

2:44:46

no one could make billions on it. It's

2:44:51

the same it's literally the it's in

2:44:53

many cases the same people and the same

2:44:57

efforts to to unearth those.

2:44:59

And you're saying it's very difficult to have conversations

2:45:01

with those kinds of people. What about conversation

2:45:04

with Trump

2:45:05

himself? Would you do a podcast with

2:45:07

Trump? No.

2:45:09

No. No. IDW think so. I I don't think I'd

2:45:11

be learning anything about him.

2:45:13

It's like with with Hitler, and

2:45:16

I I'm not comparing Trump to Hitler. But

2:45:18

Klipsch guy -- Yes. -- your chance -- Yeah. -- you

2:45:20

got this one. With certain world historical

2:45:22

figures, I

2:45:25

I would just feel like, okay, this is an opportunity

2:45:28

to learn something that I'm not gonna learn. I

2:45:30

I think Trump is among

2:45:32

the most superficial people we have

2:45:34

ever laid eyes on. Right? Like, he is he

2:45:37

is in public view.

2:45:39

Right? And I'm sure

2:45:41

I'm sure there's some distance between who he is in private

2:45:43

and who he is in public, but it's not gonna

2:45:45

be the kind of distance that's gonna blow

2:45:47

my mind. And

2:45:50

I think so

2:45:52

I think the liability so 365 instance,

2:45:55

I think Joe Rogen was very

2:45:59

wise not to have Trump on his podcast. I

2:46:01

think all he would have been doing is is

2:46:03

he would have put himself in a situation where

2:46:05

he couldn't adequately contain the damage

2:46:08

Trump was doing, and he was just gonna make Trump

2:46:10

seem cool to a whole you

2:46:14

know, potentially new cohort of his massive

2:46:16

audience. Right? I

2:46:19

mean, they would have they would have had a lot of 365. Trump's

2:46:21

funny. The

2:46:25

entertainment value of things

2:46:27

is so influential

2:46:31

IDW that that was that one debate where Trump,

2:46:34

you know, got a massive laugh

2:46:37

on the, you know, his line, you know, only

2:46:39

Rosie O'Donnell. Right? The

2:46:41

truth is we're we're living in a political system

2:46:44

where if you can get a big laugh

2:46:46

during a political debate, you

2:46:49

win. Does it matter who you

2:46:51

are? Like, that's the level of

2:46:53

of, you know, doesn't matter how uninformed you are.

2:46:55

It doesn't matter that half the debate was about

2:46:57

what the hell we should do about about, you

2:47:00

know, the threat of nuclear war anything else.

2:47:03

It's

2:47:05

We're monkeys. Right? And we like to

2:47:07

laugh. Well, because he brought

2:47:09

up Joe. He's somebody like you IDW look

2:47:11

up to. I've

2:47:13

learned a lot from him because

2:47:15

IDW think who he is

2:47:17

privately as a human being, also

2:47:19

his He's kind of the voice of

2:47:21

curiosity to me. He inspired

2:47:23

me that it's like unending open minded

2:47:26

curiosity. Much

2:47:28

like you are the voice of reason. They

2:47:31

recently had a podcast. Joe had recently a podcast

2:47:34

of Jordan Peterson, and I

2:47:36

have brought you up saying they still

2:47:38

have a hope for you. Mhmm. Yeah.

2:47:40

Any chance you could Any chance

2:47:42

you could talk to Joe again? And reinvigorate

2:47:45

your friendship?

2:47:47

You get what? I reached out to him privately

2:47:49

when I saw that. Did you use the power

2:47:51

of love? Joe knows I I love

2:47:53

him and consider him a friend. Right? So there's no there's

2:47:56

no issue there. He

2:47:58

also knows I'll I'll be happy to do his podcast.

2:48:02

When we get that together, you know. So there's

2:48:04

no I I have got no policy of not talking

2:48:07

to Joe or not doing his podcast.

2:48:11

I mean, I think we're we got little sideways

2:48:13

along these same lines where, you know, we've

2:48:15

talked about Fridman Elon and other people.

2:48:19

It was never to that

2:48:21

degree with Joe because Joe's

2:48:26

in a very different lane. Right? He's unconsciously

2:48:28

so. I mean, Joe is a stand up

2:48:30

comic who interviews,

2:48:33

who just is interested

2:48:35

in everything, interviews the

2:48:37

widest conceivable variety of people

2:48:40

and just lets his interests collide

2:48:42

with their expertise or, you know, lack

2:48:44

of expertise. I mean, he's he's again, it's

2:48:47

a super wide variety of people. He'll

2:48:50

talk about anything and he can always

2:48:52

pull the ripcord saying,

2:48:55

you know, I don't know what the fuck I'm saying. I'm a comic.

2:48:57

I'm stoned. We're we just drank too much.

2:48:59

Right? Like like as as very entertaining, it's

2:49:02

all in, you know, to my eye, it's it's

2:49:04

all in good

2:49:04

faith. I think Joe is an extraordinarily ethical

2:49:07

good person.

2:49:08

So it doesn't use Twitter. doesn't really use

2:49:10

Twitter. Right. Yeah. Yeah. No. The crucial difference

2:49:12

though is it because he

2:49:15

is an

2:49:18

entertainer first. I mean, I'm

2:49:20

not I'm not saying he's not smart and doesn't understand

2:49:22

things. He I mean, what what's compute potentially

2:49:24

confusing IDW he's very smart and he he's

2:49:26

also very 365. He's he his

2:49:28

full time job is taught, you know, when

2:49:31

he's not doing standup, or doing color

2:49:33

commentary for the UFC. His

2:49:36

full time job is talking to

2:49:39

lots of very smart people at great length.

2:49:41

So he's he's created a, you know, the the

2:49:43

Joe University for himself and he's he's

2:49:45

gotten a lot of information crammed

2:49:47

into his head. So It's

2:49:49

not that he's uninformed, but he can

2:49:52

always when he feels that he's uninformed

2:49:54

or when it turns out he was wrong about something.

2:49:57

He can always pull the rip court and say,

2:49:59

I I'm just a comic. We were stoned.

2:50:02

It was fun. You know, don't don't take medical

2:50:04

advice 365 me. I don't play a doctor

2:50:06

on the Internet. Right? I

2:50:09

can't quite do that. Right?

2:50:12

You can't quite do that. We're we're in different

2:50:14

lanes. I'm not saying you and I aren't exactly

2:50:16

the same lane, but 365 much of Joe's

2:50:18

audience, I'm just this establishment chill

2:50:20

who's just banging on about, you know, the universities

2:50:22

and medical journals and and

2:50:26

It's not true, but that would be the perception.

2:50:28

And as a counterpoint to a lot of what's being

2:50:30

said on Joe's podcast or or, you

2:50:33

know, certainly Brett's podcast on these topics,

2:50:35

can see how they they would form that

2:50:38

opinion. But in

2:50:40

reality, if you listen to me long enough,

2:50:42

you hear that I've

2:50:45

said as much against the woke nonsense as

2:50:48

anyone, even any lunatic on the right

2:50:50

who's can only keep that

2:50:52

bright that bright shiny object in view. Right?

2:50:55

So there's nothing that Candice Owens has

2:50:57

said about Wokeness, that I haven't said about Wokeness

2:51:00

as far and so far as she's speaking

2:51:02

rationally about Wokeness. But

2:51:07

we have to be able to keep multiple things

2:51:09

in view. Right? If you if you could only

2:51:11

look at the problem of Wokeness and you couldn't

2:51:13

acknowledge problem of Trump and Trumpism

2:51:16

and QAnon, and and

2:51:18

the explosion of irrationality that was

2:51:20

happening on the right, and bigotry that was happening

2:51:23

on the right. You

2:51:25

just get You were you were just disregarding half

2:51:27

of the landscape, and

2:51:30

many people took half of the problem.

2:51:33

In in recent years. In the last five years, there's

2:51:35

a story of many people taking half

2:51:37

of the problem and monetizing that

2:51:39

half of the problem. And and getting

2:51:42

captured by an audience that

2:51:45

only wanted that half of the problem talked

2:51:47

about in that way. And

2:51:50

And this is the this is the larger issue of

2:51:52

of audience capture, which, you

2:51:55

know, is very I'm sure it's

2:51:57

it's an ancient problem, but It's

2:52:00

a very helpful phrase that I think comes to us courtesy

2:52:02

of our mutual 365, Eric Weinstein. And

2:52:07

Audience captures a thing, and I believe

2:52:09

I've witnessed many casualties

2:52:12

of it. And if there's anything I've been on

2:52:14

guard against, in my life, you

2:52:16

know, professionally, it's been that. And

2:52:18

and when I noticed that I had a lot of

2:52:21

people in my audience who

2:52:23

didn't like my criticizing Trump,

2:52:26

I really leaned into it. And when I noticed

2:52:28

that a lot of the other cohort, in my honest,

2:52:31

didn't like me, criticizing the

2:52:33

far left and wokeness. They thought I was, you

2:52:35

know, exaggerating that problem. Mhmm.

2:52:37

I leaned into it because I thought those parts

2:52:40

of my audience were were absolutely wrong.

2:52:43

And I didn't care about

2:52:45

whether I was gonna lose those parts of my audience.

2:52:49

There are people who have created, you know,

2:52:52

knowingly or not, there are people who have created different

2:52:54

incentives for themselves. Because

2:52:56

of how they've monetized their podcast and

2:52:58

because of the kind of signal they've

2:53:00

responded to in their audience. And

2:53:04

I worry about, you know, you know, Brett

2:53:06

would consider this a totally invidious IDW

2:53:10

hominem thing to say, but I really do worry

2:53:12

that that's happened to Brett. I think

2:53:14

IDW cannot explain how you do

2:53:16

a hundred with all the things in the

2:53:18

in the universe to be interested

2:53:21

in, of all the things he's competent

2:53:23

to speak intelligently about, I don't know

2:53:25

how you do a hundred podcasts in a row on

2:53:28

on

2:53:28

COVID. Right? It's

2:53:29

just it makes no sense. Do think

2:53:32

in part audience capture can explain

2:53:34

that? I absolutely think it can. Yeah. What about

2:53:37

IDW you like,

2:53:39

for example, do you feel pressure to

2:53:41

not admit that you made a mistake on COVID

2:53:44

or made a mistake on Trump? IDW

2:53:46

not saying 365 feel that way,

2:53:48

but do you feel this pressure? So

2:53:51

you've entire audience capture within

2:53:53

your within the way you do stuff

2:53:55

so you don't feel as much pressure

2:53:57

from the audience but within your own

2:53:59

ego. I mean, again, the

2:54:01

the people who think I'm wrong about any

2:54:03

of these topics are gonna think,

2:54:05

okay, you're just not

2:54:08

admitting that you're wrong, but

2:54:12

now we're having a dispute about specific

2:54:14

facts. There

2:54:19

are things that I believed about COVID are

2:54:21

worried my very might be true about

2:54:23

COVID two years ago that

2:54:26

I no longer believe or I'm not so worried about

2:54:28

now. And vice

2:54:30

versa. I mean, things have 365, certain

2:54:32

things have flipped upside down. question

2:54:36

is, was I wrong? So

2:54:39

here's here's a cartoon version of it, but

2:54:41

this is something I said probably eighteen months

2:54:43

ago and it's still

2:54:44

true. You know,

2:54:45

when

2:54:45

I saw what Brett was doing on COVID, you

2:54:47

know, Lex call

2:54:49

it two years ago.

2:54:54

I IDW even if he is right, even

2:54:56

if he turned if even turns out, that

2:54:58

Ivermectin is a

2:54:59

panacea, and the mRNA

2:55:01

vaccines kill millions of people.

2:55:03

Right?

2:55:04

He's still wrong right now

2:55:07

His reasoning is still flawed right

2:55:09

now. His 365 still suck

2:55:11

right now. Right? And his

2:55:13

and his confidence is is

2:55:16

unjustified now. That

2:55:18

was true then. That will always be true then.

2:55:20

Right? And and so And

2:55:23

not much has changed for

2:55:25

me to revisit any

2:55:28

of my time points along the way. Again,

2:55:30

I will totally concede that if I had

2:55:33

teenage boys and

2:55:36

their schools were demanding that they'd be vaccinated

2:55:38

with the mRNA vaccine, I

2:55:41

would I would be powerfully

2:55:43

annoyed. Right? Like, I would I wouldn't know

2:55:45

what I was gonna do, and I would be I would

2:55:48

be doing more research about about

2:55:51

myocarditis, and I'd be badgering

2:55:53

our doctors, and I would be worried

2:55:56

that we have a medical system and a pharmaceutical

2:55:59

system and a healthcare system

2:56:01

and a public health system that's not

2:56:03

incentivized to look at

2:56:05

any of this in a fine grain way and they just want

2:56:07

one blanket

2:56:10

admonition to the entire population,

2:56:12

just get just take the shot, you idiots.

2:56:16

I view that largely as

2:56:18

a result, a panicked response to

2:56:20

the misinformation explosion that happened

2:56:23

and the public, the populist resistance

2:56:26

animated but misinformation that just made it

2:56:28

impossible to get anyone to to cooperate.

2:56:30

Right? So it's just part of it is,

2:56:32

again, a pendulum swing in wrong direction.

2:56:35

Someone analogous to the woke response to Trump

2:56:37

and the Trumpist response to woke. Right? So there's

2:56:40

a lot of people have just gotten pushed around for

2:56:42

bad reasons. Or understandable reasons.

2:56:46

But yes, it's

2:56:49

There are there are caveats to

2:56:52

my things have changed about my view of

2:56:54

of COVID. But the question is, if

2:56:56

you roll back the clock eighteen months,

2:56:58

was I wrong

2:57:00

to want to platform

2:57:04

Eric Topol, you know, a very

2:57:06

well respected cardiologist on this

2:57:09

topic or,

2:57:12

you know, Nicholas Christophe is to

2:57:14

to talk about the network effects of, you

2:57:16

know, whether we closed

2:57:18

schools. Right? He's written a book on COVID.

2:57:21

He's, you know, network effects or his

2:57:23

wheelhouse as it both as an MD and as a

2:57:25

sociologist.

2:57:29

There was a lot that we believed we knew

2:57:31

about the efficacy of of closing

2:57:33

schools during pandemics, right, during the, you

2:57:35

know, during the the Spanish

2:57:38

flu pandemic and others,

2:57:40

right? But there's a lot we didn't know about

2:57:42

COVID. We didn't know we didn't know how

2:57:47

negligible the effects would be on

2:57:49

kids compared to older people. We didn't know

2:57:51

like the so my my problem I really

2:57:54

enjoyed your conversation with

2:57:55

Atopo, but I'll didn't. So

2:57:57

he's one of the great communicators,

2:58:00

in many ways, on Twitter, like,

2:58:02

distillation of the current data,

2:58:05

but he I hope I'm

2:58:07

not overstating it, but there

2:58:09

is a bit of an arrogance. From

2:58:12

him that I think could be explained

2:58:15

by him being exhausted, by

2:58:17

being constantly attacked, by conspiracy theory,

2:58:20

like, antivacitors. Yeah. To

2:58:22

me, the same thing happens with people that

2:58:25

start drifting to

2:58:27

be being right wing.

2:58:29

Is to get a time so much by the 365. They

2:58:32

become almost irrational and arrogant in their

2:58:34

beliefs. And I

2:58:36

I felt your conversation with Agricultural did

2:58:39

not sufficiently empathize

2:58:41

with people that have skepticism, but also

2:58:44

did not sufficiently communicate uncertainty

2:58:46

we have. So, like, many

2:58:49

of the decisions you made, many of the things

2:58:51

you were talking about,

2:58:53

we're kinda saying there's a lot of uncertainty, but

2:58:55

this is the best thing we could do now. Well, it

2:58:57

was a 365 choice. You're gonna get COVID.

2:58:59

Do you wanna be vaccinated when you get it? Right.

2:59:01

That was always,

2:59:04

in my view, an easy choice. And

2:59:07

it's up until you you start breaking

2:59:09

apart the cohorts and you start saying, okay, wait

2:59:11

a minute. There is this myocarditis issue

2:59:14

in in young men. Let's talk

2:59:16

about

2:59:17

that. When that's

2:59:19

365 that story emerged, it was just it was

2:59:22

just clear that

2:59:25

this is 365 it's

2:59:28

not knocking down transmission as much as we

2:59:30

had

2:59:30

hoped, it is still mitigating severe

2:59:33

illness and death.

2:59:35

And

2:59:38

I still believe that it is the

2:59:41

current view of

2:59:43

Most people competent to analyze the data

2:59:46

that we lost something like three hundred

2:59:48

thousand people unnecessarily in the

2:59:50

US. Because of because of

2:59:52

vaccine hesitancy. But I think there's

2:59:55

a way to communicate with humility about

2:59:57

the uncertainty of

2:59:58

things. That would increase the vaccination rate.

3:00:01

IDW believe that it is rational

3:00:04

and sometimes 365. To

3:00:08

to signal impatience with

3:00:10

certain bad ideas.

3:00:12

Right? And certain conspiracy theories, and certain

3:00:15

forms of misinformation. So It

3:00:17

could it's just just think it makes you look a

3:00:19

douche bag most times. Well, no. I

3:00:21

mean, certain people are persuadable. Certain people

3:00:23

are not persuadable, but it's

3:00:27

no. Because there's not enough it's it's the

3:00:29

opportunity cost. Not everything can

3:00:31

be given a patient

3:00:33

hearing. So you can't have a physics conference

3:00:36

and then let people in to

3:00:38

just trumpet their pet theories about

3:00:40

in the grand unified vision of 365. When

3:00:45

they're obviously crazy or they're obviously

3:00:47

half crazy or they're just not, you know,

3:00:49

the peep like, you begin to

3:00:52

you begin to get a sense for this

3:00:54

when it is your wheelhouse, but there are people

3:00:56

who kind

3:00:59

of declare their their irrelevance

3:01:03

to the conversation fairly

3:01:05

quickly without knowing that they have done

3:01:07

it. Right? And And

3:01:11

the truth is, I think I'm one

3:01:13

of those people on the topic

3:01:15

of COVID. Right? Like, I it's like it's

3:01:17

not It's never that I

3:01:19

felt, listen, I know

3:01:22

exactly what's going on here. I know these mRNA

3:01:24

vaccines are 365. IDW know exact I

3:01:26

know I know exactly how to run

3:01:28

a lockdown. I no.

3:01:30

This is this is a situation where

3:01:32

you want the actual pilots to fly

3:01:34

the plane. Right? We needed experts

3:01:37

who we could trust. And insofar as

3:01:39

our experts got captured by

3:01:42

by all manner of thing. I mean, some of them got

3:01:44

captured by Trump. Some of them were made to look ridiculous

3:01:47

just standing next to Trump while

3:01:49

he was below the eating about, you

3:01:51

know, whatever that, you know, that it's just gonna

3:01:53

go away. There's just fifteen people, you know,

3:01:55

there's fifteen people in the cruise ship, and it's just gonna

3:01:57

go away. There's gonna be no problem. Or it's, like,

3:02:00

when he said, he, you know, many of these doctors

3:02:02

think, I understand this better than them. They're just amazed

3:02:04

at how I understand this. And you've got doctors,

3:02:07

real doctors, the heads of the IDW, and

3:02:10

and NIH standing around just

3:02:13

ashen faced while he is

3:02:15

talking, you know, all

3:02:17

of this was deeply corrupted of the

3:02:19

public communication of science on Bull. And then

3:02:21

again, I've banged on about the

3:02:24

depregations of wokeness. The woke thing was

3:02:27

a disaster. Right? Still is a disaster.

3:02:30

But IDW mean that

3:02:34

I mean, but the thing is there's a big difference

3:02:36

between me and Fridman this case. I

3:02:38

didn't do hundred podcasts on COVID. I

3:02:40

did like two podcasts on COVID. The

3:02:43

measure of my concern about COVID can be

3:02:45

measured in how many podcasts did

3:02:47

on it. Right? It's like once we had a

3:02:49

sense of how to live with COVID, I

3:02:51

was just living with COVID. Right? Like,

3:02:54

okay, if you get facts or don't get facts,

3:02:56

wear a mask or don't wear a mask. Travel or don't

3:02:58

travel. Like, you've got a few things to IDW, but

3:03:01

my kids were stuck at home on iPads, you

3:03:04

know, for too long. I didn't agree

3:03:06

with that. You know, it was obviously not 365. Like,

3:03:10

I criticize that on the margins, but

3:03:12

there was not much to do about it. But the the

3:03:14

the thing I didn't do is make this my

3:03:16

life and just browbeat

3:03:19

people with one message or another. We

3:03:22

need a public health regime where

3:03:25

we can trust what competent people

3:03:27

are saying to us about you know,

3:03:29

what medicines are safe to take. And

3:03:31

in the absence of that craziness

3:03:34

is gonna Even in the presence of that craziness

3:03:36

is gonna proliferate given the tools we've built But

3:03:38

in the absence of that, it it's gonna proliferate

3:03:40

for understandable reasons. And that's gonna

3:03:43

it's it's not gonna be good

3:03:45

next time when when something orders

3:03:48

of magnitude more dangerous hit us.

3:03:50

And that's I

3:03:52

spend a you know, and so far as I think

3:03:54

about this issue, I think much more about

3:03:57

next time than this time. Before

3:04:01

this COVID thing, you and Brad had

3:04:03

some good conversations. I would say we're

3:04:05

365. What's your what do you admire

3:04:07

most about Brett IDW of all the

3:04:09

criticism we've had about this COVID topic?

3:04:13

Well, I I think Brett is very

3:04:15

smart and he's a very ethical

3:04:18

person who wants good things for the

3:04:20

world. I mean, I have no reason to doubt that.

3:04:23

So the fact that we're on, you know,

3:04:25

we're cross wise on this

3:04:27

issue is not does not mean that I

3:04:29

think he's a bad person.

3:04:32

I mean, the the thing that worried me about

3:04:35

what he was doing, and this was

3:04:37

true of Joe, and this was true of Elon's, the true of

3:04:39

many other people, is that once you're

3:04:41

messaging at scale to

3:04:44

a vast audience, you

3:04:47

incur a certain kind of responsibility, not

3:04:50

to get people killed. And I do, I

3:04:52

did worry that, yeah,

3:04:55

people were making decisions on

3:04:57

the basis of the information that was getting shared

3:04:59

there. And that that's why I was I

3:05:02

think 365 circumspect. I just

3:05:04

said, okay. Give me the the center

3:05:08

the Fairway Expert opinion at this time

3:05:10

point and at this time point and at

3:05:12

this time point and then I'm

3:05:14

out. Right, I don't have anymore to say about

3:05:16

this. I'm not an expert on COVID, I'm not

3:05:19

an expert on the safety of mRNA vaccines.

3:05:24

365 something if something changes those to become

3:05:26

newsworthy, then maybe I'll do a pod so I mean, I just

3:05:28

did a podcast on Labley.

3:05:31

Labley. I was never

3:05:33

skeptical of the

3:05:35

Lab Leak hypothesis. Brett was

3:05:37

very early on saying, this is this is

3:05:39

a Lab Right? At

3:05:42

a point where my only position was

3:05:44

who cares if it's a lab Lex. Right? Like, this the

3:05:46

there's the thing we have to get straight

3:05:48

is what do we do given nature of this pandemic.

3:05:50

But also, we should say that you've

3:05:52

actually stated that it is a possibility.

3:05:55

Oh, yeah. We should say should it doesn't doesn't IDW

3:05:58

I mean, it the time to figure that

3:06:00

out now, I've actually I have had my

3:06:02

my podcast guest on this

3:06:04

topic changed my view of this because that,

3:06:06

you know, one of the guests Elena

3:06:09

Chan made the point that No.

3:06:12

Actually, the the best time to figure

3:06:14

out the origin of this is immediately.

3:06:17

Right? Because in the you lose touch with the evidence, and

3:06:19

I hadn't really been thinking about that. Like, I didn't if

3:06:21

you come back after a year, you

3:06:24

know, there are certain facts you might not be able to get

3:06:26

in hand, but I've always felt

3:06:29

that it didn't matter

3:06:31

for two reasons. One

3:06:33

is we had the genome of the

3:06:35

virus and we could

3:06:37

design -- we're very quickly design immediately

3:06:39

designing vaccines against that genome. And

3:06:41

that's what we had to do. And then we had to figure out how

3:06:44

to vaccinate and to

3:06:46

mitigate and to develop treatments

3:06:49

and all of that So the origin

3:06:51

story didn't matter. Generically

3:06:54

speaking, either origin

3:06:56

story was politically

3:07:00

inflammatory and

3:07:02

may the Chinese look bad. Right? And the Chinese

3:07:04

response to this look bad, whatever the origin

3:07:06

story. Right? They're not cooperating they're

3:07:09

letting they're stopping their domestic flights,

3:07:11

but letting their international flights go.

3:07:13

I mean, it's just they were bad actors. And

3:07:15

they should be treated as such regardless

3:07:18

of the origin. Right? And and, you know,

3:07:20

I would argue that the wet market

3:07:22

origin is even more politically

3:07:25

invidious than the lab league, origin.

3:07:27

mean Why do you think? Because 365 the lab league,

3:07:29

to my eye, the lab

3:07:31

leak can happen to anyone. Right? We're all running

3:07:34

all these advanced countries are running

3:07:36

these dangerous labs. That's

3:07:38

a practice that we should be worried about

3:07:41

you know, in general. We

3:07:43

know lab leaks are a problem. There have been

3:07:45

multiple lab leaks and even worse things

3:07:48

that haven't gotten out of hand in this way,

3:07:51

but worse pathogens. We're or

3:07:57

why is to be worried about this? And

3:07:59

on some level, it could happen to anyone.

3:08:01

Right? The wet market makes

3:08:04

them look like barbarians live in

3:08:06

another century. Like, you gotta clean up those

3:08:08

wet markets. Like, what are you what are you doing

3:08:10

putting a bat on top of a pangolin, on top

3:08:12

of a duck, It's like, get your

3:08:14

shit together. So, like, if anything,

3:08:17

the wet market makes them look worse

3:08:19

in my view. Now, I'm sure there's I'm

3:08:21

sure that what they actually did to

3:08:24

conceal a lab leak. If it was a lab leak, and

3:08:26

all of that's gonna look odious.

3:08:29

Do you think we'll ever get to the bottom of that? I mean,

3:08:31

one of the big negative, I

3:08:36

would say, failures of anti 365 and

3:08:38

so on is to be transparent and

3:08:40

clear and it's just a good communicator about getting

3:08:42

functional research, the dangers of that. This

3:08:45

like, the, you know, why

3:08:48

it's a useful way of research,

3:08:50

but it's also dangerous. Right. Just

3:08:52

being transparent about that as opposed to

3:08:54

just coming off really shady. Of course,

3:08:56

the conspiracy theorists and the politicians

3:08:59

are not helping. But this just

3:09:01

created a giant

3:09:04

mess. Yeah. No. I would agree. So

3:09:06

so that exchange with Fauci and

3:09:08

Rand Paul they went viral.

3:09:11

Yeah. I would agree that Fauci

3:09:13

looked like he was taking

3:09:16

refuge in kind of very loyered

3:09:19

language and not

3:09:22

giving a straightforward account of what we

3:09:24

do and why we do it and So,

3:09:26

yeah, I think it it looked shady. It played

3:09:28

shady and it probably was shady.

3:09:30

I mean, I don't I don't know how personally entangled

3:09:33

he is with any of this, but yeah,

3:09:35

the the gain of function research

3:09:37

is something that IDW

3:09:39

think we're wise to be worried

3:09:42

about and so far

3:09:44

as I judge myself adequate to

3:09:46

have an opinion on this, I

3:09:49

think it should be banned. Right? Like,

3:09:51

I I I'm probably a

3:09:54

podcast I'll do, you know, if if you

3:09:56

or somebody else doesn't do it in the meantime, you

3:10:00

know, I I would like a virologist on

3:10:03

to defend it against

3:10:05

a virologist who who who

3:10:07

would criticize it. Forget about just the

3:10:10

gain of function research. I

3:10:12

don't even understand virus hunting at this

3:10:14

point. Mhmm. It's like, I don't know. I don't even know

3:10:16

why you need to go into a cave to find this

3:10:18

next virus that could be circulating

3:10:20

among bats that may jump

3:10:23

zoonically to us. Why

3:10:26

do that when we can make when we when we can

3:10:28

sequence in a day and and make

3:10:30

vaccines in a in a weekend? I mean, like, Like,

3:10:33

what what kind of head start do you think you get it? That's

3:10:35

a surprising new thing. How quickly we can develop

3:10:37

a vaccine. Exactly. That's Yeah.

3:10:41

That's that's really interesting. But the shading us

3:10:43

around Lab IDW think

3:10:45

the point I didn't make about

3:10:47

Brett's style of engaging this issue is people

3:10:50

are using the fact that he was early on LabLake

3:10:52

to suggest that he was right about ivoremectin

3:10:55

and about mRNA vaccines

3:10:58

and all the rest. Like, no, that that's

3:11:02

none of that connects. And it

3:11:04

was possible to be falsely confident.

3:11:07

Like, you shouldn't have been confident about LabLab, but

3:11:09

no one should have been confident about Lab Lex early,

3:11:11

even if it turns out to be Lab Leak. Right?

3:11:14

It was always plausible. It was never

3:11:16

definite. It still isn't definite. Zoonotic

3:11:18

is is is also quite plausible.

3:11:21

It certainly was super plausible then.

3:11:25

Both are politically uncomfortable.

3:11:28

Both both of the time were inflammatory to

3:11:32

be banging on about when we were trying to

3:11:34

secure some kind of cooperation from the Chinese.

3:11:36

Right? So there's a time for these things. And

3:11:39

and and it's possible to be right by accident.

3:11:42

Right? That's the that is it

3:11:47

your your reasoning the style of reasoning

3:11:49

matters whether you're

3:11:52

right or not. You know, it's like

3:11:54

because your style of reasoning is dictating

3:11:56

what you're gonna do on the next topic.

3:12:02

Sure. But this is this

3:12:04

multivariate situation here. It's

3:12:07

really difficult to know what's right on COVID,

3:12:10

given all the uncertainty, all the chaos, especially

3:12:12

when you step outside the pure biology,

3:12:14

virology of it and you start getting

3:12:16

to policy. Yeah. It's just

3:12:19

really yeah. It's just trade offs. Yeah.

3:12:21

Like transmissibility of the virus. Sure.

3:12:25

Just knowing sixty

3:12:28

365 percent of the population gets vaccinated, what

3:12:30

effect would that have? Just

3:12:33

even knowing those things, just modeling

3:12:35

all those things. Given

3:12:38

all the other incentives I mean, 365. But

3:12:41

you can always You want to say had the CEO of Pfizer

3:12:43

on your podcast. Did you leave that conversation

3:12:46

feeling like, this is a person

3:12:48

who is consciously reaping

3:12:55

windfall profits on a dangerous

3:12:59

vaccine and putting

3:13:02

everyone at intolerable risk or

3:13:04

do you think this person did you think this person

3:13:06

was was making a good faith attempt to

3:13:08

save lives and

3:13:11

had no no

3:13:13

bad no no taint of bad

3:13:15

incentives or something.

3:13:18

The the thing I sensed and I

3:13:21

felt in part It was

3:13:24

a failure on my part, but

3:13:26

I I sensed that I was talking to a politician. Mhmm.

3:13:29

So it's not thinking of there

3:13:31

was malevolence there or benevolence. There

3:13:35

was he just had a job. He

3:13:37

he put on a suit, and I was talking to a

3:13:39

suit. Not a human being. Now

3:13:42

he said that his son was a big fan of the

3:13:44

podcast, which is why he wanted to do it.

3:13:46

So I thought I would be talking to a human being.

3:13:48

And IDW challenging

3:13:50

questions what I thought the Internet thinks

3:13:52

otherwise. Every single question in

3:13:55

that interview was

3:13:57

a challenging one. Mhmm. But

3:14:00

it wasn't grilling, which is what people

3:14:02

seem to want to do with pharmaceutical

3:14:04

companies. There's a deep distrust of pharmaceutical

3:14:07

company But what was the alternative? I mean,

3:14:09

I I totally get -- Right. -- that windfall

3:14:11

profits at a time of of,

3:14:14

you know, public health immersion say bad.

3:14:16

It's a bad it is a bad look. Right? But

3:14:18

-- Yeah. -- what do how do we

3:14:21

reward and return

3:14:23

capital to risk

3:14:25

takers who will spend

3:14:27

a billion dollars to design a new drug

3:14:30

for a disease that

3:14:33

may be only harms of

3:14:35

single digit percentage of the population. It's

3:14:37

like, what do we want to encourage? And

3:14:39

who do we want to get rich? IDW mean, so the person

3:14:42

who cures cancer, do we want that

3:14:44

person to get rich or not? We we want the

3:14:46

we want the person who gave

3:14:48

us the iPhone to get rich, but we don't want the person

3:14:50

who who cures cancer to get rich.

3:14:52

I mean, what what are we trying to think is a very

3:14:55

gray area. What we want is person

3:14:57

who declares that they have a cure for cancer,

3:14:59

they have authenticity and transparency. There's

3:15:02

like, I think we're good now

3:15:04

as a population smelling bullshit.

3:15:07

And there is something about the the Pfizer

3:15:09

CEO, for example, just CEOs of pharmaceutical companies

3:15:12

in general just because they're

3:15:14

so lured up, so much

3:15:17

marketing our PR people, that

3:15:19

they are you just smell bullshit.

3:15:21

You're not talking a real That

3:15:24

they just it just feels like none

3:15:26

of it is transparent to us as a public.

3:15:28

So, like, this whole talking

3:15:32

point that Pfizer is only

3:15:34

interested in helping people just doesn't ring

3:15:36

true, even though it very well

3:15:38

could be true. Is the same thing with

3:15:40

Bill Gates -- Mhmm. -- who seems to

3:15:43

be at scale helping a huge amount

3:15:45

of people in the world. Yeah. And yet there's

3:15:47

something about the way he delivers that message.

3:15:49

What people like. They

3:15:52

seem suspicious. What's happening

3:15:54

underneath this? Right. There's certain kinds of

3:15:56

communication styles that seem to be

3:15:59

more servers

3:16:01

better catalysts for conspiracy theories.

3:16:03

And I'm not sure what that is

3:16:05

because I don't think there's an alternative for

3:16:08

capitalism in delivering

3:16:11

drugs that help people. But

3:16:14

also at the same time, there seems to need to

3:16:16

be more transparency. And plus, like,

3:16:18

regulation that actually makes sense versus --

3:16:20

Mhmm. -- it seems like pharmaceutical

3:16:23

companies are susceptible to corruption. Yeah.

3:16:27

IIII worry about all that.

3:16:29

But I I also do think that

3:16:32

most of the people going into those

3:16:34

fields and most of the people going into government.

3:16:36

They wanna do good doing it for good and their

3:16:38

non cycle paths, trying to get

3:16:40

good things done and trying trying to solve hard

3:16:43

problems, and they they're not trying to get

3:16:45

rich. I mean, many of the people are it's like, they're

3:16:50

I mean, bad incentives or or something.

3:16:52

Again, if 365 under that phrase

3:16:54

thirty times on this podcast,

3:16:57

but it's it's just almost

3:17:00

everywhere it explains normal

3:17:02

people creating terrible

3:17:05

harm. Right? It's it's not that there are that many

3:17:07

bad people. You know? And yes,

3:17:11

it makes it makes the truly bad

3:17:13

people that much more remarkable and and

3:17:15

worth paying attention to. But the

3:17:17

bad incentives and and the and the and the

3:17:21

the the power of bad ideas

3:17:25

to do much more harm. Because, I mean, that's what

3:17:27

that what that's what gets good people running

3:17:29

in the wrong direction or or

3:17:33

doing things that are that are clearly

3:17:36

creating unnecessary suffering. You've

3:17:40

had and I hope still have a friendship

3:17:42

with Elon Musk. Especially

3:17:44

over the topic of AI. You have

3:17:46

a lot of interesting ideas that you both share

3:17:48

concerns that you both share. Well,

3:17:51

let me first ask, what do you admire

3:17:54

most about Elon? Well,

3:17:57

you know, had

3:17:59

a lot of fun with Elon. I like Elon

3:18:02

a lot. I mean, Elon knew as a 365, I

3:18:04

I like a lot. And you

3:18:08

know, I've you know, it's it's not gonna

3:18:10

surprise anyone. I mean, he's he's

3:18:13

done and his continuing to

3:18:15

do amazing things. And I think he's

3:18:21

IDW think many of his aspirations

3:18:23

are realized, the world would be a much better place.

3:18:26

IDW think it's just amazing to see

3:18:28

what he's built and what he's attempted to build

3:18:30

and what he may get building. So

3:18:32

with Tesla, Lex, with Yeah. No.

3:18:34

I'm I'm a fan of almost all of that.

3:18:37

I mean, there are wrinkles to to

3:18:39

a lot of that, you know, or some of that. And

3:18:43

humans are full of wrinkles. There's something

3:18:45

very trumping about how he's acting on

3:18:47

Twitter. Right? I mean, Twitter, I think Twitter's he

3:18:49

doesn't he thinks Twitter's great. He bought the

3:18:51

place because he thinks it's so great. IDW think Twitter

3:18:54

is driving him crazy. Right? I think he's I think

3:18:56

he's needlessly complicating his life

3:18:58

and harming his reputation and creating

3:19:01

a lot of noise. And and harming

3:19:03

a lot of other people. I mean, so, like, he the

3:19:05

thing that I objected to with him on Twitter

3:19:07

is not that he bought

3:19:09

it and made changes to it. I mean, that was

3:19:11

not again, I

3:19:14

I remain agnostic as to whether or not he can improve

3:19:16

the platform. Mhmm. It was

3:19:18

how he was personally behaving on Twitter,

3:19:20

not just toward me but toward the world.

3:19:23

I think when you, you know,

3:19:26

forward an article about Nancy

3:19:28

Pelosi's husband being attacked,

3:19:31

not as he was by some lunatic, but

3:19:34

that is just some gay gay Fridman

3:19:38

Right? That's not what it seems. And you link

3:19:40

to a website

3:19:42

that previously claimed that

3:19:45

Hillary Clinton was dead and that a body double

3:19:47

was campaigning in her place. That

3:19:50

thing was exploding in Trumpistan

3:19:52

as a conspiracy theory. Right? And it was

3:19:54

having its 365, and it matters that he

3:19:56

was signal boosting it in front of a

3:19:58

hundred and thirty million people. And so it

3:20:00

is with saying that your,

3:20:03

you know, your 365 former

3:20:05

employee, Yolphe, is a pedophile.

3:20:07

Right? I mean, like, that has real consequences.

3:20:10

It appeared to be complete bullshit. And

3:20:12

now you get this guy is getting inundated with

3:20:15

death threats. Right? And Elon, it's

3:20:17

all that's totally predictable. Right?

3:20:19

And he's so he's behaving quite recklessly.

3:20:22

And there's a long list of things like like

3:20:24

that that he's done on Twitter. It's

3:20:27

not ethical. It's not good for him.

3:20:29

It's not good for the world. It's not serious.

3:20:32

It's just it's it's

3:20:35

a very adolescent relationship to

3:20:38

real problems in our society. And

3:20:40

so my my problem with how

3:20:43

he's behaved is that he's he's

3:20:45

purported to touch real issues by turns

3:20:47

like, okay, do I give the satellites to Ukraine

3:20:50

or not? Do I Do I

3:20:52

minimize their use of them or not? Is

3:20:54

this should I publicly worry about

3:20:56

World War three or not? Right? He's

3:20:58

doing this shit on Twitter. Right? And

3:21:03

at the same moment, he's doing

3:21:05

these other very impulsive

3:21:09

he'll consider things. And he's

3:21:11

not showing any willingness to really

3:21:13

clean up the messy makes. He

3:21:16

brings Kanye on knowing he's an

3:21:18

anti semite who's got mental health problems

3:21:20

and then kicks him off for a swa sticker, which

3:21:23

IDW probably wouldn't have kicked him off 365 his watch stickers. Like,

3:21:26

that's that's even like, can you really kick

3:21:28

people off for his watch stickers? Is that something that

3:21:30

you you you get banned 365 IDW

3:21:32

that are you a free speech app saloonist? If you can't

3:21:35

let us watch ticker show up, I'm not even

3:21:37

sure that's enforce an enforceable terms of

3:21:39

service. Right? There's their way their moments to

3:21:41

use swausticers that are not conveying hate

3:21:43

and not raising their the risk of violence.

3:21:45

Clipped that. Yep. Any? But

3:21:48

so much of what he's doing given that he's again,

3:21:51

scale matters. He's doing this in front of a hundred and thirty

3:21:53

million people. That's very different than a million people,

3:21:55

and that's very different than a hundred thousand people.

3:21:58

And so when I went off

3:22:00

the tracks with Elon, he was doing this

3:22:02

about COVID. And

3:22:05

again, this was a situation where I tried to

3:22:07

privately mitigate

3:22:09

a friend's behavior, and

3:22:12

it didn't work out pretty well. Did you try

3:22:14

to correct him sort of highlighting

3:22:17

things he might be wrong on -- Yeah.

3:22:19

-- or did you use Lex Power

3:22:21

Love method? should I should write, like, a pamphlet

3:22:24

for Sam Harris to Well, no. But it was it was

3:22:26

totally coming from a place of love because I was

3:22:28

concerned about his reputation. I was concerned

3:22:31

about what he I mean, there was a twofold concern.

3:22:33

I could see what was happening with the tweet.

3:22:35

I mean, he'd he'd had this original tweet that

3:22:37

was I think it was

3:22:40

panic panic over COVID is dumb.

3:22:42

Something like that. Right? This is wait. This is in

3:22:44

March. This is early March

3:22:46

twenty twenty. Oh, super early days.

3:22:49

Super early. So when nobody knew anything,

3:22:51

but we knew we saw what was happening in Italy.

3:22:53

Right? It was totally kicking off. That

3:22:56

was a wild time. That's when the toilet paper

3:22:59

was totally wild. But that became the

3:23:01

most influential tweet on

3:23:03

Twitter for that week. I mean, it had more engagement

3:23:05

than any other tweet more than any crazy

3:23:08

thing Trump was tweeting. I mean, it was it went 365,

3:23:11

again, but it was just a nuclear

3:23:14

bomb of of 365.

3:23:17

And I could see that people

3:23:19

were responding to it like, wait

3:23:22

a minute. Okay. Here's this genius technologist.

3:23:25

Who must have inside information about everything.

3:23:27

Right? Surely, he knows something that

3:23:29

is not on the surface about this pandemic.

3:23:32

And they're reading they they were reading into

3:23:34

it. A lot of information that

3:23:36

I knew wasn't there. Right? And I and

3:23:39

I at the time, I didn't even I didn't

3:23:41

think he had any reason to be suggesting

3:23:43

that. think he was just firing off a tweet. Right?

3:23:47

So I reached out to him in private. And mean,

3:23:49

because it was a private text

3:23:52

conversation, I I won't

3:23:55

talk about the details, but I'm just saying, if

3:23:57

that's the case, among

3:23:59

the many cases of friends who have public

3:24:01

platforms and who did something that I thought was

3:24:04

dangerous and ill considered, this

3:24:06

was the case where I I reached out in private

3:24:09

and tried to to

3:24:13

help genuinely help because it was

3:24:15

just I

3:24:17

thought it was harmful in in every sense

3:24:20

because it was being misinterpreted and it was like,

3:24:22

oh, okay. You can say that panic over anything

3:24:24

is dumb, 365. But this was not

3:24:27

how this was landing. This was, like, non

3:24:29

issue, conspiracy. There's

3:24:31

gonna be no COVID in the US. It's

3:24:34

gonna peter out. It's just gonna become a cold. I mean,

3:24:36

that that's how this was getting received. Whereas

3:24:38

at that moment, it was absolutely obvious how big

3:24:41

a deal was gonna be or that

3:24:43

it was gonna at minimum going to be big deal.

3:24:45

I don't know if 365 was obvious, but it was

3:24:47

it was obvious that it was a 365 probability

3:24:50

that it could be a big IDW remember it

3:24:52

was unclear. Like,

3:24:54

how big because there were still stories of

3:24:56

it. Like, it's probably going to, like, the

3:24:58

big concern, the hospitals might 365, but it's

3:25:00

gonna -- Yeah. -- didn't, like, two months or something. Yeah. We

3:25:02

didn't know, but it was there was no way

3:25:05

we weren't going to have tens

3:25:08

of thousands of deaths at

3:25:10

a minimum at that point. And and

3:25:13

it was every it was totally rash channel

3:25:15

to be worried about hundreds of thousands. And

3:25:17

when Nicholas Christoccas came on my

3:25:19

podcast very early, know, he

3:25:21

predicted quite confidently that we would have about

3:25:23

a million people. Dead in the US.

3:25:26

Right? And that didn't

3:25:28

seem, you know, it was, you know,

3:25:30

I think appropriately hedged, but it may still it

3:25:32

was just like, okay. It's just gonna you

3:25:35

just look at the we were just kinda riding this

3:25:37

exponential and we're and

3:25:40

it's gonna be, you know, it'd be very surprising not

3:25:42

to have that order of magnitude and

3:25:45

not something much much less. And

3:25:49

so anyway, I mean, to again, to

3:25:51

to close the the

3:25:54

story on Elon. I

3:25:57

could see how this was being received

3:26:00

and I tried

3:26:02

to get him to walk that back,

3:26:04

and then we we

3:26:07

had a fairly long and

3:26:09

detailed exchange on

3:26:12

this issue and that

3:26:14

so that intervention didn't work.

3:26:17

And it was not done, you know, I was not

3:26:19

an asshole. I was not I

3:26:21

was just concerned, you know, for

3:26:23

him, for the world, for and,

3:26:26

you know, And

3:26:29

then there are other relationships where I

3:26:33

didn't take the again, that's an example

3:26:35

where taking the time didn't work, right, privately.

3:26:38

There are other relations where I thought, okay, there's just gonna

3:26:40

be more trouble and it's worth. And said, I just just

3:26:42

ignored it, you know. And there's a

3:26:44

lot of that. And I and I Frank, again, I'm

3:26:46

not comfortable with

3:26:49

how this is all netted out because

3:26:51

III don't know 365,

3:26:54

you know, I'm I'm not, you know, frankly, I'm not comfortable

3:26:56

with how much time this conversation, we've spent

3:26:58

talking about these specific people. Like,

3:27:01

what good is it for

3:27:03

me to to talk about

3:27:05

Elon or bread or anything. I think there's

3:27:07

a lot of good because those 365 listen,

3:27:09

as a fan, these are the conversations

3:27:12

that I Loved.

3:27:16

Love is a 365, and it feels like

3:27:18

COVID has robbed the world of these conversations.

3:27:21

Because you are exchanging back and

3:27:23

forth on Twitter, but that's not what I mean by conversations,

3:27:26

like long form discussions, like a debate

3:27:28

about COVID. Like a normal

3:27:31

debate. But there's no there

3:27:33

is no Elon and I shouldn't be debating

3:27:35

COVID. You should be. Here's the thing with

3:27:38

humility like, basically saying,

3:27:40

we don't really know. Like, the Rogen method.

3:27:42

Right. We don't we're just a bunch of idiots.

3:27:44

Like, one is an engineer here in IDW.

3:27:47

It's like, and just

3:27:49

kinda okay. Here's the evidence and be, like, normal

3:27:51

people. That's what everybody was doing. The

3:27:53

whole world was, like, trying to figure out what the

3:27:55

hell what Yeah. But the issue was

3:27:57

that at that So at the moment, I had

3:27:59

this collision with you on,

3:28:02

certain things were not debatable. Right?

3:28:04

It was just it was absolutely

3:28:07

clear where this was going. It

3:28:09

wasn't clear how far it was gonna

3:28:11

go or how quickly we would mitigate it, but it

3:28:13

was absolutely clear that it was gonna be an

3:28:15

issue. Right? The the the

3:28:18

the train had come off the tracks in Italy. We

3:28:21

knew we weren't gonna seal our borders.

3:28:24

There were already people, you

3:28:26

know, who there are there are already cases

3:28:29

known to many of us personally in the

3:28:31

US at that point. And

3:28:37

He was operating by a very different logic

3:28:39

that I couldn't engage with.

3:28:41

Sure. But that logic represents a

3:28:43

part of the population. And there's a lot of interesting

3:28:46

topics that have lot of uncertainty around

3:28:48

them, like the effectiveness of masks. Like

3:28:50

Yeah. But no. But where things broke down was

3:28:53

not at the point of, oh, there's

3:28:55

a lot to talk about, a lot to debate. This is all very

3:28:57

interesting, and who knows what's what?

3:28:59

It broke down very early at

3:29:02

this is, you know, there's

3:29:05

nothing to talk about here. Like, it like,

3:29:08

it's, like, either there's a water bottle

3:29:10

on the table or there isn't. Right? Like,

3:29:13

it would technically there's only

3:29:15

one fourth of of a water bottle. So

3:29:18

what what defines the water bottles? Is it water

3:29:20

inside the water bottles? Is it the water bottle?

3:29:22

What I'm giving you as an example of is worth conversation.

3:29:25

This is difficult because this is we had an exchange

3:29:28

in private and I wanna -- Sure. Sure. Sure. --

3:29:30

wanna honor not not exposing

3:29:34

the the details of it. But, you

3:29:37

know, the details convinced me

3:29:40

that there was not a follow-up

3:29:42

conversation on that topic on this

3:29:44

topic. Yep. That said, I

3:29:47

hope and I hope to be part

3:29:49

of helping that happen, the the friendship is

3:29:51

rekindled because one of the topics I care

3:29:53

a lot about, artificial intelligence, you're

3:29:57

you've had great public and private conversations

3:29:59

about this topic. And Yes. And

3:30:01

Elon was very formative in my

3:30:03

taking that issue seriously. I mean,

3:30:05

he he and I went to that initial

3:30:09

conference in Puerto Rico together, and

3:30:11

it was only because he was going

3:30:13

and I found out about it through him and I

3:30:15

just wrote his coattails to it, you

3:30:17

know, that I that I got to

3:30:20

dropped in that side of the pool to

3:30:23

hear about these concerns at that point.

3:30:25

It it would be interesting to hear

3:30:27

how is your conserved concern,

3:30:30

evolved with the coming

3:30:32

out of chat GPT and

3:30:35

these new large language models

3:30:37

that are fine tuned with reinforcement

3:30:39

learning and seemingly to be able to

3:30:41

do some incredible human like things.

3:30:44

There's two questions. One, how is your concern

3:30:46

in terms of AGI and superintelligence evolved?

3:30:49

Mhmm. And how impressed that you would charge EBT

3:30:52

as a as a student of the human mind

3:30:54

and mind in general? Well,

3:30:57

my concern about AGI is

3:30:59

unchanged. And so I I did

3:31:01

a I have spoken about

3:31:03

it a bunch of my podcast, but I, you know, I did

3:31:05

a TED talk in twenty sixteen,

3:31:07

which was the the kind of summary

3:31:09

of what that conference

3:31:12

and and, you know, various

3:31:14

conversations I had 365 that

3:31:16

did to my my brain on on this topic.

3:31:20

Basically, that once

3:31:22

superintelligence is achieved, there's

3:31:25

a 365. It becomes exponentially

3:31:27

smarter. And in a matter of

3:31:29

time, they're just were ants

3:31:32

and they're gods. Well, yeah, unless

3:31:34

we find some way of permanently

3:31:38

tethering a

3:31:41

self a superintelligent superintelligent

3:31:45

365 improving AI to our value system.

3:31:47

And I, you know, I IDW don't believe anyone has

3:31:50

figured out how to do that or whether that's even one

3:31:52

principle. mean, I know people like Stuart Russell who I

3:31:54

just had on my podcast are

3:31:57

oh, really? Have you have you released it?

3:31:59

Haven't released it yet? Okay. He he's been on previous

3:32:01

podcast, but we just recorded this week

3:32:04

because he haven't done an app, I guess, in a while.

3:32:06

So Yeah. Great. Yeah. That's great. He's good person

3:32:08

to talk about alignment with. Yeah. So Stewart,

3:32:10

I mean, Stewart has has been, you

3:32:12

know, probably more than anyone, my

3:32:15

guru on this topic. I mean, like, you're just reading his

3:32:17

book and and doing IDW think I've done to

3:32:19

podcast with him at this point. IDW it's the the

3:32:21

control problem or something like that. His is

3:32:23

a his book is human compatible. compatible.

3:32:26

Yeah. He talks about the control problem. And

3:32:30

yeah, so just think the IDW that

3:32:32

we can define a value function

3:32:35

in advance that permanently tethers

3:32:38

a a self improving super

3:32:40

intelligent AI to our

3:32:43

values as we continue to

3:32:47

discover them, refine them, extrapolate

3:32:49

them in an open

3:32:51

ended way. I

3:32:53

think that's a tall order. And there think there are

3:32:55

many more ways. There must be many more ways of

3:32:57

designing superintelligence that

3:33:00

is not aligned in that way and is not

3:33:02

ever approximating our

3:33:05

values in that way. So I mean, Stuart's idea

3:33:08

put it in a very simple way

3:33:10

is that he thinks you don't

3:33:12

wanna specify the value function 365. You don't you

3:33:14

don't wanna imagine you could ever write

3:33:17

the code in such a ways

3:33:19

to admit of no loophole, you

3:33:22

want to make

3:33:24

the AI uncertain as

3:33:26

to what human values are and perpetually

3:33:29

uncertain and always trying to

3:33:32

ameliorate that uncertainty by by hewing

3:33:34

more and more closely to what our 365 values

3:33:37

are. So, like, there's it's always

3:33:39

interested in us saying,

3:33:41

oh, no. No. That's not what we want. That's not what we

3:33:43

intend. Stop doing that. Right? No matter how

3:33:45

smart it gets, all it wants to do is

3:33:47

more perfectly approximate human values.

3:33:50

And I think there are a lot of problems

3:33:52

with that, you know, at a high level, I'm not a

3:33:54

computer scientist, so I'm sure there are many problems at

3:33:57

a low level that I don't understand or

3:33:59

like, how to force a human into the loop

3:34:01

always. No matter what. There's that and,

3:34:03

like, what humans get a vote and

3:34:06

just just what does show what

3:34:08

what do humans value and what

3:34:10

is the difference between what

3:34:12

we say we value and our revealed

3:34:14

365, which 365

3:34:17

you just if you were a superintelligent AI

3:34:19

that could look at humanity

3:34:22

now, IDW think you could be

3:34:24

forgiven for concluding that

3:34:26

what we value is driving

3:34:28

ourselves crazy with Twitter and living

3:34:31

perpetually on the brink of nuclear war

3:34:33

and, you know, just watching, you

3:34:36

know, hot girls and yoga pants on TikTok

3:34:38

again and again again. It's like when you're saying

3:34:40

that is not This is this is all this is all

3:34:42

revealed preference and it's

3:34:45

what is an AI to make of that. Right? And what

3:34:47

should it optimize? Like, so And

3:34:50

part of the this is also

3:34:52

Stuart's observation that one

3:34:54

of the insidious things about, like, the YouTube

3:34:56

algorithm is it it's not that it just caters

3:34:59

to our preferences, it

3:35:01

actually begins to change

3:35:04

us in ways so as to make us more predictable.

3:35:06

It's like it finds ways to make us

3:35:09

a better reporter of our of

3:35:11

our preferences and to trim

3:35:13

our preferences down so that it can can

3:35:17

further trained to that signal. So the

3:35:20

main concern is that most

3:35:22

of the people in the field seem

3:35:25

not to be taken intelligence

3:35:27

seriously. Like, as

3:35:29

as they design more and

3:35:31

more intelligent machines and as they profess

3:35:34

to want to design true

3:35:36

AGI. They're

3:35:39

not, again,

3:35:41

they're they're not spending the time that steward

3:35:43

is spending trying to figure out how to do this

3:35:45

safely above all. They're

3:35:48

just assuming that these these

3:35:50

problems are gonna solve themselves as we as

3:35:52

we make that final stride into the end

3:35:54

zone or they're saying very you

3:35:58

know, polyanish things like,

3:36:01

you know, an AI would never

3:36:03

form a motive to harm human. Like,

3:36:05

why would it ever form a motive to to

3:36:08

to be malicious toward humanity,

3:36:11

right, unless we put that motive in there. Right? And

3:36:13

that's that's not the concern. The concern is

3:36:15

that in the presence of of

3:36:18

vast disparities IDW competence. And

3:36:22

in certainly in a condition

3:36:24

where these the machines are improving themselves. Are

3:36:26

improving their own code. They

3:36:28

could be developing goal

3:36:31

instrumental goals that

3:36:33

are antithetical to our well-being without

3:36:36

any without any tend to harm us. Right? It's

3:36:38

analogous to what we do to every

3:36:41

other species on earth. I mean, you and I don't

3:36:44

consciously form the intention to

3:36:47

harm insects on a daily

3:36:49

basis, but there are many things we could

3:36:51

intend to do that would, in

3:36:53

fact, harm insects because,

3:36:56

you know, you IDW to repay of your driveway

3:36:58

or whatever whatever you're doing. You're like, you're not you're

3:37:00

just not taking the

3:37:02

the interest of insects into account because

3:37:04

they're so far beneath you in

3:37:07

terms of your cognitive horizons.

3:37:09

And so

3:37:11

that the real challenge here is that

3:37:15

if you believe that intelligence, you

3:37:17

know, scales up on a continuum, to

3:37:20

toward heights that we

3:37:22

can only dimly imagine. And I think there's every

3:37:24

reason to believe that. There's just no reason to believe

3:37:26

that we're near the summit of intelligence. And

3:37:30

you can define you know, define maybe

3:37:32

maybe there's maybe there's

3:37:34

some forms of intelligence for which this is not true,

3:37:36

but for

3:37:38

many relevant 365, you know, like

3:37:40

the top hundred things we care about cognitively.

3:37:43

I think there's every reason to believe that many

3:37:45

of those things, most of those things are

3:37:48

a lot like chess or go where

3:37:50

once the machines get better than we are,

3:37:52

they're gonna stay better than we are. Although they're

3:37:54

I don't know if he's caught the recent thing with Go.

3:37:56

We're we're and this guy actually came out of Stewart's lab.

3:37:59

Yeah. Yeah. Yeah. One one

3:38:01

time a human beat a machine. Yeah.

3:38:04

They 365 a hack for that. But anyway,

3:38:06

in the ultimately, it's

3:38:08

gonna there's gonna be no looking back. And then the question

3:38:11

is, What

3:38:13

do what do we do in relate in

3:38:15

relationship to these

3:38:18

systems? That are more competent than

3:38:20

we are in every relevant respect. Because

3:38:23

it will be a relationship. It's not like

3:38:25

the peep the people who think we're

3:38:27

just gonna figure this all out, you know,

3:38:31

without thinking about it in advance. It's just gonna

3:38:33

these the solutions are just gonna find themselves. Seem

3:38:38

not to be taking the

3:38:40

prospect of really creating autonomous

3:38:44

super intelligent seriously. Like like,

3:38:46

what does that mean? It's every

3:38:49

bit as independent

3:38:51

and ungovernable ultimately. As

3:38:55

us having create I mean, just imagine if we

3:38:57

created a race of people that were

3:38:59

ten times smarter than all of us. Like,

3:39:01

how would we live with those people? They're

3:39:03

ten times smarter than us. Right? Like, they

3:39:05

begin to talk about things we don't understand.

3:39:07

They begin to want things we don't understand. They

3:39:10

begin to view us as obstacles to them

3:39:12

so they're solving those problems or

3:39:14

gratifying those desires, we

3:39:17

become the chickens or the monkeys

3:39:19

in in their presence and IDW

3:39:22

think that it's

3:39:26

but for some amazing solution

3:39:28

of the sort that Stewart is is

3:39:30

imagining that we could somehow anchor

3:39:32

their reward function permanently no

3:39:34

matter how intelligent scales. I

3:39:36

think it's it's

3:39:39

really worth worrying about this. I

3:39:41

do I do buy the the,

3:39:43

you know, the 365 notion

3:39:46

that this is an existential risk if

3:39:48

we don't do it well. I worry

3:39:50

that we don't notice it. I'm

3:39:52

I'm deeply impressed with Chad GPT, and

3:39:55

I'm worried that it will

3:39:57

become super intelligent. These

3:39:59

language models have become super intelligent because

3:40:02

they're basically trained in the collective intelligence

3:40:04

of the human species And

3:40:06

then it'll start controlling our behavior if

3:40:08

they're integrated into our algorithms, the

3:40:11

recommender systems. And then we

3:40:13

just won't notice that

3:40:15

there is a superintelligent

3:40:18

system that's controlling our behavior. Well,

3:40:21

I think that's true even before

3:40:23

far before superintelligence or even before

3:40:25

general intelligence. I mean, I think just the

3:40:27

narrow intelligence of

3:40:31

these algorithms and of what

3:40:34

something like, you know, chat, EPT

3:40:36

can can do IDW

3:40:42

it's just far short

3:40:46

of it developing its own

3:40:48

goals and that that is

3:40:51

that are at cross purposes with ours.

3:40:53

Just the unintended

3:40:55

consequences of of using

3:40:57

it in the ways where going to be incentivized

3:41:00

to use it. And and, you know, the the money to be

3:41:02

made from scaling this thing.

3:41:05

And what it does to

3:41:07

to our information space and our sense of

3:41:09

of just being able to get the ground truth

3:41:12

of of on

3:41:14

any 365, it's Yeah.

3:41:17

It's it's super scary and it

3:41:19

was it's do you think it's a

3:41:21

giant leap in terms of the development towards

3:41:23

AGI, CHAGPT, or we

3:41:26

still is this just an

3:41:28

impre impressive little toolbox.

3:41:31

So, like, when when do you think the singularity

3:41:34

is coming? Oh, or

3:41:36

is it to you? It doesn't matter. I I

3:41:38

have no intuitions on that front apart from

3:41:40

the fact that if we continue

3:41:42

to make progress, it will come.

3:41:44

Right? So it's just you just have to assume

3:41:47

we continue to make progress. There's only

3:41:49

two assumptions. You you have to assume

3:41:51

substrate independence So there's there's no

3:41:54

reason why this can't be done in silico.

3:41:56

It's it's it's just we can build arbitrarily

3:41:58

intelligent machines.

3:42:02

There's nothing magical about having having

3:42:04

this done in in the wet wear of our

3:42:06

own brands. think that is

3:42:08

true, and I think that's, you know, scientifically, parsimonious

3:42:10

to think that that's true. And

3:42:13

then you just have to assume we're gonna keep

3:42:15

making progress. Doesn't have to be any special

3:42:17

rate of progress, doesn't have to be Moore's Law,

3:42:19

can just be we just keep going. At a certain

3:42:22

point, we're gonna be in relationship to

3:42:25

minds. Leaving consciousness

3:42:28

IDW, I I don't I don't have any

3:42:30

reason to believe that they'll necessarily

3:42:33

be conscious by virtue of being super intelligent,

3:42:35

and that's its own interesting ethical

3:42:38

question. But leaving

3:42:42

consciousness aside, they're

3:42:44

gonna be more they're gonna be more competent than

3:42:46

we are. And then

3:42:48

that's like, you know, that the aliens

3:42:50

have landed. You know, that's literally that's an encounter

3:42:54

with, again, leaving aside the possibility

3:42:57

that that something like Stewart's path

3:43:00

is is is actually

3:43:03

available to us. But

3:43:05

it it is hard to picture 365

3:43:08

what we mean by intelligence, all things

3:43:10

considered, and it's truly general, if

3:43:14

that scales and,

3:43:18

you know, begins to

3:43:21

build upon itself how you

3:43:25

maintain that

3:43:27

perfect, slavish devotion until

3:43:30

the end of time that tether No systems.

3:43:33

The tether to humans -- Yeah. IDW

3:43:35

think my gut says that

3:43:37

that tether is not oh, there's

3:43:39

a lot of ways to do it. So it's

3:43:41

not this increasingly impossible

3:43:43

problem. Right. That so I have

3:43:45

no you know, as you know,

3:43:47

I'm not computer scientist. I have no intuitions

3:43:50

about his algorithmically, how

3:43:52

you would approach that and and what somewhat

3:43:54

spot. My main intuition is is

3:43:57

maybe deeply flawed, but the main intuition

3:43:59

is based on the fact that most of the learning

3:44:02

is currently happening on human

3:44:04

knowledge. So even Chad

3:44:06

GPT is just trained on human data. Right.

3:44:10

I I don't see where the 365

3:44:12

happens where you completely go above

3:44:14

human wisdom. The current impressive

3:44:17

aspect of Chachi Petiz is that's using

3:44:19

collective intelligence. Of all

3:44:21

of us. 365 what

3:44:24

IDW 365 again, from people who know much

3:44:26

more about this than I do, I I

3:44:29

think we have reason to be skeptical that

3:44:32

these tech techniques of, you

3:44:34

know, deep learning are actually

3:44:36

going to be sufficient to push us

3:44:38

into AGI. Right? So it's just not they're not

3:44:41

they're not generalizing in the way they need to.

3:44:43

They're not certainly not learning, like,

3:44:46

human children. And so they're they're they're

3:44:49

brittle and strange ways. They're

3:44:51

they're It's

3:44:54

not to say that the path is the only

3:44:56

path, you know, and maybe there's

3:44:59

we might learn better lessons by ignoring

3:45:01

the way brands work, but we

3:45:05

know that they don't generalize and use

3:45:08

abstraction the way we do. And

3:45:12

so Although the they have strange

3:45:14

holes IDW their competence. But the size of

3:45:16

the holes is shrinking every year. And

3:45:18

that's Mhmm. So the intuition starts

3:45:20

to slowly fall apart. You

3:45:22

know, the intuition is, like, surely, it

3:45:24

can't be this simple. To juice --

3:45:27

Right. -- to intelligence. Yeah. But what

3:45:29

is becoming simpler and simpler. So

3:45:31

I don't know. I don't the progress is

3:45:33

quite incredible. I've been extremely impressed

3:45:35

with chat GPT and the new models, and

3:45:37

there's a lot of financial incentive to make

3:45:39

progress on this regard. So it's

3:45:42

we're going to be living through some very interesting

3:45:44

times. In

3:45:49

raising a question that I'm going to be

3:45:51

talking to you, a lot of people brought up this topic

3:45:53

probably because Eric Wise had talked to Joe

3:45:55

Roga recently. And said that he

3:45:57

and you were contacted by folks about

3:46:00

UFOs. Mhmm. Can

3:46:02

you clarify the nature of this contact?

3:46:05

Yeah. Yeah. That you all talked about I've got

3:46:07

very little to say on this. I mean, he has

3:46:09

much more to say. think he think he went down

3:46:12

this rabbit hole further than than I

3:46:14

did, which wouldn't

3:46:17

surprise anyone. He's

3:46:19

got much more of taste for this sort of thing

3:46:21

than I do. But I think we're contacted by the same

3:46:23

person. It wasn't clear

3:46:25

to me who this person was or how this person

3:46:28

got that my cell phone number they

3:46:31

didn't seem it

3:46:34

didn't seem like we were getting punk.

3:46:36

I mean, it's the person seemed credible to me.

3:46:38

And then we're talking to you about the release of different

3:46:41

videos on your Yeah. And this this is when there's

3:46:43

a 365 of activity around this. So it was like

3:46:45

there's a big New Yorker article

3:46:47

on on the 365, and

3:46:50

there was there was rumors

3:46:53

of con congressional hearings, I think, come in

3:46:55

and And then with the

3:46:57

the the videos that were being debunked

3:47:00

or not. And

3:47:02

so this person contacted both

3:47:04

of us, I think around the same time. And I think he

3:47:06

might have contacted Rogen or other Eric

3:47:08

is just the only person I've spoken to

3:47:10

about it. I think who

3:47:13

I know was contacted. And the

3:47:19

what happened is the person kept

3:47:22

you know, writing a check that he didn't cash.

3:47:25

Like, he he kept saying, okay, next

3:47:27

week, I'm gonna you know, I understand this is sounding

3:47:29

spooky and, you know, you have no reason to really

3:47:32

trust me. But next week, I'm

3:47:34

gonna I'm gonna put you on a Zoom call

3:47:36

with people who you will recognize and they're gonna

3:47:38

be, you know, former heads of the CIA

3:47:40

and, you know, people who just you're gonna

3:47:43

within five seconds of being on the Zoom call, you'll

3:47:45

you'll know this is not a hoax. And

3:47:48

I said, great. Just let me know. Just send me

3:47:50

the Zoom link. Right? And I

3:47:52

went that happened maybe three times.

3:47:54

You know, it was just one phone

3:47:57

conversation and then it was just

3:47:59

Lex, you know, that's just a bunch of And

3:48:02

I think Eric spent

3:48:05

more time with this person, and I'm not

3:48:07

I haven't spoken about it. I know he's spoken about

3:48:10

it publicly, but So

3:48:13

I I you know, it's not that my bullshit detector

3:48:16

ever really went off in a big

3:48:18

way. It's just the thing never happened. And

3:48:20

so I I lost interest. So you you

3:48:22

made a comment which is interesting that

3:48:24

you ran which I really appreciate.

3:48:27

You ran a thought experiment of

3:48:29

saying, okay, maybe we do have

3:48:32

alien spacecraft or you just the

3:48:34

thought experiment the aliens did visit.

3:48:36

Yep. And then this very

3:48:38

kinda nihilistic sad thought

3:48:40

that it wouldn't matter. It

3:48:43

wouldn't affect your life. Can you can

3:48:45

you explain that? Well No. I would IDW I

3:48:47

was I think many people notice

3:48:50

this. I mean, this was a a sign of how

3:48:52

crazy the news cycle

3:48:54

was at that point. Right? Like, we had COVID and

3:48:56

we had Trump and 365 forget when the 365

3:48:58

thing was really kicking 365, but it

3:49:01

just seemed like no one had the bandwidth

3:49:04

to even be interested in this. It's like IDW

3:49:06

was amazed to notice

3:49:09

in myself that I

3:49:11

wasn't more interested in 365

3:49:14

out what was going on. It's like a and

3:49:16

and I I considered okay. Wait a minute.

3:49:20

This is 365 this is true, this

3:49:22

is the biggest story in anyone's lifetime.

3:49:24

I mean, contact with alien intelligence

3:49:28

is by definition, the biggest story

3:49:30

in in anyone's lifetime in in human

3:49:32

history. Why

3:49:35

isn't this just totally

3:49:38

captivating. And it it not only was

3:49:40

it not totally captivating, it would just

3:49:42

barely rise into the level of

3:49:45

my being able to pay attention to it. And

3:49:48

I view that, I mean, one as a,

3:49:53

to some degree, a an understandable

3:49:55

defense mechanism against the the

3:49:58

the bogus claims that that have

3:50:00

been made about this kind of thing in the past.

3:50:05

You know, the the general sense is probably bullshit

3:50:07

or probably has some explanation that

3:50:09

is, you know, purely terrestrial and not

3:50:11

surprising. And there was there's there's

3:50:13

somebody who what's

3:50:15

his name? Is it is it? Nick West, I 365. Is

3:50:18

it the YouTuber? Yeah. Yeah. He debunks

3:50:20

365. Yeah. He don't mean, you know, I

3:50:22

I have since seen some of those videos.

3:50:24

mean, now now this is going back still

3:50:27

at least a year, but Some of those videos

3:50:29

seem like fairly credible debunkions of

3:50:31

some of the optical evidence.

3:50:35

And I'm surprised we don't they haven't seen more

3:50:38

of that. Like, there was a a fairly

3:50:40

credulous sixty minutes piece that came out

3:50:42

around that time, looking at some of that video,

3:50:44

and it was a very video that he was debunking

3:50:47

on YouTube. And, you know, his his video only

3:50:49

had, like, fifty thousand views

3:50:51

on it or whatever. But

3:50:53

again, it seemed like a fairly credible debunking.

3:50:56

haven't seen debunkings of his debunkings. But

3:50:59

IDW think there is but he's basically saying that

3:51:01

there is there is possible

3:51:03

explanations for it. Right. And usually in

3:51:05

these kinds of context, if there's a possible explanation,

3:51:08

even if it seems unlikely, is

3:51:10

going to be more likely than

3:51:12

an alien civilization visiting

3:51:15

us. Yeah. It's extraordinary claims require

3:51:17

extraordinary evidence principle, which I think

3:51:19

is generally true. Well, with

3:51:21

aliens, I think generally I

3:51:24

think there should be some humility about what

3:51:26

that would look like when they show up. But

3:51:28

I tend to think they're already here. The amazing

3:51:31

thing about this AI conversation though was that we're

3:51:33

talking about a circumstance. We're we

3:51:36

would be designing the aliens.

3:51:38

Yeah. And they would and there's

3:51:40

every reason to believe that eventually this

3:51:43

is gonna happen. Like I said, I'm not at all

3:51:45

skeptical about the

3:51:47

the coming reality of the aliens that we're

3:51:49

gonna build them. Now here's the thing. Does

3:51:51

this apply to when superintelligence shows

3:51:53

up? Will this be

3:51:55

trending on Twitter for a day?

3:51:57

And then we'll go on to complain about

3:52:00

something Sam Harris once again said in his podcast.

3:52:03

Sure thing. You you turned turned turned

3:52:06

turned on Twitter even though you're not on

3:52:08

Twitter. Is great. Yeah. I

3:52:10

don't I haven't noticed. mean, I did

3:52:13

I did notice when I was on, but

3:52:16

IDW you have this concern about

3:52:20

AGI, basically, the same kind of thing

3:52:22

that we would just look the other way. Is there something

3:52:24

about this time where even

3:52:26

like World War three, which has

3:52:28

been throwing around very casually, conceivably

3:52:32

so, even that the

3:52:34

new cycle wipes that away. Yeah. Well,

3:52:36

I think we have this this

3:52:41

general problem that we can't make

3:52:46

certain 365, even,

3:52:49

you know, unequivocally

3:52:51

certain information, emotionally

3:52:56

salient. Like like we we

3:52:58

we respond quite readily to certain things.

3:53:01

I mean, as we talked about, we we

3:53:03

respond to the the

3:53:05

little girl who fell down a well. I mean, that just

3:53:07

that gets hundred percent of our emotional resources.

3:53:10

But the abstract probability

3:53:14

of nuclear war Right? Even a

3:53:16

high probability, even just even an intolerable

3:53:18

probability, even if we put it at thirty

3:53:22

percent. Right? You know, like, it's

3:53:25

just like that's that's

3:53:27

a Russian roulette with a, you know, a gun with three

3:53:29

chambers and, you know, it's aimed at the

3:53:31

heads not only your head, but your kid's head and

3:53:33

everyone's kid's head, and it's it's just twenty

3:53:36

four hours a day. And IDW

3:53:39

think people who who lift this pre

3:53:41

Ukraine. I think the people who have made it

3:53:43

their business to, you

3:53:46

know, 365, to think about the

3:53:48

risk of nuclear war and to mitigate

3:53:50

it in a people like Graham Allison or

3:53:52

William Perry or IDW

3:53:56

they were putting, like, the

3:53:58

ongoing risk. I mean, just the risk that

3:54:00

we're gonna 365, if proper

3:54:02

nuclear war at some point

3:54:04

in the, you know, the next generation,

3:54:08

people are putting it at, you know, something like fifty

3:54:10

percent. Right, that we're living with this

3:54:12

sort of damocles over our

3:54:14

heads. Now, you might

3:54:17

wonder whether anyone

3:54:19

can have reliable intuitions about the probability

3:54:21

of that kind of thing. But the

3:54:24

the status quo is truly

3:54:26

alarming. I mean, we've got you know,

3:54:28

we've got ICBM on, I mean, leaf IDW

3:54:31

smaller exchanges and, you know,

3:54:33

tactical nukes and how that could how we could have

3:54:35

a world war you know,

3:54:38

based on, you know, incremental changes.

3:54:40

We've got the

3:54:44

biggest bombs aimed at the biggest

3:54:46

cities in both directions. And

3:54:49

it's old technology. Right?

3:54:52

And it's, you know, and it's

3:54:54

vulnerable to some

3:54:56

lunatic deciding to launch or

3:54:59

misreading, you know, bad data.

3:55:01

And we know we've been saved from nuclear

3:55:03

war. I

3:55:06

think at least twice by,

3:55:10

you know, Soviet submarine

3:55:12

commanders IDW, I'm not gonna pass this

3:55:14

up to change a chain of command. Right? Like,

3:55:16

this is this

3:55:19

is almost certainly an error. It's and it turns

3:55:21

out it wasn't error. And it's like like and

3:55:24

we we need people to

3:55:27

I mean, in that particular case, like he saw, I

3:55:29

think it was five what

3:55:31

seemed like five missiles launched from

3:55:33

the US to Russia. And he he reasoned

3:55:36

365 the if America was gonna engage in a

3:55:38

first strike, they'd launch more than five missiles.

3:55:41

Right? So this so this has to be 365,

3:55:44

and then he waited long enough to decide that

3:55:46

it was 365. But the probability

3:55:48

of of a nuclear

3:55:51

war happening by mistake or

3:55:54

some other species of inadvertent

3:55:56

you know, misunderstanding technical

3:56:01

365. That's intolerable.

3:56:04

Forget about the the intentional

3:56:06

use of it by by people who are,

3:56:09

you know, driven crazy

3:56:11

by some IDW. And

3:56:14

more and more technologies are enable

3:56:16

a kind of scale destruction. And

3:56:18

misinformation plays into this

3:56:20

picture in a way that is

3:56:23

especially scary. I mean, once

3:56:25

you can get a deep fake of, you

3:56:28

know, the any current president of the the United

3:56:30

States claiming to have launched a first strike,

3:56:33

you know, and just, you

3:56:36

know, send that everywhere. Mhmm. But that could change

3:56:38

the nature of truth, and then we that

3:56:40

might change the the

3:56:43

engine we have for skepticism is

3:56:46

sharpening it. The more you get

3:56:48

people you might have AI and and

3:56:50

digital watermarks that help us.

3:56:53

Maybe we'll not trust any

3:56:55

information that hasn't come through.

3:56:58

Specific channels. Right? I mean, so,

3:57:01

like, in in my world, it's

3:57:04

like, IDW no

3:57:06

longer feel the the need to,

3:57:09

you know, respond to anything other than

3:57:12

what I put out in in my

3:57:14

channels of information. It's like there's there's so much

3:57:16

there are so many people who have clipped stuff of

3:57:18

me that shows the opposite

3:57:21

of what I was actually saying in context. I mean, the

3:57:23

people have, like, edited my podcast audio

3:57:25

to to make it seem like a I said the opposite

3:57:27

of what I was saying. Like, unless

3:57:29

I put it out, you know, you

3:57:31

can't be sure that I actually said it, you

3:57:33

know. I mean, it's it's just But

3:57:37

I don't know what it's like to live like

3:57:40

that for all forms of

3:57:42

365. And IDW

3:57:45

strangely, I think it may require

3:57:47

a a greater

3:57:49

silo in of information in

3:57:51

the end. You know? It's like it it it it's

3:57:55

we're living through this sort of Wild West

3:57:57

period where everyone's got a newsletter and everyone's

3:57:59

got a blog and everyone's got an opinion. But

3:58:02

once you can fake everything, there might be

3:58:04

greater value for expertise -- Yeah. -- for

3:58:07

experts, but a more rigorous system

3:58:09

for identifying who the experts are. Yeah. Or just

3:58:12

or just knowing that, you know, it's gonna be

3:58:14

an arms race to authenticate

3:58:17

information. So it's like 365

3:58:19

if you can never trust a photograph --

3:58:22

Yeah. -- unless it has been

3:58:24

vetted by Getty images because

3:58:26

only Getty images has the resources to

3:58:29

to authenticate the provenance

3:58:31

of that photograph and and a test

3:58:33

that hasn't been metled with by AI.

3:58:37

And again, I don't even know if that's technically possible. I

3:58:39

mean, maybe the whatever the tools available

3:58:43

for this will be, you know, commodified

3:58:45

IDW and that the cost will be driven

3:58:47

to zero so quickly that everyone will be

3:58:49

able to do IDW, you know. It could be like encryption. But

3:58:52

and it will be proven and tested

3:58:54

most effectively first, of course, as always,

3:58:57

in porn. Yeah. Which is where

3:58:59

most of human innovation technology happens

3:59:01

365. Well,

3:59:04

I have to ask because Ron Howard, the director

3:59:06

asked this on Twitter since we're

3:59:08

talking about the threat of nuclear war

3:59:10

and otherwise, he asked, I'd be interested

3:59:12

in both your expectations for human society

3:59:15

if when we move beyond Mars. Will

3:59:18

those societies be industrial based? How

3:59:21

will it be governed, how will criminal

3:59:23

infractions be dealt with when

3:59:26

you read or why sci fi will cause closest

3:59:28

to sounding logical. Do you think about

3:59:30

our society beyond Earth? If

3:59:33

we colonize Mars, if we colonize space?

3:59:35

Yeah. Well, I I think I have a pretty humbling

3:59:39

picture of that. I mean, so because we're

3:59:41

still gonna be the apes that we are. So

3:59:44

when you when you imagine colonizing Mars,

3:59:47

you have to imagine a first fist

3:59:49

fight on Mars. Yeah. You have to imagine first

3:59:51

murder on Mars. Also 365 --

3:59:54

Yeah. -- somebody extra marital affairs on Mars.

3:59:57

Right? So it's it's gonna get really

4:00:00

homely and boring really 365, I

4:00:02

think. You know, it's like only the space suits

4:00:05

or where the other exngences

4:00:08

of of just living in that atmosphere

4:00:11

or lack thereof will

4:00:13

limit how badly we

4:00:15

can behave on Mars. But do you think most

4:00:17

of the interaction will be still in meeting space

4:00:19

versus digital? Do you think there'll be

4:00:22

do you think we're, like, living through a

4:00:24

a transformation of of a kind where

4:00:27

we're going to be doing more and more interaction in

4:00:29

digital space? Like, everything

4:00:31

we've been complaining about Twitter. Is

4:00:34

it possible that Twitter is just the early days

4:00:36

of a broken system that's actually giving

4:00:39

birth to a better working system that's ultimately

4:00:41

digital. I

4:00:43

think we're gonna experience

4:00:48

a pendulum swing back into the

4:00:50

real world. I mean, think many of us are experiencing

4:00:53

that now anyway. mean, just just wanting

4:00:55

to have face

4:00:57

to face encounters and

4:00:59

spend less time on 365 phones and less time

4:01:01

online. I think I think you know,

4:01:03

maybe everyone isn't going in that direction, but

4:01:07

I do notice it myself

4:01:09

and I notice I mean,

4:01:11

once I got off Twitter, then I noticed the

4:01:13

people who were never on Twitter. Right? And I

4:01:15

know and and the people who were never basically,

4:01:18

I mean, I know I have a lot of friends who were never on Twitter.

4:01:20

Yeah. They and they actually never

4:01:22

understood what was doing on Twitter. It's like, like,

4:01:24

they just like, it it wasn't that they were seeing

4:01:26

it and then reacting

4:01:29

to it. They just didn't know it's like

4:01:33

it's like being on it's like I'm not on Reddit either,

4:01:35

but I don't spend any time thinking about not being

4:01:37

on Reddit. Right? So IDW I'm just not on Reddit.

4:01:40

Do you think the pursuit of human happiness is

4:01:42

better achieved, more effectively achieved outside

4:01:46

of Twitter world? Well,

4:01:49

I think all we have is our attention

4:01:52

in the end, and we we just have to notice

4:01:54

what these various tools are

4:01:56

doing to it. And it's

4:01:59

just it it became very clear

4:02:01

to me that it

4:02:03

was an unrewarding use of my attention.

4:02:05

Now it's not to say there isn't some digital

4:02:08

platform that's conceivable that would be

4:02:11

365, but and rewarding.

4:02:14

But Yeah.

4:02:17

I mean, we we just have you know,

4:02:20

our life has doled out to us in moments,

4:02:22

and we and we have and we're continually

4:02:25

solving this riddle of what

4:02:27

is gonna suffice to make this

4:02:29

moment engaging and

4:02:32

meaningful and aligned

4:02:35

with who I wanna be

4:02:37

now and how want the future to to

4:02:39

look. Right? We're all and it's that we have this tension

4:02:41

between being in

4:02:44

the presence and becoming in the future.

4:02:46

And you

4:02:49

know, it's a seeming paradox. Again, it's

4:02:51

not really a paradox, but it can seem like,

4:02:53

I do think the ground truth for

4:02:56

personal well-being is

4:02:58

to find a motive being where you

4:03:01

can pay attention to

4:03:04

the present moment, and this is, you know, meditation

4:03:06

by another name. You can

4:03:08

pay attention to the present moment with

4:03:11

sufficient, you know, gravity that

4:03:14

you recognize that that

4:03:17

just consciousness itself in the present moment,

4:03:19

no matter what's happening, is

4:03:21

already a circumstance of freedom

4:03:24

and and contentment and

4:03:26

and tranquility. Like, you can be happy

4:03:28

now before anything happens.

4:03:31

365 this next desire gets gratified, before

4:03:34

this next problem gets solved, there's

4:03:36

there's this kind of ground truth that that

4:03:38

you're free, that consciousness is free and

4:03:40

open, and unencumbered by

4:03:43

really any problem until you get

4:03:45

lost and thought about all the problems that

4:03:48

may get be real for you.

4:03:50

So the ability to catch and observe

4:03:52

consciousness that in itself

4:03:55

is a source of happiness. Without being lost

4:03:57

IDW thought. And and so what what this happens

4:04:00

haphazardly for people who don't meditate

4:04:02

because they find something in their life that's so

4:04:05

captivating It's so pleasurable,

4:04:07

it's so thrilling. It it can

4:04:09

even be scary, but it can be even

4:04:12

being scared, it's captivated. Like, so so it gets

4:04:14

it's it's gets their attention. Right? Whatever it

4:04:16

is. 365 you like, you know, Sebastian Younger,

4:04:19

or you know, was wrote a great book

4:04:21

about people's experience in

4:04:23

war here. You know? It's like like, you

4:04:25

can be strangely can be the best experience

4:04:27

anyone's ever had because everything it's like

4:04:30

only the moment matters. Right? Like, the the

4:04:32

bullet is whizzing by your head, you're

4:04:34

not thinking about your your

4:04:36

401K or that thing that you

4:04:38

didn't say last week to the person you shouldn't

4:04:41

have been talking about. You're not thinking about Twitter.

4:04:43

It's like you're just fully

4:04:45

immersed in the present moment. Meditation

4:04:49

is the only way IDW that

4:04:52

that that word can mean many things to many people.

4:04:54

But what I mean by meditation is simply

4:04:56

the discovery that there is AAA

4:05:00

way to engage

4:05:02

the present moment directly regardless

4:05:04

of what's happening. You don't need to be in a war. You don't

4:05:06

need to be having sex. You don't need to be on

4:05:08

drugs. You don't need to be surfing. You don't need

4:05:11

nothing It has just have to be a peak experience. It

4:05:13

can be completely ordinary, but you can

4:05:15

recognize that in some basic

4:05:17

sense, there's only this and

4:05:20

and everything else

4:05:22

is something you're thinking. You're thinking

4:05:24

about the past. You're thinking about the future.

4:05:27

And thoughts themselves have

4:05:29

no substance. Right? It's it's it's 365 mysterious

4:05:32

that any thought ever

4:05:34

really commandeers your sense

4:05:36

of who you are and and makes you

4:05:38

anxious or afraid or or angry

4:05:40

or whatever it is. And

4:05:43

the more you discover that, the

4:05:45

half life of all these negative emotions that

4:05:47

blow all of us around get

4:05:49

much much shorter. Right? And you can you can

4:05:51

literally just you

4:05:53

know, the the anger that would have kept

4:05:55

you angry for hours or days

4:05:58

lasts, you know, four seconds

4:06:01

because you just the moment it arises,

4:06:03

you recognize it and you can get off that. You can decide

4:06:05

at minimum, you can decide whether it's useful

4:06:07

to to stay angry at that moment.

4:06:10

And, you know, obviously, it usually isn't.

4:06:12

And the illusion of free will is one of those

4:06:14

thoughts. Yeah. It's all just happening.

4:06:16

Right? Like, even the mindful and

4:06:19

meditative response to this is just happening

4:06:21

happening. It's just, like, even the

4:06:24

moments where you recognize or not recognize

4:06:26

is just happening. It's not that this

4:06:29

does open up degree of freedom for a person,

4:06:32

but it's not a freedom that gives any

4:06:34

motivation to the notion of free will. It's just

4:06:36

a new way

4:06:38

of being in the world. Is there a difference

4:06:40

between intellectually knowing

4:06:42

free world's an illusion and -- Yeah. --

4:06:44

really experiencing it? Yeah. Yeah. What's the

4:06:47

what's the longest you've been able to experience?

4:06:49

The escape, the illusion, if

4:06:52

you will? Well, it it always

4:06:54

IDW it's always obvious to me when I

4:06:56

pay attention. I mean, when whenever

4:06:59

I'm mindful, this is the the

4:07:01

term of jargon, you know, in the Buddhist and

4:07:04

and increasingly, you know, outside the Buddhist context

4:07:06

is is mindfulness. Right? But they're sort

4:07:08

of different levels of mindfulness and there's

4:07:11

there's different degrees

4:07:14

of insight into this. But yeah. So IDW what I'm

4:07:16

calling evidence of lack of

4:07:18

free will and lack of, you know, lack of

4:07:20

the 365. And I got two sides of

4:07:22

the same coin. There's a sense of being

4:07:25

a subject in the middle of experience to

4:07:28

whom all experience refers --

4:07:30

Mhmm. -- a sense of eye, a sense of me,

4:07:32

and that's

4:07:34

almost everybody's starting point when they

4:07:36

they start to meditate and that's almost

4:07:38

always the place people live most

4:07:41

of their lives 365. I do think that

4:07:43

gets interrupted in ways they get unrecognized.

4:07:45

I think people are constantly losing the sense

4:07:47

of eye. They're losing the sense of subject

4:07:49

to object distance. But they're not

4:07:51

recognizing it. And and meditation

4:07:54

is the mode

4:07:56

in which you can recognize. You can you

4:07:58

can both consciously precipitate it you can

4:08:01

look for the self and fail to find it

4:08:03

and then recognize its

4:08:05

absence. And that's

4:08:07

the just the flip IDW of the coin of free

4:08:09

will. I mean, the the

4:08:11

the feeling of having free will is

4:08:13

what it feels like to feel like a self

4:08:16

who's thinking his thoughts and doing

4:08:18

his actions and intending his intentions.

4:08:21

And he the the man

4:08:23

in the middle of the boat who's rowing that's

4:08:27

the start that's the false starting

4:08:29

point. When you find that there's no one in the middle

4:08:32

of the boat, right, or in 365, there's no boat,

4:08:34

there's just the river, there's just the

4:08:36

flow of experience, and

4:08:38

there's no center to it, and there's no place

4:08:40

from which you would control it. Again,

4:08:43

even when you're doing thing, this

4:08:45

does not negate the difference between voluntary

4:08:47

and involuntary behaviors. Like, I

4:08:49

can voluntarily reach for this,

4:08:51

but when I'm paying attention,

4:08:54

I'm aware that everything

4:08:56

is just happening. Like, just the intention to

4:08:58

move is just horizon. Right?

4:09:00

And I'm in no position.

4:09:03

I know know why it didn't arise a

4:09:05

moment before or a moment later

4:09:07

or a moment where or or, you know, fifty

4:09:10

percent stronger or weaker or, you know,

4:09:12

so is to be ineffective or to be

4:09:14

doubly effective where I lurched for

4:09:16

it versus I move slow. I mean, not

4:09:18

I'm I'm not can

4:09:21

never run the counterfactuals. I can never

4:09:24

I mean, all all of this opens the

4:09:26

door to a an

4:09:29

even more disconcerting picture along the same

4:09:31

lines, which is subsumes

4:09:33

this conversation about free will and it's the

4:09:36

question of whether anything

4:09:41

is ever possible. Like,

4:09:43

what if this is a question haven't

4:09:46

thought a lot of about it, but it's

4:09:48

been a few years. I've been kicking this question

4:09:50

around. So

4:09:54

mean, what if only the actual

4:09:57

is possible? What if what if there

4:09:59

was what if So we live with

4:10:02

this feeling of possibility. We live with a sense

4:10:04

that Let

4:10:08

me take a so, you know, I have two

4:10:10

daughters. I could have

4:10:12

had a third child. Right? So,

4:10:15

what does it mean to say that I could've had

4:10:17

a third child? Right? So do you you don't have kids?

4:10:19

don't think. Yeah. So not that I

4:10:21

know 365. Yes. So the possibility might

4:10:23

be -- Right. So what do we mean

4:10:25

when we say you

4:10:28

could have had a child or

4:10:31

you might have a child

4:10:33

in the 365. Like, what

4:10:36

is the space in react? What's the relationship

4:10:38

between possibility and sensuality

4:10:41

and reality. Is there a reality in

4:10:43

which non

4:10:46

actual things are

4:10:48

nonetheless real? And

4:10:51

so that we have other categories of, like, non

4:10:54

concrete things. We have things that don't have

4:10:56

spatial, temporal dimension but

4:10:59

they're nonetheless they they nonetheless exist.

4:11:01

So, like, you know, in

4:11:03

the integers. Right? So numbers. There's

4:11:07

a there's a reality. There's an

4:11:09

abstract reality to numbers.

4:11:11

And this is this is philosophically interesting to think

4:11:13

about these things. So they're not like in

4:11:16

some sense, they're they're

4:11:19

they're real and they're

4:11:21

they're not not merely invented by us. They're discovered

4:11:24

because they have structure that we can't impose

4:11:26

upon them. Right? It's not like they're not fictional

4:11:28

characters like, you know, Hamlet and

4:11:30

Superman, also exist

4:11:32

in some sense, but the the existed level

4:11:34

of of our own fiction

4:11:37

and and abstraction. But it's like, they're

4:11:40

true they're true and false statements you can make

4:11:42

about Hamlet. They're true and false statements

4:11:44

you can make about Superman because

4:11:46

are 365. The fictional worlds we've

4:11:49

created have a certain kind of structure. But, again,

4:11:51

this is all abstract. This like, it's all

4:11:53

abstractible from any of its concrete instantiations.

4:11:56

It's not just in the comic books and just in

4:11:58

the movies. It's in our, you

4:12:00

know, ongoing ideas about these

4:12:03

characters. But natural

4:12:05

numbers or the

4:12:07

integers don't function

4:12:10

quite that way. I mean, they're similar, but they also

4:12:12

have a structure that's purely a matter of

4:12:14

discovery. It's not you can't just make up

4:12:17

whether numbers are prime. You know,

4:12:19

365 you give me two integers, you know, of

4:12:22

of a certain size to let's you you

4:12:24

mentioned two enormous integers. If

4:12:27

I were to say, okay. Well, between those two integers,

4:12:29

they're exactly eleven prime numbers.

4:12:32

Right? That's a very specific claim

4:12:34

about which I can be right or wrong. Whether

4:12:36

or not anyone knows I'm right or wrong. It's like that's

4:12:39

just there's a domain of facts there, but these

4:12:41

are abstract it's an abstract reality that

4:12:44

relates in some way that's philosophically

4:12:46

interesting, you know, metaphysically interesting to

4:12:49

what we call real reality, you know, this the

4:12:51

spatial temporal order, the

4:12:54

physics of things. But possibility,

4:12:59

at least in my view, occupies a different

4:13:01

space. And this is something, again,

4:13:03

my thoughts on this are pretty in code. And

4:13:05

IIII think I need to talk to

4:13:08

a philosopher of physics and and

4:13:10

or 365 about how this may interact with

4:13:13

things like the many worlds interpretation of quality.

4:13:15

Yeah. That's that's an interesting. Right. Right. Exactly.

4:13:17

So IDW

4:13:19

wonder if Discover is in physics

4:13:21

like, further proof or more concrete proof

4:13:23

that many world's interpretation of quantum mechanics

4:13:26

has some validity -- Right. --

4:13:28

if that completely starts to change things.

4:13:30

But even that that's just more actuality.

4:13:34

So if if I took that seriously -- Sure.

4:13:36

-- that's that's a case of

4:13:39

And truth is that happens even even if

4:13:42

the many worlds interpretation isn't true, but

4:13:44

we just imagine we have a physically infinite

4:13:47

universe the implication

4:13:50

of infinity is such that things

4:13:52

will begin to repeat themselves. Yeah. You

4:13:54

know, the farther you go in space. Right? So

4:13:56

the, you know, If you just

4:13:58

head out in one direction, eventually, you're gonna meet

4:14:00

two people just like us having a conversation

4:14:03

just like this, and you're gonna meet them an infinite

4:14:05

number of times in every you

4:14:07

know, infinite variety of permutations

4:14:10

slightly different from this conversation. Right? So,

4:14:12

I mean, 365 is just so big

4:14:14

that our intuitions of probability completely

4:14:16

breakdown But what I'm suggesting

4:14:18

is maybe

4:14:20

probability isn't a thing. Right?

4:14:23

Maybe there's only actuality.

4:14:25

If there's maybe there's only what happens,

4:14:28

and at every point along the way,

4:14:31

our notion of what could have happened or what

4:14:33

might have happened, is just that. It's

4:14:35

just a thought about what

4:14:37

could have happened or my traditional. So it's a

4:14:39

fundamentally different thing. If you can imagine

4:14:41

a thing that doesn't make it real.

4:14:44

So they because that's that's where that

4:14:46

possibility exists. That's in your imagination.

4:14:49

Right? Yeah. And and possibility itself

4:14:51

is a kind of spooky idea because

4:14:53

it it too has a sort of structure.

4:14:56

Right? So, like, if I if I'm gonna say, you

4:15:03

know, you could

4:15:05

have had a daughter, right,

4:15:07

last year. So

4:15:12

we're saying that's that's possible, but

4:15:15

not actual. Right?

4:15:18

That is a claim the the things that

4:15:21

are true and not true about

4:15:24

that daughter. Right? Like, it it had kind of structure.

4:15:26

It's like, 365 like there's a lot

4:15:28

of fog around that

4:15:31

the possibility. It feels like almost

4:15:33

like a useful narrative. But what does

4:15:35

it mean? So, like, what

4:15:37

does it mean? If if we say, you

4:15:39

know, I just did that, but I'm

4:15:42

I I it was conceivable that I wouldn't have done

4:15:44

that. Right? Like, it's possible that III just threw

4:15:46

this cap, but -- Right. -- I

4:15:48

might not have done that. So you're taking it

4:15:50

very temporally close to the original. Like,

4:15:53

what what appears a decision? Whenever

4:15:55

we're saying something is possible, but not

4:15:58

actual. Right? Like, this thing just happened,

4:16:01

but it's conceit it's it's possible

4:16:03

that it wouldn't have happened. Or they would have happened differently.

4:16:07

In what does that possibility

4:16:10

consist? Like, where is that? What IDW

4:16:13

365 that to be real, for the possibility

4:16:15

to be real, what

4:16:18

are we what claim are we making about

4:16:21

the universe. Well, isn't that an extension

4:16:23

of the idea that 365 will is an illusion

4:16:25

that all we have is actuality, that the possibility

4:16:28

isn't Right. Yeah. I'm just extending it beyond

4:16:30

is a human action. Like,

4:16:33

it's it's this goes to the physics of things. This

4:16:35

is just everything. Like, we're we're we're always

4:16:37

telling ourselves a story. Yeah. That

4:16:39

includes possibility. Possible

4:16:41

is really compelling for some reason. Well,

4:16:45

yeah, because it's it's mean,

4:16:47

so this yeah. I mean, this could sound

4:16:49

just academic, but every

4:16:52

backward looking regret

4:16:54

or disappointment and every

4:16:56

forward looking worry is

4:17:00

completely dependent on this notion

4:17:02

of possibility. Like, every regret

4:17:04

is based on the sense that something else I could have

4:17:06

done something else. Something else could have happened.

4:17:10

Every disposition to worry about

4:17:12

the future is

4:17:14

based on the feeling that there's this range

4:17:16

of possibilities. It could go either way.

4:17:18

And You

4:17:21

know, I mean, I I know whether or not there's

4:17:24

such a thing as possibility, you know, I'm convinced that

4:17:26

war is almost never psychologically

4:17:30

appropriate because the reality is in any

4:17:32

given moment, either you can do

4:17:34

something to solve the problem you're worried about or

4:17:36

not. So if you can do something, just do it.

4:17:39

You know? And if you can't, your worry is just

4:17:41

causing you to suffer twice over. Right? You're gonna

4:17:44

you're going to get the medical procedure next

4:17:46

week anyway. How much time between

4:17:49

now and next week do you want to spend worrying about

4:17:51

it, right? It's gonna the worry

4:17:53

the worry doesn't accomplish anything. How much do

4:17:55

physicists think about possibility? Well,

4:17:57

IDW think about it in terms of probability. More

4:18:00

365. But probability just describes And

4:18:04

again, this is a place where I might be

4:18:06

out of my depth and need to talk us somebody

4:18:09

to to debunk this.

4:18:11

But the do therapy with a

4:18:13

365? Yeah. But probably

4:18:15

it seems just describes a pattern of

4:18:17

actuality that we've observed. Right?

4:18:20

I mean, we have There's certain things we observe,

4:18:22

and those are the actual things that have happened.

4:18:25

And we have this additional story about

4:18:28

probability. I mean, is it we have the frequency

4:18:30

with which things happen have happened in the

4:18:32

past. You

4:18:36

know, I I can 365 a fair coin and

4:18:38

know know in the abstract that I have a belief

4:18:41

that in the limit, those

4:18:43

365, you know, those tosses should

4:18:45

converge on fifty percent heads and fifty percent

4:18:48

tails. I know I have a story as to why

4:18:50

it's not gonna be exactly fifty percent

4:18:52

within any arbitrary time

4:18:54

frame. But

4:18:59

in reality, all we ever have are

4:19:01

the observed tosses. Right?

4:19:04

And then we have an additional story that, oh,

4:19:06

it came up heads, but it could have come up

4:19:08

tails, why

4:19:10

do we think that about

4:19:13

that last toss? And

4:19:16

what are we claiming is

4:19:18

true about the physics

4:19:20

of things? 365 we

4:19:22

say it could have been

4:19:25

otherwise. IDW think we're claiming that

4:19:27

probability is true. That

4:19:29

that just allows us to

4:19:32

have a nice model about the world, gives

4:19:34

us hope about the world. Yeah. It

4:19:36

seems that possibility has to

4:19:39

be somewhere to be effective. It's

4:19:41

a it's a little bit like what's what's happening with

4:19:43

the laws. There's something metaphysically

4:19:46

interesting about the laws of nature too because

4:19:48

the laws of nature so the laws of nature impose

4:19:50

their their work on the world. Right?

4:19:52

We see their evidence. But

4:19:56

they're not reducible to any

4:19:58

specific set of instances. Right?

4:20:00

So there's some structure there. But

4:20:03

the structure isn't just

4:20:05

a matter of the actual things with

4:20:08

the actual billiard balls that are banging

4:20:10

into each other. All of that actuality

4:20:12

can be explained by what actual things are actually

4:20:14

doing. But then we have this notion

4:20:17

that in addition to that, we have the

4:20:19

laws of nature that are making they're explaining

4:20:21

this act. But but how are the laws of nature

4:20:24

an additional thing in addition

4:20:26

to just the actual things that are actually 365 costly?

4:20:28

And if if they're if they are an additional thing,

4:20:31

in how are they 365? 365

4:20:34

not, they're not among the actual things that are just

4:20:36

actually banging around. Yeah. And

4:20:38

so to some 365, for that, boss,

4:20:41

possibly has to be hiding somewhere for the

4:20:43

laws on nature to be possible. Like,

4:20:45

for 365. anything to be possible,

4:20:47

it has to be it has to be it has to be has to blazes somewhere,

4:20:49

I'm sure. There's were all the possibility because

4:20:51

it has to be attached to something. So,

4:20:55

you don't think many worlds is that. Yeah.

4:20:58

Because May was still exist. Yeah. Well, because

4:21:00

we're in this strand of that multiverse.

4:21:03

Yeah. Right? So still still

4:21:06

you have just a local instance of what is

4:21:08

actual. Yeah. And then 365 it proliferates elsewhere

4:21:10

where you can't be affected by it,

4:21:12

many worlds as well that you can't really

4:21:14

connect with the other. Yeah. Yeah.

4:21:17

And so many worlds is just a statement of

4:21:21

basically everything that can happen happen

4:21:23

somewhere. Right?

4:21:25

And that's -- Yeah. -- you know, and that's I mean, maybe

4:21:27

that's not an entirely kosher

4:21:29

formulation of it, but it seems pretty close.

4:21:33

So but there's whatever happens. Right?

4:21:36

In fact, there's, you know, relativistically, there's

4:21:38

a there's an, you know,

4:21:40

the Einstein's original notion

4:21:42

of a block universe. Seems

4:21:44

to suggest this. And I it's been a while since I've been

4:21:46

in a conversation with physicist where I've gotten a

4:21:48

chance to ask about the standing of this concept in

4:21:51

365. Currently, I don't I don't hear it discuss much,

4:21:53

but The idea of a block universe is

4:21:55

that, you know, space time exists

4:21:58

as in totality. In

4:22:01

our sense that we are traveling through

4:22:03

space time, where there's

4:22:05

a real difference between the past and the

4:22:08

365,

Rate

Join Podchaser to...

  • Rate podcasts and episodes
  • Follow podcasts and creators
  • Create podcast and episode lists
  • & much more

Episode Tags

Do you host or manage this podcast?
Claim and edit this page to your liking.
,

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features