Podchaser Logo
Home
Understanding Power, Corruption, Politics, AI, Religion, Tribalism & Free Speech | Sam Harris PT 1

Understanding Power, Corruption, Politics, AI, Religion, Tribalism & Free Speech | Sam Harris PT 1

Released Tuesday, 8th August 2023
 1 person rated this episode
Understanding Power, Corruption, Politics, AI, Religion, Tribalism & Free Speech | Sam Harris PT 1

Understanding Power, Corruption, Politics, AI, Religion, Tribalism & Free Speech | Sam Harris PT 1

Understanding Power, Corruption, Politics, AI, Religion, Tribalism & Free Speech | Sam Harris PT 1

Understanding Power, Corruption, Politics, AI, Religion, Tribalism & Free Speech | Sam Harris PT 1

Tuesday, 8th August 2023
 1 person rated this episode
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

Today's guest is none other than Sam

0:02

Harris, an author and public intellectual

0:04

who introduced millions to mindfulness

0:07

and meditation, and quite honestly helped

0:09

shape an entire generation of thinkers as

0:11

a part of the fabled intellectual dark

0:13

web and representative of the New

0:15

Atheist Movement. But he also sparked

0:18

a ton of outrage with his recent

0:20

controversial take on how information

0:22

regarding a major political candidate should

0:25

be knowingly positioned to influence

0:27

a democratic election towards a desired outcome.

0:29

Coming into this episode I was not sure how to reconcile

0:32

those two versions of Sam Harris. He

0:35

is one of the clearest thinkers

0:37

I've ever had the pleasure of sitting down with, but

0:39

in a world where it's hard to reach a consensus

0:41

on what truth actually is, I'm

0:43

very skeptical of anyone who wants

0:46

anything other than as unvarnished

0:48

an information stream as they can get. I

0:51

really believe that people should distrust

0:53

themselves and simply hunger for

0:56

as many people to get access to information

0:58

as possible. Sitting

0:59

down with Sam though, I was once again impressed

1:02

by his openness and willing to discuss difficult

1:05

topics head-on without any

1:07

hiding or shucking and jiving. While there

1:09

are certainly things that we don't see eye to eye on,

1:11

I was grateful for my time with him and think you

1:13

guys will be too. And if you guys

1:15

hate anything that either of us have to say, fair

1:17

enough, but at least you have access to our

1:20

unvarnished thoughts and can make up your own mind. Since

1:22

I know you won't want to miss out on part two, be

1:24

sure to subscribe now on Apple podcasts to

1:27

get part two ad free when it drops as

1:29

well as

1:29

access to hundreds of archived episodes

1:32

and extras that you won't find anywhere else.

1:35

I'm Tom Bilyeu and now I bring you Sam

1:37

Harris. Culture is software. We

1:40

know it's continually failing us. We should

1:42

not have a significant number

1:44

of Americans alleging the election

1:47

was stolen. There was a pretty gnarly

1:49

one-two punch between COVID and Trump

1:51

that I really think caused a sense-making

1:54

apparatus to fall apart in some

1:56

way. My question

1:58

is, can we stop?

1:59

the rise of evil or is that already a

2:02

fallacy? And if we can, then how?

2:05

Well, I mean, I think your first point is that we

2:08

didn't have anything like a consensus

2:10

around what was going on. I mean,

2:12

people were siloed into various

2:14

echo chambers and just not converging

2:16

on an assessment of just what the

2:19

facts are about anything. I

2:21

don't consider myself the best judge of how

2:24

this happened because there were

2:26

people who

2:27

were sufficiently far from me

2:30

on the information landscape so that

2:32

I just ceased to understand how

2:34

they could be thinking and doing what they were

2:37

thinking and doing. I'm not

2:39

seeing their social media feed. I'm

2:41

seeing some fragments

2:43

of what they're finding persuasive, but

2:47

it's just amazing to me that there are, I don't

2:48

know what it is, 30, 40% of

2:51

American society still thinks

2:54

that Trump was not only

2:56

fine, he was better than fine, they're

2:58

just impeccable on some level, ethically,

3:01

and that January 6 was

3:03

a non-event and that there was really nothing

3:05

at stake there. It's all just been, insanely,

3:09

it was a combination of nothing happened,

3:12

but everything that did happen was

3:14

Antifa or Trumped

3:17

up by the CIA. It

3:19

was not violent, but the violence was

3:22

from some other source. This is not a coherent

3:24

view, but you literally have

3:27

something like 100 million people who think

3:29

that there was just no

3:31

factor. I mean, there was just nothing, the only

3:34

thing bad that happened is that the

3:36

election got stolen from the

3:38

rightful president, which was Trump. I

3:41

don't know how you are paying attention to

3:44

anything like a valid source of information

3:46

and you

3:48

still believe that. There are

3:50

people who believe that or pretend

3:53

to believe it enough that they're, I do

3:55

think there are people who are just being fundamentally dishonest

3:58

with their audiences. So you have, And it's

4:00

amazing how it can come down to a couple of dozen

4:02

personalities that really are close

4:05

to the lever that

4:07

moves public opinion here. So you have someone like Tucker Carlson,

4:10

who we know on the basis of the Dominion lawsuit

4:13

behind closed doors was talking

4:15

about Trump as a demonic force. He

4:18

couldn't wait for him to disappear

4:20

from the public conversation. He

4:22

hated him with passion. And those are basically verbatim

4:25

quotes of his text messages

4:27

within, behind the scenes at Fox that

4:30

got entered into evidence in the Dominion

4:32

case. And yet his public facing

4:34

message is all, Trump

4:37

supporting conspiracy theorizing all

4:39

the time for years. He was the most watched

4:42

person on Fox. And pretty

4:44

soon he'll be the most watched person, wherever

4:47

he finally hangs his hat, but his Twitter

4:49

videos get apparently tens

4:51

of millions of views. And so he's got this

4:53

enormous audience that seemingly doesn't

4:55

care about his hypocrisy,

4:58

which is amazing to me. I don't know how you maintain

5:00

an audience with this kind of loss

5:03

of face. The mainstream media

5:05

was not shy in advertising the

5:07

discrepancy between what he was saying behind closed doors

5:10

and what he was saying on his show. So

5:13

either you have a hundred million people

5:15

who just simply never

5:17

watch any mainstream media product

5:22

or read it, and that's quite possible.

5:26

But the basic problem before

5:28

anything, before we

5:30

think about antithetical ethical

5:32

commitments or political commitments, or people

5:35

who disagree about evil really

5:37

at bottom, we just can't even

5:39

converge on a discussion of facts.

5:42

I mean, people just can't agree about what is happening

5:45

or much less why or what should happen.

5:47

The thing that we have to start with, one, to

5:49

set the table is, is

5:51

that what people are doing, that they really

5:53

do all have their own good intentions.

5:56

They all think that they have spotted the evil, but they're just

5:58

spotting it from different sides. If that's

6:00

true, then people's behavior at least makes sense.

6:03

I understand how it's self-motivated. Now, it's

6:05

never going to be that pure. I'm sure there are also some people that

6:07

are just grabbing for power. But

6:10

if the public response is, okay,

6:13

I see from where I'm sitting from my side

6:15

that the other side is evil and I need

6:17

to really react accordingly, then

6:20

it starts to make sense. Now, if that's

6:22

what's going on, then it becomes, okay, well,

6:25

now we need the sense-making apparatus by which

6:27

we figure out what is

6:29

evil,

6:29

what is the right response.

6:32

The first step in that is going to be, I think,

6:34

to identify what's true, that

6:37

we need some anchor in the

6:39

storm that we're going to say, okay, this

6:41

is the foundation and we're going to build up from here. There

6:44

has to be a mechanism by which we start to

6:46

figure all this out. I'll lay out

6:48

my rough thesis as a way just to guide

6:51

the conversation. So I

6:53

think that there's something about

6:55

the modern world, largely

6:58

as it's married to technology, that creates

7:01

this inability to get people to

7:04

share a narrative, which allows us then to

7:06

approach any issue from

7:09

the same perspective of what we're trying to achieve.

7:11

So you've got velocity of information. So

7:14

information is just coming out of so fast and furious. I've

7:16

heard you talk about Alex Jones in that context

7:18

of like, hey, this guy just talks as fast

7:20

as he can, throwing out so many points at you.

7:23

Each one just makes you look uneducated. If you're

7:25

like, I didn't know what that was. And you're like, you find out it doesn't

7:27

even exist. And so that is

7:29

social media. Just the rate at which information

7:32

can come at you is so fast. The

7:34

business model of social media

7:36

is that whatever grabs attention is going to be monetized.

7:39

So then people very quickly realize the

7:41

more salacious, the more sort of grand and aggressive,

7:43

the more likely it is to get attention. So now

7:46

it's coming at you hyper negative and hyper

7:48

fast. When you combine that with this

7:50

sense of everything has to roll up

7:52

into a headline. So all of these

7:55

ideas are incredibly

7:57

nuanced. The problem is

7:59

to get them to.

7:59

propagate on social media, they have to be

8:02

a headline. It has to be something that's memorable.

8:04

It has to be something that's easily digestible

8:06

and it has to be something that's repeatable. And when

8:09

you repeat it, that the other person's like, oh,

8:11

that's sick, you got it just right. And now they want

8:13

to go tell somebody else. And so for anybody

8:16

that's being bombarded with all this information

8:18

as a way to wrap their

8:20

heads around it, they just pick a team. And

8:23

then the team just tells them, these are

8:25

all your positions. So now all you have

8:27

to memorize are the headlines for

8:30

your group. And I actually

8:32

am deeply empathetic to that because holding

8:34

onto the nuance of a complicated situation is

8:37

already very difficult. When

8:39

you are able to roll that

8:41

up into a headline, it becomes something that you can hold on

8:44

to far more easily, but then the truth of

8:46

the on the ground interaction points all

8:48

get lost. And so as

8:51

I was watching all of this unfold, then

8:53

it became those 12

8:56

people that you're talking about, and certainly I will

8:58

put you as one of them, it began to

9:00

be unclear, like, okay,

9:03

wait, what is the foundation

9:05

that you're building on that you level

9:07

up from this? And so what I

9:10

want to get through in the beginning here is, what

9:12

is that foundation? So in a world where the sense

9:14

making apparatus is dealing

9:16

with velocity of information, misinformation,

9:20

power grabs, corruption, but you can actually

9:22

hide a lot of that

9:24

through velocity of headline,

9:27

rolling up a complex topic into an oversimplification.

9:30

Where do we get the bedrock? Now,

9:32

I am aware of the, not debate,

9:35

maybe it was a debate that you and Jordan did about

9:37

what is truth and I know that you

9:40

can devolve into madness, but like, if

9:42

you were to give a simple explanation of

9:44

how you ground yourself when you think

9:46

through these things, what does that look like?

9:49

First of all,

9:50

not being tribal, right? So not being,

9:52

not caring really about, I

9:56

care about sources of information

9:58

as a proxy for. or just not

10:00

having to figure out everything from bedrock

10:04

every time, right? So I think- Meaning

10:06

expert as a source. Yeah, so I think you

10:09

can default to expertise most

10:11

of the time, all the while knowing that expertise

10:13

can fail. It's just a sanity sparing,

10:16

and certainly time sparing practice, to

10:19

say, okay, most of what

10:21

is printed in the New York Times has

10:23

to be, to a first approximation, mostly

10:26

true, otherwise the New York

10:28

Times is no longer the New York Times. Now

10:30

I think there have been moments where, and certainly

10:32

on specific topics where it's been valid

10:35

to worry that the New York Times is no longer the

10:37

New York Times, right? I think it's sort of systematically

10:39

getting certain topics wrong or shading

10:42

the truth for, you know, as

10:44

an expression of obvious political bias. So

10:46

there's moments where all of our institutions

10:49

have, if not, frankly failed

10:51

us, showed a capacity to fail

10:53

us, you know, at times.

10:57

And what that did to much of the country

10:59

is just torpedo

11:02

any trust in institutions, right?

11:04

So the trust in the mainstream

11:07

media is at its all time

11:09

low, I would imagine. Certainly

11:11

the last time I looked at a poll on that topic, that seemed to be

11:13

the case. But so it is with government messaging

11:16

on virtually any topic, in particular

11:18

public health, our scientific institutions, our universities.

11:22

And all of this is understandable in

11:26

that in the last, you know, six

11:28

years, post-Trump and post-COVID,

11:31

we had this almost perfect

11:33

storm politically where there really did

11:35

seem to be a capture of the mainstream institutions

11:38

by a very intolerant

11:40

and really at bottom illiberal political ideology.

11:43

I mean, it's supposedly liberal, it's far

11:45

left, but it's, you know, in terms of its style

11:47

of thought, it was,

11:50

you know, we were edging toward,

11:52

you

11:53

know, Chinese

11:55

show trials. I mean, it was really, it was just,

11:57

you know, the kinds of, and the truth is, you

12:00

didn't need that many specific cases

12:03

to feel like, okay, you've seen enough,

12:05

there's no reason to listen to these

12:07

people ever again. I mean, if you're somebody

12:10

who's just poised to throw the baby

12:12

out with the bathwater, you just need to hear one

12:15

case of someone being defenestrated

12:17

at the New York Times for

12:21

not surviving one specific

12:23

blasphemy test.

12:25

And

12:26

then the New York Times is no better than

12:30

the Epoch Times or Breitbart or anything

12:32

else that is in the business

12:34

of putting things in

12:36

font and shipping them. And

12:40

it's all journalism, right? So

12:44

for me, you had to

12:46

recognize that though our institutions

12:49

are challenged, there is still

12:52

such a thing as expertise. There

12:54

is still such a thing as institutional

12:56

knowledge. There's a need for institutions

13:00

that we can trust. Certainly

13:02

when you're in the middle of a pandemic, we need

13:05

a CDC and an FDA

13:07

that we can actually trust, right? So the

13:10

fact that we felt that we couldn't quite trust them is

13:13

an enormous problem. And it's the thing

13:16

we need to shore up is the

13:18

trustworthiness of indispensable

13:20

institutions. It's not that we need

13:23

to tear everything down to the studs such

13:25

that there are no institutions, no one

13:27

thinks in terms of institutions, it's all just podcasts

13:29

and sub-stack newsletters as far

13:32

as the eye can see, and we're just gonna all do our own research,

13:34

right? I

13:36

think it's not to say that doing your own research

13:39

is never valid and it's never

13:42

even important. I mean, there's certainly

13:44

cases where one person

13:46

can pull at a thread

13:48

long enough that something really important

13:51

unravels and we're all wiser for it. Or

13:55

that one individual given

13:57

a specific problem in their lives, a

14:00

medical problem, they do their own research

14:02

and they discover the remedy for

14:04

the thing that was ailing their family member

14:07

or whatever. And the doctors didn't do it and the CDC

14:10

didn't do it and the FDA was wrong and they found

14:12

the thing that helped. Okay, great.

14:14

Generally speaking, when times

14:17

are good, doing

14:19

your own research is just frankly a waste of time.

14:22

And when things really matter, it's

14:25

very likely a dangerous waste of time, right?

14:28

It's like you don't get on an airplane

14:30

and decide, you

14:33

know, well, I'm not so sure I trust the pilot or

14:35

the guys who repaired the engines. I'm gonna

14:37

do some of my own research here, let

14:40

me in the cockpit, I wanna interrogate some

14:43

of those dials and switches. It's like that's not

14:45

a situation where

14:49

anyone would tolerate this

14:51

sort of contrarian

14:54

anti-establishment, I'm gonna innovate,

14:56

just break stuff and

14:58

see what happens. And

15:02

the problem is in many respects,

15:05

we are at 30,000 feet together,

15:07

all of us all the time. And

15:09

we're having to figure out what is

15:12

real and what to do. And we

15:14

do need experts that actually warrant

15:19

our trust because they are in fact experts,

15:22

right? So when, you know, there's just specific

15:24

cases like what's

15:26

really happening in Ukraine and why

15:28

and what should we do about it, right? What should we, should

15:30

we be sending them arms? Is

15:33

Putin lying about, you know, the last

15:35

thing he said was true on the ground? When

15:39

our state department has a press conference

15:42

and tells us what's going on, we

15:44

need a state department that we

15:46

trust to inform us, right? And

15:49

when the New York Times has some point of view on

15:52

what's happening there, we need a New York Times that

15:55

is sourcing information in a way that is valid.

15:58

We just, we can't have...

15:59

everyone trying to

16:02

get to ground truth based on their own

16:04

private efforts to come up with

16:07

what we should all think about Ukraine or what

16:09

we should all think about mRNA vaccines or

16:11

whatever it is. But people do feel

16:13

that. And I certainly felt that at

16:15

the beginning of this. So as I hear

16:18

you say, so remember my

16:21

initial question is what's your foundation? And the

16:23

foundation felt like you wanted to

16:25

be experts but you understand how much they've eroded

16:27

their credibility. So my question,

16:29

because I don't disagree with you, if I was at 30,000 feet,

16:31

I don't want people going in and

16:33

trying to mess with the pilot. But I think

16:36

that the analogy might not quite

16:38

be right for what we went through. What

16:41

it felt like, I'll just speak for myself,

16:43

but I think I represent a lot of people. What

16:45

it felt like was, oh, I'm

16:47

realizing that the pilot is lying

16:49

to me. Now I'm willing

16:52

to be generous and say the pilot is lying

16:54

to me because they're trying to stamp out evil.

16:57

And they really believe

17:00

that

17:01

to

17:03

trick the American public into taking

17:05

a vaccine, whatever,

17:07

is the right answer.

17:08

And that they are doing

17:11

it with a big heart and

17:13

that they really just wanna help people get where

17:15

they're going. But when they said masks don't

17:17

work, it was

17:19

just like, come on, that doesn't make any sense. And

17:21

then they flip and they're like, no, of course, actually

17:23

masks do work. And we were just lying to you because we needed to

17:26

get them into the medical professionals' hands. So

17:28

it was like, just slowly, slowly, they start eroding

17:30

and you suddenly realize everybody

17:33

has an agenda.

17:34

Now that, I

17:37

get it. But where it starts

17:39

to be a problem for me is when if

17:41

we understand that experts have

17:44

an agenda and I'll even, I

17:46

had Peter Atiyah on my show who's

17:48

a medical expert, amazing. And

17:51

if he tells me to do something, I basically just do it.

17:53

Like he's unbelievable. And he,

17:55

in his new book, he talks

17:57

about how you actually can be fat and healthy.

18:00

And when he said that my first impulse was, Peter,

18:02

you can't tell people that. Because even

18:05

if it's true, it's in such

18:07

a like edge case, narrow percentage,

18:10

and the vast majority

18:12

of the bell curve are all people

18:14

who are fat and unhealthy. And

18:16

if you give them that out, they're gonna take it, they're never gonna

18:18

make any changes, and they're gonna die. And they're

18:20

gonna raise their kids worse, and their kids are gonna have shorter

18:22

life expectancy. And I really had

18:25

an emotional response of like, you can't tell people

18:27

that even if it's true. But in

18:29

that moment I realized, oh, this is the very thing

18:31

that drives me crazy. Yeah, so I'm

18:33

like, you can't. So I was like, well, if

18:35

it's true, it's true, and the consequences

18:38

are gonna be what the consequences are gonna be. So

18:40

then I start going, okay, then

18:43

if I can't just moor myself around,

18:48

experts are gonna know, and they're gonna be able

18:50

to say, because I don't think three

18:52

years ago, Peter would have written this book, I don't think he had the

18:54

insights into it. So even somebody as bright as him over

18:56

time is changing. So experts don't really

18:59

know what's right and what's wrong,

19:01

especially not in a hot and heavy situation like that. On

19:04

top of that, there are inevitably

19:07

going to be things that I think

19:10

should be said, somebody else thinks shouldn't be

19:12

said, or vice versa. And so

19:14

now you get to, okay, well, if we

19:17

can all agree, we're only gonna say what's true.

19:20

Like even that I think is a task, but let's say

19:22

that we all agree that we're only gonna say what's

19:24

true. Now this gets really complicated.

19:27

And I

19:29

will put forth that what I think is quote

19:31

unquote true is based

19:34

on perspective, interpretation,

19:37

and reinforcement.

19:40

So there's physics,

19:42

which we don't even understand fully, and then there's

19:45

everything else. And because

19:47

a gigantic

19:50

part of everyday truth, and

19:53

I don't know if that will help us get to

19:55

a sort of in the weeds working definition,

19:58

but

19:58

everyday truth. seems

20:00

to be predicated on that. You have to take

20:04

into account the person's perspective. So how

20:06

do they see the world? Blue, red, right? You

20:09

have to take into account

20:11

their interpretation. So looking at

20:13

data, some people are gonna say, no, it doesn't

20:15

show that, it shows this. And you can get people

20:17

that look at the data and just violently disagree

20:19

on what it shows. And then

20:21

you've got the reinforcement.

20:23

So if they put out a tweet

20:26

saying their version of the truth and they get

20:28

a wall of reinforcement,

20:29

then

20:31

they're just one that feels good. They're

20:34

gonna see it more and more and more and more and more, and so just

20:36

the sheer repetition of it all. Yeah.

20:39

Any of those pieces feel wrong?

20:42

Well, I think the thing you pointed

20:44

to there in your conversation with Peter

20:47

is important to focus on because people have

20:49

a very hard

20:52

time just

20:55

keeping track of everything that's said, right? And

20:57

keeping things in proportion, right? And so I

20:59

think your intuition

21:02

that it's dangerous to be precisely

21:05

true on that point because

21:08

most people, most of the time will draw the wrong message.

21:11

That is the style of thinking as you observed

21:13

that our public health officials were too

21:16

encumbered by. Like they were

21:19

aware that

21:20

they were messaging into a very

21:23

dirty information landscape,

21:26

right? It was just polluted with conspiracy thinking

21:28

and frank lies and we had a president

21:31

who was, by turns

21:33

minimizing everything and

21:35

lying about it, I mean, just telling pointless

21:38

lies, like there's, we have 15 cases

21:40

and it's gonna go away immediately, right? And

21:44

so, and

21:47

there was just this basic fact that's

21:50

quite inconvenient in the case of a pandemic that

21:52

it's this moving target where

21:55

we're finding that we don't actually don't understand what

21:58

we thought we understood yesterday, right? So

22:00

the message is changing. It's not that we have

22:02

a completely clear message

22:05

that is still difficult to parse and we

22:07

have to be careful. We have to talk to people like children

22:10

or at least in a kind of paternalistic way and say,

22:13

okay, listen, most fat people

22:15

are not healthy. It's generally not

22:17

healthy to be fat. Virtually every

22:19

fat person would be healthier if they were less

22:22

fat, but it's still possible

22:24

to be healthy if you're fat. And there are people who are skinny,

22:26

who are not healthy, and it gets confusing. So,

22:29

but you can't really go wrong

22:32

with this basic message that you

22:35

wanna be thin and fit and you

22:37

wanna do the things, you wanna be on your way to being

22:40

thin and fit at minimum. You wanna be active, you

22:42

wanna be eating well, et cetera. In

22:44

the case of COVID, the

22:46

truth was getting overturned

22:49

by further revelations. I mean, the

22:51

truth with respect to the disease, the

22:54

epidemiology of it, how contagious

22:56

it was, we were getting new variants,

22:59

so it was literally the disease itself was

23:01

changing. What we

23:03

understood about vaccines was changing. Initially

23:07

in the beginning, there was every reason to believe,

23:10

at least there was every reason to hope that the vaccines

23:12

would block transmission and therefore

23:15

not getting vaccinated was a decision, not just

23:17

with respect to your own health, but the health of the people around

23:19

you. Later on, that began to

23:21

unravel and it was clear, okay, that doesn't really block

23:24

transmission all that much, maybe a little bit, but not

23:26

really. It's a personal

23:28

decision and it's not a decision that

23:30

you're making for others. You're not a bad citizen,

23:33

therefore if you don't get vaccinated. We

23:37

were messaging into an environment where

23:40

there's so much misinformation

23:42

around, specifically things like vaccines.

23:44

There's literally like an anti-vaccine cult

23:47

that has been

23:49

working in the background of our culture for

23:51

decades. And this

23:54

was their moment to really seize

23:56

the reins of the social

23:58

media conversation, at least.

24:01

So it was understandable that

24:03

our public health officials and

24:06

doctors generally felt like, okay,

24:09

we gotta keep this really simple. This

24:11

has gotta be idiot proof. Get

24:13

faxed, COVID is dangerous,

24:16

wear a mask. Don't wear a mask

24:18

when you're stealing the masks from people who don't get,

24:21

who are our first line responders who need

24:23

the PPE. But once we had enough, wear

24:25

a mask. And the problem was

24:28

when that began to unravel,

24:32

there

24:35

were so

24:37

many clear moments of dishonesty

24:41

that where

24:43

anyone who was gonna have their trust broken with

24:45

mainstream institutions, just

24:48

they broke up right there and

24:50

they seemingly broke up permanently. Right, it was just,

24:52

okay, you're gonna tell me that

24:57

I have to get vaccinated because it stops transmission

25:00

and now I'm hearing that it no longer

25:02

stops transmission. Okay,

25:04

I'm done, right. And then

25:07

the other problem is that now

25:09

we have an information landscape

25:12

where basically everything

25:14

survives. There is not the normal

25:17

Darwinian contest between sources of

25:19

information where if something

25:21

gets sufficiently discredited, you

25:23

never hear from it again. The

25:26

internet is big enough and

25:28

friction free enough such that you can

25:30

be a complete

25:33

lunatic who everyone knows

25:35

is a complete lunatic and yet you

25:37

create an ecosystem that enough people

25:39

love and you can figure out how to monetize

25:42

it. You can have an

25:44

audience of a million people forever, it

25:46

seems, right. And you're literally, literally you

25:48

could be saying that, I

25:52

mean, the craziest case is something

25:54

like QAnon where it's like

25:57

the actual claim that people are bonding

25:59

over. is that the world is being

26:02

run by child raping cannibals.

26:04

And among those cannibals are people

26:07

like Michelle Obama and Tom

26:09

Hanks. And I mean, it's just- I always knew

26:11

it. Yeah, I mean, so it's like, okay, we're

26:14

really saying that these people are cannibals and

26:16

pedophiles. And

26:19

we're gonna spend a lot of time having this conversation

26:21

amongst ourselves. We don't care if the

26:23

rest of the world thinks we're crazy because

26:26

in this space, this

26:28

is just our playground. This is our information playground.

26:31

There's nothing- We're not bumping into any

26:33

hard objects here because

26:35

we have

26:37

as much real- We have as much information real

26:39

estate as we can- As we want to carve out for ourselves.

26:42

I mean, that didn't used to be- It used to be

26:44

that if you wanted to publish

26:47

books or publish print newspapers or magazines,

26:49

you needed enough contact

26:52

with the normal kind

26:54

of reinforcement of just

26:57

mainstream consensus that

27:00

you would survive financially. It's like, there's

27:02

something about the internet that has just made the

27:04

cost of spreading information go to zero.

27:07

And when

27:10

you're dealing in bits and no

27:12

longer dealing in atoms,

27:15

everything survives and persists in

27:17

some basic sense. Bad weather can often

27:19

be used as an excuse to procrastinate, be

27:22

lazy, or just get out of doing something.

27:24

But whatever your reason for getting

27:26

outside this summer, stop letting bad weather

27:29

be an excuse with Vessi.

27:31

Vessi shoes are 100% waterproof, so

27:33

you don't have an excuse not to get out there. And

27:36

Vessi's unique waterproof technology makes

27:38

their shoes stretchy and breathable, so you're

27:40

not having to sacrifice comfort to

27:42

keep your feet dry. And if you haven't seen the

27:44

videos of these things being dunked in water and

27:46

just how watertight they are, you gotta check it out. It's

27:49

pretty trippy. And for anybody that's ready to try,

27:51

go check out all of Vessi's styles at

27:53

Vessi.com slash

27:56

impact and use code impact for 15% off

27:58

your order.

27:59

That's Vessi, V-E-S-S-I

28:03

dot com slash impact. And

28:06

use code impact to save 15%. With

28:10

Vessi, weather is never an

28:12

excuse.

28:13

If I want to show up, lead a team, build

28:15

my business, and still have time to pour

28:17

my energy into my marriage with my wife

28:19

Lisa, then I know what

28:22

I eat matters. So much of the

28:24

quality of your life is going to be determined by the

28:26

quality of your food. You are literally

28:29

what you eat. So if you're eating junk,

28:31

you're going to be built from subpar

28:33

building blocks. Hence my obsession with

28:36

ButcherBox. Monday through Friday, if

28:38

I'm awake, I'm either working or working out.

28:40

So convenience as well as quality

28:42

of food is essential to my

28:44

routine and my success in building

28:47

the things that I'm trying to build. And ButcherBox

28:49

has me covered. I cannot stress

28:52

enough how making the switch to the quality

28:54

grass fed meat that ButcherBox offers has

28:56

improved both mine and my wife's health.

28:59

She suffered massively from gut issues

29:02

for years and ButcherBox has been

29:04

one of the anchor points that we've used to build

29:06

her back. So save your

29:08

precious time like I do, get the high

29:10

quality ingredients that I'm getting and skip

29:13

the

29:13

line in the grocery store to get

29:15

the high quality meat from ButcherBox

29:18

at a great price delivered right

29:20

to your door. Now the best part. ButcherBox

29:23

is offering Impact Theory listeners a special

29:25

deal. Sign up today using code

29:27

IMPACT to receive ground beef for

29:30

a year plus $20 off your first order.

29:33

That's two pounds of ground beef free

29:36

in every box for a year plus $20

29:39

off your first order when you sign up at butcherbox.com

29:43

slash impact and use

29:45

code IMPACT. Make the switch today.

29:49

I know firsthand how frustrating it can

29:51

be when you want to connect with people from different

29:53

countries but you struggle because you don't know the language.

29:56

Trust me, I'm married to a Greek girl so I

29:58

know exactly what this is like.

29:59

speak to her family the way I wanted to in the beginning, not

30:02

until I learned to speak Greek. It was super

30:04

lame, but you guys don't have to struggle

30:06

to express yourself the way that I did

30:08

because you have access to the learning app,

30:11

Babbel. Babbel takes the frustration

30:13

out of learning a new language and gets you speaking

30:15

in just three weeks. Backed by

30:18

language experts, Babbel uses science-backed

30:20

techniques like spaced repetition and

30:22

interactive lessons to make your learning

30:24

journey more efficient, effective,

30:27

and enjoyable. So join over 10

30:29

million subscribers who have ditched the

30:31

frustration and are finally learning a new

30:33

language with Babbel, which learning a new

30:36

language is gonna be one of the coolest things you ever do. And

30:38

Babbel has a special limited time offer

30:40

just for our listeners to get you started right now.

30:43

Our listeners get 55% off your Babbel subscription

30:46

at babble.com slash impact

30:49

theory. Again, that's 55% off. Just

30:52

go to babble.com, which is spelled

30:54

B-A-B-B-E-L, dot

30:57

com slash impact theory. Rules

31:00

and restrictions apply, of course, but you're gonna love it. Get

31:02

after it.

31:03

I think it's important to really understand

31:05

what that mechanism is. So it's what I'll call

31:07

velocity of information. If you have a better

31:10

name for it, I'm all for it. But there's

31:12

something about

31:13

packaging an idea up

31:16

in an environment where there's so much information, all

31:18

you can digest is the headline. When something

31:20

is hyper-transmittable, that

31:23

it just has, you know, whether it's clever, it rhymes,

31:26

it whatever, that has just that little

31:28

bit of extra juice on it. It's something

31:30

that's funny,

31:32

memified, that it's really gonna burn

31:34

through culture. Now,

31:37

for me, where this

31:39

all begins to become deeply problematic

31:41

is that it isn't so much

31:44

that just the internet is forever. It's

31:46

that

31:48

Socrates hated democracy because he didn't think

31:50

people were smart enough to parse through the information.

31:53

And he thought, Matt, you shouldn't be able to vote in

31:56

this thing if you're not educated on this thing. And the reality is most

31:58

people aren't gonna be educated and therefore democracy.

31:59

is really not gonna survive. So

32:02

I

32:03

take a totally different approach to this,

32:05

which is I think

32:07

that

32:08

if you create an environment where everybody gets

32:11

to vote in a world that has the velocity

32:13

of information that we have, information

32:15

is free to send, it's

32:18

easy to package it up, roll up into headlines.

32:21

There's no doubt that

32:22

a lot of misinformation is gonna get out there

32:24

and people just don't even know where to check, where to turn

32:27

to know like who's who, what's what.

32:30

That will have very negative consequences.

32:33

But the only flip side of that that I see

32:36

is top down authoritarian control where

32:38

it's like, I decide, whoever

32:40

I is, the government, Twitter, YouTube,

32:43

whoever,

32:44

they decide what's real information

32:46

and what's not. Because what people are trying

32:48

to get back to is what you were talking about before,

32:51

where information velocity was slow, that

32:53

you had to go to a print

32:55

piece of newspaper, they

32:57

even put extra checks where it was like, it

33:00

had to be vetted by three sources or whatever. And

33:02

I'm not saying yellow journalism didn't exist, of course it

33:04

did. But there were self-imposed

33:07

constraints, there was a business model that let

33:09

even though self-imposed constraints really

33:11

be financially viable, it was just

33:13

harder to do, harder to get out there. And

33:16

so by reducing it,

33:18

there were only so many narratives that

33:20

you were gonna be able to get out. So even if New York,

33:23

back in the 20s or whatever, had 50 newspapers

33:26

just in New York City, that's still

33:29

only New York City, you're not dealing with a

33:31

global readership. So

33:33

you have just this natural constraint.

33:36

Now, once nature isn't giving

33:38

you the constraint anymore,

33:41

the second you want that constraint from the

33:43

top down, you now step into what I call

33:45

the trifecta of evil. And

33:48

the trifecta of evil is three books that

33:50

I read, they technically have nothing to do with each other, but

33:53

just completely explain how all of this goes awry

33:56

and has made me absolutely terrified of

33:58

top down authoritarian control. far more

34:00

than I am, afraid of the absolute

34:03

chaos of a thousand Alex Jones.

34:06

So the three books are, The

34:08

Gulag Archipelago by Alexander

34:11

Solzhenitsyn, Mao,

34:13

The Unknown Story, and then The Rise and

34:15

Fall of the Third Reich. And

34:18

those three books tell you just

34:20

how wrong things go when

34:22

people are told, shut up, your opinion

34:24

doesn't matter, and this person knows

34:26

better, and you're just gonna get in line.

34:29

The great thing about the Mao book is that I

34:32

hadn't realized how evil Mao

34:34

was. I mean, I thought he was

34:37

like a junior level of evil compared

34:39

to Stalin and Hitler. But you read

34:41

that book and it's just, I mean,

34:43

the details are just so sadistic

34:45

and ghastly. Sadistic, yeah.

34:47

Yeah, unbelievable. I literally had

34:49

no idea when I started reading that book. The

34:51

fourth, if I were gonna do an honorable mention to

34:54

give Stalin some more love would be Red Famine.

34:58

That book is shocking, shocking.

35:01

Have you read it? No, but I've read a lot

35:03

about Stalin. Yeah,

35:07

I don't know if I can recommend it. There's this one part

35:09

where a woman telling a tale, a

35:11

woman comes up and looks through another woman's,

35:13

this is in the Ukraine, starving, 1921 or whatever, and

35:17

she looks through the window and catches her neighbor eating

35:19

her seven-year-old daughter. And you're just like,

35:22

I just, I can't imagine. So

35:25

that scares me a lot

35:27

more than

35:31

we're all having a hard time figuring out what is

35:33

true. Now I have a pitch for how I

35:35

think we

35:36

figure out what is true that is

35:38

certainly gonna be flawed. But

35:41

the first thing I want to either

35:43

agree, debate, whatever

35:46

is,

35:47

do you agree that this is sort of the sequencing

35:50

of events, that we have this wall of information,

35:52

it's coming in too fast, it's all rolled up into headlines,

35:54

there's no nuance. Most people probably

35:57

aren't smart enough to deal with the nuance anyway.

35:59

And now temptation one

36:02

is to just go, oh, dear elites, pre-masticate

36:05

all of this for us and tell us what to do. We

36:08

tried that. They have agendas, even

36:10

if they're really being sincere and trying to be good,

36:12

they have agendas and that just feels

36:15

absolutely shitty. Feels like you're being manipulated. It breaks

36:17

all your trust. Can't do that. The other

36:19

one is, you know, just absolute top

36:21

down, do what the fuck you told, shut up and, and

36:23

this is it. Both of

36:26

those strike me as horrendous. And

36:28

that leaves the third option, which is free

36:30

speech, which has become contentious

36:33

somehow. So as a child of the eighties, to me,

36:36

that's like the greatest thing ever. I'm all for free speech. I

36:38

love what Elon is doing on Twitter. I think it's amazing.

36:41

Um,

36:42

but at the same time, I know that's not widely

36:45

shared. I'm not even sure where you fall down. Yeah, no,

36:47

I can push back on some of that. Well, so a couple of distinctions.

36:49

One is that it's not just

36:52

that everything gets boiled down to headlines, right?

36:54

I mean, that is a problem. Sometimes the headline doesn't even faithfully

36:57

represent what's it actually in the article. Um,

37:01

and so many people only read the headlines.

37:03

They never even read the article, right? So it's, there's

37:05

that problem. That problem has been with us for awhile.

37:08

There's the algorithmic boosting

37:10

of outrage and misinformation

37:13

preferentially, which is, which is the

37:15

problem and on social

37:17

media and the distinction.

37:20

So, so I would make one distinction, which

37:23

is, and this is, you know, many people have made

37:25

this like freedom of speech and freedom of reach

37:27

are different things, right? So you, you should

37:29

be free to post

37:32

whatever you want to post, but it is a, it

37:34

is a choice on the side of the social

37:37

media company to preferentially

37:39

boost or dampen whatever

37:41

they want to boost or dampen, right? To, so it's to change

37:43

the character of the conversation.

37:45

And they have to make

37:48

decisions there with, to make

37:50

no decision is itself a decision, right? So you

37:52

can, if you're going to make a completely flat, people

37:54

will have one experience if you're going to, if

37:57

you're going to tweak it algorithmically, people

37:59

have a different experience. And that is a business

38:01

choice that they are incentivized to make, largely

38:05

because they have a terrible business model. I mean, the gaming

38:07

of attention is a bad business

38:09

model, I would argue. So the fact that it's

38:11

an ad-based attention

38:14

economy has a lot

38:16

to do with what we... the

38:19

original sin of social media, I think,

38:21

is the business model. And

38:25

if these were all subscription businesses, I think we

38:27

could have a different landscape there. With

38:30

respect to social media. But still,

38:33

there would be a moderation burden. And it's something

38:35

that it seems like they're never going to get right. Even,

38:38

I mean, except for the... in

38:42

the presence of something like omniscient AI

38:44

that we could trust, I don't see

38:46

how your effort to moderate what hundreds of millions

38:48

of people say to one another,

38:51

or

38:53

even in some cases billions of people. That's

38:57

always going to produce casualties. It's always going to produce

38:59

somebody who was just a completely valid

39:01

academic, who just took an edge

39:05

case position and got

39:07

flagged as a Nazi or whatever.

39:10

And there has to be some process of

39:12

appeal, etc. The

39:15

other distinction I would make is that there's

39:16

a big difference between governments

39:20

silencing speech and

39:22

actually punishing people for errant

39:25

speech. And companies,

39:27

private companies, or even publicly held companies, deciding

39:31

that they want to be

39:33

associated with or not associated

39:35

with certain kinds of speech. And

39:40

so when I look at this through a free speech

39:42

lens from a US-centric

39:44

First Amendment lens, we

39:49

should acknowledge that most of the world doesn't have the

39:51

protection of the First Amendment, and they're worse off for it. And

39:54

so if you're living in the UK and you're perceiving

39:56

this debate, you're looking at

39:58

it as someone who... feel

40:01

stifled by the reality that you don't have

40:03

a First Amendment to default back

40:05

to. And

40:07

that's, you know, I've

40:10

been slow to appreciate just

40:12

how different that is politically and ethically

40:14

for people. So speaking

40:16

from the U.S. context, I

40:18

think we have it right that the government should

40:21

not make any kind of speech

40:23

illegal, with a few exceptions like inciting

40:25

violence. So

40:28

I think you should be free to be a Nazi

40:30

and say you're Nazi things, and you should be free

40:33

to reap the reputational

40:35

costs of that, right? People now know

40:37

you're a Nazi, they don't want to do business with you, they

40:39

vilify you on their forums.

40:43

But

40:45

the question is, should a platform

40:48

like Twitter or any other platform

40:51

be legally required to

40:54

associate with Nazis? Can

40:56

they have in their terms of service a

40:58

no Nazis policy? And I think

41:01

they should, I think my free

41:04

speech concern now

41:07

is aimed at the owners and

41:09

employees of those platforms. So I'm

41:11

thinking about the person who starts a social

41:13

media company. The

41:15

truth is, we're going to do this. We're going to, you know,

41:18

this is an experiment because my, you know, as

41:20

you might imagine, my faith that you can actually produce

41:22

a social media platform that works is

41:24

pretty low. But for waking up my

41:27

meditation app, we are going to launch a basically

41:30

a forum of some kind. And

41:35

that will very quickly have tens of thousands

41:37

and even some hundreds of thousands of people in it, presumably.

41:41

Should I be able to have

41:43

a no Nazis policy? Right now, I'm not expecting

41:46

any Nazis. I mean, first of all, this is a subscription business. So

41:49

that's already a gatekeeping function that is

41:51

that is helpful there, I think, ensuring

41:54

a kind of good faith and quality. But

41:58

there, you know, anyone who needs free action. access to waking

42:00

up also gets it. So there's a lot of free users of it. So

42:02

it's not a perfect paywall.

42:07

I

42:09

think

42:11

when I start a platform like that, first of all,

42:13

I should be able to just

42:17

zero it out overnight. Like if it's not working,

42:19

if I don't like the way this is working, I should be able to just send

42:22

everyone emails saying, sorry, you guys

42:24

broke this place. I don't like the conversation. This

42:26

is over, right? That's

42:28

actually something I told Jack Dorsey when he was still

42:31

running Twitter that he should just delete it and

42:33

he'd win the Nobel Peace Prize and he would deserve

42:35

it. So I think

42:38

you should be able to, you should be free to delete

42:41

your social media account if you in

42:43

fact own it. And

42:46

you should be free to decide, okay, these

42:48

are the standards of conduct in this space.

42:51

Like it's, you know, this is true if you open a restaurant or

42:54

if you open a movie theater, if you open any public space,

42:57

it doesn't change if it's merely digital,

43:00

you should be able to set the terms of service.

43:03

And if it's a no Nazi space, well,

43:05

then Nazis are not welcome here, right?

43:07

So if you demonstrate that you're a Nazi, we

43:10

kick you out of our platform. Now,

43:13

so I'll grant you that

43:15

any company should be able to do what they wanna

43:17

do. If Hooters wants

43:19

to hire only attractive women, by all means,

43:21

let them hire only attractive women. If someone wants to do

43:23

a female only company, I don't have a beef

43:25

with it. I don't even have a beef if Harvard only

43:28

wants to, you know, if Harvard wants to make

43:30

it near impossible for an Asian student to get in,

43:32

as long as they are clear and transparent and don't take government

43:35

money, I'm all for it. I

43:37

don't care. But the transparency

43:40

matters to me. But my real question is, like

43:42

as we think about actually

43:45

solving

43:45

the problem, and so I'm asking

43:47

this largely in the connotation of 2024 is

43:49

coming, we're gonna be running into this again. I

43:52

really think the right way to set the table is

43:54

you've got people on the left and people on the right who

43:57

both think that the other side is evil. They both think that

43:59

they have recognized.

43:59

the problem reincarnate.

44:03

And

44:05

if we don't establish a new sense

44:07

making mechanism for a world in which

44:10

the velocity of information is this fast, AI

44:12

is coming, so deepfakes are gonna be a real thing. Like

44:15

we need a method that we

44:17

can all rely upon in order to think

44:19

through these problems well. And so

44:22

Mike, where I come down on free speech isn't

44:24

whether a private company should

44:26

be able to limit the reach of

44:28

somebody that's a Nazi or say, I don't want Nazis on my

44:31

platform, that's fine.

44:33

There is a real consequence to that though, which is then it

44:35

just bifurcates and you get the right

44:38

and the left, because that's really what's being argued

44:40

about as far as I can tell. I

44:42

have not seen any of what I would call real Nazi-like

44:45

stuff. It's normally just behavior people

44:47

don't like. It's coming from the opposite side of the aisle.

44:50

So when I say, okay, what is the

44:52

right way to deal with this? My answer

44:54

is, I

44:56

think everybody needs to distrust themselves a

44:58

little. So they should not assume that

45:00

they are right. Everybody should be

45:02

willing to put their ideas forward on how

45:04

they think through the problem. So rather than only

45:07

listening to experts, it's like, hey, I'm

45:09

an expert or I'm not, but this is how

45:11

I think about the problem. This is how I've ended up with

45:14

this conclusion. I've looked at this or I've studied

45:16

that, whatever. But this is how I come to this conclusion.

45:19

And then to want the

45:21

collision of ideas. And the second

45:23

people are more worried

45:26

about

45:27

bad ideas being out there,

45:29

they're either saying, I completely give up to

45:32

this velocity of information problem. So we

45:34

have to just choke it. And we have to

45:36

make sure that there's only the authorized

45:39

information.

45:40

Or you accept the consequences

45:44

of letting the ideas battle it out

45:46

in the public consciousness. And as

45:48

far as I can tell, the

45:50

second you say,

45:51

the people aren't smart enough to battle these ideas

45:54

out. The system of information distribution

45:56

is so broken that it's

45:59

unsafe, maybe isn't.

45:59

the perfect word, but you'll never

46:02

get a good outcome by doing that. You

46:04

can't have democracy. Like it

46:06

is literally only in the face

46:09

of the ability for people to say what they believe

46:11

is true and to battle out those ideas

46:13

that we have any hope of people really understanding

46:16

as close to the sort of unsculptured

46:19

way

46:20

of presenting an idea

46:22

that we're going to get. And look,

46:25

there are going to be people that won't be able

46:27

to navigate that mess. And so I'm

46:29

certainly not saying that this is perfect, but

46:31

when I step back and look at the reality

46:35

of the landscape that we're in, algorithmically

46:37

controlled, all of that, everybody has a

46:39

voice in social media, et cetera, et cetera, I don't see

46:42

a way around it.

46:45

Well, so we don't have a pure democracy,

46:47

right? It's not like you just get online and vote

46:49

and it's one person, one vote, and then we decide whether

46:51

we go to war with Russia based on the tally.

46:56

We pick representatives and there

46:58

I think it's important that we have representatives who are

47:01

not blown around

47:03

like weather veins by just whatever's

47:06

happening on Twitter that day, right? So yes,

47:08

they need to care about what their constituents want

47:11

but I think it's

47:13

good that there's a looseness of fit between

47:15

what 500 people in the government do

47:20

and the cacophony on social

47:23

media, right? It may be to

47:25

some degree informing their

47:28

impression of what their constituents want, right?

47:30

So we need serious people in

47:34

serious roles of responsibility and

47:37

insofar as we're losing that and there's

47:39

definitely signs that we are losing that, I mean,

47:42

we've got at least one person in Congress who

47:44

when our state is

47:46

on fire, she speculates that

47:48

maybe it's Jewish space lasers starting those

47:50

fires. This is Marjorie Taylor Greene. Yeah,

47:52

Marjorie Taylor Greene has heard

47:55

that there were some space lasers

47:57

put up there by Jews. I

48:00

think it was Roth trial funding that

48:03

could start fires and we might wanna look into that, right? So,

48:07

insofar as that is happening and

48:09

that we're getting people

48:11

so anti-establishment that they

48:14

are effectively lunatics in

48:17

positions of real power, I think

48:20

that's a, maybe that

48:22

problem has always been with us to some degree, but at

48:25

least I am perceiving it, certainly

48:28

post Trump

48:29

as uniquely worth worrying about

48:32

at this moment that like

48:34

populism is tending to promote

48:37

candidates that

48:39

are almost by definition

48:42

have fewer institutional

48:44

commitments with all the good

48:46

and all the bad that that entails, right?

48:48

So, but there's a lot that's good that if you

48:50

care about the

48:53

last 100 years of scientific knowledge,

48:56

right? And your statements about,

48:59

let's say something like climate change is

49:02

gonna be constrained by a

49:04

basic awareness of what most

49:07

climate scientists most

49:09

of the time think about climate change. That's

49:11

your one kind of representative. If you're

49:13

somebody who's just gonna freewheel

49:17

based on what they heard Alex Jones say, or

49:20

literally Trump gave his first

49:22

interview, I think, to Alex Jones, right? There's

49:26

a difference of kind of the center of

49:31

narrative gravity there in

49:33

populism that I think we need to worry about,

49:36

and there's obviously right wing and left wing variants

49:39

of populism, both are

49:41

problematic,

49:43

but we

49:47

have to recognize that there are asymmetries. So

49:50

what you seem to be recommending is that we basically

49:53

talk to everyone, give everyone

49:55

a fair hearing. It's

49:57

only when you just bring sunlight to everything.

49:59

that

50:00

people are gonna be able to make up their minds and

50:03

they're still, left

50:05

and right, are still gonna demonize one another, but we're

50:07

gonna approach something like maximum understanding

50:10

if we just talk about everything. So why

50:13

not have RFK Jr. on your podcast,

50:16

right? Rogan brings them on the podcast

50:18

and just, you know, RFK, tell

50:21

me, give me the world as you see it, tell me

50:23

who killed your father, who killed your uncle, what

50:27

do you think about these vaccines, do vaccines cause

50:29

autism? Just let them go for four

50:31

hours.

50:34

The downside with that is that even

50:37

in the presence of somebody who is a subject

50:39

matter expert, who's there to

50:42

provide some kind of guard rails to that conversation,

50:45

there is an asymmetry between

50:47

the time and effort it takes

50:50

to make a mess and the time and effort it takes

50:52

to clean it up, right? And whether it's even possible

50:54

to clean it up given the resources

50:57

available, right? So if somebody is, if

50:59

someone's just gonna make up lies in

51:01

front of you, even if you're an expert in that area,

51:07

there's only so much you can do because like

51:09

you're not, they're playing

51:11

with it completely different, kind

51:14

of information physics, right? They're just gonna

51:16

make something up. So if you

51:18

might be a climate change

51:20

expert or a vaccine expert,

51:23

and if you have somebody

51:25

who's a pure conspiracy

51:27

theorist on those topics,

51:30

you know, in my experience, you're sitting with someone

51:32

who is very often unscrupulous

51:35

enough to just make stuff up, right?

51:37

Or to be so delusional

51:40

in the way they have interacted with

51:43

even valid information in the past that

51:46

the word salad they're gonna produce is

51:48

effectively just a tissue of lies.

51:52

And yet there may be no way

51:54

to actually interact

51:56

with it in an honest way in a

51:58

way that's not a true

51:59

thing.

51:59

in real time on Rogan's podcast or anywhere

52:02

else, so as to properly debunk

52:04

it. So you can't just take, let's take RFK Jr.

52:06

as an example.

52:08

Do we think that he is wrong

52:12

and well-intentioned or do

52:14

we think he is sinister?

52:17

Why, well, wrong and well-intentioned

52:20

can cover for a lot of

52:24

dangerous error, right? I mean, you can really

52:26

make a mess being wrong and well-intentioned. I

52:28

think with him, he's got

52:30

so much sunk

52:31

cost. First

52:35

of all, there's so many, like there is just

52:37

a

52:39

characterological, psychological

52:42

phenotype that

52:45

just is addicted to

52:48

the contrarian

52:49

take on

52:51

more or less everything, right?

52:54

So it's just like, and it's not

52:56

an accident. The people

52:59

who are all in on the JFK

53:02

conspiracy, right? No way it

53:04

was a single shooter, no way Oswald was

53:06

a patsy, whatever it is, right? Those

53:08

people by and large tend

53:11

to just jump

53:13

on all the other conspiracies, whether it's

53:16

the moon landing or 9-11 Truth or it's like, and

53:20

you have someone like RFK Jr. where it's, it

53:23

seems, I don't know the man, but I've been

53:25

paying attention of late.

53:27

It seems like there's almost no

53:30

conspiracy that he doesn't have an

53:32

appetite for, right? So like when someone says,

53:34

well, what about Bill Gates injecting

53:38

transponders into us with the vaccine?

53:41

He's got time for that,

53:43

right? He doesn't say, oh no, that's, I

53:45

mean, you really think Bill Gates is doing that? That's just, isn't

53:47

that obviously bullshit? No, no, he's like,

53:50

well, this is something we really have to look into.

53:54

I don't have verbatim what he said on that, but he's

53:56

way too open-minded on points

53:59

like that, right? And so it is with everything.

54:01

And now

54:03

it's deeply inconvenient for

54:06

someone like me at a moment like this to have to recognize

54:09

some of these conspiracies turn out to be true,

54:11

right? And some always looked

54:14

plausible from the very beginning. So like the

54:17

origins of COVID coming from the Wuhan

54:19

Institute of Virology, right? Like, is it a lab

54:21

leak or is it the wet market? Well,

54:24

it always looked plausible that it

54:27

could be a lab leak, right? That was always a

54:29

valid thesis worth worrying

54:32

about and investigating. And it was never racist

54:35

to speculate that that might have been the case,

54:37

right? So the fact that our medical

54:40

establishment tried to

54:42

tamp that down in a completely

54:44

bad faith way, and maybe for

54:47

reasons that are, if you dig deeper

54:49

into Fauci and into the other players,

54:52

maybe there's some deeply

54:54

invidious things to discover about people's conflict

54:57

of interest and

55:00

research we funded and now we don't want to admit we

55:02

funded or whatever it was. I mean,

55:04

it was that moment in Congress where Fauci

55:07

and Rand Paul were kind

55:10

of debating the meaning of gain of function research

55:12

and Fauci looked like to

55:15

many people's perception and actually

55:17

shared it at the time, he looked like he was

55:19

just doing this sort of Talmudic

55:22

hair splitting on what the phrase

55:24

gain of function meant. Whereas

55:27

Rand Paul was saying, just be honest with

55:29

the American people, like you know that if

55:31

you are changing the function of the dynamics

55:34

of a virus, such that it spreads more among

55:36

humans, that's gain of function, you

55:38

know, by any other name. So

55:42

maybe there's something sinister beneath

55:45

all of that, right? So here's one conspiracy among

55:48

all these other conspiracies that

55:51

people

55:52

were branded as conspiracy theorists

55:54

for entertaining,

55:59

and yet it was always been.

55:59

plausible to be worried about that. But

56:02

so- This by the way is exactly

56:05

the thing that I'm worried about. So it

56:07

becomes very easy to shut

56:09

people down to say, oh, that's just conspiracy.

56:13

And to start having

56:16

the apps get

56:18

involved. So YouTube marking it as like,

56:20

oh, you talked about ivermectin, shutting

56:22

this episode down, like just so

56:25

many things coming at you at once trying to say

56:27

this is outside the Overton window. And

56:29

so my whole thesis is

56:32

very simple. That the, in a

56:34

world where there's too much information coming in,

56:36

the answer cannot be to choke it off to

56:38

try to limit the amount of information because you will

56:41

get that wrong. It is manipulative by its

56:43

very nature. The closest thing you're

56:45

gonna get is to let ideas battle out.

56:47

There are going to be consequences. I wanna be very clear.

56:50

We probably, when it comes

56:52

to things like this, you were better off when you had

56:54

trusted but hyper limited media

56:57

sources that could at least get everyone

56:59

to walk towards the exits in a calm and

57:01

orderly fashion. So I'm

57:03

not denying that, but that world is over.

57:06

And so now you

57:07

really only have

57:09

two options that I can see. You either top

57:12

down, clamp down, or, and

57:14

I'll lead people back to my trifecta

57:17

of evil, or you go, there

57:19

are gonna be consequences to letting people battle the

57:21

ideas out in public. And there are gonna be a lot of

57:23

people that get confused by things that should

57:25

never be taken seriously. And

57:27

they're going to be taken seriously and we're gonna have

57:29

lives lost because of

57:32

that, but it will balance out and

57:34

we won't have lost all

57:36

of our faith. And you're not gonna

57:38

get the small guy

57:39

trampled to death, sent off to the gulag,

57:42

killed because he's an inconvenient voice, whatever

57:44

the case,

57:45

which also opens a door to the power grabs

57:47

and now you get will to power, which we saw

57:49

a lot and COVID was like, oh, I can

57:51

get a little bit of power. And so it draws people

57:54

into that. So you get power grab, power grab. So

57:56

when I think back, and

57:59

in fact, maybe the...

57:59

the right way to ask the question is,

58:02

and I wanna keep this tied to RFK. I

58:04

don't wanna depart from that yet. So you've

58:06

got RFK,

58:08

your view is that he's making things

58:10

up, that it's conspiracy, maybe just personality-wise,

58:13

he's drawn to it. I don't hear you saying that

58:15

you think he's sinister. Just, that's

58:17

just how his mind works.

58:20

Other people, though, think that he's bang on, that he's right.

58:23

You've already admitted that he'll probably end up being

58:25

right about some of these things. I

58:27

know you well enough to know you're gonna say, well, there's nuance,

58:29

and if you're right for the wrong reason, you really do have to think that

58:31

through, it's not enough just to be right. We'll

58:34

set that aside for now. So you've got this guy,

58:36

conspiracy-minded, not being sinister, probably

58:39

will be right about some of these things, but

58:41

probably you still don't want him to be platformed.

58:45

And then it

58:46

becomes... Well, but

58:48

the crucial distinction, again, is between

58:50

it being illegal to platform

58:52

him or just the choice for any

58:55

private platform. I only care about the choice. I wanna

58:57

know why,

58:58

because you are such

59:00

a potent sense maker. There's so

59:03

many of us that are like children of Sam

59:05

Harris, where you really helped us think

59:07

about putting these ideas together. And

59:09

then there's something in this one-two punch

59:12

of COVID, Trump, where

59:14

all of a sudden I felt like, wait, I've

59:16

been using the tools you gave

59:19

me, and now

59:21

I feel like you're using a different set of tools.

59:24

And so I'm trying to remap. Because

59:27

here's how I approach you right now.

59:29

Obviously I've watched some portion of

59:31

the internet go, Sam's brain

59:33

broke. He used to be Sam Harris, not Sam

59:35

Harris. You did your own podcast about it, which is brilliant,

59:37

by the way, where you're like, some portion of my

59:40

audience thinks like, what have you done? I forget the

59:42

exact phrase you used, but you're so

59:44

aware of how people have responded, but

59:47

you've stayed really steady. So

59:49

I'm like, okay, then maybe there's something here

59:51

I'm just not getting, which is why I keep laying

59:54

my thing out, because you don't feel erratic

59:56

to me, but I want to understand,

59:58

you're layering. ideas that

1:00:01

allow you to make sense of this in a way where

1:00:03

you're calm. Like I'll

1:00:05

just,

1:00:06

if people are gonna freak out on Twitter, I'm just gonna step

1:00:08

back, I'm gonna keep doing my thing. Like

1:00:10

being with you does not feel like I'm in

1:00:12

the presence of someone who's on a manic episode

1:00:15

or anything. Like you feel as ever sort of

1:00:17

calm and centered. And so because

1:00:19

I am skeptical of my own approach, I want to understand

1:00:22

yours. Now I'm not gonna pull back on the parts

1:00:24

where I think that it doesn't make sense. I'll

1:00:26

say it doesn't make sense. But I actually do wanna understand.

1:00:29

So where I was going with that is, don't

1:00:32

worry about illegal, not illegal. I just

1:00:34

wanna know why you think it makes sense to

1:00:36

deplatform or to not

1:00:39

platform maybe a more accurate way

1:00:41

to say it, to not platform someone like RFK

1:00:43

Jr. When the

1:00:45

founding fathers said,

1:00:47

hey, the one thing you don't wanna fuck

1:00:50

with,

1:00:50

make sure people can say whatever the hell they want.

1:00:54

Okay, well, it matters what the platform

1:00:56

is. So with

1:00:59

a podcast, it's very simple. I'm only gonna

1:01:01

do 40, 50 podcasts this year. I

1:01:05

just have to make a choice. It's an editorial choice.

1:01:08

It's a publishing choice. What do I wanna pay attention

1:01:10

to? Whose book do I wanna read? Who do I

1:01:12

wanna talk to?

1:01:13

Most people- But you have said don't,

1:01:16

like there are certain people you don't think should be put

1:01:18

on a platform. Trump was one of them. Certain

1:01:20

people I wouldn't talk to for specific reasons. You're

1:01:24

not talking fair enough, but are there people

1:01:26

that you think should never be platformed?

1:01:29

Well, I think if you're gonna platform it, so what I

1:01:31

said about RFK on my podcast is if you're

1:01:33

going to platform him, you have

1:01:35

a journalistic responsibility to do your homework,

1:01:38

not only in anticipation of the things

1:01:40

he's going to say on your podcast, but

1:01:43

you need to catalog the things he's already said,

1:01:46

which are obviously bullshit that you should

1:01:48

challenge him on, right? So there

1:01:51

is just a wide range of things

1:01:54

that I've spread scientific

1:01:56

consensus at this point, that there is no

1:01:58

link between. childhood

1:02:00

vaccines and autism. Right now, autism

1:02:03

is a problem, autism rates have gone up, we

1:02:05

don't understand autism,

1:02:07

but

1:02:08

people have gone deep studying

1:02:11

the MMR vaccine, but

1:02:14

just vaccines in general and autism and

1:02:16

found no linkage, right? So, and

1:02:18

he is out there telling people, anyone

1:02:21

who will listen, that there's

1:02:23

every reason to believe that vaccines cause

1:02:26

autism, or we should be worried about it, or

1:02:28

I'm hearing from mothers who have

1:02:31

seen the clear correlation, they had

1:02:33

a normal kid on Tuesday and on Wednesday, they

1:02:35

got vaccinated and the autism

1:02:37

started, right? So, he

1:02:40

is spreading that fear, and

1:02:42

as far as I can tell, it's on the basis

1:02:45

of no valid scientific

1:02:47

information now. Now, it's also, this

1:02:49

is now linked up with everyone else's concerns

1:02:51

about COVID vaccines and just

1:02:54

the reliability of medicine in general and bad incentives

1:02:57

and pharmaceutical companies, and there's

1:03:00

a lot of there there in stuff

1:03:05

that is worth worrying about. I mean, I think a

1:03:07

profit-driven motive in medicine is something that

1:03:10

we're always gonna be in tension with because

1:03:13

what we want, we

1:03:15

want the medical establishment to be recommending

1:03:18

drugs because

1:03:21

they're safe and truly safe and effective

1:03:23

to people who truly need them, right?

1:03:25

We don't want people

1:03:28

in

1:03:28

the privacy of their own minds or in the privacy

1:03:31

of their board meetings celebrating

1:03:33

how they're gonna make billions of dollars at

1:03:35

this new opportunity because

1:03:37

they can market

1:03:39

this drug successfully to people who may not need

1:03:42

it, may not benefit from it, may be in fact be

1:03:44

harmed by it, right? So, it's that

1:03:47

disalignment of incentives that is specific

1:03:50

in the case of medicine that is, I

1:03:53

think people are understandably uncomfortable with

1:03:56

but... So

1:04:00

in the in the narrow case, when you're talking about having

1:04:02

a podcast, you you first of all, you could just

1:04:05

the burden is not is not on you to platform everybody.

1:04:07

You can just decide who you want to talk to. If

1:04:09

you're going to talk to someone like RFK Jr.,

1:04:12

I think given

1:04:15

his track record and given how much I think

1:04:18

genuine misinformation he has spread and

1:04:20

you know, consequential misinformation, I

1:04:22

think you have a responsibility not

1:04:25

to just put a mic in front of him and let him rip.

1:04:28

You you actually need to to

1:04:30

debunk him and maybe bring on someone who can

1:04:32

also debunk him. Now, again,

1:04:35

as

1:04:36

I said, you have no problem with that. So if it was

1:04:38

done in a debate format with another

1:04:41

expert or superior expert, then

1:04:43

it's we're good. Yes, except

1:04:47

you the the asymmetry I pointed

1:04:50

out before still applies. If he's going

1:04:52

to just make stuff up. Right. So like, you

1:04:54

know, he will.

1:04:56

I mean, the example I mentioned on my podcast, he's

1:04:58

been telling a story, I think,

1:05:02

in several venues that, you

1:05:05

know, he had collaborated with the

1:05:07

journalist Jake Tapper 15 years

1:05:09

ago on a documentary. They

1:05:12

had just put in a ton of effort. They

1:05:15

did a really deep dive on the link between

1:05:17

vaccines and autism. And at the

1:05:20

last minute, Jake Tapper

1:05:22

called him and said, listen, we're going to

1:05:24

pull this. We just I've never in all

1:05:26

my years as a journalist, I've never

1:05:28

had this experience, but this just came down

1:05:30

from corporate. They're just pulling

1:05:33

the plug on this. I'm so sorry. And,

1:05:36

you know, so the punchline for him, and

1:05:38

I'm not sure if he said this on Rogue and he definitely said

1:05:40

this on some podcasts I listened to.

1:05:44

Punchline for him is, OK, the pharmaceutical

1:05:46

companies have such pull

1:05:49

with I mean, they spend so much money with CNN

1:05:52

and these other outlets that, you know,

1:05:54

if they don't want something to air, it's not going to air. Right.

1:05:56

That's how corrupt journalism is now.

1:05:59

Now.

1:06:00

If I was on a podcast debating RFK

1:06:02

Jr. and he

1:06:04

Trotted out that story. I

1:06:06

Would just have to eat it. I would say all right. Well,

1:06:08

that's

1:06:09

that's bad. I agree that

1:06:12

looks bad, right now

1:06:15

Jake Tapper has published an article

1:06:17

saying this is just a lie, right?

1:06:20

This is just like this has like two percent two

1:06:22

percent relationship to what actually

1:06:24

happened. There was nothing about

1:06:26

it be It's just

1:06:28

it's all upside down and I keep

1:06:31

debunking this and he keeps telling this story,

1:06:33

right? So

1:06:36

unless you know that it doesn't

1:06:38

matter that you're a vaccine expert or

1:06:41

you could be an expert in a dozen you

1:06:43

know relevant disciplines if

1:06:46

Someone just gonna make up a story that

1:06:48

is perfectly shaped to

1:06:50

tickle the contrarian You

1:06:53

know, they're all a bunch of fucking liars part

1:06:55

of the brain He Still

1:06:58

lands that blow in real time

1:07:00

on a podcast. There's no way to debunk it in real

1:07:02

time You literally Jake Tapper to

1:07:05

you need to pull him out of the woodwork to for

1:07:07

that particular point But

1:07:10

isn't there many there are many things like

1:07:12

that. That's the thing I mean there it's not like so

1:07:14

someone who has this style of reasoning

1:07:18

Again, some of its conscious lines some of its

1:07:20

misremembering some of its that they're reading

1:07:22

studies and they're not understanding them And they're

1:07:24

just they're pulling, you know

1:07:27

half truths out of studies that either that

1:07:29

are can be made to seem real And

1:07:32

so they're making such a mess That

1:07:34

it is genuinely hard to be

1:07:37

an expert in in that, you know We're

1:07:39

riding shotgun on all of that and debunking

1:07:41

in real time but The

1:07:44

only responsible way to do it would be to have

1:07:46

an expert there to try

1:07:49

to do that I think it's worth

1:07:51

stepping back and and asking the question Why

1:07:55

is anyone listening to RFK jr. About

1:07:57

vaccines at all right? He's not

1:07:59

an expert in the relevant domain,

1:08:02

right? He's not an expert in epidemiology,

1:08:04

he's not an expert in immunology, he's

1:08:06

not a vaccine guy, he's like, he's

1:08:09

a lawyer and an activist who

1:08:11

got this particular bee in his bonnet 20 years

1:08:14

ago, and he's just made

1:08:16

a lot of noise about this. And

1:08:19

interestingly, he's also a climate

1:08:22

science activist, right, and

1:08:24

there you can see a very bizarre mismatch

1:08:30

between how he deals

1:08:32

with mainstream scientific consensus

1:08:35

in climate

1:08:36

and how he disavows

1:08:38

mainstream

1:08:40

scientific consensus on the topic of

1:08:42

vaccines. And everything is flipped,

1:08:45

I mean, it's just like he's got all the...

1:08:47

I almost never drink alcohol

1:08:50

because it is brutal on the body,

1:08:52

and I encourage you guys to do the same, but

1:08:54

I know many of you are going to

1:08:56

ignore that advice. If that's you,

1:08:59

at least do something to ensure that you still

1:09:01

feel good the next day and can keep chasing

1:09:03

your goals at full speed. To

1:09:05

that end, be sure to check out Z-Biotics.

1:09:08

Z-Biotics, pre-alcohol probiotic,

1:09:11

is the world's first genetically engineered

1:09:14

probiotic. It was invented by PhD

1:09:16

scientists to tackle

1:09:17

rough mornings after drinking.

1:09:20

Here's how it works. When you drink,

1:09:22

alcohol gets converted into a toxic byproduct

1:09:24

in the gut. It's that byproduct,

1:09:27

not dehydration, that's

1:09:29

to blame for your rough next day. Z-Biotics produces

1:09:31

an enzyme to break this byproduct down.

1:09:34

It's designed to work like your liver, but

1:09:36

in your gut where you need it most. Drink

1:09:39

Z-Biotics before drinking alcohol, stay

1:09:41

hydrated, and get a good night's sleep. You

1:09:44

will wake up the next day ready to hit the ground running.

1:09:46

So if you decide to drink, make

1:09:49

sure you do what you can to mitigate the suffering

1:09:51

and stay on

1:09:52

track. Go to zbiotics.com

1:09:55

slash impact to get 15% off

1:09:57

your first order when you use impact.

1:09:59

at checkout. Zbiotics is

1:10:02

backed with 100% money back guarantee. So

1:10:04

if you're unsatisfied for any reason, they'll

1:10:07

refund your money, no questions asked. Remember

1:10:09

to head to zbiotics.com

1:10:11

slash impact and use the code impact

1:10:14

at checkout for 15% off. Thank

1:10:17

you Zbiotics for sponsoring this episode.

1:10:20

This is the perfect example for me of the very

1:10:22

thing that I'm worried about. So here you have a guy, he's

1:10:25

either a sinister and wants

1:10:27

to help the climate, in which case

1:10:29

I think most people have cognitive dissonance,

1:10:33

or he's sinister

1:10:35

and wants people to not take vaccines that are

1:10:37

gonna save their life, or he's

1:10:40

maybe right about something that other people disagree

1:10:42

with, or maybe he's wrong but well-intentioned.

1:10:45

So- I think wrong but well-intentioned

1:10:47

covers for a lot. I mean, just think about, the vaccine

1:10:50

thing is really a unique case because

1:10:53

what you have is an

1:10:55

intervention on a healthy

1:10:57

person, very often a child,

1:11:00

that is nevertheless

1:11:02

risky to some degree. Some

1:11:08

number of people are gonna have a bad reaction. Some

1:11:10

number of people are going to die from,

1:11:15

I let you, I mean, this is just everyone's

1:11:18

worst, certainly every parent's worst nightmare. You

1:11:20

know, I let them stick a needle in my child

1:11:23

and he was never the same, or he died, right?

1:11:25

Like that's just, so you

1:11:27

just have to hear one story like that,

1:11:30

right? It doesn't even have to happen to you. It's just, it

1:11:32

could be a friend of a friend of a friend. You hear

1:11:34

this story and you think, man, it's

1:11:36

just, it's not worth it. Like I just, I

1:11:38

don't, you know.

1:11:41

In the case of childhood

1:11:44

illness,

1:11:45

you know, infectious disease, you can, as

1:11:48

basically everyone who doesn't get their kids vax

1:11:51

does, you can just be a free rider

1:11:53

on this larger system of herd immunity.

1:11:55

You can say, listen, most people are gonna be vaccinated

1:11:57

for measles. I don't even have to get my kid

1:11:59

vaxed. Like I just don't, I'm not gonna run this

1:12:02

risk. I'm just gonna opt out.

1:12:05

And so he's intersecting with totally

1:12:08

understandable fears that get, wherein

1:12:11

specific anecdotes, specific

1:12:13

stories get amplified

1:12:16

to the, to the, I mean, they're above data. They're

1:12:18

more important than data. You can show me all the data

1:12:20

in the world. I know what happened to my

1:12:22

kid, right? That's, again,

1:12:25

scientifically, that's all upside down,

1:12:28

but

1:12:28

it is so compelling that-

1:12:30

What should we do though

1:12:33

with people that are in that situation? Because

1:12:35

for me, if a parent doesn't

1:12:37

wanna vaccinate their child, I do not think you

1:12:39

should be able to force them. Even at the height

1:12:41

of COVID, where I was like,

1:12:43

when I really believed everybody just needs

1:12:45

to go get vaccinated and some people were like, I

1:12:48

don't wanna do it. I was like, word, then fine.

1:12:50

Like I, that just feels, so

1:12:53

it felt wrong to me. And this is where it feels

1:12:55

like everybody needs to have a moral compass. And

1:12:58

part of where I think the breakdown is

1:13:00

happening, I've heard you refer to something as a great

1:13:02

unraveling. Now, I don't know what you mean

1:13:04

by that, but I started mapping out what I thought you meant

1:13:06

by that. And one of the things

1:13:09

that certainly I would mean is a great unraveling is

1:13:11

we don't have these shared morals anymore.

1:13:13

We don't have one religion

1:13:16

to carry us through. And you know, cause

1:13:18

what I think ends up happening, and the thing that you

1:13:20

and I have been talking about without really talking about is

1:13:23

this is a battle for the truth. If things were

1:13:25

clear, they'd be clear. Like if we really

1:13:28

knew, like vaccines don't cause

1:13:30

autism, like

1:13:32

if vaccines

1:13:35

caused autism, it'd be very clear. You just see

1:13:37

it, boom, done, right? So it's in some

1:13:40

sort of weird, like maybe it

1:13:42

does. Like there's enough credibility

1:13:44

there that people can still buy into it. There

1:13:47

isn't enough just like unequivocal

1:13:50

evidence in the other direction. People go, I've looked at this

1:13:52

because if it were true, I could just show you this

1:13:55

and I could show you this. And then people would

1:13:57

go down the line. Nobody's arguing about

1:13:59

whether. what you eat impacts what

1:14:01

you shit, right? Everybody just gets it. Now

1:14:03

I have to tell anybody, I don't have to go convince people.

1:14:06

It's just like your life is

1:14:08

such proof that there is a one-to-one

1:14:11

relationship between what you put in your

1:14:13

mouth and what comes out the other end. So

1:14:15

there is some weird gray area. So

1:14:18

the question to come- And it's mostly gray area

1:14:20

for most things. Right, so now if

1:14:22

we know we're living in this area where everything is gray,

1:14:24

nobody knows who's gonna be the expert, you

1:14:27

started the conversation by saying,

1:14:29

okay,

1:14:29

we really do need experts, but

1:14:32

no joke, like nine words into your sentence,

1:14:34

you were caveatting experts

1:14:37

have sort of thrown away their credibility. And so it's

1:14:39

like, that is the world that we live in. Like this

1:14:41

stuff is so complex. So

1:14:44

the thing that we have to take on in

1:14:46

head-on collision is, how do

1:14:49

we discover what is true?

1:14:52

Well, if you do it the way RFK

1:14:54

Jr. is doing it for climate,

1:14:57

like you notice that you can- You

1:14:59

can find any- Believe the preponderance

1:15:02

of X-rays. Yeah, it's like, it's not an accident that

1:15:05

almost all climate scientists,

1:15:08

right? I mean, there's a general principle

1:15:10

you have to understand here is that it's

1:15:12

always possible to find a handful

1:15:15

of PhDs or MDs

1:15:18

who are crazy or conflicted,

1:15:21

or just for whatever reason disposed to

1:15:24

stake out a genuinely

1:15:27

disreputable and

1:15:29

indefensible thesis. You could also, the cigarette

1:15:31

companies could always find somebody

1:15:35

with a seemingly relevant degree to

1:15:37

say, I don't think smoking causes cancer.

1:15:40

I don't think it's addictive. But I like it. You could

1:15:42

find that guy. And then that guy would sell

1:15:44

his wares to

1:15:46

the chemical companies that are

1:15:49

putting fire retardant in mattresses. And

1:15:51

he could say, well, I don't think if it

1:15:54

gets into the bloodstream, it's not a problem, right? So

1:15:57

you can always find those people. So...

1:16:00

I'm not saying we

1:16:03

are always ruled by scientific consensus because

1:16:06

there are genuine breakthroughs in science that

1:16:09

overturn even a 99.9% consensus, right?

1:16:13

But scientific

1:16:15

consensus is still

1:16:18

salient and still matters most

1:16:20

of the time. And it's not arrived at by

1:16:22

accident. And there's so much tension

1:16:25

in science

1:16:26

to disprove other

1:16:28

scientists, right? It

1:16:31

is such a competitive atmosphere

1:16:33

that,

1:16:36

again, there are studies that don't get replicated,

1:16:38

there's false ideas

1:16:41

that survive far longer than you think they would. But

1:16:44

generally speaking, you are

1:16:46

not going to go wrong most of the time

1:16:49

by lining up with what 99% of specialist

1:16:54

X thinks on this very

1:16:57

specialized topic. So,

1:17:00

RFK Jr.

1:17:02

plays that very center of the fairway game

1:17:08

on the topic of climate, and

1:17:10

he does something completely different when he's talking

1:17:13

about medicine. Now, I don't know,

1:17:15

maybe he has a story that reconciles

1:17:17

that difference, but

1:17:21

we need, yes, we need a healthy

1:17:23

institutional and

1:17:26

scientific conversation such that good

1:17:28

ideas

1:17:29

generally survive and

1:17:32

bad ideas are generally debunked, and

1:17:34

that we know that most of the time,

1:17:39

our experts are real experts, they got

1:17:41

that, they acquired

1:17:44

their expertise by a process that was

1:17:46

going to weed out the

1:17:48

imbeciles and the delusional, and the,

1:17:50

you know,

1:17:54

deliver somebody who really is,

1:17:58

is

1:18:01

arriving at their opinions

1:18:03

on the basis of a methodology

1:18:06

that we generally

1:18:08

can trust, right? They're not

1:18:11

obviously conflicted by- Can we

1:18:13

lay out that methodology? Well,

1:18:16

you're on guard for obvious

1:18:18

cognitive bias and wishful thinking

1:18:21

and certainly bad incentives,

1:18:23

right? So it's understand, it's like, yes, if

1:18:26

RJ Reynolds is funding research

1:18:28

into the toxicology of cigarettes,

1:18:31

right?

1:18:32

It's not to say that

1:18:34

obviously conflicted money

1:18:36

is always gonna fund a study that

1:18:39

is false, right? I mean, you could run it. It's

1:18:41

not, wouldn't be hard to run a totally

1:18:44

valid study where the money came

1:18:46

from, you know, a what

1:18:49

would be classically be considered the wrong

1:18:51

source, but it's easy

1:18:53

to see that there's, at

1:18:56

least the optics are bad enough that that's

1:18:58

not how you wanna fund that particular science,

1:19:00

right? And

1:19:03

at minimum, scientists have to declare

1:19:05

any economic interest they had in any

1:19:07

part of this, you know, part of this picture.

1:19:10

But the truth is,

1:19:12

the science is deeply flawed,

1:19:16

and yet it's better than any other

1:19:18

part of culture with respect to how we play

1:19:20

this game of just letting ideas collide

1:19:22

against one another and seeing

1:19:25

what survives. That I agree with. So

1:19:27

the problem is I don't feel like that's what's

1:19:29

happening or what's being championed.

1:19:33

So broadly, and

1:19:35

then we can get specific to you and I and exactly

1:19:37

what we're saying. But

1:19:39

the way that I think

1:19:41

about this is you've

1:19:43

got even something like

1:19:46

science.

1:19:48

If you talk to Eric Weinstein, he talks about

1:19:50

the disc, I think it's the Distributed

1:19:52

Information Suppression Complex. So

1:19:55

he talks about how there is,

1:19:57

for whatever reason, just inherent into...

1:19:59

the world of science, there's a certain bias,

1:20:02

a certain ideas they don't want getting out because people

1:20:04

have built their entire careers on something and if you're putting something

1:20:06

for it that challenges it, not necessarily

1:20:08

that they're being evil, but it's the same kind of idea

1:20:11

of the cigarette guy is gonna see what

1:20:13

he wants to see. And the guy whose entire

1:20:15

career collapses if your new idea is right,

1:20:18

well, magically the peer review that you get

1:20:20

is terrible and he's got a laundry

1:20:23

list of things like that. And

1:20:25

so I'll back up. So in

1:20:27

a business context, I created something

1:20:30

called the physics of progress. And it was

1:20:32

me trying to teach to my students exactly

1:20:34

how you solve novel problems. So I was like, hey,

1:20:36

if you wanna grow a business, I have no

1:20:38

idea what the product is, how the

1:20:40

audience is gonna respond, what the market

1:20:43

situation is gonna be like. So you really have

1:20:45

to understand how to think through a

1:20:47

new business, new product, new era, new market

1:20:50

dynamics, whatever. And the way that you do

1:20:52

that is physics of progress. And I lay this whole thing out

1:20:54

and I'm super proud of it and I'm pitching it

1:20:56

to my team. And I'm like, okay,

1:20:59

you're gonna start with where are

1:21:01

you trying to get to? What's your goal? You're gonna identify

1:21:03

the obstacle between where you're at and

1:21:06

your goal. And like, why won't I just automatically

1:21:09

achieve my goal? Then you're gonna come up with

1:21:12

a experiment that you can run, a thing

1:21:14

that you can do to try to overcome that

1:21:16

obstacle. You're gonna do that thing, you're gonna look at the data,

1:21:19

figure out whether you made meaningful progress. You're

1:21:21

gonna then re-inform your hypothesis about how

1:21:23

to overcome that obstacle and you're gonna start over. And

1:21:26

one of the guys in my team goes, that's

1:21:29

the scientific method.

1:21:30

And I was like, is it? I actually don't know the

1:21:33

scientific method. And he's like, yeah, that's the scientific method.

1:21:35

And I was like, okay, that makes sense.

1:21:38

To me, that's it. The reason

1:21:40

that I called it the physics of progress, and again,

1:21:42

just completely acknowledging it's the scientific

1:21:44

method, but the reason that it

1:21:46

occurred to me as a physics of progress

1:21:49

is because it is the only way to make

1:21:51

progress that you're not gonna know.

1:21:54

You're just taking your best guess. You know where

1:21:56

you're trying to get to. You have a guess about what the

1:21:58

thing is that's stopping you. You're gonna try. try

1:22:00

something, the problem is, and when I teach this,

1:22:03

the big issue is,

1:22:05

when going back to what I

1:22:07

was saying is truth, truth is perception,

1:22:09

interpretation, and reinforcement. And

1:22:12

at the moment you look at the data, so

1:22:14

I ran my test to see if I could overcome that

1:22:17

obstacle, I get a result. When

1:22:19

I look at that data, I'm bringing

1:22:21

my perception, my interpretation,

1:22:24

and my reinforcement to that. And

1:22:26

it's not that I'm evil, but I'm not necessarily

1:22:29

going to see what's true. And this is

1:22:31

where science then begins to break down. It is

1:22:33

the right answer. Like it is what we need to do. As

1:22:35

far as I can tell, it's the only way to make

1:22:38

progress in anything. But

1:22:40

what we're living through right now is

1:22:42

that moment of the interpretation,

1:22:45

the perception, and the reinforcement

1:22:48

causes you to see something that's

1:22:50

not actually there. You're looking

1:22:52

at the world to a fun house mirror. And so,

1:22:55

what- Sorry, one

1:22:57

second, let me bring this all together. The one thing

1:22:59

that I live in perpetual fear of is

1:23:01

that you have the guy, the doctor,

1:23:04

you'll probably remember his name, I don't, who

1:23:06

was like, you know what? I think after you do an autopsy,

1:23:09

you need to wash your hands before you give birth to somebody.

1:23:11

Yeah, semilwise, yeah. What, semilwise? Semilwise,

1:23:14

yeah. Semilwise. So semilwise

1:23:16

goes and tells people this, hey, I think this is

1:23:18

causing the death of mothers. We

1:23:20

really have to start washing hands. People make fun of

1:23:22

him, lambast him, drive him into

1:23:25

an insane asylum where he dies

1:23:28

before it's discovered, oh yeah, germ theory, he

1:23:30

was right. And so, that's

1:23:32

how wrong this goes.

1:23:34

I don't think that humans have changed. I

1:23:36

think that we still

1:23:37

have that reaction where it's like,

1:23:41

they're not necessarily even trying to be mean.

1:23:43

It just doesn't make sense to them.

1:23:46

And it would cause all these changes and

1:23:49

we don't really know that this is a thing. And

1:23:51

so, to me, the

1:23:53

people that want to make the

1:23:56

decisions, they lack the humility

1:23:58

to make the decisions. to recognize

1:24:01

the odds of me being wrong

1:24:03

border on 100%. Not on everything,

1:24:07

but when you take everything in totality, you're

1:24:09

going to be wrong and you just don't know about which

1:24:11

things. And so if I'm thinking,

1:24:13

okay, you've put forth your idea, we need these,

1:24:15

we need experts, we need an institutional

1:24:18

response that we can trust. My thing

1:24:20

would be the closest thing I could imagine

1:24:22

to that, you have to red team, blue team. You've

1:24:24

got to get

1:24:26

somebody like RFK Jr. who really

1:24:28

believes this is a problem, this

1:24:30

is really causing autism. And

1:24:33

then you debate it

1:24:35

with data that you

1:24:37

predict ahead of time. So it

1:24:39

comes down to, okay, what was your prediction? What did

1:24:41

you think was going to happen? Then you run the experiment.

1:24:43

Did it actually happen? Yes or no. And then both

1:24:45

sides, because when you look at the data, is the

1:24:47

point at which you're most likely to make your

1:24:50

errors, bring your biases, all that. You look

1:24:52

at the data and then you try to go

1:24:54

with a consensus. Now,

1:24:57

I don't see any way, and climate is a great

1:24:59

one to talk about, I don't see any way to stop

1:25:02

all kinds of prolonged

1:25:05

debate.

1:25:06

But then you hope that when there really is

1:25:08

evidence, that it starts to be just

1:25:10

one

1:25:11

by one, all the detractors

1:25:13

start falling away, it just becomes too self-evident.

1:25:17

And then you really can get something approaching

1:25:20

consensus in action.

1:25:23

Yeah, well, so

1:25:26

let me see if I can just isolate what we're

1:25:28

disagreeing about here, because you

1:25:32

seem to be imagining that we can have

1:25:34

an information landscape, whether let's say it's

1:25:36

on a platform like Twitter or

1:25:39

YouTube,

1:25:40

where it's

1:25:42

as flat as, there's as much,

1:25:46

no one's doing anything deliberate to

1:25:49

tune

1:25:50

the signal to noise ratio, right?

1:25:52

Because to do that would be to be biasing it

1:25:54

based on- The honest answer is I think people are hyper

1:25:56

biased.

1:25:57

So that is factored into.

1:25:59

while I didn't talk about that, that

1:26:02

has factored into how I think you have to let

1:26:04

the ideas battle so that the wiser,

1:26:08

more eloquent fighters,

1:26:12

and I think this is probably a lot

1:26:14

of some people's

1:26:16

pushback on you stepping back

1:26:18

somewhat, exiting Twitter, as

1:26:20

they were like, you were the guy I counted on to be able to throw

1:26:22

and dodge punches and bring humor

1:26:25

and all of that.

1:26:27

And so we've lost one of the great

1:26:29

fighters of this. And

1:26:32

that gets hard because I don't

1:26:34

see myself as talented enough

1:26:37

in the idea of public opinion to do that.

1:26:39

So you have to have people that can

1:26:42

dismantle these ideas. Like I've seen you on

1:26:44

stage do this,

1:26:46

particularly with religion, which

1:26:48

is outside of where we're at right now in the conversation,

1:26:51

but where you have been,

1:26:53

you've been funny and

1:26:56

it's shareable. The clips themselves

1:26:59

are amazing because they hit and they shake you up,

1:27:01

but they're easy to transmit and remember. And

1:27:04

so when you find a great

1:27:06

orator, Douglas Murray is another guy that

1:27:09

can really do this.

1:27:14

That feels to me, and I know you've

1:27:16

sort of, you keep saying I'm not the guy to

1:27:18

think through COVID or whatever. And I'm like,

1:27:21

ah, actually. So as

1:27:23

long as it's people who

1:27:26

are disagreeing respectfully, who

1:27:28

care about truth, you have

1:27:30

to as the individual care about truth and

1:27:32

you have to not pride yourself on being right. You have

1:27:34

to pride yourself on identifying the right answer. But

1:27:38

I have a feeling that

1:27:41

experts almost need that external

1:27:44

panel of people who are like, I'm not

1:27:46

invested in this, but I know how to think through

1:27:48

novel problems. Here's how I'm parsing

1:27:50

this data. Let me ask you really

1:27:53

pointed questions, give me your feedback,

1:27:55

and then I'll triangulate on an answer. That

1:27:58

feels like in the reality.

1:27:59

of how ideas win at scale,

1:28:02

that feels the closest to true. But

1:28:05

it requires that people be able to say whatever

1:28:08

the hell they want, that they don't get

1:28:10

booted, kicked off, silenced, whatever.

1:28:13

Well, but so let's just take that last

1:28:15

claim first.

1:28:16

All of these platforms have to

1:28:18

kick people off for

1:28:21

specific violations of their

1:28:23

terms of service, right? You need some terms of

1:28:26

service. As far as I know, even

1:28:29

4chan has a terms of service, right?

1:28:31

I think maybe 8chan doesn't, but like if

1:28:34

you want to be more extreme than 4chan,

1:28:36

you have to go somewhere else. So

1:28:38

I will give you barriers must be put up. Now where

1:28:40

we put the barriers, we are going to. But the

1:28:42

moment you can see that, right?

1:28:45

Then you recognize that there's

1:28:48

absolutely nothing novel about what Elon

1:28:50

is doing on Twitter. He's just, he's

1:28:52

just biasing it in a way that he likes

1:28:54

better than the previous bias, right? So

1:28:57

he brought Kanye on knowing he was an anti-Semite

1:29:00

and then Kanye did something and he kicked him off because

1:29:02

he realized, okay, I can't really have this happening

1:29:04

on my platform. And

1:29:07

so it is what I mean, he's, he's cozying

1:29:10

up to sort of QAnon lunatics

1:29:12

and he may not even know who he's signal

1:29:14

boosting. He's just like, you know, glad handing somebody

1:29:17

who said something he thought was clever, sent him

1:29:19

a meme he thought was clever. And he's actually signal

1:29:21

boosting somebody who's just odious

1:29:23

in their ideological commitments and

1:29:25

in their line about everything under the sun, right? And

1:29:28

I'm not saying Elon's actually paying attention to all

1:29:30

that,

1:29:32

but he's doing something incredibly

1:29:34

ad hoc and sloppy and

1:29:37

it's still not free speech

1:29:39

absolutism, right? Free speech absolutism

1:29:42

just doesn't exist. It doesn't even exist on 4chan.

1:29:45

I mean, as far as I know,

1:29:47

I have this on just good faith because

1:29:50

I'm never on,

1:29:52

I don't think I've been on 4chan ever, but I

1:29:55

think it has a terms of service and that's

1:29:57

why 8chan was spawned. It's like, in

1:30:00

protest over the

1:30:03

puritanical control

1:30:05

on 4chan. So

1:30:09

what you seem to be recommending is

1:30:11

the 4chanification or the 8chanification

1:30:14

of everything. And what that would be, in

1:30:16

my view,

1:30:17

if that were happening on YouTube and TikTok

1:30:19

and Twitter and threads

1:30:22

and everywhere else,

1:30:24

it would be a maximally

1:30:26

noisy,

1:30:29

uncivil space. So

1:30:31

it would get hard, insofar as

1:30:34

you achieve that ideal of

1:30:37

no libtard institutional

1:30:39

bias, we're just gonna let

1:30:41

it rip. Anyone with

1:30:44

anything to say gets equal chance

1:30:46

to say it.

1:30:49

What you're gonna have there is just pure

1:30:52

cacophony,

1:30:54

and it's gonna be harder and harder to find the

1:30:56

signal and the noise. So

1:30:58

the moment you admit that, you admit

1:31:01

you're in the business of favoring

1:31:03

certain voices over others, platforming

1:31:05

and even deplatforming

1:31:07

people when they prove

1:31:10

on the 10th infraction that they're

1:31:12

truly beyond the pale and just committed to making

1:31:14

a mess. And so my leaving

1:31:16

Twitter was just much more of a personal decision.

1:31:21

It wasn't a decision that it was just a decision

1:31:24

about how I was gonna spend my time and attention ultimately.

1:31:26

But the reason why

1:31:28

I didn't see a

1:31:31

benefit to my staying

1:31:33

there is that it's just

1:31:35

the wrong space in which to try

1:31:38

to have a conversation, a conversation

1:31:41

that converges on anything useful on

1:31:43

these kinds of topics, really any polarizing

1:31:45

topic. Because I tried, I mean it was

1:31:48

the only social media platform that I ever used personally.

1:31:51

I mean, I'm on others, but those are just marketing channels

1:31:53

and I never see them. But it was really

1:31:55

me on Twitter. I was really trying

1:31:58

to make sense to people. What

1:32:00

i was getting back was just this

1:32:02

tsunami of bad faith attacks

1:32:05

and craziness and

1:32:07

what was actually. Exploding

1:32:10

was not just you know. Headlines

1:32:13

that were false but like you know in

1:32:15

the in the final case it

1:32:17

was a clip of me on another podcast

1:32:20

which. Was genuinely

1:32:22

misleading as to what i said

1:32:24

in context on that podcast. But

1:32:26

it simply didn't matter because the clip itself

1:32:29

seemed to be it seemed intelligible enough

1:32:31

it seemed clear enough what i was saying it within the clip.

1:32:35

That people just they

1:32:36

didn't even want to hear that there was a context

1:32:38

hundred biden thing yeah that by myself

1:32:41

so they didn't like literally they did they didn't

1:32:43

want to hear and even.

1:32:46

Like no one has the bandwidth

1:32:48

to go back and find the context

1:32:51

for the thing that they just reacted to that

1:32:53

just you know primed

1:32:55

all their you know satisfied all their salt

1:32:58

and sugar and fat receptors. And

1:33:01

what so when you ask what my principles

1:33:04

are my my my general

1:33:07

principle is to be very on guard.

1:33:09

For doing that sort of thing myself

1:33:12

so like even when there's somebody

1:33:14

who.

1:33:16

Who i know i understand

1:33:19

and

1:33:20

revile appropriately

1:33:22

somebody like trump. I'm

1:33:24

still on guard for the clip of

1:33:27

him that is actually misleading right

1:33:29

and and i will actually defend him as

1:33:32

much as i you know find him in defensible.

1:33:34

I have i burned a fair amount of you know reputational

1:33:37

capital now over here on the left.

1:33:40

By defending him on specific

1:33:42

points when i felt that the attack on him was just

1:33:45

based on lies right so like when

1:33:47

he was either when he. Give

1:33:50

a press conference after charlotte's fillin said

1:33:52

that you know there were good people on both sides and

1:33:55

he seem to be saying in that clip that

1:33:58

the neo nazis are good people right.

1:33:59

And that was spun from, that

1:34:02

was spun politically from, you know, everyone

1:34:04

from Biden on down, it was spun by the New

1:34:06

York Times. I mean, literally, I would

1:34:08

have,

1:34:09

if I had to bet

1:34:11

95% of the commentary

1:34:14

at left of center still

1:34:17

thinks he was praising neo-Nazis

1:34:19

in that, in that press conference. And yet within

1:34:22

the, within the context of the press conference, it's absolutely

1:34:25

clear that he's not doing that, right?

1:34:27

He says, you know, within 20 seconds

1:34:30

of the clipped piece, he says, you know, I'm not talking

1:34:32

about the white supremacists and the neo-Nazis,

1:34:35

I'm talking about other people who were there for,

1:34:37

you know, who were just worried about, you know, monuments

1:34:40

getting torn down.

1:34:43

And so

1:34:46

I think we have to acknowledge that

1:34:48

there's, we're in a

1:34:50

media landscape now where

1:34:53

people are being reliably misled by clips

1:34:56

that,

1:34:56

so the

1:34:59

underlying ethic here is that when

1:35:01

people are arguing in a partisan way, they

1:35:04

don't really care

1:35:07

what their enemy has

1:35:09

said or meant to say.

1:35:11

What they want to do is they want to hold them

1:35:14

to

1:35:14

the worst possible

1:35:17

interpretation of what they said and

1:35:19

make that stick, right? And the game

1:35:21

is just, see if you can make it stick. See

1:35:23

if you can, and so I've long, I've

1:35:26

made this claim,

1:35:27

you know, for years now, and this is

1:35:30

more based

1:35:32

on what happens to me from the left as opposed

1:35:34

to the right. I mean, the example we're talking

1:35:36

about now was sort of defamation coming from

1:35:38

the right. But,

1:35:41

you know,

1:35:43

I've made this point before and this

1:35:45

is, you know, it's an inconvenient point

1:35:47

to make because like even this can get clipped out

1:35:50

to my disadvantage, but it's just, it's worth saying because

1:35:53

this shines as bright as possible

1:35:55

a light on what everyone is doing and many

1:35:58

good people are doing it kind of unthinkingly. But

1:36:01

so I'm living, I've long known, for

1:36:03

at least eight years or so,

1:36:05

I've known that I'm in an environment where if

1:36:08

I say on my podcast,

1:36:10

black people are apes,

1:36:12

white people are apes, we're

1:36:15

all apes, racism doesn't make any

1:36:17

sense, right? There are

1:36:19

some considerable number of people

1:36:22

who will clip out

1:36:24

me saying black people are apes, or make

1:36:26

a meme black people are apes, Sam

1:36:28

Harris, right? And they will

1:36:31

export that to their channels with

1:36:34

apparently with a clear conscience saying

1:36:36

this is fair game, right?

1:36:38

And

1:36:42

that's the kind of people I was dealing with on Twitter. And

1:36:44

that's the kind of person who clipped that clip

1:36:46

from that podcast is exactly that sort

1:36:48

of person and he was being signal boosted

1:36:50

by lots of other people we could

1:36:52

name. And

1:36:55

so I just recognize that this

1:36:57

is a,

1:37:00

just it's just the wrong occasion to

1:37:03

try to have a conversation

1:37:05

with people. And it's built into the dynamics

1:37:08

of the system where people are incentivized

1:37:10

just to dunk on everybody, however

1:37:12

dishonestly, and then move on. And

1:37:16

part of the pathology I saw with Elon taking

1:37:18

over the place was not so much what he

1:37:20

was doing to the place as its owner, but

1:37:23

just how he was behaving personally

1:37:25

on the platform himself. I mean, he was doing the same

1:37:27

thing. One of the first things he did

1:37:29

after he took over

1:37:31

Twitter

1:37:32

was he spread

1:37:34

this meme about Nancy Pelosi's husband, after

1:37:39

the hammer attack on him. It's like, it's not what

1:37:41

it seems. This could have been a gay Trisk

1:37:44

on a ride. And he linked to a website

1:37:46

that had an article to

1:37:49

that effect. This was a website

1:37:52

which during the 2016 election

1:37:56

claimed that Hillary Clinton was

1:37:58

dead and that a body double- was campaigning

1:38:01

in her place, right? So Elon links

1:38:03

to that as a source, right? In front

1:38:05

of, at that point, probably 110 million people, amplifying

1:38:09

a completely crazy conspiracy theory that is

1:38:12

getting spun up in QAnon circles.

1:38:16

And then when that gets pointed out to him, just how

1:38:19

wrong all of that was, and

1:38:22

how irresponsible it was,

1:38:24

he never corrects the record, he never apologizes,

1:38:27

he never changes his

1:38:29

appetite for doing that again. In that

1:38:32

case, I think he just deleted the tweet, right, and moved

1:38:34

on. And so it's like

1:38:36

even people who reputationally

1:38:39

have a tremendous amount to lose by

1:38:41

behaving that way, you would think, are

1:38:44

goaded into behaving that way because of the

1:38:46

mechanics of the platform. And so, you know,

1:38:49

for me personally, I simply don't

1:38:51

understand how people have

1:38:54

audiences that will

1:38:56

still follow them after they

1:38:59

prove that they don't care

1:39:01

to make any of these distinctions. I mentioned Tucker

1:39:03

Carlson. The fact that behind closed

1:39:05

doors, he's saying that Trump is a demonic

1:39:08

force,

1:39:09

and then in front of the camera, he's

1:39:11

basically messaging to Trumpistan 85%

1:39:13

of the time, you know, in a very

1:39:16

supportive way.

1:39:18

I don't understand how, like in

1:39:20

front of my audience, if a similar

1:39:22

thing were revealed about me,

1:39:25

my

1:39:27

audience would just completely

1:39:29

disavow me. I mean, it would be no, it

1:39:33

would be a complete breach of trust with

1:39:35

my audience. And that's the way I think it should

1:39:37

be. But Alex Jones, I mean,

1:39:39

like Alex Jones has an audience of some

1:39:41

tens of millions of people in

1:39:43

the aftermath of Sandy Hook,

1:39:46

right? You know, like he has lied

1:39:48

and lied and lied about Sandy

1:39:51

Hook being a false flag, fake,

1:39:53

you know, confection of the

1:39:56

Obama administration. You

1:39:59

know, they're all crisis out. The kids never

1:40:01

died or like, I mean, it's like, I don't,

1:40:04

I don't know how deep

1:40:06

those claims went, but some version

1:40:08

of that, right? Like this is, none of this is

1:40:10

as it seems.

1:40:13

He created immense harm, demonstrable

1:40:15

harm with all the Sandy Hook families. I mean, these are families,

1:40:18

in many cases, they've had to move home.

1:40:21

Some have moved homes 10 times since

1:40:24

he started spreading those lies about them.

1:40:27

All of this gets kind

1:40:29

of,

1:40:30

you know, forensically documented at

1:40:32

trial. He gets a billion dollar

1:40:35

judgment against him.

1:40:37

How does he still have an audience? Or like,

1:40:39

who are these people who are still listening

1:40:41

to him? Is that legitimately confusing? Cause that one

1:40:44

to me is very clear. He's entertaining.

1:40:47

All these people are mentally ill or like, what? Cause

1:40:49

I don't, look, I don't know Rogan

1:40:52

well at all, but

1:40:55

Rogan keeps having him on. I don't know if he has

1:40:57

recently, but he thinks he's funny.

1:40:59

Since then, but yeah, that I

1:41:02

don't know. But certainly my take

1:41:04

is that people find him amusing.

1:41:06

He's funny and he's gotten enough weird

1:41:09

stuff right. That

1:41:11

people are like, all right, look, he missed. I

1:41:13

don't, I know literally nothing about the Sandy

1:41:15

Hook stuff. So I'm certainly not defending

1:41:17

that. I'm just saying, I understand the phenomenon.

1:41:20

The phenomenon is that,

1:41:22

that this is,

1:41:24

this is an age where the algorithm

1:41:26

is a big part of the piece of information. And so

1:41:29

the reason that certain people become the voice

1:41:31

is that they also are able to speak in

1:41:33

a way that people find really compelling,

1:41:36

entertaining, engaging. And so

1:41:38

that person is going to keep going. Alex

1:41:40

has a way of delivering information that's

1:41:43

zany. It's crazy. It's over the top.

1:41:45

You can't believe it's real. He's funny. He's

1:41:47

fun to make fun of. And there

1:41:49

are enough things that people like, I mean,

1:41:52

he gets memed, right? It's like another

1:41:54

crazy thing that he said five years ago just

1:41:57

came true. And so he's a bit.

1:42:00

like the Simpsons

1:42:03

in that people would be like, yo, Simpsons 12 years ago

1:42:05

predicted this thing. You're like, that's insane. So

1:42:08

same thing with Alex. But now that doesn't

1:42:10

mean that he isn't a destructive

1:42:12

force. It just means that it's very

1:42:15

easy for me to see why he's entertaining

1:42:17

enough that people are gonna keep

1:42:19

going back to him. I'll grant you the

1:42:21

entertainment component, especially in his case, but

1:42:24

in his

1:42:26

case, perhaps to a lesser degree, but

1:42:29

certainly in Tucker's case,

1:42:31

there's this pretense of, I'm

1:42:33

just giving you honest information, right? I'm just

1:42:35

calling it as I see it. Like, this is just, like,

1:42:38

what you see is what you get. There's not like, this is

1:42:40

just, this

1:42:43

is,

1:42:44

there's a fundamental integrity to the

1:42:47

message. That's what his audience seems

1:42:49

to think they're getting. And then we

1:42:51

know that he is a completely

1:42:53

different person

1:42:55

behind closed doors.

1:42:57

Monday through Friday, if I'm awake, I'm either working

1:42:59

or working out. So I know intimately

1:43:01

that you need to keep your energy levels high

1:43:04

if you wanna do something extraordinary.

1:43:07

Getting all the nutrients that you need to perform at

1:43:09

your best is one of the things you're gonna need to

1:43:11

do if you want to perform at an optimal

1:43:13

level. If your mission is achieving

1:43:15

excellence, you've got to support your body

1:43:18

and your mind, and AG1 is

1:43:20

here to help. One of the great things about AG1 is

1:43:23

it's whole food turned into supplement

1:43:25

form. And I think you guys could really

1:43:27

use it to optimize your daily performance. AG1

1:43:30

is packed with 75 premium

1:43:33

vitamins, minerals, and whole

1:43:35

food sourced ingredients that elevate

1:43:37

your immune system, uplift your mood, and

1:43:39

promote restful sleep. AG1

1:43:41

is offering a great deal for our listeners.

1:43:44

If you're looking for a simpler, effective

1:43:46

investment for your health, try AG1. And

1:43:49

get five free AG1 travel

1:43:51

packs and a free one year supply

1:43:54

of vitamin D with your first purchase.

1:43:57

Go to drinkag1.com.

1:44:00

That's drinkag1.com slash impact.

1:44:07

Be sure to check it out.

1:44:09

I really can't say this enough times. You're

1:44:11

never going to reach peak cognitive performance

1:44:14

if you're not taking care of your body. But

1:44:16

working out, if you don't know what you're doing,

1:44:19

can be intimidating, disheartening, and can even

1:44:21

cause injury. Future is a fitness

1:44:23

app that pairs you with your own highly

1:44:25

credentialed personal trainer who will build

1:44:27

your custom workouts that you can

1:44:29

complete on your own time whenever. It

1:44:31

starts with a video call with your trainer

1:44:34

who will discuss your goals and design an equipment

1:44:36

agnostic fitness plan unique to you.

1:44:39

Future's trainers help you stay accountable with

1:44:41

voice prompts, in-app messaging, and video

1:44:44

check-in calls. All so you can

1:44:46

improve your health and gain the discipline and

1:44:48

clarity to reach the rest of your goals.

1:44:50

If you finally want to make a change, then

1:44:53

I have a special offer for you. Use my

1:44:55

link to get your first month with a Future

1:44:57

trainer for only $19. Think

1:44:59

of what you can accomplish during that

1:45:01

month. Go to tryfuture.co

1:45:05

slash impact to get started.

1:45:07

And until next time, be legendary.

1:45:10

Thanks for tuning in to another episode of Impact

1:45:12

Theory. I hope you guys found today's conversation

1:45:14

insightful, inspiring, and

1:45:17

useful. If you want to support the show and stay

1:45:19

up to date with all of our latest episodes,

1:45:21

the easiest way to do that is by following

1:45:24

us on your favorite podcast platform. Whether

1:45:26

you're listening on Apple Podcasts, Spotify,

1:45:29

Amazon Music, or any other platform, just

1:45:31

hit that follow button to never miss

1:45:34

an episode. And if you really love the show

1:45:36

and want to help us spread the word, leave

1:45:38

us a review and share the podcast

1:45:39

with your friends and family. But that's not

1:45:42

all. You can now listen to all Impact

1:45:44

Theory episodes completely ad-free.

1:45:46

We've also curated amazing playlists

1:45:49

on critical topics like health,

1:45:51

mindset, business, and relationships.

1:45:54

Click through the show notes to subscribe on

1:45:56

Supercast, which allows you to

1:45:58

listen on any preferred podcast.

1:45:59

player. Subscribe now and let's

1:46:02

continue to learn, grow, and

1:46:04

make a legendary impact

1:46:05

together.

Rate

Join Podchaser to...

  • Rate podcasts and episodes
  • Follow podcasts and creators
  • Create podcast and episode lists
  • & much more

Episode Tags

Do you host or manage this podcast?
Claim and edit this page to your liking.
,

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features