Podchaser Logo
Home
433. Streaming, Politics, & Philosophy | Destiny (Steven Bonnell II)

433. Streaming, Politics, & Philosophy | Destiny (Steven Bonnell II)

Released Thursday, 21st March 2024
 1 person rated this episode
433. Streaming, Politics, & Philosophy | Destiny (Steven Bonnell II)

433. Streaming, Politics, & Philosophy | Destiny (Steven Bonnell II)

433. Streaming, Politics, & Philosophy | Destiny (Steven Bonnell II)

433. Streaming, Politics, & Philosophy | Destiny (Steven Bonnell II)

Thursday, 21st March 2024
 1 person rated this episode
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

However, And.

0:10

Hello! Everyone! I'm here today talking

0:12

to Stephen Banal. Known.

0:14

Professionally and online his destiny.

0:16

He's an American streamer. Debater:

0:20

And political commentator. He.

0:22

Really came to my attention. I would say. As

0:25

a consequence of the discussion he had with

0:27

Ben Shapiro and Legs Friedman and I decided

0:29

to talk to him, not least because it's

0:31

not that easy to bring people who. Are

0:34

identified at least to some degree with their

0:36

political beliefs. On the left into a studio

0:38

where I can actually have a conversation with

0:41

them. I've tried that more often than you

0:43

might think it. It happens now and then,

0:45

but not very often. So today, we talk

0:47

a lot about. Well.

0:49

The differences between the left in the right

0:52

and the dangers of political ideology per se

0:54

in the use of power as opposed to

0:56

invitation and. All. sorts of other. So.

1:00

Heated Often heated and

1:02

contentious issues and so.

1:05

You're. Welcome to join us and I was

1:07

happy to have the opportunity to do

1:09

this. So. I guess we

1:11

might as well start by letting the people who

1:13

don't know who you are get to know who

1:15

you are. With. A little bit

1:18

more precision. So. Why?

1:20

Have you become known and and how

1:22

is that developed? The

1:25

pretty broad question? Our. Like

1:27

I started streaming or fifteen years ago when

1:29

wasn't really a thing out. there are a

1:31

few people that did it. I started early

1:33

on i was app why guess back then

1:35

you're a professional gaming x a game and

1:37

just started to come out. but there's a

1:39

game called Starcraft two an ice cream myself

1:41

playing that game I was a pretty good

1:44

player, was pretty entertaining to watch and then

1:46

gonna go. over because

1:48

maybe the next seven years just streaming

1:50

that people watch streaming on you tube

1:52

arm while back then i started our

1:55

website called livestream as much to you

1:57

stream that as much recycled just and

1:59

tv And then that turned into Twitch.tv. So

2:03

after streaming there for like seven or eight years, I

2:05

was a semi-professional Starcraft 2 gamer. That game kind of

2:08

came and went, but I had a lot of other

2:10

interests. Around 2016, I

2:12

started to get more involved into the world of politics.

2:14

It's kind of a left-leaning figure. Because

2:16

my background are like esports and internet

2:18

gaming and internet trash talk. I had

2:20

more of a kind of like a

2:22

combative attitude. And that was kind of

2:24

rare for left-leaning people at the time. So it's basically

2:27

where my early political popularity came from.

2:29

I think from like 2016 to 2018

2:31

was debating right-wing people. So was there

2:33

a game-like element to the debating, do

2:35

you think? And is that part of

2:37

why that morphing made sense? No,

2:41

I wouldn't say so. I mean, if you get

2:44

really reductionist, everything in life is kind of a game.

2:46

But it's not very satisfying. I

2:49

think I grew up like very argumentative. My mom

2:51

is from Cuba. So my family was like very

2:53

conservative. And then I grew up like listening to

2:55

the news all day, listening to my mom's political

2:57

opinions all day. And then I argued with kids

2:59

in high school and everything. And I've always been

3:01

kind of like an argumentative type A aggressive personality.

3:03

So I think that probably lent itself well to the political

3:05

stuff in 2016. Was that

3:08

useful in gaming? That

3:11

personality? In some ways,

3:13

yeah. In some ways, no. I

3:15

don't know directly for the games itself. I don't know

3:17

how much it necessarily mattered. But for all

3:19

the peripheral stuff, in some ways, it was really beneficial. I could kind

3:21

of like cut out my own path and I could be very unique.

3:23

And I could kind of be on my own. In

3:26

some ways, it was very detrimental. I'm very,

3:28

I can be very difficult to get along with. And I'm

3:30

very much kind of like I want to do this thing.

3:32

And if you try to tell me what to do, I

3:34

don't want to have like a sponsor or a team or

3:36

anybody kind of with a leash on me. So yeah, I

3:38

guess it worked out. It's interesting because that the

3:41

temperamental proclivity that you're describing

3:43

that's associated with low agreeableness.

3:46

And generally, well, and that's more

3:48

combative. It's more stubborn. It's more

3:50

implacable. It's more competitive.

3:54

The downside is that it's more skeptical.

3:56

It can be more cynical. It

3:59

can be less cooperative. But generally a temperament

4:01

like that is associated with, is

4:03

not associated with political belief on

4:05

the left, because the leftists tend

4:08

to be characterized by higher

4:11

levels of compassion, and

4:13

that's low agreeableness. So, you

4:15

know, that element of your temperament at

4:17

least is quite masculine, and a

4:20

lot of the ideology that

4:22

characterizes the modern left has a

4:24

much more temperamentally

4:26

feminine nature. So,

4:28

all right, so why do

4:30

you think the shift from your popularity

4:32

to political commentary worked? And

4:35

you said that started about 2016,

4:37

and why do you think that

4:39

shift happened for you, like in terms of

4:41

your interest? I think I've always

4:43

been interested in a lot of things. Like I grew

4:45

up with a very strong political bend. It was conservative

4:48

until I got into my streaming years, probably five

4:50

or six years of streaming. I slowly kind of started

4:52

to shift to the left. I

4:55

would say that I

4:58

guess in around 2016 when I saw all

5:01

of the conversations going on with the election and

5:03

with all the issues being talked about, I felt

5:05

like conversations were very low quality. And

5:07

in my naivety, I thought that maybe I could come

5:09

in and boost the quality, at least in my little

5:11

corner of the internet, to have better conversations about what

5:13

was going on. And so

5:15

that was basically my injection point into all

5:17

of that was, yeah, fighting about those political

5:20

issues and then arguing with people about them,

5:22

doing research and reading and all of that.

5:24

And so did you do that by

5:26

video to begin with as well? Yeah, it was all

5:28

streaming. Yeah, it was all streaming. And so I

5:31

presume you built an audience among the

5:33

people who were following you as a

5:35

gamer first, and then that started to

5:37

expand. Is that correct? Basically, yeah. Without

5:39

getting too much into the business or streaming

5:41

side of things, basically, actually, it probably carries

5:43

over to basically all media,

5:45

I would imagine, is you've got people that will

5:48

watch you for special events. So

5:50

Maybe you're like a commentator of the Super Bowl

5:52

Or maybe you're hosting a really huge event. Then

5:54

You've got people who will watch you every time

5:56

you're participating in your area of expertise. So For

5:58

me, that's like. A particular game I

6:01

might be playing on. it might be when

6:03

you're on like a particular show or something

6:05

that people watch reports and then the. Fundamentals:

6:07

And like the best, found that you can run

6:09

into the lowest and most loyal viewer. I guess

6:11

it's abundance watching you basically no matter what you're

6:13

doing. and is that a give little fall you

6:16

from area to area Smyrna Think because of the

6:18

way I did gaming and I talked about a

6:20

lot of other software that was fall sick, science,

6:22

current events, whatever I had. A lot of loyal

6:24

fans are kind of follow me wherever I went.

6:26

So far as humans suck, embark on reputation. Yeah,

6:28

so how would you characterize your reach now? how

6:30

would you point to point? I

6:33

that mobile as well. Can you be more

6:35

precise how many people are, how many people

6:37

are watching your a typical video that you

6:39

might produce and and what are you doing

6:42

for subscribers say on you tube and totally

6:44

the any idea about food reach. Yeah well

6:46

I mean I guess my scratches on you

6:48

tube I have around think that's right, seven

6:51

hundred seventy thousand and I'm I mean channels

6:53

and I get probably do between all three

6:55

channels. think around fifteen to twenty million views

6:57

months. ah I'm and then I live stream

6:59

to anywhere from five to fifteen thousand concurrent

7:02

viewers a day. For hopefully run

7:04

eight hours a day. get

7:06

some cuckoo see of course.

7:08

Substantial. reached. And. So you said

7:10

that. Initially, You were

7:12

more conservative leaning, but that

7:14

changed what? Okay, What?

7:17

Did it mean that you were more conservative leaning? And

7:19

how did that? How and why did

7:21

that change? Ah, what. I saw a sort

7:23

of money and I was writing articles from

7:25

a school newspaper defending George Bush in the

7:28

Iraq War. I'm like very much like damn,

7:31

I don't. I think it's like an insult

7:33

now what people say like neo con but I

7:35

was like very much like a conservative. A Bush

7:37

era conservative also supported big business, supported traditional,

7:39

all of the conservative I guess like for

7:41

policy you know hawks are policy. It's awful

7:43

for whatever that meant. As like a fourteen sitting

7:45

there old I root for their with all

7:47

Lng solace in style incident that was very

7:49

big for a Cuban Americans were there is

7:51

a Cuban boy that tried to come the United

7:54

States with several other people as mother and their

7:56

rocked I guess plaster some the up snigger,

7:58

mom died and some other people died and

8:00

it was a huge debate on whether or

8:02

not to send him back to Cuba and Clinton

8:04

ended up sending a back to cure one.

8:06

I know that Mom was super irritating. all

8:08

that. The said elise

8:10

and than. but once they hit college you

8:12

think it's put Ron Paul and two thousand

8:14

would on two thousand and eight. Hours

8:16

of the Ron Paul Libertarians are guy

8:19

my school. When I went from I

8:21

went without the Jesuit high school and

8:23

I kind of became atheists in that

8:25

process said are reading. I ran Ah

8:27

stars Very very very very very conservative.

8:30

Ah, I'm with on the Libertarianism it's

8:32

I would say so young initially on

8:34

the like does makes more sense in

8:37

relationship to your temperament to center began

8:39

new you have any say was like

8:41

christian conservative and then it became like

8:43

Libertarian conservative ah been without my lifetime

8:45

I took like a wacky past and

8:47

then as I i started working kind

8:50

of drop at a school was working

8:52

and then I got into streaming and

8:54

once I started streaming had a son

8:56

based around the first or a sort

8:58

of streaming or as I started. To

9:01

go through life and I want from

9:03

kind of being in this like working

9:05

poor position to making a lot of

9:07

money especially to the ones my child.

9:09

I saw how different life was when

9:11

I had more money versus less and

9:13

I guess like the the differences between

9:15

what was available to me and my

9:17

child as I made more money. While

9:20

I was really wealthy was not as wealthy

9:22

a kind of started to change the whether

9:24

they've I see you're attuned to the consequences

9:26

of inequality is I? basically I would say

9:29

yeah yeah. Okay, and

9:31

so that's. Okay, how did that

9:33

lead you to develop more sympathy for.

9:36

Left. Leaning ideas to them,

9:39

I guess the might night core beliefs

9:41

have never really changed, but I think

9:44

the whether those become applied kind of

9:46

change. ah so much the same

9:48

way that ah you might think that everybody

9:50

deserves a shot to go to school and

9:52

have an education cel mai be like a

9:54

core beliefs were as a libertarian or conservative

9:56

i might think that as long as a

9:58

school is available arabic out the opportunity to

10:01

go and study, but maybe now it's like a

10:03

liberal or progressive or whatever you'd call me, I

10:05

might say, okay, well, we need to make sure

10:07

that there's enough, you know, maybe like food in

10:09

the household or household or some kind of funding

10:11

program to make sure the kid can actually go

10:14

to school and study, basically. So like the core

10:16

drive is the same, but I think the applied,

10:18

the applied principle ends up changing a bit based on

10:20

what you're going to give. Right. So

10:22

is your concern essentially something like the

10:25

observation that if

10:27

people are bereft enough of

10:29

substance, let's say, that

10:31

it's difficult for them to take

10:33

advantage of equal opportunities even if

10:35

they are presented to

10:37

them, let's say. Yeah, essentially. Yeah.

10:40

And you have some

10:42

belief, and correct me if

10:45

I'm wrong, you have some belief that there

10:47

is room for state intervention at the

10:49

level of basic provision to make

10:52

those opportunities more manifest.

10:55

Yeah, to varying degrees. Yeah. Okay.

10:58

So let's start talking more broadly

11:00

than on the political side. So

11:04

how would you characterize the difference in

11:06

your opinion between the left

11:08

and the conservative political

11:11

viewpoints? On

11:16

a very, very, very broad level, if

11:20

there's some, I would

11:22

say if there's some like good world

11:24

that we're all aiming for, I think people

11:27

on the left seem to

11:29

think that a collection

11:31

of taxes from a large population that

11:33

goes into a government that's

11:35

able to precisely kind of dole out where

11:37

that tax money goes, you're basically

11:40

able to take the problems of society, you're

11:42

able to scrape off hopefully not

11:44

super significant amount of money from people that

11:46

can afford to give a lot of money.

11:48

And then through government programs and redistribution, you

11:50

target those taxes

11:53

essentially to people that kind of need whatever

11:55

bare minimum to take advantage of the right

11:57

to society. Okay. And

12:01

I guess conservative would generally think that

12:04

why would the government take my money? I think from

12:06

a community point of view through churches, through community action,

12:08

through families, we can better allocate our own dollars to

12:10

our own friends and family to help them and give

12:12

them the things that they need so that they can

12:15

better participate in a thriving society basically. Okay.

12:18

So, one of the things that I've always

12:20

found a mystery, and I think there's an

12:22

equal mystery on the left and on the right in

12:24

this regard, is that the more

12:27

conservative types tend to be very skeptical

12:29

of big government, and the leftist types

12:31

tend to be more skeptical of big

12:34

corporations. Right? Well,

12:37

you... Okay. So, following through

12:39

the logic that you just laid out,

12:41

you made the suggestion that one of

12:43

the things that characterizes people on the

12:45

left is the belief that government can

12:47

act as an agent of distribution... can

12:50

and should act as an agent of

12:52

distribution. Okay. A potential problem for that

12:54

is the gigantism of the government that

12:56

does that. Now, the conservatives

12:58

are skeptical of that gigantism, and

13:01

likewise, the liberals, the progressives

13:04

in particular, we'll call them progressives,

13:07

are skeptical of the reach

13:09

of gigantic corporations. And

13:11

I've always seen a commonality in those two

13:13

in that both of them are skeptical of

13:15

gigantism. And so, one

13:17

of the things that I'm concerned

13:21

about generally speaking with regard to

13:23

the potential for the rise of

13:25

tyranny is the emergence of

13:27

giants. And one

13:29

potential problem with the view that the

13:32

government can and should act as an

13:34

agent of redistribution is that there is

13:36

an incentive put in place, two kinds

13:38

of incentives. Number one, a

13:41

major league incentive towards gigantism and

13:43

tyranny. And number two, an

13:45

incentive for psychopaths who use compassion

13:48

to justify their grip on power

13:50

to take money and to claim

13:52

that they're doing good. And

13:55

I see that happening everywhere now in the

13:57

name of, particularly in the name of compassion.

14:00

It's made me very skeptical in particular

14:02

about the last. And. At

14:04

least about the progress of edge of the left.

14:07

So. I'm curious about what you think

14:09

about those to. First of all, it's

14:11

It's a paradox to me that the

14:13

conservatives and the leftists face off each

14:15

other. With. Regard to their

14:17

concern about different forms of scientists

14:20

and and don't seem to notice

14:22

that the thing that unites them

14:24

is some antipathy. This is especially

14:26

true for the Libertarians, some antipathy

14:28

towards gigantic structures per se. And

14:31

so then I would say with regards

14:33

to your antithesis between. Liberalism

14:35

and Conservatives. The Conservatives are pointing

14:37

to the fact that. There. Are

14:40

intermediary forms of distribution that can

14:42

be utilized to solve the social

14:44

problems that you're describing. The don't

14:46

bring with them the associated problem

14:48

of dragons. his. And. Nexus

14:50

or it's been softened to me.

14:54

To watch the last, especially in the

14:56

last six years. Ally.

14:58

Itself for example, with pharmaceutical companies which

15:00

was something I'd never saw never thought

15:02

I would see in my lifetime mean.

15:05

For. For decades. The.

15:07

Only. Gigantic corporations the

15:10

last was more skeptical of than the

15:12

fossil fuel companies where the pharmaceutical companies

15:14

and at all seem to vanish overnight

15:16

around the cold bit tight. so I

15:18

know the story. That's a lot of

15:21

things to throw at you, but. It

15:24

sort of outlines the territory that

15:26

we could probably investigate productively. Sasso.

15:29

Is a couple days I would say

15:32

that the try political landscape we have

15:34

I think it's less I send the

15:36

that concept of cathartic putting corporations and

15:38

liberal approach as putting like large governments.

15:40

I think today the divide were setting

15:42

to see more and more is more

15:45

of like a populist or anti populace

15:47

rise or even like an institutional as

15:49

institutional rise. So for instance I think

15:51

conservative senior at a says are largely

15:53

characterized with ah I see with populism

15:55

it's ah and other supporting like certain

15:58

cigarettes namely runner Donald. trump who they think

16:00

alone can kind of like lead them

16:02

against the corrupt institutions, be

16:04

them corporate or government. I

16:06

feel like most conservatives today are not as

16:08

trustful of big corporations as they were back

16:10

in like the Bush era where we would,

16:12

you know, conservatives would champion big corporations. Yeah,

16:15

I think that's right. Yeah. That's

16:17

a strange thing because it makes the modern conservatives

16:19

a lot more like the 60s leftists.

16:22

Potentially, yeah. I mean,

16:24

that brings us into the issue too of

16:26

whether the left-right divide is actually a reasonable

16:28

way of construing the current political landscape at

16:30

all. And I'm not sure it is, but

16:34

going online without ExpressVPN is like leaving

16:36

your kids with the nearest stranger while

16:38

you go to the restroom. They're probably

16:40

not a kidnapper or a serial killer,

16:42

but why would you ever take that

16:44

risk? Every time you connect to

16:47

an unencrypted network in cafes, hotels

16:49

or airports, you leave your data

16:51

vulnerable. Any hacker on the

16:53

same network can steal your personal data,

16:55

such as your passwords and financial details.

16:57

It doesn't take much technical knowledge to

16:59

hack someone. Hackers can make

17:01

up to $1,000 per person selling personal info

17:04

on the dark web. ExpressVPN creates

17:06

a secure, encrypted tunnel between your device

17:08

and the internet so that hackers can't

17:10

steal your sensitive data. It'd take a

17:13

hacker with a supercomputer over a billion

17:15

years to get past ExpressVPN's encryption.

17:17

I love how easy ExpressVPN is to

17:19

use. You just fire up the app

17:22

and click one button to get protected.

17:24

Plus, it works on all devices, phones,

17:26

laptops, tablets and more. So you can

17:28

stay secure on the go. Secure

17:30

your online data today

17:33

by visiting expressvpn.com/Jordan. expressvpn.com/Jordan.

17:35

And you can get

17:37

an extra three months

17:39

free. That's

17:42

expressvpn.com/Jordan. Right

17:46

now it kind of is, but only because

17:48

so many conservatives are following Trump. So like

17:50

your populist, anti-populous thing kind of maps on

17:52

kind of cleanly to the left and right.

17:55

I love work with progressives though, or the

17:57

far left, because they're also anti-large everything. So

18:00

in a surprising way, on very, very far left

18:02

people, you might find them having a bit more

18:04

in common with kind of like a mega Trump

18:06

supporter, then like a center left

18:09

liberal. So for instance, like both of these groups

18:11

of people on the very far left will be

18:13

very dovish on foreign policy, probably a little bit

18:15

more isolationist. They're not a big fan of like

18:17

a ton of immigration or a ton of trade

18:19

with other countries. They might think

18:21

that there's a lot of institutional capture of

18:23

both government and corporations. Both all of the

18:25

mega supporters and the far, far left might

18:28

think that corporations don't have our best interest

18:30

at heart and the government is corrupt and tax repart,

18:32

obviously. Yeah, you'll see a lot of overlap there. I

18:36

think that sometimes there's a couple things.

18:38

One, this is something I feel like I've discovered.

18:40

People have no principles. I think that people are

18:43

largely guided by whatever is kind of satisfying them

18:45

or making them feel good at the time. But

18:48

I think that's a really important thing to understand

18:50

because people's beliefs will seem to change at random

18:52

if you're trying to imagine that

18:54

a belief is coming from some underlying

18:56

principle or is governed by some internal,

18:59

like moral or reasonable code or whatever. I think

19:02

generally there are large social groups and people kind

19:04

of follow them along from thing to thing, which

19:06

is why you end up in

19:08

a strange world sometimes where like

19:10

the position on vaccines and being an

19:12

anti-vaxxer might have been seen as something 10 years

19:15

ago. It's kind of like a

19:17

hippie leftist and now maybe it's more like a

19:19

conservative or it's associated more with like mega Trump

19:21

supporters or whatever. I think as a result of

19:23

how the social groups move around. When

19:26

it comes to the – you mentioned this like gigantism

19:28

thing. That's another thing where I'm

19:30

not sure if people actually care about gigantism or if

19:32

they're using it as a proxy for other things that

19:34

they don't like. Like I could totally imagine – But

19:36

I care about it. Sure. Yeah,

19:38

you might. Yeah, sorry. It's just in general. Because

19:42

like I could imagine somebody saying that like they

19:44

don't trust like a large government. They think there's

19:47

too much prone to tyranny or something like that,

19:49

but also be supportive of an institution like the

19:51

Catholic Church, which is literally one guy who is

19:53

a direct line to God. Right, but they can't

19:55

tax and they don't have a military. That

19:57

is – And they can't conscript you. True. And

20:00

they can't throw you in jail. That is true. Yeah.

20:04

I mean, well, those are major. Those are major insignificant. I

20:06

mean, I get that I get the overlap. Don't get me

20:08

wrong. Sure. or

20:11

tribe, usually they've got some form of an acting punishment.

20:13

It'll be sometimes more brutal, but they can throw you

20:15

in jail. Conscription hasn't existed

20:18

in the US since the Vietnam War. Yeah.

20:21

I mean, yeah. Is that true?

20:23

Yeah, true. So yeah, I think

20:25

that I guess when I look at.

20:28

Well, let's go back to the redistribution

20:31

issue. We

20:36

pay 65% of our income at say

20:40

upper middle class, middle class, upper middle

20:42

class level in Canada. It

20:44

isn't obvious to me at all that that money

20:46

is well used. In fact, quite the contrary. In

20:49

my country now, our citizens

20:52

make 60% of they produce 60% of what you

20:54

produce in the US.

20:57

As plummeted over the last 20 years as

20:59

state intervention has increased, I'm

21:01

not convinced that the

21:04

claim that

21:08

the interests of people who lack

21:10

opportunity are best served by state

21:13

intervention. And there's a couple of reasons for that. I

21:16

mean, first of all,

21:18

I'm aware of the relationship

21:20

between inequality and social problem.

21:22

There's a very well developed literature on that. It

21:25

essentially shows that the

21:27

more arbitrary, the

21:30

broader the reach of inequality in a political

21:32

institution of any given size, the

21:35

more social unrest. So

21:37

where all people are poor, there isn't much

21:40

social unrest. And where all people are rich,

21:42

there isn't much social unrest. But when there's

21:44

a big gap between the two, there's plenty.

21:46

And that's mostly driven by disaffected young men

21:49

who aren't very happy that they can't

21:51

climb the hierarchy. There are barriers in

21:53

their way. And so there

21:55

is reason to ameliorate relative poverty. The

21:58

problem with that, just to some degree, is

22:01

that most attempts to ameliorate relative

22:03

poverty tend to increase absolute poverty

22:05

and they do it dramatically. And

22:07

the only solution that we've ever been

22:09

able to develop to that is something

22:11

approximating a free market system. I wouldn't

22:14

call it a capitalist system because I

22:16

think that's capture of the terminology by

22:18

the radical leftists. It's a free exchange

22:20

system. And the price you

22:22

pay for a free exchange system is you still

22:24

have inequality, but the advantage you gain is that

22:27

the absolute levels of privation plummet.

22:29

And I think the data on that are,

22:32

I think they're absolutely conclusive, especially,

22:34

and that's been especially demonstrated in

22:37

the radical decrease in rates of poverty since the

22:39

collapse of the Soviet Union in 1989. Because

22:42

we've lifted more people out of poverty in the last

22:44

four decades than we had in the entire course of

22:46

human history up to that date. And

22:48

that's not least because the

22:51

statist interventionist types who argued

22:53

for a radical state-sponsored redistribution

22:55

lost the Cold War. And

22:59

that freed up Africa to some degree

23:01

and certainly the Southeast Asian countries to

23:03

pursue something like a free trade economy.

23:07

And that instantly made them

23:09

rich, even China. So

23:14

that's an argument, let's say, on the side of

23:16

free exchange. But it's also

23:18

an argument, a twofold argument pointing out

23:20

how we immediately absolute poverty, which should

23:22

be a concern for leftists, but doesn't

23:24

seem to be anymore, by the way.

23:27

But also an argument for the maintenance

23:30

of a necessary inequality. I'm not

23:32

sure that inequality can be decreased

23:34

beyond a certain degree without that

23:36

decrease causing other serious problems. And

23:39

we can talk about that. But

23:42

it's a complicated problem. Yeah. But

23:44

for one point of clarification, when you say leftist, what

23:46

do you mean by that? Well,

23:49

I was going with your

23:51

definition, essentially, the core

23:53

idea being something like the

23:56

central being

23:58

one of relative inequalities. quality and

24:00

distribution of resources and the central solution

24:02

to that being something like state sponsored

24:05

economic intervention. I mean,

24:08

there's other ways we could define left

24:10

and right. Sure. I'll stick with the

24:12

one that you brought forward to begin

24:14

with. Gotcha. Gotcha. Okay. I only want

24:16

to be clear on that because people

24:18

get mad if I call myself a leftist. Oftentimes

24:21

online or in, especially in

24:23

Europe or worldwide, leftists will

24:26

refer exclusively to like socialists

24:28

or communists and anybody to

24:30

the right of that would be considered like a

24:32

liberal. No, usually a fascist. Well,

24:34

depending on your time. Very rapidly. Yeah.

24:36

I just wanted to be clear on

24:38

that. So I'm absolutely a pro capitalist,

24:41

pro pre-market guy. I'm never

24:43

going to... Okay. Okay. Yeah. Okay.

24:45

Okay. Well, that's good to get that clear. Why? Yeah.

24:48

Because I would argue

24:51

that when you look at like the fall of the

24:53

Soviet Union or you look at the failure of like

24:55

socialists or communist regimes, I don't know

24:57

if the issue there was so much redistribution. I think the

24:59

problem... That was one of many issues. I don't think it

25:01

was an issue at all, actually, I would say. I think

25:03

the issue was... Wait a minute.

25:06

Wait a minute. What do you mean redistribution wasn't

25:08

an issue? What the hell do you think they

25:10

did to the kulaks? That was

25:12

forced redistribution. It resulted in the death

25:14

of 6 million people. So maybe I'm

25:16

not understanding what you mean, but

25:18

that was redistribution at its

25:21

pinnacle and forced redistribution. It

25:23

was brutal. When I

25:25

think of the strengths of capitalism, the

25:28

ability for markets to dynamically respond

25:30

to shifting consumer demand is like

25:32

the reason why capitalism and free

25:35

market economies dominate the world. When

25:37

you've got socialists or communist systems

25:39

command economies where a government is trying to say, this

25:41

is how much this is going to cost, this is

25:44

how much you're going to produce and make, this is

25:46

a failed way of managing a state economy. Even

25:48

in places where they still do it, there are always

25:50

shadow economies and stuff. There were in the Soviet Union

25:52

that prop up where people try to basically

25:55

ameliorate the conditions that are resulting from

25:57

said horrible command economy practices. So

26:00

I guess in a way you could argue a command economy

26:02

is kind of like redistribution as a form of it. No,

26:05

it's a worse problem. If you're pointing to the fact

26:07

that that's a worse problem, I'm not going to say

26:09

that. Yeah, I would say that's definitely the reason why

26:11

these places failed because they just

26:14

weren't able to respond to changing the

26:16

conditions. Okay, so

26:18

what's the difference between a state

26:21

that attempts to redistribute to foster

26:23

equality of opportunity and a command

26:25

economy? Is it a difference

26:28

of a degree? Are you looking at models, let's

26:30

say, like the Scandinavian countries? I

26:32

wouldn't use Canada, by the way, because

26:35

Canada is now, what

26:37

would you call, predicted by economic

26:39

analysts to have the worst performing

26:42

economy for the next four decades

26:44

of all the developed world. So

26:46

maybe we'll just leave the example

26:48

of Canada off the table. Scandinavian

26:50

countries are often the polities

26:52

that are pointed to by, I would

26:55

say, by people who, at least in

26:57

part, are putting forward a view of redistribution

26:59

for purposes of equality of opportunity

27:01

like you are. But they're

27:04

a strange analogy because they're very small

27:06

countries and up till now they were

27:08

very ethnically homogenous. Exactly,

27:10

and that makes a big difference when you're trying

27:12

to flatten out the

27:14

redistribution. Plus, they're also incredibly wealthy,

27:17

which makes redistribution, let's say, a

27:19

lot easier. So

27:25

why doesn't the government that's bent

27:27

on redistribution fall prey to the

27:29

pitfalls of command economy and forced

27:32

redistribution, for that matter? How do you protect

27:34

against that? I think what you have to

27:36

do is very, very, very difficult is people

27:39

get very ideologically captured by both ends, and

27:41

they feel very, I guess, like

27:43

committed or they feel very allegiant to pushing

27:45

certain forms of economic organization. And I think

27:48

sometimes it blinds them to some of the

27:50

benefits of what exists when you incorporate kind

27:52

of multiple models or I mean you'd call

27:54

them mixed economies, which is really what

27:56

every capitalist economy today is, some form

27:58

of free market capitalism. combined with some

28:00

form of like government intervention to control for

28:02

negative externalities. These are the ways that all

28:04

economies, even in Scandinavia and the world work.

28:07

And I think that recognizing the benefits of

28:09

both systems are the best way to make

28:12

things work out. Fair enough. And the Scandinavian

28:14

countries seem to have done a pretty good

28:16

job of that. But like I said, they

28:18

have a simpler problem to solve, let's say

28:20

that the Americans have negative externalities. That's

28:24

an interesting rabbit hole to wander down

28:26

because the problem I have with negative

28:28

externalities, you made a case already that,

28:31

and again, correct me if I've got this wrong, but

28:34

I think that I understood

28:37

what you said. A

28:40

free market, free exchange economy is a

28:42

gigantic distributed computational device. Basically, yeah. Right,

28:44

exactly. Which funnily enough, one of the

28:47

big problems for command economies is called

28:49

the computation problem because no central body

28:51

can actually compute the end of the-

28:54

Right, exactly. Right, that's not, yeah, that's

28:56

a fatal problem, right? It doesn't have

28:58

the computational power. It certainly doesn't have

29:01

the speed of data recognition.

29:03

It doesn't have the on the ground

29:05

agents if all of the

29:07

perception and decision making is centralized.

29:10

It's way too low resolution, it's gonna

29:12

crash. Okay, so, and I think that

29:14

that's comprehensible technically as well as ideologically.

29:17

All right, so, but having

29:19

said that, with

29:22

regards to externalities, all

29:24

the externalities that

29:26

a market economy can't compute

29:30

are so complex that

29:32

they can't be determined sensibly by

29:34

the same argument. And so- So

29:37

our way is to account for them though. Really?

29:39

That work with- Tell me. Yeah, so- I

29:42

can't see that because I can't see how

29:44

that they can be accounted for without

29:47

the same computational problem immediately arising. Yeah, and

29:49

I understand that. And I think that's a

29:51

problem sometimes of people very far on the

29:54

left when they wanna deal with certain problems.

29:57

I think that they wanna bring like heavy handed,

29:59

like things like price. controls and to say, well,

30:01

we need less of this, so let's just make this

30:03

cost this particular thing, which, ironically enough, introduces a whole

30:05

other set of externalities that will happen when you get

30:07

a lot of friction between where your price floor or

30:09

ceiling is set compared to what a market would set

30:11

it at. But ideally, if

30:13

you're a reasonable person and you view economies as

30:15

mixed economies, what you try to do is you

30:17

try to take these externalities, meaning things that aren't

30:19

accounted for with your primary system. So in a

30:21

capitalist system, an externality might be something that caused

30:23

a negative effect, but it doesn't cost you any

30:26

money. Pollution would be a good example

30:28

of that. Rather than saying, well, no company can

30:30

pollute this much, or if you're a company, you have

30:32

to use these things because the other things are making

30:34

so much pollution. All you do is you say, OK,

30:36

well, if we've determined that carbon is bad for the

30:38

atmosphere, we're just going to attach a little price to

30:40

that. So the government is going to say that, yeah,

30:42

if you pollute this much, here's the price. And then

30:44

if you want to pay for it, you can. But

30:47

that type of intervention in the economy

30:49

basically allows the free market to hopefully do its

30:51

job because the government is tapped on a little

30:53

bit of a price limit that tries to account

30:55

for the cost of that externality. Yeah. Great. That's

30:57

a great example. We can go right down that

30:59

rabbit hole. Carbon. OK, so first

31:01

of all, one

31:03

of the things I've seen you tell me

31:05

what you think about this, something that I've

31:07

seen that actually shocks me that I was

31:09

interested in watching over the last five or

31:11

six years. I wondered what

31:13

would happen when the left,

31:15

the progressives, ran into a conundrum. And

31:18

the conundrum is quite straightforward. If

31:21

you pursue carbon pricing and you make energy

31:23

more expensive, then you hurt the poor. And

31:25

I don't think you just hurt them. In

31:27

fact, I know you don't. You just don't

31:29

hurt them. I heard a man

31:32

two days ago who's fed 350 million

31:34

people in the course of his life,

31:37

heading the U.N.'s largest relief

31:39

agency, make the claim

31:41

quite straightforwardly that misappropriation

31:44

on the part of interventionist

31:47

governments increased the

31:49

rate of absolute privation dramatically

31:52

in the world over the last four or five years. And

31:55

that has happened not least because of carbon

31:57

pricing, not just carbon pricing, but

31:59

the. insistence that carbon per se is

32:01

an externality that we should control. Now,

32:03

Germany's paid a radical price for that,

32:05

for example. So their power is now

32:07

about five times as expensive as it

32:09

could be. And they pollute

32:12

more per unit of power than they

32:14

did 10 years ago before they introduced

32:16

these policies that were hypothetically there to

32:18

account for externality and the

32:20

externality was carbon dioxide. I don't think

32:23

that's computable externality. And I don't think

32:25

there's any evidence whatsoever that it's actually

32:27

an externality that we should be warping

32:30

the economic system to ameliorate if the

32:32

cost of that and it will be

32:34

will be an increase in absolute privation

32:37

among the world's poor. So

32:39

here's an additional argument on that

32:41

front with regards to externalities. You

32:44

get that wrong and here's something you could get

32:46

right instead. If you

32:48

ameliorate absolute poverty among the world's

32:50

one billion poorest, they take

32:53

a longer view of the future. And

32:55

that means they become environmentally aware.

32:57

And so the fastest route to

32:59

a sustainable planet could well be

33:01

the remediation of absolute poverty. And

33:03

the best route to that is

33:05

cheap energy. And we are interfering

33:07

with the development of cheap energy by meddling

33:11

with the hypothetically detrimental externality

33:14

of carbon dioxide. And

33:16

so I

33:19

think this is a complete bloody travesty. By the

33:21

way, we are putting the lives of hundreds of

33:23

millions of people directly at risk

33:26

right now to hypothetically

33:28

save people in the future, depending

33:31

on the accuracy of our

33:33

projections. A hundred years out

33:35

in these these interventionists, these

33:37

people who are remediating externalities,

33:39

they actually believe that they

33:41

can calculate an economic projection

33:43

one century out of

33:46

utterly delusional. So, OK,

33:48

so just to be clear, the first thing I

33:50

was giving an example of how you can use

33:52

like a government intervention to make a free market

33:55

track something which which is what cap and trade

33:57

or like carbon taxes would do. necessarily

34:00

speaking to the strength of that individual thing. But yeah, but

34:02

that's the thing to focus on. Yeah, we can focus on

34:04

that as well. That's the main directionality. We can focus on

34:06

that as well. So the first

34:08

thing, this is gonna sound mean, but

34:11

I'm very realistic. There

34:13

needs to be a better argument than just

34:15

it disproportionately impacts the poor. That's a classic

34:17

left to start here. It might be, but.

34:20

But it's the same argument you made to

34:22

justify your swing to the left at the

34:24

beginning of our discussion. You said that you

34:26

were looking at economic inequalities that disproportionately affected

34:29

the poor. So I can't see

34:31

why, and I'm not

34:33

trying to be mean about this either. I

34:35

can't see why you could base your argument

34:37

that it was morally appropriate

34:39

for you to swing to the left from

34:42

your previous position because you saw disproportionate effects

34:44

on the poor. And I can't use that

34:46

argument in the situation that I'm presenting it

34:48

right now. Imagine

34:50

being able to transition academic papers,

34:52

textbooks, websites, emails, or PDFs into

34:54

an audio format that you can

34:56

listen to while on the move.

34:58

The Listening App offers exactly this.

35:01

It's a convenient and flexible solution

35:03

for those who want to learn

35:05

anytime, anywhere. The Listening App offers

35:07

lifelike AI voices complete with emotion

35:09

and intonation for a realistic listening

35:11

experience. It allows you to indulge

35:13

in your love for reading and

35:15

learning even when you're on the

35:17

go. Take your studying to the

35:19

next level by uploading your notes and

35:21

listening to them on the Listening App.

35:23

With their note taking feature, every time

35:26

you come across a key idea, simply

35:28

click the Add Note button, and the

35:30

app instantly adds the last few sentences

35:32

to the notepad, enhancing your learning experience.

35:34

Have you ever experienced motion sickness while

35:36

trying to read your favorite book during

35:38

a journey? The Listening App

35:40

has got you covered. It allows you

35:42

to easily convert text and listen anywhere,

35:44

making your travel time more enjoyable and

35:47

nausea-free. Normally, you'd get a two-week free

35:49

trial, but listeners of the

35:51

Jordan B. Peterson podcast can get

35:53

a whole month free right now.

35:55

Just go to listening.com/Jordan and use

35:57

code Jordan at checkout. That's listening.com.

36:01

and use code Jordan at checkout today.

36:30

I think that just helping the poor

36:32

isn't an argument like

36:35

a blank check to do every

36:37

possible thing to satisfy poor people.

36:39

Right. I agree. And it's going to depend on

36:41

a particular issue. Yeah, that's fine. So like for instance, I

36:43

think... But that's because the poor, everyone who's poor is not

36:45

a victim. Some people who are

36:48

poor are psychopathic perpetrators. Sure. And

36:50

it's very useful to distinguish them. But I was making

36:52

a much more specific argument. My argument

36:54

was that the fastest way out of

36:57

absolute privation for the world's bottom billion

36:59

people is through cheap energy. Yeah,

37:01

I understand what you're saying there. Just working my way towards that.

37:03

Yeah. Yeah, I just want to say that just

37:05

because something targets the poor is not necessarily an argument against it. It

37:08

depends on how hard it targets them and it

37:10

depends on whether mass starvation is the outcome.

37:13

The outcome is important. That I agree with. So

37:15

for instance, like a syntax... The mass outcome will

37:17

be mass starvation... Yeah, I'm getting to it. Yeah,

37:19

I'm getting to it. Okay. Syntaxes

37:21

on like cigarettes and alcohol are always going to disproportionately

37:23

impact the poor or even sugar, we might say, right?

37:25

But just because that disproportionately impacts the poor, is that

37:28

a good thing or a bad thing? These

37:30

are probably the people that suffer the most from those particular

37:32

afflictions. Right. Right. So

37:35

I think that's the target... That is immediate versus

37:37

delayed issue too, right? Because the reason... Well, I

37:39

mean, obesity isn't immediate. I don't think alcohol is

37:41

immediate. I mean, the reason for the tax is

37:43

to stop people from pursuing

37:46

a certain form of short-term gratification after

37:48

cost of their longer term well-being. Correct.

37:51

So that exact same idea, if you believe

37:53

climate models or if you believe that we're

37:55

heading in a certain direction in terms

37:58

of climate, the overall warming of the planet... would

38:00

be the same argument you would make for climate

38:02

change. Only if you believe that you could

38:04

model economic development 100 years into the future.

38:06

Well, we're not trying to model, we're more

38:08

concerned with modeling climate development and economic development.

38:10

No, no, no. We are equally... Absolutely. Well,

38:13

okay, tell me how I'm wrong. I don't

38:15

believe that because what I see happening is

38:17

two things. We have climate models that purport

38:20

to explain what's going to happen over a

38:22

century on the climate side, but we have

38:24

economic models layered right on top of those

38:26

that claim that there's going to be various

38:28

forms of disaster for human beings economically

38:30

as a consequence of that climate change.

38:32

And so that's like two towers of

38:34

babble stacked on top of one another.

38:37

And so, because if people were just

38:39

saying, oh, the climate's going to change,

38:41

there'd be no moral impetus in that.

38:43

The climate's going to change and that's

38:45

going to be disastrous for the biosphere

38:47

and for humanity. But that's an economic

38:50

argument as well as a climate-based argument.

38:52

It's both, but the worst projections of

38:55

what would happen if the climate took

38:57

a disastrous turn are worse than

38:59

the worst projections of what is our planet going

39:01

to look like economically if we hardcore police... Right.

39:04

Why would you... Okay, but I don't understand

39:06

the distinction between the models. Well,

39:09

the argument would be that whatever pain

39:11

and suffering poor people might endure right now because

39:14

of a move towards green energy, that pain and

39:16

suffering is going to be short-term and far less

39:18

than the long-term pain and suffering... Right, but that's

39:20

dependent on the integrity of the economic models. And

39:23

the climate models as well, right? Exactly, but

39:25

in exactly the stacked manner that I

39:27

described it. Like, there's nobody in

39:29

1890 who could have predicted what

39:31

was going to happen in 1990 economically. Uh-huh.

39:35

Not a bit. Not a bit. And

39:38

if we think we can predict

39:40

50 years out now with the

39:42

current rate of technology and calculate

39:44

the potential impact of climate change

39:46

on economic flourishing for human beings,

39:48

we're deluded. No one can do

39:50

that. And then... Mm-hmm. So

39:53

imagine that as you do that and you

39:55

project outward, your margin of error increases. That's

39:58

absolutely definitely. the case. And

40:01

at some point, you're certainly on the

40:03

climate side, the margin of error gets rapidly to

40:05

the point where it subsumes any estimate of the

40:07

degree to which the climate is going to transform.

40:09

And that happens even more rapidly on the economic

40:12

side. Essentially. I think right now, this is a

40:14

disagreement on the fact of the matter, though, not

40:16

the philosophy of what we're talking about in controlling

40:18

externalities. If we think, I'm curious, let's say that

40:20

we think we can accurately predict the climate and

40:22

the economic impact. And we think that the climate

40:25

impact would be far worse if we don't account

40:27

for that, both in terms of

40:29

human conditions and believe any of those presumptions.

40:33

But then if you don't, but I mean, like, obviously, if

40:35

I agreed with that factual analysis, I would probably agree with

40:37

you on the prescription here, too. Right.

40:39

And while I don't like the climate models were

40:41

accurate, or we couldn't accurately predict anything, they're not

40:43

also saying why they well, they're not sufficiently accurate.

40:46

That's the first thing. And second, because they have

40:48

a margin of error, and it's a large margin

40:50

of error, they don't even model cloud coverage. Well,

40:53

that's a big problem. They don't have the

40:55

resolution. They don't have nearly the resolution to

40:58

produce the accuracy that's claimed by the climate

41:00

apocalypse. Longer. But we just got another one

41:03

of the hottest years on record. How many

41:05

times are we going to have another hottest

41:07

year on record? How many times we're gonna

41:09

have an increase of carbon oxide concentration in

41:11

the atmosphere before we're finally like, okay, I

41:13

don't know. And the reason

41:15

I don't know is because it

41:18

depends the scientific answer to that

41:20

question depends precisely on the timeframe

41:22

over which you evaluate the climate

41:24

fluctuation. And that's actually an

41:26

intractable scientific problem. So you

41:28

might say, well, if you take the last 100

41:30

years, this variation looks pretty dismal. And

41:32

I'd say, well, what if you took the last 150,000

41:34

years, or the last 10,000, or

41:38

the last 10 million, you can't

41:40

specify the timeframe. The timeframe is incredibly

41:42

important. That'd be like saying, look at,

41:44

you know, let's say somebody developed cancer,

41:46

they didn't realize it. And the person

41:48

has lost, you know, 40 or 50

41:50

pounds in the past six months. And

41:52

I'm just like, you look very sickly. And you're like, okay,

41:55

well, look at my weight fluctuation for the past 10 years.

41:57

Well, that doesn't really matter what matters. Well,

42:00

but I'm saying that it is important. Yeah, but you're

42:02

saying I don't know how to specify it. Well,

42:04

you would probably specify it with the beginning of the industrial age,

42:06

right? Why? Because when specimen carbon

42:08

dioxide, which is a gas that's seen as trapping more

42:11

heat on the planet, begins to... Why is that relevant

42:13

to the time over which you compute the variability? Because

42:15

it seems like as carbon dioxide has increased in the

42:17

atmosphere, the surface temperatures have risen at a rate that

42:19

is a departure from what we'd expect over 150,000 year

42:22

cycles of temperature variations on

42:24

the planet. No, not without time

42:26

frame. That's not the case. Absolutely the case.

42:28

No, what do you mean? You just flipped

42:30

to 150,000 year time span. What

42:33

I'm saying is that if we expect to see a temperature

42:35

do this in 150,000 year time span,

42:38

in 100 year time span, seeing it do

42:40

this, that's very worrying. You mean like Michael

42:42

Mann's hockey stick, the one that's under attack

42:44

right now in court by a major statistician

42:47

who claimed that he falsified his data. I

42:49

mean, that spike? I'm

42:51

talking about the record temperatures that

42:53

have been declared for like the

42:55

past five years that have also

42:57

increased with the concentration of

42:59

parts per million of carbon dioxide in the atmosphere. I

43:02

mean, I'm not going to tell you that every model is

43:04

perfect. They're not going to. Sure, but

43:06

right now we're like standing in traffic with our eyes

43:09

closed saying the car hasn't hit me yet, so I

43:11

don't think there's any coming. I think it's pretty undeniable

43:13

at this point that there is an impact on climate

43:15

across the planet. I think that's highly

43:17

deniable. We have no idea what the impact is from.

43:20

We don't know where the carbon dioxide is from. We

43:22

can't measure the warming of the oceans. We

43:25

have terrible temperature records going back 100 years. Almost

43:28

all the terrestrial temperature

43:32

detection sites were first put outside urban areas.

43:34

And then you have to correct for the

43:36

movement of the urban areas, and then you

43:39

introduce an

43:42

error parameter that's larger than the

43:44

purported increase in temperature that you're

43:46

planning to measure. This

43:49

isn't data. This is gas. And

43:51

there's something weird underneath it. There's

43:53

something weird that isn't oriented well

43:55

towards human beings underneath it. It

43:57

has this guise of compassion. We're going to do

43:59

this. to save the poor in the future. It's

44:01

like that's what the bloody communist said. And

44:04

they killed a lot of people doing it. And we're

44:06

walking down that same road now with

44:08

this insistence that, you know, we're so compassionate

44:10

that we care about the poor 100 years

44:13

from now. And if we have to wipe

44:15

out several hundred million of them now, well,

44:17

that's a small price to pay for the

44:19

future of utopia. And we've heard that sort

44:21

of thing before. And the

44:24

alternative to that is to stop

44:26

having global level elites plot out

44:28

a utopian future or

44:31

even an anti dystopian future. And

44:34

that's exactly what's happening now with organizations

44:36

like the WEF. And if this wasn't

44:38

immediately impacting the

44:40

poor in a devastating manner, I wouldn't care

44:42

about it that much. But it is. You

44:45

know, I watched over the course of the last

44:48

five years, the estimates of the number of people

44:50

who were in serious danger of food privation rise

44:52

from about 100 million to about 350 million. That's

44:56

a major price to pay for a little

44:58

bit of what would you say for

45:01

progress on the climate front that's so narrow,

45:03

it can't even be measured. I don't think

45:05

the increase in hungry people on the planet

45:08

is because of climate policies.

45:10

Why not? Because I don't

45:12

think that countries in Africa are being pushed away from fossil

45:14

fuel. Of course they are. They can't

45:17

even get loans from the World Bank to

45:19

pursue fossil fuel development. And there's plenty of

45:21

African leaders who are screeching at the top

45:24

of their lungs about that, because the elites

45:26

in the West have decided that, well,

45:28

it was OK for us to use fossil

45:30

fuel so that we wouldn't have to starve

45:33

to death. And our children had some

45:35

opportunities. But maybe the starving masses that are

45:37

too large a load for the world anyways

45:39

shouldn't have that opportunity. And

45:42

that's direct policy from the UN

45:44

fostered by organizations like the WEF.

45:46

They're going to have to turn

45:48

to renewables. Yeah, well, good

45:51

luck with that. Because renewables have no

45:53

energy density. Besides that, they're not renewable

45:55

and they're not environmentally friendly. And then

45:57

one more thing. There's one more. thing

46:00

underneath all of this. Okay. Well,

46:02

let's say if carbon dioxide was actually your bugbear

46:05

and it was genuine. Well, then

46:07

why wouldn't the greens, for example,

46:09

in Africa, the progressives, be agitating

46:11

to expand the use of nuclear

46:14

energy, especially because Germany has to

46:16

import it anyways, especially because France

46:18

has demonstrated that it's possible. We

46:21

could drive down the cost of energy with

46:23

low-cost nuclear, and there'd be no carbon production.

46:25

And then the poor people would have something

46:27

to eat because they'd have enough energy. And

46:30

that isn't what's happening. And that's one of

46:32

the things that makes me extremely skeptical of

46:34

the entire narrative. It's like two things. The

46:37

left will sacrifice the poor to save the planet.

46:40

And the left will de-industrialize even at

46:42

the nuclear level, despite the fact that

46:44

it devastates the poor. And

46:46

that's even worse because if

46:49

you devastate the poor and

46:51

you force them into a short-term

46:53

orientation in any given country

46:55

where starvation beckons, for example, they will cut

46:57

down all the trees and they will kill

46:59

down all the animals and they will destroy

47:02

the ecosphere. And so even

47:04

by the standards of the people

47:06

who are pushing the carbon dioxide

47:08

externality control, all the consequences of

47:11

that doctrine appear to me to

47:13

be devastating even by their own

47:16

measurement principles. We're

47:18

trying to fix the environment. Well,

47:20

boys and girls, it doesn't look like it's

47:22

working. All you've managed to do is make

47:25

energy five times as expensive and more polluting.

47:28

You were wrong. That didn't work. And

47:31

so – and I can't understand. You can help

47:33

me. That's why you're here today talking to me.

47:35

I can't understand how the left can support this.

47:37

Sure. Just one quick thing. Let's

47:39

say that everything you've said is true. What do you think is

47:41

the plan then? What is the goal? What is the drive? Like

47:44

why push? Why push obviously horrible ideas for the

47:46

planet and the poor? That's a good

47:48

question. That's a good question. Well, what do

47:51

you think? Well, because you're positive, right? So what

47:53

do you think is the driver goal? I listen

47:55

to what people say. Here's

47:57

the most terrible thing they say. There

48:00

are too many people on the planet. Okay,

48:03

so who says that? I've heard people say that

48:05

for 30 years. Perfectly

48:07

ordinary, compassionate people. Well, there's too many

48:10

people on the planet. And

48:12

I think, well, for me, that's like

48:14

hearing Satan himself take possession of their

48:16

spine and move their mouth. It's

48:18

like, okay, who are

48:21

these excess people that you're so concerned

48:23

about? And exactly who has

48:25

to go and when and why and

48:27

how? And who's going to make that

48:29

decision? And even if you don't, even

48:31

if you're not consciously aiming at that,

48:34

you are the one who uttered the words.

48:37

You're the ones who muttered the phrase. What

48:39

makes you think that the thing that possessed

48:41

you to make you out of that words

48:43

isn't aiming at exactly what you just declared.

48:46

And so that's, you know, that's a terrible

48:48

vision. But when you look

48:50

at what happens in genocidal societies

48:52

and they emerge fairly with fair

48:55

regularity and usually with a utopian

48:57

vision at hand, the

48:59

consequences, the mass destruction of millions of people.

49:02

So why should I assume that something

49:04

horrible isn't lurking like that right now,

49:06

especially given that we have pushed

49:09

a few hundred million of people back

49:11

into absolute poverty when we were

49:13

doing a pretty damn good job of getting rid of that? I

49:17

just don't understand what's happening in Germany or in

49:19

the UK. Like, it's insane. Like,

49:23

look, man, if they would have got

49:25

rid of the nuclear plants and

49:27

made energy five times as expensive and

49:29

the consequence would have been they weren't

49:32

burning late night coal as a backup

49:34

and their unit production of energy

49:36

of pollution per unit of energy had plummeted. You

49:39

could say, well, look, you know, we heard a lot

49:41

of poor people, but at least the air is cleaner.

49:44

It's like, no, there's worse and everyone's poor. So

49:48

like, explain to me

49:50

how the hell the left could be anti-nuclear. I

49:52

don't understand it at all. Gotcha. All right. concerning

50:00

to me. I

50:02

feel like when people get political beliefs, I

50:04

feel like what happens is what we think

50:06

happens, what we hope happens is you have

50:08

some moral or philosophical underpinning. And

50:10

then from there, you combine this with some epistemic

50:13

understanding of the world. And then you combine these

50:15

two things, you engage in some form of analysis

50:17

and your moral view. Yeah,

50:20

you start to apply like prescription.

50:22

So maybe I'm religious, maybe I

50:24

analyze society and I see that

50:27

particular TV shows lead to premarital sex.

50:29

So my societal prescriptions, we should ban these

50:31

TV shows, right? Ideally, this is how you

50:34

would imagine this process works. What I found

50:36

happens, unfortunately, all too often is what people

50:38

do is they join social groups. And then

50:40

with those social groups, they inherit something that

50:42

I call like a constellation of beliefs. And

50:44

this constellation of beliefs, instead of rationally

50:46

building on each of these, you basically

50:49

get this like Jenga tower

50:51

that is like floating over a table

50:53

and every block is like supporting itself.

50:55

And no real part of the power

50:57

can be addressed if you pull out

50:59

one piece of all falls apart, right?

51:02

They become like very stuck in all

51:04

of this combined constellation stuff. And

51:06

none of it is really given like any analysis

51:08

and you can't really push anybody from from one

51:10

way or another. In terms

51:13

of like reevaluating any of the beliefs that

51:15

are part of this constellation. I wish I

51:17

would have done we have that's

51:23

fine. Well, you know, there are models

51:25

now of sure there are models now

51:27

of cognitive processing belief, belief

51:30

system processing that

51:33

make the technical claim that what

51:35

a belief system does is constrain

51:37

entropy. Check that's

51:40

not surprising at all. Yeah. And

51:42

now now the signal for for

51:44

released entropy, which would be a

51:46

consequence of say violated fundamental beliefs

51:49

is a radical increase in anxiety, right

51:51

and a decrease in the possibility of

51:53

positive emotion. And so people

51:55

will struggle very hard against that, which

51:57

is exactly the phenomena that you're describing.

52:00

Okay, I agree with what you said. Oh,

52:02

here's, here's my, yeah. So I'm not sure

52:04

what is relevant to what the issue I

52:06

was getting. I'm getting, I'm getting, I'm getting

52:08

here's my issue. Okay. So when

52:12

I'm trying to evaluate a situation, I like to

52:14

think that I have some, uh, I've got some

52:16

insulation from the effects of what liberal think or

52:18

what conservatives think is because on my platform, I

52:21

don't necessarily have an allegiance to a particular political

52:23

ideology. Like right now I'm like center left to

52:25

progressive, but I break really hard to progress on

52:27

certain issues. I think how written house is in

52:29

the right. I think basically everything you guys are

52:32

doing with indigenous people is insane, uh, including the

52:34

complete mass grave hoax. Uh, I think

52:36

that I'm a big supporter of the second amendment. Uh, I

52:38

have beliefs where I can break from my side, you know,

52:41

pretty hardcore because I am not like a

52:43

leader to certain political ideology. One

52:45

thing that worries me with this constellation of

52:47

beliefs thing is that sometimes when it comes

52:49

to evaluating a particular policy or a particular

52:52

problem, I feel like it's part of the

52:54

constellation and sometimes it inhibits people from like

52:56

taking a step back and reasonably thinking about

52:58

the issue. So when we're talking about climate

53:00

change, you mentioned the W E F sacrificing

53:02

tons of people, the UN global elites, uh,

53:05

five times energy costs in Germany,

53:07

uh, genocidal people, I

53:09

feel like this is part of like a whole

53:11

thing where it's like, okay, well let's take a

53:14

quick step back and let's just like think rationally

53:16

about this particular issue for one moment. Okay. You

53:18

asked me what the motivation for anti-poor policies might

53:20

be. So that's why I was like, but I

53:22

got all of those things. If I even asked

53:25

that question, um, because I think it's totally

53:27

possible that somebody might say, okay, well, when you

53:29

put carbon dioxide in the atmosphere, it seems to

53:31

cause an increase in surface temperatures. This has been

53:34

happening from about the 1800s. And

53:36

as we started to track surface temperatures, whether the thermometer

53:38

is on top of the empire state building or in

53:40

the middle of the field, it seems like there's an

53:42

average rise in temperatures and people all around the world

53:44

are observing this in some places more than others. If

53:46

you live in Seattle and 20 years ago, your apartment

53:48

building wasn't built with air conditioner units. You feel that

53:50

now. If you live in a place in London and

53:52

you've never had an air condition before now, that's not

53:54

acceptable. I think that people on the ground can see that

53:56

there are changes. And I think that scientists, when they look in

53:58

labs, can see changes. It might. Maybe that some models aren't

54:01

precise enough and it might be that for reasons

54:03

we don't even understand now. Well the economic models

54:05

certainly aren't precise enough. Sure, maybe. Maybe that might

54:07

be true. Not maybe. They can't

54:09

even use them to predict the price of a single stock for six

54:11

months. The economic models are not

54:14

sufficiently accurate to calculate out the consequences

54:16

of climate change over a century. I

54:18

don't in the... When you... I

54:20

like the comparison because economic models can't predict

54:22

individual stocks but they do predict the rough

54:24

rise of the market. You invest in the

54:26

S&P 500. Yeah, you said pretty on cataclysmic

54:28

collapse. No, even with the cataclysmic collapse accounted

54:30

for, you're going to see about 7% returns

54:33

on average with inflation over long periods

54:35

of time. I wouldn't call an average

54:37

a very sophisticated model analogous to the

54:39

climate change. That's the difference between climate

54:41

and weather though, right? It's that climate isn't going to

54:44

tell you what the temperature is on a given day but

54:46

it might tell you the average surface temperature over a period

54:48

of one year or 10 years. And then

54:50

that's the difference between climate and weather. That's interesting like

54:52

the market and stuff. Well that's a hypothetical difference. It

54:54

is a hypothetical but again we're seeing more and more

54:56

and more data every single year and things are getting

54:58

hotter and hotter. Let's jump

55:00

out of our cloud of presuppositions for a

55:03

minute. Sleep

55:05

is the foundation of our mental and

55:07

physical health. You must have a consistent

55:09

nighttime routine to function at your best.

55:12

If you're struggling with sleep, you need to

55:14

check out Beam. It's not just your run

55:16

of the mill sleep aid. It's a concoction

55:19

carefully crafted to help you rest without the

55:21

grogginess that often accompanies other sleep remedies. Several

55:23

people on our team use Beam's Dream Powder

55:25

to sleep better through the night and show

55:27

up ready for work. Other sleep

55:29

aids can cause next day grogginess.

55:31

The Dream contains a powerful all-natural

55:33

blend of ryche, magnesium, L-cyanine,

55:35

apigenin and melatonin to help you

55:38

fall asleep, stay asleep and wake

55:40

up refreshed. Now available in delicious

55:42

flavors like cinnamon cocoa, chocolate peanut

55:44

butter and mint chip, better sleep

55:46

has never tasted better. And today

55:49

our listeners get a special discount

55:51

on Beam's Dream Powder. For science-backed,

55:53

healthy hot cocoa for sleep with

55:55

no added sugar. Better

55:57

sleep has never tasted better. beam

56:00

dream into hot water or milk,

56:02

stir or froth and enjoy before

56:04

bedtime. If you find yourself struggling

56:06

to sleep, give it a shot.

56:08

Get up to 40% off for

56:10

a limited time when you go

56:12

to shopbeam.com/Peterson and use code PETERSON

56:14

at checkout. That's shop, B E

56:16

A m.com/Peterson and use code PETERSON

56:18

for 40% off. True.

56:23

Now, one of the things that... No,

56:26

wait. Okay. There are some things that

56:28

we've gotten as a result of investing

56:30

in green energy that have been good.

56:32

So for instance, the power of solar

56:35

energy has dropped dramatically in the United

56:37

States, faster than anybody thought possible, such

56:39

that solar

56:41

energy is like competitive or beating fossil fields in certain

56:43

areas. As long as you can set the solar panels

56:46

up, you're literally beating fossil fields up. Yeah, and as

56:48

long as the sun is shining. Well, it

56:50

still is, but we're not in nuclear winter yet. But

56:52

it isn't when it's cloudy and it isn't in the

56:54

winter. That's why it's depending on where you live. There

56:56

are places, an equatorial place, if you're trying to set

56:58

up a solar panel in Seattle, you might not have

57:00

as much like our New York City, Manhattan much, or

57:03

in Germany, true. Or Europe, or in

57:05

Canada. There are also other issues

57:07

that are coming up that I think are obfuscating

57:10

our ability to evaluate what's being caused by green

57:12

energy versus not. When we look at energy increases

57:14

in Germany, I think

57:16

there's a similar constellation around nuclear energy, for

57:18

instance. People don't want nuclear energy because they

57:20

think of nukes, and they think of nuclear meltdowns,

57:22

and they think of Chernobyl, and they think of

57:24

Fukushima, and they think of atomic bombs, and that's

57:26

it. And that's stupid. And I agree with you.

57:28

But nuclear energy is a totally viable alternative to

57:31

other forms of fossil fuels. Then why does the

57:33

radical left oppose it? You think it's just this

57:35

map? See, you might think you're- For the same

57:37

reason, the right opposes vaccines because it sounds scary

57:39

and it gets a big thing and they don't

57:41

trust it. It comes with a big problem. Well,

57:43

the right has a reason to distrust vaccines in

57:45

the aftermath of the COVID debacle. Because they were

57:47

imposed by force, and that was a very bad

57:49

idea. You get to choose if you have a nuclear power plant. That's

57:51

imposed by force too, no? You don't get to

57:54

choose where your energy comes from. If you live in a

57:56

country, you turn the light switch and hopefully you don't have

57:58

a Chernobyl that melts down in your particular town, right? Well,

58:00

you get to choose it because you can buy it

58:02

or not. That's the choice. I

58:04

mean, it does, but the negative... Nobody had a choice

58:07

with the vaccines. Nobody had a choice whether or not

58:09

they lived near Chernobyl or not. Nobody

58:11

has a choice. Really, they could move away. Well, I

58:13

don't really think it's a move like 500 miles. That's

58:16

like telling conservatives when Biden tried to do the OSHA

58:18

mandate for vaccines, like, well, you can just get a

58:20

different job, right? I don't want to debate about whether

58:22

or not large nuclear power plants are frightening. They are.

58:25

Sure. And there are technologies

58:27

now where that's not a problem. And

58:29

I think that's a counter-principled place for

58:32

our discussion to go because I also

58:34

understand why people are afraid of it. But

58:36

what I don't understand, for example, is

58:38

why the Germans shut down their nuclear

58:40

power plants and the Californians are thinking

58:42

and have, doing the same thing

58:45

when they have to import power from

58:47

France anyways. Like, it's completely... Or burn

58:49

coal, which is a million times worse.

58:51

Not just coal, late night. Yeah. Right.

58:54

And then with regards to these renewable power sources,

58:56

they have a number of problems. One is they're

58:58

not energy dense. They

59:01

require a tremendous infrastructure to

59:03

produce. They might be

59:05

renewable at the energy level, but they're

59:07

not renewable at the raw materials level.

59:10

So that's a complete bloody lie. They're insanely

59:12

variable in their power production. And because of

59:14

that, you have to have a backup system.

59:17

And the backup system has to be reliable

59:19

without variability. And that means if you have

59:21

a renewable grid, you have to have a

59:23

parallel fossil fuel or coal grid

59:25

to back it up when the sun doesn't

59:27

shine and the wind doesn't blow, which

59:30

is unfortunately very, very frequently. And

59:33

so again, and so I'm not going

59:35

to say there's no place for renewable

59:37

energy like solar and wind, because maybe

59:39

there are specific niche locales where those

59:41

are useful. But the logical... What

59:45

would you say? Antidote to the problem

59:47

of reliability if we're concerned about carbon,

59:49

but we're really not, would be to

59:51

use nuclear. The Greens haven't

59:53

been flying their bloody flags

59:56

for 30 years saying, well, we could

59:58

use fossil fuels for 30 years. for fertilizer

1:00:00

and feed people and we could use

1:00:02

nuclear power to drive energy costs down

1:00:05

in the carbon dioxide free manner. That

1:00:07

seems pretty bloody self-evident to me. And

1:00:09

so then it brings up this other

1:00:11

mystery that we were talking about earlier.

1:00:13

You know, what's the impetus behind all

1:00:15

this? Because the cover story is, oh,

1:00:17

we care about carbon dioxide, which I

1:00:20

don't think they do, especially

1:00:22

given the willingness to sacrifice

1:00:24

the poor. It

1:00:26

makes no sense to me. And I

1:00:28

think it's relevant to the issue you

1:00:30

brought up, which is that people have

1:00:32

these constellations of ideas and there's a

1:00:34

driving force in the midst of them,

1:00:36

so to speak. They're not necessarily aware

1:00:39

of what that driving force is. Don't

1:00:41

we, isn't it more likely that people

1:00:43

are either misinformed or misguided than people

1:00:45

are legitimately trying to depopulate the planet?

1:00:47

I'm, look, misinformed and ignorant.

1:00:50

That's plenty relevant and worth

1:00:52

considering. And stupidity is always

1:00:54

a better explanation than malevolence.

1:00:56

But malevolence is also an explanation. And

1:00:58

no, I don't think it's a better

1:01:00

explanation because why would we waste so

1:01:02

much money sending food aid, having Bush

1:01:05

do, you know, programs to Africa for

1:01:07

AIDS, having other billionaires like Bill Gates

1:01:09

invest so much money and anti-malarial stuff?

1:01:11

Like why would all the global elites

1:01:13

be so invested in helping and killing

1:01:15

the people here at the same time?

1:01:17

Well, some of it's confusion. Okay. You

1:01:19

know, and some of it's the fact,

1:01:21

you know, many things can be happening

1:01:23

simultaneously with a fair bit of internal

1:01:25

paradox because people just don't know

1:01:27

which way is up often. But the

1:01:30

problem with the argument, okay, so you

1:01:32

tell me what you think about this.

1:01:35

So, you know, Hitler's cover story was

1:01:37

that he wanted to make the glorious

1:01:39

Third Reich and elevate the Germans to

1:01:41

the highest possible status for the longest

1:01:43

possible period of time. Okay,

1:01:45

but that wasn't the outcome. The

1:01:48

outcome was that Hitler shot himself

1:01:50

through the head after he married

1:01:52

his wife who died from poison

1:01:54

the same day in a bunker

1:01:56

underneath Berlin while Europe was inflamed.

1:01:58

Well, he was insisting that that the Germans deserved exactly

1:02:01

what they got because they weren't the noble people

1:02:03

he thought they were. And then

1:02:05

you might say, well, Hitler's plans collapsed

1:02:07

in flames and wasn't that a catastrophe?

1:02:09

Or you could say that was exactly

1:02:11

what he was aiming for from the

1:02:13

beginning, because he was brutally resentful and

1:02:15

miserable, right from the time he was,

1:02:17

you know, a rejected artist at the age of

1:02:19

16. And so he

1:02:22

was working or something was working within him

1:02:24

and something that might well be regarded as

1:02:26

demonic, whose end goal was precisely what it

1:02:28

attained, which was the devastation of

1:02:30

hundreds of millions of people. And Europe

1:02:32

left in a smoking room. And the

1:02:34

cover story was the grand third right.

1:02:37

And so there's no reason at all to assume that we're

1:02:39

not in exactly the same situation right now. I

1:02:42

think that's a great reason to assume I think that Hitler's motives and

1:02:44

everything he was trying to do wasn't a secret. I

1:02:47

don't think that anybody had to guess that he was

1:02:49

incredibly anti-Semitic, that he's airing supremacy was going to lead

1:02:51

to the destruction and the murder of like somebody different

1:02:53

people in concentration camps, like none of this was a

1:02:55

secret. So he was hiding it. He

1:02:57

didn't want to. I mean, like he tried to

1:02:59

maybe hide the death camps. Nobody in Germany was

1:03:01

wondering like, wow, crazy that programs are happening as

1:03:04

Jewish people. That's so crazy. Or wow,

1:03:06

they're all being shipped to just mainly the Jews to camps to

1:03:08

work. Like that's kind of interesting. Or wow, he talks about

1:03:10

this a lot in Mein Kampf, but it was just a

1:03:12

coincidence. I don't think you could compare

1:03:14

like Hitler to people that are worried about climate

1:03:17

change. The worry that I have here is because

1:03:19

we're applying people thought Hitler, people in Germany thought

1:03:21

Hitler was perfectly motivated by the highest of benevolent.

1:03:24

If I would have take this standard of evidence and

1:03:26

apply this lens of analysis, couldn't I say the exact

1:03:28

same thing about the conservative constellation of beliefs? They don't

1:03:30

want to intervene anywhere in the world because they don't

1:03:33

care about the problems there. They're

1:03:35

anti-immigration because they hate brown people. Trump wanted to ban

1:03:37

Muslims who come to the United States because he's

1:03:39

xenophobic. Conservatives don't want to have

1:03:41

taxes to help the poor because they want to almost feel

1:03:43

to starve and die in the winter. But

1:03:45

like, I feel like if I- Some of that's true. And

1:03:47

yes, you can adopt that criticism. I

1:03:49

think the difference with regards, especially to

1:03:51

the libertarian side of the conservative enterprise,

1:03:53

but also to some degree to the

1:03:55

conservative enterprises, they're not

1:03:58

building a central gigantic organization.

1:04:00

Organization to put forward this

1:04:02

particular utopian claim And

1:04:04

so even if the conservatives are as morally

1:04:06

adled as the leftists and to some degree

1:04:08

that might be true They're not organized with

1:04:10

the same gigantism in mind And

1:04:12

so they're not as dangerous at the moment now They

1:04:15

could well be and they have been in the past

1:04:17

but at the moment they're not And

1:04:19

so of course you can be skeptical

1:04:21

about about people's motivations when they're brandishing

1:04:25

How would we why would we say that they're not

1:04:27

as concerned about the dry Gantism? I feel like everybody

1:04:29

is when it's a particular thing that they care about

1:04:32

You mean if whether they would be

1:04:34

inclined in that direction for sure that Conservatives wield

1:04:37

the power of the government whenever they feel they

1:04:39

need to just as liberals to my conservatives were

1:04:41

very happy Well, for instance abortion was brought back

1:04:43

as a little that's a good. That's a good

1:04:46

objection. I think that You're

1:04:48

correct in your assumption that once

1:04:50

people identify a core area

1:04:52

of concern They're going

1:04:54

to be motivated to seek power to

1:04:56

implement that concern. I think cancel culture is

1:04:59

a good idea, too I think conservatives

1:05:01

prior to the 2000s if they could censor

1:05:03

everything related to your LGBT stuff

1:05:06

or weird musical stuff or so that they don't want

1:05:08

the kids to watch conservatives would do it But now

1:05:10

that you see that like liberals and progressives are kind

1:05:12

of wielding that corporate hammer now conservatives are very much

1:05:14

Well, hold on we need freedom of speech. We need

1:05:16

a platform everybody and now progressives are like well, hold

1:05:18

on Maybe we shouldn't platform people Got

1:05:20

no disagreement with those things that you said and

1:05:22

I have no disagreement about your proposition that

1:05:24

people will seek power to impose their

1:05:27

Their central up their central doctrine.

1:05:30

Okay, so then you might say and so

1:05:32

we can have a very serious conversation about that

1:05:35

What do we have that

1:05:37

ameliorates that tendency? In

1:05:39

the United States, we've got a D hopefully a

1:05:41

form of decentralized government. I can't speak to Canada's

1:05:43

clutch Yes, well, yes, that's that's true So

1:05:46

that's one of the institutional protections against

1:05:48

that because what that does is put

1:05:51

various forms of power striving in conflict

1:05:53

with one another Right and

1:05:55

so that's a very intelligent solution. But

1:05:57

then there are psychological and solutions

1:06:00

as well. And one

1:06:02

of them might be that you abjure the use of

1:06:04

power, right, as a principle. And

1:06:08

so that, and this is one of the things that was done

1:06:10

very badly during the COVID era, let's say, because

1:06:13

the rule should be something like

1:06:15

you don't get to impose your

1:06:17

solution on people using, using compulsion

1:06:20

and force. There's a doctrine

1:06:22

there, which is any policy that requires compulsion

1:06:25

and force is to be looked upon

1:06:27

with extreme skepticism. Now it's tricky because now

1:06:29

and then you have to deal

1:06:31

with psychopaths, and they tend not to

1:06:33

respond to anything but force. And

1:06:36

so there's an exception there that always has to be

1:06:38

made, and it's a very tricky exception. Look,

1:06:42

let me, let me tell you a story. And you

1:06:44

tell me what you think about this, because

1:06:46

I think it's, it's very relevant to the

1:06:48

concern that you just, you just expressed. And I

1:06:50

don't believe that the conservatives are

1:06:53

necessarily any less tempted

1:06:56

by the, by

1:06:59

the calling of power than the leftists. That's

1:07:01

going to vary from

1:07:03

situation to situation. Though I would say

1:07:05

probably overall in the 20th century, the

1:07:07

leftists have the worst record in terms

1:07:10

of sheer numbers of people killed. I

1:07:12

mean, it depends on how we're quantifying

1:07:14

that. Not really. Okay, we'll just quantify

1:07:16

Mao. How's that? Direct death

1:07:18

of 100 million people. So,

1:07:21

you know, that's a pretty stark fact. And if we're going

1:07:23

to argue about that, well, then we're really not going to

1:07:25

get anywhere. So, and you know,

1:07:27

I'm not disagreeing that the holiday more happened as well. The

1:07:29

Soviet Union and the, and China were horrible. I mean, I'm

1:07:31

not going to, I'm not going to, I'm not going to

1:07:34

say, for World War II, it depends on how much you

1:07:36

attribute the war does to Nazi Germany, et cetera, et cetera.

1:07:41

But sure, like, largely speaking, I don't think that

1:07:43

the left beat the right because the right wasn't

1:07:45

trying. I don't think it's

1:07:47

because Hitler's lack of trying led him to kill us

1:07:49

people then, who ended up dying during the Great Leap

1:07:52

Forward or during the industrialization of the Soviet Union. Yes.

1:07:54

Well, I also think it's an open question still to

1:07:56

what degree Hitler's policies were right-wing versus left-wing, and

1:07:58

no one's done the analysis properly. yet

1:08:00

to determine that. Well, what do we consider? Because it

1:08:02

was a national socialist movement for a reason. And the

1:08:04

socialist part of it wasn't accidental. Well, but the social,

1:08:06

I mean, there was no, you know, cooperatively

1:08:09

formed businesses that were owned by all of the

1:08:11

people for the people and distributed to the people.

1:08:13

And I don't think redistribution was high on Hitler's

1:08:15

list of things to do for it. That's true,

1:08:17

that's true. It was a strange mix of authoritarian

1:08:19

politics. I don't think it was a strange mix.

1:08:21

I think it was a bid to appeal to

1:08:23

Mid-Left and Center Left, the KPD and the German

1:08:25

Socialist Party by calling themselves national socialists. I think

1:08:27

it was very much like an authoritarian, ultra nationalist

1:08:30

regime that pretty squarely fits with, people get mad at

1:08:32

me because I'm far right or far left because they have

1:08:34

an attack on the environment. Well, you know, one of the

1:08:36

things I would have done if I would have been able

1:08:38

to hang on to my professorship at the University of Toronto

1:08:40

would have been to extract out a

1:08:43

random sample of Nazi

1:08:45

policies and strip them of

1:08:47

markers of their origin and present them to

1:08:49

a set of people with conservative or leftist

1:08:51

beliefs and see who agreed with them more.

1:08:53

And that analysis has never been done as

1:08:55

far as I know. So we actually don't

1:08:57

know. And we could know if

1:09:00

the social scientists would do their bloody job,

1:09:02

which they don't generally speaking. That's

1:09:04

something we could know. We could probably

1:09:06

use the AI systems we have now,

1:09:08

the large language models to determine to

1:09:11

what degree left and right beliefs intermingled

1:09:13

in the rise of national socialism. So

1:09:15

that's all technically possible. And

1:09:18

it hasn't been done, so it's a matter of

1:09:20

opinion. Sure, I don't necessarily disagree. That's

1:09:23

something you could do. Okay, so I was gonna

1:09:25

tell you this story. Okay, well, this has to

1:09:27

do with the use of power. So

1:09:31

I spent a time with

1:09:34

a group of scholars and

1:09:36

analyzing the Exodus story and Exodus

1:09:38

seminar recently. And so

1:09:41

the Exodus story is a very interesting

1:09:43

story because it's a,

1:09:45

what would you say? It's

1:09:48

an analysis of the

1:09:50

central tendency of

1:09:53

movement away from tyranny and slavery. That's a good

1:09:55

way of thinking about it. So the

1:09:58

possibility of tyranny and the... of

1:10:00

slavery are possibilities that present

1:10:04

themselves to everyone within the confines

1:10:06

of their life psychologically and socially.

1:10:08

You can be your

1:10:10

own tyrant with regards to the imposition of

1:10:12

a set of radical doctrines that you have

1:10:14

to abide by and punish yourself brutally whenever

1:10:16

you deviate from them. And we all contend

1:10:19

with the issue of tyranny and slavery. And

1:10:22

there's an alternative path and that's what

1:10:24

the Exodus story lays out. And Moses

1:10:26

is the exemplar of that alternative path,

1:10:28

although he has his flaws. And one

1:10:31

of his flaws is that he turns

1:10:33

too often to the use of force.

1:10:36

So he kills an Egyptian, for

1:10:38

example, an Egyptian noble who has

1:10:40

slayed a Hebrew, one of

1:10:42

Moses Hebrew slave brothers, and he has

1:10:44

to leave. There's a variety of indications

1:10:46

in the text that he

1:10:49

uses his staff, he uses his

1:10:51

rod, and he uses power when

1:10:53

he's supposed to use persuasion and

1:10:56

legal or verbal invitation

1:11:00

and argumentation. And this

1:11:02

happens most particularly, most spectacularly, right at

1:11:04

the end of the sojourn. So Moses

1:11:06

has spent 40 years

1:11:09

leading the Israelites through the desert. And

1:11:12

he's right on the border of the promised

1:11:14

land. And really what that means, at a

1:11:17

more fundamental basis, is that he's

1:11:21

at the threshold of attaining

1:11:24

what he's been aiming at, what he's devoted his whole

1:11:26

life to. And he's been a servant

1:11:30

of that purpose in the highest

1:11:32

order. And that

1:11:34

Israelites are still in the desert, which

1:11:36

means they're lost and confused. They don't

1:11:38

know which way is up, they're still

1:11:40

slaves. And now they're dying of thirst,

1:11:42

which is what you die of, spiritual

1:11:45

thirst if you're sufficiently lost.

1:11:47

And they go to Moses

1:11:49

and ask him to intercede with God. And God

1:11:52

tells Moses to speak to the rocks so

1:11:55

that they'll reveal the water within. And

1:11:57

Moses strikes the rocks with his rod

1:11:59

twice. instead, right? He uses

1:12:01

force. And so God says to

1:12:04

him, you'll now die before you enter

1:12:06

the promised land. It's Joshua who enters and

1:12:08

not Moses. Okay, and you're you might wonder

1:12:10

why I'm telling you that story.

1:12:12

I'm telling you that story because those

1:12:15

concepts at the center of that cloud

1:12:17

of concepts that you described are stories,

1:12:20

right? They're stories and if

1:12:22

they're well formulated, they're archetypal stories. And

1:12:24

this is an archetypal story that's illustrating

1:12:27

the danger of the use of compulsion

1:12:30

and force. You know, and

1:12:32

so one of the problems you're obviously

1:12:34

obsessed by and that I'm trying to

1:12:36

solve is what do we do as

1:12:38

an alternative to tyranny, whether it's for

1:12:40

a utopian purpose in the future, or

1:12:42

maybe for the purpose of like, conservative

1:12:45

censoring music lyrics, they don't approve

1:12:47

up. And one answer is we

1:12:49

don't use force, we do the

1:12:51

sort of thing that you and I are trying to do right now,

1:12:54

which is to have a conversation that's aimed at

1:12:56

clarifying things. And so

1:12:58

that's a principle that that's

1:13:01

something like the consent of the governed, right? It's

1:13:05

something like, but it's also something like you have

1:13:07

the right to go to hell in a handbasket,

1:13:09

if that's what you choose. And I'm as long

1:13:11

as you don't, you know, in

1:13:13

doing so, you're not in

1:13:16

everyone's way too much, you

1:13:18

have the right to your own destiny. Right.

1:13:20

And so, and you don't get to use power

1:13:22

to impose that that's the other thing that worries

1:13:26

me about what's going on on the utopian

1:13:28

front, because the problem is, you know, once

1:13:30

you conjure up a climate apocalypse, and

1:13:34

you make the case that there's an impending

1:13:36

disaster that's delayed, and you

1:13:38

might say, well, delayed how long and they were sponsored

1:13:41

we well, we're not sure, but it's likely to occur

1:13:43

in the next 100 or so

1:13:45

years, which is pretty inaccurate. You

1:13:47

now have a universal get out

1:13:49

of jail card that can be

1:13:51

utilized extremely well by power mad

1:13:53

psychopaths. And they will absolutely

1:13:55

do that, because power mad psychopaths use

1:13:57

whatever they can to free their

1:14:00

cause. People are driven by the

1:14:02

search for better but when it comes

1:14:05

to hiring the best way to search for a

1:14:07

candidate isn't to search at all. With over

1:14:09

350 million global monthly visitors Indeed's

1:14:11

matching engine helps you find quality

1:14:13

candidates fast. You get to ditch

1:14:15

the busy work. Use Indeed

1:14:18

for scheduling, screening, and messaging so you

1:14:20

can connect with candidates faster. It's not

1:14:22

just about hiring faster though. 93% of

1:14:25

employers agree Indeed delivers the highest quality

1:14:28

matches compared to other job sites according

1:14:30

to a recent Indeed survey. Don't

1:14:32

search match with Indeed. Enlisteners

1:14:35

of this show will get a

1:14:37

$75 sponsored job credit to get

1:14:40

your jobs more visibility at indeed.com/podcast.

1:14:42

Just go to indeed.com/podcast right now

1:14:44

and support our show by saying

1:14:47

you heard about Indeed on this

1:14:49

podcast. indeed.com/podcast terms and conditions apply.

1:14:52

Need to hire? You need Indeed.

1:14:54

So here's

1:14:56

my this is my issue I think. This is my issue with

1:14:59

a lot of people when it comes to political conversations. I

1:15:01

think that everything you've said is

1:15:03

true and I think that all

1:15:05

of it is it's it's good

1:15:07

analysis but I feel like it just

1:15:10

gets wielded sometimes in one direction and

1:15:12

then people kind of miss that it

1:15:14

completely and fully describes their entire site

1:15:16

as well and the thing that

1:15:18

I feel like the only solution for this is you hinted

1:15:20

at it. It's more than just

1:15:22

conversation although that's a good start. We have

1:15:24

to go back to inhabiting similar areas we

1:15:26

have to go back to inhabiting similar like

1:15:28

media landscapes. I think that the issue that

1:15:30

we're running into right now more than anything

1:15:33

else is people live in completely separate realities

1:15:35

at the moment such that if

1:15:37

we were even to describe basic reality how many illegal

1:15:39

immigrants came into the United States last year. That should

1:15:41

be a factual number that we can know. How many

1:15:43

do you think? Somebody um

1:15:47

the actual number probably

1:15:49

in the hundreds of thousands I think some conservatives think it's

1:15:51

three million per year over the past three years because they

1:15:54

look at like border contacts or they look at asylum seekers

1:15:56

and they're not looking at the process a lot. Yeah I

1:15:58

think it's 3.6 million. into the

1:16:00

US and stay? Yes, through the Southern border.

1:16:02

Okay. So. You know the historical,

1:16:04

you know the, I got an armory, I got

1:16:06

a million. I understand, I understand the shirt. Well,

1:16:08

historically, there's like 13 to 15 million people both

1:16:11

stopping the United States illegally. That's like the history

1:16:13

of illegal immigration in the United States. But some,

1:16:16

but hey, maybe I'm wrong there, right? So we can

1:16:18

say that that's an example of us living in a

1:16:20

fundamentally different reality. Well, the

1:16:22

Pew Research Group has established quite conclusively that

1:16:25

the variability over the last 20 years for

1:16:27

illegal migration in the South border is between

1:16:29

300,000 and 1.2 million. Well,

1:16:32

the Pew Research can only establish, I think the number

1:16:34

of people attempting to cross. I don't know if they

1:16:36

can, no, I don't know if Pew does like census

1:16:38

analysis. I'd have to see the- Well, I don't, well,

1:16:40

that's a different issue, right? Sure. Because

1:16:43

I don't know how you measure how many illegal

1:16:45

immigrants there are actually in the country. I understand.

1:16:47

I just want to point out. I just want

1:16:49

to point out, I agree with you. I listen

1:16:51

to a lot of Rush Limbaugh growing up. I

1:16:53

understand the fear of having a government agency say,

1:16:55

climate change, therefore, we have a blank check to

1:16:57

do whatever we want. That's scary. That's

1:16:59

what they are doing. The conservatives do the same thing

1:17:02

though. I'm not saying otherwise. Yeah, but the problem is

1:17:04

I think people don't talk about it. So for instance,

1:17:06

I heard, so we can pretend now that

1:17:08

the conservative argument was just compulsory vaccines are

1:17:10

bad because they infringe on my freedom. That

1:17:13

wasn't the conservative argument. The conservative argument was

1:17:15

that mass deaths were going to happen, mass

1:17:17

side effects were going to happen. There was

1:17:19

going to be all this corruption and stuff

1:17:21

related to vaccine distribution, to the

1:17:24

crazier theories where microchips and blah, blah, blah. None

1:17:26

of that came true. Absolutely none of the conservative

1:17:28

fear mongering related to the mRNA vaccines came to

1:17:30

fruition, but now that's all forgotten. And that was-

1:17:32

What do you mean, none of this? What do

1:17:34

you make of the excess? Forgotten,

1:17:37

and that was- What do you mean, none of this? What do

1:17:39

you make of the excess deaths? For

1:17:41

related to vaccines, there are almost none. This,

1:17:43

the mRNA vaccines have been administered to- Excess

1:17:45

deaths in your- Related to vaccines. We don't

1:17:48

know, no, no, no. We absolutely know. We

1:17:50

absolutely know. It is like, what do we

1:17:52

know? What do we know? In

1:17:54

terms of vaccine related to that? No, no, no, no, no, no. That's

1:17:56

not my question. Excess

1:17:59

deaths in your- up or up about 20% and

1:18:02

they have been since the end of

1:18:04

the COVID pandemic. Sounds really high to

1:18:06

me. 20%? I mean, I'll check afterwards,

1:18:08

but is this including the Ukrainian war

1:18:10

with Russia? No, no, it's not including

1:18:12

the Ukrainian war. Okay. Are

1:18:14

you implying that you think it's because of vaccines?

1:18:17

I'm not implying anything. I'm saying what the excess

1:18:19

deaths are. But what is your take on what's

1:18:21

causing it? And you

1:18:23

said that in a counter to me describing mRNA

1:18:25

vaccines, you said, well, the excess deaths are 20%.

1:18:27

That makes it the implication is that the vaccines

1:18:29

are causing it? Okay. First of all, something

1:18:32

is causing it. Well, at that obviously,

1:18:34

yeah. Something is causing it or some

1:18:36

combination of factors. Sure. Now,

1:18:39

one possibility is that the healthcare systems

1:18:41

were so disrupted by our insane focus

1:18:43

on the COVID epidemic that we're still

1:18:45

mopping up as a consequence of that.

1:18:47

Wait, are these excess deaths tracing back

1:18:49

through COVID as well? Post COVID.

1:18:52

Just post COVID. Post COVID. Okay.

1:18:55

Right. They're terrified. Right. They're terrified.

1:18:58

And they're not well publicized.

1:19:02

I think the excess deaths, the fact that you're speaking

1:19:04

to them right now seems like... Yeah, but I ferret

1:19:06

down a lot of rabbit holes. It's not like it's

1:19:08

front bloody page news on the New York Times. Sure.

1:19:11

But I think excess deaths, that's a metric that you

1:19:13

can Google. And I'm pretty sure there are like three

1:19:15

different huge organizations that track excess deaths around the world.

1:19:18

Many more than three. Yes. In every

1:19:20

single European country. Right. Okay.

1:19:23

Well, so one relatively

1:19:25

straightforward hypothesis is that it's a

1:19:27

consequence of the disruption of the

1:19:30

healthcare system, the staving off of

1:19:32

cancer treatment, et cetera. The increase

1:19:34

in depression, anxiety, suicidality, and alcoholism

1:19:37

that was a consequence of the

1:19:39

lockdowns, the economic disruption. And there's

1:19:42

plenty of reason to believe that

1:19:44

some of that is the case.

1:19:46

But the other obviously glaring possibility

1:19:49

is that injecting billions of people

1:19:51

with a vaccine that was not

1:19:54

tested by any stretch of the imagination,

1:19:56

with a thoroughness that it should have

1:19:58

before it was forced upon people. also

1:20:00

might be a contributing factor, partly

1:20:03

because we know that it

1:20:05

led to a rise in myocarditis among young

1:20:07

men. And we also know that

1:20:09

there was absolutely no reason whatsoever to ever

1:20:11

recommend that that vaccine was delivered to young

1:20:13

children. The risk of death at COVID was

1:20:16

so close to zero that it might as

1:20:18

well have been zero. When you're talking about

1:20:20

a disease, the risk of death isn't the

1:20:22

only thing that you worry about for the

1:20:24

disease. It's also long-term. So you're talking about

1:20:26

transmission? Because that was another thing that the

1:20:28

COVID vaccine we can

1:20:31

talk to absolutely did because

1:20:33

it decreased transmission, it didn't get rid

1:20:35

of transmission, but it reduced transmission. But it was

1:20:37

claimed that it would get rid of transmission. Only if

1:20:39

you take one reading of one single quote, I think

1:20:41

that, oh, Biden said one time where he said, no,

1:20:43

come on, I've heard so many times, because I'm gonna

1:20:46

say, oh, you can't take anything Trump says seriously. Biden

1:20:48

one time on the news says, if you get the

1:20:50

vaccine, you won't try to do it. That is so

1:20:52

silly. Which was it, no. Do you know

1:20:54

that our prime minister in Canada deprived Canadians of the

1:20:56

right to travel for six months because

1:20:58

the unvaccinated were going to transmit COVID

1:21:00

with more likelihood than the vaccine. So

1:21:04

this wasn't one bloody statement. This was the third

1:21:06

government- No, hold on. What I'm

1:21:08

saying is there wasn't a statement given that if

1:21:11

you get vaccinated, there is a 0% chance

1:21:13

of transmitting the disease. The idea is that

1:21:15

vaccines were supposed to help. They could reduce

1:21:17

this. It reduces your hospitalization, it reduces death,

1:21:19

and it reduces transmission, hopefully by making it

1:21:21

so that people don't get sick or don't

1:21:23

get sick for as long. All three of

1:21:25

those things, the vaccines did exceedingly well. They

1:21:27

continue to do that to this day, but

1:21:29

especially for the first variant and then the

1:21:31

Delta variant, the vaccines helped immensely here. They

1:21:34

were tested. The myocarditis rates are like seven

1:21:36

out of 100,000 injections and

1:21:39

the myocarditis is generally acute. And it's generally not as

1:21:41

bad as even getting the coronavirus itself, which will lead

1:21:43

you also to have a- It's the

1:21:45

much worse side effects than side effects that have caused

1:21:47

other vaccines to be taken off the market before. That

1:21:50

seven out of 100,000 rate of acute myocarditis or

1:21:54

pericarditis is not a worse side

1:21:56

effect than any other vaccine. It is a completely acceptable,

1:21:58

given that the disease itself is more likely- to

1:22:00

cause myocarditis or pericarditis? Yes, I

1:22:02

don't think the data suggests support

1:22:04

that presupposition anymore. The latest peer-reviewed

1:22:06

studies show that that's simply not

1:22:08

true, especially among young men. So

1:22:11

there is an age bracket of young men

1:22:14

where the elevated rate of myocarditis, acute myocarditis

1:22:16

from the vaccine, might have been higher, but

1:22:18

we're talking about like three or four cases

1:22:20

per 100,000 people. And again, myocarditis and pericarditis

1:22:23

are generally acute conditions. They don't last for

1:22:25

very long. I told you at the beginning

1:22:27

of this conversation that the progressive leftists were

1:22:30

on the side of the pharmaceutical companies. It's not about

1:22:32

being on the side of the pharmaceutical companies. It's

1:22:34

about... Really? Really, yeah. Yeah,

1:22:36

well, I see. So what I see as

1:22:38

the unholy part of that alliance with the

1:22:41

pharmaceutical companies is that it dovetails with the

1:22:43

radical utopian's willingness to use power to impose

1:22:45

their utopian vision. Well, then what do you

1:22:47

make of that? Because otherwise, how would you

1:22:49

explain it? Because the leftists should have been

1:22:51

the ones that were most skeptical about the

1:22:53

bloody pharmaceutical companies. And they jumped on the

1:22:55

vaccine bandwagon in exactly the same way that

1:22:58

you're doing right now. Pharmaceutical companies have helped

1:23:00

us tremendously. Yeah, right. There we go.

1:23:02

Fine. No. Like, my medicine hasn't. No,

1:23:05

I don't think so. You're just wrong.

1:23:07

I think they're utterly wrong. So you

1:23:09

don't think that the pharmaceutical companies who

1:23:12

dominate the advertising landscape with 75% of

1:23:14

the funding are corrupt? I

1:23:18

don't... Corrupt is a very broad... No,

1:23:20

no, no. It's... Do you think that... Corrupt with

1:23:23

a tinge of malevolence, willing to extract

1:23:25

money out of people by putting their

1:23:27

health on the line. You don't believe

1:23:29

that? Do you think that we get

1:23:31

effective drugs from pharmaceutical companies? Not

1:23:34

particularly. Okay.

1:23:37

So do you think that any vaccines work? Yes. Do

1:23:40

you think that any... I don't think 80 of them

1:23:43

work at once

1:23:45

for babies. I

1:23:47

think that's a little risky. But yet we've

1:23:49

been on this vaccine schedule for how many

1:23:51

decades? Like this. Like this.

1:23:54

Not like this. Not carefully.

1:23:57

I had a ton of vaccines when I was a child. I'm

1:23:59

pretty sure that was... the norm for people.

1:24:01

There were a ton of vaccines. There's

1:24:03

way more now. Okay. And you

1:24:05

think that you can understand why. I mean,

1:24:07

look, part of it, no doubt, no doubt

1:24:10

part of it is a consequence of the

1:24:12

genuine, genuine willingness to protect children. But the

1:24:14

moral hazard is quite clear. And people on

1:24:16

the left used to be aware of this.

1:24:19

What do you think the mRNA vaccine was speeding up

1:24:21

of it came from? How do you make for the

1:24:23

fact that it was Donald Trump that didn't work speed?

1:24:27

Terror. Foolish panicking, just like we're

1:24:29

doing with the climate issue. So

1:24:31

you think Trump was? Truly panicking. Was he in bed with

1:24:33

the pharmaceuticals? Was he working with the left? Or was it

1:24:35

just a dumb? That was the only panicky thing he made.

1:24:37

He didn't try to push for the mass lockdowns, like other

1:24:39

far left people would want him to do. That was just

1:24:42

the one mistake he made with the pushing for the vaccine?

1:24:45

No, I think Trump undoubtedly made all

1:24:47

sorts of mistakes and lots. And it

1:24:49

wasn't, it certainly wasn't only the left

1:24:51

that stampeded toward the

1:24:53

forced COVID vaccine. Um,

1:24:56

debacle. But it

1:24:58

was most surprising to me

1:25:01

that it emerged on the

1:25:03

left, because the left at

1:25:05

least had been protected against

1:25:07

the depredations of gigantic predatory

1:25:09

corporations by their skepticism of

1:25:13

the gigantic enterprises that can engage in regulatory

1:25:15

capture. And that just vanished. Is it not

1:25:17

possible that maybe people looked and they said,

1:25:20

hey, if all the governments, all the institutions,

1:25:22

all the schools, all the private companies across

1:25:24

all the countries around the world are saying

1:25:26

the same thing. Yeah. Maybe it is the

1:25:28

case that this vaccine just helps. Is that

1:25:31

not possible? Oh, sure. They probably that's

1:25:33

sure, of course it's possible. But that didn't mean it

1:25:35

was right. They use

1:25:37

force. If they use force, we use

1:25:39

force for all sorts of things in

1:25:41

terms of public health. Generally use force

1:25:43

to invade people's bodies. How long have

1:25:46

vaccine mandates been a thing in Canada, the United

1:25:48

States and the entire world? I

1:25:50

don't think they should have been the same. That's great.

1:25:52

If you don't think they should have been, they say

1:25:54

we don't generally use force. We absolutely use force. We

1:25:56

use force vaccines for a long time. It's an important

1:25:58

part of public health. Fair enough. We

1:26:01

did it on a scale and

1:26:03

at a rate during the COVID

1:26:05

pandemic, so-called pandemic, that was unparalleled.

1:26:07

And the consequence of that was

1:26:09

that we injected billions of people

1:26:12

with an experimental, it wasn't

1:26:14

a bloody vaccine. No, it wasn't. No,

1:26:16

it isn't. It's

1:26:20

not. It's not 100% success rate. You think it's a

1:26:22

definition of vaccine? Well, point of the vaccine is to

1:26:24

give your body a protein to try and answer the

1:26:26

answer to the word. The word vaccine. Who cares what

1:26:29

you're saying? There's plenty of different types of- They used

1:26:31

the word vaccine so that they didn't have to contend

1:26:33

with the fact that it wasn't the same technology. There

1:26:35

are different types of vaccines. There certainly are. There are

1:26:37

different technologies. Fine. The mRNA vaccines are the type of

1:26:40

vaccine technology. There used to be vaccines. Now

1:26:42

this is vaccines. No, it was like this

1:26:44

and now it's like this. No, no, no.

1:26:46

It was like this and now it's like this.

1:26:48

The mRNA technology was a radical,

1:26:51

qualitative leap forward in technology. You can call

1:26:53

it a vaccine if you want to, but

1:26:56

it bears very little resemblance to any vaccine

1:26:58

that went before that. The reason it was

1:27:00

called a vaccine was because vaccine was a

1:27:02

brand name that had a track record of

1:27:05

safety and shoehorning it in that was one

1:27:07

of the ways to make sure that people

1:27:09

weren't terrified of the technology. I think the

1:27:12

reason I call it a vaccine is because

1:27:14

they're injecting you with something that's inoculating you

1:27:16

against something in the future because of those

1:27:18

proteins that resemble a virus that- There are

1:27:21

overlaps between the mRNA technologies

1:27:23

and vaccines to be sure, but

1:27:26

they wouldn't have been put forward with the

1:27:28

rate that they were put forward if they

1:27:30

weren't a radical new technology. It's bad in

1:27:33

principle to inject billions of people

1:27:35

with an untested new technology. Isn't it also bad

1:27:37

in principle for billions of people to get infected

1:27:39

with a worldwide pandemic that initially was causing a

1:27:42

decent number of deaths, a ton of complications, shutting

1:27:44

down world economies? Maybe, maybe it was. Maybe it

1:27:46

was. So should we get an engage in that

1:27:48

analysis and figure out like if we look at-

1:27:50

We're not engaging in the analysis. No, because now

1:27:53

we're playing whether or not vaccines are even vaccines

1:27:55

or not. No, no, no, no, no, no. Don't

1:27:57

play that game. That is not what I was-

1:28:00

doing. I was making a very specific and

1:28:02

careful case. The mRNA technology

1:28:04

by wide recognition is an extraordinarily

1:28:06

novel technology. That doesn't make it

1:28:08

not a vaccine though. Well, okay,

1:28:12

it's a radically transformed form of

1:28:14

vaccine. I don't give a damn.

1:28:16

That still makes it something so

1:28:18

new that the potential danger

1:28:21

of its mass administration was

1:28:23

highly probably, highly probable

1:28:25

to be at least or more dangerous than

1:28:28

the thing that it was supposed to protect

1:28:31

against. And we are seeing that in

1:28:34

the X-ray. Absolutely not. So are you

1:28:36

implying that the X-rays were caused by

1:28:38

the vaccines or? I don't bloody well

1:28:40

know what they're caused by. Look, if

1:28:43

you're going to use the Occam's razor,

1:28:45

you're kind of stuck in an awkward

1:28:47

place here. Absolutely not stuck in our, most

1:28:49

administered vaccine in the hip or inoculation center. Whatever

1:28:51

you're going to call it, the history of

1:28:53

all of mankind, every single organization around the

1:28:56

world is motivated to call this out if it

1:28:58

was a bad thing. You don't think Russia or China

1:29:00

would be screaming if Donald Trump or the United States

1:29:02

warps through a vaccine that was having deleterious effects on

1:29:04

populations all around the world. You don't think there wouldn't

1:29:07

be some academic institution. You didn't think there'd be more

1:29:09

than a handful of doctors and Joe Rogan and some

1:29:11

conservatives saying this vaccine might've been bad if it was

1:29:13

the case that American companies, working with companies

1:29:15

in Europe and Germany, especially, because that's where biotech is

1:29:17

from in order to create a or a manufacturer vaccine

1:29:19

that was causing SSS all around the world. There are

1:29:22

so many different people that we've motivated to call this

1:29:24

out. How do you explain that? No, it's a handful

1:29:26

of people. Where are the government's calling it? Where's the

1:29:28

academic institutions calling it? Where are the other private companies

1:29:30

calling it out? Wouldn't you stand to make it killing

1:29:32

if you were a private company in Europe and you

1:29:35

could say, look, the mRNA vaccines for sure are causing

1:29:37

all of these issues. Why wouldn't Putin, why wouldn't Xi

1:29:39

Jinping, why wouldn't anybody else in the world call this

1:29:41

out? It was as horrible as it was. There are

1:29:43

plenty of people attempting to call this out.

1:29:45

Nobody credible and no huge institution. What

1:29:48

do you make of the excess deaths?

1:29:51

You haven't come up with a bloody high. I don't even

1:29:53

know if there are 20% at the excess deaths in Europe

1:29:55

right now. If I had to gas off the top of

1:29:57

my head, it's going to be like you said, one might be.

1:30:00

lingering effects of an overwhelmed healthcare system. Another one might

1:30:02

be deaths related to the war in Ukraine. Another one

1:30:04

might be rising energy costs that have happened for a

1:30:06

couple of reasons. But it's absolutely impossible that any of

1:30:08

it could be unintended consequences of a novel technology injected

1:30:10

into billions of people. I think that if excess, first

1:30:12

of all, there aren't billions of people in Europe. Well,

1:30:14

you didn't say there were. So if there were excess

1:30:16

deaths, I understand, but you're talking about excess deaths in

1:30:18

Europe. I'm not aware of excess deaths that exist in

1:30:20

other places that are completely and totally unaccounted for, but

1:30:22

the only explanation would be the vaccine. I

1:30:24

think it's a bit more. I think more people

1:30:27

will be talking about it. Well, we have to.

1:30:29

Well, first of all, the number of people talking

1:30:31

about something is not an indication of the scientific

1:30:33

validity of a claim. I agree with that. But

1:30:35

why are you using mass consensus as

1:30:38

the determinant of what constitutes true? Because

1:30:41

I think for something that was given to billions

1:30:43

and billions of people, if this was something that

1:30:45

had a measurable effect on people, it would be

1:30:49

impossible to cover it up or ignore it. We wouldn't

1:30:51

have to look at the one case brought up on

1:30:53

a documentary. We wouldn't have to look at the one

1:30:55

thing being talked about. And what do you make of

1:30:57

the VAERS data? There's more

1:31:00

negative side effects reported from

1:31:02

the mRNA vaccines than there were

1:31:05

reported for every single vaccine ever

1:31:07

created since the dawn of time

1:31:10

and not by a small margin. So it's not

1:31:12

just the excess deaths. I agree. It's the VAERS

1:31:14

data. What is VAERS data? It's

1:31:17

the data base that until the

1:31:19

COVID-19 pandemic emerged, and we had

1:31:21

the unfortunate consequence that there were

1:31:23

so many side effects being reported,

1:31:25

it was the gold standard for

1:31:27

determining whether or not vaccines were

1:31:29

safe. And now as soon as

1:31:31

it started to misbehave on the

1:31:33

mRNA vaccine front, we decided

1:31:35

that we were going to doubt the validity of

1:31:38

the VAERS reporting system. Okay. The VAERS reporting system

1:31:40

never put the gold standard for anything. VAERS reporting

1:31:42

system when you report that there is some issue

1:31:44

that you have after getting a vaccine. That's it.

1:31:46

I think it's vaccine adverse. What the hell do you think it

1:31:48

was set up for? To report adverse events.

1:31:50

Why? And after a vaccine to track and

1:31:52

see if something was related to the vaccine.

1:31:54

Right. Why? So most people

1:31:56

didn't even know VAERS existed until after

1:31:58

the COVID vaccine. Once

1:32:01

people know that it exists, of course more people are

1:32:03

going to engage with it, but what happens is... So

1:32:05

it's all noise. No, well, it

1:32:07

could be or couldn't be. So what do

1:32:09

you do? When a bunch of times... Well,

1:32:11

first of all, you might begin by suggesting

1:32:13

that maybe it's not all noise. Correct. So

1:32:15

when all of these things are admitted to

1:32:17

VAERS, what they do is, from there, they

1:32:19

investigate. All VAERS is, is

1:32:21

I might go and get a vaccine, and maybe in

1:32:24

three days they go, hmm, I've got a headache. I'm

1:32:27

going to go ahead and call my doctor and make this report,

1:32:29

and they'll say, okay, well, it's an adverse event after a vaccine.

1:32:31

It doesn't mean the vaccine caused the headache, and now that more

1:32:33

people know about this... I'm saying that there's... There

1:32:36

is not the gold standard of determining if a vaccine is

1:32:38

working or not. Compared to actual

1:32:41

longitudinal, prospective, randomized control trials. You mean like the ones

1:32:44

they should have done to the goddamn vaccine? Like the

1:32:46

ones that they did do for the vaccine, that they

1:32:48

continued to do to this day. Yes, that is correct.

1:32:50

Yes. You really think that

1:32:52

you're in a position to evaluate the scientific credibility

1:32:55

of the trials for the vaccines, do you? No,

1:32:57

I don't. So I have to trust... Then what are

1:32:59

you doing? I don't trust the blood data. First

1:33:02

of all, you have to trust third parties to

1:33:05

some extent. I don't have to trust... Of

1:33:07

course you do. You do every day. When you turn the

1:33:09

keys in your car, you hope your engine doesn't explode. When

1:33:11

you're drinking water, you hope that the public water or

1:33:13

whatever tap or bottled water you got out of isn't

1:33:16

contaminated or poisoned with cholera. I don't do that as

1:33:18

a consequence of consensus. No,

1:33:20

of course you do. No, I don't. I do that

1:33:22

as a consequence of observing multiple times that when I

1:33:24

put the goddamn key in the ignition, the truck started.

1:33:26

Why do you know it's going to start the 50th

1:33:28

of the 100th time? How many times do you wear the... I'm

1:33:31

not playing cue. You don't know

1:33:33

if the denim in those jeans isn't leaking

1:33:35

into your bloodstream. To some extent, we trust,

1:33:37

we have to trust third party institutions to

1:33:39

make determinations. Except when they use force. How

1:33:42

about that? That's not what they use force. We trust

1:33:44

the police officers. We trust the... We do, Abe. We

1:33:46

do. We on the left trust the police. To some

1:33:48

extent, do it. If somebody's breaking into your house, you

1:33:50

can be the most defunded person in the world. Who

1:33:55

are you going to call? Are you going to call your neighbor? Are you

1:33:57

going to call Joe Biden? Are you going to call Obama? like

1:34:00

Panthers, you're going to call the... Okay, so tell me this,

1:34:02

tell me this then, because the core issue here is

1:34:05

use of force as far as I'm concerned. We

1:34:07

examined some of the weeds around that. Politicians

1:34:12

throughout the world, and this would be true on

1:34:14

the conservative side now, in the aftermath of the

1:34:16

COVID, tyranny,

1:34:19

because it was more a tyranny than a

1:34:21

pandemic, are now

1:34:23

saying that we actually didn't force anybody to take

1:34:26

the vaccine. People are driven by the search for

1:34:28

better, but when it comes to hiring,

1:34:30

the best way to search for a candidate isn't to

1:34:32

search at all. With over

1:34:34

350 million global monthly visitors, Indeed's

1:34:36

matching engine helps you find quality

1:34:38

candidates fast. You get to ditch

1:34:40

the busy work. Use

1:34:42

Indeed for scheduling, screening, and messaging so

1:34:44

you can connect with candidates faster. It's

1:34:47

not just about hiring faster though. 93%

1:34:50

of employers agree Indeed delivers the highest

1:34:52

quality matches compared to other job sites.

1:34:55

According to a recent Indeed survey, don't

1:34:58

search match with Indeed. Enlisteners

1:35:00

of this show will get a

1:35:03

$75 sponsored job credit to get

1:35:05

your jobs more visibility at indeed.com/podcast.

1:35:07

Just go to indeed.com/podcast right now and

1:35:10

support our show by saying you heard

1:35:12

about Indeed on this podcast. indeed.com/podcast

1:35:15

terms and conditions apply.

1:35:18

Need to hire? You need Indeed. So

1:35:22

what do you think of that claim? So let's define

1:35:24

force. I think it's technically true, but I think it's

1:35:26

silly. What do you mean it's technically

1:35:28

true? Define force. Technically true and that in the

1:35:30

United States at least, I think the idea of

1:35:32

what they tried to do, they weren't able to

1:35:34

do it because the Supreme Court shot it down,

1:35:36

was Biden tried to make it so that OSHA,

1:35:38

who is the body that regulates job safety, could

1:35:41

make it so that employees had to get vaccinated.

1:35:43

Now I've actually said it was before they'd lose

1:35:45

their job. Okay, does that qualify as force? That's

1:35:48

why I said technically. Yeah, I know, but I'm

1:35:50

just a serious question. I mean because we need

1:35:52

to define what constitutes force before we can, it

1:35:55

seems to me... Could argue it's a type of force, sure. I

1:35:57

mean I think it'd be silly to say it's nothing. It is a type of

1:35:59

force. as a cop telling you, you have to do

1:36:01

this, you're going to be killed? No, but it's on the

1:36:03

spectrum. Sure, of course. Yeah. It's as much

1:36:05

force as the MRA, MRA

1:36:08

vaccines or vaccines. Sure.

1:36:11

It is the type of force and the MRA. OK,

1:36:14

OK, so I really think I really think

1:36:16

the problem was with the COVID response. I

1:36:18

really think the problem was the use of

1:36:21

force. I mean, I can understand to some

1:36:23

degree, although I'm very skeptical of

1:36:25

the pharmaceutical companies and far more skeptical than

1:36:27

your insistence upon the utility of consensus might

1:36:29

lead me to believe you're skeptical of them,

1:36:32

which is surprising, I would say, given that

1:36:34

you're a cop. I'm very skeptical of them.

1:36:36

That's why I'm glad there's multiple companies, multiple

1:36:38

countries, multiple academic institutions do research and the

1:36:40

FDA. Yeah, I'm very skeptical. You should be

1:36:43

in any private system. You should be skeptical

1:36:45

of every private company, of course, whether we're

1:36:47

talking media, pharmaceuticals or automobile manufacturers. Yeah. But

1:36:50

skepticism doesn't mean a blind adherence to the

1:36:52

complete total opposite of whatever it is they're

1:36:54

saying. Right. There are undoubtedly like you look

1:36:57

at Alzheimer's research, there has been groundbreaking impact

1:36:59

on drugs to treat Alzheimer's research over the

1:37:01

past three years that five years ago, not

1:37:03

these drugs even existed. And yes, I mean,

1:37:06

how about if you're skeptical of anyone who's

1:37:08

willing to use force to put their doctrine

1:37:10

forward? So you're skeptical

1:37:12

of literally every single person, political

1:37:14

ideology ever to ever have existed

1:37:16

in all of humankind. Some degree

1:37:18

of force, you would undoubtedly

1:37:21

believe this right. Some degree of force is

1:37:23

probably necessary for any kind of cohesive society.

1:37:25

Right. No, I don't believe that. Of course

1:37:27

there is. No, even if you had a

1:37:29

tribe of 100, 120 people. If

1:37:33

somebody was if somebody was stealing something,

1:37:35

right. You have to punish that person.

1:37:37

I said earlier that that that becomes

1:37:39

complicated when you're dealing with the psychopathic

1:37:42

types. Right. So that's a

1:37:44

complication. But I would say generally speaking

1:37:46

that the necessity to use

1:37:48

force is a sign of bad policy.

1:37:50

And no, I don't think see, I'm

1:37:52

not particularly Hobbesian. I don't think that

1:37:55

the only reason people comport themselves with

1:37:57

a certain degree of civility in civilized

1:37:59

societies because. they're terrified by the fact

1:38:01

that the government has a monopoly on force

1:38:03

that can be brought against them at any

1:38:05

moment. I think that keeps the psychopaths in

1:38:07

line to some degree, but I think that

1:38:10

most people are enticed into a cooperative relationship

1:38:12

and that formulating the structures that make those

1:38:14

relationships possible is a sign of good policy.

1:38:16

I've got to, I have to ask, because

1:38:18

I have watched a lot of your stuff

1:38:20

in the past. I remember

1:38:22

you speaking very distinctly on this, that

1:38:24

for instance, when two men are communicating

1:38:27

with each other, there is an

1:38:29

underlying threat of force that puts

1:38:31

on the guardrails those particular social interactions.

1:38:34

Yeah, the threat of force is don't

1:38:36

be psychopathic. How

1:38:39

broader is psychopathic here? Are we defining? Well,

1:38:41

I can define it. Sure, yeah, go for

1:38:43

it. Well, a psychopath will gain short-term advantage

1:38:46

at the cost of long-term relationship. Okay.

1:38:48

That's really the core issue. Well, you

1:38:51

made a reference to something like that

1:38:53

earlier in your discussion when

1:38:55

you pointed out that people claim

1:38:57

to be motivated, let's say, by principle,

1:39:00

but will default to short-term gratification more

1:39:02

or less at the drop of a

1:39:04

hat. Yeah, yeah, yeah, exactly. Well, the

1:39:07

exaggerated proclivity to do that is at

1:39:09

the essence of psychopathy. So it's a

1:39:11

very immature— I'm curious, with this definition

1:39:13

of psychopathy, does that mean— It's that

1:39:16

definition of psychopathy. It's not an ad,

1:39:18

it's not mine. That's the core

1:39:20

of psychopathy. Okay, I'm not—in the United States, I

1:39:22

think we call it all ASPD now. No,

1:39:25

it's separate from—that's antisocial personality disorder.

1:39:27

I thought that's consumed psychopathy and

1:39:29

sociology. Psychopathy is—no, psychopathy is more

1:39:32

like some—it's more the pathological core

1:39:34

of antisocial personality disorder. Okay, maybe

1:39:36

that might be true, okay. So

1:39:38

that's a better way of thinking.

1:39:40

Like the worst—a small number of criminals

1:39:43

are responsible for the vast majority of crimes. It's

1:39:45

1% commit 65%, something like that. Do

1:39:48

you think psychopathy is something that can be

1:39:51

environmentally induced, or do you think this is

1:39:53

core to a person? It's both. So,

1:39:56

for example, if you're disagreeable, like you are,

1:39:58

by the way, what— One of

1:40:00

the, your proclivity if you went wrong would be

1:40:02

to go wrong in an antisocial and psychopathic

1:40:04

direction. That's more true of men, for example, than

1:40:06

it is for women. That's why men are

1:40:08

more likely to be in prison by a lot.

1:40:11

It's 10 to one, 20 to one generally,

1:40:13

depends on the particular crime, with it

1:40:16

being higher proportion of men as the

1:40:18

violence of the crime mounts. So you

1:40:20

can imagine on the genetic versus environment

1:40:22

side, so imagine that when

1:40:25

you're delivered your temperamental hand

1:40:28

of cards, you're going to

1:40:30

have a certain set of advantages that go along

1:40:32

with them that are part, parcel of the genetic

1:40:34

determination. And there's going to be a certain set

1:40:36

of temptations as well. So for

1:40:38

example, if you're high in trait neuroticism, you're

1:40:40

going to be quite sensitive to the suffering

1:40:42

of others and be able to detect that

1:40:44

that's useful for infant care. But the cost

1:40:47

you'll pay is that you'll be more likely

1:40:49

to develop depression and anxiety. And

1:40:51

if you're disagreeable, if you're

1:40:53

disagreeable, extroverted and unconscious, then

1:40:55

the place you'll

1:40:59

go if you go badly is in

1:41:01

the psychopathic or antisocial direction. And there

1:41:03

are environmental determinants of that to some

1:41:05

degree. Sure. Genes express themselves in

1:41:07

an environment. I agree. I'm

1:41:10

just curious for the definition of psychopathic, for short term

1:41:12

gain at the expense of long term relationship. Really? That's

1:41:15

probably the best. Yeah. When you

1:41:17

look at stuff like people that are self destructive, say people that

1:41:19

engage in behavior, at least like obesity, is that like a type

1:41:21

of psychopath to you? Or is that like something different? Or how

1:41:23

do you define these types of things, I guess? Or how do

1:41:26

you view that type of thing? Well,

1:41:28

the no, no, there

1:41:31

is an overlap in that

1:41:33

addictive processes, one of which

1:41:35

might lead to obesity, do have

1:41:38

this problem of prioritization of the

1:41:40

short term. But the

1:41:42

distinct so that overlaps with the short

1:41:44

term orientation of the psychopath. But a

1:41:46

psychopath is, see an obese

1:41:48

person isn't gaining anything from

1:41:51

your demise to facilitate

1:41:53

their obesity. Right. So

1:41:55

there's a predatory and parasitical element to

1:41:57

psychopathy that's not there. Another. addictive

1:42:00

short-term processes. Do you think, is it possible

1:42:02

that there are things, because then to circle

1:42:04

back to the tribal example I gave, isn't

1:42:07

it possible that people can commit harms against other

1:42:09

people, or they're not necessarily gaining from their

1:42:11

demise, but it's just some other sort of gain?

1:42:13

So for instance, like I'm talking to some person,

1:42:16

I'm just gossiping or shit talking to another

1:42:18

person. I'm not necessarily feeling good that I'm trashing

1:42:20

them per se. I'm feeling good because this group

1:42:22

of friends might be more favorably because I have

1:42:24

like a gossip or something to share with

1:42:26

them. Well, but that's the gain right there. And

1:42:29

you are contributing to the demise of the people who

1:42:31

you're gossiping about. But I think there's like, I feel

1:42:33

like there's fundamentally different type of thought process between like,

1:42:36

I want to tell you something juicy about this guy

1:42:38

because it'll make you like me, versus I want to

1:42:40

do something juicy about this guy because I hate this

1:42:42

guy, and I want him to like have a worse

1:42:44

reputation among people. I feel like there's different drivers for

1:42:46

that. I would say, that's an interesting distinction.

1:42:48

I would say probably that

1:42:51

the hatred induced malevolence

1:42:53

is a worse form of malevolence

1:42:55

than the popularity inducing malevolence. Yeah,

1:42:57

the only reason I- The

1:43:00

only reason I bring that up is because

1:43:02

I feel like a lot of malevolence that

1:43:04

we have social guardrails for is that type

1:43:06

of like selfish malevolence where you're not, I

1:43:09

would argue even the majority of malevolence in

1:43:11

the world usually people acting selfishly or being

1:43:13

inconsiderate, not necessarily like, I hate this guy.

1:43:15

Yeah, I think that's right. I think that,

1:43:17

well, that's why Dante

1:43:19

outlined levels of hell, right?

1:43:21

Yeah, well, exactly that. And I

1:43:24

mean, that book was an investigation into

1:43:26

the structure of malevolence, right? He put

1:43:28

betrayal at the bottom, which I think

1:43:30

is right. I think that's right because

1:43:32

people who develop post-traumatic

1:43:34

stress disorder, for example, which almost

1:43:37

only accompanies an encounter

1:43:39

with malevolence rather than tragic

1:43:41

circumstances, they are often betrayed

1:43:44

sometimes by other people, but often by themselves.

1:43:46

And yes, there are levels of hell, and

1:43:48

you outlined a couple there. So

1:43:51

I guess then my question is just that if you

1:43:53

have people, so the kid that steals an orange from

1:43:55

a stand, not because he hates the shop owner, but

1:43:57

because he wants the orange or he's hungry, without some

1:43:59

type of society. It doesn't have to be the

1:44:01

government. It could be family, religious, without some type

1:44:03

of use of force. Do you

1:44:05

think that society ever exists without… … use force on

1:44:07

your wife? Well,

1:44:09

what are we considering force? Is withholding

1:44:12

sex, for instance, is that considered force,

1:44:14

or is saying we're going to cancel

1:44:16

a vacation? Well, deprivation of an expected

1:44:18

reward is a punishment. So you could

1:44:21

– well, but this is a serious

1:44:23

question. Look, if we're thinking about the

1:44:25

optimization of social structures, we

1:44:27

might as well start from the base level of

1:44:29

social structure and scaffold up. So

1:44:32

if a wife is upset at a husband, for

1:44:34

instance, would that be considered use

1:44:36

of force? I think a negative punishment. You're removing a

1:44:38

stimulus to punish a person for something. Yeah, would you

1:44:40

consider that like a use of force, or… I would

1:44:42

say it would depend to some degree on the intent.

1:44:45

The intent is to punish a behavior, right? The

1:44:47

intent is to punish them, then it's starting to

1:44:49

move into the domain of force. I mean,

1:44:52

look, while we've been talking, there's been

1:44:54

bursts of emotion, right? And that's because

1:44:57

we're freeing entropy and trying to enclose

1:44:59

it again. And so that's going to

1:45:01

produce – it produces negative

1:45:03

emotion fundamentally, most fundamentally anxiety and

1:45:06

pain, and secondarily something like anger,

1:45:08

because those emotions are quite tightly

1:45:10

linked. And so within

1:45:13

the confines of a marriage, because we might as

1:45:15

well make it concrete, there are going to be

1:45:17

times when disagreements result in bursts of emotion. And

1:45:20

those bursts of emotion don't necessarily

1:45:22

have to have an instrumental quality,

1:45:24

right? It's when the emotion is

1:45:26

used manipulatively to gain an advantage

1:45:28

of that short term for the

1:45:30

person, and then maybe that's

1:45:32

at the expense of the other person, or

1:45:34

even at the expense of the person who

1:45:36

benefits future self. And then it

1:45:38

starts to tilt into the manipulative –

1:45:41

there's a tetrad of

1:45:43

traits. So narcissism,

1:45:47

Machiavellianism, that's manipulativeness. Narcissism is the

1:45:49

desire for unearned social status. That's

1:45:51

what you'd gain, for example, if

1:45:53

you were gossiping and elevating your

1:45:55

social status. Machiavellianism,

1:45:57

narcissism, psychopathy, that's –

1:46:00

predatory parasitism and

1:46:02

those culminating sadism and

1:46:04

that cloud of negative emotion that's released

1:46:08

in the aftermath of disagreement

1:46:10

can be tilted in the direction of those

1:46:13

traits and that's when it becomes malevolent and

1:46:15

that's when the problem of force starts to

1:46:17

become paramount because I think I think that

1:46:19

you're I think that your

1:46:21

fundamental presupposition was both Hobbesian and ill-formed

1:46:24

I do not believe that the basis

1:46:26

for the civilized polity is force now

1:46:28

You're saying that you know, you can't

1:46:31

abjure the use of force entirely and

1:46:33

I would say unfortunately, that's true I

1:46:35

like earlier but if the if the

1:46:38

policy isn't Invitational if I

1:46:40

can't make a case that

1:46:43

that's powerful enough for you to go there

1:46:45

voluntarily Then the

1:46:47

policy is flawed No, it

1:46:49

may be that we have some cases where we

1:46:51

can't do better than a flawed policy because we're

1:46:54

not smart enough and maybe the incarceration

1:46:56

of Criminals with

1:46:58

a long-term history of violent offenses is

1:47:00

good example of that. We

1:47:02

don't know how to invite those people to play They

1:47:06

have a history Generally from the

1:47:08

time they're very young children from the

1:47:10

age of two of not being able

1:47:12

to play well with others And it's

1:47:14

a very very intractable problem There's no

1:47:16

evidence in the social science literature at

1:47:18

all that hyper aggressive boys

1:47:21

by the age of four can

1:47:23

ever be Socialized in the course

1:47:25

of their life the

1:47:27

penological evidence suggests that if you have

1:47:29

multiple Offenders your best bet is to

1:47:31

keep them in prison till they're 30

1:47:34

and the reason for that is it might be delayed

1:47:36

maturation You know biologically

1:47:38

speaking, but most criminals start to burn

1:47:40

out at around 27 So

1:47:43

it spikes the big spike when puberty

1:47:45

hits and then Stability

1:47:48

among the hyper aggressive types. So

1:47:51

actually what happens is the aggressives at four tend

1:47:53

to be aggressive their whole life And then they

1:47:55

decline after 27. The

1:47:58

normal boys are not aggressive spike

1:48:00

at puberty and go back down to

1:48:02

baseline. Right. And so you don't really

1:48:04

rehabilitate people at present for obvious reasons.

1:48:06

I mean, look at the bloody places.

1:48:08

They're great schools for crime in large,

1:48:11

but if you keep them there until they're old enough,

1:48:13

they tend to mature out of that. Except

1:48:17

the worst of them tend to mature out

1:48:19

of that predatory, short-term oriented

1:48:21

lifestyle. So yeah, that's

1:48:24

the first issue. Yeah, I agree.

1:48:26

I agree. So I fundamentally, to

1:48:28

clear my, my,

1:48:31

my, my stand up, I

1:48:33

agree that fundamentally you're not

1:48:35

building society on force. If

1:48:37

for no other reason, because there'd be so much friction,

1:48:39

it would fly apart at the seams, right? You can't

1:48:41

force. You get resistance. Yeah. Fundamentally, we're building

1:48:43

off of cooperation. You want to invite people to participate

1:48:45

in society. I agree with that. I just, I feel

1:48:47

like once you start to hit certain threshold or certain

1:48:49

points, and you've got so many different types of people

1:48:51

involved, at some point, we're

1:48:53

going to have to have force around the edges and

1:48:56

the guardrails just to make sure that we don't allow,

1:48:58

you think I would like tit for tat systems. Yeah.

1:49:01

Tit for tat is probably a really important

1:49:03

part of our evolutionary biological history and an

1:49:05

important part of the animal kingdom. And I

1:49:07

think to some degree, that tit for tat

1:49:09

punishment is important. That's force or justice. You

1:49:12

can call it what it is. No, no, no. I'm

1:49:14

curious what you think. I'm very, there's a very serious

1:49:16

question because the tit for tat,

1:49:18

the tit for tat is very bounded, right?

1:49:20

It's like you cheat, I

1:49:22

whack you. Then I cooperate. Yeah. So, and

1:49:25

I do think that there's a model there

1:49:27

for what we actually conceptualize as justice, like

1:49:29

you don't get to get away with it,

1:49:32

but the goal is the reestablishment of the cooperative endeavor

1:49:34

as fast as possible. Of course, I agree. But in

1:49:36

a reductionist way, we're kind of just using justice here

1:49:38

as a stand in for force, right? Well, I don't,

1:49:40

I don't, I don't, I put tit for tat systems.

1:49:42

I tit for, I tit, so there are different types

1:49:44

of tit for tat systems, right? You've got tit for

1:49:46

tat, you've got tit for tat tat, you've got, there's

1:49:49

all sorts of types of systems where maybe you'll let

1:49:51

somebody make a mistake one or two times, but you

1:49:53

can't have a tit, tit, tit, tit, tit system because

1:49:55

then somebody can come in and take advantage of it.

1:49:57

Yes, that is the problem with the compassionate left. Exactly.

1:50:00

To some extent, sure, it can be, or a problem with

1:50:02

the right that's far too forgiving of Donald Trump. But

1:50:05

I would say that that pat part,

1:50:07

you can call it justice. I think

1:50:09

justice is a perspective of force, right,

1:50:11

where some people might consider a force

1:50:13

to be just the cop that arrests

1:50:16

the murderer, and other people might consider

1:50:18

that force, that pat, to actually be

1:50:20

injustice because the murderer was responding to

1:50:22

environmental conditions, blah, blah, blah, or with

1:50:24

financials, whatever. That's a stupid theory that

1:50:26

responding to environmental conditions theory. Because here's

1:50:29

why it's not. I mean, because essentially

1:50:31

that's written houses. So here's why. So

1:50:33

if you assume that there's a causal

1:50:35

pathway from early childhood abuse to

1:50:37

criminality, let's say, which is the test

1:50:40

case for environmental determination

1:50:42

of the proclivity for the

1:50:45

exploitation of others, then

1:50:47

it spreads to, it folds near

1:50:50

exponentially in populations. That isn't what

1:50:52

happens. So here's the data. Most

1:50:55

people who abused their children were abused as children.

1:50:59

But most people who are abused as children

1:51:01

do not abuse their children. And

1:51:03

the reason for that is because if you were

1:51:05

abused, there's two lessons you can learn from that.

1:51:08

One is identify with the abuser. The

1:51:11

other is don't ever write exactly. And

1:51:15

what happens? And if this didn't

1:51:17

happen, every single family would be abusive to the

1:51:19

core very rapidly. What

1:51:21

happens is the proclivity for violence

1:51:23

itself, it dampens itself out with

1:51:25

it as a consequence of intergenerational

1:51:27

transmission. So the notion that

1:51:29

privation is a pathway to criminality,

1:51:31

that's not a well-founded formulation. And

1:51:37

there are an infinite number of counter examples

1:51:40

and they're crucial. Some

1:51:42

of the best people I know, and I mean

1:51:44

that literally, are people who had childhood

1:51:47

so absolutely abysmal that virtually anything

1:51:49

they would have done in consequence

1:51:52

could have been justified. And they

1:51:55

chose not to turn into

1:51:57

the predators of others. That

1:52:00

was a choice and often one that

1:52:02

caused them to reevaluate themselves right down to

1:52:04

the bottom of their soul. And

1:52:07

so that casual association of relative

1:52:09

poverty even with criminality, we know

1:52:11

also, we know this too. You

1:52:15

take a neighborhood where there's relative poverty,

1:52:17

the young men get violent. They

1:52:19

don't get violent because they're all hurt and they're

1:52:21

victims. They get violent

1:52:23

because they use violence to seek social status.

1:52:26

So even in that situation, it's not, oh, the poor,

1:52:29

poor. It's no wonder they're

1:52:31

criminal because they need bread. It's like,

1:52:33

sorry, buddy. That's not how it works.

1:52:35

The hungry women feeding their children don't

1:52:37

become criminals. The extraordinarily ambitious

1:52:40

young men who feel it's unfair that

1:52:42

their pathway to success become violent. And

1:52:46

that's 100% well

1:52:48

documented and generally by radically left-leaning scholars.

1:52:50

Sure. I don't disagree with any of

1:52:53

that. Health inequality in areas is much

1:52:55

better for different crimes than poverty than

1:52:57

absolutes. Right, but it's a very specific

1:52:59

form of crime. It's status seeking crime

1:53:02

by young men. Right? But

1:53:05

that shows you what the underlying motive

1:53:07

is. It's not even redress of the

1:53:09

economic inequality. It's actually the men striving

1:53:11

to become sexually attractive by gaining position

1:53:14

in the dominance hierarchy. There's

1:53:16

nothing to leave big about. I think you have to be

1:53:18

really careful with that assessment though, because you can say

1:53:20

that it's not economically, it's not

1:53:22

seeking economic. Why do you have to be

1:53:24

careful? The biggest predictor of a male... Well,

1:53:26

because we're assuming that people that commit crime

1:53:28

in these types of circumstances are status seeking

1:53:30

and not trying to seek economic

1:53:33

remedy. But it might be... That's exactly what

1:53:35

we're assuming. But it might be the case,

1:53:37

for instance, that in economically prosperous areas, that

1:53:39

the men there aren't actually seeking economic prosperity.

1:53:41

They're also just trying to elevate status, but

1:53:43

they do it through economic prosperity. It's potential,

1:53:45

right? They do it with

1:53:48

a longer term vision in mind. Sure, they're trying

1:53:50

to elevate. I wouldn't disagree with that in the

1:53:52

least. But they do it with

1:53:54

a much longer time horizon in mind. And

1:53:56

we know this partly because there have been

1:53:58

detailed studies of gang members... for example, in

1:54:00

Chicago, who are trying to

1:54:03

ratchet themselves up the economic ladder, but they do it

1:54:05

with a short term orientation. Most of them think they're

1:54:07

going to be dead by their early twenties. So

1:54:10

they're trying to maximize short term gain. It

1:54:12

has nothing to do with the redress

1:54:14

of economic inequality, except in the most

1:54:16

fundamental sense. And it is status driven

1:54:18

because they're looking for comparative status. I

1:54:22

don't think any human being has baked in

1:54:24

a desire to seek economic prosperity. I think

1:54:26

that that's like a third order thing that

1:54:28

we look for. And fundamentally, it's probably more

1:54:30

like safety, security for ourselves, and then status

1:54:32

seeking for other things. I think that changes

1:54:34

when you have children. Well,

1:54:37

I mean, the safety and security of children. Or

1:54:39

starts to become irrelevant. I mean, depending

1:54:42

on how you view your status, right? You

1:54:46

can't do that every time we have a discussion. Well, I'm just

1:54:48

saying, one of the important things for my child is to be

1:54:50

able to send my child to a good school. I need to

1:54:52

have an elevated status, right? I need to be able to buy

1:54:54

a house in the right school district. I need to be able

1:54:56

to pay the education. Right. But you're not telling me, I hope.

1:54:59

That the driving factor behind

1:55:01

your desire to care for your children

1:55:03

is an elevation in your status. No,

1:55:06

but I'm saying that the elevation of status might

1:55:08

be what allows you to take a job. So

1:55:10

for instance, one of the biggest predictors of getting

1:55:12

married is already at a position. Well,

1:55:15

that's what I'm saying. I'm saying there's like a there's a I know.

1:55:18

Yeah. Okay, look, we're running

1:55:20

out of time. Yeah, you're good. That

1:55:25

tit for tat thing. I was just saying that

1:55:27

the tat thing. There is some underlying built into

1:55:29

probably our genes, right? Because you see it all

1:55:31

throughout the animal kingdom, that there's some level of

1:55:33

punishment or some level of force. No,

1:55:36

but but I think I think it's the right. What's

1:55:38

this when you're the tatter, not when you're the titter

1:55:40

though, right? No, you're the titter. It's this retribution. No,

1:55:42

no, no, I don't think that's true either. Look, if

1:55:44

you read crime and punishment, for example, one

1:55:46

of the things you see that emerges

1:55:49

when Riskelandkov gets away with murder

1:55:51

and it's a brutal murder and he gets away with

1:55:53

it. It's completely clear and he has a justification for

1:55:56

it. And what happens as a

1:55:58

consequence is that that disturbs his own. relationship

1:56:00

with himself so profoundly that

1:56:02

he can't stand it such that

1:56:05

when a just punishment is finally meted out

1:56:07

to him, it's a relief. And

1:56:09

that's not rare. And that is

1:56:11

like there isn't anything more terrifying. This is why

1:56:13

crime and punishment is such a great novel. There

1:56:16

isn't anything more terrifying than breaking a moral rule

1:56:18

that you thought you had the ability to break

1:56:20

and finding out that you're somewhere now that you

1:56:22

really don't want to be. And

1:56:24

then that, you

1:56:26

know, there's nothing worse in your own

1:56:29

life than waiting for the other shoe to drop. If

1:56:31

you've transgressed against a moral rule and now

1:56:34

you're an outsider because of that, you live

1:56:36

in no man's land, the fact that you

1:56:38

have just retribution coming to you, that can

1:56:40

be a precondition for your atonement and your

1:56:43

integration back into society. But it's probably important

1:56:45

to note that depending on the system you

1:56:47

exist in, those moral transgressions just aren't, right?

1:56:49

So to take it back to, I'll use

1:56:52

your leftist example, you might consider a threat

1:56:54

of force for somebody to get a vaccine to

1:56:57

be a highly immoral thing that might be a

1:56:59

transgression against some fundamental moral thing. But a person

1:57:01

left might think that they're actually satisfying their moral

1:57:03

requirement to society by doing so, much the same

1:57:05

as a child soldier or not,

1:57:08

I won't use child soldier, but maybe an older

1:57:10

person that's committing intifada or some kind of Islamic

1:57:12

terrorist thinks that they're fulfilling some moral calling as

1:57:14

well. No doubt, no doubt that that's the case.

1:57:16

That's why I was focusing in on the use

1:57:18

of force, is that I think it's a good

1:57:20

rule of thumb policy that

1:57:22

if you have to implement your goddamn

1:57:25

scheme with force, then there's something

1:57:27

wrong with the way it's formulated. Does it bother

1:57:29

you with every religion? We could

1:57:31

have used a pure infitational strategy to distribute

1:57:33

the vaccine. It would have been much more

1:57:35

effective. And it was bad policy,

1:57:37

right? We're in an emergency, we have to use

1:57:39

force. It's like, no, no you weren't. It

1:57:43

wasn't the kind of emergency that justified

1:57:45

force, not least because behavioral psychologists known

1:57:47

for decades that force is actually not

1:57:50

a very effective motivator. It produces a

1:57:52

vicious kickback. So one of the

1:57:54

things, this is going to happen for sure, is

1:57:57

that the net deaths from people's

1:58:00

stopping using valid vaccines

1:58:02

as a consequence of

1:58:04

general skepticism about vaccination is going

1:58:06

to cause, in my estimation, over

1:58:08

any reasonable amount of time, far

1:58:10

more deaths than COVID itself caused.

1:58:14

You violate people's trust in the public health

1:58:16

system at your great peril, and

1:58:18

you do that by using force, and we did that.

1:58:21

And so you can see already

1:58:23

that there's hordes of people who

1:58:25

are vaccine-skeptic, this generalized skepticism that

1:58:27

to some degree you were rightly

1:58:29

decrying. It's floods like

1:58:31

wildfire, and no wonder, because if you make

1:58:33

me do something, I'm going to be a

1:58:35

little skeptical of you for a long time.

1:58:37

You know, this conversation, we're here voluntarily, like

1:58:39

we're trying to hash things out, and in

1:58:41

good faith, you know, but neither of us

1:58:43

compelled the other to come here, and neither

1:58:45

of us are compelled to continue. And so

1:58:47

that makes it a fair game. And

1:58:50

a fair game is something that everyone can be invited

1:58:52

to. And I suppose that's something

1:58:54

that's neither right nor left, you know, hopefully,

1:58:56

right? Something we could conceivably agree on. And

1:58:58

I also think that I don't

1:59:00

have any illusions about the fact that there

1:59:03

are people on the right who would use power

1:59:05

to impose what they believe to be their core,

1:59:08

their core, what their core,

1:59:10

the core, what would

1:59:12

you say? Their core

1:59:14

idol. Of course,

1:59:17

the temptation to use force is

1:59:20

rightly pointed to by the leftists

1:59:22

who insist that power is the

1:59:24

basis for everything. It

1:59:26

isn't the basis for everything that's wrong. It's

1:59:28

really wrong. But it's a severe

1:59:31

enough impediment to progress forward that we have

1:59:33

to be very careful about it. So

1:59:35

look, we have to we have to stop. I

1:59:38

want to know if there's anything else you'd like

1:59:41

to say before we stop, because unfortunately, we have

1:59:43

to stop rather abruptly. And so I

1:59:46

think I thought we got I thought

1:59:48

we got pretty far into this. What

1:59:51

are you trying to accomplish? Let's start with that. We

1:59:54

found a little bit about we found out a little bit

1:59:56

about who you are. I mean,

1:59:59

you formulated your

2:00:02

proclivity in terms of, to

2:00:04

some degree, in terms of delight in

2:00:06

argumentation or facility added, which you certainly

2:00:08

have. The

2:00:10

danger in that, of course, is that you

2:00:13

can be oriented to win arguments rather

2:00:15

than to pursue the truth. And that's

2:00:17

the danger of having that facility for

2:00:19

argumentation. But what are you hoping to accomplish

2:00:22

by engaging in conversations like this

2:00:24

in the public sphere? Elevation

2:00:26

of status, you know? That's one possibility.

2:00:28

No, I feel like, I think debate

2:00:31

or argumentation is good because it forces

2:00:33

two sides to make their ideas somewhat

2:00:35

commensurate to the other. If

2:00:37

two people are having a conversation, they have to be able

2:00:39

to communicate said ideas to the other person, otherwise it's just

2:00:42

a screaming match. And I think there

2:00:44

is a good, for the sake of, like just

2:00:47

being bipartisan or having a collection of people

2:00:49

in a certain area and having different people

2:00:51

together, just that in and of itself without

2:00:53

anything else happening, I think produces a good,

2:00:55

at least for a democratic society. For

2:00:58

instance, I would agree that school,

2:01:01

maybe not faculty, but administrators are very, very,

2:01:03

very, very far left today. Dangerously

2:01:05

so, I don't have to talk to you about this, obviously. But

2:01:08

I think part of the responsibility is that, I think,

2:01:10

rests at the feet of conservatives who, instead of maintaining

2:01:13

participation in the system, decide that they're gonna

2:01:15

throw their hands up and disengage. When

2:01:17

I go with I. Or be forced out. Or be forced out,

2:01:19

sure. Yeah, in my case, for example. That's fine, yeah, sometimes it

2:01:21

can happen. But I think that rather

2:01:24

than accepting being forced out, or rather than

2:01:26

encouraging other people to disengage, the engagement has

2:01:28

to happen. It can't be a, I'm losing

2:01:30

faith in the system, so all of us

2:01:32

are gonna be here to do our thing.

2:01:34

It has to be like, no, we're gonna

2:01:36

be here in these conversations, whether you like

2:01:38

it or not, because in a democracy, sometimes

2:01:40

the guy you don't like wins. Sometimes the

2:01:42

policy that you don't like is enforced. Sometimes

2:01:44

a guy you don't like is somebody you

2:01:46

have to share an office or a classroom

2:01:48

with. And we need to be okay with

2:01:50

that. And I'm worried that the internet is

2:01:52

driving people into these very homogenous, but very

2:01:54

polar-grind groups. The data on that, by the

2:01:56

way, aren't clear. Like,

2:01:58

whatever's driving polar-grind, polarization doesn't seem

2:02:00

to be as tightly related to the

2:02:02

creation of those internal bubbles as you

2:02:04

might think Like I've looked

2:02:07

at a number of studies that have

2:02:09

investigated to see whether people are being

2:02:11

driven into homogenized Information bubbles

2:02:13

and it isn't obvious that that's the

2:02:15

case Directly, although it's

2:02:17

the polarization that you're pointing to that

2:02:19

you're concerned about that seems to be

2:02:22

clearly happening So and why

2:02:24

that is well, that's a matter

2:02:26

of you know intense speculation like

2:02:28

the homogeneity I feel like

2:02:30

it's not so much this is not

2:02:32

research based at all. Just a total feeling Yeah,

2:02:34

so I meant that but the feelings that I

2:02:36

have is it's not necessarily that homogeneity has increased

2:02:39

It's that homogeneity has increased as a

2:02:41

byproduct of the bubbles becoming larger So

2:02:43

for instance, it might be that I'm

2:02:45

from Omaha, Nebraska It's a town in

2:02:47

our city really in Nebraska, right? It might have been that

2:02:49

50 years ago There

2:02:52

are bubbles in living in Omaha and there

2:02:54

are different bubbles for living in Lincoln and

2:02:56

there might be bubbles in Toronto or neighborhoods

2:02:58

in Toronto or there might be bubbles in

2:03:00

Vancouver, but now as the internet exists and

2:03:02

things become more Internationalized these

2:03:04

bubbles are it's not just a bubble that exists

2:03:06

in these cities now the bubbles have come together

2:03:08

And as a result of that coming together Sure.

2:03:12

Yeah, or a globalization problem or

2:03:14

a communication But you're right this issue where somebody

2:03:16

might be in a particular city or state and

2:03:18

have a really strong opinion about what? Aoc says

2:03:20

but they don't know anything about their local political

2:03:22

scene And I think that that's an

2:03:24

issue because the bubbles have gotten so large and they're

2:03:26

encompassing so many people now and you're expected to have

2:03:29

Like a similar set of beliefs between all of these

2:03:31

different people now that might live in totally different places

2:03:33

That's they think a big issue. Mm-hmm. We're running

2:03:36

into yeah. Well that could be we'll close with

2:03:38

this I think that might be one

2:03:40

of the unintended consequences of hyper connectivity

2:03:43

Right is that we're driving levels of

2:03:45

connectivity that we that get rigid and

2:03:47

that we also can't tolerate Yeah.

2:03:50

All right. Well, that's a good place to stop Well,

2:03:53

thank you very much for coming in today. That's

2:03:55

much appreciated and You're

2:03:58

a sharp debater your

2:04:00

feet so that's that's fun to see and

2:04:02

I do think that your closing remarks were

2:04:04

correct is that the

2:04:07

the alternative to talking

2:04:09

is fighting. So

2:04:12

when we stop talking it's not like the disagreements

2:04:14

are gonna go away. We

2:04:16

will start fighting. Yeah,

2:04:18

right. And

2:04:20

talking can be very painful because

2:04:23

a conversation can kill one of your cherished

2:04:25

beliefs and you will suffer for that although

2:04:27

maybe it'll also help you but

2:04:30

the alternative to that death by

2:04:32

offense is death. Right.

2:04:36

So better to substitute the abstract

2:04:38

argumentation for the actual physical combat.

2:04:40

Right. Sometimes like the worst

2:04:42

relationships are the ones where couples fight a lot. Yeah,

2:04:44

they're really bad ones are where they don't fight. And

2:04:47

then all of a sudden there's a couple

2:04:49

fight and reconcile. Exactly. Yes,

2:04:52

exactly. All right. All

2:04:54

right. Well, that was good. Thank you very

2:04:56

much. On the YouTube platform, thank you very

2:04:59

much for your time and attention and to we're

2:05:01

going to spend another another half an

2:05:03

hour or so on the daily wire side.

2:05:06

So if you're inclined, tune into that. And

2:05:08

we'll find out a little bit more about the

2:05:10

background of our current of our

2:05:12

current interviewee Destiny. See you

2:05:15

later, guys.

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features