Podchaser Logo
Home
Sam Harris || Free Will (Part 2)

Sam Harris || Free Will (Part 2)

Released Thursday, 4th March 2021
 1 person rated this episode
Sam Harris || Free Will (Part 2)

Sam Harris || Free Will (Part 2)

Sam Harris || Free Will (Part 2)

Sam Harris || Free Will (Part 2)

Thursday, 4th March 2021
 1 person rated this episode
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:14

Okay, well, there

0:16

are all

0:21

of the causes of

0:24

what I'm conscious of our

0:26

first unconscious, right,

0:29

So, like, you know, I'm not aware

0:32

of what my brain is doing at

0:34

the synaptic level. But let's just be

0:36

again, I'm not even a dog I'm not a I'm

0:39

certainly not a dogmatic materialist. I'm not

0:41

you know, like I can bracket the ontology

0:44

here, But let's just talk in terms of materialism

0:46

too. And and Dennett certainly

0:48

is a materialist, right, So most most

0:51

scientific compatibilists are materialists. So let's

0:53

just grant materialism. And

0:56

it's it's deterministic flavor

0:59

for this kind conversation, right, It's like, my

1:01

mind is what my brain is doing in

1:03

this moment. Right, So if I'm going to

1:05

get to the end of the sentence, it's

1:07

because of micro changes

1:10

at the level of neural circuits that accomplish

1:13

that. Right. So the grammar of human

1:15

English, the grammar of human language,

1:17

and in my case English is somehow encoded

1:20

in the physical substrate of my brain, as

1:22

it would be in the physical substrate of a robot

1:25

that was also speaking English successfully, although

1:27

it just would be a very different kind of computer.

1:29

So what we're talking about. Is

1:32

information processing in a physical system.

1:34

In my case, the computers made of meat. In

1:36

a robot's case, it will be made

1:38

of silicon. And

1:44

in neither case, is

1:47

there something extra which

1:50

is emerging or being

1:52

added that is giving a degree

1:54

of freedom beyond just the impressive

1:58

complexity of the

2:00

system in dialogue with its

2:02

environment. I think there is. I

2:04

think that is. Let me like pinpoint

2:07

precisely what I think that extra thing is. You

2:09

know, cognitive control are includes

2:12

things like implementation intentions. If we could build

2:14

a progres a robot to have the capacity

2:17

for implementation intentions and what

2:19

I mean by that is error correction ability to

2:22

take its current Because you're right, you

2:25

know, in the moment, you know, we

2:27

don't really have free will, but we have the capacity

2:29

to shift our behavior in the future

2:31

so that we can learn from our mistakes, so

2:34

that we can even make moral reasoning

2:36

decisions. You know, turtles, chimps,

2:38

APPEs and robots right now don't

2:40

really have a great capacity for

2:43

moral reasoning about something in

2:45

action they already need so they can change their behavior

2:47

in the future. To me, that conscious

2:49

control is free will. It's

2:52

free will. But I don't

2:54

think I can convince you to use

2:57

that label for that phenomenon. Is

2:59

that right? Well, it's

3:01

again you're either changing the subject

3:04

or like either you're going to interact with the thing people

3:06

think they they really think they have, right,

3:09

or you're gonna or you're gonna grant Okay, they

3:11

don't have that and we're going to talk about

3:13

this new thing. I mean, so, I like, there's

3:15

there's no doubt in my mind that there is a difference

3:18

that's that's rational to care about between

3:22

you know, voluntary and involuntary action or an

3:24

ability to regulate emotion or not. I

3:26

mean, there are people who have brain damage who you

3:28

know, just blurt out everything they're thinking. They

3:31

can't stop doing that, right, So, and

3:33

psychopaths are moral blind, morally

3:35

blind, right yeah, so you can they're they're people

3:37

who are ethical or people unethical. There are people who

3:39

are sensitive to certain things and not We could

3:42

you know, you could be more or less intelligent. Like all of this,

3:44

all this is descriptively true

3:46

of human beings, and none

3:48

of it requires free will in the

3:51

in the common sense to

3:53

understand or or to acknowledge

3:57

so and so, so I can

3:59

I can can serve the data of human

4:01

experience, all the while repudiating

4:05

free will as a as a incoherent

4:07

idea. Many

4:10

people worry, but what is

4:12

truly novel about

4:14

what I'm arguing for is

4:18

that you can recognize this subjectively

4:21

too. So the only reason

4:23

why we're talking about free will, the only reason why anyone

4:25

cares about this topic, is because people

4:28

are having an experience. They

4:30

think they're having an experience of being

4:33

a self that can

4:35

author its actions.

4:38

So the experience of having free will

4:40

and the experience of being a self,

4:43

being a subject in the middle of experience,

4:47

being a thinker, in addition to thoughts

4:49

themselves right or feeling

4:52

that one is those things, that's

4:55

just those are two sides of the same coin, right

4:57

it is. You know, people think that

5:01

they are a subject in the middle of experience.

5:03

They don't feel identical to

5:06

experience itself. They feel like they're appropriating

5:09

experience from a point of view internal

5:12

to their bodies. I mean, actually, the truth is most people

5:15

don't even feel identical to their bodies. They

5:17

feel like they have bodies, they feel

5:19

like they're in a body. They feel like they're

5:21

a subject in their head most

5:23

of the time. Right now, that is a

5:26

meditation successful meditation absolutely

5:29

proves to you, from the first person side, from

5:32

the experiential side, that that's a false

5:34

point of view, right, And that is

5:36

the point of view that gives

5:39

motivation to this claim

5:41

about free will, because that is what that's

5:44

how you feel when

5:46

you feel you are the

5:48

conscious upstream

5:51

cause of the next thing

5:53

you think and do, because you're not

5:55

noticing that the next thing you think

5:58

or intend to do is simply coming

6:01

out of the darkness behind you that

6:03

you can't inspect, right,

6:05

and that it is genuinely mysterious

6:07

and so like, So you take a moment of conscious

6:10

deliberation. I could decide,

6:12

well, uh,

6:17

you know, I have a glass of simplest possible

6:19

case. I have a glass of water on the desk, and I could decide

6:21

to pick it up and take a drink now, right, Or I

6:23

could decide to wait. Right, this is a this

6:25

is the prototypical case of me

6:28

being in the driver's seat. You know, I'm free

6:30

to do this. It's no coercion, no one's no

6:32

one's got a gun to my head. No one's saying don't pick it up,

6:34

or pick it up right now? Inter compulsion, Right,

6:37

I don't have some kind of uh

6:41

uh, you know, compulsive water drinking

6:43

behavior. So it's but you know,

6:45

and I'm a little bit thirsty, but I can choose to

6:47

resist my thirst, right, So I'm conscious of thirst,

6:49

but then I can I can consciously decide to resist.

6:52

And that seems to be me prosecuting my freedom

6:54

there, right. But the

6:56

more you pay attention to what

6:58

it's like to make that choice

7:02

out of your own free will, the

7:04

more you will discover that

7:07

it is absolutely mysterious

7:10

in every particular why

7:13

and how you do what you do, when

7:15

and how you do it? Right, Like im

7:18

from subjectively, I

7:21

have no idea why or

7:24

how I do any

7:26

of these things. And I have no idea why

7:28

why or how one

7:31

particular moment becomes decisive,

7:33

right, So, like I could be telling myself,

7:36

all, right, now you're gonna wait, you just drank it. You

7:38

just you just had to sip a few minutes ago, you

7:40

know, you just just wait, right, And

7:43

then I think, well, actually, I'm just gonna

7:45

move now. No no no, no no wait

7:47

wait no no no, I'm gonna do it no matter

7:49

how many times I go back and forth, right vetoing

7:52

the decision that almost just got made. And

7:55

then breaking through the veto

7:57

and the inn said, oh fuck it, I'm just going to pick up a glass of water. Right,

8:00

that every increment

8:03

of that subjectively is

8:06

totally mysterious, totally,

8:09

and and the thresholds that are being

8:11

crossed or not crossed to make

8:14

a thought or intention behaviorally

8:16

active or or

8:18

insufficient to provoke behavior,

8:20

right, all of this is totally

8:23

compatible with some guy,

8:25

some evil genius in another room,

8:28

typing in the instructions to my completely

8:30

determined end and coerced

8:33

brain you know, in this case using

8:35

synaptic you know potentials.

8:38

But in the case of a robot using the

8:40

whatever code of the programmers put in there,

8:44

it is there is. I

8:46

am a puppet who cannot see his strings

8:49

there right, no matter how, And again

8:52

I'm not none. None of this is to

8:54

deny that certain outcomes

8:56

in life are better than others and worth wanting.

8:59

None of this is to deny that there are ways to get

9:01

what you want out of life and ways to fail to get

9:03

what you want. None of this is to deny

9:06

that there is just a

9:09

vast landscape of experience

9:12

and we need to navigate towards

9:14

one part of it so as to be happy. And functional,

9:16

and we should avoid navigating

9:18

to another We should avoid being captured

9:20

by another part which leads to you

9:23

know, the worst forms of misery. All

9:25

of that's true, and all we can talk about

9:27

you know how to do all of that, and all

9:29

of that includes the prospect

9:32

that people can learn,

9:35

people can improve themselves, people

9:37

can can you know. It's like first

9:39

I first I didn't know how to play the guitar, and

9:41

then I did. And there's there's a pathway

9:44

between point A and point B. Right, all

9:46

of that's true. None of it requires

9:48

free will, and none of it requires that we

9:51

overlook the absolute

9:55

mystery of

9:57

our subjectivity, of our conscious subjectivity

9:59

in each moment, right, like it is, it is

10:01

just totally inscrutable, every

10:04

part of this, like like literally, you don't

10:06

you do not know what you will think

10:08

next, Right, in what sense

10:11

is that a basis for free will? You do

10:13

not know what you will think. You do

10:15

not know what it will take to make it behaviorally

10:18

active. You do not know. You

10:20

do not know what is happening

10:22

when you're second second guessing the thing

10:24

you just thought and that becomes behaviorally

10:27

active. You don't if something in you

10:29

then just suddenly pulls the brakes and says no, no, no

10:31

no. I'm like, I can't say that you

10:34

don't know why that happens when it does, and when

10:36

it works when it does, and when it fails when

10:38

it does. All of this is

10:41

mysterious, not

10:43

just a little mysterious, one hundred

10:45

percent mysterious. Right,

10:48

you have no insight into it, like

10:50

this, like the next thing right

10:52

right, like like so that in so

10:54

far, like the laughter, where

10:57

did it come from? Right? I agree?

10:59

I agree, you're making good points, and

11:02

I don't think what you're saying is wrong. But

11:04

I think you're confusing the hell it is to

11:06

have no free will. So I think

11:08

you're confusing the hell out of people because

11:11

I think that you made great points

11:13

that the kind of free will that matters to

11:16

humans we have all that you

11:18

know, and I think that's an

11:20

interesting product. I see the difference

11:22

between our projects all of a sudden, because I don't

11:24

think either of us are saying anything that's factually incorrect.

11:27

I think that, you know, I try to every

11:29

day, I try to, like show people, you know, the kind of free

11:32

will that matters to you as a cybernetic

11:34

system. You know, you're taking like the ultimate

11:36

universe perspective, but from like a cybernetic

11:38

perspective, all we have are stating

11:40

our initial state. I'm also taking these subjective.

11:43

So it's just not true that people

11:45

understand this. Almost no one understands

11:47

this. Dan Dennett does not understand

11:50

this. It's a big cree.

11:53

He obviously doesn't understand this. He obviously

11:56

feels like a self, right,

11:59

And that's the that is the string

12:01

upon which all of this controversy is strung.

12:03

Right if you feel like me right

12:06

that like most of the people listening to me right now are

12:08

thinking, what the fuck is he

12:10

talking about? I can like, but

12:13

here's what I'm not. Not my audience, not my audience.

12:16

That voice in your head, they get

12:18

it, that says, what the fuck is he talking about?

12:21

That? That feels like you?

12:24

That isn't you? Right like

12:26

that is that's not a self? What do you mean? It's not

12:29

you? It is it's you again?

12:31

You're you're a dualist when you say that, it's

12:34

no more you than than the bead of sweat

12:36

that that drips down your forehead. Is

12:38

you? It is an object? It isn't. I

12:41

disagree. People don't identify themselves

12:43

with their hand, but they identify themselves with

12:45

their conscious desires and motivations. So

12:47

we can have gradations of things that people

12:50

parts of our body that people identify

12:52

themselves with from the point from the point

12:54

of view of consciousness, there's

12:56

there's simply consciousness and his contents.

12:59

Like like, I'm not saying that there aren't important

13:02

distinctions in terms of what causally

13:05

follows from certain contents. I mean, like the

13:07

beat of sweat is a is is

13:09

truly epiphenomenal for anything, any

13:11

or any project I care about, right, So

13:14

it's not going to get a lot a lot of things done in the world,

13:16

and my thoughts might, but there's

13:19

no causal property of the sweat to like

13:21

being able to realize your

13:23

loftiest ambitions in life. It's

13:25

true, it's true. But I mean take my

13:28

loftiest ambitions, right, like, like

13:30

there there are and

13:33

this is why, this is why the concept of free will

13:35

makes no sense. So let's

13:38

say I

13:40

here's a project to which I could be purposed.

13:43

Right, I could decide

13:46

that I want to become a classically

13:49

trained musician. Right,

13:51

there are people who have given their lives to that

13:53

project, right, just from from

13:56

the earliest years of their lives, as

13:58

you know, and

14:00

then there are people who late in life presumably decide,

14:03

Okay, I've done a lot of things, and I know I'm not going

14:05

to be I'm never going to be Mozart now starting

14:07

in my fifties. But I

14:10

you know, let's just see what I can do here. I really,

14:13

I really want to do this. Right. This is what I

14:15

love. What I love more than anything on

14:18

earth is classical music. Right, I love

14:20

Bach more than anything. Right, Okay,

14:22

I just that's true of somebody.

14:25

None of that's true of me? Right?

14:27

Why not? Why don't Why don't I care

14:29

about Bach? Right? Why don't

14:31

I care more about classical music? Why when I when

14:34

I do listen to music, why is it almost never classical

14:36

music? Right? That all of these things have reasons,

14:38

they have explanations, causally, right,

14:41

some some in my corner of the universe,

14:44

classical music is just not that interesting,

14:46

right, it's not those Those are the things

14:48

that make you who you are, though, even if

14:50

you don't know why they were caused by

14:52

environmental and biological confluence.

14:56

Great, yes, okay, so so it's

14:58

deterministic, it's rando, it's

15:00

something, right, it's some pattern of causation.

15:02

Right, But so what does

15:04

it mean to say that

15:07

I am free to

15:09

take a deep and

15:12

and really all

15:15

encompassing interest in classical

15:17

music right now? Like

15:20

I'm free, No one's no one's telling me you

15:22

can't apply to Juilliard

15:24

and and really just get into this. You know. It's

15:26

like like no one's saying. No

15:28

one's saying I can't give it a shot. You're not completely

15:30

free, and there are constraints. But

15:33

I wouldn't go and say you have no one Okay,

15:36

but clearly I could.

15:38

I could decide to leave this podcast and

15:41

spend the rest of the day listening to classical

15:43

music, trying to figure out which instrument

15:46

I'm going to learn, finding a teacher. Yes,

15:48

there are constraints. We've got a COVID pandemic

15:50

that I'm worried about. So like I'm gonna have zoom

15:52

classes now in the cello, right, I can't.

15:54

I'm not going to do it face to face until I get

15:57

a vaccine. Right, Yes, there are there are, there

15:59

are things in the world that I'm navigating around, but there's

16:02

nothing stopping me from just

16:04

going all in on the cello from

16:07

this afternoon forward, just dropping

16:09

everything I'll teach you. I played cello,

16:12

you do. My grandfather was in the Philadelph

16:14

Orchestra for cello. Okay, so all right, so

16:16

so I I could. I

16:19

could absolutely do this. I am free

16:21

to do this right. This is

16:23

my This is that this is the stage upon which

16:25

my free will is going to demonstrate

16:27

itself right right now. But

16:30

what's the fun, what's the book? The

16:33

problem is I have almost

16:36

no interest in playing the cello or

16:38

like so like, I can't. It's an end. The

16:41

fact that I don't is

16:43

something that I did not author.

16:46

It's a constraint. It's a constraint, for sure.

16:49

But we can one to one thing. This

16:52

is where free will. Free will is not in the

16:54

one when you're when you talking

16:56

about devoting your life to playing the cello, not

16:58

wanting to play the cello is is a little

17:00

bit of a constraint. Yes, no, it's a huge

17:02

constraint. You're right, you're right. But I think that you're

17:04

not distinguised between first order goals and second order

17:07

goals. And I think that what gives us the free will as

17:09

a human as species what you asked me, what's

17:11

that extra? It's the wanting to want you

17:13

know. It's it's our capacity to use implementation

17:16

intentions to get out of the bed and morning go to the gym,

17:18

even though we don't want to. I don't want to get up in the morning

17:20

and go. I build a whole workout thing

17:22

on my porch here. I don't want to do it, you

17:24

know. But my freedom wies in my capacity

17:27

to use my consciousness and my change

17:29

my environment in all sorts of ways where it's easier,

17:31

where the constraints aren't as big. You

17:33

don't see that as an important part of free

17:35

will that matters to people. Okay,

17:38

I just see no reason to call it

17:40

free will. It's when I

17:43

inspect what it's like to be me here

17:45

ye again, when I think about myself from the

17:47

first person side, and when I triangulate

17:50

on myself and think about myself from a third person

17:52

side. At no point

17:55

does free will any

17:57

version of it, seem an apt apt

18:01

aptly applied to this situation. So

18:04

I am free to play the cello. I

18:06

am free to do it right, there's no

18:08

question. So yes, I can talk about the difference

18:10

between being coerced and not coerce.

18:13

That's a you know, it's a political fact

18:15

about me that I'm not coerced to play or

18:17

not play the cello. Right, it's

18:20

a social fact. But

18:23

you still do it even if you're unmotivated. Right,

18:26

I can't account for why

18:28

I don't want to play the cello or

18:30

I want so many other things so much more

18:33

that I functionally don't want to play. I mean,

18:35

the truth is I want. I potentially

18:37

want an given an infinite amount

18:39

of time and an infinite amount of energy.

18:42

Well, then I potentially want an infinite

18:44

number of things, right, Like all kinds of things are

18:46

interesting, and I would love to have all these skills

18:48

that I don't have, But I just don't.

18:50

I don't feel that. It's just

18:52

not when I rank

18:55

order my priorities, you know, either

18:57

consciously or unconsciously, And there

18:59

can be a difference there, obviously, you

19:01

know. We can. We can think we want certain things

19:03

and tell other people we want certain things, but act

19:05

as though those things aren't anywhere

19:08

near the top of our list of wants. But

19:14

I have no idea why I don't

19:17

like classical music more than I do. I

19:19

have no idea why, and I do have

19:22

Actually I do have the second order desire.

19:24

I wish I liked it more than I like it. I

19:26

mean, it's more I'll teach you.

19:28

I'll teach you. It's like I would be I think

19:31

I would be a less of a philistine if

19:33

I if I knew more about it. I

19:36

wanted to know more about it, appreciate, and certainly

19:38

if I could play, you know, the cello,

19:40

that would be a wonderful thing to

19:42

be able to do. Uh. And

19:45

yet I

19:47

am as I am with respect to classical

19:49

music. Now, if I if, if

19:53

I if I decide, if just imagine

19:56

I decided by force of this conversation you

19:58

said something in conversation that

20:01

inspired me to be different

20:03

than I'm tending to be right like this, this really

20:06

would be the the really,

20:11

the the ultimate instance of free

20:13

will, because this would be a kind of

20:16

you know, just a

20:18

surmounting of all my prior

20:21

tendencies into this new uh

20:24

commitment. I could decide, you know what, Scott,

20:28

you just on the basis of this

20:30

conversation, I'm going to take up the cello, like

20:33

right now, I'm going to do it. You inspired

20:35

me, right, Yeah, So

20:38

just email me gets I want to give me the

20:40

name of someone who can teach me. And I'm like, like, I'm

20:42

shopping for cello's right now, Like, you

20:44

know, get your point. When

20:47

I look, what would it be like for

20:49

me to experience that that

20:51

sort of that that awakening

20:55

in my own consciousness that's aimed

20:57

in the direction of classical music and cello

20:59

man, right, right, that

21:02

would be totally compatible

21:05

with the evil genius in the next room saying

21:07

all right, we're going to give him the cello desire.

21:10

Here he's going We're going to give him, you know, just turn

21:13

give him. He's now going to be

21:16

just fixated on on you know, the

21:18

difference between Mozart and Beethoven, and

21:21

it's all like just going to download the

21:23

the the classical music

21:25

infatuation program, right, that

21:28

would be what that would be compatible

21:30

with. It would not demonstrate anything

21:33

like free will. It'd

21:35

be like what came over me?

21:38

You know, I've had I've had fifty three

21:40

years to discover in myself a

21:43

desire to play the cello. If I discover

21:45

it right now, what the

21:47

fuck has come over me? Right? This this

21:50

comes from outside of consciousness.

21:52

This is the point. I see your point,

21:55

But I'm still gonna you know, because

21:57

I know you don't like, let's just agree to disagree.

21:59

So I'm really taking this seriously. I

22:02

think that what people, you know, you're using

22:04

these extreme examples that sound good and

22:06

and they prove a certain point. But the

22:08

point let me tell

22:10

you what my point is. Yah, picking

22:13

up a glass of water is not an extreme example and

22:15

not extreme. Extreme was not the right

22:17

word. I take back extreme. I take back extreme.

22:19

You're using examples that

22:21

that make you it sound like you've won an argument.

22:24

But let me just say what what my point

22:26

was? Actually this, people, the cybernetic

22:28

system wants to reach a goal that it

22:30

desires. So you're right, you can pick examples

22:33

where it doesn't. But let's pick you know, within the

22:35

realm of freedom of things you do care

22:37

about and you are motivated for. Don't

22:39

you think that's a sensible uh sense

22:42

of the term free will? That you have the free will to

22:44

make You know, you want to write a book. You want to write

22:46

the Moral Landscape Part two, you know, back

22:49

to future parts. You know what, and

22:51

you write that book. You know your capacity

22:54

to write the book and to use your consciousness

22:56

to make that a reality to exist in the

22:58

world. You don't see that as the kind of

23:01

free will that people truly care about.

23:04

Well, do people care. Yes, people

23:06

care about realizing

23:09

their goals in life, right, and there

23:11

are causal ways to succeed

23:14

at that and causal ways to fail at that, right,

23:16

And that's all it's like. Yes, so learning to play the cello

23:19

is not going to happen to me by accident, right,

23:21

So my denying free will is

23:23

not the same thing as endorsing fatalism,

23:25

where one would expect Okay, you

23:28

know, I mean this is this is how people misunderstand

23:31

this criticism of free will. They think, well, okay,

23:33

if I have no free will, then why do anything?

23:35

Why not just wait to see what happens? Right?

23:38

And so if I'm going to wait to see if I

23:40

accidentally learn to play the cello, we

23:42

know what's going to happen there. I'm not going to learn to play the cello,

23:45

right, Like, so there is the only

23:47

way to learn is to intend,

23:49

to learn, to practice, to see construction,

23:52

all of that. Right. So, people

23:54

care about outcomes in life

23:57

that are worth caring about. They want good relationships

23:59

as the bad relationships. They want to understand

24:02

things rather than be confused all the time. All

24:04

of that. But none of

24:06

that requires free will to talk about, right,

24:09

And that's some ultimate free will,

24:11

and behavioral regulation is part

24:13

of it, right, Like again, there's

24:15

a difference between someone who can can

24:18

defer gratification for long

24:21

enough to actually get something done,

24:23

as opposed to just you know, gobbling up everything

24:25

in the refrigerator the you know, the

24:27

moment they presents itself. So

24:31

it's it's all. But again,

24:33

all of that can be understood. There

24:35

are there are genuine paradoxes here which

24:38

are interesting to think about, which have ethical

24:40

implications, and which are completely

24:43

ignored the moment you embrace

24:46

the compatibilist framing here

24:48

and also again and the subjective

24:50

insight is completely ignored because what

24:53

happens to you when you recognize

24:56

that that free will doesn't make any sense

24:58

subjectively or or what has

25:00

to happen to you in order for you to have that

25:02

recognition is you have to recognize

25:05

something about the way the consciousness, the way

25:07

the way consciousness is and the way the mind is.

25:10

And it's incredibly freeing

25:13

to recognize that, right, and and to recognize

25:15

that is the antidote to a tremendous

25:18

amount of psychological suffering. Right.

25:20

And so let's say, let's say I do something

25:24

that is incredibly embarrassing,

25:27

right, you know, Like I say, I'm

25:29

giving a public talk in front of two thousand people,

25:32

and uh, you

25:34

know, I I spill my water

25:36

over you know all of the Actually I've

25:38

done this. I was at a public talk in front of two thousand people

25:41

and knocked over a water glass

25:44

and it just spilled over all the you know,

25:46

the the equipment that was you know, connected

25:48

to my microphone, you know, at that podium right

25:51

right, So what

25:54

what reaction do I have to doing

25:56

that? Right? Like, like how long do I stay embarrassed

25:58

for? Well, there's one way of feeling

26:01

about oneself and one's freedom to do

26:03

otherwise and one's you know, and just kind

26:05

of the integrity of selfhood that

26:09

leads one to feel like, fuck, I'm

26:12

I'm such a fool, you

26:14

know, I'm just like this moment says

26:16

a lot about me, Right, this is like, how

26:18

did I become such a schmuck who would get up for a

26:20

public talk and knock his glass of water? And

26:23

all these people see me, they've seen me do it, and maybe

26:25

the sun video and fuck,

26:27

you know, and then so like now open

26:30

the bottomless pit

26:32

of self mortification and

26:34

go as far down as you want right

26:37

like. That's that's a certain kind of person

26:39

to be, right You could

26:41

also be someone who instantly

26:44

notices that it's funny, right

26:46

like, who sees it, enjoys

26:49

it, enjoys it from the point of view of the people watching

26:51

it. And you could actually have the internal

26:53

lightness about oneself in the moment,

26:56

so that it's just you

26:58

realize you have a good story to tell

27:00

later in the day, and your wife is going to laugh

27:02

at it, and all of this is just more

27:05

comedy, right like, So that like

27:07

like no problem, right Like, That's a different

27:10

sort of person to be that has its own

27:12

consequences.

27:15

But another sort of person you

27:17

could be is recognizing that

27:20

that one I in

27:22

this case, you didn't even you didn't obviously you didn't intend

27:24

to knock the glass over. It says basically

27:27

nothing about you except

27:29

the the the you know, a

27:31

failed moment of motor programming,

27:34

and you

27:36

know, apart from your worrying that you

27:38

might have some neurological disease that you

27:40

know, cause you to knock the glass over,

27:43

there's really nothing to think about any

27:45

longer, right, There's just no If it just dry, it's

27:47

like, is there something to dry up? You

27:49

dry it up because you're you're well intentioned towards

27:51

the whole project of maintaining the integrity of this

27:54

institution. But there's

27:57

there's no self in the middle of this that

28:01

just got exposed, right,

28:03

there's just consciousness and its contents,

28:05

and you're free. You're like it is.

28:08

You're in a circumstance of total psychological

28:10

freedom to just move on to the

28:12

next moment without any rumination about

28:15

the last moment, and deliver the talk

28:17

you were going to you're going to deliver.

28:20

You're free to take that glass and move it

28:22

out of the way so you're less likely in the

28:24

future for Bill, there's

28:26

some intelligent response to this, But there

28:29

is no mortification, there's no

28:31

there's no place for it to land. There's just you

28:33

are the next moment is

28:35

is the world is truly

28:38

born anew in the next moment, if

28:40

you will only let it be right

28:42

and rumination and perseveration

28:45

and this this this mechanism that is

28:47

so common to

28:50

just still be beating yourself,

28:52

your your non existent self up

28:54

over this last thing that happened,

28:57

which you didn't which you the self

28:59

that is being beaten up, didn't truly

29:01

author because it just fucking happened,

29:04

right, That

29:07

is, Uh, there's a spell to be broken

29:09

there, and it can be broken. And when you break

29:11

it. You don't know what anyone's talking

29:13

about when they're talking about free will. I mean,

29:15

really you don't. You can't, you can you lose sight

29:18

of the problem, right, And but

29:20

yes, but there's one piece to add

29:22

here, because our paradox is that are interesting.

29:25

The responsibility

29:28

of paradox is is real, and I

29:31

still don't know what I think about it. And

29:33

it's this and perhaps you've heard my example

29:35

here, but it's like when you it's with

29:37

with uh.

29:41

And this is something like you know, kind of a written debate

29:43

with Dan Dennett came up this

29:46

this idea of you take the prototypical

29:48

case of of kind of behavioral

29:50

control of a of a golfer trying

29:53

to sink a short putt, right, and

29:55

when you're a bad golfer and you

29:57

fail to sink a short putt, well, you

30:01

know, you know, everyone looks at

30:03

that and says more or less, well, of course,

30:05

you know you're a bad golfer. And what do you expect you're

30:07

going to make some of those? You're going to miss a lot of those, right,

30:09

but if you're Tiger Woods or some great

30:12

golfer, you really

30:14

should make that put right, and it

30:16

would it would seem like like that's it

30:19

would seem appropriate to beat yourself up over

30:21

not over missing it, because you

30:24

know, you make up a put of that length, you

30:26

know, nine hundred and ninety times out of a thousand

30:28

times, right, And so what

30:31

does it mean to

30:33

say that you should have made that putt if

30:35

you're a great golfer and

30:38

that you know, this is just an analogy that you

30:40

know, this this the morally relevant analogy

30:42

is like what what what does it mean? You

30:44

know, when a psychopath misbehaves you

30:47

or someone with brain damage misbehaves, well,

30:49

of course, what do you expect. This is what people

30:51

with the relevant brain damage do, right? They

30:53

they they're not they're

30:58

not competent to be true free

31:00

to behave well because they're they're they're malfunctioning

31:03

robots on some level. But when you take

31:05

a truly good

31:07

person or a truly competent person

31:10

who then does something horrible

31:13

right and then really misbehaves ethically,

31:17

that person's really responsible. You

31:19

know, that's like the true case of responsibility.

31:21

But the paradox

31:23

for me is that the

31:26

more you make that, the more competent

31:28

you make the person, the more

31:30

their failures to behave

31:33

well become inscrutable.

31:35

Right, So you take the best golfer

31:37

on Earth missing the

31:39

shortest put putt he's

31:42

ever missed in his life. Right,

31:45

that seems to say almost nothing

31:47

about him. That seems to say. That seems

31:50

to be an error on

31:52

the part of the universe. Right, that seems to that seems

31:54

to cry out for an explanation, which

31:57

doesn't tell you he

32:00

should have done otherwise. It tells

32:02

you that there was some noise in the system, there was some

32:04

neurological glitch. It's like whatever,

32:06

whatever happened there intervened.

32:09

It was it was it was adventitious

32:11

to his life, Like he's going to make that put

32:13

a thousand times in a row. Now, what

32:15

can we say about him based on the fact

32:17

that he missed this one? We can say almost nothing

32:20

about him other than he was unlucky. I

32:22

think it's a good point. So

32:25

I want to acknowledge that. When I read your

32:27

produx responsibility, I thought it was a really

32:29

truly good point. It seems like we've

32:31

got it all backwards in our society. You know,

32:34

nice people well intentioned people who make

32:36

little tiny mistakes or being canceled, and

32:38

the assholes are running the country. I don't know

32:40

if you if you see any linkage here, Yeah,

32:42

yeah, well that you know, that's something I

32:45

haven't linked those two topics,

32:48

but yeah, I mean there's a there's

32:50

an asymmetry there that many of us have found totally

32:52

galling, and I think it's it

32:55

does it

32:58

is relevant to connect them in that free

33:01

will isn't the thing that the concept of free

33:03

will, or the concept that someone could have done otherwise,

33:06

isn't the thing that that helps

33:09

us understand what

33:12

matters here? What matters

33:15

is there are certain systems,

33:18

and you know, human and certain human brains are

33:21

among these systems, and

33:23

certain types of thinking, certain ideologies,

33:26

certain thought systems, right, that

33:28

reliably produce harm that

33:30

we should want to avoid because they make

33:33

life suck, right. I mean, these are these

33:35

are these are these are disproportionately

33:39

bad, even if there's some good to be found in there,

33:41

that the the the the

33:44

bad outcomes are so reliably

33:46

produced and they're so unsurprising that

33:49

we should want to figure out how to how to avoid

33:51

all of that, right, and having a

33:53

a malicious

33:57

and uninformed and sinister

34:07

and and and utterly

34:10

selfish person in charge of our country.

34:13

Is it's a bad system, right, Like

34:16

now, there's no free will. I don't know. You don't need

34:18

to attribute free will to Trump the

34:20

idea that he could be otherwise. He

34:22

can't be otherwise, right, He's a he's

34:24

a moron, and he's and he's

34:26

a narcissist. Right, what do

34:29

you expect him to do with

34:31

with with the power of the presidency? Right

34:34

now, he's not he's not as dangerous,

34:36

he was not as dangerous as he might have been. He's not Hitler,

34:39

right, But Hitler is another bad

34:41

system that we should want to disempower,

34:44

right, and certainly not work to empower.

34:47

Right. But again,

34:49

all of this is susceptible to a

34:52

mechanistic interpretation.

34:54

We don't want to put a bad robot

34:56

in charge of our world, right.

34:59

We don't want to co a a

35:02

a system of artificial intelligence that

35:04

has bad ethics, right, But

35:07

and we know that if we do, we're going to get bad outcomes.

35:10

But again, there's

35:13

no and it is it is in fact true.

35:15

I mean this, It comes back to the paradox. The

35:18

better you, the more finely you calibrate

35:21

any system towards toward

35:23

good outcomes, the more inscrutable

35:26

its failures to achieve those outcomes

35:28

become. You know, if you

35:31

you know, it's

35:33

just it's like if you realize if you have a a

35:38

robot that has a you know, only a

35:40

one in a billion error

35:43

rate, if you experience a

35:45

one in a billion error today,

35:47

you know when you're you, when you're interacting with that robot,

35:51

Damn, that's surprising, and it

35:53

and it's and it's it says virtually

35:56

nothing about what the robot did in the past,

35:58

because it worked perfectly in

36:00

the past, and it says virtually

36:03

nothing about what is likely to do in the future. And

36:06

you were you and the robot, we were

36:08

both unlucky. Today you

36:11

know, you've got the one in a billionaire, right,

36:14

so so so so again, to remind

36:16

people what the paradox is here. The

36:19

more the more you make someone

36:22

seemingly responsible, really responsible

36:25

for their actions, the more you make them as

36:27

competent as they can possibly be in that

36:30

domain, right so so that they

36:32

can shoulder this responsibility of feeling

36:34

like, god, damn, I'm really culpable

36:36

for my failures, like like I should

36:39

have done otherwise I should have made that

36:41

put right. The

36:43

more that the less it

36:46

seems like it really reflects them,

36:49

right, the more the more mysterious

36:52

the failure is. And so it's it's it's almost

36:54

like an uncanny valley effect. It's like most

36:56

of us who live most of the time in the uncanny

36:59

valley of just the chaos

37:01

of our of our imperfect

37:05

calibration. Right, that's the

37:07

place where you can sort of heap claims

37:09

of responsibility onto people, and

37:11

they seem to land. And my argument

37:13

is that it's never

37:16

to tell someone they should have done otherwise.

37:18

Like, I mean, this is very clear in

37:20

parenting, Like you know, I have, I have daughters

37:23

who I'm not I'm certainly not browbeating

37:25

about the illusoriness of free will. No, I'm trying

37:27

to raise them to be competent, self

37:30

regulating human beings. Right,

37:32

So when I say, if

37:34

I if I talk to one of my

37:37

daughters and I say, you know, you really should have

37:39

done otherwise, right, I mean, that's not a way

37:41

I would put it, obviously, But but if

37:43

that's the implication of what I'm saying, Like the thing

37:45

you like, it's be nice if you put

37:47

your plate in the sink after you

37:49

you do, we're done eating? Right, like,

37:51

like can you do that next time? It

37:53

really is much more. It's

37:56

it's never a claim

37:59

that in this end stints, if I rewound

38:01

the universe, they would they might have done

38:03

otherwise, right, No, this is a causally

38:06

determined outcome

38:09

that is was always going to be the way it was going to

38:11

be, even if you introduce randomness. Right.

38:13

So there's no free will here. But it

38:15

is a conversation about what

38:18

I want them to do next time, right,

38:20

And that is a that is and saying

38:23

that is a further input

38:25

into the clockwork of their lives.

38:28

So that will change them ultimately. It will

38:30

change them ultimately. If my daughters are going

38:32

to become civilized human beings, they

38:34

will not behave the way they

38:36

did at you know, at seven years old

38:38

or twelve years old or you know when they're

38:41

in their forties, right, and

38:44

those changes will be causally

38:46

affected on the basis of demands

38:48

imposed on them. But again, there's

38:52

no there's

38:55

no place for the folk psychological notion

38:57

of free will to land there. It's like if you wouldn't

38:59

give your daughter or any credit if she became

39:01

president of the United States, some day you would look at her not

39:03

with pride, because you'd say, well, she

39:05

didn't you know ultimately cause that well

39:09

well, honestly, so, I do feel like

39:11

pride is a is

39:15

a is a virtue that

39:18

has an expiration date in

39:21

a human life. I mean, I think it's developmentally,

39:24

there's an appropriate there's like a critical period

39:27

where pride is not a

39:28

a an ethical error

39:31

or a or

39:33

a sign of psychological confusion. It's actually

39:36

it's actually something you want to get

39:38

into the code. Right, So like I would

39:40

I love it when my daughters

39:43

are proud having

39:46

accomplished something that seems like a good thing psychologically.

39:49

But at a certain point, I think you want you clearly

39:51

want to outgrow It's it's clearly

39:53

it is a it is not a durable

39:56

basis for self

39:58

esteem. It's not a basis

40:01

for compassion for oneself and others.

40:03

It's it is it does

40:06

tend to rest on a confusion about just

40:09

what it is you can reasonably be responsible

40:12

for and what it is that it was just a happy

40:14

accident, right, And

40:16

people take to tend to take credit for the

40:18

for for things that they weren't actually in control

40:21

over, and and and uh, you

40:23

know, attribute their

40:25

failures differently, so there's kind of a

40:28

delusion built into it. In the normal case.

40:30

I mean, if you're depressed, that probably flips

40:32

and then you then you're more realistic about about

40:35

what was actually within your within your purview

40:39

to control. But yeah,

40:42

I don't. I don't

40:44

feel pride about anything in

40:46

my life now. I mean, I'm not like I have all kinds

40:48

of outcomes I prefer and

40:51

sometimes I realize them and sometimes I don't.

40:54

But and so so the the

40:56

the obverse of pride, of course

40:59

is something like shame again.

41:01

Shame is it's

41:04

an important thing to be able to feel,

41:07

but ultimately I think it

41:09

reaches its shelf life. I think you

41:11

want to be able to transcend shame

41:14

again, you know, not too early. This is an

41:16

interesting topic

41:18

and again and I'm not totally I

41:20

don't totally know what I believe about it, because I

41:22

think there's this is certainly a pathology, you

41:26

know, and a lot of danger on the other side

41:28

of losing one sense of shame. But

41:32

I do think ultimately,

41:35

you know, there's a psychological freedom in outgrowing

41:39

pride and shame and just seeing

41:41

that there's just no there's no basis to

41:43

feel either. Ultimately, you're

41:45

just telling yourself a story about the past. In both

41:48

cases, you're thinking thoughts

41:50

in the present that nominally

41:53

refer to the past, and you're and they're

41:55

making you feel a certain way. You're feeling

41:57

good about if like you're watching

41:59

a movie about your past and

42:02

you're being entranced by it, and

42:04

it's kindling an emotional response

42:07

that has a certain half life, and

42:10

it's incredibly boring. It's in the end, it's

42:13

an incredibly boring thing to do with your attention.

42:15

It's a it's a masturbatory

42:18

on the pride side, the pleasure side.

42:20

It's a masturbatory and self

42:24

directed pseudo

42:28

source of gratification, which divides

42:30

you from least importantly

42:33

it

42:34

it sets

42:36

up a system

42:39

of comparison between yourself and others that

42:41

ultimately is not a source of well being.

42:44

Right and like you, if you're comparing yourself favorably

42:46

to other people and feeling good about

42:49

that, you know, then five

42:51

minutes later you're going to be comparing yourself unfavorably

42:53

to other people who are doing yet more impressive

42:56

things, and you're gonna feel bad about that. Like

42:58

that, that pinballing between those two

43:00

things is not the

43:02

right algorithm to live a truly self actualized

43:05

life. So I do think both pride and shame

43:07

ultimately get outgrown. But at

43:09

what point That's an interesting question. I

43:12

love this transcendent view, and also the idea

43:14

that you the point that you make

43:17

that about hate, you know, there's really

43:19

hate doesn't really have a place to

43:21

program in the robot here once we

43:24

understand that there's no ultimate free will.

43:27

And what I don't understand though, is

43:29

you know, so your view absolutely and I loved your

43:31

point about how can increase sympathy for others when

43:33

we realize, you know that we're

43:35

not always aware of

43:38

or most of the time we're not constantly aware of

43:40

the outputs or the inputs

43:43

into our outputs. But what I do understand

43:45

is is like in

43:47

applying that in your own life, you

43:49

you don't apply that when you talk about Trump. I mean,

43:51

you get you hate you, you get really

43:53

angry, but you don't say things

43:56

we should have sympathy for Trump, you know? Yeah,

43:59

I mean so, well, there's certainly moments

44:01

where I'm I'm I'm

44:04

captured by by something that I find

44:06

so despicable that I'm that I'm

44:08

actually, you know, I'm blind to the to

44:11

my own philosophy here, like I'm just lost. I'm

44:13

I'm lost in thought, you know, I'm I'm

44:15

identified with with with a moment

44:17

of finding Trump despicable.

44:20

Say, and yeah,

44:22

so I'm just I'm in the dream, you know, I'm

44:24

asleep and dreaming and unaware of human Yes,

44:27

yeah, so I'm not. I'm not a Buddha.

44:30

But much of the time, a

44:33

different thing is happening. And it's not it's

44:36

not personal, it's not it's

44:39

not that I hate Trump personally.

44:43

It's that I hate And again this is all slightly

44:45

anachronistic because now he's no longer president, so

44:48

I'm basically never thinking about

44:50

him now, which is wonderful. But it's

44:52

not that I hated him personally, it's

44:55

I hated the fact of him,

44:58

right. The fact that we made this sort

45:00

of man president was

45:02

so terrible. I mean, for all

45:05

the things it's said about us as as a society

45:07

and all of the risks we were then

45:10

running for four years to put

45:12

something in charge of the literally put

45:14

something in charge of someone in charge of our nuclear

45:16

codes who couldn't figure out why we

45:18

can't use nukes. We've got him,

45:20

why not use them? Right? Like to like, take that

45:23

one factoid about Trump that he had

45:25

to be repeatedly admonished by

45:28

his joint chiefs chiefs of staff

45:30

that it was a good thing that

45:32

we had reduced our warhead count

45:34

from the sixties, And he asked, why why don't I

45:37

have as many? When he heard that, you know,

45:39

Kennedy had ten times a number of bombs

45:41

that he has, Trump thought

45:43

that was a problem, like, why why can't I

45:45

why? Why why did he have more bombs? Right?

45:47

Like, the guy was so

45:50

dangerously ignorant of in

45:53

this case, just the

45:55

the game theory of nuclear deterrence,

45:57

and and all and all the rest,

45:59

uh you know, which which our very lives,

46:02

in the lives of our children, and the fate of civilization

46:04

depend on someone not being that catastrophically

46:07

ignorant about that thing which you know,

46:10

which you know about which only

46:12

he has the responsibility at this point

46:14

for the next four years. Right, So that fact alone

46:17

joined to, as you said, ten

46:19

thousand other facts about this man.

46:22

Right. Reliability, He's reliable,

46:25

awful, Right, he's so reliably

46:28

bad as a malfunctioning

46:31

robot. Right, if we

46:33

put you can predict it with ninety eight percent

46:35

accuracy. Yeah, so

46:37

it's not so like it's a little bit analogous

46:39

to if we if we elected

46:41

a rhinoceros to be president.

46:44

I'd be fucking tearing my hair out

46:46

over how awful that is. At

46:49

no point in my imagining that

46:51

the rhinoceros can be anything other than a rhinoceros,

46:54

And at no point in my imagine, at no point

46:56

am I wishing suffering

46:58

upon the rhinoceros. I don't hate the rhinoceros.

47:01

The rhinoceros just shouldn't be president of the United

47:03

States, right Like. That's

47:06

a catastrophe to do that. And

47:09

in some sense, we elected a rhinoceros

47:11

president, and so I spent a long time

47:13

complaining about that because

47:16

of all the things to which that was

47:18

connected in our society and in

47:20

our possible future that we're worth worrying about.

47:23

I hear you, and I hope you understood my point too.

47:26

You know, you know, you never you

47:28

never said these words such as you

47:30

know, I think everyone's coming

47:32

from such a place of hate with Trump. You have to understand

47:35

he doesn't have ultimate free will, and I think we need

47:37

to have more sympathy for him like you would

47:39

with a rhinoceros, you know, while

47:42

still taking action to prevent him

47:44

from ruining the world and pressing

47:46

the nuclear button. I've never

47:48

heard you like say that in a sympathetic

47:51

way towards Trump applying your own

47:53

principle. Do you see my

47:55

point here? So, I

47:57

mean Trump is someone who

48:00

is I find it unusually hard

48:02

to have to feel compassion

48:04

for him because

48:07

he seems he

48:09

seems damaged in ways

48:12

that are that specifically render him

48:15

impervious to suffering.

48:18

Right So, he's like not someone who who seems

48:20

to suffer anything ever,

48:23

right now, maybe I'm sure he probably does,

48:25

but that that requires an extra

48:28

act of imagination. Imagine

48:31

what he's like in the you know, in the privacy of his

48:33

mind when he's suffering. He actually, I

48:35

mean, he doesn't seem comfortable. He doesn't see, he doesn't

48:37

seem like he's he's got he

48:39

has a nice mind to inhabit, but he

48:42

seems to be missing a module that

48:45

would would naturally provoke compassion.

48:47

Free will. Well, no,

48:49

it's not free, but it's not free will. It's it's just it's

48:51

like he doesn't like does

48:54

he care about the relationship.

48:56

Does he love people? Does he care about like when

48:58

someone close to him dies? Does he feel

49:01

grief? I don't. I don't know. Actually,

49:03

he seems like he might be damaged, and precisely

49:05

the way that would they would prevent someone

49:08

from ever shedding a tear

49:10

about anything, right, Like, like he's a car. He's

49:12

a kind of cartoon for

49:14

real, Right, it's not. It's not just that

49:17

I've made him a cartoon because I don't know enough

49:19

about the man or I or because I

49:21

I find him so despicable that

49:24

I'm just not disposed to think clearly

49:26

about him. He actually does seem like a very

49:29

unusual person to me, And

49:32

I wouldn't say this about even

49:34

objectively worse people, like you know, Osama

49:36

bin Laden, Right, Like Osam bin Laden seemed

49:39

like a much more normal person

49:41

to me than Donald Trump, albeit

49:45

one who was committed to specific ideas that

49:47

I found much more reprehensible,

49:50

was much more dangerous. It's a good thing

49:52

we killed him, all of that, like, but

49:54

much more. I understand

49:56

his psychology, much more

49:58

than I understand Trump's. There's a randomness,

50:01

there isn't there there's

50:03

a there's a malfunctioning robot

50:06

like aspect to Trump, where

50:08

he's he's got he's got a few obvious

50:11

pieces of his code. He wants

50:13

to be rich, he wants to be famous,

50:15

right, and then on top of that, he's

50:18

just win. He wants to win. Yeah,

50:21

he wants whatever he imagines he

50:23

is winning. And but

50:27

then he's he I

50:30

mean, there's there's there's almost no reason to talk about the

50:32

man now. But it's just the fact that people The

50:35

thing that that got

50:37

most under my skin was

50:40

that half of our society apparently

50:43

couldn't see what was wrong with him,

50:45

right, like literally like they just couldn't see the thing

50:48

I was seeing. In every

50:51

instance of having it was

50:53

completely unmediated. Unmediated too.

50:55

It's not that I'm believing the

50:57

New York Times profile, credulously

51:00

believing the libtard profile on

51:02

him, and the man is being besmirched

51:04

by a lying you know, fake

51:07

news media. No, No, everything

51:10

I feel about Trump was

51:13

was fully communicated by seeing

51:15

him at the podium talking

51:18

like like, so it was unmediated, you know,

51:20

and and and in many cases virtually

51:23

unedited. You know, you sit down and watch two

51:25

hours of Trump. That's

51:27

who I'm reacting to, right, And the fact

51:29

that half of the country felt

51:34

there was I mean, either couldn't see

51:36

it or felt that, you know, upon

51:39

seeing it, that's

51:41

exactly who they wanted to be to be president.

51:44

You know, uh, that there was

51:46

something you know, crazy making

51:48

about that, because it's it just was a

51:50

transparent act of lunacy from my point

51:52

of view, I mean, just like, okay, let's just drive

51:56

you know, human history toward

51:58

a cliff, you know, as as

52:00

quickly as possible and see what happens.

52:02

Right, It's like it seems like an act of

52:04

chicken with with to

52:07

be playing with human history. And

52:11

so anyway, I was reacting to that much

52:13

more than than I was motivated, because the truth

52:16

is, I see, like I can,

52:18

I can I can have the reaction

52:20

that the people who like Trump, I

52:22

can actually run that reaction in emulation,

52:26

Like I can see that he has moments where he's

52:28

genuinely funny, genuinely charming,

52:31

genuinely charismatic, like I like,

52:34

I get that, right, and so, and I can feel

52:36

my monkey brain light up

52:39

in precisely the way every everyone else's

52:41

brain lights up when someone who is

52:43

charismatic and charming and funny and

52:45

taking risks shows up. Right,

52:48

So, you know, like when he's at the debate and he said,

52:50

you know, he's he says that completely, you

52:52

know, he's he's challenged by Megan

52:56

Kelly for all the heinous things he said

52:58

about women, you know, in

53:00

his history as a buffoon, and he

53:02

says, you know, only Rosy O'Donnell,

53:05

right, you know, and that did us a huge laugh line and

53:07

it completely undercuts everything

53:09

Megan Kelly thought she was going to achieve in that

53:11

moment journalistically, and it wins him the debate.

53:14

Like there's part of me that just

53:16

finds that hilarious too, right, Like,

53:19

so I get I get that. But to

53:21

not have seen the bigger picture here, and to

53:23

not have seen that this man

53:25

is actually a sociopath with respect

53:27

to his ethics,

53:30

and to not have cared about any of that right,

53:32

to not have done the moral arithmetic,

53:35

and imagine who you'd have to be to

53:37

have run the fraud of Trump

53:40

University and have to have

53:42

defrauded elderly people, to have encouraged

53:45

elderly people to max out their credit

53:47

cards to get your fake knowledge

53:49

it's your fake you know, your you know, you

53:51

know scam university. To

53:54

have been that that one data point alone

53:56

in his backstory should have been absolutely

53:58

disqualified. Know, like we

54:00

should never have heard from him again after that.

54:03

It's because it says so much

54:05

about who you are and who

54:07

you're likely to be in further moments, like that's not

54:09

just a missed putt, that is a

54:12

million puts missed in

54:15

a row, right, Like we we know

54:17

you can't play golf after you miss that many

54:19

putts, not even in a row all at once, Sam,

54:22

all once. I won't even say hit

54:24

a row. It's so much

54:26

once that you can't even focus on one

54:28

at a time. It's over, the conversation

54:30

is over, you know. But

54:33

so you don't hate him, No,

54:35

No, in my in

54:38

my clearest moments,

54:40

I don't hate anybody, I mean,

54:42

And yet there are people who I

54:45

would I would sanction that we kill,

54:47

right. It's like, I mean, it's like again, it's not I'm not a pacifist.

54:50

There are people who should. It's like we

54:52

you know, we've invented guns for good reason,

54:55

right, And it's there are people

54:57

who I hope I'm not one of them after this

54:59

interview, by the way, No, and I'm

55:01

against the death penalty, right. So it's like, once we have

55:03

safely confined somebody, then there's

55:05

no reason to kill them, right, I mean, then then then that's

55:07

a major ethical lapse. But no, there

55:10

there are you know,

55:13

there are acts of self defense that are totally rational.

55:16

And you know, you know, someone comes into your

55:18

house and wants to kill you and your kids, by

55:20

all means, shoot that person in the head, right, Like,

55:22

that is what guns are for, and

55:25

you should do it. You should do it if it's a grizzly bear,

55:27

and you should do it if it's a person who

55:30

seems to think he has free will to

55:32

kill you and your kids. Right. So it's like that that's

55:34

that's morally uncomplicated in

55:36

my view. But again, hatred,

55:39

this is this is an asymmetry that I

55:41

think you were referencing. People

55:45

wonder, well, what about

55:47

love? If people have no free will, how do you love anybody?

55:50

Right? And this is a beautiful

55:52

asymmetry between hatred and love,

55:54

at least in my view, which

55:57

is hatred really does require

56:00

and an attribution to

56:02

someone that they could and

56:05

should have done otherwise. Right, Like, it's like

56:07

you believe they really are the authors of their

56:10

bad actions, and the moment you find

56:12

that they have a brain tumor or whatever it is that

56:14

is exculpatory. Then you then

56:17

you change your response. You think, oh, wow,

56:19

you know I did hate Charles Charles Whitman

56:21

for getting getting up

56:23

on that clock tower and killing fourteen kids.

56:26

But once they

56:28

performed an autopsy on him and found a

56:31

massive brain tumor pressing on his amigdala,

56:34

well, then okay, then I recognized you

56:36

can't hate the guy. He was unlucky. He was unlucky

56:38

as he was as unlucky as the kids he shot,

56:41

right, I mean, that's just that's terrible, right

56:45

on some level, that happens to everybody.

56:48

Once you recognize that free will is

56:50

an illusion. But

56:54

love doesn't require an account

56:57

of human behavior in that way that demands

56:59

that people be the true upstream

57:02

cause of all of their actions. Love

57:05

just requires that you

57:07

really just two things that you that you care

57:10

about the difference between suffering

57:12

and happiness, right for for the for

57:15

yourself and others. Right like you, you want

57:17

people to be happy, and

57:20

that is really the the what

57:23

it is to love someone. You want them, we

57:26

want to relieve their suffering. You want to

57:28

maximize their happiness. And additionally,

57:31

you take a certain you find a certain

57:34

pleasure and well being in their

57:37

company, right, you want to be with them, right, So like

57:39

you're in the presence of someone who you want to be with, who

57:41

makes you happy and who you want

57:43

to be happy, right, and you want to and

57:46

and that and that that positive

57:48

social orientation and

57:51

that direct enjoyment of the state of

57:53

love in your own mind. That's

57:55

what we mean. I would argue, that's what we

57:57

should mean when when we say

57:59

we love someone body, and

58:01

none of that requires a belief that

58:05

they are the you know, they've

58:07

pulled themselves up by their own bootstraps

58:10

causally speaking, whereas

58:13

hatred really does on some level,

58:15

like the moment you notice that. I

58:18

mean, the example I always use is of

58:20

Oude Hussein because he's

58:24

it's not as hackneyed as referencing Hitler. But

58:27

you take a truly evil person who's

58:29

done just objectively heinous things

58:32

and just walk back the timeline of their

58:34

life. You know, it's just like you know, you

58:37

know, you take Hitler. Hitler as a forty

58:39

year old was absolutely somebody who

58:42

was just fit for nothing other than a

58:44

bullet. But you

58:46

know, walk him back to when he

58:49

was four years old. He's

58:51

just a little kid who's going to become Hitler? Right,

58:53

But he's an unlucky little kid. He's

58:55

got bad jeans or bad parents, or bad

58:57

society or something's kind of beat

59:00

him into the shape we now recognize

59:02

with the bad mustache and the and and

59:05

the the the dangerous

59:07

beliefs. Right, And

59:10

at what point along the way did he get

59:12

free? Will? Well, at no point? So at

59:14

what point along the way are

59:17

you just that you've become justified in hating him

59:19

and and feeling no compassion for

59:21

him? You know, I would argue at no point,

59:23

although it admittedly it's very hard once

59:26

he becomes an adult to find

59:28

any kind of basis for a compassion. But

59:30

he didn't make himself, you

59:33

know, And you

59:35

know I certainly you know I certainly would have uh

59:39

killed Hitler had I could have

59:43

at any moment along the way

59:45

to to to stop his his real

59:47

harms. But it's an interesting question

59:50

like this is this is a Ricky Gervais

59:52

bit, right, Like if you get get into a time

59:54

machine and go back to kill Hitler, what

59:56

if you land back with him as a kid? Are

59:59

you going to kill the four year old Hitler? Right?

1:00:01

Do you've killed Trump? At each four, Well,

1:00:05

well, no, but I mean it's it's it's

1:00:08

easy. It's I wouldn't say I would have killed Trump

1:00:10

at any point, but it's easy to

1:00:12

say I would have killed Hitler at a certain point

1:00:14

given the harms he caused. But

1:00:17

killing the four year old Hitler's seems

1:00:19

like, you know, an act

1:00:21

of pure psychopathy. Right, He's a four year old kid

1:00:23

who hasn't right, Right,

1:00:26

So there's no point is

1:00:29

is compassion unjustified?

1:00:32

And at no point is hatred justified?

1:00:34

In my view? And every so

1:00:36

and so the so then the question is in

1:00:41

the in these occasions where violence

1:00:43

seems not only justified but necessary,

1:00:45

right, killing Hitler? You know, I thought,

1:00:48

you know, assassinating Hitler when it could actually do the

1:00:50

world some good. You

1:00:52

just you just never need hatred for

1:00:55

that to be motivated, right,

1:00:57

Like, it doesn't, It doesn't, it doesn't require

1:01:00

but but hatred does require

1:01:03

a false description of authorship

1:01:06

or and and human agency. And that's why,

1:01:09

and that's why I would say it is possible

1:01:11

to get rid of hatred without getting rid

1:01:13

of love. Psychologically, well,

1:01:16

I like the spirit of a lot of what you're saying, and even

1:01:18

your your teachings and your and the whole

1:01:20

point of loving kindness meditation, you can have

1:01:23

love, you know. Sharon Salzburg beautifully shows

1:01:25

how we can meditate on our enemies, right, you

1:01:28

know, you know we

1:01:30

can wish them well because if

1:01:32

they are clearly Trump wasn't

1:01:35

well you know what I mean, like like, if

1:01:37

we wish Trump well, that's only going to

1:01:39

be the benefit of the world as

1:01:41

well. Yeah, yeah, you know, so I

1:01:43

really love the spirit. Yeah, I love that.

1:01:46

No, and I love these higher principles. Okay,

1:01:48

So the is all to say, you have

1:01:51

an interesting discussion. You say

1:01:53

the separation between science and human values

1:01:56

is an illusion. Now, why you

1:01:59

for the first time in history thousands, you

1:02:01

know, one, hundreds hundreds of years, no one's been able to

1:02:03

put a guillotine on the on Hume's guillotine.

1:02:06

And and how are you able to

1:02:08

finally take us from an is to

1:02:10

an aught? Can you walk me through the logic

1:02:13

of how you think that that's possible to go from

1:02:15

facts to values. Well,

1:02:18

I think it is a trick of people are getting

1:02:20

hung up on language. It is a kind of a semantic

1:02:23

distinction that I just don't think

1:02:25

we need to be taken in by

1:02:29

and It really is not something that Hume himself went

1:02:31

deeply into. I mean it was, it's much more of an aside

1:02:34

in his writing, this distinction, and it's

1:02:36

been blown up into like this foundational

1:02:39

notion of of meta ethics

1:02:41

somehow that you can never get

1:02:44

an is from an aught. I mean, there

1:02:47

are a few ways to see what's wrong with this.

1:02:50

I mean one is, you

1:02:52

can never get an is without

1:02:54

certain aughts, Right, you can never make

1:02:57

a factual claim about

1:02:59

the world without following certain intellectual,

1:03:04

logical rational oughts

1:03:06

or values, right, buying into thems

1:03:08

like what why should why should we value evidence?

1:03:11

Why should we value logical consistency?

1:03:14

Like if someone doesn't value

1:03:17

logic at all, what logical

1:03:19

argument could you invoke so as to convince them

1:03:21

that they should value it? And if someone doesn't value

1:03:23

evidence, what evidence could you provide to

1:03:26

suggest that they should value it? Right, It's just

1:03:28

at a certain point there are certain axioms,

1:03:31

Certain things are axiomatic, and that's not

1:03:33

a problem. We can't do science without

1:03:35

it, we can't do math without it, we can't do anything

1:03:37

without it. And yet people are acting

1:03:40

like if you need any

1:03:42

of that to get your morality started, there's

1:03:45

no such thing as morality, or they change.

1:03:47

There's a double standard here that we should notice,

1:03:49

like that there people are finding people

1:03:52

take the fact that there can be moral

1:03:54

controversy has convinced

1:03:57

most people that there's no such

1:03:59

thing as objective but

1:04:01

there can be controversy about anything. There can

1:04:03

be fat there can be controversy

1:04:05

about about physics. We

1:04:08

would never have moved from that the mere fact

1:04:10

of controversy to

1:04:12

to to the claim that there's no objective physics.

1:04:16

You know, the fact that you know, the fact that the

1:04:18

Taliban disagree with us about

1:04:20

morality is invoked as a reason

1:04:23

to believe that morality has to

1:04:25

be relativistic. There are no universal

1:04:27

truth claims to be made about right and wrong or good and

1:04:29

evil because millions

1:04:31

of people over here don't agree with us. Well,

1:04:33

millions of people over here don't agree with us about physics

1:04:35

either, or evolution either, because

1:04:37

they know they're just they're not adequate to the

1:04:39

conversation, you know, because they're just, they're

1:04:42

they're obscured by they're they're

1:04:45

mired in other belief systems. That

1:04:49

never causes us to wonder whether or not

1:04:51

biology may just be made up or a cultural

1:04:54

construct or relativistic or

1:04:56

right, which is, unless word

1:04:59

haven't been complete, unless

1:05:01

we've been completely taken in by some kind of postmodernism.

1:05:04

But so

1:05:07

that double standard is worth noticing.

1:05:09

That you can't get to facts

1:05:11

without certain without indulging certain

1:05:14

values, at least implicitly. But

1:05:19

you know, I usually I just find

1:05:21

a different starting point, which is okay, Fine,

1:05:23

Let's say there's no such thing as oughts,

1:05:25

there's no shoulds, there's no there's

1:05:28

no morality, there's no values. Right, let's

1:05:31

just deal with a universe of facts. Let's just

1:05:33

start there. Well, it

1:05:35

is a fact that we live in a universe

1:05:39

where there's a

1:05:42

a vast landscape

1:05:44

of possible experience on

1:05:47

offer, and we have a navigation

1:05:49

problem, right, like we can we can navigate

1:05:51

toward places on this landscape

1:05:54

that are more and more sublime, where

1:05:58

you know, hairless apes like ourselves

1:06:01

have better and better experiences. Collaboratively,

1:06:04

creatively, we

1:06:06

we produce you know, brilliant works

1:06:09

of art and have the free time to enjoy

1:06:11

them, and we have epiphanies that

1:06:13

that that you know, cause you

1:06:15

know, the hair on

1:06:18

the back of our neck to stand up not from fear

1:06:21

but from you know, the rapture

1:06:23

of just just how beautiful the cosmos

1:06:25

is, right, and we can have and we have no

1:06:28

idea how good all of that can

1:06:30

get right with like that we could, we genuinely

1:06:32

cannot see the horizon line

1:06:34

there, but we just know we can push

1:06:36

into this area where cooperation

1:06:41

and and curiosity

1:06:45

and joy and loving kindness

1:06:48

and all of this just gets like, just gets tuned

1:06:50

up more and more and more, and the music gets

1:06:52

better, and the and the and the people

1:06:55

and people like myself who don't yet

1:06:57

understand how good music can be learned

1:06:59

more and more about out all that and get better to

1:07:02

get more adequate to that conversation. And

1:07:05

then over here we can have

1:07:07

failed states where sadistic

1:07:10

monsters torture people for pleasure

1:07:13

and nothing fun

1:07:17

happens at all apart for just

1:07:20

you know, more creative sadism uh

1:07:23

and the benefits thereof you know, accruing

1:07:25

to the few creative sus sadus

1:07:27

who get to uh stay

1:07:31

on top of that heap of misery before someone figures

1:07:33

out how to murder them, and and the

1:07:35

and the cycle continues, right and

1:07:38

then you know there and they're you know, over

1:07:40

here there are cures for diseases because

1:07:42

we have the free time to find them, and we have the

1:07:44

the the insight into the you know, the mechanism

1:07:47

that would allow us to find them, and we get

1:07:49

vaccines quickly. And over here

1:07:52

people don't even know the germ theory of disease,

1:07:54

and they kill people for witchcraft and

1:07:56

and you know, cut out the tongues of blasphemers,

1:08:00

you know, because there they think that might

1:08:02

be a cure for the bubonic plague, as

1:08:04

we did for centuries in Europe. Right,

1:08:06

So there are two very different attractor

1:08:10

states on this landscape that

1:08:12

we already know a lot about because

1:08:15

we've lived in both of them, uh

1:08:18

sometimes for centuries, and

1:08:20

we have a navigation problem we can and

1:08:23

all of this again, this is all these

1:08:26

are fact based claims right about

1:08:29

how to move in this space. There are right and

1:08:31

wrong answers about

1:08:33

how to move. There are and and the and at

1:08:35

every level at which we are gathering human

1:08:38

knowledge, there are right

1:08:40

and wrong answers. There are genetic

1:08:45

things that determine you know, where

1:08:47

we where we're inclined on this landscape. There

1:08:50

are environmental aspects

1:08:53

to this, and all of

1:08:55

this can be you know that broad

1:08:57

strokes distinction can be can be

1:09:00

defined and understood at

1:09:02

every level that we have a specific

1:09:04

science or a specific almost science that

1:09:06

addresses it. So we're talking about

1:09:09

the truths of physics on through

1:09:11

biochemistry, and you

1:09:13

know, neurophysiology

1:09:16

and psychology and sociology and economics

1:09:18

and all of it. Right, any place

1:09:20

where we're going to make a fact based distinction about

1:09:22

anything is potentially

1:09:24

relevant to how we navigate on

1:09:27

this space of possible experiences. So

1:09:30

we have not introduced morality yet, we haven't

1:09:32

introduced any oughts yet,

1:09:37

and so then, and we don't even have the word should

1:09:40

yet. Right now, let's invent

1:09:42

the word should. Right, What does

1:09:44

it mean? What should I do? What

1:09:47

should we do? Right? I would

1:09:49

simply claim this is the only thing you have to

1:09:51

grant me in order to get my moral

1:09:55

worldview booted up.

1:09:59

All you have to grant me is that the word should,

1:10:01

if it means anything at all, it

1:10:04

means that we should avoid

1:10:07

the worst possible misery for everyone.

1:10:10

Right, If we should do anything, we should do that

1:10:14

anything else on this landscape is

1:10:17

better than this one

1:10:19

place where it's true

1:10:21

to say that any

1:10:24

conscious system that can suffer

1:10:26

is suffering as much as it possibly can

1:10:29

for as long as it can without any good

1:10:31

thing coming up it. Right, there's no silver

1:10:33

lining to the suffering. You know, we've

1:10:35

created computer systems that

1:10:38

are just hell realms, where conscious

1:10:40

computer programs suffer immeasurably

1:10:44

for an apparent eternity. Right, everything

1:10:47

that can suffer is suffering the most

1:10:49

harrowing and pointless misery it

1:10:51

can possibly endure for as

1:10:54

long as it can possibly happen compatible

1:10:56

with the laws of physics. Right, that's

1:10:58

the worst possible misery for everyone. If

1:11:01

anything is bad, that is

1:11:03

bad. If the word bad means anything,

1:11:05

Okay, it doesn't yet. We haven't invented the word bad

1:11:08

yet. Now we have what

1:11:10

does it mean? If it applies

1:11:12

to anything? It applies to that.

1:11:15

What does good mean? Good means

1:11:18

good? The direct where are you going to point toward

1:11:20

the good? You're going to point away from

1:11:23

the worst possible misery for everyone. Anything

1:11:26

is better than that? And now then

1:11:28

the question is how much better does

1:11:30

better get? Well? Over here, we've

1:11:32

got a beautiful

1:11:35

global civilization figuring

1:11:37

out how to colonize the galaxy based

1:11:39

on environmentally sustainable you

1:11:42

know, collaborative

1:11:44

principles that are just redounding to the advantage

1:11:47

of almost everyone. And they're

1:11:49

curing diseases as quickly as they crop up,

1:11:51

and it's just, you know, and

1:11:54

they're fast completing something like a

1:11:59

mature psychology of human self actualization.

1:12:01

And it just it's like, basically the entire world

1:12:04

has become Esslin Institute on

1:12:07

its on the most beautiful afternoon

1:12:09

it ever enjoyed in the you know,

1:12:11

in the summer of love. Right, right,

1:12:14

that's the worst day anyone has for

1:12:16

the next thousand years. Right, that's

1:12:18

a pretty good planet to be on, right, certainly

1:12:21

better than the worst possible misery for

1:12:23

everyone, right then, So

1:12:25

so all the people who are who are getting

1:12:28

wrapped around the axle of is an aught are

1:12:30

saying, wait

1:12:33

a minute, but is that is

1:12:35

the worst possible misery for

1:12:37

everyone? Really bad? Really?

1:12:42

Should I should? I? Should I? Really

1:12:44

avoid it? Who are you to say

1:12:47

philosophically that

1:12:49

that you should

1:12:51

avoid the worst possible misery for everyone, or

1:12:53

you should actualize a galaxy

1:12:56

full of of of uh

1:13:00

amazingly happy conscious

1:13:02

systems. And

1:13:05

that's just a it's a misapplication of language.

1:13:08

It's just if the word if the word should

1:13:10

mean the place you are standing,

1:13:13

So is to have the pretense of doubt

1:13:16

about what those words mean, right,

1:13:19

that like the place you're standing to say,

1:13:21

well, is the worst possible misery for everyone,

1:13:24

really bad? Might there

1:13:26

be something worse? Right? That

1:13:28

is, that place doesn't exist

1:13:30

if you if you understand what these

1:13:33

words mean, right, if

1:13:35

you're actually running them in anything like a kind

1:13:37

of emulation so that you're you're understanding

1:13:39

them, it should be obvious

1:13:41

to you that that's not there's

1:13:43

no place to stand from which to do that philosophy

1:13:46

I mean like a much a much crasser

1:13:49

but also emphatically convincing.

1:13:53

Uh framing

1:13:55

would be Okay, put

1:13:57

your hand on a hot stove, and

1:14:00

then tell me about your philosophy about

1:14:02

whether or not that is bad

1:14:05

and worth avoiding. I'd like to double

1:14:07

click on that example, though, go for it,

1:14:10

because everything up to that example I was with

1:14:12

you, and then you know the collective well being.

1:14:14

But once we get to the individual level, you

1:14:16

know. The way I think about it is that

1:14:18

there's no should without in order to which

1:14:20

is a goal. If someone says you should do X,

1:14:22

that necessarily implies that you should

1:14:24

do X in order to get why there could be no

1:14:27

a should without reference to a goal. Now,

1:14:29

what if what if the goal is to avoid the

1:14:31

worst possible misery for everyone, then

1:14:34

that's a good point. But what if that's not your

1:14:36

value? So what or what if?

1:14:38

How could it not be your value? Well? What

1:14:40

there's one could conceive of a situation

1:14:43

in which suffering at an individual

1:14:45

level is what will lead

1:14:48

to a greater good for everyone. If your

1:14:50

value system is that greater good that you're

1:14:52

talking about. Couldn't you make the case

1:14:54

you have to do sucky things sometimes in order to

1:14:56

get there? Yeah, yeah, yeah,

1:14:59

yea. So there are trade offs and and there, and there are

1:15:01

forms of suffering that have silver

1:15:04

linings, right, I Mean, that's why I'm careful to

1:15:06

define the worst possible misery for everyone is

1:15:08

as as really the worst possible

1:15:10

misery for everyone. But if you're if you're going to talk about,

1:15:13

you know, any project in life

1:15:16

that is hard, that that can't be achieved

1:15:19

but for hard work, you

1:15:21

know, i e. Some measure of suffering, Yeah,

1:15:25

then those that's just as easy to understand.

1:15:27

I Mean, there's there's sometimes there's some things that

1:15:29

suck and nothing good come

1:15:31

from them. And there's some things that suck but

1:15:34

they're on their way because the value judgment

1:15:36

though, but

1:15:39

they could be in they could be well, no, there's there're

1:15:41

framing effects, so that there are things that are unpleasant,

1:15:45

but framed a certain way, we

1:15:48

actually kind of like them. It's a kind of

1:15:50

it's within the range of unpleasantness that we actually

1:15:52

like because we know what it means. Right,

1:15:55

So, like a good workout is my is my

1:15:57

favorite example here. It's like like that like

1:15:59

once once you learn to love lifting

1:16:01

weights, that physical

1:16:04

stress, which if you felt in another context,

1:16:06

if you woke up in the middle of the night feeling

1:16:08

what you feel when you're doing

1:16:11

you know, the heaviest deadlift you

1:16:13

can you can you can accomplish,

1:16:16

Well, you'd think you were dying, right, you'd be terrified.

1:16:19

You'd call nine one one, right, like

1:16:21

like like that's a medical emergency. But

1:16:23

because you're in a gym getting stronger,

1:16:26

you actually like that experience.

1:16:29

Right. So

1:16:32

that's but you know, that's that is interesting

1:16:34

psychologically, But that's not a counterpoint

1:16:37

to my argument. That's just it's just in

1:16:39

fact true that the cognitive

1:16:41

frame you put around certain sensory experience

1:16:44

matters. But it's

1:16:46

and it's it's also true that certain good

1:16:49

things in life can only be accomplished by

1:16:52

going through certain hard experiences,

1:16:54

right, And it may it may be true both

1:16:57

individually and collectively. And this is

1:16:59

why my moral escape analogy could

1:17:03

be relevant here on this question.

1:17:06

You know, just imagine a landscape where our

1:17:08

peaks and valleys and the peaks

1:17:11

correspond to increases in well

1:17:14

being, you know, individually and collectively. And let's

1:17:17

take a collective moral landscape for this argument.

1:17:20

So we're on and you

1:17:22

know, all of these, all of

1:17:24

the landscape disappears into the mists

1:17:26

beyond which we can see. So you never quite know you're

1:17:29

on a peak, right, Like there's never maybe there's maybe

1:17:31

the peaks go up infinitely, we don't

1:17:33

know, but we just we're on a high spot.

1:17:36

We know we can move higher and things get better

1:17:38

and better, Things get happier

1:17:40

and happier for all of us, or most

1:17:42

of us, or

1:17:44

us in the aggregate. But there's

1:17:47

and we know what it's like to go downward and things

1:17:49

seem to get worse and worse. Well, it may in fact be

1:17:52

that there's a we're just on some local

1:17:54

maximum. But the way to

1:17:56

get to a much better spot, a much higher

1:17:58

spot on this landscape would

1:18:01

require a collective

1:18:03

descent into some kind of valley,

1:18:05

right, like things actually will get worse

1:18:08

in order for us to get better. It's like you do have to

1:18:10

rip the band aid off, and that sucks,

1:18:13

but you have to rip it off. And that may

1:18:15

be true of certain things like what if? What

1:18:18

if climate change? What if our solution

1:18:21

to climate change is absolutely necessary

1:18:24

but actually painful economically.

1:18:26

Now, I'm not convinced that's the case, right,

1:18:28

I'm not convinced that we need to make significant

1:18:30

sacrifices. But what if it's

1:18:33

just an accident. If that's true,

1:18:35

what if we lived in a world where we're

1:18:38

on a collision course with something truly

1:18:40

horrible, and the only way

1:18:42

to get off of it is to make a

1:18:45

major sacrifice that diminishes,

1:18:48

that noticeably diminishes the well being

1:18:51

of more or less everyone for a generation.

1:18:54

It's totally possible that we could

1:18:56

be if we're not in that situation. Now,

1:19:00

that's an intelligible situation to be

1:19:02

in. That would be an example of like,

1:19:04

Okay, this is going to suck, but

1:19:07

we have but here's why we're doing it, and

1:19:09

and it's rational for us to do it. The

1:19:12

better and worse are value judgments I don't know why

1:19:14

you don't see that. You know the thing about

1:19:16

well being that people on

1:19:19

a hot stove, and then tell me that

1:19:21

if it was in order, if my goal was

1:19:24

a broader goal and putting my hand on hot stove would

1:19:26

help me with that broader goal, I'd put my hand on a hot

1:19:28

stove and deal with the suckiness of

1:19:30

the feeling. But the thing is, it's not to

1:19:33

say that. To say that the worst

1:19:35

possible misery for everyone is

1:19:37

bad is a value

1:19:39

judgment, is to

1:19:42

say nothing. Well, you're

1:19:44

accepting a particular definition of well

1:19:46

being. The point is that psychology

1:19:49

people, no, no no, no, no no, I'm not definition. You're

1:19:51

just you're just not understanding my claim. Take

1:19:53

any definition. Take let's

1:19:56

take value fulfillment as a definition, which

1:19:58

is ano definition less. Imagine

1:20:01

a universe of

1:20:04

radical pluralism with respect

1:20:06

to values. Right, so we've got

1:20:09

so, we've got we've got a place over here where

1:20:12

you have you have perfectly matched

1:20:14

sadists and masochists who if you

1:20:16

could just get them together, they get really really happy.

1:20:19

Right, But we want

1:20:21

nothing to do with any of that, right Like that just

1:20:23

sounds like hell to us. But the truth is psychologically

1:20:26

for them. They're having a fine old time

1:20:29

in their you know, BDSM dungeon.

1:20:31

Uh and uh, we

1:20:34

have no idea how weird all that can get. But it's just

1:20:36

different values. Right. And

1:20:40

let's say there's a functional infinity,

1:20:43

a functionally infinite number of value systems.

1:20:46

I'm asking you to imagine a universe where every

1:20:49

conscious creature, by its own values,

1:20:52

is made as miserable as it possibly can

1:20:54

be. So everything is tortured, even

1:20:57

if your torture is my highest

1:21:00

enjoyment, and vice versa. Grab

1:21:03

whatever knobs there are and turn them

1:21:05

down to the hell realms for everybody

1:21:09

for as long as possible, right, with

1:21:11

nothing good coming of it. If something good's

1:21:13

coming of it, well, then that's just not as bad

1:21:15

as things can get. Let's make them worse. Right,

1:21:19

Everything gets dialed down to the utmost

1:21:21

misery by whatever uh

1:21:25

causal structure would allow for that

1:21:27

misery. And it's so if you're going

1:21:29

to flip the flip, all the the

1:21:31

the valances on a value system in some

1:21:33

other corner of the universe will then okay,

1:21:36

that's the antimatter of morality over there,

1:21:38

will find it

1:21:40

functions by its own principles. Let's

1:21:42

let's screw let's screw things up perfectly

1:21:44

over there, that's

1:21:47

the place that's the base case. Right.

1:21:49

So if you're going to say, well, that's just a value judgment

1:21:51

that the worst possible misery for everyone is

1:21:54

bad, I don't know what you're talking

1:21:56

about. You're just you're

1:21:58

making you're making noisiers. It

1:22:00

feels like you're inserting in AUGHT.

1:22:02

You're only getting to IS because you're inserting in

1:22:04

AUGHT with the IS. You're hearing them together.

1:22:07

I'm granting. I'm granting you. There's

1:22:10

no such thing as ought. We live in the universe

1:22:12

without oughts. You don't have to do anything. You

1:22:15

shouldn't do anything. Okay,

1:22:17

you're just you're you're you're off scott free, you're off

1:22:19

the hook. There's no free Yeah,

1:22:22

there's no morality, you know, Johnny

1:22:25

Philosopher. Right, so

1:22:27

we're we're in a universe where you

1:22:29

don't you don't. You don't have to get out of bed, right,

1:22:32

you don't have to get out of bed in the morning. I'm not

1:22:34

going to judge how are facts

1:22:36

going to at all lead

1:22:38

me to action? The fact the facts

1:22:40

are that there there

1:22:43

are very different experiences

1:22:45

on offer here, and you will

1:22:47

helplessly find yourself preferring

1:22:53

the good day at Estlin over

1:22:56

the the rat filled dungeon. Just

1:22:58

to take the fairly parochial differences

1:23:01

that we can notice here on earth. Good can only

1:23:04

be used in relationship to a goal.

1:23:06

How are you divorcing it from the goal? No,

1:23:09

it's it's it's just it's a it's

1:23:12

basic. Well, no, because there's just

1:23:15

there's just like the the

1:23:17

the valance of certain experiences

1:23:20

within consciousness that have no necessary

1:23:22

reference to a goal. It's like, you can,

1:23:24

you can take you can. You can be so happy

1:23:29

or unhappy that

1:23:32

it has no reference point in past or

1:23:34

future, Right, Like you can have

1:23:36

the best possible acid trip or the

1:23:38

worst possible acid trip and

1:23:40

you're not. There's no goal there.

1:23:42

There's just a the sheer

1:23:45

extremis of your physiology

1:23:48

pushed to the breaking point. I

1:23:50

do think that there's pleasurable, there's unpleasant,

1:23:53

But I don't think they map onto

1:23:55

good or bad in the way

1:23:58

that you kind of it seems like you're mapping your mong

1:24:00

to give them, give them enough, to dial them

1:24:02

up enough, and give them enough time. Right,

1:24:04

Like, what what if what if existence was just

1:24:06

that? What if existence? What if there

1:24:09

was a way to to there

1:24:11

was a place you could be. Let's say, let's say reincarnation

1:24:13

is true. Right, let's say that's a possibility

1:24:16

and wouldn't

1:24:19

yeah, that would that would be interesting. I mean, the

1:24:21

truth is both both situations are interesting.

1:24:23

You know, you know, getting non

1:24:26

existence is also I mean the fact that you know it

1:24:29

you appear and then you disappear, and you really disappear,

1:24:31

that is also interesting. It's like

1:24:34

the firm It's the Fermi problem, you know, it's

1:24:36

like thinking of a universe

1:24:38

teeming with other life

1:24:41

forms and advanced civilizations. That is, that

1:24:43

is about the most

1:24:45

astonishing possibility on

1:24:48

offer, except if you think that there's

1:24:50

no one else in the universe, that is also astonishing.

1:24:53

I mean, they're both, they're both just jaw dropping

1:24:55

to think about. But if

1:24:58

you could be reborn in

1:25:00

a state of perfect bliss, uncomplicated,

1:25:03

uncomplicated by any goals, right

1:25:10

that that that may not be the

1:25:12

the most interesting possibility

1:25:15

for you. But maybe your

1:25:17

intuitions about that caused you to

1:25:19

think it's not the most interesting possibility

1:25:23

are just born of your own you

1:25:26

know, glitch in your own code. Whereas

1:25:28

if we could change that, we could change those intuitions. Let's

1:25:30

say we just perform the necessary you

1:25:32

know, brain changes for you a

1:25:35

little slightly, you know, maybe maybe analogous to teaching

1:25:38

me how to play the cello. All

1:25:40

of a sudden I would appreciate you, like I don't know what

1:25:42

I'm missing with respect to classical music. But

1:25:44

once you, once you gave me the intuitions

1:25:46

of a of a Beethoven, well,

1:25:49

all of a sudden, I'd recognize, Okay, this is there

1:25:51

was a there there. This is just you know, the person

1:25:53

I was, the philistine I was who just didn't

1:25:55

get it and who would rather listen to led Zeppelin

1:25:58

just didn't know what he was missing. Right.

1:26:01

My point is, with respect to the moral

1:26:03

realism here is that, just

1:26:05

as it is with any other realistic

1:26:08

claim about knowledge about facts,

1:26:10

you know, with physics or anything else, it

1:26:13

is possible in the moral domain not

1:26:16

to know what you're missing. Right. You don't

1:26:19

know how good or bad things

1:26:21

are over there. Right, you don't even

1:26:23

know there isn't over there based on your own

1:26:25

experience. Right, And when

1:26:29

we were asking questions about how

1:26:32

to navigate in this space of possible experience

1:26:34

and whether it would be good or bad,

1:26:37

to go one direction

1:26:39

or another. We

1:26:41

are constrained based on our own

1:26:45

moral intuitions rather often, but we

1:26:47

can triangulate on those and

1:26:49

recognize that, you know, we're

1:26:52

living in a universe where it could be

1:26:54

possible to change one's moral intuitions.

1:26:56

In fact, it is possible based on pedagogy,

1:26:59

and and you's just collisions with other people

1:27:01

who have different intuitions, you know, through

1:27:03

conversation, but

1:27:06

ultimately ultimately might be possible

1:27:09

to change them very directly, Like we're going to change

1:27:11

your actual the

1:27:13

code you're running on your brain, or

1:27:16

we're going to upload you into a different brain, right

1:27:18

Like like that's like, so, what what

1:27:20

sort of robot do you want to be? Once we once

1:27:23

we can really change you materially,

1:27:26

there's this further question of okay,

1:27:29

if I can change your

1:27:31

intuitions about right and wrong and good

1:27:33

and evil, right, so that up is down,

1:27:36

up is down and down is up? Right, there's

1:27:39

this further question of asking would

1:27:42

it be good to do that? Well,

1:27:44

Who's whose intuitions are you referencing?

1:27:47

Bye by when you even ask

1:27:49

that question, Like, who's going to decide whether

1:27:51

it's good to do that? Well? Again, we're

1:27:55

we have to fall back on this original navigation

1:27:57

problem. There's this moral landscape.

1:28:00

There's a functionally infinite number of experiences

1:28:02

on offer you probably almost

1:28:04

certainly not infinite, but functionally

1:28:07

so. And

1:28:12

we know that some of these experiences

1:28:14

suck. Right, It's just built into

1:28:16

the very logic of this case,

1:28:19

which is, whatever your intuitions

1:28:21

are, we can concoct an experience that is

1:28:23

maximally terrible for you, right,

1:28:26

But whatever however your mind is built, we

1:28:28

can make you suffer. Right, So

1:28:32

getting away from if we should do anything,

1:28:34

if the word should is going to mean anything,

1:28:37

if the word good is going to mean anything,

1:28:39

if the word better is going to mean anything, if

1:28:42

a valance towards the positive is going to mean

1:28:44

anything. Getting away from

1:28:47

the burning stove that is burning

1:28:49

everyone in the worst possible mode

1:28:52

of burning that they're you

1:28:54

know, their organization can admit of forever.

1:28:57

Right, Getting away from that

1:28:59

is good and better and worth

1:29:01

doing. I agree, and I think you should

1:29:04

do it. But I only agree with you because I share

1:29:06

your value system. No, no,

1:29:08

over there, your whatever your

1:29:11

value system is, its ultimate

1:29:13

repudiation is

1:29:16

part of this picture. You're if

1:29:18

you are capable of wanting anything, you're

1:29:21

going to want to be elsewhere once

1:29:23

I get you into the worst possible misery for

1:29:25

everyone. It

1:29:27

doesn't take a lot, and that takes

1:29:29

a lot of hubris though for us to think that we

1:29:32

know the right way. More

1:29:34

important this is this

1:29:37

is where the This is where the double standard I referenced

1:29:39

earlier comes into the picture. People

1:29:41

seem to think that a diversity of opinion

1:29:44

or disagreement on values,

1:29:46

on moral values has

1:29:49

to mean something when it doesn't

1:29:51

mean anything when we're talking about scientific

1:29:53

values or facts. The

1:29:55

guy who shows up at the physics conference

1:29:59

who doesn't care about causality

1:30:01

and doesn't care about consistency,

1:30:03

and doesn't care about logic, and doesn't care about

1:30:06

the history of physics and all the conversations

1:30:08

that were had before he got there, and doesn't

1:30:10

know the math and doesn't he's just not adequate

1:30:13

to the conversation. Doesn't get

1:30:15

to be part of the conversation. The

1:30:17

psychopath doesn't

1:30:19

get to inform our ethics.

1:30:22

The Taliban don't get a vote.

1:30:25

They don't know what it means to live a

1:30:27

good life and produce a good society and

1:30:29

treat women well. They're

1:30:31

imbeciles. They have a shitty culture.

1:30:34

We know this. This is not and

1:30:36

it shouldn't be taboo to say this. They're

1:30:39

relig they're trying to live by

1:30:41

the lights of a fifth century book

1:30:44

or seventh century book, which

1:30:48

wasn't a good book even

1:30:51

in the seventh century, right

1:30:53

it was. So it's like, by

1:30:56

the lights of the seventh century, it was possible to do

1:30:58

better than what Muhammad managed more really

1:31:00

speaking, probably a good book. And so

1:31:02

it is with the Bible, and so it is with the Book of

1:31:04

Mormon, right, the book Like you know, if you're going to

1:31:06

hold the Book of Mormon up as a history of the world

1:31:08

or a history of anything else, or a book about physics

1:31:10

or a book about medicine, it sucks.

1:31:13

It also sucks as a moral orientation

1:31:16

in the twenty first century, right, It

1:31:18

just it's like, we can do better than all of these things,

1:31:21

and we want to have an open ended conversation about

1:31:23

the nature of reality. We

1:31:25

do not have to be constrained by

1:31:29

this this spurious notion that

1:31:32

values are something other

1:31:34

than facts. They're not like there's

1:31:37

a way of talking about them that

1:31:40

that can seem to they can they can

1:31:42

motivate that distinction for the purposes of certain

1:31:44

conversations. But it's like

1:31:46

the distinction between reason and emotion,

1:31:49

right, It's like, yes, there's there's

1:31:51

they're not, they're not precisely the same thing. We

1:31:53

know what it's like to have motivated reasoning

1:31:55

where your emotion is causing you to

1:31:57

misconstrue certain arguments or or

1:32:00

or cherry pick certain data or whatever it is because

1:32:02

you you know, you want things to come out

1:32:04

a certain way, like like yes, reason

1:32:06

and emotion or are you

1:32:08

know, part of a ven diagram that don't completely

1:32:11

overlap. But it's

1:32:13

also true that part

1:32:15

that that there is a an emotional

1:32:18

aspect two our to

1:32:21

the cognitive apparatus

1:32:23

that is producing our rationality. And

1:32:25

if you and if you're damaged emotionally

1:32:27

in the right way is if you have you know, orbital

1:32:30

medial prefrontal damage

1:32:32

that causes you not to feel

1:32:34

the implications of certain

1:32:37

reasoning strategies or certain correct uh

1:32:40

conclusions, you will you

1:32:42

will malfunction. You'll you'll you'll

1:32:45

know what's right and be unable to

1:32:47

use what's right, you know, rationally speaking

1:32:49

to reference Antonio Dimasio's

1:32:52

work on gambling tasks there or

1:32:55

and and even just the even just the feeling

1:32:57

of doubt is an emotion. The

1:33:00

feeling of certainty is an emotion, the feeling

1:33:02

of of of Aha.

1:33:04

Now I see how that adds up, like two plus

1:33:06

two makes four. I get it there.

1:33:10

That is leveraging emotion

1:33:13

in order to land rationally

1:33:15

right, Like there's it's not it's not. It's

1:33:17

not completely devoids. It's not. It's

1:33:19

not unemotional, and

1:33:22

so it's it's on some level it's a spury.

1:33:24

It's not entirely clear distinction. And

1:33:27

we have to be careful in

1:33:29

how we differentiate reason and emotion. And

1:33:34

it's it is to a certain purpose, but so but

1:33:36

the and so and so it is

1:33:38

with values, It's like, hey, you can't

1:33:41

you can't do arithmetic without

1:33:43

valuing the you know, the

1:33:46

the operations in

1:33:48

certain ways, like if if you're going

1:33:50

to do arithmetic, or pretend

1:33:52

to do arithmetic and imagine

1:33:56

that you should be free to

1:33:59

think of the numbers differently on

1:34:03

either side of the equal sign.

1:34:06

Right, two means two over here, but

1:34:08

it means something different over here. That's

1:34:11

what I value. I I don't I you

1:34:14

know, I don't like your colonialist

1:34:17

you know mansplaining white

1:34:19

guy values of consistency

1:34:21

across the equal sign. You

1:34:24

know that got drummed into you in your

1:34:26

in your private

1:34:29

school. But over

1:34:31

here in my you know, Taliban

1:34:33

funded Academy of Arithmetic,

1:34:36

we've got you know or my my, you

1:34:38

know where we read uh dere

1:34:40

dah and uh and uh uh

1:34:44

you know fuco on the topic of arithmetic.

1:34:47

And we realize this is just a socially constructed

1:34:49

project. And on the other side of the equal sign

1:34:52

we can we can have left arithmetic and

1:34:54

right arithmetic. I mean, I'm just obviously I'm just

1:34:56

making this up, just confabulating, but what

1:35:00

like that's not going

1:35:02

to produce the results that

1:35:05

we that we we dignify as

1:35:07

arithmetic for

1:35:09

very obvious reasons, and values

1:35:13

are built into that project. Now

1:35:15

again this gets subverted in

1:35:18

specific instances, in very interesting

1:35:20

ways which we which we also have

1:35:23

to to enshrine into

1:35:25

our values. Right. So for instance, and

1:35:27

this is a little bit like how intuition is

1:35:30

is at bedrock for us, but

1:35:33

it's constantly being subverted by

1:35:36

the systems we build to leverage our intuitions,

1:35:38

Like you have to have the

1:35:42

only way you understand two plus two

1:35:44

makes four is based on intuition.

1:35:47

But we know our intuitions

1:35:49

fail in other areas of mathematics, and

1:35:51

we have to we have to account for that. But

1:35:54

so you can take something like like what I just said

1:35:56

that the kind of arithmetic that would be impossible

1:35:58

and and and and laughable.

1:36:02

There we know we can find ourselves

1:36:05

in situations where that seems

1:36:07

to be so and it isn't. Right, Like when

1:36:09

someone poses non proposes

1:36:12

non Euclidean geometry

1:36:14

for the first time. Right, you have a mathematician

1:36:17

like Remond I believe

1:36:19

was, you know, was certainly one of the first people to do

1:36:22

this. Like everyone thought

1:36:25

that a triangle

1:36:28

had to have one hundred and eighty degrees, and here

1:36:30

comes somebody saying, no, no, I'm not going to be

1:36:33

played by those rules.

1:36:36

Not all triangles have one hundred and eighty degrees.

1:36:38

Okay, there's there's

1:36:41

there's one time point where he seems

1:36:43

like a lunatic, or

1:36:45

at least he just doesn't understand what he's saying. But

1:36:48

there is a path from from

1:36:50

that initial seemingly crazy

1:36:52

claim to making sense

1:36:55

incrementally, and there's just not that

1:36:57

many increments here where then

1:36:59

you think, oh my god,

1:37:02

unless everyone understands what this guy

1:37:04

just said, they don't understand

1:37:06

geometry anymore. Right, Like so

1:37:08

like it goes from this is blasphemy

1:37:10

and you're an idiot to you're

1:37:13

a genius, thank you. Right,

1:37:16

And we know what it's like to traverse

1:37:18

that boundary, and we know and we

1:37:20

know that there's principles of intellectual honesty

1:37:22

and self criticism and openness to evidence

1:37:24

and argument, and patience and and

1:37:27

and and being sensitive to bias.

1:37:30

We know all of it. We know what we need to have

1:37:32

in the toolkit, and

1:37:34

we know we get continually surprised by

1:37:36

new discoveries and

1:37:40

the fact that someone like Remond can come along

1:37:42

and say, okay, look a triangle

1:37:44

on a curved surface is going

1:37:46

to have more or less than one

1:37:49

hundred and eighty degrees, right, do you understand

1:37:51

what I'm talking about? And

1:37:54

that and the point Lands, we

1:37:56

know that is a different project than

1:38:01

postmodern you know, everything's

1:38:03

everything's pure context relativistic

1:38:06

bullshit. We know it's different than

1:38:09

the Taliban. May be right that

1:38:11

the Koran is the perfect word of the creator of the universe,

1:38:13

and it should subsume every other human project.

1:38:16

We know enough to know then

1:38:19

the kinds of errors that are being made there.

1:38:21

And so

1:38:24

it's just It's like the the exceptions, you know,

1:38:27

those those exceptions where everybody's

1:38:29

wrong and then suddenly some lone genius

1:38:32

rewrites our collective appraisal

1:38:35

of reality. That does that

1:38:37

needn't open the door to this

1:38:39

this you know, quasi

1:38:42

nihilistic picture that every doubter

1:38:44

who comes along needs to be taken seriously.

1:38:47

Right. The person who doesn't understand that the

1:38:49

worst possible misery for everyone is

1:38:51

worth avoiding, just doesn't get

1:38:54

to come to the Conference on Morality to

1:38:57

belong. There so many good points

1:38:59

and make a lot of good points in the moral

1:39:02

landscape, but none of that is

1:39:04

countering the naturalistic fallacy, which just

1:39:06

simply says you can't have You

1:39:09

can't have only factual premises.

1:39:11

You have to have something that has no I

1:39:13

am I'm I'm agreen. I am agreeing

1:39:16

with that. Even when you only have factual

1:39:18

premises, you can't get to

1:39:20

a fact without

1:39:22

first presupposing certain values. You

1:39:26

just can't like it because if you

1:39:28

don't value not

1:39:31

contradicting the last thing you claim

1:39:33

to believe, right, if you want your beliefs

1:39:36

to cohere, I see what you're saying, that's a value

1:39:39

I can't both like to believe

1:39:41

that something is red and

1:39:43

something is also blue. You

1:39:45

know, red and blue all over, right,

1:39:48

not partly red and partly blue. But it's red

1:39:50

and it's blue. Right, is

1:39:54

a contradiction I

1:39:58

can eat. So I could decide

1:40:01

I have to have a value system to organize

1:40:05

those two propositions. I mean, either

1:40:07

I don't think that's Let me

1:40:09

give you a good example, a counterexample,

1:40:12

and you tell me if this is not a counter I

1:40:14

hated IQ tests, I hated intelligence

1:40:17

research. I went into the field

1:40:19

with the mission, because of my values, to

1:40:21

take down IQ. But I collected

1:40:24

the facts to such I had

1:40:26

to put my values aside as much as I possibly

1:40:28

could, And I studied with the leading IQ

1:40:31

researcher in the world, Nicholas McIntosh University

1:40:33

of Cambridge, and I published

1:40:35

studies which directly contradicted my whole

1:40:37

experience as a child, and it

1:40:39

was hard for me to stomach. But I still published

1:40:41

it because I had a commitment to the truth. Now

1:40:45

what that commitment to the truth is

1:40:47

a statement of your more value

1:40:50

system. Value system of commit truth trumped

1:40:52

my value system of wanting to change

1:40:55

the intelligence field in it by taking

1:40:57

down IQ. So I had two competing

1:40:59

value. Is that is what that's what you're arguing. Yeah,

1:41:02

yeah, no, you're value. Yes. We need

1:41:04

to identify the values that

1:41:07

scale best that

1:41:09

that can help. So we need to identify

1:41:12

the intellectual and ethical values

1:41:15

that allow for the

1:41:17

the again, the

1:41:19

the sanest and

1:41:22

most efficient way of

1:41:24

navigating in this space of all

1:41:26

possible experience toward

1:41:29

better and better, more creative, more insightful,

1:41:31

more beautiful experiences which we are

1:41:33

right to care about, if we're right to care about

1:41:35

anything? Right, if caring? I

1:41:38

mean this is this is where

1:41:41

where language bites its own tail, right,

1:41:43

Like, this is this is where the

1:41:45

definitions of terms become circular.

1:41:48

Right, Like for someone

1:41:50

to say, okay,

1:41:55

what if I want to experience

1:41:57

the worst possible misery for everyone? Right,

1:42:00

that's not a use of the word want

1:42:03

that makes any sense. Right, What

1:42:05

you mean by want doesn't

1:42:08

map onto this landscape, right,

1:42:10

So like we have to like words people.

1:42:15

This is what's happening with the isot problem. People

1:42:18

are pretending to think certain

1:42:20

thoughts. They're not actually thinking them.

1:42:23

They think they're thinking them, but

1:42:25

they think that uttering a

1:42:27

sentence is the same thing

1:42:30

as thinking a thought. It

1:42:32

isn't right, I can pretend

1:42:34

to think this thought. You know what I

1:42:37

have in my refrigerator right now? I have a

1:42:39

round square, right,

1:42:44

It's a sentence, right, It's

1:42:47

utter bullshit. It has no

1:42:49

reference point logically or empirically

1:42:52

in our universe or any other that

1:42:54

I can imagine, because round

1:42:56

square makes no sense, right.

1:43:00

It is the round is exactly

1:43:02

what a square isn't. Right, So the fact

1:43:04

that I can say the phrase round square

1:43:07

doesn't shouldn't make you think

1:43:10

that I'm I'm

1:43:13

I'm tracking the thought

1:43:15

that there's a thought on offer that

1:43:17

myke that that that my

1:43:20

mind isn't passing through. It's like here,

1:43:22

every thought, every every sentence is a

1:43:26

It's like a needle that that needs

1:43:28

to actually think it through. You

1:43:31

need to be able to thread the needle, right, But I

1:43:33

am not threading it. I'm just saying, here's a needle

1:43:37

and and and I'm

1:43:40

pretending that my

1:43:42

my intelligence has has passed

1:43:44

through the eye of that needle by

1:43:46

by uttering the sentence. But

1:43:48

it's it's completely empty, it's

1:43:51

completely vacuous. It is not a thought,

1:43:54

right, I'm not thinking that thought.

1:43:57

And that's not a paradox. I'm I'm

1:43:59

making small mouth noises

1:44:02

and pretending to be a philosopher when

1:44:04

I say that sentence, right, that's

1:44:07

what's happening with this is ought

1:44:09

distinction in my view. I

1:44:12

mean, it's just it is it is empty

1:44:14

language when you actually drill down

1:44:17

on the circumstance, we're actually in right

1:44:20

and what and and the way our intuitions

1:44:25

allow us to make any claim at

1:44:27

all cognitively or

1:44:29

or or behaviorally a behavior

1:44:32

to feel any motive to do anything

1:44:35

right, It's like, what what is it to

1:44:37

be a cognitively

1:44:40

and volitionally a

1:44:43

live system?

1:44:48

We we are we

1:44:51

we're hurling words at this circumstance,

1:44:55

trying to to to make some

1:44:57

appraisal of it and to make and and again

1:45:00

you know, whether we choose to think of it

1:45:02

or not. We're trying to navigate

1:45:05

within this space of possible experiences, like

1:45:07

I feel something that makes me uncomfortable, and

1:45:09

I want to stop feeling that way, right

1:45:12

with my apish brain, and might not, But

1:45:14

you might not want to stop feeling that way if you had

1:45:16

a certain value system that allowed that

1:45:18

to happen. Right, But then, but then other

1:45:21

things count as uncomfortable in that value

1:45:23

system, right, So like I've got I've got whatever

1:45:25

I've got that. I can't. I

1:45:27

can I can't perfectly inspect. In

1:45:29

fact, I'm really bad at inspecting

1:45:31

it. Right, I can't look inside myself

1:45:33

and find my values. They have to come

1:45:36

out in dialogue with

1:45:38

the world, right they get they get

1:45:40

revealed to me as as as

1:45:42

they get revealed to you by these

1:45:44

collisions, you know, linguistic collisions and

1:45:46

behavioral collisions with the world, Like how

1:45:49

do you know you don't like your hand on a hot stove?

1:45:51

Well, touch one for the first

1:45:53

time, right, and then you know,

1:45:57

how do you know your allergic to strawberries? You

1:45:59

know you eat them for the first time and you have a

1:46:01

reaction. How do you how do you know

1:46:05

you don't? You

1:46:09

want to reject the inconsistency

1:46:12

in this other person's argument, like

1:46:15

like someone's telling you something that isn't

1:46:17

adding up. Right, You're in your first philosophy

1:46:20

philosophy class, and you've

1:46:22

got some anti

1:46:25

natalist arguing that it would be better better

1:46:27

not to have been alive, Right, it'd be better not to be

1:46:29

born, and having kids is totally

1:46:31

unethical for that reason because life

1:46:33

sucks, and

1:46:37

you you feel like there's got to be

1:46:39

something wrong with this argument, because it's like it's

1:46:41

it seems, uh,

1:46:44

it just seems to open the door to all kinds of things that

1:46:47

seems starkly unethical, Like which is which

1:46:49

is to say that you kill everyone in their sleep tonight

1:46:51

painlessly. That would be a good thing to do, right,

1:46:54

Let's just murder everyone in

1:46:56

their sleep tonight. That'd be no one bereaved, no

1:46:58

one would be suffering any of

1:47:00

the outcomes of that, and there'd be no attendant

1:47:03

suffering to the deaths themselves. Like

1:47:05

that's just let's let's if you could do that, you'd

1:47:07

be a moral monster not to do that. That's sort of

1:47:09

the sorry, I'm uncomfortable

1:47:12

with that. This

1:47:14

is all kind of revealed values. And

1:47:17

then you get especially uncomfortable

1:47:19

when someone says something which amounts to two plus two

1:47:21

equals five, like, okay, that's

1:47:23

bullshit. We scan your brain

1:47:26

while you're doing all of that, and we see you're using some

1:47:28

of the same neural structures that you use

1:47:30

when you find, you know, certain smells disgusting,

1:47:33

right, because there is no other areas

1:47:35

of the brain to leverage to have these

1:47:37

kinds of reactions. Because you're you're an ape

1:47:39

after all, right,

1:47:42

So where you're using a very old

1:47:45

toolkit to you know, in evolutionary

1:47:48

evolutionary terms, to do any of these higher

1:47:51

cognitive things. So

1:47:53

we're navigating. But then but but the

1:47:55

truth is, we have enough that

1:47:59

is abstract and

1:48:02

not merely conforming to the appetites

1:48:05

born of evolution, uh

1:48:08

that allows us to take

1:48:12

something like the view from nowhere, to stand

1:48:14

outside ourselves where we can say, okay, yeah,

1:48:16

we're just apes. Now we're just

1:48:19

you know, these these these uh,

1:48:21

these warm and moist

1:48:24

and and uh,

1:48:27

meaty things that

1:48:30

that crave certain

1:48:32

certain outcomes. But

1:48:35

here we have we have we

1:48:37

have this language game that is that

1:48:39

is getting interesting enough that

1:48:41

seems to promise that we can stand

1:48:44

outside of this if only for you know, between

1:48:46

the hours of nine and five in a in

1:48:48

a at a place like M. M. I. T Or

1:48:51

Harvard or Stanford or some

1:48:54

institution that for whatever reason is

1:48:56

carved out enough free time and you

1:48:58

know, to to with the you know, our twenty

1:49:01

watts of brain power toward toward

1:49:03

problems that aren't immediately relevant

1:49:05

to feeding ourselves and

1:49:07

not dying, and we can have a

1:49:09

conversation that seems

1:49:12

to look back on this creaturely circumstance

1:49:14

of being mere apes, you know, trying

1:49:17

not to die. And we can say,

1:49:20

what should we do when we

1:49:23

can rewrite the firmware

1:49:25

of our nervous systems and do anything

1:49:27

we want? What will be right to want under

1:49:30

those conditions? When I can change

1:49:32

your moral intuitions and I could make

1:49:34

you a happy member of the Taliban if

1:49:36

you want to be that? Should you want

1:49:39

to be that? When I can make you someone who recognizes

1:49:41

how shitty it is to be a happy

1:49:43

member of the Taliban? And where

1:49:45

where can we stand? And what does this

1:49:47

moral relativism seem to promise? In

1:49:50

my view? It promises this

1:49:55

moral landscape, which is okay.

1:49:58

Now, let's finally admit we have a navigate a problem,

1:50:01

and part of our compass is

1:50:04

and part of the problem of

1:50:06

solving this this navigation problem

1:50:08

is recognizing that now we can make changes

1:50:10

to our compass itself. Right,

1:50:13

it's not just reading the true north of

1:50:16

I'm an ape born on earth that feels

1:50:18

certain a certain dopamine rush.

1:50:21

I mean, wow, this is

1:50:23

why this is what you're saying is amazing, because

1:50:25

it's precisely that metacognition and

1:50:27

mindfulness that you're exhibiting that I

1:50:29

think gives us a species, gives

1:50:31

us free will. Okay, but it's amazing.

1:50:34

You're ill. You're illustrating exactly what

1:50:36

I've been. The reason why it's not free will is

1:50:38

because all of it is being pushed

1:50:40

from behind causally, either

1:50:42

deterministically or randomly, or both such

1:50:45

that such such at every momentary

1:50:48

instance of navigating and

1:50:50

doing anything at all. Again, just

1:50:52

me getting to the end of this sentence, right, is

1:50:56

fundamentally mysterious, being

1:50:59

driven from from behind. And

1:51:03

no matter if

1:51:05

if I maintain my current course

1:51:07

or I change it, both are inscrutable.

1:51:11

If I pick up the glass of water with my left

1:51:13

hand or my right hand, both either

1:51:15

is inscrutable. If I decide to suddenly

1:51:17

want to learn to play the cello, that

1:51:20

change in me is an absolute

1:51:22

mystery which I cannot account

1:51:24

for. I cannot. It's the point, that's

1:51:26

all, besides the point that's really interesting

1:51:29

here because it's totally compatible

1:51:31

with determinism, and

1:51:33

free will is not compatible with

1:51:36

real determinism. Because if if you could say

1:51:38

to someone, listen,

1:51:40

the movie of your light life has already

1:51:43

been shot and scored, edited,

1:51:46

it's done right, there's

1:51:49

a place to stand from which we

1:51:51

know exactly what you are going to say two

1:51:54

years from now to your

1:51:56

wife in this conversation where

1:51:58

you think you're having some kind of epiphany. You

1:52:01

know, that's already written.

1:52:03

We wrote it. We have the dialogue

1:52:06

on our supercomputer. Right. Our

1:52:09

lives are compatible with that. Our

1:52:11

phenomenology, our moment to moment phenomenology

1:52:14

is compatible with that. I'm not saying that's true. I'm not saying

1:52:16

randomness isn't part of the picture. I'm

1:52:18

saying our experience

1:52:21

is totally compatible with that. And

1:52:23

to recognize that experientially

1:52:26

changes things. It feels different,

1:52:30

and it has moral implications,

1:52:32

and it closes the door to hatred and

1:52:35

real hatred, and it does not

1:52:37

close the door to love. And

1:52:40

it makes you someone who can

1:52:44

stand completely free

1:52:46

of certain forms

1:52:48

of psychological suffering that seemed

1:52:50

to be an imperative if you don't

1:52:53

experience your mind that way. That's

1:52:56

so much of this is semantic, because I'm

1:52:59

like right on board with your your whole life

1:53:01

product. I can see so clearly the

1:53:03

thread that unites all these you know, we talked

1:53:05

about religion, you know, and getting and the

1:53:08

is all this stuff. You know, you want to pull back

1:53:10

the curtain and have us

1:53:12

derived values from some

1:53:15

universal base of truth and reality

1:53:17

as opposed to driving values from some belief.

1:53:19

That's cocka ma amy, as my grandmother would

1:53:21

say. I totally am on board with all

1:53:23

this I could see semantically. I'm

1:53:25

not going to convince you some of these things. But it's funny

1:53:28

because I I love your project and your mindful

1:53:30

I am a subscriber to the Waking Up. I listen

1:53:33

to your morning meditation today, and

1:53:36

and and my as a human being, myself

1:53:38

and the and my own teachings, with my own courses

1:53:40

I teach to transcend course and all this is very much

1:53:43

in line with what you're teaching, which is helping us

1:53:45

to ascertain the reality of our mind

1:53:47

and understand our patterns and

1:53:50

in order so we can have that potential to change

1:53:52

in our lifetime. To me, the kind

1:53:54

of stuff you're working on, the mindfulness,

1:53:56

the metacognition driving values from facts,

1:53:59

to me, that's worth

1:54:01

calling free will from

1:54:03

a human perspective. Well,

1:54:06

so I mean freedom.

1:54:08

Freedom is definitely something to

1:54:10

value and to aspire to in all of its

1:54:13

guises. Right, So you

1:54:16

know, I think you know if someone

1:54:18

and again this is all this

1:54:20

can be as as transcendental

1:54:25

or as or as prosaic as you want it to

1:54:27

be. But it's you know, if

1:54:29

you just take freedom in the in the context

1:54:32

of any goal, like like if

1:54:34

you want to lose weight, it's

1:54:37

better to feel

1:54:40

the the the kind of free

1:54:42

access to the internal resources that will

1:54:44

allow you to do that with with

1:54:46

with the minimum amount of suffering.

1:54:49

Right, It's like you're not constantly

1:54:51

racked by by

1:54:53

irreconcilable impulses like you want to

1:54:55

lose weight, but you're desperate to eat chocolate,

1:54:58

and then you eat the chocolate and you feel guilty and

1:55:00

you cycle back and forth between that, and then you don't

1:55:03

make any progress, and six months have passed and you're

1:55:05

the same weight, and yet you spent you

1:55:07

know, thousands of dollars on you know, to join

1:55:09

diet clubs, Like you're frustrated, Like all

1:55:12

of that is not as good

1:55:14

as having your your capacities

1:55:18

and your aspirations truly aligned, where

1:55:21

it's like, Okay, I've made a decision. I want to lose weight.

1:55:23

I know how to do that. I got to eat at a caloric deficit,

1:55:25

I got to work out more, and I'm going to do

1:55:27

that without any sense of internal conflict,

1:55:29

and it's just going to be a source of joy for me. And the pounds

1:55:31

are just going to just fall off hour

1:55:34

by hour. You know. Literally,

1:55:36

there's not going to be an hour over the

1:55:38

next three months where I'm not going to be

1:55:40

losing some amount of weight. I'm going to

1:55:42

be happy the whole time. And three

1:55:44

months from now, I'm going to look in the mirror or look at the

1:55:46

scale and realize I've achieved my goal

1:55:49

without any impediment. Right, those

1:55:51

are two possible experiences. Freedom

1:55:55

is a concept that

1:55:58

that useful

1:56:00

to differentiate those two experiences,

1:56:03

right, I'm, you know, in one case, I'm free

1:56:05

to just follow my own advice

1:56:07

without conflict and follow the advice

1:56:09

of others, and to to

1:56:12

to not be

1:56:14

coerced by my own internal cravings and

1:56:16

addictions, and like I'm not addicted. I'm

1:56:20

I'm free of those kinds

1:56:22

of impulses. But

1:56:24

again, I just I think there's something misleading

1:56:26

about invoking this

1:56:28

traditional concept of free will, because I know

1:56:30

where people are starting. People are starting. It's

1:56:33

the same problem with self, Like there are ways

1:56:35

to talk about there's a ways to use the word

1:56:37

self that I you know, that are

1:56:40

unavoidable I mean, you know, there's

1:56:42

no problem in talking about oneself

1:56:45

or the self. But what

1:56:48

most people most of the time mean by

1:56:50

the feeling of self is

1:56:52

referring to something that is illusory,

1:56:55

that is a source of real suffering. Right.

1:56:57

It is the feeling of being a subject in

1:57:00

the middle of experience. And when you

1:57:02

lose that feeling, this is you know, it's not uh,

1:57:05

you know, this

1:57:08

is why this is a good analogy. It's actually

1:57:10

it's the same. It's it's more than an analogy.

1:57:12

When you lose that feeling, you also

1:57:14

lose the feeling to which this

1:57:16

notion of free will can be attached, right,

1:57:19

Like, that's that's where it's like, it's obvious

1:57:21

that there's nothing

1:57:24

in experience to which

1:57:26

that would refer. It's like it's

1:57:28

obvious that the next thought simply

1:57:31

arises. There's no

1:57:33

other way for it to appear. And

1:57:35

I didn't think it before I thought it, and

1:57:39

I am in some I'm

1:57:41

not I'm maybe the first to know what

1:57:44

it is, but I'm also the last to know what it's like.

1:57:46

It's it's it's and

1:57:49

even with something like speech, it's like I'm

1:57:52

unless I've prepared my speech in

1:57:54

advance, right, unless I have a script

1:57:57

I I'm hearing what I

1:58:00

I say at the same time you're

1:58:02

hearing it, right, It's

1:58:04

like, I mean, they again, there's there's some delta

1:58:06

here. I mean there's some granted,

1:58:08

there are kinds of speech acts where it's

1:58:10

not totally scripted, but it's sort of you

1:58:14

know, I have some internal sense of where I'm going as

1:58:16

I'm going there, But rather often

1:58:18

it's just you know, if

1:58:20

I'm thinking of an analogy, if we're talking and

1:58:23

I think, well, just imagine you're you

1:58:25

know, imagine the old story of

1:58:27

putting rice on you

1:58:29

know, one grain of rice on each square

1:58:32

of a chessboard, and you double it double each time,

1:58:34

and by the time you get to the end of the ship, Like I

1:58:37

just thought of that old you know

1:58:41

story, I

1:58:43

didn't. I didn't

1:58:46

think of it. I mean, it's not the greatest

1:58:48

example because I sort of thought of it before the

1:58:50

words got out. But you

1:58:52

know, there's just you and

1:58:54

I are hearing my thoughts at the same time for

1:58:57

the most part when I'm speaking, and

1:59:02

my thoughts aren't evidence of your

1:59:04

free will, right,

1:59:07

Like they're just appearing, right,

1:59:10

And that's what's happening for me

1:59:12

too. And the impulse

1:59:14

again, if I'm going to go for the glass of water, Like

1:59:16

from a moment ago, I wasn't thinking of water, wasn't

1:59:19

feeling anything about the water, I wasn't

1:59:21

thirsty, But now I thought it might

1:59:23

be nice to have a sip of water that

1:59:28

came out of nowhere, out

1:59:30

of nowhere. And but that leak between

1:59:32

thought and action is not

1:59:34

one to one. You know, there's a really interesting paper on do

1:59:37

smokers have free will? And that's the Bosid

1:59:39

Baldmeister paper, and he found that in almost

1:59:41

every case, people overestimated

1:59:43

the extent to which they wouldn't be able to

1:59:46

quit or they wouldn't be able to have freewill based on the urge.

1:59:48

But turns out that humans have much more self

1:59:50

control than they realize that they're capable

1:59:53

of. You know, you don't have to obviously,

1:59:55

you don't have to. You know, you can want the water, but if

1:59:57

you if you suddenly activate your prefaultal

2:00:00

text, you could override that and be like, you know what, I'm gonna

2:00:02

wait. But again it's subjective, so

2:00:04

I'm not denying that. Again,

2:00:07

there's there there's a difference between voluntary and involuntary

2:00:10

action. There's a difference between

2:00:13

behavioral self control and lacking

2:00:15

that capacity. Right, Like, let's

2:00:17

say I have goals, you know, to

2:00:20

you know, to stop but my goal is to stop smoking, but

2:00:22

I'm completely incapable of not

2:00:25

smoking. Right. That's one way

2:00:27

to be. The other way to be is I have

2:00:29

a goal to stop smoking, and I can actually

2:00:33

veto the impulse to stop smoking when

2:00:35

it comes online, right for the

2:00:37

time it takes me to actually kick the habit. But

2:00:41

that's not again every

2:00:44

instance of this, like let's say

2:00:46

I'm you know, I'm trying

2:00:49

to stop smoking, and I

2:00:54

I'm able to successfully preempt

2:00:56

the impulse to smoke on Tuesday afternoon,

2:01:00

but Wednesday morning, I reach for a cigarette

2:01:02

and smoke, but I only take one

2:01:04

puff and then I throw it away. Like every

2:01:07

bit of that the the sufficiency

2:01:09

of my my strength

2:01:14

of will in one case, my weakness

2:01:17

of will in another case, the fact

2:01:19

that it wasn't so weak that I smoked the whole

2:01:21

cigarette in that case, every

2:01:23

bit of it is being determined

2:01:26

by states of my brain which

2:01:28

I didn't author, which I didn't create, which

2:01:31

I'm to

2:01:33

it. But my liver is still me

2:01:35

and it gives me absolutely no sense of free will.

2:01:38

If my liver stops fun my

2:01:40

liver is working exactly the way

2:01:42

it is in this moment, and no other way.

2:01:45

If it works better tomorrow or stops

2:01:47

working completely on Friday,

2:01:50

I am a mere victim

2:01:53

of those changes or witness to their

2:01:55

consequences. It's there. It's not

2:01:57

within the domain of my of

2:01:59

my autonomy or agency,

2:02:03

but so it is with states of my brain.

2:02:07

So it is with each instance

2:02:09

of neuro chemistry in

2:02:12

my brain. And yet that

2:02:14

is producing everything I

2:02:17

experience, including

2:02:20

my preferences and my goals,

2:02:23

and my my impulses that are

2:02:25

in conformity with preferences and goals,

2:02:27

and then my sudden subversion of

2:02:29

those goals with some alternate impulse, you

2:02:32

know, the thought, Oh God, wouldn't it be great to have a cigarette right

2:02:34

now? That's that's

2:02:36

getting piped up from below. And

2:02:40

I'm the one who who can seem

2:02:43

seem to hear for the antidote. I can

2:02:46

seem to say, oh, no, no, no, I'm not going to follow that

2:02:48

thought. I've been taught mindfulness by

2:02:51

by Sam and Scott, right. But

2:02:54

the fact that that comes online in that moment

2:02:56

and doesn't in another right

2:02:59

that'sous. The fact that it comes

2:03:01

online to the degree that it does and

2:03:04

not one degree further is also mysterious.

2:03:07

It's probably dependent on other things that it seemed

2:03:09

completely adventitious to my character,

2:03:11

like whether I got enough sleep the night before,

2:03:14

or whether I had a full lunch, or whether that you

2:03:16

know, whether I

2:03:18

got enough sunlight. I mean, like, it's just who knows,

2:03:21

like totally mysterious. Like I'm

2:03:23

an absurdist, so I I love

2:03:25

a lot. We were saying, because I will see things that I

2:03:27

do and I'm like, but I'm like, oh, that was really kind

2:03:30

of predictable that I would do that. But you

2:03:32

know, it's not totally mysterious if we have an understanding

2:03:34

psychology about how genes work. You know,

2:03:36

I kind of get why I have the dopamine

2:03:38

drive. You know, I get what's pumping through my head.

2:03:40

I can kind of understand it from a mechanistic level.

2:03:43

Right, It's not like totally buttandb

2:03:46

yeah, well, unders no, I get I guess it's

2:03:49

it's descriptively in certain cases,

2:03:51

it's descriptively not mysterious at all. I

2:03:53

mean, we know causally we

2:03:55

can tell a story about it. But

2:03:59

again, it's just two differ levels of

2:04:01

of connecting to the phenomenology

2:04:04

here. When I say mysterious, I mean like,

2:04:07

like I can move my hand,

2:04:10

right, This is the most This is one of the most prosaic

2:04:13

things about me, that I can move my hand. I

2:04:15

can do this. I

2:04:19

have no insight into

2:04:21

how I do this. Right. If

2:04:24

I suddenly couldn't do it, yes,

2:04:26

that would be flabbergasting. But the fact

2:04:28

that I can do it is also flabbergasting.

2:04:30

I have no literally no insight. Now

2:04:33

I know something about the neurology

2:04:35

of this, right, I can talk about muscle

2:04:38

fibers, and you know acting, and

2:04:40

and you know the transduction

2:04:42

in motor nerves, and you

2:04:45

know a seal coaling, and like it, I

2:04:48

can I can kind of vomit

2:04:51

my concepts onto this experience.

2:04:55

Right. None of that reaches

2:04:57

in to the experience, none

2:05:00

of it. Right, So like this is this

2:05:03

is irreducibly mysterious.

2:05:07

You know that's interesting because some people have argued that autistic

2:05:09

savants actually are an exception to all this,

2:05:12

and they actually can get inside the module, you

2:05:14

know, in a very deep way that most

2:05:16

of us can't. That's why I wish I knew what the quality

2:05:18

of an autistic avant was because I talked to the

2:05:20

elite Daryld Treffort. He was a dear

2:05:22

friend of mine. He was a you know, rain Man

2:05:24

scientific advisor. He studied. He spent his whole life

2:05:27

studying these people. And it's interesting

2:05:29

because it seems like some of those autistic capacities,

2:05:32

like to be able to just verbatim

2:05:36

play back something actually

2:05:38

requires the ability to get into the module

2:05:41

consciously in some way that is

2:05:43

not privy to the rest of us. So I think there's actually

2:05:45

some really interesting neuroscientific

2:05:48

exceptions to some of this. Well, I

2:05:50

wouldn't say so. I'm not saying that you

2:05:52

can't have more and more

2:05:55

fine grained insight into the experience,

2:05:57

right, So, like like you can learn to pay enough

2:06:00

tension to anything a motor

2:06:02

a simple motor experience like moving your hand, and

2:06:05

it can break down into you

2:06:08

know, it can become pixelated

2:06:10

in ways that are interesting, right, So

2:06:12

you can you can become more sensitive

2:06:15

to the

2:06:17

the link between mind and body.

2:06:19

I mean, like like the arison of intention and

2:06:21

kind of having a kind of threshold effect

2:06:23

that actually does you know, trigger

2:06:26

a motor program.

2:06:29

You can become more

2:06:31

sensitive to things we know to be factually

2:06:33

true. Like if you touch a hot stove, you

2:06:36

can actually experience the

2:06:38

reflex component of withdrawing

2:06:40

your hand, so that you actually withdrew it before

2:06:42

you felt the pain. Right, Like

2:06:44

the you know, the road to the amygdala is

2:06:47

actually faster than the sensory the road

2:06:49

to sensory cortex that actually

2:06:51

registered the conscious percept of oh my god,

2:06:53

that's too hot. Right, so you can you can notice

2:06:55

that you can actually become sensitive enough

2:06:57

to notice that first

2:07:00

you withdrew your hand and then you felt

2:07:03

the pain of how hot it was. Right.

2:07:06

So I'm not saying you can't have any insight into this, but

2:07:08

there is still

2:07:11

something. However deep you go into

2:07:14

anything, however atomized

2:07:16

your experience consciously becomes of

2:07:19

of phenomenon. There

2:07:21

is just simply this fact that

2:07:24

first something wasn't

2:07:26

there, and then it's there. There's a

2:07:28

there's a you can you can shatter your

2:07:30

your your your

2:07:33

subjective experience down to its atoms and

2:07:37

notice that things are just

2:07:39

appearing out of the darkness,

2:07:41

right sights, sound, sensations, thoughts,

2:07:44

intentions, emotions, or their

2:07:47

micro constituents, in so far as you can find

2:07:49

those with your attention, and and again, things

2:07:51

can get incredibly pixelated when you're doing

2:07:54

when you spend months on retreat doing nothing but

2:07:56

pay attention to to mostly

2:08:00

sensory perception, it

2:08:02

breaks down, It can break down into especially

2:08:04

if you're if you're doing it

2:08:07

strategically in the particular way

2:08:09

so as to look for it's it's kind

2:08:11

of smallest and briefest aspects, which

2:08:14

is one style of meditation. Things

2:08:16

become amazingly pixelated in your body, like you don't

2:08:19

you don't feel that you have a body anymore. You have a

2:08:21

you know, a cloud of sensation you

2:08:23

know of temperature and pressure

2:08:26

and and movement that

2:08:28

is just and does you

2:08:30

know doesn't have the shape of a body

2:08:33

at all. Right, you don't feel hand You

2:08:35

feel these

2:08:37

these these micro

2:08:39

changes of of of primary

2:08:42

sensation at age each moment. But

2:08:45

again, whatever you're noticing

2:08:49

is there, and then it's not there, and

2:08:52

then something else is there, and then it's not there. You

2:08:56

are not doing any of it. That's the crucial point.

2:08:58

You, the one who witnessing,

2:09:01

aren't doing any of them. But you

2:09:03

can do something about it. That's

2:09:06

the point I'm trying to make. Well,

2:09:09

But whether or not you do in

2:09:11

the next moment is just as

2:09:13

mysterious as this. I

2:09:16

get your point, though, Yeah, I see what you're saying.

2:09:19

It is. Look, let's end. But you're you're testing

2:09:21

my limits of free will right now because I'm

2:09:23

starving. But I

2:09:25

and I'm sure you must be as well. I

2:09:27

want to just agree with you that it's all inspiring.

2:09:30

I walk around constantly in a state

2:09:32

of all in wonder. That's my default mode

2:09:35

is curiosity about

2:09:37

everything about people I'm supposed to hate. I'm

2:09:39

actually just I witness them,

2:09:41

just like I would witness my consciousness. So I'm

2:09:43

with you on a lot of that. I really can't thank you enough

2:09:46

for coming on today, spending four freaking

2:09:48

hours with me, covering

2:09:50

almost everything about human existence.

2:09:53

And I would still say to be continued, because there

2:09:55

I didn't get to the Twitter questions. I didn't get to

2:09:58

the mindfulness thing. I'm not saying we have to

2:10:00

like you come back my podcast, but I just suspect

2:10:02

we'll talk again someday. You know, the conversation

2:10:04

will continue. Yeah, yeah, Yeah, it's been a pleasure,

2:10:06

Scott. So thank you. It's a questions. Thanks

2:10:09

for listening to this episode of the Psychology

2:10:12

Podcast. If you'd like to react

2:10:14

in some way to something you heard, I encourage

2:10:16

you to join in on the discussion at the Psychology

2:10:19

podcast dot com. That's

2:10:21

the Psychology podcast dot com.

2:10:23

Thanks for being such a great supporter of the show,

2:10:26

and tune in next time for more on the mind,

2:10:28

brain, behavior, and creativity.

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features