Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:19
Hey, and welcome to What Future. I'm your host,
0:21
Josh Watspolski, And before
0:24
we get into the show, I want to talk about traveling.
0:28
I have a very particular feeling
0:31
about traveling, particularly traveling by
0:33
plane, and having
0:35
just done some traveling, I
0:38
have certain points of dread, I would say, in
0:40
the experience, and I don't know if everybody has these. I don't
0:42
know if everyone experiences travel quite the
0:44
way that I do. Like, I feel like when I'm in airport
0:47
and I look at people, people seem
0:49
generally. Now this could be a projection
0:52
or whatever, me reading into things, I feel
0:54
like people at the airport are having a better
0:56
time than I am. Like when I look around,
0:59
I feel like everybody enjoying what
1:01
they're doing or
1:03
enjoying their trip more than I am. For
1:06
me, a trip is just a
1:08
series of hurdles that
1:10
I've got a vault over, a series
1:12
of anxiety portals
1:15
that I must pass through until I get
1:17
to my destination. And I think some of that
1:19
is driven by my well.
1:23
I think I have a problem with flying because
1:25
it takes me completely out of control of the situation
1:27
and I'm not a good passenger anywhere,
1:30
Like in a car or whatever. I don't like to take trains
1:32
that much. But on a plane
1:34
you feel especially powerless and out of control.
1:36
And you know, of course if a plane typically
1:38
when a plane crashes, which doesn't happen that often, but
1:41
it does happen, it's not like, oh, there were
1:43
some injuries, you know, like a car crash is like, oh the guy
1:45
got his arm broken, or well he had to be he was in the hospital
1:47
for six months, but you know then he recovered, or
1:49
well he'll never walk again, but he's still alive.
1:51
I mean, of course, people die in car accidents, in fact, more
1:54
often than than you know, they're more likely
1:56
to die in a car accident than you are on a plane.
1:58
So but there's that kind of feel like, well, if the plane
2:01
does crash, that's it
2:03
for me. I'm I'm toast. And
2:05
then you know, there's the whole thing with seats
2:07
because I'm very tall, and
2:11
uh, it can be very
2:13
uncomfortable. Uh,
2:18
it can be very
2:20
hold on, I'm
2:23
getting a call which
2:26
I need to take. I
2:28
understand this, this is very bad timing. But one
2:32
second, hello.
2:35
Josh, yes, sir,
2:37
you know who this is, right, Well,
2:40
I mean there's only one person who can
2:42
cut right into mind. Who
2:45
is this?
2:46
It's the professor. It's always the professor.
2:49
It's only the professor.
2:50
Professor scientists, Sir, professor scientists,
2:53
Sir professor scientists, sir. Yes,
3:13
I like to take control immediately.
3:15
That's good.
3:16
You know you hear my laugh right now,
3:19
right? And I do so enjoy our conversations,
3:21
both on air and
3:24
off air. Yes, but I will
3:26
tell you this. I come into this conversation today
3:29
ship ton of
3:31
anxiety, and
3:34
imagine a ton of
3:36
ship. Okay, any
3:38
type of excrement I've got, I've got
3:40
it, do you really? I
3:43
got in my mind's eye right now. I
3:46
got it in my mind's and it's very detailed. It's
3:49
on your front doorstep. It's
3:52
more than the doorstep. It's in your yard. Okay,
3:54
your chowt is about.
3:57
Could be good. I mean you have to stop.
4:00
I held it from going out inquiring what
4:02
is that thing everywhere near that?
4:04
It's get away from that power of ship.
4:07
Yeah.
4:07
It's a ton, yeah, of anxiety,
4:10
of anxiety, and I'm not
4:12
enjoying it. And even though I'm giggling about
4:14
it, yeah, I'm very
4:16
serious.
4:17
Well, giggling is just a it's a defense
4:19
mechanism against anxiety. Yeah,
4:22
I feel like I'm doing
4:24
a bad job at everything, and that makes
4:27
me feel anxious. But I'm not so anxious
4:29
that I'm going to do a better job. If that makes you feel
4:31
any better, So.
4:34
I should write that down. That would be like
4:36
a perfect little piece of wisdom
4:38
in your in your book.
4:39
Maybe I should write a book of little little witticisms
4:42
like that.
4:42
Yeah, exactly, all right, my anxiety
4:44
levels for shiit?
4:45
What's causing you anxiety? You're you should
4:47
be relaxed, man, I should be
4:50
Yeah, well, I know you've got a lot of stress because
4:52
you've got you've got big projects, you got to deal with big
4:54
people, big personalities.
4:55
It's not just stress. It's
4:58
just I don't know. I think stress
5:00
is more of an external force and
5:03
anxiety is more of an internal force.
5:06
That sounds like something that a very
5:08
smart therapist would say.
5:09
No, I just I just said it. I just blurted
5:12
that out.
5:12
You should write a book of witticisms.
5:15
Oh you want to hear some of my witticisms. Yes,
5:18
I said this to somebody today. I
5:20
can be accused of being a little self righteous,
5:23
which I'm okay with. Yeah, because
5:25
I can be I have a little bit of a bar.
5:27
So yeah, I'll say yeah.
5:29
And I came up with this thought years
5:31
ago that when I
5:33
feel like I have a leg to stand on, which
5:37
to me indicates I'm right,
5:39
I'm just right about it. I'm not always right, but this,
5:42
in a particular moment, it's irrefutable.
5:44
I am right. Somebody's
5:47
wrong, somebody's lying, somebody's
5:49
doing something.
5:50
Yeah, if I.
5:51
Have a leg to stand on, I take
5:53
the leg I'm not standing on and I'll beat
5:55
somebody over the head with it. I
5:57
should write that one down figuratively
6:00
literally, Yeah, figure it. Both
6:02
your legs are attached permanently. Yeah,
6:04
it's not like I'm gonna go beat anybody.
6:06
Up without naming names.
6:08
Can you give me a basic structure of the what's
6:10
causing the anxiety?
6:12
Well, scaffolding, it's uh,
6:16
mortality has always
6:19
been an issue for me.
6:23
Aging mortality
6:25
is an issue for everybody.
6:26
Yeah, but I have an acute
6:29
sense of my mortality, as my
6:31
shrink says, which I do.
6:33
That's interesting, acute.
6:35
But what I've never really been conscious
6:38
of so much so, not
6:40
in a in a silly way or
6:43
is the word blitheful way? Is that
6:45
no, I don't blithe
6:48
way is uh aging?
6:52
But I've got aging on the brain.
6:54
Now this is this is an interesting topic because
6:56
I have a similar I've
6:58
had a similar thing on on my.
7:00
And I'm not feeling my age, but I'm conscious
7:03
of my age. And that didn't happen until
7:06
a handful of years ago. And
7:08
through the handful of years I had a couple
7:10
dear friends pass away and we
7:13
were all read around the same age. And I've
7:15
been thinking about my age,
7:17
which was not part of my internal
7:20
dialogue right and now it's there
7:23
every day. Why what are you thinking about your age?
7:26
Well, I will say that I
7:28
think that I've always felt and
7:30
as for as long as about as long as I can remember, I've
7:32
always felt that I was sort of like didn't
7:34
have enough time to accomplish all the things that
7:36
I wanted to accomplish, even though
7:39
I'm a master procrastinator and will
7:41
definitely sit on shit indefinitely. But
7:44
what's interesting is I don't think
7:46
at all about at
7:48
least not consciously. I don't spend a lot of time
7:50
thinking about mortality, nor
7:52
do I think about my age. And if I
7:54
have any thought about my age, it is that
7:57
I don't feel dramatically different or
7:59
older than I
8:02
did twenty years ago. By the
8:04
way, I don't either, right, Okay, So
8:06
so now, Laura, my wife, likes
8:08
to point out that we are old all the time,
8:11
like on a regular basis. She will say stuff
8:13
like we're old now.
8:14
Yeah, that's a slippery slope.
8:16
It makes me feel kind of shitty because because
8:18
I don't feel I don't feel old, and
8:20
I don't and I don't identify as like
8:23
an old person, and I don't feel like I have like
8:25
somehow done all the things I'm going
8:27
to do or lived all of the life I'm going to live. But
8:29
it makes me think about it. I'm like, well, is am I just
8:32
deluding myself? How deluded am I?
8:35
About who I am and where I sit in
8:37
the kind of spectrum on the spectrum of time?
8:40
So you know, it pops up occasionally. I mean, it's funny
8:42
because every time I talk about my age on the show, I say, oh, I'm
8:44
very close to dying. But it's like it's a joke
8:46
because I'm obviously not. I like to think
8:48
that I'm not. But you know, I'm alway
8:50
to talk to Laura, I might have to sit Laura, you
8:52
should you should talk to her about it, because I think she's
8:55
creating a real bad It creates a really bad vibeth.
8:57
It becomes a slippery slope, self
8:59
fulfilling prophecy.
9:01
Well, and I think it's like why I think about yourself
9:03
like that, Why I think about in these terms of like
9:05
old or young. I don't feel any
9:07
age.
9:08
I never did until about five years
9:10
ago.
9:10
Yeah, and I'm not. Nobody's dying in my world. I
9:12
mean, I'm just just hanging out, just
9:15
LIVI.
9:16
You know, all right, I'm going to jump around
9:18
a little bit, but I'm going to mood today. Here
9:21
we go, Here we go. I saw
9:23
a documentary it's about
9:25
the No No People, which I had never heard
9:27
before. Okay, but the No No People
9:30
were the Japanese
9:33
West Coast, primarily Japanese people
9:35
who were in turn during World
9:38
War two, Right. And
9:40
there was a twenty eight questions
9:43
questionnaire that was given to these
9:45
to all these people as they're about to be interned.
9:48
As I called it, like, number twenty
9:50
seven is will
9:54
you take up arms on behalf of the United States?
9:56
Right basically, And then number
9:59
twenty eight was some
10:01
version of do you sympathize
10:06
side with the japan right, So
10:08
these people wouldn't answer
10:11
those questions. Hence they answered
10:13
twenty six, but on twenty
10:15
seven and twenty eight they wouldn't answer them, and
10:17
they became known as the no no people right
10:20
now, and they were sent to
10:22
a more harsh internment
10:24
camp in Arizona called Camp Tully,
10:27
right, right, So all of it
10:30
was harsh, but Camp a little bit more
10:32
harsh, and these people were seen as real
10:34
troublemakers. But in fact, so
10:37
many of the Japanese people, as it turns out,
10:39
were you know, like God
10:41
love them. It's just a you know, this is a
10:44
beautiful race of people. And they were
10:46
like, they weren't rioting. They weren't
10:48
like pulling a January sixth on anybody,
10:51
right, they certainly weren't happy.
10:53
Only the only white people do stuff like that.
10:55
Fact, by the way, only white people do
10:58
that. Because here's my question, if
11:01
we were to think that, okay, because of geography
11:03
and proximity. Okay, so Asian
11:06
and we were often referred to as the West Coast
11:08
Asians, right, So it makes you think probably
11:10
not too many Asians at that time of America on
11:12
the East Coast in relationship to
11:14
the West coast. But if
11:17
we imprisoned all the Japanese
11:19
people, did
11:21
we try and imprison any of the German
11:23
people on the East coast?
11:25
I mean, this is a whole This is a whole different.
11:28
And why did we not? Oh? Because
11:31
they looked like us, because they were a
11:33
bunch of white people.
11:35
This is uh. But also a
11:37
huge amount of Nazi sympathizers in America.
11:40
Oh yeah, I made that. I made that point demand
11:42
the earlier in the day to a.
11:43
Huge amount of a bit of still
11:46
to this day and and and historically
11:48
for sure. Just imagine
11:50
thinking you're the center of the known universe for
11:53
your entire existence, in the existence of
11:55
all of your ancestors, just
11:58
thinking that you control and rule
12:00
all of the earth, and
12:02
discovering that
12:05
that's not the case, And
12:08
how little it must make you feel deep down. I
12:11
mean I can only imagine. I
12:13
mean, I'm a white man, I'm Jewish,
12:15
so I'm a minority. Basically, we're
12:18
a certain kind of minority. We're not we look like I
12:20
look like a white guy. But I definitely will be killed as
12:22
soon as the Nazis get to power.
12:24
Oh I shouldn't laugh at that one.
12:26
If the guys in New York who are the
12:28
Hasidic Jews can identify me and
12:31
ask me if I want to pray with them on
12:33
the street, they'll just pull me over and say
12:35
do you want to? That happens, Oh yeah, it's
12:37
a very common thing that happens in New York. They try
12:39
to get something called a minion together, which is
12:41
a group of men praying. I've heard of that,
12:44
and at any rate, but if they can identify
12:46
me, I guarantee you whoever, like
12:49
Donald Trump Junior, can identify that I'm Jewish
12:51
as well. But yeah, no, I mean, like, I don't
12:53
know what it's like. I don't know what it's like to feel
12:56
to feel so good about myself. I don't
12:58
know, because I don't think any Jewish person knows
13:00
this, To feel really good about
13:03
like who you are and what you've done, Like, I'm
13:05
not sure that's an emotion I can get in touch with.
13:08
But but the but these Europeans, these
13:10
white people, I mean, they've just been riding
13:12
this wave of success, you
13:15
know, seeming success, and it is very I think
13:17
it's very difficult to to imagine
13:19
a world where you are not the top
13:21
dog. And of course that world is a
13:24
there's only one solution for them,
13:27
because that world's imminent, that world is
13:29
is here. The only solution for them
13:32
is to kill, is to do just like
13:34
to do a genocide, because like there's
13:36
no way, how well you
13:38
can't stop they can't. They know they can't stop
13:41
it. I mean, the ship that's going on in America with these Nazis
13:43
and stuff like actual Nazis and this
13:45
Ronda Santis shit is just a
13:48
pure expression of like the last
13:50
gasp of this kind of European
13:53
U eurocentric white
13:55
culture that they've built up
13:57
and that they think was they thought was forever,
13:59
was going to be forever. And unfortunately
14:02
it's not very sad for them. So I mean,
14:04
but you know, they might just they might just like Adam
14:06
Bambas, you know they've done it before.
14:08
You're giving me segues and you don't even
14:10
know it.
14:11
I don't I don't know it. I have no idea what we're talking
14:13
about or why we're talking about it.
14:14
But in lockstep today and lots of stuff to often
14:17
often you and I are combative.
14:20
I was gonna say off in we
14:24
are arguing about that interesting.
14:26
We like to we like to rib each other a lot,
14:29
But today I'm not in a ribbing mood.
14:31
No, you're too anxiety ridden to rib I'm too
14:33
anxiety too much thinking
14:35
about mortality.
14:36
Did you read the latest report out the BBC
14:38
on AI as if everybody
14:41
in the world isn't talking about AI?
14:43
But no, no, but I'm happy. Let's get engage on
14:45
this. I'm ready. What's the report? Oh, the one
14:47
about how it makes extinct or
14:49
whatever?
14:49
Yes, even the I think even the guy
14:51
who's created or funding.
14:55
And the guy who runs open AI is very
14:58
worried. Interesting marketing tactic, I
15:00
would say, to make your products seem extremely
15:02
powerful and valuable. A couple
15:04
of things on that. First off, like, I
15:06
hope it's I would love to see
15:08
I'd love to be wiped out by an AI personally,
15:11
speaking on just a straight
15:13
up basic level.
15:14
I just lazers you from outer space.
15:17
He so lucky to have the AI turn
15:19
on us and wipe out you.
15:21
Man, How does it wipe you out? Does a laser
15:23
just come from outer space?
15:25
Yeah? Lasers? Sure? Fine, Listen,
15:27
you have to you do have to game it out a little bit,
15:29
right, because today the AI will
15:31
uh it doesn't do a lot. And also
15:34
it's not really AI, it's a it's
15:36
a language model, which is
15:38
because humans are very, very dumb, I think
15:40
we should just first I want to lay this out there.
15:43
Well, we're dumb. We're dumb because I
15:45
mean, storytelling is it
15:48
is what drives humanity. And by the way
15:50
that my career in journalism and
15:52
in other forms of storytelling has been very much
15:55
driven personally by this idea that telling
15:58
a story the right way, or finding this story
16:00
to tell and telling it is like, has huge, huge
16:02
value for people. It can change someone's life,
16:04
it can change someone's mind, it can it can it
16:06
can reorient you in the world, and it's like very
16:09
powerful. Right, So storytelling is
16:11
extremely important, but it is also like the basis
16:13
of all humanity essentially right that we
16:15
are this is the only way that we ever
16:17
get anywhere is by convincing each
16:20
other of a narrative. Right. Like
16:22
gay marriage is a good example. Like right gay
16:24
marriage happened in America largely
16:27
because we convinced, like
16:29
there was a narrative that started
16:31
to make sense to people, that you could tell them a
16:33
story about what marriage
16:35
was and what it meant that was different than the one they
16:37
had been told. Previously, and they could, they could, they
16:39
could understand it, and they'd buy into it. And
16:42
I think so much of that had to do with
16:44
like changing the narrative, like literally
16:46
changing the conversation and the story that we tell.
17:00
What do you think the dude is doing?
17:01
Who?
17:02
Who? What's his name? Stuart?
17:04
Who's Sam Altman? Sam? Runs open?
17:06
What's what's he doing? When he says,
17:08
Hey, this thing is really dangerous?
17:09
Basically, I think he's saying, my
17:12
company is so valuable and so powerful.
17:14
We've created something that even I don't
17:16
understand the power of. And I
17:18
hope I get I hope you will. Uh, the
17:21
government America, I hope your military
17:23
will invest heavily billions of dollars
17:25
of my technology to use raft to learn
17:28
how to you know, bomb people
17:30
better or whatever. And the NSA would
17:32
want it to help them learn about what people's behaviors
17:35
are. And uh, you know, other companies
17:37
will want to buy. And he's like, well, it's so powerful.
17:39
I don't know. I mean, maybe you can throw me a few bucks.
17:42
We could figure it out together. I think he's
17:44
a master marketer. Uh. And
17:46
but here's what it is. I mean, this model
17:49
tells us stories really well, and we're so
17:51
fucking stupid. We're so dumb.
17:54
And I gotta say just monkey level stupidity
17:56
here with humanity that the
17:59
bed the better the story is, the more believable
18:01
it is to us, regardless of it is if it is actually
18:04
believable or has any facts in it. Like
18:06
there's a great story that's been going around about this
18:09
lawyer who presented this brief,
18:11
this dissent or something in a case, and it had
18:14
all cited all of these different cases that
18:16
he had used chat GPT to get
18:18
and all of the cases were made up with like
18:21
extremely detailed citations. But
18:23
every one of the cases was invented by chat
18:25
GPT because its job is to model
18:29
what we believe we wanted
18:31
to tell us, right, It models what we think
18:33
what it thinks. The next part of
18:36
the conversation is, so we're in
18:38
this state of total panic
18:40
because like, what else could it do? But I think
18:42
to a couple of things there.
18:44
Let me interrupt you before, because when
18:46
you talk about predictiveness
18:50
meets AI meets
18:53
telling a story meets marketing. You
18:55
know, have I ever told you to read this book called If
18:57
Then by Joe Lapour?
18:59
I have been reading it actually started reading it
19:01
because you told me that's right. We did have the I I
19:04
fell off of I'm in the middle of it. You guys
19:06
never finished. It's it great, it's great. It's a great
19:08
book.
19:08
It's really a great book. And she's a great writer, but
19:11
terrific, one of the best. What she
19:13
points out and I won't ruin
19:15
it for people, But the way that this
19:18
Marketer, you know, Madison
19:20
Avenue guy got hold of a
19:22
computer, so to speak, and guys who
19:24
knew how to operate the earliest, earliest days
19:26
of computer and how they their
19:29
initial interest was how do we
19:31
get I think it was how do we get a
19:33
person to switch from one cigarette to another?
19:37
And it threw multiple
19:40
hands. Political hands eventually
19:43
started to years
19:45
later predict where
19:47
we were supposed to bomb certain
19:50
villages in Vietnam.
19:51
Right, Kisdinger got his hands on it, and it was
19:53
off to the raises. Yeah, I assume.
19:56
I don't know, but he's one of the greatest war criminals
19:58
we've ever known.
19:59
I know you I have a passion for him.
20:01
No, I don't really Actually, it's just anyhow he's
20:03
been in the news because he just turned one hundredth
20:06
But god.
20:07
Where was I so the guy, he's a great marketer,
20:09
and he just.
20:11
Talking about the stories, the stories. It's very
20:13
good.
20:13
And then the other guy, the attorney, creates
20:15
all those things.
20:16
But it's just it's just excellent at
20:19
shaping something that feels so
20:21
alive, feels so real, that
20:24
we start to ascribe all these
20:26
qualities to it. Give it all of these give
20:28
it all this power like money in a way, actually,
20:31
you know. And so we're all in a
20:33
panic now about AI wiping out humanity.
20:36
And again I have to say, I strongly advocate
20:38
for it wiping out of humanity at the hands of
20:40
AI if it can, but
20:42
it cannot. I think it's hard
20:45
for us to imagine the
20:47
mind of a person who comes to this not in
20:50
where we have come to it mid mid
20:52
life already having learned a bunch
20:55
of other habits, but somebody who's coming
20:57
to it brand new. Will they be more addicted,
20:59
will they be less addicted? Well, they find the things that
21:01
we find uh so fascinating
21:04
less interesting. I think there's a
21:06
as a really like wild kind of set
21:08
of possibilities that have nothing to
21:10
do with anything that we already think that we
21:12
know about how we use these devices Because I don't
21:15
think that we have as a as a species
21:18
actually begun to even
21:20
understand what they do at all. So
21:23
so you're, you know, the counter argument could be, well, that's
21:25
why we're going to blow ourselves up with AI. I
21:28
think right now, the danger that AI poses
21:30
is that it makes things, It makes faking
21:33
things very easy, and misinformation
21:36
more than anything. And by the way, going back to the
21:38
narrative of storytelling, misinformation
21:41
is far more dangerous than the atom bomb in
21:43
a lot of ways, right like the oh
21:45
fucking snore, if snore, if you are a professor.
21:48
You're you're stating in the arvious, I can't believe
21:50
that you are here, am
21:52
I Do you think the most dangerous
21:54
part is fake and creating false So.
21:56
It's now I think at this moment, in
21:58
this moment.
21:59
And do you think that that's actually going to get better
22:01
because it's these fake false
22:03
moments almost just created a civil
22:05
war.
22:06
And you think when where.
22:09
January sixth could have maybe but.
22:11
That AI had nothing
22:14
to do with that. That was just regular old people doing their
22:16
thing.
22:16
Well it wasn't. AI didn't have anything to do
22:18
with false news had something to do with it, right.
22:21
But that's so that's not even a thing. That's a fake
22:23
idea too. That's just like fake news is a fake
22:25
concept.
22:26
But you don't think you just said we're going to be
22:28
able to create more falsehoods. Yes,
22:31
through through an information just
22:33
information.
22:34
Call it what you will, but that's what the fucking Protocols
22:36
of the Elders of Zion is. And
22:38
it's been around forever. It's behold
22:40
a pale horse. It's all this fucking the
22:43
illuminati shit. It's all the same
22:45
thing. It's just a huge pile of
22:47
secrets and misinformation that people that
22:49
a certain segment of the population will fucking buy
22:51
into, just like trickle down economics
22:54
and all that bullshit. I hate to
22:56
do it. I hate to cite this, but there's
22:58
it.
22:59
But go ahead and say it.
23:00
Well, there's a great Trotsky
23:04
right piece of writing.
23:05
You slip when you slip
23:07
into Leon Trotsky. Well, I
23:10
know I'm not I'm not.
23:12
A hard line or anything, just saying he
23:15
wrote a piece in nineteen oh one called on Optimism
23:17
and Pessimism, and I think about it all the time. I'm
23:20
just going to read you the last line because it's the last
23:22
line, is the one that's the last two lines are
23:25
the ones that are important. I'm just gonna read them to you. Surrender
23:29
you pathetic dreamer. Here, I am your long awaited
23:31
twentieth century, your future. By the way, this piece
23:33
just details how horrible society
23:35
is in the twentieth century. This is when he wrote in nineteen
23:37
oh one. Right, this is the start of the twentieth
23:39
century. Surrender you, pathetic
23:41
dreamer. Here, I am your long awaited twentieth
23:44
century, your future. And
23:46
the last line is no replies the unhumbled
23:49
optimist, You are only the present.
23:52
I think about this all the time, every day, that
23:54
we are in this mode
23:56
of envisioning, that we are
23:58
in the end date. But
24:00
we're not in the end state. We're like in the Opening
24:03
Innings. We're in the opening innings.
24:05
And I've said this to you before, and I'll see disagree.
24:08
Humanity is not going to be destroyed by
24:11
a global pandemic. Let
24:13
me, let me, it's not gonna be destroyed. And
24:16
I'm agreeing with you by a nuclear conflagration.
24:19
I agree, it's going to be destroyed when we invent a pair
24:21
of shoes that let you jump very high,
24:23
and then it turns out one day
24:26
they go hey wire anybody's legs start flying off
24:28
because the shoes. And that's how we're gonna It's gonna
24:30
be something so fucking stupid and unexpected.
24:34
You know, It's like it's actually like the pandemic. I think
24:36
a lot of a lot of it is like, you know, we
24:38
thought it'd be zombies and fucking buildings
24:41
on fire and nuclear missiles
24:43
and whatever. The robots rope the
24:45
fucking guys from the matrix. The robots are the matrix,
24:48
and and what it actually is like you got to sit
24:50
in your house and work. You're not allowed
24:52
to go out. You can't go to the grocery store. That's the
24:54
apocalypse. That's our apocalypse. It's like you've
24:56
got to be on slack with your coworkers
25:00
while you know everybody's getting sick
25:02
around you. Anyhow, Listen,
25:04
I don't know how we got into this. I have no idea what we're
25:06
talking about.
25:06
Me take something back to No, let me take something
25:09
back because I said I agree, but
25:11
I don't agree.
25:12
I don't even know. I'm not even sure what
25:14
our topic isn't.
25:15
No, but you had said it's going to be some you
25:17
know, some fucking tennis shoe
25:19
that gets the tennis shoe. You're
25:21
gonna wipe us.
25:22
Out as your legs are flying off and
25:24
you put them on.
25:25
I have heard you say that before. Yeah, No, I think
25:27
it's going to be bigger than that.
25:29
Maybe. But the thinking about AI is this, like
25:31
it is, it appears
25:34
very scary because it does things that we
25:36
that seem like they are beyond understanding.
25:39
I think that, you know, if you look at what the real the
25:41
the interesting critics have said, and I think
25:44
to Amanda's point, where
25:46
it's true danger lies at this point is in
25:48
the misuse of AI by human beings.
25:51
But it's kind of a people
25:53
don't kill people, guns do or whatever argument,
25:56
like yeah, like ultimately a
25:58
person has to pull the trigger, but
26:00
the gun is the thing that lets
26:02
them kill. And I think that,
26:05
like, you know, we can
26:07
have the debate about, you know, what the true
26:09
danger of AI is and and it's
26:11
both things, right, it's both the technology and the
26:14
people. But it's like, but this, at this point,
26:16
we're so early in this game, and
26:19
what it's doing is such a parlor trick, and
26:22
there's no evidence that the parlor trick becomes it
26:24
can become a more elaborate parlor trick. It'll
26:26
become a very sophisticated parlor trick.
26:29
Is the AI sentient? Does
26:31
it have a desire? Does the AI
26:34
want something? No, it
26:36
doesn't, and we don't know that it could ever. We
26:39
have no idea that
26:41
there is us. There's no possibility
26:43
that we could know that you could make a computer
26:45
system that has a desire
26:48
for something. It can do things we tell
26:50
it to do. It can do things that
26:52
it thinks we wanted to do, or it thinks it should
26:54
do on its own. But that's not the same thing as
26:56
like a motivating factor, like a like
26:58
a dream. Right, dream
27:01
is not just a random processing of
27:03
information in our brain. It's not just random.
27:05
It's some combination of
27:08
the pieces of information, right, and it's some
27:11
part of us that is putting them together in a certain
27:13
way. People in machines
27:15
aren't like one and the same if you just make a
27:18
machine that's complicated enough. So this idea
27:20
that like someday it will be fucking Skynet
27:23
from the Terminator movies is like kind
27:26
of a weird, bad human fantasy that has
27:28
been I don't know. To me, it feels like
27:31
a little bit of a childish view
27:34
of the technology because some like
27:36
James Cameron wrote a movie about
27:38
a machine to become sentient and wants to kill humans.
27:41
We have basically decided that that's what the
27:43
machine's going to do. The smarter
27:45
it becomes like, I
27:47
don't know, the machine's probably going to be
27:49
able to see the Terminator and
27:52
it'll probably be like, huh,
27:54
maybe I shouldn't do that. That seems like it ends
27:56
badly, Like it's not a good ending
27:58
for the machines.
28:00
You got a sequel first, and now that badly.
28:03
The best Terminator movie is Terminator three, starring
28:06
Claire Danes, And uh,
28:08
I'll take that one to my grave.
28:10
Wait, my favorite Aliens movie?
28:13
Yeah three?
28:14
Well, that's interesting. Fincher David
28:17
Fincher's first feature film.
28:18
I get shouted down by more people over
28:20
that, but.
28:21
You'll have to agree. You will agree with me. David Fincher's best
28:23
movie is Zodiac. Uh
28:26
probably probably, No, it can't be an
28:28
argument there.
28:29
I don't know Aliens three.
28:31
Man, you're saying Aliens three
28:33
is better than Zodiac. That's crazy. That's just
28:35
that.
28:35
Go back and revisit Aliens three out there, all
28:37
right? And Charles Charles Dutton staring
28:41
Dutton the alien saying before
28:44
that lead, come on, doesn't get men
28:46
of that?
28:46
All right? I do. It does make you want to revisit it.
28:48
I have to tell you that scene.
28:50
Charles Dutton, by the way, one of the great great
28:53
actors, never gets talked about.
28:54
Charles Dutton, Rock Rock, Whatever
28:56
happened to that show? You don't hear anything about.
28:58
Like a world class actor?
28:59
The stage incredible. Okay, hold
29:01
on, do we have we gone through all your questions? Uh?
29:04
Well, we didn't really cover as much
29:06
of the Uh what are you laughing at
29:08
me about?
29:09
No? Just I just I want to know what we did cover. I
29:11
just think it's funny.
29:12
Thanks exactly. Did you have you ever
29:14
actually sat down and read the fountain Head or at
29:17
last, No.
29:17
I wouldn't read that pornographythead
29:22
go back shad, who sucks?
29:24
I think I might have said this to you. There's a
29:27
fascinating early Mike Wallace
29:29
interview black and white with a
29:33
oh yeah, it's great, chilling,
29:35
so chilling.
29:36
He really puts it, puts her, you know, gives her
29:38
some tough questions, he.
29:39
Puts her through her paces. But one
29:41
of the most remarkable things about that interview,
29:43
and anybody who's listening. Really,
29:45
go go watch that interview. It will show
29:48
you unless you're an objectivist,
29:50
it'll show you
29:52
what a fraud this person was. A big thinker,
29:54
right, a great brain whatever, basically
29:57
says yeah, I just thought it up. Well,
29:59
I mean I just imagined this thing one
30:01
day when he starts asking her about
30:04
like, well, where do your inspirations come from? What are
30:06
the references? And she's like, just figure
30:08
that thought it up?
30:10
Well?
30:11
Is that all ideas? Isn't that all good and
30:13
bad? Just somebody thinking of an idea?
30:15
I suppose. But boy, that that many people
30:17
caught on then and continue to catch
30:20
on. It's just remarkable
30:22
to me.
30:22
Well, people like to hear things that make them feel
30:24
good about the way that they behave. So
30:27
you know the thing about somebody like Anne Rand
30:29
or Eying Rand, depending on who you talk to, is
30:32
you know, she's condones a lot of behavior that's
30:34
basically selfish and shitty and bad. And that's
30:36
what the Republicans do often, right, Like
30:39
it's about protecting your interests
30:41
versus other people's, or thinking about
30:43
a kind of space where other people
30:45
should be considered, which is, you know, in
30:48
essence, the behavior of a child.
30:50
Right, the behavior of a person
30:53
with a very limited range of understanding.
30:56
Which Wallace kind of gets into a little bit there.
30:58
Well, to get it it is.
31:00
I'll tell you you should create it. You should create an app the professor
31:03
recommends. And it's just because you've recommended
31:05
several pieces of content here during
31:07
this conversation, and all
31:10
they all are very interesting.
31:11
Sound by the way, I shot my wad.
31:12
That was it.
31:13
I got three.
31:14
I get multi to those were your those were your three
31:16
recommendations.
31:19
Like I said, that's all I had.
31:21
Well, soon you'll be shuffling off
31:23
this mortal coil and you won't have to worry about recommending.
31:25
Things anybody that was brutal.
31:28
Well, you know, just thinking about
31:31
it, since you brought it up at the beginning of the conversation.
31:33
Were you going to speak at my funeral?
31:35
My being asked, I would love to speak at your funeral.
31:37
I've got some big ideas.
31:38
I would love for you to get up and pontificate to
31:41
the point where this is what you're here in the audience.
31:44
Ah.
31:45
I was thinking about doing somebody a little more like a carrot
31:47
top type of routine, something with like props.
31:51
That really would be.
31:52
Funny, you know, just pulling some shit out of a bag.
31:55
You know what I'm getting a lot of comfort from in
31:58
the midst of my well. Is
32:00
it existential angst? I'm not sure if it's so
32:03
existential.
32:03
It's more angst or dread. It sounds to me
32:05
more like dread.
32:06
It's more dread. You're right, it's more dread.
32:08
Yeah, be careful, are
32:10
you know? They're close, but they're not exactly the same.
32:13
What I'm getting some comfort from, and I
32:15
mean it is images from
32:17
the web telescope.
32:18
Oh yeah, contemplating the vastness
32:21
of reality.
32:22
Talk about a spec.
32:25
Well, that's an interesting one.
32:26
Beyond the colors and the figures.
32:29
Yeah, it's like, oh yeah, there's so much
32:31
out there.
32:32
Well, you got to be careful though, because then you really start
32:35
to feel bad about yourself, about your insignificance
32:37
and the meaninglessness of all of your toils.
32:39
No, that that hasn't That hasn't
32:41
been the reaction. No, I look at it and go,
32:43
oh, should there be
32:46
another space,
32:48
time, continuum matrix
32:50
of it all?
32:51
Yeah?
32:51
Like I want to be out there
32:53
floating in the well.
32:54
Who knows what happens when you when you leave
32:57
your you know, your physical body,
32:59
you know, have you thought about have you said about getting
33:01
into video games? Though? Maybe if you really feel
33:03
I don't believe it, despondent.
33:05
Don't believe in it. Have you don't
33:07
believe in it? We talked about this before. I think
33:09
I recommend gaming to everybody. You don't
33:11
believe in blueberries. I don't believe in video
33:13
I.
33:13
Think I don't believe in blueberries. Is
33:15
this something that came up?
33:16
I believe you said this before. Last time we talked,
33:18
you one day chastised
33:21
me in a pleasant enough way.
33:23
You said this before I recollect
33:26
and I.
33:27
Was being foolish for believing in the anti arts.
33:29
I say, I chastised you for eating blueberry.
33:32
Believing in the antioxidant qualities
33:34
of blueberry.
33:35
Oh, yes, yes, I think that's probably some kind
33:37
of scam. I'm gonna have to it's that that to
33:39
me. Whenever I hear whenever somebody.
33:40
Says, that is a scam to you,
33:43
but AI is a positive thing.
33:45
Whenever I hear somebody say, no, I didn't
33:47
say that. I'm just saying that. I think we.
33:48
I think we going to quote the great mystic
33:51
mister t I pity the fool.
33:53
No, whenever anybody says anything
33:55
about a food, no matter what it is, any
33:58
quality the food is supposed to have, I immediately
34:01
think there's a complex
34:03
system of bullshit that led
34:05
to this moment, and and whatever it is.
34:07
I'm sure the blueberries are healthy. I
34:10
have no doubt broccoli is very healthy.
34:12
I'm sure there's all sorts of shit that's really good for you. But
34:14
I just feel like getting putting too much.
34:16
What's too much?
34:17
Too much faith in the ability of a
34:19
single item, a food item.
34:22
You'd rather any
34:24
kind of people behind a kind of meaningful,
34:28
healthful reaction.
34:30
I think it's just like, is misguided. I think
34:32
it's misguided. I don't think you know,
34:34
I don't believe in the Beatles. I just believe in me, I guess
34:36
is what I'm saying.
34:38
God, that's another one. Please write that down.
34:40
No, that's fucking John Lennon said that.
34:42
Oh, oh that's right.
34:43
I think it's actually in a song.
34:44
I don't know enough about the Beatles.
34:46
Excuse me, Oh really, no, I
34:48
think it's I want to say it's in a song.
34:50
I listen to the Beatles, but I don't like I don't
34:52
quote the Beatles clearly.
34:53
Some of Lennon's solo stuff is really pretty
34:55
fucking amazing.
35:07
Do you remember when Paul McCartney death
35:09
hoaks happened?
35:10
Oh he's still alive. Interesting, But do you.
35:12
Remember the DJ and Detroit
35:14
who started that rumor about Paul McCarty.
35:17
Yeah, yeah, I think about AI doing that
35:19
just widespread. Yeah,
35:22
what are we going to do? What are we going to do?
35:23
Buy into it?
35:25
Praise for impact?
35:26
I don't know. You're vacillating now. I don't understand.
35:28
I'm not quite clear.
35:29
Are joking? I'm joking. What's gonna happen?
35:31
Oh wait, I have a question?
35:32
Shut than shot?
35:36
Please shut up? Please just show do
35:38
you?
35:40
I love listening to you speaking like, shut the fuck
35:42
up?
35:42
I didn't say the F word. Why
35:47
is it professor positive?
35:50
When it comes to AI, does
35:52
it just seems to be in the hands
35:54
of diabolical people?
35:56
Yeah?
35:56
And we on the left are
35:59
always, you know, under
36:01
the sword of demos,
36:05
and the guys on the right seem
36:07
to use it to their advantage.
36:10
Yeah, because this is the classic bringing
36:12
a knife to a gunfight situation, Like
36:15
democrats are I.
36:16
Mean, but do you really think they're going to stand
36:19
down at some point and say, all right, your turn? Who
36:22
the guys on they're only going to get
36:24
better at it, They're going to get more efficient and brutal.
36:26
Well the fucking but no. But the other side is like the Joker,
36:29
you know, like they're like, what
36:31
is it? No, you know the joker from.
36:33
Batman a joke, the guy from the they call
36:35
the Center for Denver Nuggets.
36:37
The Joker keep going, well, I wouldn't know anything about
36:39
that because I hate sports. You
36:41
know. He's like an agent of chaos, right, He's
36:43
like, just it does? What the fuck? Ever? I think that's
36:45
Uh, you can't have like a rational
36:48
this is this is like the Nazi thing, right, This
36:50
is like the the you can't be tolerant
36:52
of the intolerant, right, you can practice
36:55
tolerance, right, that's a good thing. But
36:57
then when you go, well, but I have to be tall,
37:00
are into people who think that I should not exist,
37:02
Then you reach a kind of like a threshold.
37:05
And I think that I don't describe that at at all.
37:07
That's where I go back to taking
37:09
the leg you're not standing on and beating somebody over the
37:11
head with keeping.
37:12
Yeah, but there's a whole there's a whole set of
37:14
people that are like, we're trying to
37:16
participate in this like thing called reality
37:19
and in in truth. And remember
37:22
we're not perfect all the time. And there's a bunch of shitty democrats
37:25
and people on the left who are just as stupid and
37:27
bad as people on the right. But there
37:29
are limits to and
37:32
and ways of being that
37:34
they will never go into. They will never be like,
37:37
let's exterminate an entire set
37:40
of people, like they're just not going to say that. They're
37:42
On the other hand, on the flip side, there are people who
37:44
are like, we should have an only like
37:46
a totally white nation, like just
37:48
a crazy, unhinged, fucking anti
37:52
humanity statement.
37:53
Well that does border on extermination
37:56
or right or just like sending
37:58
people off in boats.
38:00
You can't have like a healthy debate with those people.
38:02
Yes, because they're going to take every They're
38:04
going to take every possibility, everything
38:07
they can use to create an environment
38:09
of shit. They're going to do
38:11
it because they don't care. It's
38:13
kind of in sync with a lot of the religious
38:16
thinking of like this world
38:18
doesn't matter, this life doesn't matter, and that there's
38:20
something better waiting for you.
38:22
Yes, absolutely, Mike Pants believes
38:24
in Yeah, we're you.
38:25
Know, it's like why the why do the evangelicals
38:27
follow a guy like Donald Trump? Some
38:30
of the worst terrorists in America are Evangelical
38:32
Christians. Like, why we
38:34
should interrogate that. Yeah, there's it's
38:37
just like any fundamentalist terrorism.
38:39
But as a matter of fact, they want it to come the rapture.
38:42
Yeah, right, exactly, there's this weird uh
38:45
it's a death cult. Right anyhow,
38:47
God, we're way, we're so far afield. I don't even know what
38:49
we're talking about it anymore, but.
38:51
I think he's been fascinating. Oh well, I
38:53
mean, we'd certainly like to talk like top shelf
38:56
stuff.
38:57
I mean, I don't know.
38:58
I can't tell anymore our
39:00
business. You're you're the cove to me. You're
39:02
the covasier Okay of podcasts?
39:05
Is that good? I'd rather be the top shelf.
39:07
What's that very expensive whiskey?
39:10
It's like Pappy Pappy
39:13
van Winkle.
39:14
Oh, there is a thing called Pappy van Winkle.
39:16
Yeah, Patty van Winkle is like the most expensive whiskey
39:18
you can buy. It's like a very rare special.
39:21
It's like a it's five thousand dollars a
39:23
bottle or something. Wow. Yeah,
39:25
I'm the Pappy van Winkle of podcasting.
39:28
If this podcast clicks
39:30
like in a big way, it's not going
39:32
to We're just rolling in
39:34
the dough. Yeah, oh, I think I'm
39:36
going to buy you a bottle that I.
39:38
Guess they haven't haven't given you the numbers. I
39:41
don't think you're in any danger of buying me a bottle of Pappy
39:43
Van Winkle. Let's put it that way.
39:45
Maybe a shot glass of Pappy.
39:47
Van Winkle maybe, yea, I think that's
39:49
that's possible.
39:50
Do we have anything else? Do you have anything for me?
39:53
For you? Yeah, well I'm
39:55
want to I know, I you know it's I'm scared.
39:58
I'm scared scared, And
40:01
yeah, why I don't understand. Don't
40:04
get it such a rube. Listen,
40:06
you're a successful man. You've
40:08
you've more than proven your value. You've more
40:10
than proven your your worth. You've you've made incredible
40:12
things you continue to
40:14
influence to this day. You
40:17
have a circle of friends that love
40:19
and adore you and think the world of you.
40:21
Get a little choked up, get a little misty.
40:23
And employees that fear you, that cower and
40:25
fear. I've seen them in person. They
40:28
absolutely don't even want to walk a little
40:30
bit in front of you because they're they're afraid they'll get
40:32
knocked down.
40:33
Don't look at me, don't eyeball me. I think
40:35
often I say to them, don't I ball me.
40:37
What I would ask you is what's missing? What don't you
40:39
have? What did you want that you haven't
40:41
gotten? You know, where is it? What is it?
40:43
Oh? God, I wish I could hansle that on the air.
40:45
I mean, name one thing that you wanted that you
40:48
haven't gotten.
40:49
Ugh, I can't pony in my backyard.
40:51
I often use that expression.
40:53
You could have it, though, nothing stomping you. Nothing stomping
40:55
you.
40:56
By the way, I did say that to a friend of mine one time,
40:59
who I will not name him.
41:00
But the next day, there's a pony in your backyard.
41:03
He sent a pony into my office right exactly,
41:05
This is what I'm talking about. He did have a woman bring
41:07
a pony by and parade up and down
41:09
my hallway for about an hour.
41:11
You live in the life of a king. What
41:13
is it?
41:14
Just?
41:14
I want you to say, I want to know. I
41:16
really want to know one thing you wanted
41:19
or that you've wanted really badly, truly,
41:21
that you have not been able to.
41:22
Get Uh,
41:26
let's end it on this one. Yeah,
41:29
having a dad would have been good. Okay,
41:32
that's you asked for it. I gave him
41:34
to. I mean that's
41:37
good.
41:39
As a guy with a father, I got to say, it's not that great.
41:41
I mean
41:43
he's fine. He's fine.
41:44
By the way, wherever your dad is right now, he wins
41:47
and he said to he said to your mom, he's
41:49
like, ugh, just so I
41:51
got a little adjutant my like my
41:53
roop case.
41:54
They the more the more
41:56
criticism in my family, the better, to be honest
41:58
with you, he's in embracing it.
42:00
Well, maybe I should spend more time with your dad.
42:02
Well you didn't know your You didn't know your dad at all.
42:04
Oh, let's not go there now, let's.
42:06
Get into it. Now, let's do it. No another two hours,
42:09
solve your all.
42:10
Your mother would be nice to have had. Yeah,
42:13
Like, okay, I was raised in
42:15
a single parent fantastic
42:18
mother household.
42:19
Yeah, and look at look at what it look
42:21
at what it made you. So you wouldn't
42:23
be if you're with a dad, you might not be you to
42:25
be honest, you know, think about it.
42:27
But mister butterfly effect, look
42:29
at the the thing you be killed they called
42:31
the butterfly effect. Right, yeah, there we go.
42:33
Well, no, but it's true. I mean, I'm sure it drove
42:36
you in all sorts of a different ways, just like all of my
42:38
failings and my needs and wants have driven
42:40
me in different ways.
42:42
Oh, I got one final question for you. Just
42:44
popped in my head earlier in the day. Let's do it early
42:46
in the conversation.
42:47
I'm ready.
42:48
You are king of the
42:50
world. Okay, you're omniscient,
42:53
You're omnipotent.
42:55
All right?
42:56
You are borderline
42:59
godlike?
43:00
Am I immortal?
43:01
No? That has nothing doing well?
43:04
I mean a god would be immortal, in my opinion.
43:05
Said borderline.
43:07
Okay, I'm omnipotent. I'm omniscient,
43:10
but not immortal.
43:11
Okay, all right, not immortal, borderline
43:14
god?
43:14
Like I have an expiry date.
43:16
You have one thing to fix
43:19
in the world. What would
43:21
you fix one thing? I
43:23
mean, like, can an agonize.
43:24
A mean thing? What do you mean thing?
43:26
Though?
43:26
Like?
43:27
Do you cure cancer? Do you get rid
43:29
of guns? Do you stop war? Do
43:31
you it's only is it only physical things? Or can I
43:33
like remove a component of humanity?
43:35
Oh?
43:35
Yeah, you can do that too. You're a near godlike
43:38
you can do pretty much whatever you want. One
43:41
thing you alter, change, eradicate
43:45
one.
43:46
Here's what I would change, I think. I think it would
43:50
you know, who knows if it was only one thing,
43:52
who knows if it would work. I would
43:54
make it so that every person could
43:57
understand or sympathize with another
44:00
person. I would make it so that everybody was able
44:03
to feel empathetic towards another person.
44:05
That's the thing I would change, that there was a
44:08
sense of empathy for another person whenever
44:10
they interacted with them in
44:12
a desire to empathize with them.
44:16
I think that thing would probably fix a
44:18
lot of our problems in society.
44:20
I don't think like removing guns, somebody
44:22
would just make a laser like you know
44:24
what I mean, Like, they'd make up, they'd use
44:26
bombs, they'd use bows and arrows like or whatever.
44:29
I'm not saying, like it's that's the same thing. But guns
44:31
would fix a problem temporarily, but not permanently.
44:34
A permanent fix would be like if you could
44:36
empathize with another person, If everybody
44:38
could empathize with the people around them, even
44:40
the people that seem to suck, I think
44:43
it would go a long way, you know, or
44:45
like you know, I don't know erased the emotion of hate,
44:47
but I don't think that's I'm not sure that would
44:49
have the result we wanted to be honest.
44:52
Okay, anyhow, I.
44:53
Don't know what would you do if you were omniscient and
44:55
omnipotent, soft
44:58
and cuddly, what would you do? What would
45:00
you your one act?
45:01
My one act?
45:01
Big man?
45:03
Yeah, big man, you sound
45:05
like Dennis Hopper and Apocalypse. Now
45:07
what do you want me to say it was a wise man?
45:11
What?
45:12
Yeah, I didn't think about you turning the tables
45:14
on me.
45:15
That's right, baby, check it out. You're got to answer
45:17
this fucking question. Now, look at you, you're in the spotlight.
45:19
Well, there's so many things. You know, there's the obvious
45:22
you'd cure diseases, But uh,
45:24
I do think that, as difficult as it
45:27
is to say, there's there's a Darwinian
45:29
nature to life that you you
45:31
maybe shouldn't fuck with. No,
45:34
I would say, honestly, I think it's
45:36
the rain in the internet.
45:41
Yes, I do.
45:41
I mean, like comparison, mind's
45:44
way way better. I think it's like not even
45:46
I can't even calculate how much cooler and better.
45:48
My answer was to this, you're
45:51
holier than I'm in.
45:54
I love people.
45:55
Look at TikTok last if that's the one thing. God,
45:58
now you're just now you're denigrating. That's
46:01
this is not a nice way.
46:02
They looked
46:05
at their photo. There was a timer saying enough
46:07
Internet.
46:07
I think the Internet is the end
46:10
of the world.
46:10
Now you just think that because you're aging.
46:13
I think an unregulated Internet
46:15
is the end of the world. That's what I think.
46:18
Regulated, unregulated, That's not our problem.
46:21
Our problem is us. It's not that thing
46:23
we think. We always want to put it in the object.
46:26
We always want to say it's the thing that makes
46:28
it. So.
46:29
You think people are gonna get smarter, more
46:31
educated.
46:32
People thought when they put radios in cars that people
46:34
are gonna drive off the fucking road because
46:36
they were so mesmerized by the magic
46:38
box producing music that they couldn't steer
46:40
the car. And you know, it probably
46:42
did cause a few accidents, But over time
46:45
we learned to change the channel
46:47
without plowing into a family.
46:49
I'm duly chastised. I'm gonna take
46:51
mine back. I'm gonna take mine.
46:53
Back all right.
46:55
Here is what I wish for the future.
46:57
What I would change, What you would change
46:59
as a godlike creature.
47:01
As a godlike creature, we
47:03
would eat lollipops every day. We
47:07
would all have ponies in our backyard.
47:09
Okay, let's say several things.
47:12
I'm condescending right now to you. We
47:14
would have you'd wish wealth,
47:17
great wealth on everybody, great
47:19
emotional and physical
47:21
monetary wealth.
47:22
Yeah, you're mocking me.
47:24
Now get eight hours of sleep a night.
47:26
You know. It's funny, you're mocking me, but we agree. I
47:29
am mocking you, but we ultimately agree.
47:31
By the way, I'm going to go back, I think there's far
47:33
more empathy in the world than you might think.
47:35
I'm not saying there's there's plenty of empathy,
47:38
just not enough. Just not enough. There
47:40
isn't if we could, if people were empathetic, they
47:42
wouldn't act the way they act. It's
47:44
hard, it's hard. It's hard to imagine what it's
47:46
like for somebody else. But I think that the
47:48
more we can do that, the better off humanity
47:51
is.
47:51
Actually, you sold me on it. I agree with you.
47:53
I hate to be like all lovey dovey fucking
47:55
hippie.
47:56
It was a little hippi hippiush for you.
47:57
But no, I know I'm disgusted with myself
48:00
for even saying that's just that. I'm like, everybody
48:02
should be horny all the time. You know,
48:04
No, I'm on board now. You sold
48:06
it to me. I I sounded like a
48:09
rube. I sounded like mister
48:11
ten sixty is what I sounded like. And
48:13
uh and then it's just like, just like the
48:15
idea of that, You're like, if we could just live at the internet,
48:18
will be like if we can just
48:20
like you know, cancel America online,
48:23
nobody can get an account on AOL.
48:25
We're all set, all right, Okay,
48:28
no more, I'm shamed, almighty.
48:31
Shut It's a very lovely
48:34
It's kind of lovely. It's funny.
48:36
You're like, just put the phone down.
48:38
I'm off.
48:39
Put the fucking phone.
48:40
I'm often described as lovely. All
48:42
right, anything else? When are we next talking?
48:45
I don't know. I hope soon, because
48:47
it's I can't go for too long without a little bit of
48:49
this in my life.
48:50
All right, professor says, how
48:53
do you well?
48:54
Once again, your interruption has been my
48:56
pleasure.
48:58
Okay,
49:04
Well, that that
49:07
that I think is our show.
49:09
I think that we've done far more
49:11
than anyone was expecting to do.
49:14
I had a whole show planned. In fact,
49:16
I was going to spend several hours talking about
49:18
my travel anxiety. But you
49:21
know, I think there obviously, I think what we've learned there are more
49:23
important things to focus
49:25
on in this in this world and in this life,
49:29
and we focused on some
49:31
of those things just now. I'm
49:33
not sure that. I'm not sure why, I'm
49:36
not sure how, But as
49:38
usual, the Professor made it happen. All
49:41
right. I got to get out of here,
49:43
I got to have a lot to think about. But
49:46
we'll be back next week with more what future,
49:49
And as always, I wish you and your family the very
49:52
best
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More