Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:00
What's happening, people? Welcome back to
0:02
the show. My guest today is
0:04
Gwyndebogel. He's a programmer and a
0:06
writer. He also happens to be
0:08
one of my favorite Twitter followers.
0:10
He's written yet another mega thread
0:12
exploring human nature, cognitive biases, mental
0:14
models, status games, crowd behavior and
0:16
social media. And it's fantastic.
0:19
So today we get to go
0:21
through a ton of my favorites.
0:23
Expect to learn whether cynical people
0:25
are actually smarter, why people tend
0:27
to find certain outcomes so intolerable,
0:29
whether you would rather lie than say
0:31
what you really think, why people would rather
0:33
be hated than unknown, why appearing to do
0:35
good has become more important than actually doing
0:38
good and much more.
0:41
This guy is so great. This
0:44
must be his sixth or seventh episode I think he's
0:46
had on the show now. And he's just so incisive
0:49
and interesting and unique with the way that he
0:52
goes about things. You should check out his sub
0:54
stack. His sub stack's great, phenomenal writer, great speaker.
0:56
And yeah, I can't get enough of these ones.
0:58
I hope that you take tons away from this
1:01
because I had an awful
1:03
lot of fun recording it. Also this
1:05
Monday, Dr. Mike Isretel, one of the
1:07
best, if not the best evidence-based training
1:09
coaches on the planet, Doctor of Exercise
1:11
Science. He is a professor at Lehman
1:14
College in the Bronx. And
1:16
he's gonna teach us over the space of two
1:18
hours how to build muscle using science and
1:20
research. And none of that is bro science.
1:23
So yeah, huge few weeks coming
1:25
up, including some massive, massive guests
1:27
next month as well. So get
1:30
ready for those ones. This episode
1:32
is brought to you by Shopify. Shopify
1:34
is the global commerce platform that helps
1:36
you sell at every stage of your
1:38
business. From the launch your online shop
1:41
stage to the first real life store
1:43
stage, all the way to the did
1:45
we just hit a million orders stage,
1:47
Shopify is there to help you grow.
1:49
Whether you're selling scented soap or offering
1:51
outdoor outfits, Shopify helps you sell everywhere.
1:54
From that all-in-one e-commerce platform to their
1:56
in-person POS system, wherever, and whatever you're
1:58
selling, Shopify has got you. covered.
2:00
Shopify helps you turn browsers into buyers
2:02
with the internet's best converting checkout. 36%
2:05
better on average compared to other leading
2:07
commerce platforms. You would be amazed at
2:10
how many massive brands you love use
2:12
Shopify. Gymshark, perhaps one of the biggest
2:14
independent sportswear companies in the world, uses
2:17
Shopify and if it is good enough for them
2:19
it is good enough for you. So if you
2:21
are looking to get started at selling something online
2:23
Shopify is the easiest quickest and most convenient way
2:25
to do it. Plus you can sign up for
2:27
a $1 per month trial period
2:30
at shopify.com/modern wisdom all lowercase that's
2:32
shopify.com/modern wisdom to grow your business no
2:34
matter what stage you're in. You might
2:36
have seen that I recently went on
2:39
tour in the UK, Ireland, Dubai, Canada
2:41
and the US and the entire time
2:43
for a full month I didn't check
2:46
a single bag in because whole
2:48
luggage is a psyop meant to
2:50
keep you poor and late. In fact
2:52
I never need to check bags anymore
2:54
thanks to Nomadic. They make the best
2:57
most functional durable and innovative backpacks and
2:59
luggage that I've ever found. The
3:01
20 litre travel pack and carry-on classic
3:03
are absolute game-changers. They're beautifully designed not
3:05
over engineered and will literally last you
3:07
a lifetime with their lifetime guarantee. Best
3:09
of all you can return or exchange
3:11
any product within 30 days for any
3:13
reason so you can buy your new
3:15
bag try it for a month and
3:17
if you do not like it they
3:20
will give you your money back. Right
3:22
now you can get a 20% discount and see everything
3:25
that I use and recommend
3:28
by going to nomadic.com/modern wisdom
3:30
and using the code modern
3:32
wisdom at checkout that's nomadic.com/modern
3:34
wisdom and modern wisdom at
3:37
checkout. This episode is brought
3:39
to you by Element. I have
3:41
started my morning every single day for
3:43
the last three years the same way
3:45
which is with Element in water. It
3:48
tastes fantastic, it reduces muscle cramps and
3:50
fatigue, it optimizes your brain health, it
3:52
regulates appetite and it helps to curb
3:54
cravings. It's got a science-backed electrolyte ratio
3:56
of sodium, potassium and magnesium. Super simple
3:59
there is no Nothing fancy going on
4:01
here, but it really does work. Also,
4:03
they have a no BS, no questions asked refund
4:05
policy, so you can buy it 100% risk free.
4:09
And if you do not like it, they
4:11
will give you your money back and you
4:13
don't even need to return the box. That's
4:16
how confident they are that you love it.
4:18
They are the exclusive hydration partner to Team
4:20
USA weightlifting and rely on by tons of
4:22
Olympic athletes and high performers in the NFL,
4:24
NBA, NHL, and FBI sniper teams, plus tech
4:27
leaders and everyday athletes around the world. Head
4:30
to drinklmnt.com/modern wisdom to
4:32
get a free sample
4:34
pack of all eight
4:36
flavors with any purchase.
4:38
That's drinklmnt.com slash
4:41
modern wisdom. But now
4:43
ladies and gentlemen, please welcome Gwynnde
4:45
Bogle. Oh,
4:48
oh, oh, oh, oh, oh, oh, oh, oh, oh,
4:50
oh, oh, oh, oh, every
5:05
single time, dude, you keep releasing these mega threads
5:07
with cool ideas. I keep loving going through them.
5:09
So today we're gonna go through as many of
5:11
your ideas and some of mine that I've already
5:13
made from home. And we'll see what we can
5:15
get to. First one, cynical
5:17
genius illusion. Cynical people
5:20
are seen as smarter, but sizable
5:22
research suggests they actually tend to be
5:24
dumber. Cynicism is not a
5:26
sign of intelligence, but a substitute
5:28
for it, a way to shield
5:30
oneself from betrayal and disappointment without
5:32
having to actually think. Yeah,
5:36
so this is actually based on
5:39
a pretty large study, which was conducted in 2018
5:41
by Stavrova et al. And
5:47
it's basically what they did was they did a
5:49
series of surveys to test the
5:52
hypothesis that cynical people are
5:55
more intelligent because a lot
5:57
of sort of TV
5:59
popular culture, portrays cynical people
6:01
as intelligent. So you see characters like
6:04
Dr. House, played by Hugh Laurie in
6:06
that show, Sheldon from
6:08
Big Bang Theory. A lot of these
6:10
characters tend to be very cynical, very
6:12
pessimistic, but also geniuses. So it's become
6:14
a bit of a stereotype. So these
6:17
researchers decided to test this by
6:19
actually doing a massive study, which
6:22
involved about 200,000 people in 30 different
6:25
countries. And it was a
6:27
series of surveys, firstly,
6:29
to test the
6:32
cynicism, and secondly, to test
6:34
their competence, essentially their IQ.
6:37
And it was interesting because they actually found the
6:39
opposite of what a lot of people believe, which
6:42
is that cynical people actually tend to be lower
6:45
IQ, or at least lower in
6:47
their performance of cognitive tests. And
6:50
it's actually very interesting because they
6:54
sort of posit the as an explanation for this,
6:56
the idea that cynicism
6:59
is basically a evolutionary
7:02
heuristic to basically save
7:05
people from having to think. It's
7:08
basically a way
7:10
to protect
7:13
yourself against betrayal, to protect yourself
7:16
against any form of
7:18
kind of treachery, including treachery of your
7:20
own expectations. And
7:22
I can see how this would have probably been
7:25
a useful heuristic, say about 100,000 years
7:27
ago. They in
7:29
the study, they describe it as the better,
7:32
safe than sorry heuristic. So
7:34
it's this idea that, for instance, if
7:36
you're out there, and you're in a low information environment,
7:38
so let's go, let's go 100,000 years
7:41
back into the past, right? So we don't
7:43
have the internet, we don't have TV,
7:45
we don't have books, we don't have
7:47
real knowledge, we're in a low information environment, we're in
7:49
the middle of a forest, and
7:51
we see this alien looking fruit on
7:54
a tree. And we have
7:56
the choice whether we can eat it or not eat
7:58
it. And we don't know what this is
8:00
we've never got no books, we've got no understanding of
8:02
it, we've never seen it before. So
8:04
in that situation, the best thing
8:06
to do is to default to
8:09
believing that it's dangerous. Because
8:11
obviously, one fruit, if you eat it,
8:13
and it turns out to be harmless, is not going to
8:16
benefit you that much. But if you eat
8:18
that fruit, and it turns out to be poisonous, that's
8:20
the end. So obviously, from
8:22
that point of view, it makes sense to
8:24
have this kind of pessimistic, risk averse sort
8:28
of approach to life. Now, the
8:30
thing is, is obviously the world now is very,
8:32
very different from the world that we had. And
8:35
yet we retain the same basic psychology,
8:37
the same kind of biology, we
8:40
are averse to risk. And that
8:42
involves being very sort
8:44
of distrusting of other human beings. You
8:47
know, because we don't know these are the one
8:49
thing that I'm trying to bifurcate
8:51
here, what's the difference between cynicism
8:53
and conservatism or risk aversion, or
8:56
something like that? So cynicism
8:58
is a kind of
9:00
pessimism, but it's a pessimism with
9:03
respect to other people's intentions. So
9:05
it's believing that people are always doing things
9:08
for the worst possible reasons. It's usually you
9:10
can summarize it as saying that people are
9:12
only in it for themselves. You
9:14
know, so basically, you can't trust people,
9:17
basically. So obviously, could
9:19
some conservatism
9:22
could be a function of cynicism.
9:25
But I think that obviously, conservatives are much broader
9:27
than that. And it
9:29
takes into account many other different heuristics.
9:31
So the thing with
9:33
cynicism is, it's
9:36
very low cognitive effort. It
9:39
doesn't require you to really expend
9:41
much mental effort to do anything. All you've got to do
9:43
is not trust something, you know, and
9:45
to basically just say to yourself,
9:48
Oh, wow, you know, I
9:50
don't, I shouldn't do this, because something
9:52
bad might happen. And our
9:54
brains are very, very good at finding reasons
9:56
not to do something. So there's, you know,
9:58
there's, there's a idea where if
10:01
you have a hole in your roof, you could reason
10:03
to yourself on
10:06
a sunny day, you don't need to
10:08
repair that hole in your roof. So you just not do
10:10
it. It'd be like, Oh, what's the point? I don't need
10:12
to do it. It's sunny outside. It's just letting sunshine into
10:14
my house. It's actually a good thing. On
10:16
the other hand, if it's raining, you could also say,
10:18
Oh, wow, it's raining. So I
10:21
don't want to get wet. So I won't go
10:23
out. I might slip from the ladder and fall,
10:25
you know, so your brain is very good at
10:27
inventing reasons not to do things. And so we
10:29
have this natural kind of cynicism. And it actually
10:31
takes mental effort to overcome that. It
10:34
actually takes mental effort. In the study, they
10:36
actually found that people with higher IQs actually
10:38
tend to be more trusting, which
10:40
is quite an unusual thing, you would expect it to
10:43
be the other way around, you'd expect high intelligent people
10:45
to be less trusting, but they're
10:47
actually more trusting. And this is because
10:49
they tend to be, they're not necessarily
10:51
better at determining whether they should trust
10:53
someone or not. But they're better at
10:55
determining whether cynicism is
10:57
warranted or not, which is slightly different.
10:59
Right. But there's this sort of a presumption
11:03
that hoping
11:05
for the best or that believing in
11:08
people is naive. And smart
11:10
people would never be naive. One of the
11:12
worst things that you could do is have
11:14
the wall pulled over your eyes. It's seen
11:16
as kind of juvenile or innocent or unsophisticated.
11:19
And the converse of that
11:21
is, you know, cynicism or skepticism
11:24
is more mature intellectually in
11:26
some way. Yeah,
11:28
I mean, this is sort of like
11:30
a very popular misconception, I think. And
11:32
that's why cynicism is very popular, because
11:34
it has the illusion. Because obviously,
11:37
if you're if you take no risks in life,
11:39
then you're not gonna fail ever, anything,
11:41
because you put yourself
11:43
out there. You have this idea that
11:46
I've heard you speak about called the
11:48
Cynicism Safety Blanket, which I think really
11:50
sort of jives with this very well,
11:52
because obviously,
11:55
cynicism is a form of protection.
11:57
It's sort of like this front.
12:00
that you put up, which protects you
12:02
from any risk taking. If
12:05
you don't take any risks, if you don't go out
12:07
there, and if you don't try to succeed anything, then
12:09
you won't fail at anything. So, you know, it's basically
12:11
like a way to guard yourself
12:13
against any form of failure. And
12:15
that's why I think people who
12:18
maybe don't want to expend mental effort or
12:20
emotional effort, because there's an emotional aspect of
12:22
this as well, they will
12:24
instead just choose not to take the risk
12:26
is much easier to just say, Oh, I'm
12:30
not going to take the risk because everything's gone
12:32
to share, everybody's out for themselves. I'm not going
12:34
to trust this person, I'm not going to love
12:36
this person, because you know, they might betray me,
12:38
they might, they might not return
12:40
the affection, I'm not going to go out and try
12:42
this new thing because I might fail. It's
12:45
much easier just to not do any of that
12:47
stuff. And then you can just say to yourself,
12:49
Oh, well, I've never failed. You know,
12:51
it's like a kind of ego trick that you
12:53
put yourself on big, but the thing is, is
12:57
the truly sort of intelligent people will say
12:59
to themselves, well, look, yeah, I might
13:01
fail. But at the end of the day, it's
13:03
worth, it's worth trying.
13:05
Because at the end
13:07
of the day, if you don't try, you'll never
13:09
achieve anything, you're not actually going to better yourself,
13:12
you're just going to remain in the same situation
13:14
whatsoever. And even failure can be good. If you're
13:16
intelligent, failure can be good, because you learn from
13:18
failure. In fact, failure is pretty much the
13:20
only thing we learn from, you know, it's the only
13:22
lesson that we learn from, we don't learn when we
13:24
succeed, we don't learn when we're happy. So
13:27
intelligent people will tend to put themselves
13:29
out there, they will risk engaging
13:32
in ambitious endeavors,
13:35
because they know that at the end of the day, even
13:38
if they fail at that endeavor, they're actually
13:40
still improving their station, because they're improving their
13:42
knowledge, they're learning from it. I
13:45
think that's ultimately what it comes down to is, if
13:48
you're not, if you don't have a high
13:51
IQ, you can feign a high
13:53
IQ by criticizing other people, their
13:56
efforts and saying, Oh, look at this fool,
13:58
he failed, you know, whereas you'll never fail
14:00
to say, Oh, you always have that,
14:02
you know, I've never failed, but then you've
14:04
never actually succeeded either. So I think
14:08
it's, it's a guard, it's an emotional guard,
14:10
and it's an intellectual guard. Seagull's
14:13
Law. A man with a watch
14:16
knows what time it is. A man
14:18
with two watches is never sure. Ancient
14:20
societies followed a single narrative. Modern
14:23
societies are cacophonies of competing
14:25
narratives. Without trust, more data
14:27
doesn't make us more informed,
14:30
but more confused. Yeah,
14:33
so if you talk
14:36
to a lot of
14:38
these sort of disinformation academics,
14:40
people who study disinformation and stuff,
14:43
they'll often say that there's a
14:45
problem of people not
14:47
getting enough information, you know,
14:49
there's this whole idea of low information
14:51
voters and stuff, you know, that's what
14:53
people tend to call euphemistically call people
14:56
that they regard as stupid as low
14:58
information. But the
15:00
thing is, is the problem
15:02
in society at the moment is not actually a
15:04
lack of information. It's a lack
15:06
of trust. That's the bottleneck that is stopping
15:09
progress. Because look, we have, we
15:12
have more information than we've ever had in
15:14
whole of human history. I
15:17
think I read somewhere that sort of, every
15:19
year, more information is produced than in
15:21
all of the preceding years of
15:23
human history. That's how much
15:25
information is exploding. So most exponential
15:27
of exponentials. Yeah, yeah.
15:30
And so information is
15:32
not the problem. We have more than enough
15:34
information. The thing that's holding people back is
15:36
a lack of trust. And I
15:39
think it's got particularly bad since sort of
15:41
the pandemic. Because, you know,
15:43
obviously, our mainstream institutions, which we sort of
15:45
rely on to navigate the world for us,
15:48
they showed that they were flawed during the
15:50
pandemic, you know, for instance, at
15:52
the beginning of the pandemic, the World Health Organization said
15:54
that COVID is not airborne. And if you go on
15:56
Twitter, and you look at that page, the
15:59
tweets still up. which says that
16:01
COVID is not airborne, but we very quickly found that
16:03
COVID was airborne, and it was actually
16:05
disastrous because people obviously were lulled into
16:07
a false sense of security. So that
16:10
was obviously a big problem. And then we also had
16:12
the problem with the masks, how
16:14
efficacious are they? Then
16:16
there was a problem of vaccines,
16:18
how efficacious are vaccines, what
16:21
are the side effects? And then of course,
16:23
there was the lab leak hypothesis. That
16:26
was instantly dismissed as a conspiracy theory, despite
16:28
the fact that there is at
16:30
least as good an argument
16:33
that COVID escaped from a lab as that it
16:35
was naturally, it was all
16:37
the natural spillover. So these
16:40
events, I think really destroyed trust
16:42
in institutions. But I mean, obviously, this
16:45
problem began before COVID, it just
16:47
COVID exacerbated it a lot. And
16:50
obviously, things have not gotten any better since then,
16:52
you know, we've seen, for instance,
16:55
the whole Harvard scandal, the plagiarism scandal. This
16:58
year, we've seen many big academic
17:00
studies, which have been shown to
17:02
be completely bunk. There's a famous
17:05
sense of his name escapes me,
17:07
but he did a series
17:09
of studies about systemic racism in
17:11
which he basically showed that systemic racism is a
17:13
thing. And this was picked up
17:15
by the New York Times, the Washington Post to
17:17
basically say, hey, look, systemic racism is a real
17:19
thing. Look at these disparities in treatment of white
17:21
people and black people. That was
17:24
all shown to be complete nonsense. It was all
17:26
fabricated, all data was fabricated. And Dan
17:29
Ariely, who's a famous psychologist, his work was
17:31
also found to be fabricated. And ironically,
17:34
there was a Harvard professor who was studying faking
17:37
of information who ended up her
17:40
own work with fake. You
17:44
know, so this year has been really bad
17:46
for academia. There's been a lot of there's
17:48
been a massive drop interest. And if you
17:50
look at any poll regarding trust in the
17:52
media, you see a gradual slope,
17:54
you see people, you see decline
17:56
on both sides of the
17:59
aisle, but particularly amongst people on the right because
18:01
obviously there's this idea that most of
18:04
the mainstream institutions in the West lean
18:06
left. But even
18:08
the left have less trust over
18:10
time in institutions
18:12
and obviously this has gotten a lot
18:15
worse over the past few years. So
18:20
the problem with trust is it's
18:22
like a tree where
18:25
it takes a long period of time of nourishment
18:28
and light, seeing what's going on, to
18:31
actually grow it. But it can be
18:33
chopped down in like a day. It takes years for
18:35
a tree to grow but it could be chopped down
18:37
in a single day. Institutions
18:42
over many years, they try to build trust with
18:44
the public but a few real bad
18:48
instances of betrayal
18:50
of that trust have now caused
18:52
the trust to nosedive. What's
18:55
interesting here is this dovetails with what we
18:57
were talking about previously about the cynical genius
18:59
illusion because a lack
19:02
of trust leads to more cynicism.
19:05
And the cynicism stops people
19:07
from doing
19:10
things. People become more risk averse.
19:13
They become less likely to form
19:15
partnerships with people, even to form
19:17
relationships with people. And
19:20
so there's a lot less innovation
19:23
in a sense because people distrust a
19:26
lot of things. You see it in
19:28
our daily lives with
19:30
the ways... Again, I'm not saying
19:33
that this distrust is unwarranted. A lot of it
19:35
is warranted. I mean, if you look at
19:37
what's going on in America and San Francisco and
19:39
places like that, where you see the
19:43
government in San Francisco had an opportunity
19:45
to clean up the streets, to take
19:47
the fentanyl users off the streets, to
19:49
house them in a decent place and
19:52
to try and give them help and to clean
19:55
up the streets generally. And they didn't do it.
19:57
They only did it when the premier of China,
19:59
Xi Jinping, came out, you know, they thought, okay,
20:01
now we've definitely got to do something about it.
20:03
So they that just showed that they just didn't
20:05
care. Obviously, you know, when when there's
20:07
a foreign leader coming to visit, then they suddenly cleaned up
20:09
the streets. So this is obviously, you
20:12
know, this is this distrust isn't
20:14
necessarily unwarranted. But what's
20:17
happened is the result of this is that people
20:20
tend to no matter how much information
20:22
you give them, no matter how much
20:24
information, the World Health Organization or,
20:26
you know, governments
20:28
or corporations even try to give people,
20:31
the fact is, is that there's this paucity of trust.
20:35
And I don't to be honest, I don't
20:37
think that this trust is ever going to be fully
20:39
restored. I personally don't
20:42
trust institutions
20:44
anymore. I find that
20:46
it's easier to trust individuals. Now, that's
20:49
what I do. I don't really trust institutions.
20:51
And the reason for this is although there
20:53
are a lot of low integrity individuals, there
20:55
are also a lot of extremely high integrity
20:57
individuals. And it's much
21:00
easier to gauge whether an individual is
21:02
high integrity than whether an institution is
21:04
high integrity. In fact, most institutions tend
21:06
to fall to the level of their
21:08
lowest integrity members. This
21:11
is because corrupt people obviously
21:13
tend to rise high in institutions because they tend
21:15
to be more ruthless, they tend to be more
21:17
dishonest, they tend to play the game. And
21:20
so they, the dishonest people
21:22
rise to the top in institutions.
21:24
People who are trustworthy on their
21:26
own in solitude also
21:28
become untrustworthy due to negligence or
21:30
fear or compliance or the Abilene
21:32
paradox. All of that, all of
21:35
those things happen. So you get
21:37
honest individuals and untrustworthy,
21:41
highly falsified groups,
21:43
even if they're made up, even if
21:45
the constituent parts are trustworthy. Yeah,
21:48
that's it. Yeah, it all comes down
21:50
to the sort of perverse incentive structures that
21:52
institutions have, you know, they tend to be
21:54
these sort of closed systems of status games.
21:57
They also tend to be chasing money. And
22:00
a lot of the time, these people are
22:02
playing against each other for status.
22:05
So it leads to purity spirals, for
22:07
instance. And a
22:09
lot of these perverse incentives ensure that
22:11
institutions can never really rise above their
22:14
worst members. Whereas individuals,
22:17
they are a lot more variable. Not
22:20
every individual is more trustworthy than
22:22
every institution. But of the high-integrity
22:24
individuals, they're a lot more trustworthy
22:27
than high-integrity institutions. And
22:30
so I tend to trust individuals a lot more in
22:32
the ways that I learn whether I can trust someone
22:34
or not. I have a few heuristics, but for
22:37
instance, one of them would be, are
22:40
they willing to publicly admit when they get
22:42
things wrong? Because
22:44
it takes integrity to admit when you're wrong, but
22:46
it takes a huge amount of integrity to do
22:48
it publicly. And if you can
22:50
do that, and that's a very rare skill, it
22:53
takes a huge amount of strength to be
22:55
able to go out there and say, okay, I was wrong.
22:58
And so that, for me, is a very good indicator
23:00
that somebody's high-integrity. It shows that they value the truth
23:03
more than their own ego. Do you know what? One
23:06
of my favorite heuristics for this is, when
23:08
was the last time that the person you're
23:10
thinking about surprised you with one of their
23:13
takes? If they
23:15
are very predictable with the things that
23:17
they do, if you know one of
23:19
their views and from it, you can
23:21
accurately predict everything else that they believe,
23:23
they're probably not a serious thinker. They've
23:25
just absorbed some ideology wholesale. What you
23:27
want is someone who you don't always
23:29
necessarily agree with, but definitely you can't
23:31
predict. Like, obviously, most people do fall
23:33
in some sort of grouping of ideologies.
23:36
That's why we tend to have people that, birds
23:38
of a feather. But yeah,
23:40
when was the last time that this person surprised
23:42
you with something that they commented about? Yeah,
23:45
that's definitely one of mine as well, because it
23:47
shows that somebody's willing to sort of think for
23:49
themselves rather than sort of subscribe to a
23:52
total package ideology, which just gives you everything.
23:55
It tells you what to think about abortion, it
23:57
tells you what to think about gun control, it tells you what to think
23:59
about freeze in speech. All of these
24:01
things are generally unrelated, but if somebody's got all
24:03
of these predictable opinions it shows you that they're
24:05
kind of getting it all wholesale
24:07
from someone else. There's something that I
24:09
think is associated with this,
24:11
another one of yours, ambiguity aversion. People
24:14
tend to find uncertain outcomes less tolerable
24:17
than bad outcomes. De Burcaretterl 2016
24:19
found that test participants who were
24:21
told they had a small chance
24:24
of receiving an electric shock exhibited
24:26
much higher stress levels than those
24:28
who knew they'd certainly received an
24:30
electric shock. Yeah,
24:32
I mean this explains so much, I
24:34
mean everything from sort of the world of
24:36
investing. It explains market
24:38
volatility, but
24:40
it also explains things at a personal level where
24:44
one thing I've found in my personal life is that
24:47
things are never as bad as I think that they're going to be
24:49
pretty much. It's
24:52
a very simple thing, but I
24:54
find that the sort of the anxiety
24:56
of expecting what trying
24:59
to expect what's going to happen is often worse
25:01
than the actual, even the worst eventuality.
25:05
For instance, if I were one
25:07
of my old selves from say 10 years
25:10
ago I might be nervous having this conversation with you right
25:12
now knowing that a lot of people are listening. I
25:15
would probably be playing in my head a lot of
25:17
times where it could go wrong. I might say the
25:19
wrong word, I might say something really bad, I might
25:21
say the N word or something I certainly, and then
25:24
that's it. I
25:27
think about the worst possible scenario, right? That
25:31
would really like give me nightmares, but then I would
25:34
find that even if the worst did happen
25:36
it probably wouldn't actually be that, not that
25:38
I'm actually going to say the N word,
25:40
but things are
25:42
always worse in your mind because your mind
25:45
is more terrifying than reality. Your imagination
25:47
is more terrifying than reality. It's
25:49
a more skilled sort of
25:52
scaremonger than reality because it
25:55
knows your worst fears. I think
25:58
when you're uncertain... You
26:00
can often imagine extremely
26:02
bad outcomes because
26:05
in that uncertainty, that's where your
26:08
imagination runs riot. That's one
26:10
aspect of it. With regards to
26:15
the ambiguity aversion that you talk about
26:17
with the electric shocks, again,
26:19
it's managing the anxiety of
26:22
uncertainty that takes a bigger toll on
26:25
somebody than actually just resigning themselves
26:27
to the worst outcome. I
26:30
found that this is again, if I
26:32
know that something is going to happen,
26:35
something bad is going to happen, it gives me a
26:37
sense of peace of mind. Because I
26:40
know what to predict, I know what to expect. And
26:43
so I don't need to expend
26:45
stress and mental effort in
26:48
trying to find
26:50
a way out of it, trying to sort of predict
26:55
what's going to happen. Because trying to predict what's
26:57
going to happen is a very stressful sort
27:01
of thing to do. It basically requires
27:03
you to consider an extremely
27:06
wide swathe of possibilities.
27:09
And our minds are just not very good at doing
27:11
that. If we have just one path ahead of us,
27:14
even if that's a bad path, even
27:17
if it's got a ditch at the end of it, it's much
27:19
easier to just continue along that path and say, okay, so
27:21
when it happens, I'll deal with it, than
27:23
it is to say, okay, which of
27:25
these paths has got a ditch at the end? How
27:30
many steps away is
27:32
it? Every step you take,
27:34
you have to be worried that you might fall
27:36
down that ditch. So it's the stress of
27:39
having to navigate possibility, which
27:41
ends up causing more mental discomfort
27:43
than the actual bad outcome itself.
27:46
Do you think that ambiguity aversion
27:48
explains some of the Conspiratorial
27:51
thinking,: doomsday cultish like
27:53
fads that we've seen,
27:56
that it actually closes
27:58
down the potential. The
28:00
analogy of the world to one thing. One
28:02
bad thing, but it gives you a sense
28:04
of certainty as opposed to leaving you open
28:06
to ambiguity. Yeah. Absolutely
28:09
one hundred percent. And because I
28:11
think. There is one
28:13
thing that's scary at Dime A
28:15
conspiracy of people plotting everything. And.
28:18
That is. No conspiracy of people
28:20
playing everything. The everything is just
28:22
rudderless, that societies borderless basically and
28:25
nobody knows what they're doing. You
28:27
know? Everybody used his kind
28:29
of trying to navigate the world as
28:31
best as they can earn. There.
28:33
Is no overarching plot that's scary.
28:36
Also. And so. It.
28:39
Leads to uncertainty when you don't know what to
28:41
expect, when you can't blame your problems on a
28:43
single thing, when you know that leaves again at
28:45
least so many paths ahead of you that you
28:47
just become overwhelmed and you just kind of like.
28:51
The. Stress of of trying to work out which
28:53
passes the true three one that is. An.
28:56
Underrated form of stress whereas the
28:58
stress of knowing. That. There
29:00
is a bad as a group but people
29:02
out there who are playing everything's actually isn't
29:04
really stressful. How. Is. That is actually quite
29:06
interesting. Because. Then you want to go
29:08
online and you will have an ordinary uncertainty about
29:10
it. Yeah I can have this idea called anxiety
29:12
Cost So in the same way as you have
29:15
opportunity cost, the amount of time do you spend
29:17
thinking about the thing that you could have gotten
29:19
rid of had you have just done nothing. When
29:21
you wake up in the morning you need to
29:23
meditate, walk the dog, go to work. The longer
29:25
that it takes to meditate, the more times you
29:27
have to have the thought, I still need to
29:29
meditate today. That is a very helpful thing to
29:32
do and this is like a protracted version of
29:34
that. Those this for Matty Side in the times
29:36
is back in two thousand and twenty. At
29:38
Psychologists have conducted experiments to shed light
29:40
on why people lose or at least
29:42
suspend rationality. One experiment ask people to
29:44
imagine going to a doctor to human
29:46
uncertain medical diagnosis. Such people with significantly
29:48
more likely to express the belief that
29:50
God within control of the live another
29:52
as participants to imagine the time a
29:54
deep uncertainty when they feared for their
29:57
jobs or the health of the children's
29:59
they were fall more likely to see
30:01
a pattern in meaningless static or to
30:03
infer the two random events were connected.
30:05
This is such a common finding the
30:07
psychologists have given it a name. Compensate.
30:10
Three control. When. We
30:12
feel uncertain When randomness improved upon our
30:14
lives. We respond by reintroducing order in
30:16
some other way. Superstitions and conspiracy theory
30:18
speak to this need. It is not
30:21
easy to accept the important events are
30:23
shaped by random forces. This is why,
30:25
for some it makes more sense to
30:27
believe that we are threatened by the
30:29
grand plans of maligned scientists than the
30:32
chance mutations of a silly little microbe.
30:35
Yeah. Absolutely. I think it
30:37
explains so much about why we
30:40
dramatize reality. We tend to sort
30:42
of. Ten. Events.
30:44
Into Stories because it's
30:46
much. More. Ordinary. Am
30:49
your if you try to comprehend
30:51
the world as it actually is.
30:55
You might be overwhelmed. you know that's just
30:57
the so many variables going on or all
30:59
over the world like that we have to
31:01
reduce things down to simple patterns which we
31:03
call stories. In which
31:06
we basically. We. Have simply
31:08
we We collapse the sort of web of causality
31:10
down for single thread. And.
31:13
That makes mice a much easier to sort
31:15
of comprehend. even if it's it's not sort
31:17
of. It's. Not completely true
31:19
what would leave, but it's soon enough
31:21
that we can get on with our
31:23
lives and just kind of. You
31:25
know not have to worry about it. So many of
31:27
our brains. What you're looking
31:29
for with with any kind of sort of. Sense.
31:32
Making truth making system is I
31:34
want to be able to move
31:36
through the world with reliable, predictive
31:38
accuracy of what's gonna happen, but
31:40
really wants deeper than that is.
31:42
I just don't want to expand
31:44
that much mental effort trying to
31:46
work out what's going to happen.
31:48
And the difference between those two
31:50
allows this to slip in. Which
31:52
is what mano thinking. It's right.
31:54
If if every single problem in
31:56
the world is because of capitalism,
31:58
all the climate change or a
32:00
dilemma, Todd's all the whatever. Every
32:02
single problem is due to the
32:04
same solution. That's because the demand
32:06
for answers outstrips your ability to
32:08
supply them, so you just retrofit
32:10
one answer to all questions. Yeah.
32:13
Absolutely Again, is is a cognitive
32:15
sort of a it's it's energy
32:17
saving, a mechanism that that would
32:20
people and gauge and and. I
32:22
think yeah it. It. Does
32:24
explains so much of the
32:26
current. Landscapes. Occurrence of
32:29
online landscape and particularly explains and
32:31
tribalism. There is much easier just
32:33
to sprint since I saw this
32:35
really good them tweet and by
32:37
Michael, Malice and or it has
32:39
been on the show On Your
32:41
Shiny many times unfortunately. Yeah. Yeah.
32:44
He said damn feet I've been good in
32:46
front be but he went something like an
32:48
intense is it's people don't see the world.
32:51
And include it in Pittsburgh. Most people
32:53
don't navigate the world. Boy. A
32:55
true and false filter, but by an
32:57
awesome them filter. And
32:59
and so slightest. True
33:01
and false is too much of a cognitive. Demand.
33:04
You're trying to work out what's true and
33:06
false. It's just way too much effort for
33:08
most people on it. It, it requires. Statistical
33:12
analysis. It requires looking
33:14
at hard data he
33:16
requires at. Sort. Of
33:18
suppressing your own emotions, it the so much
33:21
that you need to do in order to
33:23
actually work out what's true. Whereas if you
33:25
just and adopt a very simple especially stem
33:27
the rustic it so much easier and you
33:29
can still get on and your life because
33:31
if you have an Us versus them and
33:34
Sir strategy then you're going to be in
33:36
the same boat with a group of the
33:38
people who will help you you know sort
33:40
of not get world and they'll become your
33:42
allies so it's just so less cognitively demanding
33:44
to do that and. Pretty. Much
33:47
everything about our sort of mental architecture.
33:50
Is. Configured. To
33:52
this. Sort. Of system because
33:54
that's how we evolved in a we have will
33:56
be one hunter gatherer as we we lived in
33:58
tribes and week. We should
34:00
have engaged to tribal warfare. So everything that
34:02
we've just been talking about. This.
34:05
Pattern matching and everything is all
34:07
in the service of tribalism. Ultimately
34:10
am so we will see the
34:12
best in what. Our. Allies
34:14
Sambou see the worst him our enemy
34:16
sites at will it yet will tip
34:18
for and worst possible way we will.
34:21
See. Signs in the clouds that sort
34:23
of portend that Godzilla on our side or
34:25
whatever he had a he's on our side
34:27
and he get a he hates of the
34:30
enemies you know, whatever it's get it at
34:32
will find patterns that justify an Us versus
34:34
them sort of attitude. Not funny. you know
34:36
that's what comes naturally to us and. He
34:39
also explains why we see things in
34:41
terms of drama rather than data. And
34:44
dead at things was one of my
34:46
that sort of concepts. Does it talk
34:48
about Compassion? St and. This
34:50
idea. That sort of. Those
34:54
experiments conducted am. In
34:56
which. People. From.
34:59
Database. It engaged in sort of. Bed.
35:02
These appeals for charity. So ah, what
35:04
I did is is like a sort
35:06
of a campaign for an for funding
35:09
for charity and they had two different
35:11
ways of doing it. One way was
35:13
based on am presenting simon statistics and
35:15
hard data. And the other
35:17
was based on presenting the story of Single
35:19
Stop and Go. And the
35:21
people tended to deny a lot more to
35:23
the girl. And.
35:26
The reason for this is.the hard data
35:28
is alien to the human brain and
35:30
statistics is something that we're not. It's
35:33
not a brains are not format it
35:35
for. That kind of data
35:37
analysis, which is not, You know it
35:39
is too much ss it requires too
35:41
many calories and too much time. So
35:43
what our brains do is we, again,
35:45
we not, We collapse the web of
35:47
cause, unseemly, class all the variables into
35:49
a single. Thread a
35:51
single line, single linear. Sort. of
35:53
and veta which does has a beginning a
35:56
middle and then so you'd have to get
35:58
his starbucks need your help you give her
36:00
your help, she is no longer starving
36:02
and therefore you've saved, you know, you saved a girl,
36:04
and then that's it, and then you're a good person,
36:06
you know. So that's how we sort of,
36:09
we've collapsed the whole world down to these
36:12
single narrative threads and it just makes, because
36:15
obviously we think in the language of story, if
36:18
you want to convince people, that's how you've got
36:20
to appeal to people, you've got to, statistics
36:23
aren't going to help. You can rattle off all the
36:25
numbers you want, you know, and the bigger they are,
36:27
the more alien they are, and the less they'll be
36:29
believed, the less they'll be really comprehended. You
36:32
get the story of a single girl and
36:34
you present her story in a narrative sequence,
36:37
in the way that people tell
36:39
stories, you know. You
36:41
could use the three act structure, you could use
36:43
the hero's journey, whatever system you want, but as
36:45
long as it's a narrative thread, a single narrative
36:48
thread, you'll reach a lot more people. So,
36:51
yeah, we're not donating a million
36:53
times more money or feeling a
36:55
million times worse when we
36:57
hear the story of a million kids compared with
36:59
the one of the single kid. In fact, it's
37:02
probably the opposite, that that pulls on our heartstrings.
37:04
Yeah, the personification of data and
37:06
stories, and you can see this, the charity
37:08
example is perfectly right. They are
37:10
split testing into
37:12
oblivion what the most effective way to
37:14
pull on people's heartstrings is. Like they
37:17
know. So if you want to find
37:19
out how to motivate people's behavior, just
37:21
watch a charity advert, because they're not
37:23
doing the thing that doesn't motivate behavior,
37:25
they're doing precisely the thing that motivates
37:28
behavior, and they'll have had behavioral scientists,
37:30
behavioral economics guys, they'll have had Rory
37:32
Sutherland will be in there, and the
37:34
copywriters and all the rest of it,
37:36
split testing, everything, and that's what they've
37:39
arrived at. Right, next one. Preference falsification.
37:41
If people are afraid to say what they
37:43
really think, they will instead
37:45
lie. Therefore, punishing speech, whether
37:47
by taking offense or by
37:49
threatening censorship, is ultimately a
37:51
request to be deceived. Yeah,
37:55
I mean, so this, I
37:58
think is another reason why. there's actually a
38:00
distrust in institutions because
38:03
they've tended to react to
38:06
criticism by essentially censoring
38:09
people. But
38:13
censorship is based on a very outdated way
38:16
of operating. It's based on
38:18
a very outdated information architecture.
38:22
So censorship would have worked very well 100 years
38:25
ago when there was a centralized
38:28
authority which passed information
38:30
down to everybody, whether
38:32
it was via printed
38:34
leaflets or television screens.
38:37
Information was very centralized. But
38:42
that system no longer works because the
38:44
reason it worked in the past was
38:46
because since the authorities provided a single
38:49
system of information. So for instance, think about the TV.
38:52
In the UK, the TV tended to only have four
38:55
channels originally when I
38:57
was young, very young. And
39:00
those four channels all tended to have the same sort of narrative.
39:02
So if you wanted to censor certain
39:04
information, you could just basically pass a
39:07
law because this
39:09
was broadcast media. So they were beholden to
39:11
government intervention. So you could pass
39:13
a law saying that, oh, the four
39:15
channels are not allowed to talk about this. So
39:18
therefore, now none of that information is going to get
39:20
beamed into people's homes. So now
39:22
nobody can ever know what that information was.
39:26
But that kind
39:28
of sort of centralized information structure
39:30
no longer exists.
39:32
All information in the West, at
39:34
least, is decentralized or
39:37
it's decentralizable in the
39:40
sense that somebody can pick up on
39:42
anything now and make it go viral. So
39:46
now censorship doesn't work. Now what happens is
39:48
people are well aware of what's being censored.
39:51
And you have this thing, obviously, the Streisand
39:53
effect, where when people learn what's
39:56
being censored, then they
39:58
want to know what that thing is even more. you
40:00
know, in the past, like, the further back into the
40:02
past we go, the less
40:04
likely this the Streisand effect was
40:07
because people wouldn't even know what was
40:09
being censored since information was centralized. But
40:12
now because information is
40:15
everywhere, that information is going
40:17
to leak leak out people are going to know what's
40:19
being censored people are going to know, even if they
40:21
don't know the precise thing that's being censored, they're going
40:23
to know what kind of information
40:25
is being censored from them, because somebody's going
40:27
to spill the beans somewhere because of how
40:29
interconnected everything is. You know, all it takes
40:31
is just one person to spill the beans,
40:34
and then that's going to go viral,
40:36
everybody's going to find out about it. And we see
40:38
this repeatedly, you know, like, for instance, with
40:40
a lovely, lovely, going back to the lovely hypothesis. Immediately,
40:43
as soon as Facebook
40:46
and Twitter and everybody else tried to stifle
40:48
that story, it went
40:50
viral when everybody was talking about it
40:52
because it's a Biden. Yeah,
40:54
that's another perfect example. There's many
40:57
other examples, as soon as one
40:59
organization tries to censor something, other
41:03
individuals will immediately raise the
41:05
alarm. And as soon
41:07
as that happens, everybody now wants to know what that
41:09
thing was censored, they want to know why it was
41:11
withheld from them. This is,
41:14
you know, this thing called reactance, sometimes
41:16
called the backfire effect, where when you
41:18
withhold what he says, people can't have
41:20
something, they become even more adamant
41:22
that they want they want it even more, you
41:24
know. And so this
41:28
leads to essentially
41:30
a backfire, you know, that's what's called the backfire effect.
41:33
And what happens is
41:35
that people then decide
41:38
that hang on a second, if this is
41:40
being withheld from me, then it's gonna obviously I
41:42
mean, I'm kind of I'm going a little bit,
41:44
I'm veering off a little bit from the original
41:46
thing. But so that's, that's one aspect of it.
41:49
But like, yeah, another aspect of
41:51
this whole censorship thing is that when
41:54
people realize that they can't say certain things,
41:57
they instead will lie. And
42:00
they will then they're not going to
42:03
change their beliefs. Like I said, the backfire effect
42:05
means that people don't become if you
42:07
censor people, they're not going to become less
42:09
likely to believe that thing, they're going to become more
42:12
likely to believe that thing, you know. And
42:14
the only thing that's going to change is that if, if
42:16
they know that they're going to get banned for saying
42:18
something, they'll just lie. But they're
42:20
not it's not going to change their thoughts.
42:23
In fact, the opposite is happening. And so
42:25
it's a counterproductive thing to do in the
42:27
digital age. That's why censorship just doesn't work
42:29
in the digital age. Because although people you
42:31
can control what people say online, you can't
42:33
control what they think. In fact, what you
42:35
do is you make people more adamant to
42:38
think what they want to think. So they
42:40
become more entrenched in their beliefs. Well, you
42:42
taught me a couple of episodes ago about
42:44
the chilling effect. When punishment for what people
42:46
say becomes widespread, people stop saying what they
42:48
really think and instead say whatever is needed
42:50
to thrive in the social environment, thus limits
42:52
on speech, become limits on sincerity. It seems
42:54
very similar to preference falsification. Yeah, is there
42:56
is there a distinction between the two? Is
42:58
that what where is the difference? Yeah,
43:01
so I mean, they are essentially the same thing. I
43:03
mean, maybe the difference would be something of scale, where
43:06
preference falsification really refers more to
43:08
the individual actions. You
43:11
know, and then you have things like the spiral of silence,
43:13
which is another way of saying the same thing, spiral of
43:15
silences is the cumulative effect
43:17
of preference falsification. So what happens
43:20
is that certain
43:22
ideas become more and more verboten over
43:24
time. And, and when they become verboten,
43:26
then people don't want to want to
43:28
be the first person. It's
43:31
just a fancy way of saying forbidden. Okay,
43:33
that's cool. Verboten. Yeah, German.
43:35
For some reason, I don't know why I said verboten when
43:37
I could have just said forbidden. It's nice. I like I
43:39
prefer it. Yeah. But
43:41
like, yeah, what happens
43:44
is that it leads to a spiral
43:46
of silence. So the
43:48
more that an idea becomes unsayable,
43:52
the less likely people are to say it. And
43:54
so the more it becomes insane. Yeah.
43:58
So it's kind of like a cycle. Yeah,
44:00
reinforcing system. Yeah. Yeah.
44:03
So yeah, I mean, I just don't
44:05
know what people are thinking like these organizations when
44:07
they think that they can censor information in the
44:09
digital age, it just it just very, very rarely
44:12
works. It might work in a place like China.
44:14
But even in China, right, where the
44:17
government has absolute control, you know, they've got
44:19
the sort of the Great Fire War, what
44:21
they call the Great Firewall. But
44:23
even that is not enough. Now, they've been
44:25
cases now where information has gone viral, but
44:27
the CCP didn't want to go viral because
44:29
they were trying to stifle it. And
44:32
in the age of, you know, even
44:34
though they they do all they
44:36
can, it just isn't possible because of the number,
44:39
because of how fast information travels in the digital
44:42
age, and because of the number of connections between
44:45
nodes, it's just not
44:47
possible to use censorship anymore. So any
44:49
any organization that's trying to use censorship,
44:51
they're using 20th century tactics against
44:54
21st century information systems, it just doesn't
44:56
work. And again, it
44:58
leads to more distrust of institutions. So this
45:00
goes back to this whole thing that
45:03
we're talking about with, you know, the
45:05
the problem of trust in society, and
45:07
it leads to more cynicism. So
45:09
all of it, you know, so going so between
45:11
the backfire effect and the cynical, the
45:13
whole cynical thing, you know, it just makes
45:15
things worse. And I don't know when institutions
45:18
are going to learn this, but eventually
45:21
they will, hopefully, you end up
45:23
you end up with this kind of
45:25
game where they chase their own tail in a little
45:27
bit. So for instance, you see this with YouTube
45:30
channels. So a YouTube channel will begin
45:32
to struggle with plays. And
45:34
they won't be too sure why. And everybody
45:36
has on YouTube, when it comes to the
45:38
way that they frame their episodes and what
45:40
they do both content and framing, they have
45:42
an overton window that they exist within. And
45:44
they're not prepared usually to go beyond a
45:47
particular level of boring, because people aren't going
45:49
to click. And there's usually an upper bound
45:51
of clickbaitness that they're also not prepared to
45:53
go past, because that seems kind of hacky.
45:55
And what will happen is they will begin
45:57
to skew more and more towards clickbait
46:00
side. They will use more
46:02
limbic hijack words, war, battle,
46:04
imagery, the whole Mr. Beastification
46:10
faces. They'll lean more down that, but
46:12
the problem that you have, as you
46:14
begin to pull that lever more and
46:16
more to chase ever-declining plays, your
46:19
audience becomes increasingly desensitized to the subtlety
46:21
that you want them to come back
46:23
to. So it's a one-way street. When
46:25
you start to pull that record, like
46:27
Russell Brand's channel, regardless of what you
46:29
think about Russell Brand, what he says,
46:31
his content is, I would
46:33
challenge anybody to say that the framing
46:35
around his YouTube channel is fair
46:39
and gentle and reassuring. As someone that
46:41
talks a lot about love
46:44
and you awakening wonders and all this
46:46
sort of stuff, it's like, they are
46:48
coming for your kids. You won't believe
46:50
they did what? It's the most limbic
46:52
hijack. I'm pretty sure it was his
46:54
channel that did that image
46:56
of the Hawaiian laser beam hitting the
46:58
roof of a thing. I'm pretty sure
47:01
that he either created or his team
47:03
created or used like this. Anyway,
47:06
my point being that you chase that
47:08
sort of limbic hijack game and
47:11
it makes people become increasingly desensitized
47:13
to the things that you can say in
47:16
the same way as institutions that
47:18
feel like they're losing control increasingly
47:20
apply more rigorous high
47:22
levels of scrutiny, high levels of control. What
47:25
happens? It drives the trust
47:27
down ever more. You can't
47:29
dictate trust top down.
47:31
It has to be emergent. It has to come out
47:33
bottom up. But they're chasing it more
47:36
and more. Oh my God, we need to do more
47:38
because the trust is declining and that means that we
47:40
need to use ever more totalitarian techniques to do this.
47:43
It doesn't work.
47:46
And the fact that they think that it's going to
47:48
work actually makes it even harder to trust them because
47:51
they're just so wrong about that. So you ask yourself
47:53
what else are they wrong about? They got to be
47:55
wrong about so many things. If they don't understand this
47:57
basic facet of human psychology, then they're wrong.
48:00
they're pretty much hard to trust on anything else.
48:02
Yeah. Uh, hero-stratic
48:05
fame. Many
48:07
people would rather be hated than
48:09
unknown. In ancient Greece, Herostratus
48:12
burned down the temple of
48:14
Artemis purely so he'd be
48:16
remembered. Now we have
48:18
nuisance influences who stream themselves committing
48:20
crimes and harassing people purely
48:22
for clout. Yeah,
48:26
so this has become a serious problem now, I
48:28
think. So, I don't know
48:30
if you know who Jack
48:32
Docherty is. I do. This
48:34
sort of world of IRL
48:37
streamers and Jack Docherty tell
48:39
me when I get this wrong. There are
48:41
a few of them. He appears to kind
48:43
of start fights in person
48:46
and has massive bouncers slash
48:49
security guys with him, most of
48:51
whom seem to be black, and
48:53
then they will sort
48:55
out whatever the issue is by punching
48:57
or choking out the person that Jack
49:00
just started some beef with and then
49:02
the internet goes completely crazy by saying,
49:04
dude, started on somebody then got
49:06
his six foot seven behemoth of a security
49:08
guy to step in and smash some kid
49:10
in the face. And now
49:13
he's getting paid millions of dollars and has a
49:15
Lambo and lives in LA or something. Exactly,
49:18
yeah. And he's not
49:20
the only one. I mean, this is a whole
49:22
trend, you know, there's people like Mizzie, for instance,
49:25
you probably know about Mizzie as well, who was
49:27
the guy who was going into libraries and ripping
49:29
up the books, what's filming the librarians
49:31
to see what they would do. And
49:34
then you have like, Johnny Somali,
49:36
who would go out and start harassing
49:39
people in streets and recording their reactions.
49:42
And he actually went to Japan and it's quite interesting because he
49:44
first he got knocked out. He got hit
49:46
in the face and knocked out because in Japan, they don't screw around, right?
49:49
And then he got arrested and now he's in jail. At least
49:51
he's in jail. He's
49:53
in jail in Japan, right? So
49:55
there is occasionally there is comeuppance, but I mean, most
49:57
of the time there is no comeuppance for these influences.
50:00
And they just go out there and they harass
50:02
people in the streets and they record it because
50:04
they know again This is limbic hijacking right? They
50:06
know that they're just by appealing
50:08
to the worst most closest impulses of the
50:11
human brain They can get a lot
50:13
of eyeballs and so they just
50:15
basically they would there's a
50:17
lot of pressure on young people to Be
50:20
to have a lot of followers on social media for instance
50:22
You know and they want to they want to be popular
50:24
everybody wants to be the cool kids and and
50:26
one way to get a large following Online if
50:29
you don't have other talents is
50:31
to just be an asshole, you know Just
50:33
be an asshole and film people around you
50:35
and the people will get hate followers They'll
50:37
get hate audiences who watch them simply to
50:39
hate on them And
50:41
I think you know people like Mizzie and
50:43
Jack Docherty have fallen to this this kind
50:45
of strategy I think I think
50:48
Jack Docherty originally he was just some he
50:50
just did some other lifestyle stuff But
50:53
he obviously found this niche and he thought
50:55
wow making way more when he doing this
50:57
And now he's a millionaire. I mean, he's
50:59
got a lot of money and you know
51:01
He's he's got a very glamorous lifestyle. At
51:03
least it appears glamorous if you look at
51:05
his Instagram account, you know He's surrounded by
51:07
fancy cars and beautiful women and
51:10
all this stuff, you know And he portrays
51:12
this kind of lifestyle of you know,
51:14
I'm success But really when you look at what he
51:16
does to earn that success now He
51:19
just goes out there and he just makes
51:21
life miserable for everybody and this is bad because
51:23
this is creating again This is creating a
51:25
very perverse incentive structure Fueled
51:27
by tick tock again and the Chinese government probably
51:30
knows that they're doing this and that they're allowing
51:32
that these These nuisance
51:34
influences to get a lot of views
51:36
on tick tock because they know that it's been for
51:39
America And it's bad for the
51:41
UK and it's bad for West in general But
51:44
yeah, I mean so it's a race to the
51:46
bottom now where you've got a lot of people competing to
51:48
be the most nuisance To
51:51
be the biggest nuisance to be the worst Possible
51:54
human being people who formerly were pranksters
51:56
people like fousey tube. So
51:58
you'd probably never Well, yeah, he
52:01
basically had a full-on psychological
52:03
break on camera, got it
52:06
arrested by Miami police, called
52:08
the cops on
52:10
himself, pretended that he was someone who had
52:12
a knife or a gun or something. Yeah,
52:15
wild, wild. Exactly. And the
52:17
crazy thing is that we don't even know if
52:19
this was genuine or not. This could have all
52:21
been part of, again, just being a nuisance. It
52:24
might be real, it might not. We don't
52:26
know because the line between real
52:29
and fiction is blurring now.
52:32
And for instance, Mizzie said that all of his pranks
52:35
were planned and stuff. But it's hard to
52:37
believe that he would go into, say, Azdai,
52:40
go into a superstore and
52:42
he would start riding on the
52:44
disabled trolley things that they have
52:47
and just smashing shelves and stuff and that the
52:49
supermarkets would actually allow him to do that. It's
52:52
just not, I don't believe that. A
52:55
lot of them will face the flood act to defend themselves if they
52:57
get into a lot of hot water. And
52:59
ultimately, what this does is that this creates
53:01
really bad incentives for kids. Because if you
53:03
think about in the past, in order to
53:05
be at the dawn of
53:08
YouTube, for instance, in order to
53:10
get a big following on YouTube, you tended to have to
53:12
do something that was extraordinary in some way
53:14
and extraordinary in a positive sense. You
53:17
tend to have to be
53:19
talented at something. The first big YouTubers
53:21
tended to be musicians or
53:25
athletes of some kind of people who had some kind
53:27
of skill. But very
53:29
soon, people realized that you could actually
53:31
develop just as big of a following
53:34
by having zero talent and just being a
53:36
nuisance, just being an asshole to people. And
53:39
once that happened, this kind of nuisance influencing
53:41
went viral. And it's essentially a race to
53:44
the bottom now where people are competing now
53:46
to be the worst possible human being, which
53:49
really sets a bad precedent. It sets
53:51
bad incentives for other young kids watching
53:53
this. Because when the kids watch it, they
53:56
say, oh, I want to be like Mizzi. I want to be
53:58
like Jack Doppity. I want to have all these photos. fancy girls,
54:00
all these fancy cars, I
54:02
want to be like that, so I'm
54:05
going to learn how to be an
54:07
insufferable human being. That
54:11
person is bringing no value to life and they're
54:13
getting rewarded for it. They respond to incentives. Yeah,
54:16
they respond to incentives. If
54:19
you say, rather than working really hard
54:21
at a thing consistently for a long
54:23
period of time and accumulating skills and
54:25
making yourself worthwhile, the bottom of the
54:27
brain is that it's the reason, I
54:29
think, in part, that there is some
54:32
distaste against silver
54:35
spoon dynasty children and only
54:37
fans' influences. That
54:39
there's something unfair. It
54:43
feels like, well, hang on, you've got that, but you didn't
54:45
work for it. In an Americanocratic system,
54:47
which is what we've got, that's always
54:49
going to get people's backs up. I have to
54:52
work harder than this person to get less. How
54:54
can that be fair? Oh, well, it's
54:56
because they were given a privilege that
54:58
I didn't get. That seems unfair. It's
55:00
because they're prepared to compromise their morals
55:03
in some way that I see as
55:05
I wouldn't do, therefore I am somehow
55:07
superior to them. There's this puritan nobility
55:09
that gets associated with it. When
55:12
we're talking about nuisance influences, which I think
55:14
is a phenomenal term I've never heard of
55:16
before, and do that first sentence that you
55:18
put, many people would rather
55:20
be hated than unknown. Just brilliant.
55:23
I know that you've got two books in the works, one of which
55:25
you may have submitted, but I can't
55:27
wait for both of them, man. All
55:29
of the times I watch very similar stuff to
55:31
what you watch, and yet what you're able to
55:33
pull out of it is significantly more in depth
55:36
than me. I'm very, very excited for what you've
55:38
got coming up. Thank
55:41
you. Here's one that I made earlier.
55:46
Toxic compassion. In a world where
55:48
our opinions have been separated from our deeds,
55:50
appearing to do good has become more
55:53
important than actually doing good. The
55:55
prioritization of short-term emotional comfort
55:57
over actual long-term flourishing motivates
56:00
people to say the things which make
56:02
them appear caring and empathetic, even if
56:05
they result in negative outcomes over time.
56:07
And this is seen most obviously in
56:10
support for the body positivity movement. Rather
56:12
than make someone feel uncomfortable about their
56:15
weight, you would say that weight has
56:17
no bearing on health, even
56:19
if that encourages people or discourages
56:21
them from losing weight, which results
56:23
in worse outcomes over the long
56:25
term. And that pointing could have
56:27
been seen for defund the police, that rather
56:30
than talk about some of
56:32
the challenges that are faced by different
56:34
groups, when it comes to policing,
56:36
you say that all police are
56:38
mistreating minorities, therefore the police
56:40
should be withdrawn, even if
56:42
the actual outcome over the
56:44
long term is more poor
56:48
policing and more negative outcomes for those precise
56:50
minorities that you were looking to protect in
56:52
the first place. Yeah, absolutely.
56:55
Yeah, so this brings together quite
56:57
a few very, very
56:59
interesting and informative ideas, one
57:03
of which would be luxury beliefs, which
57:05
I think you kind of alluded to at the end there. And
57:08
also my idea of the opinion pageant, where
57:12
the whole thing about the social media has
57:15
caused us to sort of overvalue opinions as
57:18
a gauge of character, we're sort of judged more by
57:21
what we say than by what we do. And
57:23
so this goes to what you were saying initially about
57:25
how it's all about looking
57:27
good rather than doing good, which again echoes
57:29
what Elon Musk said, I think
57:31
in a talk, I think with the New York
57:34
Times, a couple of weeks ago, you
57:36
know, where he just expressed a
57:38
bit of outrage at how corporations are trying
57:40
to look good, but not actually doing good.
57:42
And yeah, I think this is one of
57:44
the key concepts to understand the digital age,
57:47
where because we now have an
57:50
image oriented sort of
57:53
economy where everything that your success
57:55
in life is based on how
57:58
you appear to others now more than ever
58:00
because we're all because social media is where
58:02
people come to promote their stuff whether you're
58:04
a corporation whether you're a politician whether
58:07
you're an influencer you know everybody's
58:10
on social media trying to promote themselves
58:12
trying to show why their brand is
58:14
the brand that you should you know sort
58:17
of buy into and
58:20
part of this is this whole social game this
58:22
new social game I mean obviously there's always been
58:24
a social game as long as there's been a
58:26
society but it's been sort
58:28
of pushed to the forefront by the
58:30
fact that the vast majority of our
58:32
lives now are spent trying
58:35
to appear a certain way to people
58:37
in terms of you know just on social
58:39
media it
58:41
really explains so much of
58:44
everything from sort of cancellation to
58:49
the kinds of politics that we have now
58:51
polarization and even disinformation you know all of
58:54
these things really ultimately come down to people
58:56
trying to look as good as they can
58:58
rather than trying to do as good as
59:00
they can so people are you
59:03
know for peddling theories that
59:06
again the peddling theories that they're going to
59:08
hijack people's brains and you
59:10
know scare monger them or they're
59:12
trying to convince people
59:14
that they're morally superior so they'll
59:17
you know they'll
59:20
sort of post their luxury
59:22
beliefs online and
59:26
I think that it's
59:28
hard to really work out how
59:30
we go from here where everything is image
59:32
oriented and things are becoming more so um
59:37
I think that ultimately
59:39
I think there may be some kind of I
59:43
mean we're kind of seeing it already where we've seen
59:45
it with there's a kind of backlash to people just
59:48
going against looking good trying to
59:51
people counter signaling there's
59:53
been a rise of counter signaling I
59:55
think that Trump's election in 2016 was a
59:57
form of counter signaling where people
1:00:00
elected the most obnoxious
1:00:02
outwardly, you know, like kind of somebody
1:00:04
who just made no effort to even
1:00:06
appear good, or at least they
1:00:09
did it in a really, really obnoxious and
1:00:11
sort of overbearing cartoonish way, almost
1:00:13
as a parody of the society that we're
1:00:15
living in. I think that
1:00:18
was a kind of counter-signal, but I think that,
1:00:20
yeah, there's also, there's been the rise of vice-signaling
1:00:25
as a response to the sort of prevalence of
1:00:28
virtue-signaling. But even like
1:00:30
vice-signaling is where people will outwardly
1:00:33
just say things that they know are going
1:00:35
to upset people. You
1:00:37
could even say that this nuisance-influencing is a kind
1:00:39
of vice-signaling, where people are like, I don't care.
1:00:42
I'm over and above the
1:00:44
morality game. I don't have to appear good.
1:00:47
I can just be the worst person possible.
1:00:49
People like Andrew Tate, for instance, who have
1:00:52
developed massive followings by saying the opposite of
1:00:54
what is considered good by the majority of
1:00:56
society, you know, you see, with
1:00:59
Elon Musk, Elon Musk is counter-signaling very, very strongly
1:01:01
on Twitter a lot of the time, where he
1:01:03
will say things that are the complete opposite of
1:01:05
what we've been taught, we should say,
1:01:07
by the New York Times, by the Washington
1:01:10
Post, by the World Health Organization, all these
1:01:12
other mainstream organizations. They tell us that we
1:01:14
should be saying these kinds of
1:01:16
beliefs. We should be portraying this kind of person.
1:01:18
We should be, you know, this is how we
1:01:20
should be to be a good person. And then
1:01:23
you've got these rogues like Elon, like Donald Trump,
1:01:25
like Andrew Tate, who are basically saying, no, screw
1:01:28
that. Let's do the opposite of what they say. So
1:01:30
that's the kind of backlash. But in a
1:01:32
strange sense, this vice signaling is itself a
1:01:34
kind of virtue signaling, because it
1:01:37
is signaling to others that you
1:01:39
are way above all
1:01:41
of this silly sort of,
1:01:44
you know, bickering that people are engaging in.
1:01:46
It's the same reason why Yeezys have got
1:01:48
progressively more ugly over time. And if you
1:01:50
actually look at what counts for a lot
1:01:53
of super fashionable streetwear at the moment, it's
1:01:55
almost like hobo chic. Well,
1:01:57
that's because you're saying, look, I have
1:01:59
so surplus cool in
1:02:02
me that I can basically dress what
1:02:05
is so orthogonal to what other
1:02:08
people think is cool and still be cool,
1:02:10
that's how cool I am. Which oddly because
1:02:12
of how cool is kind of,
1:02:14
it's just so subjective, if
1:02:16
you call something cool and if enough people agree it kind of
1:02:18
is and no one can falsify whether
1:02:21
it is or not. But yeah, this toxic
1:02:23
compassion thing I've been playing around with for
1:02:25
ages and it's the interesting bit is that
1:02:27
second part is the prioritization of short term
1:02:29
emotional comfort over long term flourishing. Saying
1:02:31
things, you know, you're totally correct. Living
1:02:34
life online has caused us to flatten
1:02:37
down how we are
1:02:39
judged to be about proclamations rather
1:02:41
than actions. And it's the
1:02:43
reason that people were bullied about whether they did
1:02:45
or didn't post a black square, it's about whether
1:02:48
you do or don't have Ukraine in your bio,
1:02:50
it's about whether you do or don't have pronouns
1:02:52
in your email signature, all of
1:02:54
those things. And yeah, that the
1:02:57
additional again, it's perverse
1:02:59
incentives, you know, I think that's probably the running theme
1:03:02
of today's discussion is we're
1:03:05
creating all these perverse incentives for
1:03:07
people to follow. And
1:03:10
that's essentially what's driving these behaviors is that
1:03:12
we're rewarding, we're rewarding, like you said, we're
1:03:14
rewarding the short term gains over
1:03:16
the long term, the actual proper gains, which are
1:03:18
the long term gains, we're sort
1:03:20
of trapping ourselves in these, in
1:03:23
these compulsion loops. So compulsion loops are
1:03:26
this idea from gaming
1:03:28
and gamification, where you
1:03:30
trap people in these short term cycles
1:03:32
of effort and reward, that
1:03:35
can often lead them away from what they
1:03:37
should really be doing. And
1:03:39
we're all getting trapped in these compulsion loops, whether
1:03:42
it's being, you know, being
1:03:44
a nuisance being an asshole online, or
1:03:47
whether it's being virtue single or online, you know, we're
1:03:49
kind of all chasing these short term rewards at the
1:03:51
expense, well, not all of us, but I mean, many
1:03:53
of us are, you know, I like to think that
1:03:55
you and I are a little bit better. But
1:03:57
I mean, we're not we're not completely, we're not completely.
1:04:00
I mean, think about how many times anyone
1:04:03
that's ever been on a plane, knowing that
1:04:05
they don't have connection, gets their phone out,
1:04:07
swipes up, cycles through a bunch of apps,
1:04:10
even knowing that nothing can have happened. It's
1:04:12
a compulsion. It's just a, it's ingrained in
1:04:14
there. Right. Next one.
1:04:16
Tazwell's razor. Emotion
1:04:19
causes bias, but it also causes
1:04:21
motivation. As such, we're most likely
1:04:23
to act when our judgment can
1:04:25
be trusted least. Emotion,
1:04:28
don't trust thoughts you have while emotional,
1:04:30
instead pause and wait for the feeling
1:04:32
to pass before acting. Yeah.
1:04:36
So I think
1:04:38
everybody is not
1:04:40
a single person, but it's a collection
1:04:42
of selves. And
1:04:45
some of these selves are much
1:04:47
more representative of who we are
1:04:50
at our core than others. And I
1:04:52
think emotion can bring
1:04:55
out a side of us that is not really us. And
1:04:58
it can cause us to act in ways that we
1:05:00
would later regret. And
1:05:03
I've found this myself, like I don't really do it
1:05:05
anymore, but back in the early days, you know, like
1:05:07
10 years ago, I would
1:05:10
get sometimes I'd get angry online. If
1:05:13
somebody said something nasty to me, and I
1:05:15
would be spiteful and I would say something nasty back.
1:05:18
And I would later read back what I'd written
1:05:20
and I'd just be like, wow,
1:05:23
you know, I can't believe I actually said that, you
1:05:25
know, why not? So I was, I basically was just
1:05:27
as bad as them. Like, you know, I should be,
1:05:29
I should be better. And
1:05:31
I just realized that that person that is saying those
1:05:33
things was not actually me. Because if
1:05:35
I'm later regretting when I'm calm, I'm right,
1:05:37
I'm later regretting what I actually said when
1:05:39
I was angry, then I'm not,
1:05:42
it's not really me. I'm, one
1:05:44
of the things I say is that, you know, when you act, when
1:05:46
you're emotional, you are an ambassador for your
1:05:49
most primitive self. You're
1:05:51
basically acting for your most animals up because you're
1:05:53
engaging your reptilian brain. And
1:05:57
any decision that I've made when I've been emotional has
1:05:59
pretty much turned out to be a bad decision. I
1:06:01
mean, or at least it's been suboptimal.
1:06:04
I always make better decisions when I'm
1:06:07
mentally sort of balanced. And I think that's true
1:06:10
of pretty much anybody. But if you send that
1:06:12
email in the spirit of the moment, more
1:06:15
often than not, you're going to think I could
1:06:17
have waited that better, you know, I could have waited that a lot better. So
1:06:21
what I do now is
1:06:23
it's not like I'm a robot, I do feel emotions,
1:06:26
if somebody says something nasty to me online, I get an
1:06:28
urge to just be nasty back, you know, I get it
1:06:30
like we all do, we're all humans. But I don't I
1:06:32
never, I never do it now. I never, you know,
1:06:35
I never like just, I'm never spread for the phone.
1:06:37
If I replace somebody and sometimes I'm snarky, I am
1:06:39
snarky. But I tend to do it in a way
1:06:41
that I think is more productive. But what I always
1:06:44
do is I if I'm feeling particularly emotional, I'll
1:06:46
always wait for
1:06:48
that emotion to pass because it will pass. And
1:06:51
it's amazing how often when
1:06:54
you let that emotion pass. And then
1:06:56
you consider what you were would have done when
1:06:58
you're emotional, you realize how idiotic it would have
1:07:00
been, you know, that's happened to me
1:07:02
so many times that it's I actually am afraid
1:07:04
of acting when I'm emotional. Because I
1:07:07
just realize how, how demented
1:07:09
I am, right? And I think
1:07:11
this is true of everybody. Yeah,
1:07:13
it is deranging because I mean,
1:07:16
emotions ultimately are the
1:07:18
opposite of rationality. They are a
1:07:20
shortcut. There's a thing called the effect
1:07:22
heuristic, which is this idea that
1:07:25
emotions evolve. I
1:07:28
mean, I would say emotions evolve for two purposes. One
1:07:30
of them is they evolve for
1:07:32
motivation. And the other is
1:07:34
that they evolved for decision making and low information
1:07:36
environments. And, you know,
1:07:39
your feeling, for instance, your gut feeling
1:07:41
is how you make decisions when you don't have enough
1:07:43
information. And
1:07:46
the thing with gut feeling is it's actually often wrong people will
1:07:48
say, Oh, I swear by you know, I've got a really good
1:07:50
gut, I've got a really good gut feeling. I'm always, you know,
1:07:52
I always trust my gut. But what they're doing
1:07:54
is they're engaging in confirmation bias. Though
1:07:57
we usually remember when they've got feelings right, but they won't
1:07:59
remember when their gut feeling is wrong. And
1:08:02
so they're obviously going to naturally be skewed towards
1:08:04
believing that their gut feeling is more accurate than
1:08:06
it actually is. And that's
1:08:08
why I don't really trust it so
1:08:10
much. I mean, there's something called intuition,
1:08:12
which is a little bit more than gut
1:08:14
feeling, which is more something that you've learned
1:08:17
to trust over time. It's something that you set
1:08:20
and cues that you just
1:08:22
see. And then from that, you can build it
1:08:24
before picture. But just relying on emotion alone is
1:08:26
usually not a good strategy for decision making. Because
1:08:29
again, emotion favors short
1:08:31
term compulsion loops over long
1:08:33
term compulsion loops. And so this
1:08:40
is why I think you
1:08:42
should always, if you're
1:08:44
going to make an important decision, just
1:08:47
wait for the emotion to pass. It
1:08:49
will pass. Most emotions don't last very
1:08:51
long. Most emotions last a
1:08:54
few minutes. And then they're
1:08:56
usually, they weaken and they fade. And
1:08:58
that's all you've got to do. Just wait a couple of minutes, and
1:09:01
then see, compare your actions when you're
1:09:03
not emotional to how you were going
1:09:05
to act when you're emotional. And
1:09:07
you will realize there's a massive difference. And
1:09:10
that way you will prevent
1:09:12
yourself from many regrets, I think. Yeah,
1:09:27
so we see this online a lot again
1:09:29
with people calling other people names in order to
1:09:46
dismiss anything that they've said. So an
1:09:50
example of this might be calling somebody a bigger,
1:09:53
you know, saying, oh, you're a bigger and
1:09:56
stuff and basically saying, oh, why did he, Why
1:09:58
does he feel this? Why does he. The all because he's
1:10:01
a faggot. And for
1:10:03
many people, that's enough. Oh okay, he's a big it
1:10:05
so I don't need to listen to what you say
1:10:07
anymore. But. Really more
1:10:09
is. A lot more is bigotry.
1:10:11
Bigotry is not an explanation for behavior.
1:10:14
It's a description of behavior. right?
1:10:16
It's a it's a description by cities.
1:10:18
It's a statement that somebody is prejudiced
1:10:21
towards somebody. right? So
1:10:23
that's it's. not really. I mean you could
1:10:25
use it as a very shallow explanation for
1:10:27
it. Doesn't really explain much if you really
1:10:29
want to know. If you
1:10:32
really want explanation that you've gotta delve a little
1:10:34
bit deeper, you gotta. You gotta go a bit
1:10:36
further back. And you got to say okay, so
1:10:38
this presents a bigot. so that's that's a description.
1:10:40
So now we need explanation for why is up
1:10:42
hasn't bigots? Why would they say that thing. And.
1:10:46
Then you would say okay the it could be
1:10:48
many things Like Princes Listeners is an example of.
1:10:50
That classical. Bigotry. So
1:10:52
and somebody might, for
1:10:54
instance, Hey immigrants!
1:10:58
You know they might. They might say all your hey
1:11:00
immigrants I I just i once had a want these
1:11:02
boats to keep coming to our shores. Whatever. And
1:11:04
the standard response from many people in
1:11:07
positions of power is a sale. Dust
1:11:09
is bigoted. Move. On
1:11:11
next next questions you know. But if you
1:11:13
really want to understand you could ask yourself
1:11:15
why. Suspicion: Bigoted. And. It
1:11:18
may be a pretty. Enlightening
1:11:21
answer They might. it might be that they
1:11:23
had their jobs taken away Them I have
1:11:25
that that that jump taken away by it's
1:11:27
a immigrants and now they're out of work
1:11:29
and that's you know on the dole the
1:11:31
whatever that there are well Sarah, whatever and
1:11:33
that their life is Know that plans have
1:11:35
been get destroyed by the fact that they're
1:11:37
just that been superseded by somebody from you
1:11:39
don't have a country. Or
1:11:43
it might be that their dad's. Family.
1:11:45
member was a victim of crime by
1:11:47
an immigrant you know so if you
1:11:49
can actually go past the instinct to
1:11:51
dismiss somebody by disguising it a description
1:11:54
as an explanation spend you can actually
1:11:56
get to the real explanation and then
1:11:58
you can start to actually resolve the
1:12:00
question. You can actually say, okay, well, so
1:12:02
if this is the case, then I can
1:12:05
go out there and I can convince this
1:12:07
person that hang on a
1:12:09
second, immigration necessarily might
1:12:11
have taken your job, but some immigrants also create
1:12:14
jobs or whatever. I mean, I'm not going to
1:12:16
go into the hole, whether immigration is good or
1:12:18
not, or bad or not. But this is just
1:12:20
an example of what somebody could do. You
1:12:22
know, so you could maybe if
1:12:25
you were interested in getting people
1:12:27
to accept immigrants, if you were one of these people, you
1:12:30
could basically that's what you could do. And you could actually, instead
1:12:32
of dismissing them and making them even
1:12:34
more and hate immigrants even more, which
1:12:36
is going to happen, you know, if
1:12:39
you dismiss somebody's concerns, they're only going
1:12:41
to react again, what we
1:12:43
talked about earlier reactants backfire effect, if
1:12:45
you tell people that their opinions are bigoted, it's
1:12:47
not going to stop them from being bigoted, it's going to
1:12:49
make them more bigoted. And it's going to, you
1:12:51
know, they're going to start thinking, Oh, there's a conspiracy now, to stop,
1:12:54
you know, there's a conspiracy by
1:12:56
the Jews to, you know, flood
1:12:59
the West with immigrants, and
1:13:01
all this. And these people are calling me a
1:13:03
bigger because they're trying to destroy my life, because
1:13:05
they don't want the truth to come out. So
1:13:07
it's going to create always going to basically just
1:13:09
have a negative effect for everybody, it's just going
1:13:11
to make things worse for everybody. And that's why
1:13:13
these semantic stop signs are bad, because they don't
1:13:15
resolve the question. They don't really, they don't solve
1:13:17
anything, they just make the problem worse. And
1:13:20
that's why I don't call people racist, I don't
1:13:22
call people bigoted, I don't call people transphobic. What
1:13:25
I do is I might, I
1:13:27
might call something they've said bigoted. I
1:13:31
don't really even do that. But if I were if I
1:13:33
were going to use the word bigoted, because I don't like
1:13:35
the word bigoted, I feel it's overused. I don't like the
1:13:37
race where racist, I feel it's overused. I don't think that
1:13:39
is these words really mean anything anymore. But if I were
1:13:41
going to use those words, I
1:13:44
wouldn't call people racist, I wouldn't call
1:13:46
people bigoted, I would call their actions
1:13:48
bigoted, I'd call their actions racist. Because
1:13:50
I think that's much more helpful. Because
1:13:53
if you call somebody bigoted, or call them racist,
1:13:55
or call them transphobic, or sexist, or misogynistic, or
1:13:57
fascist, or any of these other words that thrown
1:14:00
around so casually these days. If you use
1:14:03
those terms to describe a person,
1:14:05
you're essentially implying that that person
1:14:07
is irredeemable, that you can't help
1:14:09
that person because they're a
1:14:11
lost cause because they're just a bigot.
1:14:14
Whereas if you call their actions bigoted, if
1:14:16
you call their actions racist or transphobic, and
1:14:18
I'm not advocating this, but I'm just saying
1:14:20
it's better than calling them a bigot. Because
1:14:23
if you call their actions bigoted, that actually
1:14:25
allows you to still see them as a
1:14:27
human. Because I feel that calling somebody a
1:14:29
racist is actually dehumanizing in a sense. Especially
1:14:32
when you consider that
1:14:35
terms like fascist, Nazi, a lot of
1:14:37
these terms are used to
1:14:41
paint people as the worst possible human beings. Because when
1:14:43
you think of the term fascist, when you think of
1:14:45
the term Nazi,
1:14:48
racist, when you think of these terms, you think of
1:14:50
pretty much the worst human beings. You think of the
1:14:52
Nazis, the
1:14:55
Nazis of Germany in the 1930s. You
1:14:58
think of the Ku Klux Klan, you think of really
1:15:00
bad human beings, you think of people who lynched black
1:15:03
people, you think of the worst
1:15:05
human beings. So it's dehumanizing in a sense
1:15:07
because you're portraying people as villains. You're saying
1:15:09
this person is a villain, so
1:15:12
I can just discount everything that they say. Whereas
1:15:15
when you call their actions bigoted
1:15:17
or whatever, then you can say, okay, well, we
1:15:19
can actually convince this person to behave differently. So
1:15:23
I think these semantic stop signs are a very
1:15:27
harmful aspect of our society.
1:15:30
And that's just one example that
1:15:32
I just gave you. We have many other examples in
1:15:34
which these kinds of questions
1:15:36
that people have are just dismissed
1:15:40
by disguising descriptions as
1:15:42
descriptions. Yes.
1:15:49
Mac's Content Razor. So this is from
1:15:51
mutual friend George Mac. Would
1:15:53
you consume your own content? If
1:15:55
not, don't post it. And
1:15:57
it's just the easiest... The
1:16:00
way to work out whether or not what
1:16:02
you're producing is actually something the he should
1:16:04
continue producing and I had a similar idea.
1:16:07
A tangential ideas post content Clarity: If we
1:16:09
presume that your body is made up of
1:16:11
what you put in your mouth, your mind
1:16:13
is made up of what you put in
1:16:16
your eyes and ears, your content diet should
1:16:18
be spiral enough for your soul, not fast
1:16:20
food for your amygdala. Yeah.
1:16:24
One hundred percent agree. I'm.
1:16:27
I'm very selective now about the kind
1:16:29
of concept. of
1:16:31
content I consume. I used to be
1:16:33
very careless ice to just. Mindlessly.
1:16:35
Browse my twitter feed and just whatever got my
1:16:37
attention know I would follow it. But.
1:16:40
The thing is a found that just leads
1:16:42
to a lot of wasted time and very low
1:16:44
information as a large social media is not
1:16:46
very information tend to meet your your speed is
1:16:48
probably a lot of the mind because you
1:16:50
really fun. I like about hundred people were going
1:16:53
on m I six hundred stuff so I
1:16:55
never thought I'd never brought mafia usually sees
1:16:57
lists but and yeah I mean. Yeah.
1:16:59
I do absolutely go by that
1:17:01
at.razor because I find that. It's.
1:17:04
A good Harris The to use. One of the
1:17:06
reasons why I originally wrote those mega thread side
1:17:08
right in. This mega threads on on Twitter was
1:17:10
because they were the kinds of things I want
1:17:13
to read and. I wanted to
1:17:15
lead about the world. My thoughts will
1:17:17
disappear. good exercise some a universal if
1:17:19
I can get like forty concepts the
1:17:21
are very useful the I think and
1:17:23
out people understandable better. Basques,
1:17:25
Actually the kind of content I would love but
1:17:27
nobody was doing at the time and the i
1:17:29
the I was were off so I thought okay
1:17:31
alderman at all be the person to do it.
1:17:34
And m it was interesting because.
1:17:37
You. Know for it. So Twenty Twenty A Sick
1:17:39
as as bright as beginning of Twenty Twenty
1:17:41
the I I posted that defense mega thread
1:17:43
and. It went viral and I've
1:17:45
realized it was so many people that actually wanted
1:17:47
to see that kind of thing, but nobody had
1:17:49
thought of him school even know to add be
1:17:51
rough quite a while. as far as I can
1:17:54
tell anyone of the it's all it's four am.
1:17:56
But. What was quite interesting was there in the
1:17:58
aftermath of that, though. The number of people
1:18:00
who did exactly the same to the I was
1:18:02
doing a wanted to kind of replicate the success.
1:18:05
I have enough a mega thread and those are
1:18:07
just or like a sort of all over the
1:18:09
place of the people doing the right resin and
1:18:11
I've got nothing not a got nothing against people
1:18:13
do that. I think I've got like a so
1:18:15
rights to do everything been like him. it was
1:18:17
interesting because it I think it just made something
1:18:19
click in people's minds with a full. While.
1:18:21
This is a great idea and I wouldn't I think this
1:18:23
in the name with they did it dissolves and it showed
1:18:25
that. If you. Do.
1:18:28
The kinds of things, That. You
1:18:30
want to see if you could you treat the
1:18:32
kinds of clinton that you want to see. and
1:18:34
then because you're a human being and you've got.
1:18:36
You. Share like ninety nine percent of your dna
1:18:39
with every of the human being that is
1:18:41
gonna be a large number of the people
1:18:43
that will have similar enough interests that they
1:18:45
will actually and you know what to do,
1:18:47
what you what you want to do such
1:18:49
as this, this actually. Fits.
1:18:51
In quite nicely with one of the other
1:18:53
concepts are in that in in one of
1:18:55
my recent make us red switches italics know.
1:18:58
And. Which is basically this idea
1:19:00
that. And. People.
1:19:03
Will tend to copy. Whatever.
1:19:05
Successful. Where. They
1:19:07
were talking about business in politics,
1:19:09
in or art or whatever. and
1:19:11
as a result of that. Content.
1:19:15
Tends. To converge it tends to become
1:19:18
more similar overtime and you see
1:19:20
it with Tic Toc that were
1:19:22
very small number of people like
1:19:24
Teleports and Charlie to Mallya who
1:19:26
became extremely popular Tic Toc and
1:19:28
they debated that the the most
1:19:30
viewed people on Tic Toc and
1:19:32
at all they did was flipped
1:19:34
singing and dancing. Now I I
1:19:36
have no interest in watching that kind of stuff,
1:19:38
but evidently they thought it was fun. Maybe that's
1:19:41
the kind of content they wanted to see, but
1:19:43
somehow that stuff blew up. And as
1:19:45
a result of that, Epic. It's
1:19:47
not a whole new genre of ticked
1:19:49
off video where you just people if
1:19:51
singing and dancing and everybody was doing
1:19:53
it now and he caught it. Decreased
1:19:55
value of doing that. And is
1:19:58
to say with politics not you know if you look at. France's.
1:20:00
In the Uk and. Your.
1:20:03
Their political pies Labor conservative if
1:20:05
you look at st postwar period
1:20:07
or yet Clement Attlee versus with
1:20:09
check your criminality was he was
1:20:11
a socialist, he was so on
1:20:13
and the Labour party was phone
1:20:15
socialists. My at Winston Churchill's conservatives
1:20:17
were proper conservatives, they weren't like
1:20:20
a becky and concept is and
1:20:22
over time the two. Parties.
1:20:24
Have moved towards the center so it's labor's
1:20:26
become more right wing and conservatives to become
1:20:28
more left wing and is interesting because the
1:20:31
right wing party of the Uk Conservatives and
1:20:33
now to the list of the Left wing
1:20:35
party in the U S hits and the
1:20:37
reason is. The reasons has happened
1:20:39
is because of how telling law because and
1:20:42
what happened is that. When
1:20:44
at certain politicians it does, these parties appeal
1:20:46
to the center. They had a huge amounts
1:20:48
success and the other people saw this and
1:20:50
thought wow we've we've been to capture the
1:20:52
center at get some of these people's that
1:20:55
audiences from them And so these two parties
1:20:57
gradually began to try to. Eat.
1:20:59
The center. He is much of the center
1:21:02
before the the pile up center so they
1:21:04
move closer to tennis and the conversion is
1:21:06
same concentrated stick that they tend to converge
1:21:08
of the time. And the great thing about
1:21:10
the max razor that you just stem. Spoke.
1:21:13
About. When. You create content that you
1:21:15
yourself would want to see is that you
1:21:17
can avoid her telling law because you're creating
1:21:19
content that want to see a look you
1:21:21
not chasing and what everybody else is doing.
1:21:23
You're doing the opposite because the interesting thing
1:21:25
about how telling noise at the more it
1:21:27
happens the more these content creators of these
1:21:29
politicians what are we talking us the more
1:21:31
that content. Converges, The
1:21:34
more value there isn't being different. I
1:21:36
didn't. Actually, you know trying to say to see
1:21:38
to do something that you want to see. Yeah,
1:21:42
like for instance, a Getting Back To
1:21:44
My Magistrates I saw a lot of.
1:21:47
Stuff. About mental models of but
1:21:49
it was not. It. Was not portrayed in
1:21:51
the way that. Are I'd sedatives. It
1:21:53
was more about getting a single but mental model that
1:21:55
do the thread about and loads of people were doing
1:21:57
that and I honestly was gonna do that for them.
1:22:01
I'm just doing the same thing everybody else is doing if I
1:22:03
do that. Now. Because that was that form
1:22:05
was originally popularized by the people like Tim
1:22:07
Ferris. you know, they popularizer stuff and they
1:22:09
became very successful with it. and it was
1:22:11
such a good formula the a lot of
1:22:13
other people try to do the a little
1:22:15
while. why did I do something different instead.
1:22:17
Because. The decided to just go against and the thought.
1:22:20
I. Don't want to see this on a one
1:22:22
actually consume this kind of contents because I've already
1:22:24
consumed it because so many other people are doing
1:22:26
it. So thought let me do something a little
1:22:28
bit different and let me just cracks threat of
1:22:31
various different concepts and so that was different enough
1:22:33
that actually allowed me to go viral. And
1:22:36
when I did it. so it's a very good
1:22:38
strategy to to chase not what other people are
1:22:40
doing, but what you want to see. A
1:22:42
thing I agree I he needs.
1:22:45
I understand some people would say
1:22:47
that I'm. A
1:22:49
few copies Successful contents,
1:22:52
You. Avoid making stuff
1:22:54
which is absolutely atrocious. like your
1:22:56
instincts could just be completely off
1:22:58
kilter like you're aiming at. The
1:23:01
targets are the north and issued
1:23:03
south. Basically so the. A
1:23:06
base laid the foundation of understanding writing. For instance,
1:23:08
if you're going to do that, the think: if
1:23:10
you couldn't write, it doesn't matter how good your
1:23:12
idea is, it's not going to work if you
1:23:14
don't understand how Twitter works, if you don't understand
1:23:16
the concept, if he can't betray them and an
1:23:18
interesting way does. Like a lot of things that
1:23:21
you need to get in place, but once you've
1:23:23
got basically the rules of the game, you can
1:23:25
then start to maybe. Step. Outside
1:23:27
and completely break them. So for instance with these lists
1:23:29
decal style episodes the I do in the some of
1:23:31
my favorites on I think that they keep the episode
1:23:34
moving really quickly and I know that mean you when
1:23:36
we finish these episodes feel like we've been in a
1:23:38
fucking see between the two hours of my how's it
1:23:40
been two hours already I'm at I did them with
1:23:43
whole mosey upton them with Sean puri up and them
1:23:45
with your Mac have done them with yourself you know
1:23:47
going through a list of things to thoughts that would
1:23:49
be fun to me if I left this like. Pressure.
1:23:53
Hose to copy of. Sight. insights
1:23:55
about his behavior i would have i would
1:23:57
have left an episode guy wow that's cool
1:23:59
and Yeah, it was something that was
1:24:01
my instinct. Now that being said, it's framed in
1:24:03
a way that we know works for the algorithm.
1:24:06
It's presented from a
1:24:08
tech perspective in a way that we think is
1:24:10
engaging. Dean edits these things in a way that
1:24:12
keeps stuff engaging. So again, we're playing within the
1:24:15
physics of the system in some regard,
1:24:17
but we're also trying to give our
1:24:19
own spin on something with something new. And
1:24:22
Douglas Murray has said this as well, like
1:24:24
follow your instincts. Your instincts are a pretty
1:24:26
good guide. It allows you to be unbelievably
1:24:28
unique and it allows, like if you were
1:24:30
interested in something, there is a
1:24:32
pretty good likelihood that some non insignificant minority
1:24:34
of other people are also interested in it.
1:24:37
And given how broad the access that you
1:24:39
have on the internet is now, you only
1:24:41
need some non insignificant minority
1:24:43
of other people to have a massive
1:24:45
audience. Like millions of people. Absolutely. Yeah.
1:24:47
That's one aspect of it. And another
1:24:49
aspect is that if you are
1:24:52
genuinely passionate about something, if you're genuinely interested
1:24:54
in something, you will make it interesting to
1:24:56
other people, you know, because you'll be
1:24:58
passionate about it. If you're just chasing, you know, metrics,
1:25:00
if you're just looking at what other people are doing
1:25:02
and you're, then you just copy them, your passion is
1:25:05
not going to be in it. You're not going to
1:25:07
be interested in it. You're just going to be interested
1:25:09
in getting as many views or whatever. You'll be chasing
1:25:11
the wrong metrics. The right metric is interesting. That's interesting
1:25:13
to you. Because if you make it,
1:25:15
if it's interesting to you, you'll make it interesting
1:25:17
to other people because you, your passion is contagious.
1:25:20
And I think that's the best sort of advice
1:25:22
I'd give to somebody who wants to sort of
1:25:25
make a start in sort of, you know,
1:25:28
just being an influencer or whatever, you know,
1:25:30
is just to, to just find what interests
1:25:32
you. Right. Don't try to find what you
1:25:34
think other people are going to find interesting
1:25:36
because no matter what it is, even if
1:25:38
it's something like stamp collecting or whatever, right?
1:25:40
If you are passionate about, about it enough,
1:25:42
you will make it interesting to
1:25:44
other people. Dude. So me
1:25:46
and, me and my housemate Zach love
1:25:48
these videos of guys that watch, uh,
1:25:50
rally cross. So it's like Colin McRae,
1:25:52
you know, four wheel
1:25:55
drive cars going through a dirt road. And
1:25:57
these blokes will have gone up to fucking.
1:26:00
In Scotland or the Quebec or something
1:26:02
and the stood in a poncho under
1:26:05
a a a umbrella in the pacing
1:26:07
rain in the middle basically the middle
1:26:09
of a forest to see. Them.
1:26:13
To. See valid point three of a second
1:26:15
and then when these cars go past
1:26:17
they will turn to each other and
1:26:19
co us and we love watching the
1:26:21
watching anyone get fired up about anything.
1:26:23
Makes. You feel fight up as well. He just
1:26:26
a lotta people that love things and yeah yeah,
1:26:28
if you follow your passions in that regard, you're
1:26:30
always going to remain a. On.
1:26:32
The right and you'll also be motivated
1:26:35
as well. Yeah, thanks. Another thing is
1:26:37
getting more motivated? Yeah, yeah. Snacks or
1:26:39
up next month at this ten o'clock.
1:26:41
You. Know that if you'd lived in
1:26:44
a different place or time, read different
1:26:46
books, had different friends, you'd have different
1:26:48
beliefs, and yet you're convinced that your
1:26:51
current beliefs are correct. So.
1:26:53
Are you wrong? Or. The
1:26:55
luckiest person Ever. Yeah.
1:26:58
I. This is one that gets me
1:27:00
a lot because I I find that
1:27:02
a lot of my opinions are in
1:27:04
sync with the society in which I
1:27:06
live, So in our i have broadly
1:27:08
sort of kind of i'm quite. Sort.
1:27:11
Of. Liberal in sense, you know me. I
1:27:13
wouldn't say that I'm actually a liberal, but I
1:27:15
have very liberal. Views and we
1:27:18
live in a liberal society. And
1:27:20
and. I find that
1:27:23
it. It's hard to
1:27:25
extricate my police from.
1:27:27
The time and place in which I'm
1:27:30
living always wonder what what I believe
1:27:32
if I'd lived if I'd say been
1:27:34
born in in sort of India for
1:27:36
instance, where be born in India? What?
1:27:38
What I believe there is had been
1:27:40
born in the nineteenth century? What I
1:27:42
believe. If I'd
1:27:44
you know, if I was born into a
1:27:46
rich family, lives in a poor family, what
1:27:48
would I believe and. All of
1:27:50
these things. Make. Me question
1:27:52
my police because I think to myself.
1:27:55
Or my. billie seem to be quite local
1:27:57
to where i'm living in in time and space
1:28:01
I think this is very true of religious people
1:28:03
in particular. So if you think about
1:28:05
say a Muslim person, a
1:28:08
Muslim obviously believes things
1:28:10
that were originally sort
1:28:13
of a belief system that was invented
1:28:15
in the 7th century Arabia. But
1:28:18
what would happen if that person was born
1:28:20
before the creation of Islam? So
1:28:23
if they had been born sort of
1:28:25
in the 2nd century, would
1:28:28
they still be a Muslim? Obviously
1:28:30
not. Would they still have Muslim principles?
1:28:32
Obviously not. And this is interesting because
1:28:34
Islam is supposed to be a religion
1:28:37
for all times and all places. That's
1:28:40
its sort of main claim to fame. And
1:28:43
so although there's this
1:28:45
concept in Islam called Jahili'a, which
1:28:48
is about basically this idea that before
1:28:50
the coming of Islam there was ignorance, still
1:28:53
you've got to ask yourself, surely
1:28:57
that means then that being born before
1:28:59
the creation of Islam means that you're
1:29:01
not going to have the advantage in
1:29:04
God's eyes of somebody who's born
1:29:06
after the creation of Islam. Because the person who's
1:29:08
born after the creation of Islam is going to
1:29:10
be more likely to follow Islam than the person
1:29:12
before. So there's this weird sort of disparity there.
1:29:14
And I think you could extend this to any
1:29:16
belief system. Communism for instance as
1:29:19
well. If you're born before
1:29:21
the creation of communism, you're
1:29:23
not going to be a communist. And
1:29:25
so would a communist be different
1:29:27
if they were born before the creation
1:29:29
of communism? Of
1:29:32
course they would. And so how can
1:29:34
they be sure that their belief is right? It just happened
1:29:36
to be born at the right time in history to have
1:29:38
the right beliefs. And
1:29:42
that's why my solution to this
1:29:44
problem is to
1:29:46
try to find beliefs that are as universal
1:29:48
as possible. So one way that I
1:29:51
can gauge whether a belief is a good one is
1:29:53
whether I can view myself
1:29:55
as having believed that no matter what time or place
1:29:57
I was living in. perfect
1:30:00
system because obviously knowledge is constantly
1:30:02
growing. And obviously I
1:30:04
wouldn't know the germ theory of disease a
1:30:07
thousand years ago, but I do believe it
1:30:09
now. And I think I'm pretty justified in believing in
1:30:11
the germ theory of disease, given the evidence for it.
1:30:15
But as a general rule, I think it's
1:30:17
a pretty good one where you think about that is
1:30:19
this belief a product of the society in which I'm
1:30:21
living or is it one that can be applied to
1:30:23
any time in any place? And the thing with the
1:30:25
germ theory of disease is even though it didn't exist
1:30:27
a thousand years ago, it would still have
1:30:29
helped me a thousand years ago. It
1:30:32
would still have been beneficial to believe in it a
1:30:34
thousand years ago. So I think that's a good heuristic
1:30:36
to use in order to determine whether your beliefs are
1:30:39
real. It doesn't matter if they're a product
1:30:41
of your time. What it matters is will
1:30:43
they be useful in any time and
1:30:45
any place? That's the kind
1:30:47
of universality of a belief. So
1:30:50
if your beliefs wouldn't work very well a
1:30:53
thousand years ago, then that's a good sign that you're probably
1:30:55
just imbibing what you're learning
1:30:57
from the present day. You're kind of, you
1:31:00
know, you're myopically sort of trapped in the
1:31:02
present moment and in the present place. So
1:31:05
yeah, I think universality of
1:31:08
applicability is what you want
1:31:10
to look at. So can you apply
1:31:12
it universally? And if you can, then that's a sign that
1:31:15
it's a good belief. So
1:31:17
Rob Henderson put something in his newsletter a
1:31:19
couple of weeks ago and I gave it
1:31:21
a name. So I've come in at the
1:31:23
end and thrown like a pretty bow on
1:31:25
top of something which I really like as
1:31:27
an idea. So I called this the intellectuals
1:31:29
treadmill. Some thinkers, as they
1:31:32
rise in prominence as a result of
1:31:34
their interesting ideas, gradually devote less time
1:31:36
to reading and more time to lucrative
1:31:38
opportunities. This is a mistake. They are
1:31:40
neglecting one of the core habits that
1:31:42
made them so interesting in the first
1:31:44
place. I
1:31:48
think I'm guilty of this. I
1:31:50
tend to read less than I used to. But
1:31:53
I think
1:31:55
I definitely agree with it in general.
1:31:57
I think one
1:31:59
of the problems. with a lot of thinkers
1:32:02
is that they tend to just resort to the
1:32:04
same set of tools that got them famous. So
1:32:07
a classic example of this would be somebody
1:32:09
like Nassim Taleb. He
1:32:11
became famous through a handful
1:32:14
of concepts like anti-fragility, the
1:32:16
Lindy effect, skin in the
1:32:18
game. And these obviously are
1:32:20
great ideas. They're really good ideas. And
1:32:23
that's why they became popular. But
1:32:25
since then, what I've noticed in him
1:32:28
is that he tends to sort of try to
1:32:30
apply these concepts to pretty much anything that happens.
1:32:32
This is the golden hammer, isn't it? The golden
1:32:34
hammer. Yeah, we've spoken about this before, the golden
1:32:36
hammer. And it also sort of links
1:32:39
in with another thing called the toothbrush problem,
1:32:42
where the toothbrush problem is basically
1:32:45
where intellectuals treat
1:32:48
theories like toothbrushes. They
1:32:50
don't want to use anybody else's, you know, they just
1:32:52
want to use their own. And
1:32:54
that's the opposite of
1:32:56
me who just shamelessly repurposes everybody else's.
1:33:00
Well, I think that's the healthiest way to
1:33:02
be. I think oftentimes, you know, it's when
1:33:05
you just rely on your own theories, you're
1:33:07
just closing yourself off from so
1:33:09
much learning and so much knowledge. And
1:33:12
that's why I try not to do these things, you know. But
1:33:15
I mean, it's hard because when you
1:33:18
do become famous for a certain idea, you develop
1:33:20
a certain brand and you want to sort of
1:33:22
you want to you want to overstate the kind
1:33:24
of importance of your ideas. So
1:33:26
obviously, Taleb got very famous from his
1:33:29
three major ideas, and, you
1:33:31
know, tail risk and all the other ideas that he's come out with.
1:33:33
And so he's gotten
1:33:35
he's incentivized to, instead
1:33:37
of learning new ideas by reading books, to
1:33:40
just double down on his own ideas, by
1:33:42
just constantly writing about them. And
1:33:44
so he's that's obviously going to get him more clout,
1:33:47
because the more important his ideas seem, the
1:33:49
more important he seems. And the
1:33:51
more opportunities he's going to get to sort
1:33:53
of expound upon
1:33:55
various social issues and apply them
1:33:57
to, you know, applies it. his
1:34:00
golden hammers to those. Well, I
1:34:03
remember hearing Peterson a while
1:34:05
ago, it's probably five years ago, he was
1:34:07
on Rogan and he was really at
1:34:10
the crest of this huge growth curve that he
1:34:12
was on, maybe just after the Kathy Newman interview,
1:34:14
something like that. And he said
1:34:17
something along the lines of, I need to
1:34:19
take some time to go away because if
1:34:21
you are outputting more than you are inputting,
1:34:23
all that you're doing is just saying the
1:34:25
same things over and over again and you
1:34:27
end up becoming a caricature of yourself, which
1:34:29
is dangerous. I learned from Critical
1:34:31
Drinker, do you follow that guy? Yeah,
1:34:34
I watch him in videos, yeah, he's funny. So
1:34:38
I learned from him that there's four
1:34:40
stages to most media movements.
1:34:42
So let's say like the superhero genre
1:34:44
that we've seen since sort of the
1:34:46
mid-naughts. There is like
1:34:48
the introduction phase, the growth phase, the maturity
1:34:51
phase and then the parody phase. What's
1:34:53
interesting about that is you can track it
1:34:55
perfectly with Thor. So you have this kind
1:34:57
of groundbreaking or maybe less so
1:34:59
Iron Man because he died, I guess, before he
1:35:02
could get into parody. But certainly with
1:35:04
Thor, you get this groundbreaking one and everyone's like, oh my
1:35:06
God, Chris Hamsworth's so ripped and then he get into growth
1:35:08
and it's sort of still developing. Then you get into maturity
1:35:10
where it's a little bit more predictable and you've kind of
1:35:12
got an idea. Then you get into Love
1:35:15
and Thunder, which was the most recent one. And you even saw
1:35:17
bits of parody earlier on in it, but where
1:35:19
he's the butt of a joke. He's
1:35:22
the butt of all of the jokes. He's doing a
1:35:24
set of splits on the top of like a pair
1:35:26
of dragons like Jean-Claude Van Damme. Even
1:35:29
Doctor Strange, I guess he featured
1:35:31
as a sort
1:35:33
of ancillary character in lots of other
1:35:35
things, but he only got two. So
1:35:38
he had the first Doctor Strange with
1:35:40
Benedict Cumberbatch. He's a phenomenal actor. First
1:35:42
one, super like sincere
1:35:46
in the way that they did it. And it was very meaningful
1:35:48
about him. The second one, a zombie
1:35:51
version of Benedict Cumberbatch goes back
1:35:53
in time to a different universe
1:35:55
to tell the Central American daughter
1:35:57
of a lesbian couple called America.
1:36:00
that she just needs to believe in herself. Like
1:36:03
it's just the most parody of the most parody
1:36:05
that you can think. So
1:36:07
yeah, and I think that one of the problems
1:36:09
that you get is what Peterson
1:36:12
identified, if you are outputting
1:36:14
more than you're inputting, you end
1:36:16
up just regurgitating ideas, you bastardize them,
1:36:18
you don't have anything fresh, you become
1:36:20
a caricature of yourself, you become easy
1:36:23
to be parodied, and that's dangerous.
1:36:25
And he was saying, you know, you have to take some time away. Someone
1:36:27
that we can say absolutely has adhered to that.
1:36:29
And there's all, dude, how are you gonna say
1:36:32
no to another speaking geek? How are you gonna
1:36:34
say no to another Joe Rogan experience episode?
1:36:36
How are you gonna say no to all these things? I get it, right?
1:36:39
But someone that definitely has done this was
1:36:41
Naval, who just said, I did
1:36:43
my Rogan episode, and I'm now away on sabbatical
1:36:45
because I never want to say the same thing
1:36:47
twice, and I won't be doing any more podcasts
1:36:50
until I have three hours worth
1:36:52
of new things to talk about. Fat
1:36:55
blade. Yeah, yeah, yeah,
1:36:57
I mean, I think Naval's
1:37:00
very wise in that he's done this, I think,
1:37:02
to avoid audience capture. I
1:37:04
think that's ultimately what we're
1:37:06
talking about, because when you
1:37:08
have the same set of ideas, there's a pressure on
1:37:11
you to continue to talk about those ideas, again,
1:37:14
to sort of emphasize
1:37:16
their importance. And I think
1:37:18
Tlaib is a very good example of this, going back to
1:37:20
him, because I feel he has kind
1:37:22
of been audience capture in a sense, where it's
1:37:24
now expected that he's gonna try to explain
1:37:27
things in terms of tail risk
1:37:29
or whatever. And it's because it's
1:37:31
what he knows, and I understand why he
1:37:33
does it, because it's kind of wise to a certain
1:37:36
extent to just stick to what you know. But
1:37:38
he's clearly a very intelligent man, and he's a
1:37:40
man who could learn a lot
1:37:42
more about many other things, but he instead
1:37:44
just chooses to pretty much talk about
1:37:47
the same sort of things again and
1:37:49
again. He's doing what Jordan Peterson
1:37:51
essentially warned about, where
1:37:53
instead of learning, because he's
1:37:56
Tlaib, he's a smart guy, but
1:37:58
he's arrogant as hell. And
1:38:00
he thinks that he sort of has the final
1:38:02
answer. He thinks he understands things
1:38:05
even when he doesn't really have a grounding in it.
1:38:07
He thinks he understands IQ, but he
1:38:09
makes very elementary mistakes about IQ. But
1:38:13
yeah, he tends to just
1:38:15
sort of focus on
1:38:19
a very narrow field of maths, statistical
1:38:22
kind of tail risk analysis,
1:38:24
risk analysis, that kind of
1:38:26
stuff. So
1:38:29
he uses a very narrow set of tools,
1:38:32
very useful tools, but they're very narrow. And
1:38:34
he uses that very narrow system
1:38:36
of tools to explain everything
1:38:39
from like COVID to
1:38:42
polarization to Israel
1:38:45
and Palestine. He talks about
1:38:47
a lot of these things often just using
1:38:49
very narrow set of tools. And it's
1:38:52
weird because he's otherwise he's quite an
1:38:54
erudite guy, but he just chooses not
1:38:56
to sort
1:38:58
of progress beyond what
1:39:00
made him successful. And
1:39:03
I see this with a lot of other
1:39:05
influences, a lot of other intellectuals, where they
1:39:07
just stick to the thing that made them
1:39:09
successful over and over again, as if they're
1:39:12
just sort of scared of venturing into new territory.
1:39:15
You see it with a lot of sort
1:39:17
of anti-woke accounts online now as well, where
1:39:19
the same thing is always the case. It's
1:39:21
always about wokeness. Everything's wokeness. Everything
1:39:24
can be explained in terms of wokeness. So you see
1:39:26
it in the opposite side with everything's racist. You know,
1:39:28
racism is the explanation for everything. Oh,
1:39:30
it's because of systemic racism. It's because of whiteness, it's
1:39:32
because of white fragility, all
1:39:35
of this stuff. And then you just see the
1:39:37
same sets of explanations being used over and over again
1:39:39
because these people are not reading new things. They're
1:39:42
just regurgitating what was already in their head
1:39:44
again and again and again, that they're basically
1:39:46
being spoon fed their own intellectual vomit and
1:39:49
just kind of recycling it and vomiting
1:39:51
out again. And it just degrades. It's
1:39:53
like chat GPT being trained on its
1:39:56
own outputs. You know, it's just, you
1:39:58
know, it's kind of like. It
1:40:01
is a very dangerous thing and that's why
1:40:03
I think I try to go broad
1:40:05
rather than narrow in on one thing. I
1:40:08
do occasionally narrow in on one thing when I write
1:40:10
a long read or whatever, but what
1:40:12
I try to do is to just keep
1:40:14
learning, learning new concepts and new things. Like
1:40:17
I've set a pretty good thing now where I've got
1:40:20
an audience that expects me to write about
1:40:23
a wide range of different things, but very,
1:40:25
very shallow things. I do write
1:40:27
pretty shallow stuff in general just
1:40:29
because I've got so many ideas to cover that
1:40:31
I can't go into too much detail. I'm
1:40:34
not always shallow. I do sometimes go on
1:40:36
deep dives into articles and essays where I
1:40:38
write 4,000, 5,000 words about
1:40:41
a single concept, but usually I write
1:40:43
a wide range of things,
1:40:46
but quite shallow in order to give
1:40:48
people ideas for them to springboard their
1:40:50
own ideas. That's generally what I like
1:40:52
to do. I find that
1:40:54
that's a healthy way to approach because it
1:40:56
means I'm constantly learning new ideas instead of
1:40:58
just focusing on one idea and using that
1:41:00
one tool to explain everything, which
1:41:03
is a temptation. It seems like this is
1:41:05
related to another one I got from you, Beginner's
1:41:07
Bubble Effect. You cannot learn
1:41:09
that which you already know from Epictetus.
1:41:11
The most ignorant are not those who
1:41:13
know nothing, but those who know a
1:41:16
little because a little knowledge grants the
1:41:18
illusion of understanding which kills curiosity and
1:41:20
closes the mind. This
1:41:25
would appear to go against what I've just said.
1:41:27
It would seem like, oh, okay, you shouldn't learn
1:41:29
just a little thing. You should really go deep
1:41:31
into that, but in practice, that's not actually possible.
1:41:34
You can't just learn one thing in loads and
1:41:36
loads of detail and not learn anything else. You're
1:41:38
always going to be in a situation where you
1:41:40
have to learn a little bit. The key to
1:41:43
overcoming the Big Beginner's Bubble Effect is not to
1:41:46
learn more because you can't learn more about everything.
1:41:49
The Key is to recognize your
1:41:51
limits, is to recognize how much
1:41:53
you actually know, basically. Once You
1:41:55
learn how much you actually know,
1:41:57
and that comes from humility and
1:41:59
from curiosity, The then you no longer
1:42:01
subject to begin his bubble affect the begins
1:42:03
with vet effect is a product also thinking
1:42:05
you know more than you actually do. I
1:42:08
usually comes from having a very shallow explanation
1:42:10
for something because once you have a shallow
1:42:12
explanation you think you have a full explanation
1:42:14
is just the way our brain works. You
1:42:17
know it just you kind of your accuser
1:42:19
curiosity when you when you have a shot,
1:42:21
vaccinations, something and. You. Know it, it
1:42:23
falls. You play into thinking that you understand
1:42:26
it. And. That's where the danger lies.
1:42:28
So I'm not saying you should just let little things.
1:42:30
In fact, I have. Yeah, because I think you should
1:42:32
lead let a little about a lot. Roslyn.
1:42:34
Then a lot about little like accidentally
1:42:36
about little about a lot and reason
1:42:39
for this is. Where
1:42:41
this goes to Philip: Pet lox,
1:42:44
work philatelic. Her is one of
1:42:46
the founding fathers of decisions every
1:42:48
and along with sort of people
1:42:50
I can roberts Yardeni and dem
1:42:52
Daniel Kahneman they founded sort of
1:42:54
see would have rationalism and if
1:42:56
that lox. All. About predicting the
1:42:58
future. He's. Basically because. The
1:43:01
to measure of how rational you are and how
1:43:03
much truth you have is whether you can predict
1:43:05
the future consistently. If. You caucus early
1:43:08
two thousand to do that. Nothing. You
1:43:10
can't bullshit you. way to that to predicting the
1:43:12
future as one thing you can of bullshit. So.
1:43:15
You have to know the troops in order to
1:43:17
to be pets can see predict the future and
1:43:19
that's what he's into. the whole thing about F
1:43:21
C forecasting. And. M
1:43:23
E Bay City found that the P the
1:43:25
people who are most accurate at predicting the
1:43:28
future he could. He did some series of
1:43:30
trials which were actually involve the Cia involved
1:43:32
like there was a massive funding from the
1:43:34
Cia that he did some pretty crazy stuff
1:43:36
in the mind eighties a where he basically
1:43:38
these competitions to see who could predict use
1:43:40
the best and people adopted very strategies. it's
1:43:43
various. Kinds of
1:43:45
this phenomenon became known as super forecasting
1:43:47
and what have not found was that
1:43:49
the people who tended to be the
1:43:51
best at predicting future would not. The
1:43:53
people who had to new a lot of
1:43:55
our little. Box. You the people who
1:43:57
knew little about a lot. And this
1:43:59
is because. I think. The. Problem are several
1:44:01
explanations for a blessing one of the keeps nations
1:44:03
is that the people who know a lot of
1:44:06
our little. Tend to try
1:44:08
to. Solve. All
1:44:10
problems. By recourse to
1:44:12
that little marrow. Sort of
1:44:14
sliver of information that they know really really well because
1:44:16
I do have a safe on that territory and they
1:44:18
want to venture outside of it, so they tend to
1:44:20
try to. They have you ever seen to the lens
1:44:22
of what I know really, really well. Where.
1:44:25
Is the people who know a lot about
1:44:27
it is? or yeah, know a lot about
1:44:29
little. They are a little about lot. sorry
1:44:32
if it's a date. They tend to. Be.
1:44:35
A lot more generalist and they are more
1:44:37
flexible in their thinking and so this is
1:44:39
why I would advocate if you have a
1:44:41
choice between specializing in just small number of.
1:44:44
Topics. Or learning a little about lox.
1:44:46
I would advocate the last it because that
1:44:48
puts you in a good turkey to sort
1:44:50
of be flexible in your thinking, unlearn. You
1:44:53
could bend learn if you wanna know more about
1:44:55
said thing you can dead about and this this
1:44:58
does a concert called the Curiosity Zone which is
1:45:00
when you learn. A
1:45:02
lot was I when you learn a missile about
1:45:04
a lot. What happens is
1:45:06
that. Your crew curiosity
1:45:09
get stoked and you want to learn
1:45:11
more because curiosity is not. Is
1:45:14
not stoked by an absence of knowledge,
1:45:16
it's stoked by having a little knowledge,
1:45:19
because when you have a little money,
1:45:21
is it? Curiosity
1:45:23
is the desire to fill gaps in
1:45:26
knowledge. I do not
1:45:28
have gaps in knowledge need to have. Things.
1:45:30
Nice actually learn things because. A
1:45:33
complete absence of knowledge is not that. Be
1:45:35
knowledge. He needs something. Teases you? could? you
1:45:37
don't know? Yet
1:45:39
and a gap is. Cannot
1:45:41
exist between two objects. You.
1:45:44
Can't have a gap without. You know the empty
1:45:46
spaces. not gap is gonna. This economy is going
1:45:48
to be in the middle of two things: service.
1:45:50
you learn, Those. Two things you know and
1:45:52
you have a gap. Know. You. Have a
1:45:54
computer knowledge in that camp is where
1:45:57
you're curious. teach blues basically. So. If.
1:46:00
If you want to stoke your curiosity, if you want to
1:46:02
sort of evoke curiosity in
1:46:04
yourself, then the best way
1:46:06
to do that is to learn a little about a lot. Because
1:46:10
that way you'll want to know more, you'll motivate you to want
1:46:12
to know more. And so, yeah,
1:46:15
that's what I would definitely advocate.
1:46:17
That's why I like to be more of
1:46:19
a generalist, rather than specialising in a single
1:46:21
sort of concept. I think it's much
1:46:24
better to do that. Under
1:46:27
setting theory, most of the time what's
1:46:30
happening in the news isn't actually important.
1:46:32
It only appears important because it's in
1:46:34
the news. The public conversation is based
1:46:36
on whatever's reported by the press, giving
1:46:39
the impression that this news matters most,
1:46:41
when really it's just what was chosen
1:46:43
by a few editors and thoughtlessly amplified
1:46:45
by the masses. Yeah,
1:46:49
so this is why I don't
1:46:51
really read the news very much. I
1:46:53
browse it very, very casually, often just once
1:46:56
in a while. I don't really read it
1:46:58
much because what I've found
1:47:01
is that 99% of the
1:47:03
time, the news doesn't
1:47:06
make me any wiser. It doesn't make me any more
1:47:08
informed. It doesn't really help me in
1:47:10
my day-to-day life. It doesn't help me understand
1:47:12
the world any better. It's
1:47:14
just something I do for entertainment. And
1:47:17
I think most news is just that. It's just
1:47:19
entertainment. I think it's
1:47:21
entertainment that is presented in such a
1:47:24
way that you don't feel guilty for consuming it because
1:47:26
you think you're learning about the world. A
1:47:29
lot of the time, the reason
1:47:31
for this is that
1:47:34
news is hijacking what
1:47:37
we call shiny object syndrome.
1:47:40
Shiny object syndrome is a
1:47:42
concept, another concept, I think, from one of my
1:47:44
recent threads, where
1:47:49
in our evolutionary history, we sort
1:47:53
of evolved. I'll keep saying the word
1:47:55
evolve. But basically, we evolved to
1:47:58
basically favour... new information
1:48:00
over old information because new information tended to
1:48:02
be more useful. In
1:48:05
a low information environment, new information can often be the
1:48:08
difference between life or death. So new information, for
1:48:10
instance, a thousand years ago or a hundred
1:48:12
thousand years ago, would be seeing
1:48:15
a lion coming out from the undergrowth.
1:48:17
That's new information. And that's crucial
1:48:20
information. If a lion is
1:48:22
coming out of the undergrowth and it's charging towards you, you
1:48:25
need to know. So obviously we
1:48:27
became biased towards new information because
1:48:29
new information could be the difference
1:48:32
between life or death in a way that old information
1:48:34
wasn't. And so we have this
1:48:37
bias towards novelty. We're attracted to
1:48:39
new. Anything that's new, we're just
1:48:41
attracted to it by virtue of
1:48:43
its novelty. And
1:48:45
news hijacks this evolutionary impulse
1:48:49
by providing us with new content. People
1:48:51
are always searching for what's new. They're
1:48:53
constantly looking for the breaking news,
1:48:55
the big bar in red, which say
1:48:57
breaking news. Or they're looking for seeing
1:49:00
new tweets or
1:49:02
whatever. Click the button, see new tweets,
1:49:04
see the latest posts, all
1:49:07
this stuff. People want to see what's the latest. They want to
1:49:09
know what's the latest. And this is
1:49:13
a maladaptive desire because
1:49:17
in a world where information is
1:49:19
mass produced, it's
1:49:22
no longer actually valuable to have
1:49:25
new information most of the time because
1:49:28
the majority of the new information has been
1:49:30
created for one reason and one reason only.
1:49:32
And that is to hijack your impulse for
1:49:34
novelty, your desire for novelty. It's
1:49:37
there to just... It's basically it's rushed
1:49:39
out. The information is rushed out. So
1:49:42
if you look at a lot of the latest breaking news, it's
1:49:44
usually wrong because the journalist wanted to
1:49:46
be the first person to break the story. So they
1:49:48
just rushed it out as fast as they could. And
1:49:51
they didn't do their due diligence and they didn't
1:49:54
really give you all
1:49:56
the facts. And likewise, people want to be
1:49:58
the first to retweet this new story. and
1:50:00
talk about it and so they'll just hastily,
1:50:02
they'll just retweet the headline without
1:50:04
reading the article or whatever. So a lot of
1:50:07
this new stuff is rushed out and
1:50:09
that's why news is generally not
1:50:11
that valuable because it's just
1:50:14
reported, it's often reported impulsively
1:50:16
by editors and by journalists.
1:50:19
They just say, oh okay, this sounds like it might
1:50:21
do well online so let's just post this, let's just
1:50:23
write about this. And then what
1:50:25
happens is that people think that because it was reported
1:50:27
by the news, therefore
1:50:29
it must be important but
1:50:31
it's not. A lot of the time
1:50:34
it's not. A lot of the time it's there
1:50:36
simply to hijack your attention, hijack your desire for
1:50:38
novelty and you're not going
1:50:40
to remember it, you're not going to benefit from
1:50:42
it. Just think about it,
1:50:44
go to any news page and just look
1:50:46
at the top stories and a
1:50:49
lot of the time it's just not really
1:50:51
stuff that, it might be interesting, it might
1:50:53
be interesting, it might interest you for
1:50:55
a couple of minutes. You
1:50:57
might think, oh okay, that's okay. But most
1:50:59
of the time it's not really going to be that interesting. The
1:51:03
exception to this would be news
1:51:07
that's directly relevant to
1:51:10
your field, your chosen
1:51:12
field. So for instance,
1:51:14
if you are a biologist and you are
1:51:16
interested in curing, let's
1:51:18
say you're a medical professional
1:51:21
and you're interested in curing cancer or
1:51:23
whatever and then if there's a
1:51:25
new vaccine for cancer, which there is, which
1:51:28
is an amazing story, then
1:51:31
that's obviously going to be interesting news and you want
1:51:33
to know about that. But that's rare, that's very rare
1:51:35
and you usually get that not from looking
1:51:37
at the mainstream media, you usually get that
1:51:39
from specialised news outlets. So you want to
1:51:41
go to science news outlets which
1:51:44
will tell you about the latest breakthroughs
1:51:46
in technology. The mainstream media is usually
1:51:48
just generalised just stuff that
1:51:50
is just not really going to be of value to
1:51:52
many people. It's just
1:51:54
going to be there to tickle your desire
1:51:56
for novelty. So mainstream
1:51:59
media news is... generally not that useful. That's why
1:52:01
I don't really read it much. I mean, I do read it,
1:52:03
but only because a lot of people
1:52:05
expect me to comment on it. If I
1:52:07
wasn't a writer, I wouldn't check
1:52:11
the news. I would only
1:52:13
just check information that's
1:52:15
relevant to me. So maybe if I was an
1:52:17
investor, I would check stock prices and stuff like
1:52:19
that. But I wouldn't check the general news, because
1:52:21
the general news is usually just
1:52:23
worthless. And people fall into believing that it's
1:52:25
important because it's reported. But it's
1:52:28
not. It's just what was chosen by a few editors.
1:52:32
Yeah, it's strange what we click
1:52:34
on and what editors know will
1:52:36
drive interest and engagement often
1:52:39
has absolutely no correlation
1:52:41
with something that's important. Like
1:52:43
how many times have we seen left
1:52:48
wing woman says that she can't get a
1:52:50
man to hold the door open for her.
1:52:52
And it goes like Superviral Online and everyone's
1:52:54
got the same take-off. That's a conservative. And
1:52:57
it's like a whatever.
1:52:59
It's a slow medium
1:53:02
pitch. It's there to fuel
1:53:04
engagement and engagement farming. It's basically
1:53:07
a lot of its rage
1:53:09
paint. They want to try and make you as angry
1:53:11
as possible because they want to start a fight online.
1:53:13
Because if they start a fight online, then the
1:53:16
two factions that are fighting are going to
1:53:18
be inadvertently promoted. Like baby. And
1:53:21
then also, just stuff that's
1:53:24
reported just generally like, for
1:53:26
instance, if you're an average
1:53:28
person, you'll
1:53:30
hear, oh, 30 people
1:53:32
died in a bombing in Gaza. It's
1:53:35
bad. It's tragic. It's horrible news. But
1:53:38
most people are not going to ever do anything about it. They're
1:53:40
just going to read it and then that's it. They're going
1:53:42
to forget about it. And it's like they
1:53:44
may as well have not even learned about it because
1:53:46
it's just it's not going to
1:53:49
change their life in any way. They're not going to go out there
1:53:51
and stop the bombing. It's
1:53:54
part of maybe they're a bit more ambiently
1:53:56
anxious about the world and the impending gentle
1:53:59
sense of division. In. Exactly
1:54:01
is going to make them feel. Bad.
1:54:03
A lot of the time and and
1:54:05
says the negativity bias in the news
1:54:08
reporting as well, you know that it
1:54:10
was interesting to think Steven Pinker recently
1:54:12
posted a list of Sixty Six or
1:54:14
news reports that were actually positive. So.
1:54:17
Positive developments but I didn't get any
1:54:19
traction because they were positive runs, the
1:54:21
negative snow. The negative stories always get
1:54:23
way more engagement and so does. Your.
1:54:26
If you cancer the are consuming news you
1:54:28
can develop this sort of more cynicism. You.
1:54:30
Can develop grace? Is it cynicism? More pessimism? you
1:54:33
going to become. Depressed in a
1:54:35
sense you know you could feel bad because it
1:54:37
is can assume that there was falling apart. There
1:54:39
is. If. You actually go to
1:54:41
the again good. He specialized at news outlets
1:54:43
so you gotta science reporting in our. Then.
1:54:46
You'll find a lot of stuff about medical
1:54:48
breakthroughs which is actually a lot more interesting
1:54:50
because that will allow you to predict the
1:54:52
future to be a little bit better. You
1:54:54
know if there's been a breakthrough that you
1:54:56
can add made the do something about that
1:54:58
you can maybe invest in it. He says
1:55:00
you've learned that is a for accepting for
1:55:02
cancer you can invest in it and you
1:55:04
can help the people that actually trying to
1:55:06
make that happen in our so him. That's.
1:55:09
A lot more useful stuff is positive
1:55:11
and spit positive news tends to be
1:55:13
more useful overruled under kind of negative
1:55:16
at engagement driven stuff. That this
1:55:18
issue of my favorite web sites I
1:55:20
go to Site Post and Psychology today.
1:55:22
Both your script yeah have a good
1:55:24
nominal and in the lights about human
1:55:26
Nature if you're interested in that. A
1:55:29
lot of the strength eyesight on the
1:55:31
show come from a P S Y
1:55:33
post or Psychology today. It's that Great
1:55:35
Aunt, you know what? the browser is
1:55:37
a familiar with That. The.
1:55:39
Browser Now. So the browser is. It's
1:55:42
been going for think over a decade
1:55:44
now. A It is a daily email
1:55:46
of fives articles and there is nothing
1:55:48
be. The articles have nothing in common
1:55:51
at all other than the fact that
1:55:53
the editor. Has. Found them to
1:55:55
be interesting and were it's it's it's
1:55:57
my favorite place to just get exposed.
1:56:00
to always new, new, new,
1:56:02
new, new stuff. Like here is
1:56:04
the life story in 3000 words
1:56:07
of like a boot polisher from 1800s Birmingham.
1:56:10
Uh, and here is some new
1:56:13
drone technology that's coming out of
1:56:15
China. And here is like a
1:56:17
story about Genghis Khan
1:56:19
and whatever. Like it's just so varied
1:56:21
and literally the only single
1:56:24
thread between them all is the guy. Uh, Robert
1:56:26
Cottrell, I think that dude that's in charge has
1:56:28
found it interesting. And, uh, on
1:56:30
the whole, it's not everyone's for me, but at
1:56:32
least one to two per day. It's
1:56:34
amazing. And I think it's maybe like 40 bucks
1:56:37
a year. Uh, and your sub stack,
1:56:39
something else that people should subscribe to,
1:56:41
which they can go to guinda.substack.com. Uh,
1:56:44
I'm, I'm definitely some sort
1:56:46
of premium member, which I enjoy. What can
1:56:48
people expect from you over the next few
1:56:50
months? What's what's coming up? Yeah.
1:56:53
So I'm working on, uh, um, my most ambitious article
1:56:55
yet, which is going to be a long read. It's
1:56:57
going to be about 5,000 words. Uh,
1:56:59
I'm working on it for unheard, but I'm also going to
1:57:01
be posting the longer version on my sub stack. Uh,
1:57:04
and it's about, uh, gamification and,
1:57:06
uh, how it can be used to
1:57:08
control us, but how we can take advantage of it. That's going
1:57:10
to be, I think a very useful one for a lot of
1:57:13
people. Um, I also got
1:57:15
my, my book. I don't want to talk too
1:57:17
much about my book yet because, uh, it's coming.
1:57:19
It's coming, but, uh, there's something big in
1:57:21
the works. It's going to be, it's the first one of the
1:57:23
second one cause it's two, right? Yeah, that's
1:57:25
two. Yeah. The first one's coming out next
1:57:28
year, so not long. So then
1:57:30
the one after it's probably going to come the year
1:57:32
after. So that'll be in 2025, but yeah, there's going
1:57:34
to be a book hopefully next year. Uh,
1:57:37
and I'm also going to be trying
1:57:39
to, uh, actually start doing videos as
1:57:41
well. Uh, cause I've had a
1:57:43
bit of demand from that. So, um, I think by the
1:57:45
time this comes out, I might actually have a YouTube channel.
1:57:47
I don't know. Uh, but if
1:57:49
you're watching this and you're interested in
1:57:51
hearing me ramble more than there, you
1:57:54
might want to search my name on YouTube. I'm
1:57:57
going to get, people go to your. everything
1:58:00
will be on the sub stack I'm gonna guess. Everything's on
1:58:02
the sub stack yeah and also Twitter I'm gonna be more
1:58:04
active on Twitter. I've got another mega
1:58:06
thread coming up actually because I'm gonna do
1:58:08
one for the winter 2024 mega
1:58:11
thread is going to be out in about a month or two so
1:58:15
that's going to be the next big thing on Twitter but I'll
1:58:17
be posting a lot more now so
1:58:20
because the bulk of my work on my book's done so yeah
1:58:23
I'm hoping that 2024 is going to be a very
1:58:25
productive year for me. I look
1:58:27
forward to it man. I'll try to do mega
1:58:29
threads. Yeah yeah you want
1:58:32
might want a bit more sleep than I get but
1:58:34
yeah dude look I really carries these episodes it's that's
1:58:36
two hours that's come by in literally no time at
1:58:38
all. Once the next mega
1:58:40
threads up you will come back on we will talk
1:58:42
about it again and we will we'll have more fun
1:58:44
but for now ladies and gentlemen Gwyndar Bogle thanks so
1:58:46
much for today mate. Thank
1:58:49
you always a pleasure Chris.
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More