Podchaser Logo
Home
Baratunde Thurston Explains Our Pain [Rerelease]

Baratunde Thurston Explains Our Pain [Rerelease]

Released Thursday, 24th August 2023
 1 person rated this episode
Baratunde Thurston Explains Our Pain [Rerelease]

Baratunde Thurston Explains Our Pain [Rerelease]

Baratunde Thurston Explains Our Pain [Rerelease]

Baratunde Thurston Explains Our Pain [Rerelease]

Thursday, 24th August 2023
 1 person rated this episode
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

Today's episode of What Future with

0:02

Joshua Topolski is a re

0:04

release. It's a really interesting

0:06

listen. We will be back

0:09

next week with an all new episode.

0:12

Thanks for listening,

0:31

hey, and welcome to What Future.

0:34

I'm your host, Joshua Topolski, and

0:37

We've got a great episode today because

0:39

I have been wanting to talk to our guests for

0:41

a long time, and in fact, the guest

0:44

of today's show I had on

0:46

my old podcast a long, long

0:49

time ago, and it was

0:51

one of the greatest conversations I've ever had in my

0:53

entire life. And frankly, I've spent

0:55

a lot of years in therapy dealing

0:58

with the fact that not every conversation could be as

1:00

good as that one. Of

1:02

course, talking about my conversation

1:05

with Bartunde Thurston, who

1:07

is a podcaster and a

1:09

writer and a TV personality,

1:12

I think he's done a Ted talk. I

1:14

believe he is maybe an astronaut.

1:17

He's doing many things, and he has

1:19

a fascinating guy he Currently he's writing for Puck

1:22

News, which you may have heard of, is one of my favorite

1:24

new websites, and he's been writing about

1:26

social media and technology and one

1:28

of the things that he talks about a lot, and I

1:31

know thinks about a lot is sort of the way we get

1:33

information and what's in our diet,

1:35

how we get the news of the

1:37

day, how we process it, how we think about

1:40

it, how generationally we're thinking about

1:42

these things, And that to me is a fascinating

1:44

topic because I feel like we

1:47

have this information crisis in

1:50

the world right now. I mean, I really believe

1:54

the root of most of our

1:56

problems in society.

1:58

Okay, that's pretty broad, but I think the one

2:00

of the main causes of the

2:02

pain that we experience in modern life

2:06

is the fact that it's

2:08

become so impossible to

2:10

get and understand information. There is

2:13

so much noise, so

2:15

much misinformation, so

2:18

much that is wrong

2:20

that you have access to. Right when

2:23

we thought of the Internet, you know, me

2:25

and the think tank that created the Internet that I

2:27

worked on, when we envisioned the

2:29

Internet, or when we thought about it in its earliest days,

2:31

this idea that it could give you access

2:34

to all the world's knowledge, that you could have every

2:36

book and every library, every

2:38

encyclopedia, every piece of information

2:40

available, felt like this superpower

2:43

that we would all have, that this is

2:45

like the beginning

2:47

of a new era.

2:50

Of understanding and education and

2:52

information and humanity

2:54

coming together and people solving their

2:56

problems, and we're all going to communicate

2:59

and share and explore

3:02

together in cyber space. And look

3:04

at all this access that we now have to edify

3:07

ourselves to become smarter, to become better

3:10

citizens of the world.

3:12

And what nobody.

3:13

Seemed to think of, and myself

3:15

included, is that the inverse

3:18

of all of the world's information

3:20

is true, is that all of the world's disinformation can

3:22

also exist there. And in fact, it

3:24

turns out disinformation and misinformation

3:28

is louder and faster and

3:30

sexier and more interesting and more exciting

3:33

than the real stuff, And it's

3:36

easier to share, and it's easier to scan

3:38

people with, and it's easier to convince

3:40

people of For whatever reason, we

3:43

are drawn as human beings to the

3:45

more dramatic stories, and the more dramatic

3:48

stories are often the lies. And so

3:50

the Internet has become as much as it

3:52

is a tool for humanity

3:54

to explore the world's

3:57

knowledge and to explore all

3:59

of the fascinating aspects of

4:01

humanity itself, it's also

4:03

become this massive and horrible tool

4:06

for disrupting in

4:08

the worst way possible with the most negative connotations,

4:10

possible, disrupting how our brains

4:13

work and disrupting what we know and what we believe

4:15

is true and what we agree on is

4:17

scientific, and whether or not the

4:19

Earth is flat, Like, these are now things that are

4:21

up for debate in a lot of circles.

4:24

And I say, in a lot of circles. On the internet, it feels

4:26

like a lot of circles, but in reality,

4:29

maybe it's only a few circles. But

4:31

those circles are growing,

4:34

and the pools

4:36

inside the circles are deepening, and we're being

4:39

sucked down to the bottom of the sea. And

4:42

now I'm imagining like a cylinder filled with water

4:44

and we're all in it.

4:45

At any rate. This is my roundabout

4:47

way of saying.

4:48

I thought, who better to talk about the

4:51

Internet and information and the future

4:53

of humanity as we know it. Who better

4:55

to have a conversation with about those topics

4:58

than Baritunde Thurston. And

5:01

he agreed to come on the show. He's here

5:03

with us right now.

5:21

Okay, I'm very happier here.

5:22

Yeah.

5:23

I had you on my show in twenty fifteen,

5:25

Yeah, which is like kind of amazing.

5:28

I went back and listened to a little bit of it. We were both drinking.

5:30

First off, you

5:32

were having a gigantic glass of whiskey

5:35

and I was having vodka. We got into a

5:37

long conversation about the iguanas whole

5:39

foods. Which do you don't live in New York

5:41

anymore?

5:41

Do you? You live in LA?

5:43

I live in LA. I left New York

5:46

foty years ago.

5:47

Now that's smart, that's so smart. I'm jealous

5:49

of you, all right, So let's talk about you for a second.

5:51

Now, Yeah, I mean, obviously we now know where

5:53

you're located. That's important. So

5:56

I've been reading you a lot because you're writing

5:58

for Puck Puck

6:00

News.

6:01

Is that what we call it.

6:02

We call it puck. We called the website is

6:04

puck dot news, but the name is just Puck.

6:07

I like Puck a lot. I gotta tell you, I'm a big

6:09

fan. I'm a subscriber. I

6:11

have been reading and following along.

6:13

But your stuff is really interesting lately,

6:16

especially because you're always interesting.

6:18

But you've been talking a lot about social media.

6:20

I mean, you just wrote a thing about TikTok that I thought was

6:22

super interesting. You've been obviously talking

6:24

about Elon Musk and the Twitter situation.

6:27

You made a passing comment.

6:29

Obviously that's been a topic of great interest

6:31

to me because because it's your fault, because

6:33

I personally am responsible for Elon

6:35

Musk's behavior.

6:37

You laid the path I did.

6:40

Why did I do that?

6:41

Why did I Why did they make him act

6:43

in that way? But you

6:45

made this kind of pass you mentioned You're like, yeah, Twitter

6:47

as we know it is basically dead or something like that.

6:50

I'm paraphrasing. Do you really believe that? Is it

6:52

over?

6:53

I think so. What I added in that

6:55

comment was that you know, I was referring

6:57

to Twitter in the past tense because

7:00

I think of Twitter, it is code,

7:02

but it is also culture, and

7:05

it is the unique combination of code and culture

7:07

that created the environment that we knew as

7:09

Twitter. Same goes for cities like

7:12

New York is a different

7:14

place than when I lived there four years

7:16

ago, I think, especially if a place goes through a major

7:19

tragedy, it is substantively

7:22

different. You know, after if you lived in New York

7:25

before nine to eleven and never lived there after, you don't

7:27

really know New York anymore. Like the character

7:29

of the place has shifted, even though it's still in New York.

7:31

It's kind of an existential question of a or metaphysical

7:34

one too. If like all my cells have changed,

7:36

am I still me? But in this case,

7:38

I think what Elon has done at Twitter is so

7:40

violently abrupt

7:43

in terms of driving out staff

7:45

and also users of

7:47

a certain kind, that he has shifted

7:50

the nature and culture of the place. He

7:52

has signaled really obviously

7:54

and sort of boneheadedly what

7:57

type of speech he wants to be free,

8:00

just his right wing buddies, and what type

8:02

he will police, which is anything critical of him

8:05

or any jokes he doesn't understand. Twitter's

8:07

culture is now Elon's limitations.

8:11

And he's just scaled himself, right,

8:13

He's like implemented his predilections

8:15

and personality into the code,

8:18

right, And and he has the right to do

8:20

that. That's what billions of dollars

8:22

and debt and other people's money can

8:24

help you do. But it is not the

8:27

Twitter that we commonly refer to anymore

8:30

because a lot of the people who made

8:32

that place what it was aren't spending time

8:34

there. Yeah, and I include myself in

8:37

that. And I'm late to the exodus by

8:39

some measures, even though I've

8:41

substantively like downgraded my time there

8:43

years ago, before Elon even came on the scene.

8:45

He just accelerated a path

8:48

outward and made it easier for me to spend time

8:50

elsewhere.

8:50

I mean that point is like I

8:53

get it so well because I feel the same way

8:55

like this. Yeah, you know, there definitely

8:57

was a point, you know, I don't know when I was just like

8:59

now I'm not can't do this anymore, but it

9:02

has to end, right like he Can's not financially

9:04

sustainable in its current state, don't

9:06

you know?

9:07

It isn't. And I think there's cultural death,

9:09

there's technological or kind of

9:11

operational death, and

9:14

there's financial death, and all of these

9:16

are on the table, and I think they

9:18

probably move in that order. Certainly, the cultural

9:20

one, I think has already very

9:23

much begun. When you start like banning

9:25

journalists who criticize you as if you were Saudi

9:27

Arabia itself. It's

9:29

one thing to take their money. It's another to adopt

9:32

their stance on you know, freedom of expression

9:34

and free press in your

9:36

public square business with

9:39

no sense of irony about it,

9:41

like legit just doing that. So when

9:43

people like me and you and others, a lot of the

9:45

black Twitter folk, even some of the crypto

9:47

kids, Like there's a whole community that's just

9:49

like, I'm not here for this. It's exhausting.

9:52

The town square shouldn't be the subject

9:54

of discussion of the town square, right,

9:57

But I don't.

9:57

Gather the town's creditly be like, Wow, I look at this. I love the

9:59

what they've with the stones here or whatever.

10:01

Yeah, like they gather to talk about other stuff right about

10:03

like the janitor, you know.

10:05

Yeah, it looks like one of those things

10:07

like when the story is just about right and it's

10:09

like it's like when you've made it about you or whatever.

10:11

Yeah, so you're a point about the money, you

10:14

know. My colleague at Puck, William Cohen, has

10:16

written much more incisively on this angle

10:18

because he's a former banker and understand some

10:21

of the Wall streetness of how Elon put this

10:23

deal together and what's required and when payments

10:25

are due, and he's gonna

10:27

owe some significant amount of money

10:29

on the debt he took out to pay for this thing. He

10:32

can afford it, right, But with

10:34

every drop in the share price at Tesla, it's

10:36

going to sting a little more. With every increase

10:39

in the FED rate, it's gonna sting a little more.

10:41

Yeah, and with every dollar

10:44

of revenue that brands aren't spending

10:46

in his toxic environment because they don't want to be adjacent

10:49

to a white supremacy and Nazis and fake

10:51

news, it's going to sting a little more. And

10:54

at some point he's gonna

10:56

have to do something even more radical than

10:58

what he's already done just to keep

11:01

the lights on, which he had decided

11:03

so far to handle by not paying his rent, right,

11:05

which is so very trumpy of another

11:07

brilliant, genius billionaire over

11:10

it.

11:11

Like not paying your rent is one of those things where like

11:13

if you're a regular human being, it's

11:15

such a great demonstration of like the difference

11:17

in class in

11:20

like what the billionaire class or the very rich

11:22

or capable of that we are not. And by the way,

11:24

the rent on Twitter spaces is not like

11:26

thirty eight hundred dollars a month or something. I

11:28

mean, it's not like a one bedroom apartment

11:31

in like Bushwick or something. It's

11:33

like I don't even know what the rent would be. It's probably way more than

11:35

that right now.

11:35

It's ad zeros.

11:36

Yeah, but like he's just like I'm not going

11:39

to pay the rent, and everybody's like, well, Okay,

11:42

I guess, like what do we do. We can't do anything

11:44

to Elon Musk or like to the company, but

11:46

like if you don't pay your rent, or if I don't pay you know what I mean,

11:48

like if a normal person, And maybe

11:51

we're not perfectly normal, but no, I'm.

11:53

Definitely above normal at this point.

11:54

But in general, a

11:57

person who does not pay their rent the air consequences

12:00

gonna get fucking victed, Like we're

12:02

gonna take you to court, and like none of that should happens.

12:04

And it's like to me, like, what's more going than like

12:06

him going, I'm not gonna pay it is that nobody does

12:08

anything.

12:09

It's like everybody sits there.

12:10

Look, we're raised, Josh in this world

12:12

where when someone like Elon,

12:15

of his gender, of his race,

12:17

of his inherited wealth decides

12:20

to transgress, we tend

12:22

to celebrate it right until

12:25

it's too late, you know.

12:28

White.

12:28

Yeah, he I mean, he's like just

12:31

like Heisenberg

12:33

was way cooler than this guy.

12:35

His drugs aren't as good, but

12:38

his blue math is like I don't know, fucking

12:40

the blue teck mark.

12:42

Yeah, but he can afford

12:44

to call our bluff

12:47

as a society. He can afford to wade

12:49

us out. And us I don't mean like me

12:51

and you specifically, but just society

12:54

at large. You know, people who

12:56

need money to live, money

12:59

that they get pay on a weekly, buy,

13:01

weekly, or monthly basis. He

13:04

can take the hit longer because

13:06

he's got a big old cushion

13:08

to fall back on, and so he can just like

13:10

he knows he'll get sued for how he treated the employees

13:13

when he first came in. And he's

13:15

just gonna see, like will they go through their trouble of

13:17

hiring lawyers or will

13:20

it burn so much of their time because they can't afford

13:22

to spend it on that, and I can

13:24

just weep them out. It's the Trump move,

13:27

you know. It's like I'm gonna break this law. Let's see if the government

13:29

bothers to prosecute I'm

13:31

going to not pay these vendors. Let's see if they bother

13:34

to send lawyers after me, or will I

13:36

just settle for pennies on the dollar

13:39

and it's a deal. I'm won the deal

13:41

of the century. But you're also just it's

13:43

also criminality, that's another word for

13:46

it, right, and dishonor you

13:48

know, at best, the most charitable

13:50

interpretation is he's a meanie

13:53

and he's not honorable.

14:09

The thing that I find so and I've talked about this a bunch,

14:11

but like you know, I don't want to belabor it, and I'm sure you're all

14:13

talked out over Twitter, but like, what's

14:15

so incredible to me is I never was like, Elon

14:18

Musk is clearly the smartest man in the world or anything.

14:20

But there was a period where I was like, this

14:22

guy's interesting. Yes, way before

14:24

he was whatever his Twitter persona was, there

14:27

was a period of years where Elon Musk

14:29

was like you didn't hear that much about him

14:31

on a personal level, but you heard

14:33

a lot about the things he was doing, and the things he was

14:35

doing were really fucking interesting

14:37

and really seemed to be working right, Like

14:40

once he was like involved in Tesla, the

14:42

SpaceX stuff, some of his like

14:44

big ideas that he would talk about, but you never really

14:47

eadn't go like way into detail on it.

14:49

It's so strange to me to see the evolution

14:53

or de evolution of

14:55

his like character, of his personality

14:57

like to whoever he plays in public.

15:00

Yeah, it's like he's hearing these cheers from whoever

15:02

wherever it is.

15:03

A different section of the stands. He's

15:06

playing to that part of the room. Yes, so

15:09

I share a big part of that, Josh.

15:11

And I think I went from

15:14

we're talking a decade ago, you know, not really

15:16

knowing anything about this guy, heard

15:18

about Tesla a little bit. Oh the roadster looks

15:21

kind of cool. I'll never have one of those anyway,

15:24

to twenty eighteen, maybe

15:26

twenty seventeen, getting serious about wanting to get

15:28

a car in New York because

15:31

my wife came from California. She's like, I

15:33

can't do this subway thing all the time because the grocery

15:35

stores is how do you live? Like this isn't a life we call

15:37

this. I agree with her, and

15:39

I was like, cool, cool, cool, but we're not getting a combustion

15:41

engine car, Like that's my line. Her her

15:43

line was I need car. Mine

15:46

line was not gas car. And

15:48

there was one option, yeah, you know, which was Tesla

15:50

Model three. Okay, so I start paying more

15:53

attention. I'm building spreadsheets to analyze

15:55

what the best car, and the Nero is coming out

15:57

but it's not available yet from Kia and

15:59

Hyundai. And I'm also

16:02

keen on the Space thing, and I'm like starting to look

16:04

into what's up with SpaceX, and

16:07

the YouTube algorithm is great for me. I have

16:09

never really been fed Nazis and

16:12

sort of disinformation. I've been fed like

16:14

how to make great cocktails, how

16:17

to composts, you know, and

16:19

like what electric cars should you buy? Like my YouTube

16:21

is dope? And YouTube

16:23

got a whiff that I was interested in space again

16:25

because I was a space kid, and so I watched

16:28

his announcements around the Mars

16:30

launcher and the reusable rockets,

16:32

and I just admired his

16:35

systematic thought. He

16:38

has like thought through all these parts, and he was laying

16:40

it out very logically, very engineering oriented,

16:43

and over promising stuff that would eventually

16:45

happen definitely wouldn't happen when he said

16:47

it would definitely. That's the one thing you know is

16:49

true is that he's not telling the truth about

16:52

the ready date. But beyond that,

16:54

added up multiplanetary species climate

16:57

change evis Yes, sign

16:59

me up. So came to Cali, got

17:01

the Tesla. At least the three real excited

17:04

yeah, had friends that SpaceX almost visited.

17:06

SpaceX never made it over to visit there, but

17:09

just at admiration for these

17:11

two pieces of the puzzle. Boring company

17:13

I don't know, flamethrowers. That's kind of weird.

17:16

That's exactly when it started to go the moment.

17:18

That's a moment.

17:20

Yeah, it was like he was getting high on his own supply,

17:22

Like that's the fucking reality. People were

17:24

like, you're so cool on He's like, yeah, I'll put a flamethrower

17:26

out.

17:27

And that was it.

17:28

To explain part of the transition, You

17:30

explain what happens to a lot of

17:33

young men who get radicalized

17:36

by the Internet. In this case

17:38

young maybe a political

17:40

but ultimately hyper conservative,

17:44

largely white men who

17:47

fall into the rabbit holes of you

17:49

know the thing about feminism, right, they

17:52

want to cut your balls off. Wait, is that's not going to be a

17:54

man anymore?

17:55

God, is there a video?

17:57

You know the thing about black people, They want to take everything

17:59

you ever worded hard for and rub it in your

18:01

face. You know the thing about indigenous

18:03

people, you know, to think about climate shait and

18:06

it just becomes this like low

18:08

appeal to people's deep insecurities.

18:11

And that was modeled for him by

18:14

the former president very well,

18:16

very effectively, by Fox News

18:18

and by people in an atmosphere

18:21

online where he spent a lot of time you

18:23

know he loves memes. I mean, so much of this stuff

18:25

was born out of meme culture with pet bat of Frog

18:27

and gamer Gate, you know, that was

18:29

all meme warfare early on. And

18:32

he's he's a Reddit kind of guy. You know,

18:34

he's a he's a hashtag kind of dude. And

18:36

is he a four chain kind of guy. He's a four chang

18:39

crely and he you know this if he plays with with Q.

18:41

You know my pronouns are was

18:44

it prosecute fauci? I mean, that's

18:46

that's some cute stuff, right, but

18:48

it's actually very explicable

18:51

if you've as you have spent

18:54

time on the part

18:56

of the Internet. That's not just about cloud services

18:58

and profits going up into the right, but that's about

19:01

emotion and culture and disaffected

19:03

young men, which are usually the source of

19:06

most societal collapses. Is like young

19:08

men with too much time on their hands. I guess

19:11

I agree, and and not productively put

19:13

to use. And Elon, who

19:15

as CEO of four companies, should have had plenty

19:17

to.

19:17

Do, well,

19:31

that's my thing.

19:32

This is as you're saying this, all I can think

19:34

is, and I've said this so many times, but like if I

19:36

were the world's richest man. Do you know

19:38

what I would not do is I would not fucking tweet.

19:40

I would not go on Twitter. I would not need to

19:43

go on.

19:43

Like he's that you and him are different.

19:45

I feel like there's so many things to do in the

19:47

world that you could be doing that isn't

19:49

tweeting. And I think you probably identify with this.

19:52

We're very online people and we've probably

19:54

spent like a pretty good amount of our career like building

19:57

our whoever we are, like our persona.

19:59

Like it's like on Twitter, on Instagram or whatever,

20:02

like we're communicating, but we're also and

20:04

I'm not saying I did this consciously, but it becomes

20:06

a thing where you're like that's your platform, Like that's where you

20:08

talk to people. That's like where you need to kind of be

20:11

doing what you're doing, Like you do it there, right,

20:14

h But like to build an electric car,

20:16

you don't do it on Twitter, Like you don't need

20:18

to do it on Twitter. Like to go to space, you don't

20:20

do that on Twitter. It's not part of the like doesn't

20:22

need to be part of his birth. Like Tim Cook doesn't fucking

20:25

tweet, you know, Tim cooks like has

20:27

he tells like his fourth assistant to

20:30

get his social media person to tend

20:32

to tweet about the new watch or whatever.

20:34

Tim Cook has something

20:38

that Elon's missing. He's

20:41

got a whole sense of himself.

20:44

Yeah, he has self affirmation

20:47

that doesn't require thousands

20:49

of strangers every day to affirm him

20:52

instead. Right, he's got self love

20:55

in a way that is satisfied and quiet

20:57

moments as opposed to having to make noise to

21:00

prove that he still exists. Yeah,

21:03

and so it's not that

21:05

mysterious. You know, we're not inside of

21:07

Elon specifically, but no,

21:09

I know what it feels like to

21:12

like picture or it didn't happen

21:15

right like I And I'm

21:18

I think objectively more balanced,

21:21

you know than Elon. Even though I have way less

21:23

money, I have more of what I need.

21:25

Ye, what's right, You're not the world

21:28

here. You are acting normal, so

21:31

unusual.

21:32

We get to watch somebody,

21:35

you know, with an unmet need act

21:37

out in public. And we've seen it with Kanye,

21:40

and we've seen it with Trump, and we've seen it with

21:43

a lot of folks, with all kinds of people

21:45

thrust into public life who

21:47

don't have their own private lives sorted out,

21:49

and so in that sense. There's no mystery,

21:52

it's not intriguing, it's not novel.

21:54

It's just so much more in our face because

21:58

he forces us to watch. Because when

22:00

he.

22:00

Literally bought the thing, yeah that we all

22:02

were communicating, like he bought like the

22:05

phone company, you know, he

22:07

was like, no, I own the lines now, So you're gonna hear

22:10

me a lot. I'm just gonna kind of like here, you guys, what's going

22:12

on? Like you're like, dude, I didn't. You're not in this call.

22:14

He's just popping up. But okay, But it's

22:16

interesting you mentioned Kanye, and obviously Trump

22:19

like Kanye is very I mean, I assume he's

22:21

still pretty rich, but has been at times very

22:23

rich and very powerful, very much like

22:26

in the public eye and very much in control

22:28

of like the public sort of conversation,

22:31

not where Elon is, like, he's not the world's richest

22:33

man. And yet they both have access

22:35

to presumably in Trump

22:37

as well, to kind of any information

22:39

they want or need. And yet they

22:41

all all from very different backgrounds,

22:44

all from very different doing very different things,

22:46

seem to gravitate towards him publicly,

22:49

gravitate towards this weird

22:51

conspiratorial thinking

22:55

that I don't know. I don't want to give anybody

22:57

too much credit. I think that all of them

22:59

incline voting Trump, I hate to say

23:01

it, are intelligent in some

23:03

way.

23:04

Yeah, you know, it's not a stretch, that's

23:06

yes, easy.

23:07

But like the shit they gravitate towards is kind of

23:09

the lowest, like the bottom

23:12

of the barrel, like conspiracy shit.

23:14

Yeah right, Like it's not like, oh,

23:16

I didn't hear this one before. It's not like, oh

23:19

wow, like I hadn't considered that angle.

23:20

Oh the Jews. Really the Jews,

23:23

that's your hot intelligent take the

23:26

thousand year old conspiracy.

23:27

It's like the protocols of the Elders of Zion

23:30

or whatever is the new conspiracy

23:32

that you're you that you've now been been in clued

23:34

into. But it's funny. It's like Ian

23:36

Elon shares the same shit. It's like the Fauci

23:39

conspiracy stuff. Yeah, and it's like the stuff

23:41

that is really designed for

23:43

for people who don't have better access to information.

23:46

Right, So a lot of what

23:48

you've said applies

23:50

to many of us. You could take

23:53

someone from the early

23:55

nineteen hundreds and they

23:57

could make that speech about

23:59

anyone live after you

24:02

know, nineteen ninety after

24:04

two thousand and said, like, you have access to

24:06

incredible information, right,

24:09

I thought drinking mercury on a daily

24:11

basis would improve my humors, you

24:13

know, and like you know better, Wait

24:16

it doesn't, right, And you have you know, Healthline

24:19

and Wikipedia and doctors like

24:21

you have, you have science, and

24:24

yet you still engage

24:26

in superstition and conspiracy

24:29

and unproven things. So it's

24:32

so tempting and very easy. And I'll

24:34

often be like, what's wrong with those people?

24:36

Right? But it's like, Okay, we're all people. Something's

24:38

wrong with all of us. They have a bigger

24:40

stage on which that's That's

24:42

what I'm.

24:43

Saying is so yeah, I feel like I can understand when

24:45

you don't have access to.

24:47

So access doesn't mean anything. You

24:50

know, people who enslaved other

24:52

people had access to the idea

24:54

of human rights right right,

24:57

Like it wasn't like a secret, you know, it wasn't

24:59

like that, oh my god, people could just own

25:01

themselves what that's madness.

25:04

They weren't waiting for you know, King

25:06

to say it out loud right right, It was no,

25:09

it was knowable, Yeah, And yet there

25:11

were incentives set up and benefits

25:13

set up to encourage millions

25:16

of people to go along with something we now say

25:18

is like absolutely insane and dehumanizing

25:20

and terrible. Right, that's a long way of getting to like

25:23

what the incentive for people

25:25

with presumably a lot of power and access

25:28

to resources and better information

25:30

than us to engage in the opposite

25:34

transgression is a hell of a drug.

25:37

And if you can be

25:39

self perceived as like, yo, I'm so punk

25:41

rock, you know, I find the real truth?

25:44

Like it goes back to X Files. Man, we're old enough

25:46

to remember, like the truth is out there, which

25:49

planet a seed? Not that show specifically,

25:51

but it was of a moment where we

25:53

can't trust anything.

25:56

Yeah, like two plus two is for maybe

25:59

prove it? Yeah, well okay, but what

26:01

kind of numbers are you and what system of math

26:03

are you basing that on? What assumption?

26:05

Right?

26:06

You can think yourself into not thinking,

26:08

you're thinking when you're just viewing

26:10

nonsense. And there's a lot

26:12

of nonsense that passes for intelligence

26:15

and thought these days. That's just counterintuitive

26:18

bullshit. Right. But it's packaged

26:21

up as Andrew Tate, you

26:23

know, wow, it's packaged up as

26:25

Ben Shapiro, And he sounds

26:27

like a nerd, so he must be smart. I love

26:29

Ben right, not

26:32

that everything he says is bullshit, but so

26:34

much is so much, And

26:36

so I think people like a Kanye

26:39

or an Elon or parts of us, it's

26:41

like, all right, I know you told me that

26:44

COVID, you know we'd have to do

26:46

these things. But I know the government's

26:49

been wrong about this stuff before. Look,

26:51

the black person in me, not specifically me,

26:53

but ancestrally, who was experimented on in

26:56

Tuskegee by government health experiments,

26:58

understand skepticism about government health

27:00

mandates that might hurt people.

27:03

So the anti vax thing is not even like, you're

27:05

so dumb, why don't you believe science?

27:07

Well, fucking science is always changing, right, and

27:09

sometimes science is abused. When

27:12

you have no trust or very low trust in the environment,

27:14

then you can take good faith

27:16

nuance and weaponize it as faith

27:18

in nothing, and it becomes nihilism,

27:21

and it becomes ship posting, and it

27:23

becomes pretend intelligence when you're

27:25

just peddling noise and nonsense. And I think

27:27

Elon crossed that line so much. As a business

27:30

owner, he was frustrated at California's

27:32

mandates to keep people safe, right,

27:34

and he took it to the extreme and

27:38

is now pedling conspiracy theories about

27:40

one guy who's spent most of his life

27:43

literally trying to save millions and millions of people,

27:46

right, Like, there's many people who could and should

27:48

be vilified. Anthony Fauci ain't one of them. He's

27:50

made mistakes. I would acknowledge that some

27:52

of the messaging wasn't always clear,

27:55

but prosecution those

27:58

are working out here trump a bit. Yeah, it's

28:00

like metal, you know, not

28:03

like firing lines. I don't

28:05

know, man, Well I know some of

28:07

the answer and it gets exhausting

28:09

sometimes. But then this is also like, okay, what's the what's

28:12

the lesson or the opportunity here besides

28:17

spinning up media cycles?

28:19

Talking about the guy who's powered by media

28:21

cycles? Right? And I

28:23

mean, I think the deeper thing

28:25

that we're playing around with here, which is like, Okay,

28:28

what's the motivation, what's the incentive?

28:31

How do we try to like protect

28:33

against this because he's not alone,

28:35

he's loud. But I just I think

28:37

there's something about being perceived as

28:41

brave and clever,

28:44

and there is a value to victimhood

28:47

which anyone in power

28:50

loves to wrap themselves in.

28:52

That flag. The greatest deflection

28:54

against criticism of your own power is

28:56

to claim to be the victim.

28:58

You know.

28:59

It's it's just so crazy too. You're like,

29:02

I don't think you're the guy who's had the

29:04

hard time.

29:05

Yeah. Yeah.

29:17

The thing you said about the X Files, which I think about

29:19

all the time, is like, you know, when I

29:21

was a kid, you know, or whatever, young adult, I guess

29:23

when it was on. There's a lot of stuff

29:25

on that show that circles of friends

29:28

of mine we were like, yeah, what is up with not

29:30

like the things like there's aliens or whatever. Yeah,

29:32

but there were books that now form

29:35

the basis for like some of the craziest,

29:37

most fringe conspiracy theories that were

29:39

like these books like Behold a Pale Horse in

29:42

like these like old school like the Robert Anton Wilson

29:44

books like the Illuminatus trilogy

29:46

and stuff that were like, oh, yeah, the Knights

29:48

Templar and the Vatican and there's all this like cover

29:50

up and whatever, and you're kind of like, yeah, I kind of believe

29:53

that because the Vatican seems sort of fucked up

29:55

and who knows what's really.

29:56

Going on there?

29:57

All the people I knew it was all like the

30:00

fascist state, the government man

30:02

is like covering it up or whatever.

30:04

And now it's like weirdly flipped.

30:06

Yeah, the right wing now has like somehow

30:09

adopted these things that I felt like were these

30:11

really like almost like hippieish sort of conspiracies.

30:14

If you think of conspiracy theories as a virus

30:17

capable of spreading morally

30:19

neutral in terms of the hosts,

30:22

it will attach itself to yeah,

30:24

and it has variants, and it will mutate, and

30:26

it will find new ways to

30:29

increase the mortality rate, but not so

30:31

much that it decreases the are

30:33

not in the spread, Like all this stuff we should know, but

30:36

we know now because of COVIDR. We're all a little epidemiologists.

30:38

Now I just dropped. Are not in a casual conversation.

30:41

So because I read a white

30:43

paper, you know, or five in

30:45

the past three years.

30:46

You're staying in formed, you know what's going on.

30:48

And again overwhelming my brain with things

30:50

that I like specialists to actually specialize.

30:52

Right, So you know, being

30:55

liberal is not an inoculation against

30:57

conspiracy. Right, And the same doubt

31:00

of systems, you know, is

31:02

available to liberal and

31:05

conservative and it's the basis for americanness.

31:07

To begin with, like we don't trust the king. We

31:10

started with distrust. That's a good

31:12

point. We should overthrow the government.

31:14

Actually, yeah, So it's like whether you're like

31:16

a black panther who doesn't trust the government, or

31:19

like some libertarian pharm owner who doesn't trust

31:21

the government. Like we share something. We like, what's

31:23

up with this government? And even though

31:25

one may vote for Democrats and the other.

31:26

For Republicans, with this government, I think that's

31:28

a that's a great.

31:29

Question, is

31:31

a super good question. But I also like

31:34

I used to indulge in conspiracy theory.

31:37

I'm gonna I can't believe I'm about to say before everybody

31:39

else did.

31:40

Josh, Wow,

31:42

Oh you're like the the indie rock guy.

31:45

Yeah, Like I'm that guy. I'm that guy that like I

31:47

was there at the at the indie club,

31:50

you know, like in the base.

31:51

Like losing my edge.

31:52

This is the LCD sounds lose to

31:54

my edge, you Like, I was there when it happened.

31:56

And now that everybody's into them, they're not

31:59

cool anymore.

31:59

You were the original infoce.

32:01

Theories are not cool when the

32:04

richest guy in the world loudly embraces

32:06

them, right, they lose their heads cutly

32:08

not fun. It is no longer transgressive or countercultural

32:11

when the ruling class rules

32:13

by conspiracy theory.

32:15

Yeah.

32:15

Right, So when the President of the United

32:17

States is like, you know what's true, lizard

32:19

people like fuck, I guess I can't be

32:22

engaged in that.

32:23

It actually says him, but lizard people didn't. I mean,

32:25

he's got so.

32:27

It's not cool, and I think it's it's more but

32:29

it's more dangerous. There's something about indulging,

32:32

like with highly processed

32:35

foods or dessert, right,

32:37

a little bit has value. It

32:39

can add balance, it can add an interesting,

32:42

rich flavor. But if you just eat

32:45

that, if you only eat Twizzlers, yeah,

32:47

you're gonna die a horrible depth sooner.

32:49

Than it's actually a sponsor of this episode,

32:52

so sort of disappointing to hear that you're not a

32:54

fan, but.

32:55

It's too popular. They need to stay punk,

32:57

you know, punk can ever be mainstream.

32:59

Twizzler is the most punk candy of all

33:01

time as well. Know, it's funny

33:03

that you mentioned that you were into conspiracy

33:06

when it was cool or whatever, but.

33:07

Like ham Radio times, you know what I'm saying, Like, as a

33:09

whole different level of commitment.

33:11

Well, I used to.

33:12

I used to love Coast to Coast, you know, the Art

33:14

Bell Show, which is like people calling and they'd be like,

33:16

you know, I was haunted by a spirit

33:18

or whatever. Or he'd have people on who had

33:20

had like UFO sightings. And

33:22

actually Alex Jones comes

33:26

from his show, like he became a regular

33:28

or something on the show, which was like, you know, it was like

33:30

a naturally syndicated like yeah, he was on at midnight

33:33

and it would be like people talking about conspiracy

33:35

or talking about ghosts or talking about whatever. And

33:37

it was fun because it was like, be crazy

33:39

if that were real, you know, to your

33:41

point, like when the president is like it's

33:43

real, it's like okay, like you know what, Yeah,

33:46

that loses a little bit of its novelty factor.

33:49

Yeah, And I think a little bit of like

33:52

high sodium, high fructose corn

33:54

cere like beef jerky

33:57

whatever the thing is, like a beef jerky

33:59

food cans, you know, like, oh,

34:01

that's that's cool. If you're traveling to space,

34:04

it shouldn't be your whole diet.

34:06

I don't know what tun

34:08

to diversify outside the cave, and

34:11

it's a little bit of conspiracy theory.

34:13

It probably does a body good, right, a whole

34:15

system built on it. Then the whole system falls

34:18

down because we actually do need to believe in

34:20

something. Yeah, non belief

34:22

isn't enough. And if you start doubting

34:24

everything, you don't have anything left to stand

34:26

on. You find yourself alone in the universe, which is

34:28

pretty existentially terrifying, which is

34:31

Elon Musk almost. It's where he's headed.

34:32

That's why he's like trying to like fly

34:35

out to Mars, like be literally alone

34:37

in the universe.

34:37

But the cost of his indulgence is a

34:40

lot of the rest of us, you know.

34:41

No, I mean, we're all just the shrapnel

34:43

flying at everybody from his like own weird like

34:46

midlife crisis that he's experiencing that he has

34:48

to like talk about on Twitter. Adjacent

34:50

to Twitter, you just wrote this piece about TikTok, about

34:53

America trying to shut down TikTok Get

34:55

Out of TikTok.

34:56

Which people can read at puck dot News,

34:59

Puck dot News, buy my page. TikTok on

35:01

the Clock or something like that.

35:03

Is the title, hold on, I'm I tell you it is. The

35:05

headline is TikTok on the Clock.

35:06

I have it up here what happens to the creator economy

35:09

If the world's most important new social platform

35:11

is banned in the United States, we

35:14

may be amount to find out, says Caratonde.

35:16

Uh okay, but like it's not gonna

35:19

be banned, is it? They're gonna band

35:22

You said that like gen Z would.

35:24

And Gen Alpha

35:27

would unite form Voltron

35:30

from gen.

35:30

I don't have any Whether they don't have any power.

35:32

We all have more power than we know. And

35:35

I think if these kids stopped

35:37

signing up from military duty, if

35:40

they stopped paying for their parents

35:42

and grandparents social Security checks, because

35:44

we're in this page you go system, they

35:46

could they could shut some shit down.

35:48

Do you think TikTok is that important? You don't think they'd

35:50

just be like, well.

35:51

You know, I nodded to the idea that

35:53

something so culturally enmeshed and

35:56

with such deep daily habits, you know it's

35:58

some high person. Is there some stats I left out of this

36:00

piece as far as like the number of people under nineteen

36:03

that represent the core usage of TikTok

36:06

It it seems impossible.

36:09

I don't think it is. It seems impossible

36:11

that one day that would just disappear.

36:13

I think you say the average time daily

36:16

is like.

36:17

Ninety minutes a day is the average TikTok

36:19

craze and they have a billion monthly active

36:21

user.

36:22

Well, I say that's crazy, But then I'm like, how much time

36:24

did previous generations spend watching TV?

36:26

And it's way more than ninety minutes a day.

36:29

Yeah, it was like four hours a day or more.

36:31

I remember finding these stats in the early two thousands.

36:33

So we're like all blown away by that, but like, actually

36:36

it's much close to do because in TikTok

36:38

is in some way like you're kind of flipping through channels,

36:40

so it is actually closer to a

36:43

TV experience than maybe any other

36:45

social.

36:45

App and the end, and it's higher for our

36:48

high usage people, right, you know, it's like that's

36:50

the average user, and then the ones

36:52

who over index are going to be two hours a day, three

36:54

hours a day and just living in the palmitter

36:56

hand. So I think you'd be

36:58

based on the combination of concerns

37:01

about TikTok. You've got people concerned about data

37:04

and privacy. You've got people

37:06

concerned about youth mental health, and that's

37:08

like a catchy thing to suddenly give

37:10

a shit about. You've got

37:12

folks concerned about China, some

37:15

for good reasons, some for racist reasons,

37:18

but their reasons don't matter. They're on the same team

37:20

when it comes to like skepticism toward

37:23

anything that comes from that nation.

37:25

It's like that meme with the with the meme

37:27

with the arms locked together. You

37:30

know I'm talking about it looks like two guys like arm

37:32

wrestling, but it's like, is

37:34

it like a black arm and a white arm and they're

37:36

like locked together?

37:38

Am I crazy?

37:38

I just like to make it feel crazy. You're

37:40

not fun to watch. You try to paint a picture with one

37:44

got a name? I don't know. Let's leave it to a listener.

37:46

This will be the Easter egg. The first listener

37:49

to find me on Mastadon and

37:51

tell.

37:52

Me it's called epic handshake.

37:54

Okay, you know what I'm talking about, right.

37:56

I probably know what you're talking about, but not by that.

37:58

Damn it.

37:59

Hold on, I'm just gonna say you this meme template right

38:01

now. I'm going to put in the chat here. Oh

38:04

you were saying that, like, yeah, the people

38:07

who are like racists, Yeah, it's a coalition

38:09

people who have good reasons to worry about China.

38:11

It's like this meme.

38:13

It's like, it's like mental health advocates,

38:15

racists, national security

38:18

people. Right, the

38:20

Van diagram is very

38:22

diverse here, that's the illustration we should

38:24

have done.

38:25

I like that, Actually is could this be the thing that brings

38:27

us all together?

38:28

Yes?

38:28

I mean I do think a common enemy.

38:30

I thought that before COVID. I was like, oh, COVID

38:32

is the alien invasion. It's as close as we're going to get.

38:34

And it ripped the country apart in the world too.

38:36

So No, it's funny because I thought

38:38

the same thing about when Russia invaded Ukraine.

38:41

I was like, Okay, now America,

38:44

we're classically against us. You don't

38:46

like Russia historically like we

38:48

are, And and Matt gets is like, actually,

38:50

no, it's crazy, right, it's fucking

38:53

bizarre. These Republicans are like, actually,

38:55

you know, Putin's got a good point. It's like, dude, does he like

38:58

is that where we're at? Is that where you're part?

39:00

Aready? Anyhow?

39:00

Sorry, not not to fucking lose it, but

39:03

let's take some deep breaths together. Josh is TikTok

39:07

dangerous? Yeah, this sounds like the most bullshit

39:09

sorry like CNN question of all time.

39:12

Bear Tunde. Is TikTok dangerous

39:14

for the youth?

39:15

No?

39:15

I mean, like what is your take on it?

39:16

Like do you actually think there's real concern

39:18

Like at the privacy thing I get, I get snooping

39:20

shit, but like.

39:22

I think there's real concern in

39:24

that is super addictive and

39:27

it's become back to this world

39:29

of low and no trust.

39:32

TikTok the place people go to get real questions

39:34

answered. It's like chat GPT, but

39:36

like way larger scale, and

39:39

sometimes the results you get back are as

39:41

non factual as that bot. Oh

39:43

yeah, but they present as like this is

39:45

what you should do to heal this thing, this is what

39:47

you should do in terms of how you vote.

39:49

No misinfo on TikTok is crazy.

39:51

Yeah, yeah, So it's it's a risky

39:54

vector for misinformation. It's

39:56

risky in terms of the black box of how it operates

39:59

and semi public square

40:01

media ecosystem without

40:04

checks and balances, and that's always

40:06

been that was the case with Facebook, that's the case with But

40:09

it's just transfers to whatever the hot thing is, right,

40:11

and TikTok for me is more

40:14

fascinating and potentially more dangerous

40:17

because it's a social network

40:19

that like you don't have to bring friends

40:22

to experience it. You just trust,

40:25

like the algorithm is the one friend you need.

40:27

Whether you're a creator or

40:31

I'm more of a passive consumer. In terms

40:33

of your experience on there, you like a

40:35

few things, you linger a half second

40:37

longer on this thing versus that thing, And

40:39

they just dialed you in quick and

40:41

they put you in a little box and like you are

40:44

this type of user for four to six dash B

40:46

seven and you get that stuff in your

40:48

feed. And it's like if

40:51

a pharmaceutical company could

40:53

like build a highly addictive drug

40:56

just for you within ten minutes

40:58

of knowing you.

40:58

That sounds great, unless yet

41:02

that's what I want. Okay, you got a little

41:04

bit high from this, Like what if we kind of juice

41:06

that.

41:06

Yeah, it's for the personalized drugs.

41:09

Yeah, it's actually it's the promise of your

41:11

actual.

41:11

Personal sound attractive and exciting

41:13

to me? Is that a problem for me personally?

41:16

As you've been describing TikTok just

41:18

in this last thirty seconds or forty five seconds,

41:21

I have been in the back of my head, I'm like, maybe I should

41:23

be looking at TikTok more often, like almost

41:26

very sick man. I'm doing dry January, so it could

41:28

be I'm just looking for something like addictive,

41:30

a new fix, you know, but.

41:32

Like I get it. I've been there,

41:34

I do.

41:35

I'll open TikTok occasionally and I'm like, you

41:37

know, you'll start looking at shit. It is like the

41:40

pringles, you know, like once you pop, you can't

41:42

stop or whatever.

41:43

It's also and what I didn't put in the puck

41:45

piece because I can never put everything in one thing.

41:48

So we copy and paste,

41:50

like we are a meme driven society, even before

41:52

we use that word the way Hollywood

41:55

does a copy and paste on types of movies. You

41:57

know, one thing hits and everybody's doing the thing right.

41:59

Everybody wants a Marvel cinematic universe, and

42:01

everybody needs big ip and like

42:03

leverage a book or a superstar or preferably

42:06

both, so the risk is lower

42:08

and we can just spend a fuck ton of money

42:10

on this and make a fuck ton of money after that and not be

42:12

original. Right. That also happens

42:14

in business and tech and social media. Facebook

42:17

just copies and paste. So the more successful

42:19

TikTok is, the more everything else looks like TikTok.

42:22

Right, our Instagram isn't Instagram

42:25

anymore.

42:25

It's just TikTok. Yeah, but a failed

42:27

TikTok, like a bad TikTok. So

42:29

if I'm on a reel and I accidentally go to the next

42:31

reel, I'm like, I'm immediately

42:34

turned off and always like how do I get out

42:36

of this? Which they don't make very easy. Of

42:38

course they don't. It's entrap mean, but you

42:41

can't get out of the reels once you're in there. I've

42:44

quit the app on multiple occasions.

42:45

They're not your friendly neighborhood drug

42:47

dealer. They're not

42:50

cal or Doug or Jena,

42:52

whoever you're a dealer with. They are

42:54

big pharma of social

42:57

media.

42:57

I have a medical marijuana lize, so I get it from

43:01

the state.

43:15

When it jumped the shark a bit for me, I'm on

43:17

Instagram. Like you had Twitter with

43:20

text. You like reading,

43:22

you know, you like writing, but not too much. That's

43:25

Twitter for you. You want to write too much, that's medium

43:27

for you. There's blog Spot or something to date

43:29

myself if you want. If you want

43:31

still photos and like cool

43:34

photos with stories associated with them

43:36

in captions, then you go to Instagram.

43:38

You want short form video that kind of moves at

43:40

the speed of freaking meth. Then

43:42

you go to TikTok. But now everything

43:44

is everybody just has to be in everybody else's shit,

43:47

and nobody can just be who they are. So

43:49

there's no fixed identities in this world. So

43:52

everybody just wants to be what the cool kid is at

43:54

the time. So Facebook is TikTok,

43:56

Instagram is TikTok. Twitter is just whatever.

43:59

It's dead. But I think

44:01

when I'm on Instagram and I see

44:04

I see a still photo, but

44:07

it's actually a video because the

44:09

algorithm and the incentive structure

44:12

forced a normal human being to

44:14

use a weird app to turn their

44:17

photo into a looping,

44:19

thirteen second video. Not a gift. Gifts

44:21

are a whole thing. That's a whole different type of media format.

44:24

Totally had a place for that too. Nope, so

44:26

we're just gonna We're gonna contort your natural

44:30

abilities and sensibilities into

44:32

this other thing because we don't know how to make

44:34

our own shit work, so we're just going to copy

44:36

theirs and make you TikTok. So even

44:38

people who are on Instagram, I hate TikTok. No you don't,

44:41

because you're in it, right now you're living in

44:43

TikTok's great shadow.

44:45

But I think that people on Instagram do legitimately

44:48

hate it, and I believe strongly

44:50

that the only reason why reels, if it's

44:52

successful at all, it's because it's being like forcibly

44:56

pushed on the users in a way that it's like really

44:59

uncomfortable and they don't like.

45:00

And here that was the great Facebook

45:02

pivot to video that accelerated

45:05

the overinvestment of so many newsrooms.

45:07

And it was like years later, like, well, actually

45:10

we were counting three second video

45:12

starts. Well, okay, actually

45:15

we added a zero to all the video

45:17

plays across the network.

45:19

Now, if listeners may not know, there was a period

45:22

where on the Internet when there was like

45:24

a news company or whatever, they would just publish

45:26

like texts. Sometimes they'd do like

45:28

a long form video, there'd be some images, they

45:30

would do a gallery. And then one day Facebook

45:33

was like, video is going to be the next big thing

45:35

for our platform. Everybody should really

45:37

make video. And then it literally many

45:40

large news organizations that were well funded

45:43

where like we're firing the people who

45:45

write and we're gonna make video yep.

45:47

And then it was a huge failure. Nobody watching any

45:49

of the video.

45:50

Nobody gave a shit, and then a bunch of those companies went out

45:52

of business, like a crazy amount

45:54

of companies now don't exist because

45:56

of the pivot to video. And then

45:58

Facebook was like, oh, we'd like completely miscounted

46:00

the numbers. Also, like those views we said you were getting,

46:03

like we're not real at all, Like we didn't have

46:05

any real clue about what we were doing there,

46:08

or we lied.

46:08

And on top of that, you

46:11

cannot trust the

46:13

outlet with an incentive to inflate

46:15

its numbers to tell you what's going on.

46:18

And part of what happened is Facebook

46:21

shifted its algorithm to prioritize

46:24

video, you know, right, to make sure it

46:26

went more viral, got more views and

46:29

ranked higher than stills or

46:31

texts. And then they publicly

46:33

said we're seeing a shift toward video

46:35

amongst our users.

46:37

Right, They're like, they're like, it's

46:39

weird.

46:40

We put video now on their feed and

46:42

they seem to be watching it because

46:44

there's nothing else.

46:45

I mean, it's true.

46:46

That's like you know, Shi Jinping

46:49

saying, you know, we're just seeing a high degree of loyalty

46:52

among other people of China. Right for me,

46:55

people want me to rule forever,

46:59

said the man holding a gun to his people.

47:01

We should be so, we should be so lucky

47:03

to have such an opportunity to just completely

47:05

railroad people like left, right and center

47:08

and just just forcing this shit on them. It's

47:11

a bleak picture, but I have a thought

47:13

listening to you describe what's going on with the tiktokification

47:16

of all the social networks, which is like, to

47:18

me, it almost feels like the

47:21

Netflix effect, which I think we've

47:23

seen now kind of start to crumble a bit, which

47:25

is like Netflix is like, we're Netflix. We're

47:27

doing our thing. Now, we're making original stuff. Okay,

47:29

now that's fucking killing. Now everybody is scrambling

47:32

like to play catch up to the streaming

47:34

Netflix to be Netflix, right, and you've got HBO

47:36

doing it, and you've got fucking Paramount doing it whoever,

47:39

And and it turns out there's like a limit

47:41

to how much of that shit that we want and that

47:44

will tolerate, and frankly, like there's a fatigue

47:46

that will set in after a while, right where you're

47:48

just like this may be the best shit in the world. I may

47:50

have one hundred of the best shows to watch, but like I don't

47:52

really have the time or the attention span

47:55

or the interest at this point, because like my doobamine

47:57

has.

47:57

Left my brain. I'm fucking like tapped out.

48:00

Is it possible that this is the

48:02

final breath or the near final breath

48:04

of like social media

48:06

as we know it, that it's collapsing

48:09

in upon itself, like they've run out of ideas,

48:11

they've run out of attention. I do think this

48:13

is just me riffing now, but like I do think

48:16

we are entering a phase, not post

48:18

pandemic, because they're still like a pandemic happening,

48:20

but post the worst, most

48:22

intense part of it that we experienced where

48:24

we were all alone inside online

48:27

all day for a long time.

48:29

I feel like there's a little bit of like people going like

48:32

I don't know if that's what I want to be now,

48:34

Like I want to be like that. So

48:36

is it possible there's these

48:39

like all these forces coming together. This is

48:41

just my weird wishful thinking that like people

48:44

don't want TikTok everywhere,

48:47

they don't even maybe want to be on social

48:49

media that much anymore. Maybe

48:52

TikTok is just this like super

48:54

pronounced last gasp

48:58

of social media as we know it

49:01

any any Does that sound like possible?

49:04

I think, I think that's a beautiful sentiment. I

49:06

want that world. I still

49:08

believe that world as possible. I

49:10

remember I loved clubhouse,

49:13

and then I went outside clubhouse.

49:15

God, I forgot about clubhouse. Right.

49:17

It was very simple. I couldn't

49:19

socialize, I couldn't host things, and

49:22

that was a wonderful substitute. And

49:24

then I left my house right, and I went and

49:26

filmed my TV series America

49:28

Outdoors on PBS. Check it out in the PBS

49:30

apps, great very Smith Clugg. And I

49:33

was like, all of a sudden, I'm not spending all my time talking

49:35

to strangers on a party line.

49:38

I'm just like at a restaurant.

49:40

Again. Well, I love restaurants, you know what. I

49:42

love strangers on the street. I love random people

49:44

at the coffee shop. Taken too long to order? Give

49:46

me a DMV line. Yes, And

49:49

it was all superior to that

49:52

novel interesting but unsustainable

49:55

blip in the in the grand sweep

49:57

of all the social technologies. It was like a Poe

50:00

blip. And so I was wrong about the sustainability

50:03

of that. And more to the point, so where the people

50:05

who devoted so much more of their time to it with

50:08

social media, I am

50:10

excited for us to escape the

50:13

phase that we've been in the centralization

50:16

the feed based Like even

50:18

the word feed just makes me feel like livestock,

50:21

right, just like, oh, I'm going to go to the feed. I'm going to

50:23

feed. I'm freaking the memes and the data.

50:26

And then some they're just fattening me up to

50:29

shove ads into the feed to

50:31

take stuff from me so I can

50:33

buy things.

50:33

I don't need to monetize you.

50:35

Yeah, And so we

50:38

will always tend toward

50:40

that because of capitalism, which

50:43

won't let a diverse ecosystem prevail.

50:45

That will tend toward monoculture

50:48

in food, in entertainment, in

50:50

technology platforms, because people

50:52

want more and more and more, and shareholders want more and more and more,

50:54

and that means everybody's got to taste the same. Damn rabbit. But

50:59

the gen Z and the Alphas in particular,

51:01

you know, they grew up in the ashes of the financial

51:04

crisis, and they bear the brunt

51:06

of the trillion dollars of college debt. You

51:08

know, they're older siblings, especially, so

51:10

like, I don't know so much about this, And then the planet's

51:12

reminding us like, Hey, this whole

51:15

feed based system, infinite feeding,

51:17

you'll eat your home planet and won't

51:19

have anywhere to live. That's not

51:21

sustainable either. So more of

51:23

us, I don't know if the whole system will totally

51:26

change, but more of us will seek this alternative

51:29

universe you're talking about. It excites

51:31

me, and so I joke about Mastodon.

51:33

I think it's like a hint that

51:36

we can create other environments.

51:39

In my podcast How the Citizen, the most

51:41

recent season is exploring

51:43

how we use technology to help us show

51:45

up as active citizens and members of society

51:48

and not just be fed to an economic

51:50

engine that we didn't build and that we generally

51:52

don't benefit from. And I just learned a

51:54

lot talking to people who've done different things

51:56

with tech, the folks that knew public

51:58

who were designing digital public spaces differently,

52:01

as for Alshafe over in

52:03

the Middle East, who's created this social

52:05

media network that operates very differently

52:07

from all the noise we've been talking about and

52:10

just doesn't give people infinite powers on day

52:12

zero to fuck around and find out how

52:14

much damage they can cause the society. It

52:17

respectfully onboards people into actual

52:19

communities first. It doesn't

52:22

have vcs, which makes

52:24

it possible. I love that, and

52:26

because the incentives, like

52:29

if you start with venture capital money,

52:31

and so it's rich people who need to get richer.

52:34

So this is already just like disposable,

52:37

but they drive everything

52:39

you pursue downstream from that. All your

52:41

user research, all your UX

52:43

design, all your growth marketing tactics

52:46

are so that a millionaire can get to be

52:48

a billionaire. Then whose

52:50

need are you really meeting with that feed and

52:54

so to really have TikTok be the last

52:56

gasp, we need different funding mechanisms

52:59

which are available but not wide scale,

53:01

and we need people designing these spaces

53:03

who are more than just bored engineers.

53:07

Yeah, we need people who actually know how to design

53:09

spaces, you know, like designers,

53:12

interior designers, cultural

53:14

programmers, artists. We

53:16

need folks who understand people to

53:19

build spaces for people. And

53:22

many of those folks have not been included. They've

53:24

been subjected to the visions and

53:26

imaginations of a very small group of folk

53:29

who don't have the broadest human experience.

53:32

And who are incentivized to make money.

53:34

Yeah, above all, it might talk about like

53:36

community and ship, but.

53:37

You don't have to take a vow of poverty

53:40

but if your

53:42

premise is returning ten x

53:45

and getting valued at a billion dollars and being

53:47

a unicorn, then you've

53:49

already You've already I wouldn't even say

53:51

corrupted. You have vastly limited

53:54

the possibilities for what you create under

53:56

those conditions, and so we

53:58

need fewer limits.

53:59

First, of your vision. This vision you're just paying is

54:01

very beautiful. I feel like that.

54:03

I think I think we you know, we did that a little

54:05

bit together. I'll take it.

54:06

I will take some credit.

54:08

You can have fifteen percent of the vision at

54:10

a pre money valuation, generous

54:12

call it thirty million.

54:13

Thank you. That sounds good.

54:15

I'll fine, I'll take that. I like that, all

54:17

right. So okay, I think that's a great place to leave

54:19

it. But you have a ton of stuff going on. You mentioned a couple

54:21

of things now before we end. I

54:23

want you to tell me then, and by proxy

54:26

tell everybody listening. Obviously you're

54:28

writing for pop You've got a PBS

54:30

show you just mentioned.

54:31

Yeah, it's called America Outdoors. When

54:33

can I see that? Where can I see that? You can see

54:35

it now? It's re airing on

54:38

you know. Find your local PBS affiliate. You

54:40

can go to PBS dot org slash America Outdoors.

54:43

They have a great streaming app that doesn't strip

54:45

your identity and sell it back to you. Just

54:47

the PBS digital app PBS

54:49

Passport. If you pay, Oh yeah, I subscribe

54:52

to that.

54:52

It's great. It's it's money well spent. It's

54:54

like I think it's cheap too. It's like a yearly thing, it's

54:56

like forty bucks.

54:57

Yeah, it's great, and you're supporting your local PBS

54:59

station, which in some communities is the only

55:01

local news available and local media

55:03

available. So I've become an even

55:06

bigger fan of PBS working with them and

55:08

not just watching some of their stuff. So check

55:10

out America Outdoors. It's not a

55:12

wildlife show. It's a human show about

55:14

our relationship with nature's and

55:16

it's people we don't typically see. A

55:19

couple of examples. I spent time with formally

55:22

incarcerated folk who learned how to

55:24

fight forest fires in prison and

55:26

now do it as free people and

55:29

understand some of their journey. I spend time with multiple

55:31

indigenous communities in what

55:33

we call Idaho and Minnesota

55:36

and California and learned

55:38

many of their traditions with respect

55:41

to the land that they're on I

55:43

go crabbing with the most

55:45

conservative person I've ever knowingly

55:47

been around in my life in the chest

55:50

today, and I go hiking

55:52

with refugee kids in Idaho, like it's

55:55

surfing and horseback riding and just

55:57

beautiful people. When it helped me remember

56:00

how much we still have in common, which includes,

56:02

at the basis level, our planet. So

56:05

check out America outdoors, check

56:08

out Puck, pay for it, It's worth

56:10

it. And how the Citizen is

56:13

my podcast where we take Citizen to be a verb.

56:15

And you're in season three.

56:16

We are at season four now, we're we're

56:18

about to drop season four. We launched in twenty

56:21

twenty, got some awards, had

56:23

some really dope conversations, and

56:25

the show is all about sharing

56:27

new ways to show up

56:29

in society and exercise our power. Sometimes

56:32

that's voting, most of the time it's

56:34

a total new bag. It's business stuff,

56:36

it's tech stuff, it's culture stuff.

56:39

And I tend to interview one person at a

56:41

time about their thing, and then we

56:43

give listeners in each episode a

56:46

way to make it your thing and actions

56:48

you can take in your life to just

56:50

practice these muscles that have aturo feed so

56:52

much because the only thing we've been told to do is.

56:54

Shopkeehit, which is guy.

56:56

You gotta admit it's pretty fun though, like it's

56:59

super fun.

57:00

Hot in fact, Yeah, they've gamified

57:03

resource loss.

57:04

It's so true.

57:04

So Josh, it's a great reunion. Yeah,

57:07

thank you for having me on here. Thank you

57:09

so much for doing this. You got to come back

57:11

do it some more, talk about some new

57:13

things. I'm sure you'll have like ten new things

57:15

next time. We talked to each other. Happy to

57:17

come back.

57:22

Well, that conversation was amazing

57:24

and also went kind of long, so I

57:27

don't have a lot of time. They're telling me I have to wrap

57:29

it up now, they're telling me. There's several

57:31

people are motioning towards me. Look,

57:33

they're doing that thing where they pointing to their watch

57:36

and then like making their hand go in a circle

57:38

like wrap it up. So I'm feeling a lot

57:40

of pressure right now. So that is our show

57:42

for this week, and we'll be back next week with more What

57:45

Future And I should say before I go

57:47

that. What Future is an iHeartMedia podcast.

57:49

It's executive produced by Lyras Smith, Adam

57:52

Wand is our editor, and Jenna Caagel

57:54

Am I saying that right Kegel, just like Bagel right,

57:57

is our supervising producer.

57:58

She's mad at me.

57:59

Now, I hope you're all happy, and if you

58:01

would like to, and you don't have to know, one's going to afford

58:03

to you, but it would be great for me. I could finally

58:06

hold my head up proudly at family dinners.

58:08

If you would just go on iTunes

58:11

or Apple podcasts or any place

58:13

where you can add a star rating to this podcast

58:16

and give it five or six

58:18

stars, that would be great because my

58:20

family doesn't respect me, and I think if you do that,

58:22

they'll start to come around to

58:24

the idea that I'm worth respecting.

58:28

So I really appreciate that

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features