Podchaser Logo
Home
Maximum Reading with Max Read

Maximum Reading with Max Read

Released Thursday, 18th May 2023
 1 person rated this episode
Maximum Reading with Max Read

Maximum Reading with Max Read

Maximum Reading with Max Read

Maximum Reading with Max Read

Thursday, 18th May 2023
 1 person rated this episode
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:17

Hey, and welcome to what Future.

0:19

I'm your host, Joshua Tapolski, and

0:21

we have a Crackerjack of a show. Do

0:24

people still say that Crackerjack?

0:26

Does that even exist anymore?

0:28

Do you know?

0:28

Crackerjack's there like a candy, It's

0:31

like a box. It's a candy. It's a candy corn.

0:33

No, no, candy. Corn is a candy. It's

0:35

like caramel corn or caramel corn depending

0:38

on who you talk to, which I

0:40

guess is corn covered in a sugary coating.

0:43

And then in the box it was like a red stripe box.

0:45

Does this even Does anybody know what I'm talking about?

0:47

Lyra or Jenna? Do you know what I'm talking about?

0:49

Of course I know what Crackerjack.

0:51

You know what Crackerjack is? Okay?

0:52

And I had a prize in the box, and.

0:54

At the bottom of the fucking box there'd

0:57

be a prize, like a toy or something.

0:59

That's great. They don't do that anymore. They

1:01

don't do it. Maybe they do.

1:02

They probably still sell it. It's probably very readily

1:05

available. It's probably one of the best selling products

1:07

at grocery stores.

1:08

Heay, cracker Barrel, I bet you.

1:09

Cracker Barrel is a right wing establishment dedicated

1:12

to the suppression and oppression

1:15

of many groups, and I can't stand I

1:17

won't stand for it. Although they do have really good biscuits

1:20

in my recollection.

1:21

But I would bet you good money that they sell

1:23

cracker jacks in the.

1:24

General store, crackerjack at the cracker

1:26

barrel.

1:27

Probably you're probably right, An, Yeah, we have a crackerjack

1:29

show for you. We've got a wonderful

1:31

man, Max Read, a writer, an

1:34

editor, a screenwriter, and

1:36

also a friend, I should say, a buddy,

1:39

and he's got a great newsletter called

1:41

Read Max that I love. And we're going to talk

1:43

about all sorts of stuff. I don't want to waste

1:45

one minute, so let's get into this conversation.

2:05

You were editor in chief of Gawker when

2:07

it was suit out of existence?

2:08

Is that right?

2:09

The lawsuit was ongoing when I

2:11

was editor. I left in twenty fifteen. The

2:14

following year spring, I want to say

2:16

the following year was when the judgment came down and

2:19

Gocker declared bankruptcy.

2:20

Who was the editor of Gawker at that time.

2:22

I think Alex Parene was the editor in chief.

2:24

I thought, wow, yeah, the last

2:27

and final Yeah.

2:27

No, not the last and final of that Goker. There

2:30

were many other erbitions of Cocker.

2:33

They just transitioned. They became h what

2:35

was it called.

2:36

Splinter, splinter, splinder splinter,

2:38

but splinter was Splinter was owned

2:40

by that was? When? Was

2:42

it owned by Fusion? Is that

2:45

was it?

2:45

Ye?

2:45

Fusion became I'm sorry, this is crazy shit

2:48

to think about, but like I know, Fusion

2:51

was a thing that was that existed. It was a

2:53

new media startup. Correct

2:55

me if I'm wrong.

2:56

It was It was new media within Telemundo.

2:58

It was like Telemundo Star new Yes,

3:01

start up, tell them. Telemundo

3:03

started a thing called Fusion dot net,

3:06

which was a a diverse

3:09

millennial news operation, right

3:11

that was sort of like the overarching which

3:13

like from like I get it, like very

3:16

of the moment, like very like Mashable, like in

3:18

the spectrum of like a Mashable and uh

3:20

yeah, I mean who else who else would it have been? I guess BuzzFeed

3:23

News, Mike whatever, This is not Mike,

3:25

Yeah, Mike's other perfect example.

3:27

Actually, I would say in a way Fusion was

3:29

like a mic competitor. Probably is the most

3:31

most accurate. Anyhow, Sorry, this is not we don't

3:34

have to talk about this at all. In fact, I wasn't really planning on talking

3:36

about it, but although I do want to talk about

3:38

media because you are a member

3:40

of the media elite as we know, I

3:43

don't know what what is the word for a person who just

3:45

has a newsletter?

3:45

What do you call that?

3:47

I call myself an owner operator, a small

3:49

business, small business owner.

3:50

You're a small business owner.

3:52

Okay, so you get some some of Biden's

3:54

tax break, tax breaks for the one

3:57

percent, for the elite. So you're

3:59

in the media, right, you'd say you're in the media,

4:01

like you continue you continue to publish content

4:04

that is in the sphere of news media.

4:06

Would you say, yeah, I mean, I I write twice

4:09

a week on my substack, do some freelance

4:11

journalism still too. I mean, it's

4:14

funny the fact that you sort of we sort of started

4:16

talking about this and then immediately started remembering

4:18

some websites. It's very hard. I

4:20

started my substack not really intending to write

4:22

very much about media. But if you have

4:24

been in the business for long enough, and especially

4:27

if you lived through the period

4:29

that we're talking about, the twenty tens, the crazy

4:31

twenty tens, it's hard not to

4:33

just keep writing about it. Because it

4:36

still sticks in your mind. I don't want

4:38

to say trauma, but you know it's.

4:39

No, no, no, no, it's trauma. It's Trauma's

4:42

the right word. I don't know.

4:42

I think the word trauma's overused, and I don't want to say

4:45

that people who use it don't have trauma. But

4:47

I do feel like people say trauma about things

4:49

that are like they like, you know, I don't know,

4:51

they've got they couldn't find a pair of shoes that fit,

4:53

and they're like, I have trauma from that. I'm like,

4:55

I don't think that's what the word means. Like, I feel

4:57

like it might get a little overused. But because

5:00

I had a call earlier today, I was talking

5:02

to a reporter and we

5:05

got into a conversation about

5:08

the Messenger, Yeah,

5:11

which is a the messenger is like so you actually,

5:13

I said the boom period when you're talking about what you're talking about

5:15

fusion and stuff, but actually what fusion is

5:18

in that era, which is like ten late

5:20

twenty tens. I want to say, like, right,

5:22

I don't know when fusion started or that, like

5:25

what the but I want to say, like twenty

5:28

ten, that's probably later than twenty ten, fourteen,

5:31

around thirteen fourteen. It's actually

5:33

the beginning of the end for the

5:35

new media boom. It's like Vice

5:38

had matured into this monolithic thing,

5:40

and BuzzFeed news was at least rolling

5:43

probably pretty successful at that point.

5:45

It was sort of like the beginning of the end for like the

5:47

boom era of like blogging, which

5:50

would have been which would have been in the you

5:53

know, two thousand and four

5:55

to two. I mean, obviously stuff happened aslet's say

5:57

two thousand and four to two thousand and ten

6:00

something like that is like, yeah, the biggest like boom

6:02

for like blogging, blogging, not like corporate

6:04

owned blogging.

6:06

Well, there was, I mean, the way I think about it is always like

6:08

there was that early period was very

6:11

people would still go to websites. You know, you would

6:13

actually go to ngadget dot com

6:15

or goal dot com and you you would refresh

6:17

it to see the feed. And then it was sort

6:20

of SEO came along and it allowed you

6:22

to maintain a relatively similar format.

6:24

But the thing that really sort of turned like the when

6:27

Facebook arrived as like the thing

6:29

that would give you traffic around twenty twelve,

6:32

and all of a sudden, everything you wrote

6:34

didn't matter what your website looked like, didn't matter what your website

6:37

was. Everything you wrote had to sound like it

6:39

could be shared on a social feed or

6:41

whatever.

6:42

That's right.

6:42

And that's when Fusion and all these that's when all the VC

6:44

money started really coming in and the big

6:46

corporate money.

6:48

Yeah, a very dark period. I mean, I'm

6:50

just trying to think. I'm trying to think of it in personal

6:52

terms only because that's easier for me. My

6:54

memory is very bad, and so I have to go like, well, what

6:56

was I doing? What was I doing

6:58

at that point?

6:59

Right?

6:59

So, like, yeah, Gadget Crew left to start the Verge

7:01

in like twenty ten, I think

7:04

maybe to early twenty eleven. We launched the

7:06

Verge in November of twenty eleven. I'm pretty

7:08

sure that sounds wrong, but I

7:10

think it's actually right anyhow, whatever, But that

7:12

that to me is like sort of the bit that's about

7:14

when you used to kickoff of the ends. It's not

7:17

to be, not to be you know, self

7:19

centric or whatever. But like, I think we

7:21

were legitimately doing the blogging thing

7:23

where we're like, we're going to take our team and go do a

7:25

thing. And we went to a company that wasn't it was VC back

7:27

but it wasn't like it was an NBC

7:30

or whatever. It wasn't like Telemundo

7:32

wasn't wasn't backing it or whatever you're

7:35

writing. You have a sub stack? Am

7:37

I allowed to say that?

7:38

Yeah?

7:38

A lot of times on the Internet somebody says substack.

7:41

That might put an asterisk in between the

7:43

S and the B because they don't know if Elon Musk

7:45

is going to block.

7:45

It for auditor. Yeah, well, just don't put it in the title

7:48

of this of this podcast.

7:49

Yeah, let's talk. We're gonna have to end.

7:50

As a note for Lyra and Jenna, please

7:52

let's make sure we don't add the word substack

7:55

fully spelled out in the title.

7:57

But you have a sub stack, which is a.

8:00

Is a company that has created a platform

8:03

for people to publish newsletters which

8:06

look a lot like blog posts. Yeah,

8:08

it just sought me at any point where if you sound like this,

8:10

if this sounds like this is wrong, And also,

8:12

you can charge people a subscription fee

8:15

for your newsletter and in doing

8:18

so create a like a personal

8:20

media company, like a single person

8:23

media brand.

8:24

And that's what you've done.

8:25

Now you have a thing called read Max

8:27

which I have to say, as far as titles

8:31

goes very unoriginal for you, but

8:34

I guess pretty clever as well. Explain

8:37

to the listener, who I'm sure are all subscribers

8:39

of read Max. By the way I have, I think the crossover

8:42

is extremely high. But just explain to them, like what your

8:44

newsletter does.

8:46

The tagline is that it's a newsletter about the

8:48

future.

8:49

Really, yeah, is that the tagline? Yeah?

8:52

It is? Wow?

8:53

Okay interesting, you know I started

8:55

it. It's funny. I was thinking about this the other day. I

8:57

started it with this, with the sort of manifesto

8:59

post that people can still go back and look at where

9:01

I said. I wanted to write a lot about

9:04

the way life in the twenty percentury

9:07

had been shaped by the mega

9:09

platforms of the Internet, the way that

9:11

not just sort of economics

9:14

but also culture and social formations

9:17

get worked through your facebooks

9:19

and your Instagrams and your tiktoks. But

9:22

as it happens, you know, my

9:24

editorial philosophy has all the sort of been that readers

9:27

like to feel passion and

9:29

joy. I sound like I'm giving like an upfront's presentation

9:31

to sell the advertise.

9:32

I love that. I wish you had like a clicker

9:35

and some slides.

9:35

Exactly right, if you really could, you have a

9:37

big like a pie chart with some percentage

9:39

of readers who like the.

9:40

Readers love imagine I'm in like a black

9:42

turtle.

9:43

There's like, yeah, it's like it's like seventy eight

9:45

percent of readers love to feel passion.

9:47

There's a small sliver that don't like passion

9:49

anyhow.

9:49

So this is what happens when you spend any time in editorial

9:51

management, as you learn to talk like this. That

9:53

being said, yeah cliched,

9:56

is it sounds It's like not worth

9:58

me finding something to say about

10:00

tech every week if it means that once a week

10:02

is going to be boring. So in

10:05

practice, the tagline of newsletter is a newsletter

10:07

about the future. In practice, it's a newsletter about

10:09

things that I think are interesting, which includes

10:12

AI, includes crypto, includes

10:15

you know, platforms like Facebook, includes

10:17

media stuff. Also includes action movies, sci

10:20

fi movies. Yes, sort of whatever

10:22

is is interesting me at any

10:24

given moment. So it's hard to give the elevator

10:26

pitch. But people who are

10:29

on the same wave link that I am, I think like

10:31

it because they because it gives

10:33

them a sort of weekly dose of interesting thinking about,

10:36

you know, whatever is going on in

10:39

whatever things are interesting me at that moment.

10:41

Yeah, I think I think the most successful

10:43

publications, whether they're a person

10:46

doing a newsletter or you know, lots

10:48

of people working on something, tend to be the ones

10:51

that center not around necessarily

10:54

a very hard line of specific topics,

10:56

but a kind of philosophical through line.

10:59

And I mean, I agree with you, would

11:01

be boring to simply write about like whatever

11:04

that description is supposed to encapsulate

11:06

or could encapsulate. It probably would be very boring every

11:08

week to hear about that. But you have you

11:10

do go pretty far afield. Yeah,

11:12

Like I downloaded and started watching this film

11:14

Nemesis the other day, which I definitely saw in my youth.

11:17

I mean, you might have written more than one occasion about

11:19

the movie, but it's not a good movie.

11:22

I mean, to me, just for starters time, I

11:24

want to be very clear, it's not like high art

11:26

or anything. It is like it was an

11:28

early nineties. Yeah, like a cyber

11:31

science fiction kind of movie. It's like it's

11:33

like there were a lot of movies in the early nineties. Another one

11:35

of them is Hardware, which I'm sure you've seen I

11:37

hope you've seen, which originally

11:40

was rated triple X very like

11:42

for real, like actually it was rated ACTS or

11:45

triple X is whatever they called it when it was you

11:47

know, like, but there's this kind of genre

11:49

of film people who like read Neuromancer

11:51

and were like, that's cool, But I don't

11:53

really have much of a budget. What could I do?

11:56

Like if I like this like a cyberpunk

11:58

idea, but I don't really have the to go

12:00

all the way. And so Nemesis

12:02

is what of the because you wrote about but you actually did a post. I

12:04

want to say it was like a ranking

12:07

of the sunglasses from Nemesis, the

12:09

cool sunglasses from Nemesis.

12:11

Yeah. I felt so inspired after seeing

12:14

that that.

12:14

I was like, I got to watch this movie again, and so I

12:17

h of course I went on the Pirate

12:19

Bay, which is still in existence, even

12:22

though you have to definitely get a

12:24

computer.

12:24

Virus to access it. You absolutely

12:27

have.

12:27

To get a computer virus to go to the Pirate Bay

12:29

and downloaded a you know, a

12:32

HD like Blu Ray rip of Nemesis.

12:35

Christine twenty one sixty.

12:37

You know, no, no, no, I can't be.

12:39

I can't be watching Nemesis in some seven twenty

12:41

shit like I need to see I need

12:43

to see every grain of sand in the extremely

12:46

sandy atmosphere

12:48

of the film anyhow. But so so you're writing about

12:50

like you write about stuff that is definitely not the

12:53

future. But I am curious what

12:56

is like in your mind? Like when you say the future,

12:58

what are things you've written and talked about and are thinking

13:00

about recently that would fall into that category.

13:03

I mean, I'm not any different than anybody else who's

13:05

paying attention right now in thinking that, like

13:08

AI is, the is kind

13:10

of the thing that we everybody

13:12

is talking about, and therefore becomes interesting

13:14

just because it is a thing that everybody's

13:17

talking about. You know, I was

13:19

over the last couple of years, I would describe myself as a cryptoskeptic,

13:22

but I still found crypto

13:25

as a kind of like object

13:27

of interest worth writing about and thinking

13:29

about, both as a

13:31

reason to criticize it, but also you know, the

13:33

communities that arise around something like crypto,

13:35

the kind of like the art

13:38

work I suppose, or the cultural formation,

13:41

if you want to call it that, I mean, between

13:43

Nemesis and Bordees. Clearly I have a real

13:45

fascination with bad art at something you're.

13:47

Interesting to me here.

13:49

I've often described the NFT boom as the

13:51

Olympics of bad art, which I would I

13:53

think it's just like an astounding quantity of

13:57

just like the people competing for the top slot and like

13:59

the bad art. Yeah, I mean just some of the worst

14:01

shit ever put into it into humanity.

14:03

I mean, to any terrible. But isn't that But to me, that's

14:05

like it's sort of fascinating, Like there's something fascinating

14:07

about this, like weird grifter

14:10

culture, like tacky grifter culture

14:12

emerging like billionaires.

14:15

Yeah, what happens when you find out that like

14:17

something that could you could scam people out of money

14:19

is art, and like you can and it's like

14:21

not art, like I had need, I have a Picasso,

14:24

Right, it's not like not that that's a scam. That's

14:26

a whole different type of scam actually, but

14:29

one that is real. But

14:32

but but yeah, like what it happens. I mean, that's

14:34

an interesting place to explore it, right, Yeah, I hear

14:36

what you're saying, Like and.

14:38

You know, like right now, all of the all of

14:40

that energy, all of that kind of like prenetic

14:42

like what's next, what's big? What's interesting

14:45

energy is is plunged into large

14:47

language models and the AI scene

14:49

in general. And to me this is even more

14:51

kind of fruitful and interesting than crypto

14:53

because the technology is much more

14:55

obviously impressive and like

14:58

has applications you can think of with out

15:00

having to like, you know, get a lobotomy

15:02

and start talking like Mark andrieson or

15:04

what.

15:05

About slurp juice to understand

15:07

like what AI could do for you?

15:09

Yeah, and you know it Also it's interesting to

15:11

me because for the obvious reason that, like,

15:13

as a writer, like this is this is technology

15:16

that is directly overlaps

15:18

with what I am trained to and paid

15:20

to do, you know, and

15:22

I try to be in general, my approach to this

15:24

stuff is to try and

15:26

figure out what excites like what

15:29

excites what excites other people about something that doesn't

15:31

necessarily excite me, or like what is what do people

15:33

find so captivating about it.

15:36

It's a little easier to do that because, like I said, it's

15:38

sort of obviously impressive, and I try to

15:40

approach it not from like a purely critical,

15:42

purely kind of negative standpoint.

15:45

Even if I think, you know, Sam Altman

15:47

is not a great guy or somebody

15:49

who's got my best interests in mind.

15:51

Frankly, they ever are a great guy? Are they

15:53

know?

16:04

Was the last time there was like a huge new technology

16:07

started by somebody that you're like, that is a

16:09

freat person.

16:10

I really like him. They seem cool.

16:12

I mean it is unusual, right, yeah.

16:14

I mean people used to love like Steve

16:16

Jobs, but even Steve Jobs and not

16:19

upon further reflection, by the way, always

16:22

seemed like kind of not a great guy, like he was

16:24

sort of a dick.

16:25

I mean, you must remember, I feel like it was. It's

16:28

just a different cultural difference between like the nineties

16:30

and two thousands is now as people used to sort of laugh

16:32

about Steve Jobs as an abuse of boss like, which

16:35

he really obviously was. Every story you'd

16:37

hear, these sort of semi heroic stories about

16:39

him just completely bitching people out, screening

16:42

at them whatever, like in the in the

16:44

mac Press, in MacAddict or whatever. It was also

16:46

treated as this funny little thing. And now you'd

16:49

be like, if that, if that stuff had come out

16:51

now, that people would have been up in

16:53

arms. I mean, because he sounds like he was an horrible

16:55

person to work for and basically every way.

16:57

You do, yes, and then but also you

16:59

have to wonder what is what could

17:02

a Steve Jobs level creator

17:04

exist? I mean, this is the ultimate

17:06

question. If the man wasn't allowed to be

17:08

a complete like rage aholic or whatever. It's also

17:10

right, he also cried. He also cried a lot, right,

17:13

That was the other thing, Like in the in the Isaacs in books,

17:15

they talked about him crying, but which I find just like

17:17

incongruous with him. I mean, it makes I

17:19

can see it, like in my mind's eye. I'm like, okay,

17:21

I get it. But like, imagine being in any

17:23

work scenario. Okay, you're a you're

17:26

a guy. You go into an office, your

17:28

boss is berating you or something, or

17:30

like, you have to deliver bad news, and his reactions

17:33

he starts crying in

17:35

the room. Forget about forget

17:37

about abuse for a second. Let's

17:39

take the abuse stuff out of it, just for a

17:41

moment, just put it to the side. I

17:43

can't think of a more uncomfortable situation

17:46

for a person to be. I mean, obviously there are more uncomfortable

17:48

situations, but very unusual and apparently

17:50

happened on a regular basis that he would break down

17:53

in tears during a meeting or

17:55

like you did an argument or something, which is like, it's

17:58

just very like he had a lot of volid emotions.

18:00

But that's how we got the iPhone from crying lots

18:02

of it.

18:03

Fucking raises the question because you know what happened.

18:05

I guarantee you. I guarantee you. Some people brought him

18:08

a shitty fucking touch screen like the original

18:10

Android. I mean, I don't know if you remember how Android

18:12

phones were, the first version of them, they had

18:14

like the touch screen sensitivity was all

18:16

over the fucking map. They sucked to use. Yeah,

18:19

somebody brought him that shit. Yeah, and he had

18:21

a meltdown on them. He threw the fucking phone

18:24

in the face. He did a David Pogue.

18:26

He chucked the phone directly at their face. Remember

18:28

when David Poke threw a phone at his wife. This is oh, yeah,

18:30

it was sorry, It's just I think about it all the

18:32

time. Whenever I noticed that talk about

18:34

the iPhone, I'd be like, go to Dave. David Pogue throwing

18:36

an iPhone at his wife. Oh also horrible,

18:38

by the way, horrible abusive thing to do. But

18:40

yeah, Steve Jobs, they gave him the shitty touch screen.

18:43

He threw it at somebody. He was like, this is not extentually

18:45

started crying, I assume at some point,

18:47

and then eventually, out of fear, right,

18:50

they brought him really good touch screen,

18:52

like just out of pure fear.

18:54

They were like, all right, like we have to do this, or

18:56

like he's gonna be really upset. We don't

18:59

like want to make Steve us. You know,

19:01

would we have gotten the iPhone or would we have

19:03

a would it be a whole different future that we're

19:05

living in right now. Maybe there would be no social

19:07

media because when you think about it,

19:09

without a great touch screen, the iPhone's probably not a

19:11

successful product. And if it's not successful,

19:14

then the whole social media boom and all that shit really

19:16

doesn't happen.

19:16

Might Yeah, I mean when you put it like that, you're not making

19:19

an argument that it was a good thing that any else

19:21

I'm not sure not if it's without

19:23

bad bosses, we wouldn't have social media. Wouldn't

19:25

that be great?

19:26

Right?

19:26

Better workplaces and no Facebook.

19:28

But I've come full circle. I mean I was when

19:31

I was twelve years old. I got on the Internet when I was

19:33

like twelve, right, and it was early internet.

19:36

But now I'm like, yeah, we got to shut it down, like,

19:38

why did we ever invent any of this stuff? This seems

19:40

bad, like we made a huge mistake.

19:43

Well, it's funny.

19:44

One thing about AI that I noticed is

19:46

I think we've gone from a position

19:49

I mean the sort of capsule history which isn't

19:51

which is always more complicated than this, But the capsule

19:54

history I think that people tell about themselves, journalists

19:56

tell about themselves and the tech boom is that we

19:58

were all a little bit who positive

20:01

towards the tech industry, that we were too excited

20:03

that we gave them a pass through the recite.

20:06

Yeah, we're like the touch screen on this iPhones

20:08

fucking amazingly. I mean, as a guy who reviewed

20:10

most of them, I was like, oh my god, the touch

20:12

screen is so good?

20:13

How did they do it? Steve Jobs has done it again?

20:16

And then and then when everything kind of fell apart and it

20:18

became clear how bad for all

20:20

of us, all of this stuff.

20:21

Kind of was.

20:22

You know, everybody hunched over and was like, we can't

20:25

let that happen again. And I think a lot of the like

20:27

media critical reaction to

20:29

a lot of AI stuff is born out of that fear

20:32

that we're going to be too. You know

20:34

that we that we really need to give these products

20:37

like this and technologies like this really

20:39

rigorous and clear kind of investigations

20:41

goings over like make it make them as good

20:44

as they possibly can.

20:45

Right.

20:45

That's obviously you want journalists who are oppositional

20:47

and aggressive and critical in all these ways.

20:50

But it's like creates this very a very funny and different

20:52

kind of relationship with the tech industry

20:54

than existed in say like two thousand and five

20:56

or two thousand and six.

20:58

Oh no, totally, totally.

21:00

I hear, by the way, the sounds of New York in

21:02

the background there, it's like you gotta

21:04

sir, and I'm just creeping up. So I've been so

21:06

long since I've heard that, now that I live in out

21:08

the country. No, you're right,

21:10

I mean I think there is probably in some way

21:12

an over adjustment.

21:14

I mean, I actually, and I've probably told the story before.

21:16

But I I

21:18

had a meeting with actually

21:21

it was a fund I was fundraising for the Outline and

21:23

we met with Mark Andries and it

21:26

turned into an argument between the two of us.

21:28

It was an interesting meeting because there were like seven

21:31

people there and it just ended up with us

21:33

arguing with each other. But one of his

21:35

arguments about generally about

21:37

like the tech press, I mean, the outline wasn't pure

21:39

tech, but it was like he was like, you guys

21:42

are of all, you're all trashing us, and

21:44

you're taking shots at us, and you're trying to

21:46

like tear down all the work that we're doing. And

21:48

and I was like, one of my things that

21:50

I remember, and I think maybe struck

21:52

a course with him, I was like, we like

21:55

actually have been like nothing but like

21:58

wonderful to the tech industry, and we

22:00

expected like that that people in tech to

22:02

be better. Yeah, and you guys have ended

22:04

up being exactly like the fucking robber

22:06

barons of yester year. Like you're supposed

22:08

to be the next generation of like these leaders

22:11

who do who hold themselves to a higher

22:13

standard or are more aware of what's going

22:15

on in the world and like you know, more

22:17

sensitive to like the users

22:20

and who they are and what they represent

22:22

and anyhow. But like,

22:24

I think that's that is they've reacted very poorly

22:26

to getting bad press because they they have gotten

22:29

so much good press, and so there's

22:31

a hugely defensive stance.

22:33

I also think, I mean, I think that the other aspect of

22:35

this that is is sort of unexpected and you wouldn't

22:37

have been able to predict in two thousand and six, is these

22:39

are like the two main power

22:42

user groups on Twitter. So

22:44

like suddenly all of the media kind of got shuttled

22:47

through this one single social media

22:49

feed, and it was the same feed

22:51

that all of all of tech

22:53

was also on, and it just was like

22:56

just sort of a bad neighbors situation.

22:58

Like maybe like all of a sudden, you went from like

23:01

maybe you get the Times delivered and you see

23:03

a negative article, but you also see the positive article,

23:05

and you also get the Journal and there's a bunch of different things.

23:07

Now you're like you see every single even the

23:09

lowest level, you know, copy editors

23:11

talking shit about you or like replying to

23:13

you and calling you egghead or whatever,

23:16

and you're freaking out, like it's too much

23:18

for you.

23:19

That is mean you should never I don't think I

23:21

don't I think never go after people's looks.

23:23

I think you don't you know

23:25

what I mean. I have always felt this.

23:26

I mean, especially during the Trump years, people would

23:28

be like ooh, the orange whatever. I'm like, that's not productive,

23:31

Like it just isn't like it's not a good.

23:33

Clearly, I mean, it clearly got under marketing well,

23:35

clearly got under a bunch of people's skins, various

23:37

various people. I mean, it doesn't.

23:39

I mean.

23:39

The funny thing is that Andresen then started his

23:42

own publication to much fanfare,

23:44

called I think called Future.

23:46

He's also he's also very interested in covering

23:49

the future, like what future and.

23:50

Your public is.

23:51

Then they shut it down because because if.

23:53

He wants to read a fucking white paper from a sixteen

23:55

Z or from andres and Horwitz, like nobody wants to

23:57

read, nobody cares, nobody wants to read PR.

24:00

First off, I wish I could tell all the PR people

24:02

though I wish they could learn.

24:03

But I mean, and the other thing is like running an actual

24:05

like publication that publishes with any kind of regularity

24:08

good things that people want to read with any kind of regularity

24:10

is much more difficult than I think people

24:13

appreciate it. I have this thing about I think there's

24:15

something real about the way people,

24:18

certain kinds of people relate to journalists and the media

24:20

is because of what we do is just

24:23

write and just sort of tell people what's

24:25

happening in the world. There's there's there's a

24:27

sense that like anybody could kind of do

24:29

it, and so there's a there's a there's a resentment

24:32

from people like Intrisan or whomever that that sort

24:34

of the suggestion that, like, you know, I

24:36

could be doing what you're doing, I could do it better. And

24:39

when you actually try to do it, like run the whole

24:41

thing as a business, like like hit everything

24:43

that you need to hit to be a good journalist

24:46

and a good editor and a good publisher, you recognize

24:48

that, especially in the current environment, especially in the

24:50

environment that people like Andresen created, it's

24:53

a lot of work. It takes a lot of work and a lot

24:55

of like accumulated skill and knowledge.

24:57

It's not something you can just replicate because you've

25:00

decided that today's journalists don't know how

25:02

to do it right or whatever.

25:04

I agree with everything you just said one hundred percent, but

25:06

I will also say we have so

25:08

devalued what looks like.

25:10

Journalism and news.

25:14

It's been so greatly terrifically devalued,

25:16

and there are so many people who truly suck

25:18

doing it, like really bad. There are

25:21

like bad actors at places like Daily

25:23

Wire or whatever. There are people

25:25

who are bad at it, like that just aren't good

25:27

at their job because it has become probably there

25:30

was a period where it was easier than ever to

25:33

go, hey, i'm gonna write, I'm gonna make content,

25:35

Like journalism became this thing called content,

25:38

not the journalism with the capital jay that you're

25:40

probably thinking about or we're talking about.

25:42

Really.

25:43

And also it's all free, yeah, right, it's

25:45

free to everybody, and everybody thinks it should be free.

25:47

And so I think it's that combination of

25:50

like you've got like legitimately

25:52

bad actors, you've got legitimately like kind of people

25:54

who should never have gotten into the craft

25:56

to begin with. Like not to throw it back to

25:58

the NFT thing, but it's a good it's kind of an interesting

26:01

parallel where it's like you really

26:03

aren't an artist. You didn't like, you didn't

26:05

train, you didn't really want to be an artist, you

26:07

kind of don't care about it that much, but like you

26:09

could get a job doing it right. You can get a

26:11

job like making an NFT and like maybe

26:14

sell some and make a quick buck. I think

26:16

there's a lot of people who kind of found their way into like content

26:18

creation that was like a machine.

26:20

It was like it kind of like, you.

26:22

Know why we're all talking about AI a lot lately in

26:24

this in this venue is like it

26:26

became a very machine almost like a machine

26:29

generated game, where it's like you write a headline

26:31

that it will get clicks, and you put some content in the

26:33

below it that's like feels like enough information

26:36

to call it a story, and that's like what

26:38

journalism is. There's actually a fair

26:41

argument to say that there is a lot

26:43

of bad journalism. It's just that what

26:45

they're talking about isn't like

26:48

is it the Daily Wire, and it isn't

26:50

like the user generated content on BuzzFeed

26:52

dot com. It's people

26:55

who say things that are true that they don't want to

26:57

hear or don't want anybody else to see.

27:00

Basic.

27:00

The other thing I wouldn't want to say is that it's not as

27:02

though. I mean, the thing I still love about the Internet

27:04

is that you can find incredible

27:07

geniuses doing work

27:09

that would otherwise never have been cover.

27:11

It's not like citizens, I like, do

27:14

I believe in citizen journalism like conceptually

27:17

Circle twenty twelve, Jeff Jarvis

27:19

talking about like said citizen journalism, No,

27:21

but do I believe that there are like people there are

27:23

incredibly talented writers, reporters,

27:26

journalists out there who would not

27:28

have access to audiences if it wasn't for the Internet.

27:30

Yeah.

27:30

Absolutely, I mean that those people are those.

27:32

People one hundred percent. I mean, to be

27:34

clear, we would not be having this conversation.

27:36

I would have done nothing if it weren't for

27:38

the fact. I mean I didn't go to as

27:41

you as I think. You know, I barely even went

27:43

to high school, but I definitely didn't go to

27:45

to college for journalism. Like I didn't

27:48

go to Jay's school and then leave and go intern

27:50

at the New York Times. Like there used to be a way

27:52

that people did this that was very linear, right, and

27:54

it was a very closed circuit.

27:56

For sure.

27:57

The Internet has has I know that everybody talks

27:59

about this is like kind of like bullshit, but I think

28:01

it, I mean, these days, but it had

28:03

leveled the playing field for if

28:06

a person was eager and excited

28:08

and good, yeah, and had any

28:10

bit of talent, Like there was a place to go

28:13

and go and do it and find like hone

28:15

it, you know, and like I that's

28:17

real and awesome. Yeah, And a lot

28:19

of our best journalists, a lot of the best journalists

28:22

working today it came from

28:24

that sort of with those backgrounds,

28:26

like not from straight from Jay school.

28:28

You know, something that I believe,

28:31

like incredibly strongly, like on a

28:33

political levelist that there's all this incredible,

28:36

unused, untapped talent,

28:38

creative talent, creative genius intelligence

28:41

out there in the world, in the US

28:43

but around the world that is just ill

28:46

served by the political economic

28:48

systems that we have working right now. And the Internet

28:50

at its best is a way to level

28:52

that playing field and to like find outlets

28:55

for those people, find ways for them to make use of

28:57

their incredible genius that

28:59

otherwise would and exist. And then the Internet,

29:01

it's worse, is also like a

29:03

way for very rich guys to like find those

29:05

talents and then just exploit the hell out

29:07

of the stuff they're creating. I mean, I think this about social

29:10

media for real. It's like, what makes

29:12

Facebook valuable is not the technology,

29:14

what makes or Twitter or any or TikTok

29:16

or anything. It's the people who are creating

29:19

stuff that is engaging and entertaining and

29:22

funny and weird for free for these

29:24

social networks.

29:25

Right.

29:25

The drill the drills, Yeah, exactly

29:27

right, And you know this is this is the thing that I think

29:30

is worth thinking about as AI comes

29:32

into being is the sort of sense that, like, you

29:34

know, what is AI going to be

29:36

used for? Because I think it has this

29:38

possibility to give sort

29:41

of production tool in the same way that you

29:43

know, like electronic production music

29:45

production tools gave access to all these

29:47

kids in their bedrooms the ability to make beats and

29:49

to make music and to create

29:52

create stuff themselves without having to you know,

29:54

pay one thousand dollars an hour for a studio

29:56

or whatever and get that stuff together. Like,

29:59

I see a lot of gender to AI that feels like,

30:01

maybe not right now, but pretty soon, this is going to

30:03

help people create things themselves, even if

30:05

they don't really have access to these huge

30:07

resources outside of it, right, And what frustrates

30:10

me is that it sure seems like the business

30:12

model that places like open ai are gunning

30:14

for is instead of like, let's enable people who

30:16

otherwise wouldn't be able to, you know,

30:19

do these things, let's let's find ways

30:21

to replace the people who are already doing these

30:24

things with the shittier version that we can

30:26

have the boss's control or whatever.

30:28

Right, there's a lot there that I agree

30:31

with and also to unpack. But I will say

30:33

that it is interesting that. I

30:35

mean, I don't want to get into a you know, capitalism

30:38

or whatever. Capitalism isn't good or not. There's

30:41

obviously some problems with it, but

30:43

like there is that you know. The history

30:46

of capitalism is like these

30:48

like incredible innovations that are then like manipulated

30:51

into like a massive business that then

30:54

like needs a bunch of like worker bees

30:56

to like go and do and it's like owned

30:58

by a very small segment. Like the thing

31:00

itself is owned by a very small segment of the population,

31:03

but the actual work to make the

31:05

thing is done by a much larger segment.

31:07

Of the population.

31:08

And it's usually grueling and shitty, and people

31:10

are underpaid for it. And it's like is and that is

31:12

like in some way, like a Facebook is, like

31:15

you said, like all these people are Facebook or Twitter or

31:17

whatever. Is like these amazing people create for

31:19

it, like that's the engine of it or whatever. But

31:21

it is, you know, over time

31:24

turned into kind of like a just

31:27

like it's a part of like a machine, Like those people

31:29

are a part of a machine that drives commerce.

31:32

Right, Yeah, I think that would be fine if the system

31:34

of commerce that it was built on was actually

31:37

made any fucking sense. But the

31:39

system of commerce is built on is based on a

31:41

mistake, like essentially a mistake about

31:44

the value of people on the Internet. I

31:56

think there's like a foundational fundamental

31:58

flaw and like how we monetize content

32:00

on the Internet. I've probably put this a million times and

32:02

I don't have to go into my spiel here, but like

32:05

the way that Google came up with monetizing

32:08

search is the way that we basically monetize.

32:10

Everything, and it is it's

32:13

wrong. It just was wrong.

32:14

It just was like they could do that at the time because

32:16

that was all that was available to them, Like they had one

32:19

notion of like could we make money, and like, well, this

32:21

is a way, like a billboard on a road,

32:24

Like that's one way to make money. Like you put a billboard

32:26

up and some people drive past it, and you can

32:28

sell space on the billboard. And sometimes people

32:30

who drive past it will be like, hey,

32:33

like yeah, I need a new car, Like I should

32:35

go check out the fucking Audi that

32:37

I you know, the sign I drove past. But like,

32:39

out of the people who drive past it, most of them never

32:41

go check out the Audi or whatever. But over

32:44

time, there's some small value

32:46

that can be extracted from that billboard. And

32:48

that's but that's the entire business model of the

32:50

Internet, is that every single thing

32:52

on it is like as devalued

32:55

as.

32:55

Like a billboard or whatever.

32:56

Like anyhow, I mean it's not the perfect analogy,

32:59

but you know what I'm saying, So.

33:01

But it doesn't.

33:01

I mean, part of it too, is that it means that when you're doing

33:03

creative work, like you are also

33:06

having to think about how you're like you yourself

33:08

are also a billboard. And so like I like

33:10

my job, I like my life. I like what I do on substeck,

33:13

but like part of what I

33:15

have to be part of what my subtect is and like part

33:17

of what I have to do as a writer who isn't

33:20

currently have W two employment is

33:22

sort of be my own marketing

33:24

team across a bunch of different platforms

33:26

so that people know they can hire me, people know they can

33:28

find me.

33:29

Right.

33:29

It's the hustle, yeah, and it's it's not a

33:31

great way to live. And the thing that but the thing that

33:33

really I think is a problem with it is that we

33:36

know, for a fact there are a lot of writers who are

33:38

more talented than me, who are better than me at all the things

33:40

I do. For

33:42

instance, well, who don't want

33:44

to let's without naming names. I don't

33:46

want to be part of that that hustle who

33:48

are not interested in that. Like, No,

33:52

I'm talking about the opposite. I'm talking about somebody who

33:54

is not good at this kind of this kind

33:56

of thing, who doesn't want to do it. Yes, there's

33:58

like very few ways for that person. Yeah, they're

34:00

very few ways for that person to be compensated for work

34:03

they do. Whereas you have people

34:05

who are who are incredibly good at the hustle

34:07

part of it, again not naming, incredibly

34:09

good to the hustle part of it, not not

34:11

particularly intelligent or whatever.

34:13

Writer. I think that's what we're talking

34:15

about.

34:15

I mean, we can all imagine in our heads

34:18

somebody who fits these gat and say,

34:22

for exactly, for example, they're they're ubiquitous.

34:24

I'm intelligent, but good at the hustle. I'm thinking,

34:26

actually, they shake.

34:27

The right hands, they put themselves in the right names.

34:29

It's one reason why the internet

34:31

can make you so mad when you go online these

34:34

days, Like the reward is market

34:36

Like you get much better rewarded for marketing than you do

34:38

for quality.

34:39

Right, of course no, yes, I mean and

34:41

and and if you're bad at or don't like

34:43

or feel like you know, some people like if

34:45

you're from generation acts like myself, you're

34:48

not gen X.

34:49

Are you? You're a millennial?

34:50

No, I'm a millennial.

34:50

Yeah, if you're if you're like a gen X person, you

34:52

you feel a kind of physical impulse

34:55

to reject self promotion. And

34:57

now, I don't know, maybe people would say that I don't. I

34:59

haven't jected self promotion. I'm not really sure, but I would

35:01

tell you I would. I think if you ask Lyra and Jenn

35:03

f I am promoting this podcast enough, I

35:06

think that they would tell you that.

35:08

I am not. And they wish I was hustling.

35:10

They wish I was hustling more. I don't know. I don't want

35:12

to speak for them.

35:13

All that said, you know, if

35:16

you even are close to making

35:18

a living at doing what you're doing,

35:21

it's also fucking wild because you

35:24

know your parents and your grandparents

35:26

and certainly every generation that is, almost every

35:28

generation that's come before us. This abstract

35:32

thing that we are doing, this abstract

35:35

I mean, it is fucking weird, like it is weird,

35:37

like we are go on to this thing

35:40

the screen, We go into the screen and

35:42

then we do something that frankly we like.

35:44

Probably for the most part, you probably like the writing

35:47

that you're doing, right, You're not like, you don't wake

35:49

up every day and you're like, I mean, I don't know, maybe you got to do the hustle,

35:51

but you don't wake up every.

35:52

I mean some weeks. Some weeks you're like, ah, do I really

35:54

have to do this? But for the most part, yeah, it's good,

35:56

right.

35:57

And I mean, I'm sure there are many struggling

35:59

YouTubers you feel the same way. But like, if you can

36:01

even make if you I'm sorry, if you can even make one

36:04

dime just shooting a video

36:06

of yourself and put it on the internet, Like, it's kind

36:08

of amazing because there was no

36:10

period in history before this

36:13

where such a thing was possible in really any venue.

36:15

I mean, you couldn't just put a show

36:17

on and have people come to your play or

36:19

whatever. You couldn't just make a movie

36:21

and put it in a theater. You couldn't just

36:24

like write a newspaper and have it exist

36:26

in front of people. So there is this amazing

36:28

flip side, which is like all of this seems like there's

36:30

fantasy here. Like I think if

36:32

you were to go back fifty years and describe.

36:34

This to somebody, they'd be like, it would just.

36:36

Be sound like such a fantastic

36:39

story, like such a total bit of fantasy.

36:42

So there's a flip side to all of it, which is

36:44

like, the internet's a sass pulling full of horrible people, and

36:46

the people who own and control most parts of the

36:48

Internet are tyrants who are working

36:50

everybody to the bone and don't give us our fair shake.

36:52

And the model, the monetization model

36:54

the Internet is built on, it's a complete shit show that sucks,

36:57

and it's broken at at its most fun

37:00

a mental level. And yet there

37:02

are more people creating

37:05

interesting works of fiction,

37:07

non fiction, art, non art, like journalism,

37:09

whatever, than we've probably ever had in the history of

37:11

the world. There's probably more like journalism

37:14

being done than ever before. Yeah,

37:17

am I crazy for saying that?

37:18

No? I don't think so. I mean, I think it's a it's a I

37:20

mean for me, the question is, how can

37:23

we have that flowering of creativity

37:25

and talent and intelligence, and how can

37:27

we you know, like make sure that people are reaching

37:30

their potentials without the layer

37:32

of exploitation that seems that right

37:34

now seems like integral to all

37:37

this stuff, and I did like to This

37:39

is something that I've been thinking about a lot lately, because

37:41

so I'm in the Writer's Guild, the TV Writers

37:43

Guild and we're on strike right now, or

37:46

the TV and Film Writers Guild and we're on strike right

37:48

now, and something that is a huge contrast

37:50

between being a journalist, where you

37:53

might have a workplace union but

37:55

oftentimes in general you don't, versus

37:57

being in a heavily unionized

37:59

industry part of a pretty powerful union

38:02

is seeing what

38:04

can happen, how much better a job

38:06

can be. When you have

38:09

that kind of at least, you have a little

38:11

bit of leverage against the exploitation that you're

38:13

not completely at the whims of what the bosses

38:15

want at any given moment. You're not completely like,

38:18

am I going to have a job today? Am I going to have a job tomorrow?

38:20

Am I going to make money from this thing I'm working on on?

38:22

Spec? Am I not going to make money from it?

38:24

Right?

38:25

And I think Hollywood is a really

38:27

interesting example of an industry that is a creative

38:29

industry that produces a wide

38:31

range of things from total shit

38:33

to like you know, instances

38:36

of surpassing genius that manages

38:39

nonetheless to like adequately up

38:41

until recently, adequately reward the

38:43

creative people who work on it like and sometimes

38:45

like more than just rewards, sometimes make them extremely

38:48

rich because they they did a really good job

38:50

at whatever it is you're supposed to do it. And I think

38:52

that like to you know, I'm not saying like, I

38:54

don't know how a content creators. I'm not suggesting

38:56

that content creators unionized, though I would love to. I

38:58

would think it would be great.

38:59

Imagine if like everybody in

39:01

YouTube unionized, like you.

39:02

Know, there's like the German German

39:05

vloggers have been trying to do this for

39:07

years now, and in fact, they're unionizing

39:10

under the steel workers union in Germany, I guess

39:12

is extremely powerful. It's maybe the biggest union

39:14

there, and so they've got a special like YouTube

39:17

content creators organizing

39:19

like group that they're.

39:20

Trying to I like, I really like the idea of like

39:22

steel workers and YouTubers like you

39:24

know, together, that's like very I

39:27

don't know that sounds.

39:28

Right if I might be remembering this wrong, but I'm

39:30

pretty sure the guy behind it is this German guy who

39:32

does videos of where he like builds his own catapults

39:34

and just like it's the most YouTube thing. But also

39:37

I think I like, right at the intersection of like steel

39:39

workers YouTube and German, it's like the

39:41

guy.

39:41

It's like the guy who's like, I'm going to build a house from

39:44

like just like this mud plane. I like, come

39:46

here and I'm going to build like a mudhouse out of like lumber,

39:48

I chop down and yeah, something like that. So,

39:51

right, there is this huge strike going on right now. Of

39:53

course, Hollywood is an industry that's been around a lot longer

39:55

than the Internet, right, and has

39:57

has gone through lots of situations

39:59

where there's tremendous abuse of power

40:01

by the people who are in charge of like the studios.

40:04

I imagine there's still a tremendous abuse of power

40:06

by the people who, yeah, the studios. But but

40:09

yeah, of course people eventually were like, hey, we need

40:11

to unionize. Of course, It's interesting the modern

40:13

narrative about unions is is very

40:16

confused. I think there's so much misinformation

40:18

about about what unions are and what they

40:20

do that it has really done its

40:22

job of like making a lot of people very skeptical

40:24

about the power of a union.

40:26

Yeah. I mean, Hollywood unions all formed in

40:28

the forties and fifties, when union density in the US

40:31

was like three or four times what it is now. When

40:33

you know, it was like pretty common to be in a union.

40:36

There were like institutions that people knew, well, everybody

40:38

knew at least one person or a family member who

40:40

was in a union. So it wasn't crazy to be like,

40:42

oh, yeah, we're going to form a writer.

40:43

It wasn't a political thing that like, you know,

40:46

I mean there was a wow.

40:47

I mean, it depends on the you depends on the union. Like some

40:49

of those you know, the writers in particular were

40:51

and still are relatively left wing compared to

40:54

the other right.

40:54

But I mean, I mean, Hollywood is generally

40:57

speaking not a I mean,

40:59

I'm not saying there are no conservatives in Hollywood,

41:01

but it is obviously the owners are all

41:03

probably there's no question, but I mean, generally

41:05

speaking, people in the creative arts tend

41:08

to be more left leaning, more

41:10

liberal than people in a non creative

41:12

arts. I think, yeah, I think that's fair, and so unsurprisingly

41:15

that the union would be you know, obviously very very

41:17

liberal or i mean socialism, you

41:20

know, is directly tied to lots

41:22

of unions in their creation, and like the

41:24

concept of a union of itself.

41:26

Is like a very socialistic idea, although

41:29

you know that being said, like if you go and you talk

41:31

to like the camera operators who work shows

41:33

in New York City who are all members

41:36

of AATSI, and the teamsters who are part of the Stage

41:38

Hands Union, which is one of the other big

41:40

powerful Hollywood unions. Yeah, yeah, I'm not saying

41:42

an even vote for Trump, but you're going to talk to guys who are very

41:44

different set set up politics than

41:46

I sew on union guys, guys who are showing us solidarity

41:49

on the strike. But there's like a there is a wide

41:51

range of political.

41:52

You know, but you see how those cut across.

41:54

There's a very blue collar, white collar interception

41:57

there where you've got like like a camera operation

42:00

or is a vary or like somebody who's doing set design

42:02

or or like literally some of the labor. Like I remember

42:04

when I used to go on on Fallon and like I

42:07

remember once I tried to move a box like we were like

42:09

setting up some like gadget or whatever, and they're

42:11

like, you can't touch that, And I was like Okay, there's

42:13

only the union. People who moved

42:15

the boxes could touch the boxes. Like I it was

42:17

like literally illegal for me to touch the box. But

42:20

it's very blue collar work, you know, it's like not like

42:22

sitting down at your think pad and writing,

42:24

you know, the next episode of Lost.

42:26

It's like.

42:27

One of the things that, I mean, one of the things that has been very

42:29

inspiring and has made me feel

42:31

really optimistic and positive about the strike wear

42:34

on right now is the extent of the

42:36

solidarity shown by the teamsters to

42:38

the Writer's Guild so that we've managed especially it's supposed

42:40

to work. Yeah, I mean, the idea is that everybody

42:42

that there's a there's solidarity between unions because everybody

42:45

understands the basic dynamic. And that

42:47

hasn't always been the case. You know, the last

42:49

strike writers went on with two thousand and eight. A

42:51

lot of this is sort of like anecdotes,

42:53

and but people said, you know, back then, people

42:56

didn't think the writers need to go on strike. You

42:58

know, teamsters were worried about missing there

43:00

hours. You know, they're not productions are getting

43:02

shut down. In a lot of cases, they're not getting paid.

43:04

So they would break picket lines, they would they would

43:06

walk through, they would try to keep working, try to get paid,

43:09

and that just has it's been a total

43:11

change from that for this most recent one,

43:13

which you can attribute to a million things, you know, like one

43:15

of it and is just I think, like you were saying, we've

43:18

gone from a real low ebb of unions to like

43:20

the sparks. We're not yet in a place where

43:22

unions are like have the same kind of density

43:24

that they used to, but we've gone from a low

43:26

ebb to there's you know, there's a changes

43:29

coming, I think, And so I think people are more positive

43:31

in general that we're learning

43:33

more.

43:33

I agree.

43:34

I think people maybe

43:36

are starting to realize and maybe the internet

43:38

is we can thank the internet for some of this,

43:41

just of being able to visualize what

43:43

happens in non union environments

43:46

and like going, hey, wait a second, like, yeah, it makes

43:48

sense that we have some leverage because

43:50

if you're not in a union, your leverage is basically niil

43:53

with the business. And by the way, have you been picketing?

43:55

Are you out there? Like have you you out there with this?

43:57

Yeah?

43:58

I pick I haven't picketed this week, but I was getting

44:00

last week in the week before.

44:01

Are there shifts. Are people doing shifts on picking?

44:03

Yeah, so they've got big shifts that they announced

44:05

the day before. Like right now, upfronts,

44:07

which are the big ad sales events that the

44:09

networks I'll put on, are going on. So there's pickets outside

44:11

all the upfronts. You know, usually they

44:14

bring on the actors for the big fall shows or whatever.

44:16

Actors, like the teamsters have been showing a

44:18

ton of solidarity with the writers. Most of them have refused

44:20

to cross. So it's like, you know, a couple

44:23

of nightly news anchors though even even

44:25

some of the news anchors refused to cross

44:27

too. I hear so oh really, yeah, so

44:29

there's like that's interesting. Yeah, Lester hole

44:31

he won't cross crossing or he isn't. Both refused

44:34

to cross.

44:34

That's cool.

44:35

Andrew Ross Sorkin of The Times and CNBC

44:37

did cross.

44:38

Oh wow, Andrew not cool.

44:40

Stephanie rule of MSNBC.

44:42

Stephanie rule cross She definitely crossed crossed.

44:44

Yeah, okay, who else anybody else? That's his is juiciest.

44:47

Seacrest crossed.

44:48

Seacrest crossed. Yeah. What's going on with Fallon?

44:50

I saw there's some He came on Blue Sky, he

44:52

joined Blue sky and people were like fucking hazing

44:55

him. I felt bad for the guy, to be honest,

44:57

It's like, I mean, I know you shouldn't feel too bad, but I.

44:59

Mean, the thing the show is, it's always complicated

45:01

because you know, you don't want to cross the

45:03

pivot line. But if you're doing a nightly if you're

45:05

doing a nightly talk show, like you've got a huge number

45:08

of people who are that's their living, but you

45:10

can shut it down for two weeks, but at some point you have to

45:12

go back to work. Yeah,

45:24

I think this one's going to last a long time, and so I would expect

45:27

most of the talk show hosts to go back on without writers

45:29

and have to do something similar. Though.

45:32

I mean, we'll see, Like I don't.

45:33

I've got Ai that's to spin up chat GBT

45:36

to do some bits. I mean, how hard can it be?

45:38

Right?

45:38

This is maybe this would be the first big test of whether

45:41

chat GPT can replace human beings

45:43

that creative and depth.

45:44

I mean, this is one of the items in our

45:46

negotiations. It's very important to us because

45:48

I mean, I think one thing that one way to think about all

45:51

this, you know, just to connect it to us talking about

45:53

media before is like we lived

45:55

through when we were talking about earlier on, we lived

45:57

through a huge sea change in the way

46:00

like journalism is just created and distributed,

46:03

going from print to online. They

46:05

just decimated the industry. That's not

46:07

the same thing necessarily is what's happening with streaming

46:10

versus studios, But there's

46:12

a lot of similarities. And one of the similarities

46:14

is just having these huge cash

46:17

rich companies, tech companies like Netflix

46:19

and Amazon and Apple come in and just kind

46:21

of throw money around in a way

46:23

that is very hard for people to say no to that kind

46:25

of money, but often means also diminishing

46:28

work protections, accepting deals that you wouldn't

46:30

have otherwise accepted, of course, and now I think a lot

46:32

of people are sort of looking around and saying, hold on,

46:35

like what direction is this going in? And

46:37

having lived through that as a journalist and seeing

46:40

sort of what happened to journalism, I think it's

46:42

very clear what happens when you allow

46:44

yourself to sort of buy the line that like, oh,

46:46

the technology has changed, so you have

46:48

to accept that you just can't get paid as much anymore.

46:51

And you know the other thing about

46:53

that is that Hollywood actually is a great example

46:55

of an industry that's gone two or three times now

46:58

through major technological shifts and how stuff created

47:00

and distributed. That these unions have lasted since

47:02

before TV, they lasted

47:04

before the VCR, before DVDs, And every

47:07

time there's been one of these shifts, there has been almost

47:09

like you could almost track it by the year,

47:11

there's been a strike because every

47:14

single time the studios trying and say, oh, well, stuff's

47:17

changed, we just can't pay as much as we used to, and

47:19

writers and actors and directors

47:22

and stage hands tend to stand

47:24

up and say, hold on, that's not what's

47:26

going to happen. We're going to figure out a new system so

47:28

that we get paid what we deserve to create

47:30

the content that people want to see. And

47:32

so at the end of the tunnel right now you

47:34

can see a place where studios

47:36

want to use AI to create

47:39

content as a way to pay writers less,

47:41

as a way to employ fewer writers. And

47:44

it's really important to us to stand up and

47:46

say hold on, Like, for example, if

47:48

you come up with a idea

47:50

for a story via chat, GPT

47:53

or whatever sophisticated Hollywood AI system.

47:55

I'm sure Netflix is developing somewhere

47:57

in a dark room and they're huge complex. You

48:00

still have to pay the writer who takes that idea

48:02

and turns it into a script the same rate that

48:05

you would pay writer to write a script

48:07

from a Wikipedia article or whatever.

48:09

Up Right, I mean that just seems like a no

48:11

brainer. Yeah, the work is the same exactly

48:13

right.

48:14

Yeah.

48:14

And you know there's another thing where it's like most TV

48:16

shows are written with writer's room, so you get

48:18

ten smart, funny writers. I

48:20

mean, this is like one of those things. I you know, because I started

48:22

as a journalist and moved into TV writing, I'd never

48:24

really experienced the TV writers' room. And now it's like, I

48:27

wish every single thing I write I could write

48:29

with the writer's room, to be able to sit down and sayle

48:31

and be like, heep's great. Here's what we're trying

48:33

to do. Let's get ten like extremely

48:35

funny, smart people to like

48:37

like just really work on it for five months

48:39

and make it really good. And I think with studios

48:41

they want to cut down the length of the writer's

48:44

rooms. They want to cut down the number of writers that you

48:46

have to hire because they have this idea in their heads.

48:48

And you know, I don't think any of them have announced

48:50

this specifically, but this is what everybody's sort of hearing.

48:53

This is the chatter that you

48:55

have one guy, you have your Damon

48:57

Lindelow or JJ Abrams or

49:00

Shonda Rhymes. They come up with the idea,

49:02

and then your writer's room is chat

49:04

GPT or AI or whatever

49:07

else. And then not only does that make for

49:09

worse content, but it's also that means that writers

49:11

in general, when you want to have just one writer come

49:13

in, you know, a human writer coming to punch it up,

49:16

they're now not protected by the contract

49:18

structures that previously existed.

49:20

So all of a.

49:20

Sudden, it's like, you know, there's all these things

49:22

that the studios, there's ways to make it sound reasonable.

49:25

Interestings like oh yeah, sure, why why do you need to

49:27

have ten writers or whatever? And the answer is, and not

49:29

every show needs to have ten writers. But if

49:31

you don't guarantee that every show

49:33

has ten writers, then all of a sudden, all the shows

49:35

that really do need ten writers suddenly won't get

49:37

right because the studios don't want to pay for it. They're going to point

49:39

to the guy who just needed himself, like Mike White,

49:42

who wrote all of White Lotus by himself and say, well, Mike White

49:44

did it.

49:44

Why can't you use a good point? That's a great point. Actually,

49:46

come on, why not? Why can't you just come in.

49:48

The right exactly? And then you

49:50

have guys who really can't doing ten episode seasons,

49:52

you're going to get a lot more, a lot worse TV.

49:54

Well.

49:55

I mean, it's interesting how it mirrors a lot

49:57

of the rest of what's going on in reality, Like

50:00

there is this pursuit of like massive

50:03

growth and production, and like in some way,

50:05

I feel like we're starting to see the limits of

50:07

like the content mill. Like I feel

50:09

like with the streaming services, like there's

50:11

just too fucking much. Like like we know

50:14

that people are starting to watch less. We know that

50:16

there's like an attention sort of drift happening.

50:18

I mean, I don't know about you, but I feel like the Sunday

50:21

night appointment viewing thing has come back into

50:23

focus in a big way around the stuff that HBO

50:25

is doing and some other shows like stuff like Yellowjackets,

50:28

where people are like it really is week to week,

50:30

like talking about there's like a discourse about a show,

50:32

not about the million things that

50:34

you could possibly watching. I mean, I think it is

50:36

that pursuit of that scale is a very

50:39

is a very tech market based

50:42

fucking thinking on like all

50:44

of this can be bigger and there can be more of it, and we

50:46

can just pump this out, and like AI is a solve

50:49

I mean, yes, the soul for like having to do Oh I don't want

50:51

to deal with these fucking creatives or I don't want to room with ten people.

50:53

I want one guy.

50:54

But it's also like a solve for scale, right,

50:56

You're like, oh, I can just pump this shit

50:58

out and I can hone it and refine it to what the audience

51:00

wants and it'll be this perfect marriage of like no

51:03

cost production and content

51:05

that people love, and like we'll just make all

51:08

the money for ourselves. I want to believe

51:11

there's some impossibility there. I want to

51:13

believe that there's something. I think we're all convinced

51:16

there's some fear lingering out

51:18

there, this AI fear, you

51:20

know, and listen, I see it in art, like, I mean, the stuff

51:22

that mid Journey does is crazy.

51:24

Like I put out a song.

51:25

The other day and I had mid Journey do the art for it,

51:27

and it is like as good as any maybe better

51:29

than any piece of art I could have gotten from another person

51:31

because it did exactly what I wanted, exactly

51:34

what I was trying to get it to do. And

51:36

that's scary. That's going to cost people jobs. There's

51:38

no question it already is costing people jobs.

51:40

Like I see it all over the place, Like I see it on

51:43

articles on news stories all the time,

51:45

right where the content was devalued. So

51:47

also has the art content been devalued.

51:49

Go figure.

51:51

I want to believe that there's going to be a point

51:54

and maybe I'm wrong, And frankly, I don't know if I'm

51:56

right. That the capability

51:58

of the AI won't be as as we think

52:00

it will be to deliver the thing that we think it can deliver.

52:03

Like I feel like in technology and in life

52:05

in some way, we're always looking for this

52:08

magic single solution that you put

52:11

the thing the input into and the output

52:13

is your solve, right like, and

52:15

I think that this happens all the time. I

52:17

think pivot to video is one of these things,

52:19

right Or like newsletters is one of those things.

52:22

Not a knock on your thing, but it's like this

52:24

is it. Yeah, we found the solution

52:27

to the problem here, and if

52:29

we just put that all the our bets on that thing.

52:31

It's going to fix it. And like, I think AI has a lot

52:33

of like.

52:33

Promise and obviously a lot of potential danger,

52:36

but is it really like can it really do the things?

52:38

Is there an AI that will be Mike White before

52:41

Mike White exists?

52:41

Like?

52:42

Yeah, can AI do the White lotus? Just

52:44

because it's able to synthesize what has come

52:46

before it? Yeah, and spit something out

52:48

even if it's of some quality.

52:51

Have we underestimated the

52:53

randomness of the human mind? Like I

52:55

think then, I think we have a little bit to

52:57

be honest.

52:58

You can almost talk about this in purely technical

53:00

terms right where it's like we've seen these

53:02

incredible especially large language models have

53:04

advanced by leaps and pounds just by throwing

53:06

power to them, basically, But it doesn't

53:09

follow from that that just by continuing to add

53:11

more computers, you know, more chips,

53:14

more more power behind those lms,

53:16

that we're going to get to you know, maybe we can

53:18

get to ninety percent of human capability or

53:20

whatever, and maybe for a lot of applications that's

53:22

good enough. Ninety percent is fine, and even

53:25

maybe for television, But that last ten

53:27

percent, like the ten percent that is memorable

53:30

and meaningful and original

53:33

and new we may never ever reach

53:35

that, we may not ever get.

53:36

There or whatever.

53:36

Yeah, I think that because I think the other the flip side

53:38

of the sort of dynamic you're describing is

53:41

and I can't remember who I feel like there was

53:43

an old Wired article that first kind

53:45

of articulated this idea that like the

53:48

one of the things the Internet has done is sort of

53:50

allowed the rain of the good

53:52

enough. That there's just like once you realize

53:55

you can kind of penetrate into what people are actually accessing

53:57

and seeing and doing whatever else, that you realize that, like

54:00

you don't have to make everything perfect. You can just make

54:02

stuff good enough and people will consume it in vast

54:04

quantities. In some ways, that's incredibly freeing

54:06

as a creative that like you can allow stuff

54:08

that wouldn't necessarily hit your standards out there

54:10

into the world. But the flip side of it is that like

54:13

if good enough is good enough, then like

54:15

what's the motivation to take it to

54:18

beyond good enough? Especially like on a corporate

54:20

level where you're trying to press shareholders and make a

54:22

profit whatever else, Like you don't get extra money

54:25

by going from good enough to truly

54:27

excellent. Like the dystopian fear about

54:29

AI. I think I'm just sort of echoing what you're saying,

54:31

is like maybe a I will never be able to write the White

54:33

Lotus or Succession or whatever to your sopran Certainly

54:36

not the Sopranos, whatever TV Shaw you love.

54:37

Definitely not the Soprido, like you're like the other

54:39

one.

54:40

Well, the Sopranos to me is like that's part

54:42

that's like, that's that's memorable, that's

54:44

for it.

54:44

I don't I don't disagree, But they'll be remaking

54:47

it in like twenty years of the new cast or not even

54:49

twenty years probably.

54:50

Yeah, right, anyway that there's a million shows

54:52

that would be fine to write with AI,

54:54

and people might notice the difference.

54:56

But maybe crazy atom maybe for instance.

54:58

I mean, I don't want to again, I don't want I have to. I have to

55:00

get a job when the strike is over. So I'm not going to

55:02

say anything out loud.

55:03

No, I've never watched Gray's Anatomy, so I wouldn't

55:05

be able to tell you.

55:06

But and I think, I mean, I think

55:08

that's a legitimate fear. I mean the other part of it is I

55:10

just think like, there's

55:12

something about having things created by

55:14

humans that I think is really important to

55:16

the way we consume art

55:18

and journalism and any other thing.

55:21

To you, to a lot of people, it's not

55:23

important at all.

55:23

The same people who would happily read a user

55:25

generate a piece of content on buzzfit.

55:27

I don't know and enjoy it.

55:28

I wonder like I do kind of. Jake Kang had

55:30

this great line and a piece he wrote for The New York a few weeks

55:32

ago where he said that, like, it's really

55:34

important to him when he reads an article that he's getting angry

55:37

at a human, like when you and Jay is like, as

55:39

a writer, I love who who's Like one of the things

55:41

he does is get really mad.

55:42

At people, and he's mad

55:44

online.

55:45

Yeah, and the human component, like the fact that

55:47

there's a human to be mad at, is important to

55:49

him, the fact. And I think you can transperse that to all kinds

55:51

of emotional states that you want to be. Let's

55:54

say this is a hope more than like a thing that I

55:56

absolutely know to be true. This is but it's something

55:58

that I I hope

56:00

with reason that people

56:02

will feel some kind of difference between watching

56:05

Let's pretend some far future AI could

56:07

from a single prompt create a ten

56:09

episode, not just the script, the whole thing.

56:12

That there's a difference between watching that and watching

56:14

something that somebody else made right even

56:17

that, like, even there's a difference between

56:19

you, Joshua, like I really want to watch a

56:21

cyberpunk adaptation of whatever

56:24

You type that into your TV

56:26

generating AI and get it that there's a difference

56:28

between that versus something that

56:31

Mike White typed into his thing. I want to write a

56:33

thing right that, even between those

56:35

two uses of AI, there's a difference that, like

56:37

so much of what we consume, there's this social basis

56:39

to it that I think hasn't quite been worked out

56:41

by places like Netflix. That there's that that they're

56:44

betting on the other half, which is, like

56:46

you say that if we can sufficiently allow

56:49

the prompt to just solve all of these

56:51

problems, then we're all set. But I

56:53

mean, man, I've been so wrong as

56:55

I try to get people to subscribe to my newsletter about the future. Let

56:57

me just say, I'm so wrong all the time.

57:00

But that's good to know.

57:01

People love to read about a guy who's predicted

57:03

in the future, but getting it wrong that's one of their favorite.

57:05

Yeah. I mean, actually, that's it gives you a person to hate

57:07

online.

57:08

I feel just perhaps, I mean maybe

57:10

what saves humanity is

57:12

that we need to be mad online, but we need to be mad

57:14

online at someone

57:17

specifically. We can't just be mad

57:19

online, Like being mad online at an AI

57:21

feels like unproductive, right, just

57:24

not enough, you know. That's that's that's really

57:26

seizing the means of production, the productive

57:28

anger.

57:29

Yeah, all right.

57:30

Yeah, we got to wrap up, Max.

57:31

This was really great first

57:33

off, not surprising, but really enjoyable.

57:35

And I feel like there's about twenty things

57:38

we didn't get to.

57:38

I had.

57:39

I know that during the time that you were talking, several

57:41

times you were saying something, I'm like, I have a great,

57:44

hilarious rebuttal to this, and I didn't get

57:46

a chance to even get there because we got

57:48

it to so many other things. So you got to come back

57:50

and do this again. Yeah, and people

57:53

can find you. You're not on Twitter. You don't You're

57:55

not on Twitter.

57:56

No, I'm on blue Sky now though

57:58

I'm.

57:58

Not really uhy, but nobody else is on blue Sky.

58:00

So that's why you can't promote yourself.

58:02

What is your name? On blue Sky. How come people find you?

58:04

It's just it's max read dot info.

58:06

It's my it's my personal.

58:07

Lives at r O.

58:08

That's great dot info.

58:09

I'm also I'm also on blue Sky, I should say, because

58:12

that's a cool thing to tell people.

58:13

Josh, wait, Tibolski dot.

58:14

Com is my this is

58:16

my blue Sky day, which is great.

58:18

You obviously have this great read Max newsletter

58:21

that you can subscribe to. It's on substack.

58:23

The U R L is Max read dot substack

58:26

dot com. And my last name is spelled R E A

58:28

D like a book.

58:29

Yeah, we know how to spell read. Do people e

58:31

on the end a lot of the time?

58:33

This is this R E E D sometimes?

58:35

Yeah, sure, I guess I don't think of that read

58:38

at all. What else? Instagram? You don't

58:40

do Instagram?

58:40

Not really?

58:41

You know, I'm a podcast No, but yeah,

58:45

I think when the I when the higher ups that I heard here

58:47

this, They're going to be like this guy.

58:48

We got to get this guy on the air.

58:50

One can only hope.

58:51

Or a chat GBT equivalent of this

58:53

guy. No.

58:54

I invite people. I invite people to subscribe to the

58:56

newsletter. It's free. I should mention there

58:58

is a subscription fee. You can pay for it, but I

59:01

write one free weekly column.

59:02

So I think you can get that the Nemesis Sunglass

59:05

Post for free. And I think that if you want to go

59:07

a deeper dive on nemmesis you get that, you

59:09

have to pay a few bucks.

59:10

That's exactly right, and you will after you read the Sunglass

59:12

post, you will want the deeper dive.

59:14

I do think that's a big way to kind of dangle

59:16

the subscription at people.

59:17

To get that.

59:18

That's a glass posts. Anyhow, it's a super fun Thank

59:21

you for taking the time. Well,

59:28

that is our show for this week. I think obviously

59:30

I've I've concluded the conversation and now

59:32

that I've had nothing left my all

59:34

of my essence has been drained from me. That

59:37

sounds disgusting, actually, but I

59:39

go into a cryogenic chamber at the end

59:42

of every show and I'm there until the next show,

59:44

so I have to begin my long slumber.

59:48

We'll be back next week with more what

59:50

future, and as always, I wish you and

59:52

your family the very best.

1:00:00

Height

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features