Podchaser Logo
Home
The Posters Will Rise - TikTok Ban, Changing Expectations for AI

The Posters Will Rise - TikTok Ban, Changing Expectations for AI

Released Thursday, 14th March 2024
Good episode? Give it some love!
The Posters Will Rise - TikTok Ban, Changing Expectations for AI

The Posters Will Rise - TikTok Ban, Changing Expectations for AI

The Posters Will Rise - TikTok Ban, Changing Expectations for AI

The Posters Will Rise - TikTok Ban, Changing Expectations for AI

Thursday, 14th March 2024
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

Hey everybody, it's This Week in

0:02

Google. Leo's out and I'm filling

0:04

in. We've got Jeff Jarvis as

0:06

usual and a special guest at

0:08

Zitron. This week on the show, we talk

0:11

about AI and the new

0:13

groundbreaking legislation the EU passed.

0:15

We also talk about TikTok.

0:18

Will it end up being forced to shut

0:21

down or be sold? We

0:23

also talk a lot about Kara

0:25

Swisher and this great New York

0:27

Times story about how automakers are

0:29

sharing consumers' driving behavior with insurance

0:31

companies. All that and more coming

0:33

up on the show. Stay tuned.

0:38

Podcasts you love. From

0:40

people you trust. This

0:43

is Twig. This is Twig. This

0:45

Week in Google, Episode 759, recorded Wednesday, March 13th, 2024.

0:47

The posters will rise.

1:01

This episode of This Week in Google

1:04

brought to you by Melissa, the

1:06

data quality experts. Melissa has helped

1:08

over 10,000 businesses worldwide

1:10

harness accurate data with their

1:12

industry leading solutions. Processing,

1:15

get this, over 1 trillion

1:17

address, email, name and

1:19

phone records. 1 trillion. G2

1:21

and its 2024 grid report is once

1:24

again recognized Melissa as leaders

1:26

in data quality and address verification.

1:29

Melissa knows money is tight.

1:32

They offer free trials, sample

1:34

codes, flexible pricing with an

1:36

ROI guarantee and unlimited technical

1:39

support. All customers everywhere in

1:41

the world. When

1:43

it comes to data, Melissa treats

1:46

your data like the solid goal

1:48

they protected. In fact, they are

1:50

FedRAMP authorized as an address data

1:52

resource. That means you're getting

1:54

the highest level of security for your

1:56

customer data. It also means governmental agencies

1:59

can use. Melissa with confidence. It's

2:01

also GDPR and CCPA compliant.

2:03

It meets SOC 2 and

2:05

HIPAA HiTrust standards for information

2:08

security management. They got it

2:10

all covered. Improve your e-commerce

2:12

operations with Melissa. Now as

2:14

an SRE partner, Melissa's cloud-based

2:16

tools standardize, validate, verify, and

2:18

enhance data on which top

2:20

businesses rely, ensuring address data

2:22

is seamlessly location optimized. They're

2:25

constantly getting better. Oh, and

2:27

don't forget to try the

2:29

free Melissa Lookups apps. They're on

2:31

Google Play or the iOS app

2:33

store. No signup is required. Get

2:36

started today at Melissa. 1000

2:38

records cleaned for free at

2:40

melissa.com slash twit.

2:43

That's M-E-L-I-S-S-A,

2:45

melissa.com/twit. It's

2:48

time for twig this week in Google. As

2:50

you may have guessed, I am not Leo

2:52

Laporte. In fact, I'm Para Smartenome. Leo has

2:54

abandoned us this week, fleeing the country to

2:57

the shores of Cabo. In his

2:59

stead, I have seized the means of

3:01

podcast production and God only knows what

3:04

will follow. Laughing over there is,

3:06

you know, who is not in Cabo this week?

3:08

Jeff Darvish. I'm never anyplace funny. And

3:10

I, I for one, welcome my new

3:12

master. Thank you. He's

3:14

the director of the Tau Knight

3:17

Center for Entrepreneurial Journalism at the

3:24

School of Journalism at the City University

3:26

of New York. It feels so powerful

3:29

to be able to cue that up.

3:31

Ed is right away. He just said

3:33

Christ. I'm going nowhere. Yeah. I was

3:35

about to say speaking of people

3:37

who are just right immediately on the pulse of

3:39

the show, we've got Ed Zittron

3:42

here, CEO of EZPR and

3:44

host of the Better Offline

3:46

podcast. Ed, welcome. Thank

3:49

you for having me. I will be Leo Laporte

3:51

this episode. That's

3:53

really important. We all at some point are

3:55

going to need to take turns being Leo

3:58

and you know, you'll never. You're never

4:00

going to know which one of us is

4:02

going to suddenly become an AI accelerationist and

4:04

that's really important Yeah,

4:07

that's definitely what I'll be doing Was

4:14

like I we need a we need

4:16

a contrarian Perspective to balance out

4:18

the amount of shows we've had that

4:20

are like AI should have access to

4:22

everything Yeah,

4:25

no Also, even

4:27

if it did what's it gonna do? It doesn't do anything

4:30

Great great point Well,

4:34

the information just had a story about

4:36

that About trying

4:39

to tamp down the enthusiasm because it

4:41

isn't doing enough Yeah

4:43

earlier this week my colleagues at

4:45

the information reported that Amazon

4:48

and Google are quietly trying to

4:50

tamp down expectations around generous AI

4:54

and they are Basically saying

4:56

that the hype about the technology has gotten

4:59

ahead of what it can actually do for

5:01

customers at a reasonable price That's

5:05

the sound of a balloon being blown up ready

5:07

to pop I Know

5:11

somewhere out there on the beaches of Cabo

5:13

Leo's shaking his hand going but my

5:16

you know Cool chat

5:18

GPT thing for Emacs. Well

5:21

Leo they're not here this Paris I'd love

5:23

to know the background of the story Does

5:26

because they're trying to let air out of the balloon Right

5:30

before it gets poppy Was

5:33

that the journalists saying? How

5:38

do I ask this question are

5:40

the companies Putting

5:44

out an active strategy of Lowering

5:47

expectations or did the

5:49

journalists kind of just hear this among

5:51

investors and among others that they cool

5:53

a little bit Oh, I

5:56

actually just I just read this so I'm

5:58

very excited to go found this So

6:00

a lot of it appeared

6:02

to come from direct sales calls and conversations

6:04

with internal people from what I can understand

6:06

saying that they're having to do this and

6:09

they're having to make people step back and The

6:13

weirdest one was this KPMG bought access

6:15

to Microsoft co-pilot for like 40 something

6:17

thousand people But when asked why they

6:20

were like, ah don't really

6:22

know why we're using it. But you know If

6:25

our customers ask any questions, we should

6:28

really know which is just complete

6:31

Nonsense like this isn't

6:34

the balloon it that I don't even think the

6:36

balloon is being Deflated

6:38

I think these companies are trying

6:40

to quietly deflate it while keep

6:42

it publicly interesting. That's what I'm

6:44

saying Yeah, I think there's three

6:46

ways right that they're they're trying to

6:48

deflate it one is Well,

6:51

it can't do everything we said it could do Two

6:53

is well The impact won't be as profound as

6:55

we said it would be and three is the

6:57

revenue won't be as earth-shattering as we hinted It

6:59

would be Right. Yeah,

7:01

and I think the thing that people don't realize

7:04

is Media is kind of

7:06

not help with this exactly either is even

7:08

if it gets The biggest

7:10

problem isn't just that it's not doing enough.

7:12

It's that even when it gets profitable when

7:14

it helps people get profitable I should say

7:18

What if it's only doing like two to four percent more

7:21

profit? Well, how about that

7:23

because generative AI right now is

7:25

exceedingly unprofitable very very unprofitable

7:28

Lastly and all of it

7:31

is flowing back to Google and Microsoft Open

7:34

AI and an anthropic are

7:36

basically in the pocket. They both agreed to

7:38

their own versions of exclusive Contracts

7:40

with them. So by the way, this is all

7:42

the system where just Just Microsoft

7:45

invested what ten billion dollars to basically buy

7:47

ten billion dollars of revenue same deal with

7:49

Google and an anthropic for about three billion

7:52

But The other thing in there was they

7:54

mentioned this Klarna story that Klarna saw

7:56

forty million dollars of saving in there

7:58

That's not what caught. Corona said clone

8:00

A said the was there was a specific

8:03

would have nothing to bring up. Felix This

8:05

is really important and very strange Stamps not

8:07

seen this one years it was. Say.

8:10

Profits. Profits. Improvements:

8:13

An estimated drive as forty million

8:15

dollars us, the in profits, improvement

8:17

to clone us, and Twenty Twenty

8:19

Fourth. So by the way, that

8:22

does not say. Profits.

8:24

And it does not say savings. it

8:26

says estimated in Twenty Twenty Four, which

8:28

is the year where it currently in.

8:32

Some that does that France at improvement

8:34

mean I've never heard of that. It

8:36

means that it could seriously save money

8:38

somewhere that could contribute to the the

8:40

top line profit at some point in

8:42

some form to a company called plan

8:45

and is already not petite to least

8:47

profitable because their entire business bubbles is

8:49

based on people get in free money

8:51

and they're making a billion dollars. Soon

8:54

as every all six seven hundred people. and

8:57

has lame that blame that on a i

8:59

rather than medicine a. And

9:01

blamed it on a I management or

9:03

perhaps not giving every one zero percent

9:05

in press. Learns. All.

9:08

The time red red bull know puts

9:10

with no real credit checks is the

9:12

same pragma from his face since it's

9:14

just. It. All feels like

9:17

we're being Cons: On

9:19

some was interesting because we look back at the

9:21

two thousand and press. Loves it

9:23

was about using Vc money to buy

9:25

audience that wasn't really legitimately their. Yes,

9:29

So now they're using bc money to

9:31

buy what. Well.

9:34

That's the interesting saying. they're using Vc Bunny

9:36

to find a new way to make money

9:38

from their customers. It's the same

9:40

deal it was actually Texas that

9:42

back through his day in this

9:44

question because if it's Microsoft and.

9:47

google. what they're doing is investing money

9:49

in a i companies that them become dependent

9:51

on them for their mobiles to run so

9:54

all mugs of and google doing is buying

9:56

revenue they just that most of the ten

9:58

billion that was invest OpenAI was

10:01

cloud credits. Anthropic agreed

10:03

just before they took $3 billion from Google

10:05

to be exclusively using Google's

10:07

cloud and AI services. So in their

10:09

case, it's just they've created a new

10:11

money stream. It's kind of akin to

10:13

like the cloud boom in the 2010s.

10:16

So it's an old model, equity for revenue. Yes.

10:19

But in the startups case, I don't

10:24

know. I don't know what

10:26

these people, there were some really, like there was some very

10:28

niche like AI things are kind of cool,

10:30

but you see it in Snap. You

10:32

see it in these big type of Facebooks talking about

10:34

generative AI. What's it going to do for Facebook? My

10:38

biggest theory is that Google wants to replace Google

10:40

search with generative AI anyway, but that's a whole

10:42

other point. Yeah, no, no,

10:44

no, I write a whole thing about it, Jeff. You got to read

10:46

it. I

10:49

thought it was interesting in the article my

10:51

colleagues wrote, they mentioned that at

10:53

AWS, they recently gave their salespeople

10:56

kind of a reality check on

10:58

the tech. They had an analyst come

11:00

in during annual kickoff event a couple

11:02

of weeks ago and say that the industry

11:04

is at the peak of the hype cycle

11:08

and that he anticipates the hype

11:10

could veer into quote, a trough

11:12

of disillusionment in the coming years

11:14

as customers realize generative AI's limitations.

11:17

We have

11:20

a downer tone for a company presentation

11:22

about selling AI. I

11:24

have this vision that at the headquarters of

11:26

the information, there's a big dial and the

11:30

BSometer. No,

11:33

but that's a really good idea for

11:35

our interior design. It is. And

11:39

so how far on the BSometer now,

11:41

I mean, does, does, because

11:44

I'd imagine, especially the information you two had, but

11:46

the larger organization of all the wealth and

11:48

resources you have, at some point you just

11:50

all have conversations saying, is this, is this

11:52

real? Is this going anywhere? I

11:55

don't know if I want to comment on that, on

11:57

that one. I got something. So it's about some Of the. Major

12:00

decisions, but. I'd

12:02

be very curious what the information has

12:04

to save face but Busan skinner of

12:06

a I have to look for to

12:08

just lessons coverage or am I mean

12:11

I can't speak to. Her any other

12:13

individual employees lot on it, but for

12:15

me it's. I think it's

12:17

I think has been quite clear on

12:19

the podcast. I think it's quite a

12:21

lot. Of hype. It reminds me of

12:23

said the in a whole medafor. Sprays

12:25

or the crypto N N F T crazes

12:28

need is just the. Same. Cycle

12:30

of the Moments and I haven't

12:32

yet really see any evidence that

12:35

this is a revolutionary technology. It's

12:37

in the way it has been

12:39

described by assuming. This is as

12:41

have the first. Moment.

12:44

We're We're starting to see the eventual fall

12:46

of the same Leo's running down the beach

12:49

right now to give back off on the

12:51

showed scream as it's a. Leo is

12:53

pulling up a big phone and trying

12:55

to dial in to tell us in

12:57

Leo. Leo ad for Contexts has been

12:59

radicalised ever since he went on a

13:01

walk with an unnamed A. I acceleration

13:03

us to convince him that and five

13:05

to ten years A I. Will be

13:08

every thanks. Daddy.

13:10

Daycare the same with that person does come.

13:12

I was good knob joke as I mean.

13:16

I could. The.

13:18

Big difference I think between this and like

13:20

most of us and Crypto is the is

13:22

a product here for the problem is that.

13:25

As one's kind of in on the idea of

13:27

a I being big in the same way the

13:30

autumn of remember parasite only twenty tons of the

13:32

big suffers a service posts and have one Love

13:34

Daves new way that we could make money off

13:36

of software that people's you could sell Nc and

13:38

Sports. The was actually the thing that. Take.

13:41

Never really had. Consistently with the it's

13:43

price was a structured way to sell

13:45

a contract base. Things that had an

13:47

obvious bill of so soon as I

13:50

lay on it was great. I it

13:52

feels more like the cloud boom which

13:54

was good until you realize. The

13:58

same level of hype from the

14:00

Metaverse did Sting Corner which is

14:02

a I is gonna also my

14:04

everything Okay how. Well.

14:06

It will. Okay, but how will it

14:08

do that? Integrations,

14:10

And a p I awesome great but what does

14:12

that mean? I do not know. Are.

14:15

You paid thirty five million dollars to me, I work

14:17

for Mckinsey. I was just here to tell you the

14:19

eyes good but the thing is with the valley right

14:21

now. Is. A

14:24

right now I've one likes this idea because it's

14:26

a new thing is using as one can do

14:28

is using the At one can integrate and see

14:30

rate you can sink through. Tell A I apply

14:32

to a business I worked as. here's the thing

14:34

I don't like. I dunno might that. That.

14:37

Makes sense, that's it's a logical

14:39

thing to sing top. However, When.

14:42

You're really since Think about the actual technical

14:44

side. Damn. How

14:47

much of the single trying to automate. Can

14:50

be messed up because that's a big butts

14:52

messing up things is a big part Jenner

14:55

of Ice And then even if you work

14:57

out a way, this can directly help you

14:59

and integrate their. It is

15:01

expensive, It is unprofitable for the service

15:04

provider rely the way it's unreliable. But

15:06

even it's is find the boundaries and

15:08

there's a flourish Wall Street Journal articles

15:11

few weeks ago. I were

15:13

they were saying gas, amazon and Google

15:15

that old open a i'm Google are trying

15:17

to sell their generate by I solutions

15:19

and they keep running into brom of hallucinations

15:21

and then they said what could we do

15:23

to fix this scenario was like yeah

15:25

we could just tell the i not want

15:28

Serve isn't conference which then led to another

15:30

call from someone saying yeah but at some

15:32

point the I would just go other

15:34

know what.com. Con: Healthy

15:36

Man: I'm sorry, don't trust myself.

15:38

It's just so strange because. Everyone

15:42

there was this insane amount of money to

15:44

know Mark Andreessen should be lost that publicly

15:46

said the money the when it's like chart

15:48

for ai. But. Not as hard as the money

15:51

that went into flow with as Noom and I said that. For.

15:54

So much money's gone into the systems, but. Once.

15:57

the actual thing was that was the thing

16:00

Where's the thingy that I use now? Where is the

16:02

essential thingy now? I don't see you don't think

16:04

a chat GPT is the

16:07

creme de la creme I Don't

16:09

know what it's for. I would love someone to tell

16:11

me how this Richard. I am lazy I don't want

16:13

to do stuff if the computer can do the thing

16:15

for me. I'd be so happy But

16:17

I never use chat GPT for anything you

16:20

said right I've

16:22

tried I've tried to use it for exactly one

16:24

thing I was I forget what I was trying

16:26

to look up something maybe related to niche Healthcare

16:29

industry terms for a story I was working

16:31

on and I was having a really hard time summarizing

16:33

it like a human person Right eventually asked chat GPT

16:36

a number of different questions and it kind of helped

16:38

me I used to the only thing for the

16:40

second time today My best use and I

16:43

said this mission before is to summarize

16:45

strategy I

16:51

Just put it all in there and it gives you two nice paragraphs You

16:54

know and I think I think your point is something here

16:56

that Part of the presumption

16:58

and presumption that and that their vision story

17:01

is that so many people see AI as

17:03

a method to efficiency That's the mistake Because

17:06

that presumes that this general machine can do

17:08

things we do right Somewhere

17:10

in the rundown I put up Zach Seward

17:12

who's the head of AI in the newsroom at the

17:15

Times now He did a really good presentation at Southwest

17:17

Southwest, which I haven't been to for years

17:19

and had no FOMO for all these years But

17:21

I would like to have heard that but as a put it up and

17:24

it's all about Finding anomalies

17:26

in data. That's a journalistic way to look

17:28

at it But you

17:30

know analysis of data and

17:33

finding anomalies and patterns Now

17:35

it seems to work pretty well because

17:37

you're giving it to yourself for judgment,

17:39

but replacing jobs No

17:43

efficiency So you

17:45

can write more power points. That's not efficient. It's

17:48

spamming the world. I Don't know what

17:50

it comes from. I right agree and I

17:53

would love it to reflect my my date my

17:56

PR firm job. There's a lot of spreadsheets a

17:58

lot of boring documents Love anodyne copy You'd

18:00

think this would be what these things were for, but every

18:02

time I ask it to do stuff, it gives me possibly

18:05

the most half-assed work I've ever seen.

18:09

Just like you have all of, like

18:11

you're throwing several zoos, a hundred

18:14

cats and a few trees in there

18:16

just for this query, and you can't

18:18

even give me like a filled-in spreadsheet?

18:20

I can't upload a spreadsheet and have

18:22

you tell me what's in it? I

18:24

thought this was the very basic thing

18:27

this could do. It's not quite as

18:29

infuriating as crypto where there was nothing.

18:31

Yes, yes, yes. Absolutely nothing, same with

18:33

the metaverse. And I grew up writing

18:35

about MMORPGs. NFTs as

18:37

well. This

18:40

thing does something, but at the same time,

18:42

despite everyone losing their

18:45

nut over Sam Altman

18:47

conning the nonprofit

18:49

board into reinstating him with

18:51

several other bebop and rock-steady-style

18:54

cronies, everyone lost their proverbial

18:56

over that. But this company has yet

18:59

to prove itself. I

19:01

saw a journalist say, it's so beautiful, by the

19:03

way, on the night that Sam Altman was brought

19:05

back, felt sick to my son. Oh my God!

19:08

It's disgusting. It's disgraceful to be

19:10

like that, especially for like a

19:12

goblin like Sam Altman. A

19:15

nasty little man. I'm sorry,

19:17

Sam. Ugh. Nasty. Redditors,

19:20

they made nothing off the IPO, nothing off the

19:22

Reddit IPO. Sam Altman could have made a couple

19:24

hundred million dollars. It's nice to see the good

19:26

guys win, huh? Of course.

19:29

I think part of the issue is that when

19:31

people are talking about AI, they describe it, people

19:34

much like our gone

19:37

host, Leo, describe it as

19:39

a thinking, almost feeling

19:41

machine. They humanize it in a

19:44

way and project intelligence onto it

19:46

in a way that is just

19:48

completely untemped. Untethered from

19:50

reality. Paris, I use

19:53

this phrase today. I've written a

19:55

new premise

19:58

or beginning from my book, The Grouper. parenthesis

20:00

on sale now in paperback as

20:02

well. So I had to write about that. And

20:05

I called it the literate, more

20:07

in our reaction to it than the reality, but

20:09

we see it as the literate machine. Oh,

20:12

it can, it can, we can speak to it and it can speak to us. And

20:15

that's what's so shocking to humans

20:17

and media is just that it

20:21

uses our language, not always well, not or

20:23

Lord knows not accurately, but, but

20:26

it doesn't know anything. It's not, I agree.

20:29

It doesn't know what it's saying. Yeah, I'm agree. Just

20:32

to be clear, I'm agreeing with you. Like it's, it

20:35

doesn't know anything. It's why everyone's like, Oh,

20:37

well, the future versions of Sora video generator

20:39

will look so much better. Now they weren't,

20:42

they weren't, they're not going to look bad. Why?

20:44

Because of the hallucinations, which are a feature,

20:47

not a product. These things don't know if

20:49

you can't tell it that a monkey has

20:51

four legs. Sorry, two arms, two

20:53

legs. I'm just regular stupid. I'm meant to

20:55

be able to learn things. It's not making

20:57

any claims

21:02

about being intelligent here. I think no one

21:04

has that mistake, but it's going to keep

21:06

making these errors because even if it makes

21:08

it, and this is the thing that's helpful

21:11

with the AI generated video, especially it's, we

21:13

don't realize how perfect the world around

21:15

us is and that the world around

21:17

us is we interpret it through basic

21:19

semiotics, science of science. And

21:22

we know what a thing like, we know what a

21:24

monkey looks like. If a monkey had three horns, we'd

21:26

be like, what the hell? That's a strange looking monkey.

21:28

But we'd know it was a monkey based on some

21:30

of the features. But when

21:32

you get down to basic things like

21:34

how a human being walks, how something

21:36

we know moves, and you try and

21:38

do that and you make even little

21:40

mistakes, people know that people aren't stupid.

21:42

And I actually think that that's a

21:44

big thing that generative AI people aren't

21:46

realizing people are not done. They're going

21:49

to see, they see it's not do

21:51

anything. And this is also one of

21:53

the problems I have with the claim

21:55

that Sora is going to replace

21:57

filmmaking and generative AI is going

21:59

to. completely wipe out

22:02

the filmmaking industry as we know it. Filmmaking

22:05

is a... To be a director, there's a very

22:07

high bar. Directors of

22:10

major movies are particular when it

22:12

comes to the shots that they

22:14

have. You're not going to

22:16

be able to describe

22:18

a very specific shot

22:21

as well as get the sort of

22:24

performance from your AI actors with the

22:26

same ease you would just

22:28

standing in the room with two humans that

22:30

you have hired for a specific role. It's

22:34

going to be more complicated. No, but I think TikTok,

22:36

imagine TikTok with Sora, right? There's

22:38

stuff that in

22:40

lay hands can

22:42

do more than it could. I think that

22:44

all of this, it's not a business, but I think

22:46

it had all been put out there as a creativity machine.

22:50

And that's all it does is make stuff up

22:53

and you can make stuff up with it and it's going to be good

22:55

or bad and you can have fun with it. That's

22:57

cool. No problem with that. I think there's

23:00

a lot there, but it's this efficiency

23:02

machine that's going to replace search

23:05

and replace people with jobs and

23:07

be smarter than us is such... Yes, I

23:09

got to be on the Dan Libotard podcast

23:12

this morning. French Ambassador. Does

23:14

that mean you've done now three podcasts today?

23:16

In one day. In one day,

23:18

yes. I am. I did four last

23:20

Monday. I've been doing the promo tour for this

23:22

damn show. Anybody can answer.

23:25

So I use language there that I'm not allowed to use here,

23:28

but they started asking, well, what do you

23:30

know? I had all the big

23:32

swinging Richard of the

23:35

AI boys. I quite like that. Yeah. Yeah.

23:40

And I think the reason that

23:42

I think that Google will do such with

23:45

this is not because it's a good

23:47

idea, but because I think that

23:49

Google wants to... I

23:52

think Google has made a big mess. They made a big

23:54

stinky. They've turned the web into

23:56

a big hole with their allowance of SEO.

23:59

And the thing is... Starting aside, AI spam, by the

24:01

way, Google is at fault because they are the ones

24:03

that have catered to the freaks, the SEO industry, and

24:05

the people in charge of the media are the ones

24:08

that have been trying to dance the Google song. I

24:12

think Google will do a generative bot that

24:14

turns them into a form of ISP. I

24:16

think they want Google search to be much

24:18

more controlled. I think they want it to

24:20

chew up and regurgitate the internet. To be

24:22

clear, I think this is a horrible product,

24:24

but I think that Google and Sandal Pichai

24:26

in particular have just become

24:28

so scummy and so

24:30

actively abusive of the people on

24:32

the internet that just I think that

24:35

this is their eventual endgame. I don't think it's good.

24:37

I don't think it's efficient. I don't

24:39

think it's even right most of the time. But

24:41

guess what? Google need more money.

24:44

Google must show growth. The raw economy

24:46

must survive. We must always have more

24:48

growth in tech, even if it's unsustainable,

24:50

even if it's horrifying for human beings

24:52

and indeed the human capital working at

24:54

these companies. Well, that's

24:56

depressing. Yeah. I

24:59

mean, the world is a vampire. I

25:01

mean, it's part of what you get when you

25:03

have to have companies that are always showing quarterly

25:05

growth. That's right. Speaking

25:09

of AI,

25:12

today European lawmakers approved what's

25:14

being called the world's most

25:16

comprehensive legislation yet on AI, setting

25:18

out sweeping rules for developers of

25:20

AI systems and new restrictions on

25:22

how the tech can be used. It's

25:25

called the AI Act, and the rules

25:27

are set to take effect gradually over

25:29

the next several years and apply to

25:31

all AI products sold in the EU market,

25:33

regardless of where they were developed. Some

25:36

of the notable parts of this include

25:39

their prohibitions in the legislation, including

25:42

bans on the use of emotional recognition,

25:44

AI in schools in the workplace,

25:47

and on untargeted scraping of

25:49

images for facial recognition databases.

25:53

The new rules will eventually require providers

25:55

of general purpose AI models like Chat2PT

25:57

to have up-to-date

26:00

technical documentation and publish a review

26:03

of the new model and the new model.

26:07

And makers of the most powerful AI models,

26:09

which is what the EU is deemed to

26:11

have kind of systemic risk, will be required

26:13

to put those models through state-of-the-art

26:17

safety evaluations and notify regulators

26:19

of any serious incidents

26:21

that occur with the models and

26:23

implement, I guess, mitigations for

26:26

potential risks or cybersecurity

26:28

production. What

26:30

do you guys think about this? I think the EU

26:32

is the only force that could stop the tech industry.

26:35

I don't know, I haven't read the fundamental

26:37

of the bill, but I do think regulation

26:39

is necessary here. And I think in

26:41

particular they need to start saying how these

26:43

models are trained and what they're trained on,

26:45

because it's pretty hard to make

26:48

these things forget. You can't really do it, you just have

26:51

to revert the training data. And I think

26:54

that, I'm surprised Sam Altman hasn't

26:56

created the anti-Butlerian

26:58

Jihad Europe for

27:00

this, because this is anything that

27:02

makes OpenAI responsible for their

27:04

training data is fatal for that company.

27:07

There's a reason that Mark Andreessen was freaking

27:10

out about the idea of AI companies having

27:12

to abide by copyright. Same

27:14

reason that OpenAI is so freaked out by them, why

27:16

they're paying people, because if there's anything

27:18

that sets precedent here, they're screwed,

27:22

you can't unring the

27:24

bell of training, they will have to probably

27:26

start again to make it they can't really

27:28

prove that it's not there anymore.

27:30

And if you look at Sora, a great deal

27:32

of it looks very very similar to a lot

27:34

of Shutterstock stuff too. So, I

27:38

think that, I don't know if they have a

27:41

relationship with them, but I'm pretty sure that all

27:43

of these models are based on some form of

27:45

plagiarism. The E-Rocks, I love that they're doing

27:48

stuff like this, it's better than the

27:50

nothing we're getting over here. So

27:53

I'll disagree. I'm

27:56

finally that, I've finally summoned

27:58

someone on this. podcast who agrees

28:01

with me. I think

28:03

you want to go first? You go first. No,

28:06

you go. I'll hear your rebuttal. All right. All

28:08

right. I think the legislation is better than it

28:10

could have been. They were talking about outlawing

28:13

open source, which would be disastrous. It

28:15

would be regulatory capture. They were talking

28:17

about, I think

28:20

that the reason OpenAI is not screaming is

28:22

because it is regulatory capture. The big companies

28:24

with the big money will be able to

28:26

deal with this legislation and regulation and newcomers

28:31

won't. And so that's why you

28:33

always hear Microsoft just

28:35

saying, regulate us. Yes, please. And that's

28:39

what Zuckerberg says too. And that's what

28:42

OpenAI says as well. So I

28:44

think there's issues there. I also think

28:46

that Europe constantly says, we're ahead on

28:48

regulation, but nothing else. And then they

28:50

come back around and they whine, why

28:52

don't we have our innovative companies? Because

28:55

they don't invest, they don't do it. And

28:57

they regulate reflexively. The problem, I think, in

28:59

the end, I'm fine. We have some regulations.

29:01

The regulation is not bad. But

29:03

I also put it on the run down. He says begrudgingly.

29:06

Yes, it's a bit lower. That's my view. But

29:08

I don't think it's going to do much. And

29:11

I think in its later stages, it's a

29:14

very flimsy negligee

29:17

on a, I'll stop that metaphor.

29:19

Jeff immediately retreats. It's his third

29:22

podcast of the day and it's

29:24

gotten too much for

29:32

him. So I put up

29:34

this thing from AI Snake Oil, the Arvind

29:38

Narayanan and Sayash Kapoor, which

29:40

makes the argument that I've

29:42

been on the show before,

29:44

that guardrails are impossible at

29:46

the model level. They say

29:48

that AI safety is not a

29:51

characteristic of models. It's a general

29:53

interest machine. It's like saying Gutenberg,

29:55

you're responsible for everything coming off

29:57

the press. It's like expecting Microsoft.

29:59

to tell us as we're using

30:01

word, no, you can't type that.

30:03

I'm programmed to not let you type that. You can't do that.

30:06

Obviously absurd. And so

30:08

putting the responsibility level in the model

30:12

isn't going to be effective. And so

30:15

it's a veil that makes people

30:17

think we're doing something and we're

30:19

gonna protect ourselves from this dangerous

30:21

technology. The problem I have with

30:23

it is that. No, we should be honest

30:25

about it that you cannot protect

30:27

yourself from this technology and the bad things

30:29

that people will try to do with it.

30:32

Now deal with that. The thing

30:34

I think that is kind of interesting about

30:36

this legislation is

30:38

it's kind of tiered. Originally,

30:41

I believe it was set

30:44

to exclusively focus on developers

30:46

of what they call high-risk

30:49

AI systems, which is

30:51

kind of a catch-all term that I

30:54

guess, I'm gonna break this

30:56

down from a really interesting website.

30:59

I think the EU put together

31:01

called artificialintelligenceact.eu that has kind of

31:03

what looks like, it's almost an AI-generated

31:05

summary of the act itself, but I'm

31:07

guessing it probably isn't. But in

31:09

their prohibited AI systems list, you've

31:11

got a couple of tiers. One is prohibited AI systems,

31:14

where it's like AI can't be used for these sort

31:16

of things, which is compiling

31:18

facial recognition databases, inferring

31:20

emotions, social scoring,

31:23

stuff like that. But

31:25

then most of

31:27

what this act is

31:29

targeting is developers of

31:31

so-called high-risk AI systems,

31:33

which is if

31:36

they profile individuals or include

31:38

automated processing of personal data

31:41

to assess various aspects of

31:43

a person's life, like work performance,

31:45

economic situation, or health, or

31:47

if it is somehow, I guess,

31:49

related to EU law, if

31:52

it is like the government is using something related to

31:54

the system, which I think is kind of an important

31:56

distinction. The

32:00

vast majority of these regulations

32:02

are focused on these type

32:05

of AI, like developers

32:07

and companies, which is a very specific

32:09

subset. For like the general AI systems,

32:12

like a chat GPT or something, that

32:15

is the regulations being

32:18

imposed here are of a much lower

32:20

standard. It's like they've got

32:22

to provide technical documentation and

32:25

whatnot. And for free. Which

32:27

is why. Because I know I think Leo

32:29

also often talks about how this

32:31

is going to squash free, like

32:34

open AI initiatives for like

32:36

free and open license like

32:38

AI models. They only need

32:40

to comply with copyright restrictions

32:42

and publish the training data summary, which

32:45

I think is fair. Also

32:47

I think at this point you should realize,

32:50

Leo, you're not here so of course my

32:52

perfect favorite argument where the person cannot respond.

32:55

But very basic thing here is if you're

32:57

worried about little AI models, you should already

33:00

have realized they've already failed

33:02

their screwed. These massive deals between

33:04

open AI and and

33:06

Tropic and the major cloud providers and

33:09

the fact that they can afford to buy

33:11

these massive data sets, $60 million of Reddit,

33:13

for example, with Google. The

33:15

walls already been lost. The little models can never

33:17

train at the scale. They

33:19

never will be able to. But they do

33:22

need to. Well kind

33:24

of. The question has to be addressed. The

33:26

argument is that that was all big swinging.

33:29

Richard, you didn't need to have the biggest

33:31

models. That's Sam Altman who's saying

33:33

that. Sam Altman is the one pushing back

33:35

now saying, oh yeah, you don't need big

33:37

models anymore. You need a small model. Yann

33:39

LeCun says the same thing. He

33:43

is just one is

33:45

just an extremely annoying character and

33:47

I disagree. Nevertheless, putting that aside,

33:50

otherwise I'll just spend the whole time getting mad at them. It's

33:53

just. Wait a second. Wait, wait, wait.

33:55

Who's the most irritating person in AI? I want

33:57

to hear each of you answer that. Oh,

34:00

what's his name? The

34:05

one that said there would be a Bitcoin virus. Could

34:09

be any number of people. I guess my

34:12

answer is Elon Musk, even though I'd argue

34:14

that he's not in AI. That's a cheat.

34:17

It's an easy answer. It's

34:20

the easiest answer and just because Grok, sometimes

34:24

I'll just be alone at

34:26

home thinking, should be thinking about normal things

34:28

related to my life. And I'll just remember

34:31

the existence of Grok AI. Does

34:33

it exist? Here's a question. Does Grok really exist? Yes. Yeah?

34:38

Grok exists on X slash rate my new

34:40

stop is, the Elon Musk website. And it's

34:42

great because every time you see a screenshot

34:44

of it burning someone, like trying to roast

34:46

them, it does the same thing. It says,

34:48

oh boy, where do I start? It's so

34:50

bad. That's how you

34:52

know it's a sick burn. That's how

34:55

you know it's written by Elon Musk. Yeah.

34:58

It's just like, this is extremely,

35:00

this is a terrible insult. I'm

35:02

going to say, oh boy, where

35:04

do I start here? Yeah.

35:07

So I'm trying to find this

35:09

guy because he was a less wrong guy and he

35:12

pops up. He's like an AI. God damn

35:14

it. I'm going to be thinking about this for a while.

35:16

I think Yarn's a pretty good choice, but I think Sam

35:18

Altman. I truly think

35:20

Sam Altman is the most annoying because he gets

35:23

away with it. He also does not sound eloquent.

35:25

I don't know if you heard Sam Altman speak,

35:27

but for the smartest guy in the bloody room,

35:29

he sounds pretty dumb. I

35:32

don't even mean like Dollyn

35:34

A just not a good public speaker thing.

35:37

He's just doesn't, oh, he

35:39

kind of seems like Chad GPT. Just kind of

35:42

mediocre. He's like, yeah, we'll get big in

35:44

like four years. And I

35:46

want everyone in cnbc.com immediately just

35:48

copy paste the transcript, publish.

35:51

And it's just, it

35:53

frustrates me. Sam Altman frustrates me a great

35:56

deal. He's fallen. That man has

35:58

failed upwards like seven times. I

36:01

mean it is still mind-boggling

36:03

to me that he was ousted

36:06

from the board, ends up getting

36:08

reinstated and accumulates even more power.

36:11

We still don't know why. We still do

36:13

not know why. No, we don't. Nope. We

36:16

haven't. It could be to do with his sister. We'll

36:18

never know. Could be to do with any number of

36:21

things. Well hey, maybe it was the fact that he

36:23

wanted to turn the non-profit wing of OpenAI into a

36:25

profitable thing. And you put Larry

36:27

Summers on the board, so no worries there.

36:32

Yeah, this week Bloomberg reported that...

36:36

Oh sorry, Eliza Yudowski. Eliza

36:39

Yudowski. Elizaziz. He's

36:41

very annoying. Because

36:44

not only that, he is the worst, worst,

36:46

worst of the doomers. In

36:49

the Journal of Moral Panic, otherwise known

36:51

as Time Magazine, that's

36:53

where he writes his screen. Hey, Moral Panic, drink.

36:56

Drink. I

36:59

think Gutenberg deserves a beer and Moral

37:01

Panic line, is my view. Or

37:03

vodka. Go to Diet Coke. So

37:07

Yudowski is the one who's out there screaming

37:09

that paperclips will kill us all. Yeah. And

37:12

that he's the worst of the doomers. He is

37:14

the worst of the doomers. He's

37:17

in the New Yorker story with 99.9%

37:20

that it's going to destroy us and has to stop

37:23

us. And he's just... He would love that. He's

37:25

a big fan of

37:27

Rocco's Basilisk, also known

37:30

as the Weenies version of Pascal's

37:32

Wager. What

37:34

is Rocco's Basilisk? Okay, this is

37:37

beyond our podcast score here. So

37:39

Rocco's Basilisk is this thing that

37:41

we need to start building or

37:44

working on or pleasing a theoretical

37:46

machine god before it comes into

37:48

existence. Otherwise it

37:50

will punish us if we do not

37:52

do so. It is such a dumbass story.

37:54

It's exactly the kind of dumb

37:57

guy's intellectual exercise. Like, what

37:59

if... computer was scary.

38:01

What if the computer was mad at me? Oh

38:04

no, I'm so smart for thinking of this.

38:07

It's a recursive intellectual exercise. I do

38:09

think it's quite bold to reinvent

38:12

religion in the year

38:15

of our glory. It's

38:18

the same thing. H.M. and I brought

38:21

Cosbascala's controversial for a few reasons.

38:25

It relies on a lot of speculation about the

38:27

nature of future AI and its motivations as if

38:29

it has any motivations. The idea

38:31

of AI punishing people in the past

38:33

for not helping create it is seen

38:35

by many as illogical. Yeah,

38:38

yeah, yeah. But it's

38:40

exactly what I'm traveling to. I

38:42

didn't know that. Wait, I'm traveling

38:44

now? I'm into it. It sounds like travel. That's

38:46

fun. What a big test for

38:49

you. Yeah, the time travel into the tea.

38:51

Oh God. Oh, that's really smart. I

38:55

think we need to add more letters

38:57

into test grill. I think we should

38:59

make it more complicated, ideally. Yeah, I

39:01

challenge while Leo's gone is to get test grill

39:03

of the title of the show today. That'll really

39:05

Oh, he would fly back from Cabo to stop

39:07

it in the test grills. Yeah,

39:10

that. So wait, wait,

39:12

wait, wait. So, so Rick, you're Kowski. I didn't

39:14

know this. He's also the co-founder of less wrong.

39:17

Yes. What is that? Right place. If you

39:19

want to find a bunch of libertarian guys barely

39:22

covering up their racism and

39:25

talking about subjects with less articulation

39:28

and a Redditor or a stack

39:30

overflow member and

39:33

a little less anger than

39:35

the average hack and use user. It's a

39:37

very useless place to go. If you

39:39

want to meet a bunch of people that you'll never want

39:41

to meet in real life. Wow.

39:44

Yeah. On the Wikipedia

39:46

page for less wrong.

39:49

The first subhead under

39:51

history is Rocco's Basilisk.

39:53

The biggest thing they've

39:55

created is a

39:57

version of Pascal's way. What's

40:02

great is you go on there now,

40:04

let's read some of these titles. Meta-honesty,

40:06

Fermigoth, Honesty's Edge Cases, oh my god

40:08

go outside! These

40:11

people have never touched grass in their lives.

40:15

There's another one that says no one in my

40:17

org puts money into their pension. Notes

40:19

from the prompt factory, there is

40:22

way too much serendipity. Okay.

40:26

Jesus. Here's the problem.

40:30

These people are getting huge amounts of money. They're

40:34

using money on the major

40:36

college campuses to do clubs and fellowships

40:38

and classes around this crap. They're

40:42

getting the year of legislators

40:47

and they have the year of media. And they're

40:49

idiots! Honestly

40:53

it's like the J. Rosen scam, but for

40:56

AI it's kind of cool. Just

40:58

like you do. Yeah,

41:00

I was going to say, you're talking to

41:02

a... You've got a J. Rosen adjacent

41:04

here. And

41:08

why you... J.

41:10

Rosen is at least an academic. At

41:13

least he has academic cred. At

41:16

least he has spoken to reporters and knows

41:18

them. These people are just like, yeah, thought

41:21

about it really hard, wouldn't it be scary

41:23

if the computer did this? And everyone goes,

41:25

oh, god damn, holy crap. What

41:28

if the computer was smart? But if you look

41:30

at the wider media, they're buying a not much

41:33

more sophisticated story from Sam Altman. Sam

41:35

Altman's like, yes, can automate everything. It

41:37

could be super smart and automate all

41:39

the stuff. He's about that vague. I

41:43

love these... They're

41:45

just wonderful. I want t-shirts for every

41:47

one of them. How

41:49

could I have thought that faster? The

41:52

parable of the fallen pendulum. What

41:55

could a policy banning AGI look like?

41:57

There isn't any AGI, it's not going

41:59

to happen. give it up. Notes

42:01

from a prompt factory. Oh my

42:04

lord, the serendipity one is a beautiful

42:06

one. Yeah, there's one about the COVID-19

42:08

pandemic of course because these guys these

42:11

guys love to think about stuff and

42:13

get really close to saying something racist.

42:16

Highlights from Lex Friedman's interview

42:18

of Yellow Lacoon. Wow.

42:24

What a... How to have

42:26

Holly genetically screened children. Lex

42:28

Friedman is so funny though.

42:31

Lex Friedman is

42:33

awful. Because he buys all the

42:35

BS. Remind me who Lex Friedman

42:37

is. So imagine if

42:39

you will, a very pallid boring

42:42

man that nevertheless has become one of

42:44

the most successful tech podcasters. I will

42:46

depose him. He is

42:48

an anime. Oh that's true, I have seen you

42:50

tweeting about better offline rising in the ranks. You've

42:53

got to take over Lex's spot. I will take

42:55

him on by doing a good job unlike him.

42:57

He does these like two three hour long interviews

42:59

with people. He gets got Elon Musk, he got

43:01

Jeff Bezos and he gets like Pajale of course.

43:04

I was gonna say how many posts of the All In podcast

43:06

do you think you could take in a fight at? How

43:10

many members or posts? How

43:13

many hosts or I guess... How many hosts? Couldn't

43:15

take a shot. He's a ...

43:23

Oh absolutely. Those guys you know those guys make

43:25

some like depressing looking food. But Lex Friedman what

43:27

he does is he does these three hour long

43:29

interviews that are the most... He

43:31

talks like this the entire time.

43:34

He is also just one of the least articulate

43:36

men to ever walk the South. I want to

43:38

read... Why do you like him?

43:41

Like I don't know why people like him. I

43:43

don't know why people like him at all but

43:45

let me read you this question. The transcript of

43:47

the question he asked Jeff Bezos and

43:50

this is the exact way he said it. You went

43:52

to Princeton with aspirations to be

43:54

a theoretical physicist. What attracted you to

43:56

physics and why did you change your

43:58

mind and not become... Why why

44:00

you're not Jeff Bezos the famous

44:02

theoretical physicist. Oh

44:05

no brother man

44:07

brother man and to be clear

44:09

he's reading this question. Oh, yeah,

44:11

yeah, slowly slowly slow. Is

44:14

he's a member of the slow

44:16

talkers of

44:21

America. Maybe he's popular. Maybe

44:25

comes from the culture of people listening to

44:27

podcast and like 1.5 speed. I

44:31

think it's that and yeah, you could probably listen to

44:33

this like 8 speed and get these beans flow. So

44:35

that makes you sound smarter. I

44:38

think he's speaking slowly because that's how fast the

44:40

information comes out. Let

44:42

me read you the description of his

44:45

YouTube channel. Lex Friedman

44:47

conversations about science technology

44:49

history philosophy and the nature

44:52

of intelligence consciousness. Love and

44:54

power. All right,

44:56

did he

44:59

have a young man. We're gonna

45:01

have to go to an ad break. Let's say

45:03

your last Lex Friedman

45:05

thoughts in any any final thoughts

45:07

guys. Michael play 30

45:09

seconds of him interviewing Tucker Carlson.

45:11

No, I don't think we can

45:13

do that. Actually, just just to

45:15

tie this up. He does these

45:18

very long-winded things where he gives

45:20

platforms to incredulous freaks like the

45:22

jolly like talker. He

45:24

lets Elon Musk or yes, they're well.

45:26

There's very very big problems with the

45:29

immigration and the works and

45:32

he's so dull, but he's also a

45:35

he's basically a right-wing guy. He's

45:37

basically just another right-wing tech guy and

45:39

that is a problem. The two of

45:41

the biggest podcasts are center right right

45:43

wing. That sucks. That's bad. Anyway enjoy

45:45

the outbreak. Enjoy the ad

45:48

this episode of this week in

45:50

Google brought to you by Rocket money.

45:53

By the way, thank you very much Paris Martin. Oh

45:55

for filling in for me while I'm on vacation. I did

45:58

want to come back and tell you about Rockin' money

46:00

because they have saved me enough to pay

46:02

for this vacation. Come to think of it,

46:05

did you know nearly 75% of people

46:07

have subscriptions? They have forgotten

46:09

about subscriptions. Things

46:12

like fitness apps and streaming services and

46:14

delivery services. When I started using

46:16

Rocket Money, I couldn't believe how

46:18

much I was paying for that I'd completely forgotten

46:20

about. I found campaign

46:22

contributions from 2022 that I was

46:25

still paying monthly and I cancelled

46:27

it with Rocket Money. I

46:30

found just the other day a

46:32

WordPress subscription I'd forgotten about,

46:34

got $300 back thanks to Rocket Money.

46:37

Rocket Money saves me money on

46:40

all those subscriptions I've completely forgotten

46:42

about. It's a personal finance app

46:44

that finds and cancels your unwanted

46:46

subscriptions but it does a whole

46:48

lot more. Rocket Money also monitors your spending, helps

46:50

you lower your bills, you can grow savings. It's

46:53

really an easy to use app. I can see

46:55

all my subscriptions in one place and if I

46:57

see something I don't want, Rocket

46:59

Money can help me cancel it with

47:01

a few taps. They'll deal with

47:04

customer service for you. Awesome!

47:07

Rocket Money, more than 5 million

47:09

users, it saved a total of

47:11

half a billion dollars in cancelled

47:13

subscriptions. On average it

47:15

saves members $740 a year when you use

47:17

all the app's features. It

47:19

saved me more than that. I love Rocket Money. Stop

47:22

wasting money on things you don't use. Build

47:24

your unwanted subscriptions by

47:27

going to

47:29

rocketmoney.com/twig. That's

47:31

rocketmoney.com/ twig. You got

47:33

to try this thing. It's awesome. rocketmoney.com

47:38

slash twig. Now back

47:40

to this week at Google Paris. You're doing a great

47:42

job. Thanks for filling in for me. Thanks

47:46

Leo. This

47:49

morning the House approved a bill

47:51

to force TikTok to either

47:54

cut ties with ByteDance or be banned in

47:56

the US. The bill is

47:58

now moving to the Senate where it's... The prospects

48:00

are less clear. Its

48:03

passage to the House was crazy fast.

48:05

It was only introduced last week and

48:08

passed by a House panel on a

48:10

unanimous vote. TikTok

48:12

has tried to lobby against the bill, asking

48:14

users whenever you use the app, you've probably

48:16

seen the pop-up come up, it's

48:19

asking users to call their representative to

48:21

protest. Apparently that's already

48:23

alienated some Congress people, who

48:25

would have guessed. Biden has said that

48:28

he'll sign it if it reaches his desk, it's

48:30

unclear if that will happen. If

48:33

it gets there, implementation will likely be delayed

48:35

because TikTok's expected a file suit to block

48:38

it. Meanwhile, China's also expected

48:40

to block ByteDance's attempts to

48:42

sell TikTok if it gets to this. Jeff,

48:46

I think before the show, you were saying you're

48:48

gonna get mad about this, so get mad, yeah.

48:52

I actually heard someone on the message we

48:54

see this morning say,

48:57

well, yeah, it was

48:59

Mikey Sherrill from New Jersey, who

49:02

said, TikTok is

49:04

against free speech, so what are we gonna do? We're gonna

49:06

take away the free speech. The

49:09

House logic here is just awful. It's

49:11

moral panic, drink everybody, meeting

49:14

political nihilism. It

49:18

is ridiculous, as Morning Joe goes on,

49:20

about all the Chinese propaganda, I don't

49:22

see any Chinese propaganda there, and give

49:24

some credit to citizens that in a

49:26

democracy, we can hear it and laugh

49:29

at it and get rid of it.

49:31

We're not sheeple. And

49:33

a lot of people and a lot of

49:35

voices who otherwise are not heard in big

49:37

old white mass media now have their stage

49:39

and they're gonna take it away, and so

49:41

it pisses me off majorly.

49:44

Could we get the moral panic clip to

49:46

play, folks? Nice.

49:50

Oh, no. I got a bad feeling about this. Where's the audio?

49:54

There we go. Thank you. It's a moral panic,

49:56

everybody. Cue the moral panic. One

50:00

of the most interesting things about this is, I

50:02

mean, this isn't the first time TikTok has,

50:06

the US has tried to ban TikTok,

50:08

but apparently TikTok was blindsided

50:11

by the bill. The

50:13

Wall Street Journal reported that just two

50:15

weeks ago executives from TikTok's US operations

50:18

flew to their company's headquarters

50:20

in Singapore to meet with

50:22

their bosses and tell them everything was A-OK

50:24

with the app and that they totally weren't

50:27

in danger of being banned. Obviously

50:30

a big miscalculation. Inside

50:33

TikTok, some leaders were apparently aware that

50:35

lawmakers were working on legislation, but they

50:38

didn't expect it would win so much

50:40

support so quickly. Some of the people

50:42

familiar with the matter told the Wall

50:44

Street Journal. What

50:46

do you think young voters are going to say about this? They're

50:49

not going to understand the nuances of it and they're going

50:51

to get mad they can't watch TikTok. I

50:54

mean, yeah, they're probably going to blame the

50:56

Biden administration and be quite pissed. I

51:00

thought it was interesting. My colleague, Kaya

51:02

Uryev, who writes our

51:05

Creator Economy newsletter, was at South by

51:07

Southwest this week and kind

51:09

of did a little scene report from,

51:12

you know, the various panels

51:14

with creators. Apparently, creators are totally

51:17

unfazed with this. A

51:20

TikToker named Remy had

51:23

said at a panel, a lot of creators

51:25

are probably ignoring this right now because we've

51:27

heard it so many times and it's possible

51:30

that it could happen. Which I mean,

51:32

I think that that's probably the approach that most

51:34

users in the app are taking. Like this is

51:36

not the first time. It's happened many times over

51:38

the last four years that TikTok, the, oh, TikTok

51:41

is going to be banned, panicked. I

51:43

don't know. Do you think that they actually have

51:46

the political clout to do this this time? It

51:49

seems unlikely that the Senate would approve this, but

51:51

I also wouldn't have expected the House to unanimously push

51:54

this through. I think there's momentum behind

51:56

it. I'm not a policy guy, but

51:58

clear the House surprisingly. surprisingly fast, it popped

52:01

up surprisingly fast, I

52:03

just don't know if any of the old people

52:05

in the Senate really care. I think they'll be

52:07

like, I don't like China. So damn,

52:12

sure. Who gives a damn? I'd

52:14

be against child porn, be against China

52:16

on that list. I mean, being

52:19

against child porn is the unanimously agreed

52:21

upon thing I would hope. But the

52:24

China one is kind of

52:26

like, that one is beginning to get, I

52:28

mean, the Tom Cotton part of those hearings, where he

52:30

was just repeatedly racist.

52:33

It's that one guy who was very much

52:35

not, wasn't he like, I'm from Singapore. What

52:38

are you talking about? Yeah, the fight dance

52:40

executive who he was trying to accuse of

52:43

being a Chinese guy. What

52:46

are you talking about? Where are you from?

52:48

Where are you really from? Where are you

52:50

really from? I think the

52:52

journalists should have gone all race purity

52:55

on Tom Cotton, they

53:00

should have like traced his background back to another

53:03

country that wasn't America just to mess with

53:05

him, because that is the same thing. It's

53:07

disgraceful that that happened. And this bill was

53:09

kind of disgraceful, too. It doesn't seem to

53:11

actually be fixing any problems. It seems to

53:13

be setting up standards to

53:15

punish other apps. While also,

53:18

yeah, if the CCP actually has access

53:20

to TikTok data, that's incredibly dangerous. Like I

53:22

think the people need to realize

53:25

that. But for

53:27

the average citizen, how dangerous is it? I

53:31

don't think the harm, where's the danger? I

53:33

don't think it's the danger now. It's the danger that could

53:35

be there. But at the same time,

53:37

when they come in, they'd been really, but

53:39

that's kind of like, it is a good

53:41

point. Like, what can they do with that

53:43

data? What does that data say? Now, the

53:45

example given by one of the witnesses in

53:48

the Biden's hearings was that the CCP used

53:50

it to chase down dissidents in Hong Kong.

53:53

And that is disgraceful, disgusting, and

53:56

could be bad. Like, I mean, doesn't

53:58

China have... Every Chinese student

54:00

I've ever had is extremely savvy

54:03

about this and very careful. It

54:05

doesn't matter what the platform is. It doesn't matter anything else.

54:07

And they're not going to come on and say, I'm going

54:09

to criticize the regime on TikTok

54:11

and think I'm safe. They know better. Does

54:13

that mean that we shouldn't protect them? Yeah.

54:16

But I'm saying that there is presumption

54:18

that that's how the Chinese government is

54:20

going to use it is naive. But

54:23

it's how they've been potentially using it

54:25

already. Well, we know we've had back

54:27

in the day, one of the first horrible things that

54:29

happened is Yahoo died in 2018. A

54:31

dissident. And he went to jail for

54:33

many, many years. I

54:36

just think that this is not the

54:38

way to do it, but something has to be done. But

54:40

I am not aware of what that

54:42

would be. And the speed at which this has

54:44

been pushed through and the way it's being pushed

54:47

through is harmful for America more than it is

54:49

harmful for China or TikTok. And

54:51

also Coldblur over at 404 Media

54:53

had an interesting take on it today. I

54:56

mean, yeah, the goat. That

54:58

the US, the headline of the article was the

55:00

US wants to ban TikTok for the sins of

55:02

every social media company. And I

55:04

think that that is a really relevant point here

55:07

is that we wouldn't be in this place where

55:09

we are today with this

55:12

really coalesced animus towards TikTok if

55:14

it wasn't incredibly politically expedient for

55:16

politicians on both sides of the

55:18

aisle here in the US to

55:21

be rallying their forces against big

55:23

tech generally. I think we see

55:25

this with every of these nonsense

55:27

social media hearings, one of which

55:30

our very own Jeff Jarvis participated

55:32

in it. I

55:35

don't really know how we go back from this.

55:38

I also think that perhaps I don't know. Here's

55:40

an idea. So there was not there's an executive

55:42

order that was going out that was going to

55:44

restrict the flow of sensitive data through intermediaries like

55:46

data brokers to foreign countries

55:48

like China. America's fine,

55:50

though. No need to

55:53

protect Americans data from American companies.

55:55

That's what's really bothering me about

55:58

this. There are things. The

56:00

Tick Tock has done or might

56:02

do or theoretically could do. The.

56:05

American companies do today. Yeah

56:07

I know my satisfying as. Canada,

56:10

Canada wants to stop the data from coming

56:12

here. The you want to stop so different

56:14

from here for exactly what you're say. That

56:16

is because America's best Us. And

56:19

is just like how about matter.

56:21

Why? Did we make Mark Zuckerberg

56:23

divest from Mehta? Before. His

56:25

comments or up his place when Cambridge

56:28

Analytics santa. Why was they know that

56:30

allow? Don't get me that occasionally with

56:32

a bunch of Bs regards. I mean

56:35

yes. Space protests. Facebook has. Even when

56:37

you put their site, Facebook has stopped

56:39

providing a service. Facebook has stopped. Facebook.

56:42

Has taken advantage of the people using the

56:44

platform. Skyn. A lot

56:46

of we. The end result Mehta. Have to be

56:49

salivating at the thought of tic toc been banned.

56:51

Someone's. Is not months ahead or for.

56:53

Medicine. Oh no, it's a precedent that there's

56:55

Well then I can be very happy about Isis How

56:57

know and they'll let me. I did that. Events has

57:00

has had like a county. Who is? Their

57:02

current focus is trying to get

57:04

users to. Spend a

57:06

lot more time on Instagram, really feel free to

57:08

don't really a tick tock phone and or wanted

57:10

avoid Dell. That's true. It's

57:13

just it's frustrating because this kind

57:15

of shows that this not real

57:17

concerned about the like like Tyson

57:19

Caboose thanks. V. The things that.

57:21

Most. Social Networks there. And.

57:26

It. Sucks that the actual problem isn't

57:28

that the problem is China. Just

57:30

of kind of vague xenophobia. Personal are

57:32

not cent of army of them Ship

57:34

was what took office Racism. Xenophobia. Just

57:36

call it what it is. There.

57:38

Is a degree here just to

57:40

steer of shine off ruled as

57:43

the Scary China it's. It's.

57:45

Disgraceful. And also it

57:47

isn't protecting any one. No one is safer

57:49

because of this. All. It's

57:51

going to do is allow Opie goddamn

57:54

contact. Who. oversaw the destruction

57:56

of activision blizzard who oversaw

57:59

multiple allegations of sexual harassment, a

58:01

horrifying culture and activation. Now he's saying,

58:03

oh, I might buy TikTok. That's

58:05

a better situation. You want to trust that greasy

58:08

pervert? These

58:11

are our saviors. It just frustrates, all of this

58:13

frustrates me because it isn't being done

58:15

for anyone other than a vague

58:18

vibe of political goodness or

58:20

social, not even social, just

58:22

economic goodness. Yeah.

58:24

The Wall Street Journal reported earlier

58:27

this week that Kotick, the former

58:29

CEO of Activision, is

58:32

looking for partners at a dinner at an

58:34

Allen & Co. conference earlier this week. So,

58:36

it's for Journal rates. We're all looking for things

58:38

happening. Kotick floated the idea of partnering to

58:40

buy TikTok to a table of people that

58:43

included OpenAI CEO Sam

58:45

Altman. Right. OpenAI could

58:47

use TikTok to help train its AI

58:49

models if a partner such as Kotick

58:51

could raise the capital for such an

58:54

acquisition. It

58:56

all seems to converge. Right. They're

58:58

not going to sell. Of

59:01

course. They're not going to sell.

59:03

And I mean, if this someone's passing the

59:05

Senate and ends up on Biden's desk where

59:07

he signs that we're going to have a

59:09

protracted legal battle before

59:12

it ever gets to a place where someone could buy

59:14

it. Yeah, about the first effing amendment. Yeah.

59:17

This is a matter of freedom of expression

59:19

in this country. I

59:22

guess so. What's the matter in that? Is

59:24

that, is it a matter of freedom of expression in

59:27

the company? Oh, I absolutely think so. Not

59:29

a bite dance, but of all the people,

59:31

all the citizens on TikTok

59:34

whose speech is cut off. There

59:37

is a class action, I think, there. It

59:39

said that you're taking away our platform. Our

59:42

means to choose to a private enterprise.

59:46

Well, no different.

59:49

I mean, I don't believe the

59:51

Internet is a medium, but now

59:53

putting that aside, it'd be no

59:55

different from saying that Simon &

59:58

Schuster is a... Outlawed

1:00:00

To the authors that point by the

1:00:02

government. The authors that point of action.

1:00:05

Their. Platform for speaking is gone.

1:00:07

Take. Away Wordpress. Take.

1:00:10

Away of. Ah,

1:00:15

And open source platform. Is

1:00:18

A for it's a matter of expression and

1:00:20

where we'd that we don't read A We

1:00:22

respect books, We don't respect Tv. we're We're

1:00:24

so excited rates play. We're not directly Titus

1:00:26

I, I I am not a lawyer. On.

1:00:29

Show any of us are know. They.

1:00:31

Have to coal mine, regulate. Vote for things I

1:00:34

said. Screw up this. Isn't

1:00:36

your thought x She's very sorry

1:00:39

he didn't receive further is protected

1:00:41

under free speech by the way.

1:00:43

Margin: Call you back from his

1:00:45

opinion See: Email me Bobby, We'll

1:00:47

get we'll get you on the

1:00:49

positive nasty freight anyway and. I.

1:00:53

Saw a whole thing with a postman

1:00:55

those that was restricting individual speeds by

1:00:57

banning a platform with that counts. I'm

1:00:59

honestly would be fascinated either Was a

1:01:02

class action. Iran's a home. This.

1:01:05

Event or Concept. Is

1:01:07

also interesting at my colleagues. At

1:01:10

the information reported yesterday that

1:01:12

ah, unlike. All. The

1:01:14

previous times that Tix Heart has

1:01:16

been threatened with her advance. The

1:01:18

investors are the doesn't come with

1:01:21

according to the rescue, the science

1:01:23

and select. Few of the by

1:01:25

dance investors. That a previously lobbied for

1:01:27

tic toc are doing it now and probably deadly

1:01:29

said that they don't see any reason to

1:01:31

speak out in such a polarizing issue. Or

1:01:34

he. Did. They could be foot perceived as

1:01:36

it's disloyal to the U S. if they.

1:01:39

Come out in support of textile orders

1:01:41

a sixth of senate as.to dissolve Performative.

1:01:43

The house wanted to get there ya

1:01:45

ya as the senate's not gonna pass

1:01:47

it and so we don't worry about.

1:01:50

the it seems like they it's

1:01:52

a couple there's a couple different

1:01:54

parts are one is they're worried

1:01:56

about wasting like venture capitalists seem

1:01:58

to worried about waste their political

1:02:00

clout and want to pick their

1:02:02

battles because they say that they

1:02:05

have upcoming battles

1:02:08

around AI and defense tech that they

1:02:10

think are really important and want to

1:02:13

save their political capital to engage

1:02:15

with. But they've also said,

1:02:17

and this is quoting from the

1:02:19

article, that after

1:02:22

four years of an on and off discussion about

1:02:24

the ban, there's a growing chance that tech talk will

1:02:26

be forced to withdraw from the US. They noted

1:02:28

that a US ban wouldn't end TikTok. The

1:02:31

app could still operate outside the country,

1:02:33

although its ability to make money would

1:02:35

be sorely diminished given the US is

1:02:37

the world's biggest ad market. And some

1:02:39

VCs even play down the financial impact

1:02:41

that TikTok has on ByteDance,

1:02:43

arguing that the Chinese tech company

1:02:46

might in some ways be healthier without it. I

1:02:50

think it's very interesting that this seems

1:02:52

to be a moment where some of

1:02:55

the company's biggest supporters are also bowing

1:02:57

to pressure. I

1:02:59

was a bit surprised by this sudden pivot too.

1:03:03

The internet is very unpopular right now. I

1:03:06

mean it is, but also I think

1:03:08

that the venture capital firms might be

1:03:10

a little bit scared of the government

1:03:14

because if perhaps they went and looked

1:03:16

at how much money they put into

1:03:18

things like crypto or metaverse

1:03:20

or how much money they put into

1:03:22

just completely non-existent, I think like

1:03:24

a weight loss startup got $450 million, $500 million in 2021. There's

1:03:29

a lot of things that VCs did that

1:03:31

definitely flaunted very basic due

1:03:33

diligence. Look at FTX for example. And

1:03:37

I think that they're scared of the exposure. I don't blame

1:03:39

them. I wouldn't want it. And

1:03:42

also... Yeah, especially after the huge boom time

1:03:44

we're just coming off where anything kind of

1:03:46

goes, it is now crunch time to

1:03:48

say the least. That's not a sympathetic

1:03:51

character. It's a multi-billion

1:03:53

dollar monstrosity

1:03:56

algorithm machine. And Mark Zuckerberg hates

1:03:58

it because he was never... able

1:04:00

to create an algorithm this good. But

1:04:02

everybody here likes it. Of

1:04:05

course, I'm not even, I'm not saying it's bad,

1:04:07

I'm just saying the perspective that people are

1:04:09

coming from. I find it upsetting.

1:04:11

It makes me feel 100 years old. I load

1:04:13

it up. I was just saying, do you use

1:04:15

TikTok? I assume you don't make

1:04:17

TikToks, but do you watch them? I

1:04:20

can't understand it. I

1:04:22

use it and I watch. No,

1:04:24

it really is. I'm getting old. I've never been

1:04:26

good at video either, but also

1:04:28

there's something that gives me anxiety that the

1:04:31

feed never ends, that there's always more. I

1:04:33

mean, that's every website now though. No, but

1:04:35

it's worth saying, isn't it? But

1:04:38

it's also, I'm being fed stuff that I didn't

1:04:40

ask for, which also really upsets me. I

1:04:43

don't like the algorithm interfering as much as

1:04:45

it does, or at least I like it

1:04:47

to be a little less overt with it.

1:04:49

Instagram is bordering on unusable for the

1:04:51

same reason. TikTok is just hyper aggressive with

1:04:53

it. I kind of respect the fact that they're so honest with

1:04:56

it. It's just like, yeah, we control this. You're going to see

1:04:58

whatever we want. We want you, little pig. Jess,

1:05:00

you use, you watch TikTok through

1:05:02

the app, right? Yeah. I'm

1:05:05

curious if you've had the same experience. I

1:05:07

feel like the main point of TikTok is

1:05:09

they hyper curate your feed to be everything

1:05:11

you'd want and more. But

1:05:13

I've noticed the last maybe dozen times

1:05:16

I've opened TikTok for the past couple

1:05:18

of weeks, my feed is bananas. It

1:05:20

is completely random. It is like all

1:05:22

preferences they've previously had for me have gone

1:05:25

out the window. I'm not seeing any more

1:05:27

contact about the guy with the eel pit

1:05:29

or the woman who is legally digging

1:05:31

tunnels, which are my two favorite TikTok

1:05:33

subjects. Suddenly, I'm just

1:05:35

seeing random stuff out there. There are still

1:05:37

cat videos usually, which is what's keeping me

1:05:40

going back, or strange musical content. Have you

1:05:42

noticed any change to your home? My

1:05:45

theory, yes, I have, but it's

1:05:47

sporadic. I find that like every

1:05:49

third day, boy, it's feeling

1:05:52

weird today. And the next day it's stuff I

1:05:54

like. I think they try stuff

1:05:56

out on you to see what

1:05:58

happens. they what I've

1:06:01

never managed to get it dialed in it's

1:06:04

never engaged me I've never been like I must

1:06:06

watch this but also I just don't consume that

1:06:08

much video like I'm not even saying it's a bad

1:06:10

app I don't like video that much I don't

1:06:13

I am similar I don't watch YouTube

1:06:15

videos ever is my famous

1:06:17

statement in the pie I watch no YouTube

1:06:20

videos or like 12-minute long ones hmm

1:06:24

and but it's just this whole bill I think we

1:06:26

can wrap it up by saying who's

1:06:28

it help who's this actually protecting who

1:06:31

does this benefit bill yeah yeah

1:06:34

I mean I think that it held the centralization

1:06:36

of power in the hands of few guys white

1:06:38

guys within the tech industry yeah

1:06:41

white homogeneity pushed

1:06:43

at scale it's going to make rich

1:06:45

men even richer it's going to centralize

1:06:47

the same tools by the way the

1:06:49

same things are basically done

1:06:52

using American social networks look at what Jason

1:06:54

Kobler is tweeting right now and

1:06:56

it sucks it sucks because if there

1:06:59

was a real problem I'd love to see it and

1:07:03

we should be doing a supporting open source

1:07:05

we should be supporting math done and

1:07:07

and pushing blue sky and even threads to

1:07:09

open up blue sky just opened

1:07:12

up its its moderation structure good name

1:07:14

ozone I like that I love blue

1:07:16

sky brilliant I love you too I

1:07:18

like it a lot I haven't been

1:07:20

posting there as much I've been tweeting

1:07:23

a lot lately getting

1:07:25

back into it but I've got to start blue

1:07:28

sky blue sky threads the

1:07:30

problem is people I like around threads but

1:07:33

I don't like friends as much it is

1:07:35

a putrid place it is like it's

1:07:37

like someone created linked in off

1:07:39

the dark which is not that

1:07:41

fun that sounds like someplace Jason

1:07:43

Kelz telekines would be no he

1:07:45

goes on Twitter and

1:07:48

he goes you know what racism is bad

1:07:50

but if we thought about why and it

1:07:52

would just be like a horrible thread where

1:07:54

he's giving freedom someone very racist he did

1:07:56

a whole thing about Alex Jones where he

1:07:58

was like why don't a good But

1:08:00

why don't I agree? And

1:08:02

he says this, I'm just asking the

1:08:04

questions thing, this vile right-wing thing, like

1:08:06

a classic right-wing trope. And

1:08:09

he's like, I wasn't saying I even

1:08:11

want Alex Jones back, but now he's

1:08:13

coming back. Isn't that

1:08:15

interesting? It's just, it's

1:08:18

very frustrating, all of that stuff. Completely different

1:08:20

subject, I realize. Switching gears

1:08:22

a little bit, there was a really interesting story from

1:08:24

Cashmere Hill up in New York Times this week. Yes,

1:08:26

I'm glad you're doing this. Sort of his thought on

1:08:29

how the auto industry is

1:08:31

collecting reams of data about consumers'

1:08:34

driving habits and sharing it with insurance

1:08:36

companies. The opening

1:08:38

anecdote in the story follows a guy

1:08:41

named Ken Dahl, who drives a leased

1:08:43

Chevrolet Bolt and says, he's a careful

1:08:45

driver. He's never been

1:08:47

responsible for an accident. And Dahl

1:08:50

was surprised in 2022 when

1:08:52

the cost of his car insurance skyrocketed by

1:08:54

21%. Quotes

1:08:56

from other insurance companies were also high,

1:08:58

and one insurance agent told him

1:09:01

that his LexisNexis report was a

1:09:03

factor. LexisNexis is this

1:09:05

New York based kind of global data broker.

1:09:08

They catered to the auto insurance industry

1:09:10

with this one kind of division they

1:09:12

have, and has traditionally kept tabs on

1:09:15

car accidents and tickets and like helps

1:09:17

determine potential insurance rates from that. Dahl

1:09:20

wanted to know what was causing his

1:09:22

insurance increase, so he requested info from

1:09:24

LexisNexis under the Fair Credit Reporting Act. LexisNexis

1:09:27

sent him a 28 page, no 258

1:09:29

page, sorry, 258 page consumer disclosure report, and

1:09:36

what it contained stunned him from

1:09:38

the Times. More than 130

1:09:40

pages detailing each time he or his

1:09:42

wife had driven their Bolt over the

1:09:44

previous six months. It included the dates

1:09:47

of 640 trips,

1:09:49

their start and end times, the

1:09:51

distance driven, and an accounting

1:09:54

of any speeding, hard braking, or sharp accelerations.

1:09:56

The only thing it didn't have was where

1:09:58

they had driven their car. And

1:10:01

so according to this report, the trip details

1:10:03

had been provided to LexisNexis by

1:10:05

GM, the manufacturer of the

1:10:07

Chevy Bolt. And a

1:10:09

bunch of insurance companies had requested

1:10:11

information about Dahl from LexisNexis over

1:10:13

the previous month. A quote

1:10:16

from Dahl said, it felt like a

1:10:18

betrayal. They're taking information that I didn't realize

1:10:21

was going to be shared and screwing with

1:10:23

our insurance. And

1:10:25

I think kind of the money quote in this story is, automakers

1:10:30

and data brokers that have partnered to

1:10:32

collect detailed driving data from millions of

1:10:35

Americans say that they have driver's permission

1:10:37

to do so. But

1:10:39

the existence of these partnerships is nearly

1:10:41

invisible to drivers. This consent is obtained

1:10:43

in fine print and murky privacy policies

1:10:46

that few read. As

1:10:48

someone who doesn't drive, I feel it's inappropriate

1:10:50

for me to react to this story. So

1:10:52

I'm turning it over to the two, I

1:10:54

assume, drivers on this podcast

1:10:56

today. Ed? I

1:10:59

mean, welcome to the machine. I mean, this

1:11:02

has been going on since Metro Mile came

1:11:04

in since they, progressive over

1:11:06

a decade ago, started doing an OMD thing.

1:11:09

You click into your car that would lower

1:11:11

your insurance. They've been collecting this data forever.

1:11:13

How do you think the insurance rates have

1:11:17

changed over the years? They've changed because they've been collecting

1:11:19

this data. It's I

1:11:22

assume that Tesla with their insurance does

1:11:24

the same thing. They're fairly

1:11:26

sure they do something like this. And

1:11:29

guess what? This is the

1:11:32

future. This is why having a car

1:11:34

full of stuff, full of tech, full

1:11:36

of apps is actually a really bad

1:11:38

thing. I don't like that much information

1:11:41

out there. Yep, there it is. I

1:11:44

hope. See this, so this, by the way,

1:11:46

this feels like something that the government and

1:11:48

the house and Congress need to get their

1:11:50

up in arms about. Basic

1:11:52

transparency in data collection and its

1:11:55

use. That is a this

1:11:57

story should be bigger than TikTok because

1:11:59

this is. affecting more Americans right now

1:12:01

today. The cost of car insurance is

1:12:03

an incredible burden on the average man. I

1:12:06

have a beautiful Volvo XC4 electric car.

1:12:08

It's got mechanical doors, technical

1:12:11

doors, they open the closed, it goes forwards, backwards, side

1:12:13

to side, it's an amazing vehicle. Does it

1:12:15

open the screens that does all you can like

1:12:17

have an app? It also has a beautiful physical

1:12:21

knob. Fun little side, I met

1:12:23

Steve Wozniak the other day and he was talking

1:12:25

to me about how electric car sucks. They all

1:12:27

have screens, right? And now we hate inclusive motors

1:12:29

because they've got big screens and everything's got a

1:12:31

screen. Mercedes got a screen. And

1:12:34

I think that that's part of the problem. But the

1:12:36

other problem is how much of that data, because a

1:12:38

lot of them have like sensors about driver awareness, for

1:12:40

example. Driver how engaged you are with

1:12:42

the road. These are the real, these are

1:12:45

the things that are going to really screw people. Don't

1:12:47

be scared of China coming into the TikTok app. Be

1:12:49

scared of the American

1:12:51

manufacturers that are currently tracking

1:12:53

how much you are seeing

1:12:55

the road. How engaged

1:12:58

you feel. These are the scary

1:13:00

things. Especially because these tools aren't

1:13:02

always accurate. I know with the

1:13:04

driver awareness tool, when I

1:13:06

reported on Amazon, they had kind of installed

1:13:08

these in the cabs of some of their

1:13:10

trucks and, you know,

1:13:13

driver vans. And there was kind

1:13:15

of massive wave of complaints from

1:13:17

drivers, not because they were like,

1:13:19

oh, we want to not pay

1:13:21

attention to the road, because they're like,

1:13:23

I'll be sitting there driving, paying attention

1:13:25

to the road. And it will ding

1:13:27

me five times for inattention, perhaps because,

1:13:30

you know, the person driving is

1:13:33

Asian or just not white.

1:13:35

And the camera can't properly

1:13:37

attune itself to their eyes.

1:13:39

This has been a slippery

1:13:41

slope. Well, this is the smart

1:13:43

car. It's

1:13:46

probably a middle school car. It's what,

1:13:48

10 years old. So probably

1:13:50

doesn't have all of this. No, I

1:13:53

had to spend six hundred dollars to get Android auto. What

1:13:57

is that? Android auto, same as your Apple.

1:14:00

home. I can look at the map on

1:14:02

the cars screen. It has a screen. First,

1:14:07

let's underscore, as we like to say in

1:14:09

journalism, great reporting

1:14:11

by Cashman Hill. This is

1:14:13

the kind of privacy story. Instead of doing the moral

1:14:15

panic about the internet,

1:14:19

it's taking all of our data and I don't

1:14:22

know my data and it's non-specific enough to the

1:14:24

point. This is specific. It

1:14:26

is a violation of your

1:14:28

own space and what

1:14:31

you do. Point one. Point

1:14:33

two, none of this required the internet. This

1:14:36

is about evil data

1:14:38

brokers and evil insurance companies.

1:14:41

And the data is going to be out

1:14:43

there. What's needed for legislators, as

1:14:45

you say Ed, what's absolutely needed now is

1:14:47

for them to come up with and what they can do

1:14:49

is they can forbid the use of this data. You

1:14:52

can't, it's going to be collected for

1:14:54

various reasons, for various things. Okay, whatever.

1:14:56

You can forbid it being collected. You

1:14:59

can forbid it being used for insurance.

1:15:01

That's what legislators are for. Or

1:15:03

even at the bare minimum mandate

1:15:05

that they disclose, it's being collected and

1:15:07

give you the opportunity to opt out. But

1:15:10

also to your point earlier,

1:15:13

Harris, like remember

1:15:15

Connect, Microsoft's camera, like

1:15:18

the thing for the Xbox, it couldn't see black

1:15:20

people. The

1:15:22

panopticon systems that many contractors

1:15:26

use, the ones that they use to

1:15:28

tell if you're actually watching the screen,

1:15:30

they continually see things like dreadlocks

1:15:33

or even just the hair of African-American

1:15:35

people as things that suggest they're not

1:15:37

there. These systems

1:15:40

are biased. They are racially biased. And

1:15:42

that's before you get to the obvious,

1:15:44

like this is just invasive. These systems

1:15:47

are biased against people like many algorithmic

1:15:49

systems, like the mortgage system that was

1:15:52

something like 80%. There

1:15:54

was an insane amount of bias in the mortgage system. Using

1:15:58

datasets to make calls like this is...

1:16:00

insane and by the way I'm correct

1:16:02

Tesla does have a safe driving system

1:16:04

for their insurance they give you a

1:16:06

score kind of similar to Metro mile

1:16:08

these companies want to do this and they're

1:16:10

doing it because guess what they can just

1:16:12

they can now hyper justify how

1:16:15

they judge you but guess what if these

1:16:17

are the systems then to be clear there

1:16:19

is a separate thing here there is the

1:16:21

I'm tracking where you're going and what you're

1:16:23

doing that is separate to

1:16:25

racial discrimination and the engagement

1:16:28

systems those I don't know if

1:16:30

they're using that data but just

1:16:32

a very basic driving here that

1:16:35

is going to have a class issue as well

1:16:37

you're driving through rougher neighborhoods which

1:16:39

as judged by an insurance company

1:16:41

if you're driving in such a way

1:16:44

that suggests your profile is now

1:16:46

risky which is racially biased these

1:16:48

systems are going to hurt so many people

1:16:50

and they already are and

1:16:53

yet here we are chasing tick I

1:16:55

just right right right right it's just

1:16:57

like this is a thing hurting people

1:16:59

today cashmere does a great job at

1:17:01

this cashmere's done great work

1:17:03

on data brokers companies like spoke yo and

1:17:05

tell us these companies

1:17:09

are actually actually start axios axiom but kill

1:17:11

me the next business one all they have

1:17:13

who knows

1:17:17

but we got these companies these companies

1:17:19

are actual scum they're actually dangerous to

1:17:22

Americans they are brokers you can pay

1:17:24

someone 25 bucks a month to

1:17:26

look up everything they can find out your

1:17:28

phone number your address in many cases sounds

1:17:30

like a moral panic it actually really isn't

1:17:32

companies like a bean who made good money

1:17:35

deleting that information from the internet but

1:17:37

there's only so much you can do these

1:17:39

are things to target somebody shouldn't

1:17:41

they are brokers should be illegal but

1:17:44

here's the other issue well

1:17:47

two things one day workers go way back when

1:17:50

I worked at time we used to workers like

1:17:52

crazy and that's how we did subscription models and

1:17:54

all this kind of stuff I used to scare

1:17:56

students and say let me go

1:17:58

to axiom not axios And I'd

1:18:01

say, let me find out the names and

1:18:03

addresses of, just to be really creepy, women

1:18:06

between the ages of 21 and

1:18:08

35 who live within n

1:18:10

miles of this place who have a

1:18:12

college degree and a car. And

1:18:15

it'll give it to me. Far more than the

1:18:17

internet ever does. Far more. But

1:18:20

here's the other issue. I just said we need laws, we need

1:18:22

legislators. But of course, government is the worst

1:18:24

enemy of privacy. Because what's going to happen is,

1:18:26

fine, this data doesn't tell the insurance company where

1:18:29

you went. But it's discoverable

1:18:32

and certain states are going to come in and find

1:18:34

out whether or not you went to an abortion clinic.

1:18:37

And the data's there. And

1:18:39

that's the real issue. Exactly.

1:18:42

And so, there

1:18:44

needs to be transparency. There needs to be the

1:18:46

right to erase it. Again, it's not the internet.

1:18:48

It's your car and the insurance company and your

1:18:51

state's government that are the real problems here. Yeah,

1:18:53

the internet can do stuff too. But it's more

1:18:55

anonymized there. It's not in your personal space. This

1:18:59

is about,

1:19:02

I wrote a book about privacy called Public Parts. And

1:19:05

so, I learned about all this

1:19:07

stuff. Wow, a rare drink opportunity.

1:19:09

We never get private parts. Yeah,

1:19:11

public parts. Public parts is Howard

1:19:13

Stern. I

1:19:18

dedicated it to Howard Stern as a result because

1:19:20

he was part of the title. There

1:19:25

is no right to privacy in the Constitution.

1:19:28

It was invented by case law

1:19:30

over time. It's something that,

1:19:33

and I believe in the value of publicness and being

1:19:35

out there and talking, I believe in the value of

1:19:37

privacy. We've got to protect it. But

1:19:39

it's a... And

1:19:43

the problem I have is that when

1:19:45

you have crap like Shoshana Zuboff and

1:19:49

surveillance capitalism, it

1:19:51

ruins it because it just goes overboard with, I'm

1:19:54

going to say it again, moral panic in a

1:19:56

way that's not really about harm. It's not really

1:19:58

about the real problems. Attraction,

1:20:01

but it's this kind of stuff. Don't pay attention to

1:20:03

zoom off pay attention to cashmere Hill Yeah,

1:20:07

Carl bowed Wrote

1:20:09

a really good breakdown of cashmere's

1:20:12

article in tech dirt Which

1:20:14

all I'll quote from a little bit here Again

1:20:18

like countless past scandals This is the

1:20:20

direct byproduct of a country that has

1:20:22

proven to corrupt to pass even a

1:20:24

baseline privacy law for the internet era

1:20:26

To corrupt to regulate data brokers and

1:20:29

obsessed with steadily defanging the funding understaffing

1:20:32

and curtailing the authority of

1:20:34

regulators tasked with overseeing corporations

1:20:36

the broad and reckless disdain for US

1:20:38

consumer privacy and safety Yeah,

1:20:42

I do think he has a point whole

1:20:44

rocks top college

1:20:47

Call it much nuts that fella.

1:20:49

He's absolutely ruthless. He's one of the

1:20:51

few people was on Elon Musk early

1:20:54

He really he was pushing very hard Of

1:20:58

course not just want to be clear

1:21:00

that Laura Kolodny very early on the

1:21:02

Elon Bay. Absolutely ruthless big up Laura

1:21:04

yep, but he's done an excellent job

1:21:06

of Really

1:21:08

getting to the heart of how angry you should be I

1:21:12

used to and I feel like this

1:21:14

thing is a lot of What

1:21:16

the tech industry does that leads to things like

1:21:18

this is they turn everything into a very complex

1:21:20

shell They make it sound like magic goes back

1:21:22

to the AI think goes about to tick tock

1:21:24

all this Not many sensors could really

1:21:27

explain to you why tick tock's back. They're just

1:21:29

like ah China Right, right.

1:21:31

I know China's scary. They're in my phone This

1:21:34

thing with the cars sounds

1:21:36

complex was signed in the complex way, but

1:21:39

is fairly simple They are invading your privacy

1:21:41

and I feel like there Should

1:21:44

be someone in Congress House there

1:21:46

I say even the president who

1:21:48

could just say these things like

1:21:50

Carlos Cole call is actually very

1:21:53

cleanly spoken and furious about the stuff as he

1:21:55

should be And I feel like more

1:21:57

people in tech should be I'm

1:22:00

angry every time I read about the stuff I get

1:22:02

angry because you see the world burning around you and

1:22:05

you see The fire

1:22:07

engines all rolling up to a house that isn't

1:22:09

on fire And

1:22:11

it's just yeah, but who wants to go after

1:22:13

you're going after in this case the

1:22:17

auto industry the insurance industry and

1:22:20

the data broker industry which all politicians

1:22:22

use for all of their direct mail

1:22:24

and course advertising

1:22:29

and They say well, they're Alex. They can. Yes.

1:22:31

Well, they're gonna be Well,

1:22:33

that sucks on a on another positive

1:22:36

note. I think that we should use this opportunity

1:22:38

to go to an ad break Jeff

1:22:41

Jarvis at Zitron everybody and

1:22:44

now you're gonna go to an ad

1:22:46

break You

1:22:51

need parts O'Reilly auto parts has

1:22:53

parts need them fast We've

1:22:56

got fast no matter what you

1:22:58

need We have thousands of professional

1:23:00

parts people doing their part to

1:23:02

make sure you have it product

1:23:04

availability Just one part that makes

1:23:07

O'Reilly stand apart the professional parts

1:23:09

people Um,

1:23:18

I want to talk next about a A Great

1:23:21

article that the New York Times

1:23:23

did on taking a deep look

1:23:25

into Musk's Charitable

1:23:29

foundation and I want to also

1:23:31

throw it to Ed for kind of an overview

1:23:33

of this a little bit because I know you

1:23:35

Just did a podcast episode about this where you

1:23:37

interviewed One of

1:23:40

the authors of it David Serenthold

1:23:42

I guess can you tell me a little bit about the article and

1:23:44

like what I got was like, yeah

1:23:46

absolute absolute legend David he jumped on

1:23:48

the phone the day after the article

1:23:51

went out What a legend so

1:23:53

long story short Elon Musk part about

1:23:55

five six billion dollars of Tesla shares

1:23:57

to be clear They didn't buy them

1:24:00

just chairs he had in the Musk

1:24:02

Foundation, a non-profit. Now part of the rules

1:24:04

of doing this is you have to spend

1:24:06

5% of it every

1:24:08

year to qualify for the $2

1:24:10

billion tax break

1:24:13

he got. He regularly fails to do

1:24:15

so. He regularly fails to invest enough

1:24:17

money to do so. But he

1:24:21

also, unlike, say, Larry Page's foundation, he

1:24:23

doesn't dump it in a donor advised

1:24:25

fund, which is kind of a black

1:24:27

box for giving grants to people. No,

1:24:29

he has no staff, he

1:24:32

has two unpaid volunteers, and he

1:24:34

is the other person on the board, and

1:24:36

it isn't really obvious where

1:24:39

the money goes. When it does

1:24:41

go somewhere, it goes in either very

1:24:43

small amounts and does very little, or

1:24:45

very large amounts and also does very

1:24:47

little. There is a school called Ad

1:24:49

Astra, which is literally inside a SpaceX

1:24:51

compound where Musk's own children go. It

1:24:54

used to be outside, now it's inside

1:24:56

a SpaceX compound. They

1:24:58

got several million dollars. He gave $100

1:25:00

million from the Musk

1:25:03

Foundation to another non-profit just

1:25:05

called The Foundation. It's really

1:25:07

firing on all cylinders there.

1:25:09

The Foundation.

1:25:12

It really is good. And red. It

1:25:15

bought a bunch of land out

1:25:17

very close to a boring company

1:25:19

site in Texas. Very

1:25:21

cool, very good stuff there. He

1:25:23

also donated $55 million to a

1:25:25

cause of a guy who auctioned

1:25:28

off seats on a SpaceX flight,

1:25:30

who then immediately bought three more

1:25:32

seats on another SpaceX flight. And

1:25:35

I think one of the interesting things that David,

1:25:37

the reporter, brought up in your

1:25:39

interview is part of what

1:25:42

is supposed to be happening here is if

1:25:44

you are getting this tax

1:25:47

break from being a charitable organization, you have

1:25:49

to do charitable things. Meaning you have to

1:25:51

do things that benefit the public good. And

1:25:53

this article, the thing that it kind of

1:25:55

hits again and again, is that it's unclear

1:25:57

if any of you have any questions about

1:26:00

of kind of the Musk

1:26:02

Foundation's investments

1:26:04

or philanthropic endeavors,

1:26:08

it's unclear who they benefit outside of

1:26:10

Elon Musk or Elon Musk's

1:26:12

employees and customers, which is

1:26:14

incredibly unusual for a charitable

1:26:16

foundation. Well, a good one he did

1:26:19

as well as he said he was going to fix the water contamination

1:26:21

problems that are playing Flint, Michigan. Promised

1:26:24

he'd do so. He actually tweeted at one point that he'd

1:26:26

already done so and then deleted the tweet in 2018. He

1:26:30

gave him about one point two million dollars, which is

1:26:33

it's good he did something. They bought

1:26:35

water filters for the school. They bought laptops for

1:26:37

the school. Great idea. They then responded to him

1:26:39

with a four page plan basically saying, here's how

1:26:41

you do. Here's how

1:26:43

we will do the thing you promised to help us with.

1:26:46

And then you can fix the

1:26:48

water as you promised. He sent

1:26:51

a Tesla development executive to

1:26:53

Flint, Michigan, who gave people

1:26:55

rides around the around

1:26:58

the parking lot. What

1:27:01

of the city hall? He

1:27:03

arrived in the Tesla. That's incredibly

1:27:05

charitable. And then he said, hey, well,

1:27:09

we might build an office out here. You'll never

1:27:11

guess what happened. He didn't build anything. He

1:27:14

didn't fix. Flint, Michigan.

1:27:18

At all. Some

1:27:21

of the details in the story are just wild.

1:27:23

I mean, as you'd expect from an Elon Musk

1:27:25

story, some of them are kind of all

1:27:28

quote from here. Among the donations

1:27:30

the Musk Foundation has made, there was a

1:27:32

fifty five. There was fifty five million to

1:27:34

help a major SpaceX customer meet a charitable

1:27:36

pledge. There were millions that went to Cameron

1:27:38

County, Texas after a rocket blew up and

1:27:41

there were donations to two schools closely tied

1:27:43

to his business. One that

1:27:45

was literally physically walled off inside a

1:27:47

SpaceX compound. And

1:27:50

the other, like you mentioned, is located next

1:27:52

to a new subdivision for Musk employees. I

1:27:55

just want to go into a little bit more about the one you

1:27:57

mentioned, Ad Astra, which is Latin for to

1:27:59

the. Ours, ostensibly was

1:28:01

founded by Musk as a

1:28:03

nonprofit school to explore new ways to

1:28:06

teach math and science. And this is

1:28:08

from the Times again, but that school

1:28:10

too would serve a personal purpose for

1:28:12

Mr. Musk. In its first year

1:28:14

of operation out of his home in the

1:28:16

Bel Air neighborhood of LA, five of

1:28:19

Ed Astra's 14 students were

1:28:21

his own children. The

1:28:24

headmaster said the only criteria

1:28:26

for admission were quote, kindness and

1:28:29

eagerness to learn and parents that

1:28:31

worked at SpaceX. Company

1:28:35

store. I just

1:28:37

want to again underscore, I made fun

1:28:39

of the New York Times using the verb underscore

1:28:41

the other day for what reporters really want to

1:28:43

say to themselves. So anyway, I'll

1:28:45

underscore Cashmere Hill and David

1:28:47

Ferrenholt. The Times

1:28:49

is driving me completely batty lately with

1:28:52

his credulous coverage of fascism and

1:28:54

its Biden, his birthday polls

1:28:56

and all this stuff. But

1:28:59

the Times is also the repository for people like

1:29:01

Ferrenholt and Hill who do this kind of

1:29:03

great work. And we just more of this, please, less of

1:29:05

the other crap. Yeah, it's

1:29:07

how I feel. And David was great

1:29:09

as well. He was a real, he was a

1:29:11

president. But he was also very,

1:29:13

he was very, he was

1:29:15

very entertaining. Like he was very into the story

1:29:17

and you could tell it frustrated him. And

1:29:20

you could tell he'd like physically gone out to

1:29:22

where Ad Astra was meant to be. But

1:29:25

he wasn't able to get any of that was a, he

1:29:28

said this in the podcast, but there was just a sign

1:29:30

basically saying, do

1:29:32

not pass. I forget what the nonsense

1:29:35

they say, but it's like, no, no, no

1:29:37

unauthorized entry, blah, blah, blah, blah, blah, for

1:29:39

a school, just to be clear. And

1:29:42

it's just one of the

1:29:44

more depressing things he said, though, is the people

1:29:46

that would regulate this are like the IRS and

1:29:50

other parts of the government that are

1:29:52

underfunded. And it sucks. It sucks

1:29:54

because Musk will get away with this. Musk

1:29:57

will get away with this. And Ryan Mack.

1:30:00

who was the co-reporter on the piece as

1:30:02

well. He did a good amount of haven-founding

1:30:04

and he basically found that no one talked

1:30:06

to Musk about the foundation. Sorry,

1:30:09

the Musk Foundation I should say. No

1:30:11

one, he didn't bring it up. It was not

1:30:13

something that he talked about. It's not something he

1:30:15

thought about. It's just a big tax dodge. And

1:30:18

something else that Farinhold added at the end of

1:30:21

the podcast was, he was saying, and I mentioned

1:30:23

it earlier, Larry Page is the same deal. He

1:30:26

just dumps it into a donor-advised fund but he doesn't

1:30:29

invest as much as he needs to. And it's just

1:30:31

a way for them to get away with it. And

1:30:33

I think that Musk probably doesn't want

1:30:35

to spend that money because he doesn't

1:30:38

want to liquidate billions of dollars of

1:30:40

Tesla stock. What's also disgusting is everyone

1:30:42

who covered every single time he claimed

1:30:44

he would do anything with the foundation

1:30:46

while nothing happened. Every major

1:30:49

publication, Wall Street Journal headline was, Musk

1:30:51

gives 11.6 million Tesla shares

1:30:54

to charity, failing to

1:30:56

put in that headline the

1:30:59

words, his own. Failing to

1:31:01

follow up. This stuff has been out

1:31:03

there a while. A

1:31:05

lot of this is publicly available stuff.

1:31:08

We all cut lines. Not my job. I'm running a PR firm.

1:31:10

But, Jim, this should

1:31:12

have been on this already. But I think there is, if

1:31:15

not a fear of Elon Musk, there

1:31:18

is still this want to believe that

1:31:20

he means well. And Kara Switzer has

1:31:22

been particularly scummy on this one. I think it's

1:31:24

the same sort of dynamic that we saw with

1:31:26

Trump, where it's like the

1:31:28

traditional media apparatus is

1:31:31

not equipped to

1:31:33

handle someone

1:31:35

who manipulates the truth at

1:31:38

such a, you know, regularity.

1:31:41

They are equipped. All you need to do

1:31:43

is refer to him as a con artist.

1:31:45

But I realize all you need to do

1:31:47

is put his statements in context with reality.

1:31:50

But I remember seeing this right

1:31:52

after I think, you know, when his star

1:31:54

was rising with the acquisition of Twitter, there

1:31:57

was some completely after.

1:32:00

and I'm tweet or something that he'd put out that

1:32:03

Bloomberg or someone wrote a whole story about it

1:32:05

as if it was true and of course it

1:32:07

just ended up being something he tweeted and

1:32:09

never ended up revisiting

1:32:12

every must story every must story in

1:32:14

every must he was gonna six billion

1:32:16

dollars world hunger we neural

1:32:18

link customer we put it in the first

1:32:20

thing tweet I'm gonna

1:32:22

buy Wikipedia tweet every otherance it becomes a

1:32:24

story sure except

1:32:30

it's dumber with Elon Musk he

1:32:32

is a like musk musk and Trump

1:32:34

are both proven lies they're both dangerous

1:32:37

dangerous to society in both different ways

1:32:40

with musk tech

1:32:42

industry knows better but I think the tech industry

1:32:44

look if you want to just be an enthusiast

1:32:46

industry that's fine be happy with everyone

1:32:49

no critiques or anyone you can't critique some but

1:32:51

not all musk

1:32:53

has posted multiple great

1:32:55

replacement theories things he's

1:32:58

continually doing this right-wing

1:33:00

firebrand nonsense this anti-immigrant

1:33:02

stuff he is out

1:33:04

and out racist he posted something very

1:33:06

racist against african-americans a couple months ago to

1:33:09

do with sex which I'm just not gonna go

1:33:11

any further on and

1:33:13

nobody did anything the great

1:33:16

replacement thing was a few weeks ago nobody

1:33:18

covered nobody but they will

1:33:21

cover when he fart since this I'm

1:33:23

going to I'm gonna remove likes I've

1:33:25

been a very much yes

1:33:28

and we're going to make the tweets

1:33:30

go sideways now and it's and

1:33:33

the thing is you know who actually has done a really good

1:33:36

job on this I'm gonna give a lot of credit to TechCrunch

1:33:38

TechCrunch who really did start off as kind of

1:33:40

very rah rah for tech has been damning on

1:33:42

musk those people like Amanda Silverling

1:33:45

does an excellent job sadly I've

1:33:47

Darin Essington's head moved on from there now

1:33:49

but they're doing a

1:33:52

great job calling out but CNBC isn't

1:33:54

CNBC other than Laura Kolodny who is absolutely

1:33:56

amazing and does a hell of a

1:33:58

job there you can't also You

1:34:00

can't, it undermines great reporting

1:34:02

like that when you sideline it

1:34:04

with just Musk's vague promises and

1:34:06

outright lies and then you don't

1:34:09

revisit them. And Paris, I

1:34:11

get your point. The comparison with Trump is

1:34:13

true. Except Trump, as a former president and

1:34:16

presidential candidate, there is a difference

1:34:18

there. Now, he's still

1:34:20

covered terribly, but there is a difference. Elon

1:34:22

Musk is a private citizen worth billions of

1:34:25

dollars. Rip into shreds! Take

1:34:27

his arse down! He

1:34:29

only went after media matters because he knew they

1:34:31

didn't have much money. He won't go after actual

1:34:33

journalists. Other than, here's the thing,

1:34:35

the real abdication of authority with Elon Musk is

1:34:38

in the hands of Kara Swisher. Kara Swisher should

1:34:40

be right across to you. She's doing a media

1:34:42

tour for Burn Book right now. Which

1:34:44

is an astonishing book. I mean, just

1:34:46

the fact that she's being interviewed by

1:34:48

Sam Altman and Reid Hoffman

1:34:51

about her book. It's an insult to

1:34:53

journalism. So you read it so we

1:34:55

don't have to. Say more. Yes. You

1:34:57

should read Paris Marx's review of it.

1:34:59

I was just Googling that right now.

1:35:01

Paris Marx, another great guy. But

1:35:04

with those, he makes the point that she kind

1:35:06

of acts like she criticizes people

1:35:08

but never really got down to it. She'd have a

1:35:10

moment to talk about his review. It's excellent. But

1:35:12

the long and short of Kara's book is, yeah,

1:35:15

all these people suck, but

1:35:17

I was at their parties. Right. With Elon Musk, she

1:35:19

only turned on him because he called her an asshole.

1:35:22

And to be clear, the thing he called her an

1:35:24

asshole for was because he misread a tweet where she

1:35:26

was supporting the US government paying him for Stalin in

1:35:28

Ukraine. Something he promised to do for free. People

1:35:32

like Kara have the ability to stop

1:35:34

men like this. They have

1:35:36

the ability to say, you are lying. They

1:35:38

have a huge... Stop. Same

1:35:41

deal. Oh, thank you.

1:35:43

Oh, my bad. Noire.

1:35:46

Why are you getting such fucking attention? I

1:35:49

want... I'll be honest, like... They

1:35:52

shouldn't get that. Better off when it's better

1:35:54

than they let it go out. Well, I mean, they get

1:35:56

attention because they are hyperbolic. Because they... No,

1:35:59

because they appeal to... very center liberal audience

1:36:01

of people who don't really have

1:36:03

morals but they do like posting

1:36:05

Instagram things. They appeal

1:36:07

to people who don't really want to believe in anything

1:36:09

other than that which feels convenient and

1:36:12

doesn't make them think too much. Kara is

1:36:14

the arbiter of that kind of information and

1:36:16

on top of that with Musk, she

1:36:19

was still defending his ass well into

1:36:21

2022. She was the one

1:36:23

saying he's hard to unpack. Oh he's

1:36:25

hard to unpack. He's so smart and we

1:36:27

know he's a flipping dullard. He's

1:36:30

a big wobbling candle of boringness

1:36:33

and he's not invented any of

1:36:35

the things he claims he invented. His

1:36:37

company's boring company, constantly goes to cities,

1:36:39

doesn't do anything, burns

1:36:41

people in Las Vegas. He's hurt

1:36:43

so many laborers in a very blue

1:36:45

collar city. Elon Musk

1:36:47

is a scumbag and calling him a scumbag and you

1:36:50

don't even have to do it in the newspaper. You

1:36:52

just say it in a talk. Maybe

1:36:54

you don't give him a talk. Aaron Ross-Sawkin shouldn't be

1:36:56

sitting there for the times and going, oh Elon, tell

1:36:58

me about your things. Tell me about your travails. No,

1:37:01

sit there and roast him and if he won't do

1:37:03

the interview, screw him. Say

1:37:05

he won't do the interview. It's time to

1:37:07

ask these people accountable. Anyway, sorry, stop. No,

1:37:09

this is fantastic. I could let you go

1:37:11

on. I was like, we got it. We

1:37:13

got to give a moment

1:37:15

to breathe. The great long,

1:37:18

I guess, review of

1:37:20

Kara's book, Burn Book by

1:37:22

Paris Marx is called Kara Swisher's

1:37:24

Reality Distortion Field. I posted a

1:37:27

link to it in the discord

1:37:30

chat. Here's a quote from

1:37:32

it. In the

1:37:34

end, Burn Book is the story of

1:37:36

one of Silicon Valley's most prominent access

1:37:38

journalists who took a page from the

1:37:40

billionaire she covered and created her own

1:37:42

narrative to see and present herself as

1:37:44

something else entirely. The story Swisher is

1:37:46

trying to tell helps to distance herself

1:37:48

from the decades she spent boosting companies

1:37:51

that have now been quote, disastrous, as

1:37:53

she herself admits. But it also works

1:37:55

for the industry. Letting Swisher present

1:37:57

herself as a tough reporter allows Silicon Valley to

1:37:59

be a Valley to pretend it was

1:38:01

being held to account this whole time

1:38:03

when really Swisher was along for the

1:38:06

ride and bought into the tech determinist

1:38:08

worldview guiding the industry. What

1:38:11

do you think of her, Jeff? Oh, I

1:38:14

say probably she blocked me long ago. And

1:38:16

I wasn't even saying anything that was that

1:38:18

awful. She's very, she acts all tough and

1:38:20

she's very oversensitive to criticism

1:38:23

herself. I

1:38:28

think she could have been a good analyst 20 years

1:38:30

ago. I think she could have figured things

1:38:32

out. But

1:38:36

I'm not sure where it went south. Was it putting people

1:38:38

on the red chair and making a lot of money from

1:38:40

that? I think

1:38:42

that there is this like version,

1:38:45

I mean, it's something that many people

1:38:47

are, can

1:38:50

happen to many people. Is when you move

1:38:52

from being a down

1:38:54

in the trenches reporter to

1:38:57

being a quote unquote commentator with

1:38:59

like a capital C and your

1:39:01

job is to have tech luminaries

1:39:04

or luminaries in the industry sit

1:39:06

across from you at a big

1:39:08

conference and ask them questions, part

1:39:10

of what is going on there

1:39:12

is you're existing in an economy

1:39:14

where your continued success relies on

1:39:17

the fact that you've got to

1:39:19

get those people to pick up your

1:39:21

calls and come back for next year's conference.

1:39:24

And I think that that causes a little

1:39:26

bit of brain rot in the sense of it's hard to be

1:39:30

truly pushing the envelope in

1:39:33

those industries. I think

1:39:35

in comparison to I've watched Galway, it was

1:39:37

a different case. I

1:39:39

used to go to the DLD conference in Munich and

1:39:41

he always did this spiel, you know, 100 slides

1:39:45

and 20 minutes and I'm going to be really funny the

1:39:47

whole time. And then

1:39:49

I was on MSNBC once with him

1:39:51

on his first appearance on TV and

1:39:53

I saw the drug shot right into

1:39:55

his vein. He

1:39:57

was a god. at

1:40:00

being on TV. It's no big deal. I was like,

1:40:02

you know, on daytime show, nobody sees it. My mother

1:40:04

didn't notice, you know, who cares? But he was, this

1:40:07

was it. This was the drug for him. He wanted

1:40:09

fame. And so it wasn't, in his

1:40:11

case, it's a little bit different. He gets invited to

1:40:13

give hugely expensive talks at places,

1:40:16

but he's not an, he's not an

1:40:18

access journalist so much as he,

1:40:20

as he's a faux-rye voice. And

1:40:23

so the two of them together share that

1:40:25

BS of acting like they're tough critics

1:40:27

of the world. He's a terrible predictor.

1:40:29

He's wrong about crap. She's not a

1:40:32

tough critic at all. It's

1:40:34

an act and people buy the act.

1:40:37

And that's the thing. I mean, I think part of it, obviously I

1:40:39

think they have, are responsible for their

1:40:41

own actions, bad takes and

1:40:44

bad predictions and bad interviewing. But I

1:40:46

also think that part of it is

1:40:49

what happens when you

1:40:51

are, have chosen to exist

1:40:54

within an industry that demands your

1:40:56

constant opinions and hot takes on

1:40:58

everything on a daily and weekly

1:41:01

and monthly basis. Like, I mean,

1:41:03

we've got this podcast, it's long,

1:41:05

but it's once a week that

1:41:07

I am expected to give commentary

1:41:10

and opinion on a very

1:41:12

specific niche amount of events that are changing.

1:41:15

She and other commentators that like

1:41:17

Echelon are doing this every single

1:41:19

day. That is how they're ready

1:41:21

to get to the point. They have to get on TV to do it. And you

1:41:23

have to get on TV and you've got to get clicks and

1:41:26

it is, it rots the brain.

1:41:28

It's the attention economy.

1:41:33

I think you're right, by the way. There's one other

1:41:35

layer, which is still issue. In

1:41:39

the last month and a half, I've done

1:41:41

maybe 25 interviews. I have

1:41:43

had to come up with a number of opinions for my

1:41:46

newsletter for free for years. I've written 3000 words for pretty

1:41:49

much every newsletter, at

1:41:51

least the last 12 of them. I

1:41:53

do this while running a PR firm. Cara, the job's

1:41:55

not that hard. And

1:42:01

I'll go on TV. I've done tons of podcasts. The

1:42:03

second one I did today did

1:42:05

a bunch yesterday did a bunch last week I'm not boasting.

1:42:07

I'm just saying boo bloody who

1:42:10

if the if your problem is but also

1:42:12

It's one thing to be a commentator and that's fine.

1:42:15

If that's and that's all I am I'm

1:42:17

not pretending to be a journalist.

1:42:19

I'm not doing investigative reporting I'm

1:42:21

raising up things and saying my

1:42:23

opinions today Sarah's such Kara Swisher

1:42:25

was Her last things that

1:42:27

all things digital back in 2020 2013

1:42:30

even were things like funding announcements and tech

1:42:32

stocks and stuff you

1:42:34

can't be a news source while

1:42:36

also Doing what

1:42:39

she does because she's not stupid

1:42:41

Kara Swisher is a great broadcaster

1:42:43

Scott Got Galloway is a

1:42:45

C plus at best. He's dull when

1:42:47

I saw him on the John Oliver Elon Mustang.

1:42:49

I was so angry So

1:42:52

very angry because he's a boring sod.

1:42:54

No one needs a what spot go

1:42:57

anti-union freak Talking about how

1:42:59

he got into boxing so that he could get

1:43:01

woman Loser anyways back

1:43:03

to Kara. She Is

1:43:06

better than this? I think that really

1:43:08

is it She is better than this

1:43:10

and had she stopped pretending that she

1:43:12

was some sort of Objective arbiter that

1:43:15

she was friends with these people. Fuck

1:43:17

does a great job with this No

1:43:19

one believes that Park is

1:43:21

not friendly with the billionaires. They're all in with

1:43:23

them. They know it. They're upfront with that I

1:43:25

like it. It's why we get interesting stories The

1:43:28

British media is a lot like this as well

1:43:30

They're relatively in bed with sources, but they'll rip

1:43:32

them to shreds people will go who used to

1:43:34

go on Jeremy Paxman and get their asses lit

1:43:37

up by Paxman Because

1:43:40

they had to because that was these were

1:43:42

the terms that the media offered you Kara

1:43:44

swisher could have offered those terms. She could

1:43:46

be Doesn't even

1:43:48

need to be endlessly tough But sit there and

1:43:50

give him a little more push little more sizzle

1:43:53

and frankly There are other journalists who haven't done

1:43:55

that as well I want to give a big

1:43:57

shout of respect though to Kevin ruse who is

1:43:59

very wrong Yeah, when it came to

1:44:01

crypto, but he had Chris Dixon of

1:44:03

Andreessen Horowitz on their podcast ripped into

1:44:05

shreds called him to account We need

1:44:07

more stuff like that. But also I'm

1:44:09

not mixed on roos. I Also

1:44:12

mixed on Bruce I believe he likely lost

1:44:15

many people a lot of money by supporting

1:44:17

cryptocurrency in the way he did in a

1:44:19

shocking Application of his responsibility as a journalist

1:44:21

and I think his coverage of chat tbt

1:44:23

was loading You're

1:44:27

referring to the front page

1:44:30

New York Times story that I'm forgetting the

1:44:32

exact name of but it was right when

1:44:34

chat tbt Was first coming on the scene

1:44:36

where he published an quote-unquote

1:44:39

interview with chat gpt Or

1:44:42

perhaps it was a different chat bot. No

1:44:44

one's got to be cheap. He was saying Oh

1:44:47

Chat tpt asked me to divorce my

1:44:49

wife or something I tried to convince

1:44:51

him he was unhappy with his wife

1:44:54

He forced he he tried again and

1:44:56

again and again to get chat tpt

1:44:58

to go to its dark side And

1:45:00

then when he finally succeeded after after

1:45:02

the guardrails popped in multiple

1:45:04

times He finally got it to go

1:45:06

to wacky places. Then he wrote I couldn't get to

1:45:09

sleep that night It

1:45:11

was being chatbot. What?

1:45:13

What it was? Yeah, which is so funny.

1:45:15

Don't put in the newspaper that Bing Did

1:45:18

a sigh up on you. Come on, man. You go

1:45:20

you Kevin's smarter than that But

1:45:23

that's the thing You can

1:45:25

I believe cover this tech

1:45:27

industry and be excited about it You can

1:45:29

say I find chat gpt really interesting you

1:45:31

can the times has people that

1:45:34

have this approach to tech coverage Brian

1:45:36

Chen Fantastic his

1:45:38

repeatedly look like with the Apple

1:45:40

watch He was originally quite negative on the

1:45:42

Apple watch Wall Street Journal Joanna stone One

1:45:44

of the best tech journalists working out there

1:45:47

that's some amazing written Today had

1:45:49

an interview with the CTO of open AI

1:45:51

and it was a good interview and it

1:45:53

was straight on but the headline was Makes

1:45:56

these amazing videos and it freaks us out What

1:46:00

does that matter? You're lying. I

1:46:02

know, but they lie. But you

1:46:05

are obfuscating the truth, which is,

1:46:07

you should say, initially impressive, but

1:46:09

on closer look bad. Because

1:46:11

that's the story. The story is not

1:46:14

what it does, but also what it can do. And

1:46:18

in the event that they don't fill it, if

1:46:21

the company leaves the gaps for the journalist to

1:46:23

fill in, the journalist has to go, I don't

1:46:25

know. Or say, or

1:46:27

maybe not give the most preferential answer. But I still

1:46:29

think Joanna does an excellent job. She is the reason

1:46:31

that. The interview was very good. It was the editing

1:46:33

picture. And she reme… But a lot of this comes

1:46:36

back to, what are you

1:46:38

doing there? If Cara was just a pure

1:46:40

commentator, if that's all she did, if she

1:46:42

only claimed to be an opinion person who

1:46:45

did entertainment, I'd fully respect it. I'm serious.

1:46:47

If it was… All it was was just entertainment.

1:46:49

She's like, I have my biases, I have my things, I'm going

1:46:51

to roast them as much as I can. What was Michael Arrington?

1:46:54

In the day, right? I will say,

1:46:56

if you could hear us, exactly one point… I'm

1:47:00

saying, I'm not defending him, but I'm just saying

1:47:02

he was what he was. Yeah, if

1:47:04

Michael Arrington, if you're saying that he was unashamed

1:47:06

and what he was, you're correct. That's

1:47:09

what I'm saying. Okay, that's fine. Okay.

1:47:11

A brief aside, to give Cara a little, a

1:47:13

tiny bit of credit. Today she did Get

1:47:15

a Scoop on a topic that I think is

1:47:17

interesting. This talked about, she

1:47:20

tweeted, earlier this day, that she

1:47:23

tweeted out a scoop, which I'll

1:47:25

put aside my judgment on that,

1:47:27

about Don Lemon. Specifically,

1:47:30

that Don Lemon had

1:47:32

partnered with X and Elon

1:47:34

Musk, and X had agreed

1:47:36

to throw its financial support behind

1:47:38

the creation of Lemon's new venture

1:47:40

called The Don Lemon Show. But

1:47:43

what Cara tweeted out today

1:47:46

is that the first

1:47:48

interview for the show was publicized

1:47:50

as being between Don Lemon and

1:47:52

Elon Musk. And today, after the

1:47:54

interview happened, Elon

1:47:56

Musk sent a terse text to

1:47:58

Lemon's reps saying, track terminated,

1:48:00

the show is off

1:48:03

because Lemon did a

1:48:05

tough interview. And I guess the

1:48:07

interview happened last Friday that was

1:48:10

not to Elon Musk's liking. It included

1:48:12

questions about his ketamine use and

1:48:15

other subjects. I mean,

1:48:17

this seems pretty obvious that someone

1:48:20

like Don Lemon should have seen it coming. But

1:48:24

of course now he did. He did a deal with the devil.

1:48:27

He did. And now he's being... I think

1:48:29

it's really nice that Don Lemon interviewed Kara

1:48:31

Swisher about her book, Burn Book, at

1:48:34

the 92nd Street Y in New

1:48:37

York. I think it's

1:48:39

interesting that I wonder if she asked him about

1:48:41

the sexual ramen navigation. She did. She says in

1:48:43

her tweets on this, I told Don that this

1:48:45

is exactly what would occur at a recent... Oh

1:48:47

wow, thank you Kara. Wow. You

1:48:49

asked him if he's a perper. Seriously,

1:48:52

I'm just, I'm going by the British

1:48:54

standard where these questions would get asked.

1:48:57

I've come from the Paxman School. You've got to

1:48:59

ask the questions and you need to, when they

1:49:01

answer them, you need to ask further questions. I

1:49:03

get that it's difficult, but guess what? These people

1:49:05

will, they have to talk to the press. They

1:49:08

have to. Even Musk, even Musk has to.

1:49:10

Musk can run scared, but he needs the

1:49:12

press to an extent. Also,

1:49:14

by the way, do you remember when Sam Altman

1:49:16

was out at OpenAI and Kara Swisher was getting

1:49:18

scoops and it was very obvious

1:49:20

it was just like, it was

1:49:23

a mixture of bollocks and Greg Brockman

1:49:25

just texting it directly. Yeah. It's

1:49:27

just, that's what I don't get. She

1:49:30

has this opinion gig. She can just

1:49:32

go on whatever and get a bunch

1:49:34

of money for speed. But it's access

1:49:36

man. It's access. It's fame.

1:49:39

I get it. But just enjoy

1:49:41

the fame. Stop trying to pretend like you're like

1:49:43

doing journalism. It sucks. She's

1:49:46

not stupid. She's a great broadcaster. Why

1:49:48

can't she? Why can't she do

1:49:50

better? That's the thing. This isn't a case where

1:49:52

someone's rubbish at their job and they're incapable of

1:49:54

doing it. Is this that fun?

1:49:57

Is this fun? Does this, I mean, it seems to be a

1:49:59

little at the very least. Sure,

1:50:02

but be just as lucrative

1:50:04

to actually do a good job. Well,

1:50:06

she's tried to be, you know, the other thing about her career, she's

1:50:10

like a human NFT. She'll...

1:50:15

That's the wildest description I've ever heard you use.

1:50:17

Well, it's the short attention span. I'm

1:50:21

a hard-ass journalist. I'm

1:50:23

a host of events.

1:50:26

I'm a podcaster. I'm a columnist. I'm

1:50:28

a podcaster again. I'm a this. I'm

1:50:30

a that. I mean, she's a public brand. Yeah,

1:50:33

but she... So along comes

1:50:35

the New York Times and makes her a columnist. If

1:50:38

you're a public brand, that's not a bad gig. No.

1:50:40

But then she just doesn't keep it up very long. She has

1:50:42

a short attention span. It's just... It's

1:50:45

not of great value. It just sucks because

1:50:47

she does know a lot. She

1:50:50

has great context for this entire industry, decades

1:50:52

worth. And if Burn Book had been just

1:50:55

her saying, I don't know what I've become,

1:50:58

had it been something like that, had this

1:51:00

thing come out and she... That would have

1:51:02

been quite interesting. Is it actually exploring her

1:51:04

relationship with these folks? Yeah. Well,

1:51:06

also just even if she's like, I don't want to

1:51:08

lose these friendships, she just

1:51:10

said that. If she was just like, I

1:51:12

don't know what I am now compared to what I was before, I would

1:51:15

have a deeper biting respect for her. I

1:51:17

really would. I would genuinely

1:51:19

be impressed at the introspection and acceptance

1:51:21

of what has happened because I be

1:51:23

again pathetic with her. Taken

1:51:26

away by fame. She also was doing the best

1:51:28

job. Human memory is very cruel as is watching

1:51:31

people on video. You could say, oh,

1:51:33

at the time maybe she thought she was being pressing

1:51:35

maybe to her that was at the time and

1:51:38

by the standards of the tech industry, it was

1:51:40

quite critical. Still say

1:51:42

back in British press is doing better 20 years ago. But

1:51:45

if she had the introspection to see what

1:51:47

she was instead of lying, because the

1:51:49

way she talks about Elon Musk and

1:51:51

always being a critic is a lie. It's

1:51:55

disgraceful and that is misleading people and it's

1:51:57

teaching people that you can just get away

1:51:59

with that. both the sources and people

1:52:01

in society. I'll

1:52:03

give you an example of a journalist doing it

1:52:05

well, I think, Sophie Schmidt, Eric Schmidt's

1:52:08

daughter, went to North

1:52:10

Korea with her father on a,

1:52:12

you know, a privileged trip with

1:52:14

the State Department 10 years ago. And

1:52:17

oddly, we kind of

1:52:19

wonderfully ironic, she put it up on Google, and then Google, of

1:52:21

course, killed that feature, so it was gone. So

1:52:24

somebody at Rest of World, which is a

1:52:26

wonderful, amazing site that she started, resurrected

1:52:29

the piece, and then she wrote a piece

1:52:31

today about all the ways she was wrong.

1:52:35

So I respect that. She showed off something she did

1:52:37

10 years ago, she's kind of embarrassed about in some

1:52:39

ways, it was kind of naive in the situation. And

1:52:41

then she came back and she talked to experts about

1:52:43

it and said, here's where I was wrong. That's the

1:52:45

kind of model to show she's rich, she

1:52:48

doesn't have to do any of this, she's, I think,

1:52:50

rest of the world is spectacular, she could rest on

1:52:52

those laurels, but she came out and did that. And

1:52:54

that's what Curtis Fisher is never going to do. Why

1:52:56

doesn't she do the tech equipment as smart? Yeah, it's

1:52:58

called What I Missed When I Went to North Korea,

1:53:00

11 years after her Pyongyang

1:53:02

trip, Rest of World's founder re-readed how

1:53:04

she interpreted the company, it's people and

1:53:07

their culture. That's

1:53:09

true. And what I wish Kyra

1:53:11

would do is the tech equipment as smart as if

1:53:15

you can get these people, get them on a podcast,

1:53:18

chat nonsense for an hour, let's see what they

1:53:20

say. If they're not willing to do it, they're

1:53:22

not friends. Or if

1:53:25

you're not willing to push them to do

1:53:27

it, you're not being you're not doing your

1:53:29

job, you're not putting on entertainment. If it's

1:53:31

just entertainment, show these people

1:53:33

for what they are. Because also, I think the

1:53:35

other thing is, most of these guys are terrifyingly

1:53:37

boring. Kara can

1:53:39

be quite interesting, she'd be quite electric.

1:53:42

I think Scott Galloway is boring, but

1:53:44

she's interesting. But the people she talks

1:53:46

to are so terrifyingly dull. Sam Altman,

1:53:48

Reed Hoffman, even Mark Banioff. Oh

1:53:51

god, the same different versions of everything,

1:53:53

it's like chat GPT given life. They

1:53:55

all say the same kind of things,

1:53:58

they all mumble themselves Mark Zuckerberg. What

1:54:00

a dullard, good lord! Blah

1:54:03

blah blah. Steve Jobs by the way, scumbag,

1:54:05

I just finished doing the Behind the Bastards

1:54:07

series with Robert Evans about him. Oh I

1:54:09

love that podcast. Steve Jobs by the way,

1:54:11

one of the worst people to ever walk

1:54:13

the earth, a deadbeat dad who had to

1:54:16

be sued by the District Attorney of California

1:54:18

to pay well for his work. Walter Isaacson

1:54:20

wrote a fair and balanced biography. Well you

1:54:22

know Walter Isaacson has never written an unfair

1:54:24

biography ever. And speaking of Elon Musk, I

1:54:26

want to shift gears a little bit. Yeah

1:54:29

yeah, sorry, I'm going off. Another, I mean,

1:54:31

listen, I'm sure everybody loves hearing about Kara

1:54:33

Swisher as much as we love talking about

1:54:35

her. But another story

1:54:37

in the rundown is Trump,

1:54:40

this week the Washington Post reported that

1:54:42

Trump asked Elon Musk if he wanted

1:54:44

to buy truth social. And

1:54:47

unsurprisingly, the idea went nowhere.

1:54:50

But it's still kind of

1:54:52

interesting that Trump and Elon

1:54:54

have kind of continued to

1:54:56

communicate more and more. This

1:54:58

comes after I believe last

1:55:01

week where either the

1:55:03

Post or the Times reported that

1:55:06

Trump had came to

1:55:09

Elon Musk to potentially get

1:55:11

him to try and invest in his presidential

1:55:13

campaign. Well so I think that's

1:55:15

a different story. My theory on that is, because

1:55:17

Musk said I'm not going to, I'm not going

1:55:19

to contribute to any candidate. Well Trump

1:55:22

needs something else. He needs a

1:55:24

few hundred million dollars bond. And

1:55:28

I think that's what he's going after people for. We

1:55:31

don't know where he got the money for the first bond

1:55:33

he had to put up. He has a

1:55:35

much larger bond he has to put up. That's

1:55:37

why I think with Elon, however not being smart

1:55:40

enough to realize that Elon probably has very little

1:55:42

liquid capital. Yes.

1:55:45

Everything's tied up in Tesla stock. So

1:55:48

why isn't anyone talking more about

1:55:51

this Elon Musk, Chancery court

1:55:53

thing? If he has to reorganize the

1:55:55

board of Tesla, he's

1:55:57

screwed. Any

1:56:00

his current group of people

1:56:03

look kind of like Jim

1:56:05

Henson creatures Just

1:56:08

like weird cronies of his that

1:56:10

have like transparently crooked deals with

1:56:12

him And that's why the chance

1:56:15

to record judge ruled against him because they

1:56:18

were like yeah, this these people like have no control

1:56:20

over you That

1:56:22

is the biggest story in tech right now

1:56:25

You're a long musk glues billion tens of

1:56:27

billions of dollars of stock options The richest

1:56:29

man in tech won't be able to create

1:56:31

the world's most anti woke AI Also,

1:56:34

why are we doing rock being

1:56:37

funded by tens of billions of dollars? Where

1:56:41

are we gonna get our contrarian news from? I

1:56:43

don't know that Many

1:56:45

other news outlets that seem to be

1:56:47

willing to post right-wing stuff critically

1:56:51

because they're incapable of I Don't

1:56:54

know I'm just I find the whole

1:56:56

Twitter thing very depressing as well I find the whole

1:56:58

Elon Musk thing very depressing, but I'm British. That's how

1:57:00

we roll Let's

1:57:03

uh do the Google change log By

1:57:13

the way, I have to confess to you I put

1:57:16

those both in there with irony Listen

1:57:19

I assume that's the only way that Google

1:57:21

change log can happen breaking news with my

1:57:24

ironic comment on how new From

1:57:26

CNET new Google messages

1:57:28

feature lets you turn Your

1:57:31

blue chat bubbles green or

1:57:34

orange or purple if you're feeling particularly

1:57:39

How exciting Google is

1:57:41

testing the addition of color and background customization

1:57:44

for its Google message app on Android phones

1:57:46

Another way the internet giant hopes

1:57:49

to distinguish its RCS messaging services

1:57:51

this time with some pizzazz thrilling

1:57:55

moving on to Jeff's seemingly

1:57:57

ironic breaking news Published

1:58:00

one day ago, Google's updated sign-in

1:58:02

page appears to be rolling out widely.

1:58:04

Hey, hey, how's that produced? If you've tried

1:58:06

to sign into a Google product,

1:58:09

you may realize that the

1:58:11

sign-in page looks a little differently. And

1:58:14

I'm sure quite a lot of

1:58:16

teams of people got paid quite handsomely to

1:58:18

do that, and for that, we

1:58:20

salute them. Where's Marissa Meyer? What do we

1:58:22

need her? The Google change

1:58:25

law. She's running a calendar app. Yes,

1:58:27

she is. Oh,

1:58:30

she has contacts, too. She has a contact? Huge.

1:58:32

Huge. Wow. We're going

1:58:35

to go to an ad break after which we'll

1:58:37

come back with our picks of the week, guys. And

1:58:42

we're back. Jeff, what's your pick

1:58:44

this week? Oh, you got to pick

1:58:46

on yourself first. Pick on myself

1:58:48

first, all right. Even though you're the boss.

1:58:50

My pick this week is a, you know, I'm

1:58:52

the boss, but yet I clearly don't have a

1:58:54

strong sense of self because I

1:58:57

immediately deferred to you. My

1:59:00

pick this week is a new

1:59:03

short film out, I guess

1:59:05

a normal-sized film called The Disruptors. Taylor

1:59:09

Lorenz did an interview this week

1:59:11

with the writer and director of

1:59:13

it, Adam Frucci. And

1:59:16

the headline of it is, A New Satire

1:59:18

Takes Another Whack at Silicon Valley and The

1:59:20

Men Who Fund It. This

1:59:23

is perhaps a little bit of a

1:59:25

preemptive pick, but I saw, Jeff, you'd

1:59:27

included this story in the rundown. So

1:59:29

I kind of had to highlight it

1:59:31

as my pick because the people

1:59:34

behind this film and the

1:59:36

two main actors are

1:59:40

both kind of creators that are prominent in

1:59:42

the dropout cinematic universe. And the dropout for

1:59:44

people who have been listening for a while

1:59:46

is this indie streaming service and content

1:59:49

kind of creation, I guess,

1:59:52

studio that I've been a huge fan

1:59:54

of. It is by the former

1:59:56

College Humor people and is one of my favorite things

1:59:58

in media right now. And basically what

2:00:01

this movie is about is the

2:00:03

plot is that a basic

2:00:06

an uber driver played by this

2:00:08

guy grant O'Brien decides to try

2:00:10

and scam a venture

2:00:13

capitalist into giving him lots

2:00:15

of money by making up

2:00:17

a fake startup idea. And

2:00:20

I love this interview with

2:00:23

the director and writer because

2:00:26

it I mean it is

2:00:28

a really interesting look because basically the

2:00:30

Washington Post until the ends asking like, oh, like, why

2:00:32

did you what

2:00:34

led you to do this

2:00:37

satirical critique on Silicon Valley?

2:00:39

And he basically says venture

2:00:41

capitalists are some of the most powerful people on

2:00:43

the planet. And

2:00:45

these basically every job I've

2:00:48

had has been ruined in

2:00:50

one way or another by venture capitalists or

2:00:52

the tech industry, which

2:00:54

I think is a really interesting take

2:00:57

on it. I just

2:00:59

think it's cute that someone did a

2:01:01

movie about creating something that is basically

2:01:03

symbolic capital to raise money from venture

2:01:05

capitalists, which is otherwise known as

2:01:07

venture capital in 2021. Very

2:01:12

cute. Sorry. Jeff,

2:01:15

I got. I'll keep going on the

2:01:17

capitalism is evil and private equity and

2:01:19

venture capital and hedge funds are ruining

2:01:21

the world and putting journalism with two

2:01:23

little notes. One

2:01:26

is that the Associated Press, which we thought

2:01:28

would have standards, has done

2:01:30

a deal with the evil Tabula,

2:01:33

the company that God chunks up

2:01:35

every web page with and you

2:01:37

won't believe. And they're

2:01:39

going to do a commerce site with

2:01:41

Tabula. Have they no pride? Have they?

2:01:44

Yeah, famously is the chum

2:01:46

box that you see below

2:01:49

articles where it's like one weird trick

2:01:51

gets rid of fat fast. You

2:01:54

won't believe and then it's something that's just

2:01:57

made up. Yeah. I

2:02:00

have done

2:02:02

once and I was just like,

2:02:05

and you just go, no, this isn't real. This

2:02:08

is, it's not obvious how they're making money, but

2:02:10

you know they are. So you just close the

2:02:12

window. So

2:02:14

then the second thing is speaking of

2:02:17

private equity and bad people,

2:02:19

the Los Angeles Times and

2:02:21

investor media now owned by

2:02:23

Patrick Sun Xiong, ruined by

2:02:26

previous owners at Alden.

2:02:29

They are closing. I thought Levinson. You

2:02:32

could, yes, you can play this video without the sound so

2:02:34

we won't get it taken down if you'd like. This

2:02:36

is the last press run at the

2:02:39

LA Times. Huge,

2:02:42

amazing Olympia press

2:02:46

hall and after

2:02:49

more than 30 years it's going on business. Look

2:02:51

at the size of these presses, these magnificent pieces.

2:02:53

Where are they printing it now though? Well,

2:02:56

it's going to Alden which owns

2:02:58

now San Diego and Orange County

2:03:00

and so the evil Alden hedge

2:03:02

fund will get the printing business

2:03:04

and make money off the

2:03:06

LA Times. Very cool.

2:03:09

Wow. So

2:03:11

the last, I love, I'm old

2:03:13

enough, I'm old enough to

2:03:16

the line of types and presses. When

2:03:19

I worked at the Chicago Tribune and San Francisco

2:03:21

Examiner, at a certain hour you

2:03:23

would feel the floor rumble as the press starts.

2:03:25

It was a wonderful, wonderful thing to go down

2:03:27

and watch it and smell it. It's

2:03:30

gone. I don't

2:03:32

regret paper going away. It's

2:03:35

like horses went away and so did their crap. It's

2:03:37

okay. Jeff, I have a dumb

2:03:39

question for you but in your first journalism

2:03:41

job were you using computers or typewriters? Ah,

2:03:44

here's how old I am. Or using a stone

2:03:46

tablet and just a tablet. Ah, there was a

2:03:48

tablet and a bird that said, it's a living.

2:03:51

So this is Uncle Jeff moment here. So

2:03:55

I'm old enough that my first typewriters

2:03:57

were not electric. When

2:04:00

I was at the Chicago Tribune, I was in the

2:04:02

job called, I was a rewrite man, sorry for the

2:04:04

sexism of that, and I

2:04:06

would sit on rewrite on deadline stories and there

2:04:08

was a prison break in

2:04:10

Indiana. Reporters are calling

2:04:13

in me with stuff. I'm calling people to get notes.

2:04:15

I'm calling up the clips on other prison

2:04:17

breaks. And then I would write the story on what we

2:04:19

called half books, half a sheet of paper with

2:04:22

many carbons. And I would

2:04:24

type the first paragraph, the lead of the story. I

2:04:27

would rip it out of the typewriter quite

2:04:29

dramatically and scream, copy! And

2:04:31

somebody two years younger than me would come.

2:04:34

And copies of the copy would go all around

2:04:36

creation. And it would get edited by the CD

2:04:38

desk, then edited by the copy desk, and then

2:04:40

it would get pneumatically tubed down to the composing

2:04:42

room where we set in lead down

2:04:45

there while I'm still writing. So

2:04:48

I'm writing the next paragraph and the next paragraph and the

2:04:50

next paragraph, and I've got, and

2:04:52

I keep one copy for myself, I've got to

2:04:54

find out whether or not, did I make that

2:04:56

first reference? Did I say who the DA was?

2:04:58

Did I say the first name? And if

2:05:01

not, I've got to find a way to write that in

2:05:03

because I can't get it back. It's being set in type.

2:05:05

Oh my gosh. And so on that story,

2:05:07

I always

2:05:10

remember Ralph Hallenstein, sorry, you're going to get an Uncle Jeff

2:05:12

going here with his old days, was

2:05:14

the news editor. You're getting lost in your stories.

2:05:17

Hello. He's a mondo smoker. And

2:05:19

at the end of the shift. Indoors? Oh yeah,

2:05:21

that was a back thing. Yes, I was old.

2:05:24

We used to smoke inside. And at

2:05:26

the end of the shift, honest to

2:05:28

God, the next shift would have a

2:05:30

ghoul pool about how many cigarette butts

2:05:32

there were in Ralph's ashtray. Ralph

2:05:35

of course died from lung cancer. But

2:05:38

then in came the

2:05:40

first computers in the Tribune newsroom and I was on

2:05:43

the midnight shift waiting for somebody to die a

2:05:45

horrible death so I could write about it. And

2:05:48

I was the kid who wasn't scared of them. So I played with them.

2:05:50

They couldn't do anything yet. And

2:05:53

so come the day when they were going to turn them

2:05:55

on, I was the only person who wasn't scared of them.

2:05:57

I trained the entire newsroom in the first

2:05:59

computers. I had to say, well this

2:06:01

is a cursor. You

2:06:05

have to put the cursor where you want to

2:06:07

do something. No,

2:06:09

no, no, don't hit return at the end of the line.

2:06:12

Trust me, just keep typing. No, no, no, keep typing.

2:06:14

It's smart, it'll do it. So

2:06:16

I learned computers early, early on. This one

2:06:18

got me all dirty. And the

2:06:21

final bit of Uncle Jeff moment is

2:06:23

that because I wrote so fast and

2:06:25

rewrite, it changed immediately the

2:06:27

way I wrote so I would write as fast as I

2:06:30

could to get a draft down and then

2:06:32

I spent every minute until deadline editing. And

2:06:35

so computers fundamentally changed how

2:06:37

I thought and wrote. And

2:06:40

that's what fascinates me. There's a book that

2:06:43

I absolutely love by a friend of mine

2:06:45

named Matthew Kirshendam called Track

2:06:47

Changes. It is a history of word

2:06:49

processing. Oh, that's

2:06:51

fascinating. I'm going to order that. Wonderful. It

2:06:54

is really wonderful. I was a English

2:06:56

professor at UMD. And

2:06:59

if you're into this stuff about how it kind of changes, it

2:07:01

changes the way we look at things. It changes

2:07:03

the way we write and we think. So sorry,

2:07:05

Paris, you asked me a simple question of Uncle Jeff

2:07:07

and Uncle Jeff couldn't stop. Well, now I want to

2:07:09

ask a question of Ed. I don't know how old

2:07:11

you are, Ed, but was your first

2:07:13

job, did you use a computer for it? Yes.

2:07:17

Though I didn't really

2:07:20

need to jump when I was a kid. My

2:07:22

parents paid for school. So like I just I

2:07:24

was a games journalist. So I was just writing

2:07:26

on computers. But I do remember when I was

2:07:29

like nine walking around my dad's

2:07:31

office, my dad's dad was a public

2:07:33

housing management consultant of sorts. And

2:07:37

so he had a Reuters terminal

2:07:39

on the old Reuters. And

2:07:42

I was fascinated by this thing. And I now know that

2:07:44

those things were probably worth like hundred

2:07:46

and fifty. It's way too much money. But

2:07:49

I was just fascinated by this idea that the news

2:07:51

would come through during the day. Just

2:07:54

as a kid, you were just like, oh, the news

2:07:56

exists only on television and on paper. But then they

2:07:58

know the computer has news. and I always had

2:08:00

more news. It's remarkable I

2:08:03

didn't break that thing. But

2:08:05

I tried. I mean,

2:08:07

I was messing around with it a great deal, but it's fascinating.

2:08:10

That was like my... probably my

2:08:12

most formative computer memory was

2:08:14

messing with that terminal. Because

2:08:17

it was just like the idea that information

2:08:19

came through in this manner. And it was

2:08:21

good information. It wasn't just like someone... like

2:08:23

this was clearly thoughtful, carefully

2:08:25

done stuff. Cool.

2:08:27

And also, you're on top of something. When

2:08:30

I started my first news

2:08:33

sites in 1994, my children just

2:08:35

as the browser started, I

2:08:39

got the AP wire and I started

2:08:41

this page where it would update with

2:08:44

the entire AP wire every minute. It

2:08:46

gave us page views. And the public loved

2:08:48

it. It was hugely popular because it was

2:08:50

the entire AP feed. And you

2:08:53

could get whatever you wanted. It was like Dave Weiner says, it

2:08:55

was a river of news. No judgment,

2:08:57

nothing else. You could just see the latest news. Raiders

2:09:00

loved it. The AP effing hated it.

2:09:02

And they fought and fought and fought

2:09:04

and finally killed it. Wow. I was

2:09:08

going to say my

2:09:10

first reporting job used Google Docs,

2:09:12

Slack, Twitter. Oh, watch you. That's

2:09:16

fine. Ed,

2:09:18

do you have a pick of the week for

2:09:20

us? Something you like. Is it okay if it's

2:09:23

not like news or anything normal? Yes, it's very

2:09:25

okay. I believe my pick of the week once

2:09:27

was the phrase, consider the humble corn maze. So

2:09:29

anything goes. Okay.

2:09:31

So it's about that normal. So in 2003,

2:09:34

Metallica released the album, Sennango. It was pretty

2:09:36

much universally panned. They said that the drums

2:09:38

weren't right in it. There were no solos.

2:09:40

It was classically considered the death of Metallica

2:09:42

for quite some time. A

2:09:45

few months ago, about three weeks ago, Michael

2:09:47

Shea on YouTube. And not enough people have

2:09:49

found this yet. Michael Shea on YouTube. He

2:09:52

re-recorded and recut the entire album.

2:09:55

He used James Hatfield, even. but

2:10:00

he re-recorded most of the album. He

2:10:02

finessed parts, he added basslines, rhythm, guitar.

2:10:06

I talked to him briefly about it because I'm that kind of guy

2:10:08

to go and say like, this is amazing. And

2:10:10

he was like, I'm not a great guitarist, but

2:10:13

I know what good sounds like. And

2:10:15

so he basically took this album that kind

2:10:17

of sucked and made it really good. Like,

2:10:20

it's a very good album now. Lyrics are still dumber

2:10:23

than dog poop. He

2:10:25

was still very much a Metallica album, but he

2:10:27

added depth to an album that when I was,

2:10:29

was, 2003, so quite some time ago, I was

2:10:31

in high school at the time. I remember listening

2:10:33

to it with just an abject sadness. And

2:10:36

I wish I could go back in time and say, it will get

2:10:38

better at it. You'll be able

2:10:40

to do a job on the computer and

2:10:42

send anger will be fixed. But

2:10:45

it's so weird because this album has been

2:10:47

redone a lot. Three

2:10:51

years, two, three years ago, someone did a one where

2:10:53

they re-recorded and they actually re-sang it as well. That

2:10:55

one did it wrong because they didn't accept the

2:10:57

problem. The inherent problem is sent anger, which is

2:10:59

it needed to be re-recorded. The

2:11:02

song ideas were good, but the actual fundament

2:11:04

needed to be removed a bit. And

2:11:06

this guy, Michael Shea has done it. It's

2:11:09

genuinely a good album now. Invisible Kib, which is one

2:11:11

of the worst Metallica songs ever, now actually has some

2:11:13

depth to it. It's such a good album. I love

2:11:15

it. I've listened to it so many times. I listened

2:11:17

to the original a lot for more

2:11:20

mental health reasons, just like

2:11:22

it was just something that I damaged myself

2:11:24

with. But now this album is actually good.

2:11:26

And it's called Sent Banger on

2:11:29

YouTube. Sent Banger. Yeah.

2:11:32

Good title, too. Great pick.

2:11:34

A banger of a pick. Well,

2:11:37

thank you guys both so much for

2:11:39

being here on my twig takeover. Thank

2:11:43

you, Jeff Jarvis. As always,

2:11:45

you're always here. And thank you

2:11:47

so much, Ed Zetron, for coming and

2:11:50

joining us here in the

2:11:52

Leo-list void that exists on

2:11:54

the internet. Ed, where can people

2:11:56

find you? What do you want to plug? Okay.

2:12:00

You can find me at on Twitter

2:12:02

slash rate man. You'd stop is at

2:12:04

Ed Zitron's EDZ it are aware And you

2:12:06

can find me on blue skies it Ron

2:12:09

don't be sky social from a newsletter It

2:12:11

wears your edit dot at and

2:12:13

the podcast better offline at better offline calm and you

2:12:16

click podcast And we have all the links don't have

2:12:18

to ask me with a spot if I think it's

2:12:20

all there Please don't ask me

2:12:22

just a great tweets to I

2:12:26

Some top drawer posts a

2:12:28

real posters heart and I think that that's what

2:12:31

that's what it's all about And

2:12:33

that's who I am like that's I grew up on the

2:12:35

internet posting Like this is this

2:12:38

is my call and I feel like the posters will

2:12:40

rise The posters will

2:12:42

rise again, and thank you

2:12:44

so much everybody for listening to this. Thank

2:12:46

you club twit members for subscribing

2:12:49

and making this podcast possible and Thanks,

2:12:53

everybody. Good night. Thank

2:12:56

you

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features