Podchaser Logo
Home
I Was Merely Fluffing - TikTok ban coming to the US? ChatGPT Windows 11 taskbar, Twitter job cuts

I Was Merely Fluffing - TikTok ban coming to the US? ChatGPT Windows 11 taskbar, Twitter job cuts

Released Thursday, 2nd March 2023
Good episode? Give it some love!
I Was Merely Fluffing - TikTok ban coming to the US? ChatGPT Windows 11 taskbar, Twitter job cuts

I Was Merely Fluffing - TikTok ban coming to the US? ChatGPT Windows 11 taskbar, Twitter job cuts

I Was Merely Fluffing - TikTok ban coming to the US? ChatGPT Windows 11 taskbar, Twitter job cuts

I Was Merely Fluffing - TikTok ban coming to the US? ChatGPT Windows 11 taskbar, Twitter job cuts

Thursday, 2nd March 2023
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

It's time for twig this week in Google.

0:02

Ant Pruitt, Jeff Jarvis, who here at Stacey's

0:04

got the week off, but Jason Howell from all about Android

0:07

fills in, and we had some great conversations. It

0:09

could be the end of the line for TikTok.

0:12

I'll play devil's advocate and say it's about

0:15

time. Well, also talk about

0:17

Microsoft. They say they've got new the

0:19

new Bing power chat GPT in

0:21

Windows eleven. They lied.

0:25

And we'll talk about Twitter cuts

0:27

And who is in and who is out?

0:29

We may even have a inkling of who the next

0:31

CEO of Twitter will be. It's all coming up next.

0:34

On tway, podcasts

0:38

you love, from people you trust.

0:41

This is true. This

0:48

is week, this week in Google

0:50

episode seven hundred five,

0:52

recorded Wednesday March first twenty

0:55

twenty

0:55

three. I was merely fluffing

0:59

This

0:59

week, Google is brought to you by Fast

1:01

Mail, reclaim your privacy, boost

1:04

productivity, and make, email

1:06

yours with Fast Mail Try it free

1:08

for thirty days at fast mail dot com

1:10

slash tweet. Fast mail is also giving

1:12

Twitter listeners a fifteen percent

1:15

discount in the first year when you

1:17

sign up today. Thanks

1:19

for listening to this show as an ad

1:21

supported network. We are always

1:23

looking for new partners with products

1:26

and services that will benefit our

1:28

qualified audience. Are you ready

1:30

to grow your business? Reach out to advertise

1:33

at quick dot TV and launch your

1:35

campaign now. It's

1:37

time for twink this week in Google.

1:40

The show we cover the latest news. From

1:42

the Google Verdes and Pruwit's

1:44

here in his pink hat. It's

1:47

not pink, but it's orange, you know. It's

1:49

orange. Right? Yes, sir.

1:51

Okay. Plans in orange, hands on

1:53

photography host, also the community

1:55

manager for a club, and

1:57

just a man about the twit.

1:59

He's always here, which is nice, runs our

2:01

flaws show. I see every

2:04

Sunday now and ask the tech guys back. We're gonna do

2:06

a thing with you on Sunday. Yes,

2:08

sir. I am we're gonna have a therapy

2:10

session on

2:11

Sunday,

2:11

sir. That's what I'm calling it. So

2:13

Can we say can we get couch or

2:15

anything here for and he can lie on.

2:18

He's

2:18

switched from Windows to Mac and he needs therapy.

2:21

Oh, that's what I was like, who's giving the therapy?

2:23

And who do you see the therapy? have to know.

2:25

So I think Mike is gonna talk him

2:27

off the ledge. We'll see. That's right. Oh,

2:30

boy. That's right. Also, with his son.

2:32

The director of the townite center for entrepreneurial

2:35

journalism at the Craig Newmark.

2:40

Graduate school of journalism at City University

2:42

of New

2:42

York. Mister Jeffrey Jarvis,

2:44

hello, Jeff?

2:46

Hello there. I will call him to the show to convince

2:48

Ant that he did all run. He just come

2:50

to the Chrome. Come

2:51

to the polls. Like you. To

2:53

the profile side. Was that ever

2:56

an option at? Was that ever a consideration

2:58

for you? No.

3:00

Last Sunday in Essentech guys, I demoed

3:03

that a new Acer gaming laptop.

3:05

Was that at all intriguing to

3:06

you? That was a nice Chromebook? It

3:09

was a nice Chromebook, but that did not intrigue

3:11

me at all because the

3:13

most I could do would be DaVinci Resolve

3:16

on a Linux attition. Yeah. That one

3:18

one. That's it. Yeah. No.

3:20

I just need to be able to turn it back on computer

3:22

on and get the

3:22

work. As a production machine and

3:25

not just like producing words on a screen

3:27

in a, you know, in a doc or

3:28

whatever. You need you need more than a I think

3:30

the Mac is a good choice. Yeah. I do too. Yeah.

3:32

Yeah.

3:33

I think I like it. Sorry, Jeff.

3:37

Although, you can play Sorry for ant.

3:39

I feel bad for ant having to go into

3:41

it from one empire into the

3:42

other. You could place some of the forest on it

3:44

though, which is pretty cool. All

3:47

normally, Stacey would be

3:49

here, Stacey, a little under the weather. But

3:51

that's good news for us because we've got

3:53

Jason Howell from

3:54

Hello? Call about

3:55

Android. Call about Android.

3:58

You got Carl Pake coming on the show?

4:00

Yeah. We yes. Big

4:03

announcement.

4:04

Oh, I'm sorry. Was it a secret?

4:07

I don't know if it was a secret. Yes, we do

4:09

have Carl Pay coming out of the show in a couple of

4:11

weeks. So of

4:11

the Nothing company.

4:12

Yeah. CEO of nothing.

4:14

That's a big deal. He was a big deal. He

4:16

was at one plus for a long time.

4:18

Yeah. He was he was kind of the one plus

4:20

guy for the longest

4:21

time. And then the parent company kinda

4:23

sucked it back in and bought

4:25

That's right. And a lot of people have not been

4:27

very happy about that. Yeah. I know

4:29

the reverberations of that. Yeah. And so

4:31

I think Pei said there's an opportunity here

4:34

with

4:34

Android, and he created the nothing company.

4:36

Yeah. And they have the nothing phone, which do they

4:38

sell that in the US yet? Nothing

4:40

phone, one, now you can get it in the

4:42

US. Of course, you're not gonna get it in, like, carrier stores

4:44

and everything, but they are selling it online.

4:47

You can get it. They've announced though

4:50

that the nothing phone two officially

4:52

comes to the US. Like, you can get the nothing

4:54

phone one, but I don't know, you know, you're probably

4:57

gonna have to jump through some hoops. Nothing to

4:59

they're bringing directly to the US, and I think they even

5:01

had announcement at Mobile Congress that they're gonna

5:03

have the the latest Snapdragon powered

5:05

in

5:05

there. So it's gonna be a higher powered they

5:08

still have the LEDs on the back? I

5:10

mean, why wouldn't they That's the trademark?

5:13

The big trademark

5:14

of this device. Like, whether it's a good trademark

5:17

mark. I don't know, but it's different reasons.

5:19

I

5:19

know you said that part. It's just a thing.

5:21

It doesn't it doesn't do anything. Right?

5:24

I mean, it's well, you know, style,

5:26

of course, but also notification. There

5:29

there's some integration into the system

5:31

with, you know, notifications

5:33

alerts charging status that's

5:35

right, it'll show you how much your battery is

5:37

charged based on that. It means you can't put a case

5:39

on it as opposed not because you

5:41

would see the glyph.

5:43

Is it drain the battery, Jason?

5:46

Well, I mean, I have not used It's

5:49

ALED. So it shouldn't you need too much. Right?

5:51

Yeah. Yeah.

5:52

mean, Does it drain the battery more than

5:54

if the LEDs were there? Yeah.

5:56

But I don't know how much how much it's really,

5:58

you know It's and I mean, I guess, it's not

6:01

even though it looks like it's lines of LEDs.

6:03

Probably just one LED with a

6:05

kind of a light conductive. Yeah.

6:08

Yeah. So it's not you know, it's maybe

6:10

three LEDs. Phones used

6:12

to have those notification

6:13

lights. If

6:14

I can't miss those -- Yep. -- to be honest, little

6:16

alert light. I didn't enjoy that display.

6:18

Yeah. Bring them back. So this is I

6:21

understand. Anyway -- Yeah. -- it will be it's

6:23

a big get, and it'll be exciting to see what Karl

6:25

has to

6:25

say. Yeah. Definitely looking at Coming

6:27

soon to all about Android. That's right.

6:30

You

6:30

know what's coming soon in the United States? I

6:32

think a TikTok ban, a real life,

6:35

honest to goodness, TikTok ban.

6:39

The story, of course, a couple of days ago,

6:41

was that the federal agencies

6:43

are now have thirty days to get TikTok

6:47

off devices. Now these are

6:49

devices that are owned

6:51

by the federal agencies and then used by

6:53

employees. I think employees could

6:55

still have it on there. Personal device,

6:57

I would think. Chris

6:59

Rae, who has been kind of virulently nasty

7:03

about China and and especially He's

7:06

he's one of the people who's saying COVID

7:08

was released from the Wuhan labs despite

7:11

the fact that, you know, there's disagreement among the intelligence

7:14

agencies. He's he's acting like, I don't know,

7:16

it's a Chinese. And now think he really

7:18

he's been going hard against TikTok,

7:20

and now he might be getting his way because the

7:24

house committee

7:27

approved Pat has

7:29

passed the bill

7:30

along. The Foreign Affairs

7:32

Committee voted on Wednesday along party

7:34

lines to

7:35

give Joe Biden the power to

7:37

ban TikTok in the

7:40

US. But

7:41

when you say that, it's like in the US dot

7:43

dot dot on government device.

7:46

Right?

7:46

No. No. No. This is, like, bam,

7:48

like, only for everybody, for everyone.

7:51

That's No more

7:53

TikTok corner for us. No

7:55

kidding. The build is not precisely

7:57

specified as a corner of Reuters how the band will

7:59

work. Gives Biden power

8:01

to ban any transactions with TikTok,

8:04

which in turn could prevent anyone in the US from accessing

8:06

or downloading the app on their phones. It would effectively

8:09

ban it because -- Right. -- they would force

8:11

the Android Play Store and the Apple

8:13

Store to get it take it off the store.

8:16

Just as they did with Huawei, by the way. They've kinda

8:18

basically put Huawei out of business in

8:20

the US by saying no company could do business

8:22

with

8:22

Huawei. They could do this similar thing to

8:24

TikTok. You you would

8:26

still have it on your phone. If you had it on your

8:28

phone, you could probably still side load it.

8:31

Right? Like, that's that's one way possibly

8:33

that these these

8:34

things, you could use

8:35

it on the web. You

8:36

could use it on the web. Yeah. Right. They're not

8:38

they're blocking my website

8:39

access. I wouldn't And

8:40

TikTok would've ever seen a lot of

8:42

money. The amendment case here. Yeah. I mean,

8:44

I'm sure well, I don't know. We've

8:47

banned we banned Huawei.

8:49

Yeah. Sophia's this is

8:51

the user's own platform for

8:53

speech. Well, it seems

8:55

to me. And I and I have to say, Mike Masnec

8:57

detector used this

8:59

phrase, you'll be happy to know moral

9:01

panic. Oh,

9:02

man. Thank you.

9:03

Harvody, actually. No. It's

9:07

yeah. It's in there. Our growing TikTok

9:09

moral panic. Still

9:12

isn't addressing the actual problem. The

9:15

problem really being, you

9:18

know, and honestly, I understand

9:21

why a company or government agency would say

9:23

ban TikTok. Just like,

9:25

for instance, the military ban

9:27

Strava the running app because

9:29

it turned out

9:31

a lot of military were using it to

9:33

map their runs.

9:34

And it's certainly

9:35

giving a map of the inside of the Penn a gun

9:37

to understand why

9:38

the government would they benefit for government users for

9:41

everybody? For government. Oh, government users.

9:43

There's you know, I understand

9:45

that, and I think that's you

9:46

know, certainly within their purview, and I guess

9:48

you could make a case. But honestly,

9:50

do they ban Facebook? I

9:53

think that's a that's a a point that Karl

9:56

really bakes in this article that is absolutely

9:58

true, which is okay. So we can look

10:00

at TikTok and we can say, hey, you're doing all this

10:02

stuff. You're collecting this data, you know, we don't know

10:04

exactly what's happening behind the scenes of that data

10:07

is that transferring over to the Chinese government, blah,

10:09

blah, a lot of people assume or think, you know,

10:11

that they have the details that point to that actually

10:13

happening. But then we've got all of these other apps

10:15

that are on our phones. Facebook, Instagram,

10:18

they're all tracking our location. They're all doing

10:21

stuff with our data. Why is it different

10:23

over here versus over here?

10:25

I mean, you it it really doesn't seem to solve

10:27

the problem. If if you have a problem with this

10:29

kind of information, you know, data brokers

10:31

and everything, having access

10:34

to that information, then do something about

10:36

that. Why is it so lucrative to trade in

10:38

people's

10:38

data? And maybe that's the problem. You

10:41

said you said another key phrase in there

10:43

as far as TikTok. You said that

10:45

we don't know what they're doing

10:47

with that information once it goes back

10:49

to China. Hey. We don't know what to do with

10:51

the information on those three platforms. You just

10:53

mentioned that it here in the in the US.

10:55

So so why aren't we which

10:58

is Carl White House folks other than just trying

11:00

to Carl

11:00

Wright. Couple months.

11:01

Carl Wright. That's what Carl All right.

11:03

Yes. TikTok plays fast and loose with consumer

11:06

data. So does nearly every other

11:08

foreign and domestic app and service on

11:10

your phone? From apps that track and

11:12

monetize your every waking movement in

11:14

granular detail to apps and

11:16

services that casually traffic in

11:18

your mental health specifics And that's

11:20

before you get to the telecom industry, which

11:22

has pioneered irresponsible collection and

11:24

monetization of user. Info, all

11:26

this data fit into a massive and

11:28

intentionally confusing data broker

11:31

market. Remember those two words data

11:33

broker that regulators have been generally

11:35

disinterested and seriously

11:38

policing lest US companies

11:41

cast lose

11:42

money.

11:42

I don't know. We don't wanna pass so far to drive

11:45

SLAAR be tough on data brokers, app makers, OEMs,

11:47

telecoms, because rampant surveillance and

11:49

data monetization is simply too

11:51

lucrative. So in a way, this is hand

11:53

waving and saying, pay no

11:56

attention to what's going on in the

11:57

US? It's China.

11:59

China. It's political. Yeah. Political

12:01

points. We

12:02

had a lot of distraction. And

12:04

and and data brokers, Axiom has been

12:06

around long before the Internet. And talked about

12:08

this before, where where I've freaked

12:10

out students by showing how I can at the names

12:12

and addresses of women two

12:14

miles from me who have these characteristics,

12:17

which you can't do otherwise. You

12:20

you Google doesn't know all that. But

12:22

but Axiom does and it's supported

12:24

and used by magazine companies and media companies

12:26

all

12:26

over.

12:27

Well, in his Carlpoint's history, It's

12:29

e trivially e trivially easy for the Chinese

12:32

Iranian or any other government intelligence agency to

12:34

buy to buy this data from

12:36

the house. Data broker is a really great point.

12:38

Need TikTok. It's all

12:40

available if they want it. The other

12:42

claim probably says the Chinese government will use

12:45

TikTok to fill US kids' heads with

12:47

gibberish and propaganda, but

12:49

not only I left Carl. He's been

12:51

on the show. We gotta get him on again. Not only is there no

12:53

evidence that's actually happening at scale, it's a rich

12:56

concern. Coming from a country so

12:58

inundated in authoritarian propaganda

13:01

across AM radio folks to clear the Internet

13:04

and raising that increasingly engage

13:06

in widespread domestic terrorism. Yeah.

13:09

It it's it's coming from within the house.

13:12

But it's really I think far

13:14

too easy, and I think Chris Rae is is

13:16

an example of this, to just let you

13:19

deflect blame us blaming China. There's

13:21

also the FCC commissioner who's been calling

13:23

for TikTok's ban -- Oh. -- who is notoriously

13:25

lax on regulating telecom because

13:27

he takes money from the telecom industry. So

13:31

I I think this is lip service only to privacy.

13:34

It's it's politically expedient because

13:37

Blame China seems to be the popular

13:40

thing. Right? Yeah. Alright. Even

13:42

when we say this, people say, well, why are you giving

13:44

aid and support to the enemy? Well,

13:47

like, you we're I'm gonna aid the support to Elon

13:49

by being there.

13:51

Not me. I was through my aid and support

13:53

to Elon. ACLEU

13:55

on Twitter, though, did did

13:57

tweet. This bill is a serious

13:59

violation of our first amendment rights.

14:02

Congress must vote no on this vague,

14:04

overbroad, and unconstitutional

14:08

legislation. Somebody

14:10

in the chat room has pointed out, and I think it's it's

14:12

true that, yes,

14:15

there's first amendment rights. But as soon as it comes down

14:17

to National Security, the court

14:19

well, look at the what just happened in the

14:21

Supreme Court. They

14:23

decided to not review the

14:27

WikiLeaks lawsuits

14:29

against NSA's collection of data

14:32

blanket collection of data saying national

14:35

security, would it the NSA's defense

14:37

wasn't we're not doing it. We're not

14:39

we're only we're not only targeting international

14:41

people, and that they didn't say any of that. They just

14:43

said, hey, national security, if we were to

14:46

stop that national security. Think

14:48

of the children. So I

14:50

think that's the problem is that, yeah, the ACLU

14:53

could fight

14:53

this. They could take it all the way to Supreme Court. And the

14:55

Supreme Court would say no. National Security.

14:57

And even without any evidence, without

15:00

any known evidence, without any actual discussion of

15:02

what's happening. What is The problem with all of this

15:04

is we don't talk about actual harm. We

15:06

talk about

15:07

fears. We talk about foot fees.

15:09

But the evidence of harm is not

15:12

Alright. I am gonna be because somebody has

15:14

to argue this. And I think there

15:16

are people it's a legitimate point of view. There are a lot of

15:18

people in our audience who say, no. Wait a minute. A,

15:21

it's just a social media app. What's the harm in

15:23

banning it? B, it's pretty clear the Chinese

15:25

government has access to anything

15:28

TikTok has access to. C,

15:30

we already know TikTok employees

15:33

were using location information for TikTok

15:35

to find out what

15:37

journalists were meeting, they were trying to track

15:40

down a leak. So they so they looked

15:42

at where journalists American journalists were

15:44

on their TikTok app. To see if they were

15:46

meeting with any TikTok employees. So

15:48

we already know they've reached that boundary.

15:51

So why shouldn't we ban TikTok? I mean,

15:53

This is this is pretty clearly a hazardous

15:57

app. Yeah. There are other problems.

15:59

I don't even say this. There are other

16:01

problems. We should some day, let's

16:03

take a look at Facebook and telecoms too. But

16:05

right now, we got this proximate problem

16:07

with the with

16:08

TikTok. What's wrong with

16:10

me. doesn't answer the national anthem right now. We have

16:12

a national anthem with

16:13

Facebook and Instagram and everywhere

16:15

is tracking it to our own people. Our

16:17

own citizens in in trying

16:19

to target stuff for us for their own profitable

16:21

game. Yeah. So that's a right now.

16:24

Yeah. Well, okay. Well So Fine. We'll

16:27

get to that. But but right now, we're

16:29

get to that. We'll get to that. We'll get to

16:31

that. This doesn't mean get in line. I

16:33

thought a defense to say, well, everybody does

16:35

it. That's on a fast backyard. That's

16:38

the stuff that I've always been

16:40

bugged about with our government as we do

16:42

so many things that are and and

16:45

I'm I'm gonna piss some people off. We put

16:47

our nose in a lot of different business elsewhere,

16:50

but we really do a really crap testing

16:52

job of taking care of our home backyard and

16:54

issues right here.

16:57

Yeah. But why should we let a

16:59

foreign government have access

17:02

to Well, but to your to your mister

17:04

devil's advocate, your

17:06

argument about the employees going

17:08

after a couple of

17:09

reporters. As a reporter, I shouldn't like that. But,

17:11

a, they got fired. And b, that doesn't

17:14

They got fired when they got

17:15

fired when they got fired. Yeah.

17:17

And what it demonstrated you're right. It doesn't affect National

17:19

Security, but it demonstrated they have the ability

17:21

in the will and have the bill the ability

17:23

to do that. Somebody did. Yeah. But within

17:25

TikTok, Now let's say president

17:27

Xi or his proxy goes

17:29

to mister TikTok and says, hey, I

17:32

like I like that same information, please.

17:34

I noticed you did that.

17:36

Well, here's the I can't say no to. He's not a this

17:38

coin male. I'm mister devil's advocate,

17:41

is that we're sitting in precedent here,

17:43

devil. To

17:44

be used by everybody. You. Saying that

17:46

over and over so that nobody thinks they actually

17:48

espouse this way. I don't think

17:50

so. But I you know what? I do have to I should probably

17:53

recuse myself because my son makes

17:55

a living on TikTok.

17:57

That's a He's an example of impact. And this

17:59

is what some people say is Yeah. But think of

18:01

all the creators who use TikTok, legitimately

18:03

hurt. Somebody like my boy who

18:05

has made a very good living. He's now got a cookbook.

18:08

He's talking about TV show. He was

18:10

-- Mhmm. -- he was in Miami making cubano

18:12

sandwiches with Guy Fieri, all

18:14

because it did suck. Yep.

18:16

Yeah. But don't worry. YouTube shorts just crossed

18:19

fifty billion daily views. Please. So he could

18:21

just go on YouTube. He's on Instagram.

18:23

It's TikTok that really drove his

18:25

success to point one million followers. And

18:28

it was the key. Actually, you know,

18:30

he was very smart. He he he you

18:33

know, I we had this conversation. We went to lunch.

18:35

It was, like, a year and a half ago. And

18:38

he sat down, like, about thirty thousand followers on

18:40

TikTok. Doing these cooking

18:42

things. What do you think? I said, I think you should go

18:44

all in and and push

18:46

it. And he studied the algorithm. He

18:48

figured out what people wanted. He looked at comments.

18:50

He really gotten involved,

18:52

carefully tailored something to TikTok, and it was a

18:54

huge success. He was able to do that. And

18:56

he did it really and he's created career

18:58

around it. And he's and it's it's giving him a

19:01

living. So that's an arm. So and

19:03

now oh, I'm sorry. Wait a minute. I took off my devil's advocate.

19:05

Where'd I put it? So But I

19:08

actually asked him. I said, what do you are you worried about

19:11

TikTok being banned? He said, no. I'm I'm I got

19:13

I got the Instagram now. That I

19:15

make more money. He can't engage with Instagram.

19:17

So he was which does not mean

19:19

that that he didn't he used TikTok to

19:21

get to the next

19:22

level. Yeah. That's that was just

19:24

the whole part of being a content creator,

19:26

though, sir. Is being able to

19:29

not necessarily depend on one

19:31

particular

19:32

product. You you sort

19:33

of market yourself everywhere because

19:35

stuff goes away, especially when there's

19:37

companies like Google involved and all of sudden

19:40

they kill something

19:40

off, you still aren't able to My point exactly.

19:43

That's why we should ban TikTok because there's

19:45

other avenues. It's not like we're gonna lose

19:47

anything. So

19:52

why should we let the Chinese spy

19:54

on our people. It's not like it's the

19:56

only way that that that people can

19:58

create and and put stuff

20:00

out. Let's let's

20:01

understand the dialogue. Missouri,

20:03

missouri, or whatever his name is, Adam

20:06

Missouri at this square. Why should we just

20:08

allow him to dig into the analytics

20:10

of us and I'll get to assume that we're gonna

20:13

legislate that later. We're gonna

20:15

get to there on Instagram. committee's looking

20:17

at the

20:17

platform, you know, but we we

20:19

feel that you all really want to see more videos

20:22

and we noticed that you're all not watching you

20:24

live in

20:24

it, so we're gonna dial it back just little

20:26

bit. So I

20:27

Do you see anything here? A Chinese company

20:29

should have the same rights and privileges as a US

20:32

company. Well, here's the issue, mister

20:34

senator Neville. Is that

20:38

other countries have said the same thing about us.

20:41

So but they're they're right.

20:44

New Canada, Brazil. I've

20:47

all tried to pass laws saying that America

20:49

does horrible things with data. That

20:51

number one, least data should be stored locally.

20:53

Though, of course, that's so the government can get hold of it in

20:55

that

20:55

country. Or number two, that we shouldn't

20:58

trust America at all. We should ban American apps.

21:00

That's fine. Senator president. President.

21:02

China. Let him surprise us. Because

21:04

we make the Gosh darn best apps in the whole wide

21:07

world. And if and let them try

21:09

it. Just like France try it and

21:11

Spain try to ban Google, let them

21:13

try it because they'll come to crawl and bathe us.

21:15

Because we make the best damn apps in the

21:17

world. Well, you really do

21:19

a great demo of that. Yeah. Let's fire it

21:21

up right now. See?

21:27

Yeah. No. I see. It's a good

21:29

argument. Great deal with the advocate. Well,

21:31

I'm running for Congress, and I'd like to have your vote,

21:33

mister Pruitt. You

21:36

know, black people love me

21:38

too. They love me. Right. I just want you to know. Right.

21:40

Right? I

21:41

know as a Trump said, you know, black people. I

21:43

should say, the blacks. The blacks are black. They

21:46

love me. They love me. So

21:48

just warning you, you know. Oh,

21:51

good. Actually Biden. Didn't didn't Biden

21:53

said he said if you don't vote for

21:55

me, you're not black. There's another that's another

21:57

good one. That's Yeah. Good one. Mhmm.

21:59

Jeez. You ain't

22:00

black. What are you talking about? Yes,

22:02

I am. Well, I'm not. But,

22:05

anyway, Good.

22:08

Okay. I think I I think I I think

22:10

he stomped. Yeah.

22:10

I think he stomped.

22:11

You did your best. You did your best, bud.

22:13

Did win the day.

22:15

I

22:15

did win the day. You're all shut up. I didn't.

22:18

Well, so what happens then in

22:20

this world where the devil

22:21

wins? Yeah. And TikTok

22:24

is banned in the US. Now we go after Facebook

22:26

next. Is that what happened? Because I told that we would.

22:29

Or or China is still in the negotiations.

22:31

No. I no. I think it's a mistake

22:34

to say that a Chinese company

22:36

should have the same rights as an American company. If

22:38

you're an American company, that's different.

22:42

Protectionism, man. Protectionism all

22:44

the way. No. But

22:44

also, those are those are bad

22:46

guys. We're the good guys. No

22:49

Chinese rice, only rice from Sacramento.

22:53

So in so in the so in

22:56

immigrant comes in and breaks into my

22:58

home and decides to steal some of my stuff,

23:00

that's a problem. But if my

23:02

neighbor that's been here for all of their

23:04

life decides to come in and instill my

23:06

stuff, That's okay because he's been here

23:08

all this life. Is that what you're

23:10

saying? Well

23:12

now, if you really want to get down a law and order,

23:14

that's another conversation.

23:19

Hey, Nick. He's an American, so I'm just

23:21

kidding. No. No. But it ain't But

23:23

we're not talking about individuals. Stealing stuff

23:26

from you. We're talking about companies doing

23:30

their job companies. Oh, a

23:32

bunch of but I don't trust look at it.

23:34

Don't you know that any Chinese company

23:37

has to do the bidding of the Chinese government? That's

23:39

the way it's constructed. Right?

23:42

The CCP

23:42

and the American company does the bidding of people

23:44

like Elon Musk? Well, that's different. He's

23:46

just a private citizen. It's not like they I

23:48

mean, it's pretty obvious that

23:51

the federal government told Twitter what

23:53

to do, but we've investigated

23:55

that. Thank you very much. Just

24:00

I think you could say on the face of it that

24:03

that you're asking for the

24:05

rights of a Chinese company to

24:08

invade our privacy to,

24:10

you know, propagandize

24:14

our use. You're saying that they have the right to do

24:16

that? I say they

24:17

don't. Alright.

24:19

Are you done with your can you take your hat off?

24:22

think that's permanently stuck on at this

24:24

point. I'm a little Look, I'm starting to like this. I think

24:26

I might run for Congress. I got the

24:28

time.

24:28

Yeah. You're you're incredibly

24:30

persuasive. Yeah. We also have

24:32

the coffee stains on the shirt.

24:37

I didn't go to Harvard for eighteen

24:39

years to be treated

24:42

like this. Do

24:44

you know our fresh ice skater? Don't you? Alright.

24:47

Moving on. You

24:49

know, it is funny, though, that

24:51

when you dress like this, it does kinda

24:53

make you more of a Republican. Just

24:57

I'm not cutting.

24:59

I'm not kidding. I just feel more

25:01

I feel like I'm right. Oh

25:04

gosh.

25:06

Dress for success. That's what the So do you

25:08

think? So I don't

25:10

know if it's related or not, but but

25:14

they were asking last week the

25:18

International Trade Commission banned

25:22

Apple Watches. And

25:24

they're potentially gonna ban the import of

25:26

Apple watches because they believe

25:29

Apple infringed on a patent with a company

25:31

called Lifecore for the

25:32

EKG. This thing that you you

25:34

use the alive court cardio, Jeff.

25:36

Right?

25:36

Right. Right. Cardium, where who

25:40

who where? What con is it international? Country?

25:43

What? US? US? US. US?

25:46

They they would ban the import of Apple watches

25:48

because they say they violated a

25:50

live course patents. Now it's interesting. It's more

25:52

complicated than that because the US patent

25:54

and trademark office invaluable life

25:57

course patents minutes before.

26:00

ITC said we're gonna ban it, but that's

26:02

another thing. That's another thing. So

26:04

so Apple

26:06

went to president Biden said, would you please

26:09

vito that ban. We

26:11

don't we you know, this and he refused

26:14

to. So then I don't know if that's

26:16

related or a piece of evidence, but he refused

26:18

to. Do you think Biden, who now

26:20

has the encouragement of

26:22

the

26:23

of the US Congress, to ban

26:25

TikTok. He now has the power to do that. Do you think

26:27

he will do that? Biden's

26:29

a a lutite at his core. I like I like

26:31

Joe. Joe's my man, but by on section

26:34

two thirty, on TikTok, on this kind

26:36

of stuff, he he's got people in his administration who

26:38

are reflectively moral

26:40

panicking.

26:41

Yeah. Remember Biden's administration? Sent

26:46

a brief to the Supreme Court against section

26:48

two thirty. Yeah. Oh, yeah. He said it

26:50

when he was campaigning, he thought ten

26:52

thirty. Should be overturned,

26:55

which is protect the the twenty six words

26:57

that created the Internet. So

27:02

It's unknown. But I yeah. I'm worried about

27:04

by knowing I think he could actually do it. I think he'd actually

27:06

ban TikTok. What would

27:08

that but really, what would be the real consequence of

27:10

that besides effect that we'd have to stop our TikTok

27:12

segment. Oh, shocks. Hey.

27:15

Hey. Hey. I mean, I'm I'm

27:17

I'm authentically, like, curious to

27:19

know, like, obviously, if you can't get a

27:21

TikTok app in the US anymore,

27:24

like, if it's truly and completely cut

27:26

off. Like, people are gonna have to stop

27:28

using it. But, I mean,

27:30

man, it is so popular.

27:33

Like, when I think of of kind of like

27:35

teenagers, like the the roots of

27:37

today. And TikTok is

27:39

ingrained into the fabric of

27:41

being a teenager. Right now. It's it's

27:43

part of the technological experience

27:47

if you're a

27:47

kid. True of your

27:48

kids, Jason? No. My kids

27:50

do not have access to Trionox. Yeah.

27:52

Yeah. Too too young at this point.

27:55

Fine. Good parent at the table. Trying

27:58

trying failing sometimes, but trying

28:00

I mean, the the but the drive is there. The desire

28:03

is there. And actually -- Yeah. -- even if TikTok

28:05

is banned,

28:06

all that TikTok content ends up on YouTube

28:08

anyways. Right. Or Instagram.

28:11

No. It's not like it's not like the content that was

28:13

processing. There is. We do know we

28:15

have some experimental data KIV

28:19

in Alabama banned TikTok on government

28:21

devices, which also banned it in state

28:23

run universities. So --

28:24

Oh, okay. -- University of Alabama -- Oh, and

28:26

I'm sorry. -- ain't no TikTok. Access

28:28

as well. Yeah. So many

28:31

researchers I know are now having problems

28:33

with

28:33

this.

28:33

So they never shoot text. So that means TikTok

28:36

on the low local network

28:38

at the

28:38

Wi Fi is completely inaccessible. Yeah.

28:41

They block it. Still obviously getting it through

28:43

their cellular network. And, of course and and I think

28:45

that's one data point, which is didn't

28:47

really phase the students. Well,

28:49

because they already have it on their phone anyways through

28:51

the data cache. Yeah. They just they just use the

28:53

phone's data. Right. But if

28:54

they can't get it there, I don't know.

28:57

Auburn banned TikTok and

28:59

students can't stop talking about

29:01

it. This is in New York

29:02

Times. This is last week.

29:03

Is Auburn a state school or it was private

29:05

school?

29:06

Well, I don't know. Yeah. Auburn is a

29:09

Auburn state. Yeah. Oh, okay. Oh, okay.

29:12

I didn't even know that. So a

29:14

senior at Auburn. So surprised

29:16

last month about a new ban on TikTok. She

29:18

read the news alert about it aloud to her friends.

29:20

We were like, oh, that's weird. Why would she

29:23

do that? And laughed it off and

29:25

moved on. She is the editor

29:27

in chief of the campus newspaper, which

29:29

has its own TikTok account, I

29:33

think they're gonna stop posting on TikTok,

29:36

but I don't think the the students are gonna stop

29:38

using TikTok. Right? Nineteen governors

29:40

have banned

29:41

TikTok. And and and

29:43

what what what do they have in common? They're

29:45

all Republicans, conservatives. Yeah.

29:47

Yeah. They're banning abortion

29:49

too. So Well, should we get more And I

29:51

should point out that the the

29:53

congressional committee that banned it was pretty much

29:55

party line vote. There

29:58

wasn't a hundred percent party Right? No. There was

30:00

one one or two Democrats voted in favor

30:02

as well, I think. Let's

30:05

see. Let me look at this. This is the the

30:08

House Foreign Affairs Committee Twenty

30:11

four to sixteen. Yeah. So

30:13

that's not pure party line. Yeah.

30:17

Democrats opposed bill saying it was rushed

30:19

to require due diligence their debate and consultation

30:21

with experts.

30:23

I don't have the actual vote count

30:26

here. But

30:27

Who did what was they said a long

30:29

party line. So I as I

30:31

remember, I written I written another source said,

30:33

I think one Democrat voted to

30:35

ban as well. But

30:38

I is it it's not really partisan? It's

30:40

not a partisan

30:41

issue? Is it? What is it part of

30:43

such today? No. There.

30:46

Good point. But this is the anti China. You're

30:48

gonna be tougher on China wing

30:51

steak. Right?

30:52

I've kind of I've kind of felt like the the

30:55

big tech, you know, anti tech movement

30:57

kinda crossed party

30:59

lots. To be honest. No. I really Oh, that's

31:02

Yeah. It seemed like it was

31:04

kind of in vogue right now to be very anti

31:06

ideology. And this went

31:08

off, falls. Falls firmly.

31:09

Is it our duty to defend tech

31:12

as a tech

31:14

The freedoms that the that the Internet

31:16

gives us is not the tech.

31:17

Yes. It's we're defending the Internet and

31:20

the freedoms we got. That's

31:22

my next book. Because

31:26

the Internet isn't tech. It isn't

31:28

wires. It isn't tubes. It isn't these

31:30

companies. It is the ability

31:32

of people who never could be heard

31:34

in mass media to now spaces. Yeah.

31:36

It's it's the voices. It's the community.

31:38

The the the black Twitter event that we

31:40

held at the school two weeks ago, you

31:43

know, you look and and and the

31:46

the peril that it's in in Musk's hands

31:48

now. But you look at what we lose there,

31:51

the power of the movements that came there. And those

31:53

movements are precisely what scare the

31:56

old white guys who look like me.

32:01

I think it's also our duty, those tech

32:04

journalists, to

32:06

weigh the evidence and

32:09

to be honest about when tech

32:11

isn't

32:13

necessarily right or good.

32:16

Oh, yes. Oh, for sure. That's what I0I

32:18

agree with

32:18

that. I don't wanna unilaterally defend

32:21

big attack and just say,

32:22

oh, no. No.

32:23

I totally agree with that. We need to criticize them.

32:25

Absolutely. But we also have to separate out

32:27

those things which are Tech's fault

32:29

or the Internet's fault versus those things that are

32:32

innate in our society. Tech

32:34

and the Internet didn't make us racist. Didn't

32:36

make us racist. We came into the game. Yes,

32:38

sir. Say Tech did not make us

32:40

racist. We already were. And

32:42

so when we blame the Internet for saying,

32:44

oh, it increased racism in America. No. No.

32:47

It was brought out the opposite. It

32:49

brought well, it brought out voices who

32:51

were not heard in a white

32:53

hegibony and the white hegemony wants

32:55

to burn the place down now as a

32:57

result. That's what's really happening.

32:58

Mhmm. Yeah.

33:02

Desk

33:02

processing.

33:02

It is certainly something we we on

33:05

all of our shows, we are always struggling

33:08

with is to find the truth. Right? To and

33:10

I think we try to be honest and not

33:12

partisan. I know people a lot of people say we're partisan

33:15

because you're such a lefty Jeff

33:17

Jarvis. Jeff, but Well

33:20

well, senator devil. But

33:23

I think, really, we know that it's our

33:26

mandate to dig.

33:29

And and and because we know tech to dig

33:31

as quickly as we can, it'd be critical of it, but

33:33

also defend it as needed. And and

33:35

but to really look for the truth of it. And

33:37

the truth isn't always obvious, but,

33:41

you know, tech is a double edged sword like

33:43

like many things. It can be used for good

33:45

or ill. Mhmm.

33:48

And it used to tech used to be

33:51

a cool thing that I that that people

33:53

were into. And now it's an

33:55

integral integrated part

33:57

of the fabric of life. It's society. Yeah.

33:59

It's it's everything. Yeah. So it's

34:02

so it's important, I guess, is my point.

34:04

For us to look

34:06

at both sides of that coin. I want

34:08

tech to be

34:09

better. I want because I love

34:11

technology. Right.

34:12

I love it when it works. Right. And

34:14

if it's not working, that's why sometimes

34:16

with the TikTok story, it's hard for

34:18

me because it really goes back for me to,

34:20

okay, I get the argument that you're

34:22

making against TikTok, but

34:24

show me the

34:25

proof, show me that this horrible

34:27

stuff is actually happening before we vilify

34:30

--

34:30

Yes. -- and remove Yes. Well, okay. Send

34:33

it a devil here. Oh, god. He's back.

34:38

This war and that I do not know

34:40

if We would have. Yeah.

34:42

How would we know

34:44

if the China was using

34:46

TikTok. We can't stop. It's behind

34:48

the scenes. We can't tell. We can't tell. And

34:50

by the time you know, it may be too

34:52

late. Yeah. So why

34:53

not? Cut it

34:56

short. Why not just say You

34:57

can label it for the Murdock Drive. Where do we

34:59

draw that line?

35:00

We don't need TikTok. Because

35:02

you could say that about a million different things, and

35:04

you could draw that line anywhere you walk

35:06

and kill I would see a bunch of different

35:08

things. It's a pretty easy thing to

35:10

do. If it's something

35:12

from an enemy nation -- Yeah. --

35:15

we don't allow Russian social

35:17

media in the United States. Right?

35:21

Oh, we can oh, I can use that

35:24

today. Sure I can. What's would

35:26

you what is Russian social media? There

35:30

used to be Pavel Duroff created the Russian

35:32

Facebook. They in fact, that's

35:34

why he left Russia because and and

35:36

found a telegram. Because telegraph

35:40

telegram. Because he was

35:42

it was taken from him. Right? So

35:44

what was is that still around? Vishal

35:46

talk be contacted. It would be contacted.

35:49

Yeah. V can you use it right

35:52

now? Yeah. Sure.

35:53

Well, that should be both answers. Telegram?

35:57

How about, you know, in fact I can I

36:00

think I can get it on let's see if I get my

36:01

phone? Six hundred fifty six million users

36:04

as of May twenty first -- Mhmm. -- twenty twenty. Tell him

36:06

a telegram. No. He'll talk about his k.

36:08

Yeah. Oh, he contacted. Which

36:10

was his Dora's original

36:13

creation was taken from him. And

36:16

then he went off and

36:17

Yeah. I can install I can install it from

36:19

my my endpoint. Notice

36:21

they call it the largest European social

36:24

network. Russia's always

36:26

wanted to be European. Haven't they? Well,

36:28

alright. I got a better example.

36:31

WeChat. WeChat. And

36:33

we know we know We know WeChat is

36:35

used by Chinese, military, and

36:38

Chinese government officials to contact

36:42

overseas Chinese to

36:44

say, hey, we know you have family in China.

36:48

You you might wanna consider what you're

36:50

saying. We know they use it that way.

36:52

We chatted We

36:53

use it we use these services

36:55

in Iran to try to under cut an

36:57

evil government. Right? So

37:00

why don't we ban WeChat?

37:02

It's available in the App Store.

37:04

And what's it gonna accomplish if we do?

37:06

That's that's the funny thing. Same

37:09

thing that it would accomplish if they got rid

37:11

of TikTok. I mean and and I can't that's

37:13

kind of part of my point that I was

37:15

making earlier is if TikTok goes,

37:18

I think that line gets drawn further and further,

37:20

potentially, you know, the WeChat or whatever.

37:23

And I I guess the reality is I don't

37:25

know whether it's whether it it

37:27

is a good idea to do it or not. Like, maybe,

37:29

you know, maybe there are reasons and everything, but

37:31

it just seems like that line is nebulous

37:34

and continues to stretch out if if you

37:36

go

37:36

there. I got another question, Jason, out

37:38

of that. Because you were talking started before before

37:42

but the but the companies that now are proprietors

37:44

of the net. I'm curious if

37:46

if we go ten years forward. Do

37:49

you think that Facebook will still be around?

37:52

Do you think Google will still be dominant? Do

37:54

you think Amazon will still be dominant? Do

37:56

you think that some of these companies are are

37:58

really long term? Or are they like

38:01

friend feed and myspace

38:04

that they are evanescent. What

38:07

do you guys think? I

38:09

mean, I I think I place

38:11

a Facebook and a Google and an Amazon

38:14

in the same category as a Microsoft and

38:16

Microsoft has been around a very, very long

38:18

time and has changed and transformed

38:21

over

38:21

the US less

38:23

powerful than it was, but still around

38:25

and

38:26

still actually pretty powerful. Like, Microsoft's

38:28

not a, you know, not

38:30

a -- Yeah. -- a small company. They they may

38:32

they're not as powerful and influential

38:35

as they once were, but they've certainly turned

38:37

things around. And I think think Google

38:39

is kind of in that that part right now. I don't think

38:42

Google's going anywhere. think ten years absolutely

38:44

Google will be around. But do I

38:46

think that a TikTok is going to

38:48

be the next Google in ten

38:50

years? I don't know. Oh, yeah. I I mean,

38:52

it's So the reason they put in

38:54

the same class I put Amazon

38:57

and Microsoft in the same class. I don't silicate

39:00

Google and Facebook in the same class as

39:02

Microsoft. I think Amazon has

39:04

that long that the long tail,

39:06

you know, they've they've started

39:09

out just selling books and it continued

39:11

to grow with this very, very long vision

39:13

to get to where they are now. And

39:16

before Bezos was out, I'm pretty sure

39:20

the vision was was put into chassis.

39:22

Is that the current CEO? Put into

39:24

his head in in the leadership to figure out,

39:27

hey, we can still continue to grow and continue

39:29

to just do all the things

39:31

the right way from a data analytics

39:34

standpoint and just just quietly

39:36

keep growing and quietly just keep gobbling

39:38

things up, all while continuing to have,

39:40

like, some small businesses be on the platform

39:43

form and making benefit too and just

39:45

keep taking over the world one bit at

39:47

a

39:47

time. But what's your point, Jeff? What what

39:49

is it? I don't

39:50

I don't understand. That that we we we

39:52

demonize the present proprietors. We say

39:54

these companies are rural in the world. I don't know. No. No. We

39:56

got you. We never got to do something. Right? Microsoft,

39:58

we the browser is they're they're screwing everything

40:00

up and the EU and so on and so

40:02

forth. And I think that we end up fighting yesterday's war

40:05

all the time. There's somebody new out there coming.

40:07

Maybe. But remember, in the

40:09

teens, and I'm talking about the

40:11

nineteen teens. Oh.

40:14

Any trust law was created because of

40:16

the Robert Barron's, because of the extreme

40:18

power, standard oil, and the

40:20

railroads. And

40:23

they had such dominant market positions.

40:25

There was nobody could -- Yeah. -- meet with

40:27

them. Somebody checked them. So the government decided

40:29

that it had to. And that's when that's when antitrust

40:32

law Sherman, antitrust law act and so forth

40:34

were enacted isn't it

40:36

appropriate that is big let's

40:38

say, Google and Amazon. I

40:42

think you probably have to include maybe

40:44

include Microsoft. Well, you could

40:46

pick who it is. But as these four or

40:48

five tech giants become bigger,

40:50

more dominant, more powerful. Isn't it

40:52

appropriate just as it was in the nineteen tens

40:56

to cut down the Robert Burns? Isn't it time to to

40:58

cut down them to size so that we can so

41:00

that we can have something

41:02

new. You're Well, I guess you're saying they're

41:04

they're

41:04

gonna die on their own. They're

41:05

gonna they're gonna Yeah. Microsoft faded in

41:07

its in its overarching

41:10

power on its

41:11

own. But nobody

41:12

argued that the standard oil was gonna

41:14

die on its own. I mean, we broke it up.

41:17

Well, bell bell telephone too.

41:19

Same thing. Yeah. And then I think

41:21

I think that that these things are

41:23

are the the arc is quicker for them.

41:26

Agree with you, Anton, I think, for Amazon and

41:28

I I would say Google. Jason,

41:31

what you're saying? They're Microsoft. They'll they'll go

41:33

up and down They'll be around. They won't be as powerful.

41:35

They won't be as scary. I think Facebook could

41:37

disappear. I I that's who

41:38

I think would disappear.

41:39

Right? Facebook is on the that's an easy

41:42

easy predictions since they're on the imminent

41:45

cusp of disappearing. Right? Even as we

41:47

speak. Yeah. They're they're out of the discussion. I mean,

41:49

I I go there occasionally, which I do,

41:51

I thought it really is often I used to. I still find

41:53

myself. Oh, that friend. I missed that friend. I missed that friend. I

41:55

still actually have engaged in it. But

41:57

It's less of the public conversation. It's less of the

41:59

fear conversation. The

42:02

beta stuff, I think, is kind of the

42:04

beta versus kind of a joke. So it's going to, like, Twitter

42:06

is gonna go bankrupt one way or the other.

42:10

And so I think we we

42:12

demonize these corporations

42:14

And this is part of Karl's point, I think, and

42:16

his piece about TikTok. Without looking

42:18

at the principal level, we're not

42:21

we're not defending the privacy of American

42:23

citizens instead we're going after TikTok. Do

42:26

you think it's even possible that that's

42:28

why these

42:30

members of congress are doing this because

42:33

it distracts from the real privacy

42:35

issues? Yes. Yes.

42:37

Mhmm. I

42:39

think so. And I need to blame for

42:41

our woes. Oh, yeah. And

42:43

-- Yeah. -- and not address the issues,

42:45

the real issue head on. Especially when

42:48

you know, unfortunately, money is so

42:50

important in in our government.

42:54

And and they need that they need them the

42:56

the contributions of these of telecom

42:58

companies. And But it goes back to what Jason was saying

43:00

earlier, the the convergence

43:03

of left and right attacks on the

43:05

net. It's it's it's a it's a

43:07

pincer movement. And and,

43:10

you know, in two thirty, the sword and the

43:12

shield Democrats go after the

43:14

the shield because they think that the platform

43:17

should be better at taking down hate speech. The Republicans

43:19

go after the sword because it's there, hate speech is being

43:21

taken down. They find themselves

43:23

in this strange bedfellows

43:25

conspiracy now against the tech companies

43:28

because the tech companies are easy targets They

43:30

are folk devils for everything wrong

43:32

with us. And so you're not just distracting

43:35

from the tech companies and money and privacy

43:38

and that. You're distracting again from these

43:40

higher societal ills

43:42

we have that we're not facing. You

43:45

know, when when ChatGPT, spits

43:47

back stupidity. It's armed stupidity. It's

43:49

spitting

43:49

back. And we're not dealing with that.

43:52

TikTok has been, you know, they've seen this

43:54

coming and they've been spent more than a billion

43:56

and a half dollars so far trying to

43:58

respond to this moving their data

44:00

centers to the US, to Oracle,

44:04

this is another one that cracks me up that that

44:06

that, you know, we trust Oracle more

44:08

than we trust TikTok. I

44:11

don't know. Yeah. You

44:14

know, they they they understand

44:16

that this is an existential threat to them.

44:19

They have a business in China. But if the US cuts

44:21

off TikTok, it's probably, you know,

44:23

not good for TikTok in the long run. What would

44:25

the citizenry say?

44:29

American or Chinese? American.

44:32

Motors. What would voters say? Yeah. And

44:34

think that's probably calculus that's gonna

44:36

be you know, Congress till us to vote on this.

44:40

I imagine, in fact, if you care about

44:42

this, you probably should write your congress critter.

44:44

I imagine they're, you know, putting their finger to the

44:46

wind right now saying, well, what's gonna happen?

44:49

If young people would vote a little more,

44:51

they might not be at risk. You

44:53

know. But this might be the

44:55

the the type of thing that gets young people to

44:57

vote. And I'm sure that's

44:59

what they I'm

45:00

losing my tick talk. Yeah. You better believe

45:02

I'm voting on. You know what? Sure. That's what senator.

45:04

No nothing is investigating.

45:07

But whether the people in my state

45:09

younger vote, though, in the last couple

45:12

of weeks. It's better. It is.

45:14

And I think as you as you continue to

45:16

eat away things that they care about, Supreme Court,

45:18

yesterday looked like in

45:21

the oral arguments, they were gonna vote to

45:23

overturn president Biden's student

45:26

loan forgiveness, that's

45:28

another one. You you do that. Mhmm. And

45:30

that that gets young people into the

45:33

Well, maybe not that young, people in their hopes.

45:35

Hopefully, it does. Get them into the the voting booths.

45:38

Yeah. This is why you need to vote, things

45:40

like this. And you're I'm sure that this

45:42

is every member of Congress is going, is

45:44

this gonna cost us a lot of votes under twenty

45:47

five? Yeah. Yeah. Probably.

45:49

Yeah. You know?

45:50

The other thing is is that media

45:52

should be defending TikTok. Just

45:54

like the question whether media don't

45:55

But that's a hard thing to solve. I even feel little

45:58

weird. Defending TikTok. I know. But

46:00

we should be defending the first amendment and rights

46:02

of speech on the Internet as podcasters. I

46:04

think we especially because because what

46:06

we are doing right now was enabled

46:09

by TikTok. Obvious My goodness,

46:12

TikTok. Oh, no. Not TikTok. TikTok.

46:14

Oh, your son is TikTok. Yeah. By the way,

46:16

we had You're more about the

46:18

Oh, here we're about the QuickBooks and the You're showing

46:20

that you got a big advance. Six

46:22

figure advance to write a -- Whoa.

46:24

-- book book. I don't know.

46:26

Kiddie. You don't know how this is awesome. And

46:29

that's So, man, one of the things he's doing

46:31

is he's going I

46:33

I Mike Elkin. I talked to Mike Elkin. Mike

46:35

Elkin. And as you know, we went down to Oaxaca last

46:37

year for the day of the dead. And the Elgans,

46:39

Mike and Emera know every chef in Oaxaca,

46:42

including one of the most famous chef,

46:44

Alejano Ruiz. And

46:47

Mike has for Henry to go down

46:49

for a month. Introduces with

46:52

chef Alex. Wow. It's

46:55

a unusual thing because he's gonna do TikTok.

46:57

Of fiction, if Alex Hi, Hank. Yeah.

47:01

And and that's gonna and he says, I think I'm gonna

47:03

do a Oaxaca chapter in the book. Which is

47:05

great, by the way, because Oaxacan food is

47:07

is the amazing wonderful cuisine

47:09

that that has yet to really enter the United

47:11

States once it does people gonna

47:13

upgrade. Oh, this is amazing.

47:14

Or a meal sandwich to fry them,

47:16

isn't he? Oh, yeah.

47:18

Oh, yeah. Take the hachid cuisine. And

47:21

turn it into

47:21

a sandwich. Into a sandwich. Days.

47:24

It's a Yeah. They they get

47:26

the taco or

47:27

anything. Yeah. I'd

47:28

I'd say that would be great start to

47:31

sing it. By the way, he wasn't originally

47:33

a sandwich guy. That was a thing that

47:35

he learned from TikTok. The algorithm

47:37

told him when he made sandwiches, a

47:39

huge difference, so he made sandwiches.

47:41

You know? Say the Chinese

47:44

or the Chinese or or controlling

47:45

them? They're controlling all over

47:46

stomachs. It

47:47

we wouldn't eat any sandwiches if it weren't for

47:49

China. Yeah.

47:50

They were invented there, weren't they? Now

47:53

okay. So Are

47:57

you gonna your

47:58

senator thing? Yeah. You know what I'm saying?

48:00

Yeah. How did you? It

48:03

helps if you use the voice -- Yeah. -- but

48:05

the police

48:06

government is not known for

48:09

freedom. They're known for repression. They're

48:11

known for of throwing their citizens

48:14

into concentration camps. Yep.

48:16

Should we not try

48:19

to keep that kind of you'll

48:23

feeling out of the United States of America.

48:28

I

48:28

love your filibuster. It's quick.

48:32

I mean, don't

48:34

they deserve sanctions? Just like we sanction,

48:37

The the the Rusty's I'm sorry.

48:39

The Russians against it because

48:42

of violations of human rights. I

48:44

don't think we should just but

48:46

turn blind. I added the Chinese just

48:48

because we go They make our bones.

48:50

That's not enough.

48:52

We won't make fist

48:53

bumps. To the

48:56

Saudis. Well, I don't like

48:58

them either, but that's another matter. Oh,

49:01

man. Jared keep his billion dollars.

49:03

I think we should keep the

49:05

satties out of this. I'm a you know, it's interesting

49:07

because I'm a big f one ant and I both big

49:09

f one fans, Formula One racing. And

49:12

this is a problem because the Formula One

49:15

has told people like Lewis

49:17

Hamilton you you can't wear that

49:19

rainbow flag helmet

49:21

where next time we go to Saudi Arabia

49:23

-- Right. They're

49:25

telling them, no. We gotta you you gotta run

49:27

all that stuff by us

49:29

first, and a lot of the races

49:31

are in well, Bahrain's the first

49:33

race. Bahrain's the first one. And then

49:35

there Saudi Arabia is the second one. And

49:38

so this is a global sport.

49:40

I love I know you love

49:41

it, Ant, and we're very excited. Similar

49:43

actions happened there in the World Cup

49:45

at Qatar. Yep. They,

49:47

though, act f one's really bad. They won't even

49:50

let Lewis wear his earring. They

49:53

just can't wear earrings, dude. What

49:56

the hell is a safety concern? Nineteen

49:58

fifty two. What are we talking here?

50:01

Yeah. Anyway,

50:04

I I

50:04

think you're gonna make us What is it? Are there women

50:06

drivers on the f one? No. Sorry.

50:08

Not yet. There's one black driver. That's it.

50:12

No

50:13

women. Female. See,

50:15

you see, you left Twitter and Facebook

50:17

because of the principal. Well, I live. Why

50:19

didn't you watch? And I did not watch

50:22

the World Cup because I was

50:24

upset that it was in Carter. By the way, there's

50:26

been an f one racing Carter. I

50:30

it's hard It's

50:32

hard. You

50:34

know, this this is the same sort of thing with

50:36

his TikTok. I don't I you

50:38

know, I'm just in favor of how Chinese

50:40

government treats its people.

50:43

I think it's reprehensible. But I love

50:45

James -- James. -- hand sandwiches are good there.

50:49

I have to look and see if he makes any Chinese

50:51

sandwiches, he might be working. How how was your Chinese

50:53

these days? It's very bad. I

50:56

I lost it all sad to say. But

50:58

I love the country. I love the people. I love

51:00

its history. I'm fascinated by it.

51:02

I think it is I think it

51:04

should be, by

51:06

all rights, one of the most successful

51:10

countries in the world because it's got the resources,

51:13

it's got the brains, it's got the people. And

51:16

it's unfortunately, I think the CCP which

51:18

was, you know, I have to say, in nineteen

51:20

forty five, Mount

51:23

Zeydong did lift

51:25

the Chinese people out of a feudal

51:28

system into the modern world,

51:30

but he made a lot of mistakes subsequently that

51:33

caused a great famine. And

51:35

the current regime is not just It's

51:37

Remember, you

51:38

know your next southern or, sir?

51:40

How do you know that? Because

51:42

you didn't say, mal, say, Tom, mouths,

51:44

they don't. I heard it. That means they don't

51:46

go. My life.

51:48

Mouse or the mouse? But

51:52

I but but Mao, in the early days,

51:54

did a lot for the Chinese people. And and

51:56

to this day, when we were in China and two thousand

51:58

nine, my son, Henry and I went, And

52:01

we went to a small village, and they all had big

52:03

screen, flat screen TVs. And

52:06

we were told by our guide Essentially,

52:09

that's what happens. Every year, it's a new thing. It's

52:11

air conditioning one year, big screen

52:13

TVs in next year. They're slowly

52:15

raising the standard living. And this is a very

52:18

poor Chinese village, you know, basically

52:20

subsistence agriculture. But

52:22

they have been lifted into the modern world

52:25

And so it's one of the reasons I think the Chinese people

52:27

are very supportive of their regime because

52:29

they get a lot benefits from it. And

52:32

and it really was vicious

52:34

feudal society. Right. Well, but but the other

52:36

thing is people

52:36

say, well,

52:37

the I I the thing I hate most of us, I hate people

52:39

say, well, China's not ready for democracy or most of what

52:41

Chinese people so They don't have you

52:44

it's wrong to say any people are not ready for

52:47

their own ruling themselves.

52:50

And b, opinion

52:53

polls that element

52:55

does a trust index every

52:57

year. And the oppressive regimes,

52:59

Russia of the

53:01

Philippines, China, are

53:03

all high on trust institutions because what else

53:05

are you gonna say? Right.

53:07

Yes. So the stranger calls you, oh, yes. Yes. Yes.

53:09

I love them. I love of them. They're they're out for us. Absolutely.

53:12

You can't separate that out very

53:13

easily. And and

53:15

as somebody's pointing out, a mal probably is the

53:17

highest death count of any dictator. He

53:20

killed. We don't even know

53:23

how many hundreds of millions of people

53:25

in the Great League Forward. And the cultural

53:27

revolution in the sixties was horrible. So

53:29

that, you know, he committed a lot of crimes.

53:33

You know, it's a it's life is the

53:36

stuff is complicated. And and to say, yeah,

53:38

we're gonna ban TikTok. That's gonna solve

53:39

everything. It it's not gonna solve

53:42

anything at all. Once where

53:44

media and politics come together,

53:46

and oversimplify the

53:47

world, and lose history, and lose context.

53:50

And we're not good with with real estate.

53:53

With ambiguity in gray areas.

53:55

Right. We wanna we

53:57

want good and bad. We wanna know good with the who's

53:59

the good

53:59

guys? Who the bad guys? Where's the good

54:01

guys? Tell me who to

54:02

hate. Tell me who to hate. Yeah.

54:04

I hate senator Neville. Well,

54:07

my friend, my colleague. Peggy, you brought

54:09

him back in the chest side of the aisle.

54:12

Would take you head out of his keys to

54:14

it and start thinking about real world

54:17

we live in.

54:18

Alright. Senator devil's gonna

54:20

retire now. Good.

54:25

Now that he's essentially Right. And senator,

54:27

devil, both go into

54:30

the Alright.

54:33

Let us take a little break and

54:35

there is more to talk about in

54:38

just a little bit. But we

54:40

do wanna mention our sponsor because I am

54:42

a fan, a big fan. Senator

54:44

Neville probably is to everybody loves

54:47

fast mail. I've said this

54:49

for some time now. If you care about

54:52

email, why are you using free

54:54

email? Email that treats you as

54:56

a product, not as the customer, fast mail?

54:59

Isn't free? It's

55:02

three bucks as little as three bucks a month.

55:04

It's not hugely expensive. But

55:06

free mail isn't free either. You pay with your

55:09

privacy. For over twenty

55:11

years, FastMail has been a leader in email

55:13

privacy because with FastMail, you're

55:15

a customer not

55:18

the product. Not only is it private, you get

55:20

great support because they

55:22

care about you. They wanna keep you happy.

55:24

You're their customer. Your personal

55:27

data is safe, kept away from third parties.

55:29

They have the best spam filtering ever.

55:32

I absolutely love fast mail spam

55:35

filters. I have set

55:37

up a very, I think, good

55:39

system for making email usable again.

55:42

No ads. Nope, tracking

55:44

pixels. Your data is stored in the

55:46

US with fast mail and it's fully GDPR

55:49

compliant. Let me talk little bit about some

55:51

of the features you're gonna love about Fastmail. First

55:53

of all, Fastmail is real email.

55:55

It is not it is not some

55:57

company's idea of email that's they

55:59

use the open source Cyrus iMAP server,

56:02

which is the king of iMAP servers

56:04

configured exactly

56:07

the best possible way. In fact, plasma

56:09

contributes back to the open source and the

56:11

and the Internet community. They've helped

56:13

create email standards for authentication to

56:16

reduce spam. They're

56:18

they're absolutely a good

56:20

citizen, a good neighbor in the email world.

56:22

So you're supporting somebody who's doing all

56:25

the right things. Because it's

56:27

true iMAP. You can, of course, use any

56:29

client you want. Doesn't,

56:31

you know, Thunderbird, Outlook,

56:35

Microsoft mail, Apple mail,

56:37

but you can also use their great web mail.

56:40

You won't be giving up that great web mail. In fact,

56:42

I think it's ten times better than Gmail's

56:44

interface. You can customize your workflow with

56:46

colors. Custom swipes.

56:49

You got night mode. They've got apps

56:51

for iOS and Android that are superb.

56:54

You can customize your inbox with scheduled

56:56

send. I love that feature. Snooze,

56:59

folders labels. Yeah. Folders and

57:01

labels. So that's kinda cool

57:03

too. Search bar, the

57:06

search is fantastic. III

57:08

keep all my email. Never delete

57:10

anything because FastMail has such a

57:12

good search. I know everything's there and I could find it

57:14

in a minute. In a second, keep

57:16

track of all the important details in your life

57:18

with Fastmail's powerful sidebar.

57:20

I've moved my calendar, my address book,

57:23

to Fastmail. I don't use Google anymore for

57:25

those. Fastmail supports cow

57:27

dev and car dev, the Internet

57:30

standards for that, the calendar works

57:32

great. I even have my notes stored at FastMail.

57:34

And I use FastMail as

57:37

my DNS host for

57:39

my websites because

57:41

then I get emailed to and

57:44

fast mail as it just makes it very

57:46

easy to turn off all the e turn on all the

57:48

email authentication protocols, SPF, and

57:50

d kim, and and a d mark. That

57:52

way my emails get through better.

57:54

They're authenticated. It's just and

57:56

I can have an infinite number of email addresses,

57:58

by the way. As far as I can tell, there's no

58:00

limit on the number of domains. I can have hosted

58:03

at FastMail. I don't that's pretty cool. You wouldn't put

58:05

a website there, although I do have a single,

58:07

you know, like about

58:09

me page on Fastmail. So you could put a

58:11

web page there. But mostly, I use

58:14

it for DNS because I get the email So

58:16

when I won't sign up for something in account,

58:18

every time I sign up, I have a unique email, the even

58:20

support which is really cool. One password and

58:23

BitWarden's masked email

58:25

standards. You've seen this. This is something new

58:27

that really adds to your security. Of course, you

58:29

have a unique password for every website Now

58:31

if you're using BitWORDner one password, Fastmail

58:34

will give you a unique email address

58:36

for every website. They all go to the same

58:38

inbox, But this way,

58:40

you've got kind of double the protection. A

58:43

hacker has to guess the unique email

58:45

as well as the password. That

58:47

really is cool. Desktop or

58:49

mobile, FastMail is

58:52

the best, and their support is

58:54

the best. Not some notebook reader, but

58:56

real US based support team with

58:58

email experts answering the phone, answering

59:01

your questions. So then

59:03

they're always in reach. And because, again,

59:05

you're a customer, they give

59:07

you that kind of support. Fast

59:10

mail team believes working for customers as

59:12

people to be cared for, not

59:14

products to be exploited. Advertisers

59:17

are left out, you and your privacy

59:20

are at the center. I I could

59:22

read you review after review of Fastweb, but you

59:24

listen to me when I tell you I've been using FastBear

59:26

for more than ten years. I

59:28

am absolutely hooked. I will go

59:30

nowhere else. In fact, I just signed up again

59:32

for another three years. I do it three years

59:35

at a time. It's just

59:37

the best. It's very easy

59:39

to get your data out of any other email client

59:41

and into Fastmail plus Fastmail will go out

59:43

fetch that email if you wanna keep those accounts

59:45

alive. I would do that if you move email addresses.

59:48

And as I said, I have custom domain. So I have

59:50

a custom email address with Fastmail doesn't cost anymore.

59:53

It's just great. They power open

59:56

source email. They

59:58

are they are moving email forward

1:00:00

with new Internet standards. It's a good

1:00:02

company with great people who really care

1:00:04

about email. If you care about email, get

1:00:08

Fastmail. Reclaiming privacy boost

1:00:10

productivity. Make email yours with

1:00:13

fast mail. Try it free for thirty days

1:00:15

fast mail dot com slash

1:00:17

tweet, FASTMAIL.

1:00:20

Fast mail dot com slash tweet.

1:00:24

And by the way, if you use that address fast mail dot

1:00:26

com slash tweet, you'll get fifteen percent off.

1:00:28

On your first year, your whole first

1:00:30

year when you sign up today, fast

1:00:32

mail dot com slash Twitter.

1:00:34

I I've been telling people this since

1:00:37

long before they were an advertiser. I don't know why everybody

1:00:39

doesn't just use Fastmail. It is literally

1:00:42

the best email service in the

1:00:44

market. Do it.

1:00:46

Do

1:00:46

it. Fast mail. Thank you, Fast mail

1:00:48

for supporting us.

1:00:52

Alright. Alright.

1:00:55

Enough for this -- Alright. -- senator senator You mean

1:00:57

that this is not a democracy and you're in charge.

1:00:59

We've just lost democracy. What's

1:01:01

next? Oh, actually

1:01:04

YouTube has a new leader. We talked about

1:01:06

Susan Wajeski moving along. The

1:01:09

the new guy in charge Neil Moen

1:01:11

wrote his first letter, and

1:01:14

of course he's smart. Our twenty

1:01:16

twenty three priorities, it's aimed particularly

1:01:18

at creators. Right? Who's more

1:01:20

who's most important to YouTube after advertisers

1:01:23

because they're the most important. creators,

1:01:25

somebody's gotta make that content. Yep.

1:01:29

He says, I thought he says, a little over

1:01:31

fifteen years ago, I visited company with an interesting

1:01:33

take on digital video. As

1:01:35

I walk through YouTube's small offices love

1:01:37

pizza parlor, I could see the promise of the

1:01:39

platform. I've thought about that moment

1:01:41

over the past few weeks as my longtime

1:01:44

friend and mentor, Susan Wajewski, transitioned

1:01:46

to become an adviser to Google and Alphabet, and I

1:01:48

took the helm as the new leader of

1:01:51

YouTube. So what does he say?

1:01:57

He he quoted We need to do

1:01:59

everything we can to make Marquise Brownley

1:02:01

happy. Yeah. Well,

1:02:02

that's number one. Definitely

1:02:05

the new one. Yeah. And mister Beast

1:02:07

happy. He and mister Beast. Yeah. Mister

1:02:09

Beast happy. Probably more importantly, mister Beast.

1:02:11

He he did say he quoted a

1:02:13

study from Oxford Economics that

1:02:16

in twenty twenty one more than two million

1:02:18

creators, two million earn

1:02:20

the money equivalent of a full time job

1:02:23

on YouTube. That's

1:02:25

awesome. Wow. So you talk about TikTok

1:02:27

and and it's for sure true that TikTok is

1:02:29

powered a lot of people, but nobody nobody

1:02:31

comes close to what YouTube's done. Right? YouTube

1:02:36

has a shopping feature now, ad revenue

1:02:38

sharing on shorts. The

1:02:40

number of people subscribing to individual channels

1:02:43

has jumped. Twenty percent year

1:02:45

over year, six million people

1:02:47

now subscribe to channels. It's

1:02:49

not a huge number given the number of people use

1:02:51

YouTube. I think that's also

1:02:53

because creators keep saying, hit this. Smash

1:02:55

the subscribe button during the never

1:02:58

gets old. They have added under his,

1:03:01

you know, this is probably something Susan

1:03:03

Wajewski had in process, but they just

1:03:05

added dubbing clips into alternate

1:03:08

languages. Right? How many languages do they have

1:03:10

now? It's kinda cool. Wow.

1:03:14

They of course, mister beast started

1:03:16

my Kimmy. It was it was tested

1:03:19

by mister Base First. Oh. So

1:03:21

this isn't

1:03:22

this isn't caption. This is dubbing.

1:03:24

Dubbing.

1:03:26

Wow. Multiple language audio

1:03:29

tracks. They built it in the Can we do that?

1:03:31

Can we do that from a clip from the

1:03:32

show? Sure. I wanna hear twig

1:03:35

in German. Okay? We just gotta

1:03:37

get somebody we just gotta get a cast of

1:03:40

people together.

1:03:42

What's that okay? Can we just put this on YouTube?

1:03:44

What can we dump it there? Yeah. Yeah.

1:03:46

But again, somebody has to speak German. Are you gonna

1:03:48

do it? No. I thought it was I thought

1:03:51

it was a computer

1:03:52

done. No. No. No. No. No. That's

1:03:54

that's what I thought too, Jeff. At first.

1:03:56

Oh, okay. You have

1:03:58

to imagine that's coming at some

1:04:00

point. Yeah. Actually, why not? You know what?

1:04:02

Oh, yeah. That's not a really sweet spot. Come

1:04:04

to think of it. The AI could do it. I mean,

1:04:06

they're they can translate pretty well already.

1:04:08

They're doing the transcriptions. Yep. They're doing the

1:04:10

transits translations. I guess all you'd have to do

1:04:12

is have the voice, have the voice. Yeah.

1:04:15

The item. The the creator chooses

1:04:17

which language. So there's now a and

1:04:20

you could see this on the mister beast video.

1:04:23

Hindi, Thai, Vietnamese, Arabic. He

1:04:25

rubs into a bunch bunch of different languages.

1:04:28

Over fifteen percent of the web videos, watch

1:04:30

time came from viewers. In different

1:04:32

language than the original

1:04:33

recording. So that's big. Mhmm. Viewers

1:04:36

watched over two million hours in January

1:04:39

alone of dubbed videos

1:04:40

So does mister Beast pay someone to do it or

1:04:42

volunteer's do it? Well known

1:04:45

creator, mister Beast, aka Jimmy

1:04:47

Donaldson, who has one hundred thirty million

1:04:49

global

1:04:49

subscribers. Dubbed his

1:04:51

eleven most popular videos in eleven

1:04:54

languages. In

1:04:55

an interview with YouTube's creator insider,

1:04:59

Is that Renee? I don't know. Mister

1:05:02

Beast explained, that should be that should be

1:05:04

the name of my senator. Set it up, Beast

1:05:07

Explain why the

1:05:09

feature was beneficial, why I don't even have to

1:05:11

explain that. It's much easier

1:05:13

to just run one channel than twelve. You

1:05:16

have to make twelve different thumbnails blah to blah.

1:05:19

So I'm looking to see if they explain

1:05:21

where how he got I guess he hired.

1:05:23

Or YouTube? Probably.

1:05:26

YouTube did. Yeah. Somebody -- Probably

1:05:28

if you do that. -- testing it out in a big

1:05:30

splash APR

1:05:31

leak. Who wants to volunteer? Any

1:05:34

of our listeners, proficient

1:05:37

You pick the language listeners. You tell

1:05:39

them you pick it. You'd have to have four

1:05:41

people. Right? You'd have to have somebody doing

1:05:43

my voice, some Whatever and whoever does

1:05:45

my voice has to have has

1:05:47

to have set it a beast. But

1:05:50

it doesn't have to be doesn't have to be a

1:05:52

foghorn legorn. It have to be something appropriate to

1:05:54

you or It could be a Bavarian accent. Oh, yeah.

1:05:57

It looks like this. So Oh,

1:05:59

boy. And that'd be you have to have a Jeff

1:06:01

want you have to have an aunt, want

1:06:03

you have to have a Jason. And

1:06:06

next week, you have to have as Daisy.

1:06:09

So Jason. Yeah. Good luck

1:06:11

with that. Jason.

1:06:13

Okay. Swam Brad says to double it and

1:06:16

click on

1:06:18

Portman's version of Portman's swamp.

1:06:20

Right? A

1:06:22

South according to Mohan's

1:06:25

letter of South Korean creator gained over

1:06:27

thirty thousand members after launching channel

1:06:29

memberships just seven months ago.

1:06:32

Six million viewers paid for channel memberships on

1:06:34

YouTube in December, twenty percent

1:06:36

increase. So, yeah, it makes sense

1:06:39

that Moan would be focused on creators.

1:06:42

He talks about top gaming creators.

1:06:44

YouTube's always done that. Right? It's

1:06:46

always been about the the the

1:06:49

You know, I Yeah. And I don't I kinda don't like this

1:06:51

because it's it's it's

1:06:54

I think a little deceptive. To

1:06:56

the rank and file creator who is

1:06:58

never gonna make dollar one. Mhmm.

1:07:01

But you go look at all these people making

1:07:03

a living, on YouTube, there's

1:07:05

two million of them. YouTube could. But

1:07:07

what you're really saying is, come on, everybody. You

1:07:09

gotta work for us. Make more content for us. We're

1:07:12

gonna make the bulk of the

1:07:13

money. And they're not lying. Yeah.

1:07:15

There are two million. No. It's true. They take us to

1:07:17

understand what's happening. Yeah. And

1:07:20

you and and it's true that you could

1:07:23

also.

1:07:24

Are you likely to? That's a different

1:07:27

story. Oh, and here is Rene Ritchie,

1:07:29

creator liaison in In

1:07:32

that YouTube letter, what

1:07:34

creators need to know from Neil's letter.

1:07:37

So Yay Renee, They

1:07:41

also talked about protecting kids. Looking

1:07:45

ahead, This is a

1:07:47

pivotal moment for our industry we face

1:07:50

challenging economic headwinds and

1:07:52

uncertain geopolitical conditions AI

1:07:55

represents incredible creative

1:07:57

opportunities, but must be balanced by responsible

1:08:00

stewardship. This

1:08:02

is what motivates me and everyone at YouTube

1:08:05

to do our best work every single day.

1:08:07

It's kind of an anodyne. Didn't know what

1:08:09

you'd expect letter, but

1:08:12

There's some facts in the end too.

1:08:14

How does how does Salt Hank

1:08:16

do when it comes to YouTube.

1:08:19

I

1:08:19

mean, YouTube as a as a platform has

1:08:22

been around much longer than a TikTok than

1:08:24

it know, it's funny when he was doing TikTok. So

1:08:26

why don't you do more YouTube? He said, So

1:08:28

he has done since then, he's done some long

1:08:30

form

1:08:30

YouTube. He does YouTube shorts. I've seen some

1:08:33

of his long

1:08:33

term stuff. Yeah. And it's good, but Insta's

1:08:35

his Yeah. Because of his style,

1:08:37

which was TikTok inspired inspired inspired inspired by

1:08:39

fashion. Fastweb. Shortcuts. Right.

1:08:42

It really lends itself to Instagram Reels

1:08:44

are effectively

1:08:45

TikTok. Right. Yeah.

1:08:47

It's it's it's it's not just cooking too.

1:08:49

He's he's flirting in a specific way

1:08:51

with his audience. Right? He's it's rye.

1:08:54

It's He's been pancreatic. Really?

1:08:57

How many how many proposals is Hank

1:08:59

had? Quite a few. He's --

1:09:01

Yep. -- he'll be on the bachelor's next. Yeah.

1:09:03

Yeah. He's got a

1:09:05

girlfriend. No. I shouldn't say this, though, probably.

1:09:08

Here he is here he's making Did

1:09:10

he meet her through the through the Bondy.

1:09:13

No. She's an influencer, though. I hear

1:09:15

I understand. Oh, wow. Okay.

1:09:17

He's done a lot of colabs. That's one of the things

1:09:19

you do. Right? Yep. Yeah. You're

1:09:22

you're doing these these

1:09:23

codes, which is cool. Yeah.

1:09:25

That's really a nice part of it. Yeah.

1:09:27

There is you know, and think YouTube has this

1:09:29

too. There where's this kind of collegial thing

1:09:32

about it? Here's making a chicken festo

1:09:34

sandwich. Consistent reaction

1:09:37

This is an ad. Did you see that real quick?

1:09:39

Yep. The Mazzetta Peppers, it's an ad

1:09:41

for Mazzetta. And this is,

1:09:43

by the way, TikTok doesn't get any

1:09:45

of that money. Nordahi, Nordahi's

1:09:47

YouTube. And that's really how you

1:09:49

make money on these all of these forms

1:09:51

is selling your own ass. Oh, that was good.

1:09:53

Oh, that was good. Oh. And don't

1:09:55

forget a mozzana. This

1:10:01

is bizarre.

1:10:03

But isn't that subtle?

1:10:06

Yes. I freaking lied. Did I tell

1:10:08

you about this one still? Did I tell you about this

1:10:10

one? Did I tell you about I can't

1:10:12

play the music, but I'll take that off. But did this

1:10:14

one He got a partnership with

1:10:16

a Los Angeles company,

1:10:19

shock you l dot l a. They go

1:10:21

around and do catering for big events

1:10:23

or big company. They gave him

1:10:26

that snow crab. It looked

1:10:28

like about a pound of coffee or Huge

1:10:30

amount of coffee. Yeah. And he made

1:10:32

a cowboy, a snow crab Pobo

1:10:34

with a caviar

1:10:36

rimmelade. Oh,

1:10:36

like, he said it was it was a

1:10:39

seven thousand dollar sandwich. It

1:10:43

just looks so

1:10:44

delicious. That's eleven. You

1:10:46

have a roll or sandwich.

1:10:49

Yeah. You made a seven thousand

1:10:51

dollars sandwich today. Oh,

1:10:54

capitalism,

1:10:55

man. Capitalism. Has chef the

1:10:57

only chef reaction there were No. I

1:10:59

don't think so. Hank. Oh, I don't think

1:11:01

so. I have to ask him about that. That's a good

1:11:03

question. Okay. So notice by the way

1:11:05

here on his instant, linked to his

1:11:07

YouTubes. So

1:11:10

Oh, okay. So, you know, they're gonna do it. And

1:11:12

this is a short. Right? think this is a short.

1:11:15

YouTube. First YouTube is up.

1:11:17

YouTube two is up. Oh, these are

1:11:19

little

1:11:19

instill. I see.

1:11:20

Okay. Insta things. And then he has the link

1:11:22

on

1:11:22

his program.

1:11:23

Link visit the link to you too. You

1:11:25

called the slash promotions. Yeah.

1:11:27

This

1:11:27

is longer for Yeah. It's longer.

1:11:29

So this is where I guess this is where I get you into

1:11:31

as longer Yeah. Yeah. And I don't know. Let's

1:11:33

see what the views. I don't think these these do

1:11:35

so as well. Where do I see this?

1:11:37

Thirty nine thousand. Thirty nine thousand. Thirty nine

1:11:39

thousand. Four months. Yes. See that's nothing.

1:11:41

Oh, there is. That's nothing. Yeah.

1:11:43

But then the salt you're selling the salt. You're selling

1:11:45

the cookbook. You're selling TV show.

1:11:48

You're selling your presence. Yeah. You know?

1:11:50

I am hungry. I know. It's impossible

1:11:52

to watch his videos and not getting incredibly

1:11:54

hungry.

1:11:55

My boy. Yeah. I'll tell

1:11:57

you. He

1:11:58

I tell you he cooked a kibana with

1:12:00

Guy Fieri last weekend?

1:12:02

I heard you mention Guy Fieri. That that's

1:12:04

awesome. He was at Is

1:12:06

that gonna be on honors drive it in the

1:12:08

No. God. Cai doesn't do that anymore.

1:12:11

They just rerun it forever. I don't think

1:12:12

No. No. No. No. No. No. No. Just visited

1:12:15

You just visited places near me? Oh, okay.

1:12:18

So, no, was at

1:12:20

a Miami Food Festival. Apparently,

1:12:23

it's a big one. And he

1:12:26

Hank did a couple of things. He was he

1:12:28

was at the mister clean booth making

1:12:30

bloody marries. I somewhere

1:12:36

have some video of that. And then

1:12:38

Guy Fieri was one of the celebrity chefs

1:12:40

there. And he did -- Yeah. -- and he

1:12:43

was on Fieri. Fieri. He

1:12:45

was on stage with guy doing

1:12:47

a kibana. Sales. Oh, that's so

1:12:49

cool. Yeah. So yeah.

1:12:51

You know, the you these there's this whole they make

1:12:53

the rounds basically these days. Yeah.

1:12:56

Food is it, you know, thanks to the food network.

1:12:59

Food is it.

1:13:01

But it's

1:13:01

what he has. Henry always wanted

1:13:03

to do that. You know, he he grew up

1:13:06

watching YouTube food videos. That's

1:13:09

how he learned to cook. It wasn't from me.

1:13:11

Oh. Yeah. So he

1:13:13

was always oriented in

1:13:14

that. What does

1:13:15

he think of your machines? Your many

1:13:17

machines? He got

1:13:19

shading or jealous?

1:13:22

That's a good question. I don't know. He doesn't use

1:13:24

machines, really. He uses like

1:13:26

a bullet, mostly chops his

1:13:29

own. I gave

1:13:31

him my No. I had a very I had a

1:13:33

very nice knife roll with, like, the best knives,

1:13:36

not that I'm a chef, but I, you know, at one point,

1:13:38

I said, I don't want some good knives. And I gave

1:13:40

it to him one year. And, yeah, I don't think he's gonna use

1:13:42

it, so I don't know. If

1:13:45

he won't use my knives, he definitely won't use

1:13:47

my kitchen appliances.

1:13:50

Microsoft Law mister Robyn

1:13:52

Nielsen put in our discord, you

1:13:54

know, speaking of AI. Dubbing.

1:13:57

Oh, wait a minute. Now this is the is this we've

1:13:59

been working on this. We've been working on

1:14:01

this. We're very excited about this. At

1:14:03

some point, I can retire And

1:14:06

AI Leo could take

1:14:08

over. What are Duvandem Hemo Bist,

1:14:10

Alice Lead and Schmers and Stylist. Is

1:14:14

doubled mid air quick home fullest. A tremendous

1:14:17

tribons Mouné was sold Alders,

1:14:19

Schmitz, Unloost, Sister freezing,

1:14:21

calm, calm, and mine a brewst.

1:14:24

What did I just say? I don't

1:14:26

do

1:14:27

not deplete any of this Anthony's will let me

1:14:29

know nothing. Okay. But

1:14:30

it was real words -- Right. -- in my breast.

1:14:33

It was real words, wasn't it? He put

1:14:35

the kind words put the transcript

1:14:37

there for you.

1:14:40

Oh, and and let me go look.

1:14:42

The hell that's from the heavens' art.

1:14:44

Every pain in Every pain in

1:14:46

sorrow. Is that a is this good?

1:14:51

Not a doubly, wretched heart, doubly with

1:14:53

a refreshment Philist. I am weirdly

1:14:55

with contending why this pain and desire

1:14:57

peace descending come into

1:15:00

my breast. Okay. Play it again now.

1:15:03

Now that you know it's poetry. Wadar

1:15:06

DuVanda Hemmelbist, Alice lead

1:15:08

on Schmersson's still list. Who's voice there?

1:15:10

Doubled Ellen is That's my voice.

1:15:14

Ikpin described in his moody with

1:15:16

sole alder Schmidt's moon boost.

1:15:18

Sooner free day calm, a calm in

1:15:20

mind. It brew terrible at pronunciation.

1:15:22

Is

1:15:23

it? Oh, yeah. Yeah. Sounds

1:15:25

like you. That's the dubbed version.

1:15:27

Sounds like a young Leo.

1:15:28

Yeah. It sounds like a young this, but Yeah. You

1:15:30

know this, Leo. Which

1:15:32

which which synthesis

1:15:34

are you using? Was that Descript? Or

1:15:36

was it eleven labs? I don't know which

1:15:39

one is eleven labs. Eleven labs. He's played with

1:15:41

he's played with both. Yeah.

1:15:43

He's getting quite You know what? Though, Anthony,

1:15:45

the coffee stain is not on my shirt.

1:15:49

No. See AI removes those reflections.

1:15:52

Yeah. I like

1:15:54

it. Yeah. He's working on that.

1:15:56

It's kind of the low voxel version of Leo,

1:15:58

though. It's not looks

1:16:00

more like I told him

1:16:02

we want kinda want a max headroom thing.

1:16:06

Right? It's pretty close. Right? Yeah.

1:16:08

Yes. Yeah. The low poly. You got

1:16:10

it, but you gotta add

1:16:11

the the stutter in the Yeah. I

1:16:13

can do that. Sure.

1:16:16

Anthony Nelson has become our king of AI.

1:16:18

He

1:16:19

really has. Yeah. Let's talk about AI.

1:16:21

AI is all the rage. These

1:16:24

days. In fact, Microsoft actually,

1:16:26

they kinda they kinda misled

1:16:28

people. They announced they have a new

1:16:31

thing they're doing when

1:16:33

they update windows, they call it a moment. And

1:16:36

moment two came out yesterday, and

1:16:38

they said that it would bring AI

1:16:40

powered Bing to

1:16:43

Windows eleven. Not

1:16:46

really. What it does is it puts a logo

1:16:48

on your Windows eleven task bar that when you

1:16:50

click it, opens up

1:16:52

edge

1:16:53

and the Bing chat. So they gave

1:16:55

you a shortcut or an ad or

1:16:58

an

1:16:58

ad. Let's let's

1:17:00

It's called a Spade

1:17:01

as a It's an ad. Oh,

1:17:03

boy. See this feature?

1:17:06

Yeah. Here's the if you look at

1:17:08

it in the in the little search pill

1:17:11

we call it, there's a b. When

1:17:13

you click it, it says introducing

1:17:15

the new bing. But if you wanna do

1:17:17

anything, you have to open up edge and You can't

1:17:18

just use that little search

1:17:19

bar. No.

1:17:20

Plug it in there.

1:17:21

No. See, that

1:17:21

would be an integrated inter

1:17:23

I mean, there's nothing there's no problem. They didn't do that except

1:17:25

they want This is all performance And Microsoft

1:17:27

is

1:17:28

playing a game here. Yeah.

1:17:31

Well, Microsoft was sure touted

1:17:33

there in the beginning as having made

1:17:35

a really smart move with this whole chat

1:17:37

GPT thing, reporters or idiots.

1:17:40

We're a couple of weeks away from that now.

1:17:43

Has that proven to be the case? I mean,

1:17:45

Google didn't, you know, initial

1:17:47

mean, they they haven't -- They still haven't. -- they had their

1:17:50

bar thing, but but they haven't

1:17:52

raised their Yeah. That's It turned out

1:17:54

to actually be kind of a good thing by comparison.

1:17:56

Right?

1:17:58

Jason, in in the the

1:18:00

annex in the kitty table part of the rundown,

1:18:03

I put in two stories that was was interesting to

1:18:05

me that that there's already a tech

1:18:07

clash against OpenAI

1:18:08

now. Oh, yeah. Well, So And

1:18:11

and this is reasonable because OpenAI,

1:18:13

which was funded in twenty

1:18:15

fifteen, partly by Elon Musk who partnered

1:18:17

company with them a few years later, to be

1:18:20

a response to the closed

1:18:22

source development of AI by

1:18:24

Microsoft and Google and

1:18:26

others, it was to be the open one,

1:18:28

the nonprofit one. But

1:18:30

it's no longer

1:18:32

Yeah. It's definitely not nonprofit anyone.

1:18:34

Yeah. Uh-uh. So this is an other board art

1:18:36

article. Open AI is now everything

1:18:39

it promised not to be corporate closed source

1:18:41

and for profit. True.

1:18:43

Right? Yeah. Yeah. They

1:18:48

now this, to me, is little nerve

1:18:50

wracking. Sam Altman had CEO

1:18:52

wrote a blog post this week planning for

1:18:55

AGI and beyond. AGI

1:18:59

is the scary AI, the

1:19:01

general intelligence, which would

1:19:03

essentially be as

1:19:05

he says, our mission is to ensure that

1:19:07

artificial and general intelligence AI

1:19:10

systems that are generally smarter

1:19:12

than humans benefits

1:19:15

all of humanity. If

1:19:17

AGI successfully created well,

1:19:21

yeah. I don't know.

1:19:26

I don't know. We don't have AGI. I think it's

1:19:28

pretty clear. Blake Lemoine and and

1:19:31

and Azil not withstanding. It's

1:19:33

not sentient. No. No.

1:19:35

It's not as smart as a human.

1:19:39

It sounds like it is. I

1:19:42

once asked Ray Kirschwa who is

1:19:44

kind of the king of all of the singularity. Right?

1:19:46

He said it's sometime in in the

1:19:48

next twenty years, he said I think by twenty

1:19:50

thirty five. Their

1:19:53

computers will be in. Now this is his key

1:19:55

phrase, indistinguishable from humans. And

1:19:58

I said, but but will they be thinking and

1:20:00

saying he doesn't matter if you can't tell the

1:20:02

difference, they're indistinguishable from humans.

1:20:04

That's that's it.

1:20:06

Mhmm. It doesn't it's foolish

1:20:08

to say, well, are they thinking? That's not

1:20:11

If you can't tell, does it matter? I don't

1:20:13

know if Jason's

1:20:14

thinking. Right. Well, thank you.

1:20:17

Thank you for that. But

1:20:19

but what is but what is thinking to

1:20:21

a computer? I mean, any any operation

1:20:24

that is determined by computer is

1:20:27

in essence thinking

1:20:28

Right? Not as we think. Not

1:20:30

as we think of thinking. But

1:20:32

when I think of

1:20:33

-- Yes.

1:20:34

-- it's really weird. When I think of thinking

1:20:36

-- it's thinking. I think of, you know,

1:20:38

I want this thing to happen. I have to understand

1:20:40

it. I have to figure out. I have to know how to

1:20:42

how to do this thing or

1:20:44

whatever. That's what a computer does when it's when it

1:20:46

you know, performing an action. I mean, the

1:20:48

type of thinking. Yeah. Yeah.

1:20:50

That's Ray's point. Exactly. Kirazwal said,

1:20:52

if you can't tell difference, doesn't matter

1:20:55

what the internal process is. Right. The result

1:20:57

is the same. Now

1:21:00

the real singularity comes when And

1:21:02

this is where it gets scary when

1:21:04

AGI can design better AGI

1:21:07

or can build better

1:21:08

machines.

1:21:08

And sold better version of itself. Then

1:21:10

they're faster than humans. Mhmm. It it

1:21:12

becomes iterative and it gets faster and faster

1:21:15

accelerates to the point

1:21:16

where, you know, the I don't know what. What

1:21:19

is

1:21:19

what's the limit? Yeah. In other words, right

1:21:21

now, we're holding it back. Mhmm. But

1:21:24

as soon as it could -- humans. -- pesky humans.

1:21:26

As soon as it can do it itself, nothing

1:21:29

to stop it, a machine that could design machines,

1:21:33

I don't know what, takes

1:21:35

over the universe. Yeah.

1:21:38

So we can imagine a world

1:21:40

in which humanity flourishes to a degree that

1:21:42

is probably impossible For any of us

1:21:44

to fully visualize it yet, we hope to contribute

1:21:46

to the world says Altman, an AGI

1:21:49

aligned was such

1:21:51

flourishing. And if not,

1:21:53

well, we try. I

1:21:57

mean, really? The

1:21:59

first AGI will just be a point along

1:22:01

the continuum of intelligence. We think it's likely

1:22:04

progress will continue from there. Yes. See, that's what

1:22:06

scares me. Possibly sustaining

1:22:08

the rate of progress we've seen over the past decade

1:22:10

for a long period of time. Or he

1:22:12

doesn't say or faster, If

1:22:14

this is true, the world could become extremely

1:22:17

different from how it is today, and the

1:22:19

risks could be extraordinary, a

1:22:21

misaligned, super intelligent,

1:22:25

AGI could cause grievous harm to the

1:22:27

world. An autocratic

1:22:29

regime with a decisive

1:22:32

superintelligence lead could do

1:22:34

that as well. Yikes.

1:22:38

So what are they doing to prevent that?

1:22:42

Nothing. We

1:22:46

think a slower takeoff is easier to make

1:22:48

safe. And coordination among AGI

1:22:51

efforts to slow down at

1:22:53

critical junctures will likely

1:22:55

be important. Even in

1:22:57

a world where we don't need to do this to solve technical

1:22:59

alignment problems, slowing down may be important

1:23:02

to give society enough time to adapt. So so

1:23:05

so so don't be in such a hurry. Successfully

1:23:09

transitioning to a world was superintelligence,

1:23:13

should add non human superintelligence is

1:23:16

perhaps the most important and hopeful and

1:23:18

scary project in human history. Success

1:23:20

is far from guaranteed, and the

1:23:22

stakes, boundless downside, and

1:23:25

boundless upside will hopefully

1:23:27

unite all of us. It's not really

1:23:30

All he's saying is slow

1:23:31

down. He's not

1:23:33

really giving us any tools. He's also out there

1:23:35

saying, wait. This is in the last week. Too. It was

1:23:37

in in the rundown last week where he's

1:23:39

saying, regulate us. So we don't have to make decisions

1:23:41

you make them for

1:23:42

us. Yeah. That's not Kevin Mark and I

1:23:44

are both in the in the rundown, invading

1:23:46

the rundown.

1:23:47

Hi, Kevin. Oh, hi, Kevin. Hi,

1:23:49

Kevin. We invited him on, but I think it was

1:23:51

such short notice that

1:23:52

he couldn't probably couldn't get in. Yeah. I just

1:23:54

I just got an email from him. Yeah. So

1:23:57

Emily Bender, who was one of the sarcastic parrots,

1:23:59

paper authors, coauthors. Has

1:24:01

a thread, which is in there, in which he's tearing

1:24:04

apart all. She's amazing

1:24:06

on her ongoing comments

1:24:08

on the hype of

1:24:11

of

1:24:11

AI. But she has been a

1:24:13

naysayer. It's first Absolutely. Yeah.

1:24:16

Absolutely. Well, just just the naysayer mainly

1:24:18

of the hype. Mainly of the overpromise.

1:24:22

And So this is your reaction

1:24:24

to Sam's post. This piece.

1:24:26

Yes. From

1:24:28

the get go, this is just gross. They

1:24:32

think they're really in the business of developing,

1:24:34

shaping AGI, and they think are there positioned.

1:24:36

To decide what benefits all of humanity,

1:24:39

ding ding ding ding

1:24:44

Where's her? I don't understand. That's the next

1:24:46

How does this Twitter word to get to that?

1:24:49

That's not Twitter. It's Mastodon. Sure.

1:24:51

No. How does Mastodon? That's her next post.

1:24:53

That's right

1:24:54

there. Where you're that's sort of an explanation. All the advice

1:24:56

that we need to imagine that AGI have successfully

1:24:58

created is literally magic. Also,

1:25:00

What does turbo charge in the economy

1:25:03

mean? If there's already abundance

1:25:05

more dollars for the super rich has to be,

1:25:07

also not the rhetorical sleight of hand

1:25:09

here, Paragraph one has AGI

1:25:11

as a hypothetical. But by paragraph

1:25:14

two, it's already something that has potential. But

1:25:17

oh, no. The magical imagined AGI

1:25:19

also has downsides, but

1:25:21

it's also so so tempting and

1:25:23

important to create that we cannot not create

1:25:25

it. At the next rhetorical

1:25:27

slide of hand here, now AGI is an

1:25:29

unpreventable

1:25:31

future. That's the essence

1:25:33

of what you

1:25:33

said. Really good post. I

1:25:37

I think I'm following miss

1:25:40

Binder. But if not, I'm

1:25:42

gonna follow She should. She's she's brilliant. Yeah.

1:25:45

I'm pretty sure I am following her.

1:25:47

You know, the problem I'm

1:25:48

asking on, you can't you can't try to switch over

1:25:50

from One page to another is

1:25:52

already following her. Well, if

1:25:54

actually, if I'd been following it as

1:25:57

a follower as opposed to in that

1:25:59

form, I could see it little bit better. But,

1:26:02

yeah, she's a professor of linguistics

1:26:05

at UW University of Washington.

1:26:07

She runs the master of science in computational

1:26:10

linguistics

1:26:11

program. So she knows what she's talking

1:26:13

about. Oh, yeah. Yeah. Yeah. And again, she

1:26:15

was co author with Timna Gebaru

1:26:18

and Smarga Mitchell. And

1:26:20

one other whose name I forgot, the fourth Beatles,

1:26:23

fifth Beatles, of the static a sarcastic

1:26:25

parents

1:26:25

Right. -- warning of this. So

1:26:28

what yeah. I mean, this is a good question. I

1:26:30

mean, talk about big tech. It's

1:26:34

kinda like nuclear

1:26:36

proliferation. We're

1:26:38

in we're in the era of

1:26:40

AI proliferation. And

1:26:43

there's nobody nobody talking about

1:26:45

disarmament. But here too,

1:26:47

to talk about we must regulate this now. We

1:26:49

don't know what it is. Well, you can't yeah. I

1:26:51

guess you can't, but what do you do then? Right. If

1:26:53

you can't regulate it, what do you do?

1:26:55

You you you educate people,

1:26:57

you'd be cautious, you keep people

1:27:00

accountable, you do what what only

1:27:02

vendors doing, and and and

1:27:04

mock them when they're going overboard. That's

1:27:07

what society

1:27:08

does. I mean, but what is

1:27:10

being cautious there's gonna be a lot

1:27:12

of companies that have that see the potential

1:27:14

the upside of getting

1:27:16

into AI. And their

1:27:19

bottom line is to make money.

1:27:22

So, like, where like, how

1:27:24

You don't want to make them to draw

1:27:26

the line of what caution is.

1:27:28

Because, like, we're actually gonna push it. Right?

1:27:31

Microsoft threw caution

1:27:33

to the wind. Google

1:27:36

almost did, but then held back and said, no, we're the

1:27:38

cautious one.

1:27:39

Yeah. Right. Yeah.

1:27:42

Yeah. So if we

1:27:44

did an AI show, I don't know if we're

1:27:46

gonna We're we're not really in

1:27:49

position. Please. To watch new shows,

1:27:51

but everybody wants to be on the show, except

1:27:53

for Ant and who's left. It's

1:27:55

not in don't wanna do it. It's not

1:27:57

in my no. But but no.

1:27:59

He not gonna do it either. But gonna we're gonna

1:28:01

do abuse theatres show. Gosh.

1:28:04

No.

1:28:05

Wayne Brook.

1:28:06

There you go. No. Yes.

1:28:07

If you pass it in, I show you how to

1:28:09

do the next one and that is musical theater.

1:28:11

So there you go. Oh

1:28:15

my gosh. I fear that we would

1:28:17

not have that much to We'd have plenty

1:28:19

to talk

1:28:19

about, but it'll be highly speculative. Right?

1:28:22

Speculation. That's all it is.

1:28:24

Every week, you're speculating on what

1:28:26

could potentially

1:28:27

happen. And think that's gonna get

1:28:29

old. You got stupid coverage. You got stupid

1:28:32

companies. You got smart things happening. You

1:28:34

got great opportunities. III we talked about

1:28:36

this So in our our management program, we

1:28:38

have twenty three amazing high level managers

1:28:40

in the in the executive program by help start.

1:28:43

At the school, we were talking about this

1:28:45

just yesterday. I did a class

1:28:47

on on Mastodon and and the Federal University

1:28:49

and this. And one of the issues that

1:28:51

comes down is a very international group. They

1:28:53

are using it for translation. They're

1:28:56

using it to fix their work.

1:28:58

It's making people who are

1:29:01

shy about their English as a second language,

1:29:04

more confident. It's all these kinds of

1:29:06

uses that you don't know people are putting

1:29:08

it to and finding value

1:29:10

in it. One person said she's using it to

1:29:12

write grant proposals because

1:29:14

it's all the same BS. So

1:29:16

she just returns them out with it. Right?

1:29:19

There's there's interesting

1:29:21

uses. There's interesting dangers. There's

1:29:23

interesting development. There's interesting business.

1:29:25

There's bad media coverage about it. Oh, it's

1:29:27

rich. Yeah. Let me let me put my

1:29:29

miss Lisa La Porte hat on right

1:29:31

now. Okay. So we're gonna do this

1:29:33

this week in AI Show. You're gonna

1:29:35

be able to talk about this for a

1:29:37

year? At least, you know --

1:29:38

Oh, yeah. -- fifty

1:29:39

two episodes.

1:29:39

Oh, you can give it to

1:29:40

me episodes at it. Oh, yeah. Oh, yeah.

1:29:43

Until I might get sick.

1:29:44

I feel like waiting quarters of Tech Meeam on

1:29:46

on daily ASIS has something to do with it.

1:29:48

It's clearly sucks. And they all say the same

1:29:50

thing, mister Howe. Here I have

1:29:52

that same qualm, but my qualm is that this

1:29:54

is just kind

1:29:56

of irrational exuberance and that we're gonna hit

1:29:58

another AI winner. We've done this before many,

1:30:01

many times even in my

1:30:02

memory.

1:30:03

So we've done all three in an FTs and

1:30:05

all that crap. We do. Oh, yeah. Watching.

1:30:07

But even about AI. We've gotten all excited

1:30:09

about it. We thought self driving cars

1:30:12

were just around

1:30:12

the corner. We thought we've this is

1:30:15

not and and there have been many AI winners.

1:30:17

And I just would hate to start a

1:30:20

show about something, and then then the AI winner

1:30:22

happens. And

1:30:24

we all go, yeah.

1:30:25

It's about a temporary show. It would have to

1:30:27

be temporary. It'd have to be just kind of

1:30:29

a I think if we did it, it would be in the club, and

1:30:31

it would be just kind of a chat like this.

1:30:33

Mhmm. But I see. I do wanna

1:30:35

talk to experts. I would like them people like Sure.

1:30:37

There's lots of people who invite like Emily Bender.

1:30:39

Yeah. Like that young woman I sent you, whose name

1:30:41

I can't remember,

1:30:42

Jason, Oh, yes. On TikTok.

1:30:44

Whenever

1:30:45

people send me young women, I have to send them

1:30:47

back. It's kind of a legal thing.

1:30:51

Stacy, if Stacy were here.

1:30:53

It's a joke. It's a joke.

1:30:56

Oh, god. And an answer out of here. Oh,

1:30:58

God. Honest denial of it. Rachel

1:31:00

Woods.

1:31:01

You're sending the

1:31:02

name of young woman. Let's be clear. Rachel

1:31:05

Woods. Okay.

1:31:06

Apply play a little of Rachel Woods, by the way.

1:31:08

She's she's really good on this topic. He's

1:31:10

on

1:31:10

on TikTok. She

1:31:11

put it here. Here you go.

1:31:13

Okay. I'll I'll you can't you can't do it

1:31:15

because it won't go on the won't go

1:31:17

on the thing. Rachel Woods She

1:31:19

explained TikTok. So she's a TikToker

1:31:22

who explains AI. Oh, she's an AI

1:31:24

startup founder.

1:31:26

Okay. So she has a she has something,

1:31:28

but

1:31:29

Yeah. Yep. So here we go. This is

1:31:31

her most recent. Let me turn

1:31:33

on the turn on the sound

1:31:35

thing. Turn the Turn

1:31:37

on the machine. Can you get some real

1:31:39

intelligence to know how to turn on the sound? Can you turn

1:31:41

on can you turn on the machine?

1:31:45

That's a machine. I have to push that

1:31:46

button. I said, I

1:31:48

have

1:31:49

to push that button. I forgot

1:31:51

you're screaming. In quizlet. Okay. Here

1:31:53

we go. API was announced and people

1:31:55

are going crazy. It's ten times cheaper

1:31:57

than OpenAI's previous model, which

1:31:59

means we're about to have chat GPT in

1:32:01

every product. They already announced

1:32:03

this in a new grocery shopping assistant for

1:32:06

Instacart, a shopping assistant

1:32:08

for any Shopify store, a

1:32:10

tutor on any subject in quizlet,

1:32:12

and your own personal AI and Snapchat.

1:32:14

Holy cow. There was a lot more in this announcement

1:32:17

around privacy and restrictions of

1:32:19

how you can use the API. We'll be covering

1:32:21

all of us tomorrow in our newsletter, so drop your

1:32:23

questions. If you're not already on it, the link is in

1:32:25

my

1:32:25

bio. Wow. I think

1:32:27

this is She's great.

1:32:30

I learned from her a lot. Yeah. Yeah. Yeah.

1:32:32

And she's got a newsletter. So

1:32:35

subscribe. Let me see if I'll see if

1:32:37

I can find the I needed it up too.

1:32:39

Yeah. Let me let me find her

1:32:42

her link. The a I exchange

1:32:44

dot com slash

1:32:48

the Rachel Woods. And

1:32:50

there's link there to our newsletter.

1:32:53

Interesting. Alright.

1:32:55

Jason can make that happen, mhmm,

1:32:58

because you know what, e despite

1:33:00

all appearances, he can think. True.

1:33:03

Sometimes despite all

1:33:04

appearances, and I don't don't believe

1:33:06

what you see here. I'm just teasing.

1:33:09

I'm just teasing about this much No.

1:33:11

Jason's a great producer on my

1:33:12

forehead. No. Oh, I'm t's head. He's

1:33:14

a great great producer.

1:33:16

Producing as he's

1:33:18

Yeah. Hosting. Right? Type and look crazy.

1:33:20

Yeah. I'm pretty

1:33:21

It's amazing. That's really impressive. I have to

1:33:23

say.

1:33:23

It really is. We are very lucky. Extremely

1:33:26

lucky that Jason's with us. Also,

1:33:29

the host of all about Android, and Matt mentioned

1:33:31

the beginning show gets great guests on there

1:33:33

along with his regulars. Yeah.

1:33:35

Ron. And and when

1:33:37

when Last night, we had Flow on Flow

1:33:39

came back. Nice. She came back for review.

1:33:41

Michelle

1:33:42

shows up some from time to

1:33:43

time. Shaw Romand. Yes. You got a great

1:33:45

thrill a lot of people in. JRA feel from

1:33:47

Android

1:33:48

and love j r.

1:33:49

Yeah. It's fun. We're

1:33:50

having a good time. He

1:33:52

also was Tech News weekly. And,

1:33:54

actually, I thought of a because we

1:33:56

were talking about what what you should do tomorrow.

1:33:58

Yes. I'm sorry. I'm still actively

1:34:00

looking for my next thought to have some segment. I can't

1:34:02

remember what it was. I should have written it down.

1:34:04

Hope you remember that. I know what is Mike

1:34:06

McHugh. Oh,

1:34:08

yes. This is big board. Yes. Mr.

1:34:10

Flipboard.

1:34:11

Oh, yes. Okay. Because

1:34:14

Flipboard has joined the Fennerverse. So

1:34:16

Flipboard was originally created, Mike,

1:34:18

created it with the goal of

1:34:20

being kind of a magazine based

1:34:22

on you people you follow on Twitter. Well,

1:34:26

Maybe that's not the best business model anymore.

1:34:29

So they're moving kind of away from Twitter

1:34:31

and over to Mastodon. They're even talking

1:34:34

about creating their own Mastodon Instance

1:34:36

flipboard dot

1:34:37

So you did. You're good. He did.

1:34:40

So this is really interesting. I mean,

1:34:43

like a lot of people who have abandoned Twitter,

1:34:46

but now if you've built a business around Twitter,

1:34:48

you might be very slow to do that. I'm

1:34:51

I good for

1:34:52

Mike. Mike is very active on

1:34:54

on medium on massive At the same time,

1:34:56

Tony Stivalbine and Evan Williams have

1:34:59

gone all in for medium And

1:35:02

so the medium instance,

1:35:04

m e dot d

1:35:05

m, pretty cool, MEDM, is

1:35:08

now up I'm getting all kinds of followers from it. People

1:35:10

are people are joining that one as well. Nice.

1:35:13

So you've got flipboard, you've got medium,

1:35:16

Who was it, though, that decided not to do the financial

1:35:18

times or somebody The f t. Yeah. Yeah. They

1:35:21

they had one, and they said, you know, man, this is a bad idea.

1:35:23

This is legal liabilities and all this stuff.

1:35:25

So I'm I for one I'm

1:35:27

not sure. I for a, I totally

1:35:30

think companies like medium and flipboard

1:35:32

should do this. I

1:35:34

mean, I guess, we're kind of in the same boat. I

1:35:36

shouldn't really I mean, because Twitter has

1:35:38

its own Mastodon instance. Mhmm. And it's made

1:35:40

up of our listeners. Right? Our fans.

1:35:42

Community. It's a community. Yeah. So it

1:35:44

makes sense

1:35:45

actually. I take it back. I was gonna say something

1:35:47

bad about it. Would you

1:35:49

question both community

1:35:52

czar is here. I was talking to

1:35:54

somebody the other day about about the need for

1:35:57

outsourced moderating services

1:36:00

for instance hosts. Is

1:36:02

that something you think as you grow in scale you

1:36:04

might

1:36:04

need? Or well, we don't actually

1:36:07

we're too nice. You

1:36:09

mean moderators. Yeah.

1:36:11

Well, so far on You're moderating

1:36:14

right now. Yeah. I haven't I haven't had

1:36:16

to outsource it. I think just

1:36:18

like our Discord, when

1:36:20

you have a really tight community and,

1:36:23

you know, I have to approve you to go in there and

1:36:25

I I kind of I'm pretty cautious

1:36:27

And by the way, if anybody acts out, I booed them immediately.

1:36:30

So I think it doesn't take a lot

1:36:32

of moderation. I every

1:36:33

day, I make sure I check everything.

1:36:36

How long how long every day do you spend doing

1:36:38

it?

1:36:38

Oh, an hour a day.

1:36:40

Not more. Wow. That's all. That's all. That's all. That's all. That's all. That's all. That's all. That's all. That's all. That's all. That's all. That's all. That's all. That's all. That's

1:36:42

all good. Well, and and the one

1:36:44

thing that everybody should know if you're on Mastodon

1:36:46

is that when you see a post you don't

1:36:48

like or a series of posts from somebody

1:36:51

you don't like, you have Total Control two,

1:36:53

you can mute or block anybody. And

1:36:56

by doing so, you're keeping them out

1:36:58

of your feed forever. So,

1:37:01

again, report them to me. And and then I would suggest

1:37:03

if it's somebody that you think is really toxic,

1:37:05

report them to me. I check every

1:37:07

day. We know we get at most a reported

1:37:09

day. And usually, it's not people

1:37:11

on our mastodon instances. You know,

1:37:14

people on other mastodon instances. So

1:37:16

I I think it it's actually

1:37:19

sustainable. I I spent an hour there because I'm

1:37:22

reading stuff. Not not Not

1:37:24

that typically moderate. Right. That's the total

1:37:26

amount of time. So I have two. Okay. I'll

1:37:28

just show you real quickly the dashboard that

1:37:30

I go I see. Two pending reports

1:37:33

and eight pending users. won't show you the

1:37:35

reports or the users. But So I will

1:37:37

review those every day. And if if it's an

1:37:39

account you know, I can

1:37:41

suspend an account even if it's not on our

1:37:43

instance. And that just means people on

1:37:45

our instance can't see it. So

1:37:47

the account still exists. It's just -- Yeah. -- it's

1:37:49

not So one of them is is one of

1:37:51

them is pretending to be You can show

1:37:53

this, the National Security Agency.

1:37:56

And I don't

1:37:58

know. I think that's probably a parody account. This one's

1:38:00

a little tricky. I will I will look

1:38:02

at

1:38:02

it. But it's verified.

1:38:05

There's a blue check. It

1:38:08

also says you're local from Zuckerberg.

1:38:11

It also says your local friendly neighborhood

1:38:13

surveillance agency. Yeah.

1:38:15

I'm just thinking a little It's probably a parody

1:38:18

account. I'll look at it. If it's clearly

1:38:19

parody, I won't block it, but tabs

1:38:21

that you

1:38:22

twenty four seven because we care. Yeah.

1:38:25

And then there's another one that

1:38:27

you should not show. That

1:38:30

is kind

1:38:32

of anti trans. So that one, I'm just

1:38:34

gonna press the suspend

1:38:35

button. They get a notification that

1:38:38

that social has suspend them. So

1:38:40

Oh, they get a notification. Oh, you

1:38:42

do it without

1:38:42

notification? Well,

1:38:45

yes, you can't. Because they're they're they're gonna do their

1:38:47

account. Yeah. I think they get notified. I may be

1:38:49

wrong on that, actually. I don't know. But

1:38:52

what happens as a result is even

1:38:55

the only reason it showed up at Twitter's socials

1:38:57

because somebody in Twitter's social was following them,

1:39:00

or maybe it's possible no. No. It would only

1:39:02

show up because somebody had followed them. So

1:39:05

that that person won't be able to see that

1:39:08

those posts and no one else into his social will.

1:39:10

Normal anybody able to follow them anymore.

1:39:12

So that's it pretty easy.

1:39:14

You saw I did it while we're on the

1:39:16

air. That's great. It wasn't that far.

1:39:19

We had Oigand Rocco on

1:39:21

Oh, josh. Or something. Oh, yeah. Nice.

1:39:23

How is Oigand Rocco? He's the creator

1:39:25

of Mastodon. He's the developer who invented and Mastodon.

1:39:29

So we blew minds there, and

1:39:31

and my my management students were

1:39:34

blown by this as well. He said that I don't

1:39:36

think it's a problem to say. That in the in the

1:39:38

life of Mastodon writes from two thousand eighteen

1:39:40

to today five years. He said, maybe

1:39:42

I've raised clean including the latest

1:39:44

brush, maybe a total of five hundred thousand dollars.

1:39:47

So he asked this thing that is challenging

1:39:49

Twitter that is presenting

1:39:52

as he's in that activity pub and but

1:39:54

he's he's making activity pub

1:39:56

come to life for more people, ten

1:39:59

million accounts signed up of

1:40:01

five hundred thousand dollars.

1:40:02

Nothing. Yeah. Why? And that's through

1:40:05

a Patreon. He has

1:40:07

a handful of developers. I know he hired a new

1:40:09

developer, but for the longest time, it was hired

1:40:11

with three people. Yeah. Got seventeen hundred applications

1:40:14

for three iPad. The thing that remember

1:40:17

is it's just one example of the Fediverse

1:40:19

and activity, but I always say

1:40:21

this. But one thing that's become clear that

1:40:23

we had Mike Mastodon on Sunday on Twitter,

1:40:26

and he's he said there are a number

1:40:28

of really good competitors to

1:40:30

Mastodon that aren't really

1:40:31

competitors. They're just their

1:40:34

similar projects have that work on activity

1:40:36

pubs so you could follow somebody on We also

1:40:38

had Darius who who started hometown --

1:40:40

There you go. -- room. Yeah. And and,

1:40:42

you know, he loves

1:40:43

basketball. He's got nothing against vast

1:40:45

on he forked it because he wanted to do some things

1:40:47

differently. And it's a great fork. think

1:40:50

more and more you're gonna see this. And I

1:40:52

for one, I think this is a really good

1:40:56

a strong movement

1:40:59

away from centralized social

1:41:02

you asked earlier, you know, about the his the future

1:41:04

for Facebook. I mean, this is, I think, the

1:41:06

future of social. I I love politics too.

1:41:09

I think Discards are a very good example.

1:41:11

Boy, that was an eye opener when we created this in

1:41:13

club twist.

1:41:15

Oh, this is a good thank you mashed

1:41:16

potatoes diagram. This is the tree

1:41:18

of the Fediverse, and it's actually out of date

1:41:20

now. There are many, many more compatible

1:41:27

projects going on. I mean, it's huge. So,

1:41:30

you know, you can and if you have a WordPress,

1:41:33

you can have it be equal partner in the

1:41:35

Fed ofverse two. So there's a plug in

1:41:37

for

1:41:37

that. So I think more and more,

1:41:39

you're gonna see that.

1:41:42

And Mozilla Oh, by the way, Drew,

1:41:45

take take who or yeah.

1:41:47

Take is is short for quarter main. So to

1:41:49

I wouldn't say true. Let's say take who

1:41:52

in our discretes in our club to it says

1:41:54

he has been suspended from Mastodon by

1:41:56

accident, and you do get a message. Oh,

1:41:59

okay.

1:42:01

And and I hadn't suspended you. Didn't I

1:42:03

say, hey. Okay. Good. You

1:42:06

do get a message. Yeah.

1:42:10

I'm I'm really encouraged by all of

1:42:12

that. I think this is Yeah. -- I

1:42:13

learned it from you and I'm very excited. It's

1:42:15

it's it's giving me hope. And is it safe

1:42:17

to say I think it is for me, but is it

1:42:19

safe to say for all of you that Mastodon now

1:42:22

kind of fills that

1:42:23

itch, that Twitter itch, or no?

1:42:26

For me. You

1:42:27

know what I mean? Not ant. Yeah.

1:42:29

I've I've just noticed in general for myself

1:42:31

my desire to engage

1:42:34

with a Twitter like feed

1:42:36

on a daily basis has dropped considerably

1:42:39

in the last

1:42:39

Yeah.

1:42:39

I still go to Twitter. When I see it, I go, I

1:42:41

don't wanna hang up. I open Twitter Yeah.

1:42:44

handful of times a week. I open

1:42:46

Mastodon slightly more than that,

1:42:48

and sometimes I post, but I'm just not sharing

1:42:50

a lot. And that's been a general trend for me

1:42:53

lately, and sometimes I feel a little bad about that

1:42:55

because of what I do and, you know, being in

1:42:57

touch with with the community and everything. But

1:43:00

Yeah. I don't know. My desire

1:43:02

to interact in that way has has dropped

1:43:05

a lot. And I don't know if it's just, you know,

1:43:07

a result of the turbulence

1:43:10

of recent months or if it's just kind

1:43:12

of shifting

1:43:12

interest. But There we are. Have you

1:43:14

we we just said this before, and and and you

1:43:16

already know that I'm I I moved

1:43:19

away from being an adviser to

1:43:20

it. Have you guys tried post dot news? Do you

1:43:22

have any opinions about

1:43:23

it? Have not. No.

1:43:25

I haven't. I'm not a fan only because it's an

1:43:27

Adriason Horowitz. It's joined a.

1:43:29

It is centralized

1:43:31

b. Yeah. I agree. I feel similar

1:43:33

to Really? Nobody's using it?

1:43:36

Well, no. No. No. No. No. That's not the case. People

1:43:38

are. But I follow, you know, like,

1:43:40

fifty four people. And now

1:43:43

when I go to my people I'm

1:43:44

following, it's all George

1:43:46

Conway, literally. Oh, man. Conway

1:43:48

and George Conway. Okay. That's a fit on what

1:43:50

I want. So Yeah. know.

1:43:53

I think a lot of people went there because it was like

1:43:55

Twitter. It was centralized

1:43:57

Twitter like place. And I think that the

1:43:59

people had said because Taylor Lawrence popped

1:44:02

in, briefly at Mastodon, didn't

1:44:04

like what she saw and went ran back to Twitter. I think

1:44:06

a lot of people really want Twitter. And you

1:44:08

were you were about to

1:44:09

say? I'm fairly

1:44:11

similar to mister Howe where III

1:44:14

just haven't really had a desire to

1:44:16

go in there and check either platform recently.

1:44:19

I'm pretty much in broadcast

1:44:21

mode and pretty much in campaign mode,

1:44:23

regarding my son. Every

1:44:25

now and then, usually, like, on a Friday night

1:44:27

or a Saturday, I'll go in and check notifications

1:44:30

from people that I, you know, that I

1:44:32

know just to make sure I didn't miss anything.

1:44:34

But now, even they know that

1:44:36

he's really in here that often, so you

1:44:38

may not get a reply quickly.

1:44:41

It's just social media

1:44:44

in general has just been a bit of crap

1:44:46

tastic mess, and I just try to

1:44:48

keep my energy

1:44:50

clean and and and happy if

1:44:52

you will. I know every time I'm tempted to post

1:44:54

on Twitter, fortunately, III

1:44:56

do I could, but I I don't wanna break

1:44:58

my

1:44:59

silence. Every time though, it's because

1:45:01

somebody got me angry. Right.

1:45:03

Right. That's just the thing I'm at this point. If you learn

1:45:05

one thing nice.

1:45:06

Nice. You're just opening that

1:45:09

we have voices. It's great that

1:45:11

we have the voice to be out be able to go

1:45:13

out there and just scream that

1:45:15

something has upset us and whether it be

1:45:17

an injustice or or something

1:45:19

really, really small. But usually

1:45:21

And I swear. It

1:45:22

seems like I'm seeing a lot of that stuff

1:45:24

out there, and it just brings me down. I'm like,

1:45:26

no. Yeah. Me too. Step

1:45:27

one. Let me go turn on some more mama's

1:45:29

family reruns and yeah. Why not for the rest

1:45:31

of the day? I'd say in the last couple of years

1:45:33

time and time again when I open

1:45:36

feed like Twitter or sometimes mastodon.

1:45:38

Although less so a mastodon, mastodon, for whatever

1:45:41

reason feels like a little bit more

1:45:43

open and accepting to me right

1:45:45

now than the Twitter universe does. But --

1:45:48

Yeah. -- in the last couple of years, I have many times

1:45:50

had the experience where I open up my app

1:45:52

like, okay. I got this thing. I got to put this out

1:45:54

there. I type it all up. I see

1:45:56

it on the screen, and then I hit delete.

1:45:59

It's like I don't even see it. Yeah. Because I'm

1:46:01

like I've done this to At a certain point by the

1:46:03

end of it. I'm like, is it even worth it for

1:46:05

me to put this out there? Because, like,

1:46:07

I kind of on one hand, I already did

1:46:10

the thing that like I was moved to do, which

1:46:12

was to put my thoughts into words.

1:46:14

But if I hit send, it's

1:46:16

so often it feels like all I'm doing

1:46:19

is inviting some sort of

1:46:22

reaction. And -- Yeah. -- you know,

1:46:24

I hope that that reaction is good. Or

1:46:26

I hope that people interpret my words

1:46:29

in the way that I intend for them to, but

1:46:31

then there's always that doubt that's there and

1:46:33

that's been keeping me from sharing

1:46:35

a lot on social media

1:46:36

lately. And it it it has to

1:46:38

be a lot. Again, I'm

1:46:41

really grateful for the community that I've been

1:46:43

able to build on Twitter and, you

1:46:45

know, as as well as to tweet that

1:46:47

social platform. Hey, because I'm looking right now

1:46:49

at Michael Kid that

1:46:52

gave me all of this great information about

1:46:54

NAS that I told me forgot. Right? I forgot about

1:46:56

true NAS. And it's just this

1:46:58

long thread of just useful

1:47:01

information. He's not yelling at me. He's

1:47:03

not belittling me. He's not telling me I should have

1:47:05

bought this system versus that system.

1:47:08

You know, that's how this stuff is supposed to

1:47:10

be. It's just gotten so

1:47:13

far fetched with with people just yelling and

1:47:15

fussing about any and everything. And

1:47:17

that's why I just stay away because I just don't

1:47:19

wanna pollute the good energy

1:47:22

here in the four walls of I also

1:47:24

I also wonder especially

1:47:27

for you, Jason, because you've been doing this for a long time,

1:47:29

almost as long as I have. If you're

1:47:31

not just as

1:47:33

tired of sticking your head out and just say, I wanna

1:47:35

I wanna wanna be a hermit. Are

1:47:37

you ready to

1:47:38

Yes. You're ready to have the absolute show up.

1:47:40

A little small room. Brick it up and have

1:47:43

people push food over the top to you. That's

1:47:45

good

1:47:45

luck. III

1:47:49

Even I get that one too. I haven't been

1:47:51

doing this as long as you are, but I new joy

1:47:53

just to to solitude from time

1:47:55

to

1:47:55

time. Yeah.

1:47:56

Yeah. Well, it's so

1:47:57

material as your iPhone. Sorry.

1:48:00

Go ahead. As long as she's gonna say so much my

1:48:02

career has has been

1:48:04

sharing and and and putting

1:48:07

my thoughts and my words and my feelings and

1:48:09

and everything out there, I think you're

1:48:11

I think you're right to a certain degree. 0II

1:48:13

absolutely feel that from time to

1:48:15

time where it's

1:48:15

like, you know,

1:48:16

there's a private citizen that a

1:48:19

simpler sometimes I just want a simpler

1:48:21

-- Yeah. -- a normal. And

1:48:23

sometimes that doesn't involve sharing

1:48:25

every piece of my life, which I

1:48:27

mean, I don't know. But I also

1:48:29

feel bad admitting that

1:48:31

on this show because so much of my career

1:48:34

is built around being public

1:48:36

and being kind of in that role. So then

1:48:38

I feel like I'm kind of like retracting

1:48:41

myself from this community that largely

1:48:44

is incredibly welcoming, but

1:48:46

it's just that one little portion that

1:48:49

is can be negative enough that it's

1:48:51

like, well, Why would I invite that negativity?

1:48:53

Why not just not have it? You

1:48:55

know? It's such a I

1:48:58

don't wanna be ungrateful

1:49:00

because it's such a privilege to

1:49:02

get to do what we One hundred percent identity.

1:49:05

And honestly, to

1:49:08

me, what more and more feels like will

1:49:10

what we do is

1:49:13

a conversation with, you

1:49:15

know, our friends, the host,

1:49:17

but also with our community,

1:49:20

And even though the I wish the community all had

1:49:22

a c at the table because it's it would be great if

1:49:24

we could do that. But we do as much as

1:49:26

we can for that. We have a stage open now

1:49:28

all the time. We have the IRC open

1:49:30

all the time. I read the Mastodon. I read

1:49:33

the Twitch forums. I know you all do too.

1:49:35

So I

1:49:37

feel like we're we're actually kind

1:49:39

of privileged to be in a conversation. And I

1:49:41

don't mind the I like the

1:49:43

conversation. What I

1:49:44

don't like Beautiful. Within

1:49:46

the community. What I don't like is the general

1:49:48

performative well, basically,

1:49:51

what tweeting is, which is is

1:49:53

shouting out to the

1:49:54

world. I don't think what I have to say is that

1:49:56

important to one thing. I think

1:49:58

it's a risk of big old mass media

1:50:01

where scale became -- Right. The goal

1:50:03

and performing for the whole world became the goal.

1:50:05

And it's this is something I learned from the Black Twitter event.

1:50:08

Is what special of Black Black Black Black Twitter is

1:50:10

not even the movements that mattered

1:50:12

like like last matter, which mattered greatly, but

1:50:14

it's the what what I what I learned in

1:50:16

the room that day, is the value of the

1:50:18

community for the community's own sake. To

1:50:21

have joy and sorrow and everyday life there

1:50:23

and to feel a place of comfort and

1:50:26

it's not one community, It's bunch of communities.

1:50:28

It's so people don't like each other that came out

1:50:30

too, but that's so different from I

1:50:32

gotta speak to millions of

1:50:33

people. That's a

1:50:34

completely different experience. Yes. The Internet

1:50:36

for this. It's mass mediated. My

1:50:39

favorite stat in the Goodburne Preparedness

1:50:41

is coming out in June. Preorder

1:50:43

is available now. Is

1:50:46

that before the mechanization

1:50:50

of print with steam powered presses

1:50:52

and the line of type, The average circulation

1:50:54

of a daily newspaper in the United States was four

1:50:56

thousand,

1:50:59

which

1:50:59

is just about the average number

1:51:01

of active people and twit dot socials

1:51:03

messed it on. It's it's almost

1:51:06

exactly the same

1:51:06

number.

1:51:07

Yeah. Right. Right. It's like a Dunbar number.

1:51:10

That's really interesting. So that's

1:51:12

the right in other words, that's kind of the right

1:51:14

number.

1:51:14

Yeah. Yeah.

1:51:15

Four thousand four active users. So

1:51:17

it's exactly that now. Wow.

1:51:19

That's awesome. So Leo, have

1:51:21

you as the as the iPhone user among

1:51:23

us, unless unless ants kinda get

1:51:25

drawn into further into the

1:51:27

apple world. No. No. He's good.

1:51:29

It's so way about the time. I

1:51:32

do not like Iowa. Do

1:51:34

not like that. Stay there. Stay there,

1:51:36

and

1:51:37

Leo, have you signed up for the Blue Sky

1:51:40

beta?

1:51:41

Oh, let's talk about it. Yeah. I haven't

1:51:43

been invited. About it. I haven't

1:51:45

been invited. For it.

1:51:47

You can sign up to be invited. Right.

1:51:50

Right? Which And I don't sign up to be invited.

1:51:52

Agents

1:51:52

are muted. Okay. Yeah. But I don't think I've been

1:51:54

invited. So this is Jack Dorsey's

1:51:57

last act as CEO of Twitter,

1:52:00

He funded I think

1:52:02

he put ten million dollars into

1:52:04

a research effort

1:52:06

to come up with basically

1:52:09

massed on, let's be honest, or the

1:52:11

Fed of protocol. A protocol

1:52:13

based open Twitter. It

1:52:16

was money that he put in that even when Elon

1:52:19

took over Elon could not take back. So

1:52:21

the blue sky has continued. I'm sure Elon's

1:52:23

not thrilled about it. He didn't like Mastodon either.

1:52:25

But it is now close to being

1:52:28

released. It's an invite only beta in

1:52:30

the app store. So

1:52:33

yeah, you know what? I guess I should go back in

1:52:36

and join it. But honestly, I feel like it's

1:52:38

solving a problem that that doesn't

1:52:39

exist, which is you know, trying to

1:52:41

create a fativerse, but a fativerse already

1:52:44

exists. Mhmm. So do we need an And the

1:52:45

question is whether or not it it it federates

1:52:48

with the fativerse. Will

1:52:49

be Well, right now, they have their own protocol,

1:52:51

which is very similar to Activity Pub.

1:52:55

It is like like scuttlebutt, we had ramble

1:52:57

on, who by the way? He was also at Lake River event.

1:52:59

Oh, nice. He came all the way from New Zealand for

1:53:01

you. Oh, that's wonderful. Lane Cook. It was wonderful.

1:53:03

Did you record this event?

1:53:05

No. That's a sore point. Nope.

1:53:08

I think this is a shame because

1:53:10

we this is something that I think the world should

1:53:12

here. It sounds like it

1:53:13

was a mess. I I

1:53:14

couldn't read more. I was I was told not to.

1:53:16

Not by the group, not by the group, by

1:53:19

by by a boss. But Anyway,

1:53:22

it was we have a we have report on it coming

1:53:24

out very soon, and it said it was an amazing event.

1:53:27

But I think it's too soon to say that

1:53:29

that activity pump or masked on or or or or home

1:53:32

done or anything is it? I I think that

1:53:34

if Jack comes along and and and I'm

1:53:38

suddenly forgetting her name. The the head of

1:53:40

a a blue sky j Braeburger. Invent

1:53:44

new things. The world benefits.

1:53:46

Right? It's and the same was

1:53:48

cuddle

1:53:49

button. It becomes people find

1:53:51

more good ideas in that kind of open

1:53:53

environment.

1:53:54

Well, let me I haven't downloaded it.

1:53:57

So You can?

1:53:58

Well, let's see. It's in the App Store,

1:54:01

which means you can, but you might have to Might have to,

1:54:03

yeah, have to code in order to log in

1:54:05

or whatever. Blue

1:54:06

Sky Mobile. Is that it? No. That's a

1:54:09

staffing service. Blue Sky

1:54:11

VPN? No. That's a VPN. This

1:54:14

is the problem. A Blue Sky social, see

1:54:16

what's next. That sounds right.

1:54:18

So let me download that. Or if

1:54:20

it's China. Yeah. It's China. This is,

1:54:22

by the way, a big problem on the on the app store,

1:54:24

but I I think that is it. Yeah. Blue Sky

1:54:27

social. That's it. Yeah. Yep. Let's open

1:54:29

it. See what's happening. Private beta,

1:54:32

create a new account, sign

1:54:33

in. Well, I don't I

1:54:37

I guess I could create a new account.

1:54:38

Create a new account? Trying to.

1:54:40

Yeah. How

1:54:42

do I get

1:54:43

back? Yeah.

1:54:43

It says private beta. The app is

1:54:45

available for download, but you will need an invite

1:54:48

code to create an invite code. Asking

1:54:50

for an invite code. Which

1:54:52

I'm you know, I did, you know, way

1:54:54

back in the day. I just I feel like

1:54:57

we shouldn't dilute our efforts at this point. We've

1:54:59

got activity

1:55:00

pub. Nobody's saying activity pub isn't good.

1:55:03

Yeah. I I kind of agree, but then

1:55:05

again, when we talk to Rebel,

1:55:07

about what he's doing is cuddle button

1:55:09

planetary. There were all kinds of

1:55:11

new ideas that were entirely different.

1:55:13

Yeah. And and I think that's really wild.

1:55:15

I agree. Rebel had, for instance,

1:55:17

the idea of this portability that your identity

1:55:19

wasn't tied, you know, you

1:55:21

had your own public key, the identity,

1:55:24

public private key identity, which you could take with

1:55:26

you. And

1:55:26

also with the mowry that it works offline

1:55:28

--

1:55:29

Yeah. --

1:55:29

in a in a in a closed network.

1:55:33

There's there's no. I think we I think there's lots

1:55:35

of room for development right now. So,

1:55:38

yeah, I wonder how I would

1:55:40

get an invite. Well, if you're

1:55:42

listening, it's like a did fight.

1:55:44

So would we

1:55:45

all Meanwhile, I'd also like an

1:55:47

app for Android just saying. Yeah.

1:55:49

Hey, everybody. Leo, look forward here. I'm the

1:55:52

founder and one of the hosts at the Twitter

1:55:54

podcast. Network. I wanna

1:55:56

talk to you a little bit about what we do here

1:55:58

at Twitter, because I think it's unique. And

1:56:01

I think for anybody who

1:56:03

is bringing a product or

1:56:06

a service to a tech audience, you

1:56:09

need to know about what we do here

1:56:11

at Twitter. We've built an amazing audience

1:56:13

of engaged, intelligent, affluent

1:56:16

listeners who listen to

1:56:18

us and trust us when we recommend.

1:56:21

A product. Our mission statement is

1:56:23

to it is to build a highly engaged community

1:56:25

of tech enthusiasts. Wait.

1:56:28

Already, you should be your ears should be working

1:56:30

up at that because highly engaged is

1:56:32

good for you. Tech enthusiasts, if

1:56:34

that's who you're looking for, this is the place.

1:56:37

We do it by offering them the knowledge they need,

1:56:39

to understand and use technology in today's

1:56:42

world. And I hear from our audience

1:56:44

all the time. Part of that knowledge comes

1:56:46

from our advertisers. We

1:56:48

are very careful. We pick advertisers with

1:56:51

great products, great services, with

1:56:53

integrity, and introduce them

1:56:56

to our audience with authenticity and

1:56:59

genuine enthusiasm. And

1:57:01

that makes our host red ads different from

1:57:03

anything else you can buy. We are

1:57:05

literally bringing you to

1:57:08

the attention of our audience and

1:57:10

giving you a big fat endorsement.

1:57:13

We like to create partnerships with trusted

1:57:15

brands. Brands who are in it for

1:57:17

the long run, long term partners

1:57:20

that wanna grow with us

1:57:22

and we have so many great success stories.

1:57:24

Tim Broom, who founded IT pro

1:57:26

TV in twenty thirteen, started

1:57:29

advertising with us on day one has been

1:57:31

with us ever since. He said,

1:57:33

quote, we would not be where we are

1:57:35

today. Without the Twitter network.

1:57:37

I think the proof is in the pudding. Advertisers

1:57:40

like IT pro TV and Audible that

1:57:42

have been with us for more than ten years,

1:57:44

they stick around because their ads

1:57:47

work. And honestly, isn't that

1:57:49

why you're buying advertising? You

1:57:51

get a lot with Twitter. We have a very a very

1:57:53

full service attitude. We almost think of

1:57:55

it as kind of artisanal advertising,

1:57:59

boutique advertising. You'll get a full

1:58:01

service continuity team.

1:58:04

People who are on the phone with you, who are in

1:58:06

touch with you, who support you from

1:58:08

with everything from copywriting, to

1:58:10

graphic design. So you are

1:58:12

not alone in this. We

1:58:14

embed our ads into the

1:58:16

shows. They're not They're not added later.

1:58:19

They're part of the shows. In fact, often,

1:58:21

they're such a part of our shows that our other hosts

1:58:23

will chime in on the ad saying,

1:58:26

yeah, I love that or just the other

1:58:28

day. One of our host said,

1:58:30

man, I really gotta buy that. That's

1:58:32

an additional benefit to you because

1:58:34

you're hearing people our audience trusts

1:58:37

saying, yeah, that sounds great. We

1:58:40

deliver always over deliver on

1:58:42

impressions, so you know you're gonna get the

1:58:44

impressions you expect. The

1:58:46

ads are unique every time. We don't

1:58:48

prerecord them and roll them in. We are genuinely

1:58:51

doing those ads in the middle of the show.

1:58:54

We'll give you great onboarding services, ad

1:58:56

tech with pod sites, that's free

1:58:58

for direct clients, gives you

1:59:00

a lot of reporting, gives you great idea of how

1:59:02

well your ads are working. You'll get courtesy

1:59:05

commercials. You actually can take our ads and share

1:59:07

them across social media and landing

1:59:09

pages that really extends the reach.

1:59:11

There are other free goodies too, including mentions

1:59:13

in our weekly newsletter. That sent the

1:59:15

thousands of fans engaged, fans

1:59:18

who really wanna see this stuff. We give

1:59:20

you bonus ads and social media

1:59:22

promotion too. So if you

1:59:24

want to be a long term partner, introduce

1:59:27

your product to a savvy, engaged

1:59:29

tech audience Visit twit dot

1:59:31

tv slash advertise. Check

1:59:34

out those testimonials. Mark McCreery

1:59:36

is the CEO of authentic. You probably know

1:59:38

him one of the biggest original

1:59:40

podcast advertising companies. We've

1:59:42

been with him for sixteen years.

1:59:45

Mark said the feedback from many advertisers

1:59:47

over sixteen years across a range of product

1:59:50

categories. Everything from

1:59:52

razors to computers is

1:59:54

that if ads and podcasts are gonna work for

1:59:56

a brand, They're gonna work on twitch shows.

1:59:59

I'm very proud of what we do because

2:00:02

it's honest, it's got integrity, it's

2:00:04

authentic, and it really is

2:00:06

a great introduction to our audience

2:00:09

of your brand. Our listeners

2:00:11

are smart, they're engaged They're

2:00:14

tech savvy. They're dedicated to our

2:00:16

network. And that's one of the reasons

2:00:18

we only work with high integrity partners

2:00:21

that we've personally and thoroughly vetted. have

2:00:23

absolute approval on everybody. If

2:00:25

you've got a great product, I wanna hear

2:00:28

from you. Elevate your brand by reaching

2:00:30

out today at advertise at twit

2:00:32

dot tv. Breakout of the advertising norm.

2:00:34

Grow your brand. With host red ads

2:00:37

on twit dot tv. Visit twit dot

2:00:39

tv slash advertise for more details or

2:00:41

you can email us advertise at

2:00:44

twit dot tv if you're

2:00:46

ready to launch your campaign now. I can't wait

2:00:48

to see your product. So it was a ring.

2:00:51

Have you tried You've got a pixel

2:00:53

there. Yeah. Have you tried that

2:00:55

ailing lens video that's supposed to crash the

2:00:57

pixel? No. I was worried. Do

2:00:59

it. God. Lola's talking about it.

2:01:01

Oh, Jason. Don't know, but I can do it

2:01:03

now. Alright. Hold on. Last time, I did something

2:01:05

like this on

2:01:06

this show. I ended up getting myself

2:01:08

kicked out of the Google account. Almost

2:01:10

never got it recovered.

2:01:11

Guys,

2:01:15

there is a particular video.

2:01:18

It is a and a clip from alien

2:01:21

that was on YouTube. D r clip. Right?

2:01:23

I believe so. I think they've fixed it.

2:01:25

Like, I think they fixed it on

2:01:27

the YouTube side and then Google's pushing

2:01:30

out a

2:01:30

fix. Well, let's click that. So it's

2:01:32

alien. Which which how do I find it? That's

2:01:34

a really great question. It's probably part of this

2:01:37

you know, I should just do a Google search shouldn't

2:01:39

I and see Let's see here. If I

2:01:40

could find a

2:01:41

link. It is this clip. But nobody responsible

2:01:43

would put a link. Alien four

2:01:46

k HDR. Get out of there

2:01:48

on Apex Cliffs.

2:01:49

Alien? Here's your four k

2:01:51

HDR get get

2:01:53

out of --

2:01:54

Oh, there. -- there. Apparently, the

2:01:57

minute you started, it crashes the the

2:01:59

whole phone. Right? That one

2:02:00

-- With the skull. -- at

2:02:02

Apex Clips. Yep.

2:02:03

Apex Clips. Alright. You to go

2:02:05

in and get rid of this crash.

2:02:08

We

2:02:08

gotta watch this crash. Maybe.

2:02:11

You think that maybe did? Yeah. I believe

2:02:13

that they

2:02:14

have It says two years ago. It

2:02:16

was posted. It's four gig. The same video.

2:02:18

So we're looking at the same video. This is the video

2:02:20

that was linked to from alright. You

2:02:22

ready? R as Technica. Okay.

2:02:24

And the second has started playing. It was supposed

2:02:26

to reboot your

2:02:27

phone. So I think they fixed it whatever it was.

2:02:30

But

2:02:30

what do they find out why? Yeah. What would

2:02:33

have caused that? Well, my understanding my

2:02:35

brief understanding from last night is that

2:02:37

there is suspicion that has something to do with

2:02:39

an HDR codec -- Uh-huh.

2:02:42

-- and something happening with this particular

2:02:44

video with HDR. I I beyond that,

2:02:46

I have no

2:02:47

clue. At least one person reported that not

2:02:49

only do reboot their phone, but they lost

2:02:51

connectivity until they reboot it

2:02:53

again. And again a second time. Right.

2:02:55

And flow flow did

2:02:58

not write about this for Gizmo to Florence

2:03:00

Ion, but she did experience this, and

2:03:02

she only wrote about it. Okay. So you can kinda

2:03:04

read her experience, either

2:03:05

fixed or -- Yeah. -- it doesn't

2:03:08

impact all -- Too

2:03:09

bad. Because it was a But it would have been

2:03:11

another great viral moment like you

2:03:14

knocking yourself off Google, like me Jason,

2:03:16

that was in an upcycle that was on this.

2:03:18

On my calendar's notes. Oh,

2:03:21

wow. But, you know, how long were you

2:03:23

off, Jason?

2:03:25

Before -- Out of my account. --

2:03:27

checking back on. Yeah. My Google account I

2:03:30

mean, all things considered,

2:03:32

I think it was only a couple of

2:03:33

days, but it was still

2:03:35

was scary. It was still scary. I mean,

2:03:37

it was a situation that made me realize

2:03:39

just how dependent upon my Google

2:03:42

account actually am and how much I really

2:03:44

lose if I lost

2:03:45

it. On Sunday, we had a guy call. He

2:03:48

his family asked him to digitize all the

2:03:50

family VHS tapes he

2:03:52

did. It included a

2:03:54

video of him and his sister as kids

2:03:57

in a in taking a bath or swimming

2:03:59

or something naked. Google

2:04:01

determined it was a child born, it wasn't,

2:04:03

obviously. And he has lost

2:04:05

his accounts, some business accounts, all of his

2:04:07

boom. Gone. And the and

2:04:10

he's appealed, and he's but the problem

2:04:12

is, you know, Google, it's hard to get any it

2:04:14

really is. Action out of them. Every once

2:04:16

in a while, I get a I get an email. Right?

2:04:18

Very ran them away from someone online

2:04:20

to, you know, my personal account or my work

2:04:22

account saying, hey, I ran across your video

2:04:25

on YouTube, this happened to

2:04:26

me. How did you get it figured out? And I always

2:04:29

hate, you know, to have to reply and be like, look,

2:04:31

it's

2:04:31

I have connections. Yeah. Like, because of

2:04:33

what I do, I I knew someone

2:04:35

who was able to really help me. And

2:04:38

that's probably not an option for

2:04:40

you, and I'm so sorry. I wish I had a better

2:04:42

answer. Don't

2:04:43

Well, I told me today, I get

2:04:45

people who come to me about Dell computers.

2:04:49

About two thousand five, I wrote Dell hell.

2:04:52

And to this day, someone could be, I'm having a

2:04:54

problem. Can you help

2:04:55

me? Did you have a connection? No. I don't.

2:04:57

Del Hell, is that the name of the article or

2:04:59

the book? Or

2:05:00

Oh, you don't know that story? No.

2:05:02

I'm not certain that I know that you're Dell hells.

2:05:05

I complained on my blog. This is before

2:05:07

Twitter about my my laptop. I

2:05:09

said, Dell sucks. And

2:05:12

then all kinds of things I was accused

2:05:14

of dropping the stock price of Dell, honestly,

2:05:16

but twenty percent. Michael Dilts came

2:05:18

back to the company. Power started

2:05:20

their blog with Weinhold Menchaka.

2:05:23

They hired they they they transferred a bunch

2:05:25

of technicians to go solve bloggers'

2:05:27

problems before they blew up. And this

2:05:29

led to the whole kind

2:05:31

of structure now of social

2:05:33

media crisis management. Around

2:05:36

companies and and jerk

2:05:39

customers like me. Well,

2:05:42

what

2:05:43

knowing you, Well,

2:05:48

alright. So was there really an

2:05:50

issue? mean Yeah. There was.

2:05:52

There

2:05:52

was. My Dell Hell,

2:05:54

August twenty ninth two thousand

2:05:57

five, I'm just a citizen,

2:05:59

a consumer. A guy who has

2:06:01

my own printing press. I

2:06:03

have a doctor's And I get to use it

2:06:05

however I want. Oh,

2:06:09

so this was on buzz machine. Dell lies.

2:06:11

Yep. Yeah. The the headline was Dell

2:06:13

lies and Dell

2:06:15

sucks. Not my proudest moment.

2:06:18

Not my most mature. Wow. Wow.

2:06:22

Well, good for you. You

2:06:25

know, one thing I realized, I I was

2:06:27

watching TV, somebody had passed away.

2:06:30

And it was the guy. Okay.

2:06:33

So it was the guy who Dick Cheney

2:06:35

shot in the face by accident? Oh,

2:06:37

right. He passed away a couple of maybe a few

2:06:39

weeks ago. And the poor guy, this is his

2:06:41

biopic. That's the open.

2:06:43

That's the open. They don't talk about anything

2:06:46

else. They don't

2:06:46

talk about his lie. It's just the guy, Joe Jamie.

2:06:48

It wasn't even it he was the victim. He'd

2:06:50

even do it. The guy Dick Cheney

2:06:52

shot by accident. And I thought, that's

2:06:55

the

2:06:55

problem. Yeah. Jeff,

2:06:57

when you die, Dell hell.

2:07:00

Dell hell?

2:07:01

That was it. You had

2:07:02

your moment. That was my moment. What's insane,

2:07:04

man? What's

2:07:04

mine gonna be center

2:07:06

devil. I don't know. Oh, No. No. No. No.

2:07:08

No. No. It's gonna be that you

2:07:10

were a a giant whole

2:07:12

robot. Yeah. That's

2:07:14

right. It'll be. Yeah. That's true. We

2:07:16

do have it all written down.

2:07:19

Here's here's Jeff's

2:07:20

area. You guys even answered that. That's

2:07:22

it. Yeah. It might be

2:07:24

that I was a dev null. That would be That

2:07:27

would be depressing. Yeah. That would be depressing.

2:07:30

You may remember him as that crappy animated

2:07:32

thing. I've never seen it. Oh,

2:07:37

lord. Pissed off, sold

2:07:39

it out or Bryant regularly. Yeah. So

2:07:43

we should probably do a little Twitter update.

2:07:45

Elon right now is on stage talking about master

2:07:47

plan number three. It's funny

2:07:50

when his first master plan came out.

2:07:52

I was very impressed. We talked a lot about it

2:07:54

master plan too as well. He was executing.

2:07:56

You know, it was things like create

2:07:59

a high priced electric vehicle

2:08:01

to support the creation of

2:08:03

an affordable electric vehicle to

2:08:05

save the planet. Stuff like that. was great.

2:08:09

I don't know if people are as quite as interested

2:08:11

in master plan three. Like,

2:08:14

maybe the bloom is off the rose

2:08:16

a little bit, but we'll we'll give you an update as

2:08:18

soon as it's released. I don't know if they've if they've done

2:08:20

it yet. Meanwhile, Elon

2:08:23

is still running Twitter. Although, According

2:08:26

to platformer, they're the they're the that's

2:08:29

gonna, by the way, be pacers. That's gonna

2:08:31

be Casey Newton's obituary. As

2:08:35

he he followed the Twitter collapse

2:08:37

in the mid twenty twenties.

2:08:41

According to Zoe Shiffer and Casey Newton,

2:08:43

what's been going on at Twitter might give you some

2:08:45

hint about the next CEO. So

2:08:50

clearly, AI generated bluebird --

2:08:52

Yep. -- dally. dally. That's great.

2:08:54

Yep. No surprise. So

2:08:59

Davis, who was currently CEO

2:09:01

of the boring company, loaned himself

2:09:04

out to Twitter when Elon bought

2:09:06

it. It turned out he was one of that gang

2:09:08

and has emerged as one of Musk's top

2:09:11

lieutenants. Even he had just He is

2:09:13

wife had just had a

2:09:14

baby, and I think he was bringing the newborn to

2:09:16

Twitter. Big layoffs.

2:09:19

other was sleeping in the office if

2:09:21

I -- Yeah. -- if I read read that correctly. Or another

2:09:23

new baby and the family was sleeping in the office. You

2:09:25

believe that? Another that's a

2:09:27

little more dedication than any totally. I

2:09:30

I thought the same thing. Okay.

2:09:33

Another random lamp since Saturday, another

2:09:35

two hundred. Employees, including

2:09:38

some big names. Esther

2:09:41

Crawford is the one most people talked

2:09:44

about. She was the one who was sleeping in the

2:09:46

office. Posted on Twitter the picture

2:09:48

of her in sleeping bag with a hashtag

2:09:51

charge of a blue hashtag

2:09:53

live where you work or sleep where you

2:09:55

work. Which is not a good ass damper.

2:09:57

And he'll normalize that. But also Leah Culver,

2:10:00

who have huge respect for her. She was in charge

2:10:02

of Twitter spaces. A

2:10:04

a serial startup person did

2:10:06

some really interesting. So think she did pounce.

2:10:09

Right? Which is a Twitter competitor

2:10:11

in the early days. Oh, yeah. All

2:10:13

how All people on the do

2:10:16

not fire list Martin

2:10:18

DeKuyper, who was the creator of review,

2:10:20

the newsletter program Twitter

2:10:22

bought, and then under Elon

2:10:25

has deprecated, has gotten rid of.

2:10:28

The thought is according to platformer,

2:10:30

it was gonna be so expensive to pay them

2:10:33

out that they were on a do not fire

2:10:35

list but

2:10:37

there's some some suspicion that

2:10:40

perhaps Elon isn't

2:10:42

planning on paying them out. Bankruptcy.

2:10:46

Yeah. They all can't afford. Yeah.

2:10:49

The company's head of sales, Chris Reidy.

2:10:51

Cut. What's

2:10:53

to sell? And all all of this under

2:10:57

the guidance of of

2:10:59

Davis So there's some suspicion

2:11:02

that Davis is the guy Elon is thinking

2:11:04

that's gonna take it over. Although, I hate to be

2:11:07

it's very much like the Pollot Bureau under

2:11:09

Stalin. You know, okay,

2:11:11

here's the next rental loyalist who are I

2:11:14

hate to be the next one in

2:11:16

line, right, beyond that. Yeah.

2:11:18

Did you ever learn? I made it

2:11:20

this time. Oh, I'm invincible. It

2:11:22

just hasn't gone to you yet. In December, the

2:11:25

information reported that Musk tasked

2:11:27

Davis with cutting half a billion in

2:11:29

costs and steady cut close to a billion,

2:11:32

all while sleeping in the office with his partner and

2:11:34

their newborn child. There you go. His

2:11:36

success in bringing down costs by any means

2:11:39

necessary has led to growing speculation internally

2:11:41

that Musk will choose him to be the next

2:11:43

CEO says, So he shipper in

2:11:46

case he didn't. He's

2:11:50

father Robert tweeted today that

2:11:53

Twit that social has been more reliable in the last

2:11:55

six months than Twitter. Is that the case? Has

2:11:57

Twitter been Twitter was down for a lot of

2:11:59

people. Right? And then the the timeline wasn't

2:12:01

loading? There've been some issues.

2:12:03

Yes. I

2:12:04

haven't had any accessibility issues

2:12:06

like that with Twitter. Okay.

2:12:09

I've

2:12:09

noticed a ton of notifications like I had

2:12:11

to mute. I basically muted all Twitter notifications

2:12:14

because suddenly it just ramped up. I was getting

2:12:16

a ton out of nowhere. Like

2:12:18

a lot of promotional, like, oh, in case

2:12:20

you missed it sort of things. See,

2:12:23

it's very pluggy when I log into

2:12:25

it. It's like, oh, it it really wants to be

2:12:27

smart and tell me what I wanna see, and

2:12:30

that has changed significantly. So maybe that's part

2:12:32

of the reason why I'm not using this much. But I haven't

2:12:34

I haven't seen it being, like, you

2:12:36

know, seen it broken

2:12:39

in any in any real major

2:12:41

way. I haven't encountered that. Which

2:12:43

I I'm kinda surprised about because

2:12:45

I I expected with all of the layoffs

2:12:47

that things would get really choppy and

2:12:50

I have that hasn't necessarily been my experience

2:12:52

so far. Elon

2:12:55

Musk's defense, according to

2:12:57

Siva Viganothan, of Scott

2:12:59

Adams, shows why he is misguided and

2:13:01

dangerous. When

2:13:04

Adam's makes a fool

2:13:06

Adam's a creator of Gilbert, makes fool

2:13:09

of himself, he mainly just harms himself, but

2:13:11

Musk has the power to harm others

2:13:13

said, CEVA. Scott

2:13:16

Adams lost his syndication for

2:13:19

the Gilbert comic strip after a

2:13:21

a racist of video on YouTube,

2:13:24

but Elon's

2:13:29

kind of been supportive. And

2:13:31

Elon also So he's he's he's racist.

2:13:33

I might add. Oh, you can as as

2:13:35

somebody said on Twitter, you can take Elon out

2:13:37

of South

2:13:38

Africa. Japan take the South Africa and Elon.

2:13:40

Yeah. Wow. Yeah.

2:13:43

Musk said I don't agree with everything Scott says,

2:13:46

but Gilbert is legit funny and insightful.

2:13:48

We should stop canceling

2:13:50

comedy. Actually,

2:13:52

Gilbert's not that funny. No.

2:13:54

It's okay. The

2:13:56

Del Marino was a funny

2:13:58

thing, but no. Yeah. He Musk

2:14:01

also said

2:14:04

you know, let's

2:14:06

see. The

2:14:08

the media is racist. Okay.

2:14:11

That was the other thing. That's great. Okay.

2:14:13

Musk said, without Evans, for a very long time, US

2:14:15

media was racist against non white people.

2:14:17

Now they're racist against whites and Asians.

2:14:20

Same thing happened with elite colleges and

2:14:22

high schools in

2:14:22

America. Maybe they could try not

2:14:24

being racist. Interesting.

2:14:27

He's just purely out of out of white supremacy

2:14:29

land. Pure. Yeah. Musk

2:14:33

agreed with tweet saying Adam's comments

2:14:35

weren't good, but had an element of truth

2:14:37

By the way, this is what racists always said. In fact,

2:14:39

this is what racists always think, racists

2:14:41

never say, I'm I'm a racist. They don't even

2:14:43

think they're

2:14:44

racist. You say no. I'm just telling the

2:14:46

trends

2:14:46

is black.

2:14:47

Yeah. It just telling me right. Yeah.

2:14:49

Exactly. Yeah. I'm really

2:14:51

close friends with black person.

2:14:54

Obviously, she's. And, of course,

2:14:56

Elon's tweets in support of

2:15:00

Scott Adams were responded to by Scott

2:15:02

Adams. Saying

2:15:04

thank you, Elon. So he's still on Twitter.

2:15:09

Elon Musk Elon's Elon's Go

2:15:11

ahead. Elon's Go ahead. Elon's

2:15:12

gonna say the same thing I

2:15:14

am. There you go. You're gonna build his

2:15:16

own AI. That is

2:15:18

not is not to

2:15:20

woke. Because it's Chassis and

2:15:22

GPT is to woke, and we're gonna

2:15:24

build an AI that's not woke. This

2:15:26

is according to the information. We're

2:15:28

gonna dial back the wokeometer. He

2:15:32

raised it in his

2:15:33

You'd have to be an idiot to go to work for Elon,

2:15:35

but he recruited team develop open AI

2:15:37

rival. Okay.

2:15:39

Fine. Who are you gonna get? What money?

2:15:41

Where's the money to do that? Oh, wait a minute.

2:15:44

He's the richest like, you know, this is the thing.

2:15:46

People talk about all it's going bankrupt. He's spending

2:15:48

all his money. He's still he's now once

2:15:50

again the richest man in the world. Has

2:15:52

something like a hundred ninety seven billion

2:15:54

dollars. He could lose all forty

2:15:56

four billion and still be in

2:15:59

the top ten richest people in the

2:16:00

world. He's not at risk of anything.

2:16:02

Nope. Well, Tesla

2:16:05

stock is no. But he's moved more than forty four

2:16:07

billion. He's leveraged at all on Tesla

2:16:09

stock. He set up my mind again. That's not somebody

2:16:11

was saying to me that one of our students was saying to me other

2:16:13

day is I said I think that Twitter will

2:16:15

go bankrupt and the banks will

2:16:17

own it for thirteen billion They

2:16:19

said, yeah, but so much a Tesla is mortgaged

2:16:21

on it. Elon is

2:16:24

a tweet. About

2:16:27

censoring, you know, chat

2:16:29

GPT wouldn't tell a

2:16:31

joke in the style of it would

2:16:33

would tell a joke in the style, Jerry Seinfeld. But

2:16:36

would not tell us to joke in the style

2:16:38

of Dave Chappell, which is weird,

2:16:40

but okay. Some

2:16:43

people said that's just because it's not as good I

2:16:46

can't I can't do it. But Elon

2:16:49

said, you know, we need we need a

2:16:51

what was it? We need nonwoke,

2:16:55

GPT, chat, GPT. Another

2:16:58

instance in response to a user asking open

2:17:00

CEO, Sam Altman, to turn off the woke settings

2:17:02

for GPT. Musk replied

2:17:04

saying the danger of training AI to be woke, in

2:17:06

other words, lie, is deadly.

2:17:09

Now, one thing I gotta point out, there's no

2:17:11

definition of

2:17:12

woke. Woke is

2:17:14

You just There's so

2:17:15

many things I disagree with. Right?

2:17:17

Woke started in black

2:17:19

America. I know it's

2:17:20

ironic. I coop it.

2:17:22

It's ironic. Yeah. It's

2:17:24

not just ironic. No. It's on purpose. It's on purpose.

2:17:26

You coop Language of your opponent.

2:17:29

Yeah. Yeah. Mozilla

2:17:32

is leading

2:17:34

Mammoth's precede funding.

2:17:37

Mammoth is an asked Mastodon app,

2:17:39

which I actually don't like very much. We

2:17:42

don't No. On it's on macOS,

2:17:44

but I guess it's also on iOS. It

2:17:47

was originally built on iOS Yeah.

2:17:50

I think it's a terrible app, frankly. And I don't

2:17:52

understand why you need one on the Mac since you could just

2:17:54

use the browser, which is much better. But

2:17:58

again, if if it's a venture capital funded

2:18:01

effort in the Fediverse, I'm

2:18:04

gonna be inherently advise

2:18:06

against it because I am woke. Just

2:18:11

telling you right now, the devil is

2:18:12

woke. Devil, whoa. Comment as

2:18:15

you work.

2:18:16

Devil's woke. There we go. Showtime. Showtime.

2:18:19

Devil's woke. What

2:18:22

does woke up? And and

2:18:24

is it being woke up?

2:18:28

I'm the wrong person to ask because apparently,

2:18:30

I'm not woke enough either

2:18:32

and the black man. I get a lot

2:18:34

of crap thrown at me and people

2:18:36

-- Oh, really? --

2:18:37

hit me because I don't always

2:18:39

agree with some of the stuff that that said

2:18:41

in the, quote, woke community

2:18:43

and stuff like that because something

2:18:47

doesn't make sense to me. Is the is

2:18:49

the premise if if

2:18:51

you're asleep, you're not aware of

2:18:53

semic racism, but once you wake up -- Yeah. --

2:18:55

you're aware of

2:18:56

it. Right? Because I and you're aware

2:18:58

of it. I live in a war in this world, and

2:19:00

I know that racism is this. I've

2:19:03

experienced racism, but

2:19:05

I'm the I'm probably one of

2:19:07

the few black folks that you're gonna meet that says,

2:19:09

you know what? Not everything that

2:19:12

that happens to me on a bad note is

2:19:14

because of

2:19:15

racism. Something is my bad

2:19:17

decisions. Yeah. You know? That

2:19:19

seems fair. I would I

2:19:21

am not when I say stuff like that. I would

2:19:23

I would never dream to say anyone as woke or

2:19:25

unwoke.

2:19:27

I don't I don't even like that word, to be honest

2:19:29

with you.

2:19:29

So Yeah. I I don't really it's gotten I don't

2:19:31

even really get into it too often because most of

2:19:33

the time, people just

2:19:35

Not for me to say black people. It's telling you

2:19:37

you should feel and Yeah. Right?

2:19:40

That's where that's where the origin is in

2:19:41

British police brutality. That's where the origin Oh,

2:19:43

Arjan's police brutality or Yes. Benito

2:19:46

says. Oh, I'm sorry. I mis mispronounced it. Miss

2:19:48

Benito. So it's not about systemic

2:19:50

racism. It's waking up to police brutality

2:19:52

of Benito. Something

2:19:55

along those lines. That's where it started. That's where the

2:19:57

origin is from.

2:19:58

I'm gonna have to take it over. Check

2:20:00

the urban dictionary on this.

2:20:01

So the the expert at this is Meredith

2:20:04

Clark, who's also an expert in black Twitter.

2:20:06

I put up something on bottom

2:20:07

of that one down. So sad that your black Twitter

2:20:10

event is not somewhere saved.

2:20:13

Coming back. And

2:20:15

I know he's feeling it. Here's here's

2:20:18

what the Neva AI says.

2:20:20

Being woke is defined as being aware of social and

2:20:22

political issues, particularly those related to racism,

2:20:24

and social injustice. It's also used to describe

2:20:27

somebody's conscious of disparities between

2:20:29

different demographics and socioeconomic standings.

2:20:32

Yep. Someone's pretentious about how much they

2:20:34

care about a social issue. That's the

2:20:36

other side of it. Finally, two has described someone

2:20:38

who's asleep. And uncritically accepting

2:20:40

whatever nonsense social science professors

2:20:43

dream up to advance Marxist

2:20:45

goals. Oh, wow. That's

2:20:48

the that's the other side of it.

2:20:51

That's the everything that's been piled

2:20:53

onto what Right. Yes. Right.

2:20:55

Right.

2:20:58

Alright. Now that

2:20:59

neither result, was that the AI summarizing

2:21:02

or was that

2:21:02

a clip? Okay. I was the AI summarizing. Interesting.

2:21:06

It was officially added to the dictionary in twenty seventeen.

2:21:09

It's I think this is one of the things that was summarizing.

2:21:11

Alright. Anyway, enough.

2:21:14

Enough enough. Should we do

2:21:16

a quick change log? Let's do it. Sure.

2:21:18

What the hey.

2:21:20

Google change log.

2:21:23

This is one of those weeks where I was like, god, I'm not

2:21:25

finding much and then suddenly -- Oh. -- there's a lot.

2:21:27

There's ton in here. Mister change log

2:21:30

you should be doing this, not me. Go

2:21:32

ahead.

2:21:34

Okay. New changes coming

2:21:36

to to android. Lot of

2:21:38

minis like miniature changes,

2:21:40

things like a Google Keep note

2:21:42

widget for storing, like,

2:21:44

a single note on your home screen and, you know,

2:21:47

always adding to or moving from it, whatever.

2:21:50

So Wear OS shortcuts for keep Chrome

2:21:53

getting three hundred percent Zoom,

2:21:56

which you know, this is one of those updates

2:21:59

where it's

2:21:59

like, here's it was timed with Mobile World Congress

2:22:01

and we talked about it little

2:22:02

that's why there's so much on all that android.

2:22:05

It's like, here's a bunch of announcements,

2:22:07

but they're all, like, really small. Right.

2:22:09

So

2:22:10

Right. It's kinda silly. And this

2:22:12

is for the current, Andrew, or is this the

2:22:14

next edition? No.

2:22:16

To be this these are updates that

2:22:18

that hits, I think,

2:22:20

everyone on Android because it's done through Google

2:22:22

Play Services. So you're not waiting for a major

2:22:24

OS update this just kind of

2:22:26

happens. Google's announced these some of

2:22:28

them may be, you know, in progress, and

2:22:31

some of them already rolled

2:22:32

out. No. It's cancellation and meet. That's

2:22:34

good. Yep. Noise cancellation, PDF animation

2:22:36

and drive, emoji kitchen combos.

2:22:39

A fast pair coming to Chromebooks. This was announced

2:22:42

Oh, yeah. That's kinda like for Jeff.

2:22:44

Yeah. That's kinda like what happens on a Mac

2:22:46

where you open up your Bluetooth

2:22:47

headphones. It just goes, I see that. I mean, you use

2:22:50

I mean, with with a phone, I

2:22:52

I interact with FastPay a lot,

2:22:54

and it's awesome of it when it works.

2:22:56

Yep. And it works pretty well

2:22:58

now. So Okay.

2:23:01

Awesome

2:23:02

when it works. That's right.

2:23:05

Am I doing the rest of these or it's I'm

2:23:07

I'm happy to Well, you know, you just checked off three

2:23:09

out of the list, which is good. Android's

2:23:12

adding support for e why don't we trade it

2:23:14

off? I'll do one. You

2:23:14

do one. Okay. Android's adding support for

2:23:16

eSIM transfer between devices. I

2:23:19

didn't know it didn't have that eSIM is really

2:23:21

a revolution in how smartphones work.

2:23:24

So that means you can have an account.

2:23:27

In fact, I I I've kinda did it already.

2:23:29

I had an account on my pixel

2:23:32

six then I got the Samsung

2:23:34

Ultra. Mhmm. And in Syria, I could've

2:23:36

eSIMs just, you know, transfer

2:23:39

transfer over. Yeah. And then I let and then you've

2:23:41

got the Samsung for

2:23:43

review. So I gotta transfer it back, that

2:23:45

kind of thing.

2:23:45

Right. But it's all that we have to do.

2:23:47

So we end up using that.

2:23:48

It's it's all done in software. Yeah. Yeah.

2:23:51

Yeah. It's all done. So You know, actually, you if

2:23:53

you have Google Fi account, it

2:23:55

kinda does that for you anyway. When I

2:23:57

got the Samsung

2:23:58

Altria, it says, complete the activation install

2:24:00

Fi. Oh.

2:24:01

Because I mean, you need to put in a SIM. That's

2:24:03

nice. Yeah. I've got mint. So

2:24:06

I've got the They also support e SIM.

2:24:08

They could do this out there. Have he sinned. Yeah.

2:24:11

For I mean, for

2:24:12

sponsor. Yeah. For whatever reason,

2:24:14

I'd I feel more comfortable having the physical

2:24:16

sin. It's just so easy to look at it and be like,

2:24:18

I want this in the light. Boom.

2:24:21

I'm done. You're such a light.

2:24:23

I know. Right? I becoming a luddite as

2:24:25

I get older apparently. Let's

2:24:28

see here. This one I'm not super familiar

2:24:30

with. Chrome is has improved

2:24:32

memory use. In battery life on

2:24:35

the MacBooks. We talked with this yesterday

2:24:37

on Mac Break Weekly. Mhmm. Chrome is notorious

2:24:39

pig. Well, yes, it is. On on

2:24:42

Max. No question. And so they're trying

2:24:44

to get too. Is it really?

2:24:46

Yeah. Yeah. They talked about some things like

2:24:48

the tabs, they can sleep them faster. If

2:24:50

frames still sleep, there you

2:24:52

know, there is a power setting

2:24:55

on Chrome. Which only gains you thirty

2:24:57

minutes. But they claim on

2:24:59

a MacBook Pro, fourteen inch, you can

2:25:01

get a seventeen hours of you

2:25:03

know, video time. Okay? That's nice.

2:25:06

Nice. Watching videos for seventeen hours.

2:25:08

Alright.

2:25:09

That's because that's a lot. Those

2:25:11

times that I'm watching videos. For

2:25:12

seventeen hours. Yes. As long as a baseball game,

2:25:14

I think. Right. A little bit longer.

2:25:17

Just a smidge. Waymo's starting autonomous

2:25:19

testing in LA with no human driver. Waymo.

2:25:22

Didn't they announce was it Waymo?

2:25:24

No. It was cruise that announced a million

2:25:26

hours without a -- Yeah.

2:25:29

-- cruise. That's amazing. Everywhere

2:25:32

now you started. So I said earlier in the show

2:25:34

that, you know, full self driving was kind of

2:25:36

AI that hasn't taken

2:25:39

off, but I guess it really has some ways

2:25:41

it

2:25:41

is. Yeah. Not taken off in

2:25:43

the way of we're all riding

2:25:46

around in vehicles every day that are

2:25:48

driving

2:25:48

themselves. But this is where it begins.

2:25:51

Right? This is where that that trend because It

2:25:53

was Waymo Waymo one million miles with

2:25:55

no human driver on public

2:25:57

roads.

2:26:00

So that's a lot. They have even more

2:26:02

with a safety driver, eight million

2:26:05

miles. Mhmm. Now they also

2:26:07

revealed that they'd had a couple of accidents

2:26:10

Bunch of fender benders. But

2:26:12

for a million miles, that's a lot better

2:26:14

than a human. Absolutely. Why doesn't

2:26:17

Waymo get as much press as Tesla?

2:26:21

Because you because you can buy a Tesla.

2:26:23

You can't We're most limited to a

2:26:25

few markets. Phoenix, like a Con San

2:26:28

Francisco, LA, And

2:26:31

so that's why I think most people don't have any

2:26:32

experience. I don't know. I've never written away, Mo, if you?

2:26:35

Nope. You've never been in But then you you raised

2:26:37

really important point because because Weibo

2:26:39

actually does it when Tesla

2:26:41

is not promising

2:26:42

it. Well, Tesla is kind of admitted

2:26:44

in filings that it's what so

2:26:46

called full self driving is level two driver

2:26:49

assist, which isn't really much.

2:26:51

Yeah. So we now know that's

2:26:53

just a marketing

2:26:54

term. That test You know, we talk about

2:26:56

regulation and I and I sound

2:26:58

like a libertarian in these moments when we do. If

2:27:00

you're gonna regulate

2:27:01

anything, self driving cars seems to

2:27:04

be the thing you'd want to regulate from the get

2:27:06

go.

2:27:06

Yeah. Yeah. Well, they are technically. Mhmm.

2:27:09

It's What's the do we must a way with stuff. They

2:27:11

haven't done an aggressive

2:27:12

trade. Crap on. A lot of

2:27:14

people complain about Nitsa, which sets the national

2:27:16

standards that they've got a national highway

2:27:18

transportation and safety administration. Has

2:27:20

kinda given up to now given

2:27:23

Tesla free past, not so much these

2:27:25

days. They're getting more serious about it.

2:27:27

But they should never been allowed to call it

2:27:29

autopilot.

2:27:30

Mhmm. No. That was that's not good.

2:27:33

No. I agree. What else do you got? Magic

2:27:36

eraser. Defining feature of

2:27:38

the Pixel devices coming to Google One subscribers

2:27:40

via the

2:27:41

photos, the Google Photos app. So

2:27:43

it's not limited to -- Excellent. --

2:27:45

anybody gets devices with a

2:27:47

tensor chip. You get Magic

2:27:49

eraser.

2:27:49

That's not a magic eraser. Even iPhone

2:27:52

users Yeah. And photos get magic

2:27:54

eraser. You Can we get into have do

2:27:56

we have it

2:27:57

yet? Should I try it? Oh,

2:27:59

it says starting on Thursday.

2:28:01

So it's gonna be tomorrow. It's

2:28:03

gonna be rolling out to Google One subscribers

2:28:06

using the Google Photos app. But,

2:28:08

yeah, that's Android, that's iOS, all

2:28:10

Pixel users, not just as it's

2:28:12

been limited to the six and the seven so

2:28:14

far. So nice.

2:28:16

There you go. Oh, cool.

2:28:17

Remove items from your photos

2:28:20

to your heart's content.

2:28:23

Yeah. Yeah. I would It's a feature that I

2:28:25

never use it word, but No. You don't use it. I mean,

2:28:27

I never think to use I just don't spend a

2:28:29

whole lot of time editing photos on my

2:28:31

phone, I guess. You know?

2:28:33

It works fairly well to the

2:28:34

Lightroom does this affinity photo. A lot

2:28:36

of programs do it. Right? Yes,

2:28:39

sir. I would still lean on my

2:28:41

trusty Photoshop for some stuff,

2:28:43

but -- Oh, for sure. -- I've played around with it recently

2:28:46

and like say I wanted to take

2:28:48

a picture of the living room or

2:28:50

something, but Kylo was sitting there. Right.

2:28:52

So I'll erase Kylo, but Kalo

2:28:56

Shadow is still there. So now I need to go

2:28:58

back to any other places shadows, you

2:29:00

know. And then you're hoping that if it gets that

2:29:02

right and sometimes it doesn't always do it. But

2:29:04

the average consumer would totally

2:29:07

be fine with the results that that happened.

2:29:09

You know? With me on the other hand, I'm probably just gonna

2:29:11

go into Photoshop and Yeah. Fix what

2:29:13

I need to fix.

2:29:15

Yep. You you know, have to say

2:29:18

it'll be on iPhones,

2:29:20

but and maybe this will be

2:29:22

fixed, but Google Photos can't handle

2:29:25

the iPhone format. The hike High

2:29:27

efficiency. He's he's resting.

2:29:29

High efficiency efficiency. I don't think it

2:29:32

can. Oh, wait a minute. It's a Google One

2:29:34

member. You get access to extra

2:29:36

editing features.

2:29:38

So maybe they did add that capability.

2:29:40

Maybe it's their own And a quick tip, the

2:29:42

magic eraser depending

2:29:45

on what you're trying to

2:29:46

erase, it works best when you

2:29:48

zoom in and then do the

2:29:50

eraser. Okay.

2:29:53

Yeah. I have a bunch of stuff because I'm a Google

2:29:55

One subscriber, but not yet magic eraser.

2:29:58

I'm not

2:29:58

coming tomorrow. My phone. Maybe.

2:30:00

Yeah. I see. But I already

2:30:02

had that. Didn't I already have that on the Pixel?

2:30:05

Yeah. It was already You would already have it on your Pixel.

2:30:07

Yeah. Yeah. The seven and the six the

2:30:09

six a. You

2:30:12

know, and and on the pixel, how

2:30:14

do I think it's integrated into the camera,

2:30:16

not necessarily the photos app.

2:30:19

Limited raw, so we're already able to say,

2:30:22

I shoot lot in raw. I probably shouldn't shoot

2:30:24

so much in

2:30:25

raw. You

2:30:26

gotta be

2:30:27

eating up your photos, cloud

2:30:29

storage. What the hell?

2:30:32

You

2:30:32

know what? I still have one of those things. Free

2:30:35

unlimited original deal. On

2:30:37

Amazon photos, so I just upload to

2:30:39

Amazon as

2:30:39

well.

2:30:40

There you go. And then I

2:30:42

think I don't have to upload the originals

2:30:45

or I can't upload the originals and

2:30:46

photos. Right? It has unlimited you

2:30:49

have to

2:30:49

turn it on if he's -- I don't know. --

2:30:51

at the web on. Yeah. Yeah.

2:30:53

It it they might Google might try and

2:30:55

and opt you into original size

2:30:58

by default the first time you launched the app. Right. So

2:31:00

you'd have to say So you just, you know, you know, either

2:31:02

do that or I think they call it high quality, which

2:31:04

is like lower

2:31:05

res, but high enough in my opinion. How

2:31:07

often will most be photos out of my photo

2:31:09

real and and blowing it up to a wall size

2:31:11

image, like a new -- Exactly. -- so who

2:31:13

cares? Magic eraser.

2:31:16

There it is. Yeah. Tap to erase

2:31:18

highlighted suggestions. Click

2:31:21

or brush to erase

2:31:22

more. Yeah. It doesn't have any suggestions

2:31:24

because this is a perfect fit for the photo. This

2:31:27

is amazing. But let's let's erase my

2:31:29

wife. I

2:31:29

dare you to try to erase her. I dare

2:31:31

you. No. Don't do

2:31:33

that. No. No. No. No. That one just

2:31:35

circle it. Oh, just circle it?

2:31:38

For that for that instance, circling would

2:31:40

work totally

2:31:40

fine.

2:31:41

Oh, because we'll see what he does.

2:31:44

Oh, it's a ghost. She's a lost gone.

2:31:47

What am I gonna do? Okay. So circle

2:31:50

it. Yeah. And

2:31:52

then Well,

2:31:54

that's well Well, but now I can do

2:31:56

center. I would I do additional

2:31:57

And didn't you find center by brushing

2:31:59

feet? Yeah.

2:32:00

Yeah. Yeah. Oh,

2:32:00

look at that. It's

2:32:02

like she never existed.

2:32:07

Sorry, Lisa. Right now, she's doing the same

2:32:09

deal. She's mad now. She's mad.

2:32:11

She should be not ducking, sir. She's

2:32:14

erasing your phone number from her

2:32:15

phone. There's her shadow, but there's no.

2:32:18

Wow. That's

2:32:20

cool. Yeah. There you go. Cancel.

2:32:23

I didn't wanna do that discard. I

2:32:25

love my beautiful, but there

2:32:27

are plenty of pictures what I would like to erase

2:32:30

some people in. You know, oh

2:32:32

oh, she told me, okay, she said this is

2:32:34

a good picture of these two guys with

2:32:36

their solo cups on the

2:32:37

rocks, but she said I should've erased this

2:32:40

guy. Yeah. And just keep the

2:32:42

circle.

2:32:43

Let's try it. Let's try it. They

2:32:45

don't know. They're being

2:32:46

erased. For Zoom in

2:32:48

for Zoom in just a little

2:32:50

bit. But first, I have to pick it. Right? Tool.

2:32:52

It's under

2:32:52

tools. Magic eraser. K?

2:32:55

It's gonna suggest it does no suggestion,

2:32:57

but now I wanna

2:32:57

zoom in just a little. And just a little. Yeah.

2:33:00

And then circle.

2:33:02

Because

2:33:02

he wanted to still think about this.

2:33:04

He didn't know he'd be a Oh my

2:33:06

god.

2:33:07

Yeah.

2:33:08

Would you even know would

2:33:09

you even know if you didn't know? Oops.

2:33:11

Undue

2:33:12

that one. So now when you zoom out and look at

2:33:14

it, It it it's like

2:33:16

he never was there. That

2:33:17

was the other thing that you didn't even name the tool.

2:33:20

Stalin. Yeah. Stalin. Look

2:33:22

at that. You got it. Now it's like, actually, it is a better

2:33:24

picture.

2:33:24

Yeah. So for most for most people, that's totally

2:33:27

gonna work. No kidding. Fake

2:33:29

news. Wow.

2:33:33

Keep fake

2:33:33

Well, now what I could do is save a copy. Yeah.

2:33:36

So I have the original. Wow.

2:33:38

So have the original with that random guy.

2:33:40

So I do it and then Let's see. There's

2:33:43

there's two guys there. Two

2:33:45

guys there --

2:33:46

Yep. -- and

2:33:48

go here? No. No. One guy more.

2:33:50

Random

2:33:50

guys. Pretty good. Yeah. Okay. That's pretty

2:33:53

easy.

2:33:53

That's nice. Now you'll be able to do that

2:33:55

Four people will be able to do that. Four people. That's right.

2:33:58

Yep. Yeah.

2:33:58

Gmail, clients sign encryption now

2:34:00

available in more businesses. Don't get your hopes up.

2:34:03

This is not encrypted email. Right?

2:34:09

Although it does hide it from Google, who

2:34:12

the feature makes it so even Google itself can't

2:34:14

see the contents of the emails it's

2:34:15

hosting, with data being encrypted,

2:34:17

but

2:34:17

this work can.

2:34:18

Is that plausible to my ability? Yeah.

2:34:22

Customers have soul control over their encryption

2:34:24

keys So that's yeah. You know what? That's

2:34:27

good. Even and users can

2:34:29

encrypt emails they're sending within their organization

2:34:31

as well as emails they're sending to other

2:34:33

parties even if the recipient doesn't use

2:34:35

Gmail. Well, I wonder what

2:34:37

they're using for the encryption. Well, Interesting.

2:34:43

Client side encryption for Gmail, and

2:34:46

even Google

2:34:47

can't see what you're doing. That's a Google

2:34:50

Workspace feature. I

2:34:53

think that's good. Yeah. Right? Of course, turn

2:34:55

that sucker on. Until the encryption

2:34:58

is illegal. Let's go off

2:35:00

all in on

2:35:00

it. So and and and think people know this.

2:35:03

The data is encrypted at rest. On

2:35:05

Google servers and in transit

2:35:08

between Gmail accounts. But

2:35:10

the keys aren't held by

2:35:11

you, they're held by Google. So if they were

2:35:13

subpoenaed or whatever, they still would have access

2:35:16

to it. So all they're really doing is handing

2:35:18

the keys over to the client and saying we don't

2:35:20

have we don't have out of our

2:35:22

hands. Well,

2:35:24

speaking of your hands and things being

2:35:26

in or out of them, the Pixel Watch

2:35:28

as a horrible segue. Will

2:35:32

have a different effort, Jason. Sometimes

2:35:35

it feels right to try and segue.

2:35:38

Fall detection coming to your

2:35:40

pixel watch. This was something that we've known about

2:35:42

for a while.

2:35:43

I could have used that getting out of the studio, Leo.

2:35:45

Yeah. You know, my Apple Watch did

2:35:47

not say -- No. -- not just taking a fall. But

2:35:49

the other day, I was making the bed and it said, did

2:35:51

you just fall? No. I was

2:35:54

fluffing. I was nearly

2:35:56

fluffing. So

2:35:59

the worst thing is So at first, let's

2:36:01

say did you just fall off you If you don't see it or

2:36:03

feel of the buzz, it will then go, what?

2:36:05

What? Calling

2:36:07

911. No.

2:36:10

It's I really so far, I've been able

2:36:12

to stop but it it happens more than it ought to.

2:36:14

Kind

2:36:14

of frightening. Yeah. What's the

2:36:17

push ups you do and where you have barbells

2:36:19

and you lift one and then you put it in and you do

2:36:21

the other one like that. That's

2:36:23

either doing a bent over row, something like that

2:36:26

would you push up. Something with it. Something. And

2:36:28

that that's the one that sets it off every time

2:36:30

I put down the when I put down

2:36:32

the

2:36:32

dumbbell, and they

2:36:33

go Working on those lats. Yeah.

2:36:35

Get my lats. I was

2:36:37

just fluffing Google. Okay.

2:36:40

So now you too and Pixel Watch we'll have

2:36:42

that fine experience. There you go. Finally,

2:36:46

your Google Docs are about to look a

2:36:48

little bit

2:36:48

different. Did you put this in here? Is

2:36:50

that a change log,

2:36:52

really? No. Sometimes I don't know. A

2:36:54

small refresh. Oh, okay.

2:36:56

Well --

2:36:57

Yeah. -- little

2:36:58

material through some of that in early

2:37:00

on when I didn't have many things and then

2:37:02

That's probably it didn't trim it out though. And

2:37:04

that's the Google Exchange

2:37:07

Law. Now you

2:37:09

know, you know, it's him. It's

2:37:11

all it's all

2:37:11

him. It's all me. Well, it's the

2:37:13

beauty. And then sometimes you

2:37:16

you go off the rails and you take

2:37:18

chat room suggestions. We don't for

2:37:20

some reason, we we wait a minute. Wait a minute. Do

2:37:22

we have a Scooter xchange log? I

2:37:25

don't know. Let's see here. Political

2:37:27

turmoil, defy satellite link.

2:37:29

Motorola defy satellite link

2:37:31

unveiled at Motor Mobile World

2:37:34

Congress.

2:37:35

So so just for the chat room, change

2:37:37

log has to be Google related. That's

2:37:39

that's the Motorola

2:37:40

account. Yeah. Motorola is not a not

2:37:42

a Google

2:37:43

Let's say war. Not anymore.

2:37:45

Right? It's a part of the wait a

2:37:47

minute. Here it is. He's got

2:37:49

a bunch of the scooter x one. I'll just

2:37:51

read the titles. YouTube helped form disables posting

2:37:54

in new comments changes planned. Android

2:37:56

fourteen will bring Paseke support for Dashlane

2:37:58

and other apps. Android thirteen QPR

2:38:01

two beta gets final feedback survey before

2:38:03

launch. LumaFusion video editor

2:38:05

now fully available for Android and Chrome OS.

2:38:08

And you can now access Google tasks

2:38:10

on the web without using Gmail's sidebar.

2:38:13

Thank you. You've done better. That's the scooter

2:38:15

change lock. There we go. There you go. Score

2:38:17

x. Alright.

2:38:21

Get ready. Your picks of the week. Are

2:38:23

next on our agenda.

2:38:29

But first, I wanna plug

2:38:31

the club just briefly. Mhmm.

2:38:33

First of all, before I go into that,

2:38:35

just thank you club members because you make so much

2:38:37

possible here. And increasingly, we're

2:38:39

having a hard time, frankly, selling ads.

2:38:42

It's not just us. Everybody's seeing downturn

2:38:44

in podcast ad sales. But unlike

2:38:47

NPR, which can, you know, fire

2:38:49

ten percent of its staff when it's down three hundred

2:38:51

million dollars in ad sales, We

2:38:54

don't wanna do that. So that's

2:38:56

why two years ago, during

2:38:58

COVID, Lisa created a club to it, and

2:39:00

it has been a boom. Thank you. Club

2:39:03

Twitter is seven bucks a month. That's all it is.

2:39:05

You get ad free versions of all the shows.

2:39:07

You get special shows we don't put out in public

2:39:10

on the Twitter plus feed like Hands on Macintosh

2:39:12

with Micah Sargent, Paul Therrott's hands

2:39:14

on Windows, the ENTITOLINICS Show

2:39:16

with Jonathan Bennett, the Gizz Fizz with Dixie

2:39:18

Bartolo, Stacey's book club,

2:39:21

we get so much extra stuff, ants put

2:39:23

together a bunch of great events

2:39:25

coming up. I think Samable salmon is

2:39:28

is this Thursday. Right? That

2:39:30

is correct. In the club. That's gonna be fantastic.

2:39:33

A little AMA with Sam. He's our car

2:39:35

guy. Stacey's book club is coming

2:39:38

up next month. Victor

2:39:40

will be the the under the microscope

2:39:42

in our Insight Twitch chat. He's one of our great editors,

2:39:45

Alex Wilhelm, who you'd probably

2:39:47

know well from Twitch he's been on for years.

2:39:49

Just had a baby. You're gonna do an AMA

2:39:52

with Alex and then Sean Powers from

2:39:54

Floss Weekly. There's a lot of fun

2:39:56

also in this I mean, think the Discord

2:39:58

is really more than

2:40:00

just talking about the shows. It's all the

2:40:03

stuff that you're interested. In fact, we've got to

2:40:05

add let me do that right now. An

2:40:07

AI section. I

2:40:09

I feel like we should have an AI section. We should

2:40:11

have an

2:40:11

AI.

2:40:12

Yeah. We should. AI, baby.

2:40:14

What what should I just call it AI? Yeah.

2:40:17

Okay. It's a text. We're gonna create a

2:40:19

channel. Look at that. That's how easy it is. Oh.

2:40:21

And I have to alphabetically sort

2:40:23

it. So I'll drag it

2:40:24

up to

2:40:24

Wait. I was gonna say, where is it?

2:40:26

That's how it is for us.

2:40:28

Yeah. You can't do it. But that's okay

2:40:30

because Ant is very active. And so if

2:40:33

you have a request, there's a request channel.

2:40:35

You can add it. So there you go. There's our AI

2:40:37

channel. See? I'll

2:40:39

tell you

2:40:39

what, it

2:40:40

should be wait. Wait. How did beer get above

2:40:42

AI? How did

2:40:44

beer yip above a I? How was

2:40:46

it not above a I? It has

2:40:48

to be had an effect. Are you

2:40:50

serious? Oh

2:40:53

my god. AI is already alive. It's

2:40:55

alive. That's

2:40:58

why we love the club because people in the Discord

2:41:00

are club members, and it's and it's just a great

2:41:02

community to hang out in. We

2:41:04

also, I should add, have

2:41:07

stuff that, you know, we

2:41:10

like before and after the shows, there's

2:41:12

just a whole bunch of content that we don't have

2:41:14

a place to put. It all goes into the

2:41:17

club. And it's just I think a great way

2:41:19

to support what we do. I really appreciate

2:41:21

it. And I think you get the benefits

2:41:24

of Seven bucks a month. If you're not

2:41:26

yet a club member and you're not because you're hearing

2:41:28

this, I guess, please

2:41:30

do me a favor and go

2:41:32

to twit dot tvclub Twitter and

2:41:34

sign up You could buy it for a whole year eighty

2:41:36

four bucks. By the way, I should point out when

2:41:38

you do that. Single handedly,

2:41:41

you are guaranteeing another twelve months of

2:41:43

shows. Because we can't stop

2:41:45

now. You've paid for shows

2:41:47

through what? First twenty twenty four.

2:41:50

We had to do so. Thank you. Yeah.

2:41:52

So by signing up for a year, you're

2:41:55

in a way single handedly extending

2:41:57

the life of Twip by an entire

2:41:58

year. How about that? Twenty

2:42:01

minutes. That always Stacy's

2:42:03

not by the next person that buys the year subscription.

2:42:05

Oh, yeah. Whoever bought it last. Yeah. It's another

2:42:07

five minutes or whatever. Gonna be here forever. Twitter

2:42:09

is like going

2:42:10

anywhere. Well, I will tell you this.

2:42:12

Be the canary if it goes away. So

2:42:15

Then you know you got one year. The clock is

2:42:17

ticking. Oh, boy. We hope we don't

2:42:19

have to do that, honestly. But

2:42:21

we don't have, you know, we don't have the pockets.

2:42:23

We don't have, you know, government funding.

2:42:26

We don't have VC funding. I'm

2:42:29

basically spending every penny I have to keep

2:42:31

this thing on the road. So help

2:42:33

us out a little bit. Just, you know, gas

2:42:38

grass or cash. No one rides the

2:42:40

fruit. I think it's it's the slogan.

2:42:42

I

2:42:43

think I heard some sticking like that before,

2:42:46

but I can't can't exactly remember

2:42:48

bumper sticker. Twit, that's

2:42:50

that would be hurting the mean streets of parallel.

2:42:52

Yeah.

2:42:55

It's our new slogan for the club. I

2:42:58

like it. Gas, grass, or

2:43:00

cash.

2:43:01

No rides for free. So,

2:43:03

Leo, go back to the state to the discord screen.

2:43:05

You just showed a minute ago. I wanted to scroll down. There

2:43:07

was a neat illustration.

2:43:09

Not that old. No. Scroll

2:43:11

to old. Where you were. I I didn't

2:43:14

see it on my screen. I don't know why I keep going. It

2:43:16

was somebody sitting at a desk with pictures over

2:43:18

it. Yeah. That's mister Nielsen's

2:43:20

handy. There we go. Yeah. Yeah. Yeah. Oh,

2:43:22

that's amazing. There you are, Jeff.

2:43:25

The old The young Jeff. There's

2:43:27

Ant. There's Leo. And

2:43:30

this should have that low fight tunes going

2:43:32

on in the background. Yeah. Right. How did you do that,

2:43:34

mister Nielsen? He he is a master.

2:43:36

I think it's stable diffusion. Right? Anthony's

2:43:39

really

2:43:39

good. Yeah. You know what? We never Do you see that somebody

2:43:41

got stable diffusion running on a phone

2:43:43

fully? Yeah. I haven't. Unfold.

2:43:47

On my iPhone. On your phone.

2:43:49

Wow. Yeah. I've had it for some time

2:43:51

offline. Wow.

2:43:52

Yeah.

2:43:52

I thought you had it on a computer. No. Well, I

2:43:55

have it on computer but I also have it on the phone.

2:43:57

And the reason you can do that is because the models

2:43:59

aren't that big. And the iPhone has

2:44:01

a neural processing unit. mean,

2:44:05

you forget how powerful these things are in

2:44:07

our pocket. Yeah. What was

2:44:09

it called? I have to remember

2:44:11

what it was called. It would probably be

2:44:14

in my pictures, like

2:44:16

my pictures thing, but where is

2:44:18

that not productivity? That's for sure.

2:44:23

The opposite of productivity.

2:44:27

Maybe it's in photos. Oh,

2:44:29

yeah. It is. Draw things it's called.

2:44:32

It's a it's so oh, nope. Nope.

2:44:35

Yep. So it's

2:44:37

this is a paper cut Christmas card

2:44:39

that I drew withdraw things.

2:44:41

If you've used stable diffusion, you'll recognize

2:44:43

the interface. That's a stable diffusion

2:44:45

interface. Yeah. And you

2:44:46

can you can add different where

2:44:50

is it? Here it is. You can add different

2:44:52

models. So It has a lot of

2:44:54

the stable diffusion models versions,

2:44:57

but it also has, you know, cyberpunk and

2:44:59

Tron and there's an open journey

2:45:01

style, Eldon Ring style. So these are just

2:45:04

the models. Now they're they're fairly big.

2:45:06

You wouldn't add a

2:45:06

whole bunch

2:45:07

of them. Let me add Eldon Ring. Oh,

2:45:09

no. I think I already have that. Need to

2:45:11

download the selected model over the network, proceed,

2:45:14

and it is as they all are one point

2:45:16

six gigs.

2:45:17

But that model is now the phone. And the phone actually

2:45:20

works pretty fast, almost as fast as it would it on

2:45:22

a GPU based system.

2:45:24

Oh, yeah. So expensive.

2:45:25

Draw things. It's called it's free. It's

2:45:28

a stable diffusion for the iPhone. That's

2:45:30

been up for a while. Alright.

2:45:35

Picks of the week. Picks.

2:45:37

Now, Jason, you you don't have

2:45:39

to do a pick. You were last minute. I I

2:45:41

have one in Alright. What is it?

2:45:44

Well, I just thought I would kinda throw a bone

2:45:46

for the app

2:45:48

artifact. I don't know if that's been been

2:45:50

talked about but

2:45:51

Only perforately, and I'm actually glad you brought

2:45:53

it up. I've been meaning to. This is by

2:45:56

Kevin's system, the guy who started Instagram

2:45:58

and his

2:45:58

blog. Yeah. Yeah. And I would think it was just, like, a

2:46:00

couple of weeks ago that they really opened

2:46:02

it up and allowed everybody to kinda get in and

2:46:04

start using

2:46:05

it. We talked about it little bit on all that

2:46:06

Android last night. And I think the general comparison

2:46:10

or consensus by us on the panel

2:46:12

last night was that it's hard to

2:46:14

tell right now what

2:46:17

is different between artifacts, which is like

2:46:19

a they're they're billing it as like an AI driven

2:46:21

social news aggregator.

2:46:24

Right? Like, here are the news items that are

2:46:26

that are happening right now that

2:46:28

you would like or you would be interested

2:46:31

in. And it's hard for me to really tell

2:46:33

the differences between what artifact is doing

2:46:35

and what something like Google News is

2:46:37

doing, because I use the Google News app on on

2:46:39

Android quite a bit, and Google of course

2:46:41

is making the same determinations on the back

2:46:43

end. As far as what news it thinks.

2:46:46

Whoa.

2:46:47

What's the build by friend, build gross? I was just about to

2:46:49

share this to you. Oh,

2:46:50

it's built you to us. Different build gross. Oh, okay. And

2:46:52

it's New York Post. That's but I will show you the

2:46:54

disadvantage to this. When I share it

2:46:56

to somebody

2:46:57

else, send it to my mom, she's gonna really

2:46:59

Oh, and it's it's right. It's a link Yeah.

2:47:01

To not New York Post, but artifact

2:47:03

dot news. Right. And that's

2:47:05

what it's the same thing Apple News does this.

2:47:07

Google does not. Google News

2:47:09

don't No. I don't believe so.

2:47:10

No. It shares the actual article. And

2:47:12

so I

2:47:12

think that's kind of a negative on this -- Yeah.

2:47:14

-- to be able to drive those users. Right.

2:47:17

You get some stats after you've read

2:47:20

a number of articles, you know, they they

2:47:22

actually it's kinda gamified a little bit. Like,

2:47:24

if you go there, you've got thirty seven reads. I think once

2:47:26

you get to, fifty reads, then it gives

2:47:28

you kind of like a thumbs up and says, hey, you've trained

2:47:30

our system. We know you better and we'll start,

2:47:32

you know, giving you more appropriate

2:47:35

or accurate stories, but you

2:47:37

can kinda see, you know, what are the topics that I follow

2:47:39

closely? What are

2:47:41

the publications that I tend to

2:47:43

read most? Things like

2:47:45

that. So but but it's still

2:47:48

like You can also follow

2:47:49

friends.

2:47:49

I'm gonna follow Renee. Oh, wait a minute.

2:47:51

That following? Or is it just my fault? think

2:47:53

it's your fault. To use it. So I'm on

2:47:55

a two day streak. I have thirty seven reads.

2:47:59

So they've gamified it. A little

2:48:01

bit. Yeah. It supposedly gets smarter

2:48:03

as you as you read. Yeah. Because

2:48:05

one of

2:48:05

those things like TikTok, you wanna be really careful

2:48:07

about what you open? Right. Be

2:48:09

really sure. Yeah. Because certainly they're, you

2:48:11

know, they're tracking your scrolls and how long you

2:48:13

pause on a certain thing and and click

2:48:15

into and

2:48:17

Yeah. I don't know. I've I've always done

2:48:20

a lot of your job of of giving me

2:48:21

Oh, you put up, like, the wrong

2:48:23

link. Oh, did I? It's

2:48:25

These and

2:48:26

catalog identify and assess all of your

2:48:28

art and collectibles in one place. Well,

2:48:29

don't you have a lot of art? Oh, shoot.

2:48:31

I did. Oh, I linked to all one. Look

2:48:33

at all of that.

2:48:34

Yeah. You're right. You're right. Here.

2:48:36

Instead, artifact dot news, that's,

2:48:38

like, their

2:48:39

set That's why I was confused. Like, this doesn't

2:48:41

look like a feeder.

2:48:42

Yeah. No. Sorry. I did

2:48:45

that. Last minute before coming in here. So

2:48:47

artifact dot news, if you wanna go

2:48:49

and check it out, there's no, like, waitlist or

2:48:51

anything you can just There was. There was.

2:48:54

Now they've opened it up. But worth

2:48:56

checking out, you know, I think for me, the jury

2:48:58

is still out as far as, you know,

2:49:02

Google News is so ingrained in my in my usage

2:49:05

that I just use that all the time. So

2:49:07

will I start to use artifact? I I don't know.

2:49:09

At this point, I don't know why. I would but

2:49:12

I'm gonna I'm gonna give it the shot.

2:49:14

So

2:49:15

Yeah. Great. Artifact that knows. It's worth

2:49:17

certainly worth paying attention too. Yeah. Especially

2:49:19

because if it's It's

2:49:21

lineage. Jeff

2:49:23

Jarvis, a numeral

2:49:25

so weak. I I do anything.

2:49:28

Make fun of myself for the purposes of the show.

2:49:30

IIIII try to

2:49:33

put myself out

2:49:34

there. So I've been watching

2:49:36

With mixed views,

2:49:38

the teenage look filter

2:49:40

on on TikTok?

2:49:43

Yeah. I did it. It didn't didn't make me look

2:49:45

younger. Did it make you a lot of the

2:49:46

same thing here? I I did

2:49:49

I said,

2:49:49

I really bombed. I made a chick talk with

2:49:51

it.

2:49:52

It's, hey, I can actually do so much here.

2:49:55

Exactly. I've been seeing

2:49:57

all these people on TikTok looking young,

2:50:00

And then Jacob, will I try this?

2:50:02

So that's mine. Let me see yours here.

2:50:05

This is Jeff looking younger. No.

2:50:12

Which one's the Yolked? Which which

2:50:14

was the Yolked. Yeah. Which one? Mine

2:50:17

was worse than

2:50:17

that, maybe the top one. Mine was worse

2:50:19

than that.

2:50:20

The top one, was so disappointed. So

2:50:23

disappointed. I let

2:50:25

me let me see if I can get it to to work here.

2:50:27

So in order to do this, you

2:50:30

you have to swipe into your

2:50:32

TikTok plus pictures. And

2:50:34

I'm gonna do a filter, and

2:50:37

I'm gonna do the the younger

2:50:38

filter. Where where where do they stick to that one?

2:50:40

Is it portrait?

2:50:43

III came to it by Googling originally

2:50:45

going Sunny, vitality, warmth, vlog,

2:50:48

dim, peach, pink, use.

2:50:52

Use. Look how much younger

2:50:54

I look than I did. It's

2:50:57

the

2:50:57

uch. It doesn't look any different.

2:50:59

Turn it on. Turn it

2:51:00

off. Turn it off. Turn

2:51:02

on. Turn off. It doesn't

2:51:04

change a thing. And

2:51:07

I haven't said a hundred percent. A hundred

2:51:09

percent. It's

2:51:13

just dark. Filters in a hand. It probably

2:51:15

looks like a dashing, you know, ice cream. makes

2:51:17

it a little bit younger. It makes me look the same age.

2:51:20

And so Jason was a yes.

2:51:21

Oh, I am. Right here. Here here. You

2:51:24

you be my guest. See if does anything.

2:51:27

What is it? Is it fed? Sedans.

2:51:29

Yep. That's off. Okay.

2:51:33

Is it even doing anything? I'm not

2:51:34

No. Maybe I'm not doing it right. Is

2:51:36

it under effects? I don't even know where to log. Yeah.

2:51:38

It's in the it's in the filters. Let

2:51:41

me see though, Jeff, it's also posted

2:51:44

his glamour

2:51:45

look. And that So that's that's a controversial

2:51:47

one right now. Is everybody's using the glamour one?

2:51:51

Yeah. It's kinda that that

2:51:53

one made you look younger.

2:51:55

You look beautiful. I just wanna kiss

2:51:58

your big ruggied handsome

2:51:59

face.

2:52:00

You know? Yeah. She's been straightened

2:52:02

out my beard. Manage look.

2:52:04

Oh, okay. So you weren't in the right

2:52:06

area. Yeah. So there we

2:52:08

go. Oh. Oh.

2:52:11

Yes. I I do look Well,

2:52:12

let me try. How did

2:52:13

you look young? Oh. Now you

2:52:15

got it. I

2:52:18

do little short. Oh,

2:52:20

yeah. It's freaky.

2:52:22

What is the filter called? Because III

2:52:24

don't See this. My

2:52:25

hair is darker. Uh-huh. little darker

2:52:27

on the top there. Yeah. Uh-huh.

2:52:30

Yeah. It's still

2:52:31

strange. I look like grandpa months

2:52:34

I don't know about that. I guess it

2:52:36

does.

2:52:38

Oh, boy. Thank you. Thank

2:52:40

you. I don't have

2:52:41

any filters in here.

2:52:42

I know I was looking at the wrong I was looking at

2:52:44

use, but you were supposed to be teenage. You know?

2:52:46

Well, there's I think I think you were yeah.

2:52:49

I think there's a difference between, like, when you're in

2:52:51

the general camera mode, there's, like,

2:52:53

basic filters. And then there's the filter

2:52:55

section, which are, like, all of the, like,

2:52:57

expanded, like, trending things

2:52:59

and stuff. Oh, that was a basic filter.

2:53:01

Yeah. Wasn't a good one. Right. Well, now

2:53:04

I want I want this all the time. Oh my

2:53:06

gosh. ladies and

2:53:08

gentlemen. It is now time for

2:53:10

ampruits. And his pick

2:53:13

of the

2:53:13

week. Oh my gosh. Let me put

2:53:15

this I take the top stuff down because I don't know

2:53:17

what I'm doing with it. I

2:53:19

pick Ameran and well,

2:53:22

actually, it's Aperture and their

2:53:24

other brand. Ameran, they announced

2:53:26

some new gear yesterday. In

2:53:29

particular, the affordable MRN

2:53:32

COB chip on board s series

2:53:34

lights. I've spoken about

2:53:36

them before. But these are their

2:53:39

updated versions, and they're still quite

2:53:41

nicely priced. You can use these for photography.

2:53:44

You can use these for video. Some

2:53:46

of them are really portable. If you can just

2:53:48

attach a detap battery to them.

2:53:51

They work with a CitusLink mobile

2:53:53

app, so you can do some

2:53:55

effects and things like that or control the

2:53:59

the brightness and so forth. So

2:54:00

yeah. Check them out.

2:54:02

And I I would like to have this for a trip

2:54:04

because then Lisa could hold this. I

2:54:06

would use my my impossible six

2:54:11

and a selfie stick. And people would

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features