Podchaser Logo
Home
This Week in Tech 912: Let Me Consult My AI Lawyer

This Week in Tech 912: Let Me Consult My AI Lawyer

Released Monday, 30th January 2023
Good episode? Give it some love!
This Week in Tech 912: Let Me Consult My AI Lawyer

This Week in Tech 912: Let Me Consult My AI Lawyer

This Week in Tech 912: Let Me Consult My AI Lawyer

This Week in Tech 912: Let Me Consult My AI Lawyer

Monday, 30th January 2023
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

It's time for Twitter this week in Tech.

0:02

We have a great show. Tim Stevens is

0:05

here. Harry McCracken, Christina

0:07

Warren, three of my favorite people, and

0:09

course, AI is the topic.

0:12

It's an amazing world we live in and some

0:14

of the new things that are happening with AI.

0:16

And some of the old things that maybe aren't

0:19

so good. We'll talk about the

0:22

Microsoft quarterly results.

0:24

That's so hot. Intel, the

0:27

worst quarter in a long time.

0:30

And about the Oscar campaign that

0:32

took Twitter by storm and worse.

0:35

It's all coming up

0:36

next. On TWiT,

0:39

podcasts you love from

0:41

people you trust. This

0:45

is TWiT. This

0:51

is TWiT. This weekend tech

0:54

episode nine hundred twelve recorded

0:56

Sunday, January twenty ninth twenty

0:58

twenty three. Let me consult

1:01

my AI lawyer. This

1:03

weekend, Tech is brought to you by worldwide

1:06

technology TWiT an innovative culture,

1:09

thousands of IT engineers, application

1:11

developers, unmatched labs and

1:13

integration centers for testing and deploying

1:16

technology at scale. WWE

1:18

helps customers bridge the gap between

1:21

strategy and execution. To learn

1:23

more about WWT, visit WWT

1:26

dot com slash TWiT.

1:29

And by ACI Learning.

1:31

Tech is one industry where opportunities

1:34

outpaced growth, especially in cybersecurity,

1:36

one of information security jobs require

1:39

a cybersecurity certification to

1:41

maintain your competitive edge across

1:43

audit, IT, and cybersecurity

1:46

readiness, visit go dot ACI

1:48

learning dot com slash quit.

1:51

And by bitWarden, get the password

1:54

manager that offers a robust and cost

1:56

effective solution that can drastically

1:58

increase your chances of staying

2:00

safe online. started with

2:03

a free trial of a Teams or enterprise plan,

2:05

or get started for free across all devices

2:07

as an individual user. At bid

2:09

wharton dot com slash

2:11

TWiT. Thanks for listening

2:14

to this show as an ad supported network.

2:16

We are always looking for new partners

2:19

with products and services that will benefit

2:21

our qualified

2:22

audience. Are you ready to grow your business?

2:25

Reach out to advertise at TWiT dot

2:27

tv and launch your campaign now.

2:35

It's time for TWiT this week at tech, the show we cover

2:37

the week's. Tech news. I'm

2:40

just gonna put a little black iron band if you don't

2:42

mind on

2:43

the San Francisco niners. Gold

2:46

throwback jacket. That's

2:48

life. Tim Stevens is

2:49

here. Hello. Oh, that was Harry.

2:52

Hello, Tim. Good to see you.

2:54

Hey, Leo. Good to see you as

2:55

well. Thank you for having me. Tim, of

2:57

course, has been on for many years. He is now a freelancer

2:59

at Jalopnik, at TechCrunch, at Motor

3:01

Trend. The virgin, he has his very

3:04

own substack. Tim that's

3:06

substack dot com. Great article. On

3:09

your visit to the Dicar,

3:12

rally in Saudi Arabia.

3:14

Wow.

3:15

Yeah. That was quite a A really interesting

3:18

social experience on a lot of levels and it

3:20

optimized a lot of things, but amazing

3:22

events in doing doing a lot of great travel

3:24

lately, so I've been very lucky.

3:25

Nice. Well, it's great to have you back. Not much

3:28

ice race. Thanks,

3:28

Leo. Sadly not.

3:30

Are

3:30

you aware of that, Tim? Can't really

3:32

be heard at least. Right? You

3:34

can't hear him, not real just barely. Alright.

3:37

Hold on. We're not ready to begin yet.

3:42

Oh, you have a little he's

3:44

not guy. He doesn't have headphones on. So

3:47

we can't do a

3:48

bleed. I understand why we had the bleed.

3:49

That's what I did.

3:50

Harry, do you mind now? Headphones.

3:52

I'm happy with whatever works for me. We could provide

3:54

you with headphones. Sure. I apologize.

3:57

No problem. Do you have some

3:59

work? You can get them out of my office if you don't.

4:05

Give them some nice ones. The

4:07

good stuff. Give them good stuff.

4:09

Give them the good

4:10

ones. I think there's an unopened box in

4:12

my covered on the left

4:14

there. Give him some

4:17

sterile headphones. I'm

4:20

sorry, Harry. I --

4:21

Okay. -- wasn't paying attention. Yeah. Usually,

4:23

we use a bleed, but I think

4:24

I suddenly realize you might not be aware of that.

4:26

I can't remember. Yeah. That would be kind of

4:28

a disadvantage to the overall

4:30

program. Be

4:33

kind of a bad thing. Good.

4:40

Nobody should hire Demiko Ryan's as

4:42

a head coach. That

4:44

would be a terrible idea. It's

4:50

Lisa's birthday, and I really wanted her to have

4:52

a nice birthday.

4:56

Happy birthday, Lisa.

4:57

Happy birthday, Lisa. It's also our anniversary

4:59

because I Foolishly

5:01

thought I if we got married on

5:03

her

5:03

birthday, I would only have to give her one

5:05

gift.

5:07

You didn't think that. That wasn't white

5:09

I just thought it'd be easier to remember. One

5:12

less one fewer date to

5:14

remember or

5:15

something, I don't know. Yeah.

5:18

That was fun. Unfortunately,

5:21

the

5:21

place we got married, Calisto

5:23

a ranch has burned to the ground and the wildfires is

5:26

gone.

5:27

Oh. So Which makes me sad. She says,

5:29

as life, I think at least for this. No.

5:32

Because we we used to go

5:33

there, you know, on our anniversary and stuff.

5:35

TWiT was really nice. So

5:37

but I've just learned that we had a collar

5:40

from the new Kona village, which is opening

5:42

this summer in Kona, Hawaii.

5:45

I've just learned it that. Reopening, and

5:47

that's somewhere I've always that was on my

5:49

bucket list somewhere to stay. So that's

5:51

where Steve Jobs was staying when the

5:54

iPhone four antennagate

5:56

happened. Mhmm. He didn't wanna

5:58

come back, but they made

6:00

him come back. That's how good it

6:02

is, I guess. You

6:04

you know, there's no there were at the time, there were

6:06

no TVs, no phones, no Internet. It

6:08

was, like, you were in a staying in a

6:11

traditional Hawaiian holiday. Sounds

6:14

alright. How's

6:18

that? You can hear it and you get

6:20

the volume there so you can control that as

6:22

don't deafen

6:23

yourself. Can you hear me,

6:24

Harry? Harry, can you

6:26

hear

6:26

me? Harry,

6:28

can you hear me? I can hear you because you're

6:30

sitting next to me. Oh, It's

6:33

not a good test.

6:38

One, two, you can you hear me? Are

6:40

you receiving me? Should I tell you what I have for

6:42

breakfast tonight? Okay. Good. Alright.

6:43

Awesome. Alright. I think we're good.

6:46

Yay.

6:50

Alright. Here we go. Yeah. Someday we'll have

6:52

four in the studio again. That

6:55

has that has happened. Christina

6:57

was here, but I don't

6:59

think we've had it. When was the last time we had an all in

7:01

person show? It's been a it's been a while.

7:04

Alright. Starting over. You can

7:06

hear. I can hear it. I think.

7:10

It's time TWiT this week to check the show week cover

7:12

the week's tech news. With

7:15

a panel of fabulous people,

7:17

I'll start over on my right with

7:19

mister Tim Stevens. We haven't

7:21

seen in a while freelance writer now.

7:23

You've you see Tim's stuff

7:25

all over the place, Jalopnik and TechCrunch,

7:28

and motor trend in the verge. He also has

7:30

his very own substack

7:33

called around the next bend.

7:35

Oh. Leave it myself.

7:38

I'll buy your loan some. That's

7:40

all I said to me, Lou, it's great to be here. No. It's great to

7:42

see you. I missed you. And

7:45

lots of stuff to talk about. You you

7:47

just came back from the the car and

7:49

the Descartes Road rally. And I loved

7:51

your pictures, but it's a was an interesting

7:53

mixed, I guess mixed bag

7:55

of

7:56

experiences. Yeah.

7:57

Yeah. Thanks. It was a great treat to Saudi Arabia.

7:59

Learned a lot of things, both good and bad. Yeah.

8:02

Also with us in studio, because

8:05

COVID is over no.

8:08

McCracken, global

8:11

tech editor at Fast Company,

8:13

Fingers

8:13

crossed. Hello, Harry. Good

8:15

to see you. Nice to actually see you in

8:17

person. Yay. And you brought your wonderful

8:19

wife Marie with you. Great

8:20

to see you

8:21

all. She has custody of Lily.

8:24

The TWiT pet. She

8:25

will be taking Lily home with us. I'm sure.

8:28

Lily is about the best dog you ever

8:30

saw in your LifeWorks

8:31

dog. But she lives here. Well, I

8:33

shouldn't say that because I think it's in our lease.

8:36

She's not allowed to spend any time here,

8:38

but I didn't

8:40

say that. I don't think the

8:42

landlord watches TWiT. Also

8:46

great to see Christina Warren

8:48

from GitHub last time you were in

8:50

Studio. Senior deck out of the kit

8:52

over there at GitHub. Good to see you.

8:54

Glad to be here. You made the

8:57

move this week when Ivory

8:59

came out. Tappots was one

9:01

of the third party apps

9:03

that mister Musk clobbered

9:05

at first without warning then

9:07

with a lie saying, you've

9:10

been violating the rules. For

9:12

fifteen years, you just noticed.

9:15

Finally, they said, oh, they retroactively changed

9:17

the rules, no third parties. But

9:20

that wasn't enough to push you to Mastodon IV

9:22

was the thing that did it. Tapots. Was

9:24

a very, very nice or TWiT rather. It

9:26

was a very nice Twitter client

9:28

from Tapbots. And Ivory is basically

9:31

TWiT for Mastodon.

9:35

Yeah. Yeah. And and honestly,

9:38

it was kind of a combination of things. It was

9:40

that that was, I think, really kind of

9:42

the final straw. Also, as a lot of

9:44

people have commented on. My

9:46

posts don't show up in people's feeds, and I

9:48

don't see replies, and I don't see other people's

9:50

posts. And so the whole experience was

9:52

becoming degraded. And then not

9:54

only did I have I

9:54

have, like, Ivory, which was great, but there's

9:57

ice cubes, which is a great opening I like

9:58

ice cubes a lot. There's a lot. Tell her, ice cubes.

10:00

It's fantastic. There's There's

10:02

elf dot zone, which is a great web

10:04

interface. There are I've actually

10:06

have a get help list I've been making of different

10:08

-- Yeah. -- cool people. This

10:09

is the beauty of open source. And an and an

10:11

open standard.

10:12

You can

10:12

anybody can develop and they can't cut you

10:15

off. Right. Right.

10:17

And and so and I've already got

10:19

it, like, about, you know, there

10:21

will be some people who follow me on Mastodon who didn't

10:23

follow me on Twitter, but I've got ten

10:26

percent of, you know, the followers that I

10:28

had on Twitter now on Mastodon,

10:30

which not

10:30

bad, you know, for for four or five days in.

10:33

So A lot of people are reporting increased

10:35

engagement even though they're fewer

10:37

followers.

10:38

Unvested. I'm

10:38

sure maybe just that. Yeah. Absolutely. Look at the

10:41

quality as

10:41

guys. I have two.

10:42

Yeah. I think No. That no.

10:44

That it will change over time. Right? Like, I

10:46

think that it's more people join, you

10:48

will see less of that high signal to

10:50

noise. But right now, I totally agree. Like, I'm

10:52

definitely seeing higher higher engagement higher

10:54

quality. It's pretty clear that Elon

10:56

has decided to heavily

10:59

algorithm it ties the feed

11:01

on Twitter. He's he's even said you

11:03

pay eight bucks and more people

11:05

will see you. And I think he's

11:08

I don't know why he thinks

11:10

eight bucks from a

11:12

a few hundred million at best.

11:14

Users is gonna make enough money to pay for

11:16

TWiT, but and the loss of ad

11:19

revenue, but he's doing whatever he can. And

11:21

that's the problem, though, is that it then

11:23

tells people, oh, you

11:26

know, nobody's Nobody's engaging with me. I

11:28

don't wanna be here. So you're driving

11:30

off your creators. Actually, Corey,

11:32

doctor Roe, wrote a

11:34

good story about this this week.

11:38

It's an impolite title. So I'm gonna

11:40

say, TikTok's in

11:43

certification. Using

11:45

the good places euphemism for

11:47

that word. It

11:50

I thought was quite insightful as

11:52

usual Corey made something that's, you know,

11:54

been around and obvious to all of us

11:56

crystal clear, put in words

11:58

that a light bulb goes off. He

12:00

says, here's how platforms die.

12:02

First, they're good to their users. Then

12:05

they abuse their users to make things better

12:07

for their business customers. Finally,

12:10

they abuse those business customers.

12:12

To claw back all the value for

12:14

themselves, then they die.

12:17

And he gives an example, Amazon, which was

12:19

customer first, customer first,

12:21

and then you know, as the as the

12:23

customer base got locked in with a variety

12:25

of techniques like Amazon Prime and

12:27

DRM and so forth, then they said,

12:29

alright. Now businesses businesses then

12:31

the marketplace fifty percent Amazon sales are

12:33

in the marketplace, third party sellers.

12:36

But they got locked in even though they

12:38

lose forty five percent of revenue

12:40

to Amazon and

12:40

fees. And now Amazon

12:44

says screw you, you're locked

12:45

in and they start monetizing

12:48

he says the company's

12:50

thirty one billion dollar, and he puts

12:52

it in quotes, advertising program is

12:54

really a payola scheme. That pits

12:56

sellers against each other forcing them to bid on the

12:58

chance to be at the top of your

13:00

search. But what ultimately happens

13:02

is you've ensured to find your

13:04

platform to the point where no one wants to use it

13:06

anymore. This is very

13:08

clearly where Elon is. Twitter TWiT first

13:10

was all about the users. They couldn't figure out how

13:12

to monetize it. Then they got brands to

13:14

go there. In fact, that's one of the reasons all of

13:16

us were there. Right? That's the best you have

13:18

to be there to promote your

13:20

brand, to to build your audience. Then

13:22

once they got them locked in, now they

13:24

can say, hey, if you wanna reach that

13:27

audience which we own, it'll

13:29

be eight dollars, please. But

13:31

You do that at the risk of driving people like Christina

13:34

away. He's talking about the in

13:36

this article, particularly about TikTok.

13:38

Doing this, but it happens

13:41

to every one of these companies. His

13:43

position, which I really agree

13:45

with, is, you know,

13:47

that's this is the way it is, and

13:49

you just move. You go to the next

13:51

thing. You leave May space for Facebook, you leave

13:54

Facebook for somewhere else. But

13:56

what we need regulation for is to make

13:58

sure it's as friction free as

14:00

possible to move, to avoid the

14:02

lock in. You need

14:04

interoperability. You need to make

14:06

it easy to move somewhere

14:07

else. And then then

14:10

you can let the

14:11

market you could

14:11

let the market rule. He

14:13

says, as I said at the start of this essay, this is

14:16

towards the end. In certification, exerts

14:18

a nearly irresistible gravity on

14:20

platform capitalism. You

14:23

know, the staff, the

14:25

executives the shareholders,

14:28

eventually, they all say,

14:30

no, you got to ensure to fight. We need the

14:32

money. But

14:35

even the most locked in user eventually reaches

14:37

a breaking point and walks away

14:40

or or gets pushed. Individual

14:45

product managers, executives,

14:47

activists, shareholders all give preference to quick

14:49

returns at the cost of sustainability

14:51

and they're in a race to see who can love

14:54

love, Corey. Eat their seed corn

14:56

first. In certification,

14:58

has only lasted for as long as it has because the Internet

15:01

has devolved into five giant websites.

15:03

Each filled with screenshots of the

15:05

other four. Gary's. Gary's.

15:07

Gary's getting a little little

15:09

cranky in his old age. I

15:11

don't know. In certification

15:13

kills, Google just laid off twelve

15:15

thousand. And the company's in a

15:17

full blown panic over the rise of AI

15:19

chatbots. What

15:23

are you what are your thoughts, Tim? I

15:26

I definitely, you know, the the

15:28

pattern is very clear when we certainly seen it before. What's

15:30

missing though, I think, is the actual death of these platforms.

15:33

I I think, you know, Twitter is certainly struggling,

15:35

and I think a lot of us are are kind of

15:37

thinking that these days are numbered, but it's

15:39

still incredibly huge and incredibly popular

15:41

and as much close to say, you

15:43

know, engagement numbers are up because everyone's

15:45

kinda watching the dumpster virus

15:47

motors. So I think it's a little too early to say that

15:49

Amazon has died, that that Google has died,

15:51

that Twitter has died, and so I think

15:53

that's the piece of the pattern that's

15:55

that's missing in this case for

15:57

better or for

15:57

worse. We certainly there are, of course,

15:59

plenty of companies. And this is, by the

16:01

way, not just tech companies, any company with

16:03

any

16:03

consumers.

16:04

There are plenty of companies in that graveyard.

16:07

We're in the process of watching these companies

16:09

move in that direction. But you're right. I mean, it's it's

16:11

hard to imagine Google going

16:13

away. Facebook. Maybe it's

16:15

not so hard to.

16:16

Although Facebook's latest data

16:19

on engagement,

16:21

since they started pushing videos

16:23

from people I don't even follow into

16:25

my feed. Apparently, that's actually

16:27

working at least right now in terms of engagement

16:30

that the AI they're using. To put

16:32

videos in front of you. It actually does seem

16:34

to determine stuff that people will

16:36

watch. And so it's the

16:38

numbers are a little encouraging

16:39

lately, especially given how

16:41

you know,

16:43

a

16:43

little good news. Facebook has had any

16:46

front in the last couple of years. Yeah.

16:48

Mhmm.

16:50

Christina, is does this

16:52

process end with

16:55

the end of

16:55

TWiT, or does Twitter just kind of

16:57

drag on? Well,

16:59

it can it can be both. Right? And we've seen

17:01

both because we've definitely seen social networks

17:04

just go under and

17:06

and just to disappear and that has

17:08

happened. And Google Plus is a great example of that

17:10

where obviously, you know, Google

17:12

put a lot of money and a lot of effort into that,

17:14

and it failed, and then they shut it down and got rid

17:16

of all the archives even, which I actually

17:18

thought that was not a great move to not

17:20

even keep the public archives

17:22

available, but that was like a

17:24

high profile failure. There have

17:26

been other ones. But then you also have

17:28

instances where they continue to

17:31

kind of stick around until

17:33

they're sold and and deleted and

17:35

what not myspace being a great example of that

17:37

where, you know, that has now

17:39

had god only knows how many owners and

17:42

people trying to use that very

17:44

worthless at this point, email list.

17:46

Of of users. But myspace,

17:48

you know, was bigger than Facebook

17:50

up until about two thousand

17:52

nine. I wanna say, And

17:54

then you started seeing a really big migration of

17:56

people from MySpace to

17:59

Facebook. To the point that that MySpace just

18:01

kind of became a dead zone except

18:03

for a very specific niche of

18:05

people. And that wasn't

18:07

really unlike Live Journal and

18:09

Geocities and and some and

18:11

Tumblr and and some other things. It wasn't

18:13

really because of any policy changes

18:15

that my space made. It was just because

18:17

the masses were all on Facebook. And

18:20

you know, Twitter is interesting because as

18:23

Tim says, it's still this giant

18:25

place. I think that

18:27

what will potentially be pushing people

18:29

off of it is less the

18:31

alternatives and more when the

18:33

overall experience becomes

18:35

degraded, whether because more

18:37

toxicity is there or, you know, just because

18:39

you you can't you're having errors in your

18:41

feed, you know, you're not able to post things

18:43

the right way. You can't refresh as quickly.

18:45

You don't see all of your replies. You

18:48

know, that's the sort of thing that makes

18:50

people go, okay, why am I investing time

18:52

in this? And and, arguably, you could say that

18:54

the demise of Twitter started

18:56

probably, you know, two thousand sixteen.

18:58

Ironically, when its engagement was higher,

19:01

when you started to see lot of the the previous, like, high

19:03

profile users of Twitter leave

19:05

the platform for Instagram and

19:07

then later, you know, TikTok,

19:10

but you you stopped seeing the celebrities on Twitter.

19:13

And I I don't know.

19:15

I mean, is this one of those well,

19:17

what's the Neelos

19:19

doesn't like it happens, you

19:19

know, slowly and then all at once. Yeah.

19:22

It's it's like the collapse of the

19:24

Give it give it give it give it's set

19:26

at first. Have if Scott Fitzgerald

19:28

said it about somebody going bankrupt, but I think it

19:30

was Gibbons who said the Roman Empire collapsed

19:33

slowly at at first and then

19:34

suddenly. And then

19:35

it's been applied to a lot of things. You're

19:38

right. Although Instagram seems to be

19:40

quite suddenly collapsing in

19:43

on

19:43

itself. But if I were wrong Totally. Well, again,

19:45

to you're not. And and I think Instagram

19:47

was one of those interesting ones where if

19:49

they had just stuck their guns like,

19:51

when they copied Snapshot,

19:54

that was brilliant because they did stories

19:56

better than Snapshot did. They had a

19:59

bigger audience and and they add us some features that made it better. So that

20:01

was like a perfect example of copy in the

20:03

right way. With TikTok, I think

20:05

they just have fundamentally misunderstood

20:07

their audience. They've misunderstood that

20:09

it's a completely different expectation.

20:11

And if they wanted to create a TikTok

20:13

compete, they should have created an

20:15

app called Instagram Reels that I bet would have

20:17

been very popular. But, you

20:20

know, by by loading it down with

20:22

stuff that people you don't follow,

20:24

you're not even necessarily interested in,

20:27

algorithm that is not as good as TikToks, and

20:29

then you don't even see, you know, your friend's photos,

20:31

the whole reason why people are there to begin TWiT. Yeah,

20:34

I I spend a lot less time on on

20:36

Instagram because I'm like, what's the point? I I

20:38

used to come come here for a specific

20:40

reason. Now, this isn't there. And

20:42

even

20:42

worse, it's a watered down version of this

20:44

other thing that already exists. But

20:46

we are creatures of habit, and

20:49

you're right, Tim, these things don't

20:51

die. But they don't exactly

20:53

thrive. There

20:54

will be something called Twitter ten years from now.

20:56

It's just not entirely clear whether anyone will go mad at all.

20:58

Am

20:58

I think

20:59

I believe there's still a friendster.

21:00

Uh-huh. By the way, I'm sorry.

21:03

Sunosi rises. It was Hemingway.

21:05

You win. I thought it

21:05

was You win

21:06

in Germany. I wish realized

21:09

wasn't? It was it was like a Fitzgerald quote.

21:11

But

21:11

maybe he still left from Gibbon. I

21:14

think Gibbon said it first, but I might be wrong on that

21:16

as well. How did you go bankrupt? Two

21:18

Gradually, then suddenly. Did

21:20

you collapse? Two ways,

21:23

gradually, then suddenly. This

21:27

actually runs into this there are tributories

21:29

off of this into a lot of the stories that

21:31

we're talking about these

21:34

days. And I don't wanna do another

21:36

Elon Musk filled TWiT so

21:38

we're not. Everybody's going, oh, thank god.

21:40

I thought he was gonna start talking about Elon. But,

21:43

really, it's about companies

21:45

in general going through this this

21:47

business cycle. Cement

21:49

your former employer, Tim Stevens,

21:51

has been accused of

21:54

some interesting shenanigans. We had

21:56

Conoco Yama on two weeks ago. Right when

21:58

this was breaking, remember the seventy five

22:01

stories and AI had written in their personal finance

22:04

section. She said, well, these are stories

22:06

no reporter wants to write.

22:08

You know, the kind of the basic boring

22:09

stories. We had the AI

22:11

write a

22:12

first draft and then an editor

22:14

look at it. TWiT correct

22:16

it, finish it, and then put it out.

22:19

But

22:19

now it's coming out that in fact there were far more

22:21

errors that were not That

22:23

a lot of the content wasn't very good.

22:26

And that perhaps cnet has been using it

22:28

more than just those seventy five

22:30

articles. She said, yeah. What we've used

22:32

for years, as many publications do, programs

22:35

to put in stock prices, that I

22:37

mean, I don't that's not using AI to write

22:39

a story. There's a very different thing

22:41

there. I don't I don't blame him for that. The verge

22:43

though has been really kinda hammering

22:46

on seeing it. I don't know. Maybe

22:48

they have a vested interest

22:50

in in in knocking

22:52

down a competitor. I I don't know.

22:54

But they're accusing CNN of

22:56

doing something a little bit more nefarious.

22:59

Remember, CNN, was sold

23:01

to equity capital

23:03

company called Red Ventures.

23:04

And, Tim, you have some probably direct

23:07

experience with

23:08

this? A bit. Yes, I do.

23:11

And what always happens with these

23:13

acquisitions is that the

23:15

equity capital companies raise

23:17

a lot of debt to

23:19

acquire these companies. So they're settled with big debt

23:21

and what you look at across this the

23:23

corporate landscape these days heavily

23:26

encumbered companies owning these companies. So a lot

23:28

of that. So there's pressure on them

23:30

from both their shareholders and their

23:33

and their lenders to kinda

23:34

monetize. So these companies very

23:37

often either

23:38

sell off pieces of

23:40

the company that they bought or

23:43

attempt to monetize it as Elon is

23:45

doing with Twitter. The verge

23:47

is accusing CNN

23:49

and Red Ventures of of and the way,

23:51

Red Ventures also owns

23:53

a a number of sites like the Points

23:56

Guy, bank rate, and credit

23:58

cards dot com. Which

24:00

are sites that make their money through affiliate

24:02

credit card affiliate fees,

24:04

and the verges accusing them in

24:07

effect of turning cnet into

24:09

that kind of site auto

24:12

generated link bait articles,

24:16

designer rank highly in searches,

24:18

that they can then monetize with ads or

24:21

affiliate

24:21

fees. Bank rate and credit cards have

24:24

also published AI written articles about credit

24:26

cards

24:26

with ads. For credit cards nestled within.

24:29

It turns out the same guy responsible for

24:31

this at bank rate and credit cards is responsible

24:33

for it at

24:35

cnet. Lance Davis,

24:37

Vice President of Content at Red

24:40

Ventures. And I think there's

24:42

an interesting accusation here that

24:44

Red Ventures is basically taking

24:46

this venerable highly

24:48

respected name in Technology

24:50

Journalism and turning it into an SEO

24:52

farm. Tim, I'll give you

24:55

the chance to mention. Either they

24:57

recuse yourself or

24:59

to

25:00

give us your thoughts.

25:02

Yeah. No pressure. Obviously, I need to be able

25:04

to be careful with what I say here, be

25:06

both because this is my former employer we're talking about and because

25:08

I have a lot of friends and a lot of people with my

25:10

respect. Well, that's really important. And I and I should say

25:12

that. Important to see. So many people, including

25:14

Connie, that I love and respect and

25:17

honor. I don't blame seeing it

25:19

for this one. I think this comes from

25:21

Red Ventures.

25:22

Well, so my my take on

25:24

this is a little bit complicated. I

25:26

I do think that you know,

25:28

clearly, the verge has

25:30

an interest in in making seeing it look bad their

25:32

competitors. That's fine. I don't think

25:34

think that anything that the verge has reported thus far from what

25:36

I've seen has been inaccurate. I I wanna

25:39

say one thing for sure. I

25:41

wasn't aware of any AI stuff that was going on

25:43

when I was there. Left seen at around August of

25:45

last year. There was kind of rumors

25:47

and talk and that kind of thing, but I wasn't aware

25:49

of anything going

25:49

on. So I have no insider knowledge about

25:51

how any of this came to pass. But I I will say that

25:54

Siena was using tools like

25:56

Wirtieth and others. And

25:58

those are tools that a lot of outlets

26:00

use of publications use those. And basically, what they do is they help

26:02

you optimize the content that you're running to make sure

26:04

that they include the right keywords, to

26:06

make sure that they are

26:09

you know, that they perform well in an algorithm

26:11

based environment. And that is really what

26:14

consumers are operating within right now. Anyone who goes

26:16

on the Internet and searches for a thing, is

26:18

asking an algorithm what things should I read.

26:20

And so it's only natural for

26:22

publicationists to wanna make sure that their content

26:24

performs as well as

26:26

possible. The thing is when you use a tool like that, it can begin

26:28

to feel like you are basically

26:30

reverse engineering

26:31

Google, you reverse engineering a

26:34

search engine. That's really what this game comes

26:36

down to. Isn't it?

26:36

Talking the AI, isn't

26:38

it? Right. Right.

26:39

And that's what we're talking about for Leo. At

26:42

some point, you know, what's the best tool to optimize

26:44

cognitive algorithm? It would be another

26:46

algorithm effectively. And so I I

26:48

think by extension, it's a natural thing that

26:50

seen it would do

26:51

this. I don't think anybody would be surprised at seeing

26:53

that -- Well, see that one

26:54

of the -- -- when there's financial pressure

26:56

to turn around a big acquisition. Right?

26:58

Maybe so, I think that the timing is

27:01

a little bit irrelevant here. I mean, CNN has definitely

27:03

been on the cutting edge of a lot of different

27:05

publication types over the years. What would

27:07

there be integrated affiliate

27:09

links, things like that. They've definitely been at

27:11

the bleeding edge. So there's no surprise that they would be

27:13

at the bleeding edge of adopting AI technology.

27:15

My concern really is that there wasn't

27:17

enough transparency involved here. I think that's what what

27:19

my problem is. If seen that had come out

27:21

and said, hey, we're experimenting with AI. This is kind

27:23

of fun and new. We don't really know what this is gonna be,

27:26

but here's what this is where we're trying it on. This

27:28

is some content that was written by AI.

27:30

What do you think? I think that

27:32

this would have been I'm sure that they would have

27:34

gotten some blowback for sure. But

27:36

from what I could see, from my perspective in

27:38

reading through coverage on the virgin and

27:41

elsewhere, it just seemed like it was they

27:43

were hoping that nobody would notice. And

27:45

I feel like that's really the wrong way to go about

27:47

doing this. If you're going to be embracing

27:49

this kind of technology or investing in it, especially

27:51

when you're talking about giving people recommendations

27:54

where they should put their money in a mortgage,

27:56

I think it's important to be incredibly transparent.

27:58

And, you know, Connie's piece was

28:00

very transparent, but that came

28:03

out long after the story had had kind of blown up, long after the

28:05

Virtus piece. And I think it's unfortunate that

28:07

the scene that wasn't more upfront with

28:11

with what was going on. To be honest, there was certainly you know, we saw the

28:13

little disclaimers on Google and things like that, but

28:15

that was in my

28:16

opinion, it was not enough in that that's where

28:18

I'm disappointed in this whole thing.

28:20

Yeah. And

28:20

don't blame Connie at all or

28:23

even Lindsay Turgeon, also who's been a regular

28:25

on this show for many years. I think I

28:27

have huge respect for both of them.

28:30

Yeah. If anything, I feel like they might have been

28:32

sandbagged by this, and they

28:34

didn't know the full extent of what was going

28:36

on and ended up being,

28:38

you know, kind of

28:40

hung out to dry, so to

28:42

speak. The verge quotes of former senior

28:44

employees saying Red Ventures was using

28:46

automated technology for content

28:49

long before the AI by line began

28:51

cropping up in November. They

28:53

mentioned this word Smith tool, which you

28:55

talked about, Tim, nicknamed Morgotron

28:58

or Morgot I don't know how you

29:00

pronounce that. Morgotron internally because if it's

29:02

used in mortgage stories, They

29:05

said it'd been used for at least a year and a half,

29:07

but the siloed natures of

29:09

the teams across CNN and Red Ventures

29:11

makes it difficult. For journalists at

29:13

the site to understand the chain of command who's

29:16

using what tools and when. So

29:18

I no blame on our

29:20

friends at cnet. I'm

29:23

I'm very happy, frankly,

29:25

to blame Red

29:27

Ventures in any equity capital

29:29

company because I feel like these guys or

29:31

to some degree the bane of the

29:33

ex of our

29:34

existence. But it it's not just it's not just

29:37

VC firms that that are pushing

29:39

companies to use this content. A lot of editorial

29:41

properties use SEO optimization tools. If

29:43

if you wanna perform, if you wanna be in the

29:45

first page on a Google search, you have to be

29:47

using these tools I know a lot of automotive properties are

29:49

using

29:49

them. This is not,

29:50

you know, proprietary software. This is stuff that you

29:52

can go out and license that anybody can

29:55

use Now tell you what keywords that you need to inject into your content.

29:57

And, again, it it it does make you feel like

29:59

your reverse engineering is your writing, but

30:01

this is not proprietary

30:03

stuff. Yeah. Well, in a way,

30:05

that's scarier if it's if it's even more

30:07

widespread use that we don't know about. I

30:09

mean, I

30:09

think they're but for the grace of god goes everybody

30:11

in the media business, not us.

30:14

We haven't figured out a way

30:16

to do that yet with podcasts. Over the

30:18

course not

30:19

immediately, but over the course of time, I I think

30:21

you will see AI play a

30:24

role a lot more particularly

30:26

as some of the

30:28

issues seen that ran into are less

30:30

of an issue. And also, I mean,

30:32

they seen that probably made a lot of mistakes so

30:34

the rest of us don't have to and can

30:36

learn from them in terms of

30:38

disclosure. But I

30:40

feel like, well, we're not doing any

30:42

of this and have no plans to do this. And in

30:44

fact, might not really work well for

30:45

us. Anyhow, I I would not

30:48

say that

30:49

At best company, we'll never use

30:51

AI in any form because I I think things are gonna

30:53

happen quite click quickly, and there might be ways to

30:55

use

30:55

that, which are actually

30:58

completely above board and reasonable and resolve

31:00

in in better content rather than just

31:03

cheaper

31:03

content? Somebody

31:07

said I'm trying to find the article

31:09

that CHAT GPT is

31:11

the absolute definition

31:13

of BS. Yes. Because

31:16

and and check and by the way, OpenAI, the

31:18

creators of QTT GPT say

31:20

this. They said we never said you had to be

31:22

accurate. That's not in the training at all.

31:24

It

31:24

has no idea what it's saying and whether it's

31:27

correct or not. Some sometimes TWiT happens

31:29

to be accurate, but that's not what the type

31:30

of incident accident if it happens to be

31:33

accurate almost. Right? I

31:34

did a piece. I I have a new newsletter,

31:37

which I should plug at the end of the call.

31:38

Plug it now. It's called plugged in,

31:41

and you go to our Fast Company homepage, there should be a

31:43

newsletter's link that will let you subscribe. Nice.

31:45

And I because I'm interested in

31:47

the history of cartoons, I chat

31:49

GPT what the first TV cartoon was. And every

31:51

time I asked, it would give a different

31:54

answer. Many of them very convincing, and

31:56

none of them correct. There

31:58

basically, there's so many things where ChatGPT

32:01

has no idea what it's saying and unless you

32:03

already know what the answer

32:04

is, you you might well be fooled because

32:06

TWiT is able to lie in such a con convincing

32:09

fashion. But it's important to understand that that's

32:11

not its mandate to tell the truth or to be

32:12

accurate. It's not a It's not

32:14

a fact generator. It's a BS generator. It's really good

32:17

at

32:17

straining words together. You

32:19

call it a glib

32:19

bot, which I think is a very

32:22

hear you come from now on. I'm calling it a grip butt.

32:25

So in

32:28

a way then, it

32:31

makes you wonder, should we

32:33

be, you know, your company

32:35

GitHub and you can you can disclaim

32:37

this. Again, I know you have nothing

32:39

to do with scrutiny. But is getting a little heat right now

32:41

from the open source community over its AI

32:44

code generator, copilot, which

32:47

is Kind of impressive. Copilot also

32:49

uses, we should mention, the same open

32:51

AI technology as chat

32:52

GPT. It is using GPT.

32:55

Yeah. I was gonna say, I I can't comment

32:57

on any of the the lawsuits or any of that stuff,

32:59

but co pilot does use the the

33:01

GPT33 dot five

33:03

you know, a language

33:06

model that chat GPT is based on. It

33:08

uses something called codecs, which is specifically

33:10

focused on source code rather than

33:12

you know, the the corpus that the chat

33:15

GPT uses, which is much more broad.

33:17

But if you use chat GPT to

33:19

say, right, you know, program that does

33:20

this, this, and this, Most of it

33:22

is data set is probably coming from. Yeah.

33:24

Because chat, GPT, can write code.

33:26

In fact, one of the stories we had on

33:28

security now is that Scripted

33:31

kitties are having chat,

33:34

GPT, right, effective

33:36

malware. My malware

33:40

that works

33:40

-- Mhmm. -- we

33:43

know somebody who used

33:45

chat, GPT, to write a

33:47

PowerShell script for Steve on SecurityNow that

33:50

looked through your last past vault and

33:52

told you some of its attributes. And

33:55

it worked and it was a lot easier to

33:57

develop it because and I

33:59

guess co co pilot even better.

34:01

Now clearly, with co

34:03

pilot, unlike chat GPT, there must be some rules

34:05

in there to

34:05

say, oh, and by the way, make sure this isn't made

34:08

up that it actually

34:10

works. Right?

34:11

For the most part, I mean, there are

34:14

suggestions that you can get that will not run,

34:16

so it is – that's why we call it

34:18

copilot. TWiT not do it for

34:20

you. It's your co pilot. It's, you know, autofundal and suggestions, you know,

34:22

plus one. Right? So And

34:26

the more that you use it, the more that it gets to know your code, it does to

34:28

know kind of your style and

34:30

your intent, and it can give you better and

34:32

better suggestions for what you're doing.

34:35

But no, you can absolutely the same way, you

34:37

know, you could get a wrong, you know,

34:39

suggestion, you could get, you know, or

34:41

or or a wrong, I guess, paragraph from

34:43

chat GPT, you could get some incorrect code suggestions. For the most part

34:45

though, I think that the the

34:47

training model there is

34:49

a little bit better because it is is, you

34:52

know, focused more on on one thing

34:54

rather than, you know, however great the

34:56

corpus is. For for

34:58

everything that Chap GPD is

35:00

doing. And as I said, it is learning based

35:02

on your own style and and the

35:04

stuff that is is in your

35:06

project folder. But no, I mean, this is why I

35:08

always tell people, look, co pilot is

35:10

amazing that it has saved me so much time,

35:12

especially with

35:14

boilerplate stuff. But if you're trying to use it to just you think you can

35:16

just automate it to write a program for you,

35:18

you might get lucky if it's something really

35:20

simple, like a a PowerShell script or something

35:22

like that. But

35:24

you you really need to have

35:26

AAA better idea of what you're doing so

35:28

that you can actually see what code it's

35:30

suggestion suggesting and then make

35:33

edits if that needs to be the case. But even if you still need to make

35:35

edits, I think there's still value there because it

35:37

can save you, you know, a

35:40

a lot of time of of having to,

35:42

you know, manually Google and

35:43

and, you know, command c, command v from from

35:46

Stack Overflow or or wherever. Yeah. People Well,

35:49

and that's it. Every programmer knows this, but maybe

35:51

a lot of civilians don't. That almost all

35:54

code is to some degree or

35:56

another copy to paste from

35:58

somebody

35:58

else. That's

36:00

kinda how it works. So copilot is a

36:02

natural way to do this. Copilot's quite impressive.

36:04

It's quite amazing. Here's the story

36:06

from earlier this month by checkpoint

36:10

research a malware research company.

36:12

They call it OPPON

36:15

AI, cyber criminal started to

36:17

use CHAT GPT. In Check Point's

36:20

research, previous blog

36:22

we described how CHAT GPT successfully

36:24

conducted a full infection flow.

36:27

From creating a convincing spearfishing

36:30

email to running a reverse

36:32

shell capable accepting commands

36:34

in

36:34

English. That's pretty scary.

36:38

This

36:38

is a case of

36:41

something called Infostealer. Which

36:44

was created the late last year by Chat GPT. A

36:47

cyber criminal showing how

36:49

he used Chatbeat GPT to

36:51

write the code Looks

36:53

like JavaScript. A hard

36:56

code. It it had a great

36:58

code to basically steal

37:00

files. From a

37:03

FTP server. It's

37:05

it's kind of amazing

37:07

what they're doing. One of the things that

37:09

really becomes obvious is this is a conversation a

37:12

year ago we might not have had.

37:13

This has happened all of a sudden

37:15

out of

37:16

nowhere. And you can imagine we're not that far away from

37:18

these areas being able to emulate I

37:20

mean, they can already do very compelling

37:24

So how far are they from being to

37:26

emulate your voice, your mom's

37:28

voice, and make up a

37:29

call, and then say, hey, you

37:31

know, it's it's your mom, I forgot my

37:33

password. Can you can you tell me Oh, yeah. that's already I'm sure

37:35

that's already it. That should be

37:37

doable

37:37

right now. Yeah. Yeah.

37:40

There is a generative

37:42

AI music already. It's not I

37:44

don't think it's quite there yet. This is a paper

37:47

from Google Research. They call it music l

37:49

m. It's based on language model like

37:51

Lambda, generating music from

37:53

a text prompt.

37:56

Yep. Here is

37:58

the main soundtrack of this

38:00

is the prop, the main soundtrack of an arcade

38:02

game. It is fast paced. And

38:05

upbeat. didn't check my audio. Do

38:08

you do you I think we'll we'll try it.

38:10

Turn my audio on. I wanna play this

38:12

song. It's fast paced and upbeat with

38:14

a catchy Electric guitar

38:16

riff. The music is repetitive and easy to

38:18

remember TWiT unexpected sounds like symbol crashes

38:20

or drumrolls. Does this sound like

38:22

an arcade game

38:23

to you? Maybe the

38:26

front screen is batch floater.

38:28

Maybe Sonic is running

38:32

down

38:32

That's completely AI

38:36

generated.

38:37

Although, internally, it's generated by

38:40

an AI

38:42

that there's a

38:42

fair amount of plagiarizing, which is why -- Oh, yeah. --

38:44

Google is

38:45

not very releasing to

38:46

this. Yeah. It's all

38:47

about that. Totally flavor. Here's

38:50

a slow tempo, bass and drums led reggae

38:53

song. Yeah,

38:58

man.

38:58

Everybody get together. We're going down to the beach. No.

39:00

No. Ant says no to that one.

39:05

Seems like it

39:06

as the potential to blow away the stock music industry

39:08

pretty quickly. But, yeah, it's a lot

39:10

better than the crap stock music we we

39:13

we have using. I'd like to introduce me as a you you'll be

39:15

able to generate

39:16

something unique to your own two hundred

39:18

and

39:18

eighty thousand hours of real music is

39:21

the training model. To

39:23

generate coherent songs for

39:26

descriptions of significant complexity

39:28

as the creators put it You wanna

39:30

you wanna feel like you're lost in space ant

39:32

and is becoming our our

39:34

taste

39:35

tester. Let's see if ant

39:37

agrees this is.

39:41

Sounds like an AI

39:43

did it, doesn't it? Sounds

39:45

like robot music.

39:50

Now here's the

39:50

question. Can we get taken down from YouTube

39:53

for playing that?

39:55

Make up sued

39:57

by a bot. See,

39:58

that's No. That's well, that's

40:00

gonna be an interesting thing. I think actually

40:02

Who wants to be

40:03

able to generate these unique things? Right.

40:06

Well, that that's an interesting question. But also,

40:08

I think becomes a very interesting question, which you

40:10

know, I think that this

40:12

YouTube relies

40:14

on someone else being able to say, I

40:16

have copyright of this and

40:18

and usually have, like, a a file registered

40:20

in place. They're, you know,

40:22

a a content ID can go and

40:24

find the same thing. But if it's

40:26

uniquely original

40:27

file, then Continental is not gonna find it. So

40:30

that's What a world we live?

40:32

That's

40:32

cool. What a world

40:35

But do you think there might be some cool

40:37

stuff that might happen if actual

40:39

human musicians work with some of

40:41

these tools to brainstorm

40:43

and Yes.

40:44

And riff on ideas. And then it seems like that could be kinda cool.

40:46

Yeah. She was

40:47

having a real

40:47

time. No. Exactly. I mean, I honestly, I think

40:50

that the way that and I know that a lot of

40:52

creators are really freaking out generative

40:54

art and and generative music and

40:56

and all this stuff. And and I understand the

40:58

fear. But for me, what excites me

41:01

about this is that the best AI

41:03

art that I've seen has been

41:06

from actual artists. Like, those are

41:08

the people who've been using the best prompts or have

41:10

been taken some of the prompts and have taken some

41:12

of the results and then made

41:14

really great things. And I think with music,

41:16

it's the exact same way. Right? Like, you might

41:18

be able to get something that sounds

41:20

slightly better than than than

41:22

stock music. TWiT it's still not going to be great. Right? It's going

41:24

to take a real artist to

41:26

then take that and edit it and

41:28

interpolate it and do what real artists have

41:30

always done and turn something

41:32

else. And and so the what

41:34

I've tried to been trying to tell people because

41:36

this isn't going away. This whatever your

41:38

feelings on on this stuff is, it's

41:41

not going away, and it's only going to become bigger.

41:43

We can have conversations about ethics and we

41:45

should. We can have conversations about safety

41:47

rails and we should this is not going

41:49

away. And so what I've been conversation I've been having with people for the last

41:52

year or so is, like, embrace

41:54

this as a tool to

41:56

your arsenal to

41:58

make new unique and better things rather than looking at

42:00

this as some sort of existential threat because

42:03

you're not going to outpace

42:06

this. This is not going to be something that you can get away from,

42:08

but it might be something that if you

42:10

are able to use, you

42:13

could actually enhance you know, this stuff

42:15

that you you for writers as

42:16

well. Last week, Brianna Wu was on

42:19

the show, her husband writes science

42:21

fiction among many other

42:23

things said that Frank

42:24

was stuck with a story that he was

42:26

I think he was ready for analog, but he was stuck

42:29

with a story. And he gave

42:31

a very extensive prompt to chat GPT, which wrote

42:33

kind of a mediocre

42:35

story, but came up with a lot of things

42:37

that became a starting point for him

42:39

and unstuck him. And that seems

42:41

like that's a very good use of something like chat,

42:44

GPT. I've heard so many

42:46

descriptions I love

42:48

I love

42:49

I love your name for it. What is it?

42:51

Glib PT? Glibot.

42:53

Yeah. Glibot and be like, remember I wrote

42:56

that? Yeah. That's good. Yeah.

42:58

That's good. I've also heard it say the ultimate man

43:00

splinter because it because

43:02

it's confidently wrong. Right?

43:04

And it says and it's

43:06

So and it's a little

43:07

patronizing. It's like, oh, no. Let me explain to you how the world works. Although if you tell

43:10

it that it's wrong,

43:10

then it gets really humble and

43:14

and apologize.

43:15

Apologize as

43:15

a great length and says it'll never do it again. Does

43:18

it

43:18

correct itself? If you correct it, does it state correct?

43:20

Yes. In

43:20

fact, some if it says something that's correct,

43:22

you tell it that it's wrong?

43:24

I will apologize for

43:25

that too. Stephen Wolfram wrote a very

43:27

good piece about how

43:30

confidently wrong chat

43:32

GPT is on things that, well, from Alpha, his his own

43:35

kind of AI. Is it an AI? I don't

43:37

know what you call, from Alpha. Search engine

43:39

for knowledge or something.

43:42

But he said if we should partner because we

43:44

we're good at getting the math right,

43:46

which chat, GPT, is terrible. And

43:50

then if we worked if we worked together, we maybe get something

43:52

out of it. He pointed out some

43:55

really historical examples. This is

43:57

his article from his blog at

43:59

steven wolfram dot com. Some hysterical

44:02

examples of just chat TPT

44:04

getting it terribly wrong. How far

44:06

is Chicago from Tokyo,

44:08

to which Chat GPT confidently

44:10

says the distance from

44:11

Chicago, Illinois to Tokyo, Japan is

44:14

approximately seventy six hundred

44:15

miles That'd be twelve thousand two hundred kilometers. It's

44:17

a very long distance, blah blah blah.

44:19

Turns out it's not even

44:20

close. It's six thousand three hundred thirteen

44:24

miles. So You correct it. So you

44:26

you

44:26

you tell it and

44:27

it says thank you for correcting

44:29

me. You're correct. Two,

44:31

of course, is the distance

44:32

is six thousand three hundred

44:35

thirteen miles. How far is

44:37

Chicago to Tokyo, and then it

44:39

gets it

44:40

right. At least in that continued

44:42

conversation. I think that's interesting. And

44:44

but kids don't don't do your

44:47

math homework with chat GPT

44:50

stick to wolfram alpha because it doesn't even

44:52

know three to the power of seventy

44:53

three, which is

44:56

pretty pathetic. By

44:56

the

44:56

way, not even close. It

44:59

said fourteen

45:00

billion. I can't say how big

45:02

the number is. It's a lot larger.

45:05

There's that story about Jack GPT

45:07

passing an MBA exam. Yeah. But but

45:09

the article, which said TWiT also pointed out that

45:11

it wasn't capable of

45:12

doing, like, high school math, which

45:14

I sound interesting because I'm I was so many

45:15

MBAs can't do. Right. We didn't realize

45:18

you could become an MBA without having high

45:20

school

45:21

math, but I

45:24

think it just passed a lot of school exam too, didn't it? This is now

45:26

the new thing. It's for professors to give

45:28

their exams to chat

45:31

EPT.

45:32

There was there there was a paper that that a couple of the

45:35

it was from University of Chicago when someone

45:37

else did with a GPT passing the

45:39

bar, and they gave

45:42

it part of the multiple choice, parts of the bar exam, and

45:44

it did better than random selection,

45:46

and it came close to humans

45:49

in couple of categories. It got a c plus. Passing,

45:52

but it's not like TWiT right. It didn't quite

45:54

pass. But it is it did but it is impressive

45:56

because the interesting thing though was that

45:58

it did significantly better than random selection. Like, it

46:01

wasn't one of those things where, you know,

46:03

you're just randomly, you know, okay. How would you've

46:05

done if you were just randomly selecting

46:07

the answer? So TWiT had some, you

46:09

know, better accuracy. And in some

46:12

categories, it was close to

46:14

humans. But Obviously, this is only for multiple

46:16

choice parts, and it did better in certain

46:18

areas than others. But, I

46:20

mean, to me, all this really says

46:22

is, okay, then you if your

46:25

big concern whether it's high school students or or,

46:27

you know, graduate students, you know,

46:29

and and professional taking tests, if your big

46:31

concern is the AI

46:34

cheating at the test, well, then you need to start changing how

46:36

you're testing. You're obviously not testing the right things.

46:38

Like, that to me is the big takeaway. And

46:40

we shouldn't be freaked out that

46:42

you know, these these AIs are able to pass the test. It's more

46:44

like, okay, well, what's the goal of this? And

46:46

are we testing the right way? And I think in most

46:49

cases, the answer would be

46:50

no, we're not testing the

46:52

right way. Howard Bauchner: Yeah, maybe that's the flaw of the

46:55

tests. Although, as you point out, Chad, GPT doesn't

46:58

do math.

46:59

Very well. It's good in constitutional law.

47:02

How long do we think

47:03

it'll be until Chad GBT is our

47:05

public defender and that you need to pay extra if

47:07

you want to a human to to defend

47:09

you in a in a

47:10

lawsuit? No. I don't think that's gonna

47:13

happen. The guy

47:13

who seems like a a black bureau. So the guy

47:15

who was doing the robot lawyer, I think,

47:18

is just decided to hit run run away with his tail

47:20

between his legs because then so

47:22

do not pay if you could A very actually,

47:26

a really cool service, which helped you get out of traffic tickets, created

47:28

an AI powered robot

47:30

lawyer that was gonna go

47:32

into court

47:34

I don't know, you know, first of all, I think any judge that

47:36

would throw it out immediately was gonna go

47:38

into court to help fight a traffic ticket

47:41

State Barr prosecutors threatened

47:44

the Joshua Broder as the CEO

47:46

of do not pay with jail

47:49

time. And so

47:51

Joshua says, we're postponing

47:53

our Caucasian. We're gonna stick

47:55

to consumer rights.

47:58

Wow. Okay. Oh, totally. Well, that's well well, this is the whole

48:00

thing. Right? It's, like, couldn't maybe, but,

48:02

like, do you do you think that there

48:05

if if any profession Can

48:07

you think of any class of profession who would

48:09

be less likely to allow

48:11

this into and and

48:13

therefore No. Like, like, the

48:16

even if you could potentially automate

48:18

things and and do things better than, like, your

48:20

typical public defender. Do you really think

48:22

that the the, you

48:25

know, the borrower associate and and and the

48:27

various lobbying groups for for lawyers. You really think that they would

48:29

allow this and their programs absolutely not. You're

48:31

gonna protect their

48:34

own interests above and beyond more than any other industry. They're gonna be the

48:36

ones who are

48:36

like, nope. Not not happening. Yeah. That's a good

48:39

point. If you're gonna pick an an

48:41

industry to disintermediate, do

48:44

podcasters. Don't do lawyers. You know, we're we're put

48:46

shelters. It'd be a lot easier

48:49

to go after us. Your

48:54

company, Microsoft, just to

48:56

acknowledge that they're putting in a they

48:58

already put a billion dollars

48:59

in. They were one of the

49:01

founders of

49:02

OpenAI. And now they have AAA even better deal

49:04

with OpenAI. The rumor was ten additional

49:06

ten billion dollars. I think that

49:08

was confirmed by Sachin Adella. Over

49:11

a period of time, obviously. And that

49:14

chat, GPT, or that kind of

49:16

technology, will be used in Microsoft Office.

49:19

But I think number of people are saying the

49:22

real the real thing to watch

49:24

is

49:25

Bing. Mhmm. Thoughts

49:27

about that. I know Yeah.

49:29

And you work at cop you

49:31

work at GitHub. So so, you know, it's

49:33

just owned by Microsoft. Opportunity on

49:35

my own. Look, I think this is exciting. I think that, you

49:37

know, there's also been reporting that the Google's

49:39

been having kind of like AAA

49:42

crisis about how successful chat GPT has been. And I I

49:44

don't blame them because Google

49:48

has amassed and this is not

49:50

in any way to try to integrate any other

49:52

company. But but they have probably

49:54

mass, like, the largest quantity of

49:56

of AI talent

49:58

from Mac cademia and and from industry of anyone.

50:00

And the fact that it was

50:02

TWiT which was interesting to me about

50:04

that, is that It wasn't really that demonstrably different from

50:06

any of the other GPT three things that

50:08

have been available. It was just the interface that

50:11

I think made it so accessible. Has

50:14

become this very mainstream

50:16

thing where, you know, I've been thinking and

50:18

I've talked been talking about, you know,

50:20

OpenAI stuff for

50:22

several years, But now this is a mainstream thing because the interface was ripened.

50:24

And, yeah, I I definitely

50:26

think that search is a great area where

50:29

it could be helpful people have

50:31

created extensions to add, you know, chat, GPT things

50:34

alongside Google results, and it's better. And

50:36

and I think that, you know,

50:38

Google results Google

50:40

is the primary research engine that I that I use. And the

50:42

results have gotten worse over time, and

50:45

and I don't think that

50:48

it's because of the SEO stuff, I think,

50:50

is because Google has

50:52

optimized for different sorts of results,

50:54

and they've you know, wanted to

50:56

highlight other things. And so I

50:58

often end up piping Reddit into my

51:00

search -- Yeah. -- because I find that I get much

51:02

better results

51:03

Mhmm. Yes. From from Reddit than I do.

51:05

Because

51:05

those are because what you're really doing is

51:07

asking for information from real experts

51:09

about a

51:11

topic. Right? Right. Know, I just wanna actually

51:13

get the conversation, like, the info where it's actually

51:14

going to be. But searching Reddit dot com is

51:17

is a mess. So searching Google for query

51:19

and then adding Reddit to TWiT. alternative.

51:22

But people have created, you know, like,

51:24

kind of side by side extensions

51:26

to add, you know, Cheggi BT stuff.

51:29

To Google things. And I think that, yeah,

51:31

this is an opportunity for Bing. I think

51:33

it's an opportunity for a lot

51:35

of consumer products. Obviously, one

51:37

of the big wins here is for

51:39

for Azure for, you know, other businesses

51:41

who want to take advantage of those models and build

51:43

it into their their products having kind

51:46

of AI as a service. Think, you

51:48

know, look, this is to be hot,

51:50

everybody's going to be this is

51:52

going to become an arms race. Right?

51:54

Even more than it already has been.

51:56

But for whatever reason, you know, OpenAI has

51:59

been the first to really commercialize

52:02

this in a way that the mainstream understands. And

52:06

it's exciting. I mean, personally as

52:10

a technologist, to me. All all the other kind of fears we

52:12

would have around it. Like, I look at this as

52:14

a moment of this is this is exciting.

52:16

Like, to me, this is much more

52:18

than expecting versus the

52:19

metaverse. Like, this is much more exciting --

52:21

Great. -- and it's much more tangible as

52:23

to what the next big place of computing

52:25

is going to be. Forget about the vice

52:27

stuff is is I think really

52:30

what's exciting. Open AI is less to

52:32

lose than

52:32

a Google or a Microsoft. Well, that's

52:34

why

52:35

Open AI was created, really. Right? Yeah. These are

52:37

enormous companies with an enormous customer base

52:39

-- So -- and secret and

52:41

reputations and and paying

52:44

customers and open AI

52:46

not

52:46

having any of that

52:47

stuff. Why why not throw it out into the public and

52:50

see what happens? Although, Jan

52:52

Lucone, who does a is the

52:54

genius AI researcher at

52:56

met at

52:58

Facebook, said that,

53:00

oh, chat GPT isn't particularly

53:03

innovative. We've been doing that for

53:04

years. No.

53:05

I I think if you're AI scientist, you

53:07

know, about transformers. Yeah. Yeah. Which were

53:09

and Google, basically,

53:09

like, transformer technology. Right. This is

53:12

this is what Lambda

53:13

did. Yes. They did. Yeah.

53:15

And that has done some cool stuff with him

53:17

too. But

53:17

it also feels a little

53:18

bit, like, south of the

53:19

apes. Right? It's like, oh, no. We did. I

53:22

was

53:23

just gonna say. Sure you have, but you didn't productize it. I

53:25

didn't tell anybody. No. Like

53:26

right. You didn't you didn't productize it. Like, I don't

53:28

think that anybody would make the argument. I

53:31

don't think Sam Altman or do you want

53:33

from open AI? AI would be like, oh, this is the most innovative thing and no

53:35

one else has done this. I think what

53:37

they would say is,

53:39

this is the first time that the public has actually been able

53:41

to interact with it in a way that had a

53:43

really good user interface. That's what

53:44

I thought. It was

53:45

a really user interface. That's what

53:47

looked like he said. He said Chad GPT is, quote, well put together.

53:49

He said that compared to other companies in

53:51

the field, OpenAI is

53:53

not particularly

53:55

advanced Google meta, and

53:57

he said half a dozen other startups have

53:59

equivalent technologies. But

54:04

That's

54:04

the difference. They were doing this in public and letting the

54:06

public use

54:07

it. Although it makes me

54:09

wonder, is there something better under

54:11

the hood somewhere else. Well, GPT

54:13

four apparently is an enormous advance over

54:15

three point five. With Sam Alvin, CEO

54:17

of opening eyes. Don't get your hopes

54:19

up. It's not it's not it's not AGI. Right?

54:22

It's not the general --

54:23

Right. -- general intel. That I actually did

54:25

put out a AI chatbot a a

54:27

few months ago. And they

54:30

immediately got flagged for

54:31

it. Did they get racist instantly? Being

54:33

racist and anti Semitic and so forth. So

54:35

and so they they

54:37

were they tried to be bold, but they weren't quite as bold

54:39

as check DPT, so they didn't get as much credit, and they

54:41

got a lot more flack about it. I

54:43

think partially because Metabolic Under

54:46

Companies is gonna get flagged no matter what it does, which

54:48

is not true of what an AI at least

54:49

yet. I've been using a search

54:52

engine that was founded about five years

54:54

ago by former Google search executives called

54:56

Neva, NEEVA. Are you

54:58

familiar with this? I read a big story on it. Oh, it's

55:00

like that's how I learned about it. Yeah. The

55:02

CEO is

55:04

former

55:04

top guy at

55:05

YouTube. Yeah. And

55:07

he got a little

55:08

bit depressed about the

55:12

monetization of search

55:13

The certification of Google. So we

55:14

we we went off to do a a search

55:16

engine with a a paid model. And, yeah, that's the

55:18

premise. We don't run ads. We don't in

55:22

fact, even when Google started, Larry Page famously wrote,

55:24

a search engine can't have

55:26

advertising or will then become

55:28

beholden to the advertisers they

55:32

only held that off for a few years

55:34

before getting involved

55:36

in advertising. So I pay five

55:38

bucks a month for Neva.

55:40

I get a lot of they actually give

55:42

you a free one password account and and other stuff, but I

55:44

think it's really good. And also,

55:48

because, you know, they're in this arms race.

55:50

They added a AI generator

55:52

at the beginning of search results.

55:54

So I searched for chat GPT

55:57

and Bing just now. And this is the result I got from the

55:59

AI. I think AI's are very good

56:01

at synopsizing and summarizing other

56:04

content. So They even do

56:06

footnotes to say where this information

56:08

come comes from cnet, the Guardian

56:10

Observer, and the Virgin. Microsoft is

56:12

reportedly integrating

56:14

ADI technology such as chat GPT into its Bing search

56:16

engine, which could potentially revolutionize

56:18

search as we know at this technology is capable of

56:20

generating a wide variety of text and

56:22

human

56:23

like ways and sponsored written prompts. Microsoft hopes to

56:25

launch this feature before the end of

56:28

March. In a bid to make

56:29

being more competitive with Google, I think

56:31

Google should be scared. Not

56:34

just by bandwidth by Nava. I think this is pretty cool. Nava,

56:37

the only I've been using Nava

56:39

full time instead of Google everywhere,

56:41

including on my

56:44

iPhone, for about a month now. The only

56:46

negative, the only hit on it is

56:48

it's amazing how quickly Google comes

56:50

back with a result. Nava,

56:52

there's a palpable second or two. But other than

56:55

that, the results are excellent. I love

56:57

this AI thing and

56:59

there's no

57:00

ads. TWiT doesn't favor

57:02

Google content over anybody else's

57:06

content.

57:06

I think there I believe there's

57:08

a dash of Bing in nivo's technology

57:10

along with some of its own technology

57:12

as well, is it interesting? I believe they've licensed some data.

57:13

Okay. Do they have

57:15

their

57:15

own crawler? Right? Yes.

57:18

I think

57:18

they they kind of mashed together some of their

57:21

own stuff and some stuff they've licensed. And

57:23

I don't know what this thing is, but

57:25

it's pretty cool. There's a little slider here

57:27

at the top. I don't currently showing

57:29

top news from all sources. Currently

57:32

showing top news from all

57:34

sources. I don't know. There's some something going

57:36

on there. That I can move around that

57:38

slider. There's it's I think it's very innovative. I I'm, you know,

57:40

I have no relationship with them. In fact,

57:43

I meant to ask you about this because I did read your

57:46

article about it. You talked to them. You think they're

57:48

pretty compelling. They're smart

57:50

folks.

57:50

They've added a lot of stuff since they

57:52

launched. It's great to say we're gonna go against

57:54

Google. Yeah. I mean,

57:55

they're a tiny company. But maybe,

57:57

you

57:57

know, now. We've

57:59

all been used to getting our search free for the

58:01

last twenty years. But I think if

58:03

there is if there is a time when we're at

58:05

an inflection point, or the idea of

58:07

going up against Google no longer sounds

58:10

quite so insane. It's now although,

58:12

of course, Microsoft is probably in

58:14

the best place to take advantage of this inflection

58:16

point. That it's already a large company with

58:19

a large search

58:19

engine. Although

58:20

I am curious how

58:23

they could shortly

58:24

rollout chat GPT as part of Bing just because of this issue

58:27

with accuracy. Yeah. I

58:29

think the way Neva does it

58:31

with the footnotes is the

58:33

only way you could do it. Right?

58:35

Because and this is the difference

58:37

between that knowledge graph and Google,

58:39

which is almost entirely from Wikipedia almost always and is

58:41

never sourced. At least, Neva

58:44

says, you know, where this stuff came from. I've

58:46

I've found it actually quite useful. I'll ask

58:48

TWiT, you know,

58:50

kind of technical questions, like coding questions, like, what, you

58:52

know, describe Dykstra's algorithm. And it

58:54

does a really good job. It's it's

58:56

quite this is exactly what chat should

59:00

be good at. And nevertheless,

59:04

to beat Google at its own game is not maybe

59:06

not. I

59:07

don't know. Maybe that was Now is

59:09

the time to do it. This is this is your

59:11

your article from last

59:14

June in

59:15

June before last? Oh, yeah. Twenty

59:17

twenty one was right when they were first Yeah. I hope they do

59:19

well. I feel very it's

59:22

very it's an interesting bet.

59:25

And I love not having the ads in there. And I

59:27

just hope they continue to be kind

59:30

of agnostic, you know,

59:32

not picking sides. I don't I don't wanna be

59:34

cut them to be a Bing new licensee or a Duck

59:36

To Go

59:36

licensee. They also, by the way, when you

59:38

install it, they install a

59:41

a

59:41

tracker. Which an anti tracker

59:43

tool plugin in your browser, which

59:45

shows you what trackers

59:48

are on Fast Company.

59:50

There you go. Not bad.

59:52

There are far worse. Let

59:54

me tell you, there's somewhere there's thirty or

59:56

forty trackers on a single page

59:58

It's kind of kind of

59:59

amazing. We were trying to make our pages leaner and

1:00:00

leaner just because that makes them

1:00:03

run faster -- That's gonna load

1:00:04

fast.

1:00:04

-- results in happier users? Yeah.

1:00:08

Alright. Wanna take a little break. There is a lot more to

1:00:10

talk about. We've got a great panel. I couldn't

1:00:12

have a better panel for this conversation. Tim

1:00:16

Stevens, is with us now freelance doing great.

1:00:18

He's driving his way home on

1:00:20

a substack at Tim Stevens dot substack,

1:00:24

dot com. He is also on Mastodon

1:00:26

on the Mastodon dot

1:00:28

social, but still says a little bit

1:00:31

a little tiny bit TWiT

1:00:34

In there too. Thank you, Tim, for being here. We appreciate

1:00:36

it. Harry at Technologize

1:00:39

your global tech editor at

1:00:41

a fast company we we started putting people's mass thumbs up

1:00:43

on the

1:00:43

screen. I think There we go. I think

1:00:46

that's

1:00:46

great. But

1:00:46

you can't add anything more because it's, like,

1:00:48

two seems to be the maximum.

1:00:51

Yeah.

1:00:51

Well, we got you Twitter and your Mastodon. It's

1:00:53

not my Post though. There you go. Are you on Post

1:00:55

as well? I have an account, but I haven't really been

1:00:57

using that. Yeah. I know. See, to

1:00:59

me, going to post is like not learning the lesson of Twitter. It's like,

1:01:01

oh, good. Let TWiT Andresen run

1:01:03

everything. Right?

1:01:04

No. I don't I think it's better to

1:01:06

be I love the idea that we

1:01:08

know somewhere that is not owned by

1:01:09

somebody. Right? I really

1:01:11

like that. And if you'd if you suddenly

1:01:13

are you're on Twitch social TWiT you hate the

1:01:15

way I'm running TWiT. You

1:01:17

go somewhere else, you know?

1:01:19

That's easy. Also on

1:01:22

the on the a new Mastodon user

1:01:24

and more more than welcome. Christina

1:01:26

Warren, film girl. And are you using

1:01:28

film girl you are at Maston dot

1:01:30

social? I

1:01:30

am. Yeah. Yeah. I'm I'm I'm at I'm

1:01:32

at the mother's girl. I might wind up

1:01:34

change another instance at some point because the mask on You're not the big

1:01:36

one. It's so big that

1:01:37

there can be yeah. There can I've

1:01:40

had the count since two thousand

1:01:42

eighteen. I don't know. IIII

1:01:45

had it just to have TWiT,

1:01:47

but It's

1:01:49

pretty easy that it's easy to move your followers. It's

1:01:51

hard to move your toots. You can

1:01:54

do

1:01:54

it, but most of the time, I think the stuff

1:01:56

that you have tutored or tweeted, the old

1:01:58

stuff that, you know,

1:02:00

that's water under the

1:02:01

bridge. Start fresh, but you at least can bring your

1:02:04

followers with you. That's very easy to do that

1:02:06

on that.

1:02:07

III like

1:02:10

our TWiT social

1:02:12

because it's you have to be a twit listener to

1:02:14

be in there. So it is a community. You're on

1:02:16

Harry's on SFBA, which is for San Francisco Bay area -- Mhmm.

1:02:19

-- people. The local timeline

1:02:21

really gets a a

1:02:23

point of view. If

1:02:25

you choose wisely, when you're on somewhere like Mastodon,

1:02:28

that social is just like a mini Twitter,

1:02:30

basically. It's everybody who it

1:02:32

didn't didn't look farther than the

1:02:34

the biggest instants. And

1:02:36

it's also pretty big now. It's well over a

1:02:38

hundred thousand people. So That could be fun.

1:02:40

Or the people who signed up in two thousand

1:02:42

teen. Yeah. And they didn't have a lot of these other things. Nice. There

1:02:44

was no twin social

1:02:45

banks. Awesome. Yeah. It was

1:02:48

also on a a

1:02:50

smaller 1X0X0

1:02:53

dot zone for my favorite my very

1:02:55

favorite

1:02:55

conference. Yeah.

1:02:56

And I I did

1:02:57

migrate like, I

1:03:00

never it. So I did go ahead and migrate the followers that I I had mask

1:03:02

there over to and there was

1:03:04

some some overlap I'm sure, but I did

1:03:06

migrate those followers over to

1:03:09

the the the main account that I'm on. But and

1:03:11

and that was actually

1:03:14

seamless. I was worried about what that process was gonna

1:03:16

be

1:03:16

like, but it wasn't cult. So that that

1:03:19

that's good news. I've done the

1:03:20

same thing I was unmasking on that

1:03:23

social way back when when it was

1:03:25

the only mess. That instance, And

1:03:27

when I started my own, I migrated over to

1:03:29

to with TWiT social. I also have something

1:03:31

on Pixel

1:03:32

Fed, which

1:03:33

is a metaverse not mask it on, but kind of Instagram

1:03:35

clone. And I really like it on pixel

1:03:38

fed social. I

1:03:40

really like

1:03:41

it because it's Instagram like it used to

1:03:43

be with just a bunch

1:03:45

of

1:03:45

photos. Golly, whoever thought of that, what

1:03:50

no reels. No dancing

1:03:52

chipmunks. What what kind of what kind

1:03:54

of places that? So

1:03:57

and and one of the nice things

1:03:59

about Ivory and these other clients is

1:04:01

you can you can

1:04:04

actually have multiple accounts

1:04:06

in your client. So you

1:04:08

can have your photos on pixel

1:04:10

fed and your toots somewhere else.

1:04:12

Our show two day brought

1:04:14

to you by our good friends

1:04:18

at worldwide technology. Worldwide technology is at

1:04:20

the forefront of innovation. We love these guys. We

1:04:22

had it was actually the last triple lease that

1:04:24

I took before COVID. We went out there in March

1:04:27

twenty twenty to visit their

1:04:29

advanced technology center. Wow. Is that

1:04:31

cool? You guys probably remember in the

1:04:33

old days if Davis had

1:04:35

that big testing lab in

1:04:37

Foster City. And that's why PC Magazine could

1:04:39

test a hundred printers because they had the

1:04:41

the capability of doing that.

1:04:43

You remember that? Well, that's what

1:04:45

the ATC is all about for enterprise technology. That's that's worldwide

1:04:48

technologies business is

1:04:50

enterprise technology.

1:04:52

They created the ANAVANCE Technology Center.

1:04:55

To try to research all of this

1:04:57

great technology out there for enterprise,

1:04:59

it now has all

1:05:02

the technologies from all the leading OEMs, including some of

1:05:05

the big new disruptors, more than

1:05:07

half a billion dollars in

1:05:10

equipment. It started one rack in one

1:05:11

building. It spread to four or five buildings now many,

1:05:13

many racks. But here's the

1:05:14

great part. And

1:05:15

I really honored w t

1:05:17

for this. They don't keep it to themselves. Sure. Their engineers use it to spin

1:05:20

up proofs of concept and learn

1:05:22

about new

1:05:22

technologies and

1:05:22

so forth to help their clients, but they

1:05:25

also make it available to you.

1:05:27

The advanced

1:05:28

technology center, you you don't have to go

1:05:30

to St. Louis. You can use it anywhere in the world. They

1:05:33

offer hundreds of on demand and

1:05:35

schedulable labs. Featuring solutions that include

1:05:38

technologies representing all the latest, the

1:05:40

newest advances in cloud

1:05:42

and security, networking

1:05:44

primary and secondary storage

1:05:46

data analytics and AI dev

1:05:48

ops and on and on and on.

1:05:52

It is

1:05:52

it is not just for those great WBT engineers and

1:05:54

partners, it's for anybody. It's free to anybody

1:05:56

who wants to use the ATC

1:05:59

platform. Which

1:06:00

means your evaluation time can go for months to weeks. Your

1:06:03

knowledge level

1:06:03

can go through the roof. You could test out products

1:06:06

and solutions before you go

1:06:08

to market, But it's more

1:06:10

than just the labs you can access,

1:06:12

technical articles, expert insights,

1:06:14

demonstration videos, white papers, all

1:06:17

the tools you need to stay up on the

1:06:19

latest enterprise technology. They also have a great

1:06:21

community. In fact, when you go to

1:06:23

the ATC platform, check out WWE's events and communities,

1:06:26

learn about technology trends here about the latest

1:06:28

research and insights

1:06:30

from experts, Not only is the

1:06:32

ATC at that physical lab space in

1:06:34

Saint Louis, and if you get a chance to see it do,

1:06:36

it was the most amazing thing, but it's

1:06:38

completely virtual so you can use it. If you're on the ATC

1:06:40

platform anytime anywhere in the

1:06:42

world. Three hundred sixty five days a

1:06:44

year. Whatever your

1:06:45

business needs, this is the point.

1:06:48

WWE is the best partner for

1:06:50

anybody using enterprise technology.

1:06:52

Worldwide technology can deliver scalable

1:06:54

tried and tested tailored

1:06:57

solutions. Because WWT understands

1:07:00

it's in business, technology

1:07:02

is not for technology's sake. It's

1:07:04

there to support your business strategy. WWF

1:07:08

brings strategy and execution together

1:07:10

to make this exciting new world happen

1:07:12

to learn more about WWF, the advanced

1:07:14

technology center to get access to

1:07:16

all these free resources. Very easy,

1:07:19

go to WWT dot

1:07:21

com slash

1:07:21

twit, WWT dot com

1:07:24

slash twit create a

1:07:26

free account, on the ATC platform and and

1:07:27

learn and and explore and

1:07:30

grow and use these technologies the way they're

1:07:32

intended to WWT dot

1:07:34

com slash Twit. These guys are

1:07:36

the good guys. These are these are the guys you need

1:07:38

as a partner. WWT

1:07:40

dot com slash tweet. Let's

1:07:45

see. Oh,

1:07:46

I do wanna do a

1:07:49

quick plug for our I think

1:07:51

it's the last chance to take the survey. Yeah. We have only two

1:07:53

days left. Twitter dot tv slash survey twenty three.

1:07:55

We survey our audience once

1:07:58

a year. We don't wanna spy on

1:08:00

you. We don't put we can't put trackers in a

1:08:02

podcast. It's RSS. But

1:08:04

we'd like to know more about you. Our advertisers

1:08:06

would like to know who they're who those are going

1:08:08

to. We can't compete with people like

1:08:10

Spotify or spy on your every move and know

1:08:12

who you are and all that stuff. It's

1:08:15

a survey. It's our only tool, but it helps us a lot. So

1:08:17

it should only take a few minutes. It's completely optional. Of course, answer any

1:08:19

questions you want. Twitter TV slash

1:08:24

survey. Twenty three. I wanna get every I

1:08:26

wanna get people from every show participating though, so we know

1:08:28

about, you know, what we're doing

1:08:30

and whether it fits your needs.

1:08:33

TWiT dot tv

1:08:35

slash survey twenty three. Last chance,

1:08:37

don't put it

1:08:38

off, and we thank you in advance.

1:08:41

Some really interesting

1:08:43

news from the

1:08:43

Department of

1:08:48

Justice There was a

1:08:50

ransomware game called Hive. Ransomware has become a plague, obviously. It's

1:08:52

really a problem.

1:08:55

Although I saw that the

1:08:57

revenues, and they they know this because they can look at Bitcoin

1:08:59

transfers. We're significantly down

1:09:02

in twenty twenty TWiT. And

1:09:05

the thinking is because people aren't paying. It's not that ransomware is not hitting

1:09:07

you. It's just people who've said,

1:09:08

screw that. We're not

1:09:11

giving you any money. Maybe

1:09:14

they've got better strategies for mitigating

1:09:16

them ransomware attack. But also,

1:09:18

the DOJ is going after

1:09:20

them. This was a press

1:09:22

conference from deputy attorney general Lisa Omonico. It

1:09:25

turns out

1:09:26

I think this is fascinating.

1:09:30

That the US had infiltrated

1:09:32

the FBI had infiltrated the

1:09:35

Hyve ransomware

1:09:35

group last

1:09:38

July. And as a result, under and

1:09:41

maybe this is why ransomware is going

1:09:43

down too. Under the,

1:09:46

you know, under the covers, that's not quite right. Officers

1:09:49

were able to warn victims

1:09:51

of impending attacks in

1:09:54

secret. Saying, hey, watch out. They're

1:09:56

going after you. They

1:09:58

also got decryption keys, and

1:10:00

they were able to hand out

1:10:02

more than three hundred decryption keys to

1:10:04

people who had been hit by

1:10:07

the hive ransomware, saving them more than a hundred thirty million dollars. The

1:10:11

US estimates Hive and its affiliates it's

1:10:14

one of those ransomware as a service

1:10:16

companies. I

1:10:19

don't wanna use a word but that's kind of what it is. Collected over

1:10:21

a hundred million dollars for more than fifteen hundred

1:10:24

victims. They went after, and

1:10:26

this was their mistake, hospitals, school

1:10:28

districts, critical infrastructure.

1:10:30

In more than eighty countries around the world, one hospital was left unable to accept new patients because

1:10:36

of

1:10:36

Hive, They worked

1:10:38

with the UK's national crime agency

1:10:40

and other law enforcement agencies around

1:10:43

the world to help victims. And

1:10:46

the UK fifty organizations were given decryption keys. And on Thursday,

1:10:49

the FBI shut

1:10:52

it down.

1:10:54

They took Hive's website and communications networks

1:10:56

down with the help of police

1:10:58

forces in Germany

1:10:59

and the Netherlands.

1:11:01

TWiT is a successful

1:11:04

attack on

1:11:05

the attackers. I don't

1:11:08

know if They

1:11:10

arrested anybody. I don't see that.

1:11:13

And that's the

1:11:16

problem because as

1:11:18

the head of intelligence at Mandy and John Hultquist said, until you arrest him, they're not

1:11:23

gonna be gone. It's like

1:11:25

soccer soccer cup over again. They It'll down.

1:11:28

If if

1:11:32

you went to the Hive Cruise

1:11:34

website, you would see this notice from the FBI. This hidden

1:11:37

site has

1:11:40

been seized. With lots of

1:11:43

badges. This is Nihai

1:11:45

was not the biggest of

1:11:47

the ransomware gangs. There are

1:11:49

bigger ones. Although, Rival, which was perhaps the biggest in twenty twenty in

1:11:51

twenty twenty

1:11:52

one, did get

1:11:55

arrested around the world. So

1:11:58

this is good. Dark side was taken down in June of twenty twenty one. This

1:12:03

is good. This is what

1:12:06

it takes. Let's see what else. Intel. You wanna talk

1:12:12

about Intel? Not a

1:12:14

good quarter

1:12:14

for Intel, the worst beating

1:12:15

in over a decade, Andrew

1:12:18

Orr writes from Apple and Cider.

1:12:22

There may be a little happy about little happier

1:12:24

than they ought to be about this.

1:12:26

Thirty two percent drop in revenue since

1:12:30

year over year since the

1:12:32

holiday quarter of last year twenty twenty

1:12:34

one, actually. Fourth quarter results coming out

1:12:38

revenue fourteen billion dollars down thirty percent

1:12:41

year over year, entire

1:12:43

year revenue down twenty percent

1:12:45

year over year. This goes along with

1:12:47

drops of thirty, forty percent in the

1:12:49

PC sales as well. So it's just

1:12:52

been a bad year for

1:12:54

PCs. Does that mean anything, Harry? Well,

1:12:56

I think Intel has known and acknowledged for a

1:12:58

while now that it's in this rebuilding process after

1:13:01

falling way behind

1:13:03

other chip companies. And

1:13:06

that it was not gonna result

1:13:08

in fantastic numbers immediately because they have to get

1:13:10

back to where their process is competitive again.

1:13:15

With with other technologies. And I believe they've

1:13:17

said that maybe by year

1:13:19

after next, they think they'll

1:13:21

be in a place with

1:13:24

Is great again, which is maybe as

1:13:26

long as they give pet galsing or their

1:13:28

CEO, time

1:13:31

to get there. Maybe that's when we can really judge

1:13:33

them. And if the numbers are still

1:13:35

this bad, then it's a really

1:13:37

bad sign. But I think

1:13:39

that at least as of when Galesinger

1:13:42

started, and I I wrote a future about him last year. The board had given him quite

1:13:44

a bit of runway on

1:13:46

understanding that those can be difficult.

1:13:49

And there would be more bad news before

1:13:51

there was any good news. Although, they may not have anticipated the degree to which business would

1:13:55

be so crummy. And I

1:13:57

think I think people and companies may just be purchases

1:14:00

because that everybody is so

1:14:02

cautious about the economy this year.

1:14:06

Yeah.

1:14:07

Everything's down. It's not just PCs. Plus,

1:14:09

we bought a lot of PCs

1:14:11

during COVID. Right? People have

1:14:13

relatively new nice computers now in

1:14:15

a way they didn't before the

1:14:16

pandemic? Right. And if you look at, like,

1:14:18

the increase in in in shifts,

1:14:20

you know, between, like, twenty

1:14:23

twenty and and now not

1:14:25

to say that, like, the some of

1:14:27

the gains haven't been impressive, but if you're not an enthusiast, you're not actually going to really notice, I think,

1:14:29

for a lot of

1:14:32

people. And you

1:14:34

know, TWiT looking more and more like what was happening, you know, all that excess buying in

1:14:36

twenty twenty and and even a

1:14:38

little bit into twenty twenty one.

1:14:43

Was a combination of both the the

1:14:45

supply chain, you know, maybe

1:14:47

even making more of

1:14:49

a frenzy because people couldn't

1:14:51

things, people having to work from home. It's an anomaly. And I think

1:14:53

to Ben, as as a lot of businesses did, try

1:14:55

to base it up as, like, well,

1:14:57

this is the new baseline was clearly

1:15:00

a mistake. Because that's, you

1:15:02

know, has has not continued. And and I I think that

1:15:05

with the,

1:15:08

I guess, being able to

1:15:10

kind of look back, we can say,

1:15:12

no, why would we have expected those trends to continue

1:15:14

year over year? Because that's just not consumer buying patterns

1:15:19

in in the last decade or

1:15:19

so, you know, we haven't seen that. So yeah.

1:15:22

We have been saying

1:15:23

for a

1:15:23

while the

1:15:26

end of

1:15:26

desktop computing But I think it's what do you think, Tim?

1:15:29

Is the end of

1:15:31

desktop computing exaggerated?

1:15:33

I I definitely think it is. I mean, I think we've got a

1:15:36

long time to go before that that and

1:15:38

certainly people's usage patterns show that they're shifting

1:15:41

away from desktop computing if you look at

1:15:43

overall utilization, you know, what devices consuming content on and creating

1:15:45

content on. But if you look

1:15:47

at overall time, I think

1:15:49

that number is going up

1:15:51

and desktop usage probably staying pretty much static

1:15:53

for the past few years. So so, yeah, I think we still have a long way to go there, but if

1:15:55

you also look at the number of layouts

1:15:58

we've seen lately, I mean, that's a

1:16:00

lot fewer corporate

1:16:02

laptops that are being needed. And certainly with nobody hiring, that means that there are fewer laptops being needed there too. And

1:16:04

if you do

1:16:07

get hired

1:16:08

now, I think it was probably

1:16:10

a pretty

1:16:11

good chance of getting a hand me down. So

1:16:13

I think that Joe's lapped up, but

1:16:14

we fired him last night. Yeah. And -- Alright, Peter. -- yeah. Sorry,

1:16:16

Joe. Got

1:16:19

it. Twelve thousand layoffs at Google. I

1:16:21

mean, it's just been tough. We

1:16:23

had on on

1:16:24

Wednesday, on TWiT. We had

1:16:27

just completely a

1:16:28

representative because I think what you

1:16:30

know, we talk about these layoffs in I

1:16:33

think tech industry, since the beginning of the

1:16:35

year, two hundred

1:16:36

thousand jobs lost. We talk about that and

1:16:38

this just kind of abstract numbers. I wanted to bring a face to

1:16:40

it. So we had

1:16:43

Richard

1:16:43

Hayon. He was a Google engineer. He's been

1:16:46

an engineer

1:16:46

for seventeen years at Google,

1:16:47

and was one of the people just summarily dismissed, kind

1:16:52

of

1:16:52

abruptly lost his job without any warning. His boss didn't

1:16:54

even know ahead of time. And I wanted just to kinda bring home the face of it because

1:16:56

these are that's

1:16:59

two hundred thousand people with

1:17:01

families, with bills, with mortgages, with rent, and

1:17:03

and they don't know what

1:17:05

tomorrow is gonna bring. That's

1:17:08

a huge hit.

1:17:11

And I don't want to just, you know, diminish it in

1:17:13

any way by just talking my raw

1:17:15

numbers, you know. Yeah. It

1:17:17

it really is a shame how that has to happen

1:17:19

these days or, like, the the corporateification of layoffs

1:17:21

is is really tragic and and

1:17:24

nauseating. Honestly, you know,

1:17:26

having recently been through that myself. How

1:17:28

depersonalized it has been mandated that

1:17:30

you you cannot have any empathy,

1:17:32

you cannot talk to

1:17:35

anybody about the situation. You you

1:17:37

are very restricted in what you you can say, when you

1:17:39

can say TWiT. you know, as someone

1:17:41

who isn't tried to be an

1:17:43

apathy leader, someone who you

1:17:46

know, treated his employees like his

1:17:48

friends. You have to go through that is

1:17:50

really really difficult. That's the situation. So Yeah.

1:17:53

I

1:17:53

I don't know where this pattern came from or

1:17:56

why it is almost

1:17:58

legislated into corporate law

1:18:00

these

1:18:00

days, but it is really discussing that

1:18:02

that is where we've gotten to a

1:18:04

right now

1:18:05

where your ability to be an empathic leader has to end at

1:18:07

the time when it's most important for

1:18:09

you to be an

1:18:12

empathic.

1:18:12

Yes. I

1:18:13

worry that Elon Musk sent sent the bar so low for

1:18:15

doing them. Epidemic or just decent to

1:18:18

the people who worked for you

1:18:20

that if

1:18:22

these large companies beat Elon, they figure

1:18:24

that that it's okay. But, I mean, there

1:18:26

were stories about Google employees who who came

1:18:29

into work. And waved their badge to get in

1:18:31

and either turned green and they were able to go in or turned red and they knew they had been laid

1:18:33

off and that's how they got

1:18:35

the

1:18:35

news. And it don't

1:18:38

understand what the excuse is

1:18:40

for that. Yeah. Yeah.

1:18:40

That's what I mean.

1:18:41

Yeah. There's there's no good

1:18:42

way to do lay offices. Is the

1:18:47

reality, but there are ways

1:18:49

that you can do it worse.

1:18:51

Right? And I

1:18:53

I agree, like, for all the excuses

1:18:55

that this has been a thing I think I've I've noticed

1:18:58

because I I first was seeing this in media where

1:19:00

people would find out sometimes

1:19:02

that they were laid off by

1:19:05

losing access in Slack, and then people

1:19:07

would disappear. And it would be, like, you know, like, the snap. And and you

1:19:09

were, like, what happened? You

1:19:11

know, brought back TTSD

1:19:14

one day when people lost access for Slack for completely

1:19:17

unrelated reason and everybody freaked out. They're

1:19:19

like, we'll just What

1:19:21

does this mean? It just

1:19:23

expired. Yeah. And and, you know, you do this

1:19:25

for for the automation reason. So we don't want people to have access to things. It's

1:19:28

like,

1:19:29

okay. Especially for

1:19:32

people who you're paying a certain amount of money and

1:19:34

who you have worked for

1:19:34

you for a certain amount of time. It's like, have some freaking humanity. You

1:19:37

know, there's a way

1:19:39

to do it there's a way to to

1:19:42

take access away. It doesn't mean that someone is entering the office

1:19:44

at seven AM,

1:19:47

hasn't checked their email their

1:19:49

personal right email, doesn't know what's going

1:19:51

on, waves a badge, and and finds out that way. Like, that

1:19:55

that just It's awful. And there are

1:19:58

better ways to do it. There's no good way to do it period, but

1:20:00

there are ways to do

1:20:02

it that are worse than others.

1:20:07

Somebody in the chatroom just told me that

1:20:09

Chris DeBona, who was one of the

1:20:11

founders of Floss Weekly, great

1:20:14

friend is also an ex Googler. He

1:20:16

was a director of open source

1:20:19

at

1:20:19

Google. I did not realize he

1:20:21

had lost his job as well.

1:20:23

So loss for them. And that's a massive that's

1:20:25

a massive concern for open source because of all

1:20:28

the work and money and resources

1:20:30

that Google has given open source

1:20:32

projects. Over

1:20:34

the years, sponsoring conferences and and other things. That's been a discussion that has come up

1:20:36

in the last couple of weeks with with

1:20:38

these bake lay offices. What does that mean?

1:20:43

the source I don't think that it's a wrong

1:20:46

one because budgets are tight everywhere. And

1:20:48

these are things

1:20:50

that, you know, we that some people in the open source movement don't like to acknowledge,

1:20:52

but a lot of the money and

1:20:54

and a lot of the funding

1:20:58

really does come from these corporations whether those

1:21:00

checks go away, like, what

1:21:01

does that mean? Because the this

1:21:04

sustainability in open source has been a

1:21:06

really big topic for the last number

1:21:10

of years. And corporate goodwill is or

1:21:12

or corporations paying their own wafer

1:21:14

services to support is one thing.

1:21:18

Laporte but the goodwill aspect, which has

1:21:20

been increasingly a thing that we've

1:21:22

seen happen, like, I I can

1:21:25

see that potentially at some places

1:21:28

And that's really discouraging and

1:21:30

I think it have really

1:21:32

negative consequences

1:21:35

because people happen always wanted to maybe

1:21:37

acknowledge how much of of a

1:21:39

role those those checks and

1:21:41

that funding can really

1:21:43

play. A lot of small projects. Microsoft also laying off

1:21:45

about ten thousand workers. And this

1:21:47

is the thing, you know, you go

1:21:50

I'm sure you do this too, Christina. You

1:21:52

go you look and you just check

1:21:54

and see, oh, gosh. And I'm sure there's corporate, you know, Slack's and stuff that

1:21:56

you can go to and see who's

1:21:58

there apparently a a Twitter there with

1:22:01

TWiT salute icons as

1:22:03

people -- Yeah. -- dropped off

1:22:05

the face of the

1:22:07

earth. Microsoft's border Go

1:22:09

ahead. I was gonna say, I think that what made it it

1:22:11

hard for it from Microsoft. And I'm sure for Google too, is everybody you know, a

1:22:13

lot of people working

1:22:16

from home And so,

1:22:18

yeah, there were a lot of, you know, kind of, you know, shadow groups of people. Yeah. Well, you know, yeah, you don't know,

1:22:20

but people check-in with

1:22:22

one another. I mean, that's

1:22:25

what I was doing. I was checking it with my

1:22:27

friends at Microsoft. I'm I'm in a a few group shots,

1:22:29

and and that's what I was doing, and then, you know,

1:22:31

checking Twitter. And and seeing, you

1:22:33

know, and some people were were were laid off and and whatnot. And and when you're talking

1:22:35

about numbers this it's not about

1:22:38

performance. It it really

1:22:40

is you know,

1:22:42

decisions made usually about entire divisions -- Just slash series. -- so mysterious. Yeah. One of the

1:22:47

stories and again, I hate to I'm

1:22:50

not gonna put you on the spot. You don't represent Microsoft by in in any stretch of imagination. But one of things

1:22:52

that we did learn

1:22:54

is that Microsoft's

1:22:55

VR, AR, Hollow

1:22:58

lens division suffered massive cuts.

1:23:01

And and that sounds

1:23:03

to me like more

1:23:05

of a strategic decision

1:23:07

to to not

1:23:08

pursue those

1:23:08

areas. And instead of

1:23:09

making hardware to to make their software available to

1:23:12

companies like HTC

1:23:15

and and meta, that are gonna make

1:23:17

the hardware an apple. One imagines that are gonna make the hardware and then, you

1:23:20

know, make the productivity software

1:23:22

for that hardware, which actually probably

1:23:26

is a better bet than

1:23:28

than putting all your chips

1:23:30

in on on legless, sexless people

1:23:32

wandering around in low poly

1:23:35

count. Microsoft also to

1:23:38

take Corey's pros. Yeah. Microsoft

1:23:40

also gave up on

1:23:42

Allspace VR, which was a a startup

1:23:44

data query. A few

1:23:46

years ago. So what

1:23:47

was a platform? I

1:23:50

mean, I I would be

1:23:52

cautious about assuming that Microsoft doesn't

1:23:54

have any ARVR slash metaverse platform

1:23:58

ambitions forever. But but if they do,

1:24:00

maybe this seems like a little bit of a reset.

1:24:02

And

1:24:02

-- Sure. -- it seems it seems perfectly sensible

1:24:05

at this point to redeploy some of

1:24:07

that metal band weapons and resources

1:24:09

into AI, which so clearly is

1:24:11

gonna have so much impact starting

1:24:13

at this very moment as opposed to the

1:24:15

metaverse, which is still, like, maybe at some point and maybe not to

1:24:17

the degree we expected

1:24:19

kind of thing. Is

1:24:22

it risky though to chase the flavor of the

1:24:23

month? Because, I mean, that's why they VR. True. Although, I mean,

1:24:26

I

1:24:26

I don't know my

1:24:29

for all the reasons to be cautious

1:24:32

about AI. I I think even if it's only five percent

1:24:34

as impactful as people expect that it's gonna be incredibly

1:24:36

important. Laporte

1:24:39

were a lot of reasons to think VRAR was not going

1:24:41

anywhere from day one. I mean,

1:24:43

that eleven percent of

1:24:45

the people who used it were nauseated

1:24:47

is a pretty good indicator that there's gonna be this may

1:24:49

not be the mass appeal

1:24:50

product. You hope it will be. There's some

1:24:53

if you want to have magical glasses

1:24:55

up up like these, have great battery life

1:24:57

and fantastic

1:24:58

Well, that's absolutely plan. Right? Spectacles. Well, they give up on that too. There may be just there's some fundamental

1:25:00

pieces of technology. We have no

1:25:02

idea how to build so far. Right.

1:25:06

We

1:25:06

know how to do AI.

1:25:07

It's a battery life. Yeah. Exactly. Chemistry

1:25:09

moves at a glacial pace compared to

1:25:11

a digital stuff. That

1:25:13

was one of the stories from the week

1:25:16

that Mark Berman saying Apple is gonna push off its, you know,

1:25:18

spectacle based AR vision for at least a couple of years.

1:25:22

To twenty twenty five, if not

1:25:24

later, because they can't get

1:25:26

it working. Even their headset,

1:25:29

which they're still rumors

1:25:31

are strong, they're gonna offer for three thousand

1:25:33

dollars this year has a battery in your pocket

1:25:35

because it's too heavy

1:25:37

to wear on your

1:25:40

head. Right. So, yeah, I think there are

1:25:42

some fundamental technical issues with we the problem is we all

1:25:44

read the same

1:25:47

science fiction

1:25:48

stories. By William Gibson and Neil

1:25:49

Stephenson. Don't want this. We all wanna jack in the metaphor. You all want this.

1:25:52

But

1:25:52

Well It's

1:25:53

not sci fi. I mean, I

1:25:55

mean, the battery

1:25:58

No. The battery thing is one of the biggest ones. I

1:26:00

mean, I I've been a proponent

1:26:02

of of going nuclear for a decade

1:26:04

at

1:26:04

least. Do you want a little nuclear

1:26:07

power plant

1:26:07

in your head? I mean,

1:26:08

honestly, I would trust it more than lithium ion. Okay. If

1:26:10

if you look at the safety record, I honestly would.

1:26:14

Do we have that technology? I know we have pocket nuclear reactors for

1:26:16

power, but pocket means I don't think The

1:26:18

size of this room

1:26:19

that I mean,

1:26:20

I don't I don't

1:26:21

I don't I don't know if we do or not. And my

1:26:23

my point is more like, wish

1:26:25

that we had been investing more over the last decades in in looking at that as a power source

1:26:28

than in some of these other

1:26:30

things because I do think that then

1:26:32

that

1:26:34

in my mind, that's the only way you can get

1:26:37

the long lasting battery life and

1:26:39

the the microization that you'll

1:26:41

need for these things. But I I just

1:26:43

don't think it's gonna be possible with with looking

1:26:45

in polymers. I I just don't. Okay.

1:26:47

Sure.

1:26:47

Physics and

1:26:50

chemistry. I

1:26:51

understand. But I'm just not

1:26:51

sure people are anxious to

1:26:53

wear the clear power

1:26:55

plant

1:26:55

hat. You're

1:26:58

not wrong, but again, I mean, maybe it needs a rebranding. just saying,

1:27:00

like, the the brand Don't call

1:27:02

it the nuclear mask.

1:27:03

Okay. That's a good thing. Right.

1:27:05

Don't call it the

1:27:07

nuclear mask. I'm I'm saying, like, it's it's

1:27:09

a branding thing, but, like, if I think that the

1:27:11

technology like, that's it's obviously the one of

1:27:13

the only solutions that I can think of

1:27:15

that we already

1:27:16

have. Because solar

1:27:18

is certainly not going to be enough enough listeners

1:27:20

in Australia are probably

1:27:22

aware of the fact that

1:27:28

the a a tiny Cesium one

1:27:30

thirty seven capsule went missing

1:27:32

on its way to

1:27:35

purse this past week. Just one, it's let me see

1:27:37

if I can find a picture of it because it's

1:27:39

it's so small. They're they're one in

1:27:41

the public not to

1:27:44

touch

1:27:44

it. But it's so small. I don't even

1:27:46

know how you would find it. It's about the size of one of those little

1:27:48

lithium ion

1:27:49

batteries that you put in

1:27:51

your in your pocket.

1:27:56

Here it is. Here's the size next to

1:27:58

a Australian something. Ten ten pens

1:28:00

piece. I don't know what

1:28:03

that

1:28:03

is. Six millimeters by eight millimeters. If you see it,

1:28:06

yeah, don't

1:28:06

touch it. Don't

1:28:07

pick it up.

1:28:10

Don't taste

1:28:10

it. Yeah. It's like remember, your guys are too young. Remember my

1:28:13

when I was a kid, there were constant ads

1:28:15

not to touch blasting caps.

1:28:18

Do you remember that? Do you remember that,

1:28:20

John? No. Blasting caps. That was,

1:28:22

like, maybe that was a major problem.

1:28:24

I know kids. If you see

1:28:26

this, don't touch it.

1:28:28

Well, kids, if you see a six

1:28:30

millimeter by eight millimeter shiny silver capsule,

1:28:32

TWiT

1:28:34

don't touch it. It could kill you. It could kill you. So do you wanna that your head?

1:28:41

I don't know. Christina's point though, we haven't

1:28:43

seen any real progress in alternate sources for power in a while. I I remember CES about

1:28:45

a decade

1:28:47

ago, there was like, a whole

1:28:50

full of portable devices

1:28:50

for hydrogen power, basically. So you could have

1:28:53

a fuel cell

1:28:55

in your lap topper on your phone and even and

1:28:58

there was so many different vendors. It seemed like it was just a couple of years away. Yeah. And then I'm guessing they started

1:29:02

you know, exploding in people's pants, and that was probably the end of that. But we

1:29:05

haven't really seen anything since

1:29:07

then. So certainly, solid state batteries are are

1:29:09

just around the corner. I think we'll see those

1:29:11

soon. And those will will provide a

1:29:13

pretty big step forward in terms of charging speed, discharge speed, and

1:29:15

will help to reduce the overall volume of

1:29:18

a given capacity of

1:29:20

battery.

1:29:21

But really, you know, like I said, we'll chemistry gap and slowly, and there

1:29:23

really isn't any kind of shot in the darkness

1:29:27

coming soon for portal devices for cars, you

1:29:29

know, super capacitors, things like that. I think we'll have some big gains in a decade or

1:29:31

so, but there's nothing like that

1:29:34

going for smaller stuff. Yeah.

1:29:37

Microsoft's recorder was not great. Revenue was up two

1:29:39

percent. Profit down twelve percent. This primarily, I

1:29:41

think, due to this

1:29:43

PC drop off. Both

1:29:47

below Wall Street expectations,

1:29:50

Amy Hood, Microsoft's

1:29:52

chief financial officer,

1:29:55

said New business slowed in December, but it expects -- and it expects

1:29:57

growth to continue to slow in the current

1:29:59

quarter, which ends March

1:30:02

thirty one. On the other hand, I think Microsoft is very

1:30:05

well positioned. This open AI investment

1:30:07

is looking very smart.

1:30:10

Right now. Clearly, if AI

1:30:12

is taking off businesses like Azure are

1:30:14

gonna do very

1:30:15

well, nobody wants to invest in

1:30:17

the storage

1:30:18

and and that TPU capacity that's

1:30:20

required for learning big sets

1:30:23

of of data. So they

1:30:25

do it often in the

1:30:27

cloud, Microsoft, Google, Amazon, on all

1:30:29

benefiting for that. So I, you know,

1:30:31

I think I would be

1:30:33

bullish about Microsoft, Christina. I think

1:30:35

you're you're in a and

1:30:38

certainly about

1:30:39

GitHub. GitHub passed one

1:30:41

hundred million developers this

1:30:44

week. Yes.

1:30:45

Yes. That was very, very

1:30:47

exciting news. A hundred million developers and a couple

1:30:50

years ahead of schedule. So

1:30:53

the goal had been twenty twenty

1:30:55

five. We were able to hit it, you know, early twenty twenty So very, very

1:30:58

exciting about that. And when

1:31:02

you kind of look at the trajectory of how many

1:31:05

developers have joined the platform even

1:31:07

in, like, going back to twenty

1:31:09

sixteen, it's TWiT really ramped up.

1:31:11

And I think It's because what one of

1:31:13

the great

1:31:14

things is is that the definition of developer

1:31:16

has has changed, I

1:31:19

think, in a really important way.

1:31:21

And and so people who are working on working

1:31:23

around code or making contributions that might

1:31:25

just not be, you know,

1:31:28

code focused. Can

1:31:30

still use platforms like GitHub, you know, the the kind of the the rise of kind of the the lower code, you

1:31:32

know, movement around people who

1:31:34

are building, you know, business applications

1:31:38

and doing other types of things where

1:31:41

you see a lot of data scientists and

1:31:43

and other people doing

1:31:45

really innovative stuff. But again, in

1:31:47

their mind, you know, ten years ago, they might have said,

1:31:47

I'm not a developer. Now you can be like,

1:31:50

no, you are. This stuff that you're doing might

1:31:52

not be coded in

1:31:55

a traditional

1:31:55

sense, but It definitely, you know, is impacting things

1:31:58

or in some cases, is absolutely What's what's

1:32:00

the weirdest

1:32:01

thing people are using

1:32:04

GitHub before?

1:32:05

Maybe that's

1:32:05

a voted question. I don't No. That no. It's interesting because we'll I mean, people say It's not all

1:32:07

a code.

1:32:08

I mean, I know

1:32:10

novelists and writers use it.

1:32:13

Right? Yeah. I I was gonna say I was gonna say, you know, we have this was we have this product called the GitHub projects and which

1:32:16

is like a, you know, kind of project management

1:32:18

stuff. And you will see people who will just

1:32:23

it their life, like, to to just have it as,

1:32:25

like, a very organized kind of to do list

1:32:28

thing. And

1:32:31

and that's really cool to see. But as you see, yeah, a novelist,

1:32:33

people who use it for writing. Yeah. I think

1:32:35

that is definitely a really cool way. Then

1:32:37

we also see, you know, used

1:32:39

in in really interest in ways, you

1:32:41

know, like by, you know, people in NASA

1:32:43

and in other organizations. It's

1:32:47

interesting to see a lot of the data science

1:32:49

stuff is really interesting because you can see people putting their Jupiter notebooks and there are

1:32:52

other outputs there.

1:32:55

That I think is actually really great. I think

1:32:57

seeing notebooks has been such a

1:32:59

great feature to to happen, I

1:33:01

think, in in code for a

1:33:03

lot of reasons. And

1:33:05

as we've gotten better support for that stuff within GitHub, I think that's been a really cool thing to see

1:33:08

the datasets

1:33:12

and that stuff that people have used. I

1:33:14

that that's really awesome to me because those are things that wouldn't fit with a lot of traditional code

1:33:17

TWiT, but is a really

1:33:19

great way where we

1:33:22

had an incident, I think it was last year where we got rid of one original

1:33:27

Laporte URL shorteners. And

1:33:29

and we did it because the the code behind it was was really antiquated, and and

1:33:31

it hadn't been

1:33:36

up capped. But we had to wind

1:33:38

up migrating a lot of the the URLs over and kind of keep them working because

1:33:40

it turned out that there

1:33:42

were a number of academic papers

1:33:45

where people had used the URL shortener, which would just go to a GitHub repo in

1:33:47

their academic papers. And always really interesting

1:33:50

to see how many

1:33:52

people

1:33:53

will put the full data sets and

1:33:55

and other information of academic papers on GitHub repos. That's always really

1:33:58

cool to see. Somebody has got a open eye open

1:33:59

AI chat GPT prompt

1:34:03

for a link bait article Better GitHub with

1:34:05

this one weird trick. I

1:34:07

think I think we should write

1:34:09

that right

1:34:10

now. Somebody will come up with that.

1:34:12

Totally

1:34:12

write that. Somebody I hugely admire one of the most

1:34:14

famous programmers in the world Peter Norvig. He's a scientist

1:34:17

at AI scientist

1:34:20

at

1:34:20

Google. Uses Jupiter

1:34:22

notebooks on GitHub. I follow him because I do the advent

1:34:24

of code coding

1:34:25

problems, and he does these every year. And of course,

1:34:27

here's one of the best programmers

1:34:31

in the world. This is what a Jupiter notebook looks like on

1:34:34

GitHub. He's got cartoons. He's got

1:34:36

code that runs.

1:34:39

He's got results. I mean, it's

1:34:41

amazing. He's even got visualizations in here because Jupiter notebooks,

1:34:43

which is just one of many kinds

1:34:45

of notebooks, but Jupiter is probably the

1:34:48

most popular. Allow

1:34:50

you to run code and write text so you can you could do true literate programming. think

1:34:52

this is fantastic. I I am

1:34:54

so impressed. I think to me,

1:34:56

TWiT is

1:34:59

a great use of GitHub. I mean, this is

1:35:02

actual Python code that runs.

1:35:04

Exactly.

1:35:04

Yeah. And it's such a great it's

1:35:06

such a great teaching pool. You know, honestly, like, it really

1:35:08

is, I think, one of the best ways to teach

1:35:10

stuff. And and so using that with

1:35:13

add in up code, and that's beautiful. That's a great It's

1:35:16

really cool. He and it and it's I

1:35:18

love that. It's marvelous to look at his

1:35:20

code because it's

1:35:23

a clear and it is a little TWiT,

1:35:25

but it's very clear and precise and and inspired. I mean,

1:35:27

it's this is this

1:35:30

is a guy who speaks code

1:35:32

and it's so fun to to look at this. I always

1:35:34

wait until after I've tried to solve the problem before

1:35:36

I read his his post. He also

1:35:39

has got somebody doing cartoons. In

1:35:42

all this, this is a GitHub

1:35:44

page. This is a repository, which is

1:35:46

pretty darn cool if you ask me.

1:35:49

Anyway, Microsoft tough tough quarter, but I

1:35:52

think the

1:35:55

market rewarded missing its

1:35:58

targets with a four percent bump in the

1:36:00

stock price because I think of the

1:36:02

future of AI and everybody knew

1:36:04

that this PC slowed down. Was gonna

1:36:07

hit Microsoft just as much and if not

1:36:09

more because, of course, they make the operating system

1:36:11

for most of these computers. So

1:36:14

I think you're at a good company. I would if I were you,

1:36:17

Christina, I would keep that

1:36:18

job. Just my advice too. I mean,

1:36:21

definitely, IIII that's definitely the

1:36:23

plan. Right? Like, I don't have, like, you know, everybody everybody is is is

1:36:26

there's this uncertainty everywhere, but that is

1:36:28

that is definitely the

1:36:30

plan I certainly feel very lucky to get

1:36:31

GitHub. Yeah. And yeah. Well, we love you, and

1:36:34

you could always come here if you needed to. But

1:36:36

I don't think I could pay

1:36:38

you anything like Microsoft. I

1:36:39

see. So Well, but IIII

1:36:41

appreciate that. Just so you know,

1:36:42

you know, bring your shoes. Come on over. Yeah.

1:36:44

I will bring my

1:36:46

shoes. I'll come to Petaluma. Okay.

1:36:48

What's

1:36:49

your what's your new kick? What's your what's your

1:36:51

hot new kick? Anything exciting? Okay. So I

1:36:54

don't yeah. Actually. I got I did not I don't have them

1:36:56

in this room with me. They're in the other room, but

1:36:59

I went to Vegas last week

1:37:02

with my mom. I took her to see Adele last weekend. It was

1:37:04

amazing. Oh, how

1:37:05

fast time. Oh, how fast? Yeah. My

1:37:08

mom is my mom has never

1:37:10

been to Las Vegas. And I haven't been for an odd work related reason

1:37:12

-- Right. -- in a really long

1:37:13

time. It's a very different experience, isn't it when you're

1:37:15

not going to -- Right. --

1:37:17

to the convention center every day,

1:37:20

all

1:37:20

day. Honestly, it was it was it was, like, a

1:37:22

completely different thing for me. We had such a great time, but we

1:37:24

were staying at the

1:37:26

at the plateau, which is part of

1:37:28

the Venetian, and they have a big mall. And

1:37:30

then there's, like, the the one in the Encore next door. And, anyway, I

1:37:33

went into Ferragamo, and I

1:37:35

bought a pair of Farracamo

1:37:37

sneakers. I will put them in the chats. They are great, but that is my that is

1:37:39

my new. Can

1:37:44

I I shouldn't? This is gosh

1:37:46

of me. How much were they? Can I ask? Like, eight hundred? Yeah. Well,

1:37:49

that's

1:37:51

not bad. at Ferragamo starts at eight hundred. So you really really

1:37:53

got a deal, I think. Yeah. Exactly. III got the low end. Here's the thing. There was

1:37:55

a parent I like that were a little bit more

1:37:58

expensive. They were still within my budget, but I

1:38:00

would've spent. But

1:38:02

I'm a five and a half,

1:38:04

which the the salesperson had never seen someone

1:38:06

with feet as small as mine.

1:38:10

Tiny little I have tiny little feet too. Yeah. I don't

1:38:13

know what that I feel like I'm just gonna

1:38:15

fall over in a in a

1:38:17

stiff wind. So Yeah. Yeah.

1:38:20

Alright. We're gonna take a break. You go

1:38:22

get those Ferragamo's if you want because we're

1:38:24

gonna talk about our sponsor.

1:38:26

Thank you ACI for sponsoring our studios for

1:38:29

the year. ACI Learning, you say, well, I don't know them,

1:38:31

but you do know the name IT

1:38:36

For the last decade, our partners

1:38:38

at IT Pro brought you engaging, entertaining IT content to

1:38:40

level up your

1:38:43

career, your organization, In fact, I

1:38:45

think a great many of our listeners are IT pro

1:38:48

members. There are

1:38:50

two hundred twenty seven thousand

1:38:53

people in the IT pro community. That

1:38:55

is a great learning community. Many of them, Twitter listeners. Well, IT

1:38:57

pro has partnered now

1:38:59

with ACI Learning which

1:39:03

really expands its reach. This

1:39:05

is really good news for all

1:39:07

of

1:39:07

us. Expanded

1:39:11

production capabilities ACI learning is has

1:39:13

expertise in not just IT,

1:39:15

because IT pro is the

1:39:17

best, but they also have

1:39:20

audit pro So audit readiness is a

1:39:22

big part of IT these days that can help you there. They also have a cyber security

1:39:25

division, which is

1:39:28

the best they have even learning hubs where

1:39:30

you can go and learn in person, which for some people at least part of the time is a

1:39:32

valuable adjunct to the online learning

1:39:34

that IT pro is famous for.

1:39:38

One of the most widely recognized beginner certificates,

1:39:40

the CompTIA a plus cert.

1:39:43

I know many of you in

1:39:45

IT, that's where you started. Right? That

1:39:47

desktop support cert. Comp t courses with

1:39:49

IT pro from ACI learning make

1:39:51

it easy to go from

1:39:53

being kind of a day

1:39:56

dreamer about getting that career in IT

1:39:58

to actually having a career in IT. Earning those certs is really

1:40:01

the the

1:40:04

most important thing to do to get into an

1:40:06

entry level IT position. You don't have job experience. Right? You don't you can't say, well, I did this

1:40:08

and this. But if you've got

1:40:10

that cert, they know you've got the

1:40:13

skills, the qualification. And it gets you started to move

1:40:15

on in your field, and that's what IT pro dot from ACI learning

1:40:18

is so good at.

1:40:21

Tech is one industry where

1:40:23

the opportunities now are outpacing growth, especially in cybersecurity. There

1:40:25

are more than

1:40:28

a million open, unfilled

1:40:30

jobs in cybersecurity right now. A recent LinkedIn study predicts

1:40:32

IT jobs will be the most

1:40:34

in demand roles in twenty twenty three.

1:40:39

Don't waste time. Get going. This is a

1:40:41

career that will reward you. It's

1:40:43

it's fun. You're

1:40:45

already interested in technology. About a one third of the information

1:40:48

security jobs require, a

1:40:50

cyber security certification. About

1:40:52

twenty three percent of IT

1:40:55

jobs require that, but but

1:40:57

a third require of cybersecurity jobs require a cert.

1:40:59

Organizations are obviously very hungry for

1:41:04

cybersecurity. Talent, but they wanna know

1:41:06

that you've got what it takes. The average salary for cybersecurity specialists, a

1:41:08

hundred sixteen thousand

1:41:11

dollars a year ACI

1:41:13

learning's information security analyst and cybersecurity specialist programs can

1:41:15

help you get that

1:41:19

money, get certified, get

1:41:21

that job. Last year, the global cybersecurity workforce gap

1:41:23

grew bigger, not smaller. increased by twenty

1:41:26

six point two percent.

1:41:29

Over twenty twenty one. There's a job out there waiting for you. ACI

1:41:31

Learning offers multiple cybersecurity

1:41:35

training programs that can

1:41:38

prepare you to enter or

1:41:40

advance within this exciting industry. The most popular

1:41:42

cybersecurity search, not in any particular order, CISSP,

1:41:47

I love and this is the one I wanna do. EC

1:41:49

Council's certified ethical hacker. I

1:41:51

I just thought that'd be great

1:41:53

to have CEH after

1:41:56

my name. Certified network defender. There's a

1:41:58

cybersecurity audit school. That's a specialty that is gonna

1:42:00

be in huge demand as

1:42:02

people need to prove compliance to

1:42:07

customers, to higher

1:42:09

ups, to regulator's

1:42:11

very important job. Learn cybersecurity

1:42:13

frameworks. They've got great classes in that too. When, where, and how

1:42:16

you learn makes

1:42:18

a big difference. ACI

1:42:20

learning makes it easy.

1:42:22

They offer fully customizable training no matter what kind of learning you learner

1:42:24

you are. You know,

1:42:26

some people really wanna be

1:42:29

in the classroom in person. Some

1:42:31

people really more comfortable remote. Some people like it. Live

1:42:33

remote. Some people want it on demand. ACI learning has

1:42:36

it all. Explore

1:42:39

what ACI learning offers with IT pro, audit

1:42:41

pro, which includes enterprise solutions

1:42:44

webinars, and their great

1:42:46

podcast to skeptic auditor podcast. They've got

1:42:48

practice labs so you can get hands

1:42:50

on. Just from your own home in

1:42:52

a browser, they've got learning hubs

1:42:54

where you can actually go in

1:42:56

and get in person instruction, and

1:42:59

they've got their partnership program too. This is really an exciting move for

1:43:02

IT

1:43:03

pro. ACI Learning a

1:43:05

great partner. I'm very excited. Tech is the one industry where opportunities

1:43:07

outpace growth,

1:43:11

especially in cybersecurity. One third

1:43:14

of information security jobs require that cert. Get the cert. Get the job to

1:43:16

maintain your competitive

1:43:18

edge across audit, IT, cybersecurity

1:43:21

readiness. Visit the website go dot ACI learning

1:43:23

dot com slash tweet. You got

1:43:25

that, and please use that so they

1:43:27

know you saw it here.

1:43:30

That's important to us. Go dot a

1:43:33

c ilearning dot com

1:43:35

slash Twitter. We also have

1:43:37

a great offer code. Thank you,

1:43:39

ACI Learning. Use the offer code twit three

1:43:41

zero for thirty percent off a standard

1:43:43

or premium individual IT pro

1:43:46

membership. Thirty percent off twit

1:43:48

thirty at go dot ACI

1:43:50

learning dot com slash tweet. IT Prism is such

1:43:51

a great partner for us since they started back in

1:43:53

twenty thirteen, and we're thrilled to

1:43:55

welcome ACI learning. And

1:43:58

IT pro into the family. Thank you for

1:44:01

supporting the studio and supporting what we do.

1:44:03

We really appreciate

1:44:03

it. Alright. Let's see

1:44:06

those kicks, Christina, Christina's new kicks.

1:44:08

These are Oh.

1:44:12

Oh. Yeah. What's

1:44:14

that logo? Is that the Ferragamo? What is

1:44:16

that? I guess

1:44:17

so. I'm not even sure. III

1:44:19

did like how it looked. Yeah.

1:44:21

And then when I really liked I really like the

1:44:23

back, which is like this TWiT black mite Okeydoke.

1:44:26

Okeydoke. Yeah. And and

1:44:28

again, like, I'm

1:44:29

not defending Are

1:44:31

you ever gonna wear the Are

1:44:33

you just gonna put them on the shelf and

1:44:35

sell them somebody someday? Oh, No. No. No. my

1:44:35

shoes. I don't

1:44:37

I don't buy them for

1:44:39

the resell value a. My

1:44:42

foot is so small that No

1:44:44

one's gonna buy a five and

1:44:46

a half. No. Exactly. Right? Like, that

1:44:49

that's that's that's that's there's a very small number

1:44:51

of people who be able to wear my shoe

1:44:53

size. No. I buy them to wear. I

1:44:55

have them on the back wall. I know.

1:44:57

I see them and I can see

1:44:59

the soles are used. You are not one

1:45:01

of those people. Yeah. Exactly. Yeah. Just puts a shoe on a No. I

1:45:03

mean Let's it suffer in silence.

1:45:06

And I have some I have some friends who do that. not me. For me,

1:45:08

I'm like, no, shoes are to be worn, fashions

1:45:11

to be worn, like, don't don't

1:45:13

hoard it in that way, but because if I if I

1:45:16

spent money even if it was fifty dollars on a

1:45:18

pair of shoes and they never wore

1:45:19

it, like that's I don't

1:45:21

know. That's a

1:45:23

waste. Yeah. I agree.

1:45:24

You buy it to

1:45:26

enjoy it. Yeah. I I am wearing a forty niners

1:45:28

Jersey, which is now

1:45:30

for sale cheap, if anybody.

1:45:33

No. I'm just I'm just kidding. But I bought Lisa's birthdays today in

1:45:35

our anniversaries, so she

1:45:38

got a lovely birthday

1:45:40

present. But

1:45:43

Of course. I bought her a, you

1:45:45

know, our our young star

1:45:47

quarterback, the rookie,

1:45:49

mister irrelevant, Brock

1:45:50

Purdy, I brought her bought her a at Brock Purdy

1:45:52

Jersey to wear it during the big there's

1:45:54

a big game today for those of

1:45:56

you.

1:45:56

I learned about that.

1:45:57

Yes. You didn't know at first though. Sportball.

1:45:59

What is that? So I bought her a purdy

1:46:03

jersey. But see, I

1:46:06

Now, Marie, you could tell me if I'm right or wrong on this. I thought, what size should I get? And then I said,

1:46:09

I'm getting the

1:46:12

small. Right?

1:46:14

Because if it's too small, that's

1:46:16

fine. If I got

1:46:18

large, no, that would have

1:46:20

been bad. So a little

1:46:23

husband tip start with the smallest size, whether

1:46:25

it's a shoe or a

1:46:27

shirt, small start

1:46:30

with the smallest

1:46:31

size. Can always return it and get the next one which I'm I'm gonna

1:46:33

have to do because

1:46:34

she's not that small. She

1:46:36

is she's tiny. I

1:46:39

thought it would

1:46:40

fit. But I guess women's small is pretty small.

1:46:42

It's probably, you know, the equivalent of a five and a half

1:46:44

shoe.

1:46:47

So I did something really

1:46:48

gloomy last night. I watched a

1:46:50

movie called Too Leslie. Anybody

1:46:53

see that

1:46:54

yet? I haven't yet. It's on the list

1:46:55

of consumers. No. I haven't. No

1:46:57

smart list. Yeah. But

1:46:58

every every

1:46:59

everybody started talking about it TWiT

1:47:01

then I got all the nominations. So this is about This

1:47:03

is about This is proof that Twitter

1:47:05

TWiT all its problems still

1:47:08

is very powerful.

1:47:10

Normally, this time or actually

1:47:12

last month in December, you see especially

1:47:14

in Los Angeles, which is, you know,

1:47:17

it's a company town. Billboards, ads,

1:47:19

and every magazine TV ads

1:47:21

for your consideration. Movies that

1:47:24

they want the members of

1:47:26

the academy to vote for, to nominate for best to a

1:47:28

picture, best to actor. Because it makes a

1:47:30

big difference in box office. Right? So

1:47:34

There was a tiny little movie.

1:47:36

It only made twenty seven thousand

1:47:38

dollars at the box office. Called

1:47:40

too Leslie, the movie company could

1:47:42

not possibly afford even one

1:47:44

billboard on sunset strip for

1:47:47

your consideration. But somehow, They

1:47:51

got every mainstream a

1:47:53

list actor in the world

1:47:55

to

1:47:55

tweet something just like this.

1:47:58

This is Edward Norton. I don't post a lot film or actor

1:48:00

performances. Maybe I should more often,

1:48:02

but for those interested in really

1:48:06

great acting, I'll share that Andrea Rise Burrows, portrayal

1:48:09

in two Leslie just

1:48:11

knocked me sideways. It's

1:48:13

about the most fully committed emotionally

1:48:15

deep and then there's a dot dot dot.

1:48:17

I don't know. Maybe there's more. Oh, here it

1:48:19

is. Physically, harrowing performances I've seen in a

1:48:21

while just raw and utterly devoid of performative BS. It's

1:48:23

tough, but really elegant and compassion, filmed by Michael Morris, where the emotion has really

1:48:25

learned I happen to catch it. Wow.

1:48:27

I was really three

1:48:30

tweets staggered by the depths she reached

1:48:33

very rare checking out. But turns out It

1:48:35

wasn't just Edward Norton. TWiT

1:48:38

was pretty much everybody in Hollywood tweeted this. This was a mess.

1:48:44

Twitter campaign to get this actress

1:48:46

who's frankly not well known, an

1:48:51

Oscar nomination plate ElonJet,

1:48:54

Spielberg, Oprah, Happy birthday, Oprah, Merrill Street, Daniel Day Lewis, Martin Scorsese,

1:49:01

But Brian Roe pointed this out on Twitter. All used

1:49:03

this exact phrase, the greatest performance in the

1:49:06

history of the cinematic

1:49:08

medium. Where

1:49:11

you work in PR? Do

1:49:13

you think that was

1:49:16

a coincidence?

1:49:18

I mean, I think they just performed once in the history

1:49:20

of this incident. I

1:49:21

mean It was

1:49:22

a okay. There is no lie

1:49:25

there. Oh, you watched two

1:49:27

ant. It was a really good performance. It

1:49:29

was amazing. She got a nomination for best actors

1:49:32

beating out

1:49:35

some people who everybody thought were shooting,

1:49:38

including Viola Davis for Wakanda,

1:49:40

and I'm

1:49:43

sorry, was

1:49:43

the name of it?

1:49:46

Woman king. Queen king.

1:49:48

Woman king. Apparently, great.

1:49:50

I did not see it.

1:49:52

And then there was Till. And

1:49:54

the actress in Till, who everybody thought both actresses snubbed

1:49:59

by the Golden Globes and now snubbed by the Academy, but this very

1:50:01

little known actress

1:50:02

with a

1:50:03

film that made twenty

1:50:07

seven thousand dollars.

1:50:09

Got all

1:50:10

of this attention and

1:50:12

got

1:50:13

a nomination. That's the power

1:50:15

of Twitter. Right? You

1:50:19

didn't

1:50:19

need a billboard on Twitter and

1:50:20

Go ahead. Well,

1:50:20

it was Twitter and then was also TWiT, like,

1:50:23

Ed Norton and some

1:50:25

other celebrities that may have, like, screenings --

1:50:27

Yeah. -- for cutting over the phone. So Jennifer Aniston says, come over

1:50:29

to my house and we can watch this

1:50:31

fine movie to Leslie.

1:50:34

Who's gonna turn that down? Right? Distributor

1:50:36

momentum pictures did not have

1:50:39

any money to not

1:50:41

in a campaign. Riseboro was

1:50:44

not nominated in the Golden Globes or

1:50:46

the SAG Awards. It's basically a word-of-mouth

1:50:48

campaign. Kicked off

1:50:51

-- Kicked

1:50:52

off Two

1:50:52

days before Oscar voting began too,

1:50:54

it was a late entry. Very late campaign. Yeah.

1:50:57

That's incredible. Like, even

1:50:59

though, obviously, it was like,

1:51:02

you know, getting the the photos to to see

1:51:04

it. But but the Twitter thing, you're exactly right.

1:51:06

Like, I am and I followed this

1:51:08

stuff for a long time. They didn't have

1:51:11

money for a campaign. So this is

1:51:13

a really interesting, I think example of

1:51:15

of the right connected people

1:51:17

stepping up and using social platforms

1:51:19

to you know, highlight something that otherwise would not have been getting

1:51:21

the sort of attention. Whether or not,

1:51:23

you know, she's gonna win or

1:51:26

TWiT, it remains to be seen, but

1:51:28

that's that's pretty that's

1:51:30

pretty fantastic. Here's a here's a tweet by crazycons. It tweet something weird

1:51:32

is happening. Here's

1:51:35

me a faroe.

1:51:37

Here's Meredith Vieira.

1:51:38

Here's Chomena. And by the way, all of them say a small film with a giant heart. A

1:51:41

small film with

1:51:43

a giant heart. A

1:51:46

small film with a giant heart delay Hill, a small film with a giant heart. Mhmm. I do congratulate Mark

1:51:51

Marron, who is build as an

1:51:53

executive producer probably because they couldn't pay him for it. Right. But he's a well known

1:51:56

podcaster. Does

1:51:59

the WTF podcast famous

1:52:01

comedian. I feel like

1:52:02

he's one of our own.

1:52:03

He has a very large role, and he's quite good. Didn't you think

1:52:05

Mark Baron was gonna answer? You know, even

1:52:07

know who he is?

1:52:11

He was the guy with the beard who did the Yeah. And then

1:52:13

the other guy who was in

1:52:15

it is

1:52:15

bubbles from

1:52:18

the

1:52:18

wire. And I'm watching this guy. I'm

1:52:20

saying I know this character. Who

1:52:22

is this actor? Remember Bubs? In the wire,

1:52:24

he was the kind of strung out

1:52:26

junky Informer that was actually,

1:52:28

you couldn't take your eyes off when he was on the

1:52:31

screen. He's in it as well. It

1:52:33

what was your give it out of five

1:52:35

stars, how many? Three

1:52:36

and a half. Three

1:52:39

and a half. Uh-huh.

1:52:41

I give it more than that, but it it's

1:52:44

it's very grim. It's dark.

1:52:46

And then it has a

1:52:48

well, I don't wanna spoil it

1:52:50

for you. But you I guarantee you that this

1:52:52

suddenly is gonna make millions of dollars. Right? In

1:52:54

streaming, you can stream it on all the

1:52:56

major

1:52:57

streamers. And I'll just open up

1:52:59

I mean, I'm I'm gonna

1:53:01

Sorry. Go ahead, Christina.

1:53:01

No. I was just gonna say I

1:53:02

I'm definitely gonna be streaming it. I've I meant to

1:53:03

watch it this weekend and I

1:53:06

didn't have a chance. I

1:53:08

made a point of watching last

1:53:10

night, so I'd be ready for today. It's good.

1:53:13

I'm glad I watched

1:53:15

it. I mean, It's no Wakanda forever,

1:53:17

but, you know, it's okay. And there's raising questions about the

1:53:19

ethics of these campaigns, and

1:53:20

there there are rules about what you can

1:53:22

and can't do with these campaigns. But

1:53:26

they may not have anticipated the

1:53:28

Twitter era and this campaign, which

1:53:31

if TWiT was based primarily on

1:53:33

Twitter didn't cost

1:53:34

anything, but was apparently extremely effective. Mhmm.

1:53:36

And

1:53:37

it shows you the power

1:53:39

of of while coming

1:53:41

over to Cape Blanchett's house

1:53:43

is one. Yes.

1:53:45

Big thing. Right? Yeah.

1:53:46

Yeah. But but I'll I'll, you know

1:53:47

Honestly, honestly, that's the big thing. The last last

1:53:49

one I can remember

1:53:51

that I guess similar to this

1:53:53

was the the campaign for for Frozen River, which is Melissa Leo. And

1:53:56

that was on

1:53:59

an April best and that was a very small

1:54:01

film. Yeah. And Alyssa Lille was nominated for best actress. She won the

1:54:04

following year

1:54:06

for best boarding actors for the the fighter. And and

1:54:08

I don't think she would have won

1:54:10

had she not --

1:54:11

Right. -- you

1:54:12

know, been in prison over the year

1:54:15

before even though the fighter had a

1:54:17

very large campaign behind it. I I think that most of LEO won

1:54:19

because of the the frozen river campaign the

1:54:23

year earlier. But interesting to to see and and you're

1:54:25

right, Harry. Like, there are ethical things, but at

1:54:27

the same time

1:54:29

yeah. They're all used in

1:54:31

the same language because some PR person send

1:54:33

it

1:54:33

to Yes. Some years. There was somehow. Our plate plan shot wrote to

1:54:35

everybody saying, here's

1:54:39

a suggested

1:54:40

tweet. I mean And then everybody just copy

1:54:42

pasted and and just the the same as, you know, you do the the Instagram, like, influencers,

1:54:44

like, the Kardashians often

1:54:47

times, which just copy the

1:54:49

entire prompt, including this that they weren't supposed to copy,

1:54:51

and they post it, you know, on their accounts. But

1:54:54

I mean, you know,

1:54:57

I don't think this breaks any of the rules.

1:54:59

I mean, I think that, you know, you having if if a very famous and influential person decides to

1:55:04

have other people putters over at their

1:55:06

home to watch something. I I don't think that breaks any rules. Maybe maybe it should,

1:55:08

but but I don't

1:55:10

think it

1:55:11

does. You know? Or maybe

1:55:13

just like everybody knows Andrea Riseboro and thinks she's really wonderful.

1:55:15

Now I'm learning she's English, which does

1:55:17

impress me more because she didn't play

1:55:20

in English. Character

1:55:23

in there. She

1:55:23

plays a a southern

1:55:26

character. She That's

1:55:28

hard. Yeah.

1:55:29

That's hard. She hasn't done

1:55:31

a lot. She was in

1:55:33

some movies I've heard of, but never saw,

1:55:35

like, nocturnal animals and

1:55:40

the

1:55:40

death of Stalin and I don't know. It's a it's

1:55:42

an interesting thing. What were you gonna say, Harry? I think

1:55:44

I set up oh,

1:55:46

you set it already. Okay.

1:55:48

By the way, it's The character who played

1:55:50

bubbles in the the wire is Andre

1:55:53

Royal. I wanna

1:55:56

give him credit. I have

1:55:58

not seen him ever since, but he was he plays

1:56:00

royal in the movies. It's worth

1:56:02

seeing that for Mark Marron and

1:56:06

Andrea Rio. That's what you

1:56:08

know? It's

1:56:09

nothing else. And, yes, Andrea

1:56:12

Rio

1:56:13

Rise Bros.

1:56:14

Good. I was as I'm

1:56:16

watching it, I don't know. I don't wanna spoil it. I'm

1:56:18

thinking, don't do that. Don't do

1:56:19

it. I know

1:56:20

they're gonna do it, but I don't want them

1:56:22

to do and they did

1:56:23

it. all say. I don't know.

1:56:25

That's not that's not a

1:56:28

spoiler. Hey,

1:56:30

by the way, there are a couple of cool things we didn't mention with GitHub.

1:56:32

I just wanna mention there's now

1:56:34

a a co pilot paintbrush. Right?

1:56:37

That you paint your code And now you could say,

1:56:39

hey, GitHub. Yeah. That's wild.

1:56:42

So you can Which

1:56:45

is

1:56:45

which is fantastic. So you can If you've

1:56:47

got carpal tunnel or something, you could just say,

1:56:49

hey, GitHub. Write this login code for

1:56:51

me. I'm I'm too

1:56:53

tired. Wants me

1:56:55

to log in. Okay.

1:56:56

That's pretty cool. Hey,

1:56:58

good help. Yeah. It's

1:56:59

very cool. Import pandas. Import

1:57:03

graph plotting

1:57:03

library. Hey, GitHub.

1:57:06

Insert new line.

1:57:08

Get Titanic

1:57:10

CSV data from the web. An

1:57:12

assignment to the variable titanic

1:57:14

data. Holy calls from titanic data were ages null. Fill

1:57:16

null values of column fair

1:57:19

with average column values

1:57:22

Drop duplicates from the frame

1:57:24

titanic data. Hey, GitHub.

1:57:27

New

1:57:27

line. Flatline

1:57:28

graph of age versus

1:57:31

spare column. the scatterplot.

1:57:32

Show plot. Hey, GitHub.

1:57:34

Exit code mode.

1:57:39

Hey, GitHub. Run program. Oh my god. That's pretty

1:57:41

impressive that demo right there.

1:57:43

It's writing Python code.

1:57:47

No typing involved. That's a very this

1:57:50

is a very common kind of data query for data scientists.

1:57:52

And you don't have to type

1:57:54

all those brackets and tabs and

1:57:58

semicolates

1:57:58

does it. Howard Bauchner:

1:58:00

And and what's but the impressive thing

1:58:02

with that is that there are obviously,

1:58:04

there's been a lot of Texas speech

1:58:06

technology for years that's very

1:58:08

good. TWiT has not worked

1:58:09

well with

1:58:09

with code because that's It's so specialized. Been what it's

1:58:11

designed for. And and exactly.

1:58:13

And and, like, when I when I

1:58:15

was hit by the car, five

1:58:18

years ago. And I I broke my my wrist, but my primary hand, like, typing

1:58:20

was before when I before I I

1:58:22

was in the cast and I was gonna

1:58:27

traction was impossible. And it made

1:58:30

coding basically impossible. And

1:58:32

I was using a lot

1:58:34

of, you know, text to speech

1:58:36

stuff or or voice to to

1:58:38

text stuff rather. And code was

1:58:39

that was the biggest challenge. And so when

1:58:42

I looked at, hey, get up, I was

1:58:44

like, okay?

1:58:46

Not only is it so cool that you can just

1:58:48

speak what you want it to do and it can it

1:58:50

can write it the right way in natural language,

1:58:52

but the fact is is that you can say things

1:58:55

like new line, or you can say, you know, in in in in in handles and

1:58:57

other things and it's not getting

1:58:59

confused because it's

1:59:02

it's been trained for you know, this specialized thing as you said, which

1:59:04

is really amazing.

1:59:07

Kind of incredible. Interesting

1:59:11

story about ADS

1:59:14

B. So I

1:59:16

had never heard of ADS

1:59:18

B not being a

1:59:20

pilot. But if you heard

1:59:22

about the ElonJet that's what

1:59:27

was using ADS b, which is a

1:59:29

database. It's actually, technically, ADS b

1:59:32

exchanged. And it was

1:59:34

kind of like I'm DB

1:59:36

or or

1:59:37

Wikipedia. It was created by users. And the

1:59:39

reason it

1:59:43

worked is because jet

1:59:46

airplanes have all airplanes, I guess, have transponders, transponding their tail number

1:59:48

and their location as they

1:59:50

fly around. That's how they know

1:59:54

where everybody is, and air traffic control

1:59:56

uses it, and I imagine other planes

1:59:58

use it. Well, it turns out if

2:00:00

you're an enthusiast, you can also have

2:00:02

a little receiver on the And monitor all

2:00:05

the traffic going ahead. And then if somebody were to write

2:00:07

a way to aggregate that data into

2:00:11

a map, and you had enough people with those little receivers all

2:00:13

over the world, you'd have

2:00:15

a pretty good tracking map

2:00:17

of all the flights. Well,

2:00:19

that's what ADS ASB

2:00:22

exchange was. But and I say was because

2:00:24

it was owned

2:00:27

by one person. A

2:00:31

lot of people contributed, but Dan Stuford

2:00:33

founded the site and was the

2:00:35

sole owner

2:00:37

of the site. And he sold

2:00:39

it to Jetnet, which

2:00:41

was by the way owned

2:00:43

by GetReady

2:00:47

private equity And at

2:00:48

this point, there is a little rebellion

2:00:50

going on, including by the guy

2:00:53

who does Elon

2:00:56

Jet, who said,

2:00:58

I'm not gonna use this data anymore, and I'm not gonna contribute it to it

2:01:00

anymore. TWiT understandable.

2:01:05

I mean, the server costs, the hosting costs

2:01:07

were expensive. AASB

2:01:11

exchange couldn't really monetize very

2:01:13

well. It's free to use. They used advertising, and then they had a kind

2:01:15

of higher paid

2:01:20

tier. But it's, you know,

2:01:22

still an expensive thing to run. And and so at at at some point,

2:01:24

Stuford decided that he

2:01:27

was gonna sell it Jack

2:01:30

Sweeney runs runs the ElonJet

2:01:32

Twitter account said today is

2:01:35

a sad day. If

2:01:38

you feed EADSPX change we encourage you stop

2:01:40

feeding. ADSB exchange was

2:01:42

found on the principles

2:01:44

of hobbyist community

2:01:47

not for

2:01:48

profit. Private equity firms.

2:01:50

So it'll be interesting to see

2:01:53

within a

2:01:56

few hours after the sale

2:01:58

became public, the eleven thousand feeders, eleven thousand people running these receivers

2:02:00

dropped significantly to ninety

2:02:03

five hundred people in span

2:02:06

of a few hours. I don't know where it stands

2:02:08

right now. I'm not an expert on

2:02:10

this, but I'd be very curious

2:02:14

to see what happens. One one user said,

2:02:16

flight aware, flight

2:02:19

radar win, Elon wins. All

2:02:23

the guys who are out to get us win.

2:02:25

So, you know,

2:02:28

remember the saga of, you

2:02:30

know, ElonJet and Elan chasing it off Twitter and he went to

2:02:32

Mastodon TWiT then Elon blocked

2:02:34

every Mastodon mention on

2:02:36

Twitter. TWiT was

2:02:39

a there's a final line in

2:02:41

that story. It's kinda

2:02:43

it's

2:02:43

kinda sad at Wednesday they

2:02:46

announced that they

2:02:47

had been acquired. Kind of like IMDB

2:02:50

or CDDB or all these other unfortunately, nobody's acquired Wikipedia, I

2:02:53

hope not. And no

2:02:55

one can acquire Mastodon. Again,

2:02:58

this is the argument for

2:03:00

these distributed places. JETnet is owned by Silversmith

2:03:02

Capital Partners, they were acquired last year.

2:03:07

The acquisition is the second of what the company anticipates will be several

2:03:10

future acquisitions as JetNet expands

2:03:12

its

2:03:13

data driven

2:03:16

product offerings. For the aviation industry. So you got

2:03:18

a

2:03:18

problem there if you've got volunteers freely uploading this

2:03:20

data and suddenly you

2:03:22

make a killing, selling it.

2:03:25

And this private equity comes

2:03:27

along. You need the

2:03:29

volunteers, don't you? Anything to say about

2:03:31

that? Or should we move

2:03:36

on. Okay. I'm sort

2:03:38

of I'm sort

2:03:39

of sympathetic, I guess, to

2:03:41

to the volunteers, at the

2:03:44

same time, the jets tracking said, I know it's

2:03:46

legal. I I'm not arguing the the legality at all because, obviously, you have to be able

2:03:50

to the the FAA has able planes not questioning any

2:03:52

of that. But I do think the jet

2:03:54

tracking stuff is gross. I

2:03:55

do. Yeah. Well, I under you

2:03:58

know, honestly, I understand Elon's point. I

2:04:00

mean, But, I

2:04:02

mean, it's not exactly a

2:04:03

destination coordinates. And Sweeney could

2:04:05

have done some things. Yeah.

2:04:08

0II

2:04:10

like delaying the tweet by an hour or two

2:04:12

--

2:04:13

Right. In in -- chance to move

2:04:15

on. I'm I'm not I'm not saying that

2:04:17

that it was it's asked action thing. I think

2:04:19

that was a little hyperbole. People in fandoms, like teenage

2:04:21

girls, have been doing this for years for

2:04:23

their favorite pop

2:04:25

stars. And it's grow it was grossed then and they

2:04:28

would, you know, TWiT on Twitter,

2:04:30

on Tumblr and whatnot. It's gross now.

2:04:32

I I do feel for, like,

2:04:34

us, like, the aviation enthusiasts community who feels

2:04:36

like this thing they've been contributing to is now

2:04:38

been sold to private equity who will be making money

2:04:40

off of it. But at the same time, like, the data's

2:04:42

either open or it's not. You know what I mean?

2:04:44

Like, you can create your own

2:04:47

thing. But, I mean, this is this is public data for a reason. I

2:04:50

remember watching guys. But

2:04:54

Gaga's movie, and I just

2:04:56

watched Taylor Swiss, miss Americana

2:04:58

movie. And the thing that I

2:05:00

really sticks in my mind is these

2:05:02

poor people go out of their doors of

2:05:04

their

2:05:05

apartment. And at any time of the day or night,

2:05:07

there are hundreds

2:05:11

of fans Standing there, waiting for them, they have to have big security guards

2:05:13

just to get them to the car

2:05:15

and buried. Nice. And

2:05:18

it's and I I meant and I'm starting

2:05:21

to read. I'm much to

2:05:23

my chagrin Prince

2:05:26

Harry's spare And it's somewhat the similar situation.

2:05:28

It killed Princess Diana. Right. Well

2:05:30

and and the the way that

2:05:32

a lot of the paparazzi finds where

2:05:34

the the celebrities are going to be is is that they track

2:05:37

their jets because a lot of them have if they

2:05:39

own their own jets and it's registered,

2:05:42

if they are simply renting one then then it's harder. But, like, you know,

2:05:44

Taylor Swift owns her own planes. And now

2:05:46

she's doing the thing I think where she,

2:05:48

like, hides the the registration,

2:05:51

which you can do a

2:05:53

certain way. People still her fans are insane. And and and I said this as a big tailor's

2:05:55

with them, but not one who appreciates or encourages any

2:05:57

of this because I think the stuff

2:06:00

is just gross

2:06:03

and disgusting. The the k pop fans are are

2:06:05

the same way where they will literally track

2:06:08

exactly where people

2:06:10

are at all times to try to know

2:06:12

and and put it up on the Internet and,

2:06:15

like, not realizing, then then they

2:06:17

get mad about the paparazzi you know, stocking their

2:06:19

their favorite stores. It's like, how do you think they're

2:06:21

figuring out exactly where they're landing, you know, and

2:06:23

then showing up at private

2:06:25

airports? Or or, you know, God forbid, they're having to fly commercial,

2:06:27

you know, showing up literally a baggage claim outside

2:06:31

LAX. Like, that's that's

2:06:34

because people are doing things like this, and and they're they're tracking their every movement. And there's

2:06:36

something there's

2:06:41

something gross about that. And again, I think that's I'm not trying to say that

2:06:43

everybody in fact, most of the people part of this community are not involved

2:06:45

in that at all. But III

2:06:47

do think that when

2:06:51

we have those discussions. And again, I don't

2:06:53

think that calling a assassination was in

2:06:55

any way correct. But there

2:06:57

is this very

2:06:59

gross aspect of for very for

2:07:01

high profile people having no privacy because you have really obsessive

2:07:04

people

2:07:06

out there who are tracking their removal and then in turn passing

2:07:08

that on to, you know, people who are

2:07:10

then going to take photos to sell

2:07:14

for lots of money. Yeah. I know

2:07:20

that Taylor's

2:07:23

let me see if I can I don't wanna jeopardize her safety? I feel

2:07:25

bad for I feel bad for anybody in

2:07:27

this situation. She uses

2:07:30

face recognition at her

2:07:32

concerts. To find

2:07:34

the most Alright. Go ahead. Yes. I was gonna say

2:07:36

she did, I

2:07:38

think, in the last concert.

2:07:41

Yeah. I think I think based

2:07:42

on Because they know who these most dangerous stuff I don't remember sharing my face. Yeah.

2:07:47

You Well, but they need to don't don't need to walk up to a

2:07:50

camera and smile. They see it coming

2:07:52

in. And

2:07:53

Yeah.

2:07:53

I guess so. I guess

2:07:56

I guess Yeah. I I guess I

2:07:58

was just in my mind, I was because I saw I'm being at the concert and seeing, you know, the signs

2:08:00

of that I

2:08:03

I don't remember

2:08:04

obviously, there wasn't anything when you entered where you

2:08:06

had to, like, scan your face. No. No. They they just look at the crowd. They just watch

2:08:11

you coming in. And they have

2:08:13

apparently, they have face recognition data for people who are considered

2:08:15

threats. And, you know, I have

2:08:18

more power to her. I I blame her for doing don't blame her

2:08:20

people for doing that because her life

2:08:22

is at risk. It's a shame she

2:08:25

has to. But it does

2:08:27

raise some interesting questions. So

2:08:29

there are big signs

2:08:31

saying, what? You're being your face is being captured? Yeah.

2:08:34

Something like that. I

2:08:37

a photo of it. I'll have to find it.

2:08:39

I I don't have it off top of my hand, but

2:08:41

I did take a photo of it when I saw it at

2:08:43

the Seattle reputation tour concert. I'm

2:08:45

sure it was at the one that I saw in in New Jersey as well.

2:08:47

This was in twenty eighteen, so I I which

2:08:50

was the last time she toured.

2:08:54

But, yeah, there was something like that that said that,

2:08:56

you know, that there you

2:08:58

know, your photo maybe used, you

2:09:00

know, by by by being

2:09:02

at this concert, like, you've consented you know,

2:09:04

to to your photo being used, you know,

2:09:06

in a database for for whatever the purpose might be, which, you know,

2:09:09

fair enough if

2:09:12

something that if you wanna attend this concert, you have to to

2:09:14

make that trade off. I'm there are plenty of people I'm sure who'd be

2:09:16

like, well, I will never go

2:09:18

to a concert that does that. But

2:09:20

I obviously so many people have my

2:09:22

face, my face isn't so many databases.

2:09:27

I I wanted to see the concert. So apparently, they put

2:09:29

rehearsal clips up on

2:09:31

a kiosk, and then

2:09:33

people would

2:09:35

go over and look. Why. They would go

2:09:37

over and look

2:09:38

at the clips, and there was a camera inside the display taking

2:09:42

their picture. Right. But That's what it was. And it yeah. was this kiosk TWiT.

2:09:44

And then there was a sign on the kiosk

2:09:46

that told you what it was doing. Oh

2:09:50

my god. The images, this is from which in twenty

2:09:52

eighteen. The images were being transferred

2:09:54

to a Nashville command post, where

2:09:56

they were cross reference to the

2:09:58

database of hundreds of the PoPs

2:10:01

stars hundreds of the pop

2:10:03

stars known Stalker's. Everybody who went by would

2:10:05

stop and stare at it and the

2:10:08

software would start

2:10:12

working.

2:10:12

And and presumably, if

2:10:15

you were one of

2:10:18

those people, some big burley guy with a walkie talkie would

2:10:20

walk over and say, excuse me,

2:10:22

sir. Now, this is relevant

2:10:26

to today because it's been happening, and we've talked about this

2:10:28

before at Madison Square Garden. The

2:10:30

Dolan's who own MSG and and

2:10:33

Madison Square Garden owns a bunch of

2:10:35

other stuff. Radio City Music Hall?

2:10:35

Well, it's happened first to radio

2:10:37

City Music Hall, a mother with her

2:10:39

girl scout troop

2:10:42

Went to see the rockettes for the

2:10:44

holiday show and was informed

2:10:47

as she

2:10:47

enters? Nope. Sorry, lady, you

2:10:50

can't

2:10:50

come in. Had to

2:10:51

wait outside up front while her girls

2:10:53

watched the Rockets, found

2:10:55

out it

2:10:56

was because she works for a

2:10:58

law firm that has a lawsuit with MSG.

2:11:00

And apparently,

2:11:01

the Dolan's have been doing this. MSG's been doing

2:11:03

this to any lawyer

2:11:07

that has any think going on with

2:11:09

MSG, they have face recognition and they will lock you

2:11:11

out. Or just if you work for

2:11:13

a law firm that also has

2:11:16

other lawyers Oh, yeah.

2:11:18

Suing

2:11:18

them. The mom said, I don't know anything about this.

2:11:20

This is not my I don't I'm not suing them.

2:11:22

Sorry, lady. And and, of course, James Nolan, who gave

2:11:27

fairly fiery interview about this couple of days ago, says

2:11:29

that's alright. It's a private

2:11:32

institution to

2:11:34

which The liquor licensing

2:11:36

authority in New York says,

2:11:38

well, not exactly because when

2:11:40

you have a liquor

2:11:43

license, there are caveats covenants, things

2:11:45

you agree, including being open to the public, you can't

2:11:47

have a private liquor license. So

2:11:51

there is some question. In

2:11:54

fact, New York State Attorney General, attention

2:11:57

investigating New York

2:12:00

State legislatures

2:12:01

have introduced a bill that

2:12:03

would ban face recognition in sporting events.

2:12:05

And now

2:12:06

the liquor authority of New York State

2:12:10

Liquor Authority SLA

2:12:12

is saying your liquor license

2:12:14

is in jeopardy.

2:12:15

Dolan gave

2:12:16

an interview Thursday a a

2:12:18

fiery. I'm told I didn't watch an

2:12:21

interview with Fox

2:12:22

five, channel five in New York.

2:12:26

In which he defended his family's

2:12:28

right to block anybody. We don't like

2:12:30

from coming in. And of course, Manchester

2:12:32

Square Garden is the home of the

2:12:34

Rangers hockey team. a couple Now we've mentioned earlier,

2:12:37

you don't wanna get on the bad

2:12:39

side of lawyers. I'd say,

2:12:42

well, sue your ass.

2:12:45

And I think there probably will be

2:12:47

some lawsuits. Dolan says, well, alright liquor authority. You watch, I'm gonna I'm gonna pick a

2:12:49

day, and we're not gonna serve any any

2:12:51

beer at a Ranger's

2:12:55

game, and then see how you feel. And I think he said he

2:12:57

was gonna be of the phone

2:12:58

number or of the like, her authority. Yeah.

2:13:02

You called him. Yeah.

2:13:03

He docked

2:13:03

him. He actually gave out the number

2:13:05

on the TV. It seems

2:13:07

incredibly petty and a

2:13:10

great way to get bad pulled us today without really accomplishing

2:13:12

much of anything.

2:13:14

So I

2:13:15

understand why Taylor might do this

2:13:17

at her concerts. In fact, it's sad

2:13:19

that

2:13:19

she has

2:13:20

to. But I understand why. I don't

2:13:22

think James Dolan really has to block lawyers from

2:13:25

coming into Rangers

2:13:28

games.

2:13:28

No. No. I mean, I think it's

2:13:30

one thing to be like, okay. We have I mean, she's had people, like, show up in her house. Like

2:13:32

-- Yeah. -- when she's not there

2:13:33

and, like, take showers and I

2:13:35

mean, it's awful. And she has very

2:13:39

serious mentally disturbed people after her. Totally

2:13:41

get that. It's been another thing to

2:13:43

be like, oh, you work at a

2:13:45

law firm that's involved in litigation with

2:13:47

my and so you're banned from entering the

2:13:47

premises. I mean, a,

2:13:50

that's

2:13:50

really concerning that you have,

2:13:52

like, the facial data of everybody who

2:13:54

works at law firm. Like, that's that's

2:13:58

concerning right there. And then b, it's like, really, really so humans can see the rockets.

2:14:00

Like, what what does that

2:14:02

have to do with anything? Here

2:14:06

is I I we don't have to zoom

2:14:08

in on this, but here's a

2:14:10

little thumbnail from

2:14:11

YouTube. Of Dolan holding up

2:14:13

the name, of the SLA's chief

2:14:16

executive and his phone

2:14:18

number and his email

2:14:21

and his picture saying,

2:14:23

I'm gonna put this wherever we sell

2:14:25

alcohol. I'm gonna put this up

2:14:27

in the in

2:14:30

the

2:14:30

stadium.

2:14:31

Can't imagine all that many people sitting with him.

2:14:34

Wow. No. No. No.

2:14:38

New York State senator who represents the part of

2:14:40

Manhattan that that Madison Square Garden is

2:14:42

in described Dolan's interview according to

2:14:45

the Washington Post as a public meltdown

2:14:47

called him the poster child of privilege who

2:14:49

he receives,

2:14:50

and this is

2:14:51

an important point, a forty three

2:14:53

million dollar a year tax break

2:14:55

from New Yorkers. As is

2:14:57

often the Laporte these

2:14:59

big sports

2:14:59

venues. Sometimes

2:15:00

face recognition gone wrong. Sometimes

2:15:02

most of the time. Again, Taylor

2:15:06

Swift seems to me. The only actual legitimate

2:15:08

use of this because you

2:15:11

gotta protect Tay

2:15:12

Tay. I'm sorry. That's just, you

2:15:14

know, not okay. She's alright. She's Right? Yeah.

2:15:16

Yeah. I

2:15:17

think so. Yeah. It's a

2:15:20

good

2:15:21

movie. I liked it. I enjoyed it. That's

2:15:23

the thing now. Everybody has to do this. Selena Gomez,

2:15:25

god, god, I think I don't know if

2:15:27

she stepped my daughter

2:15:29

up, but started it. Didn't she? And then, Gaga Yeah. She

2:15:31

really did it. It was

2:15:32

Ruth or Dyer. Yeah. Ruth or Dyer?

2:15:33

With Warren Beatty hanging around the dressing room.

2:15:36

Same. What are you doing tonight? You're going

2:15:38

you wanna

2:15:38

go out after the show? You wanna wanna

2:15:41

have a have a drink. You

2:15:43

wanna hang out? Alright. One more break. Then we are going to wrap

2:15:45

this puppy up.

2:15:48

But is such an important

2:15:50

advertiser. I wanna tell everybody you gotta

2:15:51

get BitWarden. BitWarden

2:15:52

is my choice for a password manager.

2:15:54

I know a lot of you followed

2:15:59

our advice. I'm sorry. And

2:16:02

what with the other

2:16:04

guys? That hasn't ended up

2:16:06

so

2:16:06

well. We didn't know, honest.

2:16:08

If you're looking for a better

2:16:10

password manager, can I say, in my experience, open

2:16:16

source is always the way to go with

2:16:18

anything like this because you know exactly what's going on? If at any

2:16:20

point you don't like it, you

2:16:22

can fork it in fact, it warden.

2:16:25

A lot of people run their own

2:16:27

server with the BitWORD and Vault, so it's not on BitWORD and Vault. You can do that with your your your

2:16:29

your individual account.

2:16:32

That's awesome.

2:16:34

And BitWarden has its own server software,

2:16:37

but there's a beautiful rust fork of it called VaultWarden you can run if you

2:16:39

don't wanna run that. That's the beauty of open source. Bitt

2:16:45

Wharton is the only

2:16:47

open

2:16:47

source cross platform, password manager

2:16:50

you can use at

2:16:52

home, At work or on the

2:16:54

go, it's trusted by millions. Steve Gibson,

2:16:56

I

2:16:57

think he knows BitWORD is a sponsor,

2:16:59

but I know Steve is a pretty

2:17:01

independent thinker. He was the guy who turned us on the last past in the beginning. He

2:17:03

moved off last

2:17:06

past a bit warden as well. I've been a

2:17:09

bit warden for several years. We had been using LastPass Enterprise.

2:17:11

We are moving now to BitWarden Enterprise. Russell

2:17:15

started that process this week. I'm

2:17:17

really I'm really excited about this.

2:17:19

And I love it because BitWarden lets you be an individual. I have my individual account,

2:17:23

but you can also have an enterprise

2:17:26

account. Now let me explain, of course, you wanna know this all your data in BitWarden's

2:17:31

Vault is end and encrypted. They don't

2:17:33

have access to

2:17:34

it. Not just the passwords, but unlike some other companies, all the metadata,

2:17:38

the sites you

2:17:40

visit, when you visited them, all

2:17:42

that stuff is

2:17:43

encrypted. Just like your passwords. That's really important. And of course, BitWarden doesn't track your

2:17:45

data in the mobile app.

2:17:48

All it does is crash

2:17:50

reporting. If you don't like that,

2:17:53

This is where open source is beautiful.

2:17:55

Get the f word

2:17:56

installation. You won't even have that. BitWarden's open

2:17:58

source, it invites anyone to review library

2:18:01

implementations at any time on GitHub. You

2:18:03

can review their privacy policies at bit warden dot

2:18:05

com slash privacy. You can protect your personal data in privacy. You

2:18:07

can add security or passwords. Use

2:18:11

BitWarden to generate strong, randomly generated passwords

2:18:13

for every account. If you go to the BitWarden site, you'll see they have a

2:18:15

password strength meter You

2:18:19

could try out your passwords there

2:18:21

safely, see how strong it is. They also have, and I love this feature, a

2:18:24

username generator. So,

2:18:28

you know, when you create an account, you use

2:18:30

your email and a password. Well, what

2:18:33

if the email you used was

2:18:35

completely unique? And never used before and

2:18:37

will never use again. That's what

2:18:39

the that's what the username generator does. It generates unique usernames for every stores them,

2:18:44

And you of course, you still wanna get those

2:18:46

recovery emails. So what they do is

2:18:49

they work with five the big

2:18:51

five integrated email alias

2:18:53

services. Our our other sponsored fast mails, one of them, simple

2:18:55

log in, a non addy, Firefox

2:18:59

Relay. They just added duct duct

2:19:01

co. So you still get the email, but you use an obfuscated address

2:19:03

so the company doesn't have

2:19:07

your address. This is a great way

2:19:09

to increase the security. And to make sure that every

2:19:11

single login is unique and is

2:19:14

never used again.

2:19:15

Keep your

2:19:16

main email address out of the

2:19:18

databases too. Right? And that's an I I do that too. I think it's a great reason

2:19:24

to use it. Integrates beautifully with BitWarden

2:19:26

and those services And your business, very We're customizable adapts

2:19:29

to your business needs.

2:19:31

There's a team organization's

2:19:33

plan that's three dollars

2:19:35

per month per seat. There's

2:19:38

an

2:19:38

enterprise organization plan that's the one

2:19:40

we're going to five dollars a

2:19:42

month per seat. It's great. You could share data

2:19:45

with a privately with co workers

2:19:47

across departments of the entire

2:19:50

company, people share passwords. We know

2:19:52

that. They write it on a piece

2:19:54

of paper and they say, here, Marie, here's

2:19:56

the password to the, you know, the WiFi.

2:19:58

No. Don't no. Don't do that.

2:20:00

Use BitWarden. You can securely share

2:20:03

those passwords. And if you've

2:20:05

got a BitWarden individual account

2:20:07

as I

2:20:08

do, It's very easy to integrate your individual

2:20:10

account with the with the organizational account

2:20:12

without crossing that barrier, so your password is still separate, but

2:20:14

you only have one login and there's all your passwords.

2:20:18

There's also, of course, the basic

2:20:20

free account, unlimited free forever, unlimited passwords. I think

2:20:22

the ten dollars a year for premium is worth it just to support,

2:20:24

but Laporte.

2:20:28

I'm a big fan. I've been doing that

2:20:30

for a couple of years. Family option. all up six users,

2:20:36

Total cost three thirty three a month, three dollars thirty three

2:20:38

cents a month. I think that's worth it

2:20:41

as well. And, of

2:20:42

course, it makes I I've it's

2:20:44

so easy to import from any other

2:20:46

password manager.

2:20:47

Export out of

2:20:48

it, import into BitWarden, TWiT I

2:20:50

I hear from everybody. Well, that

2:20:52

was

2:20:53

easy. That was easy. The

2:20:54

only hard part is changing all those passwords

2:20:57

that

2:20:57

that other company let let out

2:20:59

into the o open. Bitwarden, trusted

2:21:01

by millions of individuals, teams, and

2:21:04

organizations worldwide. It's the only open

2:21:06

source cross platform password manager. You

2:21:08

can use that home on the go or

2:21:10

at work. They've got a command line version for Linux. That

2:21:14

open source is a beautiful

2:21:16

thing. Get started with a free trial

2:21:18

of a Teams or enterprise plan. get started for across all devices an individual user,

2:21:20

TWiT

2:21:25

warden dot com slash twit.

2:21:27

I think the world is converging. I think the world has said, you know what? This is the way to go. Open

2:21:31

source baby.

2:21:35

Bitwarden

2:21:35

dot com. Slash tweet. Highly

2:21:37

recommended we thank

2:21:39

him so much for

2:21:41

supporting our show. When when BitWarden called, I

2:21:43

said yes. Yes.

2:21:46

Yes. I will do your ads.

2:21:48

I will happily do your

2:21:50

ads. Hey, we had a lot of fun this week on Twitter. And you know what we did? Because we were so worried

2:21:53

some of you might

2:21:55

have missed some of

2:21:57

the exciting moments. We've

2:21:59

made this mini movie. For your consideration.

2:22:01

Watch. I hereby verify that I,

2:22:03

Leo like the script to create

2:22:05

an overdubbed version of my voice. Why

2:22:07

do you want TWiT,

2:22:10

Anthony. I do. What the

2:22:12

hell? What the hell? Well, hey. Hey. Hey. It's

2:22:14

a AYLEA

2:22:15

Leoport, the AI tech guy.

2:22:17

Laporte week

2:22:20

on Twitch. Mac break weekly. Jason

2:22:22

has his reviews of the new Apple hardware. We'll talk about that. It's our first

2:22:26

TWiT silicon,

2:22:29

you know, boring speed bump

2:22:31

release, where not that they're bad,

2:22:33

they're remarkable computers, they're just

2:22:35

not particularly new. They're just

2:22:38

what you expect from

2:22:40

last year or from

2:22:42

two years

2:22:43

ago, I guess, except

2:22:45

fasters this week in Google. Richard Hsieh is here. He

2:22:47

is the face

2:22:50

of all of those layoffs. Google

2:22:53

cutting twelve thousand jobs sure I had I had breakfast with

2:22:55

a friend at Google's why it happened. And he said that his

2:23:00

boss has hundreds of employees and didn't

2:23:02

know. Yeah. My boss I mean, I and I just had a meeting with him on Tuesday, and there was no inclusion

2:23:04

of anything like this on the

2:23:07

horizon. Right? So it was the

2:23:09

decision was made on a whole

2:23:12

another level. Tech News Weekly.

2:23:14

E Taylor Swift Saga continues

2:23:16

Live Nation Ticketmaster and a

2:23:18

whole lot of angry

2:23:20

TWiT. When senator Blumenthal says

2:23:22

that Ticketmaster needs to look in the

2:23:24

mirror and says it's me. I'm

2:23:26

the problem. It's me. Like, You don't expect

2:23:29

those kinds of jokes to happen

2:23:31

at a senate hearing. There are

2:23:33

die hardswifties that are watching this hearing aren't

2:23:35

people are typically like, yeah, let

2:23:38

me tune in to

2:23:41

a senate dot gov

2:23:44

slash

2:23:44

whatever. Ron. It's

2:23:45

me. Hi. I'm

2:23:48

the real

2:23:51

Leo. It's me. Okay. It's still as a

2:23:53

way to go, I think. But that was

2:23:56

that was pretty good. Thank you to Anthony Nielsen who

2:23:58

snuck in here. So just read this if you don't

2:24:00

mind. And,

2:24:02

boy, that that was that's

2:24:04

scary. That's terrifying. So Christina, it's completely coincidental, but

2:24:06

two not one, but two tailors of stories.

2:24:11

In in what episode is

2:24:12

-- Go. --

2:24:13

just for you. Which is very exciting. Just just for me. I should point out

2:24:15

that if anybody I'm gonna put a link in it both in the Discord and the IRC I

2:24:20

found this this week. Thanks to Mastodon. Thanks to

2:24:23

Jeff Atwood from putting horror, who

2:24:26

there's a mashup of where's my

2:24:29

mind from the pixie's and anti hero from

2:24:31

Taylor Swift, and then it's edited to include both

2:24:35

fight club and the anti hero

2:24:37

music video. It's it's a, the baseline fits perfectly. It's one

2:24:39

of the best, like, masters I've

2:24:43

heard in a long time. And b,

2:24:45

the video editing is superb. So if you're a fan

2:24:47

of Fight Club and Taylor Swift, which

2:24:50

I know is a Ben diagram, which

2:24:52

might just be me, is everything you

2:24:54

ever wanted to

2:24:55

realize. It's fantastic. Small

2:24:56

group of people.

2:24:58

But it's

2:25:00

so good. There was

2:25:03

actually that we didn't do it as a story, but Alex Lindsay sent me a link to

2:25:08

a Billy Eilish song that somebody used

2:25:11

AI to replace Billy Eilish's

2:25:15

voice with Ariando Grande's

2:25:17

voice. Have you seen that? I probably can't play

2:25:19

it. TWiT probably

2:25:22

shouldn't play it. Let me

2:25:24

see if I can find

2:25:27

the link. on the happier ever. Should

2:25:33

I play it? Well, I get Now, do I get taken down if I play

2:25:35

a Billy Eilish song with Grande singing it?

2:25:38

I just

2:25:40

don't think so.

2:25:43

Who would sue?

2:25:44

Hang on. Let me

2:25:46

ask my AI lawyer over

2:25:48

here.

2:25:49

What's that?

2:25:50

Yes. Just say. Hang on. It's

2:25:52

actually it's interesting because it gives

2:25:54

you

2:25:55

some idea of what can what could

2:25:57

be done. Let me play it. What

2:25:59

could possibly go wrong.

2:26:09

Is that weird? Because

2:26:11

I don't

2:26:12

know the islands. Oh

2:26:14

my

2:26:15

god. No. I do.

2:26:17

This is amazing.

2:26:18

It really sounds like Ariana

2:26:21

Grande. It really does.

2:26:22

Ariana never sung those saying those words

2:26:26

and I guess they just

2:26:29

took a belly

2:26:30

Irish audio and applied Ariana Grande's prosody to it or something like that. And

2:26:36

Go ahead, YouTube, Sumi. I

2:26:38

did ask the AI lawyer and says

2:26:40

it is possible to get sued for

2:26:42

playing an AI revision of a song

2:26:45

on YouTube if the revision in infringes on

2:26:47

some of his copyright, copyright laws separated by country, but in general, creating an AI

2:26:51

revision of a song that incorporates substantial

2:26:53

parts of the original song without permission

2:26:55

could be considered copyright

2:26:55

infringement. So go after this hero guy, not

2:26:57

me. Okay? He's

2:26:59

a guy. He's

2:27:02

a guy who did

2:27:04

this.

2:27:05

That's kinda wild. I think we're gonna

2:27:07

see AI is this is, you

2:27:09

know, I'm happy because I was tired

2:27:10

of saying things like Elon Musk ruins Twitter again. TWiT

2:27:14

I'm I'm looking forward to talking more

2:27:16

about what AI can

2:27:17

do. What AI AI can

2:27:18

run from now on? Yeah. Let let AI ruin

2:27:20

it. Nobody

2:27:23

will defend AI. I'm I'm

2:27:25

guessing. Tim Stevens, I appreciate all you do, and I'm so

2:27:27

glad that you have landed

2:27:31

successfully at sub stack, tim stevens dot

2:27:34

sub stack dot com. Now we gotta get you to write more for it. Right? How how long you've been doing

2:27:36

it?

2:27:40

I I lost a couple weeks after I left

2:27:42

Sines, so I've been trying to do about a

2:27:45

post a week if it take. But, yeah, that's

2:27:47

really just kind of a place for me to

2:27:49

hear my thoughts that kind of thing. You can definitely check me out on

2:27:51

the job, Nick, road and track,

2:27:53

motor trend, tech crunch, a bunch of other places

2:27:55

while I've been really fortunate to have a lot

2:27:58

of great

2:27:58

assignments. And there's good stuff coming up too. I'm good. I'm really pleased. It's always great to

2:27:59

see you. sorry you

2:28:03

didn't get to do any

2:28:05

ice icing this year. Yeah. That's okay. Alright.

2:28:07

Do any ice fishing? That's the question. None of that either. No.

2:28:11

Okay. Thanks, Tim. I appreciate it.

2:28:14

Christina Warren, oh, it's a pleasure to see you. Thank you so much for bringing your shoes, your tiny

2:28:20

feet, and your brilliance to this

2:28:23

show. Thank you so much for having me. I'm sorry if we're having audio problems,

2:28:27

but this has been great. It's been great

2:28:29

to be on with Harry and Tim, and

2:28:31

always love talking about stuff. Always love being on Twitter. Yeah. a

2:28:36

senior developer advocate. The senior developer

2:28:38

advocate at

2:28:39

GitHub Mastodon dot social at film underscore girl, our newest

2:28:41

mastodon owner. And Tim

2:28:43

Stevens is on mastodon

2:28:45

social as well, Tim

2:28:47

Stevens at mast social.

2:28:49

Harry McCracken, you're also on

2:28:51

Mastodon, but you're on the San Francisco Bay

2:28:53

area master. SFBA dot social?

2:28:56

That's

2:28:56

awesome. Slash

2:28:59

Harry McCracken. Technologizer, global tech editor

2:29:01

at Fast Company. Can I plug my newsletter again? Yes. I have a new newsletter called

2:29:04

Plugged

2:29:04

in. You

2:29:09

can either go to fast company dot com and click on the

2:29:11

hamburger menu and look for newsletter or

2:29:13

just Google Fast Company newsletter and you'll see

2:29:15

how to sign up and it comes out

2:29:18

every Wednesday

2:29:18

morning. And I really enjoy it. I

2:29:20

mean, you I've always enjoyed your writing

2:29:22

because you have the thing is great

2:29:25

about

2:29:25

you. You have

2:29:26

a unique And I think well informed take on what's

2:29:28

going on in tech. You have a you

2:29:31

know, you've been doing this a long time.

2:29:33

A try at least. And you have a

2:29:35

voice. You know? I mean, who else would write? Big Tex layoff stinks as

2:29:42

a headline. If Max get touch screens, Apple's age of intransigence, really is over.

2:29:44

How many how many writers

2:29:46

do you

2:29:47

know? We've used the word

2:29:50

intransigence in TWiT sense.

2:29:52

Burn. I'm sure if I ask grammar,

2:29:54

like, they would have told me no,

2:29:56

and people don't know this

2:29:57

word. I like it. If

2:29:58

chat, GPT doesn't get a better grasp of facts, nothing else

2:30:02

matters. I agree.

2:30:03

And nine tech products

2:30:05

you found essential in twenty

2:30:08

twenty two. All of

2:30:10

that more at the new plugged

2:30:12

in newsletter at fast company fast company dot

2:30:14

com just look for plugged in in the hammer

2:30:18

hamurger menu. And thanks for

2:30:20

bringing Marie. It's great to

2:30:22

see you, Marie. I appreciate

2:30:23

it. We thank all

2:30:25

all of you for joining us. We do

2:30:28

this show every week, two PM Pacific five

2:30:30

PM eastern twenty two hundred UTC. On a Sunday afternoon, it's best way to spend your Sunday with us

2:30:35

If you wanna watch it live at live

2:30:37

dot twit dot tv. If you're doing that, join us in the chat room, IRC

2:30:39

dot twit dot tv. All you need is a browser. TWiT

2:30:44

if you have an IRC client, if you're an

2:30:46

old school kind of person, you could also

2:30:49

use that. We have a discord. Thanks to

2:30:51

our fabulous club TWiT members, club twit

2:30:53

is seven bucks a month and gives us a little bit

2:30:55

of a financial boost, which

2:30:58

these days would kinda need, but it

2:31:00

also gives you ad free versions of

2:31:02

all of our shows access to the Discord, or you could find all sorts of fight

2:31:04

thing. Oh, this is the

2:31:06

Fight Club thing. I might

2:31:08

play this after the show

2:31:10

so we don't get taken down.

2:31:13

Krishna. Absolutely. But

2:31:14

it is very good. Put that in

2:31:16

there. See, she's in our discord. You

2:31:18

also get shows that we don't normally put in

2:31:21

The the regular feeds like

2:31:23

Micah Sargent's hands on Macintosh,

2:31:26

Paul Theraat does hands on

2:31:28

windows, coming up in a couple

2:31:30

of weeks went to Dow's fireside

2:31:32

chat. She's, of course, the host

2:31:34

of all about Android. February tenth, Daniel Suarez joins us. new book coming just a

2:31:40

couple of days, and we will be

2:31:42

talking about critical mass with Daniel. And if you're in the club, you'll get to ask him questions directly. So

2:31:48

that's great. Samable Samad, our car guy,

2:31:50

We'll be talking March second. Stacey's book club, we've decided on a book,

2:31:55

sea of tranquility. Oh, look, Victor's

2:31:57

gonna do an inside TwitChat, one of our

2:31:59

favorite editors, Victor Bognaldo doing that. So

2:32:01

we and Pruitt, our

2:32:03

community managers, we put together

2:32:05

a lot of events. It's

2:32:08

kinda like I don't know. It's like the ninety

2:32:10

second streetwife for the Internet. You know, come

2:32:12

on by. Join the club seven bucks a month.

2:32:14

Look at all you get. Twitch dot tv slash

2:32:16

Club, Twitter. Thank you so much for

2:32:19

your support. Thank you all for being

2:32:21

here. We'll see you next time.

2:32:23

Another tweet. Is in the

2:32:25

can.

2:32:26

It is. Bye bye.

2:32:29

Amazing. Doing the twin. Alright. Doing

2:32:31

the twin baby. Doing the

2:32:35

twin. Alright.

Rate

Join Podchaser to...

  • Rate podcasts and episodes
  • Follow podcasts and creators
  • Create podcast and episode lists
  • & much more

Episode Tags

Do you host or manage this podcast?
Claim and edit this page to your liking.
,

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features