Podchaser Logo
Home
Mark Zuckerberg on Threads, the future of AI, and Quest 3

Mark Zuckerberg on Threads, the future of AI, and Quest 3

Released Wednesday, 27th September 2023
Good episode? Give it some love!
Mark Zuckerberg on Threads, the future of AI, and Quest 3

Mark Zuckerberg on Threads, the future of AI, and Quest 3

Mark Zuckerberg on Threads, the future of AI, and Quest 3

Mark Zuckerberg on Threads, the future of AI, and Quest 3

Wednesday, 27th September 2023
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

1:53

of

2:00

it. There's new products and

2:02

new ideas. Was that your sense of it?

2:05

Yeah, absolutely. Marcus had

2:07

quite a shift, I would say, in

2:10

the public perception of him over the last year,

2:12

thanks to all of his cage fighting

2:14

and billionaire CEO

2:16

feuds, and just generally leaning into

2:18

the product stuff at the company. And

2:21

you can tell that that really flutters them up.

2:23

So there's a lot of news on a medics, and there's actually some

2:25

hardware and then obviously us talked about threads, but tell us about

2:28

the hardware first. Yeah, the company finally debuted

2:30

its Quest 3 headset, which is the successor

2:32

to the Quest 2, and its next pair

2:34

of smart glasses with Ray-Ban that

2:37

have some extra AI sprinkled

2:39

in. And we also talked about really what

2:41

I think is probably the biggest news, which is that Meta

2:44

is releasing its own chat GPT

2:46

competitor, and a bunch of other AI

2:48

assistants across WhatsApp, Instagram,

2:50

and Messenger. And I think given Meta's

2:53

pretty much unrivaled scale in terms

2:55

of users, it's a big moment for the AI industry

2:58

that will probably introduce a lot of people

3:00

to these kinds of agents for the first time. AI

3:03

is very much the cutting edge of technology. Obviously

3:05

Meta has a huge investment in the

3:07

Metaverse, the Quest 3 is also

3:09

sort of on the cutting edge of technology there. But

3:11

then you talked a lot about threads, which

3:14

is a competitor to X, the

3:16

company formerly known as Twitter, and

3:19

decentralized social media. These are kind of new

3:21

rifts on older ideas, but

3:24

he was really into it. Yeah,

3:26

he really got in depth about threads and

3:28

competing with Twitter and decentralized

3:31

social media and where that's going, which he's

3:33

never really talked about publicly. We

3:35

also spent some time talking about AI regulation.

3:38

He has some interesting thoughts there around open source.

3:41

We got into that big Senate hearing that recently

3:43

happened with Chuck Schumer and a bunch of other tech

3:45

CEOs. What's struck me most about this

3:47

entire conversation is for

3:50

the past years, Mark has been a statesman,

3:52

right? He's acted like a politician. He's

3:55

determining whether posts on Facebook will stay up or

3:57

come down. He's been yelled at by

3:59

Congress. And in this interview, he's

4:02

really lit up when he's talking about

4:04

two things, building new products and

4:07

mixed martial arts. Yeah,

4:09

it's really interesting to hear where his hat is at these days.

4:12

He's in a very different spot than he

4:15

was even just last year. We talked

4:17

at the end about kind of his

4:19

personal reflection on leading such

4:21

a big company being kind of the last founder

4:23

of his era that's still running one of these huge companies.

4:27

And of course, I did have to ask him about the cage

4:29

match and if he'll actually ever fight Elon. All

4:31

right, well, we got to get to that right away. Here's Mark

4:34

Zuckerberg, CEO of F

4:52

this is like a post fight interview

4:54

in Las Vegas right outside of the octagon

4:57

after you get out of a fight with Elon.

5:00

Maybe next year. Maybe next year. Not

5:02

Elon, but I want to keep competing, but

5:04

I just need to find someone to actually- Do you think he was ever

5:06

serious about fighting? I don't know. You'd

5:10

have to ask him. But I don't know. I mean, this is like a

5:12

thing that I just, I've really enjoyed doing it as a sport. So,

5:15

for me, there's a sort of level of like

5:19

it's competition. It's a sport. And

5:22

so, I mean, I love doing it. I trained with a bunch

5:24

of guys and I definitely

5:27

want to compete more, but we'll

5:29

see. Are there any other techs,

5:31

you know, rivals you would want to

5:34

fight if you could or kind of

5:36

move on from that? I think it'll be more fun to fight someone

5:38

who actually fights. Take it seriously.

5:41

Yeah. Yeah. So,

5:43

this is like they're- Yeah. Settling tech

5:46

business robberies by combat. You don't think that's going

5:48

to become like a thing now? I don't

5:50

think so. I think that's not generally

5:53

the direction that our society is heading. Probably

5:56

for the best. Probably is for the best. Yeah. I

5:58

think a little bit of a- of a channel

6:00

to get some aggression out is good. I

6:04

think the one that was proposed

6:06

with Elon could have been fun, but it's okay. So

6:08

you're not ruling it. So I guess what I'm saying is if he

6:10

told you, if he came back to you and said, I'll find out on your

6:12

terms, you picked the venue, would

6:15

you still do that? I don't think it'll

6:17

happen. I don't think it'll happen. Okay,

6:20

fair, I agree with you. Yeah. I

6:23

just think that it's like, there's sort

6:25

of a valorization where people look at the stuff

6:28

and are like, oh, I could do that, but I mean, you

6:30

have to train. It's

6:32

very technical, it's very fun, very intellectual.

6:35

I mean, I used to, when I was a lot

6:37

younger, I used to fence competitively. And like

6:40

a lot of the striking aspects, I mean, obviously

6:42

it's different because I've been fencing, you're playing for points,

6:45

right? So when you get a touch, the point

6:47

is, and the sequence is done, whereas here

6:49

you have to worry about being countered and all that. But

6:51

it's very intellectual. I

6:54

just, I really enjoyed thinking

6:57

about all the different combos and moves

6:59

and all that. And there's a period

7:01

where you're ramping up and learning

7:03

all the basic stuff before

7:05

you can really get to the intellectual part of it. But

7:08

once you're there, I don't know, it's super fun. I love

7:10

doing it with friends. So your mind doesn't just shut

7:13

off when you're doing it. You actually find it to be

7:15

mentally stimulating? Yeah. Interesting. Last

7:18

year, I asked you if you had any advice

7:20

for Elon as he was about to take over Twitter.

7:22

A lot has happened in a year. I'm not gonna ask

7:24

you for, give him advice, but

7:27

a lot has changed in a year. You've got threads now out.

7:30

And I'd love to get into

7:33

why you did threads when you did

7:35

and the approach that you took and kind

7:37

of when you made that decision. Cause it seemed like it happened

7:40

pretty quickly. I think the aspiration

7:42

of Twitter, right? To build this text-based

7:46

discussion should

7:48

be a billion person

7:50

social app. I mean, there are certain kind of fundamental

7:53

social experiences that, you know,

7:55

I look at them and I'm just like, okay, like if I were

7:58

running that, I could scale that to... reach

8:00

a billion people. And that's one of the reasons why

8:02

over time we've done different acquisitions and

8:04

why we've considered them. You tried to buy Twitter

8:06

way back in the day, right? Like many,

8:08

many years ago. Yeah. I mean, we had conversations.

8:11

I think this was, gosh,

8:13

this was like, I think when Jack was leaving

8:16

the first time. And look, I get it.

8:18

I mean, different entrepreneurs have different goals for

8:20

what they want to do and some people want to run their companies independently.

8:22

And that's cool. I mean, it's good that there's sort of

8:24

a diversity of different terms. But I

8:27

guess Twitter was sort of plodding along

8:29

for a while before Elon came. And

8:32

I think the rate of change in the product

8:35

was pretty slow, right? So it just didn't seem like

8:37

they were on the trajectory that would maximize

8:39

their potential. And then with

8:41

Elon coming in, I think there was certainly

8:43

an opportunity to change things up.

8:46

And he has, right? He's definitely

8:48

a change agent, right? And I think it's

8:50

still not clear exactly what trajectory it's on.

8:53

But I do think he's been pretty polarizing. So I think that

8:55

the chance that it sort of reaches the

8:58

full potential on the trajectory that

9:00

it's on is, I guess, I'm probably less

9:02

optimistic or just think there's less of a chance now than

9:04

there was before. But I guess just watching

9:07

all this play out just kind of reminded me

9:09

and rekindled the sense that

9:11

someone should build a version of

9:13

this that can be more ubiquitous. A lot

9:15

of the conversation around social media is

9:17

around sort of like information

9:20

and the utility aspect. But I

9:22

think an equally important part of designing

9:25

any product is how it makes you feel. Right?

9:27

What's the kind of emotional charge

9:29

of it? And how do you come away from that feeling? I

9:32

think Instagram is generally kind of

9:34

on the happier end of the spectrum. I think Facebook

9:36

is sort of in the middle because it has happier moments,

9:38

but then it also has sort of

9:41

harder news and things like that, that

9:43

I think tend to just be more critical

9:45

and maybe make people see

9:47

some of the negative things that are going on in the world. And

9:50

I think Twitter index is very strongly on just

9:53

being quite negative and critical. I think

9:55

that that's sort of the design. It's not that the designers

9:57

wanted to make people feel bad. I think they wanted to have like

10:00

maximum kind of intense debate, right?

10:02

Which and I think that that that sort of creates a certain

10:05

emotional feeling and load and

10:07

I just thought you

10:09

could create a discussion experience That

10:12

wasn't quite so negative or

10:14

toxic and I think in doing so it would

10:16

actually be more accessible to a lot of people I

10:18

think a lot of people just don't want to use

10:20

an app Where they

10:23

come away feeling bad all the time, right?

10:25

I think that there's a certain set of people will either tolerate

10:27

that because it's their job to get that access to information Or

10:29

they're just warriors in that way. Yeah, they like

10:32

they want to be a part of that kind of intellectual combat But

10:35

I don't think that that's the ubiquitous thing, right?

10:37

I think the ubiquitous thing is like they

10:39

want to get fresh information I think there's a place for text

10:42

based right even even when the world

10:44

is Moving towards richer and richer forms

10:47

of sharing and consumption. I think the

10:49

text isn't going away It's not gonna be a big thing,

10:52

but I think how people feel is really important So that's

10:54

been a big part of how

10:56

we've tried to emphasize and develop threads

10:58

and you know over time You know if you wanted to be ubiquitous,

11:01

you obviously want to be welcome to everyone But but I think how

11:04

you see the networks and the culture

11:06

that you create there I think ends up being pretty important

11:09

for how they scale over time or with Facebook

11:11

You know We started with this real name culture

11:13

and it was grounded to your college email address

11:15

and you know now it obviously hasn't been grounded To your college

11:18

email address for a very long time, but the kind

11:20

of real authentic Identity

11:23

aspect of Facebook has continued and continue

11:25

to be an important part of it So I think how we

11:27

set the culture for threads early on in terms

11:30

of being a more positive friendly

11:32

place for discussion Will hopefully

11:34

be one of the defining elements for you

11:36

know The next decade as we as we scale

11:38

it out We obviously have a lot of work to do but I'd say it's off

11:40

to a quite a good start I mean,

11:43

obviously there's the huge spike and then right, you know Not everyone

11:45

who tried it out originally is gonna stick around immediately

11:48

But I mean the monthly actives and weeklies I

11:50

mean, I don't think we're sharing stats on it yet, but

11:52

you can't he's like it's good No, I mean, I

11:55

feel I feel quite good about it really about

11:57

about that and because there's been the reporting out there

11:59

that

13:35

You

14:00

guys aren't necessarily trying to emphasize news in

14:02

this experience, which is a whole another

14:04

topic really, but how do you

14:06

get that Twitter-like,

14:08

this is what's going on right now feeling? I think

14:10

it's a thing that will work on improving.

14:13

But, I mean, hard

14:15

news content isn't the only fresh

14:18

content. Sure. I think even within news,

14:20

there's a whole spectrum between hard critical

14:22

news and people understanding

14:26

what's going on with the sports that they follow, or

14:28

the celebrities that they follow, or things like that. And

14:31

a lot of those things don't leave people

14:33

with the same ... It's not like I was cutting, as

14:35

a lot of the hard news, and especially the political discussion,

14:38

I think is just so polarized. Yeah, and

14:41

I think it's hard to come away from reading news

14:43

about politics these days feeling

14:46

good. Yeah, but that doesn't go for everything. And

14:49

part of this overall is just how you

14:51

tune the algorithm to basically

14:53

encourage either recency

14:55

or

14:56

quality, but less recency. I'm

14:58

not sure that we have that balance exactly

15:01

right yet. It may be the case that in

15:03

a product like threads, where people

15:05

may want to see more recent content, as

15:07

opposed to something like an Instagram or Facebook,

15:10

where it's more visual and the balance might just

15:12

be towards balancing

15:14

towards maybe a little more quality, even if

15:17

it's 12 hours ago instead of two hours

15:19

ago. I think that this is the type of stuff that we need

15:21

to tune and optimize, but I think

15:24

I agree with that point. This hasn't happened yet

15:26

with threads, but you're eventually going to hook it into Activity

15:28

Pub, which is this decentralized social

15:30

media protocol. It's complicated

15:33

in layman's terms, but essentially people run their own servers.

15:35

So instead of having a centralized company run the whole

15:37

network, people can run their own fiefdoms

15:40

federated. Yeah, that's the goal. So threads

15:42

will eventually hook into this. Yeah. Because

15:44

the first time you've done anything, I think, really meaningful in the

15:46

decentralized social media space. Yeah, and

15:48

we're building it from the ground up. Yeah. I've

15:51

always believed in this stuff. Really?

15:53

A lot of this hasn't ... Yeah. Because you run the largest

15:56

centralized social media platform. But it didn't exist

15:58

when we got started. And I think the

16:00

project of like, I've had our team

16:03

at various times do the thought

16:05

experiment of like, all right, what would it take to move

16:08

all of Facebook onto just some

16:10

kind of decentralized protocol? It's like,

16:13

that's just not gonna happen. There's so much functionality that

16:16

is on Facebook that like, it

16:18

just, it's way too kind of complicated.

16:21

Well the technical data. And you can record all the different things. Yeah.

16:24

And it would just take so long and you'd not be innovating during

16:26

that time. And I think that there's value

16:28

in being on one of these protocols, but

16:30

it's not the only way to deliver value.

16:33

So the opportunity cost of doing this massive transition

16:35

is kind of this massive thing. But when you're starting

16:37

from scratch, you can just design it so

16:39

it can work with that. And we

16:42

wanted to do that with this because I thought that that was one

16:44

of the interesting things that's evolving around this

16:46

kind of the Twitter competitive space is a lot of the others.

16:50

And there is a real ecosystem around that. And

16:52

I think it's interesting. So what

16:54

does that mean for a company like yours long term if

16:57

people gravitate more towards these decentralized

17:00

protocols over time? Where does a big

17:02

centralized player fit into that picture?

17:04

Well, I guess my view

17:07

is that the more that there's

17:09

interoperability between different

17:11

services, and the more content

17:14

can flow, the better all the services can be. And

17:16

I guess I'm just confident enough

17:19

that we can build the best one of the services

17:22

that I actually think that will

17:24

benefit and will be able to build better

17:27

quality products by our

17:29

products, making sure that we can have access

17:31

to all of the different content from

17:34

wherever anyone is creating it. And like,

17:37

I get that not everyone is going to want to use everything

17:39

that we build. I mean, that's, that's obviously the case.

17:41

And it's like, okay, we have 3 billion people using Facebook,

17:44

but like, you know, not everyone wants to use one

17:46

product. And I think making it so they

17:48

can use an alternative, but can still interact with

17:50

people on the network will make it so that that

17:52

product also is more valuable, that can

17:54

be pretty powerful and you can increase

17:56

the quality of the product by making

17:58

it so that you can give people access to all

18:01

the content, even if it wasn't created on

18:03

the network itself. So, I don't know,

18:05

I mean, it's a bet. There's kind of this funny

18:07

counterintuitive thing where I just

18:10

don't think that people like feeling locked into a system.

18:13

So, in a way, I actually think

18:15

people will feel better about using our products

18:18

if they know that they have the choice to leave. And

18:20

if we make that super easy to happen, and

18:23

obviously there's a lot of competition and we

18:25

do download your data on all our products and

18:27

people can do that today, but the

18:30

more that that's designed in from

18:32

scratch, I think it really just gives

18:35

creators, for example, the sense that, okay,

18:37

I'm not, I have a agency.

18:39

Yeah, yeah, so in a way,

18:42

that actually makes people feel more confident

18:44

investing in a system if they know that they have freedom

18:47

over how they operate. So, I don't

18:49

know, maybe for phase one of social networking,

18:52

it was fine to have these systems

18:54

that people felt a little more locked into, but I think for

18:56

the mature state of the ecosystem, I don't

18:58

think that that's gonna be where it goes. So, I don't

19:00

know, I'm pretty optimistic about this. And then if we can build

19:02

threads on this, then maybe we can,

19:04

over time, as the standards

19:07

get more built out, it's possible

19:09

that we can spread that to more of the stuff that we're doing. We're

19:11

certainly working on interop with messaging. And I think

19:13

that that's been an important thing. The first step

19:15

was kind of getting interop to work between

19:18

our different messaging systems. Right, so we can

19:20

talk to each other. Yeah, and then

19:22

the first decision there was, okay, well, WhatsApp,

19:25

we have this very strong commitment to encryption, so

19:27

if we're gonna interop, then we're either

19:29

gonna make the others encrypted or we're gonna have to decrypt WhatsApp. And

19:31

it's like, all right, well, we're not gonna decrypt WhatsApp, so

19:34

we're gonna go down the path of encrypting everything else,

19:36

which we're making good progress on, but that basically

19:38

has just meant completely rewriting Messenger and

19:40

Instagram direct from scratch. So you're

19:42

basically going from a model where all the messages

19:44

are stored in the cloud, it's like you're completely

19:46

inverting the architecture, where now all the messages are stored

19:49

locally and just the way that they're- While the plane's

19:51

in the air. Yeah, yeah, so that's been sort

19:53

of this heroic effort

19:55

by just like 100 or more people over a multi-year period.

20:00

And we're basically getting to the point where it's starting to roll

20:02

out now. But now that we're

20:04

at the point where we can do encryption across

20:06

those apps, we can also start to

20:09

support more Interop, which I think is gonna be the reason too. With

20:11

other services like Meta doesn't own other

20:13

messaging. Yeah, well, I mean, the plan was always to start with

20:16

the Interop that between our

20:18

services, but then and then to get to that. But

20:21

yeah, we're starting to experiment with that too. We

20:26

need to take a quick break. When we're back, Mark

20:28

and I dive into the state of AI regulation,

20:31

the open source debate and Meta's

20:33

new hardware.

20:36

Decoder is supported by Choiceology,

20:38

an original podcast from Charles Schwab. Choiceology

20:41

is a show all about the psychology and

20:43

economics behind our decisions. Each

20:46

episode shares the latest research in

20:48

behavioral science and dives into questions

20:50

like, can we learn to make smarter decisions

20:53

or what is the power of negative thinking?

20:56

The show is hosted by Katie Milkman, an

20:58

award-winning behavioral scientist, professor

21:00

at the Wharton School and author of the bestselling

21:02

book, How to Change. In

21:05

each episode, Katie talks to authors,

21:07

athletes, Nobel laureates and more

21:09

about why we make irrational choices

21:12

and how we can make better ones. Choiceology

21:14

is out now. Listen and subscribe

21:16

at Schwab.com slash podcast

21:19

or find it wherever you listen. Support

21:22

for Decoder comes from Mint Mobile. If

21:24

you're looking to switch phone plans, you don't need

21:26

to go with a big wireless company. Mint

21:29

Mobile is a cheap, convenient alternative

21:31

to the traditional telecom giants. That

21:33

might be a good fit for you. They're the first

21:35

wireless company to take their services totally

21:38

online. That means they're spending nothing

21:40

on retail, which lets them pass those savings

21:42

on to you. You can shop for and order

21:44

a phone plan from the comfort of your home at

21:46

an affordable price. For a limited

21:48

time, Mint Mobile offers plans at just $15

21:51

a month. All their plans

21:53

give you unlimited talk and text as well as

21:55

high-speed data delivered on the nation's largest

21:58

5G network. When you switch to a...

24:00

I thought that was super interesting, right? In

24:02

a way that I think sort of reflects

24:05

pretty well on our system and the intellectual curiosity

24:07

of the people who are ultimately gonna be making those

24:10

kind of legislative decisions. So that

24:12

was fascinating to see. But no, I mean, I didn't

24:14

come away, apart from seeing

24:17

their heads nod when certain people made certain

24:20

points, it wasn't a time for

24:22

us to really get their sense on where

24:24

they are. I think it was more just they were hearing the

24:27

discussion of the issues. Have you seen some of the,

24:29

I don't think it's necessarily focused at you

24:31

specifically, but the criticism that the tech industry

24:33

is performing regulatory capture right now

24:35

with AI and is essentially

24:37

trying to take the drawbridge

24:40

up with them. Here, are you worried about

24:42

that at all? I have seen that

24:45

concern and I'm

24:48

somewhat worried about it myself. I mean, look, I think that

24:50

there are real concerns

24:53

here. So I think that, I think a lot of these folks

24:55

are truly earnest in their

24:57

concerns. And I think that there is

25:00

valuable stuff for the government to do,

25:02

both in terms of protecting

25:05

American citizens from harm

25:07

and preserving, I

25:10

think what is a natural competitive advantage for

25:13

the United States compared to other countries. I

25:15

think this is just gonna be a huge sector

25:18

and it's gonna be important for everything,

25:20

not just in terms of the economy,

25:23

but there's probably defense components

25:25

and things like that. And I think the US having a lead on that is

25:27

important. I think having the government think

25:29

through, okay, well, how do we wanna leverage the fact that

25:32

we have the leading work in the world happening

25:34

here and how do we wanna kind of control

25:36

that and what restrictions do we

25:38

wanna put on that getting to other places? I think that makes sense.

25:40

So there are a bunch of concerns

25:42

there that I think are real. One of

25:45

the topics that I've spent a lot

25:47

of time thinking about is open source. Because

25:50

we do a lot of open source work at Meta, obviously

25:53

not everything we do is open source. There's a lot of closed systems

25:55

too. I'm not like a zealot on this, right? But I think

25:57

I'm probably, I lean probably a little more pro.

26:00

open source than most of the other big

26:02

companies. We believe that

26:05

it's generally positive to open source

26:07

a lot of our infrastructure for a

26:10

few reasons. One is we don't have

26:12

a cloud business, right? So it's

26:14

not like we're selling access to the infrastructure so

26:16

giving it away is fine. And then when

26:19

we do give it away, we generally benefit

26:21

from innovation from the ecosystem. And

26:23

when other people adopt the stuff, it increases

26:26

volume and drives down prices. So if you look at stuff like-

26:28

Like Iceworks, for example. Well, when I was

26:30

talking about driving down prices, I was thinking about stuff like Open

26:32

Compute, where we open sourced our server designs

26:34

and now the factories that

26:36

are making those kind of servers can generate

26:39

way more of them because other

26:41

companies like Amazon and others are ordering

26:44

the same designs that drives down the price for everyone, which

26:46

is good. PyTorch is great

26:48

because it basically makes it so that it's like the standard

26:50

across the industry as people develop with this, which

26:53

means that more libraries and modules are created

26:55

for it, which just makes it better. And it makes it better

26:58

for us to develop internally too. So

27:01

all that stuff is true and works well for

27:03

open source. And also, I think it's pretty well

27:05

established that open source software,

27:08

it's generally more secure and safer

27:11

because it's just more scrutinized. People, when

27:14

more people can see stuff, every piece of software

27:16

has bugs and issues, but the more people

27:18

who can look at it, the more you're

27:20

gonna basically identify what those

27:22

issues are and have

27:24

eyes on fixing them. And then also because

27:27

there's sort of a standard that's deployed across the

27:29

industry, those fixes get rolled out everywhere,

27:31

which is a big advantage for

27:33

safety and security. And when I think about AI

27:35

safety, I think one of the big issues if there's

27:37

like a single super intelligence and it's closed

27:40

and someone figures out how to exploit it, then like,

27:43

everyone kind of gets screwed at the same time. Whereas

27:45

in an open source system, it's like, okay, people find

27:48

issues and just like your Mac or whatever gets

27:50

patched, right? It's like people find the issues and

27:52

it just gets rolled out across the industry. So

27:55

I think that that's

27:56

generally positive, but there's

27:59

obviously this whole... debate where when

28:01

you open source stuff, I mean, we can

28:03

build in safeguards, but if you open

28:05

source something, you're not fundamentally going

28:08

to be able to prevent bad guys from taking

28:10

that and running with it too. So

28:12

there is sort of this debate around, okay, well, what's

28:15

the balance of how capable

28:17

do you want the models that are open source? And I

28:19

think that there is a real debate there. I do

28:21

sometimes get the sense that some of the folks whose

28:23

business model is to basically sell

28:26

access to the closed models that they're developing. I

28:28

do think that they have to be careful because they are also

28:30

talking their book when they're talking about dangers

28:33

of open source. I think that there are dynamics

28:35

like that that happen that I hear

28:37

either overtly or sometimes

28:40

behind closed doors, something will get back to me that's

28:42

like, oh, like this company was talking about

28:44

why they're kind of against open source.

28:47

And it's like, yeah, well, their whole business depends

28:49

on selling access to proprietary models. So I think

28:51

you got to be careful about that. So I do think the

28:53

regulatory capture thing, I think you

28:55

need to be careful about for things like that. I

28:57

think one of the big benefits of open source is

29:01

it also just decreases the cost

29:03

of adoption for small companies

29:06

and a lot of other folks. So I do think

29:09

that's going to be a big thing. Which I think LAMA and

29:11

the LAMA 2 release has been a big thing for startups

29:14

because it is so free or

29:16

just easy to use access. And

29:20

I guess I'm wondering, was

29:22

there ever debate internally

29:24

about should we take the closed route

29:27

you spent so much time on? I mean, you spent so much

29:29

money on all this AI research, you have one of the best

29:31

probably AI labs in the world, I think it's safe to say,

29:33

like you have huge distribution,

29:36

why not keep it all to yourself?

29:38

You could have done that. Yeah, you know, the

29:41

biggest arguments in favor of

29:43

keeping it closed were generally

29:46

not proprietary advantage. Or

29:48

competitive advantage? Yeah, no, it wasn't

29:50

competitive advantage. The two

29:53

and there was a fairly intense debate around

29:55

this. And you have to be dissuaded. Did

29:57

you are like, did you know we have to have it open

29:59

and you might. The bias was that I thought it should be open,

30:02

but I thought that there were novel arguments

30:05

on the risks, and I wanted to make sure we heard them

30:07

all out. We did a very rigorous process, and my guess

30:09

is that we're training the next version

30:11

of LAMA now, and I think we'll probably have the same set of

30:13

debates around that and how we should release it.

30:16

And again, I sort of lean towards wanting to do it open

30:18

source, but I think we need to do all the red

30:20

teaming and understand the risks, and then before

30:23

making a call. But the two big

30:25

arguments that people had against

30:28

making LAMA 2 open were,

30:30

one, it's just that it takes a lot of time to prepare

30:33

something to be open. Our main

30:35

business is basically building consumer

30:37

products, and that's where we're launching a connect. LAMA 2

30:39

is not a consumer product. It's sort of the

30:42

engine or infrastructure that powers a bunch of that

30:44

stuff. But there was this argument,

30:47

especially after we did

30:49

this partial release of LAMA 1, and

30:52

there was a lot of stir around that,

30:54

and then people had a bunch of feedback, and we're wondering

30:56

when we were going to incorporate that feedback. And

30:59

just kind of like, okay, well, if we release LAMA 2,

31:01

is that going to distract us from

31:03

our real job, which is building

31:05

the best consumer products that we can? So that was

31:07

one debate. I think we sort of got comfortable

31:09

with that relatively quickly. And then the much bigger

31:12

debate was around the risk and safety.

31:15

I think it's sort of like, what is the framework for how

31:17

you measure kind of what harm can be

31:19

done, and how do you compare that to other

31:22

things? For example, someone

31:24

made this point recently, and this

31:26

was actually at the Senate event. I mean, someone

31:28

made this point that's like, okay, well, we took LAMA 2,

31:32

and our engineers in just

31:34

several days were able to take away

31:36

the safeguards and ask it a question

31:38

to, can you produce anthrax? And

31:41

it answered. On its face, that sounds

31:43

really bad, right? That's an issue that you can strip off

31:45

the safeguards. Until

31:47

you think about the fact that you can actually just Google how to

31:49

make anthrax, and it shows up on the first page of the results

31:52

in five seconds, right? So I do

31:54

think that there's like a question when you're thinking

31:56

through these things about what is the actual incremental

31:59

risk that is created? created by having

32:01

these different technologies. I think

32:03

a lot of this stuff, we've seen this in protecting

32:06

social media as well. You know, if you have

32:08

like Russia or some country trying to create

32:12

a network of bots or inauthentic

32:14

behavior, it's not that you're ever gonna stop

32:16

them from doing it. It's sort of an economics problem.

32:19

You wanna make it expensive enough for

32:21

them to do that, that it is no longer their

32:23

best strategy because it's cheaper for them to go

32:25

try to exploit someone else or something else, right?

32:28

And I think this name is true here, right? And so for the risk

32:30

on this, you wanna make it so that

32:32

it's

32:33

sufficiently expensive that it takes engineers

32:36

several days to dismantle whatever

32:38

safeguards we built in instead of just Googling it.

32:40

So you feel generally good directionally with

32:43

the safety work? For LAMA2, I mean, I think that

32:45

we did leading work on that. I think the white paper

32:47

around LAMA2 where we basically outlined

32:49

all the different metrics and

32:51

all the different things that we did and we did internal red teaming

32:53

and external red teaming and we've

32:55

got a bunch of feedback on it. So because

32:58

we went into this knowing that nothing is gonna be foolproof,

33:00

right? It's like,

33:02

some bad actor is going to be able to

33:04

find some way to exploit it. We really knew that we needed

33:07

to create a pretty high bar on that. So yeah,

33:09

no, I felt good about that for LAMA2, but it was a very rigorous

33:12

process. And you guys have now announced the Meta

33:14

AI agent, which is your proprietary,

33:16

I'm sure it's using LAMA technology, but

33:18

it's a closed model. You're not really disclosing

33:21

a lot about the model and its weights and

33:23

all that. But this is the new agent that

33:25

people are gonna be seeing in the apps. Yeah,

33:27

so in Connect we announced a bunch

33:29

of different things on this. So Meta AI and

33:32

the other AIs that we released

33:34

are based on LAMA2, right? So

33:37

it's not like exactly the same

33:39

thing that we open sourced because

33:43

we used that as the foundation and then we kind

33:45

of like built on top of that to build the

33:47

consumer products. But yeah, there were a few

33:49

different things that we announced. Like Meta

33:51

AI. I feel like that part,

33:54

the AI to me feels like the biggest deal in

33:56

the near term a lot of people are gonna be seeing

33:58

it. It may be the first time. even

34:00

with all the coverage of GPT, it may be the

34:02

first time that a lot of people experience

34:05

a chatbot like this, actually. Yeah,

34:07

I mean, I'm really curious. Which is different. Yeah,

34:10

I'm very curious to see how the stuff gets used. I used

34:12

it for a little bit and it has web, it

34:14

can pull in web results, so it's got Recency,

34:16

which is nice. It wouldn't

34:19

give me advice on how to break up with my girlfriend,

34:21

but it, you know. It wouldn't? I don't have

34:23

a girlfriend. I was just trying to see, I'm married,

34:27

but I was just trying to see, I was trying to see

34:29

what it won't and will answer. It

34:31

seems relatively safe. It seems like the type

34:33

of thing that it should be. Fine, give

34:35

me advice. Well, I'll just tell you. But what

34:38

do you imagine people using this for? Because it's got that

34:41

search engine component, but it can

34:43

do a lot of things. I mean, is this a pure

34:45

GPT, chat GPT competitor

34:48

in almost every way in your mind? Or how

34:50

do you think about it? I think that there's a bunch of different spaces

34:53

here. So that I think people are gonna wanna interact

34:55

with AIs around. Take a step back. I

34:57

think that the vision for a bunch of folks in the industry,

34:59

when I look at OpenAI or Google, is

35:02

the sense that there's gonna be one big super intelligence

35:04

and they wanna be it. I just don't think that

35:07

that's the best future. I think the

35:09

way that people tend to process the world is

35:11

we don't have one person that we go to for everything. We don't

35:13

have one app that we go to for everything. I don't

35:15

think that we want one AI. It's overwhelming.

35:18

I find this with the current chat bots. I

35:20

feel like it can do so much that I'm not actually sure

35:23

what aspect. Our

35:26

view is that there

35:28

are actually gonna be a lot of these where the people talk to you for

35:30

different things. One thought experiment

35:33

that I did to sort of prove

35:35

to myself that this would be the case is like, all right, let's say

35:37

you're a small business and you

35:40

wanna have an AI that can help you interface

35:42

with customers to do sales and support.

35:45

You wanna be pretty confident that your AI isn't

35:48

gonna be promoting your competitor's products.

35:50

You want it to be yours. You want it to be aligned with you. You're

35:54

gonna want a separate agent than your

35:56

competitor's agent. Then you

35:58

get to this point where, okay, well, they're gonna-

35:59

to be 100

36:01

million AIs just

36:03

helping businesses sell things. Then you get the creator

36:05

version of that, or every creator I think is gonna

36:08

want an AI assistant or

36:10

something that can help them build their community. People

36:13

are gonna really wanna interact with, it's

36:16

like there's just way more demand to interact with creators

36:18

and celebrities. There's only one Kylie Jenner and

36:20

you can't, yeah. I

36:22

mean there's I think a huge need here. People

36:24

wanna interact with Kylie. Kylie wants

36:27

to cultivate her community, but there

36:29

are only so many hours in a day. Creating

36:31

an AI that's sort of an assistant for her, where

36:34

it'll be clear to people that they're

36:36

not interacting with the physical Kylie

36:38

Jenner, it would be kind of an AI version that'll

36:41

help the creators and I think it'll be fun for consumers.

36:44

That one's actually really hard because I think getting

36:46

the creator one to work, we're not actually launching that now.

36:48

That's I think more of a next year thing because there's

36:51

so many, you can call

36:53

it like brand safety type concerns where

36:55

you, like if you're a creator, you really wanna

36:57

make sure these AIs reflect

37:02

the personality of the creator

37:05

and don't talk about things that the creator

37:07

doesn't wanna get into or don't

37:09

say things that are gonna be problematic for the creator and

37:11

they're endorsing deals or different things. Creators

37:13

should have input in all of this. They should be able to say I don't want this. Oh

37:15

yeah, yeah, but I think in some ways the

37:17

technology doesn't even exist yet to make it that

37:20

trained. I mean this isn't code in

37:23

the deterministic sense. It's like a model

37:25

that you need to be able to train it to

37:27

stay right in certain bounds and a lot of that is still getting

37:29

developed. So that's more next year. Yeah, so

37:33

there's businesses, there's creators. That

37:35

stuff is fun for the business stuff is I think more

37:37

useful and then I think that there's a bunch of stuff

37:40

that's just interesting kind of consumer

37:42

use cases. So there's more

37:44

of like the utility which is what meta AI is,

37:47

like answer any question. You'll be able

37:49

to use it to help navigate your quest three and the

37:51

new Ray-Ban glasses that we're

37:53

shipping which I wish you get to that in a second.

37:55

That'll be pretty wild as having that AI that

37:58

you can just talk to all day long on. your glasses. I

38:01

think that will be pretty powerful. But

38:03

then there are also going to be all these other new

38:05

characters that are getting created, which

38:08

is somewhat of an easier question to start

38:10

with than having AIs that are

38:13

kind of acting as a real person

38:15

because there aren't as many kind of brand safety

38:17

concerns around that, but they could still be pretty

38:20

fun. So we're experimenting

38:22

with a bunch of different AIs

38:25

for different interests that people have, whether it's

38:27

interest in different kinds of sports or fashion.

38:30

When I thought it was a travel agent type person.

38:33

Yeah, travel. There's some that are more around giving people

38:35

advice. There's life coach

38:37

and an ant. And then there's some that

38:39

are more gamey. So

38:42

Snoop Dogg is playing the dungeon

38:44

master. And there's a few that

38:46

are just text-based adventure games. And

38:49

the ability to just drop that into a thread and play

38:51

a text-based game, I think, is going to be super fun. I

38:54

think part of this is we want to create

38:57

a diversity of different experiences to see what resonates

39:00

and what we want to go deeper on. This is sort

39:02

of the first step towards building

39:04

this AI studio that we're working on that

39:06

will make it so that anyone can build their

39:09

own AIs, sort of just like you

39:11

create your own UGC, your own content

39:13

across social networks. You should be able

39:15

to create your own AI and publish it.

39:19

And I think that's going to be really wild. I

39:21

do agree it's going to be wild. There's a bit of an easiness

39:24

to it for me of just the idea

39:26

that we as a society are going

39:29

to be increasingly having

39:31

relationships with AIs.

39:34

I mean, there's stories about like character

39:37

AI, which has a similar kind of library of personas

39:40

you can interact with and people literally like

39:42

falling in love with some of these chatbots. I

39:45

mean, what do you think about that phenomenon?

39:47

Is it just inevitable with where the tech is

39:49

going? That's not where we're starting.

39:52

So I think that there's a lot of use cases that are

39:54

just a lot more clear

39:56

than that in terms of someone

39:59

who can help you.

39:59

make

40:01

workouts, right? Someone who can help

40:03

you with cooking, or help you figure out travel, or

40:05

even like the game type stuff. I think that

40:07

a bunch of these things can help you in your interactions

40:09

with people. I think that that's more

40:11

our natural space. One of the things

40:13

that we can do that's harder for others to

40:16

do is the ability to make

40:18

it so you can drop these into group chats. As we're

40:20

starting with meta AI, you can just invoke it in any

40:22

thread. Like, yeah, I could be having a one on one thread with

40:24

you, and I could just ask meta AI something, I could do that in

40:26

a group chat thread. I think that

40:28

that's going to be really fun. Just having

40:30

these kind of fun personalities

40:34

in these threads I think will create sort of an

40:36

interesting dynamic. I think especially around

40:38

image generation, and we haven't talked about that as much. I used

40:41

that. It was pretty impressive, and it was fast.

40:43

Yeah, I think the team has made awesome progress.

40:46

We're at good photorealistic quality. For

40:48

people who haven't used it yet, you just type into the bot what

40:50

you want the image to be in, it'll just make it. Yeah,

40:53

and the fact that it's fast and free, I think

40:55

is going to be pretty game changing. There are photorealistic

40:58

image generators out there, but a lot of them, they take a minute.

41:01

They're hard to use. Discord

41:03

or whatever. Yeah, you have to pay a subscription fee.

41:05

Yeah. So I think having it be free,

41:09

fast,

41:10

able to exist in group chat threads, I

41:12

think people are just going to create a ton

41:14

of images for fun. And I

41:16

don't know, I'm really curious to see how this gets used,

41:19

but I think it's going to be super fun. I already just

41:21

sit there with my kids and

41:24

the word that you say to get it to make an image is

41:26

imagine. And my daughter is just like,

41:28

I just want to play imagine and just like imagine this. And it's

41:30

like, oh, we get an image. And it's like, oh, well, I actually

41:33

want to change it. So imagine this and edit

41:35

the prompt, but because it's just a five second turnaround,

41:38

you could do that so easily. And you could do it over

41:40

the internet with group chat. I'm doing that

41:42

sitting there with my daughter, but I think that's

41:44

going to be really fun. So I think there are all

41:46

these things where you can

41:48

use these tools to facilitate connections

41:51

and just create entertainment, which

41:53

I

41:54

guess actually,

41:56

probably more what the technology is capable

41:58

of today than even in some of the more utility

42:01

use cases, because there is the factuality issue

42:03

with the hallucinations and all that. And we're

42:06

trying to address that by doing partnerships

42:08

with search engines that you mentioned. So

42:11

you can type in a question and ask

42:14

in real time like, who on this fight this weekend? And

42:16

it'll be able to go do a search and bring

42:18

that in. But there's still, I think, hallucinations

42:21

hasn't been solved completely in any

42:23

of these. So I think to some degree, the

42:25

thing that these language models have really been

42:27

best at is, I mean,

42:30

it's kind of with the name generative AI, I suggest being generative,

42:32

suggesting ideas, coming up with things

42:35

that could be interesting or funny, are

42:37

much better than like, I wouldn't necessarily

42:39

yet want it to be like my doctor and ask it

42:41

for diagnosis and have to rely that it's not hallucinating.

42:44

So I think having it fit into a consumer

42:47

product where the primary goals are

42:50

interacting into interesting content

42:52

and entertainment is actually maybe

42:54

a more natural fit for what the technology is capable

42:56

of today than some of the initial use cases

42:59

that people thought about. It was like, oh, it's going to be this

43:01

kind of like all intelligence assistant, or it's going to

43:03

be my new search engine or something. I mean, it's fine

43:06

for those a bunch of the time. And I think it will get

43:08

there over the next few years. But I

43:11

think the consumer thing is actually quite a good fit today.

43:14

It seems like a key differentiator for meta

43:16

in the whole model race is you have probably

43:19

second to maybe Google the most user data

43:21

to train on. And I know a lot of it's private, and you

43:23

wouldn't train on like, ever train on that private

43:25

chat. We don't. What's

43:28

that's encrypted to, but like public

43:30

stuff, reels, public Facebook posts,

43:32

that seems pretty natural for this. Is

43:35

that in feeding meta AI

43:38

right now? Yeah. I mean, like you said, we

43:40

don't train on kind of private

43:42

chats that people have with their friends or things like

43:44

that. But you're sitting

43:46

on this just massive hoard of data. Yeah.

43:50

It could be interesting in a way. I actually

43:52

think a lot of the stuff that we've done today is

43:56

actually still pretty basic. And there's a lot of upside,

43:58

and I think we need to experiment. see what ends up being useful.

44:01

But I mean, one of the things that I think is interesting

44:05

is these AI problems,

44:07

they're

44:07

so tightly optimized that having

44:10

the AI

44:13

basically like live in the environment

44:15

that you're trying to get it to get better at is

44:18

pretty important. So like, so for example, you know,

44:20

you have things like chat GPT, they're just in like a kind

44:22

of

44:23

abstract chat interface. But

44:26

getting an AI to actually live in a group chat,

44:28

for example, is actually a completely different problem.

44:31

Because now you have this question, which is, okay, when should the when

44:33

should the AI jump in? Right, right. So

44:35

it actually like in order to get an AI to be good

44:37

at being a group chat, you need to have

44:39

experience with a eyes and group chats, which even

44:42

though like, I don't know, Google or open

44:44

AI or other folks may have, you know, a lot of experience

44:47

with other things that kind of like product dynamic

44:49

of, of having the actual experience

44:51

that you're trying to deliver the product in,

44:54

isn't that that's super important. Similarly,

44:56

one of the things that I'm pretty excited about, I

44:58

think multimodality is pretty important interaction,

45:01

right? It's I think, you know, a lot of these things

45:03

today are like, okay,

45:06

you're an assistant, I can chat with you in a

45:08

box, you don't change, right?

45:10

It's like, you're the same assistant every day. I think

45:12

that that's not really how people tend to interact

45:15

in order to make things fresh and entertaining.

45:17

You know, even the apps that we use, they change,

45:20

right? They get refreshed, they add new features.

45:22

I kind of think that

45:24

people will probably want the AI is that they interact

45:27

with, I think it'll be more exciting and interesting

45:29

if they do too. So part of what

45:31

I'm interested in is this

45:33

isn't just chat, right? Chat, I think

45:35

will be where most of the interaction

45:37

happens. But these AI is

45:40

they're going to have profiles and Instagram

45:42

and Facebook, and they'll be able to post

45:44

content and they'll be able to interact with people and

45:47

interact with each other, right? And I think that that's

45:49

there's this whole like, interesting set

45:51

of flywheels around how that

45:53

interaction can happen and how they can sort of evolve

45:56

over time. And I think that that's going to be very compelling

45:59

and interesting and Obviously, we're kind

46:01

of starting slowly on that, but

46:03

I think that having them sort of exist in that environment.

46:07

We wanted to build it so that it kind of worked across the

46:09

whole meta universe of products, including

46:11

having them be able to, in the near future,

46:14

be embodied as avatars in the metaverse. As

46:16

you go into VR and you have an avatar

46:19

version of the AI and you can talk to them

46:21

there, I think that that's going to be really compelling.

46:24

It's at a minimum creating much

46:26

better NPCs and experiences

46:28

when there isn't another actual

46:30

person who you want to play a game with. You can just have AIs

46:33

that are much more realistic and compelling

46:36

to interact with. But I

46:38

think having this crossover where you

46:40

have an assistant or you have someone who tells

46:43

you jokes and kind of cracks you up and entertains

46:45

you and then they can show up in some

46:47

of your metaverse worlds and be able

46:49

to be there as an avatar, but you can still interact with them

46:52

the same way I think it's pretty cool. We

46:55

need to take another short break. When we return,

46:57

Mark and I discuss meta's ambitions for the metaverse

47:00

and how he sees AR, VR,

47:02

and AI all coming together.

47:06

Support for the show comes from NPR. What

47:09

do labor strikes, climate change, and your crappy

47:11

office printer all have in common? Simple.

47:14

They all have issues with money. Money is everywhere,

47:16

fueling all our lives, altering our environment,

47:19

and driving behavior all around the world. If

47:21

you're curious to learn something new and

47:23

exciting about economics every week, you

47:26

can listen to the Planet Money podcast from NPR.

47:29

Planet Money takes the world of complex economy and makes

47:31

it make sense, where human stories supersede

47:34

abstract theories. Listeners learn, laugh,

47:36

and be entertained. It's econ down

47:38

to earth. Plus, Planet Money answers

47:40

some of your most burning questions. Will

47:43

AI take over our jobs? Is fancy vodka

47:45

just fancy marketing? Why are Christmas

47:47

trees so damn expensive? The Planet

47:49

Money team lives to tell the good story in

47:51

around 30 minutes. It's econ for the

47:53

rest of us. Tune in to Planet Money

47:56

every week for entertaining stories and insights

47:58

about how money shapes our world. stories

48:00

that can't be found anywhere else. Listen

48:02

now to Planet Money from NPR wherever you

48:04

get your podcasts.

48:09

Web agencies, you're gonna like this one.

48:11

Let me tell you about Wix Studio, the

48:14

platform that gives agencies total creative

48:16

freedom to deliver complex client sites while

48:19

still smashing deadlines. How?

48:21

First, let's talk about the advanced design

48:24

capabilities. With Wix Studio, you

48:26

can build unique layouts with a revolutionary

48:28

grid experience and watch as elements

48:31

scale proportionally by default. No

48:33

code animations add sparks of delight

48:35

while custom CSS gives total design

48:38

control. But it doesn't stop there. Bring

48:40

ambitious client projects to life in any

48:42

industry with a fully integrated suite

48:45

of business solutions from e-com to

48:47

events, bookings, and more. And

48:49

extend the capabilities even further with

48:51

hundreds of APIs and integrations.

48:54

You know what else? The workflows just

48:56

make sense. There's the

48:58

built-in

48:58

AI tools, the centralized workspace,

49:01

the on-canvas collaborating, the reuse

49:03

of assets across sites, the seamless

49:05

client handover, and that's not

49:07

all. Find out more at wix.com

49:10

slash studio.

49:13

We're

49:16

back. So

49:18

you think the advent of these AI personas

49:20

that are way more intelligent

49:23

will accelerate interest

49:25

in the metaverse and VR? Well,

49:28

I think that all this stuff makes it more

49:31

compelling. I think it's probably

49:33

an even bigger deal for

49:35

smart glasses than

49:38

for VR. You need something. You need

49:40

a kind of visual or a voice control for

49:42

something. Well, I kind of thought, you know,

49:45

when I was thinking about what would be the key features

49:47

for smart glasses, I kind

49:50

of thought that we were gonna get holograms

49:52

in the world, and that was one. That's

49:54

kind of like augmented reality. But

49:56

then there was always some sort

49:59

of vague. notion that you'd have

50:01

like an assistant that could do something.

50:03

I thought that things like Siri or

50:05

Alexa were very limited. So I was

50:07

just kind of like, okay well like over the time period of

50:09

building AR glasses like hopefully

50:12

the AI will advance. And now it definitely

50:14

has. So now I think we're at this point

50:16

where it may actually be the case that for

50:18

smart glasses the AI is compelling

50:21

before the holograms and the displays are. Which

50:23

is sort of you know where we got to with

50:27

the new version of the Ray-Bans that we're shipping this

50:29

year. When we

50:31

started working on the product all this generative

50:34

AI stuff hadn't happened yet. So we actually started

50:36

working on the product just as an improvement

50:38

over the first generation. So the photos

50:41

are better, the audio is a lot better,

50:44

like the form factor is better. It's just

50:46

sort of like a much more refined version of the initial

50:48

product. And there's some new features like you can

50:50

live stream now, which is pretty cool right because you can live

50:52

stream what you're looking at. But it was only over the

50:54

course of developing the product that we

50:57

realized that hey we could actually put this

50:59

whole generative AI assistant

51:02

into it and you could have these glasses that

51:04

are kind of stylish Ray-Ban glasses. You could

51:06

be talking to AI all throughout the day

51:09

about different questions you have. This isn't

51:11

in the first software release but sometime

51:13

early next year we're also gonna have this

51:16

multimodality. So you're gonna be able to ask

51:18

the AI hey what is it that I'm looking at? Like

51:20

what is what type of plant is that? Like

51:23

where am I? How expensive

51:25

is this thing? Yeah I mean it's because it has a camera built

51:28

into the glasses. So then you can just like look at

51:30

something and like all right and you're

51:32

filming with you know some Canon camera. It's

51:34

like like where do I get one of those? Again

51:37

this is all like really novel stuff so I don't

51:39

I'm not pretending to know you know exactly

51:41

what the key use cases or how

51:43

people are gonna use that. But smart

51:46

glasses are very powerful for AI because

51:50

unlike having it on your phone,

51:53

glasses is a form factor. You

51:56

can see what you see and hear what

51:58

you hear from your perspective. So

52:00

if you want to build an AI assistant that

52:02

really has access to all of the inputs

52:04

that you have as a person, glasses

52:07

are probably the way that you want to build that. It's

52:09

sort of this whole new angle on smart

52:11

glasses that I thought might materialize over a

52:14

five to 10 year period, but in this odd

52:16

twist of the tech industry, I think actually is going

52:18

to show up maybe before even

52:20

super high quality holograms do. Is overall

52:23

interest in the Ray-Bans and

52:25

the Quest line

52:27

kind of tracking with where

52:29

you thought it would be at this point? Let's

52:32

take each of those separately. I know they're separate products.

52:35

Quest 1 was the first kind of standalone

52:37

product, and it did

52:39

well, but all the content

52:41

had to be developed for it. So it was really

52:44

when we developed Quest 2, which was the next generation

52:47

of it, that already had all the content built, and it

52:49

was sort of the kind of

52:51

refinement on it.

52:52

That one blew up. So Quest 2 was like a huge hit,

52:55

tens of millions, and that

52:58

did very well, and was sort of like the kind

53:01

of defining VR device so far.

53:04

Then we shipped Quest Pro, which was making the leap

53:06

to mixed reality, but it was $1,500. And

53:09

what we've seen so far is

53:11

that at least consumers

53:14

are very cost conscious. So we

53:16

expected to sell way fewer Quest Pros than

53:18

Quest 2s, and that bared out. It's

53:20

always hard to predict exactly what it will be when you're shipping

53:23

a product at $1,500 for the first time, but

53:25

I'd say it was kind of fine. With

53:28

an expectations, it wasn't like a grand slam,

53:30

but it did fine. And

53:32

now Quest 3 is sort of the refinement

53:35

on mixed reality, kind of like Quest 1 was

53:38

that with Quest 3, we're

53:40

sort of at the point where we've gotten mixed reality, which

53:42

is even higher quality than what was in Quest Pro,

53:45

but it's a third of the price. So it's $500. So

53:47

I'm really excited to see how that one will go. It

53:51

seems like you all, based on my demos,

53:53

still kind of primarily think of it as a gaming

53:55

device. Is that fair, that the main use cases

53:57

for Quest 3 are going to be... gaming

54:01

meets social, so you've got Roblox now. I

54:03

think social is actually the first thing,

54:06

which is interesting, because Quest used

54:08

to be primarily gaming, and

54:10

now if you look at what experiences are people

54:12

spending the most time in, it's actually just different

54:15

social metaverse type experiences. So, you

54:17

know, things like Rec Room, VR Chat, Horizon,

54:21

Roblox, but even with Roblox just

54:23

kind of starting to grow on the platform, social

54:26

is already more time spent than gaming use cases.

54:28

So it's different if you look at the economics, because

54:30

people pay more for games, whereas social

54:33

kind of has that whole adoption curve thing that I talked about

54:35

before, where first you have to kind of

54:37

build out the big community, and then you can enable

54:39

commerce and kind of monetize it over

54:41

time. But this is sort of my whole theory

54:43

for VR, was people looked at it initially

54:46

as a gaming device, and I thought,

54:48

hey, I think

54:48

this is a new computing platform overall. Computing

54:51

platforms tend to be good for three major things,

54:54

gaming, social and communication,

54:56

and productivity, and I'm pretty sure

54:58

we can nail the social one, if we can find the right

55:01

partners on productivity, and if we can support

55:03

the gaming ecosystem, then I think that we can help this

55:05

become a big thing. So I'd say broadly

55:07

that's sort of on track, I thought it was gonna be a

55:10

long-term project, but I think the fact that social

55:13

has now overtaken gaming

55:15

is the thing that people are spending the most time on, is

55:17

an interesting software evolution in how they're used.

55:21

But yeah, like you're saying, I mean, entertainment, social,

55:23

gaming, still the primary things, productivity

55:25

I think still needs some time to develop.

55:29

I tried the Quest 3, it's definitely a meaningful

55:31

step change in terms of graphics and performance

55:34

and all the things you guys have put into it. It

55:36

feels still like we're a little

55:38

ways away from this medium becoming

55:41

truly mainstream, becoming something that- What

55:44

do you say mainstream viewing? Well, I know you're already

55:46

at kind of console level sales, so

55:48

you could say that's mainstream, but I guess in

55:51

terms of what you could think of as a general purpose

55:53

computing platform. So even like

55:55

PC or something like that,

55:57

seems- Well, in what sense? I think there's-

56:00

there's a few parts of this. I think for productivity,

56:03

you probably want

56:04

somewhat higher resolution screens.

56:07

And that I think will come, and I think we're waiting for

56:09

the cost curve to basically,

56:12

like we could have super high resolution screens today,

56:14

just that the device would be thousands and thousands of

56:16

dollars, right? Which is basically the trade off that Apple

56:18

made with their Vision Pro. Have you tried it yet?

56:21

No, I haven't now. Yeah. But

56:24

you're right. They guided towards that

56:26

one spec, you can tell. Yeah, you just have to

56:29

imagine over the next five

56:32

plus years, like there will be

56:34

displays that are that good, and they'll come down

56:36

in cost, and we're

56:38

just sort of riding that curve. So for today,

56:40

when you're building one of these products, you basically have the

56:42

choice of, if you have it

56:44

at that expensive, then you will sell

56:46

hundreds of thousands of units or something. But

56:49

we're trying to build something where we build up the community

56:51

of people using it. So

56:54

we're trying to thread the needle and have the best possible display

56:56

that we can, while having a cost $500, Yeah,

57:00

I reported on some comments you made to employees

57:03

after Apple debuted the Vision Pro, and you didn't seem

57:05

super phased by it. Like it

57:07

seemed like it didn't bother you as much as it maybe

57:09

could have. I have to imagine if they released

57:11

a $700 headset, we'd

57:14

be having a different conversation, but they're

57:16

shipping low volume, and they're probably

57:18

three to four years out of a general,

57:21

like a lower tier type release that's

57:23

at any meaningful scale. So I guess, I

57:25

mean, is it because the market's yours foreseeably then

57:27

for a lot? I mean, look, Apple is

57:30

obviously very good at this. So I don't want to be

57:32

dismissive, but because we're

57:34

relatively newer to building this, the

57:36

thing that I wasn't sure about is

57:38

when Apple released a device, were

57:41

they just going to have made some completely

57:43

new insight or breakthrough that just

57:45

made our effort. Blew your

57:47

R&D up. Yeah, that's just like, oh, like, well,

57:50

now we need to go start over or something. And

57:52

to me, that was the thing that, yeah,

57:55

I thought we were doing pretty good work. So I thought that

57:57

was unlikely, but you don't know for sure

57:59

until... They show up with their thing and

58:02

there was just nothing like that, right? So I

58:04

think that there are some things that they did that that

58:06

are they're clever I think well, we know when we

58:08

actually get to use it more I'm sure there are gonna be other

58:10

things that will they will learn that are interesting

58:13

But mostly they just chose a different

58:15

part of the market to go in and I think

58:17

it makes sense for them Right. I mean I think that they sell it

58:20

must be what 15 to 20 million Mac books

58:22

a year and from their perspective If

58:24

they can you know replace those Mac

58:27

books over time with things like vision Pro Then

58:29

that's like a pretty good business for them Yeah, right

58:32

and it'll be many billions

58:34

of dollars of revenue and you know I think they're pretty

58:36

happy selling 20 million or 15 million Mac

58:38

books a year. It's good But

58:40

we play a different game, right? It's I mean, we're

58:43

not trying to sell devices at a big premium

58:45

and make a ton of money on the devices You

58:47

know going back to the curve they

58:49

were talking about before we

58:51

want to build something. That's great Get

58:53

it to be so that people use it and want

58:55

want to use it like every week and every day and then

58:57

over time Scale it to hundreds

59:00

of millions or billions of people and

59:02

you know I think if you if you want to do that Then

59:04

you have to innovate not just on the quality of the device

59:07

But also in making it affordable and accessible to people

59:09

so I do just think we're playing somewhat

59:11

different games and that I think Makes it

59:14

said over time, you know They'll I'm

59:16

sure build a high quality device and in

59:18

the zone that they're focusing on and it

59:20

may just be that these are in fairly Different spaces

59:22

for a long time. Yeah, but I'm not sure

59:25

I think we'll see is it as it goes from

59:27

the developer perspective Does it help

59:29

you to have developers building

59:32

on? Do you see I guess

59:34

because I you could lean too much I guess into the Android

59:36

versus iOS analogy here But yeah I

59:39

guess where do you see that going where

59:41

is meta does meta really lean into

59:43

and the Android approach and you start licensing? Your

59:46

software and technology does I'd

59:48

like to to have this be a more

59:50

open ecosystem over time My theory

59:53

on how

59:54

these computing platforms evolved

59:56

is there will be a closed integrated stack

59:59

and a more open in stock and there have

1:00:01

been in every generation of computing

1:00:03

so far. The thing that's actually not clear

1:00:06

is which one will end up being the more

1:00:08

successful. I think we're looking, we're

1:00:10

kind of coming off of the mobile one now

1:00:13

where

1:00:14

Apple has truly been the dominant company. Even

1:00:16

though there are technically more Android phones, there's way more

1:00:18

economic activity in the center of gravity for

1:00:21

all the stuff that's clearly on iPhones. In

1:00:23

a lot of the most important countries for defining

1:00:25

this, I think iPhone has a majority

1:00:28

and growing share and I think it's clearly just the dominant

1:00:30

company in the space. But that wasn't true

1:00:33

in computers and PCs. Microsoft

1:00:36

and- So our approach here

1:00:38

is to focus on making it

1:00:41

as affordable as possible.

1:00:42

We want to be the open ecosystem and

1:00:44

we want the open ecosystem to win. So

1:00:47

I think it is possible that this will be more

1:00:49

like PCs than like

1:00:51

mobile where maybe Apple

1:00:54

goes for this for kind of a high

1:00:56

end segment and maybe

1:00:58

we end up being the primary ecosystem

1:01:02

and the one that ends up serving billions

1:01:04

of people. That's the outcome that

1:01:07

we're certifying for. On the progress

1:01:09

that you're making with AR glasses, it's

1:01:11

my understanding that you're going to have your first kind of

1:01:13

internal, at least dev kit next year.

1:01:15

I don't know if you're going to show it off publicly or not, if that's

1:01:17

been decided. But is that progressing

1:01:20

at the rate that you have hoped as well?

1:01:22

It seems like Apple's dealt with this. Everyone's

1:01:24

been dealing with kind of the technical

1:01:27

problems with this. I don't

1:01:29

think we have anything to announce on that today. You

1:01:32

said AR glasses are a kind of end

1:01:34

of this decade thing and I guess what I'm trying to get

1:01:36

at is- To be at more

1:01:39

of a mainstream consumer product, not

1:01:41

like a V1. I don't have anything new to

1:01:43

announce today on this. We have a bunch

1:01:45

of versions of this that we're building internally. We're

1:01:48

kind of coming at it from two angles at once.

1:01:51

We're starting with Ray-Ban, which is like,

1:01:53

if you take stylish glasses today, what's

1:01:55

the most technology that you can cram

1:01:57

into that and make it a good product? coming

1:02:00

out from the other side, which is like, all right, we want to create

1:02:02

what is our ideal product with like,

1:02:05

full holograms, you walk into

1:02:07

a room, like, there's

1:02:09

like as many holograms, there's there are physical

1:02:12

objects, like you can interact with like, people

1:02:15

as holograms, ais as holograms,

1:02:17

like all this stuff. And then how do we get that

1:02:20

to basically fit into

1:02:22

glasses like form factor at as

1:02:25

affordable of a price as we can

1:02:27

get to, I'd say the the Ray

1:02:29

ban one, I'm really curious

1:02:32

to see how the second generation

1:02:34

of the Ray bans does. And the first one, I think the

1:02:36

reception was, was pretty good.

1:02:39

I mean, there's a bunch of reports about the retention

1:02:41

being somewhat lower. And then yeah, I think

1:02:43

that there's a bunch of stuff that we just need to polish where

1:02:46

the cameras are just so much better, the audio is so much better.

1:02:48

And we didn't realize that a lot of people were going to want to use

1:02:50

it for like listening to podcasts when they go on

1:02:52

a run, right? That wasn't what we designed it for.

1:02:55

But it was a great use case. So it's like, okay,

1:02:57

yeah, great. Like, let's make sure that's good in

1:02:59

v2. So, you know, it's the cycle

1:03:01

for iterating on this, if you're doing if we're,

1:03:03

you know, doing like a threads release or Instagram,

1:03:06

and the cycle is like a month, hardware, it's like 18

1:03:08

months, right? Right. So this is the

1:03:10

next step. And we're, I think, I'm just gonna climb

1:03:13

up that curve. But the initial interest,

1:03:15

I think, is there, I think

1:03:17

this is an interesting base to build from. So

1:03:19

I feel good about that. Going

1:03:22

the other direction, I mean, the technology

1:03:24

is hard, right? And it's,

1:03:27

we are able to get it to work. It's

1:03:29

currently very expensive. So

1:03:32

if you want it to, if you want to

1:03:34

reach a consumer population,

1:03:36

that's gonna wait for the cost curve to come down.

1:03:38

Yeah. So that's kind

1:03:40

of so that's kind of the main limiting. Well,

1:03:42

I think there's that and we Yeah, I mean,

1:03:45

we want to keep on improving it. So I

1:03:47

think,

1:03:48

but look, you learn

1:03:50

by trying to assemble, integrate everything, it's

1:03:52

you can't just like do a million R&D efforts and

1:03:54

in isolation

1:03:56

and then like, hope that they come together.

1:03:58

I think part of what lets you get to

1:04:01

building the ultimate product, just having a

1:04:03

few tries practicing building the ultimate product.

1:04:05

And it's like, oh well, we did that, but I don't

1:04:08

know, it wasn't quite as good on this one

1:04:10

dimension as we wanted, so let's

1:04:12

not ship that one, let's hold that one and then do

1:04:14

the next one. So that's sort of some of the process

1:04:17

that we've had, is we have multiple

1:04:19

generations of how we're gonna build this. You

1:04:21

know, when I look at the overall budget

1:04:24

for Reality Labs, I mean, it's augmented

1:04:26

reality, the glasses I think is the

1:04:29

most expensive part of what we're doing. That's

1:04:31

why I ask, because I think people are wondering like, where's all

1:04:33

this going? I mean, but look, I

1:04:35

think at the end of the day, I'm

1:04:38

quite optimistic about both augmented and virtual

1:04:40

reality. I think AR glasses are gonna

1:04:43

be the thing that's like mobile phones, that

1:04:45

you walk around the world wearing. VR

1:04:48

is gonna be like your workstation or TV, which

1:04:51

is when you're like settling in

1:04:53

for a session and you want kind

1:04:56

of higher fidelity, more compute, rich

1:04:59

experience, then it's gonna be worth

1:05:01

putting that on. But you're not gonna walk down the street wearing VR

1:05:04

headset. I mean, at least I hope not.

1:05:06

I mean, that's not the future that we're working towards.

1:05:09

But I do think there's somewhat of a bias, maybe

1:05:11

this is in the tech industry or maybe overall, where

1:05:13

people think that the mobile phone one, the glasses

1:05:16

one is sort of

1:05:18

the only one of the two that will end up being valuable.

1:05:20

But I think there are a ton

1:05:22

of TVs out

1:05:25

there, right? And there are a ton of people

1:05:27

who are kind of like spend a lot of time in front of computers

1:05:29

working. So I actually think the VR one will be

1:05:31

quite important too, but I

1:05:34

think that there's no question that the larger

1:05:36

market over time, I think should be smart

1:05:39

glasses. And I mean, now I think you're gonna

1:05:41

have both all the immersive quality

1:05:44

of being able to interact with people and feel

1:05:46

present no matter where you are in sort

1:05:48

of a normal form factor. And you're

1:05:50

also gonna have like the perfect form factor

1:05:52

to deliver all these AI experiences over time,

1:05:55

because they'll be able to see what you see and hear what you hear.

1:05:57

So yeah, I mean, this stuff is challenging. I

1:06:00

think making things small

1:06:02

is also very hard. There's this funny

1:06:05

counterintuitive thing where I think humans

1:06:08

get super impressed by building big things

1:06:10

like the pyramids, but I

1:06:12

think a lot of time building small things like

1:06:15

cures for diseases at a cellular

1:06:17

level or miniaturizing

1:06:20

a supercomputer to fit into your glasses are

1:06:22

maybe even bigger feats than building some

1:06:24

really physically large things, but it just

1:06:28

sort of seems less impressive for some

1:06:30

reason, but I don't know, it's

1:06:32

super fascinating stuff. The last year

1:06:34

has been, I feel like every time

1:06:36

we talk there's a lot has happened in a year. You

1:06:39

seem really dialed in to managing the company,

1:06:41

and I'm curious kind of what motivates

1:06:44

you these days, because you've got a lot going on, and you're getting

1:06:46

into fighting, you've got three kids, you've got the philanthropy

1:06:49

stuff, there's a lot going on, and

1:06:51

you seem more active in

1:06:53

kind of day-to-day stuff, at least externally

1:06:56

than ever. You're kind of the last,

1:06:58

I think, founder of your era still

1:07:00

leading a company of this large.

1:07:03

Do you think about that? Do you think about kind of what

1:07:06

motivates you still, or is it just kind of

1:07:08

still clicking, and it's kind of more subconscious?

1:07:11

I don't know, I mean, I'm not sure that much of the stuff

1:07:13

that you said is that new. I mean,

1:07:17

kids are seven years old, almost

1:07:19

eight now, right? So that's been

1:07:21

for a while. Yeah, the fighting thing is

1:07:23

relatively new over the last few years, but I've always been very physical,

1:07:26

so a lot of sports and stuff like that. But

1:07:30

we go through different waves

1:07:34

in terms of what the company

1:07:37

needs to be doing, and I think that that

1:07:39

calls for somewhat different styles

1:07:41

of leadership, and I think we went through a

1:07:44

period where a lot of

1:07:46

what we needed to do was tackle and

1:07:49

navigate some important social issues, and

1:07:51

I think that that required a somewhat different style.

1:07:54

And then we went through a period where we had some

1:07:56

quite big business challenges, handling in

1:07:59

a recession. and revenue

1:08:01

not coming in the way that we thought and needing to

1:08:04

do layoffs, and that required a

1:08:06

somewhat different style. But now

1:08:08

I think we're squarely back

1:08:11

in developing really

1:08:13

innovative products, especially because of

1:08:15

some of the innovations in AI. I

1:08:18

think that in some ways that just plays exactly

1:08:20

to, I think, my

1:08:22

favorite style of running a company.

1:08:25

But I don't know, I think these things evolve

1:08:27

over time. It feels like you're having more fun.

1:08:30

Well, how can you not? I mean, this is like, I mean, this

1:08:32

is I think what's great about the tech industry is like

1:08:34

every once in a while you get something like these

1:08:37

AI breakthroughs and it just changes everything.

1:08:39

And yeah, I mean, that can be threatening

1:08:41

if you're behind it, but I just think that that's

1:08:44

like when stuff changes and when awesome stuff gets

1:08:46

built. So then that's exciting. I guess

1:08:48

personally, I think a lot

1:08:50

of people, I mean, the world has been so weird over the last

1:08:52

few years, especially going

1:08:54

back to the COVID pandemic

1:08:57

and all that stuff. And I think it was an opportunity

1:09:00

for a lot of people to just sort of reassess

1:09:02

what they found meaningful in their lives. And

1:09:05

there's obviously a lot of stuff that was tough about it, but in

1:09:07

the silver lining, it was like I got to spend a lot more time with

1:09:09

my family and spend more time

1:09:12

out in nature because I wasn't coming into the office

1:09:14

quite as much. And I was definitely

1:09:16

a period of reflection

1:09:18

where I sort of, I felt like since

1:09:21

the time I was basically, I

1:09:23

was like 19 when I started the company. I mean, like every year

1:09:25

it's just, okay, we want to connect more people,

1:09:27

right? It's like connecting people is good.

1:09:30

That's sort of what we're here to do. Let's like make

1:09:32

this bigger and bigger and just like kind

1:09:34

of connect more people and

1:09:37

build more products that allow people to do that. I

1:09:39

guess we just sort of hit the scale where to me,

1:09:42

what I found sort of satisfaction in life

1:09:44

from and what I think is like the

1:09:47

right strategy, I think

1:09:49

both for like me personally and for the company

1:09:52

is less to just focus

1:09:54

on like, okay, we're gonna just like connect more people. And

1:09:56

more like, let's do some awesome

1:09:59

things. And... Sounds

1:10:01

very technical. I mean, there

1:10:04

are a lot of different analogies on this, but I mean, someone

1:10:07

made this point to me that doing

1:10:10

good things is different from doing awesome

1:10:13

things. And social media in a lot of ways,

1:10:15

it's good, right? It gives a lot

1:10:17

of people a voice and it lets them

1:10:19

connect and it's like sort of warm

1:10:21

and it's taking like a basic technology

1:10:24

and bringing it to billions of people. But I

1:10:26

think that there's an inherent awesomeness of

1:10:29

like

1:10:30

doing some technical feat for the first

1:10:32

time. And I

1:10:34

guess I'm, for the

1:10:36

next phase of what we do, just

1:10:38

a little more focused on that. Like,

1:10:41

I think we've done a lot of good things.

1:10:43

I think we need to make sure that they stay good,

1:10:45

right? I think that there's like a lot of work that needs to happen

1:10:47

to, on making sure the balance of all that

1:10:49

is right. But for the next wave

1:10:52

of, I guess, my life and

1:10:54

for the company, but also

1:10:56

outside of the company, what I'm doing at CZI

1:10:59

and just some of my personal projects, it's like

1:11:01

I sort of define my life

1:11:03

at this point more in terms of getting

1:11:05

to work on awesome things with great

1:11:08

people who I like working with. So it's

1:11:10

like I work on all this

1:11:13

Reality Lab stuff with Boz and

1:11:15

a team over there and like it's just super exciting.

1:11:17

And I get to work on all this AI stuff with

1:11:19

Chris and Ahmed and like the folks

1:11:21

who are working on that and like it's really exciting.

1:11:24

And like we get to work on some of the philanthropy

1:11:26

work and helping to cure diseases with

1:11:28

Priscilla and a lot of the best scientists in

1:11:30

the world. And that's really cool. And it's like, so just

1:11:33

then there's like personal stuff. It's like we get to raise a family.

1:11:35

It's like, that's really neat. And like there's

1:11:37

no other person I'd rather do that with. And but

1:11:40

I don't know, to me, that's just sort of

1:11:42

where I am in life now. But um... Sounds

1:11:45

like a nice place to be. I mean, I'm enjoying

1:11:47

it. Mark Zuckerberg, the optimist. I

1:11:49

mean, always somewhat optimistic. But yeah,

1:11:51

no, but this is... Thanks for the time, Mark. Yeah. Appreciate

1:11:54

it. Thank you.

1:11:58

Thanks again to Mark Zuckerberg.

1:11:59

Thank you to Markerberg for taking the time to talk today. Thanks

1:12:02

as always to Alex Heath for guest hosting, and thank

1:12:04

you for listening to Decoder. I hope you enjoyed it. You

1:12:06

can find Alex at his newsletter, Command Line. It's theverge.com

1:12:09

slash command line. It is jam-packed,

1:12:11

let's be honest. It is just a great newsletter.

1:12:14

Alex is also at Code this week, interviewing

1:12:16

Roblox CEO Dave Buzucki. Stay tuned for that

1:12:18

and plenty of more interviews from the

1:12:20

Code conference in the feed to come. As

1:12:23

always, I'd love to hear what you think of Decoder. You

1:12:25

can email us at decoder at theverge.com. We read every

1:12:27

email. You can also hit me up on Threads.

1:12:29

I'm at Reckless1280 on Threads. And

1:12:32

we have a TikTok, which is super fun. Check

1:12:34

it out at DecoderCon. If you like

1:12:36

Decoder, please share it with your friends and subscribe

1:12:38

wherever you get your podcasts. If you really like

1:12:40

the show, hit us up with that five-star review.

1:12:42

Decoder is a production of The Verge and part of the Vox Media

1:12:44

Podcast Network. Today's episode

1:12:46

was produced by Kate Cox and Nick Statt and edited by Calvin

1:12:49

Wright. The Decoder music is by Breakmaster

1:12:51

Cylinder. Our editorial director is Brooke

1:12:53

Minters and our executive

1:12:54

producer is Alan O'Donovan. Support

1:12:58

for this show came

1:12:59

from American Express Business. You've seen your business go

1:13:01

from just an idea to a success. Now it's time to find a partner to help

1:13:03

you grow it even more. American Express is here to

1:13:05

help. American Express Business cards are built

1:13:07

for your business with features and benefits like

1:13:09

the ability to earn membership rewards points on select cards,

1:13:12

the power to pay for big business purchases, and

1:13:14

24-7 support. American

1:13:16

Express Business cards are available for purchase on the American

1:13:19

Express website. You can purchase them online or directly

1:13:21

from the American Express website. American Express Business

1:13:23

cards are available for purchase on the American

1:13:25

Express website. You can purchase them online or directly from the American

1:13:27

Express website. Built for your business. Amex

1:13:30

Business.

1:13:30

Terms apply. Learn more at AmericanExpress.com

1:13:34

slash business cards.

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features