Podchaser Logo
Home
Fei-Fei Li: The Godmother of AI

Fei-Fei Li: The Godmother of AI

Released Tuesday, 23rd January 2024
 1 person rated this episode
Fei-Fei Li: The Godmother of AI

Fei-Fei Li: The Godmother of AI

Fei-Fei Li: The Godmother of AI

Fei-Fei Li: The Godmother of AI

Tuesday, 23rd January 2024
 1 person rated this episode
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

Every New Year, it's tradition to set

0:02

new goals. well. imagine setting a goal

0:04

now that last the rest of your

0:07

life for a limited time, Rosetta Stone

0:09

is offering a lifetime membership for just

0:11

fifty percent off. That's not an annual

0:13

payment, that's a one time payment. You

0:16

get full access to twenty five languages

0:18

and with it access to the world

0:20

one language at a time. Visit Rosetta

0:23

stone.com now and save fifty percent. That's

0:25

fifty percent off a lifetime membership. Welcome

0:29

to your twenties. When he three work recap

0:31

this year, you've been to one hundred and

0:33

twenty seven think meetings. You spent fifty six

0:36

minutes or team for files and almost missed

0:38

a deadline. He a. Twenty

0:40

Twenty Four can and should sound different

0:43

with monday.com You can work together easily,

0:45

collaborate and share data files and up

0:47

Dave. So all work happens in one

0:50

place and everyone's on the same paid.

0:52

Go to monday.com or tap the banner

0:54

to learn more. Imam.

1:05

I'm older and this is

1:07

clear in vivid conversations about

1:10

connecting in communicating. Since

1:15

I was a little girl

1:18

like eleven twelve year old,

1:20

I just loved Physics. A

1:22

was my first love, and

1:25

in hindsight, what happened is

1:27

I think I love that

1:29

audacious quest to the unknown

1:31

mystery of the universe. Then.

1:34

Enlighten. Than a manner

1:36

that I realized my oh

1:38

audacious question that I love

1:41

the most is what is

1:43

intelligence What makes intelligence? How

1:45

do we build intelligent machines

1:47

And that shift in the

1:49

middle of end of my

1:51

college year. Was

1:54

how I discovered a

1:56

I. That's. Faye Faye

1:58

Lee She's off. Called somewhat

2:00

to her embarrassment, the godmother of

2:03

Ai. My conversation with

2:05

Or is the first of three

2:07

special episodes of Clear In Vivid,

2:09

exploring the dramatic impact that artificial

2:11

intelligence his head in the last

2:13

twelve months since chat bots such

2:15

as Said She P T burst

2:17

on the scene. Doctor.

2:19

Lee is written a wonderful book

2:21

recounting her work laying the foundation

2:23

for those chat but some fifteen

2:25

years ago. but the book called

2:27

the World's I See. Is

2:30

also a vivid portrayal of our own

2:32

personal journey from her childhood in China

2:34

through a bone and risky gamble with

2:36

her career. To. Her and

2:38

concern. Today. I should

2:40

help humanity. Not. Destroys.

2:44

Thank you so much for doing this with

2:46

me. I'm on! Was guilty talking to a

2:48

win for these few minutes because I feel

2:50

like I'm stealing you away from making a

2:52

future for all of us. Law.

2:55

Think you're not on I I I

2:57

think part of the theater south of

2:59

communicating with the public. Year

3:02

when you certainly do a great job

3:04

in your book. Such a vivid

3:06

picture of the people in your life who

3:08

have given you. The. Foundation for

3:10

so many things as you do

3:12

that affect all of our lives

3:14

near like the book very much

3:16

Vermont Congratulations the I was so

3:18

interested in huge stories about your

3:20

life in China where you were

3:22

born, And I don't know

3:24

if it was as important a moment to

3:26

you, but it would have been to me

3:28

that moment where. You. Were doing so

3:31

well in school. And. Then

3:33

one day the tissue said at the end is

3:35

a day. The. Boys should stay in,

3:37

the girls are free to go now and you've

3:39

lingered outside the door to listen to what was

3:41

going on. What? What did the teachers say? Oh

3:44

yeah, I have to contextualize though.

3:46

it is still parts of the

3:48

world that top. The. The

3:51

role of tender and that

3:53

deep history is still, you

3:55

know, not the way we

3:57

thought to ask here and

3:59

the teacher. I'm pretty

4:01

sure she was. Coming

4:03

from me. Not

4:06

intentionally trying to hurt anybody, but

4:08

she wanted the boys to be

4:11

better and she wanted to encourage

4:13

the boys And the way see

4:15

encourage them is the. Say.

4:18

Was you probably be least in

4:20

which is that the boys will

4:23

have more potential. The have buffets

4:25

was likely to be smarter or

4:27

happen to us and the. You.

4:30

Know as a little girl who was

4:32

a little feisty. At

4:35

a certain areas and. Didn't rob

4:37

me well. It was a great for

4:39

for for me to. See for myself

4:42

being the person outside the door

4:44

listening to them and a woman

4:46

to advance telling the boys you're

4:48

smarter than girls biologically, I expect

4:51

more from you. But don't worry,

4:53

the few years during adolescence still

4:55

get stupid or. With

4:58

I mean it's crushing and which wonderful

5:00

was you didn't get crushed He due

5:02

to it's almost took it as a

5:05

challenge. Maybe. For. That's because

5:07

of the people around me.

5:09

And dumb. You. Know first

5:12

of all I'm not trying to

5:14

paint and the entire world with

5:16

that message right? This is one

5:18

teacher at it is It is

5:21

rooted in certain aspect of the

5:23

culture of the in the meantime

5:25

my local culture. My family I

5:28

was the the firstborn granted the

5:30

and I was a girl that

5:32

other so supported by by parents

5:35

and by my grandparents that time

5:37

I didn't feel that crossing on.

5:40

Get. Out of. Lack.

5:42

Of confidence in me by

5:44

them and does so as

5:46

a kid I needed just

5:48

a few people who supported

5:50

me. Interested: A and I

5:53

think that's probably. To.

5:55

To a lot of us, we

5:57

just need a few people who.

6:00

Love does so unconditional.

6:02

Issue to move to the

6:04

United States when you were

6:06

fifteen, right? Yeah, That must

6:08

have been a really difficult

6:10

experience to live through because

6:12

you hadn't few words in

6:14

English as I understand. That.

6:17

Plans very, very little if

6:19

any, A was challenging

6:21

but a was also character building

6:24

that someone us ask me this

6:26

question let's get some this actually

6:29

if I did the leave the

6:31

home country. What

6:33

would happen? That's a lie. I

6:36

think I have to say. That

6:38

experience helped me to

6:40

become so much more

6:42

resilient and also. Be.

6:45

Aware of. What?

6:48

One. Had to do

6:50

inside yourself. And

6:52

get all the talent? Is that that

6:55

that really made me who I am?

6:57

Intensity? And you had your

6:59

math teacher in New Jersey yes

7:01

who is so supportive and continued

7:03

throughout your life to be supportive.

7:06

Yeah. I mean that's is why

7:08

the book is the also a

7:10

tribute to those people like him.

7:12

Pops Abella he is but all

7:14

measures. Such. A. Home

7:17

and. Public high school

7:19

teacher in our country. She

7:22

is not. Unique.

7:24

Yet he's so special, his

7:26

so. He embodies the

7:29

value of that this new country.

7:32

Are instilled in me? that

7:34

kind of. I'm.

7:37

Compassion. A kind of respect to

7:39

the kind of polar and the kind of.

7:42

Generosity. That forever

7:44

you know just left at me

7:46

and I'm so grateful for that.

7:48

And I think of that moment. The.

7:51

Little girl standing outside the door. Hearing

7:54

so be recruiter be discouraging and

7:56

only a few years later, you

7:58

get a full scale. To

8:00

Princeton University. That's. A

8:03

testament to with your ability to

8:05

knock yourself out learning. I

8:07

did work hard but it's hard for

8:09

me to just give credit to myself

8:11

that the book is a love letter

8:13

to the to the people who have

8:16

supported me. Once

8:22

you were launched in this

8:24

pursuit of science, what turned

8:26

you on the Ai. Yeah.

8:28

So hello that? that's a good question.

8:30

So. It really started

8:32

with says I was stuff

8:35

I don't know like soon

8:37

as I was a little

8:39

girl like eleven twelve year

8:41

old, I just loved physics

8:43

A was my first love

8:45

and in hindsight, What? Happened

8:47

is I think I love

8:50

that or thesis quest to

8:52

the unknown mystery of the

8:54

universe like physics allows you

8:56

to ask the craziest clusters

8:59

like beginning of space time,

9:01

boundary of the universe, the

9:03

smallest particle of a matter

9:05

and. Then. In

9:09

the middle prince of physics

9:11

I discovered even the physicists

9:13

themselves like Albert Einstein and

9:15

or when shorting North. They

9:18

turned their attention. Some.

9:20

Was truly equally or basis

9:22

quests and of that sort

9:25

of physics. Question is about

9:27

life and I became so.

9:30

Enlightened than a manner that I

9:32

realized my own audacious class so

9:34

that I love the most is

9:37

what is intelligent. What?

9:39

Makes intelligence. How do we

9:41

build intelligent? missing With and

9:44

that shift. In. The middle

9:46

of end of my college here. Was

9:49

how I discovered a I and

9:51

I didn't know we were in

9:53

a I winter and all I

9:55

I I had no idea. I

9:57

didn't care I it doesn't matter

9:59

that. Indoor spraying. It was

10:01

summer for me cause it's

10:03

a soul fascinating. We are to

10:05

explain for those on haven't come across

10:07

that term said You mean. That.

10:10

He I has had times in our

10:12

history when it's bloomed and when people

10:14

have got discouraged and dropped an interest

10:16

in it is least as far as

10:19

the public is concerned we are huge

10:21

You were talking about are in a

10:23

talk I saw you give about how

10:25

you were living in a town in

10:28

New Jersey. While. Thirty

10:30

miles away. Was sell

10:33

outs. Yeah, where where I live

10:35

in the cones that who was

10:37

when you get out of developing

10:39

neural networks. Yeah he had

10:41

a hand many scientists. I had

10:43

no idea that there will. I

10:46

landed in America a was them

10:48

moment that it. A

10:51

I was little. Is

10:53

it still not downturn?

10:55

yet? Computer scientists like

10:57

silicone. We're. Working

10:59

on their you're Not for It's

11:01

just thirty miles away from my

11:03

whole life American. The Homeless Services.

11:06

So. How did you get from

11:08

that? To. Concentrating on

11:10

images. At naturally

11:12

a visual person because neither

11:15

my early childhood my dad

11:17

takes me to this natural

11:19

excursions and we looked at

11:22

butterflies withdraw the pictures of

11:24

mountains and lowest assassination of

11:27

saying I find that. understanding.

11:30

Visual Intelligence to be the

11:32

most fascinating aspect of intelligence

11:34

with alleged a sculling to

11:37

visit. I've heard you

11:39

say that vision is more than

11:41

just as sense, that it's an

11:43

experience. Vision is

11:46

intelligence. Vision is experience

11:48

this and then our

11:50

understanding of vision is

11:52

is planning business decision

11:54

making. This in the

11:56

that he says this

11:58

is a very. During

12:00

Cornerstone Teeth of Intelligence.

12:03

Itself At one point you

12:05

fascination with images. Cause.

12:08

You. To start to work

12:10

with Image net which had like to

12:12

have you help me understand was better.

12:14

It's important understand because I think that's

12:16

see. That's the

12:18

project. That. Made people call

12:21

you the Godmother of Artificial Intelligence As

12:23

we know it, today seems a way

12:25

to be. I know you know an

12:27

excuse for graduate yourself too much, but

12:29

I've heard that said about you and

12:31

I'm trying to figure out in what

12:33

way. Was it a milestone? But.

12:36

I think it's best explained with

12:39

absolute today's breakthrough in say Ted

12:41

Zip T Base Y is. Is.

12:44

Ugly! Seen that a i

12:46

break fool because we see

12:49

powerful algorithms trained on. A.

12:51

Vast amount of data, other

12:53

data of the internet and

12:56

that gives us such powerful

12:58

breakthroughs. But this concept. Of

13:01

new are not for at

13:03

st on large data was

13:05

nonexistent. Think for. Twenty.

13:08

Twelve basically because I was

13:10

still going through a phase

13:12

that we don't know what's

13:14

the best pass to make

13:17

powerful ai a. Two things

13:19

converged. One is there is

13:21

a group of people like

13:23

Professor Death Sentence who are

13:25

continue to. Continuing to

13:28

explore your network. But.

13:30

There is another ingredient in

13:33

another part of the puzzle

13:35

is. Big. Data is

13:37

at that time nobody saw

13:39

a big data with power

13:42

A I and that's where

13:44

Image that came to plays

13:46

a pivotal role is that

13:48

my students and I recognize

13:50

the power of data. We

13:53

hypothesized am. The I

13:55

guess before most people that. A

13:59

I wo. Have a paradigm

14:01

shift if we power it

14:03

with enormous scale, tie and

14:05

amount of data is isn't

14:08

say it has centric date

14:10

of first an approach and

14:12

because of that we were

14:14

working on vision so we

14:17

wanna make the fittest visual

14:19

dataset. And in

14:21

order to make the biggest

14:23

visual data said we had

14:25

this crazy idea of downloading

14:27

almost all the pacers we

14:29

can get on the internet

14:31

back in two thousand and

14:33

seven and. Organize.

14:36

A curated catalog and

14:38

inmates complete this in

14:40

terms of visual objects

14:42

And that's when we

14:44

made after. Three years.

14:47

Between. Two Thousand and seven To Two Thousand Nine.

14:50

We. Made a dataset of

14:52

fifteen million images across twenty

14:54

two thousand categories. And that's

14:56

out of cleaning up a

14:59

billion images and of. That's.

15:01

What emits and that was

15:04

image that brought the date

15:06

of first approach to a

15:09

I. It also implicitly showed

15:11

the Ai world. A

15:13

important Northstar problem for work

15:15

on which is object recognition

15:17

and also this commitment to big

15:20

data that was a. A

15:22

big intellectual job because. Putting.

15:26

Together, that kind of scale of

15:28

big data was. Was

15:30

part of a was a

15:32

gamble A was. A

15:35

career risking so before one

15:37

does, it is hard to

15:39

know. If. We should do is.

15:42

That's what we have to do. And I

15:45

think you said what inspired you

15:47

a great deal with your progress

15:49

was would you call Pyramids number

15:51

Now. Was. The importance

15:53

of beauty number. That. You

15:55

importance of Biederman number

15:57

is. A keeper.

16:00

A number on the size

16:02

of the visual Intel As

16:04

a flood. How? What is

16:06

a number to define visual

16:08

intelligence? And that is? Really?

16:10

A hard question is almost like

16:13

ask you how many stars are

16:15

out there and done and. For

16:18

someone like me who was try

16:20

know I'm. Feeding crack

16:22

the problem of visual intolerances.

16:25

I was looking for a

16:27

clue to tell me. What?

16:30

Is the scale and scope of the

16:32

problem. idol is if we. If

16:34

we the solve. For.

16:36

Objects. Is that enough? If we

16:39

saw twenty all deaths, Is it

16:41

enough? If we solved a hundred,

16:43

Objects is a enough nobody knows

16:46

thousand. But when I discover the

16:48

Biederman number. I feel like

16:50

I discovered the A. A.

16:53

Really important. I'm.

16:57

Conjecture that no one was

16:59

noticing. Witches. Tens

17:01

of thousands of categories

17:03

and. Be them and put

17:06

it as Thirty Thousand Calgary A. Once

17:08

I got that number I feel like.

17:10

I. Have a clue about the

17:13

size? Of this problem

17:15

same would you mean by

17:17

categories. Of

17:19

the categories are the

17:21

natural way the human

17:23

with conceptual as objects

17:25

may tend to conceptualize

17:27

them as German Shepherds

17:29

microwave, yellow hum of

17:31

a sports car you

17:33

know of course sometimes

17:35

with think about my

17:37

German Shepherd your microwave

17:39

but in general that

17:42

classification of visual concept

17:44

is a fundamental. Visual

17:47

Intelligence. A problem

17:49

that humans have worth down

17:51

and the solved and it's very

17:53

foundation of to our visual

17:55

intelligence. Source you collect. A

17:59

great number. Eight years under the

18:01

category of dog and a great

18:03

number pictures under the category of.

18:06

Cat. So machine

18:08

is able to sort through dead and put

18:10

a name on it. When. It sees

18:13

a picture. Was an online

18:15

Joel There are hundreds of thought

18:17

species selling him as neither father

18:20

died when we actually had hundreds

18:22

of different.terrier German Shepherd Core the

18:24

we have the the different kinds

18:27

of course season so it's a

18:29

lot more than those dog vs.

18:31

it's. Right right? Shows

18:33

you've got subcategories is? well,

18:35

yeah. Totally. I mean a

18:37

lot most of the mets and

18:40

that is the sub categories clay

18:42

right? Like as that hundreds of

18:44

dog has ah hundreds and hundreds

18:47

of birds and have since you

18:49

know many different kind of cars

18:52

will do the buildings trees flower

18:54

of the you know it's a

18:56

It's a very very vast catalogue

18:59

of the visual world. Where

19:01

you got inspired by Biederman number you

19:04

have been working up until then. On.

19:07

A on a hundred and one

19:09

categories which was hard work and

19:11

took a long time. What

19:14

made you think that you. Could.

19:16

Possibly get. Tens. Of

19:18

thousands. Of categories

19:20

in catalogue them. What gave you the

19:22

idea that you could do it? I.

19:26

Would like to say they lose. A

19:30

was like that kid who went

19:32

on a treasure. Sound and sound.

19:35

A clue to the. Biggest.

19:38

Quests, And mystery and allows was

19:40

finding that clue. I was so

19:42

excited and then I got so.

19:45

Soft. By myself, it's like

19:47

it's so much bigger and I have

19:49

no idea how to do that. Accepted

19:52

the in the back of my mind

19:54

because he was. I was getting a

19:56

test with that, the scale and I

19:59

was like what to do and then

20:01

serendipitously. A couple of years later when

20:03

I was at Princeton Temples and heard

20:06

linguists. More. Cataloguing words. They

20:08

were not thinking about the visual world.

20:10

They were not thinking about Ai. But

20:12

they were just. Making.

20:14

Knowledge. Taxonomy of words.

20:18

And. The when when I maple.

20:21

Connection between Beethoven's number

20:23

which is visual concepts

20:25

and knowledge. Taxonomy. I

20:28

think it was a moment where I fell.

20:32

So. Sure,

20:35

That this is important enough I've

20:37

gotta do with. I just

20:39

kind of forgot about. How

20:42

hard it would be. I kind

20:44

of set off. I just

20:46

needed to do it. I don't know

20:48

how open. At it. If

20:51

I don't do it it I

20:53

cannot sleep. I give our so

20:55

so. So. That was. Kind

20:58

of momentum. Armed.

21:00

Robber. And. That's that's how

21:02

I started the Image That project.

21:05

And it took a few years of

21:07

struggle. and good is also a few

21:09

years realizing. I

21:11

was so delusional li. I'm

21:15

fearless. How

21:18

do you make a decision like that?

21:21

What goes into it? Would you know

21:23

you bring your own careers to discover

21:25

how do you weigh the factors? Risk

21:27

again to change the to work. See.

21:32

I assume that Allah, you don't you

21:34

can't because if you do, you wouldn't

21:36

do as. If.

21:38

You go with your gut. You.

21:41

Go with that scientific.

21:45

Converts, And. It it

21:47

it is.at the at the end of the

21:49

day has to go to the.level but it

21:51

wasn't a eve rational. It's not like I

21:54

came up random. Better grades, The

21:56

scientific conviction as and I'm

21:59

seeing that. And I

22:01

have. To make a half and and

22:03

I can be wrong. Honestly

22:05

I did have some too much time thinking

22:07

about what have them fiverr I guess I

22:09

go is to another dry cleaner shop. For.

22:13

Which which is a reference to part of

22:15

your life that was so interesting was you

22:17

were was a while you were getting your

22:19

phd you were we were you working. On

22:22

my entire undergrad at Princeton

22:24

and the first three years

22:26

of my Phd is a

22:28

three years altogether I was

22:30

running the family dry cleaner

22:32

shop business. Which uses up

22:34

and on and on weekends. Yeah,

22:37

every weekend you did it. Work

22:39

with starch. No starch. Yeah,

22:41

actually, that was more more than

22:43

that. I'm an expert. the. Same

22:46

faces. When

22:56

we come back from our break safe

22:58

a lead tells me how the question

23:01

from her mother who was ill in

23:03

a hospital bed at the time helps

23:05

face a lead determine who present passion

23:07

which is to ensure the day I

23:09

developed should watch the clothes or human

23:11

centered way. Just

23:16

a reminder to be clear and visit his

23:18

nonprofit. With. Everything after expenses

23:21

going to the Center for Communicating

23:23

Science at Stony Brook University. Both

23:25

the show in the center dedicated

23:27

to improving the way we connect

23:30

with each other and all the

23:32

ways it influence our lives. You

23:34

can help by becoming a patron

23:36

of clear and visited patriotic.com. At

23:39

the highest tier, you can join a

23:41

monthly sad with me and other patrons,

23:44

and I'll even record voice mail message

23:46

for you either a polite, dignified message

23:48

from me explaining your inability to come

23:50

to the phone, or a slightly snarky

23:52

one where he explained you have no

23:55

interest in talking with anyone at the

23:57

moment. And I'm happy

23:59

to. Ported A Snarky One is by

24:01

far more popular. If

24:04

you'd like to help keep

24:06

the conversation going, about, connecting

24:08

and communicating, join us at

24:10

patriotic.com/clear and Vivid! P.

24:12

A T R E

24:14

O in.com/clear and vivid.

24:17

And thank you. For.

24:19

Delicious meals you can whip up

24:22

with fresh ingredients in chef crafted

24:24

recipes at a price I bet

24:26

you're like say hello to hello

24:29

Fresh. I lane and

24:31

I gave it a try and I'm

24:33

glad we did. The ingredients came to

24:35

our door in a box, the captain

24:37

cold and fresh and any exact right

24:39

portions. Are lean loves to cook

24:41

with. This seemed like something that would be fun to

24:43

do together. And. In a little more

24:45

than twenty minutes, we had a delicious meal on

24:47

the table. And. It was the would

24:49

have been quicker of I wasn't in her ways.

24:52

Are lame. Was impressed as the

24:55

freshness of the ingredients and how

24:57

easy it was and it made

25:00

our taste buds happy. And to

25:02

top it off if you subscribe

25:04

to leave and include a free

25:07

breakfast in every bucks so join

25:09

us go to hellofresh.com/all the free

25:12

and use code. All the free

25:14

for Free Breakfast for life. One

25:16

breakfast item per box. Well subscription

25:19

is acted. That's Free Breakfast for

25:21

Life and hellofresh.com. Slash all

25:23

the free with code. All

25:25

the Free! Hello Fresh America's

25:27

number One meal kit. Every

25:30

year people com with a list of

25:32

resolutions and an attempt to improve themselves,

25:35

but without a curse. many many of

25:37

resolutions fail. But Rosetta Stone offers

25:39

more than encouragement. They offer a financial

25:41

incentive for a limited time get fifty

25:44

percent off Rosetta Stone lifetime membership. Just

25:46

a one time payment gives you full

25:48

access to all of it's twenty five

25:51

different languages. Sounds a lot easier than

25:53

losing ten pounds rent. Visit Rosetta stone.com

25:55

now and save fifty percent. It

26:01

is clear and vivid. And now back

26:03

to my conversation with Safe Elite. She

26:05

tells a story in her book about

26:08

how her mother, who was ill in

26:10

a hospital at the time has the

26:12

question about how a I could help

26:15

ordinary people. And that sparked Safe a

26:17

lead to help create the Institute for

26:19

Human Center Day I'd at Stanford University.

26:24

Of course you know when I wrote

26:26

wrote that story in a punk. I

26:29

guess. I'll highlight at that

26:31

moment, but. In. Reality. I

26:33

think that moment combined with

26:35

the moment in history including

26:37

my career, what I was

26:40

seeing in Silicone Valley's my

26:42

experience in an industry on

26:44

my sabbatical as well as

26:46

my mother taking care of

26:48

my mother and sharing her

26:50

reflection of all this combined

26:53

and com told me who.

26:55

We examine. That.

26:57

Technology I was making re

27:00

examine my role in this

27:02

technology And that's where I

27:04

realized. Ai has.

27:07

Come. Of age. It.

27:09

In a more profound way

27:11

than. What? I.

27:14

Knew when I was entering

27:16

a I. We now have

27:19

a technology that to actually

27:21

makes real human impact. good,

27:23

bad and ugly. And

27:25

it's so important. We

27:27

create of him as a human

27:29

centered framework so that we can

27:32

continue to develop this and deploy

27:34

this and governors in a human

27:36

centered way. What we've

27:38

seen in the Technology movement

27:40

said whispers you to think

27:43

like this So. What? First

27:45

of all like I have had. Decades.

27:48

Of experience taken care of. My.

27:51

Parents. Ailing mom and

27:53

I think it's a very

27:55

grounding experience. I I know

27:57

that I don't have the

28:00

profile of a the leader.

28:02

I'm not a silicone valley

28:04

bro, I'm a woman, I'm

28:07

an immigrant, I'm a caretaker.

28:09

The background mean Humanity A

28:11

grounded me with humility and

28:14

understanding of human dignity, human

28:16

self respect, human compassion, and

28:18

I wanted to see a

28:20

technology helping humans. In the

28:23

meantime, I was. At google.

28:25

And I was very fortunate

28:28

to be leading Google cloud

28:30

units a I am machine

28:33

learning and their I Learned

28:35

this technology is Phyllis is

28:37

impacting or industries We were

28:40

little li working with Japanese

28:42

cucumber farmers and the way

28:44

to fortune five companies and

28:47

every sector of and health

28:49

care to finance to energy

28:52

to entertainment to he commerce

28:54

and seen that sweeping. Impact.

28:57

Made. Me realize. While. The

28:59

is just the beginning We really

29:01

need to. We've. Really

29:03

need to recognize. The.

29:06

Responsibility of this technology:

29:08

rubber hitting the road.

29:10

And we need to think deeply

29:12

about humane packed at that point.

29:14

So what are some of the things

29:17

that Institute for Human centers? A Yard

29:19

Guns were would? What are you working

29:21

on? So. We'll work com

29:23

three things. Research Human

29:25

Center Research. Human. Centered.

29:28

A I Education. And.

29:31

The Human Center A I've policy. So

29:34

in research we are

29:36

the leading institute. A

29:39

cell for as well as in

29:41

academia world that. Gives

29:44

grand and and build

29:46

interdisciplinary ah Ai research

29:49

for example in Ha

29:51

I we have digital

29:53

economy lab studies a

29:56

eyes economic impact we

29:58

have center. On both

30:00

a model stuff, study the A

30:03

March language models and a I

30:05

O N and it's impact. The

30:07

we have neuroscience projects that look

30:10

at how we can combine brain

30:12

five years. As A As A,

30:14

we have a lot of of

30:17

interdisciplinary research. In. Education: We

30:19

put a lot of

30:21

focus on baking ethical

30:23

framework into computer science

30:25

curriculum. We put a

30:28

lot of emphasis on

30:30

educating our civil society

30:32

like congressional staffers, like

30:34

journalists, like business leaders,

30:37

And. Crazy ecosystem of knowledge and

30:39

lane policy. We put a lot

30:41

of the i'm for some. Thought.

30:44

Leadership and Public Discourse

30:46

forum of discussing a

30:48

eyes policy implications and

30:50

also advocating for important

30:52

policy changes. For example:

30:54

Public Sector Investment of

30:57

a I. I get the

30:59

impression that is a big effort on your

31:01

part to make sure that. The

31:03

motivation for working on a

31:06

I developing a I further

31:08

is that it's benefit humanity,

31:10

death, Because is a Tennessee

31:12

I guess. Prove hey I to be

31:14

to be considered something he competes with

31:16

humans. Read it in assisting humans. Yeah,

31:19

this really bothers me because I

31:22

think we need to be very

31:24

clear what our relationship with schools

31:26

are. A eyes a piece of

31:29

tool, it's a very powerful piece

31:31

of to was a humanity has

31:33

had it's struggle with. That

31:36

the relationship between us and

31:38

the tools. But it's important

31:40

to recognize. That we

31:42

should have. The. Narrative We

31:44

should have the agency. In

31:47

responsibly creating the using and

31:49

governing that tool. So this

31:51

thing about. Lead to

31:53

a I compete with us more let

31:56

a I take care of us or

31:58

let a I. The whole

32:00

of is so. Knob. How

32:02

how I see the technology

32:04

is. It's wrong to give

32:06

agency to a I. It's

32:08

important we actually take that

32:10

agency. So people like me

32:12

I'm a technologist. I should

32:14

feel responsible. For. What I

32:17

build. And. Or in

32:19

the meantime, I I hope

32:21

that business leaders also feel

32:23

responsible. I feel I hope

32:25

civil society feels responsible week.

32:27

We have to recognize that

32:30

agency a responsibility. You

32:32

seem to put a lot of emphasis on. Enabling

32:35

the people who create a. Bigger.

32:38

And better he eyes to do

32:40

would resume or not. More.

32:42

Than a not put a

32:45

recognition of importance of serving

32:47

people he said. Catching.

32:49

Home in time. To

32:52

make it before it accelerates out

32:54

of our control. You

32:56

tell me Allah lay I'm definitely hear

32:59

a my own voices more than any

33:01

but. I.

33:05

I do feel. I'm

33:08

on this uphill battle. Of

33:10

trying to communicate a

33:12

I in a responsible way.

33:15

I feel sometimes are

33:17

airwaves is I'm filled

33:19

with the son. Of.

33:22

A. I is gonna dominate as a I

33:25

as well as do things to us. We

33:27

are. Screwed. Because a

33:29

I have club. Blah. Blah blah

33:31

and I really wanna just. Stand.

33:34

On top of the mountain and and

33:36

and and sell that. Let's.

33:39

Recognize a I as a

33:41

tool we collectively made and

33:44

will need to collectively. Apply

33:47

and govern. So. It

33:50

would truly is our own

33:52

responsibility to do this right.

33:54

Not a eyes. Before

34:01

we end our conversation, I wanted

34:03

to ask you that I saw

34:05

in the book repeatedly the importance

34:08

to this whole field of people

34:10

who were born outside the

34:12

United States, the value of

34:14

immigrants to our country in general,

34:16

but certainly to this field. And

34:19

I think your nonprofit organization

34:22

of AI for all is

34:25

working on that. Well,

34:27

a repeated theme in my book

34:30

is the recognition of people of all

34:32

walks of life. That

34:35

includes people of different gender, it

34:38

includes people of all kinds of

34:40

countries' origins. You know, I think

34:43

most of people who have helped

34:45

me come

34:47

from another country. And

34:49

I also believe that if

34:51

we want to make this technology

34:53

human centered, we have to recognize

34:56

the human diversity of

34:58

the makers of this technology, as

35:00

well as who's going to use

35:02

and deploy it. So

35:04

with that mindset, I

35:06

did start co-founded this

35:09

nonprofit called AI for

35:11

All that focuses on

35:13

K-12 education of

35:16

AI and to

35:18

try to lift tomorrow's leaders

35:20

from all walks of life,

35:22

from all backgrounds, whether you're

35:25

in rural America, or if

35:27

you're in inner city America,

35:30

or if you are a young

35:32

woman, or if you're people of

35:34

different color

35:36

and different backgrounds. So

35:38

that's the goal. But

35:41

the bottom line is, if

35:43

AI is going to change all of us

35:45

tomorrow, because it's going to impact

35:47

our world, I want to make sure

35:49

the people who are changing AI represent

35:51

all of us. That's

35:54

well said. And in the process, you're

35:57

creating an unknown number of faith- they

36:00

leave, that'll get

36:02

us there. I

36:04

wish I had more time to talk with you. I

36:08

have hours of things to ask you about

36:10

and learn from. But before we

36:12

end every show, we

36:14

have seven quick questions. Sure.

36:17

Okay. Of all the things there

36:19

are to understand, what do

36:21

you wish you really understood? Of

36:24

all the things in the world? Of all the

36:26

things, not necessarily related to your work, could

36:29

be, whatever it is. My

36:32

children. Ha ha ha ha

36:34

ha ha. Ha ha ha ha. Ha

36:36

ha ha ha. Well,

36:39

remind me of my good friend, Steve

36:41

Strogatz, the mathematician who said he wished

36:43

he understood his dog Murray. But

36:46

yours is even more useful. Well,

36:49

I don't have a puppy. Ha ha ha

36:51

ha ha. So,

36:54

how do you tell someone

36:56

they have their facts wrong? That's

36:59

a great question. I

37:02

wanna say it depends on who they are. But

37:05

I guess I'll start by

37:08

I respectfully disagree. Okay,

37:11

let's go to the next one. What's the

37:13

strangest question anyone has ever asked you? The

37:17

strangest question. Oh,

37:20

I got one that's good. What

37:22

is the favorite category in

37:24

ImageNet? Oh, that's interesting.

37:27

What is the favorite? Is there

37:29

one? It is so hard to answer. I

37:33

would say I constantly take a lot

37:35

of joy in

37:39

browsing the pictures of the

37:41

category of wombats. Ha

37:43

ha ha ha ha. Wait

37:46

a second. Why is

37:49

that? Well, yeah, I

37:51

don't know. I love wombats.

37:54

Ha ha ha ha ha. Ha ha ha ha ha.

37:58

Well, that's not only the strangest question. It's

38:00

the strangest answer I ever got. Okay, I'm glad

38:02

you think that way. How

38:06

do you handle a compulsive talker? I

38:11

stay silent. And

38:13

you stay put. It depends. If

38:16

I'm rushing somewhere else, I need

38:18

to extract myself.

38:23

Okay, let's say you're sitting at a dinner

38:25

table next to someone you don't know, never

38:27

met before. How do you

38:30

begin a really genuine conversation? I

38:34

actually love those opportunities because it's such

38:36

a curious way of learning. I guess

38:38

I would ask what do you feel

38:40

so excited about these days? Hmm,

38:43

yeah, great. Give a right to the emotion. What

38:47

gives you confidence? Humanity.

38:51

This is why as

38:53

an AI scientist, I

38:56

find it strangely, I use

38:58

the word human so much these

39:00

days because humanity

39:03

is flawed. I'm

39:05

going to admit it. But

39:08

we're also incredibly resilient.

39:11

Our arc of history with

39:13

all its ups and downs

39:15

is bent towards goodness. At

39:17

least we want

39:20

to. I believe in Dr. King.

39:24

The arc of history has been towards justice.

39:27

It's not just justice. For

39:29

me, there's

39:31

goodness in humans. So I do

39:34

believe in humanity. I have confidence

39:36

in humanity. Doesn't mean I

39:38

have confidence in every single individual. That

39:43

opens up a whole other conversation. We'll get to

39:45

it sometime when we meet. Last

39:48

question. What book changed

39:50

your life? Great

39:52

question. So

39:55

many, right? The

39:58

nerdy one, I would say. Home and let

40:01

service. Man. That's

40:04

great when I I do wish

40:06

we could talk longer. You're you're

40:08

You're an extraordinary person and mosquito

40:10

things you could. We have you

40:13

on the planet. Of the

40:15

whole were like I said

40:17

I think this moment communicating

40:19

ai part of my worth

40:21

because I think humanity. So.

40:24

Need. On us

40:26

to eat and authenticity in.

40:29

Parking. About a I

40:31

worry. That. The if we

40:33

don't. You for not

40:36

honest about talking about ai we're we're

40:38

not doing justice to to the suicide

40:40

is he took to the. To.

40:42

Our community. And

40:45

a being honest involves

40:47

talking about. The. Great

40:49

benefits it'll probably come

40:51

and also. Find

40:53

that a danger to do harm than

40:56

good com and you don't want to

40:58

emphasize one is a disadvantage of the

41:00

either. Way. And you

41:02

don't wanna be a hyperbolic,

41:04

eliminating nuanced than thoughtful. Which

41:06

means. You're no fun for

41:08

news. Has rotted,

41:10

was a disaster. gets the his

41:13

mind. Was

41:15

made you so much pleasure I

41:17

think. This

41:27

has been cleared vivid and we say hopes

41:29

of. My. Thanks to

41:31

the sponsors this podcast and to

41:33

one of you who support our

41:35

show and patria you keep clear

41:37

and vivid up and running and

41:39

after we pay expenses whatever is

41:41

left over goes to the All

41:43

The Center for Communicating sign and

41:45

said Stony Brook University. So your

41:47

support is contributing to the better

41:49

communication of science. We're very grateful.

41:53

Say. Leaves the inaugural Sequoia

41:55

Professor in the computer Science

41:57

department at Stanford University. In

42:00

Code Director of Stanford's Human Center

42:02

Day I Institute. During

42:05

or sabbatical from Stanford from January

42:07

two thousand and seventeen to September

42:09

two thousand and eighteen, she was

42:11

vice president at Google. And

42:13

served as chief scientist if Google

42:15

Cloud. For development

42:17

of Image Net ushered in

42:19

the age of big data

42:22

in a laying the foundation

42:24

for sad but slight said

42:26

Gp teased it's successors a

42:28

wonderful book is the world's

42:30

I see curiosity, exploration and

42:32

discovery. At the dawn of a

42:34

i. This. Episode

42:36

was edited in produced by

42:38

our executive producer Graham Said

42:41

with help from our associate

42:43

producer James who may or

42:45

publicist is Sarah Hill a

42:47

researcher is Elizabeth. though Haney

42:49

and the sound engineers Eric

42:51

have won. The music is

42:53

courtesy of the Stefan Thirty

42:55

trio. Next

43:06

in our series of conversations I

43:08

talk with Eric Schmidt being the

43:10

Ceo of Google from two thousand

43:12

and One to two Thousand Eleven

43:14

followed by four years is Googles

43:16

executive chairman has given him a

43:18

birds eye view of the development

43:20

of Ai. He has many concerns

43:22

about the harm the day I

43:24

said cause he's especially worried about

43:26

a I generated deep face but

43:28

he also sees benefits, benefits the

43:30

today we can scarcely imagine. For

43:33

instance, while chat bots work by

43:36

predicting the next words, they are

43:38

many, many examples. Were predicting

43:40

the next word is also a technique

43:42

the to the news to protect the

43:44

next. Zoom. The next

43:47

protein and it uses the same

43:49

principles. So. What does this

43:51

mean? How bout. Better batteries.

43:54

How about more efficient

43:56

energy distribution? How about

43:58

better carbon? Management. Climate.

44:01

Change alone one of the greatest,

44:03

a dangerous to humanity in the

44:05

long run will be materially improved

44:07

by this plastic so paints a

44:10

pollutant pollutants of one kind or

44:12

another. We're going to look back

44:14

on this period and say we

44:16

were so ignorant because we were

44:18

using said simple. Materials.

44:21

Components are so forth and are

44:23

built existence and this is how

44:25

progress goes on. It's great. And

44:28

all of these are happening at

44:30

a speed that is incomprehensibly fast

44:33

compared to what it was twenty

44:35

years ago, thirty years. Eric

44:38

Schmidt on the good as well as

44:40

the bad and ugly of ai. Next

44:43

time on we're in Vivid. For

44:46

more details about Clear and Vivid

44:48

into sign up for my newsletter

44:51

please visit alanalda.com and you can

44:53

also find us on Facebook and

44:55

Instagram at Clearing Vivid. Thanks.

44:58

For listening five miles. Welcomes.

45:13

Year Twenty Twenty Three Work recap This

45:15

year you've been to one hundred and

45:18

Twenty Seven think meetings use density thickness,

45:20

searching for files and almost missed a

45:22

deadline. He added. Twenty

45:25

Twenty Four Ten. And so it

45:27

sounds. difference with money.com you can

45:29

work together easily, collaborate and share

45:31

data files. And I say though,

45:33

all work happens in one place

45:35

on the same page monday.com or

45:38

Tax Advantage to learn. More. jokes

45:44

next round as bad start and already yeah

45:47

yeah just shutting for a car car of

45:49

honor for real yeah come on a makes

45:51

it super conveniences shop whenever wherever for real

45:53

that's a ton of car options yep and

45:56

israel with my price range or really real

45:58

you can afford that with carbon And boom!

46:00

Just like that, I'm getting it delivered in

46:02

a couple days. For really, really real? You

46:05

just bought a car! For

46:07

real, and you just lost, my turn.

46:09

Visit carbona.com to shop for thousands of vehicles

46:11

under $20,000.

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features