Podchaser Logo
Home
Why don't we have better robots yet? | Ken Goldberg

Why don't we have better robots yet? | Ken Goldberg

Released Tuesday, 26th March 2024
Good episode? Give it some love!
Why don't we have better robots yet? | Ken Goldberg

Why don't we have better robots yet? | Ken Goldberg

Why don't we have better robots yet? | Ken Goldberg

Why don't we have better robots yet? | Ken Goldberg

Tuesday, 26th March 2024
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:01

Ted Audio Collective. You're.

0:09

Listening to tear ducts daily. I'm your host.

0:12

Elite few. It turns out

0:14

that the things human babies can

0:16

master like picking up tiny blocks

0:18

are giant challenges for robots. In

0:20

his twenty twenty Three talk from

0:22

Ted ex marine, the roboticist Can

0:24

Goldberg takes us through how advances

0:26

in a a deep learning and

0:28

lead to big strides in training

0:31

robots to do even the most

0:33

precise tasks like and tying tangled

0:35

cables. I don't know about you,

0:37

but if robots could untangle my

0:39

necklace is, I'd line up for

0:41

that for sure. Hear. Us talk

0:43

after the break. Support.

0:47

For tedtalks, daily comes from Capital

0:49

One Bank with no Fees are

0:52

minimums. Banking with Capital One is

0:54

the easiest decision in the history

0:56

of decisions. even easier than deciding

0:58

to listen to another episode of

1:00

your favorite podcast. And with no

1:03

overdraft fees? Is it even a

1:05

decision? that's banking reimagine? what's in

1:07

your wallet Terms of life? The

1:09

Capital one.com/bank Capital One and a

1:11

member Ft. I see. Tear.

1:16

Ducts daily is brought to you by

1:18

Progressive where drivers who saved by switching

1:20

save nearly seven hundred and fifty dollars

1:22

on average quote now at progressive.com Progressive

1:24

Casualty Insurance Company and Affiliates national average

1:26

while savings with seven hundred forty four

1:28

dollars by new customers surveyed who saved

1:31

with Progressive between June twenty twenty Two

1:33

and May twenty twenty Three potential savings

1:35

will their. This

1:38

show is brought to you by Schwab.

1:40

With Schwab Investing name's it's easy to

1:42

invest in ideas you believe in like

1:45

electric vehicles, renewable energy, water, sustainability and

1:47

more. Choose from over forty themes by

1:49

as his or customize the stocks and

1:52

a theme to fit your goals. Learn

1:54

more at schwab.com/thematic Investing. I

1:58

have a feeling most people. This room.

2:00

would like to have a robot

2:03

at home. Be nice to be

2:05

able to do the chores and

2:07

take care of things. Where are

2:10

these robots? What's taking so long?

2:12

I mean we have our try

2:14

corridors and we have satellites. We

2:17

have laser beams. but where are

2:19

the robots? I

2:22

mean okay way we do have

2:24

some robots in our home by

2:26

tie. Not really doing anything that

2:28

exciting. Okay, South.

2:31

I've been doing research at U

2:33

C Berkeley for thirty years with

2:35

my students on robots and and

2:38

next ten minutes I'm gonna try

2:40

to explain the gap between sixteen

2:42

and reality. Than. The

2:44

Fields: There's something that explains this

2:46

that we call more of x

2:49

paradox and that is what's easy

2:51

for robots. Like being able to

2:53

pick up a large object large

2:55

heavy objects is hard for humans.

2:58

But what's easy for humans like

3:00

being able to pick up some

3:03

blocks and stab them were. turns

3:05

out that is very hard for

3:08

robots. And this

3:10

is a persistent problem. so the

3:12

ability to grasp arbitrary objects is

3:14

a grand challenge for my field.

3:16

Now by the way, I was

3:18

a very klutzy kid. I I

3:20

was the I would drop things

3:22

every time someone would throw me

3:24

a ball. I would drop it.

3:26

I was the last kid to

3:28

get picked on a basketball team.

3:30

I'm I'm still pretty classy actually,

3:32

but I have spent my entire

3:35

career studying how to make robots

3:37

less clumsy. Now

3:39

let's start with the hardware. So

3:41

the hand. It's a lot like

3:43

our hand and it has a

3:45

lot of motors, a lot of

3:47

tendons and cables, so it's unfortunately

3:49

not very reliable. It's also very

3:51

heavy and very expensive, so I'm

3:53

in favor of very simple hands.

3:56

So this has just two fingers

3:58

known as a parallel job. Or

4:00

so. It's very simple. it's

4:03

lightweight and reliable, and is

4:05

very inexpensive. No, actually,

4:07

it industry the Zebra super robot

4:10

gripper and that's the suction cup

4:12

and that only makes a single

4:14

point of contact. So again, simplicity

4:17

is very helpful in our field.

4:19

Scout let's talk about the software

4:21

and this is where it gets

4:23

really, really difficult because of a

4:26

fundamental issue which is uncertainty. There's

4:28

uncertainty and the control. There's uncertainty

4:30

and the perception. And

4:32

there's uncertainty in the physics. How would

4:35

I mean by the control? Well, if

4:37

you look at a robot's gripper trying

4:39

to do something, it's big, There's a

4:42

lot of uncertainty. the cables and the

4:44

mechanisms that cause very small errors and

4:46

these can accumulate and make it very

4:49

difficult to manipulate things. Snow.

4:51

In terms of the censors, yes,

4:54

robots have very high resolution cameras

4:56

just like we do and that

4:58

allows them to take images of

5:00

scenes in traffic or retirement center

5:02

or in a warehouse or in

5:04

are operating room. but these don't

5:06

give you the three dimensional structure

5:08

of what's going on. So

5:11

recently that with the new development called

5:13

My Dark and this is a new

5:15

class of cameras that use light beams

5:17

to build up of three dimensional model

5:20

of the environment. And

5:22

these are fairly effective if really

5:24

word a breakthrough in our field.

5:26

but they're not perfect so if

5:28

the objects have anything that signee

5:30

or transparent with any like acts

5:32

in unpredictable ways and ends up

5:35

with noise and holes in the

5:37

images. So these aren't really the

5:39

silver bullet and there's one other

5:41

form of censor out there now

5:43

called a Tactile Senses and these

5:45

are very interesting. They use cameras

5:47

actually image best surfaces as a

5:49

would be contact but these are

5:51

still. In their infancy. Now

5:54

the last issue is the physics. We take

5:56

a bottle on a table and we just

5:58

pushes and a robot. Putting in exactly

6:00

the same way each time, but a

6:03

bottle ends up in a very different

6:05

place each time. And why is that?

6:07

well is because it depends on the

6:09

in microscopic surface typography underneath the bottle

6:11

as it slid. For example, if you

6:14

put a grain of sand under their,

6:16

it would react very differently than if

6:18

there weren't a grain of sand and

6:20

we can't see if there's a grain

6:22

of sand because it's under the bottle.

6:25

It turns out that we can predict

6:27

the most heard of an asteroid a.

6:30

Million miles away, Far

6:32

better than we can predict the motion

6:34

of an object as it's been grass

6:36

by robots. Now

6:39

let me give you an example. Put

6:41

yourself here into the position of being

6:43

a robot. Are trying

6:46

to clear the table and your sensors

6:48

are noisy an imprecise. Your apps, waiters,

6:50

your tables and motors are are uncertain

6:52

see, can't fully control your own ripper

6:55

and there's uncertainty in the physics so

6:57

you really don't know what's gonna happen.

6:59

So it's not surprising that robots are

7:02

still very clumsy. Smothers

7:04

one sweet spot for robots and that

7:06

has to do with ecommerce. And

7:09

this has been growing as a huge trend

7:11

and during the pandemic it really jumped up.

7:13

I think most of us can relate to

7:16

that. We started ordering

7:18

things like never before had. This

7:20

trend is continuing and that Salinger

7:22

is to meet the demands, we

7:24

have to be able to get

7:26

all these packages delivered it of

7:28

timely manner. And the

7:31

challenges that every package is different.

7:33

every orders difference so you might

7:35

order some. Some nail polish

7:37

and an electric screwdriver. And those two

7:39

objects are going to be somewhere inside

7:42

one of these giant warehouses. And what

7:44

needs to be done is somewhat us

7:46

to go and find the mail past

7:49

and then God find the screwdriver, bring

7:51

up together, put him into a box

7:53

and deliver them to you. So this

7:55

is extremely difficult, it requires grasping. So

7:58

today this is almost entirely. With

8:00

humans in a human so like doing

8:02

this work. there's a huge amount of

8:05

turnover so it's a challenge and people

8:07

have tried to put robots into warehouses

8:09

to do this work has ensure that

8:11

all that well but some my students

8:14

it i. About five years ago we

8:16

came up with a method using advances

8:18

in a deep learning to have a

8:20

robot essentially train itself to be able

8:23

to grasp objects and the idea was

8:25

at the robot would do this and

8:27

simulation. It was almost as if the

8:29

robot. Would dreaming about how to grasp

8:32

things and learning how to grasp them

8:34

reliably. This is a system called Decks

8:36

Nets that is able to reliably pick

8:38

up objects that we put into these

8:40

been in front of the robot. These

8:42

are objects it's never been trained on.

8:45

And it's able to pick these

8:47

objects up and reliably clear these

8:49

pins over and over again. So

8:51

we're very excited about this results

8:53

and the students and I went

8:55

out to form a company and

8:57

we now have a company called

8:59

Mb Robotics and what we do

9:01

is make machines that use the

9:03

algorithms a software we developed at

9:05

Berkeley to pick up packages and

9:07

this is for ecommerce. The packages

9:09

arrive at large pins all different

9:11

shapes and sizes and they have

9:13

to be picked up scanned. And input

9:16

into smaller beds depending on the zipper. We

9:19

now have eighty of these machines

9:21

operating across the United States. sorting.

9:24

Over a million packages a week. Now

9:27

that's that's some progress. But it's

9:29

not exactly the com robot that

9:31

we're all been waiting for. So

9:33

I wanna give you a little

9:35

bit of idea of some of

9:37

the new research that we're doing

9:39

to try to be able to

9:41

have robots more capable in homes.

9:43

And one particular challenges being able

9:46

to manipulate the former baathists like

9:48

strings in one dimension, two dimensional

9:50

seats in three dimensions. like like

9:52

like fruits and vegetables. So we've

9:54

been working on a project is.

9:56

To untangled knots and what we do is

9:58

you take a table. We put that.

10:01

In front of the robot, it has to

10:04

use a camera to look down, analyze the

10:06

cable, figure out where to grasp it, and

10:08

how to pull it apart to be able

10:10

to untangle it. And this is a very

10:12

hard problem because the table as much longer

10:14

than the reach of the robot so has

10:16

to go through and manipulate. managed to slack

10:18

as it's working and I would say this

10:20

is doing pretty well. It's gotten up to

10:22

about eighty percent success when we give it

10:24

a tangled cable at be able to untangle.

10:28

The other one is something I think

10:30

we also or waiting for robot to

10:33

for the laundries. Now roboticist have actually

10:35

been looking at this for a long

10:37

time and there was some research that

10:39

has be done on this but the

10:42

problem is that it's very very slow

10:44

so this is about three to six

10:46

folds for our ogre s so are

10:49

we decided to free to revisit this

10:51

problem and try to to have a

10:53

robot work very fast. So what are

10:55

the things we did was try to.

10:58

Speak about a to our draw by the

11:00

could swing that fabric the way we do

11:02

with were folding and then we also use

11:04

friction in this case to drag the fabric

11:07

to have to smooth out some wrinkles And

11:09

then we bars Trust which is known as

11:11

the two seconds holes. You might have heard

11:13

of this. It's amazing because the robot is

11:15

doing exactly the same thing and that's a

11:18

little bit longer so we're making some progress.

11:20

Him. And last

11:22

example, his baggage. So you all it's counter

11:24

this all the time you go to a

11:26

corner store you have to put something in

11:29

a bag. Now it's easy again for humans.

11:31

but it's actually very very tricky for robots

11:33

because for humans you know how to take

11:35

the bag and how to manipulate it. But

11:37

robots a bag can arrive in many different

11:40

configurations and for hard to tell what's going

11:42

on. And for the

11:44

robots figure out how to open up

11:46

that bag. So what we did was

11:48

we had a robot train itself by

11:50

we painted one of these bags with

11:52

fluorescent paint and we had fluorescent lights

11:54

and with turn on and off in

11:56

the robot would essentially teach itself how

11:58

to manipulate these bags. And

12:00

so we've got it. now. up to

12:02

the point where we're able to solve

12:05

this problem about half the time so

12:07

it works, but of say it's still

12:09

we're still not. We're still not quite

12:11

there yet so I wanna come back

12:13

to more Bucks. Paradox with easy for

12:15

Robots is hard for humans and was

12:17

easy for us is still hard for

12:20

robots. We have

12:22

incredible capabilities, were very good

12:24

at manipulation. By

12:27

robots still are not there.

12:29

I want to say I

12:31

understand. It's been six

12:33

years and we're still waiting for

12:36

the robots that the Jetsons had.

12:38

Why is this difficult? We need

12:40

robots because we want them to

12:42

be able to do tasks that

12:45

we can't sue or we don't

12:47

really want to do. But

12:49

I wanted to keep in mind that these

12:52

robots they're coming. Just. Be

12:54

patient because we want the

12:56

robots. but robots Awesome! The

12:58

Us to do the many

13:01

things that robots still can't

13:03

to. Say to.

13:10

The show is back you by Schwab.

13:12

With Schwab Testing Games it's easy to

13:14

invest in ideas you believe in like

13:16

electric vehicles, renewable energy, water, sustainability and

13:19

more. Choose from over forty seems by

13:21

as is or customize the stocks in

13:23

a theme to sit your goals. learn

13:25

more at Schwab that com/be matic investing.

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features