Podchaser Logo
Home
Deus Ex Machina

Deus Ex Machina

Released Thursday, 4th July 2019
Good episode? Give it some love!
Deus Ex Machina

Deus Ex Machina

Deus Ex Machina

Deus Ex Machina

Thursday, 4th July 2019
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:04

Sleepwalkers is a production of my heart radio

0:06

and unusual productions. Every

0:11

day I go to work, I use my car. Every

0:13

tag I talked to somebody is more than a few more feet

0:15

away. I use my phone, so I'm using

0:18

certain technologies that extend

0:20

me beyond my physical normal range. Yes,

0:22

so maybe I'm a cyborg. That

0:26

Sebastian Throne speaking. He found

0:29

a Google X and he's appeared on Sleepwalkers

0:31

a few times. As far as we know, he's

0:33

not a cyborg. If summer took

0:36

all technology away from me, like refrigeration

0:38

and everything, would I not survive? I

0:41

have no clue. I good, I'm in catching deer here my

0:43

neighborhood. But yes, I would be pretty

0:45

miserable. You said something which I found

0:48

quite interesting. You said, I would love

0:50

to directly interface my brain to all

0:52

the computers in the world, so I could be truly

0:55

superhuman. I would know everything, every

0:57

name, every phone number. If there was a

0:59

button you could press to do that right

1:01

now, would you have any hesitation pressing

1:03

it? I would press it in a microsecond.

1:07

Why I make sure the product of data.

1:10

I went to college, I read probably

1:12

four or five books in the time, and

1:14

I build a scientific career on the shoulders of others.

1:16

That data empowered me to who I am.

1:19

If you took that data away from me and said, Sebastian,

1:21

I'm going to put you into the China in the seventies, you're gonna

1:23

work in the field, I would be less

1:25

as Sebastian than I am today, because I

1:27

would probably know how to plow a field. But that was

1:30

pretty much it. To Sebastian's

1:32

regret, there's been a hard limit on how

1:34

much data he can actually consume his

1:36

hardware. His physical body has

1:39

held him back. Unfortunately,

1:41

the human I owe the input out with the

1:44

ears and eyes and smell and so on, voice

1:46

are still very inefficient. If I

1:48

could accelerate the reading

1:50

of all the books into my brain, oh my god, that

1:52

would be so awesome. We

1:55

talked in the last episode about brain computer interfaces

1:57

in the medical world, helping disabled pay

2:00

ss like Jan Sherman move prosthetic

2:02

limbs with their minds. Today, we

2:04

look at trans humanism becoming

2:07

cyborks. What happens when

2:09

we merge with computers not just to restore

2:11

function, but to upgrade and enhance

2:13

ourselves. And as we head into this

2:15

new future, who might get left

2:17

behind. I musoloshin

2:20

this is Sleepwalkers. So

2:34

Carrot Sebastian was talking about cyborgs

2:37

and whether we're already cy walks because

2:39

we're so reliant on our technology, our cars,

2:41

and especially our phones, which

2:43

become kind of like a second brain, albeit

2:46

a very distracting one. Yeah. I think what he was

2:48

saying that if technology is required

2:50

to keep us alive, that merger

2:52

between humans and machines has already happened.

2:55

We're already that's right. But let's be real,

2:57

that's not really what people think about when they

2:59

hear the word cyborgs. They think about

3:01

more extreme cases like Sebastian having

3:03

his brain directly connected to all the information

3:06

in the world ever, you know, and there are already

3:08

some people working on that. There's this guy named Elon

3:10

Musk that you might have heard of, and he

3:13

actually one of his new companies is called

3:15

Neuralink, and he's trying to do is

3:18

create a mesh that can be inserted

3:20

in the brain that connects the brain to

3:23

a computer. And I think part of

3:25

his goal is to make humans

3:28

competitive enough to take on machines

3:32

if and when they out smart us. Well, this idea

3:34

of humans and machines merging

3:36

to create something superior is

3:38

a great interest throughout Silicon Valley, and

3:41

one person who kind of wrote the book on this transhumanism

3:44

is you have al No Harari. He is a historian

3:47

and a futurist and the author of Homodaeus,

3:50

a book that explores the future of our humanity,

3:53

and his writing is a major source of inspiration, not

3:55

just in Silicon Valley but for this podcast. So

3:57

I was rather excited when he agreed to have a conversation

3:59

with us. Every generation

4:01

in history thinks that they are on the

4:04

verge of the apocalypse, and

4:06

usually they are wrong. But as

4:08

a story and I have the sense that we are on the

4:10

verge of the most important revolution

4:13

since the very beginning of life. You've

4:18

all believes that right now we're actually looking

4:20

at the end of history as we know it, the

4:22

dawning of a new era. Something

4:25

has changed. I wanted

4:27

to know what we're

4:29

really deciphering, the underlying

4:32

rules of the game of life and

4:35

now acquiring the ability to change

4:37

these rules. I mean, previously, the

4:40

expectations of the apocalypse were

4:42

directed outwards towards

4:44

the gods, towards some external

4:46

entity that will come and intervene

4:49

and change everything. Now we

4:52

don't expect an external entity to

4:54

do it. We expect ourselves

4:56

to do it. And when you look

4:59

really calm only and objectively at

5:01

the advances that science

5:03

has been making, it doesn't

5:06

sound so far fetched as

5:08

you've all would have it. We no longer look to the heavens

5:10

or answers, because we ourselves are becoming

5:13

gods, and as technology improves,

5:15

we're getting better and better at finding answers

5:17

by looking inwards. Literally. One

5:21

of the biggest misunderstandings about the whole

5:23

AI revolution is that many people

5:25

see it as a revolution coming

5:28

out of computer science, but

5:30

actually it comes equally

5:33

from the life sciences, from biology,

5:35

from brain science. It's not enough

5:38

that computers are becoming smarter. It's

5:41

also essential that we view

5:44

humans as algorithms.

5:47

If our feelings are not the product

5:50

of some kind of extremely complicated

5:52

algorithm in the brain, then

5:54

no matter how smart computers

5:57

will become, many things won't happen. If

6:00

something like self driving cars to

6:02

navigate a street full of pedestrians,

6:05

the car must be able to understand

6:07

human behavior and human feelings.

6:10

And if you think that human feelings are

6:13

the result of some spiritual

6:16

soul or something which can never

6:18

be deciphered. Then you can't

6:21

have self driving cars. In

6:23

order to create so many of the technologies

6:25

we talk about on Sleepwalkers, from self

6:28

driving cars to targeted ads to parole

6:30

algorithms, we have to assume that human

6:33

behavior can be modeled, that our habits,

6:35

routines, even our personalities could

6:38

be expressed as mathematical formulas.

6:41

But even if you don't take things that far. Uv

6:44

OL's key point about the transformative

6:46

potential of combining computer science

6:48

and biology is shared by people

6:50

in the business of building the future. You

6:53

may remember Arthur Provoca from earlier

6:55

in the season Arthur round the Defense

6:57

Advanced Research Projects Agency Dark.

7:00

But we're living in a time in

7:02

which the biological sciences

7:04

are converging with

7:07

information technology. The

7:09

DARPER We are always looking for these

7:12

areas where we see these seeds

7:14

of technological surprise, and today I

7:16

think this intersection of biology

7:18

with technology is one of the most

7:20

fertile seed beds for surprise.

7:23

And when these advances are coming from the ability

7:26

to autobiology with computer science,

7:28

it may be hard for us to predict what surprises

7:31

might bloom. You've always argued

7:33

that advances in AI and gene editing

7:36

will lead to new forms of life. We

7:39

are now gaining the ability to break

7:41

or bend or change the rules of life.

7:43

We are about to create the first inorganic

7:46

life forms after four billion years

7:48

of evolution. We can't really

7:50

imagine how the new entities

7:53

would look like because even the wildest

7:55

dreams are still subject to natural

7:58

selection and organic biochemistry. So

8:02

according to you of our Cara, it's rather

8:04

difficult to say what might come next because

8:07

it's somehow out of our frame of reference. It's almost

8:09

like saying what aliens look like. And

8:11

I know if you ever watched movies, but in movies, aliens

8:14

and cyborgs tend to look remarkably

8:16

consistent. I actually watched

8:18

Men in Black last week, and it is alien

8:21

central casting. You know, a lot

8:23

of representations of cyborgs

8:25

tend to have this element of cliche. You know, there's a pitiable

8:27

robot, the robot that's hell by love. You

8:30

can think of Wally c three p O,

8:32

the tin Man from the Wizard of Us. The Wizard

8:34

was my favorite movie. It's kind of a fantasy film. We've

8:37

actually talked about real life people who may

8:39

meet the qualification of cyborg. Already

8:42

on this show, we spoke about Jan Sherman

8:44

in the last episode using a robotic arm

8:47

with her mind, but the research didn't

8:49

actually end there. In a subsequent experiment,

8:52

Jan flew a simulated plane with our

8:54

brain waves. Remember this research

8:56

was being funded by Darper and it raised

8:58

big questions for Arthur Prabba Car who was

9:00

running the agency at the time, about the ethics

9:03

of transhumanism. When

9:05

Jan went from moving

9:08

a prosthetic limb to controlling

9:10

a flight simulator, that was the moment

9:13

that it became visible that

9:15

these technologies that allow you to restore

9:17

function by harnessing motor control

9:20

signals also mean that

9:22

now we have a way for a human brain

9:25

to engage with the world in a

9:27

completely different modality. It

9:30

was a very eye opening moment. I think for

9:32

all of us, what did you feel in that moment?

9:34

It's creepy, it's powerful.

9:37

It makes you realize that

9:40

as hard as the technology

9:42

is, it might be the easiest part of figuring

9:45

out how we harness these capabilities

9:48

for good in the future. As

9:52

we've said, this technology remains firmly

9:54

in research labs at least for now.

9:57

Andy Schwartz is the neurobiologist at

9:59

the Universe. You have Pittsburgh who led the project

10:01

with Jan and he's very clear

10:03

that his mission was to restore function, not

10:06

enhance it. The idea was,

10:08

can we help paralyze people coming back

10:11

from war zones regain function?

10:13

So I think that was very good motivation

10:16

for the kind of research we are doing. And despite

10:18

the huge potential of Andy's research, he's

10:20

quick to point out that the brain computer interface

10:23

he built is much less efficient than our

10:25

existing connection to the world. At

10:27

present, our bodies are the best we've got right

10:30

now. If you wanted to control the computer

10:32

with brain activity, it would be ten times

10:34

slower. That would be if you were typing with

10:37

your own fingers. Things move

10:39

so fast that we've gotten accustomed to the

10:41

idea that if you see it once, it's just going

10:43

to become pervasive. That's actually

10:46

not true of a medically invasive

10:49

procedure that involves putting implants

10:51

on the surface of the brain. We're

10:54

going to need all of that time to figure

10:56

out what we were we want to go with these technologies.

11:00

So it's a perfect moment to ask where

11:03

do we want to go? For

11:05

many in Silicon Valley, the principle of a more

11:07

direct interface between humans and computers

11:10

is enticing. But here's Sebastian

11:12

again. At this point, I

11:15

don't think customers would love having

11:17

surgery to get an implant in their brain just

11:19

to be lived smarter. But there

11:21

are other ways to approach the problem. You

11:24

may remember Google Glass. Sebastian

11:26

lad that project when he ran Google X, and

11:28

in fact, when I first met him in twenty twelve,

11:31

he was wearing a pair we bet Google

11:33

glass because I wanted a camera

11:35

right next to your eye, and I wanted to speak of right next

11:38

to your ear, so the computer could perceive

11:40

the same sensation, the same stuff you see

11:42

every day. I think these technologies

11:45

become closer and closer to us. When

11:48

I first met Sebastian, he was wearing

11:51

these weird Google glasses carra It

11:53

kind of looked like a cyborg, and he

11:55

told me that these glasses might solve the problem

11:57

of people constantly having to look down at their

12:00

ownes because the information could be at eye

12:02

level. But that really hasn't

12:04

solved the problem of tech neck.

12:07

Technic women and men are now getting

12:09

this like very you know, unsavory

12:12

under neck, fatally from looking

12:14

down at their phones. Yeah, and then there's turtle

12:16

posture, which is people hunting forward because

12:18

they're sitting at their computers so much so we don't

12:20

know if these are going to have an impact

12:23

on us evolutionarily speaking, but I imagine

12:25

they might. Or thumbs if you think about

12:27

it, well, our bodies are already being changed by

12:29

our interaction with our phones. Absolutely.

12:32

There's actually really quickly, there's this fake

12:34

video that was going around called Lookout, where

12:36

this guy invented a product where

12:39

your phone basically had the same camera it does,

12:41

except it would face out, so you never

12:43

had to look away from your phone. You could just look

12:45

at your life through your phone, which is essentially

12:47

what everybody is already doing well, And that was kind

12:49

of the insight behind Good Last was what

12:52

if you could overlay kind of augmented

12:54

reality over what you saw And

12:57

there's something very promising cool about it, but

12:59

it wasn't mostly viable because you had to

13:01

wear something on your face that made it like a

13:03

cyborg. But for consumers and the

13:05

benefits of a computer at that proximity

13:09

just weren't enough to make up for the inconvenience

13:11

of wearing something on their face. There

13:13

is one group of people who are willing to put up with

13:15

inconvenient technology. You know, people like Jan

13:17

who had her skull opens that she could

13:19

move an arm again. Right as Andy said,

13:22

this wasn't about letting her fly planes

13:24

with her mind. It was about restoring function.

13:27

When we come back after the break, we look at a

13:29

much less invasive technology that hopes

13:31

to be as miraculous as Jan Sherman's

13:33

robotic arm. Okay,

13:41

let me do you want to test to test it with the

13:43

tambourine or whatever. Okay, that's

13:57

Noise Toka. He's an award winning

14:00

Italian blues guitarist who lives in New York. When

14:02

Noi came to play for Julian and I in the studio

14:04

earlier this year, he asked that I need

14:06

him at the bus stop. I

14:09

went to Berkeley College of Music in Boston

14:11

to study jazz guitar performance and

14:13

Noise the real deal. Well he was at Berkeley.

14:15

He was given the Jimmy Hendricks Award for being

14:18

the school's leading guitarist, as well as the Billboard

14:20

magazine and owed scholarship. And

14:22

now he plays a consistent roster of shows around

14:25

New York. But he's

14:27

working to reach a digital fan base as well. Every

14:29

day Noi posts a video to help build his audience.

14:32

This is how I found him. What

14:34

you wouldn't know just listening to this podcast is

14:36

that no Way also happens to be blind. I

14:39

was born three months early, so they

14:41

put me in the colbator, but there was too much oxygen

14:44

in his infancy. Noi suffered from

14:46

retinopathy of prematurity known

14:48

as r o P. He needed surgery

14:51

to reattach his retinas and now he has

14:53

very limited vision, totally

14:56

blind, like blind.

14:58

Technically, I guess it's called light and shades

15:01

or something. I mean, I can see colors, I can

15:03

read really large print only out of one

15:05

eye, but my vision field is small,

15:07

so I still use the k and I still read brail. This

15:10

is why no I asked me to pick him up at the bus stop.

15:12

Getting to and from new destinations is

15:15

a huge pain in the ass. But here's

15:17

the thing. There are some companies that are trying to fill

15:19

the gap and make smart glasses for

15:21

people with visual impairments. Their

15:23

hope is that with the right technology, blindness

15:26

could become a thing of the past. When

15:29

we were researching for this podcast, Julian came

15:31

across a new technology company. Yeah,

15:33

after a few months, I had a pretty AI focused search

15:36

history, right, so my Instagram ad started

15:38

to reflect that, which is convenient, you know, AI

15:41

helping us make a podcast about AI. Yeah,

15:43

super consider it, right, So I

15:45

got an ad for east Site presented itself

15:47

and east Side build themselves as creators of the

15:50

world's most advanced site enhancing

15:52

glasses for the visually impaired and

15:54

medical tech can be pretty expensive, but

15:57

the sites recently lowered their prices from

16:00

hundred dollars for appair of these glasses

16:02

down to fifty what do

16:04

they call that? A price slash? All

16:06

Hans glasses must go. So

16:09

it's still pretty steep, but east Side as payment

16:11

plans and health insurance can help people out as

16:13

well. Yeah, but even so, I

16:15

mean I'd rather spend money on guitars, you

16:18

know, like that's what I do in my life. Like, why

16:20

would I spend money on something that is

16:22

still so experimental? Still No,

16:25

I was curious to test out east Sides technology,

16:27

so I reached out, explained noise condition

16:30

and we got an appointment. Yeah,

16:33

don't do anything crazy when you put them on to

16:37

start jumping around like a crazy person. Yes,

16:41

if it doesn't work, just make it up. I'm

16:45

I'm doing. The

16:47

east Side glasses basically look like Cyclops

16:50

from X Men, except without the blinding

16:52

laser beam that shoots out from them. The

16:54

glasses capture high quality video with a

16:57

camera above your nose, and then project

16:59

the video to a high definition screen in

17:01

front of each eye. The footage is enhanced

17:04

by software designed especially for people

17:06

with vision impairment. You can also zoom,

17:09

change focus, and do things like boost contrast

17:11

or go to gray scale. And east Side

17:14

rep named Nigel helped No. I get fitted with a pair,

17:16

but it can be a little hard to get used to. I

17:20

mean, I am trying to see. Isn't

17:22

it like right here where I should be looking like?

17:25

Yes, it's it's it should be right in front

17:27

of your people. Um, yeah, it's a little it's

17:29

a little high. Let me see if I can just

17:31

adjust that. Yeah, there you go. That's better?

17:33

Is better? Yeah? That

17:36

seems like I get more details out of things,

17:38

but I don't really know what I'm looking at. Nigel

17:40

also asked, no way to identify me. Let's

17:43

let's see if you can recognize characters.

17:47

I see, I see it. Do you know what I

17:49

look like? No, I like, do you have a sense of what I look

17:51

like? I mean from what I saw when I was close

17:53

to you, like have like light hair, like the light

17:55

brown or something like that. That's right, And

17:57

uh yeah, I mean it's cool. I

18:00

just don't know, you know, we have to see how

18:02

useful it would actually be, Like, I mean, I

18:04

don't think I would use it to recognize people like there, because I

18:06

would have no way of I don't know what

18:08

anybody his face looks like anyway,

18:11

you know, because he has

18:13

been blind for his entire life. Nobody

18:15

doesn't have a point of reference for the details

18:17

in people's faces. Imagine that

18:19

you have no idea what any of the people

18:21

in your life really look like. You

18:24

learn to identify them in other ways, like their

18:26

voices. So no is

18:28

ambivalent about the prospect of seeing people

18:30

clearly, but he would like it

18:33

to be easier to get around. Recognizing

18:35

people's faces would not be one of my priorities,

18:38

but reading where the bus stop is or when

18:40

it's coming will be a priority. I

18:42

mean, it's cool, it's interesting. I would just have to see how

18:45

it actually practically will be because I mean here's

18:47

the thing, Like, when you're outside, it's not that you

18:49

have time to stop and figure stuff out, especially

18:51

when you're walking on the sidewalks of New York.

18:53

Like, I'd rather just use blind Square. Nobody

18:56

first told me about blind Square when I picked him up

18:58

from the bus. He just the most popular

19:01

iOS navigation app for blind people. It

19:03

describes the environment you're in, the intersection

19:06

you're crossing, and announces points of interest,

19:09

all using GPS noise.

19:11

Point in mentioning blind Squared to Nigel

19:13

is that it is a tool that he can work with within

19:16

his set of limitations. Now, he spent

19:18

his whole life navigating the world one way,

19:21

his way, and east Site would actually change

19:23

that, maybe more than he would want. I've

19:26

never really seen a subway map anyway, so I

19:28

would have to get a hang of how it works

19:30

first, and then because

19:32

your brain needs to recognize it and then know what it's looking,

19:36

that's hard for human beings to comprehend.

19:38

I think, yeah, it's like imagining a new

19:40

color, right. Yeah,

19:43

So Julian east site worked, but it wasn't

19:46

perfect for no Way, And I think

19:48

it's important to know that east is actually really a

19:50

cool technology, whether or not, it worked for him.

19:52

Nigel mentioned that many elderly people who suffer

19:55

from macular degeneration losing their sight

19:57

due to old age, have been able to see their grandchildren

19:59

for the first time, which is really cool. I mean

20:01

it does restore site for a lot of people who have seen before.

20:04

Noways said, the sacrifices he makes to get around

20:06

to his gigs as a blind person aren't so

20:08

bad, but there is one part of

20:11

his career that's less easy to navigate. Like

20:13

many musicians, no Way uses his social

20:16

media channels as his primary way of connecting

20:18

to his current fans and hopefully reaching

20:20

future ones. The problem is it's

20:23

pretty hard to reach your fans on Instagram

20:25

if you're blind. Instagram needs to listen

20:28

to me right now. Uh No, Basically

20:30

the issue is this editing

20:33

videos and putting the part

20:35

that you want is not accessible on Instagram.

20:37

You know, thankfully the Apple app is accessible.

20:40

So I edited on the iPhone app because there

20:43

is like sliders and he tells you like current positions

20:45

zero seconds of like two minutes and thirty seconds,

20:47

and so basically you slide with your finger up and down.

20:50

If it wasn't for that, I wouldn't be able to post on Instagram.

20:52

I would only be able to post the first minute, which

20:54

is not what I always want to post. On the songs, sometimes

20:56

I post some other part, you know. But

20:59

Instagram has implemented the app because you know,

21:01

black people aren't supposed to be on Instagram. I've

21:06

never thought about this. Digital technology

21:08

is almost exclusively visual. I

21:10

don't think about how easy it is for

21:13

me to use an iPhone every day. I just use it.

21:15

And when you have these technologies like iPhones

21:18

being designed by people in Silicon Valley who fit

21:20

pretty healthily within the norm, it's not really

21:22

doing no way any favors. And when

21:24

we focus on the next big thing in tech, we can

21:26

leap frog past uses for

21:29

these existing technologies. So we say, okay,

21:31

well, when you're starting next big unicorn, we want

21:33

to do these miraculous, gigantic

21:35

ideas. But you don't think about subway maps. You don't think

21:37

about things like blind Square unless you

21:39

actually need that app. And I think

21:41

it's we're not anti technology

21:44

or anti unicorn. Those things are important,

21:47

you know. No, I actually said something really funny, which is

21:49

that like he would love a self driving car. Of course

21:51

he would so we all benefit

21:53

from these moonshot innovations,

21:56

but it's really important I think that

21:59

everyone thinks about who is

22:01

forgotten. When we innovate too

22:03

quickly, we don't all benefit in the

22:05

same ways. We

22:12

began this episode debating what defines

22:14

a cyborg and asking how close

22:17

we are to being transhuman. These

22:19

are important philosophical questions, and

22:21

they may even become practical questions in our

22:23

lifetime. But as we

22:25

wrestle with them, it's important to remember that

22:28

a huge number of people still don't

22:30

have access to technologies that most of us take

22:32

for granted to expand our horizons

22:34

and what we can do. There are billions

22:36

of people in the world who don't have smartphones,

22:39

and millions of people in the US, like Noah,

22:42

who can't take full advantage of them. Briany

22:45

Cole is tackling this problem head on. After

22:48

working at Microsoft, she founded The Future of

22:50

Sex, and as part of her work, she

22:52

produces sex tech hackathons around

22:54

the world. I

22:56

think it's so important is because no one's talking

22:59

about it. We all got here

23:01

by having sex sexual identities

23:03

so core to who we are, and

23:06

the concentration of people innovating

23:08

in this field is quite small. So

23:11

Carol, I was a bit nervous to talk about sex

23:13

in this episode because I don't want people

23:15

just think we're being sensational for the sake of it.

23:18

But it is an area where we can see the tangible

23:21

consequences of what happens when one group

23:23

designs for the rest. Like with No

23:25

Way, I don't think it's sensational at all.

23:28

It's like with Gillian and episode one, when

23:30

people programming ads for Facebook do

23:32

not understand different outcomes for

23:34

pregnancy. They don't

23:36

know that an outcome for pregnancy could be

23:39

still birth, and that's how Gillian's ads

23:41

were making her life worse because of people

23:43

programming them a man by and large, By

23:45

and large, now it takes these edgu cases

23:47

like Gillian being haunted by targeted ads

23:50

for us to notice that more of from than

23:52

not. We don't have a counter factual. We can't make

23:54

a comparison to a version of technology

23:57

that wasn't built by Silicon Value. And

23:59

that is exactly why Briany

24:01

Cole organizes events to design

24:04

new sex tech. While

24:06

hackathon sound really geeky and typically

24:09

you're going to attract just people involved

24:12

in technology, we're really careful in

24:14

structuring the hackason, inviting other

24:17

people in from other classes,

24:19

from other ethnicities, from other genders

24:21

to allow us to build different sorts of products.

24:24

What was the surprise that we didn't

24:26

design for was people with disabilities

24:29

that showed up. And with these new

24:31

groups of people came new kinds

24:33

of products. One team came

24:36

up with a voice activated vibrator for

24:38

people in wheelchairs. Another

24:40

team in Singapore, a deaf man

24:43

a blind man built a sex

24:45

intimacy education platform about

24:47

bringing a woman back to their

24:50

dorm room and really not knowing how to read

24:52

intimacy ques because they've never been taught.

24:55

These people just showed up on my thank

24:57

you um. We feel invisible

25:01

technology very quickly becomes part of

25:03

our homeostasis. We take for granted

25:05

the fact that we have a supercomputer as our constant

25:08

companion, and that we can summon the entire

25:10

corpus of human knowledge with the flick of a

25:12

finger. But as we

25:14

constantly focus on the next frontier, what's

25:17

new, it's easy to forget that many

25:19

people don't get to share in the fruits

25:21

of what we've already built. Other

25:23

populations that also resonate with the hackathon's

25:26

people in remote areas or rural

25:28

populations that have trouble finding a mate,

25:31

and of course women, which

25:34

is predominantly who turns up to these hackathons.

25:37

It's people that typically don't have access

25:40

to this providing input into

25:42

how we build it. Briani's

25:45

hackathons allow people who feel invisible

25:47

to create products directly for themselves, people

25:50

who may even be invisible on the campuses

25:53

of Silicon Valley. For Briany,

25:55

access begins at the design phase

25:57

and including new voices can create new

26:00

products and technological surprises

26:02

that otherwise might not exist. So perhaps

26:05

our hackathons could serve as a model for

26:07

tech innovation. More broadly, I think where

26:09

I'd most like to see it go is too sort

26:11

of the trickle down effect across the world

26:13

in terms of how that technology reaches other

26:15

populations because it is so concentrated

26:18

in these like Silicon Valley types.

26:21

It's the access that I'm more excited

26:24

about. We actually have incredible

26:26

technology available to us, and yet

26:29

the US is right now are so clunky,

26:31

you know, and we're we just need to figure

26:34

out how we're going to use it and who we're going to invite

26:36

in Briani paints

26:38

the picture of a world whether benefits of god

26:40

like technology spread to all people. But

26:43

as we've talked about before on Sleepwalkers, there's

26:46

a very real possibility that technology will

26:48

be co opted by the rich and powerful. So

26:50

we have to ask who gets to become a cyborg?

26:53

And what if the new gods don't

26:55

want to share their divinity? Here's

26:58

you all again, m One disturbing

27:01

thought is that there is no us,

27:04

There is no we. Different groups

27:06

of humans have a very different future,

27:08

and they should prepare themselves in a different

27:10

way. It could lead to the creation

27:13

of a new cost system with immense

27:16

differences in wealth and power, much

27:18

greater than we ever saw before in history. It

27:21

could even lead to a speciation,

27:23

Homo sapiens splitting into

27:26

different species with

27:28

different capabilities. So the

27:31

descendants of Elon

27:33

Musk will be a different specie

27:36

then the descendants of people

27:38

who are now living in some Favella in

27:40

some Paolo. So it's a big question

27:42

always what is the future of humanity?

27:45

What should we do? What is our future?

27:48

What is the future of our humanity? What

27:51

we sleepwalking towards. It's tempting

27:53

to look for the dramatic ways technology is

27:55

changing us. Might we be able

27:58

to physically merge with computers? Will

28:00

computer vision help the blind, sea or the

28:02

marginalized experience intimacy?

28:05

They're big questions, but there's one that's

28:07

even more urgent. How is

28:09

the AI revolution already changing

28:11

the way we think about and perceive the world.

28:14

What are the algorithms we interact with every

28:16

day doing to us, and how

28:18

are they changing our society? More

28:21

on those questions with you all know a Harrari. When

28:23

we come back, until

28:35

I was twenty one, I didn't know I was gay.

28:38

And I look back at the time when

28:40

I was I don't know, fifteen or seventeen,

28:43

and I just can't understand how I missed

28:45

it. I mean, it should have been so obvious,

28:48

But the fact is I didn't know an extremely

28:51

important thing about myself, which

28:54

an AI could have discovered within

28:56

you know, like two minutes. For you all

28:58

know a Harari, AI could have made

29:00

a big difference to his early life, and

29:02

that got him thinking. When we talk about

29:04

AI, we tend to greatly exaggerate

29:07

the potential abilities, but at the same

29:10

time we also tend to exaggerate

29:12

the abilities of humans. People

29:15

say that AI is

29:17

not going to take over our lives because

29:20

it's very imperfect and it won't

29:22

be able to know us perfectly. But

29:25

what people forget is that humans

29:27

often have a very poor understanding

29:30

of themselves, of the desires,

29:33

of their emotions, of their

29:35

mental states. For AI to

29:38

take over your life, it doesn't

29:41

need to know you perfectly. It just

29:43

needs to know you better than you know

29:45

yourself. And that's not very difficult

29:48

because we often don't know the most important

29:51

things about ourselves. So let's

29:53

say you could turn back the clock to being fifteen,

29:55

would you have wanted to live

29:57

in a world where there was sufficiently sensors

30:00

to monitor your eyes, your eye

30:02

movement, you're breathing, you know, while you're

30:04

going about your daily life, and then to interpret

30:07

that and say to you you've all, more

30:10

likely than not you're gay. That's a very

30:12

good question, which will become very

30:14

practical questions in a few

30:16

years. And the way that I

30:19

grew up and developed it would

30:21

have been a very bad idea.

30:24

I wouldn't like to receive this

30:26

kind of insight from form

30:28

a machine. I'm not sure how I

30:31

would have dealt with it when I

30:33

was fifteen, you know, in Israel, in the nineteen eighties,

30:36

and maybe proudly it was, you know, a defense

30:38

mechanism in the future to it. It depends

30:40

where you live. Brunei has

30:42

instituted a death penalty for gay

30:45

people, at least for people engaged in homosexual

30:47

sex. So if I'm a teenager

30:49

in brune I, I don't want to be told

30:51

by the computer that I'm gay, because

30:54

the computer will then be able to tell that to

30:56

the police and to the authorities as well. Looking

30:59

to the future, say ten twenty years,

31:02

the danger is if I still don't know that

31:04

I'm gay, but the government and

31:06

Coca Cola and and Amazon

31:08

and Google they already know it. I'm at

31:10

a huge disadvantage. So

31:13

it could be something as frightening as

31:15

the secret police coming and taking

31:17

me to a concentration camp. But

31:20

it could also be something like Coca

31:22

Cola knowing that I'm gay, they

31:24

want to sell me a new drink, and they

31:27

choose the advertisement with the

31:29

shirtless guy and not the advertisement

31:32

with the girl in the beginning, and next day

31:34

morning I go and I buy this soft drink

31:37

and I don't even know why, and

31:39

they have this huge advantage over me

31:41

and can manipulate me in all kinds of ways.

31:44

What you've all suggests is that once we

31:46

become reducible to data, we

31:48

become predictable to algorithms, and

31:51

once we're predictable, we can be manipulated.

31:55

We talked in the last episode about how AI

31:57

is helping us decode life's fundamental

32:00

histories brain waves, health

32:02

outcomes based on genetics, time

32:04

of death. But the next frontier

32:06

could be our very behavior is

32:09

off the again. We have

32:11

tools to evaluate vast

32:13

volumes of data that we have previously

32:15

collected or that we've always collected. We

32:18

have new sources of data. Think

32:20

about everything from fitbits

32:23

and those kinds of measurements that you

32:25

can make on individuals, to the volume

32:27

of data that people are spewing

32:30

out into the online environment every

32:32

day. The implications of spewing

32:35

this data onto the internet is where we began

32:37

the series. And knowing that your data

32:39

is being used to build an accurate model of

32:41

you make if you pause about putting it

32:43

out there. But it's a combination of

32:45

that data with advances in AI that's

32:48

allowing us to start to see into the

32:50

future. Data plus that

32:52

deep knowledge now allows

32:54

you to form hypotheses

32:57

and to design experiments that allow

32:59

us now to start a journey of building

33:02

better models of these complex systems.

33:05

This is at the core of the revolution

33:07

that we're seeing. But I think the next

33:09

wave, after starting to understand

33:12

biology, is about a transformation

33:14

in the social sciences. Of course,

33:17

in some ways, that transformation in the social

33:19

sciences is already here. It's

33:22

what you' I was talking about in terms of computers

33:24

that understand us better than we understand ourselves.

33:27

And we've already seen the real world consequences

33:30

altering the course of history with Cambridge

33:32

Analytica, Brexit and the six

33:35

presidential election. But

33:37

Uvou believes that was just the beginning,

33:40

and this revolution in biology and computer

33:42

science will shake the very foundations

33:45

of how we live. The

33:48

ideology of humanism basically

33:50

says that the human experience

33:53

is the ultimate source of

33:55

authority and meaning in the

33:57

universe. If you look at polity,

34:00

then if originally political authority

34:02

came from the gods, now political

34:05

authority comes from humanity. The

34:07

idea is that the voter is always right.

34:10

You look at economics, they're the motto.

34:12

The slogan is the customer is always

34:14

right. If the customers want something,

34:17

then, however, irrational and illogical

34:20

it is, this is what the

34:22

entire system is geared to provide.

34:25

And you have the same idea in ethics,

34:28

Why is it wrong to murder? Not

34:30

because some God said so, but

34:33

because it hurts other people?

34:35

So we already view yourself in this sense

34:38

as kind of divine entities that

34:40

provide meaning and authority to

34:43

the world. The big question is what happens

34:46

once some algorithm can

34:48

decipher and manipulate human

34:51

feelings and experiences, then they can

34:53

no longer be the source of authority

34:55

if it's so easy to hackn manipulate

34:58

them. And this is part of the crisis

35:00

which we are already beginning to see today.

35:04

This crisis is the intersection, the

35:06

culmination of many of the ideas

35:08

that we've spoken about on this first season of

35:10

Sleepwalkers, and it's why it's so urgently

35:13

important not to ignore the changes

35:15

going on around us and to wake up

35:18

if we refuse to see it. If

35:20

we just hold on to this liberal

35:23

belief that humans

35:26

are free agents, we have free

35:28

will, nobody will ever be able

35:30

to understand me, nobody will ever be

35:33

able to manipulate me. If

35:35

you really believe that, then you

35:37

are not open to the danger and

35:40

you won't be able to reinvent

35:42

the system in a better way. Not

35:44

everyone agrees that we've reached the point you've

35:46

all describes. According

35:49

to Arthie, there's a world of mystery that

35:51

we've barely begun to penetrate. I

35:53

think we are so far from

35:56

having a full understanding

36:00

of any of these systems. The fact

36:02

that we are making this rapid accelerated

36:05

progress, I think sometimes leads

36:07

to a hyperbolic sense

36:10

that we're going to know everything and it'll all get reduced

36:12

to a bunch of algorithms. And I don't think

36:14

we're anywhere near that. And I'm not even sure

36:16

that's the endpoint, not the end point

36:18

in a sense that that will never happen, or there maybe an

36:20

endpoint beyond that. My own view

36:23

is that what it means to be

36:25

human is so much richer than

36:27

these mechanistic components that we're

36:29

talking about. You know, if you said to me,

36:32

when do you think we will fully map and be able

36:34

to predict behavioral systems, my answer

36:36

might be never. The incredible

36:39

depth of these systems, how messy

36:41

and organic they are, means we've got a

36:43

long way to go before we've understood everything.

36:46

The uniqueness of our humanity lives to fight another

36:48

day. But as you've

36:50

out said, the AI to profoundly influence

36:53

us, it doesn't need a perfect understanding.

36:56

It just needs to know a little bit more than

36:58

we do, and that's become reality.

37:01

So what should we do on the individual

37:03

level. It's more urgent

37:05

than ever to get to know yourself better

37:08

because you have competition. Once

37:11

there is somebody out there, a system out

37:14

or an algorithm out there that knows you better

37:16

than you know yourself, the game is up.

37:19

You can do something about it, not just

37:21

by beholding data, but above

37:24

all by improving your

37:26

own understanding of yourself. The

37:28

better you understand yourself, the more

37:30

difficult it is to manipulate. You know

37:34

thyself. You have our references

37:36

the ancient Greek maxim and last

37:39

episodes to Dartha called our attention

37:41

to the Hippocratic oath in the midst

37:43

of the incredible upheavals of modernity,

37:46

with being urged to remember the wisdom of the ancients.

37:49

But how do we get to know ourselves better?

37:52

I meditate, some people go to therapy.

37:54

Whatever works for you, do it,

37:57

and do it quickly. Get to know

37:59

yourself better, because this is your best

38:01

protection against being hacked. If

38:04

you're an engineer, then one

38:06

of the best things you can do for humanity is

38:08

build AI sidekicks that

38:12

serve individuals and not corporations

38:15

or governments. AI systems that

38:17

yes, they monitor you and they analyze

38:20

you, and they hack you, but they serve

38:22

you. They reveal that what they

38:24

find to you, and they work

38:27

like an antivirus, just like your computer

38:29

has an antivirus, so your brain

38:31

also needs an antivirus. And

38:34

uh and and this is something that that a I can

38:36

do for us to protect us against

38:38

other malevolent Aiyes. Of

38:40

course, one of the problems we've talked about on the series

38:43

is that the vast majority of talented engineers

38:46

work for the very corporations Val says

38:48

we need to be protected from. But

38:50

there is something compelling about using technology

38:53

as a tool to protect ourselves from other

38:55

technology. And brianie

38:58

Cole also sees avenues for technology you

39:00

to help us get to know ourselves better. We

39:02

can explore in virtual worlds without

39:05

shame. It's pushing us

39:07

to reveal ourselves, even

39:09

things we didn't even know about ourselves. If

39:12

we think about, Wow, this dark

39:14

and wacky world of sexuality that we

39:16

haven't even explored ourselves because

39:18

we've been too afraid of what we

39:20

might discover. You put technology

39:23

in there where you're suddenly able to create any

39:25

world that you want, These totally

39:28

fantastical edges of

39:30

our minds that we can go to thanks

39:32

to technology. And according

39:35

to Briany, this is not just about seeking

39:37

thrills. Our sexuality is actually

39:39

driven by something much more profound, the

39:42

core of our humanity. We want to connect, we

39:44

want to belong, we want to feel

39:46

like we're part of something that's

39:49

sort of like the core part of that right down to

39:51

our sexuality. What is the future

39:53

of sex? The answer has nothing to

39:55

do with technology, and it always

39:57

has to do with being human. We

40:00

can take great strides personally to get

40:02

to know ourselves better, but we also

40:04

have to recognize the limits of what we can achieve

40:06

as individuals. To create

40:09

a technological future that is fair and positive,

40:11

we need governance and policy. Here's

40:14

you again. We need to regulate

40:16

things like the ownership of data and

40:19

the immense power the divine powers

40:21

of creation, of being able

40:23

to engineer and create life. This this should

40:25

be a major political issue of

40:28

who owns these kinds of abilities.

40:31

This is not something you can do by yourself.

40:34

So here that my best recommendation

40:37

is join an organization. Fifty

40:39

people in an organization

40:41

can do far, far more than five

40:44

hundred individual activists.

40:47

So whatever cause

40:49

seems to you the most important, join

40:52

a relevant organization and do

40:55

it this week. The

40:59

AI revolution isn't far off in the future,

41:01

Kara, as Uval says, is here

41:03

with us, and we have this personal and political

41:05

responsibility to make sure that the future

41:08

is a future we want to live in. Yeah,

41:10

I think that's actually a good place to leave Season

41:12

one of Sleepwalkers, don't they're breaking

41:14

my heart? Ye. It's been incredible to hear you

41:17

say, Karma, Kara. But

41:21

also it's been incredible to report on you know, to meet

41:23

people like no Way and Gillian brock Hell

41:26

or Glenn Rodriguez, and to understand

41:28

how this new technology is affecting people's

41:31

lives. It's been a real privilege to hear those

41:33

stories and also to get access

41:35

to some of the people who are building the technologies

41:37

we live by, people like Yasmine Green

41:39

at Google's Jigsaw, Nathaniel Gleitscher,

41:42

Facebook's head of cybersecurity, and

41:44

Ben Singleton director of Analytics at the

41:46

NYPD. We got to hear firsthand

41:48

what these leaders in the world of technology is

41:51

thinking about and how hard they wrestle

41:53

with the ethics of their creations. And there's so many

41:55

more areas that AI is transforming and

41:57

that we're going to look at in season two of

42:00

things like money and climate change

42:02

and some of the problems that I've noticed even

42:04

since we started doing this podcast, like

42:06

what happens if your boss is an algorithm. Well,

42:09

that's why we're doing a season two of Sleepwalkers,

42:11

So please join us. And in the meantime, we'll

42:13

be keeping our Instagram, Twitter and website

42:15

updated. That's at Sleepwalker's

42:18

Podcast on Instagram and at Sleepwalker's

42:20

Pod on Twitter. And if you have any

42:22

stories, suggestions or criticisms,

42:25

send us an email at Sleepwalkers Press

42:27

at I heart media dot com. That Sleepwalker's

42:30

p R E s s at iHeart media dot

42:32

com. Well, that's all from us. I'm

42:34

Ozveloshin and we'll see you next time. Sleepwalkers

42:49

is a production of I Heart Radio and Unusual

42:51

Productions. For the latest AI

42:54

news, live interviews, and behind the scenes

42:56

footage. Find us on Instagram at

42:58

Sleepwalker's podcast at Sleepwalkers

43:00

podcast dot com. Special

43:03

thanks this episode to the Forward, the digital

43:05

news and culture website and Jonathan

43:08

and Cody from the SEO Agency make it all

43:10

work here in New York City. And thanks to Noise

43:12

So Hard That's s O. C h A for his

43:14

involvement in this episode. If you'd like to hear

43:16

more of Noise Music, you can find him on Facebook,

43:19

YouTube, and Instagram at Simple blues

43:21

Boy. Thanks also to Allen Ullman, author

43:23

of Life in Code, and Gary Marcus of

43:25

Robust dot Ai, who gave generous interviews

43:27

which helped shape off thinking for this series. Sleepwalkers

43:32

is hosted by me Ozveloshin and co hosted

43:35

by me Kara Price, with produced by Julian

43:37

Weller, with help from Jacopo Penzo and Taylor

43:40

Shakoyne. Mixing by Tristan McNeil

43:42

and Julian Weller. Our story editor is

43:44

Matthew Riddle. Recording assistance this episode

43:46

from Joe and Luna and Sabrina Boden.

43:49

Sleepwalkers is executive produced by me

43:51

Ozveloshin and mangesh Hat Tigetter.

43:54

For more podcasts from my Art Radio, visit the I

43:56

Heart Radio app Apple podcasts, or

43:58

wherever you listen to your favorite show, simly

44:00

for the wing of they would be for the DA

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features