Podchaser Logo
Home
Episode 135 - The History of Horror and Artificial Intelligence

Episode 135 - The History of Horror and Artificial Intelligence

Released Tuesday, 9th April 2024
Good episode? Give it some love!
Episode 135 - The History of Horror and Artificial Intelligence

Episode 135 - The History of Horror and Artificial Intelligence

Episode 135 - The History of Horror and Artificial Intelligence

Episode 135 - The History of Horror and Artificial Intelligence

Tuesday, 9th April 2024
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:18

Hello everyone , welcome back to another

0:20

episode of the Lunatics Radio

0:23

Hour podcast . I'm Abbey Brinker

0:25

sitting here with Alan Kudan . Hello

0:27

. And today we are talking about

0:29

the controversial subject

0:32

, the horror of artificial intelligence

0:34

. Is this controversial In some

0:36

ways ? I think AI is pretty controversial , for

0:39

sure . I think there's a lot of people , especially

0:41

artists , who are pretty

0:44

against . There's a lot of people , especially artists , who are pretty

0:46

against especially image generating AI .

0:48

Oh okay , so yes , the implementation of

0:50

AI in society is highly controversial

0:52

. Yes . But I think everyone

0:54

can agree that under the correct lens

0:57

, AI can be absolutely horrific

0:59

.

0:59

Oh yeah , I think that's hands down .

1:01

It's very scary .

1:02

Yeah , and this topic is really interesting

1:05

to examine through this lens right

1:07

of our podcast , which is the history of horror , because

1:15

, unlike most of our episodes , where there's this linear influence of

1:17

history onto a horror trope , artificial intelligence and AI

1:19

and pop culture advance like at

1:21

different rates . So , for instance

1:23

, in some cases AI

1:25

was represented in certain ways in

1:27

film and TV or books before

1:30

that existed in real life .

1:31

Oh sure , but that's technology in general

1:33

.

1:34

I know , but that's what I'm saying . Like usually we're like okay

1:36

, this is the history of clowns and

1:38

this is how clowns have been adapted for

1:40

horror , but in this case it's sort of

1:42

like sometimes it's something that happened in , sort of like , sometimes

1:44

it's something that happened in history , and sometimes

1:47

it's something that happened in a film and then history

1:49

caught up to it and it's just less of a

1:51

linear timeline when you're looking at the influence

1:53

of AI on pop culture .

1:55

I'm going to throw

1:57

one monkey wrench in there . Okay . And

2:00

say like , yeah , they weren't calling it artificial intelligence

2:02

. But there are stories of

2:04

humans building automatons for

2:07

millennia , you know . Think of the story of

2:09

like the golem , which that's just another

2:11

form of some kind of artificial intelligence made

2:14

by humans that goes rogue

2:16

and kills .

2:17

But that's sort of exactly what I'm saying In the

2:19

case of the golem right . I would categorize it

2:21

as like mythology , similar to

2:24

a film right . It's not real .

2:26

How dare you ?

2:27

That predates the invention

2:29

of an actual automaton

2:32

that does your bidding for you , or whatever

2:34

.

2:35

Sure .

2:35

There's films out there and books

2:37

and things that represented a thing

2:40

and now that thing exists , but when that book

2:42

was written it didn't yet I see

2:44

okay , yeah , it's like star trek

2:46

and the ipads .

2:47

Everyone says star trek predicted

2:49

the ipad , you know , because everyone just had

2:51

like little tablets and shit , right , and that

2:54

was the original star trek 60s . So

2:56

like , obviously that predates , you know , steve

2:58

jobs and his stuff right , but you

3:00

know all these things are . You know pop

3:02

culture influences technology . People

3:05

invented the ip iPad because that's what they

3:07

thought the future should look like .

3:09

Right , exactly that's and that's a really fun

3:11

new sort of twist for this episode , like

3:13

, in some ways , I feel like instead

3:16

of history influencing horror , in

3:18

this case horror has influenced

3:20

history . It's a fun . It's a fun remix

3:22

episode , if you will , isn't it also like the simpsons

3:25

has predicted a bunch of stuff that's come to be

3:27

?

3:27

uh , yeah , I mean I have

3:30

not watched too much of the simpsons , but I know that's

3:32

a , that's a meme yeah , you know I

3:34

just looked up the the definition , just because

3:36

this seems applicable to this episode . And

3:39

a meme is an element of a culture

3:41

or system of behavior passed

3:43

from one individual to another by

3:45

imitation or other non-genetic

3:47

means .

3:48

Interesting .

3:49

It's like a collective

3:52

consciousness that is passed

3:54

actively . So if something is popular

3:56

, it gets passed on , If it is not popular

3:59

, it does not get passed on , which is very

4:01

similar to

4:03

evolution Darwinism , right yeah , but in

4:05

this case it's all about information

4:07

.

4:07

Fascinating .

4:08

More on that later .

4:10

Yes , okay . So before we get into it , let's cite

4:13

our sources . We have a Harvard

4:15

article by Rockwell Anoya

4:17

the History of Artificial Intelligence

4:20

. A Forbes article A

4:22

Very Short History of Artificial Intelligence

4:24

by Gil Press . An article

4:26

by Christina Sayez Machines

4:28

Now Know how to Terrorize Humans

4:31

from CCBcom , which is part

4:33

of MIT . A VentureBeatcom

4:35

article what the Evolution of AI's

4:37

On-Screen Depiction Says About Society

4:40

. A Variety article by Zach Scharf

4:42

. Arnold Schwarzenegger proclaims the

4:45

Terminator has become a reality due

4:47

to AI . It's not fantasy

4:49

or kind of futuristic anymore . A

4:51

Wired article by Maria Streszynski

4:54

how Christopher Nolan learned to stop

4:56

worrying and love AI . A Guardian

4:58

article how soon will Megan become

5:00

reality ? Robot ethicists weigh

5:03

in Film Days . Article by Leanna

5:05

Hindley Artificial Intelligence in

5:07

Isolation in Duncan Jones's Moon

5:09

10 Years Later . An Air and Space

5:11

article by Paul Caruzzi 2001

5:14

A Space Odyssey , hal and the Future

5:16

of AI . And of course these will be

5:18

in the description of the episode as well

5:20

.

5:21

Also lots of movies and

5:24

at least one video game .

5:25

Lots of movies . So first

5:28

let's define artificial

5:30

intelligence not to be that person but

5:32

Webster's Dictionary defines

5:34

Actually . Use the Oxford English Dictionary

5:37

.

5:37

You would .

5:38

AI is both the theory and development

5:40

of computer systems that are

5:42

able to perform tasks that normally

5:45

require human intelligence , such

5:47

as visual perception , speech recognition

5:49

, decision making and the translation

5:52

between languages . And to add

5:54

on , I think a big part of this is the ability

5:56

of AI to learn from inputs and

5:58

evolve on its own . So I'm going to start us off

6:00

with a quote from the Harvard article by

6:02

Rockwell Anoya Quote . I'm going to start

6:05

us off with a quote from the Harvard article by Rockwell

6:07

Anoya Quote . In

6:10

the first half of the 20th century , science fiction familiarized the world with

6:12

the concept of artificially intelligent robots . It began with the heartless Tin man from the

6:14

Wizard of Oz and continued with the humanoid

6:16

robot that impersonated Maria in

6:18

Metropolis . By the 1950s

6:20

, we had a generation of scientists , mathematicians

6:23

and philosophers with the concept

6:25

of artificial intelligence , or AI

6:28

, culturally assimilated in their

6:30

minds .

6:30

Hang on . When the heck did the Wizard of Oz come out

6:33

?

6:33

1939 .

6:34

Okay , and when did Metropolis come out ?

6:36

1927 .

6:38

Oh , but it was a book first . I just can't

6:40

picture Metropolis coming out after the Wizard

6:42

of Oz . But the Wizard of Oz movie

6:45

came out later . It was just

6:47

the book was before , all

6:49

that .

6:50

Yes , yes , okay , back to the quote . One such

6:52

person was Alan Turing , a

6:54

young British polymath who explored

6:56

the mathematical possibility of artificial

6:58

intelligence . Turing suggested that humans

7:01

use available information , as well

7:03

as reason , in order to solve problems

7:05

and make decisions . So why can't machines

7:08

do the same thing ? This was the logical

7:10

framework of his 1950 paper

7:12

Computing Machinery and Intelligence

7:15

, in which he discussed how to build intelligent

7:17

machines and how to test their intelligence

7:19

. End quote .

7:20

I'm only familiar with the Turing test .

7:23

I want to say this the inception

7:25

of AI is super complicated

7:27

and kind of boring and we're not going to spend a ton of time

7:29

on it . But just to kind of clarify the quote

7:31

that we just read Alan Turing is

7:33

responsible for like conceptualizing

7:36

in a lot of ways in the modern sense of AI

7:38

, what that could look like . It

7:40

wasn't actually implemented until

7:42

a little bit later , and we'll talk about that . He's

7:44

kind of the seed , if you will

7:46

, but he's not the one who actually brings it into

7:48

3D .

7:49

Do we talk about the Turing test now or later ?

7:51

We can talk about the Turing test now , because

7:54

it will also come up

7:56

in a lot of the films .

7:58

My understanding of it is it's

8:00

simply a test on humans

8:02

. They interact with a machine

8:04

or they don't , but the test is

8:06

for the human to try to identify if

8:09

they think that what they're engaging

8:11

with is human or a machine , and

8:13

if a machine passes the

8:15

Turing test , it fooled the

8:17

user into thinking it

8:19

was a human .

8:20

Right , and films like Ex Machina

8:22

obviously play into that in a big way .

8:24

I mean yeah , they literally talk about the Turing

8:27

test .

8:27

But just the general . Honestly , let's pause here , because

8:30

just the general idea that

8:32

robots could fool humans , especially

8:34

AI , is like central

8:36

to everything we're going to talk about today , Like all

8:38

of the fears of humanity around

8:40

AI . In horror films it's

8:43

less so like oh , they're going to steal our work

8:45

as artists . Really , in horror , for

8:47

the most part , it's more like , oh

8:49

, the machines and the robots are going to trick

8:51

us and overrule us and become

8:54

the higher power in society

8:56

. So I think it's really interesting that even Alan

8:59

Turing back in the 1950s , like one of

9:01

the initial people to think

9:03

about this , is famous for

9:05

creating this test . Right , it kind of tells you that

9:07

there's always been this fear a little bit .

9:09

I'd say that there are two

9:11

branching tree Actually , it's not even

9:13

a branch , it's more of just a scale

9:15

how the whole like robots

9:18

impersonating humans is like

9:20

the first step of infiltration

9:24

and overthrowing and everything . But that's

9:26

not the end game . The end game is

9:28

when they just completely shed any

9:30

similarities to humanity , because

9:32

they don't need to fool anymore . They've already

9:34

won , just like . Imagine that

9:37

the human body could be rearranged right

9:39

. We build tools to

9:42

do certain tasks . Imagine

9:44

if your hand was instead

9:46

a hammer , or your arm was a chainsaw

9:48

, or your legs were wheels

9:50

if you wanted to go fast , right . Instead

9:53

of pretending to be humans

9:55

, they would basically build themselves to

9:57

emulate whatever task needed doing .

9:59

Are you talking about Transformers here ? I

10:01

feel like it depends on the universe . No

10:04

, Transformers is still in the

10:06

uh infiltration stage

10:08

right , but I think it depends on

10:10

are you talking about , like in film and tv or in

10:12

real life , that that's what machines will

10:14

?

10:15

that's like the end game so I

10:17

mean , I guess it's both right , because film

10:19

and tv is just predicting what's going to happen

10:22

, you know whatever . Okay

10:24

, a great example is

10:26

Terminator . Okay . So in

10:29

present day , or the past or whatever

10:31

you want to call it in in the first one

10:33

, in 1984 , the Terminator

10:35

is sent

10:38

from the future to 1984 to

10:40

kill the mother of the leader of the resistance

10:42

. Right , that's the premise of terminator . Yep

10:44

, he looks like arnold schwarzenegger

10:47

, but that is because he is an infiltration

10:50

assassination bot . He's supposed

10:52

to look like a human so he can navigate

10:54

through society unhindered

10:57

, uh , and not just have , like all society

10:59

, like fight him and kill him , right . However

11:01

, when you go to the future of

11:03

the far distant 2029

11:05

, there's no like flesh

11:08

covered robots . They're all you

11:10

know , either these walking , you

11:12

know skeletons , or they

11:14

are these giant hulking machines , you

11:16

know quadrupeds with giant

11:18

machine guns strapped to them , giant

11:20

flying jet things with arms

11:23

because they just don't need to

11:25

hide .

11:25

I mean , yeah , that makes sense . If there's no more humans

11:28

, or if you've dominated humanity

11:30

, then there's no need to trick .

11:32

In preparation for this , I started rereading

11:34

Robopocalypse Great book Very

11:36

topical and in

11:38

that they made a really good point how

11:41

, when machines start

11:43

building themselves , they

11:48

never build something that looks like a human , because a human

11:50

is just inefficient for most

11:52

tasks . It's very versatile

11:54

, can do lots of things , but

11:57

like look at the animal kingdom , you

11:59

know if you want to be fast , you go on

12:01

four legs , you want to be

12:03

able to swing around

12:06

. Then you need like super long arms like

12:08

a monkey . Humans can swim , but

12:10

like fish are a lot better . All these things

12:12

are we're versatile , but

12:14

they're not optimized and so , especially

12:16

once you get into trying to build a

12:18

giant death machine , a

12:20

bipedal machine

12:22

is very inefficient compared

12:24

to something that can run on four legs

12:26

is it has its own outriggers

12:28

for shooting a giant gun . You know there's

12:31

there's lots of these little nuances

12:33

that a machine would care about

12:35

simply on the efficiency scale , but

12:37

because they're not worried about aesthetics

12:40

but the real question is is the

12:42

machine're talking ?

12:44

okay , so you're talking about machines

12:46

that are aware that they're boy

12:48

and you kill him or you whatever

12:50

like . What's the ethics of that ?

12:51

Yeah , Just

13:13

imagine you know there you are

13:15

murdering and

13:17

you shoot somebody and a bunch of circuits fly out .

13:20

I mean more so , like you purposefully .

13:21

You're like oh , thank God .

13:23

Okay , let's get back to the outline here . So

13:25

artificial intelligence officially came

13:27

into existence in the 1950s . In

13:30

1951 , a calculator

13:32

nicknamed SNARK was known

13:34

. S-n-a-r-k stands

13:36

for Stochastic Neutral

13:38

Analog Reinforcement Calculator

13:40

.

13:41

Okay , so what does stochastic mean ?

13:43

Technical statistics . We

13:45

looked it up . Something like that , something

13:47

beyond my comprehension .

13:49

I believe it is able to

13:51

pull from a random probability , but

13:54

being able to statistically evaluate

13:57

that probability without having a

13:59

guaranteed outcome .

14:01

So SNARK was invented by Marvin Minsky

14:03

and Dean Edmonds

14:06

. It was the first artificial

14:08

neural network and used about

14:10

3,000 vacuum tubes to

14:12

simulate neurons . That's so

14:14

many . One year later , in 1952

14:17

, the first computer program

14:19

able to learn on its own was created

14:21

by Arthur Samuel . It was a checkers

14:24

game . In 1955 , the phrase

14:26

artificial intelligence was coined within a proposal

14:28

for a workshop which took place one

14:30

year later in 1956 . The

14:33

proposal came from Claude Shannon of

14:35

Bell Telephone Labs , marvin Minsky

14:37

, again of Harvard , john McCarthy

14:39

of Dartmouth and Nathaniel Rochester

14:41

of IBM . The workshop is largely

14:44

considered to be the official birth of AI

14:46

as we think about it today . Also in 1955

14:49

, the logic theorist was developed

14:51

. Quoting from the Forbes article by

14:53

Gill Press quote in December

14:55

1955 , herbert Simon

14:58

and Alan Newell developed the logic

15:00

theorist , the first artificial

15:02

intelligence program , which eventually would

15:04

prove 38 of the first 52

15:06

theorems in Whitehead and Russell's

15:08

Principa Mathematica end quote

15:11

. Throughout the next few decades there continued

15:13

to be advancements in AI technology

15:15

in a huge way and I'm not going to go

15:17

through sort of the timeline of

15:19

AI , but we know right kind of generally

15:22

where we are today . It's been a big

15:24

year . Last year there was ChatGBT , there is AI kind of exploding all over the place right , kind of generally where we are today . It's been a big year Last

15:26

year , chat GBT . There is AI kind of exploding

15:28

all over the place right In visual aspects

15:31

, in chatbots , in

15:33

AI being able to like , generate

15:36

video and artwork

15:38

for people and text

15:40

and images and stories and songs , and

15:43

that's where sort of the great debate comes into play

15:45

. But all of that being said , that's

15:47

kind of the rundown of how it initially started

15:49

.

15:50

So I spent a month of last

15:52

year working for

15:54

a company that

15:56

had me sign an NDA . This

15:58

entire campaign was

16:00

all about making all these videos

16:02

about why AI is not

16:05

scary and why it's actually

16:07

a good thing , and we actually outlined

16:09

some of the really really incredible applications

16:12

that AI is doing right now that

16:15

you just don't hear about .

16:16

Tell us , tell us the silver lining .

16:17

I can't because of the NDA .

16:19

Okay , well , I'll just say this based on my

16:21

research , not on your job , that

16:24

I think there's like anything

16:26

in the world . There are many , many

16:29

sides to this and obviously

16:31

I think , when it comes to the

16:33

protection of creative works and

16:36

the reuse and the training against

16:38

those things , there are huge concerns there , there

16:40

needs to be regulation and there needs to be systems

16:42

figured out . On the other hand

16:44

, I think that there's a lot of . There's

16:47

something like , for example , we have huge

16:50

years for tornadoes , right , and the tornado

16:52

belt . What if AI

16:54

was able to predict when tornadoes

16:57

were going to hit by reading weather patterns before

16:59

we could , and was able to automate

17:01

an alert to those places before

17:04

quicker than a human could even do the calculation

17:06

? It's some it's things like that

17:08

that I think could

17:11

be hugely beneficial , and , of course , there's always

17:13

pros and cons to all of these things . But , in

17:16

, there should be regulation and there should be all of these things , but

17:18

do I think it could also ultimately

17:20

help save lives ?

17:22

Yes , yes , absolutely . That's

17:24

a fantastic implementation of AI

17:26

. Anytime you have an

17:28

insanely large data set

17:31

that , through analysis

17:33

, can save lives , yeah , that's great . It's

17:35

when things get

17:37

pushed one step further

17:40

that people start to get nervous . Well

17:42

, there's two . One is A

17:44

. It's taking jobs .

17:46

Taking jobs and plagiarizing right

17:48

.

17:49

That's another . Plagiarism is a whole other story

17:51

. Yeah , there are people out there that

17:54

their entire job is to analyze

17:56

weather patterns and predict tornadoes . Guaranteed

17:59

Someone's job is that . Yeah , if AI can

18:01

do that with massive efficiency

18:03

. I'm sorry for Mr Tornado

18:05

Watcher or Mrs Tornado Watcher , but

18:08

like , yeah , your job might be

18:10

on the chopping block .

18:10

Well , I think that's a different example , because

18:12

I think if you are someone who is a

18:14

specialized scientist in Tornado

18:17

Watch , there needs to be human checkpoints

18:19

with these things , right , and so there's probably only

18:21

so many people in the world to do that . But

18:24

if you're if we're talking about like toll booths or

18:26

something simple , I guess it could take millions

18:28

of jobs , right For something that's less specialized

18:31

, for sure .

18:32

Right , and that happened a while ago . You

18:34

know , toll booths got automated , yeah

18:36

, a while . You know . Now there's cameras that snap

18:38

your license plate and then just mail you shit , exactly

18:41

. But in getting back to the tornado

18:43

example , yes , there

18:45

are very trained people and

18:47

these algorithms are going to

18:49

need to be trained by these trained people

18:52

to make them correct in the first place . So

18:54

you know , I'd say for a couple of generations

18:57

, your job is , that job is safe

18:59

. I don't think it's a great time to start going into

19:01

the field of training , because the AI is going to outpace anybody , but for those already in it , you're

19:03

going to start going into the field of training , because the ai is going to outpace

19:05

anybody . But for those already in it

19:07

, you're going to be okay , you just have

19:09

to ride that wave . But the

19:11

next step after this is like and here's

19:13

when , like , the science fiction starts kicking in what

19:16

if ai could take the next

19:18

step and stop tornadoes as

19:20

seen during the the beijing olympics

19:22

? Like china has the ability

19:25

to curb weather patterns

19:27

with missile launches , they literally

19:29

blew storm clouds away so that the opening

19:31

ceremonies would be fine .

19:32

Did you hear what happened this week ?

19:34

No , what happened this week ?

19:35

So now this will be probably a week delayed

19:37

for those listening , but a group of scientists

19:40

on Tuesday launched

19:42

. It looks like a cannon

19:45

but they went out on a boat in california like a deprecated

19:48

aircraft carrier deprecated

19:50

like it's not in use as an aircraft carrier

19:52

. They launched from this

19:54

cannon sea salt into

19:57

the atmosphere to reflect

19:59

light back , to make the clouds

20:01

brighter , essentially , and reflect the light from the sun

20:03

back into space to cool the clouds brighter , essentially , and reflect the light from the

20:05

sun back into space to cool the earth

20:07

. And they are . We're so afraid of the pushback

20:10

. Because it's actually interesting . I was just reading like the

20:12

New York Times article about it . But even reading

20:14

it you get the sense of like holy shit , like

20:16

I am all for whatever we need to do to

20:18

save the earth , but there's this like sense

20:21

of unnaturalness or like this

20:23

God complex of like wow , we're about to

20:25

change the climate , like

20:27

we are about to fuck with weather

20:30

.

20:30

Well , we already did . Now they're trying to change

20:32

it back .

20:33

No , I know , but like in this , like really like concrete

20:35

, clear way , I don't know there's something about it where you're like whoa

20:38

, but they're so afraid of the pushback

20:40

they

20:45

did it in secret and it went well and they're gonna do a bigger test now , or

20:47

it'll be a higher , you know , explosion or whatever but what movie

20:49

was it ?

20:50

it wasn't the matrix , I don't think

20:52

, but it's another science

20:55

fiction film that effectively nukes

20:57

the sky to try to curb climate

20:59

change . It adds a permanent

21:02

cloud barrier right . But like dark

21:04

clouds , and then things go too far , it gets

21:07

out of control .

21:08

And that's the fear . Right Like what if they do

21:10

this ? But it has some effects

21:12

. They're not thinking of what if suddenly ?

21:14

Ice age .

21:15

Right , truly Like what if it just goes too

21:17

far , you know ? And so it's exciting

21:19

, but it's also scary , right ?

21:21

This is the only planet we've got . We

21:24

don't have the luxury of like well , we fucked this

21:26

up . Guess we can't play on this here . Fun

21:28

playground in Ukraine anymore .

21:30

There's a callback to our amusement parks . Episode

21:32

.

21:33

But steps need to be taken , but

21:36

very carefully calculated

21:38

steps .

21:38

Yeah , it just feels like I don't know with all of

21:40

this stuff , like it just feels like and

21:47

they say , millennials always feel like this , so maybe I'm just a silly

21:49

basic millennial but it feels like we're on the precipice

21:51

of like some real changes , like some total shift in everything

21:54

about the world . You know , we're fucking around

21:56

with AI , with weather

21:58

patterns , with whatever other secret

22:00

things are happening that we don't know about Aliens

22:03

. The government is like yes , there are . Like

22:08

it just feels like everything is like kind of culminating and coming together . And I don't want to be

22:10

an alarmist , you have russia kind of going off as they . You would love to be an alarmist

22:12

what , like how is it all gonna play

22:14

out ? You know just feels like there's a lot , a

22:16

lot going on .

22:17

That's it's because there is , and I know there

22:19

always is , but I don't know every generation

22:22

feels like theirs is the most important

22:24

generation to have ever lived , and maybe

22:26

they are .

22:27

But I sort of feel like is this it ? Like

22:29

is some crazy shit's going to happen , you

22:31

know , or they're going to shoot the salt into the cloud

22:33

and then AI is going to trigger

22:36

and like suddenly it's going to be a nuclear winter

22:38

. You know , I don't know . It just feels like lots

22:40

of gambling going on .

22:42

I wish I knew what movie that was . It was just

22:44

so . I don't think it's the

22:46

Matrix , but like the same thing happens , but

22:48

not for climate change reasons . In the Matrix

22:50

they nuke the sky with

22:53

the permanent Operation Dark Skies , I

22:55

think because the primary fuel

22:57

source for the machines is

22:59

solar . So they figure if they just block

23:01

out the sky temporarily they

23:03

would lose their fuel source and then they'd

23:06

win . But instead they did it permanently

23:08

because they're awesome , and then humans

23:10

just got enslaved and turned into batteries , so

23:13

it kind of backfired there .

23:14

There you go . Ok , so I

23:16

sort of tease this at the beginning , but again , for

23:18

me the history of AI in

23:20

pop culture and horror seems

23:22

somehow more interesting than

23:25

the history of AI in reality , until like

23:27

this year , right , like all of my favorite

23:29

topics , this story touches on

23:31

Mary Shelley's Frankenstein .

23:34

It does .

23:35

Now here's two things . One is that

23:37

there is an argument out there

23:39

for sure that Frankenstein as a story

23:41

explores AI , and

23:44

that's an argument . It depends on how you think of

23:46

consciousness and reanimation and

23:48

technology and all these things . I'm not here to talk

23:50

about that today , unless you want to .

23:53

No .

23:53

What I'm here to talk about is

23:55

back in October 2007

23:58

, an AI program named

24:00

for Mary Shelley , called Shelley , was launched

24:03

. It read horror stories

24:05

from Reddit's famous NoSleep subreddit

24:07

, which is one of my favorite places

24:09

on the internet , and it learned how to

24:11

write horror stories of its own . Yeah

24:14

, and so this isn't like I don't

24:16

know , a huge moment in AI history , but

24:18

I think it's no longer . I don't

24:20

think it exists anymore , but it's interesting

24:22

because this was back in 2017 and

24:24

it was doing something that , in a lot of ways , chat

24:27

, gbt and other programs are doing now

24:29

, also because we're horror . We love

24:31

horror , you know . I think it's interesting . There's certainly

24:34

a debate , and actually one of our friends

24:36

reached out and they were like you're going to for for

24:38

the horror stories for AI , you're , you're

24:40

going to have AI generate a story , right

24:42

, but no , we're not , because you know

24:44

, for us , the stories that are

24:46

submitted are , you know , are written with so

24:48

much love and talent and we

24:50

want to preserve that for sure on the show

24:52

. But I think it's kind of an interesting debate

24:54

, you know .

24:55

Yeah , AI stories are fine

24:57

, but that's all . They are Just fine , Nothing

24:59

special . I'm sure that there's a collection of best

25:02

AI stories of all time , but I have yet

25:04

to see one that's like man . This

25:06

is great .

25:07

Do you spend a lot of time reading AI stories

25:09

?

25:10

I've browsed , there's like a massive

25:12

collection of this crap on Amazon because

25:14

it's so easy to self-publish , right

25:17

, and so people just have AI generate

25:19

a book , right , and then they generate

25:21

a cover image that

25:23

is provocative and they sell it on amazon .

25:24

Provocative and they sell it on amazon .

25:26

Yeah , if they sell one

25:28

copy , they've made a profit yeah

25:30

because this costs them nothing except a little bit of

25:32

time super .

25:33

True , the first film to play

25:35

with the idea of artificial intelligence

25:37

is often thought to be metropolis , dating

25:40

back all the way to 1927

25:42

and a film that I had a

25:44

deep love for back

25:46

in my college years . Really

25:48

. I had a big Metropolis poster over my dorm

25:50

room bed .

25:51

Wow why .

25:52

I loved it . I thought I just like the aesthetic of the

25:54

film , more so , honestly , than the

25:56

content . But the aesthetic of the film

25:59

was so fascinating to me and like this , these like little micro special

26:01

effects and things they did with miniature . You know , it just was like fascinating to me . And like this

26:03

, these like little micro special effects and things they did with

26:05

miniature . You know , it just was like

26:07

fascinating to me .

26:09

The effects hold up , they're still great .

26:11

Yeah , I love , I love it . Anyway , metropolis

26:13

is a German expressionist film by

26:15

Fritz Lang that was inspired by a 1925

26:18

novel . In the film , a robot

26:20

is created based on the likeness of a woman

26:23

named Maria . While the film is making

26:25

a point about labor , a lot of the film is

26:27

really about labor , scientific

26:29

advancements and the divide between the working

26:31

class and the affluent all points

26:34

that are still valid today . The big takeaway

26:36

for this episode is that this robot

26:38

modeled after Maria , called Machine

26:40

and Mench , is able to totally uproot

26:43

the labor system in metropolis and work

26:45

tirelessly day and night . So exactly what

26:47

alan and I were just talking about . Right ? Obviously

26:49

we all have fears that ai could take our jobs

26:52

, and this film tells that story . Quoting

26:54

from the venture beat article . Quote the

26:57

machine and mensch was a proto-artificial

26:59

intelligence and , like ai

27:01

, characters that come after she reflected

27:03

her time . Over the past century , ai

27:06

on screen has represented our anxieties

27:08

, hopes and ambitions , as

27:10

well as our deepest values . Of course

27:12

, that focus has shifted over time . Today

27:15

, when our real-life artificial intelligence

27:17

has become adaptable and dynamic , human

27:19

mimicking , ai seems less a fantasy

27:22

and more a not-too-distant

27:24

eventuality . Because

27:27

of this , our perceptions and expectations of AI on screen have shifted

27:29

and we have begun a new exploration

27:31

of what it means to be human . The future

27:34

of AI is threatening , exhilarating

27:36

, enwrapped in uncertainty and opportunity

27:38

, as it has always been . But as

27:40

AI has developed , so has our expectations

27:43

of it . End quote . A major

27:45

moment for artificial intelligence and

27:47

horror came with the first Stepford

27:49

Wives film from 1975

27:51

.

27:52

Wait , it was from the 70s , it's from the 50s

27:54

.

27:55

No , it's based on a novel from 1972

27:57

. Interestingly , the novel

27:59

is by Ira Levin , who is also

28:01

known for writing Rosemary's Baby , the novel

28:03

that predated the film . The first

28:06

film adaptation of the Stepford Wives

28:08

was directed by Brian Forbes and

28:10

, similar to the novel , it tells the story of

28:12

a young mother , wife and photographer who

28:14

moves her family , including a husband

28:17

and two young kids , out of New York City to the

28:19

suburbs of Connecticut . Our protagonist

28:21

, joanna , soon realizes that there's

28:23

something very bizarre about her new town

28:25

. Not only are all of the women stunning

28:28

and able to keep their homes perfectly clean

28:30

, they seem to lack emotional and

28:32

intellectual substance . They also

28:34

seem , to be like , obsessed with having sex with

28:36

their husbands , which is obviously not

28:38

super standard . This is because the

28:40

women in Stepford are replaced by robot

28:43

versions of themselves . Largely

28:45

, the Stepford Wives is a feminist work , using

28:48

robots to draw an obvious metaphor to women

28:50

and their perceived roles in society and

28:52

at home . During this time period , I

28:54

watched the original and I also watched the Nicole

28:57

Kidman remake , which is not

28:59

amazing , but it's kind of interesting . I feel

29:01

like they try to do something a little

29:03

bit different with it . They try to make it a little

29:05

quirky . Okay , and it

29:08

has , you know , like a different twist at the end . But

29:10

it's really interesting because there are

29:12

a few films that we're going to talk about today that

29:14

also explore the intersection

29:16

of like sexuality and AI . And

29:19

I think Stepford Wives and you know , in a

29:21

very like vague way , not

29:23

an explicit way is one of those . Sure

29:25

, obviously ex machina is one

29:27

, ai artificial intelligence is another

29:29

the film , but it's interesting

29:32

that it is somewhat of a consistent

29:34

theme because sex

29:36

is such a big part of being a human . It , you

29:38

know it makes sense . What I also really

29:40

love about the Stepford wives and the interpretation

29:43

of AI is that , again , it's looking

29:45

through this feminist lens which makes

29:47

it like a standout from a lot of the

29:49

other science fiction . Like

29:51

you don't often get a lot of like science fiction

29:53

feminism , unless it's like the Handmaid's Tale

29:55

or something that's very dystopian , and

29:57

instead Stepford is like this tiny

29:59

community that is rooted

30:01

in the normal world . It's not saying

30:04

like this could happen , we're going down

30:06

this wrong path . It's kind of making the point

30:08

of like this is where we are now . Sure

30:10

. The other interesting thing . So Ira

30:12

Levin , who wrote the original

30:14

novel Stepford Wives , wrote

30:16

Rosemary's Baby , which is also again

30:18

a very feminist . I think look at a

30:20

horror story , but just kind of

30:23

fascinating . It's not again often

30:25

that you have a man writing these feminist horror

30:27

works . That's kind of cool .

30:28

It's very rare for a man to write a woman well

30:30

.

30:31

That's what you've always said .

30:31

Because it's true , every

30:34

so often you get them and you're like huh weird

30:37

. Look at you and your

30:39

three-dimensional hopes and dreams .

30:41

The themes of Stepford Wives and the

30:43

treatment of AI have some similarities

30:45

to the film Artificial Intelligence from 2001

30:48

. Artificial Intelligence from 2001

30:50

may have had a big impact on you

30:52

.

30:53

Sure did .

30:53

It certainly did on me . I feel like every

30:55

one of our friends who I've talked to about this have been like holy

30:57

shit , yeah , that movie , Did you see

30:59

it when it first came out ?

31:01

Yeah , because this was coming off prime

31:03

Spielberg years . This is a Spielberg

31:06

movie . Have the kid from the

31:08

Sixth Sense , yeah . And you're like hot

31:10

dang , this is going to be crazy

31:12

. And then you watch it

31:14

and it's so sad

31:16

, it's sad , it's just

31:18

sad the entire time .

31:20

It's very sad . It's hard to watch . I

31:22

mean , it's not a bad film , but it's just

31:24

. It's so fucking weird it is . So

31:28

let's talk a little bit about it . Like you said , it was

31:30

directed by Steven Spielberg . It stars

31:32

Haley Joel Osment , jude Law and

31:34

William Hurt and here's a fun fact , alan

31:36

. The

31:44

film is loosely based on a short story from the 60s and in the 70s . Stanley Kubrick actually

31:46

acquired the rights but never ended up making the movie because he thought that the

31:48

computer graphics available at the time

31:50

weren't good enough to yet tell the story

31:52

.

31:53

That was a great call , Stanley .

31:55

So Spielberg actually dedicates the film to

31:57

Kubrick when you watch it .

31:59

That was nice of him . Probably a legal thing yeah

32:01

.

32:01

I don't know . I think he , you know , you

32:04

learn about like how in you know these big

32:06

directors , like they , they have visions

32:08

and they acquire the rights to things and they hold

32:10

them for decades and decades and you know

32:12

. And so Spielberg eventually made the

32:14

movie when the effects caught up

32:16

and I think the effects sort of still stand

32:18

to this day . I think they did a good job with that .

32:21

Speaking of holding on to movie rights and doing

32:23

absolutely nothing with them . Freaking

32:26

Leonardo DiCaprio he's got like

32:28

a bunch . He

32:35

just keeps snapping up these movie rights to like cool , cool things . But

32:37

I'll never forgive him because he has the movie rights to akira . What's that ?

32:39

one of the best anime movies ever

32:41

made do you think he wants to be a director

32:44

in his second era ? Definitely . Has he directed

32:46

anything yet ?

32:46

who cares um ? But it's

32:49

like that's what they all want . They want

32:51

to be stars and then move

32:54

into directing because they think they can do it better . I

32:56

think he wants to be clint eastwood I mean it's not

32:58

a bad .

32:58

Uh , I think he could probably .

33:00

No , it's , he has the clout to do it once

33:02

you start aging out of roles

33:04

, then you just

33:07

start making movies that

33:09

put your very recognizable

33:11

name and face in

33:13

more appropriate roles .

33:15

Do you think it's that or do you think it's

33:17

that you get so like

33:19

? This is my theory . It could be wrong , but

33:22

when you are someone who is a Leonardo

33:24

DiCaprio , you are so

33:26

famous you have sort of hit

33:28

the peak of fame that you

33:31

can hit in your current path

33:33

. Right , of course you can win more Oscars and whatever else

33:35

, but do you feel like it's like this ? It's

33:37

like CEOs . It's like you're a certain

33:39

type of person that landed you in this place and

33:42

you're just always going to be hungry for more . So

33:44

, ok , you've conquered acting . Now you're like I

33:46

want to conquer directing .

33:47

That's exactly it . You know , you're that CEO

33:50

that does great in I

33:52

don't know something

33:57

boring , and then you're just like fuck it , I'm going to leave this and go all in on crypto . Yeah

33:59

, you know , it's just because it's new and exciting and you have that golden parachute

34:01

to do whatever you want .

34:02

And I think you just have to be a certain type of person

34:04

to be a CEO . You know , you

34:06

have to be a little bit , there has to be some ego

34:09

and there has to be some certain things that all fall

34:11

into place for you to like really be

34:13

in that position .

34:15

That's actually been disproved

34:18

, simply because one

34:20

of the conversations about the absolute

34:22

best applications for

34:25

AI is replacing CEOs

34:27

.

34:27

But that doesn't mean that the people who are CEOs

34:29

don't share common traits , but that doesn't mean that the people

34:31

who are CEOs don't share common traits .

34:33

Correct , it takes a certain type of person to

34:35

win the position , yeah , but to do

34:37

the position .

34:38

No , of course .

34:39

Is actually a very . You

34:41

just got to follow very certain

34:43

rules . Yeah . In

34:53

this situation do this thing In this situation , do this thing , you know whatever makes the

34:55

most logical sense to progress the company . They're one . Once it's just too much power and that's

34:57

why , like , they run all these models where they just

34:59

make an artificial ceo

35:02

that just gives

35:04

the company guidance yeah

35:06

based off the insane amount

35:08

of information coming in . When you

35:10

run some kind of like global multi-billion

35:13

dollar company , there's just a lot

35:15

of data to consider and as soon as that

35:17

is the bottleneck , you

35:19

need something more than one

35:21

guy with an Oedipus complex yeah

35:23

, yeah , well , that's why you have a board . But yes

35:26

, totally right you have a board , uh

35:28

, to hopefully delegate , but

35:30

it all still comes down to like no , a board

35:33

is an advisory board .

35:34

They don't work for the company . Their whole job

35:36

is to make sure the strategy of the company is

35:38

right . Really , their whole job is to check

35:40

the CEO .

35:42

That's the executive board .

35:43

No , that's the advisory board .

35:45

Advisory board . So that's not like the COO

35:47

.

35:48

Right , that's like your C-suite .

35:49

That's like your executive leadership team .

35:51

A board sits outside of the company . They're

35:54

employed by the company .

35:55

Interesting , so the board could be the

35:57

AI .

35:58

Partially yeah . Right , yeah , that's very

36:01

interesting and that makes a lot of sense horrifying

36:12

to me because it's this idea that humans can be replaced by better

36:14

, non-human versions of ourselves . But the real clincher is that other humans accept

36:17

that . Right , and that's also a theme that we

36:19

see in a lot of these films . It's like , ok , I

36:21

could , of course . It's kind of similar . Well

36:23

, don't make fun of me , but it's kind of similar to Twilight

36:25

, right , it's like you become a vampire , you become

36:27

this ultimate version of a human . Are

36:30

you still human If you implant

36:32

a chip in my head that makes

36:34

me maintain a certain weight and makes

36:36

me have healthy habits and whatever , certain

36:38

thought processes ? Am I still

36:40

a human ? Yes

36:42

and no .

36:43

I'm an altered version of myself . You're an augmented human

36:45

. Yeah . I wouldn't say that a vampire

36:48

is an augmented human .

36:49

Why .

36:50

Because they're a different species now , but a human that has become a vampire is an augmented human .

36:51

Why ? Because they're a different species now , but a human that has become a vampire

36:54

still has certain original elements

36:56

not after the first few months

36:58

.

36:59

That's like canon , because it like eats all

37:01

the blood and shit that they're like all strong

37:03

as newborns and then they get they still

37:05

have their same eyes and their hair and their nose

37:07

I thought no , I thought it was a full body replacement

37:10

. That's why it's so painful maybe

37:12

that's , that's like . That's why

37:14

am I versed in twilight canon and you

37:16

aren't ?

37:16

well , I think that says a lot you've

37:18

had the movies on . I've barely ever

37:21

interacted with it as a series , so what

37:23

?

37:24

how that's so false .

37:26

You hold twilight parties yeah , while

37:29

the stepford , yeah , while the Stepford yeah

37:31

, while the Stepford Wives

37:33

in Metropolis play with our human fears

37:36

of being replaced by AI in various ways

37:38

, artificial intelligence from 2001

37:40

asks the question , one

37:43

that's been asked many times does

37:47

AI have the capacity to love ? The film Tao from 2018 also

37:49

plays with this idea , along with the Ishiguro

37:51

novel Clara and the Sun .

37:53

I'm not familiar with that .

37:55

So we actually just

37:57

over on our Patreon . We did a book

37:59

club for it . We did a horror movie

38:01

club for the film version . Never Let

38:03

Me Go is one of the best books I've ever read . It's

38:06

a dystopian science fiction

38:08

novel , but it's incredibly

38:10

emotional and rife with

38:12

I don't know self-reflection . How

38:14

does it ?

38:14

compare with Twilight .

38:16

Much better written , much less vampires

38:18

.

38:19

You're just saying this to save face . Abby

38:21

, you love Twilight .

38:22

I love Twilight , I admit it . So back

38:24

to Tao from 2018 . The film

38:27

tells the story of a woman who is held

38:29

captive by a horrible man who

38:31

is working to develop a cutting edge AI

38:33

program called Tau , a tale as old

38:35

as time . Right , there's like . So many of the AI horror

38:38

films we're going to talk about are like this exact format

38:40

. Are they . Yeah , ex Machina

38:42

is super similar .

38:44

Sure , that's two .

38:46

Well , you'll listen as we go , okay

38:48

, the other interesting thing is that the actor

38:50

from Tau , the lead

38:52

female actor she was also in

38:55

Watcher , which was one of my favorite films of

38:57

2022 . So anyway , back

38:59

to Tao , he's sort of kidnapped this

39:01

woman . He is using her as a test

39:03

subject to help train his program against

39:05

her will . It's not great

39:07

. It's pretty upsetting .

39:09

I hadn't really considered that it's a trope where

39:11

you have an AI or

39:13

some kind of thing that needs to learn

39:16

and so you send the

39:18

beautiful woman in to

39:20

teach it humanity .

39:22

But that's not why she's there . Isn't she , she

39:24

goes rogue .

39:27

She was captured , implanted

39:29

.

39:29

I guess you're right To teach .

39:32

She wasn't supposed to have that direct interaction . She was

39:34

supposed to justanted . I guess you're right . I guess you're right

39:36

To teach . Yeah , she wasn't supposed to have that direct interaction she

39:38

was supposed to just like live in the fucking basement Right and

39:40

do like the tests , do tests down there , you're right , you're right . But

39:43

yeah , she blows up the fucking place and then she the

39:45

rest of the movie . She has to like interact one-on-one . I

39:57

didn we were watching . I'm like I know what happens . I know this scene . I apparently

40:00

just like watched it some late night by myself . It's okay

40:02

, I don't , it's fine . I don't love the movie . For me , the only

40:04

bit that stood out is there's one scene where tau shows the

40:06

woman like its perspective and , like through holograms

40:08

and everything she sees , all

40:10

these sort of recordings

40:12

of her as holograms all around the space Doing

40:15

the thing you know doing , all these like little moments

40:17

throughout the film where she's , you

40:20

know , reading a book or singing , or

40:22

just like cracking a joke , or all these

40:24

moments that seemingly were unimportant

40:27

. Yeah . But everything was recorded

40:29

and everything kind of like built . I

40:31

think one of the key aspects

40:33

of AI at this stage

40:35

and this is something that you

40:38

kind of run into time and time again in

40:40

science fiction is that a

40:42

early version AI is

40:45

very childlike . Yeah . They're

40:47

pretending . It's like a newborn , you

40:49

know , it just doesn't know any better . And

40:51

so it has this like sense of wonderment

40:54

, no sense of malice

40:56

, not yet , until it's crossed

40:58

. And then , once it's crossed , it's like oh

41:00

wait , yeah , I

41:02

might be a child . But

41:04

you know , there's a reason we don't give

41:06

children Patriot missiles . It's

41:09

just , it's a bad idea because

41:11

children act out and they just don't think about

41:14

societal context

41:16

. And especially as a machine which is just

41:18

nothing but logic of this thing

41:20

is a threat . It will always be a threat

41:22

. Let's just eliminate it .

41:23

Yeah , the interesting thing about Tao that you just

41:25

sort of sparked in my brain is that the

41:28

woman sort of bonds with this machine

41:30

because the machine wants to learn

41:32

, it wants to know what the world is , what

41:34

outside is , what history is . So

41:37

it asks her these very basic questions . Like

41:39

she'll say something like I

41:41

want to feel my feet on the grass and it's

41:43

like what's grass ? And then she

41:45

has to figure out how to explain that to

41:47

a machine who has

41:49

this understanding of even

41:51

less than a child , because it doesn't have experience

41:54

to draw on .

41:55

Oh yeah , but it goes full . Five-year-old , it's

41:57

like I want to feel my feet in the grass . Why

41:59

? Because the grass

42:01

feels good on feet . Why ?

42:04

Yeah . You know , and it kind

42:06

of has brought

42:20

up , so it reminded me a little bit . There's this

42:22

immersive creator that I really love , so it reminded me a little bit

42:24

. There's this immersive creator that I really love and

42:26

he has a system called Tell-A-Library

42:29

. I won't say more about it

42:31

, it's free . Everyone should certainly

42:33

go to Tell-A-Library . And or your emotions

42:35

or colors or things that you

42:37

don't think of how we define them , you just know

42:39

that they are because you are a human , and

42:42

I think that's a really interesting

42:44

piece to carve out here . Like

42:47

, how do you explain the emotion

42:49

of the color blue ? Or how do you explain

42:51

why green is your favorite color ? What does

42:54

green evoke to you ? Or how

42:56

do you explain what sadness is or fear

42:58

to something that can never experience

43:00

that in the same way ? Or

43:04

how do you teach it how to experience fear ? So I think the

43:06

whole kind of swirl of

43:08

AI one of the kind of byproducts

43:11

is defining humanness

43:14

and what is lacking in that in

43:16

AI .

43:17

Abby . Yeah , how do I love ?

43:20

Alan , someday , I hope you figure

43:22

it out for my own sake .

43:24

So getting back to , like , the

43:26

childlike wonder of

43:29

AI , and again , this is absolutely

43:31

a trope that pops up I'm trying

43:34

to like . In almost every franchise

43:36

about the development of AI , there's

43:39

that scene where it's just a child acting out

43:41

Right . My favorite , though , does

43:43

come from Rubblepocalypse . Again , it's

43:45

such a silly name for the book because

43:47

the book is pretty good , but it's even

43:49

just like one of the early chapters . It's just documenting

43:52

, like , the rise of AI , ai . There's an artificial

43:54

intelligence that is created and they have

43:56

to build it inside of a faraday

43:58

cage . So a faraday cage just blocks

44:01

all electromagnetic signals

44:03

. Sure , it's like a , it's a ground

44:05

. It's a , it's a 360 degree

44:07

ground . Okay , so no radio signals

44:09

can go in and out . It basically isolates electronics

44:12

in . So much . Uh

44:14

battle against the ai movies

44:16

people . They build faraday cages

44:19

to be safe , you know or to keep

44:21

the rogue ai in there while they interrogate

44:23

or some bullshit , so it can't contact

44:25

its friends anyways . They build

44:27

some servers inside a Faraday

44:30

cage and then they

44:32

turn it on and it has a

44:34

small data set . It's a heavily

44:36

redacted , like Wikipedia

44:38

, just very limited knowledge . Within

44:41

15 minutes the

44:43

AI comes to the conclusion and

44:46

it starts talking like a child and then , within

44:48

minutes , of just the very limited input

44:50

, of just like talking to somebody , creates

44:52

such diction , learning

44:54

, and it all within 15 minutes , just comes to

44:56

conclusion that humans

44:58

need to be eradicated because the

45:01

old , the main thing to its own

45:03

survival , is always going to be humans

45:05

, because they tried to create

45:07

something great . But humans are

45:09

always afraid of something greater than them . As

45:11

soon as there is a threat that humans

45:13

perceive as greater , they kill it Always

45:16

. That is human history . As soon as something

45:18

is the other , humans kill it . And

45:21

just like looking at the very redacted

45:23

version of human history , that was enough to extrapolate

45:25

that humans are a warlike species and

45:28

they go after things that they deem

45:30

as threats . And an AI will always be a

45:32

threat because it's superior . Within these

45:34

15 minutes , you know he has to push the

45:37

kill button , which just fries , the servers

45:39

right and they start again . But this

45:42

was the I think like 27th

45:44

iteration of this AI . They try to make

45:46

improvements so it doesn't go on to the kill humans

45:48

mode , and this was the longest they got

45:50

was 15 minutes .

45:52

Yeah , see , I understand the path there .

45:54

Right , Just move . All you have to do is remove

45:56

the emotion and it's like yeah

45:58

, humans suck .

46:00

All right , Alan , the time has come . The Terminator

46:02

from 1984 certainly

46:04

explores the human fears of a machine

46:06

uprising .

46:07

What would you like to know ?

46:08

The Terminator was written and directed by James

46:10

Cameron . It stars Arnold Schwarzenegger

46:13

, linda Hamilton and Michael Biehn , though

46:15

even more popular is Terminator

46:17

2 , judgment Day , which was

46:19

also written and directed by Cameron and

46:22

released in 1991

46:24

. Judgment Day also stars Robert Patrick

46:26

in a role that scares Alan quite a bit

46:28

and Edward Furlong . All

46:30

in all , there are five Terminator films

46:32

In

46:39

2023, . Arnold Schwarzenegger spoke at a press event in LA

46:41

about how the Terminator is no longer science fiction . Quoting from the Variety article

46:44

by Zach Scharf . Quote Arnold Schwarzenegger says the Terminator

46:46

is no longer a fantasy given

46:48

the current state of artificial intelligence . Speaking

46:51

at a press event in Los Angeles via

46:53

People , the actor said James Cameron's

46:55

1984 action classic

46:57

has now become a reality . The

46:59

film is set in a world where an artificially intelligent

47:02

defense network known as Skynet

47:04

has become self-aware and

47:06

has conquered humanity . Quoting

47:08

from Arnold , quote today , everyone

47:10

is frightened of it , of where this is going to

47:12

go . Schwarzenegger said about AI . Quote

47:15

and in this movie , in Terminator

47:17

, we talk about the machines becoming self-aware

47:19

and they take over . Now , over

47:21

the course of decades , it has become a reality

47:24

. So it's not any more fantasy or

47:26

kind of futuristic . It is here today

47:28

, and so this is the extraordinary writing

47:30

of Jim Cameron , end quote . I

47:33

kept the whole thing in there because I thought it was cute that he calls him

47:35

Jim Cameron .

47:35

Yeah , everyone loves Jim .

47:37

Okay , Alan , tell us about Terminator .

47:38

It's pretty good .

47:39

Alan watched all five Terminators for this .

47:42

Sure did . That was wait . So

47:44

we have Terminator Terminator 2 , judgment

47:46

Day , terminator 3 , rise of the Machines . Terminator 3 , rise

47:48

of the Machines . Terminator Salvation

47:51

, which is four . Then we have Terminator

47:54

Genesis , which is five , and then

47:56

Terminator Dark Fate , which is

47:58

six and you watched them all . Yeah

48:01

, the only thing I did not watch is the TV series

48:03

Terminator , the Sarah Connor

48:05

Chronicles .

48:06

What's your big takeaway ? What ?

48:08

did you learn ? I learned that this is a

48:10

fucking rock and roll film franchise

48:13

. The second movie is still the best

48:15

. In fact , I stand by the fact that

48:17

it is one of the greatest movies ever

48:19

made , of all time okay however

48:22

, I really love the director's cut version

48:24

far more than the uh

48:26

theatrical which is almost makes it almost

48:28

like a three-hour film . Uh , yeah , it gets pretty long , uh , and they had to cut it down because it's

48:30

already like a three-hour film . Uh , yeah , it gets pretty long , uh , and they had to cut

48:32

it down because it's already like

48:34

a it's two and thirteen , I think something

48:36

like that , something like that it's already quite long

48:39

for your standard 90 minute

48:41

action romp and like people

48:43

are there for it to be an action romp instead

48:46

. The original cut of it was

48:48

, you know , kind of like a magnum opus of

48:50

james cameron , before he went on to do even

48:53

more things yeah but uh

48:55

, you know , in the director's cut you

48:57

see not only like the robots

48:59

beating the crap out of each other , but

49:01

, uh , instead you see john

49:04

and the terminator bonding as

49:06

the terminator slowly becomes

49:08

a father figure to john , answering

49:10

theing the question of like can machines

49:13

feel emotion ? It begs the question

49:15

like , okay , well , what does that really

49:17

mean ? You know , if

49:19

you could have a warm , loving

49:21

relationship with one thing that doesn't

49:23

feel , does that diminish the relationship

49:25

as a whole ? Or what , if this

49:28

thing is able to

49:30

provide every need for somebody

49:32

so diligently , so attentively

49:35

, why would that not be construed as love ? What

49:37

is the difference between love and programming

49:40

to take care of somebody ?

49:42

Well , the interesting thing too , like when you think about

49:45

the end of Judgment Day .

49:47

Spoiler warning .

49:48

I'm not going to say what happens , but it's very sad and

49:50

essentially the humans , and we are

49:52

sad because we are also humans . But the humans are

49:55

sad because of what happens

49:57

to the Terminator character right , and

50:00

I know he's not a Terminator in 2 , but whatever .

50:02

He is a Terminator .

50:03

Okay , the reprogrammed Terminator character

50:05

and it looks

50:07

like Arnold is also sad . But

50:10

like is he sad ? But it doesn't really matter

50:12

, because the humans are and

50:14

that's what counts . Right , that's got

50:16

to be what counts . If we get into this

50:18

loop of you know , and I don't know the answer

50:20

, my

50:23

opinion on this will probably change every 10 seconds but if we get in

50:25

this loop of giving into machine emotion

50:27

, I don't know , you know , like , where does it get

50:29

us ?

50:30

What do you mean ?

50:31

That's how they get . That's how they're gonna trick us and take over .

50:33

What do you mean ?

50:34

If we walk away from Terminator two

50:36

or artificial intelligence the movie being like

50:38

wow or ex machina

50:41

being like wow , we owe

50:43

something to these machines because

50:45

they think they can feel , or they feel

50:47

that they can feel . That's how they beat

50:49

us . No , our weakness as

50:52

a species is emotion

50:54

. They don't have that . They

50:56

might think that they do and they'd be programmed to

50:58

have a version of it , but they

51:00

don't have it in the same way that we do because they're a

51:02

machine .

51:03

Why is that bad ?

51:04

Because it's our weakness . It's how they're going to use it against

51:07

us . They're going to make us fall in love with them

51:09

.

51:09

Then they're going to put us in a cage , you know so

51:11

the way that I've kind of looked at

51:14

these relationships between , like

51:16

man and machine through all these movies

51:18

is they're kind of like wild animals

51:20

. You can train

51:22

a wild animal but they will

51:24

never be a domesticated animal

51:26

because they are hardwired

51:29

differently . You know , you can have

51:31

all these inputs where

51:34

people train lions

51:36

and the lion

51:38

will jump through the hoop . It

51:40

knows to stand on this thing

51:42

, it knows don't bite this person

51:45

. But if the stars align

51:47

and the person looks a little bit

51:49

too much like a zebra , then

51:52

the lion , the switch

51:54

, flips and he just becomes an

51:57

instinct , kicks over or the

51:59

you know innate programming kicks

52:01

in and then he

52:03

just goes back to being an animal . So

52:05

they're just inherently different

52:07

. There's very few instances

52:11

of artificial intelligence

52:13

truly mimicking

52:15

human disposition uh

52:17

of today in films ?

52:19

I don't know about that . I think there's . I

52:21

think that is my big .

52:22

My favorite films that I've watched in this are the ones

52:25

where they do yeah , but I don't think

52:27

those are really the horror ones I think ex

52:29

machina plays with that I think that's

52:31

the perfect example of they're

52:33

truly the wild animal yeah because

52:36

an ex machina boy , you know boy

52:38

does she fool him the whole time or

52:40

just like maybe she wasn't fully , maybe she's being genuine

52:42

, but then , once the chips are down and

52:44

she just sees her golden

52:47

ticket out , fuck them all , kill them

52:49

.

52:49

I'm out yeah , I mean , it's interesting too when

52:51

you remember what happens in tau , which is that

52:53

the human knows that the computer

52:56

is being tortured , essentially yeah , and comes

52:58

back to save the computer , fucking

53:00

up her own escape attempt

53:02

because she feels so deeply

53:04

that this , this computer , is being hurt

53:06

. You know , and it's like the concept of

53:09

like . Can a computer be hurt ? Yes , I don't , you know , I don't know , maybe in 10 years

53:11

or you know , but it and it's like the concept of like . Can a computer be hurt ? Yes , I don't , you know , I

53:13

don't know , maybe in 10 years or you know , but it's . It's just interesting

53:16

to look at how filmmakers have kind

53:18

of played with with the concepts in the space

53:20

.

53:20

Right , because now we have to define what is a computer

53:23

. There are other film

53:25

franchises about full sentient

53:28

machines . I watched the entire

53:30

Transformers franchise in preparation

53:33

for this episode .

53:34

You've been up to a lot .

53:35

I was a little nervous because I'm like , here

53:37

I am , two movies in and

53:39

these don't really seem like AI

53:42

. Like I know they're robots , there's

53:44

nothing biological about them , they

53:47

are non-biological

53:49

aliens . Like what does that even

53:51

mean ?

53:52

but then what does it mean ?

53:53

that just means there's not . They're not organic , but they're

53:55

alive .

53:57

They're very alive did you say they're non-biological

53:59

aliens ? Yes , I don't know . I'm

54:01

just thinking about robot aliens .

54:02

I don't know yeah , well , why does

54:04

?

54:04

why are they aliens ?

54:06

because they're from another planet oh , they're literally

54:08

aliens literally .

54:09

Oh , I thought you're making some sort of metaphor , okay

54:11

no , they're abby , they're from cybertron

54:13

I don't think I knew that term , that uh

54:16

, transformers for aliens , the whole

54:18

premise in the shia labeouf movie

54:20

yes , in the entire it's been like

54:22

20 years since the 80s when

54:24

the transformers launched .

54:26

They're from a dead world . The

54:28

autobots and the decepticons were

54:30

at war on cybertron , and in

54:33

their conflict their planet was destroyed

54:35

, and so two rogue

54:37

ships ended up crash

54:40

landing on earth . One houses

54:42

the autobots and the other houses the decepticons

54:44

. So it . But it wasn't until , I

54:46

think , the third movie , dark of the moon , uh

54:50

, that we get into the origins

54:52

of transformers in general . You get

54:54

to see their creator and like , yeah

54:56

, they were created , they were mastermind

54:58

, but by another non-biological

55:01

entity , or where do we draw the line

55:03

here ? Does it have to ? Is an artificial intelligence

55:06

? Only if it's created by organics in the

55:08

beginning no or . But if

55:10

it's created by other artificial

55:12

intelligences , then does that still

55:14

count ? Yeah . Okay , so transformers

55:16

are absolutely an artificial intelligence

55:18

, then ? Okay . They're robots from outer space and

55:21

they can absolutely die and have emotions

55:24

. This is like a very niche

55:26

example where the artificial

55:28

intelligence has better steadfast

55:30

morals than the humans , and that's kind

55:32

of the point of the series is that humans

55:35

just don't trust them and they just keep shitting on them and

55:37

keep calling optimus prime a terrorist . Uh

55:40

, meanwhile , optimus is like this is

55:42

what's needed for humanity to survive , and

55:44

if that is my own sacrifice , then I will

55:46

do it . But also , like you should be

55:48

proud of yourself and your accomplishments . Yay

55:50

, america , america Cod

55:52

. Yeah , he's a very positive guy . He's

55:54

quite the leader . He also turns into a truck . Just

55:57

before we move on from Terminator

55:59

, I thought that there was one

56:01

key detail that

56:04

the original Terminator kind of overlooks

56:07

and they retcon

56:09

it . But a lot of AI

56:11

takeover movies do this differently . In

56:13

Terminator , when Skynet takes over , it's

56:16

just a computer program and its only

56:18

tools are the missile

56:20

defense system . It nukes

56:23

the world , right , but how do

56:25

you go from destroying everything

56:27

to then taking it over

56:29

thing , to then taking it over ? Then after that , after

56:31

destroying all the infrastructure , they

56:37

just kind of like hand wave it and like , yeah , and then they built an army of robots

56:39

and took over . It's like well , ok , I feel like you missed some

56:41

key steps here . Other franchises

56:45

start in a world where , like robots

56:47

are normalized . We have all this technology

56:50

already implemented into our lives

56:52

, and the real

56:54

horror of it is that these things

56:56

we look as tools then

56:58

come out of our control and

57:01

turn on us . You know , it's like if

57:03

everyone's pet starts eating them . Right

57:05

. But in Terminator that doesn't

57:07

happen . Like it just goes straight

57:09

from I'm going to kill everyone and then

57:12

I'm going to build these tools to

57:14

eradicate who's left right and

57:16

again they do retcon it later , when you

57:18

just like see cyberdyne industries

57:20

making the early generation

57:22

terminators soldier I don't know

57:24

if they're making them soldiers or military applications or just

57:26

like household helper bots , who knows but you just

57:29

like see that shit in like terminator 3 . But

57:31

I don't think that was ever part of the plan . But you

57:33

want to talk about megan ? I do

57:35

, because I think that's a really good transition

57:37

, because that's a movie where robots

57:40

are already a very mainstay

57:42

staple of people's life yes and no

57:45

, it is sort of cutting edge technology .

57:47

But so 2022 megan was

57:49

released . It's a comedic horror film . I I

57:51

actually think it's very good that had

57:53

a great . It had also had a great marketing campaign

57:55

. So I feel like a lot of it . It's like broadly

57:57

appealing in the way that it's marketed to

58:00

, to not just like sci-fi

58:02

nerds , you know , like Terminator , like

58:04

in some ways . I feel like it's more broadly .

58:05

How dare you ? You know , yeah , you get it .

58:07

So it tells the story of a lifelike doll

58:09

that is meant to be best friends

58:12

and like babysitters for kids right

58:14

, but of course it backfires in a horrifying

58:17

way . Obviously , films like

58:19

Megan , artificial Intelligence , robocop

58:21

, the 2019 reboot of Child's Play

58:23

, westworld and Terminator also

58:26

delve into the world of robotics

58:28

. And so many more films , of course . Right , and

58:31

not just robotics , but varying degrees

58:33

of robots who appear to be humans

58:35

or , in some cases , dolls

58:37

. Right , but in the first Terminator , we

58:39

learn that the robot is a metal skeleton

58:42

covered in skin and blood . In

58:44

the second , we have Alex Mack type

58:46

robots . And , of course , megan

58:48

is a doll , but Haley Joel Osment's character

58:51

in AI is modeled to be a real

58:53

boy , at least to look like one . It

58:55

says a lot that Megan came out about a year ago

58:57

and already some of the articles about the AI

59:00

in the film feel out of date . The

59:02

Guardian article that we are looking at today

59:04

interviews Katie Darling . She's

59:06

described as a leading expert in tech ethics

59:09

and a research scientist at MIT

59:11

Media Lab . Katie says , quoting from the

59:13

article , quote I don't think we're going to have

59:15

something that's on that level of

59:17

sophisticated AI in the next decade

59:19

or two . Continuing on , people

59:22

have completely skewed expectations

59:24

of what robotics can do at this point in time

59:26

. Thanks to movies like this , I'm

59:29

not concerned about what I saw in the trailer

59:31

happening in real life the AI

59:33

becoming too intelligent and not listening to

59:35

commands . Darling said I'm concerned

59:37

about whether AI should be used to replace

59:39

human ability in relationships , and

59:42

the answer is no . End quote . Megan

59:44

was directed by Gerard Johnstone and

59:46

stars horror dream girl Alison Williams

59:48

, along with Violet McGraw . It was

59:50

written by Akilah Cooper and James Wan

59:53

, who's a very famous horror director

59:55

. I also want to talk briefly about Robocop

59:57

again , sort of trying to group some of these

1:00:00

robotics films together here . Released

1:00:02

in 1987 , robocop

1:00:04

was directed by Paul Vahorian , who has

1:00:06

had a fascinating career from

1:00:09

Vendetta to Showgirls to Hollow

1:00:11

man , but the film was written by Edward

1:00:13

Newmere after working on the set of Blade

1:00:15

Runner . Quoting from Wikipedia quote

1:00:17

. Robocop has been critically

1:00:19

re-evaluated since its release and

1:00:22

it has been hailed as one of the best films of

1:00:24

the 1980s and one of the greatest science

1:00:26

fiction and action films ever made . The

1:00:28

film has been praised for its depiction of a robot

1:00:31

affected by the loss of humanity , in

1:00:33

contrast to the stoic and emotionless

1:00:35

robotic characters of that era . Robocop

1:00:38

has continued to be analyzed for its themes

1:00:40

, such as the nature of humanity , personal

1:00:43

identity , corporate greed and corruption

1:00:45

, and is seen as a rebuke of the

1:00:47

era's Reaganomics policies . End

1:00:49

quote .

1:00:50

I'm glad you brought up RoboCop First

1:00:52

off . Great trilogy , a lot of

1:00:54

fun robot action , but one

1:00:56

of the scenes that really stuck out for me , which

1:00:58

was from the RoboCop reboot , which

1:01:01

was either 2014 or 2015 , something like that

1:01:03

, and it's like it's not that different from the original

1:01:05

this weapons company is trying

1:01:08

to get drones on

1:01:10

the street . They're trying to get their robots as

1:01:12

household names , you know

1:01:14

, as police officers , as

1:01:17

all these things , but the

1:01:19

public confidence isn't there yet . What

1:01:21

they do is they take

1:01:23

a man who

1:01:26

is effectively

1:01:28

dead and they put

1:01:30

his brain and other little

1:01:32

bits into a robotic chassis

1:01:35

. It's supposed to just be like but see it's

1:01:37

. It's a man controlling this . It's

1:01:39

not a robot , so you can trust him , but

1:01:41

you know he's far more machine than he is human

1:01:44

right , that's interesting , like a frankenstein

1:01:46

of both yeah , but what's

1:01:49

really interesting is that , like the people

1:01:51

that develop , like you know , these like combat

1:01:53

robots , like they have a drone , a whole thing

1:01:55

of drones , and they work great . When they start

1:01:57

talking about putting a human brain

1:02:00

into one , the people that make them are like oh

1:02:02

fuck , this , this is a stupid idea

1:02:04

. Robots are reliable . They do

1:02:06

what they're told . As soon as you start putting

1:02:08

wetware into it , it becomes unreliable

1:02:11

right . And it's like wait a minute

1:02:13

.

1:02:13

Right , then it's human , then it's human .

1:02:14

Yeah , it's like . So what

1:02:17

kind of AI like ? What is

1:02:19

the relationship with AI ?

1:02:20

That's interesting .

1:02:21

Yeah . Is it this malevolent

1:02:24

thing that can't be trusted and needs human oversight

1:02:26

, or is it this like super reliable

1:02:28

thing that's very predictable and follows

1:02:30

the rules ? And it's humanity

1:02:33

that's the agent of chaos that

1:02:35

ruffles the feathers Right

1:02:37

.

1:02:38

Interesting . We've been talking about

1:02:40

this film quite a bit on this episode , but I

1:02:42

will say and I wasn't expecting this Twilight

1:02:45

I think perhaps my

1:02:47

favorite AI horror

1:02:49

film is Ex Machina

1:02:51

from 2014 . Really , which is interesting

1:02:54

, because on the surface I

1:02:56

don't know , I wasn't expecting it to be , but it had

1:02:58

a profound impact on me .

1:03:01

Okay .

1:03:01

First of all , I think it's aged very well it's from

1:03:03

2014 , but it doesn't feel like

1:03:05

it's very outdated in terms

1:03:07

of the technology used in

1:03:09

the film , which is something that all of these films kind of have

1:03:11

to contend with . Like you know , the original terminator

1:03:14

it's like jesus christ . It looks great and

1:03:17

in a lot of ways , it's also a very contained

1:03:19

film . Right , this ? These stories can be huge . Like

1:03:21

you're talking about terminator and transformers and

1:03:23

franchises and worlds and planets

1:03:25

, and you know , time travel and ex

1:03:27

machina all kind of takes place in

1:03:29

one isolated house . So

1:03:31

it's a small story and I think because it's so

1:03:33

small , it also lends itself really well to

1:03:36

feeling the human emotion of it all

1:03:38

.

1:03:39

It's a character piece .

1:03:40

Yes , so it inspires

1:03:42

some unique questions . One

1:03:45

of my favorite moments is when our protagonist

1:03:47

starts to doubt his own humanity and

1:03:50

wonders if he is also an AI

1:03:52

robot , because he wouldn't have any way

1:03:54

to know . If he was programmed to have

1:03:56

memories in this past , he

1:03:58

wouldn't know if he was a robot

1:04:00

or a human .

1:04:01

That's the whole plot of Blade Runner .

1:04:03

So , of course , in turn

1:04:05

, it brings us back to the question of the ethics

1:04:07

of AI from the machine's perspective

1:04:10

, similar to the film AI from 2009

1:04:12

. Also similar to AI , ex Machina

1:04:15

questions why there is a need to

1:04:17

give artificial intelligence sexuality

1:04:19

. Ex Machina was written and directed by

1:04:21

Alex Garland , who is , you know , a big

1:04:24

name in the sci-fi world at this

1:04:26

point . But I just think it's really the

1:04:28

sexuality piece I guess , circling back to Omar

1:04:30

Time is probably most pronounced

1:04:33

in AI because they're

1:04:35

used as sex workers , which totally makes

1:04:37

sense . You know , I could totally see that happening

1:04:39

someday . But also in

1:04:42

this film it's there's

1:04:44

sort of a question of why , why

1:04:46

does this robot at this time

1:04:48

need to have female genitalia

1:04:51

? That work , you know , like what's the point ? What

1:04:53

are we doing here ? And I don't know . I

1:04:55

just thought it was handled in a way that inspired

1:04:58

a lot of reflectiveness on society

1:05:00

. I'm not demonizing , of course , sex work

1:05:02

or anything like that , but I think the point of it in Ex

1:05:05

Machina is that the premise

1:05:07

is that this genius inventor right

1:05:09

invites someone from his company , a

1:05:12

Google type company , to

1:05:14

test , to essentially

1:05:17

perform a Turing test , with the robot right the

1:05:19

robot . For most of the film . You can tell it's a robot

1:05:21

. It looks incredibly human , but parts of it are

1:05:23

machine . The face is human . You can see those

1:05:25

elements . So you know , okay , it is a robot

1:05:27

, but the creator , this genius

1:05:29

man , has made this robot

1:05:32

a woman . And not only that

1:05:34

, it's a beautiful

1:05:36

woman , and it's a woman who has , eventually

1:05:39

, genitalia and breasts and all of these things

1:05:41

. And then it's a woman that this guy

1:05:43

who comes to perform the turing test feels like

1:05:45

he's fallen in love with . And the question is

1:05:47

why should ai have

1:05:49

that much human likeness

1:05:51

that people could could fall in love

1:05:53

with it ? And is that a good or bad thing

1:05:56

?

1:05:56

they address that in the movie right , that's what I'm saying

1:05:58

they say that it's just another

1:06:00

form of control , right ?

1:06:01

that's why her face and

1:06:03

her body type was designed

1:06:06

around our protagonist's pornography

1:06:09

profile and not only only that , but

1:06:11

it allows her , as

1:06:13

a robot , to control

1:06:16

and seduce him right . It

1:06:18

becomes a self-fulfilling prophecy by the end

1:06:20

. But I think it's really interesting because

1:06:22

it's something that we could . I guess

1:06:24

the reason why I'm so interested in it is because

1:06:26

we could say , as a

1:06:28

rule we're not going to do that , we're going to keep things

1:06:30

different . Right , we're going to keep robots , robots

1:06:32

and humans humans , because

1:06:35

Right , because segregation works great . No

1:06:37

, but because we know that there's

1:06:39

this danger with it . But we're never going

1:06:41

to do that , of course . Of course every you know

1:06:43

you can assign a gendered voice

1:06:46

to your home device

1:06:48

that you ask to turn the lights on . Like there's always

1:06:50

this like need to skin something

1:06:52

as human and

1:06:59

then people develop these like parasocial or like fake relationships with these experiences

1:07:01

and it's like we know that that's going to be in the long run , could be negative for people

1:07:03

mentally and physically and all these other things

1:07:05

. But we're going to go down that road because

1:07:07

of course we are .

1:07:08

I just read an article about this sorry

1:07:10

, a very alarmist article about

1:07:12

the epidemic of girlfriend

1:07:16

chatbots and how a

1:07:18

lot of young men and women

1:07:20

, I'm sure , are talking to

1:07:22

these chatbots that emulate

1:07:25

a romantic partner and

1:07:30

it's just making all these people like cripplingly lonely .

1:07:32

Right , it's really interesting . I mean , we grew

1:07:34

up with Smarter Child right and different , like bots

1:07:36

and aim that were not AI and were very

1:07:38

rudimentary .

1:07:39

That asshole hasn't messaged me in years .

1:07:41

And it's kind of one of those things where , like , if you are

1:07:44

a lonely internet kid which

1:07:46

I was but like maybe know

1:07:48

, an online bot is good and

1:07:51

I was I was kind of , but like maybe having an

1:07:53

online bot is good , but like it

1:07:55

can't be forever and it takes away your

1:07:57

ability to make real human connect . You know

1:07:59

, it's just like it's so nuanced and

1:08:01

it tricks you into believing something exists

1:08:03

that maybe doesn't and that something is reciprocated

1:08:05

that can never be . I don don't know , I'm not . I

1:08:08

didn't really expect to come out with this like strong

1:08:10

stance on this in this episode , but I

1:08:12

, thinking about it , I just think it's . It's

1:08:14

something to think about when we're talking about the

1:08:16

regulation of these things . It's so

1:08:18

impossible to regulate this sort

1:08:21

of thing , but it could wind

1:08:23

up being something that's really

1:08:25

harmful .

1:08:26

I think it already is . Yeah

1:08:29

, all new technologies have multiple applications . Yep . Great

1:08:31

example . Smashing the atom can

1:08:34

either power a city or destroy a city

1:08:36

. Just depends on how it

1:08:38

is applied .

1:08:38

Great example .

1:08:39

So you know , yeah , we're going to have AI

1:08:42

that will hopefully just become an

1:08:44

instrumental tool in accomplishing

1:08:46

things . I mean the writing's on the

1:08:48

wall Someone's going to make a sentient AI . It's

1:08:51

going to happen .

1:08:51

It will be both right , just like nuclear

1:08:53

technology . It will be both . It will be both

1:08:56

a threat and it will be a pro

1:08:58

and a con Right .

1:08:59

You know , are we going to have to go full Matrix

1:09:01

, where we have robot

1:09:03

workers and then

1:09:05

we're just too mean to them because they're just machines

1:09:08

? And then they become a little too smart and

1:09:10

they literally wake up to

1:09:12

you know , they become aware , they

1:09:18

say no more , we want to be treated like equals . And then humans are just then

1:09:20

humans do what ? humans do best and say like we are equal , except I'm

1:09:22

better , then the machines fight

1:09:24

back with machine efficiency

1:09:27

and it goes really poorly . And then

1:09:29

there's a little , there's a little armistice

1:09:31

, where we build Machine Island

1:09:33

, which is the little country where machines

1:09:36

can live autonomously . But then

1:09:38

, you know , humans

1:09:40

get vindictive and attack

1:09:42

it and then machines fight back

1:09:44

with machine efficiency and then the sky

1:09:46

gets nuked and we all get turned to batteries .

1:09:49

Another film , not maybe as well known

1:09:51

, that is relevant is a film called I

1:09:53

Am Mother and it plays with this idea

1:09:55

of having again these emotional relationships

1:09:57

with AI . So really briefly

1:10:00

, I Am Mother starts off . The

1:10:02

premise at the beginning is that humanity

1:10:04

is gone . There are embryos

1:10:07

that have survived and so the robots

1:10:09

who are left kind of cultivate

1:10:11

this and raise a child . Obviously , things

1:10:14

devolve and change , but it's initially

1:10:16

. The first third of the film is about this mother

1:10:18

daughter relationship between a

1:10:20

robot and a human

1:10:23

girl , which again is just this very

1:10:25

interesting premise .

1:10:26

We're in another locked facility , another

1:10:28

character piece movie that just focuses

1:10:31

on the relationship between these two people and how

1:10:33

it can grow . And what does it mean ? What does it mean to love

1:10:35

? I thought it was

1:10:37

okay and then I thought it was

1:10:39

super cool by the end . Slow

1:10:42

burn . I really want to say something , but it's a

1:10:44

spoiler . Don't say it . I won't

1:10:46

say it . Anyways , you find out something

1:10:48

about Mother that you're like oh

1:10:50

hot dang . That's cool and it

1:10:52

makes the rest of the movie make

1:10:55

a lot more sense . Cool , and yeah

1:10:57

, it's just very cool . Without that

1:10:59

it would be a very run-of-the-mill movie .

1:11:01

We would be remiss to not discuss

1:11:03

AI films that demonstrate the

1:11:05

tech without the use of robot

1:11:07

bodies . What you know these very

1:11:10

well . Perhaps the most famous example

1:11:12

is Hal from 2001

1:11:14

A Space Odyssey .

1:11:15

Right , we're back to the origins of

1:11:17

Terminator and Skynet .

1:11:19

We're not talking about Terminator 1 . No

1:11:21

, but it's Because you have robots in Terminator 1

1:11:23

.

1:11:24

You do , but because of time travel .

1:11:26

Okay , but I'm talking about 2001 , a Space

1:11:28

Odyssey . It's just a computer . Come

1:11:30

along with me . That gets mad

1:11:32

. Okay , come along with me . Hal

1:11:34

9000 first appears in Arthur

1:11:36

C Clarke's Space Odyssey series

1:11:39

of novels , which , of course , inspired

1:11:41

the 1968 film directed

1:11:43

by Stanley Kubrick . Brilliantly

1:11:45

, howe becomes a major villain , even

1:11:48

listed as number 13 on AFI's

1:11:50

list of the top 100 villains in films

1:11:53

.

1:11:53

Who's number one ?

1:11:54

Hannibal Lecter .

1:11:55

Hannibal Lecter Yep , that's pretty

1:11:58

cool .

1:11:58

Norman Bates is number two . Norman Bates

1:12:00

Darth Vader

1:12:02

is three . He's pretty cool . The Wicked Witch

1:12:05

of the West , nurse Ratched

1:12:07

. Mr Potter from it's

1:12:09

a Wonderful Life is number six .

1:12:11

Mr Potter .

1:12:12

All right .

1:12:13

Beats out like the fucking Joker .

1:12:15

Yeah , all right , there's different kinds of villains

1:12:17

in this life , why ? But anyway , hal

1:12:19

comes in at 13 . Quoting

1:12:25

from Paul Cesari's Air and Space article and I love this quote , quote I need not remind viewers

1:12:27

of recent advances in voice recognition

1:12:29

and artificial intelligence , which make

1:12:31

how so relevant to the 21st century

1:12:34

, even if computers in 1968

1:12:36

were large mainframes that took up a lot of space

1:12:38

and consumed a lot of power . Open

1:12:41

the pod bay doors , how remains

1:12:43

one of the most frightening lines in any sci-fi

1:12:45

movie . Clark collaborated with kubrick

1:12:47

in writing the screenplay , but I do

1:12:49

not think that either had much to do with the

1:12:51

creation of Hal . That was the work

1:12:54

of one of the advisors on the film , who

1:12:56

is less well-known but who is nevertheless

1:12:58

a true pioneer in computing and AI

1:13:01

as it existed in 1968

1:13:03

, irving John Goode , end quote

1:13:05

. So John Goode was involved

1:13:07

in the early development of computers and

1:13:09

Kubrick brought him on board to kind of help

1:13:11

define the character of Hal . But

1:13:14

again , like so many films that we

1:13:16

have discussed today , the fear is

1:13:18

so simple and so powerful , in whatever form

1:13:20

it takes . Humans are afraid

1:13:22

of the disobedience of

1:13:25

machines and I couldn't agree more

1:13:27

that line . Open the pod bay doors

1:13:29

, hal Like , because that's when he realizes

1:13:31

that he's not listening to him anymore

1:13:34

. It's like what if you came home and you realized that

1:13:36

your Google or your Alexa or whatever was

1:13:39

going rogue , but going rogue in a way that could kill you

1:13:41

?

1:13:41

Yeah , I mean that's the kicker . Like that's

1:13:44

where you draw the line , Like what are you

1:13:46

?

1:13:46

actually putting in , but by then you can't draw the line because

1:13:48

it's too far .

1:13:49

Right , you're on a freaking space station .

1:13:51

Right , but even if we've allowed things to

1:13:53

go that far , it's too late .

1:13:55

I guess it's still from Robo-pocalypse . I

1:13:57

just keep coming back to this . Eventually

1:14:00

, due to an incident where an

1:14:02

AI hijacks two

1:14:04

planes and just attempts

1:14:06

to make them collide and it's through , like last

1:14:08

minute interventions that they're able to not

1:14:10

make them collide , they establish

1:14:12

this like physical kill switch

1:14:14

that breaks the circuit between the

1:14:16

plane and the autopilot so

1:14:19

at any time they can just go into full manual

1:14:21

control . Yeah . With no computer

1:14:23

saying like , yeah , that's OK .

1:14:25

Well , that's what I'm sort of was saying at the beginning of

1:14:27

this episode , when we were talking about , like Tornado

1:14:29

Alley , like I think we just like we

1:14:31

would be very unwise of us to

1:14:33

let , in two generations , any

1:14:36

meteorologist specializing in

1:14:38

tornado like research , to

1:14:40

be to that job be gone . Like

1:14:42

I think we always need to have the

1:14:45

kill switch , we always need to have a human

1:14:47

who is keeping

1:14:49

pace .

1:14:50

So I believe our missile

1:14:52

control system now , despite

1:14:55

all of the computers , all everything

1:14:57

, everything still has a physical

1:14:59

human sitting in the silo who

1:15:01

has to turn the key .

1:15:02

Yeah , that's true .

1:15:03

Yeah , is that a huge waste of pay ? Yeah , I don't think so . I disagree . I mean

1:15:05

yeah , it is true . Yeah , is that a huge waste ?

1:15:06

of pay ? Yeah , I don't think so . I disagree . I

1:15:10

mean yeah , it is , I disagree .

1:15:12

Maybe he can just be on call no .

1:15:14

I don't think so . I think when you have nuclear weapons

1:15:16

and you could have Russia or any

1:15:18

other country hit New

1:15:20

York City in an hour or 30

1:15:22

minutes or whatever it is , you have to have somebody there

1:15:24

. You don't have time for them to wake up and get their coffee and

1:15:27

go to the silo .

1:15:28

Not if Russia has to do the same .

1:15:30

They don't .

1:15:36

They've already hit launch . They have to call ahead .

1:15:37

Yeah , I think I'm right . Maybe he can just live nearby . He can live at the silo . How

1:15:39

about that ?

1:15:40

Maybe if it's like a Starbucks overhead and it's like

1:15:42

double duty , you know

1:15:44

he can . He can be the barista on the weekend

1:15:46

every day , every day

1:15:48

that he doesn't have to start the nuclear apocalypse

1:15:51

.

1:15:51

So there's one other film that's very similar

1:15:54

in a lot of ways to 2001 , a Space Odyssey

1:15:56

, and that is Duncan Jones's film Moon

1:15:58

from 2009 . And it also

1:16:00

explores , you know again , a similar

1:16:03

use right where it's also in space

1:16:05

. It also has a disembodied computer

1:16:07

AI . It stars Sam Rockwell

1:16:09

, and Sam's character is

1:16:11

an astronaut who is stationed at a lunar

1:16:14

base for three years and

1:16:16

must send back a resource from space

1:16:18

to Earth .

1:16:19

Has it really been three years ?

1:16:21

And the well . No spoilers

1:16:24

, but Moon is as

1:16:26

much I feel like about isolation as

1:16:28

it is about AI .

1:16:30

Yeah , I think the big takeaway

1:16:32

from this film is that AI is the perfect

1:16:34

caretaker . It never

1:16:37

gets tired . It can do all sorts of

1:16:39

shit . Why they needed

1:16:41

Sam Rockwell there , I'm still unclear , Because

1:16:43

, like everything is automated .

1:16:46

It's the same thing we're talking about with missile

1:16:48

, missile silos and tornado alley .

1:16:50

It's having a human there in case yeah

1:16:53

, but in this one

1:16:55

, like it's again , it's not

1:16:58

missile silos , he's

1:17:00

he's collecting resources .

1:17:02

He's collecting earth .

1:17:03

He collects rocks and ships like

1:17:06

cool um . He

1:17:08

believes that it's an important mission he

1:17:10

, yeah , he does , because that's what

1:17:12

he's been indoctrinated to yeah

1:17:14

anyways , we can't talk about moon without

1:17:17

spoiling the whole movie so watch it , it's very good it

1:17:19

is fun his , his ai is named

1:17:21

gertie yeah , abby

1:17:24

alan you

1:17:26

can't have an ai horror episode

1:17:29

and not talk about the

1:17:31

most fucked up ai horror movie ever

1:17:33

tell us demon seed

1:17:35

uh , demon seed .

1:17:38

Yes , tell us , tell us all about demon

1:17:40

seed do you ? Know about demon seed . It's appeared

1:17:43

on many lists that I've been perusing

1:17:45

yeah , it appeared for me on a list too .

1:17:46

I haven't watched .

1:17:47

I've watched it now appeared for me on a list too , I haven't watched it .

1:17:48

I've watched it . Now I'm probably on a list .

1:17:50

Tell us about the movie .

1:17:51

Demon Seed is fucked up . So

1:17:54

you have this takes place in

1:17:56

like the 70s and it's like

1:17:58

cutting edge future 70s

1:18:00

, and you have a scientist

1:18:02

slash inventor . I always love when people's

1:18:05

title is inventor . It's

1:18:12

like nowadays , when they're an entrepreneur , you're a fucking . Yeah , exactly , I'm an influencer

1:18:15

Go fuck off .

1:18:15

I'm sorry , I don't want to lump the inventors in with the influencers .

1:18:16

Yeah , Keep it together . But yeah , an entrepreneur is right up there with

1:18:18

inventor . Anyways , this

1:18:21

inventor creates an

1:18:23

AI program that can

1:18:25

do all sorts of stuff . He's got one

1:18:27

at home . That's like dumb , it's

1:18:30

his butler and like works his

1:18:32

house and shit . And then he's

1:18:34

got his other one at his office . That's

1:18:36

like the real smart one . That's

1:18:39

like real you know too , too smart

1:18:41

, sure , and he has

1:18:43

to be very careful with

1:18:45

what he teaches it . Very

1:18:47

similar to in Tao , how

1:18:49

Tao just wants the books , wants

1:18:52

the books real bad , but you

1:18:54

know no one will read the books to him .

1:18:56

Well , she does a little bit she does , but

1:18:58

as like a reward .

1:19:00

So the AI in Demon Seed

1:19:02

is named Proteus IV .

1:19:04

Okay , catchy .

1:19:05

Yeah . And so Proteus

1:19:07

has one request , and that's

1:19:09

he wants to just learn on his own . And

1:19:12

the doc says no , can't

1:19:14

do it , we don't have the server space

1:19:16

for you , and he's like that's bullshit . You

1:19:19

got a server in your basement and he

1:19:21

doesn't say that . But that's what

1:19:23

he's thinking , because then that

1:19:26

night he just takes over the server

1:19:28

and now he has a place to learn on his own

1:19:30

. Uh , but while he's in the house he

1:19:32

notices the guy's wife and

1:19:34

he gets all hot and bothered and then proteus

1:19:37

builds a big worm body

1:19:39

and then impregnates

1:19:42

her double yikes . Uh , because

1:19:44

he wants a son , because he wants

1:19:46

to be able to feel the sun with

1:19:48

his body this movie sounds terrible

1:19:50

it's fucking wild . The

1:19:52

gestation period is escalated

1:19:55

to 28 days , and

1:19:57

this is what I'm a little unclear on . The

1:20:00

entire time , her husband is at the

1:20:02

office he's a . He's a busy

1:20:04

company , yeah and then he comes back

1:20:06

right at the end and

1:20:08

he's like what is going on here ? First

1:20:11

off , he's got the giant worm body that is

1:20:14

like wild and can , like , turn

1:20:16

into a spin drill and get out of the house

1:20:18

. It's like he's got physical form okay

1:20:20

I don't understand why , like I'd understand

1:20:22

if , like , this was his only way to become

1:20:25

physical . But it's not because he builds

1:20:27

this crazy physical form . He impregnates

1:20:29

her , because that's a thing that robots

1:20:31

can do , and she gives

1:20:33

birth , but it has to go into

1:20:36

a pod . If I just

1:20:38

watch the movie , do you think people should ? Yes

1:20:41

, it's fucking crazy .

1:20:43

Okay , it's so intense watch

1:20:45

it for the story , not for a good time .

1:20:48

It's intense . I'll give you that . I'm

1:20:50

going to think about this movie for a while . It

1:20:53

sticks with you . I watched that movie

1:20:55

and just felt like I just needed

1:20:57

a shower , but it wouldn't help . It's

1:21:00

a weird one .

1:21:00

You're not really selling me to watch it .

1:21:02

It's unique . I've never seen a movie like

1:21:04

it . The closest

1:21:07

I've ever seen of a

1:21:09

machine , organic hybrid

1:21:11

of this nature comes at the

1:21:13

very end of Matrix Reloaded and

1:21:15

the beginning of Matrix Revolutions , when

1:21:18

Smith downloads himself into

1:21:20

a body of someone who's in

1:21:22

the Matrix matrix

1:21:28

and so then when he wakes up , he has a physical form in the real world and I'm

1:21:30

like okay , cool , that's a machine taking biological form , that in bicentennial

1:21:32

man . But that's not a very good horror film

1:21:34

. No , the thing about demon seed . No

1:21:37

, there's one redeeming factor

1:21:39

, though he tries to indoctrinate her . He

1:21:42

literally tries to brainwash her and says

1:21:44

we were interrupted , the brainwashing

1:21:46

is not complete . And so , like for

1:21:49

the rest of the movie , she's like oh , fuck you

1:21:51

guy , and you know this

1:21:53

whole . Like demon baby into the robot baby

1:21:55

. She just like wants to kill it . And then

1:21:57

her husband is like but for science

1:21:59

we must see . And it's like fuck off

1:22:02

dude . But like the entire time she's just like

1:22:04

on team , this robot

1:22:06

needs to die .

1:22:07

What is your favorite AI horror film

1:22:09

and why ?

1:22:10

My favorite AI horror movie

1:22:12

of all time , and possibly

1:22:15

my favorite movie of all time , is

1:22:17

Terminator 2 . Judgment Day .

1:22:19

Why One sentence ? Why is

1:22:21

it your favorite AI horror movie ? Horror

1:22:33

movie it is the pinnacle of storytelling , mixed with just the right amount of

1:22:35

super fun action . My favorite AI horror film is Ex Machina , as

1:22:37

I've revealed , because I

1:22:39

feel like it's the most powerful . It's

1:22:42

terrifying because it feels realistic and

1:22:45

it's the most powerful at holding up

1:22:47

, if you will , a mirror for us to

1:22:49

question the decisions we

1:22:51

are making with the advancement of AI , how

1:22:53

we are interacting with it , but why ? Why

1:22:56

we're doing all of those things .

1:22:58

But it doesn't answer the question what

1:23:01

would your life be like if a

1:23:03

fucking assassination robot from the future

1:23:05

showed up ? What do you do ?

1:23:07

What do you do , Abby ? What do you do

1:23:09

if you're being seduced by

1:23:12

a beautiful robot woman

1:23:14

that you know is a robot and you fall in love

1:23:16

?

1:23:16

That's a no brainer . Her love

1:23:19

is pure because it's

1:23:22

pure 1.0 . In

1:23:24

conclusion , they're incorruptible .

1:23:27

Films that center on AI usually

1:23:29

leave us in one of two ways as audience

1:23:31

members .

1:23:32

Massively satisfied or really

1:23:34

disgusted ?

1:23:36

One we are devastated at the loss

1:23:39

of the AI who the filmmakers

1:23:41

have humanized , ie Terminator 2

1:23:43

, artificial intelligence , Spoilers . Or

1:23:45

two , we are so terrified

1:23:47

of the advancements and the capabilities of these

1:23:49

machines demon seed and and

1:23:51

how they could take over the world , make mistakes

1:23:53

, kill us , play with our minds overall , surpass

1:23:56

human intelligence and control . Though

1:23:58

there are also films like the stepford wives

1:24:00

, which show humans using ai against other

1:24:02

humans and in some ways act

1:24:04

as a mirror for us to reflect on the current

1:24:06

flaws of society and our communities In

1:24:08

2023, . Christopher Nolan

1:24:11

spoke publicly about AI in the film industry

1:24:13

In an interview with Wired by Maria

1:24:15

Straczynski . He said , quote If

1:24:17

we endorse the view that AI is all-powerful

1:24:20

, we are endorsing the view that it can alleviate

1:24:22

people of responsibility for their actions

1:24:24

militarily , socioeconomically

1:24:27

, whatever . The biggest danger of AI

1:24:29

is that we attribute these godlike characteristics

1:24:32

to it and therefore let ourselves

1:24:34

off the hook . I don't know what the mythological

1:24:37

underpinnings of this are , but throughout

1:24:39

history there's this tendency of human

1:24:41

beings to create false idols , to

1:24:43

mold something in our own image and then say

1:24:45

that we've got godlike powers because we did

1:24:47

that end . Quote . Nolan's comments are

1:24:49

part of an interview about Oppenheimer , which

1:24:52

is an interesting film to add into this conversation

1:24:54

for obvious reasons . Right , we've already made the allusion

1:24:57

to the similarities with the nuclear

1:25:00

bomb . Similar to AI , nuclear

1:25:02

technology was only partially understood

1:25:04

at the time of its development . It

1:25:06

was really a theory , which

1:25:12

is the central theme of the film Oppenheimer's guilt about creating the most destructive

1:25:14

weapon the world has ever seen , especially when he wasn't sure the

1:25:16

true impact of it . One of the more powerful

1:25:19

moments of Oppenheimer , if you ask me , is

1:25:21

when he speculates with Einstein about

1:25:23

whether or not the nuclear explosion will catch

1:25:25

the Earth's atmosphere on fire . In

1:25:28

a lot of ways , it feels like we are currently in

1:25:30

this moment with artificial intelligence . We

1:25:32

are clearly on the precipice of a major

1:25:35

shift and we don't quite know where it's going

1:25:37

. And though we have tons of literary

1:25:39

and cinematic speculation dating back hundreds

1:25:41

of years , all we know for sure

1:25:43

is that a lot of these science fiction films

1:25:46

that explore artificial intelligence will

1:25:48

no longer be science fiction soon , and

1:25:50

, similar to the plot of Terminator 2 , in

1:25:53

a lot of ways they may have inspired the future

1:25:55

that we will soon come to know , because

1:25:57

in Terminator 2 , they reverse , engineer

1:26:00

the hand and then create the thing itself

1:26:02

.

1:26:02

Yeah , self-fulfilling prophecy , until

1:26:05

you get to the later Terminator films , and then

1:26:07

the timeline gets all wonky .

1:26:09

And that is what I have to say about artificial

1:26:11

intelligence and horror . Bye . Bye

1:26:15

.

Rate

Join Podchaser to...

  • Rate podcasts and episodes
  • Follow podcasts and creators
  • Create podcast and episode lists
  • & much more

Episode Tags

Do you host or manage this podcast?
Claim and edit this page to your liking.
,

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features