Podchaser Logo
Home
Personhood: The Animal and Robot Edition

Personhood: The Animal and Robot Edition

Released Tuesday, 28th August 2012
Good episode? Give it some love!
Personhood: The Animal and Robot Edition

Personhood: The Animal and Robot Edition

Personhood: The Animal and Robot Edition

Personhood: The Animal and Robot Edition

Tuesday, 28th August 2012
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:03

Welcome to Stuff to Blow your Mind

0:06

from how Stuff Works dot com.

0:12

Hey, welcome to Stuff to Blow your Mind. My name

0:14

is Robert Lamb and my name is Julie Douglas,

0:16

and we're continuing our discussion

0:18

on personhood. That's right. Last

0:21

week we covered the foundation of what

0:23

personhood is, and this

0:25

week we are going to continue the conversation

0:28

and you'll hear a bit more about

0:30

personhood as it relates to animals

0:32

and robots. All right, So we have talked

0:34

about personhood in in this

0:37

idea of our past selves, our future selves.

0:40

Um, personhood as as a

0:42

corporate metaphysical entity. And now

0:45

animals in personhood. Oh yeah, yeah, I

0:47

guess we're kind of taking a step backwards because

0:49

they were here before us. That they we don't talk about

0:51

until we're done with corporations. Sorry, but the

0:54

corporations have more money, uh

0:56

than do the dolphins. Um.

0:58

The dolphins are certainly one

1:01

of the animals you hear about the most when it comes

1:03

to personhood and the great apes. Um,

1:06

you know, but it's the idea that you have, you

1:09

have creatures that they

1:12

are not really all that useful to us. Yes, we

1:14

can eat grillas and dolphins but they're

1:17

not that. Um

1:19

and uh, and in that sense, they are

1:22

beyond use by humans,

1:24

although great apes

1:27

or sometimes he used this bush meat, and Jane

1:29

Goodall is certainly someone who is an

1:31

advocate of personhood

1:33

for great apes, for instance, so that they,

1:36

you know, don't become extinct, so that they are not hunted

1:38

for bush meat. Yeah. Well, like I'm

1:41

gonna throw a quick quote here. Um, this is

1:43

where I got there. Beyond use a bit there,

1:45

And this is from

1:47

Thomas I. White, who is

1:50

uh. He has the Conrad In Hilton

1:52

Chair of Business Ethics at Loyola

1:55

Marymont University in Los Angeles,

1:57

and he says, quote, the evidence for cognitive

2:00

and effective sophistication, currently

2:02

most strongly documented in dolphins, supports

2:05

the claim that these cetaceans

2:07

are non human persons, said

2:10

White. As a result, cetaceans

2:12

should be seen as beyond use

2:14

by humans and have moral standings

2:16

as individuals. It is therefore ethically

2:19

indefensible to kill, injured or keep

2:21

these beings captive for human purposes.

2:24

So that's that's what White had to say about the matter,

2:26

pointing to the the level

2:29

at which the individual's mind is

2:31

operating and the the

2:33

limited extent to which they're useful to humans.

2:36

Yeah, and I think non human persons is the key

2:38

wording here, because he's not saying like, let's

2:41

give them a right to vote or set on injuries, but

2:43

he is saying that they are exhibiting, you

2:45

know, consciousness, They are exhibiting

2:48

um, you know, language ability, higher

2:50

cognition like abstract thinking, um,

2:54

both for great apes and for dolphins.

2:57

So he's saying that in these in this case, we

2:59

really need to try to figure out

3:01

what self awareness means on

3:03

this level. Yeah.

3:07

You see the mirror tests thrown out a lot

3:09

of times in dealing with

3:11

with with personhood with animals,

3:14

and this is simply if shown a mirror,

3:16

can the does the animal recognize itself

3:18

in the mirror? So you often see

3:20

these used with great apes, but it should

3:22

be mentioned that other animals can pull it off as well,

3:25

like the magpie, a little bird can

3:27

do it. Uh. And also worth

3:29

noting that human beings with face blindness,

3:31

uh, they may not be able to

3:33

do it. So it's not there's

3:36

not it's such a simple test.

3:38

It's guaranteed to not actually be

3:41

an effective gauge of personhood.

3:43

But but yeah,

3:46

you get into these these questions to what extent

3:48

are some, if not

3:51

many animals UM

3:54

worthy of a personhood status, Like is

3:56

a like a beloved pet? Just

3:59

I mean does quite have personhood status?

4:01

But a beloved pet has more rights

4:03

in this country than say,

4:06

like a wild coyote or

4:09

protected species has more has more

4:11

rights than a non protected species. So

4:13

I mean there are there are varying levels of protection

4:15

and play, like if you kill somebody's pet, you're

4:18

in more trouble than if you kill obviously

4:21

a cockroach. Well, there

4:23

was something that I read, and I wish that I could

4:25

remember the person um and attribute this to

4:28

that person. But what they were basically saying is

4:30

that we are at a point in science now that

4:32

we understand much better the

4:34

way that animals think, they socialize,

4:37

uh, their ability to feel pain, to express

4:39

themselves, um, and animals

4:41

like dolphins and great apes, uh

4:44

that they there's a sentient quality

4:47

to them. And we're not saying this is

4:49

apples and apples. They're not humans, obviously, but

4:51

he is saying that we're at a point where we should probably

4:53

step back in knowing what we know, what

4:55

science has has borne out in terms

4:58

of their intelligence, we should

5:00

begin to redefine how we

5:03

treat them or we look at them, um,

5:05

and what personhood might mean in that situation,

5:09

like in a in a certain sense personhood

5:13

like like as far as personhood and people

5:15

goes it like it boils down to this realization

5:18

that oh, it's not just me versus

5:20

the rest of the world. There are other people

5:23

that are on my side. And you eventually reached the point

5:25

where we realize, well, actually, most

5:27

if not all, of humanity is on my side.

5:29

We are all people, we all

5:31

have personhood status. And then extending

5:34

that out, well, are there other things in

5:37

the world that, um are worthy

5:39

of my consideration as a person

5:42

or or just well just remobe the person from

5:44

that that's and it's just worthy of my consideration.

5:47

So the dolphins, should I maybe

5:49

think about it in in a way that's more meaningful

5:52

as that bump that my my jet

5:54

ski just hit, you know?

5:56

Or should I should I think of of this

5:58

guerrilla as as something more than

6:01

just my my next meal in

6:03

the bush? Well? And then this is the problem with the

6:05

way that we view animals. And one of the books

6:08

that I'm rereading this

6:10

summer is some we eat, some we love, and

6:12

some we kill and and uh

6:14

he talks about the author talks about this

6:17

idea that we really are sort of butt over

6:19

teakettle when it comes to this. We can't really think

6:21

straight. So you know, why would we look

6:23

into the eyes of dogs? Some cultures look into

6:25

the eyes of a dog and see that dopey stair and just

6:27

fall in love. And and now that

6:29

we know about oxytocin and this exchange

6:32

of this feel good hormone between these two

6:34

species, um,

6:36

you know, and then you know, elevate

6:38

this relationship we have with this dog

6:41

in this culture, Why would we do that and

6:43

not necessarily with another creature? So,

6:46

yeah, it messes you up. Like I had to tell a work with

6:48

a kitten in that house yesterday and you

6:50

know, you keep good. You're like, maybe I should

6:52

go look at it again, and I'm like, I'm getting kind of stressed

6:54

about this. Articles you probably hold that kitten and then

6:57

you look at it and then toys with you like say

7:00

you could you could drop a bank with one. Well, this is this is

7:02

a kitten that you guys just rescued, and so you know

7:04

you're tapping into your empathy.

7:06

There's good stuff. On that note, We're

7:08

going to take a quick break, but we'll be right back

7:10

in just one minute,

7:16

and we're back speaking of empathy.

7:19

Uh though, uh, the final

7:21

level of personhood that we're

7:23

gonna talk about the day. At what point

7:27

do we consider an artificial intelligence

7:30

at least a an artificial

7:32

person. Okay, okay, So now they

7:34

can create, create art, as we've talked

7:37

about in perhaps even create new

7:39

thoughts. Okay. Um,

7:41

And there's this idea that has always

7:44

been circulating that eventually we're

7:46

going to program machines and robots

7:48

to the extent that they are sentient, that they

7:50

do feel pain because they're programmed that way for

7:53

whatever reason, perhaps

7:55

for their I mean, yeah, you get into a lot of

7:57

weird area when you talk about this, because we

8:00

we mentioned in the other podcast the

8:02

Artists Young Robot, we mentioned the creativity

8:05

machine. Um, if you unplug

8:07

the creativity machine, it doesn't

8:09

die, it doesn't feel pain, and you don't go

8:11

to jail. All. Some

8:14

of those things do not apply when you unplug

8:17

a person in a hospital. And

8:21

and yeah, so you get to the point like, at what point

8:24

have we given a

8:26

program enough of

8:28

a semblance of consciousness

8:31

and a semblance of of self

8:33

awareness that it deserves

8:36

rights like if you did create

8:39

say you did create this program or this is

8:41

say there are a virtual person in a virtual

8:43

world that feels pain, then

8:46

you know and for whatever sadistic reasons

8:48

you created that program aside, then if

8:50

you go in and you administer pain to that AI,

8:53

are you really doing something horrible,

8:55

something morally reprehensible and should that

8:58

virtual person be protected by law?

9:00

Well, especially when you look at the creativity

9:03

machine, which Steven Fuller is is saying

9:05

that it has the ability to learn, it has ability

9:07

to produce new thoughts. So you

9:10

know, the extrapolation is that somewhat

9:13

down the road, this this becomes

9:15

its own thing, right, it may even maybe

9:18

even has its own consciousness if

9:20

we can never really figure out what consciousness

9:22

is. But we're trying to do that, right.

9:24

We've talked about the Blue Brain Project,

9:27

which is this reverse engineering of a

9:29

hundred billion neurons in our own brains

9:31

and trying to figure out what consciousness

9:33

is, uh and how that's happening.

9:37

And then we've talked about this idea

9:39

of virtual immortality, where

9:42

someday we're going to just download a

9:44

bunch of code from our brain that is representing

9:47

who we are who we think we are, this personhood,

9:50

and we're probably going to throw it into a machine

9:52

or to a robot, and then at

9:54

what point, you know, that's probably when we get really

9:57

concerned about this idea of of robots

9:59

or AI having personhood, because

10:02

now this thing is containing

10:04

us or this idea. It's

10:06

one thing to worry of guerrilla has

10:09

personhood. But if you know that,

10:11

if you see that guerrilla wearing your T shirt and

10:13

reading your favorite novel, then you're like, whoa,

10:16

that gorilla kind of looks like me. Now I

10:18

feel really bad about eating it. And

10:21

likewise, yeah, if you if we do

10:23

whatever extent we create the AI in

10:25

our own image, if we actually attribute it with some

10:27

of our individual characteristics,

10:30

then dad what extent

10:32

as a person or the characteristics of

10:34

of other beings or

10:36

unreal And there's this idea too that it survives

10:39

us and then it makes independent decisions

10:41

based on our own thought processes. So

10:44

in the future do we do we say, like,

10:46

somehow I want to legislate that my

10:49

future me that is now being carried

10:52

out through this AI

10:55

is somehow still influencing

10:57

the world. Uh, you know,

10:59

especially if I'm have some sort of

11:01

corporate ties. It's an

11:03

interesting idea. Yeah. And

11:06

the one thing that I also wondered about two and this

11:08

is you get into the idea of first order desires

11:10

and second order desires. Like a first

11:12

order desire is something

11:15

like I want to eat a candy bar, or me

11:17

want candy bar, or even I can has

11:19

candy bar. These are all examples of

11:22

first order desires, and

11:24

it doesn't take much to have them.

11:26

You know, insect wants food, insect

11:29

wants candy bar, but it doesn't mean that it is

11:31

a person. Uh. But a

11:33

second order desire is the realization

11:36

that you wish you didn't want

11:38

a candy bar. Okay, So

11:40

this is this idea that you're thinking about your future self

11:43

and you're saying, I wish I didn't want that, because I

11:45

really don't. I really

11:47

will regret having eaten

11:49

that candy bar, or

11:52

even having both of those ideas in your mind

11:54

simultaneously to be like I

11:56

hate candy bars, I love candy bars. That

11:59

kind of cognitive dissonance, yes,

12:01

Or like an Odysseus contract

12:03

where one you know where, where Odysseus

12:06

straps himself to the mass

12:08

of the ship so that he won't be um

12:11

one over by the Sirens song. He

12:13

knows that future Odysseus um

12:16

is a horn dog and we'll just be dragged to his

12:18

death outo sirens. So current

12:21

Odysseus straps himself to the

12:24

mass. Yeah. So I'm

12:26

thinking back now to the other podcast that we just

12:28

recorded about robots

12:30

and art and the one part about

12:33

how they're teaching Macbeth to the

12:35

machines in order to get

12:37

these nuances and understand problem

12:40

solving. So why not just run this the

12:42

same code through um?

12:44

You know, this Odysseus, this pact through

12:47

machines. Do you think that by virtue

12:49

of carrying out the code that it becomes

12:52

the thing, that it becomes real, that that's

12:54

understanding of of

12:56

nuance and cognitive dissonance

12:59

could manifests itself in the machine. No,

13:03

yeah, it's an open question. I

13:05

mean, certainly it's the domain of some of a lot of scientific

13:08

work, you know, not scientific science, fictional

13:10

work. Rather, um, you know, in

13:12

the future, will ais have rights?

13:15

Will they have will they be liable

13:17

for this that or the other? Um?

13:19

Richard K. Morrigan's u Altered

13:21

Carbon novel. In that one, I

13:24

remember that they had they were aies and

13:26

he specifically mentioned that AIS had

13:28

the same level of personhood as corporations,

13:31

so which which would make sense because

13:34

an AI is immortal, it doesn't

13:36

it doesn't necessarily have a body um

13:40

and uh and and it has

13:42

but it does have interest in the world and uh

13:45

and it could conceivably need to be protected

13:47

from other entities. Well, and

13:49

I should mention too that again, this personhood,

13:51

this idea, especially in the legal terms, is

13:54

something that can be interpreted many

13:56

different ways and has been by the law. So

13:59

sometimes operations have been more personhoodie

14:02

than than than others,

14:04

particularly right now, So when we talk

14:06

about AI, we know that there probably would be a particular

14:08

set um that will be unique

14:11

to AI personhood if that in fact

14:13

comes um into the

14:15

discussion. How about this, how

14:17

about personhood for fictional

14:20

characters? Do you think could

14:23

could you have personhood status

14:25

for say Sherlock Holmes. I

14:29

can see a number of of

14:33

of corporations or lawyers who might really

14:35

like that because it's sort of an extension of trademarking.

14:37

Right yeah, But but like if you reach

14:40

the point where not only is it not

14:42

only is it a brand, and you see a certain amount of this because

14:44

I remember, Um, I remember

14:46

reading about how there was some parody

14:49

movie that ended up using

14:51

Darth Vader and they had

14:53

full cooperation from Lucas uh.

14:56

And so they had a guy in a Darth Vader

14:58

suit on set from from Lucas Is through and

15:00

they also had an adviser there that was

15:02

there to shoot down any ideas regarding

15:05

things that Darth Vader would not say or do.

15:08

Um and UM, I think it was

15:10

like Night the Museum two or something. I didn't see it,

15:12

but but uh,

15:15

but I kind of love that idea

15:17

because it's the idea that kind of hints

15:19

at this idea of a fictional character is something

15:21

that exists outside of human

15:24

influence anymore. Like the character is

15:26

this thing that has certain rights and

15:28

it shouldn't be stepped upon. So

15:32

I'm I'm getting this is a far fetched and

15:34

just sort of an open question just to get

15:36

everyone thinking. But but yeah, what if you did reach an idea

15:39

where and certainly if you did through AI as well.

15:41

So, like, what if we took everything that we knew about

15:43

Sherlock Holmes from the work of Sir Arthur Conan

15:45

Doyle and some of the more um

15:48

uh cannon and accepted

15:51

visions of Sherlock Holmes in in cinema

15:53

and what have you, and we use that to

15:55

generate an AI that then was

15:58

the spokesman and defender of Sherlock

16:00

Holmes person of it. That

16:03

would be yeah,

16:06

and then what have you? What if Sherlock Holmes the

16:08

AI slash fictional character

16:10

personhood thing then

16:13

ran for public office and

16:18

impression but I can't. Yeah, it's

16:22

kind of fun. You've gotta have a lot of there's

16:24

a lot of flesh involved in the head there. So

16:28

anyway, those are just some open questions there at the end

16:30

to get everyone thinking a little more about personhood.

16:33

Uh. It's it's a fascinating, elusive,

16:37

deeply philosophical question. Uh.

16:40

That is not going to be going away anytime. So uh

16:43

and thank you too to everybody who wrote in about

16:45

it, especially Kelly um gave it

16:47

some really good things to think about. Um.

16:49

I should mention just a real quick if

16:51

you do a quick search for personhood

16:53

corporations on the House Stuff Works website,

16:56

you'll find an article from Josh Clark of

16:59

stuff you should know he did an article

17:01

on that is pretty cool. And uh. I

17:03

also found a Scientific

17:05

American blog entry by Eric

17:07

Michael Johnson really interesting. It's called non

17:10

human Personhood Rights and Wrongs,

17:12

and a lot of that focuses in on grade

17:14

eights because that's a Johnson's area

17:16

of study, but he gets into issues of AI

17:19

as well, and which it's a fascinating read.

17:21

All right, do you have time to bring over

17:24

the robot? Yeah? I think we have time

17:26

for a quick one. The

17:29

robot here does not have personhood status.

17:32

He has applied for it. We are reviewing

17:34

it. All right. Here's a little bit of listener mail from

17:36

Sarah. Sarah writes in and says, dear Robert

17:38

and Julie, after hearing my fellow a

17:40

taku anime junkies tell you

17:43

about Dino I coil. I

17:45

don't know if we ever got the pronunciation on that right. Yeah,

17:48

um, I thought I would tell you about

17:50

the anime Um planetes

17:53

Or, which is felt p l a in

17:56

et e s, which

17:58

is a sci fi anime about astronauts

18:01

whose job it is to pick up space debris

18:03

in order to make space safer for flight.

18:06

There are stringent laws about leaving anything out

18:08

there, and the show deals with the dangers of space,

18:10

including space madness, as

18:13

well as a little romance. Uh, got

18:15

to go, but definitely check it out. Sarah.

18:17

So, I looked this up. I haven't watched it,

18:21

but it does look really interesting. It's like just the

18:23

idea that this is a it's like a TV series.

18:25

It's essentially about about

18:28

space junk, you know, and about the about

18:30

dealing with space junk, and you know, in the various

18:32

important narrative levels. Well that's good, right,

18:34

I mean, just that's hope for me that maybe

18:37

it's starting to get into the mainstream

18:39

somewhat. I don't know if you could say that the mainstream, but at

18:41

least it's getting into the conversation.

18:43

Yeah. And it's it's an interesting reminder too

18:45

about how rich anime can

18:48

be because I'm not I mean, I've seen some

18:50

of the iconic anime flips, you

18:52

know, like you know, I've seen a Cure, you

18:55

know, I've I've seen um

18:58

and certainly I've watched like them the AZA films

19:00

and uh for some of the what's

19:03

the other really a Ghost in the Machine?

19:05

I've seen so some of those. But

19:07

there's a whole lot of anime that I'm I just don't

19:09

even know what's going on with And it's easy

19:11

to fall into the trap of just thinking of anime

19:14

as you know, as as whatever

19:16

the most uh visible

19:19

thing happens to be at the time, you know. But

19:22

but there's really a lot of diverse but

19:25

you know, a lot of times fantastic storytelling

19:27

going on there that's certainly worth looking

19:29

at. So I think it's really awesome that that somebody

19:31

out there has has has managed to

19:33

to produce a show that it is based

19:35

on a more or lesson on on on the skeleton

19:38

of a space junk uh

19:40

theme. Um.

19:42

And then we also heard from our listener

19:45

Murphy Murphy right sences high, Robert and Julie.

19:47

My name is Murphy having from Hawaii. But

19:50

that is irrelevant. I just finished listening

19:52

to your podcast about the Higgs particle. Pretty

19:54

cool. Remember that joke about the Higgs particle.

19:56

I've electronically arrived to point out

19:58

that this is the only humorous

20:01

Higgs joke because of it's clever and commonly

20:03

overlooks double entendre. Not only

20:05

can you not have mass like church mass

20:08

without the Higgs particle because of the Higgs

20:10

particle gives stuff mass like stuff

20:12

mass, but also because you cannot

20:14

have church mass without

20:16

the God particle. I've always

20:18

been looking for some recent topic

20:20

to chip in about. I listen

20:23

to your podcast retroactively, so usually

20:25

my two cents have are so

20:27

usually my two cents or a couple of years overdue.

20:29

I love, love, love the podcast. I still have a

20:31

few hundred more to listen to, so I have to go

20:34

Aloha Murphy, Murphy.

20:37

Yeah. I keep forgetting that we've done that many

20:40

but yeah, apparently we have to

20:42

a week. Yeah, and um, I've

20:44

got time for one more, uh listener mail here,

20:47

uh listener. Matt writeson and says,

20:49

hey, guys, I'm slowly working my way through your older

20:52

March April May two twelve podcasts,

20:54

and you're talking about space exploration, tech,

20:56

living in micro g environment, and the tech of

20:58

Prometheus. In regards to these podcasts,

21:01

you have to check out the movie Pandoram. It goes

21:03

further into the detrimental mental

21:05

problems long term space

21:07

travel can take on the human species. I don't

21:09

think I'm giving away too much of a spoiler when

21:11

I tell you that three of the finer points in the movie

21:14

take a look at psychotic episodes,

21:16

cabin ship fever, memory loss

21:19

due to extended hyper sleep, and biological

21:21

evolution brought on by trying to adapt to

21:23

a spaceship environment. While it didn't

21:25

play out very well in the box office, it is certainly

21:27

worth while to DVR on

21:29

Sci Fi or Netflix if you don't want the commercials.

21:32

I also wrote a long time ago

21:34

about thinking as a kid that mirrors

21:36

were not just a reflective surface, but a

21:38

window into the nearest adjacent multiverse.

21:41

My reflection was really

21:43

just another me who happened to check out the

21:45

mirror at the same time, and that

21:47

at any time I might turn

21:50

a corner and run into my doppelganger. Well,

21:52

I finally did meet him. His name is

21:54

Nega Matt and he's pretty cool dude.

21:56

We exchange numbers and are scheduled to meet for Brunched

21:59

next Saturday. Thank you again for the great

22:01

show, Matt. Well, it was a

22:03

fun email. Certainly the mirror thing.

22:06

It's that universe is just like ours, except

22:08

they spell everything backwards and

22:10

um and certainly Prometheus

22:13

uh not Prometheus Pandorum

22:15

Uh is a film that I have know

22:17

people who just seemed to hate it, but but

22:19

I really enjoyed Uh Pandorum.

22:21

I thought it was it was a really fun

22:23

kind of space horror of like a German

22:26

US co production, I believe, uh

22:28

some good performances in it. A lot of money went

22:30

into the the set design and the creatures,

22:33

and there a number of cool little scientific ideas

22:35

going on, you know, and including the three

22:37

that Matt mentions here. So, but do they

22:39

have the space punis? No,

22:41

no space punis. But

22:43

but but it does involve a scenario. Okay,

22:46

what if what if you have colonists on a seed ship

22:48

and uh, and you have genetically

22:51

engineered them to rapidly advanced,

22:53

rapidly adapt to a new environment so

22:56

that when they land on an exoplanet um

22:59

they can get out, they can rapidly

23:01

advance this environment. But then what if that

23:03

ship doesn't make it to

23:05

its destination and what if

23:07

the crew members come out and then they rapidly

23:09

adapt to the environment

23:11

of the ship. What would that look like?

23:14

So that's one of the fun ideas that this film

23:16

has. I mean, you know, don't go into

23:18

it expecting two thousand and one. But as

23:20

we've discussed in the past, if you go into any movie expecting

23:23

two thousand woman isn't two thousand and one, You're

23:25

gonna be a bit discipline so

23:27

so thank you guys for all that wonderful listener

23:29

mail. We get a lot of it. I wish we had time to

23:32

to personally respond to all of it and

23:35

or read all of it, but uh,

23:37

it's just hard and and sometimes I mean

23:40

to read something and then I print it out and then I

23:42

forget it, So some of it is useder

23:44

air on my part. Yeah.

23:46

So, uh, let us know what

23:49

you think about personhood, our

23:51

corporations, people, our plants,

23:53

Our plants people are

23:55

our animals, people, our dolphins, people

23:57

are artificial intelligence?

23:59

Is pol um? How about that guy

24:01

that you just shot in that video game? What

24:04

about that blade of grass that just got cut

24:06

and released? Green leaf volatiles screening

24:08

for help? Yeah, all these all these

24:10

questions on the board, so let us know what

24:13

you think. You can find us on

24:15

Facebook where we are stuff to Blow your Mind, and you can

24:17

find us on Twitter where our handle is blow the Mind,

24:19

and you can always email us at blow the

24:22

Mind at discovery dot com

24:29

for more on this and thousands of other topics.

24:32

Is it how Stuff Works dot com

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features