Podchaser Logo
Home
S1 Episode 2: The Future of AI and Ethics with Minter Dial

S1 Episode 2: The Future of AI and Ethics with Minter Dial

Released Monday, 3rd June 2019
 1 person rated this episode
S1 Episode 2: The Future of AI and Ethics with Minter Dial

S1 Episode 2: The Future of AI and Ethics with Minter Dial

S1 Episode 2: The Future of AI and Ethics with Minter Dial

S1 Episode 2: The Future of AI and Ethics with Minter Dial

Monday, 3rd June 2019
 1 person rated this episode
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:03

Welcome to the practical futurist podcast,

0:06

a show all about the near term future with

0:08

practical tips and tricks from a range

0:10

of global experts. I'm

0:13

your host Andrew Grill. You'll

0:16

find every episode full of practical ideas

0:18

and answers to the question what's the

0:20

future of with

0:22

voices and opinions that need to be

0:24

heard, but beware,

0:28

I'm no ordinary futurist and my

0:30

guests will give you things you can use in your business

0:32

next week, not next year. So

0:35

let's jump into it.

0:37

In this episode. What's the future of

0:39

Ai and ethics? I

0:42

launched this podcast a few weeks ago and the feedback's

0:44

been extremely positive. Thanks everyone

0:46

for listening or welcome if it's your first time here.

0:49

Before I introduce my next guest, I wanted

0:51

to outline my own point of view about this

0:53

very interesting area of empathy and

0:55

ethics when it comes to artificial intelligence.

0:59

With the rise of artificial intelligence across

1:01

all industries, commentators

1:03

and business leaders are now questioning the ethics

1:05

around these AI systems. While

1:07

existing AI systems are a long way

1:09

from being able to simulate human behavior

1:12

or general AI as it's being called. Many

1:14

are worried about how we will program these machines

1:16

to work for us instead of

1:18

against us. In almost every one of my

1:20

keynotes, I'm asked about Ai specifically,

1:24

will we lose our jobs and can we trust

1:26

these systems.

1:28

In each case, I explain that AI systems need to be

1:30

trained by humans initially and

1:32

how we train these systems will direct how

1:34

empathetic they might be.

1:36

In the end of the day, I do believe that a machine will

1:38

be able to perceive other

1:40

human beings, well, sometimes

1:44

better than us.

1:46

That's the voice of today's guest, longterm friend

1:48

of mine, Minter Dial who has just written

1:50

a new book called heart "Heartificial

1:52

Empathy" where he tackles this very topic. He

1:54

argues that as humans we need to become more

1:57

empathetic before we can hope to train these

1:59

new AI systems and that empathy is

2:01

the superglue for high performing teams.

2:04

So who is coding our AI and do they

2:06

have real empathy and ethics in their approach?

2:08

We also need to have more empathy to be better

2:10

managers and learn to listen better.

2:13

How can we create empathy in machines?

2:16

Minter argues that empathy and ethics

2:18

are linked. Welcome

2:25

to the practical futurist podcast, episode

2:27

number two, where I'm joined by bestselling

2:29

author, storyteller, filmmaker,

2:31

blogger, keynote speaker, brand

2:34

strategist, podcaster, and also my friend

2:36

Minter Dial. Minter, welcome.

2:39

Andrew. Thank you so much for having me on the show.

2:41

Heartifical Empathy, putting heart into business

2:43

and artificial intelligence, amazing title,

2:45

amazing book. The book is an in depth

2:47

look at empathy, how it's created, why

2:50

and how to increase empathy in people, organizations

2:53

and machines and the floors to be avoided.

2:55

What drove you to write it?

2:57

Let's say the topical

3:00

answer is that I think empathy has

3:03

long been an interesting topic for

3:06

business. It's not something that we've

3:08

regularly talked about. It's certainly not something that you practice

3:10

or teach in business schools and

3:13

yet it's, it's fundamental

3:16

to so many parts of the business starting

3:18

with the way we manage our

3:21

people. It's just startling

3:23

how the idea you get on a

3:25

tie, you go to the office and you treat people

3:28

differently. You don't have time to listen -

3:30

rush rush got to get everything done. And

3:33

by doing so fast, everything we forget to

3:35

listen. We forget to understand that people

3:37

have personal motivations and personal

3:39

issues. It just, my

3:42

experience said that, well, being empathic

3:44

can be a really, a magical skill.

3:46

Let's say a topic that I really felt I wanted

3:49

to put on the page and

3:51

make it not just a touchy feely thing

3:53

or soft skills. Somebody that really actually

3:56

materially will change the course

3:58

of your business if you learn the power of empathy.

4:01

So that, that was may say topical answer. The,

4:03

underneath answer actually

4:05

was that I started

4:08

to look in the mirror about how empathic I truly

4:10

was and said, well, can I, could

4:12

I do more? Can I be better at

4:14

being empathic? And then the

4:16

irony of the story is that once I really

4:18

learned about it, I did understand that I wasn't always

4:20

being empathic,

4:21

You didn't like yourself ...

4:22

Well, I certainly recognise that I could be more empathic.

4:25

And now that I've written a book, the challenge

4:27

is holding myself up to that standard.

4:30

Read your book Minter! So do you think artificial empathy is an oxymoron? How can we create empathy in the machine?

4:38

Right? So there's an oxymoronic

4:41

element to it. But the

4:44

reality is empathy is

4:46

about perception and

4:48

machines are increasingly

4:52

tremendously capable of

4:54

perceiving. So whether

4:56

it's vocal, visual,

5:00

oral, we can now perceive

5:02

emotions. We can perceive what's

5:04

happening a lot better. And so

5:06

in the end of the day, I do believe that a machine

5:08

will be able to perceive other

5:10

human beings. Well sometimes

5:14

better than us. You take

5:16

the case of a doctor and their ability to

5:18

detect depression in an

5:20

individual, as individuals, we, we tend to

5:22

sort of sometimes say what the other

5:24

person wants to hear. And the same thing actually

5:27

happens with doctors. I mean doctors have their own

5:30

filters and they're not quite as able to

5:32

pick up the signs of depression. For example, take

5:34

an example of does the

5:37

depressed person laugh? And the

5:39

answer is yes. And so

5:41

you might miss cute laughter

5:43

with happiness, but it turns out

5:45

that when a depressed person

5:48

laughs, the length of their laughter is

5:51

shorter. If you can cue a machine to

5:53

detect what's the difference between, you know,

5:55

a hearty laugh and a depressed

5:58

laugh, that kind of sensitivity is

6:00

something that machines do. That's not empathy

6:03

per se, but it does show

6:05

the detection. This ability to perceive

6:09

and, and where people get confused

6:11

is that you don't need to show the

6:13

empathy per se. At the one level, it's really just

6:16

about understanding the other person's context, at least in

6:18

a cognitive manner.

6:19

Almost having ... the machine could

6:21

say this person is less happy than normal,

6:23

so the human then goes, okay, I've got to treat

6:25

them differently. I wasn't aware. They look less happy than

6:28

normal. So that's an aid to then have

6:30

me as a human turn on more empathy.

6:32

Well, it is about helping, prompting

6:34

the human to act. I

6:37

tend to conscript

6:41

the idea of empathy to the perception

6:43

component. There are two elements. Let's

6:45

say two definitions broadly speaking of empathy.

6:47

One is feeling affective

6:49

empathy and the other one is cognitive empathy, thinking

6:52

empathy if you will. And the the feeling

6:54

one is not something that I think human machines

6:57

are going to get where if you start crying I

6:59

cry, you know, or you

7:01

know, I feel your sadness. That is

7:03

a, that's a, that's not the domain of machines,

7:06

but in the cognitive space, the ability to say,

7:08

Andrew, you look sad

7:11

or needs to perceive your sadness that

7:13

the machine is able to do. Then the

7:15

question is what are you going to do with it? And

7:18

that totally depends on the context

7:20

of Andrew because Andrew may not

7:22

be looking for sympathy. He just made me sad

7:24

because his team lost and that's

7:27

it. And, and well, inshallah

7:29

or he might be sad for another reason,

7:32

but he knows the solution. So

7:34

I, he just not looking for me to give them advice. Just

7:37

wants to listen. He just wants to have somebody

7:39

listen to them. Yeah. Yeah. Yesterday I was doing a keynote

7:42

and I hung around for lunch and before

7:44

lunch there were these stations who are talking about things. And almost

7:46

opposite of the wellness station. So I heard the

7:48

same pitch three or four times. And what was fascinating,

7:50

the ladies were saying, someone comes into work

7:52

and, and you ask them how was their weekend?

7:54

Or how are they and your condition?

7:57

Oh fine. She said, are you

7:59

ready when someone says, no, I actually had a really

8:01

bad weekend. What do you do?

8:03

I don't think we're trained in a work environment to do

8:05

that. Oh, that's a bit uncomfortable. Um,

8:07

Minter's had a bad day or Andrew's had a bad day, what

8:10

do we do, and then then must say, oh, it'll

8:12

be okay. And sometimes, especially

8:14

in a work environment, you just want someone to listen. That's

8:17

right. I think actually in all environments

8:19

we could do with lot better listening

8:22

skills. I mean, the reality is

8:24

that we all have 24 hours and

8:26

there is this perception that time has

8:28

accelerated and yet it hasn't. We

8:31

do need, there are different ways to be more efficient.

8:33

We can do so many more things. We'll have digital tools,

8:36

but in the advancement

8:38

of our technologies, we've kind of lost

8:40

our ability to sit, listen to ourselves.

8:44

Our heartbeat,

8:46

the breathing of our lungs

8:48

and listening to other people and so there's

8:50

in the first part is actually listen to

8:52

yourself, self empathy, a self

8:55

awareness and the other one is around with the

8:57

important people around you and whether it's

8:59

at work or at home, the ability

9:02

just to say, hey,

9:04

you want to talk? Let's just go in

9:06

and be quiet. When you know

9:08

that every minute is a dollar,

9:11

then we tend to equate that with

9:13

productivity and that just flushes out and pushes out

9:15

any desire to listen.

9:18

I want to talk now about ethics because when we talk

9:20

about AI I get asked all the time about

9:22

the ethics and I have a set of standard

9:24

responses that I give, but I'm keen to learn more.

9:27

You know, what's should companies do when they're thinking about

9:29

the ethics of Ai?

9:31

Well. So empathy and ethics,

9:33

it turns out are extremely linked. And

9:37

if you want to encode

9:39

AI with empathy, for example,

9:41

you really need to have a saying, if you want

9:43

to look at your ethics, how about

9:45

taking a check on how empathic you are as

9:47

an individual, as a c- suite

9:50

and as an organization? And if that empathy

9:52

is there, then let's say that you're

9:54

in a better state to create an ethical framework

9:56

before you go forward. Afterwards.

9:59

The issue is understanding

10:01

the pressure you have to perform and whether

10:04

you're able to defray that for

10:06

the sake of a stronger ethical

10:08

line. The issue with ethics

10:10

is that it's a very personal story. The

10:12

difference between what is good and what is bad,

10:15

and when you have a large team,

10:17

even a small team for that matter, your

10:20

ability to coalesce

10:22

and to agree ensemble about

10:25

an ethical line can be very deeply

10:27

personal. And so when

10:29

you have empathy, it's going to be easier for

10:31

you to understand each person's zones

10:34

and think also more importantly

10:36

about other people's zones. Because

10:38

when you're a bunch of white men sitting

10:41

around the table, chances are you're going to think

10:43

white man's stuff and white men, you know, whatever

10:45

I've had is my privilege. And yet

10:47

we might have a customer base that is deeply very

10:49

different in terms of background

10:51

or, or sex or whatever gender and

10:54

and so the, the notion of empathy

10:56

is a key consideration afterwards

10:59

into, in terms of ethics. The

11:01

reality is a lot of the ethical

11:04

conundrums we're going to faced, there

11:06

are no laws to understand

11:09

or run by. And so we're

11:11

going to have to be in constant mode of you

11:13

know, adapting and rethinking our

11:15

ethical frameworks, which is probably

11:17

why I mentioned the notion of privacy before.

11:20

I think today we can do

11:22

so many things, but it's not because we can,

11:24

that we should do them.

11:26

When we talk about ethics and AI, often the,

11:29

the notion of conscious bias comes up that if you have

11:31

to program and train a machine, you're

11:33

going to train in a certain way. So where does

11:35

conscious bias fit with empathy

11:37

and ethics and, and there are no laws

11:40

at the moment, but all the people that are developing AI platforms

11:42

and consumers also are going to start asking

11:45

who programmed my machine?

11:47

Well, let me also just add

11:50

that is it's that that AI is

11:52

going to be used by criminal

11:54

organizations, by states

11:57

in different ways and as well as

11:59

companies and cities for that matter.

12:01

So there are many different organizations are going

12:03

to be using it, putting

12:06

empathy into the way you

12:09

approach your AI

12:11

or you're, this bias

12:13

you have is going to help you to

12:15

look at it from other perspectives and

12:17

put yourselves in the shoes. The others, I'm

12:20

hard pressed to say that there's one

12:23

root in order to do this. The

12:25

challenge is you, you each perform,

12:28

you want to get the data sets,

12:30

getting the data set, is proprietary, it can

12:32

take a long time. There are no

12:34

shortcuts. You're going to screw up along the way

12:36

if you just keep your eye out

12:39

for what you think is doing good

12:41

for society and be vigilant

12:43

about that ongoing. It'll be important.

12:46

An example would be as you look

12:48

about programming your Ai,

12:52

who's on the team is doing it because

12:54

you can have coders and coders have

12:57

many talents and one particular skill.

13:00

But usually within that is not a strong

13:03

emotional quotient. So

13:06

make sure you try to compensate

13:08

or compliment anyway with

13:10

people who have maybe a stronger humanitarian

13:13

approach, more sociological understanding,

13:15

maybe stronger emotional quotient

13:18

and that might be a good way to make sure you

13:20

alongside the lawyer, they're

13:22

good, good ethics, diversity of ethics.

13:25

In the book he make the case for why empathy is not only

13:27

teachable but a requirement for success in

13:29

business and in life. So how can we

13:31

teach empathy?

13:32

Well, so I don't actually believe

13:35

that empathy is teachable per se.

13:38

It must be learned. So

13:41

the key is to create an environment

13:44

where people want to come empathic.

13:46

So the first part of that is making sure that empathy

13:49

is modeled is the behavior up top

13:52

because it's no good telling

13:54

the rest of your team to be empathic when you are

13:56

running dick. And

13:59

that means having self awareness and

14:01

evaluating stuff from the top. Secondly, empathy

14:04

is a great way to be with your customers.

14:06

However, if

14:08

as an organization you'r e unempathic internally, it's

14:11

quite unlikely that the empathy will continue to

14:13

manifest itself towards the customers. So creating

14:16

environment means modeling it from the top

14:18

and then there are different ways according to

14:20

the amount of empathy you think you have to foster

14:23

more empathy. One of

14:25

the ways, and I strongly encourage

14:28

you obviously as an author is read great

14:31

novels by reading

14:34

great novels, it's been proven that you

14:36

are going to step into the shoes of other

14:38

people. The character's going to be this crazy

14:40

man or a person or a woman and you're

14:43

going to learn through great writing the

14:45

psychology of that person and that

14:47

gets you into their shoes. So I personally now

14:50

have alternating every book I read to

14:52

be a novel, which gets me into another,

14:54

another space. It's giving

14:56

me quiet time as well, but it's also

14:58

making my, my brain expand in

15:00

other people's worlds.

15:02

I hadn't thought about that, and that's so true that I read

15:04

so many business books and I

15:06

need some escapism and you're right, you need to get into characters

15:08

that it is probably the best bit of advice 've heard

15:10

all year. Why will empathy be a key competitive advantage? I think you've almost answered that. But people out there that are not convinced, you can't put a dollar figure on having better empathy, why should they bother?

15:13

Right?

15:23

So at the very least customers

15:26

are going to want it. When you

15:28

hear the number of customers complaining about the

15:30

automated, this automated that and

15:32

the inability to code or the right

15:34

user experience. Empathy is

15:36

knowing how to design. Any great

15:38

designer has strong empathy. But

15:40

if that designer is surrounded by rational,

15:43

hard nosed unempathic individuals,

15:45

it's not going to be good. So for customer

15:48

facing components, whether it's customer service,

15:50

design of a product, managing

15:53

the sales experience, being empathic,

15:56

understanding the situation is going to be

15:58

a material benefit to your bottom

16:01

line. After that. And I think, but probably

16:03

actually really in chronological order before

16:06

that, if having

16:09

good talent is important to you and

16:11

keeping them there with you. I think

16:13

empathy is the superglue. It's

16:16

the thing that's going to help you

16:18

identify when your employees are unhappy.

16:21

It's going to help you to understand

16:23

what their motivations are and then play

16:25

towards those emotions, their

16:28

motivations, and, and ultimately make a better environment

16:30

where people want to continue to work for you.

16:33

I'm just dreaming about an organization I've

16:35

worked at before, knowing that they would probably send people

16:37

to "empathy school" and they would tick

16:39

a box of, they've done their empathy training and they

16:41

come out and be, be the Dick's ever again.

16:43

Well, I mean the very least they're

16:45

trying, you know, maybe they're also recognizing that the're not,

16:47

and let's say so on that good

16:49

news. But the bottom line is empathy

16:52

is something that has to happen in

16:54

the small details every

16:56

day. And, and so it

16:58

actually, it's quite tiring because it means

17:00

sometimes taking the time to listen, say, hey,

17:02

how are you doing Andrew? I'm not doing well.

17:04

Oh Wow, let's, yeah, let's, let's stop what we're doing.

17:06

Going to have a coffee or take the day off.

17:09

Then someone, my boss yells, mentioned, what are you doing? I'm like,

17:11

well, you know, I'm trying to

17:13

do this. Oh, okay. And

17:15

so in the book I try to tease

17:17

out a few of these typically situations

17:20

that happen in business and how we

17:22

typically do things and how we could do them

17:24

differently.

17:26

I remember one of the team I was managing one

17:28

day,

17:30

let's go to the Science Museum. We spent three hours there and over that time, because we weren't focused on what our work package was that day and we were looking at interesting exhibits, we had a chance to talk and I think both of us really found that rewarding. Now if my 1-up manager knew that I'd taken my, one of my team for the science museum. They may have thought, why? But an empathetic, manager would go, `"what a great idea". I'm going to take my team to the science museum too. Do you think we'll start to see the rise of the empathy index so consumers can see if one company is more empathetic than the other?

18:01

Well, so there, there is an organization in the

18:03

south of England who's created

18:06

a few years ago and empathy index.

18:09

The challenge with that is really measuring empathy.

18:11

Well that was another question. Can you measure

18:13

it? It's very difficult. Um, cvs in the United

18:15

States, Norman de Greve who

18:18

just was identified as one of the

18:20

top hundred courageous leaders in the

18:22

United States. They really tried to put empathy

18:25

into their

18:27

customer experience and what they've done is they've measured it, but the only way the

18:30

measured is they ask individuals that come into their

18:32

stores, the CVS drug store.

18:34

Did you feel that

18:37

the salesperson demonstrated empathy

18:39

towards you? And each person has a different

18:41

interpretation of what is empathy and so on. So it's very hard. That's,

18:43

that was sort of the the first level then, you

18:45

try to surround sound it with

18:48

different characteristics that would show that

18:51

the person's being empathic. It's hard

18:53

to do. So a) I would

18:56

say, why not have empathy index? Because at least it puts

18:58

it on the table and challenges, what are you trying to do

19:00

behind it and how scientific is it? But I will say

19:02

this, this index that was created

19:04

in 2015 it

19:07

identified 170 companies

19:09

that had, or the on the sympathy index

19:11

that we're publicly traded and they weren't able

19:13

to qualify them.

19:16

Some large, maybe 50 different

19:18

types of criteria in order to try to establish

19:20

the empathy level of that organization.

19:23

The top 10 versus the bottom 10 outperformed

19:26

on the stock market by two times. So

19:28

that would be some kind of indication.

19:31

So if you're still curious

19:33

or dubious about whether

19:35

empathy can be useful for you, there

19:38

seems to be material proof that will help you on the bottom line.

19:41

So great book. What are the top three things

19:43

that you want people to take away from the book?

20:04

Well, so the top three things, the first is think about your own level of empathy and start with that self awareness. The second is think about how empathy can be a useful thing in our divisive society. And today we have so

20:06

many issues out there in every country,

20:08

different political problems and

20:10

and societal problems, immigration,

20:14

employment and so on.

20:16

And I think that empathy is something that could be

20:18

very useful for us, not just in business

20:20

but in society. And the third is,

20:23

as you look towards the idea

20:25

of encoding artificial intelligence,

20:27

where you might consider emotions and

20:29

more specifically empathy, it's

20:32

a great opportunity to reflect on what is empathy,

20:34

what is your definition of empathy? Because in

20:37

the end of the day, you need to know that before

20:39

you start coding it. So it ends

20:41

up being a mirror for who

20:44

you are and what you're trying to achieve.

20:47

And look at that as a good reflective moment because

20:49

ultimately, while you might try to delegate

20:52

empathy, the reality is it has to

20:54

start with you.

20:55

So as this is the practical futurist

20:57

podcast, what can listeners

21:00

do next week? What are three things they can do next week to be more empathetic or on on that journey?

21:05

All right, so the first is try

21:07

to find a stranger you don't know and ask

21:09

them a few questions about them. Who

21:11

they are with their lives can be like a bus driver

21:13

or someone manning the till.

21:16

Second thing is break out a novel.

21:19

Read a good classic novel. Something

21:22

you haven't read, you'll find it, hopefully

21:24

rather entertaining. And the third

21:26

thing is look inside your business

21:29

and in what you're doing in your business practice

21:31

and see where you can strategically

21:33

try to be more understanding of people

21:36

that are different from you, specifically your

21:38

customers. Try to be in the shoes

21:40

of your customer. For example, call

21:43

your customer service, not with your

21:45

telephone number, but with someone else's telephone number

21:48

so that they can't recognize that you're an employee of

21:50

that company and ask, "Hey, listen,

21:52

I'd like to have a customer service problem solved"

21:55

and see how that experience is. Put yourself

21:57

in the shoes of the customer, legitimately

22:00

walk into a store, or if

22:02

you're in a retail place or order

22:04

from your ecommerce site, do

22:07

something that puts you in the shoes of the

22:09

customer and feel their

22:11

pain,

22:12

But also do it for your competitor to see if they're

22:14

any better than you are, and learn. Amazing discussion. I'd love to also have you back and talk about podcasting. You've done 328 podcasts since November, 2010 can we have you back to talk about podcasting?

22:26

Sure.

22:27

Look, thank you so much for your wisdom on all of these topics

22:29

today. Where can people find out more about you and

22:32

your work?

22:33

So my general toy land is

22:35

on my own site, Minterdial.com

22:38

I enjoy trying to do things

22:40

on Twitter as well

22:43

@mdial my books are heartartificialempathy.com

22:46

futureproof.ly and

22:48

then I've also done this other book on

22:50

the Second World War - its

22:52

a personal family story called thelastringhome.com.

22:55

Fantastic. This

22:58

has been the Practical Futurist podcast.

23:00

You can find all of our previous shows

23:03

at futurist.london and if you like what you've heard on the show, please consider

23:05

subscribing via your favorite podcast

23:07

platform. You can also hire me to speak

23:09

at your next management offsite or customer event.

23:12

More details on what I speak about with video replays

23:14

can also be found at futurists.london.

23:17

Until next time, i'm the

23:19

Practical Futurist, Andrew Grill.

Rate

Join Podchaser to...

  • Rate podcasts and episodes
  • Follow podcasts and creators
  • Create podcast and episode lists
  • & much more

Episode Tags

Do you host or manage this podcast?
Claim and edit this page to your liking.
,

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features