Podchaser Logo
Home
Confronting bias and privacy for better tech tools

Confronting bias and privacy for better tech tools

Released Friday, 23rd February 2024
Good episode? Give it some love!
Confronting bias and privacy for better tech tools

Confronting bias and privacy for better tech tools

Confronting bias and privacy for better tech tools

Confronting bias and privacy for better tech tools

Friday, 23rd February 2024
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:03

Hey, I'm Tom Power. I'm the host of the

0:05

podcast Q with Tom Power, where

0:07

we talk to all kinds of artists, actors,

0:09

writers, musicians, painters. We had Green Day on

0:11

the other day talking about their huge album,

0:13

American Idiot. Nicole Byer came on to talk

0:16

about ADHD and comedy. And then

0:18

there's Dan Levy. While we were talking about filmmaking,

0:20

we talked about his insecurities. I

0:22

sometimes feel like I have this desire

0:24

to perform, to be a

0:26

version of myself that people might like. Listen

0:29

to Q with Tom Power to hear your favorite

0:31

artists as they truly are wherever you get your

0:33

podcasts. The

0:59

number of telehealth apps grew rapidly during the

1:01

pandemic to meet a new dimension of

1:29

the world. There's a huge demand for virtual

1:31

healthcare services. Instead of going into a clinic

1:33

or a doctor's office, access to a physician

1:36

or mental health services looks like downloading an

1:38

app or visiting a website or using

1:40

messaging apps, phone calls and video chats. But

1:43

this convenience may come at the cost

1:45

of data privacy. Many of

1:47

the commercial companies that run virtual care

1:50

platforms collect, share and use information uploaded

1:52

by patients. According to a 2022 report

1:55

on the business practices of commercial

1:57

virtual healthcare services in Canada. Dr.

2:00

Cheryl Spithoff was the project lead. I'm

2:03

a family doctor and a researcher at

2:05

Women's College Hospital and an assistant professor

2:08

at the University of Toronto. So

2:10

we received a grant from the

2:12

Office of the Privacy Commissioner of

2:14

Canada to explore how virtual care

2:16

platforms are handling the data they

2:19

collect through their platforms. So we

2:21

interviewed 18 individuals

2:23

who were affiliated with these platforms,

2:26

largely as employees, some were

2:28

contractors or consultants, and two

2:30

academics. And then

2:32

we asked some questions about how

2:34

the platform gathered data, how they held it,

2:37

what they did with the data, what they

2:39

saw as the benefits of these different data

2:41

uses, and what kind of concerns they had.

2:44

Healthcare data is supposed to be some

2:46

of the most protected, so what's happening

2:48

with these virtual care platforms? As

2:51

far as we can tell from our

2:53

interviews and the study that we've done,

2:55

they're not selling your personal health information

2:57

as they define it. So

3:00

they define this pretty narrowly. It's the

3:02

information that they collect when you speak

3:04

to a nurse practitioner or to a

3:06

physician for your health care issue. So

3:09

the other forms of data that they

3:11

collect, one is sign up and registration

3:13

information, so this might be your name,

3:16

email addresses, phone numbers, things like this,

3:18

when you first register with the platform.

3:21

And according to participants in our

3:23

study, they define this as personal

3:25

information, but not personal health information.

3:28

Same thing with user

3:30

data like IP addresses, device

3:32

identifiers, things like that, and

3:35

then also de-identified health

3:37

information. Personal health

3:39

information was generally only used to

3:41

provide health care, although there are

3:44

a couple exceptions to that, whereas

3:46

the other forms of data that

3:48

they defined as not PHI are

3:50

not personal health information, which is

3:53

arguable. targeted

4:00

advertising, promoting different products and

4:02

services through their platforms, promoting

4:05

third-party products and services. So,

4:07

but they're not selling sort of

4:09

information about medical conditions per se.

4:11

They're selling things like contact information.

4:14

Now, you mentioned that there were

4:16

exceptions. What are those exceptions? So,

4:18

what we did find in

4:21

some situations that they were

4:23

using personal health information to

4:26

promote pharmaceutical products by adjusting

4:28

patient care pathways. So,

4:31

they didn't seem to be providing

4:33

that information to other companies, but

4:35

participants described how pharmaceutical companies paid

4:37

their platform to analyze

4:40

data, adjust patient care like the

4:42

timing of visits, timing of lab

4:44

tests, frequency of reminders, all

4:47

with the goal of increasing uptake of

4:49

a drug or a vaccine. And

4:51

then continually running analyses and seeing

4:53

if this made a difference and

4:55

trying to optimize that for the

4:57

pharmaceutical company partner. So, they

4:59

weren't sharing the data with the

5:02

company, but they were using the

5:04

data to essentially promote a product

5:06

by adjusting patient care, affecting

5:09

clinical decision making. How,

5:11

that sounds like something that you ought not

5:13

to be able to do. Is that legal?

5:16

Good question. It's not entirely clear to

5:18

our team whether this is legal or

5:20

not. In the US,

5:22

it was an example a couple of

5:24

years back when the electronic

5:26

medical record company there took money from

5:28

a pharmaceutical company,

5:31

Purdue Pharma, and

5:33

they put prompts in the

5:35

EMRs there to encourage physicians

5:37

to prescribe more long-acting opioids.

5:40

And as you know, Purdue Pharma produced

5:42

a long-acting opioid oxycontin. And they got

5:44

in trouble with for this because they

5:46

were interfering with clinical decision making or

5:49

find, I don't know, 160 million

5:51

or so. And no one told me you

5:53

can't do this. Can you walk me

5:55

through a scenario of just registering for your

5:57

average telehealth app? Right. So... So

6:00

there's some that are phone apps and other

6:02

ones that are a website. The majority that

6:04

we came across were website applications. You go

6:06

to the website, there would

6:09

be a button there that would say sign

6:11

up, and then you could put in your

6:13

information in there. And then the next step,

6:15

we didn't do this ourselves, the next step

6:18

generally would be request an

6:20

appointment with a physician or nurse practitioner.

6:22

I see. So once I've granted

6:25

permission for a virtual care platform

6:27

or a telehealth site or app

6:29

to access and collect my data,

6:32

how does it get packaged for sale?

6:35

It doesn't always get packaged for sale.

6:37

But generally what was explained to us

6:40

is there was the registration data,

6:42

and that was stored in one

6:44

database with names, email addresses, phone

6:46

numbers, things like that. And

6:49

that data, some companies would just use

6:51

internally to market to you. There

6:53

were ways that you could opt out of that marketing,

6:55

but it didn't appear to be any ways that you

6:58

could opt out of having your data used in the

7:00

first place to design those marketing campaigns. According

7:03

to participants and from what we read through

7:05

the privacy policies. The other form

7:07

of data was the user data. So a

7:10

lot of companies collected or almost

7:12

every company seemed to collect this

7:14

according to participants was your

7:16

browsing information, so your IP addresses

7:18

and your cookie history, things like

7:20

that. Now companies couldn't

7:24

appear to link that to your identified

7:26

information that they had on you, but

7:28

they would show it with Google, Facebook,

7:30

large analytics companies. And I'm

7:32

ensuring you would get information on who was

7:35

visiting their site, not names, but just kind

7:37

of like demographic breakdown. And

7:39

then Google, Facebook and analytics companies

7:42

are able to link that information

7:44

to a uniquely identified user in

7:46

their database, where they have big

7:48

profiles on people, and that's what

7:50

they use for targeted marketing to

7:52

you. That's essentially what Google And

7:54

Facebook are all about is that's how

7:56

they make their money Is that targeted

7:58

advertising. And are

8:00

concerned to me you're sharing you

8:02

know information this person access to

8:05

health website with the they'll it

8:07

it's companies Sigma Some of these

8:09

virtual care platforms only provide one

8:11

type of service. so summer focus

8:13

on Hiv prevention services Summer focus

8:15

on mental health services and then

8:17

when this is being shared with

8:19

an analyst company and they have

8:21

insight into the nature of some

8:23

top concern. Is it? Yes,

8:26

So in the cases where it's shared with an

8:28

unlit company, is there any kind of and on.

8:30

My they son of the data that happens. Certainly

8:33

on the end of the virtual care

8:35

hawthorne they don't know as one participant

8:37

excited off who's here. Okay thought when

8:39

it goes to Facebook and google they're

8:41

able to like it's your profile or

8:43

my profile and in the not be

8:46

ordered by my name but it has

8:48

enough information on there that in a

8:50

secluded as someone with to break into

8:52

it or hockey and they could clearly

8:54

know who that was and and that

8:56

kind of information is used for targeted

8:59

advertising as he is for political advertising

9:01

on things like that. I thought

9:03

this health data was supposed to be. Particularly.

9:07

Better. Protected. But I guess the

9:09

issue here is that is

9:11

not strictly speaking health data.

9:13

Yes, it's unclear fun legislation

9:15

whether this data should be

9:17

called personal health information. We

9:19

argue that it was because

9:21

of gathered in the context

9:23

for providing health service. For

9:26

also is a little strange like if someone

9:28

comes to my corner can feed me I'm

9:30

not going to take their name and email

9:33

addresses and phone numbers and say oh this

9:35

is personal information because I suck rated it

9:37

from the health information and I can use

9:39

it differently like yes to me that doesn't

9:42

make any sense I think most people it

9:44

would it would agree with that and that

9:46

seems to our interpretation to the and mindful

9:48

of legislation is intended as well. Yeah.

9:51

Keeps up to me a little bit. About what

9:53

the potential risks are here

9:55

for For somebody who's who's

9:57

information is being kept or.

10:00

The yeah so one.

10:03

People. Are vulnerable when a seagal service

10:05

and their trusting the says he held

10:07

are getting a health service for the

10:10

like. A more susceptible to the marketing

10:12

messages that are coming from these platforms.

10:14

That's one thing. It may interfere very

10:16

tiny, their ability to make decisions that

10:18

are in their own self interests. Issue

10:21

For then, We are particularly

10:23

concerned that the present. Petcare.

10:26

Pathways may be influenced by the

10:28

pharmaceutical industry for commercial games interfering

10:30

with clinical decision making and but

10:32

it's this. Are concerned about this

10:34

of laughs at this. Maybe it's

10:36

affecting care isn't the only thing

10:39

Yams Privacy A lot of companies

10:41

if they didn't share it externally,

10:43

they were sharing it with others.

10:45

The city areas in the larger

10:47

corporations are in. Some were sharing

10:49

and externally as well and the

10:51

identified it as well can also

10:54

cause harms even when identifiers. Like

10:56

names and postal code dates

10:58

of birth I removed the

11:00

these data are often used

11:03

to create algorithm artificial intelligences.

11:05

Some time my decision making

11:07

systems and these can incorporate

11:09

social biases and then when

11:12

they're used can cause harm

11:14

discrimination particularly hard to sexually

11:16

marginalized groups. Can. You expand

11:18

on that last point a little bit. Yeah, so

11:20

there's an example that I. Can provide some

11:23

something that happens in the Us

11:25

city hospitals. Their use a particular

11:27

commercial algorithm and this algorithm games

11:29

are assigned patients address score based

11:31

on their health conditions and other

11:34

factors as well and it was

11:36

used to distribute resources so few

11:38

the higher risk for more resources

11:40

like home care and other things

11:43

like that. And then when researchers

11:45

looked at this algorithms they found

11:47

that for. A similar

11:49

is scores black people were a lot

11:51

sicker than white people so they just

11:53

weren't receiving the appropriate resources for their

11:55

health condition. Who's and when they debate

11:57

and looked at the other than it

11:59

appears. You can. The algorithm

12:01

was using pass to use as

12:04

healthcare resources to determine who should

12:06

get resources for their future needs

12:08

and this was a feeling that

12:11

black people because of no structural

12:13

racism asexuals the some factors your

12:15

hand and using a lotta healthcare

12:18

resources in the past booth Now

12:20

the authors argue that this was

12:23

inadvertent was an intentional but could

12:25

happen all the time with his

12:27

algorithms and a their commercial proprietary.

12:30

Than there's no oversight people. my nose

12:32

in the be you know where they're

12:34

being use and there's no recourse and

12:36

ability for people to tell us what's

12:38

happening. And you can happen Of course

12:40

this in order in the public system

12:42

to our research study using them. But

12:44

then at this meal as much as

12:46

fancy a rounded and overstayed. You

13:08

are listening to Spark from

13:10

Cbc. And

13:12

you're young and they were talking about

13:14

spells the trouble with data A to

13:16

privacy and data bias in our digital

13:18

tech I know My guest is doctor

13:20

Cheryl Spit asks a family physician and.

13:22

Associate professor at the University of Toronto.

13:25

Or research looks at the impact a

13:27

commercial interests on health and health. Care

13:29

System. Gets

13:32

get the sense that providers. Who use

13:34

virtual health platforms to reach their

13:36

patients are aware of. This

13:38

Are aware of the privacy risks. I

13:41

don't think so there was a

13:43

case and operate our that privacy

13:45

commissioner. They're investigating what was happening

13:48

with a virtual care popcorn called

13:50

Babylon which has now become a

13:52

talent platform and was concerned with

13:55

how data for being used. In

13:57

that patients who being properly informed and from

13:59

the different. The way but they

14:01

were being handled Am One of

14:03

their critiques was that the physicians

14:05

who were involved with these platforms

14:08

were taking proper responsibility for holiday

14:10

are being handled This Athena custodians.

14:12

that's really their responsibility and I

14:14

don't think a lot of physicians

14:16

a nurse practitioners at work at

14:19

these platforms have that understanding you.

14:21

There is actually a new here

14:23

in Ontario to make that the

14:25

companies that provide these surfaces, whether

14:27

their electronic medical records enter or.

14:30

As we have a virtual care platform

14:32

to make them more responsible as well

14:34

for what happens with data are currently

14:36

it is really the responsibility of them

14:38

healthy The custodian. I. Mean, it

14:40

seems like we're often told that

14:43

these technological solutions are a way

14:45

of bringing medical services you know,

14:47

rolling note more broadly for a

14:49

lower cost. So what improve data

14:51

privacy protections would you like to

14:53

see ruled out. The. I

14:55

think versatile care is an important

14:57

to loosen but yes there's definitely

14:59

changes at Munich need to makes

15:01

him and sure that that privacy

15:03

is protected and one of them

15:06

with the clearly defining all the

15:08

data that Clinton's these platforms as

15:10

Personal Health Information Scotland in the

15:12

context for biting how serious it

15:14

deserves the same protections also be

15:16

Sam Adams difficult for pieces to

15:18

opt out of money the commercial

15:20

uses Athena Athena uses. A aren't

15:22

essential to. The permission of

15:24

Health and it should be. Very.

15:27

Easy that season. See, I don't have

15:30

to agree to these different uses in

15:32

order to access the health service nother

15:34

thing that's important as providing protection for

15:37

the identify data. So right now

15:39

under our most Canadian legislation once you

15:41

the identified and you can essentially do

15:43

whatever you want with it. So both

15:46

a new propose cetera privacy legislation

15:48

as well as Ontario is considering new

15:50

privacy legislation to the private sector. both

15:52

of them are hoping are planning on

15:55

bringing the identify data within the

15:57

scope. Of the law to give

15:59

appropriate protections and make sure

16:01

that know. The. Identification of

16:03

individuals unlikely and what we're hoping for

16:05

the sooner rather also addressing the use

16:07

of the say that so the way

16:10

that I find earlier how they that

16:12

unified feed it cause harm even without

16:14

the ratification me to think about that

16:16

our that a to be used as

16:18

a line. With wow okay support the

16:20

latest he is patience and surveys and.

16:22

Studies. They clearly say that You know

16:25

I'm happy to have my data use

16:27

for research for health system improvement, but

16:29

they're very reluctant to have it use

16:31

for commercial reasons without their explicit consent.

16:34

That will cracking down on patient. Privacy

16:36

curb commercial interest until out at a

16:38

time when we're looking at investing more

16:40

into the threats. technical solutions. It's

16:43

possible it's I mean I can't say for

16:45

sure, but it does the my company's us

16:47

as a revenue stream. So. The

16:49

other getting paid some money from the

16:51

government or from people paying at a

16:54

pocket to access the services but then

16:56

in addition they're using that A lot

16:58

of a marketing. Their products are the

17:00

company's products so as possible it'll be

17:02

less of an incentive if they're not

17:04

able to use data for those reasons

17:07

that a more that you know raise

17:09

the question then is it more appropriate

17:11

to have nonprofit are public models Efforts

17:13

Will Care or a Moto Waivers who

17:15

cares really integrated into ongoing care? Yeah,

17:18

so that's what. Mean right now

17:20

it's difficult for don't have enough primary

17:22

care providers sufficient. The nurses whole lot

17:24

of people are turning to these services

17:26

but if the and situation like the

17:29

Netherlands where ninety nine percent of people

17:31

had a family doctors than everybody to

17:33

access virtual care to their family doctor

17:35

and in that situation it's the clinical

17:38

team is making decisions about how the

17:40

days are use, whether or not to

17:42

promote a drug or vaccine through there

17:44

is not the pharmaceutical industries and so

17:47

addresses a lot of those issues. My

17:49

clinic or not, you know, monetizing our

17:52

data. Sure, I imagine that very few

17:54

clinics are. Isn't. You

17:56

know I'm just thinking about the through the explosion. Of

17:58

what we might, the colors can acquire. The medical

18:00

data like. All the things that track

18:02

your fitness, your sleep, your periods, and

18:05

so forth. Even if your phone has

18:07

robust privacy protections, you might be using

18:09

third party apps you think we need

18:11

to think more broadly about. I guess

18:13

what constitutes. Sensor data or what constitutes

18:16

health data given the explosion. Of new

18:18

technology out there. we do because we've.

18:20

Historically. Thought about how beta is

18:23

that narrow lead off on definition

18:25

of a data collected by hospital

18:27

or nurse practitioner or a family

18:29

positions. Whereas now, like you're saying

18:31

it clear, it's much broader and

18:34

that information can be tested, Subsidize.

18:36

The. And just finally, if privacy

18:38

can be improved in the virtual care

18:40

space, would you like to see. Tell.

18:43

Health become a more central part of the healthcare

18:45

system. I think it's an essential part of. Our.

18:47

Healthcare system. I would say for me

18:49

about twenty percent of the visits that

18:52

I have with patients are virtual navy

18:54

a video of a higher be thirty

18:56

percent and then far urgent care that

18:58

we run in of are clinically the

19:00

about ten percent. So it definitely has

19:03

a role and we know that. Inappropriate.

19:06

Situation from its on that can be looked

19:08

after over and over the phone or through

19:10

video. Patients prefer it with lot of time.

19:13

that just doesn't make sense. I'm going to

19:15

listen to some as long as I'm up

19:17

in their ears but for the cases a

19:19

offices follow up for know how tests or

19:22

with mental health reasons like doing here are

19:24

video or the phone is just as an

19:26

appropriate other way to provide that health care

19:28

service and I really thought everybody has access

19:31

to that up the we need to find

19:33

ways to ensure that everybody's as. Thanks

19:35

so much for your insights on this and I'm

19:37

looking for having the. Doctor.

19:40

Shells Behalf. Is a family physician

19:42

at Women's College Hospital and an assistant

19:44

professor. In the purpose of family clean, For

19:58

to movement is road. part

20:00

of the healthcare experience, from research

20:02

to diagnoses to treatments. And

20:05

the data that's collected along the way sets the

20:07

course for future care. But

20:09

what if it wasn't a human that used research data

20:12

to come up with a treatment plan? What

20:14

if it was artificial intelligence? Cajun

20:17

Gainti wrote about the history of data

20:19

collection in healthcare and the role machine

20:21

learning played in an article for The

20:23

Conversation. It's called, From a Deranged

20:26

Provocateur to IBM's Failed AI

20:28

Superproject, the controversial story of

20:30

how data has transformed healthcare.

20:33

Cajun is a senior lecturer in the history

20:35

of science, technology, and medicine at King's

20:37

College London. Data

20:40

collection and medicine began really in the

20:43

early 20th century and it really was

20:45

about amassing lots and

20:47

lots of records from

20:49

various medical institutions or asking different

20:52

medical practitioners to send in cases

20:54

that looked as though they belonged

20:56

to a particular category, a particular

20:58

diagnostic category. The idea is that

21:00

if you can amass enough information

21:03

about a particular disease

21:05

and then sort of go through and analyze that

21:08

information, you'll be able to be more specific

21:10

about both diagnosis and

21:12

treatment. And

21:21

this was incredibly revolutionary from medicine

21:23

because prior to that point it

21:25

really operated as a kind of system

21:27

where a doctor would see patients and the

21:29

patient would say, these are the symptoms I

21:31

have, and the doctor would draw on their

21:33

own sort of anecdotal experience and say, oh,

21:36

I think you've got X disease or something

21:38

like that. Whereas after

21:40

this kind of data compilation

21:42

really starts to happen, you

21:45

start to see the kind of shift sort

21:47

of across the board for everybody who's diagnosed

21:49

with a particular kind of disease. So it's

21:52

really, really critical to the foundation of modern

21:54

medicine. to

22:00

come into medicine for all sorts of reasons, sort of in the

22:02

1960s, 1970s, as

22:05

they also are entering into other kinds

22:07

of scientific enterprises. And a lot of

22:10

that has to do with the big

22:12

science movement of the Cold War period,

22:14

and the kind of growing acknowledgement that

22:16

data is the right way to sort

22:19

of understand our lives. The

22:22

AI really comes in quite a bit later, sort

22:24

of in the 1990s, and

22:26

you'd see this real understanding, and

22:29

that period that medicine

22:31

is actually an information management

22:33

system, more than it is sort of anything

22:35

else. And what's better

22:37

for an information management system

22:39

than a computer, right? And

22:41

so then increasingly you see

22:43

more and more applications for

22:45

computers and then for AI

22:47

within medical context. But

22:50

I think really the explosion of sort of AI

22:52

and machine learning comes really in the 2010s, I

22:55

would say. It's

23:00

easy to understand why people were

23:02

so excited about the prospect of

23:04

AI in medicine. What patterns or

23:07

surprising findings might be revealed by

23:09

applying machine learning to huge amounts

23:11

of medical data? And

23:13

in particular, could machine learning lead

23:15

to personalized medicine? My

23:19

sense is that people who are really hopeful

23:21

about AI, I think it's really going to

23:23

resolve all sorts of issues,

23:25

and particularly the very thorny and

23:28

long-standing issue of how

23:30

to personalize care for individual

23:32

patients, which is a

23:34

very large and thorny sort

23:37

of problem. But what it has

23:39

done, I think really successfully, is

23:41

do a lot of the kind

23:44

of information management that we used

23:46

to do in this very analog

23:48

sort of way much more quickly

23:50

and much more effectively than humans

23:52

can. For example, one

23:55

of the first machine learning applications

23:57

was something called osteodiction, and the

23:59

idea had Where did this was something that

24:01

would teach computer and how to identify

24:04

risk fractures and then allow them to

24:06

do that identification so that the doctor

24:08

doesn't have to a turned out of

24:10

place that they're very good at. Said.

24:17

To So the way that Ai is

24:19

currently being deployed in healthcare is very

24:21

similar to the pattern recognition that was

24:23

this off the Oh Detect program. So

24:26

other examples it's been used to

24:28

identify the symptoms and useful therapies

24:31

potentially around Long Cozad for example.

24:33

So that's a very recent usage

24:35

of a I'd machine learning and

24:37

that very good at sort of

24:40

recognizing patterns and scans as well.

24:42

I think one example is about

24:44

cancer tumors being able to identify

24:46

tumors and then therefore being able

24:49

to suggest sort of appropriate treatments

24:51

for a particular kind of tumor

24:53

in a particular kind of code.

24:56

Or it's been us to do something

24:58

that's really caustic in the history of

25:00

Madison, which is to look across many

25:02

different factors and to see sort of

25:04

the pattern that unites some of these

25:07

bastards together. For example, determining the recurrence

25:09

of lung cancer can look at all

25:11

these factors and say speed seems as

25:13

if actors and say that determine lung

25:15

cancer recurrence. So that's a lot of

25:18

the ways in which machine learning has

25:20

been deployed recently and health care settings.

25:24

But. What that means is that while they're

25:26

very good at doing that, kind of

25:28

were sort of this background data work

25:30

that has always been really essential to

25:32

the way that we do medicines. That's

25:34

a very different enterprise. Found the kind

25:36

of machine learning work that would be

25:38

a clickable to patient care in very

25:40

real and and specific sorts of ways.

25:43

It's the one of the as

25:45

the biggest flops in the history

25:47

of a his Ideas Watson computer

25:49

which was meant to kind of

25:51

revolutionize especially cancer care. Very.

26:02

Exciting beginning as the Watson computer

26:04

that appeared on Jeopardy on the

26:06

Game show Jeopardy and Beat Everybody.

26:08

He has had the Ferry celebrity.

26:10

Start on Jeopardy! And

26:12

then after that one of the places

26:14

in let's I the m saw machine

26:17

learning as having real and media application

26:19

was hosting. One of

26:21

the be problem said they were in seeing this

26:23

that of first of all the kinds of data

26:25

that that computer with meant to learn. From were

26:27

very very different from each other in very hard

26:29

to square, very hard to create the patterns that

26:31

needed. But then second of all

26:34

that cancer care is not something that

26:36

is universalize the cross everywhere where people

26:38

have cancer across the world so that

26:40

in it's application it may be right

26:43

for the very particular hospitals were that

26:45

the data with gather it's but it

26:47

wouldn't be right for other hospitals and

26:49

other parts of the world say where

26:52

and those diagnoses and available treatments might

26:54

look a little bit different so it

26:56

can't take into consideration local factors and

26:58

that's one of these big really difficult

27:00

issues that he also prevent it from

27:03

being able to do the kind of

27:05

personalized care that we sort of dream

27:07

of as the future of of medicine

27:09

more generally. You

27:18

know, in the last couple of years since and all of

27:20

these stories about how Watson was sold? Off for parts

27:22

the really the failure of lox and

27:25

to do the thing it was to

27:27

do is a health care is kind

27:29

of this and eighty three the really

27:31

articulate. Sort of the problems around a

27:33

I in the context of healthcare. One

27:39

of the really important challenges for

27:41

healthcare right now is this is

27:43

this question about personalized medicine. Can

27:45

we make personalized Medicine works? So

27:47

this idea that we are all

27:50

individuals and our bodies need individual

27:52

things? Can machine learning help us

27:54

to understand individuals and then be

27:56

able to tailor diagnoses and also

27:58

therapies to individual. Bodies and

28:01

I think that's one of the

28:03

things that people are really hopeful

28:05

it. It will potentially help us

28:07

to resolve within health care. But

28:09

it can only be as good as the data

28:11

sets that it has to work with. You

28:28

will. Actually

28:32

have three. Really

28:37

isn't necessarily learning data were

28:39

generally and I think only

28:41

in a sympathetic largely that.

28:48

Bottle. In

28:51

our. Which

29:03

has the. Process of the

29:06

population and that could still be a large

29:08

number. Of that if you think

29:10

about you know of reasons why

29:12

even the you need individual dirty

29:14

he won't be able to eat

29:16

at the end of the day

29:18

uses think that's a pretty unique,

29:21

terribly square individual people. In

29:23

sentences. What we need as an new

29:25

model. You know that That will say like

29:28

okay, we really want to focus on individuals

29:30

and were really serious about this kind of

29:32

personalized care Than we need to move away

29:35

in some ways from this way of thinking

29:37

about the use of data rather than trying

29:39

to refine data us further and further and

29:41

further to the point where we can gets

29:44

you each individual, which is really sort of

29:46

an impossibility. it's

29:52

hard his said talk about what the

29:54

future of anything that might look like

29:56

and in this case you're very hard

29:58

for me to imagine what a you

30:00

of sort of computers in medicine or

30:02

AI in medicine might potentially

30:05

be. You

30:07

know, I think there's a lot to

30:09

be said for the way that data

30:11

has helped to make modern medicine very,

30:13

very successful across the board, but it's

30:15

not going to be the thing that

30:17

turns the corner and allows us to

30:19

attend to everybody's individual health needs completely,

30:21

successfully, all the time. And if that's

30:23

what we want, then we have to

30:25

sort of rethink how we

30:28

do medicine. And I

30:30

really think the questions that we ask around sort

30:32

of tech and AI and

30:34

machine learning really ultimately come back

30:36

to this question about, well, is

30:38

this data-driven way of doing medicine

30:40

the way that we want to

30:43

keep doing medicine, or do

30:46

we want to do something different? Cajun

30:54

Gaite is a senior lecturer in the

30:56

history of science, technology, and medicine at

30:58

King's College London. Have

31:00

you ever wondered why you see what you

31:02

see when you're online? I'm

31:05

Jamie Bartlett, and in the gatekeepers

31:07

from BBC Radio 4, I'm

31:09

telling the story of how social

31:11

media accidentally conquered the world. Mark's

31:14

explaining to me he's going for a billion users.

31:16

I'm going for what? I'm sorry,

31:18

what is it you're going to do? They

31:21

can give us a voice or silence

31:23

us, whoever we are. At real Donald

31:25

Trump, it says, account suspended. To

31:27

understand how we got here and

31:30

where it's taking us, listen to

31:32

the gatekeepers available wherever you get

31:34

your podcasts. I'm

31:36

Nora Young, and this is an episode of Spark that first

31:39

aired in March, 2023. We're

31:41

talking about data in healthcare, where the

31:43

impacts of the use of our data

31:45

are so personal and consequential. During

31:48

the pandemic, I had breast cancer,

31:50

I was really desperate for information. So

31:53

one of the places it took me

31:55

was into my own electronic medical record.

31:58

This is Meredith Broussard. She's a data scientist. who

32:00

also worked in software development and journalism.

32:03

And as a data journalist, you do things like

32:05

read all of the boring stuff

32:07

they knew and read the manual. And

32:10

so I saw a little note in my file

32:12

that said, this scan was read

32:14

by an AI. I thought,

32:16

oh, that's really strange. I

32:18

wonder why the AI read

32:21

my scans. This took

32:23

me into the wide world of

32:26

AI-based cancer detection. So

32:29

I devised a study in scientific

32:31

terms. It's a replication study with

32:33

an N of one where I

32:35

took my own mammograms and ran

32:37

them through an open source AI

32:40

to detect cancer in order to

32:42

write about the state-of-the-art AI-based cancer

32:45

detection. So the big

32:47

takeaway that I found was

32:49

that the software is

32:52

extremely impressive and

32:54

also not necessarily

32:57

ready for prime time. In

32:59

2023, Meredith released her book, More

33:01

Than a Glitch, confronting race, gender,

33:03

and ability bias in tech. In

33:06

it, she argues that the ways we

33:08

think about tech design create deep-seated problems,

33:11

not just in healthcare, but in our

33:13

data-driven future. We

33:16

have situations like the US

33:19

kidney transplant list where for

33:21

many years, if you were

33:23

white, your kidney numbers would be measured in

33:25

a certain way. And if you were black, your

33:28

kidney numbers would be measured in a

33:30

different way so that

33:32

white people would get onto

33:34

the kidney transplant list earlier

33:37

than black people. It

33:40

was called race correction in medicine.

33:43

And this to me is a really good illustration

33:45

of why we need to really

33:47

look at the underlying diagnostic systems

33:50

before we start implementing

33:52

them as algorithms, because

33:54

obviously it's really unfair to

33:57

put people onto the... transplant

34:00

eligible list earlier based

34:03

on their race. That's

34:05

just horrible. And in fact,

34:08

medicine broadly recognizes now,

34:10

oh, this is extremely unfair.

34:13

The American Kidney Foundation has changed

34:15

their formula for recommending. The UN

34:17

has said, all right, we need

34:20

to rearrange people's spots

34:22

on kidney transplant lists

34:24

globally. Probably it is

34:26

something that is happening, but it is a race-based

34:30

problem in medicine that

34:34

has been with us for a very long

34:36

time. And so then is the concern that,

34:39

for example, the kidney thing ends up getting

34:41

encoded into these systems and perhaps even not

34:43

being recognized down the road? Exactly.

34:46

Exactly. Because when something that

34:48

is unfair is encoded in

34:50

an algorithm, it becomes very difficult

34:52

to see and almost

34:55

impossible to eradicate. Yeah.

34:57

So your book goes beyond healthcare. You're looking

34:59

at how bias in technology affects the

35:02

justice system, education, disability rights

35:04

and more. And part of

35:06

the root cause is this underlying notion

35:09

of technoshovenism, which I believe

35:11

is a term you coined. So what's technoshovenism?

35:14

Technoshovenism is the idea that

35:16

technological solutions are superior to

35:18

others. What I would

35:20

argue is that it's not a competition. Then

35:23

instead we should think about using the right

35:25

tool for the task. So sometimes

35:27

the right tool for the task is absolutely

35:29

a computer. You will pry my smartphone out

35:31

of my cold, dead hands. But

35:34

other times, yeah, other

35:36

times it's something simple like a book

35:38

in the hands of a child sitting at a parent's

35:40

lap. One is not

35:42

inherently better than the other. So we

35:45

don't win anything by doing

35:48

everything with computers instead of doing it

35:50

with people. We need to think about what

35:52

gets us toward a better world. Yeah. So

35:55

regular listeners to the show will know we've talked

35:57

about the problem of bias in the training data.

36:00

use for machine learning. Can you just talk

36:02

a little bit for me about how biases

36:04

manifest in machine learning? So

36:06

the biases that we see in machine

36:08

learning systems are the biases that exist

36:10

out in the real world. One of

36:12

the things people often say is that

36:14

AI is a mirror. And so we

36:16

really shouldn't be surprised when bias

36:19

pops up in AI systems,

36:22

because we know that we live in an

36:26

unequal world. One of

36:29

the things that I write about

36:31

is an investigation by the Markup,

36:33

an algorithmic accountability reporting organization. And

36:35

they found that in the US,

36:37

mortgage approval algorithms were 40 to

36:39

80% more

36:42

likely to reject black

36:44

borrowers as opposed to their

36:46

white counterparts. Now, this

36:49

might be surprising. But

36:51

then when we look at the

36:53

system, it becomes less surprising. Because

36:56

what is a mortgage approval algorithm doing?

36:58

Well, it's making the same kinds of

37:00

decisions that it sees in the

37:02

data that it was trained on. What is

37:05

it trained on? Who has gotten

37:07

mortgages in the past? Well,

37:09

we know that there's a history

37:11

of financial discrimination in lending. So

37:14

it's really unsurprising that

37:16

we should be seeing bias in

37:18

a mortgage approval algorithm. Yeah, you

37:20

have a great terse formulation, which

37:22

is tech is real life. So

37:25

whatever we find in real

37:27

life, we're going to find in tech at the

37:29

same time. But to me, one of the really

37:31

interesting things about the book is that it goes

37:33

beyond the bias in the technology itself, because

37:36

you're also exploring how the problem

37:38

lies in the interaction of the

37:40

bias technology with the bias

37:43

culture that's using the technology.

37:45

Is that a fair characterization?

37:47

Yeah, absolutely. We've got the

37:49

bias technology. And then we've

37:52

got this pro technology bias

37:54

operating out in the

37:56

world. And then we've got a Lack

37:59

of diversity and. Hunt Valley. we've

38:01

got get a lack of diversity

38:03

and Sec reporting. Sturgis All of

38:05

these. Factors that are interacting with

38:07

each other can't. In kind of making

38:09

a mess. On

38:36

nor a young to on spark were

38:38

talking about the limits of data driven

38:40

machine learning Spite Now my guess is

38:42

Meredith Broussard, a data scientist and the

38:44

author of the book more than a

38:46

Glitch confronting race, gender and ability bias

38:49

in Tech. One of the

38:51

reasons I wrote the books is I see

38:53

like we. Can do better. When

38:55

you just see it and article

38:57

every few months about official recognition

39:00

sale he made sense. I am

39:02

that's you know, happening every so

39:04

often. It's not really a huge

39:06

problem, but when you see all

39:08

of these stories piled up together

39:10

you really get a better sense

39:12

of what are the real farms

39:14

that people are suffering right now

39:17

at the hands of algorithmic systems

39:19

and another thing that I do

39:21

in the book as I can

39:23

try and plate readers to. The

39:26

Thinkers. Who are doing really

39:28

amazing work In this regard,

39:30

it really creates a road

39:32

map to okay, how can

39:34

we stop messing things up.

39:37

And how can we do a better job with

39:39

our technical world? will? Do. You think

39:41

these technological tools are always fixable or

39:43

they're just cases. where we should just

39:45

be categorically saying no we're not

39:47

using this i think we really

39:50

need to make space for refusal

39:52

i think we need to make

39:54

space to say i have this

39:56

thing is not working as expected

39:58

and were going to throw it

40:00

away. And that's

40:02

really hard to do, especially when

40:04

you've invested millions in developing the

40:06

system or when you've spent months

40:08

of your life just

40:11

trying to make something work. You

40:14

know, it's, it's difficult writing too. I mean,

40:16

we have this expression in writing Kill Your

40:19

Darlings, like, and it's about, okay, let's get

40:21

rid of the words that you love most.

40:24

It's really, really hard. But sometimes that's what

40:26

you have to do in order to know

40:30

make your writing good. And sometimes what we need

40:32

to do in order to make

40:34

a better world with technology is we need

40:36

to not use bad technology. Yeah.

40:38

Or are there ways that we need to think about constraining

40:40

the uses of the technology, right?

40:42

So that maybe machine

40:45

learning, for example, is useful

40:48

for certain population level things, but we don't

40:50

use it where it determines, you

40:52

know, when it has an actual impact on

40:54

an individual person's life, for example. Yeah.

40:57

Like, let's not use it

40:59

to grade student papers. One

41:01

of the things that I read about the book is a case a

41:04

few years ago where the international

41:06

baccalaureate decided to give

41:10

real students imaginary grades

41:12

assigned by a machine learning

41:14

system, which of course was

41:17

a huge disaster because what the

41:19

machine did was it said, Oh, the poor

41:21

kids, we predict they're going to get bad

41:24

grades and the rich kids, oh, we

41:26

think they're going to get good grades. Well, that's

41:28

completely counter to everything

41:31

that we want out of

41:33

education, right? Education is supposed to be

41:35

the kind of thing where it's about individual effort.

41:37

You get out of it what you put

41:39

into it. You're not constrained by your

41:42

background. Yeah. I mean, this is a

41:44

case that happened because there were COVID

41:47

restrictions on students being able to take,

41:49

actually take exams in person. And when I was

41:51

reading a description of it, I mean, it's so nutty

41:54

that they would have thought this was a good

41:56

idea. And it really made me think like, what,

41:58

how did that happen? people thought that that would

42:01

be a good idea as

42:03

a way of predicting individual students'

42:05

success or failure at these exams.

42:07

It's quite extraordinary. It

42:09

really is. I mean, we

42:11

all made some baffling decisions

42:13

during the pandemic, but that

42:16

one really sticks out to

42:18

me as misplaced faith in

42:20

algorithmic systems. Yeah. How

42:23

much of a problem do you think it is that people

42:25

just don't actually understand the technology? That

42:28

it seems like the machine learning, say,

42:30

is spitting out the capital T truth

42:32

rather than dealing with probability

42:34

or pattern matching? Oh,

42:37

absolutely. Absolutely. That's a factor

42:39

because these systems are

42:41

really hard to understand. So

42:44

I have had about a billion conversations about

42:46

chat GPT in the past couple of months.

42:48

As you can imagine, I have

42:51

explained a number of times, this is

42:53

how chat GPT works. And it's

42:56

kind of always a surprise

42:58

to people because you use

43:00

technology without thinking too hard

43:03

about how it's constructed. The

43:05

way I've thought about it is like, oh, I drive my

43:07

car, but I don't really think about the spark plugs

43:10

or the axles when I'm

43:12

driving my car. I just want to get my car and

43:14

go about my business. So I'm

43:16

the kind of person who I look

43:18

at technological systems and I think, oh,

43:21

well, the data is coming from here and the

43:23

data is coming from here and like this user

43:25

interface design decision was made and

43:27

oh, the output is going to be flawed because blah,

43:29

blah, blah. I don't know. That's just how my mind

43:32

works. And I think

43:34

that if more people start thinking

43:36

about about what goes

43:38

into a computational system, we're going to

43:40

make better decisions about what comes out

43:43

of a computational system and we'll have

43:45

less space in them when we need to

43:47

be skeptical. And I want to feel

43:49

empowered to push back

43:51

against algorithmic systems or algorithmic

43:54

decisions that are

43:56

bad decisions. You

44:10

are listening to spark what

44:12

we thought of the cyberspace,

44:15

colonized and then effectively has

44:17

become what we think of

44:19

as the real world. This

44:21

is Stark on Cbc Radio.

44:27

When you're young and right now my guest is

44:29

data scientist Meredith. Broussard. Author of

44:32

More Than a Glitch In It, she argues

44:34

we need to go beyond critiquing tax to

44:36

look at ways of designing it in the

44:38

public interest. So.

44:41

I am pro. Technology in

44:44

general. I. Sometimes like to

44:46

make that clear lake building

44:48

technology is using technology a

44:50

thick I am. I'm not

44:52

saying much. Doc is acknowledging

44:55

what I really think so

44:57

as we need to think

44:59

about the complex interplay between

45:01

society and technology and so

45:03

we need to not be

45:05

use computers for things that.

45:08

They're inappropriate for it, so. This

45:11

kind of a fantasy of a

45:13

fully autonomous worlds. By it were

45:15

algorithms like govern everything that is

45:18

on social media and in I

45:20

use an app to summon a

45:22

car in the car drives itself

45:24

to you and then dropped you

45:27

off and then like disappears into

45:29

the get their read like there's

45:31

this this fantasy about what. I

45:34

would argue instead as we

45:36

need to think about human

45:38

in the loop system that

45:40

actually having a taxi driver

45:43

is great because it's other

45:45

Simmons in the other cars

45:47

out there and humans are

45:49

really good at. Not. Getting

45:51

into purposes. With each other. Him. despite

45:53

with us or time car. folks.

45:56

Would Like You depicts the amount of time

45:58

that we don't crass is actually. Greater

46:00

than a number of times that would you crash.

46:03

So. He'll Sometimes we can

46:05

have autonomous systems by. Room.

46:09

Most of the time it's a human in the lives of

46:11

some. And we're better off thinking

46:13

about that. Here and. We're.

46:15

Better off thinking about what are

46:18

the human problems that we're. Bringing.

46:20

To the Table does beyond a I

46:22

bias to manifest in tech and other

46:24

ways. In one chapter you talk about

46:27

gender and how we came to have

46:29

to sort of binary male female

46:31

pictured gender options on forums. T. Tommy

46:34

bit about that. That story came

46:36

about because I was trying to

46:38

use my husband's transit pass in

46:41

Philadelphia ends. The transit passes used

46:43

to be marked with an M

46:45

R W. And. I was

46:48

a know and that his past said

46:50

mm the i was clearly w and

46:52

and i wasn't gonna be alias past

46:54

but. One. Of the things that's

46:56

important as a designer, As a reporter?

46:58

Whatever. as. To think about. People's.

47:01

Experiences who are not like

47:03

you. Have been

47:06

hidden empathy when were designed

47:08

technological systems and I started

47:10

wondering okay well what is

47:12

this like for somebody. Who is

47:14

non binary? Who's trans fat the train

47:16

station when you have to go up

47:19

and get your sticker like is that

47:21

an experience? Where are people get the

47:23

sender and of with are experiencing micro

47:25

aggressions and so I started talking to

47:28

people and yes this was a big

47:30

issue and then realize like fit. Why?

47:33

Does there have to be a

47:35

gender sticker on the train pass?

47:38

And then I wondered, okay, well, this is

47:40

a train to pass But. What?

47:42

About the databases

47:45

that. You're. All of

47:47

our information is enter them. What does

47:49

it feel like not to have a

47:51

box. In. The database six

47:53

fact that represent. Your. Actual

47:55

gender identity. And

47:57

so they're all these situations

48:00

that trans or non binary

48:02

folks are intercept folks like

48:04

experience out in the world

48:06

that are gender violence, Situations.

48:09

That the hands of computational

48:11

systems going through the airport

48:13

for example. There. Is

48:15

a pink or blue button. That.

48:18

The she has a isn't a when.

48:20

You go into the x ray

48:22

machine and if your gender presentation

48:25

does not match up to what

48:27

the computer think should. Be there,

48:29

Then you get pulled aside

48:31

and you know. Pulled.

48:33

Into this very invasive exam. I

48:36

mean, this is a terrible. If.

48:39

We should to better. Yeah, Yeah. We should

48:41

do better. It. Also suggests

48:43

to me than. The kind

48:45

of problem of legacy choices, right?

48:47

That it's not just the you

48:49

makes technological choices for this particular

48:51

technology, but that those choices choose

48:53

you know, Am or W or

48:55

whatever and up having implications for.

48:58

Technologies for the down the road

49:01

that are built on top of

49:03

those earlier technologies? Absolutely, Absolutely. And

49:05

so. Actually, nineteen fifties

49:07

ideas about gender? Are

49:09

included into our databases. I think about

49:12

the way that I was taught to.

49:14

To write databases in college back in

49:16

the day see to be really stingy

49:18

with storage back then for storage was

49:21

expensive by it and so one of

49:23

the way as you would make your

49:25

programs smaller to run faster is. He

49:27

would use the smallest variable possible

49:29

while a binary. Value.

49:32

To the zero or one of your tax

49:34

bill for a small units of space inside

49:36

the computer. And so I was taught to

49:38

include gender as a binary. And

49:41

well, When you're thinking about.

49:43

Gender. As being just. A

49:46

male Or female. I guess it's It's very neatly

49:48

into a zero or one. But

49:50

now we understand that gender is

49:52

a spectrum. may understand that gender

49:55

needs to be an edible. Field.

49:58

That's not what people do. in

50:02

the 70s when, say,

50:04

university student information systems were

50:06

originally set up. So

50:08

the modern university really has

50:10

to go in and renovate

50:13

their systems to make gender

50:15

editable. Yeah. But are

50:18

there ever trade-offs associated with

50:20

opening up these systems and making them more inclusive,

50:23

in the sense of, I'm just thinking that one of

50:26

the things that you hear sometimes about government

50:28

websites is that they're clunky to use. They're

50:30

not as elegant as commercial websites. And that,

50:32

at least part of that, I think, is

50:34

because they're designed to be accessible. They don't

50:36

necessarily need fast

50:38

internet connections. They don't necessarily, you know,

50:40

they're designed so that low

50:42

vision and blind users can use them. So do we ever have

50:44

to deal with trade-offs in this regard?

50:48

I learned a lot about accessibility

50:50

and designing for different disabilities

50:53

as I was researching this book. And

50:56

one of the concepts that was

50:58

really important for me was the

51:00

concept of the curb cut effect.

51:03

So the curb cut is the part

51:05

at the edge of the sidewalk that

51:07

slopes down into the street. And

51:10

they didn't used to make sidewalks with curb

51:13

cuts, but it was

51:15

something that was implemented as a

51:17

result of just ages of

51:19

work by disability advocates. And

51:23

curb cuts don't just

51:25

benefit people in wheelchairs,

51:28

right? They benefit people who

51:31

are using walkers. They benefit

51:33

people who are pushing babies

51:35

in strollers. They benefit

51:38

people who are wheeling a dolly

51:40

down the sidewalk. You know, it

51:42

makes it easier. And

51:44

so everybody benefits from

51:46

a curb cut.

51:49

It's not just something that

51:51

benefits people with, you know,

51:53

specific disabilities. It's

51:55

something that benefits everybody. So when we

51:57

design for accessibility.

52:00

we are actually designing for

52:03

the benefit of everybody. The

52:06

book ends on really

52:08

an optimistic note about the possibility

52:10

of us as citizens being

52:12

more activist about the possibility of public

52:15

interest technology. So can you talk to

52:17

me about this idea of how we

52:19

actually go about fixing these problems? So

52:22

there are two things that really make

52:24

me optimistic about the future right now.

52:27

The whole book is not a bummer. So

52:33

one thing I'm really optimistic about is

52:35

algorithmic auditing. For a very

52:37

long time, we kind of looked at algorithmic

52:39

systems as being black boxes. And

52:42

we thought, oh, we can't possibly understand

52:44

what was going on inside. Well, now

52:46

we have better tools for

52:48

cracking open the black boxes, for

52:50

looking at the training

52:53

data, the model file, the

52:55

code used to construct the

52:57

system. And we have

52:59

tools, mathematical tools, for

53:01

measuring bias in

53:03

these systems. So I'm very optimistic about that.

53:05

I think you're going to be hearing a

53:07

lot more about algorithmic auditing in the

53:10

coming years. And the other thing I'm

53:12

really excited about is a

53:14

new field called public interest

53:16

technology. It's exactly what it

53:18

sounds like. It's about making technology in the

53:21

public interest. So a public

53:23

interest technologist might audit

53:26

an algorithm for bias. They

53:28

might work on making a

53:30

website more accessible or make

53:33

a website kind of more

53:35

stable so that when there's

53:37

the next global pandemic and

53:39

a million people file for unemployment

53:42

at the same time, the website won't go down,

53:44

right? Like these

53:46

are really important infrastructure projects

53:49

that we don't think a lot about, but

53:52

are really crucial

53:55

to an effective functioning democracy.

53:57

Yeah. Thanks so much for talking to

53:59

us. about it. It's a great book. Thank you, Nora.

54:03

Meredith Broussard is a data scientist

54:05

and the author of More Than a

54:07

Glitch, Confronting Race, Gender and Ability Bias

54:09

in Tech. You've

54:16

been listening to Spark. The show is

54:18

made by Michelle Parisi, Samarit O'Hanas, McKenna

54:20

Hadley-Burke, and me, Nora Young, and

54:22

by Cheryl Spithoff, K. Jen Gainty,

54:24

and Meredith Broussard. Subscribe

54:26

to Spark on the free CBC Listen app

54:28

or your favourite podcast app. I'm Nora Young.

54:31

Talk to you soon.

Rate

Join Podchaser to...

  • Rate podcasts and episodes
  • Follow podcasts and creators
  • Create podcast and episode lists
  • & much more

Episode Tags

Do you host or manage this podcast?
Claim and edit this page to your liking.
,

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features