Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:08
Hello and
0:08
Welcome to FlashForward. I'm Rose,
0:11
and I'm your host. Today,
0:13
we are going to talk about the technology piece
0:15
of welcome to Vanguard Estates. Again,
0:17
this is one of five episodes about the
0:19
real stuff that inspired the story. So if you
0:21
haven't listened to it and you care
0:23
about having certain elements of the story spoiled,
0:26
Now is the time to hit pause on this episode
0:28
and go listen to the series.
0:31
Okay. Great. Now, let's
0:33
talk about tech. And I wanna start
0:35
with Missy. Hello, Marcus.
0:37
Nice to meet you. Missy
0:40
was loosely inspired by a real service
0:42
called care dot coach.
0:44
We
0:44
coach people to improve self
0:47
care and health care outcomes. And
0:49
the dot means that
0:51
we do it through technology.
0:53
That's
0:53
Victor Wang, the CEO of Care dot
0:55
Coach. You have heard Victor on the show
0:57
before back in twenty eighteen.
1:00
care dot coach is not robots.
1:03
Their system works via an app on
1:05
a tablet. Well,
1:06
it looks like a a do I have a little dog
1:08
or cap most of the time. The avatar
1:12
is the face for an
1:14
entire global team of
1:16
empathic and intelligent
1:18
and caring people who would
1:20
call health advocates. And we
1:22
hire them like in the Philippines and Latin
1:24
American countries for Spanish and staff
1:27
the Avatar with all these great people
1:30
and thereby, partially
1:33
solve the caregiver crisis. Basically,
1:36
what the app does is it consolidates
1:38
a whole team of people into one
1:40
avatar. that the user engages
1:42
with. And that cat or dog
1:45
does everything from medication reminders
1:47
to talking people through physical therapy
1:49
to just general social
1:51
conversations. And they
1:53
picked the dog and cat avatars for
1:55
a couple of reasons. For one thing, there is
1:58
evidence that animal therapy has benefits
1:59
even if that animal is a
2:02
Robots. But there is another reason
2:04
too. People
2:05
also tell their dog all sorts of stuff. They
2:07
don't even tell their own family members.
2:10
If we had a met a guy wake up
2:12
his his little dog, Avatar, I
2:14
think you call them buddy, people
2:16
call gravitas, all sorts of
2:18
things like sparkle, and
2:21
they'll capitan, and things like that. I
2:23
think it was Buddy. He's
2:25
like, hey, buddy. I I fell in the shower.
2:28
Kinda banged up. I think I'm okay, but
2:30
can you stay with me and make sure I'm okay?
2:32
But don't tell my daughter because
2:35
she's gonna put me in a nursing home if
2:37
she finds out a falling of the shower. Now
2:40
I asked Victor, okay, like in that
2:42
situation, did you tell the
2:44
daughter or does buddy keep
2:46
the secret? And the answer
2:49
is kind of complicated.
2:51
A lot of those decisions are coded
2:53
into the Estates. Care
2:55
dot coach isn't something you can easily
2:57
just go buy at the store or download
2:59
on your phone. They usually
3:02
work with specific health plans and
3:04
insurance programs. And each of those
3:06
health plans has specific rules
3:08
about what gets reported to who.
3:11
So there
3:11
there'll be certain certain types of things that
3:13
they want to escalate. Certain types of things they want
3:15
to go to, like, an after hours, like,
3:17
a nurse line, might have,
3:19
like, a suicide prevention line. They want us
3:21
to direct certain things towards.
3:23
They might have, like, a social
3:24
determinants of health. type
3:27
of pathway. And certain
3:29
organizations like to leverage
3:31
family members as much as possible.
3:35
and so you
3:36
might actually want us to directly
3:38
go to a family member. In
3:40
our story, one of the things that
3:42
the dad confesses to Missy is tech neglia
3:45
crime. And I asked Victor in
3:47
real life if someone told their
3:49
care dot coach, Avatar, that
3:51
they had committed a crime, what
3:54
would actually happen? I
3:56
think that depends on
3:58
the organization's policies
3:59
and, like, the nature of the supposed
4:02
crime. is
4:03
an element of common sense. You know?
4:05
Like,
4:07
like, are we
4:08
talking about, like, somebody smoked marijuana
4:10
in his state? that doesn't
4:13
allow marijuana or are we talking about?
4:15
Like, they're
4:16
the process of burning the building down.
4:18
So there's
4:21
a lot of cases, it's gonna be a judgment
4:23
call.
4:24
In our story, Marcus, the dad
4:27
gets really connected to the person
4:29
on the other end of his Robots. cat.
4:31
And that is a thing that
4:33
really does happen with people who use
4:36
these kinds of systems. A
4:38
couple of people completely bypass
4:41
the interface.
4:42
This is doctor
4:44
Amanda Lazar, an assistant professor at
4:46
the University of Maryland college of
4:48
information studies. And a few
4:50
years ago, she did a study on
4:52
these kinds of avatars. In
4:54
the study, one participant said that
4:56
they could tell who was on the other
4:58
end based on the sense of humor that
5:00
was coming through the little dog
5:03
Avatar. Another participant said
5:05
that she would ask certain questions
5:07
that she knew some teleoperators wouldn't
5:10
answer so that she could get to the one
5:12
that she wanted to talk to. I
5:14
think this is a really interesting thing
5:16
to think about because often
5:18
we assume that older people are
5:20
just going to simply sit back and
5:22
use the devices sort of exactly
5:24
the way they're told to.
5:27
But of course, nobody does
5:29
that. Right? We all find ways
5:31
around certain elements of our technology
5:33
try and get what we want. And
5:36
often, what older folks want
5:38
is relationships, contact,
5:41
social interaction, and
5:43
they will use technology to get that
5:45
in all kinds of interesting ways.
5:47
I'm socially isolated and I need
5:49
some connection. So I'm gonna turn my
5:51
temperature in my apartment up
5:53
above ninety two because I understand that
5:55
that's the point at which a telecare operator
5:57
calls me. and I'm gonna
5:59
have a social chat with that telecare
6:01
operator. This is doctor Clara Barreidge,
6:03
an associate professor at the University of
6:05
Washington School of Social Work.
6:07
I asked Victor what would happen if
6:09
someone, like the dad in our story,
6:12
asked for a worker's name or
6:14
location. Yeah.
6:15
Our health advocates are trained
6:17
not to go there.
6:19
And this was something that the users in
6:21
Amanda's study found kind
6:23
of frustrating. some
6:25
people were just so focused on that
6:27
human interaction that they could kind
6:29
of play around with the
6:31
system so they could get to those people that
6:33
they developed relationships with or
6:36
be less interested in because they
6:38
knew there were people in the third and that they weren't
6:40
that were not giving them a they
6:42
felt authentic
6:45
interaction, but they were expected to give an
6:47
authentic interaction of their own. One
6:50
participant in Amanda's study said, quote,
6:52
the digital pet can't really be a
6:54
friend to me because the people that
6:56
I talk to on the other end can't tell
6:58
me anything about their personal
7:00
lives. When someone asks me a question,
7:02
I answer the question. Then I ask
7:05
back, well, how about you? And what
7:07
do you do? And how do you feel? And
7:09
what do you like? Well, that didn't
7:11
go over very big because they're not supposed
7:13
to tell me who they are, where they are,
7:15
or what their families like. many children
7:17
they have and all of that?
7:21
We forget, I think, when we
7:23
design technologies
7:23
for older adults that people
7:26
don't want to
7:26
just be cared for relationships.
7:30
Many people do not want the relationship just
7:32
to be comforted or reached
7:34
out to in their social isolation. Right? They also
7:36
want to make genuine connections
7:38
with other people and help them and care about them. Have
7:40
some questions about their lives and their families and
7:42
things like that. So that was also something with
7:44
robotic pets we found where people some
7:46
people described how they wanted to be able to,
7:48
like, care for it.
7:50
too. Right? Whether that's like changing the
7:52
batteries in
7:52
like a more meaningful way,
7:54
they were happy they didn't have to clean the
7:56
poop anymore. Like, there were certain things
7:58
that were like, great. I'M HAPPY
7:59
NOW TO BE DOING THAT. Adrienne:
8:01
OF
8:02
COURSE, AVATARS ARE
8:04
NOT THE ONLY KIND OF TECHNOLOGY THAT
8:06
EXISTS TO HELP SENIORS.
8:09
communication
8:09
and engagement is the first
8:11
one I talk about.
8:12
Health and wellness,
8:14
learning and contribution, and
8:16
safety and security. So those are
8:19
the four
8:19
big categories.
8:21
This
8:21
is Laurie Orlev, the founder of aging
8:23
and health technology watch. She's
8:25
been tracking technologies related to aging
8:27
for over a decade. I'm probably the
8:29
first one who identified as such.
8:31
Her four categories of technology
8:34
encompass a whole bunch of different
8:36
types of devices, apps, and
8:38
services.
8:39
Email, virtual reality, so
8:41
offer applications, games, video,
8:43
cell phones, smartphones, tablets, smart
8:46
speakers,
8:46
voice assistance, and hearables. Mobile
8:48
health applications, telehealth,
8:51
medication management, disease
8:52
management, fitness trackers, voice Estates
8:55
and health related wearables.
8:57
Shall I
8:57
keep going? home
8:59
security systems, which is sort of the basics
9:01
in technology
9:02
for people who want to age in their own
9:04
home, voice
9:05
enabled health capabilities from smart
9:07
speakers,
9:08
cameras which are increasingly smart,
9:11
fall
9:11
detection technologies, home and
9:13
activity monitoring sensors
9:15
and radar.
9:17
Robots of stuff. From her
9:19
perspective, as an industry analyst,
9:21
Lori is watching a few sectors
9:23
in particular. One of them is
9:25
sensors. This can be everything from
9:27
a basic motion sensor light that
9:29
turns on when you walk by, all the way
9:31
up to much more complicated
9:33
detection devices.
9:35
Sensors that can tell you about
9:37
whether the stove has been
9:39
left on sensors
9:40
about your health, your heart
9:42
rate, blood pressure, your
9:44
personal temperature, and
9:46
camera based sensors
9:47
that can
9:49
detect what you're doing,
9:51
match
9:51
it up to some information about
9:53
you and make a suggestion, you know,
9:55
for through voice, for example, make a suggestion
9:58
about something you should change in your
9:59
behavior. In the first
10:01
episode, you heard Nikki talking about
10:04
using an Amazon Echo to set up the
10:06
lights and music to keep her
10:08
mom calm and relaxed during
10:10
sundown. She also uses
10:12
other sensor technology. So
10:14
the cameras phenomenal.
10:15
I love it because
10:18
when when my mom was in
10:20
her natural space,
10:22
she would do things that I would
10:25
my mind would be blown, like, why
10:27
are you hiding cookies
10:29
in the closet? What do you think? I'm
10:31
not gonna give you more. Like, you
10:33
know, so when I would when I
10:35
was able to find her doing those
10:37
things, it kinda helped me with assessing
10:39
how I should run her day to day. Like,
10:41
maybe give her an extra map
10:43
because obviously she's still hungry, but doesn't know how
10:45
to communicate that she's hungry. I do
10:47
have cameras that
10:49
also assists with caring for her
10:51
because they let me know. They
10:53
notify me through the app
10:55
when her body temperature has gone
10:57
down. or when she has
10:59
moved a little bit too much. So
11:02
I'm able to use
11:04
technology to kind of ease
11:06
all of the heaviness and
11:08
the weight of caring for someone who
11:10
is actively dying.
11:13
Another thing Laurie is watching along with
11:16
almost everybody else in the tech industry
11:18
in general is AI.
11:20
Software that can actually learn something
11:22
about your behavior, assemble the
11:24
data, and predict some possibility
11:26
of change in the future. I
11:28
think
11:28
that's the most interesting thing. And
11:30
it's not all that well involved this year,
11:33
but hopefully next year, It will
11:35
be more well involved and will be increasingly part
11:37
of technology offerings that serve
11:39
older adults. One
11:41
potential application is helping to
11:43
decode and understand nonverbal
11:46
kinds of communication.
11:48
Many folks with dementia struggle to
11:50
verbalize at some point or another.
11:52
But there are lots of ways a person
11:54
can show what they do or
11:56
don't like. The
11:57
application I think for AI is most interesting
11:59
is
11:59
actually kind of detecting people's
12:02
nonverbal signals because
12:04
we think of someone can't say, yes, I would
12:06
like to do that, please.
12:07
Or I'm having so much fun right
12:09
now that you know,
12:11
maybe they're not able to understand anything,
12:13
but then if you look at like the clinical
12:15
literature, there's all these observational measures of
12:17
engagement. Like is someone leaning forward during that
12:19
to me? For his
12:20
part, Victor and his team are already
12:23
incorporating AI into care
12:25
dot coach so that the system can
12:27
work faster and more autonomously.
12:29
Like,
12:29
for example, if somebody
12:32
says something to you, it takes you a
12:34
moment to be like, what is
12:36
a thoughtful empathic
12:38
some
12:38
sort of thing that I should say.
12:42
Okay. Let me say it and then you're able to
12:45
type it out. And then
12:46
you go, oh, I typed it wrong. You just fix
12:48
that. And then you have to hit enter,
12:50
ending well, your clients
12:52
or, you know, this person on the
12:54
other side like, waiting
12:56
for you and their avatar to
12:58
respond. So we're leveraging
13:01
some really cutting edge techniques to
13:04
take all the training data that we've built
13:06
and automate a lot of that and make that
13:08
faster. But
13:09
in the future, there might be apps and devices
13:12
where there isn't a human involved
13:14
at all. How
13:15
do you leap to that future where
13:18
AI is actually able to do this
13:20
kind of thing and build this type of trust
13:22
and, like, completely solve
13:24
the carrier risk shortage. There are
13:25
some big questions here that
13:27
involve discussions of out arithic
13:30
bias and what it really means
13:32
to care for someone. If
13:34
the users in Amanda's study wanted
13:36
a real authentic connection,
13:39
they're not going to get one
13:41
from an app. You
13:43
can't have a two sided conversation
13:45
about your kids or your lives
13:47
with an algorithm. And what
13:49
about all those sensors we talked about?
13:51
What are the ethics when it comes to
13:54
things like cameras installed in
13:56
people's spaces? Can
13:58
someone with dementia consent to
13:59
something like that? How do you
14:02
have that conversation?
14:04
So, like,
14:04
let's say your husband was like, I want to install
14:06
a camera our house to, like,
14:08
keep track of, like, safety wise,
14:10
making sure, like, the stove's not a I'll tell you how would
14:12
you feel about that. I'll tell them to boil her
14:14
off. Yep. Yeah. And I think there's
14:16
been a lot of tracking
14:19
implemented that's being
14:21
called for
14:22
health that's
14:24
actually being used for surveillance.
14:27
We are going to talk about that
14:29
and how to ethically design and
14:31
deploy some of these things. when
14:33
we come back. This
14:36
episode is sponsored in part by Tab for
14:38
a cause. Tab for a cause is a
14:40
browser extension that lets you raise money
14:42
for charity while doing your thing online.
14:44
It is incredibly simple.
14:46
Whenever you open a new tab, you will
14:48
see a beautiful photo and a
14:50
small ad. Part of that ad
14:52
money goes towards a charity of
14:54
York choice. That's
14:56
it. That's how it works. You can join Team
14:58
FlashForward by signing up at tab
15:01
for a cause dot org
15:03
slash flash forward.
15:06
Okay. So technology can be
15:08
really useful in some situations. Nobody
15:11
is arguing that it can't.
15:13
But what's the right way to
15:15
design and use this stuff?
15:17
Let's start with design. A lot
15:19
of technology simply forgets
15:21
that seniors even exist
15:23
as a market or user base.
15:25
what does
15:26
our tech industry look like?
15:29
Right? Like who's designing them? And
15:30
what are their kind of mental models informing
15:33
things? who are they designing
15:35
things for?
15:36
That's Dr. Amanda Lazar again. And in
15:38
her work, she does a lot of thinking about what
15:41
older adults actually want
15:43
out of their technology. And Lori
15:45
Orlov, our industry analyst, says that there
15:47
are so many examples of tech
15:49
that is clearly not designed with
15:51
seniors in mind.
15:53
Where to tap on an iPad? I
15:55
would
15:55
say that's my best example. There's
15:57
a lot of screen area on an iPad. The other one
15:59
is the
15:59
Apple TV remote. There's
16:02
no clue on the remote where to
16:04
touch.
16:04
Now what it means? What does that big
16:06
circle button
16:07
thing mean or
16:09
and an iPad is
16:11
another one where there's a lot of blank
16:13
screen And
16:14
I've seen over people
16:16
pounding at various parts of it, trying to figure
16:19
out which part is gonna wake it
16:20
up.
16:21
When we get older, the conductance of
16:23
our skin actually changes,
16:25
which makes it physically harder
16:27
to use touch screen devices.
16:29
And this kind of thing contributes to the
16:32
idea that older adults are bad at
16:34
technology or aren't interested in
16:36
technology or can't understand technology.
16:39
But is
16:39
that really true? Older
16:41
adults. Like, the people
16:44
we
16:44
are considering leaders have experience the most
16:46
technological change of, like,
16:48
anyone ever. Right?
16:50
Like, advances that they've seen and
16:52
and kept up with in technology
16:54
are, like, a ridiculous
16:56
amount of change.
16:58
Many
16:58
of today's seniors were born before
17:00
credit cards, before commercial television,
17:03
before flu shots. They've
17:05
learned a lot about a lot of new technologies
17:07
over their lives. So the idea that
17:09
they simply can't learn
17:11
probably isn't true. Right?
17:14
And yet, I'm sure a lot of you have probably
17:16
had the experience of trying to walk
17:18
an older family member through tech
17:20
support to varying levels of
17:22
success. Right? So What is going
17:24
on here? There are a couple of things to say
17:26
about this. The first is baseline education.
17:29
So people who are younger have been taught
17:31
either in schools or in their workplace.
17:34
how to use a lot of these
17:36
things. Younger people have access not only
17:38
to actual classes in school on
17:40
how to use computers and the internet and all
17:42
that jazz, but also to like tech
17:44
support teams in offices. For a
17:46
lot of you listening, if there
17:48
is a new bit of technology that you need
17:50
to use for work, you have an actual team of
17:52
people whose job it is to teach you
17:54
how to use it at your company. Older
17:57
folks often have none of that.
17:59
since new
18:00
technologies are entering the market at all
18:03
times and old technologies become
18:04
obsolete, the question
18:07
really is how to
18:07
stay current. how motivated our people
18:10
to stay current. What is the
18:12
training
18:12
cycle, for example, to learn a new smartphone?
18:14
Is it even worth it to get
18:17
one? That last
18:18
question, is it even
18:20
worth it? Is also a good one?
18:22
Because honestly, sometimes it's
18:25
not For some people who have lived through
18:27
a hundred different new bits of technology,
18:30
staying up on the latest cell phone
18:32
is just not that
18:34
interesting. The COVID-nineteen pandemic
18:36
actually provides a pretty good example of
18:38
this when it comes to video chatting.
18:40
Older people start using Zoom
18:43
and you know, are still using it to connect with family members. And
18:45
that's
18:45
because the old ways of doing things weren't working,
18:48
but maybe the reason they didn't,
18:49
like, know how to use it before
18:52
and quote, was because they didn't have it
18:53
right. Everything was working. Why do you have to learn
18:56
this big thing? On top of
18:58
all of that, you have the fact that
19:00
off thin when seniors do try to learn
19:02
something new, they are treated like they
19:04
are incompetent babies, which isn't
19:08
fun. So why even
19:10
bother? Then you add
19:12
in dementia and you get another layer of
19:14
assumptions that those who
19:16
have cognitive Klein definitely cannot understand
19:18
what is happening with technology.
19:20
I asked Nikki for example if
19:22
she had ever asked her mom about
19:24
the cameras and how she felt about
19:27
them. I
19:27
don't know if she would understand what
19:30
it meant. having a camera
19:32
there. And, you know, having a
19:34
camera around, I I had to
19:36
do it in
19:37
the middle
19:39
stage So
19:40
that was, like, from stage four to five
19:44
because that's when things started getting
19:46
it started picking up. You know, she's moving
19:49
things and she might
19:52
be defecating on herself and
19:54
and and I don't know or
19:57
she might be wandering into a place that
19:59
is
19:59
not technically safe for
20:02
her. So I don't know
20:04
if she
20:04
understood what camera or
20:07
camera meant or video taping meant at
20:09
that time? I didn't start
20:11
off studying dementia. I worked with older adults,
20:13
not living with dementia. And when I would
20:15
present findings, the most common question is,
20:17
yeah, but this is all out the window when it's
20:20
dementia. And so that's sort
20:22
of why start to turn towards dementia.
20:24
That's Dr. Claire at Barriage
20:26
again. When Claire did start looking
20:28
at people with dementia, she found
20:30
that actually, often, that's not
20:33
true. And in my own research, I
20:35
found that people, adult
20:38
children, for example, would say,
20:40
no, I probably wouldn't involve
20:42
my my mother, for example,
20:44
in the decision about putting a camera
20:46
in or a sensor or location tracking because
20:48
I don't think she'd understand it.
20:50
I would then interview the parent,
20:52
you know, the the older adult.
20:55
And, you know, everybody understood it. I everybody
20:57
was capable of comprehending
21:00
the basic function of these technologies? Of
21:02
course, Clara has an advantage here.
21:04
Right? Part of her research is about
21:06
finding the best ways to
21:08
explain these tools to older folks and folks with
21:11
dementia. Not everybody has that
21:13
expertise. It's not necessarily easy.
21:16
But that is something that she's hoping to help
21:19
change. In the last couple of years, Claire has been
21:21
working on something called, let's
21:23
talk tech, which is essentially a method of walking
21:25
both people with dementia and
21:27
their care partners through various
21:30
technologies. So we piloted
21:32
it with twenty nine people
21:34
living with mild Alzheimer's disease
21:36
and their care partner, and all of them were
21:38
spouse, most of them were spouse as we had
21:40
one adult daughter who lived
21:43
together. And so we actually
21:45
found that it was successful on all
21:47
of our measures and it was really
21:49
great findings. We
21:50
were able to
21:52
significantly improve the care
21:55
partners' knowledge of what the person
21:57
that we had to mention wanted. we were
21:59
able to significantly
21:59
improve their their comprehension
22:02
of the technologies, both
22:04
the the care partner
22:07
and on a couple of the technologies
22:09
of personal area dementia. We talk
22:11
on this show all the time
22:13
about how important it is to involve
22:15
users in your design practice. And that's
22:17
true of folks with dementia too,
22:19
especially if your app or service or
22:21
device is supposed to be for
22:24
them. In fact, people with dementia have probably already
22:26
worked out some cool uses of
22:28
technology that you didn't even know
22:31
about. Aminda Lazar did one study talking to
22:33
people with dementia about their
22:35
technology use, and the participants described
22:37
their own bespoke often
22:40
very clever systems. The study
22:42
also included ideas that these folks
22:44
had for technologies they would
22:46
actually like to use. Take getting ready
22:48
for the day. How do you know what
22:50
to wear? You or I might look
22:52
at the weather or think about the
22:55
context of an event? Is it
22:57
work? social, some combination. But those
22:59
things can be really hard to do
23:01
when you have dementia. So
23:04
one participant wished for a device that could provide what
23:06
she called social background information,
23:09
including, quote, how I need to be
23:11
presented so that I can feel I
23:13
can participate like everybody
23:15
else. And the study also
23:17
showed that sometimes folks with
23:19
dementia actually tailor the
23:21
technology they use not
23:23
based on their own desires, but
23:25
based on what they think their
23:27
loved ones want and
23:29
need. One
23:29
participant in the study said, she was
23:32
comfortable with like a geofencing
23:34
application. She wanted to use it,
23:36
but she thought it'd be
23:37
too hard for her daughter,
23:39
so she didn't. because,
23:41
like, emotionally for her doc. Right? Her daughter wasn't ready
23:43
for that. The consequences
23:45
of these assumptions, the assumption
23:47
that people with dementia or just
23:50
older people in general can't possibly understand
23:53
questions about technology are very
23:55
real. Often in
23:57
facilities, stuff is installed without
23:59
talking to the residents at
24:02
all. I interviewed
24:03
residents once at a high end nursing
24:05
home. This was a few years ago. and
24:08
they described to me this
24:10
device above their beds
24:12
and they did not know what
24:13
it was. They had not been consulted about
24:16
it or informed if it was even in use and they
24:18
found it really disturbing. And from their description,
24:20
I think it was probably
24:21
a sensor over their bed.
24:22
Now, often, cameras and
24:25
sensors are installed in facilities
24:27
as a money saving effort,
24:29
not necessarily because they are best
24:32
for care. residential care
24:34
agencies primarily at that time in the world of
24:36
intellectual and developmental disability services
24:38
in the home, like
24:39
adult family homes,
24:42
for example, We're putting cameras in residents bedrooms
24:44
and removing their staff from the building
24:46
at night, so they can monitor the
24:48
feed from multiple
24:49
cameras all at once. And
24:51
then ideally rush in
24:53
if there's a problem, send somebody.
24:55
And then the drive, of course, to do
24:57
that was cost savings and workforce shortages.
24:59
We're gonna
25:00
talk a lot more about this and the ways
25:02
that surveillance in the workplace impacts
25:05
care next week when we talk about the
25:07
economics of this world. but
25:09
it's not just facilities that make
25:11
executive decisions about technology
25:13
on behalf of people. Other
25:15
times, it's the families that are making
25:18
decisions without their loved ones consent. One
25:20
woman in
25:20
particular that really stands out for
25:23
me, she had presented the idea
25:25
to her mother of
25:27
using a sensor system
25:28
that her HUD senior housing
25:32
program was offering, and her mom said,
25:34
I don't think I'd like that. and
25:36
her sisters, the adult
25:38
daughter sisters, agreed with their mother and
25:40
said, absolutely, that's an innovation of
25:42
our mom's privacy, but because
25:44
this adult daughter was the power of attorney,
25:46
she decided to use it regardless.
25:48
So she decided to get a
25:51
three sixty degree web camera and put that in her mom's
25:53
apartment. And then she, you know, showed me
25:55
you know, she pulled out her phone and showed me her interior
25:57
of her mom's apartment.
25:59
And so it just made
26:01
me realize, wow, her mom didn't even want the
26:03
sensors, and she ended up with this this
26:06
camera. We talk
26:07
on this show all the time about
26:09
the importance of consent and privacy and
26:12
being able to make your own
26:14
informed decisions about what
26:16
kinds of information is being gathered and shared
26:18
about you. So why
26:20
are so many people quick to
26:22
throw all of that out the
26:24
window? when
26:24
someone is older. And
26:26
the potential drawbacks here aren't just
26:28
some kind of theoretical violation
26:30
of someone's wishes. The
26:33
consequences of someone else making
26:36
decisions for you are
26:38
really steep in dementia because it
26:40
could involve, for example, you
26:43
no longer living in your home
26:45
and being sent to an assisted
26:47
living facility.
26:48
Remember, period?
26:50
So for example,
26:50
there are sensor systems that claim to
26:52
be able to help track someone's
26:55
progress as they age. These
26:56
sensors could detect changes
26:59
in someone's movements or voice
27:01
or habits, and decide that
27:03
they now fall into a
27:06
new category. either a new
27:08
diagnosis or into a new
27:10
stage of dementia, for example.
27:12
And that might change what they are
27:14
allowed to do. or where they're
27:17
allowed to live? We're
27:19
like, oh, AI,
27:19
this is gonna detect progression of dementia. It's
27:21
gonna be so
27:22
great, you know. We're detect the presence
27:25
of dementia. which is like great in
27:26
terms of, you know, maybe looking at
27:29
pharmacological treatments and
27:31
understanding more. but it's
27:33
very very charged when you're a person with
27:36
dementia, kind of like coping with
27:38
life.
27:38
Right? sometimes we don't talk
27:40
to people and beliefs of our dementia, but
27:41
they're actually in a living
27:43
situation where if
27:44
they have, you
27:46
know, advanced cognitive impairment, or
27:48
dementia, they're gonna be kicked out and, you
27:50
know, sent somewhere else. Or
27:52
what if
27:53
the person in question has
27:55
secrets that they don't want their kids to
27:58
know. So I remember
27:59
and I think she's about eighty five, an eighty
28:02
five year old woman, who was
28:04
living with acquired
28:06
disabilities and she needed significant assistance
28:08
from her daughter with whom she was actually very
28:10
close. And when
28:12
I was talking to her, I think it was about the
28:14
sensors. I interviewed her about various
28:16
technologies. And she said, well, what if
28:18
this hypothetical person you're telling me
28:20
about, this hypothetical a dog who needs
28:22
these sensors. What if she's in love? And what if she doesn't
28:24
want her daughter to know? And then
28:26
at the
28:26
end of my interview with this woman,
28:30
she indicated that she's very much in
28:32
love with another woman, and she feels like
28:34
she said, I remember because it was such a
28:36
random specific age She's like, I feel thirty
28:38
six again. And she
28:40
was beaming and so happy
28:42
talking about how being in love makes her
28:44
feel. So we came clear to
28:46
me that she didn't want her
28:47
daughter to know about that she was close to her daughter,
28:49
but there there are things that older adults still,
28:52
for whatever reason, their own reasons,
28:54
want to keep private. In
28:55
our series, there is a storyline in which our
28:57
narrator accidentally snoops on
28:59
their father having sex. And
29:02
this is a thing that
29:04
happens. And when it does, it
29:06
brings us back to those questions
29:08
of agency that we talked about on
29:10
the very first episode.
29:12
If you
29:12
are able to express
29:14
preferences about sexual behavior,
29:18
then you're an adult. and
29:20
we should really kind of stay out of your way.
29:22
That's obviously not true towards the end,
29:24
and it could easily be exploitation of
29:26
people with dementia. But
29:28
our traditional default
29:31
in nursing homes and for older
29:33
people was just no,
29:34
which is a very
29:35
kind of calvinist sex
29:38
negative
29:39
idea about
29:41
what it means to
29:42
care for somebody and
29:44
protect them. That's
29:45
Dr. Tia Powell again. You heard
29:47
her last week. People are entitled
29:49
to their secrets to keep certain
29:51
things to themselves for
29:53
whatever reason they choose. And when
29:56
people know they are being watched,
29:58
they change their behavior.
30:00
I learned about people rushing in
30:03
the bathroom, deciding
30:04
not to take long afternoon
30:06
naps anymore because too
30:08
much inactivity might be detected.
30:11
having to account for behavior that
30:13
deviates from their routine, being
30:16
found out that you own a pet, you're not supposed
30:18
to, being found out that you're dealing with
30:21
continence or that you like to take long
30:23
baths. Right? And just having to answer
30:25
to somebody and, you know, whether that
30:27
be a building
30:27
social worker, you know, frontline
30:30
staff or a family member about it. That's something
30:32
that most people don't expect
30:34
to have to do.
30:36
I
30:37
think it's
30:40
really not hard to imagine
30:42
the ways
30:42
living under constant surveillance could
30:44
impact somebody's well-being.
30:46
I mean, you can imagine this. Right? Imagine
30:48
if every single thing you did was
30:50
recorded and transmitted to
30:53
your parents. or your friends
30:55
or a doctor that is supposed to watch
30:57
you. Again, there are reasons why
30:59
cameras could be useful and could make
31:01
people safer. But we also have to
31:03
weigh the trade offs too.
31:05
Right? And there's
31:06
some things, including
31:08
with your technology that we can really look
31:10
at that would help person
31:11
stay home? Like, what about these cameras?
31:13
If you live
31:14
alone, can we set up a camera that
31:17
just takes photographs like
31:19
only films like the bottom twelve inches
31:21
of a room. So if you're lying on
31:23
the floor, we see you. If your feet are walking
31:25
around, that's all we see.
31:28
but you actually
31:29
need that. And I agree
31:31
that privacy is important, but privacy
31:33
that puts you faster in
31:35
the nursing home is
31:37
probably not what people are asking for. The
31:39
point here is
31:40
not that you should never ever use
31:43
cameras or sensors. plenty
31:45
of people have great experiences using these
31:47
devices to make their lives and their
31:49
loved ones' lives better. The
31:51
point is that their use needs to
31:53
be something that everybody understands and agrees
31:56
upon, which again is totally
31:58
possible. Tia suggests a
31:59
kind of technology genius bar
32:02
for older adults and their
32:04
families. And asking
32:05
every family again
32:07
and again, what's
32:08
what's hard for you? What do you need
32:10
help with? because we have like a genius
32:13
far. we have a sort of genius bar over here where you can say, well,
32:15
he's figured out how to undo the locks and goes out
32:17
of the house in the middle of the night. So, okay,
32:19
how can we figure that out? Give me give
32:21
me some solutions. what's affordable, what
32:24
doesn't make him feel like he's in prison, what's
32:26
not gonna be a
32:26
fire hazard, all that kind of stuff. But
32:28
you need, like, a genius
32:30
bar for everybody where they can go and do
32:32
some problem solving and figure out, we'd like
32:35
you'd like to stay home. We
32:37
can't go and live with them. What do you have
32:39
for us?
32:40
It is
32:41
tricky, and I have heard loads
32:43
and loads of positive stories about
32:46
that
32:46
type of
32:47
use of cameras and and
32:50
a friend of mine in Canada who
32:52
wasn't able to be an
32:54
in person support or care
32:56
partner for his one of his parents
32:58
or family members back in Asia,
33:01
Malaysia somewhere. He set
33:04
up a
33:04
camera and that was
33:07
incredibly six his fault.
33:08
That's Kate Swaffer
33:09
again. And Nikki says that for her
33:11
part, she really tried to respect her mom's
33:14
privacy and agency here. It's
33:17
about us. What
33:18
are we gonna do to make this work?
33:20
How are we going to
33:22
have this lifestyle? you know,
33:24
I wanna make sure that we're both feeling seen and
33:26
we're both feeling safe. I think
33:29
that's
33:29
what cameras are about.
33:31
Having the conversation
33:32
earlier rather than later
33:34
is a key part of this, talking
33:36
to someone about what they want,
33:38
what they don't want, and what would
33:40
work best maybe as a compromise. In
33:44
the series, we tried to explore some
33:46
of these concerns and
33:48
questions. from kids making decisions
33:50
for their parents to
33:52
the complicated conversations you might
33:54
have with a parent or a partner
33:56
about what they do or don't
33:58
want to know about
33:59
technology. This stuff is
34:02
complicated and it's hard to
34:04
navigate. I don't have all the answers here,
34:06
but I do think that we can begin to get
34:08
to a better possible future
34:11
by we can considering our assumptions about the capacity
34:13
of older adults, including those with
34:16
dementia. I'm really interested in
34:18
thinking
34:18
about technology change attitudes
34:21
to mention, I do think there's space there.
34:23
Right? Because that instead of focusing on the person
34:25
who's affected by all this, like, stigma
34:27
and layers of discrimination
34:29
and, you know, we're thinking about the people
34:31
who if they think about things
34:33
a little differently, we might get open up
34:35
some some more the more space
34:37
space. Instead
34:38
of jumping to invent a million devices to
34:41
solve one problem or
34:44
another, We could talk to people and ask them what
34:46
they want and need and then
34:49
think about why
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More