Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:00
This message comes from NPR sponsor
0:02
Spectrum Business, made to work just
0:04
like your small business. Fast,
0:06
reliable internet, phone and mobile services
0:09
made to keep up with
0:11
demand and made to deliver on
0:13
a small business budget. Discover
0:15
more at spectrum.com/work. This
0:18
message comes from NPR sponsor the
0:20
Healthy Spaces podcast by Trained Technologies.
0:22
Did you know that 99% of
0:25
people worldwide are breathing unhealthy air?
0:28
On Healthy Spaces, experts and disruptors
0:30
explore the power of technology to
0:32
transform the health of people's homes,
0:35
workplaces and shared common spaces. Discover
0:37
where climate tech fits into the
0:39
balance between human and planetary health.
0:41
Listen and subscribe to Healthy Spaces
0:44
on your favorite podcast platform. This
0:49
is the TED Radio Hour. Each
0:52
week, groundbreaking TED Talks. Our
0:54
job now is to dream big. Delivered at TED
0:56
conferences. To bring about the future
0:58
we want to see. Around the
1:00
world. To understand who we are.
1:02
From those talks, we bring you
1:04
speakers and ideas that will surprise
1:06
you. You just don't know what you're going
1:08
to find. Challenge you. We truly have to ask
1:10
ourselves, like, why is it noteworthy? And even change
1:13
you. I literally feel like I'm a different
1:15
person. Yes. Do you feel
1:17
that way? Ideas worth spreading.
1:20
From TED and
1:22
NPR, I'm Manush
1:25
and Zamorodi. In
1:27
2017, Nita Farahani and her
1:29
family went through
1:31
a devastating experience. And
1:34
it's something she's OK talking about
1:36
now. Yeah. So we
1:38
suffered the loss of our second
1:41
daughter, Kalista. She
1:43
fell ill with
1:46
RSV. And
1:49
ultimately, after
1:51
a prolonged hospital stay, she
1:54
died from complications of it on Mother's
1:56
Day in 2017. The
2:00
day-to-day became almost unbearable
2:02
for Anita. From
2:05
the first moment that I took
2:07
her into the emergency room and
2:10
the weeks and months in the
2:12
hospital, there were
2:15
just so many vivid images that
2:17
really left me with
2:19
so much trauma. And
2:22
I ended up with PTSD because
2:25
some of the, you know, kind of
2:28
harrowing cries and moments just seared themselves
2:30
on my brain. And so I got
2:33
to the point where I couldn't sleep and I was
2:35
pretty dysfunctional. Anita tried
2:37
traditional therapy, but it
2:39
didn't do much. And then eventually
2:42
a psychologist
2:44
helped me through it and was
2:46
able to help me using exposure
2:48
therapy, which was also
2:50
really dramatic. Basically, I would
2:52
clench up every time I, you know, tried
2:55
to go through the exposure therapy and not allow
2:57
myself to, you know, kind of
2:59
re-experience the memories. But
3:02
that's a really painful process to go through.
3:04
At the time, Anita thought, if only
3:06
there was a way to reprogram her brain
3:09
so she could stop having these
3:11
intrusive thoughts and feelings. And
3:14
now there is. Today,
3:17
there are more
3:19
innovative neurotechnology procedures
3:21
that are available that had they been available to
3:23
me, I would have opted for in a heartbeat.
3:26
Neurotechnology devices that let
3:28
us peer into the
3:30
brain, monitoring, even guiding its
3:33
activity. In Anita's
3:35
case, she believes a treatment called
3:37
decoded neurofeedback might have helped her.
3:40
Basically, you know, you can go into
3:43
a fMRI machine, a functional magnetic resonance
3:45
imaging, where, you know, when
3:47
you recall the memory, it's
3:49
mapped in your brain. Once
3:52
doctors know which path your brain is
3:54
using to recall that memory, they
3:56
can start using it to do something else. Take
3:59
it over. That mapping
4:01
allows for the development of
4:03
something like a game, like
4:06
playing basketball, to implicitly reactivate
4:09
those same pathways, where by
4:11
playing that game, which is a much
4:13
more pleasant experience than remembering
4:15
the trauma, you can
4:17
essentially overwrite those memories with
4:20
more positive associations instead. So
4:22
when that path in her brain was activated, instead
4:25
of feeling total panic, Nita
4:27
would have felt more relaxed. And
4:30
so it's a different and,
4:32
you know, in many ways, less
4:34
traumatic way of working through PTSD,
4:36
or working through
4:39
any kind of traumatic memory or
4:41
experience. So that's just
4:43
one of the different techniques that people
4:45
have been working on to, you know,
4:48
have innovative uses of neurotechnology to enable
4:50
people to work through trauma or to
4:52
work through different neurological
4:55
diseases and disorders. So
4:57
the memory is still there, but the
4:59
visceralness of that thought is
5:02
lessened. That's right. Yeah. So
5:04
it's like any other memory, it's faded
5:07
more now. It's not literally reliving it
5:09
and re-experiencing it. And PTSD, for me
5:11
at least, and I think for a
5:13
lot of people who experience it, you're
5:15
back there. You're feeling it all again.
5:18
All of the overwhelming, you know, fear
5:20
and trauma that you're experiencing in the
5:22
moment, or re-experiencing as the memory kind
5:25
of takes over again, versus
5:27
how we usually remember, which is,
5:29
you know, kind of at
5:31
a distance with reflection. It may hurt
5:33
still, but it doesn't grip
5:36
us and take us over and put
5:38
us back there in that same kind of way. So
5:41
that does sound amazing. And
5:43
if there was technology that could help
5:46
with that more easily than the years
5:48
of therapy that you went through, that
5:50
would be great. But also, it
5:53
makes me think there are some
5:55
things that people would like to forget. And
5:57
are we going to get to the point where, you
5:59
know, like... that movie, Eternal Sunshine of
6:01
the Spotless Mind, there will be technology
6:04
that could maybe help
6:07
us do that. I mean, it could be.
6:10
So, Nita Fairhoney knows a lot
6:12
about neurotechnology and its pros and
6:14
cons because she's a law
6:17
professor at Duke University studying the
6:19
ethics of how we should and
6:21
shouldn't use this technology in the
6:23
future, which she says is
6:25
coming faster than we might think. Neurotechnology
6:28
in headbands, smartwatches,
6:30
earbuds will be able
6:32
to track our health, including,
6:35
eventually, our thoughts and
6:37
emotions. Yeah. So, I mean,
6:39
there are brain sensors that pick up electrical
6:41
activity in the brain. Major
6:43
tech companies are really racing to
6:46
embed these brain sensors to
6:48
be like the heart rate sensors and other
6:50
sensors we have in everyday objects, but tracking
6:52
something very different, which is being
6:54
able to tell of a person is happy
6:56
or sad or if they're paying attention or
6:58
their mind is wandering or if they're bored
7:01
or gauged, tired or falling asleep
7:03
at the wheel, for example. There
7:05
are companies like SmartCap that have been
7:08
selling these EEG headcaps
7:10
that allow a driver
7:13
or a pilot or somebody who's in
7:15
mining, for example, to track their fatigue
7:17
levels and give more accurate data about
7:19
whether they're starting to get to dangerous
7:22
levels of sleep. I
7:24
mean, this sounds cool,
7:26
easier, but where do
7:28
we start to cross
7:30
the line between technology
7:32
understanding our intention based
7:34
on tiny movements or
7:36
brain activity to understanding
7:38
our thoughts and ideas and
7:41
our feelings? Yeah, it's a great
7:43
question. So, you know, there
7:45
was this amazing study that came out in, I
7:47
think, April of
7:49
last year where using more sophisticated
7:51
neurotechnology, which is like these giant,
7:54
you know, MRI machines that a
7:56
person goes into, they, you know,
7:58
had people listen to podcast
8:01
and then trained a generative
8:03
AI classifier using GPT-1 to
8:07
say like this is what the person's listening to, this is what
8:09
their brain activity looks like, this is what they're listening to, this
8:12
is what their brain activity looks like. And
8:14
then had them listen to something and
8:16
have the classifier try to decode what
8:18
that was. And it was with a
8:20
really high degree of accuracy able to
8:23
decode a lot of what the person
8:25
was hearing or what they were imagining
8:28
just based on brain activity. And
8:31
that again is more sophisticated technology than
8:33
EEG and you know
8:35
it's peering more deeply into the brain but other
8:38
researchers are applying that same concept
8:40
to try to decode EEG activity. And
8:43
so the question of where does it
8:45
cross the line, as you're starting to
8:47
wear everyday devices, you know
8:49
earbuds that are tracking your fatigue
8:51
levels or picking up your intention
8:54
to type or to swipe, but
8:56
it's recording that brain activity at
8:58
all times and these
9:01
models are getting more and more sophisticated
9:03
at being able to decode what that
9:05
means. It's not that
9:08
hard to see that we're quickly
9:10
moving into a world where what
9:12
you're thinking and feeling is just
9:14
as transparent and can be just
9:17
as easily decoded using AI and
9:19
neurotechnology. Over
9:21
the past decade there have been
9:23
incredible advances in understanding how
9:25
the brain works. Now
9:28
we are on the cusp of a new
9:30
era of brain monitoring and
9:32
enhancement and the
9:34
exciting potential and pitfalls
9:36
of merging our minds with machines
9:39
and manipulating the brain, well
9:42
they're mind-boggling. And
9:44
so today on the show, brain hacks, neurotechnology
9:47
that could treat devastating
9:49
cognitive diseases, mental illness,
9:51
and brain injuries. But
9:54
as legal scholar Nita Farahani
9:56
warns, also put our
9:58
most private thoughts emotions
10:00
in jeopardy. And that's
10:02
the part that I fear that
10:04
when you know we get
10:07
to this world of brain transparency if we
10:09
don't have the right kind of safeguards in
10:11
place that that which I
10:13
think is so fundamental to what it means
10:15
to be human what it means to flourish
10:17
as a human may suddenly not
10:19
be our own. Here's Nita
10:22
Farahani on the TED stage. This
10:25
new category of technology
10:27
presents unprecedented possibility both
10:30
good and bad. Consider
10:33
how our physical health and well-being
10:35
are increasing while neurological disease and
10:37
suffering continue to rise. 55 million
10:41
people around the world are struggling with
10:44
dementia with more than 60 to 70
10:46
percent of them suffering from Alzheimer's disease.
10:50
Nearly a billion people struggle with
10:52
mental health and drug use disorders.
10:55
Depression affects more than 300 million.
10:59
Consumer neurotech device can finally enable
11:01
us to treat our brain health
11:03
and wellness as seriously as
11:05
we treat the rest of our physical well-being.
11:09
Regular use of brain sensors could
11:11
even enable us to detect the
11:13
earliest stages of the most aggressive
11:15
forms of brain tumors like glioblastoma
11:17
where early detection is crucial to
11:19
saving lives. The same
11:22
could hold true for Parkinson's
11:24
disease to Alzheimer's, traumatic brain
11:26
injury, ADHD, and even depression.
11:29
But all of this will only be possible
11:31
if people can confidently
11:33
share their brain data without fear that
11:36
it will be misused against them. You
11:39
see the brain data that will be
11:41
collected and generated by these devices won't
11:43
be collected in traditional laboratory
11:46
environments or in clinical research
11:48
studies run by physicians and
11:51
scientists. Instead, it'll
11:54
be the sellers of these new devices,
11:56
the very companies who've been commodifying
11:58
our personal data. for years, which
12:02
is why we can't go into this
12:04
new era naive about the risks or
12:06
complacent about the challenges that the collection
12:08
and sharing our brain data will pose.
12:12
Brain sensors provide direct access to the
12:14
part of ourselves that we hold back,
12:16
that we don't express through our words
12:18
and our actions. Brain data,
12:21
in many instances, will be more
12:23
sensitive than the personal data of
12:25
the past, because it reflects our
12:27
feelings, our mental states, our emotions,
12:30
our preferences, our desires, even our
12:32
very thoughts. I
12:34
would never have wanted the data
12:37
that was collected as I worked through
12:39
the trauma of my personal loss to
12:42
have been commodified, shared, and
12:44
analyzed by others. These
12:46
aren't just hypothetical risks. Take
12:49
Entertac, a Hangzhou-based company who has
12:51
collected millions of instances of brain
12:53
activity data as people have engaged
12:56
in a mind-controlled car
12:58
racing, sleeping, working, even
13:00
using neurofeedback with their
13:02
devices. They've already
13:04
entered into partnerships with other companies
13:06
to share and analyze that data.
13:11
I mean, what's your biggest fear here, Nita? What
13:13
is the worst case scenario if we
13:15
don't start to put some legal
13:17
safeguards around this tech? I
13:20
think, honestly, that human flourishing
13:22
is at risk here. Like, fundamentally, what it
13:24
means to be human is at risk. In
13:27
a minute, Nita Farahani explains
13:29
how our ideas of civil liberty
13:32
need to change in this new
13:34
era of neurotechnology
13:37
to protect our most private ideas
13:39
and thoughts. On the
13:41
show today, Brain Hacks. I'm
13:44
Manu Shahzammarodi, and you're listening to the TED
13:46
Radio Hour from NPR. We'll
13:48
be right back. This
14:00
message comes from NPR sponsor
14:02
Spectrum Business made to work just
14:04
like your small business. Fast,
14:06
reliable internet, phone and mobile services
14:09
made to keep up with
14:11
demand and made to deliver on
14:13
a small business budget. Discover
14:15
more at spectrum.com/work. This
14:18
message comes from NPR sponsor
14:20
Spectrum Business made to work just
14:22
like your small business. Fast,
14:24
reliable internet, phone and mobile services
14:27
made to keep up with
14:29
demand and made to deliver on
14:31
a small business budget. Discover
14:33
more at spectrum.com/work. This
14:35
message comes from NPR sponsor Intercom.
14:38
Intercom is a complete AI and
14:40
human customer service platform and the
14:42
only platform to combine a fully
14:44
featured inbox, tickets, help center, AI
14:47
chatbot, phone and more. Making it
14:49
not only complete, but also one
14:51
of the most powerful support platforms
14:53
out there. Intercom, every single tool
14:55
you need all in one place
14:58
enhanced with AI so you can
15:00
give every single customer the fastest,
15:02
most personalized experience imaginable. Learn more
15:04
at intercom.com. This
15:06
message comes from Apple Card. Reboot
15:08
your credit card with Apple Card.
15:10
It gives you unlimited daily cashback that
15:13
can earn 4.35% annual
15:16
percentage yield when you open a savings account.
15:19
A high yield, low effort way to grow
15:21
your money with no fees. Apply
15:23
for Apple Card now in the Wallet
15:25
app on iPhone. Apple Card subject to
15:28
credit approval. Savings is
15:30
available to Apple Card owners
15:32
subject to eligibility. Savings accounts
15:34
by Goldman Sachs Bank USA.
15:36
Member FDIC terms apply.
15:39
Hi, it's Terry Gross, the host
15:41
of Fresh Air. We bring you
15:44
in-depth long form interviews with actors,
15:46
directors, musicians, authors, journalists and more.
15:49
Listen to our Peabody Award winning
15:51
Fresh Air podcast from WHYY and
15:53
NPR. Hello
15:56
lovely listener. I just want to let
15:59
you know that our friends... over at NPR's
16:01
Through Line are also doing an
16:03
episode about the brain, but from
16:05
a completely different angle. Their
16:08
show is about the history
16:10
of understanding how smell changes
16:12
the way we think. It's
16:14
a good one. The same part of the brain that's
16:17
giving us the experience of emotion is also
16:19
giving us the experience of sense. And
16:21
so instantly that we are consciously registering
16:24
a sense, we are also to
16:26
some degree experiencing an emotion. Through
16:29
Line's episode on smell coming
16:31
February 8th, wherever you get your
16:33
podcasts. On
16:36
the show today, Brain Hacks
16:38
and the Future of Neurotechnology.
16:41
We were just talking to Duke Law
16:43
Professor Nita Farahani. She's the author of
16:45
the book, The Battle for Your Brain.
16:48
She says we need to rethink what
16:51
our right to privacy even means in
16:53
this new era. So
16:55
for me, you know, what I've
16:57
been talking about is our right to
17:00
cognitive liberty, our right to self-determination over
17:02
our brain and our mental experiences. And
17:04
what I mean by that is, like
17:07
fundamentally, your right
17:09
to develop your own identity, your
17:11
own thoughts, like to have a
17:13
space where you're able to
17:15
reflect, to think, to, you know, think about
17:18
being a child and trying to sort through
17:20
who you are. I have a third
17:23
grader, so an almost nine-year-old, and
17:26
watching her, you know, more and more come into
17:28
her own and trying to figure out who she
17:30
is, all of that happens
17:32
in this safe space, the safe space
17:34
of mental reprieve. You know,
17:36
in one of the most jarring things,
17:38
I think, in researching my
17:41
book I came across was, you
17:43
know, the classrooms in
17:45
China where children were being required
17:47
to wear headsets to track
17:49
their brain activity, to track
17:52
their attention levels throughout the workday. And
17:54
the thought of a child, kind of
17:56
no matter what that headset
17:59
can or to pick up, but
18:01
the chilling effect that that has on
18:04
the ability to think freely, to be
18:06
able to develop your own internal sense
18:08
of self, to dare
18:11
to think differently at an age where
18:14
it's so hard already to go
18:16
against the norm, that's
18:18
the space that I worry about us eroding,
18:21
getting to a place where people are afraid
18:24
to even think. And
18:27
if we are afraid to even think
18:29
freely, the capacity to be able
18:31
to figure out who you are
18:33
and to dare to dream big
18:35
or to dare to help
18:38
us change the path in the course
18:40
of humanity or your own path in
18:42
life is compromised. When
18:45
I hear you say that, it makes
18:47
me particularly worried because I think we're
18:49
talking about technology that will be very
18:51
subtle, right? You know when I tap
18:53
something or I swipe something, but if
18:56
I think something
18:58
or if someone is collecting a
19:00
brainwave, that
19:02
seems very discrete. It's
19:04
very discrete and hidden and invisible. So
19:06
one of the risks I worry about
19:09
is a lot of times
19:11
with an emerging technology, especially a
19:13
whole new class of technology, it
19:16
becomes normalized and hidden in ways
19:18
that we can't see. So
19:21
just to give you an example of that, some
19:24
of the places in which these EEG
19:26
headsets have been introduced already are places
19:30
like you go into an IKEA
19:32
store to look at a series of
19:34
rugs and you're given a headset and
19:37
told like, only if your brain can prove that
19:39
you love the rug, are you going to be
19:41
able to take it home? This actually happened. What?
19:44
Yeah, they had this marketing gimmick in
19:46
Brussels where they had these limited edition
19:48
rugs. They were
19:51
worried that the kind of idea
19:53
of these limited edition rugs was to bring art
19:56
to people at a reasonable price and people
19:58
were buying them and re-selling them. them
20:00
on places like eBay. And so they
20:02
ran this marketing campaign where you had to
20:04
wear an EEG headset while
20:07
looking at the rugs to prove you loved the
20:09
rug. And if you did, you could take one
20:11
home. And if you didn't, you couldn't. Like, we
20:13
can laugh at that in so many different ways.
20:15
Like it was clearly a marketing gimmick, but
20:18
it normalizes it, right? You
20:20
encounter technology in situations, novel
20:23
technology in situations that are
20:26
non-threatening, that are invisible, you
20:29
don't realize what's being collected. And
20:31
suddenly we've breached this
20:33
category of giving away, you
20:35
know, our most intimate selves without even
20:37
realizing that we're doing so. And, you
20:40
know, I don't think people realize what that
20:42
world might look like. It's not just
20:44
that you've given up your mental privacy,
20:46
you've given up the keys to who
20:48
you are to be able to mentally
20:50
shape and change you. That's happening already
20:52
with algorithms, but the ways in which
20:54
this can happen so much more precisely
20:56
and so much more in a customized
20:58
way, I worry about
21:00
a world of more
21:02
almost brainwashing of people
21:05
in ways that really limit our ability to
21:07
think freely. And
21:10
lest people have individual control over
21:12
their brain data, it will be
21:14
used for micro-targeting or worse, like
21:17
the employees worldwide who've already
21:19
been subject to brain surveillance
21:21
in the workplace to track
21:23
their attention and fatigue, to
21:26
governments developing brain biometrics to
21:28
authenticate people at borders to
21:30
interrogate criminal suspects' brains, and
21:34
even weapons that are being crafted
21:36
to disable and disorient the human
21:38
bone. Brain wearables
21:40
have not only read but write
21:43
capabilities, creating risks that our brains
21:45
can be hacked, manipulated, and even
21:47
subject to targeted attacks. We
21:50
must act quickly to safeguard against
21:52
the very real and terrifying risk
21:55
to our innermost selves. Okay,
22:00
I don't want that future. Tell
22:03
me what to do right now. What would you
22:05
say? Well, I'd say the first
22:07
place that I've been advocating that we
22:10
address is like a system of laws.
22:12
And that's not just because I'm a law professor. I
22:14
think it's because we're starting
22:16
in a world in which the
22:18
balance is not in favor of individuals, right?
22:20
We've gotten to this place where like the
22:23
collection of data is the norm. And so
22:25
the balance is in favor of the tech
22:27
companies. And we have to reclaim some of
22:29
that power for individuals. And in the past
22:31
year, a lot has happened.
22:33
So UNESCO has launched a new project
22:35
around trying to
22:39
develop ethical guidelines around neurotechnology.
22:41
The UN has a committee
22:44
that's putting together a report on
22:46
neurotechnologies and its impact on human
22:48
rights. And a lot of
22:51
the AI legislation that's been happening and
22:53
the conversations that have been happening, some
22:55
of those have started to recognize
22:58
the convergence of these fields and
23:00
include some specifics
23:02
around the processing of biometric and
23:04
personal data. But there needs to
23:06
be a lot more convergence between the
23:08
AI conversations and the neurotechnology conversations. And
23:11
so I've been advocating for a global
23:13
right to cognitive liberty, a right to
23:15
self-determination over our brain and mental experiences.
23:18
And that really just, that's a
23:21
framework to update three existing human
23:23
rights, which is our right
23:25
to privacy, to explicitly include a right
23:27
to mental privacy, the right
23:30
to freedom of thought, to more explicitly
23:32
cover a right against
23:34
interference and manipulation and
23:36
punishment for our thoughts, and the
23:38
right to self-determination, which has been recognized as
23:41
a collective and political right to also be
23:43
an individual right. It
23:45
feels like we are hurtling towards
23:47
an era, and this
23:50
sounds so dramatic, but
23:52
where man and machine are merging
23:54
in many ways. I
23:56
agree with you. How do you think about it?
23:59
So it was, You know, I think of it
24:01
as kind of
24:03
co-evolution. The way I
24:05
see that is human thinking
24:08
is relational. And
24:10
so, you know, our
24:12
technology and our dependence and
24:14
interdependence on technology is increasing,
24:18
which means our relational thinking
24:20
with respect to technology is being
24:22
shaped and changed with that technology.
24:25
I want us to be more in the driver's seat of that.
24:29
Actually knowing what's happening with that
24:31
co-evolution and to be able to
24:33
drive that co-evolution rather than the
24:35
technology in the hands of
24:37
a few powerful people deciding
24:40
how our brains are relationally gonna change
24:42
with respect to that technology. And
24:45
I think this is a whole category where
24:48
it has such a huge implication for how
24:50
it could reshape what it means to be
24:52
human that it's so important that we get
24:54
it right. That's Nita
24:57
Farahani. She's a law professor at
24:59
Duke University and the author of
25:01
The Battle for Your Brain, Defending
25:03
the Right to Think Freely in
25:05
the Age of Neurotechnology. You
25:07
can see her full talk at ted.com. So
25:15
to get to a future where
25:17
all this neurotechnology is even possible, a
25:20
lot of technical work is being done in
25:23
labs all over the world. Like
25:25
one that I visited in Brooklyn at
25:28
a company called OpenBCI. So
25:30
wait, what am I gonna put on my head? Yeah, so
25:32
the new Galia headset is crazy beast. I was
25:35
there to try out a brain
25:37
sensing virtual reality headset called
25:39
the Galia. I'm just gonna
25:41
adjust this island back here. The
25:43
helmet-like contraption that is at the
25:45
bleeding edge of neurotechnology and costs
25:47
about $25,000. And one
25:49
in the back also has three active
25:51
EEG electrodes. So what are they collecting
25:53
then? They're collecting brain activity.
25:56
So it's mostly academics, medical
25:58
researchers, and other tech. developers
26:00
who are using the hardware as a
26:02
starting point for their own projects.
26:05
Founder Connor Rissimano walked me through
26:07
what was happening as I looked
26:09
at a screen and
26:12
all the sensors attached to
26:14
my scalp and earlobes began
26:16
processing lots of data and
26:19
building a very rough profile of
26:21
how my mind works. It kind
26:23
of records your EEG, your eye tracking, your
26:26
heart rate, your heart rate variability, all these
26:28
things against the stimuli. The
26:30
result is a display of all my brain
26:33
waves. Delta, theta, alpha, beta, and
26:35
gamma. It's like a rainbow of
26:37
brain activity that I'm looking at. I
26:40
feel like naked because I feel like you
26:42
can look at these brain waves and be
26:44
like, wow, she's a nervous wreck inside, but
26:47
she presents as like a normal human being.
26:49
Connor could not read my mind,
26:52
but he and the computer could
26:54
quickly make sense of the electrical
26:56
pulses my brain sent out to
26:59
move tiny muscles. Let's try
27:01
flexing or like kind of
27:03
winking your left eye. And
27:05
then an incredible thing happened with
27:07
barely a wink of an eye. I
27:10
was able to save a virtual
27:12
cat from evil rats. All
27:15
right, here I go. There's left, try out
27:17
right. There
27:19
it is. Wow. Nice. Try right again. Then you
27:21
want to collect those bones, but you've got to
27:23
avoid those pesky rats. Yeah. All
27:26
this incredibly impressive, complicated,
27:29
starting a UDP stream now,
27:31
and expensive technology to
27:33
play a video game. But
27:36
it's early days and the goal is
27:38
to get to the point where we
27:40
can just think something and have
27:43
it happen on a screen. You know,
27:45
at OpenBCI, we're really focused on the
27:47
path of least resistance to solving the
27:50
problem, which is decoding the mind,
27:52
intention, emotions, what
27:54
makes us who we are, and then how do we augment that?
27:56
How do we improve that? To do
27:58
that, Connor and his team are piecing together how
28:00
the mind works with the
28:03
data that's easiest to collect, like
28:05
the brain waves that show up when I twitch
28:07
my eye. The computer begins
28:09
to understand that an eye twitch
28:11
shows a specific wave pattern, and
28:14
that means move the cat. Eventually,
28:17
with a lot of repetition, I could
28:19
just think about moving the cat,
28:22
and it would know to move it. So
28:25
this is maybe the beginning
28:27
of mind reading technology. If
28:30
we figure it out the right way, it becomes the
28:33
greatest tool that humanity has ever
28:35
created. When you say the greatest tool, tell me,
28:37
give me an example. To somebody listening, like, what do
28:39
you mean the greatest tool? How is that going to
28:42
help me? Okay. So
28:44
if you knew your
28:47
computer was keeping track of
28:49
your emotions, your deepest darkest
28:52
secrets, but also just like, are
28:55
you focused? Do you need a nap? Do you
28:57
need to go for a walk? Are there these
28:59
kind of like basic things that you don't even
29:01
know about yourself, but I do because I
29:04
now understand your subconscious better than you do. We're
29:08
already there in a lot of ways, but
29:10
it's, you know, we have the power,
29:12
the potential with modern neuro technology and
29:15
computers to make it even more powerful,
29:18
the understanding of our subconscious. And
29:20
so when I think about in
29:22
a real world context, like I do want the
29:24
ability to walk down the street and not have
29:26
to bend my neck down to look at my
29:28
emails and not have to be like, man, I
29:30
got to like open up Google maps and copy
29:32
and paste that address
29:35
from my text thing and then
29:37
like make sure that it's Northeast,
29:39
not Northwest, you know, like it
29:41
could be turning
29:43
down the brightness of the
29:45
display in your sunglasses because
29:48
it's noticing that you
29:51
are getting too much light or you're being stimulated
29:53
too much and it knows that. So instead
29:56
of giving you more of what it thinks you
29:58
want, it knows that
30:01
it should be prioritizing what you need and not what
30:03
you want. But that's not
30:05
what Connor thought the purpose of this technology
30:07
was when he first started the company in
30:09
2013. He
30:11
learned a lot just watching how people
30:13
in other fields used his hardware. In
30:16
the beginning, our goal was to build
30:18
an inward-pointing telescope and to share the
30:20
blueprints with the world so that anybody
30:22
with a computer could begin peering into
30:24
their own brain. Connor Rosamano continues
30:26
from the TED stage. At
30:29
first, we were an EEG-only
30:31
company. We sold brain sensors
30:33
to measure brain activity. But,
30:36
over time, we discovered people doing very
30:38
strange things with our technology. Some
30:41
people were connecting the equipment to the stomach to
30:43
measure the neurons in the gut and
30:45
studied gut-brain connection in the microbiome. Others
30:48
were using the tools to build new
30:51
muscle sensors and controllers for prosthetics and
30:53
robotics. What we learned
30:55
from all of this is that the brain, by
30:58
itself, is actually quite boring.
31:03
Turns out, brain data alone lacks
31:05
context. And what we ultimately
31:07
care about is not the brain, but
31:09
the mind. Consciousness, human cognition.
31:12
When we have things like EMG sensors
31:14
to measure muscle activity or ECG sensors
31:16
to measure heart activity, eye
31:18
trackers, and even environmental sensors to measure the
31:21
world around us, all of
31:23
this makes the brain data much more useful. But
31:26
the organs around our body, our sensory receptors,
31:28
are actually much easier to collect
31:30
data from than the brain and also arguably
31:32
much more important for determining the things that
31:34
we actually care about emotions,
31:37
intentions, and the mind overall. Additionally,
31:41
we realized that people weren't just interested
31:43
in reading from the
31:45
brain and the body. They were
31:47
also interested in modulating the mind
31:49
through various types of sensory stimulation.
31:51
Things like light, sound,
31:53
haptics, and electricity.
31:55
It's one thing to record the
31:58
mind. It's another to modulate. The
32:01
idea of a combined system that can
32:03
both read from and write to the
32:05
brain or body is referred to as
32:07
a closed loop system or bi-directional human
32:09
interface. This concept is
32:12
truly profound and it will define
32:14
the next major revolution in computing
32:16
technology. It
32:19
sounds like you think that we are at a
32:21
moment where on and off are
32:24
not going to be the way we live
32:26
anymore. Yeah, there's all these moments
32:28
where there's friction and there doesn't need
32:31
to be. The computer, if we trusted
32:33
it, would just access that information for
32:35
us and it would understand that it put the wrong
32:37
thing in and that we're frustrated and it would change
32:39
it without us changing it for it. And
32:43
we're already there, right? Most
32:45
people feel uncomfortable when their
32:47
phone is not with them. Most
32:50
people feel like there's a piece of them missing when they're
32:52
like, where's my phone? Where's my phone? Oh my gosh, did I
32:55
lose it? Oh, what does that mean? I've
32:57
lost part of my brain. There's
33:00
memories in there that I don't have without my
33:02
phone. And this
33:05
is part of our mind right
33:07
now and it changes us,
33:09
it manipulates us. And I
33:11
think we're right at this kind of inflection
33:14
point where we're figuring out how
33:16
to put real human emotions directly
33:18
into the computing loop. When
33:22
you have products that
33:24
not just are designed for the average
33:26
user but are designed to actually adapt
33:28
to their user, that's something
33:30
truly special. When we know what
33:32
the data of an emotion or a feeling looks like and
33:34
we know how to make that data go up or down,
33:37
then using AI, we can
33:39
build constructive or destructive interference
33:41
patterns to either amplify or
33:43
suppress those emotions or feelings.
33:46
In the very near future, we will have computers
33:48
that we are resonantly and
33:50
subconsciously connected to, enabling
33:53
empathetic computing for the very first time. Okay,
33:57
so tell me what that looks like to you.
34:00
If everything goes according to plan for
34:02
you, what is the potential that you
34:04
see coming from the technology that you're
34:06
building? I think
34:08
the potential is immense. And so
34:10
in my mind, success
34:13
is building a new type of
34:15
personal computer where the owner, the
34:18
user of the computer, has
34:20
total agency over their own data
34:23
and how the computer is augmenting their mind. We're
34:27
working on what I believe is the
34:29
greatest challenge of this century
34:31
and maybe the greatest challenge that humans
34:33
have ever faced, which is understanding intelligence,
34:36
human intelligence. That's
34:40
Connor Russomano, CEO of OpenBCI.
34:43
You can watch his full talk at ted.com.
34:47
On the show today, brain hacks. I'm
34:50
Manoush Zomorodi, and you're listening to the TED
34:52
Radio Hour from NPR. Stay
34:54
with us. This
35:09
message comes from NPR sponsor, BetterHelp.
35:12
Around New Year's, people can obsess
35:14
with changing and forget what they're
35:16
already doing right. Therapy can help
35:18
you recognize your victories and continue
35:21
them this New Year. Try New
35:23
Year Same You with BetterHelp's online
35:25
therapy. Visit betterhelp.com/NPR today to get
35:28
10% off your
35:30
first month. That's
35:33
BetterHelp, help.com/NPR. Support
35:35
for NPR and the following message
35:37
come from State Farm. As a
35:39
State Farm agent and agency owner,
35:42
Lakeisha Gaines is passionate about empowering
35:44
other small businesses. In the
35:46
last several years, there are more business
35:48
owners than we can count. Businesses
35:51
are opening up quite frequently. And
35:53
I think that shows the need, the
35:55
dreams, and the desires of the community
35:57
to have the independence and to have.
36:00
the financial freedom that's important to them. The
36:02
reason why it's so important to me
36:05
to be out there to share information
36:07
and to educate the community is because
36:09
I know that a dream doesn't always
36:11
help you to be successful. You need
36:13
the competency, you need the wisdom, you need
36:16
the knowledge. That's where we come in
36:18
and thank our agents, our ability to be able to
36:20
teach over 100 years of
36:22
experience in this world to say, hey,
36:24
we got you, you got this and we
36:27
got this. Let's do it together. Talk
36:29
to your local agent about small business
36:31
insurance from State Farm. Like a good
36:34
neighbor, State Farm is there. This
36:37
message comes from NPR sponsor Capella
36:39
University. Sometimes it takes a different
36:41
approach to unlock your true potential.
36:43
Capella University's game-changing FlexPath learning format
36:46
is designed to help you learn
36:48
relevant skills at your own pace,
36:50
so you can earn your degree
36:52
on your terms and apply what
36:54
you learn right away. Imagine your
36:57
future differently at capella.edu. Before
37:01
we get back to the show, I want to
37:03
tell you about what's coming up on our next
37:05
bonus episode for TED Radio Hour Plus. It's
37:08
more with technologist Connor Roussimano. You were
37:10
just hearing my conversation with him. Well,
37:13
on this plus episode, Connor and
37:15
I really geek out about the
37:17
ethics and business of
37:19
neurotechnology and how devices coming
37:21
on the market right now,
37:23
like Apple's new AR VR
37:25
headset, well, they are just
37:28
stepping stones to a new
37:30
reality where Big Tech
37:32
has access to our innermost thoughts and
37:34
feelings. That comes out in
37:36
a few days for TED Radio
37:38
Hour Plus listeners. If you are
37:40
not a subscriber yet, check it
37:43
out. Join your fellow listeners to
37:45
get all kinds of bonus content
37:47
and all of our episodes sponsor
37:49
free. Just go to plus.npr.org/TED or
37:51
sign up right now in the
37:53
Apple Podcasts app. And thank you.
37:57
It's the TED Radio Hour from NPR. I
38:00
mean if is a morality. On the
38:02
show today. Brain hacks, So
38:05
far we have talked about technology,
38:08
Augmenting the brain, But.
38:10
What if there was a
38:13
way to diagnose and even
38:15
treat severe neurological diseases at
38:17
the molecular level? For
38:20
the third, really severe conditions
38:22
for which we have. Very
38:24
poor understanding of the biology. And
38:27
and know therapeutic insights. So those
38:29
are the conditions that we're focusing
38:31
on. This is that The
38:33
Sergio Pasta. He's a professor
38:36
of Psychiatry and Behavioral Sciences
38:38
at Stanford University. He
38:40
also directs the Stanford
38:42
Brain Organogenesis Program where
38:44
he researchers neuro psychiatric
38:46
disorders, including the most
38:48
debilitating forms of Autism.
38:50
I've been very interested in
38:52
autism. Early on when I
38:55
was actually in school and met
38:57
my first patient with autism and
38:59
was just like struck by. Just
39:03
like how little we could do for patients
39:05
with autism is still today's as the case.
39:07
We consult a very little. We.
39:10
Still don't understand the biology very well.
39:12
With still diagnosed as orders the way
39:14
we were doing and the nineteenth century.
39:17
Psychiatry. And of course, neurology. There
39:20
are too large extend the only branches
39:22
of medicine word the organ of interest.
39:24
The. Brain. Is inaccessible.
39:28
So. Easy of a patient with
39:30
cancer. You. Remove that humor, right? Are
39:32
you take a biopsy? You bring it to the lab.
39:35
And you can directly studied those
39:37
cells in a dish and identify
39:39
what are the molecules in the
39:41
path and are of and come
39:43
up with. treatments? In Psychiatry we
39:45
can just take the human brain
39:48
out. You know what the past
39:50
few decades in old medicine have
39:52
indicated is that to find new
39:54
treatments that are biologically inspired, you
39:56
need access to the tissue. And
39:58
really this this has become. my
40:01
mission. Sergio Pashka
40:03
continues from the TED stage. Today,
40:07
most of what we know about the human
40:09
brain comes from studies in animals, typically
40:11
mice. And while we've
40:13
learned a lot from this animal brain,
40:16
the characteristics that make the human
40:18
brain unique and uniquely susceptible to
40:20
disease remain mysterious.
40:23
Dysfunction in the human brain causes
40:25
brain disorders such as autism and
40:27
schizophrenia and Alzheimer's disease, devastating
40:30
conditions that are poorly understood.
40:33
Nearly one in five individuals
40:35
suffers from a psychiatric disease. What is
40:37
even more striking is that the lowest
40:39
success rate for finding new drugs is
40:41
in psychiatry, out of all the branches
40:43
of medicine, likely because until
40:46
now we couldn't really access
40:49
the human brain. So
40:52
you have found a way of
40:54
studying brain tissue, but take us
40:56
back to how you went from
40:59
being a clinical psychiatrist to one
41:01
who is researching disorders at the
41:03
genetic, even molecular level. Yes,
41:05
absolutely. So as I was finishing my
41:07
clinical training about 15 years
41:10
or so ago, there was
41:12
a remarkable breakthrough made
41:14
by a Japanese scientist, Shinya
41:16
Yamanaka, who reported
41:18
at that time that under some
41:21
condition, you could take a cell
41:23
that is already fully formed and mature and
41:26
push it back in time through
41:28
a genetic trick to
41:30
look pretty much like the stem cells
41:32
from which the entire organism is
41:34
built. So this was
41:37
called cell reprogramming. He received
41:39
a Nobel Prize for this remarkable discovery a few
41:41
years after. And
41:43
it essentially involves taking, for instance, skin
41:45
cells from any individual, take a few
41:48
skin cells or blood cells, putting
41:50
them in a dish and then inserting
41:52
in those cells a few genes, just
41:55
briefly. And the dream
41:58
was, at that time, that
42:00
you would be able actually to make human
42:03
neurons at the bottom of a dish
42:05
from those patients in a non-invasive way
42:07
that would not involve taking any neuron
42:09
from anybody's brain, but
42:11
rather reconstructing or reverse engineering
42:14
this process in the
42:16
laboratory. I'm here to tell you
42:19
that we can finally grow parts of
42:21
the human brain from any individual and
42:24
then build functioning human circuits in
42:26
a laboratory so-called your dish. We
42:29
start by asking a patient to
42:31
provide a small skin sample. We
42:34
then take those skin cells, reprogram
42:37
them by putting a series of genetic
42:39
factors, and push them back
42:41
in time so that those skin cells become
42:44
stem cells. It's like cellular
42:46
alchemy. These
42:48
stem cells have almost magical abilities
42:50
to turn into any other cell type.
42:53
So what do we do? We take the
42:55
stem cells, we dissociate
42:57
them, we then aggregate
42:59
them so that they form spheres or
43:02
tiny balls of cells. We
43:04
then take those, move them into
43:06
a special place where there is a kind
43:08
of chemical soup, and that
43:10
chemical soup will allow them to
43:12
grow and transform and turn into
43:14
a brain organoid. And
43:17
let me be clear. These are not
43:19
brains in a jar. These
43:22
are parts of the nervous system in
43:25
a laboratory dish. Each
43:28
of them contains millions of cells, and
43:31
we can even listen as
43:33
they fire electrical signals. Or
43:38
we can watch them as
43:40
they sparkle with electrical activity. Or
43:43
we can image inside and watch the
43:45
cells as they communicate with each other. Isn't
43:48
it remarkable to think that just a few
43:50
months ago, these cells were skin cells in
43:52
a patient, and now they're neural cells
43:55
at the bottom of a dish that we can study
43:57
at ease. Wow.
44:00
So you can figure out
44:02
how your patients brain cells
44:04
evolved by using stem cells
44:06
to grow neural cells that
44:08
you call brain organoids. I
44:11
mean, it's amazing. Can you
44:13
give me an example of what you have
44:15
learned from this process? I know you specialize
44:18
in a particular form of autism called Timothy's
44:20
syndrome. Yes. So
44:22
for Timothy's syndrome, which is very
44:24
rare, the patients will
44:27
not just have autism and
44:29
epilepsy, but later on
44:32
they'll have a heart problem. They're
44:35
more susceptible to infections, but
44:37
the patients have literally just
44:39
a letter in their genome that
44:42
has been modified. And
44:44
that tiny, tiny mutation in a
44:46
calcium channel has devastating
44:49
effects. And this
44:51
mutation that causes Timothy's
44:53
syndrome was predicted
44:56
to cause that channel to
44:59
stay open slightly longer so
45:01
that more calcium would get inside the cell. But
45:03
of course, nobody has ever seen it. So
45:06
the first experiment that we've done, which
45:09
is still vividly remember even today, the
45:12
night when it actually worked for the first
45:14
time. And I
45:16
was in the microscopy room
45:19
and we finally had neurons
45:21
and could actually see that there was more calcium
45:24
going inside the cell. And
45:26
that was really a very
45:28
exciting moment because it actually showed us that
45:30
you take the cell from patients and
45:32
you can see a defect, so a
45:35
biological consequence of
45:38
a mutation. We now have
45:40
literally hundreds of genes that
45:42
are associated with autism,
45:45
intellectual disability, ADHD, schizophrenia.
45:48
And the reason why that is important is
45:51
because genes are closer
45:53
to the molecular pathways. So
45:56
they offer a great entry point. And
45:59
of course, I am concerned. convinced that in understanding the
46:01
molecules that are important for that, we're going to discover
46:04
very important pathways that are causing disease. So
46:10
forecast for me, you know, if you
46:12
had a fantasy come true, what would
46:14
that look like? Would it mean that
46:16
you would be able
46:18
to stop neurons from having
46:21
dysfunction or problems before they got
46:24
there? Would it mean that you
46:26
would do genetic testing? I want
46:28
to understand the implications of what
46:30
this might mean for future generations
46:32
of people. So
46:35
we've helped close to 250 labs
46:37
around the world to learn this technology.
46:39
We're going to start to understand what
46:41
goes wrong in those cells. And
46:44
I can tell you that for
46:46
team assistant room in particular, we
46:48
got such a good understanding that
46:50
a therapeutic opportunity just arose pretty
46:52
much naturally out of those experiments.
46:55
And so we designed a strategy
46:58
where we essentially destroy that RNA
47:00
that carries the mutation. And
47:02
this would likely involve an injection of
47:05
this therapy. We're still far away from
47:07
that. But in principle, it would
47:09
involve injecting a piece of DNA
47:11
that will
47:14
modulate that. So in theory,
47:16
you could stop the mutation in its
47:18
tracks. But how old would a
47:21
person have to be to get a treatment like this?
47:23
I mean, I guess in utero? Certainly
47:25
for neurodevelopmental disorders, you want to
47:27
intervene as early as you possibly
47:29
can. Because, you know,
47:31
once damage has been done to the
47:34
nervous system is going to be probably
47:36
hard to reverse. And
47:38
so it is foreseeable that
47:40
some of these patients actually will be diagnosed
47:43
within the first few weeks after being born, which
47:46
means that one could intervene very early
47:48
on in the course of
47:50
this disease. Where
47:53
do we start to think about what
47:55
the ethical implications are of when we
47:57
intervene in the development of someone? This
48:00
is a great this is a great issue something
48:02
that actually pre occupies me quite a lot i
48:04
spend a lot of time, no
48:06
thinking about the ethical societal implications of the
48:09
work that we do both in terms of
48:11
delivering some of the therapies and how early
48:13
you would intervene for timis
48:15
syndrome. It's likely to be
48:17
justified to intervene very early with
48:20
the therapy like this especially because this
48:22
children are very sick, very
48:24
debited by this condition but there will
48:26
be other neuropsychiatric conditions that are perhaps
48:28
not as severe. We're
48:30
not as good as predicting and i think they're
48:32
going to think very carefully when
48:34
do we want to intervene and what are the
48:36
risks for intervention. I think that's
48:39
important to keep in mind because i'm sure there
48:41
are some people listening who think you know we've
48:43
just gotten to a point in society where we've
48:45
started to appreciate that people's brains work differently the
48:48
word neurodiversity is used often
48:50
that we start to
48:52
understand. That you know people come
48:54
in all sorts of shapes and sizes and
48:56
ways of seeing the world. Absolutely and
48:58
certainly our goal has never been to
49:01
try to either change or
49:03
cure anybody who doesn't want
49:05
to change in any way.
49:08
Most of our work has
49:10
been focused on addressing this
49:12
devastating conditions of the human
49:14
brain severe intellectual disability where
49:17
any improvement very small improvement there will make
49:19
a huge difference because there are no therapeutic
49:22
approaches. That are even close to
49:24
being curative in those conditions. I
49:28
think it's going to be a very exciting
49:30
time for human neurobiology and for
49:32
psychiatry and my
49:34
hope is that slowly but
49:36
surely psychiatry will be
49:39
moving into a molecular era.
49:42
That's Dr Sergio Pashka he's
49:44
a professor of psychiatry and
49:46
behavioral sciences at Stanford University
49:48
where he directs the Stanford
49:51
brain organogenesis program. You
49:53
can see his full talk at ted.com.
49:58
So we have talked about all kinds. of
50:01
brain hacks. And we want
50:03
to end our show with a very personal
50:05
story. So
50:07
for people with neurological conditions
50:10
like epilepsy, the lack
50:12
of control over their brain can
50:14
be terrifying. It's very, it's
50:17
a crushing limitation for sure. This
50:20
is Kate Faulkner. She's
50:22
a sous chef in Colorado.
50:24
She's also the sister of
50:26
Ted Radiowar producer Rachel Faulkner-White.
50:29
And when Kate was a teenager, she
50:32
started having tiny seizures
50:34
called absence seizures. Where I
50:36
would shake a little bit and
50:38
drop whatever I was holding, but I wouldn't black
50:40
out or fall over or
50:42
lose consciousness or anything like that. At
50:45
the time, she didn't know what they were. And
50:47
she wasn't formally diagnosed with epilepsy until
50:50
a few years later. My
50:52
first big seizure was in the summer of
50:54
2017. And luckily it was with other people,
50:59
but I have no memory of it happening. It was one
51:02
minute I was sitting at the kitchen table and the next minute
51:04
I was with a bunch of
51:06
EMTs who were in the living room with me. After
51:09
that, Kate had started on medications that
51:11
were supposed to prevent these seizures. And
51:14
they seemed to help at first. But
51:17
a few years later, she had another big
51:19
one. I was driving in my
51:21
car, which was terrifying. Luckily,
51:23
it was the only place on the
51:26
route that I was driving that didn't
51:28
have any trees or guardrails. And I just kind of
51:30
drifted across the field. After
51:32
that one, I stopped driving. This
51:35
kept happening more and more
51:37
frequently. I was
51:39
having seizures alone at home. I
51:42
was having seizures at work. I
51:45
eventually had to leave that
51:47
chef job because open
51:49
flames and sharp knives and seizures
51:52
are not a great combination. It's
51:55
devastating to feel so limited
51:57
in what I can do.
52:00
It's the idea of
52:02
hurting someone else if I was driving a
52:04
car. Like, I don't
52:06
know how I managed to not hurt anybody.
52:09
I am haunted by the idea of what
52:11
could have happened. And I really
52:13
missed having the freedom to go where I
52:15
wanted and to go hiking
52:17
by myself or go to the store by
52:20
myself or go swimming or take a bubble
52:22
bath. But in
52:24
2023, Kate's neurologist had
52:26
a new idea. She surgically
52:29
implants a vagus nerve stimulation device.
52:31
Which is a battery that goes
52:33
in your chest on the left
52:35
side, just underneath the skin. And
52:38
it's connected to a wire that wraps around
52:40
the vagus nerve, which is part of a
52:43
parasympathetic nerve system. And
52:46
the battery is programmed
52:49
by the neurologist to emit
52:51
small electrical pulses. It
52:54
sends an extra burst of electricity
52:57
into the brain and can increase
52:59
blood flow to certain areas of the brain and
53:02
can prevent seizures from
53:04
happening. And right now my
53:07
battery is programmed to send a pulse every
53:09
five minutes for a 30 second
53:12
interval. And the
53:15
interesting side effect is that whenever the
53:17
electrical pulse goes off, there
53:19
it goes, so yeah. So as you can hear it
53:21
happening right now, the battery is going off. The
53:24
vagus nerve also works around
53:26
your vocal cords. And so your voice goes
53:29
a little strange. It's
53:31
not painful at all. I remember
53:33
for the first couple weeks, it
53:36
felt like it was really hard to catch my breath. Even
53:39
though I was breathing normally and breathing fine, it's
53:41
kind of that same feeling. So that
53:43
was hard to get used to. And after
53:46
30 seconds, then it goes away. And
53:48
my voice goes back to normal. It's
53:51
not a cure for epilepsy. And
53:53
it's not effective for everyone. But
53:56
for Kate. I haven't had a seizure since
53:58
I got the device put in. I
54:01
now have control to some
54:03
extent over a part of my brain that
54:05
I didn't have before. The
54:08
possibility of the freedom that
54:10
this could potentially bring. I might be
54:12
able to start titrating off
54:14
the epilepsy drugs that I'm on, which
54:16
those have some not fun side effects.
54:20
I want to be able to go
54:22
through a day
54:24
without having the intrusive
54:26
thought of I'm carrying
54:28
a 40 gallon pot of hot
54:30
soup and thinking on today's episode
54:32
of Bad Times to Have Seizures,
54:36
it'd be wonderful to have that kind of
54:38
freedom. I mean it's an incredible
54:40
piece of technology and then even
54:43
though I have a weird voice now
54:45
it's of I'm optimistic that
54:47
this will make life easier and open
54:50
up possibilities. That's
54:53
Kate Faulkner. We are so grateful to
54:55
her for telling her story. And you
54:57
can learn more about her condition and
55:00
treatment at ted.npr.org. Thank
55:06
you so much for listening to our show
55:08
Brain Hacks. This episode was
55:10
produced by Rachel Faulkner-White, Katie Montelillon,
55:12
and Fiona Gearan. It was edited
55:15
by Sanaz Mehskinpour, James De La
55:17
Houssey, and me. Our
55:19
production staff at NPR also includes
55:21
Matthew Cloutier and Tarsha Nahada. Bireen
55:23
Noguchi is our executive producer. Our
55:27
audio engineers were Robert Rodriguez,
55:29
Margaret Luthar, and Ted Miebain.
55:32
Our theme music was written by Romtien
55:35
Arablui. Our partners at Ted are
55:37
Chris Anderson, Michelle Quint, Alejandro
55:39
Salazar, and Daniela Balarezzo. I'm
55:42
Anoush Azomorodi and you've been listening to the
55:44
Ted Radio Hour from NPR. This
55:47
Message comes from NPR Sponsor Spectrum
55:49
Business. Made To work just like
55:51
your small business. Fast, reliable internet,
55:53
phone, and mobile services. Made To
55:55
keep up with demand. And Made
55:57
to deliver on a small business.
56:00
Budget: Discover more
56:02
at spectrum.com/work. Support.
56:04
For this Npr podcast and the
56:07
following message comes from Easy Cater
56:09
Committed to helping companies from nonprofits
56:11
to the fortune. Five hundred find
56:14
food for meetings and company events
56:16
with online ordering and Twenty Four
56:18
Seven Live Support Learn more at
56:20
Easy cater.com. Every. Weekday
56:22
Mps Best Political Reporters com to you
56:25
on the entire Politics podcast to explain
56:27
the big news coming out of Washington,
56:29
the campaign trail, and beyond. They don't
56:31
just tell you what happened, but are
56:33
you why it matters? Twenty at
56:35
the our Politics by guests every afternoon
56:37
to understand the world through political. Lies.
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More