Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:02
Every day you're generating data about
0:04
your health. You might not even be aware of
0:06
it. Maybe your phone counts how many
0:08
steps you take. Maybe your watch
0:11
measures your pulse or your heart rhythm,
0:14
or you use an app to track your exercise
0:16
or diet, and that doesn't even count
0:18
your medical data, the records
0:20
that doctors and insurance companies and
0:22
pharmacies keep about all of us. All
0:25
that data goes somewhere, and it's valuable
0:28
to someone. Welcome
0:33
to Prognosis, Bloomberg's podcast
0:35
about the intersection of health and technology
0:38
and the unexpected places it's taking us.
0:41
I'm your host Michelle fay Cortes. The
0:44
amount of health data is increasing fast,
0:47
from medical records, to health apps and
0:49
devices, to our shopping habits
0:51
and online browsing. Every day
0:53
we leave digital footprints revealing
0:56
intimate aspects of our lives. That
0:58
comes with benefits and risks.
1:01
But no one has sorted it all out yet and
1:04
lost protect people haven't caught up with the
1:06
advances in technology. Having
1:09
all that data promises to help researchers
1:11
come up with new treatments, and it can
1:13
improve doctor's care. But the
1:15
risk is that personal information you'd rather
1:17
keep to yourself could be exposed.
1:20
Here's Bloomberg's health reporter John Tazzi.
1:26
Good afternoon, Thank you for calling Anthem Member
1:28
services. My name is Kathy. How
1:30
can I help? Hi, Kathy, my name
1:32
is John Tazzi. I'm a reporter
1:34
with Bloomberg News and I'm recording this for a story
1:37
about medical data and privacy. Um.
1:39
I'm an ANTHEM member and I'd like to request
1:41
a list of who Anthem has shared my personal
1:44
information with. So you're a reporter
1:47
with Bloomberg correct, Okay,
1:50
and you are inquired. I recently learned that
1:52
I have the right to ask my health insurance
1:54
company what they're doing with my data.
1:57
It's one of the rights given to me under
1:59
a law called HIPPA. HIPPA
2:01
stands for the Health Insurance Portability
2:04
and Accountability Act. It was
2:06
passed and it's
2:08
the main law in the United States that governs
2:11
what medical providers and insurance companies
2:13
can do with our healthcare data. HIPPA
2:15
determines how medical data can be shared
2:18
and what happens if it's shared improperly.
2:21
It also gives people rights over their data,
2:23
like the right to get a copy of your medical record
2:26
or to find out how your data has been shared with other
2:28
parties, but HIPPA doesn't
2:30
cover everything. There is the idea,
2:33
and it is extremely widespread
2:36
that health information is
2:38
inherently going to be protected by
2:41
some law somewhere, but it's not true,
2:43
not at all. This is Pam Dixon.
2:46
I am the executive director of the World Privacy
2:48
Forum, we're a public interest research
2:50
group, and has been a privacy advocate
2:52
for twenty years. She told me
2:55
that people often assume there's some kind of automatic
2:57
protection for health data. That's
3:00
not the case. People universally
3:02
believe that their health data, no
3:04
matter where it is, has some form
3:07
of legal protection and is
3:09
somehow magically confidential.
3:12
HIPPA applies to the records that your doctor,
3:15
other medical providers, and your insurance
3:17
plan hold, But more and
3:19
more data about our health isn't just
3:21
in medical records. HIPPO covered data
3:23
is a smaller and smaller percentage
3:26
of all of the health data
3:28
that's out there now, And it
3:30
is so so important for
3:32
folks to understand this because much
3:35
of the health data that's that
3:37
we're working with today is not covered
3:40
under HIPPO protections. Here's
3:43
one famous example. Journalist
3:45
Charles J. Hig reported that target
3:47
used detailed profiles of customers
3:50
to predict when women became pregnant,
3:52
and then the company sent them promotions for baby
3:54
cones or diapers. The result
3:57
was creepy in the best case, and in the worst
3:59
case, could have revealed information they
4:01
may not have wanted public. Increasingly,
4:04
health data is being collected by technology
4:06
companies, data brokers, advertisers,
4:09
and other entities that are not subject
4:11
to hip hop, and it's being used
4:14
and may be misused in ways that
4:16
a lot of people don't understand. Think
4:19
about the apps on your phone. Maybe
4:22
you have something to track your steps or
4:24
to log what foods you eat or when you exercise.
4:27
Unless those apps come from your medical provider
4:30
or health plan, they're not covered by HIPPA,
4:32
and that means that the company is collecting your data
4:35
are far less restricted in how they use it,
4:38
and how they use it may not always be transparent.
4:41
A study published in the journal Drama
4:43
Network Open in April looked
4:46
at thirty six top apps to help
4:48
people with depression and quitting smoking.
4:51
Of them, we're sending data to Google or Facebook
4:54
for marketing, advertising, or analytics,
4:57
But less than half of those apps disclosed
5:00
that. The authors wrote most
5:02
apps offered users no way to anticipate
5:05
that data will be shared in this way. As
5:07
a result, users are denied an
5:10
informed choice about whether such sharing
5:12
is acceptable to them. This
5:14
is the kind of risk that has some people really
5:17
worried. Even though some privacy
5:19
advocates think hipp as protections
5:21
should be stronger, they're a good
5:23
start. It's the world of data
5:26
beyond hip as reach that we need to
5:28
pay a lot more attention to. Because
5:31
of the lack of UH sort
5:33
of a uniform standard
5:35
across the country with regard to data
5:37
that it isn't protected by hippa
5:40
UM, there are concerns about
5:42
the privacy, particularly of health
5:44
data. This is Aleana Peters.
5:47
I'm currently a shareholder at Pulsonelli,
5:49
which is a national law firm. Aliana
5:51
worked for the federal government for about twelve years.
5:54
She wrote and enforced hippa regulations
5:56
before she went to work for a private law firm
5:58
in twenty bike PAM.
6:01
She's concerned about the growing volume of health
6:03
data that hippo doesn't cover. The
6:05
information that your employer holds about
6:07
you related to your health would not be
6:10
protected by hippa UM. The information
6:12
that you share with social media about your
6:14
health or the groups that you participate
6:17
in on social media
6:19
about health issues is not protected.
6:22
There are applications that are direct to
6:24
consumer. That means they are marketed directly
6:26
to consumers and have everything
6:29
to do with you know, weight
6:31
loss, to disease
6:33
management, UM to disease
6:35
prevention, because they're marketed
6:37
directly to a consumer and don't ever interact
6:40
with a healthcare provider on their behalf or
6:42
with a health plan that would not be covered
6:45
by hippa UM. So there's a
6:47
there's a huge amount
6:49
of healthcare data UM out
6:51
there that isn't actually covered by
6:54
a standard set of legal requirements.
6:57
Here are some of the ways you might be revealing data
7:00
without knowing it. You use a credit card
7:02
to buy a pregnancy test at a retail drug
7:04
store. You order new pants online,
7:07
revealing your waist size. You
7:09
search Google for symptoms of anxiety.
7:11
You subscribe to a magazine about diabetes.
7:14
You use an app to track your morning runs.
7:17
You take a direct to consumer DNA test,
7:20
You take an uber to your therapist's office
7:22
at the same time each week.
7:25
Just because information about your health could
7:27
be gleaned from these activities doesn't
7:29
mean it will be the problem
7:32
is. We often don't have a very good idea
7:34
of where this data ends up after
7:36
it's collected. Some of it could
7:39
end up in the hands of data brokers. Data
7:41
brokers are a multibillion dollar industry
7:44
made up of thousands of companies that you've
7:46
probably never heard of. They compile
7:48
information about people and sell it to marketers.
7:51
They collect information from public records
7:53
and even that data that you might not realize
7:55
you're making, like your retail purchases,
7:58
what groups you belong to, online magazines
8:00
and services you subscribe to, and information
8:03
you fill out in surveys or online registrations.
8:07
They take all of this information and make lists
8:09
of people for marketers to target. In
8:12
testimony before the Senate Commerce Committee,
8:14
in Pam the Privacy
8:16
Advocate described how the data broker
8:18
industry tracks people by the diseases
8:21
they have and the medicines they take.
8:23
There are lists of millions
8:25
of people that are categorized by the
8:28
diseases that they have, ranging
8:30
from cancer to bed wedding,
8:33
Alzheimer's terrible
8:35
diseases, some of them benign, some of them
8:38
relating to mental illness. There are lists
8:40
of millions of people and what
8:43
prescription drugs that they take, and
8:47
these lists exist entirely
8:49
outside of hip hop, outside
8:52
of what hip hop o the any
8:54
kind of federal health protection. Pam
8:57
told Congress about some lists that
9:00
the darker sides of this business model.
9:02
They included lists of rape victims
9:05
and people with genetic diseases.
9:07
She found lists for sale of people
9:09
who had HIV and aids, of
9:11
people with dementia, and of people with alcohol
9:14
or drug addiction. There were lists
9:16
of domestic violence victims and police
9:18
officers home addresses. The
9:21
list of rape victims cost less
9:23
than eight cents per name. Pam
9:26
said that some of these lists were taken
9:28
down within an hour or two of her testimony,
9:31
but most of them have reappeared at some point,
9:33
and six years after her testimony,
9:35
she says not much has changed.
9:38
The data. Broker dossiers are often described
9:40
as marketing lists, but Pam said
9:43
that doesn't necessarily mean the buyers
9:45
or marketers, and it also doesn't
9:47
mean that the lists are used as they're
9:49
intended. For example, employers
9:52
or insurance companies could also be buying
9:54
and using this data. There's no law
9:56
against this, So all
9:58
of this points to a knee for more protection.
10:01
The laws we have just don't reach far
10:04
enough. But despite its limits,
10:06
HIPPA does provide a good framework
10:09
for where to start. Here's
10:14
the good news. When data is covered
10:16
by HIPPA, the law gives people important
10:19
protections. Healthcare providers
10:21
and insurance plans are barred from disclosing
10:23
individually identifiable data under
10:25
HIPPOP and it goes further. As
10:28
you might remember, the law also grounds people
10:31
rights over their data. It gives
10:33
people seven different rights,
10:36
and the rights are really important
10:38
because before HIPPA there were huge
10:40
problems. Pam says, it was really
10:42
difficult to get a copy of your own medical
10:44
records before HIPPA. Before HIPPA,
10:47
good luck getting a consistent copy
10:50
of your health file. It wasn't a legal
10:52
requirement anywhere, so you you can predict
10:54
what was happening prior to HIPPO. It was a
10:56
disaster trying to get your
10:58
health information. It also gives you
11:00
the right to know if someone has subpoenaed your medical
11:03
records, which might happen in a nasty
11:05
divorce case, for example. And
11:07
it gives you the right to request an accounting
11:09
of disclosures that's the list
11:11
of who your doctor or health plan has shared your
11:13
medical records with. The list that I'm
11:16
trying to get from Anthem.
11:18
HIPPA also sets the rules for what
11:20
those entities can do with your data.
11:23
They can't just make it public. They can't
11:25
tell a reporter or your employer or
11:27
a family member about your diagnosis, your
11:29
treatment, or any other private information
11:32
without your permission. HIPPA
11:34
does allow medical providers and health
11:36
plans to release data if it's de
11:38
identified. That means
11:40
removing information like your name, address,
11:43
precise zip code, and other details.
11:46
This d identified data can be used
11:48
for research. It can also be sold.
11:51
For example, when drug companies want to know
11:53
which doctors are writing the most prescriptions
11:56
for their medications, they pay data brokers
11:58
who collect that information. Then pharmaceutical
12:01
companies can send their salespeople to doctors
12:03
who are the highest volume prescribers. The
12:07
data they're buying doesn't have your name on
12:09
it, but it does represent you
12:11
aggregated with other people, and
12:14
once it's de identified, it's
12:16
no longer bound by hippa's protections.
12:19
Some privacy advocates I talked to described
12:22
this as a violation of privacy.
12:25
The fact that you can't control de identified
12:28
versions of your data is really
12:30
troubling to some people. It's
12:32
especially concerning because of the risk
12:34
that some de identified data could
12:36
be re identified that
12:38
it could be matched back to you as an individual.
12:42
Most experts I talked to said that
12:44
this risk is real, but small. Still,
12:47
the odds of being reidentified
12:49
have increased since HIPPA was
12:51
first passed in the Here's
12:54
PAM. The world has changed.
12:57
So back then, I mean,
12:59
the statistic chance of reidentifying
13:01
records was enormously low. Now
13:04
the chance of reidentifying records is
13:06
a little bit easier because computing power
13:09
has advanced so much and there's
13:11
so many more data sets
13:13
that allow for more identifiability.
13:16
But there are also benefits to making
13:18
de identified data available. Medical
13:21
researchers rely on it to learn about
13:24
how to improve care, public
13:26
health officials use it to track epidemics
13:28
and trends in population health, and
13:31
as a journalist, I often cite research
13:34
or findings based on this kind of data,
13:36
from how common certain medical procedures
13:38
are to how often a new drug is
13:40
prescribed. I work
13:42
in privacy and I definitely have an
13:44
opinion on privacy. I'm I'm for privacy
13:48
and something that was very hard
13:50
for me to learn and it took years.
13:53
UM was the value
13:55
of releasing data. Pam
13:58
said she's come to realize that trade offs
14:00
between keeping data totally private
14:03
and using some d identified pieces
14:05
of it. If you want to cure diseases,
14:07
you're going to have to study the disease,
14:10
and you can't do that without information
14:12
about the disease. Information about
14:14
that disease resides in people's experience
14:17
with that disease as
14:19
patients. We might also benefit directly
14:22
from having more of our healthcare data digitized
14:25
to learn about these benefits. I paid
14:27
a visit to the Commonwealth Fund. Welcome,
14:30
Thank you. I
14:33
was there to see a man named David Blumenthal.
14:36
I'm president of the Commal Fund,
14:39
which is a national health careful anthropy
14:42
based in New York City, and
14:44
our goal is create hard for
14:46
regal system in the United States. David's
14:49
office is in a landmarked hundred and eleven
14:51
year old mansion on Manhattan's Upper east
14:53
Side, overlooking Central Park. It
14:56
used to belong to the Harkness family, which
14:58
endowed the Commonwealth Fund ascend the three ago
15:00
with money they made as investors in John
15:03
D. Rockefeller's Standard Oil Company.
15:05
David Blumenthal is a big name in healthcare.
15:08
He worked as a primary care doctor at Massachusetts
15:11
General Hospital. He advised Senator
15:13
Ted Kennedy on healthcare and later
15:15
worked for President Barack Obama as the
15:17
country's top health I T official. He
15:20
helped implement a lot called the High Tech
15:22
Act, which updated some HIPPA
15:24
rules. It also gave medical providers
15:27
billions of dollars in federal subsidies
15:29
to digitize paper records. The
15:32
High Tech Act was intended to modernize
15:34
America's paper based healthcare system.
15:37
As recently as ten years ago, a
15:39
majority of doctor's offices in the United
15:42
States still used paper records.
15:45
David's a big believer in how the accumulation
15:47
of digital healthcare data can help people.
15:50
As it grows, it begins
15:52
to represent the healthcare experience
15:56
of millions,
15:58
are even billions of people,
16:01
and that is incredibly
16:04
valuable. He says. Apps
16:06
that draw on patients data could help them
16:08
take better care of themselves. They
16:10
could prompt people to get flu shots or
16:12
alert diabetics when their bud sugar gets
16:14
out of whack. David
16:17
says he sees the benefits of greater access
16:19
to medical data as a physician and
16:21
as a patient. Though he works in New
16:23
York, he lives in Boston, and he still
16:26
sees doctors at Mass General where
16:28
he used to work, and it's affiliated hospitals
16:30
in the Partners health care system. He
16:32
finds it comforting that he can walk into any
16:34
of the dozens of clinics or hospitals
16:36
in the system and they'll still have his records.
16:39
I have seen and use that
16:42
that connectiveness with
16:45
my own care, and it's
16:47
enormously reassure um
16:51
that you don't have to be you know, your
16:53
medicines will be known, your results
16:55
of all your tests will be known, and all
16:58
that saving That
17:00
could solve some big problems in the US health
17:02
care system. There's a lot of evidence
17:05
that patients are harmed all the time because
17:07
their care is fragmented and not coordinated.
17:10
A specialist who doesn't know all the medications
17:12
you're on might prescribe a new drug that
17:15
has a bad interaction with one you're already
17:17
taking. One study of more
17:19
than half a million patients with chronic illnesses
17:21
like diabetes or heart disease found
17:23
that people who had more fragmented care
17:26
had higher costs, lower quality
17:28
care, and more preventable hospital
17:30
visits. This is a real problem
17:32
that a lot of people in healthcare would like to solve.
17:35
Policymakers are trying to make the whole country's
17:37
health care system work better together. They're
17:40
trying to encourage different electronic medical
17:42
record systems to talk to each other.
17:44
They're also making it easier for patients
17:47
on government health insurance like medicare
17:49
to get access to their health data. The
17:51
goal is a health care system that seamlessly
17:54
relays important information that could
17:56
save your life. David gave
17:58
me a classic example. You live
18:00
in Austin, but you get into a car accident
18:02
in Chicago. Once you get to the
18:04
emergency room, maybe you're dazed or
18:06
unconscious, or you forget to tell the physician
18:09
about analergy. But if
18:11
digital records were more widely accessible,
18:13
that might not be an issue. The merger
18:15
room physician finds
18:18
your Apple phone and everything is on the
18:20
Apple phone, or it can access
18:22
your record in a cloud because there's an
18:24
agreement to share those informations, and so
18:27
that increases the reliability of
18:29
your ka, reduces the chance of an error,
18:31
reduces the chance of a of
18:34
a bad outcome. That's
18:36
the benefit. The risk is most
18:39
promised, and the
18:43
sicker people are the
18:45
less concerned they are about price. But
18:48
there are also downsides. Just
18:50
as with app collected data, more
18:52
traditional medical data sharing has its
18:54
drawbacks. The risk is that nothing
18:57
is ever truly private. As
18:59
soon as your information is available
19:03
electron reform, either in a server
19:05
or in the cloud, it
19:08
is potentially David
19:12
has experienced this firsthand as
19:15
a federal employee. His data
19:17
was breached in a hack of the government's employee
19:19
database. I've given up on
19:22
the idea of privacy. It's just
19:24
not feasible anymore. It
19:27
hasn't that
19:29
I know, have happened to my health, but
19:32
it could um and I expected
19:34
mine. Once data is digitized
19:36
and stored, there's a risk it might end up
19:38
somewhere you don't want it to. HIPPA
19:42
requires medical providers and health plans
19:45
to tell you when your data has been breached,
19:47
and under the High Tech Act, if
19:49
a breach affects more than five people,
19:52
the companies have to report it to the federal
19:54
government, which publishes a list.
19:57
Since two tho nine, when the reporting
19:59
requirement into effect, HIPPO
20:01
covered entities have reported more than two
20:04
thousand, five hundred breaches that
20:06
affected almost two hundred million
20:08
individuals health records. Health
20:11
data breaches happen so frequently now
20:13
that they rarely make the news their
20:15
routine. On average, there's a
20:17
breach of HIPPO protected health data every
20:20
thirty one hours, and that's
20:22
only the data breaches that companies have detected
20:25
and that we know about. We know about
20:27
them because the law requires entities covered
20:29
by HIPPA to tell us, but
20:32
under federal law, entities not covered
20:34
by HIPPA generally don't have to tell
20:36
us when a data breach happens. The state
20:38
laws may require them to report breaches.
20:41
They also aren't bound by any of the other
20:44
requirements of HIPPA. They're mostly
20:46
bound by the promises they make to you in
20:48
their terms of service, those long
20:50
passages of legal ease that you click
20:53
through after you download an app or sign
20:55
up for a new service, and that's
20:57
where a lot of the privacy concerns about
20:59
health data are growing. There's not only
21:01
the risk that your data might get breached in an
21:03
illegal hacking operation or stolen
21:05
by a crooked employee. There's
21:08
also the risk that it might get shared or sold
21:10
in a way that's not necessarily illegal
21:13
but isn't completely transparent. Either.
21:16
Facebook and Amazon can do anything
21:18
they want with your data or
21:20
any other any company that's not a covered
21:23
can do anything they want unless
21:26
they have assured you in that fine
21:28
print that they won't. But
21:30
since none of us need that fine print,
21:33
will never get around suing. So
21:35
under HIPPA, we have certain rights, the
21:37
right to get a copy of our data, the right
21:39
to know how it's being shared and when
21:41
it is shared improperly, and
21:44
it requires healthcare providers to keep
21:46
our identifying data close to
21:48
not disclose it without our permission. We
21:51
don't have those rights over the data we give to
21:53
some app we download, or a new fitness
21:55
device or a social media service. We
21:58
don't have those rights over what with
22:00
our credit card purchasing data or
22:02
our online searches. Partly
22:04
because we don't have those rights, sometimes
22:07
our names and contact details wind
22:09
up for sale on data brokers lists
22:11
labeling us as diabetics or
22:13
dementia sufferers or victims
22:16
of domestic violence. Right
22:18
now, the law doesn't do a very
22:20
good job of making companies be really
22:23
clear about what they're doing with our
22:25
data and making sure customers
22:27
are okay with it. So what
22:29
should we do? I think it's a really
22:31
good question, and it's a tough question. Here's
22:34
Ileana Peters the attorney and former
22:36
HIPPO official. Trying to decide
22:39
what's best for all industries
22:41
with regard to the privacy and security of data
22:43
is extremely difficult. I
22:45
think, certainly there are some things we can
22:47
all agree on, and maybe that's where we need to start.
22:50
Certainly, I think individual rights is one
22:52
of those things, you know. I think everybody
22:54
should have rights to their own data and should
22:56
be able to be at least participatory
22:59
and how they're data maybe um
23:01
used or disclosed, why it should be deleted,
23:04
how that should happen? Um, you know,
23:06
when they can get copies of it, how that should
23:08
happen. One
23:11
possible model for people looking to improve
23:14
privacy policy in the United States
23:16
is a new law that recently took effect
23:18
in the European Union. It's called
23:20
the General Data Protection Regulation, and
23:23
its strengthens privacy protections for consumers.
23:25
It covers all sorts of personal data, not
23:28
just healthcare. The law mix companies
23:30
get more explicit consent from people about
23:32
the data they want to collect. It also
23:34
gives people a right to get a copy of their data,
23:37
and it's supposed to give them more control over
23:39
what happens to it. The United
23:41
States doesn't have anything like it yet,
23:44
and there's no clear path to passing a
23:46
new umbrella privacy law in the US
23:48
anytime soon. That means
23:51
that even companies trying to do the right
23:53
thing don't have good standards to
23:55
follow. Pam Dixon,
23:57
the privacy advocate, said, we've start
24:00
by creating a set of standards that companies
24:03
adhere to voluntarily that
24:05
would give consumers more trust and how
24:07
their data is being used. So ideally,
24:10
what I'd like to see at a minimum,
24:13
is some kind of structure that allows
24:15
for um privacy standards
24:17
to be built. Is there a privacy
24:19
standard we could write for health
24:22
data outside of HIPPA. I think there
24:24
is, and I think we could find a lot of agreement
24:27
amongst the stakeholders. As
24:29
I said, I think there's a lot of people who
24:31
want to do the right thing. It's just there's not a standard
24:33
yet. In the meantime, what can we
24:36
do as individuals to have more
24:38
control over our data? First,
24:40
you can exercise the rights you already
24:43
have under HIPPA. Pam recommends
24:45
everyone get a copy of their medical records
24:47
from their providers. If someone
24:50
tries to steal your identity later on,
24:52
it will be important to have your original files.
24:55
If you have kids, get copies for your kids
24:57
too. You can also pay attention
25:00
to what you're agreeing to when you start
25:02
using a new app or service. Here's
25:04
Alana. I read everything before
25:07
I click I accept, but I
25:10
realized that I may not be the typical user.
25:12
PAM also recommends simply asking
25:14
companies what data they're collecting and
25:17
what they're doing with it. You know, sending
25:19
an email to um an app
25:21
developer and asking what happens is always
25:24
a great idea. I do that all the time.
25:26
If they don't email me back, I delete the app.
25:29
I'm a reporter, so maybe I'm
25:31
biased about this, but I think asking
25:33
questions is a good way to show the
25:35
people were trusting with our data, that we're
25:37
paying attention, that we care about
25:40
what happens to it, and that we want some
25:42
control. I spent about twenty
25:44
minutes on the phone with my insurance company.
25:46
Most of the time I was on hold, Hello,
25:56
Yes, thank you so much
25:58
for patiently waiting. I can clearly apoll eventually.
26:01
I just wanted to make sure that
26:04
she was really friendly, and eventually she
26:06
gave me the address of the privacy office
26:09
where I could send an email to request
26:11
an accounting of disclosures, one of
26:13
my rights under HIPPA. I
26:15
wrote to them in April. At the end of
26:17
May, they sent me a letter that described
26:19
how my health information was released. Anthem
26:22
said they're required by law to send my claims
26:24
records to a database run by the state
26:26
Health Department. The letter also
26:29
said that my name, date of birth,
26:31
and contact information were exposed
26:33
in a cyber attack in t Anthem
26:37
was hacked in a breach that compromised
26:39
data on seventy nine million people.
26:41
It was the largest recorded health data
26:44
theft in US history. Anthem
26:46
paid a sixteen million dollars settlement
26:48
last year over potential HIPPO violations
26:51
related to the breach. The company
26:53
did not admit liability as part of the settlement,
26:56
and justin May, two Chinese
26:58
nationals were indicted in the crime. The
27:01
Justice Department called them part
27:03
of an extremely sophisticated hacking
27:05
group operating in China that targeted
27:08
US businesses. We got in touch
27:10
with Anthem about this. A spokeswoman
27:12
there said the company is committed to safeguarding
27:15
customer data and there's no evidence
27:17
that the information stolen in the cyber
27:20
attack resulted in fraud against customers.
27:23
So I know my data is out there, along
27:26
with millions of other people's. I
27:28
don't feel great about it, but at least I know.
27:31
I'm more worried about what I don't know.
27:44
And that's it for this week's prognosis. Thanks
27:46
for listening. Do you have a
27:48
story about healthcare in the US or around
27:50
the world we want to hear from you. Find
27:53
me on Twitter at the Cortes or
27:55
email m Cortes at bloomberg
27:57
dot net. If you were a fan of this episode,
28:00
please take a moment to rate and review us
28:03
and really helps new listeners find the show and
28:05
don't forget to subscribe. This episode
28:08
was produced by Lindsay Cratterwell. Our
28:10
story editor was Rick Shine. Special
28:12
thanks to Drew Armstrong. Francesco
28:15
Levie is head of Bloomberg Podcasts. We'll
28:17
be back on June with our next episode.
28:20
See you then,
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More