Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:00
This episode is brought to you by Shopify.
0:03
Forget the frustration of picking commerce
0:05
platforms when you switch your business
0:08
to Shopify, the global commerce platform
0:10
that supercharges your selling wherever you
0:12
sell. With Shopify, you'll harness the
0:14
same intuitive features, trusted apps, and
0:16
powerful analytics used by the world's
0:18
leading brands. Sign up today for
0:20
your $1 per month
0:23
trial period at shopify.com/tech,
0:25
all lowercase. That's shopify.com
0:28
slash tech. Girl,
0:31
real talk. This whole, it's
0:33
new year, time to reinvent myself trash is
0:35
not the vibe for 2024. You
0:38
can find someone who loves you for you, as you
0:40
are. You don't need
0:43
to read a stack of self-help books,
0:45
only eat sad salads, or like start
0:47
meditating at 5 a.m. to
0:49
be ready for dating. So, yeah, my
0:51
advice is to download Bumble and
0:53
find someone who embraces you the way you are right
0:55
now. Let me know how it goes.
1:01
Guys, you ever wish that in the middle
1:03
of a conversation that you had a lifeline?
1:05
Oh, most definitely. Yeah, kind of. Yeah, I
1:07
do. The good
1:09
news this week, we have the Attorney General
1:11
of the state of Vermont, Charity Clark, who's
1:13
going to be our lifeline when it comes
1:15
to the scams we all face. And
1:18
with that, welcome to What the Hack, a
1:21
true cybercrime podcast. I'm
1:23
Adam Levin. I'm Beau Friedlander. And
1:25
I'm Travis Taylor. Welcome
1:35
to our show, Attorney General Charity Clark
1:37
from Vermont. Thanks for having me. It's really
1:39
exciting to be here with you today. General
1:41
Clark, how would you like us to address
1:43
you during this podcast? You can call
1:46
me Charity. Excellent. You
1:48
were an attorney at a private firm, you know, a big
1:51
firm in New York. What
1:53
led you to pivot to public service? I
1:55
worked In the same building where
1:57
the Attorney General's office is today. Thank you. Because.
2:00
I work in the Governor's office in Vermont, so
2:03
my career really started out as a public servants.
2:05
I. Made my way to law school. And
2:07
then private practice including a a stint in
2:09
New York City. And then I made my
2:12
my way back to the Pavilion office building
2:14
and I and that's when I arrived for
2:16
my interview. My first shot back at the
2:18
Attorney General's office. I ran to the same
2:20
facility and who was there when I was
2:22
the Governor's office and usually he charities. You
2:24
changed a lot and always was a lot.
2:26
By that was almost ten years ago. Now
2:28
and I'm it's been awesome. Charity.
2:32
I remember when you are New York City, that's her. We
2:34
met. We lived in the same neighborhood we
2:36
had of mutual friend. When. You came
2:38
back Positive reaction to that. Had just become
2:40
a big city person. Know because I was
2:42
like ten years older. And
2:46
I did. You go back specifically
2:48
to pursue the Attorney General's slaughtered
2:50
You go back to practice on
2:52
Vermont. So I went back because I
2:54
was seven months pregnant. I. Wanted
2:56
my child to grow up in Vermont because
2:58
it's the best places on earth so I
3:01
possibly ski them by some cases. Rented a
3:03
car and a drawback to Vermont to sell
3:05
Storm. I
3:07
didn't have a job. I decided I
3:09
was gonna buy my own maternity leave.
3:12
And. Then when. My daughter was five
3:14
months old. I called. The Attorney General's
3:16
office and I got down there. So
3:21
I remember when you were floating the
3:23
idea of running for Attorney General. How
3:27
was that campaign in Vermont hundred ago? He.
3:29
Was pretty wild. That was the summer
3:31
of Twenty Twenty Two. Vermont
3:34
is a very liberal place, so
3:36
the real to elections in Vermont
3:38
generally are. The primary. The.
3:40
Primary in Vermont. His likeness
3:42
Second Tuesday August. I
3:44
found out there was once been open
3:47
seat when my then boss the attorney
3:49
general announced he wasn't gonna run and
3:51
gun. That early next so
3:53
I had to. Quit. My
3:55
job watch my campaign and. let
3:58
vermonters no way was and what i about,
4:00
convinced them to vote for me, and
4:02
raise a ton of money to do that all before August
4:04
9th. So it was the
4:07
wildest summer of my life. Sometimes
4:10
I still dream about it because it was
4:12
such a whirlwind. When you only
4:14
have like two and a half months to
4:16
win a statewide election, you can't plan everything
4:19
out perfectly. You have to roll in and
4:21
do your best. And that's not
4:23
a comfort spot for attorneys. We
4:25
like to be prepared and we like
4:27
to have vetted everything, Adam's nodding. It
4:30
was truly a whirlwind. But it was
4:32
also really fun. One of the reasons
4:34
why I was fun is because there were so many people
4:37
running for office. There were a ton of
4:39
open seats. So everywhere
4:41
I went, I would run into other people
4:43
who were all friends in Vermoxville, know each
4:45
other. So it was really, really fun. Like
4:47
we would march in a parade and it would be like marching
4:49
in a parade with all your friends. And
4:52
we had a ball. All
4:55
right. So Charity, we talk a
4:57
lot about scams on this show and we
4:59
know you're a fellow traveler with us. This
5:02
is so true. You're a scam
5:04
fighter and we appreciate that. In fact,
5:06
you just published a list of the
5:09
top scams to watch out for in
5:11
Vermont. Can you tell our listeners
5:13
a little bit about the top scams that
5:15
you've seen? I want to preface this
5:18
by saying I could literally talk to you
5:20
all day about scams. I feel passionately about
5:22
scams. One of the reasons
5:24
is because we know that education is the
5:26
best means of fighting scams. So people are
5:28
going to be talking a lot about scams
5:31
because I'm trying to tell people about them.
5:34
Scammers are largely overseas. We cannot track
5:36
them down. We can't shut them down.
5:38
Our best tool is
5:40
educating each other about scams.
5:45
The number one scam we encountered
5:47
at our consumer assistance program
5:49
in Vermont was the tech support
5:51
scam. We got twice
5:53
as many complaints about that scam than
5:56
number two. So we did by
5:58
far the hottest scam ever. No
6:01
reasons why, it's because it's very effective. So
6:04
that scam looks like this. You're on your
6:06
computer minding your own business when a pop-up
6:08
comes up and it says,
6:10
hey, you have a virus. Click
6:12
here. And either it's a phishing scam and
6:14
then they're going to try to get a bunch of information from
6:17
you that they can then
6:19
steal your identity with. They
6:21
say, I can help you with your
6:24
virus. Let me remotely
6:26
access your computer. And
6:29
once you allow someone to
6:31
access your computer remotely, they
6:33
can find your bookmarks, your bank
6:35
accounts, your emails, all of it,
6:38
like chilling tales from consumers who
6:40
called our office saying they watched
6:43
helplessly as the money was
6:45
getting drained from their accounts and transferred
6:47
to a scammer's account. We've
6:49
also heard a tale of someone tried
6:51
to click out of the
6:54
pop-up and the scammer hid
6:58
the link that you would click
7:00
on to allow them to access your computer remotely under
7:03
the ox. So when I
7:05
get one of those pop-ups, I turn my computer off.
7:07
Even if it's probably like so legitimate most of the
7:09
time. When I have a problem with my
7:11
computer, I drive it to Best Buy.
7:13
I never even allowed Best Buy to remotely
7:15
access my computer because I hear these chilling
7:18
tales, people losing hundreds of thousands of
7:20
dollars this way. How
7:24
many cases of the tech support scam have you seen?
7:28
Last year it was 531 reports of the tech support scam. And
7:34
that's just within Vermont? Yes, and
7:36
that's just recorded. We had a person
7:38
within the past few years who
7:41
lost about $600,000 in the tech support scam
7:45
that was the largest one that year. So
7:47
it really sticks with my head and it
7:49
was a sophisticated person and it was crypto
7:51
that they lost. We see a lot
7:53
of crypto in the top 10 scams when
7:56
we calculate the magnitude of loss. As
7:58
hard as it is, there's no consumer
8:00
protection with cryptocurrency. We
8:04
recently did a program with
8:07
a woman named Marsry, a former government attorney,
8:09
who fell victim to a
8:11
tech support scam. Supposedly another
8:14
member of the team was
8:16
affiliated with the bank that she
8:19
banked at and asked
8:21
if there was any way that
8:23
she noticed fraudulent activity occurring on
8:25
her accounts. She had
8:27
received an email which she never did. They
8:30
basically convinced her to allow them
8:32
to open a Coinbase account in
8:35
her name and that it
8:37
would be much more secure while they were sorting
8:39
out the fraud. If she could
8:41
move her assets over to cryptocurrency, she ended
8:43
up moving over $660,000 including her retirement accounts.
8:50
Thankfully this person was a
8:52
former federal employee, an
8:54
attorney who had been involved in many
8:57
different legal situations. So she did know
8:59
how to navigate it and she also
9:01
happened to have the funds to
9:04
pay an attorney to help her and
9:06
she came out relatively whole minus
9:08
the amount of money she had to spend
9:10
on legal fees to rectify
9:13
the situation. Tech
9:15
support scams are shockingly
9:18
effective. If they
9:20
hack into the right person they're getting
9:22
a lot, maybe millions of dollars. So
9:25
what else do you have on your list that maybe
9:27
our listeners aren't aware of? I
9:30
have my, I call it my Cherry Clark
9:32
red flag formula of what a scam is.
9:35
It's when someone you don't know is trying
9:37
to convince you to give them money urgently
9:40
and there's a sense of emotion.
9:43
So in that case the emotion is concerned
9:45
and that's part of why it's so effective.
9:47
It's like oh my gosh I have a
9:50
computer virus. I don't know anything about computers
9:52
and you're all worried you're not thinking you're
9:55
feeling. So you see
9:57
that all the time with scams people are worried
9:59
they're concerned. or the fake package scam.
10:01
You know, when someone's doing it as a phishing
10:03
scam, they say like, oh, you're getting a package.
10:06
You're excited. You're like, whoa, someone mailed me something.
10:08
Or I ordered something and I forgot. So those
10:10
emotions, that's like the first starting point
10:12
with an effective scam that you're in your
10:14
emotion. So we
10:17
have a variety of
10:19
scams, but they're all really familiar
10:22
to me, except there's one new scam
10:24
on our top 10 list this year.
10:26
It's the rewards credit scam. We're
10:29
hearing about this. It comes via email
10:31
from a supposedly legitimate retailer saying, great
10:34
news, you've got rewards credit. And it's
10:36
a phishing scam. Click here and fill
10:38
out this information so that we can
10:40
get you your rewards credit. But there's
10:43
no rewards credit. It's not really that
10:45
major retailer. We got a lot of
10:47
reports on that scam, 132 reports. When
10:51
our largest was 531, but that's a lot. So
10:54
wanted to flag that because I think that might be an
10:56
up and coming scam and it's on the rise. And
10:59
what do you think they're after in that scam? I
11:01
think they want your social security number and your
11:03
address, your name, the things that they could use
11:06
to open up a bank account, credit card and
11:08
steal money that way. Which
11:10
is why we always say freeze your credit
11:13
because even if they do have your social
11:15
security number, it's gonna stop a lot of
11:17
problems from happening. As
11:19
attorney general, what is it exactly that you do
11:22
about scams? I would say that
11:24
our real focus when it comes to scams is
11:27
our consumer assistance program. And every
11:29
state has essentially a consumer assistance
11:31
program. But in Vermont, it's located
11:33
at our university. The
11:36
people who are operating phones
11:38
and answering emails are students
11:40
who take a class called
11:43
consumer law in action. And
11:45
we have other consumer classes that we teach.
11:47
I've taught these classes before at UVM. And
11:50
we train them to be
11:52
little consumer lieutenants going out in
11:54
the world, knowing all about consumer
11:56
law and rights. They also are
11:58
helping people with their consumer. concerns.
12:01
Many of those concerns are not scams.
12:03
It's legitimate companies who did something
12:06
wrong, there was a miscommunication, you didn't get
12:08
what you were expecting. We call those complaints.
12:11
Then there's the scam reports. We
12:13
are using the scam reports to
12:15
educate people like I just did
12:17
telling you about the new rewards
12:19
credit scam. But also
12:21
we have a team
12:23
in place at the Consumer Assistance Program
12:26
who focuses on robocall scams. I'd love
12:28
to tell you a little bit about
12:30
our robocall team. Robocall
12:32
scams, just like most
12:35
scams, occur overseas and you feel
12:37
kind of helpless. How do you go after
12:39
them? We figured out a way. We
12:41
have gone after the telephone
12:44
carrier who's located here in
12:46
the United States who facilitates
12:48
the scam. Because of
12:50
our friends in the federal government
12:52
during the Social Security scam, they
12:55
operated with us as a team
12:57
and they did what's called a
12:59
trace back. Where you trace back
13:01
the number and if you have,
13:03
you know, within 24 hours you can see where
13:05
the number came from. We have now settled with
13:07
a few companies. One of them just went out
13:09
of business after we settled. We also have some
13:12
investigations going with those companies, some of the
13:14
worst actors that we see. The way that
13:16
we can tell it's a scam, the quantity
13:18
of calls that are less than a minute.
13:21
And then we do on our investigation
13:23
and learn more. That's kind of our
13:25
first clue. Their scams are so prevalent
13:27
but it feels good to be doing
13:29
something and to be addressing that the
13:31
harm that we know is there and
13:33
saying we don't have a tolerance for
13:35
you directing scam robocalls into Vermont.
13:44
So, Bo and Adam, you guys know I'm
13:46
a bit of a privacy geek, if you
13:49
will. Oh yeah. Yeah, you are. Yeah, totally.
13:51
I really just don't like the idea that
13:53
just about anyone can find you online, can
13:55
find out where you live or your email
13:57
address, your phone number, anything. I just think
13:59
that entire idea is super creepy. There's so
14:01
much of my data already out there, but
14:03
is there something that you can do? Yeah,
14:05
actually you can use delete me. Delete me
14:07
is a service that pretty much does the
14:09
heavy lifting for you where they go to
14:11
all the data brokers that they have on
14:13
file and just pull your data
14:15
and delete it on a regular basis. I
14:17
use it, I like it, and they make
14:19
it quick, easy, and safe to remove your
14:21
personal data online. We have with these data
14:23
brokers, they can accumulate huge amounts of your
14:25
personally identifiable information. And if all that information
14:27
gets into the hands of a bad actor,
14:29
that opens you up to a lot of risk.
14:32
And if you act now, you can get 20% off your delete me
14:35
plan when you go to
14:37
joindeleteme.com/WTH and use promo code
14:40
WTH. The only way to
14:43
get 20% off is to
14:45
go to joindeleteme.com/ WTH
14:47
and enter promo code WTH
14:50
at checkout. Just
14:52
joindeleteme.com/ WTH promo
14:54
code WTH, which stands
14:56
for what the heck. And we thank
14:58
you for supporting delete me and what
15:01
that. Lauren.
15:04
Mike. So we host a podcast for
15:06
Wired called Gadget Lab. We do. We
15:09
do. Yes, that is
15:11
correct. Tell the good people some more about
15:13
it. Well, I think the good people should
15:15
definitely tune in every week because they get
15:17
to hear me roasting you. I know. All
15:20
right. No, really what Gadget Lab is, is
15:22
Mike and I tackling the biggest questions in
15:24
the world of technology. I like to think
15:26
of it as the best of Wired's journalism,
15:29
but in audio form. We cover the big
15:31
news of the week in Techland, but we
15:33
also offer our expert analyses and opinions on
15:35
all things consumer tech, whether that's mobile apps,
15:38
hardware, startups, cryptocurrency. Mike, what's
15:41
been a recent highlight episode
15:43
for you? We did a deep
15:45
dive on the group behind the massive Okta hack.
15:47
We also had a great conversation about Web3 and
15:50
the Metaverse. What stands out for
15:52
you? Never Metaverse you didn't like. I
15:55
really enjoyed our recent podcast about Peloton.
15:58
And Recently, the legendary Tech Journal. Kara
16:00
Swisher joined us to talk all about Elon musk
16:02
in the future of Twitter so I guess we
16:04
should help people how they can listen to our
16:06
pod. We release a new episode of Get
16:08
Alive every week and you can listen and
16:11
follow us on Apple podcasts, spotter Fi, or
16:13
were reported. Saturday.
16:20
We talk a lot on the show about
16:22
facial recognition. And the
16:24
issues with facial recognition technology and
16:26
the Vermont. Attorney General's
16:28
office is taking on a lot
16:31
of privacy issues associated with facial
16:33
recognition. Tell. Us about that. I
16:36
feel strongly about this issue. There's
16:38
two areas said think about. One
16:40
is legislation that advocating for and
16:42
I'm hopeful what'll get across the
16:44
finish line during the current legislative.
16:46
Session and for much the second is in
16:48
lawsuit the we have filed against. Accompanying and
16:50
probably are firmly rinse your and since he some
16:53
recognition which is called clear view a I. Merely.
16:56
A I is start up company.
16:58
they and screen scraped the internet.
17:01
For. A photograph. They. Use
17:03
an algorithm to create a
17:05
facial recognition system that can
17:07
identify people. They now has
17:09
amassed. Tens. Of
17:11
billions of photographs and. Apparently
17:14
are trying to face now everybody on earth.
17:17
We view this as
17:19
a violation of privacy
17:21
and had sued through.
17:23
The unfairness prong of the
17:25
Vermont Consumer Protection Act. Be
17:28
sued in March of Twenty Twenty. During that
17:30
time the pictures in the Clear View database
17:32
have just thrown in ground and one of
17:34
the things I want to emphasize is someone
17:37
listening to us may be thinking you pull
17:39
your profile picture on your Facebook page and
17:41
public that's not. All. Their.
17:44
Their algorithms is not discriminate and say
17:46
oh, this is a Facebook profile. it
17:48
literally just looks for pictures so as
17:50
i am sitting on a bench in
17:52
a park eating a sandwich my imagination
17:54
is like a sloppy egg salad sandwich
17:56
and i look ridiculous sitting on the
17:58
park bench I'm in the back of
18:01
a tourist photo. I don't know, my photo's
18:03
being taken, and yet it's being screen scraped in
18:05
Clearview's database. I have no idea it's there. I
18:07
think I'm having a sweet moment with my family,
18:09
having a picnic or something. And the other thing
18:12
about it is they're making money off of my
18:14
face. And I'm not getting a cut of
18:16
that money. That bugs me too. The
18:19
whole thing bugs me. Now, when I
18:21
think about what Clearview AI is doing,
18:23
I have different issues with it. Even
18:26
my Facebook profile picture. When I
18:28
posted it, I did not read
18:30
the privacy policy that Meta, then
18:32
Facebook, published about the use of
18:35
my photograph. I was
18:37
squeamish about seeing it pop up in
18:40
Google searches. We live
18:42
in a wild west of digital
18:44
commerce that is based
18:46
on our data. And
18:48
that is deeply problematic, not only because I
18:51
agree with you, I should get a cut.
18:53
If you're gonna use my picture to bolster
18:55
a database, you better give me some money
18:57
for it. But
18:59
the other thing is privacy and the
19:01
expectation of privacy. That's the strongest constitutional
19:03
way to go after these companies. But
19:05
the second one is intellectual property. Because
19:08
the photograph's mine. That's my
19:10
photograph. I have in theory the copyright
19:12
to that photograph. So how
19:14
about that? How about just sending them a big
19:17
bill? Can you do that, Jarenny? I'll
19:20
file an amicus brief in your case, Bo. Thanks,
19:23
I want at least 10 bucks. And
19:26
you can refer to the NIL
19:28
program for the NCAA. Yeah,
19:31
I was thinking name, image and likeness. And it's sort
19:33
of, you know, so it does come down to that.
19:36
There's another program that we learned about through
19:38
Cashmere Hill, the one where you could input
19:40
a picture of yourself and see how many
19:42
times it pops up. Yeah, that one was
19:45
pretty spooky, I think. That was Pim Eyes.
19:47
Correct. And you know what, the funny thing
19:50
is, I put you on Pim Eyes and I put Travis on
19:52
Pim Eyes, I put me on Pim Eyes, everyone. And
19:54
some of it's dumb, right? Because
19:57
I live with somebody who has their picture
20:00
place and when it popped up it was like
20:02
yeah that's this person whatever with you same thing
20:04
it was just like how many times you've been
20:06
on TV or whatever with me was a little
20:08
more concerning and with Travis
20:10
it was deeply concerning because Travis is
20:12
not I'm like I've been in
20:14
and out of public this or that but Travis hasn't
20:16
and so it was he was all over the place
20:18
I could figure out who he was I don't know
20:21
why I think I may have just been insulted but
20:23
okay it got you it didn't
20:28
get you very well but I
20:30
found you on there what does
20:32
the Attorney General of the state
20:34
of Vermont think about that? I am
20:36
disturbed by this and I would just
20:38
say you know in Vermont privacy is
20:41
part of our ethos here
20:45
about a hundred years ago Vermont was
20:47
given the opportunity to cave our
20:49
green mountains for tourism and town
20:53
meetings across the state were like no
20:55
thank you we're not gonna spoil our
20:57
beautiful green mountains and maybe
21:00
50 years after that we were
21:03
probably the first say I don't know if we're
21:05
the only state to ban billboards because
21:07
our beauty is not for sale our communities
21:09
are not for sale and our privacy should
21:11
not be for sale that's
21:14
just kind of where we are in Vermont
21:16
so I feel like you asked me for
21:18
my opinion thinking important thing
21:20
is my opinion is just reflecting a
21:22
Vermont value which is we all deserve
21:25
our privacy and we deserve to control
21:27
how our faces use it
21:30
really bugs me that someone would
21:32
be making off my data or
21:35
my face because it feels very
21:37
exploitative and unfair yeah also just
21:39
you know from the state that brought you
21:41
the floating bridge of Brookfield I
21:45
went to Mount Mansfield to the top of it because
21:47
it's so beautiful I'm obsessed without
21:50
Mansfield it's so beautiful and majestic 707
21:57
now a consumer group is calling on California to
21:59
take action again against the company for its
22:01
facial recognition technology. The nonprofit consumer Watch
22:03
Dogs sent letters to State Attorney General
22:06
Rob Bonta and the California Privacy Protection
22:08
Agency. Watch Dog accuses
22:10
the Clearview company of creating
22:12
an artificial intelligence facial recognition
22:14
software that violates privacy laws.
22:17
Concerns include mining personal data of
22:19
children and inaccurate facial recognition of
22:22
African Americans. Clearview promotes its technology
22:24
as high quality algorithms that are
22:26
accurate and help monitor... What's
22:31
the upshot of your case against Clearview? We
22:34
are in motion's practice right now and
22:37
trial is set for later this year. I
22:40
believe very strongly that our
22:42
Consumer Protection Act protects Vermonters
22:44
against this. I think
22:46
it's important that we advocate in the legislature
22:49
for changes to our laws
22:51
that will expand and address some of
22:54
these changing technologies in a way that's
22:56
maybe more specific than the Consumer Protection
22:58
Act. And it also
23:00
is important to raise awareness about these
23:02
issues. I don't think a lot of
23:05
people realize how much their data is
23:07
being collected. I don't think they know
23:09
what a data broker is. I don't think they realize
23:11
when they get a free app on their phone, it's
23:13
not free, they're trading their data for it. All
23:16
of that, I think, is something that we should be
23:18
continuing to talk about so people can realize what's happening
23:21
and what the potential impacts are because the
23:23
potential impacts are really disturbing as well. Not
23:25
only is it your data, there could
23:27
be a data breach and now your data is in
23:29
the hands of a bad actor, but
23:32
we also, there's a history
23:34
of outcomes that are discriminatory
23:37
for facial recognition software, some
23:39
of which are incredibly disturbing, people
23:41
getting arrested for being mistaken for
23:44
another person, that they very plainly are
23:46
enough. So it is
23:48
really disturbing. And as a woman,
23:50
the potential misuse of this
23:53
kind of technology for women
23:55
who are experiencing domestic violence or
23:57
intimate partner violence, and so
23:59
forth, also are very disturbing to
24:01
me. Well, just to start off,
24:03
I agree with you completely about the
24:05
privacy concerns here, but just to play
24:08
devil's advocate, are there actually legitimate uses
24:10
for Clearview technology? Well,
24:12
the use that is
24:14
the most compelling is the
24:16
use that they can use
24:18
the spatial recognition software to
24:20
find criminal actors. Like anything
24:22
in public policy, we need
24:24
to find out where the
24:26
right balance is. Should all
24:28
of us have our
24:30
privacy sacrificed so that it takes
24:33
less time to find a
24:35
criminal actor? I don't
24:37
think we've really had an opportunity to have
24:39
that robust conversation because Clearview
24:43
started acting before really
24:45
investigating what the outcomes might be.
24:47
And in fact, when they
24:49
started, they weren't focused on law enforcement.
24:51
They were focused on billionaire potential
24:54
investors. They were trying to convince to
24:56
invest in their startup, and
24:58
they weren't focused on the public good. They
25:01
were focused on their bottom line. They
25:03
realized they better just focus on the convincing
25:05
thing. But when you look around the country
25:08
at the lawsuits against them, it's really disturbing.
25:10
It's actually so fascinating to you. When you
25:12
think about other countries' experience
25:14
with privacy and how they're responding
25:17
to things like this, this threat,
25:19
it's really fascinating to think about
25:21
their history and their context, like
25:23
in Germany, in Europe, and how they're approaching
25:25
this compared to our country. It's almost
25:27
like people are being naive and they're not
25:30
as worked up as they could be because
25:32
they don't have the historical context that
25:34
they have in other places. Greenlight
25:48
is a debit card and money app made for families. It gives
25:50
kids and teens an easy and fun way to go. gain
26:00
financial literacy while giving parents,
26:03
boy do we need it, peace of mind.
26:05
You can send instant money transfers, automate allowance,
26:07
and keep an eye on kids spending with
26:09
real-time notifications. And meanwhile, your kids can begin
26:11
their journey toward financial autonomy by learning how
26:13
to save, invest, and spend wisely. So that
26:15
means they will not be living in your
26:17
basement. It's the easy, convenient way to get
26:19
kids on the right path. Sign up for
26:22
Greenlight today and get your first month free
26:24
when you go to greenlight.com slash
26:26
Adam. That's
26:29
greenlight.com/ Adam to try Greenlight
26:31
for free. That's greenlight.com
26:34
slash Adam. Girl,
26:38
real talk. This whole
26:40
it's a new year, time to reinvent myself
26:42
trash is not the vibe for 2024. You
26:46
can find someone who loves you for you as you
26:48
are. You don't need
26:50
to read a stack of self-help books,
26:52
only eat bad salads, or like start
26:54
meditating at 5 a.m. to be
26:56
ready for dating. So yeah, my advice
26:58
is to download Bumble and find someone
27:00
who embraces you the way you are
27:02
right now. Let me know how it goes.
27:09
Charity, I know that Vermont is
27:11
moving forward. You've been a big
27:13
advocate of more stringent privacy regulation
27:15
in the state. There are a
27:17
number of states that have fairly
27:19
strict privacy laws, California being the
27:21
strictest, which is modeled
27:24
after the GDPR in Europe. But
27:27
what I worry about is we have this
27:29
patchwork of privacy laws in this country where
27:33
ultimately would it be better if there
27:36
were a federal law? Of course, the only
27:38
issue with the federal laws tends to get
27:40
so watered down by the time it shows
27:43
up. Bingo, bingo. That is
27:45
the problem. If California's laws could
27:47
be national, that would be wonderful.
27:50
The beauty of federalism is every state
27:52
has our own philosophy and ethos, and
27:55
it allows for that diversity to have
27:57
this federalism that we have. So yes.
28:00
There's some good to having a patchwork,
28:02
but I don't think the industry,
28:05
the privacy advocates, people
28:07
think it's better to have this patchwork.
28:09
It's very cumbersome for all, but
28:12
it does allow for that diversity of
28:14
viewpoint. And so I want
28:16
to acknowledge that the bigger problem is
28:19
our Congress struggles to
28:21
pass legislation. It's not just that
28:24
it's water never happens. That is,
28:26
I think, a larger problem. I
28:29
would never say to any state
28:31
legislature, don't worry, Congress will take care of
28:33
this. No, you should ask if you care
28:35
about something. So honestly, that's a bigger problem.
28:37
Well, we know that Congress can't agree on
28:39
the day of the week. And a few
28:42
years ago, I was in Washington, and
28:44
I said to them, where are we on
28:46
privacy? And the response was, we'll get some
28:48
activity on security. But nobody
28:51
can seem to agree what privacy
28:53
is here in Congress. So that's
28:55
pretty disturbing. One thing
28:57
I've heard you mention before is data minimization.
28:59
Can you explain what that is? Well,
29:02
the idea of data minimization is, it
29:06
begins with an acknowledgement that data breaches
29:08
are very common. We have a data
29:10
breach notification act in Vermont that's been
29:12
in place for 16 years, and the
29:14
notifications come to my office. Little
29:17
Vermont with our 640,000 people,
29:19
we get a data breach notification pretty much
29:21
every day. It's challenging to
29:23
protect yourself from a data breach. So
29:26
acknowledging that doesn't it
29:28
make sense to minimize the data that you
29:31
have. So if there is a data breach,
29:33
there's no harm to them. Data
29:35
minimization is a best practice
29:37
to protect our data from
29:39
potential data breaches. And
29:41
that means you should only collect data
29:44
you need, you should only use
29:46
it for the purpose you were collecting it for a course,
29:48
and you should only store it as long as you need
29:50
it for while it's still out forms from a
29:53
daughter of school. And they asked me for her
29:55
social security number, I'll just leave that blank. And
29:57
I think they're gonna have to beg me for it.
29:59
Why do they need it? her social security number.
30:01
So I don't give data when people
30:03
ask me. When I log on
30:05
to the American Girl Doll website to buy a
30:08
Christmas present and they ask me for my birthday
30:10
to make sure I'm a grown-up, I just make
30:12
up a date. And I always make
30:14
sure I say I'm younger than I am.
30:16
But yeah, I don't give data over and
30:18
then sometimes I'll get an email on some
30:20
random day that's now my birthday. Happy birthday,
30:23
Charity! Would you like to... There's a sale
30:25
on the American Girl Doll. Anyway, so data
30:27
minimization is basically that philosophy. It was codified
30:29
in California a few years ago and I'd
30:31
love Vermont to do the same. I think
30:34
that it helps us all if we
30:36
are acknowledging that best practice, putting it
30:39
in law and then following it.
30:41
When you go to the office of a
30:43
professional, in particular doctors and dentists, they'll ask
30:45
you for your social security number. We
30:47
don't give it. Well, I say you don't need it
30:49
and they say we do and I say no, no,
30:51
no. You need it to find me if I don't
30:53
pay my bill and there's other ways to find me.
30:56
So you have to be Clark
30:58
Kent when it comes to these
31:00
digital entities that want your information. Put the
31:02
glasses on, put the suit on, hide the
31:04
cape. It's none of their business who you
31:06
are. They don't need to know. We've
31:11
heard a lot of stories of people's
31:13
identities being used against them. Everything from
31:15
deep fakes to sextortion. People
31:18
feel scared and people feel like there's no way
31:20
to protect themselves. I
31:22
mean, this is something
31:25
that I have been really
31:27
concerned about. Sextortion, yes,
31:30
but in general there could be
31:32
so many ways that this could
31:34
be damaging for non-consensual pornography in
31:36
that sphere, but also in a
31:38
commercial sphere as well. People
31:40
don't, people in my opinion, there's
31:42
been a lot of focus on use
31:45
of deep fakes in a political context
31:47
for misinformation, election, tampering, stuff
31:49
like that. But I'd love to
31:51
touch on the commercial impacts. I am
31:54
concerned about the potential
31:56
impacts that deep fakes are going to have on
31:58
the marketplace. Remember what Then
32:00
we were younger and new. Bought.
32:02
And sold things on the phone. Some
32:04
will call you when try to sell
32:06
you some say and then robo calls
32:08
came along with are scams and now
32:11
known even answers the phone the in
32:13
fact that Roebuck Falls had. On
32:15
several have tried one.
32:18
Buying and selling on the phone. Was.
32:21
Devastating. The can you imagine.
32:23
Is. The use of good steaks
32:26
like this. Recently The New York Times
32:28
reporting. That. The Taylor Swift. it's
32:30
fake trying to as a phishing scam
32:32
rate on look for say cookware you
32:35
read that or a month. Or two
32:37
ago Tom Hanks I the issue a
32:39
statement saying he this dental company is
32:41
using my image to make a dictator
32:43
try to sell their dental product. That's
32:46
not me, I didn't consent in this.
32:48
I'm not promoting this products imagine of
32:50
stuff like that keeps happening and there
32:52
is this sense of just trust among
32:54
consumers. About buying
32:57
things online. Imagine his.
32:59
Skin deep sake technology had a
33:02
destabilizing impact on the buying and
33:04
selling of things on the internet.
33:07
So. I think. We
33:09
should be. You know, the throwing down
33:11
the red flag to say this needs.
33:13
To be addressed. Because of
33:16
the impacts are potentially devastating.
33:20
Good as an attorney general due to
33:22
to pump the brakes on this kind
33:24
of behavior but it is. It's lawsuit
33:26
based. What is it? Why?
33:30
Would say it's it's two things
33:32
one is on. we at least
33:34
in Vermont. the work really closely
33:36
with our wonderful legislature. And
33:38
in part that's because they are
33:40
a perfect time citizen legislature. So they
33:42
are there. Were. Days a week
33:44
from January to May maybe June. At
33:47
been and they all have a staff of their
33:49
own so with system they share like. a
33:52
bunch of lawyers and each to me has
33:54
a legal assistant they'll have their own team
33:56
then color chips that they don't have a
33:58
policy director they have no voice them. So
34:00
we often will be called in, testify or
34:02
weigh in on bills, have a lot of
34:04
subject matter expertise. So we kind of support
34:06
them in that way. The other is
34:09
trying to pursue action
34:11
against bad actors. Scammers
34:14
are very hard to find. But
34:16
there's a lot between a scammer
34:18
and a legitimate company. We have
34:20
a consumer unit with lawyers who
34:22
pursue actions against bad
34:24
actors, largely using the Consumer
34:27
Protection Act, which forbids unfair
34:29
and accepted acts in commerce. And
34:32
that is a main focus of
34:34
our consumer unit and a tool that we have to fight
34:36
this. There's one other thing that disturbs
34:38
me when you talk about deep fakes. And
34:41
that is a competitor can
34:43
destroy another company by
34:46
simply creating a deep fake of
34:48
the CEO of that company making
34:50
some outrageous statement that
34:53
could cause real harm.
34:55
And how do you fight that? I
34:58
mean, it's not even just having a destabilizing
35:01
impact on like the marketplace, it could literally
35:03
have a disabling impact on the stock market
35:05
for a reason like you just articulate it
35:07
and what guardrails are in place. Because the
35:10
Consumer Protection Act is all it's a great
35:12
law and we use it all the time.
35:15
But it doesn't have those very
35:17
clear guidelines. Thankfully,
35:19
President Biden has put in place his
35:21
executive order relating to AI
35:24
that is going to hopefully do work
35:26
in this arena. And he's also called
35:28
on Congress to take steps in this
35:30
arena. But I don't think any state
35:32
legislature should wait. We should all be
35:34
educating ourselves and acting so that we're
35:36
following the best practices. We're talking about
35:39
the visual deep
35:41
fakes. But there's also audio deep
35:43
fakes, which I find concerning in
35:45
a way they're easier because it's
35:47
just a voice not a voice
35:49
and a phagee. And it
35:51
seems like they're so easy to
35:53
make now that it's really disturbing.
36:03
Listen, I know we've taken an enormous amount
36:05
of your time. We deeply appreciate the time
36:07
you spent with us. Charity, any
36:09
final thoughts before you go? You guys
36:11
probably already talked about this, but I want to talk
36:13
about gift cards because
36:15
gift cards are
36:17
a very common tool that scammers use.
36:19
I'm not talking about an LO being
36:22
gift certificate. I'm talking about those cards
36:24
you get at supermarkets that are like
36:26
Visa. But if someone is
36:28
asking you for a gift card, it's a scam. A
36:31
payment via gift card is absolutely a scam.
36:33
Gift cards are for gifts, the third motto.
36:35
Your listeners probably know that already, but you
36:37
have it there if it's useful to you
36:39
because I think it is really important. It's such
36:41
a common way that people lose money. Again,
36:48
we cannot thank you enough for this. This was
36:50
great. We'll be looking out for all the great
36:52
things that you're going to continue to do in
36:54
Vermont. Well, thank you guys so
36:56
much for having me. This has truly been really fun.
36:58
I love talking about this stuff with people
37:01
who are so knowledgeable about it. It's fun. Great.
37:04
And run for governor, please, so that we
37:06
can get you into the White House. Oh,
37:08
God. I'm not leaving Vermont. It's too nice
37:10
here. See you later. Bye, guys.
37:12
Bye, Charity. Thanks so much. Thank
37:15
you. Bye. Okay,
37:17
guys. That was phenomenal. Now
37:19
it's time for the two of us.
37:22
One. Our
37:29
paranoid take away to keep you safe
37:31
on and offline. Okay. Peter
37:34
Thiel, not my favorite person. I can't believe
37:36
I just said that. Now I'm a dead
37:38
man. Clearview AI is certainly creepy
37:40
technology. I mean, it does have
37:42
its legitimate uses, but the thought of having your likeness
37:45
be indexed and cataloged just it opens up a whole
37:47
lot of cans of worms. Privacy
37:49
worms? Okay, guys. You're
37:52
both grossing me out. But let's just call
37:54
them privacy concerns. I like privacy worms, regardless.
37:56
I think we should go up for some
37:58
tips for... failing Clearview
38:01
AI and other facial recognition
38:03
systems. What do you got? Well,
38:06
you could cover your face with a
38:08
burka or a mama clava, can't you?
38:10
No. Remember Bertillian? So Bertillian was
38:13
this French guy who was an inspector or
38:15
something who figured out how to catalog criminals.
38:17
And it was all just measurements, like the
38:19
length of your elbow to your wrist and
38:21
the distance between your eyes. And these facial
38:23
recognition things work the same way. So if
38:25
your eyes are visible and your mouth is
38:27
visible, they may still be
38:29
able to get you. Strictly speaking, we're sort
38:31
of covering two different questions here. One
38:34
is how to foil facial recognition systems in
38:36
person, and the other is how not to show up
38:38
in Clearview's database. All right, so I'm
38:41
still thinking of this like traffic cameras, can't
38:43
you just like glue a leaf to
38:45
your license plate? Yeah, I mean, for instance, there's
38:47
some really weird makeup you can put on your
38:49
face that can throw off like your phone's facial
38:51
recognition. When it comes to systems like
38:53
Clearview, it's another matter entirely. Like they are very good
38:55
at picking things up. And
38:57
if you're not trying to raise suspicion,
39:00
cubist makeup and beads isn't the way
39:02
to go. I agree. What
39:04
else you got, Travis? Well, there are
39:06
sunglasses that block infrared cameras. And these
39:08
clothing with really weird patterns that can
39:10
throw out facial recognition by breaking up
39:12
your image. So they look hideous,
39:15
but they do have the effect of making a
39:17
camera think they're seeing a giraffe or a dog
39:19
rather than a person. This brings us back to
39:21
the problem of what to do if you're already
39:23
in a facial recognition database. One of the primary
39:25
data sources that they use is from photos posted
39:27
online. So if you have a social media account or
39:30
if you're just in the background of a picture like
39:32
Charity mentioned, and that's been posted online, it's safe to
39:34
assume that you're on the grid. And
39:36
you can't opt out, I'm guessing? Well,
39:38
the EU and states like California
39:40
with privacy legislation require you
39:42
to be able to do it. Yeah, and Clearview.ai does
39:44
have a form you can fill out to get your
39:47
picture removed, but you need to supply an image of
39:49
yourself. And again, if you happen to be caught on
39:51
camera, you go right back to being on their database.
39:54
Also, I don't like
39:56
the idea of having to send a picture of myself to
39:58
get a picture of myself removed. Right.
40:00
What's the message here? People are just
40:03
shit out of luck? Scandalous
40:05
nine-foot. I can't help it.
40:08
It's a shit out of luck moment. Well,
40:10
the message is that it's important for people
40:13
in government like charity to keep working on
40:15
greater privacy protections. And you can do a
40:17
few tricks like the ones we talked about.
40:20
But it's too big for one person to do alone. Also,
40:22
Elizabeth Warren, your buddy, is
40:26
trying to pass a digital version
40:28
of the CFPB which would protect
40:30
people. People's privacy specifically. But right
40:33
now, I think
40:35
we're kind of shit out of luck. And
40:38
that's our tin foil swan. What
40:50
the heck with Adam Levin is a production
40:52
of Loud Tree Media. You can find us
40:54
online at adamlevin.com and on Instagram, Twitter, and
40:56
Facebook at AdamK11. Loud
41:07
Tree.
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More