Podchaser Logo
Home
Chinese Surveillance: The Josh Chin and Liza Lin Interview

Chinese Surveillance: The Josh Chin and Liza Lin Interview

Released Wednesday, 9th August 2023
 1 person rated this episode
Chinese Surveillance: The Josh Chin and Liza Lin Interview

Chinese Surveillance: The Josh Chin and Liza Lin Interview

Chinese Surveillance: The Josh Chin and Liza Lin Interview

Chinese Surveillance: The Josh Chin and Liza Lin Interview

Wednesday, 9th August 2023
 1 person rated this episode
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

Hello and welcome to the Kroger

0:02

Show. At Kroger, everyone wins

0:04

when it comes to saving big. Because when you order online

0:06

through the Kroger app, you get the same great

0:09

prices, deals, and rewards on

0:11

pickup or delivery that you do in-store, with no

0:13

hidden fees or markups. So no matter

0:15

how

0:15

you shop, you'll always save big at Kroger.

0:18

Kroger, fresh for everyone. This

0:20

Thursday through Saturday only, during our three-day

0:22

sale, save $10 on a total purchase

0:25

of $75 or more with your digital coupon.

0:27

Kroger,

0:28

fresh for everyone.

0:41

Welcome to Gaslit Nation. I am your host,

0:43

Andrea Chalupa, a journalist and filmmaker

0:46

and the writer and producer of the journalistic

0:48

thriller, Mr. Jones, about Stalin's

0:50

genocide famine in Ukraine. The film the

0:52

Kremlin doesn't want you to see, so be sure to watch

0:54

it.

0:55

We are running a very special summer series

0:58

called The Future of Dictatorship,

1:00

What's Next and Ways to Resist.

1:02

The series features leading voices on the front

1:04

lines of understanding AI, corporate

1:06

surveillance, Silicon Valley greed, and more.

1:09

Because the dictator's playbook remains the same, the

1:12

technology changes, and we wanted to talk

1:14

to some of these big leaders trying to

1:16

understand these changes about how

1:18

to protect ourselves.

1:20

You can learn more about the dictator's playbook

1:22

from the graphic novel from Gaslit Nation,

1:25

Dictatorship, It's Easier Than You Think.

1:27

And guess what? We're having

1:30

a new Gaslit Nation night out. Thank you to

1:32

everyone who joined us at Caveat, but we've got

1:34

an all new event coming up, and that will be September

1:36

18th at P&T Knitwear,

1:39

an independently owned bookstore

1:41

on Orchard Street in Manhattan.

1:44

This time, our wonderful friend, Russian

1:46

mafia expert, Olga Lottman, will be joining

1:49

me for a live taping of Gaslit

1:51

Nation at P&T Knitwear.

1:53

You can join us Monday, September

1:56

18th at 7 p.m. The event

1:58

is free. For details, go to-

1:59

to gaslitnationpod.com and you'll

2:02

see the link right on our homepage at

2:04

gaslitnationpod.com. So that's 7

2:06

p.m. September 18th, PNT

2:09

Netware on Orchard Street for a

2:11

live taping of Gaslit Nation with Olga

2:13

Lottman.

2:14

We'll be back with all new episodes of Gaslit

2:16

Nation in September, including a live

2:19

taping with Terrell Starr of the Black

2:21

Diplomats podcast, reporting from

2:23

Ukraine on Tuesday, September

2:26

12th at 12 p.m. Eastern for our

2:28

supporters at the Truth Teller level and

2:30

higher on Patreon. Come join us

2:32

for that and drop questions in the chat and

2:34

hope to see as many of our listeners as

2:36

can make it on September 18th in New York

2:39

at PNT Netware for a fun night out.

2:41

There won't be a live stream for this, but we'll record

2:43

what we can and hope to share it with you on

2:45

the show if it's any good. And

2:49

before we get to this week's guest, here's a quick

2:51

word from our sponsor, Judge Lackey,

2:53

the narrator of the new Gaslit Nation

2:55

graphic novel, Dictatorship. It's easier

2:58

than you think.

3:02

Want to live forever? Achieve immortality

3:05

thanks to symbols and slogans. From swastikas

3:08

to MAGA hats, every dictator needs to

3:10

have consistent branding. Create

3:12

a sense of belonging, targeting your scapegoats.

3:15

Learn more from the hottest branding guide in

3:17

town. Dictatorship. It's easier than

3:19

you think. Almost too easy.

3:23

This week's guests are Josh

3:26

Chin and Lisa Lin. Josh Chin

3:28

is the Deputy Bureau Chief in China

3:30

for the Wall Street Journal. He previously covered

3:33

politics and tech in China as a reporter

3:35

of the newspaper for more than a decade.

3:37

He led an investigative team that won the Gerald

3:39

Loeb Award for international reporting in 2018

3:42

for a series exposing the Chinese government's

3:44

pioneering embrace of digital surveillance.

3:47

He was named a National Fellow at

3:50

New America in 2020 and is

3:52

a recipient of the Dan Bolas Medal awarded

3:54

to investigative journalists who have exhibited

3:56

courage in standing up against intimidation.

3:59

Surveillance State is his first book. Born

4:02

in Utah, he currently splits time between

4:04

Seoul and Taiwan. Lisa

4:06

Lin works as the journalist covering

4:09

data use and privacy for The Wall Street Journal

4:11

from Singapore. Lisa was part of

4:13

a team that won the Loeb in 2018. Prior

4:16

to The Wall Street Journal, Lisa spent nine years

4:18

at Bloomberg News and Bloomberg Television. Surveillance

4:21

State is her

4:22

first book.

4:29

So we are here to talk about Surveillance

4:31

State inside China's quest to launch

4:33

a new era of social control

4:35

with the authors Josh Chin and Lisa

4:38

Lin. Thank you so much for this incredible

4:40

book. I was raised

4:42

by a grandfather who

4:45

lived through Stalin's purges. He spent

4:47

about a year in a Soviet

4:49

prison, a communist prison where he was tortured.

4:53

And the surveillance state, the big brother Orwellian

4:55

nightmare that he lived through, so

4:58

many of those family stories are coming

5:00

back, reading your incredible book. It's

5:03

just fascinating and complex, and

5:05

I'm so grateful that you wrote it. And I'm sure

5:07

it was a difficult, very challenging

5:10

book to write, given the sensitivity

5:12

of it and the people that you had to talk to. Could

5:15

you talk just a little bit about that, like how

5:17

you sort of navigated the process of

5:19

covering China's surveillance state and

5:22

the sources you needed to depend on and

5:24

how you had to protect those sources and protect

5:26

yourselves?

5:27

Yeah, the story of reporting

5:29

out this book is kind of interesting because there are a couple of different phases

5:32

to it. It actually began as a series

5:34

for the Wall Street Journal where we both work

5:37

as reporters. We were both in China at the time. I

5:39

was in Beijing and Lisa was in Shanghai. And

5:42

we first started reporting on this in 2017. And

5:46

at the time, it was actually the reporting

5:48

was remarkably easy. There were all these

5:51

Chinese facial recognition and AI

5:53

startups. They had this kind of gee whiz

5:55

technology and they were raising money. And

5:58

so they sort of wanted to talk to us. and they were happy

6:00

to let us in, and they brought us into their showrooms,

6:02

and they kind of told us everything they were doing, including

6:05

how they were working with police and installing

6:08

these camera systems in various cities and using

6:10

them to track people. And it was,

6:12

you know, I think we were actually a little bit shocked at first how

6:14

open they were. Of course, after we did

6:17

our first few stories and other people also

6:19

started writing about it, they started

6:21

to close off and it started to get quite a bit more difficult.

6:24

Over time, particularly, we did

6:26

some reporting in Xinjiang, which

6:29

is in the northwestern part of China, and it's home

6:31

to some Turkic Muslim minorities where

6:33

the Chinese government is really

6:36

using this technology in some very Orwellian

6:38

ways. And our reporting there

6:40

was incredibly sensitive and difficult, and it was

6:43

hard to ever really, because of the surveillance,

6:45

because there was so much tracking of people, it was

6:47

hard to talk to people,

6:49

it was dangerous for them to talk to us. So

6:51

we kind of had to get snatches of conversation

6:54

here and there in cars or in back alleys

6:56

and that sort of thing. We did have to be quite careful

6:59

about protecting their identities and making sure

7:01

that our devices were as secure

7:03

as they could be and that we weren't using

7:05

platforms that the Chinese government could easily listen

7:08

in on.

7:09

I think one big thing that Josh missed out

7:11

on was halfway through

7:13

the book, Josh himself

7:15

was kicked out of China. Yeah, that's

7:18

kind of a...

7:18

Minor detail. So

7:21

it became a book that we began

7:24

reporting on the ground in China, and

7:26

then the two of us ended up finishing

7:28

up the book outside of China. So

7:30

that entailed a lot of trying

7:33

to evade authorities themselves

7:36

when we were doing reporting.

7:37

As much as we could, we tried not

7:40

to use internet chat apps

7:42

that we knew were under close

7:44

scrutiny. And there is

7:47

a degree

7:48

to how much surveillance is

7:51

done on the various platforms. So you try and

7:53

use chat platforms where

7:55

you know maybe the AI surveillance

7:57

by both the companies and authorities might not

7:59

be able as good or maybe just

8:01

a simple telephone call because voice surveillance

8:04

is a lot harder than just text

8:06

and image recognition surveillance on chat apps.

8:09

It was a very fine line and

8:11

quite tricky trying to get in touch with people

8:13

in China especially as China got more

8:16

and more closed up as

8:18

the coronavirus kind of raged

8:20

on. I think the other thing

8:22

to add is you know we didn't just rely

8:25

on interviews. One thing that

8:27

really I think helps our book stand out is

8:29

how we use

8:29

open source material to do things. So

8:32

for example we have a chapter in the book looking

8:34

at how western companies have played a part

8:37

in helping kind of nurture China's

8:39

AI surveillance state and to do that

8:42

we went through government contracts to

8:44

figure out which were the companies these

8:48

state security agencies were buying products from.

8:51

So just because it's just not

8:53

a topic that would be talked about

8:55

or spoken about openly in China itself.

8:58

Josh could you speak a little bit about that experience

9:00

of getting kicked out in China? Like when did you realize

9:03

it was over for you and you had to go?

9:05

Like how did they approach you? You

9:07

know actually it was a real surprise. It

9:10

happened in the beginning of 2020 right as the

9:13

pandemic was actually getting underway

9:15

as the virus had started to spread out of Wuhan

9:19

and I actually was kicked out along with two of my colleagues

9:21

from the Wall Street Journal and you know the

9:23

background of this is the Chinese government at the time was

9:26

had made a big deal about an opinion column that

9:28

the Wall Street Journal had run with a headline that they had disagreed

9:31

with and they sort of knew that

9:33

we on the news side of the Wall Street Journal had

9:35

nothing to do with the opinion side but they kind of were using

9:38

this headline and this column as

9:40

a way to sort of criticize the journal and they'd

9:42

really been making a big deal out of it. I

9:44

didn't know what to make of that it did seem

9:46

strange and then it sort of all became

9:49

clear one day when they called

9:51

our bureau chief into the foreign

9:53

ministry for a meeting and

9:55

he asked me to go along I was the deputy bureau

9:58

chief at the time still am. And so

10:01

I went with him, but they wouldn't let me into the meeting.

10:03

They made me wait outside. And when

10:05

they came out, his face was sort

10:08

of pale. And I was like, what? I

10:10

was just joking. I was like, did they kick you out?

10:12

And he said, no, they kicked you out, along

10:15

with these two other colleagues, one of whom was in Wuhan

10:17

at the time, one of the few Western reporters

10:20

reporting on the ground in Wuhan at the epicenter.

10:23

And so it was the first time

10:26

that China has kicked out multiple reporters

10:29

from the same news organization since the

10:31

Mao era. So totally unexpected.

10:34

And I was just kind of in

10:36

shock. They gave me five days to get out. We

10:38

left, then the US responded by

10:41

kicking out a bunch of Chinese journalists. And then China

10:43

responded by kicking out even more American

10:45

journalists. And so it sort of became

10:48

this media war that ended

10:50

up reducing the number of reporters

10:52

on both sides.

10:54

Wow. And I'm just fascinated by this

10:56

because I've done a lot of research in

10:58

journalists getting kicked out of Hitler's

11:00

Germany and Stalin's Russia. So if

11:02

you don't mind, I just stay on this for a little bit longer. How

11:05

did you wrap up your life there? Like what, how did

11:07

you spend your final days? Like who did you say bye

11:10

to? Like you, what was that like thinking you might

11:12

never be able to return again?

11:14

Yeah, you know, it was, like I said, I was sort

11:16

of in a daze. In some ways,

11:19

I was lucky because I had actually just moved back

11:21

to Beijing. I'd been spending

11:23

most of my time in Hong Kong where my wife

11:26

was. And so I didn't have a ton

11:28

of stuff there. I mean,

11:30

it seems mundane, but it's one of these things when you get kicked

11:32

out, if you've got a ton of stuff, five days

11:34

is not a lot of time to kind of deal with all of that.

11:36

So luckily I didn't have to do that. But yeah,

11:39

I know it was a kind of a dilemma actually

11:41

to sort of who to see,

11:43

because at that point I was sort of toxic. I

11:47

was in the news and there

11:49

were a lot of Chinese friends who I'd known for

11:51

years. Some of them, most

11:53

of my life who I wanted

11:55

to see, but I just felt like I put them

11:57

at risk. And so I didn't, I just sort of. wrote

12:00

to them and said, see you later. And

12:03

I went to a few of my favorite restaurants, places

12:05

I knew I wouldn't be able to serve the kind of food I knew I

12:07

wouldn't be able to get outside of China. But

12:10

I don't think it really hit me that I was being kicked out until after

12:12

I was gone. I arrived in Japan. It

12:15

was the only country that had open borders at the time because

12:18

of the pandemic. And I stepped off the plane

12:20

and I just felt different not

12:22

being in China. And that's when I think it really hit

12:24

me that I'd been kicked

12:25

out. And you're based in Taiwan now.

12:28

Well, I was in Taiwan. I went to Taiwan for a while. And

12:30

now I'm based in Seoul because my wife works

12:33

for the New York Times. Moved up here

12:35

with them.

12:37

China is in this

12:40

space race with the rest of the world

12:42

in terms of wanting to become a leader in AI. How

12:46

is it innovating AI? How is

12:48

AI in China being used for surveillance?

12:52

Right. So

12:53

I think there are two ways to understand that question.

12:55

There's this sort of broad view and then there's

12:58

the view from the ground. And I think if you take the broad

13:00

sort of 30,000 foot view, what

13:02

you see is that China's Communist Party is using

13:05

data and AI surveillance

13:07

together to sort of reboot the way

13:09

that governments impose control on societies.

13:12

So they're doing this and they're doing this basically

13:14

by taking a page from Silicon Valley. So

13:17

in companies like Google, Amazon,

13:20

Facebook, they pioneered

13:22

technologies and techniques for harvesting

13:25

huge amounts of behavioral data and

13:27

analyzing patterns in that data to predict

13:30

future behavior. So in the case of

13:32

these companies, they did it basically to make advertising

13:34

more effective and more lucrative. The

13:36

Communist Party is basically doing the exact same thing

13:39

but with government. So they're using the same techniques and a

13:41

lot of the same technologies, collecting

13:44

huge amounts of data on their

13:46

people, on Chinese people, so that they can predict

13:49

and eradicate problems before they happen. And

13:52

that could be anything from like a health crisis

13:54

to a political protest to a traffic jam. And

13:57

what that means on the ground is that,

13:59

you know, Basically every street, every stretch

14:01

of public space is being monitored by high definition

14:04

cameras. They have something like 400

14:06

million surveillance cameras installed,

14:11

large numbers of which can identify you by

14:13

scanning your face or even examining

14:16

the way you walk because people have unique

14:19

gates. It also means that the government has access

14:21

to your entire digital life. They

14:23

know who you talk to on social media.

14:26

They know what you say to those people. They know where you work

14:28

and where you sleep and what you buy online.

14:31

In certain circumstances like during the pandemic, it

14:33

means that the government is using data it has on you

14:36

to analyze and judge your

14:38

behavior and give you a rating based on

14:40

the threat you pose to public

14:42

health or social stability.

14:44

Wow. I think that was a Philip

14:46

K. Dick novel that was turned into a popular

14:49

film, blockbuster film where they could arrest

14:51

you before you've committed

14:53

the crime.

14:54

There's a futuristic movie

14:56

on that. Do they

14:59

actually arrest people before

15:01

the crime is committed?

15:03

The one place that they have essentially been doing this

15:05

is in Xinjiang. In

15:08

the northwest of China where they've been targeting

15:10

a group of Turkic Muslim minorities

15:13

known as the Uyghurs. The

15:16

Uyghurs have long

15:18

resisted Chinese rule. They

15:20

are culturally, linguistically, religiously

15:23

distinct from China. They don't really see

15:25

themselves as Chinese. This is

15:28

a conflict that goes back centuries.

15:30

It's always been a contentious place. They've

15:34

always resisted Communist Party rule. Starting

15:37

in 2017, the Communist Party started rolling out

15:40

these technologies at a massive scale

15:42

in Xinjiang in a really suffocating

15:45

way so that they had an almost 360

15:47

degree view of what nearly

15:50

every Uyghur was doing. They

15:53

used that data on where people went, how much

15:55

gasoline they bought. whether

16:00

they had been abroad or they'd been to Muslim majority

16:02

countries, whether they went to the mosque,

16:05

how often they went to the mosque and how often they prayed,

16:07

if they had a Quran on their phone

16:09

and all this sort of data, and they

16:11

would collect it on this sort of central platform,

16:13

this data fusion platform that was developed originally

16:16

for the military to do sort of counter terrorism

16:18

operations. And then they analyze

16:21

that data to sort of give people a rating based

16:24

on the future threat they might pose

16:26

to the Communist Party's order in Xinjiang. And

16:29

people who were deemed threatening were

16:32

taken and sent to this network

16:35

of internment camps where they were subject to political indoctrination

16:37

without any legal due process. So in that sense,

16:39

they were sort of, they were doing pre-crime,

16:42

they were arresting people and punishing them before

16:44

they had done anything.

16:45

I was reading in your book the stories

16:49

of torture and the weaker genocide and

16:51

the prison camps, the

16:54

silence, what really struck out to

16:56

me, because my grandfather wrote about his time

16:58

in a Communist prison, and

17:00

they would actually chat very openly with

17:02

the other prisoners and exchange gossip and news

17:05

about the outside world and

17:07

even pray. They would even pray in prison, which

17:09

of course was an atheist dictatorship.

17:12

But what really jumped out at me is in

17:14

the Uyghur prisons, there's silence

17:17

because of advancements in technology,

17:20

the surveillance, they always are

17:22

listening to you.

17:23

Yeah. And I think it's a peculiar

17:27

sort of state to be in. I

17:29

mean, obviously it's terrifying, but then there's something

17:31

other, some other just very difficult to describe

17:34

feeling, I guess it's sort of like suffocation when you're there.

17:36

I mean, I was, as a journalist, just as

17:38

soon as I went to Xinjiang the first time, I could

17:40

feel it, I could sort of feel my heart

17:43

pumping a little bit faster and just, I was

17:45

constantly aware of being

17:47

watched, the possibility that I was always being watched.

17:50

And I can only imagine what that was like for Uyghurs who

17:53

were subject to much heavier surveillance. And there

17:55

were some Uyghurs who believed that their homes were

17:57

bugged, right? And that even in their most sort of private

17:59

space, they were being watched and some

18:01

of them almost certainly were. It's

18:04

kind of an out of body experience in some ways and

18:07

I think it had devastating effects on the

18:09

people who were sent to those camps. Of

18:11

course.

18:11

And in terms of

18:13

resistance, because we saw a lot of pushback

18:16

with the draconian measures

18:18

that China took, you saw acts of resistance,

18:21

brave acts of resistance in the general Chinese

18:23

population.

18:24

Could you speak about the ways citizens

18:27

are reacting to the surveillance system?

18:29

Are there any sort of

18:31

workarounds of these systems both

18:33

in the general population and how

18:36

are

18:36

the Uyghur minority, how are they,

18:39

do they have any resistance

18:41

system, anything that they're doing that helps

18:43

them

18:44

push back or protect each other? Could

18:47

you just speak about any strategies or any

18:50

loopholes essentially in China's growing

18:52

surveillance systems?

18:54

I can answer on the Uyghur side and then maybe Lisa

18:56

can weigh in on the rest of China because in

19:00

Xinjiang there really isn't much you can

19:02

do. I think the system there is so

19:05

pervasive and it's backed up with both

19:07

human surveillance and then also human manpower

19:09

and weaponry and government power.

19:13

So it is really difficult in Xinjiang for

19:16

anyone to resist. You can find pockets where you

19:20

may be able to have a private conversation, but those I think

19:22

are fewer and fewer.

19:23

And I guess to weigh in

19:25

on ordinary Chinese and how

19:27

they feel about the surveillance system, watching

19:30

them, when we started

19:32

reporting this, most Chinese actually

19:34

felt very resigned to the fact that

19:36

they were being watched. Because if you walk

19:39

down the streets of Shanghai or any large city

19:41

in China, the surveillance cameras are everywhere

19:43

and they're filming you from every

19:46

angle. So even if you were wearing a cap,

19:48

they would still have a

19:51

clear shot of your face from the side profile

19:53

or a front profile

19:56

if you were walking towards a traffic intersection

19:59

with a camera mounted.

19:59

on the traffic light. So, you

20:02

know, most Chinese that we spoke to just kind

20:04

of worked on the notion that

20:06

if you didn't do anything wrong, you had nothing

20:08

to fear. I think that viewpoint

20:11

was definitely challenged over the coronavirus.

20:14

What we saw over the last two to three years was

20:17

the use of digital surveillance, not

20:19

specifically AI, but you know big data style

20:21

surveillance in which the Chinese government

20:25

tracked each and everybody's mobile phones

20:27

to figure out where they went and

20:30

use that location tracking data

20:32

to assign them a health risk. If

20:35

you were, for example, in the past two

20:37

weeks in a place that

20:40

happened to be a COVID hotspot, you would be

20:42

given a health code, a red health code,

20:45

which meant that you were a health risk and

20:47

you couldn't be out walking the streets in public.

20:49

You had to stay home and quarantine for two weeks. So,

20:52

this was very limiting because everywhere you went,

20:55

you know, be it a mall or taking a subway,

20:57

you had to flash that health code. And that

21:00

was when people realized that being in the wrong place

21:02

at the wrong time could actually

21:04

lead to like a quarantine of two weeks

21:06

or you're being locked away against your will. So,

21:09

that was I think the moment when people

21:12

in China began to realize how much surveillance

21:14

that was in the country, how much the government

21:16

knew about them and what it felt like

21:19

to be in Xinjiang, where

21:21

every movement of theirs was

21:23

being watched. And that's it as

21:25

well. I guess there was one more aspect

21:27

of AI surveillance that Josh might not have touched

21:30

on. We touched on the facial recognition

21:32

and the surveillance cameras, but AI surveillance

21:35

is also heavily used in censorship

21:37

within China. The Chinese

21:39

internet companies are the conduits for

21:42

this AI surveillance because if you had used

21:44

a chat messenger in China and the

21:47

most famous one of them is called WeChat, most

21:49

Western chat apps are banned in China

21:52

or cannot be downloaded. So, you have

21:54

to resort to a Chinese chat

21:56

app in order to get in touch with friends or you know

21:59

social media. generally, they're all Chinese

22:02

companies such as Tencent, which

22:05

runs WeChat, use digital surveillance,

22:07

they use text recognition to

22:10

figure out what you're saying to your

22:12

friend over chat messenger and if you

22:14

say certain sensitive terms in

22:16

quick succession, you know, that message

22:19

never gets sent. It looks like

22:21

it gets sent on your mobile phone but

22:23

it doesn't actually turn up on the other side.

22:26

So the other party doesn't see the message at all and

22:28

it's not just day-to-day messages. It's

22:30

not like, hi Andrea, did

22:33

you see Xi Jinping on the street today? It

22:35

wouldn't just be that. PDFs,

22:38

for example, would be blocked if

22:40

you tried to send someone a file of

22:43

a news story and in

22:45

this case someone tried to send me a news

22:47

story. It was a Wall Street Journal story with

22:50

all about Xi Jinping and you know

22:52

his increasing grip on power and

22:54

that never never got true to me.

22:57

It looks like it was sent on the other person's phone

22:59

but it never actually got true to me. So there's

23:01

a lot of AI surveillance that happens

23:03

through chat apps and

23:06

the various mobile phone apps that you have within

23:08

China as well. What

23:10

a creative form of censorship.

23:13

This is scary. So I want to touch on what

23:15

you

23:16

said earlier. I've heard

23:18

people in the West say, what's the

23:20

problem with the government having

23:22

these secure measures of keeping us safe?

23:25

If you're not doing anything wrong, what's the

23:27

problem?

23:28

What would you say to that? Because this sounds

23:30

very innovative and creative what China is

23:32

doing. I would love for you to both just comment on

23:35

that.

23:35

It's interesting when you say that because

23:38

one of the experiences I had when I was writing this book

23:40

is I went to the US to sort of see what was happening

23:42

with surveillance in the US and there is actually quite a bit.

23:44

But on my way back I was

23:47

standing in line at JFK Airport

23:50

in the security line and there's like one American

23:52

couple standing right in front of me and the

23:54

woman was discussing a story

23:57

she'd read in the news about Chinese surveillance and

23:59

she's like, it sounds so crazy and

24:01

her husband turned around and said, oh, yeah, well, if you

24:04

haven't done anything wrong, then you have anything to worry about.

24:06

So that actually is not just a Chinese

24:09

attitude. It is very prevalent in China, or it was.

24:11

Very prevalent in America too.

24:13

It is. It is. And I

24:15

think what China really illustrates very

24:18

clearly, I mean, you see this everywhere, but you see it really,

24:20

really clearly in China, is that

24:22

these systems, they tend

24:24

to work for you and make your

24:27

life better and more convenient and easier, as

24:29

long as you are behaving in

24:32

the way that whoever's running those systems wants

24:34

you to behave. So if

24:36

you're in China and you are

24:39

living in a wealthy city and you're Han Chinese and

24:41

you're part of the majority and you're kind of a law

24:43

abiding citizen who doesn't talk about politics

24:46

and doesn't raise a ruckus, then mostly

24:48

what these technologies do is make it easier for you to move

24:50

around the city. You can scan

24:52

your face to ride the bus. Your life is optimized using

24:54

data for you. That changes as soon

24:59

as you cross the

25:01

authorities. So this often happens when, for

25:04

example, people have a common thing that happens in Chinese

25:06

cities is that the government will knock down old apartments

25:09

to make way for a highway or

25:11

for new apartments. And it's eminent

25:13

domain and there's nothing really you can do about it in China.

25:16

And often they won't pay full market rates.

25:18

And so you often have this situation where people who were

25:21

living a good life in a nice city suddenly had their

25:23

house knocked down and

25:25

they turned into petitioners and they turned

25:27

into protesters because they're trying to get their money back.

25:29

And suddenly the systems that were making their lives

25:31

better are now being turned on them. And

25:34

then as Lisa said, this definitely

25:36

happened during the pandemic to people who got

25:38

fed up with zero COVID controls.

25:41

The issue with China as well is, and

25:44

with many of these systems, is you need

25:46

checks and balances. So

25:48

what we saw in China in recent

25:51

years as well is how technologies

25:53

that were put in place for public

25:55

good were gradually used

25:58

and abused.

25:59

So for example, the health code that

26:02

I afflict at earlier, China

26:04

had used big data to try and assess

26:06

the health risk of people moving about

26:08

in order to stop the spread of the coronavirus.

26:11

What happened in the middle of last year, though, was

26:14

that there was a small protest in a

26:16

central Chinese city in Henan

26:19

province. And authorities

26:21

there were so desperate to stop the protests. They

26:23

turned the health code of everyone who they

26:26

expected to be at the protest red. So

26:29

that meant, as I mentioned earlier,

26:31

that if you had a red health code, you couldn't move around.

26:34

So these people who wanted to turn up at

26:37

the gates of the central bank in the

26:39

region to protest, they had

26:41

their health codes

26:41

turned red, and they were taken away at the

26:43

train station in quarantine for two weeks. So

26:46

it's just little incidents like that

26:48

that show you, it's so easy

26:50

for technology to be

26:53

subverted and used in

26:55

a perverted way, even though it started out for good.

26:57

Who is profiting off of this?

26:59

Who are the Elon Musk's of

27:02

China? And how are American

27:04

Silicon Valley companies, who's raking

27:06

in money off of what China is doing inside

27:09

China and outside?

27:11

So this is really interesting because on the

27:13

surface, it felt to us

27:16

initially like it were Chinese startups

27:18

that were providing these AI algorithms

27:21

to the police that were making the money. But

27:23

if you look under the hood, you're realizing

27:26

that these

27:27

Chinese startups did provide the algorithms,

27:30

but the hardware, the hard drives

27:32

that were used to store video footage when you

27:34

have so many cameras, millions of cameras,

27:37

or the chips that were used

27:40

by cameras and the networking

27:42

systems to power

27:44

these surveillance systems and to power

27:46

the facial recognition and to train the algorithms

27:49

even to do the facial recognitions. These

27:52

were provided by Western companies

27:54

and specifically American companies. In

27:56

the case of chips, imagine your

27:59

typical.

27:59

go, I guess the typical names that come to mind

28:02

would be Intel, Nvidia,

28:04

they would be providing like the high

28:07

performance chips that China couldn't make on

28:09

its own. And these high performance chips

28:11

were used by Chinese AI

28:13

startups to train facial recognition

28:15

algorithms for the police. And in turn,

28:18

the same chips, same high

28:20

performance chips were also used in the

28:22

backend systems to help process

28:24

large amounts of data very quickly. And if

28:27

you really kind of go beyond that, you think

28:29

about the hard drives that are used, the

28:31

hard drive industry globally is dominated

28:34

by three companies, Western Digital,

28:36

Seagate, both American companies, and Toshiba.

28:40

So really, the people who were raking

28:42

in a lot of money from this weren't just the Chinese

28:44

companies, American companies were very complicit.

28:47

Of course, they are. And is there any

28:50

global movement to push back

28:52

against this? Because it's very much a genie

28:55

coming out of the bottom, what China is doing

28:57

could easily spread around

28:59

the world. And do you see it spreading?

29:02

You definitely do see it spreading. There's a sort of important

29:05

thing to keep in mind when you think about the way this is spreading.

29:07

I think a lot of people, they assume,

29:10

and not without reason, they assume that China is trying to

29:12

replicate itself around the world. It's got

29:14

growing influence, and

29:16

it is exporting these technologies to dozens

29:19

of countries around the world. And the assumption a

29:21

lot of people have and people have written about is, oh, like

29:23

China wants to make a lot of little mini-China's. It

29:25

wants to remake the world order so that

29:27

China is the model for everyone. And I

29:29

think that's not exactly

29:31

true. What China is doing is

29:35

exporting these technologies and advancing

29:37

an argument that it should

29:39

be okay for governments to use these technologies

29:42

in whatever way they want. And it's

29:44

similar to the Chinese government's approach to the internet.

29:47

China was one of the first countries to argue for something,

29:50

a notion called internet sovereignty, which

29:52

is that governments should decide how the

29:55

internet within their borders are run. Then this

29:57

sort of western US-led

29:59

notion of of an open free

30:01

internet in which multiple people have a

30:03

say and how it's governed is we should get rid of that.

30:05

It should all be up to governments. And they

30:08

basically make the same argument with surveillance technology.

30:11

And the reason that's a concern is that one,

30:13

it's kind of hard to argue against. It's an argument that a lot

30:15

of governments like, right? It's very attractive

30:17

to them. And it's particularly attractive

30:19

to governments that are authoritarian

30:24

or sort of have authoritarian tendencies.

30:26

So the example we have

30:28

in the book, the place I went to is Uganda, where

30:31

both the United States and China are

30:33

sort of, they have influence. And for a long time,

30:36

the U.S. thought it was going to be a model for democracy

30:38

in Africa. They thought the leader there, Yury

30:41

Museveni, was this new generation of African

30:43

leaders who was gonna bring democracy

30:45

to the continent. It turns out he's much

30:47

more of a strong man. And recently

30:50

he faced a fairly stiff challenge from a

30:52

young kind of upstart politician who's

30:54

leading a really strong opposition movement. And he

30:57

turned to China, to a Chinese company

30:59

called Huawei that people may have heard

31:01

of. And they sold

31:03

him a sort of state surveillance starter

31:06

kit, which he installed and he

31:08

used to track the opposition and

31:11

to sort of shut down their ability to campaign.

31:14

And he won. He won the most recent election, even

31:16

though some people thought he should have lost. So it

31:18

is effective and China is pushing it around the world.

31:21

And I think the challenge right now is

31:23

that China has this really simple and

31:26

easy to understand and attractive authoritarian

31:28

vision for the use of AI and

31:31

big data surveillance, but there isn't really

31:33

yet a similar

31:36

democratic vision. There's a democratic

31:38

argument about how you use these technologies. There

31:40

are, in Europe, there

31:42

are arguments about how to regulate it, but those are different from

31:44

the ones in the U.S. And basically

31:47

democratic countries are all sort of, they're a

31:49

little bit schizophrenic about this, in particular in the U.S.

31:51

Because as much as Americans

31:55

think they love privacy, they also don't

31:57

like regulation. And Silicon Valley is

31:59

also really powerful. powerful lobbyists against

32:02

rules that might regulate these sorts of technologies.

32:04

So yeah, it's a bit of a mess on the

32:06

democratic side.

32:08

I would, however, add to that, that

32:10

the US government has probably put out the biggest

32:13

effort to fight against the Chinese surveillance

32:15

state. And they've done that by putting dozens

32:18

of Chinese companies linked

32:20

to AI surveillance on a

32:22

trade blacklist called the entity list. So

32:25

this means that if you were an American company and

32:27

you wanted to sell certain technologies

32:29

to such companies, it has to get approval.

32:32

You have to get approval from the Commerce Department. And

32:35

beyond that, I think the US government also dealt

32:38

the biggest hit to China's AI industry

32:40

in general last October by cutting

32:43

Chinese companies off from high performance chips.

32:46

And part of the reason was because they didn't

32:48

want China to develop AI enabled

32:51

weapons, but they also wanted to

32:54

push back against the development of digital

32:56

surveillance in the country itself.

32:58

And what impact has that had?

33:00

So I think the impact of that is

33:03

that China's AI development has

33:05

been stymied because China as a country

33:07

has been unable to catch up with the West despite

33:10

decades of trying on the semiconductor

33:13

front. And they've not been able

33:15

to, at least the homegrown companies have

33:17

been unable to produce and

33:19

design high performance chips that

33:22

the likes of Intel and Nvidia have been

33:24

able to design. So when you're cut off

33:26

from access of such chips, that means

33:29

you need to find alternatives. So what we're seeing

33:31

in China now is Chinese AI

33:33

companies have either been rationing the

33:36

high performance chips they already have, or they've

33:39

been trying to buy them on the black market, or they've

33:41

been trying to tap technologies such

33:44

as packing older

33:46

generations of chips together in order to

33:48

get the same performance that a high performance chip

33:51

would be able to deliver. So

33:53

there are various alternatives that the Chinese have

33:55

been trying, but none of it has been as

33:58

effective or as costly.

33:59

effective as just buying

34:02

a chip from American companies itself.

34:05

The innovation, the

34:07

space race, if you will, has been slowed down.

34:09

I think that's safe to conclude.

34:11

When you both look to the future with

34:14

this AI genie and the surveillance

34:17

genie being out of the bottle,

34:18

what do you see in the future?

34:21

What nags at you in terms of where all of this

34:23

is headed? I think China,

34:25

it's really hard to see

34:27

much resistance there. We did see

34:30

a brief flurry of resistance at

34:32

the end of last year where there were these really large zero

34:35

COVID protests that spread across the country and were actually

34:37

quite remarkable. But then what you also saw

34:39

shortly after that was the government using

34:42

its surveillance systems to track down the people who had

34:44

participated and to really lock down

34:46

the country even tighter than before.

34:49

On the evidence, the Communist Party

34:51

has more control now than it ever

34:54

has, and it's not going to let up anytime soon.

34:56

At least it doesn't seem like it will. I think

34:58

the real question is in democratic

35:00

countries and in other places. One of

35:02

the most astonishing things about writing

35:04

this book and then traveling around and talking to people and

35:06

trying to promote it is actually how little

35:09

some people seem to care about privacy or think about

35:11

it. There are police departments

35:13

across the United States that are using facial recognition.

35:16

They are just as vulnerable to the

35:18

attractions of these systems as police in Uganda

35:21

or China, right? Because it makes their jobs easier. It

35:23

makes sense. You can't really blame them. It's

35:25

just quite interesting to me that that hasn't really

35:27

become a public discussion in

35:29

a serious way. I think it could change in

35:32

some ways with TikTok because

35:34

so many people use TikTok

35:36

and there's this huge debate now in the US about the

35:38

data that the China's government collects through TikTok.

35:41

And then by extension, there's

35:44

been more conversation about the data that all

35:46

of these tech companies collect

35:48

and more conversation about whether or not there should be

35:50

sort of universal privacy rules

35:52

put in place in the US to prevent that. If

35:55

you care about privacy, that's a hopeful thing, but

35:57

we'll just have to see if it actually comes through.

35:59

My part, you know, very similar worries

36:02

to Josh. And I live in Singapore

36:04

right now, and over

36:06

the last four years that I've been here, I

36:09

just moved back pre-COVID

36:11

and have been unable to leave. And I've

36:14

really gradually started to notice

36:17

how many surveillance cameras have been popping up

36:19

in the subways, on the roads.

36:22

It really kind of hit home how attractive

36:25

this AI surveillance model is.

36:28

And it was interesting because even in a place

36:30

like Singapore, you do find that

36:32

these systems have been abused.

36:35

So for example, a couple of years back, just

36:37

as the digital surveillance in Singapore

36:39

had been ramping up, we introduced

36:42

some things very similar to China's health

36:44

code,

36:44

where we had

36:46

to scan our mobile phones in order

36:48

to enter public places such as malls. And

36:51

it came to light a

36:54

couple of months after that, even

36:56

though this was meant for only

36:58

public health reasons, the Singapore

37:00

police had used data that

37:03

was collected by this health code to

37:05

solve a murder. And this

37:08

wasn't made public until an

37:10

opposition politician had asked about

37:12

it in parliament. So it's one

37:14

of these things that really hit home, how

37:16

you meet checks and balances when

37:20

a government uses such systems, because

37:22

it's just such a fine line, and it's so

37:24

easy for these technologies to be

37:26

abused. Tell us about

37:28

the rise of Xi Jinping. He's

37:31

been described as a Trumpian

37:33

sort of leader, thin-skinned, he

37:35

hated the Winnie the Pooh memes about him.

37:38

Where did he come from? What does he

37:40

want? And how is he different

37:43

than recent Chinese leaders? Where is he

37:45

taking things? You

37:46

know, it's hard to say exactly how thin-skinned

37:49

Xi Jinping is because he's very unlike

37:51

Donald Trump. He never talks to the media, and

37:54

it's very hard to actually hear from

37:56

him directly, at least in an unfiltered way.

37:58

But he is... I think objectively

38:01

just looking at where he came from and

38:03

his record, I mean, he is

38:05

by far the most powerful

38:07

leader that China has had probably

38:10

since Deng Xiaoping, maybe since

38:12

Mao, just in terms of the amount

38:14

of the Chinese bureaucracy that he controls

38:17

directly. Before he came

38:19

to power for many years, China was sort of

38:21

governed by consensus. So there was a leader

38:23

of the party, but there were, he had to

38:25

sort of negotiate policy

38:28

with five or six, seven other top

38:30

leaders in the Communist Party, and they would get

38:32

together and sort of agree on any major decisions.

38:34

Xi has totally changed that. He now is the

38:37

decider on everything in

38:39

a real way, in a way that I think Donald Trump wished he

38:41

could be. And he now, as

38:44

of the end of last year, has basically sort

38:46

of put his allies at powerful

38:48

positions throughout the government. And

38:50

it's hard to say what he really believes without being able to

38:52

read his mind. He grew up in China. He never

38:54

really spent much time outside of China. Grew up

38:56

during the Cultural Revolution. If you look

38:58

at what he says, what

39:01

he reads, what he writes, he feels

39:03

like a true believer in the Communist

39:05

Party. He believes that the Communist Party is

39:07

the only force that can sort

39:09

of return China to its previous

39:11

glory. He wants to make it so

39:13

that China is at least the dominant

39:16

power in Asia. There's debate about

39:18

whether he wants China to sort of be the top

39:20

power globally and to replace the United States.

39:23

Some people believe he does. Others think

39:25

he's just content to sort of erode

39:27

US dominance. But either way, he's

39:30

very ambitious.

39:30

And his overwhelming

39:33

priority is control, security and control.

39:35

And he's really started to see this in the last few

39:37

years with the pandemic. He kept China locked

39:40

up under sort of zero COVID measures for much

39:43

longer than any other country. And

39:45

even now that they've opened, they are sort of

39:47

cracking down on foreign

39:49

businesses and on information in ways that are sort of not

39:52

good for the economy, but they are good for China's

39:54

security.

39:55

And when it comes to digital surveillance,

39:57

Xi Jinping is a big proponent of it. It

40:00

really is under Xi Jinping's rule that

40:02

we've seen all these surveillance innovations,

40:05

so to speak. He was the one

40:08

to start pushing for what you would call

40:10

safe cities, so cities

40:12

in which the surveillance cameras and AI

40:15

were meant to keep people of interest off the street.

40:18

And it's also under Xi Jinping that during

40:20

the coronavirus crisis, we saw

40:22

the health codes and mobile

40:24

phone tracking of every individual. And

40:27

more recently, in November and December,

40:29

when large protests had broken out over

40:32

in many Chinese cities, he

40:36

immediately reached to digital surveillance

40:38

as well to track who was at the protests,

40:41

for example, by using your mobile phone signals to

40:43

figure out if you were in the area where protests

40:45

had taken place. So with respect

40:47

to digital surveillance, he has been

40:49

a big promoter of it.

40:51

And one thing you write in your book is the

40:54

weaker genocide, how China

40:58

has shown some restraint.

41:01

It doesn't sound like it's like the Russian

41:04

genocide of Ukraine, which is extermination,

41:06

extermination. But you're writing

41:08

the book that there is some restraint because China

41:11

does want to still be seen as

41:14

a power on the global stage. Could

41:16

you speak a little bit about that, sort of what

41:19

leverage or that we,

41:21

the rest of the world, could have over China? Because it

41:24

doesn't seem like they have this Russian way

41:26

of just scorched earth. We're

41:29

going for whatever we want to do. You're not going to stop

41:31

us and you're going to just take it.

41:33

And we're going to just grab Ukraine and that's it. China

41:36

does seem to want to stay,

41:38

have some sort of air of respectability. Could

41:40

you talk a little bit about that?

41:42

It's interesting because people talk a lot about

41:44

the relationship between Xi Jinping and Putin

41:47

and how they have this sort of bromance going on. And

41:50

recently, just before the Ukraine invasion,

41:52

the two of them had sort of declared this

41:54

no-limits partnership between China

41:57

and Russia. And so people group them together.

42:00

often and for good reason, but there is a very

42:02

important distinction between the two leaders

42:04

and between the two countries, and it is that.

42:07

His ambition for China is to be

42:12

powerful and to be respected and to have a real

42:14

say in the way that global

42:17

affairs are run. Putin is

42:19

sort of a chaos agent. He

42:21

just self-evidently, by the

42:23

way he invaded Ukraine and other

42:26

ways he acts, his interest is in sowing

42:29

confusion and chaos. China is

42:31

not in that game. They benefited

42:34

from the current global system and

42:36

they want to make sure that they continue to benefit

42:38

by changing that system in ways that line up with

42:40

its interests. And so for that reason, China needs

42:43

to be respected globally

42:45

to a certain degree. I mean, it's willing to fight.

42:47

It's willing to confront

42:50

the US and other powers, but not to

42:52

the degree that it's seen as a rogue

42:55

nation. And so in Xinjiang,

42:59

people can argue about this and it's difficult to know for certain,

43:01

but the immense amount of attention

43:03

that was paid to Xinjiang once it was discovered,

43:05

what was happening there, I personally

43:08

believe I think it had an effect. They built

43:10

this entire camp system in a way that looked permanent.

43:13

We were there and then sort of were

43:15

some of the first people to film one of these camps.

43:18

They weren't temporary facilities. Initially,

43:20

we thought that this was going to be a permanent state

43:22

of affairs. And then in 2019, they started letting some

43:25

people out of the camps

43:27

and started shutting down some of the camps. I mean, they still

43:29

have them. There's still an immense

43:32

amount of control in Xinjiang, but it's

43:34

less than it was. And it's

43:36

a counterfactual. I mean, if there hadn't been that, the

43:39

attention paid to it, would they have done

43:41

things differently? It's hard to say, but I believe it had an

43:43

effect.

43:44

So public pressure worked from

43:46

the outside world.

43:47

To a certain degree, yeah. Yeah. I think China does respond.

43:50

For years, I covered human rights in

43:53

China. And there was always a question

43:56

for human rights activists and people who wrote about human

43:58

rights in China, whether it had any effect. Right? Because

44:00

obviously China was not going to become more democratic.

44:03

It was moving towards less respect

44:05

for human rights, not more respect for human rights. And

44:07

so people often question themselves,

44:09

but I had multiple instances in which

44:12

I'd talked to

44:13

individual

44:15

activists and dissidents who'd been imprisoned in China

44:17

who said that they noticed when they were

44:19

being written about because their lives would improve.

44:22

Like their conditions inside prison would improve.

44:24

They would get softer pillows or they'd be given a

44:26

nicer bed or they would be given better food when

44:29

they were in the news. So there is some effect.

44:31

It may not be this sort of liberating effect

44:34

that people hope for, but China does pay attention.

44:36

That's really interesting. Do you think, especially

44:38

given the disaster that Putin has

44:41

created for himself in Ukraine, do you think China

44:43

will invade Taiwan? I

44:45

mean, that is the big question. I think

44:47

it's impossible to say at this point.

44:49

I mean, certainly China has been paying a lot of

44:51

attention to Ukraine. They've

44:54

been taking a lot of lessons. Taiwan

44:56

has also been taking a lot of lessons from Ukraine. So

44:58

is the US. And so I think that

45:00

conflict has really made people

45:02

think a lot about how a conflict

45:05

in Taiwan would play out. And of course that would be a much more

45:07

devastating conflict because it would potentially involve

45:10

the two strongest militaries and two

45:12

largest economies in the world confronting each

45:14

other. I think for that reason, I don't

45:16

think either side would enter a conflict lightly.

45:20

And I hope they never do.

45:22

I think that's the million dollar question that

45:24

everyone's trying to speculate. But I do

45:27

feel that unless you have access to Xi Jinping

45:29

himself, you will never

45:31

know the answer. I

45:33

want to close by asking,

45:36

what advice do you have for people

45:38

in terms of taking measures

45:41

to protect themselves or just be aware of

45:43

surveillance as they go about their lives

45:45

in a Western democracy or going

45:47

into a country with sliding democracy?

45:50

If you don't mind sharing or whatever you

45:52

can share, what tips

45:54

do you have in terms of your mental checklist

45:56

of what to be aware of and how to operate?

45:59

Yeah. It

46:01

kind of depends on who you are

46:03

and what you're doing. I think

46:05

a general principle is that if

46:07

a government really wants, surveil

46:10

you to break into your devices to find out about

46:12

you, it can. Governments

46:14

just, they have those tools, they have that ability. What

46:17

you can do as an individual is to make it a

46:19

pain in the ass for them to do it. Governments

46:22

have limited resources, even with AI, even with

46:24

big data. They still have only

46:26

so many resources they can devote to tracking people. If

46:29

you make it a pain to track

46:31

you, then they may decide that it's not

46:33

worth it. That can be things like using, definitely

46:36

using a password manager, making

46:39

yourself hard to hack, using encrypted chats

46:42

if you're having sensitive conversations. Again,

46:44

a lot of this applies to journalists and maybe not

46:46

applied to other people. But if you have conversations

46:49

you want to keep private, use encrypted chat

46:52

apps and just be aware of what

46:55

sort of systems exist in your area. The

46:59

other thing I would say is, especially if you're

47:01

an American,

47:02

is lobby

47:04

your

47:05

local lawmakers to pass privacy

47:07

legislation. Because I really think it's the sort of thing that ultimately

47:10

can only really be solved at the

47:12

legislative level. You just need to have, as

47:15

Lisa said, checks and balances in place. The

47:17

most important and powerful thing you can do is

47:19

to make sure that your representatives know

47:22

that that's a really important issue to you.

47:24

Yeah, and I guess to add to that, make it

47:26

known that you're monitoring them. If

47:28

you see surveillance cameras coming up

47:31

and there's no explanation, ask,

47:33

why are they there? Why are they being installed?

47:36

How long is the data being kept? What

47:38

is it being used for? What are your rights?

47:41

These are just very simple questions, but just

47:43

knowing that people care about them, it's

47:46

more likely to make law

47:48

enforcement agencies more inclined to

47:50

be responsible with them. The

47:53

most ideal, obviously, would be a situation

47:55

where you have an agency that is

47:57

not linked to the police.

48:00

themselves to serve as some

48:02

sort of watchdog, just to make sure that

48:04

every year they hold the police

48:06

responsible and they ask what and

48:08

how these systems are being used for and if

48:11

these systems are being effective at all.

48:14

That's obviously the Holy Grail. But

48:16

in the absence of that, it's important to

48:19

make it known that you're watching this. And

48:21

the other thing that's really important

48:23

is

48:24

to support a free press, to

48:26

support democratic institutions, voters

48:29

who dig under the hood to figure

48:32

out what these systems are being used for or to

48:34

point out any abuses. I think

48:37

that's very, very critical.

48:39

What about TikTok? Should Americans,

48:42

Europeans, the rest of the world, should we be on TikTok

48:44

or should we stay away from it?

48:46

This is my personal view. It's right

48:49

to be asking questions and not

48:51

just of TikTok, but obviously of other

48:53

social media companies as well, meta,

48:56

YouTube, Twitter, asking

48:59

how their algorithms, because algorithms

49:01

are such opaque things and

49:03

the ordinary person on the street will never

49:06

be able to understand how they work.

49:09

It's always good to push companies for transparency.

49:12

It shouldn't just be limited to a Chinese company.

49:14

I think it should be open to all Western,

49:16

even Western social

49:18

media companies.

49:27

Our discussion continues and you can get access

49:30

to that by signing up on our Patreon at the

49:32

truth teller level or higher. We

49:34

encourage you to donate to help Rescue

49:36

and Recovery efforts in Turkey and Syria

49:39

following the devastating earthquakes in early

49:41

February. To help people in

49:43

Turkey, visit the T.P.F.

49:45

Turkey earthquake relief fund at

49:48

T.P. fund dot org. To

49:50

help Syrians in need, donate

49:52

to the white helmets at white

49:55

helmets dot org. We also

49:57

encourage

49:57

you to help support Ukraine by donating

49:59

to... Rosam for Ukraine at rosmforyukraine.org.

50:03

In addition, we encourage you to donate to the International

50:05

Rescue Committee, a humanitarian relief

50:08

organization helping refugees from Ukraine,

50:10

Syria, and Afghanistan. Donate

50:12

at rescue.org. And if you want to help

50:15

critically endangered orangutans already under

50:17

pressure from the palm oil industry, donate

50:19

to the orangutang project at the orangutangproject.org

50:23

and avoid products with palm oil. Gaslit

50:26

Nation is produced by Sarah Kensier and Andrea

50:28

Chalupa. If you like what we do, leave

50:30

us a review on iTunes. It helps us reach

50:33

more

50:33

listeners. And check out our Patreon. It

50:35

keeps us going. Our production manager

50:37

is Nicholas Torres and our associate

50:39

producer is Carlin Daigle. Our

50:42

episodes are edited by Nicholas Torres

50:44

and our Patreon exclusive content

50:46

is edited by Carlin Daigle.

50:48

Original music in Gaslit Nation is produced

50:50

by David Whitehead, Martin Wissenberg,

50:53

Nick Farr, Damien Ariaga, and

50:55

Carlin Daigle. Our logo design

50:57

was donated to us by Hamish Smith of

50:59

the New York based firm order. Thank

51:01

you so much, Hamish.

51:03

Gaslit Nation would like to thank our supporters at the

51:05

producer level on Patreon and higher with

51:07

the help of Judge Lackey, the

51:09

narrator of the new Gaslit Nation graphic

51:11

novel, Dictatorship. It's easier

51:14

than you think. If you're supporting the

51:16

show at the producer level or higher on Patreon,

51:19

thank you so much. If you do not

51:21

hear your name, you may need to go into your Patreon

51:23

account and just select your

51:25

level of support manually to make sure

51:27

that your name gets into our end credits.

51:30

So one person that needs to do that is my mother,

51:32

Tanya Chalupa.

51:33

I had to find her. She was getting

51:35

annoyed that she wasn't hearing her name in end credits. Mom,

51:37

you gotta change your manual level of

51:39

support on Patreon. Thank you for supporting

51:41

me all these years. I love you. Withrow

51:44

C. Newell.

51:45

Chiquin. Lily

51:47

Wachowski. Sean Rubin. Todd

51:49

S. Pearlstein. Kenny Main. John

51:52

Schoenthaller. Ellen McGirt.

51:55

Joel Farron. Larry Gasson.

51:58

Karen Adiel. Nico Phillips.

52:01

Brian E. Castor. Tatiana

52:03

Bursch. Karen Heisler. Ann

52:06

Bertino. Chris Bravo. Ruth

52:09

Ann Furnish. John Millett. David

52:12

East. Aida. Joseph

52:15

Marra Jr. Diana Gallagher. Julie

52:18

Matthews. Meganopoulos. Mark

52:21

Mark. Barbara Kittridge. Matthew

52:23

Womack. Sean Berg. Kristen

52:26

Kuster. Ty Gillis. Sharon

52:29

Hatrick.

52:30

William Barry Reeves. Richard

52:32

Smith. Emmy. Kevin

52:34

Gannon. Mike Christensen.

52:37

Sandra Calnans.

52:38

Katie Messeris. Emily

52:41

Fennig. James D. Leonard.

52:44

Leo Chalupa. That's my dad. Leave

52:46

him alone. Oh, wow. Gerald

52:48

Gulstad. Jason Benke. Marcus

52:51

J. Trent. Joe Darcy. Ann

52:54

Marshall. Jeremy Lewis. Trigve.

52:57

D.L. Singfield. Matt Perez.

53:00

Nicole Spear. Abby Rode. Margaret

53:03

Romero.

53:04

Yans Aldrup Rasmussen. Uh,

53:07

I don't know how to pronounce this. ZW. OK,

53:10

ZW. Kathy Cavanaugh.

53:13

Sarah Gray.

53:14

Jennifer Ann Lutter. John Ripley.

53:17

Ethan Mann. Leah Campbell.

53:20

Jared Lombardo. You are all to report

53:22

to the Ron DeSantis Re-Education Center, formerly

53:24

known as Disney's Animal Kingdom, immediately.

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features