Podchaser Logo
Home
Democracy and national security in a fast-moving digital age

Democracy and national security in a fast-moving digital age

Released Friday, 22nd March 2024
Good episode? Give it some love!
Democracy and national security in a fast-moving digital age

Democracy and national security in a fast-moving digital age

Democracy and national security in a fast-moving digital age

Democracy and national security in a fast-moving digital age

Friday, 22nd March 2024
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

Washington Post Live's futurist summit, The

0:03

New Age of Tech, is presented

0:05

by Mozilla. You're

0:07

listening to a podcast from Washington Post

0:10

Live, bringing the newsroom to you

0:12

live. Good morning

0:14

and thank you for joining us at

0:16

Washington Post. I am Ela Hezadi, a

0:18

media reporter and the co-host of the

0:20

Post Reports podcast. And I'm

0:23

thrilled to be here, joined with my

0:25

colleagues. We're joined on stage

0:27

by Sarah Ellison, to my left,

0:29

a national enterprise reporter. Pranshu

0:32

Verma, he is an innovations reporter,

0:34

and Kat Sierkreski, who is a

0:36

national technology policy reporter. Thank

0:38

you three for joining us. Thank

0:41

you. As we saw, this

0:43

is a very important year. And Kat, I

0:45

wanted to start with you. Maybe

0:47

it's easy for those of us in the United States to think

0:49

about 2024 being a pivotal year. Actually,

0:52

this is a year in which nearly

0:55

half of the global population will be

0:57

going to the polls. And actually, just

0:59

on Monday, Secretary of State Blinken warned

1:02

of a, quote, flood of

1:04

falsehoods that could suffocate serious

1:06

civil debate during this pivotal

1:08

year. So I wanted to first

1:10

start and ask you to dig in a little

1:12

deeper into what makes this year so different

1:14

than prior election years. Not

1:16

just the fact so many people are going to the

1:19

polls, but what is the landscape that

1:21

people are going to the polls within? Well,

1:23

so I think there's three things that

1:25

make this year really different. The first

1:28

is scale. The tech companies have cut

1:30

their trust and safety teams that are

1:32

responsible for combating disinformation. But at the

1:34

same time, they're dealing with more elections

1:36

around the world than ever before. And

1:39

so that creates a real tension within the

1:41

companies about where do you place

1:43

your resources in order to protect

1:45

democracies. The second is

1:48

conservative backlash. We've seen a series

1:50

of lawsuits from Republicans who are

1:52

concerned that efforts to fight misinformation

1:54

within the government amount to censorship

1:56

of their views. And that

1:58

has really had a... chilling effect on

2:01

a number of government and

2:03

civil society efforts to

2:06

address misinformation. And

2:08

the third I would say is the tech.

2:10

We've been talking all day about artificial intelligence.

2:13

And that creates a host of new challenges.

2:15

Yeah, it's like the tech is evolving so

2:17

quickly at the same time that

2:19

all these other social and political dynamics are

2:22

taking place. And you mentioned misinformation and disinformation.

2:24

And Pranshu, I wanted to ask you about

2:27

the idea that misinformation during election

2:29

years, that's not new. I mean, that's something we

2:31

saw in 2016, 2020. And

2:35

I'm curious what makes it distinct this

2:37

year. And also disinformation,

2:39

which is, I don't know if you would call it

2:41

this, like I think of it like a cousin to

2:44

misinformation. It's not the same. So

2:46

what examples have you seen lately of this?

2:49

Yeah, I think we're still figuring

2:51

out what the broad scale impact

2:53

is. For example, artificial intelligence,

2:55

like Kat mentioned. But

2:58

we are already seeing some very clear

3:00

examples that, yes, AI is better. And

3:02

it's better able to clone your voice,

3:04

your image, your entire likeness. But

3:07

what's happening is that there are really

3:09

cheap and really easy tools that

3:11

anyone can now create a deep fake. And

3:14

so we're seeing specific examples like we

3:16

just talked about. There was the robocall

3:19

of President Biden urging voters not to vote

3:21

ahead of the New Hampshire primary. And

3:24

then if you look abroad, there were

3:26

examples in Moldova. There

3:28

was a liberal party politician.

3:31

And she was deep faked to say that

3:34

she supports a pro-Putin party ahead of an

3:36

election. Now if you go to

3:38

Bangladesh, a conservative Muslim country, there was an

3:40

opposition lawmaker that was deep faked to be

3:42

in a bikini, right? Obviously

3:45

trying to get at some cultural issues there.

3:48

And then of course in Slovakia, right?

3:50

There were people that were on a

3:52

fake audio, a politician, talking about raising

3:54

the price of beer, right? Raising the

3:56

price of beer. Clearly something that voters

3:58

would give a lot. Yeah, very controversial. Yeah,

4:00

well, if you want to take somebody out of a career, tell

4:02

them they're going to raise the

4:05

price of beer. Oh, it's a wrap. That's

4:07

freaking really genius. But none

4:09

of these are like, this isn't some

4:11

massive campaign that is orchestrated that we

4:13

know of by some nation state actor

4:15

to sow discord, where it could be.

4:18

Oftentimes it's a lone political operative. It

4:21

could be a rogue individual or we're

4:24

not sure, it could be a teenager that's just

4:26

bored and then can post this stuff. And

4:28

if it's savvy enough and it's timed well

4:30

enough, we can see that actually it's hard

4:33

to fact check it in the moment where

4:35

systems are not built to fact check it in the moment.

4:38

And are the companies able to hold

4:40

to their promises and stop the

4:42

ability for political propaganda to be

4:44

made through the tools that they

4:46

create? Yeah, and I want to

4:48

get back to that point of whether the tech

4:50

companies are equipped this year to deal with this.

4:54

But as you were talking, it made me

4:56

think, and Sarah, I wanted to ask you

4:58

this, it made me think about the promise

5:00

of social media and a lot of this

5:02

technology was to democratize information and actually help

5:05

democracies around the world. And I mean, we

5:07

can talk about the Arab Spring and how

5:09

lasting that effect that was, but it

5:12

feels like the promise of social media was

5:14

to be in support of democracy and create

5:17

democracies. And I know that you and

5:19

our colleague, Naomi Nix, reported recently about

5:22

how big tech is actually surrendering to

5:24

disinformation. So it raises this question of

5:27

what impact actually these companies and these

5:29

platforms are having to undermine democracy. And

5:32

can you explain how we got there

5:34

from the promise that was there to

5:36

the current conversation we're having

5:38

today? Sure. I

5:40

mean, Kat touched on two big points, which was that

5:43

there were these mass layoffs earlier

5:45

where people really gutted their trust and safety

5:47

teams. And so the people whose job it

5:49

was not to accumulate

5:51

new users, I mean, there are lots of these

5:53

are big companies and there are lots of different

5:56

teams. The people who were responsible for policing this

5:58

kind of thing were really elitible. eliminated from

6:00

the company. And then

6:02

there was this massive pressure campaign where

6:05

people, where right wing

6:07

and conservative politicians were saying, this

6:11

isn't misinformation, this is my belief.

6:13

This is something that I legitimately

6:15

believe, so why is it that

6:17

you're, like stifling

6:19

that. Why is it you're censoring me? And

6:22

then the last thing that we really found in

6:24

the peg for our story was actually when Elon

6:26

Musk bought Twitter, he sort of opened the Overton

6:28

window in social media to

6:31

say we're going to allow all

6:33

these things that had been previously

6:35

disallowed, come back in, you can,

6:37

I mean, specifically, you can misgender

6:39

people, you can do all of

6:41

these things that previously would have gotten you suspended

6:43

from the platform. And

6:45

what that just means is that at this

6:47

point, these social media companies all decided to

6:50

kind of retreat from this effort because they didn't

6:52

want to be the quote unquote arbiters of truth.

6:54

That was sort of a famous quote from Mark

6:57

Zuckerberg. So now what we have is

7:00

the idea that misinformation isn't new, it's really

7:02

in the bloodstream, people really believe certain things,

7:05

and so as a social media company, they

7:07

don't want to be in the position of

7:09

policing those beliefs. And they've really backed away

7:11

from some of these efforts

7:15

that will in fact let democracy

7:17

can thrive on social media and

7:19

anti-democracy can thrive on social media.

7:21

Right, right. That

7:24

makes me wonder about Kat, I'm

7:26

wondering about the incentives too for

7:28

these tech companies to tamp down

7:30

on misinformation and disinformation. If

7:33

the incentive is to have a profitable business or

7:36

what incentives are in place to

7:39

make sure that these sorts of

7:41

pieces of data content aren't spreading,

7:44

like the deepfakes that Pranshu was

7:46

mentioning, and if actually the incentives

7:48

are misaligned there. There

7:50

have been many criticisms that the

7:53

incentives are horribly misaligned with social

7:55

media and misinformation because these types

7:58

of videos that go viral. role

8:00

on the platform that might be false can

8:02

cause people to spend more time looking

8:05

at Facebook, looking at TikTok, and at the

8:07

end of the day they are trying to

8:09

get that type of engagement in order to

8:11

boost their own profits. One of

8:14

the interesting things to watch as we head

8:16

into this election season is what role advertisers

8:18

might play in putting pressure on the companies.

8:20

We have seen in the past at heated

8:22

moments, if you think back to the 2020

8:24

election or when Elon

8:27

Musk bought Twitter, where the advertiser

8:29

said, hey we don't want our

8:31

ad to appear next to these lies

8:33

about the election or hateful content and

8:35

so I'm closely watching this year what

8:37

role they might play again. That's interesting

8:40

because in some ways reflects some of

8:42

the things we've seen in traditional media

8:44

and especially broadcast media in the wake

8:46

of the 2020 election of advertisers

8:49

pulling back but that's a totally different

8:51

business model because they rely on fees

8:54

from cable companies and that sort of thing. Pranshu,

8:57

going back to and picking

9:00

up on this conversation around deepfakes and

9:02

AI generated fake

9:04

content, it's

9:07

creating this environment in which politicians

9:10

are more easily able to dismiss something

9:12

that's real as fake. It's sort of

9:14

altering our perception of reality. This has

9:16

come up with former President Donald Trump.

9:18

I believe there was a recent example

9:21

of that with him. So have

9:24

you seen other politicians around the world take

9:26

that similar approach? Is this the same sort

9:28

of, you know, the fake news 2.0 of

9:30

being able to dismiss things that are actually

9:33

true like a politician going up and giving

9:35

speech and saying this has been manipulated

9:37

and fake? Yeah, this is kind of like

9:39

my favorite rabbit hole to go down and Drew

9:42

Harwell, a wonderful tech reporter told me

9:44

the term is called the Liars Dividend.

9:46

The Liars Dividend. I love it. As

9:50

a concept, but like you mentioned, I do not

9:52

love it in actuality. But as a concept, it's

9:54

really fascinating because you

10:00

mentioned former President Trump, there was

10:02

an ad generated of him by the LinkedIn

10:04

Project of real gaffes that he had made

10:06

on the campaign trail. And the LinkedIn Project

10:09

is an anti-Trump organization. Exactly.

10:11

And these were real events that happened.

10:13

Reporters were there covering it. This got

10:15

mainstream coverage. And his response was, this

10:18

is AI generated. So imagine

10:21

in 2020, the Access Hollywood

10:24

tape breaks, Trump's grabbing a woman

10:26

by a general's, and

10:28

he has the excuse to say, that wasn't real.

10:30

That was AI generated. And so

10:32

what happens is it offers a

10:35

very potentially potent excuse

10:37

to dismiss away real damning information

10:39

that comes out about you. Because

10:41

like we were saying, you used

10:44

to be able to trust that you could hear something

10:46

or you could see something and that something happened. But

10:49

now with AI getting better, there's this kind of convenient

10:52

and cozy excuse here. And

10:54

we've seen it abroad too. In Taiwan, ahead of

10:56

their elections, there was an opposition

10:58

party lawmaker that was seen

11:01

to be depicted going into a hotel with a

11:03

woman that wasn't his wife. And

11:05

so the opposition lawmakers kind of rallied around him to

11:07

say, no, that's an AI deep fake. That actually wasn't.

11:10

We don't know if it's true. We don't know if

11:12

it's an AI deep fake. But what we know now

11:14

is that there's a question about the truth of it.

11:18

And then same thing in India happened

11:20

where there was an audio deep fake,

11:22

potentially of a

11:24

politician of the ruling party saying that there was

11:26

a billion dollar scam that he was a part

11:28

of. Again, the

11:30

excuse was, well, no, that's an AI generated piece

11:32

of audio. And yet none of that,

11:35

it dismisses away the

11:37

scrutiny enough that you're

11:40

now finding a repetition

11:42

of this type of excuse coming through. And

11:44

so it kind of gets to that point

11:46

of like muddying the waters of reality and

11:48

kind of dismissing and easily allowing you to

11:50

kind of play with the concept of truth.

11:52

And I think that that's actually probably going to

11:54

be in this election cycle, one of the biggest

11:58

one of the biggest things that doesn't really. I mean, yes,

12:00

it's going to matter how good the deepfakes are. Once

12:03

people start to believe or

12:05

not believe what they're seeing, or

12:08

just to create the doubt. Say, well, we don't know

12:10

if it's true or it's not true. People

12:12

can throw up their hands and say, well, they're

12:14

all lying. Or there's this sense of the way

12:16

people are

12:19

going to receive this is just to

12:21

kind of zone out and say, you

12:23

can't believe anything. Right. It almost mirrors

12:25

the way that the traditional press or

12:27

the more mainstream press was villainized in

12:29

previous election cycles as well. That

12:31

you can just sort of throw your hands

12:33

up or like, well, I don't know what to

12:35

believe. Or I certainly don't believe these reputable

12:37

people, because they're disreputable now. And I've been told

12:39

not to believe them. And so if

12:42

you're looking for a touchstone as a voter, it's

12:45

much harder to find one. Well, it's

12:47

interesting because in the introduction,

12:49

we saw Metas,

12:52

Nick Clay, Googles, Sundar, Pichai, and OpenAI,

12:54

Sam Altman. They were all saying that

12:56

they want to be part of the

12:58

solution, not part of the problem, when

13:00

it becomes to election deepfakes specifically. But

13:02

given what we were talking about, a

13:04

lot of these tech companies have gutted

13:06

these teams that dealt with a lot

13:08

of the misinformation and making

13:10

sure that at least what's on their platform

13:12

isn't totally false. That

13:14

those safety rails aren't there the way they were before.

13:18

And it's now the technology is advancing so

13:20

fast. So is the cat almost out of

13:23

the bag? Where can these tech companies do

13:25

anything to control this problem at this point?

13:28

I mean, I would be interested to what you guys

13:30

think. But it seems like

13:32

theoretically, the technology

13:34

would be there to identify,

13:36

in most cases,

13:38

a deep fake. But the

13:41

will to do that, also what exactly

13:43

constitutes a deep fake that needs to

13:45

be policed. That's like

13:47

a philosophical conversation. Yeah, and these

13:49

companies have lots of different priorities.

13:53

If you want to show a deep

13:56

fake of Joe Biden drooling or of Donald Trump,

13:58

I mean, some of that You could say

14:00

that's political satire. We're just, it's like a cartoon.

14:03

And then you have a whole debate about whether

14:05

or not that's an appropriate use of AI. And

14:09

you would have, I'm sure that Meta and all

14:11

those companies would be debating that at the policy

14:13

level. And we've seen them sort of fall down

14:15

and be really susceptible to political

14:17

pressure when it comes to those questions. They are

14:20

companies. They want to make a profit. And they

14:22

want to stay out of controversy. Yeah. There's

14:24

one other company we haven't talked about,

14:27

or platform, and that's TikTok. Yeah.

14:30

And I want to ask you this, Kat, that

14:32

there are roughly, give

14:34

or take, 150 million Americans on TikTok.

14:36

It's a huge number. Maybe you're not

14:38

on it. Maybe your mom is on

14:40

it. My mom is on

14:42

it. I think moms and teenagers are on TikTok.

14:46

Big populations of people are on TikTok.

14:48

And then, you know, Congress, the House

14:51

of Representatives, just passed a bill that

14:53

could lead to a ban on TikTok.

14:55

And I'm curious about the power of

14:57

TikTok within politics to shape our politics,

15:00

because there's also been reporting

15:02

that Trump's allies

15:04

have told him and have counseled him

15:06

that going against TikTok would hurt him

15:08

politically. Meanwhile, we do have the White

15:11

House and President Biden backing this effort.

15:13

So what is the power of TikTok

15:15

right now in our politics? So TikTok's

15:17

power has only grown in our politics

15:19

with every election cycle. It's become more

15:22

and more popular. And as you just

15:24

mentioned, this initially was looked at as

15:26

an app that was just for teen

15:28

dance crazes. But now, you have moms.

15:31

You have significant voting blocs that are

15:33

spending time on TikTok. So that makes

15:35

it more important for campaigns. And I

15:37

think it's putting politicians in a really

15:39

tricky spot. We're seeing this with the

15:42

Biden administration right now. President

15:44

Biden's campaign has joined TikTok. And

15:46

they've made videos there. They've also

15:49

developed partnerships with TikTok influencers in

15:51

order to promote their policies. But

15:54

at the same time, we see

15:56

this growing concern about the app's

15:58

ties to trade. China and

16:00

politically a need to show we're tough on

16:03

China, we're going to take a stand against

16:05

this app. And so you see these politicians

16:07

kind of weighing both sides of

16:09

that coin. Yeah,

16:12

it's interesting because at the same time, I

16:14

don't know if you were

16:16

tracking this, but when this bill was

16:18

being debated in Congress, the app had

16:20

these pop-up messages where users were being

16:23

told to contact their representatives and members

16:25

of Congress, their offices were being flooded

16:27

with all these calls. And it's also

16:29

putting them in this, I wonder if

16:31

this is a year where that more

16:34

than meta, more than X

16:36

is going to be the topic of conversation

16:38

among politicians and to be put on the

16:40

spot of where they stand. I don't know,

16:42

Kat, if you're already seeing indications of how

16:44

this is going to play out. We are

16:46

seeing that. I mean, I was talking to

16:49

congressional aides that day that the pop-ups were

16:51

going out, and they were saying, we have

16:53

to shut down our phone lines because we're

16:55

getting so many calls about TikTok. And so

16:57

we know that this is an issue that

16:59

motivates young voters. I think even

17:01

dating back to more than a year ago

17:04

when the House of Representatives was having hearings

17:06

on this, the Secretary of Commerce came out

17:08

and said, this is a sure way to

17:10

lose voters under the age of 35 if

17:13

we support this legislation. And so I

17:15

think the Trump campaign is realizing that, and

17:18

that's why we've seen them back away from

17:20

some of the rhetoric we had in 2020.

17:23

Right. Pranshu, I wanted

17:25

to also bring up a piece of

17:27

reporting that you've done as well, a

17:29

series called Rising India Toxic Tech. And

17:32

that detailed the vast campaign

17:34

online, the digital campaign by

17:36

Hindu nationalists to stoke conversation

17:39

and discourse in India. If

17:41

you were to sum up what your big takeaway is

17:43

walking away from that reporting and what you found,

17:46

and also given the Indian

17:48

election is upon us as well, how

17:50

do you expect AI to be involved

17:52

in that country's Yeah,

17:55

the Rising India series, which was led by

17:57

our beer chief Jerry and Joe Mann and.

18:00

and Karishma Marhotra, Anand Gupta, and

18:02

Smurfs, it was a pretty

18:04

broad scale finding that the

18:08

Bharti Jantapati, which is the ruling party in

18:10

India, which Modi heads, is uniquely

18:13

adept at using social media

18:15

and artificial intelligence to sow

18:18

discord. There is

18:20

content that is hateful. There is content

18:22

that targets minorities. There is content that

18:25

blatantly lies in propaganda. It

18:27

comes in vast amounts, and there's an orchestration

18:29

within the government to spread this on

18:32

various platforms. But the

18:34

market of India itself is so lucrative

18:37

to tech companies that you often

18:39

find the policing of this content

18:43

is either turning. There's a blind

18:45

eye that's turned. It's willful ignorance.

18:47

Or there are actually demands that

18:49

some tech companies have made, or

18:51

the Indian government has made of tech companies that they

18:53

cat-tat, too. And so you see

18:56

India kind of charting this course of when

18:58

you are a powerful market with hundreds of

19:00

millions of users for social media tools, how

19:03

to use that to get your way to stifle

19:05

some sorts of messaging but let others kind of

19:08

grow is kind of the

19:10

big takeaway. And whether it's WhatsApp messages

19:12

or whether it's the type of Bollywood

19:14

movies that get made by Netflix and

19:16

Amazon, if they're Bollywood movies that are

19:18

critical of the Modi administration, we've found

19:20

that those scripts just kind of go away. So

19:23

it's in various elements. But

19:26

now you said April 9 to I think April 19 to June 1, we

19:30

have the Indian elections coming up. And

19:32

it's a very interesting hotbed for AI

19:35

because India has many languages. And

19:39

so you're seeing AI audio kind

19:41

of allow Modi's voice to be translated into

19:43

multiple languages that he doesn't speak. So that's

19:45

a utility that they've acknowledged. You're

19:47

also finding that there's a full scale

19:49

acceptance of AI amongst some political parties

19:52

to create memes, to create really crazy

19:54

and outrageous memes that maybe might

19:56

not be created, but it's a way to cut

19:58

through the political messaging. Old school. means don't work,

20:00

but these AI generated means kind of do.

20:02

And so you're seeing now, you're going to

20:04

see a flood of content, I think, that

20:06

shows like the full born acceptance of it.

20:09

And then some of these other things that we might

20:11

see, these rapid deep fakes that come up and how

20:14

are these things in the moment fact checked, is it

20:16

even possible? Yeah. So

20:18

before we leave, because we only have a couple

20:20

of minutes left and thinking about, you know, coming

20:22

up these few months that I'm imagining is something

20:24

you're going to be monitoring Pranshu. And I wanted

20:26

to ask Kat and Sarah as well, looking into

20:29

the coming months and in this coming year, what

20:31

is one big unanswered question that you have or

20:33

something that you're going to be paying very close

20:35

attention to in this space? Kat, I'm wondering if

20:37

you can start. I would say

20:39

the Supreme Court. One thing we're

20:42

seeing right now is there is

20:44

this litigation challenging the communication that

20:46

governments have with the social media companies.

20:49

And that could really disrupt a lot

20:51

of the work and planning we've seen

20:53

around elections and past cycles. And so

20:55

I'm watching to see where the Supreme

20:58

Court draws a line to see if

21:00

those types of efforts to fight misinformation

21:02

can continue. Yeah, Sarah. I

21:06

think it's proper that we're all talking

21:08

about the supply of misinformation and disinformation

21:10

and how it gets made, who's making

21:12

money off of it. But I've been

21:14

thinking a lot about the demand for

21:16

misinformation, why it hits and who, why

21:20

people are susceptible to it and

21:22

why they want to consume it. And

21:24

I think that it's not just because people

21:26

are fully disaffected and sitting in diners in

21:28

middle America, which is sort of with our

21:30

2016 answer to that kind of

21:32

question. I think that there's something that we can

21:34

really think about, like what really

21:37

gets people to believe a piece

21:40

of misinformation. And you and I worked together

21:42

for a long time in the media group

21:44

looking at traditional media. And I've

21:46

been talking to misinformation researchers that

21:48

social media is clearly very important.

21:51

But the way that the messages that

21:53

surface on social media get translated and

21:56

amplified and used in traditional

21:58

media is in. some ways

22:01

as if not more important. And so I'm

22:03

really also looking at the way these two

22:05

different worlds kind of communicate with one another.

22:07

Yeah, that's fascinating. Well, Sarah, Pranshu, and Kat,

22:10

thank you so much. We'll have to leave

22:12

it there, but thanks for joining us today.

22:23

Good afternoon. I'm Jonathan Capehart, Associate

22:25

Editor here at the Washington Post

22:28

and here with me Adam Brie,

22:30

CEO of, we're gonna talk about

22:32

this in a minute, of

22:34

Skydio, the leading joint manufacturer

22:36

in the United States. Adam, welcome to the

22:39

Washington Post. Thank you, great to be here.

22:41

So I sort of giggled in saying the

22:43

company name because I asked you how do

22:45

you pronounce it? So how do you pronounce

22:47

the company name? Skydio. Skydio. I

22:49

speak Italian, so when I saw it I said, oh, Skydio.

22:53

And in Italian, Dio means God. So

22:55

they're, oh, how clever. Sky God.

22:58

Was that on purpose? That was

23:00

unintentional. So Skydio comes from Sky

23:02

Studio. We were thinking

23:04

Sky Studio and you Google Sky Studio and you get a

23:06

million hits and when we were a three-person startup there was

23:08

no way we were gonna be at the top of that.

23:10

But Skydio was more unique and we found

23:13

out three months later that it also meant Sky God.

23:16

And so are you in the

23:18

Italian market? Not yet. Take

23:21

it from me. If you go to the Italian market,

23:24

Skydio. Yeah, we already, we know what our launch campaign

23:26

is gonna be there. So you co-founded

23:29

Skydio ten years ago, first producing

23:32

consumer drones and then designing

23:34

drones for military and corporate use. Talk briefly

23:36

about this evolution and how quickly was that

23:39

evolution? So the big bet that we made

23:41

when we started the company was that drones

23:43

can be useful for a really wide range

23:45

of tasks but a fundamental

23:48

limiter at the time and largely still

23:50

today is needing to have an expert pilot there

23:52

flying the drone. So when we

23:54

started we went all in on computer vision

23:56

and AI and this was in 2014 before

23:59

most people we thought these things were cool. We thought they were

24:01

cool. We thought that was gonna be a really big deal. So

24:03

the big bet that we made is make the thing

24:05

smart enough to fly itself, build in the skills of

24:08

an expert pilot, which just makes it more useful to

24:10

more people in more ways. And

24:12

we felt like the consumer market was probably gonna

24:14

develop first. That turned out

24:16

to be true. So our first products were consumer

24:18

oriented, but we always felt like the Consumer Product

24:21

Foundation was kind of the right

24:23

starting point for other kinds of applications. The

24:25

fundamental thing that our products do is put

24:28

sensors in important places to capture

24:30

useful information. You can think of it like a flying

24:32

camera. So the consumer platform did

24:34

in fact turn out to be a really useful starting

24:36

point for all the things that we do now across

24:38

a really wide range of industries. Construction,

24:41

public safety, energy utilities, all

24:44

the way up to national security and defense

24:46

and having soldiers use these things on the

24:48

battlefield. So since we have this here, so

24:52

this is a Skydio drone. Yeah. And

24:55

this is the closest I've ever come to actually

24:57

being near a drone. Is it heavy? Pick

24:59

it up. It's about four and a half

25:02

pound. Huh. And

25:04

how many different types of drones do

25:06

you have? So we have a

25:08

few different models, but this is our flagship drone. This

25:11

is the Skydio X10. We launched this at

25:13

the end of last year. And

25:16

this brings together a lot of really amazing

25:18

stuff. I sometimes say that drones are like

25:20

the Mount Everest of technology, but they have

25:22

every piece of wireless cameras,

25:25

optics, vibration, thermals, aerodynamics. And

25:28

then for our drones, a bunch of artificial intelligence

25:30

and machine learning built in as well. So

25:32

DJI, a Chinese company, it controls

25:35

more than 70% of

25:37

the global market, drone market, but in 2021, the

25:40

US government put the company on

25:42

an investment blacklist. How

25:44

has that impacted Skydio's growth? So

25:46

I would say that there's two big themes

25:48

in the market that we think are really

25:50

important. The first is the transition to AI

25:53

and autonomy. And with a lot of organizations,

25:55

drones have reached a point where they've proven

25:57

that they can be really valuable. Energy.

26:00

utility might have set up a drone program. They've

26:02

got somebody who's an expert drone pilot who started

26:04

bringing their drone to work. They're proving that they

26:06

can capture useful data. But then the question is,

26:08

how do you scale that? And

26:11

training everybody to be an expert pilot or

26:13

hiring expert pilots is very difficult. And so

26:15

AI and autonomy makes these things more useful

26:17

to more people. The other big

26:19

trend, I think, is cyber and national security.

26:22

And most of the drones in the market today are

26:24

made by Chinese

26:27

companies. And I think it's just become clear over

26:30

time that that's fundamentally

26:32

untenable for a lot of critical

26:34

industry applications to have drones that,

26:37

at the end of the day, are going to be

26:39

beholden to Chinese national security policy, which is mostly

26:42

in opposition to our national security. And

26:46

so there's a very strong

26:48

need in the market for secure

26:50

alternatives. And that

26:52

benefits us. But

26:55

I would say more than that, we

26:57

feel a responsibility to deliver because

27:00

it's important for our customers.

27:02

And I think it's important for our country. So

27:05

right now, your drones are being

27:07

used in real world situations. In

27:09

fact, there are hundreds of your

27:11

drones in Ukraine right now. And

27:14

we have some photos that were taken from a

27:16

Skydio drone in Ukraine.

27:19

I don't know which monitor, right behind

27:21

us. Can you explain

27:23

what we're looking at? So this

27:25

is work that we've done

27:28

with USAID in collaboration with

27:30

Ukraine's Office of Prosecutor General. And

27:33

it turns out, over the last few months,

27:35

Skydio drones have really become a critical tool

27:37

for documenting Russian war crimes. So we have

27:40

a product called 3DScan, where you

27:42

basically just tell the drone, here's the thing that I

27:44

want to create a digital copy of. And it will go

27:46

off and do it. And

27:48

at this point, they've scanned over 100 civilian structures

27:50

that have been attacked by the Russians.

27:52

And they're using it to basically gather

27:54

evidence and make a case. And

27:57

within the last couple of weeks, they've gotten a couple of

28:00

individuals. indictments in the International Criminal Court based

28:03

on evidence collected with our drones. So

28:05

this is just one application, but I think that I

28:08

was actually in Ukraine two weeks ago. It's

28:12

really a window into the future when it

28:15

comes to drone use. And

28:18

I think it's very clearly the

28:20

future of conflict and war, but I

28:22

think that it's also the future for

28:24

other industries because the stakes are so

28:26

high there, they have to use drones.

28:29

I mean, on the front lines, they don't do anything without putting

28:31

a drone in the air. They're using them to find targets. They're

28:33

using them to keep their own troops safe. They're

28:35

using them at just massive, massive scale, consuming 10,000

28:38

plus drones a month. So

28:42

as we were showing the pictures there, you

28:44

notice, everyone noticed that we stopped on this

28:46

one particular picture. And this one looks very

28:48

different from all the other ones. Can you

28:50

explain what we're looking at here? Yeah,

28:53

I mean, so this was a civilian structure that

28:55

was attacked by the Russians. And

28:57

basically, they're using... This is a 3D

28:59

model that was created. So the drone

29:01

went out and captured hundreds

29:03

of photos of the structure. And

29:06

then there's a software process called photogrammetry where

29:08

those photos are stitched together to create a

29:11

metric 3D model that you can use to

29:13

take measurements, you can use to capture different

29:15

angles. And so in their

29:17

case, they're using it to figure out what

29:19

kind of munitions were used, which angle those

29:21

munitions came in from as part of

29:23

building a case. And it's actually worth noting that

29:25

this kind of thing is a major use case

29:28

for public safety agencies in the US. They use

29:30

these things for crime and accident scene documentation. This

29:33

is just a very, very high stakes and really

29:35

tragic example. So as you just mentioned a moment

29:37

ago, you were in Ukraine. What's

29:40

your takeaway after having seen the front

29:43

lines and how your drones are being used in

29:45

person? So as

29:47

I said, I think it's really a

29:50

window into the future. And I think

29:52

it unfortunately highlights this tension

29:55

and the necessity of having secure

29:58

alternatives to Chinese. drones. So the

30:00

Ukrainians are still heavily dependent on

30:03

Chinese drones they're using and Chinese drone

30:05

technology. They're still using them at massive

30:07

massive scale but it's a super fragile

30:10

dependency because these drones and the companies

30:12

that make them are ultimately hostile to

30:15

their interests. So you

30:17

know China has put in place export controls to

30:19

try to make it harder for the Ukrainians to

30:21

get these drones. The companies themselves

30:23

have put in place firmware things that make them

30:25

less useful to the Ukrainians so they constantly have

30:27

to hack the firmware to get them to do

30:29

what they need and

30:31

I you know

30:33

I see this as just a significant

30:36

challenge to us and the rest of

30:38

the US industry to

30:40

step up our game and

30:43

deliver products that are useful

30:45

for them and you know the impact isn't

30:47

zero today but it's not

30:49

as much as it should be and this

30:51

is becoming a major major focus for myself

30:55

and Scott you know we've delivered thousands

30:57

of our drones to the US and

30:59

Western allies but where it matters

31:01

most on the battlefield in Ukraine I

31:03

think that that's you know that's the proving ground and

31:06

and that's really what we need to be focused on

31:09

both for their sake but I think ultimately for ours as

31:11

well. So this drone here

31:13

and the drones are being used in

31:15

Ukraine are as you mentioned before being

31:17

used to in addition to

31:19

all the other things you talked about document war

31:21

crimes but there are also there are other drones

31:24

that are being used as weapons of

31:27

war and America has a storied history

31:29

with the use of armed drones to

31:31

kill enemies in foreign countries how do

31:33

you reconcile the products you're developing with

31:36

the moral and ethical concerns of drone

31:39

warfare? Yeah it's a great question and something that

31:41

we think a lot about so the

31:44

Ukrainians are using all different kinds of

31:46

drones I mean they have like wing

31:48

drones, quadcopter drones, multirotor drones, surveillance drones,

31:50

decoy drones and then

31:52

they also have a category of they

31:55

call them FPV kamikaze drones so FPV means

31:57

first-person view it means you fly the drone

31:59

by wearing goggles that show you what

32:01

the drone sees. And this

32:03

is one of the primary methods that they're using

32:05

to deliver strikes now, where you fly an FPV

32:08

drone, you put a munition on it, and

32:10

you use it to go and find a target. So

32:12

that is not something that we make. We are very

32:14

focused on surveillance

32:17

drones. So this is in the military, they

32:19

call it ISR, intelligence surveillance reconnaissance. So our

32:21

drones are really designed to make it really

32:23

easy to see what's happening, get better information,

32:26

and make better decisions. And

32:28

I fundamentally believe

32:31

in the mission of the US military and

32:33

our allies, they have very difficult jobs to

32:35

do. And I think that giving them tools

32:37

that give them better information to

32:39

make better decisions is really important, good work.

32:42

I think there are tricky

32:44

questions as you start getting more and

32:46

more automated systems that

32:49

can deliver strikes, lethal strikes.

32:52

And this is something that the world is going

32:54

to have to grapple with. It's

32:57

not something that we're focused

33:00

on, because we're focused on the information gathering drones.

33:02

But that is still part of the kill chain.

33:04

I mean, that's part of the equation of having these

33:06

automated systems. And it's something that

33:08

we're getting increasingly thoughtful about as a company. But we're

33:11

not alone in it. The US military thinks about this

33:13

stuff a lot. And

33:15

they're increasingly refining their policies around

33:17

weapon systems and weapon systems with

33:19

AI. And I think it's

33:22

going to be a project for the world,

33:24

really, to grapple with. So Adam, let's talk

33:26

about police departments. You work

33:28

with police departments across the United States.

33:30

Explain how drones are being used in

33:32

those departments. So they're used for all

33:35

kinds of things. I

33:37

mentioned crime and accident scene documentation. If

33:40

you have a crash on a highway, it could

33:42

take three or four hours using

33:44

conventional methods for somebody on the ground to walk

33:46

around and perform all the measurements that they need

33:49

to gather the evidence that they care about. With

33:52

a drone, you can do that in 20 minutes.

33:55

You can clear the crash scene faster. It's just

33:57

a phenomenal thing for everybody. They're used for search

33:59

and rescue missions. I think we're seeing a search

34:02

and rescue mission sort of training exercise

34:04

here with Oklahoma City PD using

34:06

one of our drones. Hold on. Are

34:08

those voices coming from the video or is

34:11

something happening? Okay, it's coming from the video. We

34:13

could probably turn off the audio on the video.

34:16

Go ahead. Sorry, Adam. But

34:20

they're also used in the highest stakes

34:22

scenario. You might have a

34:24

situation where you have an armed suspect loose

34:26

in a neighborhood and conventionally

34:28

you'd have officers on an

34:30

eight hour manhunt going door to door, guns

34:33

drawn, looking for something that somebody that could

34:35

be dangerous. And we have

34:37

examples in our customer base increasingly on a daily basis

34:39

where you put a drone in the air. You

34:42

can find the person. You can see if they're armed or

34:44

not. You can guide the officers in. And

34:47

I think that's fundamentally better for everybody involved. You

34:49

know, it keeps the officer safer. It keeps the

34:52

community safer. It even keeps the suspect safer because

34:54

the officers don't have to guess about

34:57

what they're walking into. So

34:59

the notion of drones in policing and especially

35:02

AI enabled drones in policing, I think, can

35:05

get folks rightfully concerned because there's

35:07

certainly potential for misuse,

35:10

abuse, privacy invasion. You

35:13

know, what I would say having worked with a

35:15

lot of agencies over the last few years is

35:17

that our most sophisticated best

35:19

customers take transparency incredibly seriously.

35:21

You know, a lot of them release

35:23

the footage from their drone flights quickly

35:26

after them. And

35:29

in a lot of ways, drones are like

35:31

a flying body camera. Like it creates this

35:33

sort of objective picture of whatever happened. But

35:37

you know, body camera typically can only document

35:39

what happened. Drones can actually affect the outcome

35:41

because you can put a drone into a

35:43

dangerous situation that you wouldn't want to put

35:45

a person to get that information proactively rather

35:47

than reactively and make better decisions. Hmm.

35:53

Yeah, let's talk about it.

35:55

Well, we only have

35:57

four minutes and 30 seconds, so I can't even

35:59

go to... down the rabbit hole I wanted to

36:01

go down. But what do you say to critics

36:04

who believe drones enable police departments to behave like

36:06

big brother? So I don't

36:09

want to live in a world where we have drones

36:11

randomly flying overhead surveilling in our houses. I don't want

36:13

to live in that world. I don't

36:15

want to build that world. And

36:17

we have been very proactive on

36:19

this front. We've engaged

36:21

with public safety community organizations to release

36:24

a set of principles that

36:26

we think lead to the best outcomes

36:28

for their communities, things like

36:31

transparency, protecting privacy. And

36:34

there's a lot of things you can do from a

36:36

technology perspective to help with this. I mean, the

36:38

drone knows where it is. It knows what it's filming.

36:41

It can not point its camera at things that it

36:43

shouldn't be pointing its camera at. It can focus on

36:45

whatever task the officer cares

36:48

about. And

36:50

so I think

36:52

there's some technology things that can be done. I

36:55

think there's practices and best

36:58

practices within public safety

37:00

agencies that can be set up

37:02

and followed. And ultimately,

37:04

these agencies are accountable to

37:06

the communities that they serve.

37:09

There's always some democratic chain up

37:12

through that. And I think

37:14

that that's a really important mechanism.

37:16

And the

37:20

best agencies, I think, really lean into the transparency

37:22

element of it, which I think is really the

37:24

best answer. And what we've seen is that the

37:26

more that a community understands what their officers are

37:28

doing with drones, in general, the more

37:30

supportive of it they become because they can see the

37:32

impact on community safety. When

37:34

you say democratic chain, you mean small

37:36

d, democratic change, meaning people have a

37:38

say. Yeah, I mean like the sheriff.

37:42

We sell to a lot of sheriffs, county sheriffs,

37:44

who are elected directly. So

37:47

you lengthened it earlier to sort of

37:51

like a body cam. And we've seen

37:53

instances where we thought body cams were

37:56

going to be the thing that solved

37:58

everything until we got there. We had

38:00

police involved shootings where, oh my

38:02

god, the body cam wasn't working,

38:04

or the body cam wasn't turned

38:06

on. So in the end,

38:09

who controls the footage that

38:12

the drone picks up? And

38:14

is there any possible way

38:16

that we could see

38:18

the, oh, the drone wasn't recording

38:20

in a high profile situation? So

38:24

this is a newer product, but

38:26

I think it's going to become the

38:28

dominant operating paradigm. We have a capability

38:30

with X10 called Remote Flight Deck, where

38:32

the drone has a LTE modem in

38:34

it. It can be flown remotely. So

38:36

you can have an officer in a

38:38

remote flight center who is, through

38:40

a web browser, controlling the drone. And one

38:42

of the things that that does is

38:45

it gives some time and distance between

38:47

the person who's in the heat of the moment

38:49

on the ground. You've got somebody who's removed from

38:53

that who can be more objective

38:55

and be a guide to the people who are actually on

38:57

the ground. So increasingly, I think

39:00

you'll see the center of control shift

39:02

from the officers who are actually on

39:04

the scene in the moment to folks

39:07

who are helping them and on their

39:09

team. They're police officers themselves, usually, but

39:11

who are operating remotely. And I think

39:14

that separation can be quite valuable in

39:18

terms of the outcomes. And it's all digital stuff.

39:20

I mean, the drone is recording the video. We

39:24

have a partnership with a company called Axon that

39:26

has a digital evidence management system where

39:28

the video gets uploaded. It tracks chain of

39:30

custody. So there's good

39:32

controls in place. And I

39:34

think that the availability of this objective

39:36

aerial video is fundamentally a

39:39

good thing for transparency and

39:41

accountability of police officers.

39:43

Don't be fooled by the clock, because it says we

39:45

have 45 seconds. But I've got two questions.

39:47

So we're going to go over time. And

39:50

you've touched on AI technology. And

39:52

I want you to expand a

39:54

little bit in

39:56

reaction to something that retired Admiral

39:58

James Cerritos recently wrote. in

40:01

the Wall Street Journal. And I'm quoting here,

40:03

the winning side will

40:05

be the one that's developed the

40:08

AI-based decision making that can outpace

40:10

their adversary. This is using

40:13

AI technology in drones. My question

40:15

is, what happens when the country

40:17

that's developing those AI drones en

40:21

masse is Iran or

40:24

China? Do you believe the US is

40:26

currently outpacing other countries

40:28

in that space? So

40:32

I basically wholeheartedly agree with the

40:34

sentiment there. And

40:39

I look at things through a drone lens. We're a

40:41

drone company. If you look

40:43

at the last decade of drones, there's no

40:45

question it has been dominated by Chinese companies.

40:48

But I think the shift to

40:50

AI and autonomy presents an opportunity for the

40:53

US to really stage a comeback. I do

40:55

believe that by and large,

40:57

the US is leading the way in

40:59

AI today, like the most cutting edge

41:02

stuff is happening here. And

41:04

this is fundamentally why we founded the company, because

41:06

we believe that AI and autonomy were the future

41:08

of the industry. And we

41:10

have a technology lead there over our Chinese counterparts.

41:13

And I think that that stuff is going to

41:15

become more and more important. So we

41:18

can't stop other countries from developing

41:20

it. But what we can do is move

41:22

as fast as we possibly can ourselves. And

41:25

I think that that really matters. The stakes are very

41:27

high. So we've talked

41:29

about the power and possibilities of

41:31

drones in warfare and public safety.

41:33

But is there anything that gives

41:36

you pause? Pause

41:40

in terms of how it's? Yeah,

41:42

I mean, yeah. So we watch the Terminator

41:44

movies. That's what I

41:46

mean. Well, the drones, there's a humanity.

41:48

Yeah, there might be some software classes

41:51

inside Skydio called Skynet. Look,

41:55

I think that the I'm

41:57

a technology optimist. I think that if you look at the

42:00

at what our products are doing today. The

42:03

most exciting thing to me, we're building this

42:06

incredibly cutting edge sci-fi level technology. But

42:08

we're really deploying it to the core industries

42:10

that our civilization runs on. We're giving it

42:12

to energy utilities and construction workers and departments

42:14

of transportation. And they're using it

42:17

to do things that would be, where the alternative

42:19

is like a person hanging

42:21

off a rope to go inspect something,

42:23

or not knowing the state

42:26

of an insulator in a substation that could lead to

42:28

an explosion. And so there

42:31

are certainly things that you could be concerned about. And

42:33

we try to think through this stuff. We try to

42:35

be proactive as a company. We try to engage with

42:37

external experts to help us think about our product roadmap

42:39

and whatnot. But I think that

42:42

the impact today is overwhelmingly

42:45

positive. And I still think we're just scratching

42:47

the surface of what's possible. So

42:50

I'm generally very excited and optimistic about

42:52

the impact that technology is having now and where

42:55

it's going. OK, so I lied. I have one

42:57

more question because I got to ask it before

42:59

the last question. And you've mentioned this a couple

43:01

of times. And for those of you who know

43:04

drones and the technology, I

43:06

apologize for the simple-minded question. But a

43:08

couple of times, you talked about expert

43:10

pilots. Yeah. How do

43:12

you become an expert pilot? I

43:15

mean, if I flew airplanes, would I be

43:17

a good candidate to become a drone pilot? This

43:20

is a great question to end on. So

43:22

I'm an expert pilot. An

43:24

expert drone pilot? I am, yeah. So I agree. Just to

43:26

be clear. Yeah. Which

43:28

is actually harder than being a real airplane

43:30

pilot. We could talk about that. Because

43:32

when you're flying an airplane, you're in it. You

43:34

see exactly what it sees. You feel what it

43:36

feels. Whereas when you're on the ground trying to

43:38

fly a drone, you have to sort of project

43:41

yourself into it. So I

43:43

grew up flying radio-controlled airplanes. I took it

43:45

way too seriously as a kid. I actually

43:48

won a couple of national championships. There's competitions

43:50

for drone flying. Most

43:52

people don't know about that. I did not. It is

43:54

as nerdy as you would expect. Absolutely. So

43:59

it takes place. years of practice, like anything else,

44:01

like you can spend as much time as you want on it,

44:03

and the more time you spend, the better you get. And actually,

44:05

so I grew up flying this stuff, and

44:07

then I was starting grad school at MIT around the

44:09

time when you could first take radio control airplanes and

44:11

put computers and sensors on them, and write

44:13

software to get them to do smart stuff. So

44:15

my origin into this was basically trying to take

44:18

the things that I could do as an

44:20

expert pilot and codify them into

44:22

algorithms to get the drone to

44:24

do it on its own. And

44:27

that's sort of the genesis of the technology

44:29

stack that's in our Skydio drones

44:31

now. Hmm.

44:35

After hearing that last answer, I really do

44:37

think you should change the pronunciation to Skydio.

44:41

Adam Brie, thank you so much for joining us

44:43

here today. Thank you. Thank you. Hold

44:46

on one second. Yeah,

44:48

this is the closest we'll get to

44:51

TV at a newspaper. That concludes today's

44:53

Futurist Summit. Whether you joined us here

44:55

at the Washington Post or online, thank

44:57

you for being part of a fascinating

45:00

day of conversation. For those in

45:02

the audience, please join us in

45:04

the lounge for lunch. I'll

45:06

be out there with a number of my colleagues from The

45:08

Post across the newsroom. So

45:10

look for the sign Meet The Post. We

45:12

look forward to meeting you. Thank you again.

45:18

Thanks for listening. For more information

45:20

on our upcoming programs, go to

45:22

washingtonpostlive.com. Is

45:25

there anything more satisfying than finding something that

45:27

perfectly lines up with your taste and checks

45:29

all the boxes? Like getting

45:31

the perfect fit with a suit from Indochina.

45:34

They're. Seats are made to measure and

45:36

totally customizable with endless auctions. From.

45:38

Timeless classic to bold statement. you

45:40

can express your style exactly how

45:42

you what. Choose. Your own cut.

45:45

Fabric. Lining buttons, lapels,

45:47

and more. all

45:50

at a surprisingly affordable price. They

45:53

also offer fully customizable blazers,

45:55

pants, outerwear, women's wear, and

45:57

more. Every Indochina piece.

46:00

is made to your exact measurements, and they make

46:02

getting measured easy. Simply set

46:04

up your measurement profile in less than 10 minutes.

46:07

You can send your measurements online from the comfort of

46:09

your home, or make an appointment at one of our

46:11

showrooms. Find the perfect

46:13

fit with Indochino. Go to

46:15

indochino.com and use code PODCAST to

46:17

get 10% off any purchase of

46:19

$3.99 or more. That's

46:22

10% off with code PODCAST.

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features