Podchaser Logo
Home
Hertzbleed

Hertzbleed

Released Saturday, 18th June 2022
Good episode? Give it some love!
Hertzbleed

Hertzbleed

Hertzbleed

Hertzbleed

Saturday, 18th June 2022
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:11

Are you starting or am I starting?

0:13

Ah, that's a great question. Welcome

0:15

to security, cryptography, whatever.

0:18

My name is David I'm here

0:20

with Deirdre. Who's, uh,

0:24

who's been having a rough day. We'll put it at that.

0:29

Um, we are unfortunately, uh,

0:31

Thomas free this week. So

0:34

you're stuck with just the two of us, and

0:37

we're gonna do our usual game where

0:40

Deirdre understands math and then

0:42

I ask questions and

0:45

we will be playing the Hertzbleed

0:48

version of that game

0:49

yeah. Uh,

0:51

talking about some other stuff too. So

0:54

yes,

0:55

what is Hertz

0:56

Hertzbleed is a new, uh,

0:58

side channel attack. a

1:00

physical side channel attack paper against

1:03

one of my favorite things. Uh, an

1:05

implementation of SIKE, which is super

1:08

singular, isogeny key encapsulation,

1:11

um, and TLDR.

1:15

If you have turbo mode activated

1:17

on your processor, AKA

1:20

frequency scaling, depending on

1:22

how much power it's using it

1:24

can cause, uh, variation,

1:27

uh, in the processing time

1:30

of your crypto. Uh, and it varies

1:32

according to the value of your secret. Therefore,

1:35

if you can measure this, uh, you

1:37

can extract your secret and

1:39

they were able to, uh, leverage this against,

1:42

um, my favorite isogeny based

1:44

crypto system, which is not making

1:46

it to the, uh, final round of

1:48

the NIST post quantum competition. But

1:50

it is still, uh, you know,

1:53

a everyone's favorite redheaded stepchild

1:56

of the post crypto world. Um,

1:58

uh, I feel like this is like a targeted

2:00

Deirdre bug because

2:02

a little,

2:03

SIKE and then it's an architecture,

2:06

uh, side channel. Um,

2:08

so your partner who works in architecture,

2:12

um,

2:12

Oh God.

2:13

it's not as problem.

2:14

Yeah. And I literally was like,

2:17

there's a new paper. It has to do

2:19

with turbo scaling. And he's like, I already

2:21

know what the attack is. Stop talking to

2:23

me about it. um,

2:26

and so there is already

2:28

a known attack. Uh,

2:30

so the way that SIKE basically

2:32

works is that instead of doing this ephemeral

2:35

key exchange, kind of like elliptic

2:37

curve diffie Hellman, but with, instead

2:39

of exchanging points on curves, you exchange curves,

2:42

but also some points on those curves so that you

2:44

can do the final thingy and get to a shared

2:46

secret. Uh, it's kind of doing,

2:49

um, it's almost,

2:51

it's, it's solving the answer before

2:54

and then encrypting with it

2:56

and then giving you your key.

2:59

And then this is the whole KEM,

3:01

this key encapsulation mechanism.

3:03

So it's

3:04

So they, they tweaked the API to fit

3:06

into the N chem D uh,

3:09

uh, format,

3:09

yeah. And so,

3:11

addressed if I, if I recall correctly addressed

3:13

some of the, oh no, it's a static key. And

3:15

therefore the out of group attack supply.

3:17

Yes. Uh, the S

3:19

I D H uh, you should not use

3:22

it in a static, uh, ephemeral

3:24

setting. There is a, uh, you know,

3:26

active, adaptive attack that you can use

3:28

against that, but SIKE

3:30

turns it from, uh,

3:33

Basically would allow you to do,

3:35

uh, in theory, do use it

3:37

for static, um,

3:39

settings. So for example, if you have a

3:41

long term identity key, like in the signal protocol

3:44

or classic signal protocol, um,

3:47

you, that would live for a long time,

3:49

um, you would not want to use pure

3:51

S I D H to replace that because you

3:53

could just run this attack against that static key

3:55

over and over again, and you would get,

3:57

uh, you would be able to get the secret out. In

3:59

theory, you should have been able to do

4:01

this with SIKE instead, the

4:03

API would be different and not as

4:06

pretty, but you could in theory, do it, this

4:08

attack, uh, would,

4:11

if it works, does a thing

4:13

that would let you extract, the key,

4:15

the secret key from your SIKE key pair,

4:17

but it's a side channel

4:20

power analysis attack.

4:22

the other thing that was you

4:24

know, novel about it is that you can do it remotely.

4:27

So you would make queries either

4:29

like over some, you know, web

4:31

app request on your local

4:34

network. Um

4:36

than like break out the multi-meter

4:38

and pluck it into the processor

4:40

and

4:40

Yeah.

4:41

going on.

4:41

And I think, I think the way that they

4:43

actually leverage that is they are able,

4:46

so we'll, we'll get into to my

4:48

quibbles about this paper, uh,

4:51

next, uh, they're able to measure

4:53

it in the timing. So, so

4:55

kind of, I won't get super detailed

4:57

into the, the physical thing, but basically for

4:59

certain processors, if you're

5:01

compute and— out of the box,

5:04

they're configured to, uh, turn

5:06

on like turbo mode. Sometimes if your

5:08

compute is taking too long or taking too much

5:10

power, the processor can decide

5:12

to just go into like another gear and

5:14

tweak with the processor cycle

5:17

rate, the frequency. Um,

5:20

and this attack is leveraging

5:23

that change in signal, uh,

5:26

to extract information about the secret

5:28

and they get enough information— and they're able

5:30

to, they're able to measure

5:33

that in the timing of

5:35

the computation. So in

5:37

theory, you could also measure this by like slapping

5:39

a, like a multi-meter on the machine

5:41

and measure the power or slapping

5:44

an em, uh, radio, uh,

5:46

on the wall next to the computer. But

5:48

if you can measure this stuff over

5:51

like network latency, if

5:53

it's large enough, and this is in the milliseconds,

5:56

not microseconds, but milliseconds,

5:59

um, that's something you can measure over the network

6:02

to a degree. So that's why that's remote

6:04

and that's, that's kind of scary for people and stuff

6:06

like that.

6:07

That's one of the reasons that they're they're targeting,

6:10

uh, SIKE right, is because all the

6:12

post quantum stuff is bigger and slower

6:14

than where we've gotten elliptic

6:16

curves to these days by virtue of

6:19

constantly redesigning them for performance,

6:22

and then applying the Bitcoin

6:24

community at performance improvements

6:26

in exchange for money.

6:27

yes. This is where I get into

6:30

sort of like, cool. This is interesting

6:32

and novel, you know, this is a cool,

6:34

interesting thing. Uh,

6:37

good, good job. You're able to Mount this attack.

6:39

However, your target

6:42

is the slowest computation

6:45

in all of the post quantum crypto systems.

6:48

The one of the largest parameter

6:50

sets of that crypto system that

6:52

is available. This is using the P

6:54

7 51 parameter set that is a 751

6:58

bit prime field, uh,

7:01

over an extension field. So it's not

7:03

just all the, the SIDH's

7:05

and the SIKEs are not just doing a,

7:07

you know, finite field arithmetic, from

7:10

zero to a 751

7:12

bit number they're doing quadratic

7:14

arithmetic. So they're doing, you

7:16

know, whatever the scaling number

7:18

of computations that you have to do, you're doing,

7:21

more. Um, that

7:24

used to be a good middle of

7:26

the road parameter set... eh,

7:28

let's say, six years ago for

7:31

S I DH and SIKE since

7:34

then the security analysis,

7:36

cryptanalysis, uh, and,

7:38

and everything else about S I D H and SIKE,

7:41

um, at like the fundamental computational

7:43

level, not like this kind of protocol level,

7:45

like, you know, whatever vulnerability

7:48

stuff, has brought the

7:50

like security level up

7:52

of the protocol. And that lets us bring the

7:54

parameter sets down and make

7:56

the protocol faster without losing

7:59

security. So they should—

8:01

I asked the, oh, the co-author or one

8:03

of the co-authors online. Did you try this against

8:05

the parameter set 434.

8:08

So that's a 434 bit

8:10

base prime field. And that

8:12

parameter set is, I

8:14

don't know, like a hundred percent

8:17

faster, usually just to do computation

8:19

than the one that they targeted. Um,

8:22

in theory being, this

8:25

would go fast enough that

8:28

this turbo mode that's on by default on

8:30

these processors would not kick in as

8:32

quickly. And you would not be able

8:34

to get as good of a signal out

8:37

of the target that you're trying to attack

8:39

as you do for the 7 51 that they did

8:41

in their paper. They

8:44

said, they're gonna go try it. They haven't done

8:46

it yet. They haven't done it for elliptic curves

8:48

yet just pure classical elliptic curves yet;

8:50

I look forward to their results.

8:54

Well for, for elliptic curves, it's still,

8:56

I think, seems to be an open question as

8:58

to like, if it's even possible because of the speed,

9:01

but, um, the turbo scaling. Do

9:03

you, uh, how does that, when

9:05

does that kick in and why

9:07

does making something faster or

9:10

shorter, make

9:10

oh, so the idea being,

9:13

um, like how would you mitigate this?

9:15

Basically, if your compute

9:17

is faster and more efficient,

9:20

it's less likely that your processor is

9:22

going to think, uh, or learn

9:24

or use a heuristic, or whatever, like,

9:27

oh shit, you're using a ton of power. You're taking

9:29

a long time. I gotta like tweak my

9:31

clock cycles and, you know, change

9:33

how I'm allocating my compute

9:35

in order to like, switch it over

9:38

here or, you know, reallocate it from

9:40

this process over there or whatever it's doing.

9:42

If you're faster and more efficient,

9:45

it's less likely to get into this

9:47

side channel eliciting

9:50

behavior. That's the theory.

9:52

Versus the usual state of a processor,

9:54

which is just blocked on IO or blocked

9:56

on memory, um, or both,

9:59

uh, blocked on JavaScript.

10:01

Oh God damn JavaScript. Unless

10:03

it's, unless you're doing this CR

10:06

like God, if people are doing 7

10:08

50, 1 bit SIKE in JavaScript,

10:10

I'm not gonna give anyone ideas. Someone's probably done

10:12

it already. Uh, yeah. It's interesting.

10:15

It's interesting. But I would like to see them try

10:17

to Mount it against a less vulnerable

10:20

target. How about that? Hmm, love

10:22

that side

10:22

I've been thinking about, you know,

10:24

Spectre a little bit recently,

10:27

um, because, in

10:29

my day job at Chrome, which

10:32

I'm not speaking on behalf of, um,

10:35

you know, Site Isolation feature

10:38

was a big deal. Um,

10:40

and the basic idea behind Site

10:42

Isolation is when Chrome launched. It was one process

10:45

per tab, and everyone thought that was cool because

10:47

websites and browsers used to crash all the time.

10:50

And now just the tab would crash

10:52

and not the whole browser. And you would get the

10:54

sad tab, uh, little face. And

10:57

that was cool and that had security benefits as well.

10:59

Um, and that, you know, the stuff in one tab

11:01

couldn't leak into another. but

11:05

the fact of the matter remained that like, if you somehow

11:07

got code execution in a tab

11:09

in the renderer process, you could then just be

11:11

like, let me load up Facebook. And then

11:13

suddenly Facebook and all of its data is in

11:15

that tab. And, uh, you

11:18

can get data out that way. And so

11:20

Site Isolation was developed

11:22

as an idea of like, you know, let's actually do one process

11:24

per site rather than one process per tab to

11:27

mitigate some of those things. And it

11:29

had the side effect of— Spectre

11:32

gave you basically, if

11:34

you have like JavaScript

11:37

access because of the timing side

11:39

channels you have, um, can

11:42

read any other memory in the process

11:45

without a bug simply,

11:47

uh, by virtue of controlling the JavaScript,

11:50

even if like by normal course of execution,

11:53

Like that data was guarded by like,

11:55

if statements and so on, like you aren't supposed to be

11:57

able to read it. Um, and so you

11:59

could leak data out. And so that kind of like

12:01

accelerated some of the deployment of Site

12:03

Isolation, but without doing

12:05

a whole history of that, of which many other people are

12:07

more qualified to discuss, like

12:09

the big fear with Spectre

12:11

in terms of the web was like that you

12:13

would take one of these side channel based exploits

12:16

that gives you some form of memory

12:18

read and then like, uh,

12:21

weaponize and, or, you know,

12:24

like one click the exploit so that it

12:26

works in more situations. And

12:28

so at the time,

12:31

like there's a chance that like

12:33

this could have been really existential, uh,

12:35

to the web and then, you know,

12:37

Site Isolation more or less mitigated

12:39

it, uh, that took some time. But

12:41

also on the flip side, Um,

12:44

like that just didn't happen. Like

12:46

it turned like, I don't know of anybody

12:48

that used Spectre to actually exploit anything.

12:51

It's not like something in your standard Metasploit

12:53

toolkit and so on. You don't

12:55

see like N-day Spectre exploits

12:58

it's like so targeted. Uh,

13:01

it's like hard to have a generic

13:03

exploit using that.

13:06

I think.

13:06

And so, as a result,

13:08

like, uh, not

13:11

to say, like, it was, it was overblown

13:13

or anything, but just like it didn't, um,

13:16

it could have been much, much worse and

13:19

you know, every time you see one of these, there's

13:21

like the spectrum of like, you. This

13:23

is cool, but then I see like, oh,

13:25

reading out secrets then like the real

13:28

trick for these is the closer you can get to just

13:30

like arbitrary memory read, um,

13:33

the like higher and higher and higher the

13:35

impact is. Um, and

13:37

if you're like writing or finding

13:39

these exploits, the like goal

13:41

for the mega exploit is more

13:44

or less arbitrary memory read.

13:45

yeah, but like Spectre

13:48

Meltdown had a lot of vendor

13:52

coordination to deploy stuff to

13:54

like retpolines and

13:56

all that stuff before

13:58

was disclosed in 2017. Right.

14:01

Yeah. Some of the retpolines

14:03

are gone. And now, like for example,

14:06

V8 and Chrome, the JavaScript compiler

14:08

had some of those types of mitigations in

14:10

its code generation that have since

14:12

been removed, um, because

14:15

of Site Isolation.

14:16

Awesome.

14:17

now, like what your operating system chooses to

14:19

do about this, you know, that's a different question. That's like Spectre

14:22

in the web. Um, versus,

14:24

uh,

14:25

the

14:25

you know, Spectre and your hypervisor, uh,

14:27

which is like the same problem, different shape. Um,

14:31

I'm not sure what they're doing there.

14:32

yeah. I wonder I,

14:35

I would ask this to Thomas, but what do you think, do

14:37

you think, like enough

14:39

mitigations getting deployed

14:41

at announcement time kind of blunted

14:44

the, the drive to

14:46

try to exploit this? Or is it

14:48

just too hard for a 'sploit

14:50

kit to leverage.

14:52

I don't know. I suspect

14:55

the, uh, the

14:57

mitigation stopped some of the like

15:01

HeartBleed-esque exploitation

15:04

in the sense of just like, let's see, who can read

15:06

what type of things that happened after

15:09

HeartBleed happened. Um, but

15:12

I, I don't know, uh, I

15:15

mean, you know, uh, hindsight

15:18

is 2020. Um, I don't, I

15:20

don't know how much we were. Um,

15:22

I, I think in retrospect, and

15:25

I've heard even other people say that like Site

15:27

Isolation wasn't worth it relative to Spectre. Well,

15:30

you know, in my personal opinion Site Isolation

15:32

worth it just for the UXSS improvements.

15:34

Yes.

15:35

but, um, and Site Isolation's

15:37

definitely worth it, you know? Um, right.

15:40

Like had you been wrong? Had we been wrong about,

15:42

uh, Spectre, had it been mass exploited like

15:44

that would've been really bad, um,

15:47

an extinction level event, if you

15:49

will

15:50

Yeah, Jesus Christ. Um, that

15:52

was another vague annoyance. When

15:54

I, when I saw the, the name and

15:57

branding of this, uh, exploit

15:59

this paper as Hertz bleed. And I

16:01

was like, what? It's not

16:03

related to HeartBleed. HeartBleed was like

16:05

memory safety, right. It was a, it

16:07

was a bug, but also memory safety

16:10

that resulted in HeartBleed. This is

16:13

were memory reads

16:15

there were memory reads, but like in

16:17

a very different way. And so, yeah, that's the

16:19

bleed of like, you're leaking

16:21

some information that you're not supposed

16:23

to, but like in a very different manner

16:25

in, and it very, and it

16:27

was like HeartBleed was just like, you

16:29

ask it to do something. And it's like, here,

16:32

I'm gonna read these

16:33

I would— hi, I'd like 65 K of memory,

16:35

please

16:36

yeah. like, and it's like, here you go. Here's the

16:38

private key of this RSA

16:41

cert, or whatever the fuck. Um,

16:44

but, uh, yeah, I. God

16:47

like HeartBleed was my

16:50

running theory is that there's enough

16:53

just dumb bugs

16:55

in memory, unsafe code

16:58

in the world that the

17:00

exploit kits, and even

17:02

a lot of adversaries

17:04

that are trying to exploit you don't

17:07

need to turn to attacks like this.

17:09

Like there are probably people out,

17:11

out there in the world who are like, I'm keeping this

17:14

proof of concept in my back pocket

17:16

just for funsies, but

17:19

I think there's enough low hanging fruit

17:21

that they can get a lot of bang for their buck

17:23

for, without turning to

17:26

micro architectural attacks like Spectre

17:28

and meltdown for, uh,

17:30

side channel attacks that leverage

17:33

physical things like Hertz bleed

17:35

or, or any of these other things. They're

17:38

interesting whiz bang and

17:40

cool proofs of concept. But

17:42

I. People

17:44

who work with those sorts of people

17:46

could probably tell me otherwise,

17:49

but I have a feeling that it's like a, you know, return

17:51

on investment sort of thing. There's plenty of other software

17:54

bugs that they could leverage instead of these

17:57

mean, I think that's even broadly true of software

17:59

bugs. Um, my, my role on this

18:01

podcast is to basically recapitulate other people's

18:03

opinions that I've talked to recently. Um,

18:06

and so I'll say I was talking to,

18:08

uh, uh, someone else recently

18:10

and he was like, look, my priorities

18:12

are like single sign on, malware,

18:16

bunch of other stuff. Like

18:19

zero days in memory, safety, bugs, and

18:21

use after free's in that I'm using.

18:24

Right. Um, and then, you

18:26

know, you probably have another giant gap

18:28

and then like hard, hard

18:30

to exploit side channels. And

18:33

if you're, if you're a platform, um,

18:35

right. These matter a lot more, um,

18:38

if you're a cloud platform or an operating

18:40

system, right, you need, you need to be thinking about

18:42

these things cause you're responsible for like everybody.

18:46

Um, but uh,

18:49

if you are simply, you know, using

18:51

elliptic curves or, or running

18:54

a, a, a TLS, you probably,

18:57

um, have other things to worry

18:59

about.

18:59

yeah. Um, I

19:01

think Mike Hamburg was talking

19:04

about this on a mailing list and he

19:06

basically, he works on

19:08

stuff, I think crypto

19:10

co-processor or something like that. And

19:12

so he, he both has a,

19:15

a vested interest, but also domain expertise

19:17

and basically being like. If you

19:19

can, and you care, just have

19:21

a crypto processing unit,

19:24

like a, you know, trusted enclave sort

19:26

of dealie, or, you know, a crypto chip

19:28

in your, you know, your iPhone

19:31

or your server cloud server

19:33

rack. If that's what you do. And

19:36

it's just very well constrained,

19:39

very well specified,

19:41

you know, performance characteristics. It does

19:43

not do variable scaling.

19:46

It does not do you know,

19:48

all these like side channels that you

19:50

really, really care about. It does not do

19:52

Spectre. It does not do speculative execution.

19:54

It doesn't do all this stuff. And then you

19:57

can just trust it to either

19:59

execute a very narrow set of cryptographic

20:01

operations. Or you can just trust it to compute

20:03

over secret data. And it's not gonna leak

20:05

its ass out and then you're

20:07

done ish, or at

20:10

a, that's a great way for,

20:12

for doing the actual crypto operations.

20:14

I would caveat it a little bit with like, um,

20:17

like last year I was doing some

20:20

work with like the key chain

20:22

and secure enclave APIs on iOS,

20:24

um, for an app, um,

20:26

in the course of my day job, you know, we had some keys in

20:28

there and they were elliptic curves and we were signing

20:31

stuff. And let me tell you those

20:33

APIs are like, the

20:35

device might be very well specified the

20:37

software APIs for interacting with it are not.

20:40

Um, and, uh, they're also

20:43

like in my experience, haven't been particularly reliable

20:45

and I've heard this from other people where just like you,

20:48

when you make a key chain call and sometimes it just doesn't

20:50

work. Um, it doesn't necessarily return

20:53

to an error code. It just like doesn't work. You get

20:55

like a 4 0 4, you don't have a key or something,

20:57

even though you do, or you have like.

20:59

Multiple seconds of,

21:01

uh, of delay, like

21:03

asking, you know, what are the root certificates,

21:06

like key chain, that type of stuff. Um,

21:09

like. Just doesn't return

21:11

you know, you just have to try again or like

21:14

you know, um, so

21:17

fun.

21:18

all that to not to like specifically knock

21:20

on apple or anything, but just to say that, like the

21:22

there's two aspects of that there's well, specifying

21:25

the hardwares that it doesn't. So does these crypto

21:27

things well in hardware, which I think we're pretty good at you

21:29

don't see that many, like attacks

21:31

against the hardware and Yubikeys and so on.

21:34

Um, but what you do see is like,

21:37

oh, this HSM has an API that

21:39

like, lets you do something dumb or like

21:42

is hard to use or. I

21:44

HSMs someone

21:46

needs to disrupt that market. They

21:48

suck. Their APIs suck. I'm gonna

21:50

blame FIPs for a lot

21:52

of that trouble. Like, it's

21:55

it, uh, like there's, there's iOS

21:57

APIs in one bucket and HSMs

22:00

and what they, how they behave and their

22:02

APIs and using them in

22:04

a whole other bucket on the other side of the

22:06

ocean, uh, they

22:08

can be so much better.

22:10

think the, the market for like, you know, secure enclaves

22:12

and phones and so on is clearly very large.

22:15

However, I would posit that the market

22:17

for actual like HSMs that you would

22:19

put in a rack is roughly what you would call

22:21

a zero billion dollar market.

22:23

yeah.

22:24

so you might not, you might

22:26

be waiting a while on that

22:27

There's like two companies. And like the people who need them are

22:29

like, they need it to be FIPS certified. They need

22:31

this, they need that, whatever. Like this was part

22:33

of the reason that, uh, Z cash,

22:36

uh, shielded transaction adoption was so

22:38

difficult is that the places that we

22:40

wanted to deploy these things are like, cool.

22:42

Uh, we do all this things to

22:44

be like conformant with these regulations of

22:46

storing stuff. Like give

22:48

me an HSM that supports your weirdo

22:51

bespoke elliptic curve so that I can

22:53

do elliptic curve signatures

22:55

for these shielded transactions. Like let alone the proofs.

22:58

And we're like, ah, here's a

23:00

smart card software. Like just getting

23:02

it there is like there's

23:04

enough friction of getting stuff like

23:06

that deployed that it was definitely

23:08

like, uh, like slowed

23:11

the adoption of shielded Z cash

23:13

for ages. And it's just like

23:15

the things that you don't think about until like

23:17

the rubber meets the road of like, yeah, I need

23:19

a, a box. That can call

23:21

an API. That's like sign this and

23:24

it needs to like, do you have box?

23:26

And you're just like, no. And they're like,

23:28

do you have to do custom thing to support it? And they're

23:30

like, uh, I don't care

23:33

enough. and

23:35

I blame the HSMs.

23:37

Mm-hmm right.

23:39

don't really

23:40

right. It's like a databases

23:42

for scientists and researchers. It's a zero

23:44

billion dollar market.

23:45

yeah. Yeah. Unless

23:47

Andy Pavlo disrupts the databases. he's

23:50

a database researcher, guy

23:53

I, I believe it was Stonebreaker that first called

23:56

scientific databases, a zero billion

23:58

dollar market, or at least that's where I first heard

24:00

the term.

24:01

Was that before or after he won the Turing

24:03

award?

24:04

uh, around that time, I believe I

24:07

saw him actually give, he gave a talk at, uh, at

24:09

Michigan somewhere around

24:10

cool.

24:11

and that was when I first heard the term zero billion

24:13

dollar market. And I'm like, I'm gonna store

24:15

that one for the future. So

24:18

you were in galavanting around Europe,

24:20

doing cryptography in the past two months

24:22

is my understanding,

24:23

I

24:23

um, living the dream and

24:26

twice, um, twice

24:28

in no COVID either time.

24:29

twice in no, COVID, uh,

24:32

you know, crossing myself, um,

24:34

was there for real world crypto and Amsterdam.

24:37

Super awesome. We did our episode live from

24:40

Amsterdam. Um, you got

24:42

the gist from, from that. And

24:44

then I went back to Europe a

24:46

couple weeks ago to go to Eurocrypt

24:48

in Norway, which was very much

24:50

like I finally got to go to like a, you

24:53

know, IA, R flagship.

24:55

Academic crypto conference, cuz real world

24:57

crypto is like a symposium. They do a lot of talks. There's

25:00

a lot of industry folks, Eurocrypt is

25:02

like proper. Like, do you

25:04

want three hours on universal

25:07

composability of security

25:09

proof simulation. here's

25:12

the conference for you? Um,

25:14

do you care about the Fiat Shamir

25:16

transform? I'll talk to you for, you

25:19

know, three talks in a row about it today,

25:21

tomorrow and the day after. Uh, no, it was good.

25:23

Um, we were, it was in Tron

25:25

time. Norway, if you have a,

25:27

a good excuse to go to Norway

25:29

in the spring or the summertime highly

25:31

recommend it's lovely, uh, beautiful weather.

25:34

The sun rises at two 30 and sets at 10

25:36

30 at night. It's weird. Um,

25:38

but it was beautiful. Um,

25:41

yeah. Lots of universal

25:44

composability, lots of

25:46

multiparty protocols, which

25:48

is, uh, fun for someone who's doing

25:50

a, a very small scale multiparty

25:53

protocol in, in terms of threshold signatures,

25:55

which is frost, shout out to Chelsea

25:58

Komlo who came up with frost. Um,

26:00

I, I implement everything that she tells

26:03

me is, is secure because based on all these

26:05

security proofs that she comes up with,

26:07

um, and yeah,

26:10

Fiat Jamir stuff, uh, zero

26:13

knowledge proof stuff. Um,

26:15

A little bit of isogeny stuff.

26:18

There was at least, I think I, I missed

26:20

the last day, but there was a nice

26:22

talk about like, okay, how

26:24

do all of these super singular isogeny

26:27

assumptions relate to each other stuff

26:29

about endomorphism rings and

26:32

stuff about computing, um,

26:34

you know, ideals and Aqua and algebra

26:37

and crap like that. But I only barely

26:39

understand, from Benjamin

26:42

Wesolowski I think, um,

26:45

cool stuff like that. yes,

26:49

this is, uh, one of those conferences where

26:51

the submissions are all peer reviewed papers,

26:54

right. With the program committee and then,

26:56

you know, academic papers and so on where.

26:58

That is, you know, not gonna be the case at, uh,

27:01

at, uh RWC, which does,

27:04

which is like people submitting talks or,

27:06

or, or a black hat of

27:08

which you and I have both reviewed for,

27:10

which does, you know, these still have review boards

27:13

that pick what's coming in, but it's not the same

27:15

type of peer review that you

27:17

might see at an academic.

27:19

Yes. Uh, RYP other, uh,

27:21

academic cryptography conferences are very much,

27:24

you submit a paper, the program

27:27

committee figures out multiple people

27:29

to blind review your paper.

27:32

Um, and then they will either,

27:34

uh, tell you just flat

27:36

out. No. Or we

27:38

want you to make some changes, uh,

27:41

or just straight up. Yes, we

27:43

accept. Um, and usually you

27:45

will go back and if they, if

27:47

you make changes, uh, you, you

27:49

get like a camera ready. Um,

27:52

and then you make the very fancy paper show up. Also

27:55

on a website, it used to be published

27:57

in a BA in a, in a volume called the proceedings.

27:59

And you can still get, you know, pay $20

28:02

to get the proceedings of a conference.

28:04

But it's mostly to, just to like, get the final version

28:07

of the thing. And it gets posted on the conference website and

28:09

that is the one that got published at the

28:11

conference. And then they pick a best paper

28:13

and they pick, you know, you know, test of

28:15

time awards. There were some cool test of time awards

28:18

at, uh, Euro Crip this year. I think

28:20

one of them was literally the

28:23

curve25519

28:25

paper from, uh

28:28

DJB. Uh, and I don't remember

28:30

if it was co-authored, but with Tanja Langa, um,

28:33

it I'm pretty sure it was that paper. It was

28:35

basically how to do fast elliptic

28:37

curve math, um, from

28:39

DJB and it kind of like kickstarted the

28:41

like, not just, uh,

28:43

incomplete, short WeierStrass, uh, prime

28:46

order curve rage, or

28:48

that we've been on for at least, uh,

28:50

12 years or 15 years, or now.

28:52

just an absolute bender

28:54

Just an absolute bender of co-factor

28:56

curves that are fast as hell, but they just

28:58

bite you in the ass. Yes

29:02

um, yeah, DJB uh,

29:04

got, got one of those test of time awards,

29:06

which I, I think, you know, uh, that,

29:09

that definitely has stood the test of time. And I don't

29:11

remember what the other two were. There were two other test time

29:13

awards. Um,

29:15

I would say the big difference between like those

29:17

academic conferences and other conferences is like

29:19

the point of like point number one

29:21

is to like publish the papers. Like, and then the

29:23

conference is like a side effect of

29:25

Yes. Um, the conference

29:27

is very much an opportunity

29:29

for the grad student

29:32

who helped work on the paper, but

29:34

it's not like the big

29:36

name, the advisor, the professor, the first

29:38

author, whatever, to go to the

29:40

conference and present their work

29:43

and get their face out there and

29:45

explain it, you know, present it to colleagues

29:48

and people in, in their area, um,

29:50

and get their face out there so that they can get known as

29:53

they come up through, you know, research

29:55

and academia. And then if they want

29:57

to go on and become a professor or go tenure track

30:00

or do research, like that's the start

30:02

of them getting their face out there. Um,

30:04

there's been a lot of debate, especially around COVID

30:06

of like Especially

30:08

for the big one, which is called

30:10

crypto and has been, always held

30:13

in Santa Barbara, California for like 25

30:15

years or something like that. There's crypto Ry,

30:18

Asia crypto. Those are like the big three. Um,

30:20

and they are held approximately where

30:22

they're named for. Um, you

30:24

are often required to send

30:27

someone to go present

30:29

your paper for a, like a

30:31

20 minute, half hour presentation,

30:34

uh, as like terms

30:36

of being accepted of your paper,

30:38

getting accepted. And this

30:40

has been a point of contention because like, if

30:42

you're from the other side of the world and

30:45

to get your paper accepted at

30:47

the top tier cryptography academic

30:49

journal, uh, conference they're, they're not

30:51

big on journals. They're big on conferences. You

30:54

have to pay for a round trip,

30:56

international ticket. You have to pay

30:58

for them to stay. You have to deal with visas.

31:00

You have to deal with room board. You have to deal with meals.

31:02

You have to deal with all that stuff. that

31:04

can be exclusive to very good

31:06

research and very good students. Um,

31:09

and this is one of the reasons that kind of like

31:12

the remote, uh, the

31:14

conferences has been like appealing to

31:16

a lot of people because they can present remotely.

31:18

They can present online, they can be present

31:20

online, uh, in a way that they might have

31:22

been excluded before. So now there's been a bit

31:25

of debate about like, can we

31:27

change this up a little bit? And like

31:29

there's old guard that don't wanna change things

31:31

they want, you know, like they there's, there's

31:35

decent reasons for this

31:37

sort of stuff has been in the past, but also

31:39

like, yeah. Anyway, um,

31:43

part of this is also brought in

31:45

a proposal that was brought up

31:47

at real world crypto to introduce a new

31:50

proceedings of crypto papers.

31:52

There's the three major conferences. There's some other

31:54

IAC conferences. Um,

31:56

there's role crypto as a symposium. Um,

31:59

and there's like a journal for sort of like

32:01

best of the best papers. They kind of get

32:03

elevated and elected to like this journal.

32:06

Um, but this other thing is basically

32:08

trying to be, something

32:10

to give more access and more of

32:12

a venue than just like

32:16

the big three or some other conferences

32:19

or like nothing, or just a, just a

32:21

PDF on eprint. They're trying to have something

32:23

that's a little bit more accessible and

32:25

that's also been, uh, an area debate. It

32:28

seems like it's going forward, but, uh,

32:30

we'll see how it goes.

32:32

The rest of like the academic world

32:34

is the journal model instead of the conference model,

32:36

which is, um, as

32:39

its pros and cons, but the,

32:41

as the conferences have grown, they're all kind

32:43

of starting to approach journals. As you

32:45

can, like the idea behind journals, as you submit

32:47

over time and slowly, then eventually

32:50

you get in versus the strict deadline

32:52

associated conferences. And now

32:55

more and more conferences are having 2, 3, 4 submission

32:58

periods per year. And it's like, well,

33:01

yeah.

33:01

um, as, as you approach

33:03

the limit there, the, the sums turn

33:06

into the integral, like.

33:09

oh God, uh, algebra jokes,

33:11

um, security

33:14

and Oakland and these other things. They're still conferences,

33:16

right?

33:17

Correct. So the, the big four in

33:19

like computer security, like non

33:21

cryptography or, uh, or

33:23

maybe closer to the applied side of cryptography

33:26

are Usenix security, IEEE

33:29

security and privacy, which is also called Oakland,

33:32

um, CCS, uh,

33:34

which is the ACMs conference on

33:37

communication security, I think, I don't know.

33:40

Um, and

33:41

dunno. Mm-hmm

33:42

always in San Diego and ran by the internet

33:45

society.

33:46

Hmm.

33:46

Um,

33:47

And there they're conference conferences,

33:49

yes. Um, but at least,

33:52

uh, Usenix and CCS,

33:55

um, CS, um, have multiple submission

33:57

deadlines in Oakland, I think might as well.

33:59

right. Cool. Yeah.

34:02

so o historically

34:04

Oakland had been like, The one that

34:06

everybody wanted to stop going to, because

34:08

for years I E E wasn't open access

34:11

and I E is still not open access,

34:14

but S and P is, um,

34:16

and so, uh, people have started

34:19

going to that again. And now, like, I

34:21

think CCS is kind of starting to go out

34:23

of phase just cuz it has gotten very, very

34:25

large and

34:27

uh, for lack of a better term,

34:29

some of the cool kids have been

34:31

like, well, if I can now submit to yous NS, when

34:33

I previously at the time of year that I had to submit

34:36

to CCS, I may as well just submit

34:38

to yous NS. Um, depending on

34:40

your thoughts on like how quickly do you want

34:42

the conference to be after the submission?

34:44

Yeah,

34:45

Um,

34:46

for, for people. So the, the

34:48

crypto conferences I was talking about, they are very,

34:51

uh, besides robo crypto, which is talks,

34:53

uh, those are very academic,

34:56

the, a lot of theoretical cryptography stuff.

34:58

If you're more interested in applied cryptography,

35:01

computer security, implementation

35:03

stuff, attack stuff, uh, all

35:06

those conferences David mentioned are much more

35:08

likely to publish that sort of stuff. So like

35:10

attacks on signal protocol or,

35:13

or things like that. Those things tend to show

35:15

up in conferences like that. So those

35:17

have some cool publications

35:19

as well.

35:21

I've always enjoyed, like use NS the,

35:23

the most. Um, but there's

35:25

definitely a level of personal preference. Use

35:27

NS is usually the week, like after black

35:29

hat, it's like the second or third week

35:31

in August. And it moves around where it

35:33

is. It's usually in the us, or at least in north

35:35

America. And.

35:37

no. Um, so

35:39

recently we had, Apple's worldwide

35:41

developer thingy. And we

35:43

had, I, I don't remember if it was before

35:46

or after that. I think it was before that, uh, Google's

35:48

Google IO, whatever their equivalent

35:50

of worldwide developer thingy that apple does.

35:54

And I'm pretty sure both of them announced

35:56

that they are supporting and working

35:59

on the industry interoperable

36:02

collaboration of passwordless

36:05

authentication. And

36:07

for those of us nerds

36:09

who love, uh, unphishable,

36:13

uh, authentication mechanisms

36:15

are all of our ears perked up. And if you,

36:17

if you could see me, I've been, I'm making big ears

36:19

right now. Um, because

36:22

we're all pretty sure that this

36:24

is, uh, apple and Google,

36:27

the, you know, two biggest

36:30

maintainers and developers of

36:32

browsers, of devices, of

36:35

authentication things period

36:37

saying they're going to interoperably

36:40

support, um, a

36:42

new kind of Fido credential.

36:46

Uh, that's very similar to,

36:48

U2F the kind of old thing

36:51

where, uh, and, and now is

36:53

supported in the web API called WebAuthN.

36:56

That's apple has branded PAs

36:58

keys, uh, but it

37:00

has a much more convoluted, official

37:02

kind of generic non apple branded name,

37:05

but it's basically instead of

37:07

a password, which is just a character

37:09

string that you type into just a random

37:11

fucking field, or, you know, a query

37:14

param in a URL. Um,

37:16

you have a key pair and usually

37:19

what it is is, uh, for WebAuthN or

37:21

for, uh, FIDO2 or whatever,

37:23

uh, or old school UTF, um,

37:26

You have a key pair and it lives on your Yubikey

37:28

or it lives on your authenticator device.

37:30

Like the, you know, the trusted enclave

37:33

in my MacBook or something like that. The

37:35

private key stays there. You, the

37:37

website issues, you a challenge

37:39

when you're trying to log in, uh,

37:41

you do like basically

37:43

what amounts to a signature with your

37:46

private key that you previously registered

37:48

on. The thing that you're challenging and you send

37:50

it back and it verifies you. And part

37:52

of the thing that's happening in

37:54

this protocol is you're

37:56

bound to the website origin

38:00

that the technical term, the origin from

38:02

the request is coming from. It has

38:04

to be, uh, you know, you have to verify

38:08

the response, uh, against

38:10

your previously registered credential,

38:12

which means you can't phish it.

38:15

Um, you, and like it comes back

38:17

with all of these things bound together,

38:20

and this was great. Uh,

38:22

for second factor, um,

38:25

so that you cannot just copy paste

38:27

this thing into a, you know,

38:29

a field or a query param

38:32

because it wouldn't work. It has to, if

38:34

it was a phishing website, it wouldn't

38:36

work against google.com.

38:39

If someone is at, you know, like

38:42

g00gl3.com, but with all the OS or

38:44

zeros and the E or threes and

38:46

things like that, it just wouldn't work because

38:48

it would not compute correctly. That

38:51

was great when you just had it on your Yubikey

38:54

or your phone or whatever, but what if you need

38:56

to back them up and the answer has

38:58

been for a long time that you

39:01

just register a ton of keys.

39:03

So like I have six Yubikeys

39:05

and I have, you know, multiple devices and

39:07

things like that. And the answer is you

39:09

don't migrate the key pair.

39:12

You just have lots of key pairs and they all work

39:14

and okay, fine binders

39:17

full of Yuba

39:19

I, uh, I wanna, I, I wanna

39:21

make one kind of comment about the phishing,

39:23

because I think there's, there's a subtle point

39:25

that people miss that also makes the like

39:28

whole phone thing a lot more complicated.

39:30

So if you think about, uh,

39:33

right, the Yubikey and when you're signing in with WebAuthn,

39:36

right, you're more or less signing over the website

39:38

that you're trying to, um, log into

39:40

and the challenge that they give you on your computer.

39:42

And then it gets sent back up and everything is great, but

39:45

there's like another component that we're,

39:47

that happens auto automatically

39:50

when you're using say the Yubikey or

39:52

using the, uh, touch bar,

39:55

um, on your MacBook, which is that

39:57

like, the thing that you are using to authenticate

39:59

to is physically connected

40:02

to the same computer that is running

40:04

the web browser. And this is like enforced

40:06

by virtue of like the browser only knows

40:08

how to talk to the Yubikey that's currently plugged into

40:10

it or, or the enclave

40:13

or whatever. Um, when you take

40:15

that. And you're like, okay,

40:17

you know, my phone has a secure enclave and

40:19

it can sign things or like I

40:21

have a go program and I implemented the same spec,

40:23

whatever. Right. Um, like it's just a signature.

40:26

It doesn't need to be in a hardware. Um,

40:28

as soon as you move that to a different device

40:31

or like the communications channel

40:33

to something that isn't obviously directly

40:35

connected, um, you end up with this problem

40:38

where when you go to like, approve

40:41

on your device in some way to click, yes.

40:43

I wanna log in on your phone. Like,

40:46

let's imagine a crypto game, you've got two laptops.

40:49

Um, you can't see the screen on either.

40:51

Um, and you win the game and,

40:53

and you have a phone and you get a login

40:55

prompt and you win the game. If you can have

40:57

an over 50% accuracy

41:00

when saying which laptop is logging in,

41:02

right. You don't actually know that the laptop

41:04

or the computer that you're next to is the one that

41:06

you're logging into. So while it prevents

41:08

you from being phished on like, you know,

41:11

G zero, zero gle.com

41:13

or. Um, it doesn't prevent

41:16

you from being fished in the case where the

41:18

attacker is trying to log in at the same time

41:20

as you and sends you a prompt,

41:21

Yeah, yeah. Um, and, uh,

41:22

unless you go through the effort

41:25

of doing a preregistration

41:27

step between the browser that you're using.

41:30

Um, and, uh, and

41:33

then like the phone, you need some sort of out of band

41:35

thing to pair them together. And there's a whole

41:37

host of ways to do this. People have proof of concepted

41:40

it with like relay servers, a QR code,

41:42

and a Chrome extension. Um,

41:44

you could do it through, uh, like

41:47

Chrome sync, um, you

41:49

or iCloud sync, which is more or

41:51

less, I think what these proposals are doing.

41:54

Um, you can do stuff over Bluetooth,

41:56

combined, usually with a sync mechanism

41:59

as well to add even more. Um,

42:01

uh, but the,

42:04

it, the world gets a lot Gnar,

42:06

um, and the user experience gets slightly

42:08

worse when you, um, For

42:11

some definition of a worse, when you switch

42:13

from a Yubikey to a phone, the benefit

42:15

is you don't need to, to, to

42:17

like, uh, explain to somebody,

42:19

um, uh, or buy

42:21

them a Yubikey, right? feasible for an enterprise

42:23

to buy all their employees, Yubikeys. It's

42:26

not feasible for like the IRS to buy

42:28

everybody that needs to file taxes at Yubikey

42:30

and expect them not to lose it.

42:32

And all of this is for, uh,

42:35

using phyto credentials

42:37

as second factor or

42:39

multifactor, not your quote

42:42

unquote primary factor, which has

42:44

been, and probably will be for a long time,

42:47

passwords. Right.

42:49

Yeah, but I mean, fundamentally like these

42:51

end up being the same thing.

42:53

Right. So the new thing

42:55

is the. Passwordless

43:00

authentication stuff is

43:02

taking the same challenge,

43:06

response sort of protocol. And

43:08

using that in addition

43:10

to these, uh, device

43:13

bound credentials, the classic

43:15

FIDO2 UTF, um,

43:18

WebAuthn multifactor,

43:20

uh, thing. And instead

43:22

of a password, you have this

43:25

key pair that can

43:27

be synced because it's not device

43:30

bound such as through iCloud

43:32

key chain. And that that's like the

43:35

big proposal, is that apple

43:38

has kind of like created

43:40

this end to end implementation

43:43

of how you do this. And

43:45

everyone else is like, oh yeah, we were

43:47

thinking that. And then like, you've basically.

43:50

Done the proof of concept. And they're trying

43:52

to, they're, they're trying to get

43:54

everyone on board because if just safari

43:56

supports this and just iOS supports

43:58

this, it's gonna be it's. You need

44:01

websites to support this

44:03

API for it to work at all.

44:06

It doesn't matter if you're apple and you own the

44:08

entire stack top to bottom.

44:10

If you go to google.com

44:13

and it doesn't work, if apple PAs

44:15

keys don't work. Yeah. You're supposed to be magic.

44:17

So there app Google's on

44:19

board, uh, Apple's on board.

44:21

This is a W3C,

44:24

WebAuthN update

44:26

that's coming. but, uh, apple

44:29

Apple's kind of gotten out of the gate and, and

44:31

started with the great branding. Like they there's

44:33

like a four word name for these

44:35

things. And instead apples, like let's just call

44:37

instead of passwords, they're past keys. It's

44:39

We're just calling him pass

44:40

it's like, that's great. Damnit, apple, like,

44:43

so.

44:45

and I think in the future, we'll, we'll try and get

44:47

one of the authors onto like go

44:49

in more depth on this, but we just wanted to, to

44:52

call out that this was in fact happening.

44:55

Um, the authors know who they are

44:57

and may or may not be surprised by me

44:59

saying this without emailing them first.

45:02

uh, yes. Uh, we want to go

45:04

into a lot of depth about

45:07

this because I have more like very

45:09

specific questions. Like I was looking on the Fido,

45:11

like website and I was trying to get very specific

45:13

technical details about this. And then people were

45:15

like, ah, they're not there. They're like over in

45:17

this other place. And also these

45:19

people have their names all over it, so we should go

45:21

talk to them. So that's a thing that's happening.

45:23

That's very exciting for security. It's

45:26

we really hope to see this

45:29

getting rolled out well

45:31

and nicely, and in a way that,

45:33

you know, human users can understand

45:35

the benefits to their security. Um,

45:38

so early days, very exciting.

45:41

Uh, God speed.

45:43

Um, and then one last thing we wanted to touch,

45:46

um, or mostly I just wanted to touch.

45:48

And then for steer here to talk about was,

45:50

uh, uh, the discourse

45:53

from like, uh, a month ago or so

45:56

when, when,

45:58

um, Elon Musk

46:00

in the process of buying Twitter,

46:03

um, which will just move past that. Um,

46:06

but, uh, was like Twitter, DMS

46:09

should be E to E end to end encrypted.

46:12

Um, and then the end

46:15

to end encryption on the web discourse

46:17

started again, um, of

46:19

like, is this even PO, is it even possible

46:22

to have a web app that does end to end encryption

46:24

and has the same like threat model? Um,

46:28

good.

46:29

Uh, and so, uh,

46:32

Deirdre, you've thought about this a lot, I think.

46:34

And so maybe you could explain some of,

46:36

uh, why this is a little harder than it's

46:38

just like generate a key and, you know, call

46:40

uncrypt and like signal

46:42

does this, we know how to do this. We know a lot about like

46:45

messaging protocols, just do it in

46:47

a web browser. JavaScript's turn

46:49

complete.

46:50

oh God. Unfortunately just

46:52

slap a signal protocol on it

46:54

does not necessarily give you the

46:57

same, uh, security guarantees

47:00

in every deployed software

47:03

environment. Um, and

47:05

by every, I mean, there's the

47:07

web and there's everything else or there's

47:09

everything else. And then the, then there's the web,

47:12

um, I'm being pulled into this

47:14

because people are like, yeah, they should

47:16

totally do that. They should end to end crypt

47:18

Twitter, DMS. And if Twitter

47:21

was only supported on. Um,

47:25

iOS, Android, and,

47:27

you know, a desktop application

47:29

that could be very doable. Unfortunately,

47:32

twitter.com is a web app. That's

47:34

what the.com stands for. Right?

47:37

Um, it is primarily,

47:39

yeah, the guy who works on

47:42

the, the web browser tells me that I, I got

47:44

that one. Right. Awesome. Um, it

47:46

has, it was primarily a web

47:49

service for a very long time. And then

47:51

it got mobile apps and then it got,

47:53

you know, like tweak deck or, you know, tweaky

47:55

bot or, you know, whatever the fuck it had an API,

47:58

and then it shut it down. Um, for

48:00

a long time. It like the reason it used to be

48:02

only 160 characters is because it was like

48:04

you could SMS a,

48:07

a, a code and it would tweet it on the internet.

48:09

So it was like an SMS based plus

48:11

web service for most of its lifetime.

48:14

And then it slapped on a bunch of mobile clients.

48:17

Um, End to end encrypting

48:19

anything where you have to support that

48:21

end to end encryption with a web client

48:24

is a fundamentally harder thing to

48:26

do than if you just

48:28

have mobile clients and I'll, and I'll the,

48:30

the primary, uh, comparison is

48:32

WhatsApp. Uh, WhatsApp

48:36

started as mobile only and

48:39

eventually added a, uh,

48:42

a mirror of a web

48:45

client to a mobile

48:47

app that it was paired to. And now it has

48:49

a more evolved kind of web

48:51

service to go along with it. But web

48:54

WhatsApp. Deployment

48:56

for almost all its history was

48:58

iOS and, and Android clients. It

49:01

was Ava. It was able to slap on

49:03

signal protocol onto that service

49:05

because all of its clients were mobile

49:07

clients and they were compiled

49:11

apps in very constrained environments

49:13

that didn't auto load, uh,

49:15

content and scripting

49:17

from the internet. Every time that

49:19

you open the app.

49:21

that, that's the thing that makes it hard, right. Is

49:23

the fact that like you have,

49:25

uh, you have the security boundary

49:28

of an app store,

49:30

which like

49:31

Yeah.

49:31

the security boundary of, or even if you take

49:34

away the app store, you have like, you know, a

49:36

piece of software. And if you make the assumption

49:39

that like, people are able to get the piece of

49:41

software right. Once, and then it more, less

49:43

stays, stays like that. Um,

49:45

now you're talking about like a malicious update

49:48

through the store, whereas on the web,

49:50

it's like, you know, you can just load

49:52

resources from other sites. That's like

49:54

the whole value proposition of web.

49:57

Um, that's why it's a web and not like

50:00

an app, right. Connected

50:02

like a web

50:04

Like you can pull in images and they get

50:06

rendered and you hope that your browser render is,

50:08

uh, you know, decoder is great

50:10

or you just literally pull in a whole nother

50:12

website. Uh, and you can either

50:15

do that in an iframe and you hope that the,

50:17

the isolation is good or you just literally

50:19

do an HTTP get

50:21

and you get a blob and then you could

50:23

do whatever the fuck you want with it to a blob. Like

50:25

the dynamic nature of

50:28

the web, a web page,

50:30

a web app is like the point.

50:33

And it's just fundamentally different

50:35

in terms of the security application

50:38

platform that you write software for

50:40

than a mobile app than a

50:43

desktop app, even. Um,

50:45

and it, and therefore makes the

50:48

traditional. design

50:51

of end to end encrypted protocols,

50:53

especially signal which relies

50:55

on long term identity, keys being stored

50:57

and secure, and that you, uh,

51:01

tie all of

51:03

your, um, security back

51:05

to in this like, you know, double ratchet

51:08

way, um, fundamentally

51:11

harder and just a different

51:13

difficult proposition.

51:16

If Twitter only had mobile

51:18

clients and, you know,

51:20

pieces of software that they signed and released

51:23

and only changed when they pushed a new update

51:25

and didn't get reloaded

51:28

from the server every single time that you,

51:30

you loaded the app, like you

51:32

basically do on twitter.com or tweet deck

51:35

or whatever it

51:37

would be. And, and this is not even touching.

51:39

Like the APIs that you get from

51:42

iOS or the, the, the hardware back credentials

51:44

that you get from an Android device or, or any

51:46

of that stuff. It's just the fact that like

51:48

you could just store your long-term

51:50

keys in your app memory with

51:53

the guarantee that like, it's

51:55

just me, it's just me and my, you

51:57

know, sandbox name, space,

52:01

memory space for my, you know,

52:03

end to end encrypted twitter.com or, you

52:05

know, Twitter app. You it's

52:07

just so fundamentally different for the web

52:10

hell is other people's JavaScript.

52:12

apps. The fucking, so, you know,

52:14

my, my basic argument is like, people

52:17

are like, you should end, end crypt, Twitter, DMS. I'm

52:19

like, okay, you're probably

52:21

going to achieve that for

52:24

all these mobile clients and

52:26

web clients would not get end, end encryption.

52:30

and those secure

52:32

way to make sure that everybody

52:34

who gets, who can log in

52:37

to a Twitter service is twitter.com.

52:40

Web web users don't get DMS

52:42

at all because otherwise you could,

52:44

you, you open yourself up to downgrade attacks.

52:46

Yeah. Yes,

52:47

Yeah. Now, if you back up

52:49

the threat model, some, so like, what you're describing

52:52

is like what you would need to be close to signal

52:54

in, in a world where you really don't want to trust

52:56

the provider at all. If you weaken the

52:58

threat model to something like, I

53:01

don't want the engineers or the ops team,

53:03

or whoever to just be able to

53:05

like, run select star from

53:08

DMS in their SQL database,

53:11

um, you can do pretty well right now. Um,

53:14

but then you run the risk of,

53:16

you know, all the code could just be swapped out or they

53:18

could leak it in other ways. Um, they could, and,

53:21

um, you can have healthy

53:23

and vigorous debate about how different that

53:25

is from, from native apps that could also just

53:27

post, you know, signal or WhatsApp could just

53:29

post the plain text, um,

53:32

over HTTP to a server. But as

53:34

far as we know, they don't, and we have no reason to

53:36

believe that they do. And presumably

53:38

somewhat would notice if that happened.

53:40

Yeah, you can watch, uh, all the connections

53:42

that your app makes out and

53:45

you can measure how

53:47

much space there's in there and, and frequency

53:50

and see if there's anything going on. And if they're, you

53:52

know, You can inspect both

53:54

the decompile binary

53:56

and the behavior of a compiled

53:58

binary, um, to see

54:01

if it's doing something funny. Um,

54:03

and we have no evidence that any of these apps

54:05

are doing anything funny. Although, you

54:07

know, apple and, and iOS

54:09

was saying, oh, we're going to, we're gonna run

54:12

these like image detection, models on

54:14

your phone, and we're gonna snitch on you when

54:16

you upload them to iCloud. Um,

54:18

so that we could say that we deployed end to encryption

54:20

on iCloud, but we're gonna detect

54:22

if you're, uh, if you have child porn in your computer.

54:25

Um, but

54:26

for more context. See our episode

54:28

with Matt green.

54:29

Yes. Uh, but they say they haven't rolled that

54:31

out. And I believe that they haven't rolled that out because everyone

54:34

was very mad at them when they said they were gonna roll that out.

54:36

Um, you can detect

54:38

if, if these apps are. Doing

54:41

something that you think they shouldn't be doing? We have

54:43

no evidence that signal WhatsApp

54:46

end to encrypted Facebook messenger. Any

54:48

of these other end to encrypted messengers, uh,

54:50

are doing anything like that. Um, much

54:52

more likely to literally have a report

54:54

button like WhatsApp

54:57

or, uh, you know, using Facebook message

54:59

franking or whatever it is, um,

55:02

to explicitly get information

55:05

willingly ActionAlly from

55:07

users, um, which is a whole nother

55:09

discussion about like how

55:11

do you deploy end to encryption on

55:14

systems and still

55:16

run those systems and run those.

55:19

Communities that are made up of humans

55:21

that may or may not treat each other well or may

55:23

or may not abuse these report

55:25

capabilities. Um, how

55:27

do you do that in an end to crypto context?

55:29

And like it's not necessarily harder.

55:32

It's just different and see Riana

55:34

Pfefferkorn's paper about content

55:36

oblivious, moderation tools, uh,

55:38

for more on that. Um,

55:42

I think that's about all we got for today. Um,

55:47

I think we've got a few plugs. Um,

55:49

the first one is we continue

55:51

to have merch available

55:54

at merch security, cryptography,

55:56

whatever.com.

55:57

Mm-hmm we

55:58

and we now have merch that

56:00

isn't even all black.

56:02

Yes.

56:03

so we think that's.

56:05

You're welcome. It's got these cute little guys

56:08

that you might have seen on some of our, you know,

56:10

headers and images and stuff. Um,

56:13

if you like things that aren't all black, we have

56:15

them, um, and also mugs

56:17

and stickers with cool stuff. We

56:19

just like, we wanted merch.

56:21

We have merch. Now you can have merch too.

56:23

So that's what that is.

56:26

Um, several or maybe

56:28

even all of us, uh, will be

56:30

around black hat and we are tossing

56:32

around the idea of doing some sort of event.

56:35

So if that sounds appealing, please let

56:37

us know because that will motivate us to actually

56:39

think about it. Um,

56:42

and you can figure out the best channel

56:44

with which to do so.

56:46

I will be in Vegas for Z con,

56:49

which is a Z cash foundation event

56:51

about privacy. Uh, so

56:53

I'll be there for that. I probably will

56:55

be around black hat,

56:57

but not at a lot of black hat. And

57:00

then I will be trying to go to DEFCON,

57:02

but I'll be there that whole week of Vegas.

57:05

Let us know if you wanna come more thing

57:09

and now Deirdre, I'm gonna make us a real podcast

57:12

by doing something that, um, all

57:14

podcasts much do eventually, which

57:17

is plug another podcast.

57:19

oh my God.

57:20

uh, on, on, uh,

57:23

behalf of my partner, I have to plug the

57:25

podcast, A Star is Trek Becca

57:28

Lynch, who has never seen any star

57:30

Trek is taken on a tour through every

57:32

single series with two handpick

57:34

episodes by her friend, Jess,

57:37

who has seen all of them. And

57:39

they are doing a whirlwind tour of

57:42

all of star Trek. Um, I'll correct

57:44

myself.

57:45

were doing this.

57:46

I'll correct myself slightly. For some reason, she'd

57:48

seen this season one finale

57:50

of Picard. That was the first star Trek

57:53

that she saw.

57:54

Oh no. God help her.

57:55

um, a star is trek available

57:58

wherever you get your podcasts.

58:00

I'm subscribing right now, actually.

58:03

Um, I watched

58:06

a ton of TNG reruns

58:08

back in the day when I would get home from school.

58:10

So I'm looking forward to this

58:14

subscribed. Cool.

58:15

right. And find us on Twitter at SC w

58:18

pod. And I think

58:20

that's it.

Rate

Join Podchaser to...

  • Rate podcasts and episodes
  • Follow podcasts and creators
  • Create podcast and episode lists
  • & much more

Episode Tags

Do you host or manage this podcast?
Claim and edit this page to your liking.
,

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features