Podchaser Logo
Home
Hong Kong Deepfake Heist + Three Million Toothbrush Botnet + Hacked Canada

Hong Kong Deepfake Heist + Three Million Toothbrush Botnet + Hacked Canada

Released Friday, 15th March 2024
Good episode? Give it some love!
Hong Kong Deepfake Heist + Three Million Toothbrush Botnet + Hacked Canada

Hong Kong Deepfake Heist + Three Million Toothbrush Botnet + Hacked Canada

Hong Kong Deepfake Heist + Three Million Toothbrush Botnet + Hacked Canada

Hong Kong Deepfake Heist + Three Million Toothbrush Botnet + Hacked Canada

Friday, 15th March 2024
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

I think it's time for a little trivia. At

0:03

me. Was

0:07

there a botnet made up

0:09

of 3 million internet connected toothbrushes

0:11

that were terrorizing the internet when

0:13

they weren't terrorizing plaque? That

0:17

sounds so far fetched that I can't imagine you just

0:19

made it up so I'm gonna go with true. It's

0:24

a double bluff. No,

0:27

but it sounded so far fetched that a lot of

0:29

people thought it was true. Question

0:31

number two. Was there an elaborate deep

0:33

fake theatrical production used to stage a

0:36

massive 200 million Hong Kong dollar corporate

0:38

heist? That sounds like a for sure.

0:41

Yeah, that definitely did happen. Question number

0:43

three. Is Craig Wright Satoshi

0:47

Nakamoto? No, the answer is no,

0:49

but true. That

0:51

is the trivia thing that you asked. I

0:55

have framed this all as trivia. Maybe

0:58

for legal reasons, but if he is Boise,

1:00

not making a great case and the stakes

1:02

of the case he should be making are

1:04

very high. No,

1:07

yes, maybe we've got a bunch of fascinating stuff

1:09

to talk about, but do you know the thing

1:11

I'm most excited to talk about Scott? I'm

1:14

excited to take all our listeners on a

1:16

tour of our homeland. It's the

1:19

hacked Canada tiny stories about Canada tour. Oh,

1:21

well, there's some big stories about Canada that

1:23

we're working on that we're coming out with

1:25

the future. Yeah. Yeah.

1:29

True North strong and hacked three stories from

1:31

the north and oppressed. Flipper

1:35

zero bad air Canada chatbot

1:38

weird and a strange identification

1:40

system proposed to visit column

1:43

adult sites. Who knows what's even

1:45

going on up here. The fog of war

1:47

is thick, but one thing is for certain. You're

1:49

listening to hack. I'm

2:05

working on my broadcast transitions. Did you enjoy that?

2:07

I did. That was

2:09

perfect. You like that? That was

2:11

good. Amazing. But the

2:13

question you kicked it all off with, my friend,

2:16

how you doing? I'm good. I'm good. I

2:18

just got back to a little week surfing

2:20

Nicaragua, which is why I was absent last

2:22

episode. Apologize and will be absent

2:24

actually, I believe in the next episode because you

2:26

did that interview along its way and

2:29

the internet in Nicaragua maybe isn't

2:32

broadcast quality per se. Well,

2:34

we're happy to have you back, man. Yeah, excited for a

2:36

few of the interviews and things we got coming up with

2:38

the show in the next few episodes. Finally

2:41

getting to the Scott's Crypto

2:43

Corner book review. I

2:46

believe we're going to be doing an episode

2:48

talking about Douglas's going

2:51

infinite and Zeke Fox's number

2:53

grow up. So that should

2:55

be a good one. Zeke is coming on

2:57

the show, which is exciting. We

3:00

haven't done the interview yet, but it's

3:02

coming up. But really, yeah, really

3:05

will not, will keep all my thoughts until

3:07

that episode. No, crypto doesn't.

3:10

I want a mild correction. I

3:12

think the newest name is this Scott

3:15

Crypto Rage Cage was the

3:17

most recent. It's the most recent. Bitcoin

3:21

is up to 73,000 after ... No,

3:23

you have egg on your face. Apparently,

3:28

the world has found new value for

3:31

it and has shot its price up.

3:33

So excited to hear any theories on

3:35

what that value is. Please

3:37

draw me a chat on Twitter

3:40

at Hack Podcast. I'm

3:42

excited to hear from you. Almost as

3:44

excited as we are to introduce some

3:48

of our newest patrons on Patreon. That's right. hackpodcast.com

3:51

redirects to our Patreon. And

3:53

boy, do we appreciate all the support. Absolutely.

3:57

You know who I support? Tell me. I'm

4:00

gonna do this in like four episodes. I'm gonna

4:02

make fucking four episodes. I'm

4:04

gonna do this. Daniel's son, my

4:07

favorite karate kid. Yeah. Daniel's

4:09

son, thank you so much. Smokey Oni, that's a phone one

4:11

to say. I'm glad I got that one. Smokey Oni, thank

4:13

you. And Brad,

4:16

everybody loves a Brad. It's all about

4:18

Brad. It's all about Brad. Noib, thanks

4:20

Noib. Really do appreciate

4:22

it. Andrew Naylor. Andrew Naylor. Nailed

4:25

it. Love it. Waksara. Nailed

4:27

it. Nailed it. Waksara,

4:30

thank you so much for your support. Topher the

4:32

gopher. Also known as just

4:34

Topher. Just Topher. Sorry if I had to do.

4:36

Just Topher. Again, too loose with

4:38

it. Ruru Day, thank you so much for

4:40

your support. And last but not least, Scott, take

4:43

it across the finish line. Hackle.

4:46

Hackle. Hackle. Thank

4:48

you everybody. It means a

4:50

lot to us. We haven't done a

4:52

Patreon shout out in a little bit, but

4:54

does mean the world to us. If you

4:56

wanna support the show, hackpodcast.com

4:58

redirects to our Patreon and it

5:01

means a lot. Definitely,

5:03

definitely. Merch

5:06

store.hackpodcast.com. That's

5:09

some stuff if you want it. If you don't want it,

5:11

totally understand. I'm not here

5:13

to pressure you. This is not a

5:15

high pressure. Maybe you don't

5:17

need a bucket hat, but you probably

5:19

do. Hey, visor season is coming soon.

5:22

It is March. Visors will

5:25

be needed by like May at the latest.

5:27

Yes, it's gay here now. You

5:30

can find all that stuff if you

5:33

just go to hackpodcast.com. I think hackpodcast.com/store

5:36

is where you can purchase that hat.

5:38

Nope. Mm. Storer,

5:42

store.hackpodcast.com. That's

5:44

why I keep you around. hackpodcast.com

5:46

goes to the Patreon. Store.hackpodcast goes

5:48

to the store. The logic tracks.

5:50

Subdomains, who knew? Who

5:53

knew? Who knew? It's

5:56

been weird up here in Canada, my friend. Spicy

6:00

times. Spicy times. The first

6:03

one I want to talk about, so chatbots.

6:06

Chatbots. So a Canadian guy named

6:09

Jake Moffat successfully sued Air Canada

6:11

after being misled by the airline's

6:14

chatbot policy about their bereavement travel

6:16

terms. So airlines have

6:18

policies to provide discounts for people

6:20

urgently flying because somebody died. These

6:23

are very important policies. During

6:26

his grandmother's death, Moffat books a

6:28

flight from Vancouver to Toronto and

6:31

goes looking for information on the website about

6:33

the bereavement rates where

6:35

he was, you know, purchasing his ticket. Speaks

6:38

with the chatbot on the website to find out

6:40

what the terms are and the chatbot inaccurately instructed

6:43

him to book his flight immediately and request a

6:45

refund within 90 days.

6:47

This is importantly not how Air Canada's

6:50

bereavement policy works. Jake

6:52

files the claim, gets denied, then

6:55

presents a screenshot of the chatbot's advice

6:58

and his refund request is rejected.

7:01

In this rejection, Air Canada argues two major points.

7:05

First is that while the chatbot provided incorrect

7:07

info, it also provided a

7:09

link to another page on their website

7:11

that on that page did contain the

7:13

correct information. So it was like

7:15

a truth and a lie situation. And

7:18

then they made a very weird abstract argument about

7:20

it being this sort of separate entity that was

7:22

not their responsibility. Both stances were

7:24

dismissed by the tribunal and

7:26

Moffat's sort of persistence in this led

7:28

to a ruling in his favor granting

7:30

him this partial refund and

7:32

additional damages. And

7:34

as of the time we're recording, I

7:36

checked this morning, the chatbot is disabled on

7:39

Air Canada's website. This is great. This

7:42

is, if this had come out any

7:45

other way, we'd be in for a world of

7:47

hurt with random AI chatbots

7:49

telling us random things that weren't actually available. It's

7:51

actually bright. So

7:54

I'm so happy that this small

7:56

lesson, I'm so happy that this person took it

7:58

to court. The two

8:00

or four thousand dollars or whatever he was fighting

8:02

for in regards to his ticket refund is probably

8:05

nothing compared to what his legal bill was. So

8:08

so kudos to you my friend the

8:11

world those your favorite least canadians do for

8:13

setting the precedent that these chatbots can't just

8:15

make stuff up yeah yeah for a bunch

8:18

of reasons there should be a penalty for

8:20

the race to replace customer service people with

8:23

chatbots that have no internal model of

8:25

the world. Like the idea

8:27

that a representative of the company can just tell

8:29

you incorrect stuff that you can then act on

8:31

that the company is not liable for is like

8:34

that's a. We can all

8:36

immediately say see why that's not a

8:38

great idea that's not what this technology

8:40

is for and the fact that it was

8:42

being used on our Canada's website this quickly

8:45

is pretty shocking to me to be honest.

8:47

What's also shocking that they

8:50

must have trained the chatbot on air

8:52

Canada's policies and procedures and that they

8:54

got it so wrong which is wild

8:56

to me. So I'm

8:58

not sure if that's indicative of just bad

9:00

training or whether it's indicative of them

9:03

not setting the right boundaries for what the chatbot

9:05

was allowed to do but it's just just just

9:07

bad stuff like it actually it reminds me of

9:09

the Watsonville Chevrolet and I think we chatted about

9:11

this in a previous episode but like one of

9:14

the first big chatbot headaches.

9:17

Was some Chevy dealer in some place

9:19

called Watsonville which I do not know

9:21

where it is. I'm going to assume

9:23

Kentucky or Wyoming the

9:27

anyway they put a chat GPT chatbot on their site

9:29

and it said powered by chat GPT and all the

9:31

rest of it and people just started training it to

9:33

say yes to everything and that and then to pair

9:35

it back that it was legally binding. So people started

9:38

like buying Chevy Tahoe's for a

9:40

dollar and like setting all

9:42

these I'm pretty sure the

9:44

people that were trolling it on the Internet weren't taking

9:46

them to court being like no you owe me

9:48

a Chevy Tahoe but that would be really funny

9:51

if they actually had taken into court. I feel

9:53

like what's happening here is there's some enterprising

9:56

folks out there that

9:58

realized very quickly. Hey, if

10:01

we show up to these companies and

10:03

saying, we figured out how to plug

10:05

the open AI chat

10:07

GPT API into a

10:09

chat bot, you can replace a lot of your

10:11

customer service people with this. It's

10:13

going to save you this much money and look at how good the results are.

10:17

And they've just been on a sales tour for the last

10:19

year and a half. And I'm

10:21

hoping these stories are sort of a

10:23

big megaphone blast into the world. Like

10:26

this is not an

10:28

appropriate application of this technology. That's

10:30

not what this should be for. Because

10:32

people will figure out, you

10:35

can compromise this thing with plain language, which means

10:37

if you just put it on the internet, you're

10:39

going to get a chat bot on your site

10:41

telling people, yeah, it's Chevy Tahoe Casa nickel. And

10:44

yeah, you can just request a refund on your on

10:46

your airplane ticket. It's like, it's not, it's

10:48

not a good idea. Yeah,

10:51

it's, I feel, maybe

10:54

this is my own bias against these AI

10:56

bots, but like, I feel like they become

10:58

really good at conversation. Like, like the, like

11:01

the old Turing test to be like, whether you can

11:03

identify it as a Turing test to come or what

11:05

the test is for AI. Yeah, Turing test. If it

11:08

Turing test, yeah, of like whether you can identify whether

11:10

it's human or an AI, I

11:13

feel like they're, they're crushing that thing. But

11:16

the part of them then

11:18

being trustworthy and having the right information,

11:21

I feel like they're not crushing as

11:23

much. So I'm

11:25

sure it's only a matter of time,

11:27

but I rarely have discussions with chat

11:30

GBT to get answers for

11:32

questions that I want answers to, right?

11:35

Where the answers are actually the answers.

11:37

I feel like whether or not chat

11:40

GBT can pass a blind

11:43

kind of conversational Turing test with someone is like,

11:45

yeah, probably in a lot of cases it can.

11:48

But the difference is that a company employs a

11:50

human being, they kind of become liable for a

11:53

lot of that human beings' actions. And it is

11:55

not established that a company is liable

11:57

for the actions of a chat bot. And

12:00

how you train a chatbot is

12:03

just fundamentally different than

12:05

a human being and also like

12:09

You can fire a human being you can get angry

12:11

at a human being there's penalties and incentives for a

12:13

human being that just don't exist For a chatbot. Yeah,

12:16

it can probably pass that test in a

12:18

lot of situations, but when it fails to

12:20

you got no move whatsoever

12:24

Well the like like customer service agent like

12:26

the word agent is actually like a

12:28

like a powerful term like that to

12:31

like So it's like

12:33

in a legal sense. It's a powerful term

12:35

Yeah, like an agent is essentially the spokesperson

12:37

for a company in that regard and once

12:39

you have a chatbot agent Like

12:42

you need to be held liable for what it says

12:44

if people are using the information that's providing to make

12:46

decisions Then you should be liable for the information is

12:48

providing. I I

12:50

100% agree. I have like

12:53

clawed back money from large corporations

12:56

I've clawed back money because a person from

12:58

the company on the phone on a recorded

13:01

call pulled me something I took action based

13:03

on that and Then

13:05

something about what they told me turned out to

13:07

be wrong and the call was recorded and we

13:09

were able to reconcile and I got The money

13:12

back like that's actually happened to me and it

13:14

concerned air travel weirdly enough To

13:16

this story. No, it wasn't with air cannon. But

13:19

anyway, it's like it matters It matters that there

13:21

is a an accountable person

13:23

because otherwise it's just this like if air

13:25

Canada had won this It means that companies

13:27

could just shrug off Basically

13:30

everything they tell their customers. Oh,

13:32

that was a chatbot. Sorry separate entity than

13:34

us Yeah, you know, what can you

13:36

do these things suck? Like and

13:39

why do you have I don't know why is it telling

13:41

people to do things? Yeah, totally I feel like we get

13:43

bang on this bang on this drum all day long but

13:45

I've had the same thing where I've had to go back

13:47

to recorded phone calls to get refunds on things and Yeah,

13:51

there's a reason why they record those calls and it's

13:53

pretty amazing mine was insurance related which was

13:55

even better Weird

13:59

that sounds like fun Yeah, cool tech don't use

14:01

it this way. Anyway, actually speaking of

14:03

cool tech, you shouldn't think about a

14:06

certain way. There's another story coming out

14:08

of Canada and concerns, a device that

14:10

I know holds a special place in

14:12

your heart, Scott, the Flipper Zero. Yeah,

14:15

yeah, I definitely don't own one, seeing as they're about

14:17

to be banned. Yeah, what else do I need to

14:19

say about that? Take that episode down about how you

14:21

bought and love yours. I

14:24

bought mine because I knew that they were probably going to

14:26

be banned at some point, and then now I'm- Literally. I

14:29

definitely don't have it. It's definitely not sitting right beside

14:31

me. No. I'm probably

14:34

not holding it at the present moment. Yeah,

14:37

the Innovation, Science, and Economic Development Canada

14:39

Agency has put forward a proposed ban

14:41

on the importation sale and use of,

14:45

amongst other devices, the Flipper

14:47

Zero. So let

14:49

me- I just need to go off a bit on this

14:51

because it's- this

14:53

thing is getting such a bad name for just being

14:55

configurable. You know what I'm saying? You

14:58

can do things on it, like

15:00

run a small Wi-Fi web

15:02

server, and therefore we should ban

15:04

it because that small web server

15:07

can expose a security hole in

15:09

Tesla's key system. It's like, well,

15:11

you know, I could buy a micro PC

15:14

off of AliExpress for like 80 bucks and

15:16

do the exact same thing. So- or a

15:18

Raspberry Pi or any number of

15:20

other things that has the ability to run a Wi-Fi

15:22

server. So why

15:24

is the Flipper Zero getting

15:27

a bad name? Just because it's kind of marketed

15:29

as a tool, and by kind of, I mean

15:31

it's- Exposally. Exposally. Yeah.

15:35

I'm not going to stand that edge off. It's pretty exclusively

15:38

marketed that way. That's not a good reason to get rid

15:40

of it. But it's- yeah,

15:42

it's like it's doing its job.

15:44

It's proved that there are security

15:46

vulnerabilities in certain car manufacturers' key

15:48

systems. It's like, great. Like, that's

15:51

good. Fix those problems. Don't

15:54

ban the device. Don't ban a security research

15:56

device if you're worried about the security of

15:58

other device. It's just- it's- extraordinarily backwards.

16:00

For anyone that doesn't know, a flipper

16:03

zero is, it is marketed

16:05

as like kind of a hacker tool.

16:07

But what it really is, is a

16:10

small beginner friendly device that lets you

16:12

interact with wireless signals. RFID, NFC, Wi-Fi,

16:14

as you mentioned, Bluetooth, standard radio. You

16:17

can do all sorts of fun little hackery projects

16:19

with it. You can change TV channels, you can

16:21

clone a hotel keycard, you can

16:23

read a PETS RFID chip. It's a little

16:26

wireless signal receiver. Yeah. Yeah, it's a extensible

16:29

platform that allows you

16:31

to pretty much do anything. There's an entire like circuit

16:33

interconnect on it where you can put in custom boards.

16:35

We did a whole episode about it. If you have

16:38

any interest in it and any interest in buying one

16:40

before they get banned, I recommend you move fastly or

16:43

quickly. We

16:45

did an episode about it. We had a

16:47

great Talking Sasquatch big youtuber on about it.

16:49

Go back a few months and give it

16:51

a listen. Great episode. But very cool little

16:54

devices. It's like a pre-made

16:57

micro computer to do this stuff. It's

16:59

not like my cell

17:02

phone is running Unix. So it's like I could just

17:04

do it on my cell phone. But

17:06

it's like this is just its own kind

17:08

of little pre-made cutesy toy device for it.

17:10

Yeah, kind of great and people have really

17:13

adopted it and the community is developed that

17:15

is extending it and I don't

17:17

know. Yeah, it's nice. I think it's

17:19

worth digging into where this is coming from. So

17:21

car thefts are admittedly a pretty

17:24

disproportionate problem in Canada. Just seems

17:26

to be a thing. Statistically disproportionate.

17:30

A lot of complicated reasons why that is. Despite

17:33

all of the versatility versatility that we've discussed,

17:36

the flipper zero does lack a lot of

17:38

the capabilities necessary

17:40

for actually bypassing modern

17:43

car anti-theft production protections.

17:45

Yeah. Signal amplification relay devices

17:47

are kind of widely understood. It's like what if

17:50

you're going to buy a thing to steal cars,

17:52

you're probably buying that flipper zero doesn't let you

17:54

do that. I was just gonna say rolling key

17:56

generators and stuff like that like that you can

17:59

buy a specific device. I can go on

18:01

the internet right now and buy a device that

18:03

is meant to hack rolling key like

18:05

automotive keys. I can buy

18:07

that right now and have it shipped to my house.

18:09

That's not banned. No. But

18:12

Flipper Zero's banned because in

18:15

some situations it can be used to run

18:17

a phishing or like a man in the

18:19

middle attack, et cetera, et cetera, and it

18:22

is what it is. In Canada, you

18:25

kind of think of the geography in Canada.

18:27

You drive up, not a lot of buyers

18:30

in Alaska. You can

18:32

drive down, but primarily the buyers for stolen

18:34

cars exported from Canada aren't in the United

18:36

States. You go through the

18:38

land borders, they're extraordinarily well protected. You can

18:41

get across in other parts. It's a massive

18:43

open border, but that's not where the buyers

18:45

for these cars are. The

18:48

buyers for these cars are

18:50

primarily across oceans, let's

18:52

just call it. Is that your political

18:54

wave? Yeah, they're across ocean. They go

18:56

into sea cans and then go on

18:58

shipping freighters that

19:01

then take them across oceans, notably the

19:03

Pacific Ocean. Funny thing

19:05

about sea cans, now that you mention it,

19:07

a lot of those in our port systems.

19:10

Yeah. A ton of those in

19:12

our ports. There's tons

19:14

of things you could do to prevent car theft.

19:16

You could invest more money in security in

19:19

our ports. You could create stricter

19:21

regulations about the anti-theft measures that go into

19:23

these cars that make them prohibitively difficult to

19:25

steal. We talked about that a ton in

19:28

the Kia Boys episode. There's a lot of

19:30

really cool meaningful actions you can take. Banning

19:33

a hacking gizmo is

19:35

just a regrettably performative gesture that,

19:37

if anything, is going to hold

19:39

back meaningful security research in a

19:41

country that is saying it is doing

19:44

this because there is a security

19:46

problem with cars. The only thing I

19:48

can think of, and maybe if

19:50

there's some bureaucrat at the GOA or GOC,

19:53

Government of Canada listening to this, is

19:56

there something that we just don't know

19:58

that's not reported in the news? Maybe

20:00

these things are being... being used to

20:02

deal Honda Civics everywhere, like their push

20:05

button, script kitty, car theft devices. Because,

20:08

yeah, I agree with you, it

20:10

does seem performative if it's just exposing

20:12

security flaws, and especially when

20:14

it comes to Tesla, because one of the things that

20:16

I keep referencing is that you can

20:19

kind of use them to trick people into generating

20:21

a spare key, and then

20:23

making the footboard zero, essentially a web

20:27

hotspot. Anyway, yeah,

20:29

unless there's something that we just don't know about

20:31

that's not being reported, because they just don't want

20:33

people to know about it and talk about how

20:35

easy it is to just, you know, steal Toyota

20:38

RAB4s or something, then

20:40

yeah, it does seem performative to me for

20:42

sure. Good times,

20:45

good times. Good

20:47

times. So here's one,

20:50

there was one last Canadian story. It

20:53

was kind of a ranty Canadian episode.

20:55

Do we, attack with it. There's

20:57

a bill currently in the committee in the House of

20:59

Commons here up in Canada that

21:02

would make it so if you want

21:04

to view adult content, you either have

21:06

to, so how do I get

21:09

into this? That's not the best way to do

21:11

this. Yeah, propose

21:13

send a bill trying to mandate age

21:15

verification on explosive websites. The

21:17

argument, I understand the argument, it

21:20

is to protect minors. However, the

21:22

bill importantly doesn't specify a method for

21:24

verifying users' ages. And

21:27

looking at sort of some of the available

21:29

systems in other jurisdictions, the two big things

21:31

that come up would be either a digital

21:33

identification system that you have to, you know,

21:36

plug in to access these sites or

21:38

a facial recognition software, which

21:40

has intuitively raised concerns

21:42

about anonymity and privacy on the

21:44

internet up here in Canada. I

21:49

don't think it, I think this

21:52

seems like people go into the grocery store

21:54

and getting all of the ingredients for a

21:56

ginormous catastrophic data breach and putting them in

21:58

the basket and walking them. up to the

22:00

self-service till. This like, what if

22:02

we had a giant database of identities

22:05

of people that visited a porno site?

22:08

Um, seems like the biggest target in the

22:10

world. I can imagine the episode two years

22:12

from now, where we talk about the data

22:14

breach. It just seems like such a bad

22:16

omen. I 100% agree with you. The

22:21

other thing, unless they figured out a way to really

22:23

like, I'm just taking it through right now to really

22:25

like multi-tier,

22:28

you know, unconnected key

22:30

systems with, I

22:33

don't know how they do it, but I agree

22:35

it would be, have be right, especially if it

22:37

was a government contract and built by government contractors,

22:40

probably be ripe for data breaches. I'm

22:43

sure that they would take their best crack. I

22:45

don't get the sense. I don't

22:47

really have a tinfoil hat about this one.

22:50

I don't think this, this is the first

22:52

step towards creating a digital identification system and

22:54

a social, I'm not, I'm not meaningfully worried

22:56

about this. I think this is starting from

22:58

a good instinct to try and keep minors

23:00

off of adult websites, which is a good

23:03

instinct. Um, but I just

23:05

think that this is a solution with the

23:07

actual solution, which is a technical one sort

23:10

of being shrugged off. And I think until you

23:12

can propose that in

23:15

a secure, meaningful way,

23:17

you shouldn't, this bill has to,

23:19

uh, to 10, you shouldn't be bringing this forward.

23:21

I got a, I got a bigger challenge for

23:23

you. And in regards to

23:25

the fact that adult content is just

23:27

everywhere on the internet now. So you

23:30

literally can't just, if

23:32

your concern is minor exposure

23:34

to adult content, then

23:36

you shouldn't just let minors on the internet because

23:38

I don't know when the

23:40

last time you were on a social

23:43

network was Reddit, Twitter

23:45

or X literally

23:47

any trending post on X is

23:50

immediately followed immediately by the top

23:52

reply, which is an only fans

23:54

person promoting their only fans with

23:57

explicit content. It's like,

23:59

it's there. marketing scheme, same thing on

24:01

Reddit. If something's trending, there's

24:03

only fans, people marketing themselves in

24:05

the comments. And it's

24:07

just like, there's porn everywhere. I

24:10

don't know. Unless, unless we

24:12

start doing, unless they're marrying

24:14

it to like image identification

24:16

technologies. So like your

24:20

web browser will then filter all that stuff

24:22

out if you haven't verified your ID, like,

24:25

which is probably a likely solution to that.

24:28

I just can't see how they're how, how

24:31

mandating identity and facial recognition

24:33

for explicitly adult content on the internet or our

24:35

tagged adult content on the internet is going to

24:37

help because it's just so much of it at

24:39

this point. Yeah. I

24:42

think you kind of drove past the solution there,

24:44

which is that like, this

24:46

is a hardware level problem.

24:48

Yeah. Software, local. Hardware

24:52

platform combination level. Yeah. The simplest

24:54

version of this is that like

24:57

most kids don't have, most 11 year olds don't have

24:59

a sufficient side hustle to purchase an iPhone. It's probably

25:02

being bought for them by a parent. And

25:05

when the parent gives it to them, they can put controls

25:08

on that device because they're handing

25:10

an internet connected device to a miner. Totally. If

25:13

you don't want that miner to see something,

25:15

that security should largely

25:17

be occurring at a hardware level. I think

25:19

there's tons of things that platforms can do

25:21

to strengthen that. And to keep miners from

25:23

seeing things they shouldn't be seeing and should

25:25

be. That would

25:27

be a great place for a well-intentioned law

25:30

passer to start looking at is, well, what can

25:32

we be asking these platforms to be doing? There's

25:37

some stickiness there, but that, those two

25:39

solutions, large platform and hardware level protection

25:41

seems like a way better approach to

25:43

this. Let me

25:45

turn on your, it

25:48

seems you've gone to an adult website, turn on your webcam.

25:50

It's like, that's a nice, you're going to create a giant

25:52

underground for something that a lot of people access. It's

25:55

not a good idea. Yeah, totally. If your intention

25:57

is to like, you know, know,

26:00

think the the adults porn industry,

26:02

the legitimate porn industry that has

26:04

rules and regulations and, you

26:07

know, brings structure and probably, I

26:10

don't know, I don't know the right words I'm looking

26:12

for here. But you know, better than the underground scene

26:14

in regards to a number of, you

26:16

know, rights and

26:18

non human trafficking things that

26:22

yeah, I think that any kind of system

26:25

like this, I do I do think

26:27

that that might be the solution, like

26:29

a good solid platform like iPhone, Microsoft,

26:31

OSX, you enable child accounts,

26:34

the computer or the browser has an extension

26:36

that auto identifies that all content and immediately

26:39

like removes it from the page. I think

26:42

that's the real solution here is

26:45

allowing parents to put the

26:47

boundaries on what their children are allowed to do on

26:49

the internet. Maybe there's an

26:51

issue there in the sense that maybe there's not so

26:53

many technically savvy parents, but I feel like as

26:56

the millennial generation and below becomes

27:00

the new parents, I feel like that's gonna

27:03

quickly change. I'm not sure how many

27:05

millennials exist besides my wife lover to

27:07

death. Yeah, that aren't

27:09

technically savvy. Wasn't

27:15

sure where that was going at the beginning. Yeah,

27:18

yeah, yeah. She's

27:20

a she's a power iPhone user. But the second

27:23

you put a computer in front of her, she's,

27:25

she doesn't love it. Say that

27:27

they're unwieldy. Yeah, put protective

27:30

barriers around the kids not necessarily around

27:32

the content if you don't want to

27:34

drive legitimate sex work underground. It's

27:36

just not Yeah, which is not a

27:38

good place for it to be. Yeah, agreed. Ah,

27:41

well, anyway, rage against the machine

27:43

has a bunch of Canadian stories.

27:47

Let's kick it over to some of

27:49

our fine sponsors. And then when we come back, we'll talk

27:51

about a pretty wild heist in

27:53

Hong Kong. For

28:28

the very first time, Arctic Wolf,

28:30

the industry leader in managed security

28:33

operations, is offering you access to

28:35

the most forward-thinking ideas from their

28:37

most knowledgeable experts. You

28:40

and I, because I did

28:42

do this, can discover the top 2024 predictions

28:45

developed by Arctic Wolf Labs, their

28:48

teams of elite security researchers, data

28:50

scientists, and security engineers, derived

28:53

from the intelligence and insights gained

28:55

from trillions of weekly observations, thus

28:57

trillions with a T, within

28:59

thousands of unique environments. These

29:02

predictions trace the development of several

29:04

trends based on their earlier, simpler

29:06

iterations and anticipate which ones are

29:08

poised to take significant steps forward

29:11

in the coming months. Trillions

29:13

with a T. Learn what the

29:15

new year holds for ransomware as a service,

29:17

active directory, artificial intelligence, and more when you

29:19

download the 2024 Arctic Wolf Labs Predictions

29:23

Report today at

29:26

arcticwolf.com/hacked. That's arcticwolf.com

29:31

forward slash hacked. This

29:38

episode of hacked is supported by

29:40

Compiler, an original podcast from Red

29:42

Hat discussing tech topics, big,

29:44

small, and strange. Compiler comes

29:46

to you from the makers of Command

29:48

Line Heroes and is hosted by Angela

29:50

Andrews. The show closes the

29:53

gap between those who are new to

29:55

technology and the people behind the inventions

29:57

and services shaping our world. compiler

30:00

at Red Hat, Red Hat, but yeah,

30:03

learn more about compiler at redhat.com/compiler

30:05

podcast. Great episode you should check

30:07

out is the great stack debate

30:09

with episode 25. A

30:12

software stack is a lot like an onion or is

30:14

it a sheet cake or a lasagna or is it

30:16

that metaphor at all? It's often described

30:18

as having layers that sit on top of each

30:20

other and the reality is much more complicated and

30:22

learning about it can help any tech career. You

30:25

had me at a great cake. You had me at a sheet cake.

30:28

It's because we're both hungry because it's the

30:30

morning presently. I haven't eaten. The

30:34

great stack debate is the first episode in the

30:36

compiler series on the software stack. We'll call it

30:38

stack slash unstack where they explore

30:40

each layer of that stack, what it's like to

30:42

work on them and how they come together into

30:44

the whole application. I did

30:46

listen to it. It's very good. You should

30:48

check it out. It's a good time. I will. And

30:51

you can listen to compiler in your favorite podcast

30:53

player. We will also include a link in the

30:55

show notes. My thanks to compiler for their support

30:57

for our show. From

31:00

dozens of spreadsheets to fragmented tools

31:02

and manual security reviews, managing the

31:04

requirements for modern compliance and security

31:07

programs is increasingly challenging.

31:09

It is. And that's

31:11

why Vanta is the leading trust management platform

31:13

that helps you centralize your

31:16

efforts to establish trust and

31:19

enable growth across your organization. Do you know

31:21

how much of your compliance you can automate?

31:23

No. You can do that

31:25

up to 90 percent of your compliance.

31:27

You can strengthen security posture. You can

31:29

streamline security reviews and you can reduce

31:32

third party risks. Whoa. And

31:34

speaking of risk, Vanta is offering

31:36

hacked listeners. That's you. A

31:39

free risk assessment when you go to

31:41

Vanta, V-A-N-T-A

31:44

dot com slash hacked. You

31:46

can generate a gap assessment of your

31:49

security and compliance posture. You can discover

31:51

shadow I-T and understand the key action

31:53

to de-risk your organization. You want all

31:55

that good stuff? You're going to go

31:58

to Vanta dot com. That's vanta.com/hacked. nta.com/hacked

32:00

for your risk-free

32:02

assessment. No, your

32:04

free risk. Free

32:07

risk assessment. A

32:09

risk-free assessment. For your free risk assessment.

32:15

International news. So,

32:18

there's an interesting one. We don't

32:20

know the name of the company. It has not

32:22

been included in, based on

32:24

my research, a single piece of coverage about

32:27

this story. So, we're just going to call

32:29

it a large multinational company. An

32:32

employee at said large multinational company joins conference

32:35

call. This was a couple of weeks ago.

32:38

They got on the call and a

32:40

bunch of their coworkers are there. Camera's

32:42

on. And the result of

32:44

that call, the person is to go ahead

32:46

with a transfer of 200 million Hong

32:48

Kong dollars. It

32:51

turns out the entire call was a

32:53

deep fake theatrical production. Person

32:56

got looped into the call through a fishing scheme. Their

32:58

coworkers who were again on camera were

33:01

deep faked based on publicly available video

33:03

and photography. And the entire thing

33:05

was a scam to get them to go ahead and

33:07

transfer this money to the hackers who

33:10

took the money and ran. Case

33:12

is the first of its kind in Hong Kong involving

33:14

deep fake technology. No arrests have been made yet. The

33:16

cops are still looking into it. And

33:20

the story went wide because they were trying

33:22

to get outward that this technology has reached

33:24

a point where you can

33:26

be looking at a person on a Zoom call and

33:29

this is possible. Yeah. Yep,

33:35

had to come. Had to be, it was coming at some

33:37

point. The thing that surprised me

33:39

most is that it wasn't just one deep

33:41

fake person that they deep faked an entire

33:44

team of people. That to me

33:46

is crazy. Like

33:49

it's very sophisticated. Like I'd say

33:52

that this is, I

33:54

would say that if they're at that point

33:56

where they're like, you know what's gonna make this

33:58

more convincing if we bring six. colleagues

34:00

to the chat too. If

34:02

they're at that level of sophistication, I think

34:05

that we are in trouble and you're going

34:07

to hear more and more and more about

34:09

this. Yeah. There was a reason

34:11

I used a theatrical production because there's

34:13

something different to me about one person

34:15

doing this versus a whole bunch of

34:17

people getting together and casting parts and

34:19

figuring out who's going to say what

34:22

and scripted it all out and then

34:24

putting on their deep fake masks and

34:26

going into it. It's very theater kids

34:28

do cybercrime energy. I'm sure they're not.

34:30

I'm sure they're very dangerous hackers, but

34:33

it is just sort of a different tenor for these

34:36

types of corporate hacks. For context, $200

34:38

million, Hong Kong dollars is about $25

34:41

million US dollars. This is a large

34:44

corporate heist. It

34:46

was a fishing scheme and a

34:48

Zoom call. It's crazy. I

34:53

mentioned to you that while we were in Nicaragua,

34:56

my parents-in-law got defrauded.

34:59

Yeah. Somebody

35:02

was pretending to be my wife, same

35:05

name, set up their WhatsApp profile, messaged

35:08

her mother, gave

35:11

her some lie about how her phone had broken,

35:13

her touchscreen wasn't working, but she was somehow still

35:15

able to use WhatsApp. Her sim wasn't ready. She

35:17

couldn't call her, et cetera, et cetera. She

35:20

needed to pay some bills right away and needed her

35:23

to send some money on her behalf and she couldn't

35:25

do it because her phone was busted. Of course, loving

35:27

mother. Yeah, I'll help my daughter out.

35:30

Some brutal. Thought she was just being

35:33

independent, didn't want to call me to verify. Next

35:35

thing you know, $4,200 is on its way to Montenegro. Apparently,

35:39

the police attracted to Montenegro. We're

35:42

talking about a tiny WhatsApp call. As

35:48

far as checks and balances go, it would

35:50

have been pretty easy to see through it.

35:53

If she'd looked at the contacts phone number,

35:55

she would have noticed that the area code

35:57

was definitely not something that Michaela would have

35:59

or my wife would have. Yeah, anyway,

36:03

so you think about

36:05

that level of sophistication probably

36:08

being more successful than you would imagine.

36:11

Like it might seem like something that you would immediately

36:13

identify as fraud in the scam. Imagine

36:16

if you were looking at

36:19

your son on Zoom who

36:21

was saying, hey, mom, like I need

36:26

you to wire $6,000 to pay my rent to this

36:28

woman because my bank account's been hacked and I can't

36:30

have access to my money and I'll pay you back

36:32

in 12 days, etc. Imagine

36:37

what's about to start happening. Totally. Like

36:40

on a recreational level. Like the corporate

36:42

sophistication side will kick in and there'll

36:45

become tons of policies and checks and

36:47

balances. But if you start thinking about

36:49

applying this technology to everyday

36:51

people and boomers who love

36:53

their kids, that's

36:56

a billion dollar industry right there.

36:58

Yeah. I remember about

37:00

a year ago we did an episode on pig butchering

37:02

scams, which are basically what happened

37:04

to your in-laws and that sucks and I hope

37:06

they're kind of okay. Yep. And

37:09

so much of that is about exploiting

37:13

the emotional vulnerability that emerges when

37:15

a person is concerned about a

37:17

loved one. Totally. You

37:20

need that emotional hook. And it

37:22

is bizarre to say, but there

37:25

are emotional vulnerabilities in a corporate

37:27

context. The desire

37:29

of a person to not

37:31

mess up in front of their peers,

37:33

to not suffer embarrassment in a potentially

37:35

ruthless corporate culture. It's

37:38

not the same as concern about a loved

37:40

one, but it is the same kind of

37:42

identification of an emotional vulnerability and

37:45

setting up a lot of work

37:47

to exploit that emotional vulnerability to

37:49

catastrophic ends. It's the

37:51

same basic kind of like the

37:53

social engineering is conspicuously similar. I'm

37:56

just keep running through this in my head. Like

38:00

if you got a FaceTime call from your mother. Totally.

38:05

And she's like, your father's in the hospital and

38:07

blah, blah, blah, I need you to do this,

38:09

blah, blah, but like can you, like, eh, eh.

38:11

Yeah. It's just, it's

38:13

gonna be insane unless

38:16

they can figure out how to stop that stuff. Because

38:19

like, we were talking with

38:22

our in-laws about how like there needs to be

38:24

a, like if anybody's asking for money, if you're

38:26

about to send money, you have to at least

38:28

speak to somebody on the phone. Which is still

38:30

very fakeable. But

38:33

imagine if you had a FaceTime call and you could

38:35

see your daughter and she was like, or your mother

38:37

or your son or whatever, somebody in your family and

38:39

they were just like, yeah, I need this thing, blah,

38:41

blah, can you help me? Of

38:44

course. It's gonna be,

38:46

yeah, I don't know. I'm

38:48

hoping this is another thing that's gonna

38:50

need a technological solution. Like WhatsApp phone

38:52

calls. Like it seems to me like

38:56

every messaging service that I have

38:58

an account on, I get flooded

39:00

with garbage, including

39:03

like PlayStation network. Like

39:05

I'm constantly getting scams from everywhere

39:08

and just deleting and banning and

39:10

blocking and reporting. Pretty

39:12

much every time I log into a messaging service, I have to

39:14

report and block at least one account. So

39:18

they're gonna need to get better at identifying that

39:20

stuff. And that's probably gonna be an AI solution,

39:23

I would assume that they're gonna need,

39:25

just like we dealt with email spam, we're gonna

39:27

have to start dealing with messenger spam. It's

39:30

tough because like so much, so many of

39:32

the genuinely good

39:35

solutions that center around, okay,

39:38

if someone calls you with an urgent reason that you

39:40

need to send money, hang up

39:42

and call the person back. Yeah, totally.

39:44

Don't hit reply but call the phone number.

39:46

Like these really basic things, but those

39:48

aren't, that's

39:51

not really how we interact. Your

39:54

coworker calls you up on Zoom, your family member

39:56

calls you up. Sorry, just one second, let me

39:58

hang up on you and call you back. It's

40:00

a really unintuitive thing to do.

40:03

It's smart, it's good personal security,

40:05

but it is not intuitive

40:08

to how we communicate with the people that we

40:10

know in our lives. So if a person can

40:12

get past that filter where you

40:15

think you're talking to the person you think

40:17

you're talking about, those kinds

40:19

of personal security policies, call

40:21

them, are really, really hard

40:24

to lean on. And

40:26

I think that software level stuff to

40:28

sort of back you up a little

40:31

bit. You're like, hey, this person's

40:35

video looks a little weird. Hey, we've

40:37

done a little bit of work to figure out that

40:39

we think this phone number is being spoofed. I

40:42

don't know what those technical solutions are, but we

40:44

got to give people a little bit of backup

40:46

in these situations because the

40:49

thing that we ask them to do is really

40:51

unintuitive socially, I guess you

40:53

could say. The other issue is, and we talked about

40:55

this in the Pig Butcher one, the supplies here, these

40:57

people got away with $25 million in one hack. And

41:01

it's like, can you imagine what the

41:03

global market value for scanning is? Just

41:05

given how many people are employed and

41:07

are human trafficked and are, et cetera,

41:09

et cetera, across the globe to

41:12

become scammers and to execute scams?

41:15

It's got to be billions of dollars.

41:18

Organized crime. Yeah,

41:21

I don't know. But humans,

41:24

expecting humans to be smart enough to identify it, I

41:26

don't think is going to be the answer here. We've

41:28

had that problem with passwords since

41:32

passwords existed. Go

41:34

listen to a problem with passwords. I think it was like episode

41:36

three. Yeah, I really wanted to. But

41:40

yeah, so I think that the technical platforms and

41:42

the solutions, they're going to need to do something.

41:45

Actually, I think we have that contact

41:47

at Interpol. If we're going to have a conversation with

41:49

them about something, I think scamming would be an amazing

41:51

episode. Yeah. Talk about

41:53

the global size and scale of scamming. For sure.

41:56

There's a story that we're going to be looking into relatively soon

41:58

and concerns. For lack

42:00

of a better term, the Chinese mob and a

42:02

200,000 person scam factory operation that

42:06

has been likened by experts to modern day

42:08

slavery. It gives you a pretty

42:12

gnarly sense of the scale

42:14

of what is behind a lot of these

42:16

things. We really don't know who's making these

42:18

calls. In a lot of cases, it doesn't

42:20

look like what you think it looks like.

42:23

Yeah. Speaking of weird pop culture scammy

42:25

references, they're making their way into Hollywood

42:28

cinema now. In a recent

42:30

episode of True Detective, one of

42:32

the police officers, I don't know if you saw the new

42:34

episode. Did our new series season, didn't you?

42:38

I did, yeah. Yes. I

42:41

don't remember what you're talking about. One

42:43

of the police officers in Alaska, it was

42:46

Alaska, right? Yeah, Alaska. He was

42:48

sending money and stuff and paid for a

42:50

plane ticket and all those things for some

42:52

woman that he was WhatsApp-ing with. She

42:55

never showed up. He'd send her

42:57

money. I was like, oh. Someone's

42:59

catching up on this trend. They're

43:01

into the love scams and things. It

43:04

was just good to see in pop culture and make it

43:06

a little bit more known to people that

43:09

these are going on. It's

43:11

so common that if you want to make a

43:13

character seem relatable, you have them fall for a

43:15

giant internet grip. Exactly. With

43:17

a five-figure penalty. Yeah.

43:22

But you know what's not causing trouble,

43:24

Scott? Toothbrushes? Three

43:27

million of them. We'll

43:29

keep this one real quick because it's

43:32

more just a bizarre one. It's

43:35

sort of a one-said-the-other-said

43:37

thing between a Swiss newspaper and

43:39

a security firm. Argao

43:42

or Zetung, a Swiss newspaper,

43:44

publishes this very sensational

43:47

story about three million internet-connected

43:50

toothbrushes being hacked and used to

43:52

do cyber attacks, kind of a

43:54

DDoS story. And the

43:57

report claims that the attack caused a website to go

43:59

down for four hours. resulting in millions of dollars

44:01

in damages. And the story

44:03

was sourced from cybersecurity firm Fortinet

44:05

and was widely circulated and republished

44:07

by global news outlets. The story

44:09

goes very, very viral. It's

44:12

remarkable. Three million internet connected

44:15

toothbrushes. It paints this picture

44:17

of a very mundane technology being used

44:19

for very malicious purposes. Great

44:21

story. Unfortunately,

44:24

cybersecurity experts quickly challenged the

44:27

report foundationally

44:30

on a lack of evidence, but really just sort of

44:32

the implausibility of the whole thing. Marai

44:35

Botnet, which one of the

44:37

largest botnets ever at its peak infected 650,000 devices,

44:42

far fewer than the 3 million

44:44

toothbrushes claim. So

44:48

what happened here? A

44:50

lie went viral. That's

44:52

the story or falsehood rather. And

44:56

at this stage, it's kind of come down to

44:58

a disagreement between Fortinet and our gauras Zetun. Fortinet

45:02

issues a clarification stating that the

45:04

story was a result of a

45:06

misinterpretation and translation issues leading to

45:08

a mix up of a hypothetical

45:10

situation and an actual

45:13

situation. Fortinet

45:15

says we put out this hypothetical, this

45:17

Swiss newspaper, mistranslated it

45:20

and then published that to the world. Our

45:23

gauras Zetun, the Swiss newspaper,

45:25

responds maintaining that Fortinet provided

45:27

detailed information about the attack

45:29

and had reviewed the article

45:31

before publication. And

45:34

at this point, a lot of

45:37

people read this story about a 3

45:39

million toothbrush botnet did not read the

45:41

corrections and the responsibility for this giant

45:43

misinformation explosion is still

45:45

contested between the newspaper and Fortinet. But

45:49

what we did get out of it is a

45:51

whole bunch of memes, a whole bunch of fun

45:53

chatter about misinformation in the

45:56

cybersecurity space. So that was

45:58

fascinating to read just how quickly. a

46:00

mistranslation or a misrepresentation turned into

46:03

a viral story that burst out

46:05

into the world. The

46:07

thing that I'm Googling in

46:09

the background here is does somebody make an

46:12

internet connected toothbrush? Right? Like,

46:17

is that a real thing? Does it exist and

46:19

why? What's that? What

46:22

is it for? Why do you

46:24

need the... I love internet connected stuff. I

46:26

spend so much of my time on the

46:28

internet. Why do you need a toothbrush to

46:30

be internet connected? Yeah, apparently there is one.

46:34

Or multiple. So I am... Oh

46:36

yeah, sorry. Answer to that. Yes,

46:39

there are. There are internet connected toothbrushes. Crazy.

46:42

Bazaar. Crazy. I kind of

46:44

want to buy one. I'm reading about a toothbrush right now that

46:46

3D maps your teeth and tells you when you've

46:49

missed places. Like that sounds

46:51

amazing. Maybe when

46:53

I'm missing in my life. Oh no.

46:56

Is it internet connected toothbrush? Maybe that's

46:58

why I'll bring it all together. What

47:00

if this was... Oh dang. What if

47:02

this was a viral ad for internet connected toothbrushes? Oh

47:04

shit. Dude. Might

47:07

be onto something. I wouldn't... I

47:10

would be in a sense furious and in

47:12

another sense deeply impressed. Yeah, I'd

47:15

be mostly impressed. I

47:17

think I'd be mostly impressed. I think

47:19

I might buy that toothbrush. I'm gonna give you

47:21

a half as good at making toothbrushes as you

47:23

are promoting them sold. We just

47:25

unwillingly promoted a toothbrush that does 3D mapping

47:27

of your teeth and tells you when you

47:29

miss spots to like 100,000 people. Oh

47:32

no. So

47:36

if this was a marketing campaign, add

47:39

that to your KPI. Congratulations. You did

47:41

it. In

47:44

an attempt at covering a story about

47:46

misinformation in the tech and security space

47:48

we have inadvertently participated in. Oh man.

47:54

Last thing I want to talk with you about because this

47:56

goes back to an idea that we've wanted to make

47:58

something up for a long time. concerns the identity

48:01

of one Satoshi Nakamoto. This

48:04

past week, the Crypto Open Patent

48:07

Alliance and self-claimed

48:09

Satoshi Nakamoto, a man named Craig

48:12

Wright, will be presenting their closing

48:14

statements in a trial, in a

48:16

sense determining if Wright

48:19

is Satoshi. There's

48:22

been a really fascinating court

48:24

case to be following. The

48:27

justice in the trial, a guy named James Miller, has

48:29

not said whether or not a decision is going to

48:31

be coming out at the end of this. But

48:34

the outcome of this case that

48:36

COPPA, this patent alliance is bringing against

48:39

Wright, could have huge implications on a

48:41

bunch of other ongoing cases that center

48:43

around Wright's claim that he is the

48:45

creator of Bitcoin.

48:48

I'm not sure that anyone

48:51

listening to this requires a rundown of

48:53

who Satoshi Nakamoto is. What

48:55

do you think about that? I don't think so. I

48:57

think we could summarize it by saying Satoshi

48:59

Nakamoto was on internet forums and

49:02

is believed to be the creator

49:04

of blockchain and the

49:06

Bitcoin. The Bitcoin. The

49:09

interesting, to

49:13

me it seems, my opinion is, I'm trying to

49:15

think of very good ways to present this. This

49:18

is being done for clout. I

49:22

don't know how much of

49:24

the database writes, trademarks stuff,

49:27

patent issues there's going

49:29

to be. I'm not

49:32

sure what value he's

49:34

going to get out of it if he wins. Because

49:39

a lot of this stuff was

49:41

shared publicly. It's open source technology,

49:43

et cetera, et cetera. It's mostly

49:45

just being done for, hey, I'm

49:47

the guy. Seems like

49:49

to me. The other thing I will

49:52

say is that if

49:54

you are Satoshi Nakamoto

49:56

accessing the initial

49:58

wallets and blockchain pieces that

50:00

you use to create the

50:02

coin and accessing all of

50:04

the money, i.e. Bitcoin

50:08

that are sitting in Satoshi Nakamoto's accounts

50:10

should probably be the number one piece

50:12

of proof to prove that you're them.

50:14

Just saying. Like

50:18

if you can sit down in court

50:21

and log into the origin wallet and

50:23

move some Bitcoin around I'm sure people

50:25

will then believe you. Yes,

50:28

it does seem like there would be a pretty

50:31

easy set of ways to prove that you're Satoshi

50:33

Nakamoto. So

50:36

to add a little bit to that, Craig

50:39

Wright, Australian computer scientist, has

50:41

claimed since 2016 that he

50:43

is Satoshi Nakamoto beyond

50:45

clout, which is certainly,

50:48

that's a reasonable supposition.

50:53

Wright is engaged in a series

50:55

of copyright events that

50:58

he has embarked upon with the presumption

51:02

that he is Nakamoto. He is

51:04

suing people as Nakamoto, the synonymous

51:06

creator of Bitcoin. This

51:10

lawsuit from COPPA against him is essentially

51:13

attempting to set a precedent that he

51:15

is not. They're

51:18

arguing this case,

51:21

they're trying to make the argument that he is not Satoshi

51:23

Nakamoto so that in future cases in the UK, high

51:26

court, he can't start from

51:28

that presumption. That's

51:30

really what this is interrogating. If

51:34

Wright wins those

51:37

other legal exchanges

51:39

he's in the middle of against Coinbase,

51:42

Crockin, a bunch of other cryptocurrency

51:45

platforms and Blockstream

51:48

is a real leg up if

51:50

he wins this one and it's a

51:52

real setback if he doesn't. His

51:56

case, I'm stammering because

51:58

we're talking about an active. relatively unfolding

52:00

court case concerning litigious

52:03

participants. But

52:06

the case he's made so far has been

52:08

interesting to say the least. He

52:10

had his sister on the stand who tells

52:13

a story from, I think he was 18 or 19 and she

52:16

saw him dressed up as a ninja. So

52:19

that when she heard the name Satoshi Nakamoto, she

52:21

put two and two together and thought surely that

52:23

must be my Australian brother, Craig Wright, because she

52:25

saw him in a ninja outfit one time. It

52:28

is a series of strange

52:31

anecdotal defenses

52:35

to this claim that he is Nakamoto that to me

52:37

from everyone around, I'm like, oh man, I feel like

52:39

there's a really short distance to you proving this and

52:41

it's just you crack it open those wallets and move

52:43

it some stuff around. But so

52:46

far that hasn't happened yet. Big, big news for

52:48

you, Jordan. When

52:50

I was a child, my brother and I

52:52

used to dress up as ninjas all the

52:54

time, actually. So new

52:58

announcement. I am

53:00

Satoshi Nakamoto. We're

53:02

on here first. We're on episode 87.

53:05

And my hope is that the way this podcast

53:08

is like, I don't want it and I really

53:10

enjoy making it. But my hope

53:12

is the way it ends is on episode 100. You

53:15

prove that after all of the

53:17

crypto shit talking, you were Satoshi

53:19

Nakamoto. That's the last episode we're

53:21

done. That would be the

53:23

perfect way for this to all wrap up. Could

53:27

you imagine? There's some untold backstories

53:30

here. Jordan

53:32

and I in our brief hack

53:34

was becoming a TV show period.

53:39

Pitched an entire idea called solving

53:41

Satoshi. Are seeking Satoshi?

53:43

It's all I want. We were solving

53:45

it. And we were going to make

53:47

an entire docudrama series about looking for

53:49

the real Satoshi Nakamoto. So we would

53:51

have met Craig, right? If we

53:54

had had the opportunity to make that show. And

53:56

since somebody else has made that show. That's

53:58

true. That's true. I would

54:01

honestly like I'm not. I

54:03

would honestly really love to interview Craig

54:06

Wright. I would be fascinated

54:08

to hear the story

54:10

from him because I'm not sitting in a

54:12

courtroom I'm not listening to. I mean, I'm

54:14

not reading transcripts. I'm reading secondary coverage. I

54:17

would love to understand, you

54:19

know, that argument and those claims. But it

54:21

has been a bizarre. I

54:23

think even people firmly in his

54:26

camp would agree this court case has been

54:28

extremely odd and maybe the

54:30

case was not made as well as it could have been. But

54:33

it is a fascinating one. He is

54:36

more than welcome to drop us a note. Love to have

54:38

him on the show. Love to chat about it. Maybe

54:40

we could dress up as ninjas and do a video

54:42

stream. Be great. Yeah,

54:49

I don't, I think it's an interesting.

54:53

The thing for me is that if you're going to build something

54:56

like blockchain and big

54:58

chances are you've

55:01

put an Easter egg in it somewhere like

55:03

no developers immune from putting an Easter egg

55:05

and things which we see all the time.

55:07

If you just Google, you know, any piece

55:10

of software and the term Easter egg, you'll

55:12

find Easter eggs laden

55:14

in pieces of software. I

55:17

can't remember which Microsoft product it was, but

55:19

it had an Microsoft flight simulator was an

55:21

Easter egg inside of it. So you could

55:23

like go into a special menu, push some

55:26

key command and boom, you were like in

55:28

Microsoft flight simulator, which to me is just

55:30

amazing. That rips. So

55:34

there has to be some fingerprints

55:36

on the software and Easter eggs

55:38

that only the creators would know

55:40

about. Granted, like we're fighting over

55:42

white papers and things like that,

55:44

which is less sophisticated and less

55:46

potential for that. But, but

55:49

yeah, I don't know, even wherever

55:52

the origin code is, having the original pieces

55:54

of code and the original proof of concept

55:56

for it, like that stuff has to exist.

55:58

And if you have that, then that would

56:00

probably strengthen your case too. So

56:04

yeah, it's an interesting claim.

56:06

I don't think that my

56:09

personal opinion is that I don't think that

56:11

we will ever have, even if somebody, even

56:13

if Craig Wright is Satoshi Nakamoto, I

56:16

don't think any court will ever rule that they

56:18

are as it will

56:21

likely be unprovable. So

56:24

unless some definitive evidence shows

56:26

up like you log into the Origin Wallet,

56:29

I don't think, I think

56:33

everybody's SOL. So.

56:37

Yeah. And as much as I

56:39

love crypto, I

56:43

would prefer it be open source. That's why you invented it.

56:45

I invented it. I invented it and gave it to the

56:47

people. You're all welcome. Congratulations

56:50

on your speculative gains. I

56:52

hope you enjoy all your free money that you've generated

56:54

out of nowhere. Well,

56:59

this was fun. We haven't done one of these in a minute. Thanks

57:02

for going on a tour of our

57:04

Canada Cyber Crime and Tech gripes with

57:07

us. That was a lot of fun.

57:10

Hong Kong heists. Thanks again for listening.

57:13

This was a fun one. And

57:16

yeah, we'll catch you in the next one. Take

57:18

care, everybody.

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features