Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:01
So in the last episode we talked about my hobby,
0:03
which is mountain climbing. In this episode
0:05
we're gonna talk about your hobby, which is money laundering.
0:07
So how do you wanna get started on this topic?
0:09
<Laugh>, I'd, I'd like to get started by clarifying
0:12
that I'm not a hobbyist money launderer,
0:14
Oh you-, Sorry? Is is like semi-professional, professional?
0:17
Is that what we-, I didn't mean to patronize
0:18
or talk down to you.
0:24
<Sigh>
0:24
As an expert in the field of money laundering,
0:27
how do you get away with it?
0:29
<Laugh> You're
0:31
gonna get me sent to prison. I've never laundered money.
0:34
I just worked in the finance sector for a few
0:36
years.
0:37
That is what somebody who laundered money would
0:39
say though, right?
0:41
I guess it depends who they're talking to.
0:43
Like if you're selling money-laundering-as-a-service, you're
0:46
not gonna go, I've never laundered money.
0:48
You know what a Kafka trap is?
0:51
No.
0:52
A Kafka trap is where you set up
0:54
a situation so that no matter what a person
0:56
says, they sound guilty. So
0:59
you would say something like, Oh, alcoholics always
1:01
deny it and then either
1:03
you admit to being an alcoholic or you deny it, thereby
1:05
admitting.
1:08
Okay. Well...
1:09
It sounds like that when you , when you say, you
1:12
know, "Honestly I've never been into
1:14
money laundering", it's like that's what somebody who's really into
1:16
money laundering would say.
1:17
I think it's interesting. I've
1:19
never done it and I would never have a reason to do it cause I don't
1:22
routinely do crime.
1:23
When I worked , uhhhh , with
1:25
the gaming industry, there was a
1:28
like a lecture, like a guest lecture that
1:30
was put on , uh, about money laundering.
1:32
And this was five or six
1:34
years ago. So it was very new to me. And
1:37
uh , I realized that I think that was the point in
1:39
which I was first put on a list because I
1:42
went to a lecture about money laundering and I was the only person
1:44
who took notes.
1:46
< Laugh > , That's
1:49
such a Holly move. I
1:51
quite enjoy that actually.
1:53
If I go to a lecture to learn something. I'm gonna take
1:55
notes; as you well
1:57
know. I have a terrible, terrible
1:59
memory. I have a hilariously bad memory.
2:02
You- , you do have a pretty terrible memory. Okay.
2:06
Um, I suppose we should start by defining
2:09
what money laundering actually is for people who maybe
2:11
aren't familiar with it .
2:13
Yeah.
2:14
Yeah. So money laundering is
2:16
the illegal processing of the proceeds
2:18
of crime to disguise their origin. So you,
2:21
you commit a crime, how do you then
2:23
use the money that you've got from that crime without
2:26
being picked up by the police?
2:27
The reason that this interests me is because
2:29
, uh, so I work in cyber
2:32
security, right ? In particular offensive security with penetration
2:34
testing. And very often people
2:36
like to compare the work that we do
2:38
as penetration testers with the actions
2:40
of cyber criminals. And of course that
2:43
is the point in which we actually have
2:45
no experience and it's just not a thing that we work
2:47
on. So I will break into
2:49
computers and then demonstrate those vulnerabilities to organizations.
2:52
Whereas an actual cyber criminal breaking
2:54
into the computer is just the first step, right?
2:56
You've got a whole bunch of activity that has to come
2:58
after that in terms of monetizing
3:01
that attack and then getting
3:03
that money into some form
3:05
that you can actually spend it without being arrested, right?
3:07
So we're gonna focus in this episode on the
3:09
back end side of that, right? It's like, hey, you
3:11
have committed some kind of crime. Maybe
3:14
a cybercrime, or maybe otherwise how
3:16
does getting that money into usable
3:18
form work? And of course importantly
3:21
how come so many criminals are
3:23
really bad at it and keep getting caught.
3:25
And then we gonna talk a little bit about cryptocurrency,
3:27
which is Holly's favorite thing.
3:29
We'll talk a little bit about crypto. I do think crypto
3:31
is important in these discussions.
3:33
Sorry, sorry, sorry. We're gonna talk about cryptocurrency.
3:36
That's what I said. Crypto.
3:38
No,
3:39
Like Doge, Ethereum...
3:41
No , Holly crypto means
3:43
cryptography.
3:44
Are you team Bitcoin or Bitcoin Cash?
3:47
Neither < Sigh>
3:48
Are you team proof of stake or proof of work?
3:51
Please. Stop.
3:51
Okay, we'll get to that. We'll get to that. So
3:54
you said we should define money
3:56
laundering and I think everybody kind of like implicitly
3:59
knows what money lingering is, right? It's like that taking
4:02
the money from the point in which you're stolen
4:04
it to making it usable. But are there
4:06
stages that are involved? Is there
4:08
like common steps that an attacker would go
4:10
through for money laundering or is it different
4:13
every time?
4:14
Yeah, so broadly speaking, there
4:16
are three phases to money laundering.
4:19
Um , three steps. So
4:21
the first stage would be , uh, placement,
4:23
which is where you place or deposit the money into
4:26
a financial system. Um, the
4:28
next stage is layering where you create a
4:31
complicated web of transactions or
4:33
a trail of activities so that you can obscure
4:35
the initial source or deposit. Um,
4:38
and then the , the third and final stage
4:40
is integration and that's when the funds are
4:42
integrated into the economy , um,
4:44
returned or accessible to the person who's
4:47
been doing the laundering.
4:50
Okay , so you have some funds. So for example,
4:52
we are talking about cybercrime
4:54
and we're talking about the , like the theft of cryptocurrency.
4:57
So for example, you've hacked a cryptocurrency
5:00
exchange, something like that. And when
5:02
you first take the funds from
5:04
their original wallet or
5:06
source and you put it into a wallet that you
5:08
control, that's placement, right?
5:11
Yeah.
5:12
Okay. So you then mentioned
5:14
layering and you described that as like a
5:16
mixing activity, right? So it's several actions
5:18
where you're trying to obfuscate
5:20
the source. So you're trying to effectively
5:23
put some logical distance between
5:25
where the money was stolen from and when
5:28
the money enters an account that you can use it from, right?
5:31
Yeah. So you're basically just , um,
5:33
trying to make that as complex as possible. Um
5:35
, and there are a few ways that you can do that, which we'll get onto.
5:38
This is the thing that, that has always confused me
5:40
because very often you hear people misunderstanding.
5:43
Cryptocurrency are generally misunderstanding
5:45
the blockchain and those kinds of things. And
5:48
one, one of my great frustrations is technologies
5:50
like the blockchain, the , the feature that
5:52
they bring is they are distributed and
5:55
or I guess , uh, decentralized is a better term in
5:57
this context . Uh , the blockchain is decentralized and
5:59
people hear that and think
6:02
it means "anonymous", right?
6:04
Because the problem that I have in my head when we start our conversation
6:06
about money laundering is if you steal some cryptocurrency
6:09
from an exchange and then you move
6:11
it through some wallets and then you put it into
6:13
a wallet that you control, that you can withdraw into
6:15
cash or some other form of fungible currency
6:17
that you can exchange for goods and services. If
6:20
you wanna know where that money came from, right? You
6:23
just look at the blockchain, right? It's a published source
6:25
of this is all of the transactions that
6:27
have taken place. So just moving it
6:29
between one or two or three wallets that you've
6:32
made doesn't sound like it's really hiding anything.
6:37
I guess having , um, a public ledger
6:39
where all of your transactions are recorded for
6:42
all of time is , is probably not , um, a
6:45
great place to be laundering money, but there
6:47
are online services and bitcoin blenders
6:49
and cryptocurrency, tumblers and things like that that people
6:51
use to swap a certain
6:53
amount of cryptocurrency with somebody
6:56
else. Uh , so I guess like the,
7:01
the tokens, what
7:04
am I trying to say right now?
7:05
Tokens. You can , you can say tokens. Yeah.
7:08
So you would be swapping an amount of
7:10
cryptocurrency with somebody else, the
7:12
same volume I suppose, which
7:14
would disguise more easily
7:17
the source of those funds. But it does mean that that
7:19
other person would potentially now have the
7:22
cryptocurrency that you had that came from
7:24
that criminal source initially.
7:27
So before we dive more into cryptocurrency
7:29
and and how it works, and I've got some notes on a
7:31
, a recent money laundering story which went
7:33
hilariously - should we talk about the basics
7:36
of the foundations of how do we stop
7:38
money laundering? So what is anti-money laundering?
7:42
Um, so anti-money laundering is the name
7:44
given to like collective legislation
7:47
and regulation that requires
7:50
banks and financial institutions to
7:54
uh , have process in place to things , uh, for things
7:56
like know your customer processes
7:58
and to report suspicious transactions,
8:01
via suspicious activity reports if
8:03
they suspect that somebody is involved in
8:05
money laundering or in uh , anything
8:08
surrounding the proceeds of crime or terrorism
8:10
financing. And similar, so
8:12
anti-money laundering is both a process
8:14
and also , um, collective of legislation.
8:18
So you mentioned know your customer. What , what is
8:20
know your customer?
8:21
Know your customer is the requirement
8:24
for banks and financial organizations
8:27
to understand their
8:29
customer. What usual behavior is for that customer?
8:31
What are normal amounts of
8:33
cash or or volumes of cash to be
8:35
moving in and out of their accounts? So their regular deposits
8:38
that go into that account? How much do they earn? Where
8:40
do they live to say if
8:42
somebody's had, you know, quite a , a low
8:45
paid job for their entire life and then suddenly comes into
8:47
a large volume of money that
8:49
would flag up, a fraud team would detect
8:51
that and and wonder where the source of
8:53
that that original money was. And
8:56
also it requires a vetting
8:58
process. When you open an account with
9:00
a bank or a finance institution,
9:02
they'll have to sort of screen their perspective customers
9:05
against a list of politically exposed
9:07
people and make sure that they're not involved
9:09
in like crime or terrorism financing and
9:11
similar things. It's almost like a , a bit
9:14
of a watch list I suppose.
9:16
The know your customer stuff just sounds like
9:19
the bank being aware of like what is
9:21
normal for your account, right? So if you get paid
9:23
by the same company every month, it's just like,
9:25
oh, we don't need to investigate this transaction
9:27
because it's just a normal activity for
9:29
this user. And then anytime you get abnormal
9:32
activity, so like a large deposit or you
9:34
use the term volume there , I presume you're talking about
9:36
not necessarily a large deposit, but maybe just
9:38
a large number of transactions. It's
9:41
just them being able to pick up on that as potentially
9:43
suspicious, right?
9:44
Yeah. So there's a , a couple of things here. The
9:46
first being if you've had an account with
9:48
a bank for 10 years but you've hardly ever used it and
9:50
suddenly you start using that really heavily, that looks
9:53
suspicious. And secondly,
9:55
know, your customer also links to things
9:57
like detecting modern slavery and
9:59
people trafficking. If somebody
10:01
comes into a bank and they're potentially
10:04
being scammed or being taken advantage of
10:06
or you have it with foreign nationals,
10:08
whether they'll sometimes go into open bank account
10:10
or withdraw some money and someone else is
10:13
with them and has their passport, for example,
10:15
that could be evidence
10:18
or, or a sign that they're being coerced.
10:20
Oh, so it's not, it's not only like
10:22
bank account activity, but it's
10:25
like behavior when interacting with the financial
10:27
institution.
10:28
Yeah, absolutely. So , uh,
10:30
an interesting thing is as a,
10:33
a consultant who works in a bank , um, or
10:35
a teller, whatever you call them, wherever
10:37
you are, you are required in
10:39
the UK to report
10:42
any suspicious activity. So even
10:44
if you don't have hard evidence that somebody is laundering
10:46
money or that something criminal is , is occurring,
10:49
you still have to submit a suspicious activity report
10:51
so that that can be investigated by say the police
10:54
, um, or the National Crime Agency
10:56
who sometimes work in um , conjunction with
10:58
police forces. What that means
11:01
is usually if you submit a suspicious
11:03
activity report, the fraud team at
11:05
a bank will freeze somebody's accounts,
11:07
which means that they're not able to transact, they can't
11:09
access any money that they might have deposited, they
11:12
can't withdraw any money, things like that. But
11:14
you're also not allowed to tell the customer
11:16
why that's happened because that constitutes
11:19
tipping them off, which makes you an accomplice.
11:22
So you can say if somebody goes into a bank, talks
11:24
to a teller and the teller thinks it's suspicious, they could
11:26
say, Oh, I'm sorry we're not able to serve you
11:28
at this time. But they can't say, Oh
11:31
hey dude, you got a fraud warning on your account so
11:33
like, can't give you any money today. You know, you probably
11:35
should, should check into that. Is that what we're talking about?
11:37
<laugh> Yeah, basically...
11:39
It just says call FBI on
11:41
your account notes, <laugh> like ...
11:44
They can , they can say anything really. I suppose
11:46
they could tell them that there's a hold code on the account
11:48
but they couldn't tell them why. But I
11:50
think even if you act in a way
11:53
that gives the customer or the person attempting
11:55
to launder the funds , uh, the impression
11:57
that they're being investigated for fraud , um,
12:00
you can be fined or you can go to prison for that.
12:02
So you can't tip them off. And this is interesting
12:05
cuz we were talking the other day, weren't we when we were doing the show notes
12:07
and things for this episode and I told
12:09
you that, so I do all of my banking
12:11
through a mobile application
12:13
and on the mobile app for my bank, it
12:16
doesn't have any of the details that
12:18
I would expect to be like KYC details,
12:20
right? So like this, this know your customer thing, knowing
12:23
who you're employed by, how long you've
12:26
been employed, what your income is, those kinds of things like
12:28
these details that I would expect my bank to know
12:30
so that it can tell if a transaction is unexpected
12:33
or unusual. On my banking app, there
12:35
are none of those details. There is a space
12:37
where those details are supposed to be, but when
12:39
I log in: occupation, unknown
12:42
employer, unknown start debt the
12:45
1st of January, 2010, that definitely
12:47
feels like a null value, doesn't it? Just
12:49
like the dropdown starts there or
12:51
something. Income frequency unknown
12:54
, income unknown , home address,
12:56
country of residence, unknown
12:58
country of birth, unknown town
13:00
or city of birth unknown . This
13:03
is my actual bank account.
13:05
Maybe you're just really good at opsec .
13:08
So the the really funny thing was a
13:10
little while ago, my , my bank rang me in
13:12
what was at the time somewhat
13:14
of a suspicious phone call when your bank,
13:17
"your bank" like inverted comment here was
13:19
like your bank rings you up and says, Hey, there's
13:21
a problem with your bank account. We need some details
13:23
about you, like your name , your address,
13:26
all of these things. Like you immediately think
13:28
Phishing and then you, you do the thing where
13:30
you're like, okay, what what is the the steps for
13:32
for validating this , uh, this here
13:34
? And I think that's one
13:36
of those activities where some people can get caught
13:39
out. There used to be a problem I'm told
13:41
with, with landline phones where a
13:43
scammer would call you up on the landline, they
13:45
would tell you there's some problem with your account, whatever. And say, oh, for
13:48
security reasons, you know, you should hang up now and then you
13:50
should call your bank or you should call whatever
13:52
company says there's a scam alert on
13:54
your account and you know , you thank the person for
13:56
the call, you would hang up, you pick the phone back up, you
13:58
dial the number, you speak to your bank. But what would happen is
14:00
the scammers wouldn't hang up the line
14:02
at their end and on some landline systems
14:05
that wouldn't disconnect the call. So
14:07
when you start pressing buttons in your phone, it's just
14:09
sending the tones but the line hasn't been disconnected.
14:12
So you're still talking to the scammer.
14:14
And I saw this recently, in fact you shared this
14:16
with me on Twitter, didn't you? Somebody who
14:18
got scammed, an NFT scam, where
14:21
their mobile phone rang and
14:23
the call ID, which by the way if you didn't know a call
14:25
ID can be easily spoofed, call ID
14:27
on their mobile phone said it was from Apple
14:30
Inc. So they trusted them.
14:32
I'm missing a couple of steps here, but they trusted them
14:34
because their phone in part said
14:36
it was a legitimate call from Apple. Um,
14:39
so yeah, the scammers
14:41
are everywhere and my bank rings me up and says, hey, we need to
14:44
validate some , some personal details. And it was,
14:46
it was those kinds of details that
14:48
they needed to validate. Um, which makes it
14:50
even weirder that my account still
14:52
does not have those details showing on
14:54
my, on my account.
14:55
Yeah, it's very strange that they've not tried to capture
14:57
those at any point. If you use that regularly
15:00
or if you've spoken to anyone in , in a contact center
15:02
or gone into a branch, they would try and
15:04
collect that data while you're there for that exact
15:06
purpose. But it depends how long you've had the account
15:08
really. Because if you had that account, say
15:10
when you were a child or you had it
15:13
prior to the introduction of some of this
15:15
legislation, then they wouldn't
15:17
have collected that data at the time there
15:19
, there would've been no obligation for them to , to do those
15:21
things.
15:22
I've had, I've had the account for
15:24
something in the region of 19 years.
15:26
It's quite a while.
15:27
Yeah, it's a little little while. Um , I remember
15:30
I've had, I've had some interesting problems with
15:32
my bank , um, but one of them being I did
15:34
open the account when I was what they call a young person,
15:36
so a child. I don't remember exactly
15:38
how old I was. And when you get
15:40
to 18, obviously the details of
15:42
your account change because you're now legally eligible
15:45
for, for credit. So you can have an overdraft and things
15:47
like that. And on a young person's account you can't have
15:49
any of those things. So it's supposed to be when
15:51
you turn 18 the account just ticks over
15:53
and becomes an adult account or what they , I
15:55
think they just call it a "current account" as opposed
15:58
to a "young person's account". Like none of those things
16:00
worked. I remember back when I turned
16:02
18 it just didn't happen and I had to go into the
16:04
branch and be like, hey, it still
16:06
hasn't done this thing. And so I've had some
16:08
funny issues with my , with my bank , um,
16:11
over the time. But yeah, that one made me laugh where we're
16:13
, we're kind of preparing to have this discussion about,
16:15
about know your customer. And I'm like, oh, I'll
16:17
see what information my bank holds about me. And it's
16:20
like apparently none. Apparently no
16:22
information.
16:23
Yeah, it's, it's really interesting now because
16:25
when you try and open a bank account online or
16:27
anything like that, they'll take you through an electronic ID check.
16:30
So you'll need to provide
16:32
like a scan or something similar of
16:34
um, potentially a utility bill.
16:37
Um, or uh , another piece
16:39
of correspondence from like an official body. Um,
16:42
a copy of at least one piece of
16:44
government issued ID potentially two depending
16:46
on what kind of account it is , um, and who
16:48
you're opening that with. And then they'll
16:51
also check to see if you are on the electoral
16:53
role for where you live as well. So
16:55
if you are trying to open a bank account in London but
16:57
you actually live in Scotland, that might raise some
17:00
alarm bells and often for say
17:02
, um, an an ISA savings account
17:04
for tax purposes, they'll need things like your national insurance
17:06
number. Um, if you are not registered
17:08
to vote where you say you
17:11
are opening the account or where you live, you
17:13
will fail the electronic ID check for opening
17:15
that account.
17:16
I also had this recently so um,
17:19
so I did open an account with a challenger bank just
17:21
to see how different that is. I
17:23
mean based on the competence of my actual
17:26
bank, it's a really, really low bar. So
17:28
I thought maybe a challenger bank could be worth investigating.
17:30
But also more recently , uh, I opened
17:32
an account on a crypto exchange and
17:35
they have very similar kind of
17:37
KYC processes and with it being
17:40
all mobile app based , you know, you install the app but when you go
17:42
to open your account, it's starts: take a photograph
17:44
of your driving license, you
17:46
have to take a selfie holding
17:49
your driving license and things like that. I think
17:51
if I remember correctly, to open
17:53
the bank account I had to take a video but
17:56
to open the crypto account it was just a
17:58
selfie with me holding my id.
17:59
I've done that before and I always think it's, it's so
18:01
funny because it's like a hostage situation. Are
18:03
you like you have a hostage holding a newspaper, It's
18:06
like, it feels like that...to open a bank account.
18:08
It's like how much, how much duress
18:10
can you appear to be under looking , looking
18:12
at...I am trying to
18:14
open an account <laugh> I am
18:16
not under duress and see how much suspicion
18:18
you can raise whilst like... Anyway I
18:21
tried to open this , this crypto account and the
18:23
verification failed. Now I mentioned
18:25
this I think in a previous episode I've had this recently
18:28
where my driving license
18:30
is the new style of driving license
18:32
and some of these systems don't recognize it cuz
18:34
it looks slightly different. So it may have just
18:36
been that I had it with my exams and
18:39
I opened the crypto account about the same
18:41
time so , so it could have just been that but I
18:43
just kept trying and on like the third
18:45
or fourth attempt it just worked.
18:48
So that makes me feel like, you know,
18:50
maybe there is a machine learning artificial
18:52
intelligence system that's verifying these photographs
18:55
or maybe it's just a PRNG
18:58
and it just rolls the dice and if you get a
19:00
six you can open an account and I obviously
19:02
didn't roll a six on the first three attempts
19:04
but I got there in the end.
19:05
<laugh> Statistically you're gonna brute force it
19:08
in the end is what you're saying.
19:09
Well it gave me no other option
19:11
so it just, on the mobile app it just said
19:13
failed and then a couple of minutes later I got an email
19:16
and the email basically just
19:18
said, sorry that we couldn't verify
19:20
you, please try again. There was no
19:22
like contact us here or um,
19:24
make sure that you know, you
19:26
know some of the details you might expect like make sure you're in good lighting,
19:29
make sure that the ID is clearly visible
19:32
or any, any of those kind of details you might expect to
19:34
like assist the customer through. As far as I
19:36
remember it was just like that didn't work, try again. So
19:39
I did just keep trying again
19:41
and then it let me through in the end.
19:44
Amazing. Truly iconic.
19:45
And that's where I put all of my Doge.
19:48
Oh why...are
19:51
you doing what ? Why?
19:54
Why Doge?
19:56
What? No, why you do this to me?
19:57
Because it's going to the moon! Rocket emoji.
19:59
Oh for God's sake .
20:03
Ironic- Ironically I can't log into
20:05
my crypto exchange cuz it says I need to verify
20:07
my ID.
20:08
Again?
20:10
Yeah I've gotta do it every seven days or something. It's
20:13
not , not the full process. Just like I
20:15
can log in with Face ID but then there is
20:17
an absolute timeout, I think it's seven days and
20:19
then Face ID stops working and I've gotta log in with a password.
20:22
That seems excessive.
20:24
So the way that it works, I open the app so obviously
20:26
my phone has to be unlocked I unlock my phone first, I open
20:28
the app, it uses Face ID to log me into my account, but
20:31
then about every seven days it will log
20:33
me out completely and then I have to
20:35
enter my password and it will text
20:37
me a PIN number. I would
20:39
much prefer it just uses Face ID.
20:43
Actually, now that you mention it, my current
20:45
pension provider where I work at the moment, they
20:48
have an app and it's atrocious. You
20:51
can't just use biometrics whenever you wanna
20:53
log in . It doesn't ask you for permanent
20:56
permission. You have to, it redirects
20:58
you to a browser to log into
21:00
the website and then authenticates the
21:02
app using that. And that
21:05
only lasts for a certain period of time I suppose
21:07
cuz I don't check my pension account with
21:09
any great regularity every couple
21:11
of months or so. So maybe it's just a period of
21:14
time that it expires, but
21:16
I always thought that it was just a terrible app because
21:18
I have to repeatedly log into the browser even
21:20
though it's given permission to use
21:23
Face ID.
21:25
It's annoying isn't it?
21:27
It's very annoying. Bad customer experience.
21:29
Just gonna buy a couple of Doge whilst we're
21:31
here.
21:32
Why?
21:36
Cool. So we've talked about like what KYC
21:39
is supposed to be this know your customer aspect
21:41
of like knowing details
21:43
of who the person is, maybe who
21:45
they're employed by, the sources of
21:48
the funding and then also like something
21:50
around activity. I've also seen
21:52
previously as well artificial intelligence
21:54
and machine learning being used within payment systems
21:57
to just better detect what activity
21:59
is maybe suspicious. Cuz you mentioned
22:01
earlier didn't you like this difference between like
22:04
a large deposit and then a large number of
22:06
deposits either could be suspicious
22:08
depending on context . So I've seen AI
22:11
and ML being used to determine whether activity
22:14
is unusual or not. Um, that seems
22:16
infallible machine learning seems a good fit
22:18
to to that problem. Can you, can you imagine
22:21
any problems with that?
22:22
Absolutely not, no.
22:25
It seems like a , like a perfect fit. I
22:27
think that to be honest, if
22:29
there's one thing that people do as
22:32
rational agents , it is figuring
22:35
out what the rules are and circumventing them when
22:37
it serves them.
22:38
I imagine this is one of the reasons why
22:41
tipping off a
22:43
potential money launderer is bad. Cuz
22:45
not only could they take action to evade
22:47
prosecution or maybe flee, but also
22:49
like if they perform an action
22:51
and then you tell them, hey, this thing that you did
22:54
that was suspicious, it's like they might stop doing
22:56
that in the future and make an investigation more
22:58
, more difficult. Right?
22:59
Yeah.
23:00
I just wanted to mention on the machine
23:03
learning thing though , uh, the reason that I'm , I'm being mean
23:05
about this is because in a prior episode I
23:07
mentioned this story of the Americans
23:09
trying to detect Russian missile silos
23:12
by feeding in , uh, images. And it
23:14
, and the story goes that the dataset that they
23:16
used was bad and the machine learning algorithm
23:18
optimized for a different problem than
23:20
the problem they wanted to solve. And I , I recently
23:22
saw some articles of this being used
23:25
for COVID research and for using
23:27
machine learning to detect based
23:29
on images. So , so scans
23:31
of uh , patients, chest
23:34
scans of patients, whether that patient was COVID
23:37
positive or not. And again, it was just another really
23:39
good example of machine learning being
23:41
really good for some problems and not so much for
23:44
others . A good example of this being the
23:46
images that were fed into the system, Some
23:49
of the patients were sitting up and some
23:51
of them were lying down. The reason for
23:53
this was the patients who were scanned
23:55
lying down were COVID positive
23:57
and were seriously ill. That was why they
23:59
were lying down. And so this particular
24:01
machine learning algorithm learned to
24:03
detect whether the patient was lying down or not, not
24:06
to detect whether they were COVID positive or not.
24:08
Another example being the
24:10
dataset was bad and it contained
24:12
images, chest scans of children
24:15
who did not have COVID. And
24:17
then the obviously testset was just
24:19
a mix and the AI learned
24:21
to identify children not COVID.
24:24
So yeah, machine learning again being
24:26
absolutely fantastic but not necessarily
24:28
fantastic at the thing you want it to be.
24:32
It's like that Batman, isn't it? It's
24:35
what you deserve but not what you need right
24:37
now.
24:38
This is like the Shakespeare thing . I swear.
24:44
What Shakespeare thing?
24:45
I'm over here on a technology podcast saying here
24:47
are some really interesting applications of machine learning
24:49
and then Morgan's over here like, reminds me
24:51
of Batman.
24:53
Batman's really cool. I
24:57
know if it's the third one where um, I
24:59
said Luci- , Lucious Fox. Lucian Fox.
25:02
It's like the, the first three were really popular
25:04
and then like the second three, the
25:07
prequels weren't so good, right? Because they introducted ...
25:09
Okay, I'm talking about, I'm talking about
25:12
<laugh> , I'm talking about the Christian Bale , Christopher
25:15
Nolan trilogy. So Dark Night
25:17
Rises, I think it is. Morgan
25:19
Freeman's character has built
25:22
this system and it's like mass surveillance . It
25:24
uses , um, people's phones to
25:26
track like location , um, and
25:29
like detect crime and where that's occurring
25:31
in this city, like microphones and stuff.
25:33
And then he shuts it all down at the end cuz he thinks that
25:35
nobody should have that much power, which I think is really cute
25:37
and utopian.
25:38
Robin, I am your father.
25:41
<laugh> . Cool.
25:45
Okay, so yeah, there's
25:47
lots of legislation around this. Some of it's pretty cool,
25:50
some of it's not so cool. Um,
25:51
What makes it not so cool? Can you give us
25:53
some?
25:55
Personal bias? I don't really like 500
25:57
page documents that, you
26:00
know, say what you could say in like 30
26:02
pages.
26:03
Oh, it's just a, just a general, some
26:05
regulation is not well written . No, I can, I
26:07
can get down with that. I can ,
26:08
But interestingly like it , it features
26:11
in some other areas
26:13
too. So part three of
26:15
the Terrorism Act , um,
26:17
which I think was written around 2000 , um,
26:19
makes it illegal to use, po-, possess
26:22
or raise funds to finance terrorism
26:24
or to form an agreement to do that. So
26:26
all they have to do is find evidence
26:28
that you've agreed to fund a
26:31
terrorist organization and you're in contravention
26:33
of that. You don't even have
26:35
to have actually successfully laundered the money.
26:37
Yeah, this is like the US idea of, of
26:40
conspiracy, right? Where you , you have
26:42
to have planned an activity and then taken a
26:44
step towards that activity. You don't necessarily
26:46
have to have been successful in it. I'm
26:48
not a lawyer, we should probably point this out, but um,
26:50
yeah I remember reading the Terrorism Act, having
26:53
an interesting clause in there where
26:56
distinct from things like Computer misuse Act
26:58
where the Computer misuse Act is all about, you know, unauthorized
27:00
access to a computer program or data . So
27:03
the implication there is it , it must
27:05
have been, you must have achieved it. Whereas with
27:07
the Terrorism Act it's worded differently and it's a seriously
27:10
planned attack. The attack doesn't
27:12
have to have been successful, it just has to have been
27:14
seriously planned.
27:15
Yeah, I think that's an important provision. Maybe
27:17
we need some examples, ways of
27:19
laundering money to con-
27:21
That's not where I thought you were going with that. Well
27:23
I thought you were gonna say, Oh, I think we should have some
27:25
examples. Like is there any, you know , uh,
27:27
court cases, case law about like
27:30
money laundering or like the , the prosecution
27:32
of money laundering that we can talk about And you're
27:34
all over here like Yeah . So we should probably finish
27:36
this episode by telling people how to get away
27:38
with it. Is that where you're going?
27:40
Who's finishing the episode?
27:43
We're like 20 minutes in. I'm absolutely
27:45
not finishing the episode right
27:46
Now. I presume that money launderings quite difficult and
27:48
I just thought it would take you a while to get out there like, you
27:51
know, if I was gonna do it, this is how I would do it, story.
27:53
<laugh> . No, I just think it's , um, it's
27:56
interesting to look at ways that this has been done
27:58
historically and then we
28:00
can look at , um, the example that
28:02
you mentioned earlier. Cause I think you got
28:04
some, some interesting notes on that.
28:07
Interesting is carrying a lot of weight there. I'm excited. <Laugh>
28:14
I-, I'm also excited. Yeah, so
28:16
some some ways of laundering money. Then I'm
28:19
gonna let you cover the first one due
28:21
to the fact that it is called Smurfing.
28:24
I'm not familiar with Smurfing.
28:27
Are you not?
28:28
No, I don't think so.
28:29
Oh. Um...
28:31
Am I?
28:31
Okay, I thought we discussed this previously. It's
28:34
where you, you deposit it in really tiny amount
28:36
so-
28:36
Oh, so I would call that structuring. I've
28:39
not heard the term smurfing for that.
28:40
Oh.
28:41
So structuring to me is when you, when
28:43
you hear the , I
28:45
think it's fairly well known that there is a limit
28:47
after which a transaction has to
28:49
be reported, right? So in the US you hear
28:51
people very often saying if you deposit more
28:54
than $10,000 you have to report that. And
28:56
then everybody thinks they're really clever cuz they go, ha
28:58
ha , I'll just deposit $9,999
29:01
and I'll just keep doing that. And it's like, yeah , but
29:03
they're looking out for that as well. And that activity
29:05
I would know as structuring, I've never heard it referred
29:08
to as smurfing though. Is smurfing a specific
29:10
kind of structuring or is it just a different name for the same
29:12
thing?
29:13
It's-, It's pretty much the same thing. Yeah. Um, it's
29:15
just , uh, another name for it which I thought that
29:17
you would enjoy actually. So I'm surprised you hadn't
29:19
found that. Um...
29:20
I do, I do enjoy that term. I do.
29:23
<laugh> Yeah. Um
29:25
, also interesting, on things
29:27
like Monzo where you've got like a challenger
29:29
bank, if you go into the app, it'll
29:32
give you limits for a a specific
29:34
period of time. So if you
29:36
plan on flying under the transaction
29:38
limit radar and saying, actually I'll just deposit 9,999
29:42
pounds or dollars.
29:43
I've got a good story for this that, uh , probably
29:45
won't make it into the podcast. But , um, a
29:47
little while ago I went out and bought a car and um,
29:50
the- , the way that this worked was , uh, I
29:52
, I went into the dealership-
29:54
Which car is this?
29:55
Uh this is the Juke. So I went into the dealership
29:57
and I was like: "Hi, can I have that car but
29:59
in white please?"
30:00
<Laugh>
30:01
And so I,
30:03
being a millennial have no idea how
30:05
like large money transactions works, right? Cause I've
30:07
like never put a deposit down in a house or something like that. Owning
30:09
a house is just something that is in an
30:12
alien concept to me. My parents did it, but I
30:14
don't really understand how it would ever financially work in
30:16
this economy. But I did go
30:18
in and I bought car and when I buy everything else it's
30:20
just Apple Pay, right? Don't I just like rub my phone
30:22
on your payment machine or whatever. And they were like, Oh
30:24
we can , we can take debit. That's cool. So
30:26
I bought the car with my Monzo card and,
30:29
and then I went, they gave me the keys. They were like, you know , thanks for your business.
30:32
I went outside and I sat in my car and obviously
30:34
I needed to insure the car before I could drive
30:36
it home. So I sat in the car, I rang my insurance
30:38
company, I'd had a quote ready and everything like
30:41
that. I had the number, just had to ring the guy up
30:43
and pay for the insurance cuz I had bought the car,
30:45
rang the guy up, my card wouldn't go through, I'd hit my
30:47
daily payment limit.
30:49
<Laugh>
30:49
So I now owned a car that was just parked
30:51
in the garage's car parked that I couldn't legally
30:53
drive. And then you start doing that, like, what
30:56
do I do <laugh> ? Like I don't have
30:58
another bank account . I'm just like, what
31:00
do I do? What do I do? And uh...
31:02
That was both a rookie move
31:04
on your part, an oversight on Monzo's
31:06
part. And also I would expect the garage to
31:08
give you like drive away insurance at
31:10
least so that you could do the paperwork at home.
31:13
They probably would have. And , and , and maybe that is a standard
31:15
thing and I just wasn't aware of it cuz the guy
31:17
like, you know, they run you through all of these things, you
31:19
know, have you considered insurance? Do you have
31:21
this, do you have that? Breakdown? All of those things.
31:24
Cuz I'm sure there's upsells and includes that they
31:26
can , they can add. And when he said insurance, I was
31:28
just like, Yeah, like I've got it covered, don't worry. You
31:30
know , I've got my, my quote number with me and stuff like that.
31:32
And then anyway, I just rang a friend and I was like, Hey,
31:34
can I borrow 70 pounds.
31:36
< Laugh >
31:37
But I need you to give me your , uh, debit
31:39
card number over the phone. And then I ring
31:42
the guy and the, the insurance company when,
31:44
when we worked out that this is happening and I'm on the
31:47
phone to the insurance company and my card won't
31:49
go through. I said to the guy was like, I'm really, really sorry.
31:51
I , I don't know what to do. Let me ring
31:53
a friend and see if he can help me. Uh
31:55
, I've obviously like fallen into a mistake
31:57
here that I wasn't expecting. So I don't , I don't have another bank
31:59
card or anything I can use, you know, like I've got a
32:02
credit card, but I didn't have my credit card with me so
32:04
I didn't know the number. Like I couldn't, you know, I did
32:06
have another card, I just didn't have it on me. So
32:08
I was like, Oh, I'm just gonna ring a friend and see if he can help me.
32:10
And the insurance company guy was just like, Oh
32:12
yeah, don't worry, I'll just wait . And he just sat
32:15
on hold. Like I put the insurance card, the
32:17
insurance company on hold for like, however long
32:20
it took me to ring a friend explained to them that,
32:22
well essentially without explanation I
32:25
need 70 pounds. And also the
32:27
number of-, from your bank card <laugh> got
32:30
back on the phone to the insurance company. And I was like, Yeah, I've got it, don't
32:32
worry. And then that went through, we're talking suspicious
32:34
transactions.
32:36
< Laugh > That is such a
32:39
holly story.
32:40
The guy's like, is it , is it the same number
32:42
on this bank card? I'm like, No, I've just told you. I've just rang
32:44
a random person up and said, hey, I need some money.
32:47
<Laugh> I'm
32:50
not even sure how they're allowed to do that. That's
32:53
wild. That's, that's such a Holly story.
32:55
Yeah, I remember saying there
32:58
was something like, can you not just do a direct debit?
33:00
Right. Can you not just cause
33:02
I pay my car insurance by direct debit usually, but
33:05
there's some like deposit or the first
33:07
transaction has to be by debit or something like
33:09
that. I guess for the insurance to be active
33:11
from right now, you have to presumably
33:14
pay an amount. But yeah, I had to pay on debit and
33:16
I couldn't cuz my-, I'd hit my limit.
33:19
So , uh, that's, that's the advice
33:21
kids , uh, if you wanna-, wanna
33:24
make a large transaction, hit
33:26
your limit, that's it for 30 days.
33:29
It's daily, daily limit.
33:31
You have a-?, I,
33:33
I don't wanna ask how much your car was, but
33:36
it's, it's also a very Holly move.
33:38
Well everybody knows how much my car was cuz they'll just look
33:40
at what the daily limit on Monzo is.
33:42
I dunno what else you bought that day. You
33:45
might have spent like a hundred quid on snacks before you
33:47
bought the car.
33:48
One of my weird car buying stories.
33:50
< Laugh >
33:56
So structuring then is the activity
33:59
of a series of smaller deposits
34:01
to try and mask the fact that you really are
34:03
making a larger deposit. Are
34:05
there any other kinds of named activities?
34:08
Yeah, so , uh, something that
34:11
commonly impacts students, vulnerable
34:13
people and in cities typically
34:15
is money muleing. So
34:18
that's where you would ask either
34:20
potentially someone that , that you know,
34:22
or a complete stranger to
34:25
allow you to use their bank account. Ironically,
34:28
<laugh> temporarily they'll say, you know, something like,
34:30
can I put five grand in your bank account?
34:33
I'll let you keep like 500 quid if
34:35
you let me do this, transfer five grand in
34:37
and then ask them to transfer four
34:40
and a half out or something like that to
34:42
another account. It's part of
34:44
like the layering phase. So creating like
34:46
that web of transactions to obscure the initial deposit.
34:49
If it passes through what are considered
34:51
to be legitimate bank accounts, then
34:54
it's less likely to be considered
34:56
suspicious by bank accounts provided
34:59
that that is , uh, usual activity for
35:01
that person. So bank accounts
35:04
may or may not have like a fraud engine
35:06
that would flag up a
35:08
large transaction into a student's account, for example,
35:11
because they get paid a few thousand pounds
35:13
from Student Finance every few months.
35:15
So potentially large transactions wouldn't, wouldn't
35:18
flag up on their system.
35:19
I presume there's quite a lot of data there. Like
35:21
I presume that banks know who Student
35:24
Finance is , right? And they can tell if this is like, this
35:26
is a , the sender is a
35:29
known account used by Student Finance and it's at the right
35:31
time of the year and all of those kind of things.
35:33
Yeah. So some banks are really good at
35:35
that and some just aren't. Some haven't
35:38
historically invested in like a
35:40
, a financial crime team or in the
35:42
technology that they need to detect those sorts of things,
35:44
but it's quite data rich . So there was actually
35:46
an example a few years back that
35:48
was really cool. Monzo detected
35:51
that Ticketmaster had been breached before
35:53
Ticketmaster knew because a lot
35:55
of people who had previously made a purchase
35:58
on their Monzo accounts to
36:00
Ticketmaster were victims of
36:02
fraud. So they basically aggregated
36:05
all the transaction data for the people who were being
36:07
impacted by fraud. And Ticketmaster was
36:09
like the underlying factor. So they reported it
36:11
to Ticketmaster who said, Oh no, like
36:14
this isn't true. We haven't been breached. And then
36:16
discovered later that they actually had, but Monzo had already
36:18
taken steps to prevent any further
36:20
transactions across the board to like
36:22
all of their customers, to the people who were
36:24
committing the , the crime
36:27
<laugh> .
36:27
Yeah, this is something that, that I've known about
36:29
for a while actually, which is an interesting statistic when
36:31
it comes to the statistics behind:
36:33
how are breaches detected?
36:36
Because I think very often people think
36:38
that breaches are detected
36:40
by internal security teams , uh,
36:43
internal systems administrators and those kinds of things.
36:45
And they are, that happens, you know, there's examples
36:47
out there of like Sys admin looking
36:49
through a log file to fix an unrelated
36:51
bug and detecting suspicious activity or
36:53
something like that. Or an organization paying
36:55
for an intrusion prevention system or an intrusion
36:57
detection system that is successful
36:59
in detecting the activity of data exfiltration
37:02
or something like that. But a huge number of
37:04
breaches are actually detected, as you say, by
37:06
financial organizations when they
37:08
see that a large number of accounts
37:11
are tied to fraud activity. And also
37:13
there is a single point of commerce. So it's
37:16
like, oh, all of these accounts, there
37:18
is fraud activity and we know they all shopped
37:20
at the same store. It's surprising, if
37:23
you've not looked into it, how frequently actually
37:25
, uh, breaches are detected by third parties.
37:28
Yeah. Um , and for anybody who wants to
37:31
read more about that, that particular case, Monzo
37:33
actually wrote a blog , um,
37:35
about the Ticketmaster case. Um
37:38
, so you can check out on their website
37:40
We'll link it in the show notes .
37:42
So we've covered smurfing, structuring
37:45
, um, money mules. There's also
37:47
things like blending where you use a
37:49
cash heavy business and this is something
37:51
that you see actually on Breaking Bad. So
37:53
there's, I think it's Gus who runs the-,
37:56
the chicken places. He uses restaurants
37:59
that are quite cash heavy businesses to
38:01
blend the proceeds of his like side
38:03
hustle in dealing methamphetamines. And
38:06
then <laugh> is it , is it , I
38:08
dunno if it's Better Call Saul, or Saul, tries
38:10
to help , um, Walter and
38:13
Jesse with laundering their money
38:15
by opening a chain of like nail
38:17
salons or something like that because they're cash
38:19
heavy businesses. So it's more
38:22
difficult to trace the transactions and the
38:24
source of those funds. You could just say we've
38:26
done X amount of business and potentially
38:29
pay variable prices for
38:31
the goods that you need in order to provide that service. And
38:34
I think there's, there's one of, of these in every town there's
38:36
a restaurant that never does any business. Like
38:38
it's always empty every time you walk past, but it
38:40
seems to do really well and it's open for years and
38:42
years. Doesn't even do takeout. Like how
38:44
is it still running?
38:45
I think one of the things there as well is the
38:47
business doesn't necessarily have to be like completely
38:49
dead. There's definitely no doubt examples of that where
38:52
there's just like, that business never seems to do
38:54
any business, but it's still here. But it- , it could
38:56
also be that you have a cash rich business
38:58
that is seen to be busy, but
39:00
it's actually recording like
39:02
disproportionately high profits. So something like
39:05
a carwash where you do see cars coming
39:07
and going, everybody's paying in cash, but they're
39:09
, you know, supposedly washing 50
39:12
cars an hour and in actuality they have capacity
39:14
for five, and things like that.
39:16
Casinos have done things like this before
39:18
as well. There's been certain cases where casinos
39:20
have been involved in money laundering. Again,
39:22
quite cash rich and
39:25
I think it would be difficult to
39:27
be an accountant, at a casino personally. So
39:30
one of the other examples that's uh , I think
39:33
when you think about it, it makes so much
39:35
sense. It's things like, oh, maybe in like Bad
39:37
Boys or another nondescript crime
39:40
movie , um, trade
39:42
based . There's usually like some,
39:45
because they're , they're narcotics police, there's usually
39:47
some high profile South
39:50
American drug dealer, he's like trafficking
39:52
cocaine or something and
39:55
he has like stacks and stacks of cash
39:57
in the attic that's like being eaten by rats
39:59
or something like that, that he needs to launder. And
40:02
there are various ways that they do this, but
40:04
something historically that has been picked up
40:06
is trade-based money
40:08
laundering. So where you'll pay, like
40:11
you said previously, a disproportionately
40:13
high amount for something like a piece
40:15
of art.
40:15
Oh yeah.
40:16
...that might be, they'll sell it
40:18
for millions and it's definitely not worth that
40:20
much, but it's like a cover so that they can then have
40:23
a legitimate transaction. They don't have
40:25
to launder the money afterwards. There , there are taxes
40:27
and things associated with that depending on where
40:29
you live.
40:30
Related activities as well. So things like
40:33
buying wine and then
40:35
claiming that the wine was consumed and
40:37
things like that as
40:39
a- , as a means of hiding money, leaving
40:41
an account.
40:43
How would that work? What do you mean?
40:45
If you want to, for example, move
40:48
a large amount of money between countries
40:50
and you can only travel through
40:53
an airport with a certain amount. So say you're
40:55
not allowed to travel with more than $10,000 in
40:58
cash, you can purchase very,
41:00
very expensive vintage wines, travel
41:02
with those because they're not viewed in
41:04
the same way. The limits , uh, might
41:07
not be picked up in the same way as literally traveling
41:09
through an airport with a bag of cash. When you get
41:11
to the far side, you could sell the wine and
41:13
therefore you have successfully transferred currency from
41:15
one nation to another . But on your records
41:18
you could just claim that you consumed it and
41:20
then that money disappears from the system.
41:22
Of course. Okay . Yeah.
41:25
Um, that's another thing that people do , um,
41:28
commonly is they'll-, they'll try and transfer cash
41:30
from one geographical
41:32
area to another that has more lax
41:34
money laundering regulation and anti-money laundering
41:37
regulation.
41:39
Uh , sometimes also just as well, like the , the criminals
41:41
might physically be located in a different territory
41:43
to where they're steal the money from, right? I mean , steal
41:45
the money from an organization in the UK, and if
41:47
you're best in, you know, Eastern Europe, Russia
41:50
, uh, some other territory like
41:52
that, like there might be this , um, international
41:54
transfer hurdle that you have to get over just because of
41:57
where the attacker is physically
41:59
located,
42:02
That opens you up to more risk because
42:04
depending on where you're doing the crime and
42:06
where you're doing the laundering, you
42:09
are potentially exposing yourself twice to two
42:11
different investigatory powers
42:13
and, and different pieces of legislation
42:16
that could catch you out there. So
42:18
there was quite a high profile case in 2019
42:20
where a banker from Azerbaijan went
42:22
to prison after he failed to explain the source
42:24
of his income when the National Crime Agency
42:27
served him with an unexplained wealth order. Now
42:29
in an attempt to get his wife to admit to what
42:31
they suspected, which was that he'd embezzled
42:33
the money from the bank he'd worked at, they
42:36
also issued her with an unexplained wealth order
42:38
because of their shared assets. And her
42:40
defense was quite funny. It was basically: "you know,
42:42
there are couples that live under the same roof who
42:44
don't speak to each other and he's actually in
42:46
custody at the minute. So I'm not sure what use you expected
42:49
me to be." Um , but she had effectively
42:51
been spending on average £4,000 a
42:53
day at Harrods for over 10 years. She
42:56
had like over 50 credit cards, really
42:58
expensive jewelry. So she had uh a
43:00
1.1 million pound Cartier ring,
43:03
the couple owned a property in Nightsbridge,
43:05
they owned a golf course. So
43:07
really not subtle on the money laundering
43:10
front. Where did all of this
43:12
money come from? How can you afford to live
43:14
like this when you have no income or
43:16
, or no legitimate income?
43:18
That's one of the things that that often comes up
43:20
with like those major criminals. Isn't it that-, this idea
43:22
that you get caught out on tax evasion
43:24
as opposed to the actual crimes that you're committing. Cuz
43:26
potentially it's easier
43:28
to prosecute based on that, where it's like , hey , you have
43:31
this money and you , you can't show that you've paid tax on
43:33
it. But yeah, I think that's, that's one of the things that
43:35
very often comes up when you read the
43:37
stories of criminals getting caught is these
43:39
unexplained money orders , right? Where it's just like somebody
43:42
allegedly like works in the post office
43:44
or works in the corner shop or something like that, but then
43:46
drives a Jaguar.
43:48
Yeah. It-, it's literally that.
43:50
A lot of people would claim, like if you were
43:52
to talk to them about like, hey, you know, if you, if
43:55
you were a successful criminal, how would you get away with it? Like,
43:57
oh, you know, I'd never spend past my means and I
43:59
would hide the money and those kinds of things. Then as soon as
44:01
it actually happens, they're all like: "I bought Lamborghini".
44:04
It's like-
44:04
<laugh>.
44:04
Don't you work in the Post Office?
44:07
<Laugh> Yeah, no, it is
44:09
that, and that also interestingly, if
44:11
there is somebody committing fraud who works in
44:13
the finance sector, that's how they typically get
44:15
caught because they'll do
44:18
training , uh, mandatory training for everybody in the organization.
44:20
So I've done it a few times. Even if you
44:23
work in a completely unrelated back office
44:25
function and you don't serve customers and you
44:27
don't interact with the teams that serve customers,
44:30
you need to notice , uh, you
44:32
need to notice and report if someone starts
44:34
coming into work suddenly with
44:36
like expensive designer handbags
44:39
or they're going on holiday all the time to like luxury destinations
44:42
that you don't think that they could afford based on
44:44
what they are probably earning. And
44:46
you are obliged to report that internally. So
44:49
there's usually a nominated
44:51
money laundering lead, like a , a financial
44:53
crime officer or
44:55
similar , uh, that handles those kinds of reports. But
44:59
I think the limit for that, the- , the lower limit
45:01
is it has to be at least £50,000
45:04
according to the legislation before
45:07
uh , an unexplained wealth order can be considered. And
45:09
they typically don't pursue it in cases whereas
45:12
low amounts because it's not a
45:14
good use of like court time.
45:15
Yeah. Or like a single thing,
45:17
right? It's like maybe a family member died
45:20
or something and you've paid for a holiday. Like those
45:22
things happen . So we're not just talking about like
45:24
yeah, we went down the pub and he bought one
45:27
extra round than anyone else. You know, he
45:29
seemed a bit loose with his capital . No,
45:32
it's , it's like either a significant amount
45:34
or a significant number.
45:36
Yeah.
45:36
I'll give you, give you an example where, where this could,
45:38
could occur that that might
45:41
be seen as unfair to some people. Very
45:43
often when you talk to like founders of companies and
45:45
things like that. And I'm not necessarily talking about like founders
45:48
of like VC-backed startups and those kinds
45:50
of things. I just mean people who went from having
45:52
a job to being self-employed, maybe
45:55
they do crafts on the side, they're up in an Etsy
45:57
store and then eventually over time that becomes their job. Very,
46:00
very often you talk to people about like, when
46:03
did you make the leap kind of thing. Like when
46:05
did you think this, this job
46:07
might, might be able to support you? And
46:09
I think a lot of people like it has
46:11
to be pretty long term and it has to be quite
46:13
a lot of money. You know, if not matching
46:15
your salary, then at least demonstrating that it has
46:17
the potential to match your salary before you would make that jump to
46:19
being self-employed. And that is
46:22
a perfectly legitimate way that
46:24
somebody might have a bit more money than you'd expect
46:26
them to, you know, cuz they're making money
46:28
through another gig like through a-,
46:31
a side project or a new company
46:33
that they might not necessarily talk
46:36
about at work. And I think this, this
46:38
might be strange to some people cuz they would think, well if, you know, if
46:40
you're opening a craft store , you're obviously craft's
46:42
passionate, you would talk to everybody about that, right? And , and
46:45
you might talk to friends and family about that because that's
46:47
something you really into and you've got this goal, you want
46:49
to be self-employed. But one thing that
46:51
I would, I would say is for most people not
46:53
a great idea is telling the people that
46:56
you work with that you're intending on leaving.
46:58
But also on the , on the , the flip side of that, that
47:00
is perfectly explainable. So if
47:02
somebody were to contact a whistle blowing
47:04
hotline or report you to their , their money
47:06
la-, anti-money laundering officer, nominated
47:09
person, whatever, you could demonstrate
47:11
really easy. You could say like, I have a side
47:13
hustle. Yeah , I-, I've an Etsy or
47:15
something like, here's my Etsy profile.
47:18
Yeah. And what we're talking about is where
47:20
people have millions in undisclosed
47:22
illegitimate income and then they're
47:24
splashing that in like Harvey Nickels on like
47:26
Chloe handbags and stuff. It's, it's really
47:29
obvious it sticks out.
47:30
Yeah. I was just giving-m giving an example
47:32
of where like one of the reasons
47:35
why they might not investigate
47:37
every single, every single
47:39
time somebody spends outside their means is that like,
47:41
people might have money that you don't know about and that
47:44
is entirely legitimate.
47:45
Yeah. And also on the flip side,
47:48
in some organizations, especially in the finance sector,
47:50
and I think in certain government organizations too,
47:53
the credit checking and fraud checking and
47:55
stuff and sanctions checking that you have to go through to
47:57
get the job is pretty robust. So
48:00
you need typically slightly more references
48:02
than you are going to work somewhere else. So I
48:04
think the standard is about two years of work history.
48:07
Some finance organizations will
48:09
ask for three or five years of
48:11
references instead depending. And so
48:14
I guess a bit more similar to if you're applying
48:16
for like security clearance rather than a
48:18
standard job and then they'll do fraud
48:20
DBS sanctions checking to make sure that you're
48:23
not affiliated with crime and you haven't previously been
48:25
prosecuted for fraud. They'll check
48:27
what your , your credit score is to see if you've potentially
48:29
got a bit of a gambling problem or if you don't pay your bills on
48:31
time. Because that puts them in
48:33
an exposed position where if you are serving customers
48:36
or you have access to accounts, you are likely to
48:38
commit fraud.
48:39
Here's something that comes up on my background checks
48:41
that , that you're aware of. And it's, it's a purely
48:43
human thing cuz a computer wouldn't care, but
48:45
it catches humans out. I I once
48:47
lived in an apartment and it was fantastic and
48:50
then I decided to move closer to
48:52
work. So I left that apartment and moved to an apartment
48:54
that was closer to work that I hated
48:56
and was awful. And a year
48:58
or two later I moved back. I
49:01
didn't move back into the exact same apartment, but I
49:03
moved into same building. It was, it was effectively next
49:06
door . So, you know, you could imagine me
49:09
for , for two years living in Apartment 19,
49:11
moving away for a year and then moving back
49:13
into Apartment 20. And it's, it's,
49:16
there's a completely sensible story behind it. And it
49:18
was just because the , the , the other location,
49:20
the , the other apartment was way nicer way, nicer
49:22
area way bigger apartment, those kinds of things. But
49:25
it's a thing that stumbles humans where they
49:27
look down your address history and they're like, That's
49:29
weird
49:30
<Laugh>
49:31
It's not that weird. Really. Like-
49:33
Yeah. I don't think it's that weird. Like having
49:35
been to both of those apartment complexes.
49:38
Yeah . One of them, well they're
49:40
both pretty cute. I like both of them, but one of
49:42
them is definitely like-
49:44
It was way nicer. Yeah.
49:44
Yeah.
49:46
We're not talking like a , like a , you
49:48
know, a studio apartment versus a mansion here. I'm
49:50
just saying like, one of them was a
49:52
bit nicer than the other one I moved Yeah
49:54
. Regretted the decision and moved back. Um
49:56
, but that does come up on, you know, when you've got to demonstrate
49:59
your address history and things like that. It comes up on
50:01
that. But only ever with , with people when they
50:03
notice that the numbers are next to each other.
50:06
You do move a lot though. So I imagine that raises-
50:07
I'm agile.
50:07
You're one of
50:10
those people like- Agile! <Laugh> move
50:14
quickly and break stuff. <laugh>
50:17
cool. I think that's about all of
50:19
the , um, immediate historic examples
50:22
that I've got. There are other things like you can use
50:24
gift cards , um, you can use in-game
50:28
currency and rare items on the likes
50:30
of World of Warcraft. You could buy like a
50:32
super rare item, sell it on eBay,
50:34
Real money transactions within video games.
50:36
So things like Entropia Universe that has real money
50:38
transactions.
50:39
Yeah.
50:40
Yeah, I think, I think that's pretty much a good summary
50:42
so far of like money laundering.
50:44
Is the attempt to obfuscate
50:46
the source of illegitimately gained funds
50:48
and anti-money laundering is
50:51
investigators attempting to prevent that.
50:53
And also, anytime you think you've
50:55
come up with a really cool new way of doing money laundering,
50:57
you haven't, they have already thought of
50:59
that. You're not as clever as you think you are. Like
51:02
this whole idea of-
51:02
It's been done.
51:03
Yeah, this whole idea of like, No, no , I'll just put in
51:05
loads of small transactions. Yes, that
51:07
has a name, it is structuring. <Laugh>
51:10
It's just the , the funniest one for me
51:12
I think is like trying to blend trade-based
51:15
money laundering of like a
51:17
high priced art with the
51:19
blockchain and creating some mangled
51:23
NFT thing, it;s-. Oh , I hate
51:25
it.
51:26
You know the story of the first tweet, right?
51:28
The first tweet NFT.
51:30
I saw that. Yeah. It was ridiculous.
51:32
And it's especially funny that they tried to offload
51:34
that after Jack had resigned from Twitter.
51:37
<Laugh>
51:38
Were they offloading it because Jack left
51:40
or were they offloading it because Elon arrived.
51:43
Ehhh.
51:44
Yeah, for those who don't know that story, there
51:46
was an NFT made of the first tweet. It was sold
51:48
for the ridiculous figure of
51:50
$2.9 million dollars and then
51:53
recently it went up for auction and it
51:56
did not do very well auction.
51:59
It's almost like people are realizing that
52:01
there's no legitimate purpose
52:03
for NFTs outside of having
52:05
a hexagonal profile picture on Twitter
52:07
and looking like a dweeb . You
52:09
can cut that if you want to. < Laugh >. I think for your
52:11
benefit I might cut that. We'll see . We'll
52:14
see how many angry
52:16
NFT owners...uh... < Laugh > Yeah, I
52:18
think , um, in-game currency is probably,
52:21
I wanna say favorite. I'm not an advocate
52:23
for money laundering at all, but I think like
52:26
there are certain kinds of crime
52:28
that are-, that they take a lot of skill
52:30
to pull off and they're a bit of an art form and
52:32
in-game currency or if you
52:34
compromise someone's Blizzard account and
52:37
then sell all of their gold on Warcraft for
52:39
real money, that's really interesting to
52:41
me.
52:42
Yeah, it's definitely an interesting field and I think
52:44
it's one of those things where it's difficult, right?
52:46
Like successfully laundering money is
52:48
difficult and it must be because there's so many stories
52:51
of people getting it wrong. Either,
52:53
they made mistakes when they were new.
52:56
So when they're first starting out with criminal
52:58
activities, you know, they make mistakes that get them caught
53:00
in the future or you know, they're
53:02
maybe really good at hacking but not necessarily
53:04
really good at, you know , the actual money
53:07
laundering aspects , those kinds of things. Or maybe
53:09
there's also examples where groups
53:11
get together, the groups trust each other
53:13
and then one of them sells them out for whatever
53:16
reason, you know, they fall out or
53:18
maybe one gets caught and then there's a plea
53:20
deal and those kinds of things. I think I wanna kind
53:23
of bring in here the , the story that I was talking to
53:25
you about earlier in terms of just like really
53:27
good example of some crypto
53:29
theft from an exchange that led
53:31
down a really difficult example
53:33
of money laundering and how,
53:35
you know what, whilst you might watch movies, you
53:38
might read books about money laundering. There's a lot of criminals out
53:40
there who it turns out are just not very good at it.
53:44
I wanna hear the story. I've , I've heard
53:46
it already, but I wanna hear it again cuz it was hilarious.
53:49
So this is the story of Ilya Dutch Lichtenstein
53:52
and , and Heather Morgan who I
53:54
will just summarize as saying both very
53:57
interesting characters for various reasons.
53:59
And they were recently charged
54:02
with money laundering. The beginning of
54:04
this story, there's some details missing from
54:06
this story, but they were charged with laundering
54:09
the money that was stolen from Bitfinex.
54:12
The reason that I word it like that is so
54:14
far they have not actually been charged with
54:17
hacking Bitfinex, it's just that
54:19
they came into possession of the
54:21
funds and then they , they have
54:24
been laundering the funds and it's a huge figure.
54:26
So when Bitfinex were originally
54:28
compromised 119,754
54:32
bitcoins were stolen, which at the time,
54:34
this is 2016, was about
54:37
$71 million . Another way of putting that was 0.75%
54:41
of all Bitcoin in circulation. So
54:43
a huge, huge amount of coins. But
54:46
of course over the years that has increased
54:48
greatly in value. And when
54:51
I was writing the show notes a couple of weeks ago , uh,
54:53
I took a look at what the value was and it
54:55
was $5.2 billion. So
54:58
this has slowly been increasing,
55:00
but it's uh , a huge amount of money
55:02
that they had in their possession
55:05
as cryptocurrency that they
55:07
it is alleged tried to
55:10
launder so that they could access those funds
55:12
and , and use them. Illegitimately, this is the
55:14
biggest law enforcement seizure
55:17
of all time. The law enforcement
55:19
were able to seize 94,636
55:22
of the Bitcoin, which at the time
55:25
of the seizure was valued at $3.6
55:27
billion. So it's just a crazy amount
55:29
of money. But the story
55:31
is interesting not only because a
55:33
huge amount of Bitcoin in terms of value
55:36
was stolen, that Bitcoin was then
55:38
seized by law enforcement, but
55:40
it's interesting just because of like the difficulty
55:42
that the two suspects
55:45
had in actually laundering the money. For
55:48
one thing, some of their activities
55:50
did tip off private
55:53
industry organizations. So when they
55:55
were trying to access the funds, for example,
55:57
they were bumping into crypto exchanges,
56:00
know your customer processes. So
56:02
some of the crypto exchanges were asking
56:05
them about the sources of these funds
56:07
and they were giving , uh, either no
56:09
answers. So when they were challenged, they would
56:11
simply just not use that account anymore and the account would
56:13
be frozen or they would come up with
56:15
some story which was, was
56:17
not believed or was not detailed enough for the crypto exchange.
56:20
So saying things like the cryptocurrency was
56:23
previously gifted to them and they'd been holding them
56:25
cold storage and those kinds of things, things that
56:27
we now know are not
56:29
true, it's alleged that law enforcement
56:31
can demonstrate these funds came from, from
56:33
the Bitfinex breach, right? Because they
56:36
can track those transactions through the blockchain.
56:38
They did take actions to obfuscate
56:40
the sources they use, layering where
56:43
they are moving the funds between
56:45
a large number of crypto wallets
56:47
and they were allegedly doing that using
56:49
automated software. So they're not just moving it
56:51
by hand, you know, the same amount of money,
56:53
not just like drag and drop , 3.6
56:56
billion worth of crypto from one wallet to another. They were
56:58
doing multiple transactions between multiple
57:00
wallets of varying values. This
57:03
activity we would call layering. And it's a , it's a money
57:05
laundering technique and one
57:07
of the things that occurred from that is they
57:10
therefore had a large number
57:12
of crypto wallets that had some
57:15
funds in. So they needed a way
57:17
of tracking not only the wallets where
57:19
the funds were , uh, but they were tracking of course
57:21
if some of those funds had been frozen, they
57:24
were tracking those and they also had to keep track of
57:26
the , the passwords or in this context it would've cost me private
57:29
keys for those crypto wallets. So
57:31
, uh, it is alleged that Dutch did the...only
57:34
sensible thing you would do if you had a whole bunch of passwords
57:36
you needed to track. He kept them in a spreadsheet
57:39
on his cloud storage and
57:42
Law Enforcement as part of their investigation,
57:45
gained a search warrant for the cloud storage. It
57:47
is said that Law Enforcement were able to decrypt
57:49
these files. It's not said in the documentation
57:52
that I've read how that was possible, but they
57:54
were able to decrypt these files and they were therefore
57:56
able to access this spreadsheet
57:58
of all of the Bitcoin addresses and all of
58:00
the private keys . And that was how they were able to
58:02
seize the funds, was they just had access to
58:04
those wallets so they could transfer the money out. So
58:07
this is, you know, calling back to earlier of
58:09
, even if you don't tell the person
58:11
that they're under suspicion of
58:13
money laundering and their account has been frozen, if
58:16
you go into the bank and the teller says, Sorry, you
58:18
know, there's a hold on your account, we can't process this transaction
58:20
at this time, that might tip you off
58:22
that the funds have been frozen. Imagine when
58:25
Dutch looked into his accountant, all of his funds
58:27
had gone, that might have been an indicator that that they
58:29
were on him.
58:30
<Laugh> That's actually really interesting though because
58:32
there's, there's some pieces of legislation
58:35
and there's some quirks in how you're
58:37
able to obtain evidence for things like that. And
58:40
I think when you're looking at e-crime
58:42
or digital fraud,
58:45
the Computer Misuse Act and how
58:47
you obtain evidence in digital forensics is
58:50
a really difficult thing to navigate. So I'm wondering how
58:52
they, how they achieve that. I'd like to to
58:54
know more about that. Maybe we can cover that in like a digital forensics
58:57
episode.
58:57
We could also no doubt , talk a little bit about how
59:00
search warrants work for electronic systems and
59:02
those kinds of things. How Law Enforcement
59:04
are able to , to lawfully
59:06
access these things. And we could maybe talk about balancing
59:09
that against privacy rights as well. You
59:11
know , there's a lot to be said about things like law enforcement's,
59:13
access to messenger platforms, foreshadowing
59:16
for a future episode there.
59:18
We also need a whole episode to
59:21
discuss the myriad way
59:23
in which Microsoft Excel has served humanity.
59:26
It's the second best tool, <laugh> , it
59:28
doesn't matter what you're doing. It is the second best tool.
59:30
It's the best password manager, it's the best
59:33
place to store all your crypto keys . Sold.
59:36
So, so far in the story, we know that
59:39
Dutch and, and Heather
59:41
had access to these
59:43
funds that are alleged to have come from the Bitfinex
59:45
breach. We know that it is alleged that they attempted
59:48
to launder those through layering through obfuscating where
59:50
the source of those funds was. But
59:53
the last step that we haven't talked about is this integration
59:55
step, right? Of once you've done the layering,
59:57
you've obfuscated the source of
59:59
those funds, how do you get it
1:00:01
so that you can actually access them? So
1:00:03
how do you turn illegitimately
1:00:06
gained cryptocurrency into cash
1:00:09
or some of the mechanism that you can use?
1:00:11
So you've talked about things earlier, you mentioned, you
1:00:13
know, using gift cards and those kinds of things and maybe that's
1:00:15
one activity using a crypto to buy
1:00:17
gift cards and then buying things with the gift cards, those kinds of
1:00:19
things.
1:00:20
In this particular instance, it
1:00:22
is alleged, what Dutch did was he
1:00:24
used the crypto funds with an exchange
1:00:27
that allowed you to purchase gold and
1:00:29
then had the gold sent to an address that
1:00:31
he had access to. So of course that
1:00:34
is one mechanism for turning the crypto funds
1:00:36
into a source of currency, form
1:00:38
of currency that you could then, you know, sell
1:00:40
and use to buy products and
1:00:42
services. The problem is the
1:00:46
exchange that they used, allegedly
1:00:49
they signed up for using their
1:00:51
actual email address and
1:00:53
validated it with their actual driving
1:00:55
license and then had the gold sent
1:00:57
to their actual home address. That's
1:01:00
incredible.
1:01:02
Yes .
1:01:02
Incredible.
1:01:03
So, you know, tracking that
1:01:05
down was relatively easy,
1:01:08
and of course you might, you might say
1:01:10
that, oh well if you're
1:01:12
doing these activities, you know, you need to to worry about
1:01:15
OPSEC operational security and
1:01:17
effectively protecting the privacy. If you are the criminal
1:01:19
, you need to make sure that that law enforcement can't track
1:01:21
you. And you know, we're talking about like fake
1:01:23
IDs, fake passports, making
1:01:25
it so that your communications between your , your
1:01:28
co-conspirators are protected
1:01:30
from law enforcement. And of course it's
1:01:32
alleged that Dutch also performed
1:01:35
some of those actions. For example, when they searched
1:01:38
his apartment, they found a bag
1:01:40
labelled burner phones.
1:01:42
<laugh>
1:01:43
That's one thing. And also , um, on
1:01:46
the cloud storage there was other files for
1:01:48
example, you know, a file giving ideas
1:01:50
for where they could get other passports
1:01:52
from and those kinds of things. So
1:01:55
I mean it's, it's pretty nice
1:01:57
of them to, you know, have a folder on
1:01:59
their cloud storage called passport ideas and a
1:02:01
bag in their apartment called burner phones . If
1:02:03
you're gonna get Law Enforcement searching your apartment, I
1:02:05
mean, you know, neatly categorizing the evidence for
1:02:08
them is only polite.
1:02:09
Yeah, it's like when you check out of an Airbnb and
1:02:12
you take the sheets off, right?
1:02:14
When you check out of an Airbnb, you take the
1:02:16
sheets?
1:02:17
You take the sheets off. What
1:02:19
just like and all of the light bulbs, just like,
1:02:21
Oh my god.
1:02:23
So that's a nice throw pillow. I might keep it.
1:02:26
You leave the sheets on, don't you? Oh
1:02:28
my goodness.
1:02:29
I don't stay in Airbnbs. That's the problem
1:02:31
that we're coming up against here .
1:02:34
The funniest part about all of this for me
1:02:37
is what did he think
1:02:39
he was going to be able to do with
1:02:41
the gold.
1:02:43
To , to be clear that gold is not the only thing that
1:02:45
they bought. There was some gift cards. Gift cards
1:02:47
were used to purchase things like a PlayStation
1:02:50
and that kind of thing. So they did
1:02:52
gain access to some of the funds. A huge
1:02:54
amount of the funds was frozen. Something like
1:02:56
$180,000 was uh , restricted
1:02:58
by the crypto exchanges. Not necessarily
1:03:01
because of a specific
1:03:03
suspicion, but because like I said , they
1:03:05
had difficulty with the know your customer protocols
1:03:08
because they couldn't validate or chose not
1:03:10
to validate the accounts, the exchanges
1:03:12
therefore locked those accounts by
1:03:14
default until a validation process was followed
1:03:17
through. So that did cause them some problems
1:03:20
and honestly by reading it,
1:03:22
it just sounds like they had a real difficult time
1:03:24
with these anti-money laundering and know
1:03:26
your customer protocols. And it may have just been
1:03:28
the case that on one hand maybe they made some
1:03:30
mistakes, they made those mistakes when they were
1:03:32
less knowledgeable of the processes or maybe
1:03:34
it was frustration that they couldn't
1:03:36
access these funds. You know, they've got a bank account here
1:03:39
with $4 billion in it and they can't get to it.
1:03:41
Well, in in fairness , it didn't have
1:03:44
$4 billion in when they committed the crime, it was only, you
1:03:46
know , tens of millions.
1:03:49
I mean as far as we know, they , they didn't commit the
1:03:51
crime, we just know it is alleged that they
1:03:53
had access to the funds. It could be that
1:03:55
someone else, an associate or
1:03:58
or somebody else actually did the Bitfinex breach
1:04:00
and then maybe they themselves
1:04:03
could not access the funds. You know, this
1:04:05
is one of the things when it comes to cybercrime in
1:04:07
general, if you have some technical
1:04:09
skill but not skill through things like money laundering
1:04:12
as a cybercriminal, maybe you just
1:04:14
sell that service, right? So you hear things like ransomware-as-a-service
1:04:17
and those kinds of things, you know, maybe you, you
1:04:19
just sell a capability. So whilst
1:04:22
it is true that at this time they've not been charged with
1:04:24
the Bitfinex breach itself, it it could literally
1:04:27
be that they just did not commit that crime. They
1:04:29
were just given the funds through some of the means.
1:04:32
I've been studying cyber crime recently as
1:04:35
part of my master's degree and something that comes up quite often
1:04:37
and that they iterate to you throughout the module is that
1:04:39
it's easier to prosecute fraud than
1:04:41
it is to prosecute cyber crime, because
1:04:44
attribution is more difficult and there's also
1:04:46
less majority in the regulation around
1:04:49
cyber crime . So there are
1:04:51
interpretation quirks and
1:04:53
because it's also such a , a difficult thing
1:04:55
for lay people to understand, it's
1:04:58
easier to induce doubt into the
1:05:00
minds of like a potential jury whether
1:05:02
a crime was actually committed or-, or the person in
1:05:05
question committed the crime. Whereas fraud is much
1:05:07
easier and more widely
1:05:09
understood. So it's easier to prove.
1:05:11
And the- , the actual charges that they have against
1:05:14
them is laundering of monetary
1:05:16
instruments, fraud by wire, radio,
1:05:18
or television and conspiracy to commit
1:05:20
offence or to defraud the United States . So
1:05:23
they , you know , they are being charged effectively with,
1:05:26
with the laundering of the money. Not
1:05:28
at this time, at least with the actual
1:05:31
original breach of Bitfinex.
1:05:35
I think the , the volumes , um, in
1:05:37
terms of money laundering and potential fraud that they've
1:05:39
committed there though, do they really need to
1:05:41
be charged with the attack
1:05:44
itself? That the breach itself on top of that, I
1:05:46
think like the sentence is gonna be pretty
1:05:48
huge given the volume of
1:05:50
cryptocurrency involved.
1:05:52
This is like a societal and
1:05:54
like moral question, isn't it?
1:05:56
It's-, does it matter if
1:05:58
they are charged with all
1:06:00
of the offenses that they're committed or does it matter
1:06:02
that you know, they're appropriately punished? It , it
1:06:05
depends entirely on how you feel about crime
1:06:07
in general and should prison be
1:06:09
a punishment or should it be aiming for rehabilitation
1:06:12
and all of those kinds of things. I think there's like a , a
1:06:14
massive moral hole you could fall down
1:06:16
there in terms of like should we
1:06:18
aim for charges that are easy to get through the
1:06:21
courts or should we aim for the charges that most
1:06:23
accurately reflect the crimes committed and those
1:06:25
kinds of things. And I think this
1:06:28
podcast is maybe not the right place to have that kind
1:06:30
of moral discussion. And instead we should maybe just close
1:06:32
out with a fantastic summary
1:06:34
of this particular case that I saw by
1:06:37
Matt Levine who's a Bloomberg columnist
1:06:39
and I think he really summed up this case really
1:06:42
quite accurately. He said: "If you rob
1:06:44
a bank and steal a sack of money and the bills
1:06:46
are sequentially numbered and a dye pack explodes
1:06:48
in the sack and you drive directly to another bank
1:06:51
and hand them the dye stained sack and say, I'd
1:06:53
like to make a deposit, please, you will totally get
1:06:56
arrested and you will probably be charged with money laundering.
1:06:58
But in no meaningful sense did
1:07:00
you launder the money, it still has dye on it.
1:07:02
That happened here."
1:07:05
<laugh> . Yeah, I think people often make
1:07:07
the mistake of thinking that cryptocurrency is
1:07:09
perfect for money laundering. That
1:07:12
is, I think largely due to
1:07:14
a lack of understanding around the technology and
1:07:16
also this impression that it's not
1:07:18
at all regulated. It
1:07:21
still has to be regulated to an extent
1:07:23
because it still touches the legitimate financial
1:07:25
system.
1:07:26
You could argue here what what does
1:07:28
regulation mean and those kinds of things. But yeah,
1:07:30
the fact of the matter is there's a lot of people
1:07:32
out there who are committing these crimes and they are
1:07:35
being successfully charged with
1:07:37
those actions. So I
1:07:39
think the first thing is don't confuse
1:07:42
bitcoin or , or I guess more accurately the
1:07:44
blockchain being decentralized with
1:07:46
it being anonymous or untraceable it is
1:07:48
definitely not that. There are anonymity
1:07:52
enhanced currencies out there, but
1:07:54
money laundering is it's difficult
1:07:56
topic. And even when you take away things like these
1:07:59
people making basic mistakes like using
1:08:01
their actual id using their actual home address, all
1:08:04
of those kinds of things, it turns out that
1:08:06
anti-money laundering and know your customer protocols
1:08:08
can work.
1:08:10
But I think on the other side, like as
1:08:12
an industry, the finance sector has
1:08:15
a long way to go in terms of maturity. I
1:08:17
always think it's funny when people report at
1:08:19
the end of a , a financial year on statistics
1:08:21
and on on criminal statistics and things like
1:08:24
that and they'll say that
1:08:26
there was like zero pounds of successful
1:08:28
fraud. I've seen a stat before it was zero pounds
1:08:30
of successful mortgage fraud in a financial year. Like
1:08:32
if it was successful, would
1:08:34
you have detected it? That
1:08:36
stat's meaningless to me because I , I don't
1:08:39
think that you have the maturity or the capability to
1:08:41
detect it if it's successful and it's well executed
1:08:43
. And I think actually we have a gap as a
1:08:45
security industry in testing fraud detection
1:08:48
systems. So we might approach things in
1:08:51
the way that we approach penetration
1:08:53
testing or site reliability engineering
1:08:55
or something like that where we test the, the
1:08:57
confidentiality integrity, availability of information
1:08:59
systems. But are we testing fraud
1:09:02
detection processes and anti-money
1:09:04
laundering processes? How easy
1:09:06
is it to circumvent those? And I suppose it
1:09:08
sort of falls into fraud
1:09:11
adjacent social engineering type
1:09:13
work. But I think that there's a lot that we
1:09:15
could do in that space and I'm really interested to see how it
1:09:17
evolves over the next decade or so.
1:09:19
There's a big thing there to be said as well about
1:09:22
just like statistics in general. I-
1:09:25
, I operate obviously on the cybersecurity
1:09:27
side of things more than the the financial side
1:09:29
of things but I see a huge number
1:09:31
of statistics within cybersecurity that are
1:09:33
just wrong , incomplete,
1:09:35
or intentionally misleading. I'll give you an
1:09:37
example from this week. I saw a statistic
1:09:40
that said, and I quote, "phishing
1:09:42
attacks increased from 72% to
1:09:44
83% in the last 12 months". What
1:09:47
does that mean? Did the efficacy
1:09:49
of phishing increase? Did the number of threat groups utilizing
1:09:52
phishing increase? Did the number of phishing emails
1:09:54
sent increase? The number of companies
1:09:56
targeted increased? What increased?
1:09:59
But we see these statistics all the time.
1:10:01
Yeah, I would, I
1:10:04
would interpret that as business compromise
1:10:06
originating from phishing emails
1:10:09
or where you can attribute the
1:10:11
origin to a phishing email
1:10:13
has increased from like 72 to
1:10:15
80 whatever percent. Like that's what I would assume
1:10:17
that meant.
1:10:18
Yeah, but it doesn't have to mean that, right? So
1:10:20
one of the things that I see very often in the context of
1:10:23
phishing emails is people saying the
1:10:25
frequency of phishing attacks is increasing.
1:10:28
Um, that doesn't necessarily matter. It
1:10:30
might with phishing but certainly with other attacks
1:10:32
it might not. For example, if password guessing
1:10:34
attacks are increasing but your organization
1:10:37
effectively defends against password
1:10:39
guessing attacks, you know, you've got multifactor authentication,
1:10:41
let's not worry about the details for now . Something like that.
1:10:43
You have a protection that mitigates the
1:10:45
risk of password guessing attacks. It doesn't matter
1:10:47
how frequent they are, if you effectively
1:10:50
defend against them. So the statistic can
1:10:52
become meaningless, you can also hide
1:10:54
some things . So in the same article I saw another example
1:10:56
of um , details getting like conflated.
1:10:59
So the article said 47.3%
1:11:02
of emails sent or received are
1:11:04
spam emails. And then I saw another
1:11:07
article citing that original
1:11:10
source saying almost half of emails
1:11:12
sent in 2021 were phishing . Now,
1:11:15
spam email and phishing email
1:11:17
are not synonymous.
1:11:19
Yeah, I think there's probably
1:11:21
another episode in there where we
1:11:23
talk about things like fishing simulations internally
1:11:26
, um, awareness campaigns and
1:11:28
so on.
1:11:29
We definitely need to talk about um , phishing
1:11:31
simulations cuz there's so many cybersecurity
1:11:33
professionals out there who based
1:11:35
on their experience or based on their bias will say
1:11:38
a certain defensive mechanism should be used
1:11:40
without backing that up with data. And
1:11:43
like I'm seeing research that suggests
1:11:45
that active phishing campaigns against your
1:11:48
own staff can in some instances be actively
1:11:50
damaging, but we still see it constantly,
1:11:53
constantly advocated for in the same way that
1:11:55
we still have organizations enforcing
1:11:57
things like password rotation, that is
1:11:59
arbitrarily changing passwords after a
1:12:01
number of days even though there's so
1:12:03
much research that it says that it's actively damaging
1:12:05
for defense.
1:12:07
I think it's in keeping with outdated
1:12:09
regulation or standards that haven't been
1:12:11
updated though. So
1:12:13
I think best practice is typically to follow
1:12:16
things like NCSC guidance or or
1:12:18
NIST guidance depending on a
1:12:20
given security control, but that might
1:12:22
not always be appropriate for your organization. So,
1:12:25
I think like being too prescriptive with
1:12:27
like global security controls is
1:12:30
a flaw and that's like something that
1:12:32
I would like to explore more widely in a-, a different episode
1:12:34
but I don't think that you can just set
1:12:36
it and leave it and forget about it. With security,
1:12:39
you need to constantly review
1:12:41
your controls and make sure that they're still appropriate as
1:12:43
your organization evolves and as the industry
1:12:45
evolves and the behavior of the sorts
1:12:47
of threat actors and attacks that you're seeing evolves.
1:12:51
There's that side of it as well. But there's also just technology
1:12:53
improves and things become better. And
1:12:55
even if you are using a control because it
1:12:58
effectively mitigates a risk, maybe
1:13:00
there is a similar but different control
1:13:02
that will offer the same level of protection
1:13:04
but it's maybe easier to use for your staff and
1:13:06
those kinds of things. So you should
1:13:09
constantly review those things not only from a security point of view
1:13:11
but just like for other reasons. Operational
1:13:13
excellence.
1:13:15
A high level UN panel estimated
1:13:17
that annual money laundering flows globally
1:13:19
are at $1.6 trillion or
1:13:22
about 2.7% of global GDP
1:13:24
in 2020. I'll add the
1:13:27
source for that into the show notes cuz I know that
1:13:29
you love stats.
1:13:31
I actually do love data driven
1:13:33
arguments. What I hate is
1:13:35
, um, people misrepresenting statistics
1:13:38
doing things like conflating, spam
1:13:40
and fishing, they're different words. They're not synonymous.
1:13:43
So yeah, I'm not against statistics, please
1:13:45
don't interpret that as me being against statistics.
1:13:47
I'm against statistics being used for the benefit
1:13:49
of lying ,
1:13:51
Um, <laugh> . And secondly, we've
1:13:53
just talked about money laundering for about an hour and a
1:13:55
half. What is your favorite
1:13:58
money laundering mechanism?
1:14:00
I'm not recommending money laundering though. Can we just close this episode
1:14:03
out by saying that in that particular case they're
1:14:05
facing charges with a maximum sentence of
1:14:07
up to 25 years imprisonment.
1:14:10
That's pretty wild. Also
1:14:12
not advocating money laundering, I just think it's really interesting.
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More