Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:00
It's time for security. Now, Steve Gibson is
0:02
here we'll talk about how
0:04
Dropbox properly handled a
0:06
minor breach. And asks the question of whether
0:08
you should ever trust a managed
0:10
service provider. More on the
0:12
open SSL flaws, the
0:14
FTC going at it
0:17
with Chegg. I'm glad to see
0:19
this. And is China
0:21
cheating with zero days? That and a
0:23
whole lot more coming up next. What
0:26
security now? Stay tuned.
0:30
Podcasts you love from
0:32
people you trust. This is
0:35
tweet. This
0:40
is Security Now with Steve Gibson, episode
0:43
eight hundred ninety six recorded
0:45
Tuesday, November eighth twenty
0:48
twenty two, something for
0:50
everyone. Security
0:53
now is brought to you by. Thinks
0:55
canary detect attackers on your
0:57
network while avoiding irritating false
0:59
alarms at the alerts that
1:01
matter. For ten percent off and a sixty
1:03
day money back guarantee, go to canary
1:06
dot tools slash twit, and then
1:08
enter the code twit, and they how did you hear about
1:10
a spot? And by, Prada,
1:13
security professionals are undergoing
1:16
the tedious and arduous task
1:18
of manually manually
1:20
collecting evidence. With Dreda, say
1:22
goodbye to the days of manual evidence collection
1:24
and hello to automation All
1:26
done at drada speed. Visit
1:28
drada dot com slash tweet to get
1:30
a demo and ten percent off
1:33
implementation. It's
1:35
time for security now to show we
1:37
cover you, your privacy, your security,
1:40
how the works, how computers work
1:42
with this guy this genius right
1:44
here, mister Steve Gibson. Hello,
1:47
Steve? Yo, Leo.
1:49
Great to be with you. We have this
1:52
is a patch Tuesday. This
1:54
is the it hasn't fallen
1:56
on an election day since twenty
1:58
sixteen. Just a little bit of trivia
2:01
there for those who are -- Yes. -- following
2:03
along. And we'll
2:06
today show title to my wife.
2:08
Lory and I were out walking yesterday, and
2:11
so and we were, you know, and I was telling her
2:13
how what progress I had made so far.
2:15
She said, do you have a a topic? And I
2:17
said, You know, I don't so
2:19
far. I said, but that's okay. Sometimes
2:23
nothing really jumps out or stands
2:25
out or needs special attention. And so
2:27
I just call it, you know, like a
2:29
a busy news week or something. And
2:31
she said, how about calling it
2:33
something for everyone? Oh. And I said,
2:36
I like that. I like it. So
2:38
that's today's title. Something
2:41
for everyone because we just have
2:43
all kinds of stuff. We've got you know,
2:45
one of our pure news weeks we've
2:47
got Dropbox's handling of
2:49
a minor breach. We follow-up
2:51
on last week's open SSL flaws.
2:54
the FCC has had it
2:56
with a repeat offender, and
2:59
we're gonna find out how much total
3:01
reported Ransom was
3:04
paid last year to
3:06
the ransomware of Dennison's
3:08
-- -- Akamai has
3:10
reported on fishing kits, and
3:12
that's some it's like frightening. We've
3:15
got some stats about what initial
3:17
access brokers charge and
3:20
we look at the mechanics of cyber
3:22
bank heists, like
3:24
how that's actually pulled off in the real
3:26
world. we've got several more
3:29
defy platforms, defying
3:31
belief. Russia is
3:34
forced to move to Linux. Finally,
3:36
the Red Cross wants a please
3:38
don't attack us cyber seal.
3:41
We've got nutty Floridians who
3:43
have gotten themselves indicted in
3:46
a bold tax fraud scheme,
3:48
who that you just can't imagine they could have
3:50
possibly thought they could have gotten away with.
3:53
And Well, because of indictments,
3:55
they didn't. We've also you
3:57
know how that is? Yeah. That's right.
3:59
Also, The
4:02
question has been raised by Microsoft. Whether
4:04
China is cheating with zero
4:06
days and in
4:09
what I think is a fabulous idea
4:11
that I hope the the US might adopt,
4:14
the NCSC will
4:16
be scanning the UK's
4:18
citizenry for vulnerabilities
4:21
and working with them to remediate them,
4:23
and That's not all. There's more.
4:25
We got a great picture of the week. I've got
4:28
some feedback from our listeners
4:30
and had and a brief update
4:32
on where a spin ride stands. So -- Oh, as
4:34
I said -- Sounds like forever. --
4:36
everyone. That sounds like
4:38
an excellent show. I'm looking forward to it.
4:40
I always do. And
4:42
I also am happy to say so
4:44
to our sponsors, especially the guys at
4:46
Canary. They are big
4:48
fans. And I should
4:50
say that this is a group
4:52
of people who know what they're talking about.
4:55
The folks who invented the
4:57
the things canary to
5:00
have trained companies and militaries and
5:03
governments on how to break into
5:05
networks. And it's with that knowledge
5:07
that they built This, the
5:09
THINKSED Canary, should they brought you
5:11
by the best little honeypot, Mike
5:14
and Bye. It's a honeypot
5:16
that requires no extreme
5:18
honeypot coatings skill. Remember when
5:20
we had Bruce Cheswick out in
5:22
Boston for our last past event? He was
5:24
he wrote the first honeypot,
5:27
and he was talking about that. They they
5:29
they found somebody in the network. And
5:31
so he created a little attractive
5:34
program that they would most likely run
5:36
into a trip wire, if you will,
5:38
for intruders on the network. And that's
5:40
now here we go fast forward many
5:42
decades later. That's what this is all about.
5:44
The Thinks Canary. Looks
5:47
about like a portable USB
5:49
drive. It's not very big. connects
5:51
to power, and then
5:53
to your network, and then that's it, and then
5:55
you let it sit. Actually, you will wanna
5:57
go into your canary console and configure it.
5:59
This one's configured to
6:01
look exactly like a Synology NAS.
6:03
It's got the login page. It's
6:05
got the proper MAC address. All of that is
6:07
completely indistinguishable. from
6:09
a NAS, but it's not. It's
6:12
it's a Honeypot. You can make it look like a
6:14
Skated Device. You can make it look like a Windows
6:16
Server, a Linux Server. You can
6:18
have all the
6:20
services turned on like a Christmas tree,
6:22
or you can have, you know, just some select
6:24
services. Just maybe a little port
6:27
one thirty nine action. opened up.
6:29
But the point is they don't look
6:31
like vulnerable or traps on
6:33
the network. They look like valuable information.
6:37
But the minute a bad guy touches
6:39
it, you will
6:41
know. And that's the key to the
6:43
things to canary. It's a canary
6:46
in your coal mine in effect.
6:48
The thinks canary solves
6:51
this problem that we
6:53
all should be aware of. You know, we all
6:55
have perimeter defenses. You know, you build a
6:57
fort. You know, you keep everybody
6:59
out. But, unfortunately, nothing's
7:01
perfect. Bad guys get
7:03
in. And then once they're in, they
7:05
often have free reign. Right? They have
7:07
complete access to everything. because
7:09
we just we just say, oh, well, they couldn't possibly
7:12
get in. On average, this
7:14
is a terrifying statistic. On
7:16
average, It takes a hundred
7:18
and ninety one days. It's
7:21
more than six months for
7:23
a company to realize there's been a data breach.
7:26
Canary solves that problem. attackers
7:30
are sneaky. You know? They they they
7:32
don't necessarily wanna signal their presence.
7:34
In fact, most of the time, They don't. They
7:36
wanna prowl around. They wanna find resources.
7:38
They can exfiltrate valuable
7:40
stuff. If they're planning a ransomware
7:42
attack, they're gonna look at where your backups go,
7:44
where everything goes before they trigger the
7:46
ransomware attack. They want maximum
7:48
damage. And now as we know, The
7:51
latest is they also wanna blackmail
7:53
you afterwards with data they've xfiltrated from
7:55
your server. This
7:57
is the key, the things canary. You
7:59
can also
7:59
use your canary, and you wouldn't have
8:02
just one. You'd probably have several spread
8:04
around big banks might have hundreds, you know, some
8:06
big operations have many, many, many canaries.
8:08
because you want them in every nook and cranny. You
8:10
don't want it to stand out. You could put it on
8:12
active directory, so they're easy to find, which
8:14
is a it's a nice feature. and
8:17
you can make them look like absolutely anything,
8:19
but you can also use the canaries to create
8:21
canary tokens. These are
8:23
files. that
8:25
look like spreadsheets or PDFs
8:27
or documents that you also
8:29
can scatter around, you could create as many
8:31
as you want, And when they're opened or
8:33
somebody tries to open, they phone home
8:35
and the canary alerts you.
8:37
So this is a really good way of
8:39
kind of spreading these
8:41
trip wires around your network.
8:44
And then hackers fall into them,
8:46
and then you are alerted. It's designed to
8:48
be configured in minuses. so easy to do.
8:50
In fact, the only thing that kept me doing it
8:52
is because I was trying all different things. You
8:54
know? I mean, it's so fun.
8:56
You can make it be anything. You won't have to think
8:58
about it again. If an alert
9:01
happens, canary notifies you and they do it in any
9:03
way you want. You won't be
9:05
inundated with false alarms. It's dead
9:07
silent until there's a bad
9:09
guy snooping around. You can
9:11
get an email, You get it or all of the
9:13
above. Email, text message.
9:15
You get a canary console. It'll show you there.
9:17
They support Slack. They also support
9:19
webhooks, which means you can attach it to all sorts
9:21
of stuff there's SIS log. If you use SIS
9:23
log, that's a nice way to do it. They have a
9:25
full API, so you can write a little python
9:27
script if you want that'll ping you. Can do
9:29
anything you want? Data
9:31
breaches typically happen
9:33
kind of through a social
9:35
engineering, a backdoor. You
9:37
may not know what's happened No
9:39
alarms may go off unless you have
9:41
the
9:42
canary.
9:43
You'll find canaries deployed all over the world
9:45
on all seven continents. one of the
9:47
best tools against data breaches. You
9:49
can read about all the love
9:52
for canaries on their site at
9:54
canary dot tools slash
9:56
love. But when you want when the
9:58
time comes, you think, I wanna try these
10:00
out. Go to canary dot tool
10:02
slash tweet. So that way,
10:04
they they know you heard it here. Right? and use
10:06
the offer code twit in the how did you hear
10:08
about this box for get
10:10
this ten percent off the
10:12
canary for life. you know,
10:14
every year, right, for life. What's
10:16
the pricing? Well, I I, you know, I like to be transparent
10:19
upfront. And so to the canary folks,
10:21
that's what they're all about. Let's say you
10:23
wanted five of them. Five
10:25
canaries, you could put one in, you
10:27
know, every corner of your subnets
10:29
and every every v
10:31
land That'd be five hundred bucks for five of them
10:33
per year. You'd also
10:35
get hosted console. You
10:37
get the upgrades. you get
10:39
support, you get maintenance if somebody sits on a
10:41
canary or steps on it or, you know, it breaks
10:43
for any reason. They just immediately, no questions
10:45
asked to send you another one. Ten percent
10:48
off for life when you use Twitter and how did you
10:50
hear about us box? And they know you're
10:52
gonna love it. I know you're gonna love it, but If
10:54
for any reason it doesn't suit, you got
10:56
two months, a two
10:58
months money back guarantee full
11:00
refund. these guys are
11:02
confident. They know this is something awesome that
11:04
you're gonna really like, but don't be fooled if you hear
11:06
nothing from it. That's the good that's a
11:09
good thing. That's a good thing.
11:11
It's when you hear from it. That's when you wanna go,
11:13
uh-oh. That happened to us
11:15
once. It was I've told the story before,
11:17
boy, that that was like, wow.
11:19
canary dot tool slash tweet?
11:21
Offer
11:21
code twit. Thank
11:23
you, canary. They're great sponsors. They love
11:25
this show. They say we we
11:27
always wanna be on this show because
11:29
Steve's the best they like to support your
11:31
mission. canary dot tools
11:33
slash you support our mission
11:35
when you use that address. Do
11:37
we have a picture? I didn't even look. We
11:40
have a wonderful picture.
11:42
I will lead up while
11:45
you're getting it ready. Now
11:46
this I would I'm
11:49
tempted to call this the dumbest thing I've
11:51
ever seen except that that
11:53
that we got two previous
11:55
occupants for that slot.
11:57
One is the locked
11:59
gate standing alone out of the
12:01
middle of a meadow with
12:03
a, you know, a path running up to
12:05
it. And it's like, what is this
12:07
lock gate doing out in the middle of nowhere who's not
12:09
gonna walk around it. And sure enough, you know,
12:11
there's there's like a dirt trot and
12:13
path on either side. other
12:15
dumbest thing was that generator
12:18
that had to be grounded. So
12:20
so I'm stuck at this
12:22
rebar, it into a
12:24
pail of dirt and hooked the
12:26
ground wire to the rebar. It's like,
12:28
okay. I don't think that's
12:30
quite what they had in mind when
12:32
they said you need to ground with this
12:34
generator. Okay. Here,
12:36
we've got a a very
12:38
tall gate which is looks like
12:40
it's an electric, you know, gate.
12:42
It's a good one. I would say it's very nice.
12:44
Nice looking gate. Yeah. Got an intercom
12:46
on the side. so you, you know, buzz the person.
12:48
It looks looks like may maybe three
12:51
different units are back there somewhere.
12:53
And you are not supposed to get
12:55
in or or out, presumably.
12:57
Uh-uh. No. The problem
12:59
is that the genius
13:01
The genius who designed
13:04
this gate used a
13:06
series of horizontal bars.
13:08
wow And so I gave this the caption,
13:11
can't get in? How
13:13
about use the built in ladder?
13:18
Because you I mean, it's like
13:20
designed for scaling the
13:22
gate. It's you
13:24
you you just they don't I
13:26
can't get in. What should I do?
13:28
Oh, look, it's a ladder. No
13:30
hand. You Hi. A
13:34
convenient piece. if
13:36
if if all they had to do is make them
13:38
vertical and then you just be like
13:40
stuck. You'd be looking like, you know, like, you know,
13:42
like, you know, like, a prison
13:44
bars, but no, they build
13:46
a ladder from the gate, and so it's
13:48
quite easy. Though those this goes down. This
13:50
is maybe the third dumbest thing that we've
13:52
seen on the podcast. In the list. Where
13:54
they phone. Yeah. They we are
13:56
acquiring them over time.
13:58
Okay. So
14:00
last Tuesday, which was the first
14:02
of November. Dropbox posted
14:05
of their own experience
14:09
titled how we handled
14:11
a recent phishing incident
14:13
that targeted Dropbox. And
14:16
the short version is I think they handled it
14:18
pretty well. But there
14:20
are some lessons to be had surrounding the
14:22
event. Their announcement
14:24
began with sort of the, you know, the
14:26
required do not worry
14:28
disclaimer. They said, we
14:30
were recently the target of
14:32
a fishing campaign that successfully
14:35
accessed some of the code
14:37
we store in
14:39
GitHub. No one's
14:41
content, passwords, or payment information,
14:43
was accessed, and the issue
14:45
was quickly resolved. Our
14:47
core apps and infrastructure were
14:49
also unaffected as
14:51
access to this code is even
14:53
more limited and strictly controlled.
14:55
we believe the risk to customers is
14:57
minimal. Because we take our commitment
14:59
to security privacy and transparency
15:01
seriously, we've notified
15:03
those affected and are sharing
15:05
more here. Okay. But then I'm I've
15:07
skipped over a bunch of background
15:09
And the part I wanted to share with our listeners
15:12
was this. They said,
15:14
at Dropbox, we use
15:16
GitHub, to host our
15:18
public repositories as well
15:20
as some of our private repositories. We
15:23
also used Circle CI
15:26
for select internal
15:28
deployments. CI is
15:30
some automation technology
15:33
a CI standing for continuous
15:36
integration. So they said
15:38
in early October, multiple
15:40
drop boxers received
15:42
phishing emails, impersonating
15:45
Circle CI with the
15:47
intent of targeting our GitHub
15:49
accounts. A person
15:51
could use their GitHub credentials they
15:53
explained to log into
15:55
Circle CI. they said,
15:57
while our systems automatically quarantine
15:59
some of these emails,
16:02
you know, phishing emails.
16:04
Right? others landed
16:06
in drop boxers inboxes.
16:09
These legitimate looking emails
16:11
directed employees to
16:14
visit a fake Circle CI
16:16
login page, enter their
16:18
GitHub username and password
16:20
and then use their hardware authentication key
16:24
to pass a one time password
16:26
to the malicious site.
16:28
And as we know, all of
16:30
this bypasses, you know, I
16:32
mean, this this this approach
16:35
will get around
16:37
the use of one time password
16:39
authenticators. So they said this
16:41
eventually succeeded. Giving
16:43
the threat actor access to one
16:45
of our GitHub organizations where
16:47
they proceeded to copy
16:50
one hundred and thirty of
16:52
our code repositories. Whoops.
16:55
They said these repositories included
16:58
our own copies of third party
17:00
libraries slightly modified
17:02
for use by Dropbox internal
17:05
prototypes, and some tools and
17:07
configuration files used by the security
17:09
team. Importantly, They
17:11
did not include code
17:13
for our core apps or
17:15
infrastructure. Access to those
17:17
repositories is even more limited
17:19
and strictly controlled. And
17:21
finally, on the same day, they said we
17:23
were informed of the
17:25
suspicious activity. They don't
17:27
indicate how how, but This
17:29
is why, you know, you
17:31
need to do network monitoring like Leo
17:33
you were just talking about with that previous sponsor.
17:35
They said the threat
17:38
actor's access to GitHub was
17:40
disabled. Our security teams
17:42
took immediate action to coordinate the
17:44
rotation of all exposed
17:46
developer credentials and
17:48
determine what customer data if
17:50
any was accessed or
17:52
stolen. We also reviewed our
17:54
logs and found no evidence of
17:56
successful abuse. To be sure,
17:58
we hired outside forensics
17:59
experts to verify our findings
18:02
and reported this event to the
18:04
appropriate regulators and
18:06
law enforcement. Okay.
18:08
So there are three points that
18:10
I wanted to highlight from this report.
18:14
The first that we have yet
18:16
another instance of
18:18
a major security
18:21
savvy a network
18:23
savvy organization, you know,
18:25
Dropbox. Right? I mean, they know their way
18:27
around or they wouldn't still be
18:29
around. being successfully
18:32
attacked and breached even
18:34
in the face of knowing
18:37
that this is going on. Their
18:39
email filters worked to
18:41
prevent their employees from being
18:44
subjected to this error prone
18:46
event mostly, but
18:48
those filters also failed
18:50
just enough to allow
18:52
bogus phishing attacks to
18:54
reach their employees. And notice that
18:57
these were code developing employees,
19:00
you know, not for example,
19:02
less sophisticated, clerical, or
19:05
office workers who you might
19:07
have in a huge organization that
19:09
aren't wouldn't be expected to be up
19:11
to speed on computers. You know, these
19:13
are people who, like, log in to
19:15
Circle CI and GitHub
19:18
and they were fooled. The
19:20
point is fishing.
19:22
And we'll be talking about
19:24
that several more times before the end of
19:26
today's podcast. The second point I
19:29
wanna make is the introduction
19:31
of a new concept, which
19:33
I would turn III
19:35
would turn the phishing
19:37
email attack surface. We're all familiar
19:39
with the traditional concept of an attack
19:42
surface. Right? The idea being
19:44
that the more potential points
19:46
of entry that exist
19:49
the greater the threat that any
19:51
one of those might be
19:54
inadvertently left open or
19:56
somehow breachable. So
19:58
this new concept that I would
20:00
call the phishing email
20:02
attack surface uses this
20:04
recent Dropbox experience as a
20:06
perfect example noticing that
20:08
the more complex an
20:11
organization's setup is,
20:13
which is to say The
20:15
greater number of ancillary services
20:18
an organization employs,
20:22
the greater is their phishing
20:24
email attack surface. They're
20:26
just more things that
20:28
have log ons and authentication requirements,
20:32
and Again, more points of entry.
20:34
The modern trend is
20:37
the is products as
20:40
managed services. where companies
20:42
are increasingly contracting
20:44
out for an increasing
20:47
number of services rather
20:49
than rolling their own in house.
20:51
You know, the theory of this
20:53
is sound. Why reinvent
20:55
the same wheel over and
20:58
over? especially when there's little
21:00
additional value to be added
21:02
by doing so. Just
21:05
contract for this or
21:07
that service While focusing
21:09
upon the company's core mission,
21:11
rather than wasting time on developing
21:13
and running all of those other
21:16
things, that are common to all companies.
21:18
Sounds great. But
21:20
recall, all of the
21:22
downstream damage that
21:24
the breach at SolarWinds created.
21:28
SolarWinds was a provider of
21:30
exactly this sort of
21:32
outsourced services model.
21:34
And also remember all of those dental offices
21:36
that were being breached and
21:38
the hospital services that were
21:40
hit by crippling ransomware
21:43
when their MSP, their managed
21:45
service provider, was breached.
21:47
The danger represented by
21:50
manager service providers is exactly what
21:52
I'm referring to here. So
21:54
I wanted to observe that we
21:56
as an industry still have
21:59
a serious problem with
22:01
remote network services authentication.
22:04
The very fact
22:06
that fish emails even
22:09
exists as a security issue
22:12
demonstrates that this serious problem has
22:14
not yet been solved. So
22:16
the more remote network
22:19
MSP services and an
22:21
organization maintains the
22:23
greater their phishing email attack
22:26
surface will be. The
22:27
the third and final point
22:29
I wanted to make,
22:31
was where Dropbox wrote.
22:34
They said on the same day we were
22:36
informed of the suspicious activity the
22:38
threat actor's access to GitHub was
22:41
disabled Our security teams took immediate
22:43
action to coordinate the rotation of
22:45
all exposed developer credentials
22:47
and determine what customer
22:49
data if any was as accessed or stolen. We also
22:52
reviewed our logs and found no evidence
22:54
of successful abuse. To
22:56
that, I say bravo.
22:59
When when we were all growing
23:01
up, our elementary schools
23:03
conducted periodic fire
23:06
drills. Without warning
23:08
alarms would sound throughout the
23:10
school and the entire school
23:12
class by class would file
23:14
out in an organized manner to
23:16
previously designated locations.
23:19
While I was a school, those those alarms
23:21
never went off except for
23:23
during drills. But if someday they
23:25
were to, the entire school
23:27
was prepared. My point
23:30
is every organization must
23:32
now be prepared for
23:35
the possibility of a network breach.
23:38
So breach drills should
23:41
become a thing that all
23:44
responsible organizations conduct
23:46
just as fire drills
23:48
where were once
23:49
that when we were an elementary school.
23:52
Just as when a school might be on
23:55
fire after a network intrusion, we've
23:58
seen the stats showing
24:00
that time really
24:02
can be of the
24:05
essence. So planning for
24:07
a breach including having
24:09
some drills, should be something
24:11
that responsible organizations do.
24:14
Dropbox's immediate response
24:17
showed that they were ready and prepared
24:19
for that eventuality.
24:21
And again, I think that
24:23
is of crucial importance. think
24:26
it's also important to point out
24:28
that that's
24:31
probably why in many cases it's
24:33
better to use it MSP than do it on
24:35
your own. I mean, if if we were to
24:37
count all the flaws that people introduced
24:39
themselves by trying to do
24:41
it themselves, that's gonna far
24:43
outweigh the number of exploits
24:45
because MSP was taken advantage
24:48
of. Right? I
24:51
mean, I I think that's a
24:53
useful
24:55
consideration. The the the
24:57
problem with an MSP is the
24:59
single point of failure. So,
25:01
you know, a breach at SolarWinds gets
25:03
every Debit devastated. Yes.
25:05
So many clients I when I think about
25:08
BitWarden, for instance, and some people, again, with
25:10
BitWarden, one of our sponsors and password
25:12
manager, host their own. And they often
25:14
say, well, why do you let BitWarden host because I
25:16
always say because I think they're gonna yeah. I can
25:18
host it myself. I think they're more
25:20
likely to keep it locked down than I am.
25:22
You know? and and
25:24
and backed up, you know -- Yeah. -- you
25:26
don't risk, you know, losing the
25:28
the the cloud presence. Mhmm. I mean, it
25:31
certainly is a consideration. I guess the thing
25:33
to do would be But you
25:35
gotta trust me. As
25:38
always, yeah, find find some
25:40
balance point. You know, for example,
25:42
you know, don't don't
25:44
don't don't give no iteration
25:46
to the security of
25:48
the services that you're hiring,
25:50
at least, you know,
25:52
have
25:53
them run the gauntlet and
25:56
demonstrate that, you know, that that it
25:58
makes sense for you to
25:59
put some portion
26:01
of your security in their
26:03
hands because you are, you
26:05
know, you are when you're
26:07
outsourcing a service, you're
26:09
outsourcing the security of that
26:12
service and that services access
26:14
back into your organization. And
26:16
that's what bit the hospitals and and bit all those --
26:19
Yeah. -- dental practices when
26:21
when their, you know, their common
26:23
MSP got hacked. So
26:25
It's just I sort of wanted to put it on people's
26:28
radar to to consider that, you
26:30
know, if if
26:32
if
26:32
yeah Dropbox
26:34
hadn't been using
26:36
Circle CI. Well, they would
26:38
they wouldn't have been prone to the
26:41
Circle CI fishing e mails.
26:43
And so that couldn't have
26:45
happened. Maybe something else would have
26:47
happened. They would have gotten in some other
26:49
other way, but that's the way it happened.
26:51
So it's very much like, you
26:53
know, having exposed ports. Each of those
26:55
things represents some exposure
26:58
and that
26:59
means, you know, an expanded attack
27:02
surface. Two weeks
27:06
ago, as we talked
27:08
about last week when it was
27:10
one week ago, now it's two
27:12
weeks ago, the open SSL
27:14
project maintainers told the entire
27:16
world that one week from
27:18
then, a critical
27:20
vulnerability would be patched and
27:23
necessarily revealed to the world.
27:25
So last week, the severity
27:27
The good news was it was downgraded from
27:30
critical to high. Since there's
27:32
some possibility that
27:34
one of the two problems be
27:37
weaponized, the advice remains
27:39
that everyone using any version
27:41
three point x point
27:44
x of open SSL
27:46
where those x's aren't zero
27:48
and seven that which is to say if
27:50
you're if you're not if using anything
27:53
before three point zero point
27:55
seven, which contains the two
27:57
fixes, that should
27:59
be looked at. So, okay, here's what
28:01
we know now, as I as
28:04
suspected last week, we would get you
28:06
know, we would find out what was
28:08
going on. Here's what the project maintainers wrote
28:11
about the most serious of the two
28:13
problems. It's
28:15
got a CVE twenty twenty two thirty
28:18
six so too, now rated
28:20
at high severity. They said
28:22
a buffer overrun, which
28:24
is, of course, where most of
28:26
these problems begin, a buffer
28:29
overrun can be
28:31
triggered in the x
28:33
dot 509 certificate
28:35
verification. specifically
28:38
in name constraint
28:40
checking. Is it note that
28:43
this occurs after certificate
28:45
chain's signature verification and
28:48
requires either a
28:50
CA to have signed the
28:52
malicious certificate or for the
28:54
application to continue certificate verification
28:57
despite failure to construct
28:59
a path to a trusted
29:01
issuer. That meaning if it
29:03
hadn't been signed. An
29:05
attacker can craft a
29:07
malicious email address
29:10
to overflow four
29:13
attacker controlled bytes
29:15
on the stack. This
29:18
buffer overflow could result in a
29:20
crash causing a denial
29:22
of service, meaning, you know, it's
29:24
your services denied because the thing
29:27
crashed, or potentially remote
29:29
code execution. Many
29:32
platforms implement stack
29:34
overflow protections, which could would
29:37
mitigate against the risk of remote code
29:39
execution. The risk may be further
29:41
mitigated based on stack
29:43
layout for any given platform
29:46
and compiler. Preannouncements
29:48
of the CVE described
29:51
this issue as critical. Further
29:54
analysis based on some of the mitigating
29:56
factors described above
29:58
have led us to have led this
30:00
to be downgraded to high
30:03
Users are still encouraged to upgrade
30:05
to a new version as soon as possible.
30:08
A TLS client I'm
30:11
sorry, in a TLS
30:13
client, this can be
30:15
triggered by connecting to a
30:17
malicious server. In a TLS
30:19
server, this can be
30:21
triggered if the server requests
30:23
client authentication and a
30:26
malicious client connects.
30:27
Okay. So the
30:30
second of the two problems, there were
30:32
two that were related. The the second one
30:34
is quite similar but it
30:36
only allows the attacker to
30:39
overflow the stack with
30:41
an arbitrary number of
30:44
dot, you know, period characters.
30:46
I think that's x forty six.
30:48
So the attackers inability
30:51
to overflow the stack with their own
30:53
provided data I'm sorry.
30:55
The the the yeah. The attackers
30:58
in ability To overflow the
31:00
stack with their own provided data, all they
31:02
can do is dot characters, limits
31:04
the practical danger to
31:06
a denial of service
31:08
due to a crash that
31:11
would result in a crash in open SSL.
31:13
But in the
31:16
in the reason that the
31:18
the more serious of the two
31:20
was initially felt to be
31:22
critical is that the
31:24
stack overflow can
31:27
be of of attacker
31:30
provided bytes, four attacker
31:32
provided bytes, which could be
31:35
a jump or, you know,
31:38
like, a, you
31:40
know, some just
31:42
enough code, for example,
31:45
to elevate this
31:47
task if it weren't already
31:49
or to bypass
31:52
security checks. You know,
31:54
you know, whatever. So what
31:56
remains to be seen is
32:00
whether anyone ever
32:03
arranges to this
32:06
attack. There's no doubt
32:09
that many vulnerable instances
32:12
of open SSL version
32:15
three previous
32:17
to zero seven will remain
32:20
out in the world for the foreseeable
32:22
future. They'll they they will have already
32:24
been built into appliances
32:26
that will never be updated.
32:28
It's a relief that
32:31
the trouble cannot be induced
32:33
in an open SSL based
32:36
TLS server without the
32:38
server first requesting a
32:40
certificate from a client. That's
32:42
unusual enough so
32:44
as not to be a big issue. But
32:46
if an open SSL based
32:49
TLS client were to
32:51
be induced into visiting a
32:53
malicious server, After
32:55
this flaw were weaponized,
32:57
that could result in the execution
32:59
of code on the visiting
33:02
client. thus compromising somebody who
33:04
connects to a malicious
33:07
server. And that could pose sufficient
33:09
inducement to
33:11
cause that is the the potential
33:13
of that. Could be
33:15
sufficient inducement to
33:18
cause major exploit
33:21
creating players to
33:23
investigate its weaponization. So
33:26
we'll see see If a year or two from
33:28
now, we're not talking about
33:30
whoops. Remember that open
33:32
SSL vulnerability that
33:34
was downgraded to high, but was wreck you know,
33:36
that should have been fixed wherever
33:39
possible. Well, you
33:41
know, We'll see if that ends up happening.
33:43
It could. Okay.
33:47
We're
33:48
gonna begin
33:49
hearing of
33:51
of more instances of these sorts
33:53
of reactions from the US
33:56
federal government. And over
33:58
time, it will become become widely
34:00
known that companies cannot
34:02
simply ignore their security
34:05
responsibilities with impunity.
34:08
On Halloween, The
34:10
FTC's business blog, post,
34:13
was titled, multiple
34:16
data breaches suggest educational technology
34:20
company Chegg,
34:22
CHEG, CHEG, c h e g
34:24
g didn't
34:26
do its homework, alleges the
34:28
FTC. Now, we'll forgive the FTC
34:30
for being cute about an educational
34:32
company not doing its homework. but
34:35
the points made in their blog about were instructive. The
34:38
FTC wrote
34:40
Chegg Inc.
34:42
sells educational products and services directly
34:45
to high school and
34:48
college students. That
34:50
includes renting textbooks guiding
34:52
customers in their search for
34:54
scholarships and offering online tutoring.
34:58
But according to the FTC, the ed tech
35:00
company's lax security practices
35:04
resulted in four
35:06
separate data breaches in a
35:08
span of just a few years leading
35:12
to the
35:14
misappropriation of personal information
35:17
about approximately forty
35:20
million consumers.
35:22
The FTC complaint and some notable provisions
35:24
in the proposed settlement suggest
35:27
that it's time for a
35:29
data security refresher course
35:32
again with approach at
35:36
chegg. Are
35:38
there lessons your company could learn
35:42
the FTC deposits
35:44
or or wonders. From
35:47
where the FTC says
35:50
Chegg failed to make the grade the grade.
35:52
Okay. Okay. In the course of
35:54
its business so here's what happened. California
35:58
based
36:00
Chegg. collected they said the
36:02
FTC said a treasure trove of
36:04
personal information about many
36:06
of its customers, including their
36:09
religious affiliation, heritage,
36:12
date of birth, sexual
36:16
orientation, disabilities, and parents
36:18
income. Why do they have
36:20
my sexual orientation in the first
36:22
place? Exactly. What the
36:24
hell is that? Exactly. They're
36:27
doing textbooks. Yes.
36:30
I
36:30
know. Even the
36:31
Chegg employee
36:34
in charge of cybersecurity described the data
36:36
gathered as part of its
36:38
scholarship service scholarship
36:40
search service as quote Very
36:44
sensitive. Yeah. So you might there
36:46
might be a scholarship
36:48
for a
36:50
queer scholar something like that. So you'd have to give them that information, I
36:52
guess, to find those In order to call
36:54
her. Right. Right. It is. It's
36:56
very sensitive. Yes.
36:58
Yeah. So at least and and and
37:00
four breaches. I mean, it's very
37:02
sensitive and they're not treating it
37:04
responsibly. But wait but wait till you hear Leo.
37:06
It's unbelievable. A key
37:08
component of Chegg's
37:10
information technology infrastructure was
37:12
simple storage service, s three
37:14
-- Oh, but cloud service uh-huh.
37:17
s three buckets can be
37:20
secure, but they're offered a
37:22
cloud service offered by
37:24
Amazon Web Services,
37:26
AWS, that Chegg used to store a substantial
37:28
amount of customer and
37:30
employee data. The full
37:32
complaint provides all
37:34
the details but
37:36
the FTC cites a number of examples of what Chegg
37:38
did and didn't do that
37:40
were indicative of the company's
37:43
lack security practices. For
37:46
example, the FTC alleges
37:48
that Chegg allowed employees
37:51
and third party contractors
37:54
to access the s three databases with
37:57
a single access
37:59
key that provided
38:02
full administrative privileges over all
38:06
information. Chegg
38:10
did not require multi factor
38:12
authentication for account access to
38:14
the s three databases.
38:17
Rather than encrypting the data.
38:20
Oh. Chegg stored users and
38:22
employee's personal information in
38:24
plain text. Until
38:26
at least April of twenty
38:28
eighteen, Chegg protected, they
38:30
had that in air quotes, passwords
38:33
with outdated cryptographic hash functions.
38:36
Until at least April
38:38
twenty twenty, Chegg failed to
38:40
provide adequate data security trading for
38:43
employees and contractors. Chegg
38:46
didn't have processes in
38:48
place for inventorying and
38:50
deleting customers and employee's
38:52
personal information once there was no
38:54
longer a business need to
38:56
maintain it. In other words, the hope
38:58
it just kept accruing
39:00
the data, Add infinitum. Chegg failed to
39:02
monitor its networks adequately
39:04
for unauthorized attempts to
39:08
sneak in and illegally transfer sensitive data out of its
39:10
systems. In other words,
39:12
across the board, your basic
39:16
do the minimum possible
39:19
laziness. The report
39:21
continues. Should it should it come
39:23
as a surprise that the
39:26
complaint recounts four separate
39:28
episodes that led to the
39:30
illegal exposure of
39:32
personal information Incident one
39:34
stemmed from a Chegg employee
39:37
failing, falling for
39:39
a fissuing attack that allowed a data
39:41
thief access to the employee's direct deposit
39:44
payroll information.
39:48
Incident two, involved a former contractor who used
39:50
Chegg's AWS credential, the
39:52
the one credential, to
39:55
grab sensitive material from
39:57
one of the company's s three databases,
39:59
information that ultimately found its
40:02
way onto a
40:04
public website. Then came
40:06
incident three, a phishing
40:08
attack that took in a
40:10
senior Chegg executive that
40:12
allowed the untrue to bypass the company's multi factor
40:14
email authentication system. Once
40:16
in the executive's email
40:18
box, the intruder had access to personal
40:20
information about
40:22
consumers, including financial and medical information. And
40:25
incident number four, a
40:27
senior employee responsible for
40:30
payroll fell for another phishing attack thereby giving
40:32
the intruder access to the company's payroll
40:34
system. The intruder left with
40:38
the w two information of approximately seven hundred
40:40
current and former employees, including
40:42
their birth dates and Social
40:45
Security numbers. God. In
40:48
each of the
40:49
four incidents cited in the complaint,
40:51
the FTC alleges that Chegg had
40:53
failed to take simple
40:56
precautionary steps that would have
40:58
likely helped prevent or
41:00
detect the threat to consumer
41:02
and employee data.
41:04
For example, requiring employees to take data security
41:06
training on the telltale sides
41:08
of a fishing attempt because
41:10
they fell for it
41:12
four times. and nobody
41:14
ever learned any lessons. No
41:16
actions were taken as a consequence of
41:18
those. To settle the case,
41:20
the boy had they gotten
41:22
off easy, Chegg has agreed to a comprehensive
41:24
restructuring of its data protection
41:26
practices as part of the
41:28
proposed order Chegg must
41:30
follow a schedule that sets out the
41:32
personal information it collects, why
41:34
it collects the information, and when it
41:36
will delete the data. In
41:38
addition, Chegg must give customers access
41:40
to the information collected about
41:42
them and honor requests to
41:44
delete the data. provide
41:46
customers and employees with two factor
41:48
authentication or other authentication method
41:50
to help protect their accounts. So
41:53
it's gonna get
41:55
better But
41:56
this is just, you
41:57
know, this is just AAA
42:00
toothpick and a
42:02
haystack. Right? In
42:05
this largely, still unregulated industry,
42:08
we're operating in a
42:11
wild west mode with non
42:14
existent oversight until
42:16
failures are egregious enough
42:19
to bring governmental scrutiny and
42:21
how many of these incidents were caused by
42:24
employers, employees falling
42:26
for fishing schemes. All four of
42:28
them, even an exec, did.
42:30
yet there was no trading provided. reason is none
42:33
of those breaches directly affected
42:36
Chegg's bottom
42:38
line. Oh,
42:40
forty million of their customers
42:42
had highly sensitive daily the
42:45
data revealed, well, We're
42:47
very sorry about that. Well,
42:50
okay, right. Well, I'm not one
42:52
who believes in government overreach
42:54
and having Uncle Sam rummaging around
42:56
in our private corporate businesses,
42:58
but self regulation isn't
43:00
gonna work here. One solution would
43:02
be to only provide tools
43:05
that provide security. then at least security would need to be added
43:07
on as an optional afterthought. But as
43:10
we all well know, we're not
43:12
there yet. Where
43:14
I am Leo is in need of a team. Oh,
43:16
wow. That I can help you with. I can't help you
43:19
with. So any of the other stuff, but
43:21
we're gonna talk about The
43:23
amount of ransomware payments made last week will
43:26
become bad. Oh, yeah. Yeah. And it's a
43:28
big number. I bet it is. Well, let me talk
43:30
before we do that. Let me talk about
43:32
our sponsor. Drata. Drata, I think
43:34
many of you who listen to the
43:36
show run, you know,
43:38
organizations run IT, where
43:40
you have
43:42
compliance requirements. Right? We were just talking about the FTC
43:44
stepping in with Chegg. And forge,
43:46
you know what? I'm proud of the FTC for
43:49
getting that enforcement. and
43:51
doing it right. And, of course, there's a lot
43:54
of there are a lot of compliance
43:56
issues that every company has these
43:58
days because you've got
43:59
data. Right? If your organization's finding it difficult to
44:02
achieve continuous compliance as
44:04
it grows and scales,
44:06
is manual
44:08
evidence collection, it's slowing your team down.
44:10
I can tell you
44:12
about drama, g two's highest rated
44:15
cloud compliance software. they
44:18
streamline your compliance. If you're
44:20
doing it by hand, this blew me away that
44:22
so many companies actually do this
44:24
by hand. If you
44:26
need SOC two, ISO
44:28
twenty 7001 PCI
44:30
DSS, GDPR HIPAA, or
44:32
you have other compliance frameworks, Drada
44:35
gives you twenty four hour
44:38
continuous control monitoring
44:40
so you could focus on
44:43
scaling securely and Strata can collect that information you need
44:45
to prove compliance with a suite
44:47
of seventy five plus integrations.
44:49
Strata easily integrates with
44:51
your tech stack through applications like
44:54
AWS, Azure, GitHub,
44:58
Okta, Cloudflare, countless
45:00
security professionals from companies including
45:02
lemonade and notion.
45:04
Love both of those companies. Bamboo HR,
45:06
they have shared how critical it has been
45:08
to have Strata as a trusted partner in compliance process.
45:11
Strata's deep native integrations
45:13
provide instant visibility into
45:16
a security program and continuous monitoring to ensure
45:18
the compliance is always met. Not
45:21
it's just not just point to
45:23
point. It's always met. allows
45:26
companies to see all their controls, to
45:28
easily map them to compliance frameworks so
45:30
you know immediately if you've got, well,
45:32
for instance, framework overlap or gaps,
45:35
Right? Companies can start building a
45:37
solid security posture from day one with
45:39
Dreda, achieve and maintain compliance as your
45:41
business scales, expand your
45:43
security assurance efforts using the drawn up
45:46
platform. This is more and more
45:48
important all the time. And if you listen to
45:50
the show, you know that. Drada's
45:52
automated dynamic policy template
45:54
support companies new to
45:56
compliance and help alleviate
45:58
hours of
45:59
manual labor.
46:00
later And the problem
46:01
with manual labor besides it
46:04
being time consuming and laborious
46:06
is mistakes, not
46:08
with Dreda, And their integrated security awareness training
46:10
program and automated reminders
46:12
are great for employee
46:14
onboarding. They're the only player in the industry
46:16
to build
46:18
on a private this is just what you're talking about earlier. A
46:20
private database architecture from day
46:22
one. That means your data can
46:25
never be accessed by anyone outside your
46:28
organization. Right?
46:30
Outside your organization, Prada
46:33
can't even do it. All customers receive a
46:35
team of compliance experts, so you're never alone. You
46:37
get a designated customer success
46:40
manager. They also did really give
46:42
you some assurance. They have
46:44
a team of former
46:46
auditors. People have conducted five hundred plus
46:48
audits available for support
46:50
and counsel. So you kinda know ahead of time what you need
46:53
to do. and they can check what you're up to, make sure
46:55
it's compliant. Your success is their
46:57
success, Dorado knows that with a
46:59
consistent meeting cadence you're never
47:02
in mystery. They'll keep you on track and
47:04
ensure there are no surprises, no barriers,
47:06
and you're gonna love the pre audit
47:08
calls which ensure you're set up for success when those
47:10
audits begin. Drada is actually was kind
47:12
of created by and invented by and
47:15
supported by backed by
47:18
a syndicate of CSO angel investors who know intimately
47:21
what needs to be done. SVCI, you might
47:23
have heard of them.
47:26
these these cesars for some of the world's most influential companies said,
47:28
yeah, this is something we need. This is
47:30
this is Strada. Say goodbye to manual evidence
47:32
collection. Say a load to automated
47:36
compliance, please go to drada dot
47:38
com slash tweet DRATA
47:40
dot com and add the slash tweet if
47:42
you would so they know you saw it here.
47:45
That's really important to Steven me.
47:47
drada dot com slash
47:50
tweet. drada bringing automation
47:52
to compliance at drada. Speed. It's
47:55
everything you talk about on the show, Steve,
47:57
is is really a cautionary tale. And I
47:59
just imagine
48:02
these CSOs and CIOs and IT folks listening going,
48:04
oh, boy.
48:06
Oh, boy. Did we secure our S three
48:08
buckets today? You know? This
48:11
is -- Well, and we talked good. -- we talked a
48:14
couple weeks ago, there was some survey. I
48:16
it was IBM who did the survey
48:19
of the stress that CSOs are under. I mean,
48:22
it's just it's not a it's it's
48:24
tough to normal res it's a tough yes.
48:26
Yeah. But it but a good job.
48:28
An important Thank you for doing
48:30
it. It needs to And we're glad
48:32
you listed security now because that gives me
48:34
some confidence that you've paid
48:36
attention,
48:37
which is good. Okay.
48:41
Fin San, which
48:43
is the US Financial Crime's
48:46
Enforcement Network Unit, which is
48:48
part of the US Treasury Department, published
48:51
a ten page report detailing ransomware
48:53
related events as reported by banks and
48:55
other financial institutions through the
48:58
Bank secrecy
49:00
act also BSA,
49:03
Vincent said that in
49:05
twenty twenty one, things
49:08
related I'm sorry, five things. Filings related
49:10
to suspected ransomware payment
49:14
substantially increased from
49:16
twenty twenty. Okay.
49:18
So we're nearly a year behind. Right?
49:20
because that's the way these reports
49:22
go. Takes a while for them
49:25
to filter through. So not like
49:27
this year. We know this year
49:29
was like a bang up year
49:32
more so than even in twenty twenty
49:34
one. Anyway, twenty
49:36
twenty one substantially increased over
49:38
twenty twenty. Twenty twenty
49:40
one saw
49:42
a reported one
49:45
point two billion dollars
49:47
in in known
49:50
ransomware payments paid
49:52
out. The agency, Finjan,
49:54
estimates that roughly three quarters of
49:56
these payments were made to
49:58
ransomware gay
49:59
gangs loaded located in Russia.
50:02
And of course, that's all the
50:04
ones that we're talking about. The big guys,
50:06
all of this is is
50:09
Russian to a large degree. I've got a graph of
50:11
the of the last few years
50:13
of this, but basically,
50:15
it is the your
50:18
It's not quite exponential, but it's more than
50:20
linear. You know? It's it's
50:23
more yeah. It looks
50:25
like a hockey stick. It's little
50:27
hockey stickies. Yeah. It's not good.
50:30
It's going up fast. So
50:32
boy. Yeah. We don't want Russia
50:34
to be receiving our
50:36
money. And The problem is, well, there's this
50:38
much money behind it. One point two
50:40
billion dollars in, you know,
50:42
in cryptocurrency
50:44
transfers, That's called incentive.
50:46
Yep. And, you know, instead
50:48
of not what we want. That's why
50:50
it's so low. You know, in the in
50:52
the left hand side of the chart, you
50:55
can really trace the success of ransomware to the
50:57
rise of crypto. Yes, unless
51:00
you could get unless you could get
51:02
paid without getting caught, was
51:04
really no way to to to
51:06
make this happen. Remember, it was
51:08
Western Union transfers. That was the
51:10
way it was being done. Or you'd go down
51:12
and buy money cards from
51:14
the seven eleven. Right?
51:16
Right. Right. Sorry. No.
51:18
It's absolutely it's it's been it's been
51:20
like the the the perfect storm where
51:23
where the bad guys realize, hey,
51:25
this is great. We love this
51:27
cryptocurrency stuff. Let let's just
51:29
ask for some Bitcoin.
51:32
Akamai published their
51:34
third quarter, their
51:36
q three threat report for
51:38
for this year, twenty twenty
51:41
two, which they released. Right. smack dab on
51:43
the end on Halloween.
51:46
Since fishing has grown to
51:49
become by far. I mean, how many times have we
51:51
spoken about already in this forty six
51:54
minutes? The most frequently
51:56
detected first step in
51:58
most successful attack
52:00
scenarios, what Akamai's report had
52:02
to say about fishing I thought
52:04
was telling. They said as covered in the
52:06
q two, that is their previous
52:08
quarter's twenty twenty two report,
52:10
the overwhelming
52:12
fishing landscape, scale, and magnitude is
52:15
being enabled, and this is
52:17
news, by the existence of
52:19
fishing tool kits. Fishing
52:22
tool kits support the deployment and
52:25
maintenance of fishing websites,
52:27
driving even nontechnical
52:32
scammers. to join the fishing adversary
52:34
landscape and run and
52:36
execute fishing
52:38
scams. And anyone
52:40
who's been listening to this podcast for long nose, that's
52:42
like the worst thing that we could
52:44
hear. Right? Is you don't have
52:47
to know anything now increasingly in order to
52:50
pull off this, which is
52:52
why there's so much of it.
52:54
They they
52:56
wrote according to Akamai research that tracked two
52:58
hundred and ninety nine
53:01
different fishing tool
53:04
kits. being used in the wild to launch
53:06
new attack campaigns during
53:09
the third quarter of
53:11
twenty twenty two two
53:14
point zero one percent of the tracked kits
53:17
were reused on at
53:19
least sixty three
53:22
distinct days. Fifty
53:24
three point two, so a little over
53:26
half of the kits were reused to
53:28
launch a new attack campaign on
53:30
at least five
53:32
distinct days. And
53:34
all one hundred percent of
53:36
the tracked kits were used on
53:38
no fewer than three
53:40
distinct days with the average toolkit
53:42
reused on nine days during the third quarter of twenty
53:45
twenty two. So
53:47
though so
53:48
the bad guys are being fickle about
53:50
their toolkits. They're jumping around
53:52
trying different ones. And they're not
53:55
these are not long lived
53:58
campaigns. They're they're setting them up,
54:00
sending out a bunch of emails,
54:02
you know, waiting for how long they would expect the
54:04
email to take before
54:06
somebody opened it and clicked on
54:08
it. And you know, they
54:10
wait 56789 days, and then
54:12
they go, okay, well,
54:14
drive time to do
54:16
different campaign. They wrote further
54:18
analysis on one of the most
54:20
reused kits in the third quarter,
54:22
counting a number of different domains
54:24
used to deliver each kit shows that
54:26
kits that abuse Adobe
54:28
and M and T Bank are
54:30
top leading
54:32
toolkits. Adobe with more than
54:34
five hundred domains.
54:36
Just during Q3I
54:38
know, an M and T bank
54:40
with more than four hundred domains. They
54:42
they said the reusing behavior of fishing toolkits is
54:45
more evidence of the trend
54:47
of the fishing landscape that
54:49
continues to scale. moving
54:51
to a phishing as a service
54:54
model and utilizing free Internet
54:56
services. Phishing attacks
54:58
are more relevant than ever.
55:01
And it's interesting because their their
55:04
mention of utilizing free
55:06
Internet services. Remember, that was
55:08
the one thing that that that
55:10
the guy, the the
55:12
technical director of NCSC,
55:14
who was the subject of last week's
55:16
podcast, one of the things he
55:18
said was I wish something could be done
55:20
to to limit free
55:22
hosting services. That is
55:24
where so much of the
55:26
problem is. And
55:28
at the same time, he said, but what can you do in
55:30
a, you know, in a in an
55:32
open catch up and Yeah.
55:36
Exactly. But but
55:38
but here, you know, utilizing free
55:40
Internet services, the ability to just, you
55:42
know, spin up AAAA
55:46
hosting and create free internet service? That's a problem. So
55:48
but think about that. Two hundred
55:51
and ninety nine distinctly different
55:54
fishing tool kits. And as I said,
55:56
what we've learned from observation is that
55:59
the easier something is or something
56:01
is to do to do the more it
56:03
will be done. The log for j vulnerability never swept the
56:05
world as was originally
56:07
feared because it turned out that
56:09
the nature of the vulnerability meant that there was
56:12
no one size fit all
56:14
exploit for
56:16
it available. And if the script kitties can't use
56:18
something, then it's usually
56:20
significantly curtailed. Yep. But
56:23
if can use something
56:26
than a feeding frenzy is
56:28
the result. So
56:30
on the on the, you know,
56:32
on the front end, it has never been
56:34
easier to get into the fishing
56:36
business. And on the
56:38
back end, there's a huge market for the
56:40
services of the so
56:42
called initial access brokers.
56:44
Right? They're the
56:46
ones who who perform
56:48
this, who develop initial
56:50
access, and then resell
56:52
it. So any credentials that
56:54
a fishing campaign can manage
56:56
to obtain will find a ready market among
56:58
those who can turn them into
57:00
devastating network
57:02
attacks. I
57:04
do have one little bit of
57:06
news before I talk about initial access
57:08
brokers, and that is that
57:11
Akamai reported seeing although
57:14
this was in their admittedly
57:16
very skewed
57:18
sample set, in and which I'll explain, they saw forty
57:20
percent increase from twenty
57:23
five percent to sixty
57:26
five percent in the use
57:28
of DNS over TLS.
57:30
But that's not global.
57:32
That's their enterprise and their than
57:34
their own small and medium sized business customers. But
57:37
still, you know, although this doesn't
57:39
represent the world of large,
57:42
you know, Currently, more than seventy
57:44
percent of all DNS
57:47
remains over UDP. you
57:50
d p And it's but what I think will
57:52
happen is this will be a very gradual
57:54
change as new systems
57:56
are engineered from scratch It's
57:59
more likely that those
58:02
new solutions will probably
58:04
choose one of the encrypted
58:06
forms of DNS rather than
58:09
old school UDP, so we can hope.
58:11
And it certainly says something that, you
58:13
know, the the one
58:15
that Akamai's own enterprise and
58:20
small and medium sized business
58:22
customers really have started
58:24
to adopt DNS
58:26
over TLS. Okay. As for initial access
58:28
brokers,
58:31
another third quarter
58:33
report came out from
58:35
a threat intelligence firm, Keelah,
58:38
KELA
58:40
they
58:41
published a report on the
58:44
initial access broker side
58:46
of the network intrusion
58:49
marketplace. Kayla's reports
58:51
stated that during just this third quarter,
58:53
this past third quarter that just ended this year, they
58:55
found over five hundred
58:58
and seventy unique
59:00
network access listings for
59:03
sale. With a
59:05
cumulative requested price, of
59:08
approximately four million
59:10
US dollars. Okay. So
59:12
just to be clear, someone responding
59:16
and agreeing to purchase one of these five
59:18
hundred and seventy listings
59:20
would be receiving, and this is something
59:22
that's done, you know, through
59:26
tour hidden service on the so called dark web, they
59:28
would be receiving the means to
59:30
log in to an unsuspecting
59:34
company's network
59:36
with useful network privileges.
59:38
Within that set of five hundred
59:41
and seventy listings, the average
59:44
price to purchase access was
59:46
two thousand eight hundred dollars and
59:49
the median price was thirteen hundred and fifty.
59:52
dollars And prices have
59:54
been rising since the second quarter. The
59:56
total number of listings remained
59:59
almost unchanged between the second quarter
1:00:01
and the third quarter appearing at
1:00:03
the rate of around one
1:00:05
hundred and ninety new access
1:00:08
listings per month. So
1:00:10
so so think about that. So
1:00:13
there's a marketplace
1:00:14
where where
1:00:15
people could go And in fact,
1:00:18
as we'll get to it later, remember the
1:00:20
numbskull Floridians, they
1:00:22
actually went here and they
1:00:25
asked for access to
1:00:28
CPA and
1:00:28
tax preparer networks.
1:00:31
I
1:00:31
mean, it's that this marketplace is that
1:00:34
specific. You can go there and
1:00:36
you can say, I
1:00:38
want to
1:00:40
get into the networks of these types of businesses
1:00:42
and you can purchase credentials
1:00:46
credentials that do that. And
1:00:49
though new credentials are appearing
1:00:51
at the rate of one
1:00:53
hundred and ninety listings per
1:00:56
month. That's six and
1:00:58
a quarter new listings per day, by
1:01:00
the way.
1:01:02
So Anyway,
1:01:04
at the average price, twenty eight hundred dollars
1:01:07
to purchase access to
1:01:09
somebody's network. Mhmm. And
1:01:12
typically, There's five hundred and seventy
1:01:14
of them up at
1:01:16
any one time. Wow. Okay.
1:01:18
We will get to to Florida in
1:01:22
a I found that interesting little bit about that
1:01:24
shared some details about
1:01:26
how bank heists work. Although,
1:01:30
they don't receive a lot of coverage, over the past decade,
1:01:33
banks have not escaped
1:01:35
ever increasingly sophisticated
1:01:38
cyberattacks. Many banks
1:01:40
have been hacked and have collectively
1:01:43
lost billions of
1:01:45
US dollars in
1:01:48
serious intrusions. The two most notorious and successful
1:01:50
threat actors that pulled off
1:01:52
successful bank heists were
1:01:54
a group
1:01:56
called Carbonak, carbon act and
1:01:58
also North Korea's Lazarus Group, which
1:02:01
is an APT, an advanced
1:02:03
persistent threat group, Lazarus, we've
1:02:05
talked about before. The
1:02:08
attack geography interestingly enough
1:02:11
has been evolving over time.
1:02:14
Initial cyber
1:02:16
heists tended to target organizations in North America and
1:02:18
in Europe. Once those
1:02:20
regions were fully explored, and
1:02:24
security began tightening up. There
1:02:26
was a move into Asia and
1:02:30
Latin America. But as
1:02:32
those banks also began to seriously
1:02:34
upgrade their network defenses and
1:02:36
security, movement has been now more
1:02:38
recently in the direction of Africa, a region that
1:02:41
is until now been left largely
1:02:43
unscathed, but a
1:02:46
joint report published this week by Security Firm Group
1:02:48
IB and Orange's CERT
1:02:50
Team, a French speaking cyber
1:02:53
group, tracked as as there
1:02:55
Okay. We'll we'll we'll pronounce them
1:02:58
operator, although the t
1:03:00
is a numeral one. So 0PERA
1:03:03
numeral 1ER also
1:03:05
known as common Raven or the
1:03:08
desktop group. They've recently
1:03:10
been wreaking havoc across
1:03:13
the African continent Well, recently, from twenty eighteen
1:03:16
through twenty twenty one,
1:03:18
this report
1:03:19
covers nothing in this report since
1:03:22
then, but you know, actions that
1:03:24
have continued. The researchers
1:03:26
said they linked this
1:03:28
operator group to thirty
1:03:32
five different intrusions at different organizations
1:03:35
across fifteen countries
1:03:37
in Africa with most of
1:03:39
the targets targeting banks.
1:03:42
Group IB and
1:03:44
the orange researcher said that while
1:03:47
the group used basic phishing
1:03:50
attacks, and off the shelf remote access trojans
1:03:52
to gain an initial foothold
1:03:54
in their victims networks.
1:03:56
Once inside a network, This
1:03:59
operator group has exhibited
1:04:02
both restraint and patience.
1:04:04
Some intrusions lasted for
1:04:07
months as the group moved laterally
1:04:10
across banking systems, observing,
1:04:13
mapping the internal network
1:04:16
topology and patiently waiting before springing their attack.
1:04:18
The group's target was
1:04:21
banking systems that handled
1:04:24
money transfers. And this is I
1:04:26
found so interesting. The
1:04:28
report explained once their network penetration
1:04:31
had reached those
1:04:33
most sensitive systems where
1:04:35
the actual money transfers are managed, the
1:04:38
group would set a time
1:04:40
for the
1:04:42
heist,
1:04:42
and working
1:04:43
with a large
1:04:45
network of some four
1:04:48
hundred
1:04:50
money would orchestrate a synchronized
1:04:54
coordinated transfer of funds
1:04:56
from the bank's larger legitimate
1:04:59
accounts into the four hundred
1:05:02
mule accounts. With the money
1:05:04
mules immediate
1:05:07
withdrawing the stolen funds
1:05:09
from their accounts via
1:05:12
ATMs. In a coordinated
1:05:14
ATM cash out, before the
1:05:16
bank's employees had the
1:05:18
opportunity to react. The
1:05:20
mules would would refresh
1:05:23
the ATM screens at the appointed
1:05:25
for account balance to suddenly jump
1:05:28
up, then they would drain
1:05:30
the cash drain
1:05:33
the account for cash and quickly leave the
1:05:35
area. Thus, of course, bringing
1:05:37
new meaning to
1:05:40
the term decentralized
1:05:44
finance. The group one b
1:05:46
researchers said they had linked
1:05:48
operator intrusions to
1:05:50
bank high totaling eleven million dollars, but
1:05:52
the group is suspected of stealing more
1:05:54
than thirty million total,
1:05:56
though not all the incidents have been
1:05:58
formally confirmed.
1:06:00
So anyway, I thought that was interesting. The
1:06:02
the the bad guys get in using
1:06:04
fishing or remote access Trojan's
1:06:08
set up AAA presence in the networks,
1:06:10
explore the networks, being
1:06:13
quite patient, sometimes taking
1:06:15
months until they they they
1:06:18
determine what is there
1:06:20
and and get into a
1:06:22
position where they're able to to
1:06:24
actually perform
1:06:26
account funds transfers. They then
1:06:28
reach out to their network,
1:06:30
obviously, a pre established
1:06:34
network of four hundred
1:06:36
individual I
1:06:38
mean, you know, individuals who
1:06:42
then
1:06:43
at a prescribed time, go
1:06:45
to ATMs where their own
1:06:47
mule accounts have
1:06:50
suddenly become
1:06:52
wealthy and dump all the cash
1:06:54
out of the ATM that they
1:06:56
can and then
1:06:58
take off. and and
1:07:00
head somewhere else.
1:07:01
Wow. Just to
1:07:03
sort of keep an
1:07:05
eye on d
1:07:08
phi, not to anyone's surprise.
1:07:10
The d five platform
1:07:12
Skyward Finance confirmed last
1:07:16
Wednesday that a clever hacker had exploited a vulnerability in
1:07:18
its smart contract system and made
1:07:20
off with three million dollars of cryptocurrency
1:07:25
You know? And I guess at this point
1:07:27
for us, the proper expression would be or
1:07:29
the response would be a
1:07:31
yawn. And the
1:07:33
d five platform Seouland, S0LEND
1:07:35
said it lost one point two six million
1:07:38
worth of cryptocurrency following
1:07:40
an Oracle attack on
1:07:42
its platform which
1:07:44
targeted the Hubble USH
1:07:46
or USDH currency. So
1:07:50
It's hard to keep track
1:07:52
of all these things these
1:07:54
days. Leo, you're gonna love this one.
1:07:57
In a big What
1:08:00
in the world took them so long bit of news? The
1:08:03
Russian Ministry of
1:08:06
Digital Development surveyed
1:08:08
the country's largest IT
1:08:10
firms, you know, Russia's
1:08:12
largest IT firms to
1:08:15
obtain their recommendations. for
1:08:18
the best replacement for windows across Russian government and
1:08:23
private sector networks. The
1:08:25
three contenders are all Linux based operating systems because
1:08:28
what else could
1:08:31
they be? Yeah. they are they
1:08:33
are I mean, you're right. There is nothing else. Yeah. You're
1:08:35
gonna make it MAC? No. Of course not. No. No. So they
1:08:38
are the Astra Linux alt
1:08:42
OS -- Mhmm. -- and red OS.
1:08:45
Red OS is not Chinese
1:08:47
one, isn't it? The China
1:08:49
has its own Linux distribution, the Chinese
1:08:51
Communist Party recommends. Yep. Yep. It would certainly
1:08:53
make
1:08:54
sense that it
1:08:54
was Red Oak's brand Linux.
1:08:58
Yeah. And and and again, how many times have
1:09:00
we, like, wondered,
1:09:02
like, what has taken
1:09:05
them so long? Like,
1:09:07
how how could Russia be
1:09:09
using windows. Yeah. I just I just astonishing. They're they're
1:09:11
often using pirated copies of
1:09:16
windows. and often using the end
1:09:18
of end of life pirated copies of Windows. So it's hideously insecure. China the Chinese
1:09:20
Linux is Kylin
1:09:23
Linux, KYL in.
1:09:26
And it's specifically for the mainland China
1:09:28
market. Well and get
1:09:30
this. They it turns out
1:09:35
that Russia would not have moved away
1:09:37
from windows, but for their
1:09:40
attack on
1:09:42
Ukraine. Reportedly, the Russian government is
1:09:45
seeking a replacement only
1:09:47
now after Microsoft pulled
1:09:49
out of Russia stopped delivering
1:09:51
security updates to Russian systems,
1:09:53
and started blocking Russian's access
1:09:56
to Windows installation files. In
1:09:58
other words, Microsoft left them with no
1:10:00
choice. Yeah. And so --
1:10:02
Yeah. -- okay. Yep. Linux. linux
1:10:06
again, I I don't it's
1:10:08
it's I I
1:10:10
wonder I I wonder if
1:10:13
I I guess I don't because they're moving
1:10:15
to an open source operating system. Our NSA knows all
1:10:20
about Linux. and,
1:10:22
you know, just as well as it does windows. So -- Yeah. -- it probably doesn't really make a difference
1:10:25
one way or
1:10:28
the other. Okay. Leo,
1:10:30
this one. Wow. We've all seen war
1:10:36
stories. where in the midst
1:10:38
of battle, prominently marked red cross trucks come barreling in, carrying
1:10:41
non combatants wearing
1:10:44
wide red cross
1:10:46
arm band emblems with the
1:10:49
hope and expectation that all combatants
1:10:51
in the area, no matter
1:10:53
whose side they're on, will respect
1:10:55
the Red Cross's global neutrality and allow
1:10:57
them to care for the
1:11:00
wounded. In
1:11:02
bizarre a bizarre and
1:11:04
Okay. III
1:11:07
was gonna say interesting, but I think bizarre wins. move
1:11:12
Move.
1:11:13
the
1:11:15
they're trying to do
1:11:17
this in cyberspace. After two
1:11:19
years of study, last Thursday,
1:11:21
the International Committee for the Red Cross, the ICRC,
1:11:23
has published their resulting
1:11:28
report again. took them
1:11:30
two years, titled digitizing the Red Cross, Red Crescent and
1:11:35
Red Crystal Emblem. benefits,
1:11:38
risks, and possible solutions. Okay? In explaining their intention,
1:11:44
they wrote As
1:11:47
societies digitize, cyber operations
1:11:50
are becoming a reality
1:11:52
of armed conflict.
1:11:54
A growing number of states are
1:11:57
developing military, cyber
1:11:59
capabilities, and their use
1:12:02
during Armed Conflict is likely to
1:12:04
increase. The ICRC, the
1:12:07
International Red Cross, has has
1:12:10
warned against the potential human cost of cyber operations, and particular,
1:12:16
the vulnerability of
1:12:18
the medical sector and humanitarian
1:12:20
organizations to cyber operations, both having
1:12:22
been targeted in recent years. against
1:12:27
this background, the ICRC decided
1:12:30
to investigate the idea
1:12:34
of reflecting the internationally recognized distinctive
1:12:37
red cross, red crescent
1:12:39
and red crystal
1:12:42
emblems in the information and community communication
1:12:45
technology, IEA
1:12:48
digital emblem
1:12:50
Since twenty twenty, the ICRC
1:12:53
is partnered with research
1:12:55
institutions to explore the
1:12:58
technological feasibility of developing a digital emblem and
1:13:01
convened a global group
1:13:03
of experts to assess
1:13:06
its potential benefits and
1:13:08
risks. The idea and objective of
1:13:10
a digital emblem was straightforward. For over
1:13:13
a hundred and
1:13:16
fifty years, The distinctive emblems have been
1:13:18
used to convey a simple message. In times
1:13:20
of armed conflict,
1:13:23
those who wear them or
1:13:26
facilities and objects marked with
1:13:29
them must be protected
1:13:31
against harm.
1:13:32
Well,
1:13:34
well Good luck.
1:13:35
I wonder whether
1:13:36
during these past two
1:13:39
years of steady, those
1:13:41
working on this
1:13:44
have noticed How many hospital networks
1:13:46
have been cyber attack? Yeah. You know, we're not dealing with declared
1:13:49
hostilities in a
1:13:52
battle theater. where there's any sense of
1:13:54
honor and conventions, Geneva or otherwise, I'll be interested to see how
1:13:58
this one plays out. you know, I mean, would prevent
1:14:01
non Red Cross
1:14:04
organizations from,
1:14:06
you know, putting up a red cross seal in order
1:14:09
to protect themselves from attack. I
1:14:11
mean, it's just looney. You
1:14:14
know? Okay.
1:14:15
Okay. Last
1:14:17
Tuesday,
1:14:18
the Department of
1:14:20
Justice's U. S. Attorney's Office
1:14:22
for the Middle District of Florida
1:14:25
posted a press release with the title,
1:14:27
band of cyber criminals responsible
1:14:30
for computer intrusions nationwide
1:14:34
indicted for Rico conspiracy
1:14:38
that netted millions.
1:14:40
Okay? okay And
1:14:42
now That's thirty six millions
1:14:45
to be precise.
1:14:48
Okay.
1:14:48
okay
1:14:50
The alleged tax fraud crimes took place between
1:14:52
twenty fifteen through twenty
1:14:55
nineteen. DOJ officials
1:14:58
said the group first purchased credentials
1:15:00
from the dark web,
1:15:02
allowing them to gain
1:15:04
access to the internal
1:15:07
networks of several certified public accounting
1:15:09
and tax preparation firms
1:15:11
located across the
1:15:14
US. The group accessed
1:15:16
the CPA and tax
1:15:18
prep networks, stole the
1:15:21
tax returns of thousands of
1:15:24
taxpayers created six
1:15:26
tax preparation businesses
1:15:29
in Florida and then set up bank accounts and
1:15:32
everything. I mean, full working
1:15:34
businesses and used those companies
1:15:37
to those six tax
1:15:39
preparation companies to file
1:15:41
more than nine thousand
1:15:44
fraudulent tax returns
1:15:46
in the victim's names. and
1:15:49
hijacks hijack tax refunds directing
1:15:51
them towards their
1:15:55
own accounts. And Surprise, surprise.
1:15:57
Somehow, this this this was
1:15:59
detected
1:16:00
and they didn't
1:16:03
get away with it. Now
1:16:05
they're all facing on the
1:16:07
order of twenty years behind bars for Rico charges and fraud and
1:16:09
money laundering and,
1:16:12
you know, you
1:16:15
know, interstaced felonies and
1:16:17
you name it. I think
1:16:19
what was most interesting
1:16:21
and illuminating about this was the
1:16:23
idea before well on the
1:16:28
dark web. that it's
1:16:30
literally possible to search for network access by entity
1:16:33
type. It's
1:16:36
like, yeah, I'd
1:16:38
like to purchase network access credentials for CPA and tax prep firms in the
1:16:40
US. How much
1:16:43
for how many? for how
1:16:45
many
1:16:48
Wow. This piece
1:16:50
from
1:16:54
Microsoft I'm not sure about this. Seems a little
1:16:56
specious to me.
1:16:59
It appears to be the
1:17:01
month for reporting. and Microsoft
1:17:03
is also out with their
1:17:05
annual digital defense report. The
1:17:08
report contained a great many
1:17:10
interesting tidbits and buried among them
1:17:12
was Microsoft's observation of an
1:17:14
interesting change in China's profile. The
1:17:17
observation begins with
1:17:20
Microsoft noting that
1:17:22
China's advanced persistent threat actors have leveraged significantly
1:17:28
more zero day vulnerabilities
1:17:30
during the past year than anyone else.
1:17:33
that anyone else Now,
1:17:36
although most, you
1:17:37
know, if
1:17:38
not all APT groups rely upon zero day vulnerabilities for
1:17:44
their exploits, Microsoft said that it had
1:17:46
noted Chinese threat actors had an increased
1:17:48
number of
1:17:51
zero days over the past
1:17:53
year. And most interestingly,
1:17:56
Microsoft believes
1:17:58
this sudden spike in zero exploits
1:18:00
exclusively by Chinese
1:18:02
threat actors is the
1:18:06
direct result of a new law passed by the Chinese
1:18:09
government last year. We talked
1:18:11
about this last summer. The
1:18:13
new law was passed
1:18:16
in July of twenty twenty one,
1:18:18
and it entered into effect in September of last year twenty twenty one.
1:18:24
It requires all Chinese security
1:18:26
researchers to report to first report
1:18:29
any new vulnerabilities
1:18:32
they find to
1:18:34
a state security agency. And yes, at the time, this did
1:18:36
and us raise
1:18:40
some eyebrows It was
1:18:42
roundly criticized within the security industry while the Chinese government claimed that it only
1:18:47
wanted to maintain an accurate
1:18:50
catalog of vulnerabilities for the sake of making sure that local companies
1:18:53
would not
1:18:56
dodge responsibility for
1:18:58
failing to patch vulnerabilities time. Thus leading, Chinese government
1:19:01
networks exposed to
1:19:04
attacks. Uh-huh.
1:19:06
hello Right?
1:19:07
And
1:19:08
that sort
1:19:08
of sounds like a reverse
1:19:11
engineered rationale. To put a
1:19:13
point on it, The new
1:19:15
law also contains several generically
1:19:18
worded clauses that could
1:19:20
be interpreted to suggest that
1:19:22
the that the Chinese government was setting
1:19:25
up a secret process through which
1:19:27
its offensive cyber units would
1:19:29
have access to
1:19:32
this Trove of privately reported
1:19:34
at the time unknown vulnerabilities while simultaneously suppressing
1:19:36
the work of
1:19:39
the InfoSec community for
1:19:42
the benefit of the country's espionage operations. Although no solid evidence has come support
1:19:48
these theories, Microsoft appears to
1:19:50
be sold on this narrative in its latest report. They wrote.
1:19:52
This new legislation
1:19:55
this new regulation might
1:19:59
this is Microsoft Writing. This new
1:20:02
regulation might enable elements
1:20:04
in the Chinese government. to
1:20:07
stockpile reported vulnerabilities
1:20:10
toward weaponizing them.
1:20:12
The increased
1:20:14
use of zero days over
1:20:17
the last year from
1:20:19
China based actors likely reflects the first full year of
1:20:23
China's vulnerability disclosure requirements for
1:20:26
the Chinese security community
1:20:29
and a major step
1:20:31
in the use of zero day exploits as
1:20:33
a state priority.
1:20:36
To put a little more
1:20:38
meat on the bone, Microsoft listed
1:20:40
five specific zero days
1:20:42
as possible examples of abuse. Two,
1:20:45
in Zoho
1:20:48
Manage Engine, and one
1:20:50
each in SolarWinds, Servvue, Atlassian Confluence, and
1:20:56
Microsoft Exchange. Were
1:20:58
exploits of these five zero days developed
1:21:01
by Chinese APT threat actors
1:21:03
after they were reported? through
1:21:06
Chinese in house vulnerability disclosure rules? We don't know. Maybe. On
1:21:11
the other hand, Would anyone
1:21:13
be surprised to learn of zero days in those
1:21:15
applications? Has all of
1:21:18
that software been repeatedly
1:21:20
plagued by
1:21:22
major vulnerabilities in zero other
1:21:25
threat actors.
1:21:28
Of course, Of
1:21:30
that, there could be no doubt. So perhaps
1:21:32
a more accurate and rounded assessment
1:21:34
would be that we cannot
1:21:37
blame Chinese APT actors for looking at what
1:21:39
everyone else is looking at and discovering the same zero days
1:21:42
that others are finding.
1:21:45
Could they be getting a
1:21:48
little help from the state's mandatory disclosure law
1:21:50
again? Maybe, but public evidence seems to be sorely
1:21:52
lacking. What
1:21:55
I wondered, like, maybe reading between
1:21:58
the lines, is whether Microsoft
1:22:00
actually knows more than
1:22:03
they're able to disclose without
1:22:06
revealing well, without revealing their own sources and methods which they need to
1:22:08
keep secret. Maybe this is a little
1:22:10
bit of a shot across the bow
1:22:15
saying, you know, read between
1:22:17
the lines
1:22:18
China because here's
1:22:20
five zero days that
1:22:23
we think are suspicious. Maybe
1:22:25
they have grounds and they
1:22:27
just can't talk about it. And
1:22:29
Leo, I need a civil while --
1:22:31
Absolutely. -- to continue
1:22:34
talking about this. Well, you'd
1:22:37
I'm gonna just take a little moment
1:22:39
then while you're drinking water. water.
1:22:42
Although maybe later tonight, you'll be
1:22:44
drinking a little more red wine, I
1:22:47
think. I think so. To talk
1:22:49
about club, Twitter. A lot of
1:22:51
what you hear on the network,
1:22:54
of course, is ad supported. But
1:22:56
ads don't cover all
1:22:58
the costs. And increasingly, they're covering less and less fewer and fewer of the costs.
1:23:00
I haven't mentioned this
1:23:02
before, but it costs us
1:23:06
Lisa told me this recently, and I kinda almost fell
1:23:09
over. About three and a half
1:23:11
million dollars a year to
1:23:13
run Twitter. That's for salaries rent, you
1:23:15
know, technology, all of that stuff. It
1:23:18
that that excludes anything I get paid
1:23:20
or Lisa gets paid.
1:23:22
That's a lot of money. and advertisers have
1:23:24
been, you know, great for us and they've covered that
1:23:27
for a long time. They may not that
1:23:29
may not continue. It's starting to look
1:23:31
like it might be a
1:23:34
little bit of a desert next
1:23:36
year. And that means
1:23:39
ClubTwits gotta fill the gap
1:23:42
because Honestly, I'm skint. I don't I
1:23:44
I can't do it. So that
1:23:46
means you, and it means seven
1:23:48
bucks a month, less than what you'd
1:23:51
pay for a couple of venty, tall, twenty
1:23:53
shot lattes, but it it
1:23:56
also makes such a
1:23:58
big difference in our operating
1:24:00
budget. do
1:24:02
you get for seven bucks? I'm not gonna I'm not gonna
1:24:04
ask you for money. It's less than Twitter Blue, I
1:24:06
might add. I'm not gonna ask you for money
1:24:08
without telling you what you get. You do get
1:24:10
something. You get ad free versions of this show and every show we do. You get don't normally
1:24:13
put out
1:24:16
in pub like, we put everyone tomorrow.
1:24:18
We put a hands on Mac or hands on Windows out. But there are many more of those. Paul threatened, like a sergeant,
1:24:20
through those inside club
1:24:22
twist, for the most part.
1:24:26
partly that's because ClubTuit launches these shows
1:24:28
just as it as it did with this week in
1:24:30
space. And then as they gain legs, we
1:24:33
could put them out as as real as
1:24:35
real grown up podcast. We also do an untitled Linux show with Jonathan Bennett does job every Saturday
1:24:37
afternoon. Right after the kids
1:24:39
visit with Vicki Bartolo, He
1:24:43
does a great job, Stacey's book club every every other month.
1:24:45
There's a lot of content. There's a
1:24:47
great Discord server, which
1:24:49
is always very active. You're also supporting I don't usually
1:24:52
mention this, but you're also supporting
1:24:54
things that are open to all
1:24:56
as a club to it
1:24:58
member, or IRC, for instance, which
1:25:01
doesn't cost a lot,
1:25:03
it community twitter forums forums,
1:25:08
and with places become really busy
1:25:10
of late, our Twitter mastodon instance, Twitter dot social. I
1:25:15
recently just had subscribed
1:25:18
to a much to the top level tier with
1:25:20
our host because it was so much traffic. We
1:25:23
I think we had seven thousand
1:25:25
percent increase than users over the
1:25:28
last five days, and I think
1:25:30
it's gonna continue, which is I'm
1:25:32
thrilled. But all of that comes
1:25:34
from club to it. So you're really supporting,
1:25:36
I think, a a great effort, an
1:25:38
important thing. You get ad free shows,
1:25:40
you get the Discord, you get
1:25:42
new shows and other shows that we
1:25:44
don't put out in public, seven bucks a month.
1:25:47
So this is my pitch to you. If you
1:25:49
like what we do it to it, And
1:25:51
you know what? It's completely fine. If you
1:25:53
can't afford it, I understand completely.
1:25:55
There's no pressure, pressure you
1:25:58
know, No no peer pressure to do this, but it
1:26:00
sure is nice if you do.
1:26:02
Twitter dot tv slash club. Twitter. It's
1:26:04
a way to show Steve and all of
1:26:06
our hosts that you appreciate what they're
1:26:08
doing. And by the way, if you just
1:26:10
wanted an ad free version of SecurityNow,
1:26:12
we do offer that same page. t
1:26:15
v slash club to it. by
1:26:17
itself, that's just two dollars and ninety nine cents a month.
1:26:19
I would say, spend the extra four bucks, get it but that's that's
1:26:21
up to you, and
1:26:23
I thank you. in
1:26:26
advance. And now I return
1:26:28
you to your fully
1:26:31
hydrated host of security
1:26:33
now, Steve Gibbs. Yes.
1:26:35
Okay. So I
1:26:38
love this idea. I'll
1:26:40
be able to see what feedback I get from
1:26:42
our listeners because not everyone might like it.
1:26:45
but it's interesting. The UK's
1:26:47
cyber group, the NCSC, will
1:26:52
be scanning will be scanning its
1:26:54
public network space looking
1:26:56
for known vulnerabilities. I think
1:26:59
this is an interesting trend.
1:27:01
We were, of course, Just
1:27:03
talking about the UK's GHC G
1:27:06
CHQ NCSC Cyber Division
1:27:08
last week when we cover the
1:27:10
retirement of his technical director after his twenty years of service, and
1:27:12
he certainly knew this was happening because this, you know,
1:27:14
had to have been in the works for
1:27:18
a while. So it was with interest that I noted,
1:27:20
what I think is
1:27:22
the NCSC's excellent plan
1:27:26
to periodically scan its own UK IP
1:27:28
space searching for known
1:27:30
vulnerabilities which are accessible
1:27:33
on the public Internet and
1:27:35
reporting them for remediation to the owners
1:27:37
of those IP addresses. I think this
1:27:39
is a terrific idea.
1:27:42
Okay. So They have an information page, which
1:27:45
they titled NCSC scanning
1:27:48
information. It's
1:27:50
not too long. I'm just gonna share this because it's sort of in a
1:27:52
q and a fashion. They they said
1:27:54
this page provides information on the
1:27:58
NCSC scanning activities. You may have been referred here
1:28:00
by information left by
1:28:02
one of our scanning probes.
1:28:04
If a system
1:28:07
you own or administer has been
1:28:09
scanned. So they ask, why is the
1:28:11
NCSC carrying out scanning activities?
1:28:16
They say, A part of the NCSC's
1:28:18
mission to make the UK the safest place to live and do business online,
1:28:23
we are building a data driven view of
1:28:25
the of the, quote, the
1:28:27
vulnerability of the UK,
1:28:31
unquote. This directly supports the UK
1:28:34
government cybersecurity strategy relating
1:28:37
to understanding UK
1:28:40
cyber risk. This will help
1:28:42
us two, three things. Better understand the vulnerability and security
1:28:47
of the UK help system owners understand
1:28:49
their security posture on a day to day
1:28:52
basis and
1:28:55
respond to shocks like a widely exploited
1:28:57
zero day vulnerability. That's interesting. So they'll be on top
1:29:00
of that when
1:29:03
they find out something new like heartbleed,
1:29:06
for example, they would immediately scan the UK's web servers
1:29:08
and be proactive
1:29:11
rather than passive. Next
1:29:14
question. How does the NCSE determine
1:29:16
which systems to scan? They they answer.
1:29:18
These activities cover any Internet accessible system
1:29:23
that is hosted within the UK and vulnerabilities that
1:29:25
are common or particularly important due
1:29:27
to their high
1:29:30
impact. The NCSC uses the data
1:29:32
we have collected to create an
1:29:34
overview of the UK's exposure to
1:29:38
vulnerabilities following their disclosure and remediation over
1:29:40
time. Boy, this is just sounds
1:29:42
wonderful to me. Next question, how
1:29:46
is scanning performed? To identify whether vulnerability exists
1:29:48
on a system, we first need
1:29:50
to identify the existence of
1:29:53
specific associated protocols
1:29:56
or services. We do
1:29:58
this by interacting with the system in much the same way a web browser or
1:30:00
other network
1:30:03
client typically would. and
1:30:06
then analyzing the response that is
1:30:08
received. For example, we may be
1:30:10
able to determine the existence of
1:30:13
a vulnerability known to exist inversion
1:30:15
x of a type of
1:30:17
commonly used web server software
1:30:19
by making a web
1:30:21
request to the URL and then they give an
1:30:23
example dot slash login dot HTML
1:30:26
and detecting the value
1:30:31
version x in the content of the page that returned. If the
1:30:33
vulnerability is then remediated in
1:30:35
a subsequent version
1:30:38
y, We can identify this by similarly detecting
1:30:41
the value version y
1:30:43
in the response. By
1:30:46
repeating these requests, on a regular basis,
1:30:48
we maintain an up to
1:30:50
date picture of vulnerabilities across
1:30:53
the whole
1:30:55
of the UK. Wow.
1:30:56
What information does the NCSC
1:30:58
collect and store? We
1:31:01
collect and store
1:31:04
any data that a service
1:31:06
returns in response to a request. For web servers, this includes the full
1:31:09
HTTP response
1:31:12
including headers, to
1:31:14
a valid HTTP request. For other services, this includes data that is sent by the server
1:31:16
immediately after a
1:31:19
connection has been established or
1:31:23
like, you know, like the SMP headers, for example,
1:31:25
or a valid protocol handshake
1:31:27
has been completed. We
1:31:30
also record other useful information for each
1:31:32
request and response, such as the time
1:31:34
and date of the request, and
1:31:36
the IP addresses of the
1:31:39
source and destination endpoint. We design
1:31:41
our requests to collect the smallest amount of
1:31:43
technical information required to validate
1:31:46
the presence slash version and
1:31:50
or vulnerability of a piece
1:31:52
of software. We also determine
1:31:55
I'm sorry. We also design
1:31:57
requests to limit the amount of
1:31:59
personal data within the response. In the unlikely event we do discover information
1:32:02
that is personal or otherwise
1:32:04
sensitive, we
1:32:07
take steps to remove the data and prevent it
1:32:09
from being captured again in
1:32:11
the future. Question.
1:32:15
How can I attribute activity on
1:32:18
my systems to NCSC
1:32:20
scanning? They
1:32:23
answer all activities performed on a
1:32:26
schedule using standard and
1:32:28
freely available
1:32:31
network tools running within a dedicated cloud hosted environment.
1:32:33
All connections are made
1:32:36
using one of
1:32:39
two IP addresses. eighteen point
1:32:42
171 point seven, point
1:32:45
point seven point 246
1:32:49
or
1:32:49
thirty
1:32:50
five dot 177 dot ten dot 231
1:32:52
And
1:32:56
they said, note that these IP
1:32:58
addresses are also both assigned to
1:33:04
scanner dot scanning dot service dot NCSC
1:33:07
dot gov dot u k. with
1:33:11
both forward and reverse DNS
1:33:13
records. So that's very
1:33:16
cool. That means you
1:33:18
could do you know, a a DNS lookup
1:33:20
on scanner dot scanning dot service
1:33:22
dot NCSC dot
1:33:25
gov dot u k, and it
1:33:27
would return those two IPs or if you
1:33:29
did a a reverse lookup on either of those IPs, that's the DNS
1:33:31
that you would get to know what that
1:33:35
was. They said scan probes will also attempt to
1:33:37
identify themselves as having
1:33:39
originated from NCSC
1:33:43
where possible. For example, by including the following
1:33:45
header within all HTTP
1:33:48
requests, and the header
1:33:50
is x hyphen NCSC,
1:33:52
colon, NCSC scanning agent,
1:33:55
and then they provide a
1:33:57
URL to the page that
1:33:59
I've been sharing. for
1:34:01
so people can fight out what's
1:34:03
going what what that's about. precautions safety measures does the NCSC
1:34:07
take when scanning? The
1:34:10
answer, the NCSC is committed to conducting scanning activities in a safe and responsible manner.
1:34:16
As such, All our probes
1:34:18
are verified by a senior technical professional and tested
1:34:20
in our own
1:34:23
environment before use. We also
1:34:25
limit how often we run scans to ensure we don't risk disrupting the
1:34:27
normal operation of
1:34:31
systems. And finally, can
1:34:34
I opt out of having servers that I own or maintain being scanned?
1:34:40
Answer yes. please
1:34:42
contact scanning at ncsc dot gov dot u k with
1:34:44
a list of
1:34:47
IP addresses that you wish
1:34:50
to exclude from any future
1:34:52
scan activity, and we will endeavor to
1:34:55
remove them as soon as
1:34:57
possible once
1:34:58
validated. So So as
1:34:59
I said, sign me up as
1:35:02
a fan of this
1:35:04
concept. Given the sad
1:35:06
and sorry state of so
1:35:08
many so much consumer
1:35:10
crap. And unfortunately, the patch latency
1:35:13
of so
1:35:16
many enterprises all of
1:35:17
which is hung out on
1:35:18
the Internet to be attacked. I think this makes
1:35:20
a, you know, a
1:35:23
huge amount of sense. I
1:35:26
mean, it's not like we're not
1:35:28
all being scanned all over the place
1:35:30
all the time anyway. I mean, you know,
1:35:33
I I referred to it. It
1:35:35
was one of the first acronyms that
1:35:37
I or abbreviations that I coined, and that
1:35:40
was IBR. because,
1:35:42
you know, I started getting
1:35:44
involved in Internet security, and I thought,
1:35:46
what is all this packet noise? And
1:35:49
so it's Internet background radiation. It's just random crap out on the
1:35:51
Internet that hits all of our IPs, you know,
1:35:54
from time to time.
1:35:56
So from time to time so
1:35:59
I think it would be great if the if the
1:36:01
US could take up
1:36:03
similar responsibility and do
1:36:05
something like this. Or maybe defer to
1:36:08
individual ISPs to, like,
1:36:10
police the the traffic
1:36:12
on their own networks.
1:36:15
and and inform their customers. Well,
1:36:17
this was, you know, this was the
1:36:19
big argument some years ago. when
1:36:22
spam -- Well, still a problem. But when
1:36:24
it was really a a problem,
1:36:26
all all a ISP would have
1:36:29
to do is block port twenty
1:36:31
five at the SMTP port. and they would effectively
1:36:33
kill SPammers on their network. And for a long time, companies
1:36:35
like Comcast, the biggest ISP in the
1:36:37
in the US, wouldn't
1:36:40
do it. because they're afraid of the huge
1:36:42
cost of tech support calls from people saying, well, look, it's in my email anymore.
1:36:44
And they eventually
1:36:47
did do it. So ISPs,
1:36:49
we've talked about this before. ISPs could, without doing the
1:36:51
scanning that the British are
1:36:54
doing, do a lot to
1:36:56
police a lot to police
1:36:58
outbound traffic from there. Yep. And they
1:37:01
because they don't have to, they haven't done
1:37:03
it. Yeah. They they have not
1:37:05
been made to do it. Yeah. Yep.
1:37:07
And I and I think that their port twenty five was also self
1:37:09
interest because they were getting complaints --
1:37:12
Right. -- from, you
1:37:14
know, a lot like like that
1:37:16
their network was sending all the spam. Right? And
1:37:18
it was, yeah, it was a customer in their network. I Cox,
1:37:21
the my
1:37:24
cable provider, blocks
1:37:26
port twenty five. So I have way that order -- -- contact server at
1:37:28
GRC, but, you know, it's
1:37:30
something has to be done. Just
1:37:35
a quick note about Twitter, since I'm
1:37:38
about to share two
1:37:40
listener feedback
1:37:43
tweets, as my followers probably know, I have
1:37:45
the blue verified check
1:37:47
mark seal. And like
1:37:49
so many others who
1:37:52
have commented, I'm not going to
1:37:54
be paying anything for it. I don't need any advanced features.
1:37:56
I'm not paying
1:37:59
anything for it now. and
1:38:01
I'm certainly not gonna be paying a
1:38:03
hundred dollars per year to keep it. Well, it would also devalue it because anybody bucks regardless
1:38:07
would get it. So
1:38:09
it's no longer verifies that you are who you say you are, but it only means you paid eight bucks.
1:38:12
So it completely devalues. It's
1:38:14
not it doesn't mean verified
1:38:16
anymore. Yeah.
1:38:19
So if it's taken away, I'll still be the I'm not
1:38:21
paying either. In fact, I got off Twitter. I'm
1:38:23
done with that. I I did note
1:38:25
one thing in passing, which I
1:38:27
thought was interesting, The Twitter
1:38:29
alternative Mastodon reported that it had recently
1:38:32
reached not surprisingly
1:38:34
an all time high of
1:38:38
six hundred and fifty
1:38:40
five thousand active users after
1:38:42
an influx of get this two
1:38:46
hundred and thirty thousand new users just last week alone.
1:38:49
up to a
1:38:52
million now. Wow. And
1:38:54
I oh, our server has seven thousand percent increase in users,
1:38:56
a two thousand percent increase
1:38:58
in interactions. You should join Can
1:39:03
I put a plug in for twit dot social? I would love to have
1:39:05
you. We even have, you know, on twit
1:39:07
dot social, we have
1:39:09
a custom icon that's
1:39:11
your head. So I I think you need to
1:39:14
well, I shouldn't know. All I really do with with my
1:39:16
Twitter account is tweet the
1:39:18
link every and you You
1:39:22
don't have to give up Twitter
1:39:24
to do that, but I suspect if you joined Twitter's
1:39:26
social, you would probably get in some very interesting conversations
1:39:28
because people
1:39:31
who listen to our show or many of them are
1:39:33
there. And the thing to understand about
1:39:35
Mastodon is, you know, I'm running it's
1:39:38
federated. So I'm running it's like I'm
1:39:40
running a server, but you can follow and people
1:39:42
can follow you from all over the fediverse. Right? If
1:39:44
you were and I will
1:39:47
give this to you at Steve
1:39:49
at twit dot social, everybody would know to follow you. Or you if you wanna be
1:39:51
GSGGRC whatever you wanna be,
1:39:55
you can be Well,
1:39:57
I should be. I don't wanna getting bay engaged in conversation. That's not what
1:39:59
You don't have to. It's I it
1:40:02
doesn't you don't it doesn't require
1:40:04
it. it's
1:40:07
up to you. I I'm not gonna push you into it,
1:40:09
obviously. In fact, one of the great things about
1:40:11
Mastodon, I'm a little reluctant
1:40:13
to promote that we do this because I don't want a whole
1:40:16
influx of Twitter people in
1:40:18
here. I want people who,
1:40:21
you know, are nice people. feel the good
1:40:23
news is Leo. The only people who are hearing this are the
1:40:26
pretty nice people want. Yes. Are nice
1:40:28
people. That's that's a very good way
1:40:30
of putting it. Yes. It's a safe space
1:40:32
here. Yeah. And and that's
1:40:34
how I feel about GRC's news groups. It's just it's a fabulous place where I'm able to get real
1:40:36
work done. Yeah. I I should
1:40:38
mention that I will be fine
1:40:43
hiring up a a mailing list. Finally,
1:40:45
I have to do it in
1:40:47
order to announce SpinRite six one
1:40:49
to all of SpinRite six o's
1:40:51
owners. writing. So that
1:40:53
has to happen. So and I'm gonna I'll create a number of different, you know,
1:40:55
sublists and so
1:41:00
forth. So And and I'm thinking as Twitter
1:41:02
becomes sort of an uncertain deal. And, frankly, there
1:41:04
are an awful lot of
1:41:06
our listeners who are like, and, you
1:41:08
know, me, they're, you know, they've they've they've always refused
1:41:10
to be on Twitter. So I will probably one of the
1:41:14
things that I'll do once I get mailing system running
1:41:17
is to just send out a
1:41:19
short note every week
1:41:21
to, you know, containing the the show notes link because Oh,
1:41:24
that's a great idea. Yeah. Yeah.
1:41:26
That's a great idea. That way,
1:41:28
everyone will be able to get it. So
1:41:30
Okay. Closing the loop. Two
1:41:33
bits of feedback as
1:41:35
I said. I'd wanted
1:41:37
a note that it was fun to receive all of
1:41:39
the feedback from my discussion of my
1:41:42
preferred keyboards last week, Leo, not surprisingly,
1:41:44
lots of people had opinions
1:41:46
about keyboards. There there's lots of discussions going on in various places now. So
1:41:48
it turns out that I'm
1:41:50
far from the only one who
1:41:53
cares passionately about the, you know, the basically, the way their primary device
1:41:56
feels under
1:41:59
their fingers David
1:42:02
Stryker said this week,
1:42:05
you talked about alt
1:42:07
tab acting as MRU,
1:42:09
right, most recently
1:42:11
used. He but alt, but
1:42:13
control tab as round robin. Firefox has
1:42:16
an option to
1:42:19
set control tab to
1:42:22
act in MRU and it's one of the reasons I use it over Chromium based browsers.
1:42:24
He said I opened
1:42:27
a bug with Chrome to
1:42:30
allow MRU and their response
1:42:32
was simply won't fix. So
1:42:35
we said f f
1:42:37
and then space FTW. So
1:42:39
anyway, I just wanted to share with our listeners something I never
1:42:41
knew, which is that there was an
1:42:43
option in Firefox that would allow
1:42:45
you to change the behavior
1:42:47
of control tab so that
1:42:50
it is not round robin, but MRU. And I would
1:42:52
them are you find that
1:42:56
much preferable. PC owner, he
1:42:58
said, Steve, what is the best commercial
1:43:02
what is the best commercial cloud storage
1:43:05
Secure encrypted question mark. Okay. Well, I know that there are
1:43:07
many choices, but I did
1:43:10
wanna mention I am still
1:43:15
just to renew still a fan
1:43:17
of sync dot com, who I
1:43:19
haven't talked about in
1:43:21
a while, I've set up
1:43:23
sync to completely manage the file
1:43:25
synchronization between my two
1:43:28
locations and it has
1:43:30
never failed me. completely TNO, you
1:43:32
know, trust no one end to
1:43:34
end encrypted. It has apps for
1:43:38
iOS and Android, of course, runs under Windows and
1:43:41
Mac, presents a sync
1:43:44
directory under Windows
1:43:46
and Mac, and allows
1:43:48
for managed public link sharing despite
1:43:50
the fact that it's end to end encrypted. So it has all the features
1:43:52
that you would expect
1:43:54
from a mature, secure encrypted
1:43:59
commercial cloud storage provider. What
1:44:01
I did was to
1:44:04
move a bunch of
1:44:06
sub directories that already existed on my
1:44:09
system under this auto
1:44:12
under syncs
1:44:16
automatically synchronizing sync directory. So for example,
1:44:18
I have c colon backslash s m where
1:44:20
all of my
1:44:23
assembly code work lives. So
1:44:26
I moved that entire directory
1:44:29
under the new sync
1:44:32
directory, then I used
1:44:35
Windows there's command and windows, make link,
1:44:37
MKLINK
1:44:39
which creates what's known as
1:44:42
a junction point, you know,
1:44:44
Linux referring to them as
1:44:46
symbolic links or or hard links. This creates a junction point where the
1:44:52
relocated directory directory used
1:44:54
to be at c colon backslash
1:44:57
siem, this this puts
1:44:59
a link there so
1:45:01
that all of the existing
1:45:04
automation and batch files and and,
1:45:06
you know, all everything that I
1:45:08
have that expects my assembly
1:45:10
language stuff to be a SQL and backslacker SEM,
1:45:12
it's still there. As far
1:45:14
as it's concerned, although it's
1:45:17
actually under the sync directory, and
1:45:19
now automatically synchronize between my my multiple
1:45:21
locations and available, you know, wherever
1:45:23
I am. The
1:45:26
only feature missing and they are painfully aware of it
1:45:28
is Linux client support.
1:45:30
But I expect that
1:45:33
their evaluation of the
1:45:35
market for Linux I understand it's
1:45:38
a skewed demographic here in this podcast audience, but
1:45:41
windows and
1:45:44
Mac have such a high percentage of
1:45:46
the total desktop share that they're they don't seem to
1:45:48
be making much headway
1:45:51
on a Linux client. No.
1:45:53
because this
1:45:54
has been going on for years. Yes. I mean, I did wanna mention without question for
1:46:00
me, the best feature,
1:46:02
which I have used many times, is that everything that is synchronized
1:46:08
has full incremental versioning behind it
1:46:10
without the user ever needing to do anything.
1:46:13
Boy, is that
1:46:16
a win? and it has saved my
1:46:18
bacon a couple times. I was once doing file versioning myself locally,
1:46:20
but Now
1:46:23
it's just all built into the system that I'm using to
1:46:25
synchronize to synchronize my locations,
1:46:27
and it's great. They
1:46:31
have multiple plans, including a free
1:46:33
five gigabyte plan that you
1:46:35
can use to get your
1:46:37
feet wet and you can bump
1:46:40
that as I mentioned before when I
1:46:42
talked about sync to a free six
1:46:44
gigabytes. If you go
1:46:46
to sync dot com but use my affiliate code. Actually,
1:46:48
you could just go there in one jump
1:46:50
if it's GRC dot s c
1:46:53
slash SYNCGRC
1:46:57
dot s c slash SYNC And that'll give you an
1:46:59
extra one gig and I get one added to
1:47:02
my account too. So anyway,
1:47:04
still bullish about sync again. I know
1:47:06
that whenever I mention this, I get like fifteen people all
1:47:11
with different cloud sync provider. So I get
1:47:13
it that there are alternatives, but, you know, this is the one that I can vouch for.
1:47:16
And it's as I said, it's
1:47:18
I've been using it I use it
1:47:21
every day and it's never let me down.
1:47:23
And lastly, oh, boy, this is getting
1:47:26
exciting. A quick update on
1:47:27
where I am
1:47:28
boy did getting of citing a quick update
1:47:30
on where i am What I'm doing, what I'm not doing this podcast,
1:47:33
I finished all of
1:47:35
SPINRAID's data recovery
1:47:40
driver testing all of it's working. The oldest
1:47:42
drivers for bios interface drives ended up needing a bunch of updating.
1:47:45
That's all finished
1:47:48
and tested. As the final piece
1:47:50
of work, I turned my attention to Spinrite's command line interface
1:47:52
and it's built
1:47:55
in command line help. I
1:47:57
updated everything in the online help with the new
1:47:59
with the new design. The redesign of the
1:48:01
way it's gonna work
1:48:03
is finished. So
1:48:06
The help guide is reflected to is
1:48:08
updated to reflect that. Now I'm
1:48:10
in the midst of rewriting much
1:48:13
of SPINRAID's command line
1:48:15
processor to make it well, to
1:48:16
see
1:48:17
to to bring it up to speed with with all of
1:48:19
the other changes that
1:48:22
spin right has undergone.
1:48:25
In the process of
1:48:27
that, to list which causes Spinrite
1:48:33
To exit immediately after discovering and characterizing all
1:48:35
of the system's mass storage devices, which
1:48:38
are accessible to it,
1:48:41
it dumps that list in
1:48:43
tabular, asci text to the DOSS console. For this new for
1:48:46
this new spin right spin right, We also
1:48:49
need a way of selecting drives through the command line.
1:48:51
I could have just used the old way of indicating
1:48:53
which line item in the
1:48:55
listed table we wanted. But
1:49:00
spinrite power users use the
1:49:02
command line to automate spinrite
1:49:04
and the ordering of
1:49:06
drives could change over time if
1:49:08
a drive was unplugged or it went offline or
1:49:10
if a new drive was plugged into a lower numbered
1:49:13
port, which would then
1:49:15
get enumerated sooner and appear
1:49:18
earlier in the table. So a much more robust way of selecting drives is to allow a
1:49:20
text match on any
1:49:23
fields in the table. Since
1:49:27
that includes the drives model number
1:49:29
and its serial number, it'll be
1:49:31
possible to positively lock
1:49:34
selections to specific drives. It'll possible select
1:49:36
multiple drives by class. For example,
1:49:38
since one of the tables columns
1:49:43
is type, it'll be possible to give spinrite the command type
1:49:45
AHCI, which will cause
1:49:48
spinrite to to
1:49:50
pre select all of the system's AHCI drives, but
1:49:53
none others. So that's
1:49:55
where
1:49:55
I stopped working
1:49:56
Sunday evening to to
1:49:59
put the podcast together. tonight?
1:50:01
Oh, well, not probably not tonight because this is
1:50:03
election night. So that's I will be enthralled. But
1:50:08
tomorrow morning, First thing in the morning,
1:50:10
I'll probably still have the election on the background, but I'll be working on spin
1:50:13
right, getting
1:50:16
that finished and and tested and
1:50:18
then out into the hands of our group. So anyway, as I
1:50:20
said to Laurie during
1:50:22
our walk yesterday, it's getting
1:50:26
exciting. And we have I think we're up
1:50:28
to four hundred and six registered testers in
1:50:30
our GitLab instance. So we'll have a lot
1:50:32
of people pounding on it. and we
1:50:34
will move it as quickly as
1:50:36
possible from alpha into beta. At which point, I
1:50:38
could be able to make it available widely.
1:50:42
Yay. Is
1:50:46
that it?
1:50:52
That's it. It was there was
1:50:54
literally something for everyone. I was waiting for you to talk about the guy who had a billion dollars in
1:50:56
crypto in
1:50:59
his coffee can in his backyard. Did you
1:51:02
see that story? I missed it.
1:51:05
He had
1:51:08
stolen it. Let me see if
1:51:10
I can find the details. Oh, I did. I didn't know. I didn't realize it was stolen. I did. Oh, yeah. About
1:51:12
someone who's stolen a bunch of crypto.
1:51:14
Yeah. He's stolen a bunch of crypto
1:51:19
so. And he put it in
1:51:21
on a little, you know, board
1:51:24
because it's you know,
1:51:26
It strikes me you could just
1:51:28
write down the number of your wallet. You
1:51:30
don't need to ask him. Yes. You yes. You
1:51:32
could. So but for some reason he's
1:51:34
cited to to put it
1:51:36
on a board and maybe he wasn't
1:51:38
that sophisticated. Anyway, he had a,
1:51:41
yeah, a billion dollars worth
1:51:43
of Bitcoin. And I it
1:51:45
wasn't a coffee can or it was
1:51:47
it was hidden. And he
1:51:49
got found. Oh, yeah.
1:51:52
Oh, yeah. He got cut and he's I
1:51:54
think he's been arrested. Yeah. Anyway, I'll I don't have the
1:51:56
we'll probably talk about it on Twitter
1:51:59
that Sunday because it's just a
1:52:02
great story. Yeah. Mister g, if you like what you hear here,
1:52:04
you gotta check out
1:52:06
his website, GRC
1:52:11
dot com. Yes. Spin writes there, the world's finest
1:52:13
mass storage, maintenance, and recovery
1:52:16
utility. Six point o is the
1:52:18
current version six point one. As you
1:52:20
heard, like, just around the
1:52:22
corner, you'll get it for free if you buy six o now, you get automatic upgrade. So it's worth doing that. You
1:52:24
will wanna have this software. If
1:52:27
you have a hard drive, or
1:52:31
an SSD, you gotta have spin right. While
1:52:33
you're there, check out the show.
1:52:35
Steve has two unique versions
1:52:37
of the show, a sixteen
1:52:39
kilobit audio version. and a transcript written by
1:52:41
an actual human so they're they're actually
1:52:44
legible. And you can use those to
1:52:46
search or just read along as you're
1:52:48
listening. also
1:52:50
has a sixty four kilobit audio, GRC
1:52:52
dot com. You can leave him comments there. As
1:52:54
you heard, he doesn't really wanna talk to you.
1:52:57
But if you wanna leave a comment, go to
1:52:59
GRC dot com slash comments.
1:53:02
I'm sorry. Feedback.
1:53:04
Yeah. I don't blame you. I
1:53:06
don't I never read out. replies either.
1:53:08
Or you can go to Twitter, SGGRC
1:53:11
Sure. I can't just sign you up, Steve,
1:53:13
at Twitter dot social, it'll be so
1:53:15
so much easier. I
1:53:18
I do reply to DM's. I I try to you know, I mean, I'm I'm I'm I'm present, but, you know,
1:53:20
extended conversations, everyone
1:53:23
would rather have spin
1:53:27
right than than me. So Yes. Get
1:53:29
to work. We have sixty
1:53:31
four kilobit audio. We have video
1:53:33
two at our website, twit dot
1:53:35
tv slash s and there's a YouTube
1:53:37
channel. You could subscribe in your favorite podcast client as well and get it automatically, the minute it's
1:53:40
available. Some people like to
1:53:42
watch live, like get the very
1:53:44
freshest hot
1:53:46
off the the podcast griddle version.
1:53:48
We do the show Tuesdays.
1:53:50
The time varies depending on how
1:53:53
long Mac Break weekly goes somewhere
1:53:55
one thirty to two PM Pacific's what you're we're
1:53:57
shooting for five PM eastern, twenty
1:53:59
two hundred UTC. live
1:54:02
dot twit dot tv is the
1:54:04
stream. There's audio and video streams there. It's a nice thing
1:54:06
to have in the background while you're working or whatever.
1:54:08
And if you're doing that, you might as well
1:54:11
chat with us at IRC. twit dot tv. Club members
1:54:13
can also chat in
1:54:15
the discord. And I guess, you
1:54:17
know what? You could also comment on
1:54:19
the twit social there. Steve won't see
1:54:21
it, but I will. Or on our
1:54:24
discourse, our forums
1:54:26
at twit dot community. So there's quite a few ways to
1:54:28
interact either synchronously or asynchronously
1:54:30
with me and other listeners.
1:54:34
Don't expect Steve to get involved. He's got something better to
1:54:36
do, more important. Steve,
1:54:39
I hope you
1:54:41
have a calm
1:54:44
relaxing night. and we will see
1:54:46
you next Tuesday on now. Bye.
1:54:49
Don't miss all
1:54:52
about Android every week, we talk about
1:54:54
the latest news, hardware, apps, and now all the developer y goodness happening
1:54:57
in the
1:55:00
Android ecosystem. I'm Jason Howell, also joined
1:55:02
by Ron Richards, Florence Ion, and our newest cohost on the panel, Wynn Tudau,
1:55:05
who brings her
1:55:08
developer chops. really great stuff. We also
1:55:10
invite people from all over the android ecosystem to talk about this mobile platform we
1:55:12
love so much.
1:55:15
Join us every Tuesday all
1:55:18
about
1:55:20
android on
1:55:22
Twitter TV. Security
1:55:26
now.
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More