Podchaser Logo
Home
Talkin’ About Infosec News – 11/30/2023

Talkin’ About Infosec News – 11/30/2023

Released Wednesday, 29th November 2023
Good episode? Give it some love!
Talkin’ About Infosec News – 11/30/2023

Talkin’ About Infosec News – 11/30/2023

Talkin’ About Infosec News – 11/30/2023

Talkin’ About Infosec News – 11/30/2023

Wednesday, 29th November 2023
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:01

All right, we're on the air. All right,

0:03

so let's stop talking about what we're just now. We're on

0:05

the water. We're on the water. We're

0:07

on the tubes. I

0:09

think the firmware update fix the camera. So

0:11

it actually kind of looks like it did.

0:14

Yeah. Yeah. Good job. Wow. And

0:16

update that actually worked well in

0:18

the fence. Oh, great.

0:24

Now I got to install Ralph's firmware. Oh,

0:27

I was like four major versions

0:30

behind. So I think it was

0:32

a definitely needed to be done.

0:34

Whoa, computer into your upgrade and

0:36

it didn't break anything in the

0:38

process. Although it was

0:41

a wonderful and actually fixed. Yeah. Well,

0:43

I'd already tried three different USB ports. I'm just going to

0:45

rock with the glitches. I kind of like them. I feel

0:47

like they're a feature. Yeah, they're feature.

0:50

Not a bug. No, I'm documented

0:52

features. That's an undocumented

0:54

feature, right? They're added bonuses. Yeah,

0:57

I mean, documentation is

0:59

overrated. I'm telling you, Corey, I'm going to get

1:01

a huge screen behind me so that I can

1:03

like put up other things, right? You know, you

1:05

have like little lights. I'm just going to get

1:07

a TV. I think if you're going to do

1:09

anything, you got to go green screen and then

1:12

let Ryan Ryan just set your background to whatever

1:14

he wants. Yeah, just get a little macro pad

1:16

that Ryan can control. Yeah, no, I'm going to

1:18

do like put other stuff into our better. Let's

1:20

have a so the discord can change it. Oh,

1:22

yeah, I get so like people can vote. No,

1:26

no, no, no, no. I'm not saying they can change it to

1:28

whatever image they want. I'm saying they can choose different

1:30

modes. They can go like Rolf's in

1:32

a gator swamp. We're off to the

1:34

data center. Yeah, yeah, yeah. Ralph's

1:37

running a marathon. Doesn't

1:39

go off the rails enough. Exactly.

1:42

Yeah, like we need more. All

1:44

right. Are we ready to take this thing off? Oh,

1:47

we were born ready. Let's do it. Last

1:49

night. All right. Hello

2:10

and welcome to another edition of Black Hills

2:12

Information Security talking about news. I'm your host,

2:14

John Strand. In this particular edition, we're going

2:16

to be talking about all kinds of very

2:18

weirdly named things. General

2:20

Electric is investigating a breach. We got a

2:23

phishing attack. It's now using AI. We've

2:25

got fidelity. Financial

2:28

mortgage network is compromised. This has got a

2:30

new exploit of privilege escalation on Linux. We

2:32

have tons and tons of things. But

2:35

today I am joined once again by

2:37

an illustrious cast of InfoSec professionals. Otherwise,

2:39

we have Ryan once again always making

2:41

us look good and sound good. Ryan,

2:44

how are you doing, sir? Doing

2:46

pretty good. All right. Good

2:48

to hear. We have Kelly is with

2:50

us today. Yay. Welcome back, Kelly.

2:52

We missed you. I missed everybody

2:54

else too, John. Good to see you as well. It

2:57

sounds like some people have work to do or something.

2:59

It's weird. We have Mike is

3:01

with us as very regularly. Mike, how are you

3:03

doing, sir? Doing well. I

3:06

hope you and your family had a good holiday

3:08

weekend. Did. It only

3:10

involved getting stuck in airports overnight twice.

3:12

I consider that a win. We

3:16

have Florida Man, by the way, Raul. Just

3:19

so you know, we have a story today that's

3:21

not a Florida Man. It's a Georgia Man story.

3:25

That's literally a disappointment. Honestly, we could

3:27

try harder. We can't. You

3:29

can. Florida's got to raise their

3:31

game. Speaking of the game, Batman will is with

3:33

us, sir. How's the watch working out for you?

3:35

Oh, it's it's definitely gave me a lot to

3:38

talk with about the doctor today when I went.

3:40

It does. It does. Welcome to the

3:42

club. And I'm sorry. I'm sorry. Watch

3:44

is give you shit. But if you don't work out, you're like, you need to

3:46

go do something. You know what? I need it though. I

3:49

had it. So awesome. Oh,

3:52

somebody just said it's hard to recognize

3:54

Raul from this angle. Yeah, I fully

3:56

agree. I

4:00

I felt the exact same way then I

4:02

thought I said this same thing. Yeah,

4:04

I'm in anti streamer mode There

4:07

you go. And then speaking of angles and

4:09

cute puppies in the background Bronwyn is with

4:11

us Hey Bronwyn see the puppies there say

4:14

hi to the puppy for us Absolutely,

4:17

he's my little man Louis

4:20

and then we have what is

4:22

it the coffee that coffee guy

4:24

Cory is with us as usual

4:27

All right, everybody. I'm just gonna grab a story

4:29

at random. Is that okay? Oh, here we

4:31

go That's the whole show.

4:33

Yeah Oh,

4:38

by the way folks I have to balance a little early so if

4:40

I leave Cory

4:42

approved Approved

4:46

hey, that's my job. I

4:48

saw your line. I saw I stole his line Okay,

4:51

approve. It's one of those words I

4:54

can spell correctly So

4:56

general electric is investigating a claim of

4:58

a cyber attack and data Yeah, this

5:01

in and of itself we had someone that said

5:03

who this general electric reach that could be bad

5:06

It could be but my

5:08

favorite thing about this particular

5:10

story is the threat actor

5:13

Intel broker attempted to sell

5:15

access to general electrics development

5:17

and software pipelines for $500

5:21

on a hacking form and

5:23

it didn't sell Well,

5:26

I do like how their profile just their

5:28

their sub line is just the racist. That's

5:31

yet It's

5:35

like you've got to kind of feel bad for

5:37

general electric a little bit like you can see

5:39

the CEO or stuffs for sale online How

5:41

much are they how much are they charging for it? 500

5:44

500 million dollars those bastards I

5:51

Don't know Yes, yes Sir,

5:54

it's lower. It's all lower

5:57

or what 500 Yeah,

6:00

that's it. Not any higher

6:02

than that. No

6:04

one bought it? Yeah. It

6:06

was not on sale yet. It was on sale. Yeah. It

6:09

was on sale. Yeah. It's

6:12

someone who spends a lot of time on

6:14

these kinds of websites. No one ever believes

6:16

that anything is real ever. So

6:19

like actually getting... I will

6:21

also say a lot of the times they

6:23

will advertise like a single sale and I

6:25

think those get more... Usually

6:28

from my perspective, it's kind of a standoff where

6:30

it's like the first person that buys it is

6:32

then going to publish it or try to resell

6:34

it. So like once one person buys it and

6:36

distributes it, it's free for me after that. But

6:39

the one tactic I've seen that works is

6:41

if they advertise it as, this is a

6:44

one-time sale. I'm never going to disclose

6:46

this ever again. Then typically

6:48

companies like Mandiant or other entities

6:50

will maybe step in. I mean, who knows, right?

6:52

I don't know. It's not like I work for Mandiant.

6:55

My thought is we talk about

6:57

development pipelines, we talk about breaches. I always

6:59

think of token theft. That's where my brain

7:01

goes. We've seen it multiple times. Someone's

7:04

development keys for GitHub or keys for

7:06

GitLab or something are compromised and then

7:09

someone just pulls down the entire repo.

7:11

That's what happened to Twitch a few

7:13

years ago, I think. Yeah. That's

7:16

right. But still, I mean... I think

7:18

it's just a cyber Monday deal because

7:20

my end of the summer cyber Monday

7:22

deals, that's what the attackers were doing.

7:26

Honestly, if it's like the way that you can get

7:28

your GE washer and dryer to not just die after

7:30

three years, I definitely could seek... That would be worth

7:32

$500. I would buy it for that.

7:34

I would buy it for that. For

7:38

me, if I was on these forums, it would

7:40

be like buying domains. For

7:42

a lot of us, we have an addiction

7:44

of buying random domains. Someone's like,

7:46

"$500 for access to... Yeah, I'll

7:48

pay that." It's

7:51

worth a shot. But don't taze me, Doc Bro.

7:54

You kind of get the Doc Bro's, man. They're

7:56

really good. Yeah. General

7:58

Electric next week. hiring entry

8:00

level cybersecurity analyst eight years plus

8:02

experience with AI leverage sim 25

8:05

years experience with offer suite. 10

8:07

plus years with no. Nice.

8:13

Nice. Oh my god. It

8:16

hurts so bad because it's

8:18

true. Yeah.

8:22

So I mean, overall, like the

8:24

data that they teased was like,

8:26

they just say DARPA related military

8:28

information, files, SQL files, documents, etc.

8:31

That's all I had to know, etc. I was in. Yeah,

8:34

right. That's $500 straight up. Once

8:37

again, I can still see the CEO

8:39

and CFO being like, no one, no

8:42

one wanted to buy that at all. Like,

8:45

okay, I guess we're sad, you

8:47

know, I'm gonna go home and I'm gonna

8:49

wax my my Porsche for a little while.

8:52

Still,Bel fishes already rolled back. Freedom

8:57

check. Phantomile all over

8:59

the world. I'm scared rip. Here

9:04

we got transphobic Lund here.

9:06

Ian. You already at APT. Ian

9:09

Within Atrain, he's

9:11

a wellRecordedwhatever. I

9:15

can say is from determine. Ian

9:19

here. Now batting

9:21

me. It's not me. It's

9:23

not totally happened when he is

9:25

my defects. What? No,

9:27

no, no. Let's see if

9:30

it was Grayson. Grayson.

9:34

Oh, wait. Really? It's

9:36

where the echoes come back on.

9:39

No, it's it. We're

9:42

waiting. I'm sorry, Ian. It's easy.

9:44

It's okay. One, two, three. Not it. Not

9:47

it. I think it's fixed it. What

9:50

does your echo mean? If you're

9:53

wondering how to cause a denial of service on a podcast,

9:55

that's how you do it. Either

9:57

that or you just start dropping pictures of tits. And

10:00

then that's okay. Let's get let's get

10:02

the show off on track off track

10:04

So this let's talk about looney tunables.

10:06

I mean I feel like we have

10:09

to Thing

10:11

in G lib see I get all excited

10:14

Tunables, it's such a good name. I feel like

10:16

ten out of ten for the name. Who's got

10:18

me. I don't know who named it But 100%

10:22

absolutely fantastic. That should be trademarked. So it's a

10:24

flawless. Are you still want to keep that ten

10:26

out of ten Raider? I

10:28

think we got a drop down Speaking

10:32

of speaking of vendors come to our

10:34

snake oil summit Qualace

10:38

yes, they coil summit if you want to know vendors

10:40

and snake oil We'll talk more about it a little

10:42

bit later. But so the

10:44

tunables glib C library is

10:47

designed So you can link

10:49

and then you can adjust Malak

10:51

memory allocation and

10:53

CPU timing so you can actually tune

10:55

the performance of your app

10:57

as it relates to the CPU And

11:00

I've heard it's been deprecated. I don't know if it's

11:02

been deprecated or not. Let me check that I've

11:05

heard that there's been a lot of newer ones that

11:07

are out there It

11:11

says finally set of tunables Maybe may vary

11:13

between distributions tunables feature a lot of solutions

11:15

rather own for some reason I thought that

11:17

they were working on deprecating this particular one

11:19

for something more efficient but

11:22

but seriously when you look at what this

11:24

has access to Inside the

11:26

kernel like if you list out

11:28

tunables it has access to at

11:31

Malak, which is memory allocation CPU

11:33

shared cache sizing PT threads or

11:36

p threads CP prefer map Menting

11:39

it's like literally a who's who's

11:41

and what's what a very very

11:43

sensitive CPU functions John

11:45

that's really escalation here who would

11:47

have thought John that's

11:50

really technical the scary words that you need

11:52

to know from this one is in

11:54

their default configurations. Oh,

11:57

that is the scary phrase. Yeah, that is

12:00

The scary phrase of this one, because it

12:02

sounds like, oh, tunables never heard of it.

12:04

I'm not Richard Solomon. I don't know how tunables

12:06

work, but it doesn't matter because it's exploitable in

12:08

the default configuration. That's the scary part. Wait

12:11

a second. Nothing's ever, ever

12:14

bad in the default configuration. Ever

12:17

what? We're supposed to change configurations?

12:19

Nobody's ever done anything with that. I'm

12:21

going to push back, Mike. I'm going to

12:23

push back because this is a library on

12:26

your Linux system. This isn't like an app

12:28

that you tune. These

12:31

are shared object libraries, glibc.

12:34

You would not tune this. This is not

12:37

a default configuration. You're like, well, you're a

12:39

moron. What do you mean you didn't modify

12:41

your glibc, tunables, environmental

12:44

variables? No. No

12:46

one ever in the history, except for Bill Stearns.

12:48

Bill Stearns. Yeah, it's filmmakers

12:50

with everything. It's gripped to tune. What

12:54

I thought was funny about this attack,

12:56

too, is it's just an environment variable

12:58

that you modify to do the injection

13:00

and then you run the sedu, the

13:02

set privileges. That's it. Yeah.

13:05

Yep. Yeah. Like,

13:07

that's wild. You just run the

13:10

SUID permissions. Dude, this

13:13

is trivial to exploit.

13:16

I personally come from the place

13:18

of like, a lot of the times what

13:20

we see nowadays are vulnerabilities by misconfiguration. Those

13:22

are like, I think, if you're

13:24

in a patch, like, mature company, that's the most

13:27

common vulnerabilities are like, wait, we

13:29

shouldn't just have everyone able to add

13:31

a new guest user to Azure or

13:33

have everyone the ability to create a

13:35

new user or something with every category.

13:38

We're going to make the exact same mistakes in cloud

13:40

computing that we made all the way up. By

13:43

the way, Ben from KC just

13:45

nailed it with a comment that

13:48

said all those people that are running Gen 2 are

13:50

like, good point, Ben. Good

13:55

point. But I think the

13:57

goal is to stay in persistence within the Kubernetes

13:59

environment. or what do you think they're

14:01

really trying to get after here? I don't think it

14:03

matters. Well, it's the same old. It's

14:06

the same old. They sell a web

14:08

shell and they sell it for $500. Yeah,

14:10

I mean, the basic kind of web shell now

14:13

is a root web shell. And

14:15

it's over. We'll talk about the Mirai botnet

14:17

and DOS. We got a different vulnerability

14:19

zero day that

14:21

came from Akamai. But in

14:23

this particular one, it's just a local published

14:25

escalation. And I think that there's proof of

14:27

concept code out for it already. I

14:30

think. Oh, yeah, no, it's hot and

14:32

spicy ready for. Yeah, it's ready to go. And

14:35

I think it's just an environmental variable. So

14:37

it's not like, like, I'm trying

14:39

to find the C code to exploit this. Yeah,

14:41

it's not. It's not. G

14:43

libc tunables. So like, this is

14:45

the environment variable. G libc underscore tunables.

14:48

That's the environment variable. You modify

14:50

that. And you can,

14:52

when you launch a binary and

14:54

you can execute code. So I

14:56

mean, the current malware they mentioned

14:58

in the post is specifically targeting

15:00

the PHP testing framework, PHP unit,

15:02

but there could theoretically be exploits

15:04

in a whole variety of other

15:06

units or other Linux things.

15:09

So big, bad fix. There's also

15:12

a BOD 2201, which

15:15

everyone knows what that means, obviously. I

15:17

had three of those for dinner. No. So

15:20

let's pretend some of us don't know what that

15:22

is. I got a couple on the Instapod. If

15:25

you don't know what that is, that you don't need to

15:27

worry. You're already patched. Basically,

15:31

the BOD is like, I mean, this is

15:33

a SZA thing. So I don't really know,

15:35

but that's a binding, binding

15:37

directive, basically, meaning like you have to fix it

15:40

by the state or else you're, you

15:42

know, you're in double secret probation. Yeah.

15:47

The government, internal government auditors are going to

15:49

come after you. Nothing is very,

15:51

like dying for anyone. Or is it just me?

15:54

Reach Stream's rocking for me. I feel, I'm feeling

15:56

good. I'm getting a lot of static, but you

15:59

know. If it works on me, it must be my

16:01

fault then. Ryan, as the

16:03

driver of this plane, are you flying

16:05

into a cliff or does everything look

16:07

good from your side? No, you're the

16:10

pilot, dude. If

16:12

we have a driver for the plane, we're

16:14

already in a real problem. On

16:18

my screen, the entire interface has been completely

16:20

frozen for like 10 minutes. We don't even

16:23

do titles, right? Oh, man. Let's

16:26

keep rolling. This next one, I don't

16:29

know if this is funny, if this is tragic, if

16:31

this is scary, or if this is stupid. There's a

16:33

lot going on in this SE Media, SE Magazine

16:35

article. It said, fishing

16:38

attacks spike attributed to

16:40

generative AI adoption. And

16:42

it starts with this sentence that is

16:44

just like, seems a bit off. But

16:47

SiliconANGLE reports that fishing attacks have

16:49

increased by 1,265 percent between

16:54

the fourth quarter of 2022 and the third quarter

16:56

of 2023. While

16:59

credential fishing has risen by 967 percent, those

17:01

are huge numbers, right? I

17:08

don't know how they're coming

17:10

to that. But at any rate, the

17:13

reason why they say that they're blowing

17:15

up so much is because generative AI

17:17

is going to be so

17:19

powerful at creating convincing profile

17:21

pictures in packable text, not

17:23

inventing the ability to code

17:25

malware. The threat landscape

17:27

is shifting incredibly fast now, as opposed to

17:29

last year or the year before that, with

17:32

the introduction of AI to the name of the game. But

17:35

the good news is that AI can also

17:37

be used to defend against sophisticated attackers. This

17:39

sounds just stupid. But

17:42

at any rate, yes. I think we

17:44

found that this media outlet is just

17:46

AI generated. Yes, it's all on the

17:48

phone. I was just saying, as the

17:50

staff, do we get like bot bot

17:52

2000? Like, I'm not getting that to

17:54

you, but it was so painful I

17:56

had to share. I love that you

17:58

were reading it. like, I

18:01

don't like this.

18:04

This whole article just screams scare tactic

18:07

to me. Also, there's no

18:09

there's no, like by name, it

18:11

just says SC staff, aka the

18:14

AI API key. Right. Even even

18:16

somebody is fairly new to this

18:18

industry and things. Those numbers seem

18:20

outrageous to me, that it

18:23

can increase that much in a year. Well,

18:25

I mean, we should all have gotten at least

18:27

three phishing emails today. Absolutely.

18:31

And I'm only at one. So let's

18:33

talk about the AI thing. Hold

18:38

on. I got you. Just left

18:42

out. So let's

18:44

go through this. We read this article

18:46

says Silicon angle reports that phishing attacks

18:48

have increased by 1265. And Silicon angle

18:51

is not a company.

18:53

It's another website. That

18:56

is a news website. And

18:58

new report released by phishing protection

19:00

companies slash next incorporated. So

19:03

it's literally referencing like

19:05

another article from another

19:07

like internet. I

19:09

slash next. What does anyone know?

19:12

They're forwarding. Next

19:15

is a complete generative AI

19:17

security for email, mobile and

19:19

browser. So basically, okay,

19:21

John, I figured it out. I know why

19:24

the numbers went up because this company didn't

19:26

exist. Dare

19:31

everybody from their competition. So they can

19:33

hurry up and grab it. They

19:35

got reference to dark web stuff in here,

19:37

too. I've definitely made phishing

19:40

sites that look just like this. If you

19:42

go to business GPT net, which is one

19:44

of our phishing sites, it's basically the same.

19:47

They have a

19:49

console. Hey, that looks like

19:52

my watch on display. Oh my god. Really? You're

19:54

getting 30,000 fishes

19:56

on your watch. You really should replace them

19:58

with Different words and

20:00

that's like totally what like the phone

20:03

John the Garmin phone app doesn't it?

20:06

Yes But

20:08

they have no it's okay guys they're trusted by

20:10

their customers it says they're trusted so excited Like

20:16

this is so weird like this whole artist

20:18

go to slash WP dash login login with

20:20

admin admin we all know how it goes

20:24

Anyway, I like Proxmox. Hello. My name

20:26

is Ship

20:31

no, I am NOT an AI So

20:34

I just thought as you were

20:36

clicking through the article at some point We were

20:38

just gonna dig deep enough until we found a

20:40

Furby that had been hooked up to Twitter It's

20:49

like a Furby skeleton now right sad to

20:51

say that cuz those things are back I'm

20:54

more powerful than other. Yes, they've

20:56

come back. Yeah, I

20:58

scared It's like it's like Stephen King's

21:00

pet cemetery, but for like electronic pets

21:02

I keep I guarantee you we're gonna

21:04

read an article on here one week

21:07

Oh, probably because when they first came

21:09

out It was a huge thing in

21:11

the government because people kept bringing them

21:13

into their offices like idiots I

21:15

gotta see how much they cost you know, how much is

21:17

a Furbie? They're like under a hundred bucks or something like

21:19

that What a Furby is Oh Wow

21:23

for you for those of you who didn't get

21:25

to experience these fun level It's

21:35

a mixture of a gremlin and My

21:38

boy and and a Teddy Ruxpin

21:40

like and when one of those creepy

21:43

things like those Disney Display people don't

21:45

remember Furby and then reference Teddy Ruxpin

21:50

I went the wrong way

21:52

there You

21:55

fell straight into I draft you In

22:00

this defense Teddy Ruxpin did make a comeback

22:02

a few years ago, so people might understand

22:09

that reference. I'm going to smash it on

22:11

this a bit. There's

22:17

one thing that if anyone leaves here and they

22:19

do like the critical analysis of this article and

22:21

you say, when should I look at this and

22:23

say, this is made of

22:25

snake oil and fake unicorn tears and whatnot. It feels

22:27

like snake oil. It feels like snake oil. It does.

22:30

It's the 1000. Isn't that what we tell people when they're

22:32

looking at fishing? That

22:34

is not a quantifiable number. There's

22:37

no anything to give you any sort of

22:40

reason to believe that they've any sort of

22:42

metric to base it off of. And even

22:44

more than that, I actually

22:46

don't care if there's been a 40 bajillion

22:48

the 11 d4%

22:51

increase in fishing attacks, because that's

22:53

what it says, right? Increase in

22:56

fishing attacks. What is successful? Tell

22:59

me that tell me that generative AI

23:02

is making the fishing attack more successful

23:04

in reaching actions on objective. And now

23:06

I'm concerned. And also I got a

23:08

question. Now there's some firms that I

23:10

trust their data, right? Like Verizon comes

23:12

out with the data breach report. I'm

23:15

going to read it. They've been

23:17

like semi trustworthy actors in the field

23:19

for a long, long, long time. Mandiant

23:22

comes out with a report. I'm probably going to

23:24

trust it. But like

23:26

there's a bunch of different companies, right, that

23:28

have been around. They come out with these

23:31

reports and they seem pretty legit.

23:35

Setting aside some of those large vendors that have been

23:37

around for a really, really, really long time. If

23:39

you're a new vendor and your

23:41

whole thing is selling AI to

23:43

protect email, and you're talking

23:46

about generative AI is creating this huge

23:48

spike and all of this. And

23:51

none of us on this

23:53

show have heard of your

23:55

company. Like immediately legit security

23:57

professionals are probably not going to listen to you. Especially

24:01

because I was plugging around on this website, I

24:03

can't see where their data is coming from. Like

24:05

they're not sharing the raw data at all. It's

24:07

just like Ian said, they're just coming up with

24:10

numbers and throwing it at a wall. So,

24:12

I don't want to throw any previous employers into

24:14

the bus here, but having been involved as the

24:17

source data for articles

24:19

like this before, usually it's one marketing person

24:21

has a really good idea and they talk

24:23

to one person and then someone crunches the

24:25

numbers and then that's the report. Yeah,

24:27

yeah. I

24:30

mean, supposedly this company does their own

24:32

email filtering and security, so that's how

24:34

they basically are sourcing the data. So

24:36

it's like, yeah, but I

24:38

don't know. All right, let's get to

24:40

something that is kind of like true and real and

24:42

scares the living hell out of me. The

24:45

Fidelity National Financial being shut down

24:47

in wake of a cybersecurity incident.

24:51

So Fidelity National is a Fortune

24:54

500 company that provides title

24:56

insurance and settlement services for mortgage

24:58

and real estate industries and they

25:00

are currently shut down. And

25:03

if you're thinking that Fidelity, it's the other

25:05

Fidelity. It's the other Fidelity,

25:07

Fortune 500 company. But

25:10

it seems

25:13

bad. Like based on

25:15

our investigation to date, FNF has determined

25:17

that an unauthorized third party access certain

25:19

FNF systems and acquired certain credentials, the

25:22

investigation remains ongoing. I

25:24

think one of the biggest things that's

25:26

terrifying about this is there's two

25:29

things, right? One, and one

25:31

leads to the other. One we're getting limited information,

25:33

but two, the information we are getting out of

25:35

this leads me to believe that they have no

25:37

freaking clue how bad it is. And

25:40

that's probably the most terrifying

25:42

thing. Allow me to

25:45

attribute this using my next generation AI

25:47

technology. Go for it. I

25:49

bet you this was Scattered Spider. I

25:53

will almost confirm that for you, at least

25:55

at this point. The register went ahead

25:57

and reported last week about this and they said that it

25:59

was blocked. which is supposed to

26:01

be scattered under protectors. And

26:04

it looks like, according to

26:06

a scan from GOCI, it

26:08

looks like this might be another case

26:10

of Citrix bleed. Oh, yep.

26:13

Oh, again. Citrix bleed plus

26:15

post-ex plus maybe some SE. I

26:17

could see it. Well, I'll

26:19

be honest with you. I watch the financial stuff

26:21

because of where I work and

26:24

there have been more and more

26:26

since Citrix bleed hit, there

26:28

have been more and more of these

26:30

ransomware attacks against financial companies that

26:33

a lot of people might not have heard of,

26:35

but are big on the back end. So

26:38

it doesn't surprise me at this point.

26:41

Yeah, we got paranoid and checked all of our CPT

26:43

customers for Citrix bleed because I was like, I didn't

26:45

think this is going to be a thing. And then

26:47

it was still being a thing. And I was like,

26:49

OK, we should probably double check. Everyone was patched, luckily.

26:51

But yeah, it was scary. What's the

26:53

CPT? What's that? Oh,

26:56

sorry, continuous penetration testing. Yeah, it's

26:58

very, very, very tough. Customers. It's

27:00

our customers that hate themselves the

27:02

most and want to bring as

27:04

much pain as possible on their

27:06

security teams by having us continuously

27:08

hack them for just forever. Remind

27:12

me to keep you out of marketing,

27:15

Corey. No, no, no. There's

27:18

nothing for me. If

27:20

you're a glutton for a punishment, and you're on

27:23

a main screen today, that's all right. Listen,

27:25

if you want marketing attacks are

27:27

up 1,263% source. I

27:31

Google this. Look

27:33

at Corey right now. Corey's hair is all over

27:36

the place. He just did that amazing marketing pitch.

27:38

And he's wearing a Windows 3.1 for

27:40

work groups t-shirt. Thanks. Actually,

27:42

it's Vaporwave95. Vaporwave95.

27:48

Also, let's talk about F&S stock. Now, this

27:50

is the other thing I wanted to ask

27:52

all of you. Their stock is

27:54

only down 0.4% or, wait, no, 0.47%. Like

28:01

how, like, is there any relation

28:03

between stock and hack? Like,

28:06

and this is one of those things, Ian, I'm going

28:08

to call you into this. And I'd

28:10

like to get Kelly in as well. Like

28:12

seriously, y'all, we've been talking about this

28:14

for years. Like that you're always

28:16

like, well, how are we going to get the board of

28:18

directors to take this seriously and take security seriously and take

28:20

the CDash shows, make it take, it didn't

28:23

in fact, that's their stock price. They

28:25

don't care, right? So fidelity

28:28

next week. Yeah. Yeah.

28:33

You're his hand, leverage, 25 yards, 30 yards. And

28:37

starting out. I don't know

28:39

who you are, gooshed, but you need to come

28:41

on the show now. Yeah. Because you're hilarious. All

28:46

right. So like, I really want to open up to Ian

28:48

and Kelly on this because they're the two people that I

28:50

talked to the most about this whenever I get down and

28:52

sad and troubled and I need a helping hand. All right.

28:54

Take it away, Ian. What do you think? Oh, yeah. So,

28:57

I mean, I opened up the 8K over here.

28:59

So somebody in the comments, I'm not logged

29:01

in where I can pull up the comments, but something in the comments

29:03

put up, did they file with the SEC? I think it was Brian.

29:05

They did. It is in the

29:08

SEC. It did show up and

29:10

the entire SEC report is like

29:12

three sentences. That's it. Yeah.

29:15

And the boilerplate stuff, which

29:17

we stuff up top. Yep.

29:19

Right. So part of the reason why

29:21

one, to answer your question simply, no,

29:24

it doesn't matter anymore. After the after

29:26

target and Equifax, basically

29:28

their board saw that their stock

29:30

price would rebound in a year.

29:32

That just became part of the

29:35

playbook. They say, okay, if we

29:37

ride this out for a year and we

29:39

solve the problem, it's fine. They treat it

29:41

like nothing more than we had a bad

29:43

sales year. How would we, how would we

29:45

address this? So it's actually in

29:48

my mind that argument has actually caused

29:50

more problems. It seemed perfectly logical, right?

29:52

You're going to see a hit on

29:54

dividends, which are the money that's paid to

29:56

you as a stock owner if they pay

29:58

dividends, extra money there. And then it's going

30:00

to take a hit on the stock price, which we just haven't

30:03

seen happen. If to use

30:05

the terms the kids use, if you're

30:07

diamond hands on the stock. Diamond.

30:09

Right. The

30:12

real issue that I think is also

30:15

the reason why this doesn't impact this

30:17

is the industry that they're in. This

30:19

is a highly, highly, highly

30:22

regulated industry. No investors losing

30:24

their money after

30:26

this. They're not losing their investments.

30:28

They're not losing any of that.

30:30

All of that will not matter.

30:32

Now, fidelity, how they come out the

30:34

other side, how much money they have to spend,

30:37

maybe that's it. But that's the

30:39

reason it's not hurting the stock is the

30:41

actual core assets, barring them doing something

30:43

absolutely bananas, which could then be reversed

30:45

in a highly regulated industry. I

30:47

just don't see why it would impact them. And

30:50

Michael Allen on this court just said, oh, I

30:52

get that. But I think investors are also becoming

30:54

numb to it. Right? Oh yeah. You've

31:27

got buyers and sellers of real estate.

31:29

You have real estate attorneys who could

31:31

be one, two, three person real

31:34

estate firms. You have title

31:36

insurance. The real estate

31:38

industry is not regulated. Now, the

31:40

National Association of Realtors does have

31:43

a data and privacy toolkit,

31:45

but they really aren't that regulated.

31:47

And we've seen these types of

31:50

attacks against the title

31:52

companies, the insurance companies, the attorneys,

31:54

even the buyers and sellers has

31:56

been happening for years. And

31:58

quite frankly, I've seen. a lot

32:00

of buyers and sellers, especially here in

32:02

Florida, lose money because of phishing

32:05

attempts, of bad URLs.

32:09

Some of the clients that we've had, we've

32:12

actually said, listen, you've got to go to

32:14

out-of-band communication. You've got to come up with

32:16

your own procedure to say, this is actually

32:18

from the buyer, here's the transfer of money,

32:21

here's the special code, the one-time code, and

32:23

I'm only going to tell you it over

32:25

the phone at a particular time. I

32:28

don't think it's as regulated as

32:30

we all think it is. Honestly, I

32:32

think these small, medium-sized businesses really need

32:34

more help because they aren't as regulated

32:36

as we think they are. Well,

32:38

in going back to the SEC filings,

32:40

they did pretty much all. I

32:44

hate to say that their filing they put out

32:47

is like, oh, they're going to be fine as

32:49

far as the SEC is concerned because it's very

32:51

sparse. But they're still in the middle

32:53

of working in the incident. I guess I give them a little bit of a

32:55

pass. Now, was there a

32:57

press release for this or was it just their SEC

32:59

filing? Were they notified? I don't think you get to

33:01

the email to do the press release. They're going to

33:03

their website right now. I did. Yeah.

33:05

No, this is how- Little error. Yeah, I

33:07

know. This is how ransomware works. My hot

33:09

take on this is that day one,

33:11

it's not going to impact. Here's my

33:14

theory. The stock market knows

33:16

about ransomware and it just depends

33:18

on how fast they recover. If

33:20

they can recover in 48 hours,

33:22

you get a free pass. If they're down

33:24

for a week, a month, like

33:26

Kelly said, these transactions, these real estate transactions, there

33:28

are two people sitting in a room saying, we're

33:30

trying to send you money and you're trying to

33:33

get the money. And if they can't use this

33:35

company, they'll use another company or figure out another

33:37

way. I think when they start losing, if

33:39

they're down for a while and they start losing revenue

33:41

and competence, this could be actually really

33:43

bad. That's my prediction is like, they don't

33:46

care- If they can place else, then those

33:48

firms just continue using that stuff. Exactly. Yes.

33:51

It's not like it's going to impact their

33:53

stock price from the investment perspective on day

33:55

one. But when this industry realizes they have

33:57

lots of alternatives, they have lots of other

33:59

options, like Kelly said, it's not as regulated

34:01

as we might think, then they're just

34:03

going to be like, okay, well, we'll just use a

34:05

different company. And, you know, now this is the,

34:07

the inertia is here. And we're just going to stick

34:10

with this. I mean, I don't know. That's total

34:12

speculation. We're not financial advice. Oh,

34:16

the announcement did come through the

34:18

AK filing. They have not really

34:20

given any word from the company.

34:22

Yeah, because they can't email each

34:24

other like legitimately. Like I'm

34:26

not joking. Like they, they, there is no,

34:28

they're off the map. Like genuinely

34:30

these customers probably have no door or these,

34:33

you know, entities, no domain controller, no,

34:35

like no email, no exchange server, not

34:37

like none of that. Their VMware is

34:39

probably completely locked out. So the question

34:41

is, is how much more are they

34:43

going to wind up making because people

34:45

are delinquent in their mortgage payments? Looking

34:48

at some of the articles, people, people have

34:50

been going ahead and trying to

34:52

make payments and they can't get it all down.

34:58

Yeah, maybe it's Mr. Robot. Yeah. I

35:00

would kind of have to agree with

35:02

Corey a little bit too. Oh, oh, well,

35:05

go ahead. Yep.

35:07

I was just going to say, I would agree with Corey

35:09

that businesses like this and the way people are, we're, we're

35:12

creatures of habit. So once we, we

35:14

do something, we find out it doesn't work and then we

35:16

move on to the next thing. We're not generally

35:18

just going to go, Oh, let me check back on this old

35:20

one. Yeah. So I think that

35:23

even though, yeah, their stock price is unaffected

35:25

right now, if they don't recover

35:27

quickly, it will be very quickly. Yeah.

35:29

And I want to go back to some of the other side real

35:31

quick. Radice just brought up a great point.

35:33

When Mr. Cooper got locked out, they

35:36

gave a wave on penalties. Is Mr.

35:38

Cooper back? Are they back and

35:40

running? Or are they still, is

35:42

that like Ask Jeeves? What's

35:44

Mr. Cooper? Mr. Cooper

35:46

is a mortgage company. So they are very, very, very large

35:48

one. And I don't know

35:53

if they're back up and running. Said

35:57

customer data was compromised. Oh my gosh, we

35:59

talked about this I think. No, we

36:01

didn't. There are other things that are important.

36:04

Yeah, they're hit. I think

36:06

legally, if they still charge people the

36:08

interest, they could get in trouble because

36:10

people had no way to pay. So

36:12

like the interest is not on the

36:14

customer's fault. Yeah. So but that's

36:16

a good point. I don't know if Cooper be hanging

36:18

with Mr. Cooper. I don't know if

36:20

they're up and running yet, but they had to go through and you

36:22

had to call in and they had

36:24

to get all kinds of weird ways to

36:27

process payments, which is strange. I want to

36:29

go back to what Kelly said real quick.

36:31

She said, I'm going to disagree with you.

36:33

I actually don't think we disagree in principle.

36:35

I agree 100%. That's not a word. That's

36:37

a word. That's title insurance. Seriously, hear me

36:39

out. I agree

36:41

that the title insurance agency is weird. More

36:44

what I was saying is that the

36:47

assets themselves, the titles being transferred, those

36:50

are highly regulated. At

36:52

least in my experience, it's been very difficult,

36:54

about a million papers to transfer a

36:57

real estate title in a

37:00

mobile notary that has to come and do that. And

37:02

then they scan or they do it digitally and all

37:04

that stuff. So what more I was

37:07

going with asset loss is it's not like, and

37:09

someone can correct me if I'm wrong, this service being

37:11

down and suddenly a bunch of people are like, oh

37:14

crap, I don't own a bunch of commercial properties and

37:16

I have no idea how to get them back. I

37:19

don't think that's going to happen, but I'd

37:21

be. I

37:24

think it depends upon whether you're talking

37:27

commercial or you're talking

37:29

individual. A couple of years

37:31

ago when my mom passed away, I wound up

37:33

taking control over some

37:35

real estate and it took me $200 in

37:37

one day to go ahead and get that

37:39

title changed into my name. Yeah,

37:42

but you were the next of Ken. Yeah. Barbie?

37:45

Barbie? Let's just agree

37:47

it's bad. It

37:52

is bad. Yeah. I

37:56

don't think I don't like let's move on because I

37:58

have to leave and I really want to to

38:00

this. When I mentioned it earlier,

38:02

Florida man is slipping. Oh

38:04

no. Why is because Georgia man

38:09

was arranged today on charges, the

38:11

cyber attack that they conducted against

38:13

the medical center in 2018. According

38:15

to this incident,

38:18

Vika Singala, 45, Marietta,

38:20

a chief operating officer of

38:23

a Metro Atlanta network security

38:25

company that served healthcare industry,

38:27

allegedly conducted a cyber attack

38:29

against the medical center, disrupting

38:31

their phone service, obtaining information

38:34

from a digitizing device, disrupting the

38:36

network printing server. And they

38:38

did all of them as

38:41

their marketing plan to try to

38:43

get that hospital to work with

38:45

them. Are you serious?

38:47

You act on first so they're scared.

38:49

Okay. I'm calling it right now. We

38:51

jumped the shark. My security jumped the

38:54

shark. Someone went to a CSO conference

38:56

and was like, yeah, what

39:00

you got to do is hack into the company. And

39:02

then it's like, it works. It's like the mob. What?

39:07

You're real shaking if something happened to

39:09

that perimeter firewall. We can't help you

39:11

protect it. I'm going to bring

39:14

Kelly in on this because Kelly,

39:16

you were at SAMS conferences for

39:18

a long time with me. We

39:20

go way, way, way, way back.

39:23

This exact thing was like a lot of

39:25

people that were in the industry, they talked

39:27

about doing this. Like, wouldn't it be illegal

39:30

if I actually hacked the company and then

39:32

told them that they were vulnerable? I think

39:34

that's gray hat hacking. I'm going through finding

39:36

vulnerabilities, and then I'm showing them the vulnerabilities,

39:39

and then I can help them fix it.

39:41

And it's like, no, that's breaking the law.

39:43

That's illegal. Yes.

39:45

Are you sure? Like I would have been

39:48

in Florida. People even in Florida. Oh

39:54

my God, it gets worse, guys.

39:56

The agent in charge is named

39:58

Chris Hacker of the FBI Atlanta

40:01

field office. Yes. Yes. So

40:03

his name actually comes up

40:05

a lot because he has

40:07

very little office, a huge

40:09

field office. He does the

40:11

FBI equivalent of FBI staff

40:14

agents. It's like, yeah, yeah.

40:16

Yeah. Wow. He

40:18

was a detective. How many

40:21

times has he asked for a transfer? Oh,

40:24

how popular is the last name hacker? Actually,

40:27

but John, your point though, there was a

40:29

whole bunch of people with responsible disclosure that

40:31

were trying to do responsible disclosure proper. And

40:33

then there was a whole bunch of people

40:35

that were doing it improper. Right. And so

40:37

like, what's that balance? And I don't know

40:39

that even it is today. Great point,

40:41

by the way. We

40:45

talked about this a number of times on the show, but bring it

40:47

up here. If you're going

40:49

through and you're doing good faith security research,

40:51

if you just Google that Department of Justice,

40:53

good faith security research, you'll

40:56

come across the guidelines for the Department of

40:58

Justice and how they handle good

41:00

faith security research and what they consider

41:03

to be a differentiation between that and

41:05

malicious. So if you're doing good faith

41:07

security research and you find a vulnerability

41:09

and you go to a vendor and

41:11

you share that vulnerability with the vendor

41:14

and your primary goal is improving the

41:16

security in that organization or the industry as

41:18

a whole. And I'm cutting this

41:20

down quite a bit. Then you're a

41:22

good faith security researcher. If you're hacking

41:24

companies and accessing data and

41:26

pulling that data down, you've crossed the line,

41:28

number one. And number two, if you're doing

41:31

that and you go to them and you

41:33

say, Hey, I found the security vulnerability, you

41:35

got to pay me. That's

41:38

not good faith security research. So

41:40

the Department of Justice does have

41:42

guidance that actually breaks down what

41:44

those two things are. And for the

41:46

record, this is not that. This is

41:49

not good faith. This

41:51

one's pretty solidly on that bad side of the line.

41:54

So should we tell CJ not to do this then? Wait,

41:57

I want to close out one call. This

42:00

came out last year, Grayson. Yeah.

42:03

All right. Go ahead. Sorry, Corey. Oh,

42:05

you just said we should tell CJ not to do that.

42:08

A port plan bad. I

42:11

think operation viral marketing

42:13

version two. Well,

42:15

yeah, port blends showed

42:17

in jackpot. Yeah. Yeah.

42:20

Like Ben just said, that's extortion. Wait,

42:25

wait, hold on. We have a crime for that already.

42:27

Hold on. I sure said you

42:30

just got to put a marketing spin on it. That's. Yeah.

42:33

So, okay. Hold on because it gives

42:35

it, I just can't, this just keeps

42:37

getting worse. The assets identified

42:39

in the indictment paper, which I'm looking

42:41

at are just all Lexmark printers. Like

42:44

where, where are you getting your TTVs?

42:47

When after was Lexmark printers, are you

42:50

serious? Also, you can imagine being like

42:52

recently here, here's the other crazy thing.

42:54

Each one of these printers is a potential 10

42:56

year sentence. Can you imagine being in jail and

42:59

being like, if I hadn't gone after that printer

43:01

in the break room, it would only be 70

43:03

years. Icing

43:07

on the cake would have been if they actually just

43:09

put hire, whatever their company name was

43:11

on the printers and started printing out. Just put it

43:13

on the cake. Looks

43:16

like you've been hacked. You should hire this company. Oh

43:19

my God. Good. He did it. He

43:21

did it. He did it. Very serious

43:24

cyber operating. And

43:29

I love the commitment that he's keeping them fresh

43:31

and not reusing. I know. You

43:35

say AI. 25

43:39

years with Lexmark printers. Specifically. No,

43:43

this is like I said, you know, this

43:45

is stuff I remember kind of when pen

43:47

testing first was starting as a thing, you

43:49

would always get in conversations like, well, if I

43:51

did this, why would that be illegal? I'm just

43:53

telling him. It's like, no, no, no, that's, that's

43:55

illegal. You're breaking a ton of laws. But am

43:57

I? Yes, you are the computer fraud and abuse.

44:00

You will be able to prison for that. I

44:03

just think it's interesting that he actually tried

44:05

it. And I would love it. If somebody

44:07

out there works at this medical center, what

44:09

did he print? Because he-

44:11

What did he print? Right? It

44:15

was money. He printed money. That's how the

44:17

FBI got involved. Yeah. Oh, go

44:19

ahead. Let's be honest. Our imaginations can

44:22

run wild. Inquiring minds want to know.

44:24

And us wondering is worse than us

44:26

knowing because could it be an advertisement

44:28

for a cybersecurity company? Okay. You've been

44:30

hacked. Could it be Goutzi? We don't

44:32

know. It could be any of those

44:34

things. I would threaten to take out

44:37

P2. I'd like to request,

44:39

you know what? We need to make a screenplay. Let's get

44:41

it in front of Netflix. Let's. This

44:45

is so stupid Mr. Robot. Yeah.

44:48

This does the dumbest shit

44:50

ever. Mr. Gobot. Mr. Gobot.

44:52

Yeah. It's

44:55

like the sci-fi version of Mr. Robot

44:57

where like- That's good. Science.

45:00

I gotta start the- Corey, I'm leaving

45:02

you in charge. Oh God. No, no.

45:06

Look at this, this show. Oh man. All right. Hello.

45:10

My name's John Strand. Nothing can

45:12

get done. Yeah. I can't, like that

45:14

article is just peak. Like it's everything

45:17

that like you should never do ever.

45:20

Beautiful. Beautiful. I think it's important though

45:22

that, you know, when these things happen,

45:24

I'm not defending that situation in any

45:26

way, but it does set a precedent

45:29

sometimes for non-technical leadership in law making

45:31

and all kinds of stuff that could

45:34

have impacts on this community,

45:36

right? Yeah. In negative

45:38

ways, not necessarily in positive ways. So-

45:41

Really? Be mindful of that just

45:43

to make sure that, you know, that

45:45

all the people that make decisions

45:47

know what decisions they're making if

45:49

these types of things continue to happen. So

45:52

are you conjecturing then that this person

45:54

just told people to go do this

45:56

and didn't actually do it themselves? No.

46:00

No, I'm saying that it's important to just

46:02

make sure that you have a proper... I'm

46:05

not saying that this person had a scope, but it

46:08

just goes to echo that if you're doing

46:10

any type of pen testing or anything that

46:12

you're supposed to have, certain scopes, you're supposed

46:14

to be doing certain things because you can

46:16

really easily get into hot water and get

46:18

yourself into some legal trouble. I

46:21

don't think that's what happened. No,

46:24

that's not what happened. I'm saying, when

46:27

this type of stuff happens, it does

46:29

potentially set a legal precedent in the

46:31

future for other non-malicious

46:33

actors. Yeah, but this

46:36

is just a precedent that if you hack computers,

46:38

that's illegal. That's not a new precedent. Let

46:41

me add to what you're saying there.

46:43

This was the chief operating officer or

46:46

chief something officer,

46:48

right? When you

46:50

have that title, you have

46:52

a fiduciary responsibility to the

46:54

organization and you are bound

46:56

by certain ethical restraints, simply

46:58

by having that title. Also,

47:00

you're covered by damages

47:03

and liability officers insurance. I

47:07

think this guy was just a bad apple. Yep,

47:10

absolutely. It's

47:12

important that we know how those

47:14

bad apples can influence not only

47:16

that organization, but the cybersecurity community

47:20

in whole, right? Because bad

47:23

apples need to be held to bad apple

47:26

laws, right? Other people don't that

47:28

are potentially doing it for non-malicious needs

47:31

and intent. I think I picked

47:33

up on what you're putting down there. I think

47:35

it's important to say, even

47:37

though we do already have the

47:40

laws in place to say, hey, we already have

47:42

this. This is a computer crime. It has

47:44

been listed since the 80s. No

47:47

problem. But when you go through

47:49

and I'll tie it back to what Corey

47:51

said about the printers, when you say, oh,

47:53

they hacked this many printers and you establish

47:55

a precedent for prosecutors to say we were

47:58

able to put this person in jail. jail

48:00

for this specific type of thing, I

48:02

think it all ties together with what

48:04

everyone said is that these lawmakers and

48:06

whatnot, they don't know that Alex Mark-Printer

48:08

is, you know, maybe it's not the

48:11

hardest target in the world. But

48:13

they do know that they've got some 20-year-old

48:15

on a college campus that did something, and

48:18

they can get another win in the prosecution

48:20

column because the precedent and the case law

48:23

for how they got this other person in jail

48:25

is exactly the same. So I agree that although

48:28

it's already there, prosecuting that

48:30

sets up a kind of a dangerous

48:32

precedent to take people who would be

48:34

good faith researchers and tend them

48:36

to something that was clearly not good faith. I

48:39

kind of think the opposite. I think this

48:41

is actually a demonstration that, because it says

48:43

specifically in the indictment that it

48:45

says, aided and abetted by others unknown to

48:47

the grand jury. So the way I read

48:50

that is, if your COO tells

48:52

you to go hack this company and you're like, I

48:54

don't think this is a good idea, but you do

48:56

it anyway, as long as you flip when the FBI

48:58

comes knocking, you won't be indicted. The COO will. That's

49:00

how I read it. But I mean,

49:02

I'm not a lawyer, and I don't know if

49:04

that's actually the indictment, but the fact that it

49:06

rolled uphill instead of the COO probably would want

49:09

it to roll downhill to the actual person that

49:11

hit enter on the Lexmark printer and hacked it.

49:13

Like, that's how I read this. Yeah,

49:15

that's the standard RICO stuff. They're like, we want the

49:18

big charges and we can get that with the

49:20

C-level person. So let's turn the people below them. Yeah,

49:22

which is how it should be. If someone comes to

49:24

you and says, go hack this company, we're going

49:26

to get their business because they're going to think they're

49:28

a breach. And I'm like, that sounds like a

49:30

bad idea, but I guess I'm just rank and file.

49:33

So whatever you say, I don't

49:35

want that charge. I want that charge to go to the

49:37

person that told me. That's how I see it. I don't

49:39

know if that's the details of the case or not, but

49:41

I'm just saying, interpreting it through

49:43

that lens, I feel good about the precedent

49:45

versus I'm glad

49:47

it rolled uphill and not downhill personally. Usually you

49:49

would expect it to go the opposite way. Okay.

49:54

Anyway, I guess, Grayson, do you have

49:56

a take on the... SolarWinds

50:00

situation where the leader or I forget

50:03

if I think it was the CSO

50:06

Was it like under criminal proceedings or

50:08

whatever by the SEC or you're aware of

50:10

that situation? I haven't been following

50:13

it most recently, but you know, I'm not an attorney.

50:15

I happen to be married to one That's a pretty

50:17

darn good one And so she knows a lot more

50:19

about what's going on in that that world than I

50:21

do But you know, it's just

50:23

interesting what you know to end to your

50:25

point a minute ago That some of these

50:28

people that are non-technical that are making laws

50:30

can oftentimes misinterpret or or or psych case

50:32

law that then Is

50:34

a precedent for other things right? And

50:36

so we want to make sure that

50:38

they're not doing that so I don't

50:40

know as it pertains to the situation

50:43

with with the the

50:45

most recent Disclosure

50:47

around the SolarWinds situation.

50:50

I don't really know one way or the

50:52

other if I haven't read any of that Yeah,

50:55

it was basically yeah We talked about on a

50:57

previous show that just it was basically he told

51:00

investors that the security was great and you know

51:02

best In the industry and it was he also

51:04

was getting emails like the same time that said

51:06

the security is not great We know it's not

51:08

great. So it was basically like fraud like yeah

51:12

Like it was mostly that he said it was

51:14

good even though it was terrible I think it

51:17

was less about like any specific cyber stuff Whereas

51:19

this is much more like in the

51:21

you know, it's a single edge Yeah,

51:23

Kelly was actually talking about exactly what

51:26

that qualifies under that errors and the

51:28

other word is omissions Insurance

51:30

and so if officers of the company

51:32

knew something that investors should have known

51:35

Yes, or they weren't told something that

51:37

they should have known That

51:39

insurance is meant to cover that and it

51:41

kind of is exactly that only they took

51:43

it a step farther because

51:46

it wasn't even ambiguous it

51:48

was We knew we

51:51

knew yeah, and hey, no, everything's great

51:53

guys. Yeah, we jokes that like the

51:55

new program at SolarWinds Is they just

51:57

don't tell the CSO anything? But

52:01

yeah, no. I will keep

52:03

an eye on that article for the show and see if

52:05

there's any... I would love to

52:07

hear the take of someone who's actually practiced

52:09

his law. Like obviously

52:12

none of us really know, so it'd be interesting

52:14

to hear from someone who's like, I'm going

52:16

to use this for this purpose or judges

52:19

are going to use this for this purpose or that kind of thing. So

52:22

we'll keep an eye on this article. Plus,

52:25

we want to hear all the juicy details for our screenplay

52:27

that's going to go to Netflix next year. It's

52:29

going to be huge. Ryan, can we pull up

52:31

a Rauner's... Rainer, if I'm saying

52:34

your thing, their quote on the screen.

52:36

The SolarWinds CISO indictment is not an

52:38

infosec indictment. It's a charge for violating

52:40

the cardinal sin of capitalism. They lied

52:42

to the shareholder. No, Ryan, that's just

52:44

good business. What are you talking about?

52:47

You know the rule in America, right?

52:49

You never steal from the rich. That's

52:51

the fact. You go to prison. You

52:53

go to jail from the poor and

52:55

you're good. You steal from the rich and they will get

52:58

you. There's been

53:00

plenty of cases just recently, you

53:02

know, the CEO of that

53:04

one exchange that... Yeah, he stole

53:06

from the rich. Yes, Beckman Freed.

53:09

Yes, Beckman Freed. Sam Freeman. Sam

53:12

Freeman. Sam Freeman-Royd. Anyways, he is

53:14

going to prison for lots and lots of

53:16

time because he stole from the rich. Anyways.

53:20

Yeah, that's exactly right. It's exactly the joke

53:22

I was going to make him point. Guess

53:24

that. Made the cardinal sin of stealing from

53:26

the rich because that's what that made off.

53:28

I just can't wait for all the juicy

53:30

hacking scenes of the Lexmark printers. Like

53:33

foggy room. Foggy room. Lexmark

53:36

printer is going down left and right. He's

53:38

printing. He's faxing. He's using

53:40

all the functions of the multifunction printer. They

53:42

pull it off. What does that grab

53:44

to say? Okay. How can you even do

53:46

this? As it says, cyber Monday

53:48

deal on pen tests. Call now.

53:51

I don't know. Yeah. Flex, flex,

53:53

flex, flex. Flex, flex, flex. Flex,

53:56

flex, flex. Yeah. Yeah. Flakvist

53:58

has it right as front. The SolarWinds

54:01

guy goes, it was

54:03

lying was not the crime. Getting caught

54:05

lying Was a crime

54:07

never caught Always the crime.

54:09

Come on. Yeah, that's always true I'm

54:13

telling you the people that you should fear the most Catching

54:16

you are the people who have lots

54:18

of money. Okay, because they can bring

54:21

the resources to continuously Look

54:24

for that right? None

54:26

of them are vindictive or

54:28

malicious or no

54:31

poorly educated or anything like that. No

54:36

It's it's interesting that I think Kelly I'd

54:38

love to get your feedback on this too,

54:40

you know, there's a big difference between Even

54:44

knowing that you have a problem and having

54:46

whether you're regulated or unregulated or any anywhere

54:48

in between as an organization What do you

54:50

have money high budgets to fix

54:52

problems or no budgets to fix problems?

54:55

Just having an understanding that the problems are

54:57

there is it's very different than buying down

55:00

that technical debt to actually solution and fix

55:02

those problems Right. So whether you're doing active

55:04

protection or you're doing pen testing or you're

55:07

doing whatever Whatever you're trying to

55:09

do and you're you've got this big bucket

55:11

of money You also have this big list

55:13

of problems. You're trying to figure out what

55:15

that alignment is What do

55:17

you think is is the hardest thing for people to

55:20

get started to? Understand that

55:22

like I mean we're perhaps Orleans as

55:24

the example, but just balancing risk basic

55:26

risk, right? Well, I don't

55:28

think people understand risk To

55:30

begin with me. I one of the analogies I like to

55:32

use is I used to live

55:34

in Wisconsin So my house insurance would would deal

55:37

with things like a roof that collapsed from too

55:39

much snow on it or ice down

55:41

here I've got hurricane insurance. Well cuz I'm

55:44

I'm in Florida Understanding

55:46

the environment you're in and the

55:48

risks that lead to vault to threats

55:50

and vulnerabilities I think

55:53

we've done a poor job Teaching

55:55

see those are upper level leadership

55:57

at the board level how

55:59

to understand understand risk because it makes

56:01

no sense for me to put

56:03

a fortified snow roof on in

56:05

a house in Florida. But if

56:07

I'm a mortgage company with title

56:09

insurance with large monetary transactions, maybe

56:11

I should invest more in my

56:14

perimeter defenses, my zero trust defenses.

56:17

I just don't think we really quantify it

56:19

or that's not the right word. We

56:22

understand risk well. I agree. Yeah,

56:24

I mean, it's hard and it doesn't really

56:26

matter how big of an organization or small

56:28

of an organization. I think in general, most

56:31

people have a difficult time, you know,

56:33

understanding risk and cyber is such

56:35

a black box that most folks,

56:37

it's difficult. I understand how they

56:39

don't, they don't have to

56:41

balance it. So not saying I agree with

56:43

getting caught lying or not understanding, but there's

56:45

a lot of people that just don't get

56:47

it, right, too. So CISOs certainly need to

56:50

have a better understanding of their risk. Well, the

56:53

risk thing, the other thing I would add in

56:55

about that is it's

56:57

not only that they don't understand the risk

57:00

necessarily. A

57:02

lot of big organizations, and this is something that engineers

57:05

deal with all the time, they want to get in

57:07

the weeds. They want you to understand, and this was

57:09

something that John Portse, if he's

57:11

listening, hey, what used to say all

57:13

the time, you do not

57:15

understand the downstream impacts. Every

57:18

single cost saving measure, every single

57:20

we're going to switch to this

57:22

outsourcer and it's going to save

57:24

us so much money, they

57:26

only look at the primary

57:29

impacts, the money savings, the

57:31

glossy marketing material, whatnot, because

57:33

they don't get down into the weeds on it

57:35

because it's not in their area of concern. Their

57:38

area of concern is trimming budget and whatnot. So

57:40

when they say that they don't understand

57:43

it, well, they understand the risks that

57:45

are important to them and what they

57:47

do, saving money, bringing the payroll to

57:49

manageable levels, whatever it is. But

57:51

when you say, hey, when you do this, to

57:53

save this money, when you go to outsourcer and

57:55

you do this, here's all the downstream impacts, the

57:57

talent loss, the this, the that, the... the

58:00

one person who runs that box, that runs

58:02

the scripts, the power of the move it

58:04

server, which is actually the thing that makes

58:06

us money, you're about to fire them. You

58:09

don't understand that and then this stuff plays out and

58:11

they go, well, why didn't anybody tell me that was

58:13

a risk? We did. You just

58:15

didn't want to hear the details. Well, and when

58:18

I do get that understanding of risk, I'm sorry,

58:20

I'm gonna cut you off, but don't know, you're

58:22

good. It takes a long time to get budget

58:24

for something. Like you could understand a risk and

58:26

then you don't have that aligned budget in that

58:29

year's budget at all. And

58:31

then you've got to then go ask for leadership

58:33

to give you that and understand how to

58:35

translate that information so that they get it.

58:38

And sometimes it's too late. And

58:40

for that, if we can come back to my camera,

58:43

I have this lovely device. It

58:45

is my management accepts the risk

58:47

stamp. No, I don't wanna use

58:49

this in meetings when I got fed up. I

58:51

was like, fine, you know what, my advice? Here

58:53

you go, right here. There's a stamp. I

58:58

mean, I think point in, one of the things that

59:00

I do a lot when I'm editing reports is I

59:02

try to act like the, like

59:05

an advocate for whoever's gonna be on

59:07

the receiving end of our report. And

59:10

I think in two reports today already,

59:12

I said, okay, we need to

59:15

explain why this is bad. I get it.

59:17

Others are not going to. And especially this

59:20

has to be done in the executive summaries

59:22

and it has to be done in a

59:25

non-technical practical manner. Because

59:29

if you're not a CIO, if you're

59:32

not a CTO, you're not a CISO,

59:34

you probably don't have the

59:36

technical chops to understand

59:39

detailed forensics speak or

59:43

getting down into the weeds about this,

59:45

that, and the other thing, even explaining

59:47

something like good password policies,

59:50

it has to be simplified. And

59:52

that's one place that we still continuously

59:55

fail. Amen, sister. Preach

59:57

it. Another thing about the SolarWinds

59:59

thing, talking about it and this will be

1:00:01

the last thing we talk about, but basically, the people

1:00:05

who operate in these spaces, like

1:00:07

chief officers, chief operating officers of

1:00:09

large companies, they kind

1:00:12

of are very, they're in

1:00:14

rarefied air of like, there's very small job

1:00:16

market for them, but they're very expensive and

1:00:18

hard to find and all that good stuff.

1:00:20

But when you've been charged

1:00:22

with fraud and internal control failures, you're

1:00:24

basically out of the job market. So

1:00:28

I think like CSOs take heed of this,

1:00:30

like don't just accept the risk. That's

1:00:32

the precedent this sets from my book is like, you're

1:00:35

not going to, no one's going to be like, well, we hired

1:00:37

to do a CSO and here's the press release. He was charged

1:00:39

with internal control failures and fraud and

1:00:42

he's great. So basically, the CSO is that like,

1:00:48

oh, where you could, it could be a career

1:00:50

ending move, like quite literally. Well, for a public

1:00:52

credit company, for sure. True.

1:00:54

I mean, true. But like, yeah, I mean,

1:00:56

I'm not, not to say that this person,

1:00:58

it could be false charges or, you know,

1:01:00

unfair or whatever, but assuming the charges end

1:01:02

up sticking, I feel like you're untouchable. Right.

1:01:04

I mean, I don't know, I could be

1:01:06

wrong, but I know for,

1:01:09

I mean, Kelly brought it up, the fiduciary

1:01:11

responsibility. If you've got SEC charges,

1:01:14

no board can possibly

1:01:16

say, no, we

1:01:18

did our due care and due

1:01:20

diligence by hiring someone who had

1:01:22

SEC charges against them for fraud.

1:01:25

That's not going to happen in a public.

1:01:28

There are minor fraud charges. There are minor fraud

1:01:30

charges. Light treatment. Don't worry,

1:01:32

you can never be indicted for the same

1:01:34

crime twice. So I'm not free to go.

1:01:36

Let's mark it. Yeah, let's darn it. That's

1:01:38

the new thing. Once you double jeopardy. I

1:01:40

got a couple charges. Then you go through

1:01:43

and you're like, I can commit all the

1:01:45

public securities fraud I wanted. Our security is

1:01:47

amazing. It worked that

1:01:49

way. Right. Right. Yeah. So

1:01:51

anyway, we should wrap up the show.

1:01:53

Come to our Snake Wall Summit. Ian, you want to talk about

1:01:56

the Snake Wall Summit? Not really. I'm putting you on the spot.

1:01:58

Okay. No, I will. I will. well.

1:02:03

Come to the Sync Oil Summit on the

1:02:05

6th and 7th. We've got a low-cost training.

1:02:07

We've got lots of talks. It's all online,

1:02:09

all virtual. So in about a week, you

1:02:11

know, come check it out. You can go

1:02:13

to anticyphantraining.com and check out the schedule of

1:02:15

speakers and other things going on. So we hope

1:02:17

to see So we hope to see you there. And

1:02:20

on the final note, have a fantastic week. Indeed.

1:02:24

And I guess I'll agree. I'm a

1:02:26

fire Orion. I'm scared.

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features