Podchaser Logo
Home
Talkin’ About Infosec News – 11/22/2023

Talkin’ About Infosec News – 11/22/2023

Released Wednesday, 22nd November 2023
Good episode? Give it some love!
Talkin’ About Infosec News – 11/22/2023

Talkin’ About Infosec News – 11/22/2023

Talkin’ About Infosec News – 11/22/2023

Talkin’ About Infosec News – 11/22/2023

Wednesday, 22nd November 2023
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:01

I went and saw some family that I hadn't seen for

0:03

a long time and everyone commented

0:05

on the mustache and I was just like Every

0:08

time someone gives me a good compliment, it's just like another

0:10

reason not to shave it off But every

0:13

other day, I'm just like I'm gonna get rid of it. I hate this thing

0:16

And everyone's like no, you look so good. There was there

0:19

was a

0:21

Thought of shaving it on stage at

0:23

Way West. That was an ask.

0:26

I was like, I would have done that but

0:27

I Do it do a bounty

0:30

for charity like the the Mohawk condos

0:34

Clean shave in for charity Yeah,

0:36

people do it. Actually, I'll be honest

0:39

I used to be a streamer and I had people

0:41

that came into my stream and was like hey for

0:43

a donation Would you shave your beard and hair

0:46

when I had a beard that was like down to here and

0:48

hair that Yeah, straight streamer

0:51

or streaker streamer I

1:01

Just imagine like a streaming

1:03

streaker you wear you guys on you as

1:05

you're running like you have the drone that

1:08

follows you Fundraiser

1:15

for that I

1:19

Will

1:23

break into your broadcast Going

1:27

straight into the quad I

1:29

shaved my head once as a sophomore

1:32

high school I was stupid

1:34

and did it in the middle of winter Never

1:38

never shave your head in the middle of winter stuff gets

1:40

real cold real quick I did

1:42

that on swim team and I did also

1:44

did a Mohawk for

1:46

For Defcon for my Mohawk con I'm

1:49

pretty sure the swim team one isn't out there. But the other

1:51

Mohawk con one is out there It's

1:54

the way it looks like you're saved cuz

1:56

uh, crypto ham is here Listen

2:00

I'm gonna tryptotize this Listen,

2:02

don't ask questions. That's fair.

2:05

We're live by the way after yeah, we're live We

2:07

thought I was gonna have to run the news that was kind

2:09

of it's gonna go straight downhill real quick

2:11

Why don't we just finish each other's?

2:14

sentences

2:15

great job Who's

2:20

playing violin Yeah,

2:23

let me let me send a DM. That's

2:25

all like I live with the professional viola player

2:28

or something

2:29

That was that was some good. Oh sent

2:31

you just released that's gonna They

2:34

already know the people who are good at OS and

2:36

already knew So

2:39

did anybody actually read the news there was there was some

2:41

good stuff some crazy stuff Yeah,

2:43

well not all of it, but I like

2:46

whatever was like the top stories They

2:51

weren't in there Where

2:55

where's our traffic weather and sports Yeah,

2:58

we have your

3:00

password Kelly you take

3:02

the scores

3:05

Maybe Don't

3:09

pick any Hawkeye games that's for sure. Oh

3:11

I'm a badger fan and it's it's

3:14

been a season that tests our Loyalty.

3:17

All right. I just keep that in the under Brian.

3:20

I got tired of you not letting me borrow your time

3:22

machine So I had to go steal another one Is

3:26

it excellent a time machine? Okay, so

3:28

not a time machine, but I got the space thing down

3:31

if you operate it wrong I

3:32

thought oh you're gonna swear I thought you're gonna like to

3:34

a TARDIS right when you said that or something I know we wait

3:38

Give me a sec

3:42

You know theory that if I fly

3:44

it fast enough in the right direction I could

3:46

go back in time But

3:48

you're already oh you're

3:49

you're in a galaxy and along Yeah,

3:52

you're going to galaxy. I'm gonna part

3:54

so by the time I fly with you Yes,

3:57

it might time travel might have happened. We'll

3:59

figure it

3:59

it out. Alex, I'm really glad

4:02

you're here today because you and

4:04

I are the clean shaving club today. Oh

4:06

yeah, yeah. Well

4:08

and the mannequin. And

4:10

the mannequin. We'll remain our

4:12

one holdout for

4:13

facial hair.

4:15

Is it hunting season?

4:17

Is that why we got so much facial hair going on

4:19

today?

4:21

I can't speak for everybody else but I know for

4:23

us chubby people, it hides the roles. I

4:26

agree with

4:28

the mustache. I agree with

4:30

the mustache to look like Ted Lasso

4:33

for Halloween. Great job. And to also look like

4:35

Ted Lasso for the... Kiss

4:38

for everyday life. Everyday life,

4:40

yeah. Just looking like Ted Lasso. My

4:42

dream, just being happy. I'm wearing a bunch of

4:44

Ted Lasso-esque books so I'm always happy

4:46

and not criticizing people trying to give them

4:48

their best life, right? But remember,

4:51

the key point was you can't always be that

4:53

way

4:54

or you go nuts.

4:55

Yeah, sometimes you need a little bit of what it

4:57

looked like. What, did you actually watch the show?

4:59

I just watched the trailer. Oh yeah, dude,

5:01

the show was amazing. I've

5:05

also been watching that, or I watched that Severance.

5:07

Oh, Severance is red. Shrinking

5:10

is like the second version of Ted Lasso.

5:12

It's like, you know,

5:15

it came out in that one, I think. It's Station Seagull

5:17

and a few other Harrison

5:20

Ford's in it. Did anybody check

5:22

out Monarch yet? No,

5:25

it's pretty good. The first two episodes are

5:27

pretty good. I

5:29

was watching Silo and I just like can't

5:31

bring myself to care about it anymore. Like I

5:33

watched the first two episodes and I was like, I

5:35

don't care if these people live or die. I'm over

5:38

it. This

5:42

is so hard for me to watch shows

5:44

when they come out. I'm the kind of person

5:46

I'm like, I gotta know what happens. You

5:48

want to binge it? Like you want to know what's bingeable?

5:51

Yeah, there are very few shows that get

5:53

me to watch as they come out. Dude, I

5:56

missed that though. Like back in the day where you

5:58

watched a show and nobody could think you had a good time. Had to

6:00

watch it an episode there wasn't reruns like

6:02

the one that always comes to mind is lost right

6:05

like no one knew What was going on in that show?

6:07

It was one of the creators knew what was going on. Yeah But

6:13

when you got the school the next day and you're like everybody's

6:15

like polar bear That's

6:20

why everybody had VCRs You know

6:26

That was when I figured out my computer could do

6:29

TiVo did anybody have a cable

6:31

capture card on their computer that was We

6:40

got one minute till we start this thing for real

6:42

you're watching us you're like what the hell

6:44

they talking about Where's the news? That's cuz

6:47

we haven't gotten there yet

6:48

Pre-show

6:50

why are we doing pre-show Ryan? Did you did you accidentally

6:52

click the button just be honest? No, he's

6:54

a definitely he told us he was doing it and everything

6:57

Wow Over here with a

6:59

big pile of firewood and a big lighter and

7:01

just gonna light it on. Yeah starting

7:04

to flame We

7:06

started that flame at 420

7:08

not that that means anything You

7:13

just had to bring it up noop dog is

7:15

coming this week We

7:18

get Snoop Dogg on the card. That's my that

7:20

would be an interesting podcast. He'd

7:22

be like security. I don't know We

7:26

don't talk about security anyway, it works out Be

7:30

gangster security. That's

7:32

true. That's true. We don't talk about security.

7:35

We talk about the new

7:38

Are

7:42

you taking this or are you gonna hand it off to wait again? I don't

7:45

care you can hand it to me You

7:48

got them one. It's all good

10:00

get stuck across the middle of it. That's

10:02

the first one is great. The ransomware

10:04

SEC filing? Yeah, so for anyone

10:07

that doesn't do that, this one hit a lot

10:09

of news like this hit my feed without

10:11

me coming to the show which nothing else ever does.

10:14

But basically, Alfie

10:16

ransomware group filed an SEC complaint

10:19

against a company called Meridian Link for

10:22

not disclosing their breach to the SEC.

10:24

So it's like,

10:25

honestly, it's pretty funny

10:27

and kind of like in a way where it's like,

10:30

hey, we already made you mad. Let's make you even more mad.

10:33

I guess my question is, what is

10:35

like the motive? Why? Why do that?

10:39

How much does the SEC find

10:42

people? What did they say? Does anybody

10:44

remember? 350? 350 grand? No,

10:47

I'm just kidding. 350? I

10:51

think it depends on the scope

10:53

and all that stuff. So what

10:56

I first thought is what if this just causes like

10:58

everyone who's publicly traded just immediately

11:00

get hacked? It's like and then

11:03

right and then everyone's just like pay me or else you're gonna

11:05

have to pay this amount? Like will security

11:08

get a rise real quick because of this? Is

11:10

it supposed to be like they won't submit you to the SEC

11:13

if you do pay the ransom? Yeah,

11:15

yeah. Yeah. That's what it's

11:18

about. It's my whole patching as a service

11:21

model, right? We hack you and then we

11:23

tell you you're hacked and we're gonna tell

11:25

the SEC and unless you pay us

11:28

and if we do pay us, we'll at least patch you. So when we

11:31

get out, you won't get hacked that way. So do

11:33

we think that the default defense of most of

11:35

these companies, which is we didn't know, we

11:37

didn't know we were hacked, we're still working on the details,

11:39

we're investigating, like do you think that's gonna work in this

11:42

case? I also noticed the SEC

11:44

find like disclosure window is really

11:46

short, right? It's only like four days.

11:49

Four days. Four days to report it. So

11:51

but you still have to report breach. Kelly, do

11:53

you know what it is

11:56

to say like report breach? Like that's

11:58

what the SEC is but what's like

11:59

definition for it if you know. If

12:02

anyone

12:03

is playing

12:10

this grey area right now, where

12:13

we used to say, is it an incident,

12:15

is it a breach, while the SEC

12:17

is getting tired of companies waiting until

12:19

it says it's an actual breach. So

12:22

anything that seems suspicious and

12:25

that's a big umbrella and there's

12:27

a few more specifics but they're trying

12:29

to close down that grey area between

12:32

something suspicious, an incident and a breach.

12:35

Hmm, interesting.

12:36

Click the link, it's a breach. Yeah,

12:39

really, every time we get the password on a customer, we're

12:41

going to start reporting them to the SEC. No, I'm

12:44

just kidding. I don't know. What is it

12:46

about it? As a pen test

12:48

team, could you be like, oh, look,

12:50

we saved you $350,000 because now you're not

12:52

going to get five minutes. I like how you're

12:54

sticking with that number that I just completely

12:56

made up. Right. That

12:59

sounds like a good round number, right? For like most smaller

13:01

companies, it's a decent chunk of change. The bigger corporations,

13:03

not going to matter but... All I know, what

13:06

I know about the SEC, I mostly know from watching

13:08

the documentary about Bernie Madoff which is that

13:10

as long as you look like you're fine,

13:14

they'll just let you off the hook. That's

13:16

what I've learned. So maybe the fine is like actually $3.50

13:19

as long as you have taken the

13:21

right people out to dinner or whatever. Yeah,

13:23

and I think Corey, you're right with

13:25

the, it's not just the fine, it's the scrutiny.

13:28

So there may be a lot of companies that are like, well,

13:30

the fine we can pay but it's the SEC

13:33

turning over every rock that we go,

13:35

okay,

13:35

that's a problem because... Audits

13:38

are expensive. Audits are expensive

13:40

and then you just, it's a bad look and

13:42

companies that are being

13:44

hit by ransomware

13:46

might also have a few other things that

13:49

aren't getting done that they don't really want the SEC

13:51

looking into.

13:52

So what we should do, let's just start

13:54

this service called SEC. We'll

13:57

sell this to the SEC where we just keep an eye

13:59

on the... I'm an eye on the Alfie ransomware group and

14:01

anytime they breach someone we'll just notify

14:04

them. So then we're just fighting up the middleman. Our

14:07

friends in the chat

14:09

that character mentioned

14:11

that he said the magic

14:14

words material impact and

14:16

I think that's kind of what's got everybody worked

14:19

up about this is now

14:21

companies have to determine if there's material

14:24

impact and it may actually

14:26

I know sometimes it feels kind of

14:28

hard on the security engineers and honestly

14:30

now we have to kind of send

14:33

it over the wall to the legal team and say okay

14:36

there may or may not be material impact

14:38

and at that point they have to determine

14:41

if it's actually a breach an Incident and

14:44

I think that's kind of what makes this such a sticky

14:46

wicket is not material impact

14:48

And again, I always have to say this. I'm not a lawyer

14:51

and I don't

14:51

play on TV either Well,

14:54

let's just here's some nice little rule of thumb

14:56

in the chat Here's it. Here's a little rule

14:58

of thumb for our friends

15:00

out there if you're negotiating with ransomware

15:03

actors It's a material impact

15:06

I think it's safe to say that If

15:09

you've gotten that it telegram invite

15:11

to that Alfie telegram

15:14

you're gonna go eat that material you need

15:16

to Well

15:19

Whenever I look at this article, I just like

15:22

and just the whole thing It's like the

15:24

two kids that get in trouble

15:27

But and the ones like yeah, but he did something way

15:29

worse like look at this like don't look

15:31

what I did Look what I'm doing

15:34

Do

15:37

you

15:37

remember that old cliche when we were younger snitches

15:40

get stitches

15:41

Yeah,

15:43

I get what the SEC is trying to do

15:46

by making the company's report in

15:48

to give Like openness of

15:50

everything but at the same time it seems

15:52

all they're doing is giving the attackers another

15:55

Way to start extort the companies. Yeah

15:59

You don't want to admit it then give us more money

16:02

now Oh,

16:02

but it's a fake it's it's not really a

16:04

way to extort them because in a day if they're considering

16:07

paying then it's like They're already it's

16:10

too it the SEC is gonna find out if they

16:12

pay for sure You

16:16

have any have any whistleblower

16:19

like payouts Oh Totally

16:26

not for him so they do

16:28

someone has Graham and chat or a

16:31

cloth boy. I just know that's that's a Grammy,

16:33

it's grand I just like out of it. No,

16:35

it's fine Thought

16:38

boy says how do you even report a breach

16:40

to the SEC? Do you just add them on Twitter? SEC

16:43

probably not gonna pay for the Twitter API. So

16:46

It's gonna guess that you you have to be a ransomware

16:49

group to report things to them They probably

16:51

have like an intern. They have a form. It's like the FBI

16:54

tip hotline or whatever. It's a Google sheet

16:59

Yeah, when I saw this article

17:01

like and I know I shared this in the

17:04

In the chatter in our notion there is that I was

17:06

remembering an article that I shared in

17:09

July a brought up in July with

17:12

The yeah, I

17:14

posted that it's like, okay,

17:15

you know 2022 North Carolina says

17:17

we're not gonna pay a ransom where anymore 2023 they

17:20

get shut down by ransomware and there's

17:22

a line in there that says,

17:24

you know When the law was passed last year

17:27

some worried it would effectively serve as an additional

17:29

way for the gangs to export victims

17:32

That they'd say hey you paid

17:34

you paid us

17:36

You're not allowed to pay us so now you're in trouble and

17:38

I know at the time it was

17:40

the well our ransomware groups really going to

17:43

leverage the law

17:45

in order to put further pressure

17:48

pressure on their victims and I Guess

17:50

that was up for debate now then but now

17:52

we're seeing that. Yeah, they are going to leverage

17:55

the

17:56

You know the legal arm of what

18:00

out there in order to put additional

18:14

mustache

18:18

and put the mustache in charge of the office of

18:20

the whistle blowers.

18:22

So

18:27

they do have a whistle blower program and

18:30

the max amount of money you can get is five

18:32

million. It's

18:35

established presumption, it's lawyer

18:37

speak so I may be absolutely reading it wrong.

18:40

But they can pay you whatever they want. Yeah

18:42

pretty much. This is

18:44

a quick google fu whistle blower

18:47

SEC stuff. But at

18:49

what level do you have to get a t-shirt? You

18:52

get a free t-shirt or

18:54

what about like a nice little like commemorative

18:58

whistle? If you got a special edition t-shirt

19:00

for turning someone in and you wear that like to Defcon

19:02

like only like five people. That would

19:05

be awesome. You're just like where'd you get that shirt?

19:07

Oh if you're a whistle blower you

19:09

get this shirt. Oh okay. That's

19:12

a lot of secret. I

19:14

know that was a

19:17

lousy t-shirt. Yeah.

19:20

So in that North

19:22

Carolina town, I mean we already like touched

19:24

a little bit on it but like basically,

19:27

I mean I thought ransomware groups are better than this. I thought

19:29

they don't target local and government

19:32

stuff. I mean not depends on the ransomware

19:34

group. Yeah

19:37

also North Carolina last year banned all

19:39

government entities from paying ransoms. That's gonna make

19:41

this a little awkward. That was the thing that

19:43

I said is that it's like they banned making the payments

19:46

and then

19:47

not completely shut down the following

19:49

year.

19:50

Well only Cornelius to shut down. There's at

19:52

least three more cities in North Carolina. We

19:56

don't know of them but they're there. Yeah I

19:59

mean I will say I am a little bit,

20:01

this is one of the things where I get when the companies have

20:03

big PR departments that basically make it so that

20:06

they say all the words, but nothing is actually

20:08

indicated by what they're saying. This is like the

20:10

opposite. They're like, anything that requires

20:13

staff to access files located on our

20:15

servers, which are in the process of being scanned.

20:17

I have more questions than answers.

20:20

What do you mean being scanned? What are

20:22

you scanning for? And also, what

20:25

files do people need to access on servers? I'm

20:27

concerned. So, yeah,

20:30

ransomware is bad.

20:31

Okay.

20:32

Can we go, how about we shift

20:35

this over to a faster news

20:37

article about some Swifties shutting

20:39

down the internet? Oh, yeah, I thought this was good. So

20:42

this is gonna be Taylor Swift talk in the misinformation

20:44

village next year at Def Con. No, I'm just kidding. But

20:48

really though, kind of, right?

20:51

So Wade, you know this story? Yeah, so pretty

20:53

much what happened. So Taylor Swift is in Brazil

20:55

right now doing the eras tour,

20:57

right? Making that money. And her

21:00

location of the hotel got leaked. So

21:03

in order to try to hide the leak, the Swifties

21:05

took action and started posting all over

21:08

Twitter several different tags. So we can

21:10

see one here. Trying to make it so

21:12

if you were looking for the leak, it would be incredibly

21:14

hard. You wouldn't be able to do it on Twitter, but

21:16

I'm sure there's other ones to find it. Yeah.

21:20

But this is great. I love

21:22

it. Oh, man. And I

21:24

was thinking that like taking the strategy for

21:26

it's like for information security, it's the

21:29

look, if you do something malicious and create a log

21:31

entry, just fill up more log

21:33

entries because blue people

21:35

never find it. So I was

21:37

thinking about there was a tactic, I

21:40

believe the it was

21:43

in France. It was Macron's

21:45

campaign filled their

21:47

emails up full of fake emails. And

21:50

then they got hacked by Russia and

21:53

they were trying to leak stuff. Yeah. You couldn't

21:55

tell what was real and it was fake, right? It is

21:57

an tactic I've heard a lot before, but it is

21:59

a great. tactic and I love it. Little,

22:01

little. Okay the question is did this work? Did

22:04

this act, like did this prevent her, I mean obviously

22:06

we know that she's hopefully safe and sound. She probably just

22:09

bought the hotel and I was like no one else is gonna stay here

22:11

now. Yeah I think all the billionaires

22:13

toured out the arrows but anyway. Yeah

22:16

I feel like this is an interesting tactic

22:18

though. I guess I'm kind of like,

22:21

well they're trying to prevent people from like searching

22:23

for like Taylor Swift location.

22:26

Yeah it'll

22:28

get the low level stuff but if you start

22:31

like time binding it and going

22:33

okay well let's just go back to

22:35

the first instances of it

22:37

from the 18th, search around

22:39

there and you're going to like you know unless they

22:42

deleted it in which case go to like an archive,

22:44

see if it got archived. You know they're gonna find

22:46

that there which is what made me think of

22:49

you know being the blue team in the logs

22:51

that if you just fill up the log with additional entries

22:54

blue teams is going to go back to like that earliest point

22:56

of noise and start there

22:58

and go okay here's the thing that initially happened that

23:00

they were trying to cover up with additional noise.

23:03

It's

23:03

only gonna prevent the lazy people

23:05

from actually talking. It prevents the lazy people exactly. There

23:08

were K-pop groups. Yeah

23:11

there were K-pop groups during the last

23:14

administration that would

23:16

completely stomp these trending hashtags

23:19

that were negative towards

23:21

somebody and

23:23

just that that would become

23:26

now just an excuse to post all these K-pop

23:28

people on the Twitter

23:31

and you wouldn't actually see why

23:33

that hashtag was originally trending. So it does

23:35

work and obviously it's

23:37

going to create an additional

23:39

barrier to somebody to dig

23:41

through all that, find the actual location and

23:44

then action on it. Because I mean

23:46

unless you're in Brazil right there, you're

23:48

not

23:48

gonna catch up with her. Yeah

23:51

man you think don't mess with cats was bad,

23:53

don't mess with the BTS army. That's bad.

24:00

Really? I guess like

24:01

how do you get this ravenous of fans

24:04

like in your support or is it all just like you

24:06

just have a Botnet like I just really

24:08

hope we never know I'd rather not

24:10

know that I do

24:12

imagine have you seen Taylor Swift fans?

24:15

It's a good possibility They're just all

24:17

that obsessed and jumped all over it. Sure.

24:19

Certainly shot up NFL

24:23

Interest and Jersey sales

24:25

and stuff

24:26

What we're saying is we need to release like a

24:28

Christmas album of all like the bhis

24:30

greatest hits And

24:33

then maybe yeah, we're ravenous fans.

24:36

I think it's all yeah, they can't be

24:38

fans of us unless we have Stadium

24:40

tour so I feel like we just got to work the final

24:43

goal of a stadium viewers Are we gonna start doing

24:45

I saw gas places like we like we did

24:47

at Way West We just go to different cities and start

24:49

doing the news from different It's

24:52

our eras tour

24:55

I do think it's an interesting question though.

24:57

How many of that? additional

25:00

information was a real person or how much

25:02

of it was a bot and The

25:04

other question is can we build our

25:06

own bot army to have those ready in case

25:08

we are in a situation like that

25:10

Yes, highly frowned upon but I like

25:12

the idea. I also thought early and get a chance

25:15

to recall it But the whole

25:17

like doxing and giving over real-time locations

25:20

on Twitter Wasn't there some high-profile

25:23

individual that was really upset about Real-time

25:26

location data being posted on Twitter Extents

25:29

that they got this like jet

25:31

tracking

25:33

Taken offline like I you you

25:35

can say the whole of the network Yeah,

25:39

just like this is a dream thing to get rid

25:42

of it and it's like clearly That

25:44

work and there's no real-time doxing

25:46

going on

25:47

on like X anymore

25:49

I have terrible news for everyone Put

25:51

a message in discord that says react to this message

25:53

of your part of our bot army and we got only three reactions

25:56

So we have a lot of work to do So

26:01

okay, I think I need to go back to school

26:03

and I need to learn and how to

26:06

recognize fake news. Apparently this is now

26:08

required for all students in California

26:10

schools. Let's say you're teaching

26:16

this class.

26:18

Give me the PowerPoint like syllabus.

26:22

What is the first section? Where

26:24

is it from? Credible source, right? That's

26:27

first. That's wrong. I don't know. Don't trust American

26:30

news, right? Try

26:35

to go an external course. I

26:39

like the idea

26:42

of this class but I know it's gonna be done

26:45

horribly wrong.

26:46

I think of this more as like if you see someone

26:48

shitposting all over LinkedIn, right? And then

26:50

people are commenting on it and making it big and

26:53

then you go to their LinkedIn and you realize they've just been shitposting

26:55

like, oh that's a bot. Okay, but you found it. My

26:57

biggest concern about this is that

26:59

it's gonna be covering only media that the

27:02

kids never use anyway. So we have several

27:04

newspapers here and we're gonna have to read each one. The

27:15

other thing that's just like kind

27:16

of hard facts is that like I would

27:18

personally say as a whole like the

27:20

whole world of like vertical video and

27:23

most like short-form video content in general

27:26

is pretty hard to say is like reliable

27:28

sources at all and that's just gonna

27:30

be a hard fact for them to swallow of like,

27:33

yeah, basically anything on TikTok probably

27:35

not reputable. When

27:38

someone puts me something on TikTok, I'm like, why

27:40

are you believing TikTok? Like, give

27:42

me a professional with like a PhD at the

27:44

end of their name or at least like... Not to say

27:46

they're hard. Get somebody from YouTube. Not

27:50

to say there aren't necessarily... Not

27:52

to say there aren't reputable creators on TikTok. I

27:54

mean, I guess that would be the section is like investigate

27:57

the creator themselves.

27:59

What are their interests?

27:59

What are there you like honestly,

28:02

you know how we have like for election like

28:04

for politicians? We have like transparency reporting

28:06

data on like who who Lobbies

28:09

for them. Where do they get their funding like all this stuff? We

28:11

might need to start doing that for youtubers It's like

28:14

this guy don't try to eat get paid

28:16

by you know This entity like trying

28:18

to uncover that kind of stuff because I feel

28:21

like that's really where it's going is who's taking the biggest

28:23

Payout to say whatever opinion, you know,

28:25

that's that's where it's going, right? This class

28:27

would have done a lot better if it was just like how to

28:29

identify biases And how to

28:31

like not just in yourself, but in others

28:33

right? I

28:38

don't know But uh, there's

28:40

a lot of good resources out there and I think once you

28:42

usually can do that and try to see where they're trying to push

28:44

you towards right and That's

28:47

probably where this class is going. Does it say how

28:49

old the students are going to be k through 12? Like

28:52

yeah Freaking

28:54

all the way down to like hey, it's

28:56

not gonna be oh, I'm sorry. Oh, I

28:59

it's like I have

29:00

Like when it was one of my kids

29:02

let's in Kiernan Garren. They said that that was one

29:04

of the new things they're doing Is

29:06

the distinction between fiction and non-fiction?

29:09

um, and now I have like my daughter at home

29:12

being like she's like watching stuff on tv and bs

29:14

like

29:15

That's fiction like because dogs can't

29:17

talk and it's like yeah, you're right. And

29:19

I was also like What

29:22

they learned they learned It's

29:25

like I was like, you know, like this is just gonna completely

29:27

wreck that whole like santa

29:29

clause tooth fairy Easter body

29:31

type of narrative there because the kids get to go like

29:34

We're playing cool. Yeah, like we're

29:37

we're questioning like our news sources there. It's

29:39

like I I really don't think that somebody

29:41

Let's talk about the bible. Oh wait, no

29:43

too soon Yeah,

29:46

no, I also do think uh, like

29:49

my other reaction to this is like let's be real

29:51

Are the kids really the ones who need

29:53

this? No, they're not.

29:56

Well, we gotta start with some way right like

29:58

I guess like, take

30:00

the sanitary, like, to get your driver's license,

30:02

you have to look at two news articles, okay?

30:06

The US has recently invested

30:08

highly into public transportation now because

30:10

no one can get their driver's license that shortly

30:13

after they do. Every one of your driver's license

30:15

expired, you must take a class on

30:17

media literacy and like, yeah,

30:19

it's all, yeah. I mean, it gets into like,

30:22

I genuinely don't think that most younger

30:24

people would have trouble identifying this kind of stuff

30:27

is on

30:28

being, but like, really, I

30:30

mean, I don't know, this article does

30:32

quote some study that like, apparently under

30:34

the age of 30, there's a higher likelihood

30:37

of identifying things. I don't know. Well,

30:40

there's...

30:41

Go ahead, wait.

30:43

In college, I think like, my second or

30:45

first year at junior college, I actually took a media

30:47

studies class and they talked a lot

30:49

about not just like, biases in the news,

30:51

but how to like, look past and what they're trying to

30:54

actually talk about. And it's not just news articles,

30:56

but actual movies and stuff like, if

30:58

you look into this movie, I remember we used

31:01

the Dark Knight as a reference. And someone

31:03

wrote a whole essay on how the Dark Knight directly

31:06

relates to the Iraq War. And I was like,

31:08

how the heck did you get that? And

31:10

but looking, doing that practice

31:12

and trying to understand that type of stuff, even

31:15

if it is using works of fiction where dogs

31:17

talk and try to understand where they're going with it

31:19

is pretty important, I would say, at that lower

31:21

level. But yeah, the Santa Claus is going

31:23

to go right out the window. They're going to be like, everyone

31:25

but Santa. And so, okay.

31:29

This is interesting because like,

31:31

apparently, so they're quoting Ormsby,

31:34

which I'm sorry, I don't have the person's full name, but

31:36

basically,

31:38

they teach concepts like lateral reading,

31:40

which is comparing an online article to other sources,

31:43

reverse imaging, searching online

31:45

to trace a photo to its original source. Wow,

31:47

that's some OSINT stuff. I mean,

31:49

that's like real,

31:51

you know, lesson plans and things.

31:53

I think that like, so the other example

31:56

they give is like, they were students

31:59

during this is like during COVID, but I guess students

32:01

were presented with three articles, a

32:03

tabloid New York Post article and a scientific

32:05

article, and they were asked to choose which is

32:08

the most reputable and they chose the New York Post article 90%

32:10

of the time. Like, oh gosh.

32:14

I'd also like to point out like Corvus

32:16

is saying in chat,

32:17

a lot of kids actually do need to hear this

32:20

these days. I have to admit my

32:22

young daughter doesn't

32:24

know that the thing it's like until I said

32:26

it to her, she didn't know that the things she saw on TV

32:29

were not real. She thought all

32:31

that stuff was actually real.

32:33

So

32:34

we think it's common sense at this point, but

32:36

it kind of isn't unfortunately.

32:40

And I think it is actually

32:42

important that kids become aware of,

32:44

hey, there are people that are going

32:46

to lie to you on a regular basis out there.

32:49

Yeah, I don't have any question.

32:52

Yeah. I completely agree

32:54

with you. Like with my oldest daughter,

32:56

like I'm always talking to her about

32:58

like different things she sees on YouTube or whatever,

33:00

but they speak with such

33:03

like

33:04

emphasis and like

33:06

they know what they're talking about. Yeah.

33:09

Anybody watching it is like,

33:11

oh yeah, well they might look at they're so

33:13

confident. They know what they're talking about. It must be true.

33:16

Right.

33:17

And especially with the younger generation, they

33:19

don't understand like, go

33:21

research. That's what I tell her.

33:23

I said, go look it up. See if it's

33:25

real. Like check to find out. And

33:28

so, yeah, they just see this person

33:30

as.

33:37

Yeah, the class will not differentiate whether

33:39

Santa is misinformation or you

33:41

know, fake news. Oh, that's yeah, that's always. It's

33:43

almost as if some of these people are snake

33:45

oil salesmen.

33:46

Oh, oh, we could

33:49

do a conference about that. You

33:53

do have the gap here that I was coming up with

33:56

wanting to make mention of is like, you have the gap

33:58

here of

33:59

homeschooling.

33:59

So if you've ever like if you've seen

34:02

some of these documentaries, I know

34:04

It's in red flag.

34:05

Yeah, I mean you said you sit there and say it's like

34:07

a lot of them They just homeschool their

34:09

kids because they're like, oh the school's not teaching them the right

34:11

stuff That's cool. You're on their heads with a bunch of misinformation.

34:14

They're homeschooled just watching stuff

34:17

off of YouTube So these

34:19

programs are great, but they're not going to really,

34:22

you know reach the people that know no No, I'm

34:24

going is really good for getting your kids to

34:26

believe in your cult though Hey, I'm

34:28

gonna speak on this one too because I did

34:30

in school my kids at one point. Um Homeschooling

34:35

Has a bad rap

34:37

and I will agree that there are

34:40

definitely benefits to sending your kids to school My kids

34:42

do go now The social aspect

34:44

is definitely harder to take care of if you homeschool

34:46

them But not everybody that homeschools

34:49

their kids just has them sit in front of a TV and

34:51

watch YouTube to do it I Said

34:55

what you saw like yeah, I had

34:57

to spend Andrew Callahan did

35:00

a did a series on this

35:02

for like I think like this place rocked or something

35:05

and you saw that with the kids just

35:08

There are some really crappy ones out there I

35:10

like it's in a program where I had to spend

35:12

like two grand every year to buy the curriculum

35:15

and then learn it Unfortunately, we don't have

35:17

an anti-siphon subscription level for

35:19

homeschooling

35:22

I don't know anything about English, but I

35:24

know how to reverse engineer a packet. Okay

35:29

Anyway, let's let's get this train back on

35:31

track want to talk about breaches

35:35

Let's talk about screen

35:37

connect remote access. Let's talk about this cuz

35:39

I actually want this is something that like

35:41

is personally relevant to me

35:43

So basically the article is that hackers

35:45

have breaching healthcare organizations using screen

35:47

connects

35:48

which basically is you know a rat

35:50

for remote access tool, but

35:52

One that I would consider more reputable

35:55

or like there could be legitimate uses for

35:57

it, right?

35:58

So

35:59

essentially, I mean Yeah, basically

36:01

this is an article published by huntress.

36:03

They spot the attack seeing them on endpoint

36:06

Essentially, I mean does anyone know a lot about

36:08

screen connect by chance

36:09

we use it.

36:11

Okay.

36:11

Yeah, that's fine mean Yeah,

36:14

the one one bad thing about screen connecting a lot

36:16

of the other ones is that you can have multiple

36:19

instances installed to the same

36:21

machine so We

36:24

we actually it's it's kind of tricky

36:26

because obviously we're looking for detections

36:29

of those kind of tools But we

36:32

also have to kind of have had a conversation

36:34

of do we allow or ignore

36:37

screen connect? How can we help

36:39

me tighten that up? But right also

36:42

that you know, it does kind of come

36:44

down to Okay, how did they get on it?

36:46

You know, who did they talk into it? there's

36:49

the there's the human element because

36:52

it's not like they're just dropping a Dropping

36:55

the agents on a on an endpoint

36:57

and boom your hacked it's you know, hey,

37:00

I need you to go to this website for me, and you get

37:02

a type in this code and

37:03

And now I'll fix your computer

37:06

or whatever they're working on so we're one

37:08

interesting thing about this attack in particular is that they're

37:10

actually using the product as Licensed

37:13

by another company that they compromised, right?

37:15

So it's like instead of going out and like

37:18

purchasing the product for themselves because you know Okay,

37:20

why see and money and all that stuff there

37:22

they did like they took over a company's You

37:25

know screen connect instance and then they're deploying

37:27

it maliciously But it looks legitimate because

37:30

I guess it looks semi legitimate because it's related

37:32

to one of their you know potential business

37:34

partners

37:35

The reason it's relevant to me is because on a

37:37

tactical level This is what threat

37:40

actors or advanced threat actors are using now

37:42

more than ever They're using these like living

37:44

off the land of a dial legitimate

37:46

things it would blend in that don't really fit

37:49

into the category of like shell code

37:51

or c2 or those kinds of things and Potentially

37:54

could lead to long-term persistence

37:57

because companies don't scan Oh scan

37:59

all the service for malware. Okay, we're good. Turns

38:01

out the screen connect, the tool that you thought you were supposed

38:04

to have, is the actual malware. And

38:06

this is actually, we've seen this go as far as

38:09

being used using EDRs

38:11

as like persistence mechanism.

38:13

Like having malicious either

38:16

creating API keys in EDRs

38:18

and using them to run commands or do whatever. Or you

38:22

know, just being in the console

38:24

running commands on every system, even air gapped

38:26

or I guess whatever, not air gapped, but you

38:28

know, out of band systems you can't access,

38:31

you could just run commands on them, right? So

38:33

there's

38:34

any that's gives the beginning but...

38:36

There is the recent Cyber Work On,

38:39

is that a good threat and Cal conference

38:41

if anyone's looking. It just recently happened though.

38:43

And one of their big subjects was the use

38:45

of lay of the land tools and tactics,

38:48

right? And how that's growing bigger and bigger

38:50

and as a detection engineer, that stuff's usually

38:52

pretty hard to detect. You have to know

38:54

your internal, like your

38:56

internal team and the tools they're using

38:58

very well. And like you're saying,

39:01

if they were actively using Screen Connect, there's

39:03

no way you're going to know what's different than a false

39:05

positive and a true positive with that. So

39:08

you really have to sure up and maybe

39:10

block all the connections to the other ones if you can or

39:12

periodically scan if you don't have

39:14

some type of application whitelisting for any

39:16

of those tools, right? But I was...

39:18

But I've got to step in here guys.

39:20

If you look at the bottom of the article, it's an

39:22

unmanaged on-prem device.

39:25

Connectwise weighed in on the

39:27

article. So I

39:30

appreciate the technical

39:32

details we're talking about here but we also

39:34

have to look at cyber hygiene and

39:36

the healthcare organization wasn't practicing

39:39

good cyber hygiene.

39:40

That's a good point. That's

39:43

almost required in most healthcare environments

39:45

though. Like I mean not necessarily

39:47

practicing bad hygiene but like having unmanaged

39:50

devices. You're not gonna put EDR on

39:52

your like MRI machine and void the warranty. It's

39:54

like ten million dollars or whatever. I

39:55

don't know, right? I think the bad part is more

39:58

the it hasn't been updated in...

39:59

Four years part well that voids the

40:02

warranty too That's the thing these

40:04

also people are stuck in but like

40:06

if you update the MRI machine that breaks it and

40:08

that's on you Like you just avoided the warning so

40:11

I mean that's like a tricky thing where like

40:13

they're stuck Yeah,

40:16

these defenses aren't gonna go in at the endpoint level

40:18

They have to go in above that because we know the within

40:21

most healthcare environments the endpoints are off-limits

40:23

for patching and most like Obviously

40:25

it'd be great if the vendors allow you to patch or

40:27

provide patches, but that isn't always the case

40:30

It's almost never the case So that's

40:32

how I see it is like you really have to put these You

40:34

have to put these protections in at the higher level

40:37

and that's where it gets scary of like it's

40:39

a known app that we use I guess

40:42

it's like there might be some burden

40:44

on screen connect to go and investigate malicious

40:46

usage of their tool

40:47

Especially like

40:48

ripping out these accounts that are being abused

40:51

like that really is where that falls. I think is on

40:53

the vendor Acceptable, but on

40:55

the side at least the ID team found that

40:57

machine that they've been missing for the last four years

41:01

You know, they were like, oh we thought

41:04

we got rid of that we thought it was a lease return

41:06

But Corey you're exactly right there. There

41:09

is a balance there because you're right. They're

41:11

not going to be patching machines that are

41:13

in surgical theater But

41:15

the organization I would like to see

41:18

Look at more compensating controls. Maybe

41:21

it's The surgical theater

41:23

is off the network or it's on a different

41:26

network I mean, we don't know

41:28

the whole story here and perhaps the

41:30

organization did have compensating

41:32

controls Maybe they didn't but I

41:34

would at least like to know that that machine is is

41:37

on a risk register Or as part of

41:39

some risk management process

41:42

at the organization as a whole for

41:44

sure I mean, it's also a supply chain Like

41:46

if you have if this machine is managed

41:48

by transaction data systems or whatever and they

41:51

get compromised You have to now

41:53

assume that all those systems across all their customers

41:55

are compromised not that I'm not I'm

41:57

not saying it's not interesting but Really,

42:01

I think it's more interesting to look at this as

42:03

a way that attackers are shifting their tactics

42:06

more than specific things that

42:08

the company victim could have done to prevent

42:10

it. It's more just like, this is what

42:13

attackers are doing now. They're going in, they're

42:15

trying to compromise companies that would

42:17

have, like the MSSs fall

42:20

into this group of companies that

42:22

would have legitimate access

42:24

to lots of other companies through

42:26

channels that they might not be monitoring. That's

42:29

why all the MSSs are terrified

42:32

of this fact. So,

42:34

yeah, I don't know. It's just an interesting tactical shift

42:36

of like,

42:37

now we're not going to go after individual companies.

42:40

We're going to go after companies that manage other companies and then go

42:42

after everything they have.

42:43

So basically don't trust that vendor device in your network,

42:45

I guess is the... Never. Never.

42:48

Oh, the story. Never. Don't

42:50

trust anyone. F-E-T-A. Does

42:54

anyone have any articles they want to

42:57

bring to the table?

42:59

There's one about NORC and WinRAR.

43:01

Is that how you play it? And

43:04

G-O-R-K. Orchid man? NORC.

43:07

It's an okay one just because I've written like,

43:09

detections for it. So

43:14

I like, profiled the shit out of NORC. And

43:16

I'm like, wait a sec, I can just block this at the firewall. And

43:19

I'm like, all right, I'm done. Oh, wait. Oh, you mean

43:21

N-G-R-O-C? You're talking about N-G-R-O-C. N-G-R-O-C,

43:23

NORC, I don't know. I

43:25

got... No. No, no.

43:29

No. It's the price

43:31

interpretation. Okay,

43:35

so N-G-R-O-C, as it's known

43:38

colloquially, or NORC, as it's known

43:40

only by Wade, is basically

43:43

it's a tool that people might be familiar with. It's essentially

43:45

like a reverse tunnel into your network. Essentially,

43:48

it's... So let's say you're a developer, you're running a local

43:51

node server. If you're a developer

43:53

and you're, let's say you're developing a local

43:55

application and you

43:58

want someone to be able to browse that application... on your

44:00

laptop, but you can't just open a port up. You

44:02

can have NGROC open up a port for

44:05

you and the traffic will be redirected through it. So,

44:07

essentially kind of a developer-y focused

44:11

tool, but we've actually used it for years, yeah,

44:13

for similar things to this. Looks

44:16

like it's being attributed.

44:17

Wade, do you wanna talk about what you mentioned?

44:19

You wrote a specific section for it? What is it like?

44:22

Yeah, well, because when you, first you have to see if

44:25

you can install it on your network, yeah, right? Like what type of

44:27

application whitelisting you have? Hopefully they

44:29

can't even install it. But then

44:31

there's specific network traffic that you

44:33

can actually track. It's usually going to their domain.

44:36

So you could say anything going out to that domain. And

44:38

then there's a couple really good threat intelligence lists out there

44:41

that I can't necessarily find right now. That

44:43

have, because NGROC

44:46

is, there's a bunch

44:48

of different versions of this, right? So then that

44:50

opens up the whole like, okay, are we detecting

44:52

all of them? Are we blocking all of them? And

44:55

kind of,

44:57

you need application whitelisting at the long run,

44:59

but

45:00

it's not too hard, but it

45:02

felt like an easy win, at least for the detection

45:04

side. You use it a lot, Corey? So,

45:07

I mean, I don't use it as much nowadays

45:09

because like there's, honestly, from my

45:11

perspective, as like a pseudo, not really

45:13

developer, but someone who has similar situations

45:16

to this, like CloudFlare workers or

45:18

things like, or like the Azure Dev tunnels,

45:20

like there's lots of other better

45:22

replacements for this. And then like NGROC is kind

45:24

of a third-30, like

45:27

less trusted than maybe like Azure would

45:29

be. Although I guess you could argue that's even

45:32

less trusted, but basically,

45:35

I mean, how does this, they're using this

45:37

flaw and they're just tunneling. Yeah, so it's basically

45:39

just they're using NGROC as a C2. So

45:41

the details for this would be like relatively easy,

45:44

right, like you use, because NGROC

45:46

has their own infrastructure.

45:48

Yeah, so you just look for anything based off of them and

45:51

then it's not too hard, but. Of course,

45:53

the Russian hackers were using a free

45:55

version of NGROC, they were not

45:57

paying for it.

45:59

I signed up for it. for the free version and then

46:01

did it myself and then in my own detection,

46:03

I'm like, oh, this is easy. I just went right through

46:06

it. So hopefully, it'll be different.

46:07

Yeah, this one isn't... I

46:09

guess what it is more than anything is just proof

46:12

of the random things

46:14

if you know your exclusions, I

46:16

think is what I would take away from this. If

46:19

you have legitimate uses for NGROC, now

46:21

you need to monitor that more

46:24

closely because we

46:26

know that it's being used for C2. So

46:28

but I will say like from my perspective as the red

46:30

actor or the red teamer, like we

46:33

can tunnel through your network. I promise.

46:35

Like I promise you I can tunnel my way

46:37

out of your network. You

46:40

sound like the guy from men's warehouse right now. We can tunnel

46:43

through your network. I guarantee it. There's

46:45

a way. Like there is a way. So

46:50

I see this as more like an interesting

46:53

thing to talk about, but also like defensive

46:55

depth, right? Like you're... You can

46:57

block all the tunnels. It's not going to happen. No, you

46:59

can't block all the tunnels. It's a low level win if you're

47:01

doing some type of defensive work that isn't

47:04

a hard lift. And honestly,

47:06

nation states are using it. So it's a little

47:09

bit worth it in the long run, even though if

47:11

they... Nation states were using it against,

47:13

I believe it was the Ukraine. So like hopefully

47:15

they're not using it against you. But if they are,

47:18

good luck.

47:19

Let's talk about some botnets getting

47:21

new. Isn't

47:22

that the FBI take out the botnet? Yeah, yeah. There's

47:25

a couple of them, right?

47:26

Or is there just... A really big one that

47:28

they were talking about. Just the IP storm botnet.

47:30

Yeah. Or if storm, I

47:32

don't know. I'm assuming. Or IPS

47:35

storm. IP storm sounded good. IP storm. IP

47:37

storm. Yeah. Yeah, basically the

47:39

US government announced that they took down the IP storm botnet

47:42

and

47:43

they got a guilty plea for the person who was running

47:45

it.

47:46

So I

47:47

guess this is an FBI notice.

47:49

They delivered malware to thousands of Windows,

47:52

Mac and Android devices and Linux.

47:54

Oh my gosh. I thought Linux was free of malware. Why this is impossible.

47:58

Basically using that as a... proxy

48:00

service these are terrifying we've talked about these

48:03

previously on the news they would that

48:05

proxy service then could be used this

48:08

is kind of the new I Guess

48:10

this is like the new definition of what I would call a

48:12

botnet right like

48:13

it's not it's not really like

48:16

It's basically like a D Dossian. It's more you

48:19

it has more utilities than that. Yes, it's

48:21

more about attribution in like IP

48:23

source

48:25

Allocation than it is about DDoS.

48:27

I feel like I don't know what I see storm I

48:29

feel like I should know this but do any of these threat

48:32

actors like offer That

48:34

as a service right like if you had a huge botnet

48:37

and you wanted to masquerade your IP Does

48:39

anyone be like yeah, you can buy our service and go

48:41

through any of our pots? Oh, that was Okay,

48:47

okay

48:50

Sergy Muckin in

48:52

muck sir gay Russian in Moldovian National

48:55

who admitted to creating it between June 2019 to December 2022 he

48:59

pleaded guilty in September. Yeah, it's

49:01

basically apparently can be up to 10 years in prison

49:04

and He apparently earned $550,000 from

49:08

it. So that's it. I guess I

49:10

thought yeah I mean He said he

49:12

made at least that and he has to give up as

49:14

part of the deal and pleading guilty has to give

49:16

up all His bet coin he got from it. Also

49:19

What do you think

49:20

Morris and SEC whistleblower though?

49:22

Maybe

49:25

Maybe yeah, what he should be doing

49:27

and this is what I get a sure services do monitor

49:31

Monitor all the traffic going through the proxy

49:33

and then whistle blow when it's reached traffic Provide

49:38

the service and then whistle blow on the users

49:41

And yeah for those you

49:43

know someone Graham mentioned this in the chat, but

49:46

for those wondering how it worked

49:48

It caught it sort of made news in 2019 because

49:50

it used IPFS for peer-to-peer So

49:53

it's like there's no kill switch for this IPFS

49:56

is like essentially for those

49:58

that don't know it's like a blockchain based file sharing

50:00

protocol that can have many

50:02

different Proxies and nodes and things

50:04

it's essentially designed to be a peer-to-peer But

50:07

it's like can be based on HTTPS so it

50:09

can be Mass we've used IPS

50:12

it when you know when I said I could penalty your network That

50:14

would be one thing I would try there's like

50:16

a million different

50:18

Web front ends for IPFS that

50:20

one of them is gonna be allowed in most networks So

50:23

all right now I got to go in a deep end about IPFS.

50:25

I've never heard of that Thank you

50:27

Wade, thank you

50:33

We can dive into let's go into James article

50:35

because I actually have no clue about this and

50:37

hopefully James you read it It

50:41

intrigues me it it

50:44

kind of kind of kept me away all

50:46

weekend so Microsoft

50:48

finished up ignite last week

50:51

and one of the things they announced on the Kind

50:54

of blue team side is that you can

50:56

now use cyber deception functions

50:59

inside Microsoft defender for endpoint

51:02

so right now they're actually

51:04

Have the ability where you can create an

51:07

alias account that would sit

51:09

inside LSAS and If

51:11

it's tickled or anything like that

51:14

would alert

51:14

There's also it adds a

51:17

host file

51:18

information to fake

51:20

a Fake

51:23

as a different device

51:24

and they're gonna roll out some more things

51:27

related to file

51:29

share Other

51:31

other tactics like that the

51:33

one catch and the one thing that's kind of really frustrating

51:36

Especially on the small business side. This

51:38

is only available inside defender

51:41

p2 So

51:43

if your organization has like

51:45

e5 or whatever you're fine, but you're

51:47

a smaller organization this

51:50

along with some of the other features of Defender

51:53

p2 might be worth the upgrade

51:55

for it.

51:57

And yeah, it's super simple to turn on they

51:59

actually at that FOR Companion, you could Ellipse

52:01

things to, like, create

52:07

the transaction makes your person because you're using

52:11

your server'sphet and make them proper

52:54

but

52:59

they could be early warning or just potentially

53:01

wasting people's time. But

53:03

yeah, I mean, mostly I just feel bad

53:05

for any company that their business model was building

53:08

and selling this Deception technology because they

53:10

just got Microsoft.

53:11

Microsoft did. Well, and so Sherlock.

53:15

Sherlock,

53:15

the Canary tokens

53:17

system is great, but it is

53:20

a pain in the butt to deploy at scale. Even if you

53:22

have the paid version, there's

53:24

no quick way to just, I'm

53:26

going to put a fake password Excel file

53:29

on every computer. You have to do

53:31

some management of those and then the

53:34

honey pots themselves, you have to decide,

53:36

well, where do I put them? So

53:38

now if you have machines that are, you know, work

53:40

from home or remote users,

53:43

they're bringing the honey pot with them at the same

53:45

time. So that

53:46

that Gomer that's sitting there logging

53:49

in with the into the Starbucks, Wi-Fi,

53:52

you know, maybe somebody will

53:54

trip the alert from that.

53:57

So yeah, how many people are going to trip the alert on

53:59

their own because.

53:59

They're looking for their own password, so

54:02

I'm gonna say. I

54:04

had a couple of Canary tokens up

54:06

in shared drives, and all of a sudden, one started

54:08

going off. And it was a regular

54:11

user. I'm like, hey, why are you opening this? He's like, oh,

54:13

I was looking at how to try to reset my password.

54:15

And I found this document that said password, and this

54:17

dude's password's in it, so I tried to log in. I'm

54:19

like, what are you doing? Like, I

54:22

isolated his host right off the bat, just because this

54:24

is too big. Like, this is what you get

54:26

for doing stupid things and looking at other people's passwords. Bad,

54:29

go into timeout. Come out when you have

54:31

a good explanation. That is so tough. So

54:33

one thing that I am interested to see how they do,

54:36

where they don't really mention this a lot, is the lures,

54:38

or the digital breadcrumbs, right? That

54:40

is what I find is the difficult thing to do,

54:43

is to get the attackers to build

54:45

that path to your larger

54:48

deception stuff. Because like James said,

54:50

the Canary tokens are great, but I received

54:53

a large amount of positive sometimes with those.

54:55

So having that be the first stage, and

54:58

then having that either server or domain

55:00

admin, or something be at the second stage, and

55:02

then just luring them in further and further. How

55:06

often do you guys, do you even if you want

55:08

to say this, Corey, do you guys actually see deception

55:10

out there?

55:11

A lot.

55:12

A lot, okay. I mean, again, I'm

55:14

biased because the customers that I work with at Black

55:17

Hills are typically the most mature customers, but

55:19

yes, we see it a decent amount. I

55:21

would say if we don't see it, we recommend that

55:24

customers implement it, even if it's just

55:26

a Kerberosable account that looks really juicy and

55:28

sets off and not break glass alarm. It's my favorite.

55:31

Basic easy stuff, I would

55:33

strongly recommend to customers. And we

55:35

typically, if we don't see it, when you're talking

55:38

about endpoint-based deception, I would

55:40

say that's maybe one in five, one in 10. You

55:43

don't see that many customers

55:45

that are using like, and

55:48

they are really annoying because we'll be running, the

55:51

ones that we've seen, I don't know the specific product, but like

55:53

if you run Bloodhound from a host, it'll just be completely

55:55

fake results. Like it'll be like, it'll

55:57

be completely, it'll hook that

55:59

process. or whatever and just return completely fake

56:01

results to blood hound or sharp hound or whatever.

56:04

Ooh, I don't even know. So that,

56:06

I don't know what product is doing that. We've seen that one before

56:08

and it is funny because what we end up doing is just doing

56:10

it. We just run sharp hound through

56:12

like a proxy, you know, like we tunnel

56:15

it in. And I will say like,

56:16

to be perfectly honest, again, we've talked

56:19

about tactics, whatever, we

56:21

are moving to the point of like, essentially,

56:24

as me as the threat actor, I am doing

56:26

my best to never

56:29

touch a managed endpoint at

56:31

any cost. Like I am

56:33

either deploying VMs in Azure, deploying

56:35

VMs in ESAC, putting VMs

56:38

on existing hosts, like guess what, you can just download

56:40

a portable QEMU and just

56:42

run a

56:43

VM.

56:44

Or tunneling traffic, like I don't

56:46

want to touch the managed hosts. The managed hosts are

56:48

the most,

56:49

that is where the security effort has gone

56:52

in for the last like five, 10 years. So

56:54

people, like, companies detect

56:57

everything on their managed hosts pretty well. I haven't done

56:59

a pentest in probably five years where they didn't see any

57:01

traffic on a managed host. But when

57:04

I get into their Azure and deploy a new VM,

57:06

just a completely clean slate, they

57:08

probably won't notice that. And that's also

57:11

what real threat actors are doing too. That's one of the Scattered Spiders

57:13

techniques is like, get into ESX or

57:15

whatever, deploy a VM, that's

57:18

not your beachhead. Like that is where you're operating

57:20

for the rest of the engagement. So just something

57:22

to keep in mind, like deception is really great,

57:24

especially for insider threats or other like, you

57:27

know, it's worth doing, but it isn't

57:29

going to protect you always.

57:31

Even then I have anything else. Another good point

57:33

is, I haven't seen a lot of good deception tactics

57:36

based around cloud stuff. Like

57:38

I just set it up or any blogs about

57:40

it either. So if anyone out there is like a deception

57:42

guy, like if you'd throw a blog about how to like,

57:45

set up deception in AWS, I'd love you.

57:47

The one that I don't know if anyone's

57:49

done this, but the one that would be really awesome would

57:51

be if you did a deception product

57:54

that would

57:54

fake the metadata service in AWS instances

57:57

or other instances. So like it would just give you fake information.

57:59

In the metadata or the yeah,

58:03

it's a Windows 98 computer. You're like what? Yeah,

58:11

you have to you have to make them not too obvious

58:15

but there are some especially the paid versions

58:18

where the advantage to those is you

58:20

can change the

58:21

Characteristics of

58:23

that that honeypot pretty quickly.

58:26

We did a exercise worth With

58:30

a National Guard unit and they

58:32

kept scanning VIP and then we would

58:34

change its personality And

58:37

they'd come back and scan it and then maybe this

58:39

time they were like, oh, hey, it looks like a looks

58:41

like HVAC Why why would HVAC be in the DMZ?

58:44

I don't know but I'm gonna try to run some some

58:46

weird exploits and

58:47

and then they Nothing happens and then they come back

58:50

around like wait, wasn't this IP? Like

58:52

now it's a remote desktop server. What the heck and

58:55

so it's I mean if you could Sit

58:58

there and and mash on the buttons

59:01

and keep them keep them busy That's you

59:03

know, that kind of goes into that annoy and

59:05

a tribute type You know tactics

59:07

that you want to employ against it against

59:10

the threat actor

59:11

Yeah, for

59:12

sure. It's a good anything

59:14

that's integrated into Microsoft products we generally

59:16

get a lot of we see a lot of benefit for

59:18

our customers because Where

59:20

everyone's at everyone has some Microsoft

59:22

licensing whether or not they use, you

59:24

know, everything to its fullest potential Maybe they don't

59:27

but most people do have some

59:29

Microsoft licensing So I

59:31

do think it'd be nice if they rolled this out to lower

59:34

tiers for sure And maybe they will

59:36

in the future but for now, it's

59:38

a good start. But yeah, I

59:40

think that's uh, I

59:41

think that's a good place to call it everyone That's

59:45

it. Have anything last last

59:47

call

59:50

The FCC one was interesting, but I don't

59:52

think it's too important unless anybody wants to talk

59:54

well So so Ryan,

59:56

let's let's cover some you know, what's happening

59:59

at Black Hills

59:59

What's going on? We got any summits coming

1:00:02

up any snake oil? We do. Anything?

1:00:04

We got our snake oil question

1:00:07

mark summit where we

1:00:10

talk about topics that may be snake oil

1:00:12

or may not be snake oil.

1:00:14

The discussion is,

1:00:16

is it snake oil or

1:00:18

not? We're not trying to sell snake oil. We're

1:00:20

trying to decide is it snake

1:00:22

oil? Just

1:00:25

like that K-12 class, that's what we're

1:00:27

gonna do for us. Yes. So

1:00:30

I don't know if, yeah, we did, we saw

1:00:32

a pretty funny comment. I don't know if it made it outside of like

1:00:34

our internal BHIS chats, but some person like

1:00:36

messaged us and was like, thank you so much

1:00:38

for trying to sell snake oil. We were like, wait, what?

1:00:43

Finally summoning being honest about what they're

1:00:46

selling. Yeah,

1:00:50

no, that's not the intention. The intention is

1:00:52

to give people a platform to talk about certain

1:00:55

topics that they either think are snake oil

1:00:57

or aren't snake oil. So I think

1:00:59

it'll make a lot more sense once the talks come

1:01:02

out. Are the talks released yet, Ryan?

1:01:04

We're starting to get them filled out. We

1:01:07

have them picked out. We're

1:01:09

starting to get the schedule filled out this week. Once

1:01:12

the schedule comes out, it'll make a lot

1:01:14

more sense what the summit is really about. But looking

1:01:16

through some of the submissions we got were

1:01:18

things about, this vendor's

1:01:20

good or this vendor's not good or this whole

1:01:23

part of the cyber security industry is bad. That

1:01:25

was some of the good stuff. So

1:01:27

it's more about just having open and honest

1:01:29

conversations about the way that we do things

1:01:32

and whether they're, are they good or

1:01:34

they bad, opening that discussion.

1:01:36

So it's not about vendor bashing necessarily.

1:01:38

It's just about

1:01:40

opening the discussion from the perspective

1:01:42

of

1:01:43

a

1:01:43

rank and file employee or a CSO or

1:01:46

whatever. What is snake oil and what

1:01:48

isn't snake oil? I'm really looking

1:01:50

forward to Graham's or Slothboy.

1:01:52

It's funny he says Slothboy in the chat to the right, but he actually

1:01:55

says Graham in Discord, so whatever. But

1:01:57

I'm really looking forward to his and it's pretty

1:01:59

much based around.

1:01:59

Cyber education and As

1:02:03

someone who's done a lot of education and

1:02:05

is like part of a boot camp that

1:02:08

not really like doesn't really like the boot camp,

1:02:10

but I'm really

1:02:12

interested to what he has to say about it because I really

1:02:14

wanted to do something like that So you better

1:02:16

you better knock it out of the park gram if you don't wow

1:02:19

wait setting the expectation Right

1:02:23

in that talk I want to get on that

1:02:26

If you're anything like me and you write your talk three days

1:02:28

before it is but yeah Um,

1:02:31

I think that's a good place to call it. Thanks everyone for coming

1:02:33

And I guess if you're in the US have a good holiday

1:02:35

week. Otherwise, I have

1:02:37

good week

1:02:38

No, no bummer news articles this week.

1:02:40

It was great. Like we kept it on like a little least

1:02:42

like a level playing ground the whole way I'll

1:02:45

say one more thing about what we're doing this

1:02:47

week. We have a an anti cast happening

1:02:50

on Wednesday But nothing on Thursday

1:02:52

because of course here in the US it's

1:02:54

turkey day. It's Thanksgiving So we're

1:02:56

not we're not working

1:02:57

But so if you want to check out a webcast from

1:02:59

us We have one on Wednesday and I'll put

1:03:02

that link in the chat as well.

1:03:03

All right, and with that everyone See

1:03:06

you Yes

1:03:22

You

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features