Podchaser Logo
Home
Unlocking the Secrets of Effective Insider Threat Management With Joe Payne From Code42

Unlocking the Secrets of Effective Insider Threat Management With Joe Payne From Code42

Released Monday, 25th March 2024
Good episode? Give it some love!
Unlocking the Secrets of Effective Insider Threat Management With Joe Payne From Code42

Unlocking the Secrets of Effective Insider Threat Management With Joe Payne From Code42

Unlocking the Secrets of Effective Insider Threat Management With Joe Payne From Code42

Unlocking the Secrets of Effective Insider Threat Management With Joe Payne From Code42

Monday, 25th March 2024
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:54

I was going . Joe , it's great to finally get

0:56

you on the podcast . You know , I feel like we've

0:58

been trying to get this thing scheduled

1:00

for quite some time now

1:02

, but you know , with with with our

1:05

schedules being so hectic , you know , I'm

1:07

glad that we could finally get on .

1:09

I think it's you know , you and I spent a lot of time

1:12

with the hairdresser , I think , and

1:14

so getting our appointments all lined up

1:16

without a podcasting , that was probably the difficulty

1:19

here . Yeah .

1:20

Yeah , and this

1:22

head of hair . You know it doesn't maintain itself

1:24

, it is necklace .

1:26

Yeah .

1:28

So , joe , you know why don't we start

1:31

with you . Know your background

1:33

right , how you got into it , how

1:35

you got into you know kind

1:37

of the security side , what made you

1:39

want to go down this route , was there , you

1:42

know , something that you maybe discovered

1:44

earlier on in your life that you

1:47

know kind of took you down this path . When you're looking

1:49

back on it , or what was I like

1:51

?

1:51

It's my story to a little different to probably a lot of

1:53

your tests , because I

1:56

my first job out of school was actually

1:58

the soft drink business and I

2:00

was working for Coca Cola

2:02

in marketing and

2:05

you know I was

2:07

having a lot of fun and soft drinks , you know

2:09

commercials and all

2:11

kinds of crazy stuff . But I remember

2:13

at the end of one year we grew

2:16

. It was like we had a great year and we

2:18

grew 6% and I

2:20

was like , hmm , it got

2:22

like in a different industry it grows a little faster

2:25

than beverages and I

2:27

was always interested in tech , and this was

2:29

the mid 90s and tech wasn't made

2:31

back then . I need to think about it . You know there

2:34

was no social media . There was very

2:36

little internet . Very few people

2:38

had email addresses . When I first

2:40

joined in tech , I'll just tell you a funny

2:43

story about myself my first

2:45

email address was Joe

2:47

and Karrosh , which is my wife's

2:49

name , at AOLcom , and at

2:51

the time we thought , well , every family would just have

2:53

one email address , you know , like you have one

2:56

physical address , and so we had

2:58

a shared email address . So that's how far

3:00

back I go in tech . But

3:02

then I got interested in security

3:04

in the early 2000s . It

3:06

was a murdering space , an exciting space and

3:09

I was the CEO of a company called East Charity

3:11

the first SIM products

3:13

way back in the day . And

3:16

what really turned me on as security

3:18

was when I worked at iDefense . At

3:21

iDefense we worked doing

3:23

threat intel and our customers were

3:25

all of the three letter agencies in

3:27

the government , all the largest banks

3:30

in the world , and

3:32

at the time I also got security

3:34

clearance . And so when you get

3:36

connected to

3:39

the three letter agencies and you're looking at

3:41

all of the things

3:43

that are happening in cybersecurity

3:46

, it's tentacle could be exciting

3:48

. And iDefense

3:50

was a lot of fun because we had

3:52

white hat hackers . We were also

3:55

tearing

3:58

up software around the globe

4:00

with our white hat team just

4:02

trying to get ahead of the bad guys . So it

4:04

was a lot of fun back then and I was getting

4:07

interested in security ever since .

4:10

Yeah , it's interesting

4:12

working with the government . Earlier on in

4:14

my career I worked for a very

4:17

small company that

4:19

did Enhanced 901

4:21

software and so

4:23

the government was a huge consumer

4:28

of our product and somehow

4:30

I finangled my way into leading

4:33

the entire tech side for our government

4:35

clients and

4:37

I mean it really

4:40

does just like open up

4:42

your mind into

4:44

what's possible . And I feel

4:46

like it's a different way from government

4:48

employees , because the government employees

4:50

are typically very siloed and

4:54

as a contractor you can come in and say

4:56

no , I need to know what's

4:58

going on over here . I know you normally don't

5:00

tell people , but this impacts my work

5:02

of how I provide this product , and

5:04

so I need to know these different pieces . And

5:06

so you learn probably way more

5:09

than what you should actually learn in

5:11

some cases . And

5:14

it's always interesting because I

5:17

was going to a facility for maybe

5:20

a year or two and I never

5:22

knew what they did . I

5:24

mean , you're in the middle of the mountains West

5:26

Virginia , something

5:28

else is around , like a town was built

5:30

around this thing to support

5:33

contractors coming in and

5:36

I have no clue what we're doing . And

5:41

one day I guess my handler that's what I

5:43

called him my handler at the facility

5:45

. He was like Joe , do you not

5:47

know what we do here ? I said I

5:50

have no clue . I literally have no clue

5:52

. And he

5:54

pointed to these satellites that were

5:56

, these giant satellites on the on

5:58

premise and told me one

6:01

of the small things that they did . And

6:03

I'm like my God , where am I ?

6:06

Well , Joe , you just there's a couple of

6:08

great points to kind of for first . So

6:11

I'll say my area of expertise now in

6:13

cyber is running cyber threat and insider

6:15

risk , and so you just identified

6:17

a couple of things around why

6:20

it's so important . Because , to your point , like

6:22

contractors in particular do you get a

6:24

lot of knowledge around things , and so

6:26

contractors will end up biggest risks

6:28

on the federal side . But

6:30

the other side I think it's just as important to touch

6:32

on and I often have to talk to

6:34

the old people on my teams about

6:36

this and that is that the United States

6:39

government is , in

6:41

my opinion , the best in the

6:43

world in cyber . I

6:45

mean the best in the world . What's interesting

6:47

about that is that when most

6:50

people think of government employees , they

6:52

think of people that work in agencies

6:54

and are stereotypical

6:57

. Oh , these government employees , they don't work

7:00

that hard , they just shut

7:02

along . It takes a long time to get the government

7:04

to do anything . You know all these stories et cetera

7:06

, but what I learned in

7:08

working with our Intel

7:10

community , our long course community

7:13

around cyber , is that we

7:15

literally have the best people

7:17

in the world at the NSA

7:20

in particular , at the

7:22

FDI , a little bit at the CIA

7:24

, in the military , we

7:27

have the best people in the world . In cyber , we

7:29

are nobody's better than us , and

7:33

that level of understanding and respect

7:35

is really important for all

7:37

of us that work in a cyber community , and

7:40

so I'm glad you brought that up

7:43

, something that I think we can be really proud of , and

7:45

it's something , honestly , that's just very safe

7:47

as a country .

7:49

Yeah , you know , it's

7:51

very interesting that you bring that up , you know

7:53

, because sometimes

7:56

, as from

7:59

the outsider perspective , right and I'm

8:01

talking about people that aren't even in cybersecurity

8:03

it looks like to them that

8:07

we're so far behind everyone

8:09

else that we're not

8:13

capable . There's too much bureaucracy

8:16

. And I will say this

8:18

there probably is too much bureaucracy

8:20

, right , but at the same time , I

8:22

know the crowd of

8:24

security professionals and

8:27

we need those rules , because if we don't

8:29

have those rules and you just say , you

8:32

point us towards China and you're like

8:34

hey , you've got the power grid , you

8:36

know go have fun , right , like

8:39

yeah , it's going down and they

8:41

won't be able to recover from it , but

8:43

like that's something you know you want to really

8:45

tie it , tie into other things

8:48

and really control , right , and

8:50

it's a . It's

8:52

fascinating that you bring up . You know that we have

8:54

that talent pool there , because I

8:56

actually had on someone

8:58

maybe two years ago at this point and

9:01

the episode was never released because

9:03

the military asked me to not release

9:05

it and you know I do not want

9:08

the United States government showing up at my door

9:10

for any reason

9:12

. And

9:15

you know he

9:17

talked about how the

9:19

training that he was going through is a two

9:21

year program , for you

9:23

know , one of the three letter agencies . It was a two

9:25

year program and he said , literally

9:28

every single day , at any

9:30

point in time , you can be cut . If

9:32

you write an inefficient line of code

9:35

, if you , you know , do something that you

9:37

shouldn't have done , if you put

9:39

it in the wrong language , whatever it might be

9:41

, you know you can be cut right there . That's

9:44

the standards . And he said you know

9:46

only a very small amount of people

9:48

actually make it through and you typically have

9:50

like a six to 12 month

9:52

ramp up time before you even go

9:54

to the school and

9:57

you know those standards . They

10:00

come into play in the real world because now

10:02

you know what the bar is , you know you . You

10:04

kind of know what you're expected

10:07

to do and what you're able to perform at

10:09

and whatnot , and that's expected

10:11

of you . You know day in and day out , and I

10:13

feel like I've always

10:15

felt like we've had the best cyber

10:18

capabilities , for sure . And

10:21

if you just look at the things that have leaked , you

10:23

know , though , like eternal blue and

10:25

how many , how many zero days

10:28

came out of internal blue , right

10:30

, and that's , you know . That's

10:32

10 years ago at this point , Well , it's

10:34

interesting .

10:35

So you know , first of all , and work , I

10:37

will tell you that in working with the NSA , I

10:39

interacted as much

10:41

with lawyers at the NSA as

10:44

with the , with the cyber experts

10:46

. That's how careful they are at

10:48

following our laws and

10:50

so you know they're not . You

10:52

know they're not doing things that

10:55

are the older , very , very

10:57

, very careful . That's

11:00

something that they share . But

11:02

you mentioned there's not enough data sharing and

11:04

stuff like that . All the leaks

11:06

we had in the US have really

11:08

been because coastline

11:11

alumn , the cyber

11:13

community , you know

11:15

, created tools to share

11:18

broadly the intelligence and

11:20

there's over a million people that have street

11:22

clearances , that have access to

11:24

some of our nation's most you

11:27

know , sensitive information and that

11:29

you know . The Jack Tech Sarah

11:31

, you know , leak recently is because

11:34

it just takes one of those people

11:36

to use that information the

11:39

wrong way out of a million people

11:41

. So that's a half every month . So

11:43

the price that we're paying right now on those

11:45

leaks is that we give

11:47

a lot of access to people

11:49

and because we want to share

11:52

that data and , you know

11:54

, one a bad thing to the change

11:56

and can cause that change . So it

11:59

is an interesting balance . I know there's

12:01

a big back and forth and you tell those community

12:03

about that at all times . But

12:05

honestly , I only wanted it . We

12:07

all need to dwell on it . I just find that point

12:10

out to your list , or is that like

12:12

if you're young and you're like

12:14

not sure about these things , you haven't come

12:18

? Just know that

12:20

the people that are

12:22

protecting us from a serious death by someone

12:24

that has to the world , and that's

12:26

a good to know and it's also important to respect

12:28

.

12:30

Yeah , absolutely , you

12:32

know to dive into the insider

12:34

thread a little bit more , right

12:37

, do

12:39

you think that these Snowden

12:41

leaks kind of set the stage

12:43

for you know all

12:45

the other leaks to follow . Right , how

12:49

big was that , how meaningful was that

12:51

from a you know quasi

12:53

insider , you know expert

12:55

knowledge , right , like you probably have a little

12:58

bit more knowledge on that leak than anyone

13:00

else . Right , like , what

13:02

was ?

13:03

that like that was a

13:05

devastating leak . I mean , that was

13:07

a devastating leak for the United States . You

13:11

know , people lost . Some people lost their lives over

13:13

that leak because there was so much information

13:16

that she ended up in

13:18

hands that it shouldn't have ended up in . And

13:21

I think it was also devastating for Booz

13:23

Allen and their reputation , because he was a Booz

13:25

Allen contractor and it

13:28

was so basic

13:32

. I mean , all

13:34

thumb drives were supposed to have

13:36

been , you know , all

13:38

external drivers were supposed to have

13:40

been blocked , you know , and

13:43

they just hadn't gotten to that facility in

13:45

Hawaii to put those

13:47

controls in place , and so it was easily

13:49

preventable . It keep with policy

13:52

would improve . So

13:55

it was careful , but holding up

13:57

the level . Look , here's the thing about

13:59

insider thread and insider leaks

14:01

. We're always going to have them and

14:05

so you have to really put controls in

14:07

place . But

14:09

you know , back to this conversation about sharing data

14:11

, you've got

14:14

balance tag with the need to

14:16

share and collaborate and

14:19

if you step outside the document for just a

14:21

second , we go into corporate , which probably most

14:23

of your listeners are corporate . I

14:27

think what surprises people is

14:29

how pervasive data

14:31

is making out of organizations today

14:33

. So we all might say , oh , you

14:35

know so , do so , but haven't done anything so

14:38

stupid , or how could we be so stupid

14:40

? I've got that happen and up , up , up

14:42

, up , looking your own backyard

14:44

I mean we , when our

14:46

company was specialized in helping organizations

14:49

on insider threads we first come

14:51

in . What surprises

14:53

our clients is that the

14:55

amount of data moving out of the organization

14:58

is incredible . Literally every single

15:00

person that will reach your organization

15:02

for a new job takes a

15:04

lot of critical data . Literally

15:06

every single one . 60% of

15:08

people admit they took data from their

15:10

last job and are using it in their

15:12

current job . That's

15:15

the people that admit it . So , even if

15:17

I said it's probably more like 90%

15:19

, and so it's

15:21

really . You know , what's interesting

15:24

right now is that the whole fn of AI

15:26

and the datasets around AI people

15:29

are finally starting to pay attention and say

15:31

, oh , wait a minute , do I need to be worried

15:33

about my source code leaving

15:35

or my large

15:37

ASS leaving , or you

15:40

know whereby , your customer lists or

15:42

et cetera , etc . So we're seeing a real assertence

15:44

in people because

15:46

of the focus on AI datasets

15:49

. But , to

15:51

be fair , that's just a very

15:53

small to the I for what's actually

15:55

leaving and most organizations

15:57

. We see a lot of source code

16:00

, customer lists , hr

16:02

data . We see that all

16:04

the time . So it is an area

16:06

I'm glad is

16:09

finally getting some attention . We

16:11

can talk about sort of different stories around that

16:13

and how it's happening , why , but

16:15

on the corporate side , just to touch on the , you

16:18

know , on the stone side , on

16:20

the corporate side , the

16:23

breach that Millie sort of woke everybody

16:25

up in the last few years is Anthony Lewandowski

16:28

, who took data from Waymo

16:30

which is all their self-driving

16:33

car data and , just like Snowden

16:35

, took just a troll of the information , lewandowski

16:38

in the last week of his

16:40

his time at Google or

16:42

Waymo online , who walks to

16:45

all that data , and he just took it on an

16:47

external hard drive and then

16:50

used it for a year before

16:53

they realized it . And they only realized

16:55

it because one of the suppliers

16:57

accidentally sacked

16:59

Waymo and said , hey

17:01

, we can put confirmation on this part

17:04

. And the people at Waymo was like wait , that's our

17:06

parts . Oh , we did order this and

17:08

it turns out it was . It was he

17:10

just literally taken the exact schematic

17:12

for his new company and was saying that

17:15

part and so you know it was all

17:17

. When he discovered accidentally by the

17:19

folks at Google Now was

17:21

sort of the best , that's

17:23

the snowman taste on the corporate

17:25

side and you know

17:27

he ended up . That's what was interesting in

17:29

the end for your listeners is he

17:32

went to prison for

17:34

that and was serving time

17:36

in jail when , in the

17:38

last days of his presidency

17:41

, president Trump parted

17:43

Anthony Lewandowski with

17:45

no little to no explanation

17:47

of his phone . So interesting

17:50

sort of story that kind of keeps

17:53

going .

17:56

Yeah , that's really fascinating

17:58

. I actually I didn't . I

18:01

don't think I knew about that . You

18:04

know , is Waymo still around ?

18:06

after that . Yeah , waymo's still around . It

18:08

was an massive issue between

18:11

Waymo and and

18:13

then Uber because Uber

18:15

. It looks as though Uber basically

18:17

encouraged Lewandowski to take that data

18:20

, stand up his own company , and then Uber

18:22

bought that company six months later . And

18:25

so Anthony Lewandowski was at Uber

18:27

when they discovered this and Uber was working

18:29

on sell all the self-driving car stuff . So

18:32

they it was a long lawsuit

18:34

. They ended up settling in the

18:36

end , but then when they

18:38

sell , after they sell the lawsuit , then

18:41

criminal trial came and

18:43

so it was a big deal . Some people

18:45

think it's . You know , one

18:47

of the reasons Travis lost his job

18:49

was we're all around the same time . So

18:53

it was . It was a big . It

18:56

was very controversial at the time . Dead

18:58

pass .

18:59

And that's really fascinating because

19:02

you know , like

19:04

, how does that even , how does that even

19:06

come about ? Right , like

19:09

you know , you're working for Google on this project

19:11

. A competitor

19:13

, a direct competitor , like

19:16

they , must have approached him at some point

19:18

during his tenure at Google to

19:20

approach him with this idea . You

19:24

know , I maybe I'm just not in the insider

19:26

threat space , right , but

19:29

I mean that's a , that's a kind of corporate espionage

19:32

, that that

19:34

I I never would have thought

19:36

of , I never would have imagined

19:39

that a competitor would go to that

19:41

length . You know , I could see

19:43

them , you know , buying

19:45

the company , right , but the for

19:47

the idea to come into place

19:50

with this guy , to take

19:52

the code form a company , start

19:55

, you know , probably develop it with it , try

19:57

and order the parts that were proprietary

20:00

, you know , for what he was

20:02

using at Google , I Mean

20:04

that's just a different Love

20:07

that I feel happens every day .

20:08

We had , we beat , we beat . We had a new client just

20:11

a couple of weeks ago at code 42

20:13

where they had Employee

20:16

that left and stood up his own companies

20:18

in his source trust . The way

20:20

it happens is people that are

20:22

Smart and people that create new

20:24

things All are capable companies

20:27

. They , they , a lot of them believe

20:29

why created this design . So

20:32

you know , I'm gonna take it with me to

20:34

my next job . They'll . And

20:36

that's not the way a lot your property works

20:38

. When you work for a company , when you

20:40

join , you sign an document says anything I create

20:42

for this company the company owns , and

20:45

so we see this every day was

20:47

sourced to right now most software

20:50

developers . They have

20:52

their own get cover

20:55

, get lab accounts , your

20:58

repos . You're gonna talk to developer

21:00

friend hey , do you own your own repo ? Oh , yeah , I guess we

21:02

have in your ego of that projects that are working

21:04

on my whole career . Okay , well , those products

21:06

for companies we are you

21:09

know , let's does this company source code we

21:11

have ? I mean , it's not like I took it . I

21:13

left the copy . The company's using that copy

21:15

. I took the copy of the source code With

21:18

me and no , we see that time

21:21

and time again and it's

21:24

interesting and this ties back to the AI

21:26

stuff I was out there about . So the

21:28

way we look for the way most developers

21:30

Take

21:33

source code is this kind of interesting

21:35

and very straightforward , is they ? Essentially

21:38

, you know , every day when the developer pulls their code

21:40

down from the corporate repo , you know that corporate

21:42

code is locked up tight it's . You know

21:44

you gotta have the right stiff divider machine to get into

21:46

it . Sometimes it's all like back authentication

21:49

, but like you're not getting in that vehicle

21:51

unless you're authorized to know . Right , most companies

21:53

are really good now and

21:55

so well . Once the developer takes the code down

21:57

to their local machine , that's when they start buying

21:59

code . They got their test cases running . They

22:02

got a bunch of the ends etc , a

22:04

writer code , and then they check it back into

22:06

the forklbeat pop . Well , you

22:08

know , in order to check it back in , you

22:10

use it commands to push

22:12

that code . It's called a git push

22:14

. She pushed that code back up is the recalls . Oh

22:17

, what a lot of developers do is , at the same

22:19

time , they're gonna take the bill , get pushed

22:21

to their own personal repo and

22:24

you know if they're sitting at home and

22:26

they're off the corporate network which was developers

22:28

are these days or there's our what

22:30

. So , wherever they are , most organizations

22:33

are completely blind to

22:35

that get pushed to the corporate , to

22:38

personal repo , and we

22:41

built technology code 42 so we could see

22:43

that . And unless we did , that was her

22:45

seeing some resource Good at filtration . We

22:47

were literally like , oh my gosh , we didn't know

22:49

. We didn't even know that this was happening

22:52

. At the same time , the

22:54

same capability also

22:57

sees when you move large datasets . You

23:00

also use the git commands to use

23:02

large , large datasets . So

23:04

we also started seeing large datasets

23:07

getting moved around and that's where the AI I

23:10

was . That comes in a play where

23:12

people moving large datasets to train

23:14

LLMs for

23:16

AI usage and

23:19

you're gonna see back in the same capability

23:21

we law . We're let a

23:24

company that has this capability out there today

23:26

. We lost it about two

23:28

years ago and so the

23:31

timing was just lucky . We didn't , we weren't anticipating

23:34

that was that half of it . With

23:36

the AI and the large language models

23:38

I know , which I have a lot in the a

23:41

stop a few years ago , but

23:43

but again we see this kind of huge

23:46

movement To that again . So

23:48

back to the afternoon and ask you story like yes

23:50

, he took his designs , yes , he

23:52

took his files , let's see , toss up . I

23:55

mean , he went to prison for this , so he's obviously had

23:57

to pay the price . But in his mind he

24:01

probably was like , well , this is so fine , that's it

24:03

. No , this is my stuff . Like I

24:05

do this , I want to forget , just like if a

24:07

whole team of people who were bonus and

24:10

the damage done to the people you leave behind

24:12

as an insider , when you take

24:14

Everybody's , you

24:17

know the work that everybody did and

24:19

then you go take it somewhere else is real

24:21

. And no part of what

24:24

we often do as an insider risk company

24:26

is we help educate

24:28

the organization on like a . Hey

24:31

, we're not allowed to do an . I'll ask you

24:33

why is an adder ? You know well , you

24:35

know why is it important ? Is it ? Is it

24:37

the company being mean ? Asking you your

24:39

information ? Might , but people

24:41

often don't sort of understand that the impact

24:44

that they have is on their old tears is Let

24:47

it biggest impact cash that you know

24:49

occurs on inside risk hmm

24:52

, yeah

24:54

, it's .

24:57

I feel like it's almost like a

24:59

, it's a miss in the brain or

25:01

something like that . You know , because from

25:03

my perspective you

25:05

know I'm in it right , I

25:08

work a day job in cybersecurity

25:10

there's a lot of Potentially

25:13

you know , proprietary information

25:16

that I have taken down in notes , in

25:18

my , in my one note . But

25:20

there's a lot of Things

25:24

that I'll that I'll learn right for the very

25:26

first time and I'll take note of it and it'll be , you know , not

25:31

proprietary information , right

25:33

, it's a configuration in AWS or whatever might be , and

25:38

I'll take note of it so that I remember it . And

25:40

when I leave the company , if I don't know it by heart , I'll

25:43

just take that little , that little piece of information

25:46

and put it into my personal one note , right , but it's

25:48

not proprietary , it's not

25:50

anything that's

25:53

you know , customer , the company or anything

25:55

like that . And

25:57

for me to do the mental

26:00

gymnastics , right , to kind

26:02

of rationalize taking proprietary

26:05

information from company

26:07

, I mean I don't want to go to prison .

26:13

And it doesn't usually end up in prison . I think part

26:15

of what's going on here is

26:17

that most

26:19

organizations today have

26:22

some form of data protection but

26:24

are Are . You

26:26

know , our ability to exfiltrate data

26:28

from our company is expanded so rapidly

26:31

in the last five or six years that people

26:33

just don't think Daniel is paying attention or

26:35

they don't think that any of that car works . And

26:37

it is probably more the former than the latter

26:39

, because if they really think about any car and they

26:41

feel guilty about it and they and awesome

26:43

, they won't do it , which is good . But most

26:46

companies today just can't see

26:48

. If you upload a document to Dropbox

26:51

from your house , you know you're not on

26:53

the company network , or

26:55

you know and you just Gmail

26:57

to yourself . These are the most common ways

26:59

that people take data

27:01

and they're free and most people are like , well , no

27:03

, I don't think my company sees that , you

27:05

know I can still print and all kind of stuff , and

27:08

so you know , you know our

27:12

. Our philosophy as a company is we

27:14

coin in terms of secure

27:16

the collaboration culture , and

27:19

the collaboration culture is how we all live

27:21

today . We use stack , we use teams , we

27:24

use T drive and one your I . We have all

27:26

these tools like Salesforce , that they're cloud

27:28

based , last share information and our work

27:30

clothes are shared and we can collaborate

27:32

on presentations together and that's awesome

27:35

, like it's a so much more

27:37

productive as organizations , right

27:39

. And so he

27:41

insecurity is not to get in the way of

27:43

that and to just brought a little

27:46

bit of security around it so that people don't feel

27:48

like , hey , nobody's watching , nobody cares

27:50

, and you'll find it in one of the ways

27:53

that we do that , that's how it

27:55

is that the impact is , and

27:57

somebody good employee , normal employee

27:59

, been working , you know , in their

28:01

job does something that they're not

28:03

supposed to do . We'll send that

28:06

. The system automatically send them a video

28:08

. Is this hey , we noticed that

28:10

you uploaded a file to Dropbox . We

28:13

don't use Dropbox as our sharing platform

28:15

here at X company

28:18

, you know , please ? So what

28:20

that does is it puts the employee on notice

28:22

that well , somebody's actually paying

28:24

attention , but it does it a very nice , polite

28:26

way , etc . And if

28:28

you let people know that we're watching

28:30

a score , most people are

28:32

steal from you and most

28:35

people behave the right way . And

28:38

I'll say , if I last week I was doing

28:40

a presentation in San Francisco

28:42

and I got there with my laptop

28:44

at the conference and they were like , oh

28:46

, you can't use your laptop , catholic , but

28:48

instead I could update fine on , use yours

28:51

and know like we need your presentation

28:53

. And the only way I could get it

28:55

done was was a thumb drive . So I

28:57

put my thumb drive in and

28:59

had me using adapters . Of course , matt , matt

29:02

doesn't allow forever . Eight and

29:04

my days . The presentation is them and I

29:06

you know it was I one minute

29:09

before like my kickoff . So it was like fast

29:11

, Like I don't know , I did my presentation

29:14

that afternoon . I am

29:16

in knots . I got an email

29:18

but

29:20

I was a black from my security team , the little video attacks and

29:22

said you know , no use

29:25

, found rise to your code 42

29:27

to take that . So you

29:29

know , risky way to XLT day it . Please

29:31

watch this video . I had watched the video

29:34

all the way through . You know

29:36

I did the very tenth and check that I

29:38

have lost video and all that kind of stuff , stuff . So

29:40

that's how controls you place working in

29:42

a line and you like don't , don't

29:44

do that . And just by doing

29:47

that , now I know if

29:49

I were to leave the company , like

29:51

if I moved a bunch of data , they would

29:53

pay it , they would notice and they would pay attention . So that

29:55

kind of show , you know , that really reinforces

29:58

a way to keep people from actually doing

30:00

something .

30:02

So I mean , that's a great . That's

30:05

a great like reinforcement of

30:08

a , you know , a learned

30:10

behavior right

30:12

. But how do you , how do you stop

30:15

that developer from uploading

30:17

to their own personal you

30:19

know , get repository right ? Because

30:21

I don't , I actually don't know .

30:24

Yes . Well , from a technical standpoint you

30:26

don't want to try to solve that . And then you

30:28

hear the other one is , if

30:30

you get in the way of a software developer as

30:33

a security person , which

30:35

one ? You is the leaser job

30:38

? And I'm telling you , this is just

30:40

right . You do not want to get in the way

30:42

of the 300 software developers you got

30:44

to your company , or 500 or 1000

30:46

or whatever , and that's two , though

30:48

, slow down their machine . If you slow down

30:50

your developer's machine , hell is going to pay

30:53

by the CTOs that you go to the CEO and

30:55

say where . And

30:57

also , you don't want to get in the way of them

30:59

using get commands . That's the . That's

31:01

how you do their job every day . So

31:04

that's why the approach we

31:06

take in is not to to

31:08

stop that , but it's to

31:10

monitor that and to and

31:14

, of course , correct behavior . So , if

31:16

so , the first time a developer moves

31:18

something to a open source product

31:20

that's unsanctioned , which

31:22

is also how the that we see that a

31:24

lot too , which is , hey , meet , work , we're

31:26

contributing to 70 open source products , but

31:28

it turns out that when we actually start monitoring

31:31

, we see code moving to a lot of hundreds

31:33

of open source products . It's

31:36

a very regular thing that we see at code 42

31:38

. When we work with our clients , the

31:42

first thing you do is you send

31:44

that video to say , hey , no , do that . And

31:46

then the second time the developer

31:49

does it and you have a conversation and

31:52

then if it happens again , then then you go

31:54

to their boss and say , hey , no , developer

31:56

is doing X , y and Z . But

31:58

what you don't do is overreact

32:01

and say fly that developer

32:03

, set down their machine , cut

32:05

off all software , that people from

32:07

pushing code around with gate tomehans

32:09

. That's exactly the home

32:11

response for a security

32:14

team to have , and I think it's it's it's

32:16

right . That's one of the big differences

32:18

between insider

32:20

threat and external threat

32:23

, because when you have a piece of malware

32:25

come in , you know the first thing that

32:27

you do is

32:30

it's killed by the sledgehammer . You don't isolate

32:32

that machine , you're going to disconnect it , you're going to

32:34

quarantine it . That is not

32:36

what you do with insider

32:38

threat , because you really wanted

32:41

to investigate and understand what is the person

32:43

doing . Why are they using drought loss ? Why don't

32:45

Joe use a thumb drive ? You know , and

32:47

it turns out Joe used a thumb drive because he don't know , because he was giving

32:49

a presentation and he had to in order to get

32:51

it done , even though it's against our policy . We use

32:53

a thumb drive . We understand why we made

32:55

that decision in that situation and

32:58

we didn't come down and get Joe with a sledgehammer

33:00

right , because it's

33:02

always a bad actor when

33:05

malware gets in . You know it's

33:07

always a colleague when

33:09

we're talking about insider threat

33:11

. The threat is fast

33:14

moving and propagating

33:16

when it's from the outside . But

33:19

insider risk does it spread ? Just

33:21

because you , joe , lose the files around doesn't

33:23

mean so hot or Joe is going to move some files around

33:25

. That's not how it works . You have time to investigate

33:28

and to understand what's happening

33:30

. Right , when

33:32

it's malware , you know the

33:34

security team can handle it on its own every

33:37

time . But when it's

33:39

an insider , you're going to need HR

33:41

, you might need legal , you're going to need a little

33:43

bit of help to do that . So these are very

33:45

different situations . It's

33:47

never accidental when you get malware

33:50

in . It's always it's

33:53

sometimes . It's sometimes accidental

33:56

for in fact , it's , about 50%

33:58

of the time , an insider risk

34:00

event . This was just an accident

34:02

. Like you never need to educate

34:05

the hackers , the

34:07

bad guys . Like that's a waste of your

34:10

time and money when you always

34:12

want to educate your own employees , because

34:14

that brings down the number of events

34:16

for you last time

34:19

in it . And then , last thing , I think the interesting

34:21

one is you can't

34:24

. You can get fired

34:26

for not reacting fast

34:28

enough to a malware threat

34:30

in your environment , but with

34:32

insider risk you can get

34:34

fires for reacting too fast

34:37

and so it's like it's a totally

34:39

f*****g-less-it sign . So

34:41

one of the things that we

34:44

always work with our clients on , if you're in security

34:46

, you really need to process that is how

34:48

is this ? Is that the

34:50

folks that are running CrowdStrike

34:53

and look at all the internal , the

34:56

external activities , are not necessarily

34:58

the same people that you want to have running

35:00

your internal , because they can

35:02

be , but it's a . You have

35:05

to turn your brain around and

35:07

understand which problem that

35:09

you're working on external or internal

35:11

because your response is and the way

35:13

you handle it are so very , very different

35:16

.

35:18

Yeah , that's a really good point

35:20

that I was going to bring up

35:22

. It sounds like you almost need two separate

35:25

teams because to

35:27

that normal security team everything

35:30

looks like a nail when you're used to being the

35:32

sledgehammer . I'll

35:35

tell you right now if I saw something

35:37

like insider threat or whatever , I would

35:40

have immediately , without question

35:43

, quarantined the machine , started

35:45

doing a forensics investigation into

35:47

the guy or whoever

35:49

it is . I would immediately

35:52

go the nuclear route

35:54

. Okay , we're going to make sure this person

35:56

can't do anything .

35:58

Unfortunately , that's an exact story from

36:00

my customer base . One of my customers

36:02

, ciso , has called me and

36:05

said Joe , I'm really angry with you

36:07

. Your team handed my team a

36:09

gun and didn't teach you how

36:11

to use it . And it was the very

36:13

first thing incident they had

36:15

was with the senior vice president at

36:17

this organization , and the

36:19

security team did exactly what you described

36:22

a warranty machine , a

36:24

toboa machine and then they interrogated

36:26

the senior vice president and

36:28

was hugely embarrassing for that executive

36:30

. But it turns out the executive was just

36:32

actually doing their job

36:35

and there

36:37

was a little bit of a misunderstanding

36:39

about why I'm supposed to be sending information

36:42

to that particular vendor in this

36:44

case . And the security team oh

36:46

, that's an untested vendor and oh

36:48

, yes , had

36:50

they handled it the right way , it would have just

36:52

said oh great , we've learned from that and we'll add that to

36:54

our trusted list or wherever . But they

36:56

put on a team machine , they isolated it

36:59

and then they interrogated a person and

37:01

so , yeah , that happens . And so

37:03

part of what we've learned is like who are

37:05

we going with our customers ? We have to have

37:07

a lot of conversations and

37:10

the funny thing is is most of our customers

37:12

are not , you know , securing

37:14

folks , like you've worked with them all year , most of all

37:16

like we have this message that we know you

37:18

want to tell us . We're very familiar with all the stuff , and

37:21

we have to tell you listen

37:23

just a little bit in us , because your

37:25

methodology is so different . So

37:28

many people in security

37:30

would like to think like Fedor , say , mom

37:33

, let's just block , let me just block all

37:35

this . You know exfiltration

37:38

, and if you just block it , then you know

37:40

we all worry about it and it

37:42

doesn't really work with insiders

37:45

, right , you know it doesn't really work as first-worlds

37:47

like well , so you're gonna block external jima

37:49

. You know a lot of it . So

37:51

how are your employees going to

37:54

? You know , are they after a

37:56

second machine around with them and they have

37:58

to do it ? How will that go ? Re-employment

38:00

oh , tough , tough today

38:02

, but okay , well , fed , we know , you

38:05

know , let's see if that , if that plays out

38:07

well in terms of your employees . Just don't they find

38:09

that you , you got a nail

38:11

or a proton nail where they start , you know

38:13

, going around you . So what always finds a way , and

38:15

so , oh , we tell our customers

38:18

. As I look , your employees in

38:20

most of the organizations we work , your employees

38:22

are better have a

38:24

gene nail account . They might have a dropbox tab

38:26

because they do something with their church or their school

38:29

soccer gene with thousand like X . The

38:31

important thing is that you help them

38:33

identify , is that you watch for

38:35

important company

38:38

data not moving to those things , and

38:40

then you course correct them if you see those things

38:42

happening and you always have

38:44

the information available to you that

38:46

you can . It's necessary to

38:48

cross you down or whatever , but usually what

38:50

you just need to do is course correct and

38:52

let them operate the

38:54

way they they need to offer

38:57

and collaborate and work together . You cannot

38:59

get in the way as if you start putting on these

39:01

rules . One people's machines

39:03

, it's posed down to machines for everybody

39:05

, and so it tends

39:08

not to be a great way to do things .

39:12

Yeah , it reminds

39:14

me of a time a little bit earlier

39:17

on in my career where , you

39:19

know , my manager was walking me through

39:21

, like our , our data loss prevention

39:23

solution that we had

39:26

, you know , and he was saying how , you

39:28

know nothing , nothing is going to leave

39:30

that we don't know about , or anything like that

39:33

, and that you know it's a perfectly

39:35

locked down and everything like that . You know , and

39:38

my security mind , I immediately

39:40

and I did this subconsciously

39:42

, right , I didn't even realize that I

39:45

did this at the time , I took it as

39:47

a challenge Like you think

39:49

I can't get around this , like

39:51

I'm not the admin on this , I don't own the

39:53

tool , but you think

39:55

that I can't get around this . And then I immediately

39:58

started discovering ways of

40:00

getting around it . And when

40:02

he didn't believe me , I showed him right

40:05

there . I was like , well , I just did

40:07

it . You know , like it's , it's on my phone

40:09

now , it's on my personal cell phone now

40:11

. Like what are we ? What are we doing

40:13

? You know , and it's

40:15

, it's a different

40:17

, it's a different mentality that

40:19

security people have , you know

40:22

, because I feel like when you tell us

40:24

that we can't do something , we

40:27

immediately make sure that we can't

40:29

.

40:30

Well , it's funny . I just as as

40:32

as as comprehensive as our product

40:34

is today . And the

40:36

thing about traditional DLP is it doesn't

40:39

see . There's so much it doesn't

40:41

see . I bet in the way

40:43

we talk about it is traditional DLP

40:46

. You write specific rules to cover things

40:48

that you want to cover and so

40:50

as long as you know what all the risks are , you

40:53

know you'll be able to refine . Of course , you don't

40:55

know all the different ways that people can exfiltrate

40:57

data and move data around , and so they

41:00

tend not to be very effective

41:02

solutions . In fact , when we

41:04

go , when we sell today , 85%

41:07

of our customers and Greenfield mean

41:09

it may not have anything . Because people

41:12

gave up on DLP 15 years

41:14

ago . They said these systems don't work , they

41:16

keep us from collaborating , et cetera

41:18

. And so the

41:21

space we call the space and Cyrus management

41:23

today Microsoft

41:26

calls the same thing . So the , the , the

41:28

idea is really like hey

41:31

, we will block people for

41:33

doing things that all the in rare circumstances

41:35

only mostly focus on is monitoring

41:38

and then and then scoring

41:40

things in the cloud so we can see where

41:42

did the data come from , who had

41:44

it , where did it go ? What do we know

41:47

about that ? Versus is the first time that you

41:49

said we can do all that kind of analysis in the

41:51

cloud and and and

41:53

do it efficiently . So then also , people's and

41:55

phones down and then we can score and then we

41:57

can tell security . Well , I need to pay attention to Joe

41:59

and in particular

42:02

, when Joe is leaving

42:04

the company , this is the biggest

42:06

risk you have . We can look back 90

42:08

days and say , hey , joe , 40

42:11

days ago he took , he took , he took a lot

42:13

of information and in this , well , one

42:16

is we talk lies in the same way that you

42:18

. You know you're very careful to make sure that

42:20

we get our bags back so they can't physically

42:22

get into building their very careful and actually

42:24

the laptop accurate , so they can't get on our

42:26

networks . But Only

42:29

10% of companies ask people about their

42:31

data . Only 10% ask

42:33

you on the way out hey , did you take the data ? And

42:35

so one of the things that we have

42:37

is a process then say , go back

42:39

and look at every single person who's departing

42:42

the company and just double check that they

42:44

haven't taken all their data , because

42:46

if you think about it out while

42:49

they're in the process of leading her , before they've left

42:52

, it's really easy to get

42:54

it back at that point and

42:56

I know that sounds like well , how is it ? You know the

42:58

security will . You'll never get that

43:00

data back , but if it's someplace , I'm like , look

43:03

, here's how you get it back . You

43:05

basically tell them we need to get back

43:07

, you need to do this , all copies , and

43:09

we need to be confident that you're doing

43:11

that . Because when I'm confident you're doing that

43:13

, you're gonna sign in attestation that you did

43:15

that the total , full bill of

43:17

the process . We're gonna notify your

43:19

next employer that you're in possession

43:21

of our data . If we notify your

43:24

next employer that you're in position of our data , that actually

43:26

puts them on the hook legally for

43:28

it . They're gonna also send a pretty strong message

43:30

to them that , hey , you're hiring a new person and

43:33

new person is never worth the day of your company . Is

43:35

the process of stealing the data from their last

43:37

company . Do you really want to hire that

43:39

person ? So it's what most people

43:41

come to understand very quickly is

43:44

that they're putting their next job at risk if

43:48

they don't return the data and

43:50

up like when an empty lot of dollars you left

43:52

, they end to start it for a year . By

43:55

that time they had taken all that

43:57

data and they were using it . All

44:00

that information was being used across

44:02

Uber in this case and so

44:04

unwinding that was really complicated . If

44:06

they had found that the two days before

44:08

he had left and figured that out , it would have been a completely

44:11

different situation Although now it's very

44:13

time-prisoned and then you would take his

44:15

mind and she's allowed it safe

44:18

and his ideas and

44:20

worked at the next company and

44:23

maybe created it so that information

44:25

and done some different things . But you

44:27

want to sell his actual designs

44:29

, source code to property , et

44:31

cetera .

44:35

With the evolution

44:37

of AI and LLMs

44:39

? Are you seeing this as

44:41

a new , I guess , attack

44:44

avenue or attack path for

44:46

insider threat ? Because I'm

44:48

thinking about it from the just

44:50

a simple chat GPT side

44:53

of it . Like most of the time

44:56

the company isn't going to block chat GPT . You

44:58

could just upload whatever you want to chat

45:00

GPT and

45:02

one . You don't really know what they're doing with that

45:05

data on the back end . So that's obviously

45:07

a huge risk , but

45:09

at the same time you could save that

45:11

conversation and then you can go and access

45:14

it from your personal computer . Are

45:16

you seeing newer attack

45:19

paths that you haven't seen before or

45:21

come up ?

45:22

Yeah , and let's talk about sort

45:26

of you know hat trial , how

45:29

to act on demand . So first , I'm a rimerist . Most

45:31

of the people that are moving data out of the company

45:33

. They put them into

45:36

large lines all like chat GPT . They're

45:38

a hoise . And what are they doing ? They're

45:41

just finding their jobs , like they're literally trying to do

45:43

their jobs . They're probably not thinking I'd stay

45:45

on Excel freeing the data but not putting

45:47

it up into chat GPT so that they can

45:50

steal it in a case , or just like

45:52

hey , do you like to get an answer to this

45:54

or something ? So one , that whole video

45:57

education part that we talked about is

45:59

super important , right ? So just

46:01

remember , why are they doing

46:04

what they're doing ? And can we dictate

46:07

them and then say you

46:09

know , help them not do that ? That'll solve

46:11

most of the problems . Just try and be there

46:14

. They were actually trying to in

46:16

most cases found steal data . They certainly

46:18

not find a steal data in chat GPT . How

46:21

would that benefit them ? I think

46:23

what's newly been interesting is that

46:25

it's the first security

46:27

. So about seven months

46:30

ago , when the supply chain attacks started

46:32

happening , you know some

46:37

of the poor members

46:39

would ask security , hey

46:41

, how are we on our supply chains and most

46:44

of them are there when nothing has

46:46

been quite as loud as

46:48

the boards of directors specifically

46:51

asking security leaders a

46:53

re-cover for AI , like how

46:56

to low-earned up in the chat to TV . It's

46:58

one of those things that's been pressed to much that

47:00

poor members who aren't great technical usually and

47:02

aren't great security savvy but

47:05

boy do they worry about this issue , is they

47:07

? You know the tank cycle is really high

47:09

and so they're freaked out . So security

47:11

people are hearing a lot of like are we

47:13

covered for that ? So on

47:16

the one hand , it's one of those times on security

47:18

to be like , hey , you do the best to more money

47:21

and so it has some resources because the board

47:23

really cares about this . But

47:25

the actual threat out

47:29

again is relatively

47:32

easy , easily mitigated and

47:34

controlled by controlling

47:36

what you know

47:38

from a network security standpoint , what

47:41

you get access to . You know

47:43

, like our helping . We have our own internal

47:45

chat to TV , so it's hard

47:47

data and so it's easy to . We restrict

47:50

corporate access to chat to TV . As

47:53

I mentioned earlier , we also watch , you know

47:55

, for large style

47:57

uploads as well , and

48:00

then I think that's an

48:02

area that's been in this we've

48:04

got to work . Test you now One last example

48:07

, sort of to your point RCTO

48:11

did a experiment

48:13

and you just asked TechEBG

48:15

to go

48:17

tell us what the product roadmap

48:20

plan is for our competitor and

48:22

for work . Peppers and

48:25

TechEKING got all of their stuff

48:27

, so somebody over there had

48:29

uploaded enough information

48:32

. I guess they were working on their you know

48:34

product plan and why and how

48:36

it been you know creating it . And

48:39

we got the insider of product plan just by asking

48:41

chat to TV . So , yes

48:43

, the format is real . It

48:45

is usually accidental , so you got

48:47

to handle it , knowing that it's usually accidental

48:50

and closer , direct first , are paying

48:52

a lot of attention to it right

48:54

now . So it's not , it's something you really

48:56

need to pay attention to , even

48:59

though I would say them , the

49:01

previous threat to most organizations right now

49:03

is source code extliferation

49:06

from your own software engineers

49:08

, because you

49:11

know you think about like hey , I have

49:13

forgotten to install our source code . That's

49:16

bad , because you know they can look more bullies

49:18

in your source code . But that factor

49:20

is at a hard time . Taking that source

49:22

code and making a product , the

49:24

person that wrote back to the scope and

49:26

knows exactly what it does and how it's written

49:28

and what business problem it's solving

49:31

because they wrote it . That's

49:33

the biggest risk to your source code

49:35

today and keeping those

49:37

people in line and in

49:39

check from doing that is

49:41

a much bigger risk today and

49:43

I think you'll start to

49:46

hear a little bit more about this in

49:48

the coming months . Is it a few

49:50

big cases that are happening on Thursday ? Is

49:53

it probably a bigger risk than what's happening

49:55

on the AS ?

49:59

Wow , that product roadmap

50:01

that is . You

50:05

know , I mean the person that probably uploaded

50:07

it right was just trying to get chatGPT

50:09

to write it better . You know , I mean I

50:12

do that with everything you

50:14

know .

50:14

I mean I know people that do that with all their emails

50:17

to executives , right , you

50:19

know it's a very valid and again

50:21

I get it , you're not malicious , so

50:23

you think right , you're not trying to hurt

50:25

the company when you do it right . So

50:28

because of that , it's

50:30

one of those things that can be largely

50:32

solved through education . But

50:35

you have to take it on , you have to address it and then you secretly

50:37

leave things as well as soups . So

50:40

most of us , all of us , go through training

50:42

. You know there are companies that copy

50:44

. Every year we have to do these mandatory trainings and none

50:46

of us pay attention to it . We all push 2X

50:49

on the video thing . You know

50:51

that's how we do it . What really

50:53

we found is really the type

50:55

of the moment you do something wrong within

50:58

a few hours . That's when we

51:00

send the video as an end time , because you

51:02

know , oh , I did do something

51:04

wrong , I did upload something to chatGPT

51:07

. I shouldn't have done that and they noticed , and

51:09

so you could . Why those things ? That's

51:11

just too important . So I would say that right

51:15

now , inside of Threat in AI

51:17

is an education problem and

51:19

then , of course , correction

51:22

problem , because a lot of people will suddenly be like

51:24

look , I know I'm not supposed to do it

51:26

, but I'm not going to email to my CEO . He

51:28

needs to be really good and so I'm going to put

51:30

in there and ask for better language and

51:32

stuff . Well , if you know you're made it caught

51:34

building that , then you're less likely to do

51:36

that , but if you don't think anybody's paying attention

51:39

, then you're going to continue to do that . So I

51:41

think it's that . Of course the education is

51:43

important , also the enforcement of that

51:45

education , but that's different

51:47

than just sitting and someone with a stash card . No

51:49

.

51:51

Yeah , you're playing more into

51:53

the psychology , right

51:56

, it's more of the psychology of the

51:58

person . How do we change that thought process

52:00

? How do we change how they view

52:02

this or whatnot ? Right , which

52:04

is ? It's the better approach

52:07

than the sledgehammer that

52:10

I would have done .

52:12

And Joe , you pointed this out . It's so

52:14

different than most cybersecurity

52:16

.

52:17

Yeah , it's completely opposite .

52:19

We should talk to fancy bear and tell them

52:21

what they're doing is not even thick . You

52:24

know what I mean . There's nobody

52:26

at CrowdStrike that's sitting

52:28

around and worried about educating

52:30

. You

52:33

know , one of the things that we

52:35

always get raised eyebrows from security

52:37

people is we say , look in

52:40

security , we don't want anybody to know our methods

52:42

and practices . Right , we don't want anybody

52:44

to know that stuff . It's set inside

52:46

risk and companies . And actually

52:49

why tell people A lot

52:51

of during your food hour , monitoring

52:53

you ? We care about your

52:56

job . We're not hot in your productivity , we're

52:58

just monitoring when you move data to untrusted

53:01

location . You know like we want you

53:03

to know that we pay attention to that . And first

53:05

of all , you tell security people that they're not

53:07

going to tell people that , no , you do . You're

53:09

not really want to catch them , but

53:11

the rest of security is more tax stuff . It's

53:14

like you just don't want them to do . It

53:17

is what it comes down to and

53:19

you know it's a way to grab to try to escape

53:21

the hackers of the world and

53:23

you know to

53:26

find syndicates who are really wanting these

53:29

things today . It's a waste of time and

53:31

we see when they hack hospitals and

53:33

they're literally causing people

53:36

life and death problems

53:38

and they'll tear . You

53:41

know , and so clearly you

53:43

know it's . We're wasting our breath and we try

53:45

to educate the dinosaurs .

53:48

Yeah , it's a great point . You know , recently

53:51

I'm not sure if it was in the

53:53

news or anything like that , I don't watch the news but

53:56

you know , lurie's Children's Memorial

53:59

was

54:01

attacked . You know , it wasn't like

54:03

a small attack , like they

54:06

took them offline , you

54:08

know , and this is a children's hospital that

54:10

treats only the sickest

54:12

of sick kids , that doesn't

54:15

even charge families if they can't

54:17

afford the treatments , they don't even charge

54:19

them . You know , I

54:21

had a very personal experience with that right , like

54:24

my sister was a patient there

54:26

and there was no way , no

54:28

way my family would ever

54:30

be able to , you know , shell out the

54:32

half a million dollars to

54:35

get her , you know , treated for the

54:37

next three years or whatever it was

54:39

, you know , and they

54:42

didn't send us a bill or anything like that

54:44

. You know , like that was a huge weight

54:46

lifted off of us . And now these attackers , you

54:49

know they they go in and

54:51

they they knock them offline . It's like guys

54:53

, you know , at least do

54:55

an adult hospital , you

54:57

know what I'm saying Like a children's hospital

55:00

is just like the most heartless

55:03

thing that you can do , but these attackers

55:05

they don't care , they know .

55:07

They know and that's , and again , very

55:09

different than your insiders . I

55:11

mean insiders do care , they work at your companies

55:13

and they can most far , I mean

55:16

there's those a few people doing

55:18

, you know , real uh

55:20

, espionage at organizations . I

55:23

was a far kind . So like , yeah , we're going to help you

55:25

find those two , but like , the most important

55:27

thing is to help you reduce

55:29

the amount of offense . You see

55:31

, their non-friendly , their

55:34

accidental . That would just

55:36

require a little bit of , you know , a

55:38

course budget so you

55:40

can spend your time . Go to some of the others . You

55:42

know , a friend of mine told me

55:45

to send us to my first school , my first

55:47

job , in my early

55:49

spring , because I was the manager of a small

55:51

retail store and I saw like seven

55:53

people working really small and

55:56

, um , and somebody was taken

55:58

through the till and

56:01

this is where all came to surprise , you

56:03

know , and that then we took tax rights

56:05

and my owner , the boss

56:07

, said look , joe , 10%

56:10

of the people , uh , will

56:13

never steal from you , and

56:15

10% of people will always steal from you

56:17

. And 80% of the people

56:20

will steal from you if you give them the

56:22

opportunity . So your job is

56:24

to set the systems and the processes

56:26

and the controls so that 80%

56:28

of them steal from you and

56:31

you go focus and figure out who the 10%

56:33

that will always steal from you is and

56:36

um , and that's sort of a retailer , a retailer

56:38

, a retailer , a retailer , a retailer if

56:41

you ask a retailer , who's your where

56:43

, where's most of your leakage from ? And it's not

56:45

from , um , people coming

56:47

in well , that some cities are not

56:49

mostly not people coming from from the outside

56:52

, and so so it's your employees taking it . And

56:55

I think the information services businesses

56:57

and professional services businesses have been

56:59

so slow to realize that Um

57:01

, uh I tell that story

57:03

often now because we try

57:06

to really help our clients do is get that

57:08

80% figured out , so

57:10

that they're trying to start

57:12

with the 10% that are , that

57:14

are are going to take three or four or what , um

57:17

, and , and catching that is is is

57:19

really important because they probably have malicious

57:21

intent with what they're going to do with the data .

57:24

Hmm , well

57:26

, you know , joe , we're . We're , unfortunately

57:29

, we're at the top of the hour here . We're

57:31

at the end of our time . You know , I feel like we

57:33

could we keep talking about this forever

57:35

. Um , you know

57:38

, yeah , it was a , it was a fantastic conversation

57:40

. I really appreciate you coming on . Um

57:43

, you know , before I let you go , how about you tell my audience

57:45

, you know , where they could reach out to you if they

57:47

wanted to reach out , and where they could find your company

57:49

?

57:50

Yes , so we're code 42 and that's code

57:52

42.com and you can

57:54

get me at pain Joe on

57:57

back , sign us and you

57:59

can get me at Joepain JYNE

58:02

at code 42.com and happy

58:04

to talk to any of you . And um , you

58:07

know we try to get back to the community

58:09

. So the research report coming

58:11

out , um , I think in a few

58:13

days , uh call the data exposure

58:15

report . We do it every year , so look

58:18

for that in this loss and information there . Uh

58:20

, inside of that and center risk .

58:23

Awesome . Well , thanks everyone

58:25

. I hope you enjoyed this episode

58:27

.

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features