Podchaser Logo
Home
Ethical Tech and Inclusivity in the Age of AI With Jackie McGuire From Cribl

Ethical Tech and Inclusivity in the Age of AI With Jackie McGuire From Cribl

Released Monday, 11th March 2024
Good episode? Give it some love!
Ethical Tech and Inclusivity in the Age of AI With Jackie McGuire From Cribl

Ethical Tech and Inclusivity in the Age of AI With Jackie McGuire From Cribl

Ethical Tech and Inclusivity in the Age of AI With Jackie McGuire From Cribl

Ethical Tech and Inclusivity in the Age of AI With Jackie McGuire From Cribl

Monday, 11th March 2024
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:53

How's it going , Jackie ? It's great to get you

0:55

on the podcast finally . I feel

0:57

like it's been a while , but I'm

0:59

really excited for our conversation today .

1:02

Yeah , super happy to be here . It's

1:05

beautiful out in Arizona , so it's

1:07

a nice day . I'm in a good mood .

1:10

Oh , I know , A good friend

1:12

of mine moved to Arizona maybe

1:15

a year and a half ago and now he

1:17

always gives us updates like oh

1:19

, it's a little bit chilly , today it's 60

1:21

degrees . You know , like

1:24

man , you can leave me

1:26

alone , it's negative 20 where

1:28

I'm at .

1:31

I grew up in New Hampshire . For the first

1:33

10 years I lived in New Hampshire I lived in a little town

1:35

called Berlin , which is about 45

1:37

minutes in Canada . So I have definitely

1:40

walked to school in like

1:42

10 , negative 15 times a week before

1:44

I empathized with you .

1:46

Yeah

1:50

, when I was in college actually

1:52

, you know , I worked for the police

1:54

department , the campus police department , and

1:57

it was like negative 40 . And

2:00

I was one of like five

2:02

or 10 people that had to actually

2:04

go in as like a central personnel and

2:07

which was crazy , because they had a

2:09

student be an essential personnel and

2:13

the police chief actually sent a squad

2:15

card to my dorm to pick me

2:17

up because he said that it was too unsafe

2:19

for me to walk a quarter of a mile .

2:22

Yeah , yeah . I believe

2:24

that the thing that finally

2:26

broke me was we had an ice storm in 2008

2:29

in New Hampshire that left an inch and a half of

2:31

ice on everything in New Hampshire

2:33

and I didn't have power for six

2:35

or seven days . I lived in Manchester

2:38

. I didn't live in like nowhere in New Hampshire . I lived in

2:40

the biggest city in New Hampshire

2:42

and I was just like this place is not inhabitable

2:44

. Nobody should live here , and I literally

2:46

just sold all of my stuff and moved to California

2:49

. I've never been there . I didn't know anything

2:51

about it . I found a roommate on a

2:54

forum for motorcycles and I

2:56

ride motorcycles on my way there and I was just sort

2:58

of never living anywhere that does this .

3:01

Wow , that is so

3:03

crazy An inch and a half of ice

3:06

.

3:08

It was insane . I've never seen anything

3:10

like that before . It was so crazy

3:12

.

3:14

Yeah , I would move too At that

3:16

point . I would move too my gosh

3:18

, but

3:22

it's in Arizona today . So I'm

3:24

happy . Yeah , it's

3:27

very tempting to go to Arizona . My

3:31

wife and I we like

3:33

colder weather I

3:36

wouldn't say insanely cold weather

3:38

, but we prefer a good

3:42

variation in the seasons and

3:44

whatnot . My buddy was telling me that Flagstaff

3:46

Arizona is a good

3:49

mix of the seasons , and

3:51

so now I'm

3:53

on the mission of convincing my wife . Hey

3:56

, we should go check out Flagstaff Arizona

3:59

and see how it is . Now I'm

4:02

on that 10-year mission .

4:04

I can see snow from where I'm standing , so

4:06

there's a mountain outside . I live in

4:08

Tucson . There's a mountain just outside Tucson called

4:11

Mountain Lemon . It actually is a ski area . You

4:13

can ski for a couple months out of the

4:15

year and I can see it from like . So

4:18

it's less than an hour from my house to the

4:20

mountain , so you can get to

4:22

call by there if you really

4:25

want to , you just don't have to live in

4:27

it . I hate scraping my windshield . It's

4:31

one of those stupid things that you're like it's such a small

4:33

thing to be like , so it's not , but

4:35

it's just one of those things that it's like

4:37

7.15 in the morning and you're exhausted

4:40

and you only had a cup of coffee and you're trying to get

4:42

to work and you're just like I

4:44

hate this , I hate everything . But then again

4:46

, I've never burned my ass on

4:48

a frozen windshield In

4:51

Arizona . Half a year I literally have

4:53

to keep a towel on my car seat so

4:55

that I don't get second degree burns on my butt

4:58

.

4:58

So just straight out to everything

5:00

. Yeah , yeah

5:03

, absolutely . Well , jackie

5:05

, I'm really interested in

5:07

hearing about how you got an IT

5:09

, what made you want to go into IT

5:12

overall and then what made you want

5:14

to maybe make a I guess

5:16

maybe a slight switch into

5:19

security and focus more on security

5:21

.

5:22

Yeah , so I had a really weird , like

5:24

meandering career . I've dropped out of college

5:26

three or four times at

5:28

least .

5:29

Oh geez .

5:30

Yeah , so I started with psychology and

5:33

dropped out because I was poor and

5:36

poor to afford tuition . And I actually did

5:38

a stock broker when I was 20 . So I

5:40

went through an interview about higher class

5:42

edelides . I worked as a stock broker

5:44

through the financial crisis , moved

5:47

over to SVB and managed

5:50

cash for companies , and

5:52

so it was like I

5:54

had kids and I was doing the whole like quarter-life

5:56

crisis thing everybody does and like doing

5:59

psychedelics and like questioning mindsets

6:01

on the planet . And I

6:04

got an economics and finance degree because

6:06

I was working in finance and

6:08

I really wanted to learn how to write Python . So

6:10

I was like , well , what was the easiest way to work , how to write Python

6:12

? I was like , well , I already understand economics and

6:15

statistics , so maybe data science

6:17

would be a good idea . It seems to be an

6:19

up and coming field right now

6:21

. This is like 2017

6:24

. It was like you know , also

6:26

, it would be applying Python to something I already

6:28

understand . So I did a data science boot count

6:31

and I pushed my final

6:33

project in my GitHub and

6:36

it's got me jumped and he said hey , I am

6:38

the data science person at this sim

6:41

startup and we're looking for somebody

6:43

to write algorithms for like anomaly detection

6:46

and user behavior analytics

6:48

, are you interested ? And I

6:50

really thought about working in cameras

6:52

security , which is weird because

6:55

I was always like my mom used

6:57

to drop me off at Radio Shack when she grocery

6:59

shopped . So I've never had to use a computer

7:01

since I was like three and I

7:03

always like I don't know , it's so strange , and

7:05

like I actually grew up in the military on

7:07

a couple of intelligence phases , so

7:09

I've always been kind of weirdly adjacent

7:11

to security , those types of things

7:14

. That's how I initially got into it

7:16

and then I

7:19

think , personally , one

7:22

of the things I love about security is , well

7:25

, the gender disparity isn't a good thing

7:27

. The bathrooms are always clean for me and there's

7:29

never a line , so like that's a good . No

7:32

, no , I find more

7:34

than anything , more than any industry I've worked

7:36

in , security is very much a meritocracy

7:39

, right , it is a hundred percent a meritocracy

7:41

and it's a really interesting industry

7:43

and that people don't actually need to like

7:45

you , they just need to trust you . And

7:47

so you have all of these crazy like neurodivergent

7:50

, like not necessarily super socially

7:52

adept people . But

7:54

I did like it , just felt

7:56

like home as soon as I went to my first

7:58

like DEF CON . You know my first DEF

8:01

CON was mind blowing . It was like , oh my God , these are my

8:03

people Like , and it's

8:05

that everybody kind of wants to solve the same problems

8:08

and I've

8:10

always had problems in regular industries

8:12

where people think I'm crazy because

8:15

I don't think the way that neurotypical

8:17

people think , whereas in security people are

8:19

like , oh , your brain works differently than mine

8:21

. Come help me with this problem , because between the

8:24

way your brain works so it just seems

8:26

to be an industry that's significantly

8:28

more open to like

8:30

everybody brings their own special talents

8:33

. Yeah , and so I ended

8:35

up moving from the

8:38

random elementary school in my living

8:40

room during the pandemic because I have three kids

8:42

. And then after that I was like

8:44

you know how do I combine cybersecurity

8:46

with my finance background ? And

8:49

they ended up becoming an industry analyst . So I covered , like

8:51

Sims or XDR , all

8:53

those kind of like analytical platforms for

8:55

S&P slash

8:57

survival research , and that was

9:00

how critical found me is . They pitched

9:02

the companies to me and I was like I love

9:04

this . So I hate regex

9:06

. It's like a beta by existence , because

9:08

when I became a data scientist nobody told

9:10

me that before you can write an algorithm , you

9:12

have to get this like beast

9:15

of assist along file into something that you

9:17

can actually use , and there were no good tools

9:20

for doing that at the time . So I literally spent like

9:22

weeks on regex when I was a data

9:24

scientist . So when I saw Kerbal I was

9:26

like this is amazing . But at the time they

9:28

were kind of calling it an observability pipeline

9:31

, which is what it was it's great for

9:33

, but I was like I don't know , I'm sorry , but he really needs

9:35

a looking security the

9:39

thing we see like other people know

9:41

exactly how important their jobs

9:43

are to security . But security people are

9:46

trying to work observability .

9:49

Isn't that like right ?

9:51

Yeah , so , but so I thought it was

9:53

a very long-winded story , but I think it's important

9:55

because I have not met many people in

9:57

security who came here directly , right

10:00

, like , we all have these like weird backgrounds

10:02

and like the diversity of background is

10:04

usually what makes you good in

10:06

security , because when you're dealing

10:08

with , say , like , a financial services client

10:11

, you need to have some background in that to

10:13

really understand the nature of the threats that you're dealing

10:15

with .

10:15

So , yeah , yeah , it's

10:18

a really good point . You know I

10:21

have a lot of people reaching out to me , you

10:23

know , constantly asking me you know how

10:25

do I get into security and that

10:27

sort of thing , right , and they kind of want that

10:29

you know , 12 , 18 month path

10:32

into security . And

10:35

you know I always tell them you

10:37

don't want that path because security

10:39

is so stressful and it requires so

10:41

much context outside of security

10:44

that you just simply

10:46

wouldn't get . You know

10:48

, if you went straight into security , yeah

10:51

, you

10:54

know , like you said , right , you kind of have to know

10:57

how the financial industry works

10:59

, the kinds of systems that they have

11:01

in place , the methods and all

11:03

that to really , you

11:05

know , kind of understand

11:08

like , oh okay , we're going to do security

11:10

this way because we

11:12

have this huge compliance standard with

11:14

NYDFS that just came out , that

11:17

you know they're going

11:19

after companies for it , right ? So , like this is

11:21

a hot priority item

11:24

. You know we need to get through it like this

11:26

rather than this other , you know

11:28

, industry recognized method that we've done

11:30

forever . You know .

11:32

Yeah . Yeah , I think my job is to be

11:34

able to actually underwritten . I understand

11:36

the statistics that an actuary uses

11:39

. You know and I can form opinions

11:41

about this is the way the market is going

11:43

. You know right now we're underwriting enterprise value

11:45

. I actually think we're going to go to a model where we

11:47

underwrite the value of the actual data

11:49

that they've risked and

11:52

part of that is bringing that back . And I

11:54

agree like I always tell people like you're

11:56

not the non security background

11:59

that you bring is what's really important and

12:01

I think we in the security profession to do

12:03

a lot better job of building those bridges . Because

12:05

I agree with you , I hate the like . Here's your

12:07

18 month old print into a

12:10

tier one analyst rule . That's going to suck

12:12

your soul out of you , right

12:14

? Because , like , that path

12:16

is not going to get you into a really cool , interesting

12:19

security game . It's going to get you into a tier

12:21

one socket , which is not

12:24

the best way to start out in

12:26

security is . A lot of times those are kind of burn

12:28

assured roles . They're

12:30

really stressful . Yeah , like

12:33

how many you can be like . Okay , well , I come

12:35

from a you know a manufacturing

12:39

background , right , but I really understand

12:41

the physical security aspect of manufacturing

12:43

. That's an entire like subset

12:46

of security that is actually desperately

12:48

in need of monetization . So

12:51

go go into it that way , you know

12:53

, develop opinions on it , like in that

12:55

I think a lot of times people are afraid to

12:57

have an opinion , but that's gotten me most of the

12:59

best jobs I've ever had , right Is having

13:01

like a contrary

13:03

opinion on something .

13:06

Yeah , I feel like that's a that's

13:08

a pretty common , you

13:10

know , attribute of security professionals is

13:13

having an opinion on something that isn't

13:15

isn't the norm , isn't the

13:18

expected , you know opinion

13:20

and you know

13:22

I . I kind of go back to like

13:24

when my wife and I we were building

13:27

our house and you know figuring

13:29

out where we wanted the rooms and everything like that

13:31

, right , she

13:33

wanted to have a more open , you

13:35

know , floor plan and I'm thinking to myself

13:37

, well , like that makes things too easy , you

13:39

know , for for a potential attacker

13:41

, right . Like I'm thinking from a physical security perspective

13:44

and I'm thinking like you

13:46

know , oh , I don't want a wall here because I want to put

13:48

a camera there so I can have a wider view of range

13:51

, right , Like like all

13:53

of these things . And you know , when we're finally

13:55

in the house , she's like oh well , I like

13:57

you know , no , like

14:00

no shades on the on the window because it blocks

14:02

the natural sunlight from coming in , and whatnot

14:04

. Like one . We live in Chicago , so we

14:06

get natural sunlight like four months

14:08

of the year . And two , you

14:10

know , we're just opening our windows to attackers

14:13

. And she's like where do we live ? Like we

14:15

live in a place that has literally

14:17

zero crime ? Yeah , who

14:19

are we protecting from you know

14:21

? But like , my mind works totally different

14:23

. My mind is like like no , you

14:26

know , worst case scenario , we already expect

14:28

them to be here , you know that sort of thing . And

14:31

she has to like dial me back .

14:33

Yeah , it's interesting to think about , like

14:36

how we think about security because

14:38

, like what you're talking about , like some

14:40

of that , is such an actual

14:43

security . It's so good to secure a theater

14:45

and it's really interesting to think about it like a post-911

14:48

world , right , like a lot of people under

14:50

. It's so crazy . I think people under 25

14:52

have never known a world without security

14:54

theater , I mean you didn't use to have a lot

14:56

of that , and so I think some of that is like

14:58

there are things you do because they actually

15:01

make you more secure , and then there are things

15:03

you do because they present the illusion

15:06

of security right , like

15:08

she is a security theater

15:10

, because , realistically , if someone

15:12

wants to run straight through those parachutes

15:14

with something , they can and

15:17

you know I didn't so , but I it's interesting

15:19

to think about that because we're in the same position and like

15:21

I shouldn't say that I

15:24

might not lock my doors at night . Yes

15:26

, yeah . Because

15:28

I think it like I've had . I've tied my car . So

15:31

, for example , I used to have a convertible right . I

15:33

had my caribled three weeks in South

15:35

Salem that it would just tie the barry out , but

15:38

I'm supposed to be kept at top and broke into

15:40

it and it's like a two brand for a new

15:42

convertible top . So you know what I started doing

15:44

. I just didn't lock my doors . I

15:46

just left my car . I locked all the time , but

15:49

anything really important in the trunk is , I figured

15:51

you know like , and the top never

15:53

got caught again . It's

15:56

interesting to think about , like what things are actually

15:58

secure , what things are and

16:01

I had a kid that I was reading a book the other

16:03

day about also like our perception

16:05

of security and how much

16:07

more dangerous the roles become

16:09

and actually statistically the world's become quite

16:11

a bit safer , you know , and that they fit

16:14

a more dangerous person than they used to be , or things

16:16

like our diets .

16:18

Yeah , yeah .

16:20

So it's like I don't know we let their kids watch school

16:23

. Yeah , there's this perception

16:25

that like there's all these people out there

16:27

who are going to snatch your kids , but realistically

16:30

, kidnapings are like are down

16:32

by more than 50% since 25 years

16:34

ago . So , it's a

16:36

yeah , but I mean the same thing

16:38

in cybersecurity . Right , it's like people

16:41

we spend all spend billions

16:43

of dollars on all this AI

16:45

and sophisticated detection and

16:47

it's literally just some dude in your mailroom that

16:49

clears from the wrong link that call . Or

16:52

your your HVAC system

16:54

. You're using the default password for

16:56

the system that operates all on your air and

16:58

somebody gets it . It's

17:00

like you can sit all the time in the world

17:02

and you're spending all this money on stuff , but

17:04

the end of the day , it's usually the little things

17:06

that you're going to get that

17:08

screw you over .

17:10

Yeah , you know

17:12

, that reminds me of , like the

17:14

Octa breach that recently , apparently

17:17

recently , happened right , where you

17:19

know someone just dialed into support or

17:22

you know whatever it was the help desk and

17:24

they got access via that

17:26

. And you

17:28

know , octa was infamous

17:31

for having top notch

17:33

security never really dealt with

17:36

a breach like this before

17:38

you know , or anything like that . And

17:41

I think that they I think they handled

17:43

it fairly well right , because

17:46

I felt like I was getting the information

17:48

, like I felt like I was getting the updates

17:50

as they were getting them . You know , like , oh

17:52

, we just found out 100%

17:54

was breached . Yeah , Sorry

17:57

, yeah , you know , we just learned of it . You

17:59

know , not like a whatever

18:02

, whatever breach that was last

18:04

pass right , that like , yes , really

18:07

frustrated me . Yes , where it's like oh

18:09

, you know , they don't have anything . They got

18:11

in , but they didn't get anything . Oh

18:14

well , I got some stuff . You know

18:16

, some of the stuff is unencrypted somewhere

18:18

. Right , they got some stuff , but you're fine . Oh

18:21

, it turns out they got everything and

18:23

your master password to your , you

18:25

know , to your vault . Like guys , you

18:27

should have told me this six months ago .

18:30

Yep , yeah , how do you handle a data breach

18:32

is a lot like obviously , the kind

18:34

of data that's breached is really important , but

18:36

how you handle it Like I think about

18:38

this going through me you know

18:40

breach , and I think that's

18:42

kind of like on the polar opposite end where

18:44

I haven't really heard a whole lot from them and

18:47

I just keep , like you know , to

18:49

your point . I said well , the last pass . It's like I haven't

18:51

heard anything from them , like all I've

18:53

the only thing I've seen in the press from them

18:55

is that they still think they can be profitable

18:57

as a company and I'm like how do you know if you could

18:59

be profitable ? I want to cover my

19:01

DNA .

19:03

Right yeah .

19:06

You know I so I actually wrote when I was

19:08

in industry at all this a couple of years ago I wrote

19:11

a paper on my paper on

19:13

zero shots and that , like zero

19:15

stress is still you're creating a single

19:17

point of failure . And that's

19:19

like it's we

19:21

, it's these first pull forces

19:24

of convenience and security . Right , everybody

19:27

wants to be super secure , but also

19:29

we can't inconvenience people , because if you inconvenience

19:31

people with security measures , they finally circumvent

19:34

them . So it's this constant

19:36

battle of like and octasease

19:39

like a great idea , right Cause it's like , oh , it's

19:41

all encrypted , but again , single

19:44

point of failure . And so you know

19:46

it's . I see

19:48

these things and I think this I assume this

19:50

is most recent

19:52

kind of takedown of loft bit . If

19:54

you've read through any of the documents about the

19:57

US it's . They're basically like , hey

19:59

, the US has single points of failure

20:01

all over infrastructure . The

20:03

AT&T outage the other day really kind of

20:05

drove that home for me , you

20:08

know , because it wasn't just AT&T , because it's

20:10

an AT&T downlink satellite area , it's

20:13

all self providers Block . People

20:15

don't realize that , like self

20:17

providers don't each have a tower , it's

20:19

like they kind of choose each other's and

20:21

yeah . So it's

20:24

a really interesting thing to think about and how we

20:26

go about managing that the kind

20:28

of trade off . And to me

20:30

I've always said you know , security is really a culture

20:33

, and so I think what we need to

20:35

focus a lot more on is how

20:37

to just build security practices

20:39

into your culture at your company . Because

20:42

I could talk smack about

20:44

that now because they would pay . Personally Can't

20:48

sue me , but I went from Fidelity

20:50

to SBB and Fidelity is one of

20:52

the most conservative financial companies that exists . Right

20:54

, they're boss and base

20:56

, they're super . It went here like your first day there

20:58

. They're like , hey , fyi , compliance

21:01

is your best friend . Like they are here to

21:03

save your ass , they are not here

21:05

to ruin your day and not here to make your job harder . And

21:09

at SBB it was kind of like compliance . Just

21:12

they had these two people running all of compliance and

21:15

when I got there , like lots of the stuff they were doing , I was like I

21:17

had a supervisor lessons and my supervisor

21:19

didn't , and so I was like we can't do

21:21

most of this stuff . But it was like a check the box in there . And

21:26

security is the same way . Security compliance can go hand

21:28

in hand , right , and it has to be a

21:30

culture , because

21:33

if it's not just baked into everything everybody at your company does

21:37

and if they don't unquestioningly

21:39

trust security to have their back and

21:42

to ask stupid questions , to be able to send efficient emails before

21:44

, like if I because a lot of times I think people make bad

21:46

decisions because they're straight to ask questions

21:49

. They don't want to look stupid or admit that they don't

21:51

know whether it's safe to click on an email or not , right

21:53

, and so maybe what we do look

21:55

at a lot more is like how do we make security

21:57

more accessible to non-technical people and

21:59

how we just bake it into the corporate culture

22:03

, and most of the time we're just like we need to make the culture in most places

22:05

where we focus more on like so

22:11

and this is , I have this argument of people a lot

22:13

if having security policies

22:17

in place Prevent you

22:19

from doing your job effectively , that's probably

22:22

a procedure . It should , not a policy issue . Right , like

22:24

, if the policy is really prohibitive , change the policy

22:27

, but usually it's the way the policy is being

22:29

implemented that people have a problem with . I'm

22:31

really focused on is how do we separate

22:35

policy from procedure and Acknowledge

22:37

that , yes , some of these things might add some more work

22:39

, but we can really optimize the procedure

22:42

by which we do that , so that the policy

22:44

is not prohibited to your day-to-day work . If

22:46

that makes sense and they think like we're

22:48

not design architects for security people , so we

22:50

don't think about these things , but they have to start

22:53

kind of going together the same way you would design

22:55

UI products To , because

22:57

it's it's not just technical people that

22:59

get hacked , right , oh , yes , yeah

23:03

, that's , um , that's .

23:04

It's an interesting balance that well

23:07

, one you probably wouldn't get if you

23:09

went directly into security , right . So it

23:11

kind of circles back to that , yeah

23:13

, but you know that that's . It's

23:15

a balance that actually I'm having

23:17

to deal with right now , right

23:19

, where I'm trying to deploy and enhance security

23:22

controls , and all in line

23:24

with security policies that my architects

23:26

have created , but at

23:28

the same time , I need

23:30

to not create something so

23:32

restrictive or

23:35

enforce something so restrictive within these

23:37

applications that my devs can't do their work

23:39

, and so I . I

23:41

have to actually work , you know , very

23:43

closely with the business , that

23:46

, with people that are much smarter

23:48

than me , that

23:50

you know , or languages , right , you know

23:52

like there's compliance , you

23:56

know , basically everyone . So just

23:58

to make sure that the organization is not just

24:00

secure but that everyone in the org

24:02

can do their job as they expect

24:04

to do it , you know , and that they've been doing it

24:06

that way , and so

24:08

it's a challenging balance

24:11

, for sure .

24:12

Yeah , do you find it also challenging

24:14

to have to tie this up here , doing till a broader

24:17

corporate initiatives To

24:19

keep yourself relevant .

24:21

Yeah , that's . You know

24:23

that . That's like , um , I guess

24:25

in my most recent role , you

24:28

know , that's been a more of a focus right

24:30

Of of me taking

24:32

more and more ownership of . I'm

24:35

basically a manager or director without

24:37

the title , right , like the

24:40

title is engineer . But all

24:42

the stuff I'm doing like my manager even

24:44

say , is like , yeah , all the stuff you're doing is , you

24:46

know , director level role stuff , right

24:48

, like I'm managing my budget , I'm

24:50

, you know , putting out , you know

24:52

company wide notifications and things

24:55

like that , right , all the people I'm communicating

24:57

with . And it's a learning curve for

24:59

sure , that is . I

25:01

mean I just spent like the last four , five

25:04

months trying to figure it out .

25:06

Yeah , yeah , and I think

25:08

that's kind of where you've . Everybody

25:11

, I think , who retires in their career

25:13

goes through this process , where all of a sudden you

25:15

realize that like you have to think like

25:17

the CEO , even if you're working

25:19

in , because if you want to get

25:21

something funded , if you want to get people

25:23

to pay attention to it , if you want you

25:25

know , if you wanted to be more than just your pet

25:27

project , like it has to tie into these

25:29

kind of broader company initiatives . And

25:32

so I was actually talking with one

25:34

of my friends about setting up like a CISO training

25:36

thing at one of our corporate

25:39

events that we're doing , and I said you know , I think

25:41

you should do improv comedy , like

25:44

do an improv comedy class , because one of

25:46

the things I find is that people in security are

25:48

really not that a problem speaking , and

25:50

so , like you not only have to be able to

25:52

understand what you're doing as

25:54

a security leader , you have to be able to articulate

25:57

it to vastly different

25:59

audiences . Right , the way you explain what you're doing to

26:01

the CFO is different than the CEO

26:03

, is different than the CTO . And

26:05

then you also have to be able to get up in front of people

26:07

who are going to pepper you with questions and

26:09

you'll answer those questions . And

26:11

I like I don't think that's necessarily something

26:14

that people anticipate when they go into security

26:16

that when you get to a certain point

26:18

in your career , all of a sudden it almost

26:21

seems like all of a sudden you have to become a significantly

26:23

more robust professional . And you

26:25

did when you were just doing like detection

26:28

response .

26:29

Yeah , yeah , it's a really

26:31

good point . You know , I

26:34

always talk about , or I try to

26:36

on this podcast , talk about the things

26:38

that kind of separate you right from from

26:41

other people . And the reason why

26:43

I do that is because those separation

26:46

, those I guess those separation points

26:48

, you know make you stand out more . And

26:51

when you stand out more , hopefully it's in a

26:53

good way . You know you get promoted , you

26:55

get the opportunities that others don't get and

26:59

that you know you approach different from

27:01

from . You know , let's do improv . To

27:03

me , improv would be very scary

27:05

. I think I'm a funny person but I'm

27:07

not improv funny , you know

27:10

. That would be terrifying

27:12

.

27:14

But anyway , just have a bunch of people

27:16

and like the people position crying on

27:19

stage .

27:19

Right , right , but

27:22

you know what one of you

27:24

said . You know they struggle with

27:26

public speaking , right

27:29

, or speaking to other people that they don't know

27:31

, or what . Right ? And

27:33

this was also an issue for me , you

27:36

know , several years ago , before I started this podcast

27:38

, and somehow I got this idea to start

27:40

a podcast and I could get it out of my head . So

27:43

here I am , right , like over 150

27:45

episodes in . And you

27:48

know you came on here , there was no prep

27:50

. It was like , hey , this person's name

27:52

is Joe , he runs this podcast

27:55

, you know . And then like the same thing for

27:57

me , like this person's name is Jackie

27:59

, she's from Cribble , this

28:01

is what they do , there's no questions

28:04

, you know anything like this . If you go

28:06

back five years ago , the

28:08

thought of this conversation taking place

28:10

would have given me a lot of anxiety , but

28:12

now you know it's

28:15

nothing right , like we're just having a conversation

28:17

.

28:18

No , it did muscle and that's like . So improv

28:21

was terrifying for me too , like when

28:23

I first did it . The number one

28:25

rule of improv is yes , and

28:27

which is basically no matter what

28:29

the person before you says , you

28:32

have to agree with it and

28:34

add to it . So , and it's actually

28:36

a really good lesson for how you should live your life , because

28:38

I do some crazy stuff in my life , like I

28:40

love music , festivals , traveling , and I've

28:42

done all kinds of crazy stuff on

28:44

purpose and accidentally , because when somebody's

28:46

like , hey , do you want to do this thing , I'm

28:48

like , yes , and we should also

28:51

do this , which would make it even more epic , right

28:53

, and so that's kind

28:55

of like I . So I'm in Cribble . We have

28:58

my team is very small , but with people who

29:00

make content . We call our own

29:02

team if it will do it live

29:04

, because same thing , like almost no

29:06

live streams . We do Like I'm usually finishing

29:08

the slides for whatever thing we're about

29:10

to do as we're starting the introduction

29:13

on the recording . But

29:15

I think to your point , it's a muscle and it's

29:17

a muscle you have to exercise . And

29:19

the other thing , like the thing to figure out , is that

29:21

we all , we all have this like critical

29:23

interior of failure and

29:26

I guess in my career I've been really fortunate

29:28

that I have screwed up so badly , so

29:31

publicly , a few times that

29:33

I have failed in the most epic ways you can imagine

29:35

. And it turns out that , like , none

29:38

of your family stops loving you , none

29:40

of your friends stop hanging out with you . No

29:44

, they think you're like , let me be true about

29:46

worthless person . So you

29:48

know , you fall on your face a couple of times and you're

29:50

like , oh , it's not that bad . You know this

29:52

podcast wasn't the best one I ever recorded , but maybe

29:54

next week's will be better . You know , like

29:56

everybody in your life is , like nobody

29:59

cares , and that's the kind of thing that I

30:01

figured out is like the

30:03

work you do is extremely important , right , having

30:05

this podcast , having a resource of people who really need

30:08

it , is both extremely important and

30:10

extremely important at the same time . So

30:14

I keep looking to figure that out . It makes life a lot

30:16

easier than that .

30:18

Right , yeah , you bring up

30:20

a lot of really good points there

30:22

. You

30:24

know , with having you

30:27

know , I feel like it's so important to have

30:29

I don't want to call it a safe space

30:32

, but you need to have a space where you can fail

30:34

constantly in tech . You

30:37

know , like one of my first jobs out of college

30:39

, I mean I dropped

30:41

a bank's database , like

30:44

I didn't even know the term

30:46

drop right , like I accidentally

30:49

deleted this customer's database and

30:51

they were a bank , and then I spent

30:53

the next you know two

30:55

days , right , fixing it and

30:57

restoring it from logs that

30:59

I didn't even know could be restored from , and

31:02

you know all this sort of stuff . They didn't lose any data

31:04

, they had no downtime , right

31:06

, but I still dropped their production database

31:09

and you know

31:11

in a lot of companies on a lot of teams . That's

31:13

like immediate termination . Like , okay , you don't know

31:15

what the hell you're doing , like , get out of here . You

31:18

know . But I was also very open in the interview

31:20

. Like , hey , I don't know what the hell I'm doing . Yeah

31:22

, like , I need to be taught , you

31:25

know , and that was a huge learning

31:27

moment . But like , and

31:29

that's just one of like

31:31

a hundred , you know situations

31:33

that I was in at that company alone , right

31:35

, and so I developed , you

31:38

know , the greatest troubleshooting

31:40

doc ever at that

31:43

company . It's literally still in use . Where

31:45

you know , when someone encounters a random

31:48

problem , they just go look at my

31:50

doc because I guarantee you I've

31:52

encountered it and there's a whole section of SE

31:54

Linux and before I encountered

31:57

SE Linux there , I never touched it

31:59

, I didn't know it existed , literally

32:02

. One of my customers was a federal agency and he

32:04

said , hey , we need to turn on SE Linux , you

32:07

know , on this server . And I was like , okay

32:09

, turn it on . What's the problem ? He goes

32:11

, no , it breaks everything . I was like , whoa

32:13

, that's weird , you know . And then I would . That

32:15

was rabbit hole for three months of

32:18

knowing way more about SE Linux

32:20

than I ever wanted to , you

32:22

know it's , it's

32:25

really interesting . And then you know you bring

32:27

up the the yes and perspective

32:29

from improv and I actually

32:31

do that with like all of my trips . You

32:34

know that I'm planning a trip to

32:36

London for my first time in

32:39

the fall and

32:41

I'm going with a friend . I'm bringing my

32:43

wife and my one year old , and

32:46

you know he

32:49

comes from a different background , I guess , of doing trips

32:51

right , where they kind of plan everything around food

32:53

and you know everything else

32:55

kind of like falls into place . I guess I

32:58

am the complete opposite . I am like

33:00

like no , like we're going

33:02

on this bus tour , we're going to get off

33:04

, we're going to go have drinks here , like

33:06

all of it . You know , because when I

33:08

go somewhere new it's like well , let's

33:11

do everything , like I'm

33:13

not here to sleep , like if they're

33:15

open at 4am , like let's go at 4am

33:17

, Like I do not care . You know , yeah

33:19

. It's the same thing I did with my Germany trip last

33:21

year was , you know , every day

33:23

was another adventure , like one day

33:25

we were in the mountains going through castles

33:28

for the entire day , walked an entire marathon

33:30

. I was dead tired

33:33

at the end of it . And you

33:35

know , the next day was a football game . Right

33:38

, like , went to the football game , did a full day of

33:40

drinking . You know , I got to see

33:42

my buddy not keep up . That was fantastic

33:44

, you know , like the

33:46

whole thing , you know it's . It's

33:49

that ability to just

33:52

want to keep going , you know , want to keep

33:54

exploring and pushing and seeing

33:56

what else is out there . I guess .

33:59

Yeah , yeah , and I just kind of always

34:01

done like that . I think some of it is that I grew

34:03

up super sheltered , so if

34:05

you were poor and I was homeless

34:08

when I was 19 , like I

34:10

, so I never got

34:12

to go anywhere . Like we went on like

34:14

two trips that I can remember as a

34:16

kid . So when I was finally an adult and

34:18

making a lot of good money because I was working in fine

34:20

ass , it was like I want to do all

34:23

the things , I want to do everything

34:26

, like there's no reason not to try

34:28

anything because I didn't get to do anything

34:30

when I was a kid and I

34:32

always had like I've always had health

34:35

issues , so I've also always had this

34:37

kind of like my clock might be ticking faster

34:39

than other people , so I need to do all the stuff

34:41

you know before you know

34:43

before I run out of time and health to do

34:45

it . So that I just I

34:48

feel like everybody's . So

34:50

one of the interesting things I've also found is that

34:52

, like I used to think that everybody's

34:55

idea of happiness was roughly the same

34:57

, and I think that as I've grown as

34:59

an adult , I figured out that we're

35:01

all wired completely

35:04

differently . Like everybody's brain is wired

35:06

differently about brings people happiness , and joy

35:08

is completely different from one person

35:10

to another . It's like growing up in New Hampshire

35:12

. It's a lot of friends who live within

35:15

10 miles of where we graduated from high school

35:17

. We were married to somebody that

35:19

we went to high school and you

35:21

know , they've literally never been . I have

35:23

a friend who's never been west of Tennessee because

35:26

they live in New . Hampshire like doesn't have a passport

35:29

but they're happy and

35:31

or ish and so

35:34

like , if I don't know , I just think that I

35:36

know I'm ADHD , I know I'm autistic

35:38

, so I know that my brain has like a 40%

35:41

higher need for stimulation and activity

35:43

in most people's . Yeah

35:45

, I'm all about like maximizing the hour

35:47

line for every moment I'm awake because

35:49

, like , I think that's just how ADHD

35:52

brains work . Right , we're much human optimization

35:55

machines .

35:57

Yeah , yeah , you got to . I

35:59

don't know like I'm , I don't know if

36:01

I'm ADHD , but you

36:03

know I find that I have to

36:06

at the minimum . I have to have a goal

36:08

, you know , at all times , right , and

36:10

I need to be making progress towards it

36:12

and I have different ways of kind

36:14

of tracking that progress and whatnot . Right

36:17

, because when I don't have that , I

36:19

start I don't know , I like

36:21

start going off the deep end , right , and I'm like

36:23

no longer focused . It's very easy for

36:25

me to get into that spiral , right

36:27

.

36:28

You don't know if you have ADHD right

36:30

, right .

36:31

You know , I guess I've

36:33

never been tested , or whatever .

36:36

I didn't get tested until like four years

36:38

ago . It was

36:40

crazy too , cause it's like a list of 45

36:43

things that you thought or like character flaws

36:45

about yourself , and all of a sudden

36:47

you find out like , oh , it's actually

36:49

not that I'm human garbage , it's

36:51

actually just that my brain is wired differently

36:54

than other people's .

36:57

Right , yeah , it's

37:00

interesting , I feel like , as , as

37:02

time goes on , I just figure

37:04

out how , or like

37:06

, different everyone is

37:09

, you know , and how different everything

37:11

is , you know , and how to appreciate that

37:13

it's . Um , it's

37:16

an interesting thing that I

37:18

kind of recently went down , I guess , but

37:22

you know , can we really ?

37:23

so it's really important with regard

37:26

to AI , to understand how different people are and

37:29

so to actually talk about something technical here

37:31

, one of the really interesting things I've been thinking and

37:33

researching a lot about

37:35

is diversity as it relates to artificial

37:38

intelligence and as it relates to technology

37:40

in general . Um , and so any

37:42

modern times that people and I'm going to go out

37:45

of kind of a tangent people take of DEI as like a

37:47

PR thing or like it's a moral

37:49

issue and like , yes , morally we should

37:52

all iron diversely and hire every candidate . But it's

37:54

actually also just a technology

37:57

usability issue , because

37:59

if you don't want a world that's primarily

38:01

built to serve mostly

38:04

male , mostly white men , then

38:06

you can't have mostly male , mostly white

38:09

men building all of the technology and

38:12

this . That sounds kind of , you know , like a political

38:14

stance , but it's really not in that

38:16

. So if you

38:18

take the politics out of pronouns

38:20

, right , and you don't like and we

38:23

ignore who you know , we want to

38:25

argue over whether there should be more than two pronouns

38:27

. Well , guess what , in the Thai language there's

38:29

like 20 something , because in the Thai

38:32

language your pronoun encompasses what you were born

38:34

as , what you currently identify , as

38:36

is who you like to date . And so we

38:38

were talking about something like generated AI

38:41

. When we're trying to talk about canonical

38:43

inferences and being able to understand

38:46

text , if we don't want generated

38:48

AI to only be a utility

38:51

and helpful for English speaking

38:53

people , then it can't just be

38:55

written by English speaking people . And

38:57

so another example is like English

39:00

and Spanish are both romantic languages

39:02

, but the way you say I love something and I have

39:04

something or the same in Spanish . So

39:06

this comes into play . When you're

39:08

talking to a generative , like I'm putting

39:10

props into gen AI . If

39:13

I say , let's say

39:15

, kiaro tacos , how does that

39:18

generate AI know whether I'm

39:20

giving it a piece of factual information ? And

39:23

saying I love tacos because

39:25

take Kiaro tacos means I love tacos

39:28

, romantically , right ? Or Kiaro tacos

39:30

meaning I want tacos , and

39:32

that actually makes a big difference to because one of those

39:35

is input and one of those is requesting

39:37

help , right ? If you say I want

39:39

this , you may be requesting to get that

39:41

thing back . So diversity

39:44

another place this comes into play

39:46

is hardware . So

39:49

hardware is predominantly built for male frames

39:51

. So when you think about something

39:53

like the Apple Vision Pro , causing a

39:55

lot of women and

39:57

smaller framed people massive migraines , it's

39:59

probably because the people who designed , built and

40:02

tested that we're all skewed

40:05

towards a specific population . So

40:07

this is one of the things I think is really

40:09

interesting to think about , in that diversity

40:12

in technology is not just important because

40:15

in a utopian world , that's

40:17

how it would be . It's important because it's

40:20

going to make , it's going to determine whether

40:22

or not technology is

40:24

only useful for a

40:26

small group of people , and that's important

40:29

, right ? So the hugging

40:31

face shout out . The hugging face is

40:33

a nonprofit . It's super dedicated to

40:35

democratizing machine learning and AI

40:38

and I'm like those

40:40

are things that are really passionate about , because we

40:44

have a technology that has the ability to fundamentally

40:46

transform the way humans live

40:48

and to provide benefits that a large

40:51

percentage of our population has never

40:53

had before . But we can

40:56

only get there if we build it so that it works for

40:58

all people , right .

41:01

Yeah , it's a really good point , and

41:03

I've had on , like AI

41:05

researchers before and

41:07

I talked about this where , like

41:11

, how do you ensure , right , that

41:13

the AI has enough diversification

41:16

of its data and how

41:18

it's making

41:20

its choices , and

41:23

if it hurts a certain group of people or just advantages

41:25

a certain group of people , or whatever

41:28

it might be right , like

41:30

, how do you protect against that and how do you

41:32

have , potentially

41:34

, I

41:37

don't know , like a base

41:40

set of language or a base AI

41:42

model , right , that this other

41:44

AI model can check itself against is like

41:46

, oh , did I make the right decision here ? Like

41:49

that's where the people come in , I guess . But it's

41:53

a really it's a fascinating area

41:56

because , as humanity

41:58

has evolved , we've never encountered something

42:00

like this before . It's never been

42:03

. It's never been a thing

42:05

that anyone ever really thought

42:07

about . It's never been a thing where

42:09

we thought about , like , is Google

42:12

serving me the right search

42:15

results , right , based on I

42:17

don't know where I live , or whatever , right , those

42:21

things have never come up before . It's really

42:23

interesting where we take it , because this

42:25

will , like you said , this will really

42:28

have the capability of advancing

42:31

civilization as a whole . This

42:34

can either go really well or it could probably go

42:36

really bad .

42:37

Hopefully , it goes really well . Yeah

42:40

, for sure , and that's it's something

42:42

that I think that I

42:45

don't tend to be as much of a doomsdayer as

42:48

a lot of

42:50

AI people are , that they do potential . I

42:53

do see the potential for things to get out of control

42:55

. But also , just like a bucket of water , right

42:57

, like there's

43:00

just a bucket of water on it . No

43:02

, but what I do ? I think it's

43:05

something that we

43:07

need to . There's this phenomenon

43:09

that I've always encountered in tech where everybody assumes

43:11

that somebody else smarter than them is focused

43:13

on this problem , on any problem

43:15

, on any equity problem that

43:17

you bring up in tech . Like I think most people

43:20

always assume there's someone else

43:23

who's gonna deal with that . Like

43:25

because somebody smarter has already thought

43:27

of that A lot of times . Like seriously

43:30

, nobody's brought it up . Like there are some large

43:32

technologies that have been released that people are

43:34

like , oh , what about this ? And oops

43:37

, like I remember the

43:39

it was the one of the Apple washes

43:41

that was released . Like the Apple wash , the

43:44

face of it was too big for like 40%

43:46

of women's wrists .

43:48

Yes .

43:49

And actually I have comically small

43:51

wrists anyway , but I haven't seen an issue with

43:53

like this is not what this wash was

43:55

intended to look like , right , so I really

43:58

have . But

44:00

yeah , I think everybody was assuming somebody

44:02

else is doing this and as the

44:04

world becomes more complex , that

44:06

phenomenon will probably increase . So I think

44:09

one of the big questions we have to have is how

44:11

do you put in place checks and balances

44:13

to make sure that someone actually

44:15

is thinking about these things , and

44:17

how do you try to control

44:20

like we're also trying to make regulations

44:22

at like a state and even country level

44:24

, and data and technology is

44:26

local . So the other thing is like there's all these

44:29

different NGOs that are trying to do things

44:31

. So we have to come to a place where we're

44:33

coordinating these things a lot more closely

44:35

so that everybody is kind of aware of the

44:38

state of technology , the ideals

44:40

, you know what we're working towards , cause it seems

44:42

like a lot of this stuff , some

44:44

of these decisions about you know , do we make it

44:46

more equitable , or do we make more profits

44:48

, or are being made behind closed

44:51

doors . So I think there needs to be clearer expectations

44:53

that for when one of these paradoxes

44:56

comes up that has the potential

44:58

to impact a large number of people . Those

45:00

decisions aren't just being made by a small

45:02

group of people , and they're being made

45:04

in a public way .

45:07

Yeah , I think you bring up a great point , right

45:09

Is that it's very easy

45:11

for us to kind of assume

45:13

or think that , you

45:15

know , someone has already thought

45:18

of this , they're already working on this , they're

45:20

already doing acts , right , which

45:23

really isn't always

45:26

the case , and it's probably

45:28

happening a lot less than what you would expect

45:30

. And the difference , right

45:32

, someone may even have the same exact idea

45:34

, but the difference is , if you act

45:36

on it , it's actually , it's if you actually

45:38

do something with it , and that's , you

45:40

know , that's the important part . That's

45:42

, honestly , that's what separates , you

45:45

know , I would say , you

45:48

know the people that you hear about . That's what

45:50

separates them from everyone else is

45:52

that they get an idea and then they find a way

45:54

to make it work . Like , however , whatever

45:56

that looks like , whatever that takes

45:59

, you know , they just find a way to make it work . And

46:02

I feel like , as technology professionals

46:04

, we kind of like got to get out

46:06

of our own heads , you know , with

46:08

that , because we're so analytical by

46:11

default , right , that you

46:13

know we'll overthink something for

46:15

years before actually moving , when it's

46:17

like , hey , you should have done this , like 10 years ago

46:19

.

46:20

Yeah , yeah , I mean like , how many times

46:22

does the technology come out ? You're like , oh man

46:24

, I had the idea for that like 10 years ago

46:26

and it's like , yeah , but you didn't do anything about it . It's a issue

46:29

, yeah , great , we

46:32

at Adobe always assume and that's like

46:34

people so many people underestimate their

46:36

power , and this is the thing when I so

46:38

when I came into security . Here's

46:40

the thing we're all talking about how

46:43

great it is to get into security from other

46:45

industries . But we should acknowledge

46:47

that when you do get into security

46:49

, if you're new in the industry , it's really

46:51

easy to feel like an imposter or feel like an

46:53

outsider or feel like you're faking it because

46:55

you're moving somewhere . But that goes

46:57

back to what I was talking about with insecurity

47:00

. I found that people don't really necessarily need

47:02

to like you . They just need to trust

47:04

you , and I've earned

47:06

a lot more respect from my peers by

47:09

being really clear on where my

47:11

skills end than

47:13

demonstrating those skills themselves

47:15

, because I come into a room full of security

47:17

engineers and these people can hack your router

47:20

in four minutes with a

47:22

fluke or zero . Like

47:24

. I'm not that . I'm a data scientist

47:27

and I only worked as a data scientist

47:29

applying it to security for a little over a year

47:31

, so but I know

47:33

that what I lack in

47:35

actual technical ability to pop

47:38

your Tesla's gas door , like

47:40

I made up for in my ability

47:42

to communicate , so , like

47:44

my superpower , is communication and translation . I

47:47

can take . I can sit down with your security engineers and

47:50

they can dump on me all the technical stuff that they're

47:52

doing and I can take that and make it into a store . You

47:55

can tell your customers . I

47:57

can make it into a store . You can tell your marketing team , your sales

47:59

team and so

48:01

like . There's a lot of different skills required

48:03

around security

48:06

to make a security program

48:08

successful . Like communication , like marketing

48:11

, like training , like cause . There's

48:13

a big difference between knowing how to do something

48:15

and knowing how to teach somebody else how to do

48:17

it . So I was a facilitator for a long time another

48:19

communication job , right . So

48:22

it's important for the knowledge that you

48:24

may feel like a fish out of water

48:26

if you get into security or you join

48:28

a new industry , but you need to understand

48:31

that , your ability to know

48:33

your limits and to say

48:35

, hey , I've actually never had experience

48:37

with that , but it's something I'd like to learn more about

48:40

. Where could I read about that ? Like people will

48:42

respect you 10 times more for doing that

48:44

than for immediately . Well , should they get an

48:46

answer , because you feel like you should have one .

48:49

Yeah , I've always found there

48:51

to be a lot of value when you're more

48:54

honest , more open , more upfront

48:57

about your own limitations . You

48:59

know , because people

49:01

will keep , I guess

49:03

, kind of drilling you or drilling

49:05

you , especially in security , in the security world

49:07

. You know , as soon as you

49:09

say like oh yeah , I've done this for

49:11

10 years , or I'm an expert in this

49:14

, I built this , I mean

49:16

in security , it's like okay , well

49:18

, guess what ? I understand what that is

49:20

and let's talk about it . You know , like

49:22

we're gonna talk about it at a level that like if you

49:24

didn't build it yourself , you're not gonna

49:26

know . You know , and

49:29

I've been on both sides

49:31

of that interviews , right when

49:34

I've said you know I'm an expert

49:36

in something and they just completely grill

49:38

me on it . And you know , thankfully , like I've

49:40

got him past it because you know

49:42

what I put on my resume is the stuff

49:44

that , like I have done . You know , I'm not

49:46

like bluffing it . I may use words

49:49

that I may like rarely use , you know

49:51

, because , like you know , you don't wanna use

49:53

the same like verb or adverb or whatever it is

49:55

. You know , agitate to describe

49:58

something right , but like when I

50:00

say like hey , I built this thing , it's like . No , I

50:02

actually built it , like you know , because

50:05

I really don't wanna be in a situation where

50:07

someone points out a point and I can't answer

50:09

it at length , you know .

50:11

Yeah , oh , we've all been through the experience

50:13

of seeing , like an ex-co-worker's LinkedIn

50:15

and seeing all the shit that we did that they're taking

50:17

credit for and you're like , oh really , you made that

50:19

happen , huh .

50:21

Yeah .

50:23

Yeah .

50:24

I know . I don't remember you on that project

50:26

.

50:27

Yeah , yeah , and

50:30

do you ever call that the George Santos effect

50:32

? Now , is that ? Yeah , I

50:34

mean , I think that's true Like there's a lot of

50:36

power in saying I don't know , like

50:39

there's a lot of risk in saying it if you're the one who's

50:41

supposed to know . But in a lot of circumstances

50:43

you're not the person who's supposed to know and

50:46

I'm , you know , like people aren't looking to

50:48

you to have all the answers , they're looking to you

50:50

to know where to go to find it . Yeah

50:53

, and that's the kind of what your

50:55

utility is , a security professional usually

50:57

is . It's like I don't know everything

50:59

, but I have a process that I can go through

51:01

or I can quickly get to the information

51:04

that I need , process it and get it back

51:06

in the form we can use .

51:10

Yeah , that's a key distinction there , you

51:12

know , being able to say I don't know and

51:14

then also following up with what I can find

51:17

out . Yeah , In

51:19

today's , you know , modern age

51:21

, right , 2024 , like , you

51:23

can absolutely find something out

51:25

. If you don't know it , you know , by a

51:27

simple Google search you don't have to go to the library

51:30

anymore and hopefully they have a book on

51:32

it , right ? So

51:34

, like , there's no reason why you can't , you

51:36

know , say that and actually follow up with it

51:38

with the real information . Well

51:41

, you know , Jackie , we've gone

51:44

this entire time and

51:46

definitely doesn't feel like 50 minutes , that's

51:48

for sure . But you know , I'm

51:50

I try to be very conscious of everyone's time

51:53

and you know I don't want to go over because I

51:55

know that we're all booked . You know , meeting after

51:57

meeting . So you know , before I let

51:59

you go , how about you tell my audience , you know where

52:01

they could find you , where they could find Cribble

52:03

, if they want to learn more and maybe they want

52:05

to reach out .

52:07

Yeah , absolutely . I've been on LinkedIn

52:09

, so LinkedIn slash Jackie's in security

52:11

Can interpret

52:13

that whichever way you want to . Yeah , and

52:18

Cribble . You know we love Cribbleio , or

52:21

you can also follow Cribble on LinkedIn . We

52:23

have a fantastic social media manager who

52:25

makes a pretty high quality means . You

52:29

know we didn't talk a ton about what Cribble

52:31

does , which is my preference

52:33

, because I think that it's a product

52:35

that is much better for people to do than use . But

52:38

if you have questions about moving

52:40

data , making use of data , any of those

52:42

things , you're more than welcome to reach out to

52:45

myself or anybody else on the Cribble team .

52:48

Awesome . Well , thanks everyone . I

52:50

hope you enjoyed this episode .

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features