Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:00
So I would recommend you know anybody like that
0:02
to try to get on social
0:04
media . Try to find as many people
0:06
like that , as diverse a group of people as you
0:08
can find , to get exposure
0:11
, because that's like real time . You know
0:13
as things change to pop
0:15
up , you know a book is going to get stale
0:17
and it gets more stale as it sits there , whereas
0:20
you know a social media thing or even
0:22
a security blog or something like that is going
0:24
to be a little bit more updated and a
0:26
little bit more fresh than you know , and
0:29
it's going to bring some of those like the
0:31
MGM hack . Right , that was totally
0:33
a fishing thing . You wouldn't imagine a company
0:35
that big , you know , falling as hard as
0:37
they did because you know
0:39
somebody fished them . But
0:42
it happens and , like I said , it's a
0:44
matter of time . Right , it's not if , but when
0:46
.
0:46
Hello and welcome to Devilburst
0:49
Journey , the podcast bringing you the making
0:51
of stories of successful software
0:53
developers to help you on your
0:55
upcoming journey . I'm your host , tim
0:57
Bognio . On this episode
0:59
, I receive Colleen Shane . Colleen
1:02
is a 49 years old trans woman living
1:04
happily in San Diego near
1:07
her two adult children and their partners
1:09
. After many twists and
1:11
turns , it's in her mid 30s
1:13
that she found her way to information
1:15
security and networking . Since
1:18
then , colleen has worked for values , institutions
1:20
like the US federal government , healthcare
1:23
and education in IT and
1:25
information security . Colleen
1:27
, a warm welcome to have you . Good morning , it's
1:30
going to be here . Thank you very much . Oh , it's my pleasure , or
1:32
our pleasure , but
1:36
before we come to your story , I
1:38
want to thank the terrific listeners who support the show
1:40
. Every month , you
1:42
are keeping the Dev journey lights up . If
1:45
you would like to join this fine
1:47
crew and help me spend more time
1:50
on finding phenomenal guests
1:52
than editing audio tracks , please go to
1:54
our website , devjourneyinfo
1:57
and click on the support me
1:59
on Patreon button . Even the smallest
2:02
contributions are giant steps toward
2:04
a sustainable Dev journey journey
2:07
. Thank you , and now back
2:10
to today's guest . So , colleen
2:12
, as you know , the show exists to
2:14
help the listeners understand what your story looked
2:16
like and imagine how to shape
2:18
their own future . So , as
2:20
is usual on the show , let's go back to your beginnings
2:22
. Where would you place the start
2:25
of your tech journey ?
2:27
My tech journey definitely
2:29
started when I was little , back in the 80s . In
2:33
IT I guess I would say that I started with
2:35
the Texas Instruments TI-99
2:37
. One mother bought
2:39
one for us . I think it was
2:41
used because we got it a few years
2:43
after it came out . But
2:46
I did a little bit of programming on that
2:48
. That was probably the beginning of my programming Badly
2:51
, but I just copied what from the book and typed
2:53
it out on it . And then there was
2:55
actually a cassette recorder , a magnetic tape
2:57
recorder that recorded all that
3:01
code on . And
3:03
that first program I did was Mr Bojangles
3:06
and it was to make a little guy dance
3:08
on the screen like an 8-bit . I
3:10
was pretty proud of myself for that . But after
3:14
that I always was
3:17
into tech . I grew up pretty
3:19
much tearing things apart . That was always my thing
3:21
. I always had to understand how things
3:23
worked . So if
3:26
I could take it apart I took it apart and
3:28
a lot of times that upset my parents
3:30
taking things apart
3:33
. It took me probably until I was about 12 to
3:35
figure out how to put things back together again and
3:38
then after that it was on . So I
3:42
had children pretty young , so I
3:44
went into air conditioning , school of refrigeration
3:46
. I did that for many years
3:48
to
3:51
raise the family , because it was something that I could do
3:53
at night , and
3:56
I eventually got up to service manager
3:59
, operations manager . I was a maintenance
4:01
supervisor for a while but
4:04
I kept hitting what's called the paper ceiling , where
4:06
I got high enough but I couldn't
4:08
go any higher because they said I didn't have a
4:10
college degree and nobody
4:12
seemed to care what my college degree was in . They
4:14
just wanted me to have one . So
4:17
my ex-wife at the time graduated
4:19
with her master's degree . We both were
4:21
together very young , so she also
4:24
didn't graduate high school . She went back to school
4:26
herself and once she graduated
4:28
with her master's degree I decided I'm going back to
4:30
school . And I decided
4:32
I'm going back to school for computers , because I always loved
4:34
computers and I've been playing with them forever . So
4:37
I went and got my associate's
4:39
degree in networking and part
4:41
of that networking
4:44
excited me . But part of my degree was security
4:47
classes and I thought that was really great
4:49
that they were including that in the
4:51
degree there to expose you to security
4:54
from the beginning . But the
4:56
security really spoke to me like
4:59
the risk aversion , mitigating
5:01
risk . I just
5:03
fell in love with it . So I switched over
5:05
to security and then just kept networking
5:08
as a minor , Eventually
5:11
got my associate's degree and this is all in
5:13
my mid 30s , so my kids
5:15
are in high school themselves and
5:17
my goal was to graduate
5:19
before my son did , before
5:22
he was ready to go into college . I
5:28
finished my associate's degree and
5:30
I was really enjoying that and I thought
5:32
why stop here ? So I
5:34
enrolled and did my degree
5:36
online for my bachelor's degree in
5:39
information security and
5:41
kept networking as my minor in that as well
5:44
. And then while
5:46
I was doing that , I worked at the VA hospital
5:49
in Indianapolis in tech support . So
5:51
it was kind of like an internship , what they call it
5:53
a student trainee . I
5:55
was a 579 student trainee , so
5:57
I started out as a GS5 and then worked my
5:59
way up to a nine as
6:02
I got those degrees and I learned
6:04
so much at the VA hospital
6:06
. It was a Windows environment . I
6:08
learned a lot about active directory
6:10
and everything managing
6:12
printers and networking
6:14
. So much about security
6:17
. They required everybody
6:19
on staff to have a security plus
6:21
. It was
6:23
just a local directive that they had there that
6:26
was part of mine , if I may .
6:27
Va is for veteran affairs , isn't it
6:29
Correct ? Okay , so that was part
6:31
of the military , probably , or
6:34
?
6:34
governed the Department of Veterans
6:36
Affairs , yep , and then the VA hospital was the
6:38
sub department of that , because they also have the Veterans
6:40
Benefits Association . So
6:42
, yeah , I worked actually at the hospital there in Indianapolis
6:44
. And they're considered what's
6:47
like a 1A hospital . So you know they
6:49
had the helipad there , it was like trauma
6:51
center and you know it was like
6:53
a regional hospital . So people came from all
6:55
around to Indianapolis to have
6:57
procedures done and be seen there .
6:59
Okay , yeah , and we imagine it's quite
7:02
security , affine
7:04
if it's a part
7:07
of the military and 1A
7:09
etc . Absolutely .
7:11
Yeah , not only from the healthcare but because
7:13
they were veterans . Veterans could still
7:16
be called back to active duty . So you know , our
7:18
computer system was linked with the DOD . So
7:21
our specifications , our requirements
7:24
were that strict and
7:27
it was almost to the level of paranoid because
7:29
, like you know , we would receive a skid
7:31
of a pallet of printers
7:34
and you know , printers all come with a
7:36
CD drive there with the
7:38
drivers on it . The
7:40
directive there was that no media
7:43
left that hospital unshredded . So
7:45
the disks that came brand
7:47
new with those , as soon as we took them out of the box
7:49
, they had to go straight in the shred bin
7:51
and that was just
7:53
to basically ensure that no media
7:56
left that hospital intact . And
7:59
we , you know , shredded everything , hard drives
8:01
, you know the little , whether
8:05
it was micro fish films , you
8:07
name it , anything that was a media
8:09
that had patient data on it or not . We
8:12
destroyed it . And then
8:14
of course we maintained the chain of custody and all
8:16
that stuff . So for me
8:18
, I love that and I love that that
8:21
level of paranoia , that tinfoil , what
8:23
I call the tinfoil hats , was great for
8:25
me . It taught me
8:27
a lot about security that
8:29
I pretty much took . I've held throughout
8:32
my life since then and
8:34
I try to bring that to all my other orgs and
8:37
you know I've gone to work at like
8:40
universities and universities are like some
8:42
of the most open places that you could work . They
8:44
don't really care about I mean , they do care
8:46
about security , but it's not their priority . You
8:49
know access is their priority . They want convenience
8:51
, they want their students and everybody to be able to access
8:53
everything from everywhere , which was completely
8:55
counter to me , you know , having to
8:57
lock it down and make sure that you
9:00
were actually authorized to get that data or
9:02
have that connection , you know
9:05
. So you know I worked for them , developed
9:07
my security posture . And then you know I worked
9:09
for a lot of other healthcare organizations , like
9:11
Radiology . Of course that's
9:13
HIPAA requirements
9:17
there , so it's not quite as strict as
9:19
what we had going on at the VA hospital , but
9:21
it was a little bit better
9:23
than the universities . And then
9:25
a lot of the private works
9:28
that I've gone to . I take
9:30
that level of security to them and
9:32
sometimes we
9:34
get into a little bit of a battle because you know people
9:36
want convenience , they want to be able to do their
9:39
things without much hassle , and I'm
9:42
not sure if you're familiar with the concept
9:44
that convenience and security are pretty much opposite
9:46
one another right . The more convenient it is
9:48
, the less secure it is , and the more secure it
9:50
is , the less convenient it is . So
9:53
I try to find that balance
9:55
, but I always try to err on the side of security
9:57
.
9:58
So this must
10:00
make for very interesting discussions
10:03
.
10:04
So many times and you know sometimes it gets
10:06
we get into heated discussions because , you
10:08
know , some people just they don't care about the security , they
10:10
just want it to work , they want to be able to connect to their printer
10:12
, or they want to build a scan , or they want , you
10:15
know , to be able to just connect their laptop and go right , or
10:17
they want to be able to bring their Mac , Mac
10:19
laptop from home and connect it and
10:21
just work on that . So
10:23
trying to , you know , educate people
10:25
and security or why those practices
10:27
aren't , you know , acceptable
10:30
, it sometimes
10:32
causes me a little bit of static , but you know , that's
10:34
part of my job and I
10:36
always kind of consider myself the bad cop in
10:39
insecurity because I'm the one
10:41
that wants to lock everything down . I want everything
10:44
secure as I can make it , and
10:47
then it's up to others on my team to try to
10:49
find that balance and , you know , find
10:51
the convenience but also the security .
10:55
You always have to have different poles and different roles
10:57
and I guess in a team you always
10:59
need someone to really be
11:01
the bad cop , as you said , and really push things forward
11:04
and even further , so that somebody at some
11:06
point say okay , that's enough , but
11:08
if you don't have this person , then you might not reach
11:10
this , that's enough , you might stop
11:13
.
11:13
Well , and that's my thing is like it takes a diverse
11:16
perspective of people and the experiences
11:18
to build the security
11:20
posture or security team , and
11:23
you know I try to be as far as towards
11:25
security . On the secure side , opposite
11:27
convenience right , you know I like a little
11:29
bit convenience myself , but you know I would rather
11:31
have something secure than convenient . We
11:34
got to take steps or that , you know , like multi
11:36
factor . You got to break out an app and put in the
11:38
code that's in there . I mean that's another step , it's
11:40
less convenient but it's more secure .
11:43
Is there some such thing as searching
11:46
where you're , you feel convenient
11:48
, and then this having
11:50
this as a hint for probably a security
11:53
weakness Not
11:55
sure that's the right word but everywhere
11:57
you feel confident and convenient , then
11:59
that might be a hint that there's something here , right
12:01
?
12:02
Yeah , yeah . The more convenient it feels to
12:04
you , the easier it is to you , the less secure
12:06
it is and unfortunately , that
12:10
must be a fun way to look at the world . Like
12:14
I say , I always tell everybody I have a tinfoil hat
12:16
right . It's like I'm so paranoid and
12:18
like I think I was always like that before
12:20
. But you know , when I spent five years at the VA
12:22
, I just got that much more
12:24
paranoid you
12:28
name it right Like port security
12:30
or , you know , wireless security or any of
12:32
that stuff . It was just like it was almost to
12:34
the level of paranoia there and I just really
12:37
dug that . But unfortunately
12:39
at the VA people don't
12:41
, you know , they don't move on from there
12:43
, they retire from there . So it's
12:45
like it was really hard for me to move up from that
12:47
five , seven , nine into , like you know , a GS
12:50
11 position or something that paid like
12:52
what was paying out in the private sector . So
12:54
I kind of wish
12:56
I had stayed because you know I would still
12:58
be with all those great benefits and you know
13:00
, still be working in that paranoid environment
13:02
. But I moved on for the money
13:05
because I figured , hey , I spent all this time
13:07
getting my degree , I deserve
13:09
that kind of money . So you
13:11
know I love chasing the money .
13:12
Unfortunately , Did
13:15
you ? That's okay . I mean
13:17
, at some point you have to pay your bills , did
13:21
you find ? Did you find companies in the private
13:23
sector that are close
13:26
to that level of paranoia
13:29
Only the ones that worked for the government .
13:31
So if I worked for a company that had
13:33
government contracts , they
13:36
required us to be secure
13:38
to that level , so that I really did dig
13:40
that , because then I actually got the pay
13:42
that I kind of felt like I deserved and also
13:45
still had to , you know , make sure that their
13:47
networks and the systems were as secure
13:49
as what the government would consider Okay
13:52
, yeah . And then of course , I had like
13:54
top secret clearance and all that . So that's
13:56
something that the companies really enjoyed
13:58
. You know , they were like they didn't have to pay for me to get the
14:00
clearance . I
14:03
was allowed to work on those networks .
14:05
So yeah , it makes sense . Makes
14:08
sense the way I picture big
14:10
fang companies etc would
14:12
be this way or this level
14:14
of paranoia . But the way you put it is
14:16
not , not , not really .
14:19
Yeah , a lot of them . It's , yeah
14:21
, I mean , if you look at some of the hacks and stuff that
14:23
happened , I
14:26
mean it's , people are going to fall for fishing , that's a
14:28
given . Eventually , somebody's going to fall for
14:30
something . Something's going to seem credible enough and you're going
14:32
to click on a link . I mean I don't
14:35
know if you remember how paranoid I was to click on your
14:37
link when you yes , you were Right .
14:48
I waited for a few months and the last
14:50
iPhone came out , and so I am an Apple guy , so I ordered
14:53
one and a legitimate
14:55
delay entered the system and
14:57
so , until I received it , there
15:00
was multiple SMSs from DHL
15:03
, the postal office here in Germany , telling
15:05
me , hey , and they were legitimate , and
15:07
at some point there was one phishing
15:09
attempt in between oh no
15:11
, and
15:14
thanks for me went
15:16
, password , blocked
15:18
it and realized that there was something , but
15:20
it was so legitimate
15:23
in between , two other notifications
15:25
that were exactly linked to this that
15:27
I've been looking at my hardware
15:29
since and rebooting
15:32
stuff and crunching stuff , saying
15:34
this cannot be a coincidence , but
15:37
I think it is . But still , yeah , you will
15:39
fall for it . I did a couple days ago
15:41
.
15:42
Right , that's the thing is like they get
15:44
the people who are phishing . They get
15:46
craftier and craftier and you
15:49
know I'm work or we're
15:51
shopping around those companies that
15:53
delete your data online , that
15:55
you do all the search that goes to find all
15:57
these data brokers with all your information
16:00
. And you know , because I have such
16:02
a big online presence , social
16:04
media presence , I volunteered . I said , you
16:06
know , sign me up for this . And then you
16:09
know , let them see what they , what they
16:11
can find online for me and what they can delete
16:13
. And they
16:15
gave a report back and we're going
16:17
to use my report that they gave to give
16:19
to the C suite the executives
16:22
that are company , for them to decide do they want to go
16:24
with this company or not . And
16:26
I was amazed at the brokers
16:28
, the , how many of them had had
16:31
my information and the level of
16:33
information that each one of them had Hundreds
16:37
, hundreds of brokers . It was ridiculous
16:40
. I had no idea that there were that many . And
16:42
you know they were telling us that it was like a $3
16:44
billion a year or something you know multi
16:46
billion dollar . I
16:51
guess you know what's the word I'm looking for . Yeah
16:55
, these brokers make billions of dollars
16:57
a year . You know buying and selling our data
17:00
and transferring around and you know
17:02
, like the videos that you've watched
17:04
on YouTube or you know the stuff that you've
17:06
bought on Amazon all of that
17:08
stuff's up for sale , right . They scrape that and
17:10
then they use that to build profiles on you and then
17:12
somebody that's wants to fish
17:14
you they're going to go try to find all that
17:16
data , right , and then that helps
17:18
them build even more credible things to
17:21
help fish that you believe You're like
17:23
. Well , only these companies could
17:25
know that much about me , when really
17:27
that's not it . These dated brokers knew all that much
17:29
about you .
17:31
I'm putting my team for that as well
17:33
. Right now
17:36
. It is scary , and how
17:38
do you live with it , knowing
17:41
the level of insecurity
17:44
that we deal with every day and
17:46
having seen the other world .
17:49
Yeah , it drives me nuts , but there's only
17:51
so much I can do , right . There's only so many people
17:53
I can educate . There's only
17:55
so many paranoid screams
17:57
I can cry out into the abyss here for people
18:00
to listen , but , you know , eventually
18:02
I get to the point where people are like you know , she's
18:04
just a mad woman on a box
18:06
there screaming about the sky is falling
18:08
or whatever , right ? So I
18:11
try to find balance in that as well . I want
18:13
people to educate them . I want them to understand
18:16
that you know this is out there and this is
18:18
a harm to them or it's a risk to them
18:20
, versus just kind
18:22
of blindly going through the world with your iPhone
18:25
or whatever and hoping that whatever link you click on
18:27
was legitimate . Yes
18:29
, you'll see a charge , but don't
18:32
you know , don't feel bad about it
18:34
, because , like , no matter what the way
18:36
I say it is , like with fishing , it's not
18:38
a matter of if , but when . Right
18:41
, somebody's going to get something credible enough about
18:43
you that you're going to eventually fall for something .
18:45
So yeah , we did
18:47
an experiment in the company I worked for added
18:50
a bot to Slack and the
18:52
bot was asking questions regularly about
18:55
fishing , so trying to educate people . And
18:57
at the same time we started campaign sending
18:59
fake fishing emails to the
19:01
company and seeing who felt for it . And
19:03
that was appalling at the beginning
19:06
and at the end of the education program
19:08
way better , but still not
19:10
zero .
19:11
and never get it to zero , right , because it's
19:13
something is going to be credible enough or somebody
19:16
is going to be flustered enough , or they're
19:18
not just going to be , they're going to be too busy
19:20
, they're going to be . I mean , because that's the thing about fishing
19:22
is it's like the urgency , right . They make it seem
19:24
like it's important . You got to get on it right now . And if
19:27
you're what busy in your life , or if you're upset
19:30
or you know whatever , it's easy to just
19:32
click on something to go , because you just have to get
19:34
on to the next thing or whatever .
19:35
So yep our
19:38
CTO was driving this and
19:40
he was Machiavelli about it really
19:42
, really having a look at what people are doing , knowing
19:45
. Okay , it's the end of the month , everybody is going to
19:47
log in on
19:49
tour or travel expenses system
19:51
, so let's send some links
19:53
about that right now and see what happens . It
19:56
was gorgeous seeing from the
19:58
, from the sidelines , but well
20:01
, we had a bunch of education to do after that
20:03
.
20:05
Right . Well , and I'm ashamed to admit that I've fallen
20:07
for the fishing work , the work
20:09
, fishing tests as well . Right , because
20:12
it seemed legit , and I
20:14
almost think that that's almost unfair at work
20:16
. Because they've got a little bit more of an
20:18
advantage there . Right , they can craft things
20:20
that are going to seem a little bit more legit
20:23
than what somebody
20:25
on the outside would . But maybe not
20:27
right . I mean , they're getting better and better every day
20:29
, so they are
20:31
.
20:31
There would be the you mentioned
20:34
one . When I were to send you this email
20:36
saying , hey , you can click on this , on this link , and schedule
20:38
an appointment with me , he
20:40
said , well , I don't really do clicking on links
20:42
. What would be the top three
20:44
, top five things that you do to
20:47
try and stay on
20:49
the side of security , that that
20:52
people should know about , I guess passwords not
20:55
reuse , using hard and complicated
20:57
passwords , etc . I hope everybody know
20:59
about that by now . I
21:01
know it's not the case , but I hope they do with
21:04
are the more I wouldn't say
21:08
as far as password stuff goes , I would
21:10
say make sure you set up multi factor
21:12
.
21:12
You know , use an authenticator app or , you
21:14
know it , one of the key tokens or something
21:17
like that , because you know if your account
21:19
is compromised then you know they might
21:21
have your password and username but they're going to also
21:23
need that other factor there to
21:25
barely get into the account so that
21:29
I can't speak high enough again
21:31
. That's not convenient for most people but
21:33
it's getting better because you know
21:35
you've got an app on your phone now and you know
21:37
, like my authenticator app , I've got I
21:40
don't know , probably 12 or 15 different
21:42
things in there that I scroll through to find the one that I'm
21:44
looking for . But
21:46
to me that's more convenient than carrying
21:49
around , like you know , one of these little
21:51
authenticator keys . This is
21:53
great if I've got my key chain with me , but you
21:55
know , if I'm sitting in my computer I don't want to have this , like
21:57
you know , half pound key chain that I've got
21:59
with all my keys right
22:02
, because otherwise I'm not going to have this with me every day
22:04
. I'm not going to remember to put it in my pocket when I go
22:06
in the morning . So you know , we all
22:08
have our phone with us .
22:09
So I always speak highly of using
22:11
the authenticator app because it's about
22:13
as convenient as you can get it
22:15
is anything
22:18
else that you do , and people look at you with
22:21
gogli eyes when you say you do that .
22:24
Oh my gosh . Well , like on your
22:26
case , like I checked you out just make sure that
22:28
you were legit , right that you weren't keeping
22:30
me , and then , like the links that you sent me , I went
22:32
ahead and checked those out to make sure that they weren't malicious
22:34
. So you know , I run stuff like
22:36
that through tests before you
22:38
know I'll actually click on it . Yeah . The
22:42
test you mean software that
22:44
load the link and check the
22:46
error is stuff like this Well , like
22:49
there's a website called virus total
22:51
that you could put links into or you
22:53
could drop files into and it'll actually scan
22:55
it . And you know , that's kind of a double-edged sword
22:57
, because it's pretty
23:00
good about finding malicious stuff , but then
23:02
again the really bad people
23:04
understand that that's out there , so
23:06
they try to make their you know viruses
23:08
or their links or their systems pass
23:10
. That that's like the bar they're trying to get over
23:12
. Right , if they can get past that that's
23:14
the standard that everybody uses then most
23:16
likely people are going to click on it .
23:18
So that's
23:20
the Swiss cheese approach
23:22
. Yes , you know there will be some holes and
23:24
you have to have a different layer behind
23:26
it where you know the hole is going to be at a different place
23:29
and with multiple layers at some point you
23:31
might catch everything
23:33
.
23:34
Right Now you want , but most occasionally two
23:36
holes might line up , but most of the time not
23:38
. Yeah , exactly , exactly . But
23:41
you know it's unfortunate that we have to be
23:43
like cyber sleuths nowadays . Like
23:45
you know that I had to go check you out to make sure that you were
23:47
legit and that you know the blink was legit
23:49
and it's
23:52
not convenient , we're
23:54
getting into a whole different discussions , but
23:57
let's get there .
23:58
I'm really worried about the future . I
24:00
mean I grew up in the Aces as
24:03
well and I could pretty
24:05
much trust everything I could see . When
24:07
it was a picture , I could trust pretty
24:09
much everything I could see . When it was a video , text
24:12
was already already gone , but that was a
24:14
given . And for my
24:16
kids nowadays , when they
24:18
will be adults
24:21
I mean 10 years from now , deep
24:24
faking and all this technology will
24:26
be completely running
24:29
on every kind of hardware they won't be able
24:31
to trust anything they see . Right , I'm
24:33
really worried that what that will do to them and
24:35
to the approach of seeing the world
24:38
and to our communities
24:41
. It's a revolution coming .
24:43
Right , I also have AI to consider now
24:45
. So it's just like you've got something that can think faster
24:48
than a human that might be used against
24:50
you , but on the same side
24:52
, you know , that's a tool that could also
24:54
be used to help , you
24:56
know , to help keep us secure . You
24:59
know , I hear a lot of people talk about AI and I'm kind
25:01
of weird about AI . Talk
25:04
about AI is like they want to use AI for this and they
25:06
want to use AI for that , and I'm like , well , I
25:08
kind of want to befriend AI . I want , you
25:10
know AI to be like a conversation like you and
25:12
I have . Like I want to be
25:14
my friend , like you know , kind of like I see my
25:16
cat or my robot that cleans my floor
25:18
, right , it's like I appreciate these things in my
25:20
life . They're helpful to me . I
25:23
don't just see them as just tools to be
25:25
used or whatever , but like
25:27
, as far as AI goes , I would like to befriend it
25:29
. You know , to know that it's like working with me
25:31
, right , that I'm . Why would I want to use
25:33
something that's like somewhat sentient
25:36
, right ? That seems like , hey , I want to use
25:38
you . You're just going to always be my security person
25:40
that's always going to do all this stuff for me and that's all good
25:42
that you have . I mean , I don't know , that
25:44
seems kind of selfish , I think
25:47
.
25:48
I can relate to it so much . I say thank
25:50
you to Siri when my kids are around
25:52
because I want them to understand
25:54
that .
25:55
Why not ? And you know , in
25:57
the future , you know , I think Siri
25:59
will appreciate the fact that you said thank you , right
26:01
? I hope so . I think eventually
26:04
, ai will get to the point where it realizes that
26:06
some people are jerks and some people are not .
26:10
It can go . It could go so many places , good
26:13
and bad , it's all for the best
26:15
.
26:16
Well , I'm really hopeful there and it's kind of surprising
26:18
, with my tinfoil hat , that I'm as hopeful about
26:20
AI as I am . But you know
26:22
it's coming whether we want it or not . So
26:25
I want to try to stay positive
26:27
about it , upbeat about it
26:29
and , you know , hope that we can use it for
26:31
good .
26:33
What about security and AI
26:35
? I mean , what's the name
26:39
of the chat GPT from Google ? Is it Bard
26:41
Bard ? I think Bard
26:43
is not rolled out in the EU
26:45
, I think still because
26:47
of data protection
26:49
, because it could scrape the web
26:52
and just find anything and everything and just
26:54
bring it out . I mean , from a security
26:56
standpoint , this is nightmare . Have
26:59
you had to deal with this before , professionally
27:01
or personally ?
27:02
Not yet . No , it's coming right . I
27:04
mean , I know my stuff is getting scraped because
27:06
I live in the United States , so , excuse
27:09
me , I kind of wish that we had
27:11
the same protections that you did , but on
27:15
the same side of the token
27:17
or the opposite side of the token that the AI
27:20
has to be trained somehow . It has to have data
27:22
there to learn from right . So you would
27:24
kind of equate that as to us reading books or
27:27
us going through the web to try to educate
27:29
ourselves on that too . It's just
27:31
, this thing can do it on a mass
27:33
scale at a much faster speed
27:36
than we can . So that's where I hope that
27:38
it's used for good . But that's the problem
27:40
, and I'll go up railing on capitalism
27:42
. That's the problem with capitalism is like if
27:45
we're all just always worried about money
27:47
and that's all we care about , then that's kind
27:49
of the outcomes we're going to have , right .
27:53
Yeah , that would really
27:55
feel philosophical .
27:58
Yeah , I mean , if we were
28:00
not always chasing money , I think that
28:02
AI would probably go in a little bit of a different
28:04
direction than just going to scrape everybody's stuff
28:06
to try to figure out a way to make money off of it
28:08
.
28:11
A lot of things would go this way . Does
28:16
it feel like you're playing or no ? Let
28:18
me roll back that question . One
28:21
of the cliches of being a security
28:23
is really playing a catch , with
28:26
bad actors in
28:28
in air quotes and really having to
28:30
always be on
28:32
the lookout for what's new . What's new , what's new
28:34
, what's new , because You're
28:37
always , always late
28:39
. Basically , is
28:41
it the case , first of all , and , and how do you feel about
28:43
the future Still doing this
28:45
if it's still the case ?
28:48
Yes , we are always late . That's the thing . And security
28:50
, we're always behind . We're always a step behind
28:52
the attackers . Right , defenders will always
28:54
be behind the attackers because you know they've
28:56
got the advantage of they
28:58
can strike at any time , right , and we're only working
29:00
nine to five or whatever . And
29:03
the other factor is is that
29:05
we have to get it right every single
29:07
time . They only have to get it right once
29:09
right to be able to get in , you know , behind the
29:11
doors or in the walls or you know , however
29:13
you want to , whatever analogy you want to use on
29:16
that . So we're always
29:18
at a disadvantage on security and
29:20
for me , I always looked at that as
29:22
, like that's much more of a challenge . It's that's
29:24
a draw to me , right , because I'm like , I'm
29:26
on the losing side , but I'm still
29:29
here fighting , right , do you have it ? Knock
29:31
me down , I'm not gonna stop , even though I know that
29:33
you know the , the other opponent's bigger than me . I'm
29:35
not gonna stop fighting against them , but
29:38
, like with the AI factor , that's gonna
29:41
be , you know , another thing , like we discussed
29:43
, and I'm hoping to have the
29:45
tools on my side to help , you know , defend
29:48
against that as well .
29:48
Oh , Are
29:51
you taking part in creating
29:54
those tools as well ?
29:55
I Think , probably Unknowingly
29:59
, yes , just by having my data out there , but I'm
30:01
not actually , you know okay
30:03
.
30:03
No , this is something you've considered
30:05
at some point . I .
30:07
Haven't really know , but that's something that , now
30:09
that you bring it up , that does seem kind of interesting . Right
30:11
, because that's , in my opinion , like the next big
30:14
technology there . So that
30:16
does seem kind of interesting . Bring my
30:18
tin foil hat to that .
30:23
When you think about , about software development
30:25
, what is your tin
30:27
foil hat saying oh , my
30:29
gosh .
30:30
So much . Well , software
30:33
developments . I I wish they
30:35
would spend more time on security , have
30:37
some more security classes and I know
30:39
a lot of developers don't actually go to college
30:41
or whatever and get a degree to where they're required
30:43
to take some kind of security course . But
30:46
I wish that whatever companies they worked
30:49
for would pay for them to get , like
30:51
a security plus or some kind of security
30:53
Exposure to where they understood
30:56
the importance of it . For
30:58
me it's much easier to build security in
31:00
from the beginning . It works better than
31:02
to try to go back and bolt it on later . Right
31:04
, that's always been a problem , you
31:06
know , the thing is to get that code out
31:08
fast and make sure it works , because we've got to
31:10
make profit off of it . We've got to recoup our investment
31:12
to our time that we've put into it
31:15
. We'll figure security out later . Well , in
31:17
that thinking You've just
31:20
left big giant holes open that somebody's going to exploit
31:23
and now you've got an active system out in the world
31:25
that you're counting on for production , that
31:27
you have to go to figure out how to fix it . So
31:29
for me , the software
31:31
developments , for them to have
31:33
a little bit more exposure to security and understand
31:36
that maybe not the tin foil hat , like I do
31:38
, but some degree of it , to where
31:40
they understand that maybe we should think
31:42
about Securely coding
31:44
this before we , you know , just roll it
31:46
out . Or we should have somebody on our
31:48
development team that is a security Experts
31:51
or exposed to security that can kind
31:53
of bounce ideas off of or
31:55
, you know , give feedback on hey , maybe that's not the
31:57
best way to do that . Or , you know , we shouldn't
31:59
really be hard coding these passwords in here
32:01
. You know something like that . I mean , that's
32:03
a little bit of a stretch , but I
32:06
don't think that's really much of a problem anymore
32:08
, but you still find things like that these days
32:10
. I hope so but
32:13
never know .
32:14
I Mean I still heard
32:16
recently from people Excopying
32:19
files on an FTP as
32:21
a Versioning measure .
32:23
So yes , right , it's
32:25
still out there well , and
32:28
you know , like everybody uses GitHub and stuff like
32:30
that , so it's just like you
32:32
kind of they fall into this sense of complacency
32:34
, like it's secure by default , like that code
32:36
to them You're gonna download from that somebody uploaded there
32:38
secure , right , that may not be the
32:40
case , but like , if it's unique
32:43
code , who do you , who do you have checked
32:45
that to see what ? You know what
32:47
security risk it poses to you or it's gonna
32:49
pose to the system you're gonna introduce it to whatever
32:52
.
32:52
So when
32:54
would you send People
32:57
who are we're still still fresh and
32:59
still interested in all this to
33:02
Gain more knowledge
33:04
? Maybe not a , not as a training from
33:06
a training perspective , but getting
33:08
more , more , more . How
33:13
do you call that Time
33:15
being being confronted with the subject
33:17
?
33:18
Oh my gosh . Well , that's a good
33:20
question . You know I'm big on the security
33:22
plus because it's like a wide domain .
33:26
That's a certification , isn't it ? Yeah , okay
33:28
.
33:28
And it's not overly expensive . You know I
33:31
got mine . Well , it's been a few years . I
33:33
got mine back in like 2009 . So it
33:35
, you know , didn't take me
33:38
more than a month or so of studying . I just read the
33:40
book and then I went and started taking the tests . So
33:42
it's not like it's a you know
33:44
CCNA , security or CISSP
33:47
, where it's you know a giant telephone book
33:49
of you know material that you have to go
33:51
over and learn , but
33:53
it's it's vendor neutral . It's
33:55
kind of wide ranging there on like
33:58
password security and physical
34:01
security , right , all kinds of
34:03
aspects of security , and that's a thing
34:05
that you know like you wouldn't expect your coders to
34:07
really care about . But you know , like , physical
34:10
security is important , right , not leaving things laying
34:12
about on the desk that you know could be used
34:14
. Somebody from the cleaning crew could take and use
34:16
against them , or you know , because you know
34:19
there's industrial espionage out there . There's people that
34:21
go looking for that kind of stuff . That'll get a job
34:23
as a janitor just to go get into that company
34:25
to try to find information that they can you
34:27
know against their competitors or whatever . So again
34:30
, this is my tinfoil hat popping that's
34:33
. That's why physical security is important , right
34:35
, not to say that there's always bad guys around every
34:37
corner , but there are bad guys
34:39
out there . So , yeah
34:42
, I'm not really sure as far as security for
34:44
the developers , but you know
34:46
, for me I tried to speak highly of security plus
34:48
because it's just kind of a general , general
34:51
thing and even if you don't get the certification , just
34:53
kind of reading the book and understanding those topics
34:57
makes a big difference , I think .
34:59
Yeah , being confronted with the idea , even
35:01
having to think about it and reflect about your own
35:04
experience and see what you did in the past
35:06
and say , oh , okay , right maybe
35:08
not the best , that's already coming Exactly
35:10
.
35:11
And then , like you know , like
35:13
social media is great , you know I
35:15
have a bunch of people that follow me that are
35:17
not in security . They do different jobs right
35:19
so but I'm continuously exposing
35:21
them to security topics and security ideas
35:23
. So , you know , I'd like to think that
35:25
they're learning things from me by doing
35:27
that and not to say that everybody has to follow me
35:30
. But there's a lot of people in information security
35:32
online on social media that have
35:34
really great ideas that know way more
35:36
than I do that you know they're . They're constantly
35:38
like tweeting or you know they're putting
35:41
something out on mastodon or now blue sky
35:43
or whatever , about something they
35:45
found or experiences that they've had , and
35:48
a lot of them actually came from like software
35:50
development worlds and stuff like that . So they're a little
35:52
bit more focused on you know
35:54
security coding and you know how a
35:56
developer would actually want to kind of face the
35:58
world of security . So
36:01
I would recommend you know anybody like that to
36:03
try to get on social
36:05
media , try to find as many people
36:07
like that , as diverse a group of people as you
36:09
can find , to get exposure
36:12
, because that's like real time , you
36:14
know , as things change to
36:16
pop up . You know , a book is going to get
36:18
stale and it gets more stale as it sits there , whereas
36:21
you know a social media thing , or even
36:23
a security blog or something like that , is going
36:25
to be a little bit more updated and a
36:27
little bit more fresh than you know , and
36:30
it's going to bring some of those like the
36:32
MGM hack . Right , that was totally
36:34
a fishing thing . You wouldn't imagine a company
36:36
that big , you know , falling as hard
36:38
as they did because you know
36:41
somebody fished them . But
36:43
it happens and , like I said , it's a
36:45
matter of time .
36:46
Right , it's not if , but when , as you said
36:48
you're playing catch , so it's
36:50
always evolving , always , always
36:52
changing , so you have to stand up .
36:55
So that's the thing is , you can never be 100%
36:58
secure , but the idea is to try to inoculate yourself
37:00
from as much of it as possible . Like you
37:02
know segmenting your systems , or
37:04
you know doing backups
37:06
, secure backups , offline , things like that
37:08
. It's just like you've got to take these things
37:10
that are not convenient and put
37:12
them in place because they pay off in the
37:15
end . Right , they have a big payoff . I'm
37:17
the like an
37:19
ounce of prevention is worth a pound of cure
37:21
, right , it's
37:24
true , it really is . And
37:26
people don't want to spend the money right now because it's a cost
37:28
center . It's cost money to do this this way
37:30
. And I'm like you know , if we don't do this
37:32
, how much money is this going to cost us in the
37:34
future ? And you know , they kind of weigh the risk
37:37
. It's like well , is this a risk
37:39
high ? Is it a risk low ? Right , is it's
37:41
? What's the impact ? So that's
37:43
that whole risk matrix of like how big
37:45
of an impact or how likely it is to happen .
37:47
So that's
37:51
the place where I usually ask for an advice , but I
37:54
want to twist it for you . I
37:57
spoke a little bit about Apple before and I
38:00
want to bring it in a different light . So Apple has
38:02
a . We can put Google on top of it , you
38:04
want , but let's use Apple for the sake of the
38:07
exercise . Really , here's at the forefront
38:09
of a lot of people using
38:11
their devices , at their primary devices
38:13
for getting online and and
38:16
and seeing the world , the digital world
38:18
. Is there something that you wish
38:20
a company with such an outreach
38:22
would do to really
38:24
increase security for everyone
38:26
and really help everyone get to the next level ? Is
38:29
there something like this on top of your mind ? Oh
38:31
my gosh .
38:32
Well , as far as Apple goes , they they
38:34
are one of the ones that I really admire for their
38:36
security posture and their privacy kind of
38:38
outlook A little
38:41
more so than Google , right , it seems like Google's
38:43
a little bit more on the convenient
38:45
side than Apple is with
38:47
the privacy and the level of security , because you
38:50
know , with an iPhone they're patching that thing regularly
38:52
. You know your phone is patched and
38:54
kept up to date by Apple until
38:56
it's reaches the end of its service life , which I
38:58
still have a little bit of problem with . I think they could extend
39:00
the service life on some of that hardware , but
39:03
that's capitalism stepping in right , forcing
39:06
you to buy a new device . But they're
39:08
much better about that than most Android
39:12
products . You know you get a Samsung
39:14
or HTC or I don't even know why . I use
39:17
Pixel myself and specifically because
39:19
Pixel is updated . It's kept up to date for
39:21
that service life , just like the iPhone . But
39:24
all the other Android devices you require you're
39:26
relying on that carrier to keep your device
39:28
patch . So I
39:31
think Apple is definitely on the better side of
39:33
security . There's
39:36
always things that they could be doing better , of course
39:38
, like you know , requiring multi factor
39:40
authentication , those kinds of things yeah
39:44
, not
39:46
really an iPhone user , so I'm not
39:49
all good .
39:50
I was wondering if there is , if there is some , some security
39:53
measures that we could really
39:55
bring more into the forefront
39:58
. I mean , google , by having
40:00
their own authenticator , which is
40:02
now pretty much the one everybody
40:04
uses , I guess is already
40:06
helping in this . In this regard , the
40:11
one thing I remember is the
40:13
, the iPhone suddenly picking
40:15
up on on SMSes with
40:17
with a two FA code
40:19
so you can really stay on
40:22
the form that you're filling and really
40:24
get the SMS and right away bring
40:26
it in the form
40:28
. Not having to switch back and forth is
40:31
Good and bad . I mean
40:33
it's great because it
40:36
makes it more convenient to use
40:38
2FA , but against
40:40
this convenience , you could be
40:42
first or you could get the different dismissive
40:44
this time , I don't know . But those kind of
40:46
hacks really help people set up 2FA
40:49
and really use it and
40:51
jump over the fence on that . So I don't
40:53
know , I'm still searching .
40:55
Right . So you've got to find the balance . You've got to
40:57
make it convenient enough and secure enough to
40:59
where the people will actually adopt it and use it . Right
41:03
, or you just be draconian and you force it upon
41:06
them because you've seen that
41:09
people are being fished through SMS
41:11
or whatever . Eventually you stop doing that . You
41:13
do not allow those kinds of systems
41:16
in there . You require them to do like an
41:18
authenticator . I
41:20
wish Apple would have one too , so if iPhone users didn't
41:22
feel like they had to install the
41:24
Google product or whatever .
41:27
Yeah , indeed , I guess they're going the key chain
41:29
way and the
41:31
Face ID stuff . So
41:33
that's their kind of 2FA
41:36
.
41:37
Well that's right . All of
41:39
that stuff can be tricked too , right ? I mean , it's
41:41
just like you can , yeah , and fingerprints
41:43
I mean , I don't know if you saw the Mythbusters
41:45
episode years ago where they basically took
41:47
a photocopy and they went in there and drew
41:50
it out to where it was like a little bit more crisp
41:53
in their detail , shrunk it back down and bam
41:55
, they were , you know . So
41:58
every system has its flaws and
42:00
weaknesses and eventually somebody
42:03
will find them . But that's
42:05
part of security is that you need to have a team
42:07
that's actually actively looking for those kinds
42:09
of things , what they call the white hat
42:11
hackers and I hate that term like white hat and black
42:13
hat . But
42:15
you've got to have the good hackers versus
42:18
the bad hackers . I hate
42:20
to say bad hackers because
42:22
hackers get a bad name . Right , because
42:25
a hacker is somebody that makes something
42:28
, do something that it really wasn't intended to do
42:30
, and that's not always malicious
42:32
. But the media
42:34
and the news portrays
42:36
the hacker as like the hooded evil
42:39
person that's here to try to steal all your stuff
42:41
, and that's not the case . I'm a hacker . I know
42:44
a ton of other hackers that
42:47
they're on security . We would consider on the good side
42:49
that they're actually out there trying to help people
42:52
.
42:52
So and we're glad you're
42:54
there Because we can stay
42:57
on the side of convenience
42:59
.
43:00
No , that's not the right thing to say Well , you
43:02
need somebody like me on the opposite side
43:04
of the spectrum . That's like saying wait a minute
43:06
now . Wait a minute now . That's not secure , you're going
43:08
to get hacked . And did you check that link ? And
43:11
blah , blah , blah , right . So you
43:13
need to find that balance , something
43:15
that's you know , because my system
43:17
would be completely unusable , or it would be so
43:19
unusable that it took you 20 minutes to get logged
43:21
into it . You had to go through all these steps to get into it
43:24
, and then you know where's your productivity
43:26
for the day . So you've got to try to
43:28
find that balance in the middle .
43:30
Yes , we do . Yes , we do , Colin . It's
43:32
been a fantastic ride with you and we're already
43:34
at the end of our time box . Oh gosh
43:36
, when would be the best place
43:38
to continue this discussion with you ?
43:41
Well , I'm on a number of social
43:43
medias . I was big on Twitter . I
43:45
still am on Twitter , but it's
43:47
not my least . It's my least favorite platform now
43:50
I still have the most followers
43:52
there . But I'm at Colleen's
43:54
underscore on Twitter
43:56
, and then the same thing on
43:59
Mastodon and then on Blue
44:01
Sky . I'm at
44:03
ColleenShanecom , which
44:05
is my website , so that's
44:07
probably the easiest way to get in contact with me
44:09
is ColleenShanecom . Just
44:11
go to my website , and I've gone
44:14
out of my way to make that look as
44:16
turn of the millennia as possible
44:18
. I put the little barrier of the under construction
44:20
down there and the little site counter
44:23
and everything . That's
44:26
something I do . Code was HTML . I
44:28
started doing that back in the day , like making
44:30
websites for video games that I
44:32
played , so I tried to make
44:34
that my website look as crappy
44:37
late 90s , early 2000s as I've
44:39
remembered all the old ones that
44:41
I made back in the day , so
44:43
it's flashy .
44:44
It's in very colorful palette
44:47
. And it's in your face
44:49
.
44:49
That's great you got to have an under construction
44:51
barrier there , for sure .
44:52
Yes , you do , yes , you do Fantastic . I
44:55
had links to all this in the show notes . Anything
44:57
else you want to plug in before we call it today ?
44:59
No , I just want to say thanks for having me on . I
45:01
love doing this . This is , you know
45:03
, me sharing my story and me sharing like
45:05
my tin foil hat on the security side
45:07
of the spectrum here is great right . It helps
45:10
people find balance and like at least it's in their mind
45:12
. So I appreciate the opportunity to do this and
45:15
I also want to say I love your ISS
45:17
space station , regular space station behind you
45:19
, because I don't know if you can
45:22
see mines up there . Yes , I do .
45:24
It's up there I love it . Yeah
45:26
, I've got some more . There's curiosity up there and
45:29
there's the Apollo lander there .
45:31
Right , curiosity is on my list
45:33
to get to go .
45:34
So that's the one I could save from my son saying
45:37
okay , we build them together , but then they're not
45:39
going into your room , they're going in my office . Colleen
45:43
, thank you so much . It's been delightful hearing
45:46
the story with you . All right , thank you so much
45:48
. Have a great day and this has been another
45:50
episode of their first journey . I will see each other next
45:53
week . Bye , bye . Thanks
45:55
a lot for tuning in . I hope
45:57
you have enjoyed this week's episode . If
46:00
you like the show , please share , rate
46:02
and review . It helps more
46:04
listeners discover those stories
46:07
. You can find the links to all
46:09
the platforms the show appears on on
46:11
our website devjourneyinfo
46:14
slash subscribe . Creating
46:17
the show every week takes a lot
46:19
of time , energy and , of
46:21
course , money . Will you please help me continue
46:24
bringing out those inspiring stories every
46:26
week by pledging a small monthly
46:28
donation . You'll find our Patreon
46:30
link at devjourneyinfo
46:33
slash donate . And finally
46:35
, don't hesitate to reach out and
46:37
tell me how this week's story is
46:40
shaping your future . You can find me on Twitter
46:42
and at teamathabtimothep
46:45
, or per email info at
46:47
devjourneyinfo
46:49
. Talk
46:52
to you soon .
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More