Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:54
How's it going , Matthew ? It's really good to have
0:56
you back on the podcast here . It's been
0:58
I mean , it's been probably
1:00
18 months since you were last on
1:02
the podcast . I'm really excited , you
1:04
know , for what we , what we have in store today
1:07
.
1:08
Yeah , great . Thank you very much for having me back . Time
1:11
flies when you're having fun , I guess yeah
1:14
.
1:14
Yeah , it doesn't seem like it's . It was that
1:16
long ago , but it's just
1:18
like one thing after the next . You know in our
1:20
lives where it's like traveling
1:23
and just constant , constant
1:25
, go on different topics .
1:28
Yeah , indeed , and 2023
1:31
has been particularly busy
1:33
on many , many fronts , everywhere
1:35
.
1:37
Yeah , absolutely so you know
1:39
before we dive into you
1:41
know 2023 recap . Why
1:44
don't you tell my audience you know
1:46
who you are , what your expertise
1:48
is and all that good information ? Right
1:51
, because you know , maybe maybe there's
1:53
some new listeners that haven't heard you before
1:55
and I just want to make sure that everyone knows
1:57
. You know who you are and what you provide
1:59
in the field .
2:01
Yeah , sure . So my name is Matthew
2:03
Gorge . I'm the founder and CEO
2:05
of VG Trust . We're
2:07
a software provider of
2:09
GRC solutions
2:12
and we've got an award-winning solution
2:14
called VG1 that allows you to prepare
2:16
for , validate and manage continuous compliance
2:18
with about a hundred security frameworks worldwide
2:21
, specifically anything
2:23
that has to do with data privacy , information
2:25
, governance and compliance . So
2:27
, as you can imagine , all the usual suspects PCI
2:30
, hipaa , gdpr
2:32
, nist , iso and
2:34
so on and so I've
2:37
been in cybersecurity for longer than I
2:39
care to admit , and probably been about
2:42
25 years . I started when cyber
2:44
was not called cyber , it was called network
2:46
security , and then it became content
2:49
security , internet security , data
2:51
security , privacy , and now we're
2:53
in the area of global
2:55
compliance and global security . I'm
2:58
involved with a number of security think
3:00
tanks , including the VG Trust Global
3:02
Advisory Board , which is a non-for-profit
3:06
think tank with about 1,350
3:09
members from 30 countries , and
3:12
we talk about what's happening in the industry
3:14
under chatroom house rules . One of the things I would
3:17
say straight off is that
3:19
my view is that security , if
3:21
you work in security and you
3:23
do your job correctly
3:26
, nobody knows your name , but if something
3:28
goes wrong , you become public enemy number
3:30
one very quickly within the organization
3:32
and it carries a stigma in the industry moving
3:34
forward , and I think
3:36
that as a community , we need to look after each
3:38
other and we need to make sure
3:41
that we share best practices , not just by saying
3:43
this is what you should do , but
3:45
also saying you know what ? This is
3:47
where I make mistakes . I'm going to share that with you so
3:50
that you don't have to make my mistakes , and hopefully you will share
3:52
your mistakes with me so
3:55
I don't have to make them .
3:57
Yeah , that's a really good point and I think that
3:59
that kind of experience
4:01
is often overlooked . I
4:05
used to work for a company and they were bringing
4:07
in a new VP of
4:09
security and he was recently
4:11
at , I think , two or three other
4:14
places back to back that were breached , and they were
4:16
big , huge breaches , like the
4:18
target breach and a couple other
4:20
ones like , I think
4:22
, home Depot . It could have been a
4:25
very unfortunate situation , right , like this guy just came
4:30
into the role , they get breached
4:32
and it's pinned on him and whatnot , right
4:34
. But everyone internally
4:37
was like , ooh , are we sure we want
4:39
to hire that person ? And the only person
4:41
that actually
4:43
stood up for him was the senior director
4:46
that would be reporting to him and said , you know like
4:48
, well , why wouldn't we want that experience in
4:51
house ? We haven't been breached , he's gone through
4:54
it , he knows what happens , what can happen
4:56
and how to handle that situation .
5:00
Yeah , and you know , the reality is that
5:02
there's only two types of companies out there
5:04
the ones that have
5:06
been breached and the ones that don't know
5:08
who's been breached . And it's nearly better to
5:12
understand that you have been breached so you can
5:14
better the systems , the processes , the security awareness
5:18
, the culture of security , so
5:20
that you can make that an ongoing journey . I always say that security
5:24
is a journey and
5:26
not a destination . So by the time you reach
5:28
compliance with regulation one
5:30
, two , three or XYZ , your ecosystem has evolved
5:32
. You know , maybe people have left , maybe you've
5:34
acquired
5:37
a business , maybe there's a new system that
5:39
has been rolled out , and so your risk surface changes
5:43
all the time . And so to
5:45
say that we are secure right now , you
5:47
might be secure for a millisecond , but
5:50
everything is dynamic , and so you
5:53
need to work with people that understand that , and somebody that's
5:56
already dealt with a breach most likely
5:58
we'll understand it better than somebody that hasn't
6:01
. That doesn't mean that the scales are not
6:03
equal . I'm just saying that having to deal with a breach
6:07
from a PR perspective , a
6:09
technology perspective , a legal perspective
6:11
, you know , an internal
6:14
perspective is something that
6:16
, unless you've lived it , it's difficult
6:18
to grasp . Now you
6:22
can get amazing training for it and
6:24
you will be much better at dealing with it if you've
6:26
had the training . But
6:28
unfortunately
6:30
you won't really grasp
6:32
it until it happens to you , if that makes sense .
6:37
Yeah , that's a really good point . You
6:39
know you bring up there's two kinds of companies companies that know
6:41
that they've been breached and ones that don't know that
6:43
they've been breached yet . Right
6:47
, do you have
6:49
an interesting question . That's probably a loaded
6:51
question . When companies
6:53
have , you know , subsidiaries
6:56
or , let's say , branch branches
6:59
in China or more
7:01
adversarial countries , right , do
7:06
you assume that they're already
7:08
breached and they it's
7:10
more of an internal breach at that point ? Does
7:14
that make sense ? Because
7:16
if , how China and Russia just
7:18
throw out a couple adversarial countries
7:20
out there , how
7:22
they typically operate is that
7:24
when you operate in their country , they own
7:26
all the IP that you create
7:29
there . So , do
7:31
you see it that way or not necessarily
7:33
?
7:33
So I don't necessarily think that they've
7:35
been breached or spied on , but
7:38
what I would say is that if you
7:40
need to understand your ecosystem
7:42
and what I mean by your ecosystem is anything
7:45
that's behind your firewalls , your hybrid
7:47
workforce , your applications , your
7:50
third parties , fourth parties
7:52
, anybody that interacts with your systems
7:55
, even from
7:57
time to time , even sporadically and
7:59
if some of those subsidiaries
8:02
or branches or people are based in
8:04
a country that is at risk , then
8:07
you need to run a tabletop exercise
8:10
as to what it would mean if
8:12
you could no longer get to that data
8:14
, if you could no longer get
8:16
the physical assets , the hardware , for
8:18
instance , back into your own country , if
8:21
you could no longer talk
8:24
to the regulator , the local regulator
8:26
, because what might happen and that happens
8:28
specifically with Russia and Ukraine
8:31
is that from one day
8:33
to the next , suddenly
8:35
it became super difficult to get your
8:37
data and even if you have a backup
8:39
of your data , because of nationalization
8:42
of Western assets in Russia , for instance
8:44
, you will never get that data
8:46
back and you can be sure that at that stage , that
8:49
data is going to be analyzed . So
8:52
there are currently about 10
8:55
plus really
8:57
bad conflicts worldwide and of course
9:00
, you hear about Russia and Ukraine , and you
9:02
hear about Israel and Gaza , but there are
9:04
a few others that don't really make the
9:06
headlines the same way . You need
9:08
to map out where you do business
9:10
and the impact on your business
9:12
and I think that's where the
9:15
boards are really starting
9:18
to wake up suddenly in 2023 and into
9:20
2024 , that
9:22
you can't just assume
9:25
that because you do business in a country
9:27
right now and it's all solid
9:29
, there's the right policies
9:32
and the right backups and so on , that you don't have
9:34
to actually plan
9:36
for the worst . And
9:38
I believe that , as humans
9:40
, we are generally thinking optimists
9:44
, sometimes too much , and
9:48
it's a case of understanding what
9:50
am I ready to lose ? Have
9:52
I trained for that ? Because , as I said
9:54
, if you've trained for it , even
9:57
if you haven't really experienced it , but if you've
9:59
done a tabletop exercise as to oh
10:01
, from tomorrow onwards , we can no
10:03
longer do business in Taiwan , you
10:05
know what that means and you've prepared
10:07
for it . And I think that it
10:09
leads me to the
10:11
World Economic Forum's Global
10:14
Cyber Security Outlook 2023
10:16
report , where , essentially , they
10:18
list all of the top risks
10:20
that organizations need to
10:23
deal with , and the first risk
10:25
is not the advantage of AI
10:27
, it's not the rise in
10:29
security breaches , it's
10:31
the geopolitical fragmentation
10:34
. So , in other words , what's happening
10:36
in countries where you may or may not do business
10:39
will actually impact your business . Again
10:41
. I go back to Russia invading
10:43
Ukraine . You can see a
10:46
huge rise in ransomware
10:48
attacks coming from Russia into countries
10:50
that are openly supporting
10:52
Ukraine . You can . It's
10:55
reasonably well documented
10:57
that there were a number of critical infrastructure
11:00
attacks on the Ukrainian
11:02
critical infrastructure assets
11:04
in the nine months leading up to the physical
11:07
attack and then that kind of dropped about
11:10
two weeks before the physical attack and now
11:12
it's back up and so
11:14
we are monitoring , as an industry
11:16
and threat intelligence then the
11:18
countries that are getting the most attacks
11:21
right now , because it could
11:23
be it's not guaranteed , but it
11:25
could be a sign of physical attacks . And
11:27
I think that you
11:29
know I'm not telling anyone to forget about
11:31
privacy and so on , absolutely not
11:33
. You need to continue working on
11:35
that . But I do think that right
11:38
now is a good time to go back
11:40
to basics and to say what
11:42
is my ecosystem ? What am I protecting
11:44
? What am I willing to lose
11:47
in 2024 ? What can
11:49
I absolutely not afford
11:51
to lose in 2024 ? And that will
11:53
drive your threat intelligence
11:55
and your protection
11:57
strategy into the new year , I
11:59
guess .
12:01
Yeah , it makes a lot of sense . You know
12:03
, I actually I
12:06
took part in a tabletop exercise
12:08
before and those are
12:11
extremely good at identifying
12:13
the areas of improvement . It's
12:16
really interesting . You know , they'll come up
12:18
with a different scenario and you have to work through
12:20
it and everyone on the call , you know
12:22
, has a role . I've
12:24
seen it from both ends , where everyone knew what
12:26
they were doing and then the other side was , you
12:29
know , no one really knew what they were doing and
12:31
, you know , in this tabletop exercise , the company
12:33
was breached for an entire
12:36
week before security even knew about it , right
12:38
, and
12:41
it's a really good
12:43
tool that organizations
12:46
should , and typically do , use
12:48
to really , you know , identify those gaps
12:51
and actually it's really important to shore
12:53
them up once you identify them . You
12:56
know , now , looking back into 2023
12:59
, what were some
13:02
of your top items I
13:05
guess that happened in 2023
13:07
that you think may be setting the stage for
13:09
2024 ?
13:12
Well , from a technical perspective
13:15
, the rise of ransomware attacks
13:17
, the
13:19
scaling
13:22
in the number of attacks
13:24
against CEOs and C-suite
13:27
. From a social
13:29
engineering perspective , that was extremely visible
13:31
. What we saw as well
13:33
is a number of key executives being
13:36
prosecuted and
13:38
, in very limited cases
13:40
, being jailed
13:43
for not doing the right thing with
13:45
regards to privacy , and
13:47
that can be a game changer . We
13:51
saw a number of new
13:54
regulations coming out in
13:56
specific areas and we saw
13:58
obviously the advantage of NIS-2
14:01
in Europe with regards to critical infrastructure
14:04
protection . We saw a number
14:06
of new data privacy
14:08
regulations in I think about six states
14:10
in the US , which is good , but
14:13
they're not all exactly going the same
14:15
direction . So I think we unfortunately were still
14:17
a long way away from the
14:19
federal equivalent of GDPR
14:21
. So
14:24
we obviously saw the
14:27
Ukrainian-Russian
14:29
conflict going on and
14:31
the impact of that . We
14:34
now have the conflict
14:36
between Israel and Gaza and
14:38
, ironically , a lot
14:41
of cybersecurity funding comes out of
14:43
Israel every year , not
14:45
just to the US , but also to Europe
14:47
and to Asia , and that funding
14:49
is probably going to slow down , meaning
14:52
less money to invest in cyber , also
14:55
meaning more attacks on Israel . Also
14:58
meaning potentially another equivalent
15:00
of shadow IT coming out of
15:03
Gaza and people
15:05
supporting Gaza . So
15:07
it's a very dynamic environment
15:10
. But
15:13
we also saw a rise in
15:15
that idea of security culture and
15:18
that is mentioned as well in the World
15:21
Economic Forum report where we
15:23
see more and more business people
15:25
trying to engage with
15:27
security and compliance people to understand
15:29
what they can and cannot do and for them
15:31
to work together . And
15:34
then we go back to where we were saying at the beginning , that
15:36
if you work in security , nobody wants to talk to
15:38
you and , generally speaking , it's
15:40
because you're either telling the
15:42
business no , you can't do this because you're going to
15:44
put aside a compliance or you're going to increase
15:47
our risk surface beyond what we can
15:49
accept , or you're
15:52
like , hey , you go to the board and say
15:54
, hey , the business wants me to do that , can I get another
15:56
million dollars to make it happen securely
15:58
? So it's a difficult one . But now we're
16:01
seeing that trend where more
16:03
and more business people are talking
16:05
to security and compliance and we're going to
16:07
see a bit more of that in 2024
16:09
, into the next two to three years .
16:14
Yeah , it's a lot to unpack
16:16
there . One of the things
16:19
that you brought up previously
16:21
that I have talked about is
16:23
the fact that now we're
16:25
seeing a lot of digital
16:28
attacks or cyber warfare attacks , before
16:30
kinetic attacks ever take
16:33
place . Do
16:35
you see that ramping
16:38
up at all ? Because
16:41
I feel like there should almost be , like you
16:44
know , a watch group that
16:46
is saying like , oh , we're seeing an increased , you
16:48
know specialized attack in
16:50
you know Europe or wherever it
16:52
might be , and you know
16:54
, kind of like , put out the watch on that
16:56
. Because I feel like everyone in cybersecurity
16:59
is aware of that , they
17:01
understand that and they know the implications
17:04
of that . But it's
17:06
much more difficult to
17:08
get people outside of cybersecurity to fully
17:11
grasp the concept of
17:13
, oh , like , they're going to take down my
17:16
phone network before they
17:18
, you know , send troops in , right ?
17:21
Well , so you know , the issue
17:23
with critical infrastructure assets is
17:25
that as citizens , as
17:28
Joe public , we believe that
17:30
this is the responsibility of
17:32
the government , and what
17:34
we do not understand is that , depending on
17:36
the survey you look at , but generally speaking
17:38
, it's between 70 and 80% of
17:40
critical infrastructure assets
17:42
like electricity , power
17:44
, food , transportation
17:47
and so on is actually owned
17:50
and are operated by
17:53
the commercial sector , by
17:55
private companies , and the
17:58
part that is actually managed purely
18:00
by the government is
18:03
, generally speaking , only the army and
18:05
the police systems , because even
18:07
hospitals and specifically
18:10
in the US , you know
18:12
, half of the hospital systems
18:14
are actually private systems Not
18:16
so much in Europe , but still
18:18
parts are still actually private
18:21
. And so what you want
18:23
to do is you want to bring
18:25
the awareness level , with Joe public
18:28
, that everything
18:30
starts with them . And which actually leads
18:32
me on to another . I
18:35
suppose another issue
18:37
here and we are seeing that more
18:39
and more over the last few years is
18:41
that concept of your own critical
18:44
infrastructure . So right
18:46
now , most of you have three
18:48
, four connected devices on you , you
18:50
know , a smartwatch and maybe
18:53
a personal cell phone , a business cell phone and
18:55
an iPad or whatever , and that's before you even get
18:57
into your car , which is completely connected , and
19:00
then you get to your house and so on . And so
19:02
if I educate
19:05
you , either as the industry
19:07
and or the government , as to the
19:09
value of that , and I
19:11
can say , well , if you take care and
19:13
if you , if you are careful , nobody's
19:16
going to be able to drive by and
19:18
order whatever they want
19:20
by hacking into your phone that
19:23
is linked to your fridge
19:25
, that has a system that allows you to
19:27
connect to , to Walmart
19:30
or wherever , to replenish everything , and
19:32
now I can buy different things and get
19:34
them sent to my home instead of yours , and
19:37
that is the problem . But it's not like threatening . But
19:39
let's say I hack
19:42
into your HVAC system , your air conditioning
19:44
system , where it depends on where you live
19:46
, but like , if you live in Michigan in
19:48
the middle of summer and you
19:50
can't get cool air
19:53
, or in the middle of winter and you can get
19:55
heating , that will become critical
19:57
. And so I think that what we need
19:59
to do is we need to like if people do that
20:01
at home , they're more likely to pay attention
20:04
at work , and vice versa . So it needs
20:06
to be a continuous cycle of educating
20:08
them on both sides . I
20:11
do think that again
20:13
it goes back to that idea of the your
20:16
risk surface . So my risk
20:18
surface before , when I was
20:20
walking , was just social
20:22
engineering my watch was not connected
20:24
, I didn't have a cell phone , or my cell phone
20:26
was so dumb that you couldn't even hack into
20:29
it . Now I'm
20:31
like walking a tax
20:33
service and everywhere
20:36
I go it keeps growing . So
20:38
I need to , I need to train
20:40
people to understand hey , do I
20:42
really need that connected
20:44
wallet ? Do
20:47
I really need this ? Do I really need that ? Do
20:49
the benefits outweigh the
20:52
risks ? So if I have a connected wallet and I
20:54
lose it , I can connect to
20:56
it . That's great , I can see where it is . But
20:59
if , for
21:01
some reason , there's no , there's
21:04
only default settings on it , I might be
21:06
able to connect to your , to your wallet , and
21:08
then , once you're home , I use the wallet to
21:11
piggyback onto your , your computers
21:13
, and then to piggyback onto your
21:15
VPNs that go from your computer
21:17
to your workplace . You
21:19
can see where I'm going and it's not that far fetched
21:21
, to be honest . I mean , I'm not that technical and
21:24
I think I could . I could do
21:26
a demonstration reasonably easily . Not that I
21:28
would do that , by the way .
21:31
Yeah , it's . It's actually a lot easier
21:33
than what people would assume
21:35
, in my opinion . You know , like I , I'm
21:38
not a hacker by any means , and I
21:40
could absolutely pull something off like
21:42
that , especially in
21:44
2023 , where these exploits
21:47
and packages are kind of already pre-made
21:49
and you just kind of find the right one and get
21:52
in .
21:53
You raised a good point . You know , the 2023
21:58
has seen a huge increase in
22:00
hacking as a service where
22:02
you go to the deep web and it's not even digging
22:05
too deep and you can buy
22:07
a kit where you create your
22:09
own ransomware or your own DDOS
22:12
and so , and you literally
22:15
configure it the same way as you configure
22:17
your iPhone . And
22:19
you know , for some of them , they actually have a customer
22:21
service line where they provide better customer
22:23
service than
22:26
normal companies , and
22:28
so I think that you
22:30
know the level of skills that an
22:32
attacker needs to have keeps going down
22:34
whilst the attack surface keeps
22:36
going up , and so you can see
22:39
where you can see that's creating a huge vacuum
22:41
, and as an industry , we need to
22:43
work together , and I think that
22:45
I applaud all of
22:47
the work that's been done in
22:49
2023 around teaching
22:52
kids how to code , teaching
22:55
them cybersecurity or the
22:57
sense of security from primary school
22:59
up to , you know , up to college
23:01
, because if we don't catch them
23:03
now , they're gonna be
23:06
our next security people or our next
23:08
head of IT or a head of database
23:10
in like five years or 10 years , and
23:12
they're just gonna be walking targets with
23:15
my name on it you know at the back
23:17
, my company name , and so I don't want that
23:19
to happen . So I actually not only do I have a
23:21
duty , but there's definitely something
23:23
in it for me to do . That
23:26
, which actually leads
23:28
me on to another point . I'm
23:31
in the process of writing my
23:34
second book around the life of CSOs
23:36
, but not around , not , generally
23:38
speaking , around . You know , the
23:40
certifications that they have , but
23:43
I asked them all the same 15
23:45
questions in the same order , about work-life
23:47
balance and the threats that they see out there
23:49
. And one very interesting question
23:51
that I asked them is
23:54
do you think we are creating
23:56
the right succession plan for when you
23:58
get out of the industry ? Either you're
24:00
gonna go do something else or , you know
24:02
some of you have been in IT or
24:04
in cyber for 20 years . Maybe you're gonna want
24:06
to retire . How do we extract
24:09
the level of experience that you have
24:11
so that we can document it and pass it
24:13
on to new people , and are
24:15
we actually creating people with the
24:18
right skills ? Is
24:21
the curricular out there too
24:23
outdated for the new threats ? And
24:26
I see a divide
24:28
. So I set out to do a hundred interviews
24:31
. I'm about three quarters into
24:34
it right now , but I see a divide
24:36
. Some people say , no , actually we are doing
24:38
the right thing . Others are saying I
24:40
don't think we are and I mostly don't think
24:42
that we
24:45
have the ability to pass on our knowledge
24:48
, which I think is an interesting point Because if you think
24:50
about it , you
24:52
know people that became network
24:54
security managers in
24:56
and around early 2000s
24:59
would have had maybe five
25:01
to 10 years experience in IT already
25:03
. So these people are all coming
25:05
up to retirement in the next five to
25:07
10 years . So we're gonna
25:10
have that cliff of skills going
25:12
down . I'm not saying that new people don't have skills
25:14
. Some of you absolutely do , and
25:16
in fact they're probably faster at
25:19
some things that we all these don't . You
25:21
know we take time to process , but
25:24
we understand the value of process , whereas
25:26
younger generations , they , want
25:28
everything faster because they never grew
25:30
up with the idea of , you
25:32
know , waiting for a file to download . That's
25:35
unknown to them . So why
25:37
would you wait five days to do the
25:39
right thing , to find out where the breach came
25:41
from ? You have
25:43
a hunch , you go after it , and
25:46
in going after it quickly you
25:48
actually destroy all the legal evidence
25:50
that an older person would
25:52
have found within 10 days but would have been able
25:55
to use . I think we have a bit of a
25:57
challenge there in the next five years
25:59
five to 10 years .
26:03
Yeah , everyone always talks
26:05
about the talent shortage
26:07
or the talent gap , and
26:09
not a lot of people are bringing up
26:11
the fact that a lot of the people that
26:14
are in leadership roles or have
26:16
been , you know , very experienced
26:18
in their job for the past
26:21
you know 10 , 15 years they're all retiring
26:23
fairly soon . You know , I actually
26:26
got brought on at a company to
26:28
replace someone
26:30
as their security expert that had
26:33
been at the company for 25 plus years
26:35
. They were retiring in the next
26:37
, you know , six or nine months , something
26:39
like that , and you
26:42
know that knowledge
26:44
dump right that we had to go through I mean
26:46
, it's every day for you know nine
26:48
months . Why did you make this choice ? What was
26:50
this situation ? Who did you work with on
26:53
this ? Who do you trust within the organization
26:55
? All of those sorts of things , you
26:58
know . And now
27:00
this company that I
27:02
came and worked for , they had a very forward
27:05
thinking view . You know they
27:07
were very good at thinking ahead
27:09
, planning ahead , and so things
27:11
like that were always on their roadmap of
27:13
who's retiring when . What skill
27:16
sets do we have to pick up ? What skill sets
27:18
do you know we need to augment and
27:20
replace and things like that . But
27:22
not every organization is
27:24
thinking like that . That's
27:27
a huge challenge that's gonna be coming up very
27:29
shortly . It's almost like a different
27:31
. It's almost like a different problem
27:33
from the talent shortage that we already have .
27:37
Yeah , and I think you know there
27:40
are a lot of talented people that are
27:42
coming on the market . That's not exactly
27:45
the problem . The problem is
27:47
did we give them
27:49
, as an industry , the right pointers
27:51
so that they can either learn
27:54
what we need right now or have a
27:57
basis that's good enough that they
27:59
can be molded into what we need
28:01
? Because obviously there's
28:03
no point in creating an expert in
28:06
forensics if we have enough people
28:08
in forensics . But equally , if
28:10
you know forensics really well , you'll
28:13
be able to add value in incident response
28:15
, in purple team and so on , so
28:17
you'll be able to reshape your knowledge
28:19
. But I think the worry
28:21
is more about are
28:24
we creating people that have too narrow
28:26
of a scope and that scope
28:28
is valid today but may not be valid
28:30
tomorrow , and will they manage to retrain
28:33
? I'll give you an example
28:35
very
28:37
topical . Let's
28:40
go back to 2018 for
28:42
just one second . 2018
28:45
, gdpr was enacted and
28:48
so , overnight , millions
28:51
of people were GDPR experts . They
28:53
just added that to their LinkedIn or
28:56
their resumes or whatever . Today
28:58
, everybody is an AI expert and
29:00
, more worryingly , a lot of people
29:02
are AI security experts . So
29:06
that's great . At least there's
29:08
an interest . But the challenge is
29:10
not really just in AI security
29:12
, as in securing the code
29:15
and securing the LLMs and so on
29:17
, because there's emerging technology
29:19
on that . It's about
29:21
AI governance , and
29:23
there are very few real
29:25
AI governance courses
29:28
out there that allow
29:30
you to grasp the real risks
29:32
and the way to embrace
29:34
AI in a way that
29:36
allows you to govern the process and to deal
29:39
with issues , and so what I
29:41
wouldn't want to see is tens
29:44
of thousands of AI cyber
29:46
experts being
29:49
born over the next 12 months
29:51
and they're actually not trained
29:53
the right way and they actually add no value , but
29:55
they think that they're going to be able to get
29:57
jobs , because they're probably not going
30:00
to be able to get the jobs they want and
30:02
they may actually not add value or
30:04
not add as much value . So I think it's
30:06
really important for us , as the industry
30:08
, to work with third-level
30:11
universities , to go and do guest lectures
30:13
. I do guest lectures for various universities
30:16
. It allows me to keep my finger on the pulse
30:18
, to understand how younger
30:20
people think , what they want to learn , the
30:22
questions that they ask and so on , as
30:24
opposed to saying oh , the next big
30:26
thing is AI
30:30
risk management . Maybe it is
30:32
, maybe it isn't . I mean , the thing with AI
30:34
is we don't exactly know as an industry
30:36
, and anybody that tells you they know , take
30:40
it with a pinch of salt , because it's such
30:42
a fast-moving target that we
30:44
don't exactly know just yet .
30:48
Yeah , that's a really good point that you bring
30:50
up . It's a lot easier
30:52
to add these key terms
30:55
to your LinkedIn or to your resume than it
30:57
is to actually create
30:59
the skills and get the skills that are
31:01
needed to actually fulfill
31:03
that AI security title . I
31:08
feel like the only people
31:10
that they're harming is
31:12
themselves . Because they get a
31:14
job . Maybe they fool someone
31:16
at the job because they know a little
31:19
bit more than what the
31:21
person interviewing them does , so they get the
31:23
job and then they get that job and
31:25
they fail at it . It's just one failure
31:27
after the next and they're constantly trying to play catch
31:30
up , especially in an advanced
31:32
area like AI security . That
31:35
really isn't even defined
31:37
right now . One we
31:39
don't know where AI is going . Two
31:42
, ai security is something that
31:44
we're just starting to talk about now .
31:48
Right and I think it's great
31:50
to have an interest in AI . It's great
31:52
to understand chat
31:54
GPT , but AI is not
31:56
chat GPT . It's way bigger than
31:58
that . I
32:03
think that right now there's good
32:05
expertise in the market around the
32:07
data that you can feed AI and
32:10
the risks that you take and how to mitigate
32:12
those risks and how to classify
32:15
the data and maybe have a filter and
32:17
train people and so on , but in terms
32:19
of the
32:22
full architecture of AI , the coding
32:24
that goes in the AI , coding that goes into
32:26
your standard code , and how to keep
32:29
track of that and
32:32
actually manage that process , it's
32:35
still early days Now . That said , in
32:37
the last two years , there's
32:40
been about 35 new AI-related
32:43
regulations and standards that
32:45
came out . There's
32:49
been stuff like , for instance , the EU
32:51
AI Act . There's been other
32:54
things coming out from the industry and
32:56
it reminds me of the beginning of the cybersecurity
32:59
industry where , believe
33:01
it or not , back in 2005
33:05
, there were a lot of industry
33:07
standards that came out . Some of them were
33:09
driven by Vandu
33:11
, some of them were driven by associations
33:13
and so on , and we're seeing that right now . But
33:16
you have to remember that if you dial back today
33:19
, according to the UCF , the
33:21
Unified Compliance Framework , there's about four
33:24
and a half thousand regulations around
33:26
privacy , data and security , but
33:29
the reality is they all dial back to
33:31
about 20 , and then when you look
33:33
at those 20 , they really dial
33:35
back to ISO , NIST
33:37
, CIS , GDPR
33:39
, potentially PCI as a
33:41
restricted one , and a few on the
33:43
software security side . So it's
33:46
very likely that we will have the same with
33:48
regards to AI . So
33:51
I would keep a watch on that if I was
33:53
interested in working in risk
33:55
management for AI .
33:58
So where do you think ? What are
34:00
some key areas that you think
34:03
are going to be really
34:05
booming , that people need to pay attention
34:07
to in 2024 ?
34:10
I definitely think we're going to see some attacks
34:13
on personal infrastructure
34:15
. So there are already
34:18
vendors coming out with ways
34:20
to help you secure your infrastructure
34:23
at home all of your stuff that's connected
34:25
. I think
34:27
we're going to continue to see ransomware
34:30
. I have absolutely no doubt
34:32
there's going to be a few new
34:34
zero day attacks every year . That's
34:37
what happens . We
34:40
are seeing , as
34:42
always , attacks on government
34:45
, but mostly financial institutions
34:48
. It's also interesting
34:50
to see what's happening in the UK with regards
34:52
to PSD3 , and everything
34:54
that has to do with authentication and strong authentication
34:56
Identification
34:59
. So I would suspect there's
35:01
going to be continued investment in that . I
35:05
think we're also going to see ridiculous
35:08
things being connected . I
35:12
heard that example the other day of
35:14
a vacuum cleaner , completely connected , that actually
35:17
goes and
35:19
vacuums on a regular basis
35:21
but actually maps out your property . So
35:24
now you know that Matthew has a two bedroom or three bedroom
35:27
apartment on one floor or two floors , can
35:30
you imagine where this is going ? I
35:34
do believe that a number of attacks are going to be
35:36
automated , but
35:38
I also do believe that a number of
35:40
counterattacks are going to be automated using
35:43
AI . That's the good side of AI . That's
35:47
good because , whilst a system
35:49
is able to deal with the noise . The
35:52
actual analysts can deal with the real
35:54
attacks or the attacks that require
35:56
more thinking . We
36:01
are going to see more regulation , of course . We're
36:04
going to see some new AI regulation . We're
36:06
going to see some updates to EU
36:09
GDPR . There's a chance
36:11
that the UK is going to lose their adequacy
36:14
because the UK
36:16
GDPR currently is recognized
36:20
as being equivalent to European
36:23
GDPR , but the ICO , the Information
36:25
Commissioners Office , has already taken
36:27
steps to go a different direction than the
36:29
European Data Protection Board . So if
36:32
they lose their adequacy , that will mean that
36:34
from an EU perspective , transferring
36:36
data to the UK will be the same as
36:38
transferring it to the US or Mexico or
36:40
Australia , and you
36:42
can see that , the evolution of that . So
36:45
are we going to see a digital Pearl
36:47
Harbor , like we
36:50
all think might happen at some stage ? I
36:52
don't know that 2024 is the right year
36:54
for that , but I do believe that
36:56
the geopolitical fragmentation
36:58
is not going to go away and
37:01
we're just going to have to learn how to deal
37:04
with it . So I
37:06
wouldn't be surprised if people offering
37:08
red teaming and purple
37:10
teaming and table top exercises
37:13
will make
37:15
the fortune in 2024 . And
37:18
it probably wouldn't be a bad thing for the industry
37:20
.
37:24
So what are some areas
37:26
that our current AI policy
37:28
and governance is lacking in ? Because
37:31
I feel like this field
37:34
is advancing pretty rapidly and
37:37
, per usual , the governance
37:39
of it and the policy behind it is
37:42
lagging behind . So what are some areas
37:45
that we need to pick up and pace it ?
37:49
Well , there are a number of
37:51
best practices and checklists that are
37:53
available . So the IAPP
37:55
the International Association
37:58
for Privacy Professionals came
38:00
out this year with a very
38:03
good document that has , I think , about
38:06
65 keywords and
38:08
key topics that you need to look at in your
38:10
AI initiatives from
38:13
a technical and a policy and a training perspective
38:15
. So you're going to see
38:17
more of that . There are , as
38:19
I said , a number of vendors coming out
38:21
with interesting technology
38:23
about how
38:26
to make sure that whatever you do
38:28
using AI doesn't actually impact
38:30
on the generic codes of your
38:32
software . So I think we're going to see some
38:34
more of that , and it
38:36
wouldn't be . I think
38:38
what we need is like a NOASP top
38:41
10 and a SandStop 24
38:43
AI , and
38:46
it's coming . I think there are a few out
38:48
there that are just industry driven , but
38:50
there's going to be some more . I
38:55
would urge people to try and grasp
38:57
the idea of AI
39:00
governance . There are
39:02
some very good AI governance forums
39:04
coming out right now . I
39:06
spend a lot of time attending those
39:09
events and I'm fascinated at
39:11
the conversion the
39:17
two cybersecurity and AI trying
39:20
to meet somewhere in the middle . It's
39:22
an interesting thing to watch , because
39:24
cybersecurity is very at
39:27
this stage there's a risk
39:29
, or there isn't a risk , we can mitigate the risk
39:31
, or we can't , because we
39:33
understand it reasonably well but we don't
39:35
really understand AI . I think another
39:38
thing to keep in mind is if
39:40
you're familiar with the Cloud Security Alliance
39:42
, the CSA , they
39:46
are basically saying that protecting
39:49
your AI
39:51
systems and infrastructure will
39:53
follow a similar trajectory
39:55
to what we've learned about the Cloud
39:57
. Initially , everybody was saying , well , I'm not
39:59
moving to the Cloud , too dangerous , I
40:02
don't know what's there . Then , eventually , you
40:04
see , you have no choice but to move some
40:06
critical elements of what you do to the Cloud . But
40:08
now there's good practice
40:11
, there's ways to protect it
40:13
, there's continuous compliance . I
40:15
think that the CSA
40:17
says that it's going to follow
40:19
a similar path , and
40:22
they may well be right on that . We're
40:24
not going to be able to not embrace AI , but
40:27
we need the right structure that
40:29
organizations can use
40:31
. If you think about it , very
40:34
small organizations or mid-sized
40:36
organizations are well able to embrace
40:38
the Cloud now because there's
40:41
so much expertise out there . That's where
40:43
we need to get to with AI , or
40:45
at least with mainstream
40:48
AI . I'm not talking about Terminator
40:50
and that type of stuff . I'm talking about what
40:52
we're trying to do right now , which is to
40:54
use AI to automate the
40:56
mundane and other
41:00
tasks , that our
41:02
time would be better used to do something else
41:04
.
41:07
Yeah , that makes a lot of sense . You
41:10
brought up the prospect
41:13
of potentially cyber
41:15
Pearl Harbor or something
41:18
like that . From
41:24
my perspective , what I'm thinking about
41:26
, that I'm
41:29
thinking of an attack that is
41:31
very large in scale , that
41:34
changes the world forever
41:36
, right in a very tangible
41:39
way . Is that how you see it
41:41
? What do you think it would take for
41:43
something like that to happen , like a power grid
41:45
going down for a month , or
41:47
what does that look like to you ?
41:50
Pretty dark , actually Use
41:52
the pond , but
41:55
it could happen depending
41:57
on where you're based . I
42:02
get up in the morning and I'm happy to
42:04
be alive , and I'm happy that I have electricity
42:06
and I have water and so on , and I
42:08
don't want this to change . And
42:14
so I believe that also
42:16
, we've gone from just
42:20
pure critical infrastructure protection
42:22
to critical infrastructure resilience
42:24
. You look at Dora , for instance
42:27
, and for the banking industry in Europe
42:29
, the Digital Operation
42:31
Resiliency Act , if
42:33
I got that right . But anyway , dora
42:36
is all about making sure your
42:38
critical systems are resilient
42:40
. So will they
42:42
go down for a day ? No problem , it'll be
42:45
a pain , but it's okay . For a week , it'll be a major
42:47
pain , but it'll be okay for a month . That
42:50
will have societal effects
42:52
, that will have issues with
42:54
potentially , after a while , riots
42:57
and social unrest and so on , and
42:59
so we can't really afford to do that
43:01
. So it's
43:03
interesting that idea of you
43:07
. Know , we all understand now that we need to
43:09
protect the critical infrastructure of
43:12
the cities , of
43:14
nations . Now we need to understand
43:16
that we need to protect our own critical infrastructure
43:19
because it's a backdoor to
43:21
the rest . But we need to talk
43:23
about resilience , right , how
43:26
do I make my way of living resilience
43:28
. How do I make my
43:30
way of doing business resilience If
43:33
my e-commerce site goes down ? Am
43:35
I out of business ? So , am I 50%
43:38
out of business , and for how long ? And how long
43:40
can I sustain that ? And , by the way , that
43:42
is why some organizations decide to
43:44
pay ransoms . And
43:47
you should never pay a ransom by default
43:49
, because if you pay
43:51
it , you may not get the right
43:53
information back or the key back . It
43:55
may not work . But also , you advertise yourself
43:58
as somebody who's going to pay , so you're going to remain
44:00
a target right . But the reality is
44:02
some companies are like ah , do you know
44:04
what ? On the grand scheme of things , we're better off paying
44:06
, and
44:09
so , but with critical assets
44:11
, you can't always think
44:13
like that , you know . So
44:15
I think we need to move
44:17
towards resilience . We've
44:19
spent enough years developing good
44:21
risk assessment methodologies
44:23
and looking at all of that . Now we need to
44:26
get to the next level . How do I make this a continuous
44:28
proactive thing and I make my
44:30
ecosystem resilient and my
44:32
staff resilient and myself resilient
44:35
?
44:36
Hmm , yeah , you
44:38
bring up a really interesting point and that
44:40
is something that I myself
44:43
even see as being often overlooked
44:45
is the resilience
44:47
factor of deploying
44:51
this revenue-generating application
44:53
that is generating
44:55
I don't know a million dollars a day . Well
44:58
, what happens if that web
45:00
app goes down ? You know , do we have HA
45:02
set up ? Is it failing over to the
45:04
same location ? Because if it's
45:06
failing over to the same location , it's probably
45:08
not a good idea . All
45:11
of these things are often overlooked
45:13
or put on the back burner , and so that we'll
45:15
get to it , you know , eventually . Well
45:18
, in the meantime , when eventually is
45:21
coming , you know you can have an attack
45:23
that takes it down completely and it's like oh
45:25
, that thing that we said we were going to get
45:28
to eventually never came
45:30
because it was already at risk
45:32
. You know , well , you
45:34
know , matthew , we're coming to the end of our time
45:36
here , unfortunately , but you know
45:38
, before I let you go , how
45:40
about you tell my audience , you know , where they could find you
45:42
, where they could find Vigitrust if
45:45
they want to learn more .
45:47
Yeah , sure . So , first of all , thanks again for the opportunity
45:49
to talk to you today . So
45:52
you can find information about
45:54
Vigitrust at Vigitrustcom V-I-G-I-T-R-U-S-Tcom
45:59
, you can find information about
46:01
myself , Matthew Gorgecom
46:03
, in one word . I've
46:05
also published a book called the Cyber
46:07
Elephant in the Boardroom , published
46:09
by Forbes and Best Seller
46:12
on Amazon , and you'll find it
46:14
on Amazon and it's all about translating
46:17
cyber risk into business risk , primarily
46:20
for non-technical people . And
46:23
, of course , I'm very easy to find on LinkedIn
46:25
and I actually love networking
46:27
. I love meeting people from the industry
46:29
. There is not a day that
46:31
I don't learn something new about cyber
46:34
, and I've been at it for 25 years and
46:36
it's a great industry that way .
46:39
Awesome . Well , thanks , matthew . I really appreciate
46:41
you coming on and I hope everyone listening
46:44
enjoyed this episode . See you everyone
46:46
.
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More