Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:05
You don't have to be a parent to worry about
0:07
what kids get up to online, but
0:09
believe me, it certainly helps. The idea
0:11
of my son perusing the darker corners of
0:13
the internet without me sitting directly behind him
0:16
tends to send my heart racing. And
0:19
little wonder. Pornography, violent imagery,
0:21
incitement to self-harm, and much
0:23
more, all available with just
0:25
the swipe of a screen.
0:28
Having been given a bit of a
0:30
push by the court of public opinion,
0:32
tech companies have instituted some measures, but
0:35
the Online Safety Act passed last
0:37
October gave new powers to Communications
0:39
Watchdog Ofcom, which has just published
0:41
its draft children's safety code. Robust
0:44
age checks, improved complaints procedures,
0:46
a commitment from platforms to
0:49
take action to tame the
0:51
algorithms recommending harmful content to
0:53
children. All sounds very positive,
0:56
even if it does make you wonder why we weren't doing
0:58
it before, but some parents remain to
1:00
be convinced. This law has been put
1:02
in place, but there's
1:04
nothing really changed as such,
1:07
which is frustrating for us as parents. It's
1:09
almost like an insult to us as well,
1:12
because we've put in so much work. I'm
1:14
Neil Patterson, and on this edition of The
1:16
Sky News Daily, we'll be asking if those
1:18
concerns are valid, or if the
1:20
internet is about to be made that much
1:22
safer. Later,
1:25
we'll be speaking to one online
1:28
safety expert about the tech firms
1:30
behind the almost ubiquitous social media
1:32
platforms on which dangerous content can
1:34
be viewed. But let's start
1:36
with a few explanations. Christina Criddle
1:38
is technology correspondent for the Financial Times.
1:40
Christina, great to have you with us. Just
1:42
to explain exactly what we have heard
1:44
from Ofcom today, and how these, well, their
1:47
draft guidelines, how these draft guidelines play into
1:49
the Online Safety Act. which
2:00
basically says these are some things we think
2:02
that you should do and if you don't
2:04
do them you might not be meeting your
2:06
legal requirement under the Online Safety Act.
2:09
And these are things like age
2:11
verification but also making sure that
2:13
they're checking their algorithms aren't servicing
2:16
any harmful material especially to children.
2:18
The first time this was proposed
2:20
was in Theresa May's government. Ofcom
2:23
has been publishing guidance and different
2:26
updates over the past few months
2:28
but again this set of codes
2:30
of conduct have to be consulted
2:32
on. There's then a process before
2:34
they become public and then before
2:36
companies have to comply with them.
2:38
So I don't think we're going
2:40
to see Ofcom finding anyone for
2:42
quite some time but it's really important that they
2:44
do this because they have to be legally sound
2:47
because if they do want to take a big
2:49
tech firm to court they need to make sure
2:51
that these rules are very robust and
2:53
can stand out. So what are
2:55
the problem areas that would identify by
2:57
campaigners, by the government and codified in
3:00
the Online Safety Act? What are the
3:02
specific things that these social media platforms,
3:04
these tech companies will have to do
3:06
at some point in the near future?
3:09
The core theme is children who
3:11
are under age should not be
3:13
on these platforms and they definitely
3:15
shouldn't be seeing very harmful content
3:17
and then there's a second layer which
3:20
is these companies often have algorithms that
3:22
drive attempt content that you come across
3:24
and that these algorithms should not be
3:27
promoting harmful material which can often create
3:29
more engagement, it can make you spend more
3:31
time on the app, it might make you
3:33
share things or like them but it's not
3:35
necessarily good for children on the platform or
3:37
even adults on the platform. But specifically
3:40
on the children we already have
3:42
tiers of content that's very harmful,
3:44
things like pornography but then there's
3:47
that lower layer as well so
3:49
bullying content or those viral challenges
3:51
which can sometimes be quite dangerous. understands
4:00
what it means. Algorithms are
4:02
technology, usually AI, which goes under
4:05
what you're seeing. So I'm going
4:07
to use Instagram as an example
4:09
but TikTok has this too, where
4:11
it basically decides what you're going to see
4:13
on the platform. And as you use
4:15
the platform more it will get a better
4:17
understanding of the sort of thing you're interested
4:19
in. So I love cats. My platform
4:22
are always full with cats. If I click like
4:24
on a post having a cat then that algorithm
4:26
is going to know, okay she likes cats, I'm
4:28
going to give her some more. But
4:30
then it might start to test out different types of content.
4:32
So it might think, oh she really likes cats, why
4:35
don't we give her a dog and see what
4:37
happens. And I might like it, then it might
4:39
give me more dogs, or if I don't like
4:41
it then it'll discard that. And so these algorithms
4:44
learn from our behaviors and then can
4:46
start to drive what we're
4:48
seeing to encourage certain behaviors as well. And
4:50
presumably this applies in exactly the same way
4:52
when it comes to the more harmful types
4:54
of content. You know I've seen so many
4:56
stories over the past couple of years about
4:59
young people seeing self-harm content on
5:02
platforms like Instant and elsewhere. And
5:04
I've never come across it myself.
5:06
Just explain how we get to
5:08
a position where anyone
5:11
is receiving content of that type
5:13
on their devices. With all due
5:15
respect this tends to happen with teenagers and
5:17
teenage girls in particular who are already potentially
5:19
feeling quite low or depressed or having issues
5:22
with their body image. And so they might
5:24
be engaging with content on there that is
5:26
about body image. It might be quite innocent
5:28
and selfless level to begin with. It might
5:31
be somebody in a bikini or
5:33
it might be tips on getting healthy.
5:35
But then as the algorithm realizes you're
5:37
more interested in that it might try
5:39
and show you a more extreme version
5:41
of that. Now platforms have gone quite
5:43
far in trying to make sure that
5:45
they're not showing very harmful material on
5:47
their platforms but stuff is still getting
5:49
through. It's the joining of
5:51
the dots then that happens through the
5:54
algorithm. So do we have any
5:56
confidence though given how central these algorithms
5:58
are to these platforms? that
6:01
Ofcom who've been given these powers of enforcement
6:03
are going to be able to poke their
6:05
heads under the bonnet. At the moment
6:08
Ofcom are really relying on these companies to
6:10
provide them with information. So once these codes
6:12
of conduct come into force, companies
6:14
have to conduct risk assessments on their
6:16
algorithms. So that's giving quite a lot
6:18
of power to these companies to basically
6:20
mark their own homework. But what Ofcom
6:22
can now do is request more and
6:24
more information, they can really dig into
6:26
it, and if they suspect something
6:29
to little awry, they can open an investigation.
6:32
Given that these platforms of their own
6:34
volition have not taken steps to guard
6:36
young people against this type of content,
6:39
or not satisfactory steps, according
6:41
to the government in Ofcom, why on
6:43
earth would we ever believe that they
6:45
would be entirely honest when pushed by
6:48
Ofcom for this type of information? It
6:50
strikes me that given that they've landed themselves
6:52
in this position in the first place, we
6:54
probably shouldn't be trusting them as much as
6:56
we are. They are absolutely taking
6:59
steps, they already have. And this is
7:01
before the legislation has come into force.
7:03
But I think when this legislation does
7:05
come into force, there are quite hefty
7:07
penalties in terms of financial signs for
7:10
these companies, if they are not complying,
7:12
but even they have to name a
7:14
person at the organisation who is personally
7:17
responsible for anything that goes wrong there. And
7:19
they can even have criminal liability. So they
7:21
might even go to jail if
7:23
this company is found to be flouting the rules
7:26
on child safety. I get the point that the
7:28
tech companies have gone some distance to
7:30
putting their own house in order, and that
7:32
is what it is. But the parents aren't
7:34
happy about it. And it strikes me that
7:36
the tech companies have only done this because
7:39
people have pointed out to them
7:41
that children are seeing harmful material on
7:43
their platforms. So often with technologies,
7:46
you don't know what the impacts are going
7:48
to be when it's out in the real
7:50
world. Certainly we know now, and these tech
7:52
companies should be doing something about it. But
7:54
I also think that we as parents, we
7:56
as users of this technology, need to have
7:58
some agency as well. as well and
8:00
need to work out, you know, we
8:02
can control these algorithms. There are things
8:04
that we can do to make sure
8:07
we're not seeing this kind of material.
8:09
And if you're a parent, you really
8:11
need to be involved in how your
8:13
child uses these platforms. These companies are
8:15
trying quite hard to comply. And I
8:17
also think beyond the sort of financial
8:19
consequences for not complying, there is a
8:21
real reputation management issue. And it was
8:24
once that we saw Facebook meta would
8:26
own the whole social media network space.
8:28
And that's not the case. We've seen
8:30
TikTok come up recently. And it's been
8:32
hugely popular that there is room for
8:34
competitors. And so if you're seen as
8:36
a platform that isn't looking after
8:38
children, then you might quite
8:40
quickly lose your users. Isn't
8:43
the genie out the bottle on all of
8:45
this that the social media platforms are all
8:47
pervasive? Everyone has them on the phone, particularly
8:49
if you are a young person and they're
8:51
engaging in this manner. There's definitely
8:53
an argument for that. And I think we're
8:56
seeing that play out now with AI and
8:58
all of the hype around AI, these products
9:00
are often created without fully knowing the consequences.
9:02
And they are just released to consumers in
9:05
the wild. Perhaps there should
9:07
be some intervention earlier. I
9:09
don't know when that's going to happen. I don't know when
9:11
the regulation is going to come in full. So it causes
9:13
that. But it's certainly something to think about.
9:16
Certainly is. Christina Criddle, great to talk to you. Thanks for
9:18
your time. Stay
9:20
where you are. And please don't swipe left.
9:22
When we return, we'll look a bit more
9:24
closely at the tech companies themselves and how
9:26
likely they are to fall into line. Welcome
9:32
back. The past few years have
9:35
seen many young people lose their lives
9:37
or choose to end them because of
9:39
material found online. And whilst
9:41
parents would wish to be able
9:43
to supervise their child every action
9:45
online, that clearly isn't possible. So
9:47
there is considerable merit in getting
9:49
the tech firms themselves to ensure
9:51
that kids don't have unfettered access.
9:55
John Carr is an online safety expert.
9:57
Secretary will be coalition of UK children's
9:59
charities. dealing with internet safety and
10:01
has advised the United Nations, the
10:03
European Union and the British government.
10:06
We're the first country in the world
10:08
to attempt anything this ambitious or this
10:10
comprehensive. The whole of the
10:13
world's eyes are now on us. We've got
10:15
to make sure it works. It's been far
10:17
too long coming, but we've got it now.
10:19
I mean, I'm glad it's here. It's a
10:21
good piece of law and now we're going
10:24
to make those tech companies behave better. John,
10:26
you're making it sound as if these big
10:28
tech firms don't have children's safety as their
10:30
overwhelming priority. They absolutely do not.
10:33
If they had, we wouldn't be where
10:35
we are today having to legislate for
10:37
it. They only act if they see
10:39
a threat to the continued profitability or
10:42
the continued sort of smooth operation of
10:44
their business. This new piece of law
10:46
represents that threat, so they
10:48
are going to have to comply. In
10:50
Britain, because we're such a big and
10:52
important and rich market, they
10:55
will want to stay operational here and that
10:57
means complying with the law. But I also
10:59
wonder what it says about our politicians, that
11:01
it's taken them this long to realise what's
11:03
been going on. Well, on the
11:05
specific point about pornography, we had a law
11:08
that Parliament passed in 2017. The
11:11
then regulator, which was the BBFC,
11:13
brought regulations to Parliament, which
11:15
Parliament approved. And then at the very
11:17
last minute, I mean, literally the week
11:19
before those new laws were due to
11:21
go into effect, Boris Johnson pulled the
11:24
plug on them. And Parliament was very angry, I
11:26
have to say, even within the
11:28
Conservative Party, there was a great deal of anger.
11:30
But anyway, look, that's old history now. We've
11:33
got a new piece of legislation.
11:35
We've got to make it work.
11:37
The way in which tech companies
11:39
look after their products, they don't
11:41
like people, particularly governments, particularly intelligence
11:43
services, for example, poking around in
11:45
the inner workings of their platforms.
11:47
So I just wonder, when it
11:49
comes to Ofcom wanting to see
11:51
action taken on the algorithm, will
11:53
Ofcom have enough access to be able
11:56
to ensure that the correct action is
11:58
being taken? There are substantial powers. in
12:00
this new legislation to levy significant
12:03
fines against companies that are found not
12:05
to be complying with the law. Ten
12:08
percent of global revenue, 18 million
12:10
pounds or ten percent of global
12:12
revenues. But you know what? Fines,
12:14
schmines, these big
12:16
businesses, fines are just part
12:19
of the cost of doing business. Fines won't
12:21
do it. But what we've got uniquely
12:23
in this piece of legislation are
12:26
criminal sanctions. So what Ofcom
12:28
will have, and this is the most
12:30
powerful stick, they have a legal
12:32
power to request information from these
12:34
tech companies. And the tech companies
12:37
are under a legal obligation to
12:39
answer the questions truthfully. And
12:41
if they don't, their senior
12:43
executives will be subject to criminal
12:45
sanctions. So fines they
12:47
can pay, criminal sanctions they
12:49
cannot avoid. And so those
12:52
transparency powers will work. We
12:54
will be able to see, the public
12:56
will be able to see, parents will
12:58
be able to see, everybody will be able
13:00
to see what is the truth of what
13:02
these big tech companies are actually doing to
13:04
protect children and make the internet safe and
13:06
better place. Yeah, I mean, to see one
13:08
of the masters of the universe, one of
13:10
the big tech firms, you know, going into
13:12
prison for something like this, I think that
13:15
would be very warmly welcomed by
13:17
a group that you've already mentioned, the
13:19
parents. You've spoken to an awful lot
13:21
of them over the years, I'm sure,
13:24
John, and probably in regards to this
13:26
draft code of practice, it
13:28
strikes me, and I've seen a
13:30
comment from some of them, that not
13:33
everyone, not everyone thinks this is going
13:35
far enough. Yeah, they're impatient.
13:37
And I understand they're impatient. And
13:39
I'm going to say or do
13:41
nothing to try to talk them
13:44
out of feeling frustrated and
13:46
angry that the way things have
13:48
turned out, Ofcom have to go
13:50
through these processes very carefully and
13:52
deliberately to draw up and describe
13:54
the rules, or rather how
13:56
they're going to implement the laws. I'm not
13:58
yet dismissing Ofcom. Absolutely not. They
14:01
want it to succeed. It's in their
14:03
professional interest for it to succeed. And
14:05
I'm sure they'll give it the
14:07
best go they can. They have to. There
14:10
are no alternatives. And
14:12
this is the first time anywhere in
14:14
the world this approach has been tried. So
14:16
we all have a big investment in making
14:18
sure it works. And we'll be holding off,
14:21
come to account, the children's organizations, the parents'
14:23
organizations, they're not going to go away. The
14:25
tech companies themselves would say,
14:27
look, we have taken steps of
14:30
our own volition before any compulsion
14:33
was placed upon us. I suspect that it
14:35
has been the outspoken parents of children who
14:37
have suffered or who have lost their lives
14:39
that have compelled them to go as far
14:41
as they have. I
14:44
absolutely agree with you. And there's no doubt they have
14:46
done things. Yeah, they have done odd things, but
14:48
they've all been marking their own homework. That
14:50
ship has sailed. What advice, John,
14:53
would you give to, let's just say,
14:55
a kind of a middle-aged white Scotsman
14:57
who has a small child under the
14:59
age of 10 who is not
15:01
getting a mobile phone until he grabs
15:03
it, pries it from my cold,
15:05
dead hands. Frankly, what advice would
15:07
you give to someone in that
15:09
position who, frankly, is
15:11
genuinely really scared about what
15:13
lies ahead for my child and
15:15
his activities online? Kids need
15:18
to learn about technology. They need to
15:20
learn how to use it because it's
15:22
with us forever. Now
15:24
it's part of the way the modern world works.
15:27
But parents need to get
15:29
engaged. But you know what? The
15:31
companies, we can't put it all on
15:33
parents. These companies have changed the culture.
15:35
They spent billions and billions of pounds,
15:37
billions and billions of dollars creating a
15:40
whole new world that's really great in
15:42
lots of ways. But kids feel they
15:44
have to be there. They have to
15:46
be part of it. It's a great
15:48
way to learn. But when those products
15:50
are put into their child's hands, be
15:52
it a smartphone or a tablet or
15:54
a computer, it should be as safe
15:56
as it possibly can be at the
15:58
point of first use. In
16:00
any other consumer product with any other like
16:03
a toaster a TV a washing machine Anything
16:05
that you would buy for your for use in
16:07
a family home It wouldn't be
16:10
like that because we've had many many years
16:12
experience You've got rigorous regulations and strong enforcement
16:14
to make sure that when you take something
16:16
home with you when you take something into
16:19
a family home It's as safe as it
16:21
possibly can be That's the position
16:23
we need to be in with tech so that
16:25
when you do give your 10 year old child
16:27
a piece of tech You don't have to fret
16:30
and worry about being a bad parent. What have I
16:32
done? What have I missed or something of that kind?
16:35
John many thanks And
16:38
I should point out that both snapchat
16:40
and indeed meta which obviously owns Facebook
16:43
and whatsapp I will they say they
16:45
have extra protections for under 18s and
16:47
can offer parental tools to control what
16:49
children can see Snapchat added
16:52
it supports the aims of the online
16:54
safety act And that is the reason for that Parents
16:58
have for decades been telling their kids that
17:01
they should go outside and kick a ball
17:03
around To step away from the screen before
17:05
it rots their brain and thanks
17:07
again for that mother But for those
17:09
of us of a certain vintage it
17:12
can be difficult to conceive of how
17:14
much of our kids lives Take place
17:16
online and unlike in the past. There
17:18
is no stepping away from the screen
17:20
It goes where they go and so
17:22
to the harmful content Parents
17:24
can only do so much the
17:27
tech companies or perhaps they
17:29
can do much more That's a
17:31
lot for this edition of the Sky News Daily. We'll see you again
17:33
soon
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More