Speech

Security Minister's PIER annual conference speech

Security Minister Tom Tugendhat spoke at the Policing Institute for the Eastern Region (PIER) annual conference, ‘Tackling online harms – A whole system approach’.

The Rt Hon Tom Tugendhat MBE VR MP

Good afternoon.

It’s a pleasure to be here with you today.

Before I begin, allow me to say a few words about the response to the Independent Inquiry into Child Sexual Abuse.

For those of you who took part, thank you. I know a lot of different experts have contributed in different ways to help our understanding and I hope you’re understanding as well of the situation that we’re facing.

This independent enquiry was indeed a wake up call, it was extraordinarily important to shed a light on the unimaginable abuse that we’ve seen suffered by children over many many years. It found quite simply appalling examples of organisations placing their own interests ahead of children’s safety: either by turning a blind eye or covering up the abuse.

Frankly it is deeply dispiriting to see.

I deeply admire the courage of those survivors who came forward.

We owe it to them – as well as to future generations – to ensure that it never happens again.

Later today I’ll be speaking to a group of students.

They are going to be asking the questions that students so often ask, I’m sure.

They are going to be asking questions that are relevant to today and about gossip in the media that we’ve been hearing.

They’ll be asking about the challenges we face, and yes, occasionally heckling me…

One thing I can guarantee I’ll be asked is whether I would recommend politics as a career.

It’s a difficult question to answer really and I’ve never really known how to answer.

I’ve never thought of politics as a profession, at least not in the traditional sense.

For me it’s a form of service.

Now, having already met a few of you I’m very aware that I’m speaking to an audience of professionals.

Many of you are at the top of your fields.

But I also understand that for many of you protecting children online isn’t just a career.

It’s more of a calling, every bit as personal as it professional.

The reason I’m here today is that for me keeping children safe isn’t just another issue, or even just the right thing to do.

It’s personal, and every bit as important as my role’s traditional focus on terrorism and state threats.

Let me explain why.

Earlier this year I visited the US to meet my counterparts in intelligence and homeland security.

While I was there I had the opportunity to visit the National Centre for Missing & Exploited Children, NCMEC for short, a heroic organisation on the vanguard of global efforts to keep children safe online.

NCMEC receives reports of suspected cases of child sexual exploitation from US-based tech companies including enticement, where children are lured into sharing explicit images and videos of themselves; sextortion, when predators target their victims using blackmail; and the online distribution of child sexual abuse material.

I’ll be straight with you.

I wasn’t prepared.

I wasn’t prepared for the depravity of some of the examples of offending they gave.

I wasn’t prepared for the scale of the threat that our children face.

And, as the father to a wonderful son and daughter of my own, I wasn’t prepared for the horror that children just like them are made to suffer every day.

The thing that struck me was how vulnerable they are.

To predators, social media sites like Facebook and Instagram are a one stop shop.

Without leaving Meta’s ecosystem they can choose their target…do their research…start a conversation with them…and transfer that conversation onto a private messaging service.

And that’s exactly what they do – in their thousands.

In 2022 NCMEC received over 32 million reports of suspected child sexual exploitation and abuse.

21 million of these came from Facebook alone, which not only speaks to the severity of the issue they face.

It also leads me to suspect that other companies are significantly under-reporting.

I want to be clear – this isn’t a US issue.

We face exactly the same problem right here in the UK.

The NCA estimates that there are up 850,000 people in this country who pose a sexual risk to children, including both contact offending and offending online.

Of course, in reality the scale of the threat our children face is much larger.

We mustn’t forget that the computers in our children’s homes, and the mobile phones in their pockets don’t just make them accessible to people here in the UK.

They connect them to the world.

That works both ways of course.

I’m appalled by the increase in so-called live streamed abuse, where predators pay to victimise children remotely – and often in other countries – via webcam.

The UK is one of the top 3 consumers of livestreamed child sex abuse from the Philippines.

Equally, in addition to the threat they face domestically, our children are also the targets of predators and offenders overseas.

It’s clear, then, that this is a threat of immense scale and complexity, and I’m grateful for the valiant efforts of our law enforcement agencies.

Every month UK law enforcement agencies arrest 800 people and safeguard 1200 children.

Last week, for example, Bernard Grace was sentenced to 8 years in prison for making 600 payments to direct and livestream the sexual abuse of children in the Philippines.

In March earlier this year, Christopher Manning was jailed for 25 years for using a chat platform to distribute child sexual abuse material and encouraging others to do the same.

And in 2021 the NCA caught David Wilson, one of the most prolific child sexual abuse offenders the UK has ever seen.

Wilson posed as a teenage girl on Facebook to manipulate his victims into sending sexually explicit material of themselves before using it to blackmail them into abusing their friends and siblings.

His case is the perfect illustration of why our partnership with tech companies and organisations such as NCMEC are so important.

He was brought to justice because law enforcement were able to access the evidence contained in over 250,000 Facebook messages.

And he’s far from alone.

NCMEC sends suspected cases of child sexual abuse in this country straight to the NCA, who process them before sending the resulting intelligence to the police.

In 2021, they contributed to 20,000 criminal investigations across the UK.

For predators that’s a significant deterrent.

And for their victims, it’s a lifeline.

That lifeline is now under threat.

Despite its past record of dedicated protection, Meta is planning to roll out end-to-end encryption on Messenger and Instagram Direct later this year.

Unless they build in robust safety measures, that poses a significant risk to child safety.

Let me be clear.

Privacy matters.

The UK government is in favour of protecting online communications.

And it is possible to offer your customers the privacy they expect…while also maintaining the technical capabilities needed to keep young people safe online.

Meta are just choosing not to, many others have already taken the same path.

The consequences of that decision are stark.

Facebook and Instagram account for over 80% of global NCMEC referrals, meaning that 20 million suspected cases of child sexual abuse a year will go unreported.

Meta will no longer be able to spot grooming – including cases like David Wilson’s – on their platforms, leaving tens of thousands of children in the UK, and around the world, beyond our help and in danger of exploitation.

Faced with an epidemic of child sexual exploitation and abuse Meta have decided to turn a blind eye, and are choosing to allow predators to operate with impunity.

This is extremely worrying.

But it also raises questions for parents like myself right across the country.

Questions about big tech, and the balance of power and responsibility enjoyed by social media companies.

My children love going to a playground near where we live.

While they’re there it’s clear who’s responsible for their safety.

Me of course, as their parent – but also the council, who have a duty to ensure the environment is safe and well-maintained, and our local police force, who have a duty to make sure nothing dangerous or illegal is taking place.

Both have clear lines of accountability to me and to our local community.

My children are currently too young to have social media profiles.

But what happens when they do go online?

Who’s responsible for their safety?

And is anyone accountable to them – or to me?

In my view it’s clear.

Companies like Meta enjoy vast power and influence over our lives.

With that power should come responsibility.

It’s not acceptable for tech executives to make vast profits from their youngest users, only to pass the buck when it comes to protecting them from the dangers on their own platform create.

The first duty of government is to protect its people, and none are more precious than our children.

In that sense, the stakes couldn’t be higher.

Over the past few minutes we’ve covered a lot of frightening statistics.

But we must never forget that behind every NCMEC referral, flagged image, and Police investigation is a real child being hurt in the real world, for whom the consequences of their victimisation are devastating.

However it’s not just these children’s futures that are at stake.

I personally believe you can judge a society by how it looks after its most vulnerable members, and that, in a nutshell, is why this is so fundamental.

Because the importance we place on protecting our children isn’t just a policy issue.

It speaks to the health of our society.

This is a test for governments and tech companies alike.

For governments: one of resolve, and standing up for what we believe in.

For tech companies: one of priorities, and making sure they do no harm.

As with many issues we’re not facing this alone.

All around the world, governments are in a similar position.

And each of us has a choice.

To lean in or to look away.

Well, I can tell you very clearly:

This government will not look away.

Some will have heard the words I have used today to be particularly critical of one company, they are right, I am speaking about Meta specifically and Mark Zuckerberg’s choices particularly. These are his choices, these are our children. He is not alone in making these choices, other companies have done too.

Let me be clear again: this government will not look away.

We will shortly be launching a campaign. A campaign to tell parents the truth about Meta’s choices, and what they mean for the safety of their children.

And a campaign to encourage tech firms to take responsibility and to do the right thing.

We’ll set out our case in the papers, in magazines, over the airwaves and online.

We’ll work with law enforcement agencies, children’s safety organisations, like-minded international counterparts through bodies such as the G7 and Five Eyes, and tech experts with authority on technical solutions and their feasibility.

We will not stop until we are satisfied that Meta and others are serious about finding a solution, and until they have strong safety systems in place to protect children.

I hope that, like me, this isn’t a fight that you’re prepared to lose, and I hope you’ll join us.

Our voices are louder when we speak together.

Published 23 May 2023