News story

Protecting young people online at the heart of new VAWG strategy

Government pledges to make it impossible for children to take, share or view a nude image – partnering with tech companies.

Photo: Getty Images

More women and girls will be protected from deepfake abuse as new laws will ban ‘nudification’ tools – that use generative AI to turn images of real people into fake nude pictures and videos without their permission.

As part of the government’s strategy to tackle violence against women and girls (VAWG), ministers have pledged to make it impossible for children in the UK to take, share or view a nude image using their phones. The government will join forces with tech companies so we can work together to make this a reality and better protect young people from grooming, extortion, bullying, harassment and sexual abuse.

Research found 276,149 sexual deepfakes on a single dedicated site in 2023, with 96% depicting women. Worldwide, 24 million people visited ‘nudification’ services in September 2023 alone and 9  in 10 reports to the Internet Watch Foundation of child sexual abuse online sadly contain images taken by children themselves, often coerced by predators and with potentially devastating consequences.

At 13 years old, Roxy Longworth, founder of the campaign Behind Our Screens, was coerced into sending intimate images, which were then shared without her consent. She faced widespread humiliation, bullying, and a severe mental health crisis. Her story shows how quickly image-based abuse can escalate and why strong protections are urgently needed.

Preventing the harm from happening in the first place is key. To protect more young people like Roxy, the government will work with tech companies to develop solutions to image based abuse, expanding on technology already being implemented by British safety tech company Safe To Net, and nudity detection filters already on smartphones. 

Minister for Safeguarding and Violence against Women and Girls, Jess Phillips, said: 

We must stop these images being created and shared while tackling the root causes of negative influences on young men in their schools, homes and online. That’s why we will join forces with tech companies to stop predators online and prevent the next generation from being exploited by sexual extortion and abuse.

‘Nudification’ apps are not used for harmless pranks. They devastate young people’s lives, and we will ensure those who create or supply them face real consequences. Every child deserves to grow up safe, and we will do whatever it takes to make that a reality.

The creation and supply of so-called ‘nudification’ apps or tools that generate deepfake nude images of real people will also be banned, under plans announced today, building on offences which criminalise sharing these deeply damaging images. The new legislation will allow the police to target the firms and individuals who design and supply these tools.

Worryingly, these apps allow users to strip clothes, and produce intimate videos without the consent of those depicted – with devastating and long-lasting consequences to victims. Highly realistic, this technology has led to a scourge of financially motivated sexual extortion and even suicide in some cases. In Spain, the town of Almendralejo was devastated after several perpetrators used these apps to create nude images of 20 children walking to school.

Government has also already legislated to criminalise the creation of non-consensual sexual deepfakes, ensuring that offenders face the appropriate punishments for this atrocious harm.

Technology Secretary Liz Kendall said:

Women and girls deserve to be safe online as well as offline. We will not stand by while technology is weaponised to abuse, humiliate and exploit them through the creation of non-consensual sexually explicit deepfakes.

I am introducing a new offence to ban nudification tools, so that those who profit from them or enable their use, will feel the full force of the law – so that together we end this abuse of women and girls.

Our priority is protecting victims and ensuring the internet is a safer place for women and girls.

With the government, technology companies and all of society committed to this cause – we can turn the tide on the torrent of child sexual abuse material currently flooding the internet. 

Stopping the image from being shared in the first place will stop evil financially motivated sexual extortion and paedophile rings business models in their tracks. The latest data from the Internet Watch Foundation shows how AI tools and apps like these are being increasingly used year on year.  

Report Remove, which is run by the Internet Watch Foundation and Childline recorded that between January and September this year, 19% of reporters stated that some or all of their imagery had been manipulated in some way using AI. 

We will deploy the full power of the state in the largest crackdown on violence perpetrated against women and girls in British history. We must stop violence before it starts, by focusing on prevention and tackling the root causes of negative influences on young men in their schools, homes and online.

Founder of the Behind Our Screens campaign, Roxy Longworth, said: 

If device controls like these had existed when I was 13, my life would have been completely different. I would not have been coerced, blackmailed, abused and I would have been saved the devastating humiliation and mental health crisis that followed.

It’s so important that technology is used to protect young people, not harm them. I’m also relieved to see nudification apps being banned and that the government is taking action to protect the next generation from new technological threats. 

Chief Executive of the IWF, Kerry Smith said:  

At a time where children are facing unprecedented new harms and challenges online, we are pleased to see the commitment to bringing in on-device protections. 

Safety mechanisms to protect children from unsolicited nude imagery, and from being coerced into sending sexually explicit material is an important step – we must now see these measures made mandatory and applied across the board.  

We are also glad to see concrete steps to ban these so-called nudification apps which have no reason to exist as a product. This is not some abstract threat. Apps like this put real children at even greater risk of harm, and we see the imagery produced being harvested in some of the darkest corners of the internet. 

It is crucial any interventions are rolled out in consultation with child development experts to make sure children and victims stay at the heart of the response, and we look forward to playing our part in making these measures a success.

Updates to this page

Published 18 December 2025