We use some essential cookies to make this website work.
We’d like to set additional cookies to understand how you use GOV.UK, remember your settings and improve government services.
We also use cookies set by other sites to help us deliver content from their services.
You have accepted additional cookies. You can change your cookie settings at any time.
You have rejected additional cookies. You can change your cookie settings at any time.
Departments, agencies and public bodies
News stories, speeches, letters and notices
Detailed guidance, regulations and rules
Reports, analysis and official statistics
Consultations and strategy
Data, Freedom of Information releases and corporate reports
Government collaborates with Microsoft and other world leading technology companies to create a framework which will identify gaps in deepfake detection.
Technology Secretary Liz Kendall calls for swift action after reports xAI's Grok tool continues to allow generation of intimate deepfake images.
DSIT Secretary of State statement after concerns over Grok AI.
A global review of tools and strategies to prevent and respond to intimate image abuse in the age of AI, authored by Humane Intelligence.
New legislation sees government work with AI industry and child protection organisations to ensure AI models cannot be misused to create synthetic child sexual abuse images.
The Global Partnership for Action on Gender-Based Online Harassment and Abuse calls for urgent action on non-consensual intimate image abuse and support to survivors.
Prioritization of introduction pathways is seen as an important component of the management of biological invasions
Social media graphics and animations to encourage engagement with the Domestic Abuse Bill consultation and raise awareness of domestic abuse.
Don’t include personal or financial information like your National Insurance number or credit card details.
To help us improve GOV.UK, we’d like to know more about your visit today. Please fill in this survey (opens in a new tab and requires JavaScript).