We use some essential cookies to make this website work.
We’d like to set additional cookies to understand how you use GOV.UK, remember your settings and improve government services.
We also use cookies set by other sites to help us deliver content from their services.
You have accepted additional cookies. You can change your cookie settings at any time.
You have rejected additional cookies. You can change your cookie settings at any time.
Departments, agencies and public bodies
News stories, speeches, letters and notices
Detailed guidance, regulations and rules
Reports, analysis and official statistics
Consultations and strategy
Data, Freedom of Information releases and corporate reports
Case study from the Alan Turing Institute and University of York.
Understand how preventative design measures can reduce the risk of harms happening on your online platform.
If you manage an online platform that allows user generated content, these 7 steps will help you keep your business and your users safe.
Find out what online harms are, how to manage them, and where to report them if they occur.
Practical steps to manage the risk of online harm if your online platform allows people to live stream, or view live streams by others.
Learn about your responsibilities if you own or manage an online platform or service.
Practical steps to manage the risk of online harm if your online platform allows people to interact, and to share text and other content.
Practical steps to manage the risk of online harm if your online platform allows people to create anonymous or multiple accounts.
Past and current clinical trials into therapeutic and antiviral treatments for COVID-19, how to take part, and which treatments have proven to be effective.
Practical steps to manage the risk of online harms if your online platform allows people to search user generated content.
How safe platform design can protect your users from online harms and prepare your business or organisation for future legislation.
Practical steps to manage the risk of online harms if your online platform makes users’ account details and activity visible to others.
Case study from Credo AI.
Requirements for organisations wanting to provide or consume digital identity products and services.
Case study from Logically AI.
This playbook explains how we use social media at GDS. In it, we share our best practice, what we've learned and what we're planning to do.
How to choose data tools and infrastructure that are flexible, scalable, sustainable and secure.
Lumenova AI's AI Governance, Risk Management, and Compliance Platform aims to simplify and streamline AI risk management, providing complete visibility on AI models and ensuring consistent adherence to the latest regulatory standards and industry best practices.
We’re working to help people securely prove who they are without having to rely on physical documents.
Don’t include personal or financial information like your National Insurance number or credit card details.
To help us improve GOV.UK, we’d like to know more about your visit today. We’ll send you a link to a feedback form. It will take only 2 minutes to fill in. Don’t worry we won’t send you spam or share your email address with anyone.