We use some essential cookies to make this website work.
We’d like to set additional cookies to understand how you use GOV.UK, remember your settings and improve government services.
We also use cookies set by other sites to help us deliver content from their services.
You have accepted additional cookies. You can change your cookie settings at any time.
You have rejected additional cookies. You can change your cookie settings at any time.
Departments, agencies and public bodies
News stories, speeches, letters and notices
Detailed guidance, regulations and rules
Reports, analysis and official statistics
Consultations and strategy
Data, Freedom of Information releases and corporate reports
Register to vote Register by 18 June to vote in the General Election on 4 July.
Guidance on building and using artificial intelligence in the public sector.
Understand how to use artificial intelligence ethically and safely
The Centre for Data Ethics and Innovation (CDEI) is now called the Responsible Technology Adoption Unit (RTA), part of the Department for Science, Innovation and Technology (DSIT). Content from RTA can be found on DSIT’s GOV…
Guidance to help you assess if artificial intelligence (AI) is the right technology for your challenge.
Guidance to help you plan and prepare for implementing artificial intelligence (AI).
The CDEI has published a report detailing the findings from the third wave of its Public Attitudes Tracker Survey, which monitors how attitudes towards data and AI vary over time.
Weights & Biases provides an MLOps platform to help organisations gain auditable and explainable end-to-end machine learning workflows for reproducibility and governance.
New plan for self-driving vehicles plus a consultation on a safety ambition.
A system-level AI Impact Assessment (AIIA), developed by the Responsible Artificial Intelligence Institute (RAI Institute), informs the broader assessment of AI risks and risk management.
The Centre for Data Ethics and Innovation (CDEI) is changing its name to the Responsible Technology Adoption Unit (RTA) to more accurately reflect its mission.
Oliver Wyman's AI validation framework demonstrates how a rigorous validation framework can be executed efficiently in practice.
Case study from the British Standards Institution.
The CDEI has published a report that sets out proposals for a trustworthy approach to the regulation and governance of self-driving vehicles.
A facial recognition service provider used Credo AI’s platform to provide transparency on fairness and performance evaluations of its identity verification service to its customers.
Case study from Mind Foundry.
Research led by the Centre for Data Ethics and Innovation into recommendation algorithms used by streaming services
The CDEI has published a roadmap, which is the first of its kind, setting out the steps required to build a world-leading AI assurance ecosystem in the UK.
The CDEI has published a report on approaches to accessing demographic data for bias detection and mitigation.
Case study from Holistic AI.
An introduction to understanding and using artificial intelligence in the public sector.
Don’t include personal or financial information like your National Insurance number or credit card details.
To help us improve GOV.UK, we’d like to know more about your visit today. Please fill in this survey (opens in a new tab).