Corporate report

Evaluation Academy - Final Evaluation Report

The Evaluation Task Force has published their evaluation of the July 2025 Evaluation Academy training programme.

Documents

Evaluation Academy - Final Evaluation Report 2025 (PDF)

Request an accessible format.
If you use assistive technology (such as a screen reader) and need a version of this document in a more accessible format, please email accessible.formats@cabinetoffice.gov.uk. Please tell us what format you need. It will help us if you say what assistive technology you use.

Evaluation Academy - Evaluation Summary (PDF)

Request an accessible format.
If you use assistive technology (such as a screen reader) and need a version of this document in a more accessible format, please email accessible.formats@cabinetoffice.gov.uk. Please tell us what format you need. It will help us if you say what assistive technology you use.

Details

Ministerial Note

“Strong evaluation evidence is essential for making effective government decisions and ensuring public money is spent wisely. This evaluation report demonstrates our commitment to improving delivery and strengthening our ability to make evidence-based decisions. This work should serve as best practice for teams across government looking to produce an excellent evaluation.”

Josh Simons MP - Parliamentary Secretary for the Cabinet Office

Executive Summary 

  • The Evaluation Academy is a capability building course, developed by the Evaluation Task Force (ETF), which uses a train-the-trainer model to improve civil service analysts’ ability to teach others about evaluation methods. The ETF conducted an impact evaluation (IE) and an implementation and process evaluation (IPE) of the first stage of the Evaluation Academy’s train-the-trainer model in July 2025. 
  • The impact evaluation employed a randomised controlled trial (RCT) which used a two-arm, individual-level waitlist design to investigate the impact of the Evaluation Academy on two primary outcomes – participants’ confidence in delivering evaluation training and the size of participants’ evaluation network – and one secondary outcome – participants’ knowledge of evaluation methods and processes. 
  • The impact evaluation demonstrated large and significant effects of the Evaluation Academy on both primary outcomes. Compared to those in the control group, participants reported substantially higher levels of confidence in delivering evaluation training (an average increase of approximately 1-point on a 5-point scale) as well as substantial increases in the size of their cross-government evaluation networks (an average increase of 8 people). 
  • By contrast, the impact evaluation found more limited evidence of impact on participants’ knowledge of evaluation methods and processes. Although treatment group participants performed better on average than the control group on a post-intervention set of evaluation knowledge questions, this difference was not statistically significant. The absence of large knowledge effects reflects the fact that civil servants participating in the first stage of the train-the-trainer model had reasonably high levels of pre-existing evaluation knowledge. 
  • The implementation and process evaluation aimed to understand participant and teacher experiences of the Evaluation Academy to inform future delivery. Data was collected from surveys, participant and teacher feedback, a focus group, and observations. Key findings revealed that the Academy was well-organised, and the in-person format, mixed groupings, and external expert support enabled networking and confidence building. Barriers included the intense pace, assumed prior knowledge, and text-heavy slide packs. Participants recommended refreshing materials, clearer expectation setting, and more support for online/hybrid delivery in the future.

Transparency 

The Evaluation Task Force will publish the code and data used in this impact evaluation for transparency on the UK Data Service.

Updates to this page

Published 29 October 2025

Sign up for emails or print this page