Policy paper

MOJ Evaluation and Prototyping Strategy

Published 25 April 2023

Foreword

The justice system is an essential public service, relied upon by millions of victims, households and businesses across our country to deliver justice outcomes that matter to them. The legal system also underpins growth and prosperity in the UK.

It is essential to develop a robust understanding of which policies and interventions work, which don’t and how we can improve them, if we are to create a world class justice system that works well for everyone in society.

At the core of this Strategy is a simple message: better evidence enables better decision making which delivers better outcomes, maximising our positive impact within the resources available to us. Improving our evaluation practice and making greater use of early prototyping to develop our ideas will help us to:

1. Maximise our impact

By developing our understanding of what works, we can use this insight to make informed choices about how to improve outcomes for victims, offenders, and society.

2. Identify innovative approaches to improve the way the justice system delivers in the future

Through understanding the relationship between our activity and our outcomes, evaluation and prototyping helps identify new ideas for incremental and fundamental changes to the justice system that can be developed and tested.

3. Make the best possible use of public money

Evaluation establishes which policies and programmes deliver the best value for money and enhances Departmental accountability by demonstrating that we have used taxpayers’ money effectively.


This Evaluation and Prototyping Strategy outlines the practical steps we will take to realise these benefits. These include ensuring evaluation and prototyping is considered from the outset when new policy is being designed, building our capability to deliver quality evaluation and prototyping and taking steps to ensure our evidence is available in a timely and accessible manner to inform decision making.

For us to deliver effectively, it is vital that all parts of the justice system embrace a test-learn-adapt mindset. We want to build a culture where we’re continually seeking out innovative new approaches that help us to deliver better for the public, and using evaluation to ensure they are improving outcomes.

This is an ambitious Strategy, and we know the change won’t happen overnight. Although pressures on the justice system remain high, I’m excited about the role evaluation and prototyping will play in the years ahead to help improve outcomes for victims and offenders and create a more efficient and effective justice system.

Richard Price

Director General Performance, Strategy and Analysis Directorate

Executive summary

As a Department, we are committed to enhancing the way data and evidence is used to shape policy and operational decisions and drive improvements in justice outcomes.

The Ministry of Justice (MoJ) sits at the heart of the justice system. To protect and advance the principles of justice, the MoJ works with 34 agencies and public bodies who collectively employ over 88,000 people working across courts, prisons and probation.

As a Department, we are committed to enhancing the way data and evidence is used to shape policy and operational decisions and drive improvements in justice outcomes. This Strategy covers how the MoJ will improve its evaluation (the systematic assessment of a policy’s design, implementation and outcomes) and use prototyping (a low-cost, low-risk way of developing, testing and improving ideas at an early stage) to help deliver a world-class justice system that works for everyone in society.

The ambition of this Evaluation and Prototyping Strategy is to establish a test-learn-adapt culture where robust and timely evidence on the effectiveness of policies and programmes sits at the heart of policy and operational decisions. The specific aims of this Strategy are to:

  1. Use evaluation and prototyping evidence to inform decisions and improve justice outcomes.
  2. Increase the quality and coverage of evaluation and prototyping across the justice system.
  3. Progress our understanding of cost-effectiveness to ensure we deliver value for money for the taxpayer.
  4. Improve the timeliness of our evidence, which includes creating a new prototyping function.

This Strategy outlines a set of twelve specific actions that the MoJ and its main Executive Agencies will take to achieve these aims. These twelve actions make up the following three pillars of activity that form the basis of this Strategy:

1. Pillar One: Establish processes to ensure proportionate evaluation and prototyping

These new processes will ensure robust evaluation and prototyping is considered early, so that proportionate learning can be built into the design and delivery of new policy from the outset.

2. Pillar Two: Build capability to deliver quality evaluation and prototyping

The activity in this Pillar will guarantee that appropriate methodological support and scrutiny exists when evaluation and prototyping activity is being designed, so that decision makers have confidence in the evidence we produce.

3. Pillar Three: Produce timely and accessible evidence to improve decision making

This Pillar is about maximising the impact of our evaluation and prototyping activity, by making sure evidence is comprehensible, policy relevant and available when decision makers need it.


A specialist Evaluation and Prototyping Hub has been established to deliver this Strategy and support evaluation and prototyping across the justice system. To successfully deliver these three pillars of activity, the Evaluation and Prototyping Hub will work collaboratively with partners across the justice system, Civil Service and Academia. The MoJ will update on progress made delivering this Strategy through its Outcome Delivery Plan.

Introduction: The importance of an Evaluation and Prototyping Strategy

Our vision is to deliver a world class justice system that works for everyone in society. Improving the quality, timeliness and accessibility of our evidence is essential to realise these strategic outcomes and deliver a justice system that works for everyone in society.

The Ministry of Justice (MoJ) sits at the heart of the justice system. To protect and advance the principles of justice, the MoJ works with 34 agencies and public bodies who collectively employ over 88,000 people working across courts, prisons and probation. This Strategy applies to the Ministry of Justice and its Executive Agencies.

Our vision is to deliver a world class justice system that works for everyone in society. Central to achieving our vision is the delivery of four strategic outcomes:

1. Protect the public from serious offenders and improve the safety and security of our prisons

Through better sentencing, more prison places, safer prisons and strong action on extremism.

2. Reduce reoffending

Reducing crime through breaking the cycle of reoffending by focusing on proven interventions: a home, a job and access to treatment for substance-misuse.

3. Deliver swift access to justice

Making the courts and tribunals system stronger and smarter so that they work to support victims, tackling sexual and domestic violence and making sure the vulnerable are supported in the justice system.

4. Progress constitutional reform

A reformed human rights framework for the UK will protect people’s fundamental rights, while safeguarding the broader public interest and respecting the will of elected representatives in Parliament.


Improving the quality, timeliness and accessibility of our evidence is essential to realise these strategic outcomes and deliver a justice system that works for everyone in society. Alongside the MoJ’s published Areas of Research Interest (2020) and Data Strategy (2022), this Strategy demonstrates our commitment to putting the insights from data, evidence and collaboration at the heart of decision making to improve outcomes across the justice system.

Evaluation at the Ministry of Justice

Evaluation is the systematic assessment of a policy’s design, implementation, and outcomes. In this document policy refers to the broad range of activity, interventions and programmes delivered across the justice system. Evaluation allows decision makers and stakeholders to understand whether our policies are having their intended effect (impact evaluation), being delivered as planned (process evaluation), and represent a worthwhile use of public funds (value for money evaluation).

Good quality evaluation and prototyping supports us to:

Maximise our impact

By developing our understanding of what works, we can use this insight to make informed choices about how to improve outcomes for victims, offenders and society.

Identify innovative approaches to improve the way the justice system delivers in the future

Through understanding the relationship between our activity and our outcomes, evaluation and prototyping helps new ideas for incremental and fundamental changes to the justice system to be identified and tested.

Make the best possible use of public money

Evaluation establishes which policies and programmes deliver the best value for money and enhances Departmental accountability by demonstrating that we have used taxpayers’ money effectively.

For these reasons, the MoJ has an established track record of evaluating its activity. Since 2013, the MoJ’s Justice Data Lab has evaluated the effectiveness of a wide variety of interventions aimed at reducing reoffending, publishing over 175 reports. Our Outcome Delivery Plan provides a summary of the main evaluations currently underway across the justice system.

Our ambition for Evaluation and Prototyping

We are publishing our first Evaluation and Prototyping Strategy because we recognise there are clear benefits to improving our evaluation practice. It is a statement of our commitment to improve our evidence base and, ultimately, provide effective and efficient services for the public.

The ambition of this Strategy is to establish a test-learn-adapt culture where robust and timely evidence on the effectiveness of policies and programmes sits at the heart of policy and operational decisions. This requires adopting a proportionate approach to our learning, balancing decision makers’ needs for both robust and timely evidence. The specific aims of this Strategy are to:

Aim one: Use evidence to inform decisions and deliver impacts for Ministers, the MoJ and the public.

To realise this ambition, our activity will be informed by the existing evidence base and build on what we already know. Where the evidence base is inconclusive, such as when we are proposing new solutions, we will test policy to generate insight and use that learning to inform decisions about which of our policies should be continued, improved, or stopped. We aim to build learning opportunities into the implementation of policy from the very beginning by altering policy design to create robust counterfactuals and/or time to prototype at small scale.

Aim two: Increase the quality and coverage of our evaluation across the justice system.

Due to the breadth and contextual complexity of the justice system, there is no single evaluation approach that will be suitable for all circumstances. To learn effectively, it is essential we use a wide range of methods. Where appropriate, we will strive to use a greater range of experimental and quasi-experimental methods in our evaluations to better understand the impact of our policy against a counterfactual. Where complexity does not allow for experimental approaches, we will strive to use Theory Based Evaluation to establish what works, for whom, in what circumstances.

Aim three: Improve our understanding of cost-effectiveness to ensure we deliver value for money for the taxpayer.

We will use evidence from our evaluation and prototyping activity to help allocate our resources where it will deliver the best outcomes. To achieve this aim we will conduct more good quality value for money evaluation: this will assess whether a policy represents a good use of resources by comparing the costs with the benefits; and will identify ways in which its value for money can be improved.

This evidence will help decision makers to select the policies that deliver the best value for money and support the Permanent Secretary in their duties as the Principal Accounting Officer by demonstrating whether taxpayers’ money has been used effectively.

To this end, we will apply approaches systematically which maximise consistency in comparing costs and benefits across programmes and interventions. Similarly, we will seek to achieve consistency with comparable analysis of value for money measures across the justice system and interventions by other bodies which contribute to the MoJ’s objectives (such as reducing crime).

Aim four: Improve the timeliness of our evidence, including through the implementation of a new prototyping function.

The MoJ will make greater use of prototyping to meet decision makers’ needs for timely insights to inform the design of their policy. Prototyping, a low-cost, low-risk way of developing, testing and improving ideas at an early stage, is based on the premise that no solution will be designed perfectly at the outset and you need to try things out at small scale whilst collecting insight from individuals who deliver or receive a service. It is particularly valuable during the early stages of the policy cycle. By testing earlier if an idea does not work, has limited demand and/or is unlikely to scale, it can be stopped before large costs are incurred. This approach is common in design, engineering and digital science but has not been widely applied to policy development.

Prototyping complements and enhances traditional evaluation approaches. An intervention or policy that has been through rapid cycles of prototyping is more likely to be optimised when it is subjected to longer term evaluation. By quickly identifying if ideas have limited feasibility much earlier than traditional evaluation approaches, prototyping will enable longer term evaluations to be better targeted on the policies that have the greatest potential to improve justice outcomes.

Challenges to evaluation in the justice system

To develop this Strategy, the Evaluation and Prototyping Hub undertook a comprehensive review and assessment of evaluation practice in the justice system. This review found that while there was a strong desire to evaluate, the following barriers were limiting the impact of our evaluations:

  1. Short timescales meant potential opportunities to build learning into the design and delivery of policy were being missed.
  2. There was limited central oversight to monitor evaluation activity and prioritise analytical resource. Factors limiting the quality of our evaluation were not being identified and acted upon early enough.
  3. Competing pressures to deliver combined with variable experience of evaluation meant evaluation coverage was inconsistent across the justice system.
  4. In some areas, limited access to data of the right quality restricted what could be evaluated.

A range of specific contextual factors that make evaluation in the justice system challenging were also identified. Compared to other areas of public policy sample sizes can be small, making it difficult to calculate precise estimates from our impact evaluations. Randomised Control Trials (RCTs) can be difficult to run due to operational, ethical and safety concerns.

To help overcome these barriers, this Strategy has been informed by the recent Evaluating Government Spending report published by the National Audit Office and the updated Magenta Book, which outlines guidance on conducting evaluation in government.

The Evaluation Task Force (ETF), set up following the 2020 Spending Review to ensure robust evidence sits at the heart of government spending decisions, provided detailed advice. The ETF helped to identify the most important recommendations, share examples of best practice and ensure this Strategy is both deliverable and ambitious.

Pillar One: Establish processes to ensure proportionate evaluation and prototyping

This Pillar outlines four actions to give evaluation and prototyping greater prominence across the breadth of the MoJ:

  • Action One: Introduce common principles for prioritising our evaluation resource
  • Action Two: Embed evaluation requirements in business cases
  • Action Three: Establish a Strategic Evidence and Evaluation Committee to prioritise analytical resource
  • Action Four: Systematically track and monitor evaluation and prototyping activity

Combined, these new processes aim to ensure timely and robust evaluation is considered from the outset, ensuring that learning is built into the design and delivery of new policy.

Action One: Introduce common principles for prioritising evaluation resource

The breadth, size and complexity of the justice system means there is a considerable demand for evidence to support decision making. However, generating robust and timely evidence for decision makers requires considerable analytical resource and can place additional pressures on operational staff.

This means it is not proportionate, necessary or feasible to evaluate everything. Attempting to do so will result in the quality of evidence generated becoming compromised. Instead, it is essential to establish principles to identify and prioritise areas for evaluation and prototyping. This will ensure they are adequately resourced to deliver timely and robust evidence to inform decision making.

In many cases, a light touch monitoring exercise, where data is collected during and after implementation to improve current and future decision making, will be sufficient. Monitoring enables decision makers and stakeholders to understand a policy’s:

  • Inputs - the resources committed to an intervention.
  • Activities - the actions and processes undertaken with the resource.
  • Outputs - what is delivered and produced because of the activities.

Monitoring is essential to demonstrate what the MoJ has achieved with its spending of taxpayers’ money. It will also ensure that its policies and programmes have been delivered correctly and identify at an early stage when this is not the case. The MoJ already monitors a range of its policies through its dashboards and routine statistical publications.

Action 7 outlines the steps the MoJ is taking to improve its data and ensure data of the right quality is being brought together effectively and shared at the right time, in the right format.

Compared to inputs, activities and outputs, it is considerably more difficult and resource intensive to identify and understand policy outcomes (the early or medium-term results) and impacts (the long-term results).

Due to this, the MoJ will use the following principles to identify where it should prioritise its resource to understand the outcomes and impacts resulting from its policies:

1. Strategic relevance

Learning will be prioritised for activity that has the greatest potential to have large positive or negative impacts on the MoJ’s four strategic outcomes. A particular focus will be where there is potential for the resulting evidence to inform a strategic Ministerial, policy or operational decision.

2. Evidence gap

Our evaluation and prototyping activity will proactively build our understanding of what works by addressing current and future evidence gaps. The MoJ’s Areas of Research Interest (ARI), which summarises the MoJ’s medium term evidence needs, will be used to inform the planning of our activity. Addressing evidence gaps is a priority as, when there is no comprehensive existing evidence base to draw on, there is greater uncertainty about any potential impacts and heightened risk of policy and operational activity leading to negative impacts on justice outcomes.

3. Size of investment

High-cost programmes and policies will be prioritised for evaluation to help demonstrate accountability for the use of taxpayers’ money. Activity that requires substantial resource is also more likely to attract stakeholder interest. Where possible, we will seek to estimate the potential impacts of high-cost programmes early and at the smallest possible scale. Evidence can then be available to inform larger scale investment decisions before costs become considerable.

Policies assessed as meeting these criteria by the Strategic Evidence and Evaluation Committee (Action Three) will be expected to include a process, impact, and value for money evaluation. These principles will also be used to provide focus for the evaluations of our large-scale reform programmes, which often contain multiple strands of activity.

Action Two: Embed evaluation requirements into business cases

Common prioritisation principles will help the MoJ and its Executive Agencies decide what activity should be evaluated. To ensure the prioritisation principles are being applied, and set an expectation for proportionate learning, a new monitoring, evaluation and prototyping section will be introduced into MoJ’s business cases.

Policies and programmes will be required to outline in their business case:

  • The benefits expected to result from the activity proposed.
  • A Theory of Change, which identifies the causal pathways that theoretically link the inputs and activities of a project to the desired outcomes, allowing the evaluation to develop appropriate approaches to test them.
  • What existing evidence exists to suggest these benefits are likely to be achieved more effectively than with an existing, or other, potential approach.
  • How we will learn from the policy, including whether there are any alternative ways of implementing the policy to enable learning, for example through phasing out the roll out and/or prototyping at small scale.
  • An estimate of the analytical resource required, including any new data collection requirements, and how it will be funded.
  • The Senior Policy, Analytical and Operational colleagues who will be jointly responsible for successful delivery of the evaluation.
  • A rationale for why evaluation is not required if none is proposed.

The MoJ has a Keyholder process where business cases are assessed by a panel of specialists and experts who provide an objective and robust appraisal of business cases prior to submission to the relevant approvals board. A new Evaluation Keyholder will be created to review and comment on the evaluation section of business cases to ensure they are sufficient. These changes align the MoJ with the Treasury Green Book which stipulates that “all proposals must as part of the proposal contain proportionate budgetary, and management provisions for their own monitoring and evaluation.”

This new business requirement will prevent learning opportunities from being missed, as evaluation will be considered during the design of new policies and programmes, when it is still possible to embed it. This requirement will also ensure sufficient resource for learning has been incorporated into budgets for new policies and programmes, providing the funding certainty required. Pillar Two outlines the support that will be available to assist teams to include proportionate monitoring, evaluation and prototyping plans in their business cases.

Activity that does not go through the Keyholder assurance process will still be required to meet minimum evaluation requirements (see Action Five).

Action Three: Establish a Strategic Evidence and Evaluation Committee to prioritise analytical resource

New structures will be created to identify priority evidence gaps across the justice system and ensure successful delivery of an analytical programme to address them. In addition to enabling prioritisation, for priority evaluations this senior oversight will streamline decision making by establishing clear responsibilities and escalation routes.

A new senior Strategic Evidence and Evaluation Committee (SEEC) will have ultimate responsibility for ensuring the MoJ evidence base is sufficient to support decision making. The SEEC will identify priority policies and programmes that should be subject to greater evaluation and provide additional oversight to ensure the successful delivery of a small number of priority evaluations, which are strategically important, high profile and/or complex.

The SEEC will be chaired by the Director General for Performance, Strategy and Analysis and have senior representatives from the analytical, policy, strategy and operational functions. It will be integrated into the MoJ’s broader governance by reporting directly into the MoJ Finance Performance and Risk Committee with annual reporting to the Department’s Executive Committee.

Action Four: Systematically track and monitor evaluation and prototyping activity

We will maintain and share a comprehensive record of the learning activity planned or underway across the justice system. A bespoke tool will be created to collect and update this information, which will be aligned to existing business as usual processes where possible.

Collecting data on our evaluation and prototyping activity is necessary to facilitate effective governance and prioritisation across the MoJ. The data collected will be made accessible across the Justice system to provide a snapshot of evaluation:

  • Coverage. The evaluations planned, underway or committed to and what policies, programmes and interventions are not currently subject to evaluation.
  • Aims. The purpose of the evaluation, how the findings will be used and relate to the broader evidence base.
  • Quality. An assessment of the evaluation method being used to determine impact.
  • Geography. Where evaluations are taking place, what customer groups are in scope and which organisational entities are involved.
  • Delivery. An expected timeline for findings and an assessment of the issues and/or risks that may compromise the quality or timeliness of the evidence produced.
  • Resource. The spend and number of full-time equivalent staff currently assigned to each evaluation.

All staff wanting to conduct research, including evaluation, with staff and/or offenders in prison establishments, the Probation Service regions or within HM Prison and Probation Service (HMPPS) Headquarters are required to formally apply for research approval to the HMPPS National Research Committee (NRC). A complementary approval process aligned with the NRC for evaluations overseen by the MoJ’s Data and Analysis Directorate is outlined in Action Eight.

When new evaluations are being set up, this information will help prevent particular operational locations being overburdened and reduce the contamination risk of running two distinct evaluations in the same location. For counterfactual impact evaluations, understanding where evaluations are taking place will make it easier to identify a prison, court or probation area not currently involved in another evaluation that can serve as a control group for the purposes of the evaluation; a necessary step to disentangle cause and effect and improve our understanding of what works.

Pillar Two: Build capability to deliver quality evaluation and prototyping

Pillar two sets out how we will build our evaluation and prototyping capability across the breadth of the MoJ and its Executive Agencies. To do this we will:

  • Action Five: Invest in a centre of expertise to support systematic learning.
  • Action Six: Harness expertise through a new Evaluation Support Group.
  • Action Seven: Improve our data to deliver better evaluation and prototyping.
  • Action Eight: Strengthen methodological scrutiny of evaluation plans.

Combined, the activity outlined in this Pillar will guarantee that appropriate methodological support and scrutiny exists when learning activity is being designed and implemented. This will ensure quality across the evaluation and prototyping portfolio and give decision makers confidence in the evidence we produce.

Action Five: Invest in a centre of expertise to support systematic learning

A small Evaluation and Prototyping Hub was established at the heart of the MoJ in 2021 to support colleagues to generate evaluation and prototyping evidence that meets decision makers’ needs.

The purpose of the Evaluation and Prototyping Hub is to provide intensive support during the early stages of evaluation and policy design, to build proportional learning from the outset.

Produce evaluation and prototyping guidance tailored to the Justice context

The Evaluation and Prototyping Hub will set minimum standards for evaluation and prototyping in the MoJ. Alongside tailored project-specific advice, a range of universal guidance will be produced to support analytical, policy and operational teams to meet this minimum standard and deliver quality evidence from their evaluation and prototyping activity.

Guidance will include the need to develop a Theory of Change for the policy or programme, which links elements of the intervention with expected outcomes. Development of a Theory of Change is an essential minimum standard for ensuring quality in prototyping and evaluation, as it is used to understand the underlying logic of the intervention, build consensus around measurable indicators and identify priority areas for evaluation.

The guidance will also include a series of applied working papers, covering the main evaluation approaches and key considerations to practically deliver these approaches in the justice system.

This guidance will be supplemented by a series of applied tools, starting with a costing tool to estimate the cost of commissioning different types of research and support colleagues to build sufficient evaluation and prototyping resource into their business cases.

Develop a new MoJ evaluation and prototyping training programme

The Evaluation & Prototyping Hub will develop a new evaluation and prototyping training programme, tailored to the justice context, to ensure evaluation and prototyping is properly understood and embraced across the MoJ.

For non-analytical audiences, a new introductory training course will be developed in collaboration with evaluation and prototyping champions recruited from across the MoJ. This course will run regularly so there is a sufficient foundation of evaluation and prototyping knowledge across the organisation.

To meet more advanced analytical needs, a modular course tailored to the MoJ will be developed in partnership with the joint Cabinet Office HM Treasury Evaluation Task Force and the Behavioural Insights Team. This intermediate course will cover the whole evaluation process, from conducting a Theory of Change to selecting and implementing the appropriate evaluation approach and using findings to deliver impact.

Action Six: Harness expertise through a new Evaluation Support Group

A new Evaluation Support Group will harness the applied evaluation expertise that exists across the justice system. The Evaluation Support Group will enable individuals to access a range of methodological experts and receive practical advice on how to apply different evaluation approaches across justice settings.

Members of the Evaluation Support Group will also be able to access the wealth of academic expertise that exists beyond the MoJ. The Evaluation & Prototyping Hub will work with academics to create a faster and more coordinated pathway to access academic expertise. This builds on work led by the MoJ’s Evidence and Partnerships Hub to enhance our use of academic expertise and collaborations, including through embedded secondments, fellowships and effective knowledge exchange.

The justice-applied Evaluation Support Group will supplement other sources of evaluation support that exist both within the MoJ and across government, including the MoJ Ethics Advisory Group, the MoJ and HMPPS National Research Council, the Evaluation and Trials Advice Panel and the Cross-Government Evaluation Group.

Action Seven: Improve our data to enable better evaluation and prototyping

The learning from our evaluation and prototyping activity is only as good as the data upon which it is based. Delivering timely and quality insights depends on data of the right quality being brought together effectively and shared at the right time, in the right format.

We recognise that getting data of the appropriate quality to the people who need it is essential. With improved data coverage and quality, we can understand how services are performing and what needs to be done to improve them. Better data facilitates the rapid testing of ideas and enables robust evaluation.

We have set up a Data Improvement Programme across the MoJ and with our partners across the Criminal Justice System, which will drive data transformation. The Evaluation and Prototyping Hub will continue to work closely with the Data Improvement Programme to ensure evaluation data needs are fed into the programme as it develops over the next two years, participating in relevant pilots and inputting into the longer term improvement roadmap.

Action Eight: Strengthen methodological scrutiny of evaluation plans

All evaluation leads within Data and Analysis Directorate will need to complete an updated Evaluation Template as part of the MoJ’s Analytical Quality Assurance Process. The Evaluation and Prototyping Hub will store all the Evaluation Templates and conduct an annual audit to review quality.

Director of Analysis clearance will be required for all priority evaluations identified by the Strategic Evidence and Evaluation Committee.

Early independent review of evaluation plans will allow for methodological and delivery risks to be identified. Appropriate mitigations will then be put in place to ensure quality before financial costs are incurred. Following its review, the Evaluation and Prototyping Hub will share its written assessment of whether the proposed evaluation is proportionate, methodologically feasible and delivers value for money with the Director of Analysis and relevant Deputy Director.

Evaluation reports produced by the MoJ are already independently peer reviewed by external experts as part of the quality assurance process. We want to use this independent expertise earlier when evaluations are being designed.

For our highest profile and/or complex evaluations, we will utilise academic expertise by requiring external ‘pre-review’ of evaluation plans to identify methodological improvements and deliver quality from the outset.

Pillar Three: Timely and accessible evidence to improve decision making

Evidence needs to be accessible and timely to deliver policy impact. In a recent National Audit Office report, the heads of policy profession across government reported a lack of timely evidence and an inaccessible knowledge base were by far the biggest barriers to using evaluation to inform policy decisions. This Pillar outlines the following actions to overcome these barriers:

  • Action Nine: Utilise prototyping to quickly gain early insight.
  • Action Ten: Aid decision making through consistent, timely and monetised outcome measures.
  • Action Eleven: Make our evidence accessible.
  • Action Twelve: Improve the generation and use of evidence by encouraging a positive learning culture.

Combined, these activities will improve the impact of our evidence by making it more timely and accessible for decision makers.

Action Nine: Utilise prototyping to quickly gain insights

Delivering impact with evidence requires it to be available when required by decision makers. Alongside our longer-term policy evaluation, we will make greater use of prototyping to generate learning in a timely manner. Compared to traditional evaluation approaches, prototyping places greater emphasis on the rapid generation and use of learning to inform policy development.

At its core, prototyping accepts that no solution will be designed perfectly at the outset. For a policy or intervention to achieve its outcomes it is essential to understand the context in which it is delivered. For this reason, prototyping relies on putting people who will deliver the intervention, or be impacted by it, at the centre of its design in order to:

  1. Understand their perspectives about the problem and identify the barriers and enablers for any intended policy change.
  2. Co-design change initiatives with them, as those close to the problem are often closer to the solution.
  3. Try things out in context and get rapid feedback from people involved on whether it shows signs of delivering outcomes as intended.

Early prototyping takes a policy idea and makes it tangible to allow staff and users to understand how it might work and provide initial feedback on design. Later stages of prototyping test either the whole policy in small scale or test specific elements of the policy. It is critical to test the riskiest assumptions about how the policy will achieve its outcomes to see if these hold. This may identify if a policy is unlikely to achieve its outcomes at an earlier stage than conventional evaluation approaches.

An additional benefit of prototyping is that it collects insight from those who will deliver and/or be impacted by a policy early, when it is still possible to use these insights to inform and refine the development of policy.

Prototyping will enable decision makers to assess three fundamental questions when policy is being designed:

  1. Is there demand? Assess whether the people who deliver or receive the intervention require it, to avoid rolling out policies with low take up rates.
  2. Does it show promise? Learn quickly whether the intervention shows signs of working and identify any potential concerns. In many cases, this means we will refine the prototyped intervention to create another prototype. At this stage, it is not possible to definitively conclude an intervention works as with evaluation, but it is plausible to get a strong signal that the intervention will not work.
  3. Can it scale? Prototyping can identify the critical elements of the intervention that would need to be in place for it to be scaled more broadly. Scalability is an important consideration for determining whether an intervention is technically feasible and could represent good value for money.

Early prototyping projects include exploring the feasibility of detecting drug use in wastewater so it can be used as a drug surveillance measure in prisons, undertaking fast track methods for dealing with prisoner misconduct, and testing a new method for quantifying and communicating risks scores with the probation service to help keep the public safe.

By supporting decision makers to rapidly try out their ideas in situ, prototyping will provide vital early insights and learning to inform policy design when it is still possible to make changes without incurring large costs. Some ideas will need a second round of prototyping, some will show sufficient promise that they need to be evaluated more robustly and some will be stopped.

Action Ten: Aid decision making through consistent, timely and monetised outcome measures

Decision makers are routinely required to compare different policy options to achieve the MoJ’s outcomes. To facilitate effective decision making, where possible we will work with internal and external experts to develop outcome measures to be used in impact evaluation across each of the MoJ’s priority outcome areas.

These outcome measures will make it easier to quantify and compare the impacts of different policies; allow decision makers to select policies that deliver the best value for money and support the designers and deliverers of programmes to improve the value for money of existing and future interventions. This requires value for money evaluation, which assesses whether the policy is a good use of resources by comparing the costs with the benefits.

To support decision makers to deliver value for the taxpayer and to improve outcomes for the public within the resources allocated to us, we will use cost and benefit data from our interventions to support value for money analysis aligned with HM Treasury Green Book guidance to estimate and compare the cost-effectiveness of different interventions. We will standardise measures of costs and benefits so that different interventions and programmes can be compared on a consistent basis.

Where possible, we will also make comparisons with alternative ways of delivering our objectives: for example through other parts of the justice system, or through interventions in other areas which improve outcomes such as crime reduction. To accelerate the development of value for money insights, we will explore the scope to incorporate cost data into previous evaluations so that cost-effectiveness can be assessed. In doing so we will assess how far this provides reliable assessments for use in decision-making.

Alongside the activity to develop monetised outcome measures across the MoJ’s strategic outcomes, we will take steps to improve the timeliness of our evidence. One way we will achieve this is by developing proxy or interim outcome measures to complement longer term outcome measures. This will be particularly important with reducing reoffending, one of MoJ’s priority outcomes, where there is an inherent delay waiting for reoffending data to become available. The identification and validation of a reoffending proxy will help give an earlier indication of policy impact whilst we wait for longer term reoffending outcome data.

Action Eleven: Making our evidence accessible

We remain committed to the prompt publication of all badged Government Social Research Reports and analysis that is judged to be of acceptable quality, as stated in the Government Social Research Publication protocol.

We believe making our evaluation evidence accessible is the best way to support decision makers to learn from experience and feed findings from previous evaluations back into the design and implementation of future policies. Publishing our evaluation also strengthens accountability by demonstrating what outcomes have been achieved with the spending of taxpayers’ money.

A summary of our evaluation activity will continue to be published in the MoJ’s Outcome Delivery Plan. For the period covered by this Strategy, we will register our evaluations on the upcoming cross-government Evaluation Registry.

A specialist Evidence and Partnerships Hub has been formed within the MoJ to maximise the use and impact of evidence for decision making. To make best use of the existing national and international evidence base, an internal interactive Evidence Library has been developed, providing streamlined access to critical evidence across policy themes.

Analysts will be required to upload findings to the library as they emerge throughout the evaluation. The Evidence and Partnerships Hub are commissioning a series of evidence reviews to improve the evidence across Departmental priorities as outlined in the Areas of Research Interest (2020).

Action Twelve: Improve the generation and use of evidence by encouraging a positive learning culture

A key message from this Strategy is that being able to generate and use evidence to deliver impact requires a culture where learning is seen as everyone’s priority. To develop a culture where findings are routinely used to inform decisions, we need to embed learning processes into business-as-usual processes and activity.

We have already started to promote the value of evidence through a monthly seminar series for academic researchers to present findings that address evidence gaps in the MoJ ARI. Seminars began in October 2021 and at the time of writing, there have been 16 seminars with over 1,500 attendees, showing the appetite for this type of academic engagement.

To continue building a learning culture, we are working with Brink Consultancy to explore the barriers and incentives that need to be overcome if the MoJ is to become the leading Department for how it systematically learns and uses its evidence to improve outcomes.

This work will develop a set of system level proposals to encourage a strong and safe culture for evaluation, prototyping and learning within the MoJ. These proposals will highlight areas where changes to the system can have a positive effect for enabling a learning culture to develop.

Next steps

This Strategy has outlined twelve actions to ensure robust and timely evidence sits at the heart of policy and operational decisions. This will help the MoJ to improve outcomes and deliver value for the taxpayer. To realise this ambition the MoJ will:

  1. Introduce common principles for prioritising our evaluation resource.
  2. Embed evaluation requirements in business cases.
  3. Establish a Strategic Evidence and Evaluation Committee to prioritise analytical resource.
  4. Systematically track and monitor evaluation and prototyping activity.
  5. Invest in a centre of expertise to support systematic learning.
  6. Harness expertise through a new Evaluation Support Group.
  7. Improve our data to deliver better evaluation and prototyping.
  8. Strengthen methodological scrutiny of evaluation plans.
  9. Utilise prototyping to quickly gain early insights.
  10. Aid decision making through consistent, timely and monetised outcome measures.
  11. Make our evidence accessible.
  12. Improve the generation and use of evidence by encouraging a positive learning culture.