Home Office Evaluation Strategy
Published 5 April 2023
1. Introduction
The Home Office has a challenging programme of work to deliver, covering crime reduction, policing, national security, immigration and border security. Evaluation allows us to robustly identify which policies and aspects of our delivery are effective. The resulting learning enables the development of more effective policies and programmes. Evaluations are therefore critical to effective decision making.
Our vision is to continue to embed evaluation practice within the Home Office to ensure it is an integrated part of our policy and operational culture. Our strategy aim is reflected in this statement:
`This means that evaluative and reflective practice is part of the ‘way we do things around here’ where all colleagues seek, learn, and think critically about the evidence that underpins their actions.’ (The Magenta Book - GOV.UK www.gov.uk, page 83)
This document outlines the Home Office strategy for the evaluation of the department’s policies and programmes and supports our published `Research Areas of Interest’. (Areas of research interest relevant to the Home Office - GOV.UK )
2. Our aims are to:
-
ensure that evaluation is embedded as a core component of key Home Office programme and delivery plans
-
ensure Value for Money (VfM) is considered at the design stage of all our evaluations
-
maintain oversight of our evaluation activity across the department to ensure we are using analytical resource in the right places and share best practice
-
ensure we use learning from previous evaluations when developing new policies and programmes
-
train, upskill and maintain an appropriate level of evaluation knowledge among analysts in the department
-
ensure that our policy and operational colleagues have a sound understanding of evaluation and monitoring and how this can help them deliver their policies and programmes effectively
-
ensure that we have high quality evaluations that draw on the most appropriate methods to provide actionable insights
-
encourage Public Sector Equality Duty (PSED) considerations at an early stage of evaluation planning
-
ensure transparency of our evaluations through publication
The overall goal is to support the development of effective, evidence-based policies and programmes, thereby enabling the best possible investment of public money and outcomes for members of our society.
3. Delivering our evaluation strategy
Our strategy focuses on further embedding evaluation into the Home Office culture. This includes: training and upskilling analysts, policy and operational colleagues; ensuring we have the right enablers help drive change and getting impact through maintaining oversight and transparency of our evaluation activity.
Annex A provides a snapshot of our current evaluations as of December 2022.
4. Train, upskill and maintain
To ensure that the evaluation knowledge and skills of our analysts remain up to date we will continue to develop and refresh their expertise. Our induction academy will continue to provide introductory evaluation training for all new analytical starters. We will identify key evaluation training courses and actively encourage all our existing analysts to attend.
The focus and robustness of programmes are strengthened when work is done early in the design stage to clearly identify how activities relate to outcomes and how behaviour will be changed. We will ensure that our policy and operational colleagues understand the importance of using Theory of Change[footnote 1] and Logic Mapping[footnote 2] to deliver their policies and programmes effectively.
We are currently developing and piloting a new training toolkit to support more effective Theory of Change delivery with our internal stakeholders. One policy team will receive the training along with analysts, so they can lead this kind of activity in the future, instead of it being regarded as an ‘analyst only’ product. If this approach works well the aim is to roll it out across the Home Office.
The Home Office Evaluation Network is made up of analysts knowledgeable in, and keen to learn more about evaluation. It will run a session for the Home Office Policy Week on evaluation and Theory of Change so that we continue to grow awareness of the benefits of developing programmes and policies with evaluation built in from the start.
We will continue to ensure, through these training sessions, that colleagues are clear on the distinction between benefits realisation (which is focussed on project monitoring and delivery) and evaluation (which focusses on accountability and learning). Using both of these enables the department to continue to deliver effective programmes on schedule and know whether they achieved what they set out to, how the policy or programme was implemented, what effects it had and why.
We will also run a biennial skills audit of our analytical community. The resulting data will be used to commission evaluation training as necessary. This may then be extended to also cover policy and operational colleagues.
We will keep abreast of new developments in evaluation methodologies drawing on our links with for example, the Evaluation Society, cross government evaluation networks, and the Evaluation Trial Advisory Panel. We will ensure that new learning is shared across the Home Office Evaluation Network and beyond.
5. Collaborative working
Our analysts will continue to collaborate with finance, policy, and operational colleagues to ensure that evaluation is embedded as a core component of programme or project delivery plans at the outset. This will help ensure financial provision for evaluation is included in the economic and financial cases for new policies and programmes; that the right projects are evaluated, and the right people support and steer the evaluation at the right time. It will also keep evaluation at the front of colleague’s minds so that key decisions on interventions remain evidence based.
We will also continue to encourage our analysts to build networks with external organisations and academics so that we continue to keep up to date with the evidence base and new methodological developments.
6. Value for money (VfM)
VfM and cost benefit analysis will be considered at the design stage of all our evaluations. If appropriate the methods to address VfM and cost benefit analysis will use those outlined in the Magenta Book(The Magenta Book - GOV.UK www.gov.uk, page 49). This will ensure we can justify programme spend and make efficiencies in Home Office programmes using robust evaluation evidence. In addition, the evaluation training we provide to policy and operational leads will continue to raise their awareness of VfM and cost benefit analysis when developing new policies and programmes, as well as drawing on existing evaluation evidence of what works.
7. Learning from previous evaluations
We will ensure that lessons from evaluations are fed into new policy and programme designs, so we continue to learn from and build on our evaluation knowledge. As part of this, we will continue to start all evaluations with a Rapid Evidence Assessment or Systematic Review as appropriate.
We aim to ensure that our evaluation evidence is shared widely across all our stakeholders by publishing our evaluation evidence, so our evaluation activity has the greatest impact, and we demonstrate accountability to wider society.
We will embed compulsory review points for our larger scale, long-term evaluations. The aim is to review whether: the evaluation is still required; if we are still getting value out of the evaluation; has the intervention developed in a new direction; and whether we are still measuring the right things. We are aware that some longer-term evaluations can be overtaken by events and are no longer required. Nevertheless, we can still learn a lot from them to inform future evaluations.
8. High quality evaluations
As for other government departments, the Home Office approach to evaluation is shaped by HM Treasury’s Green Book guidance on appraisal and evaluation and its Magenta Book guidance on evaluation. Our research and analysis is informed by government social research guidelines (Government Social Research Profession. ‘GSR code: products 2018 and Government Social Research Profession. ‘GSR Ethical Assurance for Social and Behavioural Research’ 2011 Driving our performance: DBT’s monitoring and evaluation strategy 2022 to 2025), the Aqua Book and wider good practice.
Using this guidance and our audit result, we will continue to set expectations on what can be evaluated and maintain oversight of our priority evaluations. We will continue to sense check evaluation methodologies of high-profile evaluations to ensure that the proposed methodology is robust and proportionate. Ethics, Analytical Quality Assurance and General Data Protection Regulation considerations will continue to form part of that challenge process.
We expect a higher level of Quality Assurance for high profile, high-cost (>£250K) evaluations. We will continue to reflect carefully on how evaluation resource is best deployed using spend, profile and potential impact as criteria to consider.
Our analysts will be empowered to select the most appropriate approach for their evaluations. The use of experimental methods will be encouraged for evaluations, but we are aware that these may be challenging or not possible. We will continue to ensure we use the most appropriate and robust methods, and will ensure high Maryland Scientific Methods Scale (Farrington, D.P., Gottfredson, D.C., Sherman, L.W. and Welsh, B.C., 2003. The Maryland scientific methods scale. In Evidence-based crime prevention, page 13-21, Routledge.) scores where possible for our evaluations. All evaluation reports will continue to be externally peer reviewed prior to publication.
In addition, we will ensure Public Sector Equality Duty considerations are addressed when designing evaluations.
9. Ensuring impact
To help ensure impact of our evaluation activity we will continue our annual evaluation audit. Future evaluation audits will be aligned to Spending Review/financial year timeframes and will continue to help ensure our evaluations:
-
deliver value
-
are adequately resourced
-
have the right evaluation aims and methods
-
deliver evidence at right time, in the right places to help inform key Home Office business
10. Transparency
We will share our evidence with stakeholders and the public to demonstrate what works and what is value for money. To do this all our evaluations (conducted both internally and commissioned) will be published unless there are good reasons not to do so (for example, national security reasons). We will follow Government Social Research Publication Protocol which is consistent with the statutory Code of Practice for Statistics (statistics authority .GOV.UK). Compliance with the GSR publications protocol will help ensure that Home Office evidence is released into the public domain in a manner that promotes public confidence and scientific rigour. By doing this we demonstrate accountability to members of society through publication of our evaluation evidence.
Table 1: Evaluation strategy summary table
Aim | Detail/Activity | Success Indicators |
---|---|---|
Train, upskill and maintain analyst knowledge of evaluation | We will conduct a skills audit every 2 years and use the results of that to develop and provide training. Our Home Office - Analysis and Insight (HOAI) Induction Academy will continue to provide introductory training on evaluation for all new analyst starters. We will identify key evaluation training courses that we will encourage all our existing analysts to attend to refresh their expertise. We will keep abreast of new developments in evaluation methodologies drawing on our links with for example, the Evaluation Society, cross government evaluation networks, and the Evaluation Trial Advisory Panel and ensure this is shared across the Home Office Evaluation Network | The skills audit results help to develop evaluation training package(s) for HOAI analysts. The Home Office has a reputation for evaluation expertise |
Ensure that our policy and operational colleagues have a sound understanding of how evaluation and monitoring can help them deliver their policies and programmes effectively | We will continue to provide ad hoc training on evaluation to our policy and operational colleagues | Our policies and programmes are evidence based – with evaluation and monitoring built in from the start |
Evaluation is embedded as a core component of programme or project delivery plans at the outset | Empowering our analysts to work collaboratively with finance analysts, policy, and operational colleagues across the department | Financial provision for evaluation is included in economic and financial cases for new policies and programmes |
Value for Money (VfM) and cost benefit analysis will be considered at the design stage of all our evaluations | We will do this through continuing to raise awareness with all our analysts in VfM and cost benefit analysis, so they are aware of the best methodological approaches to use for their evaluations. In addition, the evaluation training we provide to policy and operational leads will continue to raise their awareness of VfM to ensure this is at the front of their minds when developing new policies and programmes, as well as drawing on existing evaluation evidence of what works | We can justify programme spend and make efficiencies in Home Office programmes using robust evaluation evidence |
Use learning from previous evaluation evidence when developing new policies and programmes | We will continue to start all evaluations with a Rapid Evidence Assessment or Systematic Review. We will embed compulsory review points for our larger scale, long-term evaluations. The aim is to review if the evaluation is still required, if we are still getting value out of the evaluation, and if the intervention has developed are we measuring the right things | Our policies and programmes draw on up to date research and evaluation evidence. We continue to learn and build on our evaluation knowledge. Our evaluations remain relevant |
Ensure that we have high quality evaluations | We will continue to sense check evaluation methodologies of high-profile evaluations to ensure that the proposed methodology is robust and proportionate. Ethics/AQA/GDPR considerations also form part of that process. The use of experimental methods will be encouraged for evaluations where possible. We will continue to encourage colleagues to consider Public Sector Equality Duty considerations when designing their evaluations | We have high quality evaluations and the Home Office has a reputation for evaluation expertise |
Maintain oversight of evaluation activity across the Home Office | We will conduct an evaluation audit across HOAI each year | Our evaluations deliver value; are adequately resourced; have the right evaluation aims and methods; and deliver evidence at right time, in the right places to help inform key Home Office business |
Ensure transparency of our evaluations through publication | We share evaluation evidence with stakeholders by publishing our evaluation evidence to demonstrate what works and value for money | The evidence base for Home Office business areas further develops with knowledge sharing to our external stakeholders. We demonstrate accountability to members of society through publication of our evaluation evidence |
Annex A
Snapshot of externally commissioned evaluations
Evaluation Title | Description | Start Date | Expected End Date |
---|---|---|---|
Safety of Women at Night Fund | Determine the impacts and draw out learnings from the Fund | Q4 2021 | Q1 2023 |
Devolving Child NRM Decisions pilot evaluation | Evaluate pilot to devolve child NRM decisions to local authorities | Q2 2021 | Q1 2023 |
Independent Child Trafficking Guardians Evaluation | Evaluate changes to the Independent Child Trafficking Guardians service recommended by the independent review of the Modern Slavery Act | Q3 2021 | Q1 2023 |
Modern Slavery Prevention Fund Evaluation | Evaluate the extent to which the MS Prevention Fund has met its objectives | Q4 2021 | Q1 2023 |
Project ADDER Evaluation | Estimate the impact and value for money of Project ADDER (Addiction, Diversion, Disruption, Enforcement and Recover) and generate learning about what works and why | Q1 2021 | Q3 2023 |
Evaluation of Operation Encompass and review of frontline officer response to domestic abuse incidents involving children | Build understanding of police responses to domestic abuse incidents involving children including use and effectiveness of the Operation Encompass school notification scheme | Q4 2021 | Q3 2023 |
Support for Migrant Victims Process Evaluation: Support for Migrant Victims Pilot Scheme | To evaluate a £1.4 million pilot scheme intended to support victims/survivors of domestic violence with No Recourse to Public Funds | Q2 2021 | Q1 2023 |
Evaluation of the Safer Streets Fund, Round 2 | To identify the barriers and facilitators to implementation of situational crime interventions and to measure the impact on police recorded crime and public perceptions of crime and safety | Q2 2021 | Q1 2023 |
Evaluation of the Safer Streets Fund, Round 3: Protecting public spaces | To identify the barriers and facilitators to implementation of situational crime interventions and to measure the impact on police recorded crime and public perceptions of crime and safety | Q3 2021 | Q1 2023 |
Evaluation of the Safer Streets Fund, Round 4 | To identify the barriers and facilitators to implementing SSF funded interventions and understanding their impact on neighbourhood crime, ASB and VAWG | Q2 2022 | Q2 2024 |
Evaluation of Violence Reduction Units | To understand how VRUs are implementing a public health approach to violence reduction and assess the combined impact of both VRU and Grip hotspot policing funding | Q3 2019 | Q1 2023 |
Creating Opportunities Fund Evaluation | To identify the barriers and facilitators to implementation of COF and to measure the impact on self-reported offending and improvements in employment | Q1 2021 | Q2 2023 |
Serious Violence Social Media Hub | To assess the feasibility of measuring any impact of the Social Media Hub on Criminal Jucstice System outcomes. | Q1 2020 | Q1 2025 |
Serious Violence Duty Evaluation | To understand implementation of the Serious Violence Duty | Q4 2022 | Q1 2025 |
Serious Violence Reduction Orders (SVRO) pilot | To assess the impact that SVROs have on reoffending rates and serious violence before consideration is given to whether SVROs are rolled out nationally | Q4 2021 | Q2 2025 |
Desistance and Disengagement Programme evaluation | To assess the impact of the programme on desistance and disengagement from terrorism and to explore effectiveness of programme implementation | 2020 | Q1 2023 |
Channel Programme evaluation | To assess the impact of Channel interventions on programme recipients and to explore the effectiveness of programme implementation | Q1 2023 | Q2 2025 |
Protect Training Evaluation | Assess efficacy & improvements for core counter-terrorism operational training products, aiming to upskill public security behaviours and raise vigilance (linked to Martyn’s law) | 2020 | Q1 2023 |
New Plan for Immigration (NPI) evaluation | To understand the impact, VfM and implementation of NPI as a whole | Q2 2022 | Q1 2024 |
Future Border and Immigration System (FBIS) evaluation | Evaluate the efficiency and effectiveness of FBIS visa routes, supporting VfM assessments | Q1 2022 | Q1 2025 |
Evaluation of the Compliant Environment | In response to Recommendation 7 of the Windrush Lessons Learnt Review, the Home Office committed to a long-term evaluation of how the compliant environment measures operate, both individually and cumulatively, to ensure that both its rule and protections work effectively | Q3 2020 | Ongoing |
Qualitative evaluation of VPRS/VCRS resettlement schemes | Qualitative evaluation of VPRS/VCRS (Ipsos), focusing on delivery of schemes and integration outcomes. Reports finalised with ministerial approval to publish | Q3 2017 | Q1 2023 |
Evaluation of Home Office resettlement schemes (UKRS, ACRS, ARAP) | Mixed-methods evaluation of the Home Office’s new resettlement schemes (UKRS, ACRS, ARAP), involving secondary analysis of data collected by Home Office, and series of focussed pieces of qualitative research | Q3 2023 | Q4 2025 |
Refugee Transitions Outcomes Fund (RTOF) Evaluation | Mixed-methods evaluation of delivery and refugee outcomes from this programme | Q1 2022 | Q1 2024 |
Refugee Employability Programme (REP) Evaluation | Mixed method evaluation comprising a process and impact evaluation of the scheme. Also includes a rapid evidence review | Q2 2023 | Q3 2025 |
-
Theory of Change describes and illustrates how and why a desired change is expected to happen in a particular context (Theory of Change) ↩
-
Logic mapping is a systematic and visual way of presenting the key steps required to turn a set of resources or inputs into activities that are designed to lead to a specific set of changes or outcomes (Logic Mapping: hints and tips guide - GOV.UK) ↩