Research and analysis

DWP Evaluation Strategy

Published 21 February 2023

1. Foreword from the DWP Chief Analyst and Chief Scientific Advisor

I am delighted to introduce the evaluation strategy for the Department for Work and Pensions. DWP has a strong tradition of evaluation of its programmes and policies, going back to predecessor Departments. We are committed to building on that and improving further, as evidenced by new initiatives such as our Employment Data Lab, and the establishment of our Methods Advisory Group. The purpose of this document is to set out our current approach to ensuring that we plan, conduct and use our evaluation resources to maximum effect.

2. Introduction

As set out in DWP’s Outcome Delivery Plan, evaluation is a key enabler for delivering our priority outcomes. Evaluation evidence from previous policies and programmes is used to develop new initiatives to meet the objectives set out in the Outcome Delivery Plan. The evaluation of new and current initiatives will help to ensure they are supporting these objectives as intended, with that evidence going on to inform future design. This document sets out our approach for ensuring that our evaluation work delivers high quality evidence on the right questions at the right time. It covers all types of evaluation – process, impact and value for money.

Our approach is based on four key principles:

  • Strong and proportionate evaluation design, following best practice in the field, including the guidance of the Magenta Book.

  • Expert evaluation resource – evaluation work led and conducted by DWP analysts with a wide range of expertise, supported by a strong capability-building approach, and drawing on external expert advice and resource as needed.

  • Rigorous quality assurance.

  • An active approach to knowledge management.

All of this is done within a centralised governance framework in which the Chief Analyst, reporting directly to the Permanent Secretary, is ultimately accountable for all analysis conducted within DWP. In this the Chief Analyst is supported by some centralised functions, and by a Senior Analytical Leadership Team comprising Lead Analysts from all part of the Department, and the heads of the analytical professions.

The following sections expand on each of these principles in turn.

3. Strong and proportionate evaluation design

The need for new evaluation work will normally be identified jointly by the Lead Analyst assigned to the relevant area of the Department’s work and their policy or operational counterparts. This may relate to new policies, programmes or processes designed to meet the objectives set out in DWP’s Outcome Delivery Plan, or to existing ones where there is a need for additional evidence (including for instance the need to update earlier evaluation work).

Having agreed the research questions which need to be answered – in most cases, supported by the development of a Theory of Change – analysts will identify options for answering them, including the implementation of the new provision, collection of relevant monitoring information and the appropriate research. The approach chosen, in discussion with policy/operational colleagues, will need to be proportionate – taking into account the costs of the different options and of the intervention to be evaluated; the robustness of alternative approaches; the strength of existing evidence for the intervention; risks of unintended consequences; the timescale within which results are needed; and the availability of necessary resources (both internal and external).

Wherever appropriate, evaluations will exploit the extensive administrative data available to DWP, in particular data on benefit claims and, subject to permission from HMRC, data on earnings and employment. These data have considerable advantages for evaluation purposes in terms of sample size and granularity, as well as being far less expensive than collecting new survey data.

In addition to these internal processes, there can be external requirements for evaluation, either as a condition for HM Treasury approval for spending proposals, or from the Infrastructure and Projects Authority if the policy or programme is within their portfolio.

4. Expert evaluation resource

DWP has a large cadre of analysts drawn from all four analytical professions, many of whom have experience in the design and conduct of evaluation. This includes for example an In-House Research Unit, available to conduct qualitative studies as part of a process evaluation, and experienced econometricians able to conduct sophisticated impact assessments.

These analysts are supported by a range of support mechanisms offered or coordinated by our Central Analysis and Science Directorate, including for example:

  • Initial advice and guidance on evaluation issues.

  • Referral to appropriate internal and external resources for further advice – internally for example we have a Trials Peer Network, an impact evaluation advisory group, an ethics advice panel, a network of Theory of Change champions; externally we may reach out to members of DWP’s Methods Advisory Group, the Evaluation Task Force, or help to identify experts in specific topics or methods using either our own contacts or through the Cross-Government Evaluation Group.

  • Curation of a range of useful resources, both internal and external, including a range of guidance and examples of past evaluations.

  • Capability building exercises – for example, we have delivered training on developing a Theory of Change, and commissioned external experts to develop training in trial design and analysis tailored to the DWP context. Such exercises of course supplement the continuous professional development which all staff undertake.

Once the evaluation approach is agreed, the actual research and analysis is carried out by a combination of expert internal resource and external resource commissioned through competitive tendering.

5. Quality assurance

All evaluation outputs, whether externally commissioned or produced internally, are subject to rigorous quality assurance, following the principles set out in the Aqua Book and in the Analysis Function standards.

For internal work, quality assurance will usually include some element of checking by analysts outside the immediate team as well as those conducting the analysis. The overall methodological approach, and the emerging results, will be discussed with appropriate experts within the Department – including policy and operational colleagues with subject matter knowledge as well as analysts. For many projects, we will establish a formal steering/advisory group to provide this support throughout the lifetime of the project. As set out above, we can bring to bear expertise in a wide range of evaluation methodologies, both qualitative and quantitative. Where appropriate, external review will also be sought, usually from a knowledgeable academic.

For external work, having a rigorous approach to quality assurance is an essential qualification for any organisation bidding to conduct research for DWP. The project manager for any externally commissioned research will ensure that the documented approach has been followed, through close ongoing liaison with the research team. Where appropriate, this may be supplemented by setting up an external advisory panel to support the work.

6. Knowledge management

A vital component of our evaluation strategy is to ensure that all our evaluation work, alongside relevant external work can be used to inform future policy development to deliver on the objectives set out in the DWP Outcome Delivery Plan. Our main ways of achieving this include:

  • Publication of outputs on GOV.UK, either as research reports or ad-hoc statistical publications.

  • Internal dissemination on our DWP Knowledge Bank which includes both published reports and internal analysis, and a search tool which operates simultaneously across our internal holdings and wider published literature.

  • Support from our Research Library to assist those looking for relevant evidence.

  • Organisation of knowledge sharing seminars to share methodological insights as well as findings – delivered by externally commissioned and internal researchers, as well as ones delivered by independent academics.