Policy paper

DCMS Monitoring and Evaluation Strategy

Published 10 November 2022

Foreword from the Permanent Secretary

Building a strong evidence base for all policy decisions helps DCMS in designing and delivering interventions to drive growth and enrich lives across the nation. This strategy seeks to further develop DCMS’s culture of robust and effective monitoring and evaluation. The renewal of this strategy comes at a crucial time when the pressure on public finances and economies worldwide is considerable, highlighting the importance of delivering interventions that are the best value possible for taxpayers’ money.

The strategy sets out how we will ensure that the department’s monitoring and evaluation activities are timely, proportionate and efficiently delivered. It will ensure that these practices are firmly embedded across the organisation, and not solely within the analytical profession. It will do this by ensuring that there are clear expectations for officials; processes to increase accountability; more high quality opportunities for learning and development; and mechanisms for policy to be continuously informed by the lessons learned on previous interventions.

The publication of this document is a statement of intent by myself and my DCMS leadership team that we will seek to take an active and personal role in reinforcing this culture and approach everywhere we can. I am pleased to introduce this strategy, to build further on the good practice we have already put in place, ensuring that our policies continue to drive growth and enrich lives for all who need them.


Sarah Healey CB
Permanent Secretary of the Department for Digital, Culture, Media and Sport (DCMS)

Executive summary

Evaluation is the cornerstone of evidence-based policy making. It helps policy officials understand impact and make better decisions. The Department for Digital, Culture, Media & Sport (DCMS) needs effective evaluation practices to credibly demonstrate the impact of its efforts, make the best decisions and promote and defend its value.

This Monitoring and Evaluation (M&E) Strategy aims to ensure that DCMS embeds proportionate, robust and impactful evaluation, thereby increasing the volume, rigour and influence of evaluation across the department. This strategy aims to promote robust practices of monitoring and evaluation, so we can best inform decisions on future policy making.

The vision: Our vision is aspirational, ambitious and long term, enabling continuous improvement over the years. A department with a strong evaluation culture at its core, and one that is committed to maintaining a robust evidence base.
The mission: Whilst the vision is the future aspiration, the mission is what we set out to achieve and is delivery focused. It enables us to build our objectives and plan of action. Proportionate, high-quality and innovative monitoring and evaluation is produced across the department, used to inform policy design, development and implementation and legislative options.

Three core outcomes

To achieve DCMS’s vision we have 3 core outcomes we seek to ensure:

  1. The evidence produced from monitoring and evaluation activity is timely, proportionate and needs driven: Evaluations are designed, commissioned and delivered at the right time and are strategically targeted towards areas of greatest impact (e.g. areas of greater evidentiary need).

  2. Monitoring and evaluation is consistently, reliably and confidently used in decision making: Evaluation findings and lessons learned are accessible, actively and clearly communicated, while also firmly embedded into the policy making and policy delivery processes and we maintain transparent quality assurance.

  3. Monitoring and evaluation is firmly embedded across the organisation: DCMS will have a strong standard of evaluation literacy, with policy officials and analysts in the department having access to regular and emerging evaluation knowledge through a well promoted positive learning culture.

Introduction

The Department for Digital, Culture, Media & Sport (DCMS) helps to drive growth, enrich lives and promote Britain abroad. Transformation in DCMS over the past few years (growing substantially in size as well as evolving in policy remit) has challenged the department to accommodate innovative and fast-growing sectors and to ensure we continue to deliver quality and cost-efficient policies and interventions.

Monitoring[footnote 1] and evaluation[footnote 2] (M&E) is vital in helping the government understand what works and what doesn’t work from its interventions and therefore improves the design and implementation of future ones. It enables the department to understand when and how its objectives and goals are achieved. Ultimately, M&E seeks to understand and improve DCMS’s work and impact on the ground.

There is an increased focus on evaluating effectively across government. The Evaluation Taskforce has been established at the Cabinet Office and HM Treasury to drive improvements in evaluation and how it informs future operational and spending decisions. There is greater emphasis and scrutiny from organisations on the quality and robustness of our evaluations, including from the National Audit Office.

DCMS’s M&E Strategy seeks to further develop a culture in which M&E evidence is integral to decision making for strategy, policy and programme development. It details how the department will continue to promote, prioritise, monitor and enforce stronger standards of quality in evaluation.[footnote 3]

An M&E strategy is needed to:

Enable better policy making by growing the evidence base: With high-quality and readily available evidence DCMS teams can put forward robust advice to ministers and make informed decisions about the best way to allocate resources.

Increase accountability: DCMS has a duty to be transparent to taxpayers and with robust evaluation, along with inflight monitoring, the government can be accountable for departmental spend, not only at the end of a programme but throughout its lifespan.

Maximise and demonstrate DCMS’s value: DCMS can be bolder in demonstrating its success with credible evaluation evidence. This evidence also enables future funding by showcasing the department’s impact.

The vision

DCMS’ vision is to have a strong evaluation culture at our core and to continue to build and maintain a robust evidence base.

The mission is to ensure proportionate, high-quality and innovative monitoring and evaluation of interventions. Their production across the department will be used to inform policy design, development, implementation, and legislative options.

To continue to drive cultural change there is a need to embed the strategy effectively over time. Therefore, DCMS set out to deliver this over the course of the next 3 years (2022 – 2025), at which point a full review will be conducted and this M&E strategy updated.

To achieve this vision, DCMS will continue to work towards 3 core outcomes:

  1. M&E evidence produced is timely, proportionate and needs driven: Performance metrics and evaluations are designed, commissioned and delivered at the right time and are strategically targeted towards areas of greatest impact (e.g. areas of greater evidentiary need).

  2. M&E is consistently, reliably and confidently used in decision making: Evaluation findings and learnings are accessible, actively and clearly communicated and firmly embedded into the policy making and policy delivery processes, with transparent quality assurance. Performance monitoring will also be embedded and available to strategic decision makers to inform risk.

  3. M&E is firmly embedded across the organisation: DCMS will have a strong standard of evaluation literacy, with policy officials and analysts in the department having access to regular and emerging evaluation knowledge through a well promoted positive learning culture.

We will ensure regular tracking to ensure the delivery of these outcomes. To achieve these outcomes DCMS professions will continue to ensure they work collectively to create a culture of quality M&E as standard practice.

Current DCMS monitoring and evaluation practices

DCMS have implemented several changes to our M&E practices in the past few years. These include:

  • the recruitment of a Head of Evaluation to encourage join-up across the department and provide strategic coordination of evaluation activity across decentralised teams
  • increased central oversight and governance of high profile, big spend evaluations
  • increased central oversight for M&E in Comprehensive Spending Reviews,[footnote 4] budget and business case development processes
  • analytical support for metric identification and development for the DCMS Outcome Delivery Plan (ODP)
  • the development of a successful DCMS evaluation community, meeting every 6 weeks — it provides a peer review function, alongside wider peer support
  • the development of an evaluation tracker, updated quarterly, for monitoring current evaluations, and a log of completed evaluations
  • new guidance and teach-ins to support teams in the early stage planning of evaluation, including an evaluation ‘checklist’ of key considerations required during the business case development process

Through consultation across the department, 3 priority areas have been identified through which M&E practices can be improved upon:

  1. Roles and responsibilities: reinforcing the roles and responsibilities of analysts, policy officials, finance professionals and wider to undertake and use M&E in a consistent and proportionate way.

  2. Processes: strengthening the close working between the central evaluation team and decentralised analytical teams. This will ensure alignment and embedment of M&E templates, guidance and tools in existing systems, making M&E a core and consistent component throughout policy making.

  3. Resourcing: further ensuring DCMS is strategic in where it targets evaluation expertise, firmly embedding a culture of learning, sharing and transparency.

This strategy

This strategy will provide accountability and a strong evidence base for future decision-making. M&E evidence will be key in building a compelling story about departmental purpose and impact, and the role of DCMS as a delivery department driving forward the government’s agenda.

DCMS will further develop an ecosystem where M&E can thrive effectively amongst the decision-making processes already embedded within the department. Evaluation will be needs driven and continue to support areas of weakness, such as evidence gaps, whilst also being ambitious. Monitoring data will be collected in real-time, enabling the department to respond to challenges at pace.

A collaborative approach will be adopted, creating a blended system where the majority of M&E activity and spend is decentralised across DCMS. Teams will be capable and confident in their ability to develop, design and deliver robust performance metrics and evaluation. Capability building, business case management, governance and feedback and learning will be supported by a central analytical team. Further governance, which crosses over the breadth of DCMS professions, will increasingly ensure that M&E is recognised as a responsibility for all and is embedded as standard practice.

To deliver this, there are 4 core objectives to the strategy:

  1. Prioritisation and proportionality.

  2. Operationalising performance monitoring and evaluation.

  3. Capability building.

  4. Evidence building and feedback.

DCMS’s approach to evaluation will continue to follow HM Treasury’s Green Book Guidance on Appraisal and Evaluation and its Magenta Book Guidance on Evaluation.

Core objective 1: Prioritisation and proportionality

Evaluation activity should be proportionate. There is an ongoing need to balance good quality M&E against other priorities to ensure that value for money is delivered for the taxpayer. It is also important that there is a greater focus on ensuring that priority evaluations are identified early and that teams have adequate resources and capability to deliver these.

To achieve this objective there are 3 areas of core focus:

Monitoring and tracking of M&E activity to develop a strong understanding of evaluation activity across the department, in order to support areas of greatest need and risk.

To achieve this, DCMS will:

Revise the existing evaluation tracker to ensure timely, accurate and useful monitoring of evaluations across all policy areas by December 2022. This will create a strategic dashboard, linking with the departmental ODP, to ensure DCMS can demonstrate the delivery of its priority outcomes. The dashboard will also capture what evaluations we are working on, their progress, and metrics to assess performance.

Conduct meetings every 2 months with policy and analytical teams, set up by December 2022. This will facilitate discussion of M&E challenges and risks, enabling support to be targeted where it is most needed.

Core principles to ensure ‘M&E standards’ are embedded within the policy making process from the outset and that the department has a consistent approach to M&E across all policy areas.

To achieve this, DCMS will:

Revise its ‘M&E standards’ by December 2022. These standards outline key principles that help to maximise quality, integrity and public value in evaluation practice. They will be produced through collaboration with internal and other government department experts to define expected levels of evaluation activity commensurate with the characteristics of each programme and its risks. The standards will be based upon core dimensions of ‘Useful, Credible, Ethical, Robust and Proportionate’.

Engagement across central and decentralised teams to ensure that expertise across teams is utilised to support areas of highest prioritisation, identify areas for joint working and set a clear process for central and decentralised teams to engage and support one another.

To achieve this, DCMS will:

Formalise an M&E advice and support request process by March 2023. This will bring strategic consistency in allocation of resources and expertise.

Refresh the existing evaluation community across the department by February 2023. A clear Terms of Reference for membership to the group will hold members to account for their active participation. Meetings will have a core focus on identifying current or upcoming risks and challenges and identifying areas for collaboration and strategic cross-over. Engagement from teams from across the department will be required.

Core objective 2: Operationalising monitoring and evaluation

In order to deliver a stronger culture on M&E, there needs to be more mechanisms in place to incentivise it. The requirement for M&E will be embedded into the framework for corporate governance and approvals so it is part of the key decision making processes in DCMS. This will ensure that M&E is automatically considered at the right stage of policy development. Close working across centralised appraisal teams will continue to ensure that concerns about any deviation from the requirement for the consideration of M&E are escalated.

Business Case (BC) development will have a strong requirement to clearly articulate what success will look like both during and after implementation of a policy or programme (within departmental BCs). These BCs are a requirement of all government spend allocated to DCMS through the Comprehensive Spending Review or budget process. Ensuring good value in the use of public money and that policies and interventions are fully planned with risks identified is key.

To achieve this, DCMS will:

Review and update BC templates and guidance by March 2023 to improve performance monitoring and evaluation plans for project development.

Implement (by March 2023) a requirement for M&E self-assessments from evaluation leads at early stages of planning, which will assess the level of involvement and oversight required (in line with Core objective 1, proportionality).

Impact assessments (IAs) are required to consider M&E and therefore, allow an early opportunity to influence the quality of evaluation plans. IAs are tools to assess the costs and benefits of regulatory proposals. Within DCMS, IAs are scrutinised by the Better Regulation Unit (BRU) and signed off by the Chief Economist or another senior analyst. Many of DCMS’s IAs are also independently scrutinised by the Regulatory Policy Committee.[footnote 5]

By utilising M&E considerations set out in IAs, DCMS will:

Ensure that (by March 2023) high-quality objectives, a sound theory of change or logic model and robust evaluation plans are a requirement in the BRU clearance and Chief Economist sign-off of DMAs and IAs.

Formalise (by March 2023) the regular cataloguing and sharing of post-implementation review (PIR) findings. This will help to ensure its use in the policy development process.

Implement a requirement for M&E self-assessments from PIR leads by March 2023, which will assess the level of involvement and oversight required.

Provide guidance and planning documents by March 2024 to support teams in their PIR planning, aligned with the PIR guidance of the Magenta book.

Quality Assurance (QA) processes will be further developed to ensure effective assessment of quality throughout the evaluation lifespan, not just at final reporting.

To achieve this, DCMS will:

Develop guidance on the level of QA required for each specific evaluation by March 2024. The level of QA will be based on information collected in the self-assessment process (see points 1 and 2 above).

Implement a requirement by March 2023 for an effective and proportionate advisory/steering group for the design, delivery and reporting of high risk or contentious evaluations as well as those with the opportunity to generate greater learning.

Increase the scrutiny of high risk/profile evaluations by March 2023 based on information collected in through the self-assessment forms through additional sign-off processes.

Regular review processes of high risk/profile evaluations will hold teams more accountable to deliver as promised, enable the department to be an intelligent learner and allow for an adaptive and agile approach when evaluations are not being delivered as intended.

To achieve this, DCMS will:

Implement annual reviews for our ‘priority’ evaluations (including PIRs) by September 2023. The annual review will be conducted by lead policy officials and analysts and will identify where teams are delivering against specific programme evaluation commitments, identifying risks and areas for change.

Core objective 3: Capability building

Learning will happen at both an individual level and through the promotion of a departmental wide culture of learning and upskilling. Evaluation community members will collaborate further and ensure expertise is shared effectively. DCMS will continue to ensure that no policy or analytical team is ‘left behind’ and that analysts are continuously aware of advancing evaluation techniques. Alongside building upon internal guidance and tools and exploring impactful dissemination routes, key areas of focus are:

A learning curriculum to ensure that learning and development is readily available and easy to access.

To achieve this, DCMS will:

Develop a modular learning curriculum by March 2023 and ensure all analysts complete basic training (by March 2024) and are encouraged to attend refresher courses on a regular basis.

Embed training in policy profession learning opportunities (by March 2024) and encourage policy colleagues with BC, IA or evaluation roles to complete the training.

Technical standard development and resourcing considerations to identify gaps in evaluation expertise and efficiently target areas of risk (where central support may be required or upskilling is a priority).

To achieve this, DCMS will:

Develop a technical competency framework by March 2024, aligned to the Government Social Research (GSR) framework, to assess M&E and learning expertise across the department.

Ensure all decentralised teams have access to sufficient evaluation expertise which meets the technical standards by March 2025, flexing M&E expertise across the department to support in areas of strategic priority.

Membership to the evaluation community in order to continue to enable the sharing of experience and expertise.

To improve on this, DCMS will:

Continue the community-based approach to in-house support and capability building (by March 2023), drawing on existing experience and expertise.

Formalise membership of the community, recognising M&E analysts’ skill-sets, through the creation of ‘evaluation advisors’ by March 2024. This community will form a ‘call-down’ network where members can be approached to support projects or challenges where they can add specific value.

Drawing upon wider expertise outside of DCMS to drive up quality in M&E.

To achieve this, DCMS will:

Strengthen linkages with the GSR and Government Economic Service leadership by March 2023 to align learning offers and events to maximise reach.

Build links with the department’s newly formed ‘College of Experts’ function by September 2023 which brings a longer term, ongoing systematic relationship with experts, providing easier access to expertise when needed.

Actively promote the use of the Evaluation and Trials Advice Panel by March 2023 during early discussions on challenging, complex and innovative impact evaluations.

Core objective 4: Evidence building and feedback

Building and embedding a stronger culture for M&E needs to be a sustained activity and requires collaboration across multiple teams within the organisation, as well as externally. Using evaluation findings and other evidence enhances DCMS’s impact on the ground, efficiency in how we work, and the value for money of DCMS investments. Feedback and learning will be at the core of the activities to achieve a stronger culture on this.

Promote and track the use of evaluation findings to maximise impact and learning.

To achieve this, DCMS will:

Develop a communications plan by March 2024 building on work to ensure the publication and promotion of evaluation reports, innovative methods, and evidence gathering, maintaining alignment with the GSR publication protocol.

Expand the repository in which information of completed evaluations is collected by March 2024 to also serve the purpose of spreading knowledge of what works and lessons learned. This will be achieved by making more accessible summaries of evidence available through a biannual newsletter and spotlight communications.

Implement (by March 2023) a timely debrief process for evaluations that have ended, to enable lessons learned to be captured and encourage dissemination and use of findings. Teams responsible for ‘priority evaluations’ will present their findings, lessons learnt and how they will utilise these findings to build on future decision-making to senior analytical leaders ahead of showcasing to the wider department through dedicated sessions.

Monitor project performance with metrics to enable continued scrutiny against intended outcomes.

To achieve this, DCMS will:

Feed performance metrics from project appraisal processes (see Core objective 1) through to departmental committees by March 2024 where they will actively be used to scrutinise and monitor the extent to which our interventions are achieving their intended outcomes. This improved ‘in-flight’ knowledge of the performance of DCMS’s policy portfolio will allow for timely course correction of policy implementation where needed.

Ensuring this strategy is implemented

Operationalising this strategy is the responsibility of the Head of Evaluation and the wider Analytical Leadership Team. However, the success of the strategy will rely on adherence to the strategy across the organisation, along with endorsement from our senior leaders. Table 1 details the core governance responsibilities of DCMS stakeholders.

Table 1: The core governance responsibilities of DCMS stakeholders

Stakeholder Core responsibilities
DCMS Senior leaders Advocating for appropriate and timely use of evaluation within their teams.

Encouraging their teams to undertake learning and development on evaluation.

Ensuring policy officials and analysts work collaboratively in evaluation development, especially during BC or IA processes.

Overseeing the implementation and adherence to the strategy within respective areas and escalating risks in a timely manner.
The Central Analysis Team Ensuring the roles and responsibilities of teams and functions are regularly and clearly communicated.

Collecting and addressing regular feedback to improve processes.

Reporting challenges or concerns regarding the adherence and implementation of the strategy to the Chief Economist and Director of Analysis to ensure timely action.

Working collaboratively across the M&E, BC and IA teams to ensure a streamlined approach to M&E development.

Engaging with other government departments to share lessons learnt and ensure coherence across shared interests.

Monitoring implementation and impact of this strategy.
Evaluation community members / evaluation advisors Participating in evaluation community meetings, highlighting key pieces of work and sharing learnings and opportunities for collaboration.

Providing support to other analysts across DCMS through workshop attendance, steering/advisory group participation and peer reviewing documents including tendering documents and M&E plans.

Acting as the evaluation point of contact for their team, disseminating requests and information when required, and being the initial response to commissions, including timely updates to the new evaluation dashboard.
Analysts working on M&E and learning across the business Using and promoting external resources within their teams, including trials advice panel and college of experts.

Utilising learning opportunities and sharing feedback within teams.

Engaging with policy colleagues to ensure awareness of relevant guidance and processes.
Policy officials Developing an awareness of this strategy and understand the importance of timely and well-considered M&E.

Utilising learning opportunities and sharing feedback within teams.

Ensuring awareness of the evidence standards and using them when scoping evaluation requirements.

Understanding how to develop a theory of change.

Ensuring they engage their analytical colleagues in a timely manner, highlighting any potential M&E risks as early as possible.
  1. Monitoring is a process for tracking progress in the delivery of an intervention by collecting data on its outputs and progress towards intended outcomes. 

  2. Evaluation is the systematic assessment of an intervention’s design, implementation and outcomes. 

  3. This strategy represents the core principles and deliverables for the central department (rather than the whole of the DCMS family, including agencies and arms length bodies). 

  4. At a Comprehensive Spending Review, the Chancellor of the Exchequer sets out the government’s plans for public spending. They are important fiscal events, with decisions made over hundreds of billions of pounds of public money. 

  5. The Regulatory Policy Committee is an independent body that assesses the quality of evidence and analysis used to inform regulatory proposals affecting the economy, businesses, civil society, charities and other non-government organisations. They do this by scrutinising government impact assessments.