Government analytical evaluation capabilities framework (HTML)
Updated 2 July 2025
Introduction to the Evaluation Capabilities Framework for Government Analysts
What is this framework for?
- To provide a description for Government analysts of the knowledge and skills which enable the effective delivery of quality evaluations.
- It is a learning and development tool allowing people to assess their evaluation skills in order to identify gaps they might want to fill (e.g. through training).
- It promotes a shared understanding of evaluation roles and skills.
- It can be used when setting up evaluation projects and teams as a tool for considering the functions and roles required.
- It can also be used more strategically as a checklist to assess the balance of skills needed in a Department as a whole, and the need for training to support Government evaluation managers.
- This tool has been developed for analysts. Other Government professions, including policy profession may find this useful. The policy profession competency framework includes evaluation.
How do I use this framework?
Identify the most relevant part for your needs which are initially broken down into four categories:
- Scoping
- Leading and managing
- Methods
- Use and dissemination
This document will go through each section in turn or you can use the buttons throughout to go to the areas that you would like to focus on.
What if I am unsure about my current skill level?
You can either review the full framework and see where you feel that you want to improve or you can access the accompanying excel tool which has a subset of questions to better understand your skills.
Where do I go if I want to develop further?
Each section in this document has its own set of unique resources however the cross-government Magenta and Green books are good resources to start with.
Frequently Asked Questions
What is a capabilities framework?
This capabilities framework describes the skills, attitudes and practices which enable effective and high quality Government evaluations. This is for the purpose of learning, personal development and training.
How does it link with existing analytical competency frameworks?
It complements other specific analytical competency frameworks. Where the skills in the evaluation capabilities framework overlaps with a skill in analytical competency frameworks, it focuses on the specific evaluation dimension of that skill. The evaluation framework also differs from the individual competency frameworks in terms of its use and purpose as a learning and development tool. It is a tool for all analytical disciplines and can also be used by teams, as well as individuals, to identify skills required in an evaluation project team. In general evaluation in government is not a distinct profession, it therefore draws on relevant professional guidance (e.g. GSR, GES, civil service code) where appropriate.
How does it link with other evaluation competency and evaluation capability frameworks
This framework is designed to guide analysts working within government departments. Other frameworks exist focusing on other audiences. A notable example is the UK Evaluation Society Framework of Evaluation Capabilities, this covers much of the same content but applies to the full UK evaluation community.
Does it mean I need all the capabilities listed in the framework to be an effective evaluation manager?
The framework outlines the skills which help ensure high quality, impactful evaluations but these do not need to be held by a single evaluation manager. Evaluations will often be delivered by multidisciplinary analytical teams and/or external consortiums. In terms of technical skills in particular, it is likely that different analytical disciplines (and individuals within them) will have different strengths and expertise. The frameworks will also be useful to identify the additional skills needed in a team or project.
I am not a badged analyst, is this of use to me?
Currently the framework is focused on evaluation managers who are often an analyst. However if you are interested in evaluation or are a policy official looking to better understand these processes there may be aspects of this that you can use.
Framework self assessment and action plan
- Accompanying this document is an Excel spreadsheet which includes a set of self- assessment questions. This allows you to get an indication of areas of strength, and areas that you may want to improve.
- The assessment is a shortened version of the full framework. For each of the described skills you can select your current capability in that area. Response options are:
- No knowledge or experience
- Basic or limited knowledge or experience
- Working knowledge and practical experience
- Detailed knowledge and significant experience
- Expert knowledge and experience
- On the results page you can see the scores in the four key evaluation areas.
- Any areas that you feel could be developed from these scores can be further explored
- The Excel sheet also has an area where you can write down your actions to develop your evaluation capabilities using the more detailed competency lists and resources sections of this document
Contents
Scoping
- Understanding and communication of the rationale for evaluation
- Understand the intervention and construct a theory of change
- Identify the right evaluation type and approach
- Producing a proportionate and appropriate evaluation plan
- Scoping resources
Leading and Managing
- Leading and managing
- Leading and managing resources
Methods
- Use of monitoring and administrative data
- Methods of primary research and analysis
- Process evaluation
- Theory based methods of impact evaluation
- Experimental approaches to impact evaluation
- Value for money evaluation
- Research synthesis
- Methods resources
Scoping 1: Understanding and communication of the rationale for evaluation
Below are the descriptions of the skills and behaviours for this area
- Sells the concept of evaluation, championing the role of evaluation in policy making;
- Understands the value of evaluation and how evaluation can support: accountability, measuring impact, learning, and programme development;
- Understands the evidence, research and evaluation needs and priorities of policy customers;
- Understands the evaluation processes used within their department, and across Government;
- Ensures evaluation is considered early in the ROAMEF policy cycle and that sufficient thought is given to monitoring and evaluation requirements (including resources needed) at the appraisal stage e.g. in the business case; Demonstrates knowledge and experience of a range of types and approaches to evaluating programmes and policies;
- Defines evaluation questions to address policy needs, and re-defines where necessary;
- Is aware of different types of evaluation questions: e.g. exploratory; descriptive; normative; causal; explanatory; developmental;
- Is able to provide a hierarchical structure for evaluation questions and prioritise amongst questions.
Scoping 2: Understand the intervention and construct a theory of change
Below are the descriptions of the skills and behaviours for this area
- Has an understanding of how evaluation questions relate to the underlying logic of the intervention;
- Can create a theory of change, linking elements of the intervention with outcomes;
- Is able to plan and run a logic/theory mapping workshop with key stakeholders;
- Uses the theory of change with stakeholders to identify priority areas for evaluation;
- Considers the original appraisal for the intervention (including any business case and/or impact assessment);
- Uses the theory of change to guide evidence collection, analysis and interpretation of findings (e.g. monitoring data and evidence gaps and using it as a tool to drive analysis and in particular synthesis);
- Can identify the key metrics or performance indicators which need to be measured and approaches to measuring them;
- Can define their role alongside that of benefits management approaches;
- Ensures that consensus is reached and there is a shared view regarding the underlying logic of the intervention.
Scoping 3: Identifying the right evaluation design for a policy
Below are the descriptions of the skills and behaviours for this area
- Works closely with clients to identify evaluation purpose and requirements and ensures conscious and informed choices are made about level and type of evaluation design;
- Understands the different roles of process, impact and value for money evaluation and can communicate and explain these to evaluation clients and stakeholders;
- Understands the range of approaches which can be used to assess whether an intervention was effective in meeting its objectives, and can determine the most appropriate approach and methods for a particular evaluation;
- Is aware of frameworks for assessing the complexity of a programme and the distinctions between: ‘simple; complicated and complex’;
- Devises appropriate evaluation questions and designs; Can make the case for investing in good quality /
- proportionate evaluation;
- Aware of, and understands the role of monitoring and evaluation as part of the life cycle of project and programme design;
- Where there is a programme delivered by other partners, ensures roles and responsibilities are clear for delivery of the evaluation, building these in from the start;
- Ensures arrangements for access to monitoring data are in place (via the necessary data sharing agreements) to facilitate use of the data for monitoring and evaluation (e.g. linking and re-contact);
- If designing an evaluation of a communications intervention, takes into account the Government Communication Evaluation Framework.
Scoping 4: Producing a proportionate and appropriate evaluation plan
Below are the descriptions of the skills and behaviours for this area
- Designs evaluations for in-house or commissioned projects and procures external evaluations where appropriate;
- Designs programmes of evaluation (including those containing multiple research projects over multiple years) including consideration of:
- Evaluation questions,
- programme attributes,
- context,
- timescales,
- and resources needed
- Able to develop, and consult on, high quality specification/ Terms of Reference to underpin evaluation;
- Works with stakeholders/delivery partners to agree and align timeframes;
- Understands the trade-offs between timeframes for observing impacts and policy timeframes.
Scoping: Resources to improve your capability
Below are resources to improve your scoping abilities. To return to the subset that you were on select the corresponding circle on the journey below.
- Magenta Book & Supplementary Guidance
This is the key Government guidance on evaluation. Early chapters focus on designing an evaluation and the range of considerations for scoping.
- DfT Logic Mapping Guidance
Guidance produced by DfT to provide hints and tips for carrying out logic mapping (PDF, 208KB)
- HIVOS Theory of Change Guidelines
A detailed guide to Theory of Change
-
Better Evaluation guidance on using logic models and theory of change
-
NPC Theory of change in ten steps.
A guide to developing a theory of change
- NPC Theory of Change beginnings.
An overview of theory of change and how to use it
- Sport England Logic Model.
A logic model worksheet for those beginning to look at this form of framework
- Choosing appropriate evaluation methods tool.
Public Health England introduction to evaluation Guidance on the basics of evaluation and includes links at the end to evaluation podcasts and videos.
Leading and Managing
Below are the descriptions of the skills and behaviours for this area
- Demonstrates leadership and champions evaluation to maintain momentum and impact;
- Works collaboratively, manages relationships and supports others (e.g. contractors and colleagues);
- Is able to proactively influence stakeholders, and ensure buy in from less engaged evaluation users;
- Is aware of, and adapts to changing circumstances, feeding this into evaluation delivery;
- Displays integrity and resilience, and maintains standards when challenged;
- Builds a culture of evaluation, learning and transparency into the policy and analytical teams they work in;
- Works effectively across analytical disciplines;
- Identifies and sets up appropriate governance structures for input and quality assurance of evaluation products;
- Is aware of key departmental procurement/commissioning procedures;
- Ensures all parties are fully aware of their roles and responsibilities and key goals/deadlines are met;
- Considers and implements relevant government guidance that impacts on evaluations (including equality, transparency, ethics) and applies behaviours in the civil service code to evaluation management decisions;
- Is prepared to invest time and energy in personal development and is willing to work with new methodologies to meet challenges;
- Demonstrates Evaluative Thinking. In particular: the investigation of one’s own as well as others’ assumptions, motivations, and biases; understanding the relevance of context dependency; and the ability to navigate uncertainty (e.g. of outcomes), ambiguity (e.g. in data and evidence), and complexity.
Leading and Managing: Resources to improve your capability
Below are resources to improve your leading and managing abilities. To return to the subset that you were on select the corresponding circle on the journey below
- Magenta Book & Supplementary Guidance
This is the key Government guidance on evaluation. Early chapters focus on designing an evaluation and the range of considerations for scoping.
- GSR Professional Guidance: Ethical Assurance for Social Research in Government
Full guidance produced by the GSR and applicable to evaluation managers (PDF, 720KB) which covers the ethics involved in research and evaluation
- UK Evaluation Society: Good Practice in Evaluation
- Civil Service Code.
The core values (integrity, honesty, objectivity and impartiality) to which civil servants are committed in order to support good government and ensure highest possible standards.
- Public Sector Equality Duty
- Evaluation Failures: 22 Tales of Mistakes Made and Lessons Learned
Described as a candid collection of stories from seasoned evaluators from a variety of sectors sharing professional mistakes they have made in the past, and what they learned moving forward.
Methods 1: Use of monitoring and administrative data
Below are the descriptions of the skills and behaviours for this area
- Recognises the importance of monitoring data and influences how programmes collect monitoring data;
- Understands the role of monitoring data in an evaluation, and when and how to use it;
- Works with others (e.g. delivery partners) to ensure that correct data sharing agreements are in place early in the process;
- Works with delivery partners to ensure high quality monitoring data is built into the programme;
- Is aware of legal, ethical and practical issues in ensuring programme monitoring (and other) data is collected and used effectively;
- Identifies sources and weighs up strengths and weaknesses of data and key metrics to help answer key evaluation questions;
- Considers the links with project management information and reporting especially key delivery metrics and any benefits realisation plan;
- Manages risks and issues with data monitoring and governance, escalating if necessary;
- Considers the long term potential of data sources;
- Engages data science experts to explore innovative use of data, for example approaches to data management, use of data technology and visualisation.
Methods 2: Methods of primary research and analysis
Below are the descriptions of the skills and behaviours for this area
- Has a robust knowledge of qualitative and quantitative data collection and data analysis methodologies, knowing how to design and apply specific methods;
- Understands strengths and weaknesses of a range of data collection methods, and when and how to apply them;
- Has good knowledge of ethical issues in research and evaluation and is able to apply these appropriately to evaluation specifications and projects;
- Collects and manages data (in particular personal data) in line with the GDPR/Digital Economy Acts and data handling protocols;
- Ensures necessary quality assurance processes are built into data collection and analysis methods at appropriate stages;
- Considers opportunities for the evaluation to analyse existing social media data, and to use social media as a tool to collect new data.
Methods 3: Process evaluation
Below are the descriptions of the skills and behaviours for this area
- Understands the value and role of process evaluation, and its link to monitoring and to impact evaluation, to understand how a programme is being delivered and why;
- Times process evaluation deliverables to coincide with best opportunities to have impact and identifies and builds in suitable interim products;
- Works collaboratively with programme managers to draw conclusions from process evaluation, alongside other insight and stakeholder engagement, and feeds these into policy design in a timely manner;
- Understands the wider learning and value from process findings, alongside impact findings, for appraisals, and to build the evidence base in an area;
- Can use results from process (and impact) evaluation in one project, to evidence potential optimism bias in other policies;
- Understands limitations of process evaluations in measuring and determining impact.
Methods 4: Theory based approaches to impact evaluation
Below are the descriptions of the skills and behaviours for this area
- Understands the role of theory based approaches to evaluating complex interventions;
- Recognises the limitations of experimental and quasi experimental approaches in some evaluation contexts;
- Ensures theory based methods have robust approaches to testing hypotheses and causal claims;
- Is familiar with specific theory based methods (e.g. contribution analysis, realist evaluation, process tracing, qualitative comparative analysis) and their appropriateness for addressing different evaluation questions;
- Understands the value of a theory based approach in establishing and understanding causation, and enhancing other methodological approaches.
Methods 5: Experimental approaches to impact evaluation
Below are the descriptions of the skills and behaviours for this area
- Understands the conditions required for experimental and quasi-experimental approaches to be useful and applicable;
- Can assess the robustness of different impact evaluation designs (including consideration of the ‘Maryland Scale’);
- Familiarity with the ‘Test- Learn- Adapt’ framework and the positive case for Randomised Control Trials (RCT)– understands the variants of RCT design;
- Appreciates the importance of the counterfactual and control groups in determining attribution;
- Can apply experimental or quasi-experimental approaches effectively to attribute impacts;
- Able to identify risks of using weaker designs (e.g. before and after) while being able to apply and interpret these effectively if they are the only approach available;
- Can make appropriate use of emerging impact data (including managing the risks of use of early findings);
- Understands the limitations as well as strengths of experimental and quasi-experimental designs and the challenges to implementing them.
Methods 6: Value for money evaluation
Below are the descriptions of the skills and behaviours for this area
- Has a good understanding of the HMT Green Book guidance on economic appraisal, cost benefit analysis, and requirements for consideration of monitoring and evaluation throughout the project cycle;
- If not an economist, works jointly with local economists to scope and deliver value for money evaluation;
- Links impact evaluation back to objectives of the policy and initial appraisal, to ensure comparability;
- Understands economic approaches to project appraisal and evaluation, including different ways of measuring efficiency, equity and impacts;
- Ensures that programme costs are tracked to allow the necessary analysis of costs and benefits;
- Considers the distributional impacts in the value for money and impacts of the programme;
- Is aware of different methods which can be used in the valuation of social benefits and wellbeing;
- Uses value for money evaluation to help improve assumptions and methods in Cost Benefit Analysis being developed for appraisal of new policies.
Methods 7: Research synthesis
Below are the descriptions of the skills and behaviours for this area
- Understands range of approaches to synthesis and meta-analysis and the role of triangulation of evidence;
- Brings evaluation evidence from different sources together to identify overarching messages and findings;
- Quality assures evidence to assess how it has been used in producing an evaluation synthesis.
Methods: Resources to improve your capability 1
- Magenta Book & Supplementary Guidance
This is the key Government guidance on evaluation. Early chapters focus on designing an evaluation and the range of considerations for scoping.
- BetterEvaluation
A collaboration that aim to share information and support evaluation Worldwide. It includes a lot of evaluation resources including overviews of different evaluation methods.
- BOND Impact Evaluation A Guide for Commissioners and Managers.
- Evalsed Sourcebook: Methods and techniques
Produced by the European Commission for looking at socio- economic development. It describes a range of methods and techniques for evaluation.
- Broadening the Range of Designs and Methods for Impact Evaluations
A report looking at existing and alternative Impact Evaluation methods, in particular applied to development – but also applicable to other policy areas. Covers theory based approaches.
- Treasury Board of Canada. Theory-Based Approaches to Evaluation: Concepts and Practices.
A useful overview of theory based evaluation approaches.
- Test, Learn, Adapt
- The Maryland Scientific Methods Scale (SMS)
- Research Methods for Policy Evaluation
National Centre for Social Research guide covering process and counterfactual impact evaluation methods.
-
The Green book
The central government guide to economic evaluations (PDF, 1,320KB)
-
DfT guidance on how to strengthen the links between appraisal and evaluation (PDF, 723KB)
-
Guidance for transport impact evaluations
This focusses on counterfactual approaches.
- Information Commissioner’s Office Guide to the Data Protection Act
A guide to better understand the “day to day” principles and implications of the data protection act
Use and dissemination 1: Reporting and presenting data
Below are the descriptions of the skills and behaviours for this area
- Has clear knowledge of the format and style required to report evaluation results; is able to report evaluation information clearly in various formats (orally and in writing);
- Can use charts, figures and infographics appropriately to add value to interpretation and presentation of data;
- Is clear about the limits of the evaluation and communicates these clearly and positively;
- Makes use of regular reporting where appropriate to enable ‘real-time learning’ for example using multiple evaluation outputs built around key reporting and decision-points.
Use and dissemination 2: Considering policy implications and sharing the findings
Below are the descriptions of the skills and behaviours for this area
- Builds use and impact of findings into the evaluation design from the beginning;
- Develops a use and dissemination plan based on which groups require what information at which point in time for what purpose;
- Identifies opportunities throughout the evaluation process for effective dissemination and enhanced impact;
- Builds relationships and demonstrates leadership in order that evaluation findings have impact;
- Tailors reporting and communication of findings to meet the needs of different audiences, improving the usability of findings by showing how they relate to specific audiences’ areas of interest; Identifies and explores creative ways of disseminating findings;
- Is familiar with a variety of communication modes and tools (e.g. one page summaries, video outputs, infographics, data sharing, newsletters, social media posts, conference presentations and seminars) to maximise impact;
- Identifies policy implications of results and feeds these into policy design including future appraisal to influence decision making and for future learning;
- Identifies how findings apply to areas beyond specific scheme or policy being evaluated;
- Takes forward publication of evaluation outputs including datasets, technical annexes and research tools where relevant, in accordance with relevant guidance.
Use and dissemination: Resources to improve your capability
Below are the descriptions of the skills and behaviours for this area
This is the key Government guidance on evaluation. Early chapters focus on designing an evaluation and the range of considerations for scoping.
- GSS Communicating Statistics](https://gss.civilservice.gov.uk/statistics/presentation-and-dissemination/)
A wealth of resources on communicating and disseminating official statistics, including guidance documents and case studies from across the Government Statistical Service (GSS).
A document that covers infographics and how to use them.
- GSR publication guidelines
The document outlining the publication of any research and analysis including evaluation.