FCDO evaluation strategy 2026 to 2030
Published 10 March 2026
Foreword

Second Permanent Under-Secretary Nick Dyer.
In a world of constant change and growing complexity, the Foreign, Commonwealth and Development Office (FCDO) is committed to making decisions that are grounded in robust evidence. Evaluation is at the heart of this commitment, providing the insight and learning needed to ensure our choices across all areas of our work – development, diplomacy, and security – deliver the greatest possible impact.
Evaluation is more than a technical process; it is a story of learning, adaptation, and progress. Evaluation has helped us to understand what works, what doesn’t, and why. We care about this because it helps us do our job better. For example, our evaluations have taught us how to tailor our interventions to best support children with disabilities in Syria; furthered global understanding about what works to prevent violence against women and girls; informed how we spend our health research budgets to focus where there is strong uptake potential; and, helped us understand how diplomatic and programming efforts can combine to strengthen our support to free and fair elections.
The value of evaluation is in learning: embedding (and celebrating) successes, preventing or cutting poor investments, and better identifying and managing risks. To do this well, we need to embed evaluation at key decision points, creating opportunities for challenge and reflection.
Over recent years, there has been a growing appetite for evidence across the FCDO. Staff at all levels are increasingly seeking out evaluation insights to guide their work, and our systems are evolving to make this evidence more accessible and visible than ever before.
Looking ahead, our ambition is clear: to make high-quality, timely evaluation a foundation for strategic choices whether they relate to our diplomatic or development aims. By strengthening our internal expertise, harnessing new technologies, and fostering a culture that values evidence, we will continue to improve outcomes, enhance value for money, and deliver on our commitments to the UK and its partners.
Evaluation is not just about measuring the past – it is about shaping a better future. Through this strategy, we reaffirm our commitment to learning, accountability, and excellence in everything we do.
Second Permanent Under-Secretary, Nick Dyer
Executive summary
In a fast-changing world, the Foreign, Commonwealth and Development Office (FCDO) must use evaluation evidence to inform strategic decisions, improve delivery, demonstrate impact and value, and manage risk for the UK and its partners.
This strategy provides a framework to address persistent challenges and evolve the evaluation function to meet the needs of the future FCDO. We want to be an organisation where high-quality, timely evaluations underpin strategic choices, help us improve outcomes and enhance value for money across FCDO priorities. To do this, the strategy sets out 4 shifts in the evaluation function:
From purely demand-led evaluation to a mixed model with greater central direction
A shift from a fully demand-driven evaluation system to one balanced by evaluations mandated from the centre will help build evidence in strategically important areas.
Increased focus on portfolio-level monitoring, evaluation and learning
Complementing individual programme evaluations with a greater focus on portfolio monitoring, evaluation and learning (MEL) will allow the FCDO to understand how different activities work together to deliver impact across countries and themes.
Wider use of internal evaluation expertise across the FCDO’s portfolio
Better use of in-house expertise to deploy evaluative techniques that complement formal evaluation so that we can support continuous learning and adaptation across the full breadth of our work – development, diplomacy and security.
Greater use of emerging technology to find and use evaluation evidence
Responsibly and ethically harness new technology, including AI, to make it quicker and easier to summarise evaluation evidence and identify gaps to focus evidence generation where it is most needed.
The strategy is structured around 3 priority areas: demand, delivery and uptake:
- Strengthening demand by strengthening incentives for evidence use within planning and delivery processes and improving our ability to target evaluations where they are most needed.
- Improving delivery through evolving the methods we use and building staff capability to apply them. Maintain high quality standards through clear policies and guidance.
- Enhancing use and uptake by making evaluation evidence more visible and accessible and by building staff capability to be intelligent customers for evaluation.
Introduction
As both the broader global context and our partner countries change, the most effective tools, policies and approaches available to the Foreign, Commonwealth and Development Office (FCDO) also need to evolve. Evaluation is central to keeping pace with these changes.
Evaluation helps the FCDO improve delivery, demonstrate impact, and make informed decisions to support the Government’s Plan for Change. Our aim is for high-quality, timely evaluations to underpin strategic choices, help us improve outcomes and enhance value for money across FCDO priorities. As we innovate and test new approaches in diplomacy and development, evaluation helps us manage risk, refine how we deliver, and make choices that increase our effectiveness. It also helps us tell an evidence-backed story about the value and impact of our work to the British people, Parliament, and our partners, serving both learning and accountability purposes.
Across our foreign policy, the UK is finding new ways to deliver. In development, we are shifting from donor to investor, from service delivery to system support, from grants to expertise, and from international intervention to local provision. International engagement and diplomacy are also evolving to reflect increased uncertainty. Managing these changes requires that we evaluate our impact.
An effective and impactful evaluation function will provide evidence to inform strategic decision-making and improve outcomes and value for money by supporting learning across all areas of the FCDO’s work. This involves supporting more of the organisation – whether development, diplomacy or security – to use evaluation evidence to strengthen delivery, using a broad range of evaluative tools to serve all areas of activity, and using technology to make it easier to use evidence for decision-making.
What do we mean by evaluation?
‘Evaluation’ at the FCDO refers to the systematic and objective use of robust research methods to assess the effectiveness, impact, efficiency, relevance, coherence or sustainability of our strategies, policy and programming – across development, diplomacy, security, strategy, policy and corporate.
‘Evaluative thinking’ or ‘small e’ evaluation refers to informal or smaller-scale efforts that support learning and adjustment (for example, a short survey after a pilot project). These can be ongoing, quick, often internal evaluation activities or feedback loops to understand how something is working and can be applied to all areas of FCDO’s work.
Evaluation is stronger when it links with programme and policy monitoring as part of a coherent monitoring and evaluation function. Monitoring tracks delivery and reporting against departmental priorities, while evaluation investigates how and why outcomes are achieved. They are complementary but distinct functions. This strategy focuses on the FCDO’s evaluation function but recognises both are necessary to inform decision making.
Ambition
This Strategy builds on the first FCDO Evaluation Strategy (2021 to 2025) which established a foundation for evaluation in a newly created Department. Implementation of the strategy delivered programming to meet bottom-up demand for evaluation, strengthened internal capacity[footnote 1], and developed tools to evaluate diplomatic influence.
Despite progress, challenges remain. Expertise remains unevenly distributed; evaluation is undertaken solely at the discretion of the implementing team (leading to sub-optimal coverage), and cuts to Official Development Assistance (ODA) often led to monitoring, evaluation and learning being cut from programme budgets, reducing oversight[footnote 2]. These factors limit the integration of evaluation evidence into decision-making. Even where the desire and ability to use evaluation evidence exists, knowledge management systems are unfit for purpose.
Building a culture that values evidence requires consistent senior engagement and strong organisational incentives. This strategy provides a framework to address persistent challenges and evolve the evaluation function to meet the needs of a dynamic, forward-looking FCDO.
An evolved approach to evaluation will need to tackle 3 challenges:
- Demand for evaluation: how do we evaluate the parts of the FCDO where it is most needed?
- Delivery of evaluation evidence: how can we deliver high quality, resource efficient evaluation evidence that meets evolving departmental needs?
- Use and uptake of evaluation evidence: how can we better support use of evaluation evidence and evaluative thinking to inform decision-making?
These challenges are interconnected. Addressing them requires using a broad range of evaluative techniques across all FCDO activities and fostering a culture where evidence informs how we think, plan, and act. This includes making space for challenge, learning, and reflection – through formal evaluations and everyday evaluative thinking - to strengthen delivery and effectiveness.
The following shifts will help us do this:
From purely demand-led evaluation to a mixed model with greater central direction
The FCDO’s decentralised evaluation model empowers teams to commission studies that respond directly to their evidence needs. However, this approach creates uneven coverage and risks important gaps in the evaluation evidence base. A shift from a fully demand-driven evaluation system to one that is balanced by evaluations mandated from the centre will help build evidence in strategically important areas, including areas not focused on development.
Increase focus on portfolio-level monitoring, evaluation and learning
Individual programme evaluations can be critical sources of learning. However, complementing this with evaluations at the portfolio level, which look across all our levers, will allow the FCDO to understand its strategic impact across countries and themes, and beyond individual programmes. Portfolio monitoring, evaluation and learning (MEL) can improve coherence and impact across activities, aid resource prioritisation and support accountability.
Data at the programme/intervention level will still be essential to inform meaningful portfolio analysis. Programme monitoring and evaluation should be proportionate and determined by clear learning needs. By shifting towards a balance between programme and portfolio analysis, we will strengthen our use of resources across development, diplomacy, programmes and policy.
Wider use of internal evaluation expertise across the FCDO’s portfolio
The FCDO’s work spans development, diplomacy and security. As our objectives and approaches change, we must learn continuously, course correct as needed and clearly evidence our impact. This means being able to support evaluative thinking and use in-house expertise to apply techniques that complement formal evaluation and are relevant for all areas of our work. We will build on existing expertise, deploy talent where it can have the most impact, and invest time in building analytical skills for the future, within FCDO’s resourcing constraints.
Greater use of emerging technology to find and use evaluation evidence
New technologies present opportunities for the FCDO to innovate, introduce efficiencies, and make evaluation evidence easy to find and use. A key focus for this strategy will be to responsibly and ethically harness new technology, including AI, to make it quicker and easier for analysts to summarise evaluation evidence and identify gaps so that future evaluations are focused on areas where there is clear added value.
This strategy will be supported by a detailed workplan with milestones, to help us achieve our ambition of a more effective and impactful evaluation function that supports strategic decision-making, improved outcomes and value for money across all areas of the FCDO’s work. We will continue to build on good practice from our most impactful evaluations.
Box 1: The impact of evaluation: mental health and learning outcomes in Syria
A team of MEL and education advisers worked with implementing partners in Syria to undertake a midline assessment of an education programme (Manahel) which reached more than 300,000 children. The analysis revealed a significant and previously undocumented finding: children with cognitive and psychosocial difficulties were progressing far more slowly than their peers. This provided strong statistical evidence of the impact of anxiety, depression, and related factors on learning outcomes in a conflict‑affected context, helping fill an important gap in the global evidence base.
The findings had direct operational and strategic impact. Delivery partners used the evidence to introduce more tailored interventions for children with disabilities, including one‑to‑one support. The results were also instrumental in securing additional United Nations funding for mental health and psychosocial support within the programme. The study was later published in the Journal on Education in Emergencies (March 2022) (PDF, 868 KB), where editors highlighted its contribution to the field. Donors have since cited the analysis as justification for increased investment in research and learning in Syria, demonstrating the tangible value of rigorous evaluative work in shaping policy and resource allocation.
Box 2: Using evaluative approaches to inform programming and political engagement
In 2025, the Integrated Security Fund conducted an evaluative assessment of activities in the Lake Chad Basin and Nigeria to understand the programme and political outcomes of activities. This analysis had a significant impact on the direction and effectiveness of several programmes in the region, informing the design of new activity in Nigeria and providing critical evidence about which projects should be discontinued. For example, the assessment highlighted that some models, including the Borno State Development Unit (BSDU), were not optimal for the evolving context, resulting in the responsible discontinuation of UK support.
The lessons from the evaluation also prompted a reassessment of adviser deployment models and political engagement strategies. As a result, the programme recalibrated its advisory model to better align with the evolving political context and maximise impact: i) moving away from a reliance on externally contracted consultants, and shifting field-level engagement to Ministry of Defence and FCDO personnel; and ii) shifting our mine action approach towards a more comprehensive, blended offer at both Federal and State levels. Overall, the evaluation played a pivotal role in shaping funding decisions, programme design, and strategic approaches to political engagement in the region.
Demand for evaluation
How do we evaluate the parts of the FCDO where it is most needed?
There is strong demand for evaluation from parts of the department, but it is distributed unevenly. Between 2021 to 2025, the central Evaluation Unit received over 400 requests from FCDO staff for technical evaluation support, quality assurance and advice on evaluation issues. Between April 2022 and March 2025 the FCDO published 34 evaluation reports covering activity in specific country offices, 7 regional evaluations across sub-Saharan Africa, the Middle East, and Latin America and the Caribbean, and 18 evaluations with a global reach.
However, evaluations are focused in certain areas. Of FCDO-funded evaluations published between 2012 to 2024, sub-Saharan Africa dominates (55%), followed by South Asia (24%). Other regions are sparsely covered and evaluations of a small number of countries account for a large share. There has been a declining trend in the number of evaluations published between 2019 and 2025. The distribution of evaluations largely reflects ODA spend, which means there are gaps in other areas, such as diplomatic influencing, non-ODA spend and corporate activity. This limits our ability to learn whether what we do is working and poses a particular risk at a time when the organisation is undergoing significant change.
Key challenges include weak incentives for using evidence, a misperception of evaluation as a technical add-on rather than a key instrument for learning and delivery, an assumption that evaluative techniques are only applicable to development programming, and a self-selection approach to evaluation which makes it difficult to direct evaluation resources to areas that offer the most valuable learning.
Incentivising evaluative thinking and evidence-use
A key part of addressing these challenges is to create top-down incentives to evaluate, especially in priority areas that are currently under-evaluated. Incentives around the use of evidence are driven by a wide range of factors, from individual performance management to public accountability requirements; they are part of the broader organisational culture. While defining organisational incentives around the use of evaluation evidence is beyond the scope of this strategy, incentives can be strengthened by:
- embedding good practice into routine processes We will seek to integrate explicit consideration of evaluation at key moments of the planning and delivery life cycle. We will provide technical support at these moments, in good time to influence outputs
- identifying evaluation ‘blind spots’ We will provide regular reports to the Programme Investment Committee to suggest areas where evaluation may be needed to support accountability and risk management. The Programme Investment Committee can request follow-ups from teams
Strategic targeting of evaluation
To strengthen our ability to target evaluations where they are most needed, we will:
- improve management information that allows us to track evaluations and identify critical gaps in evidence and expertise. The FCDO will continue updating the UK government’s Evaluation Registry so there is transparency about pipeline and published evaluations
- provide flexible technical assistance to support short-term strategic priorities. This will follow a clear commissioning process and may include defining outcomes, designing change processes, or establishing portfolio-level MEL frameworks
- support robust quantitative impact evaluation of the FCDO’s activities in strategically important areas including through the Strategic Impact Evaluation and Learning (SIEL) programme
These efforts aim to strengthen internal processes and target central resources where they can add the most value.
Box 3: Using evaluation to make health research investments more effective
In 2025, the FCDO’s Global Health Research Team evaluated its £54 million Research Programme Consortia (RPC) portfolio, which consisted of 7 long‑term applied health research programmes operating across multiple low‑ and middle‑income (LMIC) countries and diverse themes - from sexual and reproductive health for refugees to infectious diseases. The evaluation assessed both programme impact and the effectiveness of the funding model.
The findings showed that the portfolio produced valuable, policy‑relevant research and built strong partnerships with LMIC institutions, strengthening prospects for long‑term impact. However, weaknesses were also identified: research uptake and policy influence varied, engagement with FCDO country offices and national decision‑makers was inconsistent, and complex commercial contracting processes created delays and administrative burdens. These issues reduced the model’s overall efficiency. As a result, the FCDO chose to reduce the portfolio budget to narrow focus on programmes with strong research-uptake potential. The evaluation also informed a broader shift toward fewer, more focused, applied research programmes, improving impact and value for money.
Delivering evaluation evidence
How can we deliver high quality, resource efficient evaluations that meet evolving departmental needs?
Evaluation is applied unevenly across FCDO activities leading to gaps in our evidence base and a lack of understanding of how activities, such as diplomacy and development programming, work together to deliver impact. Evaluating the full breadth of the FCDO’s work requires different evaluative methods, staff with the skills and confidence to apply them, and relevant quality standards.
Enhance our tools to support portfolio level MEL
To strengthen our approach to portfolio-level MEL (PMEL), we will:
- build staff skills in delivering and using PMEL through creation of guidance and quality standards, supporting application of PMEL approaches through hands-on technical assistance and good practice toolkits
- undertake evaluations of FCDO portfolios to assess effectiveness at country or thematic level and use this approach to examine how development and diplomacy combine to deliver strategic objectives
Box 4: New portfolio evidence generated
The FCDO has produced evaluations to understand impact across a theme or portfolio. For example, the evaluation of FCDO support to improve climate resilience in the Caribbean (2024) examined the extent to which development programming between 2015 to 2023 improved resilience to climate shifts, natural disasters, and economic shocks.
The evaluation generated a theory of change which forms the heart of a new Caribbean resilience strategy. It also improved understanding of the different dimensions of resilience, highlighting the need for specificity about the type of resilience each programme is targeting. This addressed the challenge of measuring broad change and improved the quality of stakeholder conversations.
As well as informing resource prioritisation, evaluation findings shaped the design of a new programme which was better integrated with wider economic and social development initiatives.
Box 5: Portfolio approach to evaluating development and diplomacy
The FCDO conducted a thematic evaluation of the UK’s support for the Kenyan elections (PDF, 6.9 MB). It aimed to understand how the British High Commission in Nairobi supported Kenya’s 2020 elections and what lessons could be learned for posts across the global network.
The UK’s support included programming, diplomatic influencing, research and coordination among other diplomatic organisations. The evaluation looked at the important roles of programmes and diplomacy, and how the suite of interventions built on each other to contribute to the largely peaceful elections in 2022.
Good practice and recommendations from the evaluation are being used to strengthen the elections strategy for 2027, including more consolidated risk management processes, understanding interdependencies between work strands, and good practice around using programmes and diplomacy to build on each other. Key lessons have also been shared with other teams working on election issues. The theory-based evaluation methodology provides a useful example of how we can look systematically across a variety of levers – from programmes to influence and relationships – to learn.
Evolve our methods and build long-term expertise and capability to apply them
Over this strategy period, evaluative practice in the FCDO will evolve in step with changes in the organisation which emphasise a more integrated whole of mission approach to delivering on the Government’s aims. We will:
- draw on a wide spectrum of evaluative techniques combining both traditional and innovative methods, including those suited to diplomatic and security contexts, and support evaluative thinking. We will evolve and curate guidance and tools for FCDO staff and prioritise capacity building activities to support use of these techniques
- harness innovation and technology. We will use emerging technologies ethically, including AI, to streamline evaluation processes, improve synthesis, and enhance accessibility of evidence
- invest in training and professional growth to build an expert, integrated evaluation/Government Social Research (GSR) profession with the skills and confidence to use a broad range of evaluative techniques and apply these in all areas of the FCDO’s work, including programming, policy, diplomacy and security
Integrate evaluation with other evidence and analysis
We will encourage multidisciplinary approaches where possible, working across analytical functions to provide the evidence decision-makers need. We will:
- deliver programmes of research which include evaluations that address critical evidence gaps in FCDO priority areas. Programmes of this nature will continue to generate robust evaluation evidence to inform decision-making, with findings made freely available to all through open access publishing
- use synthesis to draw lessons from across existing bodies of evidence, including Best Buy products which synthesise robust cost-effectiveness evidence. The FCDO will pilot and, where appropriate, adopt tools for AI-enabled synthesis, both internally and in partnership with global partners
Maintain high quality standards for evaluation
Independent quality assurance will continue to form a central plank of the FCDO’s accountability and learning structure, helping to maintain consistent quality standards across a decentralised evaluation system. In addition, we will:
- set clear principles and minimum standards for FCDO-supported evaluations through the FCDO evaluation policy, including for activities beyond programme spend
- refresh guidance and templates to support high standards in delivery, including on emerging topics such as use of AI in evaluation and PMEL
- evolve our independent quality assurance offer by broadening access to external expertise, including experts with higher security clearance
- provide the department with access to high quality suppliers and drive value for money, for example through the Global Evaluation and Monitoring Framework Agreement, including experts with higher security clearance
Work with partners
Working in partnership will be central to all the FCDO does. In a rapidly changing and volatile global context, the need for coordination and collaboration is acute, not only for potential efficiency gains, but also to share lessons about what works. This strategy therefore seeks to emphasise partnership and collaboration. We will:
- actively contribute to the international evidence community through initiatives like the Global Evidence Commitment and the Multi-Donor Learning Partnership (MDLP)
- work closely with our Arms-Length Bodies (ALBs) to support them in strengthening their own evaluation systems
- refresh guidance and templates on participatory or inclusive approaches to evaluation and encourage co-creation of evaluation questions and evidence generation
- work across UK government departments to support monitoring and evaluation focused communities of practice and facilitate greater coordination on strategic themes
- identify opportunities for analytical diplomacy with partner governments and multilateral organisations
Workstreams will be designed as the FCDO’s organisational reform proceeds. These will be reviewed annually to maintain relevance.
Box 6: Furthering global understanding: prevention is possible
Impact evaluations supported by the FCDO-funded What Works to Prevent Violence research programme demonstrated for the first time that well‑designed interventions can reduce rates of Intimate Partner Violence (IPV) by up to 50% within just 2 to 3 years. This was significant because it showed that prevention is possible. Innovative violence against women and girls (VAWG)-prevention components, drawn from the findings of these evaluations, have been integrated into the design of the FCDO community resilience programmes in 5 countries in the Middle East and North Africa. The What Works research programme has also had a significant influence on the wider VAWG‑prevention field, shaping sectoral approaches and informing government policy and programming, as well as advancing the academic rigour and knowledge around how to prevent violence.
Use and uptake of evaluation evidence
How can we better support uptake and use of evaluation evidence to inform decision-making?
Effective use of evidence supports decision-making in a changing, and increasingly resource-constrained, environment. However, relevant evidence can be hard to find, and staff may not have the skills to make the most of evaluation evidence – problems often made worse by weak organisational incentives to use it.
As well as steps outlined above to address incentives, we will prioritise improving the availability of evaluation evidence and building staff capability to be intelligent customers for evaluation.
Make evaluation evidence more accessible
We will improve the visibility and accessibility of evaluation evidence by:
- developing a refreshed communications and visibility plan including identifying ‘Evaluation Champions’ – highly networked individuals who can disseminate targeted evaluation insights to key stakeholders and help get the right evidence to the right people at the right time
- contribute to the development of an FCDO Knowledge Management Strategy to evolve how we save, access and share evidence so that learning and knowledge is preserved and accessible across the organisation. This includes regularly updating the FCDO’s internal repository for evidence and guidance – the Thematic Hub for Expertise, Analysis, Toolkits, Research and Evidence (THEATRE)
Strengthen staff capability to be intelligent customers of evaluation evidence
Monitoring and evaluation specialists, including from the evaluation and Government Social Research (GSR) profession, are embedded across FCDO teams to provide expert support and translate evidence into action. However, gaps in expertise limit access to technical support, making it harder to demonstrate the value of evaluation where awareness is low.
To address these issues, and support the work of embedded advisers, this strategy aims to:
- lead the Evaluation and Government Social Research cadre to support organisational reform. While decisions about investing in evaluation capability remain decentralised, the Head of Profession will advise on strategic deployment of advisers. They will also onboard and support new staff
- develop foundational training in evaluation skills, accessible to all staff. We will integrate key evaluation concepts into existing training, working with the College of British Diplomacy, and develop tailored content for key audiences. Central analytical teams and embedded evaluation advisers will continue to provide demand-driven training, alongside central training on impact evaluation delivered by world-leading experts
Box 7: Building evaluation skills
Between 2021 to 2025, nearly 1,200 staff engaged in MEL learning opportunities, including: tailored foundational training in 10 countries (reaching 240 staff from 45+ FCDO posts), impact evaluation training delivered by world-leading experts for more than 160 staff, and training and support delivered to 200+ staff using the Monitoring and Evaluation of Influence Toolkit, a resource accessed by staff over 2,000 times.
Implementation of the strategy
Roles and responsibilities
The strategy depends on effective engagement and co-working across teams at the centre and in the network. Responsibility for coordinating implementation sits with the Head of the Evaluation Unit and the Head of the Evaluation and GSR Profession, with direct oversight by the Director of Analysis, and reporting to the Programme Investment Committee.
Monitoring, evaluation and learning advisers are central to delivering this strategy. They should actively use support from the Evaluation Unit and Head of Profession, who provide strategic direction, guidance and advocacy. At a minimum, advisers are expected to stay informed of central developments and engage in opportunities to build their capabilities.
In line with other FCDO cadres, advisers use their professional expertise and contribute up to 10% of their time annually to support priority projects in other areas. They also provide updates from their own areas of work, including contributing to central management information, sharing case studies and evaluation impact, and contributing to learning and dissemination activities.
Governance
The FCDO’s Director of Analysis is accountable for delivery of the strategy. The Director of Analysis and Evaluation Unit will report on progress to the Programme Investment Committee.
The committee will provide steers and approval for the approach, so it remains relevant and appropriate to organisational priorities. This may include signing off focus areas for centrally delivered portfolio evaluations each year and promoting the aims of the strategy across the organisation.
Monitoring, evaluation and learning
A MEL framework for the strategy will be developed, with delivery milestones for each workstream, based on available resource. The strategy and milestones will be reviewed annually.
-
An internal review of MEL-related training delivered during the previous Strategy period found that participants viewed it as high quality and useful, leading to increased confidence and capability around foundational evaluation skills. ↩
-
Aid budget reductions, the pandemic and the FCDO merger created a more favourable environment for fraud risk - Independent Commission for Aid Impact ↩