International Science Partnerships Fund (ISPF): evaluation framework
Published 15 May 2025
The appendices for the evaluation framework are available in the PDF attachment. HTML versions are in development and will be published at a later date.
1. Executive summary
This document presents the evaluation framework for The International Science Partnerships Fund (ISPF), an international science, research, technology and innovation (SRTI) initiative, funded and managed by the UK Department for Science, Innovation and Technology (DSIT).
The Fund was created to support international research and innovation (R&I) partnerships between the UK and other countries, and brings together DSIT international funding – Official Development Assistance (ODA) and non-ODA – under a single structure.
ISPF is delivered by a consortium of research and innovation bodies (ISPF Partner Organisations), who work with international partners in the design, funding and delivery of ISPF activities.
ISPF in a nutshell
ISPF will be comprehensively evaluated over multiple stages. The current study (and this evaluation framework) establishes the foundations and plans for the subsequent evaluation work. Given the long-estimated timescales for the full benefits of R&I to be realised, DSIT considers it too early to specify the details of stage 5 (impact evaluation), and so the focus of the current evaluation framework is just stages 2-4.
Summary of ISPF Evaluation Stages
The evaluation framework has the following key features and characteristics:
- It proposes a mixed-methods approach, underpinned by the Theory of Change (ToC) for the Fund (see Section 4). The ToC provides a “programme theory” that explains how an intervention (in this case the Fund) is expected to produce its intended results.
It has a logic model as a starting point, which presents how the inputs to, and activities of ISPF are expected to result in a series of immediate outputs. The outputs should then lead to a series of intended short-to-medium term outcomes, and in turn contribute to wider and longer-term expected impacts. - All elements of the ToC have then been translated into a set of performance metrics, for future monitoring and evaluation (M&E). This includes suggested indicators and evidence sources (qualitative and quantitative), plus recommendations for baselines and possible benchmarks (see Section 6.2).
These performance metrics make maximum use of the existing evidence base and M&E efforts (i.e. data collected via the Annual Commission) while also identifying additional sources of information (data sources are described in Section 6.3). - The evaluation framework also sets up 3 additional synthesis methods to estimate effectiveness and VfM, including a Rubric-based Value for Money approach, a Qualitative Comparative Assessment (QCA), and a Return on Investment approach (see Section 6.4).
Most of the performance metrics feed into these methods, while others will still be analysed and reported on within the wider effectiveness evaluation. - The framework includes a recommendation to develop ~20 longitudinal in-depth case studies, each covering a specific ISPF programme and drawing on multiple sources to provide evidence across the ToC.
The in-depth case studies will provide evidence to inform the effectiveness assessment following a Contribution Analysis approach (along with evidence on the mechanism that led to the achievement of benefits and contextual factors that hindered or enabled those). They will also provide the data needed to perform the VfM and QCA.
The case study programmes have been selected using a multi-stage sampling framework approach to guarantee a good coverage across ODA/Non-ODA, budget sizes, ISPF theme and primary activity type. The sample also includes 15 of the 22 POs, and programmes with 13 ODA and 10 non-ODA partner countries (see Section 6.5).
Note that the approach set out in this document is intended to be iterative, and to evolve as the evaluation of ISFP progresses and more evidence becomes available. This could then lead to an update of the ToC and performance metrics (e.g., to capture effects not originally foreseen), and / or a change on the sampling strategy for in-depth cases studies.
2. Introduction
2.1 The Fund
The International Science Partnerships Fund (ISPF) is an international science, research, technology and innovation (SRTI) initiative, funded and managed by the UK Department for Science, Innovation and Technology (DSIT). It was created in 2022 to support international research and innovation (R&I) partnerships between the UK and other countries, and brings together DSIT international funding – Official Development Assistance (ODA) and non-ODA – under a single structure, enabling strategic alignment and coherence. The Fund is delivered by a group of UK Partner Organisations (POs), working bilaterally and multilaterally with international partners in the design, funding and delivery of ISPF activities.
2.2 The study
DSIT is committed to embedding evaluation into every facet of its work, ensuring policies and programs are driven by evidence and continuous improvement, that interventions deliver maximum impact, and that public funding is spent as effectively as possible. Monitoring, evaluation and learning (MEL) is important for the long-term success of ISPF. It ensures that there is robust evidence available to assess Fund performance and value for money (VfM), for learning and accountability purposes and to inform improvements to design and delivery of this or other initiatives. The approach, as set out in the ISPF MEL Plan, includes:
- monitoring – regular and systematic data collection and reporting of performance data
- evaluation – fund-level, commissioned by DSIT and conducted by independent evaluators
- learning – dissemination of evidence and learning, to feed into continuous improvement
ISPF will be comprehensively evaluated across multiple stages, to understand how it was delivered and experienced (process), the extent to which it represents VfM, the extent to which it achieved intended outcomes (effectiveness) and an assessment of its impact.
Figure 1 - Summary of ISPF Evaluation Stages
The current study covers the preliminary stage (evaluation framework and baseline), establishing the foundations and plans for subsequent evaluation work. Given the long estimated timescales for the full benefits of R&I to be realised, DSIT considers it too early to specify the details of stage 5 (impact evaluation), and so the focus of the current study is stages 2-4.
2.3 This report
This report presents the evaluation framework (including finalised Theory of Change (ToC), performance metrics and methods) that will ensure an appropriate and robust process for data collection, monitoring and evaluation is established, agreed, and planned from the start. It also provides first insights into the Fund itself, through an initial mapping of the ISPF portfolio.
The evaluation framework and portfolio analysis seek to address the following questions:
- have ISPF activities been designed to achieve the Fund’s objectives?
- are the right KPIs and reporting processes in place to effectively monitor ISPF performance and achievements, to assess VfM and answer evaluation questions?
- what does VfM look like in ISPF? How can DSIT and POs contribute to good VfM?
- how should ISPF evaluation in stages 2-4 be designed to answer the evaluation questions? What evaluation methods should be used?
The study will also include a separate baseline assessment of key indicators (and an updated mapping and analysis of the ISPF portfolio), to be delivered later in 2025.
2.4 The approach
This evaluation framework represents the culmination of a series of different workstreams that have been undertaken over the past year (as summarised in Figure 2). The report incorporates the intermediate outputs developed through these various activities and combines them to provide a comprehensive and cohesive framework for future evaluation activity.
Following an initial scoping phase, the main workstreams have involved the development of:
- a Theory of Change (diagram and narrative) for the Fund (see Section 4)
- an analysis of the current ISPF portfolio (programmes, awards, funding) (see Section 5)
- a set of performance metrics (indicators and evidence sources, plus recommendations for baselines and possible benchmarks) for future monitoring and evaluation (see Section 6.2)
- a proposed approach and methods, plus relevant data collection approaches and sources, for evaluation (covering effectiveness, process and VfM) (Section 6.3 onwards)
- a rubric (criteria, dimensions, standards and sources) for the assessment of Value for Money across the ISPF portfolio (introduced in Section 6.4.2 and presented in Appendix E)
Across the workstreams, we have combined desk-based research (of Fund documentation and data, plus wider literature), with stakeholder consultation (interviews, workshops, meetings, presentations and correspondence with DSIT analyst, policy and portfolio management teams, plus representatives from all Partner Organisations), and extensive development and analysis work by the study team, to arrive to the various outputs that now form part of this evaluation framework. Further details on the approach to each workstream are provided in Appendix A.
There are 2 further workstreams to be undertaken following the production of the evaluation framework. These are an updated analysis of the ISPF portfolio (based on latest data) and a baseline assessment of key indicators. These will be delivered later in 2025.
Figure 2 - Summary of activities
3. The International Science Partnerships Fund (ISPF)
3.1 The Fund
Through a global, partnership-based approach, ISPF seeks to deliver on 3 key missions within the Integrated Review (2021): establishing the UK as a science superpower; being a force for global good; and putting science and technology at the heart of UK international relations.
As summarised in Figure 3 (with further detail provided in Appendix B), the Fund aims to address global challenges best tackled collaboratively, by empowering individuals, institutions, and systems to deliver enhanced outcomes and impacts, as well as positive international influence and improved perceptions for the UK. It addresses 4 main themes that relate to major challenges (‘resilient planet’, ‘healthy people, animals & plants’, ‘transformative technologies’) and supporting the talent necessary to address these (‘tomorrow’s talent’). It does so through equitable partnerships with partner countries selected for their geostrategic importance, alignment, and critical capacity to deliver against the Fund’s objectives and Themes.
ISPF is designed to be a long-term fund, but with an initial £337m in the current Spending Review (SR) period (FY 2022/23 to 2024/25), of which £218m is ring fenced for ODA to deliver research and innovation partnerships with low- and middle-income countries (LMICs) (and, within this, at least 20% delivered for the benefit of Least Developed Countries (LDCs).
Figure 3 - Summary of ISPF objectives, scope and themes
3.2 Implementation
ISPF was announced during a ministerial visit to Japan in December 2022. The first phase was launched in April 2023, utilising £119m non-ODA funding over a two-year SR period. Then in July 2023 it was confirmed that ISPF would have up to £218m additional ODA funding, for research and innovation partnerships with LMICs. As such, the Fund combines ODA and non-ODA funded SRTI activities within a single portfolio. As shown in the portfolio analysis (Section 5), during the initial three-year period, 62% of budget allocations and 33% of programmes relate to ODA funding, with 38% and 67% respectively therefore relating to non-ODA funding, and with 15 of the 22 POs delivering across both.
The Fund is managed by DSIT, but implementation is decentralised to a consortium of leading research and innovation bodies (see Figure 4).
Figure 4 ISPF Partner Organisations
It is at this PO level that the Fund is translated into programmes, which are then managed as coherent packages of work. The Fund is designed to respond to priorities identified by Government, with DSIT setting the strategic direction through objectives, research themes and priority countries. However, ISPF POs are then empowered to design relevant calls, investments and other activities to reflect these priorities and any emerging demands they identify.
Across the resulting portfolio, ISPF supports all stages of research and innovation, from early stage, foundational research, through to applied research and commercialisation, as well as skill, talent and capacity development.
Figure 5 - Overview of the ISPF portfolio structure
Additional to research grant funding, and reflecting the “dual support” model of research funding in the UK for investment via UKRI, ISPF also makes available an equivalent amount of unhypothecated institutional support funding (also known as Quality Research funding). This funding (£55.8m over 2023/24-2024/25) is delivered through Research England, the Scottish Funding Council, the Commission for Tertiary Education and Research Wales (MEDR) and the Northern Ireland Department for the Economy. Though unhypothecated, the ODA-funded institutional support must be used for ODA-eligible outcomes, and is allocated proportionally to the level of ODA-eligible activity undertaken by a given Higher Education Partner.
4. ISPF Theory of Change
4.1 Introduction
A Theory of Change (ToC) is a “programme theory” that explains how an intervention (in this case the Fund) is expected to produce its intended results. It has a logic model as a starting point, which presents how the inputs to, and activities of ISPF are expected to result in a series of immediate outputs, which should then lead to a series of intended short-to-medium term outcomes, which in turn should contribute wider and longer-term intended expected impacts.
Figure 6 - Summary of Logic chain
A preliminary Logic Model (Appendix B.3) was developed by DSIT for the ISPF Business Case. This presented the objectives of the Fund, as well as planned inputs and activities, and the outputs, outcomes and impacts that these were expected to contribute to. This diagram needed to be reviewed, updated and further developed as part of the current study to reflect more recent changes to the design and intentions of the Fund, including updated objectives.
The approach to ToC development incorporated a desk-based review of relevant ISPF documentation (Business Case, MEL Plan, Fund Strategy, etc.), consultation with DSIT Policy and Analyst Teams and ISPF Partner Organisations (POs) via 3 workshops, and development work by the evaluation team. An early version of the new ToC diagram was shared with DSIT and Partner Organisations for comments and feedback, while the full draft ToC diagram and narrative that was developed was further iterated with DSIT before being finalised.
The new ToC (set out in the remainder of this section) has served as the foundation for many of the other elements then developed for this evaluation framework (e.g. the proposed indicators and value for money rubric). As such, it provides the framework against which the Fund itself will be evaluated in future (e.g. as the basis for assessing progress and achievements). It is also envisaged that the new ToC may be useful in communicating the Fund to different audiences, and in helping to increase understanding and engagement with ISPF and its intentions.
4.2 ISPF ToC diagram
Figure 7 presents the high-level ToC diagram for ISPF. This seeks to capture the main intentions and expectations for the Fund, in a structured way within a single diagram, and allow for these to be easily understood and communicated.
Given the complex nature of ISPF, not all elements of the ToC are presented in the diagram. Specifically, the diagram does not show how the different components (activities, outputs, outcomes and impacts) relate to each other. These connections (impact pathways and assumptions) are further discussed in the pages that follow.
4.3 ISPF ToC Narrative
4.3.1 Rationale for ISPF
Science, research, technology and innovation (SRTI) have long been identified as key engines of economic growth, prosperity and wellbeing. The UK’s future success as a strong, influential country, whose citizens enjoy prosperity, security, and fulfilled, healthy and sustainable lives, will depend on its ability to build on existing SRTI strengths. International collaboration is critical to ensuring a strong and growing UK SRTI sector; it can enhance the efficiency, effectiveness and quality of SRTI, and can deliver better outcomes. Citation impact is measurably higher for internationally co-authored papers, relative to national-only ones, while collaborating internationally produces outputs that are 1.1-1.8x more impactful than UK-only collaboration.[footnote 1]
Access to the best ideas, expertise and facilities internationally is beneficial for UK SRTI. With over 95% of R&I conducted outside the UK,[footnote 2] much knowledge, expertise and infrastructure sit elsewhere. Increasing access to global opportunities and talent will help the UK to remain at the forefront of cutting-edge SRTI, in particular providing access to:
- a range of opportunities for UK researchers and innovators to engage in international collaboration that would not otherwise occur
- a global talent pool and facilities to capitalise on and strengthen those of the UK
By pooling resources, the UK can do bigger, better science than it can alone; by sharing knowledge we avoid reinventing the wheel and have access to more expertise; and by working collectively we take more diverse approaches and deliver more creative solutions.
Another increasing driver of international collaboration is the nature of the challenges themselves, and the need to solve these together. Key issues such as carbon emissions and extreme weather, global pandemics, or new and emerging technologies do not respect national boundaries. They increasingly happen at a global scale, and require a global response. For this reason, it is important to make sure UK scientists, researchers and innovators can access not just other UK-based researchers and institutions, but act through global partnerships and networks. Through establishing these international partnerships, we are better positioned to address challenges head-on and systematically, as opposed to piecemeal. By working together on challenges we share with other countries, we are also better able to address these at home.
The UK is also committed to being a force for global good and to supporting the socio-economic development of LMICs. With the majority of the world’s population living in these countries, and the impacts of global challenges being disproportionately felt by those who have the least, it is imperative that the UK act to safeguard those most in need, and commit its fair share of funding and expertise to solving these challenges. As was highlighted in the October 2023 International Development White Paper there is a need for a collective global mobilisation of scientific expertise, research and innovation at the midpoint of the Sustainable Development Goals (SDGs) to accelerate progress to 2030. This includes enabling countries to leapfrog carbon-intensive phases of industrial development; using innovation and technology to tackle poverty, create jobs and sustainable economic growth; addressing biodiversity loss and harnessing nature-based solutions; and being better prepared for and more resilient to the impacts of climate change, including to avert future humanitarian crises.
Private investment in R&D is subject to many market failures, including economies of scale, information asymmetry, and positive externalities – where investors cannot reap the full value of investment, as knowledge spills over to competitors. The economic case for public support to address these failures (and sub-optimal levels of investment that result), is well established.[footnote 3]
With regards specifically to international collaboration in SRTI, there is the additional challenge of coordination failure, with substantial barriers to collaboration. A 2018 Technopolis study on the main drivers and barriers to international collaboration[footnote 4] identified that there was demand to do more collaboration with strategic partner countries, but that individual research organisations and UK businesses do less than they would wish to because multiple barriers exist to international SRTI. These include financial barriers and international resource constraints, but also the availability of collaboration frameworks and information about partners, regulatory issues, and the recognition and enforcement of IP, as well as issues related to researcher mobility and recruitment. Government intervention, removing barriers to collaboration through collaboration frameworks and dedicated international funding, is critical in addressing these.
Public sector funding for international R&D faces challenges, however. The ISPF Business Case noted that (in 2022) funding for international collaboration was stretched, with international non-ODA funding outside of Horizon Europe limited to relatively small elements of Partner Organisations’ core budgets or UKRI’s Fund for International Collaboration. Neither of these sources on their own were considered sufficient to keep up with both UK and international SRTI demands. The Business Case also notes that significant ODA budget cuts (a 65% reduction in the Spending Review 2020) had limited the ability to deliver against HMG priorities and was felt to have severely impacted the reputation of the UK as a reliable research partner.
There is a risk that valuable opportunities are missed due to a lack of available budgets. The ISPF Business Case notes that UK R&I Partner Organisations report increasing demand from Partner Countries to either grow existing relationships, or to create new partnerships, but that these opportunities are rarely captured due to a lack of available budgets. Examples and evidence were provided, covering both ODA and non-ODA funding. Not being able to invest in these opportunities may also mean ground is lost to other countries who are able to invest.
Finally, there is potential benefit from greater coherence, alignment and targeting of efforts. By centralising a large proportion of international SRTI funding under one funding vehicle, DSIT has opportunities to (i) ensure better coherence across all international R&D spend; (ii) create stronger alignment with government and ministerial priorities; and (iii) allocate funding according to where it can deliver the best value for money and most benefit society. It can also allow for a more balanced portfolio in terms of risk (i.e. greater risk taking than might otherwise be the case), thereby delivering higher, more impactful returns overall.
4.3.2 Aims and objectives
The overall aim of ISPF is to enable potential and foster prosperity by supporting international collaboration in Science, Research, Technology and Innovation (SRTI), with countries around the world, to address some of the major themes of our time. Four themes have been identified:
- resilient planet, including contributing to humanity’s efforts to reduce carbon emissions, mitigate against climate change and adapt to the impacts of it.
- transformative technologies, including forming and strengthening industry-academia partnerships that bring forward emerging technologies and the business know-how to help them flourish.
- healthy people, Animals and Plants, including advancing innovative health technologies and deepening our understanding of pandemics, genomics, and pathogen detection, as well as improving our understanding of the socio-cultural mechanisms underpinning our relationship with vectors of health and disease.
- tomorrow’s talent, including connecting researchers and innovators, supporting their professional development and the translation of their ideas into businesses and products, and building global research networks.
ISPF has 6 high-level objectives relating to the overarching ambitions of the Fund:
- international partnerships with impact: Deliver better R&I together than we could alone, by developing long-term strategic international partnerships at every level to address shared priority areas
- addressing shared / global challenges: Support sustainable global development and address specific challenges facing low, medium and high income countries, by developing equitable partnerships and delivering targeted programmes and initiatives that contribute to government strategic priorities
- enabling potential: Strengthen R&I capacity for UK and international partners at individual, institutional and system level, by empowering talented individuals and teams, by promoting knowledge sharing and collaboration across borders, disciplines and sectors, and by supporting the development of new ideas
- collaborating at the forefront of SRTI: Strengthen the quality of UK SRTI, by collaborating with international partners at the forefront of SRTI, benefitting society and generating strategic advantage
- using our influence: Help the UK to shape and influence global standards and norms, by working closely with government agencies, international organisations, civil society, and others to advance a shared agenda on issues such as data protection, IP, open science, and privacy
- Improving perceptions: Help improve the reputation of the UK and UK R&I by building long-term relationships, working in a fair and transparent way, and demonstrating the benefits of our international partnerships
4.3.3 Inputs and activities
Figure 8 shows the inputs and activities section of the ISPF ToC diagram. These elements are described in more detail below.
Figure 8 - ISPF ToC Diagram – Inputs and Activities
One of the main inputs to ISPF is funding to support international SRTI collaboration. This includes UK government funding (an initial £119m of non-ODA spend and £218m of ODA spend for FY 22/23-24/25), plus public and private co-funding (in cash and in-kind) from partner countries. Contribution levels will vary across the ISPF portfolio, but there are specific considerations for Least Developed Countries,[footnote 5] where there is no requirement for co-funding given the material pressures of LDCs, but without prejudicing their ability to contribute where they desire to.
The Fund is managed by DSIT and delivered by a consortium of leading research and innovation bodies, listed below.
- Academy of Medical Sciences (AMS)
- British Academy (BA)
- British Council (BC)
- Royal Academy of Engineering (RAEng)
- Royal Society (RS)
- UK Atomic Energy Authority (UKAEA)
- Universities UK International (UUKi)
- Met Office (MO)
- National Physical Laboratory (NPL)
- Connected Places Catapult (CPC)[a]
- Energy Systems Catapult (ESC)[a]
- Offshore Renewable Energy Catapult (OREC)[a]
- The Faraday Institution (FI)[a]
- Arts and Humanities Research Council (AHRC)
- Biotechnology and Biological Sciences Research Council (BBSRC)
- Economic and Social Research Council (ESRC)
- Engineering and Physical Sciences Research Council (EPSRC)
- Innovate UK (IUK)
- Medical Research Council (MRC)
- Natural Environment Research Council (NERC)
- Science and Technology Facilities Council (STFC)
- UK Research and Innovation (UKRI)
[a] = Associate POs
Alongside funding allocation, DSIT also provides guidance and steer on the scope and intentions of the Fund (e.g. objectives, themes and partner countries / territories). Wider policy steer for ISPF can also be found in other policy documents, including the Integrated Review and International Development White Paper[footnote 6] (alongside ISPF specific policies on areas such as equitable partnerships, and ODA eligibility as stated in the International Development Act).
The Fund is designed to respond to priorities identified by government, with DSIT setting the strategic direction. However, ISPF POs are then empowered to design relevant funding calls and activities to reflect these priorities and any emerging demands they identify.
The Fund also benefits from prior knowledge, skills, expertise and relationships of stakeholders involved (DSIT, POs, International Partners, FCDO, SIN and SRTI communities). This includes building on and learning lessons from past programmes and initiatives – including those supported through the Newton Fund, the Global Challenges Research Fund (GCRF) and the Fund for International Collaboration (FIC) – as well as existing agreements and ways of working.
ISPF supports all stages of research and innovation, from early stage, foundational research, through to applied research and commercialisation, as well as skill, talent and capacity development. There are 8 main types of activities currently supported across the portfolio (although individual ISPF programmes may involve a combination of these):[footnote 7]
- international collaborative academic research: These tend to be typical collaborative R&D and Innovation projects (with research plans and expected R&I outputs). They include multi- and interdisciplinary, challenge-driven and (in the case of ODA) partner-led activities
- translational research: Researchers turn scientific discoveries from laboratory-based research into real-world applications, developing new products and services. These activities might support discoveries maturing from basic research to clinical trials and commercial development, as well as the development of prototypes and patents.
- international mobility: Researchers (both from the UK and partner countries) participate in training and secondment activities. These vary in length and intensity, and include both short visits (e.g. better understanding available research resources and infrastructure) and longer stages that will usually involve working on a particular research topic, as well as access to personnel and infrastructure.
- institutional R&I capacity building: These activities focus on strengthening the ability of institutions (universities, research organisations, industry partners) to conduct SRTI activities. Funding might support the development of new interdisciplinary research programmes, doctoral training partnerships, support for knowledge and exchange activities and funding for early-career researchers.
- international collaborative business-led research, development & demonstration: These activities encourage businesses to collaborate with international partners, including to explore new markets or to develop new or improved products, processes and services.
- investment in access to infrastructure / facilities: These activities support access to and development of research infrastructure. They are expected to lead to the generation of knowledge and expertise within the context of the programme, as well as future research avenues (which may then take place outside of the ISPF programme).
- pump priming: These activities support initiatives for early-stage research, the exploration of new ideas or the development of future projects. For example, ISPF funds small projects or feasibility studies that are exploring new ideas or concept to assess their viability for further research (which would be funded and conducted beyond the ISPF programme).
- networking and workshops: Activities such as visits, workshops, conferences and joint working that support idea generation and the exploration of common areas of research interest, partnership building, scoping and preparation of future research proposals. These activities are expected to generate new proposals, develop new partnerships between researchers, identify avenues for further collaboration and share research best practice.
As such, ISPF provides holistic support to international collaborative R&I, including the research itself, as well as enabling / adjacent activities such as partnership building and skills development, capacity building for future collaboration, and access to research infrastructure.
Assumptions
There are a number of assumptions (and associated risks and challenges) for the Fund and its ambitions overall. They relate to the design and delivery of the Fund, and so are mentioned here, but apply to the realisation of expectations across the ToC. They include:
- financial and political support for ISPF continues (at a sufficient scale, and beyond the current funding period). This mainly relates to UK Government / DSIT support and funding, but also applies to Partner Organisations in the UK and International Partners, whose ongoing interest and commitment to the Fund is important.
- current priorities are maintained (e.g. in relation to themes, countries, ODA/non-ODA). Again this primarily relates to shifts in UK Government / DSIT priorities, but changes in other partners could also create mis-alignment with ISPF’s scope and activities.
- the design of the Fund is appropriate to effectively deliver against expectations. This includes key features of ISPF, such as: the devolved delivery model – i.e. a Fund managed by DSIT, but delivered by a consortium of POs; the blended non-/ODA approach within a single fund; the choice of Priority Themes which are broad in scope rather than more specific challenges or missions; and a defined list of eligible Countries and Territories.
- the Fund has sufficient scale to contribute meaningfully to wider and longer-term effects. A critical mass may be required overall, but also across different themes, countries, partners and types of activity. Linked to this and the previous point, there is a risk from the devolved delivery approach of a lack of coordination or coherence across the portfolio, resulting in an inappropriate balance of funding or activities, or missed synergies.
- there is sufficient time and capacity to deliver (e.g. in terms of there being sufficient human resources within DSIT / POs and sufficient calendar time available to establish and deliver programmes), which may be further exacerbated by factors such as mis-matched funding cycles among partners, or the time and effort required to establish agreements and effective ways of working (e.g. around data sharing).
- other practical challenges to implementation in terms of e.g. mobility (visa issues) and international working (different norms, languages, cultures) are minimal or can be overcome.
4.3.4 Outputs and outcomes
Figure 9 shows the outputs and outcomes section of the ISPF ToC diagram, which are the Fund’s spheres of direct attribution and contribution, respectively. Outputs are expected to materialise as the ISPF projects and programmes progress, while outcomes are expected to emerge between 0-3 years after ISPF projects and programmes have ended.
These elements are described in more detail below.
Figure 9 - ISPF ToC Diagram – Outputs and Outcomes
By conducting the activities detailed in the previous section in collaboration with International partners, ISPF programmes are expected to leverage further resources for ISPF projects and activities (i.e. extra funding beyond the initial inputs, capacity, and other resources). These are outputs of ISPF (in that ISPF serves as a mechanism to attract international funding), but also serve as additional inputs to Fund (alongside ISPF expenditure and partner co-funding).
In terms of other outputs and outcomes expected from ISPF activities, these have been grouped into the 5 broad areas below (although there are cross-overs and interlinkages between these different areas, with multiple pathways from individual outputs to outcomes).
Partnership outputs & outcomes
The design and implementation of many ISPF activities will support the creation of new partnerships, or the further strengthening of existing partnerships, between individuals, institutions and organisations, and countries, across borders and across sectors (academia, industry, third sector, policy, funders). This may be supported or recognised through the establishment of new or strengthened agreements or Memorandums of Understanding (MoUs) between these different parties.
It is anticipated that these specific partnerships will continue over time, beyond the life of ISPF, and more generally that there will be an increased ability for the UK and partner countries to collaborate, as a result of new knowledge and understanding, increased access, or enhanced ways of working. In both cases, this should result in an overall increase in joint activities in common areas of interest (be that joint research, investment, coordination, etc.). Improved connectivity between industry and academia is also expected (in particular for UK and ODA participants), as information and knowledge are shared in the context of ISPF and through further partnerships and collaborations that are enabled by the Fund. More generally, it is anticipated that ISPF partnerships and interactions will help increase or sustain the reputation of the UK as an R&I ‘partner of choice’ or as a destination of choice for talent and investment.
Assumptions
Partnerships and interactions have developed positively, with mutual benefits for those involved.
Partnerships have been institutionalised, either via formal or informal means, such that they can remain over time, regardless of whether the individuals involved change positions or organisations.
ISPF funded activities have enabled new partnerships between industry and academia that did not exist before, improving connectivity.
Research outputs & outcomes
Many ISPF activities – particularly collaborative R&D and activities relating to infrastructure investment / access - are expected to lead to (typical) R&I outputs. This includes high quality peer reviewed publications co-authored between UK and international researchers, but also other types of publications, including policy briefs, working documents and synthesis reports, that are tailored to audiences outside academia (including policy makers and industry). Other types of research outputs are also expected (new datasets, software, models, creative products and standards), depending on the nature of the specific activity.
Given the focus of ISPF funded activities on solving common challenges it is also expected that research outputs will have a high degree of inter- and multi- disciplinary, in addition to providing new or enhanced knowledge on areas related to the ISPF themes.
In line with the EDI commitments of ISPF, it is also expected that there is a proportional gender balance in the authorship across those research outputs, as well as an equitable representation of UK and international researchers.
Furthermore, in the case of the research outputs emerging from ODA-funded activity, it is expected that they are focused on improving the socioeconomic development of LMIC/LDCs.
Dissemination of these various outputs will take place via various means (e.g. conferences and presentations, social media, teaching and training activities).
These research outputs are expected to help increase or sustain the quality and/or competitiveness of R&I in ISPF themes (in particular for UK and ODA participants), as well as exert influence on wider SRTI ecosystems, including, through contributions to the development of standards, policies, research agendas and the strengthening of research cultures (with the latter being also influenced by knowledge and skills outputs, described below).
The use, uptake and application of solutions developed through ISPF are also expected to increase or improve the ability to tackle global and socioeconomic challenges (through e.g. their influence on policy and standards, or on the products and services available).
Assumption
ISPF-funded research tackles global and socioeconomic challenges and this is widely disseminated among (and accessible to) relevant end-users.
Innovation outputs & outcomes
Translational research and business-led innovation in particular, are expected to lead to (typical) innovation outputs, including new and improved products, services and processes, as well as new and improved technologies (with an increase in the technology readiness level (TRL)), plus Intellectual Property or patents and new spin-offs or start-ups.
These outputs are expected to increase the ability to commercialise research and technology (in particular for UK and ODA participants), including access to global supply chains, trade opportunities, key infrastructure and skills. In the long-term, this should lead to increased income from commercialisation of research and technology (in particular for UK and ODA beneficiaries), including from new markets explored through ISPF activities. Innovation outputs may also help tackle shared challenges.
Assumptions
Support provided by ISPF, and progress made is sufficient to support commercialisation or to unlock further resources for developments towards commercialisation (de-risking).
There is also an assumption that successful innovation outcomes would outweigh the (inevitable) failures. Related to this, the extent and breadth of innovation outputs and outcomes will in part be dependent upon the share of the ISPF portfolio that is dedicated to translational and business-led research (where these outputs and outcomes are much more likely, at least in the shorter-term).
In the case of ODA programmes/projects, there is also the assumption that international partners are able to commercialise or benefit from the commercialisation of research and technology emerging from their joint ISPF activities with UK partners.
Knowledge and skills outputs & outcomes
Finally, project activities (particularly those that focus on capacity building) are expected to lead to the development of new knowledge and skills, including new and improved understanding among researchers, managers and industry, of various aspects including:
- (research) user needs, not only in terms of topics but also how best to make evidence accessible to wider audiences
- research methods
- common challenges and priorities
- how best to manage domestic and international R&I projects
- how best to incorporate EDI in the research design and implementation
- how best to implement Responsible R&I, at institutional and project level
These various outputs should also support increased research capabilities, including research leadership, for both UK and ODA beneficiaries.
ISPF activities are also expected to lead to new and improved understanding of available research capacity, capabilities and infrastructure among partners. This is particularly relevant for some POs (e.g. STFC), where it is expected that this may lead to an increased demand for those facilities and future joint ventures.
Assumption
Learnings are socialised in such a way that they remain over time, regardless of whether individuals involved change positions or location.
Future research and innovation
Projects are expected to lead to the establishment of joint areas of interest and joint priorities, at the country, funder, and researcher or innovator level, as well as to the identification of new R&I ideas that may be pursued in future, possibly through international partnership.
As proposals are co-developed to take these projects and ideas forward, it is expected that researchers and innovators behind those ideas are able to leverage further funding (public & private, national & international), beyond initial ISPF funding and awards. This may take the form of further grants, or investments, or the use of internal resources.
Assumption
There is funding available (in the UK / internationally) to support projects and ideas emerging from ISPF funded activities.
4.3.5 Impacts
ISPF is expected to contribute to (influence) the attainment of impacts that relate closely to its high level objectives (outlined in Section 4.3.2 above). These impacts are expected to materialise in the medium to longer term, between 4-10 years after ISPF projects and programmes have ended. They include:
- new, re-established or strengthened long-term strategic international partnerships, at Partner-and Government level, that are advancing common strategic areas in R&I
- strengthened R&I capabilities in the UK and in international partner countries, at the individual researcher, institutional, and system level
- strengthened SRTI quality, which is leading to strategic advantage and socio-economic benefits for both UK and ODA beneficiaries.
These increases in SRTI capabilities, quality and joint-working are also expected, over time, to support progress towards other areas of impact, including the addressing of specific shared challenges / priorities; and the delivery of social and environmental benefits. This connects back to the original rationale for intervention (Section 4.3.1), and the fact that challenges such as carbon emissions and extreme weather, global pandemics, or new and emerging technologies are best addressed internationally, through shared ideas, expertise and facilities.
There is also the expectation that strengthening international R&I partnerships could support wider diplomatic efforts and enhance the UK’s soft power and influence, which in turn could also support better terms and conditions in terms of economic policy (e.g. trade agreements).
There is also the expectation that innovation (and a strong SRTI sector) can deliver economic growth, by supporting increases in productivity and competitiveness.
Finally, additional expected impacts from ISPF relate to supporting the global strategic position of the UK, and the opportunities and advantage that this enables. This includes through:
- the shaping and influencing of wider SRTI ecosystems to (better) align with UK needs and ambitions (for example through norms, standards, culture, policies and regulations)
- improved international perceptions and reputation of the UK as a trusted (fair and committed) R&I partner for future joint-working and investment.
These 2 impacts will also have positive feedback loops to other impact areas described above, including in particular the advancing of common strategic areas and addressing of shared challenges, and support to wider diplomatic efforts.
Assumptions
The scale of ISPF (resources and duration) is sufficient to contribute to strengthen R&I capabilities and quality.
Research and innovation based solutions provide sufficient input to deliver economic growth, and social and environmental benefits.
There are multiple routes (as well as feedback loops) through activities, outputs and outcomes that could support any of these areas of impact. However, we have developed a series of ToC sub-diagrams that highlight the main and most significant pathways for each of the 6 high level objectives and impact areas (noting that other activities, outputs and outcomes may also have relevance). These diagrams and a narrative of the pathways are shown in Appendix B.4.
5. Portfolio analysis
5.1 Introduction
This section provides a high-level overview of the ISPF portfolio, including breakdowns by relevant dimensions. Detailed presentations of the portfolios for each of the 22 ISPF Partner Organisation (PO) are then provided in Appendix C.
The analysis is based on Level B Allocations data (maintained by the DSIT PMO team and provided in January 2025) and PO reporting through RODA (to Q4 2023/24). It is therefore important to note that the analysis provides a snapshot in time. Additional programmes will have been added, programme allocations changed, awards made, and funds expended over the subsequent months, which are not yet captured. The analysis will be repeated at the point of the baseline evaluation (based on the latest data), capturing more recent changes.
Details of the approach to undertaking the portfolio analysis are presented in Appendix A.5. However, there are some key points to note before reading the analysis:
- level B entries in RODA have been used to identify ISPF ‘Programmes’, with the Level B Title used for the ‘Programme Name’. Where the same programme appears in both ODA and non-ODA databases, 2 programmes are recorded (with ODA / non-ODA noted).
- the only exception to the above is where the Level B Title indicates that the RODA entry relates to Delivery Costs. These are presented separately in the analysis (as ‘delivery costs’, not ‘programmes’), with an indication of whether costs relate to a specific programme, or to the PO portfolio more generally (based on the information provided in the Level B Title).
- level C entries in RODA have been used to determine whether there are one or more ‘Rounds’ of funding being deployed within the programme [Information on the relevant round (e.g. a number or year) has been identified from the RODA field: Activity Title]. Only those rounds where at least one award is recorded in RODA (see below on awards) have been included in the current analysis. Other rounds will be incorporated within future iterations of the portfolio analysis as and when awards are reported.
- where entries are provided at Level D in RODA, these have been recorded as ‘Awards’ in the analysis. The number of awards presented is based on the number of Level D entries. Some Level C entries also include summary details on the number of awards [RODA field: Total Awards], but this has not been used in the analysis, due to inconsistencies in reporting.
- the analysis begins with information on Allocations to different partner organisations and programmes, split by ODA and non-ODA funding. This is based on the information presented within Level B Allocations data (provided by DSIT in January 2025), and covers 3 financial years (2022/23, 2023/24 and 2024/25).
- the analysis then focuses on the “current” portfolio (March 2024), based on whether any expenditure had been reported (against Level B, C or D entries) in RODA as of Q4 2023/24. For each programme, the analysis presents an overview of reported spend to date [sum of figures in field: Actual Net (all quarters up to and including Q4 2024)], as well as forecast future spend [sum of figures in field: Forecast (all quarters Q1 2024/25 to Q4 2024/25)]. Note: (i) spend is reported differently for ODA programmes (near cash) and non-ODA (accruals basis); (ii) programme level expenditure figures include amounts reported against Level B, C and D in RODA (combined), unless otherwise stated; and (iii) the analysis only includes financial figures for the initial 3 year period (and not any forecast spend beyond this).
5.2 ISPF allocations
ISPF Funding Allocations by Type:
- a total of £347.0m is allocated to ISPF over 2022/23 to 2024/25.
- the majority (72% or £250.1m) is allocated to the 163 programmes currently planned across the different Partner Organisations (POs).
- the remainder relates to PO delivery costs (£34.6m), DSIT delivery costs (£6.5m) and Institutional Support Awards (£55.8m). See Figure 10.
- two thirds (69%) of all allocations relate to ODA funding. See Figure 11.
Figure 10 - ISPF – Total Allocations by Year (£m)
Source: Technopolis based on Level B Allocations Data 2024.
Figure 11 - ISPF – Total Allocations(£m, 2022/23 – 2024/25), by ODA and non-ODA
Source: Technopolis based on Level B Allocations Data 2024.
ISPF Funding Allocations by PO:
- allocations are spread across 22 Partner Organisations (POs). See Figure 12. Note that DSIT gives one allocation to UKRI, which is then divided internally between the different Councils, Innovate UK and UKRI itself.
- the scale of allocations varies between the POs, from £0.9m (the Faraday Institution, with 1 programme) to £55.6m (Innovate UK, with 9 programmes) (allocation figures include both programme and delivery costs)
- 62% of allocations to POs relate to ODA funding. See Figure 13.
- all 22 POs have non-ODA allocations, while 15 also have ODA allocations.
Figure 12 - ISPF – Total Allocations(£m, 2022/23 – 2024/25), by PO
Source: Technopolis based on Level B Allocations Data 2024. Bottom right boxes are OREC and FI.
Figure 13 – ISPF: Proportion of Total Allocations to POs (£m, 2022/23 – 2024/25), by ODA / Non-ODA
Source: Technopolis based on Level B Allocations Data 2024.
5.3 ISPF “Current” Portfolio (March 2024)
Not all of the POs, Programmes or PO Delivery Cost lines that are present in the allocations data appear (yet) in RODA data:
- allocations data shows 22 POs, with 163 programmes, and 65 PO delivery cost lines (some for specific programmes, some covering more than one programme, some covering the PO’s entire portfolio), with total allocations over 3 years of £284.7m.[footnote 8]
- by comparison, RODA data (as of Q4 2023/24) includes 21 POs, 137 programmes and 50 PO delivery cost lines. The programmes and delivery cost lines currently included relate to £265.5m of allocations (97.4% of the total allocations to POs).
There are several reasons why programmes present in the allocations data may not (currently) be captured through RODA. Missing items may include some or all:
- programmes with allocation for FY 22-23 which were initially funded through the DSIT Tactical Fund mechanism
- programmes where there has not yet been any expenditure
- programmes which were originally agreed between a Partner and DSIT, but will not go ahead (flexibility is given to POs to transfer allocation between programmes)
Also note that the Faraday Institution (which appears in allocations data but not currently in RODA) is a newer PO with allocations / spend only commencing in 2024/25 Q1.
We have limited the remainder of the portfolio analysis just to those programmes where spend has already been recorded as of Q4 2023/24. This provides a view of the ‘live / underway’ programme portfolio at that point in time, which can then be updated in future iterations. Henceforth this group of programmes is referred to as the “current” portfolio (as of March 2024).
We have had to exclude PO delivery costs from the analysis, as this is often not tied to specific programmes.
The remaining analysis focuses mainly on information reported in RODA. However, information on ISPF Themes and Partner Countries is taken from the allocations data, where any changes to these fields over time have been recorded.
DSIT delivery costs and Institutional Support Awards are detailed in both the allocations data and RODA data, but as the following sections focus on the ISPF programme portfolio, they are excluded from the analysis.
ISPF Programmes:
- currently (March 2024), ISPF has a portfolio of 89 programmes that are live or underway (according to Q4 2023/24 RODA data).
- these are being delivered by 18 Partner Organisations (POs), who have between 1 and 12 programmes each (5 per PO on average).
- there are currently 65 non-ODA programmes (delivered by all 18 POs) and 24 ODA programmes (delivered by 9 of these POs). See Figure 14
Figure 14 - ISPF – Number of current ODA / non-ODA programmes by Partner Organisation
Source: Technopolis based on RODA Data Q4 2023/24.
ISPF Awards:
- 42 programmes (and 14 POs) have made awards so far (March 2024).
- these programmes have made 507 awards in total (sometimes across more than one round of funding).
- there are a similar number of ODA (257) and non-ODA (250) awards made.
ISPF Spend:
- past expenditure of £59.2m is already reported[footnote 9] across current programmes, up to Q4 2023/24 (with a further £96.2m forecast for these same programmes for the remainder of the 3 year period to Q4 2024/25).
- 55% of past expenditure on current programmes is ODA and 45% non-ODA.
- £34.1m of past expenditure is through awards (58% of programme total).
Figure 15 - ISPF – Past expenditure (£m) of current ODA / non-ODA programmes by Partner Organisation
ODA:
- Within the current portfolio, 24 programmes (27%) are with ODA countries, with total spend of £32.7m already reported[footnote 10] against these programmes.
- The most common partner countries[footnote 11] are South Africa (16 programmes), Kenya (13), Malaysia (12) & Thailand (12). Others include Brazil (11), Egypt (11), Indonesia (11), Jordan (10), Philippines (10), VietNam (10) & Turkey (9)[footnote 12]. See Figure 16.
Figure 16 - ISPF – Number of current programmes by ODA Partner Country
Source: Technopolis based on RODA and Level B Allocations data (for partner countries), 2024
Non-ODA:
- Within the current portfolio 65 programmes (73%) are with non-ODA countries, with total spend of £26.6m already reported[footnote 13] against these programmes.
- The most common partner countries[footnote 14] are the United States of America (24 programmes), Japan (18), Canada (16) and India (14). Others include South Korea (12), Switzerland (11), Australia (10), Germany (8), Israel (7), Taiwan (6), China (5), Ireland (5), France (4), Netherlands (4), New Zealand (4), Denmark (1), and Latvia (1). See Figure 17 below.
Figure 17 - ISPF – Number of current programmes by non-ODA Partner Country
</small>Source: Technopolis based on RODA and Level B Allocations data (for partner countries), 2024</small>
ISPF Themes:
The number (and proportion) of current programmes tagged against each of the 4 ISPF Themes[footnote 15] is as follows (also shown in Figure 18, with ODA/non-ODA split):
- Resilient Planet (43 programmes, 48%)
- Transformative Technologies (39 programmes, 44%)
- Nurturing Tomorrow’s Talent (37 programmes, 42%)
- Healthy People, Animals & Plants (18 programmes, 20%)
Table 1 shows the breakdown by Theme across different POs.
Figure 18 - ISPF – Number of current ODA / non-ODA programmes by ISPF Theme
Source: Technopolis based on RODA and Level B Allocations data (for Themes), 2024
Table 1 - ISPF – Number of current programmes by ISPF Theme and Partner Organisation
PO | Resilient Planet | Transformative Technologies | Nurturing Tomorrow’s Talent | Healthy People, Animals & Plants | All Programmes |
---|---|---|---|---|---|
STFC | 3 | 10 | 6 | 1 | 11 |
IUK | 2 | 4 | 0 | 0 | 5 |
MRC | 0 | 0 | 2 | 4 | 4 |
EPSRC | 4 | 1 | 0 | 1 | 4 |
NERC | 3 | 1 | 1 | 0 | 3 |
BBSRC | 1 | 2 | 0 | 0 | 2 |
UKRI | 0 | 0 | 0 | 0 | 0 |
ESRC | 0 | 0 | 0 | 0 | 0 |
AHRC | 0 | 0 | 0 | 0 | 0 |
UKRI Total | 13 | 18 | 9 | 6 | 29 |
NPL | 6 | 6 | 1 | 0 | 12 |
RAE | 8 | 7 | 9 | 7 | 10 |
UKAEA | 5 | 4 | 1 | 0 | 8 |
AMS | 0 | 0 | 4 | 5 | 7 |
BA | 0 | 1 | 6 | 0 | 6 |
BC | 4 | 2 | 2 | 0 | 5 |
RS | 0 | 0 | 3 | 0 | 3 |
ESC | 3 | 0 | 0 | 0 | 3 |
MO | 2 | 0 | 1 | 0 | 2 |
OREC | 2 | 0 | 0 | 0 | 2 |
CPC | 0 | 1 | 0 | 0 | 1 |
UUK | 0 | 0 | 1 | 0 | 1 |
FI | 0 | 0 | 0 | 0 | 0 |
All POs | 43 | 39 | 37 | 18 | 89 |
Source: Technopolis based on RODA and Level B Allocations data (for Themes), 2024
Activity Types:
- current ISPF programmes most commonly include the following activity types:[footnote 16] International Collaborative Academic Research (66% of all programmes[footnote 17]), Networking and workshops (48%) and Institutional R&I capacity building (34%). International Collaborative Business-Led Research activities are taking place within 30% of programmes in the current portfolio. See Figure 19.
- Figure 20 then presents the portfolio by (single) main activity type.
Figure 19 - ISPF – Number of current ODA/non-ODA programmes that include each activity type
Source: Technopolis based on RODA and POs input, 2024
Figure 20 - ISPF - Percentage of current programmes by main activity type
Source: Technopolis based on RODA and POs input, 2024
5.4 Further reflections from the portfolio analysis
Forming an overview of the ISPF portfolio from RODA and Allocations data has been a considerable undertaking. The set-up of the Fund and portfolio means that reporting is complicated and multi-layered, and there are inconsistencies in interpretation and reporting across different POs. For instance:
- some programmes have multiple ‘level C’ entries for the same programme (e.g. to differentiate between sub-parts of the programme, different partner countries, or different years or rounds), while the majority of POs report only one such entry per programme
- some information appears to have been entered at an early stage and has not been updated. This is clear in some of the summary descriptions, which are taken from proposals and include ‘areas to be determined at a later date’, but may also be true of other fields
- some POs report delivery costs as a separate entry or entries (sometimes for all activities combined, sometimes for ODA / non-ODA separately, and sometimes for some or all of the individual programmes). Some POs have not reported any separate delivery costs at all
- the total number of awards made is not always recorded against a programme or round, and where it is, this rarely tallies with the number of awards that are then detailed individually in the database
The database that the study team have created through this workstream, based on Allocations data, RODA data and PO input, is the first attempt to understand the ISPF portfolio as a whole. However, it is already ~10 months out of date, and the process of revising in future will require further effort, as well as additional input from POs (e.g. on the activity type of new programmes). There are also additional data fields one might usefully include (with input from POs), for instance, the UKRI ISPF database consistently records information for each programme on the match funding types and amounts, private investments, numbers of applications, ISPF sub-themes, and activity types. Similar data might usefully be collected for the remaining POs.
Further consideration should also be given to the different categories of PO and how their portfolio is best reported, such that there is consistency across the Fund. There are at least 2 broad groups:
- POs that act as funding intermediaries, mainly running calls and making awards to external individuals and organisations (this includes UKRI councils and IUK, plus the academies, learned societies and UUK); and
- POs that directly deliver R&I programmes / projects (the Public Sector Research Establishments, Catapults, and to a certain extent STFC as it is making investments in infrastructure alongside funding individual grants)
6. Effectiveness & Value for Money
6.1 Introduction and overall methodological approach
DSIT is committed to embedding evaluation. It aims to build a comprehensive evidence base to inform policy design, development, and implementation, ensuring that interventions deliver maximum impact, and that public funding is spent as effectively as possible. As such, the ISPF MEL Plan states that a comprehensive externally commissioned evaluation will be needed to provide evidence of the extent to which this Fund is achieving intended outcomes (effectiveness) and is delivering value for money in relation to the public investment.
We recommend using a mixed methods approach, grounded in a Theory of Change (ToC) for the Fund. The ToC has already been presented in Section 4, and this has then underpinned the selection of metrics and data sources. The evaluation would therefore provide an analysis of the progress and achievements made across the entirety of the ToC (from the delivery of early / immediate outputs to evidence of contributions towards wider and longer term impacts) as part of the overall effectiveness assessment.
Additionally, we also recommend using a holistic approach to Value for Money (VfM) that can be applied to arrive to judgements where traditional approaches (e.g. cost benefit analysis or return on investment analysis) are challenging. The approach uses evaluative reasoning and performance criteria to provide a transparent means to make robust VfM judgements from a wide range of qualitative and quantitative evidence. We suggest using a sample of 21 programmes to conduct the VfM assessment (1 in every 8 programmes in the portfolio), with evidence for this being collected via longitudinal case studies.
We recommend complementing this holistic approach with an assessment of the return on investment, based on monetisable / economic outcomes, with a focus on the parts of the portfolio for which it is possible (and relevant) to make such estimations (i.e. the business led collaborative R&I research). We recommend conducting this assessment using a quasi-experimental design to take into account the counterfactual scenario and explore net effects.
Finally, we also recommend conducting a Qualitative Comparative Analysis, using the same dataset of 21 longitudinal case studies mentioned above, to explore the casual pathways that lead to the achievement of (long term) outcomes. This is not only a relevant Theory Based Evaluation (TBE) method in this context, but also one that allows for maximum use and analysis of data collected in the context of the VfM assessment.
Note that the approach set out in this document is intended to be iterative and to evolve as the evaluation progresses, and more evidence becomes available. In particular, the framework should be revised and updated after the Interim Effectiveness evaluation. This could lead to an update of the ToC and performance metrics (e.g., to capture effects not originally foreseen), and / or a change on the sampling strategy for in-depth cases studies.
Note also that the approach will rely on the support of ISPF partner organisations, who have an important role to play in providing accurate, timely and complete data and information, as well as other assistance and inputs throughout the different phases of evaluation.
Figure 21 - Overview of approach to ISPF impact assessment
The sub-sections that follow present the following:
- An overview of performance metrics proposed for effectiveness assessment (Section 6.2).
The full list of metrics (with details on sources, baselines and benchmarks) is provided in Appendix D. The performance metrics cover all the elements of the Theory of Change. - Details of the main evidence sources that will be used in implementing these methods and in populating the proposed performance metrics (Section 6.3).
This includes details of relevant secondary data, as well as proposals for primary data collection activities. - The main synthesis methods proposed for the evaluation (Section 6.4).
This includes Longitudinal Case Studies a Value for Money Assessment (based on a rubric approach), a Qualitative Comparative Analysis (QCA), and an assessment of Return on Investment. In each case, the relevant sub-section sets out a proposed approach and methodology for employing these techniques in the context of ISPF (while further details of the VfM assessment rubric also set out in in Appendix E).
A small number of performance metrics will not feed directly into these analysis / synthesis methods, but they will be analysed and reported on within the effectiveness evaluation. - A recommended sampling strategy for the VfM, QCA and RoI approaches (Section 6.5).
Note that further consideration is then given to options around the scale, scope and regularity of future evaluation activities in Section 8.
6.2 Performance metrics (indicators & benchmarks)
6.2.1 Introduction and background
DSIT has already developed a suite of 23 Key Performance Indicators (KPIs) to measure the Fund’s performance and outcomes.[footnote 18] Evidence for most of these will be collected via one of 2 established monitoring systems: (i) the Annual Commission requests sent to ISPF POs for completion each year; or (ii) the quarterly data submissions by POs on spend and activities via the system RODA system. A small number will be based on further DSIT analysis of information provided via the routes above, or (in one case) on evidence collected through evaluation.
The current list of 23 KPIs is briefly summarised in Table 2 (with further detail on each given in Appendix D.1). The current study was tasked with reviewing these KPIs and their alignment / relevance for addressing the newly finalised ISPF ToC.
Table 2 - Summary of existing ISPF KPIs
The study was also tasked with recommending additional indicators that should be included to better address the ToC. The evidence for these would need to be collected by future evaluators (rather than via existing DSIT / PO monitoring activities), and so the approach and source for doing so also needed to be established by the current study.
The ISPF MEL Plan defined a series of longer term outcomes for each of ISPF’s main objectives, and identified relevant indicators (from the KPIs above) that related to each outcome. It then began to establish a series of specific interim targets for these indicators and outcomes, with the intention that these would provide something for the Fund to work towards in the short term and help demonstrate progress towards long term objectives.
The objectives and intended outcomes of the Fund have evolved since this plan was written (as the Fund has evolved and as ambitions have been further defined through the ToC development process), as have some of the KPIs (which were still being agreed and finalised). As such, some elements of the original thinking on targets are no longer well aligned and relevant. The study team have also advised against the use of specific targets where these do not have a strong foundation in the original stated intentions of the Fund and do not have a good rationale and basis for being set at a certain level. Instead, to address the desire to compare progress and achievements against a standard or expectation, the study team have recommended considering where there are relevant benchmarks for the indicators now proposed (existing KPIs and additional indicators). This will allow for a less binary assessment of progress and achievement, as well as some consideration of contextual factors (e.g. differences between ISPF and the relevant benchmark) when making the analysis.
6.2.2 Proposed metrics
The full results of the performance metric assessment and development process described above are presented in Appendix D.3 (ISPF Performance Metrics). However, details for one element of the ToC have been extracted from this appendix into Table 3 below as an example, and this is used as a reference for the introductory text that then follows.
Table 3 - Example performance measurement assessment
Column | Name | Example |
---|---|---|
A | Type | Outcome |
B | Ref No. | OC8 |
C | ToC Element | Increased or sustained quality / competitiveness R&I in ISPF themes (UK & ODA beneficiaries) |
D | Existing KPI? | Yes |
E | Existing KPI Detail | B9 Field-weighted citation impact (FWCI) (based on B7 List of publications) |
F | Recommended Indicators | As per existing KPI, plus: Citation impact - as measured by Average of Relative Citations (ARC) and HCP (Highly cited papers) - of ISPF publications (total, broken down by ISPF themes, gender and field/sector). Also bibliometric analysis of international co-publications, multi-and inter-disciplinary papers, and research novelty (based on unusual combinations of cited references). |
G | Sources | > DSIT analysis (KPI B9), based on Annual Commission (KPI B7) > Bibliometric data / analysis |
H | Included in VfM? | Yes |
I | Relevant VfM sub-dimension | 3.2.5 Attainment of outcomes: Strengthening SRTI quality |
J | Baseline | Citation impact (as described in the indicator) for ISPF researchers (before ISPF). |
K | Comments | Baseline assessment can be made at baseline evaluation stage, or retrospectively alongside the first interim impact assessment. |
L | Possible Benchmark | Comparison of ISPF quality (as measured by citation impact) with several counterfactuals: (1) the same researchers before ISPF funding. (2) the same researchers with non-ISPF funding (during ISPF period). (3) UK researchers collaborating with the same countries. [This benchmarking exercise is already proposed as part of VfM approach] |
Assessment / alignment of existing KPIs
- Columns A-C present all elements contained within the ISPF ToC (i.e. each input, activity, output and outcome that is presented in the ToC diagram). The example shown above concerns one of the outcomes of the Fund (increased R&I quality / competitiveness).
- Columns D-E identify any existing KPIs that are relevant to that part of the ToC, showing what is already being captured through RODA or the Annual Commission. In the example above, KPI B7 will collect a list of DoIs for publications emerging from ISPF, while B9 concerns additional analysis that will be undertaken by DSIT to determine the field-weighted citation impact in relation to these publications.
Note that not all existing KPIs map directly to a specific part of the ToC. The small number that do not align well are noted in Appendix D.4. These would still provide useful data for the evaluation, but are mainly contextual (or cross-cutting in the case of KPI B1).
Recommendations for additional indicators
- Column F presents additional indicators for parts of the ToC that are not already covered by KPIs. In most of the cases where there is an existing KPI in place, we also present an additional or extended indicator, which can more fully capture the relevant element of the ToC. This is the case for the example above, where it is proposed that 2 additional methods of assessing impact (ARC and HCP) are used, and that analysis is also undertaken for different sub-groups (by ISPF theme, by gender, etc.).
- Column G identifies the relevant source(s) of evidence for the indicator(s). In the example given, this includes the Annual Commission (which will collect DoIs), plus additional bibliometric data that will be obtained based on the identified ISPF publications.
Table 4 lists the different sources that are mentioned across all of the proposed metrics in the appendix and provides a summary of the number of indicators being addressed in each case. Note that the sources / methods are considered further in Sections 6.3 (evidence sources) and 6.5 (sampling strategy).
Table 4 - Summary of sources and indicators for performance metrics
Source | Input Indicators | Activity Indicators | Output Indicators | Outcome Indicators | Indicators Total |
---|---|---|---|---|---|
PO Consultation (info request) | 0 | 8 | 0 | 0 | 8 |
Programme lead consultation (written / interview) | 0 | 0 | 2 | 0 | 2 |
Survey of ISPF project participants (UK and international) | 0 | 0 | 9 | 8 | 17 |
Follow up interviews with ISPF project participants | 0 | 0 | 1 | 1 | 2 |
RODA (incl. Allocations data) | 4 | 8 | 0 | 1 | 13 |
Annual Commission data (including DSIT analysis of this data for 2x KPIs) | 2 | 14 | 3 | 17 | |
Bibliometric data / analysis & Overton | 0 | 0 | 2 | 2 | 4 |
VfM Case Studies (evidence from VfM assessment) [which draws on and triangulates multiple evidence sources] | 1 | 0 | 2 | 10 | 12 |
Any source | 4 | 8 | 21 | 22 | 55 |
Note that some indicators combine multiple sources, so columns cannot be summed to arrive at the total number of indicators.
- Columns H-I note where an indicator is also being used in the VfM rubric (introduced in Section 6.4.2). Where this is the case, the determination of indicators and relevant sources has sought to align with and make use of evidence collection that is already planned as part of VfM assessment (to reduce duplication of effort). In the example above, there is a VfM sub-dimension (3.2.5 Attainment of outcomes: Strengthening SRTI quality) that is relevant, and so the indicator proposed aligns with plans there.
Recommendations for baselining
- Columns J-K assess the relevance of establishing a baseline for each indicator (existing KPIs and additional indicators). For many of the indicators, the baseline position at the start of the Fund is “zero” and any outputs or outcomes delivered will be additional, therefore no initial measurement is required. Elsewhere, an approach is suggested for capturing the relevant baseline, along with any relevant comments on how / when this should take place. In the example above, the baseline would be the performance of ISPF researchers in the period immediately prior to ISPF, and could be assessed at the initial baseline evaluation stage, or retrospectively at the first interim effectiveness evaluation.
Assessment of possible benchmarks
- Column L provides an initial assessment of where and how results (evidence captured against indicators) might be compared with a relevant benchmark. A RAG rating is used to indicate availability of a suitable benchmark (with red indicating no suitable benchmark identified, amber indicating a possible benchmark requiring further exploration, and green indicating a feasible and suitable benchmark). In the example given above, 3 comparators are proposed (and rated green). The recommended benchmarks are already an exercise recommended to be undertaken as part of the VfM approach.
Note that in some cases where there is currently no suitable benchmark (rated red), the note in the column suggests that initial evidence / results from the indicator could however be used as the basis for an internal discussion as to whether there is a desire to establish more concrete ambitions for the future (which can then be monitored against). This is the case, for instance, in relation to indicators on the balance of the portfolio across different activity types.
6.3 Main evidence sources
As indicated in the previous section on metrics, the effectiveness evaluation (including VfM assessment – which is explained further in Section 6.4.2) will draw on a number of different primary and secondary sources of evidence. Further details of each of the sources is provided below, with reference made to the indicators presented in the metrics table (Appendix D.3), as well as to the sub-dimensions presented in the VfM rubric (Appendix E).
6.3.1 Primary data collection / consultation
Partner Organisation (organisation-level) Consultation
Most basic information on the ISPF portfolio can be obtained by the evaluators from Allocations data and RODA (see secondary data below). However, this does not currently include a tagging of programmes by ‘activity type’, which is required for 8 of the activity indicators presented in the metrics table. For the current report (portfolio analysis), POs have tagged all programmes that were entered in RODA as of Q4 2023/24, but this process would need to be repeated in future to include any subsequent additions to the portfolio. This would involve sharing a template with each PO, with responses then aggregated in a single repository.
Programme Lead Interviews and Information Request
There are 22 sub-dimensions in the VfM rubric that require consultation with ISPF Programme Leads (i.e. the programme manager or equivalent for an ISPF programme, within the UK PO and in the equivalent overseas partners). The relevant sub-dimensions mainly relate to the achievement of outputs and outcomes (under the Effectiveness criteria).
Data sharing agreements are being finalised with each PO to allow the provision of relevant contacts to the evaluator/DSIT. Interviews would then be conducted online and individually. We recommend interviews are undertaken with at least the UK lead and one overseas lead for each of the programmes included within the VfM sample (see Section 6.5 on sampling). Depending on the programme in question, additional interviews with other organisations that are supporting implementation (perhaps across multiple countries) may also be relevant.
There are also 2 output indicators (relating to resources leveraged for ISPF programmes and examples of joint areas of interest / priority identified between funders through ISPF) which should also be addressed to programme leads. For those programmes included within the VfM exercise, this information can be sought as part of the same consultation exercise. For other programmes (outside the VfM sample), a written request / short survey of the relevant leads is recommended.
Online survey of ISPF Project participants (UK and International)
An online questionnaire survey directed at ISPF Project participants (i.e. award holders and direct beneficiaries of ISPF funding), including both UK and international participants. These individuals will be identified by the Partner Organisations and their details either shared with the evaluator (where possible), or approached directly by the Partner Organisation (where not). Data sharing agreements are being finalised with each PO to allow the provision of contacts. The survey should be directed to all participants where it is possible access relevant contacts.
The survey will address 17 of the output and outcome indicators presented in the metrics table and 19 of the sub-dimensions presented in the VfM rubric (across all 4 criteria, but mainly focused on those relating to Effectiveness).
Follow-up interviews with ISPF Project participants (UK and International)
For 2 of the indicators presented in the metrics table (one output indicator, one outcome indicator), initial evidence will be collected through the participant survey (above), but we have also suggested that a small selection of examples could then be explored further through a follow-up interview. This relates to: (i) whether and how new R&I ideas (e.g. new research questions to be addressed) have been identified with ISPF funding and then taken forward (through further ISPF or alternative funding); and (ii) the extent to which ISPF participation has increased various aspects of a participant’s research capabilities (e.g. leadership skills). These interviews could be expanded to also explore other areas of particular interest in more depth, building on the initial survey responses and results.
6.3.2 Secondary data
ISPF Allocations and RODA data
ISPF Level B Allocations data (maintained by the DSIT PMO team) records the original allocations of ISPF ODA and non-ODA funding (separately) across the different ISPF Partner Organisations and their Programmes (or delivery costs) and across financial years. There is the flexibility for partner organisations to re-balance their allocations within their ODA or non-ODA portfolios over time (e.g. increasing or reducing the scale of particular programmes), and this is then reflected in future year allocations within this database (once the change has been notified to DSIT through the ISPF Change Management Process). It also includes information on ISPF Themes and Partner countries for each programme (both of which can then also be updated in this database as part of the ISPF Change Management Process).
ISPF POs are also required to make quarterly data submissions via the financial reporting system (RODA). The focus is information on past and forecast future expenditure, however other information is also collected (e.g. a programme description, the number and value of awards made, or alignment with Sustainable Development Goals). ISPF reporting through RODA begins once programmes are launched (and so is already underway, but will expand over time).
For the portfolio analysis presented in this framework, these 2 sources of data have been combined. As an evidence source, they are also collectively referred to simply as “RODA” within the metrics table (where they support 15 indicators) and VfM rubric (supporting 2 sub-dimensions). For the evaluation, the latest available data will be obtained from DSIT and analysed to provide evidence against these indicators and sub-dimensions.
ISPF Annual Commission data
Annual Commission requests are sent by DSIT to each ISPF PO for completion each year. The requests ask for a variety of data and information on the activities, outputs and outcomes that relate their portfolio of ISPF programmes and projects (see Appendix D.1 for KPIs covered).
For the evaluation, the latest available Annual Commission data will be obtained from DSIT and analysed by the evaluation team to provide evidence against the relevant 15 indicators in the metrics table and 12 relevant sub-dimensions in the VfM rubric.
The first results on ISPF through the Annual Commission have become available in early 2025 (based on data for the Jan-Dec 2023 period, collected from POs at the end of 2024). Initial evidence will therefore be available for the baseline evaluation in 2025, although data is likely to be limited at this stage, as there was minimal ISPF programme activity during 2023.
Programme documentation
The 5 sub-dimensions under the Economy criteria of the VfM rubric all require complementing evidence from consultation activities (detailed above) with information contained within programme descriptions and call documentation. RODA data (above) includes some summary information on each programme (and some specific rounds / calls), but this can be expanded upon based on a review of publicly available information on relevant PO / programme websites, plus requests for additional information from POs / programme leads themselves (as part of the planned primary data collection activities).
Bibliometric data and Overton
There are 4 indicators and one VfM sub-dimension that will require bibliometric analysis, including citation analysis, on academic papers and on policy-related literature. This requires first identifying ISPF publications (from information collected via the Annual Commission), and then estimating the citations of these within other papers and in policy-related literature.
There are several data sources available for bibliometric analysis and citation analysis on academic papers, including Web of Science and Scopus (both proprietary data), and OpenAlex (an open source database). For citation analysis / uptake in policy-related literature the only available data source at the moment is Overton (also a proprietary data source).
Bibliometric analysis will also support benchmarking exercises to compare the quality of ISPF publications (as measured by citation impact) with several counterfactuals:
- the same UK researchers before ISPF funding
- the same UK researchers with non-ISPF funding (during ISPF period)
- UK researchers collaborating with the same countries. [This benchmarking exercise is already proposed as part of VfM approach].
The analysis could be expanded to focus on researchers from other countries, and this has been included as optional exercise as described in Section 8.
ResearchFish
ResearchFish is a monitoring platform that tracks outputs and outcomes emerging from research projects. It is used by UKRI and other organisations. This data source has been suggested as a means to benchmark outputs and outcomes emerging from ISPF, to assess to extent to which they are in line, below or above expectation. To improve this comparison, the benchmarking exercise would take into account the scale of the public investment (i.e. per £m invested). It is also possible to further improve comparability by selecting only those grants that included international collaborators (and even just ISPF partner countries) for the benchmark exercise.
Even though ResearchFish is not used across all POs, it is the only database available to be able to conduct this (quantitative) benchmarking exercise.
Longitudinal / VfM Case Studies
There are 12 output and outcome indicators in the metrics table that are based (at least partly) on the evidence and results being obtained through the VfM assessment process. The source is recorded as ‘VfM Case Studies’ to signify that the relevant metric will be captured for a sample of programmes that are included within the VfM assessment, and not all programmes. Each element of the VfM assessment itself will draw on and triangulate multiple sources of evidence, and these are already incorporated within the different evidence sources above.
Note that the longitudinal cases studies will offer a synthesis method in their own right (in addition to providing the data needed for the VfM and QCA). They will draw on multiple evidence and data sources as listed above. Additional interviews might be considered (e.g. with government officials or SIN representatives, as appropriate) to further enhance the narrative evidence presented within individual cases. The presentation of the longitudinal cases studies is further discussed in the section below.
6.4 Synthesis Methods
This section sets out the main synthesis methods proposed for the effectiveness evaluation, which include:
- contribution analysis
- a Value for Money Assessment (based on a rubric approach)
- a Qualitative Comparative Analysis (QCA)
- an assessment of Return on Investment.
In each case, the relevant sub-section sets out a proposed approach and method for employing these techniques in the context of ISPF. For completeness, we start by explaining that evaluation starts with an overall analysis and presentation of the evidence collected for all dimensions of the ToC.
6.4.1 Contribution analysis and overall assessment of effectiveness
We suggest following an adapted Contribution Analysis (CA) approach to the overall assessment of effectiveness, building upon the ToC of the Fund. This will entail testing and questioning the ToC by examining the evidence collected through multiple data sources. This approach is already embedded in the performance indicators in so far as these metrics have been set up to capture the contribution made by the Fund (with suggested benchmarks and comparisons, when possible, to further aid this analysis).
We suggest presenting this analysis against the main output and outcome categories of the ToC: Partnerships, Research, Innovation, Knowledge & Skills, and Influence & Reputation; and reporting on the qualitative and quantitative evidence emerging from different indicators and data collection tools. Other options / categories can be considered for structuring the presentation of results, but the important point is to make sure that the evaluation first presents an analysis across all the (ToC) evidence, before presenting the results from the synthesis methods (VfM, QCA, RoI) which will be more summative.
In addition to this high-level CA assessment, we also suggest developing a more in-depth Contribution Analysis using the Longitudinal / VfM Case Studies. In addition to providing evidence and results for the VfM assessment, these case studies will provide evidence to inform the overall assessment of effectiveness. As well as covering all (relevant) elements of the ToC, they will focus on collecting holistic evidence on:
- the mechanisms that lead to the achievement of outputs and outcomes, and qualitative assessment of the contribution of the Fund
- the contextual factors that have enabled or hindered the achievement of outputs and outcomes
- unexpected outputs and outcomes not foreseen in the initial ToC
- lessons learned from the design and implementation
The full case studies could be presented as annexes, with specific evidence then mobilised within the corresponding sections of the effectiveness analysis (e.g. in the form of quotes, vignettes, and evidence boxes as needed).
Consideration of alternative approaches
It is not recommend to conduct an all-out Realist Evaluation (RE) approach (and preparing and testing a series of specific Context-Mechanisms-Outcomes configurations, CMOs) given the complexity of ISPF, the variety of contexts in which it is implemented, and the wide set of expected outputs and outcomes covered in ToC. A RE approach would require pre-set, pre-conceived CMOs which could be too restrictive for the ISPF context, and would require a disproportionate level of resources to implement. They will also need to be developed for each programme, making it challenging to ‘aggregate’ results at the Fund level.
We have also discarded using Process Tracing, which focuses on applying specific types of tests to assess the strength of evidence (e.g., straw in the wind, smoking gun), and have instead advised a more straight-forward framework to assess the strength of evidence as part of the VfM assessment. This recommendation, again, stems from the wide set of expected outputs and outcomes as covered in ToC and the need to keep the principle of proportionality.
6.4.2 Value for Money
6.4.2.1 Overall approach
Traditional VfM assessment methods (e.g. Cost-Benefit Analysis) tend to have a narrow definition of value and struggle when being applied to Science, Research, Technology and Innovation (SRTI) initiatives. For ISPF, challenges include: objectives, contexts and benefits that are highly diverse across the portfolio; benefits that will only be realised over long time periods; benefits (such as new knowledge, attitude change and capacity building) that can be intangible and difficult to quantify or monetise; and equity being an important consideration.
DSIT has been working on an approach to assessing VfM since 2019, initially in relation to the Newton Fund, and then for the GCRF evaluation. A ‘rubric-based’ assessment is used, adapted from a method developed by Oxford Policy Management (the main features of which are summarised below). This provides a more holistic approach to VfM, exploring dimensions tailored to capturing key aspects of value delivered by a Fund, and allowing for different types of investment to go through the same process in a transparent and fair manner. Importantly, it provides a robust framework that assesses value beyond just monetary considerations.
Overview of the rubric-based VfM Approach
The rubric-based VfM approach was originally developed for an international development context by Oxford Policy Management and adapted for GCRF and Newton evaluations.
It is based around defining a series of value for money performance dimensions and sub-dimensions which seek to encapsulate the main ‘value’ propositions of an initiative.
These dimensions are identified and organised under the “4 E’s” of Economy, Efficiency, Effectiveness and Equity, which are recommended as Value Criteria by the Foreign, Commonwealth & Development Office (FCDO) and National Audit Office (NAO). These have been defined (in an international development context)[footnote 19] as follows:
- economy – are we buying appropriate quality at the right price?
- efficiency – how well are we converting activities into outputs?
- effectiveness – how well are the outputs produced having the intended effect?
- equity – how fairly are the benefits distributed?
As shown in Figure 22 below, these 4 E’s (and the headline questions they seek to address) can be mapped against the structure of a Theory of Change. Evidence for different dimensions will therefore emerge overtime. For instance, evidence on economy dimensions would be available at the start of an intervention / evaluation, while evidence on efficiency will start to emerge after the first years, and on effectiveness a little later. Evidence on equity will also evolve and increase over time (with analysis initially focusing on equity of design and implementation, then resource distribution, and then involvement in access to benefits). This allows for an early analysis of VfM, as well as further, richer analysis as the years pass.
Figure 22 - The 4E’s mapped to the ToC structure
Source: Technopolis
For each sub-dimension, performance standards are then defined (i.e. explicit definitions of what the evidence would look like at different levels of performance - poor, adequate, good and excellent). A number is attached to each level (e.g. poor = 1, adequate = 2, and so on), such that a numeric rating can be given (with an accompanying narrative provided on the rationale for the rating, as well as a comment on the strength of evidence).
Figure 23 - Example Rubric and Assessment Template
The assessment itself is undertaken at the sub-Fund level (i.e. applying each part of the rubric to an individual programme or project, and then repeating across a sample). The scores on a particular dimension can then be aggregated across that sample. Therefore, whilst assessment and scoring take place at a sub-Fund level, the resulting analysis is at the Fund level.
The VfM assessment process itself is based on reasoned evaluative judgements, drawing on an array of qualitative and quantitative evidence. It is therefore important that the criteria and standards are clear, specific and transparent, and that evidence will be collected and available to support this assessment. The development of the ISPF Rubric therefore tied closely with the work of the current study on performance measurement (looking at centrally collected KPIs, plus possible additional metrics to be addressed through evaluation fieldwork). It is important to establish these plans for data collection before launching into the evaluation, such that the necessary evidence will be in place for VfM assessment.
Figure 24 - Summary of the Rubric-based VfM approach
DSIT were keen to achieve continuity and build upon the past (and ongoing) Newton/GCRF work in the approach used for assessing VfM in relation to ISPF. However, the detailed dimensions and performance measures required adaption and development for this new context (e.g. given differing objectives of the Funds and the blended ODA / non-ODA scope of ISPF). Workstream 4 therefore focused on the development of a bespoke VfM rubric for ISPF, including value criteria, performance standards, and a sampling approach to the assessment.
The rubric, as prepared in the context of the ISPF evaluation framework, will provide a basis for future stages of monitoring, evaluation and learning. It is primarily intended for internal audiences (for the evaluation team, the Fund team and Partner Organisations who will be central to data collection). However, the results and analysis developed in later phases, based on this rubric, will then be relevant to and of interest to a wider range of audiences. Robust VfM assessment is partly about accountability to funders (demonstrating that ISPF is maximising impact given resources) and is essential when making the case for additional funding. It can also help to drive improvements to future programme design and decision making through evidence-led adaptive management and dynamic portfolio management.
6.4.2.2 Key features of the ISPF VfM Rubric
Appendix E provides the full VfM Rubric for ISPF. There are important considerations that have informed the development of this rubric, as set out below.
A hybrid approach with programmes as the starting unit of analysis is more appropriate for ISPF. The VfM approach employed for the Newton and GCRF evaluations focused on awards as the unit of analysis. There are merits, but also drawbacks, to this - at least for the ISPF evaluation:
- Our initial analysis of the ISPF portfolio suggests that it is delivered through a combination of programmes, with and without calls for awards. With an award-based approach to VfM assessment, either an entire programme of activities would be treated as an ‘award’ or programmes without awards would be excluded entirely.
- An award focus may also create difficulties in assessing the types of initiative supported via STFC, NPL or ESC, which often involve providing individual researchers and innovators/businesses with access to support and facilities (meaning that each individual ‘project/award’ is too granular to be able to understand their activities).
- An important feature of some ISPF programmes is that they include a diversity of types of awards or activities (e.g. collaborative R&D, networking, training) and looking at an award in isolation would miss the potential synergies at programme level.
- Some relevant aspects of alignment and results do not materialise (and can therefore not be assessed) at award level. This is especially the case for the elements of the ToC that relate to strengthening partnerships at the institutional level, maintaining alignment with high level strategic objectives, or strengthening capacity at national or institutional level.
- A programme level approach should mean that a larger sample of the Fund’s portfolio can be assessed with the same resource (e.g. similar resources may need to be invested to analyse 50 awards, as are required to analyse 20 programmes containing, on average, 50-100 awards each). However, it is also important to note that some award-level evidence collection will still take place, though this will be analysed at the aggregate (i.e. programme) level. For instance, most Fund outputs (covered in the “Effectiveness” module of the Rubric) are expected to emerge at project level.
Given all of the above, we suggest implementing a hybrid approach, similar to that utilised in the Evaluation of the Fund for International Collaboration. This looked at 2 tiers of analysis: Tier 1(funders / programmes level) and Tier 2 (research and innovators / project level), and uncovered results materialising at each level, covering different aspects of the Fund ToC.
In drafting the ISPF VfM Rubric, we have therefore used a hybrid approach, whereby we use programmes as the main unit of analysis for most of the dimensions, and projects/awards (within those programmes) as a focus of analysis when relevant (e.g. for collecting information on outputs). We have also made a distinction between programmes with no calls/awards and programmes with awards when relevant. We also note that the rubric needs to apply across the entirety of the ISPF portfolio, including both ODA and non-ODA elements, and consultation and validation processes with the different POs has helped to ensure that this is the case.
The sampling strategy for the VfM approach is presented as part of Section 6.5.
The final analysis of “Value” is not performed at the programme level but at the Fund level. The VfM approach suggested here requires the evaluators to make (evidenced) value judgements for each individual unit of analysis, and as such, the initial assessment is done at programme level. However, the final analysis is intended to be presented in an aggregate form. This is important to stress, since the methodology will not be used to judge individual programmes. We don’t expect each individual programme to score highly across all the VfM dimensions or sub-dimensions, but rather seek to get a view of performance of the overall ISPF portfolio. Understanding these intentions helps in circumventing potential (valid) criticism that a particular programme did not intend to align with all the aspects covered in the ToC.
VfM is an element of the effectiveness/Impact evaluation but does not cover all the elements. It is important to note that the VfM rubric sits within a wider framework for evaluating the impact of the Fund (as shown in Figure 21 above). The dimension “Relevance / alignment of the activity (its scope, focus & intentions) with key ISPF objectives: > Strengthening R&I capabilities (UK & ODA), at all levels”, is a case in point. The current Rubric provides a performance standard for ODA programmes, where we do expect to see dedicated activities in place to support this objective. The relevance to the UK could then be explored in a more qualitative way as part of the effectiveness/impact evaluation. Note that the UK and ODA countries’ perspectives are then captured when looking at the dimensions related to actual outcomes.
There is a need to maintain a balance between codifying assessment criteria and allowing evaluators to exercise judgement. The rubric needs to be detailed enough for the resulting analysis to provide useful and informative results. However, it should not be so granular and complicated that those implementing it or reading the results struggle to understand it. It should also not be so closely prescribed that it cannot be applied across the diversity of the portfolio.
Additionally, we expect evaluators to find various contextual, programme specific issues which would mean adapting the assessment of a performance standard. To be more specific, we have in some cases used the phrase “Evidence suggests” and have listed relevant sources of information to allow for that degree of freedom, avoiding being overly prescriptive about what exact evidence this should be. Note that evaluators would need to justify their assessment, which will allow for further scrutiny and feedback of the approach taken in a particular case.
Table 5 lists the different evidence sources that are mentioned across the rubric and provides a summary of the number of sub-dimensions being addressed by each. Note that the sources for the VfM assessment were incorporated within Section 6.3 above (which provided further details on evidence sources) and are also considered further in Section 6.5 (sampling strategy).
Note that VfM analysis will be prepared and presented first as longitudinal case studies (one per programme in the sample), with each case covering all sub-dimensions of the analysis and drawing on multiple evidence sources. Each sub-section within a case study will then conclude with the scoring of the sub-dimension, which will then inform the VfM rubric assessment.
Table 5 - Summary of sources and sub-dimensions
Source | Economy sub-dimensions | Efficiency sub-dimensions | Effectiveness sub-dimensions | Equity sub-dimensions | Sub-dimension Total |
---|---|---|---|---|---|
Interviews with ISPF Programme Leads from UK and international funders / delivery organisations | 5 | 3 | 10 | 4 | 22 |
Survey with UK and international project participants | 1 | 3 | 11 | 4 | 19 |
RODA (incl. Allocations data) | 1 | 1 | 0 | 0 | 2 |
Annual Commission data | 1 | 1 | 9 | 1 | 12 |
Programme descriptions and call documentation | 5 | 0 | 0 | 4 | 5 |
Overton (references in policy documents) | 0 | 0 | 1 | 0 | 1 |
Any source | 5 | 3 | 13 | 4 | 25 |
Note that most sub-dimensions combine multiple sources, so columns cannot be summed to arrive at the total number of sub-dimensions shown in the final row.
6.4.3 Qualitative Comparative Analysis
6.4.3.1 Overall approach
To further assess the contribution of ISPF to the achievement of outcomes we suggest implementing a Qualitative Comparative Analysis (QCA). Different procedures within QCA are used to answer 3 related evaluation questions:
- what causal factors are needed for the outcome to occur?
- what causal factors are most effective (alone or in combination) for the outcome?
- what causal factors make the difference for the outcome, under what circumstances?
Overview of the QCA Approach
As stated in the HMT Magenta Book (Annex A[footnote 20]), QCA is a pragmatic method to compare different aspects of an intervention and contextual factors to understand the different characteristics or combinations of characteristics which are associated with outcomes. It enables systematic comparison based on qualitative knowledge. Rather than examining the factors causing a specific outcome in depth, as in a single case study, QCA focuses on identifying a variety of patterns. This allows for both complex causation (combinations of factors) and ‘equifinality’ (multiple causes of an outcome) to be accounted for.
Another characteristic of QCA is that it allows for the exploration of the effects of multiple conditions that could be correlated. This is particularly important for R&I studies where there is expected to be a high degree of interaction (correlation) among multiple factors. This is an advantage of QCA over regression analysis where correlation among explanatory variables brings a problem of ‘multicollinearity’. In QCA multicollinearity is called limited diversity, a feature of many naturally occurring phenomena.[footnote 21]
Data for a QCA are collated in a matrix form (sometimes called a ‘QCA matrix’ or a ‘truth table’), where rows represent cases, columns represent conditions, and the rightmost column indicates the presence or absence of the outcome for each case. For each case, the presence or absence of a condition or outcome is recorded numerically.
There are several options for coding results, including a binary approach which records the presence of absence of a condition (0 or 1), fuzzy coding which denotes the partial absence or presence of a condition, presenting this as values from 0 to 1 to reflect the degree to which the condition is met (e.g. 0.25, 0.75). In cases where the condition can be present in more than one way, this can be represented by values of (0), (1), (2) or more – referred to as “multi’ value. This is used when a condition can have multiple states (e.g. type of awards).
The table illustrates how several cases can be codified in the QCA matrix for “Outcome 1”
Case | Condition 1 | Condition 2 | … | Condition N | Outcome 1 |
---|---|---|---|---|---|
1 | 1 | 0.25 | … | 0 | 1 |
2 | 1 | 0.25 | … | 2 | 1 |
… | … | … | … | … | … |
N | 0 | 0.75 | 1 | 0 |
The coding and analysis of the QCA matrix should allow evaluators to conclude, for instance, that “a combination of Condition 1 and N leads to Outcome 1”
There are 3 main reasons (advantages) for QCA being selected as an approach:
- it is a useful mechanisms to explore causality.
- it can be implemented using the information captured via longitudinal case studies and VfM assessment without the need for additional data / indicators, of further complexity within the evaluation exercise (see Section 6.5 on Sampling).
- it adds value to and reinforces the VfM approach, by including an approach to causality.
There are 2 main limitations to this approach:
- This method simplifies complexity (of the intervention) as it assigns a numeric value to the conditions that could lead to results (and the achievement of results). To mitigate this limitation, the analysis should only be presented as a synthesis, accompanied by the wider analysis of effectiveness as suggested in Section 6.4.1 (which will summarise the evidence from different sources, including case studies, to provide more context and nuance to results, as well as the ability to document unexpected outcomes).
- Not all outcomes can be represented as a numeric result and are more complex to capture and represent (e.g. increased reputation) and have been excluded from the QCA. Again, this limitation is mitigated by the wider analysis of effectiveness as suggested in Section 6.4.1.
- It requires completeness in data collection, meaning that if one indicator is not available for one case study, that case study will ‘drop’ from the sample. To mitigate this limitation evaluators are invited to consider excluding conditions for which there is not enough data / observations (to retain a high number of cases in the analysis).
6.4.3.2 A QCA for ISPF
To develop the QCA approach (and matrices) for ISPF we have drawn from:
- the ToC and pathways to impact (as described in Section 4.3 and Appendix B) which showcase how different outputs interact to deliver outcomes
- the performance indicator matrix (as described in Section 6.2 and Appendix D)
- the value for Money Rubric (as described in Section 6.4.2 and presented in Appendix E)
In terms of sampling, we suggest using the same sample for the QCA as for the VfM assessment, and this is further explained in Section 6.5.
For each case, we suggest including 2 sets of conditions in the truth table:
- characteristics of the programmes, including whether they are ODA/ non-ODA, their budget, duration, state of implementation, type of implementation (with or without awards) and type of activity (see Table 6).
- outcome specific conditions, including outputs being achieved, and assumptions being met (in line with the ToC and pathways to impact) (see Table 7 to Table 11). The outcomes included have been selected because they can be more easily quantified / qualified as being achieved or not. Progress and achievements in relation to other outcomes will still be assessed as part of the wider effectiveness analysis.
In each of the 6 tables that follow we present a short description of the condition, the source of the information and the recommended numerical coding (either fuzzy or binary). It is important to note that suggested values for the ‘fuzzy’ coding are indicative, and that we strongly recommend conducting sensitivity analysis to test the extent to which / how changing those values affects the final results and conclusions.
Table 6 - QCA Truth table – all outcomes
Condition | Description | Source | Coding |
---|---|---|---|
Condition A (Characteristic) | ODA/ non-ODA | Annual Commission | Binary (Yes=1 / No=0) |
Condition B (Characteristic) | Budget | RODA / Allocations Data | Fuzzy (Quartiles=0.25, 0.5,0.75,1.0, based on the distribution of budget across the sample) |
Condition C (Characteristic) | Duration | RODA(plus PO consultation if necessary) | Fuzzy (Quartile=0.25, 0.5,0.75,1, based on the distribution of duration of programmes across the sample) |
Condition D (Characteristic) | Stage of implementation | RODA (plus PO consultation if necessary) | Binary (Finished=1 / Ongoing=0) |
Condition E (Characteristic) | Type of implementation | RODA | Binary (With awards=1 / without awards=0) |
Condition F (Characteristic) | Type of award | Evaluation | Multi-value (e.g. Collaborative R&D=1) |
Table 7 - QCA matrix – Attainment of outcomes: Developing international R&I partnerships
Condition | Description | Source | Coding |
---|---|---|---|
Outcome | Strengthened equitable partnerships that continue over time (including via established ways of working) |
VfM (Dimension 3.2.2 in VfM framework) |
Binary (Yes=1 / No=0) Where ‘Yes’= ‘Excellent’ & “Good’ from Rubric (and ‘No’ otherwise) |
Condition A-F | See Table 6 | See Table 6 | See Table 6 |
Condition 1 (Relevance) | Relevance / alignment of the activity (its scope, focus & intentions) with key ISPF objective: Developing long-term strategic international R&I partnerships, at all levels |
VfM (Dimension 1.1.1 in VfM framework) |
Fuzzy (Rubric. Poor=0, Acceptable=0.25, Good=0.75 Excellent=1) |
Condition 2 (Co-funding) | Co-funding / contributions in kind for ISPF activities |
VfM (Dimension 1.2.1 in VfM framework) |
Fuzzy (Rubric. Poor=0, Acceptable=0.25, Good=0.75 Excellent=1) |
Condition 3 (Output achieved) | New & strengthened partnerships within / across sectors (academia, industry, third sector, policy, funders) |
VfM (Dimension 3.1.3 in VfM framework) |
Fuzzy (Rubric. Poor=0, Acceptable=0.25, Good=0.75 Excellent=1) |
Condition 4 (Output achieved) | New MoUs / Agreements established | Annual commission (B5 Number of new partnership agreements (e.g. MoUs) (plus detail on agreement level [B/C/D], reference number, gov-to-gov or international partner org, relevant Fund and calendar year signed) (O5 in indicator matrix) |
Binary (Yes=1 / No=0) |
Note: The assessment of MoUs is included in the Rubric above, but this indicator isolates the effect of having MoUs or not.
Table 8 - QCA matrix – Attainment of outcomes: Delivering solutions to shared challenges
Condition | Description | Source | Coding |
---|---|---|---|
Outcome | Increased or improved ability to tackle global & socioeconomic challenges via use / uptake / application of solutions developed through ISPF |
VfM (Dimension 3.2.3 in VfM framework) |
Binary (Yes=1 / No=0) Where ‘Yes’= ‘Excellent’ & “Good’ from Rubric (and ‘No’ otherwise) |
Condition A-F | See Table 6 | See Table 6 | See Table 6 |
Condition 1 (Relevance) | Relevance / alignment of the activity (its scope, focus & intentions) with key ISPF objectives: |
VfM (Dimension 1.1.2 in VfM framework) |
Fuzzy (Rubric. Poor=0, Acceptable=0.25, Good=0.75 Excellent=1) |
> Delivering solutions that contribute towards addressing specific shared challenges (that fall within at least one of the ISPF Themes) | |||
Condition 2 (Co-funding) | Co-funding / contributions in kind for ISPF activities |
VfM (Dimension 1.2.1 in VfM framework) |
Fuzzy (Rubric. Poor=0, Acceptable=0.25, Good=0.75 Excellent=1) |
Condition 3 (Output achieved) | Paving the way for the uptake / application of research outputs |
VfM (Dimension 3.1.4 in VfM framework) |
Fuzzy (Rubric. Poor=0, Acceptable=0.25, Good=0.75 Excellent=1) |
Condition 4 (Output achieved) | Paving the way for the uptake / application of innovationoutputs |
VfM (Dimension 3.1.5 in VfM framework) |
Fuzzy (Rubric. Poor=0, Acceptable=0.25, Good=0.75 Excellent=1) |
Condition 5 (Assumption met) | ISPF-funded research tackles global and socioeconomic challenges and this is widely disseminated among (and accessible to) relevant end-users. | Annual Commission (B6 Number of events/workshops/symposiums attended, hosted and presented at (+ published case studies)) (O8 in indicator matrix) |
Binary (Yes=1 / No=0) Where ‘Yes’= activities taking place |
Condition 6 (Assumption met) | ISPF-funded research tackles global and socioeconomic challenges and this is widely disseminated among (and accessible to) relevant end-users. | Annual Commission (B4 Number of instances of policy engagement or policy influence (+ description, link to case studies, details of partnership country) [in relation to the engagement part].) (O8 in indicator matrix) |
Binary (Yes=1 / No=0) Where ‘Yes’= activities taking place |
Note: Outputs related to R&I outputs are not included as there is no reason to believe that the number/volume of those outputs are a condition to attain the outcome, but rather that they are used and /or taken up.
Table 9 - QCA matrix – Attainment of outcomes: Strengthening R&I capabilities
Condition | Description | Source | Coding |
---|---|---|---|
Outcome | Increased research capabilities, incl. leadership (UK & ODA beneficiaries) |
VfM (Dimension 3.2.4 in VfM framework) |
Binary (Yes=1 / No=0) Where ‘Yes’= ‘Excellent’ & “Good’ from Rubric (and ‘No’ otherwise) |
Condition A-F | See Table 6 | See Table 6 | See Table 6 |
Condition 1 (Relevance) | Relevance / alignment of the activity (its scope, focus & intentions) with key ISPF objectives: > Strengthening R&I capabilities (UK & ODA), at all levels |
VfM (Dimension 1.1.3 in VfM framework) |
Fuzzy (Rubric. Poor=0, Acceptable=0.25, Good=0.75 Excellent=1) |
Condition 2 (Output achieved) | New & strengthened partnerships within / across sectors (academia, industry, third sector, policy, funders) |
VfM (Dimension 3.1.3 in VfM framework) |
Fuzzy (Rubric. Poor=0, Acceptable=0.25, Good=0.75 Excellent=1) |
Condition 3 (Output achieved) | New/improved understanding of user needs, research methods, EDI, Responsible R&I, research management, international collaborative | > Survey with ISPF project participants (UK and international) (O14 in indicator matrix) |
Binary (Yes=1 / No=0) Where ‘Yes’= 50% of respondents agreeing with statement |
Condition 4 (Spillovers) | Enhancement of the skills and capabilities of individuals or institutions to undertake R&I more effectively and efficiently in future |
VfM (Dimension 2.2.1 in VfM framework) |
Fuzzy (Rubric. Poor=0, Acceptable=0.25, Good=0.75 Excellent=1) |
Table 10 - QCA matrix – Attainment of outcomes: Strengthening SRTI quality
Condition | Description | Source | Coding |
---|---|---|---|
Outcome | Increased or sustained quality / competitiveness R&I in ISPF themes (UK & ODA beneficiaries) |
VfM (Dimension 3.2.5 in VfM framework) |
Binary (Yes=1 / No=0) Where ‘Yes’= ‘Excellent’ & “Good’ from Rubric (and ‘No’ otherwise) |
Condition A-F | See Table 6 | See Table 6 | See Table 6 |
Condition 1 (Relevance) | Relevance / alignment of the activity (its scope, focus & intentions) with key ISPF objectives: > Strengthening SRTI quality through international collaboration (UK & ODA) |
VfM (Dimension 1.1.4 in VfM framework) |
Fuzzy (Rubric. Poor=0, Acceptable=0.25, Good=0.75 Excellent=1) |
Note that the VfM rubric already includes the attainment of outputs in its assessment so in this case the analysis will be done for 7 conditions (6 on characteristics and 1 on Relevance, which assesses the extent to which the programme clearly demonstrates that the partnership will deliver research of high quality).
Table 11 - QCA matrix – Attainment of outcomes: Shaping / influencing wider SRTI ecosystems
Condition | Description | Source | Coding |
---|---|---|---|
Outcome | Shaping / influencing wider SRTI ecosystems |
VfM (Dimension 3.2.6 in VfM framework) |
Binary (Yes=1 / No=0) Where ‘Yes’= ‘Excellent’ & “Good’ from Rubric (and ‘No’ otherwise) |
Condition A-F | See Table 6 | See Table 6 | See Table 6 |
Condition 1 (Relevance) | Relevance / alignment of the activity (its scope, focus & intentions) with key ISPF objectives: > Developing long-term strategic international R&I partnerships, at all levels |
VfM (Dimension 1.1.1 in VfM framework) |
Fuzzy (Rubric. Poor=0, Acceptable=0.25, Good=0.75 Excellent=1) |
Condition 2 (Relevance) | Relevance / alignment of the activity (its scope, focus & intentions) with key ISPF objectives: > Delivering solutions that contribute towards addressing specific shared challenges (that fall within at least one of the ISPF Themes) |
VfM (Dimension 1.1.2 in VfM framework) |
Fuzzy (Rubric. Poor=0, Acceptable=0.25, Good=0.75 Excellent=1) |
6.4.4 Return on investment
6.4.4.1 Overview
As explained in sub-section 6.4.2 (and visualised in Figure 21), we are recommending the implementation of a holistic approach to VfM assessment for ISPF. This is because more traditional methods focused on Return on Investment (which require the monetisation of outcomes) would not fully capture the myriad benefits expected to emerge from ISPF. Having said so, we do recommend also conducting a Return on Investment exercise that focuses on some of the economic outcomes from ISPF, specifically the potential benefits for business performance (due to increased innovation), with a focus on UK participants.
6.4.4.2 Approach
A return on investment assessment requires estimating costs and benefits.
Cost - On costs, and in accordance with HMT Green Book, the approach should include all costs, to all stakeholders involved, including:
- direct public funding: This will include the value of the ISPF grant
- public funding leverage from other sources: This will include any other source of public funding leverage to conduct or advance the ISPF project
- direct private funding: This will include funding provided by the private sector participant to conduct or advance the ISPF project, including (but not restricted to) match funding requirements
- private funding leverage from other sources: This will include any other source of private funding leverage to conduct or advance the ISPF project (including equity investment)
Benefits - We recommend focusing on 2 indicators: turnover and productivity. In line with, HMT Green Book, the focus should be on the latter in so far that this is a better indication of additional economic activity / gains. However, productivity gains tend to take longer to materialise, so we suggest conducting the exercise on both indicators. Additionally, we also suggest conducting a similar exercise for employment. This will not feed directly into the RoI (as it will not be monetised) but provides a useful additional impact indicator.
We also recommend following a quasi-experimental design (QED) approach, to capture the net effect of ISPF on business performance. This will entail capturing information on 2 groups:
- Treatment group: UK participants in ISPF projects
- Control group: There are 2 options for a control group:
- One option is to use unsuccessful applicants as a control group. The logic is that, by applying, unsuccessful applicants have expressed interest in developing a project for ISPF (which in turn showcases an interest to develop R&I projects within an international collaboration context). Unsuccessful applicants could of course have different characteristics to participants (successful applicants) and these can be addressed (or minimised) by implementing appropriate statistical techniques.
This group can be used for programmes that have competitive call for proposals, and where it will be possible to get access to unsuccessful applicant details. - Another option is to use businesses from the general population as a control group. This set of business are less comparable to the treatment group (and to unsuccessful applicants), and will require implementing appropriate statistical techniques to further ensure comparability, understanding certain aspects (such as innovation behaviour).
This group can be used for programmes that do not have calls for proposals, and consequently by definition, there are no unsuccessful applicants.
- One option is to use unsuccessful applicants as a control group. The logic is that, by applying, unsuccessful applicants have expressed interest in developing a project for ISPF (which in turn showcases an interest to develop R&I projects within an international collaboration context). Unsuccessful applicants could of course have different characteristics to participants (successful applicants) and these can be addressed (or minimised) by implementing appropriate statistical techniques.
In terms of specific QED methods, we suggest combining a Difference-in-difference approach with a Propensity score matching:
- Difference-in-difference (DiD): As stated in the HMT Magenta book, impact is measured by studying the outcome of interest before and after the intervention for 2 groups; one subject to the intervention and the other not. First, the trend lines for the outcome of interest (turnover and productivity in this case) for the 2 groups are compared for the pre-intervention period. Where these trend lines move in parallel over time, a counterfactual trend can be estimated for the treated group, which is then used to estimate the impact of the intervention. Note that DiD works best when it is possible to get access to trend data, but in practice, this would only work if it is possible to match participant companies to secondary data sources (such as IDBR), and will be more difficult to apply using primary data collection (which tends to have a high level of attrition for financial questions).
Note that we have considered proposing a Regression Discontinuity Design (RDD), which will entail building a control group among those unsuccessful applicants that were close to the needed scores to become successful, but this may not be possible if: not all programmes have set and known scoring systems to select participants, and /or the sample size of those unsuccessful applicants that ‘just’ pass the needed scores (or did but were not funded due to unavailable budget) is small. As such, we do not recommend using (RDD). - Propensity score matching: As stated in the HMT Magenta book, Propensity Score Matching (PSM) is a statistical technique that enables evaluators to construct a counterfactual group to estimate the impact of an intervention. This is achieved by matching treatment observations to one or more control observations based on their probability of being treated (or their propensity score). This is calculated using observable characteristics that determine the likelihood of participation and varies between 0 and 1 (where 1 is 100% likely to be treated). This technique allows the identification of comparable treatment and control groups. It will help to improve the robustness of the analysis if ‘unsuccessful applicants’ are used as a control group. Moreover, it would be a requirement if ‘businesses from the general population’ are used as a control group.
Finally, in terms of data sources (and as alluded to above), we recommend using secondary data for this exercise.
There are 2 options for this analysis:
- ORBIS/FAME. A proprietary database that collects information on the financial performance of companies based on publicly available information as contained in Companies House. The main drawback of this dataset is that, in accordance with the 2006 Companies Act, small and medium-size companies can prepare and file abbreviated accounts, which means data tends to be missing for those companies.
- Inter-Departmental Business Register (IDBR). The IDBR is a comprehensive list of UK businesses used by government for statistical purposes. The IDBR provides the main sampling frame for surveys of businesses carried out by the Office for National Statistics (ONS) and government departments. It is also an important data source for the analysis of business activities. It is possible to access this dataset via the ONS Secure Research Service (SRS), which would require having accredited researchers, and following an approval process with the ONS for both the project itself and the outputs that emerge. The main drawback is that the ONS approval processes tend to take a long time and can be unpredictable. Based on past experience, the process from beginning to end can take up to 1 year.
The main advantage of using secondary data is that it is possible to construct time-series i.e. information over a long period of time across all businesses. The main disadvantage of using secondary data sources is that information tends to be aggregated across all subsidiaries making it difficult to track which specific part of the company engaged with the programme.
In contrast, collecting financial data via survey (primary data collection) tend to lead to few data points and information for only a limited set of years. Also, this approach does not necessarily help to solve the issue of subsidiaries, but respondents seldomly hold relevant financial information at subsidiary level.
In both cases (secondary data and primary data sources), this analysis is better restricted to SMEs.
6.5 Sampling strategy
As mentioned above, we suggest applying the same sampling strategy to the development of both the VfM and QCA approaches (see Section 6.5.1). Using the same sample offers 2 main advantages:
- it maximises the use of the data set produced in the context of the study
- it adds value to and reinforces the VfM approach, by including an approach to causality
We do not foresee are major limitations in using the same data set (but do see value in expanding the sample of 20 longitudinal cases if resources allow).
A different approach to sampling is recommended for RoI analysis (see Section 6.5.2).
6.5.1 Approach to selecting cases for VfM and QCA
To select cases for the VfM and QCA analysis, we suggest a two-stage stratified approach.
The first stage focuses on 2 main aspects: (i) the focus of the programmes in terms of ODA or non-ODA; and (ii) budget (based on total allocations as reported in the Section 5, which currently correspond to the period 22/23 – 24/25).
We are suggesting a total of 20 cases (programmes) to be developed as longitudinal case studies. This is relatively fewer ‘cases’ than for other similar Fund evaluations (e.g. for GCRF 50 cases were included), but this is due to the fact that the unit of analysis for an ISPF case is a programme not a project, and this requires substantially more resources to develop. In fact, the suggested selection of programmes below has so far awarded 121 projects, i.e. 2.5 times more than covered in prior evaluations. This number of awards may also increase over time if programmes continue to give awards.
The ISPF portfolio currently has a 61% / 39% split between ODA and non-ODA programmes (based on allocations), so we suggest having a similar split in the sample of cases (i.e. around 12 ODA programmes and 8 non-ODA programmes in a sample of 20).
The budget profile of ODA and non-ODA programmes is different, so for each part of the portfolio we have calculated key statistics (average, mean, median, etc. as presented in Table 12) and used these to produce bands, show the distribution of the population (of programmes) across those bands, and to estimate a distribution of 20 case studies (see Table 13). The final result, in terms of the recommended size of the sample is 21 cases, due to rounding.
In terms of implementation, in this first stage, we have randomly selected cases across the programmes that belong to each of 8 clusters (=4 budget bands x 2 types of programme).
The second stage focuses on 2 additional aspects: ISPF themes and activity types. Since some programmes include more than one ISPF theme and more than one type of activity we suggest contrasting and comparing the distribution of the stratified random sample of the first stage against the distribution of the population (of programmes). This may call for the need to find replacements (again at random) to balance the distribution (e.g. replacing an ODA programme in one budget band for another ODA programme in the same budget band, but with a different thematic focus or containing a different primary activity type).
Final sample. The second stage review showed that the sample did not capture programmes whose primary activity is Translational Research, and that the final sample could benefit from the inclusion of additional POs. With that in mind, the sample was updated to cover those aspects, with 2 cases removed, and 2 added.
Table 14 to Table 17 present the distribution of the final sample against the distribution of the population, by theme, activity type, PO and partner country. We recommend not looking to mimic the distribution of these for ODA and non-ODA programmes as this may prove impractical (or even impossible) to reproduce given the small sample.
Table 12 - Distribution of Total Allocations 22/23 – 24/25
Statistics | ODA | Non-ODA |
---|---|---|
Median | £1,156,000 | £500,000 |
Average | £2,893,184 | £879,538 |
1 standard deviation above average | £9,372,173 | £2,027,198 |
2 standard deviations above average | £15,851,162 | £3,174,857 |
Maximum | £45,186,141 | £6,250,000 |
Total | £153,338,746 | £96,749,198 |
As % of total allocations | 61% | 39% |
Table 13 - Sample distribution, based on first stage criteria
Strata for sampling | Population (number of programmes | % | Sample size per strata | Sample size per strata (rounded) | |
---|---|---|---|---|---|
ODA | Up to £1m | 26 | 49% | 5.887 | 6 |
ODA | Higher than £1m and lower or equal to £3m | 15 | 28% | 3.396 | 3 |
ODA | Higher than £3m and lower or equal to £10m | 9 | 17% | 2.038 | 2 |
ODA | Higher than £10m | 3 | 6% | 0.679 | 1 |
Total ODA | [z] | [z] | [z] | [z] | 12 |
Non-ODA | Up to £500k | 56 | 51% | 4.073 | 3 |
Non-ODA | Higher than £500k lower that £1m | 22 | 20% | 1.6 | 2 |
Non-ODA | Higher than £1m lower that £3m | 25 | 23% | 1.818 | 2 |
Non-ODA | Higher than £3m | 7 | 6% | 0.509 | 2 |
Total non-ODA | [z] | [z] | [z] | [z] | 9 |
Note: [z] = not applicable
Table 14 - Sample distribution – Primary type of activity
Primary type of activity | Population | % | Ideal distribution (rounded) | Sample |
---|---|---|---|---|
International Collaborative Academic Research | 68 | 55% | 11 | 6 |
Translational Research & Impact Realisation | 3 | 2% | 1 | 1 |
International mobility (incl. fellowships, secondments) | 5 | 4% | 1 | 1 |
Institutional R&I capacity building | 7 | 6% | 1 | 2 |
International Collaborative Business-led RD&D | 20 | 16% | 3 | 2 |
Investment in & access to infrastructure / facilities | 4 | 3% | 1 | 3 |
Pump-priming & Networking and workshops | 16 | 13% | 3 | 3 |
Note: 3 programmes not tagged against “Primary activity type” as information was provided after the tagging exercise.
Table 15 - Sample distribution – ISPF themes
ISPF Themes | Population | %* | Ideal distribution | Sample |
---|---|---|---|---|
Resilient Planet | 61 | 45% | 9 | 10 |
Transformative Technologies | 63 | 46% | 9 | 7 |
Healthy People, Animals & Plants | 36 | 26% | 5 | 5 |
Nurturing Tomorrow’s Talent | 50 | 36% | 7 | 10 |
Table 16 - Sample distribution – PO
Partner Organisations | Population | % | Ideal distribution | Sample |
---|---|---|---|---|
AHRC | 3 | 2% | 0 | 0 |
AMS | 9 | 6% | 1 | 3 |
BA | 9 | 6% | 1 | 2 |
BBSRC | 7 | 4% | 1 | 1 |
BC | 12 | 7% | 2 | 1 |
CPC | 2 | 1% | 0 | 0 |
EPSRC | 10 | 6% | 1 | 1 |
ESC | 7 | 4% | 1 | 2 |
ESRC | 2 | 1% | 0 | 0 |
FI | 1 | 1% | 0 | 0 |
IUK | 9 | 6% | 1 | 1 |
MO | 14 | 9% | 2 | 2 |
MRC | 10 | 6% | 1 | 1 |
NERC | 7 | 4% | 1 | 1 |
NPL | 15 | 9% | 2 | 1 |
OREC | 4 | 2% | 1 | 0 |
RAE | 11 | 7% | 1 | 1 |
RS | 3 | 2% | 0 | 1 |
STFC | 14 | 9% | 2 | 2 |
UKAEA | 10 | 6% | 1 | 1 |
UKRI | 1 | 1% | 0 | 0 |
UUK | 3 | 2% | 0 | 0 |
Table 17 - Sample distribution – partner countries
Partner country | Population | % | Ideal distribution | Sample |
---|---|---|---|---|
Australia | 11 | 7% | 1 | 1 |
Brazil | 25 | 15% | 3 | 8 |
Canada | 23 | 14% | 3 | 1 |
China | 8 | 5% | 1 | 0 |
Denmark | 1 | 1% | 0 | 0 |
Egypt | 22 | 13% | 3 | 6 |
France | 6 | 4% | 1 | 1 |
Germany | 12 | 7% | 2 | 1 |
India | 24 | 15% | 3 | 5 |
Indonesia | 22 | 13% | 3 | 7 |
Ireland | 4 | 2% | 1 | 0 |
Israel | 10 | 6% | 1 | 0 |
Japan | 27 | 17% | 3 | 4 |
Jordan | 16 | 10% | 2 | 5 |
Kenya | 23 | 14% | 3 | 7 |
Latvia | 1 | 1% | 0 | 0 |
Malaysia | 21 | 13% | 3 | 7 |
Netherlands | 5 | 3% | 1 | 0 |
New Zealand | 4 | 2% | 1 | 0 |
Norway | 0 | 0% | - | 0 |
Philippines | 20 | 12% | 3 | 7 |
Singapore | 13 | 8% | 2 | 0 |
South Africa | 25 | 15% | 3 | 9 |
South Korea | 14 | 9% | 2 | 0 |
Switzerland | 12 | 7% | 2 | 4 |
Taiwan | 9 | 6% | 1 | 2 |
Thailand | 17 | 10% | 2 | 7 |
Turkey | 14 | 9% | 2 | 5 |
USA | 28 | 17% | 4 | 1 |
Viet Nam | 19 | 12% | 2 | 7 |
Notes: The percentages do not sum to 100%, as programmes can be tagged against multiple partner countries.
The ideal distribution is based on a one-case-one-country distribution, so is not directly comparable with the sample (where multiple countries may be covered by one case).
Final remarks on recommended sample
Based on the approach and parameters described above, we have identified a sample of 21 programmes. The list is provided in Appendix F.
By definition, the list follows the distribution above in terms of ODA/non-ODA programmes and budget bands. In addition, it follows closely the expected number of cases given the distribution of activity types and ISPF themes in the population (shown in Table 14 and Table 15). The sample also provides a good representation across POs (with 15 organisations included[footnote 22]) and ODA and non-ODA partner countries (13 and 10 different countries respectively are included across the sample).
The suggested sample accounts for 31% of the total allocations (=£78.2m/£250.1m). However, it is important to note that this is driven by the fact that the current sample includes one of the 2 ‘outliers’ in terms of budget (the Energy Catalyst programme, with a budget of £45.2m).
It also important to note that the Energy Catalyst has been evaluated in the past (and may be evaluated in future), so one option is to exclude it from the sample and only retain 20 cases. But, if this programme is removed from the sample, the representation of the sample against the total budget (currently 30%) will inevitably decrease. This will still be the case if this programme is replaced with another, since the next largest budget is around £10m.
Also note that these cases are expected to be developed longitudinally. Section 9 provides costing options for including additional (new) cases at the final stage of the evaluation.
6.5.2 Approach to sampling for RoI
Our initial analysis of the ISPF portfolio reveals that only 16% of the portfolio is primarily focused on International Collaborative Business-led RD&D. As such, we suggest conducting RoI analysis with all businesses participating in ISPF rather than sampling. We think this is also appropriate considering our suggestion is to use secondary sources for this analysis.
7 Process evaluation
We recommend a light-touch process evaluation is undertaken at an early stage (2026) that is focused at the Fund (rather than programme) level. Key areas to explore should include:
- processes to ensure that Fund intentions and objectives are sufficiently communicated, understood and interpreted / reflected in the design of the portfolio
- processes to ensure coherence, synergies and appropriate balance across the portfolio
- processes and arrangements that ensure the Fund has sufficient agility to adapt to evolving priorities and needs, or other significant external factors
- processes that ensure the Fund is learning adequately (from ISPF and other activities) and implementing that learning
- processes to ensure appropriate levels and means of central (DSIT) oversight and management, despite the decentralised implementation of the Fund
In each case, the evaluation would be tasked with addressing 3 broad questions:
- what processes are in place, how have these been implemented, and by whom?
- how well (how effectively, how efficiently) have these processes worked, what has worked more or less well, and what have been the important facilitators and barriers?
- what lessons have / could be learned that that would help for the future implementation of ISPF, or other similar initiatives?
We would suggest that some further consultation within DSIT may be useful in validating the above areas (their relevance and completeness), and also whether there are specific aspects of these processes that are of particular interest (and therefore should be a key focus).This process could take place before commissioning, or as part of early scoping activities.
Based on the current list of areas and questions, the evaluation should then consider several different evidence collection and analysis activities that might include:
- desk-based review of Fund documentation
- process mapping (including key processes, actions and stakeholders involved)
- consultation with key stakeholder groups identified in the mapping (mainly individuals within DSIT and the ISPF Partner Organisations). Depending on the number of relevant individuals identified, the consultation may take different forms for different groups (one-on-one / group interviews, workshops and / or surveys)
8 Baselining approach
Following the finalisation of this evaluation framework, the study will proceed to its final phase, Baseline Assessment. According to the specifications the Baseline assessment report should:
- Contain a mapping of programme activities to provide a detailed picture of ISPF funded activities by output typology, theme, geography, scale and value (portfolio analysis).
- Set baselines for each outcome to show the position at the start of ISPF, against which subsequent progress and achievements can be evaluated.
In addition to these requirements, we also propose to pilot some elements of the VfM rubric for a subset of the recommended sample, as explained below.
8.1 Mapping of programme activities
This part of the baseline assessment was initiated at an earlier stage (in preparation for the evaluation framework) to support the development of the sampling strategy and further our overall understanding of the Fund and how it is being delivered. A first version of this analysis is presented in Section 5 of the current report and contains information reported up to end of Q4 2023/24 (i.e. to 31st March 2024).
The baseline analysis will include the following characteristics:
- An updated portfolio analysis, based on information reported up to end of Q2 2024/25
- An analysis of the ToC (and update if relevant) based on the portfolio analysis.
8.2 Baseline indicators
As explained in Section 6.2, Appendix D presents an assessment of the relevance of establishing a baseline value for each indicator. Our analysis reveals that only a limited number of outcomes have available baseline values, as in many cases the starting value is zero (e.g. Value of co-funding (cash or in -kind), per programme. Total and as % of ISPF Funding).
There are 11 indicators for which we have identified the possibility of establishing a (non-zero) baseline value. These are presented in Table 18 below. These indicators correspond to 4 sources of data (including VfM). Below the table we provide further detail on how those data sources would be mobilised in the context of the baseline assessment.
In addition, for a number of indicators (12) that are marked as “Not applicable (starting value of 0)” in the appendix, but we have also indicated that “initial assessment will be done as part of the baseline evaluation workstream”. These relate to indicators emerging from the Portfolio analysis, and correspond to inputs and activities as described in the ToC. As such, they will be presented as part of the mapping of programme activities described above.
Table 18 - Baseline indicators
Type | Ref No. | ToC Element | Recommended (additional / alternative) Indicators | Sources | Included in VfM? | Relevant VfM sub-dimension | Baseline | Comments on the baseline |
---|---|---|---|---|---|---|---|---|
Output | O3 | Joint areas of interest / priorities est. (country, funder, researcher / innovator) | Examples of joint areas of interest / priorities identified (country, funder) | > Programme leads template / interviews > Survey with ISPF project participants (UK and international) |
No | [z] | Number of existing MoUs gov-to-gov or international partner org, before ISPF | [z] |
Output | O11 | New and improved technologies / increased TRL | Percentage of projects that advance one or more TRL levels due to ISPF funding | > Survey with ISPF project participants (UK and international) | No | [z] | TRL starting point at the point of application | [z] |
Output | O11 | New and improved technologies / increased TRL | Percentage of programmes /projects that have made progress in terms of market readiness as a result of ISPF funding. | > Survey with ISPF project participants (UK and international) | Yes | 3.1.5 Paving the way for the uptake / application of innovation outputs | MRL starting point at programme / project start | [z] |
Output | O14 | New/improved understanding of user needs, research methods, EDI, Responsible R&I, research management, international collaborative research, MEL (among researchers, managers, industry) | Percentage of ISPF participants for whom participation on the ISPF project has led to new/improved understanding of user needs, research methods, EDI, Responsible R&I, research management, international collaborative research, MEL (among researchers, managers, industry) | > Survey with ISPF project participants (UK and international) | No | [z] | Starting point at the point of application | [z] |
Output | O15 | New/improved understanding of available research capacity, capabilities & infrastructure among partners | Percentage of ISPF participants for whom participation on the ISPF project has led new/improved understanding of available research capacity, capabilities & infrastructure among partners | > Survey with ISPF project participants (UK and international) | No | [z] | Starting point at the point of application | [z] |
Outcome | OC4 | Increased ability of UK and partner countries to collaborate on R&I (incl. access to infrastructure | Percentage of ISPF participants for whom participation on the ISPF project has led to new/improved access to research infrastructures | > Survey with ISPF project participants (UK and international) | No | [z] | Starting point at the point of application | [z] |
Outcome | OC6 | Increased research capabilities, incl. leadership (UK & ODA beneficiaries) | Percentage of ISPF participants for whom participation on the ISPF project has led to increased research capabilities, incl. leadership + examples | > Survey with ISPF project participants (UK and international) + follow up interviews | No | [z] | Starting point at the point of application | [z] |
Outcome | OC7 | Improved connectivity between industry and academia (UK & ODA beneficiaries) | Percentage of ISPF participants for whom participation in the ISPF project has led to improved connectivity with industry / academia | > Survey with ISPF project participants (UK and international) | No | [z] | Starting point at the point of application | [z] |
Outcome | OC8 | Increased or sustained quality / competitiveness R&I in ISPF themes (UK & ODA beneficiaries) | As per existing KPI, plus: Citation impact - as measured by Average of Relative Citations (ARC) and HCP (Highly cited papers) - of ISPF publications (total, broken down by ISPF themes, gender and field/sector). Also bibliometric analysis of international co-publications, multi-and inter-disciplinary papers, and research novelty (based on unusual combinations of cited references) |
> DSIT analysis (KPI B9), based on Annual Commission (KPI B7) > Bibliometric data / analysis |
Yes | 3.2.5 Attainment of outcomes: Strengthening SRTI quality | Citation impact (as described in the indicator) for ISPF researchers (before ISPF) | Baseline assessment can be made at baseline evaluation stage, or retrospectively alongside the first interim impact assessment. |
Outcome | OC10 | Increased income from commercialisation of research & technology, incl. from new markets (UK & ODA beneficiaries) | Percentage of ISPF participants for whom participation in the ISPF project has led to increased income from commercialisation of research & technology, incl. from new markets, plus estimated value | > Survey with ISPF project participants (UK and international) | No | [z] | Starting point at the point of application | [z] |
Outcome | OC12 | Increased or sustained reputation of UK as: R&I partner of choice; destination for talent | Percentage of international funders / delivery organisations for whom participation in the ISPF programme has led to a significant improvement in their own organisation’s and other organisations’ perceptions of the UK as an SRTI partner. | > VfM Case studies (interviews with international funders / delivery organisations, plus survey with international project participants) | Yes | 3.2.7 Attainment of outcomes: Improving international perceptions and reputation | Starting point at the point of application | [z] |
Note: [z] = not applicable
Programme leads template
We suggest approaching all ISPF partner organisations with a simple template to collate information on existing MoUs with international partner organisations (including ISPF partners), before ISPF. We will pre-populate the template with existing information we may find online and in the Annual Commission.
Survey with ISPF project participants (UK and international)
We suggest conducting a short survey with all ISPF participants UK and international to collect baseline information. The survey will be distributed online and will contain 6 sections. A draft of the survey is shown in Appendix G.
Bibliometric data / analysis
We suggest collecting citation impact (as measured by Average of Relative Citations (ARC) and HCP (Highly cited papers), for:
- the UK overall (i.e. for publications with at least one UK based author)
- the UK in collaboration with ISPF partner countries (i.e. for publications with at least one UK based author, and at least one author based in an ISPF partner country), by ODA and non-ODA. We suggest conducting this exercise in aggregate form rather than for each combination of UK and ISPF partner country (i.e. 25 times).
For both groups we also suggest producing indicators overall and by: ISPF Theme; field of research; and by gender of authors (papers that include at least one female author, in comparison with others). Finally, we suggest conducting this analysis for 10 years prior to ISPF, 2013-2023 (noting that citation data has 2-3 years of lags after publication, in addition to the 2-4 years that it can take for publications to emerge from specific projects and programmes).
8.3 Testing the rubric
We also suggest using the baseline phase to test the Economy component of the VfM rubric, which contains 4 sub-dimensions on relevance (and alignment with ISPF objectives) and one on leverage. We also suggest collecting baseline information on international funders / delivery organisations perceptions of the UK as an SRTI partner. These are the parts of the VfM rubric that are most appropriate for undertaking a first assessment at this early stage.
We would conduct this exercise for 10 programmes (out of the sample of 21). Since this exercise is meant to be exploratory, rather than representative, we suggest selecting them using purposive sampling, to cover 10 different POs, and a diversity of main activities (e.g. Table 19).
Table 19 - Suggested sampling for testing the rubric
PO | ODA/NON-ODA | Primary Activity Type |
---|---|---|
BC | Non-ODA | International mobility (incl. fellowships, secondments) |
UKAEA | Non-ODA | International Collaborative Business-led RD&D |
BBSRC | Non-ODA | International Collaborative Academic Research |
EPSRC | Non-ODA | International Collaborative Academic Research |
STFC | Non-ODA | Investment in & access to infrastructure / facilities |
AMS | ODA | Networking and workshops |
MRC | ODA | International Collaborative Academic Research |
MO | ODA | International Collaborative Academic Research |
BA | ODA | Institutional R&I capacity building |
IUK | ODA | International Collaborative Business-led RD&D |
9 Resourcing and timetable
Table 20 presents a recommended approach for implementing the evaluation framework. This includes the main evaluation elements and stages, the suggested timing of each and broad estimates of the costs. These figures are indicative and presented to give a sense of scale, thereby supporting future design and commissioning.
We also present 3 options within the table, which highlight differences in cost if different decisions are made with respect to 2 elements: the number of cases included within the sample for longitudinal case studies, and (the focus of) the bibliometric analysis.
- Option 1: Based on 21 longitudinal case studies, and bibliometric analysis with a focus on the UK for bibliometric data.
- Option 2: Based on 21 longitudinal case studies and 5 new (additional) cases at the final stage, plus bibliometric analysis with a focus on the UK for bibliometric data
- Option 3: Based on 21 longitudinal case studies and 5 new (additional) cases at the final stage, plus bibliometric analysis with a focus on the UK and 5 key partner countries.
The budget suggested ranges from £1.3m to £1.6m, which is in keeping with current guidelines from DSIT Evaluation Strategy (which suggests evaluation budgets in DSIT are expected to be proportionate to the programme and the relevant evidence base, constituting circa 1 to 10% of programme budget with the primary rule for budget allocation being proportionality).
As stated above, we recommend that the approach set out in this document is iterated and evolves as more evidence becomes available. In particular, the framework should be revised and updated after the Interim Effectiveness evaluation.
Table 20 - Costing and timetable (indicative)
Elements of the evaluation | Timing | Option 1 | Option 2 | Option 3 |
---|---|---|---|---|
Process Evaluation | 2026 | £50k | £50k | £50k |
Interim Effectiveness Evaluation: incl. portfolio assessment, indicator analysis, VfM assessment & QCA |
2026 | £600k (incl. 21x£20k longitudinal case studies, plus VfM & QCA) |
£600k |
£675k (Option 2 plus additional bibliometric data, £15k-20k per country) |
Progress report (updated analysis of portfolio and annual commission data) | 2027 | £50k | £50k | £50k |
Final Summative / Effectiveness Evaluation incl. portfolio assessment, indicator analysis, VfM assessment & QCA, and RoI |
2028 | £600k (incl. 21x£20k longitudinal case studies, plus VfM & QCA) |
£700k (Option 1 plus 5x£20k cases studies) |
£775k (Option 2 plus additional bibliometric data, £15k-20k per country) |
Total | £1,300 | £1,400k | £1,550k |
-
HMG (2022). International comparison of the UK research base ↩
-
HMG (2014). The case for public support of innovation at the sector, technology and challenge area levels ↩
-
Technopolis (2018) Drivers and Barriers for Collaboration, prepared for BEIS (not published yet). ↩
-
Least Developed Country is a legally established term, with the definition of countries proposed by the Development Assistance Committee (DAC) of the Organisation for Economic Cooperation and Development (OECD). It refers to countries with low development scores across a variety of metrics, and is different to defining countries as Low-, Middle- or High-Income (purely a measure of per capita income). ↩
-
N.B. The IDWP was developed as a cross-party document and remains valid after the 2024 change in government; the Integrated Review was explicitly a policy document of the previous government and as such should not be considered current policy. ↩
-
The definition of the typology was based on desk research (a review of documentation on the Fund and programmes) and a process of consultation, iteration and validation with DSIT and POs (via meetings and workshops). ISPF represents a large (and growing) portfolio of varied activities (not all of which have yet been fully defined), being delivered by different partner organisations who use varying terminology and categorisations in their own operations. Arriving at single Fund-wide categorisation was therefore not without challenges. However, POs were invited to tag each of their programmes using one or more of the proposed categories (and to indicate the main type for each programme), and all were able to do so. ↩
-
The allocations data also includes £62.3m in additional allocations for DSIT delivery costs and institutional support awards through the 4 UK higher education funding bodies. However, these are excluded from the remainder of the analysis presented in this section. ↩
-
Includes Programme Spend and Award Spend (but not PO delivery costs) to Q4 2023/24. ↩
-
Includes Programme Spend and Award Spend (but not PO delivery costs) to Q4 2023/24. ↩
-
One or more partner countries are listed against each programme. All have been included. ↩
-
Some ODA programmes specify “LDCs” instead of / as well as listing specific countries – these “LDC” entries are not included in the figure below. ↩
-
Includes Programme Spend and Award Spend (but not PO delivery costs) to Q4 2023/24 ↩
-
One or more partner countries are listed against each programme. All have been included. ↩
-
Note that figures sum to >100% as programmes can be tagged against multiple themes. ↩
-
Tagged by POs, based on a typology defined as part of ToC development. For definitions of each activity type see section 4.3.3 ↩
-
Calculations based on 77 programmes where activity type is known ↩
-
A further 8 ideas for KPIs were originally proposed in the ISPF Monitoring, Evaluation and Learning Plan, but required further thought and development. These KPIs are listed in Appendix D.2, along with a note of when and where it has been possible to take these forward through the recommended indicators proposed. ↩
-
Definition of 4 E’s taken from ‘DFID’s Approach to Value for Money – Guidance for external partners’ (June 2020) ↩
-
HMT Magenta Book. Annex A. Analytical Methods for use within an evaluation. ↩
-
Haien Ding (2022). What kinds of countries have better innovation performance?–A country-level fsQCA and NCA study, Journal of Innovation & Knowledge, Volume 7, Issue 4, 2022, 100215, ISSN 2444-569X ↩
-
POs not included within the sample are AHRC, CPC, ESRC, FI, OREC, RAE, RS, UKRI (itself), and UUK. ↩