Digital Growth Grant evaluation 2023 to 2024 and 2024 to 2025: executive summary
Published 21 August 2025
1. Executive summary
This evaluation assesses the process, impact, and Value for Money (VfM) of the Digital Growth Grant (DGG), delivered by Barclays Eagle Labs (BEL) between the years 2023/2024 and 2024/2025.[footnote 1] The £11,694,000 fund aimed to support 16,000[footnote 2] digital startups and scaleups, particularly those outside London.
According to the guidance on the Digital Growth Grant, DGG had four main objectives:
- Deliver support services to the digital sector, particularly in transformative/emerging technologies. These services should address key challenges faced by a wide range of tech companies from the seed funding stage to series A/B, their first or second round of financing.
- Grow regional support networks for tech start-ups and scale-ups. Activities should deliver concrete improvements in three of the areas that were identified in DSIT’s Regional Ecosystems report as crucial to the success of the digital economy. These are Investment, Innovation, and Business Growth. Support should be tailored to local needs and tech specialisms and should be developed in partnership with local bodies and existing tech groups.
- Ensure founders and firms can access digital entrepreneurship and investment readiness training with tailored advice to help develop their skills to start and grow a tech business.
- Clearly signpost start-ups and scale-ups to existing initiatives in the private and public sectors across the UK that can help them access finance, talent and markets.
The DGG was awarded to BEL, which submitted a proposed portfolio of activities designed to meet the DGG objectives. This portfolio and the allocation of DGG funding across activities were agreed upon with DSIT. BEL covered all administrative costs incurred by BEL in managing the fund and not included in the DGG expenditure.
1.1 Key findings
Reach and diversity: Over 5,300 participants engaged in direct support activities. The evaluation found that the activities provided a substantial number of business interventions (14,091), which suggests a significant demand for this type of support. Although this amount is just shy of the 16,000 target, it does not include the full FY2 activities. Considering the additional FY2 activities excluded from the analysis, it is likely that the target was achieved.
The programmes exceeded diversity goals, with over half of the participants being from underrepresented groups. However, disabled founders were underrepresented.
Impact on economic growth: Accelerator programmes generated a 6% increase in employment among participant firms. Participants reported improved business skills, access to networks, and enhanced investor readiness. This assessment is based on a subset of programmes that were completed in FY1. In addition, the analysis was conducted shortly after the end of the programmes (between 6 months and a year), which means the potential longer-term employment impacts were not captured.
Regional support: 80% of FY1 participants were based outside London, achieving the original target. EPPs were especially effective, delivering tailored regional support with high satisfaction scores and strong attendance.
Investment readiness and digital entrepreneurship: While many programmes covered investment readiness, direct and tailored support (e.g., engagement with investors) was seen as the most effective. Online content received mixed engagement, and mentoring offered value, but it was unclear whether this value was additive.
Value for Money: Accelerator programmes delivered approximately £1.86 in economic benefit for every £1 invested. This may be an underestimate, particularly for scaleups. In addition, as the analysis was conducted shortly after the completion of the relevant programmes, it does not include the potential longer-term impacts.
1.2 Funded activities
The DGG-funded activities, which BEL delivered over the two funding years, included:
-
Growth programmes: Structured training programmes with various components, including lessons, workshops, panels, mentoring, and networking events. Each programme targeted different stages of development or specific populations. The programmes were delivered nationally using a hybrid delivery structure with online and some in-person aspects.
-
Ecosystem Partnership Programme (EPP): Regional programmes delivered by local or specialised delivery partners in a matched funding structure. EPPs were diverse and grouped into three categories for this evaluation:
-
Business accelerator: Targeted at established businesses and provided structured programmes similar to the growth programmes.
-
Skills and learning: Provided targeted skills and support for specific audiences, including founders at the ideation stage.
-
Connection and knowledge sharing: Facilitated connections between different parts of the value chain with tailored activities like mentoring.
-
-
Connectivity Initiative (Mentoring): Online mentoring for founders.
-
Learning platforms: A set of online learning activities, including:
-
LifeSkills: Free educational resources aimed at upskilling 14- to 19-year-olds on tech entrepreneurship and employment opportunities.
-
Eagle Labs Academy: A digital learning platform to help tech founders start and scale their businesses, covering all aspects of launching and running a business.
-
Reports: Published studies on various topics and industries relevant to founders and entrepreneurs.
-
In addition, an Advisory Board (AB) that consisted of regional and subject matter experts provided insight, advice, and direction to BEL for the delivery of DGG-funded activities and, in particular, assisted in the allocation of EPP funding.
1.3 Methods
The evaluation of the DGG, as outlined in this report, employs a combination of quantitative analysis and insights from a small number of qualitative interviews with BEL staff, delivery partners, participants, and stakeholders. It covered the delivery process, impact, and VfM of DGG-funded activities. The evaluation faced a number of challenges in terms of what analysis was possible. Before reporting the findings, the utilised methods were outlined, including their strengths and limitations, as these factors influence the conclusions that can be drawn from the evaluation.
1.3.1 Impact evaluation
The impact evaluation assessed how well the DGG-funded activities met the four main objectives and their resulting impacts. DGG-funded activities differed in their scope and intended outcomes, as well as in the availability of data on their outputs and outcomes. As a result, the evaluation used a combination of different types of analysis to assess the impact of these activities, taking into account the nature of possible outcomes and the available data.
The evaluation was challenged by the fact that, while it was being conducted, some Financial Year 2 (FY2) activities were still ongoing. Conducted in Q1 2025, the evaluation coincided with the completion of FY2 activities, which concluded on 1 April 2025. As a result, the evaluation primarily focused on evidence from Financial Year 1 (FY1). Where feasible and appropriate, the evaluation considered available information on FY2 activities. However, since many of these were still being delivered in parallel with the evaluation, any analysis of FY2 is presented as indicative and reflects the status of activities at the beginning of Q1 2025.
Given the above, the impact analysis that was possible for each aspect of the DGG varied. The feasible analyses that were deployed are described below, as this has important implications for the conclusions that can be drawn from this work.
Delivery of business support
The evaluation assessed whether the DGG activities met this objective by analysing:
-
The overall reach of the activities (the number of business interventions delivered through the DGG-funded activities). This was used to assess whether the DGG target of reaching 16,000 businesses was achieved. The total number of business interventions was estimated using monitoring data provided by BEL. This data had some limitations (e.g., due to limitations imposed by data sharing agreements, the data shared with the evaluation team lacked unique identifiers that would have allowed an assessment of unique participants’ reach).[footnote 3]
-
The diversity of participants that had been reached to determine whether the DGG goal of reaching at least 35% diverse individuals was met.[footnote 4] This was measured using monitoring data provided by BEL. Sufficient diversity data was available for the growth programmes, EPPs, and mentoring.
-
The impact on participants of the activities, which included:
-
A robust econometric analysis of the impact of selected DGG activities on business growth. This was measured using employment growth as a proxy. The analysis focused on structured programmes that were considered likely to lead to observable changes in employment, that is, the activities within growth programmes and EPPs that could be categorised as ‘business accelerators’. Hereinafter, those programmes are referred to as ‘accelerator programmes’. An econometric analysis is a robust approach that considers the impact of participation relative to the counterfactual (a scenario in which firms would not have participated in the accelerator programmes). Findings from this analysis likely underestimate the long-term impact of these activities due to the timing of the data and its limitations. In particular:
- the econometric analysis only considered employment impacts related to FY1 participation
- the estimated impacts only assess the short-term employment growth after the completion of the activities (six months to a year), whereas the employment impact might be further realised several years after training completion
- the employment data was based on data from LinkedIn, as official employment data was not suitable for this analysis, given the short time that lapsed between the activities’ completion and the evaluation
- a full assessment of employment growth can be conducted several years after completing the FY2 activities, reflecting the longer-term possible employment impact
-
-
Analysis of performance metrics. This included attendance and Net Promoter Score (NPS) data provided by BEL, as well as data collected through a survey of mentoring participants.
Supporting regional ecosystems
The evaluation assessed whether the DGG activities met this objective by analysing:
-
The regional reach of the activities: The DGG aimed to provide 80% of activities to non-London-based participants. The evaluation compared the regional reach of this target and the regional distribution of the UK tech startups and scaleups (the target population). The learning platforms’ user location data was not mandatory in the registration process, which made the geographical data information incomplete for this programme. The regional reach analysis covers only the growth programmes, EPPs, and mentoring. As with the total reach, due to data-sharing agreements, the evaluation was not able to assess the regional reach of unique participants.
-
The regional distribution of estimated employment impacts: Results from the econometric analysis above were used to assess how many additional employees in each region and nation of the UK were likely to be the result of DGG-funded ‘business accelerators’.
-
The impact of the EPPs: EPPs were specifically targeted at supporting regional firms. As such, the evaluation assessed their performance (attendance and NPS) in particular and compared it to the performance of national programmes to assess regional effectiveness.
Access to Investment readiness training, Digital entrepreneurship and signposting to other public and private opportunities
Information to support the evaluation of the last two objectives was much more limited. As such, the assessment of these objectives involved analysing:
-
Investment readiness: Assessed via curriculum review, NPSs, mentoring survey results, and engagement with online learning and reports.
-
Access to digital entrepreneurship: Evaluated mainly through qualitative interviews due to limited quantitative data.
-
Signposting to other public and private opportunities: Assessed to have been provided across the programmes, but it was not possible to assess the benefits of these activities.
1.3.2 Process evaluation
The process evaluation relied primarily on semi-structured qualitative interviews with BEL staff, delivery partners, and DSIT colleagues to understand how programmes were delivered and managed. Interviews included questions about the process of administering the DGG, selecting delivery partners, and monitoring the programmes.
1.3.3 Value for Money (VfM)
Assessing the VfM of DGG-funded activities required quantifying the benefits and costs of the activities in monetary terms. Robust quantification of the impact was only possible for accelerator programmes. As described above, this was achieved through an econometric analysis examining business employment. Therefore, the VfM analysis focused only on the outcomes of these programmes. The benefit of the accelerator programmes was quantified as the additional economic output associated with the increased employment generated by these programmes. This additional output was estimated to be in line with HMT’s guidance on economic appraisal (the Green Book). The benefit was compared to the costs of these programmes to generate a benefit-cost ratio.
Because the quantification of benefits relied on the econometric analysis conducted as part of this evaluation, the caveats of that analysis (limitations on the timing and quality of data available) also apply to the VfM. As such, the results of the VfM analysis should be considered indicative, rather than representative, of the complete value of those activities. In particular, it only reflects the short-term impact on employment from participation. However, it is likely that full employment impacts will be observable only several years after completion.
The VfM only captures benefits that can be measured and monetised (the size can be converted to monetary value). As such, additional benefits from the programme, which are neither measurable nor monetisable, are also not captured in the VfM.
1.4 Findings and recommendations
The evaluation results, detailed in the main report, led to the following conclusions regarding the delivery of the DGG. Based on the evaluation conclusion, recommendations are provided to the UK government and DSIT for the deployment of future interventions aimed at supporting local tech ecosystems.
The evaluation found evidence that these activities had a positive impact on their participants. The econometric analysis showed that firms participating in the accelerator programme led to additional employment growth (a 6% increase, above and beyond the growth in the counterfactual). Stakeholder interviews revealed that EPP participants found the programmes to be well-tailored to their needs. Two-thirds of the mentoring survey respondents were satisfied with the programme and noted that the learning platforms provided a range of content that users engaged with. However, this evidence varies in its strength, partly due to limitations in the available information for the evaluation. As such, drawing inferences at this aggregate level should be done with caution.
The evaluation offers some insight into what works well and what works less well in addressing this demand for support. One key feature of the DGG-funded activities was the combination of several more standardised, centrally designed programmes (the growth programmes) with regionally focused programmes (the EPP), which allowed considerable variation between regions and delivery partners in the nature of support provided to local ecosystems.
The evaluation suggests that both approaches had advantages and disadvantages. On balance, the evidence on impact is slightly stronger for EPPs. For example, EPPs received higher NPSs (an average of 61 for EPPs compared to 45 for growth programmes). This comparison should be interpreted with caution, but it suggests that approaches similar to the EPP may be successful in reaching and positively impacting the UK’s digital ecosystems.
It was beyond the scope of this evaluation to assess how the impact and VfM of the DGG and the activities it funded might:
- complement (or not) other ways of supporting the UK digital ecosystem
- compare to alternative ways of supporting the UK digital ecosystem
Future analysis could consider this question in detail, drawing on the evidence from this and other evaluations as well as other sources of information.
The following subsections provide more detailed conclusions and recommendations.
1.4.1 Delivering support services to the digital sector
Findings
This evaluation estimates that DGG provided 14,091 business interventions[footnote 5] across FY1 and FY2,8 which accounts for 16% to 28% of the estimated UK tech startups and scaleups. Given that FY2 reach data did not include the number of participants from activities that were ongoing during this evaluation, it is not possible to conclude that the DGG target of 16,000 has been met. However, it seems likely that the excluded programmes (three growth programmes, nine EPPs, and additional active users on the Eagle Academy)[footnote 6] would have reached the additional 2,000 participants needed.
The accelerator programmes[footnote 7] had a measurable positive impact on employment growth, contributing a 6% uplift, suggesting a benefit of £1.86 in economic value per £1 invested. This is an estimate of the very short-term impact of the programmes. Given the limitations noted in the employment growth economic analysis, this estimated VfM is likely to represent the lower bound of the full economic value created.
The Diversity targets were exceeded, with 52% to 71% of participants identifying with at least one underrepresented group (compared to the 35% target). However, founders from some hard-to-reach groups, such as those with disabilities, remained underrepresented in the programmes (between 9% and 11% across growth programmes and EPPs) compared to their representation in the UK tech startup and scaleup population (estimated at 25%).
Participant feedback highlighted that longer, more tailored programmes with smaller cohorts produced better outcomes in terms of satisfaction and learning.
Mentoring support was well-used and appreciated, as evidenced by the high uptake of the programme, the high NPS (78) that participants gave, and the overall high satisfaction (69% of survey respondents reported being satisfied). However, there was mixed evidence about the additionality of the programme. Two-thirds of the respondents said they would have paid for the programme had it not been offered for free, but 57% said they would have been able to receive mentoring from a different source.
Finally, engagement with the learning platform varied significantly, with some content unused or lightly engaged.
Recommendations regarding future local tech startup and scaleup support interventions
DSIT should consider:
-
Putting greater emphasis on accelerator programmes which offer tailored content for specific stages, founder characteristics, or geographies, given the success of the EPPs.
-
Designing and incorporating more targeted strategies to reach founders in hard-to-reach groups. Approaches such as those used by BEL to identify local delivery partners for the EPPs have been shown to have led to programmes that were tailored to local needs. Further development of these approaches could help address challenges in reaching intersectional founders.
-
Including more in-person networking with experts and peers. This evaluation found that BEL participants consistently rated in-person activities as highly impactful.
-
Assessing the quality and relevance of content across all delivery formats, including online platforms. BEL’s online platform offered a variety of content, but some of it did not resonate with the target audience.
1.4.2 Supporting regional startups and scaleups
Findings
In FY1, 80% of business interventions were provided for those based outside of London, meeting the 80% target set by DSIT, with two-thirds of the estimated employment growth impact of BEL located outside of London. Ideally, the evaluation would have assessed reach in terms of unique programme participants. However, this was not possible due to limitations in the data-sharing agreement. Therefore, the number of unique participants reached could plausibly be lower than 80%, but it is not possible to determine this with certainty.
Preliminary FY2 data indicated a decline in regional reach compared to FY1. However, since the FY2 data was not complete, it was not possible to assess this fully.
EPPs played a vital role in achieving regional reach and showed higher satisfaction (NPS) and attendance than growth programmes, but some questions about the consistency across regions remain.
Despite the success of BEL in achieving wide local reach, stakeholders suggested that some founders with the greatest needs were still unreached, as they were less likely to be in contact with local ecosystems.
Recommendations for future local tech startup and scaleup support interventions
DSIT should consider:
-
Continuing to provide regional delivery, similar to BEL’s EPPs, through partnerships with local ecosystem leaders, ensuring deep local knowledge.
-
Longer-term or multi-year funding commitments to allow for sustained impact in regional ecosystems. Although BEL’s delivery design, which was on a one-year basis, allowed changes between the years, stakeholders suggested that a delivery design with a longer horizon might have provided an opportunity for higher-impact programmes.
-
Facilitating cross-regional collaboration to encourage peer learning and ecosystem-wide benefits. BEL’s EPPs ensure that local needs are incorporated in the programmes, but stakeholders indicated that having a delivery design that allows cross-regional collaboration might further enhance the impact on local ecosystems through knowledge sharing.
-
Monitoring and maintaining the proportion of regional participation across DGG-funded activities similar to the ones deployed by BEL.
1.4.3 Access to digital entrepreneurship, investment readiness, and wider opportunities
Findings
Access to investment readiness
Most activities provided investment readiness training content. Those components (e.g., resources and content provided as part of growth programmes and EPPs) had mixed performance outcomes. For example, ‘Funding Readiness: Cohort 1’ received an NPS of 25. A qualitative assessment of participant feedback suggested that the growth programmes might not have provided sufficiently impactful content in this regard.
In addition, participants preferred direct engagement with funders and tailored funding advice over generic training. Interviewed stakeholders highlighted that the main benefit was a direct connection with investors (when that was available), which allowed them to gain insights about what investors look for or benefit from receiving direct funding (when that was available)
Finally, the mentoring programmes do not appear to have addressed investment readiness. Although one of the mentoring delivery partners specifically focused on this aspect, only 6 of the 101 interviewees mentioned this as a benefit of the mentoring programme.
Access to digital entrepreneurship
Online aspects were available across all programmes, which participants valued as it offered flexibility. Although in-person networking was provided, some participants felt that they were missing out on networking opportunities offered by in-person events, which suggests that a higher number of in-person events might have been beneficial.
Signposting to wider opportunities
Participants appear to have been directed to other private and public opportunities, but the impact of this was not possible to discern.
Recommendations for future local tech startup and scaleup support interventions
DSIT should consider:
-
Increasing the amount of direct engagement with investors, where possible, and personalised investment readiness support in future programming. Participants suggested that when these opportunities were provided in the BEL activities, they were more impactful.
-
Improving the quality control and targeting of the learning platform content. Although the content of the learning platforms was wide, additional processes to ensure the relevance and quality of the content might lead to better outcomes.
-
Mentoring offers that enable more sustained relationships over time. BEL’s mentoring programme allowed mentees to book sessions as needed and provided flexibility to book as many sessions as necessary. However, a more structured programme can create needed accountability and a sustained relationship between mentor and mentee.
-
Standardising signposting to other activities across the programmes and developing a tracking system for referrals to other public or private initiatives will help assess the level of signposting and the impact it might create.
1.4.4 Process and delivery model
Findings
BEL’s connections enabled a broad reach and the identification of capable delivery partners, particularly among regional and underrepresented groups. However, by relying on existing networks, particularly for the growth programme, mentoring, and learning platforms, some opportunities to identify new partners or build new relationships may have been overlooked.
The EPP delivery process, which involved outreach to wider delivery partners, was open in nature and heavily relied on the Advisory Board, which played a key role in EPP’s success.
BEL had good delivery processes that ensured the deployment and completion of the programmes. However, some aspects of financial management could have been improved.
Recommendations for future local tech startup and scaleup support interventions
DSIT should consider:
-
Including advisory or steering boards similar to the one deployed for BEL to oversee funded programmes to ensure consistency in quality and alignment with ecosystem needs.
-
Having a flexible programme design similar to that deployed by BEL for the EPPs that collaborates with local stakeholders and experts to deliver programmes tailored to local needs, while keeping a streamlined application and reporting process to reduce administrative burdens.
-
Ensuring evaluation is designed at the set-up stage. This would include the design of data-sharing agreements and evaluation frameworks to maximise the impact assessment potential. It should also consider how to incorporate a longer post-programme observation period to capture economic impacts better.
-
This evaluation only refers to the BEL activities that were provided by BEL but funded by the DGG (or with matched funding schemes). BEL ran similar activities parallelly, but those are excluded from this evaluation. ↩
-
The 16,000 target refers to activities directly funded by the DGG. Additional activities funded by BEL, which were conducted in parallel, were aimed at a total of 22,000 businesses (including the DGG-funded activities). ↩
-
Given these limitations, the reach was calculated by adding up the number of ‘business interventions’ as defined by BEL provided in the growth programme and EPPs, the number of mentoring sessions delivered under the Mentoring programme, the number of unique active users (those who completed at least one lesson) of the Eagle Labs Academy, and the number of unique views for each of the reports. This exclusion applies to beneficiaries of the LifeSkills programme, as the available data only counted the number of unique teachers who accessed this programme, rather than the number of ultimate beneficiaries (young adults who attended lessons using the materials provided by the programme). ↩
-
Defined for this report as those who self-reported as having at least one of the following characteristics: belonging to an ethnic minority; female, non-binary individuals, or other non-male genders; LGBTQ+; or those with health conditions. Individual with health condition are defined, for this report, as those responding to the question “Do you have any physical or mental health conditions or illnesses lasting or expected to last 12 months or more?” with “A specific learning difficulty such as dyslexia or dyspraxia or AD(H)D”; “Deaf or hearing impairment”; “A long-standing illness or health condition such as cancer, HIV, diabetes, chronic heart disease, or epilepsy”; A physical impairment or mobility issues, such as difficulty using arms or using a wheelchair or crutches”; and “Blind or visual impairment uncorrected by glasses”. ↩
-
Refers to non-unique participants as data limitations do not allow for a unique participant assessment. ↩
-
FY2 programmes were not completed during the evaluation. As such, this report presents the FY2 activities’ outputs as indicative only. ↩
-
FY2 data for this evaluation was correct as of 14 February 2025. Three growth programmes were added late in FY2, for which monitoring data was not processed. Those included Female Founder Pitch Deck, Female Founder Startup and Women in Business NI. Nine EPPs were added late in FY2, and monitoring data was not provided. Those included Allia Impact, CyNam cyber, Raise Ventures, Sustainable Ventures, Tech South West: MarineTech, Tech South West: CleanTech, Tech South West: AI, TechSPARK, and Tramshed Tech. An additional 250 mentoring sessions were added in later FY2, but monitoring data was not provided. ↩