Skip to main content
Research and analysis

5GIR and SIPP process evaluation report (application process): executive summary

Published 13 May 2026

Executive summary

1.1 Introduction

To support the Department for Science, Innovation and Technology’s (DSIT) ambition to deliver world-class digital infrastructure across the UK, drive innovation and unlock opportunities for economic growth, and as part of its Wireless Infrastructure Strategy [footnote 1], DSIT is currently in the process of delivering 2 wireless infrastructure programmes: 5G Innovation Regions (5GIR) and the Smart Infrastructure Pilots Programme (SIPP). Details of these programmes can be found in Section 2.1.2 and Section 2.1.3 of this report.

Government departments are expected to undertake comprehensive, robust and proportionate evaluations of their policy interventions in order to understand how policies and programmes are working and to ensure the best value for public money [footnote 2]. To that end, in February 2024, DSIT commissioned KPMG to undertake programme-level evaluations of 5GIR and SIPP.

The 5GIR and SIPP evaluations will comprise process, impact and economic evaluations. The evaluations for the 2 programmes will be conducted concurrently due to the synergies between the programmes and their evaluations.

In May 2024 KPMG produced a scoping and baseline report, signed off by DSIT, setting out: the research questions to be answered by each evaluation; the approach to the evaluations; and their associated timings. The process evaluation will consist of 2 elements: one element covering the application process, to be conducted between June and September 2024; and the other element covering the delivery of the programme, to be conducted between April and August 2025.

This report sets out the findings of the first element of the process evaluation – relating specifically to the application process.

In delivering this process evaluation we have worked closely with the DSIT evaluation project team [footnote 3] and the Steering Group [footnote 4] that has been set up as part of the 5GIR and SIPP evaluations (hereafter referred to as the DSIT evaluation team and Steering Group respectively).

1.2 Process evaluation research question (application process)

The agreed evaluation research question, as detailed in the scoping report [footnote 5], for this element of the process evaluation is:

  • What went well and what could be improved with regard to the 5GIR/SIPP programmes, specifically in relation to the application process (from the point of view of Local Authorities (LAs) and DSIT)?

1.3 Data and evidence collection

The approach taken to delivering the process evaluation, agreed as part of the scoping report, aligns to the principles of the HM Treasury Magenta Book [footnote 6].

The main research methods that were used to collect data/information for this evaluation are set out below:

  • Review of documentation:

    • A wide range of documentation was reviewed to help answer the evaluation research question. Documentation included: pre-competition engagement materials; competition application guidance documents; and DSIT scoring assessment spreadsheets, amongst other internal programme documentation.
  • Interview with DSIT officials:

    • An interview with 9 DSIT officials involved in the delivery of the 2 programmes was undertaken to gather primarily qualitative evidence to support the process evaluation. The interview provided insights into officials’ views on the application processes and design of the 5GIR and SIPP programmes. This insight was based on officials’ involvement in the design and subsequent delivery of the 2 programmes and their interaction and engagement with LAs through the pre-application and post-application periods.
  • Surveys:

    • 3 separate surveys were run over the course of the evaluation to help answer the evaluation research question. 2 of these surveyed applicants to each of the 2 programmes respectively: the 5GIR Survey which surveyed successful 5GIR applicants and unsuccessful 5GIR applicants; and the SIPP Survey which surveyed successful SIPP applicants and unsuccessful SIPP applicants. A third survey was conducted of all LAs across the UK (the All LA Survey) to obtain input specifically from those who did not apply to one or either of the programmes.
    • Good responses to the surveys were achieved, providing confidence in the results derived from them. 68 LAs responded to the All LA Survey. For the 5GIR Survey we received similar response rates for both successful applicants (70%) and unsuccessful applicants (65%). For the SIPP Survey we received a full response from successful applicants (ie a response rate of 100%). However, for unsuccessful applicants to SIPP we received 2 responses from the total of 5 unsuccessful applicants. As a result, some care is required when interpreting the results for unsuccessful SIPP applicants given the small number of responses involved.

1.4 Key findings

1.4.1 Summary of the 5GIR process evaluation findings


Effective advertising with room to improve further:

  • DSIT officials felt that the programme’s advertisement strategy, which included direct communication with LAs as well as national press releases and social media engagement proved effective in reaching a wide audience. Indeed, officials noted that the 5GIR briefing event had the highest registration numbers for a DSIT programme compared to previous events.

  • Respondents to the 5GIR Survey cited a wide range of means of becoming aware of the opportunity. An email from DSIT was reported as the most common means of becoming aware of the 5GIR opportunity (a finding replicated in the All LA Survey for those LAs who were aware of the 5GIR opportunity).

  • Nevertheless, the All LA Survey suggests that there is still room for improvement in outreach efforts to raise awareness of available opportunities. Three-quarters of LAs that responded to the All LA Survey and hadn’t applied for the 5GIR programme, were not aware of the opportunity.

  • Investing in such outreach (through developing improved contacts at LAs for example) could be beneficial as a significant proportion of LAs expressed interest in applying for similar programmes in the future. Around two-thirds of respondents to the All LA Survey that hadn’t applied for the 5GIR programme stated they would be interested in applying for opportunities similar to 5GIR in the future.

High application interest:

  • The 5GIR programme received 36 applications, reportedly exceeding initial expectations among DSIT officials. Moreover, applications were received from every region of the UK.

Strong application quality:

  • DSIT officials felt the quality of applications was high, with 21 shortlisted and 10 ultimately successful. They reported that successful applications demonstrated clear vision, ambitious goals, strong partnerships, and sustainable funding models.

Innovative funding method:

  • DSIT officials said that, based on the interactions they had with LAs during the application process, the innovative funding method employed for the 5GIR programme (whereby the funding method provides increased flexibility and freedom to local authorities in devising plans to meet the grant requirements) was an attractive feature for LAs and was generally well-received by them.

  • Over a third of respondents to the 5GIR Survey tended to, or definitely, agree that the method of funding was an important factor in their decision to apply for the 5GIR programme.

  • The DSIT team that designed the programme reported that they had considered aspects that would make the programme attractive to LAs – particularly illustrated by the funding mechanism employed.

Areas for improvement:

  • Following their experience of the application process, officials identified areas for potential improvement as including: extending the application timeframe; increasing funding allocation; and strategically timing the competition outside of holiday periods. Whilst this provides useful insights for consideration for future programmes, DSIT officials noted that the timing of the application window for the 5GIR and SIPP opportunities was constrained by the spending review cycle which meant the funds for these programmes needed to be disbursed by March 2025.

  • Timing was also identified as a challenge in responses to the 5GIR Survey:

    • Almost half (46%) of respondents ‘definitely disagreed’ that they received a sufficient amount of time to make the 5GIR application.
    • In free-text responses, a number of respondents highlighted that the timing of the competition (through the summer holidays) made the process of applying difficult.
    • Finally, a majority of respondents that were not successful in applying for 5GIR funding (10 out of the 17 unsuccessful applicants that responded to the survey) noted they had not received feedback from DSIT on their bids. Of these, 6 specified that they had requested feedback from DSIT (the application guidance stated that all applications would receive feedback upon request). However, DSIT officials noted that according to their own records there were 4 applicants that requested feedback and were not sent it, and that this was an oversight on DSIT’s behalf.

1.4.2 Summary of the SIPP process evaluation findings

Targeted advertisement:


  • When compared to the advertisement strategy used for the 5GIR programme, SIPP employed a more limited and targeted advertisement approach. Consistent with the difference in the size of budgets for the 2 programmes (with 5GIR accounting for over £36 million central government spend as compared to £1.3 million for SIPP) a more limited range of channels or methods were used to advertise the SIPP opportunity when compared to the 5GIR opportunity.

Moderate application interest:

  • Whereas the 5GIR opportunity provided applicants with quite a lot of autonomy to suggest how the funding might be spent, SIPP was narrower in scope – focused particularly on the use of advanced wireless technology on multi-purpose poles. SIPP received 11 applications, which was reportedly in line with the expectations of DSIT officials given its narrower scope. However, the All LA Survey showed a significant proportion (84%) of LAs that did not apply to SIPP were not aware of the opportunity.

Mixed application quality:

  • DSIT officials felt the quality of SIPP applications was mixed, with some low-quality submissions. This was attributed to:

    • limited time and resources for application completion - the SIPP competition was open for applications for around 4 weeks over the summer period, 2 weeks less than the period for the 5GIR competition; and
    • the lack of existing relationships between LAs and external stakeholders - required in order to develop proposals - and little time for these to be developed within the application timeframes.

Limited geographical spread:

  • The geographical spread of successful applications was primarily concentrated in England, with one successful application from Scotland. There were no SIPP applications from Wales or Northern Ireland. Officials felt the limited number of applications from across the UK was due to the programme’s focus on specific types of projects, which appealed to LAs within these nations already engaged in similar initiatives. However, the All LA Survey shows that 84% of respondents that hadn’t applied for SIPP were not aware of the opportunity, indicating that low levels of awareness may also have contributed to limited applications, particularly given the majority of respondents to the All LA Survey reported that they would be interested in similar programmes in the future.

Match funding requirement:

  • Officials felt that SIPP’s match funding requirement may have discouraged potential applicants.

  • This was reflected in survey responses, with one of the most common points raised in the free-text comments to the SIPP Survey concerning the lack of clarity over the match-funding element of the programme.

  • Respondents to the SIPP Survey were generally positive about the application process, particularly praising the funding mechanism used. Over 60% of respondents tended to, or definitely, agree with the statement, “This method of funding was an important factor in my decision to apply for SIPP”. Neither the match funding nor the funding mechanism employed in SIPP were raised as reasons for not applying amongst non-applicants.

Areas for improvement:

  • Areas for potential improvement identified by DSIT officials and LAs via the surveys include:
    • simplifying match funding requirements;
    • being clearer in the application guidance and associated materials about the specific outcomes expected from the programme, so that LAs had a better understanding of this;
    • extending the time period for applications; and
    • raising awareness of the opportunity more generally.
  1. See: UK Wireless Infrastructure Strategy 

  2. See: About us - Evaluation Task Force - GOV.UK 

  3. The DSIT evaluation project team has overall responsibility for the evaluations and provides ongoing input and direction in relation to the evaluation such that it meets DSIT’s requirements; the team is comprised of DSIT officials. 

  4. The Steering Group provides advice, guidance, scrutiny and challenge to the evaluations, with the aim of supporting the evaluations and ensuring the findings are robust and provide useful insights to build the evidence base in relation to digital infrastructure. It consists of DSIT officials covering the areas of policy, analysis, benefits realisation and technical expertise with respect to the 5GIR and SIPP programmes, together with a member of the Cabinet Office’s Evaluation Taskforce

  5. HM Treasury 2020; Magenta Book 

  6. The Theory of Change identifies the changes an intervention is seeking to make, how it is expected to happen and the measurable outputs, outcomes and impacts associated with the intended change.