Research and analysis

Open Networks Programme: initial evaluation

Published 12 September 2024

Executive summary

In April 2023, the Department for Science, Innovation and Technology (DSIT) commissioned an initial evaluation (ahead of interim (2026/2027) and final (2029/2030) evaluations) of the Open Networks Programme (ONP) (‘the programme’). The study was undertaken by an evaluation team of ICF Consulting (ICF), Aetha Consulting and an evaluation expert, George Barrett.

The aim of the ONP was to support the development and deployment of open interface architectures, principally Open Radio Access Network (Open RAN). The £295.5 million programme consists of a suite of 11 sub-programmes (referred to as ‘interventions’) which have funded a diverse set of research and development (R&D) related projects, testing facilities and supporting initiatives. The first funded activities started in summer 2021, and the ONP is currently scheduled to finish in March 2025. This initial evaluation was conducted before completion of the programme so that any learnings can be acted upon while the programme is in-flight to optimise delivery, and so that outputs and short-term outcomes can be reported on before the Spending Review expected in 2024.

Following the machinery of government changes in February 2023, responsibilities for the ONP were transferred from the Department for Culture, Media and Sport (DCMS) to DSIT, retaining the ONP’s delivery model, governance structure and teams required for implementation. For the purposes of this report, DSIT refers to both DSIT and DCMS as one entity unless otherwise specified. See contextual information regarding the ONP interventions and funded projects.

Evaluation objectives and methodology

This initial evaluation consists of a combined process, impact, and economic evaluation. All ONP interventions were within scope of the process evaluation. The impact evaluation of ONP was largely qualitative, drawing on a theory-based evaluation approach and mainly focused on the interventions that had started the earliest, and where impact could be measured:

  • Future Radio Access Network Competition (FRANC)
  • NeutrORAN Project (NeutrORAN)
  • SmartRAN Open Network Interoperability Centre (SONIC)
  • UK – Republic of Korea Open RAN R&D Collaboration (UK-ROK R&D)

A quota sample was used to ensure a spread of projects in terms of project RAG rating, technology area, size of consortia, and the DSIT bid appraisal score. At the centre of a theory-based approach is a Theory of Change model, which articulates the logic underpinning an intervention, setting out the pathways through which funded activities are expected to lead to the desired outcomes and impacts. This evaluation analysed evidence as to whether the anticipated steps along the expected pathways have materialised, and whether any change can be attributed to the ONP. The short- and longer-term impacts of the projects and interventions – including whether there has been increased adoption of Open Radio Access Network (RAN) products – are yet to be realised and will be assessed as part of interim and final evaluations. At this point in time there was limited potential for a comprehensive economic evaluation; instead, the economic evaluation in this report focused on an initial assessment of the cost-effectiveness of the Technology Readiness Level (TRL) progression achieved by ONP-funded R&D projects.

Figure ES 1.1 Evaluative Theory of Change for the ONP

The evaluation methodology comprised of a range of research activities:

Interviews with 38 leads and partners from ONP projects and 5 interviews with representatives from projects that bid unsuccessfully for ONP funding.

In addition, a survey was also sent around to all 14 unsuccessful FRANC bidders - 2 responses were received.

Interviews with 36 officials from DSIT tasked with ONP design and delivery, including 15 interventions leads and portfolios/projects managers, 5 Technical Design Advisors (TDAs) and 16 other DSIT officials.

A review of documentary evidence including programme Business Cases, benefits realisation data, Grant Funding Agreements, etc., as well as desktop research to inform the impact evaluation and to update the Key Performance Indicator (KPI) baseline.

Interviews with 15 industry representatives, including 3 of the 4 major Mobile Network Operators (MNOs) who responded to our request for interview. This was supplemented by a survey sent to vendors of Open RAN and system intergrators: of 20 organisations contacted, 10 responded.

9 case studies produced to assess ONP processes and impact at project intervention level. These were completed for 6 FRANC projects, NeutrORAN, SONIC and UKTIN.

UKTIN survey: Analysis of responses to a survey carried out by the UKTIN of its members (88 responses received).

Evidence gathering took place between September and December 2023, when most of the projects and interventions were live and had not yet completed. It is presently too early to make conclusive statements regarding the wider market and technological impacts of the ONP as most of the benefits realised are only likely to be observed once the interventions conclude. Therefore, our evaluation focused on an assessment of the intervention design and delivery processes and whether ONP interventions were on course to deliver intended outcomes based on the Theory of Change, presented above in Figure ES1.1.

Key takeaways

The programme is generally run well, with opportunities for improvement identified.

Organisational learning took place and there were improvements to competition guidance and the design and delivery processes of later interventions. There are opportunities for further improvement set out in the Process Evaluation section.

In terms of impact, the ONP has accelerated the development of Open RAN solutions. However, more is required to support commercialisation of Open RAN solutions and deliver the intended outcomes of the ONP.

At this stage, it is too early to assess the long-term benefits of Open RAN investment as the benefits are not expected to materialise until at least 2030.

Process evaluation findings

The process evaluation assessed the design, administration, and delivery of the ONP and its 11 component interventions. At an intervention-level this included an assessment of the processes specific to proposal development[footnote 1], mobilisation[footnote 2] and monitoring[footnote 3]. The process evaluation was informed by a document review and qualitative analysis. This included thematic analysis of interviews with a sample of DSIT team members, selected members of project consortia, a sample of unsuccessful grant applications, and triangulating this with survey results, management information and documents shared by DSIT.

Generally, the processes of the ONP are working well, but there are opportunities for further improvement. The requirements of projects and the effectiveness of associated ONP processes varied by intervention. Processes used to deliver later ONP interventions (e.g., Open Networks Ecosystem Competition (ONE)) were generally more effective than early interventions (e.g., FRANC). DSIT applied lessons learned from past projects to enhance future ONP interventions, showcasing a progressive approach.

The key findings of the process evaluation are captured in table ES1.2. These findings focus on areas where the evaluation has concluded that there is scope for improvements to the delivery of the ONP, rather than celebrating the successes of the programme.

ES 1.2 Process evaluation summary findings

1. Proposal development: There was clear evidence of active engagement with the market, industry experts and stakeholders when DSIT was designing interventions and launching competitions. This led to a higher-than-expected number of quality bids received in the competitions. Overall, the proposal development processes worked well across the interventions.

A. Intervention design

- Earlier ONP interventions were more flexible in their requirements than later interventions, which were more specific in terms of their component / use case requirements. The intervention parameters of FRANC were more nebulous than ONE in terms of its link to the overall ONP objectives.

- Eighteen months was considered by project participants as being typically too short for the duration of R&D projects. It was suggested that a minimum of 2 years is required to complete collaborative R&D projects.

- Project applicants felt they benefited from the DSIT briefing events, and overall found the competition guidance useful, but would have preferred greater clarity in certain aspects. Clarity was needed to address concerns regarding organisations participating in multiple projects under the same intervention, overseas working, and the percentage of funding that Research and Technology Organisations (RTOs) can receive.

- Timeframes for the bidding process for R&D project competitions were considered too short by project applicants (these were 8-10 weeks). It was suggested by applicant interviewees that 3 or more months would provide sufficient time for them to complete internal sign-off processes on project design and consortia agreements. This would reduce delays to project mobilisation (including signature of Grant Funding Agreements (GFAs) and collaboration agreements) and might encourage greater levels of participation from larger organisations.

- Application processes for R&D project competitions could benefit from refinement to enhance efficiency and user-friendliness, ensuring better accessibility for project applicants. For interventions that took place earlier within the ONP, applicants and DSIT officials noted issues with document processing and storage in the absence of an online portal, though improvements were made and the cross-government “Find a Grant” online portal was used for the ONE competition. Innovate UK’s application portal (Innovation Funding Services) was cited as an exemplary tool by some applicants for submitting applications.

- The ONP exhibited flexibility by allowing direct awards when justified, by exception, thus streamlining processes by excluding a ‘bidding’ stage. As a result, NeutrORAN, SONIC and UK Telecoms Lab (UKTL) interventions could focus efforts on the mobilisation processes. However, this did not result in GFAs being signed more quickly in comparison with projects selected through competitions.

B. Targeting and engagement

- Workshops and briefing events in the early stage thinking of intervention design were considered effective and thought by DSIT officials to have been well attended. DSIT-run workshops were attended by a diverse range of stakeholders, including incumbent vendors, academics, and small and medium-sized enterprises (SMEs). For example, the UK-ROK R&D briefing event had over 80 attendees.

- Most applicants interviewed across all competition-type interventions became aware of the competition through existing networks and partnerships outside of their engagement with DSIT about the ONP. We therefore conclude that DSIT promotion of upcoming interventions and match-making events did not have a marked impact on competition awareness and consortium building.

- There were similar levels of applicant engagement for both early and later competition-type interventions. For example, there were 129 organisations involved in FRANC applications, of which 67 (52%) were successful and 62 (48%) were unsuccessful. This compared to 142 organisations involved in ONE applications, of which 116 (82%) were successful and 26 (18%) were unsuccessful.

C. Project selection for interventions delivered using competitions

- Interviews with applicants generally did not add much value to the selection process. Interviews did not change scoring and therefore did not change the outcomes. This requirement was dropped for later interventions.

- The benefits section of the application was considered by most project applicants to be the most challenging to complete. Benefits section submissions typically had the lowest quality answers, and applicants reportedly did not fully understand the requirements for a successful bid.

- Feedback on the reason(s) for the bid outcome was not always felt to be well communicated by project applicants. Project applicants reported that DSIT feedback was limited, and a few unsuccessful applicants said they did not receive any feedback.

2. Mobilisation: Whilst DSIT has made improvements to streamline the due diligence checks required, the checks are considered resource intensive and are seen as contributing to delays in signing GFAs and there is scope for improvement. GFA delays continue to cause problems for delivery, as the target timescale given to projects to sign GFAs from notification of selection for award (4-10 weeks) is not being met. This has resulted in delays to project start dates; later end dates or shorter project duration; and/or organisations deciding to work at risk to complete before DSIT funding must end. Overall, the pre-funding (mobilisation) processes worked somewhat well across the interventions, but there is also room for improvement.

A. Pre-GFA due diligence

- Overall, the due diligence checks for ONP were thorough and necessary. DSIT did not report any misuse of public funds by ONP grant recipients to date, suggesting the checks were credible.

- Due diligence check processes were considered resource intensive and, in several cases, perceived to have contributed to delays in subsequent GFA signature. For example, due diligence requirements, checks and Q&A were perceived by some DSIT officials to have resulted in a 3-4 week delay in signing the GFAs for several projects.

Acquiring financial statements and commercial information was considered a challenge to complete some DSIT checks. DSIT reported issues in receiving these types of documents, and noted challenges in accepting organisational cash flow profiles as in some instances these did not align with commercial information, or straddled 2 financial years and did not transfer well to the DSIT budgets or DSIT-generated templates.

- Despite DSIT improvements to streamline due diligence requirements and checks, project interviewees that had participated in earlier interventions and ONE thought that pre-GFA requirements had not improved. DSIT acknowledged that commercial checks for recent interventions (e.g., ONE) were slower than they were for earlier interventions (e.g., FRANC and FONRC). Potential reasons included: novel issues arising, for example with respect to community interest companies, dependencies on companies in several consortia or overseas suppliers; project and DSIT resourcing; and delayed inputs from some large companies in consortia. It may be that competition guidance will need adjusting in future competitions for streamlining efforts not to be constrained as they were for the ONE competition.

B. Grant Funding Agreements (GFAs)

- The published DSIT target timescale of 4-10 weeks following notification of award to GFA was often not met in the context of the current requirements and processes to complete due diligence and sign the GFA. Nearly all the projects across the FRANC, FONRC and ONE ONP interventions did not sign their GFA within the published timescale, with many requiring 3 months or more.

- The information requested by DSIT pre-GFA was widely considered to be overly detailed and burdensome. The majority of project participants and some DSIT officials considered that DSIT lacked flexibility in capturing what is needed for projects of varying cost and complexity, and that the guidance and paperwork could be over-complicated.

- Changes to information management were perceived to be needed to improve shortcomings in documentation review systems across DSIT teams and have been implemented with the introduction of a new cloud programme management software. During the pre-GFA processes, most project participants reported issues with version control and having to submit duplicative documents to different teams across DSIT.

- Project participants signing GFAs after January 2023 encountered challenges in ensuring subsidy control compliance, following changes in legislation. This primarily impacted ONE intervention grant recipients as the interventions were made under the enacted Streamlined Research, Development, and Innovation Route. Some project consortia chose to use external solicitors at their own expense to navigate their subsidy control questions and to satisfy themselves of their compliance with the said rules.

- Inputting project milestones in the GFA was sometimes a source of contention, specifically agreement of Annex 5 of the GFA. Milestones were a prerequisite to signing a GFA and there was a perception that these milestones needed to align to quarterly grant claims. It was also difficult for projects to plan ‘end to end’ milestones for R&D and post-GFA changes to the milestone schedule were often required.

- Participant organisations often worked at risk in the absence of a signed GFA. The majority of project participants reported working at risk to ensure project delivery within the timelines set by DSIT while completing requirements before the GFA. This delay with GFAs resulted in some partners not being able to start due to the way their organisations are structured.

- From interview responses, we have concluded that DSIT’s intended improvements to the pre-GFA requirements and processes have had little impact on time required to sign GFAs for recent interventions. Of the project consortia members interviewed and involved in both the FRANC and ONE interventions, none noticed any difference in the requirements/processes and were still experiencing similar or longer delays. However, the ICF evaluation team has not conducted additional secondary analysis to determine which consortia reached GFA and collaboration agreements earlier.

D. Collaboration agreements

- Similar to other R&D programmes, there was a lack of understanding from several project consortium partners about the Intellectual Property (IP) requirements associated with collaboration agreements, which led to project delays and substantial changes to consortia. This highlights the importance of understanding the licensing of IP amongst the project consortia from the onset.

3. Monitoring: Project management processes were considered helpful by project applicants, but there is a lack of common approach/standard applied to project reporting as observed in the benefits realisation templates (particularly with capturing data on TRL progression). Overall, the monitoring processes worked somewhat well across the interventions.

A. Project management

- Regular meetings between project consortia and DSIT officials and TDAs were considered helpful, and most project consortia reported good working relationships with DSIT. Many were impressed by the level of DSIT involvement in their projects and found DSIT to be supportive if unforeseen changes were needed. The regular meetings provided an opportunity for projects to receive regular feedback from DSIT and the Technical Design Advisors (TDAs).

- The frequency of DSIT personnel changes was considered disruptive to working relations with DSIT. Most interviewed project consortia expressed frustration about the turnover of DSIT personnel, for example project managers assigned to their projects. They noted this impacted the continuity of working relationships and felt that information was not consistently nor effectively communicated within DSIT to replacement personnel.

- DSIT decision-making processes and lack of project information from project consortia caused some delays to project delivery. Several project consortia members claimed that due to DSIT governance structures, DSIT officials were unable to provide timely DSIT decisions and required further internal approvals (such as on grant claim payments, financial or commercial matters, project change requests, and project GFA extensions), which could result in project delays while awaiting key decisions. Overall, DSIT interviewees acknowledged the delays, but considered the decision-making levels and project requirements of projects for change control suitable, for example due to due diligence for public spending and DSIT’s level of risk appetite.

B. Change requests

- The process of change requests can take a long time. Project participants from FRANC and FONRC reported the process of agreeing a change request could take weeks, which they believed was too long. Delays were caused by factors such as timeliness and quality of project information provided by project participants, and the process of approvals from DSIT.

- Information required of change requests was considered by projects to be onerous and sometimes required resubmission of documents previously shared; however DSIT is reportedly attempting to improve this. Over 100 change requests had been submitted to DSIT as of March 2024, though most projects were ongoing at the time, and ONE projects had only just started in Autumn 2023. Project participants reported being asked to submit revised project delivery plans, commercial forms and cash flow profiles despite not, in their view, always requiring an update associated with the change request.

- Project change requests were often needed immediately after signing the GFA. For example, 18 change requests were required across all ONE projects in the first few months following signature of the GFA. Some project participants reported these requests were due to changes to GFA milestones. On the contrary, DSIT officials reported these changes were because of cost/commercial reasons, consortium changes and time delays.

- DSIT has made changes with the aim to aid projects and improve how it processes change requests. On 15 November 2023, DSIT introduced new ‘thresholds’ for change requests, such that many changes could simply be notified to DSIT. DSIT also improved the guidance for projects and moved to a new cloud-based project management software in July 2023, to process future change requests.

C. Grant funding claims

- Delays in projects submitting grant claims and supporting requirements and DSIT paying projects are an issue. Several project consortia members experienced delays in receiving payment following a grant claim. DSIT officials confirmed that payments are sometimes taking longer than the DSIT target timeline. This is due to factors such as projects not meeting the requirements for claims (e.g. evidence of eligible expenditure, financial forecast updates, and evidence of delivery progress), together with DSIT internal resourcing constraints and problems associated with the machinery of government organisational changes, such as the change of payment services provider.

D. Benefits realisation

- DSIT had made changes to the design and contents of the benefits realisation templates to reflect learnings from earlier interventions and similar programmes (e.g., the 5G Testbeds and Trials (5GTT) Programme). Benefits realisation templates had been adopted from those of the 5GTT Programme and streamlined for later interventions (e.g., ONE) following the 5GTT Programme and earlier interventions (e.g., FRANC) learnings.

- There is an inconsistent level of benefits realisation reporting across projects / interventions, likely due to the complexity of benefits reporting in R&D programmes. The DSIT benefits realisation team provided advice and guidance to projects on how to complete templates so projects could demonstrate their benefits. However, a review of the benefits realisation templates by the ICF evaluation team found that quality, completeness, and level of detail captured varies across the projects. This included incomplete lessons learned sections, inconsistent TRL reporting, and missing tabs (e.g., knowledge dissemination, project change log, investment stimulation). Some changes were due to iterations of the worksheet over time and others appeared due to older templates being in use. This impacted the consistency of what is captured.

- Enhanced engagement from DSIT project managers with the benefits realisation reporting could help to improve its completeness by project applicants. Currently, the DSIT project managers do not check the benefits realisation templates as this goes beyond the scope of their role. Sign-off is only required by the DSIT benefits realisation lead with TDA commentary and input.

E. Technology Readiness Level (TRL) reporting

- TRL reporting across projects and interventions lacked consistency and completeness, particularly for earlier interventions, where the requirement had not been sufficiently clear to projects. There were inconsistencies between TRLs reported in the benefits realisation templates and by project representatives during interviews.

- TRL tracking is difficult, as there can sometimes be an element of subjectivity, which, at times, lacks relevance to the specific project or intervention. Reporting of TRLs varies across projects, such that in some cases a TRL was provided for a specific component of the project and in other instances a TRL was recorded for the overall project. In some instances, it was not appropriate to assign a TRL. Some projects would benefit from clearer guidance around TRLs.

Impact evaluation findings

The impact evaluation included a ‘bottom-up’ assessment of individual project/ intervention-level impacts and a ‘top-down’ programme-wide assessment to evaluate progress against overall programme aims. We evaluated the evidence as to whether ONP interventions have delivered the intended outcomes based on the Theory of Change, and if not whether they are on course to do so. We also considered the additionality of outcomes, such as whether any change can be attributed to ONP interventions or other factors.

Our review of individual projects /interventions, including the 9 case studies, considered their contributions to the 5 key outcomes outlined in the Theory of Change and found that overall, the ONP has helped address technical barriers, but now there is more to do to support commercialisation of Open RAN products. To date, the ONP achievements include the following:

ES 1.3 Project/intervention-level impact evaluation summary findings

Accelerating the maturity of Open RAN products and solutions

ONP projects have successfully accelerated the development of several Open RAN products. Interviewees from FRANC projects reported that projects which started out very low (early stage) TRL (TRL3 -proof of concept) had made it as far as a trial in a lab environment or other relevant environment (TRL4 or TRL5).

As expected at this early stage, most ONP projects have not (yet) developed Open RAN products to the point where they are market ready. Only NeutrORAN products had reached TRL8 or TRL9 when the evaluation was carried out, but testing had not been carried out on live operator networks. So, for now, it is uncertain whether they are truly commercial products.

Several FRANC projects sought, and in some cases secured, ‘follow-on’ funding under the ONE intervention to build on the progress they made. For example, the FRANC Accelerating RAN Intelligence (ARI) 5G project supported the development of a RIC platform (an innovative Open RAN technology) and 4 exemplar use cases. This has now progressed into the ONE Accelerating RAN Intelligence across Network Ecosystems (ARIANE) project which is investigating the operation of multiple applications running on multiple RAN Intelligent Controllers (RICs) on a network.

Evidence indicates that the additionality of the ONP outcomes is high, since funded projects are judged unlikely to have gone ahead in their current timescales without public funding. Project consortia reported that ONP funding enabled their organisations to assemble research teams and undertake R&D at a scale and pace that would not otherwise have been feasible.

Further evidence on the additionality of the ONP-funded projects comes from research with unfunded projects. Interviews were carried out with 5 unsuccessful bidders to the FRANC competition (there were a total of 20 unfunded FRANC projects). Two projects went ahead anyway with different designs, funded using internal resources and private equity investment. One project changed scope and was successful in the ONE competition, one did not go ahead, and the fifth was, when interviewed, unsure what would happen (it is our understanding that it has still not progressed).

Interoperability tests carried out on Open RAN products

- SONIC[footnote 4] has successfully provided Open RAN equipment vendors with a well-regarded and trustworthy environment where they can test Open RAN products. During the evaluation, 3 of the 4 planned test environments were operational and had been used by 19 vendors to test products.

- At the time of this evaluation, the absence of an outdoor testing facility had to some extent limited the validity of some of the tests that could be undertaken at the SONIC lab. During SONIC’s work between January 2022 and March 2024, only lab and indoor testing environments were available to users; the outdoor testing environment was planned during Phase 2 and will be available to use in 2024/2025.

- Having the SONIC facility, which is one of very few such facilities in the world, is perceived by some industry interviewees to have raised the profile of the UK. Some interviewees from within the telecoms industry – unprompted – reported that, in their opinion, the SONIC labs have helped to raise the profile and reputation of the UK as a location for Open RAN research.

- The additionality of the tests carried out at the SONIC facilities is judged by the ICF evaluation team to be high. In the absence of the funding and impetus provided through the ONP, it is likely that the SONIC lab facility would not have been built as no individual organisation that stood to benefit from SONIC would have successfully mobilised the resources to fund it.

- In the longer term, the UKTL will support security testing, security research and functional secure interoperability testing of equipment and software. The UKTL is still in its infancy, and the ICF evaluation team has not assessed its contribution to this ONP outcome.

Increased confidence in and adoption of Open RAN products by MNOs

- About half of ONP projects have developed products and undertaken trials with at least some degree of involvement of one or more of the UK mobile operators. Of the 2 R&D focussed interventions – FRANC and ONEMNOs were full partners in 14 of the 34 projects. Projects have involved trials being undertaken at MNOs’ own laboratories or test sites or at the edge of their networks (e.g., a neutral host site or private network connected to the public network).

- To increase the confidence of MNOs in Open solutions and drive adoption, the next key stage is for operators to test Open RAN technologies on their live networks – ideally with actual customers. The products supported through the ONP have typically not (yet) reached this stage, mostly because the FRANC intervention focused on less mature, early-stage technologies. The ONE intervention will support later-stage technologies to reach market readiness (and thus adoption) (e.g. the ONE SCONDA project will deploy Open RAN small cells across Glasgow).

- The adoption of Open RAN technologies by MNOs means addressing a range of challenges, which the ONP is contributing towards. Interviews carried out with MNOs for this evaluation highlighted that, when deploying Open RAN on their networks, MNOs want certainty that using equipment from different suppliers will work reliably with good performance.

Catalysing the UK Open RAN innovation ecosystem

It is too early to assess the impacts of the UKTIN on the innovation ecosystem. The network has been funded since September 2022 and launched in April 2023. It has to date delivered a range of activities designed to support networking and collaboration, support policy development and disseminate information (including about ONP competitions).

The UKTIN has supported networking and knowledge sharing within the Open RAN innovation ecosystem. The network has supported the ONE competition, by hosting and sharing information and competition material, supporting network building, and giving a platform for the selected ONE projects to share information about themselves.

Until ONP-backed R&D projects finish and generate publishable results, the dissemination impacts of the UKTIN will be somewhat constrained. A future evaluation of the ONP should consider if and how these activities have impacted on the functioning of the ecosystem.

Enabling international collaborative research and development

Most ONP projects have fostered international collaboration through the participation of at least one international organisation in project consortia. For FRANC, FONRC, NeutrORAN, UK-ROK R&D and ONE projects, most project consortia included at least one multinational organisation.

International collaboration supported through the ONP has supported project development, and additionality is assessed by the ICF evaluation team to be high. Interviewed representatives from the UK-ROK R&D Flexi-DAS project noted that the project had supported the investigation of technologies which could provide energy and cost savings within Open RAN networks.

Whilst international vendors have participated in ONP projects, this has not yet translated into follow-on investment at scale. There was consensus amongst the industry representatives interviewed that the ONP has generated international interest in the UK that was not present before, but that this interest is still yet to be translated into new vendors entering the UK telecoms market

It is too early to comprehensively evaluate whether the programme is having its intended effect on delivering the medium- to long-term impacts. Instead, we present the results of primary research with industry representatives and analysis of published material, to assess whether there is evidence that the ONP is on track to deliver its intended impacts, drawn from the Theory of Change model. Our findings include:

Programme-level’ impact summary findings

Increase MNOs’ interest in deploying Open RAN on their networks

The level of interest from the MNOs in deploying Open RAN on their main network varies by operator.

UK mobile operators continue to have concerns about deploying Open RAN technology. These include challenges with responsibility and contractual liability, technology maturity, legacy technologies, and total cost of ownership.

The ONP has supported the development and testing of Open RAN products, often with the involvement of MNOs.

Moving forward, operators continue to consider whether Open RAN technology solutions can meet their requirements as part of future procurement processes.

Deploy 5G small cells, private networks and neutral host solutions

In the opinion of the ICF evaluation team, the ONP has helped promote the adoption of Open RAN technology in small scale deployments, which may help build MNOs’ confidence in Open RAN. Deploying Open RAN on small cells, on neutral host sites and on private network solutions enables MNOs to gain confidence in Open RAN technology and interoperability at a smaller scale prior to more widespread deployment across their networks. Through NeutrORAN, FRANC and ONE, the ONP has supported these deployments.

Diversify the supply chain

Supply chain diversification is underway slowly but it will be several years before progress can be assessed, which is to be expected since the ONP is still live.

Several of the mobile operators are seeking one or 2 major systems integrators to provide a solution with overall responsibility for performance.

In the assessment of the ICF evaluation team, the ONP is playing a role in expanding the RAN equipment supply chain, contributing to diversification. Projects funded through the ONP have involved many participating organisations, and when interviewed, smaller organisations indicated that they had found that being part of the ONP had helped them ‘open doors’ in MNOs and other larger organisations.

Enhance the UK’s reputation for Open RAN investment

It is presently too early to assess changes in the UK’s reputation as a location for Open RAN investment, but there are signs that the ONP and other activities are starting to generate results. Examples of such signs include increasing product development and interoperability work undertaken in the UK, positioning the UK as one of the leaders in Open RAN, and creating collaboration opportunities.

Economic evaluation findings

The long-term benefits of Open RAN investment are not expected to materialise until 2030, therefore it is too early to assess the value for money of this programme. At present, the key finding of the economic evaluation is that the ONP has led to TRL improvements which show early positive signs of cost effectiveness, but this will be a key area to focus on for future evaluations. The level of TRL progression experienced to date compares favourably with analogous DSIT interventions such as the 5GTT Programme.

1. Introduction

In April 2023, the Department for Science, Innovation and Technology (DSIT) commissioned the first of 3 (initial, interim, final) evaluations of the Open Networks Programme (ONP) (‘the programme’). The study was undertaken by a team of ICF Consulting (ICF), Aetha Consulting and an evaluation expert, George Barrett.

Following the machinery of government changes in February 2023, the responsibilities of the ONP were transferred from the Department for Culture, Media and Sport (DCMS) to DSIT, retaining the ONP’s delivery model, governance structure and teams required for programme implementation. For the purposes of this report, DSIT refers to both DSIT and DCMS as one entity unless otherwise specified. Contextual information regarding the ONP interventions and funded projects is found here.

 1.1 Open RAN and technical terminology

The UK Open Radio Access Network (RAN) market is still in its infancy. Mobile Network Operators (MNOs) are taking different approaches to deployment, and it is recognised that Open RAN technologies are more readily deployable in some contexts (rural areas) than others (high density urban areas). Open RAN deployment is also closely tied up with MNOs’ plans for network upgrades and replacement in RAN equipment typically has a 5-7 year economic life and is unlikely to be replaced whilst still functional (unless as part of the replacement of Huawei equipment by 2027). DSIT identified 2025 as a critical date within its plan for wider deployment of Open RAN since this marks the next major procurement milestone for MNOs, when they will be seeking to replace 4G equipment and continue their rollout of 5G. The main rationale for the ONP, therefore, is that without government support, Open RAN solutions will not be sufficiently designed, developed, and proven for them to play a substantial role within the next phase of the MNOs’ network deployment plans.

Open RAN versus Integrated RAN

Open RAN is being standardised by the O-RAN Alliance and primarily relates to providing an open interface between the radio unit (RU) and baseband unit. Open RAN builds on a centralised RAN (C-RAN) architecture where the function of the baseband units can be disaggregated into a Distributed Unit (DU) and Centralised Unit (CU), see Figure 1.1.

Figure 1.1 Integrated RAN versus Open RAN

Open RAN is also implemented in conjunction with a virtualised RAN (V-RAN) whereby the signal processing is undertaken using software running on centralised general-purpose processors (e.g., based on the x86 family) and a Cloud-RAN further involves the processing taking place in the cloud which could be on an operator’s own private cloud (most common model) or using one of the public cloud providers.

In the Integrated RAN model, the Radio Unit and Base Band Unit (BBU) are typically co-located at the base station site and the interface between the 2 units is known as the Common Public Radio Interface – this is usually proprietary to each equipment vendor and so a mix and match approach of equipment between different suppliers is not possible. By contrast, Open RAN aims to make use of non-proprietary specifications so network operators can ‘mix and match’ components from different suppliers whilst smaller equipment providers can specialise in different parts of the overall solution.

Technical terminology

The technical terms used in this report are defined as follow

Coherent Passive Optical Networks (CPON)

An innovative technology for interconnecting servers in data centres using optical communications links.

Distributed Unit (DU) software stack

A collection of different software programmes which combine to perform the overall function of the DU.

Integrated RAN

Used to describe a RAN that uses software and hardware provided by one equipment vendor and uses proprietary interfaces.

Open RAN

Used to describe a RAN that uses non-proprietary interfaces which allows network operators to ‘mix and match’ components from different hardware and software vendors.

Joint Operators Technical Specification (JOTS)

Technical specifications defined by multiple mobile operators coming together to ensure common standards for RAN hardware and software that are shared between the mobile operators on mobile network sites.

Small cells

A mobile network cell site which uses a compact low power radio unit that has lower coverage than a usual radio until but can deployed to target specific areas to fill coverage holes in the mobile network or provide additional capacity.

RAN Intelligent Controller (RIC)

A software platform of the RAN which is responsible for the management of all the other components of the RAN. The RIC uses different applications to maximise the performance of the RAN.

Technology Readiness Level (TRL)

A score between 1 and 9 given to a piece of technology to define its maturity, ranging from 1 is an initial idea and 9 is a technology which is used commercially at a large scale.

1.2 The Open Networks Programme

The ONP is a ~£300 million programme[footnote 5] launched by DCMS (now DSIT) to support the development and deployment of open interface architectures, principally Open RAN. The programme has 3 objectives, paraphrased as follows:[footnote 6]

1. To accelerate open-interface products and solutions, by stimulating R&D tackling barriers to the development of Open RAN solutions.

2. To incentives and de-risk the accelerated deployment of Open RAN in the UK by funding the testing and demonstration of open network solutions in a range of environments and by a catalysing new commercial relationships between suppliers.

3. To develop an internationally recognised UK telecoms ecosystem which attracts investment and supports the development, testing and deployment of market-leading Open RAN solutions.

The ONP consists of a suite of sub-programmes (hereinafter ‘interventions’) which have funded a diverse set of R&D-related projects. Figure 1.2 provides an overview of the architecture of the ONP. A Theory of Change (ToC) model for the ONP is set out in Figure 3.1 in Section 3. This model articulates how the interventions are expected to deliver the programme objectives. The first funded ONP activities began in summer 2021 and the programme is currently scheduled to finish in March 2025. There are aspirations that UKTL and SONIC will be enduring facilities and continue beyond March 2025.

The ONP has been delivered in 2 tranches. The first tranche consisted of interventions launched before April 2023, and the second tranche was made up of interventions launched after April 2023. This division reflects the ONP’s status as an iterative ‘living programme’ that can adapt to changing market and technological conditions. The programme captured and acted upon learning from interventions in the first tranche to improve its effectiveness and efficiency in the second tranche.

1.3. Evaluation methodology

This evaluation was a combined process, impact, and economic evaluation. It commenced with a short scoping phase, during which time the ICF evaluation team reviewed evidence about ONP design and delivery and interviewed officials from DSIT and refined the proposed evaluation methodology. The output of this scoping phase was an evaluation delivery plan (unpublished). Key elements of the methodology are discussed. This includes an assessment of the strengths and weaknesses of the approach, and the resultant robustness of the evidence base upon which evaluation conclusions are based.

It should be noted that this is an initial evaluation of the ONP, and that further evaluations are planned in the future. One of the main conclusions of the scoping phase was that, whilst this was an optimal moment for the process evaluation, it was too early for the impact and economic evaluations to draw any definitive conclusions about the programme. The methodology that was followed reflects this, particularly in relation to the economic evaluation where it was only feasible to undertake some preliminary assessments of the value for money of the ONP.

Figure 1.2 Overview of the ONP, including its component interventions

1.3.2 Data collection

Several different sources of evidence were assessed by the ICF evaluation team. Table 1.1 summarises the primary research activities that were undertaken, and ES 1.2 provides more detail on the research methods used. Evidence gathering took place between September and December 2023, when most of the projects and interventions were live but had not yet completed. Due to proportionality and the scale and complexity of the programme, a quota sample was used for the interviews and some subject matter experts, e.g. the Subsidy Control Team, were excluded.

Table 1.1 Summary of the primary research activities completed

Primary research activity Volume of research completed
Interviews with project applicants (leads and partners) 43 interviews
FRANC 16 interviews (including 3 unfunded)
SONIC 5 interviews
UKTL 2 interviews
NeutrORAN 2 interviews
FONRC 3 interviews
UK-ROK R&D 1 interview
UK-Japan R&D 0 interviews
ONE 11 interviews (including 3 unfunded)
UKTIN 3 interviews
Systems Integrations Skills 0 interviews
Standards and Standard Essential Patents (SEP) 0 interviews
Interviews with industry representatives 15 interviews
Mobile Network Operators (MNOs) 3 interviews
Industry representatives 8 interviews
Standardisation bodies 1 interview
Prominent relevant academics 2 interviews
Consultancies working in the field 1 interview
Interviews with DSIT officials/advisors 36 interviews (including 7 scoping interviews)
Technical Design Advisors (TDAs) 5 interviews
ONP intervention leads, portfolio and projects managers (at least one from each intervention except for the Standards and SEP intervention 15 interviews
Other DSIT officials (e.g., business case or intervention development, finance, business operations, programme outcomes) 16 interviews
Unfunded FRANC questionnaire (n=14) 2 responses (14% response rate)
Online KPI industry survey of vendors of Open RAN and system integrators (n=20) 10 responses (50% response rate)
UKTIN stakeholder survey (n=~3,400) 88 responses (~3% response rate)
Review of secondary evidence, including desk-top research to inform the impact evaluation and to update the KPI baseline 100+ key documents reviewed

Table 1.2 Commentary on the evidence sources that inputted into the evaluation

Review of documentary evidence

Review of documentary material, including programme information such as the business cases, DSIT programme reports, project management and monitoring data (e.g., benefits realisation templates), competition guidance, project applications and Grant Funding Agreements (GFAs). Open-source articles relevant to the current Open RAN ecosystem were reviewed.

Interviews

All interviews were conducted via Microsoft Teams using semi-structured topic guides to provide qualitative evidence through thematic analysis about the delivery and results of the ONP projects, and to gather insights on the possible impacts of the ONP. During interviews, responses were challenged and tested rather than being taken at face value.

Project applicants: Over 70 ONP lead and partner applicants were contacted (including both successful and unsuccessful applicants to competition-type interventions), of which 43 were interviewed. The distribution of interviews for each intervention was proportionate to the number of organisations involved. In some cases, we interviewed the one and only project under an intervention (e.g., UK-ROK R&D, NeutrORAN, SONIC, UKTL, and UKTIN). In other cases, we took a sample of projects and of leads/partners. The project selection criteria was based on current project RAG rating (includes assessment of overall performance, budget, timetable and benefits realisation), project focus/technology area(s), project delivery team (including size and participant profiles), and DSIT bid score.

Industry representatives: We aimed to sample 10 representatives and interviewed 15 in total. The interviews focused on i) perceptions of the reach, processes, design, and effectiveness of the programme within the ecosystem; ii) the changes which can be observed compared with the baseline and the extent to which these have been driven by the programme and other potential explanatory factors; and iii) possible impacts of the ONP and other Government interventions on the organisations involved and organisations’ perceptions of the current and likely future impacts on the ecosystem more generally. All 4 MNOs were contacted, and 3 were interviewed.

DSIT interviews: We conducted 36 interviews with DSIT officials (aiming for 25 interviews), including with intervention leads, project officers, Technical Design Advisors (TDAs) and team members responsible for various elements of programme and project delivery. These provided insights into the thought processes behind the design and implementation of each intervention.

Unfunded FRANC questionnaire

To understand what happened in the absence of DSIT funding, a questionnaire containing 5 open-ended questions was sent to all 14 unfunded FRANC consortium leads via email. 2 responses were received.

KPI survey

Replicating the methodological approach of the Baseline Study[footnote 7], an online survey was sent to 20 organisations consisting of Open RAN component vendors and system integrators, to update the Key Performance Indicator (KPI) baseline. 10 responses were received.

UKTIN survey

A survey carried out by the UKTIN to evaluate the early impacts of UKTIN was sent to all those engaged with its services (understood to have been sent to ~3,400 contacts). 88 responses were received.

1.3.3 Data analysis and synthesis

There were 3 elements to the data analysis and synthesis:

  1. an assessment of the programme processes
  2. an assessment of the projects funded by the programme to inform a ‘bottom-up’ impact evaluation and an assessment of the ‘macro’ trends in the Open RAN ecosystem to inform a ‘top down’ impact evaluation and
  3. a multi-faceted approach to the economic evaluation

Four-tier assessment system

To assess the processes of the programme and projects, the ICF evaluation team used a four-tier assessment system:

  • Strong performance, expectations for the programme met or exceeded.
  • Moderate performance, expectations for the programme partially met.
  • Weak performance, expectations for the programme barely met.
  • No performance, expectations for the programme not met at all.

These assessments reflect the opinions of the ICF evaluation team, drawing on evidence and data collected as described in the preceding sub-section.

To investigate the effectiveness of the programme processes and their impact on projects, as well as capture lessons learned, 9 case studies were produced. The case studies focused on interventions’ processes and project delivery, drawing on evidence collected from the interviews and analysis of project documentation. The case studies also include a discussion of the impact of interventions and projects on the wider telecommunications ecosystems.

Process evaluation. The review of documentation and management information, and evidence collected from stakeholder interviews using thematic analysis was used to inform an assessment of the effectiveness of each of the key phases of the programme: proposal development, mobilisation, and monitoring. The process evaluation also considers whether there are any learnings that might inform future delivery of ONP interventions or other DSIT/UK Government initiatives.

Impact evaluation. The impact evaluation deployed a theory-based evaluation approach, based on the ToC. The ToC (Figure 3.1) sets out how the different types of interventions are expected to lead to specific outputs, short-to medium-term outcomes (1-5 years), and medium-to longer-term impacts (5+ years). The impact evaluation combined 2 approaches, the project/ intervention-level (‘bottom-up’) and programme-level (‘top-down’), which include an evaluation of funded projects and an assessment of changes in the Open RAN landscape. Targeted interviews with industry stakeholders, together with the small-scale industry and UKTIN surveys, were used to elicit evidence to assess key changes in the industry landscape based upon the ONP objectives and metrics and the role of the programme in generating recent changes. Further details on our theory-based approach are presented in Section 3.1.

Economic evaluation. The economic evaluation draws on evidence from the documentation review and open-source data sets to elicit data for the cost-effectiveness analysis of TRL progression. All findings were reviewed and accepted by the ICF evaluation team, including the technical experts.

1.3.4 Strengths and weaknesses of the methodology

This sub-section includes a discussion of the key strengths and weaknesses of the methodology. Every effort was made to identify methodological limitations in advance and where practical devise mitigation strategies.

1.3.4.1 Strengths in our approach to evaluation

Interviews held with a wide range of project participants, industry stakeholders and DSIT officials. Interviews were held with project participants across all the ONP interventions along with DSIT individuals involved across various delivery aspects of the programme. The breadth of coverage within our interview programme allowed us to gain depth of insight into areas where the programme succeeded along with areas for improvement moving forward. The careful design of our interview programme ensured we held conversations with consortia leads along with additional consortia members to capture perspectives across programme participants. Significant effort was also put into ensuring discussions were held with key additional stakeholder groups such as leading MNOs to ensure that a more complete picture of the current UK Open RAN market could be arrived at. Similarly, with DSIT officials, we worked with the DSIT evaluation team to ensure that discussions were held with individuals across the seniority spectrum as well as ensuring that various teams involved across the design and delivery of the programme were consulted. Conducting interviews across this breadth of respondents reduced the risk of bias as thematic analysis was used and different perspectives were included.

Regular discussions held with DSIT team members to refine and adapt approach. Throughout the evaluation, we held regular discussions with the DSIT evaluation team across the various workstreams to ensure that there was visibility of the approaches being undertaken. Holding meetings on specific areas of the evaluation ensured that each stream received focused and regular reviews and that the most effective methods could be used to gauge impact.

Emphasis on theory-based methods to maximise impact findings. The timing of the evaluation along with the nature of the Open RAN market meant that impacts generated as a result of the programme were always likely to be longer term in nature. Due to this, more emphasis was placed on assessing the success of the process and early impacts workstream of the evaluation with a more light-touch approach taken for the value for money assessment workstream. To reflect this, more emphasis was placed on qualitative theory-based approaches such as carrying out a thematic analysis from the interviews to make the most of the available information gathered as part of the evaluation. We also used monitoring information provided by DSIT in the form of benefit realisation templates and application forms to construct descriptive statistics of the composition type of organisations within the ONP. This data was also synthesised with information gathered from interviews to provide early indications of the cost effectiveness of the programme and whether the grant funding provided was generating the improvements in technology maturity anticipated.

1.3.4.2 Limitations to data collection

Accuracy of recall, bias in responses. Recall from project participants on application and project set-up processes for earlier interventions such as FRANC was limited due to the time that has passed since these activities were carried out (over 2 years ago). This was also true for DSIT officials who were involved during the inception of ONP and its earlier interventions. Retention and turnover of DSIT staff also presented challenges as many officials had moved on, which resulted in poor recall of processes. As in all self-reporting, there also may have been an element of positive bias from both DSIT officials and project consortia with regards to the success of ONP processes and project performance, respectively. Evidence collected through interviews was considered in line with our document reviews to address any knowledge gaps, including process documentation and project reporting (e.g., benefits realisation templates).

Variance in compliance with DSIT expectations of project reporting. Inconsistent approaches to completing project reporting, particularly the project benefits realisation templates, across the interventions was observed. This included various approaches to recording TRL data (e.g., in some instances a TRL was recorded for the project overall, and in other instances for specific products of the project). Where TRL data was incomplete, we used the interviews, the views of project TDAs and the views of expert opinion (Aetha Consulting) to estimate the TRL where needed for the economic evaluation.

Lower-than-expected responses. Lower than expected response rates were observed across the unfunded FRANC questionnaire, Key Performance Indicator (KPI) industry survey and UKTIN survey. Additionally, we achieved lower than initially hoped for response rates in relation to the ONP participant interviews, particularly unsuccessful applicants (either because those involved in the application process had moved on from the applicant organisation or contact details were no longer correct). Attempts were made to encourage participation by both the ICF evaluation team and DSIT including contacting potential interviewees and survey respondents at least 3 times and accommodating as much as possible interviewee’s schedules.

Incomplete coverage of participation organisations. Across all 11 interventions, there are over 100 lead/partner organisations involved with delivery of funded projects, in addition to many other subcontractors and commercial partners. Rather than interviewing all participants, a quota sample was used and 43 organisations were selected based on the selection criteria described above. Efforts were made to avoid interviewing the same individuals where their organisation was involved in multiple projects. Therefore, it is possible that certain perspectives were missed, including the views of unfunded applicants from FONRC, UK-ROK R&D, and UKTIN. However, through adequate documentation review and industry surveys it is hoped opportunity was provided for all participants to express their opinions.

There was also limited evidence available regarding the Standards and SEP, Systems Integration Skills, and UK-Japan R&D interventions due to DSIT staff turnover, uncertainty regarding the future of these interventions and shift in DSIT prioritisation to other initiatives.

1.4 Report structure

The remainder of this report is structured as follows:

  • section 2 presents the results of the process evaluation, focusing on the design, administration, and delivery of the ONP and its interventions
  • section 3 assesses the delivery of the projects funded by ONP, exploring whether the interventions are meeting the objectives of the programme. This also includes an assessment of trends in the Open RAN ecosystem since programme inception
  • section 4 presents the findings of the early-stage economic evaluation, recognising the limitations of what is possible to assess at this stage of the ONP
  • section 5 answers the process, impact and economic evaluation questions and provides considerations for future interventions
  • section 6 makes recommendations for future ONP evaluations

2. Process evaluation

This section assesses the effectiveness of ONP process delivery, including proposal development, mobilisation, and monitoring. These findings focus on areas where the evaluation has concluded that there is scope for improvements to the delivery of the ONP, rather than celebrating the successes of the programme.

The design, development, and delivery of the ONP interventions is based on learning from the implementation of the 5GTT Programme (5GTT)[footnote 8] and later interventions have learnt from earlier ones. Our overall assessment of the ONP processes is that organisational learning took place, which was reflected in ONP process design and delivery. There were improvements to competition guidance and the design and delivery processes of interventions.

Table 2.1 assesses the effectiveness of the ONP processes for selected interventions and provides summary commentary. Our overall assessment of the ONP processes considered the differing magnitude of these interventions. For example, the contribution of our assessment of FRANC had greater influence on our overall process evaluation than NeutrORAN which is a single project and therefore, less evidence could be collected. This evaluation uses the assessment scoring system explained in section 1.3.3. The assessment is the ICF evaluation team’s own, based on the evidence collected and presented in the subsequent sections and case studies. Some columns have “*”, which means there was insufficient evidence/commentary available to provide a credible assessment.

Table 2.1 Assessment of ONP processes

Intervention: FRANC
  • Proposal Development: Moderate performance
  • Mobilisation: Weak performance
  • Monitoring: Moderate performance

Summary commentary

The scope of FRANC was purposefully broad. Its business case preceded the ONP business case and aimed to: help ‘disaggregate’ 5G supply chains and encourage open interfaces and security in network deployment; accelerate development of 5G Open RAN solutions; attract new 5G RAN suppliers to conduct R&D in the UK; and foster collaborations between potential new entrants into the UK’s public network. This resulted in some funded projects with indirect links to ONP objectives (e.g., not specifically Open RAN solutions), but still delivering department objectives (e.g., 5G supply chain diversification).

15 projects were selected in the competition. Due to delays in signing GFAs, the majority of participating organisations began their projects working at risk to ensure project delivery within the timescale set by DSIT (as reported by project participants) and, as a result, mobilisation costs were incurred by project consortia.

There was a perception that grant claims could only be made if there were quarterly GFA milestones and some milestones were “superfluous”. Later, this was a factor in the need for some project change requests.

Most projects were extended to 2 years to complete their baseline scope. Projects were near complete at the time of this evaluation, with DSIT having allocated grant for a small number of extensions into 2024/2025.

A total of 60 project change requests had been submitted across 14 FRANC projects at the time the evaluation took place. This includes delays, de-scoping, cost and commercial changes and project GFA extensions. The licensing of IP between partners was an area of contention in the collaboration agreements. DSIT staff turnover, particularly among DSIT officials, caused some disruption in DSIT and projects’ working relationships.

Intervention: FONRC
  • Proposal Development: Strong performance
  • Mobilisation: Weak performance
  • Monitoring: No performance

Summary commentary

FONRC aimed to: enable UK universities to work with large RAN vendors and other telecoms organisations on R&D for openness and interoperability of future network architectures; increase UK influence in standards development organisations; strengthen the UK telecoms R&D ecosystem and telecoms capability; and aid the UK’s evolving future networks and 6G vision.

3 projects were selected and DSIT funding started in February/March 2023.

Timescales were challenging and similar issues with pre-GFA requirements and processes and collaboration agreements were reported by FONRC as for FRANC.

Project participants thought that DSIT target timing to sign GFAs did not account for the time needed for internal sign-off amongst consortia, particularly for universities and larger consortia.

Intervention: UK-ROK R&D
  • Proposal Development: Strong performance
  • Mobilisation: No performance
  • Monitoring: Strong performance

Summary commentary

HMG had already established strong partnerships between the UK and South Korean organisations, which ensured an effective launch of the intervention, despite working with international partners.

Regular meetings and feedback from DSIT were well received by the project consortia which helped to improve quality of project deliverables.

1 UK project was selected (FLEXI DAS) and was near complete at the time of this evaluation.

Intervention: NeutrORAN
  • Proposal Development: Strong performance
  • Mobilisation: Moderate performance
  • Monitoring: Moderate performance

Summary commentary

This project was a direct award of a grant to the NEC Corporation. Project consortia reported stronger working relationships with the TDAs rather than DSIT managers due to the managers’ perceived lack of technical understanding and turnover of DSIT staff.

The project ‘proved its concept’ but didn’t achieve some intended scope and benefits, e.g. with respect to live deployments.

Delaying DSIT grants from one financial year to the next was found by the project to be challenging in seeking DSIT approval. The requirements and process for claiming DSIT funds created administrative overhead. This process was time and data intensive.

Intervention: SONIC
  • Proposal Development: Moderate performance
  • Mobilisation: Weak performance
  • Monitoring: Moderate performance

Summary commentary

This had a specific business case for a direct award of grant to Digital Catapult for a new facility and services, building on the parties’ positive experience with a ‘pilot’ of the facility in 2021/22. Ofcom also participated.

Both parties were able to agree a focused set of objectives effectively and efficiently.

There were challenges setting up the project, for example agreeing ‘end to end’ project milestones and accompanying deliverables as part of the GFA.

There have been 9 change requests thus far. In 2023, a further business case resulted in a significant project extension (by 12 months) to complete the outdoor testbed and additional scope.

Intervention: UKTL
  • Proposal Development: Weak performance
  • Mobilisation: Moderate performance
  • Monitoring: No performance

Summary commentary

Significant delays to the launch of the intervention, particularly during the construction phase when DSIT had project ownership. In autumn 2023, DSIT passed the contract management responsibilities to the National Physical Laboratory (NPL) to continue with project delivery; construction has not yet completed.

The contract with NPL was signed, and the project lead never worked at risk; however, machinery of government changes from DCMS to DSIT caused some delays.

Intervention: UKTIN
  • Proposal Development: Moderate performance
  • Mobilisation: Strong performance
  • Monitoring: Strong performance

Summary commentary

The GFA was awarded competitively. Even though there was an incumbent of sorts (UK5G), DSIT received multiple bids and was satisfied with the choice available.

Interviews and the documentation review indicates that the parties handled the ‘transition’ from UK5G to UKTIN well.

The mobilisation to GFA was challenging but was completed within the time window set by DSIT.

UKTIN mobilisation after GFA signature to public launch took months due to the need to transition staff to the lead organisation and recruit new staff, set-up a website, and launch multiple workstreams.

Insufficient evidence/commentary to provide a credible assessment.

Insufficient evidence/commentary to provide a credible assessment.

2.1 Assessment of proposal development processes

Proposal development processes are those that were followed to define the scope, beneficiaries, and objectives of each ONP intervention. This also included the development of plans for the project initiation stage, such as the processes to select project bids and how DSIT communicated appraisal outcomes to bidders for competitive-type interventions. Our assessment of the effectiveness of proposal development applies to the FRANC, FONRC, UK-ROK R&D, ONE, NeutrORAN, SONIC, UKTL and UKTIN interventions.

Overall, the proposal development processes worked well across the interventions.

2.1.1 Intervention design

Earlier ONP interventions were more flexible in their requirements than later interventions, which were more specific in terms of their components / use case requirements. This is in part due to the timing of the development of the business cases, such that business cases for the earlier interventions (e.g., FRANC) were developed before the overall ONP business case. For example, the intervention parameters of FRANC preceded the ONP business case and were broad, which offered DSIT flexibility in selecting bids and awarding grant funding to competition applicants. Projects applying to the later ONE competition were expected to address barriers in one of 3 challenge areas. These corresponded to factors inhibiting the widescale adoption of open mobile networks, as identified by DSIT. Nearly all DSIT interviewees involved in the design of the ONE intervention believed that the strategic design of the ONE intervention ensured a clear alignment of the intervention parameters with ONP objectives.

Eighteen months was considered typically too short for the duration of R&D projects by project participants. Nearly all interviewed ONP project participants, as well as most interviewed representatives from unfunded projects, shared this view. Unprompted, a few applicants claimed that the short timescale of projects discouraged others in the telecoms sector from applying to the ONP. The short timescales were commonly cited as the reason why potential project partners declined to join consortia when approached by these applicants. It was reported that R&D project development cycles are often 4-5 years, and most project applicants and participants suggested that a minimum of 2 years is required to complete these projects (in line with some of the extensions subsequently given to FRANC projects).

Project applicants felt they benefited from the DSIT briefing events, and overall found the competition guidance useful, but would have preferred greater clarity in certain aspects. Most interviewed applicants found the competition briefing events helpful for building their understanding as to what DSIT wanted and how to complete the application. Interviewees did not elaborate on what specifically from the guidance and briefing events eased the process for applying. A successful lead applicant from both the FRANC and ONE competitions noted that the guidance was clear for both interventions but found the briefing sessions to be more effective for the FRANC intervention. However, a few project applicants and DSIT officials attributed subsequent problems with the due diligence checks (which were completed as part of the GFA process) to a lack of clarity in the competition guidance, which led to substantial changes to consortia and/or project delays. Specifically, interviewees from projects and DSIT reported that greater clarity was needed regarding organisations participating in multiple projects under the same intervention, overseas working, and the percentage of funding that Research and Technology Organisations (RTOs) can receive.

Timeframes for the bidding process for R&D project competitions were considered too short by project applicants. The duration of the application window for ONP competitions ranged from 8 to 10 weeks, which is common across government R&D competitions and takes into account the time needed by DSIT to appraise bids and award grants within Spending Review timescales. Figure 2.2 shows that the 2-week difference in application windows across the competitions did not influence the number of applications received. However, the majority of those interviewed from larger organisations said that they struggled to get the internal sign-off required to submit an application within the 8-10 week timeframes set by DSIT. This then had knock-on impacts on timescales for signing GFAs and collaboration agreements if internal sign-off had not been completed during the bidding stage. Interviewed applicants believed that application windows should be open for longer than 10 weeks, which they thought would encourage greater levels of participation from larger organisations.

Figure 2.2 Application windows for ONP competition-type interventions

Application processes for R&D project competitions could benefit from refinement to enhance efficiency and user-friendliness, ensuring better accessibility for project applicants. For interventions that took place earlier within the ONP, nearly all DSIT officials and project applicants reported issues with document processing and storage in the absence of an online portal, resulting in an uncoordinated approach to application submission. DSIT has since improved how it receives applications. For the ONE competition, the cross-government “Find a Grant” online portal was used. Yet, when compared, Innovate UK was cited by applicants as having an application portal (Innovation Funding Services) which was viewed as easier to find online, more user friendly when there are multiple contributors, less onerous in terms of the documentation requirements, and more familiar to some of the organisations applying for R&D government grants.

The ONP exhibited flexibility by allowing direct awards when justified, by exception, thus streamlining processes by excluding a ‘bidding’ stage. Rather than expending resources on competitions, project partners and DSIT officials for NeutrORAN, SONIC and UKTL interventions could focus efforts on the pre-GFA processes from the onset of intervention design. For example, SONIC came about, to a large extent, because of the pre-existing relationship in place between the consortium lead and what was formerly DCMS. This meant that both parties could influence project design and effectively agree a focused set of objectives for the intervention. However, this did not result in GFAs being signed more quickly in comparison with projects selected through competitions.

2.1.2 Targeting and engagement

Workshops and briefing events in the early stage thinking of intervention design were considered effective and thought by DSIT officials to be well attended. Officials responsible for intervention design felt that these engagement activities were instrumental in designing interventions that resonated with industry and testing ideas to attract high-quality bids. For example, a workshop was held to discuss the design of FONRC which was attended by all the incumbent vendors (Nokia, Ericsson and Samsung), academics and small and medium-sized enterprises (SMEs). DSIT took onboard feedback and broadened the scope of the intervention as a result. This included using an Expression of Interest (EOI) phase, and nearly all those involved in the EOI phase (around 6 organisations in total) ended up as part of a successful project. DSIT also noted that whilst there was a lot of interest at the UK-ROK R&D briefing event, where more than 80 people attended, only 2 applications were ultimately received. This was thought to be because the intervention was run in parallel with the FONRC competition, where the scope was much broader and therefore received more bids.

Most applicants interviewed across all competition-type interventions became aware of the competition through existing networks and partnerships outside of their engagement with DSIT about the ONP. We therefore conclude that DSIT promotion of upcoming interventions and match-making events did not have a marked impact on competition awareness and consortium building. None of those interviewed reported forming their consortium through the match-making events. Of the FRANC participants interviewed, a few were informed about the forthcoming intervention through their existing connections at DCMS (but outside of the ONP). They were therefore able to prepare ahead of the competition launch. Most of the other FRANC participants were made aware of the competition through existing commercial relationships. Most participants from later interventions reported that they became aware of funding opportunities and built consortia based on relationships that were developed during previous collaborations for DCMS funding (FRANC, UK-ROK R&D and 5GTT). FONRC and UKTIN interviewees also reported forming their consortia through existing networks and partnerships. In single reported instances, awareness of the ONP funding was also attributed to the Connected North Conference (hosted by techUK), DSIT social media, and ‘find the grants’ search tool on gov.uk. UKTIN was also cited once as helping to find partners to form a consortium through their supplier guidance.

There were similar levels of applicant engagement for both early and later competition-type interventions. Whilst the number of organisations receiving grant funding from the ONP has increased, the number of those applying remains unchanged (further analysis on the types of participating ONP organisations is detailed in Section 3). There were 129 organisations involved in FRANC applications, of which 67 (52%) were successful and 62 (48%) were unsuccessful. This compared to 142 organisations involved in ONE applications, of which 116 (82%) were successful and 26 (18%) were unsuccessful. A review of participating organisations across the ONP interventions, has shown positive signs of increased MNO involvement in later interventions, indicating a growing commitment and collaboration within the Open RAN ecosystem (e.g., Three’s involvement in the SCONDA ONE project). However, most project participants noted that MNO engagement remains a challenge. One MNO interviewee reported that a central repository of potential projects and consortium members available ahead of the bidding stage would allow them to view all the opportunities and decide whether to join consortia, potentially encouraging greater participation.

2.1.3 Project selection for interventions delivered using competitions

Interviews with applicants generally did not add much value to the selection process. There was a consensus amongst interviewed project applicants and DSIT officials that interviews with triaged applicants did not offer additional information beyond what was included in the written bid, and it was a costly process to have senior DSIT officials on interview panels. DSIT officials reported that interviews did not change scoring and therefore did not change the project selection outcomes. DSIT dropped the interview requirement for later interventions (e.g., the ONE competition).

The benefits section of the application was considered by most project applicants to be the most challenging aspect to complete. Of the application assessors interviewed, DSIT officials and TDAs confirmed that across all interventions, the benefits section typically had the lowest quality answers, and applicants did not fully understand the requirements for a successful bid.

Feedback on the reason(s) for bid outcomes was not always felt to be well communicated by project applicants. Several successful applicants from both the FONRC and ONE competitions reported that feedback from DSIT was limited. A few consortium leads across the competition-type interventions reported that they were informed by DSIT that they had been successful in securing grant funding, but no further information was provided. One applicant, who participated in both FRANC and ONE competitions, appreciated the detailed feedback provided for their FRANC bid and felt that this bid had been thoroughly assessed by a technically proficient assessor who understood the project intricacies. In contrast, they felt they received more general feedback on their ONE project application. A few of the unsuccessful applicants claimed that they did not receive qualitative feedback from DSIT regarding their applications.

2.2 Assessment of mobilisation processes

Mobilisation requirements and processes consist of activities undertaken up to and including the signature of the GFAs that were signed with all projects funded through the ONP. This includes the pre-GFA due diligence checks that are completed by DSIT, such as checks to ensure that there are no fraud or finance errors with the grant funding claims. This process starts once successful competition applicants are notified. Grant recipients are also asked to confirm their compliance with relevant regulations with respect to government subsidy and use of radio spectrum. The mobilisation phase also includes the process of finalising GFAs and collaboration agreements between project consortium members. The aim of signing the GFA is 4-10 weeks following notification of award and this involves the project consortium members, DSIT portfolio manager, DSIT project manager and DSIT technical leads. Based on the available evidence, our assessment of the effectiveness of the mobilisation requirements and processes applies to the FRANC, ONE, NeutrORAN, SONIC, UKTL and UKTIN interventions.

Overall, our assessment is that the mobilisation requirements and processes worked somewhat well across the interventions, but with scope for improvement (particularly needed for the GFA signing processes).

2.2.1 Pre-GFA due diligence

Overall, the due diligence checks for ONP were thorough and necessary. This is evidenced by DSIT’s comprehensive review of vendor qualifications and financial stability, ensuring the selection of reliable partners for project execution. DSIT officials did not identify any misuse of public funds by ONP grant recipients to date, suggesting the checks were credible. However, there were some examples of successful applicants failing the DSIT due diligence checks (e.g., DSIT reported examples where DSIT asked for financial guarantees following successful ONE applicant organisations failing due diligence checks).

Due diligence check processes were considered resource intensive and, in several cases, perceived to have contributed to delays in subsequent GFA signature. Whilst nearly all interviewees recognised the importance of these checks to ensure appropriate use of taxpayer funds, most felt the processes could be streamlined. Due diligence checks on participating organisations start after projects are notified that they have been successful, with a target completion date of 4-10 weeks from notification. Some DSIT officials involved in the due diligence check processes felt that some of these checks could start at bid stage to alleviate resourcing pressures later on in the process. However, others noted that this could disincentivise consortia from applying and risk allocating unnecessary resource to unsuccessful projects. It was suggested by a few DSIT officials that some of the less intensive checks could be completed earlier to help expedite the processes. These include a reputation check from a general Google search and Company House check. Due diligence checks were perceived to have resulted in a 3-4 week delay in signing the GFAs for several projects.

Project and DSIT resourcing constraints also contributed to delays. Projects were working ‘at risk’ before their GFAs and collaboration agreements were signed, and sometimes didn’t meet target timescales for pre-GFA requirements. Some DSIT officials involved in specific checks said they did not always have the time to go through some review elements, for example at times where there was a ‘peak of work’. This meant that some DSIT queries and due diligence concerns did not get flagged up early enough and resulted in delays.

Acquiring financial statements and commercial information was considered a challenge to complete some DSIT checks. Interviewees from smaller organisations, especially start-ups, reported that balance sheets were not always available and that regular reporting to produce them is not a requirement of their organisation so struggled to produce these.

DSIT finance teams also noted challenges in accepting organisational cash flow profiles, as in some instances these were delayed, did not align with commercial information, or straddled 2 financial years and did not transfer well to the DSIT budgets or DSIT-generated templates.

Despite DSIT improvements to streamline due diligence requirements and checks, project interviewees that had participated in earlier interventions and ONE thought that pre-GFA requirements had not improved. DSIT finance teams acknowledged that resolution of some pre-GFA business for ONE was slower for some projects than they were for earlier interventions (e.g., FRANC, FONRC). Potential reasons included: novel issues arising, for example with respect to community interest companies, dependencies on companies in several consortia or overseas suppliers; project and DSIT resourcing; and delayed inputs from some large companies in consortia.

2.2.2 Grant Funding Agreements (GFAs)

The published DSIT target timescale of 4-10 weeks following notification of award to GFA was often not met in the context of the current requirements and processes to complete due diligence and sign the GFA. Nearly all the projects across FRANC, FONRC and ONE ONP interventions did not sign their GFA within the expected timescale, with many requiring 3 months or more (e.g., at least 9 FRANC projects took a minimum of 2 months longer than the target 4-6 weeks to sign GFAs, FONRC projects took about 14 weeks to sign GFAs rather than the target 4-6 weeks, most ONE projects had not signed GFAs at the time this study was conducted but had already surpassed the target timeframe of 8-10 weeks following notification). The timing to sign a GFA was close to target for the SONIC Programme and on target for the UKTIN Programme.

As part of their internal processes and governance, larger organisations reported that they required sufficient time to complete legal checks and seek approval from senior management, whilst smaller organisations (e.g., start-ups and SMEs) were often inexperienced in providing the requirements for signing GFAs and needed additional guidance from DSIT to work through its requirements. The ICF evaluation team has not conducted additional secondary analysis to determine which consortia reached GFA and collaboration agreements earlier.

The majority of project participants requested additional support from DSIT to help expedite the pre-GFA requirements processes, and suggested that in the future, DSIT should deliver additional DSIT-led workshops explaining how to complete the GFA requirements in detail and provide example GFAs to follow. On the contrary, DSIT officials reported that there were workshops for FRANC, FONRC and ONE which focused on: senior managers’ role in consortia; the overall pre-GFA requirements; finance/commercial requirements; and technical milestones. The lack of mention of these workshops by project participants interviewed suggests that there could be an opportunity for DSIT to improve how it advertises workshops and perhaps offer more emphasis on its link to GFAs; though it could also be the case that participants in the workshops were not represented in the interviews.

The information requested by DSIT pre-GFA was widely considered to be overly detailed and burdensome. The majority of project participants and some DSIT officials considered that DSIT currently lacks flexibility in capturing what is needed for projects of varying cost and complexity; and that the DSIT guidance and paperwork can be over-complicated. Many project participants and a few DSIT official interviewees made comparisons to the pre-GFA requirements and processes of Innovate UK and the Department for Energy Security and Net Zero (DESNZ) which were cited as being smoother, easier, and required less onerous paperwork. However, we cannot confirm whether the projects are comparable in terms of scope (e.g. collaborative R&D) or complexity (e.g. number of partners, amount of funding, etc.).

There was consensus from all project participant and a few DSIT official interviewees for government departments to take similar approaches to pre-GFA requirements and processes, with a clear preference from participating organisations for DSIT to use other government sponsors’ requirements and processes rather than apply DSIT’s own ones.

Changes to information management were perceived to be needed to improve shortcomings in documentation review systems across DSIT teams, and be implemented with the introduction of a new cloud programme management software. During the pre-GFA processes, most project participants reported issues with version control and being asked to submit duplicative documents to different teams in DSIT, suggesting a lack of coordination within DSIT in receiving and sharing documents internally. For example, a ONE project partner reported that despite sending an updated commercial form to DSIT multiple times, some DSIT teams continued to work off a previous version, which resulted in delays. DSIT officials advised that different teams were responsible for different stages of the process (e.g. competition, pre-GFA) and activities during due diligence; and there were some challenges with version control and coordinating information. DSIT was seeking to rectify these issues by a cloud-based project management platform and improvements in internal record-keeping.

Project participants signing GFAs after January 2023 encountered challenges in ensuring subsidy control compliance, following changes in legislation. The changes are set out in law, and guidance is provided by the Department for Business and Trade. This primarily impacted ONE intervention grant recipients. The interventions were made under the new Streamlined Research, Development, and Innovation Route. While most did not comment on the specific challenges this caused, a few university project participants reported choosing to engage with external solicitors for help with answering the subsidy control questions and to satisfy themselves of their compliance with the new rules.

Inputting project milestones in the GFA was sometimes a source of contention, specifically agreement of Annex 5 of the GFA. Some project participants reported inputting milestones that were not necessary for their project management purposes to get the GFA signed and to claim grant each quarter; and that project change requests were inevitable in the context of the projects’ R&D scope and time constraints.

A GFA and collaboration agreements are a prerequisite to DSIT grant funding, therefore, there is pressure on both project participants and DSIT officials to complete mobilisation and sign commercial agreements (GFA and collaboration agreement) quickly.

For the ONE competition, 18 change requests were submitted by the 20 ONE participants (see Figure 2.3) in the first couple of months that the projects went live. During the interviews (which were conducted ahead of GFA signature), ONE project participants indicated that they would likely need to submit change requests after completing the GFA requirements due to the pressures of signing the GFA in time to make their first grant claim. However, DSIT officials advise that these change requests were due to project set up delays, cost, and consortium changes.

One particular source of contention around inputting milestones in the GFA reported by nearly all project participants was the difficulty in fitting deliverables into 18 months (rather than what projects viewed should be 2-year projects). Some put in a long list of milestones to the GFA aligning with quarterly grant claims, as they perceived this was necessary in order to receive payment from DSIT. For example, SONIC had nearly 50 milestones with around 100 deliverables in 26 months. DSIT official intervention leads and project officers considered that flexibility was necessary, and as a response in 2023 DSIT sought to clarify guidance for ONE to emphasise that GFAs should capture only necessary project milestones; whilst still ensuring that projects provided adequate information on status and progress, for DSIT monitoring purposes. ONE project participants still felt that they were having to input ‘superfluous’ milestones in order to complete the GFA requirements.

Participant organisations often worked at risk in the absence of a signed GFA. The majority of project participants reported working at risk to ensure project delivery within the timelines set by DSIT, while completing the pre-GFA requirements. In some instances, the delay in signing the GFA resulted in partners not being able to start due to the way their organisations are structured and governed. For example, nearly all of the participants working with universities noted that universities will not hire anyone for the project until the GFA was signed, thereby creating delays in project resourcing and delivery. Working at risk and delay in signing the GFA was also flagged by project participants as a barrier for start-ups to participate in the programme. This is because start-ups can be nervous about working at risk given the cashflow challenges it can bring. Our analysis of the private sector type involvement in the ONP interventions (see Section 3 for detailed analysis) supports this claim, such that the percentage of start-ups (considered to be ‘micro firms’ with fewer than 10 employees) involved in ONP interventions is much less (ranges 0-16%) than those of SMEs and large organisations. In general, the majority of project participants felt that while working at risk can be expected for R&D projects, DSIT could do more to alleviate financial pressures on organisations. This includes offering greater flexibility once the GFA is signed, such that payment for work can be claimed immediately or the start and end dates of the project could be amended to take account of delays in signing the GFA. DSIT has taken this into account and projects can claim from the effective GFA date rather than the date it was signed to cover working at risk.

From interview responses, we have concluded that DSIT’s intended improvements to the pre-GFA requirements and processes have had little impact on time required to sign GFAs for recent interventions. Following an internal review of the requirements in Spring 2023 and learnings from previous interventions, DSIT adjusted certain requirements to make improvements for the ONE competition. However, of the project consortia members interviewed and involved in both the FRANC and ONE interventions, none of the participants noticed any difference and were still experiencing similar or longer delays in signing the ONE GFAs compared to the FRANC GFAs. However, there were positive signs for further improvement with the pre-GFA processes, such that DSIT officials reported that the machinery of government change had led to an increase in risk appetite tolerance compared to that which was observed under DCMS. This could lead to a further reduction in the amount of detail required from project leads and their partners.

2.2.3 Collaboration agreements

Similar to other R&D programmes, there was a lack of understanding amongst several project consortium partners about the Intellectual Property (IP) requirements associated with collaboration agreements, which led to project delays and substantial changes to consortia. This is evidenced through interviews with both project participants and DSIT officials. This highlights the importance of understanding the licensing of IP between the project consortia from the onset. A few project leads also raised concerns about the IP agreements and statements for subcontractors in projects, including voting rights within governance structures. At the time that interviews with ONE project participants were undertaken, collaboration agreements had not been signed, and several interviewees from larger organisations reported that their organisations were struggling to sign these documents due to concerns expressed by their internal legal teams, particularly around IP rights. No comparator information was available to understand project participant experiences with collaboration agreements of similar R&D programmes.

2.3 Assessment of monitoring processes

Monitoring processes include the project management, benefits realisation monitoring, and project assurance processes. This sub-section considers DSIT requirements and processes for project management, for example change requests, processing grant funding claims, and capturing benefits realised (including recording TRL progression). Based on the available evidence, our assessment of the effectiveness of monitoring processes applies to the more mature interventions where monitoring processes could be assessed. This includes FRANC, FONRC, UK-ROK R&D, NeutrORAN, SONIC and UKTIN interventions.

Overall, the monitoring processes worked somewhat well across the interventions.

2.3.1 Project management

Regular meetings between project consortia and DSIT officials and TDAs were considered helpful, and most project consortia reported good working relationships with DSIT. Many were impressed by how involved DSIT were in their project and how supportive they were when dealing with unforeseen challenges, such as changes to consortia (e.g., where a partner dropped out of the project consortium, where the principal investigator of one project switched employers). Meetings provided an opportunity for projects to receive regular feedback from DSIT and the TDAs, which most project interviewees noted have helped improve the quality of their project deliverables and understanding of DSIT’s requirements and expectations.

The frequency of DSIT personnel changes was considered disruptive to project working relations with DSIT. Most interviewed project consortia expressed frustration about the turnover of DSIT personnel assigned to their projects. They considered that this impacted on the continuity of working relationships.

Several project consortia did not feel that information was consistently nor effectively communicated within DSIT to incoming personnel and that time and resource was needed of project participants to brief DSIT. One project lead partner reported having worked with 3 DSIT project managers, 3 DSIT portfolio managers and 2 DSIT finance leads during their project (project duration was 18 months at time of interview).

Nearly all the DSIT interviewees considered that both project and DSIT personnel changes had sometimes been challenging. Potential reasons for DSIT personnel changes included: DSIT’s resourcing model, for example use of fixed term contracts and contingent labour; uncertainties associated with the transition from the 5GTT programme to the ONP; individuals seeking new career opportunities elsewhere at the ‘natural end’ of the 5GTT programme; and the machinery of government conditions.

DSIT decision-making processes and lack of project information from project consortia caused some delays to project delivery. Several project consortia members claimed that due to DSIT governance structures, DSIT officials were unable to provide quick DSIT decisions and required further internal approvals (such as on grant claim payments, financial or commercial matters, project change requests, and project GFA extensions), which could result in project delays while awaiting key decisions. Project interviewees also believed that DSIT teams often seemed misaligned and inconsistent with their messaging, particularly on finances and subsidy controls, or there were duplicative requests for project information and ultimately a delay in decision-making. It was suggested in a few interviews that there was a misalignment between DSIT officials and a few FRANC project participants regarding the maximum potential length of extension and value of additional grant granted to deliver the projects. DSIT interviewees acknowledged the delays but considered the decision-making model suitable due to DSIT’s level of risk appetite. DSIT interviewees also believed that there were some issues with quality and timeliness of project information, irrespective of DSIT decision-making timetables.

2.3.2 Change requests

The process of project change requests can take a long time. Project interviewees from FRANC and FONRC projects reported the process of DSIT agreeing a change request could take weeks, which they believed was too long. Delays were reportedly caused by factors such as the timeliness and quality of project information provided by project participants, and the process of approvals from DSIT. Lengthy timescales risked knock-on effects on project delivery, including projects proceeding without DSIT approvals to avoid delays to the delivery timetable. DSIT interviewees acknowledged that change requests took too long to approve and reported that action to streamline the relevant requirements and due diligence processes was underway.

Information required of change requests was considered by projects to be onerous and sometimes required resubmission of documents previously shared; however DSIT reported attempting to improve this. Several project participants reported being asked to submit revised project delivery plans, commercial forms, and cash flow profiles despite not, in their view, always requiring an update associated with the change request. DSIT project managers and TDAs challenged this, but acknowledged that change request ‘thresholds’ triggered too many change requests. The ‘thresholds’ were updated in November 2023. As of March 2024, around 100 change requests had been submitted to DSIT from the 42 projects funded through the ONP (see Figure 2.3), though it should be noted that most were still ongoing (and ONE projects had started in Autumn 2023) and that changes were are common for R&D projects.

Furthermore, project interviewees suggested that DSIT could consider giving more flexibility to how they spend their budgets, within agreed financial year profiling, to further reduce the need for change requests.

In several instances, project consortia interviewees compared DSIT change request thresholds and requirements with those of Innovate UK, which were cited as having fewer changes requiring sponsor approval, and smoother and more familiar processes, which it was suggested DSIT could adapt. These project participants also claimed that the required information for change requests at Innovate UK only required a simple half page explanation of what was needed, how long it would take, and any budget impacts, whereas the DSIT process required more detail and supporting documentation.

Figure 2.3 Total number of change requests across ONP interventions (March 2024)

Total number of change requests
FRANC 60
ONE 18
SONIC 9
UKTIN 4
NeutrORAN 4
FONRC 3
UK-ROK R&D 2

Project change requests were sometimes needed immediately after signing the GFA. For example, 18 change requests were required across all ONE projects in the first 6 months following the funding start date in the GFA. Project participants reported these requests were due to changes to GFA milestones. On the contrary, DSIT officials reported these changes were because of cost/commercial reasons, consortium changes and time delays.

DSIT has made changes with the aim to aid projects and improve how it processes change requests. At the end of November 2023, DSIT introduced new ‘thresholds’ and guidance for change requests, such that many changes could simply be notified to DSIT. In July 2023, DSIT also began using a cloud-based project management software to consider change requests. DSIT hopes this will help expedite the process for approval as this platform has functionality to keep all information received from projects in one place, allowing for a more traceable record of both change requests and notifications. DSIT has now also introduced more informal change notifications for a range of changes that previously required a change request, which should streamline the process.

2.3.3 Grant funding claims

Delays in projects submitting grant claims and supporting requirements and DSIT paying projects are an issue. Several project consortia members reported that they experienced delays in receiving payment following a grant claim, though this evaluation has not conducted additional analysis on grant claim submission and payment receipt timelines. DSIT officials reported issues with the timeliness and quality of projects’ grant claims and supporting finance and delivery information. This included the provision of evidence of eligible expenditure and updated cost forecasts. They noted that payments could also be taking longer due to internal resourcing constraints and organisational changes (the machinery of government changes from DCMS to DSIT included the introduction of a new payments service).

Expected timescales could be better communicated by DSIT to project consortia members. According to DSIT officials, the department is reportedly due to introduce internal changes to better manage and monitor this process). DSIT officials also noted that extensive financial changes and replanning from projects had resulted in DSIT having to resubmit financial lines to the programme Senior Responsible Owner (SRO) and HM Treasury. Projects often delayed costs, underspent and/or asked for funding to move between financial years. DSIT requires evidence for its responsibilities as sponsor, grant management, payment and change approvals, and audit trails, which can sometimes result in delays to paying projects due to delays with receiving the evidence.

2.3.4 Benefits realisation

DSIT had made changes to the design and contents of the benefits realisation templates to reflect learnings from earlier interventions and similar programmes (e.g., the 5GTT programme). DSIT officials from the benefits realisation team reported that the benefits realisation templates had been adopted from those of 5GTT and streamlined following 5GTT learnings. There are various benefit realisation templates in use across the different interventions. When comparing the benefits realisation templates from FRANC to those used for the ONE project, it is evident that further changes have been made to make them more user friendly (e.g., simplifying the forms, providing additional guidance) following feedback from the earlier interventions. It is too early to assess whether these changes have improved processes as project reporting for ONE projects had just begun at the time this evaluation was undertaken.

There is an inconsistent level of benefits realisation reporting across projects / interventions, likely due to the complexity of benefits reporting in R&D programmes. The ICF evaluation team observed varying levels of quality and completeness of benefits, lessons learned and TRL reporting coming from the projects, including missing tabs when comparing projects across the same intervention (examples of missing tabs include: knowledge dissemination, project change log, investment stimulation). Whilst DSIT have attempted to improve compliance with this reporting, several DSIT officials, including those from the benefits realisation team, reported that quality assurance processes could be improved and raised that DSIT processes do not consistently conform to a common set of standards. This was raised by several project consortia, who shared the view that DSIT processes should follow a common approach to ensure consistent project reporting. In response to these concerns, it was reported by DSIT officials that more informal efforts have been made to ensure consistency in assessing the benefits realisation templates, which are signed off by the benefits realisation lead following a review by the TDA. This includes internal monthly catch-ups attended by all benefits realisation leads to discuss best practices, monthly benefits realisation RAGs, and a recently introduced peer review mechanism. Despite these efforts, DSIT benefits realisation officers reported that project consortia continue to have difficulty with completing the templates as it can be challenging to iterate how to achieve and evidence benefits.

Furthermore, the majority of project participants and DSIT officials emphasised the importance of consistent approaches to defining the requirements of the project from the outset to help ensure the intended benefits are appropriately captured in the templates, as these will be unique to each project. It was also noted by DSIT officials that benefits realisation reporting is a key tool to monitor the projects’ progress and gathering evidence towards the end objectives. DSIT provide support and guidance from kick-off and throughout the project lifecycle to assist in identifying appropriate benefits and associated measures, however, projects consortia are ultimately responsible for defining and reporting on their project’s benefits. Despite this support, DSIT officials reported that there remains confusion on how projects define a benefit as this could sometimes be interpreted differently across projects (also observed by the ICF evaluation team). Specifically, a distinction is needed between the milestones (outputs recorded in the GFAs) and the benefits of those outcomes (completed via the benefits realisation templates).

Enhanced engagement from DSIT project managers with the benefits realisation reporting could help to improve its completeness by project applicants. The majority of DSIT portfolio managers and intervention leads reported a gap in coordination between DSIT project managers and benefits realisations leads, such that project managers are primarily concerned with ensuring the project is within budget, on time and within scope. A few DSIT officials suggested this is due to the current delivery model of the ONP such that multiple teams within DSIT hold responsibility for different elements of project management (e.g., budget, scope, benefits realisation, capturing lessons learned), and this has, at times, resulted in a mismatch of DSIT official support for project consortia, with different emphasis on priorities. Currently, the DSIT project managers do not check the benefits realisation templates, as sign-off is only required of the benefits realisation lead with TDA commentary and input. It was suggested by a few DSIT officials that further engagement from the DSIT project managers with the benefits elements could help drive the behaviours to ensure completeness of the templates by project consortia. DSIT officials from the benefits realisation team felt that DSIT project managers and project consortia leads do not always understand the use of the templates (e.g., for progress updates, a tool for tracking success, bi-annual reviews, closure activities, future evaluations) which may also contribute to template incompleteness.

2.3.5 Technology Readiness Level (TRL) reporting

TRL reporting across projects and interventions lacked consistency and completeness, particularly for earlier interventions, where the requirement had not been sufficiently clear to projects. We found that there were inconsistencies between TRLs reported in the benefits realisation templates and by project representatives during interviews. DSIT officials, in particular project managers and portfolio leads, raised concerns about the accuracy of TRL data. Some DSIT officials reported that despite projects being encouraged by the DSIT benefits realisation lead and TDA to include TRLs in their benefits realisation templates since the start of the FRANC intervention, mid-way through benefits realisation leads had to prompt projects to ensure they were providing this measure and recording progress. However, DSIT project managers and the benefits realisation team reported that the requirement to record TRLs was not well communicated amongst DSIT teams working on earlier interventions until these projects neared completion. There was also consensus from both DSIT officials and project participants that there was a lack of clarity on how to score TRLs. The DSIT benefits realisation team has taken on board feedback in its design of the ONE benefits realisation template for TRL reporting. This includes highlighting that TRL reporting is a key performance indicator (KPI) to be reported on and placing greater emphasis in the initial benefits realisation workshop with projects that TRL reporting is a standard benefit all projects need to complete. The benefits realisation template also now includes a link to the TRL definitions published by Horizon 2020 which was absent from previous versions of the templates.

TRL tracking is difficult, as there can sometimes be an element of subjectivity, which, at times, lacks relevance to the specific project or intervention. This is a view shared by nearly all project consortia and DSIT officials, including TDAs. Reporting of TRLs varies across projects, such that in some cases a TRL was provided for a specific component of the project and in other instances a TRL was recorded for the overall project. In some instances, it was not appropriate to assign a TRL, such as assigning a TRL to supply chains and for technologies that are not actually products. Instead, TRLs should only be applied to products and solutions. Given the variability in project type across interventions and based on the interviews, we share the view that how a TRL is recorded should be agreed from the outset of a project with the project consortia, TDA and benefits lead, recognising that there is not a one size fits all approach and that tracking TRL progression may not always be relevant. Having TRL guidance applied and agreed, including what elements the projects should report on from the start will help to ensure a robust foundation for future evaluations.

3. Impact evaluation

This section describes our theory-based approach to assessing the impacts and includes our assessment of project/intervention-level impacts (‘bottom-up’) and ONP programme-wide impacts (‘top-down’).

3.1 Using a theory-based approach to assessing impacts

During a scoping phase of the evaluation, it was concluded that a theory-based approach was most suitable for assessing ONP impacts at this point in time. This was primarily because there are insufficient numbers of funded (treated) and unfunded (untreated) projects under the ONP to enable use of econometric analysis of differences in performance between the 2 groups. At the centre of a theory-based approach is a ToC model for the intervention in question. A ToC model articulates the logic underpinning an intervention, setting out the expected causal pathways through which the activities that are funded are expected to lead to the desired outcomes and impacts. As part of a theory-based impact assessment, evaluators collect and analyse evidence as to whether the anticipated steps along the causal pathways have materialised, and whether any change can be attributed to the intervention. A ToC model for the ONP is shown in Figure 3.1.

The main elements of this model are as follows:

Interventions, activities, and outputs (boxes 1 to 10). The ToC model lists the interventions and projects that have been funded through the ONP and shows the activities and outputs they are expected to deliver. The ICF evaluation team has analysed the extent to which funded projects have delivered against expectations, the results of which are presented throughout this section of the report.

Short- to medium-term outcomes (boxes 11 to 15). These outcomes will start to materialise over the course of the ONP, though it might only be after the programme has ended that they reach their full extent. This evaluation has assessed whether the ONP is delivering – or is on course to deliver – all 5 of these outcomes, noting that in some cases it is too early to make a definitive judgement. A theory-based evaluation approach has been used, whereby the ICF evaluation team has assessed whether there is evidence that these outcomes have been achieved, and whether there is evidence that the interventions funded through the ONP have contributed towards these outcomes (and what other factors might also have contributed). Where outcomes have not yet been achieved – e.g., because they are expected to materialise in the future – the ICF evaluation team has assessed whether there is evidence that ONP interventions have delivered precursor activities and outputs along the ‘impact pathways’ that lead to these outcomes. Section 3.2 contains the results of this analysis.

Medium- to long-term impacts (boxes 16 to 19). These impacts will likely only materialise 5-10 years once the programme has ended, since they follow-on from the short- to medium-term programme outcomes. Their accomplishment is also dependent upon the wider diversification programme, of which the ONP is one element. This evaluation has not sought to systematically assess whether these impacts have been achieved, though Section 3.3 draws together the available evidence to identify any ways in which the ONP has already started to set conditions for their delivery.

The analysis presented in this section is synthesised from a range of evidence sources, including ONP management information and benefits realisation data, interviews with programme stakeholders (project consortia members, industry representatives and DSIT officials), unpublished case studies of 9 ONP-funded projects, and surveys of project representatives (including unfunded projects). The strengths and weaknesses of the evidence base are discussed in Section 1.3.4.

Figure 3.1 Evaluative Theory of Change model for the ONP

3.2 Assessment of intervention-level outcomes

Our assessment of the results of the ONP interventions is structured around the 5 core anticipated programme outcomes shown in Figure 3.1. In the remainder of this section, we recap the impact pathways through which the ONP was expected to have achieved its intended outcomes. We then evaluate the evidence as to whether ONP interventions have delivered these outcomes, and if not whether they are on course to do so. We also consider the additionality of outcomes, such as whether any change can be attributed to ONP interventions or other factors.

3.2.1 Accelerating the maturity of Open RAN products and solutions

ONP-funded projects have been tasked with demonstrating that Open RAN products and solutions have matured over the project lifetime. This outcome is most relevant to the FRANC, NeutrORAN, UK-ROK R&D and ONE interventions, though FONRC may also ultimately contribute. The impact evaluation has assessed progress towards this outcome, noting that most projects were still ongoing at the time the research was undertaken. Moreover, it is to be expected that products and solutions developed through ONP-funded projects will continue to mature even after the programme has ended. What follows is thus a preliminary assessment of the results of the ONP.

ONP projects have measured the status of the Open RAN products and solutions they are working on using the TRL scale[footnote 9]. Most TRLs map on to specific products and use cases, though there are also TRLs that track testbed technologies and equipment. For every product and solution that projects have tracked, they report the baseline TRL (i.e. the position as the project is starting) and progress towards a target TRL. They will ultimately report end-of-project TRLs, though these were not available when this evaluation was carried out as projects were still ongoing. TRL data are largely self-reported by projects, though there was some validation undertaken by DSIT technical advisors. The ICF evaluation team has not reviewed the accuracy of any reported TRLs.

Figure 3.2 summarises the change in TRLs between project start (‘before’) and the latest point at which data were available (noting that this is typically not the end of the project, and further progression is expected). Data for these 2 points in time were available for 28 products and solutions, of which 22 were from FRANC projects and 6 were from the NeutrORAN project. TRLs increased for most of the products that were tracked. The most common shifts were from TRL2 to either TRL4 or TRL5, and from TRL3 to TRL5. We talk about what this means in practice. Also notable are the products that were TRL2 at the start of a project and were still at TRL2 in the latest data available (4 products in total). Since projects were still ongoing when these data were collected there is still time for progression to take place, but this does highlight the fact that the outcomes of R&D initiatives are contingent on a range of factors, and not all projects will be able to achieve their targets.

Figure 3.2 Self-reported change in TRLs of products and solutions funded through the ONP (FRANC and NeutrORAN only)

Notes: numbers show counts per TRL at each point in time (out of 28); colour scheme shows direction of change: green indicates increase in TRL between the 2 points in time, blue indicates no change.

Qualitative research was undertaken with representatives from 7 FRANC projects and the NeutrORAN project. This research included analysis of if and how projects were advancing the maturity of Open RAN products, the results of which have been used to assess whether the reported TRL changes can be attributed to the ONP. Reading across the interviews, the following observations can be made.

ONP projects have successfully accelerated the development of several Open RAN products. Many of the products that form the focus of FRANC projects started at a very low (early stage) TRL – as Figure 3.2 indicates. The mean average baseline product TRL across all FRANC projects was 2.7 meaning that in most cases, the technological concept had been formulated (Technology Readiness Level 2 (TRL2)) or they may have achieved experimental proof of concept (Technology Readiness Level 3 (TRL3)). The latest mean average TRL across all FRANC projects was 4.4; interviewees from these projects confirmed that, typically, these products had made it as far as a trial in a lab or other relevant environment ((Technology Readiness Level 4 (TRL4)) or (Technology Readiness Level 5 (TRL5)). This position reflects the relatively early stage of many Open RAN technologies. As a result of the ONP, several projects were able to progress products to the trial stage, including in some cases through trials that were undertaken with MNOs. This is a key stage in the journey towards product maturity and acceptance by the MNOs, since it can give them confidence that products will meet their requirements – see section 3.2.3 for further discussion of the adoption by MNOs of Open RAN products. For example, the FRANC Proteus project enabled Parallel Wireless to develop DU software which can run on ARM chipsets as well as the x86 chipsets supported by the existing DU software, thereby helping enable supplier diversity. Initial product testing has been undertaken in BT’s lab.

As expected at this early stage, most ONP projects have not (yet) developed Open RAN products to the point where they are market ready. As Figure 3.2 indicates, only couple of the products for which data were available had reached Technology Readiness Level 8 (TRL8) or Technology Readiness Level 9 (TRL9) when this evaluation was carried out. These were from the NeutrORAN project rather than the FRANC projects, and testing was not carried out on live operator networks so, for now, it is uncertain whether they are truly commercial products. For most FRANC projects, whilst the products being tested had matured, they were not yet market ready – though again, this evaluation was carried out before many had finished. They are, however, on the pathway to market readiness, depending on the outcome of further testing and development work (see also the next paragraph).

Several FRANC projects sought, and in some cases secured, ‘follow-on’ funding under the ONE intervention to build on the progress they made. There is progression in some cases between FRANC and ONE, with the latter intervention supporting some of the products that advanced under FRANC but where there was scope for further development. For example, the FRANC ARI 5G project supported the development of a RIC platform (an innovative Open RAN technology) and 4 exemplar use cases. This has now progressed into the ONE ARIANE project which is investigating the operation of multiple applications running on multiple RICs on a network. In some instances, it was reported by DSIT officials and project participants of FRANC that rather than applying for new funding under the ONE intervention, FRANC projects secured grant extensions.

Evidence indicates that the additionality of the ONP outcomes is high, since funded projects are judged unlikely to have gone ahead in their current timescales without public funding. ONP funding enabled organisations to assemble research teams and undertake R&D at a scale and pace that would not otherwise have been feasible. Several interviewees from funded projects indicated that ONP projects had a larger R&D budget than would otherwise have been the case, meaning they could operate at scale. The ONP also required collaborative consortia-based R&D projects, which several project interviewees said typically would not have happened outside the programme. A few interviewees from smaller organisations involved in projects commented that participation in the ONP enabled them to secure the participation of MNOs in trials, which otherwise would likely not have been the case given their relative size and track record. As a result, products were further along the path to maturity than they would otherwise have been, and this gave them access to MNOs and an opportunity to demonstrate their products.

Further evidence on the additionality of the ONP-funded projects comes from research with unfunded projects. Interviews were carried out with 5 unsuccessful bidders to the FRANC competition (there were a total of 20 unfunded FRANC projects). Of these, 2 projects went ahead anyway – albeit with a different design – because the lead partner either chose to carry out the project internally or they were able to raise private equity investment. One project changed its scope and was then successful in a bid to the ONE competition. Another project was a public infrastructure project and the private investor needed government support to invest – in the absence of such a partner the project did not proceed. For the fifth project the interviewee was, at the time, uncertain what would happen in the absence of funding (our understanding is that this has still not progressed). For unsuccessful ONE projects, it was too early for the applicant organisations to discuss next steps as when this evaluation was carried out, they had only just been informed that they were unsuccessful. During interviews they suggested that they would consider other Government funding routes or seek private investment but noted that without DSIT funding the time to market will take longer. A future evaluation of the ONP should revisit these organisations to ascertain what happened in the absence of DSIT funding.

3.2.2 Interoperability tests carried out on Open RAN products

For Open RAN products to reach a sufficient level of maturity, they need to be thoroughly tested with other Open RAN products to check that they all work together (i.e. they are interoperable). The premise behind Open RAN is that components from different providers can be swapped in and out of networks, which would enable smaller organisations to focus on a particular product, rather than having to provide a full solution. However, an MNO requires a fully working solution and needs to be confident that any product it buys will be compatible with other products, including equipment it has already installed in the network. Facilities for such interoperability testing have not been readily available to Open RAN equipment providers because they need to have access to other providers’ products to test and then diagnose the cause of any problems. The ONP sought to address this problem primarily through the SONIC intervention. The UKTL and the Systems Integration Skills interventions are also relevant to this outcome but had not progressed far when this evaluation was carried out.

SONIC was tasked with creating test and trial environments where Open RAN equipment providers can test – and prove – the interoperability of different hardware and software combinations. Ultimately, testing and demonstration of products is part of their progress towards maturity and, potentially, market readiness. The SONIC testing facilities are open to organisations participating in ONP projects and any other organisation developing a relevant product. By creating a collaborative space, it was also intended that the SONIC testing facilities would become ‘incubation chambers’ where organisations could learn from each other, further enhancing the potential for product development.

SONIC has successfully provided Open RAN equipment vendors with a well-regarded and trustworthy environment where they can test Open RAN products. As discussed in Box 3.1, 3 of the 4 planned test environments are operational and thus far they have been used by 19 vendors to test products. This has helped equipment vendors demonstrate the interoperability of their products and – equally importantly – identify where compatibility issues exist, so the providers can work to improve their products, thereby taking further steps towards product maturity. As part of the SONIC project, data were collected on product TRL progression following on from tests performed. Vendors from first cohort participants estimated that the TRLs of their products – including Radio Units, Distributed Units and Centralised Units – increased from an average of TRL5-7 to TRL7-9 (i.e. almost or actually market ready).

At the time of this evaluation, the absence of an outdoor testing facility had to some extent limited the validity of some of the tests that could be undertaken at the SONIC lab. During SONIC’s work between January 2022 and March 2024, only lab and indoor testing environments were available to users; the outdoor testing environment was planned during Phase 2 and will be available to use in 2024/2025. This is a significant limitation as much Open RAN equipment will need to be deployed in outdoor environments if it is used widely in MNO networks. For some Open RAN products, the availability of an outdoor testing environment is key to the credibility of the tests and their usefulness to the MNOs.

Having the SONIC facility, which is one of very few such facilities in the world, is perceived by some industry interviewees to have raised the profile of the UK. Some interviewees from within the telecoms industry – unprompted – reported that, in their opinion, the SONIC labs have helped to raise the profile and reputation of the UK as a location for Open RAN research. This is in part because there are so few comparable facilities elsewhere. In April 2023, the O-RAN Alliance announced there were only 11 Open Testing and Integration Centres (OTIC)[footnote 10] globally, which increased to 17 by the end of 2023. Emphasising the importance of these facilities, other countries have also been investing in their development. In the USA the National Telecommunications and Information Administration (NTIA) has recently awarded USD 42 million to an industry consortium to establish Open RAN testing, evaluation, and R&D facilities, including a focus on interoperability testing.

The additionality of the tests carried out at the SONIC facilities is judged by the ICF evaluation team to be high. In the absence of the funding and impetus provided through the ONP, it is likely that the SONIC lab facility would not have been built as no individual organisation that stood to benefit from SONIC would have the resources to fund it. The main benefit of SONIC is to smaller product providers who would not have access to other product providers’ equipment or existing UK testbed facilities, such as those operated by the MNOs. They therefore would have been unable to prove their products were interoperable or identify and diagnose compatibility issues.

In the longer term, the UKTL will support security testing, security research and functional secure interoperability testing of equipment and software. When this evaluation was carried out the UKTL was still in its infancy, and the ICF evaluation team has not assessed its contribution to this ONP outcome. The UKTL is intended to provide a facility to enable vendors to assess the performance of Open RAN solutions against incumbent solutions, thus ensuring that Open RAN products are a viable alternative – on performance and price – to existing RAN products. When this evaluation was carried out, DSIT had only recently passed contract management responsibilities to the National Physical Laboratory to continue with project delivery of UKTL, and so it is too early to assess the role of the UKTL.

Box 3.1. Case study extract: The SONIC project – developing an Open RAN testing environment

The aim of the £20 million SONIC project was to create a testing facility to enable smaller organisations to evaluate the interoperability of their Open RAN network solutions and to test the compatibility of a complete configuration of mixed-vendor products to demonstrate a fully working end-to-end solution.

Phase 1 (SmartRAN Open Network Interoperability Centre 1)(SONIC-1) was a pilot phase that began in November 2020 which aimed to set up a testing environment and identify challenges in integrating Open RAN solutions. Phase 2 (SmartRAN Open Network Interoperability Centre 2)(SONIC-2) began in February 2022, since when Open RAN equipment providers or ‘vendors’ have been able to use the facility to test products. So far, SONIC has provided 3 test sites – lab-based and indoor. During SONIC-2, a total of 19 vendors – split across 4 cohorts – have used the testing facilities. Later, a fourth test site will be launched, enabling users to test equipment in an outdoor environment.

Other ONP projects’ involvement with SONIC has not been as extensive as it could have been. Interviewees from FRANC projects noted a variety of reasons for this, including insufficient capacity available at SONIC, initially limited awareness of SONIC at the start of FRANC projects, and the lack of availability of outdoor testing environments. The evaluation expects greater interaction between SONIC and the ONE projects – not least because of the greater awareness of SONIC’s facilities – which should be explored in any future ONP evaluation.

3.2.3 Increased confidence in and adoption of Open RAN products by MNOs

This outcome is likely to materialise in the medium-term, since ONP-funded projects were still in the process of developing Open RAN products during this evaluation. MNOs are a key purchaser of these finalised products, and a demonstration of the maturity of products is crucial to increasing adoption of Open RAN technologies. Even once these products are mature enough to be rolled-out in live networks, the cyclical nature of MNOs’ procurement rounds means there will be a lag between product readiness and adoption at scale within these networks. It should also be noted that the ONP is not the only intervention contributing to this outcome, and that there are other initiatives within the wider diversification programme that will drive Open RAN adoption.

About half of ONP projects have developed products and undertaken trials with at least some degree of involvement of one or more of the UK mobile operators. As Figure 3.3 shows, the extent of MNO participation in project teams varied between interventions, reflecting the intervention objectives. Of the 2 R&D focussed interventions – FRANC and ONEMNOs were full partners in 14 of the 34 projects, showing how involved they have been in the programme. Projects have involved trials being undertaken at MNOs’ own laboratories or test sites or at the edge of their networks (e.g., a neutral host site or private network connected to the public network). For example, the NeutrORAN project involved the deployment of a neutral host Open RAN solution in South Wales in conjunction with 2 of the UK’s mobile operators. The project demonstrated the suitability of the vendor’s Open RAN product for deployment on rural sites shared between the mobile operators. For many ONP R&D projects, the extent of testing has been more limited than planned, often because technology development has taken longer than expected and/or due to delays in project initiation.

Figure 3.3 The extent of MNO involvement in ONP project consortia

Mobile Network Operator involvement by intervention

Intervention name MNO in consortium projects with no MNO in consortium Total
FRANC 5 9  
FONRC 3 0  
NeutrORAN 0 1  
UK RoK 1 0  
ONE 9 11  

To increase the confidence of MNOs in Open solutions and drive adoption, the next key stage is for operators to test Open RAN technologies on their live networks – ideally with actual customers. As discussed in section 3.2.1, most of the products supported through the ONP have typically not (yet) reached this stage, mostly because the FRANC intervention focused on less mature, early-stage technologies. The ONE intervention largely aims to support later-stage technologies to reach market readiness (and thus adoption), in many cases building on use cases developed as part of FRANC. For example, the ONE SCONDA project, which had not started when this evaluation was carried out, will involve the deployment of multiple Open RAN small cells across Glasgow on Three’s live network. This includes integration with its legacy macro cell network to include testing of how customers seamlessly migrate between Open RAN small cell sites and legacy macro cell sites as they move around. This is an example of how MNOs have become involved in live network testing through the ONP, though it is at present too early to say if this will lead to wider adoption within the Three network.

The adoption of Open RAN technologies by MNOs means addressing a range of challenges, which the ONP is contributing towards. One of the key barriers is the extent to which the products that are available – or in the pipeline – perform as expected and meet MNOs’ requirements. Interviews carried out with MNOs for this evaluation highlighted that, when deploying Open RAN on their networks, MNOs want certainty that using equipment from different suppliers will work reliably with good performance. These are valid concerns – in some of the ONP projects there were delays and rescoping/replanning was needed due to issues arising with integrating hardware and software from different suppliers. Observing such issues on smaller scale projects suggests that operators’ concerns about the challenges they will face in deploying Open RAN on a larger scale are fair. Still, it is through this process of testing and demonstration – whether through FRANC/ONE or via interoperability testing at test centres such as SONIC – that MNOs will build confidence in the readiness of Open RAN technologies.

3.2.4 Catalysing the UK Open RAN innovation ecosystem

The ONP aims to build on and evolve supply chain relationships and networks within the Open RAN innovation ecosystem. The primary vehicle for achieving this outcome is the UKTIN. The UKTIN was set up to support UK telecoms through networking, coordination, and knowledge dissemination. These activities are intended to catalyse the ecosystem by bringing together organisations that may otherwise not have been aware of each other or thought to collaborate. They are also expected to educate, inspire, empower, and support organisations to innovate and invest in telecoms. The UKTIN continues and builds on the legacy of the UK5G Innovation Network that was set up by (then) DCMS as part of the 5GTT programme. That network similarly sought to develop the UK 5G ecosystem, manage information about 5G activities and learning, and promote the capabilities of UK 5G internationally. UKTIN has reused much of the information generated by UK5G network and transferred across – and expanded – the database of UK5G members. However, its remit extends beyond Open RAN to encompass the UK telecoms ecosystem more broadly, and it considers itself ‘solution agnostic’ rather than fixed to a particular technology.

It is too early to assess the impacts of the UKTIN on the innovation ecosystem. The network has been funded from September 2022 and launched in April 2023 as it took several months to recruit staff and set-up the project workstreams. In October 2023 it had over 6,000 registered users[footnote 11], representing an increase from the predecessor UK5G network (the interim evaluation of the UK5G network, published in May 2023, reported that network had over 5,000 registered users. Since its launch, the network has delivered a range of activities designed to support networking and collaboration, support policy development (e.g., through expert working groups) and disseminate information (including about ONP competitions). This has included running and participating in events (12 events in between July and September 2023). Events are reportedly the single most commonly used service provided by the network – in a survey of network members, 69% of respondents involved in Open RAN reportedly attended at least one UKTIN event[footnote 12]. Feedback was collected by the network for one of these events – an event run in connection with the ONE intervention (see next paragraph) – and there is evidence that it was useful for many attendees. Reportedly, around 60% of attendees that provided feedback said they found partners they hoped to collaborate with to a “large” or “moderate” extent at this ONE event[footnote 13].

The UKTIN has supported networking and knowledge sharing within the Open RAN innovation ecosystem. The network has supported the ONE competition, by hosting and sharing information and competition material, supporting network building, and giving a platform for the selected ONE projects to share information about themselves. There is evidence that this was successful – as noted in Section 2.1, an interviewee from a ONE project reported that the UKTIN’s supplier guidance had helped them find partners to form a consortium (note the evaluation did not ask this question of all ONE projects, so we cannot be sure how often this happened). The UKTIN was not operational for the earlier ONP interventions – e.g., the FRANC competition – though the predecessor UK5G network played a similar role to that which UKTIN is now playing.

Until ONP-backed R&D projects finish and generate publishable results, the dissemination impacts of the UKTIN will be somewhat constrained. Still, several of the smaller companies involved in ONP projects said that they frequently attend UKTIN events and have found them very helpful for interacting with other members of the ecosystem and sharing knowledge. Dissemination of knowledge at the end of projects will enable further learning across the ecosystem. Several projects have been doing this at Mobile World Congresses whereas others have been making outputs available through project websites – the UKTIN will be a channel to amplify this activity. A future evaluation of the ONP should consider if and how these activities have impacted on the functioning of the ecosystem.

3.2.5 Enabling international collaborative research and development

The ONP aims to support UK organisations to partner with organisations in like-minded nations to maximise opportunities for collaborative R&D. Two interventions – NeutrORAN and UK-ROK R&D – had a specific goal to foster international collaboration. For example, the UK-ROK R&D intervention involved the creation of commercial relationships between partners in the UK and South Korea. FRANC and ONE supported international collaboration in that global organisations could and did join project consortia, as discussed.

Most ONP projects have fostered international collaboration through the participation of at least one international organisation in project consortia. As Figure 3.4 shows, for all 5 of the interventions considered, most project consortia included at least one multinational organisation. For example, 8 of the 14 FRANC projects involved international organisations, rising to 18 of the 20 ONE projects. The composition of ONP-funded project consortia has thus facilitated international collaboration on R&D.

Figure 3.4 The extent of international firm involvement in ONP project consortia

International firm involvement

Intervention name Projects with non-UK based firm in consortium Projects without international member in consortium Total
FRANC 8 6  
FONRC 3 0  
NeutrORAN 1 0  
UK RoK 1 0  
ONE 18 2  

International collaboration supported through the ONP has supported project development, and additionality is assessed by the ICF evaluation team to be high. Interviewed representatives from the UK-ROK R&D flexible Distributed Antenna System (Flexi-DAS) project noted that the project had supported the investigation of technologies which could provide energy and cost savings within Open RAN networks. Interviewees from the project believed that, without government funding, research on reconfigurable intelligent surfaces would not have been undertaken, in part because it is reportedly unlikely that participating organisations would have worked together in the way that they have. The outcomes of the Flexi-DAS project have been useful elsewhere; 2 projects in the ONE intervention are reportedly making use of the findings from the Flexi-DAS project, though it is too soon to assess the impacts of these spillovers.

Whilst international vendors have participated in ONP projects, this has not yet translated into follow-on investment at scale. The NeutrORAN project had a specific objective to facilitate investment by encouraging an international Tier 2 radio supplier[footnote 14] to enter the UK market – both in terms of undertaking R&D in the UK and in offering the company’s products and services to UK mobile operators. Whilst the project has helped to increase international interest in the UK market, it is still yet to lead to a tier 2 vendor penetrating the UK market; however, it is too early to assess this impact as investment decisions take place over the longer-term. There was consensus amongst the industry representatives interviewed that the ONP has generated international interest in the UK that was not present before, but that this interest is still yet to be translated into new vendors entering the UK telecoms market.

3.3 Assessment of ONP impacts

This sub-section presents an early assessment of the impacts of the ONP, noting that these are expected medium- to long-term results and it is too early to comprehensively evaluate whether the programme is having its intended effect. Instead, this section presents the results of primary research with industry representatives and analysis of published material, to assess whether there is evidence that the ONP is on track to deliver its intended impacts. The impacts that have been assessed are drawn from the ToC model for the ONP (see Figure-3.1) and include the strategic priorities for the programme that were identified by DSIT officials during the evaluation. As part of the assessment of the impacts of ONP, the ICF evaluation team collected data on a set of KPI data. DSIT commissioned a baseline assessment which measured the state of play within Open RAN prior to the implementation of the ONP. The evaluation has updated some of these KPIs to assess progress since the programme started. A summary of the findings has been shared with DSIT separately and should be considered supplementary to the evaluation given the limitations listed.

3.3.1 Increase MNOs’ interest in deploying Open RAN on their networks

The level of interest from the mobile operators in deploying Open RAN on their main network varies by operator. Industry interviewees believe that all operators are interested in the potential of Open RAN products and solutions. However, interviews with industry representatives and desk research have also indicated that MNOs’ propensity to deploy Open RAN technologies on their networks varies. To date, Vodafone has demonstrated the greatest interest, having started to deploy Open RAN technology (see box 3.2 for details). Virgin Media O2 (VMO2) announced in April 2023 that it would deploy Open RAN technology using Mavenir as its main vendor. Information on the exact timescale and extent (e.g., number of sites) associated with VMO2’s planned deployment was not publicly available when this evaluation was carried out. Interviews and desk research indicated that the other 2 mobile operators are currently showing more limited interest in large-scale deployment, instead focussing on smaller-scale trials (e.g., BT’s Open RAN trial in Hull) and participation in the ONP. However, DSIT officials point out that this can be quick to change once the business case makes sense for their respective positions. For example, BT are opening up their stance on Open RAN as a use case for Neutral Host or Small Cell deployment as soon as 2025.

Box 3.2 Deployment of Open RAN technology by Vodafone

Vodafone is starting to deploy Open RAN technology on 2500 sites in the South-West of England and Wales, to be completed by 2027. The 2500 sites have been selected from the total 6000 sites on which Vodafone is required to replace Huawei equipment. Vodafone has indicated that the target of 2500 sites has been developed in consideration of the Government’s ambition for 35% of traffic to be on open and interoperable networks by 2030. During an interview with Vodafone, representatives indicated that the ONP has not directly impacted on its own plans since it planned to deploy Open RAN in any case, but it considered the ONP to be important in helping to maintain industry momentum in Open RAN.

UK mobile operators continue to have concerns about deploying Open RAN technology. There have been advances in the maturity of Open RAN products, but industry and operator interviews as well as industry papers cited continued concerns over whether the maturity level of Open RAN technology meant that it was presently suitable for large-scale deployment. Interviewees identified the following challenges:

  • Responsibility and contractual liability: MNOs are concerned about who they can hold responsible for the performance of their network. In an integrated RAN, operators will have one vendor who they can contractually hold accountable if the performance of the network is not meeting standards. In an Open RAN where an operator is using equipment from multiple vendors, it is less clear who they can hold responsible for failures in the network. A systems integrator could potentially act as the MNOs’ main point of call if the network was not performing, but there are a limited number of credible integrators

  • Technology maturity: As noted, when deploying Open RAN on their networks, MNOs want certainty that using equipment from different suppliers will work reliably with good performance. This is a major concern for the operators. Operators also have concerns about the performance of Open RAN in High Density Demand (HDD) areas: dense urban areas such as airports, sports venues, tourist attractions, etc. which carry a large volume of traffic in a small area. Thus far, performance of Open RAN is not matching that of integrated RAN with Tier 1 vendors in HDD environments. Recognising this issue, optimising network performance in HDD areas is one of the goals of the ONE intervention

  • Legacy technologies: Most early large-scale Open RAN deployments have come from new entrant (‘greenfield’) operators. One reason for this is because brownfield operators need to deploy Open RAN alongside their existing integrated RANs. In the UK, operators are continuing to run 2G networks and operators are concerned with how Open RAN will provide or operate alongside these legacy technologies as relatively few Open RAN vendors offer solutions which can support 2G. Some UK operators already face the challenge of having different vendors for 3G, 4G, and 5G, and have said they are reluctant to add the complexity of an additional 5G Open RAN overlay onto their existing networks

  • Total Cost of Ownership (TCO): A key benefit of Open RAN is supposed to be the reduced TCO of equipment. However, many of our interviewees suggested that equipment and operational costs have, so far, been higher (and in some cases much higher) than in existing integrated RAN solutions. One major issue is the lack of economies of scale for Open RAN equipment which can only be realised as and when the market develops

The ONP has supported the development and testing of Open RAN products, often with the involvement of MNOs. As discussed in section 3.2.3, the ONP has supported projects that aim to address some of the concerns of the mobile operators. At this stage, testing of Open RAN solutions through the ONP has mainly been limited to the laboratories of the operators; recognising this the ONE competition is intended to address this challenge by taking solutions developed as part of earlier ONP interventions and progressing them further along the maturity readiness scale. Fundamentally, however, the ONP can only play a limited role in maturing Open RAN technology given the budget available, when compared to the global expenditure on RAN R&D (estimated to be in the tens of billions of US dollars per annum).

Moving forward, operators continue to consider whether Open RAN technology solutions can meet their requirements as part of future procurement processes. During interviews, mobile operators were asked to express their level of confidence in Open RAN vendors. Whilst they had some confidence, they indicated that they are much more confident using Tier 1 vendors for radio units at this time. Confidence in Open RAN solutions more widely was more mixed, ranging from absolute confidence, some confidence (a solution may work in a rural area but not in high-density areas), and no confidence (no solution currently meets the operator’s needs). The mobile operators that have yet to announce plans for the deployment of Open RAN on their live network have indicated they will continue to monitor Open RAN developments. Two of the mobile operators indicated that they had recently held procurement processes and are ‘in contract’ with suppliers, and so realistically it is likely to be several more years (late 2020s onwards) before the next major network equipment procurement rounds are likely to be undertaken.

3.3.2 Deploy 5G small cells, private networks and neutral host solutions

In the opinion of the ICF evaluation team, the ONP has helped promote the adoption of Open RAN technology in small scale deployments, which may help build MNOs’ confidence in Open RAN. Whilst deployment of Open RAN on operators’ main macro cell network sites has been limited, industry interviews indicated there is growing interest in deployment on small cells, on neutral host sites and on private network solutions. These initiatives are important because they provide a means for operators to gain confidence in Open RAN technology and interoperability at a smaller scale prior to more widespread deployment across their networks. They can be seen as a means of increasing both the maturity of the underlying Open RAN technology and as an opportunity for the operators to gain experience and confidence with deploying and operating the technology. As discussed in section 3.2.3, the NeutrORAN intervention funded the deployment of Open RAN technology on a neutral host site in Wales, and this equipment was subsequently utilised by 2 MNOs. Elsewhere, the 5G DRIVE project within the FRANC intervention demonstrated an innovative method for securely integrating a 5G private network into a mobile operator’s main network utilising an innovative roaming solution. The ONE intervention will continue to support small-scale deployments, for example through the Navigate project (which aims to deploy an Open RAN solution for multiple neutral host small cell sites in the City of London) and Liverpool City Region HDD project (which aims to demonstrate at least 5 live deployments of small cells in HDD environments such as stadiums).

3.3.3 Diversify the supply chain

Supply chain diversification is underway slowly, but it will be several years before progress can be assessed, which is to be expected since the ONP is still live. As discussed in section 3.3.1, interviews with mobile operators, industry experts and Open RAN technology providers indicated that some of the mobile operators do not wish to acquire RAN equipment solutions from multiple providers separately because of the difficulties in clarifying which party(ies) are responsible whenever any issues (e.g., faults causing network downtime) occur. Having made a vendor choice, MNOs can also be ‘locked in’ until their next procurement round. Vodafone has however procured an Open RAN solution from multiple vendors (including Samsung, Dell, Wind River) having undertaken interoperability testing and a live deployment. Vodafone has taken on most of the systems integration responsibilities itself[footnote 15]. Other exemplar opportunities to-date for Open RAN suppliers have included the following:

  • as part of greenfield network deployments in Japan (Rakuten), the USA (Dish) and Germany (1&1)
  • as part of an Ericsson or Nokia overall contract with existing mobile network operators – for example AT&T’s recent contract with Ericsson includes deploying Fujitsu radios (on an unspecified number of sites)
  • as part of a modest deployment by existing mobile operators – for example Vodafone UK’s deployment of Open RAN on 2,500 sites in the South-West of England and Wales includes deploying Samsung radios using Samsung software, whilst Deutsche Telekom’s recent contract announcement with Nokia includes deploying Fujitsu radios initially in the Neubrandenburg area of Northern Germany. VMO2 in the UK will deploy Open RAN solutions from Mavenir, but the scale of this deployment is unknown

Several of the mobile operators are seeking one or 2 major systems integrators to provide the complete solution with overall responsibility for performance. At present these mobile operators consider that only Ericsson and Nokia offer that capability, although other organisations could develop this capability over time. From Aetha’s experience of working with over 50 mobile operators across the world, we observe that MNOs usually procure equipment from 2 or 3 major suppliers/systems integrators to provide a degree of equipment supply diversity. In the case of Vodafone’s Open RAN deployments in the UK, Vodafone is the systems integrator, and plans to use the same combination of vendors detailed above (including Samsung, Dell, Wind River) on all 2500 sites on which it plans to deploy Open RAN technology. However, some of the recent network contracts suggest a move towards a single major supplier/systems integrator. For example, AT&T’s recent contract with Ericsson effectively displaces Nokia from AT&T’s network.

In the assessment of the ICF evaluation team, the ONP is playing a role in expanding the RAN equipment supply chain, contributing to diversification. Projects funded through the ONP have involved many participating organisations which, as discussed in sections 3.2.1, 3.2.4 and 3.2.5, have worked together to develop Open RAN products, including radio technology. For example, under the FRANC Best of British RAN Development project, smaller UK firms have developed a 5G small cell. As noted, when interviewed, smaller organisations indicated that they had found that being part of the ONP had helped them ‘open doors’ in MNOs and other larger organisations. Involvement in a DSIT funded programme came with an element of credibility and helped smaller organisations to ‘get a hearing’ from the technical teams in MNOs (who regularly receive pitches by small technology providers). The ONP has provided environments in which individual organisations can meet and network and decide to collaborate towards creating more complete solutions for MNOs. This includes initiatives such as UKTIN, ONP public events (e.g., the announcement of the ONE project winners) and ONP private events (e.g., meetings between multiple project teams).

3.3.4 Enhance the UK’s reputation for Open RAN investment

It is presently too early to assess changes in the UK’s reputation as a location for Open RAN investment, but there are signs that the ONP and other activities are starting to generate results. Interviewees from operators and industry were mostly positive about the ONP’s impact on the UK’s reputation for Open RAN investment. They cited several ways in which the ONP has already helped the UK’s reputation within the telecoms industry:

  • Increasing the product development and interoperability work undertaken in the UK. ONP funding has encouraged larger organisations such as NEC and Parallel Wireless, who have a presence in multiple countries, to undertake R&D work in the UK rather than at other locations. For example, Parallel Wireless’s facility in Bristol has grown to 40 people at a time when the company has had to make around half of its staff redundant

  • Positioning the UK as one of the leaders in Open RAN. There was almost universal commentary by interviewees that through the ONP, the UK has signalled widely that it is serious about Open RAN. Several interviewees commented that when discussing Open RAN with international colleagues, they are asked about both UK developments in Open RAN and the ONP itself

  • Creation of collaboration and networking opportunities. As discussed in section 3.2.5, the ONP has created collaboration opportunities through the funded interventions and indirectly through events such as those organised by the UKTIN. The ONP was also seen by interviewees to have promoted greater interaction and sharing of information between industry and academia

4. Economic evaluation

This section sets out the initial findings of an early-stage economic evaluation and draws on evidence from the stakeholder interviews, documentation review, desktop research, expert opinion from the ICF evaluation team, and the ONP project case studies. Due to the long-term nature of the programme, it was always expected that it would not be possible to fully quantify the economic benefits of the programme at this stage. With this mind, this section outlines some emerging economic impacts where evidence is available, and highlights areas which require additional focus as part of future economic evaluations.

4.1 Economic evaluation rationale and methods

The economic evaluation for the ONP comprised 3 workstreams:

1. Reviewing the underlying assumptions which underpinned the original business case and assessing to what extent these are still valid. This workstream explored whether the assumptions that were made as part of the initial funding rationale from the ONP still hold, and whether the intended benefits as part of the policy scenario are still likely to be generated. The ICF evaluation team’s initial assessment of the current state of the initial economic assumptions are set out in a separate document shared with DSIT. DSIT is also undertaking a comprehensive internal review of the assumptions, but the results were not available when this evaluation was carried out as this work will be completed once the ONP is nearer completion and its full impacts are evident.

2. A cost-effectiveness analysis of the planned and achieved TRL progression of use cases to date against the funding which different projects and interventions received. Tracking the TRL progression of each intervention and comparing this to the grant amounts provided by DSIT into each intervention gives an early indication of the funding required to progress certain use cases along the technology readiness scale. However, it is important to acknowledge that funding requirements may differ for various technologies and later stages of scaling up may require more resources than earlier ones.

3. An assessment of the composition of the organisations involved within the ONP. This workstream consisted of analysis of a database of the funding amounts received, along with details of the distribution of organisation types involved within each of the competition-type interventions. This has been shared with DSIT as a separate document, including a discussion on the development of the database (which presents the currently available data on key economic variables of interest including employee turnover, revenue growth and stock price levels).

4.2 Limitations of the economic evaluation

Given the early stage of this evaluation of the ONP, this sub-section details the limitations of the economic evaluation, including the availability of evidence/data. This includes an explanation of the timing of the ONP and how it is too early to complete a value for money assessment because most of the interventions of the ONP are still in progress.

In addition to the limitations described in Section 1.3, the ICF evaluation team note the following limitations specific to the elements of the economic evaluation:

  • challenges in updating the quantitative economic assumptions which underpinned the original business case for the intervention. As the initial assumptions were gained from previous discussions carried out and based on the opinions of the current state of the telecommunications sector, generating quantitative updates in these assumptions was difficult

  • inconsistent project data on TRL progression. As indicated, the way in which TRLs were tracked differed across various projects and this made it challenging to carry out consistent comparisons. This meant that whilst analysis could be carried out on whether interventions had achieved the level of TRL progressions intended, comparing the TRL progressions across ONP interventions needed to be caveated due to the varying nature of how TRL progression was assessed

  • insufficient evidence on market valuations of ONP participating organisations. The ICF evaluation team sought to compile financial information about participating organisations using open-source data. Whilst this enabled us to generate a lot of information, there were still gaps. Working alongside DSIT and using the private sector firm database Beauhurst, these gaps were reduced, but still existed for several participating organisations

4.3 Cost-Effectiveness Analysis of TRL Progression

This sub-section presents our cost-effectiveness analysis of how the costs of achieving specific TRL progressions through R&D interventions compare across the R&D projects funded through the ONP. This provides the potential to assess how progress achieved within the ONP compares to other analogous interventions such as the 5GTT programme.

A caveat with the TRL analysis is that the way in which these were assessed varied across interventions. Interviews with project participants highlighted the varying levels of understanding which were present on how to determine TRL progression. This meant that the accuracy of TRL determination was very dependent on the individuals working within the projects. Consequently, though analysis has been carried out to compare TRL advancement both across interventions within the ONP and with other DSIT interventions, these should be interpreted with a degree of caution. An area of potential improvement for future evaluations would be to ensure that standard guidance for how to assess TRLs is given at the outset of the intervention to allow more robust comparisons to be carried out. This lesson was already being implemented as part of later FRANC interventions as well as the upcoming ONE intervention so it is increasingly likely that more accurate comparisons will be possible in future evaluations.

In summary, whilst there are limitations in our ability to confidently cost TRL progression due to the inconsistent approaches to reporting on TRLs across ONP projects, our findings suggest that the ONP to-date has led to TRL improvements. The TRL progression realised suggests that increasing the amount of DSIT funding had a positive impact on the level of TRL progression achieved. The realised TRL progression to date is comparable to those of similar DSIT initiatives like the 5GTT programme.

Figure 4.1 indicates that ONP interventions exhibited a promising initial TRL progression.  This progression appears to be at a very similar level of TRL progression to the analogous 5GTT programme. Whilst the 2 programmes did differ, they share similar goals of increasing use case maturity and ecosystem development.

Figure 4.1 Comparison of TRL progression achieved under ONP (FRANC and NeutrORAN) 5GTT programme

Intervention name Average starting TRL Average final TRL Average target TRL
FRANC TRL Overview
(average increase in TRL 1.78)
2.65 4.43 7.17
NeutrORAN TRL Overview
(average increase in TRL 1.4)
5.6 7 9
Summary of 5GTT Overview
(average increase in TRL 1.7)
4.2 5.9 7.2

Figure 4.2 shows that the range of intended TRL progression varied across the FRANC projects. Whilst most sought to progress their use cases by TRLs of 4 and above, some projects did have more modest progression targets of 2 or 3 steps along the TRL scale. Projects like Best of British RAN Development, Coordinated Multipoint Open Radio Access Network (CoMP-O-RAN) and UK 5G DU-Volution had very different intended TRL progression aims but received similar funding amounts from DSIT. This suggests that intended funding amounts did not appear to increase dependent on the levels of technology development that use cases were intended to realise. Developing different technological aspects may involve different scales or types of R&D. It indicates that wider aims such as seeking to involve strategic partners or developing a specific strand of the telecoms supply chain were also critical for certain interventions. Projects such as Secure 5G, which expressly aimed to develop the UK supply chain for base station components and ORanGaN, which looked to develop a new semi-conductor solution that would help reduce the cost of key supply chain components, had use cases looking to solve these wider challenges.

Figure 4.3 shows the TRL progressions which have been achieved to date across the FRANC projects. What this chart demonstrates is that the amount which projects have received from DSIT has been a contributing factor to the level of realised TRL progression so far. This information suggests that in low maturity products, the amount of R&D funding which organisations can dedicate to developing their solutions can contribute to how much progress these solutions can realise. It may also be that progression is less costly between lower TRLs.

As many ONE intervention projects follow on from FRANC, these use cases will often be starting from a higher initial TRL stage than those developed as part of FRANC. As such, the same TRL progression in absolute terms is not expected to be observed. What is anticipated, is that the funding will progress FRANC products to a level whereby the end of the ONE intervention they can then be deployed in live environments.

Figure 4.2 Intended FRANC TRL-cost progression (FRANC projects)

TRL Cost Progression

Figure 4.3 Currently achieved TRL-cost progression (FRANC projects)

TRL Cost Progression

4.4 Economic evaluation reflections

Ultimately, the early stage of the evaluation, coupled with the nature of the telecommunications industry where benefits tend to emerge over 5-10 year horizons, means that quantitative insight into the impact of the programme is difficult to ascertain at this stage. Considering these limitations, we conclude that:

  • whilst there has been evidence of increased activity within the Open RAN ecosystem, there is not sufficient evidence of whether this has been translated into the intended market benefits. With the ONP currently running and not due to finish until 2025, we would not expect to see significant shifts within the Open RAN market. Therefore, we have not suggested any significant changes to the underlying assumptions of the initial modelling upon which the programme business case was based, at this stage

  • initial analysis of the ONP impacts (Chapter 3) suggests that the ONP is exhibiting some early positive signs of achieving the outcomes and impacts intended. These include accelerating technological maturity and improving collaboration across the current telecommunications ecosystem. However, despite these promising early signs, it is not possible at this stage to make conclusive remarks of the impact of the ONP on participating organisations’ market valuation, given the currently available evidence

5. Conclusions and future considerations

This section sets out the conclusions of the ICF evaluation team, and answers the research questions set for the process, impact, and economic evaluation. The conclusions below reflect the assessments of the ICF evaluation team based on the evidence presented in preceding chapters, and incorporates perspectives from both DSIT officials and the project participants in addition to industry representatives.

5.1 Process evaluation

How effective is the model for identifying, designing, and agreeing ONP interventions?

Overall, ONP processes were implemented effectively. DSIT has demonstrated a clear commitment to fostering competition and innovation through its initiatives to support Open RAN technologies, with clear evidence of active engagement with industry experts and stakeholders (section 2.1.2).

Future considerations: Project design and delivery teams should continue to interact with policy teams to ensure that the objectives of the interventions match those of the programme and logic model. This will also help to ensure that the milestones agreed as part of the GFAs and information requested by DSIT as part of project reporting are useful in informing wider Open RAN and telecoms policy across the department and considered in the design of future interventions.

How effectively did DSIT engage with the market when designing interventions and launching competitions?

DSIT engaged effectively with the market when designing interventions and launching competitions (section 2.1.1). By actively involving stakeholders and industry experts in the decision-making process, interventions were better tailored to respond to market needs. Through its engagement with stakeholders, including gathering feedback from project participants of earlier interventions, DSIT has evidenced that it has taken onboard learnings to shape later interventions. The launch of ONP competitions was successful, an adequate number of applicants met DSIT’s expectations, and it was reported that the quality of bids received improved with later interventions. Overall, the pre-design workshops, briefing events and match-making events were well attended and viewed positively by both prospective applicants and DSIT officials (section 2.1.2).

Future considerations: Consortia of later interventions resembled those of earlier interventions and previous DSIT initiatives (e.g., the 5GTT programme). To encourage participation from new market entrants, DSIT should continue to test and promote interventions through social media, Gov.uk, UKTIN, and through Open RAN associations. To encourage greater MNO involvement in projects, DSIT could consider co-designing interventions directly with the MNOs.

How easy or difficult did bidders find it to participate in the ONP competitions? What were the enablers and barriers to participation, and to what extent were they the result of the ONP delivery model?

Competition guidance was industry-tested and published with the aim of ensuring accessibility for potential bidders. Following applicant and internal feedback from earlier interventions, DSIT simplified the application process which included removing the requirement for second-stage applicant interviews from its later interventions (section 2.1.3). However, DSIT officials and project consortia members still reported experiencing challenges in overcoming complexities (e.g., the amount of information required) and competition guidance (organisations participating in multiple projects under the same intervention, treatment of overseas working and the percentage of funding RTOs can receive) (section 2.1.1).

Whilst DSIT’s approach to competitions did attract a diverse range of participants, some applicants made comparisons to Innovate UK processes which they considered more familiar, less onerous, and more user-friendly, including using the application portal (Innovation Funding Services) (section 2.1.1). Timelines for providing the level of detail currently required by DSIT for applications presents a challenge for bidders, for example in obtaining sign-off across project consortia due to internal processes of organisations, particularly when considering the legal implications of IP rights (section 2.1.1).

Future considerations: DSIT should review its competition guidance and provide clarity where gaps have been identified. The timing of when information is required from competition applicants along with the depth of information requested should be reviewed to ensure that the breadth of applications is maintained and that more consortia feel encouraged to engage. DSIT should also consider how its approach could adapt processes from other organisations (such as DESNZ, Innovate UK) to ease the administrative burden on bidders and offer greater flexibility in the formatting of documents (e.g., cash flow profiles).

Were ONP processes sufficiently flexible to accommodate differences between interventions and projects? Were they sufficiently adaptive to respond to changing circumstances, and was there sufficient learning between interventions?

ONP processes displayed a degree of flexibility in accommodating differences between interventions and projects. Whilst there was recognition from DSIT about the diversity of proposed initiatives – allowing for some adaptability across interventions and projects – certain constraints such as intervention scope limited the extent of this flexibility (section 2.1.1). R&D competition-type interventions demonstrated learning between early and later interventions (section 2.1.1). Direct award interventions demonstrated the greatest degree of flexibility in ONP processes as grant recipients worked with DSIT to put in place a set of capabilities and shape the objectives of the intervention. DSIT’s ability to provide sufficient learning between interventions and adequately respond to accommodate differences between interventions is inhibited by its current pressures on staff resourcing whereby project managers often have multiple projects under their jurisdiction or the section on captured lessons learned by project participants in the benefits realisation templates was sometimes incomplete (section 2.3.4).

Future considerations: Given the diversity of intervention types, DSIT should allocate further resources to ensure that lessons learned can be shared across all interventions and that interventions respond adequately to market needs whilst achieving the programme objectives.

How effective were the processes for selecting delivery partners for the ONP interventions (via competitions and direct awards)?

Overall, the processes for selecting delivery partners for the ONP interventions were effective (section 2.1.1). For competition-type interventions, selection processes were clear and well communicated at bidding stage. For direct award interventions, specifically SONIC and UKTL, DSIT selected delivery partners based on their neutral commercial interests in the Open RAN ecosystem. However, not all delivery partners of the competition-type interventions are delivering outputs exclusive to Open RAN solutions. Projects such as ORanGaN and Secure 5G worked on use cases which looked to help address supply chain challenges faced and not necessarily a direct link to Open RAN (section 4.3).

Future considerations: DSIT should continue with its approach to select delivery partners for its interventions and give more attention to ensuring project consortia members are delivering outputs against the intervention and programme-level objectives rather than just focusing on individual project outcomes.

How effective were project mobilisation processes including scoping and planning, and agreeing GFAs?

The mobilisation processes, specifically agreeing GFAs, proved ineffective in terms of completing GFA requirements within DSIT’s expected timeframe (section 2.2). There were challenges in defining project milestones (particularly Annex 5 of the GFAs) sometimes resulting in the agreement of superfluous project milestones for the GFAs against unrealistic timelines (section 2.2.2). These difficulties persisted throughout the GFA agreement phase, leading to prolonged deliberations, delays to project delivery and organisations working at risk (or not working at all). It is recognised that DSIT made improvements to streamline the process for later interventions, though it is evident from the experiences of the ONE intervention that some challenges persist. Grantees often took 3 months to sign their GFA following notification of award. This timeline to progress from project award to GFA status appears to still be in place for the projects within the ONE intervention. DSIT’s aim to move from project award to GFA within 4-10 weeks presented challenges to internal DSIT individuals who carried out a significant number of financial reviews and assessments within a relatively short space of time. Several project consortia experienced difficulties signing collaboration agreements due to the lack of clarity on IP rights. Again, comparisons were made by DSIT officials and project consortia to the GFA signing processes of Innovate UK and other government departments (e.g., DESNZ) which were often considered to be simpler, more familiar and requiring less onerous paperwork.

Future considerations: DSIT should review its project mobilisation processes and consider aligning its GFA signing processes with those of Innovate UK or other government departments. DSIT should also review the 4-10 week timescale it expects is needed for signing the GFAs and consider extending to 3 months at a minimum to alleviate internal resourcing pressures, allow grant recipients greater flexibility in project start dates, and avoid organisations working at risk. DSIT could also offer more support to organisations when signing the collaboration agreements, specifically providing additional guidance on addressing IP rights.

How effective have project delivery processes been in monitoring projects and ensuring they meet their objectives?

Overall, the processes for monitoring projects and ensuring they meet their objectives were somewhat effective (section 2.3). While the benefits realisation templates have potential to provide good recordkeeping, the project reporting documents were often found by the ICF evaluation team to be incomplete and that inconsistent approaches had been taken by project participants to reporting (including the approach to reporting on TRL progression) (sections 2.3.4 and 2.3.5). The lack of a consistent approach to projects reporting presented challenges to measuring intervention and programme success. It is evident that measuring the success of specific interventions and projects was a source of contention, such that project consortia reported inputting superfluous milestones during the GFA signing processes to reach agreement with DSIT. This caused issues throughout the project as it was reported by project consortia that there was little flexibility from DSIT when claiming grant payments against agreed milestones. DSIT has made some recent changes to responding to changing circumstances at project-level by providing consortia with greater flexibility and lessening the administrative burden on organisations when submitting change requests, but it is too early to make an assessment as to whether these changes have been effective.

Future considerations: While the balance between maintaining standards and accommodating unique intervention and project requirements remains a challenge, further refinement of project reporting processes is recommended to ensure that ONP processes can effectively respond to the evolving landscape of the Open RAN ecosystem. This includes agreeing a standardised approach to ensure consistency amongst DSIT officials when signing-off project change requests and project reporting, particularly the benefits realisation templates. The recent improvements to change requests should be monitored and DSIT should continue to seek feedback on whether the changes implemented have been effective.

What lessons can be learned from the ONP to inform the design and delivery of other initiatives?

The process evaluation did identify a few areas where DSIT could learn lessons that would improve future ONP interventions and other departmental initiatives. To supplement the recommendations presented throughout the report, DSIT could consider the following:

  • extending timeframes for the bidding process (for competition-type interventions), the pre-GFA due diligence requirements and process (e.g., minimum 3 months) and duration of the interventions (e.g., 2 years minimum for grant funded R&D projects)

  • a review of and adaption of Innovate UK bidding and project reporting processes which are familiar to potential applicants. This includes use of the Innovation Funding Service portal for bids and its change request requirements and processes
  • attention to DSIT resourcing, including retention of staff and improved processes for handover

  • improve technical capabilities within DSIT, particularly for project managers as project consortia felt that this could improve understanding when addressing issues as they arise (e.g., project delays, supply chain issues, change requests, signing off project milestones)

  • offer more flexibility on the GFA signing processes, specifically exploring ways to reduce the level of administrative burden on organisations including a reduction in the amount of detail required from project leads and their partners

  • provide more guidance to participants regarding how to address IP in the collaboration agreements, and how to report on TRLs (guidance has now been provided on how to score TRLs for ONE projects, but not which elements of the project this should apply to)

  • improve information management and storage of files

  • implement processes to ensure DSIT personnel follow the same standards and processes for receiving and approving documents

5.2 Impact evaluation

Has ONP led to additional activity that contributes to programme objectives?

The ONP has led to additional activity that contributes to the programme’s objectives (sections 3.2 and 3.3). Specifically, the ONP has demonstrated early signs that it has:

  • accelerated the development of several Open RAN products. This is evidenced by several FRANC projects which were able to progress their projects from concept (TRL2 or TRL3) to trial stage (TRL4 or TRL5). As expected at this early stage, most ONP projects have not (yet) developed Open RAN products to the point where they are market ready

  • provided Open RAN vendors with an environment where they can test Open RAN products and has helped to catalyse new relationships. This is evidenced by the creation of SONIC labs which has resulted in more R&D and product development work being done in the UK, stimulated by the availability of the interoperability testing facility in the UK. This has also provided access for smaller product providers who would not have access to larger firms’ and MNOs’ equipment and testing facilities, helping to create new commercial relationships between suppliers. However, the absence of an outdoor testing facility has limited the validity of some of the tests that could be undertaken at the SONIC lab

  • supported international collaboration through participation of international organisations in project consortia. Industry interviewees commonly believed that through the ONP, the UK has indicated to the world that it is serious about Open RAN. Whilst it is important to caveat that the majority of these interviews were held with domestic stakeholders, it nevertheless provides a positive indicator of how the UK is perceived within this space. The ONP has also created networking opportunities, and several project consortia members highlighted that they have attended UKTIN events and found them very helpful for interacting with other members of the ecosystem and sharing knowledge

Future considerations: DSIT should continue to support SONIC labs with the deployment of their planned outdoor testing environment. DSIT should also continue to widely communicate its investment in Open RAN initiatives to showcase the UK as an attractive place for Open RAN R&D. DSIT should continue to create environments for networking and knowledge sharing to maximise the amount of collaborative opportunities between different actors in the Open RAN ecosystem.

What progress has been made in implementation of intervention objectives? Has ONP accelerated progress towards implementation?

The progress which has been made in the implementation of intervention objectives is detailed in section 3.2. In summary:

  • it is evident that the earlier interventions (e.g., FRANC, NeutrORAN) are accelerating the maturity of Open RAN products and solutions. However, products supported through these projects are typically not yet market ready, as expected as this early stage. See section 3.2.1
  • interoperability tests have been carried out on Open RAN products. SONIC labs have helped Open RAN equipment providers test the interoperability of different hardware and software combinations. SONIC labs plans to improve the availability of outdoor testing environment, which is key to the credibility of tests and their usefulness to MNOs. See section 3.2.2
  • ONP-funded projects are still in the process of developing Open RAN products and therefore it is too early to assess whether this has increased confidence in and adoption of Open RAN products by MNOs. See section 3.2.3
  • UKTIN has been designed to help evolve supply chain relationships and networks within the Open RAN innovation ecosystem. While the UKTIN has supported networking and knowledge sharing within the ecosystem, it is too early to assess whether it has catalysed the UK Open RAN innovation ecosystem. See section 3.2.4
  • ONP projects have enabled international collaborative R&D through the participation of non-UK based firms in project consortia across the interventions (including FRANC, FONRC, NeutrORAN, UK-ROK R&D, and ONE). See section 3.2.5

Future considerations: Though the government can only make a small impact in this area as it requires global change, DSIT should for the remaining part of the programme focus on projects that include later-stage R&D with higher TRLs and projects that involve working with MNOs deploying Open RAN solutions on their live networks – as is the case for some ONE projects.

Is ONP helping shape future telecoms network through early signals to vendors and MNOs?

So far, the ONP has had some results in shaping future telecoms networks (sections 3.2 and 3.3). Discussions with programme participants suggested that the ONP together with other international government funding as part of the programme is giving encouragement to Open RAN vendors to continue development. This will hopefully result in improved product offerings and trigger an increase in MNO confidence in Open RAN.

The evidence in the impact and economic evaluation sections suggest that the programme has helped to progress the maturity of individual Open RAN solutions; however, as expected at this early stage, the majority of the solutions are not at a market readiness deployment stage. As such the products are not currently at a stage where they can be adopted by mobile operators. Based on the feedback from our interviews with the MNOs, 2 continue to consider commercial deployment of Open RAN to be a long way off and have not yet seen an impact from the ONP on their Open RAN plans. However, as anticipated at this early stage, most evidence comes from lower TRL FRANC projects, meaning most available solutions are not yet market-ready. A key policy aim for telecoms diversification interventions is the adoption and deployment of open networks by MNOs. To achieve this, the ONP has ongoing HDD projects, which will provide crucial case studies for Open RAN technologies in live networks. Many of the ONE projects are more likely to have an impact closer to 2025. Vodafone, who has the most extensive plans to deploy Open RAN, indicated that its Open RAN deployment plans have been unaffected by the ONP. VMO2 has announced it will deploy Open RAN with Mavenir, but the extent of this deployment is still uncertain. Based on the interviews and public knowledge, BT and Three are both still trialling Open RAN but do not have any plans for any scale commercial deployments in view of the technology and associated operational support not considered to be sufficiently mature. So far ONP projects have mostly been focused on technical research and laboratory testing rather than on live network trials which could then lead onto commercial and scalable Open RAN deployments. Though the R&D in the projects listed above is useful, many of the completed/close to completion projects are too early-stage R&D to impact MNO deployments in the short-term.  

Furthermore, the Memorandum of Understanding signed between the government and the 4 UK MNOs is a clear statement that future telecoms networks will be based on open architectures. However early indications suggest that this might ultimately be done through the existing Tier 1 vendors adopting to some degree Open RAN technology.

Future considerations: As above, DSIT should continue to encourage MNOs to have greater involvement in the ONP projects through undertaking live trials of the developed solutions and providing useful real-world data to other projects. It is too early to judge the effect the ONP has had on future telecoms networks, but it is clear that the work of the ONP is being noticed by both MNOs and vendors and will hopefully over time prompt further investment and support for Open RAN.

Is the ONP creating opportunities for new technical and commercial partnerships and enabling development of R&D ecosystem?

The ONP has been successful in creating partnerships between different players in the Open RAN ecosystem (sections 3.2.4 and 3.2.5). Our assessment based on project participant interviews is that participation in the ONP has meant that the MNOs were more open to meeting with those organisations since their role in the ONP gave such smaller organisations more ‘credibility’ (section 3.2.3). Of those interviewed representing the universities involved in ONP projects, they were appreciative of the opportunity to work alongside MNOs and vendors to help guide and support their research. In terms of enabling development of the R&D ecosystem, the ONP has supported early-stage R&D. However, as discussed above, it is important going forward that future funding moves onto projects with higher TRL targets. The ONE intervention has begun to include such projects.

Future considerations: Further involvement from MNOs is needed to help drive the direction of research. DSIT should ensure that any future projects include at least one MNO to help steer the product development towards developing practical solutions and use cases. Ideally, any future projects would also involve live deployments on the networks of the MNO involved.

To what extent has ONP created knowledge sharing and learning across the ecosystem?

The ONP has been successful in enabling knowledge sharing and learning across the Open RAN ecosystem (sections 3.2 and 3.3). The UKTIN was established by DSIT to facilitate collaboration across ONP projects and the wider Open RAN ecosystem. Several of the smaller companies involved in ONP projects said that they frequently attend UKTIN events and have found them very helpful for interacting with other members of the ecosystem and sharing knowledge. Dissemination of knowledge at the end of projects will enable further learning across the ecosystem. Several projects intend to do this at Mobile World Congresses whereas others intend to make outputs available through project websites.

Future considerations: DSIT should continue its support for initiatives such as UKTIN to enable collaboration amongst ecosystem players and ensure this effort reaches all areas of Open RAN. DSIT should further encourage each funded project to have a clear plan in place to disseminate its findings at the end of the project.

5.3 Economic evaluation

What is the estimated value for money of the programme at this early stage?

At this stage it is too early to assess the value for money of the ONP (Section 4.2). The long-term benefits of Open RAN investment are not expected to fully materialise until at least 2030 and as such it is not a surprise to not see any significant changes to the composition of the domestic telecommunications industry. Analysis based on the available TRL progression data suggests that the ONP to-date has led to encouraging TRL advancements and that increased DSIT funding has had a beneficial impact on the level of TRL progression realised (section 4.3). This progression is comparable to other DSIT interventions in this space such as the 5GTT programme. It is not yet possible to confidently measure the impact of ONP on a participating organisation’s market valuation given the current lack of available evidence. As noted above, this was always likely to be the case due to the early nature of the intervention, but such benefits are anticipated to materialise over the medium- to long- term. R&D projects by their nature tend to demonstrate economic benefits over a longer time horizon than other areas. Assessing whether these benefits for participatory organisations takes place will be an increasingly key area of focus for subsequent evaluations.

6. Recommendations for future evaluations

With the ONP still in progress and due to be completed in 2025, there have been some areas where it is still too early to gauge the effectiveness of the programme. In addition, there have also been changes made with regards to the delivery of the programme, which are being implemented in later stage ONP interventions. The scope of future evaluations will depend on when they take place, but we have laid out some thoughts on areas which should be considered as part of a later study:

  • there will be value in a ‘light touch’ process assessment focused on how far the issues identified in this evaluation, such as the weaknesses in the measurement of benefits realisation (including TRLs), have been addressed effectively. A future process evaluation could also usefully extend to cover those interventions where limited evidence was available for the current study
  • the primary focus of the next evaluation(s) will need to be on an impact evaluation, in particular:
    • updating the current impact assessment and extending this to interventions which could only be considered in a preliminary way, such as the ONE intervention. It could also apply to those which had not progressed to the point where they could be included, with emphasis on interventions focused on developing international links (e.g., South Korea and Japan)
    • considering the extent to which outputs in terms of developing Open RAN technologies are crossing the potential ‘valley of death’ to adoption and starting to or likely to realise associated expected benefits such as market disruption and enhanced competition
  • there will also need to be an enhanced focus on economic evaluation, considering in particular the value of the impacts which are being generated and those which can reasonably be expected to arise based upon the additional evidence then available. These can then be compared both with the projections within the business case and the expected final costs of the programme

Acronyms

5GTT 5G Testbeds and Trials Programme
BBU Base Band Unit
C-RAN Centralised Radio Access Network
CU Centralised Unit
DCMS Department for Culture, Media and Sport
DESNZ Department for Energy Security and Net Zero
DSIT Department for Science, Innovation and Technology
DU Distributed Unit
EOI Expression of Interest
FONRC Future Open Networks Research Challenge
FRANC Future Radio Access Network Competition
GFA Grant Funding Agreement
HDD High Demand Density
IP Intellectual Property
KPI Key Performance Indicator
MNO Mobile Network Operator
NeutrORAN NeutrORAN Project
ONE Open Networks Ecosystem Competition
ONP Open Networks Programme
RAN Radio Access Network
RIC RAN Intelligent Controller
RTO Research and Technology Organisation
RU Radio Unit
SME Small and medium-sized enterprises
SONIC SmartRAN Open Network Interoperability Centre
Standards and SEP Standards and Standard Essential Patents
TCO Total Cost of Ownership
TDA Technical Design Advisor
ToC Theory of Change
TRL Technology Readiness Level
UK-Japan R&D Japan Research and Development Competition
UK-ROK R&D UK – Republic of Korea Open RAN R&D Collaboration
UKTIN UK Telecoms Innovation Network
UKTL UK Telecoms Lab
VMO2 Virgin Media O2
V-RAN Virtualised Radio Access Network

Annex: Acknowledgements

Prepared by:

  • Andrew MacKinnon
  • Ryan Harding
  • Amit Nagpal
  • Harry Madden *Jonathan Wall
  • Lee Sanders
  • George Barrett

Checked by James Leather and Matt Bassford.

This report is the copyright of Department for Science, Innovation and Technology and has been prepared by ICF Consulting Services Ltd under contract to Department for Science, Innovation and Technology. The contents of this report may not be reproduced in whole or in part, nor passed to any other organisation or person without the specific prior written permission of Department for Science, Innovation and Technology.

  1. Proposal development processes are those that were followed to define the scope, beneficiaries, and objectives of each ONP intervention. 

  2. Mobilisation requirements and processes consist of activities undertaken up to and including the signature of the Grant Funding Agreements that were signed with all projects funded through the ONP

  3. Monitoring processes include the project management, benefits realisation monitoring, and project assurance processes. 

  4. In order to create an end-to-end system or solution using open networking products from different vendors, SONIC Labs supports open network companies that have functional working products. It supports complete end to end integration that includes support for performance and interface testing, as well as security in relation to standardised testing. 

  5. A total of £295.5 million of capital funding has been made available by HM Treasury, of which £50 million was made available in Spending Review 2020 and another £249.5 million Autumn Statement 2021. The programme also has a £27 million budget for resource funding, to cover the costs of programme implementation. 

  6. Source: DCMS (2022) Programme Business Case - Open Networks Programme (Unpublished). 

  7. Source: Frontier Economics (2023). Open Networks Research and Development Fund Baseline Study. 

  8. Source: DCMS (2022) Programme Business Case - Open Networks Programme (Unpublished). 

  9. TRLs are a technology management tool that provides a common measurement system to assess the maturity of evolving technologies. The scale goes from TRL1 (basic principles observed and reported) to TRL9 (at the point of commercialisation). 

  10. Source: Overview of Open Testing and Integration Centre (OTIC) and O-RAN Certification and Badging Program, White Paper, April 2023, p2. 

  11. UKTIN (October 2023) Unpublished UKTIN Quarterly Report (QR4) 

  12. Source: Unpublished UKTIN survey. Base: organisations involved in Open RAN (n=29). 

  13. Source: Unpublished UKTIN Quarterly Report. Based on feedback from 33 event attendees. 

  14. A tier 2 vendor refers to an organisation which is at an intermediate size and influence within the telecommunications industry. These entities are important to add more competition to a market which is typified by a small number of large tier 1 vendors. 

  15. Source: Vodafone’s approach to Open RAN systems integration