Research and analysis

Local Growth Fund and Getting Building Fund: initial evaluation feasibility assessment

Published 11 January 2024

Applies to England

Executive summary

Overview

The Local Growth fund (LGF) was announced in 2013 as a single £12 billion pot of devolved funding aimed at delivering a range of initiatives to support local economic growth. The Getting Building fund (GBF) was a £900 million COVID-19 recovery fund, aimed at stimulating economic growth by investing in ‘shovel ready’ projects. In many ways, GBF was a continuation of LGF – it followed chronologically, made use of delivery processes that had been established under LGF, and delivered similar (sometimes the same) interventions, albeit with somewhat different objectives, which were driven by the COVID-19 Pandemic.  

The Department for Levelling Up, Housing and Communities (DLUHC) commissioned Steer Economic Development (Steer-ED) to examine the feasibility of conducting process, impact, and value for money evaluations for LGF and GBF. This report provides an initial feasibility assessment based on Steer-EDs work. Given the similarities and sequencing of the two funds, they are considered with a view to conducting a joint evaluation.

The recent announcement of withdrawal of support for Local Enterprise Partnerships (LEPs) is highly relevant to considerations for evaluation timing. LEPs are undergoing a transition up to March 2024, with some transitioning into local authority functions, and others being dissolved. Initial scoping work suggests that key stakeholders may become harder to contact and data held by LEPs may become less accessible as part of this transition, meaning that it is important to conduct some key evaluation activity as soon as possible.

The methodology for undertaking this feasibility assessment involved a range of workstreams including workshops, scoping interviews, document reviews, logic model development, refinement of project typologies, a literature review, scoping of secondary datasets, and expert interviews.

Process evaluation

Steer-ED’s initial feasibility and scoping work has indicated a good level of knowledge and available documentation for both LGF and GBF. They conclude that it is feasible to proceed with a process evaluation, which will offer valuable insights for future programmes. They recommend completing this as a joint LGF and GBF process evaluation. This is for two reasons: first, due to the similarities in processes, delivery, and interventions and the chronological sequencing of the funds, a joint evaluation will allow for the gathering of shared insights on their interconnected processes. Second, a joint evaluation will enhance efficiency – minimising duplicated work and burden on stakeholders.

Their recommended process evaluation approach includes depth interviews, document reviews, tailored workshops, and consolidation of monitoring data to gain a comprehensive understanding of the funds’ processes, development and delivery.

Impact and value for money evaluation

Steer-ED’s initial feasibility and scoping work has highlighted many challenges with completing impact and value for money evaluations. These include, for example, the difficulties of identifying a counterfactual as there are no unfunded LEPs and the presence of other confounding effects (e.g., other funding in the same area). These make it difficult to identify comparison places in order to conduct robust evaluation and isolating impacts that have resulted from LGF and GBF from confounding activity.

A mixed-methods approach, if possible, is likely the most appropriate way to evaluate the funds. While it is likely some qualitative methods are feasible, these alone will not provide robust and comprehensive evaluation of the programme. On the other hand, there are significant uncertainties and risks for whether robust quantitative methods will be feasible.

Value for money evaluation feasibility is dependent on whether impacts can be estimated robustly and can be said to have resulted from the funding. Quantifying or monetising observed impacts or benefits is a poor use of resource if impacts are not attributable to the intervention. And even where this is the case, it may still not be possible to monetise some or all of them.

Further investigation needs to be completed in order to better understand the feasibility of completing impact and value for money evaluation.

Next steps for evaluation

DLUHC is progressing next steps for evaluating LGF and GBF swiftly. Key activities which require involvement from LEPs will be prioritised due to risks of engagement and data accessibility highlighted regarding the transition. Therefore, DLUHC has commissioned Steer-ED to:

  1. Conduct a joint process evaluation of LGF and GBF. This will be completed and findings made available in 2024.

  2. Collect available data from LEPs and identify, where possible, appropriate counterfactuals as part of further feasibility investigation for impact and value for money evaluation. An updated feasibility assessment will be published when this activity is completed.

1. Introduction and methodology

Chapter overview

Scope and purpose of this report

The Local Growth fund (LGF) was announced in 2013 as a single pot of £12 billion of devolved funding aimed at delivering a range of initiatives to support local economic growth, running from 2015 until 2021. Of this, £7 billion was flexible funding to be allocated by local areas and managed by DLUHC (formerly the Ministry for Housing, Communities and Local Government and Department for Communities and Local Government), and is the scope of this evaluation.

The Getting Building fund (GBF) was a COVID-19 recovery fund worth £900 million, aimed at stimulating economic growth by investing in ‘shovel ready’ projects, announced in 2020 and running until 2022.

The Department for Levelling Up, Housing and Communities (DLUHC) commissioned Steer Economic Development (Steer-ED) to undertake a study to explore and understand the feasibility of conducting process, impact, and value for money (VfM) evaluations of the Local Growth fund (LGF) and Getting Building fund (GBF). This report, published in January 2024, provides an Initial Feasibility Assessment based on Steer-ED’s work. DLUHC is committed to understanding the impact of local growth funding, as set out in the department’s evaluation strategy. There are however multiple challenges associated with conducting impact and value for money evaluation of funds such as LGF and GBF (which are heterogeneous in intervention, spatial in distribution, and devolved in administration) as highlighted for example by the National Audit Office.

The report outlines an initial feasibility assessment for process, impact, and VfM evaluation. The report also presents the proposed research questions for evaluation and potential approaches to answer these. This Initial Feasibility Assessment was commissioned by DLUHC to help us to make an informed decision on how best to take forward the evaluation.

Methodology

Presented below is an outline summary of the methodologies deployed in undertaking the feasibility assessment completed by Steer-ED.

Process scoping

The fieldwork undertaken as part of process scoping included the following:

  • evaluation purpose workshop – a workshop to gather stakeholder views on the purpose of the evaluation and to inform lines of enquiry for the scoping calls

  • context scoping interviews (x4) – interviews with key civil servants to provide a comprehensive understanding of the two funds

  • document review – a review of documents provided by DLUHC, sourced following suggestions made by process scoping interviewees, or independently by Steer-ED. The review supported the population of Theories of Change, Process Maps, and a Timeline

  • logic model workshop – a workshop with stakeholders from DLUHC to test and refine draft logic models

  • process scoping interviews (x14) – semi-structured depth interviews with key stakeholders

Impact and VfM scoping

The fieldwork undertaken as part of impact scoping consisted of 8 workstreams, across 4 key stages, as set out in Figure 1.1.

Figure 1.1: Workstreams undertaken as part of impact scoping

Source: Steer-ED, 2023

Plain-text: Diagram showing the workstreams undertaken for impact scoping:

  1. Understanding LGF/GBF
  2. Development of monitoring data framework
  3. Consider methodological considerations
  4. Appraise evaluations methods

The workstreams comprised:

  • Development of programme and intervention logic models, which were developed through a collaborative and iterative process with the project steering group.

  • Development of project typologies – Steer-ED developed a typology of project objectives/impacts and mapped these against project spend details. This was achieved via review of monitoring data, a review of a sample of detailed project descriptions, and through desk review of relevant policy documentation. In particular, Steer-ED built upon a fund typology developed in 2016 as part of LGF business case development.

  • Preparation of monitoring data and scoping of secondary datasets – a detailed review of all LGF and GBF monitoring data held by DLUHC was undertaken. This data is compiled by DLUHC from individual LEP monitoring data returns, which were submitted on a quarterly or (in some cases) biannual basis. The review focussed on: understanding the data available; identifying what gaps exist in the data; and assessing the quality and reliability of the data. Data scoping also involved assessing relevant secondary datasets that may be of use for evaluation – at either the programme or intervention level. Consideration was given to the availability, timeliness and relevance of each dataset.

  • Scoping interviews with internal experts with a range of relevant expertise discussing, for example, best-practice evaluation approaches, similar evaluation commissions, relevant datasets (both at LEP and national level), and VfM approaches. Consultees included representatives from the Office for National Statistics (ONS), DLUHC, Department for Transport (DfT), Homes England, Department for Energy Security and Net Zero (DESNZ), What Works Centre for Local Economic Growth, one LEP and one local evaluator commissioned by a LEP. Steer-ED also drew on their expert panel including an academic econometrician and senior level economists spanning different aspects of the funds.

  • Literature review – a non-comprehensive review of relevant impact evaluation literature was undertaken to gather insights on methods used, relevant datasets, and key challenges encountered by other evaluations of large, capital-focused funds. In total, 27 documents were shortlisted for review, examining subjects including national funds, evaluation guidance, rail, skills, area-based initiatives, and others.

Structure of this document

The remainder of this document is structured as follows:

  • chapter 2 provides an overview of the funds, a timeline to show the different stages involved in development and delivery of the funds, and some detail on the projects supported through the funds
  • chapter 3 presents an initial assessment on the feasibility and robustness of conducting process evaluation
  • chapter 4 presents an initial assessment on the feasibility and robustness of conducting impact and value-for-money evaluation
  • chapter 5 outlines the next steps for evaluation activity

2. Understanding LGF and GBF

Chapter overview

LGF was announced in 2013 as a single £12 billion devolved funding pot aimed at delivering a range of initiatives to support local economic growth. GBF followed, with a £900 million COVID-19 recovery fund aimed at stimulating economic growth by investing in ‘shovel ready’ projects.

This chapter provides a detailed description of the two funds, covering their origin, objectives, and how they were delivered to support local economic growth. It includes a timeline from 2013 to 2025, highlighting important dates for the funds and significant contextual factors which shaped them. Finally, the chapter takes a closer look at project typologies and funding distribution across the different project types.

Introduction to the Funds

Introduction to the Local Growth Fund

LGF was announced by the government in June 2013 and provided funding to local areas to support economic growth. It brought together funds from several government departments into a ‘single pot’ and gave local areas, through Local Enterprise Partnerships (LEPs), responsibility to determine how the funding should be best spent to meet local needs.

LGF can be seen in the context of the government’s move to devolve power to local areas and its desire to implement the recommendations of Michael Heseltine’s ‘No Stone Unturned’ report. The report advocated for the creation of a single flexible pot and the devolving of decision-making to areas that reflected natural economic geographies.

There were 3 rounds of LGF between 2015 and 2021, totalling £12 billion. Of this, £7 billion was flexible funding to be allocated by local areas and managed by DLUHC, and is the scope of this evaluation. The remaining £5 billion was managed by other government departments/agencies and comprised £2 billion delivered through the first round of Growth Deals and managed by the Department for Transport (DfT), £0.4 billion for a Housing and Skills Budget, £2 billion for a Home Building Fund, and £0.45 billion for Transport Majors Funding. Table 2.1 shows the breakdown of funding for the £7 billion flexible funding pot, that was managed by DLUHC, from the contributing government departments.

While the £7 billion flexible funding managed by DLUHC is the focus of this evaluation, there has been other evaluation activity on LGF funding. DfT has been overseeing evaluations of large LGF transport schemes that were retained by the department. According to the guidance set out in the 2012 Monitoring and evaluation framework for local major schemes, LGF retained transport schemes have been required to submit M&E plans for review and sign-off by DfT, along with 1 and 5-year-after evaluation reports. The findings from LGF-funded and other similar types of highways, public transport and integrated schemes have been synthesised through periodic meta-evaluations, the most recent of which was published in 2022.

Table 2.1: Flexible LGF funding allocation, by department

Government Department Funding Amount
Department for Transport (DfT) £4,929m
Department for Communities and Local Government (DCLG) £1,113m
Department for Education (DfE) £980m
Department for Business, Energy and Industrial Strategy (BEIS) £50m
DCLG Broadband £9m
Total £7.1bn*

Source: DLUHC, 2023. *Paid directly to LEPs via DLUHC

Introduction to the Getting Building Fund

GBF aimed to support the economic recovery to the COVID-19 pandemic. It was designed to provide a short-term financial stimulus at a time when the Office for Budget Responsibility was predicting a recession and high unemployment, and when the construction sector was effectively halted due to COVID-19 lockdowns. It invested in ‘shovel ready’ projects, meaning that it was designed primarily to get existing projects or well-developed proposals moving.

GBF ran for 18 months from 2020 to 2022, totalled £900 million, and was delivered through LEPs using processes established under LGF. In many ways, GBF was a continuation of LGF – it followed chronologically, made use of delivery processes that had been established under LGF, and delivered similar (sometimes the same) interventions. However, it is important to note that the context and rationale for the two schemes were different, reflecting the differing economic drivers at the time of creation.

Documenting the funds

Given the similarities and sequencing of the two funds, this feasibility assessment was conducted jointly for LGF and GBF. The following materials were produced to document Steer-ED’s understanding of the funds and provide the foundation from which the proposed process evaluation research questions and methodology were developed:

  • Logic Models for each fund, developed using information from LGF and GBF business cases, as well as intervention-level Logic Models, which were adapted from those provided in the 2016/17 LGF business case. These Logic Models summarise the anticipated causal linkages from context, rationale, objectives, inputs, activities, outputs, outcomes, and impacts. These will continue to be refined and tested as part of process evaluation and further feasibility investigation on impact evaluation, and will be published as part of this activity.

  • A definitive timeline that outlines the key dates, contextual factors, changes, and events for the two programmes, from 2013 to 2025. The timeline is shown in Figure 2.1 overleaf.

  • High level process maps that outline the key processes and stages involved in programme development, delivery and monitoring of the funds.

Institutional memory is a challenge for LGF with the knowledge of the programme spread diffusely across a large number of current and former officials. This was less of an issue for GBF where the delivery period was more recent.

Background research included reviewing a series of business cases for LGF that were developed and updated as part of securing LGF funding from the contributing government departments and for the creation of GBF.

Figure 2.1: Timeline for LGF and GBF

Source: Steer-ED, 2023

Plain text: Timeline diagram showing the stages of LGF and GBF from 2013 to 2025.

LGF begins in 2013 with the development of their strategic economic plans and ends in 2025 with the end of monitoring. Interim stages include growth deals being announced and monitoring arrangements being detailed or updated.

GBF begins in 2020 with the outbreak of COVID-19 and LEPs submitting their list of ‘shovel ready’ projects. GBF ends in 2025 with the end of monitoring.

LGF and GBF portfolios

Both LGF and GBF have supported a diverse range of projects, with different outputs, timeframes and beneficiaries. Although all projects are ultimately designed to lead to local economic growth impacts (measured in terms of changes in employment or productivity), these benefits are delivered via a range of different mechanisms, and intermediate outputs and outcomes will vary significantly between intervention types. These differences are important drivers of decisions to determine evaluation methodology, in particular for the design of project or intervention-level evaluation methods.

An initial typology of LGF projects was developed by ICF Consulting Ltd on behalf of the then Department for Business, Innovation and Skills, to categorise the projects funded in the first round of Growth Deals. Steer-ED have built upon the ICF typology and refined by adding Regeneration/Public Realm, Culture/Tourism and Green Recovery as new categories to better reflect projects that were ultimately funded. Table 2.2 provides a summary of the types of projects supported.

Table 2.2: Typology for LGF and GBF project categorisation and analysis

High level Sub-category Types of projects supported
Transport Road Improvements New roads and road resurfacing; Junction improvements; and Multi-modal improvements e.g. cycle or pedestrian routes.
Transport Urban Sustainable Transport Park and ride schemes and railway station car parks;Cycle routes or improvements; and Active travel schemes.
Transport Rail Transport Railway infrastructure (e.g. platform improvements); Station infrastructure and facilities; and Wider public transport improvements e.g. access at stations.
Skills Skills Capital Investment in the further education estate; Development of new courses; and Other skills projects including access to equipment.
Site development Employment Site remediation, site access and construction of new commercial space.
Site development Innovation Investment in innovation spaces and centres of excellence.
Site development Housing Site remediation and site access enabling future housing development.
Site development Regeneration / Public-Realm (Added to ICF typology) Community regeneration projects; Investment in public/green spaces; and Public realm investment supporting wider regeneration.
Site development Culture / Tourism (Added to ICF typology) Development of new cultural or tourist facilities; Repurposing of existing buildings for entertainment use; and Development/refurbishment of sporting facilities.
Economic development Business Support Capital grants to businesses to support equipment/fit-out; Non-financial business support/advice; and Investment in workspace designed for start-up/scale-up businesses.
Economic development Flood Management Investment in flood defence projects to protect homes or businesses or enable new development.
Economic development Digital / Broadband Infrastructure Expanding access to superfast fibre broadband; Supporting rollout of 5G infrastructure.
Green recovery (GBF) Green Recovery (Added to ICF typology) Investment in EV infrastructure Retrofit schemes.

Source: Steer-ED, 2023

Project delivery

DLUHC are continuing to monitor projects with outstanding spend and those which, although financially complete, continue to capture outputs. Analysis of outputs recorded within monitoring data shows that some projects still have forecast outputs outstanding. This is particularly relevant for some intervention types with housing projects in particular being likely to have outstanding outputs.

Funding breakdown across projects

This section provides a breakdown of projects across intervention types, including the value of funding and typical project sizes for each. As shown in Table 2.3, the two largest categories of LGF projects, in terms of number of projects and spend, were Transport and Skills Capital. The next most substantial, making up just over 30% of the total, were those concerning development of employment, innovation, housing, public realm, cultural/tourism sites or business support projects. Flood management and digital infrastructure projects between them were only allocated 3.6% of the total LGF funding.

It should be noted that at this stage, due to the limitations of the monitoring data, Steer-ED found it was not possible to break transport projects down further into the road/rail/sustainable transport sub-categories. However, based on a sample of 75 randomly selected projects, they estimate that nearly half (48%) of the 755 transport projects concerned road improvements, a third (33%) were urban/sustainable transport projects, and around 8% were rail projects[footnote 1].

Table 2.3: Distribution of LGF funding for each category

Category Total number of projects Total allocation % of total allocation Average project size Project size range (lowest to highest)
Transport 775 £3,172m 44.5% £4.15m £8k - £68m
Skills Capital 561 £1,214m 17.0% £2.18m £1k - £40m
Regeneration/ Public Realm 148 £272m 3.8% £1.8m £25k -£20m
Business Support 141 £420m 5.9% £3.1m £14k - £43m
Innovation 191 £515m 7.2% £2.7m £20k - £20m
Employment 225 £689m 9.7% £3.1m £14k - £41m
Housing 79 £300m 4.2% £3.9m £70k - £20m
Culture/Tourism 68 £132m 1.9% £2.0m £31k - £12m
Flood Management 56 £168m 2.4% £3.0m £120k - £13m
Digital/Internet Infrastructure 47 £86m 1.2% £1.8m £47k - £11m
Other 74 £159m 2.2% £2.2m £1k - £26m
Total 2,365 £7,129m* 100% £3.27m £1k - £68m

Source: LGF monitoring spreadsheet, July 2022 & Steer-ED, 2023. Note: figures are rounded. For example, percentages are rounded to nearest decimal first place. * includes previously ringfenced funding released by DfT directly to LEPs

Table 2.3 sets out the distribution of funding for GBF. The rationale and context for GBF was around expediating ‘shovel ready’ projects as part of the economic recovery to COVID-19. Table 2.4 reflects this different focus and suggests the following conclusions:

  • GBF funded a much lower proportion of transport schemes (44.5% for LGF compared to 16.9% for GBF).
  • GBF funded a lower proportion of road improvements projects (approx. 16% for LGF compared to 5.1% for GBF).
  • GBF supported a greater proportion of site development projects, especially those primarily focussed on creating employment space (9.7% for LGF compared to 16.9% for GBF).
  • the focus on green recovery for GBF is reflected in the allocations.

Although the proportions of projects in each category was different between LGF and GBF, the types of projects in each remained broadly similar so evaluation methodologies for LGF remain applicable for GBF. Project categorisation has been undertaken by Steer-ED based on existing GBF categorisation and analysis of project descriptions.

Table 2.4: Distribution of GBF funding for each category

Category Total number of projects Total Allocation % of total allocation Av. Project size Project size range (lowest to highest)
Road Improvements 20 £46m 5.1% £2.40m £183k - £10m
Rail Transport 7 £59m 6.5% £8.37m £1m - £15m
Urban Sustainable Transport 20 £45m 5.0% £2.26m £350k - £6m
Skills Capital 56 £77m 8.6% £1.43m £3k - £12m
Employment 60 £152m 16.9% £2.62m £88k - £9m
Innovation 44 £126m 14.0% £2.93m £200k - £12m
Housing 9 £26m 2.9% £2.93m £500k - £6m
Business Support 27 £50m 5.5% £1.92m £200k - £7m
Culture/ Tourism 19 £59m 6.5% £3.09m £194k - £8m
Digital/ Internet Infrastructure 28 £44m 4.8% £1.61m £120k - £6m
Flood Management 2 £5m 0.5% £2.41m £500k - £4m
Regeneration/ Public Realm 33 £121m 13.5% £3.67m £564k - £23m
Green recovery 40 £72m 8.0% £1.94m £11k - £14m
Other 8 £5m 0.6% £671k £140k - £3m
Unallocated funds   £13m 1.5%    
Total 373 £900m 100% £2.45m  

Source: GBF monitoring data Q4 2022/23 & Steer-ED, 2023. Note: figures are rounded. For example, percentages are rounded to nearest decimal first place.

3. Process evaluation

Chapter overview

Steer-ED conclude that a robust process evaluation of the funds can be undertaken. Process evaluation is analysis of the implementation and design of a programme. This could provide valuable lessons to inform the design of future programmes. This Chapter sets out the proposed process evaluation research questions and their underpinning rationale. It then summarises Steer-ED’s proposed evaluation workstreams to respond to those questions.

Given the similarities in delivery model, stakeholders and processes of the funds Steer-ED suggest that opting for a joint process evaluation, rather than conducting two separate evaluations, would be preferable. A joint process evaluation would bring the following benefits:

  • avoiding duplication of desk work – for example around understanding administrative processes and refining Logic Models
  • removing the need to speak to stakeholders more than once – reducing burden for stakeholders and saving study budget
  • allowing for a comprehensive consideration of the relationship between the two funds in terms of their aligned delivery models

A joint evaluation has the above benefits but would also need to recognise the differences between the two funds in terms of scale, duration and economic context and objectives.

Research questions

The scoping interviews, document review and workshops provided an understanding of the level of information and knowledge that exists about the two funds. Key themes emerged from the scoping which either support further investigation of the logic in the Logic Models, key events in the Timeline or the implications of key processes. These form the basis for the proposed process evaluation questions.

The proposed questions have been grouped into three categories, with the focus and rationale of each as follows:

  • Design and delivery: questions focusing on the overall design and practical delivery of the fund. This includes, for example, the effectiveness of the design and delivery of the fund in achieving the fund’s objectives; aspects of delivery that have worked well or less well; and any barriers/ enablers in delivery and how the fund responded to them. These questions are important to generate learning for future schemes through understanding the delivery model and its effectiveness.

  • Governance, management and monitoring: questions which will focus on evaluation of the effectiveness of the programme’s governance, management and monitoring processes. This includes examining, for example, whether monitoring processes kept the fund on track and informed decision/ actions; how governance structures added value and contributed to impacts; and the extent to which risk strategies effectively mitigated risks. These questions can help us understand how governance, management and monitoring processes impacted on what was delivered and how lessons can be learned to help shape processes for future programmes.

  • Strategic: questions focusing on how the funds fit with or complement wider strategic activities. This includes, for example, the extent to which LGF and GBF aligned with/ complemented other government initiatives; whether the funds effectively engaged/ aligned with the wider economic development ecosystem; and lessons to inform future/ similar funds. These questions are important to develop understanding on as consideration of the strategic context, including the economic context and the relationship between the two funds and with other government initiatives, and the extent to which lessons learnt from the two programmes can be applied to future programmes.

Evaluation questions and sub-questions for each category are presented below. Questions 1-6 cover design and delivery; questions 7-12 cover governance, management and monitoring, and the strategic theme is covered by questions 13-16.

Process evaluation questions and sub-questions – design and delivery

1. How effective has the delivery model (via LEPs) been? What has driven this?

  • a. Which aspects of delivery worked particularly well, or less well, and why?
  • b. To what extent did the different intervention types e.g., transport, skills capital, business support etc., fit together into a coherent funding programme?

2. Were the rationale and objectives of the funds clear and well understood?

  • a. Were the rationale and objectives clear and well understood at a national and local (LEP) level?
  • b. Was guidance from government to local areas clear and well communicated?

3. To what extent did the delivery model affect the types of projects funded?

  • a. To what extent did the following affect the types of projects funded: (i) ‘single pot’ nature of the funding; (ii) role of LEP / of local business in prioritisation

4. To what extent did the funds align to local areas priorities?

  • a. To what extent did the funded projects align with and contribute towards the Strategic Economic Plans (SEPs)?
  • b. What local conditions supported successful delivery?
  • c. What worked well or less well in the delivery of local interventions?

5. To what extent did the delivery model support the engagement, capacity and collaboration of local partners?

  • a. What were the local benefits of any collaboration encouraged by the delivery model?
  • b. Did the delivery model create any longer-term benefits or challenges?

6. What barriers / enablers arose during fund delivery?

  • a. How did the programme team respond to these?
  • b. To what extent and how did COVID-19 influence the delivery of projects?
  • c. What lessons can be learnt for future interventions of similar schemes?

Process evaluation questions and sub-questions – governance, management and monitoring

7. How effective were governance structures (at national and LEP level)?

  • a. What lessons can be learnt from governance and process design for future delivery of schemes like this?
  • b. How effective were risk management strategies at a fund level in anticipating and mitigating against risks?
  • c. What were the implications for governance structures of changes to LEP policy, funding and assurance frameworks?
  • d. Was guidance about programme governance sufficient?

8. What was the role of each of the relevant government departments?

  • a. What role did they play and how effective was it?
  • b. Has learning been generated that can inform future cross-government collaborative funds?

9. How effective was programme management (at national and LEP level)?

  • a. How effective was the national programme management of the funds?
  • b. How effective was the local programme management of the funds?

10. To what extent were the processes for selecting projects and allocating funding effective?

  • a. Was the process for selecting projects fit for purpose?
  • b. How did the process for project selection evolve over time?

11. How effective were monitoring processes (at national and LEP level)?

  • a. Did LEPs provide timely, comprehensive and high-quality monitoring returns? What were the key barriers/ enablers to this?
  • b. How was monitoring information and LEP level evaluations used – e.g. were they used to inform decisions and actions?
  • c. Could improvements have been made to make monitoring processes more effective?
  • d. What were the monitoring implications of changes to LEP policy, funding and assurance frameworks?
  • e. Was there sufficient monitoring guidance to provide clarity and consistency?
  • f. What was the capability and capacity of LEPs to provide this data?

12. To what extent did the funds meet government’s spend targets?

  • a. To what extent did the Funds, and LEP delivery of them, meet the initial budgetary expectations?
  • b. Were there any unforeseen issues and/or hidden costs?
  • c. How were local freedoms and flexibilities used?

Process evaluation questions and sub-questions – strategic

13. To what extent did the funds align with other central government or local initiatives?

  • a. Did the funds complement the achievement of other government initiatives (i.e Grand Challenges, Sector Deals, Net Zero etc)?
  • b. To what extent were the funds used locally to lever in additional funding (e.g. ESIF)?
  • c. How integrated were the funds with other local economic development provision?

14. How did the extraordinary circumstances, especially COVID-19, impact GBF? (GBF only)

  • a. How did COVID-19, and the government’s response to it, impact on the development and delivery of GBF?

15. How does the design and delivery of GBF relate to LGF?

  • a. To what extent was GBF a continuation of LGF?
  • b. How was learning from LGF applied to GBF?
  • c. How did the ‘shovel ready’ focus, or other factors, impact on the types of projects funded compared to LGF?
  • d. To what extent did ‘shovel ready’ projects exist and how long did projects actually take to deliver?

16. What lessons can be learnt for future delivery of similar funds?

  • a. Were there greater challenges with some types of projects?
  • b. What were the critical success factors for interventions?
  • c. How could the funding process be improved?
  • d. How did the delivery experience differ from schemes such as LUF, SPF, Towns Fund? What can we learn from this?

Feasibility assessment

Steer-ED initial feasibility and scoping work has indicated a good level of knowledge and available documentation for both LGF and GBF. Therefore, they conclude that it is feasible to proceed with a process evaluation, which will offer valuable insights for future programmes.

Process evaluations typically involve the triangulation of information across a range of research methods to determine whether activities have been implemented as intended, and to gather reflections and lessons learned. As illustrated in Figure 3.1, Steer-EDs proposed approach will build on the information already gathered through the feasibility stage, will include the gathering of a range of new evidence through four process evaluation workstreams, and will conclude with triangulation and synthesis to corroborate and validate findings.

Figure 3.1: Process evaluation diagram

Source: Steer-ED, 2023

Steer-ED have proposed four process evaluation workstreams. These are as follows:

  • Workstream P1: programme of depth interviews. 25 semi-structured depth interviews with central government officials and 15 with LEP representatives (and where appropriate, wider delivery stakeholders such as LAs) to gain a better understanding of the processes involved in the funds. Steer-ED have recommended 25 central government interviews, which will provide comprehensive coverage and they assess will be feasible to deliver. Amongst LEPs, they have proposed a sampling approach rather than speaking to the full population of 38 LEPs, because this is considered to be more proportionate, and recognises the desire to minimise burden on LEPs at a time of transition in function and structure.

  • Workstream P2: document review. This will build on, and fill some gaps in, the knowledge developed from the scoping phase document review. It will include reviewing documents again in more detail, for example business cases, and reviewing newly uncovered documents such as The Composition of the Local Growth Fund.

  • Workstream P3: workshops. One ‘lessons learned’ workshop with central government stakeholders, and three area-focused workshops to include attendance from LEP Chairs, local government officials and DLUHC area leads. These workshops will help to develop consensus and accuracy across multiple stakeholders, helping to mitigate the limitations of individual memory. This will be particularly useful for LGF where there may be a lack of institutional memory.

  • Workstream P4: consolidation of monitoring data. Gathering and review of monitoring data to draw conclusions about activities undertaken by LEPs and financial expenditure.

Risks

Table 3.1 presents the outputs that will be produced by the process workstreams alongside key risks and mitigating actions. The most notable risks are related to the ability to complete comprehensive and useful stakeholder engagement and the quality of monitoring data. Steer-ED and DLUHC have put in place mitigations to reduce the likelihood and impact of these materialising. However, if they materialise, this will impact the quality of outputs and evaluation.

Table 3.1: Summary of outputs, risks to output, and mitigating actions

Workstream Expected outputs Risks to output Mitigation
P1: Depth Interviews Narrative summary of qualitative impact uncovered Challenges of limited recall and lack of institutional memory Proceed with process evaluation quickly, while LEP staff are still in post
P2: Document review Summary of findings from document review No significant risks to workstream n/a
P3: Workshops Narrative summary of qualitative impact uncovered Insufficient stakeholder engagement Recommended to proceed with process evaluation quickly, while LEP staff are still in post. Offer flexibility around date and format (hybrid or virtual)
P4: Consolidation of monitoring data Descriptive analysis of monitoring data, highlighting project delivery & expenditure Analysis is inconclusive or misleading due to gaps/poor quality of data Steer-ED to work closely with DLUHC data owners. Ensure additional information gathered from other workstreams is fed back into monitoring data, where possible

Source: Steer-ED, 2023

4. Impact and value for money evaluation

Chapter overview

Impact evaluation is an objective test of what changes have occurred, the scale of those changes and an assessment of the extent to which they can be attributed to the intervention. Value for money (vfm) evaluation is a comparison of the benefits and costs of the intervention.

There are many challenges with completing impact and vfm evaluation. While it is likely some qualitative methods are feasible, these alone will not provide robust and comprehensive evaluation of the programme. On the other hand, there are significant uncertainties and risks for whether robust quantitative methods will be feasible. Further feasibility investigation needs to be completed in order to better understand feasibility.

Research questions

Outlined below are potential impact and VfM evaluation questions that Steer-ED have proposed that the evaluations should attempt to cover. These draw heavily on the HMT Magenta Book[footnote 2], which provides guideline research questions suitable for conducting a comprehensive evaluation. The questions focus on establishing the outputs, outcomes and impacts generated by LGF/GBF at the level of individual projects, intervention types, LEPs, and for the overall portfolio. It may not be possible to answer all or some of these research questions, and hence these may be refined through further scoping work.

Potential Impact and VfM Evaluation Questions

1. For each funded project, what outputs were produced, and to what extent do these align with the outputs anticipated?

  • a. What outputs were proposed in the original project business case?
  • b. What outputs have been recorded to date?
  • c. How do the outputs recorded, and the progress compared to original business case outputs, vary across project types and geographies?

2. What outcomes occurred as a result of LGF/GBF: (i) at the project level, (ii) for each intervention type, (iii) at a LEP level, (iv) for the overall portfolio?

  • a. How do the outcomes compare to those anticipated in the original project business case?

3. What impacts occurred as a result of LGF/GBF: (i) at the project level, (ii) for each intervention type, (iii) at a LEP level, (iv) for the overall portfolio?

  • a. What are the net additional impacts?
  • b. What would have happened in the absence of the fund?
  • c. To what extent were there spillovers or displacement of impacts across LEP geographies?
  • d. How confident can we be that the fund caused the difference?
  • e. To what extent did the scheme result in greater impacts than could have been produced by individual project investments in isolation?

4. What lessons can we learn about the impact of LGF/GBF that can be applied to other policy domains?

  • a. What barriers and enablers to achieving impact were observed?
  • b. Which intervention types, areas, or other project characteristics were most likely to be successful in delivering impact, and why? Which were least successful, and why?

5. What was the overall VfM of LGF/GBF: (i) at the project level, (ii) for each intervention type, (iii) at a LEP level, (iv) for the overall portfolio?

  • a. How does the ratio of costs to benefits compare to that of alternative interventions?
  • b. Are there particular projects, interventions or LEPs that have delivered better VfM than others?

Feasibility assessment

There are a number of key challenges associated with impact and vfm evaluation of LGF and GBF. These challenges are set out below, and the implications for evaluation methodology:

  • The number and breadth of intervention types being delivered through the funds is large and leads to a corresponding breadth of outputs. Individual projects are also relatively small (with some notable exceptions), with an average LGF project size just over £3 million and GBF just over £2 million. Steer-ED conclude that no single evaluation approach will be able to address all interventions, and a mixed-methods or ‘composite’ approach (i.e., combining both qualitative and quantitative research) will be required. Even then, some approaches may struggle to detect impacts (given the relatively small scale of funding per project) and so a multi-pronged approach using triangulation of evidence from different sources should be preferred.

  • Limitations in how monitoring data can be used for impact evaluation. Monitoring data doesn’t, for example, get collected for places or sites that don’t receive funding and not all of it can therefore be used in counterfactual analysis. There are also some inconsistencies in the way that monitoring data requirements have been interpreted and responded to across LEPs. Therefore, evaluation will require use of secondary datasets (or primary data collection where possible).

  • Developing a counterfactual is challenging. Since every LEP received funding, there are no unfunded areas to serve as comparators. Furthermore, different interventions operated upon different contexts or populations – implying differing units of analysis across intervention types. Steer-ED have therefore suggested that non-counterfactual designs (such as qualitative analysis, case studies, and theory-based approaches) should be considered to supplement any counterfactual-based analysis, and also that a range of different counterfactual-based designs should/could be adopted, each addressing a different unit of analysis.

  • There are complex interaction effects, with LEP portfolios designed (in theory, if not always in practice) to deliver complementary sets of projects. Steer-ED have therefore suggested that an evaluation approach which examines projects or intervention types in isolation would fail to examine the potential for these interaction effects, and therefore a counterfactual which includes a range of different intervention types operating within a local area should be considered as part of the approach.

  • The COVID-19 pandemic caused a dramatic shock to local activity, over a time period that intersects with LGF investments (and was also the catalyst for GBF investments). This shock to activity poses a significant confounding factor to analysis, and means that before/after analysis, especially in the absence of a counterfactual group, is unlikely to produce meaningful results.

  • The presence of other confounding factors such as alternative funding sources and other local/central government spending which have also occurred within the target areas will make it challenging to isolate the impact of LGF/GBF. Therefore, Steer-ED recommend additional desk review as a means for uncovering other parallel funding schemes which should be considered within analysis.

Given these challenges, Steer-ED conclude there is unlikely to be a single method or research design that could be used to comprehensively assess the impact of the funds. Therefore, they advise a mixed-methods approach is likely to be the most appropriate, which is common across most large programme evaluations. There are a range of qualitative and quantitative methods and types of evidence that may be potentially feasible and could be used in the evaluation.

Qualitative methods, such as developing case studies, are likely to be feasible in this situation. These can be used to demonstrate some links between funding and impacts, without assigning a quantitative value or being able to confidently conclude that the funding caused these impacts. They may be able to provide a nuanced picture of the funds. However, on their own they are unlikely to be able to provide a robust and comprehensive evaluation of the programme.

Robust quantitative methods, such as undertaking quasi-experimental approaches, will be challenging to complete. There are several uncertainties and risks around the feasibility of conducting this work. For these to be potentially feasible, there would need to be a suitable number of completed projects that are similar enough to group together and a suitable comparator group for these would need to be identified, in addition to several practical challenges of conducting the analysis robustly related to those discussed above such as isolating impacts from confounding effects.

A crucial next step in determining feasibility of these methods would be to collect data from LEPs and their delivery partners, to the extent it is available, on beneficiaries and investigate whether a counterfactual for this data can be reasonably identified. Beneficiary data that would need to be obtained are, for example, lists of businesses supported through the funds, data on learner outcomes for supported education facilities, and lists of innovation centres and their occupants.

Value for money evaluation feasibility is dependent on whether impacts can be estimated robustly and can be said to have resulted from the funding. Quantifying or monetising observed impacts or benefits is a poor use of resource if this is not true. And even if impacts can be causally attributed to an intervention, it may still not be possible to monetise them if there is no robust evidence of their monetary value. It may also be the case that only part of the impacts of the policy may be able to be identified and monetised and therefore the work may conclude that a reliable comparison of benefits and costs cannot be provided.

DLUHC intends to complete further feasibility investigation which will attempt to collect available data and identify, where possible, appropriate counterfactuals to make an informed decision on whether it is valuable to complete any impact or vfm evaluation activity on these funds.

5. Next steps on evaluation

DLUHC has commissioned Steer-ED to:

  1. Conduct a joint process evaluation of LGF and GBF. This will be completed and findings made available in 2024.
  2. Collect available data from LEPs and identify, where possible, appropriate counterfactuals as part of further feasibility investigation for impact and vfm evaluation.

This will mitigate against the risk of potential data loss and allow DLUHC to make an informed decision on whether it is valuable to complete this type of evaluation activity on these funds. An updated feasibility assessment will be published when this activity is completed.

DLUHC intends to progress next steps for evaluating LGF and GBF swiftly. Through initial scoping and feasibility work, a key risk highlighted was that of stakeholders becoming harder to contact and data held by LEPs becoming less accessible as part of the decision to withdraw central government support (core funding) for LEPs from April 2024 and transfer their functions to local and combined authorities. The gathering of this data by Steer-ED will help to avoid any potential loss of data which could be of value to impact evaluation work.

  1. These figures should be treated as an approximation because they are based on a sample, and there is also cross-over between the categories – for example it is subjective how a project to improve cycle access or bus routes around a railway station should be categorised. There are also similar issues around subjective allocation between housing, innovation, regeneration, and other site-based categories. Follow-up work to manually sort and classify projects is included as part of the recommended evaluation work in light of these challenges. 

  2. The Magenta Book, HM Treasury, March 2020. See Table 2.2: Evaluation questions and types of evaluation.