Research and analysis

Local Growth Fund and Getting Building Fund: Process evaluation - executive summary

Published 25 July 2025

Applies to England

Background to the commission and methodology

The Local Growth Fund was announced in 2013 as a ‘single pot’ of £12 billion in devolved capital funding to support local economic growth. Of the £12 billion total, approximately £7 billion was allocated as flexible funding which was managed by Local Enterprise Partnerships (business-led partnerships between local authorities and local businesses) and overseen by the Ministry of Housing, Communities and Local Government. The Local Growth Fund was distributed through three funding rounds from 2015 to 2021.

In response to the COVID-19 pandemic, the Getting Building Fund was launched in 2020. The fund allocated £900 million to support ‘shovel ready’ projects that could address immediate economic challenges. It was a continuation of the Local Growth Fund in terms of timing, delivery processes, and types of interventions, and ran until 2022. It was delivered through one funding round.

This document is a process evaluation of the two funds, which was produced by Steer Economic Development and commissioned by the Ministry of Housing, Communities and Local Government. The purpose of the evaluation was to reflect on the design, delivery and governance of the two funds. A mixed-methods approach was deployed, based on seven main research methods which were selected for their ability to capture lessons learned and gather detailed feedback from stakeholders. The research methods utilised were: 19 in-depth interviews with central government stakeholders; 35 in-depth interviews with Local Enterprise Partnership representatives; 3 area-focused workshops with Local Enterprise Partnerships and area representatives from the Ministry of Housing, Communities and Local Government; a round-table recommendations workshop; three thematic case studies; and desk research. The round-table recommendations workshop was held with policy and delivery stakeholders from the Ministry of Housing, Communities and Local Government and provided an opportunity to reflect on the evaluation’s findings and the implications for future policy. Desk research involved a review of monitoring data and Local Enterprise Partnership-commissioned process evaluations.

Despite some challenges (such as institutional memory; the dissolvement of Local Enterprise Partnerships; the varying quality, depth and availability of Local Enterprise Partnership-commissioned process evaluations; and some concerns around the monitoring data) the evaluation successfully provided coverage of a range of intervention types and geographies. Furthermore, the qualitative approach was considered particularly valuable for capturing different perspectives and contextual factors influencing delivery. Overall, the methods used were considered appropriate for the purpose of the evaluation and supported the collection of valuable insights and lessons learned.

Overview of LGF and GBF

The Local Growth Fund supported a broad range of interventions according to local needs, with the overarching aim of promoting growth, rebalancing the economy, enhancing local accountability and attracting private sector match funding. While the Getting Building Fund was similar in terms of delivery, the context and rationale for the two funds differed. The Local Growth Fund aimed to support long-term economic growth by devolving power to local areas, while the Getting Building Fund was a response that targeted areas facing the biggest immediate economic challenges posed by the COVID-19 pandemic.

Local Growth Fund projects were financially complete by March 2021, with ongoing monitoring expected to continue until 2025. Getting Building Fund projects commenced in 2020 and were expected to be financially complete by March 2022, with ongoing monitoring expected to continue until 2025. At the time of writing, 58% of the 2,195 Local Growth Fund projects and 30% of the 378 Getting Building Fund projects were recorded as ‘complete’ (that is, all funding had been spent and there were no further outputs to report).

The Local Growth Fund’s largest investments were in road improvements (29% of overall Local Growth Fund budget) and skills capital (16% of overall Local Growth Fund budget). When considered at an aggregated level, transport projects (across road, rail, and other modes) accounted for a total of 43% of Local Growth Fund allocated funding. The Getting Building Fund, on the other hand, concentrated on employment and innovation projects, with a noticeably lower proportion of transport-related spend.

Programme design

The design of the funds involved three important elements:

  • funding was provided as a ‘single pot’, giving local areas the freedom to focus on local priorities rather than being tied to individual departmental objectives;
  • a competitive bidding process was introduced to encourage high-quality proposals; and
  • decision-making was decentralised, involving local businesses in project design and selection.

With regards to the ‘single pot’, stakeholders generally reported the intention was welcomed. The evaluation found many examples that demonstrate that projects did indeed straddle departmental boundaries. Local Enterprise Partnerships made use of this flexibility in the variety and cross-cutting nature of projects they selected. However, the nature of the fund (comprising existing budgets from four government departments – the Department for Transport, the Department for Education, the Ministry of Housing, Communities and Local Government and the former department for Business, Energy and Industrial Strategy) meant that some Local Enterprise Partnerships felt there was a certain degree of expectation that spend should align with these departmental allocations. This was broadly borne out in reality – suggesting that the ‘freedom’ of the flexible pot was perhaps less than originally intended.

With regards to the competitive bidding process, the evaluation found that the level of competitive tension was less than intended – the bidding process can best be described as ‘semi-competitive’. This was driven by a lack of systematic communication to Local Enterprise Partnerships regarding the amount of funding available and the criteria by which projects would be judged. A further contributing factor was a lack of resource within central government to conduct thorough assessments of projects. While there is evidence that some adjustments were made to account for the differing strengths of Local Enterprise Partnerships, Strategic Economic Plans, and supporting governance processes, the final distribution did not differ greatly from ‘per capita’ allocations.

Finally, with regards to the decentralisation of decision-making, local areas largely felt that they were able to make decisions to meet local economic priorities and were well-positioned to do so, though progress on delivery varied according to the strength of local project pipelines.

Programme delivery

The evaluation found that in some areas, the portfolio of projects delivered was highly strategic, crossing geographic and sectoral boundaries to deliver a well-coordinated and synergistic set of projects. However, it was observed that in other areas the portfolio was more segmented, with a sense of each local authority receiving their ‘fair share’ of projects. The evaluation found that project portfolios were most likely to be strategic in situations where Local Enterprise Partnerships were strongly engaged and had ‘ownership’ of the Strategic Economic Plan, where the Strategic Economic Plan itself had a clearly communicated purpose, and where geographic and political contextual factors did not get in the way of a truly strategic selection. The inclusion of the business voice (implemented via Local Enterprise Partnership Chairs and membership) was welcomed by stakeholders, with business contributors providing rigour to local project management processes, supporting more innovative delivery mechanisms, and contributing a commercial viewpoint to project selection.

The approach to delivery of projects evolved over time as Local Enterprise Partnerships evolved in their capacity, maturity and expertise. Overall, consultees generally provided evidence of innovative ways that the funding had been used, and explained that the flexibility afforded by the design of the funds permitted project delivery to be optimised. Nonetheless, some delays to project delivery occurred, linked to factors such as planning delays and development lags (often seen in complex capital projects), COVID-19 impacts, and wider macroeconomic factors.

For the Getting Building Fund, a clear focus was placed on proposing projects which were highly deliverable. Local Enterprise Partnerships’ ability to respond to this requirement largely depended on the quality of their existing project pipelines. The challenging delivery environment during COVID-19, and the resulting supply chain and inflationary pressures, led to unexpected delays across many projects, and ultimately the extension of the final deadline for completion of spending. While Local Enterprise Partnerships did propose projects which could reasonably be expected to be delivered within the 18-month delivery period, some noted that being genuinely ‘shovel ready’ required funding and planning to be already in place (or not required, in the case of planning) which was not the case in many instances.

Governance, management and monitoring

Local Enterprise Partnerships, supported by a nominated local authority accountable body, were at the core of delivery of the Local Growth Fund and Getting Building Fund. The Local Growth Fund was first introduced at a time when there was strong political support for decentralisation and, in accordance with this, central government intervention to ensure accountability, transparency and fit-for-purpose governance was deliberately light-touch. However, a series of government inquiries and reviews revealed concerns around transparency and accountability, leading to a successful ‘tightening up’ of processes. In alignment with the evolution of the approach to governance, a more involved approach to performance management by central government developed over time. It was made clear that government had the ability to withhold future years’ allocations if they were not satisfied by the results of these reviews (although in reality, this was reported to be difficult to implement due to the contractual mechanisms in place).

The system for monitoring the funds (and supporting guidance documents) changed over time. Overall, a criticism from Local Enterprise Partnership interviewees was that there was a lack of clear guidance around how output metrics should be defined, resulting in a monitoring dataset that would benefit from more completeness, comparability, and robustness and that, although fulfilling financial reporting requirements, has limitations on other fronts. 

Recommendations

The evaluation makes the following recommendations:

Programme design

  1. Extend and build upon the ‘single pot’ notion. Stakeholders welcomed the flexible ‘single pot’ as a mechanism for delivering local growth. Noting the potential challenges around coordination and allocation of responsibilities introduced by a ‘single pot’, the concept could be extended further – by including a broader selection of government departments amongst the contributors to the pot, with the Ministry of Housing Communities and Local Government as the central coordinator;

  2. Dedicate sufficient time to thinking and set-up. The desire to deliver new policies at pace can sometimes result in insufficient time for considered design, engagement and testing. This can be detrimental to the quality of subsequent delivery – leading to changes in approach and guidance which can cause confusion and frustration amongst delivery partners;

  3. Early engagement for collaborative development. The teams within the Ministry of Housing Communities and Local Government which hold relationships with local government are a key resource for engaging with local areas. Greater use of this resource, and additional time taken to canvas views of local areas, would help to reduce the need for pivots in approach after a fund has launched; and

  4. Set – and communicate – clear ‘rules of the game’. The evaluation found that Local Enterprise Partnerships were not able to compete effectively because they were not given a clear steer on how funding would be allocated or the quantity of funding available. Central government should agree ‘what good looks like’ for competitive funding bids prior to announcing the funding competition and should clearly communicate this to local areas.

Effective delivery

  1. Provide stability through use of review points. It may be beneficial to set review points in advance and agree that systems and guidance will be reviewed and updated at these agreed points in time – offering stability in the intervening periods;

  2. Build local capacity. For future funding streams, central government should consider how it can support local areas to build capacity and capability – for example, through provision of revenue funding alongside capital funding, and also through targeted interventions in local areas facing the greatest capability gaps; and

  3. Move beyond ‘shovel ready’. The evaluation noted some challenges around the concept of a ‘shovel ready’ scheme, noting that these rarely exist in reality. A long-term, integrated strategy for delivery of a pipeline of projects is encouraged.

Governance, monitoring and management

  1. Shifting the emphasis from scrutiny to support. Building on the need to support local areas facing capacity and capability challenges, a mindset change is encouraged to place greater emphasis on providing support for areas, in particular those where processes are less mature or which face greater capacity constraints;

  2. Ensure the Ministry of Housing, Communities and Local Government has access to appropriate mechanisms for managing performance. Building clawback mechanisms into future funding agreements would enhance the ability to apply performance management, even if these mechanisms are rarely used; and

  3. Ensure fit-for-purpose monitoring systems, managing the trade-off between comprehensiveness and collection burden. Standardisation and digitisation of monitoring tools are recommended for the future, distinguishing between factors that are most important for performance management (such as project status and progress against spend targets) and factors that can be used to establish the extent to which project outputs and outcomes have been realised.