Guidance

UK Community Renewal Fund: further monitoring and evaluation guidance for project deliverers

This guidance sets out the further detail on the monitoring and evaluation requirements of the UK Community Renewal Fund.

This guidance was withdrawn on

The UK Community Renewal Fund Programme was closed in December 2023 following the publication of the UK Community Renewal Fund evaluation report.

1. Introduction

1.1. As set out in the Technical note for project applicants and deliverers all projects will be required to submit evidence either via the lead authority (in the case of Great Britain) or directly to the Department for Levelling Up, Housing and Communities (DLUHC) (for Northern Ireland) demonstrating progress towards achievement of project targets and investment profiles at regular intervals. This includes both quantitative and qualitative data.

1.2. In addition, it also explains the requirement that applicants, if successful, must also develop an evaluation plan that is generally expected to be between 1-2% of their award to be dedicated to that evaluation with a minimum threshold of £10,000 (See question 1.33 in the FAQ for clarification on flexibilities.

1.3. This guidance sets out the further detail on the monitoring and evaluation requirements of the UK Community Renewal Fund.

1.4. An overview of the national evaluation activity is also provided at Section 4.

2. Outputs and outcomes

2.1. A fundamental aspect of the UK Community Renewal Fund (UKCRF) was to enable projects to help support local areas to pilot innovative new approaches and programmes that unleashes the potential of places, instil pride, and prepare them to take full advantage of the UK Shared Prosperity Fund when it launches in 2022.

2.2. To help facilitate such an approach in our initial Technical Note we set out a small number of outputs and outcomes both defined in broad terms. To enable us to better understand the nature of the investments being made under the Fund, we require a more detailed level of data collection; this detail is set out in annex A.

2.3. To provide a robust and informative analysis on the Fund’s impact, it is vital that all parties collecting data use the same metrics and appropriate methods. Where these differ between nations, for example in relation educational attainment, then the measure appropriate to the nation should be used. The relevant measures are identified in annex A.

2.4. As set out in the prospectus the monitoring approach has sought to build on that being taken by the Towns Fund, as is the case for Levelling Up Fund. This means where possible the same outputs and outcomes have been used or where new material is required it is consistent in approach.

UKCRF outputs

End beneficiaries

2.5. For UK Community Renewal Fund, outputs are directly linked to who has been supported, the end beneficiary, and the support they have received. In some cases, the end beneficiary may be the recipient of the award itself, for example, a local authority, higher education institute or an organisation representing specific sector who may just be undertaking a feasibility study. In this instance, we still want to capture the nature of the organisation supported and as such they would report themselves as an output i.e. ”# of organisations receiving grants” as well as reporting the “Feasibility studies developed as a result of support” outcome.

2.6. When projects provide support to people or other organisations, such as businesses or those from the education, training and third sectors; these would be the end beneficiaries in those circumstances.

2.7. Some successful applicants for UKCRF funding may use delivery partners to support implementation. Delivery partners are not end beneficiaries where they provide support to people, businesses, and organisations. Where this is the case they should not be reported as outputs.

Definitions

2.8. You will note that in Annex A (output tab) the number of outputs has increased. Whilst the broad output descriptors from the Technical Note could have been retained the approach taken enables both increased consistency with Towns Fund indicators and clearer guidance on the data collection requirement for each output.

2.9. The definitions for these outputs have been draw from the outcome definitions already set out in the Technical Note. They are very similar, but the wording has been changed to reflect the intended outcome of the support. So, for example, the outcome “People gaining a qualification following support” is defined simply as:

People who have received support and who gained a qualification following that support.

In the guidance, there is a specific output “# of people supported to gain a qualification”. The draws from the above outcome definition and is simply defined as:

People who have received support to gain a qualification following that support.

2.10. So, a project which support 10 people to gain a qualification records 10 outputs. If they are all successful then this also means 10 outcomes can be recorded; however, if only 8 are successful then only 8 outcomes should be recorded.

Evidence

2.11. Project deliverers must retain sufficient evidence to demonstrate that outputs and outcomes have been achieved. The evidence column in Annex A sets out the information that you will need to retain related to the support provided.

Additional information

2.12. The additional information column in Annex A sets out the summary information required on the characteristics of people and businesses that have been supported, this will be reported at an aggregated, anonymised level. This material complements that required to be collected under the evidence heading; more detail on evidence is provided below. This section also sets out the detail of the support that needs to be provided. You will note that there are minimum thresholds of support required to constitute an output. This is intended to provide UKCRF projects with greater flexibility around the types of interventions that they provide.

Multiple counting

2.13. Support for the same final beneficiary could result in multiple outputs if the nature of the intervention meant that was appropriate. For example, support may be provided as a hybrid product that is a combination of grant and access to business advice. Such support should record both a count under “# of businesses receiving grants” and “# of businesses receiving non-financial support” recognising that potentially the two outputs may only bring about one outcome such as “Businesses introducing new products to the market as a result of support”.

UKCRF outcomes

2.14. The outcomes and their definitions have been transposed across from the Technical Note.

2.15. The main additions for outcomes, as can be seen in Annex A (UKCRF Outcomes tab), are the sub-sets provided to better understand the detail of the outcome that has been delivered; these are set out in the Additional information required column. For example, if qualifications have been gained then the number of people achieving that qualification should be reported or if connectively is improved then the nature of that improvement should be reported.

2.16. In some instances, baselining will be required to accurately capture outcomes resulting in a change, for example, increase in footfall or visitors. This is identified in Baseline evidence column where applicable.

Multiple counting

2.17. An output can lead to multiple outcomes. For example, a business may be provided with a grant (# of businesses receiving grants) to successfully introduce a new product to the firm. This would lead to a count against the “Businesses introducing new products to the firm as a result of support” outcome but depending on the nature of the product could lead to others such as “Employment increase in supported businesses as a result of support” and/or “Estimated Carbon dioxide equivalent reductions as a result of support”.

2.18. As set out above it is possible that multiple outputs may lead to a single outcome. When it comes to reporting this, you should avoid double counting. Using the example already set out above:

support may be provided as a hybrid product that is a combination of grant and access to business advice. Such support should record both a count under “# of businesses receiving grants” and “# of businesses receiving non-financial support” recognising that potentially the two outputs may only bring about one outcome such as “Businesses introducing new products to the market as a result of support”.

In this case only one count of “Businesses introducing new products to the market as a result of support” should be counted.

Verification evidence

2.19. As well as setting out the outputs and outcomes and their definitions the tables in Annex A also set out verification evidence requirements i.e. what needs to be retained to provide assurance that the outputs and outcomes claimed have been delivered.

2.20. You should ensure that you have supporting evidence to validate results, for each indicator you report against. Given the innovative nature of the UKCRF programme and its relatively small budget DLUHC is testing a light touch and proportional approach to verification evidence. Part of the evaluation of the UKCRF programme will review the efficiency and effectiveness of the approach.

2.21. If you or your delivery partners already collect data on an indicator that is inconsistent with our approach, you will need to align the data collection in line with our criteria. DLUHC is unable to accept data that has not been collected using the same metrics outlined in Annex A.

2.22. Where a delivery partner is implementing a project, you should ensure that their contract covers the requirement to report its input, activity and output data to yourselves for collation. Primarily you as recipient of the UKCRF award are responsible for ensuring that the data is collected and is free from errors.

2.23. As set out above some indicators you report on will require the data to be broken down by different factors (disaggregated). It is your responsibility to collect the data disaggregated, and therefore you should be aware of this when collecting data. The disaggregation required per indicator is outlined in Annex A.

2.24. For some indicators you will be required to provide baseline evidence. For example, photographic evidence of a site before the project delivery commences. This is outlined in Annex A . For projects where baseline evidence is required you will need to ensure it is collected prior to starting project delivery and that the same methodology is used for both the before and after measurements. For example, footfall is counted in the same way at the same locations and over the same length of time.

2.25. To ensure the data provided is accurate and can be deemed as reliable in our analysis DLUHC, and lead local authorities where appropriate, reserves the right to conduct:

  • Site verifications – this will involve visiting project sites to check whether observations can confirm the validity of data collected.
  • Data Audit – this will involve visiting sites to check whether each data point can be evidenced.
  • Triangulation – this will involve comparing primary data sets to comparable external sources of data, or with qualitative feedback.

2.26. You are required to resource the data collection for activities, outputs and outcomes, you should ensure this is accounted for in your planning. In instances where DLUHC requires your support as part of the wider M&E efforts (e.g. identify stakeholders to interview), we will put any requests to the relevant officer and seek to minimise the administrative burden on your staff.

2.27. Where project deliverers or Lead Authorities collect personal beneficiary data they must comply with the Data Protection Act 2018 and the UK GDPR. DLUHC does not require project deliverers or lead authorities to provide personal data, as set out in the Data Protection Act 2018 and the UK GDPR, in respect of end beneficiaries as part of the UK CRF monitoring and evaluation process, information regarding beneficiaries should only be reported on a summarised, anonymised basis.

2.28. Further guidance on data protection can be found on the Information Commissioner website.

3. Project evaluation

Timing

3.1. To ensure project evaluation can be supported through the UKCRF then the required project evaluation will need to be completed by 30 June 2022 accepting that the invoice for that work can be paid after that date. UKCRF funding cannot be used to fund evaluation work that takes place after 30 June 2022.

3.2. Projects may negotiate with the lead authority, or DLUHC in the case of Northern Ireland, the submission of the final project evaluation up to 6 months after the final claim. However, in these cases, the final claim must be accompanied by an interim report and all spend related to evaluation activity that takes place after 30 June 2022 will be at the project’s own cost.

Approach

3.3. A key priority for the UKCRF was to bring forward innovative projects as such we are not looking to be too prescriptive in our guidance relating to project evaluation. Given the variety of projects expected it would also be challenging to provide guidance which covered all possible scenarios.

3.4. There is, however, plenty of publicly available M&E resources that you may find useful when considering your approach:

3.5. We would draw specific attention to the section 2.2.1 of the Magenta Book:

“Good policy-making necessitates a thorough understanding of the intervention and how it is expected to achieve the expected outcomes. Good evaluation also requires this understanding. Thoroughly examining the proposed intervention ensures:

  • an understanding of how the intervention is expected to work in practice, e.g. the problem the intervention aims to address; the change it aims to bring about; the causal chain of events that are expected to bring about the change; the main actors; the groups expected to be impacted; and the expected conditions required for the intervention to succeed;
  • exposing the assumptions upon which the intervention is based and the strength or weakness of the evidence supporting these assumptions;
  • an examination of the wider context, such as other policy changes or changes in economic, social and environmental factors;
  • designers and implementers of the intervention have the opportunity to stress-test the intervention design and ensure they agree on how the intervention is expected to work.

Understanding the intervention is typically done through synthesising existing evidence and producing a Theory of Change (ToC).”

3.6. Applications will have already set out the intended project outcomes and impacts. We would encourage projects, if they have not already done so, to build this into a ToC. Having a ToC will enable you to better understand what evaluation approach you want to take and what, if any, additional data you may want to collect to undertake your evaluation.

3.7. In relation to additional data you may wish to collect, the data covered in Annex A is relatively comprehensive. However, there may be instances where more granular information on end beneficiaries may be required. For example, you may wish to target specific ethnic groups under your project and as such you may want to ensure you are collecting data that ensures you have successfully targeted that specific group. Other examples include the long-term unemployed or people from jobless households.

3.8. It is recommended that the evaluation is undertaken by someone independent of the project and who has the relevant skills to undertake the task. In addition, that the evaluator be brought in at the beginning of the project to ensure that the data required to undertake the evaluation is identified early enabling it to be collected during the project implementation as required.

3.9. Due to the limited timescales for UKCRF project evaluation it may be necessary for some outcomes and impacts of a project to be forecast as they will continue to accrue after the completion of the evaluation. If this is the case, it is important that there is a clear distinction between the outcomes and impacts which have been realised and those which are predicted to arise in future years. For quantitative forecast, the estimation method will need to be clearly explained in the report.

Report content

3.10. As set out in the UK Community Renewal Fund: assessment criteria an effective evaluation will cover the:

  • appropriateness of initial design
  • progress against targets
  • delivery and management
  • outcomes and impact
  • value for money
  • lessons learnt

3.11. Below further detail is set out regarding what should be considered for each element accepting that each project will need to tailor their approach to their specific needs.

Appropriateness of initial design

3.12. This element should be based around the project’s planned outcomes and impact and include critical analysis about the appropriateness of the project’s design given these objectives.

3.13. Drawing on the available evidence, this part of the report should discuss whether there has been a change in the context within which the project was originally planned. If there have been changes, whether these have had any implications for the practical delivery of the project and the benefits which could be realised for beneficiaries and the local economy. The key questions that need to be explored here are:

  • What was the project seeking to do?
  • What was the economic and policy context at the time that the project was designed?
  • What were the specific market failures that the project was seeking to address? Was there a strong rationale for the project?
  • Was it appropriately designed to achieve its objectives? Was the delivery model appropriate?
  • Were the targets set for the project realistic and achievable?
  • How did the context change as the project was delivered and did this exert any particular pressures on project delivery?
  • Bearing in mind the project design itself and any changes in context could the project reasonably be expected to perform well against its targets?

Progress against targets

3.14. This element should consider the progress with the implementation of the project, drawing in particular on annual and lifetime performance against the expenditure, activity and output targets. Variations from the targets should be carefully explained and supported by the available evidence.

3.15. The key questions here are:

  • Has the project delivered what it expected to in terms of spend and outputs?
  • What are the factors which explain this performance?
  • When the project draws to a close, is it expected to have achieved what it set out to?

Delivery and management

3.16. This element of the report will need to provide a more qualitative analysis of the implementation of the project. As appropriate this could cover procurement, selection procedures, delivery performance, governance and management.

3.17. The key questions that the summative assessment will need to explore here include:

  • Was the project well managed? Were the right governance and management structures in place and did they operate in the way they were expected to?
  • Has the project delivered its intended activities to a high standard?
  • Could the delivery of the project have been improved in any way?
  • For projects with direct beneficiaries: did the project engage with and select the right beneficiaries? Were the right procedures and criteria in place to ensure the project focused on the right beneficiaries?
  • How are project activities perceived by stakeholders and beneficiaries? What are their perceptions of the quality of activities / delivery?

Outcomes and impact

3.18. The analysis under this element will need to set out the progress that the project has made towards the project’s intended outcomes and impacts. Any analysis in this element of the report would benefit from forecasts of lifetime outturns, where it is possible to calculate realistic forecasts.

3.19. The overarching question that this section will need to explore is whether or not the project has made a difference. In answering this critical question, projects will need to consider:

  • What progress has the project made towards achieving the intended outcome and impacts?
  • To what extent are the changes in relevant impact and outcome indicators attributable to project activities?
  • What are the gross and net additional economic, social and environmental benefits of the project (where relevant and applicable to project activities)?
  • Can these benefits be quantified and attributed to the project in a statistically robust way?
  • How has the project contributed to the wider strategic plan under which it was developed?

Value for money

3.20. This element of the report will need to provide a clear analysis of the value for money that the project has provided. As a minimum, reports should provide cost per output analysis.

3.21. Where appropriate this can also be supplemented by benefit cost ratio analysis to provide additional insight. Various methods can be used to assess benefits and costs of an intervention from the perspective of society or government which has helped to fund the activity. The Green Book provides a fuller explanation of these methods.

Lessons learnt

3.22. It is suggested that the lessons learnt element of the report is structured around identifying the strengths and weaknesses of the project. They should also highlight specific lessons for the following audiences:

  • The grant recipient / project delivery body
  • Those designing and implementing similar interventions
  • Policy makers

3.23. They should be objective and constructive and wholly evidenced by the analysis within the report.

4. National evaluation

4.1. As set out in the UKCRF prospectus, in relation to a national evaluation, the UK government will:

  • Undertake a comprehensive process evaluation to understand how efficient the delivery structures and business processes are including the impact of capacity funding.
  • Undertake evaluation’s which consider both the impact of funding on place and investment themes.

4.2. In doing so we will look to build on the Towns Fund Evaluation Framework ensuring synergies are achieved where practical. In addition, the work undertaken will, where timescales allow, inform the development and implementation of the UK Shared Prosperity Fund.

4.3. The options for the national evaluation are still being worked through; however, the monitoring data sought in Annex A alongside existing administrative data available to DLUHC should mean no additional requests for data will be made by the Department at a later date.

4.4. As set out above, where DLUHC requires your support as part of the wider M&E efforts (e.g. identify stakeholders to interview), we will put any requests to the relevant officer and seek to minimise the administrative burden on your staff.

Published 3 November 2021