Research and analysis

Ending Rough Sleeping Data Framework: Quality Assurance Review

Published 29 February 2024

Applies to England

1. Acknowledgements

Thank you to the Department for Levelling Up, Housing and Communities (DLUHC), Centre for Homelessness Impact (CHI) and to the rough sleeping data-led framework early adopter areas who have given time to provide feedback and valuable insights for the development of this piece of work.

Report by Lili Lainé of Homeless Link, commissioned by DLUHC.

2. Executive summary

This report presents the findings of a research study conducted by Homeless Link and commissioned by DLUHC. The research investigates the implementation of Phase I of the new rough sleeping data-led framework developed by DLUHC, CHI and five early adopter areas (London, Greater Manchester, Newcastle, West Midlands and Bournemouth, Christchurch and Poole). The primary objective was to understand existing approaches, methods, and processes used by local areas to collect and report data requirements introduced with the framework. The research aimed to appraise the needs for robust and consistent data collection and establish a verification process for improved data quality standards.

The research findings are based on a detailed review of approaches and processes from 13 local areas. Overall, the study confirms the robustness of Phase I indicators while recognising that there will always be variability and potential inaccuracies in data. There is a clear need for enhanced guidance, verification processes, and detailed explanations of strengths and limitations. This is important not only for improving comparability but also for ensuring that users are well-informed of these issues when utilising the data.

Overall, areas for improvement should seek to address:

  • The gap in data management and technical capacity that exists between local areas. Whilst some areas have the right capacity and systems to ensure good data quality standards, some still heavily rely on manual systems, and this presents capacity and capability challenges. Supporting the latter should be prioritised to ensure a baseline standard and enhance overall data quality, consistency and reliability.
  • Clearer definitions and guidance for specific indicators to improve comparability and consistency. Some indicators, particularly R1 single snapshot estimate, exhibit flexibility in their definitions resulting in diverse interpretations leading to inconsistencies and comparability issues.

In summary, the following recommendations are proposed for consideration by DLUHC:

  • Develop a manual outlining guidelines and tips, and including resources, for achieving good data quality standards. This could be structured according to the data quality dimensions used by the government in its Data Quality Framework.
  • Implement a flexible verification model building upon the ongoing engagement of rough sleeping advisors with local areas. This model would capture essential information on local areas’ data collection approaches, methods, and processes annually, utilising a concise assessment questionnaire. The objective is to identify areas that may benefit from additional support or guidance to achieve high-quality data. Additionally, the model aims to highlight shared experiences and challenges with the framework, providing valuable insights to DLUHC for future improvements.

A summary of the purpose, collection methods, strengths and limitations of all indicators can be found in Appendix A.

3. Introduction

Homeless Link was commissioned by the Department for Levelling Up, Housing and Communities (DLUHC) to undertake a piece of research looking at how local areas in England have implemented the new rough sleeping data-led framework and to appraise data collection methods and validation processes.

4. Background

In 2022, the government’s updated rough sleeping strategy ‘Ending Rough Sleeping for good’ was published. The strategy identifies the steps to be taken towards the vision of ending rough sleeping for good, and for the first time a clear definition of what this means was adopted: ‘we will have ended rough sleeping when it is prevented where possible, and where it does occur it is rare, brief and non-recurrent’. This has led to the development of a new data-led framework to be used to measure progress towards ending rough sleeping, developed by the Department for Levelling Up, Housing and Communities (DLUHC) with support by the Centre for Homelessness Impact (CHI) as well as five ‘early adopter’ areas (Newcastle City, Bournemouth, Christchurch and Poole, and Local Authorities in London, Greater Manchester, and the West Midlands). These areas have been participating in the development of the indicators, refinement of the definitions and trialling data collection since March 2022, providing an expert ‘sounding board’ to test the framework prior to national roll out.  

The framework is a national data collection model, which was rolled out England-wide in May 2023. Data is now gathered by local authorities on a monthly basis to measure progress towards the vision of ending rough sleeping for good, using five key core indicators. The first quarterly publication of data from the framework was published on 30th November 2023. The new monthly data is a significant step forward in terms of offering an enhanced understanding of the rough sleeping population. It will support local government to measure progress towards the shared vision to end rough sleeping, and shape local and national responses to support vulnerable people.  With more efficient data collection and reporting processes through harmonised definitions, the hope is also for the data-led framework to lead to more effective service delivery as well as policy and funding decisions based on a single source of comparable information.

5. Objectives and purpose of the research

To enhance the quality of the data supporting the new data-led framework for addressing rough sleeping, it is crucial to assess how local areas currently collect this information. This includes examining the methods employed in order to measure consistency, comparability, and data quality, which, along with providing a better understanding of the strengths and limitations of the indicators, can be used to developed improved guidance.

The research has two objectives:

  • Understanding existing approaches and processes for the local collection and verification of the data underpinning the new indicators, including the potential challenges that local areas are facing, best practices and limitations. Findings will be used to provide recommendations for standardised methods of data collection and implementation of the framework, resulting in better clarity and higher quality and more robust data.
  • Understanding what steps and verification approaches could be taken for the new data-led framework to ensure a similar verification standard to the annual rough sleeping snapshot.

6. Methodology

To deliver on the objectives, Homeless Link carried out a series of research activities:

The development and dissemination of a detailed audit questionnaire directed to ‘early adopter’ local areas to collect detailed information on data collection processes including use of systems; stakeholders involved; quality assurance, checks and validations mechanisms in place; interpretation of definitions; key challenges and improvements made/to be made; strengths and limitations of the data[footnote 1].

The questionnaire was completed by 10 local areas: Manchester; Rochdale; Oldham; Coventry; Wolverhampton; Sandwell; Walsall; Solihull; Bournemouth, Christchurch, and Poole; and Newcastle. Follow-up discussions were undertaken when additional information was needed and to provide a deeper understanding into the rationale, practicalities and perceived barriers of the methods used.

These areas were all among the ‘early adopter’ areas which have worked closely with DLUHC and CHI to develop, test, and implement the new data-led framework. In this way, they have all engaged with the data-led framework in some capacity and therefore were more likely to be slightly more ‘advanced’ (at least in their reflections around what works and what is challenging) compared to those areas who had not been involved in the testing phase prior to national roll out. However, considering the diversity of the areas in terms of size, geography and operational context, the findings should be illustrative and reflective of the existing approaches and methods used by local areas across England to capture the new data.

Structured interviews with three London Boroughs (LB Enfield, LB Kingston-Upon-Thames, and LB Westminster) were undertaken, as well as with the Greater London Authority (GLA) and London Councils, to understand any specific issues related to CHAIN (Combined Homelessness and Information Network), Greater London’s case management system which is used to pre-fill the indicators in the framework.

7. Definitions

The new data-led framework introduced a new shared definition that ‘Ending rough sleeping means preventing it whenever possible, and where it cannot be prevented, making it rare, brief and non-recurring’.

Under these four broad themes, the core indicators are defined as:

7.1 Prevented

  • P1 - Number of new people seen sleeping rough
    This comprises two estimated figures:

    • P1 (monthly): the number of new people seen sleeping rough over the reporting month.
    • P1 (snapshot): the number of new people seen sleeping rough on a single night.

A person is considered ‘new’ if they have not been seen sleeping rough in the Local Authority in the 5 calendar years (60 months) preceding the date they were seen sleeping rough during the current reporting period.

  • P2 - Number of people seen rough sleeping after being discharge from institutions
    This comprises six estimated figures:

    • P2A (five figures): the number of people sleeping rough in the reporting month who have left institutions in the last 85 days.
    • P2B (one figure): the number of people sleeping rough in the reporting month who are under 25 and who are care leavers.

For P2A, a person is counted as having left an institution if they report having been discharged from any of the following within the last 85 days (12 weeks + 1 day): prisons (adult and youth); other justice accommodation for example, accommodation providers by the National Probation Service; General and psychiatric hospitals; UK Armed Forces; National Asylum Services Accommodation.

7.2 Rare

  • R1 - Number of people sleeping rough
    This comprises two estimated figures:

    • R1 (monthly): the number of people seen sleeping rough over the reporting month.
    • R1 (snapshot): the number of people seen sleeping rough on a single night.

7.3 Brief

  • B1 - Number of people experiencing long-term rough sleeping

    This comprises one estimated figure:

    • B1: the number of people experiencing multiple and/or sustained episodes of rough sleeping.

A person will be considered as experiencing multiple and/or sustained episodes of rough sleeping if they have been seen in the reporting month and have also been seen out in 3 or more months out of the last 12 months.

7.4 Non-recurring

  • NR1 - Number of people returning to rough sleeping
    This comprises one estimated figure:

    • NR1: the number of people who were seen sleeping rough previously and have returned to the streets after a period of time.

A person is considered as a ‘returner’ if they have been seen sleeping rough again after no contact for 2 or more quarters or 180 days, whichever is shorter, measured from the last date the person was seen.

8. Understanding local approaches to data collection and verification

The findings presented in this report build upon deep-dive research into the approaches and methods used by 13 local areas to capture the data underpinning the indicators of the new data-led framework. This section provides an overview of the contexts in which the data is collected, highlighting the overarching factors that may impact data collection and verification at local level, and ultimately data quality.

8.1 A high-level overview of the process

Since May 2023, all local areas in England have been submitting monthly figures for the data-led framework to DLUHC through the DELTA portal. These indicators have been built into a broader set of management information that local areas report on monthly.

Below is a high-level overview of the data collection process:

  • Collection – Typically, data collection is undertaken by a commissioned or in-house rough sleeping outreach team. Relevant information is collected by outreach workers during ‘sweeps’ or visits following a referral.  Usually, outreach workers document the data in real-time using standardized forms or digital tools which may include checkboxes, drop-down menus and open-text fields to capture various data elements.  The data is usually recorded as close to the point of contact as possible to minimise errors.
  • Data entry – Collected data is entered into a database or electronic system. This step is usually performed by outreach workers themselves. If outreach teams record their data into specialist software, validation rules and algorithms are usually applied to identify potential errors or discrepancies such as missing information or outliers. If outreach teams are working on spreadsheets, the process is more manual and more vulnerable to human error and potential inconsistencies. However, outreach workers usually carry out manual checks to ensure completeness and consistency. This includes cross-referencing between systems if local areas have more than one.
  • Data review / quality assurance – The data collected and entered into outreach teams’ systems each day is usually reviewed by the team lead, who carries additional checks after each outreach session. Beyond quality assurance, outreach team leads review the data in order to monitor outreach needs and activities. This step may include asking clarifying questions or confirming details.
  • Validation – At month end and prior to data submission, outreach teams review their monthly data and discuss their final figures. The data is usually submitted via DELTA by the local authorities’ rough sleeping leads.

Each local area surveyed has a data collection process similar to the one outlined above but their practices, protocols and processes vary.

8.2 Operational and technical context

The way rough sleeping data is collected and verified at local level is intrinsically linked to how outreach is delivered locally. It is therefore important to consider the operational and technological contexts in which the data is collected in order to understand the different approaches and methods used by local areas and to overall appraise the quality, consistency and comparability of the data underpinning the new data-led framework.

  • Data collection:
    • Of the 13 audited areas, seven had in-house outreach services, and the remaining six areas used commissioned services.
  • Outreach activities and processes:
    • All areas conducted outreach sweeps at least once a week, with eight areas conducting sweeps at least five times a week. All areas reported conducting sweeps in the early morning, with three areas occasionally also conducting night sessions.
    • All areas followed up on StreetLink and reports throughout the day.
    • All areas noted that all new rough sleepers and bedded down contacts with existing rough sleepers were recorded on systems daily.
  • Software systems:
    • Over half of the audited areas (seven) had two different software systems in use by their outreach teams and Housing Options teams.
      • Of these, four outreach teams used spreadsheets.
    • Three areas (outside of London) had a single data system in use across outreach and Housing Options teams.
    • The three London boroughs that were audited used the pan-London CHAIN system, with two areas also using additional software.
  • Use of single identifiers:
    • Most of the audited areas (eight) made use of single identifiers across systems.
    • The remainder (five areas) did not use single identifiers, meaning that manual checks were required.
  • In-house data analysts:
    • Eight of the audited areas had an in-house data analyst.

Appendix B presents the above in a summary table, split by each area.

Responsibilities

Data is either collected directly by the local authorities through their in-house rough sleeping teams or by commissioned services which both deliver outreach activities and are responsible for data collection.

Outreach capacity

Local areas have varied resources and capacity to deliver outreach. The frequency of outreach activities as well as the intensity of the sweeps can have an impact on the robustness of the data collected.

In addition to this, the level and quality of partnership work can also influence the data outputs. A local area with established communication channels with key partner agencies as well as clear protocols for information sharing is likely to be able to produce better quality data as well as introduce more efficient verification processes than a local area with weaker multi-agency links.

Technical capacity

Technical capacity and abilities are very varied across audited local areas. For this reason, processes and protocols for recording, validating, and verifying data will inevitably be diverse. The process of data collection, verification and reporting will likely be more streamlined and consistent in local areas that have higher technical capabilities and use an integrated data management system. Likewise, some local areas have in-house data analysts who can make the process of reporting and analysing data easier.

Data collection and systems

The tools used to collect rough sleeping data are varied.

If the service is commissioned, and especially in smaller local authorities that may not have access to an integrated data system such as CHAIN (London) or GMThink (Manchester), outreach teams usually collect and collate data through their own systems (for example, In-form, Opal, Excel) whilst also reporting to the local authority after each outreach session on number of people seen and other relevant information that would be required to report on the indicators.

If the service is in-house, local areas are more likely to have a unique system, however this is not always the case. Integrated case management systems are used to record data from initial contact through to support plan, emergency accommodation and into long-term sustainable accommodation. It can integrate with the Homelessness Reduction Act requirements and acts as a Customer Relationship Management (CRM) system allowing for the recording of contacts and as a document repository.

In London, CHAIN provides a centralised platform for collecting, storing, and managing data on rough sleeping across London. CHAIN enforces standardised data collection protocols, ensuring consistency in the type of information gathered.

A number of local areas are also working with different systems where outreach teams use a database to log and record rough sleeping data, but which are not integrated to the database used by Housing Options and other council services.

Working with manual systems

Overall, those relying on more manual systems such as MS Excel face several risks and limitations in comparison to those that have a more integrated case management system which usually has built-in data validation and verification features, more robust security features, real-time collaboration capability and an ability to scale, analyse data and produce reports. They can also easily integrate with other systems.

Key risks and limitations include data entry errors (for example, typos, incorrect formats), increased risk of data inconsistencies due to reduced integration with other systems and oftentimes fewer validation checks; limited scalability, reporting and charting capabilities, and the need to establish a range of manual checks which can be burdensome.

Local areas working on MS Excel are usually storing their databases on a secure cloud system. In order to maintain data integrity and avoid human errors, most databases only allow the outreach team leads to delete or modify what has been entered and saved.

Data retention 

Some local areas are able to access information that is up to 10 years old, while others for a shorter period of time, no longer than four years. This is limiting the time span from which some indicators (for example, NR1) can be calculated.

9. Focus on the indicators

9.1 R1: Number of people sleeping rough

R1 is the base indicator for rough sleeping, which is used in percentage calculations for other indicators. The indicator is reported as the number of individuals seen sleeping rough during the reporting period. Two figures make up the indicator:

  • An estimated monthly figure of the number of people seen sleeping rough over the reporting month.
  • An estimated snapshot figure of the number of people seen sleeping rough on a given night in the reporting month

For each area, the R1 estimate as a rate per 100,000 of population will be automatically calculated. 

The estimated monthly figure

The estimated monthly figure captures a full month of rough sleeping, and it differentiates from the single night snapshot which captures rough sleeping at a point in time.

Gathering the data

In all audited local areas, the figure is produced by adding up the number of unique individuals seen bedded down across the month by the outreach team during their shifts or sweeps. The approach taken for gathering and recording the data is also similar, with local outreach teams recording the contacts they have had with people sleeping rough each day on their database or system.

The data collected is therefore directly linked to their outreach activity, including geographic coverage and the intensity of the sweeps carried out. It is also reliant on reports (for example, Streetlink) and referrals from partner agencies, especially in order to capture people who do not engage with statutory services and outreach teams.

Implementing the indicator

Local areas did not have to make important changes to their data infrastructure or processes to implement this indicator. Those with case management and case note software were already able to extract number of rough sleepers seen over the course of a month. Those working with spreadsheets introduced daily outreach reports to log and record daily contacts from outreach activities.

Verifying and validating the figures

For this indicator to be as accurate as possible, it is important for local areas to ensure that people sleeping rough are appropriately identified and that daily outreach contacts are appropriately recorded and logged into local areas’ systems throughout the reporting periods.

Checking duplicates and identifying the unknowns

Unlike the estimated snapshot figure, only ‘known’ or ‘identified’ individuals are included in the figure. Building the monthly estimate also requires ensuring de-duplication of the data and making sure that someone seen sleeping rough over the month is only counted once. Outreach records usually include identifiers such as names or descriptions of the person, but identification has been reported as a challenge by some local outreach teams who have described the steps they take to identify ‘unknowns’ across the month and therefore for the monthly figure to be as accurate as possible.

  • Local area example 1: On their sweeps, if someone is seen but not visible as ‘hidden under their bedding’ or if someone is not willing to engage, the local outreach team still logs all the information they have about the individual, such as description of appearance or bedding, in a specific spreadsheet. The team then engages with partners informally over the month and at the monthly Rough Sleeper Action Group to try identifying these ‘unknown’ individuals and capture information on them.

  • Local area example 2: The local area has built-in duplication checks based on date of birth and client names. The checks are run at least once monthly before data validation and submission.

Working with others

Whilst none of the local areas have a formal verification protocol in place involving partner agencies, most local areas can rely on existing communication channels and multi-agency working groups to sense-check their data on rough sleepers seen across the month. 

  • Local area example 1:  All the outreach contacts are reviewed, cross-checked, and validated via weekly RSI meetings. These are attended by partners (multi-agency) bi-monthly.

  • Local area example 2: The communication between partners is mainly informal but information is shared weekly between the outreach team and key partners such as Change Grow Live (CGL) and the Salvation Army. The outreach team engages with broader relevant partners (for example, probation, mental health hospitals) where gaps in their knowledge exist and on a case by-case basis.

  • Local area example 3: A weekly Rough Sleeping Action Group brings multi-agency partners together and is used to sense-check the data.

Reported strengths and limitations
Strengths

This indicator is a widely understood measure and the parameters for data collection are clear. All audited local areas were confident in their ability to provide a good and representative estimate based on logged outreach contacts.

Local areas estimate that the indicator complements well the estimated single night snapshot figure, providing additional value around:

  • Continuous monitoring: the figure provides a more continuous and ongoing understanding of the rough sleeping population, helping in understanding ‘the big picture’, trends and movements, as well as understanding need and demand for homelessness assistance.
  • Capturing hidden homelessness better (although there are still clear limitations): the outreach team can work to identify individuals who would not necessarily be visible on a single night and through engaging with partners throughout the month.
Limitations

As a ‘baseline’ figure, the key limitation pertaining to this indicator pertains to all indicators. This is:  

  • Hidden homelessness - although the way this figure is produced allows outreach teams to work to identify any unknown or unidentified individuals, many individuals experiencing rough sleeping may not be visible and therefore identified or may actively avoid being counted. This includes those sleeping rough in a more transient manner as well as groups like women, people of colour and members of the LGBTQ+ community who tend to sleep rough in more hidden locations.
Key considerations for this figure

Despite the diverse approaches taken by local areas, there is a shared understanding and interpretation of the definition provided.

Overall, there is no data quality concern for this estimate figure assuming that data collectors and users understand and recognise its limitations, and that all is done to ensure quality assurance, good use of system(s), maximise identification of rough sleepers (which partly relies on intensity and extent of outreach provided) and accurate recording of daily outreach contact.

The single night ‘snapshot’ figure

The single night ‘snapshot’ figure records only those people seen, or thought to be, sleeping rough on a single ‘typical’ night and does not include everyone sleeping rough in areas across the reporting period.

Approaches to data gathering and collection

The approaches taken by local authorities for capturing the single night snapshot figure vary greatly across local areas as well as within local areas from one reporting month to another. These derive from existing approaches and methodologies (for example, the three approaches of the annual snapshot estimate methodology) or previous funding requirements (for example, 2018 RSI funding required local areas to carry out bi-monthly rough sleeper counts).  

The different approaches taken by audited local areas to produce the single night snapshot estimate are outlined in the Local approaches to capturing single night snapshot figure table, and are summarised here:

  • A count-based snapshot estimate from outreach contact logs: Most local areas produce the figure by adding up all unique individual contacts seen bedded down on a chosen night or by producing an average of daily outreach contacts over a period during the reporting month or over the reporting month. Many areas alternate this method with a physical street count every other month.
    • Local area example 1: The figure is produced by pulling out outreach contacts’ numbers from the last outreach day of the month.
    • Local area example 2:  The outreach team looks at the average number of daily outreach logs over the reporting month and picks a date where outreach logs are reflecting this average and therefore are representative of the context/picture of the whole month.
  • A count-based snapshot estimate from physical street count: A significant number of areas undertake a physical street count of their area on a chosen typical night every month or bi-monthly. The resources, capacity and protocol are however (understandably) less robust than for the annual snapshot, with outreach teams covering key hotspots as well as areas where they have seen people sleeping rough throughout the month.
    • Local area example 3: A street count is undertaken every month. This is the same week each month, but the days of the week change to ensure that the approach accounts for people’s patterns and behaviours.
    • Local area example 4: A street count has been undertaken bi-monthly since 2018. This is always organised on the last Thursday evening of the month.
  • An evidence-based estimate including a spotlight count: Some local areas undertake an evidence-based estimate including a spotlight count, following the annual snapshot methodology closely.
    • Local area example 5: The evidence-based estimate is produced based on intelligence from partner agencies, along with a spotlight count on key hot spots on the last outreach day of the month, every month.

It is not uncommon for local areas to change their approach from month to month. Areas are advised to use one of the three snapshot approaches every two months.

Local approaches to capturing single night snapshot figure

The following table highlights the different snapshot methods used across the 13 participating local authorities and contains details about how they select the night around which to conduct their estimate.

Area Count-based estimate - Outreach contact count Count-based estimate - Street count Evidence-based estimate with spotlight count Details Selection of single night
1 - Method changes from one month to another
- Street count every other month
- Estimate is an average figure of daily outreach contacts over a 2-week period
- Street count: last Thursday evening of the month
- 2-week snapshot: last two weeks of the month
2 - - Same method every month
- Estimate figure is produced based on outreach logs on a representative night.
- Not a specific date. Picked so it is representative and based on the context/picture given across the whole month (figure-driven rather than date-driven)
3 - - Same method every month
- Evidence-base estimate with spotlight count on key hot-spots.
- Last outreach day of the month
4 - - Same method every month
- Street count
- Same week each month but alternate day (weekdays only + avoid nights of benefits payments)
5 - Method changes from one month to another
- Street count every other month
- Estimate figure is produced based on number of outreach contacts on a representative night.
- Street count: last Friday of the month
- Not a specific date. Picked so it is representative and based on the context/picture given across the whole month (figure-driven rather than date-driven)
6 - - Same method every month
- Estimate figure is produced based on outreach logs of chosen typical night
- Last outreach day of the month
7 - Method changes from one month to another
- Street count every other month
- Estimate is based on daily outreach team activity and local intel from partners
- Not a specific date. Picked so it is representative and based on the context/picture given across the whole month (figure-driven rather than date-driven)
8 - - Same method every month
- Estimate figure is produced based on outreach logs of chosen typical night
- Randomly selected date from the outreach log.
9 - Method changes from one month to another
- Street count every other month
- Estimate is an average figure of daily outreach contact over the month
-
10 - Method changes from one month to another
- Street count every other month
- Estimate is an average figure of daily outreach contact over the month
- Same week each month but alternate day (weekdays only and avoid nights where benefits payments have been made)
11 - Method changes from one month to another
- Street count every other month
- Street count: different day each month (weekdays only)
- No single night but an average over 5 days
12 - Method changes from one month to another
- Street count every other month
-
13 - Method changes from one month to another
- Street count every other month
-
Total 11 9 1 - -

Those undertaking a count-based snapshot estimate via street count broadly follow the principles set out in the Homeless Link annual snapshot guidance to plan and carry out their count. However, the capacity and resources put into these vary across areas.

  • Local area example 6: The bi-monthly count involves a number of volunteers sourced from across the homelessness sector and a local partnership to support the outreach team with data collection. The team manager monitors the spreadsheet as data is submitted and checks to remove duplicates.
  • Local area example 7: The bi-monthly count involves key partners including CGL, the Salvation Army, Crisis Skylight etc. All known hotspots are visited, and intelligence gathered by all different partners.
  • Local area example 8: The count usually involves one or two external partners, but local partners’ involvement is less significant than for the annual snapshot. Yet, the team will cover more ground than on a usual outreach session, following pre-defined routes and visiting all known locations.
Choosing the ‘single night’

As described in the table presented on the previous page, there is no consistency in the way local areas are selecting their ‘single night’. The current definition and associated guidance exhibit a degree of flexibility that permits varied interpretations.

Of those undertaking a count-based snapshot estimate from a physical street count, some local areas have developed a systematic and consistent approach to picking their representative ‘single night’: the majority are either choosing the same week every month but alternating the day of the week in case of unseen patterns or are choosing the last outreach day of the month for consistency and ease in terms of logistics. Some local areas do not have consistent approaches for choosing their ‘single night’ but are considering several factors when deciding each month including avoiding weekends, periods of events and surrounding benefit payments.

Of those building their snapshot estimates based on outreach contact logs, many do not choose a ‘single night’ but rather calculate the average of daily outreach contact logged across a period (for example, last 5 days of the month, last 2 weeks of the month) or across the month. Once the average figure is established, the ‘single night’ that is closest to this figure is chosen. This means that the figure is likely to be more representative of the picture across the whole month, however this approach moves away from the intention of the ‘single night’ snapshot figure.

Most local areas alternate between the two methods, and therefore the ‘single night’ will be different from one month to another. This introduces some challenges when it comes to comparison: averages smooth out nightly fluctuations while single night snapshot provides specific, unsmoothed counts. 

Unlike the annual snapshot estimate where local areas try as much as possible to coordinate with neighbouring local authorities to pick the same night for their rough sleeping counts, there are no consistent cross-boundary considerations for the monthly single night snapshot estimates. This potentially leads to double counting with neighbouring authorities.

Validation and verification

For the local areas undertaking count-based snapshot estimates, data is collected either on paper or digital forms. These are consistently completed on the night/morning of the snapshot and checked by the outreach team leads who also work to identify duplicates – then they can – based on names and dates of birth collected, or descriptions of individuals. This has been consistently reported as challenging, with some local areas acknowledging relatively high numbers of unidentified individuals after each snapshot count. The final figure produced by the outreach team leads is discussed and sense-check with the outreach workers before submission. Overall, the steps provided in the Homeless Link guidance around data collection and recording are closely followed by local areas.

For local areas producing their snapshot estimate based on daily outreach contacts, the figure is usually validated by the outreach team lead through a quick sense-check and comparison with previous monthly figures. This is not necessarily further verified as the data is usually extracted from the system from which data entries are checked after each outreach session.

Reported strengths and limitations
Strengths

Since local areas have been planning and carrying out the annual rough sleeping snapshot estimates since 2010, this indicator is well understood and most local areas feel confident in the robustness of their methods and the quality of their data, although they acknowledge that what is delivered is less thorough than for the annual snapshot.

The monthly single night snapshot estimate is considered to complement the annual snapshot estimate well, allowing local areas to monitor trends and fluctuations in the number of rough sleepers and get a more granular understanding of who is on the street on given nights throughout the year. It also complements the monthly estimate well and can serve as a useful benchmark.

Limitations

Local areas identified a number of limitations for this figure, and whilst some of these are common to all indicators, some specific challenges around consistency of data collection and comparability of data have been identified and raised. Reported limitations include:

  • Representativity: in a similar way as the over the month count estimate presents limitations around representativity as the figure does not include unidentified individuals, the single night snapshot count estimate has limitations as not all people sleeping rough may be visible or in places that are easily accessible by outreach teams. The caveat around under-representation of some specific groups also applies to this figure.
  • Variability: the number of rough sleepers can significantly vary from one night to another due to a range of factors and a single night count may not capture the dynamic nature of street homelessness.
  • Verification and validation: limited resources and capacity mean that it is more difficult to involve multi-agency partners and independent partners in the count.
  • Interpretation and comparability: with most local areas using different methods for collecting data and identifying their ‘single night’, it is difficult to interpret and compare data between areas and from one month to another.

In light of the above, most local areas reported finding the over the month count estimate more helpful to dive into in order to understand rough sleeping and in indicating broader trends.

Key considerations for this figure
Key points for audited local areas

Given the different contexts in which local areas are operating in, as well as the varied capacity and resources of their outreach teams, some levels of variation in approaches and data collection methods are inevitable. This is also the case for the verified annual snapshot estimate methodology which identified 3 different methodologies to cater for different spatial and operational contexts.

However, as it is, there are some concerns around the varied interpretations of the definition and the ‘single night’, and the varied approaches taken resulting in a lack of consistency in data collection. This means clear challenges to comparing the data in each local area from one month to another, and between areas across the same period. Overall, more consistency and robustness could be brought in through further clarifications and guidance for local areas to achieve better data quality.

Key points for all local areas

Considering the variety of interpretation and data collection models that audited local areas are using to produce this figure, it is likely our audited sample does not reflect all the variations that exist amongst all local areas in England. 

To achieve greater robustness and allow for meaningful use and comparison of the data collected, the following should be considered:

  • Ensuring that local areas undertaking bi-monthly street counts are doing so on the same months to allow for more comparability (for example, Jan/Mar/May/July/Sept/Nov)
  • Providing additional parameters and guidance around choosing the ‘single night’. Whilst local areas have demonstrated that they have developed their own robust approach, those using an average figure across a number of days or across the month are moving away from the intention behind the ‘snapshot’ figure approach. In addition, a number of local areas have expressed interest in understanding how others are going about choosing their ‘single night’. Guidance could support the development of more consistent approaches.

Given the varied approaches, DLUHC should ensure that the methods used to produce the figures submitted monthly are reported on so that these variations and associated limitations can be considered if any analysis is done on the data, especially if the data is aggregated and used for comparability purposes.

9.2 P1: Number of new people sleeping rough

The indicator is reported as the number of individuals seen sleeping rough for the first time during the reporting period. Two figures are making up the indicator:

  • An estimated monthly figure of the number of new people seen sleeping rough over the reporting month.
  • An estimated snapshot figure of the number of new people seen sleeping rough on a given night in the reporting month.

A person is considered ‘new’ if they have not been seen sleeping rough in the Local Authority in the five calendar years (60 months) preceding the date they were seen sleeping rough during the current reporting period.

In London however, a person is considered ‘new’ if they have not been seen sleeping rough in one of the London boroughs in the five calendar years preceding the date on which they were seen sleeping rough during the current reporting period.

Data gathering and collection

P1 monthly

This indicator builds on indicator R1, as it reports each month on all newly identified rough sleepers seen on a single night as well as seen bedded down across the month by the outreach team during their shifts or sweeps or engaged with through outreach activities. In this way, the robustness of this indicator relies on the robustness of R1 monthly estimate. 

The approach taken for gathering and recording the data is similar across local areas, with outreach teams consistently recording any contact they have had with newly identified rough sleepers each day on their database or system.

Implementing the indicators

Local areas had already been collecting data in relation to ‘new’ rough sleepers since October 2020. However, the indicator introduced a more specific definition of ‘new’.

Overall, local areas didn’t have to make important changes to the way data is collected and recorded to introduce this new indicator. For most, existing data systems were easily adapted to the new requirements. Overall, the following changes were undertaken:

  • All made sure that the dates on which people were seen sleeping rough for the first time were consistently recorded in their systems.
  • For local areas with commissioned outreach service(s), changes were made to the monthly return request to include the date people were identified as rough sleeping.
  • Those with a centralised data system created an algorithm that would identify all bedded down clients within a required timescale, allowing them to easily extract the data from their system each month.
  • Those working on spreadsheets made changes to their data collection tools in order to quickly identify those that would meet the ‘new’ threshold.

Two local areas, each working with two separate systems, reported needing to perform substantial checks to reconcile data between systems in order to gain access to the historical data needed to understand whether a newly identified rough sleeper is indeed ‘new’ or ‘returning’. This is likely to apply to more local authorities beyond the audited local areas. These local areas reported that checking historical records is relatively straightforward to do on a monthly basis to produce the P1 “over-the-month” estimate; however much harder to produce the P1 estimate for a single night.

Working with historical data

Of the 13 audited local areas, four do not have consistent records going back five years to enable them to report on this indicator as outlined in the definition of “new”. However, areas reported confidence in the level of quality of their data for the past four years and therefore should be able to report appropriately on this indicator next year. It is believed that this will be the case for most local areas that started their data collection journey in 2018 with the data collection requirements attached to the RSI funding.

Verification and validation

Local areas currently lack well-defined verification protocols for the specified indicator. The existing checks for all indicators include addressing duplicates, engaging with partners when faced with unclear or confusing self-disclosures, and conducting a general sense-check of data. The primary verification activity involves a comprehensive examination of the local database to confirm whether individuals have received previous support, typically using names and dates of birth for this purpose.

Additionally, local areas perform consistent monthly checks to generate figures for the entire month. However, verifying the single-night snapshot estimate poses greater challenges, as individuals may not be appropriately identified during street counts.

Reported strengths and limitations
Strengths

Local areas reported feeling confident with this indicator when they have five or more years of historical records. They also reported the usefulness of this indicator, especially in tracking progress in relation to prevention.

Limitations

The following limitations were identified and discussed:

  • Working with historical data / multiple systems: As described in the previous section, a number of local areas either do not have access to consistent rough sleeping records beyond four years or have to do some checks across two different systems in order to interrogate whether individuals have been seen in the five-year period.
  • Cross-boundary considerations: Outside of London, local areas identify rough sleepers as ‘new’ if they are new to their areas but there is a potential for a number of people to have a history of rough sleeping in another local authority. This would not be captured through this indicator. To mitigate this, some local areas make notes of this when this is known or disclosed by people rough sleeping themselves or trusted partners so this knowledge can be taken into consideration when analysing or reporting on the indicator. This shows the importance of the qualitative narrative accompanying the data.
  • Overlap with ‘returners’: The indicators NR1 capture individuals that have been returning to the street after a period of not being seen for six months. Some individuals can return to the street after a period of 5 years of more and in this case, these would be captured in both indicators P1 and NR1.
Key considerations for this indicator
Key points for audited local areas

There is a shared understanding and interpretation of the provided definition for P1 across local areas. For the monthly estimate, the data collection approach and methods are consistently applied. However, for the snapshot estimate, the challenges and barriers are similar than for the R1 snapshot estimate figure.

A significant challenge for this indicator lies in accessing and processing historical data spanning at least five years. Some local areas face difficulties due to inconsistent historical data or the necessity to navigate multiple systems, both of which could potentially result in increased human errors or inconsistencies.

Overall, there is no major concern about data quality for this estimate figure, provided that data collectors introduce through quality checks as well as ensure effective system utilisation, minimise human and processing errors, and maximise the identification of rough sleepers.

Key considerations for all local areas

Based on the audit findings, several local areas are encountering difficulties in maintaining high-quality data beyond a four-year timeframe, impacting their ability to identify ‘new rough sleepers’ accurately. Despite the initiation of Rough Sleeping Initiative (RSI) funding and associated data collection in 2018, some areas seem to face challenges in establishing and sustaining effective data collection systems. Considering the time elapsed since the inception of RSI, it is expected that local areas would be well-positioned to report on this indicator in the upcoming year.

Moreover, it is likely that many local areas currently employ multiple systems for cross-verifying whether individuals have been previously identified as rough sleepers. This practice introduces the potential for errors and inefficiencies. To address these issues, there is a pressing need for standardised procedures, such as aligning data formats across systems, implementing data matching algorithms, or consolidating into a unified system. These measures would contribute to improved accuracy and consistency in data reporting.

9.3 P2: Number of people seen sleeping rough after being discharged from institutions

  • P2A (five figures)

    The indicator is reported as the number of people seen sleeping rough having left an institution recently if they report having been discharged from one of the institutions below within the last 85 days (12 weeks + 1 day).

    • Prisons (Adult and youth)
    • Other justice accommodation for example, accommodation provided by the National Probation Service (that is, Approved Premises)
    • General and psychiatric hospitals
    • UK Armed Forces
    • National Asylum Support Services Accommodation
  • P2B (one figure)
    The indicator is reported as the number of people seen sleeping rough that are aged under 25 and are care leavers.

Data gathering and collection

As this indicator builds on indicator R1, the robustness of this indicator relies on the robustness of the R1 monthly estimate. 

To gather the information, local areas can count referrals that they receive from institutions, but mainly rely on self-disclosure from people experiencing rough sleeping. Outreach teams seek to collect the information during their first contacts with a new or returning rough sleeper, or once their assessment is undertaken.

Relying on self-disclosure can be challenging as people experiencing rough sleeping may not be willing to disclose information on their backgrounds for reasons including trust issues or stigma and shame. A number of local outreach teams described how this information is sometimes revealed or discovered months after initial contact, leading to inaccuracies in this indicator, especially for institutions for which an 85-day recording period is specified, since the information might not come to light before this window has closed. It is also common that the information disclosed by people experiencing rough sleeping is unclear or inconsistent, therefore requiring the outreach team to inquire with the relevant institution to check accuracy or receive additional details. Audited local areas have described the actions taken to mitigate the challenges in relation to self-disclosure and to improve data quality. This includes discussing information on newly identified, or returning, rough sleepers with partner agencies at regular multi-agency rough sleeping intelligence meetings and cross-checking information from other sources including housing assessments.

Prison

Information is gathered through self-disclosure or Duty to Refers (DTR)s / referrals received from probation and prisons.

Most local areas were previously reporting on this information and have existing communication channels and information sharing processes in place with relevant partners.

Other approved justice accommodation

Information is gathered through self-disclosure or DTRs / referrals received from probation and other partners. This information was not reported on previously, or typically captured under ‘prison’.

Local areas reported a lack of knowledge of criminal justice systems and services.

General and psychiatric hospitals

Information is gathered through self-disclosure or information sharing with relevant partners including hospitals, Adult Social Care, CGL or GPs. The ability of local areas to report on this figure varies greatly, with some areas reporting established relationships with hospitals, and others reporting difficulties.

UK armed forces

Information is gathered through self-disclosure or referrals received from armed force charities or veteran services. Local areas were not consistently reporting on this information previously. This was usually not a ‘priority’ question asked to newly-identified rough sleepers.

Asylum support services accommodation

Information is gathered through self-disclosure or referrals received from the Home Office, Adult Social Care, or local refugee charities. Local areas were reporting on this information previously and are relatively confident on their ability to report on this indicator, although have stressed the need for prompt and timely information sharing from partners.

Care leavers under 25

Information is gathered through self-disclosure or referrals received from Children services / Care leavers team as well as local charities. Most local areas were reporting on care leavers previously. This information is usually disclosed by individuals and local areas are relatively confident on their ability to report on this indicator.

Implementing the indicator

Overall, local areas were already collecting some data on people discharged into homelessness, but this was not consistent nor consolidated to include information from all institutions and there was no clear timeframe used. In addition, information on discharges typically included people who are at risk of homelessness, not necessarily rough sleeping.

  • Local area example 1: Before the implementation of the data-led framework, the local outreach team were already recording whether new people sleeping rough were prison leavers, however this was not time-banded. As part of the flow assessments, outreach workers were also recording whether people’s last settled base was an institution, but all types of institutions were grouped together.
  • Local area example 2: Before the implementation of the data-led framework, information on last settled base and whether an individual was discharged from an institution were typically recorded in case notes rather than in the flow assessment feature.

To implement the indicator, local areas reported having to:

  • Create new fields in their data tools allowing outreach teams to collect and input the appropriate information for each new or returning rough sleeper, including institution type and date of leaving last settled base in order to calculate if it was within the 85-day period.
  • Brief outreach teams to ask the right questions during their sweeps to gather the information appropriately, as well as to record the information more thoroughly within flow assessment features instead of case notes.

Validation and verification

The verification and validation processes are inconsistent across local areas. On top of the standard checks carried out for all the indicators, some local authorities seek to sense-check their information with the relevant institutions when this is straightforward and when pre-existing relationships are in place.

  • Local area example 3: The local outreach team attends a local prison release meeting every month. This pre-dated the implementation of the framework but intelligence on new / returning rough sleepers is now discussed to check accuracy.
  • Local area example 4: The local outreach team has an information sharing protocol in place with the local prison and receives a list of all the individuals leaving the prison every week.
  • Local area example 5: The local outreach team has a Hospital Discharge lead who works closely with the Hospital Discharge team sharing intelligence and verifying the monthly figure. The team also consults with the Council’s Migration team as well as the Care Leavers team so they can sense-check the monthly returns before submission.

The amount of time needed to share data and get the information validated by institutions has been reported as a clear barrier to further cross-checking and verification of data. In addition, several local areas mentioned the absence of data sharing agreements in place to allow for easy sharing of information between partner agencies. Overall, regular rough sleeping multi-agency meetings can support good information sharing and sense-check of existing information.

Key strengths and limitations

Strengths

Local areas reported on the usefulness of this indicator whilst stressing that some work is needed to ensure provision of better-quality estimates for the different figures. Local areas described how this data will support them in better identifying gaps in support provision, as well as in developing better partnership working with relevant institutions, especially since reporting on these indicators holds institutions accountable for the wellbeing of the individuals they release. A couple of areas mentioned how this might encourage institutions to consider better post-release outcomes and support needs, as well as better information sharing and joint working.

Limitations

This indicator requires outreach teams to identify within the reporting month whether rough sleepers have been discharged from an institution in the past three months. Unlike the other indicators, the outreach teams are heavily reliant on self-disclosure and timely information sharing from partners in order to get the data needed. This reliance on self-disclosure and information sharing from partners is the main identified challenge, hindering better accuracy and quality data. A number of areas raised that figures are likely to be inaccurate, yet still useful. There was also a sense that the responsibility for collecting better data was shared with partners, and therefore less within areas’ individual control.

  • Local area example 6: The local area mentioned existing data quality concerns about the data being provided in time by some agencies where self-disclosure highlights some discrepancies.
  • Local area example 7: The local outreach team highlighted that some work is needed to improve communication streams and joint working with existing services and institutions in order to get better data.

Key considerations for this indicator

Key points for audited local areas

The interpretation of this indicator remains consistent across local areas; however, there are variations in the levels of data quality and validation standards achieved. Achieving an accurate estimate necessitates effective information-sharing practices among partner agencies, including relevant institutions. Local areas with established communication channels with relevant services are better positioned in this regard compared to those without such established channels.

Key considerations for all local areas

The quality of this indicator in all local areas is intrinsically linked to the quality of partnership working and information sharing with relevant institution partners.

Local areas should be encouraged to develop relationships with relevant institutions and to introduce information sharing agreements to maximise data accuracy and quality. Work is needed to ensure that partners, beyond rough sleeping and outreach teams, are aware of the indicator and take responsibility for timely information sharing.

9.4 B1: Number of people experiencing long-term rough sleeping

The indicator B1 is reported as the number of people experiencing multiple and/or sustained episodes of rough sleeping.  

A person is considered as experiencing multiple and/or sustained episodes of rough sleeping if they have been seen in the reporting month and have also been seen out in three or more months out of the last 12 months.

Data gathering and collection

This indicator makes use of previous collections of indicator R1 and therefore builds upon the data gathered and collected for R1. In this way, the robustness of this indicator relies on the robustness of R1 monthly estimate and appropriate records of outreach contacts.

Implementing the indicator

Local areas didn’t have to make important changes to the way data is collected and recorded to introduce this new indicator. For most, existing data systems were easily adapted to the new requirements. Overall, all 13 local areas created an algorithm or added a new field in their data collection tool or database that would show the number of months in which someone was seen bedded down in the last 12 months, allowing them to easily extract the data from their system or spreadsheet each month.

Validation and verification

Local areas do not have specific validation and verification processes beyond the standard checks carried out for all the indicators. All are confident about the robustness and quality of the data they produce monthly.

Key strengths and limitations

Strengths

Local areas described how this indicator provides a meaningful indication of those likely facing complex challenges and requiring tailored support to address the unique needs and barriers likely faced by long-term homeless individuals.

Limitations

Beyond the limitations that are common to all indicators, local areas identified the following challenge which mainly relates to the indicator rather than the data collection / verification:

  • Overlook and overlap: With the current definition, the indicator could capture individuals having secured accommodation and subsequently lost accommodation – these would still be recorded as ‘long-term’ rough sleepers and therefore the data may not provide a complete or perfectly accurate representation.

Key considerations for this indicator

The interpretation of this indicator, and approach and method used to collect the data, are consistent across local areas.

Overall, there is no data quality concern for this estimate figure assuming that data collectors and users understand and recognise its limitations, and that all is done to ensure good use of system(s), maximise identification of rough sleepers by outreach teams and appropriate data entry.

9.5 NR1: Number of people returning to rough sleeping 

The indicator NR1 is reported as the number of individuals who were seen sleeping rough previously and have returned to the streets after a period of time. 

A person is considered as a ‘returner’ if they have been seen sleeping rough again after no contact for two or more quarters or 180 days, whichever is shorter, measured from the last date the person was seen.

Data gathering and collection

This indicator makes use of previous collections of indicator R1 and therefore builds upon the data gathered and collected for R1. In this way, the robustness of this indicator relies on the robustness of the R1 monthly estimate and appropriate records of outreach contacts.

Implementing the indicator

Local areas didn’t have to make important changes to the way data is collected and recorded to introduce this new indicator. For most, existing data systems were easily adapted to the new requirements. All 13 local areas created an algorithm or added a new field in their data collection tool which would highlight when the last time was that the person was seen sleeping rough.

Challenges related to implementation are similar to the challenges experienced for indicator P1 (‘new’ rough sleeper) in relation to working with historical data and multiple systems.  

As noted under the section on the P1 indicators, two local areas, each using two separate software systems, reported needing to perform substantial checks to reconcile data between systems in order to have access to the historical data needed to understand whether a newly identified rough sleeper is indeed ‘new’ or ‘returning’. Other areas are known to have changed software providers in the last few years, meaning they may face similar issues.

Local areas commonly retain historical data for up to seven years, a practice not dictated by specific regulation but rather guided by general best practices in record-keeping. However, as noted earlier, several audited local areas lack uniform records extending beyond four years. Consequently, the absence of consistent data makes it challenging for local authorities to identify individuals who may have previously been seen on the streets more than four years ago.

Verification and validation

Local areas do not have clear verification protocols in place for this indicator beyond the checks that are carried out for all indicators around duplicates and general data sense-checks. The main activity undertaken by local areas is to conduct a thorough check of their database to ensure that individuals have not been previously seen or supported. Usually, names and dates of birth are used for that purpose.  

Key strengths and limitations

Strengths

Local areas described how this indicator provides a meaningful indication of those that have fallen through the gaps of support after a period of housing stability or experienced barriers preventing them from maintaining stable housing. This will help them explore what may have gone wrong and identify potential gaps in support provision.

Limitations

Beyond the limitations that are common to all indicators, local areas identified the following challenges which mainly relates to the indicator rather than the data collection or verification:

  • Overlap with ‘new rough sleepers’: If someone with a history of rough sleeping is seen sleeping rough again after a period of five years without any contacts, they would be captured under both P1 (‘new’) and NR1 (‘returning’).
  • Retention of historical data: As mentioned above, most local areas are unable to keep records for more than seven years, meaning that those returning to rough sleeping after more than seven years would be captured as ‘new’.

Key considerations for this indicator

Key points for audited areas

The interpretation of this indicator, approach and method used to collect the data are consistent across local areas.

A challenge for this indicator lies in accessing and processing historical data. Some local areas face difficulties due to inconsistent historical data or the necessity to navigate multiple systems, both of which could potentially result in increased human errors or inconsistencies.

Key considerations for all local areas

It is likely that many local areas currently employ multiple systems for cross-verifying whether individuals have been previously identified as rough sleepers. This practice introduces the potential for errors and inefficiencies. To address these issues, there is a pressing need for standardised procedures, such as aligning data formats across systems, implementing data matching algorithms, or consolidating into a unified system. These measures would contribute to improved accuracy and consistency in data reporting.

10. Key points and recommendations

10.1 Working towards better data quality

Overall, local areas are achieving varied levels of data quality, collection, and validation standards. This is intrinsically linked to the operational context in which the data capture is undertaken and inevitable. While recognising that there will always be variability and potential inaccuracies in data, there is a clear need for enhanced guidance, verification processes, and detailed explanations of strengths and limitations. This is important not only for improving comparability but also for ensuring that users are well-informed of these issues when utilising the data.

It is also important to emphasise that data should not be relied upon in isolation. Instead, it should help ignite discussions within local areas and with partners to enhance support and services and be used conjunction with other local evidence to gain a comprehensive understanding of the situation at local levels.

Whilst there is no major data quality concern for the data underpinning the new indicators, there is room for improvement to enhance robustness and consistency. This should particularly seek to address:

  • The existing gap in data management and technical abilities/capacity between local areas. Recognising the prevalence of manual systems and the use of multiple systems in certain local areas, it is clear that there is a need for targeted support and additional guidance. This will encourage good data practices and appropriate quality assurance checks to enhance data quality and reduce the potential for human error as well as the burden on staff.
  • Areas where current indicator definitions and associated guidance exhibit a degree of flexibility that permits varied interpretations. Additional parameters and clarifications into the different methods that can be used for capturing the single night snapshot estimate (R1) should be sought to improve comparability and consistency across local areas.
  • Good practice guidance for improving data quality and validation, including working with partners; using single identifiers / checking duplicates; embedding formulas/algorithms into database systems; understanding the limitations of the data; how to use and analyse the data to inform or prioritise interventions.

There are two high-level recommendations for DLUHC:

  1. As well as providing additional guidance in the existing data-led framework implementation guide, developing a detailed manual which would provide guidelines and tips in relation to data quality.
  2. Implementing a verification model which captures information from local areas on data collection approaches, methods and processes and actively supports selected areas.

Improved guidance

New guidance issued by DLUHC should outline guidelines, tips, and resources, for achieving good data quality standards. It could be structured according to the data quality dimensions in the government’s Data Quality Framework (2020) and used by the Data Management Association (DAMA): timeliness, reliability, validity, integrity, accuracy and uniqueness. This should identify the minimum standards which all local areas should work towards, as outlined in the following table.

Areas of data quality standards Topics to include in guidance
Reliability - Consistency in data collection and reporting - Consistency in data collection approaches: options and methods
- How to work with historical data
Uniqueness - Unique records build trust in the data - How to effectively use unique identifiers / working towards implementing unique identifiers
- Checking for duplicates and introducing formulae or algorithms to flag duplicates and support production of P1, B1 and NR1
Accuracy - We have accuracy when data reflects reality - Choosing the right method / frequency of outreach
- Identifying hard-to-reach groups
- Working with partners / information sharing
Validity - Data conformity - Implementing new data entry validity checks
- Understanding validation checks in DELTA
Integrity - Reliability and consistency of database/system - Access control
- Periodically reviewing data
- Data storage
- Using multiple systems
Timeliness - Up-to-date data and timely reporting - Reporting via DELTA

10.3 Quality assuring the data-led framework

Acknowledging the diverse operational contexts in which data collection is taking place at local level, the variability in terms of capacity and resources – both human and technical – across local areas and the ambition for incorporating further indicators into the ending rough sleeping data framework in the future, it is clear that any quality assurance model will have to be flexible and seek to improve data collections processes as well as ensure minimum standards of quality and compliance.

Traditional quality assurance reviews typically focus on assessing whether local areas meet established standards or benchmarks. Whilst this is useful, it is recommended that the quality assurance model for this framework take a more comprehensive approach to also actively support local areas in improving their data practices, recognising that data quality can be a dynamic area and that the goal is to foster continuous improvement.

Overview of the proposed quality assurance model

The proposed model will be designed to foster a culture of accountability and improvement and will comprise two components aimed at achieving balance between local autonomy and impartial oversight: annual data maturity checks and support for building capacity in those areas with lower data maturity.

Annual maturity data reviews
  • DLUHC could engage with each local authority annually to capture key information on their data collection and validation processes, as well as understand any challenges they may face in reporting on the indicators monthly.
  • A series of questions based on a template could be used to understand how local areas are doing in relation to the key data quality principles: timeliness, accuracy, reliability, uniqueness, validity, and integrity. An example of a questionnaire that could be used is given below.
Timeliness

Is the data reported on monthly through DELTA and on time?

❏ Yes ❏ No

Comments:

Accuracy

Is there reasonable assurance that the data collection method chosen (in particular for the snapshot estimate) is the most appropriate considering the local area’s context?   

❏ Yes ❏ No

Comments:

Is there reasonable assurance that the local area is maximising data quality and accuracy by engaging with relevant partners as appropriate (or taking the steps to do so)? For example, multi-agency meetings, seeking information sharing agreements with key institutions and partners.  

❏ Yes ❏ No

Comments:

Reliability

Are data collection approaches and methods consistent across the months and following the guidance?

❏ Yes ❏ No

Comments:

Does the local area have access to at least five years of historical data?

❏ Yes ❏ No

Comments:

Infrastructure & system(s)

To provide the data underpinning the indicators, is a single system/database used?

❏ Yes ❏ No

Comments:

System(s) used:

Uniqueness

Are unique identifiers used?

❏ Yes ❏ No

Comments:

If no, is there a clear process to avoid double counting people and checking whether they have been previously seen and recorded in the system?

❏ Yes ❏ No

Comments:

Validity

Is new data entered in the system systematically reviewed?

❏ Yes ❏ No

Comments:

Are there rules in place to ensure the completeness of data entries?

❏ Yes ❏ No

Comments:

Integrity

Is there reasonable assurance that steps are being taken to limit human errors and maintain data integrity (for example, data storage, access control)?

❏ Yes ❏ No

Comments:

Burden

Ask the local authority to rate the burden that data collection / reporting places on their team (1 = no burden; 5 = severe burden) 

1    2    3    4    5

Support need

Would the local area benefit from support?

❏ Yes ❏ No

Comments:

Capacity building support for those with less data maturity

Based on the assessment outlined in the previous section, a sample of local areas could be selected for additional support. This should focus on those struggling to meet the minimum standards provided in the guidance. The support could be targeted to the needs of each area and cover a range of different support offers.

This component ensures that struggling local areas receive the attention and assistance needed to bring their data practices up to par.

Suggested timescales
  • Questionnaire: Jan - April 2024
  • DLUHC intelligence review: April 2024
  • Local areas selected for continuous support: May - June 2024

11. Appendices

11.1 Appendix A: Indicators summary table: strengths and limitations

Indicator Purpose How is this information gathered? Strengths Limitations
(P1) New people sleeping rough To track how effectively rough sleeping is being prevented by understanding the prevalence of new people sleeping rough both over the course of the month and on a single night Gathered by outreach workers as part of their regular outreach activity, therefore linked to the coverage and frequency of this activity, as well as reliance on referrals from partner agencies to help identify people that may not engage with services - Helpful to understand progress with prevention activity at local level - Relies on local areas having or building up accurate historical records to identify new individuals
- As the indicator highlights people new to each local authority, but in London this is used to mean “new to the Greater London area”, data is not readily comparable across England
- Some individuals can return to the street after a period of five years of more which means these would be captured in both P1 and NR1 indicators
(P2) People sleeping rough who have been discharged from an institution To track how effectively rough sleeping is being prevented by understanding the prevalence of people sleeping rough who have recently left an institution without accommodation Gathered by outreach workers as part of their regular outreach activity. Reliant on a combination of self-disclosure by people sleeping rough and/or information sharing from institutions - Used to identify gaps in support provision and develop partnership working with institutions - Verification and validation processes are inconsistent across local areas
Requires outreach teams to identify within the reporting month whether rough sleepers have been discharged from an institution in the past 3 months
- Outreach teams are heavily reliant on self-disclosure and timely information sharing from partners in order to get the data needed
- Reliant on data sharing to improve quality and accuracy
(R1) People sleeping rough – over the month To track the prevalence of rough sleeping over the course of the month Gathered by outreach workers as part of their regular outreach activity, so therefore linked to the coverage and frequency of this activity, as well as reliance on referrals from partner agencies to help identify people that may not engage with services
Outreach teams record their unique contacts across the month on their local database or system which then they use to report to DLUHC
- Clearly defined
- Based on outreach contacts
- Provides a continuous and ongoing understanding of the rough sleeping population
- Helps to capture hidden homelessness
- Only includes known people and people identified by outreach activity, so may not include all hidden homeless groups
- Relies on local systems in place to identify unique contacts and not double count individuals
(R1) People sleeping rough – single night snapshot To track the prevalence of rough sleeping on a single night Local authorities are advised to use a snapshot approach which will provide the most robust figure. This should be one of the three approaches that are used for the official rough sleeping snapshot, although if no snapshot has been conducted within the month, local authorities should gather their intelligence, data sources and records to establish what a single night figure would be and submit this as their estimate - Well established approach based on annual snapshot which has been in place since 2010
- Compliments the over the month figure and helps to understand who is on the street on specific nights of the year
- Helps to monitor trends and fluctuations at local level
- Provides a benchmark to compare with the over course of the month rough sleeping figure
- Less thorough than the annual single night snapshot, in terms of planning and involvement of independent partners
- Only includes people seen/identified by outreach workers
- Can be impacted by the weather, where people choose to sleep, the date and time chosen, and the availability of alternatives such as available night shelter
- Challenges around consistency and comparability of approach month on month and between areas, as more local discretion compared to annual snapshot
(B1) People sleeping rough long term To assess how effectively local systems are able to rapidly identify people sleeping rough and support them off the streets, and then into long-term accommodation Gathered by outreach workers as part of their regular outreach activity, and reliant on local system to record how often people are seen bedded down - Used to identify people who likely have complex challenges and require tailored support - Only includes known people and people identified by outreach activity, so may not include all hidden homeless groups
- Relies on local systems in place to identify unique contacts and not double count individuals
- Relies on local areas having or building up accurate historical records to identify individuals
(NR1) People returning to sleeping rough To track how well local areas are doing at ensuring people who have previously slept rough are supported to avoid returning to the streets Gathered by outreach workers as part of their regular outreach activity, and reliant on local system to identify people who have previously slept rough - Used to identify potential gaps in support provision and barriers to housing stability - Relies on local areas having or building up accurate historical records to identify individuals who have returned to sleeping rough in their local area
- Verification and validation processes are inconsistent across local areas
- Comparability across England, as outside of London, the definition refers to people returning to local authority
- Some individuals can return to the street after a period of 5 years of more which means these would be captured in both P1 and NR1 indicators

11.2 Appendix B: Summary of data collection practices

This table provides an overview of the operational context for each audited local area. It sets out who is responsible for data collection, the frequency of outreach activities and processes for recording data, the system(s) used and whether the area has a dedicated data analyst who could support with data collection / validation / analysis.

Early adopter Data collector Outreach activities and processes for recording data Software / systems used Use of single identifier[footnote 2] In-house data analyst
1 In-house outreach team - Daily outreach sweeps (early mornings and occasional night sessions)
- Reporting on Street Link and reports throughout the day
- All new rough sleepers and bedded down contacts with existing rough sleepers recorded on system daily
- Single data system and software used by both outreach and Housing Options teams Y Y
2 Commissioned service - Daily outreach sweeps (early mornings and occasional night sessions)
- Reporting on StreetLink and reports throughout the day
- All new rough sleepers and contacts with existing rough sleepers recorded on system daily
- Two different software systems used by outreach and Housing Options teams. N – manual checks required Y
3 In-house outreach team - Mon-Friday outreach sweeps (early mornings)
- Reporting on StreetLink and reports throughout the day
- All new rough sleepers and contacts with existing rough sleepers recorded on system daily
- Two different software systems used by outreach (spreadsheet) and Housing Options teams Y N
4 Commissioned service - Mon-Sat outreach sweeps (early mornings)
- Reporting on StreetLink and reports throughout the day
- All new rough sleepers and contacts with existing rough sleepers recorded on system daily
- Two different software systems used by outreach and Housing Options teams N – manual checks required Y
5 In-house outreach team - Mon-Friday outreach sweeps (early mornings)
- Reporting on StreetLink and reports throughout the day
- All new rough sleepers and contacts with existing rough sleepers recorded on system daily
- Single data system and software used by both outreach and Housing Options teams N – manual checks required N
6 In-house outreach team - Outreach sweeps three times a week (early mornings)
- Reporting on StreetLink and reports throughout the day
- All new rough sleepers and contacts with existing rough sleepers recorded on system daily
- Two different software systems used by outreach (spreadsheet) and Housing Options teams. Y N
7 In-house outreach team - Outreach sweeps daily (early mornings)
- Reporting on StreetLink and reports throughout the day
- All new rough sleepers and contacts with existing rough sleepers recorded on system daily
- Single data system and software used by both outreach and Housing Options teams Y Y
8 In-house outreach team - Mon-Fri outreach sweeps (early mornings)
- Reporting on StreetLink and reports throughout the day
- All new rough sleepers and contacts with existing rough sleepers recorded on system daily
- Two different software systems used by outreach and Housing Options teams. Y but not common across systems beyond GMThink Y
9 Commissioned service - Outreach sweeps at least twice a week (early mornings)
- Reporting on StreetLink and reports throughout the day
- All new rough sleepers and contacts with existing rough sleepers recorded on system daily
- Two different software systems used by outreach (spreadsheet) and Housing Options teams N – manual checks required N
10 Commissioned service - Reporting on StreetLink and reports throughout the day
- Outreach sweep at least once a week (early morning)
- All new rough sleepers and contacts with existing rough sleepers recorded on system daily
- Two different software systems used by outreach (spreadsheet) and Housing Options teams N – manual checks required N
11 Commissioned service - Daily outreach sweeps (early mornings and occasional night sessions)
- Reporting on StreetLink and reports throughout the day
- All new rough sleepers and bedded down contacts with existing rough sleepers recorded on systems daily
- CHAIN used by outreach team Y – CHAIN No. Y
12 In-house outreach team - Outreach sweeps three times a week (early mornings)
- Reporting on StreetLink and reports throughout the day
- All new rough sleepers and bedded down contacts with existing rough sleepers recorded on systems daily
- CHAIN  and additional software used by outreach team Y – CHAIN No. Y
13 Commissioned service - Outreach sweeps three times a week (early mornings)
- Reporting on StreetLink and reports throughout the day
- All new rough sleepers and bedded down contacts with existing rough sleepers recorded on systems daily
- CHAIN and additional software used by outreach team Y – CHAIN No. Y

11.3 Appendix C: Audit questionnaire

Early adopters audit questionnaire  

DLHUC has commissioned Homeless Link to carry out a feasibility study to better understand how local areas are gathering the information on the new rough sleeping indicators. We are interested in understanding process(es) used for collecting / collating / checking the data, what local areas had to do and implement to feel confident in the collection, validation and reporting of such information, and the challenges, limitations and strengths of the data used.  

The findings of the feasibility study will be used to:  

  1. Understand the variety of methods, identify best practices, and develop guidance for all local areas  
  2. Establish an approach to verification for the data-led framework to ensure best data quality.  
Early adopter ID  

Local Authority:
Local Authority Lead Coordinator:
Contact details:

Background 

When collecting data for the rough sleeping annual snapshot, does your local area usually carry out:
❏ A count-based estimate 
❏An evidence-based estimate 
❏ An evidence-based estimate with a spotlight count 

Please explain the rationale behind the choice of method:

When collecting data for the monthly rough sleeping survey, does your local area usually (in the past 2 years) carry out a:
❏ A count-based estimate 
❏ An evidence-based estimate 
❏ An evidence-based estimate with a spotlight count 
❏ Not consistent / it depends:  

If different from the annual snapshot method, please explain the rationale behind the chosen method:

Capacity and outreach activities 

Which internal monitoring systems/software are you using to record information on your rough sleeping population (for example, In-Form, CHAIN etc.). Please list all the systems / software you use.

Is your local area able to make changes to the data collection tools you use (for example, adding a new field to the data structure) or are you relying on third party to make change in the data infrastructure?

Does your local area have a dedicated data team / data analyst?

❏ No  ❏ Yes

Does your local area have an in-house outreach / rough sleeping team?

❏ No  ❏ Yes

If yes, do you have a nominated data-lead in the team?

❏ No  ❏ Yes

If no, tell us about any partnership / commissioned outreach services in place and if their responsibilities include collecting, monitoring, and reporting on rough sleeping data:

Please describe frequency / nature of outreach activities (for example, frequency of sweeps, visits in response to report)

When a rough sleeper is seen by the outreach team and information collected – what are the steps taken by the outreach team to record the information on the system? Is information recorded on the system every time the outreach team makes contact with an individual rough sleeping (for example, date seen)?

Can all outreach workers update existing and add new sleeping information on the systems/database?

❏ No  ❏ Yes

If no, please detail:

What is done to maintain data integrity in an environment where multiple users access the same system/database at the same time (for example, access policy, validation process before new information is inserted into the database)?

The new indicators: collecting, validating and reporting data
R1: Number of people sleeping rough

For this indicator, you have to provide a single night figure (snapshot) and a monthly figure.

Did your local area make any changes in the way this information is collected and monitored since the introduction of the new framework?
❏ No  ❏ Yes

If yes, please provide details:

Please describe the process(es) used to collect the data to produce the two figures your local area needs to provide monthly (this might be different for the two figures, please highlight the differences):

Please describe what is done to check accuracy (for example, remove duplicates, sense-check with relevant local organisation, ensure completeness) and validate (for example, compare with other sources/historical data) the two figures (this might be different for the two figures, please highlight the differences):

Are methodological or practical considerations given to people sleeping rough across local authorities’ boundaries? Are any steps taken to address this challenge?

Please list the agencies / partners supporting the data collection or verification, and their level of involvement / role (this might be different for the two figures, please highlight the differences):

How are communications established and managed with them (for example, protocols, frequency, formal vs informal)?

For the point-in-time snapshot figure, how does your local area pick the ‘single’ night? What are the key considerations? Any challenges?

What are the challenges your local area faces in collecting the two figures and ensuring these are as accurate as possible? What could be improved?

In your view, what are the strengths and limitations of the R1 indicator and its definition?

Do you have any data quality concerns (for example, any sources of bias, coverage, etc.)?

P1: Number of new people sleeping rough

For this indicator, you provide a single night figure (snapshot) and a monthly figure. 

A person is considered ‘new’ if they have not been seen sleeping rough in the Local Authority in the 5 calendar years (60 months) preceding the date they were seen sleeping rough during the current reporting period.

Does your area have records going back 5 years to enable you to report on this indicator?
❏ No  ❏ Yes

How far are you able to go back in the database?

Is the approach used to ascertain if someone is a ‘new’ rough sleeper different for the snapshot and the monthly estimate different? Please expand.

Are there any cross-boundaries (beyond local authority boundaries) considerations given to ascertain if someone is a ‘new’ (for example., ‘new’ to local authorities vs ‘new’ to metropolitan area/region)?

Please describe what you had to change and implement (for example, gathering information processes, data system update) in order to collect and record this new information (this might be different for the two figures, please highlight the differences):

Please describe what is done to check accuracy (for example, remove duplicates, ensure completeness and robust estimate) and validate the two figures (this might be different for the two figures, please highlight the differences):

Please list the agencies / partners supporting the data collection or verification, and their level of involvement / role (this might be different for the two figures, please highlight the differences):

What are the challenges your local area faces in collecting and reporting on this new information, and ensuring this is as accurate as possible? What could be improved?

In your view, what are the strengths and limitations of the P1 indicator and its definition? A person is considered ‘new’ if they have not been seen sleeping rough in the Local Authority in the 5 calendar years (60 months) preceding the date they were seen sleeping rough during the current reporting period.

P2: People seen sleeping rough after being discharged from institutions

For this indicator, you have to provide 6 figures.

A person is counted as having left an institution recently if they report having been discharged from

  1. Prison
  2. Other approved justice accommodation
  3. General and psychiatric hospitals
  4. Discharged from the UK Armed Forces
  5. National Asylum Support Services Accommodation

within the last 85 days (12 weeks + 1 day), or:

  1. Children’s social services in relation to care, and they are aged under 25.

Was your local area already collecting and recording this information?

People seen sleeping rough after being discharged from:
Prison: 
❏ No  ❏ Yes

UK Armed forces: 
❏ No  ❏ Yes

Other approved justice accmdation: 
❏ No  ❏ Yes

 Asylum Support Services accmdation: 
❏ No  ❏ Yes

General & psychiatric hospitals:
  ❏ No  ❏ Yes

Care leavers under 25:
❏ No  ❏ Yes

When you have ticked yes, please detail how this information was collected and recorded and what definition was used?

Please describe what you had to change and implement (for example, gathering information processes, data system update) in order to collect and record this new information

Please describe what is done to check accuracy (for example, remove duplicates, ensure completeness and robust estimate) and validate the 6 figures (this might be different for the 6 figures, please highlight the differences):

Please list the agencies / partners supporting the data collection or  verification, and their level of involvement / role (these might vary depending on the type of institution, please highlight stakeholders involved for the different figures):

What are the challenges your local area faces in collecting and reporting on this new information, and ensuring this is as accurate as possible? What could be improved?

In your view, what are the strengths and limitations of the P2 indicator and its definition? A person is counted as having left an institution recently if they report having been discharged from X INSTITUTION within the last 85 days (12 weeks + 1 day) (or is aged under 25 and is a care leaver).

Do you have any data quality concerns (for example, any sources of bias, coverage etc.)?

B1: Number of people experiencing long-term rough sleeping

For this indicator, you provide 1 estimated figure.

A person will meet the criteria for this indicator if they have been seen recently (within the reporting month) and have also been seen out in 3 or more months out of the last 12 months.

Please describe what you had to change and implement (for example, gathering information processes, data system update) in order to collect and record this new information as defined in B1:

Please describe what is done to check accuracy (for example, remove duplicates, sense-check with relevant local organisation, ensure completeness) and validate (for example, compare with other sources/historical data) the figure:

Please list the agencies / partners supporting the data collection or verification, and their level of involvement / role:

What are the challenges your local area faces in collecting and reporting this new information, and ensuring this is as accurate as possible? What could be improved?

In your view, what are the strengths and limitations of the B1 indicator and its definition?
A person is considered to be sleeping rough long term if they have been seen recently (within the reporting month) and have also been seen out in 3 or more months out of the last 12 months.

Do you have any data quality concerns (for example, any sources of bias, coverage etc.)?

NR1 Number of people returning to rough sleeping

For this indicator, you provide 1 estimated figure.

A ‘returner’ is defined as a person seen sleeping rough again after no contact for 2 or more quarters (180 days), whichever is shorter, measured from the last date the person was seen.
Please describe what you had to change and implement (for example., gathering information processes, data system update) in order to collect and record this new information as defined in NR1:

Please describe what is done to check accuracy (for example, remove duplicates, sense-check with relevant local organisation, ensure completeness) and validate (for example, compare with other sources/historical data) the figure:

Please list the agencies / partners supporting the data collection or verification, and their level of involvement / role:

What are the challenges your local area faces in collecting and reporting on this new information, and ensuring this is as accurate as possible? What could be improved?

In your view, what are the strengths and limitations of the NR1 indicator and its definition?

A ‘returner’ is defined as a person seen sleeping rough again after no contact for 2 or more quarters (180 days), whichever is shorter, measured from the last date the person was seen.

Do you have any data quality concerns (for example, any sources of bias, coverage, etc.)?

Experience of early adopters

How were definitions communicated with outreach workers and what did your local area have to do to ensure consistency and good use of the definitions?

Overall, did the monitoring of these new indicators lead to new insights and learnings into your rough sleeping population and the identification of unknown / un-evidenced issues? If so, please give one or two examples.

What tools, templates and guidance would be useful for local areas implementing new processes to gather the new information and report on the new indicators?

What could be improved that would make you/your local area confident that the figures you provide are as accurate as possible?

Any additional comments, insights, key learnings?


  1. The questionnaire can be found in Appendix (Appendix B). 

  2. If a single identifier is used in a case management system, it means it is possible for the system to flag when a service user has previously been recorded in the system. Without a single identifier, it becomes necessary for staff to manually search systems to identify if each individuals have existing records.