2. Improving data and insight

This chapter discusses how local Changing Futures projects took action to improve data and evidence, by making changes both to what data is available and how it is shared and used, and to what organisations have in place to enable this.

Why are data and insight important?

The Changing Futures programme set out to improve the local availability and use of data on multiple disadvantage. Data and evidence can be used at different levels throughout the local system. For instance, the right data and evidence can help to:

1. Build buy-in to change by raising awareness of multiple disadvantage and the costs, both for people and services, of not meeting needs. For example, Changing Futures areas have carried out cost-avoidance analysis to highlight the impact that responding effectively to multiple disadvantage can have on local services.

2. Proactively identify and reach people experiencing multiple disadvantage or who are at risk. Changing Futures areas have been able to identify groups of people, such as those who are leaving prison and women who are ‘sofa surfing’, and engage them with specialist support.

3. Reduce the need for people experiencing multiple disadvantage to repeat their story multiple times; this can be re-traumatising as well as frustrating, and lead to people disengaging from services.

4. Improve how individual staff support people who are experiencing multiple disadvantage. Efficient sharing of data between services means they have a fuller picture of an individual’s experiences, needs, goals, preferences, and agreed support plan.

5. Support service improvement and redesign through insight into how people are experiencing services. Changing Futures areas have been able to understand who is (or is not) receiving support, how people are moving through the system, and where they are falling out of the system altogether – whether through disengaging or being excluded.

6. Support strategic planning and decision-making through insight into the prevalence and nature of need. Some Changing Futures areas are finding that the numbers of local people experiencing multiple disadvantage are far higher than was once thought.

This chapter discusses how local Changing Futures projects took action to improve data and evidence, by making changes both to what data is available and how it is shared and used, and to what organisations have in place to enable this.

Things to consider

What do you want to do with data and insight?

  • What actions or decisions do you want to inform? Different types of data and analysis are needed for strategic planning, compared to service delivery. Do you seek to highlight problems in the system or to demonstrate the impact of different ways of working?
  • Who is your audience or intended user of the data and insight? What kind of information do they value, or is most likely to get their attention? What are their expectations for the types of data and evidence that should be produced, and how do they make use of it?

What is available to build on?

  • What evidence is already being produced? What data activity is already taking place? Could you contribute to this or make use of the results? For example, by adding questions to a regular survey. As well as local projects, find out what has been done in other areas or nationally. This could help not only to address your evidence needs, but also provide ideas for developing data solutions.
  • What are the different data and analytical solutions available to achieve your aims? Is new data collection really needed? Explore what data is already available and how it could be better used. Is it ‘good enough’ for your needs? Could restructuring data, linking it to other sources, and/or advanced analytical methods, help to answer your questions?
  • Consider data-sharing regulations and policies early on. What data-sharing agreements are already in place with partners? What are you legitimately allowed to do with available data?

Who needs to be involved?

  • Involve people with lived experience of multiple disadvantage in generating data and evidence. There may be peer research or lived experience groups already in your area.
  • Think about the specialist and technical skills needed. Do partner organisations have data analysts who you could collaborate with?
  • Get operational-level staff to understand and support data initiatives. Their support is often critical to ensuring quality in data collection. They also have useful insights that can help interpret data and research results.
  • Ensure you have a strategic-level project sponsor. Senior roles’ influence may be needed to unlock access to data.

What were the barriers?

Siloed data systems and lack of data sharing

Changing Futures teams described how multiple records on people experiencing multiple disadvantage were held by different services, and were not shared or pooled. This meant that services had only a partial picture of the people they were supporting. Better data sharing was needed – both to ensure that people had the right support and were kept safe, and so that people did not have to constantly repeat their story. Data sharing where it took place was not necessarily systematic, but rather, could be reliant on individual contacts and relationships that might be lost when staff changed.

Challenges in assessing the extent of multiple disadvantage

Some local areas struggled to estimate the number of people experiencing multiple disadvantage in their region. The nature of multiple disadvantage means there is no single data source that can be referred to. Furthermore, some groups of people experiencing disadvantage are less evident in the data. This poor data availability can reduce the quality of data used by decision-makers at the strategic level. To understand local need, including for its Joint Strategic Needs Assessment, one area relied on local-level estimates from a national study based on data that was several years old. Whilst having an estimate of prevalence was a significant step forward, they told us this estimate was still “really crude”.

Poor data quality

Out-of-date or inaccurate information at the individual level was thought to affect people’s access to services. For example, one stakeholder described how information on service users with a history of arson provided limited details of the offence, context, or when it took place. This led to poor assessments of current risk, and as a result, could have a negative impact on people’s access to services.

“The quality of risk information is really, really poor in [our area], so you end up with lots of people who have services refused or declined based on their risk information; but their risk information is out of date, lacking context, doesn’t have the right details, stuff like that.”

Focusing on performance management rather than insight

How data was used was also felt to be an issue in some Changing Futures areas. Staff in one area described the need to shift the focus from numbers to understanding what is happening to service users. They wanted to change the emphasis from performance to understanding, so it was less about ‘closing 100 cases’ and more about being interested in why people were dropping out of treatment; in other words, getting to the story behind the numbers.

What worked well

Thinking about purpose

Changing Futures areas produced, shared and analysed data for several different purposes. What you want to be able to do with data, and the questions you want to answer, affect the kind and quality of data that is required. This in turn will shape the data capabilities, resources and projects you need to develop.

For example, when one local area considered the type of data that their frontline workers would collect, they focused on casework journals rather than standardised outcome measures, as their priority was to support learning, rather than demonstrating progress on key performance indicators (KPIs). The case study Rethinking data for decision-making describes how that team questioned why the data used to inform decision-making had been produced in particular ways in the past, and whether they needed to produce different kinds of data to encourage a different perspective.

Connecting data problems to what is happening to service users and services can help people to understand why changes to data collection or systems are needed.

And we’ve really tried to put a person into [the data], in terms of, you’ve got this person, they’ve got all these issues going on, but no one can see it, and that’s why people give up. That’s why people don’t get what they need. Because they get sent from pillar to post, and we’re trying to really humanise that to make people really aware of the impact that not sharing information can have on an individual.

Making better use of what you have

Several Changing Futures areas made progress by encouraging people to use data resources and permissions that were already in place. In some cases, this was done alongside working towards longer-term changes such as area-wide data-sharing strategies.

One area realised that people were not sharing data because they were uncertain about what they could legally do, particularly in the context of GDPR. 2 . Changing Futures staff organised talks and training on data sharing to encourage people to share and use data. Another team described working closely with an information governance expert. It is important to check data policies and data-sharing agreements at an early stage. For instance, one team member described how her organisation could use data because people had already consented to its re-use and sharing.

One Changing Futures project undertook an exercise to map the different datasets on people experiencing different domains of disadvantage at local, regional and national levels, including those belonging to smaller services or organisations. Although they could not immediately access all the datasets identified, they were able to conduct analyses that produced new insights into local needs. For example, analysis of one voluntary sector organisation’s data identified a group of women experiencing homelessness who did not appear in statutory sector datasets. 

But there’ll be data sets everywhere that we don’t know about, is the issue. So, there’ll be people within councils, within charities, within small, tiny services who have got massive datasets that are just theirs because they just need it for the women that they’re working with, or the men that they’re working with, or whoever.

Undertaking formal data audits or mapping, however, may not always be successful: one area described having a very low response rate when they tried this. Taking a more informal approach and working with close partners may be more productive.

As well as uncovering existing data, Changing Futures areas also worked with the data that they had in new ways. Several Changing Futures teams included staff with private sector experience, which one role-holder said helped with thinking about different ways to analyse and use data. The Nottingham case study describes how staff used already available data to help a key local provider understand how the prevalence of multiple disadvantage was impacting their core aims and challenges, including its effects on their costs.

Pooling analytical resources

Several areas put together cross-organisational project teams or taskforces of data and research specialists. In one case, there were several different projects in a local area with similar data needs, which were able to form a joint working group on data. In another case, cross-organisational team members conducted an initial analysis of service-user data that could not be shared outside their organisation. Once individuals in the dataset were no longer identifiable, the data could be worked on further by other team members outside the organisation.

One Changing Futures area funded embedded roles within partner organisations. This ensured there were people on the team who knew the partner organisation’s data and systems well and, crucially, could access them directly. This facilitated collaborative work to improve data and insights on multiple disadvantage.

Case study: Northumbria – Rethinking data for decision-making

In Northumbria, as in other areas, services’ routinely collected data did not show what was happening to service users as they interacted with services, either within a single service or across the system. Instead, the focus was on KPIs.

The data has been really a carrot and stick type approach, where ¼ the main driving position is to achieve in these targets. If you’re achieving these targets, then ¼ there’s no need to look outside of what you’re doing, or the work, or the style you’re working in, because you’re doing well.

The Northumbria team thought about what data would help to understand services from the user’s point of view. Starting with several service users with whom they had developed trusting relationships, the team approached fourteen services, including NHS trusts, police, probation, and ambulance. With their service users’ consent, they were able to obtain data on past service interactions.

Changing Futures Northumbria asked users about every single interaction they’d had with the service since adulthood, what it was for, the outcome, and whether any other services were involved. This was a substantial task and required significant relationship-building. The team say it was important to start this early. Where Changing Futures did not already have a senior-level relationship, they started with frontline staff and collaborated to find the person able to authorise data sharing. One team member calculated that getting the data had taken 538 emails, 34 face-to-face meetings, and 70 Teams calls.

Using data from services and from the service users themselves, the Northumbria team were able to demonstrate that some service users were consuming substantial amounts of workforce capacity and resources – in one instance, years of police constable time. They also showed that frequent A&E attenders were also in frequent contact with other services, and that individuals’ circumstances worsened over time.

The resulting report, called ‘The Burning Platform’, is a starting point, they say, for having a different conversation about services and the demand on them: “all of this unplanned activity, which is absolutely crippling the system.” So far, the report has generated significant interest from a range of stakeholders, including health services, the local authority and the police. The next step for the team is to work with services to help shape their responses to the findings.

I suppose, the challenge for us is translating curiosity into a concrete commitment to do stuff … what we would be encouraging organisations to do is to think about lived experience, definitely; is to think about data differently; is to think about commissioning differently within that.

You can read more about the work that Changing Futures Northumbria has done (PDF, 1136KB)

Influencing the design of data systems

In one area, a new data system was designed to replace several others in the locality. Staff worked with the team implementing it, so that problems they had identified were known to the system developers, and could be designed out:

I think we’ve won [the implementation team] round, but yes, basically saying, ‘This is how we would like risk to be recorded,’ or ‘This is how we’d recommend it.’ And then putting in things like automatic review points and stuff like that, so the system itself triggers staff to have to update and refresh information, and you can’t just let data languish on systems for years and years and years. So how can we automate some of the data quality challenges?

Building relationships to ensure insights are acted upon

Partnership working and collaboration are also key to fostering an environment where the results of data analysis are well received and acted upon. Research findings may challenge organisations’ assumptions about their services, and whether and in what ways they need to change. It can be important to first build understanding and trust with partners, so that when evidence is presented, organisations can engage with it positively and use it to improve support.

It’s about fostering a culture that’s about learning rather than about trying to put blame on anything. …The fact that we can share this weird fact about, you know, ‘Oh, do you know more SMD [severe and multiple disadvantage] patients get [re]admitted than not?’ And it would be seen as, ‘Oh, that’s interesting, I wonder why,’ rather than, ‘What do you mean? Are you saying we can’t look after them in the community?’

Working collaboratively can also help identify opportunities to use data insights. One area undertook multi-agency mapping of service users’ journeys. The collaborative process of producing the journey maps enabled staff working across different services to build a shared understanding of the challenges, barriers and points of failure within the system. This also contributed to a more open and supportive culture of reflection about areas for improvement. This has created a more fertile environment for future improvement activities. The resulting journey maps have been disseminated widely, to foster understanding of the local system. The use of journey mapping is explored further in the chapter on Joining up services. See also the Proxy data and ‘a reasonable assumption’ case study.

Making use of the insights and expertise of people with lived experience

Changing Futures areas broadened the types of data that were produced and valued by different audiences. Critically, this involved more data and evidence centred on the insights and experiences of people with lived experience of multiple disadvantage. For example, in several areas, direct feedback from people with lived experience was given a central place in multiple disadvantage strategic needs assessments. In another area, appreciative enquiry was used to understand people’s views on support, system strengths and areas for development. 3

A commissioner commented that data collected from service users and frontline staff can tell a different story from that routinely collected by services:

Because so many times, we tell people what’s happening in their own service and they’re like, ‘That doesn’t happen.’ And it does happen, they just don’t know about it¼ So, yes, take time to listen and have those conversations with people if you want to understand exactly what is going on and what the barriers are.

The co-production of data and evidence with people with lived experience takes this further. Within Changing Futures, people with lived experience have been involved at national and local levels in identifying relevant research questions, helping to plan and carry out research activities, and helping to interpret data and identify what actions need to be taken. Embedding the involvement of people with lived experience in producing data and insight has been a key achievement of the programme.

In several areas, people with lived experience of multiple disadvantage undertook research to support multiple disadvantage needs assessments. They were able to engage with people who may not otherwise have participated, by building trust, meeting people where they were, and through more creative methods such as cooking or walking groups. In one area, a network of researchers with lived experience of multiple disadvantage was hosted across different services: this meant that data was generated from different parts of the local system, and the host service gained information to understand and improve its services.

However, involving people with lived experience in data and research needs to be supported. Alongside specific training on research skills, peer researchers will need the same support as other volunteers with lived experience. You can find some resources on support for lived experience volunteers in the Resources and further reading section.

Lessons learned

Data cultures can be powerful

Capturing high-quality data for monitoring and evaluation was challenging on the Changing Futures programme. Several areas described difficulties in getting frontline staff to prioritise keeping accurate and timely records. Lack of time, not understanding or seeing any benefit from data collection, and poor data systems can all undermine workforce motivation. One Changing Futures staffer described how worker morale could be affected by

[systems that require workers] to fill in five [different forms] every time I go and speak to somebody for ten minutes.

Data collection and evaluation may also be experienced as monitoring or inspection, rather than as something that helps improve support.

Getting good-quality data requires working with people so that they understand what they need to do with data, and why. Making even small changes, such as introducing a new form to be completed, can require extensive work to help people change their behaviour. One Changing Futures worker described visiting staff and sitting with them to explain how to complete forms correctly. See the case study Proxy data and working on ‘a reasonable assumption’ for more on these challenges.

Entrenched beliefs about data requirements may need to be challenged. Specialist data staff may have beliefs about what kind and how much data is needed, but as the case studies in this toolkit demonstrate, changing these beliefs can lead to better data and insight. Similarly, decision-makers sometimes believe that quantitative data is superior to qualitative data. However, there is evidence from Changing Futures areas that this is changing, with some commissioners accepting more qualitative data, such as case studies, as part of contract monitoring.

Iterative rather than wholesale change may be more realistic

Some areas initially aimed to have a single local shared case management system. In reality, different stakeholders had already made significant investment in their systems, so were unwilling to change, and it was difficult to design a single system that met the needs of all the agencies involved. As a result, several areas pivoted to testing shared access to an existing data system on a smaller scale.

Just the sheer number of case management systems and the sheer number of organisations that work across [the area], made it really, really hard to think whether it was realistic to expect that we were going to be able to launch a new system that people would sign up to. So, we decided to approach it in a slightly different way, which is to think about what existing systems do we have, and what are the opportunities for interface between those existing systems, or sharing or viewing them.

Dedicated data and evidence management roles are needed

Changing Futures projects had varying numbers and types of team roles dedicated to data and evidence. The level of expertise needed to work on issues such as cross-organisational data sharing should not be underestimated. Some areas found it helpful to have dedicated senior roles that supported better data and insight. For example, one area had been using a platform-based case management system to share data among local partner organisations. However, a new data and project lead created a data dashboard that could share insights on the cohort more effectively across the partnership. Therefore, dedicated roles can create the time and space to focus on getting the best out of data. Data and research professionals will have the specialist skills, knowledge and experience to open up new opportunities and create innovative solutions.

Case study: Nottingham – Proxy data and ‘a reasonable assumption’

A long-standing issue facing Nottingham was the limited collection and use of data on service users’ experience of multiple disadvantage in key local organisations. This limited the available local intelligence on the needs and service-use of people experiencing multiple disadvantage.

At the start of Changing Futures, most local organisations were recording and using information only in relation to their own service’s focus and their own interactions with service users, rather than systematically recording multiple disadvantage. Changing Futures began to help organisations use their data to identify and respond to disadvantage amongst service users, including referring people to specialist case-working support.

So, a lot of the work that we did early days was to try … to colour that picture in a little bit, and get parts of the system … to view outside of themselves… So, for example, if you want substance use support, but you’re also leaving prison, or you are street homeless, you might then get referred to a more specialist team.

Initially, a few Changing Futures partner organisations introduced severe and multiple disadvantage (SMD) coding: practitioners would tick a box when they had a service user they believed was experiencing multiple disadvantage. Implementing this solution created its own challenges for some organisations, however. Coding relied on the individual staff members being aware of the need to do this, being able to identify multiple disadvantage, knowing how and when to complete the right box on the right form, and consistently doing so. In a local NHS trust, with over 11,000 staff, the team struggled to embed this approach:

And while it didn’t seem like a big deal, because [of capacity issues], nobody focused on it. So, what we needed to do was to kickstart the awareness, so that then they would look out for [SMD] more once they realised how many people it did impact. But again, people didn’t fill it in until they thought that it was useful to fill in ¼ and that’s when we said, ‘Okay, we’ve got to do this differently. Our clinicians are drowning a lot of the time,’ like I think most of the NHS.

Data team members examined the data that was already being collected by the Trust, and realised that some information recorded from risk assessments and past contacts with services could be used to identify which patients were experiencing multiple disadvantage. So, they pivoted from asking staff to assess and record multiple disadvantage to using this proxy data: “using existing datasets that go back for a fair amount of time, to say with a reasonable assumption that’s probably somebody who’s experiencing multiple disadvantage.”

A key learning point for data and analytics professionals was to shift from seeking perfect data to data that was ‘good enough’ for its intended purpose.

We know who’s engaging with what services. If they’ve achieved support from that service, it’s probably because they have a need. That is good enough to be counted as a disadvantage. And I used to go, ‘Well, it’s not really, is it?’ But actually, you know what? I was wrong. I think it is, it’s good enough … that is a level of support that somebody requires, and if there’s a poor outcome in that area due to other disadvantages, that’s SMD.

Using existing patient data, the analytical team were able to provide not only figures on how many patients were experiencing multiple disadvantage, but also insights into their service use over time. The analysis changed the Trust’s understanding of its patient cohorts. The incidence of multiple disadvantage amongst patients was far higher than expected – and crucially, a clear pattern emerged of very high and repeated use of reactive and emergency interventions, including repeated admissions to inpatient mental health care. Whilst some staff had thought that people experiencing multiple disadvantage were not being referred or were not attending their assessments, the analysis found that the problem was in fact keeping a person engaged with the service. As a result of this work, there is greater awareness of the impact of multiple disadvantage on patients, staff and budgets within the Trust, and the need to further develop patient pathways and strengthen collaboration with other partners dealing with housing, social support, and other needs. 

I think there is a big assumption of ‘Well, when you’re talking about severe multiple disadvantage, it’s probably, like, 0.2% of the people we see.’ So, why would we invest in training, why would we invest in staff speciality? Being able to show them, no, actually it’s more like 20% of the people you see, gives more teams a good reason to invest in that staff training and that staff knowledge.

These findings were being used to take conversations forward at both strategic and operational levels, about the pressures that disengagement and high numbers of readmissions were placing on services, and how these could be addressed. For example, team members became aware that patient pathways with high numbers of patients experiencing multiple disadvantage were up for review. They worked with colleagues in using the data analysis results to improve the pathways, to make them more responsive to people experiencing multiple disadvantage.

For other partner organisations in Nottingham, the use of similar proxy indicators is now being explored, to help support the case within each organisation for more proactive, preventative, and connected care and support. This has been an important step in shifting perspectives on multiple disadvantage, and how this relates to the core responsibilities and challenges within each organisation – as well as the opportunities to improve outcomes through more responsive and collaborative interventions.

Getting started: The problem tree

The Nottingham case study on proxy data demonstrates that Changing Futures teams often tried several approaches to getting the data and insight outcomes they wanted. One way to think through the data problems you want to tackle is to construct a problem tree.

The example below illustrates how a problem tree might have been used in the case study example above. The problem that Nottingham faced is on the main trunk of the tree: organisations were not assessing and recording multiple disadvantage. At the roots of the tree, there are some possible reasons why this might be happening. On the branches are the impacts of the problem.

The Nottingham team addressed one of the possible causes – people not expecting to encounter multiple disadvantage among service users – through raising awareness. However, when this did not work, they revisited the impacts of the problem (the tree branches) to see if there were other options. Was there another way for organisations to produce the data on the prevalence of multiple disadvantage? What else could be done to generate data that would support service improvement?

Resources and further reading

The ‘Burning Platform’ project: Evidencing ‘failure demand’ in public services (PDF, 1136KB): This report provides a brief outline of the Burning Platform work that was carried out by Changing Futures Northumbria. The article The ‘liberated method’ – a transcendent public service innovation in polycrisis discusses how data can be used to support the move towards more relational public services. There is also a podcast exploring how data was used to inform learning and change. See also the Securing strategic alignment chapter for useful resources on cost-benefit analysis.

This journey mapping webinar outlines how Changing Futures Plymouth used journey mapping as a reflective, co-produced practice to understand how systems respond to individuals experiencing multiple disadvantage.

The Information Commissioners Office provides a wealth of advice, guidance, checklists, case studies, and other resources relating to the management of data and information, including extensive resources on GDPR.

Several Changing Futures areas focused on producing data and insight to inform a dedicated multiple disadvantage needs assessment. Bristol’s Multiple Disadvantage Needs Assessment sets out the prevalence, profile and needs of local people experiencing multiple disadvantage, to inform local strategy. Furthermore, the Multiple Disadvantage Joint Strategic Needs Assessment (JSNA) from Surrey examines the needs of individuals experiencing multiple disadvantage, using a co-produced, mixed methods approach.

(2) General Data Protection Governance: European Union legislation that governs how personal data is handled.

(3) Appreciative Inquiry is a way of looking at organisational change that focuses on identifying and doing more of what is already working, rather than looking for problems and trying to fix them.