Independent report

Review into the operational effectiveness of the Care Quality Commission: full report

Updated 17 October 2024

Applies to England

Foreword

I was asked to carry out a review of the Care Quality Commission (CQC) in May 2024. Over the last 4 months, I have spoken to around 200 senior managers, caregivers and clinicians working in the health and care sector, as well as with a wide range of patient and user[footnote 1] groups, and over 100 people within CQC - the majority of the executive team, all non-executive directors, senior managers, operational staff, the staff forum, unions and national professional advisers. I have received over 125 emails from members of staff within CQC.

Virtually all have shared with me considerable concerns about the functioning of the organisation, with a high degree of consistency in the comments made. At the same time, they recognise the need for a strong, credible and effective regulator of health and social care services.

I particularly want to mention, and thank, the large number of people within CQC, at all levels of the organisation, who have been open in sharing their concerns. The last couple of years has not been an easy time for them, seeing changes imposed that they recognised would be problematic. They have been, and continue to be, professional in their approach to work and committed to ensuring high-quality health and care services. 

An interim report of my work, providing a high-level summary of my emerging findings, was published in July 2024 to inform thinking around changes needed to start the process of improving CQC. This full report reflects the recent conversations I have had with user groups and a larger number of staff than had been possible previously, and provides greater depth to my findings and recommendations.

There is an urgent need for a rapid turnaround of CQC - a process that has already started with the appointment of an interim chief executive in June 2024 and the announcement of further changes following the publication of my interim report. I am pleased to see the openness and honesty with which the organisation has begun to address the changes required.

The health and care sector accounts for around 12% of the economy[footnote 2] and 21% of public expenditure[footnote 3], and is one of the most significant drivers of health, public satisfaction and economic growth[footnote 4]. It needs - and deserves - a high-performing regulator.

Dr Penelope Dash, Independent Lead Reviewer

Executive summary

CQC is the independent regulator of healthcare and adult social care in England. A review of CQC started in May 2024. The review has heard from over 300 people from across the health and care sectors (providers, user and patient groups, and national leaders) and within CQC, and has analysed CQC’s performance data.

In 2021, CQC launched a major change programme to make the assessment process for health and social care providers simpler and more insight driven, drawing on a wide range of data about quality of care, with the ability to prioritise assessments and inspections. The change programme included new IT systems, changes to how operational (inspection) teams are structured and a new regulatory approach - the single assessment framework (SAF)

The review’s findings

The review has found significant failings in the internal workings of CQC, which have led to a substantial loss of credibility within the health and social care sectors, a deterioration in the ability of CQC to identify poor performance and support a drive to improve quality - and a direct impact on the capacity and capability of both the social care and the healthcare sectors to deliver much-needed improvements in care.

The conclusions of the review are summarised around 10 topics.

Conclusion 1: poor operational performance 

There has been a stark reduction in activity with just 6,700 inspections and assessments carried out in 2023, compared with almost 15,800 in 2019. This has resulted in:

  • a backlog in new registrations of health and care providers
  • delays in re-inspecting after a ‘requires improvement’ or ‘inadequate’ rating
  • increasing age of ratings

The review has concluded that poor operational performance is impacting CQC’s ability to ensure that health and social care services provide people with safe, effective and compassionate care, negatively impacting the opportunity to improve health and social care services, and, in some cases, for providers to deliver services at all. 

Conclusion 2: significant challenges with the provider portal and regulatory platform

New IT systems were introduced at CQC from 2021 onwards. However, the deployment of new systems resulted in significant problems for users and staff.

The review has concluded that poorly performing IT systems are hampering CQC’s ability to roll out the SAF, and cause considerable frustration and time loss for providers and CQC staff.

Conclusion 3: delays in producing reports and poor-quality reports

All sectors told the review that they can wait for several months to receive reports and ratings following assessments. The review has heard multiple comments about poor-quality reports - these have come from providers and from members of the public. 

Poor-quality and delayed reports hamper users’ ability to access information, and limit the credibility and impact of assessments for providers.

Conclusion 4: loss of credibility within the health and care sectors due to the loss of sector expertise and wider restructuring, resulting in lost opportunities for improvement

CQC underwent an internal restructuring in 2023, alongside the introduction of the SAF and new IT systems. The restructuring moved operational staff from 3 directorates with a focus on specific sectors into integrated teams operating at a local level, resulting in a loss of expertise.

The review has found that the current model of generalist inspectors and a lack of expertise at senior levels of CQC, combined with a loss of relationships across CQC and providers, is impacting the credibility of CQC, resulting in a lost opportunity to improve health and social care services.

Conclusion 5: concerns around the single assessment framework (SAF) and its application

The SAF has set out 34 areas of care quality (called ‘quality statements’) that could be applied to any provider of health or social care with a subset applied to assessments of integrated care systems (ICSs) and local authorities. These align to the 5 domains of quality used for many years and referred to as ‘key questions’ within the SAF. For each of the 34 quality statements, there are 6 ‘evidence categories’. These are: people experience, staff experience, partner experience, observations, processes and outcomes.

The review has identified 7 concerns with the SAF as follows:

  • the way in which the SAF is described is poorly laid out on the CQC website, not well communicated internally or externally, and uses vague language
  • there is limited information available for providers and users or patients as to what care looks like under each of the ratings categories, resulting in a lack of consistency in how care is assessed and a lost opportunity for improvement
  • there are questions about how data on user and patient experience is collected and used
  • more could be done to support and encourage innovation in care delivery
  • there is insufficient attention paid to the effectiveness of care and a lack of focus on outcomes (including inequalities in outcomes)
  • there is no reference to use of resources or the efficient and economic delivery of care, which is a significant gap
  • there is little reference to, or acknowledgement of, the challenges in balancing risk and ensuring high-quality care across an organisation or wider health and care system

Conclusion 6: lack of clarity regarding how ratings are calculated and concerning use of the outcome of previous inspections (that often took place several years ago) to calculate a current rating

The review has learnt that overall ratings for a provider may be calculated by aggregating the outcomes from inspections over several years. This cannot be credible or right. Furthermore, providers do not understand how ratings are calculated and, as a result, believe it is a complicated algorithm, or a “magic box”.

Ratings matter - they are used by users and their friends and family, they are used by commissioning bodies (the NHS, private health insurers and local authorities), and they drive effective use of capacity in the sector. They are a significant factor in staff recruitment and retention.

Conclusion 7: there are opportunities to improve CQC’s assessment of local authority Care Act duties

The Health and Care Act 2022 gave powers to CQC to assess local authorities’ delivery of their adult social care duties after several reports and publications identified a gap in accountability and oversight of adult social care. The review found broad support for the overall assessment framework but also heard feedback that the assessment process and reporting could be improved. 

Conclusion 8: ICS assessments are in early stages of development with a number of concerns shared

The Health and Care Act 2022 introduced a new duty for CQC  to review and assess ICSs. Statute sets out 3 priority areas for CQC to look at: leadership, integration and quality of care; and the Secretary of State can set priorities on other themes. CQC developed a methodology for these assessments, which was tested in pilots in Dorset and Birmingham and Solihull, but wider rollout has been paused as a result of a number of concerns shared with the review.

Conclusion 9: CQC could do more to support improvements in quality across the health and care sector

The review heard a consistent comment that CQC should not be an improvement body per se, but, at the same time, could do more to support the health and care sectors to improve. It could do this, for example, through the description of best practice and greater sharing of new models of care delivery, leading international examples of high-quality care and more innovative approaches - particularly the use of technology. 

Governance structures within organisations are crucial to improvement. A greater focus on how organisations are approaching and delivering improvement, rather than looking at input metrics, could enable more significant improvements in quality of care.

Conclusion 10: there are opportunities to improve the sponsorship relationship between CQC and the Department of Health and Social Care (DHSC)

DHSC’s sponsorship of CQC should promote and maintain an effective working relationship between the department and CQC, which should, in turn, facilitate high-quality, accountable, efficient and effective services to the public.

The review has found that DHSC could do more to ensure that CQC is sponsored effectively, in line with the government’s Arm’s length body sponsorship code of good practice.

The review’s recommendations

The health and care sector is one of the most significant drivers of health, public satisfaction and economic growth. It needs - and deserves - a high-performing regulator.

In order to restore confidence and credibility and support improvements in health and social care, there is a need to:

  • rapidly improve operational performance, fix the provider portal and regulatory platform, and improve the quality of reports
  • rebuild expertise and relationships with providers
  • review the SAF to make it fit for purpose with clear descriptors and a far greater focus on effectiveness, outcomes and use of resources
  • clarify how ratings are calculated and make the results more transparent
  • continue to evolve and improve local authority assessments
  • formally pause ICS assessments
  • strengthen sponsorship arrangements

A second review considering the wider landscape for quality of care, with an initial focus on safety, will be published in early 2025.

Background and context

Introduction

CQC is the independent regulator of healthcare and adult social care in England. It was established in 2009 under the Health and Social Care Act 2008 and brought together the Commission for Social Care Inspection (CSCI), the Mental Health Act Commission and the Healthcare Commission. It is an executive non-departmental public body sponsored by DHSC.

CQC monitors, inspects and regulates services to make sure they:

  • meet fundamental standards of safety
  • provide effective care to maximise outcomes
  • are caring and responsive to all users
  • are well led with robust governance structures[footnote 5] and processes in place

It takes action to protect those who use services. It conducts performance assessments and rates providers of services (with some exceptions). Assessments and ratings are publicly available.

While this review is focused on the SAF, it should be noted that CQC has a range of other statutory responsibilities - for example, its role in market oversight, monitoring the Mental Health Act 1983 and publishing the annual State of Care report.

In its Principles of effective regulation, the National Audit Office (NAO) states that regulation plays a crucial role in many aspects of public policy, serving diverse purposes such as protecting and benefiting individuals, businesses and the environment, as well as supporting economic growth. Regulation can take various forms, ranging from strict, prescriptive rules and enforcement measures to lighter-touch approaches that include guidance and codes of practice. To ensure regulatory effectiveness, NAO outlines principles in 4 main areas: design, analysis, intervention and learning.

These principles include:

  • defining a clear overall purpose based on a good understanding of the issues that the regulation aims to address
  • setting specific regulatory objectives
  • ensuring accountability
  • designing an appropriate organisational structure 

Before the early 1990s, there was little objective assessment of the quality of health and care, despite the early attempts of notable figures such as Florence Nightingale and Ernest Codman[footnote 6]. With increasing recognition of the high levels of variation in quality of care, combined with high-profile exposés of very poor outcomes, such as the Bristol heart scandal, the decision was taken to set up an independent reviewer of quality.

The National Care Standards Commission was established by the Care Standards Act 2000 as a non-departmental public body to regulate independent health and social care services, and improve the quality of those services in England.

In 2004, 2 separate bodies were established by the Health and Social Care (Community Health and Standards) Act 2003. CSCI’s role was to inspect and regulate social care services. The Healthcare Commission was responsible for assessing and reporting on the performance of both NHS and independent healthcare organisations to ensure they provided high standards of care. In parallel, the Mental Health Act Commission had been overseeing mental health services since the early 1980s.

CQC was established in the Health and Social Care Act 2008 and was operational from 1 April 2009. It replaced the Healthcare Commission, Mental Health Act Commission and CSCI. The formation of CQC streamlined oversight and introduced inspection frameworks.

CQC has been through a number of iterations since its formation. The first phase was based on a generalist approach to inspection and the emphasis was on compliance or non-compliance with standards.

In 2013, CQC introduced a new approach to inspections and ratings following numerous critical reports. The new approach used larger and more expert inspection teams to produce provider ratings for 5 quality domains:

  • safe
  • effective
  • caring
  • responsive
  • well led

Three chief inspectors were appointed who led both the development and delivery of the new inspection programmes. 

In 2021, CQC launched a strategy, A new strategy for the changing world of health and social care, to drive improvements and encourage innovation across the health and care system, and tackle health inequalities. Alongside this strategy, CQC embarked on an ambitious transformation programme. The wide-ranging changes implemented from 2021 to 2022 included:

  • introducing a new regulatory approach for health and care providers, ICSs and local authorities, a core component of which was the single assessment framework (SAF). The new framework was intended to make the assessment process simpler and more insight driven by drawing on a wide range of data about quality of care, with the ability to prioritise assessments and inspections
  • establishing a new regulatory leadership team to shape CQC’s priorities and drive improvement
  • changing how operational (inspection) teams are structured
  • implementing a new provider portal and regulatory platform

The Health and Care Act 2022 gave CQC powers to assess care at local authority and ICS level. Assessment of local authorities began in December 2023 but ICS assessments have not yet commenced. 

The history of quality regulation in the NHS is shown in ‘Appendix 1: quality and safety history and context’ below.

Different sectors

All providers of healthcare and adult social care regulated activities in England must register with CQC including:

  • acute and specialist hospitals
  • mental health and community care services
  • social care providers
  • GPs
  • dentists
  • the independent sector

The size and scale of these sectors vary considerably. Social care providers are the largest sector by number of organisations (over 50% of all organisations regulated by CQC), while NHS trusts comprise the largest by revenue (total revenue of NHS trusts is around £121.2 billion[footnote 7] compared with around £38 billion[footnote 8] for social care providers). More information is shown in the accompanying ‘Analysis of Care Quality Commission data on inspections, assessments and ratings, 2014 to 2024’ report.

While there are many commonalities across healthcare and social care, and the sub-sectors within these, there are also important differences, particularly around the model of commissioning (self-pay, private health insurance, or local authority or NHS commissioned) and the structure of provision (large private or independent sector, small private or independent sector, or NHS provider). The different sectors are described in more detail below.

Social care

Social care is partly a consumer market - 37% of residential social care is purchased privately by individuals or their families or carers[footnote 9] and 23% of the community care (largely domiciliary care) market is funded privately[footnote 10]. Local authorities have responsibility for ensuring the quality and safety of care, and care availability for all care users, as well as responsibility for commissioning the majority of social care packages (in residential or community settings).

Users and their carers, family or friends look for objective data on the quality of care[footnote 11] and assessment ratings can play a significant role in this. In the CareAware survey carried out by CQC, 76% of respondents considered an inspection report and rating for a care home when choosing a home. One in 10 people changed their choice of care home after checking the CQC inspection report and rating.

Social care is predominantly privately provided. There are around 19,000 care providers (organisations)[footnote 12]. Of the 6,000-plus care home providers (organisations), the largest 10 comprise about 18% of the market[footnote 12].

Across these, CQC has the power to assess and rate approximately 40,900 locations. There has been an increase in the number of social care locations that CQC has the power to assess and rate over the last 5 years, with the total number increasing from around 28,500 in 2019 to around 29,300 in 2024 - a 14% increase. This is largely driven by an increase in domiciliary care providers, increasing from around 9,700 in 2019 to around 13,600 in 2024 (see the accompanying ‘Analysis of Care Quality Commission data on inspections, assessments and ratings, 2014 to 2024’ report and ‘Analysis of Care Quality Commission data on inspections, assessments and ratings, 2014 to 2024: supplementary data tables’).

Around 10% of the market is private equity backed[footnote 13] with this concentrated in the largest providers. The larger providers will typically have their own quality assurance processes, reviewing and monitoring quality of care, putting in processes to improve, and incentivising high-quality care - but this may not be the case for smaller organisations.

Independent healthcare providers

Independent healthcare providers also operate in a mixed market. The independent healthcare sector is worth around £21 billion[footnote 14] with direct consumer self-pay accounting for around £1.4 billion in 2021[footnote 15] and private health insurance commissioning (or paying for) around 7 billion in 2022[footnote 16]. Separate sources provide an estimate of the value of NHS commissioning from the private sector in England of around £13 billion in 2022 to 2023[footnote 17]. The private health insurance sector has a high level of market dominance by the 2 largest insurers - BUPA (35%) and AXA PPP (32%), according to 2020 figures[footnote 16].

In the self-pay market, users are fully reliant on an independent regulator that can assure minimum safety standards and provide information on the wider quality of care, particularly the outcomes of care, in order to support patients and users to make informed choices based on reliable information. 

Some private health insurers have systems in place to vet and monitor the healthcare providers within their network. They may require providers to meet specific safety and wider quality criteria, undergo regular audits, and adhere to best practice guidelines. They may also engage in selective contracting, partnering only with those providers who meet their quality standards, thereby providing an informal form of regulation within their networks.

NHS commissioners (largely integrated care boards (ICBs)) are responsible for assuring the quality of care purchased from independent providers on behalf of local residents for whom they commission care.

The provision of private healthcare is, in parts, a fragmented market with a small number of large providers in mental health care provision and the surgical inpatient and diagnostic sectors, but an increasing number of smaller providers in the wider sector. As in social care, larger providers are likely to have their own quality assurance mechanisms in place, but this is less likely in smaller providers.

Dentists and GPs

Dentists and GPs are independent contractors with much of their income from the NHS (particularly in the case of GPs), and are increasingly overseen by ICBs that contract for services and actively manage their performance in order to ensure high-quality care and improve efficiency in the provision of care.

NHS providers

For NHS providers, there are substantive governance structures in place with quality committees in all providers and a requirement for these to report to boards (with opportunities for detailed scrutiny). They are predominantly commissioned by ICBs, which have a statutory requirement to assure quality of care in all providers. This is supplemented by regional and national governance structures within NHS England.

Local authorities

Local authorities are fundamentally different. They are responsible for direct work with people through social work and integrated delivery with local NHS services, alongside a role in assessing needs, commissioning services and/or arranging services for those who need help and advice. They also have a local safeguarding and quality assurance role.

CQC is looking at both the regulated services provided and commissioned by local authorities, and the quality of adult social care more widely.

Integrated care systems (ICSs)

Similarly, ICSs are not providers per se. Rather, an ICS is a coalition of health and social care providers along with their commissioners (local authorities and ICBs) within a defined geography.

ICBs are accountable to NHS England. Local authorities are accountable to their local populations through 4-yearly voting.

These differences in governance structures and oversight mechanisms across sectors are significant. The regulation of services should reflect these differences.

The impact of CQC on the improvement of health and social care providers

Over the past 20 years, the quality of the health and care sector in England has seen significant improvement driven by:

  • advances in technology
  • increasing transparency about variation in the delivery and quality of care
  • an increased emphasis on seeing patients and users as consumers who can both choose where and by whom to receive care, and play a more active role in considering what care they receive

Since its creation in 2009, CQC has established quality standards, conducted inspections, published the outcomes of those inspections and enforced compliance - all with the aim of providing more information to users and patients, holding providers accountable and fostering a culture of continuous improvement. CQC has produced guidance for providers, and shared best practice, through targeted projects in sectors and across themes, and aggregated insights to show systemic trends across health and social care.

Examples include:

One notable example of CQC’s achievement, through its regulatory activities, is its involvement in the investigation of poor-quality maternity care at Morecambe Bay. The Morecambe Bay Investigation was triggered by concerns over a series of maternal and neonatal deaths at Furness General Hospital, part of the University Hospitals of Morecambe Bay NHS Foundation Trust, between 2004 and 2013.

CQC’s inspections uncovered poor clinical practices, inadequate staffing, and weak leadership, leading to significant scrutiny of the hospital’s management. This prompted a detailed investigation with recommendations for improvement and resulted in substantial policy and practice changes, improved safety protocols, enhanced staff training and better governance.

A report into the impact of CQC on provider performance carried out by Alliance Manchester Business School and the King’s Fund, published in September 2018, examined how CQC’s regulatory activities affect health and social care providers. The report developed a framework that outlines 8 mechanisms through which regulation can impact provider performance: anticipatory, directive, organisational, relational, informational, stakeholder, lateral and systemic.

Overall, the report found that providers support the need for quality regulation and viewed CQC’s approach as an improvement over the previous system. The report was able to identify impact through qualitative data, and noted that regulatory interactions can lead to internal reflections and improvements in team dynamics, leadership and organisational culture. Positive ongoing interactions between CQC inspectors and providers can facilitate continuous improvement and better communication. However, the report also found that it was difficult to identify significant impact of inspections and ratings but noted that CQC’s conceptualisation of the quality of care in the 5 domains - safe, effective, caring, responsive and well led - had been embraced and had become a pervasive framing of the quality of care.

In 2019, CQC commissioned a further report from the Alliance Manchester Business School into its impact on quality of care as part of the evaluation of CQC’s 5-year strategy, A new strategy for the changing world of health and social care, which had been published in May 2016. CQC’s impact on the quality of care: an assessment on CQC’s contribution and suggestions for improvement (PDF, 893KB), published in 2020, found that CQC could be an important driver for change, but more work needed to be done to enable this to happen more consistently and across the whole of CQC.

CQC continues to carry out a number of research and evaluation projects, which are outlined in its annual reports.

Methodology

The terms of reference for this review were to examine the suitability of the SAF methodology for inspections and ratings with specific questions regarding the degree to which CQC supports innovation in care delivery and the efficient and economic delivery of health and care services. The full terms of reference are in ‘Appendix 2: review terms of reference’.

While CQC has responsibility for assessing the provision of care in a very wide range of settings - for example, in prisons, children’s homes and defence medical services - this review has been limited to the overwhelming majority of its work, which includes:

  • social care providers (residential and community based)
  • NHS providers (trusts and GPs)
  • dentists
  • independent sector (private and charitable) healthcare providers

The review has been informed by one-to-one interviews and roundtable discussions with around 300 people. This includes:

  • patients and users, and their representative groups
  • social care providers (both smaller and larger providers)
  • executive directors and regional directors of NHS England
  • ICB CEOs and chairs
  • NHS trust CEOs, chairs, medical directors and nurse directors
  • GPs
  • the British Dental Association
  • independent sector healthcare providers
  • senior managers of local authorities

At CQC, the review has spoken to:

  • the majority of the executive team (including the chief executive who served from 2018 to June 2024)
  • the chair and non-executive directors
  • national professional advisers
  • most of the wider leadership team
  • members of the staff forum
  • trade union representatives

A list of all participants is shown in ‘Appendix 3: list of people spoken to for this review’.

The interim report was informed by a snapshot of data and summary tables provided by CQC to the review team in June 2024. The review team has since constructed a historical data set, using management information provided by CQC in August and September 2024, to independently analyse data on CQC performance. A full summary is available in the accompanying ‘Analysis of Care Quality Commission data on inspections, assessments and ratings, 2014 to 2024’ report with significant findings referenced in the report.

Conclusions

This review has found significant failings in the internal workings of CQC, which have led to a substantial loss of credibility within the health and social care sectors, deterioration in the ability of CQC to identify poor performance and support a drive to improved quality, and a direct impact on the capacity and capability of both the social care and healthcare sectors to deliver much-needed improvements in care.

The conclusions are summarised around 10 topics:

  1. Poor operational performance.
  2. Significant challenges with the provider portal and regulatory platform.
  3. Delays in producing reports and poor quality of reports.
  4. Loss of credibility within the health and care sectors due to the loss of sector expertise and wider restructuring.
  5. The SAF and its application.
  6. Lack of clarity regarding how ratings are calculated and concerning use of the outcome of previous inspections (often several years ago) to calculate a current rating.
  7. CQC’s assessment of local authorities’ Care Act duties.
  8. ICS assessments.
  9. Supporting providers to improve quality across the health and care sector.
  10. The sponsorship relationship between CQC and DHSC.

The review has heard that many people within CQC tried to raise concerns about the changes made over the last few years - the introduction of the SAF, the new provider portal and regulatory platform, and the organisational restructure. Members of CQC’s staff forum, trade union representatives at CQC, senior professional advisers and a number of executives all told the review that they gave feedback and raised concerns about the changes being implemented but did not feel listened to:

We expressed numerous concerns around how the SAF/new ways of working would present challenges for hospitals/mental health service inspections as the SAF didn’t seem to consider the complexities of these. We said how the changes would hold us back from having oversight of risk within services, not being able to inspect as regularly as leaders envisioned, problems with the new IT system, as well as issues with staffing and morale. We weren’t listened to, leaders went ahead anyway, and now we are where we are.

CQC staff member

Staff engagement sessions, ‘task and finish’ groups, and invitations for staff to provide feedback all took place, but did not appear to have any impact on the decisions being taken:

We’ve had willing people in the organisation who wanted to make a real contribution to these changes that were just point blank ignored and isolated.

CQC staff member

There wasn’t really any meaningful consultation process in place at the time.

CQC staff member

The review has received more than 125 emails from CQC staff members, providing a consistent account of the last few years, and has seen 2 letters from the recognised trade unions of CQC to former Secretaries of State for Health and Social Care informing them of significant issues.

The review acknowledges that there will always be internal resistance to any change or transformation programme, but the scale and consistency of comments is striking.

Conclusion 1: poor operational performance 

In order to ensure that health and social care services provide people with high-quality care, CQC registers organisations that apply to carry out regulated activities, carries out inspections of services and publishes information on its judgements.

Operational performance was impacted by the COVID-19 pandemic. In March 2020, CQC paused routine inspections and focused activity on where there was a risk to people’s safety. Despite this, and almost 2 and a half years since the steps in the previous government’s Living with COVID-19 strategy were implemented, the review has heard that CQC’s operational performance is still not where it should be.

There has been a stark reduction in activity with just 6,700 inspections and assessments carried out in 2023 to 2024, partly due to the rollout of the SAF[footnote 18]. This compares with around 15,800 inspections conducted in 2019 to 2020 (see the accompanying ‘Analysis of Care Quality Commission data on inspections, assessments and ratings, 2014 to 2024’ report and ‘Analysis of Care Quality Commission data on inspections, assessments and ratings, 2014 to 2024: supplementary data tables’). CQC’s unpublished business plan for 2024 to 2027 includes a target for 16,000 assessments to be carried out in 2024 to 2025.

The reduction in activity has resulted in considerable delays in re-inspecting providers after a ‘requires improvement’ or ‘inadequate’ rating. While there is a risk-based approach to prioritising assessments, which means providers with poorer ratings are inspected more frequently, a number of providers are currently “stuck in ‘requires improvement’” despite improving services and making the changes required by CQC. Data from CQC shows that the time taken to carry out a re-inspection after an ‘inadequate’ rating has increased from 87 days in 2015 to 136 days in 2024, and the time to carry out a re-inspection following a ‘requires improvement rating’ has risen from 142 days to 360 days in the same time frame (see the accompanying ‘Analysis of Care Quality Commission data on inspections, assessments and ratings, 2014 to 2024’ report and ‘Analysis of Care Quality Commission data on inspections, assessments and ratings, 2014 to 2024: supplementary data tables’). Interviewees told the review that this could result in hospital discharge teams refusing to discharge people to them, or local authorities refusing to contract with providers, with a further knock-on impact on capacity, staff morale within providers and the overall ability of providers to operate in a sustainable way:

LAs [local authorities]/ICBs won’t contract with us, banks won’t lend money to care homes without a CQC approval, investors get put off…

– Head of quality at a care home provider

The reduction in activity has resulted in some organisations not being inspected for several years. Data provided by CQC suggests that the oldest rating for a social care organisation is from February 2016 (over 8 years old) and the oldest rating for an NHS hospital (acute non-specialist) is from June 2014 (around 10 years old). The average age of current ratings across all locations is 3.9 years (or 3 years and 10 months) as of 30 July 2024, although this varies by location type. Furthermore, of the locations CQC has the power to inspect, CQC estimates that around 19% have never been rated (see the accompanying ‘Analysis of Care Quality Commission data on inspections, assessments and ratings, 2014 to 2024’ report).

As well as carrying out fewer inspections overall, CQC has moved away from ‘comprehensive’ inspections to more ‘focused’ inspections examining a few service areas or key questions (quality domains). Of inspections carried out between 1 January 2024 and 30 July 2024, 43% were assessments under the new SAF, a fifth were ‘comprehensive’ and 36% were ‘focused’. In 2023, of 6,734 inspections, 3,107 were ‘comprehensive’ and 3,598 were ‘focused’. This compares with years 2015 to 2019 where around 90% of all inspections were ‘comprehensive’ (see the accompanying ‘Analysis of Care Quality Commission data on inspections, assessments and ratings, 2014 to 2024’ report and ‘Analysis of Care Quality Commission data on inspections, assessments and ratings, 2014 to 2024: supplementary data tables’).

The reduction in activity and lack of timely (or no) inspections and ratings means that patients and users are unable to compare services with others in order to help them choose care, and providers do not have the insights from an expert inspection, resulting in a missed opportunity for improvement. In some sectors, alternative quality ratings are beginning to come to the fore to fill the gap[footnote 19] - care home quality leads told the review they end up relying on local authority provider assessment and market management solution reports to provide an objective view of quality.

As well as delays in carrying out inspections, there is a backlog in registrations of new providers, though this does need to be set against the increase in demand for registration over the last few years. The total number of locations CQC has the power to rate has increased by around 11% from 2019 to 2024. According to CQC’s ‘Corporate performance report (2023/24 year end)’, at the end of 2023 to 2024, 54% of applications pending completion were more than 10 weeks old[footnote 18]CQC has a key performance indicator (KPI) to reduce this proportion, but it increased from 22% at the end of 2022 to 2023. The review heard that the backlog in registrations was a particular problem for small providers trying to set up a new care home, domiciliary care service or a healthcare service, and could result in lost revenues and investment, which had a knock-on impact on capacity.

The performance of CQC’s call centre is poor, with interviewees telling the review that calls took a long time to be answered. Data from CQC shows the average time for calls in relation to registration (the most common reason for calling) to be answered between January and June 2024 was 19 minutes. CQC does have a KPI to achieve a 60% to 80% response rate on its national customer service centre call lines and this was achieved in 2023 to 2024 with a response rate of between 63% and 76% across the 4 lines. This means that between a quarter and a third of calls were dropped before they were answered. For calls related to registrations, only 66% were answered with 34% abandoned, equivalent to nearly 19,000 calls. Almost a third of safeguarding-related calls were abandoned despite an average wait of only around 7 minutes. More recent data shared by CQC showed that there has been some improvement over recent months, with 79% of general enquiry-related calls and 76% of mental health-related calls answered before the caller rang off.

Poor operational performance has not been helped by poor data management within CQC. In developing this report, the review has found it challenging to use CQC data sets to analyse trends and patterns over time.

The review has concluded that poor operational performance is impacting CQC’s ability to ensure that health and social care services provide people with safe, effective and compassionate care, negatively impacting the opportunity to improve health and social care services, and, in some cases, for providers to deliver services at all. 

Conclusion 2: significant challenges with the provider portal and the regulatory platform

New IT systems were introduced into CQC from 2021 onwards. The provider portal started in July 2023 but was not used in significant numbers until April 2024. The regulatory platform started in November 2023 for assessment, and included registration and enforcement by April 2024.

They were implemented with the intention of:

  • improving operations and communications with providers
  • enabling a move to a much more insight-driven approach to regulation
  • highlighting emerging risks
  • supporting more risk-informed, responsive assessments and inspections

However, the deployment of new systems resulted in significant problems for users. The review has heard that they cannot easily upload documents, there are problems if the named user is away or off sick and it can take hours to receive a password reset. This takes staff away from delivering or supporting frontline care and causes considerable frustration:

There are so many practical problems with the portal.

– Care provider

… Simple things like not having an auto response to let you know the data has been uploaded correctly are not there and that adds to the stress and difficulties.

– Care provider

These problems are not limited to providers, with CQC staff also telling the review about numerous problems with the system on their side. The review heard that uploading evidence to the platform is the single biggest issue:

It is demoralising, it takes so long and literally is frustrating.

CQC staff member

… the time it takes to gather information and upload it to the system has meant that it now takes much longer to carry out an inspection, upload information and publish a report. We produced far more inspection reports with our old ways of working. I am not able to inspect/assess anywhere near the same frequency as I did before.

CQC staff member

The review has heard of a number of issues with how the new system and regulatory platform is managing safeguarding concerns and reports of serious untoward incidents. It is not the role of CQC to investigate safeguarding concerns - responsibility for this sits with local authorities under the Care Act 2014. However, if CQC is the first recipient of safeguarding information, it should be classified as a ‘priority 1’ and a referral made to the local authority. Changes made during development of the regulatory platform have led to unintended consequences with referrals not containing adequate information for the local authority to act on and challenges for CQC staff in ensuring concerns raised are being managed in a timely manner. 

The review has concluded that poorly performing IT systems are hampering CQC’s ability to roll out the SAF and appropriately manage concerns raised with CQC, and cause considerable frustration and time loss for providers and CQC staff.

Conclusion 3: delays in producing reports and poor-quality reports

As well as delays in carrying out assessments, all sectors told the review that they can wait for several months to receive reports and ratings following assessments. This increases the burden on, and stress of, staff and results in lost time when quality improvements could have been made.

The review has heard multiple comments about poor-quality reports - these have come from providers and from members of the public. Specific issues raised include:

  • poor structuring of reports, which are hard to read and follow
  • different messages in summaries than in the main report
  • some sections copied from other providers
  • disparity between the tone and evidence used in a report, and the subsequent rating awarded
  • difficulty for users in understanding assessment reports, lacking explanation of type of residents and limited information on number of beds
  • lack of clarity in dates of assessments - for example, providing a recent date for an update when information was in fact gathered by CQC often years before, without clear explanation
  • a lengthy and complicated scoring system that is not easy to understand

The review has heard that some of this is due to challenges with the information technology within CQC. Irrespective of cause, poor-quality and delayed reports hamper users’ ability to access information, and limit the credibility and impact of assessments for providers. There should be greater consistency in the quality of reports and learning from examples of better-quality outputs that have been published.

Conclusion 4: loss of credibility within the health and care sectors due to the loss of sector expertise and wider restructuring, resulting in lost opportunities for improvement

CQC underwent an internal restructuring in 2023, alongside the introduction of the SAF and the new IT systems. This was an ambitious combination:

… Changing team structure, changing IT and introducing a new framework all at the same time was bonkers.

– Senior CQC executive

The restructuring moved operational staff from 3 directorates with a focus on specific sectors into integrated teams operating at a local level. These integrated assessment and inspection teams (IAITs) include a combination of inspectors and assessors, with those moving from the previous model retaining their previous specialisms. Under the previous model, ‘inspection managers’ worked within the sector-based directorates. Now ‘operational managers’ oversee an integrated team of inspectors and assessors with different specialisms.

When recruiting new staff, CQC does try to recruit individuals with a health or social care background, although the type and level of experience will vary. On recruitment, new inspectors and assessors are allocated to a specialism - for example, ‘inspector, adult social care’ or ‘assessor, primary care’. An adult social care inspector will then, for the most part, do adult social care inspections and not generally be expected to work outside of their sector.

However, in practice, local teams may have an unequal balance of inspectors with different specialisms, with the composition of IAITs dictated by individuals’ geographical locations, regardless of their previous specialism. Where there are vacancies within a team, or recruitment, capacity or demand challenges, inspectors are required to inspect in an area that is new to them and does not match their specialism (for example, an adult social care inspector will need to conduct inspection work for a mental health service if that is a priority activity for their team). Similarly, some operational managers provide assurance or support for the activity of inspectors for a sector in which they may not have previous experience.

In these cases, CQC seeks to ensure that inspectors are supported by others with greater experience in the sector, ‘experts by experience’ (people who have recent personal experience of using or caring for someone who uses health or social care services), senior specialists or national professional advisers, as well as ensuring that inspectors have a strong understanding of the regulations and fundamental standards.

However, the review heard that:

… the new structure may result in situations where there is an isolated single person, with a mental health background or knowledge of mental health, on an inspection team looking at a mental health unit.

– Senior executive at CQC

… [staff] lost their professional home, they lost their sense of team, they lost their expertise.

– Former senior inspector at CQC

While the review recognises that prior experience of a sector is not a prerequisite for someone to be a credible assessor and that experience does not, in itself, bring credibility, the review has heard consistently from providers about a reduction in expertise and seniority of inspection teams over the last couple of years:

… [it is] not clear they have the expertise/capabilities to carry out well led reviews - we often have very different views of the performance of people and boards, having worked with them over long periods of time.

– NHS England regional director

The loss of sector expertise has been compounded by changes to the roles of chief inspector. Where previously there were 3 roles - chief inspectors of social care, primary care and hospitals, each headed up by highly respected national figures with deep expertise in their sector - there are now 2 chief inspectors: one for adult social care and integrated care, and one for healthcare.

The healthcare leadership team consists of a mental health nurse, a pharmacist and an NHS manager. They are supported by a medical director and national professional advisers who are drawn from a wide range of backgrounds. It has been over a year since the previous chief inspector of healthcare was unfortunately taken ill and has been unable to work. A request to bring in a new chief inspector for healthcare was first discussed with DHSC in September 2023, with a formal request made in February 2024.

The lack of sector expertise results in providers not trusting the outcomes of reviews and not feeling they have the opportunity to learn from others, especially from highly regarded peers.

This lack of expertise has been compounded by a reduction in ongoing relationships between CQC staff and providers. The chief executive in post prior to 2018 and chief inspectors would spend considerable time with senior members of the health and care sectors, building relationships, hearing their perspectives on care delivery, and explaining and sharing the insights CQC was gathering. This has been described on both sides as invaluable and has been largely lost.

At the local level, inspection teams would similarly build relationships with senior leaders from across the sectors to build confidence and support early awareness of emerging problems:

We’ve lost local relationships - lost the ability to speak to people who understand what’s happening locally.

– Senior executive at CQC

The review has found that the current model of generalist inspectors and a lack of expertise at senior levels of CQC, combined with a loss of relationships across CQC and providers, is impacting the credibility of CQC, resulting in a lost opportunity to improve health and social care services.

Conclusion 5: concerns around the single assessment framework (SAF) and its application

The SAF was developed as part of CQC’s 2021 strategy, A new strategy for the changing world of health and social care, to address the duplication and inefficiency inherent in the predecessor assessment frameworks, and more fully realise its 2016 strategic ambition to develop a single, shared view of quality for health and care across all sectors.

It was hoped that data and insights could be collected in advance across all areas of care in order to have an ‘always on’ assessment model. Where new evidence was collected or received by CQC, it could be considered (based on professional judgement) with quality statements and ratings updated accordingly. Data and insights are supplemented by the evidence collected by CQC’s operations teams and support CQC to have a continued focus on the risk of poor care, so that emerging problems can be identified at an early stage, and organisations with data suggesting poorer-quality care can be assessed and inspected more frequently.

This review considers this concept and approach to be sensible and in line with regulation in other sectors, but it is clearly dependent on reviewing robust data and insights in a timely manner.

In 2022, CQC shifted its planned launch of the new SAF from January 2023 to later in 2023 because of internal delays and feedback from providers. The SAF was subsequently rolled out in November 2023 to a small number of providers across sectors as part of the early adopter programme with rollout continuing in a phased manner.

The framework has set out 34 areas of care quality (called ‘quality statements’) that could be applied to any provider of health or social care and a subset to assessments of ICSs and local authorities. These align to the 5 domains of quality used for many years and referred to as ‘key questions’ within the SAFCQC describes these in their documentation on the SAF as below:

  • safe:
    • safety is a priority for everyone and leaders embed a culture of openness and collaboration
    • people are always safe and protected from bullying, harassment, avoidable harm, neglect, abuse and discrimination
    • people’s liberty is protected where this is in their best interests and in line with legislation
  • effective:
    • people and communities have the best possible outcomes because their needs are assessed
    • their care, support and treatment reflects these needs and any protected equality characteristics
    • services work in harmony, with people at the centre of their care. Leaders instil a culture of improvement, where understanding current outcomes and exploring best practice is part of everyday work
  • caring:
    • people are always treated with kindness, empathy and compassion
    • people understand that they matter and that their experience of how they are treated and supported matters
    • their privacy and dignity is respected
    • every effort is made to take their wishes into account and respect their choices, to achieve the best possible outcomes for them. This includes supporting people to live as independently as possible
  • responsive:
    • people and communities are always at the centre of how care is planned and delivered
    • the health and care needs of people and communities are understood and they are actively involved in planning care that meets these needs
    • care, support and treatment is easily accessible, including physical access
    • people can access care in ways that meet their personal circumstances and protected equality characteristics
  • well led:
    • there is an inclusive and positive culture of continuous learning and improvement. This is based on meeting the needs of people who use services and wider communities, and all leaders and staff share this
    • leaders proactively support staff and collaborate with partners to deliver care that is safe, integrated, person-centred and sustainable, and to reduce inequalities

For each of these 34 quality statements, there are 6 ‘evidence categories’ where information about quality of care is collected. These are:

  • people experience
  • staff experience
  • partner experience
  • observations
  • processes
  • outcomes

However, the SAF is intended to be flexible and not all areas are considered for all 34 quality statements. It is tailored to different sectors by determining which evidence categories are relevant for each quality statement for each sector. Within the SAF, there are priority quality statements identified for each sector, which are the starting point of a planned assessment. Additional quality statements may then be chosen based on the risk and improvement context for providers.

The number of priority quality statements and priority evidence categories assessed varies across sectors and sub-sectors. For the sectors reviewed, the quality statements assessed under each key question vary considerably. The average number of quality statements currently used in SAF assessments is 9.2 (as of 30 July 2024) or under a third of the total 34 quality statements. The most commonly assessed quality statement under ‘safe’ was ‘safe and effective staffing’, with 98% of ‘safe’ assessments across the 6 sectors being assessed against this quality statement. This compares with only 30% of ‘safe’ assessments looking at ‘safe systems, pathways and transitions’. The accompanying ‘Analysis of Care Quality Commission data on inspections, assessments and ratings, 2014 to 2024’ report illustrates which evidence categories are considered for each quality statement and which ones are priority quality statements for different sectors.

The review has identified 7 concerns with the SAF:

  1. The way in which the SAF is described is poorly laid out on the CQC website, not well communicated internally or externally and uses vague language.
  2. There is limited information available for providers and users or patients as to what care looks like under each of the ratings categories, resulting in a lack of consistency in how care is assessed and a lost opportunity for improvement.
  3. There are questions about how data on user and patient experience is collected and used.
  4. More could be done to support and encourage innovation in care delivery.
  5. There is insufficient attention paid to the effectiveness of care and a lack of focus on outcomes (including inequalities in outcomes).
  6. There is no reference to use of resources or the efficient and economic delivery of care in the SAF, which is a significant gap, despite this being stated in section 3 of the Health and Social Care Act 2008.
  7. There is little reference to, or acknowledgement of, the challenges in balancing risk and ensuring high-quality care across an organisation or wider health and care system.

Concern 1: the way in which the SAF is described is poorly laid out on the CQC website, not well communicated internally or externally and uses vague language

The summary of the SAF (as set out at the start of this section) should be easy for any health or care provider, or any member of the public, to find and access. However, the SAF is instead described on CQC’s website in a way that the review found confusing. Poorly laid-out pages lack structure and numbering, and do not have a summary as per the section above.

Furthermore, the descriptions of each of the 5 key questions (safe, effective, responsive, caring and well led) are extremely generic, using vague language, and lack tangible objective measures of quality, and the impact of those on users and patients:

The SAF, although [it] supposedly reduces repetition, duplicates statements, is confusing, [and] badly worded.

CQC staff member

The 117 pages of the woolly SAF are unwieldy, hard to use, difficult to comprehend and purport to cover all care services. The style is off-putting with the ‘we’ and ‘I’ statements.

– Representative of a patient or user group

Furthermore, the review has found that, across the executive team, few were able to describe the 34/6 framework (34 quality statements and 6 evidence categories), the rationale for prioritising particular quality statements in each sector, the rationale for which evidence categories are used for different quality statements, the way in which ratings are calculated and so on. This should be widely understood in CQC irrespective of role or background.

Concern 2: there is limited information available for providers and users or patients as to what care looks like under each of the ratings categories, resulting in a lack of consistency in how care is assessed and a lost opportunity for improvement

The descriptions as to what ‘good’ looks like for each quality statement do exist. For example, for the ‘safeguarding’ quality statement, the description of ‘good’ says: “We work with people to understand what being safe means to them as well as with our partners on the best way to achieve this. We concentrate on improving people’s lives while protecting their right to live in safety, free from bullying, harassment, abuse, discrimination, avoidable harm and neglect. We make sure we share concerns quickly and appropriately.”

There is more detail available in guides to providers - for example, that shown in ‘Appendix 4: example of a rating descriptor’ and on the CQC website. However, while there used to be tangible, clear and objective descriptions of each area assessed in the old model for each of the 4 ratings categories - see ‘Appendix 5: examples of rating descriptors in previous assessment model’ - this has not yet been developed for the new framework.

The lack of clear descriptions does not support organisations to improve. The review heard time and time again from providers that they struggle to know what inspectors are looking for, they are not learning from them and, as a result, they don’t know what they need to do to be better:

What does outstanding look like? We don’t have it.

– CEO at an NHS foundation trust

They should not be an improvement body but should have a clear view as to what good/outstanding care looks like.

– NHS England senior executive

[There is a] failure from CQC to clearly articulate what it means to be each rating.

– Senior executive at a care provider

The review also heard that inspectors struggled to articulate the SAF and the definitions they should be working to. It was told that, in some instances, operational staff lacked guidance on how to inspect or assess providers under the SAF, with previous information on the intranet no longer accessible. Strict limits on the number of words that could be written for each evidence category could mean that issues are not able to be reported in detail in order to help organisations improve:

We previously had detailed ratings guidance and would carefully assess each rating against that. Deputy chief inspectors would assess each rating to check.

– Previous senior inspector at CQC

It’s not clear an assessor/inspector knows how to allocate a score of 1 to 4 - it is very subjective.

– Senior executive at CQC

[The] lack of a clear description for each quality statement and evidence category means it’s hard to know why CQC decides to prosecute one provider and not another.

– Lawyer working in the healthcare sector

Many providers referred to a lack of consistency in ratings awarded to providers. This is not a new issue for CQC, but the review found that this has, so far, not been effectively addressed by the move to the SAF. Those who work across multiple sites (for example, a large care home provider with multiple sites or a large group of GP practices) report ratings differing from one site to another, when they know, having spent far more time with them, that their performance does not align with the reports. This applies in all directions - for example, their poorer-quality providers are getting better ratings than their top providers and vice versa:

There is [a] major problem at the moment with consistency - even within a single large provider, one home can be rated as ‘requires improvement’ while another is ‘outstanding’. [The] difference may be down to very small things such as signage. That’s not credible. CQC needs to be much clearer on what good looks like - which would then make it easier for assessors/inspectors.

– Senior executive at a large social care provider

Concern 3: there are questions about how data on user and patient experience is collected and used

With the development of the SAFCQC has sought to have a greater emphasis on people’s experience of the care they receive. Indeed, a senior inspector stated: “People’s experience is the most important thing we look at.”

CQC has a range of methods for gathering feedback on people’s experience. Data is drawn from national surveys (for example, the NHS Patient Survey Programme, NHS England GP Patient Survey and the Personal Social Services Adult Social Care Survey, also known as the Adult Social Care Survey), all of which have had some statistical analysis applied and remove any results related to less than 30 users at any one site.

Surveys are supplemented by interviews with service users. The review heard concerns from providers that relatively small numbers of users or patients are interviewed, with sometimes as few as tens of users of a service that looks after thousands of people a year in the case of a GP practice, or hundreds of thousands in the case of a hospital.

The review heard from users, patients and their representative groups that there is confusion about the mechanisms through which patients and users can give feedback to CQC about individual providers. While CQC can use complaints as evidence, it is not CQC’s role to investigate or respond to complaints. 

The ways in which complaints are managed across health and social care will be further considered in a second review, looking at the wider safety landscape. That review will also consider the breadth of bodies currently collecting patient and user experience feedback, and how this could be more effectively channelled and used as a basis for assessment and improvement.

CQC does assess providers on whether they are actively seeking, listening and responding to the views of people who have had, or are most likely to have, a poorer experience of care, or those who face more barriers to accessing services.

There is similarly a need to ensure representative surveys of all staff. CQC has been using NHS Staff Survey data as a long-standing source of evidence in their assessments and ratings of NHS trusts. This is established in dashboards available for operations teams, and used in preparatory analysis carried out for specific NHS trust inspections. It is supplemented by interviews with staff on site during inspections, with similar concerns expressed about the representativeness of views collected from a small proportion of staff.

Concern 4: more could be done to encourage and support innovation in care delivery

The review was asked to consider how CQC considers innovation in care delivery.

Innovation is considered within the SAF under the ‘learning, innovation and improvement’ and ‘governance, management and sustainability’ quality statements under the ‘well led’ key question, and the ‘delivering evidence-based care and treatment’ quality statement under the ‘effective’ key question. There is also some CQC guidance on using and sharing innovation to reduce health inequalities.

Furthermore, CQC is actively seeking to develop operational guidance on inspection of new approaches such as virtual wards and vision-based monitoring systems in inpatient mental health units to help operational colleagues understand how they can assess these appropriately. The Capturing innovation to accelerate improvement project has developed 4 innovation case studies and a resources map to support services. CQC has sought to support digitisation of social care records by including the adoption of records as a measure of best practice in the SAF, and communicating future expectations for providers on the role of digitisation in improving and maintaining care quality.

There is some feedback from provider surveys: CQC’s own annual survey of providers showed that just under 50% of providers agreed or strongly agreed that CQC’s information supports their service’s efforts to innovate or adopt innovation from elsewhere. In 2024, NHS Providers published research, Good quality regulation: how CQC can support trusts to deliver and improve, that highlighted the need for CQC to do more to “support and encourage improvement and innovations”.

The review heard from providers that there is insufficient focus on encouraging, driving and supporting innovation in care delivery:

Regulator doesn’t have a great tradition of innovation. Used to hear from members: ‘inspector didn’t like this, that or the other innovation’.

– Care provider

While the then-chief inspector of adult social care recognised and applauded our innovation, the local inspectorate teams did not have the same response.

– Care provider

While CQC is seeking to identify some innovations that can be woven into its assessment framework, the opportunities to endorse and embed change and improvement are far greater. For example, a recent report highlighted multiple opportunities for greater adoption of technology and wider innovation in care delivery[footnote 20].

Supporting innovation could be a galvanising factor in driving better-quality, more efficient and more responsive care across all sectors. There is scope for CQC to further expand its work to ensure a greater focus on driving innovation - and sharing best practices. 

In comparison, for example, Ofsted works closely with the Department for Education to consider what changes and improvements to schools are planned, and incorporates those into its inspection framework. While CQC does work closely with DHSC to incorporate new legislative requirements into the assessment framework, CQC and DHSC should consider how to ensure a greater focus on innovation and new models of care.

Concern 5: there is insufficient attention paid to the effectiveness of care and a lack of focus on outcomes (including inequalities in outcomes)

The 5 key questions of quality (previously called quality domains) - safe, effective, caring, responsive and well led - are well established and intended to be used as a holistic overview of quality. However, there has been an increasing focus on the ‘safe’ key question with a surprising lack of attention to ‘effective’. While this may reflect an increasing focus on organisations where there are concerns about safety, it fails to recognise the crucial importance of effectiveness, which looks at the impact (or outcomes) of care on improving people’s health and wellbeing.

Between 4 December 2023 and 30 July 2024, CQC completed 980 assessments under the new framework (this figure includes unpublished assessments). Of these, 75% looked at the ‘safe’ key question, whereas only 38% looked at the ‘effective’ key question. The proportion of assessments looking at each key question varies across sectors, with 90% of adult social care assessments looking at ‘safe’ compared with 67% of secondary and specialist care assessments (which includes NHS acute hospitals). No community health assessments had looked at the key questions ‘effective’ and ‘responsive’, and no secondary and specialist care assessments had looked at the key question ‘caring’.

Of all the quality statements considered during the same time period, 40% were within the ‘safe’ key question and 13% within the ‘effective’ key question. Among secondary and specialist care assessments carried out, 63% of quality statements considered were within the ‘safe’ key question and 9% within the ‘effective’ key question.

The number of quality statements focused on during assessments also varied from sector to sector. On average, 9.2 out of 34 quality statements were used with 2.1 for mental health assessments and 11.6 for adult social care assessments. Of the assessments that looked at the key question ‘effective’, only 34% considered the quality statement ‘monitoring and improving outcomes’. 

Within the framework, there is an evidence category for outcomes data but little attention is given to this. For example, within primary care, only 2 of 100 (2%) evidence categories considered across 34 quality statements refer to outcomes of care. Of the 34 quality statements considered for primary care, only 6% have outcomes as an evidence category (compared with 68% looking at people’s experience). Even when outcomes are considered, there is a very narrow set of metrics routinely reviewed. This is despite there being considerable data available about outcomes of care in primary care, such as how well controlled an individual’s diabetes is and the impact of that care - for example, rates of admission to hospital with the complications of a long-term condition (renal failure in people with diabetes, acute cardiovascular events in people with high blood pressure or atrial fibrillation).

A number of providers commented that inspections tended to concentrate on more trivial issues. For example, at a primary care providers roundtable, the review heard that the failure to calibrate 1 out of 15 sets of weighing scales was reported on. Other providers referred to an overemphasis on documentation of things like employment practices, which, while being important, should not be done at the expense of assessing all key questions including the outcomes of care:

[It is] not clear they focus on the real things that matter.

– NHS England director

200 pieces of ‘evidence’ were asked for but it was not clear what they were used for.

– Social care provider

While there is limited nationally consistent data that is available on the outcomes of social care at a care provider level, CQC does collect data from providers on adverse negative events in care including serious injuries, safeguarding incidents and deaths. At a local authority level, slightly more outcomes data[footnote 21] is available and assessments do take into account measures from the Adult Social Care Outcomes Framework including social care-related user quality of life.

Within healthcare, NHS England publishes over 30 national clinical audits and the Getting It Right First Time (GIRFT) programme - more of which could be drawn on to provide comparisons of outcomes across providers. The GIRFT programme provides data on mortality rates, length of stay, post-surgical infection rates and hospital re-admission rates in more than 40 surgical and medical specialties but is not used by CQC.

The review has heard positive feedback about the work of the primary medical services team in CQC in introducing clinical searches of GP records. These allow for reviews of prescribing, monitoring of high-risk medicines, management of long-term conditions and potential missed diagnoses. An initial evaluation of the searches found that 97% of inspectors, 91% of specialist clinical advisers and 65% of providers believed they helped to identify risks around safe and effective patient care.

Across effectiveness and outcome data, the review has struggled to find reference to measures looking at outcomes by different population groups, in particular the Core20PLUS5 groups, though the review understands this is being considered in CQC’s developing approach to assessing ICSs.

The review was similarly surprised to see the lack of measures of outcomes in independent sector providers, particularly given the emphasis on this in the Paterson Inquiry report. The Paterson case exposed significant gaps in the oversight of private healthcare providers and across the NHS - the accompanying review found:

  • a lack of effective oversight
  • the need for a regulatory framework that not only enforces standards but also actively monitors and audits clinical practices and outcomes within the independent sector
  • a need for more accessible whistleblowing mechanisms within the private sector

Following the Paterson case, there has been significant work in the private sector, supported by CQC, to strengthen clinical governance across the industry including the development of the Medical Practitioners Assurance Framework (MPAF), which has been embedded in both the CQC inspection regime and the NHS Standard Contract. The MPAF is a contemporary consensus view of medical best practice designed as a framework for clinical governance across the sector.

Concern 6: there is no reference to use of resources or efficient and economic delivery of care in the SAF, despite this being stated in section 3 of the Health and Social Care Act 2008

The review was asked to consider how CQC supports the efficient, effective and economic delivery of care. This is part of the scope of CQC and was set out in the Health and Social Care Act 2008. However, within the SAF, there is no quality statement that considers use of resources or efficient delivery of care. The review understands that ‘use of resources’ assessments used to be conducted by NHS England, but were paused during the pandemic and are no longer done.

The lack of an objective assessment of the efficient and economic delivery of care is disappointing as effective use of resources is one of the most impactful ways of improving quality of care for any provider. More efficient deployment of staff and more efficient use of assets (such as beds, diagnostics and theatres) enables more people to be cared for and better care for individual patients. Furthermore, a number of recognised metrics of high-quality care are also good metrics of efficient services and good use of resources - for example, length of stay in an inpatient facility[footnote 22].

The quality statement on safe and effective staffing assesses staffing levels based on whether the provider has the staff numbers recommended in guidance from national bodies, but does not independently consider whether services could be delivered in a more efficient and effective way:

We could just add staff and we would get a better rating, but it wouldn’t necessarily improve care.

– CEO at an NHS foundation trust

The review understands that inspectors are encouraged to discuss staffing levels with providers before making an assessment, but there appears to be a disconnect between CQC policy on this issue and providers’ experience of inspections:

NHS organisations are continually doing dynamic assessments of staffing levels/risk levels, but CQC come in and simply say you’re below establishment and need more staff.

– NHS England regional director

CQC has a tendency to always say you need more staff.

– NHS England regional director

The review has understood that ‘safe staffing’ levels are set out by NHS England and other regulatory or professional bodies. It is not clear how the ‘safe staffing’ levels in the guidance are set and whether they are grounded in an efficient model of care and/or promote scenarios where technology, scale and optimal process management are used to best effect. CQC should clarify the evidence base or lack thereof.

While ‘safe staffing’ can be a valid and useful concept, when bluntly applied, ‘safe staffing’ ratios can be a significant constraint on ensuring efficient, effective and economic delivery of care, restricting innovation and improvement in care delivery, diverting resources to specific areas, and putting quality of care at risk.

An independent regulator should not be assessing against provider-stipulated inputs, but should be focused on overall outcomes for patients and users, and challenge provider capture, as can be seen in other national regulators such as Ofgem.

Concern 7: there is little reference to, or acknowledgement of, the challenges in balancing risk and ensuring high-quality care across an organisation or wider health and care system

The review has heard from a number of providers - and commissioners of care - about the lack of recognition within the SAF to the challenges in balancing risk in the health and care sectors. Providers have referred to a number of examples where one aspect of care is assessed in isolation of the consequences for other areas of care and, therefore, other patients and users.

For example, within the social care sector, the review heard of a requirement that staff do not lift someone following a fall. This can then result in an ambulance being called and a person being taken to hospital with implications for their own wellbeing (recognising the negative impact of hospital admissions on older, frail people) and for others who may then incur a delay waiting for an ambulance.

Within the healthcare sector, providers referred to ‘safe staffing’ assessments and CQC requirements to add staff to one area of care (for example, an inpatient ward) that can then result in other services (for example, community services) not being available and a greater number of patients or users being compromised[footnote 23][footnote 24].  

Across health and care systems, the review heard of care homes refusing to accept users after 4pm as “it will be marked down by CQC” and this in turn leading to further time in hospital with implications for the user and a knock-on impact on others awaiting admission to hospital. CQC has said that none of these examples are stipulated by CQC or are regulatory requirements so it is not clear why these perceptions persist.

Health and social care are both complex and have inherent risks. Leaders and staff working in the sectors need to balance risks within and across services on a daily basis. This should be given greater recognition within the CQC assessment framework to ensure a far greater focus on improving outcomes for users, patients and populations.

Conclusion 6: lack of clarity regarding how ratings are calculated and concerning use of the outcome of previous inspections (that often took place several years ago) to calculate a current rating

The review has been concerned to find that overall ratings for a provider may be calculated by aggregating the outcomes from inspections over several years. Examples of this are shown in ‘Appendix 6: examples of combined ratings’. This cannot be credible or right.

The review understands that this approach is longstanding and did not change as a result of the introduction of the SAF, but may not have been transparent before.

The SAF was intended to prevent the use of inspections (and associated ratings) from previous years as more frequent assessments would be undertaken based on emerging data and intelligence, but, because CQC is not doing the number of assessments required to update ratings, the problem continues. CQC intends to mitigate this by using individual quality statement and key question scores instead of aggregated ratings, and by assessing more quality statements to improve robustness.

Providers do not understand how ratings are calculated and, as a result, believe it is a complicated algorithm, or a “magic box”. This results in a sense among providers that it is “impossible to change ratings”:

During the assessment and the assessment feedback, it was mentioned by the inspectors that there was evidence that ‘well led’ had improved significantly and, when queried why more quality statements were not assessed, they indicated that this wasn’t an option available to them, while also noting that it was probable that the rating would have changed with the improvements noted.

– CEO of a care provider

A similar theme was heard from people working within CQC, with some inspectors and senior professional advisers also commenting that they didn’t know how ratings were calculated.

CQC is seeking to bring greater clarity to how ratings are calculated, and is developing materials to facilitate communication and build transparency.

Ratings matter - they are used by users and their friends and family, they are used by commissioning bodies (NHS, private health insurers and local authorities), and they drive effective use of capacity in the sector. They are a significant factor in staff recruitment and retention:

CQC ratings have significant implications for care homes - if CQC says something isn’t ‘safe’, then the home is at risk of being prosecuted.

– Lawyer

Ratings need to be credible and transparent.

Conclusion 7: there are opportunities to improve CQC’s assessment of local authority Care Act duties

The Health and Care Act 2022 gave powers to CQC to assess local authorities’ delivery of their adult social care duties after several reports and publications identified a gap in accountability and oversight of adult social care[footnote 25].

Local authorities are assessed across 4 themes (working with people, providing support, ensuring safety within the system and leadership), with a total of 9 quality statements across these 4 themes, which are a subset of the 34 quality statements in the SAF.

Formal assessment commenced in December 2023 following a pilot phase involving 5 volunteer local authorities. CQC aims to complete baseline assessment of all 153 local authorities by December 2025. The review spoke to all 9 local authorities who have already been through the entire assessment process (with reports published) as well as representative bodies within the sector. The review found broad support for the assessment framework as it is designed, in line with a wider sector response[footnote 26]. However, it also heard feedback that the assessment process and reporting could be improved. Both were regarded less favourably than Ofsted reviews of children’s social care.

A number of areas were commented on, including:

  • the relatively small number of cases reviewed. Typically, 6 individual cases are tracked - a small number in proportion to the number of long-term users of care supported by local authorities (the average number of long-term users supported over a year is around 5,600, ranging from 30 in the Isles of Scilly to over 22,000 in Lancashire[footnote 27]). The review understands that CQC also gathers feedback on people’s experiences
  • engagement between the CQC team and local authority staff. This received mixed feedback - some reporting that this worked well, others feeling a stronger relationship could have been built. This was contrasted with Ofsted’s ‘keeping in touch’ meetings. Some local authorities commented that there was insufficient opportunity to discuss and reflect on CQC’s findings during the assessment. This was felt to be a missed opportunity for learning and improvement. CQC recognises that there is more to be done to build relationships with individual local authorities and is considering the introduction of relationship owners and annual engagement meetings in addition to current feedback meetings following an assessment
  • the expertise of teams. There were fewer comments than in other sectors, but there was a perception among some local authorities that the assessment teams lacked the expertise and insight into how local authorities work in adult social care - there were reports that very few had social work experience. Views on executive reviewers were mixed and, in some cases, there was concern about their seniority
  • insufficient descriptors of what ‘good’ or ‘outstanding’ looks like to enable local authorities to improve
  • the length of time between the request for information and inspection was difficult to manage with an impact on capacity and morale within local authorities
  • there were challenges in the process for factual accuracy checking - with local authorities stating that this was burdensome and conclusions sometimes being based on data that was significantly out of date

A particular gap noted was in the commissioning of social care. The review was told that ministers who initiated the review of local authorities had expected that CQC would look at how effectively they were commissioning services. This means:

  • understanding residents’ needs (now and into the future)
  • building a deep knowledge of different models of care provision, including more innovative models of care
  • having analytical insights into the most efficient and effective models of care delivery (taking into account scale, technology and different workforce models)
  • agreeing and negotiating contracts with providers over longer periods of time (to avoid spot purchasing and competing against other local authorities)
  • ensuring a stable and robust social care market

There are concerns as to how well local authorities enact these functions with the lack of effective commissioning highlighted in a recent report on care for older people[footnote 20] and in comments made in the review:

Commissioning is a major gap everywhere - fragmented across too many bodies - too many commissioning LAs [local authorities], not linked in enough with ICBs.

– Senior policy leader

It is not clear from current assessments how comprehensibly commissioning functions are assessed, which misses an opportunity to improve commissioning capabilities and, as a result, quality and efficiency of care.

Given that the approach to re-assessment is still to be designed, it is difficult to comment on the role of CQC in local authority improvement. The review understands CQC is in the process of designing reassessment with the sector. At a roundtable for local authority representatives, the review heard that:

Ofsted’s ILACs [Inspecting local authority children’s services] allows for a proportionate reassessment/review period, so shorter for those that have been deemed ‘requiring improvement’ or ‘poor’ [and] longer period for those rated ‘good’. CQC could consider adoption of a similar approach.

Conclusion 8: integrated care system (ICS) assessments are in early stages of development with a number of concerns shared

Under the Health and Care Act 2022, CQC was given the duty to review and assess ICSs. The assessment was intended to provide independent assurance to the public and Parliament about how well health and social care organisations within an ICS area collaborate to plan and deliver high-quality care.

CQC is required by statute to look at 3 areas: leadership, integration and quality of care - the latter, presumably, to pull together information from individual providers within an ICS. CQC developed a methodology for these assessments, which was tested in pilots in Dorset and Birmingham and Solihull, but wider rollout has been paused. 

The NHS Confederation and CQC have both shared feedback to date, as have a number of people spoken to as part of this review. 

The following concerns were highlighted:

  • questions as to whether processes are being assessed or outcomes. ICSs have 4 objectives (improving outcomes in population health and healthcare, reducing inequalities in outcomes, experience and access, enhancing productivity and value for money, and helping the NHS support broader social and economic impact) - however, the assessment process has not particularly focused on these objectives and it is not clear how progress against them will be measured
  • lack of descriptors as to what ‘good’ looks like, particularly recognising different structures or arrangements across ICSs
  • questions as to what specific data (metrics) should be considered to enable a meaningful assessment of leadership, integration and quality across an ICS - and the performance of the ICS against its 4 objectives
  • concerns around duplication of provider assessments - some data requests related to provider data that could or should have been considered in provider assessments rather than ICS assessments
  • the need for CQC to recognise the challenges of clinical risk management in working effectively across multiple providers
  • difficulties in meaningfully hearing from residents about their views of quality of care across a whole system
  • the time taken to prepare for the CQC assessment and the associated costs - both direct (CQC fees) and indirect (time of staff), given this is on top of the costs for providers within an ICS
  • overlaps with the NHS England Oversight Framework, which sets out how NHS England will assess ICBs and ICSs

Conclusion 9: CQC could do more to support improvements in quality across the health and care sector

The review heard a consistent comment that CQC should not be an improvement body per se, but, at the same time, could do more to support the health and care sectors to improve. 

CQC has been, and continues to be, an impactful prompt with many providers highlighting that preparation for an inspection can be a positive opportunity for self-reflection and learning - and that a poor rating from CQC is a very strong incentive to improve:

What CQC say[s] does matter. People do take notice - there is credibility and people do take it seriously.

– NHS England regional director

There are opportunities for far more than this. Specifically:

  • as highlighted previously, the description of best practice, and of what ‘good’ and ‘outstanding’ delivery of care looks like, should be a central source of guidance during any inspection. While there are some examples of this on CQC’s website, there could be far more, such as around new models of care delivery, leading international examples and more innovative approaches - particularly the use of technology. CQC could be a substantive repository of high-quality and innovative models of care
  • reports need to be clearer with opportunities to improve set out and encouragement to develop clear action plans. Ideally, these would then be followed up on in a timely manner
  • inspection teams should inspire as much as instruct - helping organisations to understand where there are opportunities for improvement and setting out what a better model of care could look like. This requires having high-calibre teams carry out inspections who are seen as credible and knowledgeable
  • many organisations, both within and outside the health and care sectors, take a very proactive approach to the collection of, and use of, user feedback - this could be far more systematic and comprehensive in the health and care sector  
  • governance structures within organisations are crucial to improvement - with clear roles, responsibilities and accountabilities, all aimed at continuously innovating and improving. A greater focus on how organisations are approaching and delivering improvement (becoming self-improving organisations) rather than looking at input metrics, could enable more significant improvements in quality of care. The review notes CQC’s research into improvement approaches within organisations - see the Rapid literature review: improvement cultures in health and adult social care settings

NHS Providers’ report Good quality regulation: how CQC can support trusts to deliver and improve explores how CQC’s approach could become more supportive and constructive in the future. The report advocates for CQC to actively support improvement and innovation by sharing best practices and engaging in improvement-focused conversations. As the report says: “CQC should make the most of its privileged observer position by sharing good practice, engaging in improvement-focused conversations with providers, and working with organisations that have a direct role in improvement.”

Conclusion 10: there are opportunities to improve the sponsorship relationship between CQC and DHSC

DHSC’s sponsorship of CQC should promote and maintain an effective working relationship between the department and CQC, which should, in turn, facilitate high-quality, accountable, efficient and effective services to the public.

The review has found that DHSC could do more to ensure that CQC is sponsored effectively, in line with the government’s Arm’s length body sponsorship code of good practice. For example, DHSC should do more to ensure that CQC is meeting its KPIs in terms of operational performance and hold it to account through regular discussion of timely management information linked to those KPIs. The review has heard of long delays in DHSC responding to requests from CQC - for example, to replace senior roles. This has not helped CQC and should be addressed.

The National Quality Board (NQB) is responsible for ensuring high-quality health and social care services. As such, it could be expected that it would have extensively reviewed the SAF prior to its launch. The review understands that the NQB had an initial presentation on the SAF in 2021, but this was a very high-level description of the aims of the SAF, including the intention to focus on user experience. It appears there was no further discussion and so the opportunity was lost to consider the details of the 34 quality statements, the 6 evidence categories, and what specific measures would be used to ensure the regulator was able to support high-quality, efficient and effective care.

The NQB should consider its role in agreeing:

  • definitions of high-quality care
  • how to measure and quantify the effectiveness of care - in particular, what outcome metrics to use
  • how to weave innovation - particularly the use of technology and new models of care - into CQC’s assessment framework
  • how to quantify use of resources to support optimal allocation of resources to improve health and wellbeing, and optimal deployment of resources within providers
  • how to assess trade-offs or balance risk considerations
  • use of resources, and identifying how and where limited public funds could be spent most effectively

Other areas for further consideration

A number of areas have been raised with the review team but not yet considered in detail. These are:

  1. One-word ratings.
  2. Finances within CQC - both how CQC is funded, and the costs of running the organisation efficiently and effectively.
  3. The need to ensure the NHS Federated Data Platform results in a single ‘data lake’ across the health and social care sectors.
  4. The wider regulatory landscape and the burden of regulation, including the relationship between CQC and the NHS England Oversight Framework.

More details on each of these follow.

1. One-word ratings

While the pros and cons of one-word ratings hasn’t particularly been raised during conversations carried out for the review, the government recently announced that Ofsted would end the use of one-word ratings and so it would be reasonable to similarly consider their use in health and social care. A recent review of regulation by NHS Providers, A pivotal moment for regulation: regulation and oversight survey 2024, raised concerns about the use of one-word ratings among a wider range of topics as referred to earlier. Following the Ofsted announcement, the Local Government Association called for CQC to scrap the use of single-word ratings in its assessments of local authorities’ adult social care services.

Changes to one-word ratings could be beneficial in allowing greater clarity to be brought to the different key questions of quality, allowing a ‘balanced scorecard’ approach across ‘safe’, ‘effective’, ‘responsive’/’caring’ and ‘well led’. Consideration could also be given to providing greater transparency of ratings across different service lines and sites in larger providers. All this needs to be set against the need for a straightforward narrative that is accessible for users and patients.

2. CQC finances

CQC is currently funded largely through fees charged to providers. CQC is required through HM Treasury guidance Managing public money to recover the full cost of its regulatory services through fees charged to registered providers - so-called full chargeable cost recovery. CQC must consult on any changes to its statutory scheme of fees, and both HM Treasury agreement and Secretary of State consent is required before changes to fee levels can come into effect.

The current funding model presents a number of challenges, namely how to:

  • ensure efficient and effective service delivery from CQC when providers are obligated to pay
  • decide on fee levels
  • ensure that resources available match the requirements of CQC - while still remaining efficient and effective

While this is presumably a role for DHSC as the sponsor organisation, greater consideration could be given to how to quantify the above - and where or how decisions regarding resourcing are made.

3. Single ‘data lake’ across the health and care sectors

The review has heard that there is a significant opportunity to build a single repository of data on quality of care (including use of resources) across the health and care sectors. This would benefit CQC (and all organisations in the health and care sector), and would bring a more streamlined and efficient approach to performance management and improvement across all services.

The Federated Data Platform was established to create a “single version of the truth” approach to data within healthcare. Consideration should be given to how best to build on this to have a common set of data about quality of care (across all domains and sectors) that could be used.

4. The wider regulatory landscape

The wider regulatory landscape is extensive with a growth in the number of bodies over recent years. Overlapping responsibilities of these bodies can create confusion for providers and create an unhelpfully large burden of regulation:

With over 12 different regulatory/inquiry body frameworks for maternity care - resulting in, for example, well over 100 recommendations for Shrewsbury and Telford maternity unit - it’s an extremely complex and crowded landscape.

– Senior manager at NHS England

A mapping of the regulatory landscape of healthcare identified over 100 organisations which exert some regulatory influence on NHS provider organisations and recommended a review to ensure more effective and responsive regulation[footnote 28].

Within the NHS, ICBs are responsible for ensuring high-quality providers (NHS trusts, GPs and services commissioned from independent providers) and NHS England is responsible for overseeing ICBs to ensure they are delivering against their 4 objectives as outlined above.

As a result, there is significant overlap between the role of NHS England, ICBs and CQC.

Recommendations

There are 7 recommendations:

  1. Rapidly improve operational performance, fix the provider portal and regulatory platform, improve use of performance data within CQC, and improve the quality and timeliness of reports.
  2. Rebuild expertise within the organisation and relationships with providers in order to resurrect credibility.
  3. Review the SAF and how it is implemented to ensure it is fit for purpose, with clear descriptors, and a far greater focus on effectiveness, outcomes, innovative models of care delivery and use of resources.
  4. Clarify how ratings are calculated and make the results more transparent.
  5. Continue to evolve and improve local authority assessments.
  6. Formally pause ICS assessments.
  7. Strengthen sponsorship arrangements to facilitate CQC’s provision of accountable, efficient and effective services to the public.

Recommendation 1: rapidly improve operational performance, fix the provider portal and regulatory platform, improve use of data, and improve the timeliness and quality of reports

The interim chief executive of CQC is already making progress towards redressing poor operational performance including bringing in more staff, particularly those with prior experience of working in CQCCQC should agree operational performance targets or KPIs in high-priority areas, in conjunction with DHSC, to drive and track progress.

CQC will need to set out how, and by when, it will make the changes required to the provider portal and regulatory platform. CQC should also ensure that there is far more consideration given to working with providers to seek feedback on progress.

Urgent action is needed to ensure a timely and appropriate response to concerns raised around safeguarding and serious untoward incidents.

The quality of reports needs to be significantly improved with clear structure, labelling and findings.

Recommendation 2: rebuild expertise within the organisation and relationships with providers in order to resurrect credibility

There is an urgent need to appoint highly regarded senior clinicians as Chief Inspector of Hospitals and Chief Inspector of Primary and Community Care. Working closely with the chief inspectors and the national professional advisers, there should be rapid moves to rebuild sector expertise in all teams.

The review heard a strong message from providers across sectors about the opportunity to create a sense of pride and incentive in working as a specialist adviser with CQC. Consideration should be given to a programme whereby the top-performing managers, carers and clinicians from across health and social care are appointed or apply to become assessors for 1 to 2 weeks a year with a high accolade being given to those accepted on the programme.

The executive leadership team of CQC - which should include the 3 chief inspectors - should rebuild relationships across the health and care sectors, share progress being made on improvements to CQC and continually seek input.

Recommendation 3: review the SAF and how it is implemented to make it fit for purpose

There needs to be a wholescale review of the SAF to address the concerns raised.  Professor Sir Mike Richards is now working with CQC to initiate this.

Specifically, to:

  • improve the quality of documentation on the CQC website
  • appropriately describe each key question
  • set out clear definitions of what ‘outstanding’, ‘good’, ‘requires improvement’ and ‘inadequate’ looks like for each evidence category and for each quality statement, as per the previous lines of enquiry
  • request credible sector experts to revisit which quality statements to prioritise and how to assess and measure them
  • give greater emphasis to the ‘effective’ key question
  • give greater emphasis to, and use of, outcome measures. CQC should build on the work it has done with partners, particularly the Healthcare Quality Improvement Partnership, GIRFT and national clinical audits, to expand the range of outcome measures it uses
  • give greater emphasis and prominence to use of resources within the ‘effective’ and ‘well led’ key questions - and build skills and capabilities to assess
  • build recognition and understanding of the balance of risk within and across organisations, and adapt the SAF accordingly
  • significantly improve transparency and robustness of the data used for patient, user and staff experience
  • build greater knowledge and insights into innovation in healthcare and social care - including new models of care - and weave these into the quality statements
  • ensure inspectors and other operational staff are fully trained in how to conduct inspections under the SAF and have access to all relevant materials

Recommendation 4: clarify how ratings are calculated and make the results more transparent, particularly where multi-year inspections and ratings have been used

The approach used to calculate ratings should be transparent and clearly explained on CQC’s website. It should be clear to all providers and users. The use of multi-year assessments in calculating ratings and in reports should be reconsidered and greater transparency given to how these are being used in the meantime.

Recommendation 5: continue to evolve and improve local authority assessments

CQC has been clear that the assessment process for local authorities will evolve during baselining. It is now 9 months into the 2-year baselining period. CQC should consider feedback it has received, alongside the findings in this review, in order to improve the process of assessment, continuously improving its robustness and the experience of local authorities.

Recommendation 6: pause ICS assessments

Given the difficulties to date in agreeing how best to assess ICSs, a need to ensure alignment with the NHS England Oversight Framework and considerable challenges within CQC, it is recommended that ICS assessments be paused for now with the nascent ICS assessment team redeployed within CQC.

Recommendation 7: strengthen sponsorship arrangements to facilitate CQC’s provision of accountable, efficient and effective services to the public

Given the need for DHSC support, CQC and DHSC should work together to strengthen DHSC’s arrangements for sponsorship of CQC, reaching an advanced level of maturity against the Arm’s length body sponsorship code of good practice. This should be underpinned by more regular performance reviews between DHSC and CQC to reinforce and check progress against the recommendations in this report.

Metrics for performance review should be enhanced with clear performance targets set for the next 6 to 12 months. Meetings should include senior civil servants at DHSC (ideally the relevant directors general) and should take place on a monthly basis. CQC should consider strengthening partnerships with those it regulates, including setting out more clearly what providers can expect from the regulator. Strengthened sponsorship arrangements will further reinforce accountability.

It is recognised that a number of the recommendations made within this report will require wider system consideration - for example, how to ensure a sufficient focus on effectiveness and outcomes, and that use of resources is woven through all quality assessments and recommendations. DHSC will need to lead or co-ordinate this work. As part of it, the terms of reference of the NQB should be reviewed.

Next steps

DHSC should support CQC in progressing the next steps.

Over the next 4 months, a second review will report on proposed improvements to the wider landscape for quality of care, with a focus on patient safety.

Over the next 6 months, there needs to be:

  • rapid improvements to operational performance within CQC
  • significant steps taken towards rebuilding expertise within CQC
  • significant steps taken towards fostering stronger relationships with providers and the wider sectors in order to resurrect credibility

Over the next 12 months, the SAF needs to be fundamentally enhanced and improved with:

  • a review of quality statements
  • far greater emphasis on effectiveness, outcomes, innovation and use of resources
  • clear descriptors for each quality statement or evidence category

Appendix 1: quality and safety history and context

This list is not exhaustive.

1990s

Increasing interest in quality of care accompanied by increasing role of clinical audit.

Under the Labour administration, the Department of Health published ‘Quality in the NHS’ in 1998 which was seen as a step change in focusing on systematic improvement in the quality of care.

Investigation into the Bristol heart scandal.

1999

Establishment of the Commission for Health Improvement (CHI). The statutory functions conferred on CHI were set out in section 20 of the Health Act 1999 and the Commission for Health Improvement (Functions) Regulations 1999 to 2000.

2000 

Establishment of the National Care Standards Commission by the Care Standards Act 2000 as a non-departmental public body to regulate independent health and social care services, and improve the quality of those services in England. It was set up on 9 April 2001.

2001

Creation of the National Patient Safety Agency (NPSA).

Establishment of the Shipman Inquiry.

2003

Creation of Council for Healthcare Regulatory Excellence.

Establishment of Medicines and Healthcare products Regulatory Agency.

2004

Establishment of the Commission for Social Care Inspection by the Health and Social Care (Community Health and Standards) Act 2003. Under the terms of the 2003 act, the commission assumed responsibility for the functions previously exercised by CHI.

2006

Creation of the National Patient Safety Forum.

2009

Establishment of NQB.

Establishment of CQC, replacing the HCI, Mental Health Act Commission and the Commission for Social Care Inspection

NPSA publishes first version of Never Events policy and framework.

2010

NPSA publishes a National Framework for Reporting and Learning from Serious Incidents Requiring Investigation (the Serious Incident Framework).

2012

Council for Healthcare Regulatory Excellence becomes the Professional Standards Authority for Health and Social Care.

NPSA transferred to NHS England under provisions in the Health and Social Care Act 2012.

2013

Robert Francis QC’s Report of the Mid Staffordshire NHS Foundation Trust Public Inquiry published in February.

CQC introduced its new regulatory model.

2014

Introduction of the Regulation 20: duty of candour for trusts through the Health and Social Care Act 2008 (Regulated Activities) Regulations 2014.

Establishment of Patient Safety Collaboratives.

CQC roll out new approach to regulating adult social care in England.

2015

Ongoing implementation of comprehensive inspections and ratings for all NHS and care providers by CQC, and a focus by CQC on patient safety in response to the Mid Staffordshire NHS Foundation Trust Public Inquiry.

2016

Launch of the GIRFT programme.

CQC publishes its review into learning from deaths, Learning, candour and accountability.

2017

DHSC establishes the Healthcare Safety Investigation Branch.

2018

CQC publishes Opening the door to change, looking at why patient safety incidents like ‘never events’ were still occurring.

2019

The first NHS Patient Safety Strategy is published by NHS England.

2023

CQC introduced the single assessment framework.

New duty on CQC to assess local authorities’ delivery of their duties under part 1 of the Care Act 2014.

Health Services Safety Investigations Body is established as an arm’s length body.

Appendix 2: review terms of reference

Summary

To examine the suitability of CQC’s new SAF methodology for inspections and ratings. In particular to:

  • ensure the new approach supports the efficient, effective and economic provision of health and care services
  • contrast the previous approach, which prioritised routine and/or some reactive inspections with the new more sophisticated approach informed by emerging risks and data. The new inspection methodology for ICSs - both the NHS and social care components - will also be reviewed
  • consider what can be done to ensure appropriate alignment between the NHS Oversight Framework and CQC inspection and ratings
  • consider what can be done to ensure trusts respond effectively, efficiently and economically to CQC inspections and ratings
  • consider whether CQC is appropriately set up, in both its leadership and staffing, to ensure that its new statutory role of assuring local government social care functions is as effective as possible alongside its wider responsibilities, and how it will review and monitor this over time
  • examine how senior NHS leaders can be more involved in CQC inspections and actions they can take to positively support CQC activity to ensure CQC’s work is translated into strong outcomes for patients
  • examine how social care inspections and ratings make use of the user voice and capture patient experience

Areas of focus

The main areas of focus include:

  • staffing and service innovation - is CQC inspection an actual or perceived barrier to workforce reform and change, and service innovation? If so, how are these barriers being addressed?
  • provider responses - do CQC’s investigations and ratings drive the correct responses among providers, in respect of ensuring the delivery of safe and efficient, effective and economic services?
  • data quality and learning - what more could be done to ensure that regulated bodies understand the importance of ensuring that the data they produce on which regulation activity is based is of sufficient quality?
  • patient satisfaction and access - do inspections and ratings take account not only of the access statistics but patient experience of service access more broadly? Are patients’ voices are being heard, both in health and social care?
  • is the inspection and/or assurance approach, both CQC’s leadership and staffing, appropriate for:
    • ICSs (health and social care aspects)
    • local government social care functions?
  • how will the new scoring system affect ratings and the 5 key areas (safe, caring, responsive, effective and well led)?
  • are CQC’s regulations and processes fit for an age of digital healthcare?

Appendix 3: list of people spoken to for this review

Over the course of the review, we spoke to:

  • 6 CQC executives including the previous CEO
  • the CQC chair and 8 CQC non-executive directors
  • 16 members of the CQC senior leadership team
  • 11 CQC organisational leads
  • 12 CQC specialist advisers and national professional advisers
  • 34 wider CQC staff members
  • 7 representatives from trade unions
  • 8 DHSC senior civil servants
  • 1 senior civil servant from the Ministry of Housing, Communities and Local Government (previously the Department for Levelling Up, Housing and Communities)
  • 9 NHS England national directors including chair
  • 7 NHS England regional directors
  • 52 NHS trust leaders including chairs, chief executives and medical directors of NHS trusts (spread across acute and specialist community and mental health trusts, and foundation trusts)
  • 3 members, 1 director and 1 chief executive of NHS-related bodies
  • 16 general practitioners and general practitioner leaders
  • 3 senior members of the British Dental Association
  • 11 senior members of organisations in the independent sector
  • 6 senior members of social care provider representative organisations
  • 8 members of individual adult social care providers
  • 22 quality leaders of social care providers
  • 8 local authority directors or chief executives of adult social care
  • 9 ICB chairs and chief executives
  • 6 chairs and chief executives of statutory and quality-related health bodies
  • 3 contributors from academia or think tanks
  • 14 senior members from user voice organisations
  • 4 people from London councils
  • 10 other individuals

A subset of the people listed above participated in an advisory board to review emerging findings and recommendations, meeting twice during the development of this report.

Appendix 4: example of a rating descriptor

CQC defines the ‘safeguarding’ quality statement as:

  • there is a strong understanding of safeguarding and how to take appropriate action
  • people are supported to understand safeguarding, what being safe means to them, and how to raise concerns when they don’t feel safe, or they have concerns about the safety of other people
  • there are effective systems, processes and practices to make sure people are protected from abuse and neglect
  • there is a commitment to taking immediate action to keep people safe from abuse and neglect. This includes working with partners in a collaborative way
  • people are appropriately supported when they feel unsafe or experience abuse or neglect
  • where applicable, there is a clear understanding of the Deprivation of Liberty Safeguards (DoLS) and this is only used when it is in the best interest of the person
  • safeguarding systems, processes and practices mean that people’s human rights are upheld and they are protected from discrimination
  • people are supported to understand their rights, including their human rights, rights under the Mental Capacity Act 2005 and their rights under the Equality Act 2010

Appendix 5: examples of rating descriptors in previous assessment model

We have taken this from page 26 of CQC’s Key lines of enquiry for healthcare services, which includes ratings characteristics.

We have reproduced, for ease of reference, the rating characteristics for ‘safe’ in healthcare services below.

Safe

By safe, we mean people are protected from abuse and avoidable harm.

Note: abuse can be physical, sexual, mental or psychological, financial, neglect, institutional or discriminatory abuse.

The ratings are:

  • outstanding: people are protected by a strong comprehensive safety system, and a focus on openness, transparency and learning when things go wrong
  • good: people are protected from avoidable harm and abuse. Legal requirements are met
  • requires improvement: there is an increased risk that people are harmed or there is limited assurance about safety. Regulations may or may not be met
  • inadequate: people are not safe or at high risk of avoidable harm or abuse. Normally some regulations are not met

S1 (CQC code for safe): how do systems, processes and practices keep people safe and safeguarded from abuse?

Outstanding

Outstanding means:

  • there are comprehensive systems to keep people safe, which take account of current best practice. The whole team is engaged in reviewing and improving safety and safeguarding systems. People who use services are at the centre of safeguarding and protection from discrimination
  • innovation is encouraged to achieve sustained improvements in safety and continual reductions in harm

Good

Good means:

  • there are clearly defined and embedded systems, processes and standard operating procedures to keep people safe and safeguarded from abuse, using local safeguarding procedures whenever necessary. These:
    • are reliable and minimise the potential for error
    • reflect national, professional guidance and legislation
    • are appropriate for the care setting and address people’s diverse needs
    • are understood by all staff and implemented consistently
    • are reviewed regularly and improved when needed
  • staff have received up-to-date training in all safety systems, processes and practices
  • safeguarding adults, children and young people at risk is given sufficient priority. Staff take a proactive approach to safeguarding and focus on early identification. They take steps to prevent abuse or discrimination that might cause avoidable harm, respond appropriately to any signs or allegations of abuse and work effectively with others, including people using the service, to agree and implement protection plans. There is active and appropriate engagement in local safeguarding procedures and effective work with other relevant organisations, including when people experience harassment or abuse in the community

Requires improvement

Requires improvement means:

  • systems, processes and standard operating procedures are not always reliable or appropriate to keep people safe
  • monitoring whether safety systems are implemented is not robust. There are some concerns about the consistency of understanding and the number of staff who are aware of them
  • safeguarding is not given sufficient priority at all times. Systems are not fully embedded, staff do not always respond quickly enough, or there are shortfalls in the system of engaging with local safeguarding processes and with people using the service
  • there is an inconsistent approach to protecting people from discrimination

Inadequate

Inadequate means:

  • safety systems, processes and standard operating procedures are not fit for purpose
  • there is wilful or routine disregard of standard operating or safety procedures
  • there is insufficient attention to safeguarding children and adults. Staff do not recognise or respond appropriately to abuse or discriminatory practice
  • care premises, equipment and facilities are unsafe

Appendix 6: examples of combined ratings

The tables below show how locations can hold ‘combined ratings’. A combined rating is where the overall ratings are aggregated from 2 or more inspections on different dates.

This can happen at both the domain level (from focused inspections on a subset of domains) or the service level (from focused inspections at the service level, which are more common for a larger location such as an acute hospital) or both.

The data used to create these was sourced from CQC reports found under Find and compare services on the following care providers:

  • Manchester University NHS Foundation Trust March 2019 report, which includes data on the ratings for Wythenshawe Hospital and St Mary’s Hospital
  • Wythenshawe Hospital and St Mary’s Hospital July 2023 reports
  • Aire Valley Surgery March 2016 report
  • Aire Valley Surgery January 2020 report
  • Ranelagh House October 2019 report
  • Ranelagh House March 2023 report

Wythenshawe Hospital

Wythenshawe Hospital had a full inspection on all services and domains, except as to whether it was effective for outpatients, in March 2019.

In July 2023, only the maternity service was inspected with its ‘safe’, ‘well led’ and overall ratings all being downgraded.

This led to Wythenshawe Hospital’s overall location ratings for ‘safe’, ‘well led’ and overall to be downgraded to ‘requires improvement’ even though several services were not re-inspected.

Table 5a: ratings for Wythenshawe Hospital, published March 2019

Service Safe rating Effective rating Caring rating Responsive rating Well led rating Overall rating
Urgent and emergency care Requires improvement Requires improvement Good Requires improvement Good Requires improvement
Medical care (including older people’s care) Good Good Good Requires improvement Good Good
Surgery Good Good Good Good Good Good
Critical care Good Good Outstanding Outstanding Good Outstanding
Maternity Good Good Good Good Good Good
Services for children and young people Good Good Good Outstanding Requires improvement Good
End of life care Good Good Outstanding Good Good Good
Outpatients Good Not rated Good Good Good Good
Overall Good Good Outstanding Requires improvement Good Good

Table 5b: ratings for Wythenshawe Hospital published July 2023

Service Safe rating Well led rating Overall rating
Maternity Inadequate Requires improvement Requires improvement
Overall Requires improvement Requires improvement Requires improvement

St Mary’s Hospital

Similarly, St Mary’s Hospital had a full inspection on all services and domains in March 2019.

In July 2023, only the maternity service was inspected with its ‘safe’, ‘well led’ and overall ratings all being downgraded.

This led to the overall location ratings for these 2 domains and the overall rating to be downgraded to ‘requires improvement’ for overall and ‘well led’, and ‘inadequate’ for ‘safe’.

Table 5c: ratings for St Mary’s Hospital, published March 2019

Service Safe rating Effective rating Caring rating Responsive Well led rating Overall rating
Maternity Good Good Good Good Good Good
Neonatal services Good Good Outstanding Good Good Good
Overall Good Good Outstanding Good Good Good

Table 5d: ratings for St Mary’s Hospital, published July 2023

Service Safe rating Well led rating Overall rating
Maternity Inadequate Requires improvement Requires improvement
Overall Inadequate Requires improvement Requires improvement

Aire Valley Surgery and Ranelagh House

These combined ratings do not exclusively occur for service-level inspections in hospitals, as Aire Valley Surgery (a GP practice) and Ranelagh House (a care home) also had focused domain inspections in January 2020 and March 2023.

For Ranelagh House, the re-inspection and downgrading of the ‘well led’ domain to ‘requires improvement’ made the overall location rating downgrade to ‘requires improvement’.

Table 5e: ratings for Aire Valley Surgery in March 2016 and January 2020

Publication date Safe rating Caring rating Effective rating Responsive rating Well led rating Overall rating
9 January 2020 Not inspected Not inspected Good Not inspected Good Good
21 March 2016 Good Good Good Good Good Good

Table 5f: Ratings for Ranelagh House in October 2019 and March 2023

Publication date Safe rating Caring rating Effective rating Responsive rating Well led rating Overall rating
22 March 2023 Requires improvement Not inspected Not inspected Not inspected Requires improvement Requires improvement
17 October 2019 Requires improvement Good Good Good Good Good
  1. Note: some groups refer to users of social care as “those who draw on care and support in social care”. 

  2. Total UK public and private expenditure on health and social care is taken from tables 1a and 7a of the Office for National Statistics’ (ONS) UK Health Accounts data set, which states that health-related expenditure totalled £292.476 billion in 2023 and long-term care (social care) £12.069 billion in 2022 - this is the most up-to-date data available. UK GDP is estimated at around £2.5 trillion in 2022 to 2023, according to HM Treasury’s GDP deflators at market prices, and money GDP June 2024 (Quarterly National Accounts). Taken together, this leads to an estimate of approximately 12% of GDP

  3. Total public expenditure on health in 2022 to 2023 was 18.4% of total managed expenditure (£213.3 billion on health and £1,157.4 billion total managed expenditure), as stated in HM Treasury’s Public spending statistics: May 2024. Total managed expenditure includes all outgoings from government, which includes resource and capital spending on services, and all benefits and debt repayments. Total social care spending is £27.3 billion for the UK. This is the total of ‘sickness and disability - of which personal social services’ and ‘old age - of which personal social services’ - see HM Treasury’s Public Expenditure Statistical Analyses 2024, page 74, rows 10.1 and 10.2 of Table 5.2. Taken together, health and social care public expenditure is estimated to be approximately 21% of total public expenditure. 

  4. NHS Confederation. Creating better health value: understanding the economic impact of NHS spending by care setting. 2023. 

  5. For example, the Nolan Principles, which provide a framework for ethical behaviour and good governance in public life. 

  6. Healthcare Quality Improvement Partnership. Clinical audit: a manual for lay members of the clinical audit team. 2012. 

  7. NHS England. Consolidated NHS provider accounts: annual report and accounts 2022 to 2023, page 8. 2024. 

  8. DHSC and Government Office for Science. Evidence review for adult social care reform, pages 10-11, paragraph 2.7, figure 2. 2021. 

  9. ONS. Care homes and estimating the self-funding population, England: 2022 to 2023. 2023. 

  10. ONS. Estimating the size of the self-funding population in the community, England: 2022 to 2023. 2023. 

  11. Competition and Markets Authority (CMA). Care homes market study. 2016. 

  12. DHSC analysis of the Using CQC data care provider directory with filters, accessed August 2024, using the location type of ‘social care organisation’ for a broad definition of adult social care and treating each brand as a single provider.  2

  13. ‘Care Homes for Older People UK Market Report’, 34th edition. 2023. LaingBuisson, London. 

  14. Due to definitional issues, no source has a UK or England-based value on the same definition of ‘the independent sector’ as regulated by CQC

  15. ‘Private Healthcare Self-Pay UK Market Report’, 5th edition, page 2. 2023. LaingBuisson, London. 

  16. ‘UK Healthcare Market Review’, 34th edition. LaingBuisson, London.  2

  17. DHSC. DHSC annual report and accounts: 2022 to 2023. 2024: page 5, table 69. This states that total independent sector expenditure of £12.7 billion consists of £11.454 billion from independent sector providers and £1.264 billion from voluntary sector or not-for-profit providers. 

  18. CQC. CQC Board meeting: 22 May 2024. 2024: see ‘Corporate performance report (2023/24 year end) - appendix’.  2

  19. Such as Norfolk County Council’s Provider assessment and market management solution (PAMMS), an online assessment tool used to help assess the quality of care delivered by providers of adult social care services in Norfolk. 

  20. Cavendish C, Moberg G and Freedman J. ‘A better old age? Improving health and care outcomes for the over-65s in the UK.’ 2024: M-RCBG Associate Working Paper No. 236, Mossavar-Rahmani Center for Business and Government, Harvard Kennedy School.  2

  21. Han TS, Murray P, Robin J, Wilkinson P, Fluck D and Fry CH. ‘Evaluation of the association of length of stay in hospital and outcomes.’ International Journal for Quality in Health Care 2022: volume 34, issue 2. 

  22. NHS Confederation and CF. ‘Unlocking the power of health beyond the hospital: supporting communities to prosper.’ 2023. 

  23. DHSC. Independent investigation of the NHS in England. 2024. 

  24. For example, NAO’s 2021 report The adult social care market in England and CMA’s 2017 Care homes market study: summary of final report called for “greater accountability for local authorities in delivering on their care obligations”. 

  25. Association of Directors of Adult Social Services (ADASS). ADASS Spring Survey 2024. 2024. 

  26. NHS England. Adult Social Care Statistics in England: An Overview. 2024. 

  27. Oikonomou E, Carthey J, Macrae C and others. ‘Patient safety regulation in the NHS: mapping the regulatory landscape of healthcare.’ BMJ Open 2019: volume 9, issue 7.