Consultation outcome

Improving the way Ofsted inspects education: report on the responses to the consultation

Updated 9 September 2025

Foreword from His Majesty’s Chief Inspector

Ofsted exists to keep children and learners safe, raise standards of their education and care, and improve their lives. That is our core purpose and driving mission. We know we must deliver our mission with professionalism, courtesy, empathy and respect for the professionals we work with. They – like us – dedicate their lives to improving the lives of children and learners. We are determined to do more to improve how we work with these dedicated professionals and provide parents and carers with the information they want.

Each and every day, our inspectors see brilliant practice in the nurseries, childminders, children’s homes, colleges, all forms of schools, young offender institutions, teacher training providers and the myriad other providers of education and care we inspect and regulate. We champion this brilliance through our reporting to parents and carers, government and the public.

Sadly, despite the best efforts of professionals, we also identify unacceptable practice. It is our job to call out practice that undermines children’s safety or robs them of their one chance at an education that allows them to thrive. Identifying poor practice is the first step in delivering urgent and necessary improvement and support.

All too often, we inspect education providers where aspirations for children and learners – especially those from the most disadvantaged backgrounds – are too low: where behaviour is unacceptable and where children’s one chance at fulfilling their potential is squandered. We regularly find local areas failing to provide adequate support for children and young people with special educational needs and/or disabilities (SEND). And, tragically, we still uncover early years providers where children’s lives are at risk because of unacceptable standards of care.

It is our duty to call these unacceptable standards out.

The proposals we consulted on in recent months were developed in line with this mission – as well as the tens of thousands of pieces of feedback received through the Big Listen; the largest consultation Ofsted has ever carried out. In the aftermath of the tragic death of Ruth Perry, headteacher of Caversham Primary School, and the subsequent Coroner’s inquiry, we were determined to hear feedback from everyone connected with our work. We remain equally determined to maintain good working relationships with education professionals, while inspecting on behalf of children and their parents.

In the Big Listen, we heard clearly from parents and professionals that they wanted a move away from a single overall grade. We heard that parents wanted more information, with clear grades across the range of areas we inspect when we visit nurseries, childminders, schools, colleges and other education providers. Independent research found that this proposal was supported by a majority of the professionals. However, there was also a clear desire for more granular assessments of providers, taking into account context and showing clear areas for improvement.

The Ofsted report card combines the preferences of parents and professionals, delivering a system that continues to identify brilliance, drives high and rising standards across all providers and calls out unacceptable practice. In addition, it gives more nuance – both at a glance and for those interested in delving into greater detail.

It gives a more detailed picture of the strengths and identifies more precisely the areas for improvement. It increases the incentive to improve; the new ‘exceptional’ grade will identify the very best education practice in the country.

And our new approach improves accountability, rather than lessens it. Where standards are not yet high enough, we will return quickly to check that progress is being made.  

We hope these measures, including the significant improvements made as a direct result of the feedback from professionals, will be welcomed. We will continue to engage constructively with all professionals, so we can learn from their feedback.

We know the vast majority of professionals understand the importance of accountability in keeping children safe and improving their lives. They want Ofsted to perform that vital role, but they want us to improve how we work. We believe our plan delivers on that ambition.

However, a small but vocal minority are calling for reduced accountability or removing grading altogether. We do not agree. And parents and carers do not agree either.

The changes we are introducing are fair and empathetic for professionals, but without losing sight of our core purpose to raise standards.

Our new report cards will include the nuanced content parents want. Our improved inspection practice – built on a new methodology, updated evidence-gathering processes, and bolstered training – will change the look and feel of inspection. We will embed transparency through all our work, clarifying our inspection practices and processes through the new operational guides, publishing our training, and being clear about the data we use. And we will add an extra inspector to inspection teams for schools to boost inspection capacity, to support leaders and to make sure we gather the evidence we need.

These reforms are further proof of how we are resetting Ofsted’s culture through our actions in response to the Big Listen, working with the education workforce to raise standards. Ultimately, children only get one childhood. This is why we are putting disadvantaged and vulnerable children and learners at the heart of what we do, as we continue to strive to keep them safe and improve their lives.

Sir Martyn Oliver His Majesty’s Chief Inspector, Ofsted

Introduction

We are undergoing a major cycle of reform. This consultation response sets out what amounts to our biggest and most consequential changes.

Our response to the Big Listen in 2024 set out 132 actions to reform and change our work. We publicly track these actions in our Big Listen action monitoring reports. That response set an ambition to reset our relationship with those we regulate and inspect, working collaboratively with them to put children and learners first. As part of this ambition, we committed to revising our inspection framework and introducing inspection report cards.

On 3 February 2025, we launched a consultation on our proposals to improve education inspections and our new report cards for providers. The proposals covered:

  • early years providers (not childminder agencies or those on the voluntary childcare register)
  • state-funded schools
  • non-association independent schools
  • further education and skills (FE and skills) providers
  • initial teacher education (ITE) providers

The consultation ran from 3 February 2025 to 28 April 2025. It was open to the public and promoted widely through the media, our website and social media channels. We sought the views of key stakeholders and interested parties through a variety of methods.

The Department for Education (DfE) ran a parallel consultation to gather feedback on its approach to, and the principles of, school accountability; the introduction of school profiles; and its approach to improvement and support for state-funded schools.

We published the draft toolkits for early years, state-funded schools, non-association independent schools, FE and skills, and ITE as part of the consultation. We have now published the updated toolkits and operating guides. You can view them below:

The findings in this response are based on the feedback gathered through:

  • free-text comments received through the online consultation
  • focus groups carried out with practitioners, inspectors, parents and carers
  • discussions with organisations representing children and young people with SEND and employers in the FE and skills sector
  • discussions with our 7 external reference groups and experts and representatives from across the sectors we inspect
  • test visits using our proposed approach and inspection methodology, in which we heard directly from professionals working in education providers about their experience of the proposed new approach
  • independent qualitative and quantitative research into parents’ and children’s views of report cards, which we commissioned YouGov to carry out; this included a nationally representative survey of 1,090 parents in England on their thoughts on report cards, and 7 focus groups involving 57 parents
  • user testing of our proposed report card designs with parents and professionals
  • commissioned research into conceptualising vulnerability for inspection and regulation by the National Children’s Bureau and engaging with parents, provider staff, and learners on inspection by the Behavioural Insights Team (a research and innovation consultant)
  • an independent review of the impact of our inspection reforms on the workload and well-being of the education workforce

In total, we received more than 6,500 responses to the consultation. We heard from people from every education sector that we inspect. The education professionals were the largest group of respondents (75%), followed by parents/carers (21%). There was some overlap between these groups. The technical annex gives more detail on the methodology across different strands of data collection.

This evidence base has informed the final drafts of our toolkits and operational guides for education inspection, which we have published alongside this response.

Summary of changes

We have listened. We have listened to parents and children through independent polling, focus groups, commissioned research, and the over 1,300 responses that parents and carers returned to the consultation. We have listened to over 4,900 professionals in the education sectors we inspect, through the responses they returned to the consultation, through our engagement events, through our meetings with education unions and representative bodies, and by hearing directly from leaders and inspectors during test inspections. And we have listened to experts, from our 7 external reference groups, from the DfE, and from our direct engagement. We have listened and we have made changes as a result of their responses.

The proposals we consulted on

As a reminder, we consulted on a range of proposals to renew our inspection framework, and to make changes to our inspection materials and methodology, in order to increase transparency and build stronger relationships with those we inspect. The proposals were:

  • Report cards (Proposal 1) – these would give parents, carers and employers more detailed information than the current reports, including a new 5-point grading scale to evaluate more areas of a provider’s work and short summaries of what inspectors found.
  • Education inspection toolkits (Proposal 2) – this tool shows providers and inspectors the evaluation areas that inspections will focus on and how we will evaluate and grade providers.
  • Inspection methodology (Proposal 3) – changes to how we carry out inspections.
  • Full inspections and monitoring inspections, state-funded schools (Proposal 4) – we plan to end ungraded inspections of state-funded schools and change our monitoring programmes so that, where needed, we can check that timely action is taken to raise standards.
  • Identifying state-funded schools causing concern (Proposal 5) – our approach to how we will place a school into a category of concern.

We proposed changes to a 30-year-old, well-established, extensively copied and well-understood approach to reporting on the quality of education provision.

Before we stopped using the overall effectiveness judgement last year, about 90% of schools were graded ‘good’ or ‘outstanding’, and as it stands now, 98% of early years settings and almost 80% of FE and skills settings are currently graded ‘good’ or ‘outstanding’. This means most of the highly diverse range of education provision in England is summed up by 2 simple descriptors, which is unhelpful to parents and unfair to providers.

Research carried out for us as part of the Big Listen consultation had found that only 3 in 10 professionals (29%) and 4 in 10 parents (38%) supported single-word judgements for overall effectiveness.

By adding a 5-point grading scale across a range of evaluation areas, we can offer more differentiation and therefore more information for parents and providers. Report cards first and foremost have been, and should be, designed for parents and carers. They have multiple purposes, but they are primarily to advise parents and carers on who to trust with the care and education of their child, or to inform learners about their critical educational choices. This is why we listen to parents and carers first when it comes to reporting.

We think that parents have given us overwhelming backing for these reforms. We commissioned YouGov to carry out independent research on the views of parents on the proposed report cards. Two thirds (66%) of parents of school-age children independently polled by YouGov in a nationally representative sample told us that they want Ofsted to continue to grade schools using a scale, regardless of whether it was a 4- or 5-point scale (only 10% said they were opposed to this). Two thirds (64%) told us they agreed with the addition of an ‘exemplary’ grade (9% disagreed) – the fifth point on the 5-point scale. Two thirds (67%) told us they prefer the new report card to the way we currently report (15% said they preferred current reports). In focus groups YouGov held with parents of children in schools and nurseries, we heard similarly strong support for our plans.

Parentkind, a network for parents of school children, told us:

“… the proposed report card and accompanying toolkits clearly support better parental engagement. We believe Ofsted has made parents a priority through this consultation.”

We are delighted that our new approach is thoroughly viewed as an improvement by parents.

According to a More In Common poll commissioned by Schools Week, 71% of parents said they felt that the new grading system is fairer on teachers (17% said the current system is fairer). We agree.

Professionals, sector representatives and education experts have offered a mixture of constructive feedback on our proposals. They have given praise for our plans to improve inspection practice, and criticism for propositions they thought were unnecessarily complex or under-explained. We also received challenge from teacher and headteacher unions that opposed our proposals to continue to give grades as part of inspections.

The polling feedback from parents tells us our proposals on report cards were broadly right. When it comes to inspection practice, it is crucial that we prioritise the views of those who experience inspections. This is why we have listened closely to the constructive feedback from education professionals that we have heard from the consultation, from direct engagement, and through our testing visits, to make many changes to inspection toolkits, methodology, and broader changes to our approach to inspection.

High-level summary of actions we are taking

We will put children first, breaking barriers down by raising standards up – especially for the most disadvantaged and vulnerable

We have focused our reforms to inspection on addressing barriers to learning – such as disadvantage and SEND – through a strengthened approach to inclusion.

We will shine a spotlight on achievement by sharing national outcomes and context data alongside our inspection findings, because children only get one chance at an education.

We have rigorously tested our proposed 5-point scale for inspection grades, which we believe will raise standards and give more information to parents, carers and learners.

We will improve the way we report and the content of inspection findings to parents, carers and learners

We will introduce a new report card to make our inspection findings clear and have refined our design.

We have revised the terms used to describe new grades and evaluation areas.

We have reduced the number of evaluation areas.

We will improve the look and feel of inspection for education providers and professionals, taking a more collaborative approach

We have updated our new inspection toolkits with a new structure and approach.

We have revised our approach to inspection, through a more collaborative process.

We will offer webinars and events for education professionals.

We will increase assurance by boosting the capacity of school inspection teams, increasing the frequency of early years inspections and reducing registration wait times

We will increase the frequency of routine inspections for regulated early years providers to a 4-year window*

We will inspect early years providers within 12 to 18 months of registering, down from the current wait of up to 30 months*

We will boost inspection capacity for schools by adding an extra inspector to inspection teams.

*Early years regulatory inspection changes to take place from April 2026.

We will maintain our focus on having care and concern for the well-being and workload of providers

We will ensure that our inspections cover providers’ existing responsibilities – we do not expect any provider to be doing more than it needs to just ‘for Ofsted’.

We will invite providers to nominate a staff member to help streamline inspection workloads, and offer support for those nominees (a non-compulsory position).

We have set a reasonable timeframe for inspections to take, to reduce the length of the inspection day.

We will take context into account on inspection

We will start inspections by talking with leaders about the context their provider works in so that inspectors have that insight.

We will adapt our inspection approaches to different types of education providers, as set out in the toolkits.

We will introduce a service called ‘Ofsted: explore an area’ to give insight into local education provision.

We will make inspections consistent in approach and continue to improve our process for challenging inspections

All inspections of maintained nursery schools, schools and FE and skills settings will be led by His Majesty’s Inspectors (HMI).

We will enhance and evaluate our established processes for quality-assuring inspections and improving the consistency of inspections.

We will improve our approach to enabling challenge, by making further upgrades to our complaints process.

We will offer constructive advice on what providers need to do to improve, based on conversations with leaders

We will give clear recommendations on what to improve, as part of our reporting.

We will introduce new types of monitoring inspections for schools and FE and skills providers that need attention.

We will set actions for early years settings with evaluation areas graded ‘needs attention’ or ‘urgent improvement’.

The actions we are taking and changes we have made

Evaluation areas – schools (state-funded schools and non-association independent schools) 

The school report cards will cover grades across a range of evaluation areas, which we have rigorously tested with parents. By giving individual grades across a range of evaluation areas, we will offer a more rounded and nuanced view of the quality of a school.

Following feedback from the consultation, we are taking the following actions:

  • We have reduced the total number of evaluation areas we proposed in school (including non-association independent school) inspections from 8 to 6. This will ensure our inspections are more consistent and will reduce the workload for school staff on inspection.

  • The 6 core areas intentionally reflect the areas we are obliged to inspect against for schools by law.[footnote 1] This reflects our intention to inspect against schools’ statutory responsibilities, by directly reflecting our own statutory duties. We are not inspecting schools against standards beyond what is already expected of them by law.

  • We have merged ‘developing teaching’ with ‘curriculum’ to create a single ‘curriculum and teaching’ evaluation area. ‘Developing teaching’ as a standalone area caused some confusion with consultation respondents, especially parents in the YouGov focus groups.

  • We have also merged ‘behaviour and attitudes’ with ‘attendance’ to create a single ‘attendance and behaviour’ evaluation area. We are emphasising attendance first because of the importance of addressing the ‘epidemic of school absence’ that the government has identified.

  • The core evaluation areas that are graded on the 5-point scale for all schools are now:

    • ‘inclusion’
    • ‘curriculum and teaching’
    • ‘achievement’
    • ‘attendance and behaviour’
    • ‘personal development and well-being’
    • ‘leadership and governance’
  • We will have separate ‘early years’ and ‘post-16 provision’ evaluation areas for inspections of schools with early years and/or sixth form provision.

  • We will inspect ‘safeguarding’ as a separate ‘met’ or ‘not met’ core evaluation area with an accompanying narrative. We consulted in the Big Listen about decoupling safeguarding from the ‘leadership’ evaluation area.

  • Non-association independent schools will also be inspected against the same evaluation areas as state-funded schools but will continue to be evaluated against the independent school standards.

Evaluation areas – early years

The evaluation areas in the early years toolkits are designed to align with the early years foundation stage (EYFS) statutory framework.

To ensure consistency for parents when making decisions about their children’s education, we have kept the names of these evaluation areas as similar as possible to those used in schools. However, in response to what we heard through the consultation, we have tailored the content to reflect the unique context of early years settings.

Following feedback from the consultation, we are taking the following actions:

  • We have placed the ‘safeguarding’ evaluation area at the top of all the toolkits (and report cards), both in early years and across schools and FE and skills remits. ‘Safeguarding’ will be given a ‘met’ or ‘not met’ evaluation. This is because ensuring that early years settings are safe and suitable for children is our most important priority.

  • In early years, as in schools, we have merged ‘developing teaching’ with ‘curriculum’ to create a single ‘curriculum and teaching’ evaluation area. Some consultation respondents, particularly childminders, found having ‘developing teaching’ as a standalone area problematic. Parents in YouGov focus groups also misunderstood it as an area of evaluation.

  • As well as ‘safeguarding’, the evaluation areas for early years are:

    • ‘inclusion’
    • ‘curriculum and teaching’
    • ‘achievement’
    • ‘behaviour, attitudes and establishing routines’
    • ‘children’s welfare and well-being’
    • ‘leadership and governance’
  • We will disapply the ‘curriculum and teaching’ and ‘achievement’ evaluation areas on inspections of out-of-school childcare settings. It would not be possible to reach a grade for these settings, given their statutory responsibilities.

Evaluation areas – FE and skills

In FE and skills, the evaluation areas are designed to allow parents and learners to make informed choices between providers. These might range from a sixth-form college of a few hundred students, or an independent specialist college of tens of learners, to an FE college of thousands of learners and apprentices.

Given this context, we have spread the evaluation areas across provision and provider types so that learners can see our evaluation of the type of provision most appropriate for them. This also offers a cross-provider evaluation of leadership, inclusion and safeguarding.

Following feedback from the consultation, we are taking the following actions:

  • We have merged ‘developing teaching and training’ with ‘curriculum’ to create a single ‘curriculum, teaching and training’ evaluation area. This was to reduce the overall number of evaluations and to reflect the similar merger in schools and early years.

  • We have introduced 3 evaluation areas for the provider as a whole:

    • ‘safeguarding’ (‘met’ or ‘not met’)
    • ‘inclusion’
    • ‘leadership and governance’
  • We also have 3 evaluation areas for each provision type offered (education programmes for young people; provision for learners with high needs; apprenticeships and/or adult learning programmes):

    • ‘curriculum, teaching and training’
    • ‘achievement’
    • ‘participation and development’
  • We have introduced an additional evaluation area for colleges and designated specialist institutions: ‘contribution to skills needs’. This covers the provider as a whole.

  • We added ‘governance’ to the ‘leadership’ grade in response to concerns from respondents from the sector that it may appear that governance is not as important in the FE and skills sector as it is in schools and early years.

  • Following feedback from stakeholders, we have increased the ‘contribution to skills needs’ evaluation area from 3 progress evaluations to the 5-point scale like all other areas across remits (other than ‘safeguarding’).

  • We heard some concerns that the ‘personal development’ and ‘behaviour and attitudes’ criteria from the education inspection framework should be more thoroughly embedded in the toolkits. We have reviewed the ‘participation and development’ evaluation area and have ensured that the ‘personal development’ and ‘behaviour and attitudes’ content from the framework remains in the revised toolkit.

Evaluation areas – ITE

We have mirrored the approach to evaluation areas in initial teacher education (ITE), where we heard some similar concerns from stakeholders about the overall number of areas proposed. 

Following feedback from the consultation, we are taking the following actions:

  • We have combined ‘curriculum’ and ‘teaching’ to create a ‘curriculum, teaching and training’ evaluation area.

  • We have reduced the total number of evaluation areas from 6 to 5:

    • ‘inclusion’
    • ‘curriculum, teaching and training’
    • ‘achievement’
    • ‘professional behaviours, personal development and well-being’
    • ‘leadership’

Grading scale

We have extensively tested views on the report card and the grading scale. The feedback from parents was clear – the polling shows that they comprehensively welcomed our proposed changes.

However, we also heard many concerns in the consultation about the proposed grading scale from education professionals, especially in schools.

We wanted to give parents the scale they wanted while giving confidence to professionals and providers.

Following feedback from the consultation, we are taking the following actions:

  • We will grade all evaluation areas on a 5-point scale. The government’s manifesto specifically said that it wants to introduce a ‘new report card system telling parents clearly how schools are performing.’ As we have already explained, parents have given resounding support for our proposals. Parents also strongly support the 5-point grading scale: 87% of those surveyed said they would find a 5-point scale (running from ‘causing concern’ to ‘exemplary’ – the grading terms we consulted on) useful.

  • We are bringing together the most popular preferences of parents and professionals by combining separate grades, rating scales and short summaries into new report cards. The proposed 5-point scale was based on the Big Listen findings. In the independent research we commissioned as part of the Big Listen, parents ranked ‘separate judgements for each inspection area’ highest (76% in favour) and a ‘rating scale of 0 to 5’ second highest (61%). Individual education professionals ranked ‘separate judgements’ as second highest (59%) and education professionals representing providers ranked it third highest (53% in favour). About a third of both groups favoured rating scales. The highest-rated options for providers and professionals were ‘bullet point summaries of our findings’ (67% and 66% in favour).

  • We will adopt a combined approach using both a 5-point scale and a narrative explanation for each grade across all evaluation areas, except for ‘safeguarding’, which will be a ‘met’ or ‘not met’ judgement. This gives parents, carers and learners that clear snapshot of information and allows them to compare providers. It also offers that detailed narrative explanation for how the grade was reached. We believe it strikes the right balance between the need for clarity and accountability.

    In the responses to the consultation, there were some calls for us to move to narrative reports only (the second favourite option among providers in the Big Listen research, and third favourite option among individual education professionals). This was also a particularly strong view of some representative organisations in the schools sector. However, views across other sectors were mixed. A major representative in the early years sector said a 5-point scale would provide more nuance and detail, and more meaningful feedback to improve on their practice than our previous approach.

    We think a combined approach works best. The narrative gives a detailed picture of performance while the 5-point scale offers the clarity and simplicity that parents asked for. We believe this approach is a fairer way of holding providers accountable for their performance and showing a clear and accurate picture of that performance.

    We have heard that the ability to compare providers ‘at a glance’ is an important and useful feature of our reporting. A narrative on its own would not have the same level of clarity as the combined approach or be as easy for parents to use.

  • We firmly believe that a 5-point scale sets high expectations for truly exceptional standards, while encouraging or instigating improvement when it is needed. We considered multiple forms of grading scale options through the consultation. Responses from professionals were largely mixed about the other forms of grading scales we proposed. We opted against a ‘pass/fail’ style (‘met’ or ‘not met’) option, because it lacks nuance and does not offer the information parents want. A 3-point scale option exacerbates the problem that most outcomes sit in the 2 ‘good’ or ‘outstanding’ boxes currently, reducing the nuance parents have called for and the stretch in standards that children and learners deserve. And a 4-point scale returns us back to the status quo and suggests an equivalence to the previous scale. We did not want new terminology to be compared with old, such as ‘inadequate’, which we heard through the Big Listen caused much upset and anxiety.
  • We have also considered the risk that points scales (whether a binary met/not met scale, a 5-point scale, or any other points scale) introduce ‘cliff edges’ into the assessment. We consider that a 5-point scale, applied across 6 core evaluation areas, reduces this concern as it is spread out. Each grade will be published alongside a narrative explaining inspectors’ justification for each grade; this will be particularly important in instances when the provider falls close to the borderline of a grade.

  • We have changed the names of grades in response to feedback. We heard concerns from professionals and parents about the terminology used to describe grades, such as ‘secure’ being confusing and ‘causing concern’ being too harsh on providers. We tested views through 2 rounds of independent polling of parents and children and independent focus groups of parents, as well as listening to feedback from the sectors we inspect. As a result, we have changed the names of each grade:

    • ‘causing concern’ to ‘urgent improvement’
    • ‘attention needed’ to ‘needs attention’
    • ‘secure’ to ‘expected standard’
    • ‘strong’ to ‘strong standard’
    • ‘exemplary’ to ‘exceptional’

Grading and inspection methodology and toolkits

We received thousands of responses about our inspection toolkits, their content, and how we grade providers. We have carried out extensive engagement with experts, sector representatives, providers and our own inspectors to test and hone our toolkits and methodology.

Following feedback from the consultation, we are taking the following actions:

  • We have tightened the definitions of ‘expected standard’ and ‘strong standard’ so the differentiation between both is clearer across all toolkits. We heard through consultation feedback and test visits that the differentiation between ‘expected standard’ (previously ‘secure’) and ‘strong standard’ (previously ‘strong’) was not clear enough. Through extensive revision, challenge and testing, we have clarified these definitions across every toolkit and every remit we inspect.

  • We have developed a new methodology for grading providers. We heard concerns not only about the differentiation between grades but that our inspection methodology left too much up to inspectors’ discretion, which could lead to inconsistency. We have now changed it from a ‘best fit’ model of evaluation, which allowed inspectors to award grades by determining a ‘best fit’ across a range of standards, to a ‘secure fit’ model. This means that each standard within a grade must be met before it can be awarded. We believe this will help keep grading as consistent as possible.

  • To ensure that our approach works for the different types of providers we inspect, we have reduced the number of standards in the toolkits and offered far more clarity on the evidence that inspectors could gather to determine whether the standards for each grade have been achieved. This will help providers to understand the expectations on inspection.

  • The new methodology will start by gathering evidence at the ‘expected standard’ (previously ‘secure’). The toolkits guide inspectors in gathering evidence for that evaluation area’s standards. All the ‘expected standards’ need to be met before inspectors can consider evidence against the ‘strong standard’.

  • We have restructured the toolkits to orient around the ‘expected standard’. The likely 3 most commonly awarded grades will appear on one page:

  • ‘needs attention’
  • ‘expected standard’
  • ‘strong standard’

The 2 extremes then sit on the following page:

  • ‘urgent improvement’
  • ‘exceptional’

The ‘expected standard’ is in the middle of the page of the toolkit because this is what we would typically expect to see on inspections. It covers the statutory, professional and non-statutory guidance that providers are already expected to follow.

The ‘strong standard’, with its tighter definitions, looks for evidence of practice to be consistent, embedded and highly impactful. It sits to the right of the ‘expected’ standard.

An evaluation area will be graded ‘needs attention’ when the ‘expected standard’ of the particular evaluation area is not met because weaknesses or inconsistencies in practice have a negative impact on children, pupils and learners in general or on a particular group (see ‘inclusion’ section).

  • We will make recommendations or set actions on what to improve. This was a widespread ask of us from education professionals in the Big Listen. We will specifically make recommendations or set actions.

  • In schools, FE and skills and ITE, an area graded ‘needs attention’ will be accompanied by one or more recommendations, which will describe what needs to improve (but not how to do it).

  • This will be the same for early years in maintained nurseries and for school-based early years provision. For other early years settings, where we are the regulator, our approach will be slightly different. Where we identify an evaluation area as ‘needs attention’, we will set out clear actions. Actions for early years settings will provide a framework to address weaknesses to ensure the safety, well-being and learning of children in their care.

  • We have simplified the process of awarding ‘exceptional’ (previously ‘exemplary’). We are no longer proposing to ask providers to submit case studies of exemplary practice to the Ofsted Academy for approval. Many respondents found this too complex or were worried it would create an additional burden. Instead, inspectors will evaluate ‘exceptional’ practice in the same way as other grades: using their evidence and applying the toolkit during inspection. It will be a grade, like all the others, that will be provisionally awarded at the end of inspection and subject to our usual quality assurance and consistency checking.

    For an ‘exceptional’ grade to be awarded for a particular evaluation area, all the ‘strong standards’ need to be met; inspectors can then look at their evidence against the standards for ‘exceptional’. The practice needs to be sustained (evident over time rather than a recent improvement). It needs to have a transformational impact on the outcomes and experiences of disadvantaged children/pupils/learners (depending on the type of provision), those with SEND, those who are known to children’s social care, and with no significant areas of improvement identified that leaders have not already prioritised.

    After receiving the grade, leaders should share their exceptional success in the evaluation area(s) identified as ‘exceptional’. This could be through any appropriate method, for example with other schools/providers, professionals, their community and stakeholders, including local and/or national networks. This is to support system-wide improvement.

  • We will grade an area as ‘urgent improvement’ when we:

    • evaluate it to be failing overall or failing a significant group of children or learners
    • identify serious, critical or systemic shortcomings in practice, policy or performance, against professional/statutory or non-statutory guidance and requirements

If we identify that standards for children and learners must be urgently improved, we will not hesitate to call it out.

We have further information on how we will support providers that receive this grade in the sections on:

Inclusion (supporting those from disadvantaged backgrounds, who have SEND or who are known, or previously known, to children’s social care)

We have used the consultation responses and extensive evidence to inform our approach to inclusion, including from the DfE, through its expert advisory group on inclusion; the Education Endowment Foundation; our external reference group on inclusion; and research we commissioned from the National Children’s Bureau on how we might consider conceptualising vulnerability for inspection and regulation.

Following feedback from the consultation, we are taking the following actions:

  • We will put children first by raising standards, especially for the most disadvantaged and vulnerable. This is why we will introduce a new ‘inclusion’ evaluation area within the report card. Inspectors will evaluate whether education providers are identifying and offering high-quality support for all children and learners, especially those who are disadvantaged, those with SEND, and those who are known to children’s social care.

  • We will always start with what the government asks providers to focus on, through their statutory obligations or non-statutory guidance. This will include how providers are using targeted funding (such as the pupil premium or high-needs funding) to support children and learners who are disadvantaged, and those who have SEND or are known to children’s social care.

  • We have made inclusion both a specific evaluation area and a key theme across other evaluation areas. The grade for the ‘inclusion’ evaluation area will be based on the specific standards in that area, which focus on leaders’ ambitions, intent and identification of those who need support. Inspectors will also consider the impact of leaders’ work on inclusion across other evaluation areas. From leadership to teaching to behaviour, all aspects of education provision should support children and learners who are disadvantaged, those who have SEND, and those are known to children’s social care.

  • We have revised how we describe inclusion. We agree it is not for Ofsted to ‘define’ inclusion, and so we are describing this as our approach to inclusion. This is captured in the way we have drafted our toolkits and inspection instruments. Throughout the toolkits, you will see that we refer to those children and learners who are disadvantaged, those with SEND, and those who are known to children’s social care. We will always consider where those with protected characteristics are negatively impacted by these barriers in a provider’s context. Our approach to inclusion is also explained in full in the footnotes.[footnote 2]

Well-being and workload

We heard concerns raised through the consultation about the additional workload that some feared the revised framework could generate, and the implications for the well-being of the professionals we inspect. We have taken this concern extremely seriously and have taken action to address it.

Following feedback from the consultation, we are taking the following actions:

  • Nothing in the standards set out in the toolkits should add to a provider’s workload. Our toolkits are built on the requirements, standards and expectations already placed on leaders and their provision. This includes statutory and non-statutory guidance, standards that education professionals should be performing to, and the educational evidence that suggests the most effective strategies in securing better outcomes for all children and learners. Inspections can help providers focus on meeting these expectations more efficiently and effectively. We do not expect any provider to be doing more than it needs to just ‘for Ofsted’.

  • We are increasing inspection capacity for schools. We will add an extra inspector to inspection teams for one day of all full inspections of all state-funded schools. Having this extra inspector should allow the lead inspector more time to focus on engaging with leaders, coordinating their inspection team, and overseeing and quality assuring the inspection. This means we can reduce pressure on leaders through the inspection process. The extra inspector will enable lead inspectors and leaders to really collaborate across the inspection, and should ease any anxiety for leaders by acting as a regular point of contact, while allowing the wider inspection team the time to gather evidence to inform their evaluations.

  • We want inspections to take place within clear and reasonable timeframes. We recognise that inspection days can be long and that this places a burden on both providers and inspectors. The operating guides set out clear guidance to inspectors on the times at which they can arrive on site and the suggested latest times that they should be departing on each day of inspection, to cap the number of hours spent on site. For example, we are asking schools inspectors to finish the first day of inspection at 5pm and we will review whether the second day can finish earlier.

  • We commissioned an independent review of the impact of the proposed reforms on leaders’, practitioners’ and inspectors’ workloads, mental health and well-being. We commissioned Sinéad McBrearty, Chief Executive Officer at Education Support (a mental health and well-being charity for the education workforce across the UK) to carry out an independent review of the impact of our inspection reforms on the workload and well-being of the education workforce.

  • We have summarised its recommendations and set out, in that section, our response to them.

  • As explained, we have designed our inspection methodology with a clear view to reducing workload for the education workforce, and we have tested and revised that approach based on test visits and providers’ feedback on the workload implications of our inspection reforms.

  • We believe the changes we have set out, such as reducing the number of evaluation areas, clarifying the distinction between grades, and changing our approach to ‘exceptional’, will ease concerns about any workload and well-being implications.

  • We have removed the deep-dive methodology as the main structure for schools and FE and skills settings, to reduce the workload impact on middle leaders. See the section on adapting to different settings.

  • Inspectors will evaluate providers’ work to support and promote leaders’ and staff’s well-being. We have included this in the leadership and governance evaluation area of the toolkit in all inspection remits.

  • We will invite providers to nominate an individual working within the setting/provision to act as a nominee, where appropriate. The nominee will support planning, communication and ongoing engagement throughout the inspection, helping to streamline the workload. This role already exists in FE and skills inspections and is a welcomed, supportive measure during inspections. FE and skills providers will also be invited to nominate a shadow nominee, replacing the role of the skills nominee. You can find more details on the role in the operating guides. The Ofsted Academy will offer webinars and events for nominees to support them in their role.

  • The nominee is not a compulsory role. We will not expect childminder settings to have one. Providers will not be at a disadvantage if they feel unable to have an individual member of staff to take up this role. Similarly, if leaders, especially of smaller providers, wish to and feel able to take on the role themselves then they are welcome to do so. This remains a separate role and function to the nominated individual in the early years, which is the representative of a registered body and the primary contact with Ofsted on application, registration and compliance matters.

  • We will continue our policy and processes to support the well-being of leaders and staff during an inspection. This includes allowing inspectors to pause an inspection, particularly if they have concerns about the well-being of a leader or staff member and need support from the body responsible for that person. We have set up a national team for our inspectors and providers to help with any well-being concerns during inspection. We also ask providers at the point of notification to inform us of any agreed reasonable adjustments that we should take account of. We will continue our provider contact helpline, which providers can use to speak to a senior person in Ofsted during the inspection process, until they have received the draft report.

  • In the Big Listen, we heard that the inspection process and inspectors should have more care and concern for the well-being of the leaders and staff we inspect. We have taken this on board. In the Big Listen response, we set out 132 actions to improve Ofsted’s culture and reset the relationship with those we inspect, which we track and publish in monitoring reports to update on our progress.

  • We have embedded mental health awareness into all our training. We have also introduced:
    • a policy to allow inspectors to pause an inspection, including when they have well-being concerns about a leader or member of staff
    • a provider contact helpline
    • a national team to help with any well-being concerns during an inspection
    • an ‘inspection welfare, support and guidance hub’
  • We will continue to instil our values of always treating people with professionalism, courtesy, empathy and respect through everything we do.

  • We have evaluated the impact of proposals on groups protected by the Equalities Act 2010. The equalities impact assessment assesses how the renewed framework will affect providers’ staff with certain disabilities, such as those relating to mental health. We have outlined the actions Ofsted has taken to protect the well-being and mental health of provider staff and to minimise workload pressures.

  • We have addressed concerns about consistency of inspection. We also heard that inconsistency of inspection practice and inspection outcomes can lead to anxiety among education professionals. We take these concerns very seriously. We revised our grading methodology so that the grading scale and consistency of grading should not drive unnecessary anxiety. We discuss consistency more in the next section.

Consistency

We have listened to the feedback from the consultation that raised questions around the consistency of our inspection practices and reporting. Our revised framework will deliver consistent, fair inspections, with in-depth insights across a broad range of evaluation areas.

Following feedback from the consultation, we are taking the following actions:

  • All inspections of maintained nursery schools, schools (including non-association independent schools), and FE and skills providers will be led by HMI that we employ directly. Inspection leads will also include Ofsted Inspectors (OIs) with recent experience as HMIs (typically those who have worked as HMI within the past 3 years). We already have HMI-led inspections in ITE.

  • This means we will have a smaller pool of lead inspectors, more regular training and more regular consistency checks, which will increase our consistency of evaluations. These inspectors will be brought together regularly for consistency checking, professional reflection and quality assurance work.

  • In schools, we made the decision to notify of inspections on a Monday (for early years, FE and skills and ITE providers, notification depends on the type of inspection and institution). As well as reducing unnecessary pressure in the system, this means that all our schools HMI are usually available on Friday of each week for work including quality assurance and consistency checking. HMI bring a rich experience of leading inspections and an in-depth understanding of our framework, toolkits and inspection methodology born of this experience.

  • Our expert contracted OIs (who usually work in the education sector, are highly experienced and have considerable expertise) will continue to be part of inspection teams. We will make the best use of OIs’ current sector knowledge and experience by deploying them as team inspectors and aiming to match their expertise to specific types of provision. 

  • We will match inspectors with specific sector expertise with the provision they inspect. Most school and FE and skills inspections will have at least one inspector, either lead or team, with previous experience of working in a similar type of provision. All our early years inspectors (employed and contracted) have experience of working in the early years sector and so have the relevant experience.

  • We will have a slightly different approach for registered early settings, where we are also the regulator. These include most types of group-based settings (such as nurseries, pre-schools and out-of-school clubs) and childminders. These settings will continue to be inspected by our early years regulatory inspectors and OIs, all of whom have specific sector expertise. We will be supporting our early years inspectors (employed and contracted) with bespoke training to ensure consistent application of our reformed framework across different types of settings.

  • We will provide clear information to be used on all inspections. Our updated toolkits give clearer instructions to inspectors on how to gather evidence and apply the toolkit to different types of providers, and guide their evaluation of the evidence. We have strengthened our descriptions across the grading scale to ensure they are more nuanced and clearer. We have replaced the inspection handbooks with operating guides, which have clear instructions for inspectors on how to apply the methodology and inspection activities available to them and guidance on when to contact the duty desk.

  • We will introduce a programme of work to assess consistency in school inspections. As part of the quality assurance process, a senior inspector will shadow a sample of live inspections. During the inspection, the senior inspector will guide and advise the inspection team to ensure the consistency of the inspection outcome. After the inspection, any initial differences between the senior inspector’s evaluations and the team’s will be analysed by our research and evaluation team. We will consider that information, alongside our wider consistency activity, including scenario-based testing for inspectors, to help us update and improve our training and inspection materials. 

  • We will continue with quality assurance visits for early years as we implement the renewed framework. These visits will give assurance that the renewed methodology, such as professional dialogue, is helping inspectors to gather robust evidence to find out what it is typically like for children at that setting and to support the grades they reach.

  • We will continue having weekly consistency assurance meetings. Our National Director, Education and National Director, Delivery, or National Director, Regulation and Social Care will lead a rigorous consistency evaluation with regional directors and inspectors to review the week’s inspection findings.

  • Our national hubs will continue to improve consistency across all our work. We set up these hubs last year to improve consistency across Ofsted by taking a centralised approach to different aspects of our work. They include the ‘enhanced consistency and moderation’ hub, which offers additional scrutiny to the reports on providers causing us the greatest concern, independently of the original inspection team.

  • The Ofsted Academy will run the largest package of training Ofsted has ever offered for an inspection launch. Ahead of the implementation of the renewed framework, all senior HMI will have led pilots before leading on the renewed methodology. We will give all senior HMI and early years senior officers extensive training to quality assure the inspections of other inspectors ahead of them leading an inspection. We will also give early years regulatory inspectors and OIs in early years additional bespoke training. All inspectors will complete a comprehensive package of training on the renewed framework before inspections start in November. Our extensive training package will improve consistency across inspections by making sure inspectors have the right expertise to make evaluations in different settings and for different types of providers.

Context

We have listened carefully to providers who told us that inspections should fairly reflect the unique context they work in. In response, we are introducing changes that will help ensure that inspection findings are grounded in a clear understanding of each provider’s unique circumstances.

Following feedback from the consultation, we are taking the following actions:

  • Inspectors will use the planning call to providers to understand their context. This includes the children or learners’ needs and leaders’ evaluation of the provider’s strengths and areas for improvement. During the call, which will take place before the inspection via video in most cases, inspectors will discuss the extent to which leaders understand the provider’s context, strengths and areas for development. The call will also help to establish a strong professional relationship between the lead inspector and providers from the outset of an inspection.

  • We are introducing a new service, ‘Ofsted: explore an area’. Although report cards focus on what it is like to be a child or learner at a specific provider, the service will bring together data to show what education provision is like in and around a local area. This will explain how the provider’s performance sits within its local context. ‘Ofsted: explore an area’ will go live in November.

  • We will also share data about the provider’s context alongside the report card. This will include demographic and outcome information. Published summary data – the latest data available at the point when the inspection took place – will sit alongside report cards and complement the qualitative findings from inspections:
    • For state-funded schools, this will show the profile of pupils (such as the number of pupils receiving free school meals), as well as performance and attendance data.
    • For early years settings, data on the number of places will help inspectors and parents viewing the report card understand the capacity and nature of provision.
    • For FE and skills providers, data on the number of learners and types of provision will help illustrate the scale and focus of a provider’s work.

      Where available, this further contextual information will be included within pre-inspection materials that inspectors use (such as the inspection data summary report (IDSR) for schools).

      We have set out the data we will publish alongside each report card in full.
  • We will further train inspectors to use data as a starting point for understanding context. We will continue to hold all schools to the same standards, but inspections will be better informed by context.

    Inspectors will triangulate the data, where available in pre-inspection materials (such as the IDSR), with other evidence they gather on inspection. We will also train them to understand the limitations of data – such as its use in small schools – and to understand how to interpret contextual factors. This approach will also support consistency.

  • We have started to develop a ‘similar schools’ comparison measure. This will help inspectors and schools to understand how schools compare with those in a similar context. We will discuss with stakeholders and experts whether this measure may be a valuable way of adding more contextual information for inspections. We will ensure that it is consistent with any approach the DfE adopts in this area and work closely together as we develop ours.

Adapting inspections for different types of providers

We heard questions raised in the consultation about how we will adapt our approach for different types of providers.

Following feedback from the consultation, we are taking the following actions:

  • We will give detailed instructions to inspectors on how to adapt the format of an inspection or apply the toolkit, as appropriate, to a range of types of provision. These are set out for each evaluation area in the toolkits and operating guides; they will be a key aspect of inspector training. Different types of provision include: out-of-school settings delivering childcare; childminders; special schools; alternative provision (AP); small settings; and the wide diversity of FE and skills provision. Understanding the context of providers is key to ensuring that the renewed methodology is appropriate to all settings we inspect.

  • We will shine a spotlight on instances when schools have made decisions that compromise the education and care of vulnerable children. Our state-funded schools operating guide explains how inspectors should evaluate a school’s use of off-site education provision or any SEN resourced provision/unit. This will help to ensure that inspections give an accurate and fair assessment of how well the school meets the needs of all its pupils, including those in unregistered AP or specialist settings.

  • We have replaced the deep dive evidence-gathering structure on inspection with a new approach for schools and FE and skills inspections. The operating guides describe this in detail. The approach will allow leaders and inspectors to reflect on each provider’s unique context and its improvement priorities. The toolkits set out evidence-gathering approaches as well as how inspectors consider the evaluation areas for different types of providers. The flexibility of evidence-gathering activities will be proportionate to the type of provision. This will also ease the pressure of inspection on middle leaders, which was welcomed in consultation responses.

Report card – format and visual

We have made changes to the report cards to make them more user-friendly and accessible. Our changes are based on extensive research, feedback in consultation responses, direct engagement with stakeholders, polling and focus groups with parents, and – crucially – extensive user testing directly with parents and professionals.

Following feedback from the consultation, we are taking the following actions:

  • The report card will include a summary and a detailed report on each of the evaluation areas that we have evaluated the provider against. The report card summary gives an overview of the number of evaluation areas, sitting across the evaluation scale in a colour-coded table. Colours range from red for ‘urgent improvement’ to blue for ‘exceptional’.

  • The detailed report will sit below the overview grid and provide a narrative for each evaluation area. It will explain the strengths and areas for development. ‘Safeguarding’ sits at the top of the detailed report to enable parents, carers and providers to easily find this important information.

  • The report card will also include an overview of ‘what it is like to attend this provider’.

  • In addition to publishing report cards for each of our individual inspections, we will also continue to publish relevant summary management information and statistics.[footnote 3]

  • The report card has been designed to be used on both desktop and mobile devices, based on our user research of how parents typically engage with Ofsted reports.

Schools report card

Ofsted: new school report card

Early years report card

Ofsted: new early years report card

Further education and skills report card

Ofsted: new FE and skills report card

Assurance and accountability: early years providers

In the Big Listen, we heard that 7 in 10 parents and carers place value on an Ofsted report when choosing a setting for their child. However, only 1 in 3 respondents agreed that Ofsted reassures them of the learning and development of children at early years settings. This is why we are improving our reporting, as set out above, but we are also intent on providing greater assurance and accountability in early years to ensure that children get the best start in life.

We are taking the following actions:

  • We will increase the frequency of routine inspections for early years. We will reduce the 6-year inspection cycle for regulated early years settings to a 4-year cycle (from April 2026). This will give far more assurance to parents and carers that early years settings are safe and suitable for children, while offering greater contact and feedback for settings on what they can improve. This was announced by the DfE in its ‘Giving every child the best start in life’ report in July.

  • We will also inspect providers more frequently if we have evaluated any area as ‘needs attention’ or ‘urgent improvement’. We may also inspect them more frequently if we have received concerns or information about them that could suggest there are risks to children.

  • We will bring forward providers’ first full inspections within 12 to 18 months after they register. We know that the time that a newly registered early years setting waits to be inspected is a source of frustration, both for them and for parents. From April 2026, we will reduce expected waiting times by half, from up to 30 months to 12 to 18 months.

  • We will set actions for early years providers when any evaluation area is graded as ‘needs attention’ or ‘urgent improvement’. As providers must meet all of the requirements set out in the statutory framework for the EYFS to be graded ‘expected standard’, to receive one of these grades means they have not met all of these requirements. When this happens, the following apply:

    • If breaches to EYFS requirements are found, but they do not have a significant impact on children’s safety, well-being and/or learning and development and there are no concerns regarding the setting’s suitability or capacity to improve, we should grade that evaluation area as ‘needs attention’.
    • If breaches to EYFS requirements do have a significant impact on children’s safety, well-being and/or learning and development and/or there are concerns about the setting’s suitability and/or capacity to improve, then the evaluation area is ‘urgent improvement’.

In either case, we will take proportionate regulatory action to ensure that the setting meets the relevant statutory requirements. This may include giving the setting actions. If any areas are graded as ‘needs attention’, we will reinspect within 12 months. If any areas are graded as ‘urgent improvement’, we will reinspect within 6 months.

The early years operating guide sets out the processes in more detail.

Identifying schools causing concern

Ofsted is required by law (section 44 of the Education Act 2005) to identify schools that meet the statutory definitions of categories of concern. Our process for identifying schools causing concern follows the 2-step assessment below.

Figure 1: Placing a school into a category of concern

View this flowchart in an accessible format.

Following feedback from the consultation, we are taking the following actions:

  • We will refer to the terms set out in law to distinguish the categories of concern for schools, and will align our terminology with that set out in the Act:

    • Schools with widespread issues – and that do not have the leadership capacity to resolve them – are categorised as ‘special measures’.
    • Schools with more specific (but still serious) issues – and in which leadership is deemed to have the capacity to bring about the rapid improvement needed itself – will be categorised as ‘requires significant improvement’ (which we previously called ‘serious weaknesses’).

      This aligns our terminology with that set out in in the Education Act 2005.
  • We have implemented a ‘suspend and return’ policy in schools. Inspectors can suspend an inspection to allow a school to resolve minor issues with safeguarding within 3 months, as long as that is the only issue.

    The DfE’s consultation on school accountability sets out its proposed approach to school improvement for schools in the ‘special measures’ and ‘requires significant improvement’ categories.

Monitoring for schools

We want to offer reassurance to parents and carers, as well as the regulator (the DfE), that schools are improving after we have found that improvement is needed.

We believe that monitoring inspections are an opportunity for eligible schools to have their improvements systematically recognised, and to allow us to support that improvement. They also reduce pressure on leaders so they are not stuck with unwanted inspection outcomes for years.

Following feedback from the consultation, we are taking the following actions:

  • Schools graded as ‘urgent improvement’, and therefore in a category of concern, will receive a monitoring inspection each term following the publication of the report card. The monitoring inspections will focus on the areas for improvement identified at the full inspection.

    The number of monitoring inspections will be tailored to the circumstances of the school, following the below parameters:

    • Schools that require significant improvement will receive up to 5 inspections within 18 months of the last full inspection.
    • Schools that require special measures will receive up to 6 inspections within 24 months of the last full inspection.
    • There will be one inspection per school term.

The end of each monitoring inspection will include checking the readiness for the removal of a category of concern. This will determine whether monitoring will continue or whether the school has improved enough to have a full inspection, in which the category of concern can be removed. We will set out the findings of each monitoring inspection and publish these in the report card.

If inspectors consider that a school in a category of concern has improved so that it can be taken out of a category of concern, they may deem that monitoring inspection to be a full inspection. They will then complete all the activities of a full inspection and produce a full report card, with updated grades.

  • Schools with any evaluation area that is graded as ‘needs attention’ will also receive monitoring. Their monitoring programme will start with an initial call to the headteacher to discuss the progress the school has made towards reaching the ‘expected standard’ since their full inspection. These monitoring inspections will only look at the evaluation areas that were graded below the ‘expected standard’.[footnote 4]
    Where possible, we will allocate an inspector to the school throughout the monitoring programme. They will offer regular opportunities for dialogue and discussion with the leaders of the school, about the timing of the monitoring inspection and the steps they are taking to improve. We hope this helps to build an enduring relationship between the school and the inspector.

    A monitoring inspection can move a grade up to at least ‘expected standard’ and we will update the report card.

    Once the school is graded ‘expected standard’ or above in all areas, the monitoring programme will end. The school can expect a full inspection within the normal cycle (4 years from the last full inspection).

    A monitoring inspection can also move a grade down to ‘urgent improvement’. This is likely to lead to the inspection being deemed a full inspection, after which we will publish a new report card with a full suite of new grades. This is also likely to mean that the school will receive additional monitoring as a school causing concern.

  • We will continue to carry out other, usually urgent, monitoring inspections of schools. We call these ‘focused inspections’.We did not consult on this type of inspection because the qualifying criteria will remain the same as it was: in response to information that causes us concern, for example a qualifying complaint made to Ofsted.

    Focused inspections can take place with or without notice. The outcome and grades of the school will not be changed as a result of the focused inspection. However, we will update the report card to inform parents and carers of the focused inspection findings.

    If, during a focused inspection, inspectors find evidence that raises sufficient concern that the school would be likely to be graded ‘urgent improvement’ on a full inspection, they may deem the monitoring inspection to be a full inspection. They will then carry out all the activities and evaluations of a full inspection and produce a full report card.

  • We will no longer be carrying out ungraded inspections. From November 2025, all routine inspections will be full inspections. The other form of inspections – monitoring and ‘focused’ inspections – will apply as described above. This will simplify inspection so that every school will know exactly what kind of inspection it will receive and how often. This also gives parents and carers more clarity about the most up-to-date grades for their child’s school.

  • We will continue with full inspections every 4 years as part of the school inspection cycle. Schools that are being monitored and do not improve by the end of the monitoring programme may receive a full inspection sooner.

  • We will publish the school monitoring inspection operating guide later in the autumn, which will set out our approach in more detail.

Monitoring for FE and skills providers

Like with schools, we want to offer reassurance to parents and carers and learners, about the quality of FE and skills providers, while supporting providers to improve through monitoring inspections.

We are taking the following actions:

  • FE and skills providers with any evaluation area graded as ‘urgent improvement’ will receive monitoring inspections. These will only focus on the areas for improvement identified at the full inspection. The initial monitoring inspection is likely to be within 6 months of the publication of the full inspection report card. If a second monitoring inspection is required, this is likely to be within 12 months of the publication of the full inspection report card.

  • FE and skills providers with any evaluation area graded as ‘needs attention’ will also receive monitoring. The monitoring inspection will only focus on the areas graded ‘needs attention’ at the previous inspection. If a provider has improved the quality of provision in these areas, the grade will change and we will publish an updated report card. If these areas have not improved, the grade will not change. If they have declined, then we will schedule a full inspection of the provider.

    When all evaluation areas that were ‘needs attention’ are graded as ‘expected standard’, the monitoring programme will end.

  • We will continue to carry out new provider monitoring inspections for those providers that become newly, directly or publicly funded to deliver education and/or training. These new providers will receive a monitoring inspection within 18 months of starting to deliver that provision. This will only result in progress grades.

    Newly merged colleges will receive a monitoring inspection before receiving a full inspection within 3 years of the merger.

  • We will continue to carry out focused monitoring inspections if we receive any information that causes us serious concern, such as a safeguarding incident.

  • We will be publishing the FE and skills monitoring inspection operating guide later in the autumn, which will set out our approach in more detail.

  • We will adopt a ‘suspend and return’ policy in FE and skills inspections, as we have done for schools. Inspectors can suspend an inspection to allow a provider to resolve issues with safeguarding within 3 months, where there are no concerns in other evaluation areas.

Reinspection for ITE providers

We will continue with our usual approach to reinspection for ITE providers.

We are taking the following actions:

  • ITE providers that receive a grade below the ‘expected standard’ in any evaluation area will be reinspected within 12 months of their full inspection. The reinspection will only consider the evaluation area(s) that are less than ‘expected standard’. 

Complaints

In the Big Listen, we heard criticism about how we handle complaints and that our process was not independent enough. The independent review of the impact of our inspection reforms on the workload and well-being of the education workforce that we commissioned found that these concerns remain.

We have taken steps to address these concerns, and will do more. So far, we have done the following:

  • We set up a new provider contact helpline. This advises callers on the complaints process. This is not intended to reduce complaints but to increase the transparency and understanding of the process.

  • We set up a ‘complaints against Ofsted’ hub to centralise complaints. This means that inspectors assessing complaints are always from a different region to the one from which the complaint originated. We believe this is leading to a more independent and consistent approach.

  • We are improving contact with complainants. We have introduced the opportunity for complainants to talk to the investigating officer so that the investigating officer fully understands the concerns being raised. We are also including more detail in our complaint investigation outcome letters to explain clearly why investigators reached their conclusions. This means that complainants get more information about how grades were decided, and examples of the evidence gathered.

  • We introduced complaints panels with external sector representatives, which began in January 2025. They review whether we have handled a sample of complaints fairly and in line with our policy. After a successful first 2 terms of operation, we will strengthen these panels even further by enabling more external representation into the process.

  • We will continue to work on strengthening our complaints process, including introducing more external representation on to complaints panels. Under our new Chair, we expect the Ofsted Board to take a significant role in developing our complaints policy. The Board will challenge the quality and independence of our processes and monitor this work.

Engaging children, learners, parents, carers, providers and staff on inspection

In our response to the Big Listen, we committed to improving how we engage with children, learners, parents, carers, providers and/or staff, both when we inspect education providers and at all times. We had heard from some parents that we should improve how we engage with them on inspection.

We commissioned the Behavioural Insights Team to carry out research with stakeholders to understand what improvements we can make to ensure that parents, children and staff can share their views with inspectors during inspections. 

We are taking the following actions:

  • We will improve how we engage with children, learners, parents and carers when we inspect education providers.  Our renewed inspection methodology emphasises the importance of speaking to pupils on inspection.

    All our inspectors receive refresher training on speaking to children and learners during inspection. We have used the Behavioural Insights Team’s advice to revise this training, including around how to make conversations with inspectors feel less formal and more approachable to put children and learners at ease.

  • We are considering new ways to strengthen how we hear the views of parents, children, learners and staff as we roll out our inspection reforms. We are developing videos that we can share to tell children and staff what they can expect from inspection, how they can share their views with an inspector and how inspectors will use what they say. We expect these to be ready later in the academic year.

What we heard – changing how we report inspection findings

We heard powerful feedback on how we propose to implement the government’s plan for new report cards for school inspection, which we are also applying across other education inspections. This has helped us to make changes to our report card design, toolkits and methodology (see the ‘Summary of changes’ section).

Report cards (Proposal 1)

Report cards – these will give parents, carers and employers more detailed information than the old inspection reports, including a new 5-point grading scale to evaluate more areas of a provider’s work and short summaries of what inspectors found.    

What we heard: from parents

The response to our plans from parents was resoundingly positive.

We asked YouGov to carry out independent quantitative and qualitative research with parents. This found that parents were very familiar with Ofsted. Of the parents that YouGov surveyed:

  • 99% had heard of Ofsted
  • 51% had heard a lot about Ofsted
  • 79% would trust what they read in an Ofsted report at least a little, with trust increasing significantly among those who had already read a report

However, parents also told YouGov in surveys and focus groups that the information we provide should be better. The findings of the YouGov research we commissioned suggest that what we proposed is a big improvement:

  • 67% of parents preferred the new report card
  • only 15% preferred the old format

These findings were consistent with a More In Common poll commissioned by ‘Schools Week’ in February 2025. In this:

  • 65% of parents preferred the new report card system
  • only 16% of parents preferred the previous format
  • 71% of parents felt the new grading system to be fairer on teachers
  • only 17% said they felt the previous system was fairer

We also asked YouGov to research how useful the new reporting format was. This found that, of the parents surveyed:

  • 78% agreed the information on the new report card will be useful to them as a parent
  • 70% said the new report card tells them what they need to know about a school or sixth form
  • 78% said the new report card clearly tells them what a provider is doing well and where further work is needed
  • 87% said they would find a 5-point scale useful (from ‘causing concern’ to ‘exemplary’ – which is now called ‘exceptional’)
  • 64% agreed with the addition of an ‘exemplary’ grade

We commissioned YouGov to run a series of focus groups with parents of school-age children and children in early years settings to find out their views on the report card format, the grading scale, and the evaluation areas.

Parents of school-age children were very positive about the 5-point scale:

It [the scale] felt easy to understand and was often associated with 5 stars, though adding stars did not feel necessary.

[It added] ‘a much-needed “middle ground” and therefore reduce pressure.’

Since there are more ‘good’ areas on the scale, schools are provided with more to strive for, as they could effectively be performing well in 3 different categories.

The 5-point scale is a particular highlight of the new approach. The majority felt that it reduces pressure, allows for more nuance, gives parents and schools more insight and offers schools more flexibility.

YouGov also ran a focus group with parents who worked in education, mostly teachers in primary and secondary schools. This showed us that there was not an acute divide between parents in general and parents who are also education professionals on how we report. Parents who were education professionals also welcomed the changes:

I think that this format tries to strike a balance between providing a simple snapshot of a school, and providing more detail. Considering parents as the audience, I think this is a far more preferable format.

As a member of staff, or leader, it shows which areas are key to focus on to improve, and which to maintain. I also like the increased breadth of areas.

Honestly, I really like it. I might be tempted to suggest that differentiating between the red/orange and three shades of green would be clearer. Maybe not something as simple as a dotted line straight down between ‘attention needed’ and ‘secure’, but something. Maybe a light green box behind all of the areas in ‘secure’, ‘strong’ and ‘exemplary’ – to show that ‘secure’ is a positive thing.

In the YouGov focus groups with parents of children in early years settings, we saw a contrast in views. Parents who sent their children to group-based early years providers (such as nurseries) were far more positive about the report card format and scale than parents whose children were looked after by childminders. For parents of children who were looked after by childminders, the choice of provider was informed by factors other than Ofsted’s reporting, such as their relationship with the childminder being more personal.

The YouGov research also found that 84% of parents surveyed said the colour-coding system from dark green to red was useful. However, in its qualitative research, YouGov found that, while ‘the traffic-light colour approach is universally understood’, there were some challenges around accessibility. We found the same challenges in our user research, as the proposed 3 different shades of green did not have enough colour contrast for users to differentiate between them. Some alternative colour options were proposed for us to consider based on parents’ insights, which we have now adopted following extensive direct user testing.

In polling, 86% of parents surveyed said the labelling of the evaluation areas was useful. However, in focus groups, we were able to get specific views on the terms we had used. We found that terms such as ‘developing teaching’, ‘secure’, ‘causing concern’ and ‘exemplary’ were misunderstood or challenged. In response to these concerns, we renamed the evaluation areas and grading scale terminology.

The YouGov research was designed to hear from a representative sample of parents. Our online consultation attracted more negative views. Online respondents were not as in favour of the 5-point scale of the report card format as those in the YouGov polling sample. Many were also critical when it came to the number of evaluation areas proposed. Those who responded positively to the consultation said that the report card was ‘easy to use for everyone,’ and appreciated the parent-friendly layout that allows you to easily search for relevant information (unlike a full report in the previous style). Parents frequently asked for context alongside data in the report cards to help inform their views. Some were also concerned that early years providers were being evaluated using school-based metrics.

What we heard: from professionals who work in education providers

The proposals have generated a mixed and sometimes negative reaction from early years professionals, headteachers, teachers, FE and skills professionals and ITE providers.

Education professionals welcomed some of our changes through the consultation feedback, such as removing the overall effectiveness grade across all remits and the greater nuance and detail in the report cards. They also saw the value of publishing data but stated that any performance data would need to be accompanied by contextual information, such as data on the demographics of providers and insights on issues such as inclusion from the inspections.

However, we also heard many concerns about our proposed reporting system. We heard a range of views on different approaches to reporting, particularly different forms of grading scale. A common thread in the feedback was the preference for a more narrative-based inspection report, or a ‘met or not met’ grading system, as opposed to a scaled grading system.

What we heard: from the early years and schools sectors

Respondents from the early years and schools sectors offered similar views, but early years representatives were distinctly more positive than schools representatives. A major childcare and early years provider representative said they believe the new grading system will help providers to identify areas to improve more precisely, due to the 5-point scale providing more granular feedback on what providers need to improve.

Organisations representing school professionals, including headteachers and teachers, had a strong negative reaction to the report card proposals. This included a media campaign opposing our plans, which criticised: the principle of grading schools; the grading scale; the increased number of evaluation areas, despite also having concerns about one overall effectiveness grade; and the proposed use of colours.

Submissions from these organisations expressed strong concerns that there would be high-stakes pressure and increased workload associated with the proposals, including by using any form of grading scale. They were also concerned about how consistent inspectors’ grading would be across the proposed increased number of evaluation areas, and the practicalities of inspectors using the toolkits across a 5-point scale, due to weaknesses in the descriptors within them.

Consultation responses from early years and school professionals also shared concerns about the implications for workload, fairness, the grading scale, and the complexity of evaluation criteria.

Some school professionals liked the concept of sharing best practice and the recognition they would receive through the new ‘exemplary grade’ (now ‘exceptional’). However, they were hesitant about the workload with the proposed case-study submission approach. We have now removed this approach and instead will encourage schools and other providers to share their ‘exceptional’ practice through other ways (see toolkits for detail).

Through the user research we carried out, we were able to further explore some people’s preference for a more narrative-based inspection report. We were able to confirm the level of detail and narrative that users would like from the report cards, and have made changes in light of this.

What we heard: from the FE and skills and ITE sectors

We had far fewer respondents from the FE and skills and ITE sectors. Of those we did hear from, their views on our reporting specifically were similar to those from the early years and schools sectors.

One of the main concerns specific to the FE and skills sector was that data on issues such as completion could be misinterpreted by people who have limited knowledge of the sector.

FE and skills respondents also cautioned that there is no single source of achievement data that we could use for all FE and skills provision types. Some said that specialist providers would not be well represented by qualification data because some or all learners would be working towards personalised learning goals, rather than external accreditation. Representatives of nominees in the FE and skills sector were positive about the proposed 5-point scale, and their member survey indicated strong overall support for this approach.

Like other respondents, ITE providers expressed concerns about the number of evaluation areas. Some organisations recommended that we reduce the total number to allow inspectors enough time to thoroughly cover everything in the toolkit.

Proposal 1: our changes and actions

We have set out:

Changing our inspection practice (proposal 2, 3, 4, 5 + additional questions)

Education inspection toolkits – the toolkits show providers and inspectors the evaluation areas that we will focus inspections on and how we will assess and grade them.

What we heard: early years providers

Across Ofsted-led focus groups with early years professionals, and in many consultation responses, we heard broad encouragement around the clarity and relevance of the proposed toolkit.

Respondents to the consultation said that the toolkit provides transparent information about what inspectors are looking for and could be helpful for self-assessment.

In focus groups, early years providers generally welcomed the content and structure of the toolkit, as well as the focus on inclusion throughout.

Some responses, particularly from childminders, noted that the level of detail in the toolkit may need further adjustment to better reflect their specific context, for example the purposes of different provision types and the capacity of smaller settings to prepare for inspection.

There were broader questions and comments about the suitability of the toolkit for different early years providers. They wanted more detail about how inspectors would adapt it for different settings, and some suggested that it could be more closely aligned with the purpose and principles of their provision. Others recommended that the language in the toolkit should better reflect the language used by early years providers.

Some consultation responses suggested that there were too many evaluation areas for early years. There were some concerns that the breadth of the toolkit could have an impact on staff’s workload and well-being. Some were also concerned about the relationship between inspectors and early years providers. They highlighted the need for comprehensive training to support consistent and constructive inspections.

What we heard: state-funded schools

Across all education remits, on testing visits, inspectors, providers and respondents raised issues about the differentiation between the ‘secure’ (now ‘expected standard’) and ‘strong’ (now ‘strong standard’) grades across toolkits. This was also brought out strongly in responses from the schools sector, both through the consultation and direct feedback.

Many respondents felt that the toolkits did not do enough to acknowledge the challenges faced by schools in deprived areas or those with limited resources. They wanted us to revise the toolkit to better accommodate the context of schools and the realities they face, particularly in terms of resources and recruitment challenges, leadership structures, and the socioeconomic backgrounds of students. Some also raised concerns that schools may be judged unfairly in the achievement and attendance section.

Respondents encouraged us to make use of contextual data to improve inspectors’ understanding of the different degrees of challenge schools face. Another broad concern was the breadth of the overall reforms, such as the number of evaluation areas and associated toolkits, and the workload this may generate. However, the inclusion of staff well-being in the toolkits was welcomed.

We heard that the term ‘developing teaching’ was not clear and not everyone understood it. However, respondents welcomed our focus on professional development within this evaluation area.

We asked, in the consultation, whether the toolkit would work in practice for special schools and the AP we inspect. Respondents noted the importance of adapting the toolkits for these settings and ensuring that inspectors who inspect special schools and AP have the right level of expertise to make informed and consistent evaluations.

What we heard: independent schools (non-association independent schools)

Respondents on behalf of independent schools broadly welcomed the alignment with state-funded schools.

Across all evaluation areas, independent school respondents stated that the toolkit should refer to existing professional standards and guidance, where these are available.

They also highlighted the need for inspectors to understand the context they work in, especially regarding small settings.

As with state-funded schools, independent school respondents stated that the toolkit would need to be adapted for the independent special schools and independent AP settings we inspect.

What we heard: FE and skills providers

Feedback on the toolkits from professionals working in FE and skills reflected a wide range of views and valuable insights.

The consultation responses noted the complexity of inspection for FE and skills providers that have multiple types of provision. Some respondents noted that the number of potential evaluations – up to 20 – could be difficult to manage. Several stakeholders recommended merging evaluation areas to simplify the toolkit. Some focus group participants recommended further refining the evaluation areas to align more closely with the schools toolkit, for example expanding ‘leadership’ to ‘leadership and governance’ and strengthening the emphasis on ‘personal development’, as seen in the education inspection framework.

The consultation also revealed mixed views on the toolkit’s suitability across different FE and skills provision. Some respondents felt the toolkit was well suited to 16 to 18 college provision; others found it less applicable or harder to use for independent learning providers and apprenticeship programmes.

What we heard: ITE providers

The ITE toolkit received some positive feedback through the consultation response. Respondents welcomed the overall relevance of the toolkits to ITE. Some consultation respondents and focus group attendees noted that we could change some key terms in the toolkit to better align with the ITE sector.

Several ITE stakeholders felt that we needed to do further work to ensure that the standards in the toolkit were suitable and terminology clearly distinguished between trainee, teacher, leader and mentor, for example.

Consultation respondents raised specific concerns about how inspectors would measure achievement; they wanted further details on the extent to which success would be measured against trainees’ outcomes or the provider’s performance.

What we heard: inclusion specifically

Our focus on inclusion was widely welcomed across all education remits. Respondents felt that inclusion was reflected as a core priority across all toolkits. A number of education professionals agreed that providers needed to get it right for the most vulnerable and disadvantaged in order to get it right for everyone.

Although many education professionals welcomed our proposed definition of inclusion, some suggested that we should leave a formal definition to the DfE.

Education providers also wanted inspectors to recognise the context they were operating in. Across all remits, respondents raised the importance of recognising systemic challenges such as funding, parental responsibilities, availability of health services and availability of social care services. They cautioned that all of these have an impact on the extent to which they could be inclusive.

There were also some concerns about how the toolkit would be applied outside mainstream school settings. Consultation respondents noted that it would be challenging for small settings to evidence their inclusive practice during an inspection. Early years providers, in particular, questioned how inspectors will consider the impact a setting can have if they only care for a child for a fraction of the week. FE and skills and ITE respondents also cautioned that inclusion looks different for them, and that inclusive practice for adult learners has to be based on consent.

Proposal 2: our changes and actions

We have set out:

Proposal 3: inspection methodology

Inspection methodology – changes to how we carry out inspection.

What we heard overall

We heard thoughtful and constructive challenge on the ‘look and feel’ of inspection – our methodology and overall approach.

Consultation respondents liked the shift towards a more supportive, empathetic approach to inspections. They also appreciated the starting point of ‘expected standard’ (formerly ‘secure’) for schools, unless evidence suggested otherwise. Across all educational remits, professionals and inspectors who responded to our consultation or participated in testing visits stated that the new approach to inspection was more collaborative.

However, many education professionals in the online consultation were dissatisfied with the overall changes. They felt that the reforms did not go far enough in addressing their concerns. They were concerned that the increase in evaluation areas could lead to greater workload, more stress and negative impacts on well-being. Some were also concerned that the reforms would not adequately account for the unique contexts of individual providers, leading to unfair outcomes that lack nuance.

Respondents were also uncertain and unclear about what an inspection would entail without deep dives. There was a sense of heightened anxiety about the proposals on the new methodology, due to their familiarity with the current process.

What we heard: early years and schools sectors

Early years and schools professionals welcomed the changes to our methodology to make inspections feel more collaborative.

In focus groups, headteachers and leaders said that they welcomed the opportunity to explain the context of their provision during the planning call and the opportunity to demonstrate the work they are doing on inspection. They also generally liked the fact that inspections will be starting with ‘secure’ (now ‘expected standard’) in the methodology. Early years respondents to the online consultation also said that their workload might reduce as a result of this change. Some focus group attendees thought it was still unclear how the methodology and inspection process will work in practice and that the wording of the toolkits should be made clearer.

Consultation and focus group evidence suggested that school professionals particularly welcomed our removal of deep dives, as this would reduce the pressure on, and workload of, middle leaders. They also noted that this change would allow inspectors to spend more time with leaders and pupils.

There was also support for making senior leaders the focus of inspections, and the emphasis on professional dialogue and collaboration. Some school inspectors also welcomed the removal of deep dives; they felt that the methodology had become too narrow. Other inspectors wanted more clarity on what the renewed methodology would be like without deep dives.

There were mixed views about how the role of the nominee would work. Some thought it may help to give assurance on inspection; others thought it would be an unnecessary burden for smaller providers, especially childminders who work alone. Early years professionals and inspectors asked how the notification call would work in practice for childminders and other small settings.

Representatives of school group leaders welcomed our more nuanced approach to evaluating school performance; emphasis on inclusion; and commitment to identifying and sharing best practice to raise standards system-wide.

They also recommended ways to strengthen the validity and reliability of inspections, such as by simplifying the grading system, particularly around the use of ‘exemplary’; refining the wording of evaluation areas; merging the assessments of teaching and curriculum; and treating inclusion as an aggregated evaluation area. They also advised ensuring that the methodology supports consistent grades and reducing the volume of proposed monitoring activity.

Representative groups of school professionals, such as headteacher and teacher unions, were highly critical of all the proposals in the consultation. They raised concerns about the perceived increase in pressure on school leaders, the feasibility and fairness of inspections, and the lack of sensitivity to different school contexts. Respondents welcomed the removal of deep dives and supported our commitment to continue to emphasise inspectors’ professionalism, courtesy, empathy and respect.

The alignment of independent school inspections to state-funded school inspections was broadly welcomed, to maintain a level playing field across school types.

What we heard: FE and skills and ITE sectors

Focus group participants and online consultation respondents had mixed views on the removal of deep dives in FE and skills inspections. Some felt that the removal would help ease the pressure on middle leaders during inspection. Others felt that the focus on subjects through deep dives had helped to drive up standards.

FE and skills respondents were also concerned about whether the proposals would be appropriate in a range of provider types, and how inspectors would take account of the context of these providers.

In ITE focus groups, providers spoke about their experiences of inspection. They said that they welcomed the focus on collaboration. Some mentioned their concerns about potential increases in workload.

Proposal 3: our changes and actions

We have set out:

Proposal 4: full inspections and monitoring inspections of state-funded schools

Full inspections and monitoring inspections of state-funded schools – we plan to end ungraded inspections of state-funded schools and to change our monitoring programmes so that we can check that schools are taking timely action to raise standards.

The increase in monitoring requirements under the renewed framework has raised some concerns from education professionals who responded to the online consultation. They were worried that the proposed frequency of monitoring inspections would increase their workload, that the visits would detract their focus away from making improvements, and that this could negatively affect staff’s mental health and well-being.

Representatives of school group leaders were also concerned that the proposed frequency of monitoring inspections could add burden without a clear benefit. We heard from another representative body that the frequency would be disproportionate, and would not allow schools enough time to make meaningful improvements.

Respondents to the online consultation also noted the potential overlap of our proposed monitoring with the DfE’s ‘regional improvement for standards and excellence’ (RISE) teams. They were concerned that if a school is subject to both Ofsted monitoring and support from the DfE’s RISE teams, then this would increase the stress for staff and could lead to mixed messages and inefficiencies. They said schools might be at risk of receiving support from too many external sources, especially when advice might differ.

Monitoring also came up in responses to our question about what we could do to help reduce or manage any unintended consequences of the changes. Respondents suggested removing grades and reducing the friction of inspection, but they also noted that increased monitoring inspections would encourage Ofsted to take more of an advisory/support role for schools. This showed that monitoring is seen as a positive by some in the sector.

Proposal 4: our changes and actions

We have set out our approach to monitoring.

Proposal 5: identifying state-funded schools causing concern 

Identifying state-funded schools causing concern – a new approach to how we will place a school into a category of concern.

In response to the question on how we propose to identify schools causing concern, many agreed with the process and welcomed its clarity.

Some of the more negative respondents considered the language to be stigmatising; they felt that labelling a school as causing concern would lose the community trust and damage staff morale. We have since changed the terminology to ‘urgent improvement’.

They said that the renewed framework increases the number of potential points of failure and could increase the risk of schools being judged negatively. Some felt this may discourage leaders’ aspirations and hinder efforts to continuously improve.

Proposal 5: our changes and actions

We have set out our approach to identifying schools causing concern.

What we heard from current and former inspectors about all the proposals

Inspectors who responded to the consultation had a range of views. They praised the nuance that report cards would bring to inspections and said that having 5 grades would allow them to distinguish between providers that currently just about reach a ‘good’ rating and those that are not quite ‘outstanding’. However, many noted that they would need to be trained to grade consistently across the 5-point scale.

Some inspectors told us the importance of having a clear ‘expected standard’ grade that looks at whether providers are meeting their statutory and non-statutory responsibilities.

Others thought the number of evaluation areas and associated toolkits might increase their workload. Some raised questions about what the changes to the methodology would mean for inspection practice, such as the removal of deep dives.

Many thought the changes would lead to more supportive inspection practice.

We have set out our approach to grading and inspection methodology and toolkits.

The workload and well-being implications of our inspection reforms for the education workforce

We know professionals understand that our first duty is to the children and learners we are charged with protecting. It is our job to call out practice that undermines children’s safety or robs them of their one chance at an education that allows them to thrive.

Sadly, we too regularly uncover bad actors working in education and care. We must always be vigilant not to provide opportunities for wrongdoing to go uncovered. But more often, unacceptably low standards are not the result of malign intent. Instead, well-intentioned professionals are struggling in difficult circumstances. In these cases, we must still act to protect children and learners while enabling professionals to receive the support they need.

This is why reducing workload and promoting well-being remain central to our approach. We will maintain our focus on raising standards and holding providers to account. But we also believe giving education professionals time and space to receive support allows them to do their best to raise standards and protect children.

We accept we have a challenge. To change our approach, we have to change. The government committed to this change in its manifesto, and parents and professionals called on us to change through the Big Listen. But, as the independent workload and well-being review we commissioned recognised, any change to the framework is likely to result in stress due to the potential workload consequences that may come from adapting to a renewed framework. We are doing what we can to alleviate the pressures of that change on professionals, while being true to our duties to children and learners, and our responsibilities to parents and carers to offer them the nuanced information they have called for.

This section sets out what we are doing to address the workload and well-being implications of our inspection reforms through:

  • our new inspection methodology
  • refining our approach through test visits
  • responding to an independent review of the reforms

How our new inspection methodology supports providers’ and inspectors’ well-being and workload

We have considered leaders’ workload and well-being from the outset of our reforms and in our methodology design. In the consultation, we stated:

We do not want inspection to add to leaders’ workloads. We want it to come together with the everyday business of running a provider, so that it does not detract leaders from what they are already doing to continuously improve their provision. To support this approach, our toolkits will take account of the standards and expectations already placed on leaders and their provision. This includes statutory and non-statutory guidance, professional standards and the educational research that suggests the most effective strategies in securing better outcomes for all learners.

This commitment remains embedded in our approach.

How we are addressing workload concerns

First and foremost, nothing in the standards set out in the toolkits should add to a provider’s workload. Our toolkits are built on the requirements, standards and expectations already placed on leaders and their provision. This includes statutory and non-statutory guidance and standards that professionals should be meeting. They are also based on the research and inspection evidence that suggests the most effective strategies in securing better outcomes for all children and learners.

We believe inspections will help providers focus on meeting those expectations more efficiently and effectively. We do not expect any provider to be doing more than it needs to just ‘for Ofsted’.

We have built on this through our inspection practices. Our revised grading is more nuanced, fair and informative, and we believe it better supports well-being than the previous model or the alternatives considered.

As set out earlier, we have designed the inspection methodology to be more collaborative, to minimise the stress of inspections. We have considered this so deeply that it even flows through to how we have structured the toolkits themselves. The toolkits group the 3 most common grades (‘needs attention’, ‘expected standard’ and ‘strong standard’) on one page, with the 2 extremes (‘urgent improvement’ and ‘exceptional’) shown separately. This will help to focus attention during an inspection on the areas where most providers sit, and make it clear that inspectors are not to looking to ‘catch leaders out’, as some falsely fear.

We have revised the grading methodology so that it is fairer and more informative, while reducing unnecessary anxiety. We will keep leaders and nominees (where relevant) informed about likely grading outcomes through regular reflection meetings, which will be an opportunity to review emerging evidence (for more detail, see the operating guides). This will help to reduce the build-up of anxiety around revealing the inspection grades at the end of the process.

We are introducing a more detailed report card with a 5-point grading scale. This recognises providers’ strengths and areas for improvement. It offers a more nuanced form of reporting and replaces the previous ‘overall effectiveness’ grade that we know from the Big Listen caused much anxiety across the education workforce. We also believe that we can reduce anxiety by ensuring consistency in grading.

We are assuring leaders about how we will see the context of their provision, and how we will adapt our inspections to different settings. Our section on monitoring and reinspection sets out how we can ease the concerns of those leaders worried about the ‘needs attention’ and ‘urgent improvement’ grades, and how they can be improved within an inspection cycle.

We have a section on everything we are doing to reduce the workload and support the well-being of leaders and education professionals. This explained how:

  • we are increasing school inspection capacity so lead inspectors can offer more support to leaders
  • we have removed the deep dive methodology for schools and FE and skills providers, to reduce the workload impact on middle leaders
  • we are introducing the role of a ‘nominee’ to all education inspections (FE and skills inspections already have this) to assist communication between the school and inspection team, which should also support leaders’ workload and well-being
  • we want inspections to take place within clear and reasonable timeframes to reduce the workload impact of the inspection itself; as part of the test visits to schools, we asked inspectors to ensure that they leave the school by 5pm on the first day
  • we will continue enabling inspections to be paused when well-being concerns are raised
  • we will continue giving inspectors mental health training

We stand by the steadfast commitment we made in the Big Listen to reset the relationship with, and consider the well-being of, those we inspect in any changes we make. This is why we have taken the concerns raised during the consultation about workload and well-being very seriously.

Refining our approach based on findings from the test visits

We have been determined to refine our approach through testing. Through March, April and May, we held a series of test visits across a range of providers in the different education remits we inspect. We wanted to understand the impact of our proposals on the ground.

This first round of test visits was based on the toolkits and methodology we consulted on in February.

After the test visits, we asked providers whether the proposed methodology was likely to reduce their inspection-related workload compared with how we currently inspect.

The findings were mixed. About half of all providers agreed it would reduce their workload; half disagreed. It was about 50/50 in schools, more early years providers agreed and more FE and skills providers disagreed. As part of this feedback, leaders across all remits reported that they appreciated being more involved in the inspection event, even though it was demanding on their time.

After the consultation closed, we made significant changes ahead of further testing. These included:

  • reducing the number of evaluation areas across remits
  • revising our methodology
  • clarifying and constraining expectations on providers
  • revising the role of the nominee
  • making it clear that inspection must take place within a set timeframe

After making these changes, we carried out a subsequent series of test visits through June and July. Those test visits allowed us to assess the impact of these changes.

Feedback from the test visits

Feedback from the test visits also gave us more evidence that we can build on. Providers remained divided on whether the reforms will reduce their inspection-related workload in the years between inspections. Most early years providers agreed that their workload would reduce during inspection days compared with under previous inspection frameworks. More schools and FE providers said that theirs would not reduce, as they felt that there is always the need to prepare for inspections.

We had positive feedback on other elements of the methodology. Almost all early years providers and schools involved in the test visits agreed that the planning call helped them to understand what to expect. Almost all early years providers and half of schools said that the proposed inspection methodology did not negatively impact on their day-to-day running. All early years providers and almost all schools said that accompanying inspectors and talking with them on the visit helped to develop a shared understanding of their provision’s strengths and areas for development. All early years providers and most schools said that, overall, they were satisfied with the way evidence was gathered. Almost all early years providers and schools, most FE and skills providers and all ITE providers felt that conversations with inspectors about grading were collaborative.

These findings bolster our confidence that our approach will be more transparent, less intrusive, more supportive and more collaborative – which should combine to reduce anxiety and support the well-being of those we inspect.

Independent review of the impact of our inspection reforms on the workload and well-being of the education workforce

In addition to this extensive testing of our approach, we commissioned Sinéad McBrearty, Chief Executive Officer at Education Support, to carry out an independent review of the impact of our inspection reforms on the workload and well-being of the education workforce.

The review took a ‘mental health impact assessment’ approach as the framework for the analysis. It provided us with recommendations on how to manage the initial stress that is an inevitable consequence of change and the potential workload consequences that may come with it.

The review was split into priority actions and secondary actions for Ofsted. We have responded accordingly.

Independent review: priority actions

Recommendation 1: Explore and implement changes to reduce the isolation and individual responsibility felt by headteachers and principals.

Strong leadership is vital to a school’s success, but we recognise it is never the responsibility of just one individual. Leadership of a school is shaped not solely by one individual but by a group of leaders, including those who have statutory responsibilities for the well-being of the headteacher. To reflect this, we name headteachers and the chair of governors or trustees and the chief executive of the multi-academy trust (if applicable) on school report cards. This makes it clear that inspection outcomes are a collective responsibility.

We also recognise that leaders’ well-being and workload are influenced not only by reporting but by the whole process of inspection. As we have set out in the sections inspection methodology and workload and well-being, we are significantly improving this process through our reform. The inspection methodology is designed to ease leaders’ workload by tailoring inspection activity to each provider’s context, involving leaders throughout, and reducing the likelihood of unexpected findings through the sharing of emerging grades. Introducing an optional ‘nominee’ role for all remits should ease the inspection process and help reduce the demands placed on providers. This builds on changes we have already made to address headteacher isolation, including that all headteachers and teachers could have a colleague from their school or trust join discussions with inspectors.

In addition, the DfE’s revised accountability model, combined with our approach to monitoring inspections that can review and update any grade below ‘expected standard’, gives leaders a clear opportunity to make rapid improvements and to have these recognised in subsequent monitoring visits. If a provider improves, we will then update its report card. This will ensure that providers are fairly represented to parents and the public. This change has important implications for well-being, as progress can be recognised promptly.

Recommendation 2: Invest significantly in the well-being and professional development of the HMI workforce.

We want to minimise stress and workload pressures for inspectors as well as providers, to ensure that they are at their best.

We will add an extra inspector to inspection teams for schools for the first day to boost inspection capacity and support inspection teams. By shortening inspection days, we will reduce inspectors’ workload and by improving the opportunities for dialogue between inspectors and providers we will make the inspection experience more positive for everyone involved.

We have engaged with inspectors’ trades unions closely on our plans.

We have developed a comprehensive package of training for inspectors for the launch of the renewed framework. This training will help to refresh the core skills, knowledge and behaviours that inspectors need to carry out inspections effectively. It will also help prepare them to inspect with the professional, propositional, procedural and conditional knowledge they need to be at their best. The training will include refresher sessions on mental health and well-being, which build on the training that we rolled out on this topic following the Prevention of Future Deaths report in 2024.

Recommendation 3: Introduce an unequivocal mechanism for independence in the complaints process.

We have already made significant changes to how we handle complaints in response to concerns raised about the process. However, we are determined to go further to build trust in how we do this.

We are improving communication with complainants: investigating officers offer direct conversations to better understand their concerns. We have set up complaints panels with external sector representatives, who review whether complaints are handled fairly. These panels began in January 2025 and will be strengthened further by increased involvement from external representatives to enhance transparency and trust in the process.

We are continuing to work closely with the DfE on how we can introduce further independence into the complaints process. The DfE contracts with the Independent Complaints Adjudication Service for Ofsted, which is run by an independent body and gives recommendations to us on how to improve our complaints handling.

Under our new Chair, we expect the Ofsted Board to take a significant role in developing our complaints policy. The Board will challenge us on the quality and independence of our processes and monitor this work.

Recommendation 4: Develop a clear protocol for responding to individuals in acute distress or at risk of suicide.

In response to the Prevention of Future Deaths report, we introduced measures to respond to individuals in distress. This included a policy allowing inspectors to pause an inspection if they have concerns about an individual’s well-being. We also embedded mental health awareness in all inspector training. We will update that training regularly in response to the latest research and guidance. When the British Standards Institution’s standard dedicated to suicide awareness has been finalised and published, we will review it and ensure that our training reflects it.

We have also launched a provider contact helpline and created an ‘inspection welfare, support and guidance hub’ to offer support and guidance to inspectors and providers during the inspection process. These steps are part of our wider commitment to ensuring that everyone we inspect is treated with professionalism, empathy, courtesy and respect. We have completed every action we committed to in our response to the Prevention of Future Deaths report.

We developed our inspection approach to accommodate concerns where we are responsible for addressing them. For example, we designed our pause policy to create space during an inspection to allow responsible bodies to support an individual experiencing distress. Ofsted is not the appropriate organisation to provide that support itself; we should not step in where others have responsibilities to do.

Alongside the Prevention of Future Deaths report, we also commissioned Dame Christine Gilbert to lead a learning review of Ofsted’s response to the death of Ruth Perry. This looked at the actions we took in response to hearing about Ruth Perry’s death, our communication and engagement with stakeholders, information-sharing within Ofsted, and the support we offered internally to staff. After every Board meeting, we publish a report on our progress in completing the actions set out in the Big Listen, including in our responses to Dame Christine Gilbert’s review. From September, Dame Christine will become Chair of the Ofsted Board.

The DfE also responded to the Prevention of Future Deaths report, committing to improve communication with schools, review safeguarding guidance, and strengthen support for school and college leaders.

Recommendation 5: Monitor the unintended consequences of the revised framework highlighted in this report.

As part of preparing for our inspection reforms, we have carried out the workload impact assessment we set out above. This included testing the impact of our reforms ‘on the ground’ through test visits, as well as commissioning this independent review. We have taken on board the findings from those test visits and the recommendations of this review to inform our changes.

As we start inspections under the revised framework, we want to keep checking for any unintended consequences. In autumn, we will invite a random sample of providers to take part in ‘exit interviews’ with His Majesty’s Chief Inspector, the National Director for Education and other senior Ofsted officials. These interviews will supplement the standard post-inspection survey and give us deeper insight into the impact of the changes.

We will also start holding ‘roundtable’ meetings with sector representatives to gather qualitative feedback on the impact of the reforms in real time. We will continue to listen to, reflect on and respond to any challenges. We have also commissioned an independent evaluation of the renewed framework. This will start with a baseline study in summer/autumn 2025, followed by in-depth qualitative research in spring 2026 and an ongoing post-inspection survey beginning in spring 2026 and continuing in summer and winter.

We will use these insights to help us respond to any emerging issues as fast as possible and to adjust the framework when needed. We do not want the framework to be ‘fixed’. We intend to amend it as necessary to take into account changes to government policy, experience of inspections on the ground, feedback from stakeholders and evidence from research and reviews.

Independent review: secondary actions

Recommendation 6: Develop and monitor key performance indicators to track the progress of key actions identified in this report.

We are committed to being a transparent, learning organisation; we will continue to review all available evidence on the impact of the renewed framework to inform our future improvements.

As part of this, we will carry out a comprehensive evaluation programme to understand both the implementation and impact of the framework. This will include:

  • an independent evaluation of the renewed framework – beginning with a baseline study in summer/autumn 2025, followed by in-depth qualitative research in spring 2026 and an ongoing post-inspection survey
  • work to assess consistency, including quality assurance processes, desk-based analysis using vignettes, and data monitoring through weekly consistency meetings
  • further evaluation activities focused on engaging with parents, carers, providers and inspectors to gather insights into how they are finding the framework in practice

Our Strategy and Delivery Unit, set up in response to the Dame Christine Gilbert Review, will track the progress we make against each of the actions we have committed to in this response, and give regular updates to the Ofsted Board.

Recommendation 7: Carefully monitor and be prepared to revise the amount of inspector time that can be allocated to contested inspections.

We have made significant reforms to how we grade providers to make the process of inspection more collaborative, and improve the consistency of inspections. We believe these changes will lead to fewer contested inspections.

For more challenging inspections, our regional directors will be able to give inspectors more time to gather evidence to inform their grading.

If an inspection is contested through the complaints process, we will dedicate expertise from across Ofsted to review the inspection outcome and ensure that we give an accurate grade.

Recommendation 8: Develop a plan to address the particularly low level of trust in Ofsted among primary schools.

Rebuilding trust in the inspection system is a priority for Ofsted. We understand that trust must be earned through openness, fairness and a clear commitment to listening and responding.

The Big Listen was a key step in this effort. It gave all those we inspect, including leaders, staff, parents and carers, the opportunity to share their experiences and concerns. We heard the need for more transparency, greater empathy during inspections, and a system that better supports well-being while maintaining high standards for children and learners.

We believe that the reforms to the renewed framework will help to instil greater trust in inspections. Several measures in our renewed framework directly address the concerns that primary school professionals raised – such as removing deep dives, which were particularly difficult for small primary schools to manage. The toolkits also have specific sections explaining how inspectors should adapt inspection activity for smaller settings, such as primary schools. Schools will also have the option to have a ‘nominee’ who can liaise directly with the inspection team, which will support a more collaborative inspection experience.

We will train our inspectors to ensure that they are well equipped to understand the specific context and challenges of different providers, including primary schools.

All primary school inspections will be led by inspectors with expertise in the primary phase. In the rare cases where this is not possible, we will use additional quality assurance measures. We aim to improve the consistency and accuracy of inspection findings in primary settings.

Independent review: our conclusion

We are confident that our new approach will promote stronger collaboration, greater consistency and renewed confidence in the inspection system. However, we recognise that our renewed approach, and the process of change itself, may create some additional workload for some providers. This is why we have taken the extensive steps set out above to alleviate these concerns.

The independent review focused mainly on the schools sector, in which workload and well-being concerns had been the subject of much attention and had been a major concern of sector representative organisations for some time.

This does not detract from our focus on the workload and well-being concerns of all the other sectors we inspect and regulate. For instance, as we increase the frequency of routine inspections for early years, we will review the workload and well-being implications of early years inspections and what we can do to mitigate them. In the long run, we believe that more frequent inspections will give greater assurance to providers, professionals and parents alike.

We recognise that inspections can be stressful. That is to some extent inevitable in an inspection system fundamentally aimed at ensuring that proper standards of education and safeguarding are in place, and that parents are fully informed on those matters. However, we are determined to minimise this stress where we can. We fully believe the changes we have made do this, and that they will lead to a more informative, transparent and fairer system of reporting that better serves children and learners, parents and carers, and professionals and providers.

Next steps

Starting inspections

We will start inspecting under the revised framework, using the operating guides and toolkits, from:

  • 10 November 2025 for early years, state-funded schools and FE and skills inspections
  • January 2026 for ITE and non-association independent school inspections

This will give providers at least a full 2 months to become familiar with the changes.

For state-funded school inspections, we will prioritise volunteers for full inspections in the weeks between 10 November and Christmas. These inspections will result in a report card, with a complete set of grades. We will return to the normal schedule for state-funded schools towards the end of the period and not before 1 December. If there are enough volunteers, we will continue to prioritise them after 1 December. We will not carry out inspections in the final week before Christmas. 

Our Deputy Chief Inspector will review all requests for an inspection deferral to make sure each case is treated with the utmost sensitivity and consideration.

Training our inspectors

There will be a steady and consistent start to inspections. This month, we will use the end-to-end piloting process as an opportunity for as many HMI and early years regulatory inspectors as possible to experience and apply the new methodology. We expect all HMI to take part in a pilot inspection.

We will carefully structure our schedule to ensure that our senior HMI lead the first inspections, with HMI on the inspection teams or shadowing these inspections. From November until the end of the year, all inspectors will go through the process of shadowing, teaming and learning. We want to ensure that they are all confident with the renewed framework.

To support a steady and assured start, our National Director for Education and Principal Inspector will quality assure the work of the lead inspectors after their pilot visits to providers in early autumn. This will ensure that they are confidently able to carry out inspections to the required standard.

Evaluating the reforms

We will also evaluate the implementation and impact of our reforms. The evaluation plan will include:

  • an externally commissioned evaluation of the renewed inspection framework; we will ask an independent supplier to do a baseline study in summer/autumn 2025, preliminary in-depth qualitative research in the spring term 2026, and a rolling evaluation survey, carried out post-inspection to be continued each year
  • a programme of work to measure our consistency, including through HMI shadow inspections, a desk-based study using vignettes, and data that informs the weekly consistency meetings
  • further activities, such as engagement with parents and carers, providers and inspectors about the implementation of the framework

Annex – data in report cards 

We will publish data alongside the report cards to illustrate the providers’ and learners’ contexts and, where available, the performance and attendance data that we will use to support inspection.

The data will be:

  • already made public by the DfE or another government body
  • what was available and correct at the time of inspection; it will stay as this ‘point in time’ data until the provider’s next full inspection

When we have carried out additional analysis on the published data, for example comparing a provider’s figure with the national average, we will include a clear explanation of the methodology we used.

The data we will publish for each remit

Early years providers: provider context

Measure What is this?
Age range of children Age range of children who attend the provision
Number of places The total number of early years registered children that may attend the provision at any one time

Independent schools (non-association independent schools): school context

Measure What is this?
School capacity Number of pupils the school can accommodate
Number of pupils on roll Number of pupils currently at the school

Independent schools (non-association independent schools): pupil context

Measure What is this?
Number of pupils with an education, health and care (EHC) plan The number of pupils with an EHC plan
Number of pupils with special educational needs (SEN) support The number of pupils who receive SEN support

State-funded schools: school context

Measure What is this?
School capacity Number of pupils the school can accommodate
Number of pupils on roll Number of pupils currently at the school
Resourced provision or SEND unit (if applicable) Whether the school has resourced provision or a SEND unit
Type of specialist provision (if applicable) The type of SEND provision offered at the school (if applicable)

State-funded schools: pupil context

Measure What is this?
% of pupils eligible for free school meals – ever 6 The proportion of pupils eligible for free school meals at any point in the last 6 years
% of pupils with an EHC plan The proportion of pupils with an EHC plan
% of pupils with SEN support The proportion of pupils who receive SEN support
School location deprivation The deprivation level of the school’s local area relative to the national level

State-funded schools: performance – key stage 2*

Data will not be provided for special schools or AP. Where possible, data will be provided for disadvantaged pupils.

Measure What is this?
% meeting expected standard in reading, writing and maths The proportion of pupils meeting the expected standards in combined reading, writing and maths
% meeting expected standard in reading The proportion of pupils meeting the expected standards in reading
% meeting expected standard in writing The proportion of pupils meeting the expected standards in writing
% meeting expected standard in maths The proportion of pupils meeting the expected standards in maths

State-funded schools: performance – key stage 4

Data will not be provided for special schools or AP. Where possible, data will be provided for disadvantaged pupils.

Measure What is this?
Attainment 8 score for school A score based on how well pupils have performed in up to 8 qualifications
% pupils achieving grade 5 for English and maths (combined measure) The proportion of pupils who achieved a grade 5 or above in English and maths GCSEs
Progress 8 score for school A score showing pupils’ progress between the end of key stage 2 and the end of key stage 4
Destinations at age 16 Proportion of school leavers by destination after key stage 4

State-funded schools: performance – 16 to 18*

Data will not be provided for special schools or AP.

Measure What is this?
A-level average point score The average points that students achieved per A-level entry
A-level value added A score showing students’ progress between the end of key stage 4 and the end of their academic qualification studies

State-funded schools: absence

Measure What is this?
Overall absence rate Total sessions missed due to absence as a percentage of all possible sessions
Persistent absence rate The percentage of pupils who missed 10% or more of possible sessions

*In the autumn term, we will use unpublished ‘provisional’ key stage 2 and key stage 5 data to support some inspections. The data published alongside the report cards for these inspections will therefore not include this unpublished data at the time of first release. We will update the data after the DfE publishes the ‘revised’ key stage 2 and key stage 5 datasets.

FE and skills providers: learner context

Measure What is this?
Number of learners on education programmes for young people at time of inspection Number of 16- to 18-year-olds currently at the provider taking part in education and training. Will include a small number of 19-year-olds, or those aged 19 to 25 with an EHC plan, who are on a study programme
Number of learners on adult learning programmes at time of inspection Number of adults, aged 19 and over, currently taking part in education and training at the provider
Number of apprentices at time of inspection Number of apprentices aged 16 and over at the provider
Number of learners receiving high-needs funding at time of inspection Number of learners and apprentices aged 16 and over, currently at the provider, who are receiving high-needs funding, with an EHC plan

FE and skills providers: performance data 

Measure What is this?
16 to 18 overall achievement rate The proportion of learners aged 16 to 18 who were due to complete their education and training qualification that year and achieved it
19+ overall achievement rate The proportion of learners aged 19 and over who were due to complete their education and training qualification that year and achieved it
Apprenticeship pass rate The proportion of apprentices who completed their apprenticeship and achieved a pass
Apprenticeship overall achievement rate The proportion of apprentices who were due to complete their apprenticeship that year and achieved it

ITE providers: provider context

Measure What is this?
Number of partners by phase of education The number of partner organisations the provider works with to deliver training across the phases of early years, primary, secondary and FE

ITE providers: trainee context

Measure What is this?
Number of trainees on early years teacher training The number of trainees preparing to teach in the early years phase currently studying with the provider
Number of trainees on primary teacher training The number of trainees preparing to teach in the primary phase currently studying with the provider
Number of trainees on secondary teacher training The number of trainees preparing to teach in the secondary phase currently studying with the provider
Number of trainees on FE and skills teacher training The number of trainees preparing to teach in the FE and skills phase currently studying with the provider

In addition to the data above, the provider’s page on our Find an inspection report site (where the report card will be) will continue to contain an ‘About this school/setting/provider’ section. This section includes up-to-date information, such as its address and provision type.

Annex for figure

Data for Figure 1: Placing a school into a category of concern

Steps Description
Step 1 Has any evaluation area, other than leadership, been graded as ‘urgent improvement’, or has safeguarding been graded as ‘not met’?
Answer to step 1: yes Go to step 2a
Answer to step 1: no Go to step 2b
Step 2a Has leadership and governance been graded as ‘urgent improvement’?
Step 2b Has leadership been graded as ‘urgent improvement’?
Answer to step 2a: yes Action: Special measures
Answer to step 2a: no Action: Requires significant improvement
Answer to step 2b: yes Action: Requires significant improvement
Answer to step 2b: no Action: No category of concern

See Figure 1.

  1. The Education Act 2005 says that the Chief Inspector’s school inspection reports must cover:

    • the achievement of pupils
    • the quality of teaching
    • the quality of the leadership in and management of the school
    • the behaviour and safety of pupils

    and must also consider:

    • the spiritual, moral, social and cultural [personal] development of pupils at the school
    • the extent to which the education at the school meets the needs of the range of pupils at the school, and in particular the needs of pupils who have SEND (covered by the inclusion evaluation area)

    In the past, we did not always directly reflect these duties in the names of our evaluation areas or sub-evaluations, but they have always been covered by our school reports. 

  2. Ofsted’s approach to inclusion:

    Inclusive providers are at the heart of their communities. They have high expectations and aspirations for every child and learner. They are particularly alert to the needs of those who may require the most support to achieve well and have positive experiences of education, including those who are disadvantaged, those with SEND, those who are known (or previously known) to children’s social care, and those who may face other barriers to their learning and/or well-being.

    They recognise that barriers to learning and well-being are dynamic and not always fixed traits – and that these arise from multiple interacting factors at individual, family, provider, community and societal levels. 

    Leaders set a clear and ambitious vision for inclusion at the provider. They put this at the core of their planning and policies and communicate it to children, learners, staff, and parents and carers. They create a culture in which every child and learner belongs, and feels safe, welcomed and valued. They make sure that all children and learners access a high-quality education, taught by experts with high ambition who strive to develop every child and learner’s potential. They encourage all to participate in wider enrichment opportunities, so that all children and learners can achieve, belong and thrive.

    Providers identify needs early, showing compassion and curiosity to identify those who experience hidden vulnerabilities. They make reasonable adjustments, including ensuring that the learning environment is accessible and supportive. They support transitions between phases and, where appropriate, deliver evidence-based, targeted support for those who need it.

    Providers work in a close and effective partnership with parents and carers and other agencies to secure the best possible outcomes for every child and learner, regardless of their starting points. 

  3. This will ensure that we provide a window into our overall activity and can set out what we are finding at an aggregated level: broken down into key groups, such as provider type. We will process and handle this information following the code of practice for statistics, ensuring appropriate quality and methodologies that best meet users’ need for timely and publicly available information. 

  4. We determine the timing of the monitoring inspections using the evaluation areas that were identified as needing to improve. For example, if there are only one or two areas, then it is likely that a monitoring visit can be fairly soon after the full inspection. If most evaluation areas are graded as ‘needs attention’, then a full inspection may be more suitable but may come later. The allocated inspector and leaders will discuss the best timing of a monitoring visit. However, we expect all schools with ‘needs attention’ grades to receive a monitoring inspection usually no later than 2 years after we publish the report card from their full inspection.