Improving education inspections: methods statement
Published 9 September 2025
Applies to England
Strands of evidence
Between 3 February and 28 April 2025, we ran a 12-week public consultation on improving the way Ofsted inspects education. We collected a wide range of evidence through our:
- online consultation survey – this was open to everyone
- focus group discussions – this was for groups of stakeholders (inspectors and sector professionals), to allow more in-depth discussions of relevant topics
- YouGov polling of parents/carers – to help us to hear from a representative sample of parents/carers
- test visits with providers – to test the toolkits we consulted on and our proposed inspection methodology across a range of providers
These strands complemented one another and fed into our consultation response.
We wanted to know what the public and professionals think about our proposed plans for inspecting schools and other settings where children and adults learn. The specific remits are:
- early years
- state-funded schools
- non-association independent schools
- further education (FE) and skills providers
- initial teacher education (ITE) providers
Our focus was on the following main areas:
- report cards, including grading scales
- education inspection toolkits – evaluation areas that we will focus on in inspections and how we will assess and grade providers
- inspection methodology – changes to how we carry out inspection
- potential impact of the proposed reforms on workload and well-being, anticipated unintended consequences, and what could help reduce or manage unintended consequences
The changes to the above areas subsequently led to changes to our education inspection framework documents for the given remits.
Ethics
We gained ethical approval for all strands led by Ofsted (consultation survey, focus group discussions and test visits with providers) from our research ethics committee. This committee includes external academic experts.
Participation was voluntary. Each participant or respondent had a right to refuse to participate, stop taking part or decide not to respond to some questions or discussion topics.
Consultation survey
The survey questions were written in plain English and produced in a format compatible with screen readers to aid accessibility. We did not use colours.
The survey was anonymous for individual respondents. Only those responding on behalf of an organisation were asked to name the organisation. IP addresses were removed from the data that we analysed.
The demographic data was stored separately. It was also separately analysed, aggregated and reported in the Respondents and participants section. In line with standard research ethics guidelines, responses were not used to identify any respondent.
Our privacy notice also informed respondents that we may use artificial intelligence software to support our analysis process.
We asked respondents and participants not to provide information that could identify them or lead to their identification, or to disclose personal information about someone else. The privacy notice made it clear that respondents and participants should not use this consultation to report any complaints, or safety and safeguarding concerns. They were informed of the appropriate channels for doing so.
Focus groups and roundtables
For the Ofsted-run focus groups, we informed participants that they could withdraw at any time up until the focus group or roundtable finished. After that, they would not be able to withdraw their contributions, which would be anonymised. We explained all this in the information sheet for participants, as well as at the beginning of each focus group or roundtable.
Focus group and roundtable facilitators reviewed all notes or session transcripts to make sure that no participant was identifiable.
YouGov-run research
YouGov[footnote 1] followed its own ethical processes for this research. It also followed the ethical guidelines of the Market Research Society, the Economic and Social Research Council, the Association of Qualitative Research and the Social Research Association. YouGov’s approach to safeguarding and ethics aligns with the Government Social Research Ethical Assurance for Social and Behavioural Research guidance.
YouGov communicated the purpose of the research to participants and provided relevant opt-outs. Data collection was based on voluntary informed participation. Participants could withdraw consent at any point. Anonymisation and pseudonymisation techniques were used to protect individuals’ privacy rights, in line with the General Data Protection Regulation and all relevant data protection standards.
The survey was designed to be accessible to all respondents from a variety of devices. Accessibility was also ensured in the qualitative strand. For example, participants could choose a method of interviewing where necessary (such as video- or audio-only interviews, or telephone or face-to-face interviews).
Test visits
Participation in the test visits with providers was also voluntary. Each provider had a right to refuse to participate, to stop taking part, or to decide not to respond to the follow-up survey. We told providers that we would use the information we gathered during the visit to develop future changes to how we inspect. They were also assured that the visit would remain separate from, and have no impact on, any future inspection outcomes. We made it clear that we would not share the information from the visits with anyone outside Ofsted, unless it was appropriate in relation to safeguarding and/or required by law.
In the surveys sent to inspectors and providers after the test visits, we asked for the name of the school/provider and the respondent’s role in the school/provider. This was to help us to analyse data and look for any trends between respondents. Data was aggregated and pseudonymised for analysis and reporting, meaning that it could not be identified as belonging to individuals or individual schools/providers. Providers were informed that their school/provider would not be named or be identifiable in any way in our summary of insights or training materials. Providers were also informed that we would not publish a standalone report based on the test visits.
Data collection
Consultation survey
The online survey consisted of 102 open-ended questions across all 5 education remits that Ofsted inspects. The rest were fixed-choice questions on respondents’ backgrounds. Respondents could choose which questions they answered.
All respondents were asked to answer questions on how we propose to report inspection findings. We were also specifically interested to learn views from education professionals and organisations about our education inspection toolkits, the changes to inspection methodology, and any potential impacts of the changes on their workload or well-being. They could decide whether they wanted to comment on 1 remit or more.
Ofsted-run focus group discussions and a roundtable
The purpose of the focus groups was to complement the consultation survey and to allow for more detailed discussion of themes relevant to particular groups. The questions were designed to correspond to those asked in the consultation survey, with some adaptations to meet the needs of specialist groups (such as employers).
There was a total of 107 participants. For early years, schools and ITE remits, we carried out focus-group discussions with practitioners and inspectors, respectively, in separate sessions. In FE and skills, we also ran focus-group discussions with employers and employer providers. For special educational needs and/or disabilities (SEND), we ran a roundtable with SEND sector representatives.
Ofsted regional or remit teams put forward names of inspectors who might be interested in taking part, or providers who might be willing to send practitioners. Further practitioners were recruited by asking for expressions of interest from attendees of Ofsted’s consultation webinars. For the SEND roundtable, the Ofsted inclusion team put forward a list of key stakeholders from across the sector who were invited to attend. Ofsted also worked with the Association of Employment and Learning Providers to invite their members to focus groups. The participants were recruited on a voluntary basis, with no incentives.
The focus groups were conducted by Ofsted staff. The exception was focus-group discussions with employers and employer providers. These were run by the Association of Employment and Learning Providers.
YouGov polls and focus-group discussions
Ofsted commissioned YouGov to survey parents through a poll and through text-based focus-group discussions. The poll was used to collect quantitative data from a representative sample of parents through fixed-choice questions, while the discussions were used to obtain qualitative data. The focus was on parents’ perceptions of the current way of reporting inspection outcomes and the proposed report card, including comprehension and reactions to the layout, colours, 5-point scale and proposed evaluation areas.
The parent poll ran between 5 and 11 March 2025. YouGov conducted 5 interactive text-based focus groups with parents in March 2025, each comprising 7 to 10 participants. A further 2 text-based focus groups took place in April 2025, each comprising of 9 parents of children in early years (aged 0 to 5).
Test visits
Alongside the consultation, we tested our proposed new approach by visiting a range of early years providers, state-funded and independent schools, FE and skills providers and ITE providers.
Through these visits we explored the following things:
- feasibility – to understand how well the proposed inspection toolkits and inspection methodology supported inspectors to collect evidence
- validity – to understand the extent to which the proposed toolkits enabled inspectors to collect evidence on the most important aspects of education
- alignment – to explore how aligned inspectors were in their understanding of the draft revised framework
- scalability – to understand how well the evaluation areas in the proposed toolkits worked together and whether they covered the right areas, included all that they need to, and avoided unnecessary or unhelpful repetition
These test visits allowed us to explore and understand how the proposed inspection methodology, evaluation areas and toolkits would work in practice. They allowed us to gather the views of providers and inspectors and hear how they felt about the changes, after they had experienced a test visit. The visits allowed us to identify issues and improvements that may need to be made before any changes are implemented. They also provided an early opportunity for inspectors to become familiar with the main features of the proposed new approach.
Across all the phases of test visits, inspectors visited a total of 287 providers across England who had volunteered to take part. This included early years providers (86), state-funded schools (143), independent schools (26), FE and skills providers (24) and ITE providers (8). Early years providers were visited by Early Years Regulatory Inspectors and other remits by His Majesty’s Inspectors. A small number of Ofsted Inspectors (17) took part in test visits in state-funded schools.
These voluntary visits took different forms and happened in different phases.
Phase 1: Thematic and paired thematic visits
Thematic visits
A thematic visit was a limited visit to providers, to test the feasibility and validity of specific aspects of the proposed methodology and toolkits. The focus of these visits differed across the different remits.
Paired thematic visits
These visits happened in the early years, state-funded schools, and FE and skills remits.
- In early years and state-funded schools, they took the form of a limited visit to providers by more than 1 inspector.
- In FE and skills, an extra inspector shadowed the lead inspector on some of the test visits.
These visits allowed us to test how well aligned inspectors were in terms of the process they would follow should the proposals be implemented, given the evidence they were collecting. They also showed us the extent to which inspectors had shared views on the provider’s strengths and areas for improvement.
Phase 2: Test visits
Test visits tested the whole proposed inspection methodology and toolkits (excluding safeguarding and testing of new digital systems or proposed report cards/grading, due to development timeframes). Inspectors were generally on site for the same period of time they would be during a ‘normal’ inspection. Inspection teams reflected the size they would usually be for the provider type/size.
Phase 3: Grading test visits
Grading test visits took place once the consultation had closed, using the toolkit that was consulted on. These visits tried to replicate a fuller inspection experience and therefore included grading conversations and report writing. The focus was on evaluating the proposed grading process and report writing.
An additional inspector joined a small number of these visits (10 in total: 4 in early years, 3 in state-funded schools, 1 in independent schools and 2 in FE and skills) to test a revised version of the proposed toolkit. This inspector shadowed the lead inspector, observing inspection activity and noting their own evidence based on this version of the proposed toolkit. They used the proposed revised toolkit to guide their evaluations and grading decisions, but they did not discuss their evaluations with the provider, nor did the provider see this version of the toolkit during the visit.
This allowed for us to make an early evaluation of the proposed revised toolkit. It also enabled us to start assessing how well this toolkit supported inspectors with their evaluations in comparison to the version of the toolkit included in the consultation survey.
Phase 4: Revised toolkit grading test visits
Grading test visits of the proposed revised toolkit took place in June and early July 2025, once insights from the earlier grading test visits had been analysed and further revisions made to the version of the toolkits included in the consultation survey. As with the earlier grading test visits, these replicated a fuller inspection experience under the proposed new regime. Inspectors had full grading discussions with providers and insights from these discussions fed into the proposed new style of report writing.
As with earlier visits, inspectors did not evaluate safeguarding. These visits tested the feasibility, validity and scalability of the proposed revised toolkits, and fed into further toolkit revisions, operating guides and inspector training.
Following each visit, inspectors and providers completed an online survey. Their feedback allowed us to reflect on the process, methodology, effectiveness and usability of the proposed toolkits. Insights from the thematic visit surveys allowed us to make changes to the methodology and toolkit ahead of the test visits. Insights from test visits allowed us to consider additional changes as well as identify training and development needs for inspectors. Inspectors were also invited to take part in several focus groups that were held in addition to the survey.
Further, the evidence collected by inspectors during 19 test visits (5 in early years, 5 in state-funded schools, 3 in independent schools, 3 in FE and skills and 3 in ITE) and 14 revised toolkit grading visits (8 in early years, 4 in state-funded schools, 1 in independent schools and 1 in FE and skills) was quality assured.
All the insights gathered in Phase 4 fed into our response to the consultation.
Data analysis
Consultation survey
Closed-question responses
The only closed questions in the survey collected information on respondents’ demographic characteristics. Counts and percentages relating to those questions were calculated without weighting.
Hybrid approach to analysing free-text responses
All 102 questions asking for views on the proposed changes to inspection were open-ended. There was no limit to response length. As this yielded a vast amount of qualitative data, we adopted a hybrid approach to analysis: manual analysis and artificial intelligence (AI) analysis.[footnote 2]
First, we carried out a manual analysis of a subset of responses to each question. We drew on social research expertise and identified emerging themes. We then fed the top 2 themes per question to the AI, asking it to ignore them and find any other themes we might have missed. We used AI to analyse all submitted responses. In the end, we integrated the AI findings with the manual findings. The steps we took in the manual and the AI phase are further explained in the next 2 subsections.
Manual analysis of free-text responses
We used thematic analysis for open-ended responses to identify main and recurrent themes.
We created an initial coding framework based on a sample of early responses. The coding framework evolved as the research team began to apply it to more data. We made sure the researchers’ work was consistent by providing an induction, making checks and putting support systems in place (for example having daily meetings, creating a group support chat and reviewing early coding samples).
Each response was first allocated a sentiment code to indicate the overarching sentiment: positive, negative, mixed, neutral or irrelevant (in case the response was not relevant to the question). Each response was then allocated at least 1 thematic code. Coding allowed us to synthesise information across all the coded responses and identify recurrent themes. We included no identifying information of respondents.
In some cases, we broke down the analysis by groups of respondents.
For most questions on reporting, we compared findings from the following 2 groups:
- parents/carers (who are not education professionals)
- everyone else (education professionals who are also parents/carers, education professionals who are not parents/carers and all other respondent roles)[footnote 3]
We specifically wanted to tease out the views of parents/carers given that they are the main users of report cards.
For the question on ‘What do you think about our evaluation areas?’, we compared findings across the following respondent groups.
- Early years:
- parents/carers (who are not education professionals)
- everyone else
- State-funded and independent schools:
- parents/carers (who are not education professionals)
- education professionals (regardless of their parental status)
- everyone else
-
FE and skills:
- parents/carers (who are not education professionals)
- learners
- everyone else
For all questions in the state-funded schools section of the survey (on the toolkits, inspection methodology and impact), the respondent breakdown was the following:
- education professionals
- everyone else
This breakdown from the manual analysis could not be replicated in the AI analysis, because we did not have enough respondents in the ‘everyone else’ group. Therefore, we treated ‘education professionals’ and ‘everyone else’ as a single group in AI analysis, to make sure that all responses are analysed.
We did no respondent breakdowns for the other questions in the survey. Instead, we analysed all respondents as a single group. The only exception were questions in the early years section of the survey where we split respondents into 2 groups: childminders and everyone else.
AI analysis of free-text responses
We only used AI to analyse responses from subgroups of respondents when there were sufficient responses (250 or more) to a specific free-text question. For the questions with lower response rates (fewer than 250), manual analysis continued until all responses were analysed.
Before we analysed the data using AI, we replaced common acronyms, such as ‘SEND’, with their full forms so that we could better identify topics and themes across the survey responses during subsequent text analysis.
We used OpenAI’s GPT-4o and Meta’s Llama-3-70bn large language models (LLMs).
For each of the 102 free-text questions with over 250 responses, we prompted OpenAI’s GPT-4 Turbo LLM to ignore the 2 most prevalent topics identified through the manual analysis, because we already had in-depth analysis of these, and to then summarise the responses into the 10 most frequently occurring topics and produce a description of each topic. Access to this LLM was provided through Microsoft’s Azure OpenAI Service, which does not share data with third parties, and the LLM was accessed through Ofsted computers.
LLMs can only process a limited number of words at a time, and GPT-4 Turbo has a limit of around 100,000 words. Where a question received a higher volume of text, the responses were split into roughly equal sizes. Consequently, as the number of responses to a question got larger, so did the number of AI-generated topics. For each analysis of a topic, we split the responses into sentences and prompted Meta’s open-source Llama-3-70bn LLM to assign each sentence to 1 of the topics.
Validation of AI analysis
Categorising each sentence allowed us to identify which topics were prevalent and which were not. We considered a topic prevalent if at least 50 respondents, or 5% or more of the total sample, were assigned to the topic by the AI.
Every prevalent AI-generated topic was validated by a team of data science and social researcher professionals to check that the output was genuinely reflective of the survey responses. If 2 or more topics were not prevalent but clearly overlapped, we merged them together. The merged topic was validated if it met the threshold of at least 50 respondents or 5% or more of the total respondent pool.
Each AI-generated topic plus the corresponding sentences in the survey responses were combined into an output document. Researchers validated the AI-generated topic by reading at least 50 sentences to determine whether the topic was valid. Any topics that did not pass this validation were discarded.
Responses from individuals and organisations
We received responses from individuals and responses submitted on behalf of organisations. We analysed these together. Given that organisational responses may represent the views of more than 1 respondent, we also read and considered organisational responses separately. All these responses fed into our consultation response.
During manual analysis, we found that many individual respondents copied a pro-forma response to Ofsted’s consultation that the National Association of Head Teachers (NAHT) provided to its members.[footnote 4] We carefully analysed the NAHT organisational response, as well as all the individual responses that combined elements of the NAHT pro-forma response with other views these individuals wanted to share. However, we did not individually analyse the responses of another group of over 500 respondents who either replicated the NAHT pro-forma word-for-word or followed it closely.[footnote 5] Nevertheless, we made sure that the scale of support for the NAHT organisational views was accounted for.
During the AI analysis, to avoid the quality of AI analysis becoming diminished, we also removed individual responses that were a direct copy of the NAHT pro-forma responses. Using an automated process, we identified over 500 respondents with responses that were direct copies.
Focus-group discussions
We carried out a thematic analysis of Ofsted-run focus group discussions, to identify key and recurrent themes, as well as the themes specific to specific groups of respondents.
YouGov analysed qualitative focus-group discussion data to identify key and recurrent themes.
Test visits
Survey and focus-group responses were analysed on a weekly basis. Quantitative data was analysed for each question to establish inspector/provider sentiment and to identify areas of strength and weakness in the proposed methodology and toolkit. Qualitative open responses and focus-group data were reviewed and then used to help the team understand patterns emerging in the quantitative data. Findings from across all the test visits were then analysed together to provide an overall view across all the visits and across the remits.
The evidence reviews focused on evaluating the evidence collected, to check how inspectors had triangulated, evaluated and synthesised evidence during the test visit. Inspectors’ use of key inspection activities such as start-of-day walks, joint observations, document reviews and hearing learners’ voices were evaluated, as well as the consideration that inspectors had given to the context of each provision. It explored the breadth and depth of evidence collected to evaluate the sufficiency of evidence collected. This was then merged with findings from the surveys and focus groups to provide an overall set of findings for internal use.
Respondents and participants
Consultation survey – demographic characteristics
We heard from 6,556 respondents with a wide range of demographic characteristics.
- Age: the majority of respondents who answered the question about age were between 35 and 54 (1,973; 63%), with 21% (647) aged 55 to 64, and 5% (150) aged 65 or over. Approximately 8% (251) were under 35 (14 to 34). Over a half (3,416; 52%) did not answer the question.
- Sex: of the 3,143 who responded to this question, most (2,140; 68%) identified as female, and 896 (29%) as male. A small number (18; less than 1%) preferred to self-describe, and 89 (3%) preferred not to say.
- Sexual orientation: most of those who answered this question (2,586; 85%) identified as straight/heterosexual, while 137 (5%) were gay or lesbian, 40 bisexual (1%), and 39 (1%) preferred to self-describe; 255 (8%) preferred not to say, and 3,499 (53%) gave no response.
- Special educational needs (SEN): only 122 respondents to this question (4%) considered themselves to have SEN, while 2,871 (93%) said they did not; 99 (3%) preferred not to say, and 3,464 (53%) did not respond.
- Disability: a minority of those who responded to this question reported having a disability (255; 7%), while the majority (2,775; 89%) said they did not, and 107 (3%) preferred not to say. 3,449 (53%) did not respond.
- Ethnicity: of 3,068 respondents to this question, 95% (2,905) identified as white; 2% (65) were Asian or Asian British. The remaining groups represented 1% of respondents each: 34 identified as Black, Black British, Caribbean or African; 44 as mixed or multiple ethnic groups; and 20 as other ethnic groups.
- Religion/belief: the most common religion/belief was Christian (1,472; 48%), followed by none (1,185; 39%). Small numbers reported other religions, each under 1% of respondents: Muslim (24), Jewish (13), Hindu (10), Sikh (10), and Buddhist (9). 284 (9%) preferred not to say, and over half of respondents (3,486; 53%) did not respond.
Consultation survey – respondent roles
In terms of their role, most survey respondents were education professionals (see Table 1, Survey data column). The rest were: parents; local authority staff; social care and health-care professionals; charity, third sector, academic, research or policy professionals; members of the public; children/learners; Ofsted inspectors and other Ofsted staff; and various organisations.
As respondents could select more than 1 role, we allocated each respondent to a primary role only to facilitate breakdowns as part of qualitative analysis. The perceived dominance of a role in view of the subject matter of the survey – education inspection – guided allocations to a primary role. For example, it was assumed that ‘education professional’ would be the main role, which guided answers of a teacher or headteacher who was also a parent/carer (see Table 1, Primary assigned role column).
Table 1: Respondents’ roles (values rounded to nearest whole %)
Respondent role | Survey data* | Primary assigned role |
---|---|---|
Education professional | 4,899 (75%) | 4,809 (73%) |
Parent/carer | 1,354 (21%) | 699 (11%) |
Other (please specify)** | 405 (6%) | z |
Current or former education or regulatory inspector | 321 (5%) | 334 (5%) |
Charity, third sector, academic, research or policy professional | 204 (3%) | 120 (2%) |
Chief executive officer or chair of an association | 201 (3%) | 118 (2%) |
Member of the public | 198 (3%) | 127 (2%) |
Local authority staff | 198 (3%) | 101 (2%) |
Other: (Governor) | z | 96 (1%) |
Prefer not to say | 67 (1%) | 63 (1%) |
Child/learner | 63 (1%) | 62 (1%) |
Health-care professional | 18 (<1%) | 5 (<1%) |
Social care professional | 15 (<1%) | 9 (<1%) |
Non-inspection staff of an inspectorate or regulator*** | 15 (<1%) | z |
Organisational response | z | 13 (<1%) |
Total respondents | 6,556 | 6,556 |
* Multiple responses allowed. Percentages are based on total number of respondents (n=6,556).
** Responses entered under ‘Other’ were reviewed individually. If no other predefined options were selected, the write-in response was analysed and, where possible, mapped to one of the existing predefined categories.
*** Due to the low number of selections for this role, respondents who chose it were reassigned to other roles, as appropriate (such as parents).
Note: z means not applicable.
Consultation survey – workplace of education professionals
We asked education professionals to specify the type of a provider they work in. Most education professionals (63%) reported working in a state-funded school, while the rest worked in an early years setting or childcare provider, an FE and skills provider, an ITE/training provider, or an independent school (see Figure 1).
Figure 1: Please tell us where you work (based on 4,899 respondents who identified as an education professional)
A more detailed breakdown of respondents’ workplaces across several remits is displayed in Table 2 below.
Table 2: Workplace of education professionals by provider type (based on 4,899 respondents who identified as an education professional in our survey)
Provider type | Count |
---|---|
Maintained primary school | 1,228 (25%) |
Maintained secondary school | 233 (5%) |
Maintained special school | 102 (2%) |
Maintained junior school | 68 (1%) |
Maintained nursery school | 28 (1%) |
Pupil referral unit | 14 (<1%) |
Maintained all-through school | 10 (<1%) |
Maintained middle school | 6 (<1%) |
Total of maintained schools (including community, foundation and voluntary schools) | 1,689 (34%) |
Academy or free primary school | 787 (16%) |
Academy or free secondary school | 485 (10%) |
Academy or free special school | 71 (1%) |
Academy or free junior school | 37 (1%) |
Academy or free all-through school | 29 (1%) |
Alternative provision | 13 (<1%) |
Academy or free middle school | 11 (<1%) |
Total of academy or free schools (including studio schools and technology colleges) | 1,433 (29%) |
Multi-academy trust, diocese or other school group/chain | 320 (7%) |
Other (please specify) | 276 (6%) |
Prefer not to say | 71 (1%) |
Online education provider | 8 (<1%) |
Children’s social care setting | 4 (<1%) |
Total of other | 679 (14%) |
Childcare on non-domestic premises | 249 (5%) |
Childminder who works alone | 212 (4%) |
Childminder who works with other childminders and/or assistants | 88 (2%) |
Childcare on domestic premises | 20 (<1%) |
Before and/or after school provider | 4 (<1%) |
Total of registered early years and/or childcare providers | 573 (12%) |
General FE/tertiary college | 140 (3%) |
Independent learning provider | 72 (1%) |
Higher education institution | 34 (1%) |
Sixth-form college | 34 (1%) |
Local authority adult education service | 28 (1%) |
Independent specialist college | 25 (1%) |
Prison or young offender institution | 17 (<1%) |
16 to 19 academy | 7 (<1%) |
Total of FE and skills providers | 375 (7%) |
Higher education institution | 54 (1%) |
School-centred initial teacher training provider | 19 (<1%) |
General FE/tertiary college | 8 (<1%) |
Independent learning provider | 6 (<1%) |
Total of ITE/initial teacher training providers | 87 (2%) |
Independent special school | 35 (1%) |
Independent primary school | 14 (<1%) |
Independent secondary school | 12 (<1%) |
Independent all-through school | 11 (<1%) |
Independent alternative provision | 6 (<1%) |
Independent residential special school or boarding school | 3 (<1%) |
Total of independent schools | 81 (2%) |
Total of all providers | 4,899 (100%) |
Of the 4,249 education professionals who answered the question on the ethos of their workplace, 954 (22%) work in a school or organisation with a faith ethos. Most respondents (3,295; 78%) work in a school or organisation that does not have a faith ethos.
Most education professionals from state-funded schools who responded to our survey work in a primary school (68%). The next largest proportion come from secondary schools (23%). These percentages broadly reflect the national percentages (see Table 3).
Table 3: Breakdown of the education professionals’ workforce by phase of education – state-funded schools
Phase of education | Survey respondents* | National data** |
---|---|---|
All-through | 39 (1%) | 17,848 (2%) |
Middle school | 17 (1%) | 4,554 (1%) |
Nursery | 28 (1%) | 4,079 (<1%) |
Primary | 2,120 (68%) | 523,329 (58%) |
Secondary | 718 (23%) | 273,243 (30%) |
Special | 200 (6%) | 86,954 (10%) |
Total | 3,122 (100%) | 910,007 (100%) |
* Based on the workplace of 3,122 respondents who identified as education professionals, working in state-funded school sector (out of 4,899 education professionals who completed the survey).
** The national figures include teachers, teaching assistants and leadership non-teachers in state-funded nursery, primary, all-through, secondary and special (including alternative provision and pupil referral unit) schools in England in November 2023. They are drawn from the School Workforce Census 2023, accessed on 28 May 2025.
Most respondents from state-funded schools work in local authority (LA)-maintained schools (54%), while 46% work in academies or free schools. This is a reverse of the national picture (see Table 4).
Table 4: Breakdown of the education professionals’ workforce by school type – state-funded schools
School type | Survey respondents* | National data** |
---|---|---|
Academies/free schools | 1,433 (46%) | 494,213 (54%) |
LA-maintained | 1,689 (54%) | 415,794 (46%) |
Total | 3,122 (100%) | 910,007 (100%) |
* Based on the workplace of 3,122 respondents who identified as education professionals, working in state-funded school sector (out of 4,899 education professionals who completed the survey).
** The national figures include teachers, teaching assistants and leadership non-teachers in state-funded nursery, primary, all through, secondary, special, alternative provision and pupil referral unit schools in England in November 2023. They are drawn from School Workforce Census 2023, accessed on 28 May 2025.
Ofsted-run focus-group discussions and a roundtable
Table 5 displays numbers and groups of participants across the remits.
Table 5: Focus group discussions and participants
Attendees | Early years | Schools | FE and skills | ITE | SEND | Total |
---|---|---|---|---|---|---|
Inspectors | 5 | 5 | 9 | 7 | 7 | 33 |
Practitioners | Childminders: 4 Nurseries: 6 |
Primary: 5 Secondary: 4 |
14 | 9 | 9 | 51 |
SEND sector representatives | z | z | z | z | 11 | 11 |
Employers or employer providers | z | z | Employers: 6 Employer providers: 6 |
z | z | 12 |
Total number of participants | 15 | 14 | 35 | 16 | 27 | 107 |
Note: z means not applicable.
Where sampling was necessary, this was conducted to make sure we had a regional spread and a variety of providers and practitioner roles.
YouGov polls and focus-group discussions
The YouGov poll sample consisted of 1,090 parents in England aged 18 and above. To ensure a representative sample, quotas were set during fieldwork, and results were weighted by age, marital status, social grade, gender and region. Parents working in the education sector were excluded.
In the sample of parents for focus groups, there was a mix of age, gender, social grades and regions across England. Parents with ethnic minority backgrounds were included. There was a mixed awareness and attitude towards Ofsted reporting. This was the group breakdown:
- Group 1 – parents of primary school children
- Group 2 – parents of secondary school children
- Group 3 – parents working in education
- Group 4 – parents of children at schools most recently graded outstanding or good
- Group 5 – parents of children at schools most recently graded requires improvement or inadequate
- Group 6 – parents of children aged 0 to 5, currently at nursery
- Group 7 – parents of children aged 0 to 5, currently at registered childminders
Test visits
Test visits aimed to cover as many different provider types as possible within each remit. All the provider types within each remit had at least 1 visit.
We tested the proposed renewed framework in a variety of settings and provider types. This included:
in early years:
- nurseries
- pre-schools
- out-of-school settings
- sessional care
- childminders
in state-funded schools:
- maintained nursery schools
- primary schools (including small primary schools with fewer than 150 pupils, infant and junior schools)
- secondary schools
- all-through schools
- special schools
- alternative provision
in non-association independent schools:
- special schools
- faith-based schools
- other independent schools
in FE and skills:
- general FE colleges
- specialist FE colleges
- independent learning providers
- independent specialist colleges
- higher education providers
- sixth-form colleges
In ITE:
- higher education institutions
- school-centred initial teacher training (SCITT)
The tables below detail the number of visits in each phase of testing and the number of inspector and provider responses to the survey. ITE is excluded in Table 9 as there were no relevant visits in the timeframe.
Table 6: Phase 1 (thematic visits) – cross-remit visit numbers and survey response rates
Provider type | Total visits | Inspector responses | Provider responses |
---|---|---|---|
Total cross-remit | 85 | 109 | 62 |
State-funded schools | 38 | 60 | 26 |
Independent schools | 6 | 4 | 4 |
Early years | 35 | 31 | 29 |
FE and skills | 3 | 6 | 3 |
ITE | 3 | 8 | 0 |
Table 7: Phase 2 (test visits) – cross-remit visit numbers and survey response rates
Provider group | Total visits | Inspector responses | Provider responses |
---|---|---|---|
Total cross-remit | 110 | 267 | 99 |
State-funded schools | 58 | 153 | 59 |
Independent schools | 13 | 12 | 8 |
Early years | 23 | 22 | 17 |
FE and skills | 13 | 63 | 13 |
ITE | 3 | 17 | 2 |
Table 8: Phase 3 (grading test visits) – cross-remit visit numbers and survey response rates
Provider group | Total visits | Inspector responses | Provider responses |
---|---|---|---|
Total cross-remit | 44 | 98 | 41 |
State-funded schools | 19 | 42 | 20 |
Independent schools | 5 | 6 | 4 |
Early years | 14 | 11 | 13 |
FE and skills | 4 | 28 | 3 |
ITE | 2 | 11 | 1 |
Table 9: Phase 4 (revised toolkit grading test visits) – cross-remit visit numbers and survey response rates
Provider group | Total visits | Inspector responses | Provider responses |
---|---|---|---|
Total cross-remit | 48 | 104 | 38 |
State-funded schools | 28 | 67 | 22 |
Independent schools | 2 | 4 | 2 |
Early years | 14 | 13 | 12 |
FE and skills | 4 | 20 | 2 |
-
YouGov is ISO27001:2022 certified and holds Cyber Essentials Plus certification. It is a company member of the Market Research Society and British Polling Council. ↩
-
It is becoming increasingly more common across government to analyse free-text responses with assistance from the government’s AI-powered consultation data analysis tool. ↩
-
See ‘Consultation survey – respondent roles’ in the ‘Respondents and participants’ section for a further description of our survey respondents and breakdowns. ↩
-
‘NAHT responds to Ofsted’s consultation on changes to school inspection in England’, NAHT, April 2025. ↩
-
These closely aligned responses typically showed only minor deviations – such as slight rewording or shortening – without introducing new or substantive content. Submissions that demonstrated clear individual input, such as original commentary or significant modifications, were retained for analysis. ↩