© Crown copyright 2019
This publication is licensed under the terms of the Open Government Licence v3.0 except where otherwise stated. To view this licence, visit nationalarchives.gov.uk/doc/open-government-licence/version/3 or write to the Information Policy Team, The National Archives, Kew, London TW9 4DU, or email: email@example.com.
Where we have identified any third party copyright information you will need to obtain permission from the copyright holders concerned.
This publication is available at https://www.gov.uk/government/publications/risk-assessment-methodology-for-maintain-schools-and-academies/risk-assessment-methodology-good-and-outstanding-maintained-schools-and-academies
This is the risk assessment process that Ofsted uses to assist in scheduling for inspections of good and outstanding maintained schools and academies.
We use risk assessment to ensure that our approach to inspection is proportionate and to focus our efforts where they can have the greatest impact. We also use risk assessment to identify good and exempt outstanding schools for which we have concerns about their performance. It is important to note that the risk assessment process is not used in any way to pre-judge inspection outcomes. Inspectors do not have access to the risk assessments when inspecting schools.
For a number of years, we have used statistical models to ensure proportionate inspection. Our new methodology has not changed this and we believe it will improve our capacity to identify concerns about performance. This is a note on the methodology applied using 2018 published data. The methodology will be applied to school inspections from the summer term of the 2019 academic year. This note will be updated annually.
The risk assessment process
Risk assessment has 2 stages:
- stage 1 involves an assessment of each school based on analysis of published data
- stage 2 involves a more in-depth ‘desk-based’ review of a wider range of available information
At stage 2 Senior Her Majesty’s Inspectors (SHMI) review potential inspections to ensure the most appropriate inspection type (if any) is carried out.
Stage 1: analysis of published historic data
Our methodology is unchanged this year. Once again we are using a methodology known as ‘supervised machine learning’. This is a way of getting computers to make decisions that have not been explicitly programmed. A common application is classifying items into two or more groups.
In a typical application, there will be a large dataset called ‘training data’ for which we already know which groups the items belong to. This is used to train the machine learning algorithm to distinguish between unknown items. For example, a ‘spam’ filter can be trained by giving it lots of emails that users have marked as spam and lots of non-spam emails and the algorithm works out the differences.
Machine learning applied to inspection outcomes
Known inspection outcomes over the 2017/18 academic year were retrospectively predicted using a machine learning algorithm. The training dataset consisted of inspection outcomes and a wide range of other data that was available when the inspections took place.
The data sources are:
- progress and attainment data from Department for Education
- school workforce census data
- Parent View responses
The machine learning algorithm combined the data to make an optimum fit to the known inspection outcomes. The outcomes were coded as ‘Good or better’ or ‘Less than good’.
One drawback to machine learning can be that the predictions vary slightly according to the training dataset used.
To help overcome this, we fitted a large number of models to slightly different training sets. In this way, we effectively produced a probability of a forthcoming inspection being less than good. This is our ‘raw risk score’, which takes a value between 0 and 1.
It’s important to remember that:
- this algorithm is only used at stage 1 of the risk assessment process and SHMI reviews follow on from this
- in no way do the algorithm results impact on or determine inspection judgements
Additional risk information
Additional information is incorporated into the risk assessment process, but is not included in the machine learning algorithm. This data is used to ‘fine tune’ risk assessments.
The additional information used is:
- qualifying complaints about schools
- schools with high levels of pupil mobility
- time since last inspection, and inspection framework inspected under, for schools exempt from routine inspection
Inclusion criteria for stage 1 of the risk assessment
Schools with cohorts of fewer than 11 pupils at the relevant key stage are not included in stage 1 of the risk assessment.
Stage 2: desk-based review
SHMI within each region review the information provided by stage 1 of the risk assessment process. They also review risk by considering:
- the outcomes of any inspections, such as survey inspections, that we have carried out since the last routine inspection
- qualifying complaints1 about the school referred to us by parents
- statutory warning notices
- any other significant concerns that are brought to our attention
Timing of inspections
For further information on the use of risk assessment and the timing of inspections, please refer to the school inspection handbook.