Apprenticeship training provider accountability framework and specification
Updated 28 January 2026
Applies to England
This apprenticeship accountability framework (AAF) and its specification inform our assessment of the quality of provision delivered by apprenticeship training providers.
We expect training providers to routinely review and assess the quality and success of their individual programmes. We expect them to take proactive action to improve their provision and shape their apprenticeship provision offer. This framework supports training providers to proactively:
- review their own performance
- identify potential quality issues
- support their continuous improvement
Changes to the AAF: January 2026
From the end of January 2026, this framework will incorporate several changes to ensure its indicators remain focused on current and meaningful measures of provider quality.
Ofsted thresholds have been updated to align with the revised Ofsted reporting arrangements introduced in November 2025. These changes reflect updates to inspection outcomes and judgement structures. We are suspending 3 supplementary indicators:
- breaks in learning
- end-point assessment organisation data
- off-the-job training
The apprentice past planned end date indicator has been refined to focus more precisely on a key area of risk. And we have also removed the distinction between quality and supplementary indicators.
The indicators being removed no longer provide sufficient value, their current form and method of calculation, to support the early identification of provider risk or underperformance.
The underlying areas these indicators relate to remain important. We will continue to monitor the relevant data and keep our approach under review.
Any future inclusion of indicators will be informed by improved measures that more accurately reflect provider quality and support effective risk management
Accountability policy for apprenticeship training providers
The quality of training is a key factor in whether an apprentice completes their apprenticeship and progresses successfully.
The Department for Education (DfE) aims to:
- protect the experience of all apprentices, increase achievements and improve quality
- ensure that pockets of poor provision do not affect the overall performance of apprenticeships or the programme brand
- enable employers to realise the full benefits of apprenticeship training and ensure provision meets workforce needs
This framework sets out a range of quality indicators with defined thresholds that training providers can use to support self-improvement.
Where performance falls below expected levels, you should expect DfE to review your data as part of its risk-based oversight approach.
Thresholds within this framework are used to highlight potential risks to the quality of a provider’s apprenticeship provision and to determine whether a management review is required. Crossing a threshold does not result in automatic intervention or contractual action. Thresholds are used to inform risk identification and any next steps are considered on a case-by-case basis, in line with the approach set out later in this document.
The thresholds reflect our minimum expectations and should not be used as targets or aspirations. Providers with a large number of apprentices should expect closer monitoring and performance management to support continuous improvement.
We will:
- use the framework to assess performance throughout the academic year – this will ensure that if we need to take intervention action, it is timely and proportionate
- make full use of the range of existing contractual measures detailed in your apprenticeship provider agreement
- evaluate each case on its own merits – this may include acting in cases where performance compares poorly with other providers delivering similar provision where improvement conversations have not led to progress, or where non-priority provision is being delivered
- take into account contextual factors that are evidently impacting a provider’s performance or data, ensuring that decisions are informed by a full understanding of the provider’s operating environment
Framework policy principles
Quality is the foundation of a successful apprenticeship system. Every apprentice should have access to training that equips them with the skills, knowledge and behaviours needed for their chosen career. High-quality provision:
- supports achievement and progression for apprentices
- ensures employers have a skilled workforce
- delivers value for government investment by addressing skills gaps and driving economic growth
The AAF is designed to identify and respond to poor-quality provision. It prioritises quality before growth and holds providers to transparent standards that reflect the maturity and ambition of the programme.
The framework is underpinned by 5 principles.
Data-driven
We use a wide range of quality indicators to give us a rounded overview of provider delivery.
Risk-based
We focus on providers where quality concerns may exist, so we can intervene to protect the interests of apprentices.
Encourage self-improvement
We aim to:
- identify risks to the quality of apprenticeship provision and learner outcomes early
- protect the interests of apprentices
- support self-improvement while making clear that responsibility for improvement rests with the provider
Timeliness
We monitor performance data throughout the academic year to enable early conversations and interventions. Providers should also review their own data regularly.
Proportionality
We take proportionate action based on the level of risk to apprentices and employers, the capacity of providers to improve, and contextual factors.
Indicators we use to review provider performance
From the end of January 2026, this framework uses a single set of quality indicators to review provider performance. The previous distinction between quality and supplementary indicators has been removed to simplify the framework and improve clarity.
Reviews will assess the latest available data for each indicator for the current academic year and the previous academic year (where applicable).
We will continue to keep the indicators and their thresholds under review to ensure they remain effective in identifying low-quality provision and enable targeted oversight.
Performance classifications
To support consistent and transparent assessment of apprenticeship training providers, this framework uses 3 performance classifications. These apply across all indicators and inform the level of engagement and oversight by DfE, helping to ensure that any intervention is timely, proportionate and focused on protecting the interests of apprentices. These classifications are:
At risk
Provider performance falls significantly below expected thresholds. This would normally trigger a performance review and management conversation, which may lead to the application of one of the contractual measures detailed in the apprenticeship provider agreement.
Needs improvement
Where provider performance falls below the relevant threshold, it may indicate a risk to apprentices and employers and require further attention, with improvement to be realised by the provider. Account managers may initiate a management conversation to better understand the reasons for underperformance and may request supporting evidence. You may be asked to develop and implement an improvement plan, or other proportionate contractual controls may be considered where appropriate. Where a provider remains in this category over time without demonstrable improvement, more stringent intervention may be considered.
On track
Provider performance meets or exceeds all required quality indicator thresholds. No performance review is triggered.
Interventions and contractual remedies
Where a provider is classified as being ‘at risk’ or as ‘needs improvement’ in relation to the quality indicators, we may take action including contractual interventions and remedies as set out in your apprenticeship provider agreement.
The various interventions and actions include, but are not limited to, those set out in Range of interventions.
Quality indicators and thresholds
Outcomes from Ofsted inspection
Ofsted inspections assess the quality of apprenticeship training to ensure it:
- meets the needs of apprentices and employers, and supports learner progress
- reflects effective leadership and governance
Inspection outcomes are used within this framework to inform DfE’s assessment of risk and provider performance classification.
Revised inspection framework
As of November 2025, Ofsted inspections of apprenticeship provision are conducted under the updated education inspection framework and toolkit for further education and skills. The only exceptions are the level 5 learning and skills teacher and level 6 teacher apprenticeship standards, which are inspected under the ITE inspection toolkit.
We will consider training providers to be ‘at risk’ if Ofsted issues a judgement that states:
- an ‘urgent improvement’ judgement for ‘leadership and governance’ or ‘inclusion’ at whole-provider level
- a ‘not met’ judgement for ‘safeguarding’ at whole-provider level
- an ‘urgent improvement’ judgement for any provision-type level evaluation areas for apprenticeships
We’ll consider organisations as ‘needs improvement’ if Ofsted issues:
-
a ‘needs attention’ judgement for ‘leadership and governance’ or ‘inclusion’ evaluation areas at whole-provider level.
-
a ‘needs attention’ judgement for any provision-type level evaluation area for apprenticeships
-
an ‘insufficient progress’ assessment at a new-provider monitoring inspection
-
we may prioritise providers for review under the accountability framework if they’ve not yet been inspected by Ofsted or if account managers assess providers as having made insufficient progress from their last Ofsted inspection
DfE shall not take into account any assessment by Ofsted in relation to the Ofsted evaluation area that is concerned with ‘contribution to meeting skills needs’.
Providers not yet inspected under the November 2025 Ofsted framework
Training providers that have not yet been inspected under the November 2025 Ofsted framework will continue to be classified under their most recent judgement using the inspection outcome from the previous framework until a new inspection has taken place.
We’ll consider organisations to be ‘at risk’ if Ofsted judges them to be:
- inadequate for ‘apprenticeships’
- inadequate for ‘overall effectiveness’ under its further education (FE) and skills remit, where there is no separate apprenticeship judgement
The contractual action we may take in these circumstances is set out in your apprenticeship provider agreement.
We may prioritise providers under the accountability framework if they’ve not yet been inspected by Ofsted, or if they’ve been inspected by Ofsted and received:
- a ‘requires improvement’ judgement at full inspection
- an ‘insufficient progress’ assessment at a new-provider monitoring inspection
Apprenticeship achievement rate
Apprenticeship achievement rates are calculated as part of qualification achievement rates (QARs).
We’ll assess organisations with an all-age apprenticeship achievement rate:
- of less than 50% as ‘at risk’
- greater than or equal to 50% and less than 60% as ‘needs improvement’
In-year QAR releases are provisional but will be used within the AAF to help identify providers that may be at risk. These updates are reflected in the accountability framework dashboard in line with the main QAR dashboard releases:
- R10 in June
- R12 in August
- R14 in January and March
Foundation apprenticeships
Foundation apprenticeships were introduced in August 2025 and we recognise that providers will need time to adapt their delivery models and approaches during this first year.
For this reason, foundation apprenticeship achievement rates will not be included in a provider’s QAR for the apprenticeship accountability framework for the 2025 to 2026 academic year. Achievement rates for foundation apprenticeships will, however, continue to be published in the national achievement rate tables as part of the overall programme.
This does not limit DfE’s ability to take action under the apprenticeship provider agreement where there are wider concerns about compliance, delivery or quality in relation to foundation apprenticeships. DfE reserves its contractual rights accordingly.
We will:
- monitor the early indicators of quality and performance at provider and programme level for foundation apprenticeships
- protect the integrity of the apprenticeship system and ensure high quality outcomes for all apprentices
We will continue to work with providers to ensure that:
- they have the support they need to deliver high-quality training
- apprentices benefit from a positive and meaningful apprenticeship experience
Apprenticeship retention rate
Apprenticeship retention rates are calculated as part of QARs.
We’ll assess organisations with an all-age apprenticeship retention rate:
- of less than 52% as ‘at risk’
- greater than or equal to 52% and less than 62% as ‘needs improvement’
In-year QAR releases are provisional but will be used within the AAF to help identify providers that may be at risk. These updates are reflected in the accountability framework dashboard in line with the main QAR dashboard releases:
- R10 in June
- R12 in August
- R14 in January and March
Apprentice feedback
This is collected through an apprentice’s ‘My apprenticeship’ account on the apprenticeship service.
Information on how the apprentice feedback score is calculated is available.
In their apprenticeship service provider accounts, providers can see single academic year scores for the past 5 years, an average score based on scores for the past 5 years and the detail behind their scores.
To ensure we only review recent performance, the framework will use the most recent academic year rating for assessment against the threshold. This will be displayed on the accountability framework dashboard, to which providers should refer.
Organisations with apprentice feedback of less than 2.5 will be assessed as ‘needs improvement’.
Employer feedback
This is collected through a questionnaire. Information on how the employer feedback score is calculated is available.
In their apprenticeship service provider accounts, providers can see single academic year scores for the past 5 years, an average score based on scores for the past 5 years and the detail behind their scores.
To ensure we only review recent performance, the framework will use the most recent academic year rating for assessment against the threshold. This will be displayed on the accountability framework dashboard, to which providers should refer.
Organisations with average employer feedback of less than 2.5 will be assessed as ‘needs improvement.’
Apprentices past planned end date (APPED)
This indicator now focuses solely on apprentices who are continuing (or intending to continue) training and are past their planned training end date. Prolonged extension of training can indicate barriers to timely completion, reduced momentum and an increased risk of non‑completion.
Apprentices in this group will be identified using Individualised Learner Record (ILR) data by selecting apprentices whose planned end date has passed and who remain in learning (ILR Completion Status Code 1).
We will assess organisations as:
-
‘at risk’ if greater than or equal to 15% of apprentices are past their planned end date and have not completed the learning activities leading to the learning aim
-
‘needs improvement’ if greater than or equal to 10% and less than 15% of apprentices are past their planned end date have haven not completed the learning activities leading to the learning aim
Withdrawals
This indicator refers to apprentices who have withdrawn from their learning activities prior to completion.
We’ll assess organisations as:
-
‘at risk’ if their withdrawals are greater than or equal to 20% of the total number of apprentices
-
‘needs improvement’ if their withdrawals are greater than or equal to 15% and less than 20% of the total number of apprentices
Review process
We will continually monitor provider performance against the indicators. We may contact you at any point in the academic year to discuss performance that falls below thresholds or diverges from improvement trajectories agreed with account managers.
If a management conversation is initiated, the relevant data will be shared with the provider and reviewed together to ensure transparency and accuracy.
All data used within this framework is drawn from published sources and data that providers already own or can access, including information presented in the accountability framework dashboard.
Before a review
Account managers will identify any provider that is ‘at risk’ or ‘needs improvement.’ They will contact you to set out where data indicates provision has fallen below any thresholds.
The account manager will usually arrange a management review.
If there are any mitigating factors that can be evidenced, share this with your case manager promptly.
During the management review
The account manager will review provider data against the framework’s quality indicators.
The discussion or correspondence is focused on where your data indicates performance is below required thresholds. We’ll discuss:
- evidenced reasons for performance falling below the specified thresholds
- your track record, previous performance and responses to earlier issues
- your capacity to realise improvement, including actions already taken, when you expect to see impact, and progress in relation to improvement plans or targets we have previously agreed with you, as well as your ability to accurately forecast and meet such targets
- how your performance compares with sector or standard benchmarks (where appropriate)
- whether your expected timeframe for improvement is realistic and reasonable, given the scale of the issue, and its impact on apprentices and employers
- contextual factors, including the extent to which your provision aligns with government priorities, relative market share and wider considerations, such as the number of other providers operating in the relevant sector or area and the quality of comparable provision
The account manager may ask for further information. You should provide this within the agreed time period.
After the review
Following the management conversation, the account manager will inform you in writing if any follow-up action is required.
We will consider all relevant factors discussed at the meeting. If, based on this review, we judge that your performance against the framework thresholds reflects issues within your control, we may consider appropriate intervention in line with your apprenticeship provider agreement.
Range of interventions
This framework does not, in itself, exercise contractual powers or create legal rights or obligations. It operates as a risk identification and engagement framework to inform discussion, oversight and decision making. Any contractual action is taken separately, in accordance with the terms of the apprenticeship provider agreement, and only following consideration of the individual circumstances of each case
Your apprenticeship provider agreement sets out the full range of intervention actions we may take. These include but are not limited to:
- requiring you to produce an improvement plan with associated targets and demonstrate your ability to achieve the necessary improvements within a realistic and appropriate timeframe
- restricting your ability to recruit new apprentices (either for specific or for all occupational standards)
- restricting your sub-contracting arrangements
- withholding or suspending funding for a fixed or indefinite period
- capping funding for delivery of new standards, for a fixed or indefinite period
- limiting growth while improvement is demonstrated
- putting in place other contractual conditions as appropriate
- terminating your contract, where necessary
No intervention
If you are ‘on track’ against all indicators, we’ll not contact you to arrange a performance review but may engage you to highlight and showcase good practice.
There may be instances where a provider is flagged as ‘at risk’ or ‘needs improvement’ for one or more indicators, but we determine that intervention action is not required. In all cases, decisions will be based on a review of the data and any evidenced contextual factors, ensuring that oversight remains proportionate and informed
Contextual factors
We’ll take relevant evidenced contextual factors into account when reviewing your performance. These include, but are not limited to:
Benchmarked QARs
Benchmarked QAR data will be reviewed as part of the management conversation. It will support the evaluation of each case through more relevant and transparent comparison of performance with providers delivering similar provision and therefore facing similar sectoral challenges.
Benchmarked data will not lead to arbitrary action. Recognising that providers delivering similar provision may be impacted by different provider-level challenges, we’ll consider wider contextual information – for example, delivery location, learner and employer profiles, and cohort or provider size – when reviewing benchmarked data with you.
Further information on benchmarked QAR data can be found in Annex 3.
Provider information
We aim to expand the data available on the AAF dashboard to aid self-improvement, inform management conversations and improve transparency. Future additions may include data on market share, geographical coverage and employer and learner characteristics. A timetable for implementation will be confirmed once ongoing work is complete, and we expect to provide an update in early 2026.
Apprentices with protected characteristics and from disadvantaged backgrounds, including care leavers
We’ll consider the profile of a provider’s cohort when we review their performance.
Every apprentice deserves excellence in their training provision, and all those considering an apprenticeship need to know that apprenticeships are accessible and inclusive.
We recognise that some apprentices, due to their background or protected characteristics, may require additional support and time to achieve. We will take this into account when reviewing your performance data, while ensuring all apprentices have access to high-quality provision that enables success.
The provider guide to delivering high-quality apprenticeships provides guidance on the additional funding and support available to support different cohorts to achieve.
Small or new apprenticeship provision
When deciding on intervention action for underperforming providers, we’ll consider whether they:
- have small cohorts
- offer new or immature provision
We expect these providers to set realistic improvement targets as a priority. We will challenge them on reasonable progress and evidence of impact with a particular focus on ensuring quality is embedded before any growth is considered. Smaller or new providers are brought into the market to address identified gaps, and we expect them to establish a credible track record in the provision for which they were given market access before considering expansion into other areas.
English and maths
We will track the numbers of apprentices aged 19 and over undertaking English and maths qualifications, including withdrawals and pass rates. This monitoring helps us ensure that the flexibility introduced for adult apprentices is being used appropriately, and that any emerging trends – such as high numbers of starts without corresponding completions – are identified early.
It also supports oversight of the quality and effectiveness of English and maths provision, given its importance in enabling apprentices to progress and achieve.
Data timeliness and accuracy
It’s important that your data accurately reflects the true status and progress of your apprentice population and performance at any point in the year. Your provider agreement sets out your obligations with regard to the accurate and timely reporting of data. Any deliberate misreporting or manipulation of data is considered a serious breach of the agreement and likely to result in contractual action.
You can use the following tools to test its credibility:
- funding information service
- provider data self-assessment toolkit (PDSAT)
- post-16 monitoring reports dashboard in view your education data
- analyse FE data tool (AFED) – this includes an overarching dashboard for the quality view
Complaints and feedback
You can complain about an intervention through our customer help portal, if you’re unable to resolve this with your account manager directly.
Annex 1: AAF quality indicator thresholds
| ID | Indicator | At risk | Needs improvement |
|---|---|---|---|
| 001 | Outcomes from Ofsted reports | Received ‘not met’ judgement for safeguarding. Or received ‘urgent improvement’ judgement for any ‘whole provider’ level or ‘provision-type’ level evaluation area. | Received a ‘needs attention’ judgement for any ‘whole provider’ level or ‘provision type-level’ evaluation area for apprenticeships. Or received an ‘insufficient progress’ assessment at a new-provider monitoring inspection. |
| 002 | Achievement rates – part of qualification achievement rates (QAR) | less than 50% | greater than or equal to 50% and less than 60% |
| 003 | Retention rates – part of QAR | less than 52% | greater than or equal to 52% and less than 62% |
| 004 | Withdrawals | greater than or equal to 20% | greater than or equal to 15% and less than 20% |
| 005 | Employer feedback | N/A | average feedback less than 2.5 |
| 006 | Apprentice feedback | N/A | average feedback less than 2.5 |
| 007 | Learner past planned end date (APPED) | greater than or equal to 15% of apprentices past their planned end date who have not completed the learning activities leading to the learning aim (Completion Status 1) | greater than or equal to 10% and less than 15% of apprentices past their planned end date who have not completed the learning activities leading to the learning aim (Completion Status 1) |
Annex 2: Definitions used for quality indicators
Data source
This framework uses multiple data sources to assess provider quality. Each source contributes to different indicators within the framework:
Individualised Learner Record (ILR)
Most quality indicators are derived from ILR data submitted via the Submit Learner Data service, which enables DfE-funded providers to validate and submit their data. This includes information on learner status, achievement, retention and withdrawals.
Ofsted
Inspection outcomes are sourced directly from Ofsted, not the ILR. These reflect the quality of apprenticeship training as assessed under Ofsted’s inspection frameworks.
Apprentice and employer feedback
Feedback data is collected through surveys completed on the apprenticeship service. Apprentices and employers are invited to provide feedback at regular intervals after the apprenticeship begins. This data is not part of the ILR.
Note: Only the latest ILR return is used to generate the AAF. Outputs are available on view your education data.
ILR-derived indicators: denominator definition
For all ILR-derived indicators, the denominator is defined as the total number of apprentices within the academic year, regardless of completion status.
This includes:
- new starts
- existing apprentices
- apprentices on both apprenticeship standards and frameworks
We exclude apprentices who do not meet the qualifying period of a minimum 42 days. This is set out in the apprenticeship funding rules. We may monitor the volume of leavers in the first 42 days of apprenticeships in future updates.
Technically, this covers records with an:
- an Aim Type of 1 (‘programme aim’)
- and have a Programme type of:
- 2 (‘advanced level apprenticeship’)
- 3 (‘intermediate level apprenticeship’)
- 20 (‘higher apprenticeship – level 4’)
- 21 (‘higher apprenticeship – level 5’)
- 22 (‘higher apprenticeship – level 6’)
- 23 (‘higher apprenticeship – level 7+’)
- 25 (‘apprenticeship standard’)
- and have a Funding model of:
- 35 (‘adult skills’)
- 36 (‘apprenticeships’)
- 81 (‘other adult’)
Ofsted
Ofsted inspections assess the quality of apprenticeship training to ensure:
- it is high-quality and meets the needs of employers and apprentices
- apprentices learn, develop and make progress as they should
- providers continuously improve
- providers are accountable for the public funding received for apprenticeships
From the point a provider begins delivering funded apprenticeship training, they are subject to Ofsted inspection. Ofsted inspects all apprenticeships under their further education and skills framework, except the:
Ofsted inspects these 2 apprenticeships under their initial teacher education framework.
Ofsted also inspects all level 6 and level 7 apprenticeships, including degree apprenticeships.
A detailed summary of what to expect from inspections is available in Ofsted’s guide to inspections for providers.
In November 2025, Ofsted implemented a renewed education inspection framework, including a new toolkit for further education and skills.
Note: Ofsted ratings are not currently included in the apprenticeship accountability dashboard. Providers can access their report on the Ofsted website.
Qualification achievement rate (QAR)
Qualification achievement rates (QARs) measure provider performance and quality in a given academic year. They are calculated using ILR data to determine the proportion of learning aims successfully completed, expressed as a percentage.
For further information, refer to:
- Introduction to Qualification Achievement Rates (QARs)
- Qualification achievement rates 2024 to 2025 guidance
Achievement rate
The achievement rate is calculated as learning aims achieved as a percentage of the total number of aims in the cohort.
Retention rate
The retention rate is calculated as learning aims recorded as complete as a percentage of the total number of aims that ended.
Apprentice and employer feedback
Feedback from apprentices and employers is managed by the apprenticeship service.
The apprenticeship service automatically invites both apprentices and employers to provide feedback on the training delivered by their training provider. This process begins 3 months after the apprenticeship starts and continues throughout its duration.
Apprentice feedback
Training providers receive 2 types of feedback from apprentices:
1. Overall rating
Apprentices are asked to rate the quality of training from very poor to excellent. These ratings are displayed as a star rating out of 4.
2. Detailed ratings
Apprentices also rate specific aspects of the training, including:
- the structure of the training
- communication from the training provider
- how prior learning is taken into account
These ratings are shown as a bar chart.
If no apprentice feedback is visible, it means fewer than 5 apprentices have responded. This threshold helps protect the anonymity of those providing feedback.
Employer feedback
Training providers receive 2 types of feedback from employers:
1. Overall rating
Employers are asked to rate the quality of training from very poor to excellent. These ratings are displayed as a star rating out of 4.
2. Detailed ratings
Employers also provide feedback on specific aspects of the training. Areas they can rate include:
- your training facilities
- how you carry out initial assessments
- your communication with employers
These ratings help identify strengths and areas for improvement from the employer’s perspective.
Apprentices past planned end date (APPED)
This indicator identifies apprentices who have passed their planned end date but have not yet completed their learning.
This definition uses the following fields from the ILR:
- Completion status
- 1 (‘the learner is continuing or intending to continue the learning activities leading to the learning aim’)
- learning actual end date
- learning planned end date
The definition also uses a field called ILR Freeze Date. From R01 in September to R11 in July, this is the date that the ILR data collection for that period closed, as defined in column B of the ILR freeze schedule.
For R12 in August through to R14 in October, this date is coded to 31 July, as the data collection closes following the end of the academic year.
Total
Set to 1 if any of the following conditions are true.
If the Completion Status is 1 and the Learning Actual End Date is not populated and the difference between the Learning Planned End Date and the ILR Freeze Date is greater than or equal to 1 day.
If the Completion Status is 1 and the Learning Actual End Date is populated and the difference between the Learning Planned End Date and the Learning Actual End Date is greater than or equal to 1 day.
Withdrawals
This definition uses the following fields from the ILR:
- Completion status of 3 (‘the learner has withdrawn from the learning activities leading to the learning aim’)
- Learning actual end date
- Learning start date
Total
Set to 1 if the Completion Status is 3 and the difference between the Learning Start Date and Learning Actual End Date is greater than 42 days.
Annex 3: Guidance and technical notes for benchmarked QAR data
Benchmarked data can be found by selecting ‘check here to see how your stats compare to national comparison’ on the QAR section of the AAF dashboard.
It includes comparative benchmarking data, setting out providers’ overall, sector subject area and standard-level QARs against national equivalents.
It also includes detailed benchmarked standard-level QAR data, based on quartiles – this shows providers in more detail how their delivery compares to providers delivering the same standard, by setting out whether they are performing better than 25%, 50% or 75% of other providers delivering the same standard, or are in the lowest 25%. As this is a new way of presenting data, we have provided technical notes to support providers’ understanding.
AAF indicator thresholds will remain the route for identifying which providers will be contacted for an AAF management conversation. The indicator thresholds for QARs are based on providers’ overall QAR. Benchmarked QAR data will not be used as an indicator threshold.
Benchmarked data will be updated once a year with full-year data, once national achievement rates tables (NARTs) have been published. The front page of AAF achievements data will continue to be updated in-year alongside in-year QAR updates.
Benchmarked data will remain within each provider’s private AAF dashboard and will not be shared publicly or published on Explore Education Statistics (EES). The quartile benchmarked data is confidential information belonging to DfE that is not in the public domain and therefore should not be used in public-facing materials.
We welcome feedback on the first iteration of this data, especially on how we can improve its presentation in the AAF dashboard and the explanation in the technical notes. Email provider.strategy@education.gov.uk.
Quartile benchmarking: technical notes
How quartile benchmarking is calculated
Quartiles divide an ordered list of providers’ qualification achievement rates for a given standard (that is, ordered from lowest to highest) into 4 equal-sized groups to reflect the relative performance of providers.
We calculate that if a provider is achieving better than 25%, 50% or 75% of other providers for a given standard, based on respectively falling into the second, third or fourth quartile of this ordered list of achievement rates for that standard.
The QAR value is calculated at the 25th, 50th (median) and 75th percentile of an ordered list of achievement rates for a given standard. Each provider’s QAR is then compared to these values to calculate the quartile they fall into and the proportion of providers they are performing better than for that standard, as follows:
-
providers with a QAR less than or equal to the value at the 25th percentile: their QAR for this standard is within the lowest 25% of all QARs for this standard (the lowest quartile)
-
providers with a QAR less than or equal to the median, and above the 25th percentile: their QAR for this standard is better than that of at least 25% of other providers delivering this standard
-
providers with a QAR less than or equal to the 75th percentile, and above the median: their QAR for this standard is better than that of at least 50% of other providers delivering this standard
-
providers with a QAR above the 75th percentile: their QAR for this standard is better than that of at least 75% of other providers delivering this standard (the top quartile)
The minimum and maximum quartile boundaries for the quartile a provider falls into for a standard are set out in the dashboard under the columns ‘quartile boundary (minimum)’ and ‘quartile boundary (maximum)’.
The quartile a provider falls into is determined only by the value of their QAR. This calculation is based on the exact QAR value, calculated as achievers ÷ leavers (× 100), without rounding to the single decimal place QAR is usually displayed with.
Scenarios
Scenario 1
The quartiled approach reflects the relative performance of providers. For standards where most providers have high QARs, the boundaries between quartiles will also be high. For standards where most providers have low QARs, the boundaries will be low. This means that:
- providers securing high QARs in high-performing standards may not fall into the higher quartiles when benchmarked against other providers delivering the same standard because their performance is good but relatively lower than other providers delivering the same standard
- providers delivering standards that usually achieve low QAR rates may fall into the higher quartiles despite having a low QAR, because their performance is relatively better than other providers delivering the same standard
Scenario 2
In any instance where a provider has exactly the same QAR for a standard as another provider (a tie), they will both fall in the same quartile.
Scenario 3
Where the 25th, 50th (median) and 75th percentile value does not fall exactly on an existing provider, a calculated value is used. This means the boundary value may not be a QAR achieved by a specific provider.
For instance, in a standard offered by 12 providers, each quartile would ideally contain 3 providers if no ties existed. Calculated values would be used to create boundaries falling between the third and fourth, sixth and seventh, and ninth and 10th provider to split the providers into 4 groups of 3.
Scenario 4
Where the QAR of a provider is exactly equal to a boundary, the provider falls into the quartile below to ensure the statement ‘For this standard, your QAR is within the lowest 25% of other providers or better than at least 25%, 50% or 75% of other providers’ holds true.
Scenario 5
For standards where many providers achieve a 100% or 0% QAR, the calculated quartile boundaries may be 100% or 0%, so, in some standards, a 100% QAR may not place a provider into the highest quartile. In this instance, enough providers have 100% QAR for this standard that 100% is not an achievement better than a full 75% of other providers, as ties are not counted.
Scenario 6
At present, there must be at least 2 providers delivering a standard for quartile benchmarking data to be generated for that standard. We’ll explore setting a further minimum in future iterations of this data if user testing suggests this is appropriate. Quartile benchmarked data has therefore not been generated for standards delivered by only one provider. This means providers that are the sole deliverers of standards will see comparative benchmarked data but not quartile benchmarked data for these standards.
Scenario 7
The use of calculated quartile boundaries, as explained in scenario 4, will have a greater effect on standards offered by fewer providers, as follows.
Where a standard is delivered by only 2 providers, the calculated boundaries would fall 25%, 50% and 75% of the way between the 2 providers’ QAR values. This means one provider will fall in the lowest and one in the highest quartile.
Having more than 2 providers on a standard gives more data for the boundary calculations, increasing the accuracy of the calculated boundaries.
The quartile given where numbers are too small for even quartile splits represents the proportion of providers you would be achieving better than if the standard was more widely offered and achievement followed the current pattern.
As quartiles are a measure of the spread of current data, quartile benchmarking is a more accurate indicator where there is more data for comparison, as this allows for more even quartile splits and more accurate projections.