Research and analysis

GCSE, AS and A level summer report 2021

Published 16 December 2021

Applies to England

Background

Ofqual regulates the 4 exam boards that award GCSEs, AS and A levels in England. The exam boards – AQA, OCR, Pearson and WJEC (Eduqas) – provide these qualifications to schools and colleges.

In January 2021, the government decided that it would not be fair for GCSE, AS and A level exams to take place in summer 2021 because of the disruption to students’ education caused by the coronavirus (COVID-19) pandemic. Instead, students received grades based on assessments by their teachers: Teacher Assessed Grades or TAGs. We aimed to make sure that students had the greatest opportunity to show the full breadth of their knowledge and understanding based on what they had been taught.

The cancellation of exams changed the way we monitored exam boards as they delivered students’ grades. We focused on ensuring that GCSE, AS and A levels were issued on time with results that were, as far as possible, accurate and indicative of student performance, in the absence of external assessments.

Following a joint consultation with the Department for Education (DfE) on how grades should be awarded, we put in place our regulatory framework for general qualifications for 2021 (the General Qualifications Alternative Awarding Framework). This required the exam boards to support teachers to assess their students using a range of evidence to make a judgement about the grade at which they had performed, focusing on the content they had been taught.

This year a total of 1.2 million students received grades for GCSE, AS and A levels.

Staff at school, college and other exam centres submitted 5.7 million TAGs for students taking GCSE, AS and A level qualifications in England this summer. These constituted:

  • 4.9 million GCSE TAGs
  • 57,360 AS TAGs
  • 754,520 A level TAGs

This was a significant achievement by teachers, school and college leaders, and support staff. They were asked to take on a difficult and important task in assessing students’ performance to determine their grades. This was on top of the other pandemic-related pressures throughout the year. We worked with the exam boards to provide guidance and support to make this challenge more manageable, and we thank teachers and other centre staff for all their work this year to ensure that judgments were made, internally quality assured and submitted on time. The window for TAG submission ran from 26 May to 18 June. Over 98% of centres submitted their TAGs by the deadline.

As exams were cancelled this year, teachers determined grades using a range of evidence, for example mock exams, class tests and any non-exam assessment already completed. No single set of assessment arrangements could have taken into account the differing degrees of learning lost by students due to the pandemic. Centres were therefore given flexibility to decide the nature of this evidence, while ensuring that they only assessed students on content they had been taught.

To assure the quality of their approach, centres had to:

  • ensure that at least 2 people were involved in determining each TAG
  • have the head of school or college sign off the grades to confirm they were a true representation of student performance and that the appropriate process had been followed
  • detail their approach to determining and quality assuring grades in a centre policy
  • submit to the exam boards a sample of the evidence on which their judgments were based

In addition, exam boards:

  • checked information about each centre’s policy
  • reviewed the profile of grades submitted by each centre
  • scrutinised samples of students’ work

Where exam boards had concerns about a centre’s approach or the TAGs they submitted, these were followed up with the school or college and in some cases, teachers reconsidered their judgements and submitted revised grades.

This was the second year in which the summer exam series did not go ahead as planned, due to the pandemic. In summer 2020 teachers were asked to determine grades they expected their students to have achieved had exams taken place. For more detail about the awarding of GCSE, AS and A level grades in 2020, you can see our blog post evaluating summer 2020 awarding and our Annual Report and Accounts 2020 to 2021.

Introduction

This report includes a summary of the actions that Ofqual took as the qualifications regulator and the resulting actions by exam boards. These include issues identified in the lead up to, during and immediately after results were issued for summer 2021. This report focuses only on GCSE, AS and A level qualifications offered in England, and all monitoring data concerns centres in England only.

Given the cancellation of exams, our regulation and the resulting work of the exam boards had a different focus to that in a normal year. In place of the usual distribution and processing of scripts, DfE policy and our regulations required the 4 GCSE, AS and A level exam boards to:

  • issue guidance to support teachers in determining TAGs
  • provide materials centres could adapt to create their own assessments
  • review centre policies to help teachers adopt a consistent approach to determining TAGs
  • collect Head of Centre declarations stating that students and learners had been assessed in line with the guidance
  • review all centres’ TAG submissions, and scrutinise a sample of the evidence on which they were based, to ensure they accurately reflected students’ performance
  • issue results and certificates
  • consider appeals from students who believed that an error had been made in determining their grade

These requirements are significantly different from the normal role of the exam boards when exams take place. For this reason, meaningful comparisons cannot be drawn between the monitoring data in this report and those we have published in previous years.

Preparations for delivering assessments in GCSE, AS and A levels begin over a year before they are due to be sat. Some of the monitoring data in this report therefore relates to activity undertaken by the exam boards before the decision was made to cancel exams and does not reflect the approach that was used to award grades in summer 2021.

Following the cancellation of exams, the exam boards stopped their preparations to deliver advance information and assessment adaptations. They worked together through the Joint Council of Qualifications (JCQ) to issue new guidance and ensure that their expectations of centres were consistent, aiming to minimise burden on centres as far as possible while meeting the requirements of our regulatory framework. The exam boards also worked together to ensure that they could deliver external quality assurance measures consistently across centres in line with our principles.

Individual exam boards remained responsible for managing, and reporting to us, any issues that arose in the delivery of their qualifications. As in any normal year we monitored the actions exam boards took and intervened where necessary to protect standards, public confidence, or to mitigate any impact on students. As regulator, Ofqual monitors exam boards’ management of any incidents and, after results are published, we evaluate the cause of each incident, its impact and how effectively it was managed by each board. We decide if any regulatory response is necessary. We follow up specific incidents with individual exam boards, consider the focus of our ongoing monitoring and, where appropriate, conduct additional work to understand how to minimise the likelihood of particular types of issue from reoccurring.

As we did not oversee the delivery of exams for summer 2021, we adapted our monitoring approach to reflect the role of the exam boards in awarding grades determined by TAGs. We regulate the exam boards, not centres, and as such we present in this report data on the aspects of the arrangements managed and delivered by the exam boards in awarding qualifications. We did not directly monitor how centres determined TAGs.

This report does not cover the exam boards’ delivery of an autumn 2021 exam series for all GCSEs, A levels and some AS levels. This series was open to all students who received a TAG this year or who the exam boards reasonably believe would have entered for exams in summer 2021, had they taken place.

Incidents

Exam boards must promptly notify Ofqual of any existing or potential incident which could have an impact on standards, public confidence in qualifications, or their ability to develop, deliver or award qualifications in a way which complies with our rules (these instances are referred to as Adverse Effects under Condition B3 of our General Conditions of Recognition). We call these reports ‘event notifications’, of which we received fewer this year compared to the number in pre-pandemic exam series. Exam boards made 83 event notifications in relation to summer 2021, compared to 232 notifications in summer 2019.

This reflects the different arrangements this year. Many of the processes in a pre-pandemic series – that would usually present significant risks to exam boards’ delivery of qualifications – did not take place. For example, as there were no formal exams, there was no confidential assessment material and therefore no risk of the security of such materials being breached. Instead, this year all reported security breaches related to students being given their results early.

Chart 1 shows an overview of event notifications for summer 2021. These categories are explained in more detail later in this report.

Chart 1. Types of issues reported

Type of issue Number of event notifications
Security breach 28
Delivery failure 19
Malpractice 17
Incorrect results 11
Other 6
Assessment material error 2

None of these issues was substantial enough to threaten the timely release of GCSE, AS and A level results this year, and nearly all students received their grades on results day. Most notifications from exam boards reflected aspects of the awarding arrangements outside of their direct control, so they had fewer opportunities to prevent them.

The external quality assurance carried out by the exam boards resulted in changes to the grades submitted by centres in only a small proportion of cases. This reflects the fact that overall, where exam board subject specialists looked at the samples of evidence on which teachers had based their judgments, they found that for most centres the evidence supported the grades awarded.

The delivery of grades in summer 2021 followed 4 distinct phases:

  1. planning
  2. delivery
  3. quality assurance
  4. appeals

We have structured this report to reflect these phases.

Phase 1: Planning

Exam board readiness

The role of the boards was very different this year due to the cancellation of exams. As a consequence, we adapted our regulatory requirements for summer 2021 via the General Qualifications Alternative Awarding Framework and accompanying guidance. These publications outlined the expectations of exam boards involved in the awarding of GCSE, AS, A level, Advanced Extension Awards and Project Qualifications in 2021 in the context of the pandemic.

In response to this, the exam boards published guidance in March 2021 that set out the approach they would require centres to take to determine GCSE, AS and A level grades in summer 2021. The was supplemented by additional guidance on other aspects of the arrangements this year such as grading and appeals.

JCQ also produced guidance aimed at students and parents to help address the anxiety that may have been felt by students not able to take exams as they had planned. This sat alongside Ofqual’s Student Guide to Awarding: Summer 2021. All of these documents were written to be as accessible as possible to students, to keep them informed about the alternative arrangements in place this year.

The exam boards collaborated through the Joint Council for Qualifications (JCQ) to devise an alternative way for students to receive grades and to provide consistent guidance and support to centres, in line with government policy and our regulatory requirements. Exam boards worked together to plan how new processes required this year such as quality assurance and appeals would be delivered, and to ensure there was sufficient capacity and expertise in place to do so. The exam boards tried as far as possible to ensure that centres had to engage with only one exam board at a time, to minimise burden.

In April 2021, once the General Qualifications Alternative Awarding Framework had come into effect, we met with each exam board to assess their readiness to award grades to students.

In these meetings we focused on the extent to which they had identified and were managing the risks to the safe delivery of results. This included new processes which exam boards put in place to support centres to determine TAGs. We also sought assurances that they were on track to recruit, train and supervise sufficient subject specialists and other staff to deliver the quality assurance and appeals arrangements.

We identified no serious concerns but used this opportunity to confirm our view of the key areas of risk for the summer series that they would need to manage. We met with each exam board regularly throughout the spring and summer to discuss their preparations and progress. Due to the amount of collaboration required between exam boards to deliver the assessment arrangements this year we also met regularly with the exam boards on a collective basis as JCQ.

Entries

Schools and colleges submit entries to the exam boards for each qualification their students will take. In May 2021 we published statistics on provisional entries for GCSE, AS and A level qualifications in summer 2021. These showed that:

  • overall, GCSE entries remained stable (approximately 5.3 million) this year (an increase of 0.4% on 2020), though a rise in entries from year 11 students masked decreases in entries from lower year groups and from candidates older than 16
  • there were small increases in the proportion of entries for the higher tier in most tiered GCSEs compared to 2020 (ranging from 2 to 5 percentage points)
  • A level entries for summer 2021 increased by 3% on 2020 (756,230 in 2021 compared to 731,855 in 2020), partly reflecting a change in the size of the overall cohort
  • AS entries for summer 2021 decreased by 33% on 2020 (58,300 in 2021 compared to 86,970 in 2020), continuing a trend seen in these qualifications since reforms decoupling them from A levels

Note that the number of GCSE entries recorded in the Official Statistics is greater than the number of GCSE TAGs submitted. This is because teachers submitted a single TAG for GCSE combined science but this qualification counts as 2 GCSEs.

Entries reflect the information submitted to the exam boards at that time, but the final entries are always expected to vary. Information on final entry numbers for GCSE, AS and A level subjects in England in summer 2021 can be found in the results tables published by JCQ.

Access arrangements and reasonable adjustments

In a standard examination series, exam boards adjust some exam arrangements for students, or allow centres to make the necessary adjustments. Ofqual does not prescribe what arrangements exam boards should provide, but requires all exam boards to have clear, published details about who qualifies for these arrangements and what arrangements may be given.

Access arrangements

Access arrangements are provisions made for students, agreed before they take an assessment, to ensure that they can be validly assessed and are not unfairly disadvantaged due to a disability, temporary illness, or injury or if their first language is not English. Access arrangements can be provided for any students taking exams or non-exam assessments who meet the eligibility criteria.

Individual students may require more than one form of access arrangement.

Reasonable adjustments

Access arrangements granted for disabled students are known as reasonable adjustments. If a student has a disability (defined by the Equality Act 2010 as meaning the student has a physical or mental impairment that has a substantial and long-term negative effect on their ability to do normal daily activities) they are legally entitled to reasonable adjustments.

Modified Papers

Access arrangements also cover the provision of modified papers. These are papers or tasks which have been adapted to make them more accessible for particular students. For example, enlarging font size for candidates who may have a visual impairment or providing papers in braille.

Access arrangement statistics

In November 2021 we published statistics on applications for access arrangements during the 2020 to 2021 academic year. When exams were cancelled on 4 January 2021, the deadlines for requesting modified papers and other forms of reasonable adjustment were imminent. Centres were encouraged to continue to submit applications despite the cancellation of exams.

For the 2020 to 2021 academic year there were 447,555 access arrangements approved. The majority of schools and colleges (88.2%) requested access arrangements for one or more of their students.

However, for summer 2021 centres themselves administered adjustments for students, when they assessed them to determine their TAGs. We do not know how many adjustments centres made for their students.

Post assessment adjustments

Where a student experiences a temporary illness, injury or other personal circumstance, which manifests at the time of their assessment, this would fall under the bracket of special consideration. If approved, this would usually result in an enhancement to their marks.

As with access arrangements, centres administered these arrangements themselves for summer 2021 as part of the process of determining a TAG.

Additional assessment materials (AAMs)

Exam boards provided a package of materials (AAMs) to centres. This included questions and mark schemes that teachers could choose to use as a source of evidence to help determine TAGs, alongside other evidence of student performance such as non-exam assessments, mock exam results, or work completed in classes or as homework. The AAMs were primarily drawn from past exam papers, including some material which had not previously been published. Exam boards provided guidance for teachers on how to adapt the AAMs (where appropriate and feasible for centres to do so) to assess students only on the content that they had been taught.

Teachers did not have to use these AAMs and could write their own assessments if they preferred. It was not possible to set compulsory assessments that would have accounted for the differing disruption and impact on content coverage students experienced. This is why exams were cancelled. Instead teachers decided how, and on what, to assess their students. Teachers also needed to be able to determine when to assess their students to take account of their individual circumstances and the amount of other performance evidence available.

The exam boards made the AAMs available first to centres via secure areas of their websites on 31 March. These were provided with the relevant mark schemes for the questions, and mapping grids indicating the past papers from which specific assessment materials had been taken. Exam boards then published the AAMs on their public websites on 19 April without mark schemes and with reduced mapping grids. The exam boards also issued to centres data for some of the items included in the AAMs indicating how students had performed on them when they had originally been taken in an exam series. They also provided marking and grading exemplars of student work, where available, to support teachers to determine TAGs.

We required the exam boards to publish on their websites the content of past questions included in the AAMs so that they could be seen by all students, including private candidates. As students were being assessed at different times by their teachers this meant that the AAMs could not be ‘leaked’, as they were freely available to all. Content would otherwise have been shared by students who had seen the materials. This would have been – or be perceived to have been – unfair if some students had seen the materials before being assessed and others had not.

The status of these AAMs differed from confidential question papers for an exam series because they were made publicly available to all during the series. Removing controls around the AAMs meant that there were no notifications of security breaches in assessments this year.

Where material in the AAMs already existed in modified formats from when the papers were originally taken, exam boards signposted these and helped centres to adapt them for their students – for example, providing AAM content in braille or larger font sizes. Where such modified formats did not already exist, the exam boards provided modified materials on request.

As errors in the AAMs could have impacted on centres’ ability to use them to assess their students, we wrote to the exam boards as they were preparing the AAMs to draw their attention to question paper errors that had been reported to us in previous exam series. We asked that they check that these had been rectified before making the AAMs available.

During the summer, we received 2 notifications regarding errors which had been identified by centres in the AAMs. These were minor errors in past mark schemes which had previously gone unnoticed. This number of errors was significantly smaller than we usually see in an exam series. This was unsurprising as the content had been used in previous assessments.

Phase 2: Delivery

The normal risks to safe delivery did not apply this year. There were, however, new risks for centres and the exam boards to manage. We monitored exam board delivery and where these risks were realised, made sure that they managed issues effectively and quickly to minimise any negative impact on students.

Overall, there were far fewer event notifications in relation to qualification delivery than usual.

Security breaches

The security breaches usually notified to us are where confidential assessment material or an exam result is released or seen ahead of its scheduled release time. We ask exam boards to tell us when there has been a potential security breach that was avoided, or where an actual breach has occurred. Where either an actual or potential breach happens, we expect the affected exam board to investigate and take all reasonable steps to mitigate its impact.

In previous years, when exams have taken place, the security breaches that exam boards have notified us of have mainly related to the leak of an exam paper or question before the exam had been taken. This year, however, all reported security breaches related to students being given their results early.

The nature of the arrangements for this year meant that teachers knew candidates’ grades in advance, which is not usually the case. Teachers were not allowed to tell students the TAGs they submitted to the exam board.

Exam boards reported 27 events relating to the early release of results at 23 centres. Some of these were notifications about the early release of results at the same centre made by different exam boards. Other notifications covered the early release of results at more than one centre.

At 18 centres students were told some of their results before results day, ranging from the point TAGs were submitted to the exam boards to just prior to when they were due to be released. This was due to a failure in the centres’ systems or processes. Teachers’ judgements were either unwittingly stored in insecure areas of centres’ networks which students or their parent or carers could access, or automatic notifications were sent to students or their parents or carers via centres’ software for tracking and sharing students’ progress.

There was also a change in the time until which results were embargoed on results day (from 8am to 8:30am). The majority of centres noted this change and amended their systems. However, 5 centres did not, which led to some of their students being told their results on results day but before the lifting of the 8.30am embargo. As these students were told their results only shortly before the official release time, it is unlikely they would have gained any advantage over other students.

In a very small number of cases, teachers allegedly disclosed TAGs to students ahead of results day. Exam boards treated these as potential instances of malpractice or maladministration and investigated them accordingly. These cases are captured in the malpractice statistics later in this report.

Delivery failure

There were 16 notifications of cyber-attacks on centres and 3 notifications of IT failures by exam boards reported to us this year which could have adversely affected the delivery of qualifications.

Most of the 16 cyber-attack notifications concerned attacks on more than one centre. We wrote to exam boards to clarify our expectations for reporting this type of event as there had been an increase in the number of cyber-attacks notifications compared to previous years. We monitored these cases closely due to the potential compromise of centres’ ability to submit TAGs or provide supporting evidence for quality assurance purposes.

In total, 77 centres were reported as having been potentially affected by a cyber-attack. The exam boards put in place alternative arrangements to ensure centres that had been impacted by cyber-attacks were able to submit their grades. The boards were also flexible on their deadlines for centres who temporarily lost access to data.

The remaining 3 events were IT failures of the portals which some of the exam boards used to receive TAGs and evidence from centres. These occurred close to the respective submission deadlines, so had the potential to affect key stages in the delivery of the awarding arrangements this year. However, they were quickly resolved and had limited impact on centres’ ability to submit their evidence on time.

Exam boards managed these 19 events relating to delivery risks and results were awarded and released to students on time. They continued to liaise with centres subject to cyber-attacks beyond results days to manage any potential impact on the centres’ ability to process students’ requests for centre reviews and appeals.

Malpractice

Everyone involved in the delivery of qualifications has a role to play in preventing and reporting malpractice, whether a teacher, student, or member of exam board staff. We take allegations of malpractice very seriously and we expect exam boards to do the same.

Exam boards require schools, colleges and others involved in the delivery of assessments to report all suspected incidents of malpractice and to cooperate with any subsequent investigation. Exam boards must investigate all instances where there are reasonable grounds for an allegation of malpractice. Where malpractice is proven, the exam board should take proportionate action against those responsible.

We do not require exam boards to notify us about all cases of suspected malpractice while they are still under investigation. They tell us only of the most serious issues, including those that might affect public confidence due to their impact on a large number of students or awarding organisations. This year we asked the exam boards to inform us about centres who had not responded to requests to review and resubmit their TAGs following external quality assurance, and were therefore at risk of having their results withheld on results day pending a malpractice investigation.

The arrangements this year created different opportunities for centres and students to commit maladministration or malpractice. However, the external quality assurance arrangements also provided an extra opportunity for exam boards to detect potential candidate or centre malpractice or maladministration when they were scrutinising samples of centres’ evidence.

Exam boards must also provide us with information about the total number of investigations they are both conducting and have completed at the point at which we collect the data. For GCSEs, AS and A levels we publish data on the number of offences and penalties imposed by the exam boards.

The main trends in malpractice in GCSE, AS and A level for the summer 2021 exam series were:

  • There were 295 penalties issued to students in 2021, up from 20 in 2020, representing a very small proportion of the 16,184,620 total entries this year.
  • There were 35 penalties issued to school or college staff in 2021, up from 25 in 2020. This involves a very small proportion of the total number of staff in England (nearly 355,000 in state-funded secondary schools alone).
  • There were fewer than 5 penalties issued to schools or colleges in 2021, down from 15 in 2020.

New categories of penalty and offence were introduced in 2020, to capture malpractice cases related to the centre assessment grade process put in place due to the pandemic.

These new categories were bias or discrimination, and negligence (types of offences), and referral to Teaching Regulation Agency (type of penalty). Of these, some cases of bias or discrimination were reported in 2020, but none of these cases resulted in a penalty being imposed. As such, they are not included in the numbers of penalties reported above or in the data tables. No cases of bias or discrimination, or negligence, were reported in 2021.

Exam board notifications of malpractice investigations

This summer, the exam boards notified us of 13 allegations of serious malpractice that they were investigating, affecting 12 centres.

Chart 2. Malpractice notifications

Perpetrators of malpractice in relation to notifications Number of notifications
Centre 11
Centre staff 5
Unknown or missing 1

The different nature of the arrangements this year meant there were no notifications of malpractice by examiners. There were fewer opportunities for candidates to commit serious malpractice as there was no secure assessment material and JCQ rules on the conduct of external assessments did not apply.

Whistleblowers

Students, teachers, parents, and others can also directly report to us concerns about malpractice by schools and colleges.

We consider all allegations received and will raise them with the exam board in question, if appropriate. Two of the malpractice notifications this summer resulted from allegations about wrongdoing that we passed on to the exam boards.

We do not pass on the names of individuals who do not wish to be identified, but we share the allegations where doing so will not lead to these individuals being identified. When we pass allegations on to exam boards about potential teacher, centre or student malpractice we monitor the action they take. We follow up where necessary to assure ourselves that the allegations have been properly investigated and, if appropriate, that sanctions have been applied. We investigate any concerns regarding an exam board’s approach.

We will report our whistleblowing data from April 2021 to March 2022 in full in our 2022 Annual Report. Data from the previous reporting period (April 2020 to March 2021) can be found in our 2021 Annual Report.

Ofqual received 125 allegations of malpractice in the period 1 April to 1 November 2021, of which 37 were raised by people about practice within their own workplace. This represents a very small proportion of the 5,864 centres in England who entered 1.2 million students for over 6 million GCSE, AS and A level qualifications this year. This summer, many allegations related to the arrangements that centres had put in place for determining TAGs. As there was considerable flexibility built into the guidance for the way that centres could gather evidence to determine TAGs, most allegations of this nature concerned practice which was within the scope of the guidance.

Chart 3 shows the number of allegations received this summer by month.

Chart 3. Whistleblowing and Malpractices Cases Received by Ofqual

Month Malpractice Whistleblower Total
April 2021 24 5 29
May 2021 28 6 34
June 2021 15 6 21
July 2021 4 11 15
August 2021 11 5 16
September 2021 4 2 6
October 2021 1 2 3
November 2021 5 2 7

Phase 3: Quality assurance

Our regulatory requirements for 2021 reflected government policy that teachers should assess their students and decide on the grade that best reflected their performance in assessments based only on the parts of their courses they had been taught.

Centres were given discretion to decide how to assess their students, because they had been affected in different ways by the pandemic. This allowed them to:

  • take into account relevant work already undertaken, including coursework or non-exam assessments
  • set new assessments written by teachers or using questions provided by the exam boards
  • vary the approach used for individual students where that was appropriate for their individual circumstances (for instance those lacking the same range of existing evidence as their peers, or who may not have received a reasonable adjustment they were entitled to at the time that an assessment was undertaken)

Exam boards were responsible for quality assurance of the process followed by centres for teachers to determine grades, but not for the accuracy of each individual TAG assigned by a teacher.

Exam boards made initial contact with centres to confirm that they had understood the requirements in their guidance. The exam boards required each centre to put in place an internal quality assurance process, which the centre described in their centre policy. Centres had to standardise their grading judgements. Exam boards reviewed information about all centres’ policies. Exam boards also required all centres to provide them with samples of student work on which their TAGs had been based. The exam boards checked a sample of this work.

Our regulatory framework required the exam boards to determine and implement a sampling methodology which would ensure evidence was scrutinised from a broad range of centre types before results were issued. It also required the exam boards to identify some centres for scrutiny on the basis of specific criteria.

As the regulator we monitored the progress of the exam boards’ quality assurance to check that it complied with the principles we had set. A summary of each stage is provided below.

Stages of the quality assurance process

Schools and colleges each set out in their centre policy how they would assess their students and determine their TAGs, within the guidance provided by the exam boards. The boards contacted schools and colleges as they were developing their policies to make sure that they understood what they were required to do.

These policies explained the steps that schools and colleges would take to ensure that their grades were properly determined. This included making sure that TAGs were checked by at least 2 teachers. When submitting TAGs, the head of each school and college also had to make a declaration confirming that they had been produced in line with the requirements and the centre’s policy.

The centre policies were then submitted to the exam boards who checked them all. Where exam boards had concerns about the approach a school or college planned to take, the school or college was required to make changes.

After the TAGs were sent to the exam boards, the boards reviewed the data submitted and required each school and college to send in the work for a sample of subjects and students. The exam boards selected the subjects and the specific students. Schools and colleges had 48 hours in which to submit their evidence. The exam boards scrutinised a selection of the student work submitted.

Some centres were selected to have their students’ work scrutinised based on specific criteria. The exam boards also looked at work from some schools and colleges selected at random. They made sure work from schools and colleges of all different types (for example, academies, independent schools, further education colleges and sixth form colleges) was looked at, and that schools and colleges from all regions were included.

Further details on the stages of the quality assurance process, and how the exam boards acted to deliver quality assurance according to the principles and framework we had set out, can be found in JCQ’s External Quality Assurance process summary.

These quality assurance arrangements were specific to the summer 2021 series. We have not taken decisions on the quality assurance process that would be used if summer 2022 were cancelled and TAGs had to be used again. If students’ work was to be sampled again as part of the quality assurance process, we would expect the exam boards to vary the sampling approach used in 2021. This would ensure no teacher or centre could predict which students’ work would be looked at.

Initial contact

Prior to the deadline for the submission of centre policies on 30 April, the exam boards attempted to contact all centres by phone to make sure they understood what they had to do. Between them, the exam boards attempted 16,097 telephone calls and were successful in speaking to staff at 5228 centres (89%). Exam boards made multiple attempts to contact the remaining centres prior to the deadline for submitting centre policies.

The exam boards divided responsibility for making these calls between them, according to their share of the market. Through JCQ the exam boards agreed the topics these calls would cover, and a shared process for ensuring that queries raised by centres during the telephone call were appropriately addressed.

Stage 1: centre policies

Every school and college was required to set out its approach to assessing students and quality assuring TAGs, in line with. The guidance set out that the centre policy should:

  • outline the roles and responsibilities of individuals in the centre, in relation to determining TAGs
  • detail the training and support provided for newly qualified teachers (NQTs) and training around objectivity in decision making
  • set out the approach for the determination of grades including how evidence would be used
  • describe the process that would be adopted where a potential conflict of interest had been identified, such as where a student was related to the teacher
  • outline the internal quality assurance processes in place including arrangements to standardise judgments and consider TAGs against results from previous years when exams took place (2017 to 2019)
  • detail any provision for private candidates, if applicable

Centres were required to submit their centre policy, and a summary of it, to the exam boards for review. These were checked by the exam boards.

The window for the submission of centre policies was from 12 April to 30 April. Pearson collected in the centre policies and then distributed them to the exam boards according to their market share, for them to review. Working together through JCQ, the exam boards devised common training for the staff undertaking these reviews to ensure that their expectations of the policies were consistent. They quality assured their decisions and feedback.

For 686 centres 12 exam boards identified further information they needed before they could approve a centre’s policy. Where a centre policy was not submitted despite repeated contact from the relevant exam board, this was treated as potential malpractice. No centre received grades until it had submitted its policy.

Stage 2: virtual visits

The exam boards fed back to schools and colleges where they had concerns about the content of the centre policy and required them to make changes.

Exam boards arranged ‘virtual visits’ with 74 centres (fewer than 1%) whose policies they considered needed to be changed, during which exam board representatives discussed the centre’s proposed approach to determining TAGS with the centre’s senior leaders.

These virtual visits took place during May and June. The visited centres then had to revise and re-submit their policies. The exam boards checked the resubmitted policies to confirm that the necessary changes had been made.

Stage 3: post-submission sampling

Once centres had submitted their TAGs, the exam boards reviewed these against their previous results, and required each school and college to submit a sample of the evidence on which their TAGs had been determined. This consisted of the work of at least five students in two or three subjects, depending on whether the centre had entered students for both GCSE and A level subjects. The exam boards selected the subjects and the specific students for whom this work would be provided.

Where possible one of these GCSE subjects was either English language or mathematics. Schools and colleges had 48 hours in which to submit their evidence. This was a tight turn-around to allow exam boards to complete their external quality assurance activity before schools and colleges broke up for the summer, and to accommodate the earlier closing dates of some independent schools. More than 90% of centres submitted their evidence within this 48-hour window. A centre’s failure to submit this evidence despite chase activity by the exam boards was treated as malpractice, and put centres at risk of having their results withheld.

Once the evidence had been received, the exam boards selected a sample of centres whose evidence they would review. The exam boards made sure that work from different types of schools and colleges was reviewed, and that there was broadly representative subject and regional coverage. Some centres were selected on a risk-based approach, and other centres were selected randomly.

Potential risk factors which meant that a school or college was more likely to be selected for sampling included those which:

  • had submitted TAGs that were unusually high or unusually low compared to its previous years’ results
  • had a significant change in its entry pattern relative to previous years
  • were a new centre
  • had been subject to a credible malpractice allegation

The basis on which centres were selected for random and targeted sampling, and how the exam boards determined and implemented the quality assurance methodology we required of them, is set out in detail in the JCQ technical description of the sampling process adopted for summer 2021.

We monitored the sampling conducted by the exam boards. We found:

  • the exam boards looked at evidence of students’ performance from 1,101 out of 5,864 centres in England (19%) with GCSE, AS and A level entries
  • between them, these centres made 22% of the total GCSE, AS and A level entries recorded in England this summer
  • of these 1,101 centres, 55% were secondary schools or academies, 18% were independent or selective centres, 13% were FE colleges, sixth forms or tertiary colleges and 13% were other centre types (including free schools); broadly in line with the proportion of each centre type nationally but sampling relatively more FE colleges, sixth-form colleges and tertiary colleges and relatively fewer centres in the ‘other’ category (see Charts 4 and 5)
  • the exam boards looked at centres by region broadly in proportion with their national distribution, but sampled relatively more centres in South-East England and South London, the East of England and North London and Lancashire and North Yorkshire, and relatively fewer centres in the East Midlands and North-East London, the North of England, South-West England and the West Midlands as a proportion of the number of centres in those regions (see Charts 5 and 6 below)

Chart 4. Breakdown of all centres making entries by type

Centre type %
Secondary schools and academies 53.27
Other (including free schools) 20.41
Independent and selective centres 17.58
FE colleges, sixth-form colleges and tertiary colleges 8.73

Chart 5. Breakdown of centres sampled by type

Centre type Total %
Secondary schools and academies 55.4
Other (including free schools) 13.17
Independent and selective centres 18.16
FE colleges, sixth-form colleges and tertiary colleges 13.26

Chart 6. Breakdown of all centres making entries by region

Region % of centres by region
East Midlands and the Humber 10.90
East of England and North-East London 11.61
Lancashire and West Yorkshire 15.01
North of England 6.87
North-West London and South-Central England 15.50
South-East England and South London 17.43
South-West England 9.65
West Midlands 13.03

Chart 7. Breakdown of all centres sampled by region

Region Total %
East Midlands and the Humber 8.90
East of England and North-East London 13.62
Lancashire and West Yorkshire 16.26
North of England 5.81
North-West London and South-Central England 14.90
South-East England and South London 19.80
South-West England 8.36
West Midlands 12.35

The samples of students’ work were reviewed by subject experts (usually senior examiners) appointed by the exam boards, who checked that the TAGs submitted were supported by the evidence. Subject experts did not re-mark or moderate the work; they looked at the evidence of students’ performance holistically and in context, as teachers had been asked to do.

Where the subject experts were not assured that the TAGs were supported by the evidence, they had a professional conversation with senior leaders at the school or college to probe their rationale for the TAGs submitted. The exam boards’ subject experts looked at further evidence and discussed the centre’s approach to determining grades more generally. Of the 1,101 centres whose students’ work was looked at by the exam boards:

  • 159 were subject to such additional scrutiny
  • 133 had their original TAGs upheld following further exemplification by centre staff
  • 26 were asked to revisit their TAGs

Changes to TAGs as a result of quality assurance

The rules we put in place this year did not allow exam boards to change a centre’s TAGs. Rather, where the exam boards found that TAGs were not supported by the evidence, they asked the school or college to revisit their TAGs.

Following a professional conversation with senior staff, if the exam board subject experts were still not satisfied that the evidence supported the grades which the centre had awarded, they were asked to revisit some or all of their TAG judgments based on the feedback. This could be restricted to the grades they submitted for a limited number of students, for students in a given subject, or for all of their students.

Where centres resubmitted some or all of their TAGs, these were reviewed again by the exam boards to ensure that the subject expert’s feedback had been taken into account and the new TAG was supported by the evidence.

We monitored the changes to TAGs as a result of the quality assurance process. Across the 26 centres asked to revisit their judgments:

  • 195 TAGs changed, of which
  • 179 TAGs decreased and
  • 16 TAGs increased

A summary of the scale of TAG changes by qualification level is provided in Chart 8.

Chart 8. TAG changes by +/- grades and qualification level

Grade levels changed Number of changes made at A level Number of changes made at AS level Number of changes made at GCSE Total
-3 1 0 1 2
-2 18 0 51 59
-1 28 0 80 108
1 0 0 12 12
2 0 0 3 3
3 0 0 1 1

Withheld results

For a very small number of centres, discussions between the centre and the exam board about resubmission of TAGs extended past the quality assurance period and were still to be resolved at the time results were released.

Where exam boards had outstanding concerns about a centre’s TAGs, we required the exam board not to issue results until the concerns had been resolved. We asked the exam boards to tell us where there was a risk that students would not receive results on time because these discussions were still ongoing with centres.

In most cases, exam boards and centres worked together to ensure that any quality assurance concerns were resolved before results days. The numbers of centres whose results were withheld on results days were:

  • 2 centres had some of their grades withheld on A level results day
  • 6 centres had some of their grades withheld on GCSE results day

Centres for which these concerns could not be resolved because they did not engage with the exam boards’ request to review their grades were investigated for malpractice. In all of these cases centres had received their grades where appropriate, or the resulting malpractice cases had concluded, by the time of publication – ensuring that almost all students were able to make decisions about their next steps at the intended time.

Some of the notifications we received concerned results for qualifications other than GCSE, AS and A levels, so are not included in these figures.

Phase 4: Post results

When we published our decisions on how grades for GCSEs, AS and A levels should be determined in summer 2021, we confirmed that students would have the right to appeal. As there was no marking, the usual arrangements relating to Reviews of Marking, Moderation and Appeals could not be used. We therefore consulted on the Appeals Guidance to which the exam boards had to have regard when considering appeals. Unlike in other years, students themselves (rather than their centres) could decide whether to appeal, with the appeal being submitted by the centre on the student’s behalf. We also confirmed that grade protection on appeal would not be in place. Students were therefore asked to confirm that they were aware that their grade could go up, down or remain the same when they appealed.

The appeals process

We required exam boards to take a 2-stage approach to appeals.

Stage 1

Students who believed their grade did not reflect their performance could ask their centre to check whether it made an administrative or procedural error when determining their TAG. If the centre found such an error and that, as a result, it had submitted the wrong grade to the exam board, the centre explained the nature of its mistake and asked the exam board to change the grade.

Stage 2

If a student still believed their grade was wrong following the outcome of their centre review, they could ask their centre to submit an appeal on their behalf to the exam board. The student needed to state what they thought had gone wrong by selecting the appropriate grounds of appeal. The evidence centres were required to provide differed depending on the grounds of appeal the student selected.

The centre was required to provide the exam board with the evidence used to determine the student’s grade, together with the centre’s justification for the grade, the student’s concerns and details of the process used by the centre to determine the grade. For appeals on academic judgment grounds, the exam board considered whether the evidence of the student’s performance indicated that the grade represented a reasonable exercise of academic judgment or whether the selection of work they used to decide the grade was unreasonable.

If the exam board decided the grade was supported by the evidence, it did not change the grade. If the exam board decided the grade was not supported by the evidence it would change the grade. The exam board could also consider whether the school or college made a procedural error, or whether the exam board itself made an administrative error.

Monitoring of appeals

We monitored the volume and progress of appeals through weekly data collections from exam boards. Students could appeal on one or more grounds, for example unreasonable exercise of academic judgement regarding selection of evidence and/or determination of a TAG and/or a procedural error by the centre. Appeals where multiple grounds were selected were more complicated and usually took longer.

Exam boards offered a priority appeals process for students taking A levels who had applied to higher education and who had missed out on their firm choice offer. Exam boards set a deadline of 23 August 2021 for such appeals to be made to the exam board in order that they could aim to complete the priority appeals by 8 September 2021. We monitored the progress of these priority appeals. Exam boards notified us where they did not consider that they would be able to complete priority appeals by the priority deadline.

Exam boards endeavoured to process GCSE appeals to the priority appeals deadline where students’ progression depended on grades in GCSE qualifications, even though priority did not extend to GCSE appeals.

The final deadline for appeals to be received by exam boards was 17 September. Exam boards continued to receive applications after this date and accepted appeals where there was an appropriate explanation for missing the deadline.

Summer 2021 appeal outcomes

We will be publishing official statistics on appeals in GCSEs, AS and A levels for summer 2021 early in 2022.

Incorrect Results

We expect exam boards to issue correct results. However, we recognise that it was possible for errors to be made by all involved in the TAG process. It was important that exam boards could recognise an error and correct it if appropriate.

Exam boards are required to have regard to our guidance on making changes to incorrect results which explains the factors they should take into account in deciding whether to correct a result, including any potential negative impact from doing so.

This year we told the exam boards that we expected them to notify us of any corrections to incorrect results where there had been, or there was the potential for, an Adverse Effect.

Exam boards notified us of 11 incidents of incorrect results this summer where there was a potential of an Adverse Effect for the impacted students. Of these, 8 concerned instances where a centre had identified that they had submitted the wrong grade to the exam board and asked for it to be changed. These will be reflected in the summer 2021 appeals outcome data due to be published early in 2022. The other 3 notifications related to administrative errors made by the exam boards during the processing of appeals.

Exam boards explained to us how they had considered our guidance when balancing the potential Adverse Effect for the impacted students against any negative impact which may have been caused by correcting the result.

Other Event Notifications

Exam boards provided 6 event notifications that did not fall into a specific category. Of these, 3 concerned the potential that exam boards would not be able to complete priority appeals by the deadline (discussed under ‘Monitoring of Appeals’) including where they were unable to obtain further information or evidence from centres. We monitored the progress of these appeals, and required the exam boards to inform us when they had been completed and how they intended to manage any impact on the learners involved.

The remaining 3 notifications covered late candidate entry, an appeal made by a centre for the incorrect subject, and qualification fees.

Conclusions and next steps

The summer 2021 series saw more than 6 million results issued on time to 1.2 million students, despite the cancellation of exams. Between them these students received qualifications in 385 different GCSE, AS and A levels, allowing them to move on to the next stage of their lives.

Under normal circumstances exams are the fairest form of assessment. The government is firmly committed to GCSE, AS and A level exams going ahead in England in summer 2022, and we and the exam boards are working towards this aim. However, we have also consulted on and confirmed that TAGs would be used in summer 2022 in the unlikely event that exams do not proceed as planned. While the number of issues was small this series, and those that arose would not occur if exams take place, there may be steps that could be taken to better manage them or prevent reoccurrence if TAGs are used again.

We are now looking ahead to the exam boards’ delivery of summer 2022. We have confirmed some changes to the exam and wider assessment arrangements. We have also published guidance for teachers to help them prepare for the unlikely event that exams are cancelled. We will next discuss with the exam boards the work we had expected them to do following the 2019 summer series, had the pandemic not happened. We will review the expectations we had at that time to prepare for the subsequent exam series and consider the focus of our regulatory activity in summer 2022.

In doing so, we will be mindful of the extra demands on the exam boards this year as they deliver assessments with advance information and other adaptations made to GCSE, AS and A levels while also preparing for the delivery of contingency arrangements should the need for them arise.

Some of the areas we will be discussing with boards are:

Malpractice

In 2019 we welcomed the recommendations made by JCQ’s independent commission into malpractice. Some of the recommendations support work that we had underway before the pandemic, for example improving the quality of the data on access arrangements exam boards provide. This data will allow us to explore the increase in extra time requests and review if the system is operating as effectively as possible.

We will continue our efforts to prevent malpractice, working closely with exam boards, schools, and colleges. To help exam boards do all they can to prevent malpractice; investigate it with the necessary rigour; and take appropriate action against those responsible we have recently published updated guidance on malpractice and maladminstration following public consultation.

We will also continue to focus on raising awareness among students and parents of what constitutes malpractice, taking into account the relevant assessment arrangements that may come into play. This could include communications discouraging taking prohibited materials, such as phones, into exams, and advice for what students should do if they encounter real or hoax breached assessment materials on social media.

Question paper security

In September 2019 we met with exam boards and their representative body, JCQ, and shared our suggestions on reducing the risk of exam paper leaks in the future. We will continue to monitor progress made on improving question paper security.

2022 delivery risks

As we do every year, we will review the exam boards’ readiness for the challenges of – and risks to – a successful 2022 summer exam series. We will also continue to monitor boards’ planning and preparation for potential contingency arrangements, in case exams for the summer 2022 series are cancelled.