Consultation outcome

Analysis of responses: Ensuring the resilience of the qualifications system

Updated 21 September 2023

Background

In 2022 and 2023, exams and other formal assessments took place as planned in a return to pre-COVID-19 pandemic assessment arrangements. The future cancellation of exams and the need for alternative assessment arrangements remains very unlikely. Good public policy, however, means ensuring appropriate contingencies are in place, even for unlikely scenarios.

Resilience arrangements for GCSEs, AS and A levels, Project Qualifications and Advanced Extension Award (AEA) were put into place for the 2023 exam series following consultation with stakeholders in the Autumn of 2022. These arrangements focused on the provision of guidance to schools and colleges on how to gather robust assessment evidence that could be used to determine Teacher Assessed Grades (TAGs) in the unlikely event that government determined exams could not go ahead. Ofqual also expected awarding organisations offering vocational technical qualifications (VTQs) and other qualifications with similar assessment arrangements to take account of the guidance and determine whether it was applicable to them. The arrangements were designed to increase resilience in the examination system with minimum burden to schools and colleges. The consultation revealed broad support for the arrangements proposed for 2023, and two-thirds of respondents supported the idea in principle that they may be continued beyond 2023.

In May 2023, Ofqual and the Department for Education (DfE) launched a consultation inviting views on introducing resilience arrangements to schools and colleges in the longer term. While the risks related to the COVID-19 pandemic are, hopefully ever diminishing, the experiences of the pandemic served to highlight the importance of prudent public policy in ensuring qualifications can still be awarded even in the most unlikely and unexpected of circumstances.

The consultation proposed that guidance be provided for GCSEs, AS and A levels, Project Qualifications and AEA, on gathering and storing evidence of student performance. The proposed guidance was designed to allow schools and colleges to decide how to gather evidence of student performance in assessments in ways that align with their normal arrangements for preparing students for exams. The consultation asked for the views of stakeholders, including students, teachers, awarding organisations, schools and colleges and their representatives on the proposed guidance. In particular, it sought their views on whether it supported the gathering of evidence to build resilience in the system while minimising any additional burden on teachers and students.

To help ensure the parity of student treatment across qualification types, it was decided to also formally consult on arrangements for VTQs and other qualifications used alongside or instead of GCSEs, AS and A levels for progression to further or higher study. This included Technical Qualifications within T Level. The arrangements put into place for the 2023 exam series set out that Ofqual expected awarding organisations offering VTQs used to support progression to further or higher study to take account of the guidance. It was proposed that there should be more formal requirements for these qualifications going forwards.

Given the wide variety of VTQs, the proposed resilience arrangements would not be appropriate for all. For VTQs and other qualifications, which assess occupational or professional competence, proficiency, or act as a licence to practise, including apprenticeship end-point assessments (EPAs), these resilience arrangements would not apply as these qualifications would not be awarded on the basis of a TAG or other alternative evidence.

The consultation, therefore, proposed that Ofqual require awarding organisations to consider whether it was necessary and appropriate to put in place guidance for schools and colleges on gathering evidence of student performance that could be used to support TAGs should exams and formal assessments not be able to go ahead. Were deemed appropriate, it was proposed that Ofqual require awarding organisations to provide such guidance.

For these qualifications, common guidance was not proposed as any guidance would need to consider the design of each qualification. The consultation did, however, note that awarding organisations should consider the proposed guidance for general qualifications and can draw on it or reproduce it wherever appropriate.

A second part to the consultation was undertaken solely by Ofqual, and outlined the regulatory approach needed to implement the proposed arrangements. The proposals in this second part related to the new General Condition, the Qualification Level Conditions and the statutory Guidance that would be put in place to implement the policy proposals outlined in the first part of the consultation. These matters are solely within the remit of Ofqual.

The consultation was available online for 12 weeks and received 40 responses. Responses to the consultation have been used to inform the arrangements put in place for gathering assessment evidence to support resilience in the exams system. The decisions taken on the final form of the guidance are set out in a separate decisions document, as well as the final guidance itself.

Approach to analysis

The consultation was available to be completed through an online form from 10 May 2023 until 2 August 2023. The consultation included 24 questions on proposed arrangements to build resilience in the exam system. The questions were:

  • quantitative – having a format of a 5-point scale (strongly agree, agree, neither agree nor disagree, disagree, strongly disagree), or two-option questions (yes/no), or three-option questions (positive/neutral/negative)
  • qualitative – open-ended questions where respondents could provide comments on the proposals

Respondents were invited to identify whether they were offering personal responses or official responses for their organisation.

‘Awarding organisations’ has been used to describe organisations which offer qualifications regulated by Ofqual, such as exam boards. These cover those organisations that offer GCSEs, AS and A level qualifications VTQs and other qualifications.

Throughout the analyses presented in this report, the answers to quantitative questions are summarised in tables. The Appendix section includes tables of the responses to the quantitative questions aggregated across all respondent types.

All responses to the qualitative questions have been read in full. For qualitative questions, the key themes that emerged from respondents’ answers are presented. A selection of comments from respondents have been included which represent the range of views expressed. Some of the comments have been edited to correct spelling or grammatical errors and to keep respondents’ identities anonymous. In editing though, care has been taken to ensure any such changes do not alter the meaning of the comments.

Respondents could submit their final response without having replied to all questions. Many respondents skipped the qualitative questions or replied with “N/A”, ‘nothing to add’, “nil” or similar. These answers are included in the total number of responses presented in the document.

The report is organised into the following sections:

  • Guidance on collecting evidence of student performance in academic year

  • Arrangements for private candidates

  • Guidance on the regulatory approach to collecting evidence

  • Equality impact assessment

  • Regulatory impact assessment

The questions are presented in the same order as in the consultation document.

Where the document refers to schools and colleges, this includes schools, colleges and other exams centres.

It is important to note that, given not all respondents offered comments when responding to questions, the comments analysed may not be fully representative of the range of views across all respondents. They may also be an unbalanced representation if respondents were more likely to offer comments with negative or positive views respectively.

Many comments made throughout the consultation were repetitious of comments made by the same correspondents across multiple questions. Sometimes this meant that the comments were not directly relevant to the particular question being asked.

Who responded?

Respondent type Number of respondents
Organisational – awarding organisation 13
Personal – Senior Leadership Team (SLT) members of SLTs at schools and colleges 7
School or college 7
Other representative or interest group 6
Personal – teachers (responding in a personal capacity) 4
Organisational – other 2
Organisational – local authority 1
Total number of respondents 40

It is important to note that one of the 6 ‘representative or interest group’ respondents submitted a letter response which only directly addressed Question 1.1, so it is only analysed in the figures for Question 1.1.

Overarching themes

Respondents tended to agree that guidance should be provided to schools and colleges on how to collect and retain evidence to help award grades in the unlikely event exams do not go ahead as planned.

With regards to the proposed guidance for GCSEs, AS and A levels, and AEA, most respondents agreed that it would be helpful for schools and colleges to collect and retain evidence in a proportionate way in line with their existing arrangements to help determine TAGs if exams do not go ahead in the future. Respondents also generally agreed that this would not add significant burden to students, schools and colleges, beyond their existing assessment arrangements. Respondents were more likely to disagree than agree, however, that the proposed guidance would minimise additional teacher workload. A variety of views were expressed throughout the consultation responses, and some themes emerged in responses across multiple questions.

First, a small number of respondents questioned the need for such arrangements given the low likelihood of exams and assessments not going ahead. Some respondents believed that the guidance would require additional administrative, financial and teacher work time to implement, causing problems for schools and colleges to resource. Respondents here, focused on the impact on teacher workload from setting and marking assessments, and on retention of evidence, as well as a potential increase in costs (for example, printing copies of assessments for students). Several also suggested that only the grades or a small sample of evidence from each cohort should be retained to avoid this.

Secondly, respondents expressed concerns about the way mock assessments are used and perceived. Some suggested it would change the purpose of these assessments from formative to summative and emphasised the potential impact on student mental health. Others suggested mock examination results are not indicative of students’ final performance in formal assessments, and schools and colleges would decide to introduce additional assessments to gather more representative evidence of student performance. The implications of this focused on the impact on students, a predicted increase in staff workload, and reduced learning time.

Thirdly, some respondents raised concerns that the proposed guidance may be followed inconsistently by different schools and colleges, and worried that this could lead to unfair or unrepresentative outcomes for students, should the evidence retained be needed to award grades. Similarly, some respondents raised concerns that they or others may inadvertently implement the proposals incorrectly. These issues led to several respondents indicating they would want parts of the proposals to be clarified, including how to ensure they were complying with any new regulations. Some went further and suggested that additional regulation to that proposed should be considered to ensure that the proposals were followed consistently by all, to prevent unfair outcomes and to prevent inadvertent failures to comply.

For VTQs and qualifications other than GCSEs, AS and A levels, Project Qualifications and AEA, which are used for progression to further or higher study, the majority of respondents agreed with the proposal that awarding organisations should be required to consider if it is necessary and appropriate to have resilience arrangements in place, and if so, to provide guidance to their centres on the arrangements. Respondents did, however, request further clarification and highlight areas of concern with the proposed approach. These comments were received across multiple questions relating to the proposed approach and covered the following themes.

Respondents highlighted the need for awarding organisations’ arrangements to minimise the burden on centres as far as possible. They suggested that to do this awarding organisations offering similar qualifications should seek to align their approaches. Respondents felt that if awarding organisations took different approaches, it would place greater burden on schools and colleges that offered qualifications from many different awarding organisations. It was also suggested that for some qualifications, formative assessment information already held and collected by schools could be sufficient. Respondents also asked about the scope of the arrangements. This included request for further clarification of which qualifications the arrangements would be for. This included asking whether specific qualifications such as functional skills qualifications would be included, as well as whether the arrangements were just for linear VTQs. Respondents also raised concerns about the arrangements being misunderstood and the expectation that the evidence could also be used on an individual basis where a single student or students from a single school or college were not able to take exams or formal assessments.

The respondents that disagreed with the proposed approach tended to feel that there were alternative ways to provide resilience in the qualifications system.

Detailed analysis

Part 1: Ofqual and the Department for Education’s proposed guidance for future resilience arrangements proposals

Questions covered in this section

In this section of the consultation, DfE and Ofqual set out the proposed resilience arrangements for the long-term to facilitate the gathering of evidence of student performance by centres, which would enable grades to be awarded in the unlikely event that exams cannot go ahead as planned.

For GCSEs, AS and A levels, Project Qualifications and AEA, it was proposed that guidance be provided to schools and colleges on the gathering and retaining evidence of student performance. This evidence could then be used as an alternative means to award qualifications, such as through TAGs in the unlikely event government determined exams are cancelled again in the future.

For VTQs and other qualifications used alongside or instead of GCSEs, AS and A levels for progression to further or higher study, including Technical Qualifications within T Level, it was proposed that similar arrangements should be put in place, but these would need to be set by awarding organisations to allow the arrangements to reflect the design of their qualifications. Given the wide variety of VTQs, the proposed resilience arrangements would not be appropriate for all. For VTQs and other qualifications, which assess occupational or professional competence, proficiency, or act as a licence to practise, including apprenticeship end-point assessments, these resilience arrangements would not apply, as they would not be awarded based on TAGs or other alternative evidence.

It was proposed that awarding organisations should determine which qualifications the arrangements would be necessary, and if so, determine what the guidance might be, noting the need for them to take into account the proposed approach for GCSEs, AS and A levels, Project Qualifications and AEA.

The questions asked in this section of the consultation sought views on whether the proposed approach was appropriate and acceptable.

Question 1.1

Do you agree that for the future (on a long-term basis), guidance should be provided to support schools and colleges in gathering evidence of student performance should exams not be able to go ahead as planned?

Response Number of responses
Yes 32
No 6

Forty responses were received to the consultation overall, including one that was submitted as a letter. The letter response only explicitly referenced question 1, so has been reflected in the analysis of responses to this question only. In total, there were 38 responses to this question. One respondent did not answer ‘Yes’ or ‘No’ but provided comments, and one respondent did not respond to this question at all; these 2 respondents are not featured in the above figures, but the comments were analysed.

Overall, more than three-quarters of respondents agreed that guidance should be provided for long-term use. Less than a fifth disagreed.

Four of the respondents who answered ‘No’ were representing schools and colleges or personal responses from teachers or members of SLTs. One was from an awarding organisation, and one identified was a union. More than three-quarters of respondents across these categories of respondents, however, agreed that guidance should be provided.

There were 28 comments received in response to this question. All respondents who did not provide comments to this question agreed with the proposals. This means that the comments may not be representative of the full range of views and may focus on issues where respondents either disagreed with the proposals or supported them but raised some concerns. Some of the comments also pre-empted subsequent questions so many of the themes raised were repeated in response to subsequent questions.

The most common theme in the comments was respondents’ positive reception of the proposed guidance. Those commenting positively suggested that the proposals would provide reassurance about future scenarios where exams could be cancelled. They also suggested that the proposals would not be excessively burdensome to schools and colleges to implement.

“Providing parents, students and teachers with certainty over contingency arrangements should formal examinations once again be cancelled gives clarity and provides peace of mind.” (Interest or representative group)

“The proposals are reasonable and do not place an excessive burden on schools and colleges in so far as they reflect existing practices.” (Awarding organisation)

Most respondents who provided comments did, however, have some concerns about the proposals, even if they were supportive of them overall. In particular, the most common themes raised in the comments were that the proposed guidance could increase teacher workload and/or present additional administrative and financial burdens upon schools, colleges and their staff.

“The financial impact of gathering evidence of student performance - photocopying mock papers etc is costly especially at a time when school budgets are extremely tight… There must also be consideration of the impact the evidence gathering, processing and storing has on workload when the education section is already struggling with excessively high workloads for employees; particularly when the evidence is to be used as a fail safe for an extremely unlikely event.” (Personal – teacher)

Several respondents also held views that the guidance may encourage or lead to over-assessment of students, which they suggested could reduce learning time and put additional pressure on students.

“The expectation that centres should gather robust evidence during the course is likely to have a detrimental impact on learning. Centres following the guidance would have to reduce learning time/formative assessment to create more time to deliver semi-formal summative assessments.” (Awarding organisation)

Another concern shared by a small number of respondents was that the guidance may not be implemented consistently and may therefore lead to unfair or unequal outcomes.

“The proposed approach attempts to replicate high stakes exams approaches without any of the formal controls required to achieve that level of validity.” (Awarding organisation)

“…All evidence would be different between schools.” (Personal – teacher)

A small number of comments indicated that their authors did not view mock exam results as indicative of end point assessment student attainment.

“There is no mechanism for trajectory from performance in exams taken before the end of linear courses to the likely outcome from the summative assessment in the summer of Year 2 of the course. Every school does mock examinations at different times and for a school like ours we would expect at least one and possibly 2 grades enhancement from January practice exams to the final exams in the summer.” (School or college)

Finally, a small number of respondents indicated that in order to avoid problems with implementation, additional guidance or clarification may be needed, such as the guidance being clearer on the frequency and content of the mock examinations.

Some suggested that additional regulatory mechanisms may also be necessary to ensure compliance with the guidance.

“The nature of this guidance must be transparent and clear for schools and colleges. It should include whether it is mandatory or optional, and what sanctions (e.g., maladministration) would be implemented for not following the guidance. However, the use of the word ‘guidance’ is unhelpful here. If the DfE and Ofqual want all centres to carry out these proposals, then ‘requirement’ is a more transparent and helpful term.” (Union)

Question 1.2

Do you agree that awarding organisations offering VTQs should consider if it is necessary and appropriate to have resilience arrangements in place, and if so, provide necessary guidance to centres?

Please add any comments to explain your response.

Response Number of responses
Yes 31
No 5

Thirty-six responses were received to the question. Over three-quarters of respondents to the questions agreed with the proposal. Those that disagreed with the proposal included 3 personal responses from senior leadership team members or teachers, one awarding organisation and one union. Views were evenly split from the teachers responding in a personal capacity. Most awarding organisations and unions agreed with the proposals.

Twenty-three comments were received in response to this question, 18 from those that agreed and 5 from those that did not. All respondents who did not provide comments to this question agreed with the proposals. This means that the comments may not represent the full range of views and may focus on issues where respondents either disagreed with the proposals or supported them but raised some concerns.

Several respondents that agreed with the approach talked about the reasons for their support. Respondents mentioned the need for parity of approaches between GCSEs and A levels, and VTQs and other qualifications used for progression to further or higher study, welcoming the proposed flexibility to enable awarding organisations to decide on an approach best suited to their qualifications and highlighting that the design of many VTQs already builds in resilience to the qualifications.

“To ensure that there is parity between general, vocational and technical qualifications used for similar purposes, it is essential that all students are able to achieve grades in the event that exams cannot take place.” (Awarding organisation)

Most respondents that agreed with the overall approach, including both awarding organisations and other types of respondents, also identified areas for further consideration or clarification. This included the need for consistency in approaches across awarding organisations offering similar qualifications. It was felt that this would both reduce the burden placed on schools and colleges and maintain comparability within different types of qualifications, such as T Levels, by ensuring awarding organisations take similar approaches. In addition, one respondent suggested that where differences were necessary, it would be important that the communications with schools and colleges were clear and consistent to avoid confusion.

Two respondents also commented more generally on the burden the arrangements would place on schools, colleges and teachers. This included the financial impact of gathering the evidence of student performance and the administrative burden the arrangements might place on them.

“There must also be consideration of the impact the evidence gathering, processing and storing has on workload when the education section is already struggling with excessively high workloads for employees; particularly when the evidence is to be used as a fail safe for an extremely unlikely event.”(Teacher)

Two respondents also specifically referenced the approach for T Levels, suggesting it would be helpful for there to be consistency across awarding organisations offering these qualifications. One suggested that because there are specific requirements set by Ofqual for T Levels, Ofqual should set the approach to guidance for these qualifications to ensure consistency. Another suggested that awarding organisations should be required to include their guidance when they submit their qualifications, including T Levels, for review by Ofqual and the Institute for Apprenticeships and Technical Education (IfATE).

Several respondents also requested further clarity on which qualifications would be covered by the arrangements. One asked specifically whether the arrangements would only cover linear VTQs. Another asked whether the arrangements would include Functional Skills and ESOL Skills for Life qualifications.

One respondent also mentioned the need to consider the arrangements for nested qualifications, so that students who were taking a nested qualification or part way through a qualification could be certificated where necessary.

One respondent highlighted the need for awarding organisations to keep the arrangements under review and to clearly communicate with their schools and colleges, providing advanced notice, if they make a change.

Two respondents also talked about the arrangements that would be necessary if exams and other formal assessments did not go ahead. It was suggested that there should be at least 50% of work completed to give a grade based on alternative evidence. It was also highlighted that it would be necessary for any arrangements to ensure that professional competencies are still assessed.

Of those respondents that disagreed with the approach, most provided negative comments which weren’t specifically related to the question, such as requesting to return to students taking exams and other formal assessment. Of those that were responding to the question, 2 respondents suggested alternative ways that evidence could be collected. This included using multiple sources of evidence from assessments taken across the course of study or using formative assessment information already held and collected by schools.

One respondent said that results based on any alternative evidence would not be as valid or reliable as results based on exams and formal assessments. It was instead suggested that given the arrangements were to be used in exceptional circumstances; there was no need to look to replicate the environment of exams and formal assessments to generate that evidence. The respondent suggested that clear communications around the status of any grades issued using this evidence would be important.

The ability of awarding organisations to ensure schools and colleges were following their guidance was also questioned. It was suggested that the approach set expectations that the evidence retained could be then used to award a grade in other situations, for example, in response to individual adverse circumstances. It was also felt that collecting evidence would have a detrimental effect on teaching and learning.

One respondent also raised concerns that that the arrangements may not be implemented consistently and may therefore lead to unfair or unequal outcomes.

Proposed Guidance for GCSEs, AS and A levels, Project Qualifications and AEA

Questions covered in this section

This section of the consultation sought to gather views on the proposed draft guidance to schools and colleges on the gathering and retention of evidence of student performance for GCSEs, AS and A levels, Project Qualifications and AEA.

The draft guidance was designed to minimise the impact for schools, colleges, teachers and students, with arrangements similar to 2023 and scaled back from those in 2021 and 2022, in light of the experience of schools and colleges. The draft guidance was designed to enable teachers to gather evidence in line with their existing formative assessment processes and to best support students preparing to take their exams.

Question 1.3

Do you agree that this proposed guidance for 2024 and beyond would help schools and colleges to collect and retain evidence in a proportionate way in line with their existing arrangements to help determine TAGs if exams do not go ahead in the future?

Response Number of responses
Yes 22
No 13

There were 35 responses received to this question. Overall, nearly 63% of respondents agreed that the guidance helps schools and colleges retain evidence in a proportionate way to help determine TAGs if exams do not go ahead in the future. Four respondents to the wider consultation did not answer ‘Yes’ or ‘No, of which one did provide comments; this latter response is therefore not featured in the above figures, but the comments were included in the analysis.

Of the 13 respondents who answered ‘No’, 8 were either representing schools and colleges (organisational responses from a school or college) or personal responses from teachers or members of a senior leadership team). Two were from an awarding organisation, and 3 identified themselves as ‘Organisational – Other’. In the same categories of respondents (those representing schools and colleges, those from awarding organisations, and those identifying as ‘Organisational - other), however, more than half of respondents answered ‘Yes’.

There were 26 comments received in response to this question.

Of those respondents who did not provide comments, 3 had not answered ‘Yes’ or No’, and 12 answered ‘Yes’. This means the comments are not necessarily representative of the full range of views and may focus on issues where respondents either disagreed with the proposals or supported them but raised some concerns. Some of the comments also pre-empted subsequent questions so many of the themes raised were repeated in response to subsequent questions.

There were 2 main themes in the comments. The first was respondents’ positivity. Several respondents agreed that the proposed guidance was a proportionate method to supporting schools and colleges to retain evidence. Several comments stated this was partly because the guidance supported existing assessment arrangements. All of the explicitly positive comments came from awarding organisations.

“We agree that the proposed guidance for 2024 and beyond would help schools and colleges to collect and retain evidence proportionally and in line with existing arrangements. This guidance will hopefully reduce any potential unnecessary burden on schools, colleges, and students by ensuring clarity on expectations.” (Awarding organisation)

The second prominent theme was that the proposed guidance would present additional and/or disproportionate administrative and financial burdens upon schools, colleges and their staff. This echoes points raised in response to the first question, particularly with regards to the financial and administrative costs of copying assessment papers.

Several respondents raised similar views to those given in response to the first question that the guidance risks over-assessment and/or inequality in awarded grades if exams were to be cancelled. This is because of a lack of consistency of practice across schools and colleges, and because some schools or colleges may conduct extra assessments to account for student performance as the year progresses.

“The lack of guidance on how the evidence would be used, is likely to result in some schools still wishing to pursue a second set of mocks, closer to the final exams, as it is not clear whether we would be allowed to inflate the grade achieved in, for example, Dec/Jan to reflect the additional learning that would have taken place by the summer.” (Personal – teacher)

A small number of comments referenced concerns that the guidance may lead to additional teacher workload, and/or may cause additional stress to students if they feel mock exam results carry greater significance than they had done previously. These views had also been raised in response to the first question.

“All this does is produce further stress and more pressure on mental health.” (Personal – SLT)

A small number of comments suggested that additional guidance may be helpful to offer clarity and reassurance about how the evidence may be used to award grades if exams were cancelled, and to further minimise burden on schools, colleges and staff.

“Reassures centres re assessment load - best practice guidance could be best.”(School or college)

Finally, one comment indicated overall support for the guidance but suggested that if a full set of mock exam papers were to be kept for each student it may be disproportionately burdensome and suggests instead that it would be better to retain a sample of work produced to be used as evidence.

“We agree that this guidance for 2024 and beyond would help schools and colleges collect and retain evidence, but that if it is to be proportionate, then the moderation processes in place in schools and colleges should mean that only a sample of work needs to be retained as evidence.” (Local authority)

Question 1.4

To what extent do you agree or disagree that the guidance set out minimises any additional burden on students beyond the existing assessment arrangements, such as mock exams, in place in centres?

Response Number of responses
Strongly Agree 3
Agree 14
Neither Agree nor Disagree 10
Disagree 4
Strongly Disagree 5

There were 36 responses received to this question. Overall, almost half (47%) of respondents agreed or strongly agreed that the guidance would minimise burden on students. Twenty-seven percent of respondents neither agreed nor disagreed, and 25% respondents disagreed or strongly disagreed.

Responses did vary among different groups, however. More than half of awarding organisations that responded agreed, while less than a third disagreed or strongly disagreed, and the rest neither agreed nor disagreed. Members of SLTs, and schools and colleges were more likely to agree or strongly agree. Awarding organisation and ‘other representative or interest group’ responses primarily neither agreed nor disagreed, while teachers were more likely to neither agree nor disagree.

In total, we received 25 comments about this question. Of those respondents who did not provide comments, 3 answered ‘strongly agree’, 7 answered ‘agree’, and 2 neither agreed nor disagreed. Many comments were positive in that respondents felt it minimised potential impact upon students.

“Students would normally expect to take one full set of mock exams, under exam conditions in their final year of study. If the guidance is clear that this is the expectation on centres then this will minimise any additional burden on students.” (Representative or interest group)

One of the most frequently raised issues by those that disagreed or neither agreed nor disagreed was that having this guidance in place was burdensome in terms of the administrative time and costs associated with retaining evidence.

Several respondents commented that the guidance would negatively impact on students’ well-being, by increasing the importance of mock exam results due to their role in awarding grades should exams be cancelled.

“Any assessment, even if only for one mock examination, that could bear even the remote possibility of leading to a final grade carries with it an additional burden that countermands all the work we do about attitudes to learning and embracing mistakes.” (School or college)

A small number of respondents raised, again as elsewhere, concerns the mock assessment results are not indicative of student attainment in their final exams. Some also suggested that the guidance may lead to schools and colleges over-assessing to gather the evidence the guidance requests, and reiterated concerns about the impact of the guidance on students’ well-being.

A small number of responses suggested that the guidance is not strict enough in that it is guidance and not mandated. This led to some respondents suggesting that this could lead to issues such as inconsistency or some schools and colleges over-assessing their students.

“Guidance is not mandatory. Different centres will take different approaches or even no approach.” (Awarding organisation)

Another respondent also did not comment on whether the guidance minimised impact upon students, but instead repeated comments they made elsewhere about the burden upon schools and colleges/their staff they believe the proposals will produce.

“In order to minimise additional burden, a sample of work being kept would be sufficient. This would allow teachers to share work with students to use formatively for improvement from mock exams. It would minimise the additional burden if the retained evidence could be kept as a copy either electronically or in physical form for a sample of the cohort.”(Local authority)

Question 1.5

To what extent do you agree or disagree that the guidance set out above would minimise any additional teacher workload beyond existing assessment arrangements, such as mock exams, in place in centres? Please add any comments to explain your response:

Response Number of responses
Strongly Agree 1
Agree 12
Neither Agree nor Disagree 6
Disagree 8
Strongly Disagree 9

There were 36 responses to this question. Overall, 36% of respondents agreed or strongly agreed that the guidance would minimise additional teacher workload. Nearly 17% neither agreed nor disagreed, while 47% respondents disagreed or strongly disagreed.

Looking at the responses from different groups, disagreement (including ‘strongly disagree) was highest amongst senior leaders (71%) and teachers (100%) who were providing a personal response. Most awarding organisations responding agreed (54%) that the guidance would minimise additional teacher workload, while 28% either disagreed or strongly disagreed, and the rest neither agreed nor disagreed. Forty-two percent of organisational school or college respondents agreed, with the rest of organisational school and college respondents being evenly split between those who either disagreed or strongly disagreed, and those who neither agreed nor disagreed. Slightly more representative or interest groups disagreed than agreed, but the numbers were small (only 3 respondents).

Of the 36 respondents to this question, 29 included comments. Of those respondents who did not provide comments, one answered ‘Strongly agree’, 4 answered ‘agree’, one neither agreed not disagreed, and one answered ‘disagree’.

Most commonly, the concerns raised by respondents in their comments centred on teacher workload being increased due to the view that they would spend additional time facilitating, marking, and retaining evidence of mock exams.

“The section of the guidance on retention of work will increase teacher workload as recognised by Ofqual. Making copies of evidence, scanning for digitisation and indexing for future reference all takes significant time in addition to teachers’ ordinary workload and will also create a cost burden on centres.” (Representative or interest group)

Additionally, several respondents repeated concerns raised elsewhere that the proposals may lead to additional assessments being put into place for students. One being to ensure students can demonstrate as strong a performance as possible, or because of changes in internal processes to accommodate the guidance. They raised these concerns again here in the context of the impact additional assessments could have on teacher workload, such as through increased marking.

There were, however, several positive comments made by those who agreed that the guidance minimises additional teacher workload. These comments suggested that teacher workload would not be increased because the guidance seeks to fit in with existing assessment arrangements in schools and colleges.

“We feel the language is clear that over-assessing students is not desirable, and therefore additional workload on teachers through marking of assessments should not go beyond those arrangements typically adopted in schools and colleges each year.”(Awarding organisation)

A small number of other comments suggested that the proposed guidance was not sufficiently clear or prescriptive, or that additional guidance should be provided to ensure consistency, prevent over-assessment, and minimise additional teacher workload.

“The principle set out in the guidance at Annex A that total assessment time should not exceed the total time that students would spend taking exams for the relevant qualification, plus any time spent on non-exam assessment is important. The excessive resilience-related assessments referred to in the consultation document in 2023 were the result of centre practices that ignored this principle. As a minimum expectation, it is essential that centres do not depart from this principle without a justifiable reason for doing so…. This principle should, therefore, be reflected in qualification specifications given that some centres used their discretion to ignore it despite its inclusion in previous versions of the guidance set out at Annex A.”(Representative or interest group)

Question 1.6

Are there any parts of the guidance which you think could be improved? Please be specific about which part of the guidance you are referring to and how it might be improved.

There were 30 responses to this question. Comments covered a wide range of topics, not all of which were related to the guidance in question. Some respondents simply reiterated that they did not support the proposed guidance and associated arrangements without referencing particular parts or offering suggestions for improvement. A small number of respondents commented but only to note that they did not have any specific suggestions for improvement.

Some comments reiterated points already made in response to other questions. These included comments stating that the retention of scripts would be an administrative burden and may result in additional teacher workload.

Of those which suggested improvements, several suggested it would be less burdensome if schools and colleges were able to either retain student assessment scores rather than the assessment papers, or if they were able to retain a sample of the papers instead of all of them.

“The guidance should be clear that samples of work need to be retained at each grade but that it is not necessary for every piece of work for every student to be retained, provided that the centre has documented its assessment and moderation processes.” (Local authority)

“Retention of evidence - could the school not simply keep marks (which are recorded as a matter of course) rather than pupil work which adds an element of administration that should not be discounted.” (Personal – SLT)

A few comments referenced concerns that mock assessment grades may not be indicative of students’ final exam attainment. Some of these requested clarity or made suggestions on how teachers could represent expected student performance improvement in the months after a mock assessment in the event TAGs, or similar, had to be used. This is beyond the scope of the consultation, which focuses on the gathering and retention of evidence; the proposed guidance made it clear that further guidance on TAGs would be provided when required.

Several comments indicated that some respondents felt additional rules and regulation would be necessary to ensure use, and consistency in use, of the guidance by schools and colleges. These comments on the regulatory status of the guidance rather than the content therein.

“To level up - have national fairness and equity - all schools should be mandated to offer only one set of mocks per academic year. If you do not have national rules - there, logically, can only be iniquity.” (Personal – teacher)

There were a small number of respondents who suggested additional information and clarity on specific circumstances would be helpful, such as how long evidence should be retained for, or what information should be retained in addition to evidence of student performance.

“The guidance on the retention of evidence could be strengthened by adding additional information on the need to retain the information until after the deadline for any appeals has passed. This will ensure that centres are clear that the retention should extend beyond the date upon which grades are issued to learners. It could also be improved by stating that the retention of information about any special consideration applied to the assessment is also required until after the deadline for any appeals. This will ensure that evidence that is useful in making an appeal decision is available to AOs.” (Awarding organisation)

One respondent gave a detailed response concerning the particular needs of students for whom English is not their first language. They suggested additional guidance may be required for schools and colleges in this instance to ensure those students are not disadvantaged in the event that grades are awarded by alternative means to exams. This response relates more strongly to the Equality Impact Assessment, which is analysed later.

“Centres must ensure that all relevant information regarding the proposed arrangements should be clearly relayed to pupils using EAL and their parents/carers and fully understood by them. It is important to remind centres that parents and carers may also be new to English or at the early stages of language acquisition themselves, may not be literate in English, and/or may be unfamiliar with the English education system and how it works.” (Organisational – other)

Private Candidates

Questions covered in this section

This section of the consultation focuses on the proposed resilience arrangements for private candidates. The proposed guidance stated that some private candidates may want centres to assess them during the academic year, alongside the centre’s students, in line with the guidance. Alternatively, that private candidates could be assessed only in the unlikely event it is confirmed that exams will not take place as intended, in which case they would be assessed in a compressed period.

Question 1.7

To what extent do you agree or disagree that this would be the best approach for private candidates? Please add any comments you have on the proposed approach, and/or any views you have on alternative approaches.

Response Number of responses
Strongly Agree 2
Agree 18
Neither Agree nor Disagree 11
Disagree 2
Strongly Disagree 2

There were 35 responses to this question. More than half of respondents (57%) strongly agreed or agreed that the guidance would be the best approach for private candidates. This was followed by 31% of respondents neither agreeing or disagreeing and only 11% disagreeing or strongly disagreeing. Of those that disagreed, 2 were awarding organisations, one was a school or college, and one was a member of a senior leadership team; respondents who agreed, or neither agreed nor disagreed, were a mixed group.

There were 29 comments in response to this question. Of those respondents who did not provide comments, 4 answered ‘agree’ and 2 neither agreed not disagreed.

As with other questions, several comments raised issues which have been covered under the section above on overarching themes, including the additional burden, costs and time for schools, colleges and their staff.

“Given the need to gather evidence through the academic year, this probably creates additional burden for centres and private candidates but is also a ‘least bad’ option.”(Awarding organisation)

There were 2 main themes to comments offered in response to this question. The first centred on concern for potentially unequal opportunities for private candidates to benefit from this guidance, due either to the lack of access to schools and colleges willing to accept private candidates, or a lack of consistency between those centres in following the guidance.

“Some private candidates struggle to find a centre willing to accept their entry within a reasonable travelling distance of their home. It is therefore likely that a significant proportion of private candidates will be unable to find a centre willing to host additional assessments in addition to hosting the exam itself.”(Awarding organisation)

The second significant theme that emerged was the view that further information may be required for schools and colleges about how to manage private candidates in line with the guidance. Some of these comments were beyond the scope of this consultation as they referred to the need for further guidance when determining grades, rather than further guidance for how to gather and retain evidence.

“Further clarification is required. Centres will need guidance on the process which they will be expected to be followed to standardise the evidence from private candidates with that of students taught by the centre, as they will be assessed at a different time, and many will have a different evidence base.” (Awarding organisation)

“It is unclear who would choose the assessment, the contents and duration and who will be marking this work.” (Awarding organisation)

Part 2: Ofqual proposals on Conditions and statutory Guidance

The questions in this section of the consultation relate to Ofqual’s proposed regulatory approach outlined in Part 1 of this consultation. The proposals relate to the General Conditions of Recognition, Qualification Level Conditions and statutory Guidance that would be put in place to implement the policy proposals outlined in Part 1.

Question 2.1

Do you have any comments on the drafting of Condition C2.6?

Twelve respondents provided comments to this question. This included 7 awarding organisations, 2 unions, 2 teachers and one other representative or interest group. The remaining respondents to the consultation (27 respondents) either did not answer the question or confirmed that they did not have any comments.

Of those that provided comments, several respondents suggested the use of the word ‘throughout’ in the drafting of the Condition suggests gathering the evidence would involve several series of assessments. It was suggested this could be changed to ‘during’ instead. It would also give awarding organisations more flexibility as to when in the year it would be most appropriate to gather the evidence.

“It would be helpful to change the word ‘throughout’ to ‘during’ C2.6 (a)… ‘gather evidence throughout the academic year’. Using the word ‘throughout’ suggests that Centres should be constantly gathering evidence.” (Awarding organisation)

Several respondents also mentioned the burden on schools and colleges and the need to ensure any guidance developed by awarding organisations would not place unnecessary burden on them. This included one respondent who raised concerns that the Condition did not set any limitations on the assessment an awarding organisation might consider.

“The wording of Condition C2.6 does not set any limitations on the assessment an awarding organisation might consider. It does not reflect the guidance to centres which suggests one set of mocks under exam conditions is sufficient.” (Union)

One respondent raised concerns that the drafted Condition left the need to gather evidence at the discretion of awarding organisations. The respondent felt this would lead to inconsistencies across awarding organisations which could potentially damage the public perception of the qualification. The respondent did, however, suggest this risk could be mitigated by JCQ and other similar organisations.

Another respondent questioned how awarding organisations would ensure that were necessary and appropriate, schools and colleges were collecting evidence of student attainment.

It was also suggested that the Condition should not come into effect partway through an academic year without being amended to take into account the workloads of teachers and students.

One respondent also disagreed that a new Condition requiring all awarding organisations to consider if it is necessary and appropriate to have resilience arrangements in place should be introduced. Instead, they said, awarding organisations should be required to include their guidance when they submit their qualifications, including T Levels, for review to Ofqual and IfATE as part of any funding or other approvals process.

Comments were also made on the drafting. One respondent suggested the word ‘resilience’ was not being used in the way it was ordinarily used and so suggested that it would be helpful for Ofqual to either define ‘resilience’ or remove it from the drafting. It was also suggested that drafting of the Condition should be amended so that each of the points follow on from the stem of the Condition.

One respondent also commented on the drafting of the statutory Guidance and so the response has instead been considered in Question 2.1.

One respondent’s comment did not relate to the drafting of the new Condition or guidance and so was beyond the scope of this question.

Question 2.2

Do you have any comments on the drafting of the addition to statutory Guidance for Condition C2?

Thirteen respondents provided comments to the question. This included 8 awarding organisations, 2 unions, 2 teachers and one other representative or interest group. The remaining respondents to the consultation (26 respondents) either did not answer the question or confirmed they did not have any comments.

Of those that provided comments, 2 respondents suggested it would be helpful for the Guidance to clarify the circumstances when evidence gathered might be used. They suggested that some awarding organisations, particularly those that do not offer GCSEs and A levels, might misunderstand the arrangements and think they could apply to other adverse circumstances.

“It is important to ensure everyone is aware this is only where “exams did not go ahead” at all, for any leaners. There is a risk that awarding organisations or centres could misunderstand and think this relates to individual special consideration cases or issues for specific centres (like a fire or flooding).” (Awarding organisation)

Several respondents made specific comments on the drafting. Two respondents repeated the request to use another word instead of ‘throughout’ in the drafting as it suggests that schools and colleges should be constantly gathering evidence. Another suggested that where the statutory Guidance lists the situations when it might be appropriate for centres to gather evidence, it would be helpful to make it clearer that it would be when all 3 criteria apply.

Several respondents requested additional guidance to clarify for which qualifications the arrangements would be necessary. It was also asked what the arrangements would be should exams and other formal assessments not go ahead for qualifications that act as a licence to practise.

“Further guidance on what Ofqual defines as ‘similar VTQs‘ is needed and whether this will follow the same approach used during COVID-19 to identify similar VTQs to GQs, or whether there will be a difference.” (Awarding organisation)

One respondent also suggested that the statutory Guidance should include the need for awarding organisations to consider the arrangements for nested qualifications (where a smaller qualification is part of, or nested within, a larger qualification), so that students taking a nested qualification or part way through a qualification can be certificated where necessary.

One respondent also commented that just because a qualification is taken alongside or instead of a GCSE or A level, it does not mean similar approaches would be appropriate. They suggested it would be helpful for the statutory Guidance to provide assurances that for some qualifications, the arrangements needed for schools and colleges to collect evidence of student attainment might be different to those set out in the guidance for GCSEs, AS and A level, Advanced Extension Awards and Project Qualifications.

One respondent suggested it would be helpful to explain what would be deemed as ‘other suitable evidence’ that might mean it would not be necessary for schools and colleges to need to collect evidence of student attainment. The respondent suggested in an established qualification this might be different to that in a qualification that has just started to be taught, where familiarity with subject content and assessments structures is likely to be lower.

Two respondents made comments that were out of scope of this consultation. Several respondents also repeated their comments on the drafting of the Condition already made in response to question 2.1.

Question 2.3

Do you have any comments on our proposal to amend ConditionGCSE4.8? - Comments on Condition GCSE4.8

There were 28 responses to this question, however, the majority simply responded ‘No’ to indicate they had no comments. Five respondents offered comments. There were no particular themes to these due to their small number, but some are highlighted below.

A small number of those comments made were positive, indicating support for the proposed amendment to ConditionGCSE4.8.

“It makes sense to have a Condition whereby AOs should alert centres to the importance of evidence gathering. This seems a sensible approach.” (Representative or interest group)

A small number of respondents emphasised that it would be important for Ofqual to communicate effectively with awarding organisations about changes to the guidance so they could be confident they were supplying schools and colleges with accurate information, and that they were following the guidance correctly.

Question 2.4

Do you have any comments on our proposal to amend Condition GCE4.3? - Comments on Condition GCE4.3

There were 28 responses to this question. However, the majority responded ‘No’ to indicate they had no comments. Six respondents offered comments. Given so few respondents offered comments, the comments are not necessarily representative of the full range of views and may focus on some views more than others. Some of the comments may also be pertinent to questions asked elsewhere.

There were no particular themes to the comments due to their small number but, as in responses received for the previous question, some respondents wanted additional information or reassurance concerning clear communication from Ofqual to ensure they could be confident in their compliance.

Question 2.5

Do you have any comments on the changes to the title of ConditionProject1 and the drafting of ConditionProject1.2?

There were 29 responses to this question. The majority, however, simply responded ‘No’ to indicate they had no comments. Seven respondents offered comments. Given so few respondents offered comments, the comments are not necessarily representative of the full range of views and may focus on some views more than others. Some of the comments may also be pertinent to questions asked elsewhere.

There were no particular themes to the comments, as there were so few responses. Notably, however, some indicated that they were unsure if Project Qualifications should be included in the guidance.

“Inclusion of the Project qualification in this guidance seems unnecessary… the qualification does not match Ofqual’s description of qualifications that are likely to be in scope (i.e. where the qualification does not include non-exam assessment, or it follows an academic year cycle and there are limited assessment opportunities).” (Awarding organisation)

Question 2.6

Do you have any comments on the drafting of Condition AEA4.3? - Comments on Condition AEA4.3?

There were 28 responses to this question. The majority, however, responded ‘No’ to indicate they had no comments. Six respondents offered comments. Given so few respondents offered comments, the comments are not necessarily representative of the full range of views and may focus on some views more than others. Some of the comments may also be pertinent to questions asked elsewhere.

There were no particular themes to the comments, as there were so few responses. A small number of responses, however, referenced the burden these changes and new Conditions could place upon schools and colleges. This burden was identified, as it has been elsewhere in the consultation, as being down to the administrative costs of implementing the guidance, whether through evidence retention processes or the process of understanding and engaging with the new requirements.

An awarding organisation also referenced the potential difficulty for schools and colleges in identifying students likely to take the AEA early enough in the year to collect evidence of student performance.

“It should be noted that as the AEA is designed to provide extra challenge to some students following A level maths courses, centres may not know at the start of an academic year which of their students (if any) will be entered for the AEA exam as well as the GCE maths exams. This decision may only be made around Feb/March of the final year when exam entries are due to be made.” (Awarding organisation)

Equality impact assessment

Questions covered in this section

In developing the proposals included in the consultation, there was consideration of the impact that the proposals might have on students because of their protected characteristics. In this section of the consultation, respondents were asked if they agreed with the impacts identified by DfE and Ofqual, whether there were other impacts not identified, and whether there were additional ways to mitigate these impacts.

Question 3.1

Do you believe the proposed arrangements (any or all) would have a positive impact on particular groups of students because of their protected characteristics?

Response Number of responses
Yes 10
No 20

There were 30 responses to this question. Of those that did respond, two-thirds did not think the proposed arrangements would have a positive impact on particular groups of students because of their protected characteristics. One third of respondents thought there would be a positive impact.

Among the respondents believing there would be a positive impact, awarding organisations made up the majority (70%), with the rest being schools or colleges, or teachers.

Among the respondents that did not believe there would be a positive impact, the majority categories were jointly teachers, schools or colleges, and awarding organisations with 20% each. Another 15% were representative or interest groups and 15% were members of SLTs. Overall, there were more respondents in each respondent type who did not think there would be a positive impact than did.

Question 3.2

Do you believe the proposed arrangements (any or all) would have a negative impact on particular groups of students because of their protected characteristics?

Response Number of responses
Yes 10
No 20

There were 30 responses to this question. Of those that did respond, two thirds did not think the proposed arrangements would have a negative impact on particular groups of students because of their protected characteristics. One third of respondents answered ‘yes’, indicating they thought there would be a negative impact.

Among those that answered ‘yes’, the only respondent type that responded majority ‘yes’ was representative or interest groups, but the numbers were small. Forty per cent of those who responded ‘yes’ were awarding organisations, however the majority of this respondent type indicated they did not think there would be a negative impact.

Awarding organisations made up the majority of those who did not think there would be a negative impact, at 35%. This was followed by schools and colleges at 25% and members of SLTs at 20%. A local authority respondent and a representative or interest group also did not think there would be a negative impact.

Question 3.3

Do you have any comments on the impact of the arrangements on particular groups of students because of their protected characteristics?

There were 27 responses to this question. It is worth noting that nearly half of these were ‘N/A’, ‘No’, or ‘None’ responses. Of those that did offer substantive comments, most were awarding organisations, with a small number of other organisations and teachers responding in a personal capacity too. Some of these responses were very detailed, and several themes emerged.

Many of the comments referenced a reported increase in the number of students requesting reasonable adjustments and additional support during their studies and for their exams. Some of these comments went on to suggest that schools and colleges may struggle to resource this increase, which could negatively impact particular groups of students by preventing them from completing a full set of mock exams or limiting their performance during these assessments.

“Centres are already struggling to meet the increased demand for access arrangements due to a lack of space, staff and resources (communicate-ed, 2022) and may struggle to replicate exam conditions fully in a mock series.” (Representative or interest group)

Several comments highlighted that students with inconsistent school attendance, or those with inconsistent performance because of attendance issues or other circumstances, may be disadvantaged. The comments suggested they would have fewer opportunities to complete a full set of mock examinations or would be underperforming at the time of the mock exams.

“Some students may have a negative impact where they have not attended an educational setting for varying reasons. They may not have enough of the required ‘evidence’ to submit for an overall grade.”(Awarding organisation)

Some comments suggested students with protected characteristics or particular socio-economic circumstances may be more disadvantaged compared with other students. This could be because of the lasting effects of the pandemic or could be possible in the event of a future large-scale crisis that sees education disrupted and/or exams cancelled. In particular, these respondents highlighted greater difficulty for these students in accessing education during the course of their studies.

“In a public health emergency, it seems likely that more disadvantaged students will be disproportionately affected, e.g., missing classes, facing more barriers to access, experiencing digital exclusion etc.” (Representative or interest group)

A small number of comments were positive about the impact the proposals could have on particular groups of students, suggesting it may alleviate anxiety by enabling practice for their formal assessments, or provide reassurance about scenarios where government determines to cancel exams, or they are unable to complete their exams for personal reasons.

“Additionally, some students may find the assessments used for gathering evidence beneficial for their preparation for formal exams and assessments.” (Awarding organisation)

Finally, a small number of respondents suggested there may be disadvantages for those students undertaking qualifications other than GCSEs, AS and A levels, Project Qualifications and AEA, due to the lack of centrally set guidance for these.

“There is a risk that students are disadvantaged because the guidance is only provided for some qualifications. Students for other qualifications, where it might be a valid approach, would be disadvantaged.”(Awarding organisation)

Regulatory impact assessment

Questions covered in this section

This section of the consultation asked respondents if there were additional activities associated with delivering the proposed resilience arrangements that had not been identified in the consultation, if there were additional costs incurred by the proposed resilience arrangements and if there were alternative approaches to reduce burden and costs.

Question 4.1

Do you believe resilience arrangements in place for 2023 increased the burden on schools, colleges and staff over and above business as usual?

Response Number of responses
Yes 20
No 11

There were 31 responses to this question. Of those that did respond, about two-thirds thought the resilience arrangements in place for 2023 did increase burdens on schools, colleges and staff over and above business as usual. Approximately a third did not.

Of those that answered ‘no’, the most common respondent type was ‘Awarding organisation’, accounting for just under half. Teachers, members of SLTs, and schools and colleges accounted for half of those who answered ‘yes’.

Eighteen comments were received in response to this question. The most common theme was the view that the arrangements created additional burden in terms of the administrative and financial costs associated with retaining evidence of student performance, including the impact this had on teacher workload. This point has also been raised in relation to other questions in the consultation.

“The expectation to retain evidence over the last few years placed a significant burden on schools and colleges. While most schools and colleges already ran mock exam series, they were not used to retaining the evidence from these. This became either a physical challenge (storing thousands of scripts) or a workload challenge (scanning in scripts page by page).”(Representative or interest group)

Other comments suggested the arrangements had led to additional assessments of students, which also increased burden in terms of teacher workload.

“The arrangements for 2023 were not sufficiently clear about what level of assessment was necessary to provide evidence for a TAG if needed. As recognised by Ofqual, this lack of clarity led to some centres putting on assessments in addition to an ordinary set of mock exams. This increased burden on centres and staff with a consequential reduction in teaching time and increase in workload around the administration of assessments.”(Representative or interest group)

A small number of respondents highlighted that the guidance also had implications for schools’ and colleges’ resources because some students required special consideration and/or reasonable adjustments at the time of their mock exams.

“There are more learners requesting small group or individual rooms for exams. There are more learners than ever asking for special consideration under mental health arrangements than seen previously.” (School or college)

There were also, however, positive comments about the 2023 arrangements, with some respondents suggesting the additional burden had been minimal due to the guidance enabling schools to comply where possible using existing mock exam arrangements.

“The requirements prompted some centres to consider and plan their approach to mocks and to how they quality assured all aspects of the process, including marking, more carefully than they might otherwise have done and this will have led to additional work. However, most centres have reported that the requirements fitted well with their business as usual arrangements.”(Awarding organisation)

Question 4.2

Do you believe the proposed resilience arrangements for 2024 and beyond will increase burden on schools, colleges and staff over and above business as usual?

Response Number of responses
Yes 22
No 11

There were 33 responses to this question. Of those who responded, two-thirds answered ‘yes’ and a third answered ‘no’.

Awarding organisations made up approximately 55% of the ‘no’ responses. Collectively, as with the previous question, it was teachers, schools and colleges, and members of SLTs who were most likely to answer ‘yes’, making up 50% of those respondents confirming their belief that the proposed arrangements would produce more burden over and above business as usual.

There were 21 comments received in response to this question. Of those respondents who did not provide comments, 7 had answered ‘Yes’ and 5 had answered ‘No’.

The most common theme among the comments was the view that the arrangements created additional burden for schools, colleges and their staff. This was in terms of the administrative and financial costs associated with retaining evidence of student performance, including the impact this had on teacher workload.

“The cost of someone processing paper exam papers etc to be able to store them digitally is large. It is time consuming.”(Personal – teacher)

“Unless schools have planned for secure mocks already. An additional activity or need to store evidence will increase workload and stress.” (Awarding organisation)

A few responses highlighted concerns that the guidance could lead to additional assessments, and that this would have a corresponding impact on teacher workload. It was also suggested there would be an impact on students due to additional assessments possibly leading to less time being available for learning.

“The expectation that centres should gather robust evidence during the course is likely to have a detrimental impact on learning. Centres following the guidance would have to reduce learning time/formative assessment to create more time to deliver semi-formal summative assessments. This will be to the detriment of students.”(Awarding organisation)

There were also a small number of positive comments suggesting the proposed arrangements would not likely increase the burden on schools, colleges and staff. This feedback seemed to centre on the belief that the guidance should be applicable within existing arrangements for most schools and colleges.

”It should be possible to integrate these arrangements into planned assessment schedules with relatively little additional work.” (Representative or interest group)

Question 4.3

Do you believe resilience arrangements in place for 2023 had an overall positive, neutral or negative impact on students?

Response Number of responses
Positive 7
Neutral 20
Negative 3

There were 30 responses to this question. Of those that responded, two-thirds indicated they felt the 2023 arrangements had a neutral impact on students. Almost a quarter felt the impact was positive, and only 10% felt it was negative.

The respondent types were fairly split across the different responses. Awarding organisations were more likely to be neutral (61%), with the rest of their responses answering ‘positive’. Members of SLTs made up the majority of the ‘negative’ responses, but these were very small numbers in total. Teachers, schools and colleges, and members of SLTs were, collectively, most likely to answer ‘neutral’ overall.

Question 4.4

Do you believe the proposed resilience arrangements for 2024 and beyond will have a positive, neutral or negative impact on students?

Response Number of responses
Positive 8
Neutral 18
Negative 6

There were 32 responses to this question. Of those who did respond, more than half believed the proposed arrangements for 2024 and beyond would have a neutral impact on students. Slightly more of those remaining responded that there would be a positive impact than negative impact (25% against 19%).

Awarding organisations were fairly evenly split between thinking there would be a positive or a neutral impact and made up the majority of ‘positive’ responses (approximately 63%). Collectively, teachers, schools and colleges, and members of SLTs were most likely to think the impact would be neutral. The negative responses were predominantly from awarding organisations (28%), teachers (22%), members of SLTs (17%), and schools and colleges (17%).

There were 19 comments in response to this question.

Many comments were positive about the proposed guidance and its potential impact on students, suggesting it offered a ‘safety net’ and a sense of reassurance about what would happen if exams were cancelled again.

“This should benefit learners as it is a safety net to ensure their progress against the learning outcomes is monitored and evidenced.”(Awarding organisation)

The most prominent theme among the less positive comments was concern for the wellbeing of students. This was in terms of the potential for increased stress and anxiety if they feel their mock assessments become more important as a result. Some of those, however, reflected that despite the additional stress, it was still positive to have resilience arrangements.

“We have been informed by our members that some students who are anxious about exams, including those whose anxiety levels increased because of Covid, have been negatively impacted by changes such as these.”(Representative or interest group)

A small number of responses reiterated concerns about the potential for inconsistency in how schools and colleges follow the guidance, and the subsequent potential for some to introduce additional assessments, which could negatively impact students.

“If the guidance is left open to interpretation, as it currently is, and this leads to a significant number of centres over-assessing their students then the impact will be negative.”(Representative or interest group)

A small number of responses drew out the differential impact the proposed guidance could have on certain groups of students, if they have protected characteristics of if English is not their first language.

“For some pupils using EAL the arrangements are likely to be beneficial, as ensuring that they have experience of formal assessments will help them to better prepare for their exams in the summer. However, anxiety is likely to be even greater for recently arrived pupils who will also have been wrestling with learning a new language.” (Organisational – other)

Question 4.5

Are there additional burdens associated with the delivery of the proposed arrangements on which we are consulting that we have not identified above?

Response Number of responses
Yes 9
No 19

There were 28 responses to this question. Approximately two-thirds of respondents answered ‘no’, and the remaining third answered ‘yes’. Awarding organisations represented the majority of the ‘yes’ responses but had almost as many ‘no’ responses. Schools and colleges that answered this question all answered ‘no’, as did three-quarters of both teachers and members of SLTs. The remaining responses were equally spread between respondent type.

There were 13 comments in response to this question.

Several of the respondents reiterated points they or others had made elsewhere in the consultation, such as concerns about the administrative and financial cost of retaining evidence of student performance, and the impact on teacher workload.

One respondent referenced a lack of information regarding how teachers could be confident the evidence retained is representative of student performance if some students change tier of entry after their mock exam.

“Many teachers use the mocks as a judge of the final tier of entry and so the proposed arrangements make no reference to this and provide no guidance on approaches to take if the evidence base indicates that the student is taking papers from the wrong tier.” (Awarding organisation)

One respondent suggested there would also be an increase in demand upon awarding organisations as a result of the guidance, due to evidence of student performance potentially being used to solve issues other than in a scenario where government determines to cancel exams.

“There would also be an impact on AOs in terms of responding to queries about whether the alternative evidence that the centre has gathered can be used to award a grade to students who have missed assessments.”(Awarding organisation)

One respondent raised concerns that if the proposals were implemented, it may encourage the view that there are alternative means of achieving a grade if individual students are unable to sit exams or have faced other adverse circumstances.

“There is a large risk that this will lead to some centres expecting the awarding organisation to use the data for cohort or individual adverse circumstances.”(Awarding organisation)

Question 4.6

What additional costs do you expect you would incur through implementing the proposed arrangements on which we are consulting? What costs would you save? Please distinguish in your response between those costs or savings that relate to preparing to put the proposed arrangements in place, from those that would only be realised if the arrangements were required.

There were 23 responses to this question. The most common theme to the responses was costs associated with administration of retaining evidence of student performance.

”Photocopying costs: paper, ink, time of person operating photocopier, time to arrange the copied papers, time to store safely and time to rotate historical evidence.” (Personal – teacher)

There were a few comments from awarding organisations suggesting there would be additional costs associated with any requirement upon them to assess whether the proposed guidance and Conditions were relevant to their qualifications, and any subsequent requirement upon them to issue suitable guidance about this. These costs were identified as being primarily related to the need for additional staff, or more time from existing staff, to develop new guidance, engage with schools and colleges and respond to enquiries about any new guidance.

“Costs of creating the guidance. Costs of monitoring schools (if that is expected from Ofqual). Costs to monitor and maintain additional regulations.”(Awarding organisation)

Several comments reiterated concerns about an increase in teacher and wider staff workload, in terms of communication with parents and students, additional support to those with protected characteristics or language barriers, additional invigilation requirements, and administration.

Question 4.7

Do you have any views on how we could reduce burden and costs while achieving the same aims?

There were 20 responses to this question. Most of the suggestions received for reducing burden and costs centred on using alternative evidence of student attainment, such as only the marks achieved through mock exams or a sample of mock exam papers, to avoid the costs associated with retaining as much evidence.

“We propose that centres document their processes and are required to keep only a sample of work at the different grades.” (Local authority)

A small number of respondents suggested there was no need for the proposals at all, with one suggesting exams should go ahead regardless of future circumstances.

“One option is committing to run the examinations in a safe way regardless of the situation.”(Awarding organisation)

Some comments made were out of the scope of the consultation, as opposed to making suggestions that may reduce associated burden/costs.

Question 4.8

Are there any examples of best practice for evidence retention which reduce financial and administrative costs which you are able to share with us?

There were 19 responses to this question. Most respondents commented to say they had no suggestions, or were not aware of any best practice, and so there were few themes to analyse. The only significant theme was that several comments referenced digital solutions to help reduce the costs of retaining evidence of student performance.

Appendix A: Analytical tables of the responses to the quantitative questions aggregated over all respondent types

Respondent type Number of respondents
Other representative or interest group 6
Awarding organisation 13
Personal – SLT (senior leadership team) 7
Personal – teacher (responding in a personal capacity) 4
Organisational – local authority 1
Organisational – other 2
Organisational – schools or college 7
Total number of respondents 40

Breakdown of responses for each question

Question 1.1

Do you agree that, for the future (on a long-term basis), guidance should be provided to support schools and colleges in gathering evidence of student performance should exams not be able to go ahead as planned?

Response Number of responses
Yes 32
No 6
Not Answered 2

Question 1.2

Do you agree that awarding organisations offering VTQs should consider if it is necessary and appropriate to have resilience arrangements in place, and if so, provide necessary guidance to centres?

Response Number of responses
Yes 31
No 5
Not Answered 3

Question 1.3

Do you agree that this proposed guidance for 2024 and beyond would help schools and colleges to collect and retain evidence in a proportionate way in line with their existing arrangements to help determine TAGs if exams do not go ahead in the future?

Response Number of responses
Yes 22
No 13
Not Answered 4

Question 1.4

To what extent do you agree or disagree that the guidance set out minimises any additional burden on students beyond the existing assessment arrangements, such as mock exams, in place in centres?

Response Number of responses
Strongly Agree 3
Agree 14
Neither Agree nor Disagree 10
Disagree 4
Strongly Disagree 5
Not Answered 3

Question 1.5

To what extent do you agree or disagree that the guidance set out above would minimise any additional teacher workload beyond existing assessment arrangements, such as mock exams, in place in centres? Please add any comments to explain your response:

Response Number of responses
Strongly Agree 1
Agree 12
Neither Agree nor Disagree 6
Disagree 8
Strongly Disagree 9
Not Answered 3

Question 1.7

To what extent do you agree or disagree that this would be the best approach for private candidates? Please add any comments you have on the proposed approach, and/or any views you have on alternative approaches. Comments on proposed approach for private candidates

Response Number of responses
Strongly Agree 2
Agree 18
Neither Agree nor Disagree 11
Disagree 2
Strongly Disagree 2
Not Answered 4

Question 3.1

Do you believe the proposed arrangements (any or all) would have a positive impact on particular groups of students because of their protected characteristics?

Response Number of responses
Yes 10
No 20
Not Answered 9

Question 3.2

Do you believe the proposed arrangements (any or all) would have a negative impact on particular groups of students because of their protected characteristics?

Response Number of responses
Yes 10
No 20
Not Answered 9

Question 4.1

Do you believe resilience arrangements in place for 2023 increased the burden on schools, colleges and staff over and above business as usual?

Response Number of responses
Yes 20
No 11
Not Answered 8

Question 4.2

Do you believe the proposed resilience arrangements for 2024 and beyond will increase burden on schools, colleges and staff over and above business as usual?

Response Number of responses
Yes 22
No 11
Not Answered 6

Question 4.3

Do you believe resilience arrangements in place for 2023 had an overall positive, neutral or negative impact on students?

Response Number of responses
Positive 7
Neutral 20
Negative 3
Not Answered 9

Question 4.4

Do you believe the proposed resilience arrangements for 2024 and beyond will have a positive, neutral or negative impact on students?

Response Number of responses
Positive 8
Neutral 18
Negative 6
Not Answered 7

Question 4.5

Are there additional burdens associated with the delivery of the proposed arrangements on which we are consulting that we have not identified above?

Response Number of responses
Yes 9
No 19
Not Answered 11

Appendix B: List of organisational respondents

When completing the consultation questionnaire, respondents were asked to indicate whether they were responding as an individual or on behalf of an organisation. These are the organisations that submitted a non-confidential response:

  • Parentkind
  • Grey Court School
  • JAGS
  • Marlborough College
  • Godolphin and Latymer School
  • Rutlish School
  • Southmoor Academy Sunderland
  • Wildern School
  • Buckinghamshire School
  • Luminate
  • WJEC
  • RSL Awards
  • Open Awards
  • NCFE
  • NAHT
  • NEBOSH
  • Association of Colleges (AoC)
  • International Baccalaureate Organisation
  • National Association of Schoolmasters and Union of Women Teachers (NASUWT)
  • Training Qualifications UK
  • ASCL
  • Pearson
  • York College
  • City & Guilds
  • Hampshire Inspection and Advisory Service, Hampshire County Council
  • AQA
  • Chartered Institute of Building
  • National Education Union
  • OCR
  • Cambridge Assessment International Education