Consultation outcome

Regulating Digital Functional Skills qualifications

Updated 31 March 2022

Summary

Respondents were supportive of our proposals. There was very strong support for our approaches and proposals on:

  • the weighting of marks allocated to the assessment of practical digital skills within assessments
  • the coverage and sampling of subject content
  • differentiation between qualification levels
  • designing assessments with a real-life focus and connections between skills areas
  • minimum and maximum assessment times

There were still high levels of support for the following proposals, but there was a slightly higher level of disagreement on:

  • use of on-screen and online assessment
  • the number of components and assessments
  • setting minimum and maximum assessment times at 90 to 120 minutes respectively, for both qualification levels
  • introducing a qualification-level condition to ensure that awarding organisations cannot make FSQs in ICT available at any level after a 12month transitional period

We also received a small number of comments on our proposed Conditions, requirements and Guidance.

Background

The Department for Education (referred to as the Department in this document) is introducing new qualifications called Digital Functional Skills qualifications (FSQs) that seek to provide students with the core digital skills needed to fully participate in society. The Department is introducing them as part of its plans to improve adult basic digital skills and the new qualifications will sit alongside Essential Digital Skills qualifications as part of the government’s adult digital offer.

As set out by the Department, Digital FSQs will be introduced from August 2023 and will be new qualifications replacing the existing Functional Skills Qualifications in Information Computer Technology (FSQs in ICT). Unlike FSQs in ICT, which are available at Level 1, Level 2 and Entry levels 1, 2 and 3, Digital FSQs will be based on Entry level and Level 1 subject content.

The Department published the final subject content on 29 October 2021 following a consultation in May 2019. Awarding organisations will use this subject content to create the new qualifications.

Ofqual will regulate Digital FSQs and consulted on its initial policy approach to regulating Digital FSQs at the same time that the Department consulted on subject content for Digital FSQs in May 2019. Ofqual’s May 2019 consultation analysis and consultation decisions documents were published on 25 November 2021.

This consultation analysis document considers the responses Ofqual received to its remaining policy proposals and on the draft Conditions, requirements and Guidance for regulating the new qualifications that were consulted on from 25 November 2021 to 27 January 2022.

Approach to analysis

The consultation included 25 questions (including regulatory and equality impact questions) and was published on our website. Respondents could complete the questions using Ofqual’s online consultation platform.

Respondents to this consultation were self-selecting, so the sample of those that chose to reply cannot be considered as representative of any group.

Responses to the consultation questions are presented in the order they were asked. For each of the questions, Ofqual presented background contextual information, followed by proposals, and then asked respondents to indicate agreement and provide additional comment if they wished. Respondents did not have to answer all the questions.

In some instances, respondents answered a question with comments that did not relate to that question. Where this was the case, those responses were reported against the question to which the response relates, rather than the question against which it was provided.

Who responded?

Our consultation on regulating Digital FSQs was open between 25 November 2021 and 27 January 2022. Respondents could complete the questions using Ofqual’s online consultation platform.

Ofqual held an online consultation event with awarding organisations that had expressed an interest in providing the new qualifications on 9 December 2021. This event attracted 22 attendees from 5 awarding organisations. Four of these awarding organisations submitted official written responses to the consultation.

We received 23 written responses to our consultation. Three responses were in the form of letters sent to our public enquiry mailbox. The rest were received via our Citizen Space consultation platform. Fifteen were official responses from the following organisations:

  • 8 responses from awarding organisations
  • 4 responses from other representative or interest groups
  • 1 response from a local authority
  • 1 response from a training provider
  • 1 response from a private sector, not-for-profit company

We also received 8 personal responses:

  • 3 responses from teachers
  • 2 responses from consultants
  • 1 response from an awarding organisation employee
  • 1 response from a local authority employee
  • 1 other response

22 of the respondents are based in England and one is based in Wales.

Detailed analysis

Question 1a

To what extent do you agree or disagree with our proposals to set rules around the weighting of marks which could be gained through questions assessing practical digital skills and those gained through questions assessing knowledge?

5 respondents strongly agreed. 14 respondents agreed. 1 respondent neither agreed nor disagreed. No respondents either disagreed or strongly disagreed.

There were 20 responses to this question. Nineteen respondents either strongly agreed or agreed with our proposal to set rules around the weighting of marks which could be gained through questions assessing practical digital skills and those gained through questions assessing knowledge. One respondent neither agreed nor disagreed. Twelve respondents provided comments.

Of those that agreed or strongly agreed with our proposal, respondents thought our proposals would provide clarity and consistency around assessment design across different awarding organisations.

Respondents also said the proposed weighting of marks for the demonstration of practical skills (see question 1b) would be in line with the purpose of the qualification.

One awarding organisation agreed that the constructs being assessed aligned closely with our proposed weighting for the assessment of skills and knowledge, but suggested this could only be confirmed once it had started designing the qualification and the assessments.

The one respondent that neither agreed nor disagreed to our proposals suggested that, in some instances, knowledge would need to be assessed discretely and, in others, the assessment of knowledge would need to be embedded within the assessment of practical skills.

Question 1b

To what extent do you agree or disagree with our proposals to set a minimum weighting of 70% of marks to be gained through the assessment of practical digital skills at both Entry level 3 and Level 1?

8 respondents strongly agreed. 10 respondents agreed. 1 respondent neither agreed nor disagreed. 1 respondent disagreed. No respondents strongly disagreed.

There were 20 responses to this question. Eighteen respondents either strongly agreed or agreed with our proposal to set a minimum weighting of 70% of marks to be gained through the assessment of practical digital skills at both Entry level 3 and Level 1. One respondent disagreed and one neither agreed nor disagreed. Thirteen respondents provided comments.

Many respondents agreed that the purpose of the qualification and the subject content indicated that the majority of the assessment should test practical digital skills. One respondent felt our proposed weightings would allow the design of an engaging assessment of practical digital skills. Another respondent commented that our proposals would allow providers to focus on practical training, rather than focusing on preparing students for knowledge and practice tests.

One respondent suggested setting a maximum weighting for the assessment of practical digital skills, to avoid any knowledge-based assessment being redundant by having a very small weighting.

The respondent that neither agreed nor disagreed suggested our proposals to set a minimum weighting of 70% may be too high for students at Entry level 3, based on their experience with current students, who have not always been able to demonstrate their practical skills in formal assessment situations. They did agree that our proposals are appropriate for students at Level 1.

The respondent that disagreed with our proposals was an awarding organisation. It welcomed our approach of setting a minimum weighting, rather than a maximum weighting. However, based on its experience of assessment of the subject content in Essential Digital Skills qualifications (EDSQs), it felt different weightings are needed at Entry level 3 and Level 1.

Question 1c

To what extent do you agree or disagree with our proposals to require awarding organisations to provide a rationale for their interpretation of subject content statements, together with their weightings, in their assessment strategy?

6 respondents strongly agreed. 10 respondents agreed. 4 respondents neither agreed no disagreed. No respondents either disagreed or strongly disagreed.

There were 20 responses to this question. Sixteen respondents either strongly agreed or agreed to our proposal to require awarding organisations to provide a rationale for their interpretation of subject content statements, together with their weightings, in their assessment strategy. Four respondents neither agreed nor disagreed. Twelve respondents provided comments.

Those that agreed or strongly agreed with our proposal said it would help to ensure consistency, transparency and comparability among awarding organisations and would minimise areas that may have previously been interpreted differently by different awarding organisations. An awarding organisation commented that requiring awarding organisations to develop an assessment strategy would ensure its approach is in line with the Department’s steer and would also create an audit trail for decisions and feedback.

Some respondents commented that guidance on the level of detail needed for the rationale in the assessment strategies would be helpful, to reduce levels of inconsistency within the information provided by different awarding organisations.

A small number of respondents felt this proposal would allow centres to make an informed choice on which awarding organisation would be best suited to meet the needs of their students.

One respondent also commented that Ofqual would need to ensure that its proposals did not negatively impact timescales and first teaching dates.

Question 2a

To what extent do you agree or disagree with our proposals to require at both qualification levels that the design of the assessment must be able to cover all subject content statements, over time?

4 respondents strongly agreed. 12 respondents agreed. 3 respondents nether agreed nor disagreed. 1 respondent disagreed. No respondents strongly disagreed.

There were 20 responses to this question. Sixteen respondents either strongly agreed or agreed with our proposal to require at both qualification levels that the design of the assessment must be able to cover all subject content statements over time. One respondent disagreed. Eleven respondents provided comments.

Respondents welcomed our proposal to permit sampling of the subject content statements within an assessment (or pair of assessments), while acknowledging the importance that all subject content statements were covered over time.

Some respondents asked for clarification on what was meant by ‘over time’, saying that otherwise it was left up to awarding organisations to interpret what ‘over time’ meant.

One awarding organisation suggested that we should specify a minimum percentage of the number of subject content statements to be covered in each assessment (or pair of assessments), and that awarding organisations should evidence how subject content statements were covered over time.

In contrast, some other awarding organisations said that it should be up to the awarding organisation to decide what covering all subject content statements ‘over time’ looked like as this was standard practice in assessment design, but that a rationale for the design of their assessment should be provided in their assessment strategy.

Of those respondents that disagreed or neither agreed nor disagreed with our proposal, one representative organisation thought that, in light of the disruption arising from the pandemic, it was not sensible to expect teaching would cover all the statements. One awarding organisation also thought that some of the content could be challenging to assess. It singled out the Entry level 3 video conferencing content as of particular concern (subject content 3.3) and stated it would be challenging to assess in a formal assessment, as it would not be easily sampled with other content and might also compromise the predictability of the papers.

The awarding organisation that disagreed with our proposals thought it would be difficult to assess all subject content over time, in a time-constrained, externally set assessment. It gave examples of subject content statements at both levels which it felt would be difficult to assess. These included capturing digital media in a video format (subject content 2.4), using both local and remote storage to retrieve information (subject content 1.7), and saving a file on cloud storage using one device and opening it with another (subject content 1.6). It also said that centres and students might not have access to multiple devices or to both local and remote storage facilities.

Question 2b

To what extent do you agree or disagree with our proposals to require at both qualification levels that assessments may sample the subject content statements and the bracketed subject content in the subject content statements?

4 respondents strongly agreed. 15 respondents agreed. 1 respondent neither agreed nor disagreed. No respondents disagreed or strongly disagreed.

There were 20 responses to this question. Nineteen respondents either strongly agreed or agreed with our proposals to require at both qualification levels that assessments sample the subject content statements and the bracketed subject content in the subject content statements. No respondents disagreed. 7 respondents provided comments.

Respondents, including 2 awarding organisations, felt our proposals supported both the validity and reliability of assessments by enabling the development of more interesting, less predictable and less contrived assessments.

One awarding organisation thought guidance on what sampling may look like could be useful to ensure consistency across awarding organisations and avoid assessment predictability.

One awarding organisation noted that while it welcomed that all subject content statements, including bracketed subject content, did not have to be assessed in one assessment, it still had a concern that some skills statements were difficult to assess in an externally set assessment. It thought it would be particularly difficult at Level 1, where all assessment must be externally set and marked.

Another awarding organisation commented it was important that awarding organisations were able to select from the subject content statements, provided that all subject content statements were covered over time.

Question 2c

To what extent do you agree or disagree with our proposals to require at both qualification levels that assessments must cover as many of the subject content statements as possible?

3 respondents strongly agreed. 10 respondents agreed. 4 respondents neither agreed nor disagreed. 2 respondents disagreed. 1 respondent strongly disagreed.

There were 20 responses to this question. Thirteen respondents either strongly agreed or agreed with our proposals to require at both qualification levels that assessments must cover as many of the subject content statements as possible. Three respondents either strongly disagreed or disagreed and 4 neither agreed nor disagreed. Eleven respondents provided comments.

Those that agreed or strongly agreed with our proposal said it would allow students to demonstrate their competence across a range of skills and areas, and so support progression. One respondent emphasised the need for assessments to be realistic and meaningful.

Those that disagreed or neither agreed nor disagreed with our proposals felt that our proposal was too subjective. Some awarding organisations suggested we should prescribe a minimum number of subject content statements to be covered within an assessment (or pair of assessments) to support consistency across awarding organisations.

Other awarding organisations suggested it should be down to each awarding organisation to state and justify in their assessment strategies their coverage of subject content statements, based on their experience of developing assessment materials. It was argued by one awarding organisation that it is only when an assessment approach was being designed and tested that it could be determined what extent of coverage of the subject content would be workable, taking account of all other requirements.

One respondent also argued that a better approach would be to consider whether an assessment should assess student ability in-depth rather than requiring as many subject content statements as possible to be covered, as requiring the latter might lead to an assessment which was ‘light touch’, or superficial and lacking rigour.

One respondent, replying in a personal capacity, reflected that although testing of understanding is a critical aspect for any assessment, the assessment must be in an accessible format if it is to effectively and reliably measure a student’s ability.

The one respondent that strongly disagreed with the proposal was a personal response from an awarding organisation employee. No comments were left that explained why this respondent strongly disagreed with the proposal.

Question 2d

To what extent do you agree or disagree with our proposals to require at both qualification levels that assessments must cover subject content statements from each skill area?

5 respondents strongly agreed. 13 respondents agreed. 1 respondent neither agreed nor disagreed. No respondents disagreed or strongly disagreed.

There were 20 responses to this question, although one respondent did not answer the survey question to indicate to what extent they agreed or disagreed with the proposal. Eighteen respondents either strongly agreed or agreed with our proposals to require that assessments must cover subject content statements from each skill area, at both qualification levels. No respondents disagreed. Nine respondents provided comments.

Comments provided in response to this question often repeated those given for Question 2b. These were that the proposal supported both the validity and reliability of assessments, and that guidance on what coverage of the skills areas was expected could be useful.

Other comments included that the proposals were essential to ensure the qualifications reflected their purpose and would help provide for comparability across awarding organisations. One awarding organisation commented that this approach was consistent with that taken by awarding organisations for EDSQs and that the same approach should apply to Digital Functions Skills qualifications (FSQs).

One awarding organisation observed that, as in many cases the skills are interwoven across the 5 areas, it would welcome further clarification on whether mark schemes would be expected to include a rubric which ensured achievement across each skill area.

Other awarding organisations also highlighted the interrelationships in the subject content across the skills areas and thought that some assessment tasks should include content from more than one skills area.

Two awarding organisations observed that, as the subject content for some of the skills areas was predominantly knowledge-based, the assessment could become predictable over the lifetime of the qualification, and hence some flexibility should be given to awarding organisations on this matter to ensure more valid and reliable assessments.

Question 3

To what extent do you agree or disagree that we should issue guidance to support consistency between awarding organisations when differentiating between qualification levels?

All respondents either strongly agreed or agreed. 13 respondents strongly agreed. 7 respondents agreed.

There were 20 responses to this question. All 20 respondents either strongly agreed or agreed with our proposal to issue guidance to support consistency between awarding organisations when differentiating between qualification levels. Twelve respondents provided comments.

Most respondents agreed that issuing guidance would increase consistency between awarding organisations and would be helpful for employers, centres, and students. Another respondent suggested this could encourage collaboration between awarding organisations, similar to that which exists with other qualifications.

Another respondent stated this would avoid awarding organisations being selected based on the ease of passing and progressing to the next level, bringing more credibility to the qualifications. They stated that it would also allow providers the opportunity to make informed decisions around what qualification level to enter students for.

One awarding organisation suggested it would be keen to provide input and help shape this guidance.

Question 4a

To what extent do you agree or disagree with our proposal to issue guidance that assessment tasks are authentic and relevant to the workplace and everyday life and require the use of digital devices?

All respondents either strongly agreed or agreed. 14 respondents strongly agreed. 6 respondents agreed.

There were 20 responses to this question. All 20 respondents either strongly agreed or agreed with our proposal to issue guidance that assessment tasks are authentic and relevant to the workplace and everyday life and require the use of digital devices. Twelve respondents provided comments.

Respondents said our proposal was essential to ensure the needs of the target audience of the qualification are met and that the assessments support the purpose and content of the qualification. Respondents felt strongly that this qualification should reflect the needs of society and be based on genuine digital skills and activities.

A representative organisation raised concerns that our proposal may bring changes that are unfair to students that have not had any work or volunteering experience over the past two years, particularly considering the number of students that have missed workplace experience due to the pandemic.

An awarding organisation questioned the meaning of ‘authentic’ and asked who would decide this. A similar point was made on what constitutes a ‘digital device’ and whether all students would have equal access to said digital devices, depending on their socio-economic background. In contrast, one respondent advised we should not be overly prescriptive about specific devices or interfaces because of the pace of change with technology.

One awarding organisation suggested we would need to allow enough flexibility in the design of assessment tasks to avoid further disadvantaging students in prison or youth offender institutes (for example, system limits due to security restrictions could impact how a student would complete the assessment task).

Question 4b

To what extent do you agree or disagree with our proposal to expect that assessments are delivered on-screen and online, but to allow paper-based assessments materials to be made available as an exception, where this can be justified?

7 respondents strongly agreed. 9 respondents agreed. 1 respondent neither agreed nor disagreed. 2 respondents disagreed. 1 respondent strongly disagreed.

There were 20 responses to this question. Sixteen respondents either strongly agreed or agreed with our proposal to expect that assessments are delivered on-screen and online, but to allow paper-based assessments materials to be made available as an exception, where this can be justified. Three respondents strongly disagreed or disagreed, and one neither agreed nor disagreed. Eighteen respondents provided comments, including respondents who did not complete the consultation survey and responded by email.

Of those that agreed with our proposal, some respondents commented on assessments being delivered on-screen and online, whereas others commented on allowing paper-based assessment materials as an exception.

Sixteen respondents strongly agreed or agreed with our expectation that assessments should be delivered on-screen and online. Five of these questioned why paper-based assessment materials should be available at all as an exception. They suggested that a paper-based assessment would not be authentic for a digital qualification, that practical digital skills should be demonstrated on a digital platform, and that it might be difficult to develop valid or innovative paper-based assessments for skills-based assessments.

One awarding organisation suggested assistive technology should be used in place of paper-based assessment materials, where needed as a reasonable adjustment.

There was some support, however, for permitting paper-based assessment materials by exception. One respondent observed that some students with additional support needs have struggled with recalling tasks from one screen to the next while completing an exercise to demonstrate skills. It was argued that, for those assessments where multiple screens are interchanged, a paper copy of the task would be beneficial.

One awarding organisation said some centres still had limited IT infrastructure, so clarity on whether on-screen but offline assessment would be permitted would be welcomed.

Some respondents thought paper-based assessment materials should be made available to students with disabilities, if requested, by exception, and that this would help to make Digital FSQs more accessible to students with learning disabilities.

The respondent who strongly disagreed with the proposal to allow paper-based assessment materials said that, as Digital FSQs are meant to improve digital literacy, this requires assessments to be delivered on-screen.

The other respondents that disagreed or neither agreed nor disagreed with the proposal were from awarding organisations. One agreed that assessment should be delivered on-screen and online, but disagreed that paper-based assessment materials should be made available as an exception, except in the case of requests for reasonable adjustments. It stated that its experience of providing EDSQs suggested training providers were able to access the tests online and that it had not received reports of training providers not having the IT infrastructure. It stated that it was also aware that the secure estate (prisons and youth offender institutions) believed that its students should have a similar experience to other students.

One awarding organisation thought that clarity was needed on what was an exception. It said it would not want to create a separate process for exceptions, in addition to any process for reasonable adjustments or Special Consideration, and queried what other exceptions there could be. It also asked if it would need to consider the application of any exception rules to EDSQs too.

Another awarding organisation stated that, although it agreed with the approach in principle, the decision to produce paper-based assessments should be left to the awarding organisation to justify in its assessment strategy. It also thought the proposal raised questions about the level of justification required for paper-based assessment and expectations of comparability across awarding organisations. It argued that Ofqual could instead set a requirement that assessments must be taken as a whole using a digital device, rather than setting a requirement for paper-based assessments as an exception. It asserted this would be in keeping with the Department for Education’s principle that “assessments should be designed so that they can be delivered onscreen and/or on-line, reflecting today’s digital world”.

A representative organisation had concerns that this proposal might require awarding organisations to upgrade their IT hardware and software before they would be in a position to deliver the new qualifications. It noted that on-screen and online assessments would be a new development above and beyond what was currently being offered for FSQs in ICT and reformed FSQs in maths and English, and that the cost of technological developments might make some awarding organisations reluctant to offer Digital FSQs.

An interest group recommended that further clarification be provided regarding the range of digital devices required for assessment. Although it welcomed our proposals that awarding organisations should make full use of technology to enhance the quality and relevance of assessments, it thought that our expectation for centres to provide a sufficient range of devices might be a barrier to accessing Digital FSQs, especially in more disadvantaged areas. It suggested that one option around this might be to revise how the knowledge and practical elements of the qualification are assessed, allowing for possible grouped combinations that centres could tailor to their needs.

Question 4c

To what extent do you agree or disagree with our proposal to set out in guidance that we would expect awarding organisations to ensure that any of their centres who wish to offer assessment materials in a paper-based format can justify that there is a need for them to do so and explain to us in their assessment strategy how they will be so assured?

7 respondents strongly agreed. 7 respondents agreed. 2 respondents neither agreed nor disagreed. 3 respondents disagreed. 1 respondent strongly disagreed.

There were 20 responses to this question. Fourteen respondents either strongly agreed or agreed with our proposal to set out in guidance that we would expect awarding organisations to ensure that any of their centres who wish to offer assessment materials in a paper-based format can justify that there is a need for them to do so, and explain to us in their assessment strategy how they will be so assured. Four respondents either strongly disagreed or disagreed. Eleven respondents provided comments.

One respondent repeated its view, as expressed in Question 4b, that centres that wish to offer paper-based assessment materials should only be permitted to do so in cases of reasonable adjustment.

Two other awarding organisations agreed that centres should provide justification for offering paper-based assessment materials. One further suggested awarding organisations should also explain in their assessment strategies their approach, the parameters, and centre quality assurances they expected to see in place.

Two representative organisations also agreed with this proposed approach, with one also concurring that paper-based assessment materials should only be permitted in cases of reasonable adjustment. A training provider thought that having a clear justification was a minimum basis for permitting paper-based assessment materials. Otherwise, it argued, schools, colleges and training providers might seek an easier option for their students taking the qualifications.

One respondent strongly disagreed with our proposal. This was a personal response from a college employee. The reasons given for strong disagreement were identical to those given that agreed with our proposal. This respondent contended that since Digital FSQs are digital courses developing digital skills, then only in exceptional circumstances, by prior arrangement with the awarding organisation, should paper-based assessment materials be permitted.

Another respondent, replying in a personal capacity, thought that if digital and paper-based assessment materials were equivalent, and met agreed principles and criteria for their use, then it was unclear why individual justifications to permit paper-based assessment materials were necessary.

Two awarding organisations disagreed with the proposal. One thought that submitting a justification to offer paper-based assessment materials placed a considerable regulatory burden on awarding organisations and centres. It was suggested that Ofqual could ease this burden by setting a requirement that assessments must be offered on-screen or online, rather than setting a requirement for paper-based assessment materials to be an exception. It also contended the proposal raised questions about the level of justification required for the use of paper-based assessment materials and expectations of comparability across awarding organisations when managing this.

The second awarding organisation that disagreed with this proposal argued centres that requested to use paper-based assessment materials should be limited to making such requests under reasonable adjustments or Special Consideration.

An awarding organisation that neither agreed nor disagreed with our proposal said that more clarity was needed on what would constitute a justification for exceptions to the normal process. It also thought that awarding organisations should consider how reasonable adjustments could be embedded into on-screen assessments, such as larger text or screen background colours.

Question 4d

To what extent do you agree or disagree with our proposals to require awarding organisations to explain how they will manage any risks relating to where paper-based assessment materials are made available on-demand in their assessment strategy?

8 respondents strongly agreed. 10 respondents agreed. 2 respondents neither agreed nor disagreed. No respondents either disagreed or strongly disagreed.

There were 20 responses to this question. Eighteen respondents either strongly agreed or agreed with our proposals to require awarding organisations to explain how they will manage any risks relating to where paper-based assessment materials are made available on-demand in their assessment strategy. Two respondents neither agreed nor disagreed. Ten respondents provided comments.

Respondents welcomed the proposal that awarding organisations should explain their risk mitigation plans for on-demand, paper-based assessment, with 5 commenting that this policy proposal should be a minimum requirement of all awarding organisations. Some awarding organisations stated they would have risk-assessed the provision of paper-based, on-demand assessments even if Ofqual did not make it a requirement.

One awarding organisation said it required more clarity on what constituted an exception and thought it would be useful to have some guidance outlining this.

One awarding organisation that agreed with this policy proposal did so on the proviso that it did not relate to reasonable adjustments or Special Consideration.

Question 4e

Do you have any comments on the proposed definitions for on-screen and online?

20 respondents considered this question. Ten respondents provided comments on the proposed definitions for on-screen and online. Half of the comments provided came from awarding organisations.

There was broad agreement on our definitions for on-screen and online from the representative organisations and the training provider that gave comments, that thought them to be clear and practical. The 2 personal responses provided were also satisfied with the definitions.

Comments received from the 5 awarding organisations were mixed. Comments centred around how Ofqual’s proposed definitions for on-screen and online differed from their own, or how they differed from assessments that they currently provided for other ICT qualifications.

One awarding organisation stated that, although the proposed definitions differed to their own, its e-assessment platform was on-screen and online. It noted, however, that not all adaptations that might come under Special Consideration could be provided through its platform. Furthermore, it stated that if the assessment was downloaded from the platform, it would technically be running offline but on-screen, but it could then be uploaded once back online. This concern was echoed by another 2 awarding organisations that thought the distinction between the 2 proposed definitions was unclear and required further work. Both questioned whether the definitions considered or covered instances where the assessment itself was completed with an internet connection or required one for successful deployment, not just submission.

Another awarding organisation that currently provides other ICT qualifications thought it very useful to have definitions for on-screen and online, so that it could ascertain what was and was not permitted. It observed that, while it met the proposed definition for online in the way it delivered its paper-based assessments for FSQs in ICT, it did not meet the on-screen definition for those qualifications. It argued it would be difficult to envisage how it could adapt paper-based versions of assessments to meet the definition, stating it would be difficult to envisage designing assessments where the question papers were delivered on-screen, that were not online assessments, and were available on-demand. It also noted that on-demand, paper-based assessments would raise security issues for keeping question papers secured. It thought that the proposed definition should include the use of an e-assessment platform either as an example of online assessment or a requirement for it.

Question 5a

To what extent do you agree or disagree with our proposals to require awarding organisations to design qualifications at both qualification levels with a single component?

6 respondents strongly agreed. 7 respondents agreed. 5 respondents neither agreed nor disagreed. 1 respondent disagreed. 1 respondent strongly disagreed.

There were 20 responses to this question. Thirteen respondents strongly agreed or agreed with our proposal to require organisations to design qualifications at both qualification levels with a single component. Two respondents disagreed or strongly disagreed. Five respondents neither agreed nor disagreed. Nine respondents provided comments.

Of those that agreed with our proposals, many respondents commented on the need for consistency across awarding organisations. Respondents felt a single component was the best option, either because of the size of the qualification (the hours of guided learning), because it was not desirable to have a qualification with separate knowledge and practical skills components, or to provide clarity to awarding organisations and students.

One respondent repeated the point made earlier in response to Question 1b, that there should be a maximum weighting for the percentage of marks achieved through the assessment of practical digital skills.

One respondent that neither agreed nor disagreed with the proposal said that a single component might not meet the needs of all students. Another respondent that neither agreed nor disagreed stated Ofqual should ensure students are not overwhelmed by the prospect of a single assessment, particularly if the student has previously had a poor experience relating to assessments.

A respondent that disagreed with our proposals felt it would be better to assess the students’ knowledge and skills through 2 separate assessments, as this would better identify the areas that students need to work on.

Question 5b

To what extent do you agree or disagree with our proposals to permit a maximum of 2 assessments within a component, at both qualification levels?

3 respondents strongly agreed. 9 respondents agreed. 5 respondents neither agreed nor disagreed. 3 respondents disagreed. No respondents strongly disagreed.

There were 20 responses to this question. Twelve respondents either strongly agreed or agreed with our proposal to permit a maximum of 2 assessments within a component, at both qualification levels. Three respondents disagreed and 5 neither agreed nor disagreed. Nine respondents provided comments.

Respondents agreed that setting such an expectation would aid comparability across awarding organisations. Some said this would aid centres in the selection of awarding organisations, as it would stop decisions being made based on the volume of assessment.

Some respondents agreed our proposals were sensible due to the size and guided learning hours for the qualification. One respondent felt this was a sensible approach as students at these levels may not be accustomed to large numbers of assessments.

One respondent asked for clarification on whether the assessment can or must be completed in one sitting.

Another respondent suggested there should be some flexibility to allow awarding organisations to choose the number of assessments, based on the design principles or technology platforms used for assessments. They did not disagree with our proposal to permit a maximum of 2 assessments.

One respondent, who neither agreed nor disagreed with our proposal, commented they agreed there should be a limited number of assessments. However, they felt this could result in longer assessment times and could impact students with special educational needs, who may need supervised rest breaks. They also felt that setting rules about the maximum number of assessments may be overly restrictive for awarding organisations, as they felt a certain amount of flexibility would be required to develop assessments which covered all subject content statements.

One of the respondents that disagreed with our proposals felt this would prevent awarding organisations from creating innovative forms of assessments. It also stated that several shorter assessments would be easier to deliver for centres, where students needed to use digital devices and assessments that could not be completed in traditional exam rooms.

Another respondent that disagreed reported feedback received from colleges that it was challenging to deliver assessments which did not fall within class hours, and that assessments that had to be scheduled out of class hours would result in non-attendance. This was because adult students with other commitments, such as work, may also not be able to commit to different or extended time slots. The respondent felt the ideal timescales for any one component should be a maximum of one hour and no longer than 120 minutes.

Question 6a

To what extent do you agree or disagree with our proposals to set a requirement on the minimum and maximum overall assessment time?

6 respondents strongly agreed. 13 respondents agreed. 2 respondents neither agreed nor disagreed. No respondents either disagreed or strongly disagreed.

There were 21 responses to this question. Eighteen respondents either strongly agreed or agreed to our proposal to set a requirement on the minimum and maximum overall assessment time. None disagreed and 2 respondents neither agreed nor disagreed. Eleven respondents provided comments, including one who did not complete the consultation survey and responded by email.

Of those that agreed with our proposals, many said this would ensure sufficient comparability across awarding organisations, while still allowing flexibility for awarding organisations to be innovative in their assessment design. One respondent that agreed felt our proposal was in keeping with the regulatory approach taken for the reformed FSQs in English and maths. This respondent also thought that it provided a clear framework for awarding organisations to work within, as well as enough flexibility.

Three respondents agreed with setting a requirement on the minimum and maximum overall assessment time, as long as students who required longer assessment time or other support could be supported by exceptions and would not be excluded.

One respondent commented that centres may select awarding organisations based on assessment time and that shorter assessment times could suggest that the assessment may be easier to pass than those with longer assessment times.

One awarding organisation queried whether it would be able to provide input to help shape the requirements.

One respondent that neither agreed nor disagreed understood the need for comparability across awarding organisations. However, they were unsure whether it would be helpful to set minimum and maximum times until awarding organisations had time to develop specimen assessments which comply with all design requirements.

Question 6b

To what extent do you agree or disagree with our proposal for minimum and maximum overall assessment time to be set at 90 to 120 minutes, at both qualification levels? Please also provide any comments on whether using paper-based assessment materials could mean that additional time is necessary.

2 respondents strongly agreed. 11 respondents agreed. 5 respondents neither agreed nor disagreed. 1 respondent disagreed. 1 respondent strongly disagreed.

There were 20 responses to this question. Thirteen respondents either strongly agreed or agreed with our proposal for minimum and maximum overall assessment time to be set at 90 to 120 minutes, at both qualification levels. Two respondents disagreed or strongly disagreed, and 5 neither agreed nor disagreed. Twelve respondents provided comments on our proposal, including one respondent who did not complete the consultation survey and responded by email. Some respondents also commented on whether the provision of paper-based assessment materials could mean additional time is necessary.

Of those that agreed with our proposals, many felt extra time should not be permitted for paper-based assessments unless as part of a reasonable adjustment for students with additional needs. In contrast, one respondent that agreed with our proposal said it would be highly likely that paper-based assessments would require additional time.

One respondent that neither agreed nor disagreed with our proposals suggested an additional 15 to 20 minutes may be needed for paper-based assessments, depending on the rationale for using this approach in the first place.

Some respondents suggested further guidance on whether additional time, or potentially time outside of the set assessment time, would be used for printing or for any externally set observed tasks, or whether they would be included within the overall assessment time.

Some respondents (one that agreed with our proposals, one that neither agreed nor disagreed, and one that disagreed) felt that students at Entry level 3 may struggle with the overall assessment time and could feel overwhelmed or fatigued, with one respondent suggesting students at this level often struggle with focus. This respondent also suggested good practice would not allow any student to work at a screen for 90 to 120 minutes without a break.

One of the respondents that disagreed with our proposal agreed to the principle of setting a minimum and maximum overall assessment time to aid comparability across awarding organisations. They felt, however, that the minimum of 90 minutes was too long for students at Entry level 3 and could mean the qualification became inaccessible to students that would benefit the most from Digital FSQs. Another respondent suggested it would be useful to have guidance around permitting breaks within the set assessment time and suggested the maximum overall assessment time be set at 60 minutes.

Two awarding organisations that neither agreed nor disagreed said that proposed assessment times did not align with those for EDSQs, where many awarding organisations ran EDSQ assessments that exceed the maximum time proposed for Digital FSQs. One respondent queried whether more could be done to ensure consistency between the 2 types of digital qualification.

Question 7

To what extent do you agree or disagree with our proposals to introduce a qualification-level condition to ensure that awarding organisations will not be able to make FSQs in ICT at any level available after the 12 month transitional period?

6 respondents strongly agreed. 6 respondents agreed. 6 respondents neither agreed nor disagreed. 2 respondents disagreed. No respondents strongly disagreed.

There were 20 responses to this question. Twelve respondents either strongly agreed or agreed to our proposal to introduce a qualification level condition to ensure that awarding organisations will not be able to make FSQs in ICT at any level available after the 12 month transitional period. Two respondents disagreed and 6 neither agreed nor disagreed. Seven respondents provided comments.

Of those that agreed or strongly agreed with our proposal, most commented on the 12 month transition period, agreeing that this should be the maximum period FSQs in ICT are available, rather than on the introduction of the qualification-level Condition. One respondent commented this would allow sufficient time for mid-flight students to complete their qualification. Another respondent commented they would not support the transitional period being any longer, as running both qualifications at the same time would place a burden on the centres and awarding organisations.

One awarding organisation suggested that having a heavily publicised end date, supported by a communications campaign, with reminders from Ofqual and the Department, would be beneficial.

One awarding organisation that neither agreed nor disagreed with our proposal commented that, although it agreed in principle that there should be a maximum 12 month transitional period, further consideration should be given to the needs of students taking FSQs in ICT within apprenticeships, as they may not complete within the 12 month period. It was suggested further research be carried out to ascertain if this would be manageable for centres. This awarding organisation commented that they would encourage a swift move over to avoid running both qualifications at the same time. It also suggested Ofqual clarifies exactly when the 12 month transitional period would begin.

One respondent that disagreed was concerned that mid-flight students were given time to complete their current qualifications, considering the ongoing impact of the pandemic.

Question 8

Do you have any comments on our proposed Conditions and requirements?

21 respondents considered this question. Fifteen respondents stated they had no further comments to make. Seven respondents made comments on our proposed Conditions and requirements, including one respondent who did complete the consultation survey and responded by email.

A representative organisation thought the proposed Conditions and requirements were very detailed, and so it was important that the accompanying guidance was clear.

A training provider wondered if, especially at Entry level 3, delivery of the qualifications was realistic, in light of the number of hours of guided learning (55 GLH).

An interest group welcomed the proposal to set out expectations for differentiation between the 2 qualification levels in the subject level rules. It was thought this would create consistency across awarding organisations. However, it also thought that further clarification could be provided with regards to progression between Entry level 3 and Level 1, with particular reference to accreditation of prior learning.

One awarding organisation thought that plain English could have been better employed in the drafting of the conditions and requirements.

Another awarding organisation reflected that since it did not currently offer FSQs, any conditions and requirements that were introduced would not require it to make any changes to its current processes or practices. It imagined, however, that those awarding organisations that did provide FSQs in ICT might have comments on the changes it could require for their own processes or practices.

Another awarding organisation noted that draft Condition DFS3 required awarding organisations to have an assessment strategy for Digital FSQs, which it argued would be a burden to produce, review and maintain. It also stated that assessment strategies only existed because Ofqual required it, and that they were not used by schools, colleges and training providers.

The same awarding organisation also noted that Condition DFS6.2 required awarding organisations to keep guided learning and Total Qualification Time (TQT) under review, but argued that, as the hours of guided learning for Digital FSQs were set by the Department, it was unclear what actions it could take if it found that the hours of guided learning were not appropriate for the qualification.

Another awarding organisation questioned the necessity to permit adaptations of contexts in assessments at Entry level 3 but not Level 1 (Condition DFS7), given the purpose and requirement of the qualifications which expects students to respond to work or life contexts, and wondered if it would be making assessments less authentic by allowing this change. It also argued that as EDSQs do not specify an approach to awarding and, given the similarities in the content and purpose between Digital FSQs and EDSQs, it wondered if it was reasonable to create this disparity in approach.

Question 9

Do you have any comments on our proposed guidance?

20 respondents considered this question. Seventeen respondents stated they had no further comments to make. Three respondents left further comments on our proposed guidance.

A representative organisation thought that, although the guidance was clear, as it was published in a separate document, this might lead to confusion. It was suggested that the conditions, requirements and guidance should be set out alongside each other so that both were absolutely clear.

The awarding organisation that thought that better use could have been made of plain English for the proposed Conditions and requirements made the same comment about the guidance.

One awarding organisation raised 2 concerns. Firstly, it wondered whether the section in the draft guidance on on-screen and online assessment went against Ofqual’s priorities to maintain standards and promote public confidence in qualifications. It argued that Ofqual’s expectation that assessments were delivered to students on-screen and online, even if it was outside the requirements of the subject content, was unnecessary to maintain standards. It asserted that setting an additional barrier that could not be justified by the content could undermine public confidence in the qualifications. Secondly, it observed that the section on assessment availability stated that awarding organisations should consider how students were entered for and took assessments, individually or as a group. It was thought it would be useful to have further clarification on this point and that it might be useful to separate ‘being entered for the assessment’ section from ‘taking the assessment’.

The respondent also included comments on the definitions of on-screen and online and on the availability of paper-based assessment materials as an exception, which have been reported under the appropriate consultation question.

Equalities impact

Ofqual is a public body, which means we are bound by the public sector equality duty in the Equality Act 2010. In Annex B of the consultation, we set out how this duty interacts with our statutory objectives and other duties.

We published an updated equalities impact assessment alongside our technical consultation and explained that we had not identified any additional equalities impacts since the previous consultation on Digital FSQs. However, we asked for views on any additional impacts that respondents had identified, as well as on how any identified additional impacts might be mitigated.

Ten respondents indicated they had identified additional equalities impacts, with some referring to particular groups of students that they believed would be affected.

Four responses referred to students with special educational needs and disabilities (SEND), one of which suggested a potential positive impact for such students. Of these 4 respondents, 2 outlined the need for appropriate assistive technology to be available to such students and one mentioned the need for carers to understand how to work with such technologies safely. Two respondents mentioned the lack of provision at levels below Entry level 3 as having a potential impact on students with SEND.

Three respondents outlined concerns related to students’ access to digital devices, which was flagged as an issue that could have adverse socioeconomic impacts. One respondent indicated concerns that students in prison would struggle to achieve full marks in the assessments due to restricted access to the internet.

Individual respondents identified issues for particular groups of students. One respondent suggested that alternative provision should be considered for students who do not have English as a first language. One respondent referred to the needs of students with physical disabilities and stated that awarding organisations would need to ensure that such students who take paper-based assessments are neither inadvertently disadvantaged nor advantaged in doing so. One respondent queried how Digital FSQs would be marketed to diverse audiences. One respondent raised concerns with our proposed approach to Special Consideration using paper-based assessments.

Although it was unclear how their concerns specifically related to equalities, 2 respondents raised issues related to cybersecurity and the potential for students to be adversely affected in the event of a cyberattack.

Regulatory impact

In our consultation, we identified regulatory impacts which could arise from our proposals and explained that there was likely to be an increased burden as a result. We asked for views from respondents on the regulatory impacts we identified, on whether there could be further regulatory impacts arising from our proposals and how best to mitigate those impacts. Four respondents indicated that there were additional regulatory impacts that we had not identified.

Of these, 2 respondents referred to additional impacts that could arise from the proposal for on-screen and online assessment. One awarding organisation suggested that the burden could be reduced by not requiring schools, colleges and training providers to submit a justification to awarding organisations each time a student required a paper-based assessment, with online assessments being made available but not required. A representative organisation was concerned that the approach would require awarding organisations to invest in technological developments to enable them to offer Digital FSQs, and that the cost of this might make some awarding organisations reluctant to offer the qualifications.

One respondent raised the additional burden created by our requirements for awarding organisations to develop an assessment strategy for Digital FSQs.

One awarding organisation suggested that Ofqual’s FSQ Conditions would be more usable and easier to understand if there were qualification-level conditions with appendices for each subject, rather than 3 separate subject-level conditions. The respondent suggested that the regulatory burden on awarding organisations arising from our proposals could be reduced if we were to take such an approach.

In addition, although no regulatory impact was referred to in conjunction with this issue, one respondent suggested that the use of Digital FSQs within apprenticeships should be further explored.

Annex A: List of organisational respondents

When completing the consultation questionnaire, respondents were asked to indicate whether they were responding as an individual or on behalf of an organisation. These are the organisations that submitted a non-confidential response:

  • Association of Colleges
  • Association of School and College Leaders
  • Babington
  • City & Guilds
  • Federation of Awarding Bodies
  • Gateway Qualifications Ltd
  • iCan Qualifications
  • Kent Community Learning and Skills
  • NCFE
  • NOCN
  • Open Awards
  • Pearson Education
  • ScaleUp Institute
  • The St Martin’s Group
  • Training Qualifications UK