Research and analysis

The use of assistive technologies for assessment

Published 10 June 2021

Authors

  • Stuart Cadwallader
  • Diana Tonin

With thanks to

  • the teachers and students who participated in this research
  • the Access Consultation Forum
  • the Research Advisory Group

Executive summary

Ofqual wishes to better understand the use of Assistive Technology (AT) for assessment to ensure that its approach to regulation remains fit for purpose. AT is an umbrella term that can, in the broadest sense, include any device, software or system that is used to support an individual who has some form of disability or impairment.

In the context of education and assessment, examples include:

  • scanning and reading pens
  • eye-tracking technology
  • braille note takers
  • magnification software and equipment
  • modified equipment (for example, enlarged keyboards)
  • word processing software
  • generic ATs (for example, tablets and digital recorders)

Official statistics about the use of access arrangements for GCSEs and A levels show that the number of students using a non-interactive electronic question paper, the format most commonly used alongside various forms of AT, has increased significantly since 2015 (Ofqual, 2019a).

This report describes a qualitative study which was undertaken to foster greater understanding of how AT is being deployed in practice. Data collection took place across 2 phases. For phase 1, we conducted 4 one-to-one interviews and 5 focus groups (each of 2-3 people) with teachers and special education needs coordinators (SENCos). For Phase 2, we conducted 6 focus groups (2-5 people) and 1 telephone interview with students who had completed assessment within the past few years.

Among other things, participants were asked to describe the AT which they or their students used for assessment, whether they encountered any barriers to using it, and how they felt the use of AT had impacted on their experience of assessment and their performance. The discussions were diverse and covered a wide range of ATs. Data was analysed thematically and 5 broad themes emerged.

Reasonable adjustments are usually bespoke to the student

Each student has unique requirements that vary across subjects and contexts. This means that rules and guidelines that govern the use of AT in assessment need to be clear in their underpinning principles but flexible enough to allow students to make bespoke arrangements that reflect their normal way of working and maintain the validity of the assessment.

Examination question papers could be more compatible with AT

The degree of compatibility between assessment materials and the AT with which they are used, though not fundamentally problematic, has scope for improvement. In particular, it may be that the digital file formats in which examination papers are provided could be tested with a wider variety of AT hardware and software.

Schools may not always be fully aware of how AT may be used for assessment

Teachers and students may not always have all of the information about what is possible and permissible with AT. This means that there is the potential for some students to miss out on arrangements that may better support their needs because their teachers are either unaware of a given AT or unsure of whether it would contravene the rules. There may therefore be scope to improve the way in which ‘best practice’ is shared.

AT often has an impact on the experience of undertaking assessment

In some cases, the use of AT can unavoidably alter the experience of taking the assessment for students. This is particularly true where the AT requires significant additional time or the use of a specific assessment environment for effective use.

AT removes barriers to assessment but sometimes also changes how it is accessed

Where skills such as reading, spelling and grammar are not part of the assessment, AT can remove barriers to access, mitigating some forms of construct irrelevant variance (CIV)[footnote 1] . However, the use of AT can also sometimes draw upon other processes and skills that are not part of the assessment, thus introducing other forms of CIV. It is important to consider this balance. In many cases AT is vital because it provides the only means by which a student may access an assessment and demonstrate their knowledge and skills.

This study illustrates some of the issues around the use of AT in assessment and informs Ofqual’s understanding of what teachers and students experience when using AT in practice. Ofqual continues to work with awarding organisations and other stakeholders towards the goal of ensuring that the regulatory approach minimises the risk of malpractice and maladministration while not inadvertently impeding good practice.

Introduction

Background

Ofqual requires awarding organisations (AOs) to design their assessments such that they do not present any unjustifiable barriers to students, regardless of any disability or impairment that a student may have (Condition D2, Ofqual, 2019b). If a student underperforms on a given assessment because it is not sufficiently accessible to them (rather than because they lack the targeted knowledge and skills), then that assessment has an issue with validity. Ofqual therefore seeks to provide rules and guidance to AOs that will support them in providing reasonable adjustments that optimise assessment validity and promote fairness for all candidates. To this end, Ofqual chairs a group called the Access Consultation Forum (ACF), which meets at least twice a year to discuss issues of qualification accessibility. Attendees include AOs, the Joint Council for Qualifications (JCQ) and various organisations who represent individuals with particular needs or disabilities (Ofqual, 2019c).

Before proceeding, it is helpful to consider the terminology used in this report. The terms ‘reasonable adjustments’, ‘special consideration’ and ‘access arrangements’ can all be used to describe changes made to an assessment to make it more accessible for a learner. ‘Reasonable adjustments’ refer to changes to how an assessment is delivered to make it more accessible for disabled learners. ‘Special considerations’ refers to other adjustments that may be made for reasons other than disability, such as illness, injury or bereavement. ‘Access arrangements’ is a broader term, often used within centres, that encompasses both special considerations and reasonable adjustments to describe any of these adjustments to assessment. This report focusses predominantly on reasonable adjustments.

The JCQ provides rules and guidance to schools and colleges (centres) with regard to the use of access arrangements (including reasonable adjustments) for assessment. They state that access arrangements exist to: ‘…allow candidates with specific needs, such as special educational needs, disabilities or temporary injuries, to access the assessment and show what they know and can do without changing the demands of the assessment’ (JCQ, 2019, p. 3). The provision of access arrangements is the primary way through which AOs seek to facilitate reasonable adjustments for those students who need them, something which is required by legislation (Equalities Act, 2010).

One reasonable adjustment, which is the focus of this report, is the use of Assistive Technology (AT). AT is an umbrella term that can, in the broadest sense, include any device, software or system that is used to support an individual who has some form of disability or impairment (Edyburn, 2004; Watts, O’Brian, & Wojcik, 2003). Although the term AT can be used to refer to relatively non-specialist equipment (such as reading glasses), in the context of education and assessment it tends to relate to electronic devices and computer software.

The JCQ regulations cover the use of ATs, explicitly mentioning everything from word processors and computer readers (which convert text into synthesised speech) through to more specialist pieces of equipment such as electronic braillers and eye gaze trackers (which allow individuals with motor impairments to communicate using their eye movements). Centres are able to apply for their students to use individual or multiple ATs and to combine these with other reasonable adjustments (such as additional time), depending on their specific needs.

Ofqual’s official statistics provide insight into the use of access arrangements for GCSE, AS and A level assessments (Ofqual, 2019a). The number of students using a non-interactive electronic question paper, the format most commonly used alongside various forms of AT, increased from 9,870 to 22,115 between 2015 and 2019[footnote 2] . Ofqual wishes to better understand the use of AT for assessment to ensure that its approach to regulation remains fit for purpose. It is important that the regulatory approach minimises the risk of malpractice and maladministration but also that it does not in any way inadvertently impede good practice. Woods, James and Hipkiss (2018) specifically highlight the need for Ofqual to regularly consult and communicate with stakeholders (including people who use AT for assessment) to inform the sector’s approach to reasonable adjustments. Building on input from the ACF, this report describes a qualitative study which Ofqual undertook with a small but diverse range of practitioners and students to gain a more nuanced understanding of how AT is being deployed for assessment in practice.

Literature review

An inherent challenge for research of this kind is that technology is rapidly evolving. New software and hardware are constantly being developed and refined, and pedagogy and policy are often quite changeable as a result. This natural state of fluctuation can cause some of the research evidence to date rather quickly. In addition, as discussed above, the term AT covers a wide variety of technologies deployed to support a significant breadth of needs. Much of the literature therefore tends to focus on specific groups of ATs that relate to particular types of need. For example, the issues associated with ATs used by visually impaired students (for example, Nees and Berry, 2013) are often quite different to those associated with reading difficulties (for example, Svensson et al., 2019) or physical impairments (for example, Hemmingsson, Lidström, and Nygård, 2009). Given this breadth, this report does not attempt to provide a comprehensive review of the research on AT and education, instead it focuses on how the use of AT may interact with high stakes summative assessment. It is worth noting that in 2019 the Department for Education (DfE) developed a strategy for education providers and the technology industry to help improve and increase the effective use of technology in education. Within this strategy, the DfE have recently published a literature review on the use of assistive technology in educational settings (DfE, 2020).

The following literature review is divided into 3 sections. First, we discuss some of the theory behind the use of reasonable adjustments and AT in the context of assessment, with specific reference to ‘universal design’ and the ‘interaction hypothesis’. Second, we discuss the provision of AT in education in general, to understand how its use in assessment may relate to students’ normal ways of working. Finally, we consider how the use of AT is managed in the context of high stakes formative assessment in England.

Universal design and the interaction hypothesis

The purpose for the provision of reasonable adjustments for examinations (or ‘accommodations’ as they are sometimes referred to outside of England) is to ensure, as far as possible, equality of opportunity. This means that reasonable adjustments are intended to provide flexibility so that all students are able to engage with assessment in a manner appropriate to their needs. In other words, to allow students to demonstrate their knowledge, skills and understanding, unhindered in their demonstration by any disability, so they can be fairly assessed. In this regard, access arrangements are intended to counteract any construct irrelevant variance[footnote 3] (see Messick, 1989) that is caused by the interaction between a student’s needs and the format of the assessment.

One method that assessment developers may adopt is to try to develop materials that adhere to the principles of ‘Universal Design’ (Story, 1998). These principles are intended to ensure that products and environments are, as far as possible, usable by all people without the need for adaptation. When applied to assessment, an examination with a universal design would be accessible to all potential candidates and there would therefore be no need for access arrangements. Indeed, many of the barriers faced by those with impairments are often also experienced by those without them (Madriaga et al., 2010).

Though universal design would be the ideal, the reality is that a ‘one size fits all’ approach is not always achievable in practice and reasonable adjustments are often necessary for supporting individuals with particular needs. The ‘Interaction Hypothesis’ states that such arrangements should, if correctly implemented, lead to better assessment outcomes only for those students who need them. In other words, a reasonable adjustments used by someone who did not have any need of it would not improve their performance (Sireci, Scarpati, and Li, 2005). For example, a student who is not visually impaired would be very unlikely to gain a better assessment score were they to be provided with a braille version of a question paper (in addition to the standard printed version).

Unfortunately, the evidence relating to the impact of various reasonable adjustments on performance is patchy and findings are somewhat mixed (Cormier, Altman, Shyyan, and Thurlow, 2010). For example, though a study may generate evidence that individuals with dyslexia (and only those with dyslexia) benefit from having assessment materials read to them (for example Fletcher et al., 2006), findings can be difficult to generalise. This is because research often focuses on specific reasonable adjustments tailored for specific assessments and specific groups of students.

Evaluating the impact of a reasonable adjustment is often further complicated where it may be provided for a variety of reasons. For example, extra time to complete an assessment may be granted because the student is dyslexic, because of emotional difficulties, because they require time to use an assistive technology, or for a wide range of other reasons. However, if performance is in any way contingent on the student’s speed (meaning the test is ‘speeded’, whether intentionally or not), then most students may benefit from extra time, regardless of any particular need they may have (Lu and Sireci, 2007).

In order to maintain validity, reasonable adjustments must leave the assessment construct (the target skill or knowledge) unchanged, along with the standard of performance that is required (Kettler, 2012). However, there can often be an interaction between the design of the assessment and the access arrangement. Such potential interactions need to be considered when arranging and regulating the use of AT to ensure fairness to all candidates. This can be a complex challenge.

The use of Assistive Technology in education and assessment

The JCQ’s guidance makes it clear that the use of Assistive Technology for examinations should reflect the student’s ‘normal way of working’ (JCQ, 2019). In other words, the student should be regularly using the AT for their day to day education if that AT is to be permissible for use in the examination. This means that the availability of AT for assessment is largely dependent on what is generally available to the student for their education.

There are various models which describe the ways in which AT and supportive services may be designed, selected and delivered in the context of education (Edyburn, 2001; National Centre for Special Education, 2016). The World Health Organization’s Global Collaboration on Assistive Technology (GATE) emphasises the importance of focussing AT provision on the person, implying that good deployment of AT ‘requires an individualized and holistic understanding of the value and meaning of AT for the individual …focusing on the person, in context, and then considering the condition and/or the technology’ (Desmond et al., 2018, p. 437).

One theme that emerges throughout the literature is that there is a constant need to inform all stakeholders, be they policy makers, teachers, parents, or students about the latest ATs which are available and how they may be used. In 2009, the British Educational Communications and Technology Agency (BECTA) conducted research into technology and inclusion in England (BECTA, 2009). The report found that the availability of AT was not consistent across schools. They suggested a number of ways through which provision of ATs could be improved, including centralised online resources to inform schools about available AT and a system for advising policy makers on how to address inconsistencies in assessment, funding and expertise.

Successful provision of AT is likely to be contingent on schools and teachers having contemporary knowledge of the ATs that are available and how they may be of benefit to students, particularly given the wide range of choice (Conderman, 2015; Forgrave, 2002; Hayhoe, 2014; Netherton and Deal, 2006). It is important to note that the mere availability of an AT is not alone enough to ensure that it is successfully adopted. Adoption is considered contingent on the ‘five ‘Ps’: People, Products, Provisioning, Personnel and Policy (Smith et al., 2018). An AT Product can only drive educational improvement where it is facilitated by the other four Ps. For example, a useful product will only be successfully adopted where policy allows and where personnel have the expertise and capacity to operate and maintain the AT.

Copley and Ziviani (2004) examined the literature on the use of AT with children and found that inadequate staff training and support, negative staff attitudes, substandard planning processes, poor funding, and difficulties procuring and managing equipment could all be barriers to the adoption of AT. A study conducted by Lamond and Cunningham (2019) found that teachers with higher levels of AT knowledge and better computer literacy were more likely to perceive AT as useful for their students.This finding is supported by a study by Woodbury (2015), who found that training sessions were one way to improve teachers’ implementation of AT.

Of course, the succesful use of AT also requires buy-in from students. Draffan, Evans and Blenkhorn (2007) conducted a telephone survey of students who were using AT as a means to support their dyslexia. They found that engagement with training was variable, and concluded that this might mean that some of the students were unlikely to be using the available technology to its optimal effect.

Even if a teacher, parent or student is engaged and well informed, there are many implications for the selection of a particular AT which need to be considered. Satsangi, Miller and Savage (2019) discuss how, alongside considerations of usability and cost, the educational and emotional impact of the AT on each individual student must be considered. For example, will the AT promote autonomy and independence? Will it align with the student’s preferences? Will it impact on the student’s social interactions and confidence? All of these are very important considerations that go beyond assessment performance.

The Department for Education’s literature review on the use of assistive technology in educational settings (DfE, 2020) summarises the available evidence concerning the use of ATs in education and the current limitations. The review highlights a lack of research-based evidence to inform AT interventions in practice, including limited empirical work on the relationship between accessible educational materials and AT and a lack of universal screening for AT interventions. Although there is much advocacy by educators, parents, students and policymakers on the importance of AT, it’s potential is not yet being realised because of the limited availability of appropriate AT for those students who need it. The lack of a universal screening procedure to ascertain student needs means that someone within the school must identify and advocate for a pupil. This contributes to inequitable access because identification and implementation depends on staff training and support. The author concludes that AT has important applications for students and suggests that a coordinated effort between policy makers, administrators, educators, researchers and industry is required for AT to realise its potential to provide students who need it with the means for accessing and engaging in the curriculum.

This literature, taken together, demonstrates the importance of viewing the use of AT for assessment in the wider context of how students use technology in their daily activities. Though a particular AT may offer functional support for an individual with a particular impairment, there is a clear need to understand the extent to which that individual is already familiar with the AT through either training or regular use.

Assistive technology as a reasonable adjustment

As discussed, in England assistive technology may be used as a reasonable adjustment for an examination where it reflects the individual’s normal way of working (JCQ, 2019). AT may be deployed in combination with other reasonable adjustments where this is appropriate for the individual concerned. For example, if an AT is time consuming or tiring for the student to operate, the centre may also request that they be given extra time. Similarly, AT can sometimes be used to remove the need for another reasonable adjustment. For example, students may prefer to use text-to-speech software instead of a ‘human reader’ (an assistant who reads the examination materials out loud to the student during the assessment).

Ensuring that reasonable adjustments are efficiently governed and administered is a challenge. Woods (2007) suggests that in order for access arrangements (including reasonable adjustments) to be successfully implemented in schools their administration must be fair and manageable. Along with the rules and guidance around the use of access arrangements, the JCQ also operate the system that manages and records applications. In 2008 they significantly changed the process with the introduction of the ‘Access Arrangements Online’ system.

Woods, James and Hipkiss (2018) explored the views of special educational needs co-ordinators (SENCos) with regard to this system using a questionnaire. Of those surveyed, 46% perceived the system for administrating access arrangements to be fair while 21% stated that it was not fair[footnote 4]. In terms of manageability, 25% of respondents perceived the system to be manageable while 43% felt that it was not manageable. The respondents’ qualitative responses suggested that their concerns centred on both the amount of time the application process took and its complexity. There is a tension between the need for a straightforward process that reduces burden upon teachers and the need for a system which is robust enough to deter and prevent potential malpractice.

The use of AT as a reasonable adjustment is also contingent on Awarding Organisations (AOs) providing compatible assessment materials. Nees and Berry (2013) discuss how text-to-speech AT can provide equitable access to assessment for those with disabilities but that there remain both technical and social barriers to implementation. Technical barriers often relate to the availability and compatibility of digital versions of the assessment materials, such as electronic question papers. For example, particular versions of software may be partially or completely incompatible with the format of the digital materials that are supplied by the AO. The extent to which AOs currently adhere to recommended accessibility standards, as provided by UKAAF (UK Association for Accessible Formats, 2019) is somewhat unclear. The sheer range of available AT and the fact that it is constantly evolving makes it difficult to ensure electronic assessment materials are universally compatible.

Research questions

The literature review has provided some context for exploring teacher and student perspectives on the use of AT for assessment. The research study itself is driven by the following open and explorative research questions:

  1. Under what circumstances do centres and learners use AT for assessment?
  2. What do centres and learners perceive to be the strengths and weaknesses of current arrangements around AT?
  3. What is the learner experience of using AT for assessment?
  4. What is the impact of using AT on the quality of the assessment?
  5. What are the barriers to effective use of AT?

The intention is not to provide comprehensive answers to these questions but rather to improve our understanding of the context in which learners and teachers may work and the challenges they may face when using AT for assessment. By developing our understanding, through case studies of real-world practice, we will enhance our capability for informed regulation.

Methodology

Research design

The research questions outlined above are open and explorative, lending themselves to a qualitative approach. Recent literature has highlighted the need for policy makers to take greater account of student voice with regard to matters that affect them, highlighting their ability to provide powerful and unique insights (Barrance and Elwood, 2018; Kevin Woods, McCaldin, Hipkiss, Tyrell, and Dawes, 2019; Kevin Woods, Parkinson, and Lewis, 2010). The intention for this study is to focus on the first-hand experiences of both practitioners and students, allowing themes to emerge naturally through the discussions.

Given this brief, a semi-structured interview approach was selected. This method allows us to explore emerging themes in depth and enriched by context, meaning that the strengths and weaknesses of current rules and guidance may be considered and examples of best practice identified. It is important to briefly note that findings from qualitative research of this type should not be generalised too broadly because samples are of insufficient size to allow us to confidently estimate the prevalence of particular issues. What the approach instead facilitates is a deeper understanding of the mechanisms that underpin the use of AT in context.

Data collection was divided across 2 phases. Interviews were conducted with teachers and/or Special Educational Needs Coordinators (SENCos) in Phase 1, while students who were AT users were interviewed for Phase 2. The phased approach was a methodological decision. By speaking with teachers and SENCos first, we hoped to gain an overview of policy and practice in schools before we engaged with students about their personal experiences. The phasing also provided us with an opportunity to take advice from teachers about the needs of the students we would be interviewing, helping us to ensure that the research process would be suitably accessible and comfortable for those involved.

Interviews were undertaken on either a one-to-one basis or in small focus groups of 2 to 5 interviewees. This approach was chosen for both practical and methodological reasons. Practically, it was easier to organise group interviews in cases where participants had busy schedules and researchers needed to travel and could therefore make only a single trip. Methodologically, it is important to note that these 2 interview types create differing social dynamics and therefore elicit slightly different types of data. Generally speaking, one-to-one interviews are considered slightly better for exploring themes in depth while focus groups elicit somewhat wider ranging discussions and may generate more ideas (Breen, 2006). By combining the two, this study aims to draw on the relative qualities of each approach.

Ofqual researchers travelled to conduct the interviews for each phase (where applicable). Interviews ranged between 18 and 90 minutes in duration (the mean duration was 55 minutes). Interviews took place, where possible, in a quiet room and all were audio recorded. The audio files were transcribed verbatim for analysis. One student was interviewed over the phone and 3 students used Augmentative and Alternative Communication (AAC) to participate. Each interview was guided by a schedule, which was intended to provide a framework for discussion that did not prescribe the topic or prevent the interviewers or participants from pursuing useful tangents. The interview schedule for the teacher and SENCo interviews was slightly different to that for the student interviews, though they covered the same general topics. The interview schedules were developed with feedback from the ACF.

All participants were required to read an information sheet and complete ‘informed consent’ paperwork prior to their interview. In the case of those students who were under the age of 16, we also required signed informed consent paperwork from a parent or guardian. Given that some of the young people who were involved had impairments that could make them vulnerable or uncomfortable, we discussed how best to communicate with them with their teachers prior to the interviews. In such cases, these teachers were present for the interview to support the student should they have required any assistance or reassurance.

Participants

Potential participants were identified with support from the ACF, whose members kindly recommended schools, colleges and individuals to us for Phase 1. In most cases, members of the ACF made contact on our behalf (either directly or via their mailing lists), with potential participants then directly contacting Ofqual to volunteer. Recruitment was therefore opportunistic, with the explicit purpose of identifying those with particular experience or expertise in facilitating the use of AT for their students. For Phase 2, students were identified by the Phase 1 contacts (where possible) and invited to participate. In total, 9 schools took part in Phase 1 and 5 of these schools also participated in Phase 2, with 1 additional school recruited for phase 2 (see Table 1 for a breakdown).

Table 1 Study Participants

Centre ID Centre or participant type Phase 1 – Teachers and SENCos Phase 2 - Students
A Sixth Form College Yes No
B Academy Sponsor Led Yes Yes
C Academy Converter Yes Yes
D Independent School Yes Yes
E Independent School Yes No
F Community Special School Yes Yes
G Community Special School Yes Yes
H Local Authority Peripatetic staff Yes Yes
I Local Authority Peripatetic staff Yes No
J Voluntary aided school No Yes

In Phase 1 we conducted 4 interviews and 5 focus groups (of 2-3 people). The sample included 3 individual teachers who are peripatetic, supporting a portfolio of centres within a specific Local Authority (LA). Though these individuals were not based at a particular school or college, they had a breadth of experience based on their work across multiple centres. The other interviewees were based at specific schools or colleges. The primary contact for each school chose whether to also invite colleagues who they felt would add value to the discussion because of their experience and expertise (in these cases we conducted focus groups). The sample for Phase 2 included 20 students who had completed their GCSEs or A levels within the past 1-3 years. We conducted 6 focus groups (of 2-5 people) and 1 telephone interview for Phase 2.

The interviews for Phase 1 were conducted between June and July 2019 and the interviews for Phase 2 where conducted between November 2019 and January 2020. The discussions in both phases were diverse and included a range of ATs used by the students, including: text-to-speech, speech-to-text, scanning and reading pens, eye-tracking software, braille note takers, magnification software and equipment, modified equipment (for example enlarged keyboards), word processing software, and generic ATs (for example tablets, digital recorders).

Findings

This section discusses the research findings under the 5 broad themes:

  1. Reasonable adjustments are usually bespoke to the student
  2. Examination question papers could be more compatible with AT
  3. Schools may not always be fully aware of how AT may be used for assessment
  4. AT has variable impact on the experience of undertaking assessment
  5. AT removes barriers to assessment but sometimes changes how it is accessed

Though these themes are presented separately, they link and interact with one another. Before describing and discussing these themes in detail, it is worth noting that we have chosen not to provide counts for how frequently each theme arose or how often particular views were expressed. The reason for this is that it would undercut the unique circumstances of the centres we visited, who are highly diverse in terms of both their provision and the student populations for whom they cater. Our thematic analysis incorporates a wide range of ATs and contexts, therefore constituting a selection of case studies rather than a sample from which it is possible to generalise. The perceptions of those we interviewed help us to understand the type of issues teachers and learners may be facing and how these are influenced by context, but do not tell us how prevalent these issues may be across all schools in England. We will now explore each of the 5 themes in turn.

Theme 1: Reasonable adjustments are usually bespoke to the student

Decisions around the deployment of assistive technology appear to be largely driven by the specific needs of the student. Under this theme we will discuss how there is a high degree of personalisation that extends beyond the selection of particular equipment and software. Decisions often involve consideration of contextual factors relating to the academic subject, the environment in which the assessment is to take place, and the current physical, attitudinal and emotional condition of the individual student.

Specific ATs may work better for some students than others, even if, on the surface, the needs they are attempting to fulfil are similar. Not only is it important to find the correct AT(s) for that individual, but it is also important to identify the correct way for them to use those ATs. The process can therefore take time and involve a degree of trial and error.

We have an initial assessment that we do to look at the need for assistive technology, and what are the areas of difficulty, to identify the right type of technology for that child. And then obviously through continual working with a child we’re constantly assessing whether that’s meeting the need of that child, and whether we need to look at other options. Teacher, Academy Converter, C

The needs of individuals change over time, while technology is also constantly evolving. It is important to frequently re-assess students, identifying and meeting new needs as they arise. Through re-evaluation, teachers attempt to ensure that their students still qualify for any arrangements that they have been previously granted.

Because we’re very conscious that when you have a diagnosis of a specific learning difficulty when you’re younger, obviously that can change, your needs can change over time, so it’s a case of obviously making sure that they still qualify when they get to those key stages, but also sometimes a new need has arisen. So there might be somebody who didn’t qualify for a reader for example when they had the report done, but then as things have got more difficult and they’re finding the curriculum harder, there’s more reading, there’s bigger vocabulary, then they might actually end up qualifying for a reader later on. Teacher, Independent School, E

In some cases, individual students may be able to compensate for their needs at GCSE level but may then struggle to access A levels because of the more intensive curriculum. In other words, the ‘step up’ to A level may bring a need for AT into sharper focus, where previously the student had been coping (perhaps unnoticed) without it.

And possibly those really quite bright dyslexics that have compensated very well up until the end of GCSE and have done very well, that the volume of reading for A level just tips them over the edge and cracks start to appear… Teacher, Independent School, D

It can be considered important that centres identify and support students that run out of time and/or struggle with exams. In one centre, this took the form of an internal process whereby students that were consistently running out of time during exams were identified and referred for additional testing.

We pick up a lot of students that have never had an exam arrangement before. They know they’ve run out of time, they know they’ve struggled with exams, but nobody has ever picked [this] up. Teacher, Sixth Form College, A

Here we can see an interaction between the provision of AT and another reasonable adjustment: extra time. It is frequently the case that these 2 adjustments are required in unison, sometimes to meet the needs of the individual student and sometimes simply because the AT itself is time consuming to operate. In the example below, the student, who had previously struggled to finish exam papers on time, found that access to an AT (and the additional time required to use it) had helped them to achieve grades that they felt better reflected their ability in the subject.

It’ll be a scribe, a reader and a computer, and extra time as well. …The amount of work I can get done in the amount of time is, like, significantly different. I can get so much more done. But that’s still not necessarily the amount of like what’s expected. But I can, I’m more likely to finish the exam now than I was before so that’s good. Student, Independent School, D

A point which is clearly reflected in JCQs guidelines is that reasonable adjustments should mirror the student’s normal way of working. Regardless of the ATs being discussed, teachers appeared to be well aware of this guidance and keen to ensure that students’ classroom experience of ATs was aligned with what they would experience in an exam.

The idea is that I don’t want anyone to ever walk into an exam and do something incredibly different from what they normally do… Teacher, Sixth Form College, A

…it really comes down to what the student prefers, because the JCQ documentation is very clear, it has to be their normal way of working, so whatever they would normally do in lessons is what we try and replicate in exams. Teacher, Community Special School, G

Despite this, defining the ‘normal way of working’ may not always be straightforward. Individual student may use different combinations of ATs for different subjects and contexts, sometimes using multiple ATs and sometimes no ATs at all. For example, students may decide to handwrite short, structured questions and to type essay-style questions. This depends on the student’s preferences and needs.

I have a few that do mixed options, so for biology they might choose to handwrite for paper 1 and 2 because they’re short answers and then they type for paper 3 because that’s got the long answers. So really it’s about a conversation with teaching staff as to the format. Teacher, Sixth Form College, A

For readers who particularly only need odd words or odd sentences read rather than whole papers I think the reading pen provides a much better option for those students over the [speech-to-text software] and especially over a human reader. Teacher, Independent School, F

The students offered insights as to why they may prefer to use different ATs for different subjects. The use of technology where students are required to read long passages of text and write long form answers was considered beneficial, while it was not considered as important for subjects such as maths and sciences, where less reading and shorter responses tend to be required. Depending on their needs, students may find it preferable to use pen and paper where graphs and diagrams are required.

For me it was because I was too slow at writing, so it became a problem keeping up for like written subjects. For maths I don’t use it…. Sometimes in physics I get it when I need it. So I just have it alongside a sheet of lined paper that I can make notes on like handwritten or type, I normally type notes. But like diagrams I can do by hand. Student, Voluntary Aided School, J

It is worth noting that in certain circumstances, where the arrangement may impact on the validity of the assessment outcome, some reasonable adjustments are not permitted. For example, students are not allowed a human reader for certain English examination papers, but they can use a computer reader (an AT). Therefore, some students that were familiar with assistance from a human were required to adapt and use AT for the exam.

Students with visual impairments may prefer to use a Perkin’s Brailler (a braille typewriter) for some subjects but either a computer or a BrailleNote (a braille laptop computer) for other subjects. One student explained a preference to use the Perkin’s Brailler for graphs, while another used the BrailleNote for scientific subjects because of the inbuilt scientific calculator.

So the Brailler, the Perkins Brailler… that’s sort of useful for when you need to actually write a graph… if you’re doing a coordinate graph it’s just easier to put the graph paper into the Perkins Brailler and then do the axis on a Perkins than it is to do it another way. It’s just quicker. And I’ve done it before so… I’d be quite confident in doing it that way as well. Student, Community Special School, G

…for the maths and science GCSE where I used the BrailleNote machine. […] They have a scientific calculator built in which, out of all the different types of scientific calculators I’ve tried for the maths and the science GCSE, was the best I thought. So I used that for my calculations for the maths and the science. …it was just an extra piece of equipment that helped for those two exams. Student, Local Authority, I

However, the BrailleNote was often preferred for its speed and flexibility when writing and modifying larger bodies of text.

Then I’d use a Braille Note or a laptop or computer. Just because on a Brailler, when you’re writing that sort of amount, when you’re writing long question answers it can take some time. Plus if you make a mistake on a Brailler you can’t, it’s quite hard to get rid of the mistake as you can’t really go back and delete; you have to rub it out and that causes the paper to look untidy and stuff like that. Student, Community Special School, G

It is perhaps unsurprising that the bespoke nature of students’ normal working arrangements often leads to unique logistical arrangements on the day of the examination. Students who are using AT often require considerably more desk space than their peers so that they may accommodate their electronic devices and any adapted assessment materials. To illustrate this, one teacher described having to give up their office on an annual basis during the summer because they had a large curved desk which could accommodate a large amount of assistive technology. One visually impaired student described how they required 3 tablet computers in an exam, each one locked down to display a single document (thus preventing access to any functionality that may provide them with an unfair advantage).

But in my A-level… I had a graphical calculator and a normal calculator and I had three different [tablets], because you can’t have them all on the same [tablet] because of restrictions. So I had, the order it would go: formula booklet, graphical, scientific, the question paper, which was massive, answer booklet… Student, Academy Converter, C

It may also be that a typical examination environment (a large and silent school hall) is not conducive to the needs of the student. Some students may require a room with fewer or no people in close proximity, particularly if they need the use of a human reader alongside their AT.

We’ve had a hearing impaired student where obviously the assistant has to speak up a little bit. Now, we took the decision, rightly, to have that one-to-one in a single room where they’re isolated, because other students can overhear what’s being said; whereas, other students who are blind, there may be three or four in a room but they’re spaced out in such a way that overhearing would be very, very difficult. So we’ve just come to, over time, make judgements on that according to our pupils’ needs…. Another example is that we’ve had students here with seizures. So obviously it’s right and proper to put them in isolation as well in case they have such a seizure and that would impact on other students in the room. Teacher, Community Special School, G

It is worth noting that several teachers suggested that their school was likely to be unusual in the extent to which they strived to find the right ATs by evaluating (and re-evaluating) their students’ needs. This reiterates the notion that the centres where we conducted interviews may be demonstrating ‘best practice’, informed by their experience of working with cohorts that have particular needs. As we will discuss, other centres may lack the expertise and resource necessary to be as proactive.

Though much of this theme is not explicitly instructive with regard to the regulation of ATs for assessment, it provides an important overview of some of the potential issues facing teachers and students. The interaction between the student and their AT is usually unique and is often fluctuating, meaning that any rules around the use of AT must be sufficiently flexible.

Theme 2: Examination question papers could be more compatible with AT

One of the major themes of the discussions was about technical compatibility between various ATs and examination materials. Practitioners and users of AT can sometimes be frustrated because the examination papers do not work optimally with the AT.

The prime example of this related to the fact that non-interactive electronic papers were usually provided in the ‘PDF’ format. There were concerns about how reliably and universally these files worked across the range of available ATs (particularly screen-reading software) and whether AOs were doing enough to test this compatibility. This issue appears to relate to the various types of screen reader software that are favoured by students who have visual impairments. There are a wide range of screen readers available, all of which may interact somewhat differently with a given PDF.

So you can order the non-interactive PDF, which you’re recommended to do. But you need to know if it works with your screen reader, and if it doesn’t you need to try a different screen reader, and the child needs to get used to it. And it doesn’t matter which screen reader you choose, there will be little quirks… If it’s an American one it might say ‘period’, or it might say ‘tick’… I think that would help if there was a standard for the PDF papers in terms of accessibility and how it was made. They [the students] would know exactly how it’s coming, and then they could definitely work out what software they want to use with it, or screen reader, and then just get ready for it by practising with PDFs made to that kind of standard. Peripatetic Staff, Local Authority, L

They [the students] use screen readers… and they [AOs] never seem to test the papers on that. …what will happen is [the screen reader] will read some of it and then it won’t anymore, there’ll be bits missed out. …I don’t think they’ve tested them on any screen reader, because things like diagrams and stuff, if you put a label on it saying, this is a picture of a cat, the screen reader will read it out to you. Teacher, Community Special School, G

In the example below, two students discuss an issue whereby their AT sometimes failed to identify sub-questions (for example ‘1(a).i’) within the overall structure of a question. This issue created a risk that they may be unable to identify all of the questions and might miss one.

S1: Yeah, instead of having like 1(a) part (ii), you could just have like just one question [to] work through. S2: But yeah and just the fact that if you have one small line at the bottom of a page underneath a big question, it’s quite easy to miss that. Students, Academy Converter, C

Some of those we interviewed expressed a preference for receiving examination papers in the formats used by word processors, stating that these tended to be more compatible with ATs .[footnote 5]

And the… [software] hates PDFs. It works in a completely different way. So, on a [word processor] document it’s a bit like, have you ever used it, you highlight it, don’t you, as if you were going to copy or cut it, click on the button and it reads it to you. On a PDF you have to create what looks like a textbox around the text and then click on read. And it’s really fiddly. Teacher, Sixth Form College, A

Alongside the issue of technical compatibility, non-interactive electronic question papers do not allow students to type their responses on to them directly. This is in contrast to physical (hand-written) papers, where students write their response into the physical space provided on the page, which is usually directly below the question. The use of digital papers therefore often requires students to use two laptops: one to display the non-interactive paper and one for use as a word processor for typing the response.

So say for example a child is using a laptop and has got a non-interactive PDF paper… they generally probably will have two laptops, one with the paper in view and one that they’re typing on, so it can be quite complicated. So I’m not sure why you can’t type straight into a PDF document. Teacher, Independent School, E

In addition, the ability to enlarge text on non-interactive papers is often limited to the ‘zoom’ feature, which can be problematic for visually impaired students.

The PDF can’t always be read by the screen readers…. And as you enlarge it, it becomes a bit blurry and fuzzy, so [a word processor file format] is better, because with [it] the student can then change the font size if they want to. Peripatetic Staff, Local Authority, I

… bigger doesn’t necessarily mean better. It’s such a misconception that, oh, well, just make it in A3 and that makes it bigger. No, because if you take something that is really thin and grainy and make it big, well, now you’ve got something that’s big and grainy…what you’re actually doing is just making it so there’s more stuff to see and I can’t actually put all of that into my head at the same time. Student, Academy Converter, C

Given the need to prepare students for examinations, some teachers prepare their students to work with PDFs so that they are familiar with using their AT with the format of the non-interactive papers.

The other thing that is difficult is we’re then actually having to teach them two different ways, because the rest of the year they’re using it on really easy to use [word processor] documents… and then at the end we’re having to make sure that they actually know how it works in a PDF, because they’re very rarely using PDF documents in class. Teacher, Sixth Form College, A

The PDF, it can be a little tricky to use but you just have to remember the steps… So it’s just getting familiar with the processes and making little cards for them to say, OK, just press this, this, this and then, yeah. We’ve had a few issues with the PDF reader but I don’t think that’s the PDF reader’s fault, I think that might be just our technology! But anything that could make it easier would be great. Teacher, Independent School, E

The quotes above focus on the compatibility of screen readers and other AT with non-interactive electronic papers. There were also cases where interviewees mentioned compatibility issues with physical copies of the exam papers. For example, one teacher suggested that when exam papers are designed they should take into account the diversity of students that will be taking the exam, thus evoking the ideas of universal design discussed in the literature review.

I do sometimes wonder whether the people that are… designing these papers whether they do think about the broadness of the students that are sitting them. If you’re mindful that these options are available to your students, the paper should be able to be accessed by the options that are available. So if you look at the reading pen, that’s a very low grade exam access arrangement. But if you’re designing papers that don’t fit the way in which that works you’re taking away actually a really useful arrangement, and could be forcing someone out of their normal way of working to engage with a different form of technology just so they can access that paper. Teacher, Sixth Form College, A

To try to overcome the challenges that are sometimes presented by the interaction between the format of examination papers and ATs, schools tend to request different versions of the same paper for the students. This allows students to swap between the two versions, depending on what they are trying to achieve.

Students sometimes when they have the exam papers in front of them, for whatever reason they’re still not comfortable with the font size and presentation of it. So they might choose to use an ordinary copy of the paper, normal size print and then use a [magnifying device] to read it. And so they can make the print larger or smaller and change different settings to make it black on white, white on black. Teacher, Community Special School, G

I think the most important thing to me was to have a normal size paper and also an enlarged copy of the paper… So I would have my A3 paper on my lap… here with the A4 paper and there would be a monitor screen in front of me. So while I would be reading what was showing on the screen, I’d be reading that and then I’d go to my left and start to write down what the answer was. So I think it was really convenient to me and I think it was quite efficient as well. Student, Community Special School, G

In addition to non-interactive digital examination papers, those who use AT often also require modified papers. For those students with visual impairments, any errors or inconsistencies in the translation from the original paper to the modified version can present additional challenges. A single typographic error can completely change the meaning of the question. It is unclear whether such errors are more or less prevalent for modified papers and whether exam boards operate similar checking procedures for both the modified and standard versions of the paper.

So for example for computer science there’s a little bit of programming involved so you have to read a line of code and interpret it or correct it, or that type of thing. And my computer science GCSE was, the braille for that code I needed to read and correct was wrong. So I tried to correct, I thought I was correcting [it]. So basically I corrected the wrong thing. I corrected their braille instead of what I should have corrected. So yeah, so both the maths GCSE and the computer science GCSE kind of did put that little bit of extra stress on me because they’d been wrong twice. The accessibility bit had been wrong twice. So yeah it, by the time A-levels came around, I mean I did AS as well in my first year and one of them was wrong as well. It was just a spelling mistake in the computer braille again but you know it is still, it’s still, well it still affected me anyway. It still made me a little bit wary basically. Student, Local Authority, I

This theme demonstrates that there is still scope for improvement in the design of assessment materials, be they on paper or digital, standard or modified, so that they can be more compatible with ATs and better meet the diverse needs of students. It may be that examiners can further embrace the principles of universal design when developing their question papers.

Theme 3: Schools may not always be fully aware of how AT may be used for assessment

Many, if not all, of the schools and colleges involved in the study were well informed with regard to the availability of, and potential uses for, AT. They had a detailed understanding of the rules and guidance around implementing AT for high stakes assessment and were able to reflect on where they felt there were ambiguities with regard to what was permissible. Given the opportunistic nature of the sample, it is likely that this high level of knowledge is somewhat unusual. None the less, the insights allow us to consider the availability and clarity of key information.

Whether real or perceived, the interviews suggest that there may be some grey areas in the rules and guidance. For instance, there was uncertainty over whether the use of particular programmes or apps on a tablet computer needed to be deactivated for an examination.

I said use the [calculator] on the [tablet]. But they [the teachers in the school] were worried about that… It would just be [a case of] switching off anything that would give the student an advantage. Peripatetic Staff, Local Authority, H

One significant challenge is maintaining rules and guidance in an environment where technology (and terminology) advances so quickly. It may be difficult for teachers to ascertain whether particular software or equipment is permissible. The interpretation of the rules and guidance may sometimes drive behaviours which do not truly reflect their principles or ‘spirit’. The examples below illustrate areas where differences in interpretation have created uncertainty and may have led to some valid uses of AT for assessment being prevented.

I feel like there’s lots of things, there’s lots of assistive technology out there and things have moved on a lot, [some newer AT] might be allowed but it’s maybe not specified. Teacher, Academy Converter, C

We haven’t used this in exams [Software which allows you to edit a PDF] and we were wondering with the JCQ document, it says it’s a non-interactive PDF. And whether that non-interactive is because the exam boards don’t want people to use that as an editable document, or whether it’s just that the PDF is not normally editable, whether it would be allowed or not. Teacher, Academy Converter, C

The exam officers are obviously not the teachers in the school, and they’re quite driven by the regulations and the rules. And I think they’re sometimes scared or don’t always understand… they haven’t quite got all the information or they read the regulations and maybe if it doesn’t mention [tablets] but mentions assistive technology then maybe sometimes they’re not sure [what is] allowed. Peripatetic Staff, Local Authority, H

Despite these challenges, the current JCQ rules and regulations were not challenged on the general principals which underpinned them – any criticism focussed on particular details, suggesting that the fundamentals of the approach were considered broadly appropriate by the teachers and learners we interviewed.

It was encouraging to hear examples of teachers engaging in regular and ad-hoc activities that were intended to share good practice between schools. For example, one of the schools made arrangements for one of their students to go speak with a nearby school about how they use AT.

There’s a school down the road who use readers and they’ve been thinking about it, so we’ve been giving them advice. They’ve come in and we’ve spent a few hours having meetings with them, showing them software. In fact one of the girls that was our first tester… came and did a presentation about it to these random strangers from another school about her experience using it and how it changed her life and how to use it and that was really good. Teacher, Independent School, E

The centres we have engaged with generally represent case studies of best practice. Some of the specialist schools had dedicated resource and expertise that was used to ensure that students had the best possible AT provision. Though this will not be possible at most schools, it is important that best practice is shared to ensure that all centres are aware of what is possible and permissible. The greater risk may not be that centres do not follow the rules around the use of AT (for which they could be sanctioned), but rather that schools assume that the rules are inflexible and therefore feel unable to provide the most suitable arrangements for their students.

There’s a lot of grey areas where people… get so panicky about exams, what they can and can’t do that even as a professional it can be difficult to convince a setting that they can do it if it’s not written in black and white. Teacher, Academy Converter, C

Theme 4: AT has variable impact on the experience of undertaking assessment

This theme reflects the experience of administrating and undertaking high stakes assessments using AT. Given the diversity of contexts, the range of available ATs and the bespoke nature of student needs, it is reasonable to state that the impact of using ATs is idiosyncratic for the user. In line with the interaction hypothesis, every individual and each assessment interacts differently with the specific set of arrangements (and AT) that are in place.

We have already discussed how the demands of some AT (desk space, power outlets) may sometimes necessitate the use of non-standard assessment environments (Theme 1). This can, of course, have a significant impact on the way in which a student experiences the assessment. Teachers can sometimes try to mitigate this impact through additional exam preparation. This can involve anything from providing training and written instruction about how to use technology during an exam (for example remember to save your work often!) through to helping students to visualise what the exam room will look like and thus reduce their anxiety on the day.

And we like to prep our students, we’ve got photos, got them somewhere, of the exam room, so they know exactly what they’re going into. So for some students they’re OK in smaller groups, smaller rooms, in a classroom, it looks like the space that they’re used to rather than the massive exam hall which can be very difficult for some students. Teacher, Sixth Form College, A

So, when you’re in a heightened state of taking an exam and you’re stressed out, pressing the save button sometimes you might miss that step and then maybe you’ve lost an entire paragraph or an entire essay. So it’s drilling them beforehand to say, you must keep pressing save, you must do this, you must do that, so the typing is fine. Teacher, Independent School, D

This type of pragmatic advice is important as students may feel under pressure. One student described how saving work frequently had become natural to them following an incident in which their technology had failed and erased a significant amount of their work during an exam.

I mean it’s a natural thing you know. I guess the only time I didn’t do it then was because I was focusing on the questions and I really… you know, lose my focus, lose the track of everything… It really wasn’t something I wanted to do. I guess I paid for it but you know. At least I know now. Student, Community Special School, G

The potential for technology to fail can be an additional stress on the day of the assessment, one that can affect students and teachers alike.

[the students get stressed] if we can’t get a PDF reader to work and we then have to call in the IT Department who aren’t in this building and then it’s delaying the start of their exam because they don’t have the facility that they’re normally used to having. And it’s not only that, it’s the stress to the other pupils in that room, because luckily with the assistive technology we can put them in the same room as anybody else that’s typing, they’re just plugged in with earphones, so they’re listening. So yeah, there’s lots of things that can go wrong on the day! Teacher, Independent School, E

Depending on the AT being used and how it combines with other reasonable adjustments, assessment can be significantly more physically and mentally demanding for the student. One of the major impacts of AT comes from the fact that it often requires additional time to use, which extends the duration of the exams, in some cases very substantially.

I do feel sorry for them when they’ve got a couple of exams: one in the morning, one in the afternoon. […] they had an English exam, which was effectively four hours, and then a history exam, which was effectively three and a half hours, so that’s seven/eight hours of exams in one day. That’s tough. So psychologically and mentally it is really wearing on them. Teacher, Community Special School, G

By the end of the last exam I was, I was very tired. …I was slightly regretting the decision to do the whole eight hours or whatever. But that, yeah that was just, it was a bad exam all round to be honest that last one. Student, Local Authority, I

In some cases, the demands on students can be particularly high. One student explained that they had required 700% of the standard examination time so that they could complete their GCSE paper using eye-tracking technology. This meant that they had to complete the exam over a 3-day period and isolate themselves between the exam sessions (specifically, they were not permitted social contact with their peers or access to the internet in case they were able to gain an advantage). Despite this challenge, they had relished the opportunity that the technology had given them to take a GCSE exam despite their physical impairment.

When I did my biology I had 700% extra time so 3 days isolated from my mates. I used my [Eye gaze software] to dictate my answers. The actual chance to give it a go and come out with a grade using my eyes. Student, Community Special School, F

Indeed, despite the challenges faced by many of those interviewed, the availability of AT for examinations was considered positively in relation to other arrangements, such as having a human scribe. This is because AT often appears to be important for a student’s sense of independence and therefore their self-esteem and confidence. In many cases, the positive impact of AT is likely to extend beyond assessment and education and in to other aspects of the students’ lives. These wider considerations influenced both whether an AT was used and the choice of AT. The quotes below illustrate the nuance of this point.

[With regard to typing responses on a laptop] I think for the students it just allows them independence, it fosters independence for them. It shows them that they can write… so if they’re currently only able to write nine words or ten words per minute and they’re typing 25 words per minute they’re doubling then probably tripling their output… their hands are then able to keep up probably with their fast thinking, so they can see that it’s a mechanical reason as to why they’re not getting things done on the paper possibly and that just helps then with their self-esteem. Teacher, Independent School, D

[…] it is best to think long term independence, because our main goal is them having technology that’s going to help them in the future to go to A-levels, to go to university, to work, so it’s putting all of those things together. Peripatetic staff, Local Authority, H

I think for me, anyway, I’d prefer to have laptop skills rather than the Braille Note skills because it’s something that is used a lot more widely in the working world, you know, rather than a Braille Note. Student, Local Authority, I

It is not surprising that the use of AT somewhat changes the examination experience for the student. It is useful to be aware of this when preparing assessment materials and when considering requests for reasonable adjustments.

Theme 5: AT removes barriers to assessment but sometimes also changes how it is accessed

As well as changing the experience of undertaking assessment, AT can also have an impact on the nature of the assessment itself. This theme illustrates some of the ways in which elements of the assessment can be somewhat changed (often unavoidably) through the use of AT.

The intention behind the use of AT is to remove barriers to valid assessment by allowing all students fair access to it. Skills such as writing, reading, and spelling are often important for communicating a response to an examination question, but they are not always explicitly covered by the assessment criteria. In such cases, these factors are ‘construct irrelevant variance’ (CIV); they can affect the assessment outcome even though they are not part of what is being assessed, weakening the validity of that assessment. AT is therefore best used to level the ‘playing field’ by helping students to reach their potential without giving them an advantage over those taking assessment without the AT.

Essentially I find that the technologies that I’m using are doing exactly what they should do in that they’re levelling the playing field. And that’s what any exam arrangement should do. It should never give somebody an advantage over somebody else. Teacher, Sixth Form College, A

For example, for students who find reading challenging, text-to-speech software or a reading pen can help them to understand the question and increase their reading speed. The technology is helping them to understand and access the question but will not improve their performance if they do not have the targeted level of knowledge and skills.

We have some students that without it [text to speech] will misunderstand the question entirely just by reading one word wrong because their accuracy is so poor and that one word they misread can change the entire thing that they’ve written. And other people it’s just that they take so long to read the information, by the time they’ve read it half the exam has gone… So I think it is really effective to help people access the exams. Teacher, Independent School, D

[The] reading pen helps me to make my answer more sensible. So I think reading pen saved me my grades. Student, Independent School, D

Word processing affords students greater capacity to edit and restructure their work. Though such editing is possible using pen and paper, it is likely to be more challenging and time consuming, perhaps making the student’s work appear aesthetically untidy.

One of the big things which has changed from GCSE and A level for me is like structure matters so much. So it’s when you’re on your laptop you’re able, if you see that you, so let’s say you’ve got like a set structure you have to do, like point, explain point, evidence, explain, link, whatever, then it allows you, if you think you haven’t developed one of those bits enough, you can go back into a certain part and develop that certain part within that paragraph. Student, Independent School, D

So you can get all of your thoughts down, because sometimes when I’m writing something I get to the end and I think oh no hang on, I was trying to put too much down at once… And you can edit it as well… It’s really useful, definitely lose marks without it. Student, Voluntary Aided School, J

It is worth noting that not all students benefit from the use of a word processor.

I’ve had two teachers come back to me to say that [the student does] better work handwriting than they do typing and I don’t know whether that’s because it’s almost like a stream of consciousness, their fingers are just going and they’re not taking a pause to think about… what they’re writing or what’s going on. But for their exam they’ve come back and said they don’t want them on a laptop. Teacher, Independent School, D

This suggests that the manner in which the response is provided by the student (typed or hand-written) does seem to somewhat change the nature of how it may be prepared and structured.

AT can alter the nature of assessment in other ways too, requiring the use of processes that are not part of the assessment criteria. For example, we have already discussed examples of students managing multiple devices. In these cases, it seems that the use of technology requires additional planning and management, which may offset some of the accessibility benefits.

I can’t think of any situation where using the technology would actually put the student at an advantage. […] I mean having screen readers, having electronic calculator, actually it’s just giving the student with a visual impairment access to what the sighted individual would have in the exam situation. And it’s actually physically more difficult, because you’ve having to work between lots of things, rather than just looking at one sheet of paper aren’t you? Teacher, Academy Converter, C

There may also be additional cognitive demands placed on candidates who are using certain ATs. For example, for candidates using eye-tracking software, tasks that may be relatively simple on paper or on a keyboard, such as using a capital letter, may require an additional series of steps.

Yeah, it’s all the additional things with the punctuation things… because you’ve got limited space on a screen ‘punctuation pages’ [which must be navigated to using the software] are a button and then a page, so you’ve got so many extra steps to be doing that I think it would take a lot longer time.

…But then in order to do a calculation he’s then got to come out of that screen, go into his calculator, perform the calculation and then put that back into the [word processor] document… but it’s the time taken to come out of one screen, into the other one, then back into the other one. Teacher, Community Special School, G

Students with visual impairments experience similar physical and cognitive complexity because they need to switch between typing and reading in braille, while also remaining orientated so that they can navigate the various materials and devices on their desk. The issue is not merely the additional time that this requires but also how it affects the student’s process. The need to navigate and manipulate the software interrupts the flow of their writing and they need to store more information in their short-term memory. This additional cognitive demand is construct irrelevant variance because their performance may be affected by their capacity to manage this process, even though it is not relevant to the assessment criteria and is not required by those students who are taking the assessment by standard means.

If you or I are accessing a piece of text that we need to read and we’re just writing, we can do that, right, but a braillist has to physically move from what they’re typing to read and then coming back, so there’s an orientation element there that takes some time, but also mentally they’re having to switch from what they’re doing… So there’s a possibility that they’ve forgotten their flow in what they were writing, whereas you or I wouldn’t really have that issue. I could confidently argue that there’s a greater mental capacity required for a braillist to do all those operations than there is for you or I. Teacher, Community Special School, H

[For] English language, having to switch from listening to the article from reading the question on the paper to writing an answer on a different, on another [tablet], that that did take quite some time. Student, Academy Converter, C

Equally, the experience of completing an assessment that requires analysis of an image or video requires significant adaption for visually impaired students. Rather than viewing an image, these students are often required to listen to a verbal description of that image – something that may at least somewhat alter the nature or demand of the assessment.

I remember there was a photo of, it was a child worker in a… factory or something who was making clothes and they were like ‘oh they look tired’. It literally was saying they look tired, like a couple of things, there’s clothes around them kind of thing. But one of the answers that someone in my class gave was the fact that it was in, all the clothes were black and white, so it was conveying a desperate image, this sort of thing. But it didn’t say that in the description, so there were things that I just didn’t have. Student, Academy Converter, C

No matter what level of description there is in a diagram that’s always going to, I think, take away from that experience of “seeing”, in inverted commas, the diagram for themselves. So it’s a little bit like the question in history exams where you have a photograph and that has to be described to them. They always find that the most difficult, because invariably the description that they give in the adapted paper is wholly insufficient. Teacher, Community Special School, H

In the case below, candidates for an Art assessment were required to choose between several optional topics, some of which required them to view a video. The student felt excluded from these options.

The only thing with the art one [assessment] was that there was actually links to videos that you had to watch for the exam, which were described to me by my art teacher. And they weren’t audio videos whatsoever they were just visual so that I was completely reliant on her description. …there was actually six different topics you could choose from. Now, three of the topics came with a video and three of the topics came with a text description, all different things… So in that respect it kind of it took away, well it didn’t take away the choice but it made me lean more towards the text topics than the video topics if that makes sense. Student, Local Authority, I

Although the interviews mainly focussed on written assessment, there were also some interesting examples from other forms of assessment. In the example below a non-verbal student was taking a qualification in drama, for which one of the assessment criteria involved demonstrating expression through one’s voice. The teacher clearly felt that the criterion was too rigid and was unreasonably excluding those students who used synthesised speech.

That happened this year with the [qualification in] drama. There is a performance and part of the critique we received back was that the student was using a monotone, a voice without expression. Well, the student is non-verbal and her machine is a computer generated voice. It doesn’t have expression […].You can show expression in a variety of ways and it doesn’t have to be in your voice, because that’s why we get good at reading body language in the classroom. Teacher, Community Special School, F

The discussion above has focussed on how, in some cases, the introduction of AT subtly alters the nature of the assessment and introduces elements of CIV. However, it is worth noting that, overall, most of those we interviewed argued that AT had removed more CIV than it had introduced, thus improving the overall validity of the assessment for the individual. They believed that the AT was allowing students to better demonstrate their knowledge and skills against the assessment criteria. > Before we used assistive technology she qualified for a reader because she has severe dyslexia, but she wouldn’t speak to the person who was there to read. She wouldn’t ask the person to read because she doesn’t ‘do’ speaking to people. […] We got the assistive technology, the computer reader software when she was beginning of year 11, introduced her to it and it was like a different person. It opened up everything to her and she used it, well, uses it, she’s just finished her A-levels, she used it so much that her grades drastically improved. Teacher, Independent School, E

My typing speed is something over 100 so obviously I can just get more done because I’ll type faster than the others. …I can plan more. It’s not like I finish extremely early, it’s just that I can make more detailed things. I’d say that’s good, and it gives a more accurate picture of my skill in that subject. It’s not that I have, I don’t see it as an unfair advantage, even though it may be perceived that way, it’s that I’m actually just showing my skill at a subject better with more ability to write, I can plan it more. Student, Voluntary Aided School, J

…it’s probably hugely liberating to a whole cohort of pupils that would never be able to get to the next stage of education without it and so all that wasted potential in the past, those pupils now have the ability to go through. Teacher, Independent School, D

To summarise, AT may introduce some degree of construct irrelevant variance by changing the way in which students approach and conduct an assessment. However, this is balanced by the construct irrelevant variance that AT may be counteracting by allowing the student an opportunity to fairly access the assessment and demonstrate the relevant knowledge and skills.

Discussion

To summarise, the following 5 themes emerged from our analysis of the data provided by these case studies.

Reasonable adjustments are usually bespoke to the student

Each student has unique requirements that vary across subjects and contexts. This means that rules and guidelines that govern the use of AT in assessment need to be clear in their underpinning principles but flexible enough to allow students to make bespoke arrangements that reflect their normal way of working and maintain the validity of the assessment.

Examination question papers could be more compatible with AT

The degree of compatibility between assessment materials and the AT with which they are used, though not fundamentally problematic, has scope for improvement. In particular, it may be that the digital file formats in which examination papers are provided could be tested with a wider variety of AT hardware and software.

Schools may not always be fully aware of how AT may be used for assessment

Teachers and students may not always have all of the information about what is possible and/or permissible with AT. This means that there is the potential for some students to miss out on arrangements that may better support their needs because their teachers are either unaware of a given AT or unsure of whether it would contravene the rules. There may therefore be scope to improve the way in which ‘best practice’ is shared.

AT often has an impact on the experience of undertaking assessment

In some cases, the use of AT can unavoidably alter the experience of taking the assessment for students. This is particularly true where the AT requires significant additional time or the use of a specific assessment environment for effective use.

AT removes barriers to assessment but sometimes also changes how it is accessed

Where skills such as reading, spelling and grammar are not part of the assessment, AT can remove barriers to access, mitigating some forms of construct irrelevant variance (CIV). However, the use of AT can also sometimes draw upon other processes and skills that are not part of the assessment, thus introducing other forms of CIV. It is important to consider this balance. In many cases AT is vital because it provides the only means by which a student may access an assessment and demonstrate their knowledge and skills.

The main limitation of these findings is the self-selecting nature of the teachers and students who participated. We used opportunistic recruitment to select our sample and it is likely that the schools that were interested in engaging with the research were also those who were well informed and more vocal about the use of assistive technology for exams. Although we included a range of schools in the study (and they were not all specialist), we sampled only a small number of centres and the findings cannot therefore be generalised to the entire population. Instead we have a series of case studies that illuminate some of the potential factors and how they may manifest in context.

Overall, this study illustrates some of the issues around the use of AT in assessment and informs Ofqual’s understanding of what teachers and students experience when using AT in practice. The findings illuminate areas for possible improvement with regard to the design of assessment materials. Ideally, close adherence to the principles of universal design would mitigate issues of compatibility between assessment materials and AT. However, this will not always be possible and it is therefore important that AOs continue to strive to ensure that non-standard assessment materials are tested for compatibility across the full range of ATs.

It is also important to raise awareness of the potential benefits of ATs and to foster a more precise understanding of what is and is not permitted in examinations. JCQ rules and guidelines are in place to ensure fairness but, if misunderstood, have the potential to create anxiety that may discourage teachers from adopting AT that may support their students. Ofqual will continue to work with awarding organisations and other stakeholders towards the goal of ensuring that our regulatory approach minimises the risk of malpractice and maladministration while not inadvertently impeding good practice.

References

Barrance, R., & Elwood, J. (2018). National assessment policy reform 14–16 and its consequences for young people: student views and experiences of GCSE reform in Northern Ireland and Wales. Assessment in Education: Principles, Policy and Practice.

BECTA. (2009). Becta: The current technology and inclusion landscape.

Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative Research in Psychology, 3(2), 77–101.

Breen, R. L. (2006). A Practical Guide to Focus-Group Research. Journal of Geography in Higher Education, 30(3), 463–475.

Conderman, G. (2015). Assistive Technologies: A Lifeline for Learning. Kappa Delta Pi Record, 51(4), 173–178. Copley, J., & Ziviani, J. (2004). Barriers to the use of assistive technology for children with multiple disabilities. Occupational Therapy International.

Cormier, D. C., Altman, J. R., Shyyan, V., & Thurlow, M. L. (2010). A Summary of the Research on the Effects of Test Accommodations: 2007-2008 (Technical Report 56). Minneapolis, MN: University of Minnesota.

Department for Education. (2020). Rapid literature review on assistive technology in education.

Desmond, D., Layton, N., Bentley, J., Boot, F. H., Borg, J., Dhungana, B. M., … Scherer, M. J. (2018). Assistive technology and people: a position paper from the first global research, innovation and education on assistive technology (GREAT) summit. Disability and Rehabilitation: Assistive Technology, 13(5), 437–444.

Draffan, E. A., Evans, D. G., & Blenkhorn, P. (2007). Use of assistive technology by students with dyslexia in post-secondary education. Disability and Rehabilitation: Assistive Technology, 2(2), 105–116.

Edyburn, D. (2001). Models, Theories, and Frameworks: Contributions to Understanding Special Education Technology. Special Education Technology Practice, 4.

Edyburn, D. (2004). Rethinking assistive technology. Special Education Technology Practice, 5, 16–23. Equalities Act (2010).

Fletcher, J. M., Francis, D. J., Boudousquie, A., Copeland, K., Young, V., Kalinowski, S., & Vaughn, S. (2006). Effects of Accommodations on High-Stakes Testing for Students with Reading Disabilities. Exceptional Children, 72(2), 136–150.

Forgrave, K. E. (2002). Assistive Technology: Empowering Students with Learning Disabilities. The Clearing House: A Journal of Educational Strategies, Issues and Ideas, 75(3), 122–126.

Hayhoe, S. (2014). The need for inclusive accessible technologies for students with disabilities and learning difficulties. In Learning in a Digitalized Age: Plugged in, Turned on, Totally Engaged? (pp. 257–274). Melton, UK: John Catt Educational Publishing.

Hemmingsson, H., Lidström, H., & Nygård, L. (2009). Assistive technology devices in educational settings: Students’ perspective. Assistive Technology Research Series, 25, 619–621.

JCQ. (2019). Access Arrangements and Reasonable Adjustments 2019-20.

Kettler, R. J. (2012). Testing Accommodations: Theory and research to inform practice. International Journal of Disability, Development and Education, 59(1), 53–66.

Lamond, B., & Cunningham, T. (2019). Understanding Teacher Perceptions of Assistive Technology. Journal of Special Education Technology, 35(2), 97–108.

Lu, Y., & Sireci, S. G. (2007). Validity Issues in Test Speededness. Educational Measurement: Issues and Practice, 26(4), 29–37.

Madriaga, M., Hanson, K., Heaton, C., Kay, H., Newitt, S., & Walker, A. (2010). Confronting similar challenges? Disabled and non‐disabled students’ learning and assessment experiences. Studies in Higher Education, 35(6), 647–658.

Messick, S. (1989). Meaning and Values in Test Validation: The Science and Ethics of Assessment. Educational Researcher, 18(2), 5–11.

National Centre for Special Education. (2016). Assistive Technology/Equipment in Supporting the Education of Children with Special Educational Needs – What Works Best?

Nees, M., & Berry, L. (2013). Audio assistive technology and accommodations for students with visual impairments: Potentials and problems for delivering curricula and educational assessments. Performance Enhancement & Health, 2, 101–109.

Netherton, D., & Deal, W. (2006). [Assistive technology in the classroom.] (http://mhess1.pbworks.com/f/Assistive+Technology+in+the+Classroom+by+Netherton.pdf)

Ofqual. (2019a). Access arrangements for GCSE, AS and A Level: 2017 to 2019 academic year.

Ofqual. (2019b). Ofqual Handbook: General Conditions of Recognition.

Ofqual. (2019c). Access Consultation Forum terms of reference.

Satsangi, R., Miller, B., & Savage, M. N. (2019). Helping teachers make informed decisions when selecting assistive technology for secondary students with disabilities. Preventing School Failure: Alternative Education for Children and Youth, 63(2), 97–104.

Sireci, S. G., Scarpati, S. E., & Li, S. (2005). Test Accommodations for Students with Disabilities: An Analysis of the Interaction Hypothesis. Review of Educational Research, 75(4), 457–490.

Smith, R. O., Scherer, M. J., Cooper, R., Bell, D., Hobbs, D. A., Pettersson, C., … Bauer, S. (2018). Assistive technology products: a position paper from the first global research, innovation, and education on assistive technology (GREAT) summit. Disability and Rehabilitation: Assistive Technology, 13(5), 473–485.

Story, M. F. (1998). Maximizing Usability: The Principles of Universal Design. Assistive Technology, 10(1), 4–12.

Svensson, I., Nordström, T., Lindeblad, E., Gustafson, S., Björn, M., Sand, C., … Nilsson, S. (2019). Effects of assistive technology for students with reading and writing disabilities. Disability and Rehabilitation: Assistive Technology, 1–13.

UK Association for Accessible Formats. (2019). UKAFF minimum standards. Watts, E. H., O’Brian, M., & Wojcik, B. W. (2003). Four Models of Assistive Technology Consideration: How Do They Compare to Recommended Educational Assessment Practices? Journal of Special Education Technology, 19(1), 43–56.

Woodbury, R. J. (2015). The Effects of a Training Session on Teacher Knowledge, Perceptions, and Implementation of Assistive Technology in Secondary Schools. All Graduate Plan B and Other Reports.

Woods, K. (2007). Access to General Certificate of Secondary Education (GCSE) examinations for students with special educational needs: What is “best practice”? British Journal of Special Education, 34, 89–95.

Woods, K., James, A., & Hipkiss, A. (2018). Best practice in access arrangements made for England’s General Certificates of Secondary Education (GCSEs): where are we 10 years on? British Journal of Special Education, 45(3), 236–255.

Woods, K., McCaldin, T., Hipkiss, A., Tyrell, B., & Dawes, M. (2019). Linking rights, needs, and fairness in high-stakes assessments and examinations. Oxford Review of Education.

Woods, K., Parkinson, G., & Lewis, S. (2010). Investigating Access to Educational Assessment for Students with Disabilities. School Psychology International, 31(1), 21–41.

  1. Construct irrelevant variance (CIV) is variation in performance caused by factors that are unrelated to the skills and knowledge targeted by the assessment. CIV is therefore a threat to validity. 

  2. The available statistics do not provide a breakdown of specific ATs. For example, the precise number of candidates who use a computer reader is unknown because the data is not currently collected in such a way that computer and human readers can be distinguished. 

  3. Construct irrelevant variance (CIV) is variation in performance caused by factors that are unrelated to the skills and knowledge targeted by the assessment. CIV is therefore a threat to validity. 

  4. These figures exclude those surveyed who did not respond to this specific question or those who described their view as ‘ambivalent’. 

  5. It is unclear whether there would be any issues, technical or otherwise, with the provision of non-interactive papers in these formats.