Accessibility monitoring of public sector websites and mobile apps from 2022 to 2024
Published 17 December 2024
1. Executive summary
This report provides information on the accessibility monitoring of public sector websites and mobile applications (apps) undertaken within the UK under The Public Sector Bodies (Websites and Mobile Applications) (No. 2) Accessibility Regulations 2018 (accessibility regulations).
Whilst the accessibility regulations were amended in 2022 to remove references to European Union legislation and process, the UK Government will still publish a report every 3 years for transparency. This is the second published report, covering monitoring from 2022 to 2024.
The accessibility regulations require UK public sector organisations to make their websites and mobile applications (apps) accessible to disabled people. An accessibility statement, detailing any accessibility issues and who to contact if there is a problem, must be published for each website and app.
The Government Digital Service (GDS) monitors compliance with the accessibility regulations and this report contains the monitoring findings from January 2022 to September 2024.
Accessibility monitoring of websites and apps continues across the public sector, including organisations’ main websites and secondary websites
Monitoring has continued to test the main websites of organisations across the public sector. Monitoring of some areas of the public sector was paused in 2020-2021 due to Covid, such as for the NHS and other health organisations, but was resumed and included in this monitoring period.
We believe we have tested the main websites of the majority of organisations in the UK public sector between 2020 and 2024 (excluding public sector organisations that are exempt from the accessibility regulations). We have now started to look at secondary websites, such as organisations’ recruitment sites.
We have monitored 1,203 websites and 21 mobile apps during this monitoring period.
Accessibility issues were found on nearly all tested websites and apps, similar to the previous monitoring period. After sending a report to the organisation, and giving them some time to fix (normally 12 weeks), 68% had fixed the issues found or had short-term plans for when the remaining issues would be fixed. This is a slight increase from the previous monitoring period (59%).
The monitoring process gets the public sector to fix their accessibility issues
We found 29,787 accessibility issues across monitored websites and apps. Of these, 16,482 (55.3%) were fixed by the public sector organisations during the monitoring. Further issues have been mitigated by organisations removing unneeded content or redesigning and rebuilding the website or app. 3,693 issues (12.4%) were not fixed when we retested the website or app.
The main issues we found during monitoring were:
- not enough colour contrast between text and the background, which affects people with visual impairments
- lack of visible focus, which affects keyboard users and screen reader users
- problems using a site or app with a keyboard, which affects users who have trouble operating a pointing device such as a mouse and screen reader users
- sites that do not adjust to the shape and size of the browser or device (known as ‘reflow’), which affects users who need to use a particular device, magnification level or screen orientation
Enforcement of the accessibility regulations by the Equality and Human Rights Commission (EHRC) and Equality Commission for Northern Ireland (ECNI) contributes to further accessibility issues being fixed by public sector organisations.
Accessibility statements are being published, but they are not all kept up-to-date
85% of websites and mobile apps had published an accessibility statement when initially monitored. After the monitoring process, 97% had published a statement and 65% included all necessary information detailed in the model accessibility statement template. We have seen during monitoring that many accessibility statements are now out-of-date and have not been reviewed in the last 12 months.
2. How we monitor website and mobile app accessibility
2.1 Summary
The accessibility regulations require the Cabinet Office to monitor the accessibility of public sector websites and mobile apps. This is carried out by GDS.
Monitoring covers the accessibility of a site or app against the World Wide Web Consortium (W3C) Web Content Accessibility Guidelines (WCAG) levels A and AA. The version of the standard used during this reporting period was WCAG 2.1. Monitoring from October 2024 onwards uses WCAG 2.2, which adds 6 new success criteria. During the monitoring process, we also review the accessibility statement that must be published to meet the accessibility regulations.
All areas of the UK public sector are in scope of the accessibility regulations, apart from public service broadcasters, some non-governmental organisations, nurseries and schools. The definition of public sector bodies is given in Section 4 of the accessibility regulations.
More information about the testing process, the test coverage correlation to WCAG A and AA, and the tools used for monitoring can be found in the Appendix.
There are 3 types of testing used in accessibility monitoring. The accessibility regulations were amended in 2022 to remove references to the European Union’s mandated process but for this monitoring period we have kept the same 3 test types:
- simplified testing covers a small sample of pages and uses mainly automated accessibility testing with some manual testing
- detailed testing covers a wider range of pages, each tested against all relevant WCAG success criteria
- mobile app testing which is similar to detailed testing, but across the screens and flows of mobile apps
For all test types, we send a report of our findings to the public sector body, who then have a period of time (normally 12 weeks) to fix accessibility issues and report their progress to us. We then reassess the accessibility issues from the report and any statement issues. We pass a summary of the case and remaining accessibility issues to the relevant equality body (the EHRC in England, Scotland and Wales, or ECNI in Northern Ireland) to make further decisions on compliance or enforcement action.
2.2 Simplified testing process
Summary
Simplified audits examine a small selection of webpages. An automated accessibility checker is run on each page and then the auditor carries out some manual checks for common accessibility issues.
Simplified tests do not show every accessibility issue and we encourage organisations to carry out a full accessibility audit.
Preparing for an audit
We identify core pages like the home page and contact page with a sample of other pages from across the site. We look for different types of pages that contain a range of elements, such as tables or different layout styles. We also specifically look for forms and PDFs.
Auditing
We use an automated accessibility testing extension for Google Chrome called Axe, created by Deque Systems. We record severe or critical errors found by the automated tool, as these result in barriers for disabled users and prevent them from accessing website elements or content. We manually check issues to ensure quality and include them in a report that will be sent to the organisation.
We then carry out manual tests for some of the most common barriers to users with accessibility needs that are not likely to be found by automated tests.
This includes tabbing through each page with a keyboard to check that:
- the user can navigate to all links and interactive elements without a mouse (WCAG 2.1.1 Keyboard)
- there are no keyboard traps (WCAG 2.1.2 No Keyboard Trap)
- the focus surrounding elements is visible (WCAG 2.4.7 Focus Visible)
- focus order is logical (WCAG 2.4.3 Focus Order)
- there are no unexpected actions when tabbing (WCAG 3.1.2 On Focus)
Each page is viewed at 100%, 200% and 400% zoom, and we simulate viewing the page on a small screen. This is designed to check that the page is still navigable and readable with no content cut off (WCAG 1.4.4 Resize text and WCAG 1.4.10 Reflow).
We look for audio/visual content such as images, videos, podcasts and animations. This is to check that:
- there are accessible alternatives for video-only or audio-only content (WCAG 1.2.1 Audio-only and Video-only)
- there are no images of text with significant content (WCAG 1.4.5 Images of Text)
- moving content such as animations and carousels can be paused, stopped or hidden (WCAG 2.2.2 Pause, Stop, Hide)
There are WCAG success criteria that we do not explicitly test for in the simplified process, but will note if we experience them. This includes the page refreshing unexpectedly (WCAG 2.2.1 Timing Adjustable) and audio playing for more than 3 seconds with no way to stop it (WCAG 1.4.2 Audio Control).
We look for an accessibility statement and check that this meets the required legal format. Statements must include all mandatory wording, list all known accessibility issues and be up-to-date.
2.3 Detailed testing process
Summary
Detailed audits take an in-depth look at the accessibility of a website. We test against the full range of WCAG success criteria to level A and AA (55 success criteria - see Appendix A). Assistive technology is used to check compliance as well as the automated and manual methods used in simplified testing.
Before an audit
Before a detailed audit happens, we request information about the website from the organisation to help us plan the audit. This includes:
- most used pages
- page template types
- previous accessibility audit reports
- access to the website/service if it is not publicly available
Preparing for an audit
When the information is provided it is used to decide on a sample of pages to test. We try to include a range of pages from the website or service such as:
- some of the most popular pages
- a range of template types
- at least one service end-to-end where possible
Previous accessibility audits carried out by the organisation will be considered when deciding the coverage of our monitoring. We will use the information in previous audits if they are recent (normally within the last year) and follow a complete WCAG testing methodology.
The previous audit is compared against the sample of pages chosen for testing by the monitoring body. If the page has already been tested by the organisation and is satisfactory, we carry out a simplified test on that page. Any other pages will receive a detailed test against all WCAG criteria.
Auditing
A detailed test covers all WCAG success criteria in scope at level A and AA. This is similar to a full WCAG audit, with the auditor checking against each success criteria, but the test remains a sample-based approach across a selection of pages and does not provide full coverage of the website. The auditor will use automated testing as well as manual checks. Auditing includes analysing the code of the website and using assistive technology to test the experience.
We publish our team’s testing guide which explains what auditors will look for to meet each success criteria.
2.4 Mobile app testing process
The mobile application tests take an in-depth look at the accessibility of a mobile app. We test against the full range of WCAG success criteria to level A and AA. Mobile app audits follow the same process as detailed website audits, explained in Section 2.3. We have published a blog post on how mobile apps undergo accessibility monitoring.
For mobile application tests, the latest version of both Android and iOS versions of an application are tested where available.
2.5 What happens after an audit
After the audit, a report detailing accessibility issues found is sent to the public
sector organisation that runs the website or app. They are asked to acknowledge the report and given time (normally 12 weeks) to fix the accessibility issues found. The auditor will answer any questions organisations may have about the issues found in the report to help them identify and fix issues.
We send the report to the contacts in the organisations’ accessibility statements. If we cannot find contact details in an accessibility statement, we check the contact page or use feedback forms to request contact information.
If we do not hear from organisations after a month and have tried all known methods of communication, we forward information about the website to the equality bodies with a recommendation of further compliance and enforcement action.
After 12 weeks the public sector body is contacted to give an update on their progress. The auditor will retest the issues we initially raised, but this is not a full test of the website or mobile application.
We then make a recommendation on whether we think the equality body should evaluate the evidence and potentially take further enforcement action. An email will be sent to the public sector body to confirm the outcome of the monitoring.
2.6 After testing and correspondence
The accessibility report, information from the monitoring process and correspondence, and our recommendation, are passed to the EHRC in England, Scotland and Wales or ECNI in Northern Ireland, who are responsible for enforcement of equality and non-discrimination laws. They decide whether any further action is required and will contact the public sector body directly if necessary.
3. What has been monitored
3.1 Composition of the sample
The data provided is for monitoring between 1 January 2022 and 1 September 2024.
1,203 websites were tested and 21 mobile apps were tested.
Lists of UK public sector organisations were compiled from publicly available data. Where possible, we have gathered information on the size of organisations or impact, based on factors such as population or staff/student numbers. Some larger organisations were selected at random for detailed tests. If the organisation had a mobile app, this may have been monitored instead of their website. The remainder of organisations were randomised and used as lists for simplified testing.
In 2022 to 2024, we continued working through organisation lists for each operational area of the public sector, such as Further Education colleges, local government and police. This included areas of the public sector where we had paused monitoring in 2020/2021 due to Covid, such as health organisations.
We believe we have tested the main website of the majority of organisations in the UK public sector between 2020 and 2024 (excluding public sector organisations that are exempt from the accessibility regulations). We have started monitoring secondary websites published by public sector organisations, such as recruitment websites.
Simplified testing
There are 1,151 websites that were tested between 1 January 2022 and 1 September 2024.
On 1 January 2022, there were 204 website cases in progress, but these are excluded from the data analysis in Section 4 of this report. 49 of the 204 websites were monitored from 20 November 2021 (the end of the monitoring period analysed in the previous report) to 31 December 2021. Of the 204 websites, 117 (57%) websites were passed to the equality bodies with a recommendation of no further action and 87 (43%) had a recommendation for equality bodies to consider whether further enforcement was appropriate.
On 1 September 2024:
- we have finished the monitoring process for 927 websites
- 7 websites are being tested or having their report written
- 156 websites have been sent their report and are in the 12 week period that is provided for fixing issues
- 37 websites have reached the end of the 12 week period and we are reviewing their response
- 24 website cases were closed during the process, due to being exempt - these include organisations which were found to belong to the private sector, or websites in the process of closing or being merged into a separate site
A search engine was used to find the main website of the organisation. For some sectors, such as local authorities, we have started testing secondary sites (websites that the organisations run that are not their main website). For secondary sites, the main website was accessed and suitable websites were found.
92 simplified tests were retests of websites previously tested in 2020-2021.
We included 7 complaints received via the Equality Advisory Support Service (EASS) in the simplified testing sample.
Geographic distribution of simplified testing
Location | Percentage |
---|---|
England | 74.5% |
Scotland | 12.2% |
Wales | 7.0% |
UK-wide | 5.0% |
Northern Ireland | 1.4% |
Simplified testing across different public sector areas
Area | Percentage |
---|---|
Education | 31.4% |
Local government | 23.9% |
Health | 19.4% |
Central government, including agencies and Arm’s Length Bodies (ALBs) | 16.2% |
Law and public safety | 9.2% |
Detailed testing
There were 52 websites that were monitored within the monitoring period.
4 of these were complaints that were referred to the monitoring body by EASS but were not suitable for a simplified test due to the nature of the complaint.
6 were for websites which had previously undergone a simplified test.
In this monitoring period we focused on the websites of larger public sector organisations from different public sector areas and geographic locations. We have now started to monitor digital services offered by public sector organisations as well as their main website.
Geographic distribution of detailed testing
Location | Percentage |
---|---|
England | 61.5% |
Scotland | 15.4% |
Northern Ireland | 9.6% |
Wales | 9.6% |
UK wide | 3.8% |
Detailed testing across different public sector areas
Area | Percentage |
---|---|
Local government | 48.1% |
Health | 17.3% |
Education | 13.5% |
Law and public safety | 11.5% |
Central government, including agencies and ALBs | 9.6% |
Mobile app testing
There were 21 mobile applications (apps) monitored within the monitoring period.
In each case we tested both the iOS and Android version of the app, counting these as two apps in total. The only exception is 1 app where we tested 2 versions of the iOS app.
3 were complaints that were referred to the monitoring body by EASS.
In this monitoring period we focused on apps from larger public sector bodies from different sectors and geographic locations, where the public sector body was thought to have a significant impact on the general public (locally or nationally), or had the highest number of downloads based on the app stores.
10 of the apps were for organisations that had previously undergone a simplified test on their main website.
Geographic distribution of mobile app testing
Location | Percentage |
---|---|
England | 52.4% |
Scotland | 19.0% |
UK wide | 9.5% |
Northern Ireland | 9.5% |
Wales | 9.5% |
Sectors for mobile app testing
Location | Percentage |
---|---|
Local government | 38.1% |
Transport | 23.8% |
Central government, including agencies and ALBs | 19.0% |
Health | 19.0% |
4. What we found
Note: the number of issues cannot be directly compared to the 2020-2021 report due to changes in test methodology and recording. Multiple instances of the same accessibility failure on 1 page of the website (or screen of an app) are counted as 1 issue.
4.1 Summary
Test type | No of tests | Issues found | Issues fixed | Completed cases | Recommend no further action | Recommend compliance or enforcement |
---|---|---|---|---|---|---|
Simplified | 1,151 | 26,171 | 13,882 | 927 | 640 | 276 |
Detailed | 52 | 2,736 | 1,857 | 52 | 26 | 26 |
Mobile app | 21 | 880 | 743 | 20 | 12 | 8 |
Total | 1,224 | 29,787 | 16,482 | 999 | 678 | 310 |
55.3% of accessibility issues found in monitoring were fixed by the public sector organisation.
In 67.9% of completed cases, organisations had fixed the issues found or had a short-term plan to fix remaining issues, so we recommended to the equality bodies that no further compliance and enforcement action was required.
4.2. Simplified tests
We have carried out simplified monitoring on 1,151 websites.
We found no accessibility issues for 10 websites. Half of the websites had 23 or fewer issues. The most issues found was 73. As mentioned in the testing methodology, simplified tests only cover parts of WCAG and test on a small number of pages.
For 69 websites, the organisation was sent the monitoring report but did not acknowledge or respond to multiple attempts to contact. The cases were closed and referred to the equality bodies for further compliance and enforcement action.
After sending the report, correspondence with the public sector organisations, and a retest after 12 weeks, 303 websites had fixed the issues found or had a short-term plan to fix remaining issues.
With further correspondence and work with the public sector organisations, 640 (70% of completed cases) websites were passed to the equality bodies with a recommendation of no further action. 276 (30%) had a recommendation for equality bodies to consider whether further enforcement was appropriate. 26 cases were stopped during the process, for example if the organisation was not considered to be in the public sector. 209 cases were still in progress on 1 September 2024.
The compliance rate of 70% is slightly improved on the previous monitoring period (59%).
There was some variation of compliance between different areas within the public sector.
Area of the public sector | Acceptable at 12 week test | Recommended no further action to equality body |
---|---|---|
Central government, including agencies and ALBs | 35% | 75% |
Local government | 29% | 73% |
Health | 35% | 70% |
Education | 32% | 64% |
Law and public safety | 30% | 65% |
In total 26,171 accessibility issues were found. 8,596 issues were not retested at 12 weeks due to the page or site being removed or we received no response from the public sector organisation. 13,882 issues were fixed. 3,693 issues remained unfixed.
The top 10 issues found are:
WCAG criterion | Percentage of issues |
---|---|
2.4.7 Focus Visible | 13.3% |
1.4.3 Contrast (minimum) | 13.0% |
2.1.1 Keyboard | 10.8% |
1.4.10 Reflow | 9.2% |
4.1.2 Name, Role, Value | 8.6% |
1.3.1 Info and Relationships | 6.4% |
2.4.4 Link Purpose (In Context) | 6.2% |
1.4.4 Resize Text | 5.5% |
2.4.3 Focus Order | 4.0% |
1.1.1 Non-text Content | 3.0% |
The 5 most common criteria account for 55% of the total issues, and the 10 most common criteria account for 80% of the total issues.
Across the sites monitored by simplified testing, the top 10 WCAG criteria found are:
WCAG criterion | Percentage of websites |
---|---|
1.4.3 Contrast (minimum) | 77.3% |
2.4.7 Focus Visible | 76.3% |
2.1.1 Keyboard | 63.6% |
1.4.10 Reflow | 55.4% |
4.1.2 Name, Role, Value | 53.2% |
2.4.2 Page titled | 51.9% |
1.3.1 Info and Relationships | 51.7% |
2.4.4 Link Purpose (In Context) | 46.5% |
1.1.1 Non-text content | 36.8% |
1.4.4 Resize Text | 34.1% |
Observations from simplified testing
Homepages often have elements that may cause accessibility problems, such as more use of images that contain text, or carousels. Cookie notices, or other elements that have been added to the page that overlay the main content can also generate errors.
Websites often have elements that are not keyboard accessible. A dropdown navigation menu can have multiple links available, however, it can be inaccessible if the keyboard cannot be used to expand the menu. The keyboard may also tab through the menus whilst they are minimised, meaning the keyboard focus is no longer visible.
There is also normally an issue with the PDF that is tested, even though the checks are limited to technical tests, such as inclusion of a PDF title and language.
4.3. Detailed tests
We have carried out detailed monitoring on 52 websites.
At the end of the monitoring process, 26 websites had fixed the issues found or had a short-term plan to fix remaining issues. These were passed to the equality bodies with a recommendation of no further action. 26 had a recommendation for equality bodies to consider whether further enforcement was appropriate.
Issues found during detailed testing
At the time of each detailed test, none of the 52 websites were fully compliant.
The total number of issues found in each test ranged from 14 to 116. The average number of issues per test was 53.
The number of unique issues per page on each website ranged from 0.7 to 5.4 within each test. The average number of unique issues per page across every website was 2.7.
2,736 accessibility issues were found in detailed testing. 1,857 were resolved, representing 68% of total issues. The percentage of issues resolved as a result of each detailed monitoring process ranged from 0% to 98%.
The most common issues were:
WCAG criterion | Percentage of issues |
---|---|
1.4.3 Contrast (Minimum) | 13.8% |
4.1.2 Name, Role, Value | 10.8% |
1.3.1 Info and Relationships | 10.8% |
2.4.7 Focus Visible | 7.1% |
1.1.1 Non-text Content | 6.2% |
4.1.1 Parsing - removed from WCAG 2.2 | 6.1% |
1.4.10 Reflow | 4.9% |
2.1.1 Keyboard | 3.7% |
1.4.11 Non-text Contrast | 3.3% |
2.4.4 Link Purpose (In Context) | 3.2% |
The success criterion 4.1.1 Parsing became obsolete in October 2023 and we did not report on it from that point.
The 5 most common criteria account for 49% of the total issues, and the 10 most common criteria account for 70% of the total issues.
Observations from detailed testing
The most frequently overlooked issues with colour contrast are often caused by:
- text that is placed over images
- pale placeholder text in form inputs
- the colours used for elements on keyboard focus or pointer hover
There are often issues where form elements do not have accessible names or are not programmatically associated with the visible label. Label association issues can occur where one label covers multiple inputs. Examples include separate day, month, and year fields where only one input is associated, or where individual radio buttons have labels but are not linked to the overarching question to which they are options for.
Programmatic issues also occur with content that can be opened and closed where the state is not announced to users of assistive technology for example ‘expanded’ or ‘collapsed’. Similarly, sometimes hidden content is only visually hidden but can still be accessed by keyboard or assistive technology users making navigation more confusing.
We sometimes find carousels and sliders that cannot be used with a keyboard, have no means to pause them, or have unlabelled controls.
Menus often do not support keyboard access, have unpredictable behaviour on hover, and have hidden focus. In some cases, keyboard focus goes through the menu even when the menu is closed. In others, the menu stays open even when a user tabs past the menu. Also, menu options are often hard to access at high zoom levels. These issues will be more prominent when monitoring for WCAG 2.2 due to the new Focus Not Obscured criterion.
4.4. Mobile app tests
We have carried out mobile app monitoring on 21 mobile apps.
At the end of the monitoring process, organisations responsible for 12 of the apps had fixed the issues found or had a short-term plan to fix remaining issues and were passed to the equality bodies with a recommendation of no further action.
The organisations responsible for 8 of the apps had a recommendation for equality bodies to evaluate the case to see if further enforcement was appropriate.
The remaining 1 app was not pursued further as the app was decommissioned.
Issues found during mobile app testing
At the time of each initial test, none of the 21 apps were fully compliant with the accessibility regulations.
The total number of issues found in each mobile app test ranged from 13 to 76. The average number of issues per test was 42.
The number of unique issues per page on each app ranged from 0.7 to 5.5. The average number of unique issues per page across every app was 2.5.
In this monitoring period, 880 issues were detected in mobile app testing and the most common issues were:
WCAG success criterion | Percentage of issues |
---|---|
4.1.2 Name, Role, Value | 19.2% |
1.3.1 Info and Relationships | 14.7% |
1.4.3 Contrast (Minimum) | 13.6% |
2.1.1 Keyboard | 10.6% |
1.4.4 Resize text | 6.0% |
1.4.11 Non-text Contrast | 5.8% |
2.4.6 Headings and Labels | 5.5% |
1.1.1 Non-text Content | 4.1% |
2.4.7 Focus Visible | 3.9% |
4.1.3 Status Messages | 2.4% |
The 5 most common criteria in this table account for 64% of the total issues, and the 10 most common criteria account for 86% of the total issues.
Out of the 880 issues we found, 743 were resolved, representing 84% of total issues.
Across all cases (excluding the decommissioned app), the percentage of issues resolved as a result of each test ranged from 38% to 100%.
Observations from mobile app testing
Most common failures for mobile apps mirror those from detailed testing but we do notice that some issues tend to be more prevalent across the entire app or in large sections.
Common app accessibility issues include:
- large sections of the app or commonly recurring elements that are not accessible with a keyboard
- app content that does not increase in size when system settings are changed (and with no alternative method provided in the app)
- lack of support for both portrait and landscape mode, which is essential for people who need to view a mobile device in one particular setting
We have noticed that issues with keyboard access and orientation tend to be accessibility issues that are not fixed in mobile apps.
Testing with a keyboard helps to identify some accessibility issues - for example, components having correctly-defined roles and being in a logical order. Keyboard access also has parallels with switch control access.
4.5. Retests of websites from the previous monitoring period
As part of our monitoring, we tested some websites for a second time. We took a random sample of sites that had undergone a simplified test in 2020 or 2021 and either carried out a simplified or detailed test, depending on the nature and size of the organisation. In total, 97 retests were undertaken: 92 were simplified tests and 5 were detailed tests.
We compared statistics for 47 cases where both the initial and second test were simplified tests. However, it should be noted that direct comparison between first and second tests is difficult due to factors such as:
- changes to how we monitor
- less comprehensive data collection in our first years of monitoring
- large scale website redesign where the site tested the second time was very different to the site when tested initially
In 2020-2021, when the first test of each website was undertaken, all 47 websites were found to have issues. By comparison, 3 of the 47 websites were found to have no issues when we tested them again.
As the number of pages we have tested varies, we looked at the average number of issues per page. This was 3.3 for initial tests and 2.0 for retests.
2 organisations did not respond when sent the report from the retest - both of these organisations corresponded with us the first time. They were referred to the equality bodies for potential further compliance and enforcement action.
At the 12 week test after sending the report, more websites had all issues fixed and fewer had no issues fixed.
Initial test | Retest | |
---|---|---|
Issues fixed after 12 weeks | 17% - None 40% - Some issues fixed 43% - All issues fixed |
11% - None 30% - Some issues fixed 59% - All issues fixed |
To see if organisations’ websites were remaining accessible, or getting better or worse, we analysed the outcome of the monitoring process.
Initial test outcome | Retest outcome | No of websites |
---|---|---|
Recommend no further action | Recommend no further action | 17 |
For enforcement consideration | Recommend no further action | 6 |
Recommend no further action | For enforcement consideration | 6 |
For enforcement consideration | For enforcement consideration | 18 |
We generally see websites getting more accessible when redesigned and redeveloped, but some public sector websites are still made without considering accessibility. We see that knowledge of the regulations and digital accessibility can disappear as team members leave or change roles.
Our monitoring suggests there are many organisations that do not meet the accessibility regulations. A larger sample will be needed to draw conclusions and make recommendations and we intend to retest more websites and apps in the next monitoring period.
5. Accessibility statements
173 websites did not have an accessibility statement at time of initial test. After monitoring, 38 did not have an accessibility statement.
Only a small number of statements were fully compliant when initially tested.
Test type | Compliant on initial test | Compliant at end of monitoring |
---|---|---|
Simplified | 8% | 65% |
Detailed | 4% | 88% |
Mobile app | 0% | 76% |
The most common issues with accessibility statements include:
- mandatory wording missing, including the wording to specify what the accessibility statement applies to
- issues being found in our tests which are not included in the statement
- accessibility pages which do not follow the model accessibility statement
- organisations directly copying and pasting text from the sample statement template, and not replacing with their own organisation’s information
We have also seen many statements that are out of date. Statements must be reviewed yearly and updated to show this. Statements often refer to future fixes for issues where the dates have now passed. This makes it difficult for users to understand what issues may remain on the site.
6. Disproportionate burden
Disproportionate burden can be claimed where fully meeting the accessibility regulations costs too much for an organisation to reasonably complete. Lack of time or knowledge does not constitute a disproportionate burden and the majority of digital content should be fully accessible.
Organisations are legally required to carry out an assessment of the extent to which compliance with the accessibility regulations imposes a disproportionate burden.
Type of test | Percentage with a disproportionate burden claim at closure |
---|---|
Simplified | 17% |
Detailed | 37% |
Mobile app | 57% |
Disproportionate burden claims for detailed testing are higher than those on simplified tests. This could be because we find issues with a wider range of functionality. For example, if videos are missing audio description, it can be a disproportionate effort to add this in after publication, especially if there are many to remedy. We also test end-to-end services in depth. These are often provided by third parties and not within an organisation’s direct control.
Disproportionate burden claims on mobile apps are also high. This could be because of the number of keyboard and orientation issues we find. If apps do not support this functionality, they are likely to require elements to be completely rewritten.
The understanding of disproportionate burden varies greatly and has often caused confusion and questions. Common issues include:
- organisations claim disproportionate burden without having carried out an assessment beforehand
- organisations incorrectly apply a disproportionate burden exemption to content that is out of scope of the accessibility regulations (for example, PDF documents published before September 2018)
Organisations are often unclear on what kind of information to include in their assessment. A good disproportionate burden assessment is a cost/benefit exercise and should consider factors such as:
- how much it will cost to fix the issue
- the amount allocated to spend on the website annually
- how extra costs would impact the organisation’s budget
- the number of users the issue impacts if not fixed
- benefits that fixing issues would bring to users
- how long an organisation expects this disproportionate burden to apply
- if the site or service is procured or outsourced, how long the third party supplier is contracted for, and how much it would cost to re-tender or renegotiate the contract to get the issues fixed
7. Feedback from public sector organisations that were monitored
We invite organisations who have completed the audit process to share their feedback and experience with us by filling out an anonymous survey.
Since January 2022, 276 organisations have responded to the survey.
Impact of monitoring
Organisations have shared that receiving the monitoring report has had a positive impact on their organisation:
- 83% of organisations reported that it has helped them fix accessibility issues on their website
- 78% of organisations reported it helped them create or improve their accessibility statement
- 71% of organisations reported it prompted their organisation to take action on accessibility
- 62% of organisations reported it helped raise awareness of accessibility regulations
Quotes from the survey reinforce that the audit has empowered people to take action to meet the accessibility regulations:
- “It helped the team to mobilise the organisation around accessibility remediation, especially as there is a need to secure compliance with WCAG 2.2 now.”
- “The process is very much to the point and serious. Thankfully, we were already prepared and on track so the process was straightforward. We required flexibility in timing and that was given.”
- “The report highlighted the importance of reviewing the website and statement regularly.”
- “The audit was positive as it showed that we are on top of most of the issues even though we are not fully accessible and proved to the Director that our work on making the website accessible is worthwhile.”
- “The audit itself was a surprise but we learned a lot from it and it was incredibly helpful.”
Some organisations found the timelines for fixing issues too short, the process was too rigid or that there was not enough technical support:
- “It came out of the blue and felt stressful to receive an official notice in that way. I think the timeline is too short for a meaningful turnaround.”
- “It would be good if there could be training offered, or experts on hand to help with things such as writing statements or explaining how to fix certain issues. I had to read some very complex documents on the Government website, and try to understand them.”
- “GDS did not understand that timelines were hard to set due to the distributed ownership/procurement/budget, etc. I asked for a conversation to discuss our unique situation and was told this was not possible.”
Barriers to meeting the accessibility regulations
Organisations reported that they are dealing with the following barriers to meeting the accessibility regulations:
- funding (53%)
- awareness about the accessibility regulations (53%)
- training about accessibility (45%)
- timing (37%)
- lack of support from internal leaders (33%)
Other barriers included difficulties engaging with contractors and technology restrictions preventing use of accessibility tools.
Organisations are taking steps to overcome some of these barriers. This includes using resources within the accessibility community, having more open conversations with senior leaders, providing accessibility training to staff and setting up a specific accessibility working group.
14% of respondents said their organisation had received accessibility complaints from their users, but some of these were usability issues.
8. Lessons learnt from the accessibility monitoring process
The majority of organisations acknowledge our report, but some organisations do not reply to us until after we inform them we are sending their information to the enforcement bodies. A minority (6%) never respond to our report or subsequent emails. This is lower than the previous monitoring period due to extra attempts to contact the organisations and more work to find the right teams within organisations. Organisations that do not respond are passed on to the equality bodies for potential further compliance and enforcement action.
For simplified testing, we do not contact the organisation before monitoring and some organisations have fed back that they would have liked to be notified. For detailed testing, we do ask for information upfront. We find that organisations often ask for a delay (for example, due to upcoming changes or redevelopment) or start making changes as we are monitoring, which can result in having to re-test some pages, increasing the time taken to test.
Since our last report, we have moved from sending our simplified test reports as email attachments to publishing them as web pages, to maximise their accessibility.
We have worked with the equality bodies to understand what factors they may take into account when deciding on further compliance and enforcement action. This has led us to spend more time in correspondence with monitored organisations to ensure as many issues that we found are fixed. This takes more auditor time, but has contributed to the increase in compliance rate for simplified test cases.
Some organisations ask for a call to go through the report. Due to the volume of websites being tested by each auditor, we cannot offer this, and we need to keep written records of correspondence with organisations. We share links to the cross-Government accessibility community and other online resources to help support them through the monitoring process.
Our mobile app testing has helped us learn more detail on how apps behave, helping us resolve queries more quickly. Examples include how roles work in Android apps and knowledge of different tools available to help us with testing.
We have also reviewed the coverage of detailed tests, reducing the average number of pages we test and giving ourselves more flexibility to run a light touch test on repeated functionality. This approach maintains a similar level of coverage while taking less time than before.
9. Complaints, compliance and enforcement
9.1. Complaints
The Equality Advisory Support Service (EASS) and the Equality Commission for Northern Ireland (ECNI) can receive complaints from the public when users encounter accessibility issues on websites and apps, and the issues have not been resolved by the public sector organisation.
In this reporting period, EASS have referred 12 cases to us where members of the public have complained about a website or app. 10 of the cases referred to the general accessibility of the site, with 4 of these specifically mentioning screen reader usage. 4 cases mentioned accessibility statements being unavailable or incorrect. No complaints were received through ECNI.
In 2 cases, no action was taken as one organisation was not covered by the accessibility regulations and the other organisation withdrew the mobile app that was being complained about.
In 5 cases, the complaint was of a general nature and the website was reviewed as a simplified test.
In the other 5 cases, the complaint was more specific and a detailed or mobile app test was conducted.
9.2 Compliance and enforcement
Once GDS have completed our monitoring, we send information about the case to either the EHRC for English, Scottish and Welsh organisations or ECNI for Northern Irish organisations.
GDS provides technical support, including retests, to the equality bodies as they investigate and carry out any compliance and enforcement activity.
We asked the EHRC and ECNI to provide information on enforcement and their approach, which is included in Sections 9.3 and 9.4.
9.3. EHRC enforcement approach
The EHRC enforces the accessibility regulations in Great Britain. To carry out its regulatory work, the EHRC has a range of legal powers under the Equality Act 2006. Ensuring accessibility of public sector websites is a strategic priority for the EHRC. The EHRC will consider the GDS reports and recommendations on whether further action should be taken against an organisation.
Where a public sector body has failed to engage with GDS and/or failed to fix the majority of non-compliance issues, the EHRC will consider whether it is proportionate to take enforcement action. It does this in accordance with its litigation and enforcement policy.
The EHRC has a two staged approach to enforcement. Firstly, it will seek compliance from the organisation. It sends an initial letter setting out the scope of its enforcement powers and requests details of the efforts made to fix the outstanding accessibility issues together with a specified time frame to make the fixes.
The organisation is informed that failure to do so may result in enforcement action. Secondly, if the organisation fails or refuses to comply, the EHRC will proceed with enforcing the regulations, this may include entering into a binding agreement to comply with the regulations, or conducting an investigation into the failure to make reasonable adjustments to ensure website accessibility.
Since the start of the EHRC’s role, the majority of GDS reports have not needed to recommend further action. To date, of the 93 public sector bodies in England and Wales that the EHRC has sent initial letters to since 1 January 2022, compliance was secured with 66, without the need for enforcement, with the remaining matters ongoing. Of the 5 public sector bodies EHRC have written to in Scotland, compliance was secured with 2, and continue to engage with the remaining 3.
9.4 ECNI enforcement approach
The Equality Commission for NI is Northern Ireland’s statutory equality body. It was established by the Northern Ireland Act 1998. Its powers and duties derive from a number of statutes enacted over the past decades.
Enforcement is undertaken by the ECNI in Northern Ireland, using existing enforcement powers under the Disability Discrimination Act 1995 (DDA 1995), as amended.
The Equality Commission may choose to take action based on information provided by GDS, using its existing enforcement powers under the DDA 1995 and related enforcement powers under Article 9 (1a) and (2) of the Equality Disability, etc (Northern Ireland) Order 2000.
Under the disability discrimination legislation, enforcement in Northern Ireland is by way of legal action in a county court, brought by a service user. The service user must be able to show that they are disabled, in that they have a physical or mental impairment which has a substantial and long term adverse effect on their ability to carry out normal day to day activities.
Service providers (including public sector website or app based services) are under a duty to make reasonable adjustments to ensure that disabled users can access the service. This means that they must remove barriers that could make it impossible or unreasonably difficult for disabled people to access the service.
There are strict time limits for lodging proceedings in a county court. Cases must be lodged within 6 months of the alleged discriminatory act. While a court has a discretion to extend the time limit, this is used very sparingly.
It is a matter for a county court judge, upon hearing all of the evidence, to decide whether or not disability discrimination has occurred.
On receipt of a complaint by a disabled person concerning access to a website or mobile app of a Public Sector Body, the Equality Commission will provide advice to the service user and may provide legal assistance for an individual with legal proceedings. The Commission determines which cases to assist under its Policy for the Provision of Legal Advice and Assistance to Individuals. In the case of inaccessibility of a UK wide website or mobile app the Commission may refer the matter to the Equality and Human Rights Commission’s enforcement team.
Individuals with a complaint regarding website or mobile app accessibility in Northern Ireland may contact the Equality Commission for NI on 028 90 500 600 and ask to speak to a Discrimination Advice Officer.
10. Appendices
A. Correlation of monitoring against technical standards
For the period of this report, our testing was based on the World Wide Web Consortium’s Web Content Accessibility Guidelines (WCAG) version 2.1 levels A and AA.
WCAG 2.2 was published on 5 October 2023 and became the legal technical standard for the accessibility regulations. Our team has been monitoring for this standard since October 2024.
The following table covers our test approaches for each WCAG success criterion using these categories:
- Yes: we test the success criterion manually or using tools, whichever is most appropriate
- Automated only: we only flag automatically-detectable failures of the success criterion
- Limited: there is not an easy way to test for this on a mobile but we will flag potential issues if we find them
- No: we do not actively test for it
- Exempt: the success criterion is exempt from the accessibility regulations
Success criterion | Level | Simplified | Detailed | Mobile |
---|---|---|---|---|
1.1.1 Non-text Content | A | Automated only | Yes | Yes |
1.2.1 Audio-only and Video-only (Prerecorded) | A | Yes | Yes | Yes |
1.2.2 Captions (Prerecorded) | A | Yes | Yes | Yes |
1.2.3: Audio Description or Media Alternative (Prerecorded) | A | No | Yes | Yes |
1.2.4 Captions (Live) | AA | Exempt | Exempt | Exempt |
1.2.5 Audio Description (Prerecorded) | AA | No | Yes | Yes |
1.3.1 Info and Relationships | A | Automated only | Yes | Yes |
1.3.2 Meaningful Sequence | A | No | Yes | Yes |
1.3.3 Sensory Characteristics | A | No | Yes | Yes |
1.3.4 Orientation | AA | Automated only | Yes | Yes |
1.3.5 Identify Input Purpose | AA | Automated only | Yes | Yes |
1.4.1 Use of Color | A | Automated only | Yes | Yes |
1.4.2 Audio Control | A | Yes | Yes | Yes |
1.4.3 Contrast (Minimum) | AA | Yes | Yes | Yes |
1.4.4 Resize text | AA | Yes | Yes | Yes |
1.4.5 Images of Text | AA | Yes | Yes | Yes |
1.4.10 Reflow | AA | Yes | Yes | Limited |
1.4.11 Non-text Contrast | AA | No | Yes | Yes |
1.4.12 Text Spacing | AA | Automated only | Yes | Limited |
1.4.13 Content on Hover or Focus | AA | Yes | Yes | Yes |
2.1.1 Keyboard | A | Yes | Yes | Yes |
2.1.2 No Keyboard Trap | A | Yes | Yes | Yes |
2.1.4 Character Key Shortcuts | A | No | Yes | Yes |
2.2.1 Timing Adjustable | A | Yes | Yes | Yes |
2.2.2 Pause, Stop, Hide | A | Yes | Yes | Yes |
2.3.1 Three Flashes or Below Threshold | A | No | Yes | Yes |
2.4.1 Bypass Blocks | A | Automated only | Yes | Yes |
2.4.2 Page Titled | A | Automated only | Yes | Yes |
2.4.3 Focus Order | A | Yes | Yes | Yes |
2.4.4 Link Purpose (In Context) | A | Automated only | Yes | Yes |
2.4.5 Multiple Ways | AA | No | Yes | Yes |
2.4.6 Headings and Labels | AA | No | Yes | Yes |
2.4.7 Focus Visible | AA | Yes | Yes | Yes |
2.4.11 Focus Not Obscured (Minimum) (from October 2024) | AA | Yes | Yes | Yes |
2.5.1 Pointer Gestures | A | No | Yes | Yes |
2.5.2 Pointer Cancellation | A | No | Yes | Yes |
2.5.3 Label in Name | A | Automated only | Yes | Yes |
2.5.4 Motion Actuation | A | No | Yes | Yes |
2.5.7 Dragging Movements (from October 2024) | AA | Yes | Yes | Yes |
2.5.8 Target Size (Minimum) (from October 2024) | AA | Automated only | Yes | Yes |
3.1.1 Language of Page | A | Automated only | Yes | Limited |
3.1.2 Language of Parts | AA | Automated only | Yes | Limited |
3.2.1 On Focus | A | Yes | Yes | Yes |
3.2.2 On Input | A | No | Yes | Yes |
3.2.3 Consistent Navigation | AA | No | Yes | Yes |
3.2.4 Consistent Identification | AA | No | Yes | Yes |
3.2.6 Consistent Help (from October 2024) | A | No | Yes | Yes |
3.3.1 Error Identification | A | No | Yes | Yes |
3.3.2 Labels or Instructions | A | No | Yes | Yes |
3.3.3 Error Suggestion | AA | No | Yes | Yes |
3.3.4 Error Prevention (Legal, Financial, Data) | AA | No | Yes | Yes |
3.3.7 Redundant Entry (from October 2024) | A | No | Yes | Yes |
3.3.8 Accessible Authentication (Minimum) (from October 2024) | AA | No | Yes | Yes |
4.1.1 Parsing(removed from testing from October 2023) | A | Automated only | Yes | Limited |
4.1.2 Name, Role, Value | A | Automated only | Yes | Yes |
4.1.3 Status Messages | AA | No | Yes | Yes |
B. Tools used
To aid with the audits, the following main tools are used:
Tool | Test type |
---|---|
Colour Contrast Analyser from TPGi | Simplified, detailed and mobile |
Axe from Deque | Simplified and detailed |
Wave from WebAIM | Simplified and detailed |
Adobe Acrobat Pro’s accessibility checker | Simplified and detailed |
Web Developer extension | Detailed only |
NVDA screen reader | Detailed only |
VoiceOver for Mac | Detailed only |
VoiceOver for iOS | Mobile only |
TalkBack for Android | Mobile only |
External keyboard | Mobile only |
Android accessibility scanner | Mobile only |
JAWS screen reader | Only used to verify specific complaints |
Dragon speech recognition software | Only used to verify specific complaints |
Website tests are mostly carried out in the Google Chrome browser but others have been used where deemed appropriate. Due to the equipment available, simplified tests are mainly conducted on Macs whereas detailed tests are completed on Windows laptops for wider access to assistive technology.
Android mobile apps are tested on a Pixel 4a phone and iOS applications on an iPhone 11.
C. Accessibility testing
These are the checks that may be performed for a detailed test. For simplified and mobile tests, we perform a subset of these checks as shown in the correlation table in Section 10.1.
The monitoring methodology is available at https://www.gov.uk/guidance/accessibility-monitoring-how-we-test.
Keyboard and zoom
Tab through the page at 100%, 200% and 400% zoom, interacting with elements. This helps us test that a site or application works predictably with a keyboard at different zoom levels, that no functionality has been lost, and that pages respond as expected.
Automated check
Use Axe to pick up automatically detectable issues. Automated checkers are believed to detect approximately 30-40% of accessibility issues.
Sensory check
Check any sensory elements such as images, colours and media and ensure that:
- contrast levels are sufficient
- nothing relies solely on vision, colour, hearing or timing
- any distractions can be removed, for example by pausing autoplaying video
Screen reader
Detect and interact with elements such as headings, links, form controls and tables, using a screen reader. This helps us test that:
- page elements are coded consistently with how they appear visually
- the site or application is compatible with assistive technology
- links, headings and labels are available and correct
- forms can be filled in easily
- error messages are correctly identified with information on how to fix them
Tools
Use tools to test for specific criteria including those around language, text spacing and keyboard shortcuts.
Mobile and motion
Use a mobile device to test orientation and any gesture-dependent functionality. This helps us test that:
- pages work in both portrait and landscape modes
- functionality does not rely on motion or specific gestures
- accidental gestures can be cancelled
Whole site
Look for consistent navigation and element names across the site. This helps us test that:
- there are multiple ways to navigate around a site
- there is consistency across the site in how navigational elements work and how elements are named
We may also scan the site for accessibility, usability or functional issues that go beyond the scope of WCAG A & AA and report on significant issues, however these will not be mandatory for the organisation to fix within the accessibility regulations.
Accessibility statement checks
For all test types, we check that an accessibility statement:
- exists
- is easy to find from the home page or app download page
- matches the model statement
- contains correct and up to date information based on our findings