Corporate report

Accessibility monitoring of public sector websites and mobile apps 2020-2021

Published 20 December 2021

1. Executive summary

The Public Sector Bodies (Websites and Mobile Applications) (No. 2) Accessibility Regulations 2018 requires public sector organisations to make their websites and mobile apps accessible. An accessibility statement, detailing any accessibility issues and who to contact if there is a problem, should be published for each website and app.

The Central Digital and Data Office monitors compliance with the regulations and this report contains the findings from monitoring from February 2020 to November 2021.

We have focussed on larger public sector organisations for our monitoring, especially central government (including agencies and arm’s length bodies), larger local government and central health organisations. This is a proportionate approach because of the higher impact of such sites and recognises the burden of extra coronavirus-related work across the public sector. We also found that generally smaller organisations did not have the capability or capacity to easily make fixes to their websites for accessibility issues.

Accessibility issues were found on nearly all tested websites. After sending a report to the website owner, and giving them some time to fix (normally 12 weeks), 59% had fixed the issues found or had short-term timelines for when the website would be fixed.

The main issues found are lack of visible focus, which affects keyboard users, low colour contrast, which affects visually impaired users, and parsing issues, that affects users of assistive technology. PDFs are generally less accessible than web pages, and often do not contain information that helps assistive technology interpret the content.

Accessibility statements are a new requirement for public sector websites that were introduced with the accessibility regulations. 90% of websites had an accessibility statement when monitored, but only 7% of statements contained all required information. After the monitoring process, 80% had fully compliant accessibility statements, and only 0.5% had no statement at all.

Many statements were written at the time the regulations were implemented (September 2018 for new websites and September 2019 for all remaining sites). These are now becoming out of date and organisations need to review them and keep statements updated.

Accessibility statements contain contact information for the organisation. We use this to contact organisations when monitored. 20% of organisations did not respond to our contact, and we are concerned that users with accessibility issues will also get no response. Organisations must make sure that reports of inaccessibility are received by the right team, taken seriously and responded to within a reasonable timeframe.

2. How we monitor website and mobile app accessibility

2.1 Summary

The Public Sector Bodies (Websites and Mobile Applications) (No. 2) Accessibility Regulations 2018 (“the accessibility regulations”) requires the Cabinet Office to monitor the accessibility of public sector websites and mobile apps. This is carried out by the Central Digital and Data Office (CDDO).

Monitoring covers the accessibility of a site against ETSI EN 301 549, which for websites and mobile apps is similar to the W3C Web Content Accessibility Guidelines version 2.1 (WCAG 2.1) levels A and AA. We also review the accessibility statement that must be published to meet the accessibility regulations.

More information about the testing process, the coverage of testing against WCAG 2.1 A and AA, and the tools used for monitoring can be found in the Appendix.

There are 3 kinds of testing:

  • simplified testing covers a small sample of pages and uses mainly automated accessibility testing with some manual testing
  • detailed testing covers a wider range of pages, each tested against all relevant WCAG success criteria
  • mobile app testing, which is similar to detailed testing, but across the screens and flows within mobile apps

For all test types, we send a report of our findings to the public sector body, who then have a period of time (12 weeks) to make fixes and report their progress to us. We then reassess and pass information to the relevant equality body to make further decisions on compliance or enforcement action.

2.2 Simplified testing process

Summary

Simplified audits examine a small selection of webpages. An automated accessibility checker is run on each page with some quick manual checks.

Preparing for an audit

We identify core pages like the home page and contact page with a sample of other pages from across the site. We look for different types of pages that contain a range of elements, such as tables or different layout styles when possible. We also specifically look for a form and PDFs with an administrative purpose or that have been published since 28 September 2018.

Auditing

We use an automated accessibility testing extension for Google Chrome called Axe, created by Deque Systems. We only record severe or critical errors and manually check issues flagged to make sure they are not just best practice suggestions.

We do manual tests for some of the most common barriers to users with accessibility needs that are not likely to be found by automated tests.

This includes tabbing through each page without a mouse to check that:

  • the user can navigate to all links and interactive elements without a mouse (WCAG 2.1.1 Keyboard)
  • there are no keyboard traps (WCAG 2.1.2 No Keyboard Trap)
  • the focus surrounding elements is visible (WCAG 2.4.7 Focus Visible)
  • focus order is logical (WCAG 2.4.3 Focus Order)
  • there are no unexpected actions when tabbing (WCAG 3.1.2 On Focus)

Each page is viewed at different zoom settings up to 400% and we simulate viewing the page on a small screen. This is designed to check that the page is still navigable and readable with no content cut off (WCAG 1.4.4 Resize text and WCAG 1.4.10 Reflow).

We look for audio/visual content such as images, videos, podcasts and animations. This is to check that:

  • there are accessible alternatives for video-only or audio-only content (WCAG 1.2.1 Audio-only and Video-only)
  • there are no images of text with significant content (WCAG 1.4.5 Images of Text)
  • animations and carousels etc. follow WCAG 2.2.2 Pause, Stop, Hide requirements

There are issues we do not explicitly look for, but will note if we find them. This includes the page refreshing unexpectedly (WCAG 2.2.1 Timing Adjustable) and audio playing for more than 3 seconds and there is no way to stop it (WCAG 1.4.2 Audio Control).

We sometimes find usability errors and bad practice issues that do not fail WCAG. If they are significant, we include them separately in our report making it clear it was not a fail.

Simplified tests do not show every accessibility error and we encourage organisations to get a full audit.

We look for an accessibility statement and check that this meets the required legislative format. These must include all mandatory wording, list all known accessibility issues and be up-to-date.

2.3 Detailed testing process

Summary

Detailed audits take a more in-depth look at a website. It tests against the full range of WCAG 2.1 success criteria to level A and AA. Assistive technology is used to check compliance as well as the automated and manual methods used in simplified testing.

Before an audit

Before a detailed audit happens, the public sector body is contacted to get information such as:

  • statistics on page usage
  • page template types
  • previous audit reports
  • access to the website/service if it is not publicly available
  • disproportionate burden assessments (if applicable)

Preparing for an audit

When the information is provided it is used to decide on a sample of pages to test. We try to include a range of pages from the website or mobile application such as:

  • some of the most popular pages
  • a range of template types
  • at least one service end-to-end (where possible)
  • other pages the monitoring body really think need to tested

The regulations allow previous accessibility audits to be considered when deciding the coverage of our monitoring if they meet certain criteria in the regulations. This includes the timeliness of the audit (how old it is) and whether it is a detailed audit.

If the previous audit provided by the public sector body is sufficient, it is compared against the sample of pages chosen for testing by the monitoring body. If the page has already been tested by the public sector body as part of their audit and is satisfactory, the regulations say the monitoring body is only required to do a simplified test on that page. Any other pages will need a detailed test.

Auditing

The detailed test covers all WCAG 2.1 success criteria in scope at level A and AA, but remains a sample-based approach and does not provide full coverage of the website.

2.4 Mobile app testing process

For mobile application tests, both Android and iOS versions of an application are tested.

The mobile application tests follow the same process as detailed audits, and are tested against WCAG 2.1 level A and AA success criteria, except for criteria that are exempt from mobile testing.

2.5 What happens after an audit

After the audit a report detailing accessibility issues found is sent to the public sector body that runs the website. They are asked to acknowledge the report and given 12 weeks to fix issues.

We send the report to the contacts in the organisations’ accessibility statements. If we cannot find contact details in an accessibility statement, we check the contact page or use feedback forms to request contact information. If we do not hear from organisations after a month and have tried all known methods of communication, we forward information about the website to the enforcement bodies.

After 12 weeks the public sector body is contacted to give an update on their progress. The monitoring body may retest certain parts of the website or mobile application to confirm the issues no longer exist but this does not constitute a full retest of the website or mobile application.

We then make a recommendation on whether we think further compliance action should be taken. An email will be sent to the public sector body to confirm the outcome of the monitoring.

2.6 After testing and correspondence

Information about the monitoring process and outcome, as well as our recommendation, are passed to the Equality and Human Rights Commission (EHRC) in England, Scotland and Wales or Equality Commission for Northern Ireland (ECNI), who are responsible for enforcement of equality and non-discrimination laws. They decide whether any further action is required and will contact the public sector body directly if necessary.

Periodically CDDO will, on behalf of the Minister for the Cabinet Office, publish a list of websites with non-compliant accessibility statements. This is expected to start in 2022.

3. What has been monitored

3.1 Composition of the sample

Monitoring of websites and mobile apps is ongoing. The data is provided from February 2020, when monitoring started, until 19 November 2021.

612 websites have been tested (593 by the simplified testing method and 19 by the detailed testing method). 2 mobile apps have been tested.

Simplified testing

There are 593 websites that were tested.

  • 13 websites are currently being tested or having their report written.
  • 113 websites have been sent their report and are in the 12 week period that is provided for fixing issues.
  • 22 websites have reached the end of the 12 week period and we are reviewing their response. We have finished the monitoring process for 403 websites.
  • 42 website cases were closed during the process, due to being exempt, as private sector, or organisations that were closing or merging.

In 2020 and 2021 our aim was to monitor a sample of all public sector organisation and website types and sizes, including some of those with exemptions, to learn about any differences needed in the monitoring process. We started by sampling across a range of organisations, but due to the burden of extra coronavirus-related work and smaller organisations having problems making changes to their sites easily, we have focussed on larger public sector organisations for most of our testing, especially focussed on central government (including agencies and arm’s length bodies), larger local government and large and central health organisations.

We created lists of public sector organisations that were then randomised for testing. A search engine was used to find the main website of the organisation and then tested.

We included complaints received via the Equality Advisory Support Service (EASS) in the sample.

Geographic distribution of simplified testing

Location Percentage
UK wide 26.5%
England 57.9%
Northern Ireland 4.7%
Scotland 6.2%
Wales 4.7%

Sectors for simplified testing

Sector Percentage
Central government, including agencies and ALBs 31.4%
Education 3.6%
Health 9.3%
Law and public safety 4.0%
Local government 51.7%

Detailed testing

There are 19 websites and services that were audited.

All tests are at different stages and include:

  • 3 websites and services that we have contacted and that have provided the initial information and are in our backlog to test (due to be tested this year)
  • 1 service where testing has finished and the report is being compiled
  • 7 websites and services that have been sent their report and are in the 12 week period that is provided for fixing issues
  • 3 websites and services that have reached the end of the 12 week period and we are reviewing their response
  • 5 websites and services where we have finished the monitoring process

Of the websites and services tested, three of these were complaints that were referred to the monitoring body by the Equality Advisory and Support Service (EASS), but were too complicated to test through simplified testing.

In this monitoring period we focussed on larger public sector bodies, starting with councils. Organisations to test were sampled by looking for council regions that had the largest population sizes, and that had not already been through the simplified test process. Complaints received via EASS that were not suitable for simplified testing were also prioritised and included in the sample.

Geographic distribution of detailed testing

Location Number
UK wide 2
England 14
Northern Ireland 2
Scotland 1
Wales 0

Sectors for detailed testing

Sector Number
Central government, including agencies and ALBs 2
Health 1
Local councils 16

Mobile app testing

For mobile testing, 4 mobile applications were audited - 2 applications on 2 operating system platforms (iOS and Android).

Testing has finished and the report has been sent.

For mobile applications, an exercise was done to identify public sector apps and the chosen applications represented either large local authority or widely downloaded applications.

The 4 applications tested were in the health sector and were UK-wide applications.

4. What we found

4.1 Simplified tests

We found no accessibility issues for 8 websites, of which 5 had compliant accessibility statements. Approximately half of the websites tested had less than 5 issues, 75% of sites had less than 9 issues, and we found at most 26 issues on one site. As mentioned in the testing methodology, simplified tests only cover parts of WCAG and test on a small number of pages.

After sending the report, correspondence with the public sector organisations, and retest after 12 weeks, 229 websites (59%) had fixed the issues found or had a short-term roadmap to fix remaining issues, and 159 (41%) still had some issues remaining. 40 cases were stopped before retesting because the website was exempt, deemed to be private sector, had been closed or merged with another website.

From a sample of 421 tests, 2691 issues were found, including:

WCAG % of total
2.4.7 Focus Visible 12%
4.1.2 Name, Role, Value 11%
1.4.3 Contrast (Minimum) 11%
1.3.1 Info and Relationships 9%
2.4.4 Link Purpose (In Context) 8%
2.1.1 Keyboard 8%
1.4.4 Resize text 8%
1.1.1 Non-text Content 7%
1.4.1 Use of Colour 5%
1.4.10 Reflow 5%

Observations from simplified testing

We usually find at least one accessibility issue through automated tests and an issue through manual testing.

There is also normally an issue with the PDF that is tested, even though the checks are limited to technical tests, such as inclusion of a PDF title and language.

Many issues are found across all pages, as they are part of the template used throughout the site. Homepages often have extra elements that may cause accessibility problems, such as more use of images that contain text, or carousels. Cookie notices, or other elements that have been added to the page, can also generate errors.

Some sites use a third party provider for forms, and may not use the same template as the rest of the site. These can present different accessibility issues, some of which are seen across all sites that use the same third party provider.

4.2 Detailed tests

Out of 19 sites that have been tested so far, we have completed the monitoring process for 4 of them.

Two of the four public sector websites that were audited showed good levels of improvement after the 12 week period provided for fixing issues with at least 89% of issues fixed. The other sites fixed less issues, one fixing 37% and the other fixing none of the issues identified in the report.

All four public sector websites had non-compliant accessibility statements when they were initially tested. All had accessibility issues missing in the statement, and two had mandatory wording that was missing. All four websites made changes to their statements after our report was sent and had compliant statements at the end of the monitoring process.

Issues found during detailed testing

The total number of issues found in each detailed test ranged from 15 to 151. The average number of issues per test was 52.

For detailed testing, the average number of unique issues per page ranged from 0.66 to 8.39.

The average number of unique issues per page was 2.13.

From the detailed tests carried out up to November 2021, 830 issues were detected and the most common issues were:

WCAG % of total
4.1.1 Parsing 23.5%
4.1.2 Name, Role, Value 8.7%
1.3.1 Info and Relationships 8.0%
1.4.3 Contrast (Minimum) 7.7%
2.4.7 Focus Visible 5.8%
1.4.10 Reflow 4.6%
1.1.1 Non-text Content 4.3%
1.4.4 Resize text 4.1%
2.1.1 Keyboard 2.8%
2.4.2 Page Titled 2.8%

The first 5 criteria account for 53% of the total issues, and the first 10 criteria account for 72% of the total issues.

The top three (Parsing, Name/Role/Value and Info and Relationships, covering 40% of issues) relate to writing code in a robust way and ensuring that the visual and programmatic elements of a site are consistent.

Observations from detailed testing

While all organisations so far have been able to provide evidence of recent accessibility audits, these have varied in depth from scheduled automated checks to comprehensive manual audits.

Simpler pages written using standard, semantic HTML tend to be the most accessible.

Complex sites containing media, animations and forms tend to have the most issues as there are more things that can go wrong.

Common components that cause accessibility issues include:

  • form fields with no label associated programmatically - these issues are particularly likely to block screen reader users from using a site independently as they may not have another way of identifying each field’s purpose
  • images with no alternative text
  • images containing important text which is not available any other way
  • online form pages which do not respond well to high levels of zoom, either shrinking or requiring two-dimensional scrolling
  • carousels that cannot be paused or operated using the keyboard
  • PDF documents not having titles or languages set in their metadata
  • page add-ons, such as cookie banners or email subscription prompts, which either are not keyboard-accessible or require the user to tab through the whole page before they can be accessed
  • unresponsive components such as buttons containing text, where the text is cut off as it is enlarged
  • links which are not underlined and rely on a user to distinguish two similar colours to identify them

Online forms often have more issues per page than average. Some online forms perform very well with automated checks but fail manual checks. These can be managed by a different team to the core website or outsourced to a third party, leading to a different experience to the rest of the website.

4.3 Mobile tests

We contacted the first organisations sampled for mobile app testing in August 2021. The first report is being compiled.

Android and iOS versions of the same app are counted as separate applications. However, common issues which affect both operating systems have been reported back to organisations as a single issue.

2 mobile applications have been tested in this monitoring period, an Android version and iOS version of the same app and 4 more are in test.

Issues found in mobile testing

The iOS app had 21 issues and the Android app had 19 issues. The most common issues were:

WCAG % of total
4.1.2 Name, Role, Value 22.5%
2.1.1 Keyboard 17.5%
1.4.4 Resize text 12.5%
2.4.7 Focus Visible 12.5%
1.3.1 Info and Relationships 12.5%
1.4.3 Contrast (Minimum) 12.5%
2.2.1 Timing Adjustable 5.0%
2.2.2 Pause, Stop, Hide 5.0%
2.5.3 Label in Name 5.0%

As only a small amount of mobile testing has been carried out so far, it is difficult to draw any conclusions from this data.

So far iOS and Android apps often behave very differently for purportedly the same app, and apps using embedded web pages behave differently from native app screens and respond to operating system preferences in different ways.

At this early stage, Android apps have appeared to have better assistive technology support.

5. Accessibility statements

At time of initial testing, 39 sites (7%) had a compliant accessibility statement, 461 sites (83%) had a statement but with some mandatory information missing, and 55 (10%) had no statement.

After monitoring and correspondence, 319 sites (80%) had a compliant accessibility statement, 80 sites (20%) still had some mandatory information missing, and 2 (0.5%) had no statement.

The most common issues with accessibility statements include:

  • mandatory wording missing, including the wording to specify what the accessibility statement applies to
  • issues being found in our tests which are not included in the statement
  • old versions of an accessibility statement still present that have not been updated since the regulations came into force, which do not resemble the required legislative format
  • organisations directly copying and pasting text from the sample statement template, and not replacing with their own organisation’s information

Generally the statements have been easy to find, and included in the header or footer of the website. Some organisations have split the statement into many pages, or have published them as PDFs or Word documents, which means they are harder to understand.

It is not always clear where accessibility issues are found, and what the problems are. This means that users cannot easily understand where and how they may have issues using the website.

Many statements have not been updated since first publication, often in 2020 or earlier. They refer to future fixes for issues where the dates have now passed. Again, this means that users cannot understand what issues may remain on the site. Sites should review the statement every year, and add the date of review to the statement so users know the statement is up-to-date.

From a small initial analysis of public sector mobile applications made prior to sampling, few had an accessibility statement. Out of 13 apps, only 4 had an accessibility statement that we could find.

6. Disproportionate burden

Disproportionate burden may be claimed where the impact of fully meeting the accessibility regulations is too much for an organisation to reasonably complete. For example, if a specific accessibility issue is expensive to fix and the benefit to users is not high enough to provide resources. Lack of time or knowledge does not constitute a disproportionate burden and the majority of digital content should be fully accessible.

Organisations are legally required to carry out an assessment of the extent to which compliance with the accessibility regulations imposes a disproportionate burden.

32% of websites have claimed disproportionate burden and 68% have not. The understanding of disproportionate burden varies greatly and has often caused confusion and questions. Common issues include:

  • organisations copying the disproportionate burden claim directly from the sample accessibility statement
  • organisations claiming disproportionate burden without having carried out an assessment beforehand
  • misunderstanding of when the disproportionate burden exemption applies, as opposed to exemptions for certain pages and information from the accessibility regulations

Some organisations, however, have been able to provide a detailed assessment on request and this can paint a clear picture of the organisation’s thought process during the assessment, including costs and benefits to their users.

Organisations are often unclear on what kind of information to include in their assessment. A good disproportionate burden assessment is a cost/benefit exercise and should consider factors such as:

  • how much it will cost to fix the issue
  • the amount allocated to spend on the website annually
  • how extra costs would impact the organisation’s budget
  • the number of users the issue impacts if not fixed
  • benefits that fixing issues would bring to users
  • how long an organisation expects this disproportionate burden to apply
  • if the site or service is procured or outsourced, how long the third party supplier is contracted for, and how much it would cost to re-tender or renegotiate the contract to get the issues fixed

Both of the apps tested so far have made a disproportionate burden claim. In this case, a joint assessment was published on their website beforehand. However there is insufficient data to draw further conclusions.

7. Lessons learnt from the feedback sent by the monitoring body to the public sector bodies monitored

7.1 Simplified testing

We have had mixed reactions to organisations receiving our accessibility audit reports. Some organisations are grateful to have a starting point and welcome our audits, which can push website accessibility to be prioritised, but some organisations do not respond positively and this can sometimes be the first they are hearing of the accessibility regulations.

The majority of organisations acknowledge our report within a month, but some organisations will not reply to us until after we inform them we are sending their information to the enforcement bodies.

A minority (roughly 20%) never respond to our report or subsequent emails. We get better response rates when there is a direct email address on an accessibility statement or contact page rather than when relying on website contact forms.

Willingness and cooperation is usually dependent on organisation size and resources available. The pandemic has inevitably affected this.

Organisations will ask us for guidance and help. Most contact us within the 12 weeks to ask questions. We started sending out more information with our reports and saw a reduction in the number of questions we were asked. Due to the volume of websites being tested, we have not been able to engage more closely with monitored organisations or provide more information about how to fix specific issues we found.

7.2 Detailed and mobile testing

It is often hard for an organisation to provide us with access to a dedicated test environment due to security controls, although it has been done.

This may be a larger barrier to testing when the sample includes more:

  • services
  • transactional websites
  • internal sites
  • intranets.

We need to submit data on live services to test a whole process. This requires careful coordination with the public sector body and in some cases may not even be possible.

We occasionally find issues which are accessibility failures but do not fail a specific WCAG criteria. In each case we use our judgement on the likely impact and raise advisory issues back to the public sector body.

Issues found on parts of the website managed by a third party can take longer to fix due reaching out to suppliers and lead times set in contracts. Despite this, some third party suppliers have made their services more accessible and fixed issues raised by public sector bodies.

8. Feedback from public sector organisations that were monitored

We invited all public sector organisations that were monitored by the simplified test process, where the process is complete, to answer some questions about their experience. We received a total of 86 responses.

Receiving the accessibility report had a positive impact on organisations:

  • 73% reported that it has helped them fix accessibility issues with their website
  • it helped 63% ​​create or improve their accessibility statement
  • it has helped raise accessibility awareness in 43% of the organisations
  • it has helped 20% of organisations secure the resources they need in order to fix accessibility issues

Quotes from the survey reinforce that the audit has empowered people to start taking action to meet the accessibility regulations:

“Having an audit under the name of the Cabinet Office did help us in getting leverage when it came to asking other teams to make changes to their forms/pages.”

“Escalated the importance of ensuring public facing content is accessible.”

“Prompted the establishment of an accessibility roadmap.”

There are some organisations where receiving the audit made no difference (~7%):

”To be honest it had no impact on anyone within the organisation other than the Communications Team. We found it quite difficult to resolve changes as they were structural/developer issues that only our provider could amend and found explaining the importance of these changes very challenging.”

For some organisations (10%) it created unnecessary work:

“It impacted ‘business as usual’ work and took valuable resources away from priority pandemic-related work.”

Only a small percentage of respondents (34%) reported having an in-house accessibility specialist. To compensate for this they need to rely on contractors and online accessibility testing services.

66% reported that they are using the guidance we publish on GOV.UK to meet the accessibility regulations.

Public sector organisations reported that they are dealing with the following barriers to meeting the accessibility regulations:

  • lack of awareness about the regulations (55%)
  • funding issues (53%)
  • lack of accessibility training (52%)
  • needing more accessibility guidance (45%)
  • difficulties collaborating with contractors (34%)

Other barriers included a lack of resources to continuously fix accessibility issues on the website, difficulties in motivating the organisation to spend on costs associated with addressing accessibility issues, and the organisation’s willingness to change.

9. Complaints, compliance and enforcement

9.1 Complaints

The Equality Advisory Support Service (EASS) and the Equality Commission for Northern Ireland (ECNI) receive complaints from the public when users encounter accessibility issues on websites and apps, and that the issues have not been resolved by the public sector organisation.

They have referred 14 cases to us where members of the public have complained about a website. 6 of these cases refer to accessibility statements not being available on the website and 3 were about screen readers not being able to work effectively. All of these cases were assessed when sent to us by either EASS and ECNI.

Complaints are tested via the simplified testing process unless they are too complicated to test this way, such as services or transactions within a website, in which case the complaint is moved to the detailed testing process so we can access the relevant pages.

In one case, a complaint came through to the organisation through multiple channels and they commissioned an audit from an independent accessibility consultant. This audit was reviewed and we used this audit as part of our detailed testing.

9.2 Compliance and enforcement

Once CDDO have completed our assessment of a public sector website or mobile application we send information about the case to either the Equality and Human Rights Commission (EHRC) for English, Scottish and Welsh organisations or the Equality Commission for Northern Ireland for Northern Irish organisations.

CDDO provides technical support, including retests, to the equality bodies as they investigate and carry out any enforcement activity.

EHRC enforcement approach

The EHRC enforces the accessibility regulations in Great Britain. To carry out its regulatory work, the EHRC has a range of legal powers under the Equality Act 2006. Ensuring accessibility of public sector websites is a strategic priority for the EHRC. The EHRC will consider the CDDO reports and recommendations on whether further action should be taken against an organisation.

Where a public sector body has failed to engage with CDDO and/or failed to fix the majority of non-compliance issues, the EHRC will consider whether it is proportionate to take enforcement action. It does this in accordance with its litigation and enforcement policy.

The EHRC has a two staged approach to enforcement. Firstly, it will seek compliance from the organisation. It sends an initial letter setting out the scope of its enforcement powers and requests details of the efforts made to fix the outstanding accessibility issues together with a specified time frame to make the fixes.

The organisation is informed that failure to do so may result in enforcement action. Secondly, if the organisation fails or refuses to comply, the EHRC will proceed with enforcing the regulations, this may include entering into a binding agreement to comply with the regulations, or conducting an investigation into the failure to make reasonable adjustments to ensure website accessibility.

Since the start of the EHRC’s role, the majority of CDDO reports have not needed to recommend further action. To date, the EHRC has secured compliance without the need for enforcement action.

ECNI enforcement approach

The Equality Commission for NI is Northern Ireland’s statutory equality body. It was established by the Northern Ireland Act 1998. Its powers and duties derive from a number of statutes enacted over the past decades. Enforcement is undertaken by the ECNI in Northern Ireland, using existing enforcement powers under the Disability Discrimination Act 1995 (DDA 1995).

The Equality Commission may choose to take action based on information provided by CDDO, using its existing enforcement powers under the DDA 1995 and related enforcement powers under Article 9 (1a) and (2) of the Equality Disability, etc (Northern Ireland) Order 2000.

Under the disability discrimination legislation, enforcement in Northern Ireland is through legal action in a county court. The service user must be able to show that they are disabled, in that they have a physical, sensory or mental impairment which has a substantial and long term adverse effect on their ability to carry out normal day to day activities.

Service providers (including public sector website or app based services) are under a duty to make reasonable adjustments to ensure that disabled users can access the service. This means that they must remove barriers that could make it impossible or unreasonably difficult for disabled people to access the service.

There are strict time limits for lodging proceedings in a county court. Cases must be lodged within 6 months of the alleged discriminatory act. While a court has a discretion to extend the time limit, this is used very sparingly. It is a matter for a county court judge, upon hearing all of the evidence, to decide whether or not disability discrimination has occurred.

On receipt of a complaint by a disabled person to access a website or mobile app of a Public Sector Body, the Equality Commission will provide advice to the service user and may provide legal assistance for an individual with legal proceedings.

The Commission determines which cases to assist under its Policy for the Provision of Legal Advice and Assistance. In the case of inaccessibility of a UK wide website or mobile app the Commission may refer the matter to the Equality and Human Rights Commission’s enforcement team.

The Equality Commission has considered two complaints by disabled people. Both are on going. In one complaint, proceedings of disability discrimination have been lodged in a county court in Northern Ireland; in the other complaint the matter has been referred to the Equality and Human Rights Commission for consideration. CDDO have referred monitoring reports of 13 Northern Ireland based websites to the Equality Commission.

The Commission is in the process of contacting the relevant public sector bodies regarding their plans to improve accessibility and noting that complaints from individuals may result in legal action under the Disability Discrimination Act 1995.

Individuals with a complaint of website accessibility in Northern Ireland may contact the Equality Commission for NI on 028 90 500 600 and ask to speak to a Discrimination Advice Officer.

10. Accessibility training, communications and support for the public sector

The accessibility capability team in CDDO provides support and guidance for the public sector.

10.1 Resources

CDDO publishes information for government and the wider public sector around digital accessibility, such as information about making your service accessible, accessibility personas, publishing accessible documents and testing for accessibility.

There is guidance to help public sector bodies understand accessibility regulation requirements and how to ​​make a website or app accessible and publish an accessibility statement.

We publish an accessibility in government blog, with popular posts including:

10.2 Communities

The team oversees the government accessibility community, which includes over 1500 people interested in accessibility in the public sector. It’s a place where people can ask questions, share their experiences, and contribute to the development of best practises.

10.3 Training

The team holds weekly Accessibility Clinics where guests can ask any accessibility-related questions they may have. The team also hosts two monthly webinars: Introduction to Accessibility and a Demo of Assistive Technology which have been attended by roughly 700 people over the last year.

Both the clinics and webinars are open to anybody working in or with the government and can be booked by looking for the advertised events on the cross-government accessibility Slack channel.

10.4 Communications campaign

Government Digital Service (GDS), and then CDDO, completed a communications campaign from 2019 to tell the public sector about the new accessibility regulations, and to promote the published guidance to help the public sector to comply.

The campaign included blog posts, stakeholder outreach, social media outreach, events and panels including a GDS-run event for Global Accessibility Awareness Day 2020. We held 6 webinars across Global Accessibility Awareness Day with 4,000 registrations to the event. As a result of attending the webinars, 84% of public sector ‘website owners’ agreed they had a better understanding of what they can do to meet the accessibility regulations.

The campaign won a bronze award in the digital category of the Public Service Communications Awards 2020.

We have run surveys with the public sector throughout the deadlines included in the accessibility regulations, and they indicate that awareness of the dates for the deadlines has consistently been high and that the public sector knows the actions to become compliant.

We have also promoted the way that users can complain about inaccessible websites and mobile apps. There is a page on GOV.UK about the rights users have to be able to access online public sector information and services, and how to complain. We’ve also developed more information for users with Scope.

11. National Disability Strategy

This report is published in line with the National Disability Strategy.

The National Disability Strategy, published in July 2021, sets out the government’s plan to improve the everyday lives of disabled people.

Everyday experience for disabled people is very different from non-disabled people. Many disabled people:

  • navigate inaccessible and inflexible workplaces or education settings
  • use unresponsive and fragmented public services that do not meet their needs
  • find themselves barred from exercising rights such as voting and serving on a jury

The strategy commits the government to putting disabled people at the heart of policy-making and service delivery – laying the foundations for longer term, transformative change.

The government will work across government departments to embed the following elements, which underpin our future approach to disability:

  • ensure fairness and equality – we will empower disabled people by promoting fairness and equality of opportunities, outcomes and experiences, including work and access to products and services
  • consider disability from the start – we will embed inclusive and accessible approaches and services to avoid creating disabling experiences from the outset
  • support independent living – we will actively encourage initiatives that support all disabled people to have choice and control in life
  • increase participation – we will enable greater inclusion of a diverse disabled population in the development and delivery of services, products and policies
  • deliver joined up responses – we will work across organisational boundaries and improve data and evidence to better understand and respond to complex issues that affect disabled people

These elements complement those of the UN Convention on the Rights of Persons with Disabilities (UNCRPD).

12. Appendices

12.1 Correlation of monitoring against technical standards

Our testing is based on the EN 301 549 standard, version 2.1.2. This maps closely to Web Content Accessibility Guidelines (WCAG) version 2.1 levels A and AA as follows:

Using table A1 in Annex A of the standard, we are required to test the following:

  • conditional requirements based on the functionality of the websites in scope (from clauses 5, 6 and 7 of EN 301 549) - however none of the web pages or applications we have tested so far have included functionality which brings this into scope
  • requirements based on Web Content Accessibility Guidelines (clause 9)
  • requirements for support documentation (clause 12) - however the majority of websites do not have separate support documentation so this does not apply in most cases - we routinely test help pages as part of detailed and mobile testing if they exist

Mobile testing requirements are listed in table A2 of EN 301 549 and documents are covered in clause 10.

The following table covers our test approaches for each Web Content Accessibility Guideline using these categories:

  • Yes: we test the success criterion manually or using tools, whichever is most appropriate
  • Automated only: we only flag automatically-detectable failures of the success criterion
  • Limited: there is not an easy way to test for this on a mobile but we will flag potential issues if we find them
  • No: we do not actively test for it
  • Exempt: the success criterion is exempt from the regulations

The numbering structure for Clause 9 of EN 301 549 uses the same numbering structure as WCAG v2.1, preceded with a 9. For example, WCAG 1.1.1 (non-text content) is the same as EN 301 549 clause 9.1.1.1.

WCAG Level Simplified Detailed Mobile
1.1.1 Non-text Content A Automated only Yes Yes
1.2.1 Audio-only and Video-only (Prerecorded) A Yes Yes Yes
1.2.2 Captions (Prerecorded) A Yes Yes Yes
1.2.3: Audio Description or Media Alternative (Prerecorded) A No Yes Yes
1.2.4 Captions (Live) AA Exempt Exempt Exempt
1.2.5 Audio Description (Prerecorded) AA No Yes Yes
1.3.1 Info and Relationships A Automated only Yes Yes
1.3.2 Meaningful Sequence A No Yes Yes
1.3.3 Sensory Characteristics A No Yes Yes
1.3.4 Orientation AA Automated only Yes Yes
1.3.5 Identify Input Purpose AA Automated only Yes Yes
1.4.1 Use of Color A Automated only Yes Yes
1.4.2 Audio Control A Yes Yes Yes
1.4.3 Contrast (Minimum) AA Yes Yes Yes
1.4.4 Resize text AA Yes Yes Yes
1.4.5 Images of Text AA Yes Yes Yes
1.4.10 Reflow AA Yes Yes Limited
1.4.11 Non-text Contrast AA No Yes Yes
1.4.12 Text Spacing AA Automated only Yes Limited
1.4.13 Content on Hover or Focus AA Yes Yes Yes
2.1.1 Keyboard A Yes Yes Yes
2.1.2 No Keyboard Trap A Yes Yes Yes
2.1.4 Character Key Shortcuts A No Yes Yes
2.2.1 Timing Adjustable A Yes Yes Yes
2.2.2 Pause, Stop, Hide A Yes Yes Yes
2.3.1 Three Flashes or Below Threshold A No Yes Yes
2.4.1 Bypass Blocks A Automated only Yes Exempt
2.4.2 Page Titled A Automated only Yes Exempt
2.4.3 Focus Order A Yes Yes Yes
2.4.4 Link Purpose (In Context) A Automated only Yes Yes
2.4.5 Multiple Ways AA No Yes Exempt
2.4.6 Headings and Labels AA No Yes Yes
2.4.7 Focus Visible AA Yes Yes Yes
2.5.1 Pointer Gestures A No Yes Yes
2.5.2 Pointer Cancellation A No Yes Yes
2.5.3 Label in Name A Automated only Yes Yes
2.5.4 Motion Actuation A No Yes Yes
3.1.1 Language of Page A Automated only Yes Yes
3.1.2 Language of Parts AA Automated only Yes Exempt
3.2.1 On Focus A Yes Yes Yes
3.2.2 On Input A No Yes Yes
3.2.3 Consistent Navigation AA No Yes Exempt
3.2.4 Consistent Identification AA No Yes Exempt
3.3.1 Error Identification A No Yes Yes
3.3.2 Labels or Instructions A No Yes Yes
3.3.3 Error Suggestion AA No Yes Yes
3.3.4 Error Prevention (Legal, Financial, Data) AA No Yes Yes
4.1.1 Parsing A Automated only Yes Limited
4.1.2 Name, Role, Value A Automated only Yes Yes
4.1.3 Status Messages AA No Yes Exempt

1.2.4 Captions (Live) is not tested as live time-based media is specifically exempt from the accessibility regulations.

The list of exempt mobile criteria comes from table A2 of EN 301 549.

12.2 Tools used

In this monitoring period, tests of websites have mainly been carried out in a Google Chrome browser but others may be used where deemed appropriate. Due to the equipment available, simplified tests are largely conducted on Macs and detailed tests on Windows laptops.

Android mobile applications are tested on a Pixel 4a phone and iOS applications on an iPhone 11.

To aid with the audits, the following tools are used.

Tool Used in type of test
Colour Contrast Analyser from TPGi Simplified, detailed and mobile
Contrast checker from WebAIM Simplified, detailed and mobile
Axe from Deque Simplified and detailed
Wave from WebAIM Detailed only
Web Developer extension Detailed only
NVDA screen reader Detailed only
Adobe Acrobat Pro DC’s accessibility checker Detailed only
Text spacing bookmarklet Detailed only
WCAG parsing compliance bookmarklets Detailed only
Character Key Shortcut bookmarklet Detailed only
VoiceOver for iOS Mobile only
TalkBack for Android Mobile only
External keyboard for mobile Mobile only
Android accessibility scanner Mobile only
JAWS screen reader Only used to verify specific complaints
Dragon speech recognition software Only used to verify specific complaints

12.3 Accessibility testing

The order and grouping of the WCAG checks varies depending on the test approach (simplified, detailed and mobile), and what works best for the individual auditor.

Here is a summary of the checks that may be performed for a detailed test. For simplified and mobile tests, we perform a subset of these checks as shown in the correlation table.

Keyboard and zoom

Tab through the page and see how it behaves at different zoom levels. This helps us test that a site or application:

  • is keyboard-accessible
  • works at different zoom levels
  • does not have keyboard shortcuts which interfere with website use

Automated check

Use Axe to pick up automatically-detectable issues. Automated checkers are believed to detect approximately 30-40% of accessibility issues.

Sensory check

Check that things do not rely on using a particular sensory characteristic (such as audio or colour), and are not distracting. This helps us test that:

  • contrast levels are sufficient
  • images have alternative text
  • essential text is not provided in image form
  • additional content such as tooltips are accessible
  • there are alternatives to audio and video media
  • autoplaying audio and video can be stopped

Screen reader

Detect and interact with elements such as headings, links, form controls and tables, using a screen reader. This helps us test that:

  • page elements are coded consistently with how they appear visually
  • the site or application is compatible with assistive technology
  • links, headings and labels are available and correct
  • forms can be filled in easily
  • error messages are correctly identified with information on how to fix them

Tools

Use the web inspector and tools to test specific criteria such as language, text spacing and keyboard shortcuts. This helps us test that:

  • pages have the correct title and language
  • page content is in a logical order programmatically
  • hypertext markup language (HTML) is valid

Mobile and motion

Use a mobile device to test orientation and any gesture-dependent functionality. This helps us test that:

  • pages work in both portrait and landscape modes
  • functionality does not rely on motion or specific gestures
  • accidental gestures can be cancelled

Whole site

Look for consistent navigation and element names across the site. This helps us test that:

  • there are multiple ways to navigate around a site
  • there is consistency across the site in how navigational elements work and how elements are named

We may also scan the site for accessibility, usability or functional issues that go beyond the scope of WCAG 2.1 levels A and AA and report on significant issues, however these will not be mandatory for the organisation to fix within the accessibility regulations.

Accessibility statement checks

For all test types, we check that an accessibility statement:

  • exists
  • meets the requirements for where it should be located
  • meets the requirements of the accessibility regulations
  • contains correct and up to date information based on our findings