Guidance

Accessibility monitoring: how we test

Find out how the accessibility monitoring methodology is used to test websites and mobile apps under the Accessibility Regulations

Accessibility monitoring methodology

Monitoring looks at the accessibility of a site or mobile app against W3C Web Content Accessibility Guidelines (WCAG) levels A and AA.

The Government Digital Service (GDS) is working on how to assess the new success criteria in WCAG 2.2 and will start monitoring for the extra criteria in October 2024. Until October 2024 we will monitor accessibility of websites and apps to WCAG 2.1 level AA.

GDS also reviews the accessibility statement that must be published to meet the accessibility regulations.

There are 3 types of testing:

  • simplified testing covers a small sample of pages and uses mainly automated accessibility testing with some manual testing
  • detailed testing covers a wider range of pages, each tested against all relevant WCAG success criteria
  • mobile app testing, which is similar to detailed testing, but across the screens and flows within mobile apps

For all test types, we send a report of our findings to the public sector body, who then have time (12 weeks) to make fixes and report their progress to us. We reassess and pass information to the relevant equality body to make further decisions on compliance or enforcement action.

Simplified testing process

Summary

Simplified audits examine a small selection of web pages. An automated accessibility checker is run on each page with some quick manual checks.

Preparing for an audit

We look at core pages like the home page and contact page with a sample of other pages from across the site. We look for different types of pages that contain a range of elements, like tables or different layout styles when possible. We also specifically look for a form and PDFs with an administrative purpose or that have been published since 28 September 2018.

Auditing

We use an automated accessibility testing extension for Google Chrome called Axe, created by Deque Systems. We only record severe or critical errors and manually check issues highlighted to make sure they are not just best practice suggestions.

We do manual tests for some of the most common barriers to users with accessibility needs that are not likely to be found by automated tests.

This includes tabbing through each page without a mouse to check:

  • the user can navigate to all links and interactive elements without a mouse (WCAG 2.1.1 Keyboard)
  • there are no keyboard traps (WCAG 2.1.2 No Keyboard Trap)
  • the focus surrounding elements is visible (WCAG 2.4.7 Focus Visible)
  • focus order is logical (WCAG 2.4.3 Focus Order)
  • there are no unexpected actions when tabbing (WCAG 3.2.1 On Focus)

Each page is viewed at different zoom settings up to 400%, and we simulate viewing the page on a small screen. This is designed to check page navigation and readability with no content cut off (WCAG 1.4.4 Resize text and WCAG 1.4.10 Reflow).

We look for audio/visual content such as images, videos, podcasts and animations. This is to check:

  • there are accessible alternatives for video-only or audio-only content (WCAG 1.2.1 Audio-only and Video-only)
  • there are no images of text with significant content (WCAG 1.4.5 Images of Text)
  • animations and carousels follow WCAG 2.2.2 Pause, Stop, Hide requirements

There are issues we do not directly look for, but will note if we find them. This includes the page refreshing unexpectedly (WCAG 2.2.1 Timing Adjustable) and audio playing for more than 3 seconds with no way to stop it (WCAG 1.4.2 Audio Control).

We sometimes find usability errors and bad practice issues that do not fail WCAG. If they are significant, we include them separately in our report making it clear it was not a fail.

Simplified tests do not show every accessibility error and we encourage organisations to get a full audit.

We look for an accessibility statement and check that this meets the required legislative format. This must include all mandatory wording, list all known accessibility issues and be up to date.

Detailed testing process

Summary

Detailed audits take a more in-depth look at a website. They test against the full range of WCAG success criteria to level A and AA. We use assistive technology to check compliance as well as the automated and manual methods used in simplified testing.

Before an audit

Before a detailed audit happens, we contact the public sector body to get information such as:

  • statistics on page usage
  • page template types
  • previous audit reports
  • access to the website or service if it is not publicly available
  • disproportionate burden assessments (if applicable)

Preparing for an audit

We use the information provided to decide on a sample of pages to test. We try to include a range of pages from the website or mobile application such as:

  • some of the most popular pages
  • a range of template types
  • at least one service end-to-end (where possible)
  • other pages the monitoring body really thinks need to be tested

The regulations allow previous accessibility audits to be considered when deciding the coverage of our monitoring if they meet certain criteria in the regulations. This includes how old and how detailed an audit is.

If the previous audit provided by the public sector body is sufficient, it is compared against the sample of pages chosen for testing by the monitoring body. If the page has already been tested by the public sector body as part of their audit and is satisfactory, the regulations say the monitoring body is only required to do a simplified test on that page. Any other pages will need a detailed test.

Auditing

The detailed test covers all WCAG success criteria in scope at level A and AA, but remains a sample-based approach and does not provide full coverage of the website.

Mobile app testing process

For mobile application tests, both Android and iOS versions of an application are tested.

The mobile application tests follow the same process as detailed audits, and are tested against WCAG level A and AA success criteria.

What happens after an audit

After the audit we send a report to the public sector body that runs the website detailing accessibility issues found. They are asked to acknowledge the report and are given 12 weeks to fix issues.

We send the report to the contacts in the organisations’ accessibility statements. If we cannot find contact details in an accessibility statement, we check the contact page or use feedback forms to request contact information. If we do not hear from organisations after a month and have tried all known methods of communication, we forward information about the website to the enforcement bodies.

After 12 weeks the public sector body is contacted to give an update on their progress. The monitoring body may retest certain parts of the website or mobile application to confirm the issues no longer exist but this does not constitute a full retest of the website or mobile application.

We then make a recommendation on whether we think further compliance action should be taken. An email will be sent to the public sector body to confirm the outcome of the monitoring.

After testing and correspondence

Information about the monitoring process and outcome, as well as our recommendation, are passed to the Equality and Human Rights Commission (EHRC) in England, Scotland and Wales or Equality Commission for Northern Ireland (ECNI), who are responsible for enforcement of equality and non-discrimination laws. They decide whether any further action is required and will contact the public sector body directly if necessary.

Published 9 February 2022
Last updated 5 October 2023 + show all updates
  1. Added information about WCAG 2.2 and what GDS is doing as a result of the new success criteria in WCAG 2.2.

  2. We've updated the guidance to reflect responsibility moving to Government Digital Service (GDS) from the Central Digital and Data Office (CDDO).

  3. First published.