The Digital by Default Service Standard is a set of criteria for digital teams building government services to meet. Meeting the standard will mean digital services are of a consistently high quality. This includes creating services that are easily improved, safe, secure and fulfill user needs.

The standard:

  • needs to be met by all new or redesigned transactional government services going live after April 2014
  • has to be maintained after a government service has gone live
  • aims to make digital services so good that people prefer to carry out the transaction online rather than by phone, post or in person

The service manual will help digital teams meet the standard and select people with the skills they need.

If a Government Digital Service assessment panel doesn't pass a service, it won't be awarded the standard and won't appear on GOV.UK.

The Digital by Default Service Standard was a commitment the government made in its Digital Strategy.

Teams must meet the criteria below, and maintain this quality for the full life of their service.

The criteria

  1. Understand user needs. Research to develop a deep knowledge of who the service users are and what that means for digital and assisted digital service design.
  2. Put in place a sustainable multidisciplinary team that can design, build and operate the service, led by a suitably skilled and senior service manager with decision-making responsibility.

    Related guides

  3. Evaluate what user data and information the service will be providing or storing, and address the security level, legal responsibilities, and risks associated with the service (consulting with experts where appropriate).

    Related guides

  4. Evaluate the privacy risks to make sure that personal data collection requirements are appropriate.
  5. Evaluate what tools and systems will be used to build, host, operate and measure the service, and how to procure them.
  6. Build the service using the agile, iterative and user-centred methods set out in the manual.
  7. Establish performance benchmarks, in consultation with GDS, using the 4 key performance indicators (KPIs) defined in the manual, against which the service will be measured.

    Related guides

  8. Analyse the prototype service’s success, and translate user feedback into features and tasks for the next phase of development.

    Related guides

  9. Create a service that is simple and intuitive enough that users succeed first time, unaided.

    Related guides

  10. Put appropriate assisted digital support in place that’s aimed towards those who genuinely need it.

    Related guides

  11. Plan (with GDS) for the phasing out of any existing alternative channels, where appropriate.

    Related guides

  12. Integrate the service with any non-digital sections required for legal reasons.

    Related guides

  13. Build a service consistent with the user experience of the rest of GOV.UK by using the design patterns and the style guide.
  14. Make sure that you have the capacity and technical flexibility to update and improve the service on a very frequent basis.

    Related guides

  15. Make all new source code open and reusable, and publish it under appropriate licences (or give a convincing explanation as to why this can’t be done for specific subsets of the source code).
  16. Use open standards and common government platforms (eg GOV.UK Verify) where available.
  17. Be able to test the end-to-end service in an environment identical to that of the live version on all common browsers and devices. Use dummy accounts and a representative sample of users.
  18. Use analytics tools that collect performance data.
  19. Build a service that can be iterated on a frequent basis and make sure resources are in place to do so.

    Related guides

  20. Put a plan in place for ongoing user research and usability testing to continuously seek feedback from users.
  21. Establish a benchmark for user satisfaction across the digital and assisted digital service. Report performance data on the Performance Platform.
  22. Establish a benchmark for completion rates across the digital and assisted digital service. Report performance data on the Performance Platform.
  23. Make a plan (with supporting evidence) to achieve a low cost per transaction across the digital and assisted digital service. Report performance data on the Performance Platform.
  24. Make a plan (with supporting evidence) to achieve a high digital take-up and assisted digital support for users who really need it. Report performance data on the Performance Platform.
  25. Make a plan for the event of the service being taken temporarily offline.

    Related guides

  26. Test the service from beginning to end with the minister responsible for it.

    Related guides

    • No related guides yet