GOV.UK Design System Live Assessment

The report for GDS's GOV.UK Design System live assessment on the 11th March 2021

Digital Service Standard assessment report

GOV.UK Design System

From: Central Digital and Data Office
Assessment date: 11/03/21
Stage: Live
Result: Not met
Service provider: Government Digital Services

Previous assessment reports

Service description

The GOV.UK Design System provides teams across government with the styles, components and patterns they need to design and build user-centered digital services.

Components and patterns are contributed to the Design System from the cross government community. Each new component/pattern goes through a validation process before it’s incorporated into the Design System. This reduces duplication and promotes reuse of research and design efforts across departments. This is part of point 13 of the Service Standard.

Using styles, components and patterns from the Design System makes it easier for teams to build services that look like GOV.UK and that are accessible.

The Design System team maintains the platform, provides support, ensures that it continues to meet accessibility standards, improves and expands the components and patterns available.

Service users

  • people who use public services
  • people who run public services
  • people who build public services
  • people who create design resources for people who build public services

1. Understand user needs

Decision

The service met point 1 of the Standard.

What the team has done well

The panel was impressed that:

  • user needs and personas are clearly defined and referred to with a shared understanding shown by members throughout the team
  • there is a demonstrated understanding of barriers users face within the service, such as feelings of inexpertise in contributors, perceptions of limitations in the design system for internal applications, or incompatible language frameworks
  • research into user needs has included a focus on accessibility and inclusivity

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure a continued focus on user needs for the application as a service. There is a risk that a focus on component design and iteration could eclipse continued development and understanding of the wider user needs for the service itself. Research on needs around guidance, content, navigation, and all aspects of the user journey as a whole should continue in Live, particularly in light of the noted complexity of the journeys through the design system for various users
  • continue to understand barriers to usage of the design system within primary audiences while the team plans to expand their user needs set to include non-governmental organisations. The community-led approach is fantastic and the plans to continue to gauge satisfaction in this forum are great. Ensure that there is still some attention on those not using the system and conduct ongoing research into needs to further mitigate any barriers, such as awareness, support, onboarding, accessibility, etc.

2. Do ongoing user research

Decision

The service met point 2 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have regularly conducted informational and evaluative user research with a variety of appropriate methods from inception through Alpha phases
  • the whole team are involved in user research and this came through in other sections as well with references others made to examples of how research activity has influenced their work
  • the team proactively shares findings with the community for transparency and accountability. (eg. from the recent user satisfaction survey)
  • the team draw on several sources of insight including feedback from the working group and user community, via service assessment reports, and user research is linked into the defined performance framework

What the team needs to explore

Before their next assessment, the team needs to:

  • consider how user research can support continued iteration of the service as a whole in Live. It was noted that research throughout beta has informed prioritisation and helped to validate thinking, but further research could continue to explore opportunities to improve and iterate on the service itself

3. Have a multidisciplinary team

Decision

The service met point 3 of the Standard.

What the team has done well

The panel was impressed that:

  • there is a well-resourced team in place
  • the team have continued to recruit a range of roles that have been reviewed and adapted as the work has changed throughout Beta
  • the team have a clear onboarding and induction process and ensure that new team members are able to learn quickly and effectively about work done previously on the design system

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure that they are able to demonstrate how their new Performance Analyst is working with designers and researchers in the iteration and improvement of the service

4. Use agile methods

Decision

The service met point 4 of the Standard.

What the team has done well

The panel was impressed that:

  • the team uses a range of agile methods and offer various different engagement and collaboration opportunities across the whole team and smaller subsections of the team
  • the team have responded well to the challenges posed by Covid-19 and fully remote working and have adapted their processes and ways of working
  • the team were able to describe how they regularly review and adapt their ways of working to better support the needs of the team and its members

5. Iterate and improve frequently

Decision

The service met point 5 of the Standard.

What the team has done well

The panel was impressed that:

  • the team responds to feedback and have frequently iterated areas of their service such as their support model to better meet the needs of users and team members
  • there is a clearly defined process within the team for identifying and prioritising new content to be added to the Design System, and which content should be iterated on
  • the team have refined and automated much of the process for releasing new content into the Design System, reducing risk and disruption to users
  • the team have created a hierarchy for and mapped out the user needs for the entire end-to-end user journey through the service

What the team needs to explore

Before their next assessment, the team needs to:

  • continue to engage with the community to find better ways to communicate when changes have been made to the content within the design system
  • be more transparent about the process the team follows to prioritise content for development or improvement, and engage the community earlier in this process
  • continue to iterate on the Design System service itself, rather than just the content within it. Ensure that any design and content decisions made to iterate and improve the Design System service are based on a combination of user research, usability testing and analytics

6. Evaluate tools and systems

Decision

The service met point 6 of the Standard.

What the team has done well

The panel was impressed that:

  • the team relies solely on open, non-proprietary web standards for producing the component library and the guidance around it
  • the way the components and patterns are built and distributed by GOV.UK Frontend and documented in the Design System supports and complements the Service manual and the Government Design Principles
  • the team re-evaluated their tooling and migrated their CI to increase efficiency and reduce costs

7. Understand security and privacy issues

Decision

The service met point 7 of the Standard.

What the team has done well

The panel was impressed that:

  • the component library is extremely light on dependencies, the current ones only being used in development
  • the team ran a threat modelling workshops using the STRIDE methodology, defined current and potential mitigations against various threats and attacks
  • no personal data is being stored on the server and no cookies sent to the client

8. Make all new source code open

Decision

The service met point 8 of the Standard.

What the team has done well

The panel was impressed that:

9. Use open standards and common platforms

Decision

The service met point 9 of the Standard.

What the team has done well

The panel was impressed that:

  • for the Design System, the team uses PaaS as a common platform
  • the team relies solely on open, non-proprietary web standards for producing the component library and the guidance around it
  • they’re setting a standard on “what good looks like” which makes it easier to share components across departments

10. Test the end-to-end service

Decision

The service met point 10 of the Standard.

What the team has done well

The panel was impressed that:

  • they have an effective deployment and publishing environment
  • they test the end-to-end service emulating multiple devices (desktop and mobile) and including automated accessibility testing
  • they spin up and tear down environments with ease, for each potential contribution to both GOV.UK Frontend and the Design System

11. Make a plan for being offline

Decision

The service met point 11 of the Standard.

What the team has done well

The panel was impressed that:

  • the team uses GOV.UK PaaS for keeping the service live, benefitting from the support and maintenance provided by the common platform

12. Make sure users succeed first time

Decision

The service met point 12 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has an exemplary approach to ensuring all the components and patterns included in the Design System meet access needs, through both compliance with accessibility regulations and usability testing. The team often go beyond what could be considered the normal expectations for accessibility, ensuring any services built with the Design System are as inclusive and accessible as possible
  • the team has submitted the Design System for multiple accessibility audits and have identified a number of issues that have now been fixed in order to make the site fully compliant with accessibility regulations. The team has also put together a detailed accessibility statement
  • the team has a strong understanding of the entry points into their service and demonstrated how their service had been designed to support these
  • there are multiple non-linear end-to-end journeys through their service that the team have identified, both through an understanding of user needs and analysis of the issues raised through their support channels
  • the team are making good use of their support channels to identify opportunities to improve their service
  • the team is aware that a dependence on Github might present a barrier for some users, and are using their support channels to provide an alternative way for users to interact with the parts of their service that would otherwise be inaccessible

What the team needs to explore

Continuing into live, the team needs to:

  • ensure that any design and content decisions that impact the Design System are based on user research, usability testing and analytics. Based on the research repo, it looks like the majority of usability research conducted on the Design System occurred during the earlier phases of the development of the service. The team have already recognised a need to run usability testing on the Design System to look at issues around navigation, but they should expand on this so that future updates to the design and content of the Design System (rather than the elements contained within) continue to be assessed directly with users through usability testing
  • conduct usability testing on the Design System service with users who have access needs or use assistive technologies. The team has already done a lot of work to ensure the service meets accessibility regulations, and frequently conducts accessibility research on the patterns and components in the design system. It was not clear however that they have conducted any usability research on how their primary users use the Design System itself (with a focus on accessibility)
  • conduct further research to identify opportunities to expand on the documentation included in the Design System. The team should explore whether providing more detailed information about accessibility, user research or the technical implementation of each component or pattern could benefit users
  • build a deeper understanding of a users end-to-end journey through the service, based on a combination of research and analytics. The team demonstrated a strong understanding of the various activities a user might perform within their service, but it was not clear they had a good understanding of how users transition between these activities, where they loop between them, and where they may exit the service

13. Make the user experience consistent with GOV.UK

Decision

The service met point 13 of the Standard.

What the team has done well

The panel was impressed that:

  • the team clearly puts a large amount of effort into prioritising, developing and iterating all the styles, components and patterns within the Design System, which ultimately helps the users of their service meet this point of the service standard when developing other government services
  • the team are aware of how users search for components and patterns, and have consequently introduced elements that are not currently included within the Design System, such as navigation menus and search with autocomplete, to support these behaviours
  • where the team is unable to support the development of variants of the Design System for use in different design tools or development environments, they are making it simple for users to discover and access resources developed by the wider cross-government community

What the team needs to explore

Moving into live, the team needs to:

  • continue to explore ways to create greater transparency around the process followed by the team to prioritise, develop or iterate each style, component and pattern in the Design System. The team has clearly already recognised the importance of doing this, and are currently exploring how to make the contribution process clearer and more inclusive
  • continue exploring how to improve their approach to capturing, documenting and publishing the user research that has contributed to the development of a component or pattern within the Design System
  • continue exploring how common elements used on the GOV.UK website and the Design System, such as the icons used for the detail and accordion components, can be made more consistent
  • continue to engage with other government departments, especially those which have their own design systems, in order to identify efficiencies which could increase the speed at which components and patterns are prioritised, developed and iterated

14. Encourage everyone to use the digital service

Decision

The service met point 14 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have created processes to help support users with assisted digital needs, for example completing steps in GitHub on behalf of users where required
  • the team have conducted research with users of the service and have worked to ensure the service meets their needs
  • the team hold regular meetings with owners of other design systems to learn from others and share their own knowledge and experience

What the team needs to explore

Before their next assessment, the team needs to:

  • continue to conduct research with and test the service with those using the Design System, to ensure it continues to meet their needs
  • regularly assess whether their processes of support for assisted digital users are sustainable or whether they need iterating to ensure they do not become too time consuming for the team

15. Collect performance data

Decision

The service met point 15 of the Standard.

What the team has done well

The panel was impressed that:

  • various different methods are being used to collect feedback from users
  • a range of data sources are being used, appropriate to this service
  • there is a plan of which analytics are needed going forward, and dedicated analytical resource to focus on this, however there was a gap in analytics in beta without that dedicated resource

16. Identify performance indicators

Decision

The service did not meet point 16 of the Standard.

What the team has done well

The panel was impressed that:

  • a performance framework has been developed
  • engagement is being measured
  • volume of support requests is measured

What the team needs to explore

Before their next assessment, the team needs to:

  • analyse website usage (as the team had already planned in next steps) and use evidence from the data on the user journey to identify pain points that can be iterated, gather insights from research to complement the evidence from the data, understand the baseline and how changes will be measured, then measure the impact of the changes to show whether they were successful or not, making more use of A/B testing different options
  • if possible, understand more about volumes and groups that aren’t using the Design System in order to improve the team’s knowledge on the whole potential population. A lot is known about current users, and research has looked at new users (e.g. designers new into their roles) but there are potential users that it’s hard to get data on. It would be good to try to quantify in terms of understanding the whole population. Communicating potentially useful updates to those that don’t currently use the design standards could increase uptake.
  • circumstances beyond the team’s control have meant that they haven’t had a performance analyst when they needed so others have picked up some of the essential analytical tasks, which is good. There is also a lot of data being collected, but it wasn’t clear that data and evidence was being used to drive decision making on iterations to the service. It is recommended to make use of the Performance Analyst that is new into the team to understand issues and prioritise work to iterate and improve the service based on evidence from data and analytics, and to show how design, research and analytics work jointly together on the iterations, as was clearly wanted and intended by the team. This should happen before the service can move into live

17. Report performance data on the Performance Platform

Decision

The service met point 17 of the Standard.

What the team has done well

The panel was impressed that:

  • this is a non-transactional service so reporting on Performance Platform / data.gov.uk is not required
  • high level figures were quoted in the overview and were known by the team, and that costs & savings figures are being measured
  • download trends are being measured and seasonality & comparison to last year are being taken into account

What the team needs to explore

Before their next assessment, the team needs to:

  • use data and insights to report on the metrics identified in the performance framework, to show evidence on whether user needs are being met

18. Test with the minister

Decision

The service met point 18 of the Standard.

  • the service has been shown to and supported by the GDS Director General. This has included advocating for the Design System through actions like sending a letter to all DDaT Leaders asking them to adopt the Cookie Banner component from the Design System

Published 10 January 2022