GOV.UK design system alpha assessment

The report from the alpha assessment for GDS's GOV.UK design system service on 5 June 2017.

From: Government Digital Service
Assessment date: 5/06/17
Stage: Alpha
Result: Not met
Service provider: GDS

To meet the Standard the service should:

  • Return the focus to service design from platform development. Expand the scope of the alpha to include needs which are more service-focussed.

For example:

  • I need patterns to be authoritative and evidence-based so I can trust them enough to use them.
  • I need to be able to contribute to the develop of design patterns so I can help improve patterns over time.

Exploring needs of this nature in the alpha phase should not require further development of the prototype, but additional user research and any identified approaches that follow should inform the end to end service design which will be of benefit in the beta phase.

  • Engage with users in access needs communities to understand their needs and test the alpha service.
  • Ensure the team has the right mix of roles to deliver a service.

About the service

Description

GOV.UK Design System is a place for service teams to find styles, components, and patterns to use in designing government services.

Service users

The users of this service are primarily designers and developers who build digital services in government.

Detail

User needs

Overall the panel felt that although some excellent work had been conducted to understand the needs of these users, the majority of the needs that had been prioritised for focus in alpha were based on the development of a platform rather than the overall service.

The detailed research conducted by the team has enabled them to identify key questions around the use of design patterns - namely how people understand what patterns to use, where they find the patterns, relevance to a particular service, how can the patterns be integrated, how can they contribute to patterns, how do they know if a pattern has been updated, and the rational behind a pattern.

A single platform will only address these needs in part. The team should consider at alpha stage what other service elements are required to work with the platform and what the overall experience would be for users. Potentially, this could include more on or offline products or processes sure as a pattern standard and the practicalities of pattern assurance. These elements should be considered at the same time as platform development in case dependencies arise.

The team developed 6 user types during alpha, mapping their integration with the community, and their skills. Going forward this will help create empathy for the different types of users and also help with recruitment.

Although the likelihood of people with low digital skills being users of the service is low, the possibility of someone needing to use the service who has lower awareness of GDS and the design community across government is high. In the Beta the team should aim to understand more about these types of users and how they can be engaged.

In summary, the panel recommends that along with exploring the service based needs further, the team should conduct research with users with access needs and test ideas around updating pages (including how to let users know a service has been updated) before moving into the beta.

Team

The team is a multidisciplinary mix of digital specialists who are supportive of each other and collectively demonstrate a passion and commitment to the task at hand.

The team have stated they would like to expand to include a performance analyst and content designer which is understandable for platform development, however GOV.UK Design System is a service therefore the team should consider whether they have the right skills in sufficient numbers to deliver the end to end service design that is required.

In particular, the panel recommends the team considers additional effort in technical development, research and product/service area for future phases rather than attempt to absorb this work within the existing team.

Technology

The panel were pleased to see the team have explored a number of different technical components as part of developing their solution. Ultimately, the team have decided to abandon the current codebase, though intend to reuse much of the same technology stack.

The team have not implemented any automated testing or deployment pipeline which is fine at alpha, but the team have planned to implement this in their beta with the new codebase, which will allow them to confidently iterate.

The team have published their code, but without a licence so it can’t be reused. Also in support of reuse, the team should ensure the codebase is modular in beta.

The team are currently still evaluating the hosting solution and it looks likely they will use the GDS PaaS though it is understood the platform might not meet all of their needs. However, the team described a positive relationship with the PaaS team which may help influence PaaS roadmap in order to better meet their needs.

Improvements have been made since the workshop in terms of linking to a history of each pattern, and considerations have been made to demonstrating currency in each pattern. However, there should be some bi-directional link from the version in the design system to the version of the frontend tool kit.

The current search solution does not cater for users without javascript nor for typos and spelling mistakes therefore, the panel recommends that the decision for search being considered an enhancement and browser-based be revisited.

Design

The team has conducted desk research on other pattern libraries, and investigated some of the limitations with similar and pre-existing libraries. They have chosen a format currently used for technical documentation, and have iterated around this.

They have explored a number of iterations within the framework of technical documentation through alpha, predominantly using a left-nav and subsequently introducing an additional horizontal primary nav, which has seen greater success for findability.

The team conducted card sorting and then usability testing to arrive at the current information architecture (IA), and this has gone through several rounds of iteration. The team demonstrated IA improvements they’ve made, however there is scope for further work here, particularly around the ‘patterns’ section. Research has identified ‘pattern’ as an ambiguous label in this context, and including it as a top-level section makes the other navigation labels appear less clearly delineated. The choices made around IA now are likely to have long-lasting impacts, so it’s important to go into private beta with a reasonably mature approach.

The team have and will continue to have challenges around naming individual patterns. They demonstrated a good approach through tagging and search to improve findability. The team is likely to need continual feedback on how users would name things.

They have also given some thought to how their design approach will work for users of assistive technology, mostly based on how the technical documentation pattern behaved. They are planning to explore and test for accessibility in private beta, but this may not give them enough time and flexibility to change their design approach, if needed.

The team have done some early exploration on how their design approach will work on mobile/smaller screens, but this wasn’t demonstrated in the assessment.

The team have developed content templates for consistent page-level information offerings, based on needs identified through research.

The team hasn’t yet considered routes into the library, aside from redirects from existing resources. Finding and being able to use patterns are two of the main aims of the MVP, so finding the library in the first place seems worthy of exploration in alpha.

There was some discussion around the library needing to scale, particularly to accommodate departmental patterns. Although not discussed in the session, it would be useful for the team to explore whether and how to accommodate content design patterns – which could be cross-governmental and departmental. (Assuming the ambition is to host all patterns in one place.)

Recommendations

To pass the reassessment, the service team must:

  • Continue to iterate the IA, particularly looking at the ‘patterns’ section – e.g. make sure the label conveys these are multi-page patterns, and ensure items in this section are appropriately high level.
  • Explore how the design approach you’ve chosen works on common assistive technologies, including screen magnifiers, as soon as possible.
  • Explore routes into the library, including how users new to government would come to know about it in the first place
  • Explore ideas around updating pages, including how to let users know a service has been updated.
  • Ensure the publicly available prototype has an alpha service banner in line with GOV.UK guidelines.
  • Explore ideas around versioning in alignment with the front end toolkit
  • Re-present with representation from the front end toolkit team in order to demonstrate an end-to-end service.
  • Explore alternatives to a browser based only search solution.
  • Add an appropriate license to the existing codebase.
  • Put in place additional development capability to provide peer review assurance, at minimum this should be part-time support.

The service team should also:

  • Make sure responsive designs are evolved enough for the team to be confident that they are feasible, and plan to test these during private beta.
  • Explore including content design patterns.

Next Steps

In order for the service to continue to the next phase of development it must meet the Standard. The service must be re-assessed against the criteria not met at this assessment.

Please contact the Service Assessment team at least 4 weeks before the date you’d like to hold a reassessment.

Get advice and guidance

The team can get advice and guidance on the next stage of development by:

Digital Service Standard points

Point Description Result
1 Understanding user needs Not met
2 Improving the service based on user research and usability testing Met
3 Having a sustainable, multidisciplinary team in place Not met
4 Building using agile, iterative and user-centred methods Met
5 Iterating and improving the service on a frequent basis Met
6 Evaluating tools, systems, and ways of procuring them Met
7 Managing data, security level, legal responsibilities, privacy issues and risks Met
8 Making code available as open source Met
9 Using open standards and common government platforms Met
10 Testing the end-to-end service, and browser and device testing Met
11 Planning for the service being taken temporarily offline Met
12 Creating a simple and intuitive service Not met
13 Ensuring consistency with the design and style of GOV.UK Not met
14 Encouraging digital take-up Not met
15 Using analytics tools to collect and act on performance data Met
16 Defining KPIs and establishing performance benchmarks Met
17 Reporting performance data on the Performance Platform Met
18 Testing the service with the minister responsible for it Met
Published 24 July 2018