Build with data (registers) alpha assessment

The report from the alpha assessment for GDS's build with data service on 6 June 2017.

From: Central Digital and Data Office
Assessment date: 2017-06-06
Stage: Alpha
Result: Met
Service provider: GDS

The service met the standard because of a robust approach to user research, evidence of responding to research findings, and demonstration that they have explored the difficult edges of the problem space. The assessment panel wrestled with the scope of the service, and while happy to allow the team to proceed, strongly recommend bringing a more end-to-end service for beta assessment.

About the service

Description

The service makes it simple for people building government services to rely on authoritative government data sets.

Service users

The users of this service are people building government services who need to use authoritative government data.

Detail

User needs

The team has a dedicated user researcher. There is strong evidence that a significant amount of user research has been carried out to identify the potential users and their needs for this service. It was acknowledged by the team that the initial assumption of users being purely technical had needed to be challenged, and now a much broader group of users were being considered.

The user researcher is using a ‘jobs to be done’ approach which is understood by the whole team and suits the nature of the service - a good rationale behind this approach was provided. There is evidence of a broad range of user research methods being used, and a good process for reporting, tracking and sharing the outcome of the research.

There is a list of user needs, but given the broad range of user groups and many associated user needs, it’s not clear how often the team are routinely talking about and referring back to these needs. It is recommended that the user needs are consolidated to make it easier for them to be considered and referred to regularly by the service team.

As a result of the decision to split the project into two, this service only addresses the needs of one user group (Government Service Teams) with the needs from the other user group (Government Data Owners) being addressed by Registers Design Authority. It is recommended that for Beta assessment the two teams are assessed as an end to end service, to ensure that any potential conflicts between the different user needs have been addressed.

As the service is primarily used by service teams there is no requirement to provide an Assisted Digital service. However the team has acknowledged that some users have low digital skills, and has redesigned some elements to take account of this. The team has also put effort into addressing accessibility needs, which is a positive.

Given that the service is already publicly available, the team should address the issue regarding the marking of the service as Alpha, but the marking of some registers as Beta. It can’t be assumed that users will understand what this means, and must be made clearer (for non technical users in particular) that this is an Alpha service, and what that means in terms of trusting and using the registers.

Team

The panel were pleased to see a well staffed multi-functional team that is not dependant on contractors; and that the team were able to demonstrate they will be able to continue to improve the service.

Technology

The team have a mature existing code base and demonstrated they had explored a large number of technical options in its development.

They have a monitoring solution that notifies the team when the service is unavailable and they will make a best efforts approach towards restoring the service.

The team have identified the primary threats to the service are mostly around falsifying data and have produced some sensible mitigations to those risks.

The team have built vulnerability checking into their pipeline which was good to see.

The team have developed modular and test coverage and open sourced under the MIT licence.

The team demonstrated an effective deployment infrastructure and further aspirations of what they want to achieve in beta; it is important that the team maintain a like-for-like solution if they do opt to migrate into the GDS PaaS or other hosting solution in order to keep the value the team have invested in this.

The team have implemented many open standards and are actively looking to migrate into common government platforms such as GDS PaaS.

The team have not implemented any proprietary solutions which would attract license arrangements or otherwise long arrangements.

The team have developed a significantly resilient service, though as consumers of the service naturally come to depend on the service the team should look at how to set expectations better in terms of SLA and how they can communicate planned/unplanned downtime to consumers.

The panel recommend the team look to the Accessing GaaP Services work once it restarts for authenticating data providers which would remove the requirement for the team to manage storing and managing the users manually.

Design

The scope of the MVP for alpha was based on a hypothesis that users need to access up to date information.

The team have already started to consider accessibility during the alpha, and have undertaken a workshop with the GDS accessibility team, as well as testing prototypes using assistive technologies.

The team are proving that users are succeeded first time by using the feedback form provided on the register pages.

The team demonstrated they had tried different approaches and iterated the user journeys to registers as well as how the information is displayed on the register home pages. They also showed how they were exploring different service views to meet the needs of non-technical users.

This work should be continued in beta, however it was not clear how users would navigate the different pages including product pages, which page would be their entry point and how this would affect discovery. During beta work should be done to map out and rationalise user journeys and discovery, including routes from web search and external links.

The use of alpha and beta to describe the maturity of the data is confusing and the team are aware of this issue. Users may not understand whether the beta banner refers to the data or the service. This should be resolved before beginning the private beta and users should be clear about the maturity and stability of the data.

Recommendations

To pass the next assessment, the service team must:

  • Explore ways to notify consumers of new/updated/redacted/removed data from the registry.
  • Ensure that all code is stored in repositories in line with GDS policies. This might include migration into the ‘alphagov’ organisation.
  • Explore ways to set expectations to their consumers around the service availability particularly if developers are live linking or integrating into their build pipeline a data refresh.
  • Implement performance testing, ideally within the delivery pipeline.

The service team should also:

  • Write some blog posts and put together some talks to share with other teams about how some of the interesting user research techniques particularly the ‘jobs to be done’ and natural language processing of qualitative interviews.
  • Consider presenting with representation from the data.gov.uk team.
  • Write some guidance about how the code could be reused
  • Strongly consider presenting for a beta assessment as a united service with the Registers Design Authority to demonstrate an end-to-end service that has taken into account all the user needs from both ends. This will make it more likely that a beta assessment will be successful.

Next Steps

You should follow the recommendations made in this report before arranging your next assessment.

Get advice and guidance

The team can get advice and guidance on the next stage of development by:

Digital Service Standard points

Point Description Result
1 Understanding user needs Met
2 Improving the service based on user research and usability testing Met
3 Having a sustainable, multidisciplinary team in place Met
4 Building using agile, iterative and user-centred methods Met
5 Iterating and improving the service on a frequent basis Met
6 Evaluating tools, systems, and ways of procuring them Met
7 Managing data, security level, legal responsibilities, privacy issues and risks Met
8 Making code available as open source Met
9 Using open standards and common government platforms Met
10 Testing the end-to-end service, and browser and device testing Met
11 Planning for the service being taken temporarily offline Met
12 Creating a simple and intuitive service Met
13 Ensuring consistency with the design and style of GOV.UK Met
14 Encouraging digital take-up Met
15 Using analytics tools to collect and act on performance data Met
16 Defining KPIs and establishing performance benchmarks Met
17 Reporting performance data on the Performance Platform Met
18 Testing the service with the minister responsible for it Met
Published 24 July 2018