The report from the alpha assessment for HMRC's API Developer Hub service on 31 March 2016.
|Service provider||HM Revenue and Customs|
The service met the Standard because:
There is clear evidence that the service meets a user need that exists, and that the team have done the research necessary to understand the user need and fulfil it.
The team has built a service for developers that is clear and easy to navigate. The team has understood the unique needs of the developer audience and taken efforts to provide guidance and support to enable the developers to access and use the services appropriately.
The team has selected and built a technical stack that takes into account the complexity of the underlying service and provides a good, well structured interface for people accessing government without simply replicating the internal organisation onto their users.
About the service
Service Manager: Umer Ehsan
Digital Leader: Mark Dearnley
The API Developer Hub will provide developers with information about the APIs offered by HMRC, documentation on how to use them, and allow developers to register for access to use the test and production APIs.
Detail of the assessment
Lead Assessor: Michael Brunton-Spall
Researching and understanding user needs [points 1, 2, 10]
The assessment panel was impressed by the user research that has been carried out. The service team had a good overall understanding of user needs, and was able to show clear examples of how user research and user needs feed into the design and improvement of the service. In addition to the regular rounds of research, the service team have carried out an accessibility audit, which they have learnt from.
The team and how it’s working [point 3, 4]
The team are suitably skilled and have grown when necessary. In particular, the team have a dedicated WebOps team member.
The team are empowered and have iterated their processes. The team analysed and selected the tools to deliver the API Hub. When moving from alpha to private beta, the team were able to adjust the length of sprints from 1 to 2 weeks in order to better align with the user research cycle. The team also decided to stop using a physical board because it was duplicating the information in Jira, which was easier to update.
The team are working in a way that’s consistent with the service manual. The team had a clear understanding of their scrum process and ceremonies. The team was mostly co-located but also used a range of tools (Slack, Kira, Hangouts) to ensure regular contact.
The governance model is proportionate and there are clear, measurable goals. The Service Manager sits with the team and regularly attends Show and Tells. The team are using burn up charts to estimate progress and health of the project, which is reported to the Programme Manager weekly. There is monthly engagement with senior management on budget and staffing.
Designing and testing the service [points 12, 13, 18]
The team has a clear idea for a minimum viable product that they can deliver. The team’s focus will be on new APIs and they do not plan to make changes to existing APIs or how they are provided.
The team were not able to provide details of who, centrally in HMRC, will be responsible for evaluating and controlling the appropriate use of the range of APIs published by the multiple service teams at HMRC. While this responsibility may not sit within the team, the panel would like to see this considered further.
The team had made efforts to understand security and privacy issues from the perspective of an end user who accesses their information held by HMRC through third party software. The team’s findings were similar to those documented by large tech companies - that end users infrequently consider the implications of granting third party applications access to their data.
Technology, security and resilience [points 5, 6, 7, 8, 9, 11]
The team have made a good effort to identify the correct technology and tools for the service. They were able to explain clearly how they ensured that their technology was as decoupled from external technology as much as possible (REST over bespoke RPC).
The team was able to explain in detail the levels of testing that the service has been tested under and with respect to its internal service boundaries. Some performance testing has been done and we recommend that the team make this a regular part of their workflow. In addition, we recommend that the team adopt automated checking for zero downtime deployments as part of their release process.
The service demonstrated their approach to security concerns and threats. They were able to explain how they have taken steps mitigate some attacks (e.g. through their gated access).
The team have monitoring in place for the service being temporarily unavailable and understand their relationship with HMRC webops.
The service failed to demonstrate how it would protect users from erroneous uses of those APIs and put an onus on the user to understand (and read) the terms and conditions that the software company provides.
The level of security around the use of APIs focussed strongly on technical security and fraud prevention, not on the security of users from software using data in ways that endanger the user.
Improving take-up and reporting performance [points 14, 15, 16, 17]
The service has clearly thought about how to best measure the performance of the service at its most basic level and is investigating further ways to understand the performance of the service. The team had considered additional KPIs for the service, such as the types of developers accessing it.
The service has looked at how to migrate people from the existing services onto the new service and how to best enable users of various skill levels into succeeding with the service.
The team have identified some potential user journeys that they plan to use to inform their data analytics work during public beta. The team also intend to use analytics filters to improve their understanding of how users interact with the service and where they drop out.
The service team should:
Understand what constitutes appropriate uses of an API, which APIs should be used together and which should not.
Gain a more detailed understanding of the service’s security posture and risks to end users that may result because of the broad API offering.
Continue to test terminology and improve their understanding of the needs of developers new to working in the tax industry or with APIs.
Continue to research with end users that use third party software to access HMRC-held data about them, to better understand their data security needs and how best to meet those needs.
Further develop their understanding of the user journeys and use the larger user base in public beta to find opportunities for research to help with this.
Develop a plan for out of hours support.
Develop a plan for transferring knowledge from contractors to HMRC employees.
Digital Service Standard points
Published: 13 April 2017
Assessment date: 31 March 2016