Register and maintain a trust alpha assessment

This service is required as part of the 4th Anti Money Laundering Directive (4MLD) of the EU. There is a legislative requirement to hold a central register of trust with a UK tax liability. This will allow the data to be shared between member states to tackle money laundering and terrorist financing. Currently, the service provides online access via iForm functionality. The existing registration process will be superseded by a service that will allow users to register a trust and estate and carry out all the necessary tasks in relation to a trust or estate.

From: Central Digital and Data Office  
Assessment date: 31/07/19  
Stage: Alpha  
Result: Met  
Service provider: HMRC  

Service description

This service is required as part of the 4th Anti Money Laundering Directive (4MLD) of the EU. There is a legislative requirement to hold a central register of trust with a UK tax liability. This will allow the data to be shared between member states to tackle money laundering and terrorist financing.

Currently, the service provides online access via iForm functionality, allowing individuals (implemented from July 2017) and agents (implemented from October 2017) to register a trust or estate.

The existing registration process will be superseded by a microservice that will allow users to do the following: register a trust or estate; view the data held by HMRC; change and update that data; annually declare the information provided is accurate and up to date and finally close a trust or estate.

Service users

The users of the service are individuals (20% users) who need to register a trust or estate and agents (80% users) acting on behalf of their clients who need to register a trust or estate.

1. Understand user needs

Decision

The service met point 1 of the Standard.

What the team has done well

The panel was impressed that:

  • the team showed a good understanding of their user’s needs and demonstrated how these were reflected in the service
  • the team described user’s pain points with the existing iForm and showed how the alpha design addressed these.

What the team needs to explore

Before their next assessment, the team needs to:

  • be clear whether they are representing a group of users or just the one, so findings are not misrepresented. The team had responded to their workshop recommendations to look at their personas. However, it was still unclear to them and the panel whether they had personas or cases studies or a combination of the two. Demographic information should be used with caution on personas so as not to suggest to others it is about one person. For example, if age is a relevant factor in people’s behaviour, a range would be a better way to represent it. The summary document which shows the scale of people’s knowledge about trusts and digital capability was an excellent way of presenting how people are similar and different
  • the team should ensure all their user needs represent the actual user need, not a solution. For example, ‘I need to print’ probably comes from people’s mistrust of submitting information to government. Although the team have good evidence of some user’s reliance on paper, presenting solutions as needs can limit the design team where there may be a better way to meet that need, both now and in the future.

2. Do ongoing user research

Decision

The service met point 2 of the Standard.

What the team has done well

The panel was impressed that:

  • the team had made good efforts to address their workshop recommendations around broadening research in terms of location, accessibility etc. It is important for a successful beta phase that this approach continues
  • the plan to combine the betas for registration and variations is a good one. This will help the team establish whether the end to end service works for users
  • the team described good plans for private beta, including working closely with the new Performance Analyst.

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure they continue to undertake a proportionate amount of research with individual users. Working with the contact centre on recruitment should support this
  • ensure they undertake face to face, contextual research wherever possible. It was clear this had the most impact on the team in terms of learning
  • the end to end journey is very long and ensuring users can complete it and that it meets their needs is a challenge. Scenario tests are a useful tool, but the team needs to be mindful of their limitations and ensure they track real user journeys through the private beta service as much as possible.

3. Have a multidisciplinary team

Decision

The service met point 3 of the Standard.

What the team has done well

The panel was impressed that:

  • the team had a wide range of roles within it, including separate product manager, user research, user experience and content designer
  • the service manager seems very engaged in the service and part of the team. She was able to answer questions at varying levels of detail and is clearly a champion of the service with stakeholders, protecting the team from the different interested parties that could cloud the decision making process
  • the team are ensuring that the different disciplines work together and share their insights across the design and development process.

4. Use agile methods

Decision

The service met point 4 of the Standard.

What the team has done well

The panel was impressed that:

  • the team were ensuring the entire team were involved in the design and development of the service. Particularly, playing back videos of the research sessions to the entire team and the Friday morning prototype walkthroughs which enable the whole team to discuss and understand the direction of the service
  • the product manager seems very open to challenge on his prioritisation decisions and could give examples of those priorities changing on receipt of research or feedback.

What the team needs to explore

Before their next assessment, the team needs to:

  • take stock of the amount they are iterating their ways of working each sprint. While it is fantastic they are using their retros to feed back on the ways of working, the constant chopping and changing could be having an unintentional impact on the cadence of the team.

5. Iterate and improve frequently

Decision

The service met point 5 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have been through a large amount of design iterations and could articulate what changes had been made and what research had driven them
  • the team are dedicated to continuing to iterate and improve the service, making it as simple to use as possible. This is particularly important as the service consists of approximately 250 different screens, and most users will see between 60-90 screens while interacting with the service. The team have already identified future ways to reduce the handoffs within the service
  • the team are planning on using the HMRC A/B testing suite to further improve the service, utilising Google Analytics to identify aspects of the service to be tested.

6. Evaluate tools and systems

Decision

The service met point 6 of the Standard.

What the team has done well

The panel was impressed that:

  • the service team is using familiar technologies that allow for quick prototyping: Node.js, Nunjucks, MongoDB, Heroku
  • the service team considered (and ruled out) using the existing electronic form application, which would not have met user needs and scalability requirements
  • the service team considered HMRC’s standard MDTP platform for alpha phase and ruled it out as it would have slowed down development.

7. Understand security and privacy issues

Decision

The service met point 7 of the Standard.

What the team has done well

The panel was impressed that:

  • the service team worked closely with the designers and researchers from the Security Integration Team to prototype integration with Government Gateway
  • the service team implemented data encryption at rest (MongoDB database) and in-transit (HTTPS), and password protected its prototype web sites
  • the team implemented automated penetration testing on the service to identify and mitigate any OWASP vulnerabilities
  • the team had an external penetration test performed on the registration MVP service
  • the team have integrated transaction monitoring into the solution to reduce potential fraud risks
  • the service team plans to host its beta on HMRC’s MDTP platform, which will provide the required security protection and backend applications isolation.

8. Make all new source code open

Decision

The service met point 8 of the Standard.

What the team has done well

The panel was impressed that:

  • the service team plans to make private beta code open source.

9. Use open standards and common platforms

Decision

The service met point 9 of the Standard.

What the team has done well

The panel was impressed that:

  • the alpha version is built using open source technologies: Node.js, Nunjucks, MongoDB, Nginx, Docker
  • the service team used GOV.UK Prototyping kit
  • the communication between microservices is done via standard HTTPS, REST API with JSON protocols and formats
  • the service team plans to host the beta version of the service on HMRC’s standard MDTP platform
  • the service team plans to integrate the service with the existing authentication and backend (ETMP) services.

10. Test the end-to-end service

Decision

The service met point 10 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is building a suite of contract tests for the RESTful APIs that the trusts microservice uses
  • the team developed acceptance test suites (Cucumber/Selenium) which regularly conduct smoke tests of the user interface
  • the team have contract and integration tested the APIs into ETMP
  • the team continuously run performance testing.

11. Make a plan for being offline

Decision

The service met point 11 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is planning to use an existing MDTP platform for hosting the service in beta phase, which would meet the availability requirements
  • the team understands the implications of the service being unavailable. It has gained this understanding through its experiences with the existing electronic form and a good relationship with call centres that handle queries around the service

12: Make sure users succeed first time

Decision

The service met point 12 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has iterated the service based on user research, in regular and fast cycles
  • the team demonstrated a good understanding of the end-to-end journey
  • the new parts of the journey introduced since the last assurance event followed similar design patterns to previous parts, the team building on what they already knew worked rather than starting from scratch.

What the team needs to explore

Before their next assessment, the team needs to:

  • make progress on improving how clients approve an agent to act on their behalf. This part of the journey seems the most complicated, with a lot of back and forth between different parties. This part of the service currently relies on HMRC’s ‘agent services’ team. The team should prototype and test their proposed improvements to assess the impact it would have on users’ journeys. This should hopefully help then make the case to the agent services team to improve it as a priority.

13. Make the user experience consistent with GOV.UK

Decision

The service met point 13 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have contributed findings on design patterns into the GOV.UK Design System, and been a part of the working group for introducing and approving new patterns.

What the team needs to explore

Before their next assessment, the team needs to:

  • have a clear, verb-based name for the service. It’s worth mentioning that it is absolutely acceptable to iterate the name of the service over time in response to user research and as the scope changes. The current name ‘Register a Trust’ seems to work for now, but could change to cover the other things a user will be able to do in the future.

14. Encourage everyone to use the digital service

Decision

The service met point 14 of the Standard.

What the team needs to explore

Before their next assessment, the team needs to:

  • plan out how the service will be rolled out and when the legislative obligations will be enforced. Enough time should be given to users to begin using the service before the obligations are enforced. The team are fully aware of this but it would be good to see a fully planned out approach by the next assessment.

15. Collect performance data

Decision

The service met point 15 of the Standard.

What the team has done well

The panel was impressed that:

  • the team are already using the performance data that is currently collected from the iForm to help prioritise and research the service
  • the use of Google Analytics is already being planned for the service going forward, with the aim of using it to plan and prioritise A/B testing on the HMRC A/B testing suite.

16. Identify performance indicators

Decision

The service met point 16 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have begun to consider their key performance indicators

  • the team are planning on comparing the new service with the existing form to measure whether it is meeting more needs, i.e user satisfaction.

What the team needs to explore

Before their next assessment, the team needs to:

  • work further with the business to try to identify any other indicators that would help assess whether the service was successfully solving the problem for the user.

17. Report performance data on the Performance Platform

Decision

The service met point 17 of the Standard.

What the team has done well

The panel was impressed that:

  • the team had already begun conversations with the GOV.UK Performance Platform Team

What the team needs to explore

Before their next assessment, the team needs to:

  • continue their engagement with the GOV.UK Performance Platform Team and ensure that they are ready to onboard as soon as they are mature enough

18. Test with the minister

Decision

Point 18 of the Standard does not apply at this stage.

Published 5 September 2019