Help To Save - alpha

The report from the alpha assessment for HMRC’s Help to save service 15 November 2017.

From: Central Digital and Data Office
Assessment date: 15th November 2017
Stage: Alpha
Result: Met
Service provider: HMRC

The service met the Standard because :

  • User research is thorough and firmly embedded into the team and process. Key user groups have been identified and detailed personas created.
  • The service has been significantly iterated throughout the alpha period and the improvements are based on research findings, with no resistance shown to throwing old versions of the prototype away.
  • A multi-discipline team is in place, working in a collaborative and iterative fashion, with a clear roadmap and a prioritised backlog.

About the service


The service enables working people on low incomes to build up their savings.

Service users

The users of this service are working people on low incomes who are in receipt of either Working Tax Credits or Universal Credit.


User needs

There is a clear policy intent for the service - to encourage people to get into a savings habit. (Over 8 million people have no savings. 13 million don’t have enough to support themselves for a month if they have a drop in income by 25%.) The policy is aimed at lower-income families, and approximately 4 million people will be eligible to use the service.

From initial user research, key user groups have been identified and detailed personas were created. User research is thorough and firmly embedded into the team and process. There is a full time user researcher facilitating all of the usability testing sessions, who feeds back to the team via a face-to-face meeting and who documents the main findings.

User research and usability testing is taking place within each 3-week cycle. The team has run 17 rounds of usability testing over 12 different locations in the UK. A variety of methods have been used, including user interviews and prototype testing on different devices; taking place in usability testing labs, shared spaces and home visits. The team has a user research plan going forward and will be conducting more contextual research at user’s homes, allowing them to use their own devices and also diarising their experiences.

The service team has been using a third party recruiter to ensure users with assisted digital needs are included in their user research, and 16 assisted digital customers have formed part of the usability testing. The most challenging users were those with literacy and numeracy needs, and the team have used their findings to design content that meets these needs. The team were able to demonstrate various examples of where the service design has been iterated based on feedback, and showed illustrative examples of these.

The most challenging pain point was users understanding the final bonus payment. The team have iterated various designs and content around this; and are conscious this is an area they need to focus on as they move into beta. The panel noted how important it is that users understand the bonus calculations at the beginning of the scheme, as this is the main benefit of using the service. The panel recommends the team continue to focus on this and iterate solutions to meet the needs of users.

The accessibility lead at HMRC has been involved in reviewing the service design and there will be a full accessibility audit when it moves into private beta, as the team have already been in touch with the Digital Accessibility Centre in Neath to arrange this for February.


The team are successfully working as a multi-disciplinary team despite the challenges of not being co-located and being made up of a core HMRC team and a number of external partners.

The team are working in 3 week sprints, with a clear and effective feedback cycle. They have regular retrospectives and are using them to improve how they work. They have a clear roadmap and a prioritised backlog.


As the service is in alpha stage, only the prototype of the service is usually assessed. However it appears that a large portion of the final back-office is in the process of being written. This is unusual, though understandable, and results in it being unclear what components of the architecture are owned by the team and what is part of HMRC’s infrastructure. This distinction will become crucial when the service goes through beta assessment, as each component in scope will need to be examined.

Another risk with designing the back-office early is that as beta progresses and changes, it will sometimes put demands on the backend, which shouldn’t be so advanced that it’s difficult to adapt. The panel trusts the team will continue building the back-office with that in mind. This goes both ways (as the team has pointed out about eligibility checking): technical limitations may unfortunately result in a completely different user experience.

The panel welcomes the fact that the prototype was built using the govuk prototype kit, which demonstrates an understanding of the design patterns which will be helpful when designing Beta. However the panel notes that some patterns weren’t used or were used incorrectly:

  • There was no hint text for user input, providing examples of sort code, or email address.
  • Limiting the amount of money to be paid in was done using browser number min/max, which isn’t consistent across browsers and also missing in older browsers. Validation should always be done by the backend to provide a consistent error messaging.
  • Tables were sometimes used for lists, which would make screen readers read the list wrongly.
  • The money amount pattern should be used.
  • Callout patterns e.g. ‘Confirm withdrawal’ (grey boxes) are not consistent with patterns.

We note that continuing using the tools and patterns provided by the govuk-elements and govuk-frontend-toolkit packages will complement the team’s skills in building a service that is “mobile-first”, which seems to be particularly important to the users of the service.

We are also concerned about the team’s future plan to integrate the service into a mobile app. While we appreciate that the fact that HMRC’s app is popular and it would make sense to add Help to Save as an additional feature, mobile apps are rarely justified because of concerns around accessibility, openness, and maintenance.

Given that the service deals with personal finance and personal data, there should be special attention to security issues. Even though parts of the back-office are already used and presumably security-tested, a new user-facing service opens new threat potentials. However the system will also generate interesting data that is of public interest: for the government to be able to report on the success of the scheme, or for citizens to see their tax money being put to good use. In that respect opening that data through channels like the performance platform is an exciting prospect.


Ideally we would have seen a variety of concepts explored during the early alpha period but the team has focussed on one. The single concept is driven by the policy constraints but more importantly, by users’ mental models around managing money online.

The single concept has been significantly iterated throughout the alpha period and the improvements are based on research findings. The eligibility > registration > enrollment process has been reduced from 13 pages to 4.

The team’s prototype is high-fidelity and clearly a useful research tool. They demonstrated no resistance to throwing old versions away. The team’s wall presentations showed 8 versions of the prototype, mapped to the pain points which have been addressed along the way. Their iterations have no reliance on complex UI.

The prototype presented had a broken journey where users choose to enter their bank details at another time. This was raised as something to remedy and test as the service is dependent on the user having entered them. It will need to be clearer where returning users go to access the service. Currently returning users are directed back to the Help to Pay guidance pages where the signin link is on the 5th page.

The panel didn’t explore the ‘get help’ journey (prototype has a broken link) during the service assessment. For the next assessment, the team need to be able to demonstrate how users of each service channel are able to get help when they need it and who is providing that support.

The service name is user-centric and was tested with Treejack. The team are tracking the completion rate of the journeys within the service. They will also be using diary studies as a more qualitative method.

The panel noted that the start and end of the service was not fully understood. How will users find the service if they are eligible but not looking for it? What does the end of the service look like? How are users informed that the service is ending?

The service is using Worldline as the payment provider as this is an existing payment provider for NS&I. However, the panel recommends using GOV.UK Pay as this would make the journey both more consistent in terms of design, as well as potentially being more cost effective and easier to maintain. Similarly other GaaP services like PaaS and Notify should be investigated and considered.

The team mentioned the possibility of re-skinning the Worldline product to maintain the GOV.UK identity throughout the user journey. The team should bear in mind that if the NS&I part of the journey isn’t on the GOV.UK domain, it can’t use the GOV.UK font, the crest, or call itself GOV.UK.

The team have made plans to explore their users’ accessibility needs and feed them into personas. They will also be looking into video content on GOV.UK. There is a team of accessibility testers at HMRC which the Help to Save team will be working with going forward.

The team is sharing their design ideas and problems with wider design community and accepting lots of feedback. They have done design spikes (in particular around the problematic 2nd bonus period), design crits and are active on the cross-gov Slack design community.

Overall the design of the service is simple and user-centric. The team’s visualisations of prototype evolution, policy timeline, rollout strategy and in particular personas, were impressive and helpful. Looking ahead to private beta and their next assessment, the team need to pay special attention to how the service performs when users start managing their own money.


The team have considered how to measure the success of their service and have engaged with the Performance Platform team. In addition to the mandatory KPIs, they plan to measure the service against industry data for savings among low income households.

Analytics packages have been considered so that the team can monitor error messages and drop off rates and use these to make improvements to the service.


To pass the next assessment, the service team must:

  • Continue to focus on helping users understand the bonus calculations at the beginning of the scheme and iterate solutions to meet the needs of users
  • Ensure that GOV.UK design patterns are always used
  • Act on the findings from the content review sent alongside this report
  • Ensure that the new user-facing service is fully secure
  • Design and test the journey for users returning to log back into the service
  • Design and test ‘get help’ journeys and confirm who is providing that support

The service team should also:

  • Consider using GOV.UK Pay to make the journey more consistent in terms of design and potentially more cost effective and easy to maintain. Similarly other GaaP services like PaaS and Notify should be investigated
  • Be mindful of the risks of designing the back-office early and ensure that it can adapt as beta progresses and changes
  • Consider alternatives to integrating the service into a mobile app
  • Be able to demonstrate the ‘enter bank details later’ journey
  • Show how users are made aware of the service if they are eligible but don’t know about it or are not looking for it
  • Design and test what the end of the service looks like
  • Prepare for the impact on the service of users managing their own money

Next Steps

You should follow the recommendations made in this report before arranging your beta assessment.

Digital Service Standard points

Point Description Result
1 Understanding user needs Met
2 Improving the service based on user research and usability testing Met
3 Having a sustainable, multidisciplinary team in place Met
4 Building using agile, iterative and user-centred methods Met
5 Iterating and improving the service on a frequent basis Met
6 Evaluating tools, systems, and ways of procuringthem Met
7 Managing data, security level, legal responsibilities, privacy issues and risks Met
8 Making code available as open source Met
9 Using open standards and common government platforms Met
10 Testing the end-to-end service, and browser and device testing Met
11 Planning for the service being taken temporarily offline n/a
12 Creating a simple and intuitive service Met
13 Ensuring consistency with the design and style of GOV.UK Met
14 Encouraging digital take-up Met
15 Using analytics tools to collect and act on performance data Met
16 Defining KPIs and establishing performance benchmarks Met
17 Reporting performance data on the Performance Platform Met
18 Testing the service with the minister responsible for it n/a
Published 24 July 2018