Help To Save live assessment

Low-income people or families who want an incentive to save regularly and get into a habit of saving.

From: Central Digital and Data Office
Assessment date: 13/02/19
Stage: Live
Result: Met
Service provider: HMRC

Service description

Low-income people or families who want an incentive to save regularly and get into a habit of saving. For these savers the incentive is a 50% bonus at years 2 and 4.

Service users

UK residents and crown servants abroad can use the Help to Save service if they are in receipt of Tax Credits or universal Credits and have household earnings at least equivalent to 16 hours at the National Living Wage.

1. Understand user needs

Decision

The team met point 1 of the Standard.

What the team has done well

The panel was impressed that:

  • The team have a broad and thorough understanding of their users needs. During private Beta, the team have constantly iterated the service based on user testing with 44 users using a combination of face to face and telephone interviews. The team have also conducted surveys of 508 users.
  • The service team are using personas as a way of segmenting and drawing out user needs and these have been used throughout the development of the service. The team demonstrated that they could evidence all of the user needs. The teams understanding that they needed to change content and the transaction to force a mindset to get users saving more was particularly impressive.

What the team needs to explore

Before their next assessment, the team needs to:

  • Do more work with the customer service desk to ensure that the service is accessible by users who can not use the digital service and need someone to help them open an account.
  • Ensure that the accessibility issue that was identified in the Worldline payment functionality is fixed in October and ensure that while the issue is live that anyone using service is given additional support. The panel was reassured that the issue was a small one.

2. Do ongoing user research

Decision

The team met point 2 of the Standard

What the team has done well

The panel was impressed that: - The team have a regular, planned user research schedule and expect to have funding for a high level of iteration after they go live. They intend to continue work on a backlog of ideas and research. - The team did more research on withdrawals, users who don’t have Government Gateway, ineligible users and on rejected payment journeys. - They have tested the offline processes and plan to work closely with organisations that support users, like the Citizen Advice Bureau.

What the team needs to explore

Before their next assessment, the team needs to:

  • In beta, the team were encouraged to test with Universal Credit users as they may have unique issues with things like managing multiple accounts. The team’s analysis shows the working tax credits users are representative as those users are being moved over to Universal Credit. These users would have Government Gateway accounts. The panel would encourage the team to consider users of Universal Credit who would be expecting to be able to use their digital ID through GOV.UK Verify.

3. Have a multidisciplinary team

Decision

The team met point 3 of the Standard

What the team has done well

The panel was impressed that:

  • The service team are a multi-disciplinary co-located team. This was very evident and ensures that the team can iterate the service on an ongoing basis. This will be especially needed when the team are Live to allow them to continuously improve the service.
  • The team are fully empowered to prioritise the backlog and iterate the service against user needs. The team demonstrated that they have autonomy and the Digital Service Manager gives them cover to develop the service without impediment from the board. The team have a very knowledgeable Product Owner who understands the backlog including the dependencies on other parties contracted to deliver the solution. This includes the HMRC back end services team, the DWP Universal credit and NS&I front end and back end service teams.
  • The team have a dedicated Performance Analyst who works closely with the Product Owner and User Researcher to ensure that analytics are used to improve the service. This was evident when the team demonstrated how they used analytics to inform the development of the Digital Service. The team have a good rhythm to the sprints and have ensured that user research and analytics are at its heart. Significant work has also been done with the content to educate users on how to save on an ongoing basis.

What the team needs to explore

Before their next assessment, the team needs to:

  • The team are working well with other services that are dependent such as NS&I. They showed detailed work in terms of how they could change pages that were in NS&I control, the team need to ensure that all areas of content and the journey are within their control so that they can change all elements of the service without impediment.
  • The team should think about enablement from contract staff to permanent staff and how that can be woven into service contracts to build digital capability and change the ratio of contract staff to permanent staff headcount. The ratio should be changed in favour of permanent staff.

4. Use agile methods

Decision

The team met point 4 of the Standard

What the team has done well

The panel was impressed that:

  • The team are using agile ceremonies to allow them to manage backlog items and iterate and test on an ongoing basis. These include daily stand-ups, sprint planning, retrospectives and show and tells which are well attended with senior level buy-in.
  • The team are empowered to manage the backlog and report into a Service Board through the Digital Service Manager.
  • The agile nature of the team with no impediments to release will allow them to iterate against user needs when the service goes into Live.

What the team needs to explore

Before their next assessment, the team needs to:

  • Think about embedding policy into the digital service team. This will allow policy to see user needs coming downstream and allow the policymakers to make policy based on user needs.
  • Test the Verify journey to ensure that this can not be considered for use in the service rather than the HMRC alternative.
  • Ensure that the helpdesk processes and triage mean that the service team are alerted on a daily basis rather than a weekly basis to any problems that occur in the live service. This will allow the service team to eliminate any problems with the live service.

5. Iterate and improve frequently

Decision

The team met point 5 of the Standard

What the team has done well

The panel was impressed that:

  • The panel was impressed with the way the team worked together using user research, performance analysis, content and design. The team clearly demonstrated how they understood problems, made changes and measure the success of those changes.
  • It was good to hear that the team are improving the experience for back end processes and internal screens for call handlers.
  • The team had many iterations of the product demonstrating that they had iterated frequently to ensure that users could use the service and complete the transaction first time. The user satisfaction of the service added more evidence to this which showed a satisfaction rate of 93%.

6. Evaluate tools and systems

Decision

The team met point 6 of the Standard

What the team has done well

The panel was impressed that:

  • The team has adopted MDTP, which is a tried and trusted HMRC common platform - the overall architecture of the system and the tools have been guided by that.
  • Not much has changed since the beta assessment. The team has continued improving things, for example, making the OAuth tokens mechanism more robust and making their code pipeline more efficient.
  • The team is well aware of the dependencies they have on other systems and they are happy by the SLAs they provide.
  • They are aware of the minor usability issues of Worldline (their payment gateway) and they have contacted the supplier to get them to commit to improving them.

7. Understand security and privacy issues

Decision

The team met point 7 of the Standard

What the team has done well

The panel was impressed that:

  • The team has proven to be extremely conscious of security and privacy. Both the HMRC and NS&I subteams follow best practices, such as running independent penetration tests and thread modelling sessions at regular intervals or when the system architecture changes.
  • The service runs a fraud detection engine and transaction monitoring. It uses Akamai to prevent DDOS attacks. Other relevant features are ZAP tests, HTTP security headers and encrypted cookies.
  • The service relies on external trusted services for user registration and login and employs multi-factor authentication.
  • The team is educating users about the risks of cyber attacks.

What the team needs to explore

Before their next assessment, the team needs to:

  • The team should collaborate more closely with the HMRC Cybersecurity team to be fully prepared for potential attacks - for example, an attacker may try to steal financial information via phishing email or other means.

8. Make all new source code open

Decision

The team met point 8 of the Standard

What the team has done well

The panel was impressed that:

  • The team is truly working in the open, which is really great. They are using Github to publish their code and they allow people to collaborate.

What the team needs to explore

Before their next assessment, the team needs to:

  • The copyright owner is missing in the LICENSE files of the Github repositories - it should be “Crown Copyright”.
  • NS&I should work on releasing the less sensitive code (e.g. front end) as Open Source.

9. Use open standards and common platforms

Decision

The team met point 9 of the Standard

What the team has done well

The panel was impressed that:

  • The team is making extensive use of RESTful APIs and is using common components, such as Government Gateway, the GOV.UK Frontend and GOV.UK Notify.

What the team needs to explore

Before their next assessment, the team needs to:

  • The team should liaise with DWP and GDS regarding a possible integration of GOV.UK Verify as an alternative method for the identity verification of users. The majority of users of the Help To Save service would be users of Universal Credit (UC) as well, and UC supports Verify. We should try to limit cases where users need to go through identity verification twice. This would be a major piece of work as the MDTP platform does not support Verify at the moment.
  • If an opportunity comes up where the benefits outweigh the cost, the team should consider a migration from Worldline to GOV.UK Pay.

10. Test the end-to-end service

Decision

The team met point 10 of the Standard

What the team has done well

The panel was impressed that:

  • Both the HMRC and NS&I teams have automatic end-to-end tests in place and they are also able to test the whole end-to-end journey.
  • They have a projection of the number of users both in the short and longer term. They have run performance tests and sized the infrastructure to account for around 5 times the peak load. During public beta, they had spikes of traffic and registrations and they handled it without a problem.

What the team needs to explore

Before their next assessment, the team needs to:

  • The team should carry out even more testing of the whole user journey including the identity verification part. It would be interesting to engage with users who access Universal Credit via Verify to understand the impact of having to do identity verification twice.

11. Make a plan for being offline

Decision

The team met point 11 of the Standard

What the team has done well

The panel was impressed that:

  • The team have developed several custom “shutter pages” for when the system is unavailable, depending on the type of impact for the user. The team uses Pagerduty for their escalation policy in the case of an outage or incident.
  • The service is also available by phone or via regular online banking.
  • Support is available 24/7 in the case of high priority incidents detected by the service’s monitoring tools.
  • Most changes can be deployed with zero downtime. A particular class of changes makes the banking system unavailable for a prolonged period of time - however, those changes are rare, the team is fully aware of them, and the service can still be accessed via online banking during that time.

12. Make sure users succeed first time

Decision

The team met point 12 of the Standard

What the team has done well

The panel was impressed that:

  • The team have been able to influence policy based on user research by sharing feedback on the bonus, unhappy paths like the ineligible journeys and the appeals process with the treasury. In one example the policy was changed to allow the 280 users who mistakenly closed their accounts to reopen them.

What the team needs to explore

Before their next assessment, the team needs to:

  • The team should continue to conduct regular accessibility reviews and deliver on recommendations from the DAC accessibility audit.

13. Make the user experience consistent with GOV.UK

Decision

The team met point 13 of the Standard

What the team needs to explore

Before their next assessment, the team needs to:

  • After registration the URL switches to hts.nsandi.com/account. There is a piece of work underway at the moment to change this to help-to-save.service.gov.uk/account. The target date is March 2019. It’s important for consistent user experience as the GOV.UK branding can only be applied to .service.gov.uk domains.

14. Encourage everyone to use the digital service

Decision

The team met point 14 of the Standard

What the team has done well

The panel was impressed that:

  • The team demonstrated that they were encouraging multiple users to use the digital service and linking in well with Universal Credit to ensure that users had no barriers to using the service. They had majored on the use of content and nudging to ensure that users were encouraged to use the digital service regularly to save therefore changing behaviours and saving habits.
  • The team were working closely with Communications and Social Media teams to create awareness of the service. They also got really good exposure when help to save was endorsed on the Martin Lewis Money Expert Financial Show. The Service Team saw a surge in traffic when the show was aired.

15. Collect performance data

Decision

The team met point 15 of the Standard

What the team has done well

The panel was impressed that:

  • The use of data in many areas of the team to understand the user journey and drive improvements to the service was impressive.
  • The service team clearly demonstrated the use of data in understanding user behaviour, identifying service issues and driving improvements to the service. They have gathered and used data from a diverse range of sources intelligently to provide a very complete and detailed picture of performance across the user journey, both online and offline. The tools that they use for the collection of data have had the necessary governance and are appropriate for the task of gathering user insight.
  • The tools that they use for the collection of data have had the necessary governance and are appropriate for the task of gathering user insight.
  • Security of data is part of the processes embedded within the digital performance analyst role and data is regularly audited to prevent the unnecessary collection of PII (personally identifiable information).

What the team needs to explore

Before their next assessment, the team needs to:

  • While in Live, the team should look at the personas and see whether the take up by each persona was as expected. If not, investigate the reasons and take appropriate actions.

16. Identify performance indicators

Decision

The team met point 16 of the Standard

What the team has done well

The panel was impressed that:

  • The team have demonstrated their performance measurement plans where the success measures for the service are outlined and metrics and targets developed for their measurement. They have gone beyond the 4 mandatory KPIs to have a suite of KPIs that measure many aspects of the user journey and success against user needs and policy objectives.
  • A wide range of data sources, both online and offline are used to measure performance against the range of KPIs and this has enabled the team to drive positive improvements to the service and also to the user journey.
  • Performance against the KPIs is monitored, reported and shared regularly across the team and with stakeholders.

What the team needs to explore

Before their next assessment, the team needs to:

  • Using some A/B testing to assess changes to the service before deploying the changes to all users of the service.

17. Report performance data on the Performance Platform

Decision

The team met point 17 of the Standard

What the team has done well

The panel was impressed that:

  • The team have established a dedicated page for the service on the Performance Platform and are reporting regular trend data for 3 of the 4 mandatory KPIs. Cost per Transaction is yet to be reported, but a metric has been developed.

What the team needs to explore

Before their next assessment, the team needs to:

  • Report Cost per Transaction once sign off on the metric has been completed.

18. Test with the minister

Decision

The team met point 18 of the Standard

What the team has done well

The panel was impressed that:

  • The team have tested with the minister who fully endorsed the digital service giving senior level buy-in at the very top which is great to see.

Updates to this page

Published 6 March 2019