Check your state pension live report - live

The report from the live assessment for HMRC and DWP's check your state pension service on 27 April 2017.

From: Central Digital and Data Office
Assessment date: 27 April 2017
Stage: Live
Result: Met
Service provider: HMRC & DWP

The service met the Standard because:

  • The team successfully brought together HMRC and DWP colleagues to deliver a cross-government service.
  • The team have evidenced their users’ needs through research and shown they have the flexibility to respond to new and changing needs.
  • The team has done the hard work to keep a complex calculation simple for users.

About the service

Description

The Check your State Pension provides users with information that they can use to plan for their retirement.

Service users

This service is for everyone entitled to a UK state pension.

Detail

User needs

There was strong evidence that the team had run regular user research and used this to improve the service. The team had clearly iterated the service during Beta, making changes as new user needs emerged. For example, the team recognised people that are too old to use the service were trying to so they changed to the design to let them know earlier.

The team recognised a need for users to understand how they could improve their pension situation. They had explored how they could convey this information to users within their service and found that the complexity of the information meant users’ needs were better met by using the service as a prompt for a conversation with a specialist advisor. Research in live should monitor the service’s effectiveness in getting users to appropriate advice to meet this need.

Analytics show a large number of users failing to access the service at the ID verification stage. During live the team should work with GOV.UK Verify and Government Gateway teams to find ways to address this.

Care had been taken to ensure that the service was accessible. The team had relied quite heavily on expert reviews to achieve this with a smaller number of actual users with access needs testing the service. Ongoing research in live should include a greater number of users with access needs.

The team has done well to research with organisations that support users who might need assisted digital help. As the service runs in live, the team should be satisfied that these sources of support continue to be available as they are currently reliant on informal agreements. The team should also work with these organisations to encourage as many users as possible to use the digital service. The team had also done good work to raise the quality of paper statements to be in line with the digital service.

There was an encouraging plan to continue user research into the live phase. The team expressed aspirations for a more holistic approach that places the service in context of pensions and retirement planning. There is an opportunity to further understand the needs of younger users who are further from retirement as well as those who viewing this information together with information about other pensions and source of money for retirement.

Team

This service has been developed by a digital team comprising of HMRC and DWP staff co-located in Newcastle. The team are clear on the shared vision for the service and have been empowered to build the service to meet user needs.

The team intend to hand over the live service to HMRC’s business as usual (BAU) team who have responsibility for other HMRC products on the Tax Platform. The BAU team includes the mix of disciplines expected for live services and DWP intend to supplement this with additional team members. The BAU team will be responsible for small iterations to the service and support the live operation. For bigger changes the service will need to be handed back to a dedicated team. The panel encourage HMRC and DWP to keep the BAU arrangement under review to ensure the depth of knowledge of the users and service is preserved and it doesn’t deter bigger pieces of work.

The team described the support they’ve received from their management and governance structures, especially as they report to two departments. The team have benefited from positive engagement with policy and other stakeholders throughout development. The panel were particularly impressed to hear the ministers responsible have tested the service and observed user research sessions.

The panel consider this team a great example of cross government collaboration and service team empowerment. The panel hope the collaboration continues into live and proves a useful model for other services to copy.

Technology

The team has done an admirable job ‘doing the hard work to make things simple’. Pension calculations are complex behind the scenes and the team has worked to hide this complexity from users and focus on user needs. This is especially impressive given the cross-departmental nature of the project.

The service performs well and is regularly tested but the team should anticipate that events may drive significant spikes of user traffic. The testing regime should include periodically testing to failure in order to establish a baseline. As far as possible, non-functional testing should be built into the normal workflow after committing a change. Of course, if the underlying platform is migrated to another cloud provider the team should anticipate that performance will need to be re-tested against the new profile.

The team has understood and planned for the government’s cloud-first policy, and has a plan to migrate to a public cloud environment – they should continue on this path. The radical shortening of deployment times is a great sign of a maturing and resilient software and server environment.

It is a further sign of a mature team that the top-up payments feature was removed from the scope of the service based on data and user research. Although not a feature of the service, top-up payments may be part of a user’s onward journey so in live the team should consider better signposting to help users continue. The panel encourage the team to blog publically about the decision process they’ve gone through as it’s likely to be useful for others.

As the service does not store significant personally identifiable information within its own store there is limited application of current Data Protection Act concerns, but the team should plan for the coming implementation of the General Data Protection Regulations, which expand personal data to include tokens such as IP addresses and tighten data handling requirements around retention and redaction.

Recognising that internal rogue actors are the most serious threat to the system’s security, the team should develop a strategy to more proactively monitor the egress of unauthorised data from the network – alarm bells should ring if unexpected data is going the wrong way through the DMZ.

The transformation this service represents is an excellent example of technical collaboration between two large public bodies, and the team is encouraged to write up some of their results, perhaps on the GDS Technology blog.

Design

From a design point of view, this is a near exemplary government service with an exemplary team. It solves a clear user need in a simple way, using common government design patterns and components. The team have iterated on the design throughout their Beta phase resulting in significant increases in their KPIs. They also demonstrated a clear plan for the Live team to continue to iterate the design in response to user research and policy changes.

It would be great to see the team blog more about the great work they are doing, so that other service teams around government can learn from them. Of particular note was the increase in click through rate on the start page - from 60% to 80% over 4 iterations - and also the the increase in understanding users got of their state pension as the team added more clarity in the visual design and content, including removing the tabs interaction mentioned in the service’s Beta assessment.

The team demonstrated a strong commitment to not increasing the scope of their service beyond the user need of checking their state pension. The team have investigated sensible ideas which relate to that core user need, prototyped those ideas, and have good reasons for not taking them forward into the production service.

Consistency with advertising campaign

It was great to hear that the team have been working with the campaign team to measure the success of the campaign related to digital take up, showing a real understanding of the wider user journey.

However, the branding of the campaign is inconsistent with the branding of the digital service and the advertising materials don’t meet the accessibility guidelines we hold our digital services to. While marketing isn’t within the service team’s remit, the team should work with marketing colleagues to share best practice and agree a consistent brand users recognise and can trust.

Identity Verification

The service uses both Government Gateway and GOV.UK Verify to identify users. The cross-government strategy is for services for individuals to use GOV.UK Verify and there are opportunities to simplify the GOV.UK Verify journey for users accessing this service.

The panel recommend the team look at improving the use of GOV.UK Verify in live as it could help increase the number of users that succeed at accessing the digital service. GOV.UK Verify have shared separately a proposal for increasing digital uptake, the key recommendations are:

  1. Test to see if using GOV.UK Verify at a lower level of assurance (LOA1) improves completion rates (with an ‘uplift’ to LOA2, if required, to view sensitive information in the NI record).
  2. Test whether increasing the prominence of GOV.UK Verify increases the number of users successfully accessing the service - this could be achieved through A/B testing.
  3. Test whether the current design, which favours one identification route above the other, is helping users to choose the path that’s easiest for them and reuses existing authentication details.
  4. The team should seek to improve their understanding of the journey for users who don’t succeed using Government Gateway or GOV.UK Verify and what users need at that point.

If these tests show better user completion the team should consider and recommend any wider changes needed to the Tax Platform to help other teams benefit from what’s learned.

Analytics

The team has had an embedded performance analyst since the end of private beta. The team explained how the performance analysis work was integrated into sprints and the product roadmap.

The service and product managers had worked with the analyst to develop a performance framework for the service, identifying and collecting data sources and developing KPIs.

The analyst has successfully promoted the value and importance of data both within the team, through weekly data show and tells and aligning work with ‘hypothesis boards’; and to wider stakeholders through the development of dashboards which collate performance data from a range of sources.

The team provided examples of the effective use of data - for example:

  • Hypothesis that changing text to ‘this is the most you can get’ provided more context and would reduce traffic to call centres was proved. Given take-up of the service, if this change had not been made, call volumes would have been overwhelming.
  • Combining data from both GOV.UK and HMRC to show improved click through rate. Showing the value of the data analysis to improve triage.
  • Predictive analysis of impact of media mentions of the service provided confidence to the service manager that calls could be managed within normal call centre channels.

The service is now handed over to the Live Services team, but continues to have an analyst available (from HMRC). Along with a user researcher, the analyst will input to the weekly service forum to inform prioritisation of work.

The team is not yet reporting cost per transaction on the Performance Platform. The team has a good idea of live service costs, which are running at a third of clerical costs, but need more data about development and infrastructure costs to report an accurate figure.

Recommendations

The service team should consider:

  • Work with GOV.UK Verify to simplify the user identification journey and test ways of improving the user identification completion rate.
  • Finalise collection of service cost data and publish cost per transaction to the Performance Platform.
  • Continue to liaise with departments and agencies with similar ‘look-up’ services such as DVLA to share benchmarking and measurement approaches.
  • Adjust the domain configuration to follow GOV.UK policy so the service domain isn’t returned in search results.
  • Adjust performance and load tests to test to the point of failure.
  • Proactively monitor the egress of data from the network environment.
  • Plan for the upcoming General Data Protection Regulations, especially regarding handling user tokens or IP addresses in logging and analytics systems.

Next Steps

You should follow the recommendations made in this report as you continue to develop your service.

Get advice and guidance

The team can get advice and guidance on the next stage of development by:

Digital Service Standard points

Point Description Result
1 Understanding user needs Met
2 Improving the service based on user research and usability testing Met
3 Having a sustainable, multidisciplinary team in place Met
4 Building using agile, iterative and user-centred methods Met
5 Iterating and improving the service on a frequent basis Met
6 Evaluating tools, systems, and ways of procuring them Met
7 Managing data, security level, legal responsibilities, privacy issues and risks Met
8 Making code available as open source Met
9 Using open standards and common government platforms Met
10 Testing the end-to-end service, and browser and device testing Met
11 Planning for the service being taken temporarily offline Met
12 Creating a simple and intuitive service Met
13 Ensuring consistency with the design and style of GOV.UK Met
14 Encouraging digital take-up Met
15 Using analytics tools to collect and act on performance data Met
16 Defining KPIs and establishing performance benchmarks Met
17 Reporting performance data on the Performance Platform Met
18 Testing the service with the minister responsible for it Met
Published 24 July 2018