Guaranteed Minimum Pension - Beta Assessment

The report from the alpha assessment for HMRC's Guaranteed Minimum Pension service on 21 March 2016.

Stage Beta
Result Met
Service provider HM Revenue and Customs

The Guaranteed Minimum Pension service met the beta Digital Service Standard because:

• The service provides a clear way for users to obtain Guaranteed Minimum Pension (GMP) value calculations.

• The service team has good coverage of all team roles, and is working in an agile way; iterating the service in short sprint cycles.

• The service team has tested pension-specific terminology with service users and amended the terms used in the service to meet user needs.

About the service

Service Manager: N Cranston

Digital Leader: Mark Dearnley

The Guaranteed Minimum Pension service will allow Pension Scheme Administrators (PSA) to calculate a member’s guaranteed minimum pension after contracting-out ends.

Detail of the assessment

Lead Assessor: Thomas Moore

Researching and understanding user needs [points 1, 2, 10]

The team has used the database of pension scheme administrators, The Pensions Regulator, and associated bodies to recruit user research candidates, and has also engaged with pension scheme administrators through trade fairs. User engagement was initially through online channels; the service team have used ‘snowball recruitment’ (asking users to suggest other users for further research) to try to capture the needs of those with low digital ability.

User research has been conducted with both large and small scale companies, using lab-based, contextual, and observational research. Users have been involved in workshops during the prototyping phase of the service development and in some cases follow-up user research has been conducted during beta development with those involved in the earlier stages of the service development. During contextual user research sessions, the team’s user researcher has effectively isolated users of the service from the company management level to ensure that needs of true users of the service are captured.

Although a considerable number of research sessions have been conducted with PSA in large and medium size companies, research with micro-businesses (those with fewer than 10 employees) is yet to be undertaken. This is an area where further user research should be focused. Micro-businesses may not have the level of digital support that larger companies have and may also not be familiar with some industry standard terms that the service uses.

The service team has tested the calculation outcome page with users and made changes to it, but we did not see evidence of discovering whether users can use the numbers reported on the outcome page accurately in their work. The panel recommends that in subsequent research the team asks users to show how the various reported numbers will be used. This will help the team to check that users have understood the results correctly.

Running the service team [points 3]

The team covers all the roles expected of an agile service delivery team, and the service manager is empowered to make decisions regarding the release of new iterations of the service. The team operates in two-week sprints with contextual user research forming part of the sprint cycle. Team members in all roles attend user research sessions, helping to build the connection between those developing the service and those using it. The routes the team described for reporting concerns and issues ensure issues can be highlighted and addressed early.

Designing and testing the service [points 4, 5, 11, 12, 13, 18]

The service has presented a clear path for pension scheme administrators to follow in order to reach the desired GMP values at the end of the service. Work so far has focused on developing a service to handle single calculations; the assessment panel looks forward to seeing the implementation of a bulk-upload facility by the time of the live assessment. The service team has tested pension-specific terminology with service users, and amended the terms so that they are readily understood by users of the service. The service should always look to simplify terms to help new entrants.

Whilst the service follows GOV.UK design patterns, further work is needed to improve error messaging. Whilst error messaging does indicate in which field the error has occurred, it currently neither explains nor describes the remedial action the user needs to take.

The service should ensure terms used are used consistently throughout the service, for example the use of ‘opposite gender calculation’ in the service and ‘true/opposite calculculation’ in the outcome page.

The calculation outcome page presents users with array of numbers, only two of which are critical to a pension scheme administrator’s work. Some of the numbers are presented in a spreadsheet format that doesn’t follow common spreadsheet conventions. This should be tested further with users to ensure it is easy to understand and consistent with their needs.

Technology, security and resilience [points 6, 7, 8, 9]

The service team has evaluated the sensitivity of the data they present and have determined it is of a low sensitivity.

The team have built the service to be hosted on HMRC’s tax platform. The tooling frameworks and processes have been well considered, and the team were able to migrate from SQUID API to the new DES API with minimal effort since their alpha assessment.

The team have made suitable plans to mitigate service downtime and respond to unexpected outages.

The team has committed to open sourcing their codebase demonstrated by the release of their frontend and further explained their intent to release their backend too.

Improving take-up and reporting performance [points 14, 15, 16, 17]

The service team have set the service up to monitor the four key performance indicators (KPIs), and are keen to look at a further metric: average time to completion. The team now knows the processing cost per transaction. The team are using funnels to monitor user journeys and to identify the points at which user are ‘dropping out’ of the service.

The paper service will remain in place in the interim for bulk submissions of GMP value requests and in the longer term for requests on behalf of those working in certain professions. The service team plan to promote digital take-up at conferences and through bulletins to pension scheme administrators. The team plan to utilise existing communication channels with administrators to promote the service.

Recommendations

Ahead of the live assessment the service must:

• Identify users working in micro-businesses (those under 10 employees) and conduct user research with this group; ensure the terms used in the service are understood by them.

• Continue to conduct contextual research to evaluate how users do multiple calculations and what the outcome values are used for.

• Amend error messages to ensure users understand the reason why they are seeing the error message, and what remedial action they need to take.

• Review the outcome page to ensure users can readily find the information they need. Continue to engage with the Performance Platform team to publish KPIs.

Digital Service Standard points

Point Description Result
1 Understanding user needs Met
2 Improving the service based on user research and usability testing Met
3 Having a sustainable, multidisciplinary team in place Met
4 Building using agile, iterative and user-centred methods Met
5 Iterating and improving the service on a frequent basis Met
6 Evaluating tools, systems, and ways of procuring them Met
7 Managing data, security level, legal responsibilities, privacy issues and risks Met
8 Making code available as open source Met
9 Using open standards and common government platforms Met
10 Testing the end-to-end service, and browser and device testing Met
11 Planning for the service being taken temporarily offline Met
12 Creating a simple and intuitive service Met
13 Ensuring consistency with the design and style of GOV.UK Met
14 Encouraging digital take-up Met
15 Using analytics tools to collect and act on performance data Met
16 Defining KPIs and establishing performance benchmarks Met
17 Reporting performance data on the Performance Platform Met
18 Testing the service with the minister responsible for it Met
Published 27 January 2017