Look up residency status for relief at source beta assessment

The report from the beta assessment for HMRC's look up residency status for relief at source on 26 April 2018.

From: Government Digital Service
Assessment date: 26 April 2018
Stage: Beta
Result: Met
Service provider: HMRC

The service met the Standard because:

  • It meets a clear and valuable need for Pension Scheme Administrators
  • It is based on good ongoing user research and design iteration
  • Its underpinning technology is strong and resilient.

About the service


The service allows pension scheme administrators to lookup the HMRC-held residency status of their members so that they can claim the appropriate rate of tax relief.

Service users

The users of this service are employees of Pension Scheme Administrators.


User needs

The team involved about 100 users in user research altogether. This is an impressive number considering that there are potentially about 1000-1500 users of this service. Most research methods focused on remote observations and telephone interviews as well as communication with service users via other means.

The team received positive feedback from their users which they collected in user research sessions. User research showed that bulk upload is something that users need to use to upload a lot of profiles of their clients. This has caused some problems since entering data in bulk needs to comply with specific formats. User research also identified a need to search for multiple members.

Users are aware of ways to access relevant support models when they face problems with the service since they usually access such support when they face problems with a wider service not only in its online mode. These support models have also been communicated via newsletters and other means.

The key concern from the alpha stage assessment was digital accessibility and involvement of people with various level of digital literacy. The team ran an internal digital accessibility assessment which highlighted areas that need to be improved. Despite continually trying, the team hasn’t been able to recruit any users with digital accessibility needs or with low digital literacy because these kinds of individuals don’t work in these roles. The team will try to recruit these people in the future but we understand that this may be especially hard because the pool of available users is relatively small and they may not represent the general population. It will be important to ensure that the changes made after an internal digital accessibility assessment work with people who have these needs.

This service meets the assessment but the team could try to involve people with digital accessibility and disability (two groups) needs to make sure that they have addressed everyone’s needs. Although there may not be people with these needs currently in the industry, there is no reason to assume that they may not be employed there in the future. This is why it is a good practice to ensure that the service meets everyone’s needs. It is important to note that we recognise the hard work the team has done trying to recruit these people so far and it may be possible that they won’t manage to recruit them in the future since all recruitment efforts failed so far.


The panel was pleased that the service team is multidisciplinary and using agile methods to build a user-centred service. The service team is currently split between permanent and interim members and may wish to consider whether they would benefit from more permanent members of the team as they approach the Live stage - something which may be achieved when they handover to the ‘Live Support’ team later in the service’s life cycle.


The service is based on the standard HMRC tech stack, written in Scala using the Play framework and hosted on the common tax platform. It is separated into frontend and backend services, and the backend is also exposed as an API via the common API Platform. It also uses existing HMRC services for file upload and virus scanning, as well as Government Gateway for authorisation to the frontend.

The team is conscious of issues regarding storage of personal information and has taken care to ensure the service does not hold any data for the long term. Files generated as responses to bulk uploads are temporarily stored in MongoDB using GridFS with a TTL of three days, and are deleted immediately once accessed. There is no other persistence.

Environments for development, QA and staging are available and testing is integrated into the development process. The team is currently releasing on average twice per sprint. Performance testing has been carried out on all parts of the service, including the bulk upload functionality which has been tested with various different sizes of file.

The service is being supported by the team during the beta. Developers have access to logs via Kibana and metrics via a Grafana dashboard, including values such as response times, memory usage, file upload times, etc. Once the service is live, support will be carried out by one of the common support teams; the dev team are compiling a runbook for common issues and queries.

The code is all available in the open via Github.


The team has done substantial work in their private beta phase to try and make the service intuitive and frictionless for users. However, there is more work to be done to ensure that users succeed first time.

One of the key challenges that the team face is helping users to upload a .csv file of their members’ details. The team has tested this page, but 14 out of 21 people tested with so far had not successfully uploaded a .csv without getting an error message. The team has made iterations to this page to address the issue, but were waiting to see the effects of that change. For this reason, it cannot be said that the service meets point 12 of the standard right now.

There is also an issue with the CSV upload function when JavaScript is turned off in the browser. It looks like the validation is happening in JavaScript, which would mean if that file doesn’t load for any reason (such as a poor connection) the service would not work correctly.

This is not the only service on GOV.UK that needs users to upload a CSV file. Report landfill data and GOV.UK Notify are two examples of services solving this same problem, which are documented in the GOV.UK Draft Patterns. Looking across government at other ways these problems have been solved could lead to a successful design quicker, as well making sure the service is consistent with the rest of GOV.UK. This relates to point 13 of the standard, so the team needs to look closely at this before their next assessment. They should make sure they have utilised the work done in other bits of government on uploading a CSV. It would be great to see the team also feed back into that pattern anything they find out in research.

Other small points which will cause problems for some users include the validation of non-alphabetic characters in names not being let through (eg é or hyphens in double-barreled surnames), as well as not accepting text input on the month field (eg September or Sep instead of 9).

The team are working in the right way. They showed that they are doing regular user research and responding to difficulties that users are having. The work that the team have done on making sure users understand the start of the service was very good.

However, this assessment comes slightly too early for the service to conclusively meet all the points of the standard, possibly because the timelines for the project overall have all been quite tight. The planned public beta period is also very short, so the team should make sure they really have the time to make the service work well for all users before coming in for a Live assessment.


The team showed that they have consistently used data and analytics to feed into building the service and have prioritised key metrics to report on. Prior to the assessment the team agreed with GDS that as the service was non transactional it would not be suitable for the Performance Platform. In lieu of this, the team have internal dashboards which are regularly monitored and discussed.

While the team do not have a Performance Analyst embedded with them they are following an established practice at HMRC of accessing Performance Analysis resource from a central team. They are able to ensure continuity by working with the same analysts each time and there are regular meetings and processes in place to get measurement issues added to the team’s backlog - including weekly catch-ups with the User Researcher and Business Analyst. The Performance Analysts have adapted the Performance Framework to suit the needs of the team and are combining different data sources to validate their approach.

The team acknowledge the current difficulty in tracking users who go back through the journey and edit their responses, and have a plan in place to measure this more effectively. They are regularly iterating to improve the quality of data available, for example adding custom dimensions so that they don’t need to rely on the user explorer feature in Google Analytics. This will also ensure the reporting can scale as the service gets more widely adopted.

For the next assessment it would be good to understand how performance is being tracked between GOV.UK and the service - as the team suggest they will have resource to monitor this. It would also be useful to understand more about the strategy behind the AB testing which the team plan to carry out as volume to the service grows. As the service develops the team will no doubt focus on different journeys, the panel would be interested to learn more about how they will use data to inform decisions with these journeys - particularly in more complex scenarios when users don’t come back in time to view their files.


To pass the next assessment the service team must:

  • Show that most users can successfully upload the CSV file without getting error messages
  • Make sure the validation on the CSV upload works with JavaScript turned off
  • Use, or contribute research to, the CSV pattern linked in the report
  • Make sure the validation on the form fields is lenient enough to accept common genuine user input – Diacritic characters in names and Months entered as words are two common user behaviours.

The service team should also:

  • The panel strongly recommends that the service team does not apply for Live assessment until they are confident that they are ready for it.
  • The service was almost too early to be assessed for Beta. The panel assumes that it would have been in a much stronger position given only an additional 2-4 weeks. As it is, the panel are willing to pass the service despite not meeting standard 12 because failing the team would probably slow or block their ability to improve on this point. The service team should strongly consider their confidence in meeting the ‘Live’ standard before applying for assessment.

Next Steps

You should follow the recommendations made in this report before arranging your next assessment.

This service now has permission to launch on a GOV.UK service domain. These instructions explain how to set up your *.service.gov.uk domain.

Get advice and guidance

The team can get advice and guidance on the next stage of development by:

Digital Service Standard points

Point Description Result
1 Understanding user needs Met
2 Improving the service based on user research and usability testing Met
3 Having a sustainable, multidisciplinary team in place Met
4 Building using agile, iterative and user-centred methods Met
5 Iterating and improving the service on a frequent basis Met
6 Evaluating tools, systems, and ways of procuring them Met
7 Managing data, security level, legal responsibilities, privacy issues and risks Met
8 Making code available as open source Met
9 Using open standards and common government platforms Met
10 Testing the end-to-end service, and browser and device testing Met
11 Planning for the service being taken temporarily offline Met
12 Ensuring users succeed first time Not met
13 Ensuring consistency with the design and style of GOV.UK Met
14 Encouraging digital take-up Met
15 Using analytics tools to collect and act on performance data Met
16 Defining KPIs and establishing performance benchmarks Met
17 Reporting performance data on the Performance Platform Met
18 Testing the service with the minister responsible for it Met
Published 6 August 2018