Claim Flat Rate Expenses alpha assessment

The service allows a user to claim for Flat Rate Expenses (washing and laundering uniform, small tools, protective clothing etc) in real-time.

Service provider: HMRC

Service description

The service allows a user to claim for Flat Rate Expenses (washing and laundering uniform, small tools, protective clothing etc) in real time.

Service users

The users of this service are employees who can claim tax relief on expenses for uniforms, protective clothing and small tools.

1. Understand user needs

Decision

The team met point 1 of the Standard.

What the team has done well

The panel was impressed that:

  • User needs have been identified and iterated through discovery and alpha through user research which has included users with assisted digital needs. The user needs are universal to the 4 personas the team have identified, they have also identified an anti-persona.
  • Through the research the team have understood user knowledge, behaviour, capabilities both digitally and their entitlement to expenses, and their pain points. The team have developed an evidence map of the users’ journey through the service.
  • The team have worked to understand not only the needs whilst using the service, but also users experience, needs and pain points in how they find out about the service and understanding their eligibility of claiming expenses.
  • The team have shared their insights and findings with other relevant teams. They have consistently engaged with policy and operational colleagues and done research with back office staff and call agents.

What the team needs to explore

Before their next assessment, the team needs to:

  • In beta it will be important to understand the needs of the full end to end journey, by testing the letter that is sent to users if claiming for previous tax years to make sure this is also meeting users needs.
  • Time permitting it would be useful to research the user journey for those who are unable to authenticate online. To understand if they are able to continue their journey on another channel. We realise this part of the service is universal to HMRC, but it would be useful insight for this team and the employee expenses team to know whether or not needs are being meet.

2. Do ongoing user research

Decision

The team met point 2 of the Standard.

What the team has done well

The panel were impressed that:

  • It is clear that user needs were understood before prototyping began and then throughout alpha prototypes were iterated based on the evidence of user research. This was very clearly communicated through the team’s posters. The team acknowledged the pain points that still need work in their prototype, which will be iterated in beta.
  • The team have used a variety of appropriate user research methods with a range of users throughout alpha.
  • The team have a tight schedule to deliver their private beta in, but they have a clear research plan and have thoroughly thought about what work needs to be done in each sprint, which methods and tools they will use and how they will recruit people.

What the team needs to explore

Before their next assessment, the team needs to:

  • The team have a solid plan for accessibility needs research in the beta, it just needs to be emphasised that a wide range of access requirements need to be included in the research brief to the specialist recruitment agenda. The team should also expand the assisted digital requirements they include beyond dyslexia and colour blindness.
  • It would be useful for the team to understand if users would be claiming for uniform/tool expenses and profession subscriptions at the same time or at different times, to understand how this would affect their behaviour and experience of using the service.

3. Have a multidisciplinary team

Decision

The team met point 3 of the Standard.

What the team has done well

The panel were impressed that:

  • There is a co-located, multi-disciplinary, agile team in place, consisting of Service Manager, Product Manager, User Researcher, Interaction Designer, Scrum Master, Lead Developer, 2 Developers, Apprentice Developer, Content Designer, Business Analyst and Performance Analyst. All members of the team are working full time on this project, except the Content Designer and Performance Analyst who are spending 50% of their time on this project.
  • Going into Beta the team will stay mainly the same and there might be another Performance Analyst joining the team.

4. Use agile methods

Decision

The team met point 4 of the Standard.

What the team has done well

The panel were impressed that:

  • The team demonstrated a mature use and good understanding of agile tools and techniques. They are using Scrum and working in 2-week Sprints. They are using Jira to track and manage work as well as Confluence for their documentation (for example, user research documents and architectural diagrams). They are also making use of Google Drive and the whole G-suite of products. Developers use Slack for day to day for communication and the service prototype is on Github
  • The team showed that they are regularly engaging with their stakeholders, doing show & tells and inviting them to user research sessions.

5. Iterate and improve frequently

Decision

The team met point 5 of the Standard.

What the team has done well

The panel were impressed that:

  • This is a user-driven project, prioritisation is informed mainly by user research and it seems the team is empowered to take decisions on their service.
  • The team demonstrated how user research feeds into their sprints and showed examples of issues that have been highlighted by user research, iterated on and tested further.
  • The team showed their plan for Beta and how user research will fit into their sprints and iterations.

6. Evaluate tools and systems

Decision

The team met point 6 of the Standard.

What the team has done well

The panel were impressed that:

  • The team chose to develop the project using Play, an open-source web application framework written in Scala. This standard tech stack is commonly used in HMRC systems which allows easier integration with existing services and makes it easier to maintain and develop further at the organisation level.
  • They chose to use the latest stable version for the web application framework (v2.6) considering the fact that the support for the current version that is widely used in their department (v2.5) will be dropped soon and they want to avoid an unnecessary migration.
  • They are using common open source technologies for automation, monitoring, alerts and logging (e.g. Jenkins, Grafana, Kibana).

7. Understand security and privacy issues

Decision

The team met point 7 of the Standard.

What the team has done well

The panel were impressed that:

  • Threats to the service are minimal, yet have been identified and measures are in place to prevent them from happening. Penetration test for the external service is planned.
  • The backend service used by this service is constantly tested and as it’s been running in production for a while now.
  • The external service sits behind a Government Gateway login; users data is safely stored and transmitted to the backend service.

8. Make all new source code open

Decision

The team met point 8 of the Standard.

What the team has done well

The panel were impressed that:

  • The code for this application is publicly available on GitHub.

What the team needs to explore

Before their next assessment, the team needs to:

  • Document the code in the public repository and explain how someone else can reuse the code.

9. Use open standards and common platforms

Decision

The team met point 9 of the Standard.

What the team has done well

The panel were impressed that:

  • The team chose open standards, open source solutions and common platforms throughout the service (development, continuous integration, monitoring, alerts, logging).
  • An HMRC scaffolding tool was used to automatically generate the MVC structure of the project. This allowed quick iterations and gave the ability to start from scratch at any point while reusing code and patterns used across HMRC services.

What the team needs to explore

Before their next assessment, the team needs to:

  • Government Gateway is being used for identity assurance ― consider GOV.UK Verify as an alternative option.
  • Consider updating the scaffolding project to use the new Design System code as it provides accessibility enhancements and keeps the service up to date with the latests frontend improvements.

10. Test the end-to-end service

Decision

The team met point 10 of the Standard.

What the team has done well

The panel were impressed that:

  • Multi-environment, automated end-to-end testing is in place including unit, functional tests as well as browser journey tests.
  • Load testing performed using open source tools (Gatling) and monitoring alerts are set up. Peak events are considered as well as normal loads.

11. Make a plan for being offline

Decision

The team met point 11 of the Standard.

What the team has done well

The panel were impressed that:

  • There is an outage plan in place, serving appropriate error pages and messages; dates and times are announced in advance in case of planned maintenance.

12. Make sure users succeed the first time

Decision

The team met point 12 of the Standard.

What the team has done well

The panel were impressed that:

  • The service team were able to talk through different iterations of previous approaches. For example, the number of tax years a user can claim for; removal of the PAYE reference; inclusion of the claim amount to remove uncertainty; and, splitting the industry and occupation search into multiple pages.
  • The journey through the service is straightforward and short, with the right amount of content at each stage to help users complete the task.

What the team needs to explore

Before their next assessment, the team needs to:

  • There are some dead ends in the journey, such as when claiming for multiple jobs and where the user is already claiming tax relief for a given year, that needs improving through better signposting and setting of expectations.
  • In the early iterations of the service, the industry and occupation were collected via one autocomplete. In later iterations, and based on user feedback, the team split the step into multiple pages. However, following the split, the page headings weren’t carried over. For example, the original question was “What industry you work in and what is your occupation?” The subsequent page title is “Industry 1 of 3” and so on. The team should consider convention and naming the pages appropriately, for example: “What industry do you work in?” and “What is your occupation?” If a qualifier is needed for clarity and accessibility, for example, “What industry do you work in? (Page 1 of 3)” and “What is your healthcare occupation?”, then these should be considered.
  • The interaction pattern used to progress the user to the next page of industry categories is strange. The panel suggests either replacing “show more industries” with an ‘or’ separator and a radio option stating “none of the above”, or including all industries on the same page (assuming there is a limited set).
  • The question relating to how much of the applicant’s expenses have been repaid by their employer is ambiguous. While the service team suggested the applicant would know their employer was reimbursing them, they should consider including this explicitly in the page’s heading, for example, “How much of your expenses has your employer paid back to you?”.

13. Make the user experience consistent with GOV.UK

Decision

The team met point 13 of the Standard.

What the team has done well

The panel were impressed that:

  • The service team has made good use of the GOV.UK design patterns to present information in a clear and consistent way, giving users the confidence that this is an official HMRC service.

What the team needs to explore

Before their next assessment, the team needs to:

  • The service team recognised the name of the service is a placeholder. It is vital they choose a name for the service that means that users can find the service more easily when they search online, and so that users understand what your service does and quickly decide whether to use it. An example service name could be: “Claim for tax relief on employee expenses.”

14. Encourage everyone to use the digital service

Decision

The team met point 14 of the Standard.

What the team has done well

The panel were impressed that:

  • The team showed how users complete questions with regards to their claim before they are asked to sign into Government Gateway. They are shown what tax relief they are entitled to before they are required to sign in and complete their claim. This increases their motivation to continue with the journey and encourages users to proceed with the service.

What the team needs to explore

Before their next assessment, the team needs to:

  • Have a clearer plan of how they intend to encourage use.

15. Collect performance data

Decision

The team met point 15 of the Standard.

What the team has done well

The panel were impressed that:

  • The team’s performance framework, especially at the alpha stage, showed very clearly how user needs are linked to KPIs and what data sources will be used to the collect data.
  • There is a good range of data sources and team demonstrated a good understanding of how this data will be used.

What the team needs to explore

Before their next assessment, the needs to:

  • Ensure that the SRO has signed off on using Google Tag Manager.
  • Ensure that Performance Analysts are not allowed to publish google tags. The final publishing should be done by the Tech Lead / Developer or someone qualified.

16. Identify performance indicators

Decision

The team met point 16 of the Standard.

What the team has done well

The panel were impressed that:

  • The team showed that they have a performance framework completed by their Performance Analyst. It was great to see that the performance indicators are directly linked to user needs and follow GDS guidance.
  • Another Performance Analyst will be joining the team in beta stage.

What the team needs to explore

Before the next assessment, the team needs to:

  • Consider formulating some team level KPIs in addition to their service level ones.

17. Report performance data on the Performance Platform

Decision

The team met point 17 of the Standard.

What the team has done well

The panel were impressed that:

  • The team are planning on publishing a dashboard and have already engaged with the Performance Platform team. Their performance analyst will be working with the Performance Platform team to ensure a dashboard is ready for public beta.

18. Test with the minister

Decision

Point 18 of the Standard is not applicable at this stage.

Updates to this page

Published 4 March 2019