Pay A Tax Penalty

The report for HMRC's Pay A Tax Penalty alpha assessment on the 2nd of September 2021

Service Standard assessment report

Pay A Tax Penalty

From: Central Digital & Data Office (CDDO)
Assessment date: 02/09/2021
Stage: Alpha
Result: Met
Service provider: HMRC

Service description

HMRC intends to harmonise penalties across VAT and ITSA, as current penalty systems vary significantly across different taxes.

Our service will present the penalty points, penalties, interest and appeal journeys to VAT users, with a view to the service being reusable for other tax regimes.

Service users

  • VAT users on ETMP, so a combination of MTD mandated, MTD voluntary, and former MTD (opted out) users, and their agents.
  • The current ETMP population for VAT is 1.03 million.
  • The total VAT population is 2.2 million and it is expected that all of these users will be on ETMP when our MVP service goes public beta in April 2022.

1. Understand users and their needs

Decision

The service met point 1 of the Standard.

What the team has done well

The panel was impressed that:

  • the team identified the main end user groups and developed personas around them
  • user needs are clearly formulated
  • the team has considered inclusion aspects such as level of digital literacy and accessibility

What the team needs to explore

Before their next assessment, the team needs to:

  • identify secondary users. From the presentation it could be inferred that caseworkers and / or call centre agents might be secondary users but the team needs to do some research to identify any secondary users of the service and make sure they understand their needs
  • the team needs to clarify what was the methodology employed to develop their personas, and whether they used any theoretical frameworks such as grounded theory or activity theory, for instance. See this MOJ blog as an example of good practice: https://medium.com/uxr-content/your-personas-probably-suck-heres-how-you-can-build-them-better-b2b32a45c93b
  • in future presentations it would be good to give a brief summary of the types of research performed, and for each type how many rounds and participants were included as well as the assumptions that were tested

2. Solve a whole problem for users

Decision

The service met point 2 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is informed of the end-to-end service journey
  • the team has demonstrated awareness of the primary user groups
  • The team recognises that the service has many dependencies on other services. The team is working with these constraints to make sure the front stage of the service is in line with other services to create a unified experience for the user

What the team needs to explore

Before their next assessment, the team needs to:

  • since the HMRC portal is used by both VAT and non-VAT registered business users. The team should explore how the changes in the dashboard might impact users’ experience across services. Suggest doing research on user groups that are not involved in a VAT transaction but could be impacted by an interface change
  • the team mentioned that there have been recruitment challenges. For instance, they had limited success with recruiting users with low digital confidence. This must be given priority and the team should be supported with timely recruitment of participants for the research sessions

3. Provide a joined-up experience across all channels

Decision

The service met point 3 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is collaborating with other service team and departments to provide a joined-up experience across all channels
  • the team demonstrated the use of existing backstage processes from other HMRC services to streamline the internal caseworker processes by making use of APIs. This is a good example of leveraging technology across departments and eliminating any manual process
  • the team has aligned language across all communication channels such as letters, service pages and other digital communication channels

What the team needs to explore

Before their next assessment, the team needs to:

  • the team must make sure users using other channels such as letters should have the same level of experience as the digital channel
  • the team mentioned that more than a quarter of businesses rely on agents to file their VAT. Suggest doing more research on agents and how they handle client processes so that this learning could be used for better communications including the introduction of new point systems
  • the team is aware of the various design constraints, especially when collaborating with other services and departments. The team is working towards resolving them. Policy team and the department needs to ensure that the team’s priorities are considered in a timely manner for a smooth delivery of the service

4. Make the service simple to use

Decision

The service met point 4 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has demonstrated evidence of user-centred design
  • the team is practising agile ways of testing and iterating design. For instance, they did many design iterations of the appeal journey based on user feedback to make the service easy to use. Suggest they continue doing it
  • the team has used known design patterns (HMRC and GOV.UK design system)
  • the team has considered user feedback and modified the existing design patterns (tag for the summary card) to suit user needs. Suggest these findings are fed back to the design system community so that other designers in government can benefit from these valuable contributions
  • the team is aware that financial processes heavily rely on specialist terminology and acronyms. They are actively working towards minimising its use

What the team needs to explore

Before their next assessment, the team needs to:

  • the team should consider breaking down content into hierarchies for easy navigation and to reduce cognitive load. Suggest exploring design systems components such as inset to break down content
  • the team must do a content audit to maintain consistency in content format across the service. For instance, “VAT penalties and appeals - Overview” page uses two (2) different heading styles, that is, “You owe” and “I cannot pay today”, even though they fall under the same taxonomy levels. Similar inconsistencies in headings, titles, subtitles, content, links and buttons are noticed across the service pages

5. Make sure everyone can use the service

Decision

The service met point 5 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has demonstrated plans to leverage HMRC’s existing support model and channels for people with accessibility needs and assisted digital users
  • the team is exploring alternative research methods such as speaking to call centre agents or face to face usability testing for users with accessibility needs

What the team needs to explore

Before their next assessment, the team needs to:

  • consider which groups of users might be potentially excluded from the service. Assumptions like “agents do not have access needs” need to be revisited and explored. Also the team needs to consider whether digital literacy and accessibility are the only inclusion factors to take into account
  • it would be good to include potentially excluded groups in the personas the team developed
  • the team should consider an external accessibility audit such as Digital Accessibility Centre (DAC) audit before the public beta launch
  • the team must test and iterate the end-to-end journeys with assisted digital users and users with low digital confidence before launch
  • the team must test and measure how the existing support model performs and provide necessary guidance to make it efficient for the internal caseworkers
  • the team should do end-to-end testing of all the journeys with all user groups in various screen sizes and devices including scenarios such as document upload feature during an appeal journey in a mobile device

6. Have a multidisciplinary team

Decision

The service met point 6 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has a full multidisciplinary team and a clear team ethos, and have successful processes in place for onboarding new team members and getting them up to speed quickly. The team is a mixture of contract and perm staff, and they have funding to see them into the next phase
  • the team is working well together, and are approaching things like user research as a team sport, as well as regularly reviewing and iterating their ways of working

What the team needs to explore

Before their next assessment, the team needs to:

  • overcome the obstacles that they are facing with regard to external factors, for example: Being able to be prioritised by the Whitehall publishing team. It is essential that this work is completed ready for the legislation go live date in April 2022. Being able to demonstrate a successful end to end test of the system including APIs into the PEGA team. Ensuring that they can pick up (and have funding to complete) the changes required to the VAT View and Change service

7. Use agile ways of working

Decision

The service met point 7 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has strong agile ways of working, they are using ceremonies and they have iterated their ways of working
  • the team has a clear view of the myriad of their dependencies, and are working with stakeholders to keep them updated on progress and issues as they arise. They have defined ways of working with policy, stakeholders and other digital teams
  • the team has clearly put a lot of effort into understanding their velocity and cadence, and have a strong confidence in their roadmap commitments

What the team needs to explore

Before their next assessment, the team needs to:

  • Ensure that they carry on working in this way, bringing new colleagues, stakeholders and users with them on their journey

8. Iterate and improve frequently

Decision

The service met point 8 of the Standard.

What the team has done well

The panel was impressed that:

  • the team was able to talk through and demonstrate how their service had been iterated, as well as how they have been able to effect iteration on other services based on their research
  • the team has done extensive user research and were able to give examples of how they had iterated the appeal journey in response to usability testing
  • the evidence supplied to support this point was an excellent documentation of changes

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure that they have a plan to iterate the service if needed at go live based on early usage
  • continue iterations and improvements of the service during development

9. Create a secure service which protects users’ privacy

Decision

The service met point 9 of the Standard.

What the team has done well

The panel was impressed that:

  • the Data is secured in transit as well at rest and does not store any PII and data is not persisted
  • the team is planning for GDPR and DPIA reports before moving to private beta
  • certificate management is done at the wider MDTP platform level
  • the team has done the ZAP test

What the team needs to explore

Before their next assessment, the team needs to:

  • complete the DPIA and publish the privacy and cookie policies
  • complete the Pen test and address any recommendations

10. Define what success looks like and publish performance data

Decision

The service met point 10 of the Standard.

What the team has done well

The panel was impressed that:

  • there is a named performance analyst on the team, and that the team has completed a performance framework workshop to understand what success looks like
  • the team has linked the standard KPIs plus their other metrics back to their success factors
  • the team has considered both the information they will need to provide and the information they will need to get from others to ensure they have a strong evidence base for their decisions post April 2022

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure that they have a route to acquire the call centre data, and also access to the call recordings so that they can measure quant and qual impacts of their service on the support team

11. Choose the right tools and technology

Decision

The service met point 11 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is using the right set of tools and technologies such as MongoDB, Scala-Play
  • the team is using JIRA, Confluence, Project Roadmap, Planning tracker for the project management and prioritisation
  • the team is using various test tools such Selenium Webdriver(automated UI), Cucumber(Requirement), Accessibility (Wave- manual), Security(ZAP) and Gatling

12. Make new source code open

Decision

The service met point 12 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has published the source code using Github
  • the team is using open source tools for development, testing and release

13. Use and contribute to open standards, common components and patterns

Decision

The service met point 13 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is using the common components from the HMRC digital common services and components some of which are MDTP platform, Auth service, Government Gateway Credential, Transaction monitoring)
  • the team is using GOV.UK standards data format and patterns including WCAG2.1 guidelines

14. Operate a reliable service

Decision

The service met point 14 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is using HMRC contact centre and service is available 24/7/365 and AWS SLAs agreed with MDTP Platform which has a resilience with multi-AZ deployment on AWS single region (London). Cross-region replication can’t be enabled primarily because of the legal requirement to keep all the data within the UK jurisdiction
  • the team is using the CI/CD pipelines using Git/GitHub, Jenkins-CD and Automated tests and test tools
  • the team is using the various alert and monitoring tools and dashboards such as Google Analytics, Google Tag Manager, Splunk, Google Data Studio and Deskpro

Next Steps

Pass - Alpha

This service can now move into a private beta phase, subject to implementing the recommendations outlined in the report and getting approval from the GDS spend control team. The service must pass their public beta assessment before launching their public beta.

The panel recommends this service sits a beta assessment in around 3 - 6 months time. Speak to your Digital Engagement Manager to arrange it as soon as possible.

To get the service ready to launch on GOV.UK the team needs to:

Published 22 September 2021