EMCS Trader Front End beta assessment

Service Standard assessment report EMCS Trader Front End 28/09/2023

Service Standard assessment report

EMCS Trader Front End

From: Central Digital & Data Office (CDDO)
Assessment date: 28/09/2023
Stage: Beta
Result: Met
Service provider: HMRC

Previous assessment reports

Service description

The Excise Movement and Control System (EMCS) is a computerised system for monitoring the movement of excise duty-suspended goods in the UK and EU.

Excise goods are alcohol, cigarettes and related tobacco products, energy products such as hydrocarbon oils and biofuels.

Most excise goods become subject to duty as soon as they are produced or imported into the EU. This duty can be suspended which means it doesn’t have to be paid until the product is released for consumption.

This process is mandated by both UK and EU law. Any UK organisation moving duty-suspended excise goods is legally required to record these movements using the EMCS trader front end (TFE) or 3rd party software that connects to EMCS via an API.

Service users

This service is for external users (people at organisations). Users of the service include:

  • Alcohol producers (breweries, wineries or distilleries). These are required to have excise warehouse status by law.
  • Excise warehouse keepers who move / store their own excise goods or on behalf of others
  • Import / export companies that specialise in moving excise goods
  • Customs agents / brokers that have registered to use EMCS and act on behalf of the owners of excise goods
  • Large supermarkets who need to move and receive excise goods
  • Sole traders and smaller businesses who want to move or trade excise goods
  • Logistics companies (DPD etc.) who move excise goods on behalf of others
  • Freight forwarding companies who move excise goods on behalf of others
  • Haulage companies who move shipments of excise goods on behalf of others

1. Understand users and their needs

Decision

The service met point 1 of the Standard with conditions.

What the team has done well

  • the panel was impressed that: The team have iterated their designs around user needs, feedback, and evidence - for example, by including hint text in designs where users needed help or further information.
  • the team has researched with seven users with declared access needs, and three users who used assistive technology, to help them better understand what the service needed to do for these users.
  • the team has thorough research documentation, like their user needs spreadsheet. It was noted that the team had also collaborated on iterating user needs.
  • the team iterated their personas with the new evidence that they have.

What the team needs to explore

Before the next assessment, the team needs to:

  • must review their personas and consider whether the new entirely separate persona for users with access needs is needed. For example, the way it’s currently set out, it looks like ‘freight forwarder’ or any of the other roles wouldn’t have any access needs. But access needs could theoretically occur across the different personas you’ve generated. With further research evidence, the team may still conclude that they do definitely need an entirely separate persona.
  • must review their user needs against the guidance set out in the GDS service manual. Currently, a lot of the team’s user needs are directly related to EMCS, which describes a solution, not a problem (as described by GDS in the service manual). This means they’re more like user stories rather than user needs. The panel appreciates that this is a tricky area for the team, as the solution had already been decided for them. A way to remedy this might be to consider whether ‘jobs to be done’ would be a more appropriate framework than user needs. The team noted in their assessment that use of EMCS is mandatory, so jobs to be done (tasks that users must complete) may be a more helpful framework for the team.
  • should consider a broader range of methods, like diary studies and tree testing and/or card sorting, to get a richer picture of users’ mental models and to gain further qualitative insights into their users.

2. Solve a whole problem for users

Decision

The service met point 2 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has spoken to a range of users during research, as requested at alpha
  • the team has mapped out complex service journeys using tools like Mural
  • the team has created a high-fidelity coded prototype to demonstrate a range of interactions in the service

What the team needs to explore

Before the next assessment, the team needs to:

  • continue to look at the service and its touchpoints in the broader context of trade

3. Provide a joined-up experience across all channels

Decision

The service met point 3 of the Standard.

What the team has done well

The panel was impressed that:

  • the team showed an awareness of the wider service ecosystem
  • the team has tried to reduce cognitive load in a service with complex subject matter
  • the team are engaging with their wider organisation including the EMCS helpdesk in anticipation of future support queries

What the team needs to explore

Before the next assessment, the team needs to:

  • consider bringing non-digital assets (printed e-AD and ARC) into scope for testing and iteration

4. Make the service simple to use

The panel appreciated the level of preparedness shown for the assessment.

Decision

The service met point 4 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has established a good working relationship with policy colleagues and brought subject matter experts from policy into content discussions
  • the team is using design elements like hint text and reveals to assist user understanding and introduce users to specialist terminology
  • content iteration based on prototype testing and user feedback was demonstrated
  • front end designs have been iterated since Alpha and are consistent with government patterns

What the team needs to explore

Before the next assessment, the team needs to:

  • review and correct some minor content inconsistencies (from GDS style) in the prototype, for example, in the way times and dates were written - a 2i to make all content consistent with either GDS (or HMRC) style is recommended
  • consider if Heading 1s can be written as statements instead of questions (for example, ‘Gross mass of the wine’ instead of ‘What is the gross mass of the wine?’) and if some instances of button label text can be improved
  • consider if the overall number of interactions (as shown in prototype) could be reduced
  • when non-digital assets (printed documents) come into scope, include them within the design and user testing remit

5. Make sure everyone can use the service

Decision

The service met point 5 of the Standard.

What the team has done well

The panel was impressed that:

  • the team demonstrated an iterative approach to design based on user feedback
  • content has been included as an integral part of the service for user testing
  • an accessibility audit is planned and preparation of an accessibility statement is in    progress

What the team needs to explore

  • Before the next assessment, the team needs to: review how they ask users about accessibility needs in their screener form. At the moment, the team asks users whether they have any ‘access needs’. They do follow this up with examples in their screener, but users may not necessarily know what they’re being asked. It may be more fruitful to ask something along the lines of ‘Do you have any conditions or disabilities that affect your day-to-day life or how you use technology?’ or ‘Do you have any of the below conditions?’ and then briefly explain why you’re asking them this.
  • continue to develop understanding of any barriers users may face in arriving at the service

6. Have a multidisciplinary team

Decision

The service met point 6 of the Standard.

What the team has done well

The panel was impressed that:

  • the multidisciplinary team is still largely in place from alpha to ensure continuity and momentum.
  • the team continues to work with a subject matter expert embedded to ensure that they have expert input as appropriate.

What the team needs to explore

Before the next assessment, the team needs to:

  • consider what the team size and operating model will need to be for when the service migrates into live, in the context of broader HMRC live service run and support and the current proportion of suppliers to civil servants.

7. Use agile ways of working

Decision

The service met point 7 of the Standard.

What the team has done well

The panel was impressed that:

  • the team demonstrated strong agile ways of working in terms of tools, ceremonies, and cadence
  • the team have a good awareness of how they connect with other services and service teams that feed into theirs. Good stakeholder relationships exist to give and receive feedback.

What the team needs to explore

Before the next assessment, the team needs to:

  • nil

8. Iterate and improve frequently

Decision

The service met point 8 of the Standard.

What the team has done well

The panel was impressed that:

  • the team demonstrated a robust workflow for collaborative user-centred design including Mural, coded prototype, and the use of copy decks for content consistency in production code

What the team needs to explore

Before the next assessment, the team needs to:

  • nil

9. Create a secure service which protects users’ privacy

Decision

The service met point 9 of the Standard.

What the team has done well

The panel was impressed that:

  • the service uses HMRC’s common Auth service for trader authentication/authorisation which uses two factor authentication
  • the microservices use their own credentials which are system-generated
  • the service will not collect Personally Identifiable Information (PII) as names and addresses will only relate to the trader’s organisation
  • the Data Protection Impact Assessment (DPIA) has been approved
  • the Business Impact Assessment (BIA) has been approved
  • the Security Risk Assessment (SRA) has been approved

What the team needs to explore

Before the next assessment, the team needs to:

  • provide more detail about the quality assurance testing that has been performed for the service
  • provide evidence that relevant IT Health Checks and pen testing have been run for the Multi Tax Delivery Platform (MDTP) where the service is hosted

10. Define what success looks like and publish performance data

Decision

The service met point 10 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has developed a performance framework which has given them a set of KPIs derived from the user needs and aims they have identified; the team acknowledged this is still in development as there are still some gaps to be filled, and the indicators they use are subject to change as the service develops.
  • the team has made sensible interpretations of the mandatory KPIs, including using proxy measures for the cost per transaction which is difficult to measure meaningfully on this service
  • the team complied with analytics cookie policy; this means that they are missing data for about 80% of journeys, and therefore use back-end data to compensate where possible
  • the team outlined plans for development of analytics, including experimentation with AI tools
  • the team holds weekly analytics meetings with representatives from across the team, where findings are discussed and agreed

What the team needs to explore

Before the next assessment, the team needs to:

  • see better examples of how data has helped to improve and develop the service - although they did explain how it had given them confidence in the changes they had already made: by improving transaction times and reduced looping, for example.
  • make better use of contact centre data to help them understand where users have problems - at present the call centre data they receive is difficult to relate to specific parts of the service because of the way it is recorded; the team explained that it planned to meet with contact centre representatives to improve this.

11. Choose the right tools and technology

Decision

The service met point 11 of the Standard.

What the team has done well

The panel was impressed that:

  • technical choices discounted short term solutions such as reskinning the existing service which would not have supported the roadmap for the Multi Tax Delivery Platform (MTDP)
  • the service is being rewritten using the best practices and standards for the MTDP which will support its maintainability for future iterations
  • the architecture has been designed to support the migration from integrating with older systems to switching to new downstream endpoints

What the team needs to explore

Before the next assessment, the team needs to:

  • follow up on the alpha assessment recommendations of showing what technical learning can be gained from other HMRC Border and Trade services, such as the Customs Declaration Service (CDS), Goods Vehicle Movement Service (GVMS) and New Computerised Transit System (NCTS)
  • provide more detail about how the existing Excise Movement Control System (EMCS) will be decommissioned

12. Make new source code open

https://github.com/hmrc/emcs-tfe-frontend

https://github.com/hmrc/emcs-tfe

https://github.com/hmrc/emcs-tfe-reference-data

Decision

The service met point 12 of the Standard.

What the team has done well

The panel was impressed that:

  • microservice code is hosted in public GitHub repos
  • following the recommendation from the alpha assessment, testing stubs have been made available

What the team needs to explore

Before the next assessment, the team needs to:

  • continue to improve the README.md of each repository to explain what the microservice does and how it fits into the overall platform
  • continue to review if more dependent module code can be open sourced

13. Use and contribute to open standards, common components and patterns

Decision

The service met point 13 of the Standard.

What the team has done well

The panel was impressed that:

  • standards and best practices from the Multi-Channel Digital Tax Platform (MDTP) are used to support future development of the service
  • continue to follow the GOV.UK Design System patterns
  • the team are using government patterns in a coded prototype and have used design crits as a forum for feedback

What the team needs to explore

Before the next assessment, the team needs to:

  • follow up from the alpha assessment recommendation to investigate opportunities to share components and patterns with the wider community of users. EMCS has been implemented by at least 27 other countries according to this EU web page.

14. Operate a reliable service

Decision

The service met point 14 of the Standard.

What the team has done well

The panel was impressed that:

  • the service will make best use of the logging and monitoring capability that the Multi-Channel Digital Tax Platform (MDTP) provides
  • Kibana and Grafana dashboards will be customised based on new information as the service progresses through Beta
  • PagerDuty alerts have been set up to notify the team of any immediate issues with the service
  • disaster recovery practices and support will follow other MDTP services

What the team needs to explore

Before the next assessment, the team needs to:

  • provide more detail about the MDTP support capability that the service relies upon
  • provide more detail about the reliability of the existing Excise Movement Control System (EMCS) and how the service will mitigate risks with its dependencies on legacy systems
  • follow up on the alpha assessment recommendation to consider whether asking users to reinput is the best way to manage a service outage
Published 11 December 2023