Make an export declaration online

Service Standard report for HMRC'S Make an export declaration online live assessment

Service Standard assessment report

Make an export declaration online

From: CDDO
Assessment date: 16/07/2024
Stage: Live Assessment
Result: Red
Service provider: HMRC

Service description

This service aims to solve the problem of…

  • making an export declaration online service to facilitate the movement of goods across UK borders for Exporters and Intermediaries. It allows users to submit, view and manage declarations, this tells HMRC what is leaving the UK, when, how and where it is going.

Service users

This service is for…

  • intermediaries (such as shipping agents, customs agents, freight forwarders)
  • exporters (international traders exporting goods outside of the UK)
  • new exporters (traders who have not previously exported goods outside of the UK / outside of EU pre brexit)
Things the service team have done well: 
  • contributing to the programme’s longer term, overarching service design work
  • collaborating with HMRC’s guidance team to design improvements to related, pre-service GOV.UK guidance
  • gaining excellent insights into users and their needs, including through some highly impressive work with call centre and support staff
  • making design changes to reduce pain points identified through usability testing
  • developed a useful understanding of the different kinds of barriers to use faced by their users, such as dyslexia and English as a second language
  • building up organised documentation about design decisions
  • set up performance analytics, which are demonstrably driving evidence-based decisions and action through a multi-disciplinary, iterative approach

1. Understand users and their needs

Decision

The service was rated green for point 1 of the Standard.

2. Solve a whole problem for users

Decision

The service was rated green for point 2 of the Standard.

3. Provide a joined-up experience across all channels

Decision

The service was rated amber for point 3 of the Standard.

During the assessment, we didn’t see evidence of: 

  • a simple, joined up way for users to get support while using this complicated service. The ‘fix declaration errors’ screen has a link to a form on GOV.UK (used in a number of different export-related journeys) to get help with errors. Whilst this direct link is better than the more complicated journey in the older/wider service (phoning up, then getting an email with a link to the form), the user has to provide details that they have already submitted in the service. The panel recommends that the team explore with the team responsible for the form whether and how this could be improved.

4. Make the service simple to use

Decision

The service was rated red for point 4 of the Standard.

During the assessment, we didn’t see evidence of: 

  • the right balance between reducing the work involved in supporting the service and reducing the burden on the user. The panel appreciates that doing the hard work to make it simple, as detailed in the following points, may mean more ongoing support for the live front end service than the team had envisaged
  • the needs of non-expert users being met, even though new exporters are one of the main user groups. The team told us that if a user thought the service looked too difficult to use, they could pay an agent to make the declaration for them. This means they have not yet made the service simple enough to satisfy the needs of one of the main user groups 
  • a rationale that’s consistent with the Service Standard for the decision to apply no business rules in the course of the front end journey. The decision means the user has to deal with errors at the end after the data has been run through business rules in another system. It requires the user to try to understand and fix an error relating to something they might have inputted early in the form, which could be a long time ago, possibly even in a different session. And fixing each error means re-journeying through the service unless a user notices and uses the secondary button giving them the option to come straight back to the error screen
  • enough work being done to make the errors, and how to fix them, understandable. An example is “Error 1 - The combination of 2 values you entered is not allowed (CDS12056)”, but it doesn’t tell the user which 2 values. 
  • enough exploration of ways to further simplify the screens and journey. The team has put effort into educating the user through clear, in-service guidance, for example by adding explanations in blue boxes for things that new users might be unfamiliar with. But the team didn’t demonstrate that they had adequately explored alternatives to this, especially alternatives that would result in removing content that some users don’t need at all (for example about MUCRs, if they haven’t been given a MUCR). Could the DUCR be partly completed for the user with information that’s already known, rather than trying to educate them on the required DUCR format? There’s some content whose purpose isn’t clear, for example, what action should a user take, at this point, when they see “Your loader will need to shut the MUCR before the departure of the goods”?
  • a clear user need for the links out to complicated policy and technical information on GOV.UK web pages. Whilst the team has tried to give some guidance on screen, inclusion of the links means there’s competing sources of information on screen which can increase cognitive burden. The team explained that there’s not much use of the links and that in testing, users tended not to find the information on the GOV.UK pages helpful. Where the team believes there’s still a need for guidance, the panel recommends doing more work with legal and policy colleagues to establish what specific information users really need in order to make a legally complete declaration, and to meet that need in as easy-to-understand a way as possible in the service itself, without expecting the user to leave the service to read complicated GOV.UK information

5. Make sure everyone can use the service 

Decision

The service was rated red for point 5 of the Standard.

During the assessment, we didn’t see evidence of: 

  • the team’s excellent understanding of different types of users, and the contexts they’re in, having enough of an impact on the overall design of the service. For example, the expectation was that those users who might struggle to use the service would pay an agent to do it for them instead
  • making the service easy enough for those who will struggle more than most with complicated jargon, lots of content on screen, and the need to move possibly multiple times between the service and GOV.UK guidance. For example, a user with ADHD said that “I tend to read the question only and not read anything else unless I have to”, but the resulting user need is documented as “I need clear instructions so that I understand what I need to do to complete the section of the form” rather than “I need the question to be self-explanatory”
  • a broader understanding of how ESL and issues such as dyslexia impact the entire user journey rather than specific areas
  • a new accessibility audit confirming full compliance with WCAG 2.1 AA. The team should also be aware that by the time their service gets reassessed, it may need to be compliant with WCAG 2.2 AA. More information at Making your service accessible: an introduction - Service Manual - GOV.UK (www.gov.uk)

6. Have a multidisciplinary team

Decision

The service was rated amber for point 6 of the Standard.

During the assessment, we didn’t see evidence of: 

  • what the resource profile will look like for the team once in live. It was evident that further work is needed on the design of the service to make it more accessible beyond expert users. The panel are concerned that this work hasn’t been planned for sufficiently.
  • a plan to address roadmap enhancements. These will need to be factored into the hand over to live service to ensure that they are not forgotten about once transitioned.

7. Use agile ways of working

Decision

The service was rated green for point 7 of the Standard.

8. Iterate and improve frequently

Decision

The service was rated amber for point 8 of the Standard.

During the assessment, we didn’t see evidence of: 

  • statistical analysis and applying that rigour to hypothesis testing – the panel recommends not relying solely on visual comparison as this risks false positives and negatives, and iteration based on inconclusive evidence

-iterating the performance framework or the key performance indicators (KPIs) – it’s essential for the team and stakeholders to understand what success looks like and why, and to connect this with user needs. The panel recommends being clearer around what you’re measuring and why, especially when handing over into live, which is not intended for continued iteration at the same rate as public beta

9. Create a secure service which protects users’ privacy

Decision

The service was rated green for point 9 of the Standard.

10. Define what success looks like and publish performance data

Decision

The service was rated amber for point 10 of the Standard.

During the assessment, we didn’t see evidence of: 

-an up-to-date, comprehensive, live-ready performance framework, outlining the key performance indicators (KPIs) and the link through from business outcomes to meeting user needs – the panel recommends giving more context this way and setting out clearer hypotheses to test

-how and why the KPIs were fit for purpose and were the most appropriate drivers/indicators of success – in particular, the panel recommends getting a much higher return rate to make customer satisfaction more representative of user experience

-internal analytics or success measures, specifically HMRC staff satisfaction and feedback around their part in the service, or measurement of operational processes, such as cost to serve – the focus was solely on external users and their issues

  • using performance data or insights from related HMRC and government services – to identify common issues and resolutions

-where or how performance data would be published or shared, particularly to inform decisions around similar and related HMRC and government services

11. Choose the right tools and technology

Decision

The service was rated green for point 11 of the Standard.

12. Make new source code open

Decision

The service was rated green for point 12 of the Standard.

13. Use and contribute to open standards, common components and patterns

Decision

The service was rated amber for point 13 of the Standard.

During the assessment, we didn’t see evidence of:

  • close enough attention to the detail of design patterns - for example, we’ve seen an error message that could result in an incorrect answer, a radio pattern being used when it should be check boxes so that the user can see what they’ve already entered in the conditional reveals, a question requiring a long explanation that does not follow the H1 content pattern for complex questions, resulting in two differing instructions on screen, and link text that could be more specific (“More details about what is required on this page”)

14. Operate a reliable service

Decision

The service was rated amber for point 14 of the Standard.

During the assessment, we didn’t see evidence of: 

  • sufficient knowledge transfer and process documentation for the live support team when they are responsible for handling incidents and operating a reliable service.

Next Steps

Red

In order for the service to continue to the next phase of development, it must meet the Standard and get GDS spend approvals. The service must be reassessed against the points of the Standard that are rated red at this assessment.

Updates to this page

Published 1 October 2025