Check how to import or export goods

The beta service assessment report for Cabinet Office's Check how to import or export goods on the 15th of July 2021

Service Standard assessment report

Check how to import or export goods

From: Central Digital & Data Office (CDDO)
Assessment date: 15/07/2021
Stage: Beta
Result: Met
Service provider: Cabinet Office

Previous assessment reports

  • Alpha assessment report: 29th October 2020 - Met

Service description

This is a digital service that provides structured guidance in one location for specific goods that a trader is trying to import or export. “Check how to Import or Export Goods” (CHIEG) is a component of Single Trade Window (STW), which will become a digital service that provides a single source of truth for the import and export process across HMG.

Service users

This service is for:

UK traders who have little to no import or export experience: currently only trade within the UK and/or EU, bringing in / sending goods out that don’t require customs declarations.   

UK freight forwarders who manage the import and / or export process for Traders: they act as agents and can deal with the entire import / export process or part of it depending on their client’s / trader’s requirements (e.g. they manage customs clearance, movement of goods, etc.)

1. Understand users and their needs

Decision:

The service met point 1 of the Standard.

What the team has done well

The panel was impressed that:

  • the team had done user research as a team sport, involving the team in research and cultivating an understanding of users in the team
  • adapted to the challenges of not being able to do contextual observation in person with a mix of moderated and unmoderated methods, including diary studies
  • conducted research with users with a range of access needs, and with proxy users via assisted digital routes
  • understood the impact of political policy decisions, especially around Northern Ireland, on the service’s users and adapted research as a result
  • triangulated insights from research, performance analytics and contact centre feedback to identify areas for research or iteration

What the team needs to explore

Before their next assessment, the team needs to:

  • conduct contextual observation of real users using the real service, as and when department policies allow for face to face research
  • explore writing user needs that are less solution-driven to give the team a clearer focus on what problem a user needs to solve
  • exhaust their options for doing real-world testing of the assisted digital route through the service if at all possible

2. Solve a whole problem for users

Decision

The service did not meet point 2 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is working in the open and across organisational boundaries, building relationships with departments such as HMRC, DIT, DEFRA, FSA, GDS, Home Office and BEIS, all of which own services that are part of the importer’s end-to-end journey
  • the team has prioritised developing the licensing journey which is causing significant issues for users and has plans to address other pain points in the end-to-end journey through public beta

What the team needs to explore

Before their next assessment, the team needs to:

  • demonstrate through analytics that the service is having returning users from across the UK that are successfully completing their journeys once functionality to complete transactions is developed or integrated. Which is then supported by case studies of how this service has benefited a small to medium sized trader who imports goods to the UK and that by using this service they successfully imported their goods into the UK
  • continue ongoing collaboration with other government departments to work towards minimising the number of times users are to provide the same information to government
  • consider adding a roadmap link to their service, similar to GOV.UK Pay, to provide users a date of when to expect new features will be added to the service

3. Provide a joined-up experience across all channels

Decision

The service did not meet point 3 of the Standard.

What the team has done well

The panel was impressed that:

  • the service team provided excellent examples of where they have been working with front line operations to design the call journeys
  • the team has strong relationships with other teams and departments
  • the team has spoken to over 40 of the exiting services in the importing and exporting area to understand what different needs are being met already
  • the team has built relationships with over 60 policy owners across government responsible for the guidance and services that is referenced and linked to and have ensured they have early oversight of any change to policy in the future
  • they have clearly demonstrated a need to solve this challenging and complicated area of Government

What the team needs to explore

Before their next assessment, the team needs to:

  • continue ongoing work to create a service that is joined up and provides more functionality than a signposting service. As mentioned in the presentation one of the key findings is that users want a single entry point to complete all their import activities but this non-transactional service currently does not solve that fragmented journey. The guidance that is being linked to from this service is mostly generic outside of licencing and where the user needs to use other services to complete their documentation to import goods they are asked similar questions like the commodity code so it is not a single joined-up experience. Although, the assessors appreciate that this is not going to be solved quickly or easily
  • for public beta, demonstrate the progression of Welsh support for the service has happened

4. Make the service simple to use

Decision

The service met point 4 of the Standard.

What the team has done well

The panel was impressed that:

  • the service is consistent and follows the GOV.UK design system from an interaction design point of view
  • there have been several rounds of testing with iterative design improvements that are well documented

What the team needs to explore

Before their next assessment, the team needs to:

  • review and address all should recommendations in the content audit. If the details from the content audit can also apply to the export service then the insights should be shared with the export service
  • merge the export and import journeys and have a consistent look and feel across both services

5. Make sure everyone can use the service

Decision

The service met point 5 of the Standard.

What the team has done well

The panel was impressed that:

  • the service meets the 2.1 AA WCAG standards and the accessibility statement is up to date
  • a good amount of live service support is being provided and it has been designed with the live service support teams
  • the team has done a lot of qualitative research with their potential users

What the team needs to explore

Before their next assessment, the team needs to:

  • build in the commodity code search service into the import journey as this is currently an area of high drop off

6. Have a multidisciplinary team

Decision

The service met point 6 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has all the roles expected at private beta on a multidisciplinary team and this will remain the same for public beta
  • the team acknowledges there has been a heavy reliance on contractors so far but progress has been made to recruit more civil servants. A Product Lead and Senior Delivery Lead have joined the team and a Service Design Lead will be joining shortly
  • the team has an established process for sharing knowledge, including regular feedback between contractors and civil servants to document technology, research and design choices
  • the team has regular feedback loops between DIT’s check how to export goods (CHEG) and HMRC’s trade tariff service

What the team needs to explore

Before their next assessment, the team needs to:

  • continue to reduce the reliance on contractors to create a sustainable service team

7. Use agile ways of working

Decision

The service met point 7 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is using agile ways of working effectively with the appropriate ceremonies in place
  • the team provided multiple examples of how they’ve adapted based on retrospective feedback, including moving some parts of the team to kanban to better reflect the user recruitment cycle, creating more consistent user stories against a defined criteria and ensuring everyone has the opportunity to contribute during meetings
  • the Product Owner has made a conscious effort to minimise the amount of governance meetings team members need to attend, carefully picking the appropriate people in each case. This should continue going forward to ensure the meetings are held with people at the right level and continue to add value

What the team needs to explore

Before their next assessment, the team needs to:

  • be empowered to change the direction of the service if the outcome of following recommendations in points 2, 3 and 10 show that it’s not the right solution to solve the problem. For example, if the team discovers that the service isn’t having the expected impact with importers in reality, it should be able to review and explore other ways to solve the problem

8. Iterate and improve frequently

Decision

The service met point 8 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has consistently taken on board user research insights to iterate the service, for example informing users when they are being directed to another service for the export journey
  • the team has worked with the GOV.UK content team to iterate the start page content

9. Create a secure service which protects users’ privacy

Decision

The service met point 9 of the Standard.

What the team has done well

The panel was impressed that:

  • because the service is not transactional, it is not getting any personal information from the users
  • the team has engaged with DPO and was informed that DPIA is not required as the service does not have any personal data and is just providing information
  • the team has published the cookie policy and privacy policy
  • Data in Transit is secured and encrypted using TLS 1.2 or greater. These are also directed through an AWS CloudFront WAF. The connections between the microservices within the Kubernetes cluster are encrypted via the istio service mesh
  • the outgoing connections from the CHIEG services (AWS RDS PostgreSQL, AWS Elasticache Redis, and the Online Trade Tariff API) database are all encrypted using TLS 1.2 or greater
  • for security of Data at Rest, the PostgreSQL database is provisioned using AWS RDS, and is configured to be encrypted at rest. The Redis session store is provisioned using AWS Elasticache, and is configured to be encrypted at rest. The logs within ElasticSearch are encrypted in transit and at rest (provided by elastic cloud)

10. Define what success looks like and publish performance data

Decision

The service did not meet point 10 of the Standard.

What the team has done well

The panel was impressed that:

  • has a full time performance analyst
  • has well integrated Google Analytics data collection and visualisation that will enable the data to be shared more widely
  • the team has worked on a performance framework
  • the team are developing metrics that measure success outside of the digital process
  • the team are collaborating with other services to show success and impacts across these
  • the team has defined some success measures and measured the impact of changes they have made in line with these

What the team needs to explore

Before their next assessment, the team needs to:

  • continue to develop the service so that it solves a “whole problem” for users. This will make it possible to more clearly measure that it is meeting their user needs. Until this work is further developed, it is difficult to know whether the measures are accurately measuring success
  • continue defining goals and measures that focus on the real problem that the service is trying to solve
  • be able to show (through performance analytics as well as qualitative research) a measurable benefit for users that proves the service is meeting the user needs it set out to meet - related to Point 2

11. Choose the right tools and technology

Decision

The service met point 11 of the Standard.

What the team has done well

The panel was impressed that:

  • the service team has used the right set of tools technologies for managing frontend, backend, databases, development, testing and deployment. Some of them are NodeJS, TypeScript, ExpressJS, Nunjucks, GOV.UK Frontend, Java 11, Spring Boot, Spring WebFlux (non-blocking reactive streams), JPA, Flyway (for database schema management). The CMS data is stored in AWS RDS (PostgreSQL) and the CMS UI session data is stored in AWS Elasticache (Redis). The analytics dashboards are built using Tableau and the data is processed from Google Analytics via Google BigQuery. Code is stored in GitHub and Jenkins is used for the primary CI and CD pipelines
  • manual testing is also carried out including cross-browser testing with BrowserStack and manual performance testing is carried out with JMeter
  • a suite of automated Selenium end-to-end tests are maintained using BDD, which run as part of the CI pipeline. Automated Axe accessibility tests run as part of the CI pipeline. Automated OWasp Zap security tests run as part of the CI pipeline. GitHub CodeQL is used to perform automated static analysis of the code. Dependabot is used for dependency security scanning
  • all infrastructure is built within the AWS cloud. This is built entirely using Terraform infrastructure-as-code, deployed using Terragrunt via Jenkins. The Microservices are containerised using Docker and run within an Elastic Kubernetes Service cluster. Istio is in use as a service mesh within the cluster. Traffic is routed through a CloudFront WAF/CDN
  • security (splunk) logs are monitored by the Cabinet Office SecOps team

12. Make new source code open

Decision

The service met point 12 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is storing code in private repositories but will make it open once in public beta in Mid-August
  • the team is using various open source tools and technologies

What the team needs to explore

Before their next assessment, the team needs to:

  • publish the code in the open

13. Use and contribute to open standards, common components and patterns

Decision

The service met point 13 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is using GOV.UK frontend
  • the team is using HMRC and CO shared tools and technologies

14. Operate a reliable service

Decision

The service met point 14 of the Standard.

What the team has done well

The panel was impressed that:

  • the SLA is for the service to be available 24/7, with uptime NFR being 99.5%. First-line (end-user) technical support is provided by HMRC DCST (Digital Customer Service Team). Assisted Digital end-users will be handled via the same route (support form completed on the users’ behalf by the AD agent). Internal users (HMRC) will be able to raise incidents through HMRC IT Service Desk which will be routed through to second and third-line teams
  • second and third line support will be provided by the Single Trade Window Platform and Check how to import and export goods UK team
  • the platform team will have at least one support engineer available 24/7
  • the ongoing Check how to import and export goods UK delivery team will supply business hours technical support (9-5 Mon-Fri, excluding bank holidays)
  • Major Incident Management will be provided by HMRC CE&BO
  • as the service is non-transactional, is deemed non-critical, and essentially amalgamates information from several other GOV.UK services, in the event of a service outage, users can gather then same information, albeit in a less user-friendly format, from those other GOV.UK services
  • the platform is provisioned as Infrastructure-as-code (using Terraform) so in the event of DR the team can rebuild the environments from that
  • the platform is designed with an RTO of 12 hours which is sufficient for this service

Published 12 October 2021