Electronic Travel Authorisation (ETA) beta assessment

Service Standard assessment report for HO's Electronic Travel Authorisation (ETA)

Electronic Travel Authorisation (ETA)

Assessment date: 06/02/2025
Stage:  
Type: Beta
Assessment  
Result: Amber
Service provider: Home Office

Previous assessment reports

https://www.gov.uk/service-standard-reports/electronic-travel-authorisation-eta-alpha-assessment

Service description

This service aims to solve the problem of providing an Electronic Travel Authorisation (ETA) scheme service, allowing eligible users to apply for and obtain an ETA, which is a multi-entry permission to travel to the UK for a period of two years.

Service users

This service is for applicants to the ETA service.

Identified customer groups are:

  • travellers who currently need an EVW to travel to the UK
  • travellers who currently do not need permission to travel to the UK

User groups within these are:

  • traveller completing the application for themselves
  • someone else completing the application on behalf of a traveller

Secondary users are:

  • carriers, such as airlines, ferry and train companies
  • caseworkers
  • Border Force officers

Things the service team has done well:

  • positive iterations have been made since the previous assessment
  • the team are delivering a complex cross-cutting service at pace, against a background of restrictive policy constraints and deadlines

1. Understand users and their needs

Decision

The service was rated amber for point 1 of the Standard.

During the assessment, we didn’t see evidence of:

  • full end-to-end testing with assistive technology users is planned and in scope, but it’s crucial to ensure its completion to confirm the service works for everyone
  • clear evidence that the journey for users applying on behalf of someone else functions effectively. We raised this concern because users must scroll through multiple screens before being informed that they are not eligible for the web app. An improvement to consider would be splitting the journey earlier.
  • full end-to-end testing with assistive technology users is planned and in scope, but it’s crucial to ensure its completion to confirm the service works for everyone
  • there was no clear evidence of understanding why some users choose not to use the mobile app. This includes cases where users could use the app but prefer the web version, possibly due to translation needs or screen size or assistive technology. Additionally, we didn’t see evidence of how users navigate to the web app, which requires four screens to access. This may be particularly difficult for those whose first language is not English, as they might not immediately realise a web option is available from the initial content full end-to-end testing with assistive technology users is planned and in scope, but it’s crucial to ensure its completion to confirm the service works for everyone

Optional advice to help the service team continually improve the service:

  • the team has carried out research through various avenues despite challenges and blockers and has a solid research plan. However, the plan should explicitly include people applying on behalf of someone else and ensure unhappy paths and risky assumptions are addressed
  • the team could further explore users who are not using the app or have limited online skills, including who they rely on for support, as well as ensuring the assisted digital support route is tested or identifying ways to help users complete the process independently
  • the team is working with third parties, such as travel agents and carriers, to flag the need for an ETA. We suggest also considering how users will identify the genuine ETA site, given the risk of fake apps and organisations charging a premium to apply on a user’s behalf
  • the team should ensure that elements like webchat are considered part of the overall service, so research, analysis and other insights are shared across the wider service
  • the team should gather evidence on when people apply in relation to their travel plans to provide valuable insight

2. Solve a whole problem for users

Decision

The service was rated amber for point 2 of the Standard.

During the assessment, we didn’t see evidence of:

  • user-centred guidance on which platform to use, perhaps using routing for the different user groups (navigation to the web journey has intentionally been made difficult, this should not be the case considering it seems preferable for significant user groups: people applying for someone else; those who cannot understand English but can use browser translation; or those who need to use a large screen)

Optional advice to help the service team continually improve the service

  • stronger evidence of service design should have been provided, this was previously raised in the alpha report ‘provide more service mapping evidence ahead of the event so the assessors can more clearly see that the team have thoroughly considered the end-to-end service including context and non-digital elements’

3. Provide a joined-up experience across all channels

Decision

The service was rated amber for point 3 of the Standard.

During the assessment, we didn’t see evidence of:

  • user-centred guidance on which platform to use (see point 2 notes)

4. Make the service simple to use

Decision

The service was rated amber for point 4 of the Standard.

During the assessment, we didn’t see evidence of:

  • a GDS content review of the app and web journeys. Iteration should be required based on the recommendations (specifics will not be covered here as the team did not adequately provide designs for review ahead of the assessment; for future assessment all app and web screens should be submitted a week prior to the session)

5. Make sure everyone can use the service

Decision

The service was rated amber for point 5 of the Standard.

During the assessment, we didn’t see evidence of:

  • comprehensive end-to-end testing with assistive technology users. This is planned and included within scope. It is vital to ensure this testing is fully completed to confirm that the service is accessible and effective for all users

Optional advice to help the service team continually improve the service:

  • in the absence of official assisted digital support, the assumption that third parties will adequately support users who have difficulty with the app requires further validation
  • the service is currently available in English only which will be a barrier for many service users - the team plan to test translations (illustrations alone will not be sufficient)

6. Have a multidisciplinary team

Decision

The service was rated green for point 6 of the Standard.

7. Use agile ways of working

Decision

The service was rated green for point 7 of the Standard.

8. Iterate and improve frequently

Decision

The service was rated amber for point 8 of the Standard.

During the assessment, we didn’t see evidence of:

  • iterations that incorporated testing with users of assistive technology, during each iteration
  • iterations incorporating recommendations from a GDS content review

9. Create a secure service which protects users’ privacy

Decision

The service was rated green for point 9 of the Standard.

Optional advice to help the service team continually improve the service:

  • the team should keep making sure that data isn’t kept on the user’s device longer than needed and keep track of zero-day vulnerabilities that often appear and allow some apps to access data from other apps
  • the same goes with the web workflow and possible browser vulnerability allowing cross-site snooping
  • likewise, it is often tempting to trust any third-party systems this service uses. However, some of the team’s attention should be devoted to monitoring any incident those systems might suffer, or any problems coming from their integration

10. Define what success looks like and publish performance data

Decision

The service was rated green for point 10 of the Standard.

Optional advice to help the service team continually improve the service

  • given that around 75% of users use the App, the team should implement Google Analytics onto the App as soon as possible. This will also enable the Completion Rate to be measured for the App
  • the ETA page on GOV.UK states “The fastest way to apply is using the UK ETA app.”. However, data presented at the service assessment indicated that in September the “TIME TO COMPLETE APPLICATION - APP” was 27 seconds slower than using the web form. Also, the time to complete using the App won’t include the time taken to download and install the App. Therefore, the messaging on the GOV.UK page should be reconsidered

11. Choose the right tools and technology

Decision

The service was green for point 11 of the Standard.

Optional advice to help the service team continually improve the service

  • for the app the team should consider app download time as part of the user experience when testing
  • for the web service the team should also consider performance, in particular assets loaded upfront. React is known for needing to download a large amount of code upfront, which may confuse users on slow devices
  • given that the app offers all the necessary features, it will be tempting to put less effort into maintaining and improving the app. The panel hopes it won’t be the case.

Indeed, there is work happening from browser vendors to standardise features like NFC support. The team should track this effort (and maybe contribute) as it might lead to the browser version being able to support all app features, making the app unnecessary

  • the team should follow GOV.UK improvements, especially around foreign currency payments
  • Since the code wasn’t shared with the panel, we are not able to assess if the team and third-party suppliers can write code that is robust and secure code, and adheres to modern software engineering standards.

12. Make new source code open

Decision

The service was rated amber for point 12 of the Standard.

  • the code wasn’t made available to the panel, who therefore weren’t able to assess good software development practices
  • the team weren’t allowed to work in the open
  • there was no evidence presented of internal discussion about publishing the code

13. Use and contribute to open standards, common components and patterns

Decision

The service was rated green for point 13 of the Standard.

Optional advice to help the service team continually improve the service

  • see point 4. Adequate design review will need to happen to meet point 13 in any future assessment
  • see point 11. Assessment of the code will also need to happen to meet point 13
  • also, from point 11. Following or contributing to standardisation efforts for NFC support in the browser could be beneficial

14. Operate a reliable service

Decision

The service was rated green for point 14 of the Standard.

Optional advice to help the service team continually improve the service

  • the technical team includes members from 5 suppliers, and the service requires several external third-party services to implement tasks such as identify verification. This creates a risk in case of business partners changing, as knowledge is often lost. The team is encouraged to make sure that in the case of such an event, knowledge is properly transferred by keeping documentation up to date, maintaining a decisions log, and generally avoiding vendor lock-in
  • the team has performed an evaluation of possible threats, such as fraud or disruption of the service by bad actors. We expect the team will continue evolving their threat model as the service goes to public beta, as new attack vectors or types of bad actors will probably be identified
  • the comfortable three-day SLA spares the team from having 24/7 support. We’re hoping that nonetheless the team will have everything in place to minimise traveller frustration in case of a system failure

##

Next Steps

Amber - beta

This service can now move into a public beta phase, subject to addressing the amber points within three months time and GDS spend approval.

This service now has permission to launch on a GOV.UK service domain with a beta banner. These instructions explain how to set up your *.service.gov.uk domain.

Updates to this page

Published 9 October 2025