The Electronic Visa Waiver - Beta Assessment

The report from the alpha assessment of the Home Office The Electronic Visa Waiver service on 2 February 2016.

Stage Beta
Result Met
Service provider Home Office

Result of service assessment

The Electronic Visa Waiver service is seeking permission to launch on a service.gov.uk domain as a Beta service.

Outcome of service assessment

After consideration the assessment panel has concluded the Electronic Visa Waiver service has shown sufficient progress and evidence of meeting the Digital Service Standard criteria and should proceed to launch as a Beta service on a service.gov.uk domain.

Reasons

The team has conducted a lot of valuable user research with real users doing the actual task, making an extra effort to conduct this research in Kuwait and in airports. They provided many instances of making use of this research in the service, and provided evidence of the iteration of the service through the alpha and private beta phases.

Whilst there are a number of recommendations in this report, the assessment panel agreed that a public beta phase would offer the best opportunity to implement these recommendations and learn from a much greater volume of users.

Recommendations

The panel recommend that the service team prioritise the following recommendations, and start work on the following areas ahead of the public beta launch scheduled for late February.

Accessibility

The team must carry out research sessions with users with different accessibility needs, including users of assistive technology in both languages in order to understand if these users can complete the service unaided. As the service is using dynamic content (in the country typeahead for example) the code needs to be properly marked up with the correct ARIA attributes.

A thorough accessibility audit by a third party must also be undertaken.

Service Name

The name Electronic Visa Waiver does not sufficiently reflect the user need. There is also no direct translation for “waiver” in Arabic. We noted that the team discovered that the acronym EVW is well-known amongst the service users. The team should research alternative titles for the service and follow the “good services are verbs” advice.

Progressive enhancement

The service currently requires javascript to function. If for any reason the javascript is not delivered to the user, it is impossible for them to complete the transaction. The team must ensure that the service works without javascript initially and only use it to enhance interactions.

Security

Conduct a penetration test.

Testing

The team should implement a programme of cross-browser and cross-device testing on the service;

One thing per page

Currently some pages in the service ask the user for multiple pieces of information. As the service is already using two languages the pages are quite complex, especially when error validation is also displayed. The service should test a version that uses the “one thing per page” pattern to help to reduce complexity. Asking a single question on each page also reduces the ‘interleaved text’ effect; the current service has English and Arabic text in separate blocks for content pages, but alternates English and Arabic sentences on the more complex forms pages.

Multiple languages

Designing a service that uses multiple languages brings unique challenges. The team need to be certain that they have fully explored the different options to create a bilingual service that is simple, clear and meets users’ needs. It is recommended that the team prototype different designs and patterns and test them with users to explore as many options. Testing a design that might not be technically possible is a good way to provoke users into speaking about the problems and to reveal insights that may not be surfaced by choosing the safe path.

In addition, the following work should be completed before returning for a live assessment:

• Undertake user research with the internal users of the casework system.

• Develop a clearer articulation of the user needs for the service.

• Ensure that there is a permanent, full-time service manager in place for the service. This person should have the seniority necessary to be responsible for resources and the total operation of the service, as specified by the service manual:

https://www.gov.uk/service-manual/the-team/service-manager

• Ensure that a member of the team is tasked with analysing the analytics data for the service in public beta. This person should be responsible for proactively identifying and developing user stories that result from this source.

• Release the code developed for the service under an appropriate open source licence. Any decisions made not to release code will need to be fully justified at live assessment stage.

• Develop plans for dealing with a DoS attack from multiple IP addresses.

Proceed with plans to use scrubbed data to enable more realistic testing to take place in the pre-production environment.

• Finalise the KPIs for the service and make sure data is available to report on these. In particular, the measurement of internal casework details will become critical as the usage of the service ramps up considerably in public beta.

• Identify alternatives to the current hosting provider and develop plans for managing migration in order to avoid vendor lock-in.

• Check that cookies policy details the correct Google Analytics cookie names.

The panel also noted that:

• The service team agreed to return for a further service standard assessment before on boarding further countries or adding a payment option.

• When the team consider the addition of payment options they should engage with the GOV.UK Pay team. The anticipated three month public beta period before returning for a live assessment will be challenging in the light of the additional functionality that the service team wants to add. The service team may want to review their timescales in the light of the recommendations detailed in this report.

Digital Service Standard points

Point Description Result
1 Understanding user needs Met
2 Improving the service based on user research and usability testing Met
3 Having a sustainable, multidisciplinary team in place Met
4 Building using agile, iterative and user-centred methods Met
5 Iterating and improving the service on a frequent basis Met
6 Evaluating tools, systems, and ways of procuring them Met
7 Managing data, security level, legal responsibilities, privacy issues and risks Met
8 Making code available as open source Met
9 Using open standards and common government platforms Met
10 Testing the end-to-end service, and browser and device testing Met
11 Planning for the service being taken temporarily offline Met
12 Creating a simple and intuitive service Met
13 Ensuring consistency with the design and style of GOV.UK Met
14 Encouraging digital take-up Met
15 Using analytics tools to collect and act on performance data Met
16 Defining KPIs and establishing performance benchmarks Met
17 Reporting performance data on the Performance Platform Met
18 Testing the service with the minister responsible for it Met
Published 27 January 2017