Fitness to Drive - Beta Assessment

The report from the beta assessment for DVLA Fitness to Drive service on 24 August 2016.

Stage Beta
Result Met
Service Provider Driver and Vehicle Licensing Agency

The service met the Standard because:

  • The service team have demonstrated clear progress in building a user focused service since the last assessment.
  • The team has a clear road map to build out the service from the initial small number of medical conditions.

About the service

The service allows citizens who are drivers to tell DVLA about a relevant medical condition or renew their existing medical driving licence. The project is delivering a new online notification service enabling customers to tell DVLA about their medical condition, and a similar service enabling customers to renew their existing medical driving licence,

Detail of the assessment

Lead Assessor: Jeremy Gould

The service owner and team demonstrated good progress in developing the functionality of the service and have a clear road map to build out quickly from a small (but significant in terms of transactions) number of medical conditions to a wider, more comprehensive service.

The team’s knowledge and understanding of the user need was clear and their approach to managing the development of the service was consistent with good practice as set out in the service manual.

User needs

Following on from alpha, the team continue to work hard to find and meet with potential users of the service through a variety of channels, including those with the lowest levels of digital skills, confidence or access. The panel were pleased to learn that the team have conducted 14 rounds of usability testing across the country in user research labs and people’s homes. The objective of the research within the team was clear, however, the design of the tasks was not articulated well, and care should be taken to design usability testing tasks that will answer the team’s questions.

Assisted digital user needs

The team have appointed an Assisted Digital lead, and have made good progress usability testing the phone support in the lab, and identifying other support groups such as charities. They recognise that these support groups often don’t have the necessary digital skills themselves and are looking into other support routes.

There was some confusion over the recruitment of users with lower confidence and low digital skills, mixing these people up with people with accessibility needs. The team should make sure that assisted digital and accessibility are treated as two separate issues, and care should be taken to make sure the two are tested separately

Team

Release Cycle

For various internal and external reasons the team have adopted a pattern of releasing four weeks (two sprints) of work in one go, although they have occasionally fast-tracked fixes for a quicker release. Therefore here can be a lag of up to four weeks between a developer completing work and it getting deployed to production.

We would encourage the team to consider whether there may be a net benefit in empowering the team to release work as soon as it is done and only hold work back by exception.

Team structure

The team are dependent on a contractor for interaction design and usability testing expertise, so the team should ensure that appropriate knowledge transfer to permanent staff happens.

Technology

The panel was impressed by the team’s approach to choosing technology, coding in the open and addressing security and privacy issues. The panel have outlined below a few areas where there is potential to improve.

Legacy systems

The team has chosen to focus on the front end user experience for this service, and has pragmatically integrated with existing systems and process by generating and emailing a PDF version of the existing paper form into DVLA’s case processing system. Without updating the backend systems there are limitations on how much automation can be achieved and thus on how quickly and cost effectively the end to end (declaration to updated licence on doormat) transaction can be fulfilled.

The choice of a PDF as the interface, combined with the security conscious decision not to store historic data in the frontend means that much information about the user’s answers is not available in a digital form, limiting the DVLA’s ability to service emerging user needs. For example, if DVLA needed to contact users who answered “Yes” to a particular question within the service, it would not be feasible to extract the list of users, as the answers have been distilled into a PDF and then processed by a caseworker - there’s no database containing the answers that users gave.

Recommendation: The panel would encourage the team to work with other affected teams to understand the needs and advance the business case for refreshing DVLA’s backend systems with a view to automation and ability to stand up RESTful interfaces.

Zero downtime deployment

In discussion, it became clear that there was potential to improve the user experience for users who are mid-session during a deployment. If the number or order of steps in the journey changes between the old and new version of the service, the user can end up with an invalid set of answers and see an error page. We discussed some potential ways to eliminate this issue using technical measures such as pinning in flight users to a version of the journey and specifically coding backwards compatibility into new versions of the journey.

Recommendation:The panel encourage the team to account for in-flight users during development and within the deployment process.

Differences between pre-prod and prod deployment

In the current development process, completed work is deployed continually to the pre-prod environment, whereas the prod environment moves from a 4 week old release to the current release during deployment. The transition that happens in production has therefore not been tested in full anywhere else. This hasn’t been an issue so far.

Recommendation: The team should update the release process to ensure pre-prod and prod deployment are identical.

Disaster Recovery

The service is currently hosted on VMWare based hosting in a single data centre. The team estimated it would take 10 days to recover the service in a 2nd data centre, and stated that in this event users would have the option of downloading the paper forms and completing the process. While this risk may be palatable for now, future automation will reduce capacity in the paper channel.

Recommendation:The team should continue to investigate hosting options.

Monitoring

While plans to monitor the service in public beta and beyond appeared robust, the team informed us that existing monitoring has not reported any outages over the last few months. It’s likely that there have been some short outages in that period, based on outages experienced by other services hosted within the same data centre so it’s possible that the monitoring tools are not functioning as expected.

Recommendation: The team should review and test monitoring systems ability to detect and alert on common modes of failure.

Use of GOV.UK Verify

All users must currently log in using GOV.UK Verify in order to use the online service. If the user cannot Verify, they are required to use the paper form. The team is aware that the overall service must work for all users and is working on simplifying the paper form.

Recommendation: The team should consider whether the improved user experience within the digital service can be offered to users who cannot Verify too - for example by allowing user answers to populate a form which can be printed, signed and posted.

Design

Looking up a medical condition

For a user to look up a medical condition on the private beta, they need to type the name of the condition using exactly the same spelling that DVLA use in the database. This causes difficulty for conditions like Déjà vu which have accents on the letters (and so are hard to type on a keyboard) and for conditions that have multiple names. The service team have been prototyping how to fix this using synonyms, and plan to start implementing this feature into the beta from next sprint. This feature will significantly improve usability, and the service should not proceed into public beta until it is implemented.

Testing with alternative routes into the digital service

So far the service team have prioritised their research on users declaring their own medical condition, after having been told to do so by a doctor. The panel would like to see them collaborate with the NHS Digital teams to investigate alternative ways of making users aware of the service.

The panel would also like to see the team do research with users who don’t have the medical condition themselves, but are investigating whether someone else they know with a medical condition (close relative etc) should still have their driving licence.

As currently far fewer people declare a medical condition to DVLA than is numerically likely, the team are working on the assumption that most users don’t declare their medical conditions (even though they are legally obliged to). I’d like to see the team experiment with using less punitive-sounding language where possible, eg “isn’t safe to drive” rather than “revoke licence” to see if it affects how willing users are to engage with it.

Non-standard UI pattern on the driving licence question

The team have done the right thing by simplifying the question in this way, but should use the GDS design pattern for conditionally revealing content unless they have strong evidence from user research that their design works better. If they do have that evidence, they should contribute it back to the design community so that we can iterate and improve the GDS pattern.

Case management system

The back-end automation and fulfilment system has not progressed from the last assessment, and still involves a PDF report being emailed to a case management system and a decision letter being sent back to the user via post. The team should consider what is required after the user submits their information (eg the reports pushed into the legacy case management system and the paper notifications back to users).

Analytics

The panel were pleased to see that the team are measuring performance that include the full service, from notification to informing the user of the outcome.

The team’s digital take-up target is still 55%, and there was little evidence of plans to increase this. The team should continue to develop plans for increasing digital take up beyond the 55% of users who have a stated preference for digital and an appropriate plan to phase out non-digital channels in the longer term.

Recommendations

To pass the next assessment , the service team must:

  • Have explored options for a more ambitious takeup up target than the current 55% of users.

Digital Service Standard points

Point Description Result
1 Understanding user needs Met
2 Improving the service based on user research and usability testing Met
3 Having a sustainable, multidisciplinary team in place Met
4 Building using agile, iterative and user-centred methods Met
5 Iterating and improving the service on a frequent basis Met
6 Evaluating tools, systems, and ways of procuring them Met
7 Managing data, security level, legal responsibilities, privacy issues and risks Met
8 Making code available as open source Met
9 Using open standards and common government platforms Met
10 Testing the end-to-end service, and browser and device testing Met
11 Planning for the service being taken temporarily offline Met
12 Creating a simple and intuitive service Met
13 Ensuring consistency with the design and style of GOV.UK Met
14 Encouraging digital take-up Met
15 Using analytics tools to collect and act on performance data Met
16 Defining KPIs and establishing performance benchmarks Met
17 Reporting performance data on the Performance Platform Met
18 Testing the service with the minister responsible for it Met

Updates to this page

Published 18 January 2017