Add a Driving Licence Check Code to your Mobile Phone Beta Assessment Report

The assessment report from DVLA's Add a Driving Licence Check Code to your Mobile Phone Beta assessment on the 14th June 2018.

From: Central Digital and Data Office
Assessment date: 14 June 2018
Stage: Beta
Result: Met
Service Provider Driving and Vehicle Licensing Agency

The service met the Standard because:

  • the service team have addressed the conditions raised at alpha assessment particularly focussing on the security and technology used to deliver the service.
  • the service team have agreed to run a robust private beta with at least one external organisation which will focus on understanding contextual use of the service
  • the service have agreed to undertake further research for users with access needs in context.
  • the service team have agreed to undertake further research related to cross-channel journeys particularly with reference to users needs around desktop and mobile devices and integration with the existing “View and share your driving licence information” service.

Description

The service enables users to create a check code which they can save to their mobile phone and use to give organisations access to their live driving licence information. This will be used by members of the public looking to book cars, or by organisations who employ drivers and need to meet their legislative duties to ensure that all drivers are eligible to drive.

Service users

The users of this service are members of the public who drive for a living or who want to rent a car, organisations who rent cars or employ drivers.

User needs

The team has identified a need for a simpler way to share checking codes. They have used a variety of methods in their research. The research has built some empathy for users within the team.

The panel was pleased to see that the team have acted on the recommendations from the alpha assessment. Since alpha the service has been tested with a variety of users including licence checkers, people with low digital skills and some people with access needs.

The research appears to have focussed on what users want rather than what they need. It was unclear what the most difficult user problems were and how research was used to help the team understand and solve them.

The team showed that they had undertaken a number of user testing sessions however these have heavily focussed on car rental companies and organisations rather than with members of the public.

The panel was also concerned that usability testing only took place in labs and hotels. The lack of real world context (for both company audiences and members of the public) means that there are gaps in the team’s understanding of how the service might be used from end-to-end.

Contextual testing would help the team understand how the existence of desktop and mobile versions of the service might lead to problems for different user groups and for users with access needs.

During a private beta the panel would expect a wider range of partners from priority user groups to use the service. Testing within DVLA is insufficient. It isn’t a varied enough sample of contexts and participants to surface potential problems.

Given the potential scope for the service the team should have included a larger number of users with access needs in their beta. By this point the panel would expect the team to have a clearer understanding of their users’ access needs. The panel was pleased to see that the team intends to continue testing with users with access needs in public beta. The panel recommends that they approach this in a more systematic way so they understand which users with access needs will have the most problems.

Team

The service team were represented in the assessment by Service Manager, Service Designer, Delivery Manager, Developer and Interaction Designer. Notably there was no User Researcher present. User testing activities have been undertaken (and were presented at assessment) by the team’s Customer Insight Lead, who was present.

The service team were proud to be made up of exclusively DVLA permanent staff. This is rare and should be celebrated. However, the team should also note that the work of this service is relatively new and novel within government and in such circumstances it may be appropriate to seek outside skills and influences, seeking opportunities for knowledge share in order to build capability within the team.

The service team provided the panel with a comprehensive view of their team structure, this is heavily weighted towards development and technologists. There is an established and adept development team in place made up of Developers, Architects, Business Analysts and Testers.

However, given the novel nature of the service, the team should have considered whether it was appropriate to increase the number of User Researchers within the team at beta, especially given the changing nature of the technology decisions and security which would have a direct impact on usability.

The team described their agile ceremonies, and prioritisation exercises involving all members of the team. With a team heavily weighted towards development and technology practitioners, the service team should ensure that the voice of the user is heard, and this may be more difficult for a less experienced user researcher in a large team.

The team should develop a private beta focussing on contextual understanding of users needs and behaviour, where needed, bringing in contractor resource or more experienced User Researchers from within the organisation to ensure that the findings of this phase are focussed on users needs in context.

Technology

The technology stack of the service largely extends existing the DVLA data platforms and AWS-based components and hosting of the View/Check Driving License service. As such, there is a reduced technical risk for the deployment of the new service. Live support is available for the service 24/7 and the service development team appears well-integrated with the operational considerations. Technology choices seem appropriate.

The Drone.io/Spinnaker deployment process built around Docker images is an advancement for DVLA and fits well with a continuous integration/delivery model, and the team has the capacity to deploy production patches within hours if needed with zero downtime. The actual target pace for release deployment is considerably slower, however, and rather than deploy monthly as planned, we encourage the team to move to a more rolling and continuous deployment model in which the service is gradually improved daily as a matter of course.

It is appropriate to use both an iOS and generic web version of the mobile application, provided that a wide range of devices and browsers are used to test the application and no group of users are excluded. We expect the team to regularly verify that no users are unreasonably excluded from this service. This requirement to provide equal access will require the team to follow the mobile technology space closely and continue to upgrade the service to remain compatible with all platforms. We hope that the team will continue to monitor the development of Progressive Web Application technologies carefully, as the standardisation of Service Worker proxies as a cross-platform way to handle offline access to the application seems to fit the use case, and would result in less vendor-specific code being needed.

DVLA’s information assurance and cybersecurity functions are not permitting the service team to work in the open, publishing the code for the service to an open repository, and seems to have set a standard prohibiting the publishing of “business logic”. This does not meet point 8 of the Digital Service Standard. We welcome the publication of some new libraries to an open repository immediately following the assessment, but are not clear whether this represents all new code being developed.. We encourage DVLA to move to a model of “working in the open” with the very limited exceptions of keys and credentials, algorithms used to detect fraud and code pointing to unreleased policy. The team must also create scripts to continually refresh the open code repositories upon release of code to production so that the open and closed repositories remain in sync. We recognise that this is a journey from closed to open code and that incremental steps must be taken, but we emphasise that working in the open is an important part of government’s strategy to improve the quality and security characteristics of digital services.

Design

The service team has made good use of the GOV.UK design patterns to present information in a clear and consistent way, giving users the confidence that this is an official DVLA service.

The journey through the service is straightforward and short, with the right amount of content at each stage to help users complete the task. Where there was potential for dead ends, such as a user not holding a photocard licence, support is provided to allow the journey to continue. Possible areas for improvement include guiding non-British licence holders for whom the facility is not available to a positive outcome, and on the terms and conditions page, where content is hidden.

The service team were able to talk through different iterations of previous approaches, for example, the service’s name, the pre-GOV.UK Verify information page, the confirmation page and pages targeted at Android device users.

While there is utility in adding a driving licence check code to your phone, not least concerning the ability to be offline when presenting the check code for checking, it is not clear whether the current design meets users needs or merely is part of a longer-term DVLA mobile strategy. Furthermore, the private beta service did not engage enough users to inform the service’s design direction or give the panel confidence that the service met latent user needs. The service team need to gain a more substantial number of presenters and checkers to build confidence in their approach.

It is unclear whether this service should be standalone, or an additional journey within the current “View or share your driving licence information” service. The service team were not able to clearly articulate where the journey would start: via a sidebar link, via the central GOV.UK directory, within the current ‘view or share function or at the point of check code creation.

The proposed service does not allow users to start their journey on a desktop and finish on a mobile, or vice-versa, a typical cross-channel user journey. Similarly, there is not a provision for adding or sending an existing or new check code to a mobile device from the “share your licence information” screen in the current “View or share your driving licence information” service. The panel strongly recommends the service team support a cross-channel journey.

Furthermore, the service team was not able to justify why the current view or share service will not be enhanced to include a QR code and, perhaps, photograph. The addition of the former will provide greater utility to the existing facility and align with the plan for checkers to scan a code rather than typing it.

The authentication mechanism and level of assurance between the web version of the service and the mobile version of the service differ significantly. The web version requires driving licence number, national insurance number and postcode to authenticate at Level of Assurance 1 (LoA1), the mobile version uses GOV.UK Verify to authenticate at Level of Assurance 2 (LoA2).

The mobile wallet card expires after 21 days. While the card includes the expiry date, the service team recognise this could be an area of frustration for users if they fail to notice expiry of the code on presenting to a checker. The panel recommends exploring a generic web-based version of the card to avoid issues with check code expiry while also supporting the broadest number of users.

The panel was pleased to hear that the DVLA has a dedicated support line for the view and share service, with additional people being allocated to support this service.

However, the panel was disappointed to discover little effort had been afforded by the service team to design for users with accessibility needs. Only five people during private beta had participated in the associated research.

The service team had engaged Digital Accessibility Centre (DAC) to conduct an audit of their service last September with a second audit to take place in July. However, this is too little and too late. Designing for accessibility needs must be at the forefront of service development.

Analytics

The service team are actively using digital analytics to help build the service. They understand the importance of using data and analytics to feed into building a digital service and have clearly considered the metrics then need to achieve this

They work closely with performance analysts who provide them with analytical expertise. The performance analysts feed any analysis finding they have back into the service team They also use alerts from their analytics to report on performance issues they may find. These are fed back to the appropriate members of the service team.

They use the premium version of Google Analytics which has been approved by their SIRO. They have not anonymised the IP as there is a separation between the user and the service where the users IP address is not sent to Google Analytics.

They use a number of data sources including * Digital analytics * User feedback from surveys including data from assisted digital users obtained from the call centre * Telephone call centre feedback with call data analysed

They have built a number of dashboards to help monitor how the service is working.

The team have worked with the Performance Platform team and agreed not to publish the cost per transaction KPI as it has some commercial sensitivities.

The service team have decided on a comprehensive set of metrics and measures that will help measure the performance of the service. They worked with the Performance Analytics team to determine which were the best measures to use. The metrics were based on previous experience on the Performance Analytics team and what they found worked well with other services.

Overall the service team have a very good analytics implementation for analytics. It is clear that their analytical strategy is well thought and should provide the data they need to help iterate and improve the service.

Recommendations

The service team must:

  • work with GDS to design and run a robust private beta with at least one external organisation which will focus on understanding contextual use of the service
  • conduct significant further research for users with access needs making sure that testing can happen in context rather than in a lab scenario
  • do more to understand user needs relating to cross-channel journeys (desktop and mobile) and to consider further integration with the existing “View and share your driving licence information” service
  • implement the recommendations from the Technology section relating to ongoing publishing of open-source code and increase frequency of releases.

The service team should also:

  • consider the User Research capability and resources with the team considering the novel nature of this service, taking steps to learn from external organisations or other government departments with similar expertise in order to increase User Research capability within the team
  • undertake further contextual user research with members of the public
  • address all of the feedback within the Technology section of this assessment report
  • run an Analytics Framework session. GDS Performance Analysts find these frameworks very useful for determining the best metrics needed to measure the success of the user needs and benefits of the service
  • continue to track the development of Progressive Web Application technology and, if possible, unify the application platform.

Get advice and guidance

The team can get advice and guidance on the next stage of development by:

Next Steps

To get your service ready to launch on GOV.UK you need to:

Digital Service Standard points

Point Description Result
1 Understanding user needs Met
2 Do ongoing user research Met
3 Have a multidisciplinary team Met
4 Use agile methods Met
5 Iterate and improve frequently Met
6 Evaluate tools and systems Met
7 Understand security and privacy issues Met
8 Make all new source code open Met
9 Use open standards and common platforms Met
10 Test the end-to-end service Met
11 Make a plan for being offline Met
12 Make sure users succeed first time Met
13 Make the user experience consistent with GOV.UK Met
14 Encourage everyone to use the digital service Met
15 Collect performance data Met
16 Identify performance indicators Met
17 Report performance data on the Performance Platform Met
18 Test with the minister Met
Published 13 February 2019