Driving Theory Test Booking - Alpha Assessment

The report from the alpha assessment of the DVSA’s Driving Theory Test Booking service on 20 January 2016.

Stage Alpha
Result Not Met
Service provider DfT / Driver and Vehicle Standards Agency (DVSA)

Outcome of service assessment

After consideration the assessment panel has concluded the driving theory test booking service is not yet on track to meet the Digital Service Standard at this early stage of development.

Reasons

The panel was able to take many positives from the assessment and was assured by the enthusiasm shown by the service manager and the team to build a service that meets user needs.

The team is doing research and has identified the 3 key user groups for their service - candidates; approved driving instructors and trainers; additional users including family, friends and support workers.

The service manager was able to demonstrate they’re empowered to make decisions about the service, citing a number of examples. One of these was the decision to remove an ethnicity questionnaire from the end of the user journey having seen it test very poorly with users.

The team are generally following the agile working methods set out in the service manual, despite some of the constraints imposed by the outsourced model they are operating under. The team are using the GOV.UK prototyping kit and the service looks like GOV.UK.

The service did not meet a number of the points in the standard, relating to user research, the team, and design. The panel’s reasons for each are outlined below.

User research - points 1 and 2

The service team did not demonstrate a deep understanding of their users and their needs. Given how broad the potential user group is for booking a driving theory test, the panel felt that the coverage of the research up to this point was insufficient to progress from the alpha phase.

The panel was also concerned that the team’s research did not encompass speaking to all of the identified user groups, and that it has primarily focussed on people who may never need to use the service. Therefore the feedback they have received is less accurate and insightful than they would get from those who have a real need to book a driving theory test. The panel understands that recruiting users and observing them at their point of need is difficult, however, it would provide invaluable feedback to have users who are likely to engage with the service in real life. In addition, the team’s research is weighted towards those who have used the existing service. This gives an indication of preference and whether their design is an improvement on the existing service, rather than helping them understanding whether the new service meets user needs.

The panel had concerns that the team have not yet accounted for the needs of users who may need assisted digital support to book a test. By recruiting research participants primarily through online methods, the team are not fully considering those who may need support. The team have assumed the 1% of people who book through the phone channel do so because they want to, rather than because they can’t use the online service, and have not carried out research with users who have used third parties for support.

Although the service team were explicit that they will continue doing user research in the next phase, they didn’t have a clear plan of the frequency of research, who they plan to do research with, and what they are hoping to learn.

The service team up to this point has had some input from a central DVSA research team during discovery and access to part of a DVLA user researcher’s time during alpha. However, they do not have a dedicated user researcher embedded in the team to lead on the research challenge for this service. The panel believes that the concerns listed above could be addressed by recruiting an experienced user researcher and embedding them in the team to lead the research moving forward.

The team - point 3

The assessment panel had concerns that there were gaps in the multidisciplinary team that are important for building a service that meets user needs. In particular, the team doesn’t have a dedicated user researcher working at least 3 days per week, a dedicated performance analyst or content designer.

The panel felt there was not a clear separation between some of the key roles in the multidisciplinary team. For example, the service manager is responsible for identifying actionable data insights that can be used to improve the service, rather than a performance analyst. The designer in the team (who is clearly very capable and impressed the panel) is currently stretched across a number of roles - at the moment, they are the designer, front-end developer and are making content decisions.

Design - points 12 and 13

It was clear in the assessment that the designer is keen to follow the GDS design patterns and principle; and is using the prototyping kit. However, the panel’s view is that the prototype is not yet sufficiently on track to progress from alpha.

The panel felt the designer in the team was overstretched by covering the design, front end development, content decisions and interpreting the user research findings. The panel also felt that the design solutions to issues found in research are defaulting to those already in the supplier software, rather than trying other GDS patterns that may be more appropriate.

One of the design principles is “do the hard work to make it simple”. The panel felt there were examples in the prototype where the service is not intuitive or easy to comprehend. For example, picking a test centre and time slot was difficult to understand and relies on a key to explain the interaction. The use of the error message pattern as reminders for a user to do something felt like it is been used to overcome parts of the user journey that are not intuitive enough yet.

The assessment panel were assured that the service manager and designer will prioritise significant changes if needed to ensure the service meets user needs. However, doing this requires a dedicated user researcher to find the problems with the prototype in the first instance.

Exemptions

Tools and systems - point 6

The service team are locked into specific technology choices due to the nature of the procurement. While they do have flexibility with parts of the toolchain they are locked into the core tools for the duration of the contract.

Although this means the service is unable to pass this point, the assessment panel do recognise that the contract was signed prior to the GDS spend controls and service standard been introduced. The panel also recognised that the service team are constrained to operate within the terms of the contract.

Open Source - point 8

The nature of the contract is such that the supplier owns all the intellectual property and therefore making any of it open source is constrained by the terms of the contract. The nature of the MIT licence used on the GOV.UK toolkit and GOV.UK prototyping kit (that the supplier has used) mean that the supplier is not required to publish any changes they make to these. However, it was mentioned that independent pieces of work that are exclusively for this service could be negotiated separately.

Recommendations

For alpha review reassessment

User research - points 1 and 2

  • Conduct user research with each of the 3 key user groups identified by the service team to better understand their needs and see how the developing service meets those needs.

  • Intercept users at their point of need wherever possible for usability testing, or conduct research with those for whom this is likely to be a real need rather than scenario based which can be unengaging for users.

  • Conduct user research with potential users of the service who have the lowest level of digital skills and establish their barriers to using the digital service. Recruit participants through offline methods - third parties and approved driving instructors may be able to help with this.

  • Recruit an experienced user researcher and embed them in the team to lead the user research and usability testing.

  • Develop a plan for user research and usability testing for the remainder of alpha and for the subsequent beta development.

The team - point 3

  • Recruit an experienced user researcher and embed them in the team. Recruit a content designer and performance analyst.
  • Ensure there is a separation of key roles within the multidisciplinary team so that the same person is not performing a number of roles and can focus on their area of expertise.

Open Source - point 8

  • Explore open sourcing the prototype that has been created with the GOV.UK prototyping kit and ensure all new patterns are added there first so that other parts of government can easily make use of them.

Design - points 12 and 13

  • Prototype and research the newer cross-government design patterns - including appointment bookings, check your answers and confirmation pages (see design report)

  • Identify the pain points in the prototype through user research and usability testing; and iterate the prototype based on those findings.

  • The prototype is currently using error message styles to highlight important information and explain the meaning of icons. This is misleading for users, as it implies that they have done something wrong. The team should explore alternative ways of displaying the information without resorting to using error messages, such as using text descriptions instead of icons, and simplifying the quantity of information being given (for example - is there a validated user need to see if there are cafe facilities near to a test centre?)

  • You can currently only use the service if you have an email address, but that is only made clear at the end of the process. The email address and phone number questions should be moved to the eligibility section at the beginning of the service to avoid wasting users’ time if they don’t have an email address.

Digital Service Standard points

Point Description Result
1 Understanding user needs Not Met
2 Improving the service based on user research and usability testing Not Met
3 Having a sustainable, multidisciplinary team in place Not Met
4 Building using agile, iterative and user-centred methods Met
5 Iterating and improving the service on a frequent basis Met
6 Evaluating tools, systems, and ways of procuring them N/A
7 Managing data, security level, legal responsibilities, privacy issues and risks Met
8 Making code available as open source N/A
9 Using open standards and common government platforms Met
10 Testing the end-to-end service, and browser and device testing Met
11 Planning for the service being taken temporarily offline Met
12 Creating a simple and intuitive service Not Met
13 Ensuring consistency with the design and style of GOV.UK Not Met
14 Encouraging digital take-up Met
15 Using analytics tools to collect and act on performance data Met
16 Defining KPIs and establishing performance benchmarks Met
17 Reporting performance data on the Performance Platform Met
18 Testing the service with the minister responsible for it Met

N/A - service is exempt from meeting these points of the Digital Service Standard due to the constraints of the legacy contract.

Updates to this page

Published 22 December 2016