Book Theory Test Beta Service Standard assessment report

Book driving theory test for cars, motorcycles and other vehicles. Used by members of the public, Approved Driving Instructors and in-house theory testing centres (IHTTCs).

Book Theory Test Beta Service Standard assessment report

From: Central Digital and Data Office (CDDO)
Assessment date: 18/05/2021
Stage: Beta
Result: Met
Service provider: DVSA (Drivers and Vehicle Standards Agency)

Previous assessment reports

Service description

Book driving theory test for cars, motorcycles and other vehicles. Used by members of the public, Approved Driving Instructors and in-house theory testing centres (IHTTCs).

Service users

  • Learner drivers
  • Driving instructors
  • Operational staff at IHTTCs
  • DVSA support staff
  • Test centre site managers

1. Understand user needs

Decision

The service met point 1 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has leveraged and built upon the foundational research conducted in Alpha, and also identified the gaps and further research needs therein, in order to set out and work off a comprehensive research plan and roadmap for this phase
  • the team walked through clear and relevant user profiles for the service, embodying key user needs, pain points, and design hypotheses/iterations
  • it was also clear to see the levels of validation for the profiles, shown as green for validated, with amber as needing further iterations, and red as needing primary research, and so on
  • it is also encouraging to see that two of the primary profiles also had their individual accessibility needs, around dyslexia and hard of hearing, factored in (as opposed to treating users with accessibility needs as totally separate profiles, to be dealt with separately)
  • there was evidence that the team had addressed users with accessibility needs, and the type of support that they would need.This was shown through research recommendations and design iterations that highlighted aspects of the support and affordances that should be made available to users in this segment.
  • the entire team showed evidence of collaborating in research activities, and this was highlighted throughout the session
  • the team also showed evidence of utilising the feedback from the mock assessment, for instance, around areas such as analysing the service data from the third party supplier, and also in the general presentational aspects on the day, and this is to be commended.

What the team needs to explore

Before their next assessment, the team needs to:

  • continue to prioritise user research with users who might have visual impairment needs (screen enlargement colour blindness and so on) in order to build on the foundation of the work that’s been accomplished thus far. This is crucial, given that the recommendations to this group will provide benefits across most user groups. Note: This recommendation is to be taken as separate from the work that is being done via DAC, who seem to be providing help with the technical accessibility of the service, although it is also useful to utilise the capabilities of such organisations to actually test the service with users with accessibility needs
  • as a consideration for the future, while visual representations of user-profiles do provide indication of user needs, pain points and so on, it is also useful for audiences new to the project to be presented with a simple summary table encapsulating the key user needs for each user profile (perhaps in the form of needs, user stories or even requirements). And then if relevant, highlight the needs that are common across user-profiles and the needs that are unique and discrete to subgroups or individual user profiles.

2. Do ongoing user research

Decision

The service met point 2 of the Standard.

What the team has done well

The panel was impressed that:

  • the team was also able to use the user profiles above to discuss and analyse the scope of the service, and subsequently identify the areas that need further research, using a Private/Public Beta research roadmap
  • the team worked through a lockdown and were actually able to articulate some of the benefits of doing research remotely during this phase, and the steps they took to mitigate the impact of remote access for users with AD needs, and this is praiseworthy

What the team needs to explore

Before their next assessment, the team needs to:

  • build on the solid understanding of users with access needs and enact the research plans that they articulated, in order to carry on with this
  • it is important to have the future research roadmap explicitly factored into future planning, and if any blockers are predicted, to then agree on mitigation strategies upfront, rather than experiencing similar blockers as in the past (see above comment about widening the remit for external groups such as DAC to conduct their own user testing with users with accessibility needs, on the team’s behalf). It is hoped that the easing of lockdown might mitigate some of the challenges they faced
  • following on from the above, given the sensitive nature of the transition between the third party and in-house service provision, for the team to have a core research plan, and accompanying mitigation leading up to September 2021, in order to avoid being stymied by restrictive contractual or supplier agreements, as might have been the case to date.

##

3. Have a multidisciplinary team

Decision

The service met point 3 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have managed this being a large and complex service by running a number of scrum teams and covering any gaps in capability by using a mix of permanent team members and 2 key suppliers
  • the teamwork closely with stakeholders
  • team members are all invited to user research sessions and attend show & tells

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure that supplier knowledge is passed on to the DVSA team
  • there is a plan to scale down the team post the September release. The team need to ensure that key roles continue throughout the Beta so that the service can continue to be iterated

4. Use agile methods

Decision

The service met point 4 of the Standard.

What the team has done well

The panel was impressed that:

  • the team collaborates well with a feedback loop through user research, service design and development
  • agile practices have been followed well, with the team working in fortnightly sprints, with daily standups, show & tells and backlog prioritisation sessions.
  • the team manage the size and complexity of the service with one project backlog and multiple scrum teams, who all come together in a regular scrum of scrums to refine and prioritise the backlog
  • the team link back their work to a roadmap and to the design principles
  • there is a clear governance and escalation process
  • the team are working collaboratively with stakeholders such as test centre networks

5. Iterate and improve frequently

Decision

The service met point 5 of the Standard.

What the team has done well

The panel was impressed that:

  • the team demonstrated that they had done numerous iterations across the learner driver, trainer bookers and in house theory test centre user journeys in response to user research with their users, citing seven significant changes, with more than 30 iterations between each version
  • have spent time iterating the landing page, paying special attention to the needs of users with accessibility issues

What the team needs to explore

Before their next assessment, the team needs to:

  • continue to iterate the service as they learn from launching with ‘real’ users across all three user journeys.

6. Evaluate tools and systems

Decision

The service met point 6 of the Standard.

What the team has done well

The panel was impressed that:

  • the team seems to be using the right set of tools and technologies including GOV.UK design patterns, agile delivery and confluence
  • the team is using Azure, confluence and MS-Dynamics
  • the team is using the right set of testing and CI/CD tool

What the team needs to explore

Before their next assessment, the team needs to:

  • IHTTC service does not work with JavaScript turned off. The team should explore how this part of the service can be made progressively enhanced and work without JavaScript.

7. Understand security and privacy issues

Decision

The service met point 7 of the Standard.

What the team has done well

The panel was impressed that:

  • the team informed that the Pen test was done and another is planned soon. One reported issue in the pen test has been resolved
  • the team has done the DPIA. GDPR and data retention policy are in progress
  • cyber Security Engineer is embedded in the project
  • team seems to use good monitoring, auditing and logging tools
  • team informed that the data is encrypted while in transit and at rest and only authorised users can access the Prod data with audit log captured

What the team needs to explore

Before their next assessment, the team needs to:

  • the team should have agreed on GDPR, Data retention policy and cookie policy published.

8. Make all new source code open

Decision

The service met point 8 of the Standard.

What the team has done well

The panel was impressed that:

  • team informed that the Policy had been agreed for open sourcing non-sensitive codebases
  • team has created the first repositories in DVSA public GitHub
  • team is using confluence for their repositories for Delivery and Tech team

9. Use open standards and common platforms

Decision

The service met point 9 of the Standard.

What the team has done well

The panel was impressed that:

  • team informed that they are using various common components from DVSA and GOV.UK:

  • DVSA Architectural principles and DVSA Azure ActiveDirectory
  • GOV.UK Notify
  • DVSA wide MS-Dynamics CRM
  • DVSA wide Customer Payments Management System (CPMS)

10. Test the end-to-end service

Decision

The service met point 10 of the Standard.

What the team has done well

The panel was impressed that:

  • team informed that they are using automated testing for unit and integration tests on all commits, the inclusion of accessibility and vulnerability checking in builds, integration test events
  • team is doing Performance testing in NFT environment at prod scale
  • team is using BrowserStack for multi-browser UI testing

11. Make a plan for being offline

Decision

The service met point 11 of the Standard.

What the team has done well

The panel was impressed that:

  • team has right processes and workflows in place if the service is offline
  • team has SLAs with various stakeholders to maintain the required service availability
  • team informed that there is the right level of backup and recovery processes in place.

12: Make sure users succeed first time

Decision

The service met point 12 of the Standard.

What the team has done well

The panel was impressed that the team has:

  • used information and data about the existing service, which has high user satisfaction, to inform the design of the replacement service
  • tested multiple different design approaches before deciding on one, specifically the ‘try before you buy’ test
  • designed for mobile-first, and tested with users on a range of devices
  • done extensive research on the public booking journey with disabled users and users with low digital confidence, and has iterated the service based on their findings
  • analysed user research and Google analytics to make sure the service is using the language that users use
  • a service centre staff member is embedded in the team to help design the assisted digital approach
  • carried out rounds of research into assisted digital - including journeys wholly booked online, and journeys that start online and end online - with users with a range of digital skills and support needs
  • iterated the assisted digital journey in response to findings, notably to avoid frustrating users with low digital skills e.g. removing the question that asks them “if they knew they could book online”
  • carried out usability testing of all parts of the services with disabled users and users with low digital skills, prioritising the public booking journey

What the team needs to explore

Before their next assessment, the team needs to:

  • continue to test and improve the assisted digital journey with real staff and real customers, monitoring user pain points in particular. For example, consider if the support team can have access to a users’ previous booking progress so users do not have to provide the same information twice in different formats
  • make sure insights are shared between the digital and support teams so the online and offline services can be iterated in tandem, using insights from one platform to inform the other. For example, some users may call the phone line thinking they have “bespoke” questions but perhaps they would be useful to many users
  • work with the supplier of the IHTTC and Trainer Booker services to try improve the fact that many basic tasks do not work without JavaScript - government services should be progressively enhanced
  • continue to iterate the service based on user research and data once some users are able to book tests, before the full launch in September

13. Make the user experience consistent with GOV.UK

Decision

The service met point 13 of the Standard.

What the team has done well

The panel was impressed that the team:

  • is using the GOV.UK Design System and style guide
  • has researched and tested new patterns for appointment booking with users
  • has a start page on GOV.UK that has been tested with users
  • Is re-using the DVSA’s payments tool rather than building something from scratch

What the team needs to explore

Before their next assessment, the team needs to:

  • replace the multi-factor authentication component with one that uses the GOV.UK Design System elements, so the service looks trustworthy
  • contribute what they’ve learned about appointment booking to the issue on the GOV.UK Design System backlog
  • use the payment card details pattern when asking users for this information
  • follow the extra design recommendations as the way some Design System elements are used do not always make it clear what users are supposed to do

14. Encourage everyone to use the digital service

Decision

The service met point 14 of the Standard.

What the team has done well

The panel was impressed that:

  • the service is almost 100% online with a customer support centre for assisted digital needs
  • a great deal of work has been done to improve the journey for users with support needs
  • the team are working with DAC to help identify and improve support needs

What the team needs to explore

Before their next assessment, the team needs to:

  • consider if more can be done so that users who need support to take the test can book online if they want, so they do not have a poorer experience that takes longer
  • continue to monitor digital uptake and do user testing to ensure that the customer support centre meets the needs of assisted digital users
  • ensure that all the recommendations from the Accessibility Review are adhered to

15. Collect performance data

Decision

The service met point 15 of the Standard.

What the team has done well

The panel was impressed that:

  • the have carefully considered their data needs and using a performance framework ensure the data being collected is focused and relevant for measuring the performance of the service and if it is meeting their users’ needs
  • the team use Google Analytics to measure the performance of the service
  • the team have anonymised the IP address from the user’s browser and ensure that no personal identifiable information is passed to Google Analytics or their data reporting
  • their SIRO and security team have signed off the usage of Google Analytics as their user behaviour data collection tool
  • while the team are currently using the basic Google Analytics tagging they have plans in place to enhance the data collection in particular to look at the performance of error messages on the service
  • the team shared some excellent examples of where they used user behaviour data to feed into the building and improving the service including using search terms to make sure the language used on the service is appropriate
  • the team used data to set the scope and where to focus efforts including identifying users that may need additional assistance
  • the team used feedback and funnel drop off data to help inform User Research hypotheses and help with their hypothesis led design
  • the team are using a cookie opt-in solution with a plan in place to analyse the opt-in rate of cookies on the service

##

16. Identify performance indicators

Decision

The service met point 16 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has produced a comprehensive performance framework that is used to identify performance indicators. They have linked the performance indicators back to users needs ensuring that they can measure if users needs are being met
  • the team have set up relevant baselines using data from their CRM, survey data and reports provided by their current service providers
  • their data team provides monthly feedback where the insights are fed back into the product team - but this feedback loop time will reduce when the service is in public beta

What the team needs to explore

Before their next assessment, the team needs to:

  • look at reducing the time it takes for the insights team to provide insights to the product team

17. Report performance data on the Performance Platform

Decision

The service met point 17 of the Standard.

What the team has done well

The panel was impressed that:

  • the team are in consultation with the DVSA team responsible for uploading Performance data to data.gov.uk with a long term view of automating the updating of the data
  • in the meantime the team will be updating the performance data manually to data.gov.uk
  • the team is reporting the 4 mandatory KPI’s

18. Test with the minister

Decision

The service met point 18 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has an appointment scheduled for 1st July to test the service with the Chief Executive
Published 22 July 2021