Commercial vehicle service - alpha

The report from the alpha assessment for DVSA's Commercial vehicle service on 8 February 2018.

From: Central Digital and Data Office
Assessment date: 08 February 2018
Stage: Alpha
Result: Met
Service provider: DfT DVSA

The service met the Standard because:

  • They have a clear understanding of the user needs the service aims to address.
  • The team have frequently iterated their product in response to user research.
  • The service team is multi-disciplinary, working in a collaborative way using agile tools and techniques to continually improve and iterate the service.

About the service

Description

The ‘manage your authorised testing facility’ service allows operators to manage their business interactions with DVSA. This includes making payments, viewing actual account balances, viewing annual testing activities with the ability to download invoices and monthly statements for financial reconciliation.

The ‘digital test result capture’ application is an integral part of the holistic service and is the key transaction which is captured within the wider service – that being annual test results for heavy vehicles to comply with current legislation in the interest of road safety.

Service users

  • Manage your authorised testing facility
  • Business to business (DVSA to Authorised Testing Facility Operators)
  • ATF (Authorised Testing Facility) Managers, administrators, finance personnel and testing lane managers
  • Digital test result capture
  • Internal users (DVSA to DVSA)
  • Skilled DVSA employees - Vehicle standards assessors, Senior vehicle standard assessors,
  • Technical team leaders, Network business managers, DVSA administrators.

Detail

User needs

The team had a good understanding of the users of the service, and were able to volunteer stories learned through the testing that they observed, which indicated that the team had developed empathy for the users. The service team was able to demonstrate that they had kept the user at the heart of the design and development of the service.

The impact of the research was clear, and the team presented evidence that research findings had directly influenced the design of the service. For instance, an early hypothesis which would have had a strong impact on the direction of design was abandoned following strong findings from initial research. It is clear that research findings were communicated effectively with the team: by sharing findings at the earliest opportunity, using evidence-based personas, and displaying key research outcomes visibly in the team’s shared working spaces.

There has been limited research with users who have accessibility or assisted digital needs thus far; specific thought must to be given to how accessibility and assisted digital needs will be addressed in ongoing testing.

Team

The team is made up of a mix of suppliers and civil servants lead by an empowered Service Manager and it is clear that they are passionate about the project and had all taken the time to observe the user research. The team clearly work well together and are using agile ceremonies and tools as a way to manage their product and development lifecycles.

The team is set up to deliver the two parts of the Commercial Vehicle Service, ‘manage your authorised testing facility’ service and the ‘digital test result capture’ application, with a dedicated Product Manager and Product Owner and Business Analyst and Devs for each. Other roles are shared by both products.

The team recognise that some additional roles need to be filled as they move into private beta and beyond. In addition to those the team have identified, it is recommended that the team also consider a dedicated Agile Delivery Manager to manage the scrum team as they prepare for private beta and beyond.

Technology

A comprehensive and well factored architecture was presented that, when fully implemented, should be able to support all aspects of the service. The team has clearly focused on delivering technology that enables the highest value activities to be prototyped and explored first. The use of the gov.uk prototyping kit, independent of the rest of the target architecture, provided the team with the ability to respond quickly to user research findings and service design improvements as well as iterate their product quickly. This allowed the team to conduct up to 8 or 9 iterations with individual ATFs during the alpha period. As the service users are well known, we recommend that the team conduct more formal research into the browsers in use and how they are upgraded.

Good practice is being followed with respect to working in the open, using open source and knowing when to keep things closed.

The team have set out a realistic plan for evolving the technology into private beta although the details of the CRM (Customer Relationship Management) System remain undecided pending the conclusion of the tender process. The business and architectural implications of the CRM choice are well understood, but delivering the high quality user experience promised by the alpha testing must remain the primary goal of the technology. The Technology Code of Practice offers guidance on how to balance these concerns. When migrating away from the prototyping kit the team should consider designs that allow them to retain distinct boundaries between their own functionality and that provided by the CRM and thus allow modules of the architecture to remain independently replaceable and procurable. This will allow the team to avoid lock-in and help them to retain complete ownership of the experience they offer their users.

A lot of thinking and analysis has been conducted around monitoring, planning for downtime, security and threat modelling. The work done by the Information Management Security specialist seemed particularly comprehensive. For private beta, thought must be given to how smaller scale problems will be handled such as ensuring the correct balance is always displayed, even if testing devices are offline or whilst the existing paper process is being transformed. The process outlined for issuing and managing user credentials must also be thoroughly verified from user, administrative and technological perspectives.

The team should reconnect with the gov.uk Pay Team now as there may be opportunities to work closely on the direct debit functionality.

The panel was impressed how the team delivered a very agile project whilst not only working within the existing, established, DVSA processes and frameworks but using them in an effective way that delivered a lot of value.

Design

The team presented clear examples of testing their prototypes with users, and making significant improvements based on that research. For example, improving the way the Account Balance is conveyed to the user, and finding what users needed in the Account History.

The panel was impressed that the team looked at underlying service pain-points, such as the payments and refunds, and how the data could be out of sync, with long lags. This is a great example of doing the hard work to make it simple, and we hope to continue to see a focus on these kinds of improvements in beta. The team mentioned looking into auto top-up, which sounds like it could again be a real benefit to the end users. If you manage to do these things, we recommend blogging about them.

It was also great to hear thinking around re-using existing systems (MOT Reminder and MOT History) - this will reduce wasted effort, and make user experience more consistent.

The interface design seems consistent with GOV.UK design patterns, and makes good use of the right patterns for the right problems. We recommend collaborating with the cross government design community and sharing your research for any areas where you are deviating from published patterns. For example the status box with ‘Your account balance is at the recommended amount’. This will help with assessment at Beta, if you are using any non-standard patterns.

Analytics

The team have registered their service with the Performance Platform.

In addition to the four mandatory KPIs, the team have looked at other variables to help them analyse the success of the new service. Fewer calls to the Contact Centre are expected, as well as a reduction in the number of replacement certificates issued due to errors, and an associated reduction in cost.

There is a view that if an MOT is booked in early rather than waiting until the last minute, the vehicle is more likely to pass. The team therefore plan to measure improvements in the pass rate and use this as a measure of success.

Recommendations

To pass the next assessment, the service team must:

  • The team should undertake user testing with users with accessibility and assisted digital needs, ideally within each sprint. For instance, the team should investigate the impact of the RAG colour system for account funds with users who are colourblind.
  • Some isolated comments were made about how users benefited from being talked through the prototype, which is to be expected at this early stage. At beta, the team should concentrate on ensuring the service is intuitive enough to use without training.
  • Do further research to establish the most appropriate name for the service.
  • Ensure that the implementation of the CRM is done in a way that puts user needs first.
  • Work with gov.uk Pay to ascertain timelines for a direct debit integration.

The service team should also:

  • The team’s user researcher did an excellent job of keeping the team informed, but could have achieved more if allowed to concentrate on one service instead of two. As a time saving measure, more use could have been made of remote testing methods.
  • Continue to look at additional roles and consider recruiting an Agile Delivery Manager for each Product Team.
  • Continue to grow resource in order to reduce the reliance on suppliers.
  • Keep iterating and revising the content.
  • Collaborate with the cross government design community and share their research for any areas where non standard patterns are used.
  • Conduct more formal research into the browsers in use and how they are upgraded.
  • Ensure thought is given to how smaller scale problems will be handled such as ensuring the correct balance is always displayed, even if testing devices are offline or whilst the existing paper process is being transformed.
  • Ensure the process outlined for issuing and managing user credentials from user, administrative and technological perspectives is thoroughly verified.

Digital Service Standard points

Point Description Result
1 Understanding user needs Met
2 Improving the service based on user research and usability testing Met
3 Having a sustainable, multidisciplinary team in place Met
4 Building using agile, iterative and user-centred methods Met
5 Iterating and improving the service on a frequent basis Met
6 Evaluating tools, systems, and ways of procuring them Met
7 Managing data, security level, legal responsibilities, privacy issues and risks Met
8 Making code available as open source Met
9 Using open standards and common government platforms Met
10 Testing the end-to-end service, and browser and device testing Met
11 Planning for the service being taken temporarily offline Met
12 Creating a simple and intuitive service Met
13 Ensuring consistency with the design and style of GOV.UK Met
14 Encouraging digital take-up Met
15 Using analytics tools to collect and act on performance data Met
16 Defining KPIs and establishing performance benchmarks Met
17 Reporting performance data on the Performance Platform Met
18 Testing the service with the minister responsible for it n/a
Published 6 August 2018