Get an annual MOT reminder - beta reassessment

The report from the beta reassessment for DVSA’s Get an annual MOT reminder service on 29 September 2017.

From: Central Digital and Data Office
Assessment date: 29 September 2017
Stage: Beta2
Result: Met
Service provider: DVSA (DfT)

The service met the Standard because:

  • The team have regularly iterated the service in response to user feedback, and have demonstrated that the service meets user needs.
  • The team have created a simple and intuitive service that we believe will be a success at public beta.
  • The team have a strong approach in place for measuring performance of the service and making decisions with data.

About the service

Description

The service enables users to sign up for reminders of when the MOT is due for GB registered vehicles. Reminders are currently provided by e-mail and text.

Service users

Primarily motorists - those with some responsibility or a vested interest in when an MOT is due.

Detail

As this was a re-assessment, the panel focussed on those areas ‘not met’ from the previous report.

The team should be commended on how well they have taken on the feedback from the previous assessment. All points on the service assessment are now met, and the team are in an excellent position to succeed in public beta. The team should also be commended on how well they prepared for this service assessment. They came with clear evidence of how they had met the service standard, which made our job as assessors relatively easy.

Note that the team’s private beta was findable from search engines, and had been widely publicised by third parties. This meant that a Private Beta already had 100,000 users. At minimum, the team should ensure future Private Beta’s are not available to search engines. The assessments team may want to consider whether Private Beta’s be available so publicly in the future.

User needs

Since attending its first beta assessment, the service team has clearly taken the recommendations of the previous panel seriously in regards to user needs. The service team has evolved from minimal guerilla usability testing and a survey to a programme of regular usability testing backed up by additional discovery work. The panel are satisfied that there is a solid plan for user research evidenced by a robust backlog and a solid plan for future testing during upcoming sprints.

The discovery work, usability testing and monitoring of the beta feedback survey seriously influenced the direction of the service. In particular, it directed the team to include a SMS service and allow drivers who have registered a new car to be able to use the service. These are great examples of how doing sufficient user research has allowed the team to truly understand its users’ needs and develop a service that meets the most important needs.

The service team did a full review of its approach to the assisted digital journey. The first step was to incorporate plenty of users on the lower end of the digital inclusion scale into research rounds. The team went further by classifying users according to additional circumstances and the kinds of help that they receive from others. The assisted digital channel was tested in the lab successfully. The call centre has been trained in the assisted digital journey and it will be deployed in the next few days following the service assessment.

By including users with different access needs in usability testing and undergoing two assessments with the Digital Accessibility Centre, the team is in a good position to provide an accessible service. The team should ensure that every future round of testing includes at least one or two users with access needs.

Team

The MOT Reminders team are clearly a strong team well versed in agile practices. Working and releasing in a two week sprint cycle (but capable of releasing quicker if necessary), they were able to demonstrate numerous strong examples of how they had incrementally improved their service based on user feedback and/or data. They are composed of a combination of civil servants and interims, and a clear plan is in place to ensure capability is transferred to civil servants over time. The service is already performing well, and the team have a prioritised backlog of user problems to focus on when they are in public beta. From what I have seen, I am confident in the team’s ability to make this service a success at public beta.

Technology

The team have taken onboard the comments about the data sharing aspects of the service. A workshop was held in September and clarified the intentions of DVSA regarding opening parts of the Trade API. Even though actions are yet to be taken by DVSA (at large) in that respect, the panel consider that the team have acted upon the conclusions of the previous assessment.

Other aspects of the technological points that the panel enquired on were answered in a satisfactory way, and the conclusion of the panel remains that the service meets those points.

Several recommendations have been made below.

Design

Since the previous assessment the team have iterated the design of the service, based on research, in such a way that all users can succeed first time.

Those parts of the design which were not consistent with GOV.UK, as well as the broken back button behaviour, have been changed. As before the pages are clear and well laid-out.

Particularly commendable is the hard work that’s been done to remove restrictions (for example on vehicles less than 3 years old). This in turn makes the service simpler by not needing to handle eligibility.

There are some design challenges which the team hasn’t been able to tackle before taking the service public. They should now be looking at:

  • How they’re going to raise awareness of the service

  • Understanding, and designing for the user journey from garages, and the vehicle tax services

  • How garages could help or direct users (especially those with assisted digital needs) to sign up for text message reminders, without having to use the online form.

There are some more minor interaction and content issues which are detailed separately (Attached on the email), and the panel would expect these to be addressed before the next assessment.

Analytics

The team has addressed the panel’s concerns from the previous assessment.

Data for the 4 mandatory KPIs is being published regularly to the performance platform (https://www.gov.uk/performance/mot-reminder), although the dashboard is not yet publicly available and is still labelled as a prototype. The team should request this as soon as possible.

Completion rate is accounting for the journey, from signing up to authenticating emails/text messages. The team demonstrated improvements to completion rate resulting from changes they’d made to the service.

A significant number of users are providing ratings, and the team gave specific examples where user feedback has led to service improvements. It would be interesting to see more work done in this area after users start to receive their reminders.

The team showed a range of indicators being used internally, and the panel agreed that these were useful in measuring the service’s performance.

The team is using an API to shared the service’s data with organisations interested in road safety issues. Some work is required in ensuring that data is uploaded more frequently to data.gov.uk.

Recommendations

To pass the next assessment, the service team must:

The service team should also:

  • Monitor cases in which users do not succeed with STOP codes from SMS. If necessary, improve the service to better handle this use case. Consider sharing knowledge with Cross Government digital community to share what you’ve learned in this area.
  • Ensure that every future round of testing includes at least one or two users with access needs.
  • The service should expire tokens that haven’t been confirmed. Otherwise, this could effectively create a list of email addresses and car registration plates of people who have not consented to be in that list. A common duration for unconfirmed tokens is 24 hours. Similarly, quick deletion of records of people who have opted out of the service is highly recommended. There isn’t an obvious reason why these should be kept up to 5 years.
  • Explore anti-personas and possible abuse scenarios beyond standard cyberattack mitigation. Even though the risks of data leaks and possible mischief are low given the nature of the service and the vehicle information already available publicly, it is important to consider all risk factors and how to detect and mitigate them.
  • Consider building a generic reminder system, as this is likely to become a common service across GOV.UK. In particular making the source code easier to localise would be beneficial, as with most government services that may eventually be translated to Welsh, or may even be reused outside of government.
  • Even though the team did follow the recommendation to operate in a coding-in-the-open style, there is room from improvement in how the code is publicly share. The repositories should include descriptions of the service, give an idea of the technical architecture of the service. Improvement in the way code is managed with source control should also be improved with better change descriptions and a finer granularity of modifications.

Next Steps

You should follow the recommendations made in this report before arranging your next assessment.

This service now has permission to launch on a GOV.UK service domain. These instructions explain how to set up your *.service.gov.uk domain.


Digital Service Standard points

Point Description Result
1 Understanding user needs Met
2 Improving the service based on user research and usability testing Met
3 Having a sustainable, multidisciplinary team in place MET IN BETA1
4 Building using agile, iterative and user-centred methods Met
5 Iterating and improving the service on a frequent basis Met
6 Evaluating tools, systems, and ways of procuring them MET IN BETA1
7 Managing data, security level, legal responsibilities, privacy issues and risks MET IN BETA1
8 Making code available as open source MET IN BETA1
9 Using open standards and common government platforms Met
10 Testing the end-to-end service, and browser and device testing MET IN BETA1
11 Planning for the service being taken temporarily offline MET IN BETA1
12 Creating a simple and intuitive service Met
13 Ensuring consistency with the design and style of GOV.UK Met
14 Encouraging digital take-up MET IN BETA1
15 Using analytics tools to collect and act on performance data MET IN BETA1
16 Defining KPIs and establishing performance benchmarks Met
17 Reporting performance data on the Performance Platform Met
18 Testing the service with the minister responsible for it MET IN BETA1
Published 30 July 2018