Report MOT fraud/MOTreminders beta assessment

The report from the beta assessment for DVSA's report MOT fraud service/MOT reminders on 9 June 2017.

From: Central Digital and Data Office
Assessment date: 09 June 2017
Stage: Beta
Result: Not met
Service provider: Driver and Vehicle Standards Agency

To meet the Standard the service should:

  • Conduct further user research, including with those with accessibility requirements and low digital skills.
  • Research and develop appropriate support routes for users with low digital skills.
  • Iterate and improve the service on a more frequent and incremental basis.

About the service

Description

The MOT Reminders service lets users know when a vehicle is due for its annual MOT. Report MOT Fraud allows users to report garages they suspect of being involved in fraudulent MOT related activity.

Service users

The users of these services are predominantly the motoring public and people working in the garage trade.

Detail

User needs

Insufficient research has been undertaken to support either the MOT Reminders or Report MOT Fraud services in moving into public beta.

Discovery

User research in discovery has been conducted solely through use of surveys with a relatively low number of users. This has lead to methodological issues such as a reliance on questions that ask users what they want rather than identifying the underlying user needs required to drive service design.

For example, there was a relatively even split between users who would prefer reminders through SMS versus email. In place of further research to understand why this was the case, the team made the decision to go forward with an email solution. Whilst developing the service with a single channel (i.e. email) is a reasonable approach for an MVP, it is not reasonable to do so without an informed understanding of underlying user needs.

Usability testing

Iterative testing has been insufficient for the MOT Reminders service, with only a single round of testing being conducted for the service. This was testing done in a single garage, with an ad-hoc group of users who used the garage that day. There are definite benefits in conducting some testing in a garage setting, as users test the service in-context, however a single round in one garage means that testing was done with a potentially homogenous group of users biased towards their experiences of that one garage. Changes were made based on feedback during testing, but these changes have yet to be tested with a new groups of users. No meaningful usability testing has been conducted on the Report MOT Fraud service.

For services ready for beta the panel would expect a number of rounds of research that implement and then test improvements to the design based on user needs and some testing of the service in context of the rest of GOV.UK.

Assisted digital

MOT Reminders

For the MOT Reminders service, there was no evidence of an assisted digital plan. The team needs to consider how people who are unable to either access or use the service online might access the service.

Report MOT Fraud

For the Report MOT Fraud service, the team has made some consideration for an assisted digital plan such as having the fraud report form used by contact centre representatives to record calls, however this support should be researched with users, tested and iterated.

Accessibility

There is no evidence of discovery or testing with users who have access needs or evidence of improvements made to the service based on an understanding of access needs. This is a basic requisite of being able to release a service into beta, especially given that the service is intended for a broad range of citizens.

Team

The two services form part of a wider MOT Testing programme, and as such are well supported with coverage of key agile team roles.

The two services demonstrate a similar approach to that which has been taken for features of the Record MOT Test Results service. This is an understandable as both services are small in scale in scale and function. However, as with the features, insufficient user research, testing, iteration and improvement has been done to support them. Although clearly distinct and standalone services, given their size, it seems sensible to develop them as part of a wider programme rather than have a dedicated service team working on them. However support should be proportional and sufficient, for instance, though concentrated periods of several sprints of work, spread throughout the year, single user research sessions, and major service changes based on the findings, is not an appropriate approach.

Technology

The services generally re-use components from the larger MOT Testing service, so there is little new to add in this context. The technical stack is modern and fits the DVSA standards, makes good use of commodity public cloud and offers the ability to scale over time. The new use of serverless architectures for the MOT Reminders service is an interesting new addition, and others in government would probably benefit from learning more about the DVSA approach to this component. While the team noted some issues with the cold-start time of these lambda instances, the current work-around seems to be functional for the moment, though in the longer term a better solution might be needed to avoid cold-start latency. The team is to be commended for the integration of common government platforms such as GOV.UK Notify.

The panel does not consider that the service currently meets point 9 of the Service Standard, specifically to “manage the common data you hold and meet your open data responsibility”. The data collected and held through the MOT services is of public interest, high volume and not obviously sensitive. As such, the panel believe that the team has an obligation to make the data they collect available in clear, complete, machine-readable formats. This might be by re-starting the publication of datasets to data.gov.uk (currently out-of-date), or by supporting APIs that make the full datasets queryable. In a more minor area, the team should consider the use of the existing open data standard for the exchange of calendar events (RFC 5545) in the Reminders service to prompt users to create a suitable reminder event.

The team is to be commended on the continued evolution of the deployment infrastructure and the Blue/Green deployment pattern, which seems to already be having an effect on uptime and certainly on flexibility of releases. We urge the team to continue to ensure that the deployment instances are as generic as possible, and configured entirely in code. Security and privacy are generally good and IT Health Checks are being performed, but the panel would like to see more specifics about how the personally identifiable information now being collected in either service is being retained and expired – this would include email addresses in the Reminders service, but also identifying information collected in the less structured Fraud service. The unstructured data, in particular, creates a new burden for the retention, redaction and deletion of personally-identifiable information under EU regulations, and the team should have a clear technical approach to this in good time for the coming into force of the General Data Protection Regulations in May 2018.

Design

The design of the service is generally clear, well laid out, and consisted with GOV.UK. The following issues in particular need to be be resolved:

General

  • Ensure line lengths don’t go over ⅔ of the page width on desktop. A ⅔ column layout from GOV.UK Elements can be used to achieve this.

MoT Reminders

  • Eligibility is currently only presented on the GOV.UK start page. This should be user tested as part of the end-to-end journey to see whether users are aware of these restrictions at the start of the service.
  • The back navigation button doesn’t work, with back navigation resulting in an error message.
  • The ‘check your answers’ page uses a non-standard blue header for vehicle details. A more standard layout, with standard GDS elements must be used unless there’s a demonstrable user need for a new element (in which case, this will need to be shared with the wider cross-government design community).

Report MOT Fraud

  • For the order in which question are asked in the Report MOT Fraud service, the required question should be asked first.
  • Questions should be well-worded and tested with users to allow a range of reporting scenarios.

Analytics

Key Performance Indicators (KPIs)

The team has a number of indicators they are using internally to measure the success of the two services.

MOT Reminders

The panel recommends that the service captures user satisfaction ratings and comments immediately after a user signs up for reminders, and investigates how to collect feedback from users after reminders have been received.

To be useful, completion rate should be measured from the time when users start to complete the form until they verify their email address. This will be harder to measure using web analytics as the second stage may be completed much later or on a different device.

Because costs will be largely fixed and there is only one channel, the published cost per transaction is expected to correlate inversely with the take-up of the service, but this will change if SMS reminders are offered to users.

Report MOT Fraud

As there is no direct follow up with users reporting fraud, the four mandatory KPIs may not be either easy to measure nor provide useful insight to improve the service.

The team plan to measure quality of information reported. The panel recommend that the digital service is measured separately from the telephone service to identify which elicits better quality information.

The team should consider using A/B testing of fraud reporting questions to improve the quality of information received.

Performance Indicators

The team has contacted the Performance Platform team, but have yet to supply metrics for a dashboard.

Recommendations

To pass the reassessment, the service team must:

  • Conduct further user research, including with users with accessibility requirements and low digital skills.
  • Review and revise user research methods to develop a deeper understanding of user needs and motivations.
  • Research and develop appropriate support routes for users with low digital skills.
  • Iterate and improve the service on a more frequent and incremental basis.
  • Address design and content concerns.
  • Report performance metrics to the Performance Platform.
  • Meet the full government open data commitment by making all data collected available in machine-readable open formats, either by restarting publishing of datasets to data.gov.uk and/or by supporting an appropriate API directly.

The service team should also:

  • Capture user satisfaction ratings and comments immediately after a user signs up for reminders, and investigate how to collect feedback from users after reminders have been received.
  • Separate measures of quality of information between the telephone and online services to facilitate comparison between the two channels.
  • Consider A/B testing for fraud reporting questions to improve the quality of information received.
  • Create a plan to manage Personally Identifiable Information (PII) and to mitigate the new effects of the General Data Protection Regulations in both services in good time for the May 2018 coming into force of the new legislation. This should include potential responses to Subject Access Requests, data redaction and deletion, and a reasonable term of data retention. New content may also be needed.
  • Adjust the monitoring tooling of the applications to trigger explicit alerts if personal or other data is exfiltrated from the production environment, rather than relying on subsequent forensic log file analysis.
  • Operate in a true coding-in-the-open style, moving beyond periodic deployment-based open source releases.

Next Steps

In order for the service to continue to the next phase of development it must meet the Standard. The service must be re-assessed against the criteria not met at this assessment.

Please contact the Service Assessment team at least 4 weeks before the date you’d like to hold a reassessment.

Get advice and guidance

The team can get advice and guidance on the next stage of development by:

Digital Service Standard points

Point Description Result
1 Understanding user needs Not met
2 Improving the service based on user research and usability testing Not met
3 Having a sustainable, multidisciplinary team in place Met
4 Building using agile, iterative and user-centred methods Not met
5 Iterating and improving the service on a frequent basis Not met
6 Evaluating tools, systems, and ways of procuring them Met
7 Managing data, security level, legal responsibilities, privacy issues and risks Met
8 Making code available as open source Met
9 Using open standards and common government platforms Not met
10 Testing the end-to-end service, and browser and device testing Met
11 Planning for the service being taken temporarily offline Met
12 Creating a simple and intuitive service Not met
13 Ensuring consistency with the design and style of GOV.UK Not met
14 Encouraging digital take-up Met
15 Using analytics tools to collect and act on performance data Met
16 Defining KPIs and establishing performance benchmarks Not met
17 Reporting performance data on the Performance Platform Not met
18 Testing the service with the minister responsible for it Met

Updates to this page

Published 30 July 2018