Check if a vehicle is taxed or has an MOT - live reassessment

The report from the live reassessment for DVLA's Check if a vehicle is taxed or has an MOT service on 15 March 2018.

Service Standard reassessment report

Check if a vehicle is Taxed or has an MOT

From: Government Digital Service
Assessment date: 15 March 2018
Stage: Live2
Result: Met
Service provider: DfT DVLA

The service met the Standard because:

  • There is clear evidence that the service is meeting user needs on a large scale, and it is doing so simply, cheaply and to a high level of user satisfaction.
  • There is regular iteration of the service based on analytics and user feedback from a variety of sources. The team were able to demonstrate learning from all the experiences during its beta and having addressed problems raised in previous assessments.
  • There is a multi-disciplinary team co-located in DVLA, which is well-informed about its users and its product. This team will continue to actively to improve the user experience, performance and operation of the service during the Live phase.

About the service


The service enables users to search by a vehicle’s registration number to find out if that vehicle is taxed and has an MOT.

Service users

The users of this service are citizens and professional users in the vehicle industry; both groups share the need to find out the current tax and MOT details for a vehicle, alongside some basic specifications that help identify that they are looking at the right vehicle.


Over the course of the Beta, DVLA has delivered an intuitive and reliable service, which has some of the highest volumes of usage of any service in government. The service team is an enthusiastic multi-disciplinary group of civil servants who own the direction of the service. They have an excellent understanding of their users and are making good use of data to inform continuous improvement. The panel is confident that they are meeting user needs, and are capable of operating and improving the service in a Live phase.

User needs

Users are able to search for any vehicle which may be of interest to them, meaning there are a variety of contexts in which the service can be used. The service is primarily used by customers who are interested in purchasing a vehicle and those wanting to check the status of an existing vehicle. Previously this information would have been available only through inspection of the now abolished paper tax discs displayed in a vehicle or by calling or writing to DVLA and DVSA. Providing the digital service means the information is available anywhere, at the point of need and at no cost to the user.

Following a simplification in the scope of the service (following an earlier unsuccessful Live assessment), the service team has put significant effort into developing a more detailed understanding of the users of the service, and this effort was clearly shown at reassessment.

Through research, the team has identified the range of potential users of the service, and has been able to articulate the situations under which users would encounter the service and which tasks they were aiming to complete through using the service. While the user needs were sometimes framed as preferences rather than needs, it is clear that user goals are well understood and referenced throughout the development and design process.

The research undertaken during Beta took many forms (monitoring customer feedback, analytics, and contextual interviews), and the various methods used were evidently effective. Research has been inclusive (including taking place with a range of cohorts and in locations around the UK), and the researchers have made efforts to perform both contextual and usability research with users lacking in digital skills or digital confidence, as reflected in the evidence-based personas produced.

A high number of users are able to successfully use the service on the first attempt, and general feedback for the service is very good, as reported through both quantitative and qualitative research.

Some research has also been done with users who have accessibility needs, notably potential users with dyslexia. While accessibility audits have been successfully completed, it would be beneficial to identify additional users with other accessibility needs to develop further confidence in the service’s ease of use.

All of the disciplines represented at the assessment were able to contribute to answers relating to users and their needs, and it was evident that empathy with the users of the service has been developed across the disciplines in the team. It was clear that the research has been effectively communicated within the service team, both through the team’s direct involvement in research sessions and when findings are fed back. Building on this, it would be beneficial to ensure that the findings that have come out of research on this service are communicated with other service teams within DVLA.

The assessment panel were pleased to hear the team speak about their ongoing plans for user research to inform further improvements to the service research once it enters its Live phase.

The service team spoke with pride about the relationships they had established with users of the service during the development of the Beta. This engagement should continue in the form of invites to participate in further user research and the sharing of updates about the performance and any further improvements made to the service informed by the insights from that research.


The service team includes all the disciplines necessary to design, build and operate the service. This team enjoys a high degree of autonomy, with support from the Service Manager who in turn operates with full agency over the service. The team is staffed entirely by co-located civil servants, who have worked together in this unit over the course of the Beta and who will remain in place once the service has gone live.

The team works in regular iterations, following an agile delivery framework. Stories are based on user needs and prioritised in a backlog by the product manager and reviewed at the beginning of each iteration by team who will be carrying out the work. Forums like daily stand-ups and show-and-tells help keep the team informed of one another’s work, and regular retrospectives take place, enabling the team to learn through delivery and evolve their culture, processes and tooling over time.

Releases are made on a regular basis, and the team are ready and able to make improvements as and when they are required. The team talked about their pride in being able to demonstrate the Beta to their Minister and bring the service forward for Live assessment; however, they were also clear that there was much that they still needed to learn and improve in the service, and there is a roadmap of design and technology options that they want to explore further through testing.

The DVLA team should actively be posting and presenting on what they have learned in getting their service to Live. This is evidently an experienced team with experiences that could be usefully shared with communities of practice and other service teams across government, who are bringing similar services through the delivery phases.


A detailed technical review was conducted prior to the assessment, the results of which were discussed alongside other points raised by the panel in the assessment.

The service has good practices in place to ensure the right things are built with appropriate technologies, and so it is meeting the standards necessary to move to the Live phase.

The service has been running successfully for considerable time in Beta and a mature technical stack which allowed zero downtime deployments is in place. Changes introduced during the Beta phase included a move to Jenkins, serving content via a CDN and a new API for bulk queries. Robust support arrangements are in place appropriate for a service that currently handles around 750 million transactions per year.

The service has an effective deployment environment and was being tested regularly in an environment similar to live. Load testing had been carried out to a point well beyond the anticipated growth in transactions. The service could be run from more than one physical location, and in the event of the service being taken temporarily offline a message advertising this fact would be displayed to the users, while a standard recovery procedure to keep users informed, return the service and identify the problem took place.

The service team described their approach to security and risk management in the pre-tech call, and this was thorough. At the assessment some examples were given to show how the team has ensured that data held by the service could not be extrapolated to determine whether the driver of a particular vehicle was disabled. The service appeared at the top of most relevant web searches and the team has a process in place for removing copycat sites which sought solely to charge for information that is free to the citizen from DVLA.

Much of the code is available in an internet source code repository and the team has made contributions to the open source community. There has been good engagement with the open source team at GDS and it was clear that appropriate thought had gone into why certain areas of code would not be released.

Content and design

A detailed content and design review was conducted prior to the assessment, the results of which were discussed alongside other points raised by the panel in the assessment.

Aside from some content and design issues which will be detailed in a separate document, the service is consistent with the user experience of the rest of GOV.UK including use of design patterns and content that make the service simple to use and intuitive to the standard required to progress to the Live phase.

The service team set out clearly the iterations to the service’s content and design during the Beta, and were able to demonstrate where they had made improvements based on user feedback, analytics and accessibility audits. For example, the removal of the requirement to specify the make of the vehicle had increased success rates. Improvements to the experience on mobile devices was also evident, which were based on findings from talking with users about when and where they accessed the service. Some issues were identified in the accessibility audit; the pass against this point is conditional on the basis that these issues are resolved as a matter of priority.

Another significant change was the removal of tabs containing additional information about the vehicle history, tax rates and payment options. These tabs were removed based on an accessibility audit. Since then accessible tab patterns have been developed and the team talked about including tabs again. Any iteration should involve further user research to determine whether the inclusion of the additional information and tabs meets evidenced user needs.

To see tax rates for the vehicle and options to pay requires the user to provide additional information and adds steps in the journey. The panel were satisfied that the use of information that requires possession of the vehicle logbook would help to prevent fraud and protect personal information, such as whether the owner was disabled or the vehicle was owned by the military. Further specific testing should take place on the steps to see tax rates and the complexity of the results pages, particularly where mobile usage is expected to increase and ongoing payment steps may be introduced.

DVLA recognises Check if a vehicle is Taxed or has an MOT as a service that is at the beginning of ongoing user journeys (for example, to pay tax, see a vehicle’s MOT history or get a vehicle log book). Links to related information and services are displayed on the right of the page, categorised by the responsible organisation. Further testing and use of analytics is encouraged to determine the visibility and usage of these links, especially to users on mobile where the links fall to the bottom of the page. The results and curation activity should be a collaborative activity between DVLA and DVSA.

It was noted that there were also multiple start pages to get to the services. Routes into the service should be carefully monitored using analytics and unnecessary start pages could probably be removed if deemed not to be needed. It was felt the team could be making better use of the vast analytics and performance information at their disposal, particularly around understanding common user journeys.


A Performance Platform dashboard for the service has been available throughout the Beta, providing information on the 4 mandatory key performance indicators. Based on the latest information shared by the team for these KPIs, it is clear that digital take-up is high and increasing, costs are low and decreasing and user satisfaction is good. It is also a service with one the highest number of completed online transactions reporting on the Performance Platform (719m in 2017/18).

The team outlined a wide range of additional metrics they use to assess the performance of their service. These include measures on usage volumes, how people use the service (devices, browsers etc), user feedback, and service availability. Analysis is shared with the team through regular reports and the insights are used to inform improvements to the service.

The team self serve with data from Google Analytics (including dedicated funnel reports) and the contact centre. The team also includes a Performance Analyst, but as the service is a mature one, the PA is shared with other teams. This is a sensible use of resource: other members of the team demonstrated familiarity with use of data and Google Analytics and it makes sense that they would only call on a specialist for more detailed analysis.

Data has been used to iterate during Beta. An excellent example is how data demonstrating that users would drop out of the service at the question about the make and model of the car was used to drive an iteration to remove this question. The data confirmed that after this change the drop-off rate decreased.

The team are commended on paying detailed attention to use by device. The data on the level of use by mobile devices prompted user research to understand more about this and insights into use at the roadside by users such as AA roadside assistance and council workers.

Data is continuing to drive further improvements to meet user needs. Event tracking on the use of links to other related services shows that the MOT service is the most popular next destination, so the team are starting a project to collaborate with that service to improve the journey. This is a particularly strong example since it demonstrates the use of data to improve end-to-end journeys by users when they touch several services instead of treating each service as a separate silo.


The service team should continue to iterate and improve the service.

Recommendations following this Live assessment include:

  • Keep the momentum up on your user testing and monitoring of analytics when live to inform the content and design optimisations flagged in this assessment.
  • The team was obviously enthusiastic about further developments that could be made to the service; it is important to remember that any changes or additions made to the service must continue to be user-led and in response to needs uncovered through research.
  • It would be beneficial to ensure that research findings from this service are communicated with other service teams within DVLA.
  • Share with wider government community your experiences of designing, building and operating a digital service on this scale.
  • Continue to regularly open source the codebase and move towards coding in the open wherever possible.
  • The users of DVLA services are engaged and interested in their development, which is excellent. Continue to take advantage of this interest by publishing blog posts and release notes.

Next Steps

You should follow the recommendations made in this report and the supplementary reports, as you continue to develop your service.

Annex A - Content review

Annex B - Accessibility review

Annex C - Performance analysis review

Get advice and guidance

The team can get advice and guidance on the next stage of development by:

Digital Service Standard points

Point Description Result
1 Understanding user needs Met
2 Improving the service based on user research and usability testing Met
3 Having a sustainable, multidisciplinary team in place Met
4 Building using agile, iterative and user-centred methods Met
5 Iterating and improving the service on a frequent basis Met
6 Evaluating tools, systems, and ways of procuring them Met
7 Managing data, security level, legal responsibilities, privacy issues and risks Met
8 Making code available as open source Met
9 Using open standards and common government platforms Met
10 Testing the end-to-end service, and browser and device testing Met
11 Planning for the service being taken temporarily offline Met
12 Creating a simple and intuitive service Met
13 Ensuring consistency with the design and style of GOV.UK Met
14 Encouraging digital take-up Met
15 Using analytics tools to collect and act on performance data Met
16 Defining KPIs and establishing performance benchmarks Met
17 Reporting performance data on the Performance Platform Met
18 Testing the service with the minister responsible for it Met
Published 6 August 2018