Pay a DVLA fine - Live Assessment

The report from the live assessment for DfT- DVLA Pay a DVLA fine live assessment on 24 January 2018.

From: Central Digital and Data Office
Assessment date: 24 January 2018
Stage: Live
Result: Met
Service provider: DfT DVLA

The service met the Standard because:

  • The service team are a capable team of civil servants who own the direction of the service.
  • They have an excellent understanding of their users and a data driven approach to improving the site.
  • They have created an intuitive and reliable service.

About the service

Description

Keepers of untaxed or uninsured vehicles will receive a penalty in which they have a certain time to pay before the case is set up for court or Debt Collection. Prior to ‘Pay a DVLA fine’ enforcement payments were made by phone, (restricted to a 9-5/5 service) or by posting a cheque, which is outdated. This service adds an online channel concurrent with the Agency’s digital vision.

Pay a DVLA fine gives users the ability to pay quickly and easily 24/7 wherever a customer wishes to access the service. 46% of ‘Pay a DVLA fine’ payments are made outside of office hours and 45% are made via tablet/smartphone. The most recent improvements allow the users to tax their vehicle as well as paying the fine.

Service users

Customers who fail to:
1. Pay their Vehicle Excise Duty (vehicle tax) 2. Insure their taxed vehicle These can be vehicle keepers, individuals and companies.

Detail

It was a pleasure to assess another highly capable DVLA service team. The team are making good use of data, have an excellent understanding of their users and have created an intuitive and reliable service. They have taken ownership of the service from third party contractors and built a strong team of civil servants. The panel is confident that they are capable of operating and improving a live service.

User needs

The panel was impressed by the cohesiveness of the team, their knowledge about the users of the service, and the user research conducted since discovery.

The team has a good understanding of the users and their needs - with the mission being to modernise and transform a largely paper based system into a service that is online. This has resulted in increased compliance and less administrative burden on the DVLA.

Development of the service started with users being able to make payments for just one type of fine. Through monitoring customer feedback, analytics, and contextual interviews, the ability to make payments for three additional types of fines has been added - i, Section 29 ii, Continuous insurance improvement iii,Direct debit offenders - for arrears. Each of these deal with approximately 150,000 penalties per year (approx 600,000 overall).

Since the beta assessment the team has continued to conduct contextual interviews and usability testing with users. This has also included users who manage fleets, and trade organisations. A number of different journeys have been tested. This has involved participants being presented with a Penalty Letter and then asked to proceed with trying to pay the fine - either by searching via a search engine, or by entering the URL into the address bar.

The panel was pleased that the team has continued to iterate the Penalty Letter that users receive, and has also consulted (and continues to do so) with other organisations, such as TV Licensing, to get advice and follow best practice. This has resulted in digital take up increasing from 60% to 70%+.

The team also conducts research on a regular basis with their contact centre. Listening to calls, and consulting with, and involving call centre staff in the usability testing of the service. The team also talked about how they conduct thematic analysis on all their call, email, and social media data - and how this inputs into the development of the service. It would be helpful if the team could blog about this work as this is something that other teams would be interested in, and would almost certainly benefit from their experience.

The team continue to involve people with access needs and low digital skills in their research. The team is also looking to move towards WCAG and AAA over the next year, and has almost finished building the assisted digital platform for the contact centre - which they’ll need to continue to test. DACC is used to consult on accessibility issues and some testing, but the team is clear that this won’t replace testing with real users.

As mentioned previously, the team is cohesive in the research it conducts on the service. All members of the team attend research sessions and are able to speak confidently about the findings from research and how it has helped develop the service. The team is also well connected in DVLA; they work well and share research with other service teams, attending show and tells and other research feedback sessions.

Team

The panel were very impressed with the capability of the team - they work together well in an agile way and have good governance structures in place with the wider organisation. The team have done particularly well in transitioning away from third party suppliers towards civil servants. Every member of the team we met was a civil servant. They have good plans and processes in place for future development, and have demonstrated that they are capable of operating and continuously improving the live service.

Whilst the reshuffle has meant they have been unable to test the service with the minister, they have tested with their local MP and Welsh Assembly Member as well as senior civil servants within DVLA.

Note:

Update post assessment report issued: Team have tested with the minster after this assessment report was produce.

Technology

The service has undergone significant changes in personnel and technology since the assessment at public beta. The team have taken over ownership from third party contractors, building up their own development capability and processes. They have also migrated the service from UK Cloud to AWS (Amazon Web Services) and implemented a lot of new functionality.

It was really impressive to hear how the team have taken ownership of the service. Their enthusiasm and commitment is clear, and they have implemented processes and practices around development and testing which help them deliver high quality code.

The application technology stack has not changed significantly, and in fact the team have retained the same fundamental architecture and components that were inherited and migrated to AWS.

Unfortunately, the service is not coded in the open, so it’s not possible to see the development team’s work in action, however the team have set up an appropriate process of sprint-by-sprint publishing of their code repositories to public Github. They have opened all their code with the exception of one repository, which integrates directly with the legacy platform, and the VSS platform code itself. The process for open sourcing the legacy code would be arduous, and with the platform being replaced it is unnecessary.

The monitoring instrumentation of the service is very good. The team have put automation in place like Kibana log watchers, certificate validation, and good use of vendor software capabilities to automate detection, management and response. Appropriate tools and remote system access are available to 24/7 on-call responders.

Risk and security processes are also mature, with a risk owner in the team and appropriate interactions with the wider departmental risk team and leadership. Security reviews and penetration testing are regular, and managed by DVLA’s central security team. The service team have access to that team for consultation.

There were two points raised at public beta to be addressed in future, which were downtime during deployments and the absence of infrastructure redundancy. Unfortunately, neither of these has been fully resolved, although improvements have been made in both cases.

While user-facing downtime associated with releases has not been wholly eliminated, the team have reduced the impact from several hours to under an hour thanks to their efforts in scripting and automation. The downtime is a limitation of the underlying legacy platform, VSS, which is being replaced as part of DVLA’s enterprise architecture transformation. Zero-downtime (blue/green) deployments are expected to be enabled by the transformation programme, and the team should proactively engage with the architecture group in DVLA to ensure that this will be the case.

Multi-datacenter redundancy has not been tackled and is something the team should pursue. At some point the single availability zone will fail. Adding redundancy in at least one further availability zone in the same region should be prioritised, and if possible they should implement multi-region redundancy.

Overall, despite there being areas to improve, Pay a Fine represents an excellent service from a technical perspective. The team has full lifecycle ownership, and at each stage has good practices in place to ensure the right thing is built with appropriate technologies.

Design

The name of the service is inconsistent: is it ‘Pay a DVLA fine’ or ‘Pay a DVLA penalty’? The start page uses the word ‘fine’, but the service uses ‘penalty’. We recommend using ‘fine’ because it’s plain English, more widely understood and follows the GOV.UK style guide. ‘Fine’ is also better because other services and content on GOV.UK about this topic use ‘fine’.

But the most important thing is that there’s consistent wording across all parts of the service, including the letter users receive before they start.

The main entry to the service for a user is receiving a letter telling them they have a fine to pay. The panel was impressed by the substantial work the team has done to iterate the paper part of the service. Doing user research and A/B testing to improve the letters has resulted in a higher proportion of users using the online service to pay their fine.

The team has designed a service that is largely consistent with GOV.UK patterns and styles. The few deviations from those patterns are backed up by evidence from user research. The design team performs regular audits on services across DVLA to ensure that patterns remain up-to-date and consistent. They are also engaged in the wider government design community and contributing to the GOV.UK Design System, which is encouraging for the ongoing design of the service.

The team have a good incident management system in place for if the service goes down.

The team have set a target of 80% digital take-up within the next 5 years, which is reasonable given the nature of the service (many users will want to speak to a customer service representative to discuss their fine before paying).

Content report

Analytics report

Analytics

The team has dedicated analysts working, with further analytical support from the wider enforcement programme.

Analysis is shared with the team through weekly reports. Insights raised by analysts help the team to develop new scenarios for further investigation by user researchers.

The team outlined a wide range of metrics they use to assess the performance of their service. These include measures on volumes (number of people paying each type of fine through each channel), how people use the service (devices, browsers etc), user feedback (through the web form and phone), and payments/arrears processed.

A Performance Platform dashboard has been created and it was agreed that this is now ready to go live.

The team is publishing an aggregated completion rate for the different types of fine, but internally it uses segmentation to identify differing patterns of behaviour. The team clearly demonstrated an awareness of the points at which users drop out, and the reasons this happens, which were largely beyond their control (e.g. inability to pay).

The team has been granted an exemption from publishing cost per transaction for the service because the costs are bound up with wider enforcement activities.

However, they use costs internally, and the difference in costs for each channel is used to justify the shift to digital. An increase in collection of arrears has also been credited to the digital channel.

All phone calls to DVLA call centres are tagged using speech analytics The user research team has easy access to calls relating to the fine payments service, which has helped them to understand their users. The panel strongly encourages the team to share their experience of using speech analytics with other teams.

Recommendations

The service team should:

  • Share with wider Government their experiences of using thematic analysis on call, email, and social media data to inform development of the service.
  • Use consistent wording across all channels.
  • Prioritise infrastructure redundancy in at least one more availability zone.
  • Ensure zero downtime deployments are achieved as part of DVLA’s enterprise architecture transformation.
  • Continue to regularly open source the codebase and move towards coding in the open if possible.

Digital Service Standard points

Point Description Result
1 Understanding user needs Met
2 Improving the service based on user research and usability testing Met
3 Having a sustainable, multidisciplinary team in place Met
4 Building using agile, iterative and user-centred methods Met
5 Iterating and improving the service on a frequent basis Met
6 Evaluating tools, systems, and ways of procuring them Met
7 Managing data, security level, legal responsibilities, privacy issues and risks Met
8 Making code available as open source Met
9 Using open standards and common government platforms Met
10 Testing the end-to-end service, and browser and device testing Met
11 Planning for the service being taken temporarily offline Met
12 Creating a simple and intuitive service Met
13 Ensuring consistency with the design and style of GOV.UK Met
14 Encouraging digital take-up Met
15 Using analytics tools to collect and act on performance data Met
16 Defining KPIs and establishing performance benchmarks Met
17 Reporting performance data on the Performance Platform Met
18 Testing the service with the minister responsible for it Met
Published 30 July 2018